LLaMA Chat
A modern chat interface built with SvelteKit and Tailwind CSS that connects to various LLM models through the GROQ API.
Live Demo
š Try it out: chat.ifsvivek.in
Features
- š¤ Multiple AI model support (LLaMA, Mixtral, Gemma)
- šØ Fast and responsive UI with SvelteKit
- šØ Beautiful design using Tailwind CSS
- š¬ Real-time chat interface
- š Dark mode
- š Model switching on the fly
Getting Started
Prerequisites
- Node.js 16 or later
- npm or pnpm
Installation
- Clone the repository:
git clone https://github.com/ifsvivek/chat.git
- Install dependencies:
npm install
# or
pnpm install
- Create a
.env
file in the root directory and add your GROQ API key:
GROQ_API_KEY=your_api_key_here
- Start the development server:
npm run dev
# or
pnpm dev
- Open http://localhost:5173 in your browser
Tech Stack