This project is a chatbot application built using Next.js, TypeScript, Supabase, Langchain, and FastAPI. The frontend is responsible for the user interface and handles user interactions, while the backend, powered by FastAPI + SQLAlchemy, processes chat messages using OpenAI's GPT-3.5 model and manages API endpoints. The application includes user authentication via Supabase, allowing users to sign up, log in, and manage their sessions. This architecture ensures a clear separation of concerns, with the backend handling all AI-related logic and the frontend focused on delivering a seamless user experience.
Clone the repository:
git clone https://github.com/ojasskapre/nextjs-starter-template.git
cd nextjs-starter-template/frontend
Install dependencies:
npm install
or
yarn install
Set up your Supabase project:
Set up your OpenAI API key:
Create a .env.local
file in the root of the project and add your Supabase and OpenAI credentials:
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
OPENAI_API_KEY=your_openai_api_key
Start the development server:
npm run dev
or
yarn dev
Open your browser and navigate to http://localhost:3000
.
Navigate to backend folder
cd ../backend
Install Poetry if you haven't already:
curl -sSL https://install.python-poetry.org | python3 -
Install the backend dependencies:
poetry install
Set up environment variables
OPENAI_API_KEY=your_openai_api_key
DATABASE_URL=your_database_url
SUPABASE_URL=your-project-url
SUPABASE_ANON_KEY=your-anon-key
Run the FastAPI server
poetry run uvicorn app.main:app --reload
The FastAPI server will be running at http://127.0.0.1:8000
.
Make sure you have Docker and Docker Compose installed on your system.
Ensure that you have set up the environment variables in the .env
file in the frontend
and backend
directory.
Run the following command to start the application:
docker compose up --build
The application will be running at http://localhost:3000
.
The FastAPI server will be running at http://localhost:8000
.
Remember to stop the containers using docker compose down
when you are done.
Instead of using FastAPI to handle and respond to chat queries, you can opt to handle everything directly in the Next.js frontend by leveraging langchain.js
, @langchain/openai
. This approach simplifies the architecture by keeping everything within the frontend, which can be useful for specific use cases.
Steps to implement:
Create the API Route in Next.js
Navigate to the @/app/api/chat/
directory and create a new file named route.ts
.
cd frontend/app/api/chat
Copy the contents from @/example/chat.route.ts to the newly created route.ts
file.
Update the api parameter in useChat()
in @/components/chat/Section.tsx from ${backendUrl}/api/chat
to /api/chat
Follow the frontend installation steps and start the application