This is a simple chat application built with Next.js, Convex, and Tailwind CSS. It allows users to chat with an AI assistant powered by OpenAI.
useChat
) for data synchronization and state management.convex/multiModelAI.ts
).convex/chats.ts
).convex/modelPreferences.ts
).convex/chat.ts
).react-textarea-autosize
).hooks/use-toast.ts
).Clone the repository:
git clone https://github.com/waynesutton/nextjsaichatconvextemplate
cd nextjsaichatconvextemplate
Install dependencies:
npm install
# or
yarn install
# or
pnpm install
Set up Convex:
npm install -g convex
npx convex login
npx convex dev
convex/
directory for changes and provides a local development backend.npx convex dev
output or the Convex dashboard.Set up Environment Variables:
.env.local
file in the root directory of your project.NEXT_PUBLIC_CONVEX_URL=<your-convex-dev-url>
OPENAI_API_KEY
with your OpenAI API key as the value.Run the Next.js development server:
npm run dev
# or
yarn dev
# or
pnpm dev
Open http://localhost:3000 with your browser to see the result.
ai
package): Provides hooks and utilities (useChat
) for building chat interfaces.react-textarea-autosize
: Component for automatically adjusting textarea height based on content.nextjs-convex-demo/
├── app/
│ ├── layout.tsx # Main application layout
│ ├── page.tsx # Main page component (renders Chat)
│ ├── providers.tsx # Context providers (Convex, Theme, etc.)
│ └── globals.css # Global styles and Tailwind directives
├── components/
│ ├── chat.tsx # Core chat UI component
│ ├── chat-message.tsx # Renders individual messages
│ ├── convex-chat-provider.tsx # Integrates Convex with useChat
│ ├── navbar.tsx # Application navigation bar
│ ├── footer.tsx # Application footer
│ └── ui/ # Shadcn/ui components (toast.tsx, button.tsx, etc.)
├── convex/
│ ├── schema.ts # Database schema definition
│ ├── chat.ts # Chat archival logic
│ ├── directMessages.ts # Saving AI responses
│ ├── init.ts # Initial data seeding
│ ├── messages.ts # Message query/mutation functions
│ ├── modelPreferences.ts # AI model preference logic
│ ├── multiModelAI.ts # Core Convex Action responsible for interacting with AI models (e.g., OpenAI) asynchronously in the background.
│ ├── openai.ts # OpenAI action wrappers (re-exports)
│ ├── useOpenAI.ts # Direct OpenAI interaction actions
│ └── _generated/ # Auto-generated Convex types and API (DO NOT EDIT)
├── hooks/
│ └── use-toast.ts # Custom hook for toast notifications
├── lib/
│ └── utils.ts # Utility functions (e.g., cn for classnames)
├── public/ # Static assets (images, fonts, etc.)
├── .env.local # Local environment variables (Convex URL)
├── .eslintrc.json # ESLint configuration
├── components.json # Shadcn/ui configuration
├── next.config.js # Next.js configuration
├── package.json # Project dependencies and scripts
├── postcss.config.js # PostCSS configuration (Tailwind)
├── tailwind.config.ts # Tailwind CSS configuration
├── tsconfig.json # TypeScript configuration
├── README.md # Project overview and setup guide (this file)
├── convexsetup.md # Convex-specific setup guide
├── filesjason.md # Descriptions of project files
└── nextchatjsonprompt.md # JSON prompt structure for the app
app/page.tsx
: The main entry point and layout for the application using Next.js App Router. Renders the Chat
component.components/chat.tsx
: The main chat interface component. It uses useConvexChat
for state management and renders ChatMessage
components.components/chat-message.tsx
: Renders individual chat messages (user or assistant).components/convex-chat-provider.tsx
: Contains the ConvexChatProvider
and the useConvexChat
hook, which integrates Convex with the Vercel AI SDK's useChat
hook for managing chat state, sending messages, and handling AI responses via Convex actions.convex/schema.ts
: Defines the database schema for Convex tables (e.g., messages
, chats
).convex/messages.ts
: Contains Convex query and mutation functions related to messages (e.g., list
, send
).convex/chats.ts
: Contains Convex query and mutation functions related to chat sessions (e.g., getOrCreate
, clear
).convex/openai.ts
: Contains the Convex action (chat
) responsible for interacting with the OpenAI API to generate AI responses.convex/multiModelAI.ts
: Core Convex Action responsible for interacting with AI models (e.g., OpenAI) asynchronously in the background.convex/_generated/
: Automatically generated files by Convex, including API definitions and types based on your schema and functions. Do not edit directly..env.local
: Local environment variables (only NEXT_PUBLIC_CONVEX_URL
for development). Sensitive keys like OPENAI_API_KEY
should be managed in the Convex dashboard.README.md
: This file, providing information about the project.Learn more about the concepts and best practices behind Convex:
Follow these steps to deploy your application to Vercel:
Create a Vercel Account: If you don't have one, sign up at vercel.com.
Link Your Project:
Override the Build Command:
npx convex deploy --cmd 'npm run build'
Set Production Environment Variables in Vercel:
CONVEX_DEPLOY_KEY
for Production:CONVEX_DEPLOY_KEY
.Set Production Environment Variables in Convex:
OPENAI_API_KEY
is set for the Production environment. This is separate from your development variables.Deploy:
Vercel will now automatically deploy your Convex functions and frontend changes whenever you push to the designated branch (e.g., main
). The npx convex deploy
command uses the CONVEX_DEPLOY_KEY
to push backend changes and sets the NEXT_PUBLIC_CONVEX_URL
environment variable for the build, pointing your frontend to the correct production Convex deployment.
To enable preview deployments for branches/pull requests:
Generate Preview Deploy Key:
Add Preview Environment Variable in Vercel:
CONVEX_DEPLOY_KEY
.Now, when Vercel creates a preview deployment for a branch, npx convex deploy
will use the preview key to create a unique, isolated Convex backend deployment for that preview. Your frontend preview will automatically connect to this isolated backend.
(Optional) Set Default Preview Variables in Convex: If your preview deployments require specific Convex environment variables (like a default OPENAI_API_KEY
), you can configure "Default Environment Variables" for Preview/Dev deployments in your Convex project settings.
(Optional) Run Setup Function for Previews: If you need to seed data in your preview deployments, add --preview-run 'yourFunctionName'
to the Vercel Build Command. For example: npx convex deploy --cmd 'npm run build' --preview-run 'internal.setup:seedData'
We welcome contributions! Here's how you can help:
git checkout -b feature/amazing-feature
git commit -m 'Add amazing feature'
git push origin feature/amazing-feature
This project is open source and available under the MIT License.