Building a Next.js Chatbot with Open AI
At Interstellar Solutions, we set out to build an AI-powered chatbot that could provide intelligent, conversational responses to users. By leveraging Open AI’s API and the flexibility of Next.js, we created a seamless, scalable chatbot integrated into our web application.
To achieve this, I initialized a Next.js project, set up API routes for secure Open AI communication, and built a client-side interface for real-time chat. Below, I’ll guide you through the process of building a chatbot with Next.js and Open AI.
Setting Up the Next.js Project
Start by creating a new Next.js project and installing the Open AI client library:
npx create-next-app@latest my-chatbot-app
cd my-chatbot-app
npm install openai
You’ll also need an Open AI API key, which you can obtain from Open AI’s platform. Store the key in a .env.local file:
OPENAI_API_KEY=your-openai-api-key
Server-Side API Route for Open AI
To securely handle Open AI requests, we created a Next.js API route in pages/api/chat.ts. This route processes user messages and sends them to Open AI’s API:
import type { NextApiRequest, NextApiResponse } from 'next';
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
if (req.method !== 'POST') {
return res.status(405).json({ error: 'Method not allowed' });
}
try {
const { message } = req.body;
if (!message) {
return res.status(400).json({ error: 'Message is required' });
}
const completion = await openai.chat.completions.create({
model: 'gpt-4o-mini',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: message },
],
});
const response = completion.choices[0]?.message?.content || 'No response';
res.status(200).json({ response });
} catch (error) {
console.error('Open AI error:', error);
res.status(500).json({ error: 'Internal server error' });
}
}
This API route accepts POST requests with a user message, sends it to Open AI’s gpt-4o-mini model, and returns the AI’s response.
Client-Side Chat Interface
We built a simple chat interface in pages/index.tsx using React state to manage the conversation. The interface sends user input to the /api/chat endpoint and displays the AI’s responses:
import { useState } from 'react';
export default function Chatbot() {
const [messages, setMessages] = useState<{ role: string; content: string }[]>([]);
const [input, setInput] = useState('');
const handleSend = async () => {
if (!input.trim()) return;
const userMessage = { role: 'user', content: input };
setMessages([...messages, userMessage]);
setInput('');
try {
const res = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ message: input }),
});
const data = await res.json();
if (data.error) {
setMessages([...messages, userMessage, { role: 'assistant', content: `Error: ${data.error}` }]);
} else {
setMessages([...messages, userMessage, { role: 'assistant', content: data.response }]);
}
} catch (error) {
setMessages([...messages, userMessage, { role: 'assistant', content: 'Error: Failed to connect to the server' }]);
}
};
return (
<div className="max-w-2xl mx-auto p-4">
<h1 className="text-2xl font-bold mb-4">AI Chatbot</h1>
<div className="border rounded-lg p-4 mb-4 h-96 overflow-y-auto">
{messages.map((msg, index) => (
<div key={index} className={`mb-2 ${msg.role === 'user' ? 'text-right' : 'text-left'}`}>
<span className={`inline-block p-2 rounded ${msg.role === 'user' ? 'bg-blue-100' : 'bg-gray-100'}`}>
{msg.content}
</span>
</div>
))}
</div>
<div className="flex gap-2">
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
className="flex-1 p-2 border rounded"
placeholder="Type your message..."
onKeyPress={(e) => e.key === 'Enter' && handleSend()}
/>
<button onClick={handleSend} className="p-2 bg-blue-500 text-white rounded">
Send
</button>
</div>
</div>
);
}
Next.js Routes
The Next.js application uses the following routes:
/- Main chat interface (defined inpages/index.tsx)/api/chat- API route for handling Open AI chat requests
Best Practices for Next.js and Open AI
- Secure API Key: Store the Open AI API key in environment variables and never expose it client-side.
- Error Handling: Implement robust error handling on both client and server to handle API failures gracefully.
- Rate Limiting: Consider adding rate limiting to the
/api/chatroute to prevent abuse. - Styling: Use Tailwind CSS (as shown in the example) for a responsive, modern interface.
- Conversation Context: Enhance the chatbot by maintaining conversation history in the Open AI API calls for context-aware responses.