Logo
Modules/Chat UI

Using API Route

Chat interface for your LlamaIndexTS application using API Route

Using chat-ui, it's easy to add a chat interface to your LlamaIndexTS application. You just need to create an API route that provides an api/chat endpoint and a chat component to consume the API.

API route

As an example, this is an API route for the Next.js App Router. Copy the following code into your app/api/chat/route.ts file to get started:

import { MockLLM } from "@llamaindex/core/utils";
import { LlamaIndexAdapter, type Message } from "ai";
import { Settings, SimpleChatEngine, type ChatMessage } from "llamaindex";
import { NextResponse, type NextRequest } from "next/server";
 
Settings.llm = new MockLLM(); // config your LLM here
 
export async function POST(request: NextRequest) {
  try {
    const { messages } = (await request.json()) as { messages: Message[] };
    const userMessage = messages[messages.length - 1];
    if (!userMessage || userMessage.role !== "user") {
      return NextResponse.json(
        { detail: "Last message is not a user message" },
        { status: 400 },
      );
    }
 
    const chatEngine = new SimpleChatEngine();
 
    return LlamaIndexAdapter.toDataStreamResponse(
      await chatEngine.chat({
        message: userMessage.content,
        chatHistory: messages as ChatMessage[],
        stream: true,
      }),
      {},
    );
  } catch (error) {
    const detail = (error as Error).message;
    return NextResponse.json({ detail }, { status: 500 });
  }
}

Chat UI

This is the simplest way to add a chat interface to your application. Copy the following code into your application to consume the API:

"use client";
import {
  ChatHandler,
  ChatInput,
  ChatMessages,
  ChatSection,
} from "@llamaindex/chat-ui";
import { useChat } from "ai/react";
 
export const ChatDemo = () => {
  const handler = useChat();
  return (
    <ChatSection handler={handler as ChatHandler}>
      <ChatMessages>
        <ChatMessages.List className="h-auto max-h-[400px]" />
        <ChatMessages.Actions />
      </ChatMessages>
      <ChatInput />
    </ChatSection>
  );
};

Try it out ⬇️

Combining both, you're getting a fully functional chat interface:

Next Steps

The steps above are the bare minimum to get a chat interface working. From here, you can go two ways:

  1. Use create-llama to scaffold a new LlamaIndexTS project including complex API routes and chat interfaces or
  2. Learn more about chat-ui and LlamaIndexTS to customize the chat interface and API routes to your needs.

On this page