Logo
Guide/Chat UI

Using Next.js RSC

Chat interface for your LlamaIndexTS application using Next.js RSC

Using chat-ui, it's easy to add a chat interface to your LlamaIndexTS application using Next.js RSC and Vercel AI RSC.

With RSC, the chat messages are not returned as JSON from the server (like when using an API route), instead the chat message components are rendered on the server side. This is for example useful for rendering a whole chat history on the server before sending it to the client. Check here, for a discussion of when to use use RSC.

For implementing a chat interface with RSC, you need to create an AI action and then connect the chat interface to use it.

Create an AI action

First, define an AI context provider with a chat server action:

import { Markdown } from "@llamaindex/chat-ui/widgets";
import { MockLLM } from "@llamaindex/core/utils";
import { generateId, Message } from "ai";
import { createAI, createStreamableUI, getMutableAIState } from "ai/rsc";
import { type ChatMessage, Settings, SimpleChatEngine } from "llamaindex";
import { ReactNode } from "react";
 
type ServerState = Message[];
type FrontendState = Array<Message & { display: ReactNode }>;
type Actions = {
  chat: (message: Message) => Promise<Message & { display: ReactNode }>;
};
 
Settings.llm = new MockLLM(); // config your LLM here
 
export const AI = createAI<ServerState, FrontendState, Actions>({
  initialAIState: [],
  initialUIState: [],
  actions: {
    chat: async (message: Message) => {
      "use server";
 
      const aiState = getMutableAIState<typeof AI>();
      aiState.update((prev) => [...prev, message]);
 
      const uiStream = createStreamableUI();
      const chatEngine = new SimpleChatEngine();
      const assistantMessage: Message = {
        id: generateId(),
        role: "assistant",
        content: "",
      };
 
      // run the async function without blocking
      (async () => {
        const chatResponse = await chatEngine.chat({
          stream: true,
          message: message.content,
          chatHistory: aiState.get() as ChatMessage[],
        });
 
        for await (const chunk of chatResponse) {
          assistantMessage.content += chunk.delta;
          uiStream.update(<Markdown content={assistantMessage.content} />);
        }
 
        aiState.done([...aiState.get(), assistantMessage]);
        uiStream.done();
      })();
 
      return {
        ...assistantMessage,
        display: uiStream.value,
      };
    },
  },
});

The chat server action is using LlamaIndexTS to generate a response based on the chat history and the user input.

Create the chat UI

The entrypoint of our application initializes the AI provider for the application and adds a ChatSection component:

import { AI } from "./ai-action";
import { ChatSectionRSC } from "./chat-section";
 
export const ChatDemoRSC = () => (
  <AI>
    <ChatSectionRSC />
  </AI>
);

The ChatSection component is created by using chat components from @llamaindex/chat-ui:

"use client";
 
import {
  ChatHandler,
  ChatInput,
  ChatMessage,
  ChatMessages,
  ChatSection as ChatSectionUI,
  Message,
} from "@llamaindex/chat-ui";
import { useChatRSC } from "./use-chat-rsc";
 
export const ChatSectionRSC = () => {
  const handler = useChatRSC();
  return (
    <ChatSectionUI handler={handler as ChatHandler}>
      <ChatMessages>
        <ChatMessages.List className="h-auto max-h-[400px]">
          {handler.messages.map((message, index) => (
            <ChatMessage
              key={index}
              message={message as Message}
              isLast={index === handler.messages.length - 1}
            >
              <ChatMessage.Avatar />
              <ChatMessage.Content>{message.display}</ChatMessage.Content>
            </ChatMessage>
          ))}
          <ChatMessages.Loading />
        </ChatMessages.List>
      </ChatMessages>
      <ChatInput />
    </ChatSectionUI>
  );
};

It is using a useChatRSC hook to conntect the chat interface to the chat AI action that we defined earlier:

"use client";
 
import { useActions } from "ai/rsc";
 
import { generateId, Message } from "ai";
import { useUIState } from "ai/rsc";
import { useState } from "react";
import { AI } from "./ai-action";
 
export function useChatRSC() {
  const [input, setInput] = useState<string>("");
  const [isLoading, setIsLoading] = useState<boolean>(false);
  const [messages, setMessages] = useUIState<typeof AI>();
  const { chat } = useActions<typeof AI>();
 
  const append = async (message: Omit<Message, "id">) => {
    const newMsg: Message = { ...message, id: generateId() };
 
    setIsLoading(true);
    try {
      setMessages((prev) => [...prev, { ...newMsg, display: message.content }]);
      const assistantMsg = await chat(newMsg);
      setMessages((prev) => [...prev, assistantMsg]);
    } catch (error) {
      console.error(error);
    }
    setIsLoading(false);
    setInput("");
 
    return message.content;
  };
 
  return {
    input,
    setInput,
    isLoading,
    messages,
    setMessages,
    append,
  };
}

Try RSC Chat ⬇️

Next Steps

The steps above are the bare minimum to get a chat interface working with RSC. From here, you can go two ways:

  1. Use our full-stack RSC example based on create-llama to get started quickly with a fully working chat interface or
  2. Learn more about AI RSC, chat-ui and LlamaIndexTS to customize the chat interface and AI actions to your needs.
Edit on GitHub

Last updated on

On this page