Using Next.js RSC
Chat interface for your LlamaIndexTS application using Next.js RSC
Using chat-ui, it's easy to add a chat interface to your LlamaIndexTS application using Next.js RSC and Vercel AI RSC.
With RSC, the chat messages are not returned as JSON from the server (like when using an API route), instead the chat message components are rendered on the server side. This is for example useful for rendering a whole chat history on the server before sending it to the client. Check here, for a discussion of when to use use RSC.
For implementing a chat interface with RSC, you need to create an AI action and then connect the chat interface to use it.
Create an AI action
First, define an AI context provider with a chat server action:
The chat server action is using LlamaIndexTS to generate a response based on the chat history and the user input.
Create the chat UI
The entrypoint of our application initializes the AI provider for the application and adds a ChatSection
component:
The ChatSection
component is created by using chat components from @llamaindex/chat-ui:
It is using a useChatRSC
hook to conntect the chat interface to the chat
AI action that we defined earlier:
Try RSC Chat ⬇️
Next Steps
The steps above are the bare minimum to get a chat interface working with RSC. From here, you can go two ways:
- Use our full-stack RSC example based on create-llama to get started quickly with a fully working chat interface or
- Learn more about AI RSC, chat-ui and LlamaIndexTS to customize the chat interface and AI actions to your needs.
Last updated on