Hooks
Understanding the hook system for accessing chat state and functionality
The hook system provides access to chat state (e.g. message content), context, and functionality throughout the component tree without the need to pass props. These hooks enable deep customization and integration with the chat interface.
Context Hooks
useChatUI
The primary hook for accessing chat state and handlers to modify the chat state.
import { useChatUI } from '@llamaindex/chat-ui'
function CustomChatComponent() {
const {
messages,
input,
setInput,
isLoading,
sendMessage,
regenerate,
stop,
setMessages,
requestData,
setRequestData,
} = useChatUI()
const handleSendMessage = async () => {
await sendMessage({
id: 'user-msg-1',
role: 'user',
parts: [{ type: 'text', text: input }],
})
}
return (
<div>
<p>Messages: {messages.length}</p>
<p>Loading: {isLoading ? 'Yes' : 'No'}</p>
{error && <p>Error: {error.message}</p>}
</div>
)
}
Returned Properties:
messages
- Array of chat messages with partsinput
- Current input valuesetInput
- Function to update inputisLoading
- Loading state boolean (computed from status)status
- Current chat status ('submitted' | 'streaming' | 'ready' | 'error')sendMessage
- Function to send a messagestop
- Function to stop current generationregenerate
- Function to regenerate a messagesetMessages
- Function to update message arrayrequestData
- Additional request datasetRequestData
- Function to update request data
useChatMessage
Access the current message context within message components.
import { useChatMessage } from '@llamaindex/chat-ui'
function CustomMessageContent() {
const { message, isLast } = useChatMessage()
return (
<div>
<p>Role: {message.role}</p>
<p>Parts: {message.parts.length}</p>
<p>Is last message: {isLast ? 'Yes' : 'No'}</p>
{message.parts.map((part, index) => (
<div key={index}>
<p>Part {index + 1}: {part.type}</p>
</div>
))}
</div>
)
}
Returned Properties:
message
- Current message object with parts arrayisLast
- Boolean indicating if this is the last message
usePart
Access the current message content within part components. This hook provides type-safe access to specific part types in the current message.
import { usePart } from '@llamaindex/chat-ui'
function TextPartComponent() {
const textPart = usePart('text')
if (!textPart) return null
return <p>{textPart.text}</p>
}
function ArtifactPartComponent() {
const artifactPart = usePart('data-artifact')
if (!artifactPart) return null
return (
<div>
<h4>{artifactPart.title}</h4>
<p>Type: {artifactPart.data.type}</p>
</div>
)
}
function CustomPartComponent() {
const customPart = usePart<CustomPartType>('data-custom-type')
if (!customPart) return null
return <CustomRenderer data={customPart.data} />
}
Function Overloads:
The hook provides automatic type inference for built-in part types:
usePart('text')
→TextPart | null
usePart('data-file')
→FilePart | null
usePart('data-artifact')
→ArtifactPart | null
usePart('data-event')
→EventPart | null
usePart('data-sources')
→SourcesPart | null
usePart('data-suggestion')
→SuggestionPart | null
Usage Notes:
- Must be used within a
ChatPartProvider
context - Returns
null
if the part type doesn't match the current part - For custom part types, use the generic parameter:
usePart<CustomPart>('custom-type')
useChatInput
Access input form state and handlers.
import { useChatInput } from '@llamaindex/chat-ui'
function CustomInputField() {
const {
input,
setInput,
handleSubmit,
isLoading,
onFileUpload,
uploadedFiles,
} = useChatInput()
return (
<form onSubmit={handleSubmit}>
<textarea
value={input}
onChange={e => setInput(e.target.value)}
disabled={isLoading}
placeholder="Type your message..."
/>
<p>Uploaded files: {uploadedFiles.length}</p>
</form>
)
}
Returned Properties:
input
- Current input valuesetInput
- Function to update inputhandleSubmit
- Form submit handlerisLoading
- Loading stateonFileUpload
- File upload handleruploadedFiles
- Array of uploaded files
useChatMessages
Access messages list state and handlers.
import { useChatMessages } from '@llamaindex/chat-ui'
function CustomMessageList() {
const { messages, isLoading, regenerate, stop, isEmpty, scrollToBottom } =
useChatMessages()
return (
<div>
<div className="messages">
{messages.map((msg, i) => (
<div key={i}>
<p>Role: {msg.role}</p>
<div>
{msg.parts.map((part, index) => (
<div key={index}>
{part.type === 'text' ? (
<p>{part.text}</p>
) : (
<p>{JSON.stringify(part.data)}</p>
)}
</div>
))}
</div>
</div>
))}
</div>
{isEmpty && <p>No messages yet</p>}
<button onClick={regenerate} disabled={isLoading}>
Regenerate
</button>
<button onClick={stop}>Stop</button>
<button onClick={scrollToBottom}>Scroll to Bottom</button>
</div>
)
}
Returned Properties:
messages
- Array of messagesisLoading
- Loading stateregenerate
- Regenerate last messagestop
- Stop generationisEmpty
- Boolean if no messagesscrollToBottom
- Function to scroll to bottom
useChatCanvas
Access artifact canvas state and handlers.
import { useChatCanvas } from '@llamaindex/chat-ui'
function CustomCanvasControls() {
const {
currentArtifact,
isVisible,
showCanvas,
hideCanvas,
updateArtifact,
artifacts,
} = useChatCanvas()
return (
<div>
<p>Canvas visible: {isVisible ? 'Yes' : 'No'}</p>
<p>Current artifact: {currentArtifact?.title || 'None'}</p>
<p>Total artifacts: {artifacts.length}</p>
<button onClick={showCanvas}>Show Canvas</button>
<button onClick={hideCanvas}>Hide Canvas</button>
</div>
)
}
Returned Properties:
currentArtifact
- Currently displayed artifactisVisible
- Canvas visibility stateshowCanvas
- Function to show canvashideCanvas
- Function to hide canvasupdateArtifact
- Function to update artifactartifacts
- Array of all artifacts
Utility Hooks
useFile
Handle file uploads and processing.
import { useFile } from '@llamaindex/chat-ui'
function FileUploadComponent() {
const {
uploadFile,
uploadFiles,
isUploading,
uploadedFiles,
removeFile,
clearFiles,
} = useFile({
maxSize: 10 * 1024 * 1024, // 10MB
accept: ['image/*', 'application/pdf'],
multiple: true,
})
const handleFileSelect = async (files: FileList) => {
await uploadFiles(Array.from(files))
}
return (
<div>
<input
type="file"
multiple
onChange={e => handleFileSelect(e.target.files!)}
/>
{isUploading && <p>Uploading...</p>}
<div>
{uploadedFiles.map((file, i) => (
<div key={i}>
<span>{file.name}</span>
<button onClick={() => removeFile(i)}>Remove</button>
</div>
))}
</div>
<button onClick={clearFiles}>Clear All</button>
</div>
)
}
Options:
maxSize
- Maximum file size in bytesaccept
- Array of accepted MIME typesmultiple
- Allow multiple filesonUpload
- Upload completion callbackonError
- Error callback
Returned Properties:
uploadFile
- Upload single fileuploadFiles
- Upload multiple filesisUploading
- Upload stateuploadedFiles
- Array of uploaded filesremoveFile
- Remove file by indexclearFiles
- Clear all files
useCopyToClipboard
Copy content to clipboard with feedback.
import { useCopyToClipboard } from '@llamaindex/chat-ui'
function CopyButton({ content }) {
const { copyToClipboard, isCopied } = useCopyToClipboard({
timeout: 2000, // Reset after 2 seconds
})
const handleCopy = () => {
copyToClipboard(content)
}
return (
<button
onClick={handleCopy}
className={isCopied ? 'text-green-600' : 'text-gray-600'}
>
{isCopied ? 'Copied!' : 'Copy'}
</button>
)
}
Options:
timeout
- Time before resetting copied state
Returned Properties:
copyToClipboard
- Function to copy textisCopied
- Boolean indicating if recently copied
Hook Composition
Custom Chat Hook
Create custom hooks by composing existing ones:
import { useChatUI, useChatMessages } from '@llamaindex/chat-ui'
function useCustomChat() {
const chatUI = useChatUI()
const messages = useChatMessages()
const sendMessageWithMetadata = async (content: string, metadata: any) => {
await chatUI.sendMessage({
id: `user-${Date.now()}`,
role: 'user',
parts: [{ type: 'text', text: content }]
}, { data: metadata })
}
const getLastAssistantMessage = () => {
return chatUI.messages.filter(m => m.role === 'assistant').pop()
}
return {
...chatUI,
...messages,
sendMessageWithMetadata,
getLastAssistantMessage,
}
}
Hook Context Boundaries
Context Availability
Different hooks are available in different contexts:
function App() {
return (
<ChatSection handler={handler}>
{/* useChatUI available here */}
<ChatMessages>
{/* useChatMessages available here */}
<ChatMessages.List>
{messages.map(message => (
<ChatMessage key={message.id} message={message}>
{/* useChatMessage available here */}
<CustomMessageContent />
</ChatMessage>
))}
</ChatMessages.List>
</ChatMessages>
<ChatInput>
{/* useChatInput available here */}
<CustomInputField />
</ChatInput>
</ChatSection>
)
}
Error Boundaries
Handle hook errors gracefully:
function SafeHookUsage() {
try {
const { messages } = useChatUI()
return <MessageList messages={messages} />
} catch (error) {
console.error('Hook error:', error)
return <ErrorFallback />
}
}
Next Steps
- Parts - Learn how hooks work with message parts
- Customization - Use hooks for custom styling and behavior
- Examples - See complete examples using hooks
- Core Components - Understand component-hook relationships
useWorkflow
Manages LlamaIndex workflows that are deployed on LlamaDeploy. Start new instances, send events, and handle streamed event responses. Key features:
- Client Management: Automatically creates and manages the LlamaDeploy client connection
- Run Lifecycle: Creates and manages workflow runs
- Event Streaming: Real-time streaming of workflow events
- State Management: Tracks workflow status (running, complete, error) and event history
- Error Handling: Provides callbacks for handling errors and stop events
To get started, you need a deployment with a workflow running. You can create a deployment with a workflow by following the LlamaDeploy documentation.
To create the hook, you'll pass the baseUrl
of the LlamaDeploy server, the deployment
and workflow
name to the useWorkflow
hook. Here's an example file:
import { useWorkflow } from '@llamaindex/chat-ui'
function WorkflowComponent() {
const { runId, start, stop, sendEvent, events, status } = useWorkflow({
baseUrl: 'http://localhost:8000',
deployment: 'my-deployment',
workflow: 'my-workflow',
onStopEvent: event => {
console.log('Workflow stopped:', event)
},
onError: error => {
console.error('Workflow error:', error)
},
})
const handleStartWorkflow = async () => {
await start({ message: 'Hello workflow!' })
}
const handleSendCustomEvent = async () => {
await sendEvent({
type: 'workflow.ProgressEvent',
data: { custom: 'data' },
})
}
const handleStopWorkflow = async () => {
await stop({ reason: 'User requested stop' })
}
return (
<div>
<p>Run ID: {runId}</p>
<p>Status: {status}</p>
<p>Events received: {events.length}</p>
<button onClick={handleStartWorkflow} disabled={status === 'running'}>
Start Workflow
</button>
<button onClick={handleSendCustomEvent} disabled={!runId}>
Send Event
</button>
<button onClick={handleStopWorkflow} disabled={status !== 'running'}>
Stop Workflow
</button>
<div>
<h4>Events:</h4>
{events.map((event, i) => (
<div key={i}>
<strong>{event.type}</strong>: {JSON.stringify(event.data)}
</div>
))}
</div>
</div>
)
}
Options:
baseUrl
- Base URL for the workflow API (optional)deployment
- Name of the registered deploymentworkflow
- Set the workflow to run - if not set uses the default workflow (optional)runId
- Optional run ID for resuming a previous workflow runonStopEvent
- Callback when workflow receives a stop eventonError
- Error callback for workflow errors
Returned Properties:
runId
- Current workflow run IDsendEvent
- Function to send events of typeWorkflowEvent
to the workflow. Just pass the event object with thetype
anddata
fields as parameter.start
- Function to start a new workflow. This is sending aStartEvent
besides starting the workflow run, so you pass thedata
object for theStartEvent
as parameter.stop
- Function to send stop event to current workflow. This is sending aStopEvent
besides stopping the workflow run, so you pass thedata
object for theStopEvent
as parameter.events
- Array of all received workflow events of typeWorkflowEvent
status
- Current workflow run status ('idle' | 'running' | 'complete' | 'error')
Workflow Event Types:
A workflow event is identified by its type
and can have optional data
associated with it carrying any JSON-serializable data:
interface WorkflowEvent {
type: string
data?: JSONValue | undefined
}
The type
field is the fully qualified name of the event type in Python.
There are a few built-in event types that you can use with the type
field:
// Built-in event types
enum WorkflowEventType {
StartEvent = 'llama_index.core.workflow.events.StartEvent',
StopEvent = 'llama_index.core.workflow.events.StopEvent',
}
Advanced Usage:
Resume an existing workflow task:
function ResumeWorkflowComponent() {
const workflow = useWorkflow({
deployment: 'my-deployment',
runId: 'existing-run-id', // Resume existing task
onStopEvent: event => {
console.log('Workflow completed:', event)
},
})
// Events from the existing task will be automatically fetched
useEffect(() => {
console.log('Restored events:', workflow.events)
}, [workflow.events])
return (
<div>
<p>Resumed run: {workflow.runId}</p>
<p>Events: {workflow.events.length}</p>
</div>
)
}
Custom event handling:
interface CustomWorkflowEvent extends WorkflowEvent {
type: 'workflow.ProgressEvent' | 'workflow.ResultEvent' | string
data: {
progress?: number
result?: any
message?: string
}
}
function CustomWorkflowComponent() {
const workflow = useWorkflow<CustomWorkflowEvent>({
deployment: 'progress-deployment',
workflow: 'progress-workflow',
onStopEvent: event => {
if (event.type === 'workflow.ResultEvent') {
console.log('Final result:', event.data.result)
}
},
})
const progressEvents = workflow.events.filter(
e => e.type === 'workflow.ProgressEvent'
)
const latestProgress =
progressEvents[progressEvents.length - 1]?.data?.progress || 0
return (
<div>
<p>Progress: {latestProgress}%</p>
<progress value={latestProgress} max="100" />
</div>
)
}
useChatWorkflow
The useChatWorkflow
hook is a specialized version extending the useWorkflow
designed specifically for chat interfaces. It is used for building chat interfaces for workflows running on LlamaDeploy.
It returns a ChatHandler
, so that it works seamlessly with the chat-ui chat components. Therefore, it is also a drop-in replacement for the useChat
hook from Vercel AI.
Here's an example of how to use it:
import {
ChatSection,
ChatMessages,
ChatMessage,
ChatInput,
useChatWorkflow,
} from '@llamaindex/chat-ui'
function WorkflowChatApp() {
const handler = useChatWorkflow({
baseUrl: 'http://localhost:4501',
deployment: 'my-deployment',
workflow: 'chat-workflow',
onError: error => console.error(error),
})
return (
<ChatSection handler={handler}>
<ChatMessages>
<ChatMessages.List>
{handler.messages.map((message, index) => (
<ChatMessage
key={index}
message={message}
isLast={index === handler.messages.length - 1}
>
<ChatMessage.Avatar />
<ChatMessage.Content>
<ChatMessage.Part.Markdown />
<ChatMessage.Part.Source />
{/* Renderer for custom message parts */}
<WeatherPart />
</ChatMessage.Content>
<ChatMessage.Actions />
</ChatMessage>
))}
</ChatMessages.List>
</ChatMessages>
<ChatInput />
</ChatSection>
)
}
Parameters:
baseUrl
- Base URL for the workflow API (optional)deployment
- Name of the registered deploymentworkflow
- Set the workflow to run - if not set uses the default workflow (optional)onError
- Error callback for workflow errors
Chat Workflows
To be used as a chat workflow, your workflows need to follow the same calling convention as a Agent Workflow - that's why Agent Workflows are working out of the box with the useChatWorkflow
hook.
Your workflow will be executed once for each chat request with the following input parameters included in workflow's StartEvent
:
user_msg
[str]: The current user messagechat_history
[list[ChatMessage]]: All the previous messages of the conversation
Example:
@step
def handle_start_event(ev: StartEvent) -> MyNextEvent:
user_msg = ev.user_msg
chat_history = ev.chat_history
...
Built-in Workflow Events
The hook automatically processes workflow events (using useWorkflow
) and renders them as parts in the chat interface, making it easy to build rich conversational experiences with LlamaDeploy workflows.
Your LlamaDeploy workflows can send three main types of events to enhance the chat experience:
1. SourceNodesEvent - Citations and References
Send source nodes to display citations and references for generated content:
from llama_index.core.chat_ui.events import (
SourceNodesEvent,
)
from llama_index.core.schema import NodeWithScore
from llama_index.core.data_structs import Node
ctx.write_event_to_stream(
SourceNodesEvent(
nodes=[
NodeWithScore(
node=Node(
text="sample node content",
metadata={"URL": "https://example.com/document.pdf"},
),
score=0.8,
),
],
)
)
Note: Your
ChatMessage.Content
component needs to have theChatMessage.Part.Source
component as a child to display the citations and references (as shown in the example above).
2. ArtifactEvent - Code and Artifacts
Send code snippets, documents, or other artifacts that can be displayed in a dedicated canvas:
from llama_index.core.chat_ui.events import (
ArtifactEvent,
)
import time
ctx.write_event_to_stream(
ArtifactEvent(
data={
"type": "code",
"created_at": int(time.time()),
"data": {
"language": "typescript",
"file_name": "example.ts",
"code": 'console.log("Hello, world!");',
},
}
)
)
Note: Your
ChatMessage.Content
component needs to have theChatMessage.Part.Markdown
component as a child to display the artifacts inline in the markdown component (as shown in the example above).
3. UIEvent - Custom UI Components
Send custom data to render it in a specialized UI components. In your workflow code, you can send UIEvent
s for this - for example to render a weather part in the chat interface:
from llama_index.core.chat_ui.events import (
UIEvent,
)
from pydantic import BaseModel
class WeatherData(BaseModel):
location: str
temperature: float
condition: str
humidity: int
windSpeed: int
weather_data = WeatherData(
location="San Francisco, CA",
temperature=22,
condition="sunny",
humidity=65,
windSpeed=12,
)
ctx.write_event_to_stream(
UIEvent(type="weather", data=weather_data)
)
To render this custom UI component, you need to add it as child to your ChatMessage.Content
component. The example above will render the WeatherPart
component in the chat interface.