Integration
Vercel
Integrate LlamaIndex with Vercel's AI SDK
LlamaIndex provides integration with Vercel's AI SDK, allowing you to create powerful search and retrieval applications. You can:
- Use any of Vercel AI's model providers as LLMs in LlamaIndex
- Use indexes (e.g. VectorStoreIndex, LlamaCloudIndex) from LlamaIndexTS in your Vercel AI applications
Setup
First, install the required dependencies:
Using Vercel AI's Model Providers
Using the VercelLLM
adapter, it's easy to use any of Vercel AI's model providers as LLMs in LlamaIndex. Here's an example of how to use OpenAI's GPT-4o model:
Use Indexes
Using VectorStoreIndex
Here's how to create a simple vector store index and query it using Vercel's AI SDK:
Note: the Vercel AI model referenced in the
llamaindex
function is used by the response synthesizer to generate a response for the tool call.
Using LlamaCloud
For production deployments, you can use LlamaCloud to store and manage your documents:
Next Steps
- Explore LlamaCloud for managed document storage and retrieval
- Join our Discord community for support and discussions