Using other LLM APIs
By default LlamaIndex.TS uses OpenAI's LLMs and embedding models, but we support lots of other LLMs including models from Mistral (Mistral, Mixtral), Anthropic (Claude) and Google (Gemini).
If you don't want to use an API at all you can run a local model.
This example runs you through the process of setting up a Mistral model:
Installation
Using another LLM
You can specify what LLM LlamaIndex.TS will use on the Settings
object, like this:
You can see examples of other APIs we support by checking out "Available LLMs" in the sidebar of our LLMs section.
Using another embedding model
A frequent gotcha when trying to use a different API as your LLM is that LlamaIndex will also by default index and embed your data using OpenAI's embeddings. To completely switch away from OpenAI you will need to set your embedding model as well, for example:
We support many different embeddings.
Full example
This example uses Mistral's mistral-tiny
model as the LLM and Mistral for embeddings as well.
Last updated on