Portkey LLM
Installation
npm i llamaindex @llamaindex/portkey-ai
pnpm add llamaindex @llamaindex/portkey-ai
yarn add llamaindex @llamaindex/portkey-ai
bun add llamaindex @llamaindex/portkey-ai
Usage
import { Portkey } from "@llamaindex/portkey-ai";
import { Settings } from "llamaindex";
Settings.llm = new Portkey({
apiKey: "<YOUR_API_KEY>",
});
Load and index documents
For this example, we will use a single document. In a real-world scenario, you would have multiple documents to index.
import { Document, VectorStoreIndex } from "llamaindex";
const document = new Document({ text: essay, id_: "essay" });
const index = await VectorStoreIndex.fromDocuments([document]);
Query
const queryEngine = index.asQueryEngine();
const query = "What is the meaning of life?";
const results = await queryEngine.query({
query,
});
Full Example
import { Portkey } from "@llamaindex/portkey-ai";
import { Document, Settings, VectorStoreIndex } from "llamaindex";
// Use the Portkey LLM
Settings.llm = new Portkey({
apiKey: "<YOUR_API_KEY>",
});
async function main() {
// Create a document
const document = new Document({ text: essay, id_: "essay" });
// Load and index documents
const index = await VectorStoreIndex.fromDocuments([document]);
// get retriever
const retriever = index.asRetriever();
// Create a query engine
const queryEngine = index.asQueryEngine({
retriever,
});
const query = "What is the meaning of life?";
// Query
const response = await queryEngine.query({
query,
});
// Log the response
console.log(response.response);
}
API Reference
Edit on GitHub
Last updated on