Logo
Getting Started/Installation

With Next.js

In this guide, you'll learn how to use LlamaIndex with Next.js.

Before you start, make sure you have try LlamaIndex.TS in Node.js to make sure you understand the basics.

Getting Started with LlamaIndex.TS in Node.js

Differences between Node.js and Next.js

Next.js is a React framework that has both server side compatibility and client side compatibility. This means that you need to be careful when using LlamaIndex.TS in Next.js. Don't leak the import data like API keys to the client side.

Also, in Next.js, there is build time and runtime. Some computations can be done at build time like Document embedding could be done at build time for better performance. Where as the llamaindex package is working with Next.js, some provider packages like @llamaindex/huggingface are not working well with Next.js. This is due to the upstream dependencies used by the provider package.

Make sure to use withLlamaIndex to make sure that LlamaIndex.TS works well with Next.js.

// next.config.mjs / next.config.ts
import withLlamaIndex from "llamaindex/next";
 
/** @type {import('next').NextConfig} */
const nextConfig = {};
 
export default withLlamaIndex(nextConfig);

If you see any dependency issues, you are welcome to open an issue on the GitHub.

Edge Runtime

Vercel Edge Runtime is a subset of Node.js APIs. Similar to Cloudflare Workers, it is a serverless platform that runs your code on the edge.

Not all features of Node.js are supported in Vercel Edge Runtime, so does LlamaIndex.TS, we are working on more compatibility with all JavaScript runtimes.

On this page