Prompts
Prompting is the fundamental input that gives LLMs their expressive power. LlamaIndex uses prompts to build the index, do insertion, perform traversal during querying, and to synthesize the final answer.
Users may also provide their own prompt templates to further customize the behavior of the framework. The best method for customizing is copying the default prompt from the link above, and using that as the base for any modifications.
Usage Pattern
Currently, there are two ways to customize prompts in LlamaIndex:
For both methods, you will need to create an function that overrides the default prompt.
1. Customizing the default prompt on initialization
The first method is to create a new instance of a Response Synthesizer (or the module you would like to update the prompt) by using the getResponseSynthesizer function. Instead of passing the custom prompt to the deprecated responseBuilder parameter, call getResponseSynthesizer with the mode as the first argument and supply the new prompt via the options parameter.
2. Customizing submodules prompt
The second method is that most of the modules in LlamaIndex have a getPrompts
and a updatePrompt
method that allows you to override the default prompt. This method is useful when you want to change the prompt on the fly or in submodules on a more granular level.