LLMQuestionGenerator
Defined in: llamaindex/src/QuestionGenerator.ts:22
LLMQuestionGenerator uses the LLM to generate new questions for the LLM using tools and a user query.
Extends
PromptMixin
Implements
BaseQuestionGenerator
Constructors
new LLMQuestionGenerator()
new LLMQuestionGenerator(
init
?):LLMQuestionGenerator
Defined in: llamaindex/src/QuestionGenerator.ts:30
Parameters
init?
Partial
<LLMQuestionGenerator
>
Returns
Overrides
PromptMixin.constructor
Properties
llm
llm:
LLM
<object
,object
>
Defined in: llamaindex/src/QuestionGenerator.ts:26
prompt
prompt:
SubQuestionPrompt
Defined in: llamaindex/src/QuestionGenerator.ts:27
outputParser
outputParser:
BaseOutputParser
<StructuredOutput
<SubQuestion
[]>>
Defined in: llamaindex/src/QuestionGenerator.ts:28
Methods
_getPrompts()
protected
_getPrompts():object
Defined in: llamaindex/src/QuestionGenerator.ts:38
Returns
object
Overrides
PromptMixin._getPrompts
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Defined in: llamaindex/src/QuestionGenerator.ts:44
Parameters
promptsDict
subQuestion
SubQuestionPrompt
Returns
void
Overrides
PromptMixin._updatePrompts
generate()
generate(
tools
,query
):Promise
<SubQuestion
[]>
Defined in: llamaindex/src/QuestionGenerator.ts:52
Parameters
tools
ToolMetadata
[]
query
QueryType
Returns
Promise
<SubQuestion
[]>
Implementation of
BaseQuestionGenerator.generate
_getPromptModules()
protected
_getPromptModules():ModuleRecord
Defined in: llamaindex/src/QuestionGenerator.ts:72
Return a dictionary of sub-modules within the current module that also implement PromptMixin (so that their prompts can also be get/set).
Can be blank if no sub-modules.
Returns
ModuleRecord
Overrides
PromptMixin._getPromptModules