CondenseQuestionChatEngine
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:33
CondenseQuestionChatEngine is used in conjunction with a Index (for example VectorStoreIndex). It does two steps on taking a user's chat message: first, it condenses the chat message with the previous chat history into a question with more context. Then, it queries the underlying Index using the new question with context and returns the response. CondenseQuestionChatEngine performs well when the input is primarily questions about the underlying data. It performs less well when the chat messages are not questions about the data, or are very referential to previous context.
Extends
BaseChatEngine
Constructors
new CondenseQuestionChatEngine()
new CondenseQuestionChatEngine(
init
):CondenseQuestionChatEngine
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:43
Parameters
init
queryEngine
BaseQueryEngine
chatHistory
ChatMessage
[]
condenseMessagePrompt?
CondenseQuestionPrompt
Returns
Overrides
BaseChatEngine.constructor
Properties
queryEngine
queryEngine:
BaseQueryEngine
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:34
memory
memory:
BaseMemory
<object
>
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:35
llm
llm:
LLM
<object
,object
>
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:36
condenseMessagePrompt
condenseMessagePrompt:
CondenseQuestionPrompt
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:37
Accessors
chatHistory
Get Signature
get chatHistory():
ChatMessage
<object
>[] |Promise
<ChatMessage
<object
>[]>
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:39
Returns
ChatMessage
<object
>[] | Promise
<ChatMessage
<object
>[]>
Overrides
BaseChatEngine.chatHistory
Methods
_getPromptModules()
protected
_getPromptModules():ModuleRecord
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:59
Returns
ModuleRecord
_getPrompts()
protected
_getPrompts():object
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:63
Returns
object
condenseMessagePrompt
condenseMessagePrompt:
CondenseQuestionPrompt
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:69
Parameters
promptsDict
condenseMessagePrompt
CondenseQuestionPrompt
Returns
void
chat()
Call Signature
chat(
params
):Promise
<EngineResponse
>
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:88
Parameters
params
NonStreamingChatEngineParams
<object
, object
>
Returns
Promise
<EngineResponse
>
Overrides
BaseChatEngine.chat
Call Signature
chat(
params
):Promise
<AsyncIterable
<EngineResponse
,any
,any
>>
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:89
Parameters
params
StreamingChatEngineParams
<object
, object
>
Returns
Promise
<AsyncIterable
<EngineResponse
, any
, any
>>
Overrides
BaseChatEngine.chat
reset()
reset():
void
Defined in: llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:137
Returns
void