Class: CondenseQuestionChatEngine
CondenseQuestionChatEngine is used in conjunction with a Index (for example VectorStoreIndex). It does two steps on taking a user's chat message: first, it condenses the chat message with the previous chat history into a question with more context. Then, it queries the underlying Index using the new question with context and returns the response. CondenseQuestionChatEngine performs well when the input is primarily questions about the underlying data. It performs less well when the chat messages are not questions about the data, or are very referential to previous context.
Extends
Implements
Constructors
new CondenseQuestionChatEngine()
new CondenseQuestionChatEngine(
init
):CondenseQuestionChatEngine
Parameters
• init
• init.chatHistory: ChatMessage
[]
• init.condenseMessagePrompt?
• init.queryEngine: QueryEngine
• init.serviceContext?: ServiceContext
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:44
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:40
condenseMessagePrompt()
condenseMessagePrompt: (
__namedParameters
) =>string
Parameters
• __namedParameters
• __namedParameters.chatHistory: undefined
| string
= ""
• __namedParameters.question: undefined
| string
= ""
Returns
string
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:42
llm
llm:
LLM
<object
,object
>
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:41
queryEngine
queryEngine:
QueryEngine
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:39
Methods
_getPromptModules()
protected
_getPromptModules():Record
<string
,any
>
Returns
Record
<string
, any
>
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:83
_getPrompts()
protected
_getPrompts():object
Returns
object
condenseMessagePrompt()
condenseMessagePrompt: (
__namedParameters
) =>string
Parameters
• __namedParameters
• __namedParameters.chatHistory: undefined
| string
= ""
• __namedParameters.question: undefined
| string
= ""
Returns
string
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:59
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Parameters
• promptsDict
• promptsDict.condenseMessagePrompt
Returns
void
Overrides
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:65
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
>>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:86
chat(params)
chat(
params
):Promise
<EngineResponse
>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:89
getPrompts()
getPrompts():
PromptsDict
Returns all prompts from the mixin and its modules
Returns
PromptsDict
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:27
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Implementation of
Defined in
packages/llamaindex/src/engines/chat/CondenseQuestionChatEngine.ts:130
updatePrompts()
updatePrompts(
promptsDict
):void
Updates the prompts in the mixin and its modules
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:48
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Validates the prompt keys and module keys
Parameters
• promptsDict: PromptsDict
• moduleDict: ModuleDict
Returns
void