ToolCallLLM
Defined in: packages/core/src/llms/base.ts:71
Unified language model interface
Extends
BaseLLM
<AdditionalChatOptions
,AdditionalMessageOptions
>
Extended by
Type Parameters
AdditionalChatOptions
AdditionalChatOptions
extends object
= object
AdditionalMessageOptions
AdditionalMessageOptions
extends ToolCallLLMMessageOptions
= ToolCallLLMMessageOptions
Constructors
Constructor
new ToolCallLLM<
AdditionalChatOptions
,AdditionalMessageOptions
>():ToolCallLLM
<AdditionalChatOptions
,AdditionalMessageOptions
>
Returns
ToolCallLLM
<AdditionalChatOptions
, AdditionalMessageOptions
>
Inherited from
Properties
metadata
abstract
metadata:LLMMetadata
Defined in: packages/core/src/llms/base.ts:20
Inherited from
supportToolCall
abstract
supportToolCall:boolean
Defined in: packages/core/src/llms/base.ts:76
Methods
complete()
Call Signature
complete(
params
):Promise
<AsyncIterable
<CompletionResponse
,any
,any
>>
Defined in: packages/core/src/llms/base.ts:22
Get a prompt completion from the LLM
Parameters
params
Returns
Promise
<AsyncIterable
<CompletionResponse
, any
, any
>>
Inherited from
Call Signature
complete(
params
):Promise
<CompletionResponse
>
Defined in: packages/core/src/llms/base.ts:25
Parameters
params
LLMCompletionParamsNonStreaming
Returns
Promise
<CompletionResponse
>
Inherited from
chat()
Call Signature
abstract
chat(params
):Promise
<AsyncIterable
<ChatResponseChunk
,any
,any
>>
Defined in: packages/core/src/llms/base.ts:57
Get a chat response from the LLM
Parameters
params
LLMChatParamsStreaming
<AdditionalChatOptions
, AdditionalMessageOptions
>
Returns
Promise
<AsyncIterable
<ChatResponseChunk
, any
, any
>>
Inherited from
Call Signature
abstract
chat(params
):Promise
<ChatResponse
<AdditionalMessageOptions
>>
Defined in: packages/core/src/llms/base.ts:63
Parameters
params
LLMChatParamsNonStreaming
<AdditionalChatOptions
, AdditionalMessageOptions
>
Returns
Promise
<ChatResponse
<AdditionalMessageOptions
>>