HuggingFaceInferenceAPI
Defined in: providers/huggingface/src/shared.ts:89
Wrapper on the Hugging Face's Inference API. API Docs: https://huggingface.co/docs/huggingface.js/inference/README List of tasks with models: huggingface.co/api/tasks
Note that Conversational API is not yet supported by the Inference API. They recommend using the text generation API instead. See: https://github.com/huggingface/huggingface.js/issues/586#issuecomment-2024059308
Extends
BaseLLM
Constructors
new HuggingFaceInferenceAPI()
new HuggingFaceInferenceAPI(
init
):HuggingFaceInferenceAPI
Defined in: providers/huggingface/src/shared.ts:97
Parameters
init
Returns
Overrides
BaseLLM.constructor
Properties
model
model:
string
Defined in: providers/huggingface/src/shared.ts:90
temperature
temperature:
number
Defined in: providers/huggingface/src/shared.ts:91
topP
topP:
number
Defined in: providers/huggingface/src/shared.ts:92
maxTokens?
optional
maxTokens:number
Defined in: providers/huggingface/src/shared.ts:93
contextWindow
contextWindow:
number
Defined in: providers/huggingface/src/shared.ts:94
hf
hf:
HfInference
Defined in: providers/huggingface/src/shared.ts:95
Accessors
metadata
Get Signature
get metadata():
LLMMetadata
Defined in: providers/huggingface/src/shared.ts:118
Returns
LLMMetadata
Overrides
BaseLLM.metadata
Methods
chat()
Call Signature
chat(
params
):Promise
<AsyncIterable
<ChatResponseChunk
,any
,any
>>
Defined in: providers/huggingface/src/shared.ts:129
Parameters
params
LLMChatParamsStreaming
<object
, object
>
Returns
Promise
<AsyncIterable
<ChatResponseChunk
, any
, any
>>
Overrides
BaseLLM.chat
Call Signature
chat(
params
):Promise
<ChatResponse
<object
>>
Defined in: providers/huggingface/src/shared.ts:132
Parameters
params
LLMChatParamsNonStreaming
<object
, object
>
Returns
Promise
<ChatResponse
<object
>>
Overrides
BaseLLM.chat
nonStreamChat()
protected
nonStreamChat(params
):Promise
<ChatResponse
<object
>>
Defined in: providers/huggingface/src/shared.ts:161
Parameters
params
LLMChatParamsNonStreaming
<object
, object
>
Returns
Promise
<ChatResponse
<object
>>
streamChat()
protected
streamChat(params
):AsyncIterable
<ChatResponseChunk
,any
,any
>
Defined in: providers/huggingface/src/shared.ts:178
Parameters
params
LLMChatParamsStreaming
<object
, object
>
Returns
AsyncIterable
<ChatResponseChunk
, any
, any
>