VectorStoreIndex
Defined in: llamaindex/src/indices/vectorStore/index.ts:73
The VectorStoreIndex, an index that stores the nodes only according to their vector embeddings.
Extends
BaseIndex
<IndexDict
>
Properties
storageContext
storageContext:
StorageContext
Defined in: llamaindex/src/indices/BaseIndex.ts:27
Inherited from
docStore
docStore:
BaseDocumentStore
Defined in: llamaindex/src/indices/BaseIndex.ts:28
Inherited from
indexStruct
indexStruct:
IndexDict
Defined in: llamaindex/src/indices/BaseIndex.ts:30
Inherited from
indexStore
indexStore:
BaseIndexStore
Defined in: llamaindex/src/indices/vectorStore/index.ts:74
Overrides
embedModel?
optional
embedModel:BaseEmbedding
Defined in: llamaindex/src/indices/vectorStore/index.ts:75
vectorStores
vectorStores:
VectorStoreByType
Defined in: llamaindex/src/indices/vectorStore/index.ts:76
Methods
insert()
insert(
document
):Promise
<void
>
Defined in: llamaindex/src/indices/BaseIndex.ts:68
Insert a document into the index.
Parameters
document
Document
<Metadata
>
Returns
Promise
<void
>
Inherited from
init()
static
init(options
):Promise
<VectorStoreIndex
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:90
The async init function creates a new VectorStoreIndex.
Parameters
options
Returns
Promise
<VectorStoreIndex
>
getNodeEmbeddingResults()
getNodeEmbeddingResults(
nodes
,options
?):Promise
<BaseNode
<Metadata
>[]>
Defined in: llamaindex/src/indices/vectorStore/index.ts:170
Calculates the embeddings for the given nodes.
Parameters
nodes
BaseNode
<Metadata
>[]
An array of BaseNode objects representing the nodes for which embeddings are to be calculated.
options?
An optional object containing additional parameters.
logProgress?
boolean
A boolean indicating whether to log progress to the console (useful for debugging).
Returns
Promise
<BaseNode
<Metadata
>[]>
buildIndexFromNodes()
buildIndexFromNodes(
nodes
,options
?):Promise
<void
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:193
Get embeddings for nodes and place them into the index.
Parameters
nodes
BaseNode
<Metadata
>[]
options?
logProgress?
boolean
Returns
Promise
<void
>
fromDocuments()
static
fromDocuments(documents
,args
):Promise
<VectorStoreIndex
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:206
High level API: split documents, get embeddings, and build index.
Parameters
documents
Document
<Metadata
>[]
args
VectorIndexOptions
& object
= {}
Returns
Promise
<VectorStoreIndex
>
fromVectorStores()
static
fromVectorStores(vectorStores
):Promise
<VectorStoreIndex
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:251
Parameters
vectorStores
VectorStoreByType
Returns
Promise
<VectorStoreIndex
>
fromVectorStore()
static
fromVectorStore(vectorStore
):Promise
<VectorStoreIndex
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:270
Parameters
vectorStore
BaseVectorStore
<unknown
>
Returns
Promise
<VectorStoreIndex
>
asRetriever()
asRetriever(
options
?):VectorIndexRetriever
Defined in: llamaindex/src/indices/vectorStore/index.ts:274
Create a new retriever from the index.
Parameters
options?
Omit
<object
& object
, "index"
> | Omit
<object
& object
, "index"
>
Returns
Overrides
asQueryEngine()
asQueryEngine(
options
?):RetrieverQueryEngine
Defined in: llamaindex/src/indices/vectorStore/index.ts:284
Create a RetrieverQueryEngine. similarityTopK is only used if no existing retriever is provided.
Parameters
options?
retriever?
BaseRetriever
responseSynthesizer?
BaseSynthesizer
preFilters?
MetadataFilters
nodePostprocessors?
BaseNodePostprocessor
[]
similarityTopK?
number
Returns
RetrieverQueryEngine
Overrides
asChatEngine()
asChatEngine(
options
):ContextChatEngine
Defined in: llamaindex/src/indices/vectorStore/index.ts:310
Convert the index to a chat engine.
Parameters
options
VectorIndexChatEngineOptions
= {}
The options for creating the chat engine
Returns
ContextChatEngine
A ContextChatEngine that uses the index's retriever to get context for each query
Overrides
insertNodesToStore()
protected
insertNodesToStore(newIds
,nodes
,vectorStore
):Promise
<void
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:324
Parameters
newIds
string
[]
nodes
BaseNode
<Metadata
>[]
vectorStore
BaseVectorStore
<unknown
>
Returns
Promise
<void
>
insertNodes()
insertNodes(
nodes
,options
?):Promise
<void
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:348
Parameters
nodes
BaseNode
<Metadata
>[]
options?
logProgress?
boolean
Returns
Promise
<void
>
Overrides
deleteRefDoc()
deleteRefDoc(
refDocId
,deleteFromDocStore
):Promise
<void
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:364
Parameters
refDocId
string
deleteFromDocStore
boolean
= true
Returns
Promise
<void
>
Overrides
deleteRefDocFromStore()
protected
deleteRefDocFromStore(vectorStore
,refDocId
):Promise
<void
>
Defined in: llamaindex/src/indices/vectorStore/index.ts:376
Parameters
vectorStore
BaseVectorStore
<unknown
>
refDocId
string
Returns
Promise
<void
>