I’ve wanted to ask this question but I don’t know who to ask.
Can someone explain what the use case is for vector DBs like pinecone, milvus etc. vs a fully featured search engine like Vespa, ElasticSearch etc. which also support vector search features?
Is there something about running this type of index operationally that is particularly difficult?
I’ve been playing around here a bit. One feature is that documents have embeddings stored at index time which means that query performance is quite good. Another is that the model you’re using to embed and query can be changed and configured based on specific use cases. A pretty cool feature of Marqo is multi-statement queries that let you search for multiple positive queries and even include negative queries to filter out results.
Can someone explain what the use case is for vector DBs like pinecone, milvus etc. vs a fully featured search engine like Vespa, ElasticSearch etc. which also support vector search features?
Is there something about running this type of index operationally that is particularly difficult?