Posted On: Nov 29, 2023
Amazon DocumentDB (with MongoDB compatibility) now supports vector search, a new capability that enables you to store, index, and search millions of vectors with millisecond response times. Vectors are numerical representations of unstructured data, such as text, created from machine learning (ML) models that help capture the semantic meaning of the underlying data. Vector search for Amazon DocumentDB can store vectors from Amazon Web Services ML services such as Amazon SageMaker, third party services, and proprietary models. There are no upfront commitments or additional costs to use vector search, and you only pay for the data you store and compute resources you use.
With vector search for Amazon DocumentDB, you can simply set up, operate, and scale databases for your ML, including generative AI enabled applications. You no longer have to spend time managing separate vector infrastructure, writing code to connect with another service, and duplicating data from your source database. The vector search capability together with large language models (LLMs) enable you to search the database based on meaning, unlocking a wide range of use cases, including semantic search, product recommendations, personalization, and chatbots.
Vector search for Amazon DocumentDB is available on DocumentDB 5.0 instance-based clusters in Amazon Web Services China (Beijing) Region, operated by Sinnet, and in Amazon Web Services China (Ningxia) Region, operated by NWCD.
You can get started by launching an Amazon DocumentDB cluster directly from the Amazon Web Services Console or the Amazon CLI. Learn more about vector search on our product page and developer guide.