MongoDB has introduced new AI-focused capabilities aimed at helping enterprises run AI agents in production environments. The updates include automated vector embeddings, persistent agent memory, performance improvements in MongoDB 8.3, and expanded deployment support across cloud and hybrid infrastructure.
The company said enterprises often struggle to combine multiple tools for vector search, memory management, embeddings, and operational databases. MongoDB’s latest release aims to bring these functions together within a single platform.
The announcement includes updates to MongoDB Atlas, MongoDB Vector Search, and MongoDB 8.3, along with integrations for LangGraph.js and AWS PrivateLink.
MongoDB focuses on AI agents in production
As businesses move beyond AI experiments and into production deployments, data infrastructure has become a bottleneck.
CJ Desai, president and CEO of MongoDB, said the challenge is no longer limited to AI models themselves.
“The hardest part of running agents in production isn’t the model. It’s the data layer underneath it,” Desai said.
According to MongoDB, AI agents need fast access to accurate data, persistent memory across sessions, and low-latency infrastructure to function reliably in enterprise environments.
Automated embeddings for real time AI search
One of the major announcements is Automated Voyage AI Embeddings for MongoDB Vector Search, now available in public preview.
Embedding models convert data into numerical vectors that AI systems use to understand meaning and context. MongoDB said embeddings will now be generated automatically whenever data is written or updated.
The company claims this removes much of the manual infrastructure setup traditionally required for semantic search systems.
MongoDB also highlighted that Voyage AI embedding models currently rank first on the Retrieval Embedding Benchmark (RTEB), a benchmark measuring retrieval accuracy.
The goal is to help enterprises deploy AI-powered search systems faster while improving the quality of context retrieval for AI agents.
Persistent memory for AI agents
MongoDB also announced general availability of the LangGraph.js Long-Term Memory Store integration.
The feature gives JavaScript and TypeScript developers persistent memory capabilities for AI agents using MongoDB Atlas as the backend database.
This allows AI agents to retain information across conversations and sessions without relying on separate memory databases.
Pablo Stern, chief product officer for AI and Emerging Products at MongoDB, said many AI failures are tied to weak data infrastructure rather than model limitations.
“Developers no longer have to build and maintain data infrastructure, wire up embeddings, or manage syncing between systems,” Stern said.
MongoDB 8.3 brings higher database performance
The company also introduced MongoDB 8.3, which it said improves database performance across several workloads without requiring application changes.
According to MongoDB, version 8.3 delivers:
- Up to 45% more reads
- Up to 35% more writes
- Up to 15% more ACID transactions
- Up to 30% better performance for complex operations
MongoDB said the update is aimed at enterprises running large AI workloads that require rapid data retrieval and near real-time context updates.
Hybrid and multi cloud deployment support expands
MongoDB also announced general availability of cross-region connectivity for AWS PrivateLink.
The feature keeps traffic between MongoDB Atlas clusters inside AWS private networks without exposing traffic to the public internet.
The company said this is particularly important for sectors such as banking, healthcare, and government, where compliance and data residency requirements often shape infrastructure decisions.
MongoDB said its platform continues to support deployments across AWS, Microsoft Azure, Google Cloud, on-premises systems, and hybrid environments.
