Top AI Announcements at MongoDB.local NYC
The AI landscape is evolving so quickly that it’s no surprise customers are overwhelmed by their choices. Between foundation models for everything from text to code, AI frameworks, and the steady stream of AI-related companies being founded daily, developers and organizations face a dizzying array of AI choices.
MongoDB empowers customers through a developer data platform that helps them avoid vendor lock-in from cloud providers or AI vendors in this fast-moving space. This freedom allows customers to choose the large language model (LLM) that best suits their needs - now or in the future, whether it's open source or proprietary. Today at MongoDB.local NYC, we announced many new product capabilities, partner integrations, services, and solution offering that enable development teams to get started and build customer-facing solutions with AI.
This post is also available in:
Deutsch
,
Français
,
Español
,
Português
,
Italiano
,
한국어
,
简体中文
.
Run everywhere, with whatever technology you are using in your AI stack
MongoDB’s flexible document model is built on the ethos of “data that is accessed and used together is stored together.” Vectors are a natural extension of this capability, meaning customers can store their source data, metadata, and related vector embeddings in the same document. All of this is accessed and queried with a common Query API, making vector data easy to combine and work with other types of data stored within MongoDB.
MongoDB Atlas—our fully managed, multi-cloud developer data platform—makes it easy to build AI-powered applications and experiences, with the breadth and depth of MongoDB’s AI partnerships and integrations—no matter which language, application framework, foundation model, or technology partner is used or preferred by developers.
This year, we’re continuing to focus on our AI partnerships and integrations to make it easier for developers to build innovative applications with generative AI, including:
Python and JavaScript using the dedicated Langchain-MongoDB package
Python and C# Microsoft Semantic Kernel integration
for Atlas Vector Search
AI models from
Mistral
and
Cohere
AI models on the
Fireworks AI platform
Addition of Atlas Vector Search as a
knowledge base in Amazon Bedrock
Atlas as a datastore enabling storage, query, and retrieval
using natural language in ChatGPT
Atlas Vector Search as a datastore on
Haystack
Atlas Vector Search as a datastore on
DocArray
Collaboration with
Google Gemini Code Assist
and
Amazon Q
to quickly prototype new features and accelerate application development.
Google Vertex AI Extension
to harness natural language with MongoDB queries
MongoDB integrates well with a rich ecosystem of AI developer frameworks, LLMs, and embedding providers. We continue investing in making the entire AI stack work seamlessly, enabling developers to take advantage of generative AI capabilities in their applications easily. MongoDB’s integrations and our industry-leading multi-cloud capabilities allow organizations to move quickly and avoid lock-in to any particular cloud provider or AI technology in a rapidly evolving space.
Build high-performance AI applications securely and at scale
Workload isolation, without data isolation, is critical for building performant, scalable AI applications. Search Nodes in MongoDB Atlas provide dedicated computing and enable users to isolate memory-intensive AI workloads for superior performance and higher availability. Users can optimize resource consumption for their use case, upsizing or downsizing the hardware for that specific node irrespective of the rest of the database cluster. Search Nodes make optimizing performance for vector search queries easy without over or under-provisioning an entire cluster. The IaC integrations with Hashicorp Terraform Atlas Provider and Cloudformation enable developers to configure and programmatically deploy Search Nodes at scale.
Search Nodes are an integral part of Atlas - our fully managed, battle-tested, multi-cloud platform. Previously, we announced the availability of
Search Nodes for our AWS and Google Cloud
customers. We are excited to announce the preview of Search Nodes for our Azure customers at MongoDB.local NYC. Search Nodes on Atlas helps developers move faster by removing the friction of integrating, securing, and maintaining the essential data components required to build and deploy modern AI applications.
Improve developer productivity with AI-powered experiences
Today, we also announced new and improved releases of our intelligent developer experiences in
MongoDB Compass
,
MongoDB Relational Migrator
, and
MongoDB Atlas Charts
, aiming to enhance developer productivity and velocity. With the updated releases, developers can use natural language to query their data using MongoDB Compass, troubleshoot common problems during development,
perform SQL-to-Query API conversion right from within MongoDB Relational Migrator
, and quickly build charts and dashboards using natural language prompts in MongoDB Atlas Charts.
Collectively, these intelligent experiences will help developers build differentiated features with greater control and flexibility, making it easier than ever to build applications with MongoDB.
Enable development teams to get started and build customer-facing solutions faster and easier with AI
MongoDB makes it easy for companies of all sizes to build AI-powered applications. To provide customers with a straightforward way to get started with generative AI, MongoDB is announcing the
MongoDB AI Application Program
(MAAP). Based on usage patterns for common AI use cases, customers receive a functioning application built on a reference architecture backed by MongoDB Atlas, vetted AI models and hosting solutions, technical support, and a full-service engagement led by our Professional Services team.
We’re launching with an incredible group of industry-leading partners, including Anthropic, Anyscale, AWS, Cohere, Credal.ai, Fireworks.ai, Google Cloud, gravity9, LangChain, LlamaIndex, Microsoft Azure, Nomic, PeerIslands, Pureinsights, and Together AI. MongoDB is in a unique position in the market to be able to pull together such an impressive AI partner ecosystem in a single customer-focused program, and we’re excited to see how MAAP will help customers more easily go from ideation to fully functioning generative AI applications.
Last year, to further enable startups to build AI solutions with MongoDB Atlas, we launched the
AI Innovators Program
, an extension of
MongoDB for Startups
, which offers an additional $5000 in Atlas credits to our AI startups. This year, we are expanding the program by introducing an
AI Startup Hub
, which features a curated guide for getting started with MongoDB and AI, quickstarts for MongoDB and select AI partners, and startup credit offerings from our AI partners.
We provide
two new AI Accelerator consulting packages
for larger enterprise companies: AI Essentials and AI Implementation. While MAAP is aimed exclusively at building highly vetted reference architectures, these consulting packages allow customers to design, build, and deploy open-ended AI prototypes and solutions into their applications.
Data has always been a competitive advantage for organizations, and MongoDB makes it easy, fast, and flexible to innovate with data. We continue to invest in making all the other parts of the AI stack easy for organizations: vetting top partners to ensure compatibility with different parts of the application stack, building a managed service that spans multiple clouds in operation, and ensuring the openness that's always been a part of MongoDB which avoids vendor lock-in.
How does MongoDB Atlas unify operational, analytical, and generative AI data services to streamline building AI-enriched applications? Check out our
MongoDB for AI page
to learn more.
May 2, 2024