AWS, Anthropic, and the Model Context Protocol
The year ends now. We and only about 50,000 of our closest friends are about to hop the plane for Amazon Web Services (AWS) hashtag#reInvent2024 tomorrow. With AWS kicking in another $4 billion to Anthropic, there's little doubt that we'll be hearing lots about hashtag#genAI and agents. And we're likely wondering what we're going to be hearing about multicloud and, in an era where national borders are hardening, sovereign cloud. What other themes are we going to hearing about?
OK, AWS, you're making Anthropic your BFF in the LLM arms race. So let's talk about what you're investing in. We're especially intrigued about a recent announcement from Anthropic on the Model Context Protocol (MCP) -- the notion of standardizing the connection between language models and data sources. We're excited because as organizations are looking beyond kicking the tires on gen AI, the spotlight is increasingly going back to data. Models are only as good as the data they train on and connect to. The need for sound data is behind of course surging interest in retrieval augmented generation (hashtag#RAG), not to mention various approaches to grounding models and answers. Given the arms race between the LLM usual suspects, we'd be surprised if the industry comes together around a common protocol for connecting to data. We'd love to hear AWS give MCP a shout-out this week. One can always dream.