That is Half 3 in a five-part sequence on optimizing web sites for the agentic internet. Half 1 lined the evolution from search engine optimisation to AAIO. Half 2 explored get your content material cited in AI responses. This text goes deeper: the protocols forming the infrastructure layer that make every part else doable.
The early internet wanted HTTP to move knowledge, HTML to construction content material, and the W3C to maintain everybody constructing on the identical basis. With out these shared requirements, we’d have ended up with a fragmented assortment of incompatible networks as an alternative of a single internet.
The agentic internet is at that very same inflection level. AI brokers want standardized methods to hook up with instruments, discuss to one another, question web sites, and perceive codebases. With out shared protocols, each AI vendor builds proprietary integrations, and the consequence is similar fragmentation the early internet narrowly averted.
4 protocols are rising because the foundational layer. This text covers what each does, who’s behind it, and what it means for your enterprise. All through this sequence, we draw completely from official documentation, analysis papers, and bulletins from the businesses constructing this infrastructure.
Why Requirements Matter
Think about how the unique internet got here collectively. Within the early Nineteen Nineties, competing browser distributors and incompatible requirements have been fragmenting what ought to have been a unified community. The W3C introduced order by establishing shared protocols. HTTP dealt with transport. HTML dealt with construction. Everybody agreed on the principles, and the net took off.
AI is at an analogous crossroads. Proper now, each main AI firm is constructing brokers that have to work together with exterior instruments, knowledge sources, different brokers, and web sites. With out requirements, connecting your enterprise programs to AI means constructing separate integrations for Claude, ChatGPT, Gemini, Copilot, and no matter comes subsequent. That’s the M x N downside: M totally different AI fashions instances N totally different instruments equals an unsustainable variety of customized connections.
What makes this second outstanding is who’s constructing the answer collectively. On Dec. 9, 2025, the Linux Basis introduced the Agentic AI Basis (AAIF), a vendor-neutral governance physique for agentic AI requirements. Eight platinum members anchor it: AWS, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI.
OpenAI, Anthropic, Google, and Microsoft. Competing on AI merchandise, collaborating on AI infrastructure. As Linux Basis Government Director Jim Zemlin put it: “We’re seeing AI enter a brand new section, as conversational programs shift to autonomous brokers that may work collectively.”
It is a larger deal than most individuals notice. Opponents constructing shared infrastructure as a result of all of them acknowledge that proprietary requirements would maintain again the complete ecosystem, together with themselves.
MCP: The Common Adapter
What it’s: The Mannequin Context Protocol (MCP) is an open customary for connecting AI purposes to exterior instruments, knowledge sources, and workflows.
The official analogy is apt:
“Consider MCP like a USB-C port for AI purposes. Simply as USB-C supplies a standardized solution to join digital units, MCP supplies a standardized solution to join AI purposes to exterior programs.”
Earlier than MCP, in case you needed your database, CRM, or inner instruments accessible to an AI assistant, you needed to construct a customized integration for every AI platform. MCP replaces that with a single customary interface. Construct one MCP server on your knowledge, and each MCP-compatible AI system can hook up with it.
The numbers are putting. MCP launched as an open-source venture from Anthropic on Nov. 25, 2024. In simply over a 12 months, it reached 97 million month-to-month SDK downloads throughout Python and TypeScript, with over 10,000 public MCP servers constructed by the neighborhood.
The adoption timeline tells the story. Anthropic’s Claude had native MCP assist from day one. In March 2025, OpenAI CEO Sam Altman introduced assist throughout OpenAI’s merchandise, stating: “Folks love MCP and we’re excited so as to add assist throughout our merchandise.” Google adopted in April, confirming MCP assist in Gemini. Microsoft joined the MCP steering committee at Construct 2025 in Could, with MCP assist in VS Code reaching basic availability in July 2025.
From inner experiment to trade customary in 12 months. That tempo of adoption indicators one thing actual.
What this implies for your enterprise: In case your knowledge, instruments, or providers are MCP-accessible, each main AI platform can use them. That’s not a theoretical profit. It means an AI assistant serving to your buyer can pull real-time product availability out of your stock system, test order standing out of your CRM, or retrieve pricing out of your database, all by one standardized connection quite than platform-specific integrations.
A2A: How Brokers Discuss To Every Different
What it’s: The Agent2Agent protocol (A2A) permits AI brokers from totally different distributors to find one another’s capabilities and collaborate on duties.
If MCP is how brokers hook up with instruments, A2A is how brokers join to one another. The excellence issues. In a world the place companies use AI brokers from Salesforce for CRM, ServiceNow for IT, and an inner agent for billing, these brokers want a solution to uncover what one another can do, delegate duties, and coordinate responses. A2A supplies that.
Google launched A2A on April 9, 2025 with over 50 know-how companions. By June, Google donated the protocol to the Linux Basis. By July, model 0.3 shipped with over 150 supporting organizations, together with Salesforce, SAP, ServiceNow, PayPal, Atlassian, Microsoft, and AWS.
The core idea is the Agent Card: a JSON metadata doc that serves as a digital enterprise card for brokers. Every A2A-compatible agent publishes an Agent Card at a normal internet handle (/.well-known/agent-card.json) describing its identification, capabilities, abilities, and authentication necessities. When one agent wants assist with a job, it reads one other agent’s card to grasp what that agent can do, then communicates by A2A to request collaboration.
Google’s personal framing of how these items match collectively is beneficial: “Construct with ADK, equip with MCP, talk with A2A.” ADK (Agent Growth Equipment) is Google’s framework for constructing brokers, MCP provides them entry to instruments, and A2A lets them discuss to different brokers.
Right here’s a sensible instance. A buyer contacts your organization with a billing query that requires a refund. Your customer support agent (constructed on one platform) identifies the problem, passes the context to your billing agent (constructed on one other platform) by way of A2A, which calculates the refund quantity and fingers off to your funds agent (one more platform) to course of it. The client sees one seamless interplay. Behind the scenes, three brokers from totally different distributors collaborated by a shared protocol.
The enterprise adoption sign is powerful. When Salesforce, SAP, ServiceNow, and each main consultancy signal on to a protocol inside months, it’s as a result of their enterprise purchasers are already working into the multi-vendor agent coordination downside that A2A solves.
NLWeb: Making Web sites Conversational
What it’s: NLWeb (Pure Language Net) is an open venture from Microsoft that turns any web site right into a pure language interface, queryable by each people and AI brokers.
Of the 4 protocols lined right here, NLWeb is probably the most immediately related to this sequence’ viewers. MCP, A2A, and AGENTS.md are primarily developer issues. NLWeb is about your web site.
NLWeb was launched at Microsoft Construct 2025 on Could 19, 2025. It was conceived and developed by R.V. Guha, who joined Microsoft as CVP and Technical Fellow. If that identify sounds acquainted, it ought to: Guha is the creator of RSS, RDF, and Schema.org, three requirements that essentially formed how the net organizes and syndicates info. When the individual behind Schema.org builds a brand new internet protocol, it’s value paying consideration.
The important thing perception behind NLWeb is that web sites already publish structured knowledge. Schema.org markup, RSS feeds, product catalogs, recipe databases. NLWeb leverages these current codecs, combining them with AI to let customers and brokers question a web site’s content material utilizing pure language as an alternative of clicking by pages.
Microsoft’s framing is deliberate: “NLWeb can play an analogous position to HTML within the rising agentic internet.” The NLWeb README places it much more immediately: “NLWeb is to MCP/A2A what HTML is to HTTP.”
Each NLWeb occasion is robotically an MCP server. Which means any web site working NLWeb instantly turns into accessible to the complete ecosystem of MCP-compatible AI assistants and brokers. Your web site’s content material doesn’t simply sit there ready for guests. It turns into actively queryable by any AI system that speaks MCP.
Early adopters embody Eventbrite, Shopify, Tripadvisor, O’Reilly Media, Widespread Sense Media, and Hearst. These are content-rich web sites that already make investments closely in structured knowledge. NLWeb builds immediately on that funding.
Right here’s what this seems like in follow. As a substitute of a consumer navigating Tripadvisor’s search filters to search out family-friendly eating places in Barcelona with outside seating, an AI agent might question Tripadvisor’s NLWeb endpoint: “Discover family-friendly eating places in Barcelona with outside seating and good critiques.” The response comes again as structured Schema.org JSON, prepared for the agent to current to the consumer or act on.
If your enterprise has already invested in Schema.org markup (and Half 2 of this sequence defined why you must), you’re nearer to NLWeb readiness than you would possibly suppose.
AGENTS.md: Directions For AI Coders
What it’s: AGENTS.md is a standardized Markdown file that gives AI coding brokers with project-specific steerage, primarily a README written for machines as an alternative of people.
This protocol is much less immediately related to the entrepreneurs and strategists studying this sequence, nevertheless it’s an essential piece of the whole image, particularly in case your group has growth groups utilizing AI coding instruments.
AGENTS.md emerged from a collaboration between OpenAI Codex, Google Jules, Cursor, Amp, and Manufacturing unit. The issue they have been fixing: AI coding brokers want to grasp venture conventions, construct steps, testing necessities, and architectural choices earlier than they’ll contribute helpful code. With out express steerage, brokers make assumptions that result in inconsistent, buggy output.
Since its launch in August 2025, AGENTS.md has been adopted by over 60,000 open-source tasks and is supported by instruments together with GitHub Copilot, Claude Code, Cursor, Gemini CLI, VS Code, and plenty of others. It’s now ruled by the Agentic AI Basis, alongside MCP.
The file itself is easy. Plain Markdown, sometimes below 150 strains, masking construct instructions, architectural overview, coding conventions, and testing necessities. Brokers learn it earlier than making any modifications, getting the identical tribal data that senior engineers carry of their heads.
GitHub experiences that Copilot now generates 46% of code for its customers. When practically half of code is AI-generated, having a normal method to make sure brokers observe your conventions, safety practices, and architectural patterns isn’t elective. It’s high quality management.
Why this issues for your enterprise: In case your growth groups use AI coding instruments (and most do), AGENTS.md ensures these instruments produce code that matches your requirements. It reduces agent-generated bugs, cuts onboarding time for AI instruments on new tasks, and supplies consistency throughout groups.
How They Match Collectively
These 4 protocols aren’t competing. They’re complementary layers in the identical stack.
| Protocol | Created By | Goal | Net Analogy |
|---|---|---|---|
| MCP | Anthropic | Join brokers to instruments and knowledge | USB ports |
| A2A | Agent-to-agent communication | E-mail/messaging | |
| NLWeb | Microsoft | Make web sites queryable by brokers | HTML |
| AGENTS.md | OpenAI + collaborators | Information AI coding brokers | README information |
| AAIF | Linux Basis | Governance and requirements physique | W3C |
The stack works like this: MCP supplies the plumbing for brokers to entry instruments and knowledge. A2A permits brokers to coordinate with one another. NLWeb makes web site content material accessible to the complete ecosystem. AGENTS.md ensures AI coding brokers construct accurately. And the Agentic AI Basis supplies the governance layer, making certain these protocols stay open, vendor-neutral, and interoperable.
The parallel to the unique internet is not possible to disregard:
- HTTP (transport) maps to MCP (device entry) and A2A (agent communication).
- HTML (content material construction) maps to NLWeb (web site content material for brokers).
- W3C (governance) maps to AAIF (governance).
What’s totally different this time is the pace. HTTP took years to achieve broad adoption. MCP went from launch to common platform assist in 12 months. A2A grew from 50 to 150+ accomplice organizations in three months. NLWeb shipped with main writer adoption at launch. AGENTS.md reached 60,000 tasks inside its first few months.
The infrastructure is being constructed at web pace, not standards-committee pace. That’s partly as a result of the businesses concerned are the identical ones constructing the brokers that want these protocols. They’re motivated.
And these 4 aren’t the one protocols rising. Commerce-specific requirements are constructing the transaction layer: Shopify and Google co-developed the Common Commerce Protocol (UCP), launched in January 2026 with assist from Etsy, Goal, Walmart, and Wayfair. OpenAI and Stripe co-developed the Agentic Commerce Protocol (ACP), which powers Instantaneous Checkout in ChatGPT. CopilotKit’s AG-UI protocol addresses agent-to-frontend communication, with integrations from LangGraph, CrewAI, and Google ADK. We’ll cowl the commerce protocols in depth in Half 5.
What This Means For Your Enterprise
You don’t have to implement all 4 protocols tomorrow. However you’ll want to perceive what’s being constructed, as a result of it shapes what your web site, instruments, and groups must be prepared for.
If you happen to’ve already invested in Schema.org markup, NLWeb is your closest on-ramp. It builds immediately on the structured knowledge you already keep. As NLWeb adoption grows, your Schema.org funding turns into the inspiration for making your web site conversationally accessible to AI brokers. Preserve your structured knowledge present and complete.
When you’ve got APIs or inner instruments, take into account MCP accessibility. Making your providers accessible by MCP means any AI platform can work together with them. For ecommerce, that would imply product catalogs, stock programs, and order monitoring turning into accessible to AI purchasing assistants throughout ChatGPT, Claude, Gemini, and no matter comes subsequent.
If you happen to’re evaluating multi-vendor agent workflows, A2A is the protocol to look at. Enterprise organizations working brokers from a number of distributors (Salesforce, ServiceNow, inner instruments) will more and more want these brokers to coordinate. A2A is the rising customary for that coordination.
In case your growth groups use AI coding instruments, undertake AGENTS.md now. It’s the best protocol to implement (it’s a single Markdown file) and the one with probably the most fast, tangible profit: fewer bugs, extra constant output, sooner onboarding for AI instruments in your codebase.
The underlying message throughout all 4 protocols is similar: the agentic internet is being constructed on open requirements, not proprietary ones. The businesses that perceive these requirements early shall be higher positioned as AI brokers turn into a major method customers work together with companies.
These aren’t issues you’ll want to implement as we speak. However they’re issues you’ll want to perceive, as a result of Half 4 of this sequence will get into the technical specifics of creating your web site agent-ready.
Key Takeaways
- 4 protocols type the agentic internet’s infrastructure. MCP (instruments), A2A (agent communication), NLWeb (web site content material), and AGENTS.md (code steerage) are complementary layers, not rivals.
- The pace of adoption indicators actual urgency. MCP reached 97 million month-to-month SDK downloads and common platform assist in 12 months. A2A grew from 50 to 150+ accomplice organizations in three months. These are usually not experiments.
- Opponents are collaborating on infrastructure. OpenAI, Anthropic, Google, and Microsoft are all constructing shared protocols below the Agentic AI Basis. This mirrors the W3C second that unified the early internet.
- NLWeb is probably probably the most related protocol for web site house owners. Constructed by the creator of Schema.org, it turns your current structured knowledge right into a conversational interface for AI brokers. Each NLWeb occasion is robotically an MCP server.
- MCP is the common adapter. Construct one MCP connection to your knowledge, and each main AI platform (Claude, ChatGPT, Gemini, Copilot) can entry it. No extra constructing separate integrations for every platform.
- Begin with what you have got. Schema.org markup readies you for NLWeb. Current APIs can turn into MCP servers. AGENTS.md is a single file your dev workforce can create as we speak. You don’t want to begin from scratch.
The unique internet succeeded as a result of rivals agreed on shared requirements. The agentic internet is following the identical playbook, simply sooner. The protocols are being established now. The governance is in place. The brokers are already utilizing them.
Up subsequent in Half 4: the hands-on technical information for making your web site prepared for autonomous AI brokers, from semantic HTML to accessibility requirements to testing with actual agent instruments.
Extra Assets:
This publish was initially revealed on No Hacks.
Featured Picture: Collagery/Shutterstock
