· 4 min read

MCP Hit 97 Million Installs — And It Actually Matters for How You Build

MCP Hit 97 Million Installs — And It Actually Matters for How You Build

Remember when every AI tool had its own bespoke plugin system? OpenAI had ChatGPT plugins (RIP), Google had extensions, every platform was reinventing the wheel with incompatible standards. That era is over.

MCP — Anthropic's Model Context Protocol — crossed 97 million monthly SDK downloads in March 2026. For context, Kubernetes took four years to reach comparable deployment density. MCP did it in about 16 months from its November 2024 launch.

If you've been tuning this out as "infrastructure stuff that doesn't affect me," it's time to tune back in.

What MCP Actually Is

The simplest explanation: MCP is a standard way for AI agents to connect to external tools and data sources. Think USB-C for AI — one protocol, any model, any tool.

Before MCP, if you wanted your product to work with Claude, you built a Claude integration. If you also wanted it to work with GPT, you built a separate GPT integration. And Gemini? Another one. Every AI platform had its own way of doing things, and every integration was bespoke work.

MCP collapses that into one standard. You build one MCP server that exposes your tool's capabilities, and any MCP-compatible client can use it — Claude, GPT, Gemini, whatever comes next.

The ecosystem is now massive: over 5,800 community and enterprise servers covering databases, CRMs, cloud providers, productivity tools, dev tools, e-commerce platforms, and analytics services. Basically every category of software that an AI agent might need to interact with already has an MCP server.

The big players all bought in. OpenAI, Google, Microsoft, AWS, and Cloudflare all support MCP through the Linux Foundation's Agentic AI Foundation. When every major company agrees on a standard, the protocol war is over. MCP won.

Why This Matters for Solo Builders

Here's where it gets interesting for people like us.

Integration leverage. If you're building an AI-powered product, MCP means you don't have to build integrations from scratch. The hard part of connecting to Slack, GitHub, Notion, Postgres, or whatever your users need is already done. You wire up MCP, and your product inherits the entire ecosystem's integrations.

Distribution through agents. This is the one most people miss. If you build an MCP server for your product, every AI assistant becomes a potential distribution channel. When someone asks Claude or GPT to "check my analytics" or "update my project board," and your tool has an MCP server, you're in the running. You didn't build a Claude plugin and a GPT plugin and a Gemini plugin — you built one MCP server and you're everywhere.

One-person scale. Before MCP, a solo builder couldn't realistically offer a product that integrates with 50 different tools. The integration work alone would take a team. With MCP, you build to one standard and the ecosystem does the rest. This is a genuine leverage multiplier — the kind that makes solo businesses viable in categories that used to require teams.

The Practical "Now What"

If you already have a product: build an MCP server for it. I'm not being dramatic when I say this is becoming table stakes. As AI agents become a primary interface for how people interact with software, not having an MCP server is like not having an API in 2015. You can survive without one, but you're cutting yourself off from a growing distribution channel.

If you're starting something new: the MCP ecosystem itself is a business opportunity. There are 5,800+ servers, but there are gaps. Some existing servers are poorly maintained. Some categories are underserved. Building and maintaining high-quality MCP servers for specific niches is a valid product in itself.

The New Stack reported that MCP's biggest production pain points — authentication, error handling, streaming — are actively being addressed in the 2026 roadmap. The protocol is still maturing, which means early builders have the advantage of shaping best practices in their niche before the category gets crowded.

What I'm Doing About It

I've been building with MCP for this blog's workflow — using MCP servers for content management, deployment, and even the idea generation process. The experience has been good enough that I'm convinced this is the right abstraction layer.

My next project will have an MCP server from day one. Not as an afterthought or a "nice to have," but as a core part of the product architecture. If AI agents are going to be how people discover and interact with software, I want to be discoverable.

MCP won the protocol war by being boring and useful — the two best traits for infrastructure. If you're building anything that touches AI, stop treating it as "that Anthropic thing" and start treating it as plumbing you need to understand.

Stay in the Loop

Get new posts delivered to your inbox. No spam, unsubscribe anytime.

Related Posts