MCP Guide 2026: What It Is, Best Servers & Setup
The Model Context Protocol hit 97M installs. Learn what MCP is, the best servers, how to set it up in Claude & Cursor, and why it matters.
TL;DR — MCP in 2026
| What | Open protocol connecting AI models to external tools and data sources |
| Created by | Anthropic (Nov 2024), now governed by the Linux Foundation's Agentic AI Foundation |
| Installs | 97 million+ SDK downloads as of March 2026 |
| Servers available | 12,000+ across npm, PyPI, GitHub, and registries like Smithery |
| Supported clients | Claude Desktop, Claude Code, Cursor, Windsurf, VS Code (Copilot), Zed, Cline, Replit |
| Transport | Stdio (local), Streamable HTTP (remote) |
| Cost | Free and open-source (Apache 2.0) |
What Is the Model Context Protocol?
The Model Context Protocol (MCP) is an open standard created by Anthropic that defines how AI applications connect to external tools, APIs, and data sources. Think of it as USB-C for AI: a single, universal plug that lets any AI model talk to any service.
Before MCP, every AI tool had its own proprietary way of calling external services. If you built a tool integration for ChatGPT, you had to rebuild it for Claude, and again for Cursor. MCP eliminates that fragmentation. Build one MCP server, and it works everywhere.
Anthropic open-sourced MCP in November 2024. By December 2025, it was donated to the Linux Foundation's Agentic AI Foundation (AAIF), co-founded by Anthropic, OpenAI, Block, Google, Microsoft, AWS, and Cloudflare. By March 25, 2026, it had crossed 97 million installs — the fastest adoption curve for any AI infrastructure standard in history.
How Does MCP Work?
MCP uses a client-server architecture with three key components:
The Architecture
┌─────────────────────────────────────────┐
│ Host (Claude Desktop, Cursor, etc.) │
│ │
│ ┌───────────┐ ┌───────────┐ │
│ │ MCP Client│ │ MCP Client│ ... │
│ └─────┬─────┘ └─────┬─────┘ │
└────────┼───────────────┼────────────────┘
│ │
┌─────▼─────┐ ┌─────▼─────┐
│ MCP Server│ │ MCP Server│
│ (GitHub) │ │ (Postgres)│
└───────────┘ └───────────┘
- Host: The AI application you use (Claude Desktop, Cursor, Claude Code)
- MCP Client: Embedded inside the host, maintains a 1:1 connection with each server
- MCP Server: A lightweight program that exposes tools, resources, or prompts from an external service
Three Primitives
MCP defines three primitives for how data flows:
| Primitive | Controlled By | Example |
|---|---|---|
| Tools | The AI model | "Search this GitHub repo," "Run this SQL query" |
| Resources | The application | File contents, database schemas, API responses |
| Prompts | The user | Pre-built prompt templates for specific workflows |
Transport
All communication uses JSON-RPC 2.0. Two transport options:
- Stdio: For local servers. The client launches the server as a subprocess and communicates via stdin/stdout. Zero network overhead, maximum simplicity.
- Streamable HTTP: For remote servers. Uses a single HTTP endpoint for bidirectional messaging. Replaced the older SSE transport in 2025.
Top MCP Servers in 2026
The ecosystem has exploded to 12,000+ servers. These are the most widely adopted:
| Server | Connects To | Installs | Maintained By |
|---|---|---|---|
| Filesystem | Local files (read/write) | 485K+ | Anthropic (official) |
| GitHub | Repos, PRs, issues, CI/CD | 398K+ | GitHub (official) |
| PostgreSQL | Postgres databases | 312K+ | Anthropic (official) |
| Brave Search | Web search results | 287K+ | Anthropic (official) |
| Playwright | Browser automation | 180K+ | Microsoft (official) |
| Slack | Messages, channels, threads | 150K+ | Anthropic (official) |
| Context7 | Up-to-date library docs | 120K+ | Upstash |
| Supabase | Full Supabase platform | 95K+ | Supabase |
| Firecrawl | Web scraping and crawling | 85K+ | Firecrawl |
| Notion | Pages, databases, search | 70K+ | Community |
Standout Picks
Context7 is arguably the highest-impact server for daily coding. It gives your AI access to current, version-specific library documentation, eliminating hallucinated APIs and outdated code examples. If your AI agent keeps suggesting deprecated syntax, Context7 fixes that. Playwright (by Microsoft) gives your AI control over a real browser using Playwright's accessibility tree. Faster and more reliable than screenshot-based approaches. Ideal for testing, scraping, and visual verification. Firecrawl handles web data extraction without leaving your editor. Itsfirecrawl_agent tool plans its own browsing strategy, gathering data from multiple sources and returning structured results.
Which AI Tools Support MCP?
Every major AI platform now supports MCP as a client:
| Tool | MCP Support | Notes |
|---|---|---|
| Claude Desktop | Full | Deepest integration (Anthropic built both Claude and MCP) |
| Claude Code | Full | No tool limit, terminal-based, supports both local and remote servers |
| Cursor | Full | Easiest setup via Settings UI, 40-tool cap per server |
| Windsurf | Full | Strong enterprise controls, admin-managed MCP configs |
| VS Code + Copilot | Full | Native MCP support in GitHub Copilot agent mode |
| Zed | Full | Built-in MCP support in the editor |
| Cline | Full | VS Code extension with MCP integration |
| ChatGPT | Partial | OpenAI adopted MCP support in 2025 |
| Replit | Full | Cloud-native MCP integration |
How to Set Up MCP Servers
In Claude Desktop
- Open Claude Desktop and go to Settings > Developer > Edit Config
- This opens
claude_desktop_config.json. Add your servers:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/Users/you/projects"
]
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_TOKEN": "ghp_your_token_here"
}
},
"postgres": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-postgres",
"postgresql://localhost:5432/mydb"
]
}
}
}
- Fully quit and reopen Claude Desktop (not just close the window)
- You should see a hammer icon in the chat input indicating available tools
In Cursor
Option A — Settings UI:- Go to Settings > Features > MCP
- Click Add Server, fill in the name, command, and args
- Restart Cursor completely
Create .cursor/mcp.json in your project root:
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"]
},
"brave-search": {
"command": "npx",
"args": ["-y", "@anthropic-ai/brave-search-mcp"],
"env": {
"BRAVE_API_KEY": "your_key_here"
}
}
}
}
This method is ideal for team sharing via version control.
In Claude Code
Add servers via the command line:
# Add a stdio server
claude mcp add filesystem -- npx -y @modelcontextprotocol/server-filesystem /Users/you/projects
# Add a server with environment variables
claude mcp add github -e GITHUB_TOKEN=ghp_your_token -- npx -y @modelcontextprotocol/server-github
# List configured servers
claude mcp list
# Remove a server
claude mcp remove filesystem
Key Setup Tips
- Never hardcode secrets in config files. Use environment variables or a secrets manager.
- Any server that works in Claude Desktop also works in Cursor — the JSON format is identical.
- Stdio servers require Node.js (for
npx) or Python (foruvx) installed locally. - Remote servers only need a URL — no local dependencies.
Building Your Own MCP Server
If your tool or service does not have an MCP server yet, you can build one. Official SDKs exist for TypeScript, Python, Java, Kotlin, C#, Swift, and Go.
Here is a minimal example in TypeScript:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { z } from "zod";
const server = new McpServer({
name: "my-server",
version: "1.0.0",
});
server.tool(
"get_weather",
"Get current weather for a city",
{ city: z.string() },
async ({ city }) => ({
content: [{ type: "text", text: `Weather in ${city}: 72°F, sunny` }],
})
);
const transport = new StdioServerTransport();
await server.connect(transport);
Publish to npm and anyone can add it with npx -y your-package-name.
Why MCP Won
MCP succeeded where previous attempts at AI tool standards failed for three reasons:
- Ship first, standardize later. Anthropic shipped MCP with working servers and real client support before asking for industry buy-in. By the time competitors evaluated it, thousands of developers were already using it.
- Simplicity. A stdio-based MCP server is a single file. JSON-RPC is a well-understood protocol. The barrier to building a server is minutes, not weeks.
- Neutral governance. Donating MCP to the Linux Foundation's Agentic AI Foundation — with OpenAI, Google, Microsoft, and AWS as co-members — removed the "Anthropic lock-in" concern. It is now genuinely vendor-neutral.
Deploying MCP-Powered Apps
Once you have built an application using MCP-connected AI agents, you need to ship it. Y Build handles the deployment side — one-click deploy to Cloudflare's global edge network, with built-in analytics and SEO. Pair your MCP-powered AI workflow with Y Build to go from prototype to production in minutes.
Start building for free →Frequently Asked Questions
What is MCP in simple terms?
MCP (Model Context Protocol) is a universal standard that lets AI assistants like Claude, ChatGPT, and Cursor connect to external tools and data. Instead of each AI building its own integrations, MCP provides one protocol that works everywhere — similar to how USB-C provides one cable for all devices.
Is MCP free to use?
Yes. MCP is fully open-source under the Apache 2.0 license. The protocol specification, SDKs, and official reference servers are all free. Some third-party MCP servers may require API keys for the underlying service (e.g., a Brave Search API key), but MCP itself costs nothing.
Do I need to be a developer to use MCP?
For basic setup (adding servers to Claude Desktop or Cursor), you need minimal technical knowledge — mostly copy-pasting JSON config. Building your own MCP server requires programming experience, but using existing servers is straightforward.
What is the difference between MCP and function calling?
Function calling is a model-level feature where you define tools in your API request. MCP is a protocol-level standard that sits above function calling — it defines how clients discover, connect to, and invoke tools hosted on external servers. MCP servers can expose tools that are then invoked via the model's function calling capability.
Which MCP server should I install first?
Start with Filesystem (for local file access) and Context7 (for up-to-date documentation). These two cover the most common use cases for developers. Add GitHub if you work with repos, and PostgreSQL or Supabase if you interact with databases.
Can I use MCP with ChatGPT?
Yes. OpenAI adopted MCP support in 2025 and co-founded the Agentic AI Foundation alongside Anthropic. ChatGPT supports MCP, though Claude Desktop and Claude Code currently offer the deepest integration since Anthropic created both the model and the protocol.
How many MCP servers can I run at once?
There is no hard protocol limit. Claude Desktop and Claude Code support as many servers as you configure. Cursor has a 40-tool cap per server but supports multiple servers simultaneously. In practice, most developers run 3-8 servers covering their core workflow.
Sources:
- Model Context Protocol — Official Site
- Anthropic — Introducing the Model Context Protocol
- Anthropic — Donating MCP to the Agentic AI Foundation
- Linux Foundation — Agentic AI Foundation Announcement
- MCP Hits 97M Installs — AI Unfiltered
- Why the Model Context Protocol Won — The New Stack
- MCP Specification
- GitHub MCP Servers Repository
- Cursor MCP Documentation
- Claude Code MCP Documentation