From USB-C analogy to building your own server. The Model Context Protocol is now backed by the Linux Foundation, adopted by OpenAI and Google, and powering 10,000+ integrations.
Just as USB-C gave us one universal port for charging, data, and display—MCP gives AI one universal protocol for every tool, API, and data source.
Before USB-C, you needed a different cable for every device: Lightning for iPhones, Micro-USB for Android, barrel connectors for laptops. Before MCP, AI assistants faced the same fragmentation—every tool required a custom integration.
The Model Context Protocol solves this by defining a universal interface between AI models and external capabilities. An MCP server exposes tools, resources, and prompts through a standardized JSON-RPC 2.0 protocol. Any MCP-compatible AI client—Claude Desktop, Cursor, Cline, Windsurf, or an OpenAI-compatible agent—can connect to any MCP server without custom code.
The result? A server you build once works everywhere. A tool catalog that grows with the community. And AI assistants that can do real work—managing databases, deploying code, orchestrating infrastructure—through natural language.
Functions the AI can call. Each tool has a name, description, and typed input schema. The AI decides when and how to invoke them based on the user's request.
Read-only data the AI can access. File contents, database records, API responses—resources provide context without requiring the user to paste data manually.
Reusable prompt templates that servers can expose. Predefined workflows and instruction sets that help the AI use tools more effectively.
On December 9, 2025, Anthropic donated MCP to the newly formed Agentic AI Foundation under the Linux Foundation—transforming it from a single-company project into a true industry standard.
Anthropic open-sources MCP, Claude Desktop launches with MCP support
MCP adoption accelerates: Cursor, Cline, Windsurf, and community servers proliferate
Anthropic donates MCP to the Agentic AI Foundation (AAIF) under Linux Foundation
Ecosystem surpasses 10,000 published MCP servers. OpenAI announces MCP support.
MCP Dev Summit in New York City
The companies steering the future of MCP
MCP follows a client-server architecture built on JSON-RPC 2.0 with multiple transport options.
An MCP Host (Claude Desktop, Cursor, etc.) contains an MCP Client that connects to one or more MCP Servers. Each server exposes a set of tools, resources, and prompts. The client discovers capabilities at startup and invokes them on behalf of the AI model.
All communication uses JSON-RPC 2.0. The client sends tools/list to discover available tools, then tools/call to invoke them. Response types include text content, images, and structured data. Error codes follow the JSON-RPC spec.
MCP supports multiple transports: stdio (standard input/output) for local processes, SSE (Server-Sent Events) for HTTP streaming, and Streamable HTTP for stateless deployments. Choose based on whether your server runs locally or in the cloud.
"Create a US proxy with iOS fingerprint"
Model selects the right tool and parameters
JSON-RPC call to MCP server via transport
Server calls external API/database/service
Structured result returned to the AI model
AI formats the result for the user
Get the PROXIES.SX MCP server running in Claude Desktop in under 5 minutes.
Open the Claude Desktop configuration file. The path depends on your OS:
# macOS
~/Library/Application Support/Claude/claude_desktop_config.json
# Windows
%APPDATA%\Claude\claude_desktop_config.json
# Linux
~/.config/Claude/claude_desktop_config.jsonAdd this configuration block. No installation step needed—npx handles everything automatically:
{
"mcpServers": {
"proxies-sx": {
"command": "npx",
"args": ["-y", "@proxies-sx/mcp-server"],
"env": {
"PROXIES_API_URL": "https://api.proxies.sx/v1",
"PROXIES_EMAIL": "your@email.com",
"PROXIES_PASSWORD": "your-password"
}
}
}
}Close and reopen Claude Desktop. You should see the MCP server icon indicating 42 tools are available. Claude now has full access to your proxy infrastructure.
Try these commands to verify everything works:
47.2 GB remaining across 12 active ports. Next billing cycle: March 1, 2026.The @proxies-sx/mcp-server package demonstrates what a production MCP server looks like: 42 tools organized into 10 categories covering every aspect of mobile proxy management.
2
tools
7
tools
4
tools
5
tools
5
tools
1
tools
3
tools
5
tools
5
tools
5
tools
"Create a US proxy on T-Mobile"
→ create_port(country="US", carrier="t-mobile")
"Rotate my proxy IP now"
→ rotate_port(portId="us-tmobile-8847")
"How much bandwidth do I have left?"
→ get_account_summary()
"Set up iOS fingerprint spoofing"
→ update_os_fingerprint(osFingerprint="ios:2")
"Open a support ticket about latency"
→ create_ticket(subject="...", message="...")
"Buy 50GB of bandwidth with USDC"
→ x402_create_session(gb=50, network="base")
Create your own MCP server in TypeScript with the official SDK. This minimal example gives you a working server in under 50 lines of code.
npm init -y && npm install @modelcontextprotocol/sdk zodimport { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { z } from "zod";
// Create MCP server instance
const server = new McpServer({
name: "my-custom-server",
version: "1.0.0",
});
// Define a tool with typed parameters
server.tool(
"get_weather",
"Get current weather for a city",
{
city: z.string().describe("City name"),
units: z.enum(["celsius", "fahrenheit"]).default("celsius"),
},
async ({ city, units }) => {
// Your implementation here
const temp = units === "celsius" ? "22C" : "72F";
return {
content: [
{
type: "text",
text: `Weather in ${city}: ${temp}, partly cloudy`,
},
],
};
}
);
// Define a resource
server.resource(
"config",
"config://app",
async (uri) => ({
contents: [
{
uri: uri.href,
mimeType: "application/json",
text: JSON.stringify({ version: "1.0", debug: false }),
},
],
})
);
// Connect via stdio transport
const transport = new StdioServerTransport();
await server.connect(transport);{
"mcpServers": {
"my-custom-server": {
"command": "npx",
"args": ["tsx", "server.ts"]
}
}
}How does MCP compare to other approaches for giving AI models access to tools?
The key differentiator: MCP is the only approach that is simultaneously an open standard, vendor-neutral, backed by major industry players through the Linux Foundation, and supports a rich ecosystem of community-built servers. OpenAI Function Calling is powerful but locked to OpenAI models. ChatGPT Plugins were deprecated. LangChain Tools are flexible but framework-specific. MCP works everywhere.
From GitHub to PostgreSQL to Stripe—the MCP ecosystem has exploded. Here are the major categories and notable servers driving adoption.
Directories like mcp.so, glama.ai, and the official MCP Server Registry catalog thousands of servers. The MCP Dev Summit on April 2-3, 2026 in New York City will bring the community together for the first major in-person gathering.
MCP is an open standard that provides a universal protocol for connecting AI models to external tools, data sources, and services. Think of it as USB-C for AI: one standardized connection that works across all AI assistants. Originally created by Anthropic, it was donated to the Agentic AI Foundation under the Linux Foundation on December 9, 2025.
As of early 2026, there are over 10,000 published MCP servers. They span every category imaginable: developer tools (GitHub, GitLab), databases (PostgreSQL, MongoDB), cloud services (AWS, Cloudflare), APIs (Stripe, Twilio), productivity apps (Notion, Slack), and infrastructure management (PROXIES.SX).
Major MCP-compatible clients include Claude Desktop, Claude Code (CLI), Cursor, Cline, Windsurf, Continue, and increasingly OpenAI-compatible agents. Since MCP is now a Linux Foundation standard, adoption is accelerating across the entire AI ecosystem.
The AAIF was formed under the Linux Foundation on December 9, 2025 when Anthropic donated the Model Context Protocol. Its platinum members are AWS, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI. The foundation governs the MCP specification and fosters the ecosystem.
No. Using an existing MCP server only requires editing a JSON configuration file to tell your AI client where to find the server. For example, adding the PROXIES.SX MCP server to Claude Desktop takes a single JSON block. Building your own server does require programming knowledge (TypeScript or Python).
Tools are functions the AI can call (like creating a proxy or querying a database). Resources are read-only data the AI can access (like configuration files or documentation). Prompts are reusable instruction templates that help the AI use the server more effectively.
It provides 42 tools across 10 categories (Account, Ports, Status, Rotation, Billing, Reference, Utilities, Payments, Support Tickets, x402 Sessions) that let AI agents manage mobile proxy infrastructure through natural language. Install it with: npx -y @proxies-sx/mcp-server
The MCP Dev Summit is scheduled for April 2-3, 2026 in New York City. It is the first major in-person gathering for MCP server developers, AI client builders, and the broader agentic AI community.
Whether you want to use existing servers or build your own, MCP is the standard for connecting AI agents to the real world. The PROXIES.SX MCP server is a great place to start.
npx -y @proxies-sx/mcp-serverPROXIES.SX Team
Building AI-native proxy infrastructure