What MCP Actually Does

Model Context Protocol is an open standard that lets AI assistants call external tools, read files, and hit APIs – all through a single, consistent interface. Instead of every AI client inventing its own plugin system, MCP gives you one protocol that works across Claude Desktop, VS Code, Cursor, ChatGPT, and others.

MCP servers expose three types of capabilities:

  • Tools – functions the model can call (with user approval), like querying a database or calling an API
  • Resources – read-only data the model can pull in, like file contents or API responses
  • Prompts – pre-built templates that guide the model through specific tasks

The protocol uses JSON-RPC over stdio or HTTP. Your server is just a process that reads JSON-RPC messages and responds. The Python SDK (v1.x stable, v2 in pre-alpha) and TypeScript SDK (v1.26.0) handle all the protocol plumbing.

Build a Python MCP Server in 5 Minutes

You need Python 3.10+ and uv (Astral’s fast package manager). If you don’t have uv:

1
curl -LsSf https://astral.sh/uv/install.sh | sh

Set up the project:

1
2
3
4
5
uv init my-mcp-server
cd my-mcp-server
uv venv
source .venv/bin/activate
uv add "mcp[cli]" httpx

Now create server.py. This server exposes a tool that fetches the top Hacker News stories:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("hackernews")

HN_API = "https://hacker-news.firebaseio.com/v0"


async def fetch_json(url: str) -> Any:
    async with httpx.AsyncClient() as client:
        resp = await client.get(url, timeout=10.0)
        resp.raise_for_status()
        return resp.json()


@mcp.tool()
async def top_stories(count: int = 5) -> str:
    """Fetch the top Hacker News stories.

    Args:
        count: Number of stories to return (default 5, max 30)
    """
    count = min(count, 30)
    story_ids = await fetch_json(f"{HN_API}/topstories.json")
    stories = []
    for sid in story_ids[:count]:
        item = await fetch_json(f"{HN_API}/item/{sid}.json")
        title = item.get("title", "No title")
        url = item.get("url", f"https://news.ycombinator.com/item?id={sid}")
        score = item.get("score", 0)
        stories.append(f"- [{title}]({url}) ({score} points)")
    return "\n".join(stories)


@mcp.tool()
async def story_comments(story_id: int, limit: int = 5) -> str:
    """Fetch top comments for a Hacker News story.

    Args:
        story_id: The Hacker News story ID
        limit: Max number of comments to return (default 5)
    """
    item = await fetch_json(f"{HN_API}/item/{story_id}.json")
    kids = item.get("kids", [])[:limit]
    if not kids:
        return "No comments on this story."
    comments = []
    for cid in kids:
        comment = await fetch_json(f"{HN_API}/item/{cid}.json")
        text = comment.get("text", "")[:300]
        author = comment.get("by", "anonymous")
        comments.append(f"**{author}:** {text}")
    return "\n\n".join(comments)


if __name__ == "__main__":
    mcp.run(transport="stdio")

The FastMCP class reads your type hints and docstrings to auto-generate the tool schemas that clients need. No manual JSON Schema definitions required.

Run it to verify there are no import errors:

1
uv run server.py

The process will sit there waiting for JSON-RPC input on stdin. That’s correct – kill it with Ctrl+C.

Connect It to Claude Desktop

Open your Claude Desktop config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Create the file if it doesn’t exist, and add your server:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
{
  "mcpServers": {
    "hackernews": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/my-mcp-server",
        "run",
        "server.py"
      ]
    }
  }
}

Two things to get right here: the path must be absolute, and the command might need to be the full path to uv (run which uv to find it).

Quit Claude Desktop completely (Cmd+Q on macOS, not just closing the window) and reopen it. You should see a connector icon indicating available MCP tools.

Ask Claude: “What are the top stories on Hacker News right now?” It will call your top_stories tool and return real results.

Add Resources for Static Data

Tools are for actions. Resources are for data you want Claude to be able to read. Here’s how to expose a local config file as a resource:

1
2
3
4
5
6
@mcp.resource("config://app-settings")
def get_app_settings() -> str:
    """Return the current application settings."""
    import json
    with open("settings.json", "r") as f:
        return json.dumps(json.load(f), indent=2)

Resources use URI schemes. When a client requests config://app-settings, your function runs and returns the data. This is useful for giving Claude access to project configuration, database schemas, or documentation without the model needing to call a tool.

Debugging with MCP Inspector

The MCP Inspector is a standalone tool for testing servers without launching a full client. Install and run it:

1
npx @modelcontextprotocol/inspector uv run server.py

This opens a web UI where you can list available tools, call them with test inputs, and see the raw JSON-RPC messages. It’s the fastest way to iterate when you’re building a server.

Common Errors and Fixes

“Could not attach to MCP server” in Claude Desktop. This almost always means Claude can’t find or execute the command. Check these in order:

  1. Is the path in claude_desktop_config.json absolute? Relative paths silently fail.
  2. Can Claude find uv? Run which uv and use that full path as the command value.
  3. Is your JSON valid? A trailing comma or missing quote will break the entire config. Run it through python3 -m json.tool claude_desktop_config.json to validate.

spawn uv ENOENT in the logs. Claude Desktop can’t find the uv binary. Your shell PATH isn’t inherited by the desktop app. Fix it by using the absolute path:

1
2
3
4
5
6
7
8
{
  "mcpServers": {
    "hackernews": {
      "command": "/home/youruser/.local/bin/uv",
      "args": ["--directory", "/absolute/path/to/my-mcp-server", "run", "server.py"]
    }
  }
}

Server starts but tools don’t appear. Your server is probably writing to stdout accidentally. In MCP stdio servers, stdout is reserved for JSON-RPC messages. Any stray print() call corrupts the protocol stream. Use print(..., file=sys.stderr) or the logging module (which defaults to stderr).

“Connection timeout” or tools hang. Check that your tool functions actually return. If an HTTP request inside a tool hangs, the whole server stalls. Always set explicit timeouts on network calls:

1
2
async with httpx.AsyncClient() as client:
    resp = await client.get(url, timeout=10.0)  # Don't leave this open-ended

Check the logs. Claude Desktop writes MCP logs to ~/Library/Logs/Claude/ on macOS. Look at mcp.log for connection issues and mcp-server-hackernews.log for your server’s stderr output.

MCP Apps: Interactive UIs in Chat

The newest MCP feature (January 2026) is MCP Apps – an official extension that lets tools return interactive UI components. Instead of dumping raw text, your tool can render dashboards, forms, and visualizations directly in the conversation.

Tools include a _meta.ui.resourceUri field pointing to an HTML resource served via the ui:// scheme. The host renders it in a sandboxed iframe with bidirectional JSON-RPC communication. This works today in Claude, VS Code Insiders, and Goose.

This is still early, but the practical upshot: tools that return data (like database queries or monitoring dashboards) can now show interactive tables, charts, and filters instead of making the user ask follow-up questions to drill down.

Where MCP Is Headed

The protocol spec (currently version 2025-11-25) is actively evolving. The team is working on async operation support for long-running tasks, stateless scaling for enterprise deployments, and .well-known URLs for server discovery. SDK downloads hit 97 million per month, with over 10,000 active servers in the ecosystem.

The important thing: MCP is the standard now. Claude, ChatGPT, Gemini, Copilot, and VS Code all support it. If you’re building any kind of AI integration, building an MCP server is the way to make it work everywhere.