Skip to main content
mcp-agent is built on top of the Model Context Protocol (MCP). Agents connect to MCP servers to gain tools, data, prompts, and filesystem-style access. If you are new to the protocol, start with the official MCP introduction; this page shows how MCP fits into mcp-agent.

MCP primitives at a glance

Tools

Functions exposed by MCP servers—use agent.call_tool or let an AugmentedLLM invoke them during generation.

Resources

Structured content retrievable via URIs (agent.list_resources, agent.read_resource).

Prompts

Parameterised templates listed with agent.list_prompts and fetched via agent.get_prompt.

Roots

Named filesystem locations agents can browse; list with agent.list_roots.

Elicitation

Servers can pause a tool to request structured user input; see the elicitation example under examples/mcp.

Sampling

Some servers provide LLM completion endpoints; try the sampling demo in examples/mcp.
Supported Capabilities → covers each primitive in depth.

Example-driven overview

The quickest way to learn is to run the projects in examples/mcp:
ExampleFocusTransport
mcp_streamable_httpConnect to a remote HTTP MCP server with streaming responsesstreamable_http
mcp_sseSubscribe to an SSE MCP serversse
mcp_websocketsBi-directional WebSocket communicationwebsocket
mcp_prompts_and_resourcesList and consume prompts/resourcesstdio
mcp_rootsBrowse server roots (filesystem access)stdio
mcp_elicitationHandle elicitation (interactive prompts)stdio
Each example includes a minimal server configuration and client code that connects via gen_client.

Configuring servers

Add servers to mcp_agent.config.yaml:
mcp:
  servers:
    filesystem:
      command: "npx"
      args: ["-y", "@modelcontextprotocol/server-filesystem", "/data"]

    docs_api:
      transport: "streamable_http"
      url: "https://api.example.com/mcp"
      headers:
        Authorization: "Bearer ${DOCS_API_TOKEN}"
Store secrets in mcp_agent.secrets.yaml, environment variables, or preload settings (see Specify Secrets).

Using MCP capabilities from an agent

from mcp_agent.agents.agent import Agent

agent = Agent(
    name="mcp_demo",
    instruction="Use all available MCP capabilities.",
    server_names=["filesystem", "docs_api"],
)

async with agent:
    tools = await agent.list_tools()
    resources = await agent.list_resources()
    prompts = await agent.list_prompts()
    roots = await agent.list_roots()

    print("Tools:", [t.name for t in tools.tools])
    print("Resources:", [r.uri for r in resources.resources])
Common API calls:
  • await agent.call_tool("tool_name", arguments={...})
  • await agent.read_resource(uri)
  • await agent.get_prompt(name, arguments)
  • await agent.list_roots()
AugmentedLLMs inherit these capabilities automatically.

Lightweight MCP client (gen_client)

from mcp_agent.app import MCPApp
from mcp_agent.mcp.gen_client import gen_client

app = MCPApp(name="mcp_client_demo")

async def main():
    async with app.run():
        async with gen_client("filesystem", app.server_registry, context=app.context) as session:
            tools = await session.list_tools()
            print("Tools:", [t.name for t in tools.tools])
For persistent connections or aggregators, see Connecting to MCP Servers.

Authentication

Detailed reference

Transport configurations

Best for local subprocess servers:
mcp:
  servers:
    filesystem:
      transport: "stdio"
      command: "npx"
      args: ["-y", "@modelcontextprotocol/server-filesystem"]
Ideal for streaming responses and near-real-time updates:
mcp:
  servers:
    sse_server:
      transport: "sse"
      url: "http://localhost:8000/sse"
      headers:
        Authorization: "Bearer ${SSE_TOKEN}"
Bi-directional, persistent connections:
mcp:
  servers:
    websocket_server:
      transport: "websocket"
      url: "ws://localhost:8001/ws"
HTTP servers with streaming support:
mcp:
  servers:
    http_server:
      transport: "streamable_http"
      url: "https://api.example.com/mcp"
      headers:
        Authorization: "Bearer ${API_TOKEN}"

Build a minimal MCP server

demo_server.py
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Resource Demo MCP Server")

@mcp.resource("demo://docs/readme")
def get_readme():
    """Provide the README file content."""
    return "# Demo Resource Server\n\nThis is a sample README resource."

@mcp.prompt()
def echo(message: str) -> str:
    """Echo the provided message."""
    return f"Prompt: {message}"

if __name__ == "__main__":
    mcp.run()

Agent configuration for that server

mcp_agent.config.yaml
execution_engine: asyncio

mcp:
  servers:
    demo:
      command: "python"
      args: ["demo_server.py"]

openai:
  default_model: "gpt-4o-mini"

Using tools, resources, prompts, and roots

# Tools
result = await agent.call_tool("read_file", {"path": "/data/config.json"})

# Resources
resource = await agent.read_resource("file:///data/report.pdf")

# Prompts
prompt = await agent.get_prompt("code_review", {"language": "python", "file": "main.py"})

# Roots
roots = await agent.list_roots()

Elicitation example

from mcp.server.fastmcp import FastMCP, Context
from mcp.server.elicitation import (
    AcceptedElicitation,
    DeclinedElicitation,
    CancelledElicitation,
)
from pydantic import BaseModel, Field

mcp = FastMCP("Interactive Server")

@mcp.tool()
async def deploy_application(app_name: str, environment: str, ctx: Context) -> str:
    class DeploymentConfirmation(BaseModel):
        confirm: bool = Field(description="Confirm deployment?")
        notify_team: bool = Field(default=False)
        message: str = Field(default="")

    result = await ctx.elicit(
        message=f"Confirm deployment of {app_name} to {environment}?",
        schema=DeploymentConfirmation,
    )

    match result:
        case AcceptedElicitation(data=data):
            return "Deployed" if data.confirm else "Deployment cancelled"
        case DeclinedElicitation():
            return "Deployment declined"
        case CancelledElicitation():
            return "Deployment cancelled"

Capability matrix

PrimitiveSTDIOSSEWebSocketHTTPStatus
ToolsFully supported
ResourcesFully supported
PromptsFully supported
RootsFully supported
ElicitationFully supported
SamplingSupported via examples
I