Skip to main content
mcp-agent uses YAML configuration files to manage application settings, MCP servers, and model providers.

Configuration files

Start with two YAML files at the root of your project:

mcp_agent.config.yaml

Application configuration, MCP servers, logging, execution engine, model defaults

mcp_agent.secrets.yaml

API keys, OAuth credentials, and other secrets (gitignored)
See Specify Secrets for credential management patterns and production tips.

Basic configuration

Here’s a minimal configuration:
execution_engine: asyncio
logger:
  transports: [console]
  level: info

mcp:
  servers:
    fetch:
      command: "uvx"
      args: ["mcp-server-fetch"]

openai:
  default_model: gpt-4o

Execution Engine

Choose how your workflows execute:
  • asyncio
  • Temporal
In-memory execution for development and simple deployments:
execution_engine: asyncio
Best for:
  • Local development
  • Simple agents
  • Quick prototyping
Learn more about Execution Engines →

Logging

Configure logging output and level:
mcp_agent.config.yaml
logger:
  transports: [console, file]  # Output to console and file
  level: info  # debug, info, warning, error
  path: "logs/mcp-agent.jsonl"  # For file transport
You can also use dynamic log filenames:
logger:
  transports: [file]
  level: debug
  path_settings:
    path_pattern: "logs/mcp-agent-{unique_id}.jsonl"
    unique_id: "timestamp"  # Or "session_id"
    timestamp_format: "%Y%m%d_%H%M%S"
Learn more about Logging →

MCP Servers

Define MCP servers your agents can connect to:
mcp_agent.config.yaml
mcp:
  servers:
    fetch:
      command: "uvx"
      args: ["mcp-server-fetch"]
      description: "Fetch web content"

    filesystem:
      command: "npx"
      args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
      description: "Local filesystem access"

    sqlite:
      command: "uvx"
      args: ["mcp-server-sqlite", "--db-path", "data.db"]
      description: "SQLite database operations"
Learn more about MCP Servers →

Model Providers

Configure your LLM provider. Many examples follow this layout—for instance, the basic finder agent sets OpenAI defaults exactly this way.
  • OpenAI
  • Anthropic
  • Azure OpenAI
  • AWS Bedrock
mcp_agent.config.yaml
openai:
  default_model: gpt-4o
  temperature: 0.7
  max_tokens: 4096
mcp_agent.secrets.yaml
openai:
  api_key: "sk-..."

OAuth configuration

Two places control OAuth behaviour:
  1. Global OAuth settings (settings.oauth) configure token storage and callback behaviour (loopback ports, preload timeouts, Redis support).
  2. Per-server auth (mcp.servers[].auth.oauth) specifies client credentials, scopes, and provider overrides.
mcp_agent.config.yaml
oauth:
  token_store:
    backend: redis
    redis_url: ${OAUTH_REDIS_URL}

mcp:
  servers:
    github:
      command: "uvx"
      args: ["mcp-server-github"]
      auth:
        oauth:
          enabled: true
          client_id: ${GITHUB_CLIENT_ID}
          client_secret: ${GITHUB_CLIENT_SECRET}
          redirect_uri_options:
            - "http://127.0.0.1:33418/callback"
          include_resource_parameter: false
Pair this with secrets in mcp_agent.secrets.yaml or environment variables. For concrete walkthroughs, study the OAuth basic agent and the interactive OAuth tool. The pre-authorize workflow example shows how to seed credentials before a background workflow runs.

Programmatic configuration

You can bypass file discovery by passing a fully-formed Settings object (or a path) to MCPApp. This is especially useful for tests and scripts that compose configuration dynamically.
from mcp_agent.app import MCPApp
from mcp_agent.config import Settings, OpenAISettings

settings = Settings(
    execution_engine="asyncio",
    openai=OpenAISettings(
        default_model="gpt-4o-mini",
        temperature=0.3,
    ),
)

app = MCPApp(name="dynamic", settings=settings)
Because Settings extends BaseSettings, environment variables still override any fields you set explicitly.

Configuration discovery

When MCPApp starts, it resolves settings in this order:
  • MCP_APP_SETTINGS_PRELOAD / MCP_APP_SETTINGS_PRELOAD_STRICT
  • Explicit settings argument passed to MCPApp
  • mcp_agent.config.yaml (or mcp-agent.config.yaml) discovered in the working directory, parent directories, .mcp-agent/ folders, or ~/.mcp-agent/
  • mcp_agent.secrets.yaml / mcp-agent.secrets.yaml merged on top
  • Environment variables (including values from .env, using __ for nesting)
Environment variables override file-based values, while the preload option short-circuits everything else—handy for containerised deployments that mount secrets from a vault. Specify Secrets covers strategies for each stage.

Environment Variables

You can reference environment variables in configuration:
mcp_agent.config.yaml
openai:
  default_model: ${OPENAI_MODEL:-gpt-4o}  # Default to gpt-4o

temporal:
  host: ${TEMPORAL_HOST:-localhost:7233}
Use environment variables for deployment-specific settings like endpoints and regions, while keeping model choices in the config file.

Project Structure

Recommended project layout:
your-project/
├── agent.py                  # Your agent code
├── mcp_agent.config.yaml     # Application configuration
├── mcp_agent.secrets.yaml    # API keys (gitignored)
├── .gitignore                # Ignore secrets file
├── requirements.txt          # Python dependencies
└── logs/                     # Execution logs
Add to .gitignore:
mcp_agent.secrets.yaml
logs/
*.log

Complete Configuration Reference

For all available configuration options, see the Configuration Reference.

Next Steps

I