## Summary <img width="1000" alt="image" src="https://github.com/user-attachments/assets/18e8ef34-d222-453c-8b0a-1b25ef8cf806" /> <img width="250" alt="image" src="https://github.com/user-attachments/assets/ba97556c-09c5-4f76-9f4e-49a2e8e57468" /> <img width="250" alt="image" src="https://github.com/user-attachments/assets/68f7804a-fe74-442d-9849-39a229c052cf" /> <img width="250" alt="image" src="https://github.com/user-attachments/assets/700690ba-f9fe-4726-8871-3bfbab586001" /> Full-stack MCP (Model Context Protocol) tool block integration that allows users to connect to any MCP server, discover available tools, authenticate via OAuth, and execute tools — all through the standard AutoGPT credential system. ### Backend - **MCPToolBlock** (`blocks/mcp/block.py`): New block using `CredentialsMetaInput` pattern with optional credentials (`default={}`), supporting both authenticated (OAuth) and public MCP servers. Includes auto-lookup fallback for backward compatibility. - **MCP Client** (`blocks/mcp/client.py`): HTTP transport with JSON-RPC 2.0, tool discovery, tool execution with robust error handling (type-checked error fields, non-JSON response handling) - **MCP OAuth Handler** (`blocks/mcp/oauth.py`): RFC 8414 discovery, dynamic per-server OAuth with PKCE, token storage and refresh via `raise_for_status=True` - **MCP API Routes** (`api/features/mcp/routes.py`): `discover-tools`, `oauth/login`, `oauth/callback` endpoints with credential cleanup, defensive OAuth metadata validation - **Credential system integration**: - `CredentialsMetaInput` model_validator normalizes legacy `"ProviderName.MCP"` format from Python 3.13's `str(StrEnum)` change - `CredentialsFieldInfo.combine()` supports URL-based credential discrimination (each MCP server gets its own credential entry) - `aggregate_credentials_inputs` checks block schema defaults for credential optionality - Executor normalizes credential data for both Pydantic and JSON schema validation paths - Chat credential matching handles MCP server URL filtering - `provider_matches()` helper used consistently for Python 3.13 StrEnum compatibility - **Pre-run validation**: `_validate_graph_get_errors` now calls `get_missing_input()` for custom block-level validation (MCP tool arguments) - **Security**: HTML tag stripping loop to prevent XSS bypass, SSRF protection (removed trusted_origins) ### Frontend - **MCPToolDialog** (`MCPToolDialog.tsx`): Full tool discovery UI — enter server URL, authenticate if needed, browse tools, select tool and configure - **OAuth popup** (`oauth-popup.ts`): Shared utility supporting cross-origin MCP OAuth flows with BroadcastChannel + localStorage fallback - **Credential integration**: MCP-specific OAuth flow in `useCredentialsInput`, server URL filtering in `useCredentials`, MCP callback page - **CredentialsSelect**: Auto-selects first available credential instead of defaulting to "None", credentials listed before "None" in dropdown - **Node rendering**: Dynamic tool input schema rendering on MCP nodes, proper handling in both legacy and new flow editors - **Block title persistence**: `customized_name` set at block creation for both MCP and Agent blocks — no fallback logic needed, titles survive save/load reliably - **Stable credential ordering**: Removed `sortByUnsetFirst` that caused credential inputs to jump when selected ### Tests (~2060 lines) - Unit tests: block, client, tool execution - Integration tests: mock MCP server with auth - OAuth flow tests - API endpoint tests - Credential combining/optionality tests - E2e tests (skipped in CI, run manually) ## Key Design Decisions 1. **Optional credentials via `default={}`**: MCP servers can be public (no auth) or private (OAuth). The `credentials` field has `default={}` making it optional at the schema level, so public servers work without prompting for credentials. 2. **URL-based credential discrimination**: Each MCP server URL gets its own credential entry in the "Run agent" form (via `discriminator="server_url"`), so agents using multiple MCP servers prompt for each independently. 3. **Model-level normalization**: Python 3.13 changed `str(StrEnum)` to return `"ClassName.MEMBER"`. Rather than scattering fixes across the codebase, a Pydantic `model_validator(mode="before")` on `CredentialsMetaInput` handles normalization centrally, and `provider_matches()` handles lookups. 4. **Credential auto-select**: `CredentialsSelect` component defaults to the first available credential and notifies the parent state, ensuring credentials are pre-filled in the "Run agent" dialog without requiring manual selection. 5. **customized_name for block titles**: Both MCP and Agent blocks set `customized_name` in metadata at creation time. This eliminates convoluted runtime fallback logic (`agent_name`, hostname extraction) — the title is persisted once and read directly. ## Test plan - [x] Unit/integration tests pass (68 MCP + 11 graph = 79 tests) - [x] Manual: MCP block with public server (DeepWiki) — no credentials needed, tools discovered and executable - [x] Manual: MCP block with OAuth server (Linear, Sentry) — OAuth flow prompts correctly - [x] Manual: "Run agent" form shows correct credential requirements per MCP server - [x] Manual: Credential auto-selects when exactly one matches, pre-selects first when multiple exist - [x] Manual: Credential ordering stays stable when selecting/deselecting - [x] Manual: MCP block title persists after save and refresh - [x] Manual: Agent block title persists after save and refresh (via customized_name) - [ ] Manual: Shared agent with MCP block prompts new user for credentials --------- Co-authored-by: Otto <otto@agpt.co> Co-authored-by: Ubbe <hi@ubbe.dev>
AutoGPT Platform
Welcome to the AutoGPT Platform - a powerful system for creating and running AI agents to solve business problems. This platform enables you to harness the power of artificial intelligence to automate tasks, analyze data, and generate insights for your organization.
Getting Started
Prerequisites
- Docker
- Docker Compose V2 (comes with Docker Desktop, or can be installed separately)
Running the System
To run the AutoGPT Platform, follow these steps:
-
Clone this repository to your local machine and navigate to the
autogpt_platformdirectory within the repository:git clone <https://github.com/Significant-Gravitas/AutoGPT.git | git@github.com:Significant-Gravitas/AutoGPT.git> cd AutoGPT/autogpt_platform -
Run the following command:
cp .env.default .envThis command will copy the
.env.defaultfile to.env. You can modify the.envfile to add your own environment variables. -
Run the following command:
docker compose up -dThis command will start all the necessary backend services defined in the
docker-compose.ymlfile in detached mode. -
After all the services are in ready state, open your browser and navigate to
http://localhost:3000to access the AutoGPT Platform frontend.
Running Just Core services
You can now run the following to enable just the core services.
# For help
make help
# Run just Supabase + Redis + RabbitMQ
make start-core
# Stop core services
make stop-core
# View logs from core services
make logs-core
# Run formatting and linting for backend and frontend
make format
# Run migrations for backend database
make migrate
# Run backend server
make run-backend
# Run frontend development server
make run-frontend
Docker Compose Commands
Here are some useful Docker Compose commands for managing your AutoGPT Platform:
docker compose up -d: Start the services in detached mode.docker compose stop: Stop the running services without removing them.docker compose rm: Remove stopped service containers.docker compose build: Build or rebuild services.docker compose down: Stop and remove containers, networks, and volumes.docker compose watch: Watch for changes in your services and automatically update them.
Sample Scenarios
Here are some common scenarios where you might use multiple Docker Compose commands:
-
Updating and restarting a specific service:
docker compose build api_srv docker compose up -d --no-deps api_srvThis rebuilds the
api_srvservice and restarts it without affecting other services. -
Viewing logs for troubleshooting:
docker compose logs -f api_srv ws_srvThis shows and follows the logs for both
api_srvandws_srvservices. -
Scaling a service for increased load:
docker compose up -d --scale executor=3This scales the
executorservice to 3 instances to handle increased load. -
Stopping the entire system for maintenance:
docker compose stop docker compose rm -f docker compose pull docker compose up -dThis stops all services, removes containers, pulls the latest images, and restarts the system.
-
Developing with live updates:
docker compose watchThis watches for changes in your code and automatically updates the relevant services.
-
Checking the status of services:
docker compose psThis shows the current status of all services defined in your docker-compose.yml file.
These scenarios demonstrate how to use Docker Compose commands in combination to manage your AutoGPT Platform effectively.
Persisting Data
To persist data for PostgreSQL and Redis, you can modify the docker-compose.yml file to add volumes. Here's how:
-
Open the
docker-compose.ymlfile in a text editor. -
Add volume configurations for PostgreSQL and Redis services:
services: postgres: # ... other configurations ... volumes: - postgres_data:/var/lib/postgresql/data redis: # ... other configurations ... volumes: - redis_data:/data volumes: postgres_data: redis_data: -
Save the file and run
docker compose up -dto apply the changes.
This configuration will create named volumes for PostgreSQL and Redis, ensuring that your data persists across container restarts.
API Client Generation
The platform includes scripts for generating and managing the API client:
pnpm fetch:openapi: Fetches the OpenAPI specification from the backend service (requires backend to be running on port 8006)pnpm generate:api-client: Generates the TypeScript API client from the OpenAPI specification using Orvalpnpm generate:api: Runs both fetch and generate commands in sequence
Manual API Client Updates
If you need to update the API client after making changes to the backend API:
-
Ensure the backend services are running:
docker compose up -d -
Generate the updated API client:
pnpm generate:api
This will fetch the latest OpenAPI specification and regenerate the TypeScript client code.