Compare commits

..

2 Commits

Author SHA1 Message Date
Bentlybro
e934df3c0c fix: address code review feedback
- Add 'text' language identifier to code blocks (MD040)
- Add VAULT_ENC_KEY generation command (openssl rand -hex 16)
- Fix DB_HOST default to 'localhost' (not 'db')
- Add info box clarifying port numbers are internal Docker ports
- Update OAuth callback URL to not include port by default
- Clarify Docker service names are internal container DNS
2026-02-16 12:10:09 +00:00
Bentlybro
8d557d33e1 docs: add deployment environment variables guide
Closes #10961, Closes OPEN-2715

Documents all environment variables that must be configured when deploying
AutoGPT to a new server:

- Quick reference table of critical URLs that must change
- Configuration file locations and loading order
- Security keys that must be regenerated (with generation commands)
- Database, Redis, RabbitMQ configuration
- Default ports for all services
- OAuth callback URLs for all supported providers
- Full deployment checklist
- Docker vs external services guidance
2026-02-16 11:59:34 +00:00
18 changed files with 606 additions and 733 deletions

View File

@@ -41,18 +41,13 @@ jobs:
ports:
- 6379:6379
rabbitmq:
image: rabbitmq:4.1.4
image: rabbitmq:3.12-management
ports:
- 5672:5672
- 15672:15672
env:
RABBITMQ_DEFAULT_USER: ${{ env.RABBITMQ_DEFAULT_USER }}
RABBITMQ_DEFAULT_PASS: ${{ env.RABBITMQ_DEFAULT_PASS }}
options: >-
--health-cmd "rabbitmq-diagnostics -q ping"
--health-interval 30s
--health-timeout 10s
--health-retries 5
--health-start-period 10s
clamav:
image: clamav/clamav-debian:latest
ports:

View File

@@ -6,16 +6,10 @@ on:
paths:
- ".github/workflows/platform-frontend-ci.yml"
- "autogpt_platform/frontend/**"
- "autogpt_platform/backend/Dockerfile"
- "autogpt_platform/docker-compose.yml"
- "autogpt_platform/docker-compose.platform.yml"
pull_request:
paths:
- ".github/workflows/platform-frontend-ci.yml"
- "autogpt_platform/frontend/**"
- "autogpt_platform/backend/Dockerfile"
- "autogpt_platform/docker-compose.yml"
- "autogpt_platform/docker-compose.platform.yml"
merge_group:
workflow_dispatch:

View File

@@ -53,6 +53,63 @@ COPY autogpt_platform/backend/backend/data/partial_types.py ./backend/data/parti
COPY autogpt_platform/backend/gen_prisma_types_stub.py ./
RUN poetry run prisma generate && poetry run gen-prisma-stub
# ============================== BACKEND SERVER ============================== #
FROM debian:13-slim AS server
WORKDIR /app
ENV POETRY_HOME=/opt/poetry \
POETRY_NO_INTERACTION=1 \
POETRY_VIRTUALENVS_CREATE=true \
POETRY_VIRTUALENVS_IN_PROJECT=true \
DEBIAN_FRONTEND=noninteractive
ENV PATH=/opt/poetry/bin:$PATH
# Install Python, FFmpeg, ImageMagick, and CLI tools for agent use.
# bubblewrap provides OS-level sandbox (whitelist-only FS + no network)
# for the bash_exec MCP tool.
# Using --no-install-recommends saves ~650MB by skipping unnecessary deps like llvm, mesa, etc.
RUN apt-get update && apt-get install -y --no-install-recommends \
python3.13 \
python3-pip \
ffmpeg \
imagemagick \
jq \
ripgrep \
tree \
bubblewrap \
&& rm -rf /var/lib/apt/lists/*
COPY --from=builder /usr/local/lib/python3* /usr/local/lib/python3*
COPY --from=builder /usr/local/bin/poetry /usr/local/bin/poetry
# Copy Node.js installation for Prisma
COPY --from=builder /usr/bin/node /usr/bin/node
COPY --from=builder /usr/lib/node_modules /usr/lib/node_modules
COPY --from=builder /usr/bin/npm /usr/bin/npm
COPY --from=builder /usr/bin/npx /usr/bin/npx
COPY --from=builder /root/.cache/prisma-python/binaries /root/.cache/prisma-python/binaries
WORKDIR /app/autogpt_platform/backend
# Copy only the .venv from builder (not the entire /app directory)
# The .venv includes the generated Prisma client
COPY --from=builder /app/autogpt_platform/backend/.venv ./.venv
ENV PATH="/app/autogpt_platform/backend/.venv/bin:$PATH"
# Copy dependency files + autogpt_libs (path dependency)
COPY autogpt_platform/autogpt_libs /app/autogpt_platform/autogpt_libs
COPY autogpt_platform/backend/poetry.lock autogpt_platform/backend/pyproject.toml ./
# Copy backend code + docs (for Copilot docs search)
COPY autogpt_platform/backend ./
COPY docs /app/docs
RUN poetry install --no-ansi --only-root
ENV PORT=8000
CMD ["poetry", "run", "rest"]
# =============================== DB MIGRATOR =============================== #
# Lightweight migrate stage - only needs Prisma CLI, not full Python environment
@@ -84,59 +141,3 @@ COPY autogpt_platform/backend/schema.prisma ./
COPY autogpt_platform/backend/backend/data/partial_types.py ./backend/data/partial_types.py
COPY autogpt_platform/backend/gen_prisma_types_stub.py ./
COPY autogpt_platform/backend/migrations ./migrations
# ============================== BACKEND SERVER ============================== #
FROM debian:13-slim AS server
WORKDIR /app
ENV DEBIAN_FRONTEND=noninteractive
# Install Python, FFmpeg, ImageMagick, and CLI tools for agent use.
# bubblewrap provides OS-level sandbox (whitelist-only FS + no network)
# for the bash_exec MCP tool.
# Using --no-install-recommends saves ~650MB by skipping unnecessary deps like llvm, mesa, etc.
RUN apt-get update && apt-get install -y --no-install-recommends \
python3.13 \
python3-pip \
ffmpeg \
imagemagick \
jq \
ripgrep \
tree \
bubblewrap \
&& rm -rf /var/lib/apt/lists/*
# Copy poetry (build-time only, for `poetry install --only-root` to create entry points)
COPY --from=builder /usr/local/lib/python3* /usr/local/lib/python3*
COPY --from=builder /usr/local/bin/poetry /usr/local/bin/poetry
# Copy Node.js installation for Prisma
COPY --from=builder /usr/bin/node /usr/bin/node
COPY --from=builder /usr/lib/node_modules /usr/lib/node_modules
COPY --from=builder /usr/bin/npm /usr/bin/npm
COPY --from=builder /usr/bin/npx /usr/bin/npx
COPY --from=builder /root/.cache/prisma-python/binaries /root/.cache/prisma-python/binaries
WORKDIR /app/autogpt_platform/backend
# Copy only the .venv from builder (not the entire /app directory)
# The .venv includes the generated Prisma client
COPY --from=builder /app/autogpt_platform/backend/.venv ./.venv
ENV PATH="/app/autogpt_platform/backend/.venv/bin:$PATH"
# Copy dependency files + autogpt_libs (path dependency)
COPY autogpt_platform/autogpt_libs /app/autogpt_platform/autogpt_libs
COPY autogpt_platform/backend/poetry.lock autogpt_platform/backend/pyproject.toml ./
# Copy backend code + docs (for Copilot docs search)
COPY autogpt_platform/backend ./
COPY docs /app/docs
# Install the project package to create entry point scripts in .venv/bin/
# (e.g., rest, executor, ws, db, scheduler, notification - see [tool.poetry.scripts])
RUN POETRY_VIRTUALENVS_CREATE=true POETRY_VIRTUALENVS_IN_PROJECT=true \
poetry install --no-ansi --only-root
ENV PORT=8000
CMD ["rest"]

View File

@@ -23,7 +23,6 @@ from .model import (
ChatSession,
append_and_save_message,
create_chat_session,
delete_chat_session,
get_chat_session,
get_user_sessions,
)
@@ -212,43 +211,6 @@ async def create_session(
)
@router.delete(
"/sessions/{session_id}",
dependencies=[Security(auth.requires_user)],
status_code=204,
responses={404: {"description": "Session not found or access denied"}},
)
async def delete_session(
session_id: str,
user_id: Annotated[str, Security(auth.get_user_id)],
) -> Response:
"""
Delete a chat session.
Permanently removes a chat session and all its messages.
Only the owner can delete their sessions.
Args:
session_id: The session ID to delete.
user_id: The authenticated user's ID.
Returns:
204 No Content on success.
Raises:
HTTPException: 404 if session not found or not owned by user.
"""
deleted = await delete_chat_session(session_id, user_id)
if not deleted:
raise HTTPException(
status_code=404,
detail=f"Session {session_id} not found or access denied",
)
return Response(status_code=204)
@router.get(
"/sessions/{session_id}",
)

View File

@@ -5,7 +5,7 @@ import re
from datetime import datetime, timedelta, timezone
from typing import Any
from pydantic import BaseModel, Field, field_validator
from pydantic import BaseModel, field_validator
from backend.api.features.chat.model import ChatSession
from backend.api.features.library import db as library_db
@@ -14,7 +14,6 @@ from backend.data import execution as execution_db
from backend.data.execution import ExecutionStatus, GraphExecution, GraphExecutionMeta
from .base import BaseTool
from .execution_utils import TERMINAL_STATUSES, wait_for_execution
from .models import (
AgentOutputResponse,
ErrorResponse,
@@ -35,7 +34,6 @@ class AgentOutputInput(BaseModel):
store_slug: str = ""
execution_id: str = ""
run_time: str = "latest"
wait_if_running: int = Field(default=0, ge=0, le=300)
@field_validator(
"agent_name",
@@ -119,11 +117,6 @@ class AgentOutputTool(BaseTool):
Select which run to retrieve using:
- execution_id: Specific execution ID
- run_time: 'latest' (default), 'yesterday', 'last week', or ISO date 'YYYY-MM-DD'
Wait for completion (optional):
- wait_if_running: Max seconds to wait if execution is still running (0-300).
If the execution is running/queued, waits up to this many seconds for it to complete.
Returns current status on timeout. If already finished, returns immediately.
"""
@property
@@ -153,13 +146,6 @@ class AgentOutputTool(BaseTool):
"Time filter: 'latest', 'yesterday', 'last week', or 'YYYY-MM-DD'"
),
},
"wait_if_running": {
"type": "integer",
"description": (
"Max seconds to wait if execution is still running (0-300). "
"If running, waits for completion. Returns current state on timeout."
),
},
},
"required": [],
}
@@ -237,14 +223,10 @@ class AgentOutputTool(BaseTool):
execution_id: str | None,
time_start: datetime | None,
time_end: datetime | None,
include_running: bool = False,
) -> tuple[GraphExecution | None, list[GraphExecutionMeta], str | None]:
"""
Fetch execution(s) based on filters.
Returns (single_execution, available_executions_meta, error_message).
Args:
include_running: If True, also look for running/queued executions (for waiting)
"""
# If specific execution_id provided, fetch it directly
if execution_id:
@@ -257,22 +239,11 @@ class AgentOutputTool(BaseTool):
return None, [], f"Execution '{execution_id}' not found"
return execution, [], None
# Determine which statuses to query
statuses = [ExecutionStatus.COMPLETED]
if include_running:
statuses.extend(
[
ExecutionStatus.RUNNING,
ExecutionStatus.QUEUED,
ExecutionStatus.INCOMPLETE,
]
)
# Get executions with time filters
# Get completed executions with time filters
executions = await execution_db.get_graph_executions(
graph_id=graph_id,
user_id=user_id,
statuses=statuses,
statuses=[ExecutionStatus.COMPLETED],
created_time_gte=time_start,
created_time_lte=time_end,
limit=10,
@@ -339,28 +310,10 @@ class AgentOutputTool(BaseTool):
for e in available_executions[:5]
]
# Build appropriate message based on execution status
if execution.status == ExecutionStatus.COMPLETED:
message = f"Found execution outputs for agent '{agent.name}'"
elif execution.status == ExecutionStatus.FAILED:
message = f"Execution for agent '{agent.name}' failed"
elif execution.status == ExecutionStatus.TERMINATED:
message = f"Execution for agent '{agent.name}' was terminated"
elif execution.status in (
ExecutionStatus.RUNNING,
ExecutionStatus.QUEUED,
ExecutionStatus.INCOMPLETE,
):
message = (
f"Execution for agent '{agent.name}' is still {execution.status.value}. "
"Results may be incomplete. Use wait_if_running to wait for completion."
)
else:
message = f"Found execution for agent '{agent.name}' (status: {execution.status.value})"
message = f"Found execution outputs for agent '{agent.name}'"
if len(available_executions) > 1:
message += (
f" Showing latest of {len(available_executions)} matching executions."
f". Showing latest of {len(available_executions)} matching executions."
)
return AgentOutputResponse(
@@ -475,17 +428,13 @@ class AgentOutputTool(BaseTool):
# Parse time expression
time_start, time_end = parse_time_expression(input_data.run_time)
# Check if we should wait for running executions
wait_timeout = input_data.wait_if_running
# Fetch execution(s) - include running if we're going to wait
# Fetch execution(s)
execution, available_executions, exec_error = await self._get_execution(
user_id=user_id,
graph_id=agent.graph_id,
execution_id=input_data.execution_id or None,
time_start=time_start,
time_end=time_end,
include_running=wait_timeout > 0,
)
if exec_error:
@@ -494,17 +443,4 @@ class AgentOutputTool(BaseTool):
session_id=session_id,
)
# If we have an execution that's still running and we should wait
if execution and wait_timeout > 0 and execution.status not in TERMINAL_STATUSES:
logger.info(
f"Execution {execution.id} is {execution.status}, "
f"waiting up to {wait_timeout}s for completion"
)
execution = await wait_for_execution(
user_id=user_id,
graph_id=agent.graph_id,
execution_id=execution.id,
timeout_seconds=wait_timeout,
)
return self._build_response(agent, execution, available_executions, session_id)

View File

@@ -1,124 +0,0 @@
"""Shared utilities for execution waiting and status handling."""
import asyncio
import logging
from typing import Any
from backend.data import execution as execution_db
from backend.data.execution import (
AsyncRedisExecutionEventBus,
ExecutionStatus,
GraphExecution,
GraphExecutionEvent,
)
logger = logging.getLogger(__name__)
# Terminal statuses that indicate execution is complete
TERMINAL_STATUSES = frozenset(
{
ExecutionStatus.COMPLETED,
ExecutionStatus.FAILED,
ExecutionStatus.TERMINATED,
}
)
async def wait_for_execution(
user_id: str,
graph_id: str,
execution_id: str,
timeout_seconds: int,
) -> GraphExecution | None:
"""
Wait for an execution to reach a terminal status using Redis pubsub.
Uses asyncio.wait_for to ensure timeout is respected even when no events
are received.
Args:
user_id: User ID
graph_id: Graph ID
execution_id: Execution ID to wait for
timeout_seconds: Max seconds to wait
Returns:
The execution with current status, or None if not found
"""
# First check current status - maybe it's already done
execution = await execution_db.get_graph_execution(
user_id=user_id,
execution_id=execution_id,
include_node_executions=False,
)
if not execution:
return None
# If already in terminal state, return immediately
if execution.status in TERMINAL_STATUSES:
logger.debug(
f"Execution {execution_id} already in terminal state: {execution.status}"
)
return execution
logger.info(
f"Waiting up to {timeout_seconds}s for execution {execution_id} "
f"(current status: {execution.status})"
)
# Subscribe to execution updates via Redis pubsub
event_bus = AsyncRedisExecutionEventBus()
channel_key = f"{user_id}/{graph_id}/{execution_id}"
try:
# Use wait_for to enforce timeout on the entire listen operation
result = await asyncio.wait_for(
_listen_for_terminal_status(event_bus, channel_key, user_id, execution_id),
timeout=timeout_seconds,
)
return result
except asyncio.TimeoutError:
logger.info(f"Timeout waiting for execution {execution_id}")
except Exception as e:
logger.error(f"Error waiting for execution: {e}", exc_info=True)
# Return current state on timeout/error
return await execution_db.get_graph_execution(
user_id=user_id,
execution_id=execution_id,
include_node_executions=False,
)
async def _listen_for_terminal_status(
event_bus: AsyncRedisExecutionEventBus,
channel_key: str,
user_id: str,
execution_id: str,
) -> GraphExecution | None:
"""
Listen for execution events until a terminal status is reached.
This is a helper that gets wrapped in asyncio.wait_for for timeout handling.
"""
async for event in event_bus.listen_events(channel_key):
# Only process GraphExecutionEvents (not NodeExecutionEvents)
if isinstance(event, GraphExecutionEvent):
logger.debug(f"Received execution update: {event.status}")
if event.status in TERMINAL_STATUSES:
# Fetch full execution with outputs
return await execution_db.get_graph_execution(
user_id=user_id,
execution_id=execution_id,
include_node_executions=False,
)
# Should not reach here normally (generator should yield indefinitely)
return None
def get_execution_outputs(execution: GraphExecution | None) -> dict[str, Any] | None:
"""Extract outputs from an execution, or return None."""
if execution is None:
return None
return execution.outputs

View File

@@ -192,7 +192,6 @@ class ExecutionStartedResponse(ToolResponseBase):
library_agent_id: str | None = None
library_agent_link: str | None = None
status: str = "QUEUED"
outputs: dict[str, Any] | None = None # Populated when wait_for_result is used
# Auth/error models

View File

@@ -12,7 +12,6 @@ from backend.api.features.chat.tracking import (
track_agent_scheduled,
)
from backend.api.features.library import db as library_db
from backend.data.execution import ExecutionStatus
from backend.data.graph import GraphModel
from backend.data.model import CredentialsMetaInput
from backend.data.user import get_user_by_id
@@ -25,7 +24,6 @@ from backend.util.timezone_utils import (
)
from .base import BaseTool
from .execution_utils import get_execution_outputs, wait_for_execution
from .helpers import get_inputs_from_schema
from .models import (
AgentDetails,
@@ -72,7 +70,6 @@ class RunAgentInput(BaseModel):
schedule_name: str = ""
cron: str = ""
timezone: str = "UTC"
wait_for_result: int = Field(default=0, ge=0, le=300)
@field_validator(
"username_agent_slug",
@@ -154,14 +151,6 @@ class RunAgentTool(BaseTool):
"type": "string",
"description": "IANA timezone for schedule (default: UTC)",
},
"wait_for_result": {
"type": "integer",
"description": (
"Max seconds to wait for execution to complete (0-300). "
"If >0, blocks until the execution finishes or times out. "
"Returns execution outputs when complete."
),
},
},
"required": [],
}
@@ -358,7 +347,6 @@ class RunAgentTool(BaseTool):
graph=graph,
graph_credentials=graph_credentials,
inputs=params.inputs,
wait_for_result=params.wait_for_result,
)
except NotFoundError as e:
@@ -442,9 +430,8 @@ class RunAgentTool(BaseTool):
graph: GraphModel,
graph_credentials: dict[str, CredentialsMetaInput],
inputs: dict[str, Any],
wait_for_result: int = 0,
) -> ToolResponseBase:
"""Execute an agent immediately, optionally waiting for completion."""
"""Execute an agent immediately."""
session_id = session.session_id
# Check rate limits
@@ -481,60 +468,6 @@ class RunAgentTool(BaseTool):
)
library_agent_link = f"/library/agents/{library_agent.id}"
# If wait_for_result is specified, wait for execution to complete
if wait_for_result > 0:
logger.info(
f"Waiting up to {wait_for_result}s for execution {execution.id}"
)
result = await wait_for_execution(
user_id=user_id,
graph_id=library_agent.graph_id,
execution_id=execution.id,
timeout_seconds=wait_for_result,
)
final_status = result.status if result else ExecutionStatus.FAILED
outputs = get_execution_outputs(result)
# Build message based on final status
if final_status == ExecutionStatus.COMPLETED:
message = (
f"Agent '{library_agent.name}' execution completed successfully. "
f"{MSG_DO_NOT_RUN_AGAIN}"
)
elif final_status == ExecutionStatus.FAILED:
message = (
f"Agent '{library_agent.name}' execution failed. "
f"View details at {library_agent_link}. "
f"{MSG_DO_NOT_RUN_AGAIN}"
)
elif final_status == ExecutionStatus.TERMINATED:
message = (
f"Agent '{library_agent.name}' execution was terminated. "
f"View details at {library_agent_link}. "
f"{MSG_DO_NOT_RUN_AGAIN}"
)
else:
message = (
f"Agent '{library_agent.name}' execution is still {final_status.value} "
f"(timed out after {wait_for_result}s). "
f"View at {library_agent_link}. "
f"{MSG_DO_NOT_RUN_AGAIN}"
)
return ExecutionStartedResponse(
message=message,
session_id=session_id,
execution_id=execution.id,
graph_id=library_agent.graph_id,
graph_name=library_agent.name,
library_agent_id=library_agent.id,
library_agent_link=library_agent_link,
status=final_status.value,
outputs=outputs,
)
# Default: return immediately without waiting
return ExecutionStartedResponse(
message=(
f"Agent '{library_agent.name}' execution started successfully. "

View File

@@ -53,7 +53,7 @@ services:
rabbitmq:
<<: *agpt-services
image: rabbitmq:4.1.4
image: rabbitmq:management
container_name: rabbitmq
healthcheck:
test: rabbitmq-diagnostics -q ping
@@ -66,6 +66,7 @@ services:
- RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
ports:
- "5672:5672"
- "15672:15672"
clamav:
image: clamav/clamav-debian:latest
ports:

View File

@@ -75,7 +75,7 @@ services:
timeout: 5s
retries: 5
rabbitmq:
image: rabbitmq:4.1.4
image: rabbitmq:management
container_name: rabbitmq
healthcheck:
test: rabbitmq-diagnostics -q ping
@@ -88,13 +88,14 @@ services:
<<: *backend-env
ports:
- "5672:5672"
- "15672:15672"
rest_server:
build:
context: ../
dockerfile: autogpt_platform/backend/Dockerfile
target: server
command: ["rest"] # points to entry in [tool.poetry.scripts] in pyproject.toml
command: ["python", "-m", "backend.rest"]
develop:
watch:
- path: ./
@@ -127,7 +128,7 @@ services:
context: ../
dockerfile: autogpt_platform/backend/Dockerfile
target: server
command: ["executor"] # points to entry in [tool.poetry.scripts] in pyproject.toml
command: ["python", "-m", "backend.exec"]
develop:
watch:
- path: ./
@@ -162,7 +163,7 @@ services:
context: ../
dockerfile: autogpt_platform/backend/Dockerfile
target: server
command: ["ws"] # points to entry in [tool.poetry.scripts] in pyproject.toml
command: ["python", "-m", "backend.ws"]
develop:
watch:
- path: ./
@@ -195,7 +196,7 @@ services:
context: ../
dockerfile: autogpt_platform/backend/Dockerfile
target: server
command: ["db"] # points to entry in [tool.poetry.scripts] in pyproject.toml
command: ["python", "-m", "backend.db"]
develop:
watch:
- path: ./
@@ -224,7 +225,7 @@ services:
context: ../
dockerfile: autogpt_platform/backend/Dockerfile
target: server
command: ["scheduler"] # points to entry in [tool.poetry.scripts] in pyproject.toml
command: ["python", "-m", "backend.scheduler"]
develop:
watch:
- path: ./
@@ -272,7 +273,7 @@ services:
context: ../
dockerfile: autogpt_platform/backend/Dockerfile
target: server
command: ["notification"] # points to entry in [tool.poetry.scripts] in pyproject.toml
command: ["python", "-m", "backend.notification"]
develop:
watch:
- path: ./

View File

@@ -1,8 +1,6 @@
"use client";
import { SidebarProvider } from "@/components/ui/sidebar";
// TODO: Replace with modern Dialog component when available
import DeleteConfirmDialog from "@/components/__legacy__/delete-confirm-dialog";
import { ChatContainer } from "./components/ChatContainer/ChatContainer";
import { ChatSidebar } from "./components/ChatSidebar/ChatSidebar";
import { MobileDrawer } from "./components/MobileDrawer/MobileDrawer";
@@ -33,12 +31,6 @@ export function CopilotPage() {
handleDrawerOpenChange,
handleSelectSession,
handleNewChat,
// Delete functionality
sessionToDelete,
isDeleting,
handleDeleteClick,
handleConfirmDelete,
handleCancelDelete,
} = useCopilotPage();
if (isUserLoading || !isLoggedIn) {
@@ -56,19 +48,7 @@ export function CopilotPage() {
>
{!isMobile && <ChatSidebar />}
<div className="relative flex h-full w-full flex-col overflow-hidden bg-[#f8f8f9] px-0">
{isMobile && (
<MobileHeader
onOpenDrawer={handleOpenDrawer}
showDelete={!!sessionId}
isDeleting={isDeleting}
onDelete={() => {
const session = sessions.find((s) => s.id === sessionId);
if (session) {
handleDeleteClick(session.id, session.title);
}
}}
/>
)}
{isMobile && <MobileHeader onOpenDrawer={handleOpenDrawer} />}
<div className="flex-1 overflow-hidden">
<ChatContainer
messages={messages}
@@ -95,16 +75,6 @@ export function CopilotPage() {
onOpenChange={handleDrawerOpenChange}
/>
)}
{/* Delete confirmation dialog - rendered at top level for proper z-index on mobile */}
{isMobile && (
<DeleteConfirmDialog
entityType="chat"
entityName={sessionToDelete?.title || "Untitled chat"}
open={!!sessionToDelete}
onOpenChange={(open) => !open && handleCancelDelete()}
onDoDelete={handleConfirmDelete}
/>
)}
</SidebarProvider>
);
}

View File

@@ -1,15 +1,8 @@
"use client";
import {
getGetV2ListSessionsQueryKey,
useDeleteV2DeleteSession,
useGetV2ListSessions,
} from "@/app/api/__generated__/endpoints/chat/chat";
import { useGetV2ListSessions } from "@/app/api/__generated__/endpoints/chat/chat";
import { Button } from "@/components/atoms/Button/Button";
import { LoadingSpinner } from "@/components/atoms/LoadingSpinner/LoadingSpinner";
import { Text } from "@/components/atoms/Text/Text";
import { toast } from "@/components/molecules/Toast/use-toast";
// TODO: Replace with modern Dialog component when available
import DeleteConfirmDialog from "@/components/__legacy__/delete-confirm-dialog";
import {
Sidebar,
SidebarContent,
@@ -19,52 +12,18 @@ import {
useSidebar,
} from "@/components/ui/sidebar";
import { cn } from "@/lib/utils";
import { PlusCircleIcon, PlusIcon, TrashIcon } from "@phosphor-icons/react";
import { useQueryClient } from "@tanstack/react-query";
import { PlusCircleIcon, PlusIcon } from "@phosphor-icons/react";
import { motion } from "framer-motion";
import { useState } from "react";
import { parseAsString, useQueryState } from "nuqs";
export function ChatSidebar() {
const { state } = useSidebar();
const isCollapsed = state === "collapsed";
const [sessionId, setSessionId] = useQueryState("sessionId", parseAsString);
const [sessionToDelete, setSessionToDelete] = useState<{
id: string;
title: string | null | undefined;
} | null>(null);
const queryClient = useQueryClient();
const { data: sessionsResponse, isLoading: isLoadingSessions } =
useGetV2ListSessions({ limit: 50 });
const { mutate: deleteSession, isPending: isDeleting } =
useDeleteV2DeleteSession({
mutation: {
onSuccess: () => {
// Invalidate sessions list to refetch
queryClient.invalidateQueries({
queryKey: getGetV2ListSessionsQueryKey(),
});
// If we deleted the current session, clear selection
if (sessionToDelete?.id === sessionId) {
setSessionId(null);
}
setSessionToDelete(null);
},
onError: (error) => {
toast({
title: "Failed to delete chat",
description:
error instanceof Error ? error.message : "An error occurred",
variant: "destructive",
});
setSessionToDelete(null);
},
},
});
const sessions =
sessionsResponse?.status === 200 ? sessionsResponse.data.sessions : [];
@@ -76,22 +35,6 @@ export function ChatSidebar() {
setSessionId(id);
}
function handleDeleteClick(
e: React.MouseEvent,
id: string,
title: string | null | undefined,
) {
e.stopPropagation(); // Prevent session selection
if (isDeleting) return; // Prevent double-click during deletion
setSessionToDelete({ id, title });
}
function handleConfirmDelete() {
if (sessionToDelete) {
deleteSession({ sessionId: sessionToDelete.id });
}
}
function formatDate(dateString: string) {
const date = new Date(dateString);
const now = new Date();
@@ -118,152 +61,128 @@ export function ChatSidebar() {
}
return (
<>
<Sidebar
variant="inset"
collapsible="icon"
className="!top-[50px] !h-[calc(100vh-50px)] border-r border-zinc-100 px-0"
>
{isCollapsed && (
<SidebarHeader
className={cn(
"flex",
isCollapsed
? "flex-row items-center justify-between gap-y-4 md:flex-col md:items-start md:justify-start"
: "flex-row items-center justify-between",
)}
<Sidebar
variant="inset"
collapsible="icon"
className="!top-[50px] !h-[calc(100vh-50px)] border-r border-zinc-100 px-0"
>
{isCollapsed && (
<SidebarHeader
className={cn(
"flex",
isCollapsed
? "flex-row items-center justify-between gap-y-4 md:flex-col md:items-start md:justify-start"
: "flex-row items-center justify-between",
)}
>
<motion.div
key={isCollapsed ? "header-collapsed" : "header-expanded"}
className="flex flex-col items-center gap-3 pt-4"
initial={{ opacity: 0, filter: "blur(3px)" }}
animate={{ opacity: 1, filter: "blur(0px)" }}
transition={{ type: "spring", bounce: 0.2 }}
>
<motion.div
key={isCollapsed ? "header-collapsed" : "header-expanded"}
className="flex flex-col items-center gap-3 pt-4"
initial={{ opacity: 0, filter: "blur(3px)" }}
animate={{ opacity: 1, filter: "blur(0px)" }}
transition={{ type: "spring", bounce: 0.2 }}
>
<div className="flex flex-col items-center gap-2">
<SidebarTrigger />
<Button
variant="ghost"
onClick={handleNewChat}
style={{ minWidth: "auto", width: "auto" }}
>
<PlusCircleIcon className="!size-5" />
<span className="sr-only">New Chat</span>
</Button>
</div>
</motion.div>
</SidebarHeader>
)}
<SidebarContent className="gap-4 overflow-y-auto px-4 py-4 [-ms-overflow-style:none] [scrollbar-width:none] [&::-webkit-scrollbar]:hidden">
{!isCollapsed && (
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.2, delay: 0.1 }}
className="flex items-center justify-between px-3"
>
<Text variant="h3" size="body-medium">
Your chats
</Text>
<div className="relative left-6">
<SidebarTrigger />
</div>
</motion.div>
)}
{!isCollapsed && (
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.2, delay: 0.15 }}
className="mt-4 flex flex-col gap-1"
>
{isLoadingSessions ? (
<div className="flex min-h-[30rem] items-center justify-center py-4">
<LoadingSpinner size="small" className="text-neutral-600" />
</div>
) : sessions.length === 0 ? (
<p className="py-4 text-center text-sm text-neutral-500">
No conversations yet
</p>
) : (
sessions.map((session) => (
<div
key={session.id}
className={cn(
"group relative w-full rounded-lg transition-colors",
session.id === sessionId
? "bg-zinc-100"
: "hover:bg-zinc-50",
)}
>
<button
onClick={() => handleSelectSession(session.id)}
className="w-full px-3 py-2.5 pr-10 text-left"
>
<div className="flex min-w-0 max-w-full flex-col overflow-hidden">
<div className="min-w-0 max-w-full">
<Text
variant="body"
className={cn(
"truncate font-normal",
session.id === sessionId
? "text-zinc-600"
: "text-zinc-800",
)}
>
{session.title || `Untitled chat`}
</Text>
</div>
<Text variant="small" className="text-neutral-400">
{formatDate(session.updated_at)}
</Text>
</div>
</button>
<button
onClick={(e) =>
handleDeleteClick(e, session.id, session.title)
}
disabled={isDeleting}
className="absolute right-2 top-1/2 -translate-y-1/2 rounded p-1.5 text-zinc-400 opacity-0 transition-all group-hover:opacity-100 hover:bg-red-100 hover:text-red-600 focus-visible:opacity-100 disabled:cursor-not-allowed disabled:opacity-50"
aria-label="Delete chat"
>
<TrashIcon className="h-4 w-4" />
</button>
</div>
))
)}
</motion.div>
)}
</SidebarContent>
{!isCollapsed && sessionId && (
<SidebarFooter className="shrink-0 bg-zinc-50 p-3 pb-1 shadow-[0_-4px_6px_-1px_rgba(0,0,0,0.05)]">
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.2, delay: 0.2 }}
>
<div className="flex flex-col items-center gap-2">
<SidebarTrigger />
<Button
variant="primary"
size="small"
variant="ghost"
onClick={handleNewChat}
className="w-full"
leftIcon={<PlusIcon className="h-4 w-4" weight="bold" />}
style={{ minWidth: "auto", width: "auto" }}
>
New Chat
<PlusCircleIcon className="!size-5" />
<span className="sr-only">New Chat</span>
</Button>
</motion.div>
</SidebarFooter>
</div>
</motion.div>
</SidebarHeader>
)}
<SidebarContent className="gap-4 overflow-y-auto px-4 py-4 [-ms-overflow-style:none] [scrollbar-width:none] [&::-webkit-scrollbar]:hidden">
{!isCollapsed && (
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.2, delay: 0.1 }}
className="flex items-center justify-between px-3"
>
<Text variant="h3" size="body-medium">
Your chats
</Text>
<div className="relative left-6">
<SidebarTrigger />
</div>
</motion.div>
)}
</Sidebar>
<DeleteConfirmDialog
entityType="chat"
entityName={sessionToDelete?.title || "Untitled chat"}
open={!!sessionToDelete}
onOpenChange={(open) => !open && setSessionToDelete(null)}
onDoDelete={handleConfirmDelete}
/>
</>
{!isCollapsed && (
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.2, delay: 0.15 }}
className="mt-4 flex flex-col gap-1"
>
{isLoadingSessions ? (
<div className="flex min-h-[30rem] items-center justify-center py-4">
<LoadingSpinner size="small" className="text-neutral-600" />
</div>
) : sessions.length === 0 ? (
<p className="py-4 text-center text-sm text-neutral-500">
No conversations yet
</p>
) : (
sessions.map((session) => (
<button
key={session.id}
onClick={() => handleSelectSession(session.id)}
className={cn(
"w-full rounded-lg px-3 py-2.5 text-left transition-colors",
session.id === sessionId
? "bg-zinc-100"
: "hover:bg-zinc-50",
)}
>
<div className="flex min-w-0 max-w-full flex-col overflow-hidden">
<div className="min-w-0 max-w-full">
<Text
variant="body"
className={cn(
"truncate font-normal",
session.id === sessionId
? "text-zinc-600"
: "text-zinc-800",
)}
>
{session.title || `Untitled chat`}
</Text>
</div>
<Text variant="small" className="text-neutral-400">
{formatDate(session.updated_at)}
</Text>
</div>
</button>
))
)}
</motion.div>
)}
</SidebarContent>
{!isCollapsed && sessionId && (
<SidebarFooter className="shrink-0 bg-zinc-50 p-3 pb-1 shadow-[0_-4px_6px_-1px_rgba(0,0,0,0.05)]">
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.2, delay: 0.2 }}
>
<Button
variant="primary"
size="small"
onClick={handleNewChat}
className="w-full"
leftIcon={<PlusIcon className="h-4 w-4" weight="bold" />}
>
New Chat
</Button>
</motion.div>
</SidebarFooter>
)}
</Sidebar>
);
}

View File

@@ -1,46 +1,22 @@
import { Button } from "@/components/atoms/Button/Button";
import { NAVBAR_HEIGHT_PX } from "@/lib/constants";
import { ListIcon, TrashIcon } from "@phosphor-icons/react";
import { ListIcon } from "@phosphor-icons/react";
interface Props {
onOpenDrawer: () => void;
showDelete?: boolean;
isDeleting?: boolean;
onDelete?: () => void;
}
export function MobileHeader({
onOpenDrawer,
showDelete,
isDeleting,
onDelete,
}: Props) {
export function MobileHeader({ onOpenDrawer }: Props) {
return (
<div
className="fixed z-50 flex gap-2"
<Button
variant="icon"
size="icon"
aria-label="Open sessions"
onClick={onOpenDrawer}
className="fixed z-50 bg-white shadow-md"
style={{ left: "1rem", top: `${NAVBAR_HEIGHT_PX + 20}px` }}
>
<Button
variant="icon"
size="icon"
aria-label="Open sessions"
onClick={onOpenDrawer}
className="bg-white shadow-md"
>
<ListIcon width="1.25rem" height="1.25rem" />
</Button>
{showDelete && onDelete && (
<Button
variant="icon"
size="icon"
aria-label="Delete current chat"
onClick={onDelete}
disabled={isDeleting}
className="bg-white text-red-500 shadow-md hover:bg-red-50 hover:text-red-600 disabled:opacity-50"
>
<TrashIcon width="1.25rem" height="1.25rem" />
</Button>
)}
</div>
<ListIcon width="1.25rem" height="1.25rem" />
</Button>
);
}

View File

@@ -1,15 +1,10 @@
import {
getGetV2ListSessionsQueryKey,
useDeleteV2DeleteSession,
useGetV2ListSessions,
} from "@/app/api/__generated__/endpoints/chat/chat";
import { useGetV2ListSessions } from "@/app/api/__generated__/endpoints/chat/chat";
import { toast } from "@/components/molecules/Toast/use-toast";
import { useBreakpoint } from "@/lib/hooks/useBreakpoint";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { useChat } from "@ai-sdk/react";
import { useQueryClient } from "@tanstack/react-query";
import { DefaultChatTransport } from "ai";
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
import { useEffect, useMemo, useRef, useState } from "react";
import { useChatSession } from "./useChatSession";
import { useLongRunningToolPolling } from "./hooks/useLongRunningToolPolling";
@@ -19,11 +14,6 @@ export function useCopilotPage() {
const { isUserLoading, isLoggedIn } = useSupabase();
const [isDrawerOpen, setIsDrawerOpen] = useState(false);
const [pendingMessage, setPendingMessage] = useState<string | null>(null);
const [sessionToDelete, setSessionToDelete] = useState<{
id: string;
title: string | null | undefined;
} | null>(null);
const queryClient = useQueryClient();
const {
sessionId,
@@ -34,30 +24,6 @@ export function useCopilotPage() {
isCreatingSession,
} = useChatSession();
const { mutate: deleteSessionMutation, isPending: isDeleting } =
useDeleteV2DeleteSession({
mutation: {
onSuccess: () => {
queryClient.invalidateQueries({
queryKey: getGetV2ListSessionsQueryKey(),
});
if (sessionToDelete?.id === sessionId) {
setSessionId(null);
}
setSessionToDelete(null);
},
onError: (error) => {
toast({
title: "Failed to delete chat",
description:
error instanceof Error ? error.message : "An error occurred",
variant: "destructive",
});
setSessionToDelete(null);
},
},
});
const breakpoint = useBreakpoint();
const isMobile =
breakpoint === "base" || breakpoint === "sm" || breakpoint === "md";
@@ -177,24 +143,6 @@ export function useCopilotPage() {
if (isMobile) setIsDrawerOpen(false);
}
const handleDeleteClick = useCallback(
(id: string, title: string | null | undefined) => {
if (isDeleting) return;
setSessionToDelete({ id, title });
},
[isDeleting],
);
const handleConfirmDelete = useCallback(() => {
if (sessionToDelete) {
deleteSessionMutation({ sessionId: sessionToDelete.id });
}
}, [sessionToDelete, deleteSessionMutation]);
const handleCancelDelete = useCallback(() => {
setSessionToDelete(null);
}, []);
return {
sessionId,
messages,
@@ -217,11 +165,5 @@ export function useCopilotPage() {
handleDrawerOpenChange,
handleSelectSession,
handleNewChat,
// Delete functionality
sessionToDelete,
isDeleting,
handleDeleteClick,
handleConfirmDelete,
handleCancelDelete,
};
}

View File

@@ -1151,36 +1151,6 @@
}
},
"/api/chat/sessions/{session_id}": {
"delete": {
"tags": ["v2", "chat", "chat"],
"summary": "Delete Session",
"description": "Delete a chat session.\n\nPermanently removes a chat session and all its messages.\nOnly the owner can delete their sessions.\n\nArgs:\n session_id: The session ID to delete.\n user_id: The authenticated user's ID.\n\nReturns:\n 204 No Content on success.\n\nRaises:\n HTTPException: 404 if session not found or not owned by user.",
"operationId": "deleteV2DeleteSession",
"security": [{ "HTTPBearerJWT": [] }],
"parameters": [
{
"name": "session_id",
"in": "path",
"required": true,
"schema": { "type": "string", "title": "Session Id" }
}
],
"responses": {
"204": { "description": "Successful Response" },
"401": {
"$ref": "#/components/responses/HTTP401NotAuthenticatedError"
},
"404": { "description": "Session not found or access denied" },
"422": {
"description": "Validation Error",
"content": {
"application/json": {
"schema": { "$ref": "#/components/schemas/HTTPValidationError" }
}
}
}
}
},
"get": {
"tags": ["v2", "chat", "chat"],
"summary": "Get Session",

View File

@@ -115,7 +115,7 @@ const DialogFooter = ({
}: React.HTMLAttributes<HTMLDivElement>) => (
<div
className={cn(
"flex flex-col-reverse gap-2 sm:flex-row sm:justify-end",
"flex flex-col-reverse sm:flex-row sm:justify-end sm:space-x-2",
className,
)}
{...props}

View File

@@ -15,6 +15,7 @@
## Advanced Setup
* [Advanced Setup](advanced_setup.md)
* [Deployment Environment Variables](deployment-environment-variables.md)
## Building Blocks

View File

@@ -0,0 +1,397 @@
# Deployment Environment Variables
This guide documents **all environment variables that must be configured** when deploying AutoGPT to a new server or environment. Use this as a checklist to ensure your deployment works correctly.
## Quick Reference: What MUST Change
When deploying to a new server, these variables **must** be updated from their localhost defaults:
| Variable | Location | Default | Purpose |
|----------|----------|---------|---------|
| `SITE_URL` | `.env` | `http://localhost:3000` | Frontend URL for auth redirects |
| `API_EXTERNAL_URL` | `.env` | `http://localhost:8000` | Public Supabase API URL |
| `SUPABASE_PUBLIC_URL` | `.env` | `http://localhost:8000` | Studio dashboard URL |
| `PLATFORM_BASE_URL` | `backend/.env` | `http://localhost:8000` | Backend platform URL |
| `FRONTEND_BASE_URL` | `backend/.env` | `http://localhost:3000` | Frontend URL for webhooks/OAuth |
| `NEXT_PUBLIC_SUPABASE_URL` | `frontend/.env` | `http://localhost:8000` | Client-side Supabase URL |
| `NEXT_PUBLIC_AGPT_SERVER_URL` | `frontend/.env` | `http://localhost:8006/api` | Client-side backend API URL |
| `NEXT_PUBLIC_AGPT_WS_SERVER_URL` | `frontend/.env` | `ws://localhost:8001/ws` | Client-side WebSocket URL |
| `NEXT_PUBLIC_FRONTEND_BASE_URL` | `frontend/.env` | `http://localhost:3000` | Client-side frontend URL |
---
## Configuration Files
AutoGPT uses multiple `.env` files across different components:
```text
autogpt_platform/
├── .env # Supabase/infrastructure config
├── backend/
│ ├── .env.default # Backend defaults (DO NOT EDIT)
│ └── .env # Your backend overrides
└── frontend/
├── .env.default # Frontend defaults (DO NOT EDIT)
└── .env # Your frontend overrides
```
**Loading Order** (later overrides earlier):
1. `*.env.default` - Base defaults
2. `*.env` - Your overrides
3. Docker `environment:` section
4. Shell environment variables
---
## 1. URL Configuration (REQUIRED)
These URLs must be updated to match your deployment domain/IP.
### Root `.env` (Supabase)
```bash
# Auth redirects - where users return after login
SITE_URL=https://your-domain.com:3000
# Public API URL - exposed to clients
API_EXTERNAL_URL=https://your-domain.com:8000
# Studio dashboard URL
SUPABASE_PUBLIC_URL=https://your-domain.com:8000
```
### Backend `.env`
```bash
# Platform URLs for webhooks and OAuth callbacks
PLATFORM_BASE_URL=https://your-domain.com:8000
FRONTEND_BASE_URL=https://your-domain.com:3000
# Internal Supabase URL (use Docker service name if containerized)
SUPABASE_URL=http://kong:8000 # Docker
# SUPABASE_URL=https://your-domain.com:8000 # External
```
### Frontend `.env`
```bash
# Client-side URLs (used in browser)
NEXT_PUBLIC_SUPABASE_URL=https://your-domain.com:8000
NEXT_PUBLIC_AGPT_SERVER_URL=https://your-domain.com:8006/api
NEXT_PUBLIC_AGPT_WS_SERVER_URL=wss://your-domain.com:8001/ws
NEXT_PUBLIC_FRONTEND_BASE_URL=https://your-domain.com:3000
```
!!! warning "HTTPS Note"
For production, use HTTPS URLs and `wss://` for WebSocket. You'll need a reverse proxy (nginx, Caddy) with SSL certificates.
!!! info "Port Numbers"
The port numbers shown (`:3000`, `:8000`, `:8001`, `:8006`) are internal Docker service ports. In production with a reverse proxy, your public URLs typically won't include port numbers (e.g., `https://your-domain.com` instead of `https://your-domain.com:3000`). Configure your reverse proxy to route external traffic to the internal service ports.
---
## 2. Security Keys (MUST REGENERATE)
These default values are **public** and **must be changed** for production.
### Root `.env`
```bash
# Database password
POSTGRES_PASSWORD=<generate-strong-password>
# JWT secret for Supabase auth (min 32 chars)
JWT_SECRET=<generate-random-string>
# Supabase keys (regenerate with matching JWT_SECRET)
ANON_KEY=<regenerate>
SERVICE_ROLE_KEY=<regenerate>
# Studio dashboard credentials
DASHBOARD_USERNAME=<your-username>
DASHBOARD_PASSWORD=<strong-password>
# Encryption keys
SECRET_KEY_BASE=<generate-random-string>
VAULT_ENC_KEY=<generate-32-char-key> # Run: openssl rand -hex 16
```
### Backend `.env`
```bash
# Must match root POSTGRES_PASSWORD
DB_PASS=<same-as-POSTGRES_PASSWORD>
# Must match root SERVICE_ROLE_KEY
SUPABASE_SERVICE_ROLE_KEY=<same-as-SERVICE_ROLE_KEY>
# Must match root JWT_SECRET
JWT_VERIFY_KEY=<same-as-JWT_SECRET>
# Generate new encryption keys
# Run: python -c "from cryptography.fernet import Fernet;print(Fernet.generate_key().decode())"
ENCRYPTION_KEY=<generated-fernet-key>
UNSUBSCRIBE_SECRET_KEY=<generated-fernet-key>
```
### Generating Keys
```bash
# Generate Fernet encryption key (for ENCRYPTION_KEY, UNSUBSCRIBE_SECRET_KEY)
python -c "from cryptography.fernet import Fernet;print(Fernet.generate_key().decode())"
# Generate random string (for JWT_SECRET, SECRET_KEY_BASE)
openssl rand -base64 32
# Generate 32-character key (for VAULT_ENC_KEY)
openssl rand -hex 16
# Generate Supabase keys (requires matching JWT_SECRET)
# Use: https://supabase.com/docs/guides/self-hosting/docker#generate-api-keys
```
---
## 3. Database Configuration
### Root `.env`
```bash
POSTGRES_HOST=db # Docker service name or external host
POSTGRES_DB=postgres
POSTGRES_PORT=5432
POSTGRES_PASSWORD=<your-password>
```
### Backend `.env`
```bash
DB_USER=postgres
DB_PASS=<your-password>
DB_NAME=postgres
DB_PORT=5432
DB_HOST=localhost # Default is localhost; use 'db' in Docker
DB_SCHEMA=platform
# Connection pooling
DB_CONNECTION_LIMIT=12
DB_CONNECT_TIMEOUT=60
DB_POOL_TIMEOUT=300
# Full connection URL (auto-constructed from above in .env.default)
# Variable substitution is handled automatically; only override if you need custom parameters
DATABASE_URL="postgresql://${DB_USER}:${DB_PASS}@${DB_HOST}:${DB_PORT}/${DB_NAME}?schema=${DB_SCHEMA}"
```
---
## 4. Service Dependencies
### Redis
```bash
REDIS_HOST=redis # Docker: 'redis', External: hostname/IP
REDIS_PORT=6379
# REDIS_PASSWORD= # Uncomment if using authentication
```
### RabbitMQ
```bash
RABBITMQ_DEFAULT_USER=<username>
RABBITMQ_DEFAULT_PASS=<strong-password>
# In Docker, host is 'rabbitmq'
```
---
## 5. Default Ports
| Service | Port | Purpose |
|---------|------|---------|
| Frontend | 3000 | Next.js web UI |
| Kong (Supabase API) | 8000 | API gateway |
| WebSocket Server | 8001 | Real-time updates |
| Executor | 8002 | Agent execution |
| Scheduler | 8003 | Scheduled tasks |
| Database Manager | 8005 | DB operations |
| REST Server | 8006 | Main API |
| Notification Server | 8007 | Notifications |
| PostgreSQL | 5432 | Database |
| Redis | 6379 | Cache/queue |
| RabbitMQ | 5672/15672 | Message queue |
| ClamAV | 3310 | Antivirus scanning |
---
## 6. OAuth Callbacks
When configuring OAuth providers, use this callback URL format:
```text
https://your-domain.com/auth/integrations/oauth_callback
# Or with explicit port if not using a reverse proxy:
# https://your-domain.com:3000/auth/integrations/oauth_callback
```
### Supported OAuth Providers
| Provider | Env Variables | Setup URL |
|----------|---------------|-----------|
| GitHub | `GITHUB_CLIENT_ID`, `GITHUB_CLIENT_SECRET` | [github.com/settings/developers](https://github.com/settings/developers) |
| Google | `GOOGLE_CLIENT_ID`, `GOOGLE_CLIENT_SECRET` | [console.cloud.google.com](https://console.cloud.google.com/apis/credentials) |
| Discord | `DISCORD_CLIENT_ID`, `DISCORD_CLIENT_SECRET` | [discord.com/developers](https://discord.com/developers/applications) |
| Twitter/X | `TWITTER_CLIENT_ID`, `TWITTER_CLIENT_SECRET` | [developer.x.com](https://developer.x.com) |
| Notion | `NOTION_CLIENT_ID`, `NOTION_CLIENT_SECRET` | [developers.notion.com](https://developers.notion.com) |
| Linear | `LINEAR_CLIENT_ID`, `LINEAR_CLIENT_SECRET` | [linear.app/settings/api](https://linear.app/settings/api/applications/new) |
| Reddit | `REDDIT_CLIENT_ID`, `REDDIT_CLIENT_SECRET` | [reddit.com/prefs/apps](https://reddit.com/prefs/apps) |
| Todoist | `TODOIST_CLIENT_ID`, `TODOIST_CLIENT_SECRET` | [developer.todoist.com](https://developer.todoist.com/appconsole.html) |
---
## 7. Optional Services
### AI/LLM Providers
```bash
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GROQ_API_KEY=
OPEN_ROUTER_API_KEY=
NVIDIA_API_KEY=
```
### Email (SMTP)
```bash
# Supabase auth emails
SMTP_HOST=smtp.example.com
SMTP_PORT=587
SMTP_USER=<username>
SMTP_PASS=<password>
SMTP_ADMIN_EMAIL=admin@example.com
# Application emails (Postmark)
POSTMARK_SERVER_API_TOKEN=
POSTMARK_SENDER_EMAIL=noreply@your-domain.com
```
### Payments (Stripe)
```bash
STRIPE_API_KEY=
STRIPE_WEBHOOK_SECRET=
```
### Error Tracking (Sentry)
```bash
SENTRY_DSN=
```
### Analytics (PostHog)
```bash
POSTHOG_API_KEY=
POSTHOG_HOST=https://eu.i.posthog.com
# Frontend
NEXT_PUBLIC_POSTHOG_KEY=
NEXT_PUBLIC_POSTHOG_HOST=https://eu.i.posthog.com
```
---
## 8. Deployment Checklist
Use this checklist when deploying to a new environment:
### Pre-deployment
- [ ] Clone repository and navigate to `autogpt_platform/`
- [ ] Copy all `.env.default` files to `.env`
- [ ] Determine your deployment domain/IP
### URL Configuration
- [ ] Update `SITE_URL` in root `.env`
- [ ] Update `API_EXTERNAL_URL` in root `.env`
- [ ] Update `SUPABASE_PUBLIC_URL` in root `.env`
- [ ] Update `PLATFORM_BASE_URL` in `backend/.env`
- [ ] Update `FRONTEND_BASE_URL` in `backend/.env`
- [ ] Update all `NEXT_PUBLIC_*` URLs in `frontend/.env`
### Security
- [ ] Generate new `POSTGRES_PASSWORD`
- [ ] Generate new `JWT_SECRET` (min 32 chars)
- [ ] Regenerate `ANON_KEY` and `SERVICE_ROLE_KEY`
- [ ] Change `DASHBOARD_USERNAME` and `DASHBOARD_PASSWORD`
- [ ] Generate new `ENCRYPTION_KEY` (backend)
- [ ] Generate new `UNSUBSCRIBE_SECRET_KEY` (backend)
- [ ] Update `DB_PASS` to match `POSTGRES_PASSWORD`
- [ ] Update `JWT_VERIFY_KEY` to match `JWT_SECRET`
- [ ] Update `SUPABASE_SERVICE_ROLE_KEY` to match
### Services
- [ ] Configure Redis connection (if external)
- [ ] Configure RabbitMQ credentials
- [ ] Configure SMTP for emails (if needed)
### OAuth (if using integrations)
- [ ] Register OAuth apps with your callback URL
- [ ] Add client IDs and secrets to `backend/.env`
### Post-deployment
- [ ] Run `docker compose up -d --build`
- [ ] Verify frontend loads at your URL
- [ ] Test authentication flow
- [ ] Test WebSocket connection (real-time updates)
---
## 9. Docker vs External Services
### Running Everything in Docker (Default)
The docker-compose files automatically set internal hostnames:
```yaml
# Internal Docker service names (container-to-container communication)
# These are set automatically in docker-compose.platform.yml
DB_HOST: db
REDIS_HOST: redis
RABBITMQ_HOST: rabbitmq
SUPABASE_URL: http://kong:8000
```
### Using External Services
If using managed services (AWS RDS, Redis Cloud, etc.), override in your `.env`:
```bash
# External PostgreSQL
DB_HOST=your-rds-instance.region.rds.amazonaws.com
DB_PORT=5432
# External Redis
REDIS_HOST=your-redis.cache.amazonaws.com
REDIS_PORT=6379
REDIS_PASSWORD=<if-required>
# External Supabase (hosted)
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_ROLE_KEY=<your-service-role-key>
```
---
## Related Documentation
- [Getting Started](getting-started.md) - Basic setup guide
- [Advanced Setup](advanced_setup.md) - Development configuration
- [OAuth & SSO](integrating/oauth-guide.md) - Integration setup