## Summary This PR adds the frontend service to the Docker Compose configuration, enabling `docker compose up` to run the complete stack, including the frontend. It also implements comprehensive environment variable improvements, unified .env file support, and fixes Docker networking issues. ## Key Changes ### 🐳 Docker Compose Improvements - **Added frontend service** to `docker-compose.yml` and `docker-compose.platform.yml` - **Production build**: Uses `pnpm build + serve` instead of dev server for better stability and lower memory usage - **Service dependencies**: Frontend now waits for backend services (`rest_server`, `websocket_server`) to be ready - **YAML anchors**: Implemented DRY configuration to avoid duplicating environment values ### 📁 Unified .env File Support - **Frontend .env loading**: Automatically loads `.env` file during Docker build and runtime - **Backend .env loading**: Optional `.env` file support with fallback to sensible defaults in `settings.py` - **Single source of truth**: All `NEXT_PUBLIC_*` and API keys can be defined in respective `.env` files - **Docker integration**: Updated `.dockerignore` to include `.env` files in build context - **Git tracking**: Frontend and backend `.env` files are now trackable (removed from gitignore) ### 🔧 Environment Variable Architecture - **Dual environment strategy**: - Server-side code uses Docker service names (`http://rest_server:8006/api`) - Client-side code uses localhost URLs (`http://localhost:8006/api`) - **Comprehensive config**: Added build args and runtime environment variables - **Network compatibility**: Fixes connection issues between frontend and backend containers - **Shared backend variables**: Common environment variables (service hosts, auth settings) centralized using YAML anchors ### 🛠️ Code Improvements - **Centralized env-config helper** (`/frontend/src/lib/env-config.ts`) with server-side priority - **Updated all frontend code** to use shared environment helpers instead of direct `process.env` access - **Consistent API**: All environment variable access now goes through helper functions - **Settings.py improvements**: Better defaults for CORS origins and optional .env file loading ### 🔗 Files Changed - `docker-compose.yml` & `docker-compose.platform.yml` - Added frontend service and shared backend env vars - `frontend/Dockerfile` - Simplified build process to use .env files directly - `backend/settings.py` - Optional .env loading and better defaults - `frontend/src/lib/env-config.ts` - New centralized environment configuration - `.dockerignore` - Allow .env files in build context - `.gitignore` - Updated to allow frontend/backend .env files - Multiple frontend files - Updated to use env helpers - Updates to both auto installer scripts to work with the latest setup! ## Benefits - ✅ **Single command deployment**: `docker compose up` now runs everything - ✅ **Better reliability**: Production build reduces memory usage and crashes - ✅ **Network compatibility**: Proper container-to-container communication - ✅ **Maintainable config**: Centralized environment variable management with .env files - ✅ **Development friendly**: Works in both Docker and local development - ✅ **API key management**: Easy configuration through .env files for all services - ✅ **No more manual env vars**: Frontend and backend automatically load their respective .env files ## Testing - ✅ Verified Docker service communication works correctly - ✅ Frontend responds and serves content properly - ✅ Environment variables are correctly resolved in both server and client contexts - ✅ No connection errors after implementing service dependencies - ✅ .env file loading works correctly in both build and runtime phases - ✅ Backend services work with and without .env files present ### Checklist 📋 #### For configuration changes: - [x] `.env.default` is updated or already compatible with my changes - [x] `docker-compose.yml` is updated or already compatible with my changes - [x] I have included a list of my configuration changes in the PR description (under **Changes**) 🤖 Generated with [Claude Code](https://claude.ai/code) --------- Co-authored-by: Lluis Agusti <hi@llu.lu> Co-authored-by: Claude <noreply@anthropic.com> Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co> Co-authored-by: Claude <claude@users.noreply.github.com> Co-authored-by: Bentlybro <Github@bentlybro.com>
8.8 KiB
Getting Started with AutoGPT: Self-Hosting Guide
Introduction
This guide will help you setup the server and builder for the project.
!!! warning DO NOT FOLLOW ANY OUTSIDE TUTORIALS AS THEY WILL LIKELY BE OUT OF DATE
Prerequisites
To setup the server, you need to have the following installed:
Checking if you have Node.js & NPM installed
We use Node.js to run our frontend application.
If you need assistance installing Node.js: https://nodejs.org/en/download/
NPM is included with Node.js, but if you need assistance installing NPM: https://docs.npmjs.com/downloading-and-installing-node-js-and-npm
You can check if you have Node.js & NPM installed by running the following command:
node -v
npm -v
Once you have Node.js installed, you can proceed to the next step.
Checking if you have Docker & Docker Compose installed
Docker containerizes applications, while Docker Compose orchestrates multi-container Docker applications.
If you need assistance installing docker: https://docs.docker.com/desktop/
Docker-compose is included in Docker Desktop, but if you need assistance installing docker compose: https://docs.docker.com/compose/install/
You can check if you have Docker installed by running the following command:
docker -v
docker compose -v
Once you have Docker and Docker Compose installed, you can proceed to the next step.
Raspberry Pi 5 Specific Notes
On Raspberry Pi 5 with Raspberry Pi OS, the default 16K page size will cause issues with thesupabase-vector container (expected: 4K).
To fix this, edit
/boot/firmware/config.txt and add:
```ini kernel=kernel8.img ``` Then reboot. You can check your page size with:
```bash getconf PAGESIZE ```
16384 means 16K (incorrect), and 4096 means 4K (correct).
After adjusting, docker compose up -d --build should work normally.
See supabase/supabase #33816 for additional context.
Quick Setup with Auto Setup Script (Recommended)
If you're self-hosting AutoGPT locally, we recommend using our official setup script to simplify the process. This will install dependencies (like Docker), pull the latest code, and launch the app with minimal effort.
For macOS/Linux:
curl -fsSL https://setup.agpt.co/install.sh -o install.sh && bash install.sh
For Windows (PowerShell):
powershell -c "iwr https://setup.agpt.co/install.bat -o install.bat; ./install.bat"
This method is ideal if you're setting up for development or testing and want to skip manual configuration.
Manual Setup
Cloning the Repository
The first step is cloning the AutoGPT repository to your computer. To do this, open a terminal window in a folder on your computer and run:
git clone https://github.com/Significant-Gravitas/AutoGPT.git
If you get stuck, follow this guide.
Once that's complete you can continue the setup process.
Running the AutoGPT Platform
To run the platform, follow these steps:
- Navigate to the
autogpt_platformdirectory inside the AutoGPT folder:cd AutoGPT/autogpt_platform
-
Copy the
.env.defaultfile to.envinautogpt_platform:cp .env.default .envThis command will copy the
.env.defaultfile to.envin theautogpt_platformdirectory. You can modify the.envfile to add your own environment variables. -
Run the platform services:
docker compose up -d --buildThis command will start all the necessary backend services defined in the
docker-compose.ymlfile in detached mode.
Checking if the application is running
You can check if the server is running by visiting http://localhost:3000 in your browser.
Notes:
By default the application for different services run on the following ports:
Frontend UI Server: 3000 Backend Websocket Server: 8001 Execution API Rest Server: 8006
Additional Notes
You may want to change your encryption key in the .env file in the autogpt_platform/backend directory.
To generate a new encryption key, run the following command in python:
from cryptography.fernet import Fernet;Fernet.generate_key().decode()
Or run the following command in the autogpt_platform/backend directory:
poetry run cli gen-encrypt-key
Then, replace the existing key in the autogpt_platform/backend/.env file with the new one.
📌 Windows Installation Note
When installing Docker on Windows, it is highly recommended to select WSL 2 instead of Hyper-V. Using Hyper-V can cause compatibility issues with Supabase, leading to the supabase-db container being marked as unhealthy.
Steps to enable WSL 2 for Docker:
- Install WSL 2.
- Ensure that your Docker settings use WSL 2 as the default backend:
- Open Docker Desktop.
- Navigate to Settings > General.
- Check Use the WSL 2 based engine.
- Restart Docker Desktop.
Already Installed Docker with Hyper-V?
If you initially installed Docker with Hyper-V, you don’t need to reinstall it. You can switch to WSL 2 by following these steps:
- Open Docker Desktop.
- Go to Settings > General.
- Enable Use the WSL 2 based engine.
- Restart Docker.
🚨 Warning: Enabling WSL 2 may erase your existing containers and build history. If you have important containers, consider backing them up before switching.
For more details, refer to Docker's official documentation.
Development
Frontend Development
Running the frontend locally
To run the frontend locally, you need to have Node.js and PNPM installed on your machine.
Install Node.js to manage dependencies and run the frontend application.
Install PNPM to manage the frontend dependencies.
Run the service dependencies (backend, database, message queues, etc.):
docker compose --profile local up deps_backend --build --detach
Go to the autogpt_platform/frontend directory:
cd frontend
Install the dependencies:
pnpm install
Generate the API client:
pnpm generate:api-client
Run the frontend application:
pnpm dev
Formatting & Linting
Auto formatter and linter are set up in the project. To run them: Format the code:
pnpm format
Lint the code:
pnpm lint
Testing
To run the tests, you can use the following command:
pnpm test
Backend Development
Running the backend locally
To run the backend locally, you need to have Python 3.10 or higher installed on your machine.
Install Poetry to manage dependencies and virtual environments.
Run the backend dependencies (database, message queues, etc.):
docker compose --profile local up deps --build --detach
Go to the autogpt_platform/backend directory:
cd backend
Install the dependencies:
poetry install --with dev
Run the backend server:
poetry run app
Formatting & Linting
Auto formatter and linter are set up in the project. To run them:
Format the code:
poetry run format
Lint the code:
poetry run lint
Testing
To run the tests:
poetry run pytest -s
Adding a New Agent Block
To add a new agent block, you need to create a new class that inherits from Block and provides the following information:
- All the block code should live in the
blocks(backend.blocks) module. input_schema: the schema of the input data, represented by a Pydantic object.output_schema: the schema of the output data, represented by a Pydantic object.runmethod: the main logic of the block.test_input&test_output: the sample input and output data for the block, which will be used to auto-test the block.- You can mock the functions declared in the block using the
test_mockfield for your unit tests. - Once you finish creating the block, you can test it by running
poetry run pytest backend/blocks/test/test_block.py -s. - Create a Pull Request to the
devbranch of the repository with your changes so you can share it with the community :)