feat: Complete container publishing implementation with deployment tools and templates

Co-authored-by: ntindle <8845353+ntindle@users.noreply.github.com>
This commit is contained in:
copilot-swe-agent[bot]
2025-09-17 18:28:23 +00:00
parent 1d207a9b52
commit 9a61b45644
4 changed files with 492 additions and 3 deletions

View File

@@ -2,16 +2,38 @@
Welcome to the AutoGPT Platform - a powerful system for creating and running AI agents to solve business problems. This platform enables you to harness the power of artificial intelligence to automate tasks, analyze data, and generate insights for your organization.
## Getting Started
## Deployment Options
### Quick Deploy with Published Containers (Recommended)
The fastest way to get started is using our pre-built containers:
```bash
# Download and run with published images
curl -fsSL https://raw.githubusercontent.com/Significant-Gravitas/AutoGPT/master/autogpt_platform/deploy.sh -o deploy.sh
chmod +x deploy.sh
./deploy.sh deploy
```
Access the platform at http://localhost:3000 after deployment completes.
### Platform-Specific Deployments
- **Unraid**: [Deployment Guide](../docs/content/platform/deployment/unraid.md)
- **Home Assistant**: [Add-on Guide](../docs/content/platform/deployment/home-assistant.md)
- **Kubernetes**: [K8s Deployment](../docs/content/platform/deployment/kubernetes.md)
- **General Containers**: [Container Guide](../docs/content/platform/container-deployment.md)
## Development Setup
### Prerequisites
- Docker
- Docker Compose V2 (comes with Docker Desktop, or can be installed separately)
### Running the System
### Running from Source
To run the AutoGPT Platform, follow these steps:
To run the AutoGPT Platform from source for development:
1. Clone this repository to your local machine and navigate to the `autogpt_platform` directory within the repository:
@@ -157,3 +179,28 @@ If you need to update the API client after making changes to the backend API:
```
This will fetch the latest OpenAPI specification and regenerate the TypeScript client code.
## Container Deployment
For production deployments and specific platforms, see our container deployment guides:
- **[Container Deployment Overview](CONTAINERS.md)** - Complete guide to using published containers
- **[Deployment Script](deploy.sh)** - Automated deployment and management tool
- **[Published Images](docker-compose.published.yml)** - Docker Compose for published containers
### Published Container Images
- **Backend**: `ghcr.io/significant-gravitas/autogpt-platform-backend:latest`
- **Frontend**: `ghcr.io/significant-gravitas/autogpt-platform-frontend:latest`
### Quick Production Deployment
```bash
# Deploy with published containers
./deploy.sh deploy
# Or use the published compose file directly
docker compose -f docker-compose.published.yml up -d
```
For detailed deployment instructions, troubleshooting, and platform-specific guides, see the [Container Documentation](CONTAINERS.md).

262
autogpt_platform/build-test.sh Executable file
View File

@@ -0,0 +1,262 @@
#!/bin/bash
# AutoGPT Platform Container Build Test Script
# This script tests container builds locally before CI/CD
set -euo pipefail
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# Configuration
REGISTRY="ghcr.io"
IMAGE_PREFIX="significant-gravitas/autogpt-platform"
VERSION="test"
BUILD_ARGS=""
# Functions
info() {
echo -e "${BLUE}[INFO]${NC} $1"
}
success() {
echo -e "${GREEN}[SUCCESS]${NC} $1"
}
warning() {
echo -e "${YELLOW}[WARNING]${NC} $1"
}
error() {
echo -e "${RED}[ERROR]${NC} $1"
}
usage() {
cat << EOF
AutoGPT Platform Container Build Test Script
Usage: $0 [OPTIONS] [COMPONENT]
COMPONENTS:
backend Build backend container only
frontend Build frontend container only
all Build both containers (default)
OPTIONS:
-r, --registry REGISTRY Container registry (default: ghcr.io)
-t, --tag TAG Image tag (default: test)
--no-cache Build without cache
--push Push images after build
-h, --help Show this help message
EXAMPLES:
$0 # Build both containers
$0 backend # Build backend only
$0 --no-cache all # Build without cache
$0 --push frontend # Build and push frontend
EOF
}
check_docker() {
if ! command -v docker &> /dev/null; then
error "Docker is not installed"
exit 1
fi
if ! docker info &> /dev/null; then
error "Docker daemon is not running"
exit 1
fi
success "Docker is available"
}
build_backend() {
info "Building backend container..."
local image_name="$REGISTRY/$IMAGE_PREFIX-backend:$VERSION"
local dockerfile="autogpt_platform/backend/Dockerfile"
info "Building: $image_name"
info "Dockerfile: $dockerfile"
info "Context: ."
info "Target: server"
if docker build \
-t "$image_name" \
-f "$dockerfile" \
--target server \
$BUILD_ARGS \
.; then
success "Backend container built successfully: $image_name"
# Test the container
info "Testing backend container..."
if docker run --rm -d --name autogpt-backend-test "$image_name" > /dev/null; then
sleep 5
if docker ps | grep -q autogpt-backend-test; then
success "Backend container is running"
docker stop autogpt-backend-test > /dev/null
else
warning "Backend container started but may have issues"
fi
else
warning "Failed to start backend container for testing"
fi
return 0
else
error "Backend container build failed"
return 1
fi
}
build_frontend() {
info "Building frontend container..."
local image_name="$REGISTRY/$IMAGE_PREFIX-frontend:$VERSION"
local dockerfile="autogpt_platform/frontend/Dockerfile"
info "Building: $image_name"
info "Dockerfile: $dockerfile"
info "Context: ."
info "Target: prod"
if docker build \
-t "$image_name" \
-f "$dockerfile" \
--target prod \
$BUILD_ARGS \
.; then
success "Frontend container built successfully: $image_name"
# Test the container
info "Testing frontend container..."
if docker run --rm -d --name autogpt-frontend-test -p 3001:3000 "$image_name" > /dev/null; then
sleep 10
if docker ps | grep -q autogpt-frontend-test; then
if curl -s -o /dev/null -w "%{http_code}" http://localhost:3001 | grep -q "200\|302\|404"; then
success "Frontend container is responding"
else
warning "Frontend container started but not responding to HTTP requests"
fi
docker stop autogpt-frontend-test > /dev/null
else
warning "Frontend container started but may have issues"
fi
else
warning "Failed to start frontend container for testing"
fi
return 0
else
error "Frontend container build failed"
return 1
fi
}
push_images() {
if [[ "$PUSH_IMAGES" == "true" ]]; then
info "Pushing images to registry..."
local backend_image="$REGISTRY/$IMAGE_PREFIX-backend:$VERSION"
local frontend_image="$REGISTRY/$IMAGE_PREFIX-frontend:$VERSION"
for image in "$backend_image" "$frontend_image"; do
if docker images | grep -q "$image"; then
info "Pushing $image..."
if docker push "$image"; then
success "Pushed $image"
else
error "Failed to push $image"
fi
fi
done
fi
}
show_images() {
info "Built images:"
docker images | grep "$IMAGE_PREFIX" | grep "$VERSION"
}
cleanup_test_containers() {
# Clean up any test containers that might be left running
docker ps -a | grep "autogpt-.*-test" | awk '{print $1}' | xargs -r docker rm -f > /dev/null 2>&1 || true
}
# Parse command line arguments
COMPONENT="all"
PUSH_IMAGES="false"
while [[ $# -gt 0 ]]; do
case $1 in
-r|--registry)
REGISTRY="$2"
shift 2
;;
-t|--tag)
VERSION="$2"
shift 2
;;
--no-cache)
BUILD_ARGS="$BUILD_ARGS --no-cache"
shift
;;
--push)
PUSH_IMAGES="true"
shift
;;
-h|--help)
usage
exit 0
;;
backend|frontend|all)
COMPONENT="$1"
shift
;;
*)
error "Unknown option: $1"
usage
exit 1
;;
esac
done
# Main execution
info "AutoGPT Platform Container Build Test"
info "Component: $COMPONENT"
info "Registry: $REGISTRY"
info "Tag: $VERSION"
check_docker
cleanup_test_containers
# Build containers based on component selection
case "$COMPONENT" in
backend)
build_backend
;;
frontend)
build_frontend
;;
all)
if build_backend && build_frontend; then
success "All containers built successfully"
else
error "Some container builds failed"
exit 1
fi
;;
esac
push_images
show_images
cleanup_test_containers
success "Build test completed successfully"

View File

@@ -0,0 +1,137 @@
{
"name": "AutoGPT Platform",
"version": "1.0.0",
"slug": "autogpt-platform",
"description": "AutoGPT Platform for creating and managing AI agents with Home Assistant integration",
"url": "https://github.com/Significant-Gravitas/AutoGPT",
"codenotary": "notary@home-assistant.io",
"arch": [
"aarch64",
"amd64",
"armhf",
"armv7",
"i386"
],
"startup": "services",
"init": false,
"privileged": [
"SYS_ADMIN"
],
"hassio_api": true,
"hassio_role": "default",
"homeassistant_api": true,
"host_network": false,
"host_pid": false,
"host_ipc": false,
"auto_uart": false,
"devices": [],
"udev": false,
"tmpfs": false,
"environment": {
"LOG_LEVEL": "info"
},
"map": [
"config:rw",
"ssl:ro",
"addons_config:rw"
],
"ports": {
"3000/tcp": 3000,
"8000/tcp": 8000
},
"ports_description": {
"3000/tcp": "AutoGPT Platform Frontend",
"8000/tcp": "AutoGPT Platform Backend API"
},
"webui": "http://[HOST]:[PORT:3000]",
"ingress": true,
"ingress_port": 3000,
"ingress_entry": "/",
"panel_icon": "mdi:robot",
"panel_title": "AutoGPT Platform",
"panel_admin": true,
"audio": false,
"video": false,
"gpio": false,
"usb": false,
"uart": false,
"kernel_modules": false,
"devicetree": false,
"docker_api": false,
"full_access": false,
"apparmor": true,
"auth_api": true,
"snapshot_exclude": [
"tmp/**",
"logs/**"
],
"options": {
"database": {
"host": "localhost",
"port": 5432,
"username": "autogpt",
"password": "!secret autogpt_db_password",
"database": "autogpt"
},
"redis": {
"host": "localhost",
"port": 6379,
"password": "!secret autogpt_redis_password"
},
"auth": {
"jwt_secret": "!secret autogpt_jwt_secret",
"admin_email": "admin@example.com"
},
"homeassistant": {
"enabled": true,
"token": "!secret ha_long_lived_token",
"api_url": "http://supervisor/core/api"
},
"logging": {
"level": "INFO",
"file_logging": true
}
},
"schema": {
"database": {
"host": "str",
"port": "int(1,65535)",
"username": "str",
"password": "password",
"database": "str"
},
"redis": {
"host": "str",
"port": "int(1,65535)",
"password": "password?"
},
"auth": {
"jwt_secret": "password",
"admin_email": "email"
},
"homeassistant": {
"enabled": "bool",
"token": "password?",
"api_url": "url?"
},
"logging": {
"level": "list(DEBUG|INFO|WARNING|ERROR)?",
"file_logging": "bool?"
},
"advanced": {
"backend_workers": "int(1,10)?",
"max_agents": "int(1,100)?",
"resource_limits": {
"cpu": "float(0.1,4.0)?",
"memory": "str?"
}
}
},
"image": "ghcr.io/significant-gravitas/autogpt-platform-{arch}",
"services": [
"mqtt:want"
],
"discovery": [
"autogpt_platform"
]
}

View File

@@ -0,0 +1,43 @@
<?xml version="1.0"?>
<Container version="2">
<Name>AutoGPT-Platform</Name>
<Repository>ghcr.io/significant-gravitas/autogpt-platform-frontend:latest</Repository>
<Registry>https://ghcr.io/significant-gravitas/autogpt-platform-frontend</Registry>
<Network>bridge</Network>
<MyIP/>
<Shell>bash</Shell>
<Privileged>false</Privileged>
<Support>https://github.com/Significant-Gravitas/AutoGPT/issues</Support>
<Project>https://github.com/Significant-Gravitas/AutoGPT</Project>
<Overview>AutoGPT Platform is a powerful system for creating, deploying, and managing continuous AI agents that automate complex workflows. This template sets up the complete platform including frontend, backend services, database, and message queue.&#xD;
&#xD;
This is a complete stack deployment that includes:&#xD;
- Frontend web interface&#xD;
- Backend API services&#xD;
- PostgreSQL database with pgvector&#xD;
- Redis for caching&#xD;
- RabbitMQ for task queuing&#xD;
- Supabase for authentication&#xD;
&#xD;
IMPORTANT: This template creates multiple containers. Make sure you have sufficient resources (minimum 8GB RAM, 4 CPU cores) and available disk space (minimum 20GB).</Overview>
<Category>Productivity: Tools: Other:</Category>
<WebUI>http://[IP]:[PORT:3000]</WebUI>
<TemplateURL>https://raw.githubusercontent.com/Significant-Gravitas/AutoGPT/master/autogpt_platform/templates/unraid-template.xml</TemplateURL>
<Icon>https://raw.githubusercontent.com/Significant-Gravitas/AutoGPT/master/assets/autogpt_logo.png</Icon>
<ExtraParams>--network=autogpt</ExtraParams>
<PostArgs/>
<CPUset/>
<DateInstalled>1704067200</DateInstalled>
<DonateText>AutoGPT is an open-source project. Consider supporting development.</DonateText>
<DonateLink>https://github.com/sponsors/Significant-Gravitas</DonateLink>
<Requires>This template requires the AutoGPT Platform stack to be deployed. Please follow the setup instructions at: https://docs.agpt.co/platform/deployment/unraid/</Requires>
<Config Name="WebUI Port" Target="3000" Default="3000" Mode="tcp" Description="Port for the AutoGPT Platform web interface" Type="Port" Display="always" Required="true" Mask="false">3000</Config>
<Config Name="Backend API URL" Target="AGPT_SERVER_URL" Default="http://[IP]:8006/api" Mode="" Description="URL for the backend API server. Replace [IP] with your Unraid server IP." Type="Variable" Display="always" Required="true" Mask="false">http://[IP]:8006/api</Config>
<Config Name="Supabase URL" Target="SUPABASE_URL" Default="http://[IP]:8000" Mode="" Description="URL for Supabase authentication. Replace [IP] with your Unraid server IP." Type="Variable" Display="always" Required="true" Mask="false">http://[IP]:8000</Config>
<Config Name="WebSocket URL" Target="AGPT_WS_SERVER_URL" Default="ws://[IP]:8001/ws" Mode="" Description="WebSocket URL for real-time communication. Replace [IP] with your Unraid server IP." Type="Variable" Display="always" Required="true" Mask="false">ws://[IP]:8001/ws</Config>
<Config Name="Config Directory" Target="/app/config" Default="/mnt/user/appdata/autogpt-platform/frontend" Mode="rw" Description="Directory for frontend configuration files" Type="Path" Display="advanced" Required="true" Mask="false">/mnt/user/appdata/autogpt-platform/frontend</Config>
<Config Name="Log Level" Target="LOG_LEVEL" Default="INFO" Mode="" Description="Logging level (DEBUG, INFO, WARNING, ERROR)" Type="Variable" Display="advanced" Required="false" Mask="false">INFO</Config>
<Config Name="Node Environment" Target="NODE_ENV" Default="production" Mode="" Description="Node.js environment mode" Type="Variable" Display="advanced" Required="false" Mask="false">production</Config>
<Config Name="PUID" Target="PUID" Default="99" Mode="" Description="User ID for file permissions" Type="Variable" Display="advanced" Required="false" Mask="false">99</Config>
<Config Name="PGID" Target="PGID" Default="100" Mode="" Description="Group ID for file permissions" Type="Variable" Display="advanced" Required="false" Mask="false">100</Config>
</Container>