mirror of
https://github.com/Significant-Gravitas/AutoGPT.git
synced 2026-02-11 15:25:16 -05:00
Compare commits
15 Commits
autogpt-pl
...
otto/secrt
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6a40a29f13 | ||
|
|
fa8930da4c | ||
|
|
8c79e170e7 | ||
|
|
698dc45146 | ||
|
|
214ab25b3c | ||
|
|
4daa25e3dc | ||
|
|
7195f7e298 | ||
|
|
582754256e | ||
|
|
36aeb0b2b3 | ||
|
|
2a189c44c4 | ||
|
|
508759610f | ||
|
|
062fe1aa70 | ||
|
|
2cd0d4fe0f | ||
|
|
1ecae8c87e | ||
|
|
659338f90c |
206
.devcontainer/platform/README.md
Normal file
206
.devcontainer/platform/README.md
Normal file
@@ -0,0 +1,206 @@
|
||||
# GitHub Codespaces for AutoGPT Platform
|
||||
|
||||
This dev container provides a complete development environment for the AutoGPT Platform, optimized for PR reviews.
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
1. **Open in Codespaces:**
|
||||
- Go to the repository on GitHub
|
||||
- Click **Code** → **Codespaces** → **Create codespace on dev**
|
||||
- Or click the badge: [](https://codespaces.new/Significant-Gravitas/AutoGPT?quickstart=1)
|
||||
|
||||
2. **Wait for setup** (~60 seconds with prebuild, ~5-10 min without)
|
||||
|
||||
3. **Start the servers:**
|
||||
```bash
|
||||
# Terminal 1
|
||||
make run-backend
|
||||
|
||||
# Terminal 2
|
||||
make run-frontend
|
||||
```
|
||||
|
||||
4. **Start developing!**
|
||||
- Frontend: `http://localhost:3000`
|
||||
- Login with: `test123@gmail.com` / `testpassword123`
|
||||
|
||||
## 🏗️ Architecture
|
||||
|
||||
**Dependencies run in Docker** (cached by prebuild):
|
||||
- PostgreSQL, Redis, RabbitMQ, Supabase Auth
|
||||
|
||||
**Backend & Frontend run natively** (not cached):
|
||||
- This ensures you're always running the current branch's code
|
||||
- Enables hot-reload during development
|
||||
- VS Code debugger can attach directly
|
||||
|
||||
## 📍 Available Services
|
||||
|
||||
| Service | URL | Notes |
|
||||
|---------|-----|-------|
|
||||
| Frontend | http://localhost:3000 | Next.js app |
|
||||
| REST API | http://localhost:8006 | FastAPI backend |
|
||||
| WebSocket | ws://localhost:8001 | Real-time updates |
|
||||
| Supabase | http://localhost:8000 | Auth & API gateway |
|
||||
| Supabase Studio | http://localhost:5555 | Database admin |
|
||||
| RabbitMQ | http://localhost:15672 | Queue management |
|
||||
|
||||
## 🔑 Test Accounts
|
||||
|
||||
| Email | Password | Role |
|
||||
|-------|----------|------|
|
||||
| test123@gmail.com | testpassword123 | Featured Creator |
|
||||
|
||||
The test account has:
|
||||
- Pre-created agents and workflows
|
||||
- Published store listings
|
||||
- Active agent executions
|
||||
- Reviews and ratings
|
||||
|
||||
## 🛠️ Development Commands
|
||||
|
||||
```bash
|
||||
# Navigate to platform directory (terminal starts here by default)
|
||||
cd autogpt_platform
|
||||
|
||||
# Start all services
|
||||
docker compose up -d
|
||||
|
||||
# Or just core services (DB, Redis, RabbitMQ)
|
||||
make start-core
|
||||
|
||||
# Run backend in dev mode (hot reload)
|
||||
make run-backend
|
||||
|
||||
# Run frontend in dev mode (hot reload)
|
||||
make run-frontend
|
||||
|
||||
# Run both backend and frontend
|
||||
# (Use VS Code's "Full Stack" launch config for debugging)
|
||||
|
||||
# Format code
|
||||
make format
|
||||
|
||||
# Run tests
|
||||
make test-data # Regenerate test data
|
||||
poetry run test # Backend tests (from backend/)
|
||||
pnpm test:e2e # E2E tests (from frontend/)
|
||||
```
|
||||
|
||||
## 🐛 Debugging
|
||||
|
||||
### VS Code Launch Configs
|
||||
|
||||
> **Note:** Launch and task configs are in `.devcontainer/vscode-templates/`.
|
||||
> To use them locally, copy to `.vscode/`:
|
||||
> ```bash
|
||||
> cp .devcontainer/vscode-templates/*.json .vscode/
|
||||
> ```
|
||||
> In Codespaces, core settings are auto-applied via devcontainer.json.
|
||||
|
||||
Press `F5` or use the Run and Debug panel:
|
||||
|
||||
- **Backend: Debug FastAPI** - Debug the REST API server
|
||||
- **Backend: Debug Executor** - Debug the agent executor
|
||||
- **Frontend: Debug Next.js** - Debug with browser DevTools
|
||||
- **Full Stack: Backend + Frontend** - Debug both simultaneously
|
||||
- **Tests: Run E2E Tests** - Run Playwright tests
|
||||
|
||||
### VS Code Tasks
|
||||
|
||||
Press `Ctrl+Shift+P` → "Tasks: Run Task":
|
||||
|
||||
- Start/Stop All Services
|
||||
- Run Migrations
|
||||
- Seed Test Data
|
||||
- View Docker Logs
|
||||
- Reset Database
|
||||
|
||||
## 📁 Project Structure
|
||||
|
||||
```text
|
||||
autogpt_platform/ # This folder
|
||||
├── .devcontainer/ # Codespaces/devcontainer config
|
||||
├── .vscode/ # VS Code settings
|
||||
├── backend/ # Python FastAPI backend
|
||||
│ ├── backend/ # Application code
|
||||
│ ├── test/ # Test files + data seeders
|
||||
│ └── migrations/ # Prisma migrations
|
||||
├── frontend/ # Next.js frontend
|
||||
│ ├── src/ # Application code
|
||||
│ └── e2e/ # Playwright E2E tests
|
||||
├── db/ # Supabase configuration
|
||||
├── docker-compose.yml # Service orchestration
|
||||
└── Makefile # Common commands
|
||||
```
|
||||
|
||||
## 🔧 Troubleshooting
|
||||
|
||||
### Services not starting?
|
||||
```bash
|
||||
# Check service status
|
||||
docker compose ps
|
||||
|
||||
# View logs
|
||||
docker compose logs -f
|
||||
|
||||
# Restart everything
|
||||
docker compose down && docker compose up -d
|
||||
```
|
||||
|
||||
### Database issues?
|
||||
```bash
|
||||
# Reset database (destroys all data)
|
||||
make reset-db
|
||||
|
||||
# Re-run migrations
|
||||
make migrate
|
||||
|
||||
# Re-seed test data
|
||||
make test-data
|
||||
```
|
||||
|
||||
### Port already in use?
|
||||
```bash
|
||||
# Check what's using the port
|
||||
lsof -i :3000
|
||||
|
||||
# Kill process (if safe)
|
||||
kill -9 <PID>
|
||||
```
|
||||
|
||||
### Can't login?
|
||||
- Ensure all services are running: `docker compose ps`
|
||||
- Check auth service: `docker compose logs auth`
|
||||
- Try seeding data again: `make test-data`
|
||||
|
||||
## 📝 Making Changes
|
||||
|
||||
### Backend Changes
|
||||
1. Edit files in `backend/backend/`
|
||||
2. If using `make run-backend`, changes auto-reload
|
||||
3. Run `poetry run format` before committing
|
||||
|
||||
### Frontend Changes
|
||||
1. Edit files in `frontend/src/`
|
||||
2. If using `make run-frontend`, changes auto-reload
|
||||
3. Run `pnpm format` before committing
|
||||
|
||||
### Database Schema Changes
|
||||
1. Edit `backend/schema.prisma`
|
||||
2. Run `poetry run prisma migrate dev --name your_migration`
|
||||
3. Run `poetry run prisma generate`
|
||||
|
||||
## 🔒 Environment Variables
|
||||
|
||||
Default environment variables are configured for local development. For production secrets, use GitHub Codespaces Secrets:
|
||||
|
||||
1. Go to GitHub Settings → Codespaces → Secrets
|
||||
2. Add secrets with names matching `.env` variables
|
||||
3. They'll be automatically available in your codespace
|
||||
|
||||
## 📚 More Resources
|
||||
|
||||
- [AutoGPT Platform Docs](https://docs.agpt.co)
|
||||
- [Codespaces Documentation](https://docs.github.com/en/codespaces)
|
||||
- [Dev Containers Spec](https://containers.dev)
|
||||
152
.devcontainer/platform/devcontainer.json
Normal file
152
.devcontainer/platform/devcontainer.json
Normal file
@@ -0,0 +1,152 @@
|
||||
{
|
||||
"name": "AutoGPT Platform",
|
||||
"dockerComposeFile": "docker-compose.devcontainer.yml",
|
||||
"service": "devcontainer",
|
||||
"workspaceFolder": "/workspaces/AutoGPT/autogpt_platform",
|
||||
"shutdownAction": "stopCompose",
|
||||
|
||||
// Features - Docker-in-Docker for full compose support
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/docker-in-docker:2": {
|
||||
"dockerDashComposeVersion": "v2"
|
||||
},
|
||||
"ghcr.io/devcontainers/features/github-cli:1": {},
|
||||
"ghcr.io/devcontainers/features/node:1": {
|
||||
"version": "22",
|
||||
"nodeGypDependencies": true
|
||||
},
|
||||
"ghcr.io/devcontainers/features/python:1": {
|
||||
"version": "3.13",
|
||||
"installTools": true,
|
||||
"toolsToInstall": "flake8,autopep8,black,mypy,pytest,poetry"
|
||||
}
|
||||
},
|
||||
|
||||
// Lifecycle scripts - paths relative to repo root
|
||||
"onCreateCommand": "bash .devcontainer/platform/scripts/oncreate.sh",
|
||||
"postCreateCommand": "bash .devcontainer/platform/scripts/postcreate.sh",
|
||||
"postStartCommand": "bash .devcontainer/platform/scripts/poststart.sh",
|
||||
|
||||
// Port forwarding
|
||||
"forwardPorts": [
|
||||
3000, // Frontend
|
||||
8006, // REST API
|
||||
8001, // WebSocket
|
||||
8000, // Supabase Kong
|
||||
5432, // PostgreSQL
|
||||
6379, // Redis
|
||||
15672, // RabbitMQ Management
|
||||
5555 // Supabase Studio
|
||||
],
|
||||
|
||||
"portsAttributes": {
|
||||
"3000": { "label": "Frontend", "onAutoForward": "openBrowser" },
|
||||
"8006": { "label": "REST API", "onAutoForward": "notify" },
|
||||
"8001": { "label": "WebSocket", "onAutoForward": "silent" },
|
||||
"8000": { "label": "Supabase", "onAutoForward": "silent" },
|
||||
"5432": { "label": "PostgreSQL", "onAutoForward": "silent" },
|
||||
"6379": { "label": "Redis", "onAutoForward": "silent" },
|
||||
"15672": { "label": "RabbitMQ", "onAutoForward": "silent" },
|
||||
"5555": { "label": "Supabase Studio", "onAutoForward": "silent" }
|
||||
},
|
||||
|
||||
// VS Code customizations
|
||||
"customizations": {
|
||||
"vscode": {
|
||||
"settings": {
|
||||
// Python
|
||||
"python.defaultInterpreterPath": "${workspaceFolder}/backend/.venv/bin/python",
|
||||
"python.analysis.typeCheckingMode": "basic",
|
||||
"python.testing.pytestEnabled": true,
|
||||
"python.testing.pytestArgs": ["backend"],
|
||||
|
||||
// Formatting
|
||||
"[python]": {
|
||||
"editor.defaultFormatter": "ms-python.black-formatter",
|
||||
"editor.formatOnSave": true
|
||||
},
|
||||
"[typescript]": {
|
||||
"editor.defaultFormatter": "esbenp.prettier-vscode",
|
||||
"editor.formatOnSave": true
|
||||
},
|
||||
"[typescriptreact]": {
|
||||
"editor.defaultFormatter": "esbenp.prettier-vscode",
|
||||
"editor.formatOnSave": true
|
||||
},
|
||||
"[javascript]": {
|
||||
"editor.defaultFormatter": "esbenp.prettier-vscode",
|
||||
"editor.formatOnSave": true
|
||||
},
|
||||
|
||||
// Editor
|
||||
"editor.rulers": [88, 120],
|
||||
"editor.tabSize": 2,
|
||||
"files.trimTrailingWhitespace": true,
|
||||
|
||||
// Terminal
|
||||
"terminal.integrated.defaultProfile.linux": "bash",
|
||||
"terminal.integrated.cwd": "${workspaceFolder}",
|
||||
|
||||
// Git
|
||||
"git.autofetch": true,
|
||||
"git.enableSmartCommit": true,
|
||||
"git.openRepositoryInParentFolders": "always",
|
||||
|
||||
// Prisma
|
||||
"prisma.showPrismaDataPlatformNotification": false
|
||||
},
|
||||
|
||||
"extensions": [
|
||||
// Python
|
||||
"ms-python.python",
|
||||
"ms-python.vscode-pylance",
|
||||
"ms-python.black-formatter",
|
||||
"ms-python.isort",
|
||||
|
||||
// JavaScript/TypeScript
|
||||
"dbaeumer.vscode-eslint",
|
||||
"esbenp.prettier-vscode",
|
||||
"bradlc.vscode-tailwindcss",
|
||||
|
||||
// Database
|
||||
"Prisma.prisma",
|
||||
|
||||
// Testing
|
||||
"ms-playwright.playwright",
|
||||
|
||||
// GitHub
|
||||
"GitHub.vscode-pull-request-github",
|
||||
"GitHub.copilot",
|
||||
"github.vscode-github-actions",
|
||||
|
||||
// Utilities
|
||||
"eamodio.gitlens",
|
||||
"usernamehw.errorlens",
|
||||
"christian-kohler.path-intellisense",
|
||||
"mikestead.dotenv"
|
||||
]
|
||||
},
|
||||
|
||||
"codespaces": {
|
||||
"openFiles": [
|
||||
"README.md"
|
||||
]
|
||||
}
|
||||
},
|
||||
|
||||
// Environment variables
|
||||
"containerEnv": {
|
||||
"CODESPACES": "true",
|
||||
"POETRY_VIRTUALENVS_IN_PROJECT": "true"
|
||||
},
|
||||
|
||||
// Run as non-root for security
|
||||
"remoteUser": "vscode",
|
||||
|
||||
// Host requirements for performance
|
||||
"hostRequirements": {
|
||||
"cpus": 4,
|
||||
"memory": "8gb",
|
||||
"storage": "32gb"
|
||||
}
|
||||
}
|
||||
21
.devcontainer/platform/docker-compose.devcontainer.yml
Normal file
21
.devcontainer/platform/docker-compose.devcontainer.yml
Normal file
@@ -0,0 +1,21 @@
|
||||
# Standalone devcontainer service
|
||||
# The platform services (db, redis, etc.) are started from within
|
||||
# the container using docker compose commands in the lifecycle scripts.
|
||||
|
||||
services:
|
||||
devcontainer:
|
||||
image: mcr.microsoft.com/devcontainers/base:ubuntu-24.04
|
||||
volumes:
|
||||
# Mount the entire AutoGPT repo
|
||||
- ../..:/workspaces/AutoGPT:cached
|
||||
|
||||
# Keep container running
|
||||
command: sleep infinity
|
||||
|
||||
# Environment for development
|
||||
environment:
|
||||
- CODESPACES=true
|
||||
- POETRY_VIRTUALENVS_IN_PROJECT=true
|
||||
- POETRY_NO_INTERACTION=1
|
||||
- NEXT_TELEMETRY_DISABLED=1
|
||||
- DO_NOT_TRACK=1
|
||||
142
.devcontainer/platform/scripts/oncreate.sh
Executable file
142
.devcontainer/platform/scripts/oncreate.sh
Executable file
@@ -0,0 +1,142 @@
|
||||
#!/bin/bash
|
||||
# =============================================================================
|
||||
# ONCREATE SCRIPT - Runs during prebuild
|
||||
# =============================================================================
|
||||
# This script runs during the prebuild phase (GitHub Actions).
|
||||
# It caches everything that's safe to cache:
|
||||
# ✅ Dependency Docker images (postgres, redis, rabbitmq, etc.)
|
||||
# ✅ Python packages (poetry install)
|
||||
# ✅ Node packages (pnpm install)
|
||||
#
|
||||
# It does NOT build backend/frontend Docker images because those would
|
||||
# contain stale code from the prebuild branch, not the PR being reviewed.
|
||||
# =============================================================================
|
||||
|
||||
set -e # Exit on error
|
||||
set -x # Print commands for debugging
|
||||
|
||||
echo "🚀 Starting prebuild setup..."
|
||||
|
||||
# =============================================================================
|
||||
# Setup PATH for tools installed by devcontainer features
|
||||
# =============================================================================
|
||||
# Python feature installs pipx at /usr/local/py-utils/bin
|
||||
# Node feature installs nvm, node, pnpm at various locations
|
||||
export PATH="/usr/local/py-utils/bin:$PATH"
|
||||
|
||||
# Source nvm if available (Node feature uses nvm)
|
||||
export NVM_DIR="${NVM_DIR:-/usr/local/share/nvm}"
|
||||
if [ -s "$NVM_DIR/nvm.sh" ]; then
|
||||
. "$NVM_DIR/nvm.sh"
|
||||
fi
|
||||
|
||||
# =============================================================================
|
||||
# Verify and Install Poetry
|
||||
# =============================================================================
|
||||
echo "📦 Setting up Poetry..."
|
||||
|
||||
if command -v poetry &> /dev/null; then
|
||||
echo " Poetry already installed: $(poetry --version)"
|
||||
else
|
||||
echo " Installing Poetry via pipx..."
|
||||
if command -v pipx &> /dev/null; then
|
||||
pipx install poetry
|
||||
else
|
||||
echo " pipx not found, installing poetry via pip..."
|
||||
pip install --user poetry
|
||||
export PATH="$HOME/.local/bin:$PATH"
|
||||
fi
|
||||
fi
|
||||
|
||||
poetry --version || { echo "❌ Poetry installation failed"; exit 1; }
|
||||
|
||||
# =============================================================================
|
||||
# Verify and Install pnpm
|
||||
# =============================================================================
|
||||
echo "📦 Setting up pnpm..."
|
||||
|
||||
if command -v pnpm &> /dev/null; then
|
||||
echo " pnpm already installed: $(pnpm --version)"
|
||||
else
|
||||
echo " Installing pnpm via npm..."
|
||||
npm install -g pnpm
|
||||
fi
|
||||
|
||||
pnpm --version || { echo "❌ pnpm installation failed"; exit 1; }
|
||||
|
||||
# =============================================================================
|
||||
# Navigate to workspace
|
||||
# =============================================================================
|
||||
cd /workspaces/AutoGPT/autogpt_platform
|
||||
|
||||
# =============================================================================
|
||||
# Install Backend Dependencies
|
||||
# =============================================================================
|
||||
echo "📦 Installing backend dependencies..."
|
||||
|
||||
cd backend
|
||||
poetry install --no-interaction --no-ansi
|
||||
|
||||
# Generate Prisma client (schema only, no DB needed)
|
||||
echo "🔧 Generating Prisma client..."
|
||||
poetry run prisma generate || true
|
||||
poetry run gen-prisma-stub || true
|
||||
|
||||
cd ..
|
||||
|
||||
# =============================================================================
|
||||
# Install Frontend Dependencies
|
||||
# =============================================================================
|
||||
echo "📦 Installing frontend dependencies..."
|
||||
|
||||
cd frontend
|
||||
pnpm install --frozen-lockfile
|
||||
cd ..
|
||||
|
||||
# =============================================================================
|
||||
# Pull Dependency Docker Images
|
||||
# =============================================================================
|
||||
# Use docker compose pull to get exact versions from compose files
|
||||
# (single source of truth, no version drift)
|
||||
# =============================================================================
|
||||
echo "🐳 Pulling dependency Docker images..."
|
||||
|
||||
# Start Docker daemon if using docker-in-docker
|
||||
if [ -e /var/run/docker-host.sock ]; then
|
||||
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock || true
|
||||
fi
|
||||
|
||||
# Check if Docker is available
|
||||
if command -v docker &> /dev/null && docker info &> /dev/null; then
|
||||
# Pull images defined in docker-compose.yml (single source of truth)
|
||||
docker compose pull db redis rabbitmq kong auth || echo "⚠️ Some images may not have pulled"
|
||||
echo "✅ Dependency images pulled"
|
||||
else
|
||||
echo "⚠️ Docker not available during prebuild, images will be pulled on first start"
|
||||
fi
|
||||
|
||||
# =============================================================================
|
||||
# Copy environment files
|
||||
# =============================================================================
|
||||
echo "📄 Setting up environment files..."
|
||||
|
||||
cd /workspaces/AutoGPT/autogpt_platform
|
||||
|
||||
[ ! -f backend/.env ] && cp backend/.env.default backend/.env
|
||||
[ ! -f frontend/.env ] && cp frontend/.env.default frontend/.env
|
||||
[ ! -f .env ] && cp .env.default .env
|
||||
|
||||
# =============================================================================
|
||||
# Done!
|
||||
# =============================================================================
|
||||
echo ""
|
||||
echo "=============================================="
|
||||
echo "✅ PREBUILD COMPLETE"
|
||||
echo "=============================================="
|
||||
echo ""
|
||||
echo "Cached:"
|
||||
echo " ✅ Poetry $(poetry --version 2>/dev/null || echo '(check path)')"
|
||||
echo " ✅ pnpm $(pnpm --version 2>/dev/null || echo '(check path)')"
|
||||
echo " ✅ Python packages"
|
||||
echo " ✅ Node packages"
|
||||
echo ""
|
||||
131
.devcontainer/platform/scripts/postcreate.sh
Executable file
131
.devcontainer/platform/scripts/postcreate.sh
Executable file
@@ -0,0 +1,131 @@
|
||||
#!/bin/bash
|
||||
# =============================================================================
|
||||
# POSTCREATE SCRIPT - Runs after container creation
|
||||
# =============================================================================
|
||||
# This script runs once when a codespace is first created. It starts the
|
||||
# dependency services and prepares the environment for development.
|
||||
#
|
||||
# NOTE: Backend and Frontend run NATIVELY (not in Docker) to ensure you're
|
||||
# always running the current branch's code, not stale prebuild code.
|
||||
# =============================================================================
|
||||
|
||||
set -e # Exit on error
|
||||
|
||||
echo "🚀 Setting up your development environment..."
|
||||
|
||||
# Ensure PATH includes pipx binaries (where poetry is installed)
|
||||
export PATH="/usr/local/py-utils/bin:$PATH"
|
||||
|
||||
cd /workspaces/AutoGPT/autogpt_platform
|
||||
|
||||
# =============================================================================
|
||||
# Ensure Docker is available
|
||||
# =============================================================================
|
||||
if [ -e /var/run/docker-host.sock ]; then
|
||||
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock 2>/dev/null || true
|
||||
fi
|
||||
|
||||
# Wait for Docker to be ready
|
||||
echo "⏳ Waiting for Docker..."
|
||||
timeout 60 bash -c 'until docker info &>/dev/null; do sleep 1; done'
|
||||
echo "✅ Docker is ready"
|
||||
|
||||
# =============================================================================
|
||||
# Start Dependency Services ONLY
|
||||
# =============================================================================
|
||||
# We only start infrastructure deps in Docker.
|
||||
# Backend/Frontend run natively to use the current branch's code.
|
||||
# =============================================================================
|
||||
echo "🐳 Starting dependency services..."
|
||||
|
||||
# Start core dependencies (DB, Auth, Redis, RabbitMQ)
|
||||
docker compose up -d db redis rabbitmq kong auth
|
||||
|
||||
# Wait for PostgreSQL to be healthy
|
||||
echo "⏳ Waiting for PostgreSQL..."
|
||||
timeout 120 bash -c '
|
||||
until docker compose exec -T db pg_isready -U postgres &>/dev/null; do
|
||||
sleep 2
|
||||
echo " Waiting for database..."
|
||||
done
|
||||
'
|
||||
echo "✅ PostgreSQL is ready"
|
||||
|
||||
# Wait for Redis
|
||||
echo "⏳ Waiting for Redis..."
|
||||
timeout 60 bash -c 'until docker compose exec -T redis redis-cli ping &>/dev/null; do sleep 1; done'
|
||||
echo "✅ Redis is ready"
|
||||
|
||||
# Wait for RabbitMQ
|
||||
echo "⏳ Waiting for RabbitMQ..."
|
||||
timeout 90 bash -c 'until docker compose exec -T rabbitmq rabbitmq-diagnostics -q ping &>/dev/null; do sleep 2; done'
|
||||
echo "✅ RabbitMQ is ready"
|
||||
|
||||
# =============================================================================
|
||||
# Run Database Migrations
|
||||
# =============================================================================
|
||||
echo "🔄 Running database migrations..."
|
||||
|
||||
cd backend
|
||||
|
||||
# Run migrations
|
||||
poetry run prisma migrate deploy
|
||||
poetry run prisma generate
|
||||
poetry run gen-prisma-stub || true
|
||||
|
||||
cd ..
|
||||
|
||||
# =============================================================================
|
||||
# Seed Test Data (Minimal)
|
||||
# =============================================================================
|
||||
echo "🌱 Checking test data..."
|
||||
|
||||
cd backend
|
||||
|
||||
# Check if test data already exists (idempotent)
|
||||
if poetry run python -c "
|
||||
import asyncio
|
||||
from backend.data.db import prisma
|
||||
|
||||
async def check():
|
||||
await prisma.connect()
|
||||
count = await prisma.user.count()
|
||||
await prisma.disconnect()
|
||||
return count > 0
|
||||
|
||||
print('exists' if asyncio.run(check()) else 'empty')
|
||||
" 2>/dev/null | grep -q "exists"; then
|
||||
echo " Test data already exists, skipping seed"
|
||||
else
|
||||
echo " Running E2E test data creator..."
|
||||
poetry run python test/e2e_test_data.py || echo "⚠️ Test data seeding had issues (may be partial)"
|
||||
fi
|
||||
|
||||
cd ..
|
||||
|
||||
# =============================================================================
|
||||
# Print Welcome Message
|
||||
# =============================================================================
|
||||
echo ""
|
||||
echo "=============================================="
|
||||
echo "🎉 CODESPACE READY!"
|
||||
echo "=============================================="
|
||||
echo ""
|
||||
echo "📍 Services Running (Docker):"
|
||||
echo " PostgreSQL: localhost:5432"
|
||||
echo " Redis: localhost:6379"
|
||||
echo " RabbitMQ: localhost:5672 (mgmt: 15672)"
|
||||
echo " Supabase: localhost:8000"
|
||||
echo ""
|
||||
echo "🚀 Start Development:"
|
||||
echo " make run-backend # Start backend (localhost:8006)"
|
||||
echo " make run-frontend # Start frontend (localhost:3000)"
|
||||
echo ""
|
||||
echo " Or run both in separate terminals!"
|
||||
echo ""
|
||||
echo "🔑 Test Account:"
|
||||
echo " Email: test123@gmail.com"
|
||||
echo " Password: testpassword123"
|
||||
echo ""
|
||||
echo "📚 Full docs: .devcontainer/platform/README.md"
|
||||
echo ""
|
||||
44
.devcontainer/platform/scripts/poststart.sh
Executable file
44
.devcontainer/platform/scripts/poststart.sh
Executable file
@@ -0,0 +1,44 @@
|
||||
#!/bin/bash
|
||||
# =============================================================================
|
||||
# POSTSTART SCRIPT - Runs every time the codespace starts
|
||||
# =============================================================================
|
||||
# This script runs when:
|
||||
# 1. Codespace is first created (after postcreate)
|
||||
# 2. Codespace resumes from stopped state
|
||||
# 3. Codespace rebuilds
|
||||
#
|
||||
# It ensures dependency services are running. Backend/Frontend are run
|
||||
# manually by the developer for hot-reload during development.
|
||||
# =============================================================================
|
||||
|
||||
echo "🔄 Starting dependency services..."
|
||||
|
||||
cd /workspaces/AutoGPT/autogpt_platform || { echo "❌ Failed to cd to workspace"; exit 1; }
|
||||
|
||||
# Ensure Docker socket is available
|
||||
if [ -e /var/run/docker-host.sock ]; then
|
||||
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock 2>/dev/null || true
|
||||
fi
|
||||
|
||||
# Wait for Docker
|
||||
timeout 30 bash -c 'until docker info &>/dev/null; do sleep 1; done' || {
|
||||
echo "⚠️ Docker not available, services may need manual start"
|
||||
exit 0
|
||||
}
|
||||
|
||||
# Start only dependency services (not backend/frontend)
|
||||
docker compose up -d db redis rabbitmq kong auth
|
||||
|
||||
# Quick health check
|
||||
echo "⏳ Waiting for services..."
|
||||
sleep 5
|
||||
|
||||
if docker compose ps | grep -q "running"; then
|
||||
echo "✅ Dependency services are running"
|
||||
echo ""
|
||||
echo "🚀 Start development with:"
|
||||
echo " make run-backend # Terminal 1"
|
||||
echo " make run-frontend # Terminal 2"
|
||||
else
|
||||
echo "⚠️ Some services may not be running. Try: docker compose up -d"
|
||||
fi
|
||||
110
.devcontainer/platform/vscode-templates/launch.json
Normal file
110
.devcontainer/platform/vscode-templates/launch.json
Normal file
@@ -0,0 +1,110 @@
|
||||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"name": "Backend: Debug FastAPI",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"module": "uvicorn",
|
||||
"args": [
|
||||
"backend.rest:app",
|
||||
"--reload",
|
||||
"--host", "0.0.0.0",
|
||||
"--port", "8006"
|
||||
],
|
||||
"cwd": "${workspaceFolder}/backend",
|
||||
"env": {
|
||||
"PYTHONPATH": "${workspaceFolder}/backend"
|
||||
},
|
||||
"console": "integratedTerminal",
|
||||
"justMyCode": false
|
||||
},
|
||||
{
|
||||
"name": "Backend: Debug Executor",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"module": "backend.exec",
|
||||
"cwd": "${workspaceFolder}/backend",
|
||||
"env": {
|
||||
"PYTHONPATH": "${workspaceFolder}/backend"
|
||||
},
|
||||
"console": "integratedTerminal",
|
||||
"justMyCode": false
|
||||
},
|
||||
{
|
||||
"name": "Backend: Debug WebSocket",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"module": "backend.ws",
|
||||
"cwd": "${workspaceFolder}/backend",
|
||||
"env": {
|
||||
"PYTHONPATH": "${workspaceFolder}/backend"
|
||||
},
|
||||
"console": "integratedTerminal",
|
||||
"justMyCode": false
|
||||
},
|
||||
{
|
||||
"name": "Frontend: Debug Next.js",
|
||||
"type": "node",
|
||||
"request": "launch",
|
||||
"runtimeExecutable": "pnpm",
|
||||
"runtimeArgs": ["dev"],
|
||||
"cwd": "${workspaceFolder}/frontend",
|
||||
"console": "integratedTerminal",
|
||||
"serverReadyAction": {
|
||||
"pattern": "- Local:.+(https?://\\S+)",
|
||||
"uriFormat": "%s",
|
||||
"action": "openExternally"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "Frontend: Debug Next.js (Server-side)",
|
||||
"type": "node",
|
||||
"request": "launch",
|
||||
"runtimeExecutable": "pnpm",
|
||||
"runtimeArgs": ["dev"],
|
||||
"cwd": "${workspaceFolder}/frontend",
|
||||
"env": {
|
||||
"NODE_OPTIONS": "--inspect"
|
||||
},
|
||||
"console": "integratedTerminal"
|
||||
},
|
||||
{
|
||||
"name": "Tests: Run Backend Tests",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"module": "pytest",
|
||||
"args": ["-v", "--tb=short"],
|
||||
"cwd": "${workspaceFolder}/backend",
|
||||
"console": "integratedTerminal",
|
||||
"justMyCode": false
|
||||
},
|
||||
{
|
||||
"name": "Tests: Run E2E Tests (Playwright)",
|
||||
"type": "node",
|
||||
"request": "launch",
|
||||
"runtimeExecutable": "pnpm",
|
||||
"runtimeArgs": ["test:e2e"],
|
||||
"cwd": "${workspaceFolder}/frontend",
|
||||
"console": "integratedTerminal"
|
||||
},
|
||||
{
|
||||
"name": "Scripts: Seed Test Data",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"program": "${workspaceFolder}/backend/test/e2e_test_data.py",
|
||||
"cwd": "${workspaceFolder}/backend",
|
||||
"env": {
|
||||
"PYTHONPATH": "${workspaceFolder}/backend"
|
||||
},
|
||||
"console": "integratedTerminal"
|
||||
}
|
||||
],
|
||||
"compounds": [
|
||||
{
|
||||
"name": "Full Stack: Backend + Frontend",
|
||||
"configurations": ["Backend: Debug FastAPI", "Frontend: Debug Next.js"],
|
||||
"stopAll": true
|
||||
}
|
||||
]
|
||||
}
|
||||
147
.devcontainer/platform/vscode-templates/tasks.json
Normal file
147
.devcontainer/platform/vscode-templates/tasks.json
Normal file
@@ -0,0 +1,147 @@
|
||||
{
|
||||
"version": "2.0.0",
|
||||
"tasks": [
|
||||
{
|
||||
"label": "Start All Services",
|
||||
"type": "shell",
|
||||
"command": "docker compose up -d",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"group": "build"
|
||||
},
|
||||
{
|
||||
"label": "Stop All Services",
|
||||
"type": "shell",
|
||||
"command": "docker compose stop",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"label": "Start Core Services",
|
||||
"type": "shell",
|
||||
"command": "make start-core",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"group": "build"
|
||||
},
|
||||
{
|
||||
"label": "Run Backend (Dev Mode)",
|
||||
"type": "shell",
|
||||
"command": "poetry run app",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}/backend"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"isBackground": true,
|
||||
"group": {
|
||||
"kind": "build",
|
||||
"isDefault": false
|
||||
}
|
||||
},
|
||||
{
|
||||
"label": "Run Frontend (Dev Mode)",
|
||||
"type": "shell",
|
||||
"command": "pnpm dev",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}/frontend"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"isBackground": true,
|
||||
"group": {
|
||||
"kind": "build",
|
||||
"isDefault": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"label": "Run Migrations",
|
||||
"type": "shell",
|
||||
"command": "make migrate",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"label": "Seed Test Data",
|
||||
"type": "shell",
|
||||
"command": "make test-data",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"label": "Format Code",
|
||||
"type": "shell",
|
||||
"command": "make format",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"label": "Backend: Run Tests",
|
||||
"type": "shell",
|
||||
"command": "poetry run test",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}/backend"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"group": "test"
|
||||
},
|
||||
{
|
||||
"label": "Frontend: Run Tests",
|
||||
"type": "shell",
|
||||
"command": "pnpm test",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}/frontend"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"group": "test"
|
||||
},
|
||||
{
|
||||
"label": "Frontend: Run E2E Tests",
|
||||
"type": "shell",
|
||||
"command": "pnpm test:e2e",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}/frontend"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"group": "test"
|
||||
},
|
||||
{
|
||||
"label": "Generate API Client",
|
||||
"type": "shell",
|
||||
"command": "pnpm generate:api",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}/frontend"
|
||||
},
|
||||
"problemMatcher": []
|
||||
},
|
||||
{
|
||||
"label": "View Docker Logs",
|
||||
"type": "shell",
|
||||
"command": "docker compose logs -f",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": [],
|
||||
"isBackground": true
|
||||
},
|
||||
{
|
||||
"label": "Reset Database",
|
||||
"type": "shell",
|
||||
"command": "make reset-db",
|
||||
"options": {
|
||||
"cwd": "${workspaceFolder}"
|
||||
},
|
||||
"problemMatcher": []
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -22,7 +22,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ github.event.workflow_run.head_branch }}
|
||||
fetch-depth: 0
|
||||
|
||||
2
.github/workflows/claude-dependabot.yml
vendored
2
.github/workflows/claude-dependabot.yml
vendored
@@ -30,7 +30,7 @@ jobs:
|
||||
actions: read # Required for CI access
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
2
.github/workflows/claude.yml
vendored
2
.github/workflows/claude.yml
vendored
@@ -40,7 +40,7 @@ jobs:
|
||||
actions: read # Required for CI access
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
2
.github/workflows/codeql.yml
vendored
2
.github/workflows/codeql.yml
vendored
@@ -58,7 +58,7 @@ jobs:
|
||||
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
|
||||
2
.github/workflows/copilot-setup-steps.yml
vendored
2
.github/workflows/copilot-setup-steps.yml
vendored
@@ -27,7 +27,7 @@ jobs:
|
||||
# If you do not check out your code, Copilot will do this for you.
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
2
.github/workflows/docs-block-sync.yml
vendored
2
.github/workflows/docs-block-sync.yml
vendored
@@ -23,7 +23,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
2
.github/workflows/docs-claude-review.yml
vendored
2
.github/workflows/docs-claude-review.yml
vendored
@@ -23,7 +23,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
|
||||
2
.github/workflows/docs-enhance.yml
vendored
2
.github/workflows/docs-enhance.yml
vendored
@@ -28,7 +28,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ github.event.inputs.git_ref || github.ref_name }}
|
||||
|
||||
@@ -52,7 +52,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Trigger deploy workflow
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
uses: peter-evans/repository-dispatch@v4
|
||||
with:
|
||||
token: ${{ secrets.DEPLOY_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
|
||||
@@ -17,7 +17,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ github.ref_name || 'master' }}
|
||||
|
||||
@@ -45,7 +45,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Trigger deploy workflow
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
uses: peter-evans/repository-dispatch@v4
|
||||
with:
|
||||
token: ${{ secrets.DEPLOY_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
|
||||
2
.github/workflows/platform-backend-ci.yml
vendored
2
.github/workflows/platform-backend-ci.yml
vendored
@@ -68,7 +68,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
@@ -82,7 +82,7 @@ jobs:
|
||||
|
||||
- name: Dispatch Deploy Event
|
||||
if: steps.check_status.outputs.should_deploy == 'true'
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
uses: peter-evans/repository-dispatch@v4
|
||||
with:
|
||||
token: ${{ secrets.DISPATCH_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
@@ -110,7 +110,7 @@ jobs:
|
||||
|
||||
- name: Dispatch Undeploy Event (from comment)
|
||||
if: steps.check_status.outputs.should_undeploy == 'true'
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
uses: peter-evans/repository-dispatch@v4
|
||||
with:
|
||||
token: ${{ secrets.DISPATCH_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
@@ -168,7 +168,7 @@ jobs:
|
||||
github.event_name == 'pull_request' &&
|
||||
github.event.action == 'closed' &&
|
||||
steps.check_pr_close.outputs.should_undeploy == 'true'
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
uses: peter-evans/repository-dispatch@v4
|
||||
with:
|
||||
token: ${{ secrets.DISPATCH_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
|
||||
10
.github/workflows/platform-frontend-ci.yml
vendored
10
.github/workflows/platform-frontend-ci.yml
vendored
@@ -31,7 +31,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Check for component changes
|
||||
uses: dorny/paths-filter@v3
|
||||
@@ -71,7 +71,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v6
|
||||
@@ -107,7 +107,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
@@ -148,7 +148,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
@@ -277,7 +277,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
|
||||
4
.github/workflows/platform-fullstack-ci.yml
vendored
4
.github/workflows/platform-fullstack-ci.yml
vendored
@@ -29,7 +29,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v6
|
||||
@@ -63,7 +63,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
|
||||
2
.github/workflows/repo-workflow-checker.yml
vendored
2
.github/workflows/repo-workflow-checker.yml
vendored
@@ -11,7 +11,7 @@ jobs:
|
||||
steps:
|
||||
# - name: Wait some time for all actions to start
|
||||
# run: sleep 30
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@v6
|
||||
# with:
|
||||
# fetch-depth: 0
|
||||
- name: Set up Python
|
||||
|
||||
@@ -10,6 +10,8 @@ from typing import Any
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from backend.util.json import dumps as json_dumps
|
||||
|
||||
|
||||
class ResponseType(str, Enum):
|
||||
"""Types of streaming responses following AI SDK protocol."""
|
||||
@@ -193,6 +195,18 @@ class StreamError(StreamBaseResponse):
|
||||
default=None, description="Additional error details"
|
||||
)
|
||||
|
||||
def to_sse(self) -> str:
|
||||
"""Convert to SSE format, only emitting fields required by AI SDK protocol.
|
||||
|
||||
The AI SDK uses z.strictObject({type, errorText}) which rejects
|
||||
any extra fields like `code` or `details`.
|
||||
"""
|
||||
data = {
|
||||
"type": self.type.value,
|
||||
"errorText": self.errorText,
|
||||
}
|
||||
return f"data: {json_dumps(data)}\n\n"
|
||||
|
||||
|
||||
class StreamHeartbeat(StreamBaseResponse):
|
||||
"""Heartbeat to keep SSE connection alive during long-running operations.
|
||||
|
||||
@@ -21,43 +21,71 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
class HumanInTheLoopBlock(Block):
|
||||
"""
|
||||
This block pauses execution and waits for human approval or modification of the data.
|
||||
Pauses execution and waits for human approval or rejection of the data.
|
||||
|
||||
When executed, it creates a pending review entry and sets the node execution status
|
||||
to REVIEW. The execution will remain paused until a human user either:
|
||||
- Approves the data (with or without modifications)
|
||||
- Rejects the data
|
||||
When executed, this block creates a pending review entry and sets the node execution
|
||||
status to REVIEW. The execution remains paused until a human user either approves
|
||||
or rejects the data.
|
||||
|
||||
This is useful for workflows that require human validation or intervention before
|
||||
proceeding to the next steps.
|
||||
**How it works:**
|
||||
- The input data is presented to a human reviewer
|
||||
- The reviewer can approve or reject (and optionally modify the data if editable)
|
||||
- On approval: the data flows out through the `approved_data` output pin
|
||||
- On rejection: the data flows out through the `rejected_data` output pin
|
||||
|
||||
**Important:** The output pins yield the actual data itself, NOT status strings.
|
||||
The approval/rejection decision determines WHICH output pin fires, not the value.
|
||||
You do NOT need to compare the output to "APPROVED" or "REJECTED" - simply connect
|
||||
downstream blocks to the appropriate output pin for each case.
|
||||
|
||||
**Example usage:**
|
||||
- Connect `approved_data` → next step in your workflow (data was approved)
|
||||
- Connect `rejected_data` → error handling or notification (data was rejected)
|
||||
"""
|
||||
|
||||
class Input(BlockSchemaInput):
|
||||
data: Any = SchemaField(description="The data to be reviewed by a human user")
|
||||
data: Any = SchemaField(
|
||||
description="The data to be reviewed by a human user. "
|
||||
"This exact data will be passed through to either approved_data or "
|
||||
"rejected_data output based on the reviewer's decision."
|
||||
)
|
||||
name: str = SchemaField(
|
||||
description="A descriptive name for what this data represents",
|
||||
description="A descriptive name for what this data represents. "
|
||||
"This helps the reviewer understand what they are reviewing.",
|
||||
)
|
||||
editable: bool = SchemaField(
|
||||
description="Whether the human reviewer can edit the data",
|
||||
description="Whether the human reviewer can edit the data before "
|
||||
"approving or rejecting it",
|
||||
default=True,
|
||||
advanced=True,
|
||||
)
|
||||
|
||||
class Output(BlockSchemaOutput):
|
||||
approved_data: Any = SchemaField(
|
||||
description="The data when approved (may be modified by reviewer)"
|
||||
description="Outputs the input data when the reviewer APPROVES it. "
|
||||
"The value is the actual data itself (not a status string like 'APPROVED'). "
|
||||
"If the reviewer edited the data, this contains the modified version. "
|
||||
"Connect downstream blocks here for the 'approved' workflow path."
|
||||
)
|
||||
rejected_data: Any = SchemaField(
|
||||
description="The data when rejected (may be modified by reviewer)"
|
||||
description="Outputs the input data when the reviewer REJECTS it. "
|
||||
"The value is the actual data itself (not a status string like 'REJECTED'). "
|
||||
"If the reviewer edited the data, this contains the modified version. "
|
||||
"Connect downstream blocks here for the 'rejected' workflow path."
|
||||
)
|
||||
review_message: str = SchemaField(
|
||||
description="Any message provided by the reviewer", default=""
|
||||
description="Optional message provided by the reviewer explaining their "
|
||||
"decision. Only outputs when the reviewer provides a message; "
|
||||
"this pin does not fire if no message was given.",
|
||||
default="",
|
||||
)
|
||||
|
||||
def __init__(self):
|
||||
super().__init__(
|
||||
id="8b2a7b3c-6e9d-4a5f-8c1b-2e3f4a5b6c7d",
|
||||
description="Pause execution and wait for human approval or modification of data",
|
||||
description="Pause execution for human review. Data flows through "
|
||||
"approved_data or rejected_data output based on the reviewer's decision. "
|
||||
"Outputs contain the actual data, not status strings.",
|
||||
categories={BlockCategory.BASIC},
|
||||
input_schema=HumanInTheLoopBlock.Input,
|
||||
output_schema=HumanInTheLoopBlock.Output,
|
||||
|
||||
@@ -743,6 +743,11 @@ class GraphModel(Graph, GraphMeta):
|
||||
# For invalid blocks, we still raise immediately as this is a structural issue
|
||||
raise ValueError(f"Invalid block {node.block_id} for node #{node.id}")
|
||||
|
||||
if block.disabled:
|
||||
raise ValueError(
|
||||
f"Block {node.block_id} is disabled and cannot be used in graphs"
|
||||
)
|
||||
|
||||
node_input_mask = (
|
||||
nodes_input_masks.get(node.id, {}) if nodes_input_masks else {}
|
||||
)
|
||||
|
||||
@@ -213,6 +213,9 @@ async def execute_node(
|
||||
block_name=node_block.name,
|
||||
)
|
||||
|
||||
if node_block.disabled:
|
||||
raise ValueError(f"Block {node_block.id} is disabled and cannot be executed")
|
||||
|
||||
# Sanity check: validate the execution input.
|
||||
input_data, error = validate_exec(node, data.inputs, resolve_input=False)
|
||||
if input_data is None:
|
||||
|
||||
@@ -364,6 +364,44 @@ def _remove_orphan_tool_responses(
|
||||
return result
|
||||
|
||||
|
||||
def validate_and_remove_orphan_tool_responses(
|
||||
messages: list[dict],
|
||||
log_warning: bool = True,
|
||||
) -> list[dict]:
|
||||
"""
|
||||
Validate tool_call/tool_response pairs and remove orphaned responses.
|
||||
|
||||
Scans messages in order, tracking all tool_call IDs. Any tool response
|
||||
referencing an ID not seen in a preceding message is considered orphaned
|
||||
and removed. This prevents API errors like Anthropic's "unexpected tool_use_id".
|
||||
|
||||
Args:
|
||||
messages: List of messages to validate (OpenAI or Anthropic format)
|
||||
log_warning: Whether to log a warning when orphans are found
|
||||
|
||||
Returns:
|
||||
A new list with orphaned tool responses removed
|
||||
"""
|
||||
available_ids: set[str] = set()
|
||||
orphan_ids: set[str] = set()
|
||||
|
||||
for msg in messages:
|
||||
available_ids |= _extract_tool_call_ids_from_message(msg)
|
||||
for resp_id in _extract_tool_response_ids_from_message(msg):
|
||||
if resp_id not in available_ids:
|
||||
orphan_ids.add(resp_id)
|
||||
|
||||
if not orphan_ids:
|
||||
return messages
|
||||
|
||||
if log_warning:
|
||||
logger.warning(
|
||||
f"Removing {len(orphan_ids)} orphan tool response(s): {orphan_ids}"
|
||||
)
|
||||
|
||||
return _remove_orphan_tool_responses(messages, orphan_ids)
|
||||
|
||||
|
||||
def _ensure_tool_pairs_intact(
|
||||
recent_messages: list[dict],
|
||||
all_messages: list[dict],
|
||||
@@ -723,6 +761,13 @@ async def compress_context(
|
||||
|
||||
# Filter out any None values that may have been introduced
|
||||
final_msgs: list[dict] = [m for m in msgs if m is not None]
|
||||
|
||||
# ---- STEP 6: Final tool-pair validation ---------------------------------
|
||||
# After all compression steps, verify that every tool response has a
|
||||
# matching tool_call in a preceding assistant message. Remove orphans
|
||||
# to prevent API errors (e.g., Anthropic's "unexpected tool_use_id").
|
||||
final_msgs = validate_and_remove_orphan_tool_responses(final_msgs)
|
||||
|
||||
final_count = sum(_msg_tokens(m, enc) for m in final_msgs)
|
||||
error = None
|
||||
if final_count + reserve > target_tokens:
|
||||
|
||||
10
autogpt_platform/backend/poetry.lock
generated
10
autogpt_platform/backend/poetry.lock
generated
@@ -46,14 +46,14 @@ pycares = ">=4.9.0,<5"
|
||||
|
||||
[[package]]
|
||||
name = "aiofiles"
|
||||
version = "24.1.0"
|
||||
version = "25.1.0"
|
||||
description = "File support for asyncio."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "aiofiles-24.1.0-py3-none-any.whl", hash = "sha256:b4ec55f4195e3eb5d7abd1bf7e061763e864dd4954231fb8539a0ef8bb8260e5"},
|
||||
{file = "aiofiles-24.1.0.tar.gz", hash = "sha256:22a075c9e5a3810f0c2e48f3008c94d68c65d763b9b03857924c99e57355166c"},
|
||||
{file = "aiofiles-25.1.0-py3-none-any.whl", hash = "sha256:abe311e527c862958650f9438e859c1fa7568a141b22abcd015e120e86a85695"},
|
||||
{file = "aiofiles-25.1.0.tar.gz", hash = "sha256:a8d728f0a29de45dc521f18f07297428d56992a742f0cd2701ba86e44d23d5b2"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -8440,4 +8440,4 @@ cffi = ["cffi (>=1.17,<2.0) ; platform_python_implementation != \"PyPy\" and pyt
|
||||
[metadata]
|
||||
lock-version = "2.1"
|
||||
python-versions = ">=3.10,<3.14"
|
||||
content-hash = "fc135114e01de39c8adf70f6132045e7d44a19473c1279aee0978de65aad1655"
|
||||
content-hash = "c06e96ad49388ba7a46786e9ea55ea2c1a57408e15613237b4bee40a592a12af"
|
||||
|
||||
@@ -76,7 +76,7 @@ yt-dlp = "2025.12.08"
|
||||
zerobouncesdk = "^1.1.2"
|
||||
# NOTE: please insert new dependencies in their alphabetical location
|
||||
pytest-snapshot = "^0.9.0"
|
||||
aiofiles = "^24.1.0"
|
||||
aiofiles = "^25.1.0"
|
||||
tiktoken = "^0.12.0"
|
||||
aioclamd = "^1.0.0"
|
||||
setuptools = "^80.9.0"
|
||||
|
||||
@@ -10,8 +10,9 @@ import {
|
||||
MessageResponse,
|
||||
} from "@/components/ai-elements/message";
|
||||
import { LoadingSpinner } from "@/components/atoms/LoadingSpinner/LoadingSpinner";
|
||||
import { toast } from "@/components/molecules/Toast/use-toast";
|
||||
import { ToolUIPart, UIDataTypes, UIMessage, UITools } from "ai";
|
||||
import { useEffect, useState } from "react";
|
||||
import { useEffect, useRef, useState } from "react";
|
||||
import { CreateAgentTool } from "../../tools/CreateAgent/CreateAgent";
|
||||
import { EditAgentTool } from "../../tools/EditAgent/EditAgent";
|
||||
import { FindAgentsTool } from "../../tools/FindAgents/FindAgents";
|
||||
@@ -121,6 +122,7 @@ export const ChatMessagesContainer = ({
|
||||
isLoading,
|
||||
}: ChatMessagesContainerProps) => {
|
||||
const [thinkingPhrase, setThinkingPhrase] = useState(getRandomPhrase);
|
||||
const lastToastTimeRef = useRef(0);
|
||||
|
||||
useEffect(() => {
|
||||
if (status === "submitted") {
|
||||
@@ -128,6 +130,20 @@ export const ChatMessagesContainer = ({
|
||||
}
|
||||
}, [status]);
|
||||
|
||||
// Show a toast when a new error occurs, debounced to avoid spam
|
||||
useEffect(() => {
|
||||
if (!error) return;
|
||||
const now = Date.now();
|
||||
if (now - lastToastTimeRef.current < 3_000) return;
|
||||
lastToastTimeRef.current = now;
|
||||
toast({
|
||||
variant: "destructive",
|
||||
title: "Something went wrong",
|
||||
description:
|
||||
"The assistant encountered an error. Please try sending your message again.",
|
||||
});
|
||||
}, [error]);
|
||||
|
||||
const lastMessage = messages[messages.length - 1];
|
||||
const lastAssistantHasVisibleContent =
|
||||
lastMessage?.role === "assistant" &&
|
||||
@@ -263,8 +279,12 @@ export const ChatMessagesContainer = ({
|
||||
</Message>
|
||||
)}
|
||||
{error && (
|
||||
<div className="rounded-lg bg-red-50 p-3 text-red-600">
|
||||
Error: {error.message}
|
||||
<div className="rounded-lg bg-red-50 p-4 text-sm text-red-700">
|
||||
<p className="font-medium">Something went wrong</p>
|
||||
<p className="mt-1 text-red-600">
|
||||
The assistant encountered an error. Please try sending your
|
||||
message again.
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
</ConversationContent>
|
||||
|
||||
@@ -30,7 +30,7 @@ export function ContentCard({
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
"rounded-lg bg-gradient-to-r from-purple-500/30 to-blue-500/30 p-[1px]",
|
||||
"min-w-0 rounded-lg bg-gradient-to-r from-purple-500/30 to-blue-500/30 p-[1px]",
|
||||
className,
|
||||
)}
|
||||
>
|
||||
|
||||
@@ -4,7 +4,6 @@ import { WarningDiamondIcon } from "@phosphor-icons/react";
|
||||
import type { ToolUIPart } from "ai";
|
||||
import { useCopilotChatActions } from "../../components/CopilotChatActionsProvider/useCopilotChatActions";
|
||||
import { MorphingTextAnimation } from "../../components/MorphingTextAnimation/MorphingTextAnimation";
|
||||
import { OrbitLoader } from "../../components/OrbitLoader/OrbitLoader";
|
||||
import { ProgressBar } from "../../components/ProgressBar/ProgressBar";
|
||||
import {
|
||||
ContentCardDescription,
|
||||
@@ -77,7 +76,7 @@ function getAccordionMeta(output: CreateAgentToolOutput) {
|
||||
isOperationInProgressOutput(output)
|
||||
) {
|
||||
return {
|
||||
icon: <OrbitLoader size={32} />,
|
||||
icon,
|
||||
title: "Creating agent, this may take a few minutes. Sit back and relax.",
|
||||
};
|
||||
}
|
||||
|
||||
@@ -203,7 +203,7 @@ export function getAccordionMeta(output: RunAgentToolOutput): {
|
||||
? output.status.trim()
|
||||
: "started";
|
||||
return {
|
||||
icon: <OrbitLoader size={28} className="text-neutral-700" />,
|
||||
icon,
|
||||
title: output.graph_name,
|
||||
description: `Status: ${statusText}`,
|
||||
};
|
||||
|
||||
@@ -149,7 +149,7 @@ export function getAccordionMeta(output: RunBlockToolOutput): {
|
||||
if (isRunBlockBlockOutput(output)) {
|
||||
const keys = Object.keys(output.outputs ?? {});
|
||||
return {
|
||||
icon: <OrbitLoader size={24} className="text-neutral-700" />,
|
||||
icon,
|
||||
title: output.block_name,
|
||||
description:
|
||||
keys.length > 0
|
||||
|
||||
@@ -1,11 +1,8 @@
|
||||
import { environment } from "@/services/environment";
|
||||
import { getServerAuthToken } from "@/lib/autogpt-server-api/helpers";
|
||||
import { NextRequest } from "next/server";
|
||||
import { normalizeSSEStream, SSE_HEADERS } from "../../../sse-helpers";
|
||||
|
||||
/**
|
||||
* SSE Proxy for chat streaming.
|
||||
* Supports POST with context (page content + URL) in the request body.
|
||||
*/
|
||||
export async function POST(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ sessionId: string }> },
|
||||
@@ -23,17 +20,14 @@ export async function POST(
|
||||
);
|
||||
}
|
||||
|
||||
// Get auth token from server-side session
|
||||
const token = await getServerAuthToken();
|
||||
|
||||
// Build backend URL
|
||||
const backendUrl = environment.getAGPTServerBaseUrl();
|
||||
const streamUrl = new URL(
|
||||
`/api/chat/sessions/${sessionId}/stream`,
|
||||
backendUrl,
|
||||
);
|
||||
|
||||
// Forward request to backend with auth header
|
||||
const headers: Record<string, string> = {
|
||||
"Content-Type": "application/json",
|
||||
Accept: "text/event-stream",
|
||||
@@ -63,14 +57,15 @@ export async function POST(
|
||||
});
|
||||
}
|
||||
|
||||
// Return the SSE stream directly
|
||||
return new Response(response.body, {
|
||||
headers: {
|
||||
"Content-Type": "text/event-stream",
|
||||
"Cache-Control": "no-cache, no-transform",
|
||||
Connection: "keep-alive",
|
||||
"X-Accel-Buffering": "no",
|
||||
},
|
||||
if (!response.body) {
|
||||
return new Response(
|
||||
JSON.stringify({ error: "Empty response from chat service" }),
|
||||
{ status: 502, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
}
|
||||
|
||||
return new Response(normalizeSSEStream(response.body), {
|
||||
headers: SSE_HEADERS,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("SSE proxy error:", error);
|
||||
@@ -87,13 +82,6 @@ export async function POST(
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resume an active stream for a session.
|
||||
*
|
||||
* Called by the AI SDK's `useChat(resume: true)` on page load.
|
||||
* Proxies to the backend which checks for an active stream and either
|
||||
* replays it (200 + SSE) or returns 204 No Content.
|
||||
*/
|
||||
export async function GET(
|
||||
_request: NextRequest,
|
||||
{ params }: { params: Promise<{ sessionId: string }> },
|
||||
@@ -124,7 +112,6 @@ export async function GET(
|
||||
headers,
|
||||
});
|
||||
|
||||
// 204 = no active stream to resume
|
||||
if (response.status === 204) {
|
||||
return new Response(null, { status: 204 });
|
||||
}
|
||||
@@ -137,12 +124,13 @@ export async function GET(
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(response.body, {
|
||||
if (!response.body) {
|
||||
return new Response(null, { status: 204 });
|
||||
}
|
||||
|
||||
return new Response(normalizeSSEStream(response.body), {
|
||||
headers: {
|
||||
"Content-Type": "text/event-stream",
|
||||
"Cache-Control": "no-cache, no-transform",
|
||||
Connection: "keep-alive",
|
||||
"X-Accel-Buffering": "no",
|
||||
...SSE_HEADERS,
|
||||
"x-vercel-ai-ui-message-stream": "v1",
|
||||
},
|
||||
});
|
||||
|
||||
72
autogpt_platform/frontend/src/app/api/chat/sse-helpers.ts
Normal file
72
autogpt_platform/frontend/src/app/api/chat/sse-helpers.ts
Normal file
@@ -0,0 +1,72 @@
|
||||
export const SSE_HEADERS = {
|
||||
"Content-Type": "text/event-stream",
|
||||
"Cache-Control": "no-cache, no-transform",
|
||||
Connection: "keep-alive",
|
||||
"X-Accel-Buffering": "no",
|
||||
} as const;
|
||||
|
||||
export function normalizeSSEStream(
|
||||
input: ReadableStream<Uint8Array>,
|
||||
): ReadableStream<Uint8Array> {
|
||||
const decoder = new TextDecoder();
|
||||
const encoder = new TextEncoder();
|
||||
let buffer = "";
|
||||
|
||||
return input.pipeThrough(
|
||||
new TransformStream<Uint8Array, Uint8Array>({
|
||||
transform(chunk, controller) {
|
||||
buffer += decoder.decode(chunk, { stream: true });
|
||||
|
||||
const parts = buffer.split("\n\n");
|
||||
buffer = parts.pop() ?? "";
|
||||
|
||||
for (const part of parts) {
|
||||
const normalized = normalizeSSEEvent(part);
|
||||
controller.enqueue(encoder.encode(normalized + "\n\n"));
|
||||
}
|
||||
},
|
||||
flush(controller) {
|
||||
if (buffer.trim()) {
|
||||
const normalized = normalizeSSEEvent(buffer);
|
||||
controller.enqueue(encoder.encode(normalized + "\n\n"));
|
||||
}
|
||||
},
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
function normalizeSSEEvent(event: string): string {
|
||||
const lines = event.split("\n");
|
||||
const dataLines: string[] = [];
|
||||
const otherLines: string[] = [];
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.startsWith("data: ")) {
|
||||
dataLines.push(line.slice(6));
|
||||
} else {
|
||||
otherLines.push(line);
|
||||
}
|
||||
}
|
||||
|
||||
if (dataLines.length === 0) return event;
|
||||
|
||||
const dataStr = dataLines.join("\n");
|
||||
try {
|
||||
const parsed = JSON.parse(dataStr) as Record<string, unknown>;
|
||||
if (parsed.type === "error") {
|
||||
const normalized = {
|
||||
type: "error",
|
||||
errorText:
|
||||
typeof parsed.errorText === "string"
|
||||
? parsed.errorText
|
||||
: "An unexpected error occurred",
|
||||
};
|
||||
const newData = `data: ${JSON.stringify(normalized)}`;
|
||||
return [...otherLines.filter((l) => l.length > 0), newData].join("\n");
|
||||
}
|
||||
} catch {
|
||||
// Not valid JSON — pass through as-is
|
||||
}
|
||||
|
||||
return event;
|
||||
}
|
||||
@@ -1,20 +1,8 @@
|
||||
import { environment } from "@/services/environment";
|
||||
import { getServerAuthToken } from "@/lib/autogpt-server-api/helpers";
|
||||
import { NextRequest } from "next/server";
|
||||
import { normalizeSSEStream, SSE_HEADERS } from "../../../sse-helpers";
|
||||
|
||||
/**
|
||||
* SSE Proxy for task stream reconnection.
|
||||
*
|
||||
* This endpoint allows clients to reconnect to an ongoing or recently completed
|
||||
* background task's stream. It replays missed messages from Redis Streams and
|
||||
* subscribes to live updates if the task is still running.
|
||||
*
|
||||
* Client contract:
|
||||
* 1. When receiving an operation_started event, store the task_id
|
||||
* 2. To reconnect: GET /api/chat/tasks/{taskId}/stream?last_message_id={idx}
|
||||
* 3. Messages are replayed from the last_message_id position
|
||||
* 4. Stream ends when "finish" event is received
|
||||
*/
|
||||
export async function GET(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ taskId: string }> },
|
||||
@@ -24,15 +12,12 @@ export async function GET(
|
||||
const lastMessageId = searchParams.get("last_message_id") || "0-0";
|
||||
|
||||
try {
|
||||
// Get auth token from server-side session
|
||||
const token = await getServerAuthToken();
|
||||
|
||||
// Build backend URL
|
||||
const backendUrl = environment.getAGPTServerBaseUrl();
|
||||
const streamUrl = new URL(`/api/chat/tasks/${taskId}/stream`, backendUrl);
|
||||
streamUrl.searchParams.set("last_message_id", lastMessageId);
|
||||
|
||||
// Forward request to backend with auth header
|
||||
const headers: Record<string, string> = {
|
||||
Accept: "text/event-stream",
|
||||
"Cache-Control": "no-cache",
|
||||
@@ -56,14 +41,12 @@ export async function GET(
|
||||
});
|
||||
}
|
||||
|
||||
// Return the SSE stream directly
|
||||
return new Response(response.body, {
|
||||
headers: {
|
||||
"Content-Type": "text/event-stream",
|
||||
"Cache-Control": "no-cache, no-transform",
|
||||
Connection: "keep-alive",
|
||||
"X-Accel-Buffering": "no",
|
||||
},
|
||||
if (!response.body) {
|
||||
return new Response(null, { status: 204 });
|
||||
}
|
||||
|
||||
return new Response(normalizeSSEStream(response.body), {
|
||||
headers: SSE_HEADERS,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Task stream proxy error:", error);
|
||||
|
||||
@@ -61,7 +61,7 @@ Below is a comprehensive list of all available blocks, categorized by their prim
|
||||
| [Get List Item](block-integrations/basic.md#get-list-item) | Returns the element at the given index |
|
||||
| [Get Store Agent Details](block-integrations/system/store_operations.md#get-store-agent-details) | Get detailed information about an agent from the store |
|
||||
| [Get Weather Information](block-integrations/basic.md#get-weather-information) | Retrieves weather information for a specified location using OpenWeatherMap API |
|
||||
| [Human In The Loop](block-integrations/basic.md#human-in-the-loop) | Pause execution and wait for human approval or modification of data |
|
||||
| [Human In The Loop](block-integrations/basic.md#human-in-the-loop) | Pause execution for human review |
|
||||
| [List Is Empty](block-integrations/basic.md#list-is-empty) | Checks if a list is empty |
|
||||
| [List Library Agents](block-integrations/system/library_operations.md#list-library-agents) | List all agents in your personal library |
|
||||
| [Note](block-integrations/basic.md#note) | A visual annotation block that displays a sticky note in the workflow editor for documentation and organization purposes |
|
||||
|
||||
@@ -975,7 +975,7 @@ A travel planning application could use this block to provide users with current
|
||||
## Human In The Loop
|
||||
|
||||
### What it is
|
||||
Pause execution and wait for human approval or modification of data
|
||||
Pause execution for human review. Data flows through approved_data or rejected_data output based on the reviewer's decision. Outputs contain the actual data, not status strings.
|
||||
|
||||
### How it works
|
||||
<!-- MANUAL: how_it_works -->
|
||||
@@ -988,18 +988,18 @@ This enables human oversight at critical points in automated workflows, ensuring
|
||||
|
||||
| Input | Description | Type | Required |
|
||||
|-------|-------------|------|----------|
|
||||
| data | The data to be reviewed by a human user | Data | Yes |
|
||||
| name | A descriptive name for what this data represents | str | Yes |
|
||||
| editable | Whether the human reviewer can edit the data | bool | No |
|
||||
| data | The data to be reviewed by a human user. This exact data will be passed through to either approved_data or rejected_data output based on the reviewer's decision. | Data | Yes |
|
||||
| name | A descriptive name for what this data represents. This helps the reviewer understand what they are reviewing. | str | Yes |
|
||||
| editable | Whether the human reviewer can edit the data before approving or rejecting it | bool | No |
|
||||
|
||||
### Outputs
|
||||
|
||||
| Output | Description | Type |
|
||||
|--------|-------------|------|
|
||||
| error | Error message if the operation failed | str |
|
||||
| approved_data | The data when approved (may be modified by reviewer) | Approved Data |
|
||||
| rejected_data | The data when rejected (may be modified by reviewer) | Rejected Data |
|
||||
| review_message | Any message provided by the reviewer | str |
|
||||
| approved_data | Outputs the input data when the reviewer APPROVES it. The value is the actual data itself (not a status string like 'APPROVED'). If the reviewer edited the data, this contains the modified version. Connect downstream blocks here for the 'approved' workflow path. | Approved Data |
|
||||
| rejected_data | Outputs the input data when the reviewer REJECTS it. The value is the actual data itself (not a status string like 'REJECTED'). If the reviewer edited the data, this contains the modified version. Connect downstream blocks here for the 'rejected' workflow path. | Rejected Data |
|
||||
| review_message | Optional message provided by the reviewer explaining their decision. Only outputs when the reviewer provides a message; this pin does not fire if no message was given. | str |
|
||||
|
||||
### Possible use case
|
||||
<!-- MANUAL: use_case -->
|
||||
|
||||
Reference in New Issue
Block a user