Compare commits

..

12 Commits

Author SHA1 Message Date
Nick Tindle
a2856c1863 fix: Prevent binary file data loss on storage failure
- Log at WARNING level instead of DEBUG
- Fall back to storing data URI in content field for binary files
  if workspace storage fails, preventing permanent data loss
2026-02-11 12:15:32 -06:00
Nick Tindle
d9daf3e6db fix: Remove redundant Dockerfile check
TEXT_EXTENSIONS already contains 'Dockerfile'
2026-02-11 12:14:58 -06:00
Nick Tindle
66f9f3a12a docs: Regenerate block docs after schema changes 2026-02-11 12:13:23 -06:00
Nick Tindle
9c4c29b096 fix: Type errors in sandbox_files.py
- Convert bytearray to bytes for ExtractedFile.content
- Wrap data_uri with MediaFileType for store_media_file
2026-02-11 11:54:14 -06:00
Nick Tindle
2f2a031b2c refactor: Use SandboxFileOutput directly in ClaudeCodeBlock
- Remove duplicate FileOutput class from ClaudeCodeBlock
- Use shared SandboxFileOutput model directly in output schema
- Remove unnecessary conversion loop
- Remove unused BaseModel import
2026-02-11 11:35:05 -06:00
Nick Tindle
e72c6681d8 fix: Remove out-of-scope files and regenerate docs
- Remove accidentally committed test_disabled_block_bypass.py
- Remove accidentally committed ActivityDropdown.stories.tsx
- Regenerate block documentation with updated output schemas
2026-02-11 11:14:29 -06:00
Nicholas Tindle
8bed3aee27 Delete autogpt_platform/frontend/src/components/layout/Navbar/components/AgentActivityDropdown/components/ActivityDropdown/ActivityDropdown.stories.tsx 2026-02-11 11:09:21 -06:00
Nicholas Tindle
488ba642c6 Delete autogpt_platform/backend/test_disabled_block_bypass.py 2026-02-11 11:08:35 -06:00
Nick Tindle
c839fee53d fix: Remove default from output schema field 2026-02-11 10:58:35 -06:00
Nick Tindle
931c1c2fcd fix: Update test outputs for workspace_ref field
- Add files output to ExecuteCodeBlock test_output
- Add workspace_ref: None to ClaudeCodeBlock test expected output
2026-02-11 10:55:12 -06:00
Nicholas Tindle
3f36be2d7a Merge branch 'dev' into feat/sandbox-file-workspace-storage 2026-02-11 10:38:16 -06:00
Nick Tindle
b98fbc40ee feat(blocks): Store sandbox files to workspace
- Add shared sandbox_files.py utility for file extraction and workspace storage
- Update Claude Code block to use shared utility and add workspace_ref field
- Update Code Executor block to extract files from /output directory
- Files are stored via store_media_file() with virus scanning and size limits
- Backward compatible: content field preserved, workspace_ref added as optional

Closes SECRT-1931
2026-02-11 10:18:07 -06:00
13 changed files with 380 additions and 1113 deletions

View File

@@ -1,206 +0,0 @@
# GitHub Codespaces for AutoGPT Platform
This dev container provides a complete development environment for the AutoGPT Platform, optimized for PR reviews.
## 🚀 Quick Start
1. **Open in Codespaces:**
- Go to the repository on GitHub
- Click **Code****Codespaces****Create codespace on dev**
- Or click the badge: [![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/Significant-Gravitas/AutoGPT?quickstart=1)
2. **Wait for setup** (~60 seconds with prebuild, ~5-10 min without)
3. **Start the servers:**
```bash
# Terminal 1
make run-backend
# Terminal 2
make run-frontend
```
4. **Start developing!**
- Frontend: `http://localhost:3000`
- Login with: `test123@gmail.com` / `testpassword123`
## 🏗️ Architecture
**Dependencies run in Docker** (cached by prebuild):
- PostgreSQL, Redis, RabbitMQ, Supabase Auth
**Backend & Frontend run natively** (not cached):
- This ensures you're always running the current branch's code
- Enables hot-reload during development
- VS Code debugger can attach directly
## 📍 Available Services
| Service | URL | Notes |
|---------|-----|-------|
| Frontend | http://localhost:3000 | Next.js app |
| REST API | http://localhost:8006 | FastAPI backend |
| WebSocket | ws://localhost:8001 | Real-time updates |
| Supabase | http://localhost:8000 | Auth & API gateway |
| Supabase Studio | http://localhost:5555 | Database admin |
| RabbitMQ | http://localhost:15672 | Queue management |
## 🔑 Test Accounts
| Email | Password | Role |
|-------|----------|------|
| test123@gmail.com | testpassword123 | Featured Creator |
The test account has:
- Pre-created agents and workflows
- Published store listings
- Active agent executions
- Reviews and ratings
## 🛠️ Development Commands
```bash
# Navigate to platform directory (terminal starts here by default)
cd autogpt_platform
# Start all services
docker compose up -d
# Or just core services (DB, Redis, RabbitMQ)
make start-core
# Run backend in dev mode (hot reload)
make run-backend
# Run frontend in dev mode (hot reload)
make run-frontend
# Run both backend and frontend
# (Use VS Code's "Full Stack" launch config for debugging)
# Format code
make format
# Run tests
make test-data # Regenerate test data
poetry run test # Backend tests (from backend/)
pnpm test:e2e # E2E tests (from frontend/)
```
## 🐛 Debugging
### VS Code Launch Configs
> **Note:** Launch and task configs are in `.devcontainer/vscode-templates/`.
> To use them locally, copy to `.vscode/`:
> ```bash
> cp .devcontainer/vscode-templates/*.json .vscode/
> ```
> In Codespaces, core settings are auto-applied via devcontainer.json.
Press `F5` or use the Run and Debug panel:
- **Backend: Debug FastAPI** - Debug the REST API server
- **Backend: Debug Executor** - Debug the agent executor
- **Frontend: Debug Next.js** - Debug with browser DevTools
- **Full Stack: Backend + Frontend** - Debug both simultaneously
- **Tests: Run E2E Tests** - Run Playwright tests
### VS Code Tasks
Press `Ctrl+Shift+P` → "Tasks: Run Task":
- Start/Stop All Services
- Run Migrations
- Seed Test Data
- View Docker Logs
- Reset Database
## 📁 Project Structure
```text
autogpt_platform/ # This folder
├── .devcontainer/ # Codespaces/devcontainer config
├── .vscode/ # VS Code settings
├── backend/ # Python FastAPI backend
│ ├── backend/ # Application code
│ ├── test/ # Test files + data seeders
│ └── migrations/ # Prisma migrations
├── frontend/ # Next.js frontend
│ ├── src/ # Application code
│ └── e2e/ # Playwright E2E tests
├── db/ # Supabase configuration
├── docker-compose.yml # Service orchestration
└── Makefile # Common commands
```
## 🔧 Troubleshooting
### Services not starting?
```bash
# Check service status
docker compose ps
# View logs
docker compose logs -f
# Restart everything
docker compose down && docker compose up -d
```
### Database issues?
```bash
# Reset database (destroys all data)
make reset-db
# Re-run migrations
make migrate
# Re-seed test data
make test-data
```
### Port already in use?
```bash
# Check what's using the port
lsof -i :3000
# Kill process (if safe)
kill -9 <PID>
```
### Can't login?
- Ensure all services are running: `docker compose ps`
- Check auth service: `docker compose logs auth`
- Try seeding data again: `make test-data`
## 📝 Making Changes
### Backend Changes
1. Edit files in `backend/backend/`
2. If using `make run-backend`, changes auto-reload
3. Run `poetry run format` before committing
### Frontend Changes
1. Edit files in `frontend/src/`
2. If using `make run-frontend`, changes auto-reload
3. Run `pnpm format` before committing
### Database Schema Changes
1. Edit `backend/schema.prisma`
2. Run `poetry run prisma migrate dev --name your_migration`
3. Run `poetry run prisma generate`
## 🔒 Environment Variables
Default environment variables are configured for local development. For production secrets, use GitHub Codespaces Secrets:
1. Go to GitHub Settings → Codespaces → Secrets
2. Add secrets with names matching `.env` variables
3. They'll be automatically available in your codespace
## 📚 More Resources
- [AutoGPT Platform Docs](https://docs.agpt.co)
- [Codespaces Documentation](https://docs.github.com/en/codespaces)
- [Dev Containers Spec](https://containers.dev)

View File

@@ -1,152 +0,0 @@
{
"name": "AutoGPT Platform",
"dockerComposeFile": "docker-compose.devcontainer.yml",
"service": "devcontainer",
"workspaceFolder": "/workspaces/AutoGPT/autogpt_platform",
"shutdownAction": "stopCompose",
// Features - Docker-in-Docker for full compose support
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"dockerDashComposeVersion": "v2"
},
"ghcr.io/devcontainers/features/github-cli:1": {},
"ghcr.io/devcontainers/features/node:1": {
"version": "22",
"nodeGypDependencies": true
},
"ghcr.io/devcontainers/features/python:1": {
"version": "3.13",
"installTools": true,
"toolsToInstall": "flake8,autopep8,black,mypy,pytest,poetry"
}
},
// Lifecycle scripts - paths relative to repo root
"onCreateCommand": "bash .devcontainer/platform/scripts/oncreate.sh",
"postCreateCommand": "bash .devcontainer/platform/scripts/postcreate.sh",
"postStartCommand": "bash .devcontainer/platform/scripts/poststart.sh",
// Port forwarding
"forwardPorts": [
3000, // Frontend
8006, // REST API
8001, // WebSocket
8000, // Supabase Kong
5432, // PostgreSQL
6379, // Redis
15672, // RabbitMQ Management
5555 // Supabase Studio
],
"portsAttributes": {
"3000": { "label": "Frontend", "onAutoForward": "openBrowser" },
"8006": { "label": "REST API", "onAutoForward": "notify" },
"8001": { "label": "WebSocket", "onAutoForward": "silent" },
"8000": { "label": "Supabase", "onAutoForward": "silent" },
"5432": { "label": "PostgreSQL", "onAutoForward": "silent" },
"6379": { "label": "Redis", "onAutoForward": "silent" },
"15672": { "label": "RabbitMQ", "onAutoForward": "silent" },
"5555": { "label": "Supabase Studio", "onAutoForward": "silent" }
},
// VS Code customizations
"customizations": {
"vscode": {
"settings": {
// Python
"python.defaultInterpreterPath": "${workspaceFolder}/backend/.venv/bin/python",
"python.analysis.typeCheckingMode": "basic",
"python.testing.pytestEnabled": true,
"python.testing.pytestArgs": ["backend"],
// Formatting
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter",
"editor.formatOnSave": true
},
"[typescript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[typescriptreact]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[javascript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
// Editor
"editor.rulers": [88, 120],
"editor.tabSize": 2,
"files.trimTrailingWhitespace": true,
// Terminal
"terminal.integrated.defaultProfile.linux": "bash",
"terminal.integrated.cwd": "${workspaceFolder}",
// Git
"git.autofetch": true,
"git.enableSmartCommit": true,
"git.openRepositoryInParentFolders": "always",
// Prisma
"prisma.showPrismaDataPlatformNotification": false
},
"extensions": [
// Python
"ms-python.python",
"ms-python.vscode-pylance",
"ms-python.black-formatter",
"ms-python.isort",
// JavaScript/TypeScript
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode",
"bradlc.vscode-tailwindcss",
// Database
"Prisma.prisma",
// Testing
"ms-playwright.playwright",
// GitHub
"GitHub.vscode-pull-request-github",
"GitHub.copilot",
"github.vscode-github-actions",
// Utilities
"eamodio.gitlens",
"usernamehw.errorlens",
"christian-kohler.path-intellisense",
"mikestead.dotenv"
]
},
"codespaces": {
"openFiles": [
"README.md"
]
}
},
// Environment variables
"containerEnv": {
"CODESPACES": "true",
"POETRY_VIRTUALENVS_IN_PROJECT": "true"
},
// Run as non-root for security
"remoteUser": "vscode",
// Host requirements for performance
"hostRequirements": {
"cpus": 4,
"memory": "8gb",
"storage": "32gb"
}
}

View File

@@ -1,21 +0,0 @@
# Standalone devcontainer service
# The platform services (db, redis, etc.) are started from within
# the container using docker compose commands in the lifecycle scripts.
services:
devcontainer:
image: mcr.microsoft.com/devcontainers/base:ubuntu-24.04
volumes:
# Mount the entire AutoGPT repo
- ../..:/workspaces/AutoGPT:cached
# Keep container running
command: sleep infinity
# Environment for development
environment:
- CODESPACES=true
- POETRY_VIRTUALENVS_IN_PROJECT=true
- POETRY_NO_INTERACTION=1
- NEXT_TELEMETRY_DISABLED=1
- DO_NOT_TRACK=1

View File

@@ -1,142 +0,0 @@
#!/bin/bash
# =============================================================================
# ONCREATE SCRIPT - Runs during prebuild
# =============================================================================
# This script runs during the prebuild phase (GitHub Actions).
# It caches everything that's safe to cache:
# ✅ Dependency Docker images (postgres, redis, rabbitmq, etc.)
# ✅ Python packages (poetry install)
# ✅ Node packages (pnpm install)
#
# It does NOT build backend/frontend Docker images because those would
# contain stale code from the prebuild branch, not the PR being reviewed.
# =============================================================================
set -e # Exit on error
set -x # Print commands for debugging
echo "🚀 Starting prebuild setup..."
# =============================================================================
# Setup PATH for tools installed by devcontainer features
# =============================================================================
# Python feature installs pipx at /usr/local/py-utils/bin
# Node feature installs nvm, node, pnpm at various locations
export PATH="/usr/local/py-utils/bin:$PATH"
# Source nvm if available (Node feature uses nvm)
export NVM_DIR="${NVM_DIR:-/usr/local/share/nvm}"
if [ -s "$NVM_DIR/nvm.sh" ]; then
. "$NVM_DIR/nvm.sh"
fi
# =============================================================================
# Verify and Install Poetry
# =============================================================================
echo "📦 Setting up Poetry..."
if command -v poetry &> /dev/null; then
echo " Poetry already installed: $(poetry --version)"
else
echo " Installing Poetry via pipx..."
if command -v pipx &> /dev/null; then
pipx install poetry
else
echo " pipx not found, installing poetry via pip..."
pip install --user poetry
export PATH="$HOME/.local/bin:$PATH"
fi
fi
poetry --version || { echo "❌ Poetry installation failed"; exit 1; }
# =============================================================================
# Verify and Install pnpm
# =============================================================================
echo "📦 Setting up pnpm..."
if command -v pnpm &> /dev/null; then
echo " pnpm already installed: $(pnpm --version)"
else
echo " Installing pnpm via npm..."
npm install -g pnpm
fi
pnpm --version || { echo "❌ pnpm installation failed"; exit 1; }
# =============================================================================
# Navigate to workspace
# =============================================================================
cd /workspaces/AutoGPT/autogpt_platform
# =============================================================================
# Install Backend Dependencies
# =============================================================================
echo "📦 Installing backend dependencies..."
cd backend
poetry install --no-interaction --no-ansi
# Generate Prisma client (schema only, no DB needed)
echo "🔧 Generating Prisma client..."
poetry run prisma generate || true
poetry run gen-prisma-stub || true
cd ..
# =============================================================================
# Install Frontend Dependencies
# =============================================================================
echo "📦 Installing frontend dependencies..."
cd frontend
pnpm install --frozen-lockfile
cd ..
# =============================================================================
# Pull Dependency Docker Images
# =============================================================================
# Use docker compose pull to get exact versions from compose files
# (single source of truth, no version drift)
# =============================================================================
echo "🐳 Pulling dependency Docker images..."
# Start Docker daemon if using docker-in-docker
if [ -e /var/run/docker-host.sock ]; then
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock || true
fi
# Check if Docker is available
if command -v docker &> /dev/null && docker info &> /dev/null; then
# Pull images defined in docker-compose.yml (single source of truth)
docker compose pull db redis rabbitmq kong auth || echo "⚠️ Some images may not have pulled"
echo "✅ Dependency images pulled"
else
echo "⚠️ Docker not available during prebuild, images will be pulled on first start"
fi
# =============================================================================
# Copy environment files
# =============================================================================
echo "📄 Setting up environment files..."
cd /workspaces/AutoGPT/autogpt_platform
[ ! -f backend/.env ] && cp backend/.env.default backend/.env
[ ! -f frontend/.env ] && cp frontend/.env.default frontend/.env
[ ! -f .env ] && cp .env.default .env
# =============================================================================
# Done!
# =============================================================================
echo ""
echo "=============================================="
echo "✅ PREBUILD COMPLETE"
echo "=============================================="
echo ""
echo "Cached:"
echo " ✅ Poetry $(poetry --version 2>/dev/null || echo '(check path)')"
echo " ✅ pnpm $(pnpm --version 2>/dev/null || echo '(check path)')"
echo " ✅ Python packages"
echo " ✅ Node packages"
echo ""

View File

@@ -1,131 +0,0 @@
#!/bin/bash
# =============================================================================
# POSTCREATE SCRIPT - Runs after container creation
# =============================================================================
# This script runs once when a codespace is first created. It starts the
# dependency services and prepares the environment for development.
#
# NOTE: Backend and Frontend run NATIVELY (not in Docker) to ensure you're
# always running the current branch's code, not stale prebuild code.
# =============================================================================
set -e # Exit on error
echo "🚀 Setting up your development environment..."
# Ensure PATH includes pipx binaries (where poetry is installed)
export PATH="/usr/local/py-utils/bin:$PATH"
cd /workspaces/AutoGPT/autogpt_platform
# =============================================================================
# Ensure Docker is available
# =============================================================================
if [ -e /var/run/docker-host.sock ]; then
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock 2>/dev/null || true
fi
# Wait for Docker to be ready
echo "⏳ Waiting for Docker..."
timeout 60 bash -c 'until docker info &>/dev/null; do sleep 1; done'
echo "✅ Docker is ready"
# =============================================================================
# Start Dependency Services ONLY
# =============================================================================
# We only start infrastructure deps in Docker.
# Backend/Frontend run natively to use the current branch's code.
# =============================================================================
echo "🐳 Starting dependency services..."
# Start core dependencies (DB, Auth, Redis, RabbitMQ)
docker compose up -d db redis rabbitmq kong auth
# Wait for PostgreSQL to be healthy
echo "⏳ Waiting for PostgreSQL..."
timeout 120 bash -c '
until docker compose exec -T db pg_isready -U postgres &>/dev/null; do
sleep 2
echo " Waiting for database..."
done
'
echo "✅ PostgreSQL is ready"
# Wait for Redis
echo "⏳ Waiting for Redis..."
timeout 60 bash -c 'until docker compose exec -T redis redis-cli ping &>/dev/null; do sleep 1; done'
echo "✅ Redis is ready"
# Wait for RabbitMQ
echo "⏳ Waiting for RabbitMQ..."
timeout 90 bash -c 'until docker compose exec -T rabbitmq rabbitmq-diagnostics -q ping &>/dev/null; do sleep 2; done'
echo "✅ RabbitMQ is ready"
# =============================================================================
# Run Database Migrations
# =============================================================================
echo "🔄 Running database migrations..."
cd backend
# Run migrations
poetry run prisma migrate deploy
poetry run prisma generate
poetry run gen-prisma-stub || true
cd ..
# =============================================================================
# Seed Test Data (Minimal)
# =============================================================================
echo "🌱 Checking test data..."
cd backend
# Check if test data already exists (idempotent)
if poetry run python -c "
import asyncio
from backend.data.db import prisma
async def check():
await prisma.connect()
count = await prisma.user.count()
await prisma.disconnect()
return count > 0
print('exists' if asyncio.run(check()) else 'empty')
" 2>/dev/null | grep -q "exists"; then
echo " Test data already exists, skipping seed"
else
echo " Running E2E test data creator..."
poetry run python test/e2e_test_data.py || echo "⚠️ Test data seeding had issues (may be partial)"
fi
cd ..
# =============================================================================
# Print Welcome Message
# =============================================================================
echo ""
echo "=============================================="
echo "🎉 CODESPACE READY!"
echo "=============================================="
echo ""
echo "📍 Services Running (Docker):"
echo " PostgreSQL: localhost:5432"
echo " Redis: localhost:6379"
echo " RabbitMQ: localhost:5672 (mgmt: 15672)"
echo " Supabase: localhost:8000"
echo ""
echo "🚀 Start Development:"
echo " make run-backend # Start backend (localhost:8006)"
echo " make run-frontend # Start frontend (localhost:3000)"
echo ""
echo " Or run both in separate terminals!"
echo ""
echo "🔑 Test Account:"
echo " Email: test123@gmail.com"
echo " Password: testpassword123"
echo ""
echo "📚 Full docs: .devcontainer/platform/README.md"
echo ""

View File

@@ -1,44 +0,0 @@
#!/bin/bash
# =============================================================================
# POSTSTART SCRIPT - Runs every time the codespace starts
# =============================================================================
# This script runs when:
# 1. Codespace is first created (after postcreate)
# 2. Codespace resumes from stopped state
# 3. Codespace rebuilds
#
# It ensures dependency services are running. Backend/Frontend are run
# manually by the developer for hot-reload during development.
# =============================================================================
echo "🔄 Starting dependency services..."
cd /workspaces/AutoGPT/autogpt_platform || { echo "❌ Failed to cd to workspace"; exit 1; }
# Ensure Docker socket is available
if [ -e /var/run/docker-host.sock ]; then
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock 2>/dev/null || true
fi
# Wait for Docker
timeout 30 bash -c 'until docker info &>/dev/null; do sleep 1; done' || {
echo "⚠️ Docker not available, services may need manual start"
exit 0
}
# Start only dependency services (not backend/frontend)
docker compose up -d db redis rabbitmq kong auth
# Quick health check
echo "⏳ Waiting for services..."
sleep 5
if docker compose ps | grep -q "running"; then
echo "✅ Dependency services are running"
echo ""
echo "🚀 Start development with:"
echo " make run-backend # Terminal 1"
echo " make run-frontend # Terminal 2"
else
echo "⚠️ Some services may not be running. Try: docker compose up -d"
fi

View File

@@ -1,110 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Backend: Debug FastAPI",
"type": "debugpy",
"request": "launch",
"module": "uvicorn",
"args": [
"backend.rest:app",
"--reload",
"--host", "0.0.0.0",
"--port", "8006"
],
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Backend: Debug Executor",
"type": "debugpy",
"request": "launch",
"module": "backend.exec",
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Backend: Debug WebSocket",
"type": "debugpy",
"request": "launch",
"module": "backend.ws",
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Frontend: Debug Next.js",
"type": "node",
"request": "launch",
"runtimeExecutable": "pnpm",
"runtimeArgs": ["dev"],
"cwd": "${workspaceFolder}/frontend",
"console": "integratedTerminal",
"serverReadyAction": {
"pattern": "- Local:.+(https?://\\S+)",
"uriFormat": "%s",
"action": "openExternally"
}
},
{
"name": "Frontend: Debug Next.js (Server-side)",
"type": "node",
"request": "launch",
"runtimeExecutable": "pnpm",
"runtimeArgs": ["dev"],
"cwd": "${workspaceFolder}/frontend",
"env": {
"NODE_OPTIONS": "--inspect"
},
"console": "integratedTerminal"
},
{
"name": "Tests: Run Backend Tests",
"type": "debugpy",
"request": "launch",
"module": "pytest",
"args": ["-v", "--tb=short"],
"cwd": "${workspaceFolder}/backend",
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Tests: Run E2E Tests (Playwright)",
"type": "node",
"request": "launch",
"runtimeExecutable": "pnpm",
"runtimeArgs": ["test:e2e"],
"cwd": "${workspaceFolder}/frontend",
"console": "integratedTerminal"
},
{
"name": "Scripts: Seed Test Data",
"type": "debugpy",
"request": "launch",
"program": "${workspaceFolder}/backend/test/e2e_test_data.py",
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal"
}
],
"compounds": [
{
"name": "Full Stack: Backend + Frontend",
"configurations": ["Backend: Debug FastAPI", "Frontend: Debug Next.js"],
"stopAll": true
}
]
}

View File

@@ -1,147 +0,0 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "Start All Services",
"type": "shell",
"command": "docker compose up -d",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": "build"
},
{
"label": "Stop All Services",
"type": "shell",
"command": "docker compose stop",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Start Core Services",
"type": "shell",
"command": "make start-core",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": "build"
},
{
"label": "Run Backend (Dev Mode)",
"type": "shell",
"command": "poetry run app",
"options": {
"cwd": "${workspaceFolder}/backend"
},
"problemMatcher": [],
"isBackground": true,
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "Run Frontend (Dev Mode)",
"type": "shell",
"command": "pnpm dev",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": [],
"isBackground": true,
"group": {
"kind": "build",
"isDefault": true
}
},
{
"label": "Run Migrations",
"type": "shell",
"command": "make migrate",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Seed Test Data",
"type": "shell",
"command": "make test-data",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Format Code",
"type": "shell",
"command": "make format",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Backend: Run Tests",
"type": "shell",
"command": "poetry run test",
"options": {
"cwd": "${workspaceFolder}/backend"
},
"problemMatcher": [],
"group": "test"
},
{
"label": "Frontend: Run Tests",
"type": "shell",
"command": "pnpm test",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": [],
"group": "test"
},
{
"label": "Frontend: Run E2E Tests",
"type": "shell",
"command": "pnpm test:e2e",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": [],
"group": "test"
},
{
"label": "Generate API Client",
"type": "shell",
"command": "pnpm generate:api",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": []
},
{
"label": "View Docker Logs",
"type": "shell",
"command": "docker compose logs -f",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"isBackground": true
},
{
"label": "Reset Database",
"type": "shell",
"command": "make reset-db",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
}
]
}

View File

@@ -1,10 +1,10 @@
import json
import shlex
import uuid
from typing import Literal, Optional
from typing import TYPE_CHECKING, Literal, Optional
from e2b import AsyncSandbox as BaseAsyncSandbox
from pydantic import BaseModel, SecretStr
from pydantic import SecretStr
from backend.data.block import (
Block,
@@ -20,6 +20,13 @@ from backend.data.model import (
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util.sandbox_files import (
SandboxFileOutput,
extract_and_store_sandbox_files,
)
if TYPE_CHECKING:
from backend.executor.utils import ExecutionContext
class ClaudeCodeExecutionError(Exception):
@@ -174,22 +181,15 @@ class ClaudeCodeBlock(Block):
advanced=True,
)
class FileOutput(BaseModel):
"""A file extracted from the sandbox."""
path: str
relative_path: str # Path relative to working directory (for GitHub, etc.)
name: str
content: str
class Output(BlockSchemaOutput):
response: str = SchemaField(
description="The output/response from Claude Code execution"
)
files: list["ClaudeCodeBlock.FileOutput"] = SchemaField(
files: list[SandboxFileOutput] = SchemaField(
description=(
"List of text files created/modified by Claude Code during this execution. "
"Each file has 'path', 'relative_path', 'name', and 'content' fields."
"Each file has 'path', 'relative_path', 'name', 'content', and 'workspace_ref' fields. "
"workspace_ref contains a workspace:// URI if the file was stored to workspace."
)
)
conversation_history: str = SchemaField(
@@ -252,6 +252,7 @@ class ClaudeCodeBlock(Block):
"relative_path": "index.html",
"name": "index.html",
"content": "<html>Hello World</html>",
"workspace_ref": None,
}
],
),
@@ -267,11 +268,12 @@ class ClaudeCodeBlock(Block):
"execute_claude_code": lambda *args, **kwargs: (
"Created index.html with hello world content", # response
[
ClaudeCodeBlock.FileOutput(
SandboxFileOutput(
path="/home/user/index.html",
relative_path="index.html",
name="index.html",
content="<html>Hello World</html>",
workspace_ref=None,
)
], # files
"User: Create a hello world HTML file\n"
@@ -294,7 +296,8 @@ class ClaudeCodeBlock(Block):
existing_sandbox_id: str,
conversation_history: str,
dispose_sandbox: bool,
) -> tuple[str, list["ClaudeCodeBlock.FileOutput"], str, str, str]:
execution_context: "ExecutionContext",
) -> tuple[str, list[SandboxFileOutput], str, str, str]:
"""
Execute Claude Code in an E2B sandbox.
@@ -449,14 +452,18 @@ class ClaudeCodeBlock(Block):
else:
new_conversation_history = turn_entry
# Extract files created/modified during this run
files = await self._extract_files(
sandbox, working_directory, start_timestamp
# Extract files created/modified during this run and store to workspace
sandbox_files = await extract_and_store_sandbox_files(
sandbox=sandbox,
working_directory=working_directory,
execution_context=execution_context,
since_timestamp=start_timestamp,
text_only=True,
)
return (
response,
files,
sandbox_files, # Already SandboxFileOutput objects
new_conversation_history,
current_session_id,
sandbox_id,
@@ -471,140 +478,6 @@ class ClaudeCodeBlock(Block):
if dispose_sandbox and sandbox:
await sandbox.kill()
async def _extract_files(
self,
sandbox: BaseAsyncSandbox,
working_directory: str,
since_timestamp: str | None = None,
) -> list["ClaudeCodeBlock.FileOutput"]:
"""
Extract text files created/modified during this Claude Code execution.
Args:
sandbox: The E2B sandbox instance
working_directory: Directory to search for files
since_timestamp: ISO timestamp - only return files modified after this time
Returns:
List of FileOutput objects with path, relative_path, name, and content
"""
files: list[ClaudeCodeBlock.FileOutput] = []
# Text file extensions we can safely read as text
text_extensions = {
".txt",
".md",
".html",
".htm",
".css",
".js",
".ts",
".jsx",
".tsx",
".json",
".xml",
".yaml",
".yml",
".toml",
".ini",
".cfg",
".conf",
".py",
".rb",
".php",
".java",
".c",
".cpp",
".h",
".hpp",
".cs",
".go",
".rs",
".swift",
".kt",
".scala",
".sh",
".bash",
".zsh",
".sql",
".graphql",
".env",
".gitignore",
".dockerfile",
"Dockerfile",
".vue",
".svelte",
".astro",
".mdx",
".rst",
".tex",
".csv",
".log",
}
try:
# List files recursively using find command
# Exclude node_modules and .git directories, but allow hidden files
# like .env and .gitignore (they're filtered by text_extensions later)
# Filter by timestamp to only get files created/modified during this run
safe_working_dir = shlex.quote(working_directory)
timestamp_filter = ""
if since_timestamp:
timestamp_filter = f"-newermt {shlex.quote(since_timestamp)} "
find_result = await sandbox.commands.run(
f"find {safe_working_dir} -type f "
f"{timestamp_filter}"
f"-not -path '*/node_modules/*' "
f"-not -path '*/.git/*' "
f"2>/dev/null"
)
if find_result.stdout:
for file_path in find_result.stdout.strip().split("\n"):
if not file_path:
continue
# Check if it's a text file we can read
is_text = any(
file_path.endswith(ext) for ext in text_extensions
) or file_path.endswith("Dockerfile")
if is_text:
try:
content = await sandbox.files.read(file_path)
# Handle bytes or string
if isinstance(content, bytes):
content = content.decode("utf-8", errors="replace")
# Extract filename from path
file_name = file_path.split("/")[-1]
# Calculate relative path by stripping working directory
relative_path = file_path
if file_path.startswith(working_directory):
relative_path = file_path[len(working_directory) :]
# Remove leading slash if present
if relative_path.startswith("/"):
relative_path = relative_path[1:]
files.append(
ClaudeCodeBlock.FileOutput(
path=file_path,
relative_path=relative_path,
name=file_name,
content=content,
)
)
except Exception:
# Skip files that can't be read
pass
except Exception:
# If file extraction fails, return empty results
pass
return files
def _escape_prompt(self, prompt: str) -> str:
"""Escape the prompt for safe shell execution."""
# Use single quotes and escape any single quotes in the prompt
@@ -617,6 +490,7 @@ class ClaudeCodeBlock(Block):
*,
e2b_credentials: APIKeyCredentials,
anthropic_credentials: APIKeyCredentials,
execution_context: "ExecutionContext",
**kwargs,
) -> BlockOutput:
try:
@@ -637,6 +511,7 @@ class ClaudeCodeBlock(Block):
existing_sandbox_id=input_data.sandbox_id,
conversation_history=input_data.conversation_history,
dispose_sandbox=input_data.dispose_sandbox,
execution_context=execution_context,
)
yield "response", response

View File

@@ -1,5 +1,5 @@
from enum import Enum
from typing import Any, Literal, Optional
from typing import TYPE_CHECKING, Any, Literal, Optional
from e2b_code_interpreter import AsyncSandbox
from e2b_code_interpreter import Result as E2BExecutionResult
@@ -20,6 +20,13 @@ from backend.data.model import (
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util.sandbox_files import (
SandboxFileOutput,
extract_and_store_sandbox_files,
)
if TYPE_CHECKING:
from backend.executor.utils import ExecutionContext
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
@@ -85,6 +92,9 @@ class CodeExecutionResult(MainCodeExecutionResult):
class BaseE2BExecutorMixin:
"""Shared implementation methods for E2B executor blocks."""
# Default output directory for file extraction
OUTPUT_DIR = "/output"
async def execute_code(
self,
api_key: str,
@@ -95,14 +105,21 @@ class BaseE2BExecutorMixin:
timeout: Optional[int] = None,
sandbox_id: Optional[str] = None,
dispose_sandbox: bool = False,
execution_context: Optional["ExecutionContext"] = None,
extract_files: bool = False,
):
"""
Unified code execution method that handles all three use cases:
1. Create new sandbox and execute (ExecuteCodeBlock)
2. Create new sandbox, execute, and return sandbox_id (InstantiateCodeSandboxBlock)
3. Connect to existing sandbox and execute (ExecuteCodeStepBlock)
Args:
extract_files: If True and execution_context provided, extract files from
/output directory and store to workspace.
""" # noqa
sandbox = None
files: list[SandboxFileOutput] = []
try:
if sandbox_id:
# Connect to existing sandbox (ExecuteCodeStepBlock case)
@@ -114,6 +131,9 @@ class BaseE2BExecutorMixin:
sandbox = await AsyncSandbox.create(
api_key=api_key, template=template_id, timeout=timeout
)
# Create /output directory for file extraction
if extract_files:
await sandbox.commands.run(f"mkdir -p {self.OUTPUT_DIR}")
if setup_commands:
for cmd in setup_commands:
await sandbox.commands.run(cmd)
@@ -133,7 +153,24 @@ class BaseE2BExecutorMixin:
stdout_logs = "".join(execution.logs.stdout)
stderr_logs = "".join(execution.logs.stderr)
return results, text_output, stdout_logs, stderr_logs, sandbox.sandbox_id
# Extract files from /output if requested
if extract_files and execution_context:
files = await extract_and_store_sandbox_files(
sandbox=sandbox,
working_directory=self.OUTPUT_DIR,
execution_context=execution_context,
since_timestamp=None, # Get all files in /output
text_only=False, # Include binary files too
)
return (
results,
text_output,
stdout_logs,
stderr_logs,
sandbox.sandbox_id,
files,
)
finally:
# Dispose of sandbox if requested to reduce usage costs
if dispose_sandbox and sandbox:
@@ -238,6 +275,12 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
files: list[SandboxFileOutput] = SchemaField(
description=(
"Files written to /output directory during execution. "
"Each file has path, name, content, and workspace_ref (if stored)."
),
)
def __init__(self):
super().__init__(
@@ -259,23 +302,30 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
("results", []),
("response", "Hello World"),
("stdout_logs", "Hello World\n"),
("files", []),
],
test_mock={
"execute_code": lambda api_key, code, language, template_id, setup_commands, timeout, dispose_sandbox: ( # noqa
"execute_code": lambda api_key, code, language, template_id, setup_commands, timeout, dispose_sandbox, execution_context, extract_files: ( # noqa
[], # results
"Hello World", # text_output
"Hello World\n", # stdout_logs
"", # stderr_logs
"sandbox_id", # sandbox_id
[], # files
),
},
)
async def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
self,
input_data: Input,
*,
credentials: APIKeyCredentials,
execution_context: "ExecutionContext",
**kwargs,
) -> BlockOutput:
try:
results, text_output, stdout, stderr, _ = await self.execute_code(
results, text_output, stdout, stderr, _, files = await self.execute_code(
api_key=credentials.api_key.get_secret_value(),
code=input_data.code,
language=input_data.language,
@@ -283,6 +333,8 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
setup_commands=input_data.setup_commands,
timeout=input_data.timeout,
dispose_sandbox=input_data.dispose_sandbox,
execution_context=execution_context,
extract_files=True,
)
# Determine result object shape & filter out empty formats
@@ -296,6 +348,8 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
yield "stdout_logs", stdout
if stderr:
yield "stderr_logs", stderr
# Always yield files (empty list if none)
yield "files", [f.model_dump() for f in files]
except Exception as e:
yield "error", str(e)
@@ -393,6 +447,7 @@ class InstantiateCodeSandboxBlock(Block, BaseE2BExecutorMixin):
"Hello World\n", # stdout_logs
"", # stderr_logs
"sandbox_id", # sandbox_id
[], # files
),
},
)
@@ -401,7 +456,7 @@ class InstantiateCodeSandboxBlock(Block, BaseE2BExecutorMixin):
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
try:
_, text_output, stdout, stderr, sandbox_id = await self.execute_code(
_, text_output, stdout, stderr, sandbox_id, _ = await self.execute_code(
api_key=credentials.api_key.get_secret_value(),
code=input_data.setup_code,
language=input_data.language,
@@ -500,6 +555,7 @@ class ExecuteCodeStepBlock(Block, BaseE2BExecutorMixin):
"Hello World\n", # stdout_logs
"", # stderr_logs
sandbox_id, # sandbox_id
[], # files
),
},
)
@@ -508,7 +564,7 @@ class ExecuteCodeStepBlock(Block, BaseE2BExecutorMixin):
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
try:
results, text_output, stdout, stderr, _ = await self.execute_code(
results, text_output, stdout, stderr, _, _ = await self.execute_code(
api_key=credentials.api_key.get_secret_value(),
code=input_data.step_code,
language=input_data.language,

View File

@@ -0,0 +1,288 @@
"""
Shared utilities for extracting and storing files from E2B sandboxes.
This module provides common file extraction and workspace storage functionality
for blocks that run code in E2B sandboxes (Claude Code, Code Executor, etc.).
"""
import logging
import shlex
from dataclasses import dataclass
from typing import TYPE_CHECKING
from pydantic import BaseModel
from backend.util.file import store_media_file
from backend.util.type import MediaFileType
if TYPE_CHECKING:
from e2b import AsyncSandbox as BaseAsyncSandbox
from backend.executor.utils import ExecutionContext
logger = logging.getLogger(__name__)
# Text file extensions that can be safely read and stored as text
TEXT_EXTENSIONS = {
".txt",
".md",
".html",
".htm",
".css",
".js",
".ts",
".jsx",
".tsx",
".json",
".xml",
".yaml",
".yml",
".toml",
".ini",
".cfg",
".conf",
".py",
".rb",
".php",
".java",
".c",
".cpp",
".h",
".hpp",
".cs",
".go",
".rs",
".swift",
".kt",
".scala",
".sh",
".bash",
".zsh",
".sql",
".graphql",
".env",
".gitignore",
".dockerfile",
"Dockerfile",
".vue",
".svelte",
".astro",
".mdx",
".rst",
".tex",
".csv",
".log",
}
class SandboxFileOutput(BaseModel):
"""A file extracted from a sandbox and optionally stored in workspace."""
path: str
"""Full path in the sandbox."""
relative_path: str
"""Path relative to the working directory."""
name: str
"""Filename only."""
content: str
"""File content as text (for backward compatibility)."""
workspace_ref: str | None = None
"""Workspace reference (workspace://{id}#mime) if stored, None otherwise."""
@dataclass
class ExtractedFile:
"""Internal representation of an extracted file before storage."""
path: str
relative_path: str
name: str
content: bytes
is_text: bool
async def extract_sandbox_files(
sandbox: "BaseAsyncSandbox",
working_directory: str,
since_timestamp: str | None = None,
text_only: bool = True,
) -> list[ExtractedFile]:
"""
Extract files from an E2B sandbox.
Args:
sandbox: The E2B sandbox instance
working_directory: Directory to search for files
since_timestamp: ISO timestamp - only return files modified after this time
text_only: If True, only extract text files (default). If False, extract all files.
Returns:
List of ExtractedFile objects with path, content, and metadata
"""
files: list[ExtractedFile] = []
try:
# Build find command
safe_working_dir = shlex.quote(working_directory)
timestamp_filter = ""
if since_timestamp:
timestamp_filter = f"-newermt {shlex.quote(since_timestamp)} "
find_result = await sandbox.commands.run(
f"find {safe_working_dir} -type f "
f"{timestamp_filter}"
f"-not -path '*/node_modules/*' "
f"-not -path '*/.git/*' "
f"2>/dev/null"
)
if not find_result.stdout:
return files
for file_path in find_result.stdout.strip().split("\n"):
if not file_path:
continue
# Check if it's a text file
is_text = any(file_path.endswith(ext) for ext in TEXT_EXTENSIONS)
# Skip non-text files if text_only mode
if text_only and not is_text:
continue
try:
# Read file content as bytes
content = await sandbox.files.read(file_path, format="bytes")
if isinstance(content, str):
content = content.encode("utf-8")
elif isinstance(content, bytearray):
content = bytes(content)
# Extract filename from path
file_name = file_path.split("/")[-1]
# Calculate relative path
relative_path = file_path
if file_path.startswith(working_directory):
relative_path = file_path[len(working_directory) :]
if relative_path.startswith("/"):
relative_path = relative_path[1:]
files.append(
ExtractedFile(
path=file_path,
relative_path=relative_path,
name=file_name,
content=content,
is_text=is_text,
)
)
except Exception as e:
logger.debug(f"Failed to read file {file_path}: {e}")
continue
except Exception as e:
logger.warning(f"File extraction failed: {e}")
return files
async def store_sandbox_files(
extracted_files: list[ExtractedFile],
execution_context: "ExecutionContext",
) -> list[SandboxFileOutput]:
"""
Store extracted sandbox files to workspace and return output objects.
Args:
extracted_files: List of files extracted from sandbox
execution_context: Execution context for workspace storage
Returns:
List of SandboxFileOutput objects with workspace refs
"""
outputs: list[SandboxFileOutput] = []
for file in extracted_files:
# Decode content for text files (for backward compat content field)
if file.is_text:
try:
content_str = file.content.decode("utf-8", errors="replace")
except Exception:
content_str = ""
else:
content_str = f"[Binary file: {len(file.content)} bytes]"
# Try to store in workspace
workspace_ref: str | None = None
try:
# Convert bytes to data URI for store_media_file
import base64
import mimetypes
mime_type = mimetypes.guess_type(file.name)[0] or "application/octet-stream"
data_uri = (
f"data:{mime_type};base64,{base64.b64encode(file.content).decode()}"
)
result = await store_media_file(
file=MediaFileType(data_uri),
execution_context=execution_context,
return_format="for_block_output",
)
# Result is workspace://... or data:... depending on context
if result.startswith("workspace://"):
workspace_ref = result
except Exception as e:
logger.warning(f"Failed to store file {file.name} to workspace: {e}")
# For binary files, fall back to data URI to prevent data loss
if not file.is_text:
content_str = data_uri
outputs.append(
SandboxFileOutput(
path=file.path,
relative_path=file.relative_path,
name=file.name,
content=content_str,
workspace_ref=workspace_ref,
)
)
return outputs
async def extract_and_store_sandbox_files(
sandbox: "BaseAsyncSandbox",
working_directory: str,
execution_context: "ExecutionContext",
since_timestamp: str | None = None,
text_only: bool = True,
) -> list[SandboxFileOutput]:
"""
Extract files from sandbox and store them in workspace.
This is the main entry point combining extraction and storage.
Args:
sandbox: The E2B sandbox instance
working_directory: Directory to search for files
execution_context: Execution context for workspace storage
since_timestamp: ISO timestamp - only return files modified after this time
text_only: If True, only extract text files
Returns:
List of SandboxFileOutput objects with content and workspace refs
"""
extracted = await extract_sandbox_files(
sandbox=sandbox,
working_directory=working_directory,
since_timestamp=since_timestamp,
text_only=text_only,
)
return await store_sandbox_files(extracted, execution_context)

View File

@@ -563,7 +563,7 @@ The block supports conversation continuation through three mechanisms:
|--------|-------------|------|
| error | Error message if execution failed | str |
| response | The output/response from Claude Code execution | str |
| files | List of text files created/modified by Claude Code during this execution. Each file has 'path', 'relative_path', 'name', and 'content' fields. | List[FileOutput] |
| files | List of text files created/modified by Claude Code during this execution. Each file has 'path', 'relative_path', 'name', 'content', and 'workspace_ref' fields. workspace_ref contains a workspace:// URI if the file was stored to workspace. | List[SandboxFileOutput] |
| conversation_history | Full conversation history including this turn. Pass this to conversation_history input to continue on a fresh sandbox if the previous sandbox timed out. | str |
| session_id | Session ID for this conversation. Pass this back along with sandbox_id to continue the conversation. | str |
| sandbox_id | ID of the sandbox instance. Pass this back along with session_id to continue the conversation. This is None if dispose_sandbox was True (sandbox was disposed). | str |

View File

@@ -215,6 +215,7 @@ The sandbox includes pip and npm pre-installed. Set timeout to limit execution t
| response | Text output (if any) of the main execution result | str |
| stdout_logs | Standard output logs from execution | str |
| stderr_logs | Standard error logs from execution | str |
| files | Files written to /output directory during execution. Each file has path, name, content, and workspace_ref (if stored). | List[SandboxFileOutput] |
### Possible use case
<!-- MANUAL: use_case -->