Compare commits
14 Commits
autogpt-pl
...
fix/execut
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
dc74d8d89c | ||
|
|
6fce3a09ea | ||
|
|
9158d4b6a2 | ||
|
|
2403931c2e | ||
|
|
af58b316a2 | ||
|
|
03e3e2ea9a | ||
|
|
6bb6a081a2 | ||
|
|
df20b70f44 | ||
|
|
21faf1b677 | ||
|
|
b53c373a59 | ||
|
|
4bfeddc03d | ||
|
|
af7d56612d | ||
|
|
0dd30e275c | ||
|
|
a135f09336 |
@@ -15,6 +15,7 @@
|
||||
!autogpt_platform/backend/pyproject.toml
|
||||
!autogpt_platform/backend/poetry.lock
|
||||
!autogpt_platform/backend/README.md
|
||||
!autogpt_platform/backend/.env
|
||||
|
||||
# Platform - Market
|
||||
!autogpt_platform/market/market/
|
||||
@@ -34,6 +35,7 @@
|
||||
## config
|
||||
!autogpt_platform/frontend/*.config.*
|
||||
!autogpt_platform/frontend/.env.*
|
||||
!autogpt_platform/frontend/.env
|
||||
|
||||
# Classic - AutoGPT
|
||||
!classic/original_autogpt/autogpt/
|
||||
|
||||
3
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -24,7 +24,8 @@
|
||||
</details>
|
||||
|
||||
#### For configuration changes:
|
||||
- [ ] `.env.example` is updated or already compatible with my changes
|
||||
|
||||
- [ ] `.env.default` is updated or already compatible with my changes
|
||||
- [ ] `docker-compose.yml` is updated or already compatible with my changes
|
||||
- [ ] I have included a list of my configuration changes in the PR description (under **Changes**)
|
||||
|
||||
|
||||
15
.github/workflows/platform-frontend-ci.yml
vendored
@@ -176,11 +176,7 @@ jobs:
|
||||
|
||||
- name: Copy default supabase .env
|
||||
run: |
|
||||
cp ../.env.example ../.env
|
||||
|
||||
- name: Copy backend .env
|
||||
run: |
|
||||
cp ../backend/.env.example ../backend/.env
|
||||
cp ../.env.default ../.env
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
@@ -252,15 +248,6 @@ jobs:
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Setup .env
|
||||
run: cp .env.example .env
|
||||
|
||||
- name: Build frontend
|
||||
run: pnpm build --turbo
|
||||
# uses Turbopack, much faster and safe enough for a test pipeline
|
||||
env:
|
||||
NEXT_PUBLIC_PW_TEST: true
|
||||
|
||||
- name: Install Browser 'chromium'
|
||||
run: pnpm playwright install --with-deps chromium
|
||||
|
||||
|
||||
3
.gitignore
vendored
@@ -5,6 +5,8 @@ classic/original_autogpt/*.json
|
||||
auto_gpt_workspace/*
|
||||
*.mpeg
|
||||
.env
|
||||
# Root .env files
|
||||
/.env
|
||||
azure.yaml
|
||||
.vscode
|
||||
.idea/*
|
||||
@@ -121,7 +123,6 @@ celerybeat.pid
|
||||
|
||||
# Environments
|
||||
.direnv/
|
||||
.env
|
||||
.venv
|
||||
env/
|
||||
venv*/
|
||||
|
||||
@@ -114,13 +114,31 @@ Key models (defined in `/backend/schema.prisma`):
|
||||
- `StoreListing`: Marketplace listings for sharing agents
|
||||
|
||||
### Environment Configuration
|
||||
- Backend: `.env` file in `/backend`
|
||||
- Frontend: `.env.local` file in `/frontend`
|
||||
- Both require Supabase credentials and API keys for various services
|
||||
|
||||
#### Configuration Files
|
||||
|
||||
- **Backend**: `/backend/.env.default` (defaults) → `/backend/.env` (user overrides)
|
||||
- **Frontend**: `/frontend/.env.default` (defaults) → `/frontend/.env` (user overrides)
|
||||
- **Platform**: `/.env.default` (Supabase/shared defaults) → `/.env` (user overrides)
|
||||
|
||||
#### Docker Environment Loading Order
|
||||
|
||||
1. `.env.default` files provide base configuration (tracked in git)
|
||||
2. `.env` files provide user-specific overrides (gitignored)
|
||||
3. Docker Compose `environment:` sections provide service-specific overrides
|
||||
4. Shell environment variables have highest precedence
|
||||
|
||||
#### Key Points
|
||||
|
||||
- All services use hardcoded defaults in docker-compose files (no `${VARIABLE}` substitutions)
|
||||
- The `env_file` directive loads variables INTO containers at runtime
|
||||
- Backend/Frontend services use YAML anchors for consistent configuration
|
||||
- Supabase services (`db/docker/docker-compose.yml`) follow the same pattern
|
||||
|
||||
### Common Development Tasks
|
||||
|
||||
**Adding a new block:**
|
||||
|
||||
1. Create new file in `/backend/backend/blocks/`
|
||||
2. Inherit from `Block` base class
|
||||
3. Define input/output schemas
|
||||
@@ -162,6 +180,11 @@ ex: do the inputs and outputs tie well together?
|
||||
- Fill out the .github/PULL_REQUEST_TEMPLATE.md template as the PR description/
|
||||
- Run the github pre-commit hooks to ensure code quality.
|
||||
|
||||
### Reviewing/Revising Pull Requests
|
||||
- When the user runs /pr-comments or tries to fetch them, also run gh api /repos/Significant-Gravitas/AutoGPT/pulls/[issuenum]/reviews to get the reviews
|
||||
- Use gh api /repos/Significant-Gravitas/AutoGPT/pulls/[issuenum]/reviews/[review_id]/comments to get the review contents
|
||||
- Use gh api /repos/Significant-Gravitas/AutoGPT/issues/9924/comments to get the pr specific comments
|
||||
|
||||
### Conventional Commits
|
||||
|
||||
Use this format for commit messages and Pull Request titles:
|
||||
|
||||
@@ -8,7 +8,6 @@ Welcome to the AutoGPT Platform - a powerful system for creating and running AI
|
||||
|
||||
- Docker
|
||||
- Docker Compose V2 (comes with Docker Desktop, or can be installed separately)
|
||||
- Node.js & NPM (for running the frontend application)
|
||||
|
||||
### Running the System
|
||||
|
||||
@@ -24,10 +23,10 @@ To run the AutoGPT Platform, follow these steps:
|
||||
2. Run the following command:
|
||||
|
||||
```
|
||||
cp .env.example .env
|
||||
cp .env.default .env
|
||||
```
|
||||
|
||||
This command will copy the `.env.example` file to `.env`. You can modify the `.env` file to add your own environment variables.
|
||||
This command will copy the `.env.default` file to `.env`. You can modify the `.env` file to add your own environment variables.
|
||||
|
||||
3. Run the following command:
|
||||
|
||||
@@ -37,44 +36,7 @@ To run the AutoGPT Platform, follow these steps:
|
||||
|
||||
This command will start all the necessary backend services defined in the `docker-compose.yml` file in detached mode.
|
||||
|
||||
4. Navigate to `frontend` within the `autogpt_platform` directory:
|
||||
|
||||
```
|
||||
cd frontend
|
||||
```
|
||||
|
||||
You will need to run your frontend application separately on your local machine.
|
||||
|
||||
5. Run the following command:
|
||||
|
||||
```
|
||||
cp .env.example .env.local
|
||||
```
|
||||
|
||||
This command will copy the `.env.example` file to `.env.local` in the `frontend` directory. You can modify the `.env.local` within this folder to add your own environment variables for the frontend application.
|
||||
|
||||
6. Run the following command:
|
||||
|
||||
Enable corepack and install dependencies by running:
|
||||
|
||||
```
|
||||
corepack enable
|
||||
pnpm i
|
||||
```
|
||||
|
||||
Generate the API client (this step is required before running the frontend):
|
||||
|
||||
```
|
||||
pnpm generate:api-client
|
||||
```
|
||||
|
||||
Then start the frontend application in development mode:
|
||||
|
||||
```
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
7. Open your browser and navigate to `http://localhost:3000` to access the AutoGPT Platform frontend.
|
||||
4. After all the services are in ready state, open your browser and navigate to `http://localhost:3000` to access the AutoGPT Platform frontend.
|
||||
|
||||
### Docker Compose Commands
|
||||
|
||||
@@ -184,6 +146,7 @@ The platform includes scripts for generating and managing the API client:
|
||||
If you need to update the API client after making changes to the backend API:
|
||||
|
||||
1. Ensure the backend services are running:
|
||||
|
||||
```
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
@@ -1,39 +1,5 @@
|
||||
import logging
|
||||
import re
|
||||
from typing import Any
|
||||
|
||||
import uvicorn.config
|
||||
from colorama import Fore
|
||||
|
||||
|
||||
def remove_color_codes(s: str) -> str:
|
||||
return re.sub(r"\x1B(?:[@-Z\\-_]|\[[0-?]*[ -/]*[@-~])", "", s)
|
||||
|
||||
|
||||
def fmt_kwargs(kwargs: dict) -> str:
|
||||
return ", ".join(f"{n}={repr(v)}" for n, v in kwargs.items())
|
||||
|
||||
|
||||
def print_attribute(
|
||||
title: str, value: Any, title_color: str = Fore.GREEN, value_color: str = ""
|
||||
) -> None:
|
||||
logger = logging.getLogger()
|
||||
logger.info(
|
||||
str(value),
|
||||
extra={
|
||||
"title": f"{title.rstrip(':')}:",
|
||||
"title_color": title_color,
|
||||
"color": value_color,
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
def generate_uvicorn_config():
|
||||
"""
|
||||
Generates a uvicorn logging config that silences uvicorn's default logging and tells it to use the native logging module.
|
||||
"""
|
||||
log_config = dict(uvicorn.config.LOGGING_CONFIG)
|
||||
log_config["loggers"]["uvicorn"] = {"handlers": []}
|
||||
log_config["loggers"]["uvicorn.error"] = {"handlers": []}
|
||||
log_config["loggers"]["uvicorn.access"] = {"handlers": []}
|
||||
return log_config
|
||||
|
||||
52
autogpt_platform/backend/.dockerignore
Normal file
@@ -0,0 +1,52 @@
|
||||
# Development and testing files
|
||||
**/__pycache__
|
||||
**/*.pyc
|
||||
**/*.pyo
|
||||
**/*.pyd
|
||||
**/.Python
|
||||
**/env/
|
||||
**/venv/
|
||||
**/.venv/
|
||||
**/pip-log.txt
|
||||
**/.pytest_cache/
|
||||
**/test-results/
|
||||
**/snapshots/
|
||||
**/test/
|
||||
|
||||
# IDE and editor files
|
||||
**/.vscode/
|
||||
**/.idea/
|
||||
**/*.swp
|
||||
**/*.swo
|
||||
*~
|
||||
|
||||
# OS files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Logs
|
||||
**/*.log
|
||||
**/logs/
|
||||
|
||||
# Git
|
||||
.git/
|
||||
.gitignore
|
||||
|
||||
# Documentation
|
||||
**/*.md
|
||||
!README.md
|
||||
|
||||
# Local development files
|
||||
.env
|
||||
.env.local
|
||||
**/.env.test
|
||||
|
||||
# Build artifacts
|
||||
**/dist/
|
||||
**/build/
|
||||
**/target/
|
||||
|
||||
# Docker files (avoid recursion)
|
||||
Dockerfile*
|
||||
docker-compose*
|
||||
.dockerignore
|
||||
@@ -1,3 +1,9 @@
|
||||
# Backend Configuration
|
||||
# This file contains environment variables that MUST be set for the AutoGPT platform
|
||||
# Variables with working defaults in settings.py are not included here
|
||||
|
||||
## ===== REQUIRED DATABASE CONFIGURATION ===== ##
|
||||
# PostgreSQL Database Connection
|
||||
DB_USER=postgres
|
||||
DB_PASS=your-super-secret-and-long-postgres-password
|
||||
DB_NAME=postgres
|
||||
@@ -10,74 +16,50 @@ DB_SCHEMA=platform
|
||||
DATABASE_URL="postgresql://${DB_USER}:${DB_PASS}@${DB_HOST}:${DB_PORT}/${DB_NAME}?schema=${DB_SCHEMA}&connect_timeout=${DB_CONNECT_TIMEOUT}"
|
||||
DIRECT_URL="postgresql://${DB_USER}:${DB_PASS}@${DB_HOST}:${DB_PORT}/${DB_NAME}?schema=${DB_SCHEMA}&connect_timeout=${DB_CONNECT_TIMEOUT}"
|
||||
PRISMA_SCHEMA="postgres/schema.prisma"
|
||||
ENABLE_AUTH=true
|
||||
|
||||
# EXECUTOR
|
||||
NUM_GRAPH_WORKERS=10
|
||||
|
||||
BACKEND_CORS_ALLOW_ORIGINS=["http://localhost:3000"]
|
||||
|
||||
# generate using `from cryptography.fernet import Fernet;Fernet.generate_key().decode()`
|
||||
ENCRYPTION_KEY='dvziYgz0KSK8FENhju0ZYi8-fRTfAdlz6YLhdB_jhNw='
|
||||
UNSUBSCRIBE_SECRET_KEY = 'HlP8ivStJjmbf6NKi78m_3FnOogut0t5ckzjsIqeaio='
|
||||
|
||||
## ===== REQUIRED SERVICE CREDENTIALS ===== ##
|
||||
# Redis Configuration
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
REDIS_PASSWORD=password
|
||||
|
||||
ENABLE_CREDIT=false
|
||||
STRIPE_API_KEY=
|
||||
STRIPE_WEBHOOK_SECRET=
|
||||
# RabbitMQ Credentials
|
||||
RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
|
||||
# What environment things should be logged under: local dev or prod
|
||||
APP_ENV=local
|
||||
# What environment to behave as: "local" or "cloud"
|
||||
BEHAVE_AS=local
|
||||
PYRO_HOST=localhost
|
||||
SENTRY_DSN=
|
||||
|
||||
# Email For Postmark so we can send emails
|
||||
POSTMARK_SERVER_API_TOKEN=
|
||||
POSTMARK_SENDER_EMAIL=invalid@invalid.com
|
||||
POSTMARK_WEBHOOK_TOKEN=
|
||||
|
||||
## User auth with Supabase is required for any of the 3rd party integrations with auth to work.
|
||||
ENABLE_AUTH=true
|
||||
# Supabase Authentication
|
||||
SUPABASE_URL=http://localhost:8000
|
||||
SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
|
||||
# RabbitMQ credentials -- Used for communication between services
|
||||
RABBITMQ_HOST=localhost
|
||||
RABBITMQ_PORT=5672
|
||||
RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
## ===== REQUIRED SECURITY KEYS ===== ##
|
||||
# Generate using: from cryptography.fernet import Fernet;Fernet.generate_key().decode()
|
||||
ENCRYPTION_KEY=dvziYgz0KSK8FENhju0ZYi8-fRTfAdlz6YLhdB_jhNw=
|
||||
UNSUBSCRIBE_SECRET_KEY=HlP8ivStJjmbf6NKi78m_3FnOogut0t5ckzjsIqeaio=
|
||||
|
||||
## GCS bucket is required for marketplace and library functionality
|
||||
## ===== IMPORTANT OPTIONAL CONFIGURATION ===== ##
|
||||
# Platform URLs (set these for webhooks and OAuth to work)
|
||||
PLATFORM_BASE_URL=http://localhost:8000
|
||||
FRONTEND_BASE_URL=http://localhost:3000
|
||||
|
||||
# Media Storage (required for marketplace and library functionality)
|
||||
MEDIA_GCS_BUCKET_NAME=
|
||||
|
||||
## For local development, you may need to set FRONTEND_BASE_URL for the OAuth flow
|
||||
## for integrations to work. Defaults to the value of PLATFORM_BASE_URL if not set.
|
||||
# FRONTEND_BASE_URL=http://localhost:3000
|
||||
## ===== API KEYS AND OAUTH CREDENTIALS ===== ##
|
||||
# All API keys below are optional - only add what you need
|
||||
|
||||
## PLATFORM_BASE_URL must be set to a *publicly accessible* URL pointing to your backend
|
||||
## to use the platform's webhook-related functionality.
|
||||
## If you are developing locally, you can use something like ngrok to get a publc URL
|
||||
## and tunnel it to your locally running backend.
|
||||
PLATFORM_BASE_URL=http://localhost:3000
|
||||
|
||||
## Cloudflare Turnstile (CAPTCHA) Configuration
|
||||
## Get these from the Cloudflare Turnstile dashboard: https://dash.cloudflare.com/?to=/:account/turnstile
|
||||
## This is the backend secret key
|
||||
TURNSTILE_SECRET_KEY=
|
||||
## This is the verify URL
|
||||
TURNSTILE_VERIFY_URL=https://challenges.cloudflare.com/turnstile/v0/siteverify
|
||||
|
||||
LAUNCH_DARKLY_SDK_KEY=
|
||||
|
||||
## == INTEGRATION CREDENTIALS == ##
|
||||
# Each set of server side credentials is required for the corresponding 3rd party
|
||||
# integration to work.
|
||||
# AI/LLM Services
|
||||
OPENAI_API_KEY=
|
||||
ANTHROPIC_API_KEY=
|
||||
GROQ_API_KEY=
|
||||
LLAMA_API_KEY=
|
||||
AIML_API_KEY=
|
||||
V0_API_KEY=
|
||||
OPEN_ROUTER_API_KEY=
|
||||
NVIDIA_API_KEY=
|
||||
|
||||
# OAuth Credentials
|
||||
# For the OAuth callback URL, use <your_frontend_url>/auth/integrations/oauth_callback,
|
||||
# e.g. http://localhost:3000/auth/integrations/oauth_callback
|
||||
|
||||
@@ -87,7 +69,6 @@ GITHUB_CLIENT_SECRET=
|
||||
|
||||
# Google OAuth App server credentials - https://console.cloud.google.com/apis/credentials, and enable gmail api and set scopes
|
||||
# https://console.cloud.google.com/apis/credentials/consent ?project=<your_project_id>
|
||||
|
||||
# You'll need to add/enable the following scopes (minimum):
|
||||
# https://console.developers.google.com/apis/api/gmail.googleapis.com/overview ?project=<your_project_id>
|
||||
# https://console.cloud.google.com/apis/library/sheets.googleapis.com/ ?project=<your_project_id>
|
||||
@@ -123,104 +104,66 @@ LINEAR_CLIENT_SECRET=
|
||||
TODOIST_CLIENT_ID=
|
||||
TODOIST_CLIENT_SECRET=
|
||||
|
||||
## ===== OPTIONAL API KEYS ===== ##
|
||||
|
||||
# LLM
|
||||
OPENAI_API_KEY=
|
||||
ANTHROPIC_API_KEY=
|
||||
AIML_API_KEY=
|
||||
GROQ_API_KEY=
|
||||
OPEN_ROUTER_API_KEY=
|
||||
LLAMA_API_KEY=
|
||||
|
||||
# Reddit
|
||||
# Go to https://www.reddit.com/prefs/apps and create a new app
|
||||
# Choose "script" for the type
|
||||
# Fill in the redirect uri as <your_frontend_url>/auth/integrations/oauth_callback, e.g. http://localhost:3000/auth/integrations/oauth_callback
|
||||
NOTION_CLIENT_ID=
|
||||
NOTION_CLIENT_SECRET=
|
||||
REDDIT_CLIENT_ID=
|
||||
REDDIT_CLIENT_SECRET=
|
||||
REDDIT_USER_AGENT="AutoGPT:1.0 (by /u/autogpt)"
|
||||
|
||||
# Discord
|
||||
DISCORD_BOT_TOKEN=
|
||||
# Payment Processing
|
||||
STRIPE_API_KEY=
|
||||
STRIPE_WEBHOOK_SECRET=
|
||||
|
||||
# SMTP/Email
|
||||
SMTP_SERVER=
|
||||
SMTP_PORT=
|
||||
SMTP_USERNAME=
|
||||
SMTP_PASSWORD=
|
||||
# Email Service (for sending notifications and confirmations)
|
||||
POSTMARK_SERVER_API_TOKEN=
|
||||
POSTMARK_SENDER_EMAIL=invalid@invalid.com
|
||||
POSTMARK_WEBHOOK_TOKEN=
|
||||
|
||||
# D-ID
|
||||
# Error Tracking
|
||||
SENTRY_DSN=
|
||||
|
||||
# Cloudflare Turnstile (CAPTCHA) Configuration
|
||||
# Get these from the Cloudflare Turnstile dashboard: https://dash.cloudflare.com/?to=/:account/turnstile
|
||||
# This is the backend secret key
|
||||
TURNSTILE_SECRET_KEY=
|
||||
# This is the verify URL
|
||||
TURNSTILE_VERIFY_URL=https://challenges.cloudflare.com/turnstile/v0/siteverify
|
||||
|
||||
# Feature Flags
|
||||
LAUNCH_DARKLY_SDK_KEY=
|
||||
|
||||
# Content Generation & Media
|
||||
DID_API_KEY=
|
||||
FAL_API_KEY=
|
||||
IDEOGRAM_API_KEY=
|
||||
REPLICATE_API_KEY=
|
||||
REVID_API_KEY=
|
||||
SCREENSHOTONE_API_KEY=
|
||||
UNREAL_SPEECH_API_KEY=
|
||||
|
||||
# Open Weather Map
|
||||
# Data & Search Services
|
||||
E2B_API_KEY=
|
||||
EXA_API_KEY=
|
||||
JINA_API_KEY=
|
||||
MEM0_API_KEY=
|
||||
OPENWEATHERMAP_API_KEY=
|
||||
|
||||
# SMTP
|
||||
SMTP_SERVER=
|
||||
SMTP_PORT=
|
||||
SMTP_USERNAME=
|
||||
SMTP_PASSWORD=
|
||||
|
||||
# Medium
|
||||
MEDIUM_API_KEY=
|
||||
MEDIUM_AUTHOR_ID=
|
||||
|
||||
# Google Maps
|
||||
GOOGLE_MAPS_API_KEY=
|
||||
|
||||
# Replicate
|
||||
REPLICATE_API_KEY=
|
||||
# Communication Services
|
||||
DISCORD_BOT_TOKEN=
|
||||
MEDIUM_API_KEY=
|
||||
MEDIUM_AUTHOR_ID=
|
||||
SMTP_SERVER=
|
||||
SMTP_PORT=
|
||||
SMTP_USERNAME=
|
||||
SMTP_PASSWORD=
|
||||
|
||||
# Ideogram
|
||||
IDEOGRAM_API_KEY=
|
||||
|
||||
# Fal
|
||||
FAL_API_KEY=
|
||||
|
||||
# Exa
|
||||
EXA_API_KEY=
|
||||
|
||||
# E2B
|
||||
E2B_API_KEY=
|
||||
|
||||
# Mem0
|
||||
MEM0_API_KEY=
|
||||
|
||||
# Nvidia
|
||||
NVIDIA_API_KEY=
|
||||
|
||||
# Apollo
|
||||
# Business & Marketing Tools
|
||||
APOLLO_API_KEY=
|
||||
|
||||
# SmartLead
|
||||
SMARTLEAD_API_KEY=
|
||||
|
||||
# ZeroBounce
|
||||
ZEROBOUNCE_API_KEY=
|
||||
|
||||
# Ayrshare
|
||||
ENRICHLAYER_API_KEY=
|
||||
AYRSHARE_API_KEY=
|
||||
AYRSHARE_JWT_KEY=
|
||||
SMARTLEAD_API_KEY=
|
||||
ZEROBOUNCE_API_KEY=
|
||||
|
||||
## ===== OPTIONAL API KEYS END ===== ##
|
||||
|
||||
# Block Error Rate Monitoring
|
||||
BLOCK_ERROR_RATE_THRESHOLD=0.5
|
||||
BLOCK_ERROR_RATE_CHECK_INTERVAL_SECS=86400
|
||||
|
||||
# Logging Configuration
|
||||
LOG_LEVEL=INFO
|
||||
ENABLE_CLOUD_LOGGING=false
|
||||
ENABLE_FILE_LOGGING=false
|
||||
# Use to manually set the log directory
|
||||
# LOG_DIR=./logs
|
||||
|
||||
# Example Blocks Configuration
|
||||
# Set to true to enable example blocks in development
|
||||
# These blocks are disabled by default in production
|
||||
ENABLE_EXAMPLE_BLOCKS=false
|
||||
|
||||
# Cloud Storage Configuration
|
||||
# Cleanup interval for expired files (hours between cleanup runs, 1-24 hours)
|
||||
CLOUD_STORAGE_CLEANUP_INTERVAL_HOURS=6
|
||||
# Other Services
|
||||
AUTOMOD_API_KEY=
|
||||
1
autogpt_platform/backend/.gitignore
vendored
@@ -1,3 +1,4 @@
|
||||
.env
|
||||
database.db
|
||||
database.db-journal
|
||||
dev.db
|
||||
|
||||
@@ -8,14 +8,14 @@ WORKDIR /app
|
||||
|
||||
RUN echo 'Acquire::http::Pipeline-Depth 0;\nAcquire::http::No-Cache true;\nAcquire::BrokenProxy true;\n' > /etc/apt/apt.conf.d/99fixbadproxy
|
||||
|
||||
RUN apt-get update --allow-releaseinfo-change --fix-missing
|
||||
|
||||
# Install build dependencies
|
||||
RUN apt-get install -y build-essential
|
||||
RUN apt-get install -y libpq5
|
||||
RUN apt-get install -y libz-dev
|
||||
RUN apt-get install -y libssl-dev
|
||||
RUN apt-get install -y postgresql-client
|
||||
# Update package list and install build dependencies in a single layer
|
||||
RUN apt-get update --allow-releaseinfo-change --fix-missing \
|
||||
&& apt-get install -y \
|
||||
build-essential \
|
||||
libpq5 \
|
||||
libz-dev \
|
||||
libssl-dev \
|
||||
postgresql-client
|
||||
|
||||
ENV POETRY_HOME=/opt/poetry
|
||||
ENV POETRY_NO_INTERACTION=1
|
||||
@@ -68,6 +68,12 @@ COPY autogpt_platform/backend/poetry.lock autogpt_platform/backend/pyproject.tom
|
||||
|
||||
WORKDIR /app/autogpt_platform/backend
|
||||
|
||||
FROM server_dependencies AS migrate
|
||||
|
||||
# Migration stage only needs schema and migrations - much lighter than full backend
|
||||
COPY autogpt_platform/backend/schema.prisma /app/autogpt_platform/backend/
|
||||
COPY autogpt_platform/backend/migrations /app/autogpt_platform/backend/migrations
|
||||
|
||||
FROM server_dependencies AS server
|
||||
|
||||
COPY autogpt_platform/backend /app/autogpt_platform/backend
|
||||
|
||||
408
autogpt_platform/backend/backend/blocks/enrichlayer/_api.py
Normal file
@@ -0,0 +1,408 @@
|
||||
"""
|
||||
API module for Enrichlayer integration.
|
||||
|
||||
This module provides a client for interacting with the Enrichlayer API,
|
||||
which allows fetching LinkedIn profile data and related information.
|
||||
"""
|
||||
|
||||
import datetime
|
||||
import enum
|
||||
import logging
|
||||
from json import JSONDecodeError
|
||||
from typing import Any, Optional, TypeVar
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from backend.data.model import APIKeyCredentials
|
||||
from backend.util.request import Requests
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
class EnrichlayerAPIException(Exception):
|
||||
"""Exception raised for Enrichlayer API errors."""
|
||||
|
||||
def __init__(self, message: str, status_code: int):
|
||||
super().__init__(message)
|
||||
self.status_code = status_code
|
||||
|
||||
|
||||
class FallbackToCache(enum.Enum):
|
||||
ON_ERROR = "on-error"
|
||||
NEVER = "never"
|
||||
|
||||
|
||||
class UseCache(enum.Enum):
|
||||
IF_PRESENT = "if-present"
|
||||
NEVER = "never"
|
||||
|
||||
|
||||
class SocialMediaProfiles(BaseModel):
|
||||
"""Social media profiles model."""
|
||||
|
||||
twitter: Optional[str] = None
|
||||
facebook: Optional[str] = None
|
||||
github: Optional[str] = None
|
||||
|
||||
|
||||
class Experience(BaseModel):
|
||||
"""Experience model for LinkedIn profiles."""
|
||||
|
||||
company: Optional[str] = None
|
||||
title: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
location: Optional[str] = None
|
||||
starts_at: Optional[dict[str, int]] = None
|
||||
ends_at: Optional[dict[str, int]] = None
|
||||
company_linkedin_profile_url: Optional[str] = None
|
||||
|
||||
|
||||
class Education(BaseModel):
|
||||
"""Education model for LinkedIn profiles."""
|
||||
|
||||
school: Optional[str] = None
|
||||
degree_name: Optional[str] = None
|
||||
field_of_study: Optional[str] = None
|
||||
starts_at: Optional[dict[str, int]] = None
|
||||
ends_at: Optional[dict[str, int]] = None
|
||||
school_linkedin_profile_url: Optional[str] = None
|
||||
|
||||
|
||||
class PersonProfileResponse(BaseModel):
|
||||
"""Response model for LinkedIn person profile.
|
||||
|
||||
This model represents the response from Enrichlayer's LinkedIn profile API.
|
||||
The API returns comprehensive profile data including work experience,
|
||||
education, skills, and contact information (when available).
|
||||
|
||||
Example API Response:
|
||||
{
|
||||
"public_identifier": "johnsmith",
|
||||
"full_name": "John Smith",
|
||||
"occupation": "Software Engineer at Tech Corp",
|
||||
"experiences": [
|
||||
{
|
||||
"company": "Tech Corp",
|
||||
"title": "Software Engineer",
|
||||
"starts_at": {"year": 2020, "month": 1}
|
||||
}
|
||||
],
|
||||
"education": [...],
|
||||
"skills": ["Python", "JavaScript", ...]
|
||||
}
|
||||
"""
|
||||
|
||||
public_identifier: Optional[str] = None
|
||||
profile_pic_url: Optional[str] = None
|
||||
full_name: Optional[str] = None
|
||||
first_name: Optional[str] = None
|
||||
last_name: Optional[str] = None
|
||||
occupation: Optional[str] = None
|
||||
headline: Optional[str] = None
|
||||
summary: Optional[str] = None
|
||||
country: Optional[str] = None
|
||||
country_full_name: Optional[str] = None
|
||||
city: Optional[str] = None
|
||||
state: Optional[str] = None
|
||||
experiences: Optional[list[Experience]] = None
|
||||
education: Optional[list[Education]] = None
|
||||
languages: Optional[list[str]] = None
|
||||
skills: Optional[list[str]] = None
|
||||
inferred_salary: Optional[dict[str, Any]] = None
|
||||
personal_email: Optional[str] = None
|
||||
personal_contact_number: Optional[str] = None
|
||||
social_media_profiles: Optional[SocialMediaProfiles] = None
|
||||
extra: Optional[dict[str, Any]] = None
|
||||
|
||||
|
||||
class SimilarProfile(BaseModel):
|
||||
"""Similar profile model for LinkedIn person lookup."""
|
||||
|
||||
similarity: float
|
||||
linkedin_profile_url: str
|
||||
|
||||
|
||||
class PersonLookupResponse(BaseModel):
|
||||
"""Response model for LinkedIn person lookup.
|
||||
|
||||
This model represents the response from Enrichlayer's person lookup API.
|
||||
The API returns a LinkedIn profile URL and similarity scores when
|
||||
searching for a person by name and company.
|
||||
|
||||
Example API Response:
|
||||
{
|
||||
"url": "https://www.linkedin.com/in/johnsmith/",
|
||||
"name_similarity_score": 0.95,
|
||||
"company_similarity_score": 0.88,
|
||||
"title_similarity_score": 0.75,
|
||||
"location_similarity_score": 0.60
|
||||
}
|
||||
"""
|
||||
|
||||
url: str | None = None
|
||||
name_similarity_score: float | None
|
||||
company_similarity_score: float | None
|
||||
title_similarity_score: float | None
|
||||
location_similarity_score: float | None
|
||||
last_updated: datetime.datetime | None = None
|
||||
profile: PersonProfileResponse | None = None
|
||||
|
||||
|
||||
class RoleLookupResponse(BaseModel):
|
||||
"""Response model for LinkedIn role lookup.
|
||||
|
||||
This model represents the response from Enrichlayer's role lookup API.
|
||||
The API returns LinkedIn profile data for a specific role at a company.
|
||||
|
||||
Example API Response:
|
||||
{
|
||||
"linkedin_profile_url": "https://www.linkedin.com/in/johnsmith/",
|
||||
"profile_data": {...} // Full PersonProfileResponse data when enrich_profile=True
|
||||
}
|
||||
"""
|
||||
|
||||
linkedin_profile_url: Optional[str] = None
|
||||
profile_data: Optional[PersonProfileResponse] = None
|
||||
|
||||
|
||||
class ProfilePictureResponse(BaseModel):
|
||||
"""Response model for LinkedIn profile picture.
|
||||
|
||||
This model represents the response from Enrichlayer's profile picture API.
|
||||
The API returns a URL to the person's LinkedIn profile picture.
|
||||
|
||||
Example API Response:
|
||||
{
|
||||
"tmp_profile_pic_url": "https://media.licdn.com/dms/image/..."
|
||||
}
|
||||
"""
|
||||
|
||||
tmp_profile_pic_url: str = Field(
|
||||
..., description="URL of the profile picture", alias="tmp_profile_pic_url"
|
||||
)
|
||||
|
||||
@property
|
||||
def profile_picture_url(self) -> str:
|
||||
"""Backward compatibility property for profile_picture_url."""
|
||||
return self.tmp_profile_pic_url
|
||||
|
||||
|
||||
class EnrichlayerClient:
|
||||
"""Client for interacting with the Enrichlayer API."""
|
||||
|
||||
API_BASE_URL = "https://enrichlayer.com/api/v2"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
credentials: Optional[APIKeyCredentials] = None,
|
||||
custom_requests: Optional[Requests] = None,
|
||||
):
|
||||
"""
|
||||
Initialize the Enrichlayer client.
|
||||
|
||||
Args:
|
||||
credentials: The credentials to use for authentication.
|
||||
custom_requests: Custom Requests instance for testing.
|
||||
"""
|
||||
if custom_requests:
|
||||
self._requests = custom_requests
|
||||
else:
|
||||
headers: dict[str, str] = {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
if credentials:
|
||||
headers["Authorization"] = (
|
||||
f"Bearer {credentials.api_key.get_secret_value()}"
|
||||
)
|
||||
|
||||
self._requests = Requests(
|
||||
extra_headers=headers,
|
||||
raise_for_status=False,
|
||||
)
|
||||
|
||||
async def _handle_response(self, response) -> Any:
|
||||
"""
|
||||
Handle API response and check for errors.
|
||||
|
||||
Args:
|
||||
response: The response object from the request.
|
||||
|
||||
Returns:
|
||||
The response data.
|
||||
|
||||
Raises:
|
||||
EnrichlayerAPIException: If the API request fails.
|
||||
"""
|
||||
if not response.ok:
|
||||
try:
|
||||
error_data = response.json()
|
||||
error_message = error_data.get("message", "")
|
||||
except JSONDecodeError:
|
||||
error_message = response.text
|
||||
|
||||
raise EnrichlayerAPIException(
|
||||
f"Enrichlayer API request failed ({response.status_code}): {error_message}",
|
||||
response.status_code,
|
||||
)
|
||||
|
||||
return response.json()
|
||||
|
||||
async def fetch_profile(
|
||||
self,
|
||||
linkedin_url: str,
|
||||
fallback_to_cache: FallbackToCache = FallbackToCache.ON_ERROR,
|
||||
use_cache: UseCache = UseCache.IF_PRESENT,
|
||||
include_skills: bool = False,
|
||||
include_inferred_salary: bool = False,
|
||||
include_personal_email: bool = False,
|
||||
include_personal_contact_number: bool = False,
|
||||
include_social_media: bool = False,
|
||||
include_extra: bool = False,
|
||||
) -> PersonProfileResponse:
|
||||
"""
|
||||
Fetch a LinkedIn profile with optional parameters.
|
||||
|
||||
Args:
|
||||
linkedin_url: The LinkedIn profile URL to fetch.
|
||||
fallback_to_cache: Cache usage if live fetch fails ('on-error' or 'never').
|
||||
use_cache: Cache utilization ('if-present' or 'never').
|
||||
include_skills: Whether to include skills data.
|
||||
include_inferred_salary: Whether to include inferred salary data.
|
||||
include_personal_email: Whether to include personal email.
|
||||
include_personal_contact_number: Whether to include personal contact number.
|
||||
include_social_media: Whether to include social media profiles.
|
||||
include_extra: Whether to include additional data.
|
||||
|
||||
Returns:
|
||||
The LinkedIn profile data.
|
||||
|
||||
Raises:
|
||||
EnrichlayerAPIException: If the API request fails.
|
||||
"""
|
||||
params = {
|
||||
"url": linkedin_url,
|
||||
"fallback_to_cache": fallback_to_cache.value.lower(),
|
||||
"use_cache": use_cache.value.lower(),
|
||||
}
|
||||
|
||||
if include_skills:
|
||||
params["skills"] = "include"
|
||||
if include_inferred_salary:
|
||||
params["inferred_salary"] = "include"
|
||||
if include_personal_email:
|
||||
params["personal_email"] = "include"
|
||||
if include_personal_contact_number:
|
||||
params["personal_contact_number"] = "include"
|
||||
if include_social_media:
|
||||
params["twitter_profile_id"] = "include"
|
||||
params["facebook_profile_id"] = "include"
|
||||
params["github_profile_id"] = "include"
|
||||
if include_extra:
|
||||
params["extra"] = "include"
|
||||
|
||||
response = await self._requests.get(
|
||||
f"{self.API_BASE_URL}/profile", params=params
|
||||
)
|
||||
return PersonProfileResponse(**await self._handle_response(response))
|
||||
|
||||
async def lookup_person(
|
||||
self,
|
||||
first_name: str,
|
||||
company_domain: str,
|
||||
last_name: str | None = None,
|
||||
location: Optional[str] = None,
|
||||
title: Optional[str] = None,
|
||||
include_similarity_checks: bool = False,
|
||||
enrich_profile: bool = False,
|
||||
) -> PersonLookupResponse:
|
||||
"""
|
||||
Look up a LinkedIn profile by person's information.
|
||||
|
||||
Args:
|
||||
first_name: The person's first name.
|
||||
last_name: The person's last name.
|
||||
company_domain: The domain of the company they work for.
|
||||
location: The person's location.
|
||||
title: The person's job title.
|
||||
include_similarity_checks: Whether to include similarity checks.
|
||||
enrich_profile: Whether to enrich the profile.
|
||||
|
||||
Returns:
|
||||
The LinkedIn profile lookup result.
|
||||
|
||||
Raises:
|
||||
EnrichlayerAPIException: If the API request fails.
|
||||
"""
|
||||
params = {"first_name": first_name, "company_domain": company_domain}
|
||||
|
||||
if last_name:
|
||||
params["last_name"] = last_name
|
||||
if location:
|
||||
params["location"] = location
|
||||
if title:
|
||||
params["title"] = title
|
||||
if include_similarity_checks:
|
||||
params["similarity_checks"] = "include"
|
||||
if enrich_profile:
|
||||
params["enrich_profile"] = "enrich"
|
||||
|
||||
response = await self._requests.get(
|
||||
f"{self.API_BASE_URL}/profile/resolve", params=params
|
||||
)
|
||||
return PersonLookupResponse(**await self._handle_response(response))
|
||||
|
||||
async def lookup_role(
|
||||
self, role: str, company_name: str, enrich_profile: bool = False
|
||||
) -> RoleLookupResponse:
|
||||
"""
|
||||
Look up a LinkedIn profile by role in a company.
|
||||
|
||||
Args:
|
||||
role: The role title (e.g., CEO, CTO).
|
||||
company_name: The name of the company.
|
||||
enrich_profile: Whether to enrich the profile.
|
||||
|
||||
Returns:
|
||||
The LinkedIn profile lookup result.
|
||||
|
||||
Raises:
|
||||
EnrichlayerAPIException: If the API request fails.
|
||||
"""
|
||||
params = {
|
||||
"role": role,
|
||||
"company_name": company_name,
|
||||
}
|
||||
|
||||
if enrich_profile:
|
||||
params["enrich_profile"] = "enrich"
|
||||
|
||||
response = await self._requests.get(
|
||||
f"{self.API_BASE_URL}/find/company/role", params=params
|
||||
)
|
||||
return RoleLookupResponse(**await self._handle_response(response))
|
||||
|
||||
async def get_profile_picture(
|
||||
self, linkedin_profile_url: str
|
||||
) -> ProfilePictureResponse:
|
||||
"""
|
||||
Get a LinkedIn profile picture URL.
|
||||
|
||||
Args:
|
||||
linkedin_profile_url: The LinkedIn profile URL.
|
||||
|
||||
Returns:
|
||||
The profile picture URL.
|
||||
|
||||
Raises:
|
||||
EnrichlayerAPIException: If the API request fails.
|
||||
"""
|
||||
params = {
|
||||
"linkedin_person_profile_url": linkedin_profile_url,
|
||||
}
|
||||
|
||||
response = await self._requests.get(
|
||||
f"{self.API_BASE_URL}/person/profile-picture", params=params
|
||||
)
|
||||
return ProfilePictureResponse(**await self._handle_response(response))
|
||||
34
autogpt_platform/backend/backend/blocks/enrichlayer/_auth.py
Normal file
@@ -0,0 +1,34 @@
|
||||
"""
|
||||
Authentication module for Enrichlayer API integration.
|
||||
|
||||
This module provides credential types and test credentials for the Enrichlayer API.
|
||||
"""
|
||||
|
||||
from typing import Literal
|
||||
|
||||
from pydantic import SecretStr
|
||||
|
||||
from backend.data.model import APIKeyCredentials, CredentialsMetaInput
|
||||
from backend.integrations.providers import ProviderName
|
||||
|
||||
# Define the type of credentials input expected for Enrichlayer API
|
||||
EnrichlayerCredentialsInput = CredentialsMetaInput[
|
||||
Literal[ProviderName.ENRICHLAYER], Literal["api_key"]
|
||||
]
|
||||
|
||||
# Mock credentials for testing Enrichlayer API integration
|
||||
TEST_CREDENTIALS = APIKeyCredentials(
|
||||
id="1234a567-89bc-4def-ab12-3456cdef7890",
|
||||
provider="enrichlayer",
|
||||
api_key=SecretStr("mock-enrichlayer-api-key"),
|
||||
title="Mock Enrichlayer API key",
|
||||
expires_at=None,
|
||||
)
|
||||
|
||||
# Dictionary representation of test credentials for input fields
|
||||
TEST_CREDENTIALS_INPUT = {
|
||||
"provider": TEST_CREDENTIALS.provider,
|
||||
"id": TEST_CREDENTIALS.id,
|
||||
"type": TEST_CREDENTIALS.type,
|
||||
"title": TEST_CREDENTIALS.title,
|
||||
}
|
||||
527
autogpt_platform/backend/backend/blocks/enrichlayer/linkedin.py
Normal file
@@ -0,0 +1,527 @@
|
||||
"""
|
||||
Block definitions for Enrichlayer API integration.
|
||||
|
||||
This module implements blocks for interacting with the Enrichlayer API,
|
||||
which provides access to LinkedIn profile data and related information.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
|
||||
from backend.data.model import APIKeyCredentials, CredentialsField, SchemaField
|
||||
from backend.util.type import MediaFileType
|
||||
|
||||
from ._api import (
|
||||
EnrichlayerClient,
|
||||
Experience,
|
||||
FallbackToCache,
|
||||
PersonLookupResponse,
|
||||
PersonProfileResponse,
|
||||
RoleLookupResponse,
|
||||
UseCache,
|
||||
)
|
||||
from ._auth import TEST_CREDENTIALS, TEST_CREDENTIALS_INPUT, EnrichlayerCredentialsInput
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class GetLinkedinProfileBlock(Block):
|
||||
"""Block to fetch LinkedIn profile data using Enrichlayer API."""
|
||||
|
||||
class Input(BlockSchema):
|
||||
"""Input schema for GetLinkedinProfileBlock."""
|
||||
|
||||
linkedin_url: str = SchemaField(
|
||||
description="LinkedIn profile URL to fetch data from",
|
||||
placeholder="https://www.linkedin.com/in/username/",
|
||||
)
|
||||
fallback_to_cache: FallbackToCache = SchemaField(
|
||||
description="Cache usage if live fetch fails",
|
||||
default=FallbackToCache.ON_ERROR,
|
||||
advanced=True,
|
||||
)
|
||||
use_cache: UseCache = SchemaField(
|
||||
description="Cache utilization strategy",
|
||||
default=UseCache.IF_PRESENT,
|
||||
advanced=True,
|
||||
)
|
||||
include_skills: bool = SchemaField(
|
||||
description="Include skills data",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
include_inferred_salary: bool = SchemaField(
|
||||
description="Include inferred salary data",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
include_personal_email: bool = SchemaField(
|
||||
description="Include personal email",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
include_personal_contact_number: bool = SchemaField(
|
||||
description="Include personal contact number",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
include_social_media: bool = SchemaField(
|
||||
description="Include social media profiles",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
include_extra: bool = SchemaField(
|
||||
description="Include additional data",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
credentials: EnrichlayerCredentialsInput = CredentialsField(
|
||||
description="Enrichlayer API credentials"
|
||||
)
|
||||
|
||||
class Output(BlockSchema):
|
||||
"""Output schema for GetLinkedinProfileBlock."""
|
||||
|
||||
profile: PersonProfileResponse = SchemaField(
|
||||
description="LinkedIn profile data"
|
||||
)
|
||||
error: str = SchemaField(description="Error message if the request failed")
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize GetLinkedinProfileBlock."""
|
||||
super().__init__(
|
||||
id="f6e0ac73-4f1d-4acb-b4b7-b67066c5984e",
|
||||
description="Fetch LinkedIn profile data using Enrichlayer",
|
||||
categories={BlockCategory.SOCIAL},
|
||||
input_schema=GetLinkedinProfileBlock.Input,
|
||||
output_schema=GetLinkedinProfileBlock.Output,
|
||||
test_input={
|
||||
"linkedin_url": "https://www.linkedin.com/in/williamhgates/",
|
||||
"include_skills": True,
|
||||
"include_social_media": True,
|
||||
"credentials": TEST_CREDENTIALS_INPUT,
|
||||
},
|
||||
test_output=[
|
||||
(
|
||||
"profile",
|
||||
PersonProfileResponse(
|
||||
public_identifier="williamhgates",
|
||||
full_name="Bill Gates",
|
||||
occupation="Co-chair at Bill & Melinda Gates Foundation",
|
||||
experiences=[
|
||||
Experience(
|
||||
company="Bill & Melinda Gates Foundation",
|
||||
title="Co-chair",
|
||||
starts_at={"year": 2000},
|
||||
)
|
||||
],
|
||||
),
|
||||
)
|
||||
],
|
||||
test_credentials=TEST_CREDENTIALS,
|
||||
test_mock={
|
||||
"_fetch_profile": lambda *args, **kwargs: PersonProfileResponse(
|
||||
public_identifier="williamhgates",
|
||||
full_name="Bill Gates",
|
||||
occupation="Co-chair at Bill & Melinda Gates Foundation",
|
||||
experiences=[
|
||||
Experience(
|
||||
company="Bill & Melinda Gates Foundation",
|
||||
title="Co-chair",
|
||||
starts_at={"year": 2000},
|
||||
)
|
||||
],
|
||||
),
|
||||
},
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
async def _fetch_profile(
|
||||
credentials: APIKeyCredentials,
|
||||
linkedin_url: str,
|
||||
fallback_to_cache: FallbackToCache = FallbackToCache.ON_ERROR,
|
||||
use_cache: UseCache = UseCache.IF_PRESENT,
|
||||
include_skills: bool = False,
|
||||
include_inferred_salary: bool = False,
|
||||
include_personal_email: bool = False,
|
||||
include_personal_contact_number: bool = False,
|
||||
include_social_media: bool = False,
|
||||
include_extra: bool = False,
|
||||
):
|
||||
client = EnrichlayerClient(credentials)
|
||||
profile = await client.fetch_profile(
|
||||
linkedin_url=linkedin_url,
|
||||
fallback_to_cache=fallback_to_cache,
|
||||
use_cache=use_cache,
|
||||
include_skills=include_skills,
|
||||
include_inferred_salary=include_inferred_salary,
|
||||
include_personal_email=include_personal_email,
|
||||
include_personal_contact_number=include_personal_contact_number,
|
||||
include_social_media=include_social_media,
|
||||
include_extra=include_extra,
|
||||
)
|
||||
return profile
|
||||
|
||||
async def run(
|
||||
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
||||
) -> BlockOutput:
|
||||
"""
|
||||
Run the block to fetch LinkedIn profile data.
|
||||
|
||||
Args:
|
||||
input_data: Input parameters for the block
|
||||
credentials: API key credentials for Enrichlayer
|
||||
**kwargs: Additional keyword arguments
|
||||
|
||||
Yields:
|
||||
Tuples of (output_name, output_value)
|
||||
"""
|
||||
try:
|
||||
profile = await self._fetch_profile(
|
||||
credentials=credentials,
|
||||
linkedin_url=input_data.linkedin_url,
|
||||
fallback_to_cache=input_data.fallback_to_cache,
|
||||
use_cache=input_data.use_cache,
|
||||
include_skills=input_data.include_skills,
|
||||
include_inferred_salary=input_data.include_inferred_salary,
|
||||
include_personal_email=input_data.include_personal_email,
|
||||
include_personal_contact_number=input_data.include_personal_contact_number,
|
||||
include_social_media=input_data.include_social_media,
|
||||
include_extra=input_data.include_extra,
|
||||
)
|
||||
yield "profile", profile
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching LinkedIn profile: {str(e)}")
|
||||
yield "error", str(e)
|
||||
|
||||
|
||||
class LinkedinPersonLookupBlock(Block):
|
||||
"""Block to look up LinkedIn profiles by person's information using Enrichlayer API."""
|
||||
|
||||
class Input(BlockSchema):
|
||||
"""Input schema for LinkedinPersonLookupBlock."""
|
||||
|
||||
first_name: str = SchemaField(
|
||||
description="Person's first name",
|
||||
placeholder="John",
|
||||
advanced=False,
|
||||
)
|
||||
last_name: str | None = SchemaField(
|
||||
description="Person's last name",
|
||||
placeholder="Doe",
|
||||
default=None,
|
||||
advanced=False,
|
||||
)
|
||||
company_domain: str = SchemaField(
|
||||
description="Domain of the company they work for (optional)",
|
||||
placeholder="example.com",
|
||||
advanced=False,
|
||||
)
|
||||
location: Optional[str] = SchemaField(
|
||||
description="Person's location (optional)",
|
||||
placeholder="San Francisco",
|
||||
default=None,
|
||||
)
|
||||
title: Optional[str] = SchemaField(
|
||||
description="Person's job title (optional)",
|
||||
placeholder="CEO",
|
||||
default=None,
|
||||
)
|
||||
include_similarity_checks: bool = SchemaField(
|
||||
description="Include similarity checks",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
enrich_profile: bool = SchemaField(
|
||||
description="Enrich the profile with additional data",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
credentials: EnrichlayerCredentialsInput = CredentialsField(
|
||||
description="Enrichlayer API credentials"
|
||||
)
|
||||
|
||||
class Output(BlockSchema):
|
||||
"""Output schema for LinkedinPersonLookupBlock."""
|
||||
|
||||
lookup_result: PersonLookupResponse = SchemaField(
|
||||
description="LinkedIn profile lookup result"
|
||||
)
|
||||
error: str = SchemaField(description="Error message if the request failed")
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize LinkedinPersonLookupBlock."""
|
||||
super().__init__(
|
||||
id="d237a98a-5c4b-4a1c-b9e3-e6f9a6c81df7",
|
||||
description="Look up LinkedIn profiles by person information using Enrichlayer",
|
||||
categories={BlockCategory.SOCIAL},
|
||||
input_schema=LinkedinPersonLookupBlock.Input,
|
||||
output_schema=LinkedinPersonLookupBlock.Output,
|
||||
test_input={
|
||||
"first_name": "Bill",
|
||||
"last_name": "Gates",
|
||||
"company_domain": "gatesfoundation.org",
|
||||
"include_similarity_checks": True,
|
||||
"credentials": TEST_CREDENTIALS_INPUT,
|
||||
},
|
||||
test_output=[
|
||||
(
|
||||
"lookup_result",
|
||||
PersonLookupResponse(
|
||||
url="https://www.linkedin.com/in/williamhgates/",
|
||||
name_similarity_score=0.93,
|
||||
company_similarity_score=0.83,
|
||||
title_similarity_score=0.3,
|
||||
location_similarity_score=0.20,
|
||||
),
|
||||
)
|
||||
],
|
||||
test_credentials=TEST_CREDENTIALS,
|
||||
test_mock={
|
||||
"_lookup_person": lambda *args, **kwargs: PersonLookupResponse(
|
||||
url="https://www.linkedin.com/in/williamhgates/",
|
||||
name_similarity_score=0.93,
|
||||
company_similarity_score=0.83,
|
||||
title_similarity_score=0.3,
|
||||
location_similarity_score=0.20,
|
||||
)
|
||||
},
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
async def _lookup_person(
|
||||
credentials: APIKeyCredentials,
|
||||
first_name: str,
|
||||
company_domain: str,
|
||||
last_name: str | None = None,
|
||||
location: Optional[str] = None,
|
||||
title: Optional[str] = None,
|
||||
include_similarity_checks: bool = False,
|
||||
enrich_profile: bool = False,
|
||||
):
|
||||
client = EnrichlayerClient(credentials=credentials)
|
||||
lookup_result = await client.lookup_person(
|
||||
first_name=first_name,
|
||||
last_name=last_name,
|
||||
company_domain=company_domain,
|
||||
location=location,
|
||||
title=title,
|
||||
include_similarity_checks=include_similarity_checks,
|
||||
enrich_profile=enrich_profile,
|
||||
)
|
||||
return lookup_result
|
||||
|
||||
async def run(
|
||||
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
||||
) -> BlockOutput:
|
||||
"""
|
||||
Run the block to look up LinkedIn profiles.
|
||||
|
||||
Args:
|
||||
input_data: Input parameters for the block
|
||||
credentials: API key credentials for Enrichlayer
|
||||
**kwargs: Additional keyword arguments
|
||||
|
||||
Yields:
|
||||
Tuples of (output_name, output_value)
|
||||
"""
|
||||
try:
|
||||
lookup_result = await self._lookup_person(
|
||||
credentials=credentials,
|
||||
first_name=input_data.first_name,
|
||||
last_name=input_data.last_name,
|
||||
company_domain=input_data.company_domain,
|
||||
location=input_data.location,
|
||||
title=input_data.title,
|
||||
include_similarity_checks=input_data.include_similarity_checks,
|
||||
enrich_profile=input_data.enrich_profile,
|
||||
)
|
||||
yield "lookup_result", lookup_result
|
||||
except Exception as e:
|
||||
logger.error(f"Error looking up LinkedIn profile: {str(e)}")
|
||||
yield "error", str(e)
|
||||
|
||||
|
||||
class LinkedinRoleLookupBlock(Block):
|
||||
"""Block to look up LinkedIn profiles by role in a company using Enrichlayer API."""
|
||||
|
||||
class Input(BlockSchema):
|
||||
"""Input schema for LinkedinRoleLookupBlock."""
|
||||
|
||||
role: str = SchemaField(
|
||||
description="Role title (e.g., CEO, CTO)",
|
||||
placeholder="CEO",
|
||||
)
|
||||
company_name: str = SchemaField(
|
||||
description="Name of the company",
|
||||
placeholder="Microsoft",
|
||||
)
|
||||
enrich_profile: bool = SchemaField(
|
||||
description="Enrich the profile with additional data",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
credentials: EnrichlayerCredentialsInput = CredentialsField(
|
||||
description="Enrichlayer API credentials"
|
||||
)
|
||||
|
||||
class Output(BlockSchema):
|
||||
"""Output schema for LinkedinRoleLookupBlock."""
|
||||
|
||||
role_lookup_result: RoleLookupResponse = SchemaField(
|
||||
description="LinkedIn role lookup result"
|
||||
)
|
||||
error: str = SchemaField(description="Error message if the request failed")
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize LinkedinRoleLookupBlock."""
|
||||
super().__init__(
|
||||
id="3b9fc742-06d4-49c7-b5ce-7e302dd7c8a7",
|
||||
description="Look up LinkedIn profiles by role in a company using Enrichlayer",
|
||||
categories={BlockCategory.SOCIAL},
|
||||
input_schema=LinkedinRoleLookupBlock.Input,
|
||||
output_schema=LinkedinRoleLookupBlock.Output,
|
||||
test_input={
|
||||
"role": "Co-chair",
|
||||
"company_name": "Gates Foundation",
|
||||
"enrich_profile": True,
|
||||
"credentials": TEST_CREDENTIALS_INPUT,
|
||||
},
|
||||
test_output=[
|
||||
(
|
||||
"role_lookup_result",
|
||||
RoleLookupResponse(
|
||||
linkedin_profile_url="https://www.linkedin.com/in/williamhgates/",
|
||||
),
|
||||
)
|
||||
],
|
||||
test_credentials=TEST_CREDENTIALS,
|
||||
test_mock={
|
||||
"_lookup_role": lambda *args, **kwargs: RoleLookupResponse(
|
||||
linkedin_profile_url="https://www.linkedin.com/in/williamhgates/",
|
||||
),
|
||||
},
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
async def _lookup_role(
|
||||
credentials: APIKeyCredentials,
|
||||
role: str,
|
||||
company_name: str,
|
||||
enrich_profile: bool = False,
|
||||
):
|
||||
client = EnrichlayerClient(credentials=credentials)
|
||||
role_lookup_result = await client.lookup_role(
|
||||
role=role,
|
||||
company_name=company_name,
|
||||
enrich_profile=enrich_profile,
|
||||
)
|
||||
return role_lookup_result
|
||||
|
||||
async def run(
|
||||
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
||||
) -> BlockOutput:
|
||||
"""
|
||||
Run the block to look up LinkedIn profiles by role.
|
||||
|
||||
Args:
|
||||
input_data: Input parameters for the block
|
||||
credentials: API key credentials for Enrichlayer
|
||||
**kwargs: Additional keyword arguments
|
||||
|
||||
Yields:
|
||||
Tuples of (output_name, output_value)
|
||||
"""
|
||||
try:
|
||||
role_lookup_result = await self._lookup_role(
|
||||
credentials=credentials,
|
||||
role=input_data.role,
|
||||
company_name=input_data.company_name,
|
||||
enrich_profile=input_data.enrich_profile,
|
||||
)
|
||||
yield "role_lookup_result", role_lookup_result
|
||||
except Exception as e:
|
||||
logger.error(f"Error looking up role in company: {str(e)}")
|
||||
yield "error", str(e)
|
||||
|
||||
|
||||
class GetLinkedinProfilePictureBlock(Block):
|
||||
"""Block to get LinkedIn profile pictures using Enrichlayer API."""
|
||||
|
||||
class Input(BlockSchema):
|
||||
"""Input schema for GetLinkedinProfilePictureBlock."""
|
||||
|
||||
linkedin_profile_url: str = SchemaField(
|
||||
description="LinkedIn profile URL",
|
||||
placeholder="https://www.linkedin.com/in/username/",
|
||||
)
|
||||
credentials: EnrichlayerCredentialsInput = CredentialsField(
|
||||
description="Enrichlayer API credentials"
|
||||
)
|
||||
|
||||
class Output(BlockSchema):
|
||||
"""Output schema for GetLinkedinProfilePictureBlock."""
|
||||
|
||||
profile_picture_url: MediaFileType = SchemaField(
|
||||
description="LinkedIn profile picture URL"
|
||||
)
|
||||
error: str = SchemaField(description="Error message if the request failed")
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize GetLinkedinProfilePictureBlock."""
|
||||
super().__init__(
|
||||
id="68d5a942-9b3f-4e9a-b7c1-d96ea4321f0d",
|
||||
description="Get LinkedIn profile pictures using Enrichlayer",
|
||||
categories={BlockCategory.SOCIAL},
|
||||
input_schema=GetLinkedinProfilePictureBlock.Input,
|
||||
output_schema=GetLinkedinProfilePictureBlock.Output,
|
||||
test_input={
|
||||
"linkedin_profile_url": "https://www.linkedin.com/in/williamhgates/",
|
||||
"credentials": TEST_CREDENTIALS_INPUT,
|
||||
},
|
||||
test_output=[
|
||||
(
|
||||
"profile_picture_url",
|
||||
"https://media.licdn.com/dms/image/C4D03AQFj-xjuXrLFSQ/profile-displayphoto-shrink_800_800/0/1576881858598?e=1686787200&v=beta&t=zrQC76QwsfQQIWthfOnrKRBMZ5D-qIAvzLXLmWgYvTk",
|
||||
)
|
||||
],
|
||||
test_credentials=TEST_CREDENTIALS,
|
||||
test_mock={
|
||||
"_get_profile_picture": lambda *args, **kwargs: "https://media.licdn.com/dms/image/C4D03AQFj-xjuXrLFSQ/profile-displayphoto-shrink_800_800/0/1576881858598?e=1686787200&v=beta&t=zrQC76QwsfQQIWthfOnrKRBMZ5D-qIAvzLXLmWgYvTk",
|
||||
},
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
async def _get_profile_picture(
|
||||
credentials: APIKeyCredentials, linkedin_profile_url: str
|
||||
):
|
||||
client = EnrichlayerClient(credentials=credentials)
|
||||
profile_picture_response = await client.get_profile_picture(
|
||||
linkedin_profile_url=linkedin_profile_url,
|
||||
)
|
||||
return profile_picture_response.profile_picture_url
|
||||
|
||||
async def run(
|
||||
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
||||
) -> BlockOutput:
|
||||
"""
|
||||
Run the block to get LinkedIn profile pictures.
|
||||
|
||||
Args:
|
||||
input_data: Input parameters for the block
|
||||
credentials: API key credentials for Enrichlayer
|
||||
**kwargs: Additional keyword arguments
|
||||
|
||||
Yields:
|
||||
Tuples of (output_name, output_value)
|
||||
"""
|
||||
try:
|
||||
profile_picture = await self._get_profile_picture(
|
||||
credentials=credentials,
|
||||
linkedin_profile_url=input_data.linkedin_profile_url,
|
||||
)
|
||||
yield "profile_picture_url", profile_picture
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting profile picture: {str(e)}")
|
||||
yield "error", str(e)
|
||||
@@ -37,6 +37,7 @@ LLMProviderName = Literal[
|
||||
ProviderName.OPENAI,
|
||||
ProviderName.OPEN_ROUTER,
|
||||
ProviderName.LLAMA_API,
|
||||
ProviderName.V0,
|
||||
]
|
||||
AICredentials = CredentialsMetaInput[LLMProviderName, Literal["api_key"]]
|
||||
|
||||
@@ -155,6 +156,10 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
|
||||
LLAMA_API_LLAMA4_MAVERICK = "Llama-4-Maverick-17B-128E-Instruct-FP8"
|
||||
LLAMA_API_LLAMA3_3_8B = "Llama-3.3-8B-Instruct"
|
||||
LLAMA_API_LLAMA3_3_70B = "Llama-3.3-70B-Instruct"
|
||||
# v0 by Vercel models
|
||||
V0_1_5_MD = "v0-1.5-md"
|
||||
V0_1_5_LG = "v0-1.5-lg"
|
||||
V0_1_0_MD = "v0-1.0-md"
|
||||
|
||||
@property
|
||||
def metadata(self) -> ModelMetadata:
|
||||
@@ -280,6 +285,10 @@ MODEL_METADATA = {
|
||||
LlmModel.LLAMA_API_LLAMA4_MAVERICK: ModelMetadata("llama_api", 128000, 4028),
|
||||
LlmModel.LLAMA_API_LLAMA3_3_8B: ModelMetadata("llama_api", 128000, 4028),
|
||||
LlmModel.LLAMA_API_LLAMA3_3_70B: ModelMetadata("llama_api", 128000, 4028),
|
||||
# v0 by Vercel models
|
||||
LlmModel.V0_1_5_MD: ModelMetadata("v0", 128000, 64000),
|
||||
LlmModel.V0_1_5_LG: ModelMetadata("v0", 512000, 64000),
|
||||
LlmModel.V0_1_0_MD: ModelMetadata("v0", 128000, 64000),
|
||||
}
|
||||
|
||||
for model in LlmModel:
|
||||
@@ -676,7 +685,11 @@ async def llm_call(
|
||||
client = openai.OpenAI(
|
||||
base_url="https://api.aimlapi.com/v2",
|
||||
api_key=credentials.api_key.get_secret_value(),
|
||||
default_headers={"X-Project": "AutoGPT"},
|
||||
default_headers={
|
||||
"X-Project": "AutoGPT",
|
||||
"X-Title": "AutoGPT",
|
||||
"HTTP-Referer": "https://github.com/Significant-Gravitas/AutoGPT",
|
||||
},
|
||||
)
|
||||
|
||||
completion = client.chat.completions.create(
|
||||
@@ -696,6 +709,42 @@ async def llm_call(
|
||||
),
|
||||
reasoning=None,
|
||||
)
|
||||
elif provider == "v0":
|
||||
tools_param = tools if tools else openai.NOT_GIVEN
|
||||
client = openai.AsyncOpenAI(
|
||||
base_url="https://api.v0.dev/v1",
|
||||
api_key=credentials.api_key.get_secret_value(),
|
||||
)
|
||||
|
||||
response_format = None
|
||||
if json_format:
|
||||
response_format = {"type": "json_object"}
|
||||
|
||||
parallel_tool_calls_param = get_parallel_tool_calls_param(
|
||||
llm_model, parallel_tool_calls
|
||||
)
|
||||
|
||||
response = await client.chat.completions.create(
|
||||
model=llm_model.value,
|
||||
messages=prompt, # type: ignore
|
||||
response_format=response_format, # type: ignore
|
||||
max_tokens=max_tokens,
|
||||
tools=tools_param, # type: ignore
|
||||
parallel_tool_calls=parallel_tool_calls_param,
|
||||
)
|
||||
|
||||
tool_calls = extract_openai_tool_calls(response)
|
||||
reasoning = extract_openai_reasoning(response)
|
||||
|
||||
return LLMResponse(
|
||||
raw_response=response.choices[0].message,
|
||||
prompt=prompt,
|
||||
response=response.choices[0].message.content or "",
|
||||
tool_calls=tool_calls,
|
||||
prompt_tokens=response.usage.prompt_tokens if response.usage else 0,
|
||||
completion_tokens=response.usage.completion_tokens if response.usage else 0,
|
||||
reasoning=reasoning,
|
||||
)
|
||||
else:
|
||||
raise ValueError(f"Unsupported LLM provider: {provider}")
|
||||
|
||||
|
||||
@@ -5,6 +5,12 @@ from backend.blocks.ai_shortform_video_block import AIShortformVideoCreatorBlock
|
||||
from backend.blocks.apollo.organization import SearchOrganizationsBlock
|
||||
from backend.blocks.apollo.people import SearchPeopleBlock
|
||||
from backend.blocks.apollo.person import GetPersonDetailBlock
|
||||
from backend.blocks.enrichlayer.linkedin import (
|
||||
GetLinkedinProfileBlock,
|
||||
GetLinkedinProfilePictureBlock,
|
||||
LinkedinPersonLookupBlock,
|
||||
LinkedinRoleLookupBlock,
|
||||
)
|
||||
from backend.blocks.flux_kontext import AIImageEditorBlock, FluxKontextModelName
|
||||
from backend.blocks.ideogram import IdeogramModelBlock
|
||||
from backend.blocks.jina.embeddings import JinaEmbeddingBlock
|
||||
@@ -30,6 +36,7 @@ from backend.integrations.credentials_store import (
|
||||
anthropic_credentials,
|
||||
apollo_credentials,
|
||||
did_credentials,
|
||||
enrichlayer_credentials,
|
||||
groq_credentials,
|
||||
ideogram_credentials,
|
||||
jina_credentials,
|
||||
@@ -39,6 +46,7 @@ from backend.integrations.credentials_store import (
|
||||
replicate_credentials,
|
||||
revid_credentials,
|
||||
unreal_credentials,
|
||||
v0_credentials,
|
||||
)
|
||||
|
||||
# =============== Configure the cost for each LLM Model call =============== #
|
||||
@@ -115,6 +123,10 @@ MODEL_COST: dict[LlmModel, int] = {
|
||||
LlmModel.GEMINI_2_5_FLASH_LITE_PREVIEW: 1,
|
||||
LlmModel.GEMINI_2_0_FLASH_LITE: 1,
|
||||
LlmModel.DEEPSEEK_R1_0528: 1,
|
||||
# v0 by Vercel models
|
||||
LlmModel.V0_1_5_MD: 1,
|
||||
LlmModel.V0_1_5_LG: 2,
|
||||
LlmModel.V0_1_0_MD: 1,
|
||||
}
|
||||
|
||||
for model in LlmModel:
|
||||
@@ -204,6 +216,23 @@ LLM_COST = (
|
||||
for model, cost in MODEL_COST.items()
|
||||
if MODEL_METADATA[model].provider == "llama_api"
|
||||
]
|
||||
# v0 by Vercel Models
|
||||
+ [
|
||||
BlockCost(
|
||||
cost_type=BlockCostType.RUN,
|
||||
cost_filter={
|
||||
"model": model,
|
||||
"credentials": {
|
||||
"id": v0_credentials.id,
|
||||
"provider": v0_credentials.provider,
|
||||
"type": v0_credentials.type,
|
||||
},
|
||||
},
|
||||
cost_amount=cost,
|
||||
)
|
||||
for model, cost in MODEL_COST.items()
|
||||
if MODEL_METADATA[model].provider == "v0"
|
||||
]
|
||||
# AI/ML Api Models
|
||||
+ [
|
||||
BlockCost(
|
||||
@@ -376,6 +405,54 @@ BLOCK_COSTS: dict[Type[Block], list[BlockCost]] = {
|
||||
},
|
||||
)
|
||||
],
|
||||
GetLinkedinProfileBlock: [
|
||||
BlockCost(
|
||||
cost_amount=1,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": enrichlayer_credentials.id,
|
||||
"provider": enrichlayer_credentials.provider,
|
||||
"type": enrichlayer_credentials.type,
|
||||
}
|
||||
},
|
||||
)
|
||||
],
|
||||
LinkedinPersonLookupBlock: [
|
||||
BlockCost(
|
||||
cost_amount=2,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": enrichlayer_credentials.id,
|
||||
"provider": enrichlayer_credentials.provider,
|
||||
"type": enrichlayer_credentials.type,
|
||||
}
|
||||
},
|
||||
)
|
||||
],
|
||||
LinkedinRoleLookupBlock: [
|
||||
BlockCost(
|
||||
cost_amount=3,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": enrichlayer_credentials.id,
|
||||
"provider": enrichlayer_credentials.provider,
|
||||
"type": enrichlayer_credentials.type,
|
||||
}
|
||||
},
|
||||
)
|
||||
],
|
||||
GetLinkedinProfilePictureBlock: [
|
||||
BlockCost(
|
||||
cost_amount=3,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": enrichlayer_credentials.id,
|
||||
"provider": enrichlayer_credentials.provider,
|
||||
"type": enrichlayer_credentials.type,
|
||||
}
|
||||
},
|
||||
)
|
||||
],
|
||||
SmartDecisionMakerBlock: LLM_COST,
|
||||
SearchOrganizationsBlock: [
|
||||
BlockCost(
|
||||
|
||||
109
autogpt_platform/backend/backend/data/generate_data.py
Normal file
@@ -0,0 +1,109 @@
|
||||
import logging
|
||||
from collections import defaultdict
|
||||
from datetime import datetime
|
||||
|
||||
from prisma.enums import AgentExecutionStatus
|
||||
|
||||
from backend.data.execution import get_graph_executions
|
||||
from backend.data.graph import get_graph_metadata
|
||||
from backend.data.model import UserExecutionSummaryStats
|
||||
from backend.server.v2.store.exceptions import DatabaseError
|
||||
from backend.util.logging import TruncatedLogger
|
||||
|
||||
logger = TruncatedLogger(logging.getLogger(__name__), prefix="[SummaryData]")
|
||||
|
||||
|
||||
async def get_user_execution_summary_data(
|
||||
user_id: str, start_time: datetime, end_time: datetime
|
||||
) -> UserExecutionSummaryStats:
|
||||
"""Gather all summary data for a user in a time range.
|
||||
|
||||
This function fetches graph executions once and aggregates all required
|
||||
statistics in a single pass for efficiency.
|
||||
"""
|
||||
try:
|
||||
# Fetch graph executions once
|
||||
executions = await get_graph_executions(
|
||||
user_id=user_id,
|
||||
created_time_gte=start_time,
|
||||
created_time_lte=end_time,
|
||||
)
|
||||
|
||||
# Initialize aggregation variables
|
||||
total_credits_used = 0.0
|
||||
total_executions = len(executions)
|
||||
successful_runs = 0
|
||||
failed_runs = 0
|
||||
terminated_runs = 0
|
||||
execution_times = []
|
||||
agent_usage = defaultdict(int)
|
||||
cost_by_graph_id = defaultdict(float)
|
||||
|
||||
# Single pass through executions to aggregate all stats
|
||||
for execution in executions:
|
||||
# Count execution statuses (including TERMINATED as failed)
|
||||
if execution.status == AgentExecutionStatus.COMPLETED:
|
||||
successful_runs += 1
|
||||
elif execution.status == AgentExecutionStatus.FAILED:
|
||||
failed_runs += 1
|
||||
elif execution.status == AgentExecutionStatus.TERMINATED:
|
||||
terminated_runs += 1
|
||||
|
||||
# Aggregate costs from stats
|
||||
if execution.stats and hasattr(execution.stats, "cost"):
|
||||
cost_in_dollars = execution.stats.cost / 100
|
||||
total_credits_used += cost_in_dollars
|
||||
cost_by_graph_id[execution.graph_id] += cost_in_dollars
|
||||
|
||||
# Collect execution times
|
||||
if execution.stats and hasattr(execution.stats, "duration"):
|
||||
execution_times.append(execution.stats.duration)
|
||||
|
||||
# Count agent usage
|
||||
agent_usage[execution.graph_id] += 1
|
||||
|
||||
# Calculate derived stats
|
||||
total_execution_time = sum(execution_times)
|
||||
average_execution_time = (
|
||||
total_execution_time / len(execution_times) if execution_times else 0
|
||||
)
|
||||
|
||||
# Find most used agent
|
||||
most_used_agent = "No agents used"
|
||||
if agent_usage:
|
||||
most_used_agent_id = max(agent_usage, key=lambda k: agent_usage[k])
|
||||
try:
|
||||
graph_meta = await get_graph_metadata(graph_id=most_used_agent_id)
|
||||
most_used_agent = (
|
||||
graph_meta.name if graph_meta else f"Agent {most_used_agent_id[:8]}"
|
||||
)
|
||||
except Exception:
|
||||
logger.warning(f"Could not get metadata for graph {most_used_agent_id}")
|
||||
most_used_agent = f"Agent {most_used_agent_id[:8]}"
|
||||
|
||||
# Convert graph_ids to agent names for cost breakdown
|
||||
cost_breakdown = {}
|
||||
for graph_id, cost in cost_by_graph_id.items():
|
||||
try:
|
||||
graph_meta = await get_graph_metadata(graph_id=graph_id)
|
||||
agent_name = graph_meta.name if graph_meta else f"Agent {graph_id[:8]}"
|
||||
except Exception:
|
||||
logger.warning(f"Could not get metadata for graph {graph_id}")
|
||||
agent_name = f"Agent {graph_id[:8]}"
|
||||
cost_breakdown[agent_name] = cost
|
||||
|
||||
# Build the summary stats object (include terminated runs as failed)
|
||||
return UserExecutionSummaryStats(
|
||||
total_credits_used=total_credits_used,
|
||||
total_executions=total_executions,
|
||||
successful_runs=successful_runs,
|
||||
failed_runs=failed_runs + terminated_runs,
|
||||
most_used_agent=most_used_agent,
|
||||
total_execution_time=total_execution_time,
|
||||
average_execution_time=average_execution_time,
|
||||
cost_breakdown=cost_breakdown,
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to get user summary data: {e}")
|
||||
raise DatabaseError(f"Failed to get user summary data: {e}") from e
|
||||
@@ -821,3 +821,21 @@ class GraphExecutionStats(BaseModel):
|
||||
activity_status: Optional[str] = Field(
|
||||
default=None, description="AI-generated summary of what the agent did"
|
||||
)
|
||||
|
||||
|
||||
class UserExecutionSummaryStats(BaseModel):
|
||||
"""Summary of user statistics for a specific user."""
|
||||
|
||||
model_config = ConfigDict(
|
||||
extra="allow",
|
||||
arbitrary_types_allowed=True,
|
||||
)
|
||||
|
||||
total_credits_used: float = Field(default=0)
|
||||
total_executions: int = Field(default=0)
|
||||
successful_runs: int = Field(default=0)
|
||||
failed_runs: int = Field(default=0)
|
||||
most_used_agent: str = Field(default="")
|
||||
total_execution_time: float = Field(default=0)
|
||||
average_execution_time: float = Field(default=0)
|
||||
cost_breakdown: dict[str, float] = Field(default_factory=dict)
|
||||
|
||||
@@ -20,6 +20,7 @@ from backend.data.execution import (
|
||||
upsert_execution_input,
|
||||
upsert_execution_output,
|
||||
)
|
||||
from backend.data.generate_data import get_user_execution_summary_data
|
||||
from backend.data.graph import (
|
||||
get_connected_output_nodes,
|
||||
get_graph,
|
||||
@@ -144,6 +145,9 @@ class DatabaseManager(AppService):
|
||||
get_user_notification_oldest_message_in_batch
|
||||
)
|
||||
|
||||
# Summary data - async
|
||||
get_user_execution_summary_data = _(get_user_execution_summary_data)
|
||||
|
||||
|
||||
class DatabaseManagerClient(AppServiceClient):
|
||||
d = DatabaseManager
|
||||
@@ -169,6 +173,9 @@ class DatabaseManagerClient(AppServiceClient):
|
||||
spend_credits = _(d.spend_credits)
|
||||
get_credits = _(d.get_credits)
|
||||
|
||||
# Summary data - async
|
||||
get_user_execution_summary_data = _(d.get_user_execution_summary_data)
|
||||
|
||||
# Block error monitoring
|
||||
get_block_error_stats = _(d.get_block_error_stats)
|
||||
|
||||
@@ -215,3 +222,6 @@ class DatabaseManagerAsyncClient(AppServiceClient):
|
||||
get_user_notification_oldest_message_in_batch = (
|
||||
d.get_user_notification_oldest_message_in_batch
|
||||
)
|
||||
|
||||
# Summary data
|
||||
get_user_execution_summary_data = d.get_user_execution_summary_data
|
||||
|
||||
@@ -1208,6 +1208,9 @@ class ExecutionManager(AppProcess):
|
||||
)
|
||||
return
|
||||
|
||||
# Check if channel is closed and force reconnection if needed
|
||||
if not self.cancel_client.is_ready:
|
||||
self.cancel_client.disconnect()
|
||||
self.cancel_client.connect()
|
||||
cancel_channel = self.cancel_client.get_channel()
|
||||
cancel_channel.basic_consume(
|
||||
@@ -1237,6 +1240,9 @@ class ExecutionManager(AppProcess):
|
||||
)
|
||||
return
|
||||
|
||||
# Check if channel is closed and force reconnection if needed
|
||||
if not self.run_client.is_ready:
|
||||
self.run_client.disconnect()
|
||||
self.run_client.connect()
|
||||
run_channel = self.run_client.get_channel()
|
||||
run_channel.basic_qos(prefetch_count=self.pool_size)
|
||||
|
||||
@@ -305,9 +305,10 @@ class Scheduler(AppService):
|
||||
|
||||
if self.register_system_tasks:
|
||||
# Notification PROCESS WEEKLY SUMMARY
|
||||
# Runs every Monday at 9 AM UTC
|
||||
self.scheduler.add_job(
|
||||
process_weekly_summary,
|
||||
CronTrigger.from_crontab("0 * * * *"),
|
||||
CronTrigger.from_crontab("0 9 * * 1"),
|
||||
id="process_weekly_summary",
|
||||
kwargs={},
|
||||
replace_existing=True,
|
||||
|
||||
@@ -182,6 +182,15 @@ zerobounce_credentials = APIKeyCredentials(
|
||||
expires_at=None,
|
||||
)
|
||||
|
||||
enrichlayer_credentials = APIKeyCredentials(
|
||||
id="d9fce73a-6c1d-4e8b-ba2e-12a456789def",
|
||||
provider="enrichlayer",
|
||||
api_key=SecretStr(settings.secrets.enrichlayer_api_key),
|
||||
title="Use Credits for Enrichlayer",
|
||||
expires_at=None,
|
||||
)
|
||||
|
||||
|
||||
llama_api_credentials = APIKeyCredentials(
|
||||
id="d44045af-1c33-4833-9e19-752313214de2",
|
||||
provider="llama_api",
|
||||
@@ -190,6 +199,14 @@ llama_api_credentials = APIKeyCredentials(
|
||||
expires_at=None,
|
||||
)
|
||||
|
||||
v0_credentials = APIKeyCredentials(
|
||||
id="c4e6d1a0-3b5f-4789-a8e2-9b123456789f",
|
||||
provider="v0",
|
||||
api_key=SecretStr(settings.secrets.v0_api_key),
|
||||
title="Use Credits for v0 by Vercel",
|
||||
expires_at=None,
|
||||
)
|
||||
|
||||
DEFAULT_CREDENTIALS = [
|
||||
ollama_credentials,
|
||||
revid_credentials,
|
||||
@@ -203,6 +220,7 @@ DEFAULT_CREDENTIALS = [
|
||||
jina_credentials,
|
||||
unreal_credentials,
|
||||
open_router_credentials,
|
||||
enrichlayer_credentials,
|
||||
fal_credentials,
|
||||
exa_credentials,
|
||||
e2b_credentials,
|
||||
@@ -213,6 +231,8 @@ DEFAULT_CREDENTIALS = [
|
||||
smartlead_credentials,
|
||||
zerobounce_credentials,
|
||||
google_maps_credentials,
|
||||
llama_api_credentials,
|
||||
v0_credentials,
|
||||
]
|
||||
|
||||
|
||||
@@ -279,6 +299,8 @@ class IntegrationCredentialsStore:
|
||||
all_credentials.append(unreal_credentials)
|
||||
if settings.secrets.open_router_api_key:
|
||||
all_credentials.append(open_router_credentials)
|
||||
if settings.secrets.enrichlayer_api_key:
|
||||
all_credentials.append(enrichlayer_credentials)
|
||||
if settings.secrets.fal_api_key:
|
||||
all_credentials.append(fal_credentials)
|
||||
if settings.secrets.exa_api_key:
|
||||
|
||||
@@ -25,6 +25,7 @@ class ProviderName(str, Enum):
|
||||
GROQ = "groq"
|
||||
HTTP = "http"
|
||||
HUBSPOT = "hubspot"
|
||||
ENRICHLAYER = "enrichlayer"
|
||||
IDEOGRAM = "ideogram"
|
||||
JINA = "jina"
|
||||
LLAMA_API = "llama_api"
|
||||
@@ -47,6 +48,7 @@ class ProviderName(str, Enum):
|
||||
TWITTER = "twitter"
|
||||
TODOIST = "todoist"
|
||||
UNREAL_SPEECH = "unreal_speech"
|
||||
V0 = "v0"
|
||||
ZEROBOUNCE = "zerobounce"
|
||||
|
||||
@classmethod
|
||||
|
||||
@@ -223,10 +223,14 @@ class NotificationManager(AppService):
|
||||
processed_count = 0
|
||||
current_time = datetime.now(tz=timezone.utc)
|
||||
start_time = current_time - timedelta(days=7)
|
||||
logger.info(
|
||||
f"Querying for active users between {start_time} and {current_time}"
|
||||
)
|
||||
users = await get_database_manager_async_client().get_active_user_ids_in_timerange(
|
||||
end_time=current_time.isoformat(),
|
||||
start_time=start_time.isoformat(),
|
||||
)
|
||||
logger.info(f"Found {len(users)} active users in the last 7 days")
|
||||
for user in users:
|
||||
await self._queue_scheduled_notification(
|
||||
SummaryParamsEventModel(
|
||||
@@ -384,10 +388,13 @@ class NotificationManager(AppService):
|
||||
async def _queue_scheduled_notification(self, event: SummaryParamsEventModel):
|
||||
"""Queue a scheduled notification - exposed method for other services to call"""
|
||||
try:
|
||||
logger.debug(f"Received Request to queue scheduled notification {event=}")
|
||||
logger.info(
|
||||
f"Queueing scheduled notification type={event.type} user_id={event.user_id}"
|
||||
)
|
||||
|
||||
exchange = "notifications"
|
||||
routing_key = get_routing_key(event.type)
|
||||
logger.info(f"Using routing key: {routing_key}")
|
||||
|
||||
# Publish to RabbitMQ
|
||||
await self.rabbit.publish_message(
|
||||
@@ -395,6 +402,7 @@ class NotificationManager(AppService):
|
||||
message=event.model_dump_json(),
|
||||
exchange=next(ex for ex in EXCHANGES if ex.name == exchange),
|
||||
)
|
||||
logger.info(f"Successfully queued notification for user {event.user_id}")
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(f"Error queueing notification: {e}")
|
||||
@@ -416,85 +424,99 @@ class NotificationManager(AppService):
|
||||
# only if both are true, should we email this person
|
||||
return validated_email and preference
|
||||
|
||||
def _gather_summary_data(
|
||||
async def _gather_summary_data(
|
||||
self, user_id: str, event_type: NotificationType, params: BaseSummaryParams
|
||||
) -> BaseSummaryData:
|
||||
"""Gathers the data to build a summary notification"""
|
||||
|
||||
logger.info(
|
||||
f"Gathering summary data for {user_id} and {event_type} wiht {params=}"
|
||||
f"Gathering summary data for {user_id} and {event_type} with {params=}"
|
||||
)
|
||||
|
||||
# total_credits_used = self.run_and_wait(
|
||||
# get_total_credits_used(user_id, start_time, end_time)
|
||||
# )
|
||||
|
||||
# total_executions = self.run_and_wait(
|
||||
# get_total_executions(user_id, start_time, end_time)
|
||||
# )
|
||||
|
||||
# most_used_agent = self.run_and_wait(
|
||||
# get_most_used_agent(user_id, start_time, end_time)
|
||||
# )
|
||||
|
||||
# execution_times = self.run_and_wait(
|
||||
# get_execution_time(user_id, start_time, end_time)
|
||||
# )
|
||||
|
||||
# runs = self.run_and_wait(
|
||||
# get_runs(user_id, start_time, end_time)
|
||||
# )
|
||||
total_credits_used = 3.0
|
||||
total_executions = 2
|
||||
most_used_agent = {"name": "Some"}
|
||||
execution_times = [1, 2, 3]
|
||||
runs = [{"status": "COMPLETED"}, {"status": "FAILED"}]
|
||||
|
||||
successful_runs = len([run for run in runs if run["status"] == "COMPLETED"])
|
||||
failed_runs = len([run for run in runs if run["status"] != "COMPLETED"])
|
||||
average_execution_time = (
|
||||
sum(execution_times) / len(execution_times) if execution_times else 0
|
||||
)
|
||||
# cost_breakdown = self.run_and_wait(
|
||||
# get_cost_breakdown(user_id, start_time, end_time)
|
||||
# )
|
||||
|
||||
cost_breakdown = {
|
||||
"agent1": 1.0,
|
||||
"agent2": 2.0,
|
||||
}
|
||||
|
||||
if event_type == NotificationType.DAILY_SUMMARY and isinstance(
|
||||
params, DailySummaryParams
|
||||
):
|
||||
return DailySummaryData(
|
||||
total_credits_used=total_credits_used,
|
||||
total_executions=total_executions,
|
||||
most_used_agent=most_used_agent["name"],
|
||||
total_execution_time=sum(execution_times),
|
||||
successful_runs=successful_runs,
|
||||
failed_runs=failed_runs,
|
||||
average_execution_time=average_execution_time,
|
||||
cost_breakdown=cost_breakdown,
|
||||
date=params.date,
|
||||
try:
|
||||
# Get summary data from the database
|
||||
summary_data = await get_database_manager_async_client().get_user_execution_summary_data(
|
||||
user_id=user_id,
|
||||
start_time=params.start_date,
|
||||
end_time=params.end_date,
|
||||
)
|
||||
elif event_type == NotificationType.WEEKLY_SUMMARY and isinstance(
|
||||
params, WeeklySummaryParams
|
||||
):
|
||||
return WeeklySummaryData(
|
||||
total_credits_used=total_credits_used,
|
||||
total_executions=total_executions,
|
||||
most_used_agent=most_used_agent["name"],
|
||||
total_execution_time=sum(execution_times),
|
||||
successful_runs=successful_runs,
|
||||
failed_runs=failed_runs,
|
||||
average_execution_time=average_execution_time,
|
||||
cost_breakdown=cost_breakdown,
|
||||
start_date=params.start_date,
|
||||
end_date=params.end_date,
|
||||
)
|
||||
else:
|
||||
raise ValueError("Invalid event type or params")
|
||||
|
||||
# Extract data from summary
|
||||
total_credits_used = summary_data.total_credits_used
|
||||
total_executions = summary_data.total_executions
|
||||
most_used_agent = summary_data.most_used_agent
|
||||
successful_runs = summary_data.successful_runs
|
||||
failed_runs = summary_data.failed_runs
|
||||
total_execution_time = summary_data.total_execution_time
|
||||
average_execution_time = summary_data.average_execution_time
|
||||
cost_breakdown = summary_data.cost_breakdown
|
||||
|
||||
if event_type == NotificationType.DAILY_SUMMARY and isinstance(
|
||||
params, DailySummaryParams
|
||||
):
|
||||
return DailySummaryData(
|
||||
total_credits_used=total_credits_used,
|
||||
total_executions=total_executions,
|
||||
most_used_agent=most_used_agent,
|
||||
total_execution_time=total_execution_time,
|
||||
successful_runs=successful_runs,
|
||||
failed_runs=failed_runs,
|
||||
average_execution_time=average_execution_time,
|
||||
cost_breakdown=cost_breakdown,
|
||||
date=params.date,
|
||||
)
|
||||
elif event_type == NotificationType.WEEKLY_SUMMARY and isinstance(
|
||||
params, WeeklySummaryParams
|
||||
):
|
||||
return WeeklySummaryData(
|
||||
total_credits_used=total_credits_used,
|
||||
total_executions=total_executions,
|
||||
most_used_agent=most_used_agent,
|
||||
total_execution_time=total_execution_time,
|
||||
successful_runs=successful_runs,
|
||||
failed_runs=failed_runs,
|
||||
average_execution_time=average_execution_time,
|
||||
cost_breakdown=cost_breakdown,
|
||||
start_date=params.start_date,
|
||||
end_date=params.end_date,
|
||||
)
|
||||
else:
|
||||
raise ValueError("Invalid event type or params")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to gather summary data: {e}")
|
||||
# Return sensible defaults in case of error
|
||||
if event_type == NotificationType.DAILY_SUMMARY and isinstance(
|
||||
params, DailySummaryParams
|
||||
):
|
||||
return DailySummaryData(
|
||||
total_credits_used=0.0,
|
||||
total_executions=0,
|
||||
most_used_agent="No data available",
|
||||
total_execution_time=0.0,
|
||||
successful_runs=0,
|
||||
failed_runs=0,
|
||||
average_execution_time=0.0,
|
||||
cost_breakdown={},
|
||||
date=params.date,
|
||||
)
|
||||
elif event_type == NotificationType.WEEKLY_SUMMARY and isinstance(
|
||||
params, WeeklySummaryParams
|
||||
):
|
||||
return WeeklySummaryData(
|
||||
total_credits_used=0.0,
|
||||
total_executions=0,
|
||||
most_used_agent="No data available",
|
||||
total_execution_time=0.0,
|
||||
successful_runs=0,
|
||||
failed_runs=0,
|
||||
average_execution_time=0.0,
|
||||
cost_breakdown={},
|
||||
start_date=params.start_date,
|
||||
end_date=params.end_date,
|
||||
)
|
||||
else:
|
||||
raise ValueError("Invalid event type or params") from e
|
||||
|
||||
async def _should_batch(
|
||||
self, user_id: str, event_type: NotificationType, event: NotificationEventModel
|
||||
@@ -764,7 +786,7 @@ class NotificationManager(AppService):
|
||||
)
|
||||
return True
|
||||
|
||||
summary_data = self._gather_summary_data(
|
||||
summary_data = await self._gather_summary_data(
|
||||
event.user_id, event.type, model.data
|
||||
)
|
||||
|
||||
|
||||
@@ -5,23 +5,64 @@ data.start_date: the start date of the summary
|
||||
data.end_date: the end date of the summary
|
||||
data.total_credits_used: the total credits used during the summary
|
||||
data.total_executions: the total number of executions during the summary
|
||||
data.most_used_agent: the most used agent's nameduring the summary
|
||||
data.most_used_agent: the most used agent's name during the summary
|
||||
data.total_execution_time: the total execution time during the summary
|
||||
data.successful_runs: the total number of successful runs during the summary
|
||||
data.failed_runs: the total number of failed runs during the summary
|
||||
data.average_execution_time: the average execution time during the summary
|
||||
data.cost_breakdown: the cost breakdown during the summary
|
||||
data.cost_breakdown: the cost breakdown during the summary (dict mapping agent names to credit amounts)
|
||||
#}
|
||||
|
||||
<h1>Weekly Summary</h1>
|
||||
<h1 style="color: #5D23BB; font-size: 32px; font-weight: 600; margin-bottom: 25px; margin-top: 0;">
|
||||
Weekly Summary
|
||||
</h1>
|
||||
|
||||
<p>Start Date: {{ data.start_date }}</p>
|
||||
<p>End Date: {{ data.end_date }}</p>
|
||||
<p>Total Credits Used: {{ data.total_credits_used }}</p>
|
||||
<p>Total Executions: {{ data.total_executions }}</p>
|
||||
<p>Most Used Agent: {{ data.most_used_agent }}</p>
|
||||
<p>Total Execution Time: {{ data.total_execution_time }}</p>
|
||||
<p>Successful Runs: {{ data.successful_runs }}</p>
|
||||
<p>Failed Runs: {{ data.failed_runs }}</p>
|
||||
<p>Average Execution Time: {{ data.average_execution_time }}</p>
|
||||
<p>Cost Breakdown: {{ data.cost_breakdown }}</p>
|
||||
<h2 style="color: #070629; font-size: 24px; font-weight: 500; margin-bottom: 20px;">
|
||||
Your Agent Activity: {{ data.start_date.strftime('%B %-d') }} – {{ data.end_date.strftime('%B %-d') }}
|
||||
</h2>
|
||||
|
||||
<div style="background-color: #ffffff; border-radius: 8px; padding: 20px; margin-bottom: 25px;">
|
||||
<ul style="list-style-type: disc; padding-left: 20px; margin: 0;">
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Total Executions:</strong> {{ data.total_executions }}
|
||||
</li>
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Total Credits Used:</strong> {{ data.total_credits_used|format("%.2f") }}
|
||||
</li>
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Total Execution Time:</strong> {{ data.total_execution_time|format("%.1f") }} seconds
|
||||
</li>
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Successful Runs:</strong> {{ data.successful_runs }}
|
||||
</li>
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Failed Runs:</strong> {{ data.failed_runs }}
|
||||
</li>
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Average Execution Time:</strong> {{ data.average_execution_time|format("%.1f") }} seconds
|
||||
</li>
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Most Used Agent:</strong> {{ data.most_used_agent }}
|
||||
</li>
|
||||
{% if data.cost_breakdown %}
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 8px;">
|
||||
<strong>Cost Breakdown:</strong>
|
||||
<ul style="list-style-type: disc; padding-left: 40px; margin-top: 8px;">
|
||||
{% for agent_name, credits in data.cost_breakdown.items() %}
|
||||
<li style="font-size: 16px; line-height: 1.8; margin-bottom: 4px;">
|
||||
{{ agent_name }}: {{ credits|format("%.2f") }} credits
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</li>
|
||||
{% endif %}
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
<p style="font-size: 16px; line-height: 165%; margin-top: 20px; margin-bottom: 10px;">
|
||||
Thank you for being a part of the AutoGPT community! 🎉
|
||||
</p>
|
||||
|
||||
<p style="font-size: 16px; line-height: 165%; margin-bottom: 0;">
|
||||
Join the conversation on <a href="https://discord.gg/autogpt" style="color: #4285F4; text-decoration: underline;">Discord here</a>.
|
||||
</p>
|
||||
@@ -9,7 +9,6 @@ import fastapi.responses
|
||||
import pydantic
|
||||
import starlette.middleware.cors
|
||||
import uvicorn
|
||||
from autogpt_libs.logging.utils import generate_uvicorn_config
|
||||
from fastapi.exceptions import RequestValidationError
|
||||
from fastapi.routing import APIRoute
|
||||
|
||||
@@ -247,7 +246,7 @@ class AgentServer(backend.util.service.AppProcess):
|
||||
server_app,
|
||||
host=backend.util.settings.Config().agent_api_host,
|
||||
port=backend.util.settings.Config().agent_api_port,
|
||||
log_config=generate_uvicorn_config(),
|
||||
log_config=None,
|
||||
)
|
||||
|
||||
def cleanup(self):
|
||||
|
||||
@@ -6,7 +6,6 @@ from typing import Protocol
|
||||
import pydantic
|
||||
import uvicorn
|
||||
from autogpt_libs.auth import parse_jwt_token
|
||||
from autogpt_libs.logging.utils import generate_uvicorn_config
|
||||
from fastapi import Depends, FastAPI, WebSocket, WebSocketDisconnect
|
||||
from starlette.middleware.cors import CORSMiddleware
|
||||
|
||||
@@ -309,7 +308,7 @@ class WebsocketServer(AppProcess):
|
||||
server_app,
|
||||
host=Config().websocket_server_host,
|
||||
port=Config().websocket_server_port,
|
||||
log_config=generate_uvicorn_config(),
|
||||
log_config=None,
|
||||
)
|
||||
|
||||
def cleanup(self):
|
||||
|
||||
@@ -24,7 +24,6 @@ from typing import (
|
||||
|
||||
import httpx
|
||||
import uvicorn
|
||||
from autogpt_libs.logging.utils import generate_uvicorn_config
|
||||
from fastapi import FastAPI, Request, responses
|
||||
from pydantic import BaseModel, TypeAdapter, create_model
|
||||
|
||||
@@ -266,7 +265,7 @@ class AppService(BaseAppService, ABC):
|
||||
self.fastapi_app,
|
||||
host=api_host,
|
||||
port=self.get_port(),
|
||||
log_config=generate_uvicorn_config(),
|
||||
log_config=None, # Explicitly None to avoid uvicorn replacing the logger.
|
||||
log_level=self.log_level,
|
||||
)
|
||||
)
|
||||
|
||||
@@ -360,7 +360,7 @@ class Config(UpdateTrackingModel["Config"], BaseSettings):
|
||||
description="Maximum message size limit for communication with the message bus",
|
||||
)
|
||||
|
||||
backend_cors_allow_origins: List[str] = Field(default_factory=list)
|
||||
backend_cors_allow_origins: List[str] = Field(default=["http://localhost:3000"])
|
||||
|
||||
@field_validator("backend_cors_allow_origins")
|
||||
@classmethod
|
||||
@@ -472,6 +472,7 @@ class Secrets(UpdateTrackingModel["Secrets"], BaseSettings):
|
||||
groq_api_key: str = Field(default="", description="Groq API key")
|
||||
open_router_api_key: str = Field(default="", description="Open Router API Key")
|
||||
llama_api_key: str = Field(default="", description="Llama API Key")
|
||||
v0_api_key: str = Field(default="", description="v0 by Vercel API key")
|
||||
|
||||
reddit_client_id: str = Field(default="", description="Reddit client ID")
|
||||
reddit_client_secret: str = Field(default="", description="Reddit client secret")
|
||||
@@ -521,6 +522,7 @@ class Secrets(UpdateTrackingModel["Secrets"], BaseSettings):
|
||||
apollo_api_key: str = Field(default="", description="Apollo API Key")
|
||||
smartlead_api_key: str = Field(default="", description="SmartLead API Key")
|
||||
zerobounce_api_key: str = Field(default="", description="ZeroBounce API Key")
|
||||
enrichlayer_api_key: str = Field(default="", description="Enrichlayer API Key")
|
||||
|
||||
# AutoMod API credentials
|
||||
automod_api_key: str = Field(default="", description="AutoMod API key")
|
||||
@@ -534,7 +536,6 @@ class Secrets(UpdateTrackingModel["Secrets"], BaseSettings):
|
||||
ayrshare_api_key: str = Field(default="", description="Ayrshare API Key")
|
||||
ayrshare_jwt_key: str = Field(default="", description="Ayrshare private Key")
|
||||
# Add more secret fields as needed
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
|
||||
@@ -1,123 +0,0 @@
|
||||
############
|
||||
# Secrets
|
||||
# YOU MUST CHANGE THESE BEFORE GOING INTO PRODUCTION
|
||||
############
|
||||
|
||||
POSTGRES_PASSWORD=your-super-secret-and-long-postgres-password
|
||||
JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
DASHBOARD_USERNAME=supabase
|
||||
DASHBOARD_PASSWORD=this_password_is_insecure_and_should_be_updated
|
||||
SECRET_KEY_BASE=UpNVntn3cDxHJpq99YMc1T1AQgQpc8kfYTuRgBiYa15BLrx8etQoXz3gZv1/u2oq
|
||||
VAULT_ENC_KEY=your-encryption-key-32-chars-min
|
||||
|
||||
|
||||
############
|
||||
# Database - You can change these to any PostgreSQL database that has logical replication enabled.
|
||||
############
|
||||
|
||||
POSTGRES_HOST=db
|
||||
POSTGRES_DB=postgres
|
||||
POSTGRES_PORT=5432
|
||||
# default user is postgres
|
||||
|
||||
|
||||
############
|
||||
# Supavisor -- Database pooler
|
||||
############
|
||||
POOLER_PROXY_PORT_TRANSACTION=6543
|
||||
POOLER_DEFAULT_POOL_SIZE=20
|
||||
POOLER_MAX_CLIENT_CONN=100
|
||||
POOLER_TENANT_ID=your-tenant-id
|
||||
|
||||
|
||||
############
|
||||
# API Proxy - Configuration for the Kong Reverse proxy.
|
||||
############
|
||||
|
||||
KONG_HTTP_PORT=8000
|
||||
KONG_HTTPS_PORT=8443
|
||||
|
||||
|
||||
############
|
||||
# API - Configuration for PostgREST.
|
||||
############
|
||||
|
||||
PGRST_DB_SCHEMAS=public,storage,graphql_public
|
||||
|
||||
|
||||
############
|
||||
# Auth - Configuration for the GoTrue authentication server.
|
||||
############
|
||||
|
||||
## General
|
||||
SITE_URL=http://localhost:3000
|
||||
ADDITIONAL_REDIRECT_URLS=
|
||||
JWT_EXPIRY=3600
|
||||
DISABLE_SIGNUP=false
|
||||
API_EXTERNAL_URL=http://localhost:8000
|
||||
|
||||
## Mailer Config
|
||||
MAILER_URLPATHS_CONFIRMATION="/auth/v1/verify"
|
||||
MAILER_URLPATHS_INVITE="/auth/v1/verify"
|
||||
MAILER_URLPATHS_RECOVERY="/auth/v1/verify"
|
||||
MAILER_URLPATHS_EMAIL_CHANGE="/auth/v1/verify"
|
||||
|
||||
## Email auth
|
||||
ENABLE_EMAIL_SIGNUP=true
|
||||
ENABLE_EMAIL_AUTOCONFIRM=false
|
||||
SMTP_ADMIN_EMAIL=admin@example.com
|
||||
SMTP_HOST=supabase-mail
|
||||
SMTP_PORT=2500
|
||||
SMTP_USER=fake_mail_user
|
||||
SMTP_PASS=fake_mail_password
|
||||
SMTP_SENDER_NAME=fake_sender
|
||||
ENABLE_ANONYMOUS_USERS=false
|
||||
|
||||
## Phone auth
|
||||
ENABLE_PHONE_SIGNUP=true
|
||||
ENABLE_PHONE_AUTOCONFIRM=true
|
||||
|
||||
|
||||
############
|
||||
# Studio - Configuration for the Dashboard
|
||||
############
|
||||
|
||||
STUDIO_DEFAULT_ORGANIZATION=Default Organization
|
||||
STUDIO_DEFAULT_PROJECT=Default Project
|
||||
|
||||
STUDIO_PORT=3000
|
||||
# replace if you intend to use Studio outside of localhost
|
||||
SUPABASE_PUBLIC_URL=http://localhost:8000
|
||||
|
||||
# Enable webp support
|
||||
IMGPROXY_ENABLE_WEBP_DETECTION=true
|
||||
|
||||
# Add your OpenAI API key to enable SQL Editor Assistant
|
||||
OPENAI_API_KEY=
|
||||
|
||||
|
||||
############
|
||||
# Functions - Configuration for Functions
|
||||
############
|
||||
# NOTE: VERIFY_JWT applies to all functions. Per-function VERIFY_JWT is not supported yet.
|
||||
FUNCTIONS_VERIFY_JWT=false
|
||||
|
||||
|
||||
############
|
||||
# Logs - Configuration for Logflare
|
||||
# Please refer to https://supabase.com/docs/reference/self-hosting-analytics/introduction
|
||||
############
|
||||
|
||||
LOGFLARE_LOGGER_BACKEND_API_KEY=your-super-secret-and-long-logflare-key
|
||||
|
||||
# Change vector.toml sinks to reflect this change
|
||||
LOGFLARE_API_KEY=your-super-secret-and-long-logflare-key
|
||||
|
||||
# Docker socket location - this value will differ depending on your OS
|
||||
DOCKER_SOCKET_LOCATION=/var/run/docker.sock
|
||||
|
||||
# Google Cloud Project details
|
||||
GOOGLE_PROJECT_ID=GOOGLE_PROJECT_ID
|
||||
GOOGLE_PROJECT_NUMBER=GOOGLE_PROJECT_NUMBER
|
||||
1
autogpt_platform/db/docker/.gitignore
vendored
@@ -1,5 +1,4 @@
|
||||
volumes/db/data
|
||||
volumes/storage
|
||||
.env
|
||||
test.http
|
||||
docker-compose.override.yml
|
||||
|
||||
@@ -5,8 +5,101 @@
|
||||
# Destroy: docker compose -f docker-compose.yml -f ./dev/docker-compose.dev.yml down -v --remove-orphans
|
||||
# Reset everything: ./reset.sh
|
||||
|
||||
# Environment Variable Loading Order (first → last, later overrides earlier):
|
||||
# 1. ../../.env.default - Default values for all Supabase settings
|
||||
# 2. ../../.env - User's custom configuration (if exists)
|
||||
# 3. ./.env - Local overrides specific to db/docker (if exists)
|
||||
# 4. environment key - Service-specific overrides defined below
|
||||
# 5. Shell environment - Variables exported before running docker compose
|
||||
|
||||
name: supabase
|
||||
|
||||
# Common env_file configuration for all Supabase services
|
||||
x-supabase-env-files: &supabase-env-files
|
||||
env_file:
|
||||
- ../../.env.default # Base defaults from platform root
|
||||
- path: ../../.env # User overrides from platform root (optional)
|
||||
required: false
|
||||
- path: ./.env # Local overrides for db/docker (optional)
|
||||
required: false
|
||||
|
||||
# Common Supabase environment - hardcoded defaults to avoid variable substitution
|
||||
x-supabase-env: &supabase-env
|
||||
# Core PostgreSQL settings
|
||||
POSTGRES_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
POSTGRES_HOST: db
|
||||
POSTGRES_PORT: "5432"
|
||||
POSTGRES_DB: postgres
|
||||
|
||||
# Authentication & Security
|
||||
JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
ANON_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
SERVICE_ROLE_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
DASHBOARD_USERNAME: supabase
|
||||
DASHBOARD_PASSWORD: this_password_is_insecure_and_should_be_updated
|
||||
SECRET_KEY_BASE: UpNVntn3cDxHJpq99YMc1T1AQgQpc8kfYTuRgBiYa15BLrx8etQoXz3gZv1/u2oq
|
||||
VAULT_ENC_KEY: your-encryption-key-32-chars-min
|
||||
|
||||
# URLs and Endpoints
|
||||
SITE_URL: http://localhost:3000
|
||||
API_EXTERNAL_URL: http://localhost:8000
|
||||
SUPABASE_PUBLIC_URL: http://localhost:8000
|
||||
ADDITIONAL_REDIRECT_URLS: ""
|
||||
|
||||
# Feature Flags
|
||||
DISABLE_SIGNUP: "false"
|
||||
ENABLE_EMAIL_SIGNUP: "true"
|
||||
ENABLE_EMAIL_AUTOCONFIRM: "false"
|
||||
ENABLE_ANONYMOUS_USERS: "false"
|
||||
ENABLE_PHONE_SIGNUP: "true"
|
||||
ENABLE_PHONE_AUTOCONFIRM: "true"
|
||||
FUNCTIONS_VERIFY_JWT: "false"
|
||||
IMGPROXY_ENABLE_WEBP_DETECTION: "true"
|
||||
|
||||
# Email/SMTP Configuration
|
||||
SMTP_ADMIN_EMAIL: admin@example.com
|
||||
SMTP_HOST: supabase-mail
|
||||
SMTP_PORT: "2500"
|
||||
SMTP_USER: fake_mail_user
|
||||
SMTP_PASS: fake_mail_password
|
||||
SMTP_SENDER_NAME: fake_sender
|
||||
|
||||
# Mailer URLs
|
||||
MAILER_URLPATHS_CONFIRMATION: /auth/v1/verify
|
||||
MAILER_URLPATHS_INVITE: /auth/v1/verify
|
||||
MAILER_URLPATHS_RECOVERY: /auth/v1/verify
|
||||
MAILER_URLPATHS_EMAIL_CHANGE: /auth/v1/verify
|
||||
|
||||
# JWT Settings
|
||||
JWT_EXPIRY: "3600"
|
||||
|
||||
# Database Schemas
|
||||
PGRST_DB_SCHEMAS: public,storage,graphql_public
|
||||
|
||||
# Studio Settings
|
||||
STUDIO_DEFAULT_ORGANIZATION: Default Organization
|
||||
STUDIO_DEFAULT_PROJECT: Default Project
|
||||
|
||||
# Logging
|
||||
LOGFLARE_API_KEY: your-super-secret-and-long-logflare-key
|
||||
|
||||
# Pooler Settings
|
||||
POOLER_DEFAULT_POOL_SIZE: "20"
|
||||
POOLER_MAX_CLIENT_CONN: "100"
|
||||
POOLER_TENANT_ID: your-tenant-id
|
||||
POOLER_PROXY_PORT_TRANSACTION: "6543"
|
||||
|
||||
# Kong Ports
|
||||
KONG_HTTP_PORT: "8000"
|
||||
KONG_HTTPS_PORT: "8443"
|
||||
|
||||
# Docker
|
||||
DOCKER_SOCKET_LOCATION: /var/run/docker.sock
|
||||
|
||||
# Google Cloud (if needed)
|
||||
GOOGLE_PROJECT_ID: GOOGLE_PROJECT_ID
|
||||
GOOGLE_PROJECT_NUMBER: GOOGLE_PROJECT_NUMBER
|
||||
|
||||
services:
|
||||
|
||||
studio:
|
||||
@@ -24,24 +117,24 @@ services:
|
||||
timeout: 10s
|
||||
interval: 5s
|
||||
retries: 3
|
||||
depends_on:
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
STUDIO_PG_META_URL: http://meta:8080
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
POSTGRES_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
|
||||
DEFAULT_ORGANIZATION_NAME: ${STUDIO_DEFAULT_ORGANIZATION}
|
||||
DEFAULT_PROJECT_NAME: ${STUDIO_DEFAULT_PROJECT}
|
||||
OPENAI_API_KEY: ${OPENAI_API_KEY:-}
|
||||
DEFAULT_ORGANIZATION_NAME: Default Organization
|
||||
DEFAULT_PROJECT_NAME: Default Project
|
||||
OPENAI_API_KEY: ""
|
||||
|
||||
SUPABASE_URL: http://kong:8000
|
||||
SUPABASE_PUBLIC_URL: ${SUPABASE_PUBLIC_URL}
|
||||
SUPABASE_ANON_KEY: ${ANON_KEY}
|
||||
SUPABASE_SERVICE_KEY: ${SERVICE_ROLE_KEY}
|
||||
AUTH_JWT_SECRET: ${JWT_SECRET}
|
||||
SUPABASE_PUBLIC_URL: http://localhost:8000
|
||||
SUPABASE_ANON_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
SUPABASE_SERVICE_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
AUTH_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
|
||||
LOGFLARE_API_KEY: ${LOGFLARE_API_KEY}
|
||||
LOGFLARE_API_KEY: your-super-secret-and-long-logflare-key
|
||||
LOGFLARE_URL: http://analytics:4000
|
||||
NEXT_PUBLIC_ENABLE_LOGS: true
|
||||
# Comment to use Big Query backend for analytics
|
||||
@@ -54,15 +147,15 @@ services:
|
||||
image: kong:2.8.1
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- ${KONG_HTTP_PORT}:8000/tcp
|
||||
- ${KONG_HTTPS_PORT}:8443/tcp
|
||||
- 8000:8000/tcp
|
||||
- 8443:8443/tcp
|
||||
volumes:
|
||||
# https://github.com/supabase/supabase/issues/12661
|
||||
- ./volumes/api/kong.yml:/home/kong/temp.yml:ro
|
||||
depends_on:
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
KONG_DATABASE: "off"
|
||||
KONG_DECLARATIVE_CONFIG: /home/kong/kong.yml
|
||||
# https://github.com/supabase/cli/issues/14
|
||||
@@ -70,10 +163,10 @@ services:
|
||||
KONG_PLUGINS: request-transformer,cors,key-auth,acl,basic-auth
|
||||
KONG_NGINX_PROXY_PROXY_BUFFER_SIZE: 160k
|
||||
KONG_NGINX_PROXY_PROXY_BUFFERS: 64 160k
|
||||
SUPABASE_ANON_KEY: ${ANON_KEY}
|
||||
SUPABASE_SERVICE_KEY: ${SERVICE_ROLE_KEY}
|
||||
DASHBOARD_USERNAME: ${DASHBOARD_USERNAME}
|
||||
DASHBOARD_PASSWORD: ${DASHBOARD_PASSWORD}
|
||||
SUPABASE_ANON_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
SUPABASE_SERVICE_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
DASHBOARD_USERNAME: supabase
|
||||
DASHBOARD_PASSWORD: this_password_is_insecure_and_should_be_updated
|
||||
# https://unix.stackexchange.com/a/294837
|
||||
entrypoint: bash -c 'eval "echo \"$$(cat ~/temp.yml)\"" > ~/kong.yml && /docker-entrypoint.sh kong docker-start'
|
||||
|
||||
@@ -98,48 +191,49 @@ services:
|
||||
db:
|
||||
# Disable this if you are using an external Postgres database
|
||||
condition: service_healthy
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
GOTRUE_API_HOST: 0.0.0.0
|
||||
GOTRUE_API_PORT: 9999
|
||||
API_EXTERNAL_URL: ${API_EXTERNAL_URL}
|
||||
API_EXTERNAL_URL: http://localhost:8000
|
||||
|
||||
GOTRUE_DB_DRIVER: postgres
|
||||
GOTRUE_DB_DATABASE_URL: postgres://supabase_auth_admin:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
|
||||
GOTRUE_DB_DATABASE_URL: postgres://supabase_auth_admin:your-super-secret-and-long-postgres-password@db:5432/postgres
|
||||
|
||||
GOTRUE_SITE_URL: ${SITE_URL}
|
||||
GOTRUE_URI_ALLOW_LIST: ${ADDITIONAL_REDIRECT_URLS}
|
||||
GOTRUE_DISABLE_SIGNUP: ${DISABLE_SIGNUP}
|
||||
GOTRUE_SITE_URL: http://localhost:3000
|
||||
GOTRUE_URI_ALLOW_LIST: ""
|
||||
GOTRUE_DISABLE_SIGNUP: false
|
||||
|
||||
GOTRUE_JWT_ADMIN_ROLES: service_role
|
||||
GOTRUE_JWT_AUD: authenticated
|
||||
GOTRUE_JWT_DEFAULT_GROUP_NAME: authenticated
|
||||
GOTRUE_JWT_EXP: ${JWT_EXPIRY}
|
||||
GOTRUE_JWT_SECRET: ${JWT_SECRET}
|
||||
GOTRUE_JWT_EXP: 3600
|
||||
GOTRUE_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
|
||||
GOTRUE_EXTERNAL_EMAIL_ENABLED: ${ENABLE_EMAIL_SIGNUP}
|
||||
GOTRUE_EXTERNAL_ANONYMOUS_USERS_ENABLED: ${ENABLE_ANONYMOUS_USERS}
|
||||
GOTRUE_MAILER_AUTOCONFIRM: ${ENABLE_EMAIL_AUTOCONFIRM}
|
||||
GOTRUE_EXTERNAL_EMAIL_ENABLED: true
|
||||
GOTRUE_EXTERNAL_ANONYMOUS_USERS_ENABLED: false
|
||||
GOTRUE_MAILER_AUTOCONFIRM: false
|
||||
|
||||
# Uncomment to bypass nonce check in ID Token flow. Commonly set to true when using Google Sign In on mobile.
|
||||
# GOTRUE_EXTERNAL_SKIP_NONCE_CHECK: true
|
||||
|
||||
# GOTRUE_MAILER_SECURE_EMAIL_CHANGE_ENABLED: true
|
||||
# GOTRUE_SMTP_MAX_FREQUENCY: 1s
|
||||
GOTRUE_SMTP_ADMIN_EMAIL: ${SMTP_ADMIN_EMAIL}
|
||||
GOTRUE_SMTP_HOST: ${SMTP_HOST}
|
||||
GOTRUE_SMTP_PORT: ${SMTP_PORT}
|
||||
GOTRUE_SMTP_USER: ${SMTP_USER}
|
||||
GOTRUE_SMTP_PASS: ${SMTP_PASS}
|
||||
GOTRUE_SMTP_SENDER_NAME: ${SMTP_SENDER_NAME}
|
||||
GOTRUE_MAILER_URLPATHS_INVITE: ${MAILER_URLPATHS_INVITE}
|
||||
GOTRUE_MAILER_URLPATHS_CONFIRMATION: ${MAILER_URLPATHS_CONFIRMATION}
|
||||
GOTRUE_MAILER_URLPATHS_RECOVERY: ${MAILER_URLPATHS_RECOVERY}
|
||||
GOTRUE_MAILER_URLPATHS_EMAIL_CHANGE: ${MAILER_URLPATHS_EMAIL_CHANGE}
|
||||
GOTRUE_SMTP_ADMIN_EMAIL: admin@example.com
|
||||
GOTRUE_SMTP_HOST: supabase-mail
|
||||
GOTRUE_SMTP_PORT: 2500
|
||||
GOTRUE_SMTP_USER: fake_mail_user
|
||||
GOTRUE_SMTP_PASS: fake_mail_password
|
||||
GOTRUE_SMTP_SENDER_NAME: fake_sender
|
||||
GOTRUE_MAILER_URLPATHS_INVITE: /auth/v1/verify
|
||||
GOTRUE_MAILER_URLPATHS_CONFIRMATION: /auth/v1/verify
|
||||
GOTRUE_MAILER_URLPATHS_RECOVERY: /auth/v1/verify
|
||||
GOTRUE_MAILER_URLPATHS_EMAIL_CHANGE: /auth/v1/verify
|
||||
|
||||
GOTRUE_EXTERNAL_PHONE_ENABLED: ${ENABLE_PHONE_SIGNUP}
|
||||
GOTRUE_SMS_AUTOCONFIRM: ${ENABLE_PHONE_AUTOCONFIRM}
|
||||
GOTRUE_EXTERNAL_PHONE_ENABLED: true
|
||||
GOTRUE_SMS_AUTOCONFIRM: true
|
||||
# Uncomment to enable custom access token hook. Please see: https://supabase.com/docs/guides/auth/auth-hooks for full list of hooks and additional details about custom_access_token_hook
|
||||
|
||||
# GOTRUE_HOOK_CUSTOM_ACCESS_TOKEN_ENABLED: "true"
|
||||
@@ -168,16 +262,17 @@ services:
|
||||
db:
|
||||
# Disable this if you are using an external Postgres database
|
||||
condition: service_healthy
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
PGRST_DB_URI: postgres://authenticator:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
|
||||
PGRST_DB_SCHEMAS: ${PGRST_DB_SCHEMAS}
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
PGRST_DB_URI: postgres://authenticator:your-super-secret-and-long-postgres-password@db:5432/postgres
|
||||
PGRST_DB_SCHEMAS: public,storage,graphql_public
|
||||
PGRST_DB_ANON_ROLE: anon
|
||||
PGRST_JWT_SECRET: ${JWT_SECRET}
|
||||
PGRST_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
PGRST_DB_USE_LEGACY_GUCS: "false"
|
||||
PGRST_APP_SETTINGS_JWT_SECRET: ${JWT_SECRET}
|
||||
PGRST_APP_SETTINGS_JWT_EXP: ${JWT_EXPIRY}
|
||||
PGRST_APP_SETTINGS_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
PGRST_APP_SETTINGS_JWT_EXP: 3600
|
||||
command:
|
||||
[
|
||||
"postgrest"
|
||||
@@ -192,8 +287,6 @@ services:
|
||||
db:
|
||||
# Disable this if you are using an external Postgres database
|
||||
condition: service_healthy
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
healthcheck:
|
||||
test:
|
||||
[
|
||||
@@ -204,23 +297,26 @@ services:
|
||||
"-o",
|
||||
"/dev/null",
|
||||
"-H",
|
||||
"Authorization: Bearer ${ANON_KEY}",
|
||||
"Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE",
|
||||
"http://localhost:4000/api/tenants/realtime-dev/health"
|
||||
]
|
||||
timeout: 5s
|
||||
interval: 5s
|
||||
retries: 3
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
PORT: 4000
|
||||
DB_HOST: ${POSTGRES_HOST}
|
||||
DB_PORT: ${POSTGRES_PORT}
|
||||
DB_HOST: db
|
||||
DB_PORT: 5432
|
||||
DB_USER: supabase_admin
|
||||
DB_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
DB_NAME: ${POSTGRES_DB}
|
||||
DB_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
DB_NAME: postgres
|
||||
DB_AFTER_CONNECT_QUERY: 'SET search_path TO _realtime'
|
||||
DB_ENC_KEY: supabaserealtime
|
||||
API_JWT_SECRET: ${JWT_SECRET}
|
||||
SECRET_KEY_BASE: ${SECRET_KEY_BASE}
|
||||
API_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
SECRET_KEY_BASE: UpNVntn3cDxHJpq99YMc1T1AQgQpc8kfYTuRgBiYa15BLrx8etQoXz3gZv1/u2oq
|
||||
ERL_AFLAGS: -proto_dist inet_tcp
|
||||
DNS_NODES: "''"
|
||||
RLIMIT_NOFILE: "10000"
|
||||
@@ -256,12 +352,15 @@ services:
|
||||
condition: service_started
|
||||
imgproxy:
|
||||
condition: service_started
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
ANON_KEY: ${ANON_KEY}
|
||||
SERVICE_KEY: ${SERVICE_ROLE_KEY}
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
ANON_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
SERVICE_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
POSTGREST_URL: http://rest:3000
|
||||
PGRST_JWT_SECRET: ${JWT_SECRET}
|
||||
DATABASE_URL: postgres://supabase_storage_admin:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
|
||||
PGRST_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
DATABASE_URL: postgres://supabase_storage_admin:your-super-secret-and-long-postgres-password@db:5432/postgres
|
||||
FILE_SIZE_LIMIT: 52428800
|
||||
STORAGE_BACKEND: file
|
||||
FILE_STORAGE_BACKEND_PATH: /var/lib/storage
|
||||
@@ -288,11 +387,14 @@ services:
|
||||
timeout: 5s
|
||||
interval: 5s
|
||||
retries: 3
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
IMGPROXY_BIND: ":5001"
|
||||
IMGPROXY_LOCAL_FILESYSTEM_ROOT: /
|
||||
IMGPROXY_USE_ETAG: "true"
|
||||
IMGPROXY_ENABLE_WEBP_DETECTION: ${IMGPROXY_ENABLE_WEBP_DETECTION}
|
||||
IMGPROXY_ENABLE_WEBP_DETECTION: true
|
||||
|
||||
meta:
|
||||
container_name: supabase-meta
|
||||
@@ -302,15 +404,16 @@ services:
|
||||
db:
|
||||
# Disable this if you are using an external Postgres database
|
||||
condition: service_healthy
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
PG_META_PORT: 8080
|
||||
PG_META_DB_HOST: ${POSTGRES_HOST}
|
||||
PG_META_DB_PORT: ${POSTGRES_PORT}
|
||||
PG_META_DB_NAME: ${POSTGRES_DB}
|
||||
PG_META_DB_HOST: db
|
||||
PG_META_DB_PORT: 5432
|
||||
PG_META_DB_NAME: postgres
|
||||
PG_META_DB_USER: supabase_admin
|
||||
PG_META_DB_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
PG_META_DB_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
|
||||
functions:
|
||||
container_name: supabase-edge-functions
|
||||
@@ -318,17 +421,17 @@ services:
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- ./volumes/functions:/home/deno/functions:Z
|
||||
depends_on:
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
JWT_SECRET: ${JWT_SECRET}
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
SUPABASE_URL: http://kong:8000
|
||||
SUPABASE_ANON_KEY: ${ANON_KEY}
|
||||
SUPABASE_SERVICE_ROLE_KEY: ${SERVICE_ROLE_KEY}
|
||||
SUPABASE_DB_URL: postgresql://postgres:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
|
||||
SUPABASE_ANON_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
SUPABASE_SERVICE_ROLE_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
SUPABASE_DB_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres
|
||||
# TODO: Allow configuring VERIFY_JWT per function. This PR might help: https://github.com/supabase/cli/pull/786
|
||||
VERIFY_JWT: "${FUNCTIONS_VERIFY_JWT}"
|
||||
VERIFY_JWT: "false"
|
||||
command:
|
||||
[
|
||||
"start",
|
||||
@@ -362,26 +465,29 @@ services:
|
||||
db:
|
||||
# Disable this if you are using an external Postgres database
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
LOGFLARE_NODE_HOST: 127.0.0.1
|
||||
DB_USERNAME: supabase_admin
|
||||
DB_DATABASE: _supabase
|
||||
DB_HOSTNAME: ${POSTGRES_HOST}
|
||||
DB_PORT: ${POSTGRES_PORT}
|
||||
DB_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
DB_HOSTNAME: db
|
||||
DB_PORT: 5432
|
||||
DB_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
DB_SCHEMA: _analytics
|
||||
LOGFLARE_API_KEY: ${LOGFLARE_API_KEY}
|
||||
LOGFLARE_API_KEY: your-super-secret-and-long-logflare-key
|
||||
LOGFLARE_SINGLE_TENANT: true
|
||||
LOGFLARE_SUPABASE_MODE: true
|
||||
LOGFLARE_MIN_CLUSTER_SIZE: 1
|
||||
|
||||
# Comment variables to use Big Query backend for analytics
|
||||
POSTGRES_BACKEND_URL: postgresql://supabase_admin:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/_supabase
|
||||
POSTGRES_BACKEND_URL: postgresql://supabase_admin:your-super-secret-and-long-postgres-password@db:5432/_supabase
|
||||
POSTGRES_BACKEND_SCHEMA: _analytics
|
||||
LOGFLARE_FEATURE_FLAG_OVERRIDE: multibackend=true
|
||||
# Uncomment to use Big Query backend for analytics
|
||||
# GOOGLE_PROJECT_ID: ${GOOGLE_PROJECT_ID}
|
||||
# GOOGLE_PROJECT_NUMBER: ${GOOGLE_PROJECT_NUMBER}
|
||||
# GOOGLE_PROJECT_ID: GOOGLE_PROJECT_ID
|
||||
# GOOGLE_PROJECT_NUMBER: GOOGLE_PROJECT_NUMBER
|
||||
|
||||
# Comment out everything below this point if you are using an external Postgres database
|
||||
db:
|
||||
@@ -419,19 +525,19 @@ services:
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
depends_on:
|
||||
vector:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
POSTGRES_HOST: /var/run/postgresql
|
||||
PGPORT: ${POSTGRES_PORT}
|
||||
POSTGRES_PORT: ${POSTGRES_PORT}
|
||||
PGPASSWORD: ${POSTGRES_PASSWORD}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
PGDATABASE: ${POSTGRES_DB}
|
||||
POSTGRES_DB: ${POSTGRES_DB}
|
||||
JWT_SECRET: ${JWT_SECRET}
|
||||
JWT_EXP: ${JWT_EXPIRY}
|
||||
PGPORT: 5432
|
||||
POSTGRES_PORT: 5432
|
||||
PGPASSWORD: your-super-secret-and-long-postgres-password
|
||||
POSTGRES_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
PGDATABASE: postgres
|
||||
POSTGRES_DB: postgres
|
||||
JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
JWT_EXP: 3600
|
||||
command:
|
||||
[
|
||||
"postgres",
|
||||
@@ -447,7 +553,7 @@ services:
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- ./volumes/logs/vector.yml:/etc/vector/vector.yml:ro
|
||||
- ${DOCKER_SOCKET_LOCATION}:/var/run/docker.sock:ro
|
||||
- /var/run/docker.sock:/var/run/docker.sock:ro
|
||||
healthcheck:
|
||||
test:
|
||||
[
|
||||
@@ -461,8 +567,11 @@ services:
|
||||
timeout: 5s
|
||||
interval: 5s
|
||||
retries: 3
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
LOGFLARE_API_KEY: ${LOGFLARE_API_KEY}
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
LOGFLARE_API_KEY: your-super-secret-and-long-logflare-key
|
||||
command:
|
||||
[
|
||||
"--config",
|
||||
@@ -475,8 +584,8 @@ services:
|
||||
image: supabase/supavisor:2.4.12
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- ${POSTGRES_PORT}:5432
|
||||
- ${POOLER_PROXY_PORT_TRANSACTION}:6543
|
||||
- 5432:5432
|
||||
- 6543:6543
|
||||
volumes:
|
||||
- ./volumes/pooler/pooler.exs:/etc/pooler/pooler.exs:ro
|
||||
healthcheck:
|
||||
@@ -498,22 +607,25 @@ services:
|
||||
condition: service_healthy
|
||||
analytics:
|
||||
condition: service_healthy
|
||||
<<: *supabase-env-files
|
||||
environment:
|
||||
<<: *supabase-env
|
||||
# Keep any existing environment variables specific to that service
|
||||
PORT: 4000
|
||||
POSTGRES_PORT: ${POSTGRES_PORT}
|
||||
POSTGRES_DB: ${POSTGRES_DB}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||
DATABASE_URL: ecto://supabase_admin:${POSTGRES_PASSWORD}@db:${POSTGRES_PORT}/_supabase
|
||||
POSTGRES_PORT: 5432
|
||||
POSTGRES_DB: postgres
|
||||
POSTGRES_PASSWORD: your-super-secret-and-long-postgres-password
|
||||
DATABASE_URL: ecto://supabase_admin:your-super-secret-and-long-postgres-password@db:5432/_supabase
|
||||
CLUSTER_POSTGRES: true
|
||||
SECRET_KEY_BASE: ${SECRET_KEY_BASE}
|
||||
VAULT_ENC_KEY: ${VAULT_ENC_KEY}
|
||||
API_JWT_SECRET: ${JWT_SECRET}
|
||||
METRICS_JWT_SECRET: ${JWT_SECRET}
|
||||
SECRET_KEY_BASE: UpNVntn3cDxHJpq99YMc1T1AQgQpc8kfYTuRgBiYa15BLrx8etQoXz3gZv1/u2oq
|
||||
VAULT_ENC_KEY: your-encryption-key-32-chars-min
|
||||
API_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
METRICS_JWT_SECRET: your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
REGION: local
|
||||
ERL_AFLAGS: -proto_dist inet_tcp
|
||||
POOLER_TENANT_ID: ${POOLER_TENANT_ID}
|
||||
POOLER_DEFAULT_POOL_SIZE: ${POOLER_DEFAULT_POOL_SIZE}
|
||||
POOLER_MAX_CLIENT_CONN: ${POOLER_MAX_CLIENT_CONN}
|
||||
POOLER_TENANT_ID: your-tenant-id
|
||||
POOLER_DEFAULT_POOL_SIZE: 20
|
||||
POOLER_MAX_CLIENT_CONN: 100
|
||||
POOLER_POOL_MODE: transaction
|
||||
command:
|
||||
[
|
||||
|
||||
@@ -34,11 +34,11 @@ else
|
||||
echo "No .env file found. Skipping .env removal step..."
|
||||
fi
|
||||
|
||||
if [ -f ".env.example" ]; then
|
||||
echo "Copying .env.example to .env..."
|
||||
cp .env.example .env
|
||||
if [ -f ".env.default" ]; then
|
||||
echo "Copying .env.default to .env..."
|
||||
cp .env.default .env
|
||||
else
|
||||
echo ".env.example file not found. Skipping .env reset step..."
|
||||
echo ".env.default file not found. Skipping .env reset step..."
|
||||
fi
|
||||
|
||||
echo "Cleanup complete!"
|
||||
@@ -1,9 +1,39 @@
|
||||
# Environment Variable Loading Order (first → last, later overrides earlier):
|
||||
# 1. backend/.env.default - Default values for all settings
|
||||
# 2. backend/.env - User's custom configuration (if exists)
|
||||
# 3. environment key - Docker-specific overrides defined below
|
||||
# 4. Shell environment - Variables exported before running docker compose
|
||||
# 5. CLI arguments - docker compose run -e VAR=value
|
||||
|
||||
# Common backend environment - Docker service names
|
||||
x-backend-env:
|
||||
&backend-env # Docker internal service hostnames (override localhost defaults)
|
||||
PYRO_HOST: "0.0.0.0"
|
||||
AGENTSERVER_HOST: rest_server
|
||||
SCHEDULER_HOST: scheduler_server
|
||||
DATABASEMANAGER_HOST: database_manager
|
||||
EXECUTIONMANAGER_HOST: executor
|
||||
NOTIFICATIONMANAGER_HOST: notification_server
|
||||
CLAMAV_SERVICE_HOST: clamav
|
||||
DB_HOST: db
|
||||
REDIS_HOST: redis
|
||||
RABBITMQ_HOST: rabbitmq
|
||||
# Override Supabase URL for Docker network
|
||||
SUPABASE_URL: http://kong:8000
|
||||
|
||||
# Common env_file configuration for backend services
|
||||
x-backend-env-files: &backend-env-files
|
||||
env_file:
|
||||
- backend/.env.default # Base defaults (always exists)
|
||||
- path: backend/.env # User overrides (optional)
|
||||
required: false
|
||||
|
||||
services:
|
||||
migrate:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: autogpt_platform/backend/Dockerfile
|
||||
target: server
|
||||
target: migrate
|
||||
command: ["sh", "-c", "poetry run prisma migrate deploy"]
|
||||
develop:
|
||||
watch:
|
||||
@@ -20,10 +50,11 @@ services:
|
||||
- app-network
|
||||
restart: on-failure
|
||||
healthcheck:
|
||||
test: ["CMD", "poetry", "run", "prisma", "migrate", "status"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
test: ["CMD-SHELL", "poetry run prisma migrate status | grep -q 'No pending migrations' || exit 1"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 5s
|
||||
|
||||
redis:
|
||||
image: redis:latest
|
||||
@@ -73,29 +104,12 @@ services:
|
||||
condition: service_completed_successfully
|
||||
rabbitmq:
|
||||
condition: service_healthy
|
||||
<<: *backend-env-files
|
||||
environment:
|
||||
- SUPABASE_URL=http://kong:8000
|
||||
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
- SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
- DATABASE_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- DIRECT_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- REDIS_HOST=redis
|
||||
- REDIS_PORT=6379
|
||||
- RABBITMQ_HOST=rabbitmq
|
||||
- RABBITMQ_PORT=5672
|
||||
- RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
- RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
- REDIS_PASSWORD=password
|
||||
- ENABLE_AUTH=true
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- SCHEDULER_HOST=scheduler_server
|
||||
- EXECUTIONMANAGER_HOST=executor
|
||||
- NOTIFICATIONMANAGER_HOST=notification_server
|
||||
- CLAMAV_SERVICE_HOST=clamav
|
||||
- NEXT_PUBLIC_FRONTEND_BASE_URL=http://localhost:3000
|
||||
- BACKEND_CORS_ALLOW_ORIGINS=["http://localhost:3000"]
|
||||
- ENCRYPTION_KEY=dvziYgz0KSK8FENhju0ZYi8-fRTfAdlz6YLhdB_jhNw= # DO NOT USE IN PRODUCTION!!
|
||||
- UNSUBSCRIBE_SECRET_KEY=HlP8ivStJjmbf6NKi78m_3FnOogut0t5ckzjsIqeaio= # DO NOT USE IN PRODUCTION!!
|
||||
<<: *backend-env
|
||||
# Service-specific overrides
|
||||
DATABASE_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
DIRECT_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
ports:
|
||||
- "8006:8006"
|
||||
networks:
|
||||
@@ -123,26 +137,12 @@ services:
|
||||
condition: service_completed_successfully
|
||||
database_manager:
|
||||
condition: service_started
|
||||
<<: *backend-env-files
|
||||
environment:
|
||||
- DATABASEMANAGER_HOST=database_manager
|
||||
- SUPABASE_URL=http://kong:8000
|
||||
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
- SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
|
||||
- DATABASE_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- DIRECT_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- REDIS_HOST=redis
|
||||
- REDIS_PORT=6379
|
||||
- REDIS_PASSWORD=password
|
||||
- RABBITMQ_HOST=rabbitmq
|
||||
- RABBITMQ_PORT=5672
|
||||
- RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
- RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
- ENABLE_AUTH=true
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- AGENTSERVER_HOST=rest_server
|
||||
- NOTIFICATIONMANAGER_HOST=notification_server
|
||||
- CLAMAV_SERVICE_HOST=clamav
|
||||
- ENCRYPTION_KEY=dvziYgz0KSK8FENhju0ZYi8-fRTfAdlz6YLhdB_jhNw= # DO NOT USE IN PRODUCTION!!
|
||||
<<: *backend-env
|
||||
# Service-specific overrides
|
||||
DATABASE_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
DIRECT_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
ports:
|
||||
- "8002:8002"
|
||||
networks:
|
||||
@@ -168,22 +168,12 @@ services:
|
||||
condition: service_completed_successfully
|
||||
database_manager:
|
||||
condition: service_started
|
||||
<<: *backend-env-files
|
||||
environment:
|
||||
- DATABASEMANAGER_HOST=database_manager
|
||||
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
- DATABASE_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- DIRECT_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- REDIS_HOST=redis
|
||||
- REDIS_PORT=6379
|
||||
- REDIS_PASSWORD=password
|
||||
# - RABBITMQ_HOST=rabbitmq
|
||||
# - RABBITMQ_PORT=5672
|
||||
# - RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
# - RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
- ENABLE_AUTH=true
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- BACKEND_CORS_ALLOW_ORIGINS=["http://localhost:3000"]
|
||||
|
||||
<<: *backend-env
|
||||
# Service-specific overrides
|
||||
DATABASE_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
DIRECT_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
ports:
|
||||
- "8001:8001"
|
||||
networks:
|
||||
@@ -205,11 +195,12 @@ services:
|
||||
condition: service_healthy
|
||||
migrate:
|
||||
condition: service_completed_successfully
|
||||
<<: *backend-env-files
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- DIRECT_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- ENCRYPTION_KEY=dvziYgz0KSK8FENhju0ZYi8-fRTfAdlz6YLhdB_jhNw= # DO NOT USE IN PRODUCTION!!
|
||||
<<: *backend-env
|
||||
# Service-specific overrides
|
||||
DATABASE_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
DIRECT_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
ports:
|
||||
- "8005:8005"
|
||||
networks:
|
||||
@@ -250,23 +241,12 @@ services:
|
||||
# interval: 10s
|
||||
# timeout: 10s
|
||||
# retries: 5
|
||||
<<: *backend-env-files
|
||||
environment:
|
||||
- DATABASEMANAGER_HOST=database_manager
|
||||
- NOTIFICATIONMANAGER_HOST=notification_server
|
||||
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
|
||||
- DATABASE_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- DIRECT_URL=postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
- REDIS_HOST=redis
|
||||
- REDIS_PORT=6379
|
||||
- REDIS_PASSWORD=password
|
||||
- RABBITMQ_HOST=rabbitmq
|
||||
- RABBITMQ_PORT=5672
|
||||
- RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
- RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
- ENABLE_AUTH=true
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- BACKEND_CORS_ALLOW_ORIGINS=["http://localhost:3000"]
|
||||
|
||||
<<: *backend-env
|
||||
# Service-specific overrides
|
||||
DATABASE_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
DIRECT_URL: postgresql://postgres:your-super-secret-and-long-postgres-password@db:5432/postgres?connect_timeout=60&schema=platform
|
||||
ports:
|
||||
- "8003:8003"
|
||||
networks:
|
||||
@@ -292,52 +272,39 @@ services:
|
||||
condition: service_completed_successfully
|
||||
database_manager:
|
||||
condition: service_started
|
||||
<<: *backend-env-files
|
||||
environment:
|
||||
- DATABASEMANAGER_HOST=database_manager
|
||||
- REDIS_HOST=redis
|
||||
- REDIS_PORT=6379
|
||||
- REDIS_PASSWORD=password
|
||||
- RABBITMQ_HOST=rabbitmq
|
||||
- RABBITMQ_PORT=5672
|
||||
- RABBITMQ_DEFAULT_USER=rabbitmq_user_default
|
||||
- RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
|
||||
- ENABLE_AUTH=true
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- BACKEND_CORS_ALLOW_ORIGINS=["http://localhost:3000"]
|
||||
|
||||
<<: *backend-env
|
||||
ports:
|
||||
- "8007:8007"
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
# frontend:
|
||||
# build:
|
||||
# context: ../
|
||||
# dockerfile: autogpt_platform/frontend/Dockerfile
|
||||
# target: dev
|
||||
# depends_on:
|
||||
# db:
|
||||
# condition: service_healthy
|
||||
# rest_server:
|
||||
# condition: service_started
|
||||
# websocket_server:
|
||||
# condition: service_started
|
||||
# migrate:
|
||||
# condition: service_completed_successfully
|
||||
# environment:
|
||||
# - NEXT_PUBLIC_SUPABASE_URL=http://kong:8000
|
||||
# - NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
# - DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/postgres?connect_timeout=60&schema=platform
|
||||
# - DIRECT_URL=postgresql://agpt_user:pass123@postgres:5432/postgres?connect_timeout=60&schema=platform
|
||||
# - NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
|
||||
# - NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
# - NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
|
||||
# - NEXT_PUBLIC_BEHAVE_AS=LOCAL
|
||||
# ports:
|
||||
# - "3000:3000"
|
||||
# networks:
|
||||
# - app-network
|
||||
|
||||
frontend:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: autogpt_platform/frontend/Dockerfile
|
||||
target: prod
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
migrate:
|
||||
condition: service_completed_successfully
|
||||
ports:
|
||||
- "3000:3000"
|
||||
networks:
|
||||
- app-network
|
||||
# Load environment variables in order (later overrides earlier)
|
||||
env_file:
|
||||
- path: ./frontend/.env.default # Base defaults (always exists)
|
||||
- path: ./frontend/.env # User overrides (optional)
|
||||
required: false
|
||||
environment:
|
||||
# Server-side environment variables (Docker service names)
|
||||
# These override the localhost URLs from env files when running in Docker
|
||||
AUTH_CALLBACK_URL: http://rest_server:8006/auth/callback
|
||||
SUPABASE_URL: http://kong:8000
|
||||
AGPT_SERVER_URL: http://rest_server:8006/api
|
||||
AGPT_WS_SERVER_URL: ws://websocket_server:8001/ws
|
||||
networks:
|
||||
app-network:
|
||||
driver: bridge
|
||||
|
||||
@@ -20,6 +20,7 @@ x-supabase-services:
|
||||
- app-network
|
||||
- shared-network
|
||||
|
||||
|
||||
services:
|
||||
# AGPT services
|
||||
migrate:
|
||||
@@ -96,19 +97,13 @@ services:
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
|
||||
# frontend:
|
||||
# <<: *agpt-services
|
||||
# extends:
|
||||
# file: ./docker-compose.platform.yml
|
||||
# service: frontend
|
||||
|
||||
# Supabase services
|
||||
studio:
|
||||
<<: *supabase-services
|
||||
frontend:
|
||||
<<: *agpt-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: studio
|
||||
file: ./docker-compose.platform.yml
|
||||
service: frontend
|
||||
|
||||
# Supabase services (minimal: auth + db + kong)
|
||||
kong:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
@@ -123,61 +118,35 @@ services:
|
||||
environment:
|
||||
GOTRUE_MAILER_AUTOCONFIRM: true
|
||||
|
||||
rest:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: rest
|
||||
|
||||
realtime:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: realtime
|
||||
|
||||
storage:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: storage
|
||||
|
||||
imgproxy:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: imgproxy
|
||||
|
||||
meta:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: meta
|
||||
|
||||
functions:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: functions
|
||||
|
||||
analytics:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: analytics
|
||||
|
||||
db:
|
||||
<<: *supabase-services
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: db
|
||||
ports:
|
||||
- ${POSTGRES_PORT}:5432 # We don't use Supavisor locally, so we expose the db directly.
|
||||
- 5432:5432 # We don't use Supavisor locally, so we expose the db directly.
|
||||
|
||||
vector:
|
||||
# Studio and its dependencies for local development only
|
||||
meta:
|
||||
<<: *supabase-services
|
||||
profiles:
|
||||
- local
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: vector
|
||||
service: meta
|
||||
|
||||
studio:
|
||||
<<: *supabase-services
|
||||
profiles:
|
||||
- local
|
||||
extends:
|
||||
file: ./db/docker/docker-compose.yml
|
||||
service: studio
|
||||
depends_on:
|
||||
meta:
|
||||
condition: service_healthy
|
||||
# environment:
|
||||
# NEXT_PUBLIC_ENABLE_LOGS: false # Disable analytics/logging features
|
||||
|
||||
deps:
|
||||
<<: *supabase-services
|
||||
@@ -186,13 +155,24 @@ services:
|
||||
image: busybox
|
||||
command: /bin/true
|
||||
depends_on:
|
||||
- studio
|
||||
- kong
|
||||
- auth
|
||||
- meta
|
||||
- analytics
|
||||
- db
|
||||
- vector
|
||||
- studio
|
||||
- redis
|
||||
- rabbitmq
|
||||
- clamav
|
||||
- migrate
|
||||
|
||||
deps_backend:
|
||||
<<: *agpt-services
|
||||
profiles:
|
||||
- local
|
||||
image: busybox
|
||||
command: /bin/true
|
||||
depends_on:
|
||||
- deps
|
||||
- rest_server
|
||||
- executor
|
||||
- websocket_server
|
||||
- database_manager
|
||||
|
||||
20
autogpt_platform/frontend/.env.default
Normal file
@@ -0,0 +1,20 @@
|
||||
NEXT_PUBLIC_SUPABASE_URL=http://localhost:8000
|
||||
NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
|
||||
NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
|
||||
NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
NEXT_PUBLIC_FRONTEND_BASE_URL=http://localhost:3000
|
||||
|
||||
NEXT_PUBLIC_APP_ENV=local
|
||||
NEXT_PUBLIC_BEHAVE_AS=LOCAL
|
||||
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_ENABLED=false
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID=687ab1372f497809b131e06e
|
||||
|
||||
NEXT_PUBLIC_SHOW_BILLING_PAGE=false
|
||||
NEXT_PUBLIC_TURNSTILE=disabled
|
||||
NEXT_PUBLIC_REACT_QUERY_DEVTOOL=true
|
||||
|
||||
NEXT_PUBLIC_GA_MEASUREMENT_ID=G-FH2XK2W4GN
|
||||
NEXT_PUBLIC_PW_TEST=true
|
||||
|
||||
@@ -1,44 +0,0 @@
|
||||
NEXT_PUBLIC_FRONTEND_BASE_URL=http://localhost:3000
|
||||
|
||||
NEXT_PUBLIC_AUTH_CALLBACK_URL=http://localhost:8006/auth/callback
|
||||
NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
|
||||
NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_ENABLED=false
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID=687ab1372f497809b131e06e # Local environment on Launch darkly
|
||||
NEXT_PUBLIC_APP_ENV=local
|
||||
|
||||
NEXT_PUBLIC_AGPT_SERVER_BASE_URL=http://localhost:8006
|
||||
|
||||
## Locale settings
|
||||
|
||||
NEXT_PUBLIC_DEFAULT_LOCALE=en
|
||||
NEXT_PUBLIC_LOCALES=en,es
|
||||
|
||||
## Supabase credentials
|
||||
|
||||
NEXT_PUBLIC_SUPABASE_URL=http://localhost:8000
|
||||
NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
|
||||
## OAuth Callback URL
|
||||
## This should be {domain}/auth/callback
|
||||
## Only used if you're using Supabase and OAuth
|
||||
AUTH_CALLBACK_URL="${NEXT_PUBLIC_FRONTEND_BASE_URL}/auth/callback"
|
||||
GA_MEASUREMENT_ID=G-FH2XK2W4GN
|
||||
|
||||
# When running locally, set NEXT_PUBLIC_BEHAVE_AS=CLOUD to use the a locally hosted marketplace (as is typical in development, and the cloud deployment), otherwise set it to LOCAL to have the marketplace open in a new tab
|
||||
NEXT_PUBLIC_BEHAVE_AS=LOCAL
|
||||
NEXT_PUBLIC_SHOW_BILLING_PAGE=false
|
||||
|
||||
## Cloudflare Turnstile (CAPTCHA) Configuration
|
||||
## Get these from the Cloudflare Turnstile dashboard: https://dash.cloudflare.com/?to=/:account/turnstile
|
||||
## This is the frontend site key
|
||||
NEXT_PUBLIC_CLOUDFLARE_TURNSTILE_SITE_KEY=
|
||||
NEXT_PUBLIC_TURNSTILE=disabled
|
||||
|
||||
# Devtools
|
||||
NEXT_PUBLIC_REACT_QUERY_DEVTOOL=true
|
||||
|
||||
# In case you are running Playwright locally
|
||||
# NEXT_PUBLIC_PW_TEST=true
|
||||
|
||||
1
autogpt_platform/frontend/.gitignore
vendored
@@ -31,6 +31,7 @@ yarn.lock
|
||||
package-lock.json
|
||||
|
||||
# local env files
|
||||
.env
|
||||
.env*.local
|
||||
|
||||
# vercel
|
||||
|
||||
@@ -17,7 +17,12 @@ CMD ["pnpm", "run", "dev", "--hostname", "0.0.0.0"]
|
||||
FROM base AS build
|
||||
COPY autogpt_platform/frontend/ .
|
||||
ENV SKIP_STORYBOOK_TESTS=true
|
||||
RUN pnpm build
|
||||
RUN if [ -f .env ]; then \
|
||||
cat .env.default .env > .env.merged && mv .env.merged .env; \
|
||||
else \
|
||||
cp .env.default .env; \
|
||||
fi
|
||||
RUN pnpm build --turbo
|
||||
|
||||
# Prod stage - based on NextJS reference Dockerfile https://github.com/vercel/next.js/blob/64271354533ed16da51be5dce85f0dbd15f17517/examples/with-docker/Dockerfile
|
||||
FROM node:21-alpine AS prod
|
||||
|
||||
@@ -18,31 +18,58 @@ Make sure you have Node.js 16.10+ installed. Corepack is included with Node.js b
|
||||
>
|
||||
> Then follow the setup steps below.
|
||||
|
||||
### Setup
|
||||
## Setup
|
||||
|
||||
1. **Enable corepack** (run this once on your system):
|
||||
### 1. **Enable corepack** (run this once on your system):
|
||||
|
||||
```bash
|
||||
corepack enable
|
||||
```
|
||||
```bash
|
||||
corepack enable
|
||||
```
|
||||
|
||||
This enables corepack to automatically manage pnpm based on the `packageManager` field in `package.json`.
|
||||
This enables corepack to automatically manage pnpm based on the `packageManager` field in `package.json`.
|
||||
|
||||
2. **Install dependencies**:
|
||||
### 2. **Install dependencies**:
|
||||
|
||||
```bash
|
||||
pnpm i
|
||||
```
|
||||
```bash
|
||||
pnpm i
|
||||
```
|
||||
|
||||
3. **Start the development server**:
|
||||
```bash
|
||||
pnpm dev
|
||||
```
|
||||
### 3. **Start the development server**:
|
||||
|
||||
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
|
||||
#### Running the Front-end & Back-end separately
|
||||
|
||||
We recommend this approach if you are doing active development on the project. First spin up the Back-end:
|
||||
|
||||
```bash
|
||||
# on `autogpt_platform`
|
||||
docker compose --profile local up deps_backend -d
|
||||
# on `autogpt_platform/backend`
|
||||
poetry run app
|
||||
```
|
||||
|
||||
Then start the Front-end:
|
||||
|
||||
```bash
|
||||
# on `autogpt_platform/frontend`
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. If the server starts on `http://localhost:3001` it means the Front-end is already running via Docker. You have to kill the container then or do `docker compose down`.
|
||||
|
||||
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
|
||||
|
||||
#### Running both the Front-end and Back-end via Docker
|
||||
|
||||
If you run:
|
||||
|
||||
```bash
|
||||
# on `autogpt_platform`
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
It will spin up the Back-end and Front-end via Docker. The Front-end will start on port `3000`. This might not be
|
||||
what you want when actively contributing to the Front-end as you won't have direct/easy access to the Next.js dev server.
|
||||
|
||||
### Subsequent Runs
|
||||
|
||||
For subsequent development sessions, you only need to run:
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
"start": "next start",
|
||||
"start:standalone": "cd .next/standalone && node server.js",
|
||||
"lint": "next lint && prettier --check .",
|
||||
"format": "prettier --write .",
|
||||
"format": "next lint --fix; prettier --write .",
|
||||
"type-check": "tsc --noEmit",
|
||||
"test": "next build --turbo && playwright test",
|
||||
"test-ui": "next build --turbo && playwright test --ui",
|
||||
@@ -27,35 +27,35 @@
|
||||
],
|
||||
"dependencies": {
|
||||
"@faker-js/faker": "9.9.0",
|
||||
"@hookform/resolvers": "5.2.0",
|
||||
"@next/third-parties": "15.4.4",
|
||||
"@hookform/resolvers": "5.2.1",
|
||||
"@next/third-parties": "15.4.6",
|
||||
"@phosphor-icons/react": "2.1.10",
|
||||
"@radix-ui/react-alert-dialog": "1.1.14",
|
||||
"@radix-ui/react-alert-dialog": "1.1.15",
|
||||
"@radix-ui/react-avatar": "1.1.10",
|
||||
"@radix-ui/react-checkbox": "1.3.2",
|
||||
"@radix-ui/react-collapsible": "1.1.11",
|
||||
"@radix-ui/react-context-menu": "2.2.15",
|
||||
"@radix-ui/react-dialog": "1.1.14",
|
||||
"@radix-ui/react-dropdown-menu": "2.1.15",
|
||||
"@radix-ui/react-checkbox": "1.3.3",
|
||||
"@radix-ui/react-collapsible": "1.1.12",
|
||||
"@radix-ui/react-context-menu": "2.2.16",
|
||||
"@radix-ui/react-dialog": "1.1.15",
|
||||
"@radix-ui/react-dropdown-menu": "2.1.16",
|
||||
"@radix-ui/react-icons": "1.3.2",
|
||||
"@radix-ui/react-label": "2.1.7",
|
||||
"@radix-ui/react-popover": "1.1.14",
|
||||
"@radix-ui/react-radio-group": "1.3.7",
|
||||
"@radix-ui/react-scroll-area": "1.2.9",
|
||||
"@radix-ui/react-select": "2.2.5",
|
||||
"@radix-ui/react-popover": "1.1.15",
|
||||
"@radix-ui/react-radio-group": "1.3.8",
|
||||
"@radix-ui/react-scroll-area": "1.2.10",
|
||||
"@radix-ui/react-select": "2.2.6",
|
||||
"@radix-ui/react-separator": "1.1.7",
|
||||
"@radix-ui/react-slot": "1.2.3",
|
||||
"@radix-ui/react-switch": "1.2.5",
|
||||
"@radix-ui/react-tabs": "1.1.12",
|
||||
"@radix-ui/react-toast": "1.2.14",
|
||||
"@radix-ui/react-tooltip": "1.2.7",
|
||||
"@radix-ui/react-switch": "1.2.6",
|
||||
"@radix-ui/react-tabs": "1.1.13",
|
||||
"@radix-ui/react-toast": "1.2.15",
|
||||
"@radix-ui/react-tooltip": "1.2.8",
|
||||
"@sentry/nextjs": "9.42.0",
|
||||
"@supabase/ssr": "0.6.1",
|
||||
"@supabase/supabase-js": "2.52.1",
|
||||
"@tanstack/react-query": "5.83.0",
|
||||
"@supabase/supabase-js": "2.55.0",
|
||||
"@tanstack/react-query": "5.85.3",
|
||||
"@tanstack/react-table": "8.21.3",
|
||||
"@types/jaro-winkler": "0.2.4",
|
||||
"@xyflow/react": "12.8.2",
|
||||
"@xyflow/react": "12.8.3",
|
||||
"boring-avatars": "1.11.2",
|
||||
"class-variance-authority": "0.7.1",
|
||||
"clsx": "2.1.1",
|
||||
@@ -65,22 +65,22 @@
|
||||
"dotenv": "17.2.1",
|
||||
"elliptic": "6.6.1",
|
||||
"embla-carousel-react": "8.6.0",
|
||||
"framer-motion": "12.23.9",
|
||||
"framer-motion": "12.23.12",
|
||||
"geist": "1.4.2",
|
||||
"jaro-winkler": "0.2.8",
|
||||
"launchdarkly-react-client-sdk": "3.8.1",
|
||||
"lodash": "4.17.21",
|
||||
"lucide-react": "0.525.0",
|
||||
"lucide-react": "0.539.0",
|
||||
"moment": "2.30.1",
|
||||
"next": "15.4.4",
|
||||
"next": "15.4.6",
|
||||
"next-themes": "0.4.6",
|
||||
"nuqs": "2.4.3",
|
||||
"party-js": "2.2.0",
|
||||
"react": "18.3.1",
|
||||
"react-day-picker": "9.8.0",
|
||||
"react-day-picker": "9.8.1",
|
||||
"react-dom": "18.3.1",
|
||||
"react-drag-drop-files": "2.4.0",
|
||||
"react-hook-form": "7.61.1",
|
||||
"react-hook-form": "7.62.0",
|
||||
"react-icons": "5.5.0",
|
||||
"react-markdown": "9.0.3",
|
||||
"react-modal": "3.16.3",
|
||||
@@ -88,7 +88,7 @@
|
||||
"react-window": "1.8.11",
|
||||
"recharts": "2.15.3",
|
||||
"shepherd.js": "14.5.1",
|
||||
"sonner": "2.0.6",
|
||||
"sonner": "2.0.7",
|
||||
"tailwind-merge": "2.6.0",
|
||||
"tailwindcss-animate": "1.0.7",
|
||||
"uuid": "11.1.0",
|
||||
@@ -96,42 +96,42 @@
|
||||
"zod": "3.25.76"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@chromatic-com/storybook": "4.0.1",
|
||||
"@playwright/test": "1.54.1",
|
||||
"@storybook/addon-a11y": "9.0.17",
|
||||
"@storybook/addon-docs": "9.0.17",
|
||||
"@storybook/addon-links": "9.0.17",
|
||||
"@storybook/addon-onboarding": "9.0.17",
|
||||
"@storybook/nextjs": "9.0.17",
|
||||
"@tanstack/eslint-plugin-query": "5.81.2",
|
||||
"@tanstack/react-query-devtools": "5.83.0",
|
||||
"@chromatic-com/storybook": "4.1.0",
|
||||
"@playwright/test": "1.54.2",
|
||||
"@storybook/addon-a11y": "9.1.2",
|
||||
"@storybook/addon-docs": "9.1.2",
|
||||
"@storybook/addon-links": "9.1.2",
|
||||
"@storybook/addon-onboarding": "9.1.2",
|
||||
"@storybook/nextjs": "9.1.2",
|
||||
"@tanstack/eslint-plugin-query": "5.83.1",
|
||||
"@tanstack/react-query-devtools": "5.84.2",
|
||||
"@types/canvas-confetti": "1.9.0",
|
||||
"@types/lodash": "4.17.20",
|
||||
"@types/negotiator": "0.6.4",
|
||||
"@types/node": "24.0.15",
|
||||
"@types/node": "24.2.1",
|
||||
"@types/react": "18.3.17",
|
||||
"@types/react-dom": "18.3.5",
|
||||
"@types/react-modal": "3.16.3",
|
||||
"@types/react-window": "1.8.8",
|
||||
"axe-playwright": "2.1.0",
|
||||
"chromatic": "13.1.2",
|
||||
"chromatic": "13.1.3",
|
||||
"concurrently": "9.2.0",
|
||||
"cross-env": "7.0.3",
|
||||
"eslint": "8.57.1",
|
||||
"eslint-config-next": "15.4.2",
|
||||
"eslint-plugin-storybook": "9.0.17",
|
||||
"eslint-config-next": "15.4.6",
|
||||
"eslint-plugin-storybook": "9.1.2",
|
||||
"import-in-the-middle": "1.14.2",
|
||||
"msw": "2.10.4",
|
||||
"msw-storybook-addon": "2.0.5",
|
||||
"orval": "7.10.0",
|
||||
"orval": "7.11.2",
|
||||
"pbkdf2": "3.1.3",
|
||||
"postcss": "8.5.6",
|
||||
"prettier": "3.6.2",
|
||||
"prettier-plugin-tailwindcss": "0.6.14",
|
||||
"require-in-the-middle": "7.5.2",
|
||||
"storybook": "9.0.17",
|
||||
"storybook": "9.1.2",
|
||||
"tailwindcss": "3.4.17",
|
||||
"typescript": "5.8.3"
|
||||
"typescript": "5.9.2"
|
||||
},
|
||||
"msw": {
|
||||
"workerDirectory": [
|
||||
|
||||
@@ -45,7 +45,7 @@ export default defineConfig({
|
||||
webServer: {
|
||||
command: "pnpm start",
|
||||
url: "http://localhost:3000",
|
||||
reuseExistingServer: !process.env.CI,
|
||||
reuseExistingServer: true,
|
||||
},
|
||||
|
||||
/* Configure projects for major browsers */
|
||||
|
||||
1455
autogpt_platform/frontend/pnpm-lock.yaml
generated
@@ -14,12 +14,7 @@ export async function addDollars(formData: FormData) {
|
||||
comments: formData.get("comments") as string,
|
||||
};
|
||||
const api = new BackendApi();
|
||||
const resp = await api.addUserCredits(
|
||||
data.user_id,
|
||||
data.amount,
|
||||
data.comments,
|
||||
);
|
||||
console.log(resp);
|
||||
await api.addUserCredits(data.user_id, data.amount, data.comments);
|
||||
revalidatePath("/admin/spending");
|
||||
}
|
||||
|
||||
|
||||
@@ -29,6 +29,7 @@ function SpendingDashboard({
|
||||
</div>
|
||||
|
||||
<Suspense
|
||||
key={`${page}-${status}-${search}`}
|
||||
fallback={
|
||||
<div className="py-10 text-center">Loading submissions...</div>
|
||||
}
|
||||
|
||||
@@ -66,6 +66,7 @@ export const AgentTableRow = ({
|
||||
return (
|
||||
<div
|
||||
data-testid="agent-table-row"
|
||||
data-agent-id={agent_id}
|
||||
data-agent-name={agentName}
|
||||
className="hidden items-center border-b border-neutral-300 px-4 py-4 hover:bg-neutral-50 dark:border-neutral-700 dark:hover:bg-neutral-800 md:flex"
|
||||
>
|
||||
|
||||
@@ -1,67 +0,0 @@
|
||||
"use server";
|
||||
|
||||
import { revalidatePath } from "next/cache";
|
||||
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { NotificationPreferenceDTO } from "@/lib/autogpt-server-api/types";
|
||||
import {
|
||||
postV1UpdateNotificationPreferences,
|
||||
postV1UpdateUserEmail,
|
||||
} from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
|
||||
export async function updateSettings(formData: FormData) {
|
||||
const supabase = await getServerSupabase();
|
||||
const {
|
||||
data: { user },
|
||||
} = await supabase.auth.getUser();
|
||||
|
||||
// Handle auth-related updates
|
||||
const password = formData.get("password") as string;
|
||||
const email = formData.get("email") as string;
|
||||
|
||||
if (password) {
|
||||
const { error: passwordError } = await supabase.auth.updateUser({
|
||||
password,
|
||||
});
|
||||
|
||||
if (passwordError) {
|
||||
throw new Error(`${passwordError.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (email !== user?.email) {
|
||||
const { error: emailError } = await supabase.auth.updateUser({
|
||||
email,
|
||||
});
|
||||
await postV1UpdateUserEmail(email);
|
||||
|
||||
if (emailError) {
|
||||
throw new Error(`${emailError.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const preferences: NotificationPreferenceDTO = {
|
||||
email: user?.email || "",
|
||||
preferences: {
|
||||
AGENT_RUN: formData.get("notifyOnAgentRun") === "true",
|
||||
ZERO_BALANCE: formData.get("notifyOnZeroBalance") === "true",
|
||||
LOW_BALANCE: formData.get("notifyOnLowBalance") === "true",
|
||||
BLOCK_EXECUTION_FAILED:
|
||||
formData.get("notifyOnBlockExecutionFailed") === "true",
|
||||
CONTINUOUS_AGENT_ERROR:
|
||||
formData.get("notifyOnContinuousAgentError") === "true",
|
||||
DAILY_SUMMARY: formData.get("notifyOnDailySummary") === "true",
|
||||
WEEKLY_SUMMARY: formData.get("notifyOnWeeklySummary") === "true",
|
||||
MONTHLY_SUMMARY: formData.get("notifyOnMonthlySummary") === "true",
|
||||
},
|
||||
daily_limit: 0,
|
||||
};
|
||||
await postV1UpdateNotificationPreferences(preferences);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
throw new Error(`Failed to update preferences: ${error}`);
|
||||
}
|
||||
|
||||
revalidatePath("/profile/settings");
|
||||
return { success: true };
|
||||
}
|
||||
@@ -1,316 +1,22 @@
|
||||
"use client";
|
||||
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
Form,
|
||||
FormControl,
|
||||
FormDescription,
|
||||
FormField,
|
||||
FormItem,
|
||||
FormLabel,
|
||||
FormMessage,
|
||||
} from "@/components/ui/form";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import { Separator } from "@/components/ui/separator";
|
||||
import { NotificationPreference } from "@/app/api/__generated__/models/notificationPreference";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { useSettingsForm } from "./useSettingsForm";
|
||||
import { EmailForm } from "./components/EmailForm/EmailForm";
|
||||
import { NotificationForm } from "./components/NotificationForm/NotificationForm";
|
||||
|
||||
export const SettingsForm = ({
|
||||
preferences,
|
||||
user,
|
||||
}: {
|
||||
type SettingsFormProps = {
|
||||
preferences: NotificationPreference;
|
||||
user: User;
|
||||
}) => {
|
||||
const { form, onSubmit, onCancel } = useSettingsForm({
|
||||
preferences,
|
||||
user,
|
||||
});
|
||||
|
||||
return (
|
||||
<Form {...form}>
|
||||
<form
|
||||
onSubmit={form.handleSubmit(onSubmit)}
|
||||
className="flex flex-col gap-8"
|
||||
>
|
||||
{/* Account Settings Section */}
|
||||
<div className="flex flex-col gap-4">
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="email"
|
||||
render={({ field }) => (
|
||||
<FormItem>
|
||||
<FormLabel>Email</FormLabel>
|
||||
<FormControl>
|
||||
<Input {...field} type="email" />
|
||||
</FormControl>
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="password"
|
||||
render={({ field }) => (
|
||||
<FormItem>
|
||||
<FormLabel>New Password</FormLabel>
|
||||
<FormControl>
|
||||
<Input
|
||||
{...field}
|
||||
type="password"
|
||||
placeholder="************"
|
||||
/>
|
||||
</FormControl>
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="confirmPassword"
|
||||
render={({ field }) => (
|
||||
<FormItem>
|
||||
<FormLabel>Confirm New Password</FormLabel>
|
||||
<FormControl>
|
||||
<Input
|
||||
{...field}
|
||||
type="password"
|
||||
placeholder="************"
|
||||
/>
|
||||
</FormControl>
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<Separator />
|
||||
|
||||
{/* Notifications Section */}
|
||||
<div className="flex flex-col gap-6">
|
||||
<h3 className="text-lg font-medium">Notifications</h3>
|
||||
|
||||
{/* Agent Notifications */}
|
||||
<div className="flex flex-col gap-4">
|
||||
<h4 className="text-sm font-medium text-muted-foreground">
|
||||
Agent Notifications
|
||||
</h4>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnAgentRun"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">
|
||||
Agent Run Notifications
|
||||
</FormLabel>
|
||||
<FormDescription>
|
||||
Receive notifications when an agent starts or completes a
|
||||
run
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnBlockExecutionFailed"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">
|
||||
Block Execution Failures
|
||||
</FormLabel>
|
||||
<FormDescription>
|
||||
Get notified when a block execution fails during agent
|
||||
runs
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnContinuousAgentError"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">
|
||||
Continuous Agent Errors
|
||||
</FormLabel>
|
||||
<FormDescription>
|
||||
Receive alerts when an agent encounters repeated errors
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Balance Notifications */}
|
||||
<div className="flex flex-col gap-4">
|
||||
<h4 className="text-sm font-medium text-muted-foreground">
|
||||
Balance Notifications
|
||||
</h4>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnZeroBalance"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">
|
||||
Zero Balance Alert
|
||||
</FormLabel>
|
||||
<FormDescription>
|
||||
Get notified when your account balance reaches zero
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnLowBalance"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">
|
||||
Low Balance Warning
|
||||
</FormLabel>
|
||||
<FormDescription>
|
||||
Receive warnings when your balance is running low
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Summary Reports */}
|
||||
<div className="flex flex-col gap-4">
|
||||
<h4 className="text-sm font-medium text-muted-foreground">
|
||||
Summary Reports
|
||||
</h4>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnDailySummary"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">Daily Summary</FormLabel>
|
||||
<FormDescription>
|
||||
Receive a daily summary of your account activity
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnWeeklySummary"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">Weekly Summary</FormLabel>
|
||||
<FormDescription>
|
||||
Get a weekly overview of your account performance
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnMonthlySummary"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4">
|
||||
<div className="space-y-0.5">
|
||||
<FormLabel className="text-base">Monthly Summary</FormLabel>
|
||||
<FormDescription>
|
||||
Receive a comprehensive monthly report of your account
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Form Actions */}
|
||||
<div className="flex justify-end gap-4">
|
||||
<Button
|
||||
variant="outline"
|
||||
type="button"
|
||||
onClick={onCancel}
|
||||
disabled={form.formState.isSubmitting}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={form.formState.isSubmitting || !form.formState.isDirty}
|
||||
>
|
||||
{form.formState.isSubmitting ? "Saving..." : "Save changes"}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</Form>
|
||||
);
|
||||
};
|
||||
|
||||
export function SettingsForm({ preferences, user }: SettingsFormProps) {
|
||||
return (
|
||||
<div className="flex flex-col gap-8">
|
||||
<EmailForm user={user} />
|
||||
<Separator />
|
||||
<NotificationForm preferences={preferences} user={user} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -0,0 +1,72 @@
|
||||
"use client";
|
||||
|
||||
import { Form, FormControl, FormField, FormItem } from "@/components/ui/form";
|
||||
import { Input } from "@/components/atoms/Input/Input";
|
||||
import { Text } from "@/components/atoms/Text/Text";
|
||||
import { Button } from "@/components/atoms/Button/Button";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { useEmailForm } from "./useEmailForm";
|
||||
|
||||
type EmailFormProps = {
|
||||
user: User;
|
||||
};
|
||||
|
||||
export function EmailForm({ user }: EmailFormProps) {
|
||||
const { form, onSubmit, isLoading, currentEmail } = useEmailForm({ user });
|
||||
|
||||
const hasError = Object.keys(form.formState.errors).length > 0;
|
||||
const isSameEmail = form.watch("email") === currentEmail;
|
||||
|
||||
return (
|
||||
<div>
|
||||
<Text variant="h3" size="large-semibold">
|
||||
Security & Access
|
||||
</Text>
|
||||
<Form {...form}>
|
||||
<form
|
||||
onSubmit={form.handleSubmit(onSubmit)}
|
||||
className="mt-6 flex flex-col gap-4"
|
||||
>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="email"
|
||||
render={({ field, fieldState }) => (
|
||||
<FormItem>
|
||||
<FormControl>
|
||||
<Input
|
||||
id={field.name}
|
||||
label="Email"
|
||||
placeholder="m@example.com"
|
||||
type="text"
|
||||
autoComplete="off"
|
||||
className="w-full"
|
||||
error={fieldState.error?.message}
|
||||
{...field}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
<div className="flex items-center gap-4">
|
||||
<Button
|
||||
variant="outline"
|
||||
as="NextLink"
|
||||
href="/reset-password"
|
||||
className="min-w-[10rem]"
|
||||
>
|
||||
Reset password
|
||||
</Button>
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={hasError || isSameEmail}
|
||||
loading={isLoading}
|
||||
className="min-w-[10rem]"
|
||||
>
|
||||
{isLoading ? "Saving..." : "Update email"}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</Form>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,10 @@
|
||||
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
|
||||
export async function updateSupabaseUserEmail(email: string) {
|
||||
const supabase = await getServerSupabase();
|
||||
const { data, error } = await supabase.auth.updateUser({
|
||||
email,
|
||||
});
|
||||
|
||||
return { data, error };
|
||||
}
|
||||
@@ -0,0 +1,92 @@
|
||||
"use client";
|
||||
|
||||
import { useForm } from "react-hook-form";
|
||||
import { z } from "zod";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { usePostV1UpdateUserEmail } from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
|
||||
const emailFormSchema = z.object({
|
||||
email: z
|
||||
.string()
|
||||
.min(1, "Email is required")
|
||||
.email("Please enter a valid email address"),
|
||||
});
|
||||
|
||||
function createEmailDefaultValues(user: { email?: string }) {
|
||||
return {
|
||||
email: user.email || "",
|
||||
};
|
||||
}
|
||||
|
||||
async function updateUserEmailAPI(email: string) {
|
||||
const response = await fetch("/api/auth/user", {
|
||||
method: "PUT",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({ email }),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.error || "Failed to update email");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
export function useEmailForm({ user }: { user: User }) {
|
||||
const { toast } = useToast();
|
||||
const defaultValues = createEmailDefaultValues(user);
|
||||
const currentEmail = user.email;
|
||||
|
||||
const form = useForm<z.infer<typeof emailFormSchema>>({
|
||||
resolver: zodResolver(emailFormSchema),
|
||||
defaultValues,
|
||||
mode: "onSubmit",
|
||||
});
|
||||
|
||||
const updateEmailMutation = usePostV1UpdateUserEmail({
|
||||
mutation: {
|
||||
onError: (error) => {
|
||||
toast({
|
||||
title: "Error updating email",
|
||||
description:
|
||||
error instanceof Error ? error.message : "Failed to update email",
|
||||
variant: "destructive",
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
async function onSubmit(values: z.infer<typeof emailFormSchema>) {
|
||||
try {
|
||||
if (values.email !== user.email) {
|
||||
await Promise.all([
|
||||
updateUserEmailAPI(values.email),
|
||||
updateEmailMutation.mutateAsync({ data: values.email }),
|
||||
]);
|
||||
|
||||
toast({
|
||||
title: "Successfully updated email",
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
toast({
|
||||
title: "Error updating email",
|
||||
description:
|
||||
error instanceof Error ? error.message : "Something went wrong",
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
form,
|
||||
onSubmit,
|
||||
isLoading: updateEmailMutation.isPending,
|
||||
currentEmail,
|
||||
};
|
||||
}
|
||||
@@ -0,0 +1,260 @@
|
||||
"use client";
|
||||
|
||||
import { Form, FormControl, FormField, FormItem } from "@/components/ui/form";
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import { Text } from "@/components/atoms/Text/Text";
|
||||
import { Button } from "@/components/atoms/Button/Button";
|
||||
import { NotificationPreference } from "@/app/api/__generated__/models/notificationPreference";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { useNotificationForm } from "./useNotificationForm";
|
||||
|
||||
type NotificationFormProps = {
|
||||
preferences: NotificationPreference;
|
||||
user: User;
|
||||
};
|
||||
|
||||
export function NotificationForm({ preferences, user }: NotificationFormProps) {
|
||||
const { form, onSubmit, onCancel, isLoading } = useNotificationForm({
|
||||
preferences,
|
||||
user,
|
||||
});
|
||||
|
||||
return (
|
||||
<div>
|
||||
<Text variant="h3" size="large-semibold">
|
||||
Notifications
|
||||
</Text>
|
||||
<Form {...form}>
|
||||
<form
|
||||
onSubmit={form.handleSubmit(onSubmit)}
|
||||
className="mt-6 flex flex-col gap-10"
|
||||
>
|
||||
{/* Agent Notifications */}
|
||||
<div className="flex flex-col gap-6">
|
||||
<Text variant="h4" size="body-medium" className="text-slate-400">
|
||||
Agent Notifications
|
||||
</Text>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnAgentRun"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Agent Run Notifications
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Receive notifications when an agent starts or completes a
|
||||
run
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnBlockExecutionFailed"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Block Execution Failures
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Get notified when a block execution fails during agent
|
||||
runs
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnContinuousAgentError"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Continuous Agent Errors
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Receive alerts when an agent encounters repeated errors
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Balance Notifications */}
|
||||
<div className="flex flex-col gap-4">
|
||||
<Text variant="h4" size="body-medium" className="text-slate-400">
|
||||
Balance Notifications
|
||||
</Text>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnZeroBalance"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Zero Balance Alert
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Get notified when your account balance reaches zero
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnLowBalance"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Low Balance Warning
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Receive warnings when your balance is running low
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Summary Reports */}
|
||||
<div className="flex flex-col gap-4">
|
||||
<Text variant="h4" size="body-medium" className="text-slate-400">
|
||||
Summary reports
|
||||
</Text>
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnDailySummary"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-1">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Daily Summary
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Receive a daily summary of your account activity
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnWeeklySummary"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Weekly Summary
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Get a weekly overview of your account performance
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="notifyOnMonthlySummary"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between">
|
||||
<div className="space-y-0.5">
|
||||
<Text variant="h4" size="body-medium">
|
||||
Monthly Summary
|
||||
</Text>
|
||||
<Text variant="body">
|
||||
Receive a comprehensive monthly report of your account
|
||||
</Text>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Form Actions */}
|
||||
<div className="flex justify-end gap-4 pt-8">
|
||||
<Button
|
||||
variant="outline"
|
||||
type="button"
|
||||
onClick={onCancel}
|
||||
disabled={isLoading}
|
||||
className="min-w-[10rem]"
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={isLoading || !form.formState.isDirty}
|
||||
className="min-w-[10rem]"
|
||||
loading={isLoading}
|
||||
>
|
||||
{isLoading ? "Saving..." : "Save preferences"}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</Form>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,114 @@
|
||||
"use client";
|
||||
|
||||
import { useForm } from "react-hook-form";
|
||||
import { z } from "zod";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
import { NotificationPreference } from "@/app/api/__generated__/models/notificationPreference";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { usePostV1UpdateNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
import { NotificationPreferenceDTO } from "@/lib/autogpt-server-api/types";
|
||||
|
||||
const notificationFormSchema = z.object({
|
||||
notifyOnAgentRun: z.boolean(),
|
||||
notifyOnZeroBalance: z.boolean(),
|
||||
notifyOnLowBalance: z.boolean(),
|
||||
notifyOnBlockExecutionFailed: z.boolean(),
|
||||
notifyOnContinuousAgentError: z.boolean(),
|
||||
notifyOnDailySummary: z.boolean(),
|
||||
notifyOnWeeklySummary: z.boolean(),
|
||||
notifyOnMonthlySummary: z.boolean(),
|
||||
});
|
||||
|
||||
function createNotificationDefaultValues(preferences: {
|
||||
preferences?: Record<string, boolean>;
|
||||
}) {
|
||||
return {
|
||||
notifyOnAgentRun: preferences.preferences?.AGENT_RUN,
|
||||
notifyOnZeroBalance: preferences.preferences?.ZERO_BALANCE,
|
||||
notifyOnLowBalance: preferences.preferences?.LOW_BALANCE,
|
||||
notifyOnBlockExecutionFailed:
|
||||
preferences.preferences?.BLOCK_EXECUTION_FAILED,
|
||||
notifyOnContinuousAgentError:
|
||||
preferences.preferences?.CONTINUOUS_AGENT_ERROR,
|
||||
notifyOnDailySummary: preferences.preferences?.DAILY_SUMMARY,
|
||||
notifyOnWeeklySummary: preferences.preferences?.WEEKLY_SUMMARY,
|
||||
notifyOnMonthlySummary: preferences.preferences?.MONTHLY_SUMMARY,
|
||||
};
|
||||
}
|
||||
|
||||
export function useNotificationForm({
|
||||
preferences,
|
||||
user,
|
||||
}: {
|
||||
preferences: NotificationPreference;
|
||||
user: User;
|
||||
}) {
|
||||
const { toast } = useToast();
|
||||
const defaultValues = createNotificationDefaultValues(preferences);
|
||||
|
||||
const form = useForm<z.infer<typeof notificationFormSchema>>({
|
||||
resolver: zodResolver(notificationFormSchema),
|
||||
defaultValues,
|
||||
});
|
||||
|
||||
const updateNotificationsMutation = usePostV1UpdateNotificationPreferences({
|
||||
mutation: {
|
||||
onError: (error) => {
|
||||
toast({
|
||||
title: "Error updating notifications",
|
||||
description:
|
||||
error instanceof Error
|
||||
? error.message
|
||||
: "Failed to update notification preferences",
|
||||
variant: "destructive",
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
async function onSubmit(values: z.infer<typeof notificationFormSchema>) {
|
||||
try {
|
||||
const notificationPreferences: NotificationPreferenceDTO = {
|
||||
email: user.email || "",
|
||||
preferences: {
|
||||
AGENT_RUN: values.notifyOnAgentRun,
|
||||
ZERO_BALANCE: values.notifyOnZeroBalance,
|
||||
LOW_BALANCE: values.notifyOnLowBalance,
|
||||
BLOCK_EXECUTION_FAILED: values.notifyOnBlockExecutionFailed,
|
||||
CONTINUOUS_AGENT_ERROR: values.notifyOnContinuousAgentError,
|
||||
DAILY_SUMMARY: values.notifyOnDailySummary,
|
||||
WEEKLY_SUMMARY: values.notifyOnWeeklySummary,
|
||||
MONTHLY_SUMMARY: values.notifyOnMonthlySummary,
|
||||
},
|
||||
daily_limit: 0,
|
||||
};
|
||||
|
||||
await updateNotificationsMutation.mutateAsync({
|
||||
data: notificationPreferences,
|
||||
});
|
||||
|
||||
toast({
|
||||
title: "Successfully updated notification preferences",
|
||||
});
|
||||
} catch (error) {
|
||||
toast({
|
||||
title: "Error updating notifications",
|
||||
description:
|
||||
error instanceof Error ? error.message : "Something went wrong",
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
function onCancel() {
|
||||
form.reset(defaultValues);
|
||||
}
|
||||
|
||||
return {
|
||||
form,
|
||||
onSubmit,
|
||||
onCancel,
|
||||
isLoading: updateNotificationsMutation.isPending,
|
||||
};
|
||||
}
|
||||
@@ -1,51 +0,0 @@
|
||||
import { z } from "zod";
|
||||
|
||||
export const formSchema = z
|
||||
.object({
|
||||
email: z.string().email(),
|
||||
password: z
|
||||
.string()
|
||||
.optional()
|
||||
.refine((val) => {
|
||||
if (val) return val.length >= 12;
|
||||
return true;
|
||||
}, "String must contain at least 12 character(s)"),
|
||||
confirmPassword: z.string().optional(),
|
||||
notifyOnAgentRun: z.boolean(),
|
||||
notifyOnZeroBalance: z.boolean(),
|
||||
notifyOnLowBalance: z.boolean(),
|
||||
notifyOnBlockExecutionFailed: z.boolean(),
|
||||
notifyOnContinuousAgentError: z.boolean(),
|
||||
notifyOnDailySummary: z.boolean(),
|
||||
notifyOnWeeklySummary: z.boolean(),
|
||||
notifyOnMonthlySummary: z.boolean(),
|
||||
})
|
||||
.refine((data) => {
|
||||
if (data.password || data.confirmPassword) {
|
||||
return data.password === data.confirmPassword;
|
||||
}
|
||||
return true;
|
||||
});
|
||||
|
||||
export const createDefaultValues = (
|
||||
user: { email?: string },
|
||||
preferences: { preferences?: Record<string, boolean> },
|
||||
) => {
|
||||
const defaultValues = {
|
||||
email: user.email || "",
|
||||
password: "",
|
||||
confirmPassword: "",
|
||||
notifyOnAgentRun: preferences.preferences?.AGENT_RUN,
|
||||
notifyOnZeroBalance: preferences.preferences?.ZERO_BALANCE,
|
||||
notifyOnLowBalance: preferences.preferences?.LOW_BALANCE,
|
||||
notifyOnBlockExecutionFailed:
|
||||
preferences.preferences?.BLOCK_EXECUTION_FAILED,
|
||||
notifyOnContinuousAgentError:
|
||||
preferences.preferences?.CONTINUOUS_AGENT_ERROR,
|
||||
notifyOnDailySummary: preferences.preferences?.DAILY_SUMMARY,
|
||||
notifyOnWeeklySummary: preferences.preferences?.WEEKLY_SUMMARY,
|
||||
notifyOnMonthlySummary: preferences.preferences?.MONTHLY_SUMMARY,
|
||||
};
|
||||
|
||||
return defaultValues;
|
||||
};
|
||||
@@ -1,57 +0,0 @@
|
||||
"use client";
|
||||
import { useForm } from "react-hook-form";
|
||||
import { createDefaultValues, formSchema } from "./helper";
|
||||
import { z } from "zod";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import { updateSettings } from "../../actions";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
import { NotificationPreference } from "@/app/api/__generated__/models/notificationPreference";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
|
||||
export const useSettingsForm = ({
|
||||
preferences,
|
||||
user,
|
||||
}: {
|
||||
preferences: NotificationPreference;
|
||||
user: User;
|
||||
}) => {
|
||||
const { toast } = useToast();
|
||||
const defaultValues = createDefaultValues(user, preferences);
|
||||
|
||||
const form = useForm<z.infer<typeof formSchema>>({
|
||||
resolver: zodResolver(formSchema),
|
||||
defaultValues,
|
||||
});
|
||||
|
||||
async function onSubmit(values: z.infer<typeof formSchema>) {
|
||||
try {
|
||||
const formData = new FormData();
|
||||
|
||||
Object.entries(values).forEach(([key, value]) => {
|
||||
if (key !== "confirmPassword") {
|
||||
formData.append(key, value.toString());
|
||||
}
|
||||
});
|
||||
|
||||
await updateSettings(formData);
|
||||
|
||||
toast({
|
||||
title: "Successfully updated settings",
|
||||
});
|
||||
} catch (error) {
|
||||
toast({
|
||||
title: "Error",
|
||||
description:
|
||||
error instanceof Error ? error.message : "Something went wrong",
|
||||
variant: "destructive",
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
function onCancel() {
|
||||
form.reset(defaultValues);
|
||||
}
|
||||
|
||||
return { form, onSubmit, onCancel };
|
||||
};
|
||||
@@ -5,6 +5,7 @@ import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
|
||||
import * as React from "react";
|
||||
import SettingsLoading from "./loading";
|
||||
import { redirect } from "next/navigation";
|
||||
import { Text } from "@/components/atoms/Text/Text";
|
||||
|
||||
export default function SettingsPage() {
|
||||
const {
|
||||
@@ -35,11 +36,11 @@ export default function SettingsPage() {
|
||||
|
||||
return (
|
||||
<div className="container max-w-2xl space-y-6 py-10">
|
||||
<div>
|
||||
<h3 className="text-lg font-medium">My account</h3>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
<div className="flex flex-col gap-2">
|
||||
<Text variant="h3">My account</Text>
|
||||
<Text variant="large">
|
||||
Manage your account settings and preferences.
|
||||
</p>
|
||||
</Text>
|
||||
</div>
|
||||
<SettingsForm preferences={preferences} user={user} />
|
||||
</div>
|
||||
|
||||
@@ -79,7 +79,6 @@ export default function SignupPage() {
|
||||
control={form.control}
|
||||
name="password"
|
||||
render={({ field }) => {
|
||||
console.log(field);
|
||||
return (
|
||||
<Input
|
||||
id={field.name}
|
||||
|
||||
39
autogpt_platform/frontend/src/app/api/auth/user/route.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { NextResponse } from "next/server";
|
||||
|
||||
export async function GET() {
|
||||
const supabase = await getServerSupabase();
|
||||
const { data, error } = await supabase.auth.getUser();
|
||||
|
||||
if (error) {
|
||||
return NextResponse.json({ error: error.message }, { status: 400 });
|
||||
}
|
||||
|
||||
return NextResponse.json(data);
|
||||
}
|
||||
|
||||
export async function PUT(request: Request) {
|
||||
try {
|
||||
const supabase = await getServerSupabase();
|
||||
const { email } = await request.json();
|
||||
|
||||
if (!email) {
|
||||
return NextResponse.json({ error: "Email is required" }, { status: 400 });
|
||||
}
|
||||
|
||||
const { data, error } = await supabase.auth.updateUser({
|
||||
email,
|
||||
});
|
||||
|
||||
if (error) {
|
||||
return NextResponse.json({ error: error.message }, { status: 400 });
|
||||
}
|
||||
|
||||
return NextResponse.json(data);
|
||||
} catch {
|
||||
return NextResponse.json(
|
||||
{ error: "Failed to update user email" },
|
||||
{ status: 500 },
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -3,6 +3,7 @@ import {
|
||||
getServerAuthToken,
|
||||
} from "@/lib/autogpt-server-api/helpers";
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import { getAgptServerBaseUrl } from "@/lib/env-config";
|
||||
|
||||
const FRONTEND_BASE_URL =
|
||||
process.env.NEXT_PUBLIC_FRONTEND_BASE_URL || "http://localhost:3000";
|
||||
@@ -12,9 +13,7 @@ const getBaseUrl = (): string => {
|
||||
if (!isServerSide()) {
|
||||
return API_PROXY_BASE_URL;
|
||||
} else {
|
||||
return (
|
||||
process.env.NEXT_PUBLIC_AGPT_SERVER_BASE_URL || "http://localhost:8006"
|
||||
);
|
||||
return getAgptServerBaseUrl();
|
||||
}
|
||||
};
|
||||
|
||||
@@ -93,7 +92,7 @@ export const customMutator = async <T = any>(
|
||||
// 4. If the request succeeds on the server side, the data will be cached, and the client will use it instead of sending a request to the proxy.
|
||||
|
||||
if (!response.ok && isServerSide()) {
|
||||
console.error("Request failed on server side", response);
|
||||
console.error("Request failed on server side", response, fullUrl);
|
||||
throw new Error(`Request failed with status ${response.status}`);
|
||||
}
|
||||
|
||||
|
||||
@@ -3,19 +3,12 @@ import {
|
||||
makeAuthenticatedFileUpload,
|
||||
makeAuthenticatedRequest,
|
||||
} from "@/lib/autogpt-server-api/helpers";
|
||||
import { getAgptServerBaseUrl } from "@/lib/env-config";
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
function getBackendBaseUrl() {
|
||||
if (process.env.NEXT_PUBLIC_AGPT_SERVER_URL) {
|
||||
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL.replace("/api", "");
|
||||
}
|
||||
|
||||
return "http://localhost:8006";
|
||||
}
|
||||
|
||||
function buildBackendUrl(path: string[], queryString: string): string {
|
||||
const backendPath = path.join("/");
|
||||
return `${getBackendBaseUrl()}/${backendPath}${queryString}`;
|
||||
return `${getAgptServerBaseUrl()}/${backendPath}${queryString}`;
|
||||
}
|
||||
|
||||
async function handleJsonRequest(
|
||||
|
||||
@@ -28,7 +28,7 @@ export default async function RootLayout({
|
||||
>
|
||||
<head>
|
||||
<GoogleAnalytics
|
||||
gaId={process.env.GA_MEASUREMENT_ID || "G-FH2XK2W4GN"} // This is the measurement Id for the Google Analytics dev project
|
||||
gaId={process.env.NEXT_PUBLIC_GA_MEASUREMENT_ID || "G-FH2XK2W4GN"} // This is the measurement Id for the Google Analytics dev project
|
||||
/>
|
||||
</head>
|
||||
<body>
|
||||
|
||||
@@ -15,6 +15,7 @@ import { Textarea } from "@/components/ui/textarea";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { useRouter } from "next/navigation";
|
||||
import { addDollars } from "@/app/(platform)/admin/spending/actions";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
|
||||
export function AdminAddMoneyButton({
|
||||
userId,
|
||||
@@ -30,18 +31,32 @@ export function AdminAddMoneyButton({
|
||||
defaultComments?: string;
|
||||
}) {
|
||||
const router = useRouter();
|
||||
const { toast } = useToast();
|
||||
const [isAddMoneyDialogOpen, setIsAddMoneyDialogOpen] = useState(false);
|
||||
const [isSubmitting, setIsSubmitting] = useState(false);
|
||||
const [dollarAmount, setDollarAmount] = useState(
|
||||
defaultAmount ? Math.abs(defaultAmount / 100).toFixed(2) : "1.00",
|
||||
);
|
||||
|
||||
const handleApproveSubmit = async (formData: FormData) => {
|
||||
setIsAddMoneyDialogOpen(false);
|
||||
setIsSubmitting(true);
|
||||
try {
|
||||
await addDollars(formData);
|
||||
setIsAddMoneyDialogOpen(false);
|
||||
toast({
|
||||
title: "Success",
|
||||
description: `Added $${dollarAmount} to ${userEmail}'s balance`,
|
||||
});
|
||||
router.refresh(); // Refresh the current route
|
||||
} catch (error) {
|
||||
console.error("Error adding dollars:", error);
|
||||
toast({
|
||||
title: "Error",
|
||||
description: "Failed to add dollars. Please try again.",
|
||||
variant: "destructive",
|
||||
});
|
||||
} finally {
|
||||
setIsSubmitting(false);
|
||||
}
|
||||
};
|
||||
|
||||
@@ -122,10 +137,13 @@ export function AdminAddMoneyButton({
|
||||
type="button"
|
||||
variant="outline"
|
||||
onClick={() => setIsAddMoneyDialogOpen(false)}
|
||||
disabled={isSubmitting}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button type="submit">Add Dollars</Button>
|
||||
<Button type="submit" disabled={isSubmitting}>
|
||||
{isSubmitting ? "Adding..." : "Add Dollars"}
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</form>
|
||||
</DialogContent>
|
||||
|
||||
@@ -64,6 +64,7 @@ export const providerIcons: Partial<
|
||||
open_router: fallbackIcon,
|
||||
llama_api: fallbackIcon,
|
||||
pinecone: fallbackIcon,
|
||||
enrichlayer: fallbackIcon,
|
||||
slant3d: fallbackIcon,
|
||||
screenshotone: fallbackIcon,
|
||||
smtp: fallbackIcon,
|
||||
|
||||
@@ -3,6 +3,12 @@ import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { createBrowserClient } from "@supabase/ssr";
|
||||
import type { SupabaseClient } from "@supabase/supabase-js";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import {
|
||||
getAgptServerApiUrl,
|
||||
getAgptWsServerUrl,
|
||||
getSupabaseUrl,
|
||||
getSupabaseAnonKey,
|
||||
} from "@/lib/env-config";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import type {
|
||||
AddUserCreditsResponse,
|
||||
@@ -86,10 +92,8 @@ export default class BackendAPI {
|
||||
heartbeatTimeoutID: number | null = null;
|
||||
|
||||
constructor(
|
||||
baseUrl: string = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
|
||||
"http://localhost:8006/api",
|
||||
wsUrl: string = process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL ||
|
||||
"ws://localhost:8001/ws",
|
||||
baseUrl: string = getAgptServerApiUrl(),
|
||||
wsUrl: string = getAgptWsServerUrl(),
|
||||
) {
|
||||
this.baseUrl = baseUrl;
|
||||
this.wsUrl = wsUrl;
|
||||
@@ -97,11 +101,9 @@ export default class BackendAPI {
|
||||
|
||||
private async getSupabaseClient(): Promise<SupabaseClient | null> {
|
||||
return isClient
|
||||
? createBrowserClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{ isSingleton: true },
|
||||
)
|
||||
? createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
|
||||
isSingleton: true,
|
||||
})
|
||||
: await getServerSupabase();
|
||||
}
|
||||
|
||||
@@ -635,6 +637,7 @@ export default class BackendAPI {
|
||||
search?: string;
|
||||
page?: number;
|
||||
page_size?: number;
|
||||
transaction_filter?: string;
|
||||
}): Promise<UsersBalanceHistoryResponse> {
|
||||
return this._get("/credits/admin/users_history", params);
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import { getAgptServerApiUrl } from "@/lib/env-config";
|
||||
import { isServerSide } from "../utils/is-server-side";
|
||||
|
||||
import { GraphValidationErrorResponse } from "./types";
|
||||
@@ -56,9 +57,7 @@ export function buildClientUrl(path: string): string {
|
||||
}
|
||||
|
||||
export function buildServerUrl(path: string): string {
|
||||
const baseUrl =
|
||||
process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
|
||||
return `${baseUrl}${path}`;
|
||||
return `${getAgptServerApiUrl()}${path}`;
|
||||
}
|
||||
|
||||
export function buildUrlWithQuery(
|
||||
@@ -229,7 +228,13 @@ export async function makeAuthenticatedRequest(
|
||||
const payloadAsQuery = ["GET", "DELETE"].includes(method);
|
||||
const hasRequestBody = !payloadAsQuery && payload !== undefined;
|
||||
|
||||
const response = await fetch(url, {
|
||||
// Add query parameters for GET/DELETE requests
|
||||
let requestUrl = url;
|
||||
if (payloadAsQuery && payload) {
|
||||
requestUrl = buildUrlWithQuery(url, payload);
|
||||
}
|
||||
|
||||
const response = await fetch(requestUrl, {
|
||||
method,
|
||||
headers: createRequestHeaders(token, hasRequestBody, contentType),
|
||||
body: hasRequestBody
|
||||
|
||||
@@ -6,14 +6,12 @@ import {
|
||||
makeAuthenticatedFileUpload,
|
||||
makeAuthenticatedRequest,
|
||||
} from "./helpers";
|
||||
|
||||
const DEFAULT_BASE_URL = "http://localhost:8006/api";
|
||||
import { getAgptServerApiUrl } from "@/lib/env-config";
|
||||
|
||||
export interface ProxyRequestOptions {
|
||||
method: "GET" | "POST" | "PUT" | "PATCH" | "DELETE";
|
||||
path: string;
|
||||
payload?: Record<string, any>;
|
||||
baseUrl?: string;
|
||||
contentType?: string;
|
||||
}
|
||||
|
||||
@@ -21,13 +19,13 @@ export async function proxyApiRequest({
|
||||
method,
|
||||
path,
|
||||
payload,
|
||||
baseUrl = process.env.NEXT_PUBLIC_AGPT_SERVER_URL || DEFAULT_BASE_URL,
|
||||
contentType = "application/json",
|
||||
}: ProxyRequestOptions) {
|
||||
return await Sentry.withServerActionInstrumentation(
|
||||
"proxyApiRequest",
|
||||
{},
|
||||
async () => {
|
||||
const baseUrl = getAgptServerApiUrl();
|
||||
const url = buildRequestUrl(baseUrl, path, method, payload);
|
||||
return makeAuthenticatedRequest(method, url, payload, contentType);
|
||||
},
|
||||
@@ -37,13 +35,12 @@ export async function proxyApiRequest({
|
||||
export async function proxyFileUpload(
|
||||
path: string,
|
||||
formData: FormData,
|
||||
baseUrl = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
|
||||
"http://localhost:8006/api",
|
||||
): Promise<string> {
|
||||
return await Sentry.withServerActionInstrumentation(
|
||||
"proxyFileUpload",
|
||||
{},
|
||||
async () => {
|
||||
const baseUrl = getAgptServerApiUrl();
|
||||
const url = baseUrl + path;
|
||||
return makeAuthenticatedFileUpload(url, formData);
|
||||
},
|
||||
|
||||
65
autogpt_platform/frontend/src/lib/env-config.ts
Normal file
@@ -0,0 +1,65 @@
|
||||
/**
|
||||
* Environment configuration helper
|
||||
*
|
||||
* Provides unified access to environment variables with server-side priority.
|
||||
* Server-side code uses Docker service names, client-side falls back to localhost.
|
||||
*/
|
||||
|
||||
import { isServerSide } from "./utils/is-server-side";
|
||||
|
||||
/**
|
||||
* Gets the AGPT server URL with server-side priority
|
||||
* Server-side: Uses AGPT_SERVER_URL (http://rest_server:8006/api)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_AGPT_SERVER_URL (http://localhost:8006/api)
|
||||
*/
|
||||
export function getAgptServerApiUrl(): string {
|
||||
// If server-side and server URL exists, use it
|
||||
if (isServerSide() && process.env.AGPT_SERVER_URL) {
|
||||
return process.env.AGPT_SERVER_URL;
|
||||
}
|
||||
|
||||
// Otherwise use the public URL
|
||||
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
|
||||
}
|
||||
|
||||
export function getAgptServerBaseUrl(): string {
|
||||
return getAgptServerApiUrl().replace("/api", "");
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the AGPT WebSocket URL with server-side priority
|
||||
* Server-side: Uses AGPT_WS_SERVER_URL (ws://websocket_server:8001/ws)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_AGPT_WS_SERVER_URL (ws://localhost:8001/ws)
|
||||
*/
|
||||
export function getAgptWsServerUrl(): string {
|
||||
// If server-side and server URL exists, use it
|
||||
if (isServerSide() && process.env.AGPT_WS_SERVER_URL) {
|
||||
return process.env.AGPT_WS_SERVER_URL;
|
||||
}
|
||||
|
||||
// Otherwise use the public URL
|
||||
return process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL || "ws://localhost:8001/ws";
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the Supabase URL with server-side priority
|
||||
* Server-side: Uses SUPABASE_URL (http://kong:8000)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_SUPABASE_URL (http://localhost:8000)
|
||||
*/
|
||||
export function getSupabaseUrl(): string {
|
||||
// If server-side and server URL exists, use it
|
||||
if (isServerSide() && process.env.SUPABASE_URL) {
|
||||
return process.env.SUPABASE_URL;
|
||||
}
|
||||
|
||||
// Otherwise use the public URL
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_URL || "http://localhost:8000";
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the Supabase anon key
|
||||
* Uses NEXT_PUBLIC_SUPABASE_ANON_KEY since anon keys are public and same across environments
|
||||
*/
|
||||
export function getSupabaseAnonKey(): string {
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY || "";
|
||||
}
|
||||
@@ -4,6 +4,7 @@ import { User } from "@supabase/supabase-js";
|
||||
import { usePathname, useRouter } from "next/navigation";
|
||||
import { useEffect, useMemo, useRef, useState } from "react";
|
||||
import { useBackendAPI } from "@/lib/autogpt-server-api/context";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "@/lib/env-config";
|
||||
import {
|
||||
getCurrentUser,
|
||||
refreshSession,
|
||||
@@ -32,16 +33,12 @@ export function useSupabase() {
|
||||
|
||||
const supabase = useMemo(() => {
|
||||
try {
|
||||
return createBrowserClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{
|
||||
isSingleton: true,
|
||||
auth: {
|
||||
persistSession: false, // Don't persist session on client with httpOnly cookies
|
||||
},
|
||||
return createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
|
||||
isSingleton: true,
|
||||
auth: {
|
||||
persistSession: false, // Don't persist session on client with httpOnly cookies
|
||||
},
|
||||
);
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Error creating Supabase client", error);
|
||||
return null;
|
||||
|
||||
@@ -1,47 +1,43 @@
|
||||
import { createServerClient } from "@supabase/ssr";
|
||||
import { NextResponse, type NextRequest } from "next/server";
|
||||
import { getCookieSettings, isAdminPage, isProtectedPage } from "./helpers";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "../env-config";
|
||||
|
||||
export async function updateSession(request: NextRequest) {
|
||||
let supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
});
|
||||
|
||||
const isAvailable = Boolean(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL &&
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY,
|
||||
);
|
||||
const supabaseUrl = getSupabaseUrl();
|
||||
const supabaseKey = getSupabaseAnonKey();
|
||||
const isAvailable = Boolean(supabaseUrl && supabaseKey);
|
||||
|
||||
if (!isAvailable) {
|
||||
return supabaseResponse;
|
||||
}
|
||||
|
||||
try {
|
||||
const supabase = createServerClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{
|
||||
cookies: {
|
||||
getAll() {
|
||||
return request.cookies.getAll();
|
||||
},
|
||||
setAll(cookiesToSet) {
|
||||
cookiesToSet.forEach(({ name, value }) =>
|
||||
request.cookies.set(name, value),
|
||||
);
|
||||
supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
const supabase = createServerClient(supabaseUrl, supabaseKey, {
|
||||
cookies: {
|
||||
getAll() {
|
||||
return request.cookies.getAll();
|
||||
},
|
||||
setAll(cookiesToSet) {
|
||||
cookiesToSet.forEach(({ name, value }) =>
|
||||
request.cookies.set(name, value),
|
||||
);
|
||||
supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
});
|
||||
cookiesToSet.forEach(({ name, value, options }) => {
|
||||
supabaseResponse.cookies.set(name, value, {
|
||||
...options,
|
||||
...getCookieSettings(),
|
||||
});
|
||||
cookiesToSet.forEach(({ name, value, options }) => {
|
||||
supabaseResponse.cookies.set(name, value, {
|
||||
...options,
|
||||
...getCookieSettings(),
|
||||
});
|
||||
});
|
||||
},
|
||||
});
|
||||
},
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
const userResponse = await supabase.auth.getUser();
|
||||
const user = userResponse.data.user;
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { createServerClient, type CookieOptions } from "@supabase/ssr";
|
||||
import { getCookieSettings } from "../helpers";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "../../env-config";
|
||||
|
||||
type Cookies = { name: string; value: string; options?: CookieOptions }[];
|
||||
|
||||
@@ -11,8 +12,8 @@ export async function getServerSupabase() {
|
||||
|
||||
try {
|
||||
const supabase = createServerClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
getSupabaseUrl(),
|
||||
getSupabaseAnonKey(),
|
||||
{
|
||||
cookies: {
|
||||
getAll() {
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
* Utility functions for working with Cloudflare Turnstile
|
||||
*/
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { getAgptServerApiUrl } from "@/lib/env-config";
|
||||
|
||||
export async function verifyTurnstileToken(
|
||||
token: string,
|
||||
@@ -19,19 +20,16 @@ export async function verifyTurnstileToken(
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_AGPT_SERVER_URL}/turnstile/verify`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
token,
|
||||
action,
|
||||
}),
|
||||
const response = await fetch(`${getAgptServerApiUrl()}/turnstile/verify`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
);
|
||||
body: JSON.stringify({
|
||||
token,
|
||||
action,
|
||||
}),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
console.error("Turnstile verification failed:", await response.text());
|
||||
|
||||
@@ -2,7 +2,7 @@ import { LoginPage } from "./pages/login.page";
|
||||
import test, { expect } from "@playwright/test";
|
||||
import { TEST_AGENT_DATA, TEST_CREDENTIALS } from "./credentials";
|
||||
import { getSelectors } from "./utils/selectors";
|
||||
import { hasUrl } from "./utils/assertion";
|
||||
import { hasUrl, isHidden } from "./utils/assertion";
|
||||
|
||||
test.describe("Agent Dashboard", () => {
|
||||
test.beforeEach(async ({ page }) => {
|
||||
@@ -89,6 +89,7 @@ test.describe("Agent Dashboard", () => {
|
||||
}
|
||||
|
||||
const firstRow = rows.first();
|
||||
const deletedAgentId = await firstRow.getAttribute("data-agent-id");
|
||||
await firstRow.scrollIntoViewIfNeeded();
|
||||
|
||||
const delActionsButton = firstRow.getByTestId("agent-table-row-actions");
|
||||
@@ -100,9 +101,7 @@ test.describe("Agent Dashboard", () => {
|
||||
await expect(deleteButton).toBeVisible();
|
||||
await deleteButton.click();
|
||||
|
||||
// Wait for row count to drop by 1
|
||||
await expect
|
||||
.poll(async () => await rows.count(), { timeout: 15000 })
|
||||
.toBe(beforeCount - 1);
|
||||
// Assert that the card with the deleted agent ID is not visible
|
||||
await isHidden(page.locator(`[data-agent-id="${deletedAgentId}"]`));
|
||||
});
|
||||
});
|
||||
|
||||
@@ -53,8 +53,6 @@ test.describe("Build", () => { //(1)!
|
||||
for (const block of blocksToAdd) {
|
||||
await buildPage.addBlock(block);
|
||||
}
|
||||
|
||||
await buildPage.saveAgent(`Saved blocks ${letter} test part ${part}`);
|
||||
}
|
||||
|
||||
// Reason Ignore: admonishment is in the wrong place visually with correct prettier rules
|
||||
|
||||
@@ -35,7 +35,11 @@ test("user can publish an agent through the complete flow", async ({
|
||||
const agentToSelect = publishAgentModal.getByTestId("agent-card").first();
|
||||
await agentToSelect.click();
|
||||
|
||||
const nextButton = publishAgentModal.getByRole("button", { name: "Next" });
|
||||
const nextButton = publishAgentModal.getByRole("button", {
|
||||
name: "Next",
|
||||
exact: true,
|
||||
});
|
||||
|
||||
await isEnabled(nextButton);
|
||||
await nextButton.click();
|
||||
|
||||
@@ -101,7 +105,10 @@ test("should validate all form fields in publish agent form", async ({
|
||||
const agentToSelect = publishAgentModal.getByTestId("agent-card").first();
|
||||
await agentToSelect.click();
|
||||
|
||||
const nextButton = publishAgentModal.getByRole("button", { name: "Next" });
|
||||
const nextButton = publishAgentModal.getByRole("button", {
|
||||
name: "Next",
|
||||
exact: true,
|
||||
});
|
||||
await nextButton.click();
|
||||
|
||||
await isVisible(getText("Write a bit of details about your agent"));
|
||||
|
||||
144
autogpt_platform/frontend/src/tests/settings.spec.ts
Normal file
@@ -0,0 +1,144 @@
|
||||
import test, { expect } from "@playwright/test";
|
||||
import { getTestUser } from "./utils/auth";
|
||||
import { LoginPage } from "./pages/login.page";
|
||||
import { hasAttribute, hasUrl, isHidden, isVisible } from "./utils/assertion";
|
||||
import { getSelectors } from "./utils/selectors";
|
||||
|
||||
test.beforeEach(async ({ page }) => {
|
||||
const testUser = await getTestUser();
|
||||
const loginPage = new LoginPage(page);
|
||||
|
||||
// Login and navigate to settings
|
||||
await page.goto("/login");
|
||||
await loginPage.login(testUser.email, testUser.password);
|
||||
await hasUrl(page, "/marketplace");
|
||||
|
||||
// Navigate to settings page
|
||||
await page.goto("/profile/settings");
|
||||
await hasUrl(page, "/profile/settings");
|
||||
});
|
||||
|
||||
test("should display email form elements correctly", async ({ page }) => {
|
||||
const { getField, getButton, getText, getLink } = getSelectors(page);
|
||||
|
||||
// Check email form elements are displayed
|
||||
await isVisible(getText("Security & Access"));
|
||||
await isVisible(getField("Email"));
|
||||
await isVisible(getLink("Reset password"));
|
||||
await isVisible(getButton("Update email"));
|
||||
|
||||
const updateEmailButton = getButton("Update email");
|
||||
const resetPasswordButton = getLink("Reset password");
|
||||
|
||||
// Button should be disabled initially (no changes)
|
||||
await expect(updateEmailButton).toBeDisabled();
|
||||
|
||||
// Test reset password navigation
|
||||
await hasAttribute(resetPasswordButton, "href", "/reset-password");
|
||||
});
|
||||
|
||||
test("should show validation error for empty email", async ({ page }) => {
|
||||
const { getField, getButton } = getSelectors(page);
|
||||
|
||||
const emailField = getField("Email");
|
||||
const updateEmailButton = getButton("Update email");
|
||||
|
||||
await emailField.fill("");
|
||||
await updateEmailButton.click();
|
||||
await isVisible(page.getByText("Email is required"));
|
||||
});
|
||||
|
||||
test("should show validation error for invalid email", async ({ page }) => {
|
||||
const { getField, getButton } = getSelectors(page);
|
||||
|
||||
const emailField = getField("Email");
|
||||
const updateEmailButton = getButton("Update email");
|
||||
|
||||
await emailField.fill("invalid email");
|
||||
await updateEmailButton.click();
|
||||
await isVisible(page.getByText("Please enter a valid email address"));
|
||||
});
|
||||
|
||||
test("should handle valid email", async ({ page }) => {
|
||||
const { getField, getButton } = getSelectors(page);
|
||||
|
||||
const emailField = getField("Email");
|
||||
const updateEmailButton = getButton("Update email");
|
||||
|
||||
// Test successful email update
|
||||
const newEmail = `test+${Date.now()}@example.com`;
|
||||
await emailField.fill(newEmail);
|
||||
await expect(updateEmailButton).toBeEnabled();
|
||||
await updateEmailButton.click();
|
||||
await isHidden(page.getByText("Email is required"));
|
||||
await isHidden(page.getByText("Please enter a valid email address"));
|
||||
});
|
||||
|
||||
test("should handle complete notification form functionality and form interactions", async ({
|
||||
page,
|
||||
}) => {
|
||||
const { getButton } = getSelectors(page);
|
||||
|
||||
// Check notification form elements are displayed
|
||||
await isVisible(
|
||||
page.getByRole("heading", { name: "Notifications", exact: true }),
|
||||
);
|
||||
|
||||
await isVisible(getButton("Cancel"));
|
||||
await isVisible(getButton("Save preferences"));
|
||||
|
||||
// Check all notification switches are present - get all switches on page
|
||||
const switches = await page.getByRole("switch").all();
|
||||
|
||||
for (const switchElement of switches) {
|
||||
await isVisible(switchElement);
|
||||
}
|
||||
|
||||
const savePreferencesButton = getButton("Save preferences");
|
||||
const cancelButton = getButton("Cancel");
|
||||
|
||||
// Button should be disabled initially (no changes)
|
||||
await expect(savePreferencesButton).toBeDisabled();
|
||||
|
||||
// Test switch toggling functionality
|
||||
for (const switchElement of switches) {
|
||||
const initialState = await switchElement.isChecked();
|
||||
await switchElement.click();
|
||||
const newState = await switchElement.isChecked();
|
||||
expect(newState).toBe(!initialState);
|
||||
}
|
||||
|
||||
// Test button enabling when changes are made
|
||||
if (switches.length > 0) {
|
||||
await expect(savePreferencesButton).toBeEnabled();
|
||||
}
|
||||
|
||||
// Test cancel functionality
|
||||
await cancelButton.click();
|
||||
// Wait for form state to update after cancel
|
||||
await page.waitForTimeout(100);
|
||||
await expect(savePreferencesButton).toBeDisabled();
|
||||
|
||||
// Test successful save with multiple switches
|
||||
const testSwitches = switches.slice(0, Math.min(3, switches.length));
|
||||
for (const switchElement of testSwitches) {
|
||||
await switchElement.click();
|
||||
}
|
||||
await expect(savePreferencesButton).toBeEnabled();
|
||||
await savePreferencesButton.click();
|
||||
await isVisible(getButton("Saving..."));
|
||||
await isVisible(
|
||||
page.getByText("Successfully updated notification preferences"),
|
||||
);
|
||||
|
||||
// Test persistence after page reload
|
||||
if (testSwitches.length > 0) {
|
||||
const finalState = await testSwitches[0].isChecked();
|
||||
await page.reload();
|
||||
await hasUrl(page, "/profile/settings");
|
||||
const reloadedSwitches = await page.getByRole("switch").all();
|
||||
if (reloadedSwitches.length > 0) {
|
||||
expect(await reloadedSwitches[0].isChecked()).toBe(finalState);
|
||||
}
|
||||
}
|
||||
});
|
||||
@@ -1,58 +1,46 @@
|
||||
@echo off
|
||||
setlocal enabledelayedexpansion
|
||||
|
||||
goto :main
|
||||
REM Variables
|
||||
set SCRIPT_DIR=%~dp0
|
||||
set REPO_DIR=%SCRIPT_DIR%..\..
|
||||
set CLONE_NEEDED=0
|
||||
set LOG_FILE=
|
||||
|
||||
REM --- Helper: Check command existence ---
|
||||
:check_command
|
||||
if "%1"=="" (
|
||||
echo ERROR: check_command called with no command argument!
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
where %1 >nul 2>nul
|
||||
if errorlevel 1 (
|
||||
echo %2 is not installed. Please install it and try again.
|
||||
pause
|
||||
exit /b 1
|
||||
) else (
|
||||
echo %2 is installed.
|
||||
)
|
||||
goto :eof
|
||||
|
||||
:main
|
||||
echo =============================
|
||||
echo AutoGPT Windows Setup
|
||||
echo =============================
|
||||
echo.
|
||||
|
||||
REM --- Variables ---
|
||||
set SCRIPT_DIR=%~dp0
|
||||
set LOG_DIR=%SCRIPT_DIR%logs
|
||||
set BACKEND_LOG=%LOG_DIR%\backend_setup.log
|
||||
set FRONTEND_LOG=%LOG_DIR%\frontend_setup.log
|
||||
set CLONE_NEEDED=0
|
||||
set REPO_DIR=%SCRIPT_DIR%..\..
|
||||
|
||||
REM --- Create logs folder immediately ---
|
||||
if not exist "%LOG_DIR%" mkdir "%LOG_DIR%"
|
||||
|
||||
REM Check prerequisites
|
||||
echo Checking prerequisites...
|
||||
call :check_command git Git
|
||||
call :check_command docker Docker
|
||||
call :check_command npm Node.js
|
||||
call :check_command pnpm pnpm
|
||||
where git >nul 2>nul
|
||||
if errorlevel 1 (
|
||||
echo Git is not installed. Please install it and try again.
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
echo Git is installed.
|
||||
|
||||
where docker >nul 2>nul
|
||||
if errorlevel 1 (
|
||||
echo Docker is not installed. Please install it and try again.
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
echo Docker is installed.
|
||||
echo.
|
||||
|
||||
REM --- Detect repo ---
|
||||
REM Detect repo
|
||||
if exist "%REPO_DIR%\.git" (
|
||||
echo Using existing AutoGPT repository.
|
||||
set CLONE_NEEDED=0
|
||||
) else (
|
||||
set REPO_DIR=%SCRIPT_DIR%AutoGPT
|
||||
set CLONE_NEEDED=1
|
||||
)
|
||||
|
||||
REM --- Clone repo if needed ---
|
||||
REM Clone repo if needed
|
||||
if %CLONE_NEEDED%==1 (
|
||||
echo Cloning AutoGPT repository...
|
||||
git clone https://github.com/Significant-Gravitas/AutoGPT.git "%REPO_DIR%"
|
||||
@@ -61,72 +49,47 @@ if %CLONE_NEEDED%==1 (
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
) else (
|
||||
echo Using existing AutoGPT repository.
|
||||
echo Repository cloned successfully.
|
||||
)
|
||||
echo.
|
||||
|
||||
REM --- Prompt for Sentry enablement ---
|
||||
set SENTRY_ENABLED=0
|
||||
echo Would you like to enable debug information to be shared so we can fix your issues? [Y/n]
|
||||
set /p sentry_answer="Enable Sentry? [Y/n]: "
|
||||
if /I "%sentry_answer%"=="" set SENTRY_ENABLED=1
|
||||
if /I "%sentry_answer%"=="y" set SENTRY_ENABLED=1
|
||||
if /I "%sentry_answer%"=="yes" set SENTRY_ENABLED=1
|
||||
if /I "%sentry_answer%"=="n" set SENTRY_ENABLED=0
|
||||
if /I "%sentry_answer%"=="no" set SENTRY_ENABLED=0
|
||||
|
||||
REM --- Setup backend ---
|
||||
echo Setting up backend services...
|
||||
echo.
|
||||
REM Navigate to autogpt_platform
|
||||
cd /d "%REPO_DIR%\autogpt_platform"
|
||||
if exist .env.example copy /Y .env.example .env >nul
|
||||
cd backend
|
||||
if exist .env.example copy /Y .env.example .env >nul
|
||||
|
||||
REM --- Set SENTRY_DSN in backend/.env ---
|
||||
set SENTRY_DSN=https://11d0640fef35640e0eb9f022eb7d7626@o4505260022104064.ingest.us.sentry.io/4507890252447744
|
||||
if %SENTRY_ENABLED%==1 (
|
||||
powershell -Command "(Get-Content .env) -replace '^SENTRY_DSN=.*', 'SENTRY_DSN=%SENTRY_DSN%' | Set-Content .env"
|
||||
echo Sentry enabled in backend.
|
||||
) else (
|
||||
powershell -Command "(Get-Content .env) -replace '^SENTRY_DSN=.*', 'SENTRY_DSN=' | Set-Content .env"
|
||||
echo Sentry not enabled in backend.
|
||||
)
|
||||
cd ..
|
||||
|
||||
docker compose down > "%BACKEND_LOG%" 2>&1
|
||||
if errorlevel 1 echo (docker compose down failed, continuing...)
|
||||
docker compose up -d --build >> "%BACKEND_LOG%" 2>&1
|
||||
if errorlevel 1 (
|
||||
echo Backend setup failed. See log: %BACKEND_LOG%
|
||||
echo Failed to navigate to autogpt_platform directory.
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
echo Backend services started successfully.
|
||||
echo.
|
||||
|
||||
REM --- Setup frontend ---
|
||||
echo Setting up frontend application...
|
||||
REM Create logs directory
|
||||
if not exist logs mkdir logs
|
||||
|
||||
REM Run docker compose with logging
|
||||
echo Starting AutoGPT services with Docker Compose...
|
||||
echo This may take a few minutes on first run...
|
||||
echo.
|
||||
cd frontend
|
||||
if exist .env.example copy /Y .env.example .env >nul
|
||||
call pnpm.cmd install
|
||||
set LOG_FILE=%REPO_DIR%\autogpt_platform\logs\docker_setup.log
|
||||
docker compose up -d > "%LOG_FILE%" 2>&1
|
||||
if errorlevel 1 (
|
||||
echo pnpm install failed!
|
||||
echo Docker compose failed. Check log file for details: %LOG_FILE%
|
||||
echo.
|
||||
echo Common issues:
|
||||
echo - Docker is not running
|
||||
echo - Insufficient disk space
|
||||
echo - Port conflicts (check if ports 3000, 8000, etc. are in use)
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
echo Frontend dependencies installed successfully.
|
||||
echo.
|
||||
|
||||
REM --- Start frontend dev server in the same terminal ---
|
||||
echo Setup complete!
|
||||
echo =============================
|
||||
echo Setup Complete!
|
||||
echo =============================
|
||||
echo.
|
||||
echo Access AutoGPT at: http://localhost:3000
|
||||
echo To stop services, press Ctrl+C and run "docker compose down" in %REPO_DIR%\autogpt_platform
|
||||
echo API available at: http://localhost:8000
|
||||
echo.
|
||||
echo The frontend will now start in this terminal. Closing this window will stop the frontend.
|
||||
echo Press Ctrl+C to stop the frontend at any time.
|
||||
echo To stop services: docker compose down
|
||||
echo To view logs: docker compose logs -f
|
||||
echo.
|
||||
|
||||
call pnpm.cmd dev
|
||||
echo Press any key to exit (services will keep running)...
|
||||
pause >nul
|
||||
325
autogpt_platform/installer/setup-autogpt.sh
Normal file → Executable file
@@ -4,9 +4,7 @@
|
||||
# AutoGPT Setup Script
|
||||
# ------------------------------------------------------------------------------
|
||||
# This script automates the installation and setup of AutoGPT on Linux systems.
|
||||
# It checks prerequisites, clones the repository, sets up backend and frontend,
|
||||
# configures Sentry (optional), and starts all services. Designed for clarity
|
||||
# and maintainability. Run this script from a terminal.
|
||||
# It checks prerequisites, clones the repository, and starts all services.
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# --- Global Variables ---
|
||||
@@ -14,24 +12,19 @@ GREEN='\033[0;32m'
|
||||
BLUE='\033[0;34m'
|
||||
YELLOW='\033[1;33m'
|
||||
RED='\033[0;31m'
|
||||
NC='\033[0m' # No Color
|
||||
NC='\033[0m'
|
||||
|
||||
# Variables
|
||||
REPO_DIR=""
|
||||
CLONE_NEEDED=false
|
||||
DOCKER_CMD="docker"
|
||||
DOCKER_COMPOSE_CMD="docker compose"
|
||||
LOG_DIR=""
|
||||
SENTRY_ENABLED=0
|
||||
LOG_FILE=""
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Helper Functions
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Print colored text
|
||||
print_color() {
|
||||
printf "${!1}%s${NC}\n" "$2"
|
||||
}
|
||||
|
||||
# Print the ASCII banner
|
||||
print_banner() {
|
||||
print_color "BLUE" "
|
||||
d8888 888 .d8888b. 8888888b. 88888888888
|
||||
@@ -45,295 +38,109 @@ d88P 888 \"Y88888 \"Y888 \"Y88P\" \"Y8888P88 888 888
|
||||
"
|
||||
}
|
||||
|
||||
# Handle errors and exit
|
||||
handle_error() {
|
||||
echo ""
|
||||
print_color "RED" "Error: $1"
|
||||
print_color "YELLOW" "Press Enter to exit..."
|
||||
read -r
|
||||
if [ -n "$LOG_FILE" ] && [ -f "$LOG_FILE" ]; then
|
||||
print_color "RED" "Check log file for details: $LOG_FILE"
|
||||
fi
|
||||
exit 1
|
||||
}
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Logging Functions
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Prepare log directory
|
||||
setup_logs() {
|
||||
LOG_DIR="$REPO_DIR/autogpt_platform/logs"
|
||||
mkdir -p "$LOG_DIR"
|
||||
}
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Health Check Functions
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Check service health by polling an endpoint
|
||||
check_health() {
|
||||
local url=$1
|
||||
local expected=$2
|
||||
local name=$3
|
||||
local max_attempts=$4
|
||||
local timeout=$5
|
||||
|
||||
if ! command -v curl &> /dev/null; then
|
||||
echo "curl not found. Skipping health check for $name."
|
||||
return 0
|
||||
fi
|
||||
|
||||
echo "Checking $name health..."
|
||||
for ((attempt=1; attempt<=max_attempts; attempt++)); do
|
||||
echo "Attempt $attempt/$max_attempts"
|
||||
response=$(curl -s --max-time "$timeout" "$url")
|
||||
if [[ "$response" == *"$expected"* ]]; then
|
||||
echo "✓ $name is healthy"
|
||||
return 0
|
||||
fi
|
||||
echo "Waiting 5s before next attempt..."
|
||||
sleep 5
|
||||
done
|
||||
echo "✗ $name health check failed after $max_attempts attempts"
|
||||
return 1
|
||||
}
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Prerequisite and Environment Functions
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Check for required commands
|
||||
check_command() {
|
||||
local cmd=$1
|
||||
local name=$2
|
||||
local url=$3
|
||||
|
||||
if ! command -v "$cmd" &> /dev/null; then
|
||||
handle_error "$name is not installed. Please install it and try again. Visit $url"
|
||||
check_prerequisites() {
|
||||
print_color "BLUE" "Checking prerequisites..."
|
||||
|
||||
if ! command -v git &> /dev/null; then
|
||||
handle_error "Git is not installed. Please install it and try again."
|
||||
else
|
||||
print_color "GREEN" "✓ $name is installed"
|
||||
print_color "GREEN" "✓ Git is installed"
|
||||
fi
|
||||
}
|
||||
|
||||
# Check for optional commands
|
||||
check_command_optional() {
|
||||
local cmd=$1
|
||||
if command -v "$cmd" &> /dev/null; then
|
||||
print_color "GREEN" "✓ $cmd is installed"
|
||||
|
||||
if ! command -v docker &> /dev/null; then
|
||||
handle_error "Docker is not installed. Please install it and try again."
|
||||
else
|
||||
print_color "YELLOW" "$cmd is not installed. Some features will be skipped."
|
||||
print_color "GREEN" "✓ Docker is installed"
|
||||
fi
|
||||
}
|
||||
|
||||
# Check Docker permissions and adjust commands if needed
|
||||
check_docker_permissions() {
|
||||
|
||||
if ! docker info &> /dev/null; then
|
||||
print_color "YELLOW" "Docker requires elevated privileges. Using sudo for Docker commands..."
|
||||
print_color "YELLOW" "Using sudo for Docker commands..."
|
||||
DOCKER_CMD="sudo docker"
|
||||
DOCKER_COMPOSE_CMD="sudo docker compose"
|
||||
fi
|
||||
|
||||
print_color "GREEN" "All prerequisites installed!"
|
||||
}
|
||||
|
||||
# Check all prerequisites
|
||||
check_prerequisites() {
|
||||
print_color "GREEN" "AutoGPT's Automated Setup Script"
|
||||
print_color "GREEN" "-------------------------------"
|
||||
print_color "BLUE" "This script will automatically install and set up AutoGPT for you."
|
||||
echo ""
|
||||
print_color "YELLOW" "Checking prerequisites:"
|
||||
|
||||
check_command git "Git" "https://git-scm.com/downloads"
|
||||
check_command docker "Docker" "https://docs.docker.com/get-docker/"
|
||||
check_docker_permissions
|
||||
check_command npm "npm (Node.js)" "https://nodejs.org/en/download/"
|
||||
check_command pnpm "pnpm (Node.js package manager)" "https://pnpm.io/installation"
|
||||
check_command_optional curl "curl"
|
||||
|
||||
print_color "GREEN" "All prerequisites are installed! Starting installation..."
|
||||
echo ""
|
||||
}
|
||||
|
||||
# Detect installation mode and set repo directory
|
||||
# (Clones if not in a repo, otherwise uses current directory)
|
||||
detect_installation_mode() {
|
||||
detect_repo() {
|
||||
if [[ "$PWD" == */autogpt_platform/installer ]]; then
|
||||
if [[ -d "../../.git" ]]; then
|
||||
REPO_DIR="$(cd ../..; pwd)"
|
||||
CLONE_NEEDED=false
|
||||
cd ../.. || handle_error "Failed to navigate to repository root."
|
||||
cd ../.. || handle_error "Failed to navigate to repo root"
|
||||
print_color "GREEN" "Using existing AutoGPT repository."
|
||||
else
|
||||
CLONE_NEEDED=true
|
||||
REPO_DIR="$(pwd)/AutoGPT"
|
||||
cd "$(dirname \"$(dirname \"$(dirname \"$PWD\")\")\")" || handle_error "Failed to navigate to parent directory."
|
||||
fi
|
||||
elif [[ -d ".git" && -d "autogpt_platform/installer" ]]; then
|
||||
REPO_DIR="$PWD"
|
||||
CLONE_NEEDED=false
|
||||
print_color "GREEN" "Using existing AutoGPT repository."
|
||||
else
|
||||
CLONE_NEEDED=true
|
||||
REPO_DIR="$(pwd)/AutoGPT"
|
||||
fi
|
||||
}
|
||||
|
||||
# Clone the repository if needed
|
||||
clone_repository() {
|
||||
clone_repo() {
|
||||
if [ "$CLONE_NEEDED" = true ]; then
|
||||
print_color "BLUE" "Cloning AutoGPT repository..."
|
||||
if git clone https://github.com/Significant-Gravitas/AutoGPT.git "$REPO_DIR"; then
|
||||
print_color "GREEN" "✓ Repo cloned successfully!"
|
||||
else
|
||||
handle_error "Failed to clone the repository."
|
||||
fi
|
||||
else
|
||||
print_color "GREEN" "Using existing AutoGPT repository"
|
||||
git clone https://github.com/Significant-Gravitas/AutoGPT.git "$REPO_DIR" || handle_error "Failed to clone repository"
|
||||
print_color "GREEN" "Repository cloned successfully."
|
||||
fi
|
||||
}
|
||||
|
||||
# Prompt for Sentry enablement and set global flag
|
||||
prompt_sentry_enablement() {
|
||||
print_color "YELLOW" "Would you like to enable debug information to be shared so we can fix your issues? [Y/n]"
|
||||
read -r sentry_answer
|
||||
case "${sentry_answer,,}" in
|
||||
""|y|yes)
|
||||
SENTRY_ENABLED=1
|
||||
;;
|
||||
n|no)
|
||||
SENTRY_ENABLED=0
|
||||
;;
|
||||
*)
|
||||
print_color "YELLOW" "Invalid input. Defaulting to yes. Sentry will be enabled."
|
||||
SENTRY_ENABLED=1
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Setup Functions
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Set up backend services and configure Sentry if enabled
|
||||
setup_backend() {
|
||||
print_color "BLUE" "Setting up backend services..."
|
||||
cd "$REPO_DIR/autogpt_platform" || handle_error "Failed to navigate to backend directory."
|
||||
cp .env.example .env || handle_error "Failed to copy environment file."
|
||||
|
||||
# Set SENTRY_DSN in backend/.env
|
||||
cd backend || handle_error "Failed to navigate to backend subdirectory."
|
||||
cp .env.example .env || handle_error "Failed to copy backend environment file."
|
||||
sentry_url="https://11d0640fef35640e0eb9f022eb7d7626@o4505260022104064.ingest.us.sentry.io/4507890252447744"
|
||||
if [ "$SENTRY_ENABLED" = "1" ]; then
|
||||
sed -i "s|^SENTRY_DSN=.*$|SENTRY_DSN=$sentry_url|" .env || echo "SENTRY_DSN=$sentry_url" >> .env
|
||||
print_color "GREEN" "Sentry enabled in backend."
|
||||
run_docker() {
|
||||
cd "$REPO_DIR/autogpt_platform" || handle_error "Failed to navigate to autogpt_platform"
|
||||
|
||||
print_color "BLUE" "Starting AutoGPT services with Docker Compose..."
|
||||
print_color "YELLOW" "This may take a few minutes on first run..."
|
||||
echo
|
||||
|
||||
mkdir -p logs
|
||||
LOG_FILE="$REPO_DIR/autogpt_platform/logs/docker_setup.log"
|
||||
|
||||
if $DOCKER_COMPOSE_CMD up -d > "$LOG_FILE" 2>&1; then
|
||||
print_color "GREEN" "✓ Services started successfully!"
|
||||
else
|
||||
sed -i "s|^SENTRY_DSN=.*$|SENTRY_DSN=|" .env || echo "SENTRY_DSN=" >> .env
|
||||
print_color "YELLOW" "Sentry not enabled in backend."
|
||||
fi
|
||||
cd .. # back to autogpt_platform
|
||||
|
||||
$DOCKER_COMPOSE_CMD down || handle_error "Failed to stop existing backend services."
|
||||
$DOCKER_COMPOSE_CMD up -d --build || handle_error "Failed to start backend services."
|
||||
print_color "GREEN" "✓ Backend services started successfully"
|
||||
}
|
||||
|
||||
# Set up frontend application
|
||||
setup_frontend() {
|
||||
print_color "BLUE" "Setting up frontend application..."
|
||||
cd "$REPO_DIR/autogpt_platform/frontend" || handle_error "Failed to navigate to frontend directory."
|
||||
cp .env.example .env || handle_error "Failed to copy frontend environment file."
|
||||
corepack enable || handle_error "Failed to enable corepack."
|
||||
pnpm install || handle_error "Failed to install frontend dependencies."
|
||||
print_color "GREEN" "✓ Frontend dependencies installed successfully"
|
||||
}
|
||||
|
||||
# Run backend and frontend setup concurrently and manage logs
|
||||
run_concurrent_setup() {
|
||||
setup_logs
|
||||
backend_log="$LOG_DIR/backend_setup.log"
|
||||
frontend_log="$LOG_DIR/frontend_setup.log"
|
||||
|
||||
: > "$backend_log"
|
||||
: > "$frontend_log"
|
||||
|
||||
setup_backend > "$backend_log" 2>&1 &
|
||||
backend_pid=$!
|
||||
echo "Backend setup finished."
|
||||
|
||||
setup_frontend > "$frontend_log" 2>&1 &
|
||||
frontend_pid=$!
|
||||
echo "Frontend setup finished."
|
||||
|
||||
show_spinner "$backend_pid" "$frontend_pid"
|
||||
|
||||
wait $backend_pid; backend_status=$?
|
||||
wait $frontend_pid; frontend_status=$?
|
||||
|
||||
if [ $backend_status -ne 0 ]; then
|
||||
print_color "RED" "Backend setup failed. See log: $backend_log"
|
||||
print_color "RED" "Docker compose failed. Check log file for details: $LOG_FILE"
|
||||
print_color "YELLOW" "Common issues:"
|
||||
print_color "YELLOW" "- Docker is not running"
|
||||
print_color "YELLOW" "- Insufficient disk space"
|
||||
print_color "YELLOW" "- Port conflicts (check if ports 3000, 8000, etc. are in use)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ $frontend_status -ne 0 ]; then
|
||||
print_color "RED" "Frontend setup failed. See log: $frontend_log"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
}
|
||||
|
||||
# Show a spinner while background jobs run
|
||||
show_spinner() {
|
||||
local backend_pid=$1
|
||||
local frontend_pid=$2
|
||||
spin='-\|/'
|
||||
i=0
|
||||
messages=("Working..." "Still working..." "Setting up dependencies..." "Almost there...")
|
||||
msg_index=0
|
||||
msg_counter=0
|
||||
clear_line=" "
|
||||
|
||||
while kill -0 $backend_pid 2>/dev/null || kill -0 $frontend_pid 2>/dev/null; do
|
||||
i=$(( (i+1) % 4 ))
|
||||
msg_counter=$(( (msg_counter+1) % 300 ))
|
||||
if [ $msg_counter -eq 0 ]; then
|
||||
msg_index=$(( (msg_index+1) % ${#messages[@]} ))
|
||||
fi
|
||||
printf "\r${clear_line}\r${YELLOW}[%c]${NC} %s" "${spin:$i:1}" "${messages[$msg_index]}"
|
||||
sleep .1
|
||||
done
|
||||
printf "\r${clear_line}\r${GREEN}[✓]${NC} Setup completed!\n"
|
||||
}
|
||||
|
||||
# ------------------------------------------------------------------------------
|
||||
# Main Entry Point
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
main() {
|
||||
print_banner
|
||||
print_color "GREEN" "AutoGPT Setup Script"
|
||||
print_color "GREEN" "-------------------"
|
||||
|
||||
check_prerequisites
|
||||
prompt_sentry_enablement
|
||||
detect_installation_mode
|
||||
clone_repository
|
||||
setup_logs
|
||||
run_concurrent_setup
|
||||
|
||||
print_color "YELLOW" "Starting frontend..."
|
||||
(cd "$REPO_DIR/autogpt_platform/frontend" && pnpm dev > "$LOG_DIR/frontend_dev.log" 2>&1 &)
|
||||
|
||||
print_color "YELLOW" "Waiting for services to start..."
|
||||
sleep 20
|
||||
|
||||
print_color "YELLOW" "Verifying services health..."
|
||||
check_health "http://localhost:8006/health" "\"status\":\"healthy\"" "Backend" 6 15
|
||||
check_health "http://localhost:3000/health" "Yay im healthy" "Frontend" 6 15
|
||||
|
||||
if [ $backend_status -ne 0 ] || [ $frontend_status -ne 0 ]; then
|
||||
print_color "RED" "Setup failed. See logs for details."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
print_color "GREEN" "Setup complete!"
|
||||
print_color "BLUE" "Access AutoGPT at: http://localhost:3000"
|
||||
print_color "YELLOW" "To stop services, press Ctrl+C and run 'docker compose down' in $REPO_DIR/autogpt_platform"
|
||||
echo ""
|
||||
print_color "GREEN" "Press Enter to exit (services will keep running)..."
|
||||
read -r
|
||||
detect_repo
|
||||
clone_repo
|
||||
run_docker
|
||||
|
||||
echo
|
||||
print_color "GREEN" "============================="
|
||||
print_color "GREEN" " Setup Complete!"
|
||||
print_color "GREEN" "============================="
|
||||
echo
|
||||
print_color "BLUE" "🚀 Access AutoGPT at: http://localhost:3000"
|
||||
print_color "BLUE" "📡 API available at: http://localhost:8000"
|
||||
echo
|
||||
print_color "YELLOW" "To stop services: docker compose down"
|
||||
print_color "YELLOW" "To view logs: docker compose logs -f"
|
||||
echo
|
||||
print_color "YELLOW" "All commands should be run in: $REPO_DIR/autogpt_platform"
|
||||
}
|
||||
|
||||
main
|
||||
main
|
||||
BIN
docs/content/imgs/aimlapi/Step 1 AutoGPT Running.png
Normal file
|
After Width: | Height: | Size: 19 KiB |
BIN
docs/content/imgs/aimlapi/Step 2 Build Screen.png
Normal file
|
After Width: | Height: | Size: 134 KiB |
BIN
docs/content/imgs/aimlapi/Step 3 AI Block.png
Normal file
|
After Width: | Height: | Size: 7.5 KiB |
BIN
docs/content/imgs/aimlapi/Step 4 AI Generator Block.png
Normal file
|
After Width: | Height: | Size: 45 KiB |
BIN
docs/content/imgs/aimlapi/Step 5 AIMLAPI Models.png
Normal file
|
After Width: | Height: | Size: 28 KiB |
BIN
docs/content/imgs/aimlapi/Step 6.1 Key Placeholder.png
Normal file
|
After Width: | Height: | Size: 9.4 KiB |
BIN
docs/content/imgs/aimlapi/Step 6.2 No Fill Key Placeholder.png
Normal file
|
After Width: | Height: | Size: 24 KiB |
BIN
docs/content/imgs/aimlapi/Step 6.3 Filled Key Placeholder.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
docs/content/imgs/aimlapi/Step 6.4 Overview.png
Normal file
|
After Width: | Height: | Size: 54 KiB |
BIN
docs/content/imgs/aimlapi/Step 7.1 Save.png
Normal file
|
After Width: | Height: | Size: 26 KiB |
BIN
docs/content/imgs/aimlapi/Step 8 Run.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
docs/content/imgs/aimlapi/Step 9 Output.png
Normal file
|
After Width: | Height: | Size: 23 KiB |
@@ -64,11 +64,12 @@ You can learn more under: [Build your own Blocks](platform/new_blocks.md)
|
||||
|
||||
The platform comes pre-integrated with cutting-edge LLM providers:
|
||||
|
||||
- OpenAI
|
||||
- Anthropic
|
||||
- AI/ML API
|
||||
- Groq
|
||||
- Llama
|
||||
- OpenAI - https://openai.com/
|
||||
- Anthropic - https://www.anthropic.com/
|
||||
- Groq - https://groq.com/
|
||||
- Llama - https://llamaindex.ai/
|
||||
- AI/ML API - [https://aimlapi.com/](https://aimlapi.com/?utm_source=autogpt&utm_medium=github&utm_campaign=integration)
|
||||
- AI/ML API provides 300+ AI models including Deepseek, Gemini, ChatGPT. The models run at enterprise-grade rate limits and uptimes.
|
||||
|
||||
## License Overview
|
||||
|
||||
|
||||
@@ -20,11 +20,11 @@ KEY2=value2
|
||||
|
||||
The server will automatically load the `.env` file when it starts. You can also set the environment variables directly in your shell. Refer to your operating system's documentation on how to set environment variables in the current session.
|
||||
|
||||
The valid options are listed in `.env.example` in the root of the builder and server directories. You can copy the `.env.example` file to `.env` and modify the values as needed.
|
||||
The valid options are listed in `.env.default` in the root of the builder and server directories. You can copy the `.env.default` file to `.env` and modify the values as needed.
|
||||
|
||||
```bash
|
||||
# Copy the .env.example file to .env
|
||||
cp .env.example .env
|
||||
# Copy the .env.default file to .env
|
||||
cp .env.default .env
|
||||
```
|
||||
|
||||
### Secrets directory
|
||||
@@ -88,17 +88,17 @@ We use the Poetry to manage the dependencies. To set up the project, follow thes
|
||||
```sh
|
||||
poetry shell
|
||||
```
|
||||
|
||||
|
||||
3. Install dependencies
|
||||
|
||||
```sh
|
||||
poetry install
|
||||
```
|
||||
|
||||
4. Copy .env.example to .env
|
||||
|
||||
4. Copy .env.default to .env
|
||||
|
||||
```sh
|
||||
cp .env.example .env
|
||||
cp .env.default .env
|
||||
```
|
||||
|
||||
5. Generate the Prisma client
|
||||
@@ -106,7 +106,6 @@ We use the Poetry to manage the dependencies. To set up the project, follow thes
|
||||
```sh
|
||||
poetry run prisma generate
|
||||
```
|
||||
|
||||
|
||||
> In case Prisma generates the client for the global Python installation instead of the virtual environment, the current mitigation is to just uninstall the global Prisma package:
|
||||
>
|
||||
@@ -114,7 +113,7 @@ We use the Poetry to manage the dependencies. To set up the project, follow thes
|
||||
> pip uninstall prisma
|
||||
> ```
|
||||
>
|
||||
> Then run the generation again. The path *should* look something like this:
|
||||
> Then run the generation again. The path _should_ look something like this:
|
||||
> `<some path>/pypoetry/virtualenvs/backend-TQIRSwR6-py3.12/bin/prisma`
|
||||
|
||||
6. Run the postgres database from the /rnd folder
|
||||
|
||||
139
docs/content/platform/aimlapi.md
Normal file
@@ -0,0 +1,139 @@
|
||||
|
||||
# 🧠 Running AI/ML API with AutoGPT
|
||||
|
||||
Follow these steps to connect **AI/ML API** with the **AutoGPT** platform for high-performance AI text generation.
|
||||
|
||||
---
|
||||
|
||||
## ✅ Prerequisites
|
||||
|
||||
1. Make sure you have gone through and completed the [AutoGPT Setup Guide](https://docs.agpt.co/platform/getting-started/), and AutoGPT is running locally at `http://localhost:3000`.
|
||||
2. You have an **API key** from [AI/ML API](https://aimlapi.com/app/keys?utm_source=autogpt&utm_medium=github&utm_campaign=integration).
|
||||
|
||||
---
|
||||
|
||||
## ⚙️ Setup Steps
|
||||
|
||||
### 1. Start AutoGPT Locally
|
||||
|
||||
Follow the official guide:
|
||||
[📖 AutoGPT Getting Started Guide](https://docs.agpt.co/platform/getting-started/)
|
||||
|
||||
Make sure AutoGPT is running and accessible at:
|
||||
[http://localhost:3000](http://localhost:3000)
|
||||
|
||||
> 💡 Keep AutoGPT running in a terminal or Docker throughout the session.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### 2. Open the Visual Builder
|
||||
|
||||
Open your browser and go to:
|
||||
[http://localhost:3000/build](http://localhost:3000/build)
|
||||
|
||||
Or click **“Build”** in the navigation bar.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### 3. Add an AI Text Generator Block
|
||||
|
||||
1. Click the **"Blocks"** button on the left sidebar.
|
||||
|
||||

|
||||
|
||||
2. In the search bar, type `AI Text Generator`.
|
||||
3. Drag the block into the canvas.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### 4. Select an AI/ML API Model
|
||||
|
||||
Click the AI Text Generator block to configure it.
|
||||
|
||||
In the **LLM Model** dropdown, select one of the supported models from AI/ML API:
|
||||
|
||||

|
||||
|
||||
| Model ID | Speed | Reasoning Quality | Best For |
|
||||
| ---------------------------------------------- | ------ | ----------------- | ------------------------ |
|
||||
| `Qwen/Qwen2.5-72B-Instruct-Turbo` | Medium | High | Text-based tasks |
|
||||
| `nvidia/llama-3.1-nemotron-70b-instruct` | Medium | High | Analytics and reasoning |
|
||||
| `meta-llama/Llama-3.3-70B-Instruct-Turbo` | Low | Very High | Complex multi-step tasks |
|
||||
| `meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo` | Low | Very High | Deep reasoning |
|
||||
| `meta-llama/Llama-3.2-3B-Instruct-Turbo` | High | Medium | Fast responses |
|
||||
|
||||
> ✅ These models are available via OpenAI-compatible API from [AI/ML API](https://aimlapi.com/app/?utm_source=autogpt&utm_medium=github&utm_campaign=integration)
|
||||
|
||||
---
|
||||
|
||||
### 5. Configure the Prompt and API Key
|
||||
|
||||
Inside the **AI Text Generator** block:
|
||||
|
||||
1. Enter your prompt text in the **Prompt** field.
|
||||
2. Enter your **AI/ML API Key** in the designated field.
|
||||
|
||||
🔐 You can get your key from:
|
||||
[https://aimlapi.com/app/keys/](https://aimlapi.com/app/keys?utm_source=autogpt&utm_medium=github&utm_campaign=integration)
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### 6. Save Your Agent
|
||||
|
||||
Click the **“Save”** button at the top-right of the builder interface:
|
||||
|
||||
1. Give your agent a name (e.g., `aimlapi_test_agent`).
|
||||
2. Click **“Save Agent”** to confirm.
|
||||
|
||||

|
||||
|
||||
> 💡 Saving allows reuse, scheduling, and chaining in larger workflows.
|
||||
|
||||
---
|
||||
|
||||
### 7. Run Your Agent
|
||||
|
||||
From the workspace:
|
||||
|
||||
1. Press **“Run”** next to your saved agent.
|
||||
2. The request will be sent to the selected AI/ML API model.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
### 8. View the Output
|
||||
|
||||
1. Scroll to the **AI Text Generator** block.
|
||||
2. Check the **Output** panel below it.
|
||||
3. You can copy, export, or pass the result to further blocks.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## 🔄 Expand Your Agent
|
||||
|
||||
Now that AI/ML API is connected, expand your workflow by chaining additional blocks:
|
||||
|
||||
* 🔧 **Tools** – fetch URLs, call APIs, scrape data
|
||||
* 🧠 **Memory** – retain context across interactions
|
||||
* ⚙️ **Actions / Chains** – create full pipelines
|
||||
|
||||
---
|
||||
|
||||
🎉 You’re now generating AI responses using enterprise-grade models from **AI/ML API** in **AutoGPT**!
|
||||
@@ -107,53 +107,28 @@ If you get stuck, follow [this guide](https://docs.github.com/en/repositories/cr
|
||||
|
||||
Once that's complete you can continue the setup process.
|
||||
|
||||
### Running the backend services
|
||||
### Running the AutoGPT Platform
|
||||
|
||||
To run the backend services, follow these steps:
|
||||
To run the platform, follow these steps:
|
||||
|
||||
* Navigate to the `autogpt_platform` directory inside the AutoGPT folder:
|
||||
```bash
|
||||
cd AutoGPT/autogpt_platform
|
||||
```
|
||||
|
||||
* Copy the `.env.example` file to `.env` in `autogpt_platform`:
|
||||
```
|
||||
cp .env.example .env
|
||||
```
|
||||
This command will copy the `.env.example` file to `.env` in the `supabase` directory. You can modify the `.env` file to add your own environment variables.
|
||||
- Copy the `.env.default` file to `.env` in `autogpt_platform`:
|
||||
|
||||
* Run the backend services:
|
||||
```
|
||||
cp .env.default .env
|
||||
```
|
||||
|
||||
This command will copy the `.env.default` file to `.env` in the `autogpt_platform` directory. You can modify the `.env` file to add your own environment variables.
|
||||
|
||||
- Run the platform services:
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
||||
This command will start all the necessary backend services defined in the `docker-compose.combined.yml` file in detached mode.
|
||||
|
||||
|
||||
### Running the frontend application
|
||||
|
||||
To run the frontend application open a new terminal and follow these steps:
|
||||
|
||||
- Navigate to `frontend` folder within the `autogpt_platform` directory:
|
||||
|
||||
```
|
||||
cd frontend
|
||||
```
|
||||
|
||||
- Copy the `.env.example` file available in the `frontend` directory to `.env` in the same directory:
|
||||
|
||||
```
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
You can modify the `.env` within this folder to add your own environment variables for the frontend application.
|
||||
|
||||
- Run the following command:
|
||||
```
|
||||
corepack enable
|
||||
pnpm install
|
||||
pnpm dev
|
||||
```
|
||||
This command will enable corepack, install the necessary dependencies with pnpm, and start the frontend application in development mode.
|
||||
This command will start all the necessary backend services defined in the `docker-compose.yml` file in detached mode.
|
||||
|
||||
### Checking if the application is running
|
||||
|
||||
@@ -185,127 +160,6 @@ poetry run cli gen-encrypt-key
|
||||
|
||||
Then, replace the existing key in the `autogpt_platform/backend/.env` file with the new one.
|
||||
|
||||
!!! Note
|
||||
*The steps below are an alternative to [Running the backend services](#running-the-backend-services)*
|
||||
|
||||
<details>
|
||||
<summary><strong>Alternate Steps</strong></summary>
|
||||
|
||||
#### AutoGPT Agent Server (OLD)
|
||||
This is an initial project for creating the next generation of agent execution, which is an AutoGPT agent server.
|
||||
The agent server will enable the creation of composite multi-agent systems that utilize AutoGPT agents and other non-agent components as its primitives.
|
||||
|
||||
##### Docs
|
||||
|
||||
You can access the docs for the [AutoGPT Agent Server here](https://docs.agpt.co/#1-autogpt-server).
|
||||
|
||||
##### Setup
|
||||
|
||||
We use the Poetry to manage the dependencies. To set up the project, follow these steps inside this directory:
|
||||
|
||||
0. Install Poetry
|
||||
|
||||
```sh
|
||||
pip install poetry
|
||||
```
|
||||
|
||||
1. Configure Poetry to use .venv in your project directory
|
||||
|
||||
```sh
|
||||
poetry config virtualenvs.in-project true
|
||||
```
|
||||
|
||||
2. Enter the poetry shell
|
||||
|
||||
```sh
|
||||
poetry shell
|
||||
```
|
||||
|
||||
3. Install dependencies
|
||||
|
||||
```sh
|
||||
poetry install
|
||||
```
|
||||
|
||||
4. Copy .env.example to .env
|
||||
|
||||
```sh
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
5. Generate the Prisma client
|
||||
|
||||
```sh
|
||||
poetry run prisma generate
|
||||
```
|
||||
|
||||
> In case Prisma generates the client for the global Python installation instead of the virtual environment, the current mitigation is to just uninstall the global Prisma package:
|
||||
>
|
||||
> ```sh
|
||||
> pip uninstall prisma
|
||||
> ```
|
||||
>
|
||||
> Then run the generation again. The path *should* look something like this:
|
||||
> `<some path>/pypoetry/virtualenvs/backend-TQIRSwR6-py3.12/bin/prisma`
|
||||
|
||||
6. Migrate the database. Be careful because this deletes current data in the database.
|
||||
|
||||
```sh
|
||||
docker compose up db -d
|
||||
poetry run prisma migrate deploy
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
### Starting the AutoGPT server without Docker
|
||||
|
||||
To run the server locally, start in the autogpt_platform folder:
|
||||
|
||||
```sh
|
||||
cd ..
|
||||
```
|
||||
|
||||
Run the following command to run database in docker but the application locally:
|
||||
|
||||
```sh
|
||||
docker compose --profile local up deps --build --detach
|
||||
cd backend
|
||||
poetry run app
|
||||
```
|
||||
|
||||
### Starting the AutoGPT server with Docker
|
||||
|
||||
Run the following command to build the dockerfiles:
|
||||
|
||||
```sh
|
||||
docker compose build
|
||||
```
|
||||
|
||||
Run the following command to run the app:
|
||||
|
||||
```sh
|
||||
docker compose up
|
||||
```
|
||||
|
||||
Run the following to automatically rebuild when code changes, in another terminal:
|
||||
|
||||
```sh
|
||||
docker compose watch
|
||||
```
|
||||
|
||||
Run the following command to shut down:
|
||||
|
||||
```sh
|
||||
docker compose down
|
||||
```
|
||||
|
||||
If you run into issues with dangling orphans, try:
|
||||
|
||||
```sh
|
||||
docker compose down --volumes --remove-orphans && docker-compose up --force-recreate --renew-anon-volumes --remove-orphans
|
||||
```
|
||||
|
||||
### 📌 Windows Installation Note
|
||||
|
||||
When installing Docker on Windows, it is **highly recommended** to select **WSL 2** instead of Hyper-V. Using Hyper-V can cause compatibility issues with Supabase, leading to the `supabase-db` container being marked as **unhealthy**.
|
||||
@@ -332,14 +186,92 @@ For more details, refer to [Docker's official documentation](https://docs.docker
|
||||
|
||||
## Development
|
||||
|
||||
### Formatting & Linting
|
||||
Auto formatter and linter are set up in the project. To run them:
|
||||
### Frontend Development
|
||||
|
||||
Install:
|
||||
#### Running the frontend locally
|
||||
|
||||
To run the frontend locally, you need to have Node.js and PNPM installed on your machine.
|
||||
|
||||
Install [Node.js](https://nodejs.org/en/download/) to manage dependencies and run the frontend application.
|
||||
|
||||
Install [PNPM](https://pnpm.io/installation) to manage the frontend dependencies.
|
||||
|
||||
Run the service dependencies (backend, database, message queues, etc.):
|
||||
```sh
|
||||
docker compose --profile local up deps_backend --build --detach
|
||||
```
|
||||
|
||||
Go to the `autogpt_platform/frontend` directory:
|
||||
```sh
|
||||
cd frontend
|
||||
```
|
||||
|
||||
Install the dependencies:
|
||||
```sh
|
||||
pnpm install
|
||||
```
|
||||
|
||||
Generate the API client:
|
||||
```sh
|
||||
pnpm generate:api-client
|
||||
```
|
||||
|
||||
Run the frontend application:
|
||||
```sh
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
#### Formatting & Linting
|
||||
|
||||
Auto formatter and linter are set up in the project. To run them:
|
||||
Format the code:
|
||||
```sh
|
||||
pnpm format
|
||||
```
|
||||
|
||||
Lint the code:
|
||||
```sh
|
||||
pnpm lint
|
||||
```
|
||||
|
||||
#### Testing
|
||||
|
||||
To run the tests, you can use the following command:
|
||||
```sh
|
||||
pnpm test
|
||||
```
|
||||
|
||||
### Backend Development
|
||||
|
||||
#### Running the backend locally
|
||||
|
||||
To run the backend locally, you need to have Python 3.10 or higher installed on your machine.
|
||||
|
||||
Install [Poetry](https://python-poetry.org/docs/#installation) to manage dependencies and virtual environments.
|
||||
|
||||
Run the backend dependencies (database, message queues, etc.):
|
||||
```sh
|
||||
docker compose --profile local up deps --build --detach
|
||||
```
|
||||
|
||||
Go to the `autogpt_platform/backend` directory:
|
||||
```sh
|
||||
cd backend
|
||||
```
|
||||
|
||||
Install the dependencies:
|
||||
```sh
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
Run the backend server:
|
||||
```sh
|
||||
poetry run app
|
||||
```
|
||||
|
||||
#### Formatting & Linting
|
||||
Auto formatter and linter are set up in the project. To run them:
|
||||
|
||||
Format the code:
|
||||
```sh
|
||||
poetry run format
|
||||
@@ -350,71 +282,14 @@ Lint the code:
|
||||
poetry run lint
|
||||
```
|
||||
|
||||
### Testing
|
||||
#### Testing
|
||||
|
||||
To run the tests:
|
||||
|
||||
```sh
|
||||
poetry run test
|
||||
poetry run pytest -s
|
||||
```
|
||||
|
||||
To update stored snapshots after intentional API changes:
|
||||
|
||||
```sh
|
||||
pytest --snapshot-update
|
||||
```
|
||||
|
||||
## Project Outline
|
||||
|
||||
The current project has the following main modules:
|
||||
|
||||
#### **blocks**
|
||||
|
||||
This module stores all the Agent Blocks, which are reusable components to build a graph that represents the agent's behavior.
|
||||
|
||||
#### **data**
|
||||
|
||||
This module stores the logical model that is persisted in the database.
|
||||
It abstracts the database operations into functions that can be called by the service layer.
|
||||
Any code that interacts with Prisma objects or the database should reside in this module.
|
||||
The main models are:
|
||||
* `block`: anything related to the block used in the graph
|
||||
* `execution`: anything related to the execution graph execution
|
||||
* `graph`: anything related to the graph, node, and its relations
|
||||
|
||||
#### **execution**
|
||||
|
||||
This module stores the business logic of executing the graph.
|
||||
It currently has the following main modules:
|
||||
* `manager`: A service that consumes the queue of the graph execution and executes the graph. It contains both pieces of logic.
|
||||
* `scheduler`: A service that triggers scheduled graph execution based on a cron expression. It pushes an execution request to the manager.
|
||||
|
||||
#### **server**
|
||||
|
||||
This module stores the logic for the server API.
|
||||
It contains all the logic used for the API that allows the client to create, execute, and monitor the graph and its execution.
|
||||
This API service interacts with other services like those defined in `manager` and `scheduler`.
|
||||
|
||||
#### **utils**
|
||||
|
||||
This module stores utility functions that are used across the project.
|
||||
Currently, it has two main modules:
|
||||
* `process`: A module that contains the logic to spawn a new process.
|
||||
* `service`: A module that serves as a parent class for all the services in the project.
|
||||
|
||||
## Service Communication
|
||||
|
||||
Currently, there are only 3 active services:
|
||||
|
||||
- AgentServer (the API, defined in `server.py`)
|
||||
- ExecutionManager (the executor, defined in `manager.py`)
|
||||
- Scheduler (the scheduler, defined in `scheduler.py`)
|
||||
|
||||
The services run in independent Python processes and communicate through an IPC.
|
||||
A communication layer (`service.py`) is created to decouple the communication library from the implementation.
|
||||
|
||||
Currently, the IPC is done using Pyro5 and abstracted in a way that allows a function decorated with `@expose` to be called from a different process.
|
||||
|
||||
## Adding a New Agent Block
|
||||
|
||||
To add a new agent block, you need to create a new class that inherits from `Block` and provides the following information:
|
||||
@@ -424,4 +299,5 @@ To add a new agent block, you need to create a new class that inherits from `Blo
|
||||
* `run` method: the main logic of the block.
|
||||
* `test_input` & `test_output`: the sample input and output data for the block, which will be used to auto-test the block.
|
||||
* You can mock the functions declared in the block using the `test_mock` field for your unit tests.
|
||||
* Once you finish creating the block, you can test it by running `poetry run pytest -s test/block/test_block.py`.
|
||||
* Once you finish creating the block, you can test it by running `poetry run pytest backend/blocks/test/test_block.py -s`.
|
||||
* Create a Pull Request to the `dev` branch of the repository with your changes so you can share it with the community :)
|
||||
|
||||
@@ -19,6 +19,7 @@ nav:
|
||||
- Agent Blocks: platform/agent-blocks.md
|
||||
- Build your own Blocks: platform/new_blocks.md
|
||||
- Using Ollama: platform/ollama.md
|
||||
- Using AI/ML API: platform/aimlapi.md
|
||||
- Using D-ID: platform/d_id.md
|
||||
- Blocks: platform/blocks/blocks.md
|
||||
- Contributing:
|
||||
|
||||