mirror of
https://github.com/Significant-Gravitas/AutoGPT.git
synced 2026-04-08 03:00:28 -04:00
Revert "feat(docker): add frontend service to docker-compose with env config improvements" (#10577)
Reverts Significant-Gravitas/AutoGPT#10536 to bring platform back up due to this error: ``` │ Error creating Supabase client Error: @supabase/ssr: Your project's URL and API key are required to create a Supabase client! │ │ │ │ Check your Supabase project's API settings to find these values │ │ │ │ https://supabase.com/dashboard/project/_/settings/api │ │ at <unknown> (https://supabase.com/dashboard/project/_/settings/api) │ │ at bX (.next/server/chunks/3873.js:6:90688) │ │ at <unknown> (.next/server/chunks/150.js:6:13460) │ │ at n (.next/server/chunks/150.js:6:13419) │ │ at o (.next/server/chunks/150.js:6:14187) │ │ ⨯ Error: Your project's URL and Key are required to create a Supabase client! │ │ │ │ Check your Supabase project's API settings to find these values │ │ │ │ https://supabase.com/dashboard/project/_/settings/api │ │ at <unknown> (https://supabase.com/dashboard/project/_/settings/api) │ │ at bY (.next/server/chunks/3006.js:10:486) │ │ at g (.next/server/app/(platform)/auth/callback/route.js:1:5890) │ │ at async e (.next/server/chunks/9836.js:1:101814) │ │ at async k (.next/server/chunks/9836.js:1:15611) │ │ at async l (.next/server/chunks/9836.js:1:15817) { │ │ digest: '424987633' │ │ } │ │ Error creating Supabase client Error: @supabase/ssr: Your project's URL and API key are required to create a Supabase client! │ │ │ │ Check your Supabase project's API settings to find these values │ │ │ │ https://supabase.com/dashboard/project/_/settings/api │ │ at <unknown> (https://supabase.com/dashboard/project/_/settings/api) │ │ at bX (.next/server/chunks/3873.js:6:90688) │ │ at <unknown> (.next/server/chunks/150.js:6:13460) │ │ at n (.next/server/chunks/150.js:6:13419) │ │ at j (.next/server/chunks/150.js:6:7482) │ │ Error creating Supabase client Error: @supabase/ssr: Your project's URL and API key are required to create a Supabase client! │ │ │ │ Check your Supabase project's API settings to find these values │ │ │ │ https://supabase.com/dashboard/project/_/settings/api │ │ at <unknown> (https://supabase.com/dashboard/project/_/settings/api) │ │ at bX (.next/server/chunks/3873.js:6:90688) │ │ at <unknown> (.next/server/chunks/150.js:6:13460) │ │ at n (.next/server/chunks/150.js:6:13419) │ │ at h (.next/server/chunks/150.js:6:10561) │ │ Error creating Supabase client Error: @supabase/ssr: Your project's URL and API key are required to create a Supabase client! │ │ │ │ Check your Supabase project's API settings to find these values │ │ │ │ https://supabase.com/dashboard/project/_/settings/api │ │ at <unknown> (https://supabase.com/dashboard/project/_/settings/api) │ │ at bX (.next/server/chunks/3873.js:6:90688) │ │ at <unknown> (.next/server/chunks/150.js:6:13460) │ │ at n (.next/server/chunks/150.js:6:13419) ```
This commit is contained in:
4
.github/workflows/platform-frontend-ci.yml
vendored
4
.github/workflows/platform-frontend-ci.yml
vendored
@@ -195,7 +195,7 @@ jobs:
|
||||
|
||||
- name: Run docker compose
|
||||
run: |
|
||||
NEXT_PUBLIC_PW_TEST=true docker compose -f ../docker-compose.yml up -d
|
||||
docker compose -f ../docker-compose.yml up -d
|
||||
env:
|
||||
DOCKER_BUILDKIT: 1
|
||||
BUILDX_CACHE_FROM: type=local,src=/tmp/.buildx-cache
|
||||
@@ -258,6 +258,8 @@ jobs:
|
||||
- name: Build frontend
|
||||
run: pnpm build --turbo
|
||||
# uses Turbopack, much faster and safe enough for a test pipeline
|
||||
env:
|
||||
NEXT_PUBLIC_PW_TEST: true
|
||||
|
||||
- name: Install Browser 'chromium'
|
||||
run: pnpm playwright install --with-deps chromium
|
||||
|
||||
@@ -8,6 +8,7 @@ Welcome to the AutoGPT Platform - a powerful system for creating and running AI
|
||||
|
||||
- Docker
|
||||
- Docker Compose V2 (comes with Docker Desktop, or can be installed separately)
|
||||
- Node.js & NPM (for running the frontend application)
|
||||
|
||||
### Running the System
|
||||
|
||||
@@ -36,7 +37,44 @@ To run the AutoGPT Platform, follow these steps:
|
||||
|
||||
This command will start all the necessary backend services defined in the `docker-compose.yml` file in detached mode.
|
||||
|
||||
4. After all the services are in ready state, open your browser and navigate to `http://localhost:3000` to access the AutoGPT Platform frontend.
|
||||
4. Navigate to `frontend` within the `autogpt_platform` directory:
|
||||
|
||||
```
|
||||
cd frontend
|
||||
```
|
||||
|
||||
You will need to run your frontend application separately on your local machine.
|
||||
|
||||
5. Run the following command:
|
||||
|
||||
```
|
||||
cp .env.example .env.local
|
||||
```
|
||||
|
||||
This command will copy the `.env.example` file to `.env.local` in the `frontend` directory. You can modify the `.env.local` within this folder to add your own environment variables for the frontend application.
|
||||
|
||||
6. Run the following command:
|
||||
|
||||
Enable corepack and install dependencies by running:
|
||||
|
||||
```
|
||||
corepack enable
|
||||
pnpm i
|
||||
```
|
||||
|
||||
Generate the API client (this step is required before running the frontend):
|
||||
|
||||
```
|
||||
pnpm generate:api-client
|
||||
```
|
||||
|
||||
Then start the frontend application in development mode:
|
||||
|
||||
```
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
7. Open your browser and navigate to `http://localhost:3000` to access the AutoGPT Platform frontend.
|
||||
|
||||
### Docker Compose Commands
|
||||
|
||||
|
||||
@@ -277,47 +277,34 @@ services:
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
# Frontend environment variables
|
||||
x-frontend-vars: &frontend-vars
|
||||
NEXT_PUBLIC_SUPABASE_URL: http://localhost:8000
|
||||
NEXT_PUBLIC_SUPABASE_ANON_KEY: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
NEXT_PUBLIC_AGPT_SERVER_URL: http://localhost:8006/api
|
||||
NEXT_PUBLIC_AGPT_WS_SERVER_URL: ws://localhost:8001/ws
|
||||
NEXT_PUBLIC_FRONTEND_BASE_URL: http://localhost:3000
|
||||
NEXT_PUBLIC_BEHAVE_AS: LOCAL
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_ENABLED: false
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID: 687ab1372f497809b131e06e
|
||||
NEXT_PUBLIC_APP_ENV: local
|
||||
NEXT_PUBLIC_DEFAULT_LOCALE: en
|
||||
NEXT_PUBLIC_LOCALES: en,es
|
||||
NEXT_PUBLIC_SHOW_BILLING_PAGE: false
|
||||
NEXT_PUBLIC_TURNSTILE: disabled
|
||||
NEXT_PUBLIC_REACT_QUERY_DEVTOOL: true
|
||||
NEXT_PUBLIC_GA_MEASUREMENT_ID: G-FH2XK2W4GN
|
||||
NEXT_PUBLIC_PW_TEST: "${NEXT_PUBLIC_PW_TEST:-}"
|
||||
# frontend:
|
||||
# build:
|
||||
# context: ../
|
||||
# dockerfile: autogpt_platform/frontend/Dockerfile
|
||||
# target: dev
|
||||
# depends_on:
|
||||
# db:
|
||||
# condition: service_healthy
|
||||
# rest_server:
|
||||
# condition: service_started
|
||||
# websocket_server:
|
||||
# condition: service_started
|
||||
# migrate:
|
||||
# condition: service_completed_successfully
|
||||
# environment:
|
||||
# - NEXT_PUBLIC_SUPABASE_URL=http://kong:8000
|
||||
# - NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
# - DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/postgres?connect_timeout=60&schema=platform
|
||||
# - DIRECT_URL=postgresql://agpt_user:pass123@postgres:5432/postgres?connect_timeout=60&schema=platform
|
||||
# - NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
|
||||
# - NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
# - NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
|
||||
# - NEXT_PUBLIC_BEHAVE_AS=LOCAL
|
||||
# ports:
|
||||
# - "3000:3000"
|
||||
# networks:
|
||||
# - app-network
|
||||
|
||||
frontend:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: autogpt_platform/frontend/Dockerfile
|
||||
target: prod
|
||||
args: *frontend-vars
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
migrate:
|
||||
condition: service_completed_successfully
|
||||
ports:
|
||||
- "3000:3000"
|
||||
networks:
|
||||
- app-network
|
||||
environment:
|
||||
<<: *frontend-vars
|
||||
# Server-side environment variables (Docker service names)
|
||||
AUTH_CALLBACK_URL: http://rest_server:8006/auth/callback
|
||||
SUPABASE_URL: http://kong:8000
|
||||
AGPT_SERVER_URL: http://rest_server:8006/api
|
||||
AGPT_WS_SERVER_URL: ws://websocket_server:8001/ws
|
||||
networks:
|
||||
app-network:
|
||||
driver: bridge
|
||||
|
||||
@@ -90,11 +90,11 @@ services:
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
|
||||
frontend:
|
||||
<<: *agpt-services
|
||||
extends:
|
||||
file: ./docker-compose.platform.yml
|
||||
service: frontend
|
||||
# frontend:
|
||||
# <<: *agpt-services
|
||||
# extends:
|
||||
# file: ./docker-compose.platform.yml
|
||||
# service: frontend
|
||||
|
||||
# Supabase services
|
||||
studio:
|
||||
@@ -190,17 +190,3 @@ services:
|
||||
- redis
|
||||
- rabbitmq
|
||||
- clamav
|
||||
- migrate
|
||||
|
||||
deps_backend:
|
||||
<<: *agpt-services
|
||||
profiles:
|
||||
- local
|
||||
image: busybox
|
||||
command: /bin/true
|
||||
depends_on:
|
||||
- deps
|
||||
- rest_server
|
||||
- executor
|
||||
- websocket_server
|
||||
- database_manager
|
||||
|
||||
@@ -3,10 +3,13 @@ NEXT_PUBLIC_FRONTEND_BASE_URL=http://localhost:3000
|
||||
NEXT_PUBLIC_AUTH_CALLBACK_URL=http://localhost:8006/auth/callback
|
||||
NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
|
||||
NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_ENABLED=false
|
||||
NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID=687ab1372f497809b131e06e # Local environment on Launch darkly
|
||||
NEXT_PUBLIC_APP_ENV=local
|
||||
|
||||
NEXT_PUBLIC_AGPT_SERVER_BASE_URL=http://localhost:8006
|
||||
|
||||
## Locale settings
|
||||
|
||||
NEXT_PUBLIC_DEFAULT_LOCALE=en
|
||||
@@ -17,10 +20,15 @@ NEXT_PUBLIC_LOCALES=en,es
|
||||
NEXT_PUBLIC_SUPABASE_URL=http://localhost:8000
|
||||
NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
|
||||
|
||||
## OAuth Callback URL
|
||||
## This should be {domain}/auth/callback
|
||||
## Only used if you're using Supabase and OAuth
|
||||
AUTH_CALLBACK_URL="${NEXT_PUBLIC_FRONTEND_BASE_URL}/auth/callback"
|
||||
GA_MEASUREMENT_ID=G-FH2XK2W4GN
|
||||
|
||||
# When running locally, set NEXT_PUBLIC_BEHAVE_AS=CLOUD to use the a locally hosted marketplace (as is typical in development, and the cloud deployment), otherwise set it to LOCAL to have the marketplace open in a new tab
|
||||
NEXT_PUBLIC_BEHAVE_AS=LOCAL
|
||||
NEXT_PUBLIC_SHOW_BILLING_PAGE=false
|
||||
NEXT_PUBLIC_GA_MEASUREMENT_ID=G-FH2XK2W4GN
|
||||
|
||||
## Cloudflare Turnstile (CAPTCHA) Configuration
|
||||
## Get these from the Cloudflare Turnstile dashboard: https://dash.cloudflare.com/?to=/:account/turnstile
|
||||
|
||||
@@ -15,42 +15,6 @@ CMD ["pnpm", "run", "dev", "--hostname", "0.0.0.0"]
|
||||
|
||||
# Build stage for prod
|
||||
FROM base AS build
|
||||
# Accept build args for NEXT_PUBLIC_* variables
|
||||
ARG NEXT_PUBLIC_SUPABASE_URL
|
||||
ARG NEXT_PUBLIC_SUPABASE_ANON_KEY
|
||||
ARG NEXT_PUBLIC_AGPT_SERVER_URL
|
||||
ARG NEXT_PUBLIC_AGPT_WS_SERVER_URL
|
||||
ARG NEXT_PUBLIC_FRONTEND_BASE_URL
|
||||
ARG NEXT_PUBLIC_BEHAVE_AS
|
||||
ARG NEXT_PUBLIC_LAUNCHDARKLY_ENABLED
|
||||
ARG NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID
|
||||
ARG NEXT_PUBLIC_APP_ENV
|
||||
ARG NEXT_PUBLIC_DEFAULT_LOCALE
|
||||
ARG NEXT_PUBLIC_LOCALES
|
||||
ARG NEXT_PUBLIC_SHOW_BILLING_PAGE
|
||||
ARG NEXT_PUBLIC_TURNSTILE
|
||||
ARG NEXT_PUBLIC_REACT_QUERY_DEVTOOL
|
||||
ARG NEXT_PUBLIC_GA_MEASUREMENT_ID
|
||||
ARG NEXT_PUBLIC_PW_TEST
|
||||
|
||||
# Set environment variables from build args
|
||||
ENV NEXT_PUBLIC_SUPABASE_URL=$NEXT_PUBLIC_SUPABASE_URL
|
||||
ENV NEXT_PUBLIC_SUPABASE_ANON_KEY=$NEXT_PUBLIC_SUPABASE_ANON_KEY
|
||||
ENV NEXT_PUBLIC_AGPT_SERVER_URL=$NEXT_PUBLIC_AGPT_SERVER_URL
|
||||
ENV NEXT_PUBLIC_AGPT_WS_SERVER_URL=$NEXT_PUBLIC_AGPT_WS_SERVER_URL
|
||||
ENV NEXT_PUBLIC_FRONTEND_BASE_URL=$NEXT_PUBLIC_FRONTEND_BASE_URL
|
||||
ENV NEXT_PUBLIC_BEHAVE_AS=$NEXT_PUBLIC_BEHAVE_AS
|
||||
ENV NEXT_PUBLIC_LAUNCHDARKLY_ENABLED=$NEXT_PUBLIC_LAUNCHDARKLY_ENABLED
|
||||
ENV NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID=$NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID
|
||||
ENV NEXT_PUBLIC_APP_ENV=$NEXT_PUBLIC_APP_ENV
|
||||
ENV NEXT_PUBLIC_DEFAULT_LOCALE=$NEXT_PUBLIC_DEFAULT_LOCALE
|
||||
ENV NEXT_PUBLIC_LOCALES=$NEXT_PUBLIC_LOCALES
|
||||
ENV NEXT_PUBLIC_SHOW_BILLING_PAGE=$NEXT_PUBLIC_SHOW_BILLING_PAGE
|
||||
ENV NEXT_PUBLIC_TURNSTILE=$NEXT_PUBLIC_TURNSTILE
|
||||
ENV NEXT_PUBLIC_REACT_QUERY_DEVTOOL=$NEXT_PUBLIC_REACT_QUERY_DEVTOOL
|
||||
ENV NEXT_PUBLIC_GA_MEASUREMENT_ID=$NEXT_PUBLIC_GA_MEASUREMENT_ID
|
||||
ENV NEXT_PUBLIC_PW_TEST=$NEXT_PUBLIC_PW_TEST
|
||||
|
||||
COPY autogpt_platform/frontend/ .
|
||||
ENV SKIP_STORYBOOK_TESTS=true
|
||||
RUN pnpm build
|
||||
|
||||
@@ -45,7 +45,7 @@ export default defineConfig({
|
||||
webServer: {
|
||||
command: "pnpm start",
|
||||
url: "http://localhost:3000",
|
||||
reuseExistingServer: true,
|
||||
reuseExistingServer: !process.env.CI,
|
||||
},
|
||||
|
||||
/* Configure projects for major browsers */
|
||||
|
||||
@@ -3,11 +3,14 @@ import {
|
||||
makeAuthenticatedFileUpload,
|
||||
makeAuthenticatedRequest,
|
||||
} from "@/lib/autogpt-server-api/helpers";
|
||||
import { getAgptServerUrl } from "@/lib/env-config";
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
function getBackendBaseUrl() {
|
||||
return getAgptServerUrl().replace("/api", "");
|
||||
if (process.env.NEXT_PUBLIC_AGPT_SERVER_URL) {
|
||||
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL.replace("/api", "");
|
||||
}
|
||||
|
||||
return "http://localhost:8006";
|
||||
}
|
||||
|
||||
function buildBackendUrl(path: string[], queryString: string): string {
|
||||
|
||||
@@ -28,7 +28,7 @@ export default async function RootLayout({
|
||||
>
|
||||
<head>
|
||||
<GoogleAnalytics
|
||||
gaId={process.env.NEXT_PUBLIC_GA_MEASUREMENT_ID || "G-FH2XK2W4GN"} // This is the measurement Id for the Google Analytics dev project
|
||||
gaId={process.env.GA_MEASUREMENT_ID || "G-FH2XK2W4GN"} // This is the measurement Id for the Google Analytics dev project
|
||||
/>
|
||||
</head>
|
||||
<body>
|
||||
|
||||
@@ -3,12 +3,6 @@ import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { createBrowserClient } from "@supabase/ssr";
|
||||
import type { SupabaseClient } from "@supabase/supabase-js";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import {
|
||||
getAgptServerUrl,
|
||||
getAgptWsServerUrl,
|
||||
getSupabaseUrl,
|
||||
getSupabaseAnonKey,
|
||||
} from "@/lib/env-config";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import type {
|
||||
AddUserCreditsResponse,
|
||||
@@ -92,8 +86,10 @@ export default class BackendAPI {
|
||||
heartbeatTimeoutID: number | null = null;
|
||||
|
||||
constructor(
|
||||
baseUrl: string = getAgptServerUrl(),
|
||||
wsUrl: string = getAgptWsServerUrl(),
|
||||
baseUrl: string = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
|
||||
"http://localhost:8006/api",
|
||||
wsUrl: string = process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL ||
|
||||
"ws://localhost:8001/ws",
|
||||
) {
|
||||
this.baseUrl = baseUrl;
|
||||
this.wsUrl = wsUrl;
|
||||
@@ -101,9 +97,11 @@ export default class BackendAPI {
|
||||
|
||||
private async getSupabaseClient(): Promise<SupabaseClient | null> {
|
||||
return isClient
|
||||
? createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
|
||||
isSingleton: true,
|
||||
})
|
||||
? createBrowserClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{ isSingleton: true },
|
||||
)
|
||||
: await getServerSupabase();
|
||||
}
|
||||
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import { getAgptServerUrl } from "@/lib/env-config";
|
||||
import { isServerSide } from "../utils/is-server-side";
|
||||
|
||||
import { GraphValidationErrorResponse } from "./types";
|
||||
@@ -57,7 +56,9 @@ export function buildClientUrl(path: string): string {
|
||||
}
|
||||
|
||||
export function buildServerUrl(path: string): string {
|
||||
return `${getAgptServerUrl()}${path}`;
|
||||
const baseUrl =
|
||||
process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
|
||||
return `${baseUrl}${path}`;
|
||||
}
|
||||
|
||||
export function buildUrlWithQuery(
|
||||
|
||||
@@ -6,7 +6,8 @@ import {
|
||||
makeAuthenticatedFileUpload,
|
||||
makeAuthenticatedRequest,
|
||||
} from "./helpers";
|
||||
import { getAgptServerUrl } from "@/lib/env-config";
|
||||
|
||||
const DEFAULT_BASE_URL = "http://localhost:8006/api";
|
||||
|
||||
export interface ProxyRequestOptions {
|
||||
method: "GET" | "POST" | "PUT" | "PATCH" | "DELETE";
|
||||
@@ -20,7 +21,7 @@ export async function proxyApiRequest({
|
||||
method,
|
||||
path,
|
||||
payload,
|
||||
baseUrl = getAgptServerUrl(),
|
||||
baseUrl = process.env.NEXT_PUBLIC_AGPT_SERVER_URL || DEFAULT_BASE_URL,
|
||||
contentType = "application/json",
|
||||
}: ProxyRequestOptions) {
|
||||
return await Sentry.withServerActionInstrumentation(
|
||||
@@ -36,7 +37,8 @@ export async function proxyApiRequest({
|
||||
export async function proxyFileUpload(
|
||||
path: string,
|
||||
formData: FormData,
|
||||
baseUrl = getAgptServerUrl(),
|
||||
baseUrl = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
|
||||
"http://localhost:8006/api",
|
||||
): Promise<string> {
|
||||
return await Sentry.withServerActionInstrumentation(
|
||||
"proxyFileUpload",
|
||||
|
||||
@@ -1,53 +0,0 @@
|
||||
/**
|
||||
* Environment configuration helper
|
||||
*
|
||||
* Provides unified access to environment variables with server-side priority.
|
||||
* Server-side code uses Docker service names, client-side falls back to localhost.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Gets the AGPT server URL with server-side priority
|
||||
* Server-side: Uses AGPT_SERVER_URL (http://rest_server:8006/api)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_AGPT_SERVER_URL (http://localhost:8006/api)
|
||||
*/
|
||||
export function getAgptServerUrl(): string {
|
||||
return (
|
||||
process.env.AGPT_SERVER_URL ||
|
||||
process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
|
||||
"http://localhost:8006/api"
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the AGPT WebSocket URL with server-side priority
|
||||
* Server-side: Uses AGPT_WS_SERVER_URL (ws://websocket_server:8001/ws)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_AGPT_WS_SERVER_URL (ws://localhost:8001/ws)
|
||||
*/
|
||||
export function getAgptWsServerUrl(): string {
|
||||
return (
|
||||
process.env.AGPT_WS_SERVER_URL ||
|
||||
process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL ||
|
||||
"ws://localhost:8001/ws"
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the Supabase URL with server-side priority
|
||||
* Server-side: Uses SUPABASE_URL (http://kong:8000)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_SUPABASE_URL (http://localhost:8000)
|
||||
*/
|
||||
export function getSupabaseUrl(): string {
|
||||
return (
|
||||
process.env.SUPABASE_URL ||
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL ||
|
||||
"http://localhost:8000"
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the Supabase anon key
|
||||
* Uses NEXT_PUBLIC_SUPABASE_ANON_KEY since anon keys are public and same across environments
|
||||
*/
|
||||
export function getSupabaseAnonKey(): string {
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY || "";
|
||||
}
|
||||
@@ -4,7 +4,6 @@ import { User } from "@supabase/supabase-js";
|
||||
import { usePathname, useRouter } from "next/navigation";
|
||||
import { useEffect, useMemo, useRef, useState } from "react";
|
||||
import { useBackendAPI } from "@/lib/autogpt-server-api/context";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "@/lib/env-config";
|
||||
import {
|
||||
getCurrentUser,
|
||||
refreshSession,
|
||||
@@ -33,12 +32,16 @@ export function useSupabase() {
|
||||
|
||||
const supabase = useMemo(() => {
|
||||
try {
|
||||
return createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
|
||||
isSingleton: true,
|
||||
auth: {
|
||||
persistSession: false, // Don't persist session on client with httpOnly cookies
|
||||
return createBrowserClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{
|
||||
isSingleton: true,
|
||||
auth: {
|
||||
persistSession: false, // Don't persist session on client with httpOnly cookies
|
||||
},
|
||||
},
|
||||
});
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Error creating Supabase client", error);
|
||||
return null;
|
||||
|
||||
@@ -1,43 +1,47 @@
|
||||
import { createServerClient } from "@supabase/ssr";
|
||||
import { NextResponse, type NextRequest } from "next/server";
|
||||
import { getCookieSettings, isAdminPage, isProtectedPage } from "./helpers";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "../env-config";
|
||||
|
||||
export async function updateSession(request: NextRequest) {
|
||||
let supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
});
|
||||
|
||||
const supabaseUrl = getSupabaseUrl();
|
||||
const supabaseKey = getSupabaseAnonKey();
|
||||
const isAvailable = Boolean(supabaseUrl && supabaseKey);
|
||||
const isAvailable = Boolean(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL &&
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY,
|
||||
);
|
||||
|
||||
if (!isAvailable) {
|
||||
return supabaseResponse;
|
||||
}
|
||||
|
||||
try {
|
||||
const supabase = createServerClient(supabaseUrl, supabaseKey, {
|
||||
cookies: {
|
||||
getAll() {
|
||||
return request.cookies.getAll();
|
||||
},
|
||||
setAll(cookiesToSet) {
|
||||
cookiesToSet.forEach(({ name, value }) =>
|
||||
request.cookies.set(name, value),
|
||||
);
|
||||
supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
});
|
||||
cookiesToSet.forEach(({ name, value, options }) => {
|
||||
supabaseResponse.cookies.set(name, value, {
|
||||
...options,
|
||||
...getCookieSettings(),
|
||||
const supabase = createServerClient(
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{
|
||||
cookies: {
|
||||
getAll() {
|
||||
return request.cookies.getAll();
|
||||
},
|
||||
setAll(cookiesToSet) {
|
||||
cookiesToSet.forEach(({ name, value }) =>
|
||||
request.cookies.set(name, value),
|
||||
);
|
||||
supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
});
|
||||
});
|
||||
cookiesToSet.forEach(({ name, value, options }) => {
|
||||
supabaseResponse.cookies.set(name, value, {
|
||||
...options,
|
||||
...getCookieSettings(),
|
||||
});
|
||||
});
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
);
|
||||
|
||||
const userResponse = await supabase.auth.getUser();
|
||||
const user = userResponse.data.user;
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { createServerClient, type CookieOptions } from "@supabase/ssr";
|
||||
import { getCookieSettings } from "../helpers";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "../../env-config";
|
||||
|
||||
type Cookies = { name: string; value: string; options?: CookieOptions }[];
|
||||
|
||||
@@ -12,8 +11,8 @@ export async function getServerSupabase() {
|
||||
|
||||
try {
|
||||
const supabase = createServerClient(
|
||||
getSupabaseUrl(),
|
||||
getSupabaseAnonKey(),
|
||||
process.env.NEXT_PUBLIC_SUPABASE_URL!,
|
||||
process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!,
|
||||
{
|
||||
cookies: {
|
||||
getAll() {
|
||||
|
||||
@@ -2,7 +2,6 @@
|
||||
* Utility functions for working with Cloudflare Turnstile
|
||||
*/
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { getAgptServerUrl } from "@/lib/env-config";
|
||||
|
||||
export async function verifyTurnstileToken(
|
||||
token: string,
|
||||
@@ -20,16 +19,19 @@ export async function verifyTurnstileToken(
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`${getAgptServerUrl()}/turnstile/verify`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
const response = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_AGPT_SERVER_URL}/turnstile/verify`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
token,
|
||||
action,
|
||||
}),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
token,
|
||||
action,
|
||||
}),
|
||||
});
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
console.error("Turnstile verification failed:", await response.text());
|
||||
|
||||
@@ -107,9 +107,9 @@ If you get stuck, follow [this guide](https://docs.github.com/en/repositories/cr
|
||||
|
||||
Once that's complete you can continue the setup process.
|
||||
|
||||
### Running the AutoGPT Platform
|
||||
### Running the backend services
|
||||
|
||||
To run the platform, follow these steps:
|
||||
To run the backend services, follow these steps:
|
||||
|
||||
* Navigate to the `autogpt_platform` directory inside the AutoGPT folder:
|
||||
```bash
|
||||
@@ -120,13 +120,40 @@ To run the platform, follow these steps:
|
||||
```
|
||||
cp .env.example .env
|
||||
```
|
||||
This command will copy the `.env.example` file to `.env` in the `autogpt_platform` directory. You can modify the `.env` file to add your own environment variables.
|
||||
This command will copy the `.env.example` file to `.env` in the `supabase` directory. You can modify the `.env` file to add your own environment variables.
|
||||
|
||||
* Run the platform services:
|
||||
* Run the backend services:
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
||||
This command will start all the necessary backend services defined in the `docker-compose.yml` file in detached mode.
|
||||
This command will start all the necessary backend services defined in the `docker-compose.combined.yml` file in detached mode.
|
||||
|
||||
|
||||
### Running the frontend application
|
||||
|
||||
To run the frontend application open a new terminal and follow these steps:
|
||||
|
||||
- Navigate to `frontend` folder within the `autogpt_platform` directory:
|
||||
|
||||
```
|
||||
cd frontend
|
||||
```
|
||||
|
||||
- Copy the `.env.example` file available in the `frontend` directory to `.env` in the same directory:
|
||||
|
||||
```
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
You can modify the `.env` within this folder to add your own environment variables for the frontend application.
|
||||
|
||||
- Run the following command:
|
||||
```
|
||||
corepack enable
|
||||
pnpm install
|
||||
pnpm dev
|
||||
```
|
||||
This command will enable corepack, install the necessary dependencies with pnpm, and start the frontend application in development mode.
|
||||
|
||||
### Checking if the application is running
|
||||
|
||||
@@ -158,6 +185,127 @@ poetry run cli gen-encrypt-key
|
||||
|
||||
Then, replace the existing key in the `autogpt_platform/backend/.env` file with the new one.
|
||||
|
||||
!!! Note
|
||||
*The steps below are an alternative to [Running the backend services](#running-the-backend-services)*
|
||||
|
||||
<details>
|
||||
<summary><strong>Alternate Steps</strong></summary>
|
||||
|
||||
#### AutoGPT Agent Server (OLD)
|
||||
This is an initial project for creating the next generation of agent execution, which is an AutoGPT agent server.
|
||||
The agent server will enable the creation of composite multi-agent systems that utilize AutoGPT agents and other non-agent components as its primitives.
|
||||
|
||||
##### Docs
|
||||
|
||||
You can access the docs for the [AutoGPT Agent Server here](https://docs.agpt.co/#1-autogpt-server).
|
||||
|
||||
##### Setup
|
||||
|
||||
We use the Poetry to manage the dependencies. To set up the project, follow these steps inside this directory:
|
||||
|
||||
0. Install Poetry
|
||||
|
||||
```sh
|
||||
pip install poetry
|
||||
```
|
||||
|
||||
1. Configure Poetry to use .venv in your project directory
|
||||
|
||||
```sh
|
||||
poetry config virtualenvs.in-project true
|
||||
```
|
||||
|
||||
2. Enter the poetry shell
|
||||
|
||||
```sh
|
||||
poetry shell
|
||||
```
|
||||
|
||||
3. Install dependencies
|
||||
|
||||
```sh
|
||||
poetry install
|
||||
```
|
||||
|
||||
4. Copy .env.example to .env
|
||||
|
||||
```sh
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
5. Generate the Prisma client
|
||||
|
||||
```sh
|
||||
poetry run prisma generate
|
||||
```
|
||||
|
||||
> In case Prisma generates the client for the global Python installation instead of the virtual environment, the current mitigation is to just uninstall the global Prisma package:
|
||||
>
|
||||
> ```sh
|
||||
> pip uninstall prisma
|
||||
> ```
|
||||
>
|
||||
> Then run the generation again. The path *should* look something like this:
|
||||
> `<some path>/pypoetry/virtualenvs/backend-TQIRSwR6-py3.12/bin/prisma`
|
||||
|
||||
6. Migrate the database. Be careful because this deletes current data in the database.
|
||||
|
||||
```sh
|
||||
docker compose up db -d
|
||||
poetry run prisma migrate deploy
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
### Starting the AutoGPT server without Docker
|
||||
|
||||
To run the server locally, start in the autogpt_platform folder:
|
||||
|
||||
```sh
|
||||
cd ..
|
||||
```
|
||||
|
||||
Run the following command to run database in docker but the application locally:
|
||||
|
||||
```sh
|
||||
docker compose --profile local up deps --build --detach
|
||||
cd backend
|
||||
poetry run app
|
||||
```
|
||||
|
||||
### Starting the AutoGPT server with Docker
|
||||
|
||||
Run the following command to build the dockerfiles:
|
||||
|
||||
```sh
|
||||
docker compose build
|
||||
```
|
||||
|
||||
Run the following command to run the app:
|
||||
|
||||
```sh
|
||||
docker compose up
|
||||
```
|
||||
|
||||
Run the following to automatically rebuild when code changes, in another terminal:
|
||||
|
||||
```sh
|
||||
docker compose watch
|
||||
```
|
||||
|
||||
Run the following command to shut down:
|
||||
|
||||
```sh
|
||||
docker compose down
|
||||
```
|
||||
|
||||
If you run into issues with dangling orphans, try:
|
||||
|
||||
```sh
|
||||
docker compose down --volumes --remove-orphans && docker-compose up --force-recreate --renew-anon-volumes --remove-orphans
|
||||
```
|
||||
|
||||
### 📌 Windows Installation Note
|
||||
|
||||
When installing Docker on Windows, it is **highly recommended** to select **WSL 2** instead of Hyper-V. Using Hyper-V can cause compatibility issues with Supabase, leading to the `supabase-db` container being marked as **unhealthy**.
|
||||
@@ -184,92 +332,14 @@ For more details, refer to [Docker's official documentation](https://docs.docker
|
||||
|
||||
## Development
|
||||
|
||||
### Frontend Development
|
||||
|
||||
#### Running the frontend locally
|
||||
|
||||
To run the frontend locally, you need to have Node.js and PNPM installed on your machine.
|
||||
|
||||
Install [Node.js](https://nodejs.org/en/download/) to manage dependencies and run the frontend application.
|
||||
|
||||
Install [PNPM](https://pnpm.io/installation) to manage the frontend dependencies.
|
||||
|
||||
Run the service dependencies (backend, database, message queues, etc.):
|
||||
```sh
|
||||
docker compose --profile local up deps_backend --build --detach
|
||||
```
|
||||
|
||||
Go to the `autogpt_platform/frontend` directory:
|
||||
```sh
|
||||
cd frontend
|
||||
```
|
||||
|
||||
Install the dependencies:
|
||||
```sh
|
||||
pnpm install
|
||||
```
|
||||
|
||||
Generate the API client:
|
||||
```sh
|
||||
pnpm generate:api-client
|
||||
```
|
||||
|
||||
Run the frontend application:
|
||||
```sh
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
#### Formatting & Linting
|
||||
|
||||
### Formatting & Linting
|
||||
Auto formatter and linter are set up in the project. To run them:
|
||||
Format the code:
|
||||
```sh
|
||||
pnpm format
|
||||
```
|
||||
|
||||
Lint the code:
|
||||
```sh
|
||||
pnpm lint
|
||||
```
|
||||
|
||||
#### Testing
|
||||
|
||||
To run the tests, you can use the following command:
|
||||
```sh
|
||||
pnpm test
|
||||
```
|
||||
|
||||
### Backend Development
|
||||
|
||||
#### Running the backend locally
|
||||
|
||||
To run the backend locally, you need to have Python 3.10 or higher installed on your machine.
|
||||
|
||||
Install [Poetry](https://python-poetry.org/docs/#installation) to manage dependencies and virtual environments.
|
||||
|
||||
Run the backend dependencies (database, message queues, etc.):
|
||||
```sh
|
||||
docker compose --profile local up deps --build --detach
|
||||
```
|
||||
|
||||
Go to the `autogpt_platform/backend` directory:
|
||||
```sh
|
||||
cd backend
|
||||
```
|
||||
|
||||
Install the dependencies:
|
||||
Install:
|
||||
```sh
|
||||
poetry install --with dev
|
||||
```
|
||||
|
||||
Run the backend server:
|
||||
```sh
|
||||
poetry run app
|
||||
```
|
||||
|
||||
#### Formatting & Linting
|
||||
Auto formatter and linter are set up in the project. To run them:
|
||||
|
||||
Format the code:
|
||||
```sh
|
||||
poetry run format
|
||||
@@ -280,14 +350,71 @@ Lint the code:
|
||||
poetry run lint
|
||||
```
|
||||
|
||||
#### Testing
|
||||
### Testing
|
||||
|
||||
To run the tests:
|
||||
|
||||
```sh
|
||||
poetry run pytest -s
|
||||
poetry run test
|
||||
```
|
||||
|
||||
To update stored snapshots after intentional API changes:
|
||||
|
||||
```sh
|
||||
pytest --snapshot-update
|
||||
```
|
||||
|
||||
## Project Outline
|
||||
|
||||
The current project has the following main modules:
|
||||
|
||||
#### **blocks**
|
||||
|
||||
This module stores all the Agent Blocks, which are reusable components to build a graph that represents the agent's behavior.
|
||||
|
||||
#### **data**
|
||||
|
||||
This module stores the logical model that is persisted in the database.
|
||||
It abstracts the database operations into functions that can be called by the service layer.
|
||||
Any code that interacts with Prisma objects or the database should reside in this module.
|
||||
The main models are:
|
||||
* `block`: anything related to the block used in the graph
|
||||
* `execution`: anything related to the execution graph execution
|
||||
* `graph`: anything related to the graph, node, and its relations
|
||||
|
||||
#### **execution**
|
||||
|
||||
This module stores the business logic of executing the graph.
|
||||
It currently has the following main modules:
|
||||
* `manager`: A service that consumes the queue of the graph execution and executes the graph. It contains both pieces of logic.
|
||||
* `scheduler`: A service that triggers scheduled graph execution based on a cron expression. It pushes an execution request to the manager.
|
||||
|
||||
#### **server**
|
||||
|
||||
This module stores the logic for the server API.
|
||||
It contains all the logic used for the API that allows the client to create, execute, and monitor the graph and its execution.
|
||||
This API service interacts with other services like those defined in `manager` and `scheduler`.
|
||||
|
||||
#### **utils**
|
||||
|
||||
This module stores utility functions that are used across the project.
|
||||
Currently, it has two main modules:
|
||||
* `process`: A module that contains the logic to spawn a new process.
|
||||
* `service`: A module that serves as a parent class for all the services in the project.
|
||||
|
||||
## Service Communication
|
||||
|
||||
Currently, there are only 3 active services:
|
||||
|
||||
- AgentServer (the API, defined in `server.py`)
|
||||
- ExecutionManager (the executor, defined in `manager.py`)
|
||||
- Scheduler (the scheduler, defined in `scheduler.py`)
|
||||
|
||||
The services run in independent Python processes and communicate through an IPC.
|
||||
A communication layer (`service.py`) is created to decouple the communication library from the implementation.
|
||||
|
||||
Currently, the IPC is done using Pyro5 and abstracted in a way that allows a function decorated with `@expose` to be called from a different process.
|
||||
|
||||
## Adding a New Agent Block
|
||||
|
||||
To add a new agent block, you need to create a new class that inherits from `Block` and provides the following information:
|
||||
@@ -297,5 +424,4 @@ To add a new agent block, you need to create a new class that inherits from `Blo
|
||||
* `run` method: the main logic of the block.
|
||||
* `test_input` & `test_output`: the sample input and output data for the block, which will be used to auto-test the block.
|
||||
* You can mock the functions declared in the block using the `test_mock` field for your unit tests.
|
||||
* Once you finish creating the block, you can test it by running `poetry run pytest backend/blocks/test/test_block.py -s`.
|
||||
* Create a Pull Request to the `dev` branch of the repository with your changes so you can share it with the community :)
|
||||
* Once you finish creating the block, you can test it by running `poetry run pytest -s test/block/test_block.py`.
|
||||
|
||||
Reference in New Issue
Block a user