Files
AutoGPT/autogpt_platform/frontend
Zamil Majdy 4bfeddc03d feat(platform/docker): add frontend service to docker-compose with env config improvements (#10615)
## Summary
This PR adds the frontend service to the Docker Compose configuration,
enabling `docker compose up` to run the complete stack, including the
frontend. It also implements comprehensive environment variable
improvements, unified .env file support, and fixes Docker networking
issues.

## Key Changes

### 🐳 Docker Compose Improvements
- **Added frontend service** to `docker-compose.yml` and
`docker-compose.platform.yml`
- **Production build**: Uses `pnpm build + serve` instead of dev server
for better stability and lower memory usage
- **Service dependencies**: Frontend now waits for backend services
(`rest_server`, `websocket_server`) to be ready
- **YAML anchors**: Implemented DRY configuration to avoid duplicating
environment values

### 📁 Unified .env File Support
- **Frontend .env loading**: Automatically loads `.env` file during
Docker build and runtime
- **Backend .env loading**: Optional `.env` file support with fallback
to sensible defaults in `settings.py`
- **Single source of truth**: All `NEXT_PUBLIC_*` and API keys can be
defined in respective `.env` files
- **Docker integration**: Updated `.dockerignore` to include `.env`
files in build context
- **Git tracking**: Frontend and backend `.env` files are now trackable
(removed from gitignore)

### 🔧 Environment Variable Architecture
- **Dual environment strategy**: 
- Server-side code uses Docker service names
(`http://rest_server:8006/api`)
  - Client-side code uses localhost URLs (`http://localhost:8006/api`)
- **Comprehensive config**: Added build args and runtime environment
variables
- **Network compatibility**: Fixes connection issues between frontend
and backend containers
- **Shared backend variables**: Common environment variables (service
hosts, auth settings) centralized using YAML anchors

### 🛠️ Code Improvements
- **Centralized env-config helper** (`/frontend/src/lib/env-config.ts`)
with server-side priority
- **Updated all frontend code** to use shared environment helpers
instead of direct `process.env` access
- **Consistent API**: All environment variable access now goes through
helper functions
- **Settings.py improvements**: Better defaults for CORS origins and
optional .env file loading

### 🔗 Files Changed
- `docker-compose.yml` & `docker-compose.platform.yml` - Added frontend
service and shared backend env vars
- `frontend/Dockerfile` - Simplified build process to use .env files
directly
- `backend/settings.py` - Optional .env loading and better defaults
- `frontend/src/lib/env-config.ts` - New centralized environment
configuration
- `.dockerignore` - Allow .env files in build context
- `.gitignore` - Updated to allow frontend/backend .env files
- Multiple frontend files - Updated to use env helpers
- Updates to both auto installer scripts to work with the latest setup!

## Benefits
-  **Single command deployment**: `docker compose up` now runs
everything
-  **Better reliability**: Production build reduces memory usage and
crashes
-  **Network compatibility**: Proper container-to-container
communication
-  **Maintainable config**: Centralized environment variable management
with .env files
-  **Development friendly**: Works in both Docker and local development
-  **API key management**: Easy configuration through .env files for
all services
-  **No more manual env vars**: Frontend and backend automatically load
their respective .env files

## Testing
-  Verified Docker service communication works correctly
-  Frontend responds and serves content properly  
-  Environment variables are correctly resolved in both server and
client contexts
-  No connection errors after implementing service dependencies
-  .env file loading works correctly in both build and runtime phases
-  Backend services work with and without .env files present

### Checklist 📋

#### For configuration changes:
- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Lluis Agusti <hi@llu.lu>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Claude <claude@users.noreply.github.com>
Co-authored-by: Bentlybro <Github@bentlybro.com>
2025-08-14 03:28:18 +00:00
..
2025-06-05 15:02:27 +00:00

This is the frontend for AutoGPT's next generation

🧢 Getting Started

This project uses pnpm as the package manager via corepack. Corepack is a Node.js tool that automatically manages package managers without requiring global installations.

Prerequisites

Make sure you have Node.js 16.10+ installed. Corepack is included with Node.js by default.

⚠️ Migrating from yarn

This project was previously using yarn1, make sure to clean up the old files if you set it up previously with yarn:

rm -f yarn.lock && rm -rf node_modules

Then follow the setup steps below.

Setup

  1. Enable corepack (run this once on your system):

    corepack enable
    

    This enables corepack to automatically manage pnpm based on the packageManager field in package.json.

  2. Install dependencies:

    pnpm i
    
  3. Start the development server:

    pnpm dev
    

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

Subsequent Runs

For subsequent development sessions, you only need to run:

pnpm dev

Every time a new Front-end dependency is added by you or others, you will need to run pnpm i to install the new dependencies.

Available Scripts

  • pnpm dev - Start development server
  • pnpm build - Build for production
  • pnpm start - Start production server
  • pnpm lint - Run ESLint and Prettier checks
  • pnpm format - Format code with Prettier
  • pnpm type-check - Run TypeScript type checking
  • pnpm test - Run Playwright tests
  • pnpm test-ui - Run Playwright tests with UI
  • pnpm fetch:openapi - Fetch OpenAPI spec from backend
  • pnpm generate:api-client - Generate API client from OpenAPI spec
  • pnpm generate:api-all - Fetch OpenAPI spec and generate API client

This project uses next/font to automatically optimize and load Inter, a custom Google Font.

🔄 Data Fetching Strategy

Note

You don't need to run the OpenAPI commands below to run the Front-end. You will only need to run them when adding or modifying endpoints on the Backend API and wanting to use those on the Frontend.

This project uses an auto-generated API client powered by Orval, which creates type-safe API clients from OpenAPI specifications.

How It Works

  1. Backend Requirements: Each API endpoint needs a summary and tag in the OpenAPI spec
  2. Operation ID Generation: FastAPI generates operation IDs using the pattern {method}{tag}{summary}
  3. Spec Fetching: The OpenAPI spec is fetched from http://localhost:8006/openapi.json and saved to the frontend
  4. Spec Transformation: The OpenAPI spec is cleaned up using a custom transformer (see autogpt_platform/frontend/src/app/api/transformers)
  5. Client Generation: Auto-generated client includes TypeScript types, API endpoints, and Zod schemas, organized by tags

API Client Commands

# Fetch OpenAPI spec from backend and generate client
pnpm generate:api-all

# Only fetch the OpenAPI spec
pnpm fetch:openapi

# Only generate the client (after spec is fetched)
pnpm generate:api-client

Using the Generated Client

The generated client provides React Query hooks for both queries and mutations:

Queries (GET requests)

import { useGetV1GetNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";

const { data, isLoading, isError } = useGetV1GetNotificationPreferences({
  query: {
    select: (res) => res.data,
    // Other React Query options
  },
});

Mutations (POST, PUT, DELETE requests)

import { useDeleteV2DeleteStoreSubmission } from "@/app/api/__generated__/endpoints/store/store";
import { getGetV2ListMySubmissionsQueryKey } from "@/app/api/__generated__/endpoints/store/store";
import { useQueryClient } from "@tanstack/react-query";

const queryClient = useQueryClient();

const { mutateAsync: deleteSubmission } = useDeleteV2DeleteStoreSubmission({
  mutation: {
    onSuccess: () => {
      // Invalidate related queries to refresh data
      queryClient.invalidateQueries({
        queryKey: getGetV2ListMySubmissionsQueryKey(),
      });
    },
  },
});

// Usage
await deleteSubmission({
  submissionId: submission_id,
});

Server Actions

For server-side operations, you can also use the generated client functions directly:

import { postV1UpdateNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";

// In a server action
const preferences = {
  email: "user@example.com",
  preferences: {
    AGENT_RUN: true,
    ZERO_BALANCE: false,
    // ... other preferences
  },
  daily_limit: 0,
};

await postV1UpdateNotificationPreferences(preferences);

Server-Side Prefetching

For server-side components, you can prefetch data on the server and hydrate it in the client cache. This allows immediate access to cached data when queries are called:

import { getQueryClient } from "@/lib/tanstack-query/getQueryClient";
import {
  prefetchGetV2ListStoreAgentsQuery,
  prefetchGetV2ListStoreCreatorsQuery
} from "@/app/api/__generated__/endpoints/store/store";
import { HydrationBoundary, dehydrate } from "@tanstack/react-query";

// In your server component
const queryClient = getQueryClient();

await Promise.all([
  prefetchGetV2ListStoreAgentsQuery(queryClient, {
    featured: true,
  }),
  prefetchGetV2ListStoreAgentsQuery(queryClient, {
    sorted_by: "runs",
  }),
  prefetchGetV2ListStoreCreatorsQuery(queryClient, {
    featured: true,
    sorted_by: "num_agents",
  }),
]);

return (
  <HydrationBoundary state={dehydrate(queryClient)}>
    <MainMarkeplacePage />
  </HydrationBoundary>
);

This pattern improves performance by serving pre-fetched data from the server while maintaining the benefits of client-side React Query features.

Configuration

The Orval configuration is located in autogpt_platform/frontend/orval.config.ts. It generates two separate clients:

  1. autogpt_api_client: React Query hooks for client-side data fetching
  2. autogpt_zod_schema: Zod schemas for validation

For more details, see the Orval documentation or check the configuration file.

🚩 Feature Flags

This project uses LaunchDarkly for feature flags, allowing us to control feature rollouts and A/B testing.

Using Feature Flags

Check if a feature is enabled

import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";

function MyComponent() {
  const isAgentActivityEnabled = useGetFlag(Flag.AGENT_ACTIVITY);

  if (!isAgentActivityEnabled) {
    return null; // Hide feature
  }

  return <div>Feature is enabled!</div>;
}

Protect entire components

import { withFeatureFlag } from "@/services/feature-flags/with-feature-flag";

const MyFeaturePage = withFeatureFlag(MyPageComponent, "my-feature-flag");

Testing with Feature Flags

For local development or running Playwright tests locally, use mocked feature flags by setting NEXT_PUBLIC_PW_TEST=true in your .env file. This bypasses LaunchDarkly and uses the mock values defined in the code.

Adding New Flags

  1. Add the flag to the Flag enum in use-get-flag.ts
  2. Add the flag type to FlagValues type
  3. Add mock value to mockFlags for testing
  4. Configure the flag in LaunchDarkly dashboard

🚚 Deploy

TODO

📙 Storybook

Storybook is a powerful development environment for UI components. It allows you to build UI components in isolation, making it easier to develop, test, and document your components independently from your main application.

Purpose in the Development Process

  1. Component Development: Develop and test UI components in isolation.
  2. Visual Testing: Easily spot visual regressions.
  3. Documentation: Automatically document components and their props.
  4. Collaboration: Share components with your team or stakeholders for feedback.

How to Use Storybook

  1. Start Storybook: Run the following command to start the Storybook development server:

    pnpm storybook
    

    This will start Storybook on port 6006. Open http://localhost:6006 in your browser to view your component library.

  2. Build Storybook: To build a static version of Storybook for deployment, use:

    pnpm build-storybook
    
  3. Running Storybook Tests: Storybook tests can be run using:

    pnpm test-storybook
    
  4. Writing Stories: Create .stories.tsx files alongside your components to define different states and variations of your components.

By integrating Storybook into our development workflow, we can streamline UI development, improve component reusability, and maintain a consistent design system across the project.

🔭 Tech Stack

Core Framework & Language

  • Next.js - React framework with App Router
  • React - UI library for building user interfaces
  • TypeScript - Typed JavaScript for better developer experience

Styling & UI Components

Development & Testing

Backend & Services

  • Supabase - Backend-as-a-Service (database, auth, storage)
  • Sentry - Error monitoring and performance tracking

Package Management

  • pnpm - Fast, disk space efficient package manager
  • Corepack - Node.js package manager management

Additional Libraries

Development Tools

  • NEXT_PUBLIC_REACT_QUERY_DEVTOOL - Enable React Query DevTools. Set to true to enable.