Merge branch 'dev' into ntindle/systemallquietalerts

This commit is contained in:
Nicholas Tindle
2025-10-22 13:33:56 -05:00
committed by GitHub
82 changed files with 1782 additions and 1851 deletions

View File

@@ -12,6 +12,7 @@ This file provides comprehensive onboarding information for GitHub Copilot codin
- **Infrastructure** - Docker configurations, CI/CD, and development tools
**Primary Languages & Frameworks:**
- **Backend**: Python 3.10-3.13, FastAPI, Prisma ORM, PostgreSQL, RabbitMQ
- **Frontend**: TypeScript, Next.js 15, React, Tailwind CSS, Radix UI
- **Development**: Docker, Poetry, pnpm, Playwright, Storybook
@@ -23,15 +24,17 @@ This file provides comprehensive onboarding information for GitHub Copilot codin
**Always run these commands in the correct directory and in this order:**
1. **Initial Setup** (required once):
```bash
# Clone and enter repository
git clone <repo> && cd AutoGPT
# Start all services (database, redis, rabbitmq, clamav)
cd autogpt_platform && docker compose --profile local up deps --build --detach
```
2. **Backend Setup** (always run before backend development):
```bash
cd autogpt_platform/backend
poetry install # Install dependencies
@@ -48,6 +51,7 @@ This file provides comprehensive onboarding information for GitHub Copilot codin
### Runtime Requirements
**Critical:** Always ensure Docker services are running before starting development:
```bash
cd autogpt_platform && docker compose --profile local up deps --build --detach
```
@@ -58,6 +62,7 @@ cd autogpt_platform && docker compose --profile local up deps --build --detach
### Development Commands
**Backend Development:**
```bash
cd autogpt_platform/backend
poetry run serve # Start development server (port 8000)
@@ -68,6 +73,7 @@ poetry run lint # Lint code (ruff) - run after format
```
**Frontend Development:**
```bash
cd autogpt_platform/frontend
pnpm dev # Start development server (port 3000) - use for active development
@@ -81,23 +87,27 @@ pnpm storybook # Start component development server
### Testing Strategy
**Backend Tests:**
- **Block Tests**: `poetry run pytest backend/blocks/test/test_block.py -xvs` (validates all blocks)
- **Specific Block**: `poetry run pytest 'backend/blocks/test/test_block.py::test_available_blocks[BlockName]' -xvs`
- **Snapshot Tests**: Use `--snapshot-update` when output changes, always review with `git diff`
**Frontend Tests:**
- **E2E Tests**: Always run `pnpm dev` before `pnpm test` (Playwright requires running instance)
- **Component Tests**: Use Storybook for isolated component development
### Critical Validation Steps
**Before committing changes:**
1. Run `poetry run format` (backend) and `pnpm format` (frontend)
2. Ensure all tests pass in modified areas
3. Verify Docker services are still running
4. Check that database migrations apply cleanly
**Common Issues & Workarounds:**
- **Prisma issues**: Run `poetry run prisma generate` after schema changes
- **Permission errors**: Ensure Docker has proper permissions
- **Port conflicts**: Check the `docker-compose.yml` file for the current list of exposed ports. You can list all mapped ports with:
@@ -108,6 +118,7 @@ pnpm storybook # Start component development server
### Core Architecture
**AutoGPT Platform** (`autogpt_platform/`):
- `backend/` - FastAPI server with async support
- `backend/backend/` - Core API logic
- `backend/blocks/` - Agent execution blocks
@@ -121,6 +132,7 @@ pnpm storybook # Start component development server
- `docker-compose.yml` - Development stack orchestration
**Key Configuration Files:**
- `pyproject.toml` - Python dependencies and tooling
- `package.json` - Node.js dependencies and scripts
- `schema.prisma` - Database schema and migrations
@@ -136,6 +148,7 @@ pnpm storybook # Start component development server
### Development Workflow
**GitHub Actions**: Multiple CI/CD workflows in `.github/workflows/`
- `platform-backend-ci.yml` - Backend testing and validation
- `platform-frontend-ci.yml` - Frontend testing and validation
- `platform-fullstack-ci.yml` - End-to-end integration tests
@@ -146,11 +159,13 @@ pnpm storybook # Start component development server
### Key Source Files
**Backend Entry Points:**
- `backend/backend/server/server.py` - FastAPI application setup
- `backend/backend/data/` - Database models and user management
- `backend/blocks/` - Agent execution blocks and logic
**Frontend Entry Points:**
- `frontend/src/app/layout.tsx` - Root application layout
- `frontend/src/app/page.tsx` - Home page
- `frontend/src/lib/supabase/` - Authentication and database client
@@ -160,6 +175,7 @@ pnpm storybook # Start component development server
### Agent Block System
Agents are built using a visual block-based system where each block performs a single action. Blocks are defined in `backend/blocks/` and must include:
- Block definition with input/output schemas
- Execution logic with proper error handling
- Tests validating functionality
@@ -167,6 +183,7 @@ Agents are built using a visual block-based system where each block performs a s
### Database & ORM
**Prisma ORM** with PostgreSQL backend including pgvector for embeddings:
- Schema in `schema.prisma`
- Migrations in `backend/migrations/`
- Always run `prisma migrate dev` and `prisma generate` after schema changes
@@ -174,13 +191,15 @@ Agents are built using a visual block-based system where each block performs a s
## Environment Configuration
### Configuration Files Priority Order
1. **Backend**: `/backend/.env.default` → `/backend/.env` (user overrides)
2. **Frontend**: `/frontend/.env.default` → `/frontend/.env` (user overrides)
2. **Frontend**: `/frontend/.env.default` → `/frontend/.env` (user overrides)
3. **Platform**: `/.env.default` (Supabase/shared) → `/.env` (user overrides)
4. Docker Compose `environment:` sections override file-based config
5. Shell environment variables have highest precedence
### Docker Environment Setup
- All services use hardcoded defaults (no `${VARIABLE}` substitutions)
- The `env_file` directive loads variables INTO containers at runtime
- Backend/Frontend services use YAML anchors for consistent configuration
@@ -189,6 +208,7 @@ Agents are built using a visual block-based system where each block performs a s
## Advanced Development Patterns
### Adding New Blocks
1. Create file in `/backend/backend/blocks/`
2. Inherit from `Block` base class with input/output schemas
3. Implement `run` method with proper error handling
@@ -198,6 +218,7 @@ Agents are built using a visual block-based system where each block performs a s
7. Consider how inputs/outputs connect with other blocks in graph editor
### API Development
1. Update routes in `/backend/backend/server/routers/`
2. Add/update Pydantic models in same directory
3. Write tests alongside route files
@@ -205,21 +226,76 @@ Agents are built using a visual block-based system where each block performs a s
5. Run `poetry run test` to verify changes
### Frontend Development
1. Components in `/frontend/src/components/`
2. Use existing UI components from `/frontend/src/components/ui/`
3. Add Storybook stories for component development
4. Test user-facing features with Playwright E2E tests
5. Update protected routes in middleware when needed
**📖 Complete Frontend Guide**: See `autogpt_platform/frontend/CONTRIBUTING.md` and `autogpt_platform/frontend/.cursorrules` for comprehensive patterns and conventions.
**Quick Reference:**
**Component Structure:**
- Separate render logic from data/behavior
- Structure: `ComponentName/ComponentName.tsx` + `useComponentName.ts` + `helpers.ts`
- Exception: Small components (3-4 lines of logic) can be inline
- Render-only components can be direct files without folders
**Data Fetching:**
- Use generated API hooks from `@/app/api/__generated__/endpoints/`
- Generated via Orval from backend OpenAPI spec
- Pattern: `use{Method}{Version}{OperationName}`
- Example: `useGetV2ListLibraryAgents`
- Regenerate with: `pnpm generate:api`
- **Never** use deprecated `BackendAPI` or `src/lib/autogpt-server-api/*`
**Code Conventions:**
- Use function declarations for components and handlers (not arrow functions)
- Only arrow functions for small inline lambdas (map, filter, etc.)
- Components: `PascalCase`, Hooks: `camelCase` with `use` prefix
- No barrel files or `index.ts` re-exports
- Minimal comments (code should be self-documenting)
**Styling:**
- Use Tailwind CSS utilities only
- Use design system components from `src/components/` (atoms, molecules, organisms)
- Never use `src/components/__legacy__/*`
- Only use Phosphor Icons (`@phosphor-icons/react`)
- Prefer design tokens over hardcoded values
**Error Handling:**
- Render errors: Use `<ErrorCard />` component
- Mutation errors: Display with toast notifications
- Manual exceptions: Use `Sentry.captureException()`
- Global error boundaries already configured
**Testing:**
- Add/update Storybook stories for UI components (`pnpm storybook`)
- Run Playwright E2E tests with `pnpm test`
- Verify in Chromatic after PR
**Architecture:**
- Default to client components ("use client")
- Server components only for SEO or extreme TTFB needs
- Use React Query for server state (via generated hooks)
- Co-locate UI state in components/hooks
### Security Guidelines
**Cache Protection Middleware** (`/backend/backend/server/middleware/security.py`):
- Default: Disables caching for ALL endpoints with `Cache-Control: no-store, no-cache, must-revalidate, private`
- Uses allow list approach for cacheable paths (static assets, health checks, public pages)
- Prevents sensitive data caching in browsers/proxies
- Add new cacheable endpoints to `CACHEABLE_PATHS`
### CI/CD Alignment
The repository has comprehensive CI workflows that test:
- **Backend**: Python 3.11-3.13, services (Redis/RabbitMQ/ClamAV), Prisma migrations, Poetry lock validation
- **Frontend**: Node.js 21, pnpm, Playwright with Docker Compose stack, API schema validation
- **Integration**: Full-stack type checking and E2E testing
@@ -229,6 +305,7 @@ Match these patterns when developing locally - the copilot setup environment mir
## Collaboration with Other AI Assistants
This repository is actively developed with assistance from Claude (via CLAUDE.md files). When working on this codebase:
- Check for existing CLAUDE.md files that provide additional context
- Follow established patterns and conventions already in the codebase
- Maintain consistency with existing code style and architecture
@@ -237,8 +314,9 @@ This repository is actively developed with assistance from Claude (via CLAUDE.md
## Trust These Instructions
These instructions are comprehensive and tested. Only perform additional searches if:
1. Information here is incomplete for your specific task
2. You encounter errors not covered by the workarounds
3. You need to understand implementation details not covered above
For detailed platform development patterns, refer to `autogpt_platform/CLAUDE.md` and `AGENTS.md` in the repository root.
For detailed platform development patterns, refer to `autogpt_platform/CLAUDE.md` and `AGENTS.md` in the repository root.

View File

@@ -63,6 +63,9 @@ poetry run pytest path/to/test.py --snapshot-update
# Install dependencies
cd frontend && pnpm i
# Generate API client from OpenAPI spec
pnpm generate:api
# Start development server
pnpm dev
@@ -75,12 +78,23 @@ pnpm storybook
# Build production
pnpm build
# Format and lint
pnpm format
# Type checking
pnpm types
```
We have a components library in autogpt_platform/frontend/src/components/atoms that should be used when adding new pages and components.
**📖 Complete Guide**: See `/frontend/CONTRIBUTING.md` and `/frontend/.cursorrules` for comprehensive frontend patterns.
**Key Frontend Conventions:**
- Separate render logic from data/behavior in components
- Use generated API hooks from `@/app/api/__generated__/endpoints/`
- Use function declarations (not arrow functions) for components/handlers
- Use design system components from `src/components/` (atoms, molecules, organisms)
- Only use Phosphor Icons
- Never use `src/components/__legacy__/*` or deprecated `BackendAPI`
## Architecture Overview
@@ -95,11 +109,16 @@ We have a components library in autogpt_platform/frontend/src/components/atoms t
### Frontend Architecture
- **Framework**: Next.js App Router with React Server Components
- **State Management**: React hooks + Supabase client for real-time updates
- **Framework**: Next.js 15 App Router (client-first approach)
- **Data Fetching**: Type-safe generated API hooks via Orval + React Query
- **State Management**: React Query for server state, co-located UI state in components/hooks
- **Component Structure**: Separate render logic (`.tsx`) from business logic (`use*.ts` hooks)
- **Workflow Builder**: Visual graph editor using @xyflow/react
- **UI Components**: Radix UI primitives with Tailwind CSS styling
- **UI Components**: shadcn/ui (Radix UI primitives) with Tailwind CSS styling
- **Icons**: Phosphor Icons only
- **Feature Flags**: LaunchDarkly integration
- **Error Handling**: ErrorCard for render errors, toast for mutations, Sentry for exceptions
- **Testing**: Playwright for E2E, Storybook for component development
### Key Concepts
@@ -153,6 +172,7 @@ Key models (defined in `/backend/schema.prisma`):
**Adding a new block:**
Follow the comprehensive [Block SDK Guide](../../../docs/content/platform/block-sdk-guide.md) which covers:
- Provider configuration with `ProviderBuilder`
- Block schema definition
- Authentication (API keys, OAuth, webhooks)
@@ -160,6 +180,7 @@ Follow the comprehensive [Block SDK Guide](../../../docs/content/platform/block-
- File organization
Quick steps:
1. Create new file in `/backend/backend/blocks/`
2. Configure provider using `ProviderBuilder` in `_config.py`
3. Inherit from `Block` base class
@@ -180,10 +201,20 @@ ex: do the inputs and outputs tie well together?
**Frontend feature development:**
1. Components go in `/frontend/src/components/`
2. Use existing UI components from `/frontend/src/components/ui/`
3. Add Storybook stories for new components
4. Test with Playwright if user-facing
See `/frontend/CONTRIBUTING.md` for complete patterns. Quick reference:
1. **Pages**: Create in `src/app/(platform)/feature-name/page.tsx`
- Add `usePageName.ts` hook for logic
- Put sub-components in local `components/` folder
2. **Components**: Structure as `ComponentName/ComponentName.tsx` + `useComponentName.ts` + `helpers.ts`
- Use design system components from `src/components/` (atoms, molecules, organisms)
- Never use `src/components/__legacy__/*`
3. **Data fetching**: Use generated API hooks from `@/app/api/__generated__/endpoints/`
- Regenerate with `pnpm generate:api`
- Pattern: `use{Method}{Version}{OperationName}`
4. **Styling**: Tailwind CSS only, use design tokens, Phosphor Icons only
5. **Testing**: Add Storybook stories for new components, Playwright for E2E
6. **Code conventions**: Function declarations (not arrow functions) for components/handlers
### Security Implementation

View File

@@ -94,42 +94,36 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
config = LoggingConfig()
log_handlers: list[logging.Handler] = []
structured_logging = config.enable_cloud_logging or force_cloud_logging
# Console output handlers
stdout = logging.StreamHandler(stream=sys.stdout)
stdout.setLevel(config.level)
stdout.addFilter(BelowLevelFilter(logging.WARNING))
if config.level == logging.DEBUG:
stdout.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stdout.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
if not structured_logging:
stdout = logging.StreamHandler(stream=sys.stdout)
stdout.setLevel(config.level)
stdout.addFilter(BelowLevelFilter(logging.WARNING))
if config.level == logging.DEBUG:
stdout.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stdout.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
stderr = logging.StreamHandler()
stderr.setLevel(logging.WARNING)
if config.level == logging.DEBUG:
stderr.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stderr.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
stderr = logging.StreamHandler()
stderr.setLevel(logging.WARNING)
if config.level == logging.DEBUG:
stderr.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stderr.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
log_handlers += [stdout, stderr]
log_handlers += [stdout, stderr]
# Cloud logging setup
if config.enable_cloud_logging or force_cloud_logging:
import google.cloud.logging
from google.cloud.logging.handlers import CloudLoggingHandler
from google.cloud.logging_v2.handlers.transports import (
BackgroundThreadTransport,
)
else:
# Use Google Cloud Structured Log Handler. Log entries are printed to stdout
# in a JSON format which is automatically picked up by Google Cloud Logging.
from google.cloud.logging.handlers import StructuredLogHandler
client = google.cloud.logging.Client()
# Use BackgroundThreadTransport to prevent blocking the main thread
# and deadlocks when gRPC calls to Google Cloud Logging hang
cloud_handler = CloudLoggingHandler(
client,
name="autogpt_logs",
transport=BackgroundThreadTransport,
)
cloud_handler.setLevel(config.level)
log_handlers.append(cloud_handler)
structured_log_handler = StructuredLogHandler(stream=sys.stdout)
structured_log_handler.setLevel(config.level)
log_handlers.append(structured_log_handler)
# File logging setup
if config.enable_file_logging:
@@ -185,7 +179,13 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
# Configure the root logger
logging.basicConfig(
format=DEBUG_LOG_FORMAT if config.level == logging.DEBUG else SIMPLE_LOG_FORMAT,
format=(
"%(levelname)s %(message)s"
if structured_logging
else (
DEBUG_LOG_FORMAT if config.level == logging.DEBUG else SIMPLE_LOG_FORMAT
)
),
level=config.level,
handlers=log_handlers,
)

View File

@@ -4,13 +4,13 @@ import mimetypes
from pathlib import Path
from typing import Any
import aiohttp
import discord
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import APIKeyCredentials, SchemaField
from backend.util.file import store_media_file
from backend.util.request import Requests
from backend.util.type import MediaFileType
from ._auth import (
@@ -114,10 +114,9 @@ class ReadDiscordMessagesBlock(Block):
if message.attachments:
attachment = message.attachments[0] # Process the first attachment
if attachment.filename.endswith((".txt", ".py")):
async with aiohttp.ClientSession() as session:
async with session.get(attachment.url) as response:
file_content = response.text()
self.output_data += f"\n\nFile from user: {attachment.filename}\nContent: {file_content}"
response = await Requests().get(attachment.url)
file_content = response.text()
self.output_data += f"\n\nFile from user: {attachment.filename}\nContent: {file_content}"
await client.close()
@@ -699,16 +698,15 @@ class SendDiscordFileBlock(Block):
elif file.startswith(("http://", "https://")):
# URL - download the file
async with aiohttp.ClientSession() as session:
async with session.get(file) as response:
file_bytes = await response.read()
response = await Requests().get(file)
file_bytes = response.content
# Try to get filename from URL if not provided
if not filename:
from urllib.parse import urlparse
# Try to get filename from URL if not provided
if not filename:
from urllib.parse import urlparse
path = urlparse(file).path
detected_filename = Path(path).name or "download"
path = urlparse(file).path
detected_filename = Path(path).name or "download"
else:
# Local file path - read from stored media file
# This would be a path from a previous block's output

View File

@@ -0,0 +1,9 @@
# Import the provider builder to ensure it's registered
from backend.sdk.registry import AutoRegistry
from .triggers import GenericWebhookTriggerBlock, generic_webhook
# Ensure the SDK registry is patched to include our webhook manager
AutoRegistry.patch_integrations()
__all__ = ["GenericWebhookTriggerBlock", "generic_webhook"]

View File

@@ -1,7 +1,5 @@
import asyncio
import logging
import urllib.parse
import urllib.request
from datetime import datetime, timedelta, timezone
from typing import Any
@@ -10,6 +8,7 @@ import pydantic
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import Requests
class RSSEntry(pydantic.BaseModel):
@@ -103,35 +102,29 @@ class ReadRSSFeedBlock(Block):
)
@staticmethod
def parse_feed(url: str) -> dict[str, Any]:
async def parse_feed(url: str) -> dict[str, Any]:
# Security fix: Add protection against memory exhaustion attacks
MAX_FEED_SIZE = 10 * 1024 * 1024 # 10MB limit for RSS feeds
# Validate URL
parsed_url = urllib.parse.urlparse(url)
if parsed_url.scheme not in ("http", "https"):
raise ValueError(f"Invalid URL scheme: {parsed_url.scheme}")
# Download with size limit
# Download feed content with size limit
try:
with urllib.request.urlopen(url, timeout=30) as response:
# Check content length if available
content_length = response.headers.get("Content-Length")
if content_length and int(content_length) > MAX_FEED_SIZE:
raise ValueError(
f"Feed too large: {content_length} bytes exceeds {MAX_FEED_SIZE} limit"
)
response = await Requests(raise_for_status=True).get(url)
# Read with size limit
content = response.read(MAX_FEED_SIZE + 1)
if len(content) > MAX_FEED_SIZE:
raise ValueError(
f"Feed too large: exceeds {MAX_FEED_SIZE} byte limit"
)
# Check content length if available
content_length = response.headers.get("Content-Length")
if content_length and int(content_length) > MAX_FEED_SIZE:
raise ValueError(
f"Feed too large: {content_length} bytes exceeds {MAX_FEED_SIZE} limit"
)
# Parse with feedparser using the validated content
# feedparser has built-in protection against XML attacks
return feedparser.parse(content) # type: ignore
# Get content with size limit
content = response.content
if len(content) > MAX_FEED_SIZE:
raise ValueError(f"Feed too large: exceeds {MAX_FEED_SIZE} byte limit")
# Parse with feedparser using the validated content
# feedparser has built-in protection against XML attacks
return feedparser.parse(content) # type: ignore
except Exception as e:
# Log error and return empty feed
logging.warning(f"Failed to parse RSS feed from {url}: {e}")
@@ -145,7 +138,7 @@ class ReadRSSFeedBlock(Block):
while keep_going:
keep_going = input_data.run_continuously
feed = self.parse_feed(input_data.rss_url)
feed = await self.parse_feed(input_data.rss_url)
all_entries = []
for entry in feed["entries"]:

View File

@@ -1,6 +1,7 @@
from urllib.parse import parse_qs, urlparse
from youtube_transcript_api._api import YouTubeTranscriptApi
from youtube_transcript_api._errors import NoTranscriptFound
from youtube_transcript_api._transcripts import FetchedTranscript
from youtube_transcript_api.formatters import TextFormatter
@@ -64,7 +65,29 @@ class TranscribeYoutubeVideoBlock(Block):
@staticmethod
def get_transcript(video_id: str) -> FetchedTranscript:
return YouTubeTranscriptApi().fetch(video_id=video_id)
"""
Get transcript for a video, preferring English but falling back to any available language.
:param video_id: The YouTube video ID
:return: The fetched transcript
:raises: Any exception except NoTranscriptFound for requested languages
"""
api = YouTubeTranscriptApi()
try:
# Try to get English transcript first (default behavior)
return api.fetch(video_id=video_id)
except NoTranscriptFound:
# If English is not available, get the first available transcript
transcript_list = api.list(video_id)
# Try manually created transcripts first, then generated ones
available_transcripts = list(
transcript_list._manually_created_transcripts.values()
) + list(transcript_list._generated_transcripts.values())
if available_transcripts:
# Fetch the first available transcript
return available_transcripts[0].fetch()
# If no transcripts at all, re-raise the original error
raise
@staticmethod
def format_transcript(transcript: FetchedTranscript) -> str:

View File

@@ -1,4 +1,3 @@
import collections
import inspect
import logging
import os
@@ -10,7 +9,6 @@ from typing import (
Any,
Callable,
ClassVar,
Dict,
Generic,
Optional,
Sequence,
@@ -22,8 +20,7 @@ from typing import (
import jsonref
import jsonschema
from prisma import Json
from prisma.models import AgentBlock, BlocksRegistry
from prisma.models import AgentBlock
from prisma.types import AgentBlockCreateInput
from pydantic import BaseModel
@@ -482,50 +479,19 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
return self.__class__.__name__
def to_dict(self):
# Sort categories by their name to ensure consistent ordering
sorted_categories = [
category.dict()
for category in sorted(self.categories, key=lambda c: c.name)
]
# Sort dictionary keys recursively for consistent ordering
def sort_dict(obj):
if isinstance(obj, dict):
return collections.OrderedDict(
sorted((k, sort_dict(v)) for k, v in obj.items())
)
elif isinstance(obj, list):
# Check if all items in the list are primitive types that can be sorted
if obj and all(
isinstance(item, (str, int, float, bool, type(None)))
for item in obj
):
# Sort primitive lists for consistent ordering
return sorted(obj, key=lambda x: (x is None, str(x)))
else:
# For lists of complex objects, process each item but maintain order
return [sort_dict(item) for item in obj]
return obj
return collections.OrderedDict(
[
("id", self.id),
("name", self.name),
("inputSchema", sort_dict(self.input_schema.jsonschema())),
("outputSchema", sort_dict(self.output_schema.jsonschema())),
("description", self.description),
("categories", sorted_categories),
(
"contributors",
sorted(
[contributor.model_dump() for contributor in self.contributors],
key=lambda c: (c.get("name", ""), c.get("username", "")),
),
),
("staticOutput", self.static_output),
("uiType", self.block_type.value),
]
)
return {
"id": self.id,
"name": self.name,
"inputSchema": self.input_schema.jsonschema(),
"outputSchema": self.output_schema.jsonschema(),
"description": self.description,
"categories": [category.dict() for category in self.categories],
"contributors": [
contributor.model_dump() for contributor in self.contributors
],
"staticOutput": self.static_output,
"uiType": self.block_type.value,
}
def get_info(self) -> BlockInfo:
from backend.data.credit import get_block_cost
@@ -772,123 +738,3 @@ def get_io_block_ids() -> Sequence[str]:
for id, B in get_blocks().items()
if B().block_type in (BlockType.INPUT, BlockType.OUTPUT)
]
async def get_block_registry() -> Dict[str, BlocksRegistry]:
"""
Retrieves the BlocksRegistry from the database and returns a dictionary mapping
block names to BlocksRegistry objects.
Returns:
Dict[str, BlocksRegistry]: A dictionary where each key is a block name and
each value is a BlocksRegistry instance.
"""
blocks = await BlocksRegistry.prisma().find_many()
return {block.id: block for block in blocks}
def recursive_json_compare(
db_block_definition: Any, local_block_definition: Any
) -> bool:
"""
Recursively compares two JSON objects for equality.
Args:
db_block_definition (Any): The JSON object from the database.
local_block_definition (Any): The local JSON object to compare against.
Returns:
bool: True if the objects are equal, False otherwise.
"""
if isinstance(db_block_definition, dict) and isinstance(
local_block_definition, dict
):
if set(db_block_definition.keys()) != set(local_block_definition.keys()):
logger.error(
f"Keys are not the same: {set(db_block_definition.keys())} != {set(local_block_definition.keys())}"
)
return False
return all(
recursive_json_compare(db_block_definition[k], local_block_definition[k])
for k in db_block_definition
)
values_are_same = db_block_definition == local_block_definition
if not values_are_same:
logger.error(
f"Values are not the same: {db_block_definition} != {local_block_definition}"
)
return values_are_same
def check_block_same(db_block: BlocksRegistry, local_block: Block) -> bool:
"""
Compares a database block with a local block.
Args:
db_block (BlocksRegistry): The block object from the database registry.
local_block (Block[BlockSchema, BlockSchema]): The local block definition.
Returns:
bool: True if the blocks are equal, False otherwise.
"""
local_block_instance = local_block() # type: ignore
local_block_definition = local_block_instance.to_dict()
db_block_definition = db_block.definition
is_same = recursive_json_compare(db_block_definition, local_block_definition)
return is_same
def find_delta_blocks(
db_blocks: Dict[str, BlocksRegistry], local_blocks: Dict[str, Block]
) -> Dict[str, Block]:
"""
Finds the set of blocks that are new or changed compared to the database.
Args:
db_blocks (Dict[str, BlocksRegistry]): Existing blocks from the database, keyed by name.
local_blocks (Dict[str, Block]): Local block definitions, keyed by name.
Returns:
Dict[str, Block]: Blocks that are missing from or different than the database, keyed by name.
"""
block_update: Dict[str, Block] = {}
for block_id, block in local_blocks.items():
if block_id not in db_blocks:
block_update[block_id] = block
else:
if not check_block_same(db_blocks[block_id], block):
block_update[block_id] = block
return block_update
async def upsert_blocks_change_bulk(blocks: Dict[str, Block]):
"""
Bulk upserts blocks into the database if changed.
- Compares the provided local blocks to those in the database via their definition.
- Inserts new or updated blocks.
Args:
blocks (Dict[str, Block]): Local block definitions to upsert.
Returns:
Dict[str, Block]: Blocks that were new or changed and upserted.
"""
db_blocks = await get_block_registry()
block_update = find_delta_blocks(db_blocks, blocks)
for block_id, block in block_update.items():
await BlocksRegistry.prisma().upsert(
where={"id": block_id},
data={
"create": {
"id": block_id,
"name": block().__class__.__name__, # type: ignore
"definition": Json(block.to_dict(block())), # type: ignore
},
"update": {
"name": block().__class__.__name__, # type: ignore
"definition": Json(block.to_dict(block())), # type: ignore
},
},
)
return block_update

View File

@@ -1,191 +0,0 @@
import json
from datetime import datetime
import pytest
from prisma.models import BlocksRegistry
from backend.blocks.basic import (
FileStoreBlock,
PrintToConsoleBlock,
ReverseListOrderBlock,
StoreValueBlock,
)
from backend.data.block import (
check_block_same,
find_delta_blocks,
recursive_json_compare,
)
@pytest.mark.asyncio
async def test_recursive_json_compare():
db_block_definition = {
"a": 1,
"b": 2,
"c": 3,
}
local_block_definition = {
"a": 1,
"b": 2,
"c": 3,
}
assert recursive_json_compare(db_block_definition, local_block_definition)
assert not recursive_json_compare(
db_block_definition, {**local_block_definition, "d": 4}
)
assert not recursive_json_compare(
db_block_definition, {**local_block_definition, "a": 2}
)
assert not recursive_json_compare(
db_block_definition, {**local_block_definition, "b": 3}
)
assert not recursive_json_compare(
db_block_definition, {**local_block_definition, "c": 4}
)
assert not recursive_json_compare(
db_block_definition, {**local_block_definition, "a": 1, "b": 2, "c": 3, "d": 4}
)
assert recursive_json_compare({}, {})
assert recursive_json_compare({"a": 1}, {"a": 1})
assert not recursive_json_compare({"a": 1}, {"b": 1})
assert not recursive_json_compare({"a": 1}, {"a": 2})
assert not recursive_json_compare({"a": 1}, {"a": [1, 2]})
assert not recursive_json_compare({"a": 1}, {"a": {"b": 1}})
assert not recursive_json_compare({"a": 1}, {"a": {"b": 2}})
assert not recursive_json_compare({"a": 1}, {"a": {"b": [1, 2]}})
assert not recursive_json_compare({"a": 1}, {"a": {"b": {"c": 1}}})
assert not recursive_json_compare({"a": 1}, {"a": {"b": {"c": 2}}})
@pytest.mark.asyncio
async def test_check_block_same():
local_block_instance = PrintToConsoleBlock()
db_block = BlocksRegistry(
id="f3b1c1b2-4c4f-4f0d-8d2f-4c4f0d8d2f4c",
name=local_block_instance.__class__.__name__,
definition=json.dumps(local_block_instance.to_dict()), # type: ignore To much type magic going on here
updatedAt=datetime.now(),
)
assert check_block_same(db_block, PrintToConsoleBlock) # type: ignore
@pytest.mark.asyncio
async def test_check_block_not_same():
local_block_instance = PrintToConsoleBlock()
local_block_data = local_block_instance.to_dict()
local_block_data["description"] = "Hello, World!"
db_block = BlocksRegistry(
id="f3b1c1b2-4c4f-4f0d-8d2f-4c4f0d8d2f4c",
name=local_block_instance.__class__.__name__,
definition=json.dumps(local_block_data), # type: ignore To much type magic going on here
updatedAt=datetime.now(),
)
assert not check_block_same(db_block, PrintToConsoleBlock) # type: ignore
@pytest.mark.asyncio
async def test_find_delta_blocks():
now = datetime.now()
store_value_block = StoreValueBlock()
local_blocks = {
PrintToConsoleBlock().id: PrintToConsoleBlock,
ReverseListOrderBlock().id: ReverseListOrderBlock,
FileStoreBlock().id: FileStoreBlock,
store_value_block.id: StoreValueBlock,
}
db_blocks = {
PrintToConsoleBlock().id: BlocksRegistry(
id=PrintToConsoleBlock().id,
name=PrintToConsoleBlock().__class__.__name__,
definition=json.dumps(PrintToConsoleBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
ReverseListOrderBlock().id: BlocksRegistry(
id=ReverseListOrderBlock().id,
name=ReverseListOrderBlock().__class__.__name__,
definition=json.dumps(ReverseListOrderBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
FileStoreBlock().id: BlocksRegistry(
id=FileStoreBlock().id,
name=FileStoreBlock().__class__.__name__,
definition=json.dumps(FileStoreBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
}
delta_blocks = find_delta_blocks(db_blocks, local_blocks)
assert len(delta_blocks) == 1
assert store_value_block.id in delta_blocks.keys()
assert delta_blocks[store_value_block.id] == StoreValueBlock
@pytest.mark.asyncio
async def test_find_delta_blocks_block_updated():
now = datetime.now()
store_value_block = StoreValueBlock()
print_to_console_block_definition = PrintToConsoleBlock().to_dict()
print_to_console_block_definition["description"] = "Hello, World!"
local_blocks = {
PrintToConsoleBlock().id: PrintToConsoleBlock,
ReverseListOrderBlock().id: ReverseListOrderBlock,
FileStoreBlock().id: FileStoreBlock,
store_value_block.id: StoreValueBlock,
}
db_blocks = {
PrintToConsoleBlock().id: BlocksRegistry(
id=PrintToConsoleBlock().id,
name=PrintToConsoleBlock().__class__.__name__,
definition=json.dumps(print_to_console_block_definition), # type: ignore To much type magic going on here
updatedAt=now,
),
ReverseListOrderBlock().id: BlocksRegistry(
id=ReverseListOrderBlock().id,
name=ReverseListOrderBlock().__class__.__name__,
definition=json.dumps(ReverseListOrderBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
FileStoreBlock().id: BlocksRegistry(
id=FileStoreBlock().id,
name=FileStoreBlock().__class__.__name__,
definition=json.dumps(FileStoreBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
}
delta_blocks = find_delta_blocks(db_blocks, local_blocks)
assert len(delta_blocks) == 2
assert store_value_block.id in delta_blocks.keys()
assert delta_blocks[store_value_block.id] == StoreValueBlock
assert PrintToConsoleBlock().id in delta_blocks.keys()
@pytest.mark.asyncio
async def test_find_delta_block_no_diff():
now = datetime.now()
local_blocks = {
PrintToConsoleBlock().id: PrintToConsoleBlock,
ReverseListOrderBlock().id: ReverseListOrderBlock,
FileStoreBlock().id: FileStoreBlock,
}
db_blocks = {
PrintToConsoleBlock().id: BlocksRegistry(
id=PrintToConsoleBlock().id,
name=PrintToConsoleBlock().__class__.__name__,
definition=json.dumps(PrintToConsoleBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
ReverseListOrderBlock().id: BlocksRegistry(
id=ReverseListOrderBlock().id,
name=ReverseListOrderBlock().__class__.__name__,
definition=json.dumps(ReverseListOrderBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
FileStoreBlock().id: BlocksRegistry(
id=FileStoreBlock().id,
name=FileStoreBlock().__class__.__name__,
definition=json.dumps(FileStoreBlock().to_dict()), # type: ignore To much type magic going on here
updatedAt=now,
),
}
delta_blocks = find_delta_blocks(db_blocks, local_blocks)
assert len(delta_blocks) == 0

View File

@@ -346,8 +346,6 @@ class APIKeyCredentials(_BaseCredentials):
)
"""Unix timestamp (seconds) indicating when the API key expires (if at all)"""
api_key_env_var: Optional[str] = Field(default=None, exclude=True)
def auth_header(self) -> str:
# Linear API keys should not have Bearer prefix
if self.provider == "linear":
@@ -526,13 +524,13 @@ class CredentialsMetaInput(BaseModel, Generic[CP, CT]):
if hasattr(model_class, "allowed_providers") and hasattr(
model_class, "allowed_cred_types"
):
allowed_providers = sorted(model_class.allowed_providers())
allowed_providers = model_class.allowed_providers()
# If no specific providers (None), allow any string
if allowed_providers is None:
schema["credentials_provider"] = ["string"] # Allow any string provider
else:
schema["credentials_provider"] = allowed_providers
schema["credentials_types"] = sorted(model_class.allowed_cred_types())
schema["credentials_types"] = model_class.allowed_cred_types()
# Do not return anything, just mutate schema in place
model_config = ConfigDict(

View File

@@ -39,6 +39,7 @@ from backend.data.notifications import (
)
from backend.data.user import (
get_active_user_ids_in_timerange,
get_user_by_id,
get_user_email_by_id,
get_user_email_verification,
get_user_integrations,
@@ -146,6 +147,7 @@ class DatabaseManager(AppService):
# User Comms - async
get_active_user_ids_in_timerange = _(get_active_user_ids_in_timerange)
get_user_by_id = _(get_user_by_id)
get_user_email_by_id = _(get_user_email_by_id)
get_user_email_verification = _(get_user_email_verification)
get_user_notification_preference = _(get_user_notification_preference)
@@ -231,6 +233,7 @@ class DatabaseManagerAsyncClient(AppServiceClient):
get_node = d.get_node
get_node_execution = d.get_node_execution
get_node_executions = d.get_node_executions
get_user_by_id = d.get_user_by_id
get_user_integrations = d.get_user_integrations
upsert_execution_input = d.upsert_execution_input
upsert_execution_output = d.upsert_execution_output

View File

@@ -34,6 +34,7 @@ from backend.data.graph import GraphModel, Node
from backend.data.model import CredentialsMetaInput
from backend.data.rabbitmq import Exchange, ExchangeType, Queue, RabbitMQConfig
from backend.data.user import get_user_by_id
from backend.util.cache import cached
from backend.util.clients import (
get_async_execution_event_bus,
get_async_execution_queue,
@@ -41,11 +42,12 @@ from backend.util.clients import (
get_integration_credentials_store,
)
from backend.util.exceptions import GraphValidationError, NotFoundError
from backend.util.logging import TruncatedLogger
from backend.util.logging import TruncatedLogger, is_structured_logging_enabled
from backend.util.settings import Config
from backend.util.type import convert
@cached(maxsize=1000, ttl_seconds=3600)
async def get_user_context(user_id: str) -> UserContext:
"""
Get UserContext for a user, always returns a valid context with timezone.
@@ -53,7 +55,11 @@ async def get_user_context(user_id: str) -> UserContext:
"""
user_context = UserContext(timezone="UTC") # Default to UTC
try:
user = await get_user_by_id(user_id)
if prisma.is_connected():
user = await get_user_by_id(user_id)
else:
user = await get_database_manager_async_client().get_user_by_id(user_id)
if user and user.timezone and user.timezone != "not-set":
user_context.timezone = user.timezone
logger.debug(f"Retrieved user context: timezone={user.timezone}")
@@ -93,7 +99,11 @@ class LogMetadata(TruncatedLogger):
"node_id": node_id,
"block_name": block_name,
}
prefix = f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]"
prefix = (
"[ExecutionManager]"
if is_structured_logging_enabled()
else f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]" # noqa
)
super().__init__(
logger,
max_length=max_length,

View File

@@ -26,7 +26,6 @@ ollama_credentials = APIKeyCredentials(
id="744fdc56-071a-4761-b5a5-0af0ce10a2b5",
provider="ollama",
api_key=SecretStr("FAKE_API_KEY"),
api_key_env_var=None,
title="Use Credits for Ollama",
expires_at=None,
)
@@ -35,7 +34,6 @@ revid_credentials = APIKeyCredentials(
id="fdb7f412-f519-48d1-9b5f-d2f73d0e01fe",
provider="revid",
api_key=SecretStr(settings.secrets.revid_api_key),
api_key_env_var="REVID_API_KEY",
title="Use Credits for Revid",
expires_at=None,
)
@@ -43,7 +41,6 @@ ideogram_credentials = APIKeyCredentials(
id="760f84fc-b270-42de-91f6-08efe1b512d0",
provider="ideogram",
api_key=SecretStr(settings.secrets.ideogram_api_key),
api_key_env_var="IDEOGRAM_API_KEY",
title="Use Credits for Ideogram",
expires_at=None,
)
@@ -51,7 +48,6 @@ replicate_credentials = APIKeyCredentials(
id="6b9fc200-4726-4973-86c9-cd526f5ce5db",
provider="replicate",
api_key=SecretStr(settings.secrets.replicate_api_key),
api_key_env_var="REPLICATE_API_KEY",
title="Use Credits for Replicate",
expires_at=None,
)
@@ -59,7 +55,6 @@ openai_credentials = APIKeyCredentials(
id="53c25cb8-e3ee-465c-a4d1-e75a4c899c2a",
provider="openai",
api_key=SecretStr(settings.secrets.openai_api_key),
api_key_env_var="OPENAI_API_KEY",
title="Use Credits for OpenAI",
expires_at=None,
)
@@ -67,7 +62,6 @@ aiml_api_credentials = APIKeyCredentials(
id="aad82a89-9794-4ebb-977f-d736aa5260a3",
provider="aiml_api",
api_key=SecretStr(settings.secrets.aiml_api_key),
api_key_env_var="AIML_API_KEY",
title="Use Credits for AI/ML API",
expires_at=None,
)
@@ -75,7 +69,6 @@ anthropic_credentials = APIKeyCredentials(
id="24e5d942-d9e3-4798-8151-90143ee55629",
provider="anthropic",
api_key=SecretStr(settings.secrets.anthropic_api_key),
api_key_env_var="ANTHROPIC_API_KEY",
title="Use Credits for Anthropic",
expires_at=None,
)
@@ -83,7 +76,6 @@ groq_credentials = APIKeyCredentials(
id="4ec22295-8f97-4dd1-b42b-2c6957a02545",
provider="groq",
api_key=SecretStr(settings.secrets.groq_api_key),
api_key_env_var="GROQ_API_KEY",
title="Use Credits for Groq",
expires_at=None,
)
@@ -91,7 +83,6 @@ did_credentials = APIKeyCredentials(
id="7f7b0654-c36b-4565-8fa7-9a52575dfae2",
provider="d_id",
api_key=SecretStr(settings.secrets.did_api_key),
api_key_env_var="DID_API_KEY",
title="Use Credits for D-ID",
expires_at=None,
)
@@ -99,7 +90,6 @@ jina_credentials = APIKeyCredentials(
id="7f26de70-ba0d-494e-ba76-238e65e7b45f",
provider="jina",
api_key=SecretStr(settings.secrets.jina_api_key),
api_key_env_var="JINA_API_KEY",
title="Use Credits for Jina",
expires_at=None,
)
@@ -107,7 +97,6 @@ unreal_credentials = APIKeyCredentials(
id="66f20754-1b81-48e4-91d0-f4f0dd82145f",
provider="unreal",
api_key=SecretStr(settings.secrets.unreal_speech_api_key),
api_key_env_var="UNREAL_SPEECH_API_KEY",
title="Use Credits for Unreal",
expires_at=None,
)
@@ -115,7 +104,6 @@ open_router_credentials = APIKeyCredentials(
id="b5a0e27d-0c98-4df3-a4b9-10193e1f3c40",
provider="open_router",
api_key=SecretStr(settings.secrets.open_router_api_key),
api_key_env_var="OPEN_ROUTER_API_KEY",
title="Use Credits for Open Router",
expires_at=None,
)
@@ -123,7 +111,6 @@ fal_credentials = APIKeyCredentials(
id="6c0f5bd0-9008-4638-9d79-4b40b631803e",
provider="fal",
api_key=SecretStr(settings.secrets.fal_api_key),
api_key_env_var="FAL_API_KEY",
title="Use Credits for FAL",
expires_at=None,
)
@@ -131,7 +118,6 @@ exa_credentials = APIKeyCredentials(
id="96153e04-9c6c-4486-895f-5bb683b1ecec",
provider="exa",
api_key=SecretStr(settings.secrets.exa_api_key),
api_key_env_var="EXA_API_KEY",
title="Use Credits for Exa search",
expires_at=None,
)
@@ -139,7 +125,6 @@ e2b_credentials = APIKeyCredentials(
id="78d19fd7-4d59-4a16-8277-3ce310acf2b7",
provider="e2b",
api_key=SecretStr(settings.secrets.e2b_api_key),
api_key_env_var="E2B_API_KEY",
title="Use Credits for E2B",
expires_at=None,
)
@@ -147,7 +132,6 @@ nvidia_credentials = APIKeyCredentials(
id="96b83908-2789-4dec-9968-18f0ece4ceb3",
provider="nvidia",
api_key=SecretStr(settings.secrets.nvidia_api_key),
api_key_env_var="NVIDIA_API_KEY",
title="Use Credits for Nvidia",
expires_at=None,
)
@@ -155,7 +139,6 @@ screenshotone_credentials = APIKeyCredentials(
id="3b1bdd16-8818-4bc2-8cbb-b23f9a3439ed",
provider="screenshotone",
api_key=SecretStr(settings.secrets.screenshotone_api_key),
api_key_env_var="SCREENSHOTONE_API_KEY",
title="Use Credits for ScreenshotOne",
expires_at=None,
)
@@ -163,7 +146,6 @@ mem0_credentials = APIKeyCredentials(
id="ed55ac19-356e-4243-a6cb-bc599e9b716f",
provider="mem0",
api_key=SecretStr(settings.secrets.mem0_api_key),
api_key_env_var="MEM0_API_KEY",
title="Use Credits for Mem0",
expires_at=None,
)
@@ -172,7 +154,6 @@ apollo_credentials = APIKeyCredentials(
id="544c62b5-1d0f-4156-8fb4-9525f11656eb",
provider="apollo",
api_key=SecretStr(settings.secrets.apollo_api_key),
api_key_env_var="APOLLO_API_KEY",
title="Use Credits for Apollo",
expires_at=None,
)
@@ -181,7 +162,6 @@ smartlead_credentials = APIKeyCredentials(
id="3bcdbda3-84a3-46af-8fdb-bfd2472298b8",
provider="smartlead",
api_key=SecretStr(settings.secrets.smartlead_api_key),
api_key_env_var="SMARTLEAD_API_KEY",
title="Use Credits for SmartLead",
expires_at=None,
)
@@ -190,7 +170,6 @@ google_maps_credentials = APIKeyCredentials(
id="9aa1bde0-4947-4a70-a20c-84daa3850d52",
provider="google_maps",
api_key=SecretStr(settings.secrets.google_maps_api_key),
api_key_env_var="GOOGLE_MAPS_API_KEY",
title="Use Credits for Google Maps",
expires_at=None,
)
@@ -199,7 +178,6 @@ zerobounce_credentials = APIKeyCredentials(
id="63a6e279-2dc2-448e-bf57-85776f7176dc",
provider="zerobounce",
api_key=SecretStr(settings.secrets.zerobounce_api_key),
api_key_env_var="ZEROBOUNCE_API_KEY",
title="Use Credits for ZeroBounce",
expires_at=None,
)
@@ -208,7 +186,6 @@ enrichlayer_credentials = APIKeyCredentials(
id="d9fce73a-6c1d-4e8b-ba2e-12a456789def",
provider="enrichlayer",
api_key=SecretStr(settings.secrets.enrichlayer_api_key),
api_key_env_var="ENRICHLAYER_API_KEY",
title="Use Credits for Enrichlayer",
expires_at=None,
)
@@ -218,7 +195,6 @@ llama_api_credentials = APIKeyCredentials(
id="d44045af-1c33-4833-9e19-752313214de2",
provider="llama_api",
api_key=SecretStr(settings.secrets.llama_api_key),
api_key_env_var="LLAMA_API_KEY",
title="Use Credits for Llama API",
expires_at=None,
)
@@ -227,7 +203,6 @@ v0_credentials = APIKeyCredentials(
id="c4e6d1a0-3b5f-4789-a8e2-9b123456789f",
provider="v0",
api_key=SecretStr(settings.secrets.v0_api_key),
api_key_env_var="V0_API_KEY",
title="Use Credits for v0 by Vercel",
expires_at=None,
)

View File

@@ -17,8 +17,9 @@ from backend.data.model import (
)
from backend.integrations.oauth.base import BaseOAuthHandler
from backend.integrations.webhooks._base import BaseWebhooksManager
from backend.sdk.provider import OAuthConfig, Provider, ProviderRegister
from backend.sdk.provider import OAuthConfig, Provider
from backend.sdk.registry import AutoRegistry
from backend.util.settings import Settings
logger = logging.getLogger(__name__)
@@ -39,7 +40,6 @@ class ProviderBuilder:
self._client_id_env_var: Optional[str] = None
self._client_secret_env_var: Optional[str] = None
self._extra_config: dict = {}
self._register: ProviderRegister = ProviderRegister(name=name)
def with_oauth(
self,
@@ -48,11 +48,6 @@ class ProviderBuilder:
client_id_env_var: Optional[str] = None,
client_secret_env_var: Optional[str] = None,
) -> "ProviderBuilder":
self._register.with_oauth = True
self._register.client_id_env_var = client_id_env_var
self._register.client_secret_env_var = client_secret_env_var
"""Add OAuth support."""
if not client_id_env_var or not client_secret_env_var:
client_id_env_var = f"{self.name}_client_id".upper()
@@ -78,8 +73,6 @@ class ProviderBuilder:
def with_api_key(self, env_var_name: str, title: str) -> "ProviderBuilder":
"""Add API key support with environment variable name."""
self._register.with_api_key = True
self._register.api_key_env_var = env_var_name
self._supported_auth_types.add("api_key")
# Register the API key mapping
@@ -98,14 +91,30 @@ class ProviderBuilder:
)
return self
def with_api_key_from_settings(
self, settings_attr: str, title: str
) -> "ProviderBuilder":
"""Use existing API key from settings."""
self._supported_auth_types.add("api_key")
# Try to get the API key from settings
settings = Settings()
api_key = getattr(settings.secrets, settings_attr, None)
if api_key:
self._default_credentials.append(
APIKeyCredentials(
id=f"{self.name}-default",
provider=self.name,
api_key=api_key,
title=title,
)
)
return self
def with_user_password(
self, username_env_var: str, password_env_var: str, title: str
) -> "ProviderBuilder":
"""Add username/password support with environment variable names."""
self._register.with_user_password = True
self._register.username_env_var = username_env_var
self._register.password_env_var = password_env_var
self._supported_auth_types.add("user_password")
# Check if credentials exist in environment
@@ -165,7 +174,6 @@ class ProviderBuilder:
supported_auth_types=self._supported_auth_types,
api_client_factory=self._api_client_factory,
error_handler=self._error_handler,
register=self._register,
**self._extra_config,
)

View File

@@ -1,123 +0,0 @@
from typing import Dict
from prisma import Prisma
from prisma.models import ProviderRegistry as PrismaProviderRegistry
from backend.sdk.provider import ProviderRegister
def is_providers_different(
current_provider: PrismaProviderRegistry, new_provider: ProviderRegister
) -> bool:
"""
Compare a current provider (as stored in the database) against a new provider registration
and determine if they are different. This is done by converting the database model to a
ProviderRegister and checking for equality (all fields compared).
Args:
current_provider (PrismaProviderRegistry): The provider as stored in the database.
new_provider (ProviderRegister): The provider specification to compare.
Returns:
bool: True if the providers differ, False if they are effectively the same.
"""
current_provider_register = ProviderRegister(
name=current_provider.name,
with_oauth=current_provider.with_oauth,
client_id_env_var=current_provider.client_id_env_var,
client_secret_env_var=current_provider.client_secret_env_var,
with_api_key=current_provider.with_api_key,
api_key_env_var=current_provider.api_key_env_var,
with_user_password=current_provider.with_user_password,
username_env_var=current_provider.username_env_var,
password_env_var=current_provider.password_env_var,
)
if current_provider_register == new_provider:
return False
return True
def find_delta_providers(
current_providers: Dict[str, PrismaProviderRegistry],
providers: Dict[str, ProviderRegister],
) -> Dict[str, ProviderRegister]:
"""
Identify providers that are either new or updated compared to the current providers list.
Args:
current_providers (Dict[str, PrismaProviderRegistry]): Dictionary of current provider models keyed by provider name.
providers (Dict[str, ProviderRegister]): Dictionary of new provider registrations keyed by provider name.
Returns:
Dict[str, ProviderRegister]: Providers that need to be added/updated in the registry.
- Includes providers not in current_providers.
- Includes providers where the data differs from what's in current_providers.
"""
provider_update = {}
for name, provider in providers.items():
if name not in current_providers:
provider_update[name] = provider
else:
if is_providers_different(current_providers[name], provider):
provider_update[name] = provider
return provider_update
async def get_providers() -> Dict[str, PrismaProviderRegistry]:
"""
Retrieve all provider registries from the database.
Returns:
Dict[str, PrismaProviderRegistry]: Dictionary of all current providers, keyed by provider name.
"""
async with Prisma() as prisma:
providers = await prisma.providerregistry.find_many()
return {
provider.name: PrismaProviderRegistry(**provider.model_dump())
for provider in providers
}
async def upsert_providers_change_bulk(providers: Dict[str, ProviderRegister]):
"""
Bulk upsert providers into the database after checking for changes.
Args:
providers (Dict[str, ProviderRegister]): Dictionary of new provider registrations keyed by provider name.
"""
current_providers = await get_providers()
provider_update = find_delta_providers(current_providers, providers)
"""Async version of bulk upsert providers with all fields using transaction for atomicity"""
async with Prisma() as prisma:
async with prisma.tx() as tx:
results = []
for name, provider in provider_update.items():
result = await tx.providerregistry.upsert(
where={"name": name},
data={
"create": {
"name": name,
"with_oauth": provider.with_oauth,
"client_id_env_var": provider.client_id_env_var,
"client_secret_env_var": provider.client_secret_env_var,
"with_api_key": provider.with_api_key,
"api_key_env_var": provider.api_key_env_var,
"with_user_password": provider.with_user_password,
"username_env_var": provider.username_env_var,
"password_env_var": provider.password_env_var,
},
"update": {
"with_oauth": provider.with_oauth,
"client_id_env_var": provider.client_id_env_var,
"client_secret_env_var": provider.client_secret_env_var,
"with_api_key": provider.with_api_key,
"api_key_env_var": provider.api_key_env_var,
"with_user_password": provider.with_user_password,
"username_env_var": provider.username_env_var,
"password_env_var": provider.password_env_var,
},
},
)
results.append(result)
return results

View File

@@ -1,127 +0,0 @@
from datetime import datetime
import pytest
from prisma.models import ProviderRegistry as PrismaProviderRegistry
from backend.sdk.db import find_delta_providers, is_providers_different
from backend.sdk.provider import ProviderRegister
@pytest.mark.asyncio
def test_is_providers_different_same():
current_provider = PrismaProviderRegistry(
name="test_provider",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID",
client_secret_env_var="TEST_CLIENT_SECRET",
with_api_key=True,
api_key_env_var="TEST_API_KEY",
with_user_password=True,
username_env_var="TEST_USERNAME",
password_env_var="TEST_PASSWORD",
updatedAt=datetime.now(),
)
new_provider = ProviderRegister(
name="test_provider",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID",
client_secret_env_var="TEST_CLIENT_SECRET",
with_api_key=True,
api_key_env_var="TEST_API_KEY",
with_user_password=True,
username_env_var="TEST_USERNAME",
password_env_var="TEST_PASSWORD",
)
assert not is_providers_different(current_provider, new_provider)
@pytest.mark.asyncio
def test_is_providers_different_different():
current_provider = PrismaProviderRegistry(
name="test_provider",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID",
client_secret_env_var="TEST_CLIENT_SECRET",
with_api_key=True,
api_key_env_var="TEST_API_KEY",
with_user_password=True,
username_env_var="TEST_USERNAME",
password_env_var="TEST_PASSWORD",
updatedAt=datetime.now(),
)
new_provider = ProviderRegister(
name="test_provider",
with_oauth=False,
with_api_key=True,
api_key_env_var="TEST_API_KEY",
with_user_password=True,
username_env_var="TEST_USERNAME",
password_env_var="TEST_PASSWORD",
)
assert is_providers_different(current_provider, new_provider)
@pytest.mark.asyncio
def test_find_delta_providers():
current_providers = {
"test_provider": PrismaProviderRegistry(
name="test_provider",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID",
client_secret_env_var="TEST_CLIENT_SECRET",
with_api_key=True,
api_key_env_var="TEST_API_KEY",
with_user_password=True,
username_env_var="TEST_USERNAME",
password_env_var="TEST_PASSWORD",
updatedAt=datetime.now(),
),
"test_provider_2": PrismaProviderRegistry(
name="test_provider_2",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID_2",
client_secret_env_var="TEST_CLIENT_SECRET_2",
with_api_key=True,
api_key_env_var="TEST_API_KEY_2",
with_user_password=True,
username_env_var="TEST_USERNAME_2",
password_env_var="TEST_PASSWORD_2",
updatedAt=datetime.now(),
),
}
new_providers = {
"test_provider": ProviderRegister(
name="test_provider",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID",
client_secret_env_var="TEST_CLIENT_SECRET",
with_api_key=True,
api_key_env_var="TEST_API_KEY",
with_user_password=True,
username_env_var="TEST_USERNAME",
password_env_var="TEST_PASSWORD",
),
"test_provider_2": ProviderRegister(
name="test_provider_2",
with_oauth=False,
with_api_key=True,
api_key_env_var="TEST_API_KEY_2",
with_user_password=True,
username_env_var="TEST_USERNAME_2",
password_env_var="TEST_PASSWORD_2",
),
"test_provider_3": ProviderRegister(
name="test_provider_3",
with_oauth=True,
client_id_env_var="TEST_CLIENT_ID_3",
client_secret_env_var="TEST_CLIENT_SECRET_3",
with_api_key=False,
with_user_password=True,
username_env_var="TEST_USERNAME_3",
password_env_var="TEST_PASSWORD_3",
),
}
assert find_delta_providers(current_providers, new_providers) == {
"test_provider_2": new_providers["test_provider_2"],
"test_provider_3": new_providers["test_provider_3"],
}

View File

@@ -30,23 +30,6 @@ class OAuthConfig(BaseModel):
client_secret_env_var: str
class ProviderRegister(BaseModel):
"""Provider log configuration for SDK providers."""
name: str
with_oauth: bool = False
client_id_env_var: Optional[str] = None
client_secret_env_var: Optional[str] = None
with_api_key: bool = False
api_key_env_var: Optional[str] = None
with_user_password: bool = False
username_env_var: Optional[str] = None
password_env_var: Optional[str] = None
class Provider:
"""A configured provider that blocks can use.
@@ -65,7 +48,6 @@ class Provider:
def __init__(
self,
name: str,
register: ProviderRegister,
oauth_config: Optional[OAuthConfig] = None,
webhook_manager: Optional[Type[BaseWebhooksManager]] = None,
default_credentials: Optional[List[Credentials]] = None,
@@ -83,7 +65,7 @@ class Provider:
self.supported_auth_types = supported_auth_types or set()
self._api_client_factory = api_client_factory
self._error_handler = error_handler
self.register = register
# Store any additional configuration
self._extra_config = kwargs
self.test_credentials_uuid = uuid.uuid4()

View File

@@ -13,11 +13,9 @@ from backend.data.model import Credentials
from backend.integrations.oauth.base import BaseOAuthHandler
from backend.integrations.providers import ProviderName
from backend.integrations.webhooks._base import BaseWebhooksManager
from backend.sdk.db import upsert_providers_change_bulk
from backend.sdk.provider import ProviderRegister
if TYPE_CHECKING:
from backend.sdk.provider import Provider, ProviderRegister
from backend.sdk.provider import Provider
logger = logging.getLogger(__name__)
@@ -59,7 +57,6 @@ class AutoRegistry:
_webhook_managers: Dict[str, Type[BaseWebhooksManager]] = {}
_block_configurations: Dict[Type[Block], BlockConfiguration] = {}
_api_key_mappings: Dict[str, str] = {} # provider -> env_var_name
_provider_registry: Dict[str, ProviderRegister] = {}
@classmethod
def register_provider(cls, provider: "Provider") -> None:
@@ -67,7 +64,6 @@ class AutoRegistry:
with cls._lock:
cls._providers[provider.name] = provider
cls._provider_registry[provider.name] = provider.register
# Register OAuth handler if provided
if provider.oauth_config:
# Dynamically set PROVIDER_NAME if not already set
@@ -167,7 +163,7 @@ class AutoRegistry:
cls._api_key_mappings.clear()
@classmethod
async def patch_integrations(cls) -> None:
def patch_integrations(cls) -> None:
"""Patch existing integration points to use AutoRegistry."""
# OAuth handlers are handled by SDKAwareHandlersDict in oauth/__init__.py
# No patching needed for OAuth handlers
@@ -217,73 +213,6 @@ class AutoRegistry:
creds_store: Any = backend.integrations.credentials_store
if "backend.integrations.providers" in sys.modules:
providers: Any = sys.modules["backend.integrations.providers"]
else:
import backend.integrations.providers
providers: Any = backend.integrations.providers
legacy_oauth_providers = {
providers.ProviderName.DISCORD.value: ProviderRegister(
name=providers.ProviderName.DISCORD.value,
with_oauth=True,
client_id_env_var="DISCORD_CLIENT_ID",
client_secret_env_var="DISCORD_CLIENT_SECRET",
),
providers.ProviderName.GITHUB.value: ProviderRegister(
name=providers.ProviderName.GITHUB.value,
with_oauth=True,
client_id_env_var="GITHUB_CLIENT_ID",
client_secret_env_var="GITHUB_CLIENT_SECRET",
),
providers.ProviderName.GOOGLE.value: ProviderRegister(
name=providers.ProviderName.GOOGLE.value,
with_oauth=True,
client_id_env_var="GOOGLE_CLIENT_ID",
client_secret_env_var="GOOGLE_CLIENT_SECRET",
),
providers.ProviderName.NOTION.value: ProviderRegister(
name=providers.ProviderName.NOTION.value,
with_oauth=True,
client_id_env_var="NOTION_CLIENT_ID",
client_secret_env_var="NOTION_CLIENT_SECRET",
),
providers.ProviderName.TWITTER.value: ProviderRegister(
name=providers.ProviderName.TWITTER.value,
with_oauth=True,
client_id_env_var="TWITTER_CLIENT_ID",
client_secret_env_var="TWITTER_CLIENT_SECRET",
),
providers.ProviderName.TODOIST.value: ProviderRegister(
name=providers.ProviderName.TODOIST.value,
with_oauth=True,
client_id_env_var="TODOIST_CLIENT_ID",
client_secret_env_var="TODOIST_CLIENT_SECRET",
),
}
if hasattr(creds_store, "DEFAULT_CREDENTIALS"):
DEFAULT_CREDENTIALS = creds_store.DEFAULT_CREDENTIALS
for item in DEFAULT_CREDENTIALS:
new_cred = ProviderRegister(
name=item.provider,
with_api_key=True,
api_key_env_var=item.api_key_env_var,
)
if item.provider in legacy_oauth_providers:
new_cred.with_oauth = True
new_cred.client_id_env_var = legacy_oauth_providers[
item.provider
].client_id_env_var
new_cred.client_secret_env_var = legacy_oauth_providers[
item.provider
].client_secret_env_var
cls._provider_registry[item.provider] = new_cred
await upsert_providers_change_bulk(providers=cls._provider_registry)
if hasattr(creds_store, "IntegrationCredentialsStore"):
store_class = creds_store.IntegrationCredentialsStore
if hasattr(store_class, "get_all_creds"):
@@ -308,6 +237,5 @@ class AutoRegistry:
logger.info(
"Successfully patched IntegrationCredentialsStore.get_all_creds"
)
except Exception as e:
logging.warning(f"Failed to patch credentials store: {e}")

View File

@@ -16,7 +16,6 @@ from fastapi.middleware.gzip import GZipMiddleware
from fastapi.routing import APIRoute
from prisma.errors import PrismaError
import backend.blocks
import backend.data.block
import backend.data.db
import backend.data.graph
@@ -96,13 +95,10 @@ async def lifespan_context(app: fastapi.FastAPI):
# Ensure SDK auto-registration is patched before initializing blocks
from backend.sdk.registry import AutoRegistry
await AutoRegistry.patch_integrations()
AutoRegistry.patch_integrations()
await backend.data.block.initialize_blocks()
blocks = backend.blocks.load_all_blocks()
await backend.data.block.upsert_blocks_change_bulk(blocks)
await backend.data.user.migrate_and_encrypt_user_integrations()
await backend.data.graph.fix_llm_provider_credentials()
await backend.data.graph.migrate_llm_models(LlmModel.GPT4O)

View File

@@ -63,9 +63,9 @@ def initialize_launchdarkly() -> None:
config = Config(sdk_key)
ldclient.set_config(config)
global _is_initialized
_is_initialized = True
if ldclient.get().is_initialized():
global _is_initialized
_is_initialized = True
logger.info("LaunchDarkly client initialized successfully")
else:
logger.error("LaunchDarkly client failed to initialize")
@@ -218,7 +218,8 @@ def feature_flag(
if not get_client().is_initialized():
logger.warning(
f"LaunchDarkly not initialized, using default={default}"
"LaunchDarkly not initialized, "
f"using default {flag_key}={repr(default)}"
)
is_enabled = default
else:
@@ -232,8 +233,9 @@ def feature_flag(
else:
# Log warning and use default for non-boolean values
logger.warning(
f"Feature flag {flag_key} returned non-boolean value: {flag_value} (type: {type(flag_value).__name__}). "
f"Using default={default}"
f"Feature flag {flag_key} returned non-boolean value: "
f"{repr(flag_value)} (type: {type(flag_value).__name__}). "
f"Using default value {repr(default)}"
)
is_enabled = default

View File

@@ -8,10 +8,7 @@ settings = Settings()
def configure_logging():
import autogpt_libs.logging.config
if (
settings.config.behave_as == BehaveAs.LOCAL
or settings.config.app_env == AppEnvironment.LOCAL
):
if not is_structured_logging_enabled():
autogpt_libs.logging.config.configure_logging(force_cloud_logging=False)
else:
autogpt_libs.logging.config.configure_logging(force_cloud_logging=True)
@@ -20,6 +17,14 @@ def configure_logging():
logging.getLogger("httpx").setLevel(logging.WARNING)
def is_structured_logging_enabled() -> bool:
"""Check if structured logging (cloud logging) is enabled."""
return not (
settings.config.behave_as == BehaveAs.LOCAL
or settings.config.app_env == AppEnvironment.LOCAL
)
class TruncatedLogger:
def __init__(
self,

View File

@@ -4,15 +4,18 @@ from typing import Literal
import sentry_sdk
from pydantic import BaseModel, Field, SecretStr
from pydantic import SecretStr
from sentry_sdk.integrations import DidNotEnable
from sentry_sdk.integrations.anthropic import AnthropicIntegration
from sentry_sdk.integrations.asyncio import AsyncioIntegration
from sentry_sdk.integrations.launchdarkly import LaunchDarklyIntegration
from sentry_sdk.integrations.logging import LoggingIntegration
from backend.util.feature_flag import get_client, is_configured
from backend.util import feature_flag
from backend.util.settings import Settings
settings = Settings()
logger = logging.getLogger(__name__)
class DiscordChannel(str, Enum):
@@ -23,8 +26,11 @@ class DiscordChannel(str, Enum):
def sentry_init():
sentry_dsn = settings.secrets.sentry_dsn
integrations = []
if is_configured():
integrations.append(LaunchDarklyIntegration(get_client()))
if feature_flag.is_configured():
try:
integrations.append(LaunchDarklyIntegration(feature_flag.get_client()))
except DidNotEnable as e:
logger.error(f"Error enabling LaunchDarklyIntegration for Sentry: {e}")
sentry_sdk.init(
dsn=sentry_dsn,
traces_sample_rate=1.0,

View File

@@ -3,6 +3,7 @@ import logging
import bleach
from bleach.css_sanitizer import CSSSanitizer
from jinja2 import BaseLoader
from jinja2.exceptions import TemplateError
from jinja2.sandbox import SandboxedEnvironment
from markupsafe import Markup
@@ -101,8 +102,11 @@ class TextFormatter:
def format_string(self, template_str: str, values=None, **kwargs) -> str:
"""Regular template rendering with escaping"""
template = self.env.from_string(template_str)
return template.render(values or {}, **kwargs)
try:
template = self.env.from_string(template_str)
return template.render(values or {}, **kwargs)
except TemplateError as e:
raise ValueError(e) from e
def format_email(
self,

View File

@@ -1,31 +0,0 @@
-- CreateTable
CREATE TABLE "ProviderRegistry" (
"name" TEXT NOT NULL,
"with_oauth" BOOLEAN NOT NULL DEFAULT false,
"client_id_env_var" TEXT,
"client_secret_env_var" TEXT,
"with_api_key" BOOLEAN NOT NULL DEFAULT false,
"api_key_env_var" TEXT,
"with_user_password" BOOLEAN NOT NULL DEFAULT false,
"username_env_var" TEXT,
"password_env_var" TEXT,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "ProviderRegistry_pkey" PRIMARY KEY ("name")
);
-- CreateTable
CREATE TABLE "BlocksRegistry" (
"id" TEXT NOT NULL,
"name" TEXT NOT NULL,
"definition" JSONB NOT NULL,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "BlocksRegistry_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "ProviderRegistry_updatedAt_idx" ON "ProviderRegistry"("updatedAt");
-- CreateIndex
CREATE INDEX "BlocksRegistry_updatedAt_idx" ON "BlocksRegistry"("updatedAt");

View File

@@ -61,34 +61,6 @@ model User {
NotificationBatches UserNotificationBatch[]
}
// This model describes the providers that are available to the user.
model ProviderRegistry {
name String @id
with_oauth Boolean @default(false)
client_id_env_var String?
client_secret_env_var String?
with_api_key Boolean @default(false)
api_key_env_var String?
with_user_password Boolean @default(false)
username_env_var String?
password_env_var String?
updatedAt DateTime @updatedAt
@@index([updatedAt])
}
model BlocksRegistry {
id String @id @default(uuid())
name String
definition Json
updatedAt DateTime @updatedAt
@@index([updatedAt])
}
enum OnboardingStep {
// Introductory onboarding (Library)
WELCOME

View File

@@ -0,0 +1,140 @@
from unittest.mock import Mock, patch
import pytest
from youtube_transcript_api._errors import NoTranscriptFound
from youtube_transcript_api._transcripts import FetchedTranscript, Transcript
from backend.blocks.youtube import TranscribeYoutubeVideoBlock
class TestTranscribeYoutubeVideoBlock:
"""Test cases for TranscribeYoutubeVideoBlock language fallback functionality."""
def setup_method(self):
"""Set up test fixtures."""
self.youtube_block = TranscribeYoutubeVideoBlock()
def test_extract_video_id_standard_url(self):
"""Test extracting video ID from standard YouTube URL."""
url = "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
video_id = self.youtube_block.extract_video_id(url)
assert video_id == "dQw4w9WgXcQ"
def test_extract_video_id_short_url(self):
"""Test extracting video ID from shortened youtu.be URL."""
url = "https://youtu.be/dQw4w9WgXcQ"
video_id = self.youtube_block.extract_video_id(url)
assert video_id == "dQw4w9WgXcQ"
def test_extract_video_id_embed_url(self):
"""Test extracting video ID from embed URL."""
url = "https://www.youtube.com/embed/dQw4w9WgXcQ"
video_id = self.youtube_block.extract_video_id(url)
assert video_id == "dQw4w9WgXcQ"
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
def test_get_transcript_english_available(self, mock_api_class):
"""Test getting transcript when English is available."""
# Setup mock
mock_api = Mock()
mock_api_class.return_value = mock_api
mock_transcript = Mock(spec=FetchedTranscript)
mock_api.fetch.return_value = mock_transcript
# Execute
result = TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
# Assert
assert result == mock_transcript
mock_api.fetch.assert_called_once_with(video_id="test_video_id")
mock_api.list.assert_not_called()
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
def test_get_transcript_fallback_to_first_available(self, mock_api_class):
"""Test fallback to first available language when English is not available."""
# Setup mock
mock_api = Mock()
mock_api_class.return_value = mock_api
# Create mock transcript list with Hungarian transcript
mock_transcript_list = Mock()
mock_transcript_hu = Mock(spec=Transcript)
mock_fetched_transcript = Mock(spec=FetchedTranscript)
mock_transcript_hu.fetch.return_value = mock_fetched_transcript
# Set up the transcript list to have manually created transcripts empty
# and generated transcripts with Hungarian
mock_transcript_list._manually_created_transcripts = {}
mock_transcript_list._generated_transcripts = {"hu": mock_transcript_hu}
# Mock API to raise NoTranscriptFound for English, then return list
mock_api.fetch.side_effect = NoTranscriptFound(
"test_video_id", ("en",), mock_transcript_list
)
mock_api.list.return_value = mock_transcript_list
# Execute
result = TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
# Assert
assert result == mock_fetched_transcript
mock_api.fetch.assert_called_once_with(video_id="test_video_id")
mock_api.list.assert_called_once_with("test_video_id")
mock_transcript_hu.fetch.assert_called_once()
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
def test_get_transcript_prefers_manually_created(self, mock_api_class):
"""Test that manually created transcripts are preferred over generated ones."""
# Setup mock
mock_api = Mock()
mock_api_class.return_value = mock_api
# Create mock transcript list with both manual and generated transcripts
mock_transcript_list = Mock()
mock_transcript_manual = Mock(spec=Transcript)
mock_transcript_generated = Mock(spec=Transcript)
mock_fetched_manual = Mock(spec=FetchedTranscript)
mock_transcript_manual.fetch.return_value = mock_fetched_manual
# Set up the transcript list
mock_transcript_list._manually_created_transcripts = {
"es": mock_transcript_manual
}
mock_transcript_list._generated_transcripts = {"hu": mock_transcript_generated}
# Mock API to raise NoTranscriptFound for English
mock_api.fetch.side_effect = NoTranscriptFound(
"test_video_id", ("en",), mock_transcript_list
)
mock_api.list.return_value = mock_transcript_list
# Execute
result = TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
# Assert - should use manually created transcript first
assert result == mock_fetched_manual
mock_transcript_manual.fetch.assert_called_once()
mock_transcript_generated.fetch.assert_not_called()
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
def test_get_transcript_no_transcripts_available(self, mock_api_class):
"""Test that exception is re-raised when no transcripts are available at all."""
# Setup mock
mock_api = Mock()
mock_api_class.return_value = mock_api
# Create mock transcript list with no transcripts
mock_transcript_list = Mock()
mock_transcript_list._manually_created_transcripts = {}
mock_transcript_list._generated_transcripts = {}
# Mock API to raise NoTranscriptFound
original_exception = NoTranscriptFound(
"test_video_id", ("en",), mock_transcript_list
)
mock_api.fetch.side_effect = original_exception
mock_api.list.return_value = mock_transcript_list
# Execute and assert exception is raised
with pytest.raises(NoTranscriptFound):
TranscribeYoutubeVideoBlock.get_transcript("test_video_id")

View File

@@ -52,8 +52,7 @@ class TestWebhookPatching:
"""Clear registry."""
AutoRegistry.clear()
@pytest.mark.asyncio
async def test_webhook_manager_patching(self):
def test_webhook_manager_patching(self):
"""Test that webhook managers are correctly patched."""
# Mock the original load_webhook_managers function
@@ -76,7 +75,7 @@ class TestWebhookPatching:
with patch.dict(
"sys.modules", {"backend.integrations.webhooks": mock_webhooks_module}
):
await AutoRegistry.patch_integrations()
AutoRegistry.patch_integrations()
# Call the patched function
result = mock_webhooks_module.load_webhook_managers()
@@ -88,8 +87,7 @@ class TestWebhookPatching:
assert "webhook_provider" in result
assert result["webhook_provider"] == MockWebhookManager
@pytest.mark.asyncio
async def test_webhook_patching_no_original_function(self):
def test_webhook_patching_no_original_function(self):
"""Test webhook patching when load_webhook_managers doesn't exist."""
# Mock webhooks module without load_webhook_managers
mock_webhooks_module = MagicMock(spec=[])
@@ -105,7 +103,7 @@ class TestWebhookPatching:
"sys.modules", {"backend.integrations.webhooks": mock_webhooks_module}
):
# Should not raise an error
await AutoRegistry.patch_integrations()
AutoRegistry.patch_integrations()
# Function should not be added if it didn't exist
assert not hasattr(mock_webhooks_module, "load_webhook_managers")
@@ -118,8 +116,7 @@ class TestPatchingIntegration:
"""Clear registry."""
AutoRegistry.clear()
@pytest.mark.asyncio
async def test_complete_provider_registration_and_patching(self):
def test_complete_provider_registration_and_patching(self):
"""Test the complete flow from provider registration to patching."""
# Mock webhooks module
mock_webhooks = MagicMock()
@@ -141,7 +138,7 @@ class TestPatchingIntegration:
"backend.integrations.webhooks": mock_webhooks,
},
):
await AutoRegistry.patch_integrations()
AutoRegistry.patch_integrations()
# Verify webhook patching
webhook_result = mock_webhooks.load_webhook_managers()

View File

@@ -25,7 +25,6 @@ from backend.sdk import (
Provider,
ProviderBuilder,
)
from backend.sdk.provider import ProviderRegister
class TestAutoRegistry:
@@ -40,7 +39,6 @@ class TestAutoRegistry:
# Create a test provider
provider = Provider(
name="test_provider",
register=ProviderRegister(name="test_provider"),
oauth_handler=None,
webhook_manager=None,
default_credentials=[],
@@ -80,7 +78,6 @@ class TestAutoRegistry:
default_credentials=[],
base_costs=[],
supported_auth_types={"oauth2"},
register=ProviderRegister(name="oauth_provider"),
)
AutoRegistry.register_provider(provider)
@@ -98,7 +95,6 @@ class TestAutoRegistry:
provider = Provider(
name="webhook_provider",
register=ProviderRegister(name="webhook_provider"),
oauth_handler=None,
webhook_manager=TestWebhookManager,
default_credentials=[],
@@ -132,7 +128,6 @@ class TestAutoRegistry:
provider = Provider(
name="test_provider",
register=ProviderRegister(name="test_provider"),
oauth_handler=None,
webhook_manager=None,
default_credentials=[cred1, cred2],
@@ -199,7 +194,6 @@ class TestAutoRegistry:
):
provider1 = Provider(
name="provider1",
register=ProviderRegister(name="provider1"),
oauth_config=OAuthConfig(
oauth_handler=TestOAuth1,
client_id_env_var="TEST_CLIENT_ID",
@@ -213,7 +207,6 @@ class TestAutoRegistry:
provider2 = Provider(
name="provider2",
register=ProviderRegister(name="provider2"),
oauth_config=OAuthConfig(
oauth_handler=TestOAuth2,
client_id_env_var="TEST_CLIENT_ID",
@@ -260,7 +253,6 @@ class TestAutoRegistry:
# Add some registrations
provider = Provider(
name="test_provider",
register=ProviderRegister(name="test_provider"),
oauth_handler=None,
webhook_manager=None,
default_credentials=[],
@@ -290,8 +282,7 @@ class TestAutoRegistryPatching:
AutoRegistry.clear()
@patch("backend.integrations.webhooks.load_webhook_managers")
@pytest.mark.asyncio
async def test_webhook_manager_patching(self, mock_load_managers):
def test_webhook_manager_patching(self, mock_load_managers):
"""Test that webhook managers are patched into the system."""
# Set up the mock to return an empty dict
mock_load_managers.return_value = {}
@@ -303,7 +294,6 @@ class TestAutoRegistryPatching:
# Register a provider with webhooks
provider = Provider(
name="webhook_provider",
register=ProviderRegister(name="webhook_provider"),
oauth_handler=None,
webhook_manager=TestWebhookManager,
default_credentials=[],
@@ -320,8 +310,8 @@ class TestAutoRegistryPatching:
with patch.dict(
"sys.modules", {"backend.integrations.webhooks": mock_webhooks}
):
# Apply patches - now async
await AutoRegistry.patch_integrations()
# Apply patches
AutoRegistry.patch_integrations()
# Call the patched function
result = mock_webhooks.load_webhook_managers()

View File

@@ -0,0 +1,765 @@
<div align="center">
<h1>AutoGPT Frontend • Contributing ⌨️</h1>
<p>Next.js App Router • Client-first • Type-safe generated API hooks • Tailwind + shadcn/ui</p>
</div>
---
## ☕️ Summary
This document is your reference for contributing to the AutoGPT Frontend. It adapts legacy guidelines to our current stack and practices.
- Architecture and stack
- Component structure and design system
- Data fetching (generated API hooks)
- Feature flags
- Naming and code conventions
- Tooling, scripts, and testing
- PR process and checklist
This is a living document. Open a pull request any time to improve it.
---
## 🚀 Quick Start FAQ
New to the codebase? Here are shortcuts to common tasks:
### I need to make a new page
1. Create page in `src/app/(platform)/your-feature/page.tsx`
2. If it has logic, create `usePage.ts` hook next to it
3. Create sub-components in `components/` folder
4. Use generated API hooks for data fetching
5. If page needs auth, ensure it's in the `(platform)` route group
**Example structure:**
```
app/(platform)/dashboard/
page.tsx
useDashboardPage.ts
components/
StatsPanel/
StatsPanel.tsx
useStatsPanel.ts
```
See [Component structure](#-component-structure) and [Styling](#-styling) and [Data fetching patterns](#-data-fetching-patterns) sections.
### I need to update an existing component in a page
1. Find the page `src/app/(platform)/your-feature/page.tsx`
2. Check its `components/` folder
3. If needing to update its logic, check the `use[Component].ts` hook
4. If the update is related to rendering, check `[Component].tsx` file
See [Component structure](#-component-structure) and [Styling](#-styling) sections.
### I need to make a new API call and show it on the UI
1. Ensure the backend endpoint exists in the OpenAPI spec
2. Regenerate API client: `pnpm generate:api`
3. Import the generated hook by typing the operation name (auto-import)
4. Use the hook in your component/custom hook
5. Handle loading, error, and success states
**Example:**
```tsx
import { useGetV2ListLibraryAgents } from "@/app/api/__generated__/endpoints/library/library";
export function useAgentList() {
const { data, isLoading, isError, error } = useGetV2ListLibraryAgents();
return {
agents: data?.data || [],
isLoading,
isError,
error,
};
}
```
See [Data fetching patterns](#-data-fetching-patterns) for more examples.
### I need to create a new component in the Design System
1. Determine the atomic level: atom, molecule, or organism
2. Create folder: `src/components/[level]/ComponentName/`
3. Create `ComponentName.tsx` (render logic)
4. If logic exists, create `useComponentName.ts`
5. Create `ComponentName.stories.tsx` for Storybook
6. Use Tailwind + design tokens (avoid hardcoded values)
7. Only use Phosphor icons
8. Test in Storybook: `pnpm storybook`
9. Verify in Chromatic after PR
**Example structure:**
```
src/components/molecules/DataCard/
DataCard.tsx
DataCard.stories.tsx
useDataCard.ts
```
See [Component structure](#-component-structure) and [Styling](#-styling) sections.
---
## 📟 Contribution process
### 1) Branch off `dev`
- Branch from `dev` for features and fixes
- Keep PRs focused (aim for one ticket per PR)
- Use conventional commit messages with a scope (e.g., `feat(frontend): add X`)
### 2) Feature flags
If a feature will ship across multiple PRs, guard it with a flag so we can merge iteratively.
- Use [LaunchDarkly](https://www.launchdarkly.com) based flags (see Feature Flags below)
- Avoid long-lived feature branches
### 3) Open PR and get reviews ✅
Before requesting review:
- [x] Code follows architecture and conventions here
- [x] `pnpm format && pnpm lint && pnpm types` pass
- [x] Relevant tests pass locally: `pnpm test` (and/or Storybook tests)
- [x] If touching UI, validate against our design system and stories
### 4) Merge to `dev`
- Use squash merges
- Follow conventional commit message format for the squash title
---
## 📂 Architecture & Stack
### Next.js App Router
- We use the [Next.js App Router](https://nextjs.org/docs/app) in `src/app`
- Use [route segments](https://nextjs.org/docs/app/building-your-application/routing) with semantic URLs; no `pages/`
### Component good practices
- Default to client components
- Use server components only when:
- SEO requires server-rendered HTML, or
- Extreme first-byte performance justifies it
- If you render server-side data, prefer server-side prefetch + client hydration (see examples below and [React Query SSR & Hydration](https://tanstack.com/query/latest/docs/framework/react/guides/ssr))
- Prefer using [Next.js API routes](https://nextjs.org/docs/pages/building-your-application/routing/api-routes) when possible over [server actions](https://nextjs.org/docs/14/app/building-your-application/data-fetching/server-actions-and-mutations)
- Keep components small and simple
- favour composition and splitting large components into smaller bits of UI
- [colocate state](https://kentcdodds.com/blog/state-colocation-will-make-your-react-app-faster) when possible
- keep render/side-effects split for [separation of concerns](https://en.wikipedia.org/wiki/Separation_of_concerns)
- do not over-complicate or re-invent the wheel
**❓ Why a client-side first design vs server components/actions?**
While server components and actions are cool and cutting-edge, they introduce a layer of complexity which not always justified by the benefits they deliver. Defaulting to client-first keeps things simple in the mental model of the developer, specially for those developers less familiar with Next.js or heavy Front-end development.
### Data fetching: prefer generated API hooks
- We generate a type-safe client and React Query hooks from the backend OpenAPI spec via [Orval](https://orval.dev/)
- Prefer the generated hooks under `src/app/api/__generated__/endpoints/...`
- Treat `BackendAPI` and code under `src/lib/autogpt-server-api/*` as deprecated; do not introduce new usages
- Use [Zod](https://zod.dev/) schemas from the generated client where applicable
### State management
- Prefer [React Query](https://tanstack.com/query/latest/docs/framework/react/overview) for server state, colocated near consumers (see [state colocation](https://kentcdodds.com/blog/state-colocation-will-make-your-react-app-faster))
- Co-locate UI state inside components/hooks; keep global state minimal
### Styling and components
- [Tailwind CSS](https://tailwindcss.com/docs) + [shadcn/ui](https://ui.shadcn.com/) ([Radix Primitives](https://www.radix-ui.com/docs/primitives/overview/introduction) under the hood)
- Use the design system under `src/components` for primitives and building blocks
- Do not use anything under `src/components/_legacy__`; migrate away from it when touching old code
- Reference the design system catalog on Chromatic: [`https://dev--670f94474adee5e32c896b98.chromatic.com/`](https://dev--670f94474adee5e32c896b98.chromatic.com/)
- Use the [`tailwind-scrollbar`](https://www.npmjs.com/package/tailwind-scrollbar) plugin utilities for scrollbar styling
---
## 🧱 Component structure
For components, separate render logic from data/behavior, and keep implementation details local.
**Most components should follow this structure.** Pages are just bigger components made of smaller ones, and sub-components can have their own nested sub-components when dealing with complex features.
### Basic structure
When a component has non-trivial logic:
```
FeatureX/
FeatureX.tsx (render logic only)
useFeatureX.ts (hook; data fetching, behavior, state)
helpers.ts (pure helpers used by the hook)
components/ (optional, subcomponents local to FeatureX)
```
### Example: Page with nested components
```tsx
// Page composition
app/(platform)/dashboard/
page.tsx
useDashboardPage.ts
components/ # (Sub-components the dashboard page is made of)
StatsPanel/
StatsPanel.tsx
useStatsPanel.ts
helpers.ts
components/ # (Sub-components belonging to StatsPanel)
StatCard/
StatCard.tsx
ActivityFeed/
ActivityFeed.tsx
useActivityFeed.ts
```
### Guidelines
- Prefer function declarations for components and handlers
- Only use arrow functions for small inline lambdas (e.g., in `map`)
- Avoid barrel files and `index.ts` re-exports
- Keep component files focused and readable; push complex logic to `helpers.ts`
- Abstract reusable, cross-feature logic into `src/services/` or `src/lib/utils.ts` as appropriate
- Build components encapsulated so they can be easily reused and abstracted elsewhere
- Nest sub-components within a `components/` folder when they're local to the parent feature
### Exceptions
When to simplify the structure:
**Small hook logic (3-4 lines)**
If the hook logic is minimal, keep it inline with the render function:
```tsx
export function ActivityAlert() {
const [isVisible, setIsVisible] = useState(true);
if (!isVisible) return null;
return (
<Alert onClose={() => setIsVisible(false)}>New activity detected</Alert>
);
}
```
**Render-only components**
Components with no hook logic can be direct files in `components/` without a folder:
```
components/
ActivityAlert.tsx (render-only, no folder needed)
StatsPanel/ (has hook logic, needs folder)
StatsPanel.tsx
useStatsPanel.ts
```
### Hook file structure
When separating logic into a custom hook:
```tsx
// useStatsPanel.ts
export function useStatsPanel() {
const [data, setData] = useState<Stats[]>([]);
const [isLoading, setIsLoading] = useState(true);
useEffect(() => {
fetchStats().then(setData);
}, []);
return {
data,
isLoading,
refresh: () => fetchStats().then(setData),
};
}
```
Rules:
- **Always return an object** that exposes data and methods to the view
- **Export a single function** named after the component (e.g., `useStatsPanel` for `StatsPanel.tsx`)
- **Abstract into helpers.ts** when hook logic grows large, so the hook file remains readable by scanning without diving into implementation details
---
## 🔄 Data fetching patterns
All API hooks are generated from the backend OpenAPI specification using [Orval](https://orval.dev/). The hooks are type-safe and follow the operation names defined in the backend API.
### How to discover hooks
Most of the time you can rely on auto-import by typing the endpoint or operation name. Your IDE will suggest the generated hooks based on the OpenAPI operation IDs.
**Examples of hook naming patterns:**
- `GET /api/v1/notifications``useGetV1GetNotificationPreferences`
- `POST /api/v2/store/agents``usePostV2CreateStoreAgent`
- `DELETE /api/v2/store/submissions/{id}``useDeleteV2DeleteStoreSubmission`
- `GET /api/v2/library/agents``useGetV2ListLibraryAgents`
**Pattern**: `use{Method}{Version}{OperationName}`
You can also explore the generated hooks by browsing `src/app/api/__generated__/endpoints/` which is organized by API tags (e.g., `auth`, `store`, `library`).
**OpenAPI specs:**
- Production: [https://backend.agpt.co/openapi.json](https://backend.agpt.co/openapi.json)
- Staging: [https://dev-server.agpt.co/openapi.json](https://dev-server.agpt.co/openapi.json)
### Generated hooks (client)
Prefer the generated React Query hooks (via Orval + React Query):
```tsx
import { useGetV1GetNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
export function PreferencesPanel() {
const { data, isLoading, isError } = useGetV1GetNotificationPreferences({
query: {
select: (res) => res.data,
},
});
if (isLoading) return null;
if (isError) throw new Error("Failed to load preferences");
return <pre>{JSON.stringify(data, null, 2)}</pre>;
}
```
### Generated mutations (client)
```tsx
import { useQueryClient } from "@tanstack/react-query";
import {
useDeleteV2DeleteStoreSubmission,
getGetV2ListMySubmissionsQueryKey,
} from "@/app/api/__generated__/endpoints/store/store";
export function DeleteSubmissionButton({
submissionId,
}: {
submissionId: string;
}) {
const queryClient = useQueryClient();
const { mutateAsync: deleteSubmission, isPending } =
useDeleteV2DeleteStoreSubmission({
mutation: {
onSuccess: () => {
queryClient.invalidateQueries({
queryKey: getGetV2ListMySubmissionsQueryKey(),
});
},
},
});
async function onClick() {
await deleteSubmission({ submissionId });
}
return (
<button disabled={isPending} onClick={onClick}>
Delete
</button>
);
}
```
### Server-side prefetch + client hydration
Use server-side prefetch to improve TTFB while keeping the component tree client-first (see [React Query SSR & Hydration](https://tanstack.com/query/latest/docs/framework/react/guides/ssr)):
```tsx
// in a server component
import { getQueryClient } from "@/lib/tanstack-query/getQueryClient";
import { HydrationBoundary, dehydrate } from "@tanstack/react-query";
import {
prefetchGetV2ListStoreAgentsQuery,
prefetchGetV2ListStoreCreatorsQuery,
} from "@/app/api/__generated__/endpoints/store/store";
export default async function MarketplacePage() {
const queryClient = getQueryClient();
await Promise.all([
prefetchGetV2ListStoreAgentsQuery(queryClient, { featured: true }),
prefetchGetV2ListStoreAgentsQuery(queryClient, { sorted_by: "runs" }),
prefetchGetV2ListStoreCreatorsQuery(queryClient, {
featured: true,
sorted_by: "num_agents",
}),
]);
return (
<HydrationBoundary state={dehydrate(queryClient)}>
{/* Client component tree goes here */}
</HydrationBoundary>
);
}
```
Notes:
- Do not introduce new usages of `BackendAPI` or `src/lib/autogpt-server-api/*`
- Keep transformations and mapping logic close to the consumer (hook), not in the view
---
## ⚠️ Error handling
The app has multiple error handling strategies depending on the type of error:
### Render/runtime errors
Use `<ErrorCard />` to display render or runtime errors gracefully:
```tsx
import { ErrorCard } from "@/components/molecules/ErrorCard";
export function DataPanel() {
const { data, isLoading, isError, error } = useGetData();
if (isLoading) return <Skeleton />;
if (isError) return <ErrorCard error={error} />;
return <div>{data.content}</div>;
}
```
### API mutation errors
Display mutation errors using toast notifications:
```tsx
import { useToast } from "@/components/ui/use-toast";
export function useUpdateSettings() {
const { toast } = useToast();
const { mutateAsync: updateSettings } = useUpdateSettingsMutation({
mutation: {
onError: (error) => {
toast({
title: "Failed to update settings",
description: error.message,
variant: "destructive",
});
},
},
});
return { updateSettings };
}
```
### Manual Sentry capture
When needed, you can manually capture exceptions to Sentry:
```tsx
import * as Sentry from "@sentry/nextjs";
try {
await riskyOperation();
} catch (error) {
Sentry.captureException(error, {
tags: { context: "feature-x" },
extra: { metadata: additionalData },
});
throw error;
}
```
### Global error boundaries
The app has error boundaries already configured to:
- Capture uncaught errors globally and send them to Sentry
- Display a user-friendly error UI when something breaks
- Prevent the entire app from crashing
You don't need to wrap components in error boundaries manually unless you need custom error recovery logic.
---
## 🚩 Feature Flags
- Flags are powered by [LaunchDarkly](https://docs.launchdarkly.com/)
- Use the helper APIs under `src/services/feature-flags`
Check a flag in a client component:
```tsx
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
export function AgentActivityPanel() {
const enabled = useGetFlag(Flag.AGENT_ACTIVITY);
if (!enabled) return null;
return <div>Feature is enabled!</div>;
}
```
Protect a route or page component:
```tsx
import { withFeatureFlag } from "@/services/feature-flags/with-feature-flag";
export const MyFeaturePage = withFeatureFlag(function Page() {
return <div>My feature page</div>;
}, "my-feature-flag");
```
Local dev and Playwright:
- Set `NEXT_PUBLIC_PW_TEST=true` to use mocked flag values during local development and tests
Adding new flags:
1. Add the flag to the `Flag` enum and `FlagValues` type
2. Provide a mock value in the mock map
3. Configure the flag in LaunchDarkly
---
## 📙 Naming conventions
General:
- Variables and functions should read like plain English
- Prefer `const` over `let` unless reassignment is required
- Use searchable constants instead of magic numbers
Files:
- Components and hooks: `PascalCase` for component files, `camelCase` for hooks
- Other files: `kebab-case`
- Do not create barrel files or `index.ts` re-exports
Types:
- Prefer `interface` for object shapes
- Component props should be `interface Props { ... }`
- Use precise types; avoid `any` and unsafe casts
Parameters:
- If more than one parameter is needed, pass a single `Args` object for clarity
Comments:
- Keep comments minimal; code should be clear by itself
- Only document non-obvious intent, invariants, or caveats
Functions:
- Prefer function declarations for components and handlers
- Only use arrow functions for small inline callbacks
Control flow:
- Use early returns to reduce nesting
- Avoid catching errors unless you handle them meaningfully
---
## 🎨 Styling
- Use Tailwind utilities; prefer semantic, composable class names
- Use shadcn/ui components as building blocks when available
- Use the `tailwind-scrollbar` utilities for scrollbar styling
- Keep responsive and dark-mode behavior consistent with the design system
Additional requirements:
- Do not import shadcn primitives directly in feature code; only use components exposed in our design system under `src/components`. shadcn is a low-level skeleton we style on top of and is not meant to be consumed directly.
- Prefer design tokens over Tailwind's default theme whenever possible (e.g., color, spacing, radius, and typography tokens). Avoid hardcoded values and default palette if a token exists.
---
## ⚠️ Errors and ⏳ Loading
- **Errors**: Use the `ErrorCard` component from the design system to display API/HTTP errors and retry actions. Keep error derivation/mapping in hooks; pass the final message to the component.
- Component: `src/components/molecules/ErrorCard/ErrorCard.tsx`
- **Loading**: Use the `Skeleton` component(s) from the design system for loading states. Favor domain-appropriate skeleton layouts (lists, cards, tables) over spinners.
- See Storybook examples under Atoms/Skeleton for patterns.
---
## 🧭 Responsive and mobile-first
- Build mobile-first. Ensure new UI looks great from a 375px viewport width (iPhone SE) upwards.
- Validate layouts at common breakpoints (375, 768, 1024, 1280). Prefer stacking and progressive disclosure on small screens.
---
## 🧰 State for complex flows
For components/flows with complex state, multi-step wizards, or cross-component coordination, prefer a small co-located store using [Zustand](https://github.com/pmndrs/zustand).
Guidelines:
- Co-locate the store with the feature (e.g., `FeatureX/store.ts`).
- Expose typed selectors to minimize re-renders.
- Keep effects and API calls in hooks; stores hold state and pure actions.
Example: simple store with selectors
```ts
import { create } from "zustand";
interface WizardState {
step: number;
data: Record<string, unknown>;
next(): void;
back(): void;
setField(args: { key: string; value: unknown }): void;
}
export const useWizardStore = create<WizardState>((set) => ({
step: 0,
data: {},
next() {
set((state) => ({ step: state.step + 1 }));
},
back() {
set((state) => ({ step: Math.max(0, state.step - 1) }));
},
setField({ key, value }) {
set((state) => ({ data: { ...state.data, [key]: value } }));
},
}));
// Usage in a component (selectors keep updates scoped)
function WizardFooter() {
const step = useWizardStore((s) => s.step);
const next = useWizardStore((s) => s.next);
const back = useWizardStore((s) => s.back);
return (
<div className="flex items-center gap-2">
<button onClick={back} disabled={step === 0}>Back</button>
<button onClick={next}>Next</button>
</div>
);
}
```
Example: async action coordinated via hook + store
```ts
// FeatureX/useFeatureX.ts
import { useMutation } from "@tanstack/react-query";
import { useWizardStore } from "./store";
export function useFeatureX() {
const setField = useWizardStore((s) => s.setField);
const next = useWizardStore((s) => s.next);
const { mutateAsync: save, isPending } = useMutation({
mutationFn: async (payload: unknown) => {
// call API here
return payload;
},
onSuccess(data) {
setField({ key: "result", value: data });
next();
},
});
return { save, isSaving: isPending };
}
```
---
## 🖼 Icons
- Only use Phosphor Icons. Treat all other icon libraries as deprecated for new code.
- Package: `@phosphor-icons/react`
- Site: [`https://phosphoricons.com/`](https://phosphoricons.com/)
Example usage:
```tsx
import { Plus } from "@phosphor-icons/react";
export function CreateButton() {
return (
<button type="button" className="inline-flex items-center gap-2">
<Plus size={16} />
Create
</button>
);
}
```
---
## 🧪 Testing & Storybook
- End-to-end: [Playwright](https://playwright.dev/docs/intro) (`pnpm test`, `pnpm test-ui`)
- [Storybook](https://storybook.js.org/docs) for isolated UI development (`pnpm storybook` / `pnpm build-storybook`)
- For Storybook tests in CI, see [`@storybook/test-runner`](https://storybook.js.org/docs/writing-tests/test-runner) (`test-storybook:ci`)
- When changing components in `src/components`, update or add stories and visually verify in Storybook/Chromatic
---
## 🛠 Tooling & Scripts
Common scripts (see `package.json` for full list):
- `pnpm dev` — Start Next.js dev server (generates API client first)
- `pnpm build` — Build for production
- `pnpm start` — Start production server
- `pnpm lint` — ESLint + Prettier check
- `pnpm format` — Format code
- `pnpm types` — Type-check
- `pnpm storybook` — Run Storybook
- `pnpm test` — Run Playwright tests
Generated API client:
- `pnpm generate:api` — Fetch OpenAPI spec and regenerate the client
---
## ✅ PR checklist (Frontend)
- Client-first: server components only for SEO or extreme TTFB needs
- Uses generated API hooks; no new `BackendAPI` usages
- UI uses `src/components` primitives; no new `_legacy__` components
- Logic is separated into `use*.ts` and `helpers.ts` when non-trivial
- Reusable logic extracted to `src/services/` or `src/lib/utils.ts` when appropriate
- Navigation uses the Next.js router
- Lint, format, type-check, and tests pass locally
- Stories updated/added if UI changed; verified in Storybook
---
## ♻️ Migration guidance
When touching legacy code:
- Replace usages of `src/components/_legacy__/*` with the modern design system components under `src/components`
- Replace `BackendAPI` or `src/lib/autogpt-server-api/*` with generated API hooks
- Move presentational logic into render files and data/behavior into hooks
- Keep one-off transformations in local `helpers.ts`; move reusable logic to `src/services/` or `src/lib/utils.ts`
---
## 📚 References
- Design system (Chromatic): [`https://dev--670f94474adee5e32c896b98.chromatic.com/`](https://dev--670f94474adee5e32c896b98.chromatic.com/)
- Project README for setup and API client examples: `autogpt_platform/frontend/README.md`
- Conventional Commits: [conventionalcommits.org](https://www.conventionalcommits.org/)

View File

@@ -4,20 +4,12 @@ This is the frontend for AutoGPT's next generation
This project uses [**pnpm**](https://pnpm.io/) as the package manager via **corepack**. [Corepack](https://github.com/nodejs/corepack) is a Node.js tool that automatically manages package managers without requiring global installations.
For architecture, conventions, data fetching, feature flags, design system usage, state management, and PR process, see [CONTRIBUTING.md](./CONTRIBUTING.md).
### Prerequisites
Make sure you have Node.js 16.10+ installed. Corepack is included with Node.js by default.
### ⚠️ Migrating from yarn
> This project was previously using yarn1, make sure to clean up the old files if you set it up previously with yarn:
>
> ```bash
> rm -f yarn.lock && rm -rf node_modules
> ```
>
> Then follow the setup steps below.
## Setup
### 1. **Enable corepack** (run this once on your system):
@@ -96,184 +88,13 @@ Every time a new Front-end dependency is added by you or others, you will need t
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
## 🔄 Data Fetching Strategy
## 🔄 Data Fetching
> [!NOTE]
> You don't need to run the OpenAPI commands below to run the Front-end. You will only need to run them when adding or modifying endpoints on the Backend API and wanting to use those on the Frontend.
This project uses an auto-generated API client powered by [**Orval**](https://orval.dev/), which creates type-safe API clients from OpenAPI specifications.
### How It Works
1. **Backend Requirements**: Each API endpoint needs a summary and tag in the OpenAPI spec
2. **Operation ID Generation**: FastAPI generates operation IDs using the pattern `{method}{tag}{summary}`
3. **Spec Fetching**: The OpenAPI spec is fetched from `http://localhost:8006/openapi.json` and saved to the frontend
4. **Spec Transformation**: The OpenAPI spec is cleaned up using a custom transformer (see `autogpt_platform/frontend/src/app/api/transformers`)
5. **Client Generation**: Auto-generated client includes TypeScript types, API endpoints, and Zod schemas, organized by tags
### API Client Commands
```bash
# Fetch OpenAPI spec from backend and generate client
pnpm generate:api
# Only fetch the OpenAPI spec
pnpm fetch:openapi
# Only generate the client (after spec is fetched)
pnpm generate:api-client
```
### Using the Generated Client
The generated client provides React Query hooks for both queries and mutations:
#### Queries (GET requests)
```typescript
import { useGetV1GetNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
const { data, isLoading, isError } = useGetV1GetNotificationPreferences({
query: {
select: (res) => res.data,
// Other React Query options
},
});
```
#### Mutations (POST, PUT, DELETE requests)
```typescript
import { useDeleteV2DeleteStoreSubmission } from "@/app/api/__generated__/endpoints/store/store";
import { getGetV2ListMySubmissionsQueryKey } from "@/app/api/__generated__/endpoints/store/store";
import { useQueryClient } from "@tanstack/react-query";
const queryClient = useQueryClient();
const { mutateAsync: deleteSubmission } = useDeleteV2DeleteStoreSubmission({
mutation: {
onSuccess: () => {
// Invalidate related queries to refresh data
queryClient.invalidateQueries({
queryKey: getGetV2ListMySubmissionsQueryKey(),
});
},
},
});
// Usage
await deleteSubmission({
submissionId: submission_id,
});
```
#### Server Actions
For server-side operations, you can also use the generated client functions directly:
```typescript
import { postV1UpdateNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
// In a server action
const preferences = {
email: "user@example.com",
preferences: {
AGENT_RUN: true,
ZERO_BALANCE: false,
// ... other preferences
},
daily_limit: 0,
};
await postV1UpdateNotificationPreferences(preferences);
```
#### Server-Side Prefetching
For server-side components, you can prefetch data on the server and hydrate it in the client cache. This allows immediate access to cached data when queries are called:
```typescript
import { getQueryClient } from "@/lib/tanstack-query/getQueryClient";
import {
prefetchGetV2ListStoreAgentsQuery,
prefetchGetV2ListStoreCreatorsQuery
} from "@/app/api/__generated__/endpoints/store/store";
import { HydrationBoundary, dehydrate } from "@tanstack/react-query";
// In your server component
const queryClient = getQueryClient();
await Promise.all([
prefetchGetV2ListStoreAgentsQuery(queryClient, {
featured: true,
}),
prefetchGetV2ListStoreAgentsQuery(queryClient, {
sorted_by: "runs",
}),
prefetchGetV2ListStoreCreatorsQuery(queryClient, {
featured: true,
sorted_by: "num_agents",
}),
]);
return (
<HydrationBoundary state={dehydrate(queryClient)}>
<MainMarkeplacePage />
</HydrationBoundary>
);
```
This pattern improves performance by serving pre-fetched data from the server while maintaining the benefits of client-side React Query features.
### Configuration
The Orval configuration is located in `autogpt_platform/frontend/orval.config.ts`. It generates two separate clients:
1. **autogpt_api_client**: React Query hooks for client-side data fetching
2. **autogpt_zod_schema**: Zod schemas for validation
For more details, see the [Orval documentation](https://orval.dev/) or check the configuration file.
See [CONTRIBUTING.md](./CONTRIBUTING.md) for guidance on generated API hooks, SSR + hydration patterns, and usage examples. You generally do not need to run OpenAPI commands unless adding/modifying backend endpoints.
## 🚩 Feature Flags
This project uses [LaunchDarkly](https://launchdarkly.com/) for feature flags, allowing us to control feature rollouts and A/B testing.
### Using Feature Flags
#### Check if a feature is enabled
```typescript
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
function MyComponent() {
const isAgentActivityEnabled = useGetFlag(Flag.AGENT_ACTIVITY);
if (!isAgentActivityEnabled) {
return null; // Hide feature
}
return <div>Feature is enabled!</div>;
}
```
#### Protect entire components
```typescript
import { withFeatureFlag } from "@/services/feature-flags/with-feature-flag";
const MyFeaturePage = withFeatureFlag(MyPageComponent, "my-feature-flag");
```
### Testing with Feature Flags
For local development or running Playwright tests locally, use mocked feature flags by setting `NEXT_PUBLIC_PW_TEST=true` in your `.env` file. This bypasses LaunchDarkly and uses the mock values defined in the code.
### Adding New Flags
1. Add the flag to the `Flag` enum in `use-get-flag.ts`
2. Add the flag type to `FlagValues` type
3. Add mock value to `mockFlags` for testing
4. Configure the flag in LaunchDarkly dashboard
See [CONTRIBUTING.md](./CONTRIBUTING.md) for feature flag usage patterns, local development with mocks, and how to add new flags.
## 🚚 Deploy
@@ -333,7 +154,7 @@ By integrating Storybook into our development workflow, we can streamline UI dev
- [**Tailwind CSS**](https://tailwindcss.com/) - Utility-first CSS framework
- [**shadcn/ui**](https://ui.shadcn.com/) - Re-usable components built with Radix UI and Tailwind CSS
- [**Radix UI**](https://www.radix-ui.com/) - Headless UI components for accessibility
- [**Lucide React**](https://lucide.dev/guide/packages/lucide-react) - Beautiful & consistent icons
- [**Phosphor Icons**](https://phosphoricons.com/) - Icon set used across the app
- [**Framer Motion**](https://motion.dev/) - Animation library for React
### Development & Testing

View File

@@ -2,18 +2,11 @@
// The config you add here will be used whenever a users loads a page in their browser.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import {
AppEnv,
BehaveAs,
getAppEnv,
getBehaveAs,
getEnvironmentStr,
} from "@/lib/utils";
import { environment } from "@/services/environment";
import * as Sentry from "@sentry/nextjs";
const isProdOrDev = [AppEnv.PROD, AppEnv.DEV].includes(getAppEnv());
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
const isProdOrDev = environment.isProd() || environment.isDev();
const isCloud = environment.isCloud();
const isDisabled = process.env.DISABLE_SENTRY === "true";
const shouldEnable = !isDisabled && isProdOrDev && isCloud;
@@ -21,7 +14,7 @@ const shouldEnable = !isDisabled && isProdOrDev && isCloud;
Sentry.init({
dsn: "https://fe4e4aa4a283391808a5da396da20159@o4505260022104064.ingest.us.sentry.io/4507946746380288",
environment: getEnvironmentStr(),
environment: environment.getEnvironmentStr(),
enabled: shouldEnable,

View File

@@ -1,16 +1,16 @@
#!/usr/bin/env node
import { getAgptServerBaseUrl } from "@/lib/env-config";
import { execSync } from "child_process";
import * as path from "path";
import * as fs from "fs";
import * as os from "os";
import { environment } from "@/services/environment";
function fetchOpenApiSpec(): void {
const args = process.argv.slice(2);
const forceFlag = args.includes("--force");
const baseUrl = getAgptServerBaseUrl();
const baseUrl = environment.getAGPTServerBaseUrl();
const openApiUrl = `${baseUrl}/openapi.json`;
const outputPath = path.join(
__dirname,

View File

@@ -3,18 +3,11 @@
// Note that this config is unrelated to the Vercel Edge Runtime and is also required when running locally.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import { environment } from "@/services/environment";
import * as Sentry from "@sentry/nextjs";
import {
AppEnv,
BehaveAs,
getAppEnv,
getBehaveAs,
getEnvironmentStr,
} from "./src/lib/utils";
const isProdOrDev = [AppEnv.PROD, AppEnv.DEV].includes(getAppEnv());
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
const isProdOrDev = environment.isProd() || environment.isDev();
const isCloud = environment.isCloud();
const isDisabled = process.env.DISABLE_SENTRY === "true";
const shouldEnable = !isDisabled && isProdOrDev && isCloud;
@@ -22,7 +15,7 @@ const shouldEnable = !isDisabled && isProdOrDev && isCloud;
Sentry.init({
dsn: "https://fe4e4aa4a283391808a5da396da20159@o4505260022104064.ingest.us.sentry.io/4507946746380288",
environment: getEnvironmentStr(),
environment: environment.getEnvironmentStr(),
enabled: shouldEnable,
@@ -40,7 +33,7 @@ Sentry.init({
enableLogs: true,
integrations: [
Sentry.captureConsoleIntegration(),
Sentry.captureConsoleIntegration({ levels: ["fatal", "error", "warn"] }),
Sentry.extraErrorDataIntegration(),
],
});

View File

@@ -2,19 +2,12 @@
// The config you add here will be used whenever the server handles a request.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import {
AppEnv,
BehaveAs,
getAppEnv,
getBehaveAs,
getEnvironmentStr,
} from "@/lib/utils";
import { environment } from "@/services/environment";
import * as Sentry from "@sentry/nextjs";
// import { NodeProfilingIntegration } from "@sentry/profiling-node";
const isProdOrDev = [AppEnv.PROD, AppEnv.DEV].includes(getAppEnv());
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
const isProdOrDev = environment.isProd() || environment.isDev();
const isCloud = environment.isCloud();
const isDisabled = process.env.DISABLE_SENTRY === "true";
const shouldEnable = !isDisabled && isProdOrDev && isCloud;
@@ -22,7 +15,7 @@ const shouldEnable = !isDisabled && isProdOrDev && isCloud;
Sentry.init({
dsn: "https://fe4e4aa4a283391808a5da396da20159@o4505260022104064.ingest.us.sentry.io/4507946746380288",
environment: getEnvironmentStr(),
environment: environment.getEnvironmentStr(),
enabled: shouldEnable,

View File

@@ -1,13 +1,13 @@
"use client";
import { isServerSide } from "@/lib/utils/is-server-side";
import { useEffect, useState } from "react";
import { Button } from "@/components/atoms/Button/Button";
import { Text } from "@/components/atoms/Text/Text";
import { Card } from "@/components/atoms/Card/Card";
import { WaitlistErrorContent } from "@/components/auth/WaitlistErrorContent";
import { isWaitlistErrorFromParams } from "@/app/api/auth/utils";
import { isWaitlistError } from "@/app/api/auth/utils";
import { useRouter } from "next/navigation";
import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard";
import { environment } from "@/services/environment";
export default function AuthErrorPage() {
const [errorType, setErrorType] = useState<string | null>(null);
@@ -17,7 +17,7 @@ export default function AuthErrorPage() {
useEffect(() => {
// This code only runs on the client side
if (!isServerSide()) {
if (!environment.isServerSide()) {
const hash = window.location.hash.substring(1); // Remove the leading '#'
const params = new URLSearchParams(hash);
@@ -38,12 +38,9 @@ export default function AuthErrorPage() {
}
// Check if this is a waitlist/not allowed error
const isWaitlistError = isWaitlistErrorFromParams(
errorCode,
errorDescription,
);
const isWaitlistErr = isWaitlistError(errorCode, errorDescription);
if (isWaitlistError) {
if (isWaitlistErr) {
return (
<div className="flex h-screen items-center justify-center">
<Card className="w-full max-w-md p-8">
@@ -56,34 +53,25 @@ export default function AuthErrorPage() {
);
}
// Default error display for other types of errors
// Use ErrorCard for consistent error display
const errorMessage = errorDescription
? `${errorDescription}. If this error persists, please contact support at contact@agpt.co`
: "An authentication error occurred. Please contact support at contact@agpt.co";
return (
<div className="flex h-screen items-center justify-center">
<Card className="w-full max-w-md p-8">
<div className="flex flex-col items-center gap-6">
<Text variant="h3">Authentication Error</Text>
<div className="flex flex-col gap-2 text-center">
{errorType && (
<Text variant="body">
<strong>Error Type:</strong> {errorType}
</Text>
)}
{errorCode && (
<Text variant="body">
<strong>Error Code:</strong> {errorCode}
</Text>
)}
{errorDescription && (
<Text variant="body">
<strong>Description:</strong> {errorDescription}
</Text>
)}
</div>
<Button variant="primary" onClick={() => router.push("/login")}>
Back to Login
</Button>
</div>
</Card>
<div className="flex h-screen items-center justify-center p-4">
<div className="w-full max-w-md">
<ErrorCard
responseError={{
message: errorMessage,
detail: errorCode
? `Error code: ${errorCode}${errorType ? ` (${errorType})` : ""}`
: undefined,
}}
context="authentication"
onRetry={() => router.push("/login")}
/>
</div>
</div>
);
}

View File

@@ -7,7 +7,7 @@ import {
usePostV1CreateCredentials,
} from "@/app/api/__generated__/endpoints/integrations/integrations";
import { useToast } from "@/components/molecules/Toast/use-toast";
import type { PostV1CreateCredentialsBody } from "@/app/api/__generated__/models/postV1CreateCredentialsBody";
import { APIKeyCredentials } from "@/app/api/__generated__/models/aPIKeyCredentials";
import { useQueryClient } from "@tanstack/react-query";
import { useState } from "react";
@@ -89,7 +89,7 @@ export function useAPIKeyCredentialsModal({
api_key: values.apiKey,
title: values.title,
expires_at: expiresAt,
} as PostV1CreateCredentialsBody,
} as APIKeyCredentials,
});
}

View File

@@ -6,6 +6,7 @@ import ReactMarkdown from "react-markdown";
import type { GraphID } from "@/lib/autogpt-server-api/types";
import { askOtto } from "@/app/(platform)/build/actions";
import { cn } from "@/lib/utils";
import { environment } from "@/services/environment";
interface Message {
type: "user" | "assistant";
@@ -129,7 +130,7 @@ export default function OttoChatWidget({
};
// Don't render the chat widget if we're not on the build page or in local mode
if (process.env.NEXT_PUBLIC_BEHAVE_AS !== "CLOUD") {
if (environment.isLocal()) {
return null;
}

View File

@@ -1,7 +1,7 @@
import Shepherd from "shepherd.js";
import "shepherd.js/dist/css/shepherd.css";
import { sendGAEvent } from "@/services/analytics/google-analytics";
import { Key, storage } from "@/services/storage/local-storage";
import { analytics } from "@/services/analytics";
export const startTutorial = (
emptyNodeList: (forceEmpty: boolean) => boolean,
@@ -555,7 +555,7 @@ export const startTutorial = (
"use client";
console.debug("sendTutorialStep");
sendGAEvent("event", "tutorial_step_shown", { value: step.id });
analytics.sendGAEvent("event", "tutorial_step_shown", { value: step.id });
});
}

View File

@@ -42,8 +42,16 @@ function isVideoUrl(url: string): boolean {
if (url.includes("youtube.com/watch") || url.includes("youtu.be/")) {
return true;
}
if (url.includes("vimeo.com/")) {
return true;
try {
const parsed = new URL(url);
if (
parsed.hostname === "vimeo.com" ||
parsed.hostname === "www.vimeo.com"
) {
return true;
}
} catch {
// If URL parsing fails, treat as not a Vimeo URL.
}
return videoExtensions.some((ext) => url.toLowerCase().includes(ext));
}

View File

@@ -17,6 +17,7 @@ import { GraphExecutionJobInfo } from "@/app/api/__generated__/models/graphExecu
import { LibraryAgentPreset } from "@/app/api/__generated__/models/libraryAgentPreset";
import { useGetV1GetUserTimezone } from "@/app/api/__generated__/endpoints/auth/auth";
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
import { analytics } from "@/services/analytics";
export type RunVariant =
| "manual"
@@ -78,6 +79,10 @@ export function useAgentRunModal(
agent.graph_id,
).queryKey,
});
analytics.sendDatafastEvent("run_agent", {
name: agent.name,
id: agent.graph_id,
});
setIsOpen(false);
}
},
@@ -105,6 +110,11 @@ export function useAgentRunModal(
agent.graph_id,
),
});
analytics.sendDatafastEvent("schedule_agent", {
name: agent.name,
id: agent.graph_id,
cronExpression: cronExpression,
});
setIsOpen(false);
}
},

View File

@@ -37,6 +37,7 @@ import { useToastOnFail } from "@/components/molecules/Toast/use-toast";
import { AgentRunStatus, agentRunStatusMap } from "./agent-run-status-chip";
import useCredits from "@/hooks/useCredits";
import { AgentRunOutputView } from "./agent-run-output-view";
import { analytics } from "@/services/analytics";
export function AgentRunDetailsView({
agent,
@@ -131,7 +132,13 @@ export function AgentRunDetailsView({
run.inputs!,
run.credential_inputs!,
)
.then(({ id }) => onRun(id))
.then(({ id }) => {
analytics.sendDatafastEvent("run_agent", {
name: graph.name,
id: graph.id,
});
onRun(id);
})
.catch(toastOnFail("execute agent preset"));
}
@@ -142,7 +149,13 @@ export function AgentRunDetailsView({
run.inputs!,
run.credential_inputs!,
)
.then(({ id }) => onRun(id))
.then(({ id }) => {
analytics.sendDatafastEvent("run_agent", {
name: graph.name,
id: graph.id,
});
onRun(id);
})
.catch(toastOnFail("execute agent"));
}, [api, graph, run, onRun, toastOnFail]);

View File

@@ -43,6 +43,7 @@ import {
import { AgentStatus, AgentStatusChip } from "./agent-status-chip";
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
import { analytics } from "@/services/analytics";
export function AgentRunDraftView({
graph,
@@ -197,6 +198,12 @@ export function AgentRunDraftView({
}
// Mark run agent onboarding step as completed
completeOnboardingStep("MARKETPLACE_RUN_AGENT");
analytics.sendDatafastEvent("run_agent", {
name: graph.name,
id: graph.id,
});
if (runCount > 0) {
completeOnboardingStep("RE_RUN_AGENT");
}
@@ -373,6 +380,12 @@ export function AgentRunDraftView({
})
.catch(toastOnFail("set up agent run schedule"));
analytics.sendDatafastEvent("schedule_agent", {
name: graph.name,
id: graph.id,
cronExpression: cronExpression,
});
if (schedule && onCreateSchedule) onCreateSchedule(schedule);
},
[api, graph, inputValues, inputCredentials, onCreateSchedule, toastOnFail],

View File

@@ -8,9 +8,9 @@ import { EmailNotAllowedModal } from "@/components/auth/EmailNotAllowedModal";
import { GoogleOAuthButton } from "@/components/auth/GoogleOAuthButton";
import Turnstile from "@/components/auth/Turnstile";
import { Form, FormField } from "@/components/__legacy__/ui/form";
import { getBehaveAs } from "@/lib/utils";
import { LoadingLogin } from "./components/LoadingLogin";
import { useLoginPage } from "./useLoginPage";
import { environment } from "@/services/environment";
export default function LoginPage() {
const {
@@ -118,7 +118,7 @@ export default function LoginPage() {
type="login"
message={feedback}
isError={!!feedback}
behaveAs={getBehaveAs()}
behaveAs={environment.getBehaveAs()}
/>
</Form>
<AuthCard.BottomText

View File

@@ -1,6 +1,5 @@
import { useTurnstile } from "@/hooks/useTurnstile";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { BehaveAs, getBehaveAs } from "@/lib/utils";
import { loginFormSchema, LoginProvider } from "@/types/auth";
import { zodResolver } from "@hookform/resolvers/zod";
import { useRouter } from "next/navigation";
@@ -8,6 +7,7 @@ import { useCallback, useEffect, useState } from "react";
import { useForm } from "react-hook-form";
import z from "zod";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { environment } from "@/services/environment";
export function useLoginPage() {
const { supabase, user, isUserLoading } = useSupabase();
@@ -18,7 +18,7 @@ export function useLoginPage() {
const [isLoading, setIsLoading] = useState(false);
const [isGoogleLoading, setIsGoogleLoading] = useState(false);
const [showNotAllowedModal, setShowNotAllowedModal] = useState(false);
const isCloudEnv = getBehaveAs() === BehaveAs.CLOUD;
const isCloudEnv = environment.isCloud();
const isVercelPreview = process.env.NEXT_PUBLIC_VERCEL_ENV === "preview";
const turnstile = useTurnstile({

View File

@@ -6,10 +6,10 @@ import Link from "next/link";
import { User } from "@supabase/supabase-js";
import { cn } from "@/lib/utils";
import { useAgentInfo } from "./useAgentInfo";
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
interface AgentInfoProps {
user: User | null;
agentId: string;
name: string;
creator: string;
shortDescription: string;
@@ -20,11 +20,12 @@ interface AgentInfoProps {
lastUpdated: string;
version: string;
storeListingVersionId: string;
libraryAgent: LibraryAgent | undefined;
isAgentAddedToLibrary: boolean;
}
export const AgentInfo = ({
user,
agentId,
name,
creator,
shortDescription,
@@ -35,7 +36,7 @@ export const AgentInfo = ({
lastUpdated,
version,
storeListingVersionId,
libraryAgent,
isAgentAddedToLibrary,
}: AgentInfoProps) => {
const {
handleDownload,
@@ -82,11 +83,15 @@ export const AgentInfo = ({
"transition-colors duration-200 hover:bg-violet-500 disabled:bg-zinc-400",
)}
data-testid={"agent-add-library-button"}
onClick={handleLibraryAction}
disabled={isAddingAgentToLibrary}
onClick={() =>
handleLibraryAction({
isAddingAgentFirstTime: !isAgentAddedToLibrary,
})
}
>
<span className="justify-start font-sans text-sm font-medium leading-snug text-primary-foreground">
{libraryAgent ? "See runs" : "Add to library"}
{isAgentAddedToLibrary ? "See runs" : "Add to library"}
</span>
</button>
)}
@@ -96,7 +101,7 @@ export const AgentInfo = ({
"transition-colors duration-200 hover:bg-zinc-200/70 disabled:bg-zinc-200/40",
)}
data-testid={"agent-download-button"}
onClick={handleDownload}
onClick={() => handleDownload(agentId, name)}
disabled={isDownloadingAgent}
>
<div className="justify-start text-center font-sans text-sm font-medium leading-snug text-zinc-800">

View File

@@ -1,10 +1,11 @@
import { usePostV2AddMarketplaceAgent } from "@/app/api/__generated__/endpoints/library/library";
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { useRouter } from "next/navigation";
import * as Sentry from "@sentry/nextjs";
import { useGetV2DownloadAgentFile } from "@/app/api/__generated__/endpoints/store/store";
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
import { analytics } from "@/services/analytics";
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
interface UseAgentInfoProps {
storeListingVersionId: string;
@@ -16,29 +17,9 @@ export const useAgentInfo = ({ storeListingVersionId }: UseAgentInfoProps) => {
const { completeStep } = useOnboarding();
const {
mutate: addMarketplaceAgentToLibrary,
mutateAsync: addMarketplaceAgentToLibrary,
isPending: isAddingAgentToLibrary,
} = usePostV2AddMarketplaceAgent({
mutation: {
onSuccess: ({ data }) => {
completeStep("MARKETPLACE_ADD_AGENT");
router.push(`/library/agents/${(data as LibraryAgent).id}`);
toast({
title: "Agent Added",
description: "Redirecting to your library...",
duration: 2000,
});
},
onError: (error) => {
Sentry.captureException(error);
toast({
title: "Error",
description: "Failed to add agent to library. Please try again.",
variant: "destructive",
});
},
},
});
} = usePostV2AddMarketplaceAgent();
const { refetch: downloadAgent, isFetching: isDownloadingAgent } =
useGetV2DownloadAgentFile(storeListingVersionId, {
@@ -50,13 +31,46 @@ export const useAgentInfo = ({ storeListingVersionId }: UseAgentInfoProps) => {
},
});
const handleLibraryAction = async () => {
addMarketplaceAgentToLibrary({
data: { store_listing_version_id: storeListingVersionId },
});
const handleLibraryAction = async ({
isAddingAgentFirstTime,
}: {
isAddingAgentFirstTime: boolean;
}) => {
try {
const { data: response } = await addMarketplaceAgentToLibrary({
data: { store_listing_version_id: storeListingVersionId },
});
const data = response as LibraryAgent;
if (isAddingAgentFirstTime) {
completeStep("MARKETPLACE_ADD_AGENT");
analytics.sendDatafastEvent("add_to_library", {
name: data.name,
id: data.id,
});
}
router.push(`/library/agents/${data.id}`);
toast({
title: "Agent Added",
description: "Redirecting to your library...",
duration: 2000,
});
} catch (error) {
Sentry.captureException(error);
toast({
title: "Error",
description: "Failed to add agent to library. Please try again.",
variant: "destructive",
});
}
};
const handleDownload = async () => {
const handleDownload = async (agentId: string, agentName: string) => {
try {
const { data: file } = await downloadAgent();
@@ -74,6 +88,11 @@ export const useAgentInfo = ({ storeListingVersionId }: UseAgentInfoProps) => {
window.URL.revokeObjectURL(url);
analytics.sendDatafastEvent("download_agent", {
name: agentName,
id: agentId,
});
toast({
title: "Download Complete",
description: "Your agent has been successfully downloaded.",

View File

@@ -82,6 +82,7 @@ export const MainAgentPage = ({ params }: MainAgentPageProps) => {
<div className="w-full md:w-auto md:shrink-0">
<AgentInfo
user={user}
agentId={agent.active_version_id ?? ""}
name={agent.agent_name}
creator={agent.creator}
shortDescription={agent.sub_heading}
@@ -92,7 +93,7 @@ export const MainAgentPage = ({ params }: MainAgentPageProps) => {
lastUpdated={agent.last_updated.toISOString()}
version={agent.versions[agent.versions.length - 1]}
storeListingVersionId={agent.store_listing_version_id}
libraryAgent={libraryAgent}
isAgentAddedToLibrary={Boolean(libraryAgent)}
/>
</div>
<AgentImages

View File

@@ -17,10 +17,10 @@ import {
FormItem,
FormLabel,
} from "@/components/__legacy__/ui/form";
import { getBehaveAs } from "@/lib/utils";
import { WarningOctagonIcon } from "@phosphor-icons/react/dist/ssr";
import { LoadingSignup } from "./components/LoadingSignup";
import { useSignupPage } from "./useSignupPage";
import { environment } from "@/services/environment";
export default function SignupPage() {
const {
@@ -196,7 +196,7 @@ export default function SignupPage() {
type="signup"
message={feedback}
isError={!!feedback}
behaveAs={getBehaveAs()}
behaveAs={environment.getBehaveAs()}
/>
<AuthCard.BottomText

View File

@@ -1,6 +1,5 @@
import { useTurnstile } from "@/hooks/useTurnstile";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { BehaveAs, getBehaveAs } from "@/lib/utils";
import { LoginProvider, signupFormSchema } from "@/types/auth";
import { zodResolver } from "@hookform/resolvers/zod";
import { useRouter } from "next/navigation";
@@ -8,6 +7,7 @@ import { useCallback, useEffect, useState } from "react";
import { useForm } from "react-hook-form";
import z from "zod";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { environment } from "@/services/environment";
export function useSignupPage() {
const { supabase, user, isUserLoading } = useSupabase();
@@ -19,7 +19,7 @@ export function useSignupPage() {
const [isGoogleLoading, setIsGoogleLoading] = useState(false);
const [showNotAllowedModal, setShowNotAllowedModal] = useState(false);
const isCloudEnv = getBehaveAs() === BehaveAs.CLOUD;
const isCloudEnv = environment.isCloud();
const isVercelPreview = process.env.NEXT_PUBLIC_VERCEL_ENV === "preview";
const turnstile = useTurnstile({
@@ -59,6 +59,7 @@ export function useSignupPage() {
resetCaptcha();
return;
}
try {
const response = await fetch("/api/auth/provider", {
method: "POST",
@@ -71,7 +72,6 @@ export function useSignupPage() {
setIsGoogleLoading(false);
resetCaptcha();
// Check for waitlist error
if (error === "not_allowed") {
setShowNotAllowedModal(true);
return;
@@ -149,6 +149,7 @@ export function useSignupPage() {
setShowNotAllowedModal(true);
return;
}
toast({
title: result?.error || "Signup failed",
variant: "destructive",

View File

@@ -33,7 +33,7 @@ export async function POST(request: Request) {
if (error) {
// Check for waitlist/allowlist error
if (isWaitlistError(error)) {
if (isWaitlistError(error?.code, error?.message)) {
logWaitlistError("OAuth Provider", error.message);
return NextResponse.json({ error: "not_allowed" }, { status: 403 });
}

View File

@@ -30,6 +30,7 @@ export async function POST(request: Request) {
turnstileToken ?? "",
"signup",
);
if (!captchaOk) {
return NextResponse.json(
{ error: "CAPTCHA verification failed. Please try again." },
@@ -48,8 +49,7 @@ export async function POST(request: Request) {
const { data, error } = await supabase.auth.signUp(parsed.data);
if (error) {
// Check for waitlist/allowlist error
if (isWaitlistError(error)) {
if (isWaitlistError(error?.code, error?.message)) {
logWaitlistError("Signup", error.message);
return NextResponse.json({ error: "not_allowed" }, { status: 403 });
}

View File

@@ -1,49 +1,45 @@
/**
* Checks if a Supabase auth error is related to the waitlist/allowlist
* Checks if an error is related to the waitlist/allowlist
*
* Can be used with either:
* - Error objects from Supabase auth operations: `isWaitlistError(error?.code, error?.message)`
* - URL parameters from OAuth callbacks: `isWaitlistError(errorCode, errorDescription)`
*
* The PostgreSQL trigger raises P0001 with message format:
* "The email address "email" is not allowed to register. Please contact support for assistance."
*
* @param error - The error object from Supabase auth operations
* @returns true if this is a waitlist/allowlist error
*/
export function isWaitlistError(error: any): boolean {
if (!error?.message) return false;
if (error?.code === "P0001") return true;
return (
error.message.includes("P0001") || // PostgreSQL custom error code
error.message.includes("not allowed to register") || // Trigger message
error.message.toLowerCase().includes("allowed_users") // Table reference
);
}
/**
* Checks if OAuth callback URL parameters indicate a waitlist error
*
* This is for the auth-code-error page which receives errors via URL hash params
* from Supabase OAuth redirects
*
* @param errorCode - The error_code parameter from the URL
* @param errorDescription - The error_description parameter from the URL
* @param code - Error code (e.g., "P0001", "unexpected_failure") or null
* @param message - Error message/description or null
* @returns true if this appears to be a waitlist/allowlist error
*/
export function isWaitlistErrorFromParams(
errorCode?: string | null,
errorDescription?: string | null,
export function isWaitlistError(
code?: string | null,
message?: string | null,
): boolean {
if (!errorDescription) return false;
// Check for explicit PostgreSQL trigger error code
if (code === "P0001") return true;
if (errorCode === "P0001") return true;
if (!message) return false;
const description = errorDescription.toLowerCase();
const lowerMessage = message.toLowerCase();
// Check for the generic database error that occurs during waitlist check
// This happens when Supabase wraps the PostgreSQL trigger error
if (
code === "unexpected_failure" &&
message === "Database error saving new user"
) {
return true;
}
// Check for various waitlist-related patterns in the message
return (
description.includes("p0001") || // PostgreSQL error code might be in description
description.includes("not allowed") ||
description.includes("waitlist") ||
description.includes("allowlist") ||
description.includes("allowed_users")
lowerMessage.includes("p0001") || // PostgreSQL error code in message
lowerMessage.includes("not allowed") || // Common waitlist message
lowerMessage.includes("waitlist") || // Explicit waitlist mention
lowerMessage.includes("allowlist") || // Explicit allowlist mention
lowerMessage.includes("allowed_users") || // Database table reference
lowerMessage.includes("not allowed to register") // Full trigger message
);
}

View File

@@ -3,20 +3,19 @@ import {
createRequestHeaders,
getServerAuthToken,
} from "@/lib/autogpt-server-api/helpers";
import { isServerSide } from "@/lib/utils/is-server-side";
import { getAgptServerBaseUrl } from "@/lib/env-config";
import { transformDates } from "./date-transformer";
import { environment } from "@/services/environment";
const FRONTEND_BASE_URL =
process.env.NEXT_PUBLIC_FRONTEND_BASE_URL || "http://localhost:3000";
const API_PROXY_BASE_URL = `${FRONTEND_BASE_URL}/api/proxy`; // Sending request via nextjs Server
const getBaseUrl = (): string => {
if (!isServerSide()) {
if (!environment.isServerSide()) {
return API_PROXY_BASE_URL;
} else {
return getAgptServerBaseUrl();
return environment.getAGPTServerBaseUrl();
}
};
@@ -77,7 +76,7 @@ export const customMutator = async <
// The caching in React Query in our system depends on the url, so the base_url could be different for the server and client sides.
const fullUrl = `${baseUrl}${url}${queryString}`;
if (isServerSide()) {
if (environment.isServerSide()) {
try {
const token = await getServerAuthToken();
const authHeaders = createRequestHeaders(token, !!data, contentType);
@@ -100,7 +99,7 @@ export const customMutator = async <
response_data?.detail || response_data?.message || response.statusText;
console.error(
`Request failed ${isServerSide() ? "on server" : "on client"}`,
`Request failed ${environment.isServerSide() ? "on server" : "on client"}`,
{ status: response.status, url: fullUrl, data: response_data },
);

View File

@@ -207,7 +207,7 @@
"schema": {
"oneOf": [
{ "$ref": "#/components/schemas/OAuth2Credentials" },
{ "$ref": "#/components/schemas/APIKeyCredentials-Input" },
{ "$ref": "#/components/schemas/APIKeyCredentials" },
{ "$ref": "#/components/schemas/UserPasswordCredentials" },
{ "$ref": "#/components/schemas/HostScopedCredentials-Input" }
],
@@ -215,7 +215,7 @@
"propertyName": "type",
"mapping": {
"oauth2": "#/components/schemas/OAuth2Credentials",
"api_key": "#/components/schemas/APIKeyCredentials-Input",
"api_key": "#/components/schemas/APIKeyCredentials",
"user_password": "#/components/schemas/UserPasswordCredentials",
"host_scoped": "#/components/schemas/HostScopedCredentials-Input"
}
@@ -233,7 +233,7 @@
"schema": {
"oneOf": [
{ "$ref": "#/components/schemas/OAuth2Credentials" },
{ "$ref": "#/components/schemas/APIKeyCredentials-Output" },
{ "$ref": "#/components/schemas/APIKeyCredentials" },
{ "$ref": "#/components/schemas/UserPasswordCredentials" },
{
"$ref": "#/components/schemas/HostScopedCredentials-Output"
@@ -243,7 +243,7 @@
"propertyName": "type",
"mapping": {
"oauth2": "#/components/schemas/OAuth2Credentials",
"api_key": "#/components/schemas/APIKeyCredentials-Output",
"api_key": "#/components/schemas/APIKeyCredentials",
"user_password": "#/components/schemas/UserPasswordCredentials",
"host_scoped": "#/components/schemas/HostScopedCredentials-Output"
}
@@ -302,7 +302,7 @@
"schema": {
"oneOf": [
{ "$ref": "#/components/schemas/OAuth2Credentials" },
{ "$ref": "#/components/schemas/APIKeyCredentials-Output" },
{ "$ref": "#/components/schemas/APIKeyCredentials" },
{ "$ref": "#/components/schemas/UserPasswordCredentials" },
{
"$ref": "#/components/schemas/HostScopedCredentials-Output"
@@ -312,7 +312,7 @@
"propertyName": "type",
"mapping": {
"oauth2": "#/components/schemas/OAuth2Credentials",
"api_key": "#/components/schemas/APIKeyCredentials-Output",
"api_key": "#/components/schemas/APIKeyCredentials",
"user_password": "#/components/schemas/UserPasswordCredentials",
"host_scoped": "#/components/schemas/HostScopedCredentials-Output"
}
@@ -4782,41 +4782,7 @@
},
"components": {
"schemas": {
"APIKeyCredentials-Input": {
"properties": {
"id": { "type": "string", "title": "Id" },
"provider": { "type": "string", "title": "Provider" },
"title": {
"anyOf": [{ "type": "string" }, { "type": "null" }],
"title": "Title"
},
"type": {
"type": "string",
"const": "api_key",
"title": "Type",
"default": "api_key"
},
"api_key": {
"type": "string",
"format": "password",
"title": "Api Key",
"writeOnly": true
},
"expires_at": {
"anyOf": [{ "type": "integer" }, { "type": "null" }],
"title": "Expires At",
"description": "Unix timestamp (seconds) indicating when the API key expires (if at all)"
},
"api_key_env_var": {
"anyOf": [{ "type": "string" }, { "type": "null" }],
"title": "Api Key Env Var"
}
},
"type": "object",
"required": ["provider", "api_key"],
"title": "APIKeyCredentials"
},
"APIKeyCredentials-Output": {
"APIKeyCredentials": {
"properties": {
"id": { "type": "string", "title": "Id" },
"provider": { "type": "string", "title": "Provider" },

View File

@@ -3,12 +3,12 @@ import {
makeAuthenticatedFileUpload,
makeAuthenticatedRequest,
} from "@/lib/autogpt-server-api/helpers";
import { getAgptServerBaseUrl } from "@/lib/env-config";
import { environment } from "@/services/environment";
import { NextRequest, NextResponse } from "next/server";
function buildBackendUrl(path: string[], queryString: string): string {
const backendPath = path.join("/");
return `${getAgptServerBaseUrl()}/${backendPath}${queryString}`;
return `${environment.getAGPTServerBaseUrl()}/${backendPath}${queryString}`;
}
async function handleJsonRequest(

View File

@@ -6,13 +6,12 @@ import "./globals.css";
import { Providers } from "@/app/providers";
import TallyPopupSimple from "@/components/molecules/TallyPoup/TallyPopup";
import { GoogleAnalytics } from "@/services/analytics/google-analytics";
import { Toaster } from "@/components/molecules/Toast/toaster";
import { ReactQueryDevtools } from "@tanstack/react-query-devtools";
import { SpeedInsights } from "@vercel/speed-insights/next";
import { Analytics } from "@vercel/analytics/next";
import { headers } from "next/headers";
import Script from "next/script";
import { SetupAnalytics } from "@/services/analytics";
export const metadata: Metadata = {
title: "AutoGPT Platform",
@@ -26,7 +25,7 @@ export default async function RootLayout({
}>) {
const headersList = await headers();
const host = headersList.get("host") || "";
const isPlatformDomain = host === "platform.agpt.co";
return (
<html
lang="en"
@@ -34,17 +33,12 @@ export default async function RootLayout({
suppressHydrationWarning
>
<head>
<GoogleAnalytics
gaId={process.env.NEXT_PUBLIC_GA_MEASUREMENT_ID || "G-FH2XK2W4GN"} // This is the measurement Id for the Google Analytics dev project
<SetupAnalytics
host={host}
ga={{
gaId: process.env.NEXT_PUBLIC_GA_MEASUREMENT_ID || "G-FH2XK2W4GN",
}}
/>
{isPlatformDomain && (
<Script
strategy="afterInteractive"
data-website-id="dfid_g5wtBIiHUwSkWKcGz80lu"
data-domain="agpt.co"
src="https://datafa.st/js/script.js"
/>
)}
</head>
<body>
<Providers

View File

@@ -1,248 +0,0 @@
"use client";
import { StarRatingIcons } from "@/components/__legacy__/ui/icons";
import { Separator } from "@/components/__legacy__/ui/separator";
import BackendAPI, { LibraryAgent } from "@/lib/autogpt-server-api";
import { useRouter } from "next/navigation";
import Link from "next/link";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
import { User } from "@supabase/supabase-js";
import { cn } from "@/lib/utils";
import { FC, useCallback, useMemo, useState } from "react";
interface AgentInfoProps {
user: User | null;
name: string;
creator: string;
shortDescription: string;
longDescription: string;
rating: number;
runs: number;
categories: string[];
lastUpdated: string;
version: string;
storeListingVersionId: string;
libraryAgent: LibraryAgent | null;
}
export const AgentInfo: FC<AgentInfoProps> = ({
user,
name,
creator,
shortDescription,
longDescription,
rating,
runs,
categories,
lastUpdated,
version,
storeListingVersionId,
libraryAgent,
}) => {
const router = useRouter();
const api = useMemo(() => new BackendAPI(), []);
const { toast } = useToast();
const { completeStep } = useOnboarding();
const [adding, setAdding] = useState(false);
const [downloading, setDownloading] = useState(false);
const libraryAction = useCallback(async () => {
setAdding(true);
if (libraryAgent) {
toast({
description: "Redirecting to your library...",
duration: 2000,
});
// Redirect to the library agent page
router.push(`/library/agents/${libraryAgent.id}`);
return;
}
try {
const newLibraryAgent = await api.addMarketplaceAgentToLibrary(
storeListingVersionId,
);
completeStep("MARKETPLACE_ADD_AGENT");
router.push(`/library/agents/${newLibraryAgent.id}`);
toast({
title: "Agent Added",
description: "Redirecting to your library...",
duration: 2000,
});
} catch (error) {
console.error("Failed to add agent to library:", error);
toast({
title: "Error",
description: "Failed to add agent to library. Please try again.",
variant: "destructive",
});
}
}, [toast, api, storeListingVersionId, completeStep, router]);
const handleDownload = useCallback(async () => {
const downloadAgent = async (): Promise<void> => {
setDownloading(true);
try {
const file = await api.downloadStoreAgent(storeListingVersionId);
// Similar to Marketplace v1
const jsonData = JSON.stringify(file, null, 2);
// Create a Blob from the file content
const blob = new Blob([jsonData], { type: "application/json" });
// Create a temporary URL for the Blob
const url = window.URL.createObjectURL(blob);
// Create a temporary anchor element
const a = document.createElement("a");
a.href = url;
a.download = `agent_${storeListingVersionId}.json`; // Set the filename
// Append the anchor to the body, click it, and remove it
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
// Revoke the temporary URL
window.URL.revokeObjectURL(url);
toast({
title: "Download Complete",
description: "Your agent has been successfully downloaded.",
});
} catch (error) {
console.error(`Error downloading agent:`, error);
toast({
title: "Error",
description: "Failed to download agent. Please try again.",
variant: "destructive",
});
}
};
await downloadAgent();
setDownloading(false);
}, [setDownloading, api, storeListingVersionId, toast]);
return (
<div className="w-full max-w-[396px] px-4 sm:px-6 lg:w-[396px] lg:px-0">
{/* Title */}
<div
data-testid="agent-title"
className="mb-3 w-full font-poppins text-2xl font-medium leading-normal text-neutral-900 dark:text-neutral-100 sm:text-3xl lg:mb-4 lg:text-[35px] lg:leading-10"
>
{name}
</div>
{/* Creator */}
<div className="mb-3 flex w-full items-center gap-1.5 lg:mb-4">
<div className="text-base font-normal text-neutral-800 dark:text-neutral-200 sm:text-lg lg:text-xl">
by
</div>
<Link
data-testid={"agent-creator"}
href={`/marketplace/creator/${encodeURIComponent(creator)}`}
className="text-base font-medium text-neutral-800 hover:underline dark:text-neutral-200 sm:text-lg lg:text-xl"
>
{creator}
</Link>
</div>
{/* Short Description */}
<div className="mb-4 line-clamp-2 w-full text-base font-normal leading-normal text-neutral-600 dark:text-neutral-300 sm:text-lg lg:mb-6 lg:text-xl lg:leading-7">
{shortDescription}
</div>
{/* Buttons */}
<div className="mb-4 flex w-full gap-3 lg:mb-[60px]">
{user && (
<button
className={cn(
"inline-flex min-w-24 items-center justify-center rounded-full bg-violet-600 px-4 py-3",
"transition-colors duration-200 hover:bg-violet-500 disabled:bg-zinc-400",
)}
data-testid={"agent-add-library-button"}
onClick={libraryAction}
disabled={adding}
>
<span className="justify-start font-sans text-sm font-medium leading-snug text-primary-foreground">
{libraryAgent ? "See runs" : "Add to library"}
</span>
</button>
)}
<button
className={cn(
"inline-flex min-w-24 items-center justify-center rounded-full bg-zinc-200 px-4 py-3",
"transition-colors duration-200 hover:bg-zinc-200/70 disabled:bg-zinc-200/40",
)}
data-testid={"agent-download-button"}
onClick={handleDownload}
disabled={downloading}
>
<div className="justify-start text-center font-sans text-sm font-medium leading-snug text-zinc-800">
Download agent
</div>
</button>
</div>
{/* Rating and Runs */}
<div className="mb-4 flex w-full items-center justify-between lg:mb-[44px]">
<div className="flex items-center gap-1.5 sm:gap-2">
<span className="whitespace-nowrap text-base font-semibold text-neutral-800 dark:text-neutral-200 sm:text-lg">
{rating.toFixed(1)}
</span>
<div className="flex gap-0.5">{StarRatingIcons(rating)}</div>
</div>
<div className="whitespace-nowrap text-base font-semibold text-neutral-800 dark:text-neutral-200 sm:text-lg">
{runs.toLocaleString()} runs
</div>
</div>
{/* Separator */}
<Separator className="mb-4 lg:mb-[44px]" />
{/* Description Section */}
<div className="mb-4 w-full lg:mb-[36px]">
<div className="decoration-skip-ink-none mb-1.5 text-base font-medium leading-6 text-neutral-800 dark:text-neutral-200 sm:mb-2">
Description
</div>
<div
data-testid={"agent-description"}
className="whitespace-pre-line text-base font-normal leading-6 text-neutral-600 dark:text-neutral-400"
>
{longDescription}
</div>
</div>
{/* Categories */}
<div className="mb-4 flex w-full flex-col gap-1.5 sm:gap-2 lg:mb-[36px]">
<div className="decoration-skip-ink-none mb-1.5 text-base font-medium leading-6 text-neutral-800 dark:text-neutral-200 sm:mb-2">
Categories
</div>
<div className="flex flex-wrap gap-1.5 sm:gap-2">
{categories.map((category, index) => (
<div
key={index}
className="decoration-skip-ink-none whitespace-nowrap rounded-full border border-neutral-600 bg-white px-2 py-0.5 text-base font-normal leading-6 text-neutral-800 underline-offset-[from-font] dark:border-neutral-700 dark:bg-neutral-800 dark:text-neutral-200 sm:px-[16px] sm:py-[10px]"
>
{category}
</div>
))}
</div>
</div>
{/* Version History */}
<div className="flex w-full flex-col gap-0.5 sm:gap-1">
<div className="decoration-skip-ink-none mb-1.5 text-base font-medium leading-6 text-neutral-800 dark:text-neutral-200 sm:mb-2">
Version history
</div>
<div className="decoration-skip-ink-none text-base font-normal leading-6 text-neutral-600 underline-offset-[from-font] dark:text-neutral-400">
Last updated {lastUpdated}
</div>
<div className="text-xs text-neutral-600 dark:text-neutral-400 sm:text-sm">
Version {version}
</div>
</div>
</div>
);
};

View File

@@ -29,7 +29,7 @@ const TooltipContent = React.forwardRef<
ref={ref}
sideOffset={sideOffset}
className={cn(
"z-50 max-w-xs overflow-hidden rounded-md border border-slate-50 bg-white px-3 py-1.5 text-xs text-gray-900 animate-in fade-in-0 zoom-in-95 data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=closed]:zoom-out-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2",
"z-50 max-w-xs overflow-hidden rounded-md bg-white px-3 py-1.5 text-xs font-normal leading-normal text-zinc-900 outline outline-1 outline-offset-[-1px] outline-gray-100 animate-in fade-in-0 zoom-in-95 data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=closed]:zoom-out-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2",
className,
)}
{...props}

View File

@@ -1,7 +1,7 @@
import { HelpItem } from "@/components/auth/help-item";
import { Card, CardContent } from "@/components/__legacy__/ui/card";
import { BehaveAs } from "@/lib/utils";
import { AlertCircle, CheckCircle } from "lucide-react";
import { BehaveAs } from "@/services/environment";
interface Props {
type: "login" | "signup";

View File

@@ -1,6 +1,6 @@
"use client";
import { cn } from "@/lib/utils";
import { isServerSide } from "@/lib/utils/is-server-side";
import { environment } from "@/services/environment";
import { useEffect, useRef, useState } from "react";
export interface TurnstileProps {
@@ -32,7 +32,7 @@ export function Turnstile({
// Load the Turnstile script
useEffect(() => {
if (isServerSide() || !shouldRender) return;
if (environment.isServerSide() || !shouldRender) return;
// Skip if already loaded
if (window.turnstile) {

View File

@@ -43,7 +43,15 @@ export function WaitlistErrorContent({
exact same email address you used when signing up.
</Text>
<Text variant="small" className="text-center text-muted-foreground">
If you&apos;re not sure which email you used or need help,{" "}
If you&apos;re not sure which email you used or need help, contact us
at{" "}
<a
href="mailto:contact@agpt.co"
className="underline hover:text-foreground"
>
contact@agpt.co
</a>{" "}
or{" "}
<a
href="https://discord.gg/autogpt"
target="_blank"

View File

@@ -1,4 +1,4 @@
import { isServerSide } from "@/lib/utils/is-server-side";
import { environment } from "@/services/environment";
import { useCallback, useEffect, useRef, useState } from "react";
interface useInfiniteScrollProps {
@@ -34,7 +34,8 @@ export const useInfiniteScroll = ({
}, [hasNextPage, isFetchingNextPage, fetchNextPage, onLoadMore]);
useEffect(() => {
if (!hasNextPage || !endOfListRef.current || isServerSide()) return;
if (!hasNextPage || !endOfListRef.current || environment.isServerSide())
return;
const observer = new IntersectionObserver(
(entries) => {

View File

@@ -5,14 +5,15 @@ import { InfoIcon, AlertTriangleIcon, XCircleIcon } from "lucide-react";
import { cn } from "@/lib/utils";
const alertVariants = cva(
"relative w-full rounded-lg border border-neutral-200 px-4 py-3 text-sm [&>svg]:absolute [&>svg]:left-4 [&>svg]:top-[12px] [&>svg]:text-neutral-950 [&>svg~*]:pl-7",
"relative w-full rounded-lg border border-zinc-200 px-4 py-3 text-sm [&>svg]:absolute [&>svg]:left-4 [&>svg]:top-1/2 [&>svg]:-translate-y-1/2 [&>svg]:text-zinc-800 [&>svg~*]:pl-7",
{
variants: {
variant: {
default: "bg-white text-neutral-950 [&>svg]:text-blue-500",
default: "bg-white text-zinc-800 [&>svg]:text-purple-500",
warning:
"bg-white border-orange-500/50 text-orange-600 [&>svg]:text-orange-500",
error: "bg-white border-red-500/50 text-red-500 [&>svg]:text-red-500",
"bg-[#FFF3E680] border-yellow-300 text-zinc-800 [&>svg]:text-orange-600",
error:
"bg-[#FDECEC80] border-red-300 text-zinc-800 [&>svg]:text-red-500",
},
},
defaultVariants: {
@@ -45,7 +46,7 @@ const Alert = React.forwardRef<HTMLDivElement, AlertProps>(
className={cn(alertVariants({ variant: currentVariant }), className)}
{...props}
>
<IconComponent className="h-4 w-4" />
<IconComponent className="h-[1.125rem] w-[1.125rem]" />
{children}
</div>
);

View File

@@ -1,7 +1,6 @@
import { useState, useCallback, useEffect } from "react";
import { verifyTurnstileToken } from "@/lib/turnstile";
import { BehaveAs, getBehaveAs } from "@/lib/utils";
import { isServerSide } from "@/lib/utils/is-server-side";
import { environment } from "@/services/environment";
interface UseTurnstileOptions {
action?: string;
@@ -46,17 +45,15 @@ export function useTurnstile({
const [widgetId, setWidgetId] = useState<string | null>(null);
useEffect(() => {
const behaveAs = getBehaveAs();
const isCloud = environment.isCloud();
const hasTurnstileKey = !!TURNSTILE_SITE_KEY;
const turnstileDisabled = process.env.NEXT_PUBLIC_TURNSTILE !== "enabled";
// Only render Turnstile in cloud environment if not explicitly disabled
setShouldRender(
behaveAs === BehaveAs.CLOUD && hasTurnstileKey && !turnstileDisabled,
);
setShouldRender(isCloud && hasTurnstileKey && !turnstileDisabled);
// Skip verification if disabled, in local development, or no key
if (turnstileDisabled || behaveAs !== BehaveAs.CLOUD || !hasTurnstileKey) {
if (turnstileDisabled || !isCloud || !hasTurnstileKey) {
setVerified(true);
}
}, []);
@@ -80,7 +77,12 @@ export function useTurnstile({
setError(null);
// Only reset the actual Turnstile widget if it exists and shouldRender is true
if (shouldRender && !isServerSide() && window.turnstile && widgetId) {
if (
shouldRender &&
!environment.isServerSide() &&
window.turnstile &&
widgetId
) {
try {
window.turnstile.reset(widgetId);
} catch (err) {

View File

@@ -3,12 +3,6 @@ import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
import { createBrowserClient } from "@supabase/ssr";
import type { SupabaseClient } from "@supabase/supabase-js";
import { Key, storage } from "@/services/storage/local-storage";
import {
getAgptServerApiUrl,
getAgptWsServerUrl,
getSupabaseUrl,
getSupabaseAnonKey,
} from "@/lib/env-config";
import * as Sentry from "@sentry/nextjs";
import type {
AddUserCreditsResponse,
@@ -73,9 +67,9 @@ import type {
UserPasswordCredentials,
UsersBalanceHistoryResponse,
} from "./types";
import { isServerSide } from "../utils/is-server-side";
import { environment } from "@/services/environment";
const isClient = !isServerSide();
const isClient = environment.isClientSide();
export default class BackendAPI {
private baseUrl: string;
@@ -93,8 +87,8 @@ export default class BackendAPI {
heartbeatTimeoutID: number | null = null;
constructor(
baseUrl: string = getAgptServerApiUrl(),
wsUrl: string = getAgptWsServerUrl(),
baseUrl: string = environment.getAGPTServerApiUrl(),
wsUrl: string = environment.getAGPTWsServerUrl(),
) {
this.baseUrl = baseUrl;
this.wsUrl = wsUrl;
@@ -102,9 +96,13 @@ export default class BackendAPI {
private async getSupabaseClient(): Promise<SupabaseClient | null> {
return isClient
? createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
isSingleton: true,
})
? createBrowserClient(
environment.getSupabaseUrl(),
environment.getSupabaseAnonKey(),
{
isSingleton: true,
},
)
: await getServerSupabase();
}

View File

@@ -1,6 +1,6 @@
"use client";
import { isServerSide } from "../utils/is-server-side";
import { environment } from "@/services/environment";
import BackendAPI from "./client";
import React, { createContext, useMemo } from "react";
@@ -20,7 +20,7 @@ export function BackendAPIProvider({
}): React.ReactNode {
const api = useMemo(() => new BackendAPI(), []);
if (process.env.NEXT_PUBLIC_BEHAVE_AS == "LOCAL" && !isServerSide()) {
if (environment.isLocal() && !environment.isServerSide()) {
window.api = api; // Expose the API globally for debugging purposes
}

View File

@@ -1,7 +1,6 @@
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
import { Key, storage } from "@/services/storage/local-storage";
import { getAgptServerApiUrl } from "@/lib/env-config";
import { isServerSide } from "../utils/is-server-side";
import { environment } from "@/services/environment";
import { GraphValidationErrorResponse } from "./types";
@@ -41,12 +40,11 @@ export function buildRequestUrl(
method: string,
payload?: Record<string, any>,
): string {
let url = baseUrl + path;
const url = baseUrl + path;
const payloadAsQuery = ["GET", "DELETE"].includes(method);
if (payloadAsQuery && payload) {
const queryParams = new URLSearchParams(payload);
url += `?${queryParams.toString()}`;
return buildUrlWithQuery(url, payload);
}
return url;
@@ -57,17 +55,28 @@ export function buildClientUrl(path: string): string {
}
export function buildServerUrl(path: string): string {
return `${getAgptServerApiUrl()}${path}`;
return `${environment.getAGPTServerApiUrl()}${path}`;
}
export function buildUrlWithQuery(
url: string,
payload?: Record<string, any>,
query?: Record<string, any>,
): string {
if (!payload) return url;
if (!query) return url;
const queryParams = new URLSearchParams(payload);
return `${url}?${queryParams.toString()}`;
// Filter out undefined values to prevent them from being included as "undefined" strings
const filteredQuery = Object.entries(query).reduce(
(acc, [key, value]) => {
if (value !== undefined) {
acc[key] = value;
}
return acc;
},
{} as Record<string, any>,
);
const queryParams = new URLSearchParams(filteredQuery);
return queryParams.size > 0 ? `${url}?${queryParams.toString()}` : url;
}
export async function handleFetchError(response: Response): Promise<ApiError> {
@@ -197,7 +206,7 @@ function isAuthenticationError(
}
function isLogoutInProgress(): boolean {
if (isServerSide()) return false;
if (environment.isServerSide()) return false;
try {
// Check if logout was recently triggered

View File

@@ -1,65 +0,0 @@
/**
* Environment configuration helper
*
* Provides unified access to environment variables with server-side priority.
* Server-side code uses Docker service names, client-side falls back to localhost.
*/
import { isServerSide } from "./utils/is-server-side";
/**
* Gets the AGPT server URL with server-side priority
* Server-side: Uses AGPT_SERVER_URL (http://rest_server:8006/api)
* Client-side: Falls back to NEXT_PUBLIC_AGPT_SERVER_URL (http://localhost:8006/api)
*/
export function getAgptServerApiUrl(): string {
// If server-side and server URL exists, use it
if (isServerSide() && process.env.AGPT_SERVER_URL) {
return process.env.AGPT_SERVER_URL;
}
// Otherwise use the public URL
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
}
export function getAgptServerBaseUrl(): string {
return getAgptServerApiUrl().replace("/api", "");
}
/**
* Gets the AGPT WebSocket URL with server-side priority
* Server-side: Uses AGPT_WS_SERVER_URL (ws://websocket_server:8001/ws)
* Client-side: Falls back to NEXT_PUBLIC_AGPT_WS_SERVER_URL (ws://localhost:8001/ws)
*/
export function getAgptWsServerUrl(): string {
// If server-side and server URL exists, use it
if (isServerSide() && process.env.AGPT_WS_SERVER_URL) {
return process.env.AGPT_WS_SERVER_URL;
}
// Otherwise use the public URL
return process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL || "ws://localhost:8001/ws";
}
/**
* Gets the Supabase URL with server-side priority
* Server-side: Uses SUPABASE_URL (http://kong:8000)
* Client-side: Falls back to NEXT_PUBLIC_SUPABASE_URL (http://localhost:8000)
*/
export function getSupabaseUrl(): string {
// If server-side and server URL exists, use it
if (isServerSide() && process.env.SUPABASE_URL) {
return process.env.SUPABASE_URL;
}
// Otherwise use the public URL
return process.env.NEXT_PUBLIC_SUPABASE_URL || "http://localhost:8000";
}
/**
* Gets the Supabase anon key
* Uses NEXT_PUBLIC_SUPABASE_ANON_KEY since anon keys are public and same across environments
*/
export function getSupabaseAnonKey(): string {
return process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY || "";
}

View File

@@ -1,7 +1,7 @@
import { type CookieOptions } from "@supabase/ssr";
import { SupabaseClient } from "@supabase/supabase-js";
import { Key, storage } from "@/services/storage/local-storage";
import { isServerSide } from "../utils/is-server-side";
import { environment } from "@/services/environment";
export const PROTECTED_PAGES = [
"/monitor",
@@ -83,7 +83,7 @@ export function setupSessionEventListeners(
onFocus: () => void,
onStorageChange: (e: StorageEvent) => void,
): EventListeners {
if (isServerSide() || typeof document === "undefined") {
if (environment.isServerSide()) {
return { cleanup: () => {} };
}

View File

@@ -4,7 +4,6 @@ import { User } from "@supabase/supabase-js";
import { usePathname, useRouter } from "next/navigation";
import { useEffect, useMemo, useRef, useState } from "react";
import { useBackendAPI } from "@/lib/autogpt-server-api/context";
import { getSupabaseUrl, getSupabaseAnonKey } from "@/lib/env-config";
import {
getCurrentUser,
refreshSession,
@@ -20,6 +19,7 @@ import {
setWebSocketDisconnectIntent,
setupSessionEventListeners,
} from "../helpers";
import { environment } from "@/services/environment";
export function useSupabase() {
const router = useRouter();
@@ -33,12 +33,16 @@ export function useSupabase() {
const supabase = useMemo(() => {
try {
return createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
isSingleton: true,
auth: {
persistSession: false, // Don't persist session on client with httpOnly cookies
return createBrowserClient(
environment.getSupabaseUrl(),
environment.getSupabaseAnonKey(),
{
isSingleton: true,
auth: {
persistSession: false, // Don't persist session on client with httpOnly cookies
},
},
});
);
} catch (error) {
console.error("Error creating Supabase client", error);
return null;

View File

@@ -1,15 +1,15 @@
import { createServerClient } from "@supabase/ssr";
import { NextResponse, type NextRequest } from "next/server";
import { getCookieSettings, isAdminPage, isProtectedPage } from "./helpers";
import { getSupabaseUrl, getSupabaseAnonKey } from "../env-config";
import { environment } from "@/services/environment";
export async function updateSession(request: NextRequest) {
let supabaseResponse = NextResponse.next({
request,
});
const supabaseUrl = getSupabaseUrl();
const supabaseKey = getSupabaseAnonKey();
const supabaseUrl = environment.getSupabaseUrl();
const supabaseKey = environment.getSupabaseAnonKey();
const isAvailable = Boolean(supabaseUrl && supabaseKey);
if (!isAvailable) {

View File

@@ -1,6 +1,6 @@
import { createServerClient, type CookieOptions } from "@supabase/ssr";
import { getCookieSettings } from "../helpers";
import { getSupabaseUrl, getSupabaseAnonKey } from "../../env-config";
import { environment } from "@/services/environment";
type Cookies = { name: string; value: string; options?: CookieOptions }[];
@@ -12,8 +12,8 @@ export async function getServerSupabase() {
try {
const supabase = createServerClient(
getSupabaseUrl(),
getSupabaseAnonKey(),
environment.getSupabaseUrl(),
environment.getSupabaseAnonKey(),
{
cookies: {
getAll() {

View File

@@ -1,35 +1,27 @@
/**
* Utility functions for working with Cloudflare Turnstile
*/
import { BehaveAs, getBehaveAs } from "@/lib/utils";
import { getAgptServerApiUrl } from "@/lib/env-config";
import { environment } from "@/services/environment";
export async function verifyTurnstileToken(
token: string,
action?: string,
): Promise<boolean> {
// Skip verification unless explicitly enabled via environment variable
if (process.env.NEXT_PUBLIC_TURNSTILE !== "enabled") {
return true;
}
// Skip verification in local development
const behaveAs = getBehaveAs();
if (behaveAs !== BehaveAs.CLOUD) {
if (!environment.isCAPTCHAEnabled() || environment.isLocal()) {
return true;
}
try {
const response = await fetch(`${getAgptServerApiUrl()}/turnstile/verify`, {
method: "POST",
headers: {
"Content-Type": "application/json",
const response = await fetch(
`${environment.getAGPTServerApiUrl()}/turnstile/verify`,
{
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
token,
action,
}),
},
body: JSON.stringify({
token,
action,
}),
});
);
if (!response.ok) {
console.error("Turnstile verification failed:", await response.text());

View File

@@ -228,36 +228,6 @@ export function getPrimaryCategoryColor(categories: Category[]): string {
);
}
export enum BehaveAs {
CLOUD = "CLOUD",
LOCAL = "LOCAL",
}
export function getBehaveAs(): BehaveAs {
return process.env.NEXT_PUBLIC_BEHAVE_AS === "CLOUD"
? BehaveAs.CLOUD
: BehaveAs.LOCAL;
}
export enum AppEnv {
LOCAL = "local",
DEV = "dev",
PROD = "prod",
}
export function getAppEnv(): AppEnv {
const env = process.env.NEXT_PUBLIC_APP_ENV;
if (env === "dev") return AppEnv.DEV;
if (env === "prod") return AppEnv.PROD;
// Some places use prod and others production
if (env === "production") return AppEnv.PROD;
return AppEnv.LOCAL;
}
export function getEnvironmentStr(): string {
return `app:${getAppEnv().toLowerCase()}-behave:${getBehaveAs().toLowerCase()}`;
}
function rectanglesOverlap(
rect1: { x: number; y: number; width: number; height?: number },
rect2: { x: number; y: number; width: number; height?: number },

View File

@@ -1,3 +0,0 @@
export const isServerSide = (): boolean => {
return typeof window === "undefined";
};

View File

@@ -1,68 +0,0 @@
/**
* Modified copy of ga.tsx from @next/third-parties/google, with modified gtag.js source URL.
* Original source file: https://github.com/vercel/next.js/blob/b304b45e3a6e3e79338568d76e28805e77c03ec9/packages/third-parties/src/google/ga.tsx
*/
"use client";
import type { GAParams } from "@/types/google";
import Script from "next/script";
import { useEffect } from "react";
let currDataLayerName: string | undefined = undefined;
export function GoogleAnalytics(props: GAParams) {
const { gaId, debugMode, dataLayerName = "dataLayer", nonce } = props;
if (currDataLayerName === undefined) {
currDataLayerName = dataLayerName;
}
useEffect(() => {
// Feature usage signal (same as original implementation)
performance.mark("mark_feature_usage", {
detail: {
feature: "custom-ga",
},
});
}, []);
return (
<>
<Script
id="_custom-ga-init"
// Using "afterInteractive" to avoid blocking the initial page rendering
strategy="afterInteractive"
dangerouslySetInnerHTML={{
__html: `
window['${dataLayerName}'] = window['${dataLayerName}'] || [];
function gtag(){window['${dataLayerName}'].push(arguments);}
gtag('js', new Date());
gtag('config', '${gaId}' ${debugMode ? ",{ 'debug_mode': true }" : ""});
`,
}}
nonce={nonce}
/>
<Script
id="_custom-ga"
strategy="afterInteractive"
src="/gtag.js"
nonce={nonce}
/>
</>
);
}
export function sendGAEvent(...args: any[]) {
if (currDataLayerName === undefined) {
console.warn(`Custom GA: GA has not been initialized`);
return;
}
const dataLayer = (window as any)[currDataLayerName];
if (dataLayer) {
dataLayer.push(...args);
} else {
console.warn(`Custom GA: dataLayer ${currDataLayerName} does not exist`);
}
}

View File

@@ -0,0 +1,114 @@
/**
* Modified copy of ga.tsx from @next/third-parties/google, with modified gtag.js source URL.
* Original source file: https://github.com/vercel/next.js/blob/b304b45e3a6e3e79338568d76e28805e77c03ec9/packages/third-parties/src/google/ga.tsx
*/
"use client";
import type { GAParams } from "@/types/google";
import Script from "next/script";
import { useEffect } from "react";
import { environment } from "../environment";
declare global {
interface Window {
datafast: (name: string, metadata: Record<string, unknown>) => void;
[key: string]: unknown[] | ((...args: unknown[]) => void) | unknown;
}
}
let currDataLayerName: string | undefined = undefined;
type SetupProps = {
ga: GAParams;
host: string;
};
export function SetupAnalytics(props: SetupProps) {
const { ga, host } = props;
const { gaId, debugMode, dataLayerName = "dataLayer", nonce } = ga;
const isProductionDomain = host.includes("platform.agpt.co");
// Datafa.st journey analytics only on production
const dataFastEnabled = isProductionDomain;
// We collect analytics too for open source developers running the platform locally
const googleAnalyticsEnabled = environment.isLocal() || isProductionDomain;
if (currDataLayerName === undefined) {
currDataLayerName = dataLayerName;
}
useEffect(() => {
if (!googleAnalyticsEnabled) return;
// Google Analytics: feature usage signal (same as original implementation)
performance.mark("mark_feature_usage", {
detail: {
feature: "custom-ga",
},
});
}, [googleAnalyticsEnabled]);
return (
<>
{/* Google Analytics */}
{googleAnalyticsEnabled ? (
<>
<Script
id="_custom-ga-init"
strategy="afterInteractive"
dangerouslySetInnerHTML={{
__html: `
window['${dataLayerName}'] = window['${dataLayerName}'] || [];
function gtag(){window['${dataLayerName}'].push(arguments);}
gtag('js', new Date());
gtag('config', '${gaId}' ${debugMode ? ",{ 'debug_mode': true }" : ""});
`,
}}
nonce={nonce}
/>
{/* Google Tag Manager */}
<Script
id="_custom-ga"
strategy="afterInteractive"
src="/gtag.js"
nonce={nonce}
/>
</>
) : null}
{/* Datafa.st */}
{dataFastEnabled ? (
<Script
strategy="afterInteractive"
data-website-id="dfid_g5wtBIiHUwSkWKcGz80lu"
data-domain="agpt.co"
src="https://datafa.st/js/script.js"
/>
) : null}
</>
);
}
export const analytics = {
sendGAEvent,
sendDatafastEvent,
};
function sendGAEvent(...args: unknown[]) {
if (typeof window === "undefined") return;
if (currDataLayerName === undefined) return;
const dataLayer = window[currDataLayerName];
if (!dataLayer) return;
if (Array.isArray(dataLayer)) {
dataLayer.push(...args);
} else {
console.warn(`Custom GA: dataLayer ${currDataLayerName} does not exist`);
}
}
function sendDatafastEvent(name: string, metadata: Record<string, unknown>) {
if (typeof window === "undefined" || !window.datafast) return;
window.datafast(name, metadata);
}

View File

@@ -0,0 +1,115 @@
export enum BehaveAs {
CLOUD = "CLOUD",
LOCAL = "LOCAL",
}
function getBehaveAs(): BehaveAs {
return process.env.NEXT_PUBLIC_BEHAVE_AS === "CLOUD"
? BehaveAs.CLOUD
: BehaveAs.LOCAL;
}
export enum AppEnv {
LOCAL = "local",
DEV = "dev",
PROD = "prod",
}
function getAppEnv(): AppEnv {
const env = process.env.NEXT_PUBLIC_APP_ENV;
if (env === "dev") return AppEnv.DEV;
if (env === "prod") return AppEnv.PROD;
// Some places use prod and others production
if (env === "production") return AppEnv.PROD;
return AppEnv.LOCAL;
}
function getAGPTServerApiUrl() {
if (environment.isServerSide() && process.env.AGPT_SERVER_URL) {
return process.env.AGPT_SERVER_URL;
}
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
}
function getAGPTServerBaseUrl() {
return getAGPTServerApiUrl().replace("/api", "");
}
function getAGPTWsServerUrl() {
if (environment.isServerSide() && process.env.AGPT_WS_SERVER_URL) {
return process.env.AGPT_WS_SERVER_URL;
}
return process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL || "ws://localhost:8001/ws";
}
function getSupabaseUrl() {
if (environment.isServerSide() && process.env.SUPABASE_URL) {
return process.env.SUPABASE_URL;
}
return process.env.NEXT_PUBLIC_SUPABASE_URL || "http://localhost:8000";
}
function getSupabaseAnonKey() {
return process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY || "";
}
function getEnvironmentStr() {
return `app:${getAppEnv().toLowerCase()}-behave:${getBehaveAs().toLowerCase()}`;
}
function isProd() {
return process.env.NODE_ENV === "production";
}
function isDev() {
return process.env.NODE_ENV === "development";
}
function isCloud() {
return getBehaveAs() === BehaveAs.CLOUD;
}
function isLocal() {
return getBehaveAs() === BehaveAs.LOCAL;
}
function isServerSide() {
return typeof window === "undefined";
}
function isClientSide() {
return typeof window !== "undefined";
}
function isCAPTCHAEnabled() {
return process.env.NEXT_PUBLIC_TURNSTILE === "enabled";
}
function areFeatureFlagsEnabled() {
return process.env.NEXT_PUBLIC_LAUNCHDARKLY_ENABLED === "enabled";
}
export const environment = {
// Generic
getEnvironmentStr,
// Get environment variables config
getBehaveAs,
getAppEnv,
getAGPTServerApiUrl,
getAGPTServerBaseUrl,
getAGPTWsServerUrl,
getSupabaseUrl,
getSupabaseAnonKey,
// Assertions
isServerSide,
isClientSide,
isProd,
isDev,
isCloud,
isLocal,
isCAPTCHAEnabled,
areFeatureFlagsEnabled,
};

View File

@@ -4,15 +4,15 @@ import { LDProvider } from "launchdarkly-react-client-sdk";
import type { ReactNode } from "react";
import { useMemo } from "react";
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
import { BehaveAs, getBehaveAs } from "@/lib/utils";
import * as Sentry from "@sentry/nextjs";
import { environment } from "../environment";
const clientId = process.env.NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID;
const envEnabled = process.env.NEXT_PUBLIC_LAUNCHDARKLY_ENABLED === "true";
export function LaunchDarklyProvider({ children }: { children: ReactNode }) {
const { user, isUserLoading } = useSupabase();
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
const isCloud = environment.isCloud();
const isLaunchDarklyConfigured = isCloud && envEnabled && clientId;
const context = useMemo(() => {

View File

@@ -1,8 +1,8 @@
"use client";
import { DEFAULT_SEARCH_TERMS } from "@/app/(platform)/marketplace/components/HeroSection/helpers";
import { BehaveAs, getBehaveAs } from "@/lib/utils";
import { useFlags } from "launchdarkly-react-client-sdk";
import { environment } from "../environment";
export enum Flag {
BETA_BLOCKS = "beta-blocks",
@@ -51,7 +51,7 @@ const mockFlags = {
export function useGetFlag<T extends Flag>(flag: T): FlagValues[T] | null {
const currentFlags = useFlags<FlagValues>();
const flagValue = currentFlags[flag];
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
const isCloud = environment.isCloud();
if ((isPwMockEnabled && !isCloud) || flagValue === undefined) {
return mockFlags[flag];

View File

@@ -1,5 +1,5 @@
import { isServerSide } from "@/lib/utils/is-server-side";
import * as Sentry from "@sentry/nextjs";
import { environment } from "../environment";
export enum Key {
LOGOUT = "supabase-logout",
@@ -10,7 +10,7 @@ export enum Key {
}
function get(key: Key) {
if (isServerSide()) {
if (environment.isServerSide()) {
Sentry.captureException(new Error("Local storage is not available"));
return;
}
@@ -23,7 +23,7 @@ function get(key: Key) {
}
function set(key: Key, value: string) {
if (isServerSide()) {
if (environment.isServerSide()) {
Sentry.captureException(new Error("Local storage is not available"));
return;
}
@@ -31,7 +31,7 @@ function set(key: Key, value: string) {
}
function clean(key: Key) {
if (isServerSide()) {
if (environment.isServerSide()) {
Sentry.captureException(new Error("Local storage is not available"));
return;
}

View File

@@ -7,7 +7,7 @@ A block that transcribes the audio content of a YouTube video into text.
This block takes a YouTube video URL as input and produces a text transcript of the video's audio content. It also extracts and provides the unique video ID associated with the YouTube video.
### How it works
The block first extracts the video ID from the provided YouTube URL. It then uses this ID to fetch the video's transcript. The transcript is processed and formatted into a readable text format. If any errors occur during this process, the block will capture and report them.
The block first extracts the video ID from the provided YouTube URL. It then uses this ID to fetch the video's transcript, preferring English when available. If an English transcript is not available, the block will automatically use the first available transcript in any other language (prioritizing manually created transcripts over auto-generated ones). The transcript is processed and formatted into a readable text format. If any errors occur during this process, the block will capture and report them.
### Inputs
| Input | Description |
@@ -22,5 +22,5 @@ The block first extracts the video ID from the provided YouTube URL. It then use
| Error | Any error message that occurs if the transcription process fails. |
### Possible use case
A content creator could use this block to automatically generate subtitles for their YouTube videos. They could also use it to create text-based summaries of video content for SEO purposes or to make their content more accessible to hearing-impaired viewers.
A content creator could use this block to automatically generate subtitles for their YouTube videos. They could also use it to create text-based summaries of video content for SEO purposes or to make their content more accessible to hearing-impaired viewers. The automatic language fallback feature ensures that transcripts can be obtained even from videos that only have subtitles in non-English languages.