mirror of
https://github.com/Significant-Gravitas/AutoGPT.git
synced 2026-04-08 03:00:28 -04:00
Compare commits
48 Commits
hotfix/wai
...
seer/fix-m
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
06a4e4defa | ||
|
|
f4ba02f2f1 | ||
|
|
e2a9923f30 | ||
|
|
39792d517e | ||
|
|
a6a2f71458 | ||
|
|
788b861bb7 | ||
|
|
e203e65dc4 | ||
|
|
bd03697ff2 | ||
|
|
efd37b7a36 | ||
|
|
bb0b45d7f7 | ||
|
|
04df981115 | ||
|
|
d25997b4f2 | ||
|
|
11d55f6055 | ||
|
|
063dc5cf65 | ||
|
|
b7646f3e58 | ||
|
|
0befaf0a47 | ||
|
|
93f58dec5e | ||
|
|
3da595f599 | ||
|
|
e5e60921a3 | ||
|
|
90af8f8e1a | ||
|
|
eba67e0a4b | ||
|
|
47bb89caeb | ||
|
|
271a520afa | ||
|
|
3988057032 | ||
|
|
a6c6e48f00 | ||
|
|
e72ce2f9e7 | ||
|
|
bd7a79a920 | ||
|
|
3f546ae845 | ||
|
|
c958c95d6b | ||
|
|
3e50cbd2cb | ||
|
|
1b69f1644d | ||
|
|
d9035a233c | ||
|
|
972cbfc3de | ||
|
|
8f861b1bb2 | ||
|
|
fa2731bb8b | ||
|
|
2dc0c97a52 | ||
|
|
a1d9b45238 | ||
|
|
29895c290f | ||
|
|
1ed224d481 | ||
|
|
3b5d919399 | ||
|
|
3c16de22ef | ||
|
|
3ed1c93ec0 | ||
|
|
773f545cfd | ||
|
|
84ad4a9f95 | ||
|
|
8610118ddc | ||
|
|
ebb4ebb025 | ||
|
|
cb532e1c4d | ||
|
|
794aee25ab |
94
.github/copilot-instructions.md
vendored
94
.github/copilot-instructions.md
vendored
@@ -12,6 +12,7 @@ This file provides comprehensive onboarding information for GitHub Copilot codin
|
||||
- **Infrastructure** - Docker configurations, CI/CD, and development tools
|
||||
|
||||
**Primary Languages & Frameworks:**
|
||||
|
||||
- **Backend**: Python 3.10-3.13, FastAPI, Prisma ORM, PostgreSQL, RabbitMQ
|
||||
- **Frontend**: TypeScript, Next.js 15, React, Tailwind CSS, Radix UI
|
||||
- **Development**: Docker, Poetry, pnpm, Playwright, Storybook
|
||||
@@ -23,15 +24,17 @@ This file provides comprehensive onboarding information for GitHub Copilot codin
|
||||
**Always run these commands in the correct directory and in this order:**
|
||||
|
||||
1. **Initial Setup** (required once):
|
||||
|
||||
```bash
|
||||
# Clone and enter repository
|
||||
git clone <repo> && cd AutoGPT
|
||||
|
||||
|
||||
# Start all services (database, redis, rabbitmq, clamav)
|
||||
cd autogpt_platform && docker compose --profile local up deps --build --detach
|
||||
```
|
||||
|
||||
2. **Backend Setup** (always run before backend development):
|
||||
|
||||
```bash
|
||||
cd autogpt_platform/backend
|
||||
poetry install # Install dependencies
|
||||
@@ -48,6 +51,7 @@ This file provides comprehensive onboarding information for GitHub Copilot codin
|
||||
### Runtime Requirements
|
||||
|
||||
**Critical:** Always ensure Docker services are running before starting development:
|
||||
|
||||
```bash
|
||||
cd autogpt_platform && docker compose --profile local up deps --build --detach
|
||||
```
|
||||
@@ -58,6 +62,7 @@ cd autogpt_platform && docker compose --profile local up deps --build --detach
|
||||
### Development Commands
|
||||
|
||||
**Backend Development:**
|
||||
|
||||
```bash
|
||||
cd autogpt_platform/backend
|
||||
poetry run serve # Start development server (port 8000)
|
||||
@@ -68,6 +73,7 @@ poetry run lint # Lint code (ruff) - run after format
|
||||
```
|
||||
|
||||
**Frontend Development:**
|
||||
|
||||
```bash
|
||||
cd autogpt_platform/frontend
|
||||
pnpm dev # Start development server (port 3000) - use for active development
|
||||
@@ -81,23 +87,27 @@ pnpm storybook # Start component development server
|
||||
### Testing Strategy
|
||||
|
||||
**Backend Tests:**
|
||||
|
||||
- **Block Tests**: `poetry run pytest backend/blocks/test/test_block.py -xvs` (validates all blocks)
|
||||
- **Specific Block**: `poetry run pytest 'backend/blocks/test/test_block.py::test_available_blocks[BlockName]' -xvs`
|
||||
- **Snapshot Tests**: Use `--snapshot-update` when output changes, always review with `git diff`
|
||||
|
||||
**Frontend Tests:**
|
||||
|
||||
- **E2E Tests**: Always run `pnpm dev` before `pnpm test` (Playwright requires running instance)
|
||||
- **Component Tests**: Use Storybook for isolated component development
|
||||
|
||||
### Critical Validation Steps
|
||||
|
||||
**Before committing changes:**
|
||||
|
||||
1. Run `poetry run format` (backend) and `pnpm format` (frontend)
|
||||
2. Ensure all tests pass in modified areas
|
||||
3. Verify Docker services are still running
|
||||
4. Check that database migrations apply cleanly
|
||||
|
||||
**Common Issues & Workarounds:**
|
||||
|
||||
- **Prisma issues**: Run `poetry run prisma generate` after schema changes
|
||||
- **Permission errors**: Ensure Docker has proper permissions
|
||||
- **Port conflicts**: Check the `docker-compose.yml` file for the current list of exposed ports. You can list all mapped ports with:
|
||||
@@ -108,6 +118,7 @@ pnpm storybook # Start component development server
|
||||
### Core Architecture
|
||||
|
||||
**AutoGPT Platform** (`autogpt_platform/`):
|
||||
|
||||
- `backend/` - FastAPI server with async support
|
||||
- `backend/backend/` - Core API logic
|
||||
- `backend/blocks/` - Agent execution blocks
|
||||
@@ -121,6 +132,7 @@ pnpm storybook # Start component development server
|
||||
- `docker-compose.yml` - Development stack orchestration
|
||||
|
||||
**Key Configuration Files:**
|
||||
|
||||
- `pyproject.toml` - Python dependencies and tooling
|
||||
- `package.json` - Node.js dependencies and scripts
|
||||
- `schema.prisma` - Database schema and migrations
|
||||
@@ -136,6 +148,7 @@ pnpm storybook # Start component development server
|
||||
### Development Workflow
|
||||
|
||||
**GitHub Actions**: Multiple CI/CD workflows in `.github/workflows/`
|
||||
|
||||
- `platform-backend-ci.yml` - Backend testing and validation
|
||||
- `platform-frontend-ci.yml` - Frontend testing and validation
|
||||
- `platform-fullstack-ci.yml` - End-to-end integration tests
|
||||
@@ -146,11 +159,13 @@ pnpm storybook # Start component development server
|
||||
### Key Source Files
|
||||
|
||||
**Backend Entry Points:**
|
||||
|
||||
- `backend/backend/server/server.py` - FastAPI application setup
|
||||
- `backend/backend/data/` - Database models and user management
|
||||
- `backend/blocks/` - Agent execution blocks and logic
|
||||
|
||||
**Frontend Entry Points:**
|
||||
|
||||
- `frontend/src/app/layout.tsx` - Root application layout
|
||||
- `frontend/src/app/page.tsx` - Home page
|
||||
- `frontend/src/lib/supabase/` - Authentication and database client
|
||||
@@ -160,6 +175,7 @@ pnpm storybook # Start component development server
|
||||
### Agent Block System
|
||||
|
||||
Agents are built using a visual block-based system where each block performs a single action. Blocks are defined in `backend/blocks/` and must include:
|
||||
|
||||
- Block definition with input/output schemas
|
||||
- Execution logic with proper error handling
|
||||
- Tests validating functionality
|
||||
@@ -167,6 +183,7 @@ Agents are built using a visual block-based system where each block performs a s
|
||||
### Database & ORM
|
||||
|
||||
**Prisma ORM** with PostgreSQL backend including pgvector for embeddings:
|
||||
|
||||
- Schema in `schema.prisma`
|
||||
- Migrations in `backend/migrations/`
|
||||
- Always run `prisma migrate dev` and `prisma generate` after schema changes
|
||||
@@ -174,13 +191,15 @@ Agents are built using a visual block-based system where each block performs a s
|
||||
## Environment Configuration
|
||||
|
||||
### Configuration Files Priority Order
|
||||
|
||||
1. **Backend**: `/backend/.env.default` → `/backend/.env` (user overrides)
|
||||
2. **Frontend**: `/frontend/.env.default` → `/frontend/.env` (user overrides)
|
||||
2. **Frontend**: `/frontend/.env.default` → `/frontend/.env` (user overrides)
|
||||
3. **Platform**: `/.env.default` (Supabase/shared) → `/.env` (user overrides)
|
||||
4. Docker Compose `environment:` sections override file-based config
|
||||
5. Shell environment variables have highest precedence
|
||||
|
||||
### Docker Environment Setup
|
||||
|
||||
- All services use hardcoded defaults (no `${VARIABLE}` substitutions)
|
||||
- The `env_file` directive loads variables INTO containers at runtime
|
||||
- Backend/Frontend services use YAML anchors for consistent configuration
|
||||
@@ -189,6 +208,7 @@ Agents are built using a visual block-based system where each block performs a s
|
||||
## Advanced Development Patterns
|
||||
|
||||
### Adding New Blocks
|
||||
|
||||
1. Create file in `/backend/backend/blocks/`
|
||||
2. Inherit from `Block` base class with input/output schemas
|
||||
3. Implement `run` method with proper error handling
|
||||
@@ -198,6 +218,7 @@ Agents are built using a visual block-based system where each block performs a s
|
||||
7. Consider how inputs/outputs connect with other blocks in graph editor
|
||||
|
||||
### API Development
|
||||
|
||||
1. Update routes in `/backend/backend/server/routers/`
|
||||
2. Add/update Pydantic models in same directory
|
||||
3. Write tests alongside route files
|
||||
@@ -205,21 +226,76 @@ Agents are built using a visual block-based system where each block performs a s
|
||||
5. Run `poetry run test` to verify changes
|
||||
|
||||
### Frontend Development
|
||||
1. Components in `/frontend/src/components/`
|
||||
2. Use existing UI components from `/frontend/src/components/ui/`
|
||||
3. Add Storybook stories for component development
|
||||
4. Test user-facing features with Playwright E2E tests
|
||||
5. Update protected routes in middleware when needed
|
||||
|
||||
**📖 Complete Frontend Guide**: See `autogpt_platform/frontend/CONTRIBUTING.md` and `autogpt_platform/frontend/.cursorrules` for comprehensive patterns and conventions.
|
||||
|
||||
**Quick Reference:**
|
||||
|
||||
**Component Structure:**
|
||||
|
||||
- Separate render logic from data/behavior
|
||||
- Structure: `ComponentName/ComponentName.tsx` + `useComponentName.ts` + `helpers.ts`
|
||||
- Exception: Small components (3-4 lines of logic) can be inline
|
||||
- Render-only components can be direct files without folders
|
||||
|
||||
**Data Fetching:**
|
||||
|
||||
- Use generated API hooks from `@/app/api/__generated__/endpoints/`
|
||||
- Generated via Orval from backend OpenAPI spec
|
||||
- Pattern: `use{Method}{Version}{OperationName}`
|
||||
- Example: `useGetV2ListLibraryAgents`
|
||||
- Regenerate with: `pnpm generate:api`
|
||||
- **Never** use deprecated `BackendAPI` or `src/lib/autogpt-server-api/*`
|
||||
|
||||
**Code Conventions:**
|
||||
|
||||
- Use function declarations for components and handlers (not arrow functions)
|
||||
- Only arrow functions for small inline lambdas (map, filter, etc.)
|
||||
- Components: `PascalCase`, Hooks: `camelCase` with `use` prefix
|
||||
- No barrel files or `index.ts` re-exports
|
||||
- Minimal comments (code should be self-documenting)
|
||||
|
||||
**Styling:**
|
||||
|
||||
- Use Tailwind CSS utilities only
|
||||
- Use design system components from `src/components/` (atoms, molecules, organisms)
|
||||
- Never use `src/components/__legacy__/*`
|
||||
- Only use Phosphor Icons (`@phosphor-icons/react`)
|
||||
- Prefer design tokens over hardcoded values
|
||||
|
||||
**Error Handling:**
|
||||
|
||||
- Render errors: Use `<ErrorCard />` component
|
||||
- Mutation errors: Display with toast notifications
|
||||
- Manual exceptions: Use `Sentry.captureException()`
|
||||
- Global error boundaries already configured
|
||||
|
||||
**Testing:**
|
||||
|
||||
- Add/update Storybook stories for UI components (`pnpm storybook`)
|
||||
- Run Playwright E2E tests with `pnpm test`
|
||||
- Verify in Chromatic after PR
|
||||
|
||||
**Architecture:**
|
||||
|
||||
- Default to client components ("use client")
|
||||
- Server components only for SEO or extreme TTFB needs
|
||||
- Use React Query for server state (via generated hooks)
|
||||
- Co-locate UI state in components/hooks
|
||||
|
||||
### Security Guidelines
|
||||
|
||||
**Cache Protection Middleware** (`/backend/backend/server/middleware/security.py`):
|
||||
|
||||
- Default: Disables caching for ALL endpoints with `Cache-Control: no-store, no-cache, must-revalidate, private`
|
||||
- Uses allow list approach for cacheable paths (static assets, health checks, public pages)
|
||||
- Prevents sensitive data caching in browsers/proxies
|
||||
- Add new cacheable endpoints to `CACHEABLE_PATHS`
|
||||
|
||||
### CI/CD Alignment
|
||||
|
||||
The repository has comprehensive CI workflows that test:
|
||||
|
||||
- **Backend**: Python 3.11-3.13, services (Redis/RabbitMQ/ClamAV), Prisma migrations, Poetry lock validation
|
||||
- **Frontend**: Node.js 21, pnpm, Playwright with Docker Compose stack, API schema validation
|
||||
- **Integration**: Full-stack type checking and E2E testing
|
||||
@@ -229,6 +305,7 @@ Match these patterns when developing locally - the copilot setup environment mir
|
||||
## Collaboration with Other AI Assistants
|
||||
|
||||
This repository is actively developed with assistance from Claude (via CLAUDE.md files). When working on this codebase:
|
||||
|
||||
- Check for existing CLAUDE.md files that provide additional context
|
||||
- Follow established patterns and conventions already in the codebase
|
||||
- Maintain consistency with existing code style and architecture
|
||||
@@ -237,8 +314,9 @@ This repository is actively developed with assistance from Claude (via CLAUDE.md
|
||||
## Trust These Instructions
|
||||
|
||||
These instructions are comprehensive and tested. Only perform additional searches if:
|
||||
|
||||
1. Information here is incomplete for your specific task
|
||||
2. You encounter errors not covered by the workarounds
|
||||
3. You need to understand implementation details not covered above
|
||||
|
||||
For detailed platform development patterns, refer to `autogpt_platform/CLAUDE.md` and `AGENTS.md` in the repository root.
|
||||
For detailed platform development patterns, refer to `autogpt_platform/CLAUDE.md` and `AGENTS.md` in the repository root.
|
||||
|
||||
@@ -63,6 +63,9 @@ poetry run pytest path/to/test.py --snapshot-update
|
||||
# Install dependencies
|
||||
cd frontend && pnpm i
|
||||
|
||||
# Generate API client from OpenAPI spec
|
||||
pnpm generate:api
|
||||
|
||||
# Start development server
|
||||
pnpm dev
|
||||
|
||||
@@ -75,12 +78,23 @@ pnpm storybook
|
||||
# Build production
|
||||
pnpm build
|
||||
|
||||
# Format and lint
|
||||
pnpm format
|
||||
|
||||
# Type checking
|
||||
pnpm types
|
||||
```
|
||||
|
||||
We have a components library in autogpt_platform/frontend/src/components/atoms that should be used when adding new pages and components.
|
||||
**📖 Complete Guide**: See `/frontend/CONTRIBUTING.md` and `/frontend/.cursorrules` for comprehensive frontend patterns.
|
||||
|
||||
**Key Frontend Conventions:**
|
||||
|
||||
- Separate render logic from data/behavior in components
|
||||
- Use generated API hooks from `@/app/api/__generated__/endpoints/`
|
||||
- Use function declarations (not arrow functions) for components/handlers
|
||||
- Use design system components from `src/components/` (atoms, molecules, organisms)
|
||||
- Only use Phosphor Icons
|
||||
- Never use `src/components/__legacy__/*` or deprecated `BackendAPI`
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
@@ -95,11 +109,16 @@ We have a components library in autogpt_platform/frontend/src/components/atoms t
|
||||
|
||||
### Frontend Architecture
|
||||
|
||||
- **Framework**: Next.js App Router with React Server Components
|
||||
- **State Management**: React hooks + Supabase client for real-time updates
|
||||
- **Framework**: Next.js 15 App Router (client-first approach)
|
||||
- **Data Fetching**: Type-safe generated API hooks via Orval + React Query
|
||||
- **State Management**: React Query for server state, co-located UI state in components/hooks
|
||||
- **Component Structure**: Separate render logic (`.tsx`) from business logic (`use*.ts` hooks)
|
||||
- **Workflow Builder**: Visual graph editor using @xyflow/react
|
||||
- **UI Components**: Radix UI primitives with Tailwind CSS styling
|
||||
- **UI Components**: shadcn/ui (Radix UI primitives) with Tailwind CSS styling
|
||||
- **Icons**: Phosphor Icons only
|
||||
- **Feature Flags**: LaunchDarkly integration
|
||||
- **Error Handling**: ErrorCard for render errors, toast for mutations, Sentry for exceptions
|
||||
- **Testing**: Playwright for E2E, Storybook for component development
|
||||
|
||||
### Key Concepts
|
||||
|
||||
@@ -153,6 +172,7 @@ Key models (defined in `/backend/schema.prisma`):
|
||||
**Adding a new block:**
|
||||
|
||||
Follow the comprehensive [Block SDK Guide](../../../docs/content/platform/block-sdk-guide.md) which covers:
|
||||
|
||||
- Provider configuration with `ProviderBuilder`
|
||||
- Block schema definition
|
||||
- Authentication (API keys, OAuth, webhooks)
|
||||
@@ -160,6 +180,7 @@ Follow the comprehensive [Block SDK Guide](../../../docs/content/platform/block-
|
||||
- File organization
|
||||
|
||||
Quick steps:
|
||||
|
||||
1. Create new file in `/backend/backend/blocks/`
|
||||
2. Configure provider using `ProviderBuilder` in `_config.py`
|
||||
3. Inherit from `Block` base class
|
||||
@@ -180,10 +201,20 @@ ex: do the inputs and outputs tie well together?
|
||||
|
||||
**Frontend feature development:**
|
||||
|
||||
1. Components go in `/frontend/src/components/`
|
||||
2. Use existing UI components from `/frontend/src/components/ui/`
|
||||
3. Add Storybook stories for new components
|
||||
4. Test with Playwright if user-facing
|
||||
See `/frontend/CONTRIBUTING.md` for complete patterns. Quick reference:
|
||||
|
||||
1. **Pages**: Create in `src/app/(platform)/feature-name/page.tsx`
|
||||
- Add `usePageName.ts` hook for logic
|
||||
- Put sub-components in local `components/` folder
|
||||
2. **Components**: Structure as `ComponentName/ComponentName.tsx` + `useComponentName.ts` + `helpers.ts`
|
||||
- Use design system components from `src/components/` (atoms, molecules, organisms)
|
||||
- Never use `src/components/__legacy__/*`
|
||||
3. **Data fetching**: Use generated API hooks from `@/app/api/__generated__/endpoints/`
|
||||
- Regenerate with `pnpm generate:api`
|
||||
- Pattern: `use{Method}{Version}{OperationName}`
|
||||
4. **Styling**: Tailwind CSS only, use design tokens, Phosphor Icons only
|
||||
5. **Testing**: Add Storybook stories for new components, Playwright for E2E
|
||||
6. **Code conventions**: Function declarations (not arrow functions) for components/handlers
|
||||
|
||||
### Security Implementation
|
||||
|
||||
|
||||
@@ -8,6 +8,11 @@ start-core:
|
||||
stop-core:
|
||||
docker compose stop deps
|
||||
|
||||
reset-db:
|
||||
rm -rf db/docker/volumes/db/data
|
||||
cd backend && poetry run prisma migrate deploy
|
||||
cd backend && poetry run prisma generate
|
||||
|
||||
# View logs for core services
|
||||
logs-core:
|
||||
docker compose logs -f deps
|
||||
@@ -35,13 +40,18 @@ run-backend:
|
||||
run-frontend:
|
||||
cd frontend && pnpm dev
|
||||
|
||||
test-data:
|
||||
cd backend && poetry run python test/test_data_creator.py
|
||||
|
||||
help:
|
||||
@echo "Usage: make <target>"
|
||||
@echo "Targets:"
|
||||
@echo " start-core - Start just the core services (Supabase, Redis, RabbitMQ) in background"
|
||||
@echo " stop-core - Stop the core services"
|
||||
@echo " reset-db - Reset the database by deleting the volume"
|
||||
@echo " logs-core - Tail the logs for core services"
|
||||
@echo " format - Format & lint backend (Python) and frontend (TypeScript) code"
|
||||
@echo " migrate - Run backend database migrations"
|
||||
@echo " run-backend - Run the backend FastAPI server"
|
||||
@echo " run-frontend - Run the frontend Next.js development server"
|
||||
@echo " run-frontend - Run the frontend Next.js development server"
|
||||
@echo " test-data - Run the test data creator"
|
||||
@@ -94,42 +94,36 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
|
||||
config = LoggingConfig()
|
||||
log_handlers: list[logging.Handler] = []
|
||||
|
||||
structured_logging = config.enable_cloud_logging or force_cloud_logging
|
||||
|
||||
# Console output handlers
|
||||
stdout = logging.StreamHandler(stream=sys.stdout)
|
||||
stdout.setLevel(config.level)
|
||||
stdout.addFilter(BelowLevelFilter(logging.WARNING))
|
||||
if config.level == logging.DEBUG:
|
||||
stdout.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
|
||||
else:
|
||||
stdout.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
|
||||
if not structured_logging:
|
||||
stdout = logging.StreamHandler(stream=sys.stdout)
|
||||
stdout.setLevel(config.level)
|
||||
stdout.addFilter(BelowLevelFilter(logging.WARNING))
|
||||
if config.level == logging.DEBUG:
|
||||
stdout.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
|
||||
else:
|
||||
stdout.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
|
||||
|
||||
stderr = logging.StreamHandler()
|
||||
stderr.setLevel(logging.WARNING)
|
||||
if config.level == logging.DEBUG:
|
||||
stderr.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
|
||||
else:
|
||||
stderr.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
|
||||
stderr = logging.StreamHandler()
|
||||
stderr.setLevel(logging.WARNING)
|
||||
if config.level == logging.DEBUG:
|
||||
stderr.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
|
||||
else:
|
||||
stderr.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
|
||||
|
||||
log_handlers += [stdout, stderr]
|
||||
log_handlers += [stdout, stderr]
|
||||
|
||||
# Cloud logging setup
|
||||
if config.enable_cloud_logging or force_cloud_logging:
|
||||
import google.cloud.logging
|
||||
from google.cloud.logging.handlers import CloudLoggingHandler
|
||||
from google.cloud.logging_v2.handlers.transports import (
|
||||
BackgroundThreadTransport,
|
||||
)
|
||||
else:
|
||||
# Use Google Cloud Structured Log Handler. Log entries are printed to stdout
|
||||
# in a JSON format which is automatically picked up by Google Cloud Logging.
|
||||
from google.cloud.logging.handlers import StructuredLogHandler
|
||||
|
||||
client = google.cloud.logging.Client()
|
||||
# Use BackgroundThreadTransport to prevent blocking the main thread
|
||||
# and deadlocks when gRPC calls to Google Cloud Logging hang
|
||||
cloud_handler = CloudLoggingHandler(
|
||||
client,
|
||||
name="autogpt_logs",
|
||||
transport=BackgroundThreadTransport,
|
||||
)
|
||||
cloud_handler.setLevel(config.level)
|
||||
log_handlers.append(cloud_handler)
|
||||
structured_log_handler = StructuredLogHandler(stream=sys.stdout)
|
||||
structured_log_handler.setLevel(config.level)
|
||||
log_handlers.append(structured_log_handler)
|
||||
|
||||
# File logging setup
|
||||
if config.enable_file_logging:
|
||||
@@ -185,7 +179,13 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
|
||||
|
||||
# Configure the root logger
|
||||
logging.basicConfig(
|
||||
format=DEBUG_LOG_FORMAT if config.level == logging.DEBUG else SIMPLE_LOG_FORMAT,
|
||||
format=(
|
||||
"%(levelname)s %(message)s"
|
||||
if structured_logging
|
||||
else (
|
||||
DEBUG_LOG_FORMAT if config.level == logging.DEBUG else SIMPLE_LOG_FORMAT
|
||||
)
|
||||
),
|
||||
level=config.level,
|
||||
handlers=log_handlers,
|
||||
)
|
||||
|
||||
@@ -47,6 +47,7 @@ RUN poetry install --no-ansi --no-root
|
||||
|
||||
# Generate Prisma client
|
||||
COPY autogpt_platform/backend/schema.prisma ./
|
||||
COPY autogpt_platform/backend/backend/data/partial_types.py ./backend/data/partial_types.py
|
||||
RUN poetry run prisma generate
|
||||
|
||||
FROM debian:13-slim AS server_dependencies
|
||||
@@ -92,6 +93,7 @@ FROM server_dependencies AS migrate
|
||||
|
||||
# Migration stage only needs schema and migrations - much lighter than full backend
|
||||
COPY autogpt_platform/backend/schema.prisma /app/autogpt_platform/backend/
|
||||
COPY autogpt_platform/backend/backend/data/partial_types.py /app/autogpt_platform/backend/backend/data/partial_types.py
|
||||
COPY autogpt_platform/backend/migrations /app/autogpt_platform/backend/migrations
|
||||
|
||||
FROM server_dependencies AS server
|
||||
|
||||
@@ -4,13 +4,13 @@ import mimetypes
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import aiohttp
|
||||
import discord
|
||||
from pydantic import SecretStr
|
||||
|
||||
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
|
||||
from backend.data.model import APIKeyCredentials, SchemaField
|
||||
from backend.util.file import store_media_file
|
||||
from backend.util.request import Requests
|
||||
from backend.util.type import MediaFileType
|
||||
|
||||
from ._auth import (
|
||||
@@ -114,10 +114,9 @@ class ReadDiscordMessagesBlock(Block):
|
||||
if message.attachments:
|
||||
attachment = message.attachments[0] # Process the first attachment
|
||||
if attachment.filename.endswith((".txt", ".py")):
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(attachment.url) as response:
|
||||
file_content = response.text()
|
||||
self.output_data += f"\n\nFile from user: {attachment.filename}\nContent: {file_content}"
|
||||
response = await Requests().get(attachment.url)
|
||||
file_content = response.text()
|
||||
self.output_data += f"\n\nFile from user: {attachment.filename}\nContent: {file_content}"
|
||||
|
||||
await client.close()
|
||||
|
||||
@@ -699,16 +698,15 @@ class SendDiscordFileBlock(Block):
|
||||
|
||||
elif file.startswith(("http://", "https://")):
|
||||
# URL - download the file
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(file) as response:
|
||||
file_bytes = await response.read()
|
||||
response = await Requests().get(file)
|
||||
file_bytes = response.content
|
||||
|
||||
# Try to get filename from URL if not provided
|
||||
if not filename:
|
||||
from urllib.parse import urlparse
|
||||
# Try to get filename from URL if not provided
|
||||
if not filename:
|
||||
from urllib.parse import urlparse
|
||||
|
||||
path = urlparse(file).path
|
||||
detected_filename = Path(path).name or "download"
|
||||
path = urlparse(file).path
|
||||
detected_filename = Path(path).name or "download"
|
||||
else:
|
||||
# Local file path - read from stored media file
|
||||
# This would be a path from a previous block's output
|
||||
|
||||
@@ -62,10 +62,10 @@ TEST_CREDENTIALS_OAUTH = OAuth2Credentials(
|
||||
title="Mock Linear API key",
|
||||
username="mock-linear-username",
|
||||
access_token=SecretStr("mock-linear-access-token"),
|
||||
access_token_expires_at=None,
|
||||
access_token_expires_at=1672531200, # Mock expiration time for short-lived token
|
||||
refresh_token=SecretStr("mock-linear-refresh-token"),
|
||||
refresh_token_expires_at=None,
|
||||
scopes=["mock-linear-scopes"],
|
||||
scopes=["read", "write"],
|
||||
)
|
||||
|
||||
TEST_CREDENTIALS_API_KEY = APIKeyCredentials(
|
||||
|
||||
@@ -2,7 +2,9 @@
|
||||
Linear OAuth handler implementation.
|
||||
"""
|
||||
|
||||
import base64
|
||||
import json
|
||||
import time
|
||||
from typing import Optional
|
||||
from urllib.parse import urlencode
|
||||
|
||||
@@ -38,8 +40,9 @@ class LinearOAuthHandler(BaseOAuthHandler):
|
||||
self.client_secret = client_secret
|
||||
self.redirect_uri = redirect_uri
|
||||
self.auth_base_url = "https://linear.app/oauth/authorize"
|
||||
self.token_url = "https://api.linear.app/oauth/token" # Correct token URL
|
||||
self.token_url = "https://api.linear.app/oauth/token"
|
||||
self.revoke_url = "https://api.linear.app/oauth/revoke"
|
||||
self.migrate_url = "https://api.linear.app/oauth/migrate_old_token"
|
||||
|
||||
def get_login_url(
|
||||
self, scopes: list[str], state: str, code_challenge: Optional[str]
|
||||
@@ -82,19 +85,84 @@ class LinearOAuthHandler(BaseOAuthHandler):
|
||||
|
||||
return True # Linear doesn't return JSON on successful revoke
|
||||
|
||||
async def migrate_old_token(
|
||||
self, credentials: OAuth2Credentials
|
||||
) -> OAuth2Credentials:
|
||||
"""
|
||||
Migrate an old long-lived token to a new short-lived token with refresh token.
|
||||
|
||||
This uses Linear's /oauth/migrate_old_token endpoint to exchange current
|
||||
long-lived tokens for short-lived tokens with refresh tokens without
|
||||
requiring users to re-authorize.
|
||||
"""
|
||||
if not credentials.access_token:
|
||||
raise ValueError("No access token to migrate")
|
||||
|
||||
request_body = {
|
||||
"client_id": self.client_id,
|
||||
"client_secret": self.client_secret,
|
||||
}
|
||||
|
||||
headers = {
|
||||
"Authorization": f"Bearer {credentials.access_token.get_secret_value()}",
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
}
|
||||
|
||||
response = await Requests().post(
|
||||
self.migrate_url, data=request_body, headers=headers
|
||||
)
|
||||
|
||||
if not response.ok:
|
||||
try:
|
||||
error_data = response.json()
|
||||
error_message = error_data.get("error", "Unknown error")
|
||||
error_description = error_data.get("error_description", "")
|
||||
if error_description:
|
||||
error_message = f"{error_message}: {error_description}"
|
||||
except json.JSONDecodeError:
|
||||
error_message = response.text
|
||||
raise LinearAPIException(
|
||||
f"Failed to migrate Linear token ({response.status}): {error_message}",
|
||||
response.status,
|
||||
)
|
||||
|
||||
token_data = response.json()
|
||||
|
||||
# Extract token expiration
|
||||
now = int(time.time())
|
||||
expires_in = token_data.get("expires_in")
|
||||
access_token_expires_at = None
|
||||
if expires_in:
|
||||
access_token_expires_at = now + expires_in
|
||||
|
||||
new_credentials = OAuth2Credentials(
|
||||
provider=self.PROVIDER_NAME,
|
||||
title=credentials.title,
|
||||
username=credentials.username,
|
||||
access_token=token_data["access_token"],
|
||||
scopes=credentials.scopes, # Preserve original scopes
|
||||
refresh_token=token_data.get("refresh_token"),
|
||||
access_token_expires_at=access_token_expires_at,
|
||||
refresh_token_expires_at=None,
|
||||
)
|
||||
|
||||
new_credentials.id = credentials.id
|
||||
return new_credentials
|
||||
|
||||
async def _refresh_tokens(
|
||||
self, credentials: OAuth2Credentials
|
||||
) -> OAuth2Credentials:
|
||||
if not credentials.refresh_token:
|
||||
raise ValueError(
|
||||
"No refresh token available."
|
||||
) # Linear uses non-expiring tokens
|
||||
"No refresh token available. Token may need to be migrated to the new refresh token system."
|
||||
)
|
||||
|
||||
return await self._request_tokens(
|
||||
{
|
||||
"refresh_token": credentials.refresh_token.get_secret_value(),
|
||||
"grant_type": "refresh_token",
|
||||
}
|
||||
},
|
||||
current_credentials=credentials,
|
||||
)
|
||||
|
||||
async def _request_tokens(
|
||||
@@ -102,16 +170,33 @@ class LinearOAuthHandler(BaseOAuthHandler):
|
||||
params: dict[str, str],
|
||||
current_credentials: Optional[OAuth2Credentials] = None,
|
||||
) -> OAuth2Credentials:
|
||||
# Determine if this is a refresh token request
|
||||
is_refresh = params.get("grant_type") == "refresh_token"
|
||||
|
||||
# Build request body with appropriate grant_type
|
||||
request_body = {
|
||||
"client_id": self.client_id,
|
||||
"client_secret": self.client_secret,
|
||||
"grant_type": "authorization_code", # Ensure grant_type is correct
|
||||
**params,
|
||||
}
|
||||
|
||||
headers = {
|
||||
"Content-Type": "application/x-www-form-urlencoded"
|
||||
} # Correct header for token request
|
||||
# Set default grant_type if not provided
|
||||
if "grant_type" not in request_body:
|
||||
request_body["grant_type"] = "authorization_code"
|
||||
|
||||
headers = {"Content-Type": "application/x-www-form-urlencoded"}
|
||||
|
||||
# For refresh token requests, support HTTP Basic Authentication as recommended
|
||||
if is_refresh:
|
||||
# Option 1: Use HTTP Basic Auth (preferred by Linear)
|
||||
client_credentials = f"{self.client_id}:{self.client_secret}"
|
||||
encoded_credentials = base64.b64encode(client_credentials.encode()).decode()
|
||||
headers["Authorization"] = f"Basic {encoded_credentials}"
|
||||
|
||||
# Remove client credentials from body when using Basic Auth
|
||||
request_body.pop("client_id", None)
|
||||
request_body.pop("client_secret", None)
|
||||
|
||||
response = await Requests().post(
|
||||
self.token_url, data=request_body, headers=headers
|
||||
)
|
||||
@@ -120,6 +205,9 @@ class LinearOAuthHandler(BaseOAuthHandler):
|
||||
try:
|
||||
error_data = response.json()
|
||||
error_message = error_data.get("error", "Unknown error")
|
||||
error_description = error_data.get("error_description", "")
|
||||
if error_description:
|
||||
error_message = f"{error_message}: {error_description}"
|
||||
except json.JSONDecodeError:
|
||||
error_message = response.text
|
||||
raise LinearAPIException(
|
||||
@@ -129,27 +217,84 @@ class LinearOAuthHandler(BaseOAuthHandler):
|
||||
|
||||
token_data = response.json()
|
||||
|
||||
# Note: Linear access tokens do not expire, so we set expires_at to None
|
||||
# Extract token expiration if provided (for new refresh token implementation)
|
||||
now = int(time.time())
|
||||
expires_in = token_data.get("expires_in")
|
||||
access_token_expires_at = None
|
||||
if expires_in:
|
||||
access_token_expires_at = now + expires_in
|
||||
|
||||
# Get username - preserve from current credentials if refreshing
|
||||
username = None
|
||||
if current_credentials and is_refresh:
|
||||
username = current_credentials.username
|
||||
elif "user" in token_data:
|
||||
username = token_data["user"].get("name", "Unknown User")
|
||||
else:
|
||||
# Fetch username using the access token
|
||||
username = await self._request_username(token_data["access_token"])
|
||||
|
||||
new_credentials = OAuth2Credentials(
|
||||
provider=self.PROVIDER_NAME,
|
||||
title=current_credentials.title if current_credentials else None,
|
||||
username=token_data.get("user", {}).get(
|
||||
"name", "Unknown User"
|
||||
), # extract name or set appropriate
|
||||
username=username or "Unknown User",
|
||||
access_token=token_data["access_token"],
|
||||
scopes=token_data["scope"].split(
|
||||
","
|
||||
), # Linear returns comma-separated scopes
|
||||
refresh_token=token_data.get(
|
||||
"refresh_token"
|
||||
), # Linear uses non-expiring tokens so this might be null
|
||||
access_token_expires_at=None,
|
||||
refresh_token_expires_at=None,
|
||||
scopes=(
|
||||
token_data["scope"].split(",")
|
||||
if "scope" in token_data
|
||||
else (current_credentials.scopes if current_credentials else [])
|
||||
),
|
||||
refresh_token=token_data.get("refresh_token"),
|
||||
access_token_expires_at=access_token_expires_at,
|
||||
refresh_token_expires_at=None, # Linear doesn't provide refresh token expiration
|
||||
)
|
||||
|
||||
if current_credentials:
|
||||
new_credentials.id = current_credentials.id
|
||||
|
||||
return new_credentials
|
||||
|
||||
async def get_access_token(self, credentials: OAuth2Credentials) -> str:
|
||||
"""
|
||||
Returns a valid access token, handling migration and refresh as needed.
|
||||
|
||||
This overrides the base implementation to handle Linear's token migration
|
||||
from old long-lived tokens to new short-lived tokens with refresh tokens.
|
||||
"""
|
||||
# If token has no expiration and no refresh token, it might be an old token
|
||||
# that needs migration
|
||||
if (
|
||||
credentials.access_token_expires_at is None
|
||||
and credentials.refresh_token is None
|
||||
):
|
||||
try:
|
||||
# Attempt to migrate the old token
|
||||
migrated_credentials = await self.migrate_old_token(credentials)
|
||||
# Update the credentials store would need to be handled by the caller
|
||||
# For now, use the migrated credentials for this request
|
||||
credentials = migrated_credentials
|
||||
except LinearAPIException:
|
||||
# Migration failed, try to use the old token as-is
|
||||
# This maintains backward compatibility
|
||||
pass
|
||||
|
||||
# Use the standard refresh logic from the base class
|
||||
if self.needs_refresh(credentials):
|
||||
credentials = await self.refresh_tokens(credentials)
|
||||
|
||||
return credentials.access_token.get_secret_value()
|
||||
|
||||
def needs_migration(self, credentials: OAuth2Credentials) -> bool:
|
||||
"""
|
||||
Check if credentials represent an old long-lived token that needs migration.
|
||||
|
||||
Old tokens have no expiration time and no refresh token.
|
||||
"""
|
||||
return (
|
||||
credentials.access_token_expires_at is None
|
||||
and credentials.refresh_token is None
|
||||
)
|
||||
|
||||
async def _request_username(self, access_token: str) -> Optional[str]:
|
||||
# Use the LinearClient to fetch user details using GraphQL
|
||||
from ._api import LinearClient
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Any
|
||||
|
||||
@@ -10,6 +8,7 @@ import pydantic
|
||||
|
||||
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
|
||||
from backend.data.model import SchemaField
|
||||
from backend.util.request import Requests
|
||||
|
||||
|
||||
class RSSEntry(pydantic.BaseModel):
|
||||
@@ -103,35 +102,29 @@ class ReadRSSFeedBlock(Block):
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def parse_feed(url: str) -> dict[str, Any]:
|
||||
async def parse_feed(url: str) -> dict[str, Any]:
|
||||
# Security fix: Add protection against memory exhaustion attacks
|
||||
MAX_FEED_SIZE = 10 * 1024 * 1024 # 10MB limit for RSS feeds
|
||||
|
||||
# Validate URL
|
||||
parsed_url = urllib.parse.urlparse(url)
|
||||
if parsed_url.scheme not in ("http", "https"):
|
||||
raise ValueError(f"Invalid URL scheme: {parsed_url.scheme}")
|
||||
|
||||
# Download with size limit
|
||||
# Download feed content with size limit
|
||||
try:
|
||||
with urllib.request.urlopen(url, timeout=30) as response:
|
||||
# Check content length if available
|
||||
content_length = response.headers.get("Content-Length")
|
||||
if content_length and int(content_length) > MAX_FEED_SIZE:
|
||||
raise ValueError(
|
||||
f"Feed too large: {content_length} bytes exceeds {MAX_FEED_SIZE} limit"
|
||||
)
|
||||
response = await Requests(raise_for_status=True).get(url)
|
||||
|
||||
# Read with size limit
|
||||
content = response.read(MAX_FEED_SIZE + 1)
|
||||
if len(content) > MAX_FEED_SIZE:
|
||||
raise ValueError(
|
||||
f"Feed too large: exceeds {MAX_FEED_SIZE} byte limit"
|
||||
)
|
||||
# Check content length if available
|
||||
content_length = response.headers.get("Content-Length")
|
||||
if content_length and int(content_length) > MAX_FEED_SIZE:
|
||||
raise ValueError(
|
||||
f"Feed too large: {content_length} bytes exceeds {MAX_FEED_SIZE} limit"
|
||||
)
|
||||
|
||||
# Parse with feedparser using the validated content
|
||||
# feedparser has built-in protection against XML attacks
|
||||
return feedparser.parse(content) # type: ignore
|
||||
# Get content with size limit
|
||||
content = response.content
|
||||
if len(content) > MAX_FEED_SIZE:
|
||||
raise ValueError(f"Feed too large: exceeds {MAX_FEED_SIZE} byte limit")
|
||||
|
||||
# Parse with feedparser using the validated content
|
||||
# feedparser has built-in protection against XML attacks
|
||||
return feedparser.parse(content) # type: ignore
|
||||
except Exception as e:
|
||||
# Log error and return empty feed
|
||||
logging.warning(f"Failed to parse RSS feed from {url}: {e}")
|
||||
@@ -145,7 +138,7 @@ class ReadRSSFeedBlock(Block):
|
||||
while keep_going:
|
||||
keep_going = input_data.run_continuously
|
||||
|
||||
feed = self.parse_feed(input_data.rss_url)
|
||||
feed = await self.parse_feed(input_data.rss_url)
|
||||
all_entries = []
|
||||
|
||||
for entry in feed["entries"]:
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
from urllib.parse import parse_qs, urlparse
|
||||
|
||||
from youtube_transcript_api._api import YouTubeTranscriptApi
|
||||
from youtube_transcript_api._errors import NoTranscriptFound
|
||||
from youtube_transcript_api._transcripts import FetchedTranscript
|
||||
from youtube_transcript_api.formatters import TextFormatter
|
||||
|
||||
@@ -64,7 +65,29 @@ class TranscribeYoutubeVideoBlock(Block):
|
||||
|
||||
@staticmethod
|
||||
def get_transcript(video_id: str) -> FetchedTranscript:
|
||||
return YouTubeTranscriptApi().fetch(video_id=video_id)
|
||||
"""
|
||||
Get transcript for a video, preferring English but falling back to any available language.
|
||||
|
||||
:param video_id: The YouTube video ID
|
||||
:return: The fetched transcript
|
||||
:raises: Any exception except NoTranscriptFound for requested languages
|
||||
"""
|
||||
api = YouTubeTranscriptApi()
|
||||
try:
|
||||
# Try to get English transcript first (default behavior)
|
||||
return api.fetch(video_id=video_id)
|
||||
except NoTranscriptFound:
|
||||
# If English is not available, get the first available transcript
|
||||
transcript_list = api.list(video_id)
|
||||
# Try manually created transcripts first, then generated ones
|
||||
available_transcripts = list(
|
||||
transcript_list._manually_created_transcripts.values()
|
||||
) + list(transcript_list._generated_transcripts.values())
|
||||
if available_transcripts:
|
||||
# Fetch the first available transcript
|
||||
return available_transcripts[0].fetch()
|
||||
# If no transcripts at all, re-raise the original error
|
||||
raise
|
||||
|
||||
@staticmethod
|
||||
def format_transcript(transcript: FetchedTranscript) -> str:
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
from typing import Type
|
||||
|
||||
from backend.blocks.ai_music_generator import AIMusicGeneratorBlock
|
||||
from backend.blocks.ai_shortform_video_block import AIShortformVideoCreatorBlock
|
||||
from backend.blocks.ai_shortform_video_block import (
|
||||
AIAdMakerVideoCreatorBlock,
|
||||
AIScreenshotToVideoAdBlock,
|
||||
AIShortformVideoCreatorBlock,
|
||||
)
|
||||
from backend.blocks.apollo.organization import SearchOrganizationsBlock
|
||||
from backend.blocks.apollo.people import SearchPeopleBlock
|
||||
from backend.blocks.apollo.person import GetPersonDetailBlock
|
||||
@@ -323,7 +327,31 @@ BLOCK_COSTS: dict[Type[Block], list[BlockCost]] = {
|
||||
],
|
||||
AIShortformVideoCreatorBlock: [
|
||||
BlockCost(
|
||||
cost_amount=50,
|
||||
cost_amount=307,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": revid_credentials.id,
|
||||
"provider": revid_credentials.provider,
|
||||
"type": revid_credentials.type,
|
||||
}
|
||||
},
|
||||
)
|
||||
],
|
||||
AIAdMakerVideoCreatorBlock: [
|
||||
BlockCost(
|
||||
cost_amount=714,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": revid_credentials.id,
|
||||
"provider": revid_credentials.provider,
|
||||
"type": revid_credentials.type,
|
||||
}
|
||||
},
|
||||
)
|
||||
],
|
||||
AIScreenshotToVideoAdBlock: [
|
||||
BlockCost(
|
||||
cost_amount=612,
|
||||
cost_filter={
|
||||
"credentials": {
|
||||
"id": revid_credentials.id,
|
||||
|
||||
@@ -264,7 +264,16 @@ class BaseGraph(BaseDbModel):
|
||||
schema_fields: list[AgentInputBlock.Input | AgentOutputBlock.Input] = []
|
||||
for type_class, input_default in props:
|
||||
try:
|
||||
schema_fields.append(type_class.model_construct(**input_default))
|
||||
constructed_obj = type_class.model_construct(**input_default)
|
||||
# Validate that the constructed object has required 'name' attribute
|
||||
# model_construct() bypasses validation, so we need to check manually
|
||||
if not hasattr(constructed_obj, "name") or constructed_obj.name is None:
|
||||
logger.warning(
|
||||
f"Skipping invalid {type_class.__name__} node: missing required 'name' field. "
|
||||
f"Input data: {input_default}"
|
||||
)
|
||||
continue
|
||||
schema_fields.append(constructed_obj)
|
||||
except Exception as e:
|
||||
logger.error(f"Invalid {type_class}: {input_default}, {e}")
|
||||
|
||||
|
||||
@@ -347,6 +347,9 @@ class APIKeyCredentials(_BaseCredentials):
|
||||
"""Unix timestamp (seconds) indicating when the API key expires (if at all)"""
|
||||
|
||||
def auth_header(self) -> str:
|
||||
# Linear API keys should not have Bearer prefix
|
||||
if self.provider == "linear":
|
||||
return self.api_key.get_secret_value()
|
||||
return f"Bearer {self.api_key.get_secret_value()}"
|
||||
|
||||
|
||||
|
||||
5
autogpt_platform/backend/backend/data/partial_types.py
Normal file
5
autogpt_platform/backend/backend/data/partial_types.py
Normal file
@@ -0,0 +1,5 @@
|
||||
import prisma.models
|
||||
|
||||
|
||||
class StoreAgentWithRank(prisma.models.StoreAgent):
|
||||
rank: float
|
||||
@@ -39,6 +39,7 @@ from backend.data.notifications import (
|
||||
)
|
||||
from backend.data.user import (
|
||||
get_active_user_ids_in_timerange,
|
||||
get_user_by_id,
|
||||
get_user_email_by_id,
|
||||
get_user_email_verification,
|
||||
get_user_integrations,
|
||||
@@ -146,6 +147,7 @@ class DatabaseManager(AppService):
|
||||
|
||||
# User Comms - async
|
||||
get_active_user_ids_in_timerange = _(get_active_user_ids_in_timerange)
|
||||
get_user_by_id = _(get_user_by_id)
|
||||
get_user_email_by_id = _(get_user_email_by_id)
|
||||
get_user_email_verification = _(get_user_email_verification)
|
||||
get_user_notification_preference = _(get_user_notification_preference)
|
||||
@@ -231,6 +233,7 @@ class DatabaseManagerAsyncClient(AppServiceClient):
|
||||
get_node = d.get_node
|
||||
get_node_execution = d.get_node_execution
|
||||
get_node_executions = d.get_node_executions
|
||||
get_user_by_id = d.get_user_by_id
|
||||
get_user_integrations = d.get_user_integrations
|
||||
upsert_execution_input = d.upsert_execution_input
|
||||
upsert_execution_output = d.upsert_execution_output
|
||||
|
||||
@@ -34,6 +34,7 @@ from backend.data.graph import GraphModel, Node
|
||||
from backend.data.model import CredentialsMetaInput
|
||||
from backend.data.rabbitmq import Exchange, ExchangeType, Queue, RabbitMQConfig
|
||||
from backend.data.user import get_user_by_id
|
||||
from backend.util.cache import cached
|
||||
from backend.util.clients import (
|
||||
get_async_execution_event_bus,
|
||||
get_async_execution_queue,
|
||||
@@ -41,11 +42,12 @@ from backend.util.clients import (
|
||||
get_integration_credentials_store,
|
||||
)
|
||||
from backend.util.exceptions import GraphValidationError, NotFoundError
|
||||
from backend.util.logging import TruncatedLogger
|
||||
from backend.util.logging import TruncatedLogger, is_structured_logging_enabled
|
||||
from backend.util.settings import Config
|
||||
from backend.util.type import convert
|
||||
|
||||
|
||||
@cached(maxsize=1000, ttl_seconds=3600)
|
||||
async def get_user_context(user_id: str) -> UserContext:
|
||||
"""
|
||||
Get UserContext for a user, always returns a valid context with timezone.
|
||||
@@ -53,7 +55,11 @@ async def get_user_context(user_id: str) -> UserContext:
|
||||
"""
|
||||
user_context = UserContext(timezone="UTC") # Default to UTC
|
||||
try:
|
||||
user = await get_user_by_id(user_id)
|
||||
if prisma.is_connected():
|
||||
user = await get_user_by_id(user_id)
|
||||
else:
|
||||
user = await get_database_manager_async_client().get_user_by_id(user_id)
|
||||
|
||||
if user and user.timezone and user.timezone != "not-set":
|
||||
user_context.timezone = user.timezone
|
||||
logger.debug(f"Retrieved user context: timezone={user.timezone}")
|
||||
@@ -93,7 +99,11 @@ class LogMetadata(TruncatedLogger):
|
||||
"node_id": node_id,
|
||||
"block_name": block_name,
|
||||
}
|
||||
prefix = f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]"
|
||||
prefix = (
|
||||
"[ExecutionManager]"
|
||||
if is_structured_logging_enabled()
|
||||
else f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]" # noqa
|
||||
)
|
||||
super().__init__(
|
||||
logger,
|
||||
max_length=max_length,
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import typing
|
||||
from datetime import datetime, timezone
|
||||
|
||||
import fastapi
|
||||
@@ -71,64 +72,171 @@ async def get_store_agents(
|
||||
logger.debug(
|
||||
f"Getting store agents. featured={featured}, creators={creators}, sorted_by={sorted_by}, search={search_query}, category={category}, page={page}"
|
||||
)
|
||||
search_term = sanitize_query(search_query)
|
||||
where_clause: prisma.types.StoreAgentWhereInput = {"is_available": True}
|
||||
if featured:
|
||||
where_clause["featured"] = featured
|
||||
if creators:
|
||||
where_clause["creator_username"] = {"in": creators}
|
||||
if category:
|
||||
where_clause["categories"] = {"has": category}
|
||||
|
||||
if search_term:
|
||||
where_clause["OR"] = [
|
||||
{"agent_name": {"contains": search_term, "mode": "insensitive"}},
|
||||
{"description": {"contains": search_term, "mode": "insensitive"}},
|
||||
]
|
||||
|
||||
order_by = []
|
||||
if sorted_by == "rating":
|
||||
order_by.append({"rating": "desc"})
|
||||
elif sorted_by == "runs":
|
||||
order_by.append({"runs": "desc"})
|
||||
elif sorted_by == "name":
|
||||
order_by.append({"agent_name": "asc"})
|
||||
|
||||
try:
|
||||
agents = await prisma.models.StoreAgent.prisma().find_many(
|
||||
where=where_clause,
|
||||
order=order_by,
|
||||
skip=(page - 1) * page_size,
|
||||
take=page_size,
|
||||
)
|
||||
|
||||
total = await prisma.models.StoreAgent.prisma().count(where=where_clause)
|
||||
total_pages = (total + page_size - 1) // page_size
|
||||
|
||||
store_agents: list[backend.server.v2.store.model.StoreAgent] = []
|
||||
for agent in agents:
|
||||
try:
|
||||
# Create the StoreAgent object safely
|
||||
store_agent = backend.server.v2.store.model.StoreAgent(
|
||||
slug=agent.slug,
|
||||
agent_name=agent.agent_name,
|
||||
agent_image=agent.agent_image[0] if agent.agent_image else "",
|
||||
creator=agent.creator_username or "Needs Profile",
|
||||
creator_avatar=agent.creator_avatar or "",
|
||||
sub_heading=agent.sub_heading,
|
||||
description=agent.description,
|
||||
runs=agent.runs,
|
||||
rating=agent.rating,
|
||||
# If search_query is provided, use full-text search
|
||||
if search_query:
|
||||
search_term = sanitize_query(search_query)
|
||||
if not search_term:
|
||||
# Return empty results for invalid search query
|
||||
return backend.server.v2.store.model.StoreAgentsResponse(
|
||||
agents=[],
|
||||
pagination=backend.server.v2.store.model.Pagination(
|
||||
current_page=page,
|
||||
total_items=0,
|
||||
total_pages=0,
|
||||
page_size=page_size,
|
||||
),
|
||||
)
|
||||
# Add to the list only if creation was successful
|
||||
store_agents.append(store_agent)
|
||||
except Exception as e:
|
||||
# Skip this agent if there was an error
|
||||
# You could log the error here if needed
|
||||
logger.error(
|
||||
f"Error parsing Store agent when getting store agents from db: {e}"
|
||||
)
|
||||
continue
|
||||
|
||||
offset = (page - 1) * page_size
|
||||
|
||||
# Build filter conditions
|
||||
filter_conditions = []
|
||||
filter_conditions.append("is_available = true")
|
||||
|
||||
if featured:
|
||||
filter_conditions.append("featured = true")
|
||||
if creators:
|
||||
creator_list = "','".join(creators)
|
||||
filter_conditions.append(f"creator_username IN ('{creator_list}')")
|
||||
if category:
|
||||
filter_conditions.append(f"'{category}' = ANY(categories)")
|
||||
|
||||
where_filter = (
|
||||
" AND ".join(filter_conditions) if filter_conditions else "1=1"
|
||||
)
|
||||
|
||||
# Build ORDER BY clause
|
||||
if sorted_by == "rating":
|
||||
order_by_clause = "rating DESC, rank DESC"
|
||||
elif sorted_by == "runs":
|
||||
order_by_clause = "runs DESC, rank DESC"
|
||||
elif sorted_by == "name":
|
||||
order_by_clause = "agent_name ASC, rank DESC"
|
||||
else:
|
||||
order_by_clause = "rank DESC, updated_at DESC"
|
||||
|
||||
# Execute full-text search query
|
||||
sql_query = f"""
|
||||
SELECT
|
||||
slug,
|
||||
agent_name,
|
||||
agent_image,
|
||||
creator_username,
|
||||
creator_avatar,
|
||||
sub_heading,
|
||||
description,
|
||||
runs,
|
||||
rating,
|
||||
categories,
|
||||
featured,
|
||||
is_available,
|
||||
updated_at,
|
||||
ts_rank_cd(search, query) AS rank
|
||||
FROM "StoreAgent",
|
||||
plainto_tsquery('english', '{search_term}') AS query
|
||||
WHERE {where_filter}
|
||||
AND search @@ query
|
||||
ORDER BY rank DESC, {order_by_clause}
|
||||
LIMIT {page_size} OFFSET {offset}
|
||||
"""
|
||||
|
||||
# Count query for pagination
|
||||
count_query = f"""
|
||||
SELECT COUNT(*) as count
|
||||
FROM "StoreAgent",
|
||||
plainto_tsquery('english', '{search_term}') AS query
|
||||
WHERE {where_filter}
|
||||
AND search @@ query
|
||||
"""
|
||||
|
||||
# Execute both queries
|
||||
agents = await prisma.client.get_client().query_raw(
|
||||
query=typing.cast(typing.LiteralString, sql_query)
|
||||
)
|
||||
|
||||
count_result = await prisma.client.get_client().query_raw(
|
||||
query=typing.cast(typing.LiteralString, count_query)
|
||||
)
|
||||
|
||||
total = count_result[0]["count"] if count_result else 0
|
||||
total_pages = (total + page_size - 1) // page_size
|
||||
|
||||
# Convert raw results to StoreAgent models
|
||||
store_agents: list[backend.server.v2.store.model.StoreAgent] = []
|
||||
for agent in agents:
|
||||
try:
|
||||
store_agent = backend.server.v2.store.model.StoreAgent(
|
||||
slug=agent["slug"],
|
||||
agent_name=agent["agent_name"],
|
||||
agent_image=(
|
||||
agent["agent_image"][0] if agent["agent_image"] else ""
|
||||
),
|
||||
creator=agent["creator_username"] or "Needs Profile",
|
||||
creator_avatar=agent["creator_avatar"] or "",
|
||||
sub_heading=agent["sub_heading"],
|
||||
description=agent["description"],
|
||||
runs=agent["runs"],
|
||||
rating=agent["rating"],
|
||||
)
|
||||
store_agents.append(store_agent)
|
||||
except Exception as e:
|
||||
logger.error(f"Error parsing Store agent from search results: {e}")
|
||||
continue
|
||||
|
||||
else:
|
||||
# Non-search query path (original logic)
|
||||
where_clause: prisma.types.StoreAgentWhereInput = {"is_available": True}
|
||||
if featured:
|
||||
where_clause["featured"] = featured
|
||||
if creators:
|
||||
where_clause["creator_username"] = {"in": creators}
|
||||
if category:
|
||||
where_clause["categories"] = {"has": category}
|
||||
|
||||
order_by = []
|
||||
if sorted_by == "rating":
|
||||
order_by.append({"rating": "desc"})
|
||||
elif sorted_by == "runs":
|
||||
order_by.append({"runs": "desc"})
|
||||
elif sorted_by == "name":
|
||||
order_by.append({"agent_name": "asc"})
|
||||
|
||||
agents = await prisma.models.StoreAgent.prisma().find_many(
|
||||
where=where_clause,
|
||||
order=order_by,
|
||||
skip=(page - 1) * page_size,
|
||||
take=page_size,
|
||||
)
|
||||
|
||||
total = await prisma.models.StoreAgent.prisma().count(where=where_clause)
|
||||
total_pages = (total + page_size - 1) // page_size
|
||||
|
||||
store_agents: list[backend.server.v2.store.model.StoreAgent] = []
|
||||
for agent in agents:
|
||||
try:
|
||||
# Create the StoreAgent object safely
|
||||
store_agent = backend.server.v2.store.model.StoreAgent(
|
||||
slug=agent.slug,
|
||||
agent_name=agent.agent_name,
|
||||
agent_image=agent.agent_image[0] if agent.agent_image else "",
|
||||
creator=agent.creator_username or "Needs Profile",
|
||||
creator_avatar=agent.creator_avatar or "",
|
||||
sub_heading=agent.sub_heading,
|
||||
description=agent.description,
|
||||
runs=agent.runs,
|
||||
rating=agent.rating,
|
||||
)
|
||||
# Add to the list only if creation was successful
|
||||
store_agents.append(store_agent)
|
||||
except Exception as e:
|
||||
# Skip this agent if there was an error
|
||||
# You could log the error here if needed
|
||||
logger.error(
|
||||
f"Error parsing Store agent when getting store agents from db: {e}"
|
||||
)
|
||||
continue
|
||||
|
||||
logger.debug(f"Found {len(store_agents)} agents")
|
||||
return backend.server.v2.store.model.StoreAgentsResponse(
|
||||
|
||||
@@ -40,23 +40,13 @@ async def get_profile(
|
||||
Get the profile details for the authenticated user.
|
||||
Cached for 1 hour per user.
|
||||
"""
|
||||
try:
|
||||
profile = await backend.server.v2.store.db.get_user_profile(user_id)
|
||||
if profile is None:
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=404,
|
||||
content={"detail": "Profile not found"},
|
||||
)
|
||||
return profile
|
||||
except Exception as e:
|
||||
logger.exception("Failed to fetch user profile for %s: %s", user_id, e)
|
||||
profile = await backend.server.v2.store.db.get_user_profile(user_id)
|
||||
if profile is None:
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": "Failed to retrieve user profile",
|
||||
"hint": "Check database connection.",
|
||||
},
|
||||
status_code=404,
|
||||
content={"detail": "Profile not found"},
|
||||
)
|
||||
return profile
|
||||
|
||||
|
||||
@router.post(
|
||||
@@ -83,20 +73,10 @@ async def update_or_create_profile(
|
||||
Raises:
|
||||
HTTPException: If there is an error updating the profile
|
||||
"""
|
||||
try:
|
||||
updated_profile = await backend.server.v2.store.db.update_profile(
|
||||
user_id=user_id, profile=profile
|
||||
)
|
||||
return updated_profile
|
||||
except Exception as e:
|
||||
logger.exception("Failed to update profile for user %s: %s", user_id, e)
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": "Failed to update user profile",
|
||||
"hint": "Validate request data.",
|
||||
},
|
||||
)
|
||||
updated_profile = await backend.server.v2.store.db.update_profile(
|
||||
user_id=user_id, profile=profile
|
||||
)
|
||||
return updated_profile
|
||||
|
||||
|
||||
##############################################
|
||||
@@ -155,26 +135,16 @@ async def get_agents(
|
||||
status_code=422, detail="Page size must be greater than 0"
|
||||
)
|
||||
|
||||
try:
|
||||
agents = await store_cache._get_cached_store_agents(
|
||||
featured=featured,
|
||||
creator=creator,
|
||||
sorted_by=sorted_by,
|
||||
search_query=search_query,
|
||||
category=category,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
)
|
||||
return agents
|
||||
except Exception as e:
|
||||
logger.exception("Failed to retrieve store agents: %s", e)
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": "Failed to retrieve store agents",
|
||||
"hint": "Check database or search parameters.",
|
||||
},
|
||||
)
|
||||
agents = await store_cache._get_cached_store_agents(
|
||||
featured=featured,
|
||||
creator=creator,
|
||||
sorted_by=sorted_by,
|
||||
search_query=search_query,
|
||||
category=category,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
)
|
||||
return agents
|
||||
|
||||
|
||||
@router.get(
|
||||
@@ -189,22 +159,13 @@ async def get_agent(username: str, agent_name: str):
|
||||
|
||||
It returns the store listing agents details.
|
||||
"""
|
||||
try:
|
||||
username = urllib.parse.unquote(username).lower()
|
||||
# URL decode the agent name since it comes from the URL path
|
||||
agent_name = urllib.parse.unquote(agent_name).lower()
|
||||
agent = await store_cache._get_cached_agent_details(
|
||||
username=username, agent_name=agent_name
|
||||
)
|
||||
return agent
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting store agent details")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": "An error occurred while retrieving the store agent details"
|
||||
},
|
||||
)
|
||||
username = urllib.parse.unquote(username).lower()
|
||||
# URL decode the agent name since it comes from the URL path
|
||||
agent_name = urllib.parse.unquote(agent_name).lower()
|
||||
agent = await store_cache._get_cached_agent_details(
|
||||
username=username, agent_name=agent_name
|
||||
)
|
||||
return agent
|
||||
|
||||
|
||||
@router.get(
|
||||
@@ -217,17 +178,10 @@ async def get_graph_meta_by_store_listing_version_id(store_listing_version_id: s
|
||||
"""
|
||||
Get Agent Graph from Store Listing Version ID.
|
||||
"""
|
||||
try:
|
||||
graph = await backend.server.v2.store.db.get_available_graph(
|
||||
store_listing_version_id
|
||||
)
|
||||
return graph
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting agent graph")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while retrieving the agent graph"},
|
||||
)
|
||||
graph = await backend.server.v2.store.db.get_available_graph(
|
||||
store_listing_version_id
|
||||
)
|
||||
return graph
|
||||
|
||||
|
||||
@router.get(
|
||||
@@ -241,18 +195,11 @@ async def get_store_agent(store_listing_version_id: str):
|
||||
"""
|
||||
Get Store Agent Details from Store Listing Version ID.
|
||||
"""
|
||||
try:
|
||||
agent = await backend.server.v2.store.db.get_store_agent_by_version_id(
|
||||
store_listing_version_id
|
||||
)
|
||||
agent = await backend.server.v2.store.db.get_store_agent_by_version_id(
|
||||
store_listing_version_id
|
||||
)
|
||||
|
||||
return agent
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting store agent")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while retrieving the store agent"},
|
||||
)
|
||||
return agent
|
||||
|
||||
|
||||
@router.post(
|
||||
@@ -280,24 +227,17 @@ async def create_review(
|
||||
Returns:
|
||||
The created review
|
||||
"""
|
||||
try:
|
||||
username = urllib.parse.unquote(username).lower()
|
||||
agent_name = urllib.parse.unquote(agent_name).lower()
|
||||
# Create the review
|
||||
created_review = await backend.server.v2.store.db.create_store_review(
|
||||
user_id=user_id,
|
||||
store_listing_version_id=review.store_listing_version_id,
|
||||
score=review.score,
|
||||
comments=review.comments,
|
||||
)
|
||||
username = urllib.parse.unquote(username).lower()
|
||||
agent_name = urllib.parse.unquote(agent_name).lower()
|
||||
# Create the review
|
||||
created_review = await backend.server.v2.store.db.create_store_review(
|
||||
user_id=user_id,
|
||||
store_listing_version_id=review.store_listing_version_id,
|
||||
score=review.score,
|
||||
comments=review.comments,
|
||||
)
|
||||
|
||||
return created_review
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst creating store review")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while creating the store review"},
|
||||
)
|
||||
return created_review
|
||||
|
||||
|
||||
##############################################
|
||||
@@ -340,21 +280,14 @@ async def get_creators(
|
||||
status_code=422, detail="Page size must be greater than 0"
|
||||
)
|
||||
|
||||
try:
|
||||
creators = await store_cache._get_cached_store_creators(
|
||||
featured=featured,
|
||||
search_query=search_query,
|
||||
sorted_by=sorted_by,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
)
|
||||
return creators
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting store creators")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while retrieving the store creators"},
|
||||
)
|
||||
creators = await store_cache._get_cached_store_creators(
|
||||
featured=featured,
|
||||
search_query=search_query,
|
||||
sorted_by=sorted_by,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
)
|
||||
return creators
|
||||
|
||||
|
||||
@router.get(
|
||||
@@ -370,18 +303,9 @@ async def get_creator(
|
||||
Get the details of a creator.
|
||||
- Creator Details Page
|
||||
"""
|
||||
try:
|
||||
username = urllib.parse.unquote(username).lower()
|
||||
creator = await store_cache._get_cached_creator_details(username=username)
|
||||
return creator
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting creator details")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": "An error occurred while retrieving the creator details"
|
||||
},
|
||||
)
|
||||
username = urllib.parse.unquote(username).lower()
|
||||
creator = await store_cache._get_cached_creator_details(username=username)
|
||||
return creator
|
||||
|
||||
|
||||
############################################
|
||||
@@ -404,17 +328,10 @@ async def get_my_agents(
|
||||
"""
|
||||
Get user's own agents.
|
||||
"""
|
||||
try:
|
||||
agents = await backend.server.v2.store.db.get_my_agents(
|
||||
user_id, page=page, page_size=page_size
|
||||
)
|
||||
return agents
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting my agents")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while retrieving the my agents"},
|
||||
)
|
||||
agents = await backend.server.v2.store.db.get_my_agents(
|
||||
user_id, page=page, page_size=page_size
|
||||
)
|
||||
return agents
|
||||
|
||||
|
||||
@router.delete(
|
||||
@@ -438,19 +355,12 @@ async def delete_submission(
|
||||
Returns:
|
||||
bool: True if the submission was successfully deleted, False otherwise
|
||||
"""
|
||||
try:
|
||||
result = await backend.server.v2.store.db.delete_store_submission(
|
||||
user_id=user_id,
|
||||
submission_id=submission_id,
|
||||
)
|
||||
result = await backend.server.v2.store.db.delete_store_submission(
|
||||
user_id=user_id,
|
||||
submission_id=submission_id,
|
||||
)
|
||||
|
||||
return result
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst deleting store submission")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while deleting the store submission"},
|
||||
)
|
||||
return result
|
||||
|
||||
|
||||
@router.get(
|
||||
@@ -488,21 +398,12 @@ async def get_submissions(
|
||||
raise fastapi.HTTPException(
|
||||
status_code=422, detail="Page size must be greater than 0"
|
||||
)
|
||||
try:
|
||||
listings = await backend.server.v2.store.db.get_store_submissions(
|
||||
user_id=user_id,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
)
|
||||
return listings
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst getting store submissions")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": "An error occurred while retrieving the store submissions"
|
||||
},
|
||||
)
|
||||
listings = await backend.server.v2.store.db.get_store_submissions(
|
||||
user_id=user_id,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
)
|
||||
return listings
|
||||
|
||||
|
||||
@router.post(
|
||||
@@ -529,36 +430,23 @@ async def create_submission(
|
||||
Raises:
|
||||
HTTPException: If there is an error creating the submission
|
||||
"""
|
||||
try:
|
||||
result = await backend.server.v2.store.db.create_store_submission(
|
||||
user_id=user_id,
|
||||
agent_id=submission_request.agent_id,
|
||||
agent_version=submission_request.agent_version,
|
||||
slug=submission_request.slug,
|
||||
name=submission_request.name,
|
||||
video_url=submission_request.video_url,
|
||||
image_urls=submission_request.image_urls,
|
||||
description=submission_request.description,
|
||||
instructions=submission_request.instructions,
|
||||
sub_heading=submission_request.sub_heading,
|
||||
categories=submission_request.categories,
|
||||
changes_summary=submission_request.changes_summary or "Initial Submission",
|
||||
recommended_schedule_cron=submission_request.recommended_schedule_cron,
|
||||
)
|
||||
result = await backend.server.v2.store.db.create_store_submission(
|
||||
user_id=user_id,
|
||||
agent_id=submission_request.agent_id,
|
||||
agent_version=submission_request.agent_version,
|
||||
slug=submission_request.slug,
|
||||
name=submission_request.name,
|
||||
video_url=submission_request.video_url,
|
||||
image_urls=submission_request.image_urls,
|
||||
description=submission_request.description,
|
||||
instructions=submission_request.instructions,
|
||||
sub_heading=submission_request.sub_heading,
|
||||
categories=submission_request.categories,
|
||||
changes_summary=submission_request.changes_summary or "Initial Submission",
|
||||
recommended_schedule_cron=submission_request.recommended_schedule_cron,
|
||||
)
|
||||
|
||||
return result
|
||||
except backend.server.v2.store.exceptions.SlugAlreadyInUseError as e:
|
||||
logger.warning("Slug already in use: %s", str(e))
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=409,
|
||||
content={"detail": str(e)},
|
||||
)
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst creating store submission")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while creating the store submission"},
|
||||
)
|
||||
return result
|
||||
|
||||
|
||||
@router.put(
|
||||
@@ -627,36 +515,10 @@ async def upload_submission_media(
|
||||
Raises:
|
||||
HTTPException: If there is an error uploading the media
|
||||
"""
|
||||
try:
|
||||
media_url = await backend.server.v2.store.media.upload_media(
|
||||
user_id=user_id, file=file
|
||||
)
|
||||
return media_url
|
||||
except backend.server.v2.store.exceptions.VirusDetectedError as e:
|
||||
logger.warning(f"Virus detected in uploaded file: {e.threat_name}")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=400,
|
||||
content={
|
||||
"detail": f"File rejected due to virus detection: {e.threat_name}",
|
||||
"error_type": "virus_detected",
|
||||
"threat_name": e.threat_name,
|
||||
},
|
||||
)
|
||||
except backend.server.v2.store.exceptions.VirusScanError as e:
|
||||
logger.error(f"Virus scanning failed: {str(e)}")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=503,
|
||||
content={
|
||||
"detail": "Virus scanning service unavailable. Please try again later.",
|
||||
"error_type": "virus_scan_failed",
|
||||
},
|
||||
)
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst uploading submission media")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while uploading the media file"},
|
||||
)
|
||||
media_url = await backend.server.v2.store.media.upload_media(
|
||||
user_id=user_id, file=file
|
||||
)
|
||||
return media_url
|
||||
|
||||
|
||||
@router.post(
|
||||
@@ -679,44 +541,35 @@ async def generate_image(
|
||||
Returns:
|
||||
JSONResponse: JSON containing the URL of the generated image
|
||||
"""
|
||||
try:
|
||||
agent = await backend.data.graph.get_graph(agent_id, user_id=user_id)
|
||||
agent = await backend.data.graph.get_graph(agent_id, user_id=user_id)
|
||||
|
||||
if not agent:
|
||||
raise fastapi.HTTPException(
|
||||
status_code=404, detail=f"Agent with ID {agent_id} not found"
|
||||
)
|
||||
# Use .jpeg here since we are generating JPEG images
|
||||
filename = f"agent_{agent_id}.jpeg"
|
||||
if not agent:
|
||||
raise fastapi.HTTPException(
|
||||
status_code=404, detail=f"Agent with ID {agent_id} not found"
|
||||
)
|
||||
# Use .jpeg here since we are generating JPEG images
|
||||
filename = f"agent_{agent_id}.jpeg"
|
||||
|
||||
existing_url = await backend.server.v2.store.media.check_media_exists(
|
||||
user_id, filename
|
||||
)
|
||||
if existing_url:
|
||||
logger.info(f"Using existing image for agent {agent_id}")
|
||||
return fastapi.responses.JSONResponse(content={"image_url": existing_url})
|
||||
# Generate agent image as JPEG
|
||||
image = await backend.server.v2.store.image_gen.generate_agent_image(
|
||||
agent=agent
|
||||
)
|
||||
existing_url = await backend.server.v2.store.media.check_media_exists(
|
||||
user_id, filename
|
||||
)
|
||||
if existing_url:
|
||||
logger.info(f"Using existing image for agent {agent_id}")
|
||||
return fastapi.responses.JSONResponse(content={"image_url": existing_url})
|
||||
# Generate agent image as JPEG
|
||||
image = await backend.server.v2.store.image_gen.generate_agent_image(agent=agent)
|
||||
|
||||
# Create UploadFile with the correct filename and content_type
|
||||
image_file = fastapi.UploadFile(
|
||||
file=image,
|
||||
filename=filename,
|
||||
)
|
||||
# Create UploadFile with the correct filename and content_type
|
||||
image_file = fastapi.UploadFile(
|
||||
file=image,
|
||||
filename=filename,
|
||||
)
|
||||
|
||||
image_url = await backend.server.v2.store.media.upload_media(
|
||||
user_id=user_id, file=image_file, use_file_name=True
|
||||
)
|
||||
image_url = await backend.server.v2.store.media.upload_media(
|
||||
user_id=user_id, file=image_file, use_file_name=True
|
||||
)
|
||||
|
||||
return fastapi.responses.JSONResponse(content={"image_url": image_url})
|
||||
except Exception:
|
||||
logger.exception("Exception occurred whilst generating submission image")
|
||||
return fastapi.responses.JSONResponse(
|
||||
status_code=500,
|
||||
content={"detail": "An error occurred while generating the image"},
|
||||
)
|
||||
return fastapi.responses.JSONResponse(content={"image_url": image_url})
|
||||
|
||||
|
||||
@router.get(
|
||||
|
||||
@@ -63,9 +63,9 @@ def initialize_launchdarkly() -> None:
|
||||
config = Config(sdk_key)
|
||||
ldclient.set_config(config)
|
||||
|
||||
global _is_initialized
|
||||
_is_initialized = True
|
||||
if ldclient.get().is_initialized():
|
||||
global _is_initialized
|
||||
_is_initialized = True
|
||||
logger.info("LaunchDarkly client initialized successfully")
|
||||
else:
|
||||
logger.error("LaunchDarkly client failed to initialize")
|
||||
@@ -218,7 +218,8 @@ def feature_flag(
|
||||
|
||||
if not get_client().is_initialized():
|
||||
logger.warning(
|
||||
f"LaunchDarkly not initialized, using default={default}"
|
||||
"LaunchDarkly not initialized, "
|
||||
f"using default {flag_key}={repr(default)}"
|
||||
)
|
||||
is_enabled = default
|
||||
else:
|
||||
@@ -232,8 +233,9 @@ def feature_flag(
|
||||
else:
|
||||
# Log warning and use default for non-boolean values
|
||||
logger.warning(
|
||||
f"Feature flag {flag_key} returned non-boolean value: {flag_value} (type: {type(flag_value).__name__}). "
|
||||
f"Using default={default}"
|
||||
f"Feature flag {flag_key} returned non-boolean value: "
|
||||
f"{repr(flag_value)} (type: {type(flag_value).__name__}). "
|
||||
f"Using default value {repr(default)}"
|
||||
)
|
||||
is_enabled = default
|
||||
|
||||
|
||||
@@ -8,10 +8,7 @@ settings = Settings()
|
||||
def configure_logging():
|
||||
import autogpt_libs.logging.config
|
||||
|
||||
if (
|
||||
settings.config.behave_as == BehaveAs.LOCAL
|
||||
or settings.config.app_env == AppEnvironment.LOCAL
|
||||
):
|
||||
if not is_structured_logging_enabled():
|
||||
autogpt_libs.logging.config.configure_logging(force_cloud_logging=False)
|
||||
else:
|
||||
autogpt_libs.logging.config.configure_logging(force_cloud_logging=True)
|
||||
@@ -20,6 +17,14 @@ def configure_logging():
|
||||
logging.getLogger("httpx").setLevel(logging.WARNING)
|
||||
|
||||
|
||||
def is_structured_logging_enabled() -> bool:
|
||||
"""Check if structured logging (cloud logging) is enabled."""
|
||||
return not (
|
||||
settings.config.behave_as == BehaveAs.LOCAL
|
||||
or settings.config.app_env == AppEnvironment.LOCAL
|
||||
)
|
||||
|
||||
|
||||
class TruncatedLogger:
|
||||
def __init__(
|
||||
self,
|
||||
|
||||
@@ -3,15 +3,17 @@ from enum import Enum
|
||||
|
||||
import sentry_sdk
|
||||
from pydantic import SecretStr
|
||||
from sentry_sdk.integrations import DidNotEnable
|
||||
from sentry_sdk.integrations.anthropic import AnthropicIntegration
|
||||
from sentry_sdk.integrations.asyncio import AsyncioIntegration
|
||||
from sentry_sdk.integrations.launchdarkly import LaunchDarklyIntegration
|
||||
from sentry_sdk.integrations.logging import LoggingIntegration
|
||||
|
||||
from backend.util.feature_flag import get_client, is_configured
|
||||
from backend.util import feature_flag
|
||||
from backend.util.settings import Settings
|
||||
|
||||
settings = Settings()
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DiscordChannel(str, Enum):
|
||||
@@ -22,8 +24,11 @@ class DiscordChannel(str, Enum):
|
||||
def sentry_init():
|
||||
sentry_dsn = settings.secrets.sentry_dsn
|
||||
integrations = []
|
||||
if is_configured():
|
||||
integrations.append(LaunchDarklyIntegration(get_client()))
|
||||
if feature_flag.is_configured():
|
||||
try:
|
||||
integrations.append(LaunchDarklyIntegration(feature_flag.get_client()))
|
||||
except DidNotEnable as e:
|
||||
logger.error(f"Error enabling LaunchDarklyIntegration for Sentry: {e}")
|
||||
sentry_sdk.init(
|
||||
dsn=sentry_dsn,
|
||||
traces_sample_rate=1.0,
|
||||
|
||||
@@ -3,6 +3,7 @@ import logging
|
||||
import bleach
|
||||
from bleach.css_sanitizer import CSSSanitizer
|
||||
from jinja2 import BaseLoader
|
||||
from jinja2.exceptions import TemplateError
|
||||
from jinja2.sandbox import SandboxedEnvironment
|
||||
from markupsafe import Markup
|
||||
|
||||
@@ -101,8 +102,11 @@ class TextFormatter:
|
||||
|
||||
def format_string(self, template_str: str, values=None, **kwargs) -> str:
|
||||
"""Regular template rendering with escaping"""
|
||||
template = self.env.from_string(template_str)
|
||||
return template.render(values or {}, **kwargs)
|
||||
try:
|
||||
template = self.env.from_string(template_str)
|
||||
return template.render(values or {}, **kwargs)
|
||||
except TemplateError as e:
|
||||
raise ValueError(e) from e
|
||||
|
||||
def format_email(
|
||||
self,
|
||||
|
||||
@@ -0,0 +1,100 @@
|
||||
-- AlterTable
|
||||
ALTER TABLE "StoreListingVersion" ADD COLUMN "search" tsvector DEFAULT ''::tsvector;
|
||||
|
||||
-- Add trigger to update the search column with the tsvector of the agent
|
||||
-- Function to be invoked by trigger
|
||||
|
||||
-- Drop the trigger first
|
||||
DROP TRIGGER IF EXISTS "update_tsvector" ON "StoreListingVersion";
|
||||
|
||||
-- Drop the function completely
|
||||
DROP FUNCTION IF EXISTS update_tsvector_column();
|
||||
|
||||
-- Now recreate it fresh
|
||||
CREATE OR REPLACE FUNCTION update_tsvector_column() RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.search := to_tsvector('english',
|
||||
COALESCE(NEW.name, '') || ' ' ||
|
||||
COALESCE(NEW.description, '') || ' ' ||
|
||||
COALESCE(NEW."subHeading", '')
|
||||
);
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql SECURITY DEFINER SET search_path = platform, pg_temp;
|
||||
|
||||
-- Recreate the trigger
|
||||
CREATE TRIGGER "update_tsvector"
|
||||
BEFORE INSERT OR UPDATE ON "StoreListingVersion"
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_tsvector_column();
|
||||
|
||||
UPDATE "StoreListingVersion"
|
||||
SET search = to_tsvector('english',
|
||||
COALESCE(name, '') || ' ' ||
|
||||
COALESCE(description, '') || ' ' ||
|
||||
COALESCE("subHeading", '')
|
||||
)
|
||||
WHERE search IS NULL;
|
||||
|
||||
-- Drop and recreate the StoreAgent view with isAvailable field
|
||||
DROP VIEW IF EXISTS "StoreAgent";
|
||||
|
||||
CREATE OR REPLACE VIEW "StoreAgent" AS
|
||||
WITH latest_versions AS (
|
||||
SELECT
|
||||
"storeListingId",
|
||||
MAX(version) AS max_version
|
||||
FROM "StoreListingVersion"
|
||||
WHERE "submissionStatus" = 'APPROVED'
|
||||
GROUP BY "storeListingId"
|
||||
),
|
||||
agent_versions AS (
|
||||
SELECT
|
||||
"storeListingId",
|
||||
array_agg(DISTINCT version::text ORDER BY version::text) AS versions
|
||||
FROM "StoreListingVersion"
|
||||
WHERE "submissionStatus" = 'APPROVED'
|
||||
GROUP BY "storeListingId"
|
||||
)
|
||||
SELECT
|
||||
sl.id AS listing_id,
|
||||
slv.id AS "storeListingVersionId",
|
||||
slv."createdAt" AS updated_at,
|
||||
sl.slug,
|
||||
COALESCE(slv.name, '') AS agent_name,
|
||||
slv."videoUrl" AS agent_video,
|
||||
COALESCE(slv."imageUrls", ARRAY[]::text[]) AS agent_image,
|
||||
slv."isFeatured" AS featured,
|
||||
p.username AS creator_username, -- Allow NULL for malformed sub-agents
|
||||
p."avatarUrl" AS creator_avatar, -- Allow NULL for malformed sub-agents
|
||||
slv."subHeading" AS sub_heading,
|
||||
slv.description,
|
||||
slv.categories,
|
||||
slv.search,
|
||||
COALESCE(ar.run_count, 0::bigint) AS runs,
|
||||
COALESCE(rs.avg_rating, 0.0)::double precision AS rating,
|
||||
COALESCE(av.versions, ARRAY[slv.version::text]) AS versions,
|
||||
COALESCE(sl."useForOnboarding", false) AS "useForOnboarding",
|
||||
slv."isAvailable" AS is_available -- Add isAvailable field to filter sub-agents
|
||||
FROM "StoreListing" sl
|
||||
JOIN latest_versions lv
|
||||
ON sl.id = lv."storeListingId"
|
||||
JOIN "StoreListingVersion" slv
|
||||
ON slv."storeListingId" = lv."storeListingId"
|
||||
AND slv.version = lv.max_version
|
||||
AND slv."submissionStatus" = 'APPROVED'
|
||||
JOIN "AgentGraph" a
|
||||
ON slv."agentGraphId" = a.id
|
||||
AND slv."agentGraphVersion" = a.version
|
||||
LEFT JOIN "Profile" p
|
||||
ON sl."owningUserId" = p."userId"
|
||||
LEFT JOIN "mv_review_stats" rs
|
||||
ON sl.id = rs."storeListingId"
|
||||
LEFT JOIN "mv_agent_run_counts" ar
|
||||
ON a.id = ar."agentGraphId"
|
||||
LEFT JOIN agent_versions av
|
||||
ON sl.id = av."storeListingId"
|
||||
WHERE sl."isDeleted" = false
|
||||
AND sl."hasApprovedVersion" = true;
|
||||
|
||||
COMMIT;
|
||||
@@ -5,10 +5,11 @@ datasource db {
|
||||
}
|
||||
|
||||
generator client {
|
||||
provider = "prisma-client-py"
|
||||
recursive_type_depth = -1
|
||||
interface = "asyncio"
|
||||
previewFeatures = ["views"]
|
||||
provider = "prisma-client-py"
|
||||
recursive_type_depth = -1
|
||||
interface = "asyncio"
|
||||
previewFeatures = ["views", "fullTextSearch"]
|
||||
partial_type_generator = "backend/data/partial_types.py"
|
||||
}
|
||||
|
||||
// User model to mirror Auth provider users
|
||||
@@ -664,6 +665,7 @@ view StoreAgent {
|
||||
sub_heading String
|
||||
description String
|
||||
categories String[]
|
||||
search Unsupported("tsvector")? @default(dbgenerated("''::tsvector"))
|
||||
runs Int
|
||||
rating Float
|
||||
versions String[]
|
||||
@@ -747,7 +749,7 @@ model StoreListing {
|
||||
slug String
|
||||
|
||||
// Allow this agent to be used during onboarding
|
||||
useForOnboarding Boolean @default(false)
|
||||
useForOnboarding Boolean @default(false)
|
||||
|
||||
// The currently active version that should be shown to users
|
||||
activeVersionId String? @unique
|
||||
@@ -798,6 +800,8 @@ model StoreListingVersion {
|
||||
// Old versions can be made unavailable by the author if desired
|
||||
isAvailable Boolean @default(true)
|
||||
|
||||
search Unsupported("tsvector")? @default(dbgenerated("''::tsvector"))
|
||||
|
||||
// Version workflow state
|
||||
submissionStatus SubmissionStatus @default(DRAFT)
|
||||
submittedAt DateTime?
|
||||
|
||||
140
autogpt_platform/backend/test/blocks/test_youtube.py
Normal file
140
autogpt_platform/backend/test/blocks/test_youtube.py
Normal file
@@ -0,0 +1,140 @@
|
||||
from unittest.mock import Mock, patch
|
||||
|
||||
import pytest
|
||||
from youtube_transcript_api._errors import NoTranscriptFound
|
||||
from youtube_transcript_api._transcripts import FetchedTranscript, Transcript
|
||||
|
||||
from backend.blocks.youtube import TranscribeYoutubeVideoBlock
|
||||
|
||||
|
||||
class TestTranscribeYoutubeVideoBlock:
|
||||
"""Test cases for TranscribeYoutubeVideoBlock language fallback functionality."""
|
||||
|
||||
def setup_method(self):
|
||||
"""Set up test fixtures."""
|
||||
self.youtube_block = TranscribeYoutubeVideoBlock()
|
||||
|
||||
def test_extract_video_id_standard_url(self):
|
||||
"""Test extracting video ID from standard YouTube URL."""
|
||||
url = "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
|
||||
video_id = self.youtube_block.extract_video_id(url)
|
||||
assert video_id == "dQw4w9WgXcQ"
|
||||
|
||||
def test_extract_video_id_short_url(self):
|
||||
"""Test extracting video ID from shortened youtu.be URL."""
|
||||
url = "https://youtu.be/dQw4w9WgXcQ"
|
||||
video_id = self.youtube_block.extract_video_id(url)
|
||||
assert video_id == "dQw4w9WgXcQ"
|
||||
|
||||
def test_extract_video_id_embed_url(self):
|
||||
"""Test extracting video ID from embed URL."""
|
||||
url = "https://www.youtube.com/embed/dQw4w9WgXcQ"
|
||||
video_id = self.youtube_block.extract_video_id(url)
|
||||
assert video_id == "dQw4w9WgXcQ"
|
||||
|
||||
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
|
||||
def test_get_transcript_english_available(self, mock_api_class):
|
||||
"""Test getting transcript when English is available."""
|
||||
# Setup mock
|
||||
mock_api = Mock()
|
||||
mock_api_class.return_value = mock_api
|
||||
mock_transcript = Mock(spec=FetchedTranscript)
|
||||
mock_api.fetch.return_value = mock_transcript
|
||||
|
||||
# Execute
|
||||
result = TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
|
||||
|
||||
# Assert
|
||||
assert result == mock_transcript
|
||||
mock_api.fetch.assert_called_once_with(video_id="test_video_id")
|
||||
mock_api.list.assert_not_called()
|
||||
|
||||
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
|
||||
def test_get_transcript_fallback_to_first_available(self, mock_api_class):
|
||||
"""Test fallback to first available language when English is not available."""
|
||||
# Setup mock
|
||||
mock_api = Mock()
|
||||
mock_api_class.return_value = mock_api
|
||||
|
||||
# Create mock transcript list with Hungarian transcript
|
||||
mock_transcript_list = Mock()
|
||||
mock_transcript_hu = Mock(spec=Transcript)
|
||||
mock_fetched_transcript = Mock(spec=FetchedTranscript)
|
||||
mock_transcript_hu.fetch.return_value = mock_fetched_transcript
|
||||
|
||||
# Set up the transcript list to have manually created transcripts empty
|
||||
# and generated transcripts with Hungarian
|
||||
mock_transcript_list._manually_created_transcripts = {}
|
||||
mock_transcript_list._generated_transcripts = {"hu": mock_transcript_hu}
|
||||
|
||||
# Mock API to raise NoTranscriptFound for English, then return list
|
||||
mock_api.fetch.side_effect = NoTranscriptFound(
|
||||
"test_video_id", ("en",), mock_transcript_list
|
||||
)
|
||||
mock_api.list.return_value = mock_transcript_list
|
||||
|
||||
# Execute
|
||||
result = TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
|
||||
|
||||
# Assert
|
||||
assert result == mock_fetched_transcript
|
||||
mock_api.fetch.assert_called_once_with(video_id="test_video_id")
|
||||
mock_api.list.assert_called_once_with("test_video_id")
|
||||
mock_transcript_hu.fetch.assert_called_once()
|
||||
|
||||
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
|
||||
def test_get_transcript_prefers_manually_created(self, mock_api_class):
|
||||
"""Test that manually created transcripts are preferred over generated ones."""
|
||||
# Setup mock
|
||||
mock_api = Mock()
|
||||
mock_api_class.return_value = mock_api
|
||||
|
||||
# Create mock transcript list with both manual and generated transcripts
|
||||
mock_transcript_list = Mock()
|
||||
mock_transcript_manual = Mock(spec=Transcript)
|
||||
mock_transcript_generated = Mock(spec=Transcript)
|
||||
mock_fetched_manual = Mock(spec=FetchedTranscript)
|
||||
mock_transcript_manual.fetch.return_value = mock_fetched_manual
|
||||
|
||||
# Set up the transcript list
|
||||
mock_transcript_list._manually_created_transcripts = {
|
||||
"es": mock_transcript_manual
|
||||
}
|
||||
mock_transcript_list._generated_transcripts = {"hu": mock_transcript_generated}
|
||||
|
||||
# Mock API to raise NoTranscriptFound for English
|
||||
mock_api.fetch.side_effect = NoTranscriptFound(
|
||||
"test_video_id", ("en",), mock_transcript_list
|
||||
)
|
||||
mock_api.list.return_value = mock_transcript_list
|
||||
|
||||
# Execute
|
||||
result = TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
|
||||
|
||||
# Assert - should use manually created transcript first
|
||||
assert result == mock_fetched_manual
|
||||
mock_transcript_manual.fetch.assert_called_once()
|
||||
mock_transcript_generated.fetch.assert_not_called()
|
||||
|
||||
@patch("backend.blocks.youtube.YouTubeTranscriptApi")
|
||||
def test_get_transcript_no_transcripts_available(self, mock_api_class):
|
||||
"""Test that exception is re-raised when no transcripts are available at all."""
|
||||
# Setup mock
|
||||
mock_api = Mock()
|
||||
mock_api_class.return_value = mock_api
|
||||
|
||||
# Create mock transcript list with no transcripts
|
||||
mock_transcript_list = Mock()
|
||||
mock_transcript_list._manually_created_transcripts = {}
|
||||
mock_transcript_list._generated_transcripts = {}
|
||||
|
||||
# Mock API to raise NoTranscriptFound
|
||||
original_exception = NoTranscriptFound(
|
||||
"test_video_id", ("en",), mock_transcript_list
|
||||
)
|
||||
mock_api.fetch.side_effect = original_exception
|
||||
mock_api.list.return_value = mock_transcript_list
|
||||
|
||||
# Execute and assert exception is raised
|
||||
with pytest.raises(NoTranscriptFound):
|
||||
TranscribeYoutubeVideoBlock.get_transcript("test_video_id")
|
||||
@@ -21,6 +21,7 @@ import random
|
||||
from datetime import datetime
|
||||
|
||||
import prisma.enums
|
||||
import pytest
|
||||
from autogpt_libs.api_key.keysmith import APIKeySmith
|
||||
from faker import Faker
|
||||
from prisma import Json, Prisma
|
||||
@@ -498,9 +499,6 @@ async def main():
|
||||
if store_listing_versions and random.random() < 0.5
|
||||
else None
|
||||
),
|
||||
"agentInput": (
|
||||
Json({"test": "data"}) if random.random() < 0.3 else None
|
||||
),
|
||||
"onboardingAgentExecutionId": (
|
||||
random.choice(agent_graph_executions).id
|
||||
if agent_graph_executions and random.random() < 0.3
|
||||
@@ -570,5 +568,11 @@ async def main():
|
||||
print("Test data creation completed successfully!")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
@pytest.mark.integration
|
||||
async def test_main_function_runs_without_errors():
|
||||
await main()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
|
||||
765
autogpt_platform/frontend/CONTRIBUTING.md
Normal file
765
autogpt_platform/frontend/CONTRIBUTING.md
Normal file
@@ -0,0 +1,765 @@
|
||||
<div align="center">
|
||||
<h1>AutoGPT Frontend • Contributing ⌨️</h1>
|
||||
<p>Next.js App Router • Client-first • Type-safe generated API hooks • Tailwind + shadcn/ui</p>
|
||||
</div>
|
||||
|
||||
---
|
||||
|
||||
## ☕️ Summary
|
||||
|
||||
This document is your reference for contributing to the AutoGPT Frontend. It adapts legacy guidelines to our current stack and practices.
|
||||
|
||||
- Architecture and stack
|
||||
- Component structure and design system
|
||||
- Data fetching (generated API hooks)
|
||||
- Feature flags
|
||||
- Naming and code conventions
|
||||
- Tooling, scripts, and testing
|
||||
- PR process and checklist
|
||||
|
||||
This is a living document. Open a pull request any time to improve it.
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Quick Start FAQ
|
||||
|
||||
New to the codebase? Here are shortcuts to common tasks:
|
||||
|
||||
### I need to make a new page
|
||||
|
||||
1. Create page in `src/app/(platform)/your-feature/page.tsx`
|
||||
2. If it has logic, create `usePage.ts` hook next to it
|
||||
3. Create sub-components in `components/` folder
|
||||
4. Use generated API hooks for data fetching
|
||||
5. If page needs auth, ensure it's in the `(platform)` route group
|
||||
|
||||
**Example structure:**
|
||||
|
||||
```
|
||||
app/(platform)/dashboard/
|
||||
page.tsx
|
||||
useDashboardPage.ts
|
||||
components/
|
||||
StatsPanel/
|
||||
StatsPanel.tsx
|
||||
useStatsPanel.ts
|
||||
```
|
||||
|
||||
See [Component structure](#-component-structure) and [Styling](#-styling) and [Data fetching patterns](#-data-fetching-patterns) sections.
|
||||
|
||||
### I need to update an existing component in a page
|
||||
|
||||
1. Find the page `src/app/(platform)/your-feature/page.tsx`
|
||||
2. Check its `components/` folder
|
||||
3. If needing to update its logic, check the `use[Component].ts` hook
|
||||
4. If the update is related to rendering, check `[Component].tsx` file
|
||||
|
||||
See [Component structure](#-component-structure) and [Styling](#-styling) sections.
|
||||
|
||||
### I need to make a new API call and show it on the UI
|
||||
|
||||
1. Ensure the backend endpoint exists in the OpenAPI spec
|
||||
2. Regenerate API client: `pnpm generate:api`
|
||||
3. Import the generated hook by typing the operation name (auto-import)
|
||||
4. Use the hook in your component/custom hook
|
||||
5. Handle loading, error, and success states
|
||||
|
||||
**Example:**
|
||||
|
||||
```tsx
|
||||
import { useGetV2ListLibraryAgents } from "@/app/api/__generated__/endpoints/library/library";
|
||||
|
||||
export function useAgentList() {
|
||||
const { data, isLoading, isError, error } = useGetV2ListLibraryAgents();
|
||||
|
||||
return {
|
||||
agents: data?.data || [],
|
||||
isLoading,
|
||||
isError,
|
||||
error,
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
See [Data fetching patterns](#-data-fetching-patterns) for more examples.
|
||||
|
||||
### I need to create a new component in the Design System
|
||||
|
||||
1. Determine the atomic level: atom, molecule, or organism
|
||||
2. Create folder: `src/components/[level]/ComponentName/`
|
||||
3. Create `ComponentName.tsx` (render logic)
|
||||
4. If logic exists, create `useComponentName.ts`
|
||||
5. Create `ComponentName.stories.tsx` for Storybook
|
||||
6. Use Tailwind + design tokens (avoid hardcoded values)
|
||||
7. Only use Phosphor icons
|
||||
8. Test in Storybook: `pnpm storybook`
|
||||
9. Verify in Chromatic after PR
|
||||
|
||||
**Example structure:**
|
||||
|
||||
```
|
||||
src/components/molecules/DataCard/
|
||||
DataCard.tsx
|
||||
DataCard.stories.tsx
|
||||
useDataCard.ts
|
||||
```
|
||||
|
||||
See [Component structure](#-component-structure) and [Styling](#-styling) sections.
|
||||
|
||||
---
|
||||
|
||||
## 📟 Contribution process
|
||||
|
||||
### 1) Branch off `dev`
|
||||
|
||||
- Branch from `dev` for features and fixes
|
||||
- Keep PRs focused (aim for one ticket per PR)
|
||||
- Use conventional commit messages with a scope (e.g., `feat(frontend): add X`)
|
||||
|
||||
### 2) Feature flags
|
||||
|
||||
If a feature will ship across multiple PRs, guard it with a flag so we can merge iteratively.
|
||||
|
||||
- Use [LaunchDarkly](https://www.launchdarkly.com) based flags (see Feature Flags below)
|
||||
- Avoid long-lived feature branches
|
||||
|
||||
### 3) Open PR and get reviews ✅
|
||||
|
||||
Before requesting review:
|
||||
|
||||
- [x] Code follows architecture and conventions here
|
||||
- [x] `pnpm format && pnpm lint && pnpm types` pass
|
||||
- [x] Relevant tests pass locally: `pnpm test` (and/or Storybook tests)
|
||||
- [x] If touching UI, validate against our design system and stories
|
||||
|
||||
### 4) Merge to `dev`
|
||||
|
||||
- Use squash merges
|
||||
- Follow conventional commit message format for the squash title
|
||||
|
||||
---
|
||||
|
||||
## 📂 Architecture & Stack
|
||||
|
||||
### Next.js App Router
|
||||
|
||||
- We use the [Next.js App Router](https://nextjs.org/docs/app) in `src/app`
|
||||
- Use [route segments](https://nextjs.org/docs/app/building-your-application/routing) with semantic URLs; no `pages/`
|
||||
|
||||
### Component good practices
|
||||
|
||||
- Default to client components
|
||||
- Use server components only when:
|
||||
- SEO requires server-rendered HTML, or
|
||||
- Extreme first-byte performance justifies it
|
||||
- If you render server-side data, prefer server-side prefetch + client hydration (see examples below and [React Query SSR & Hydration](https://tanstack.com/query/latest/docs/framework/react/guides/ssr))
|
||||
- Prefer using [Next.js API routes](https://nextjs.org/docs/pages/building-your-application/routing/api-routes) when possible over [server actions](https://nextjs.org/docs/14/app/building-your-application/data-fetching/server-actions-and-mutations)
|
||||
- Keep components small and simple
|
||||
- favour composition and splitting large components into smaller bits of UI
|
||||
- [colocate state](https://kentcdodds.com/blog/state-colocation-will-make-your-react-app-faster) when possible
|
||||
- keep render/side-effects split for [separation of concerns](https://en.wikipedia.org/wiki/Separation_of_concerns)
|
||||
- do not over-complicate or re-invent the wheel
|
||||
|
||||
**❓ Why a client-side first design vs server components/actions?**
|
||||
|
||||
While server components and actions are cool and cutting-edge, they introduce a layer of complexity which not always justified by the benefits they deliver. Defaulting to client-first keeps things simple in the mental model of the developer, specially for those developers less familiar with Next.js or heavy Front-end development.
|
||||
|
||||
### Data fetching: prefer generated API hooks
|
||||
|
||||
- We generate a type-safe client and React Query hooks from the backend OpenAPI spec via [Orval](https://orval.dev/)
|
||||
- Prefer the generated hooks under `src/app/api/__generated__/endpoints/...`
|
||||
- Treat `BackendAPI` and code under `src/lib/autogpt-server-api/*` as deprecated; do not introduce new usages
|
||||
- Use [Zod](https://zod.dev/) schemas from the generated client where applicable
|
||||
|
||||
### State management
|
||||
|
||||
- Prefer [React Query](https://tanstack.com/query/latest/docs/framework/react/overview) for server state, colocated near consumers (see [state colocation](https://kentcdodds.com/blog/state-colocation-will-make-your-react-app-faster))
|
||||
- Co-locate UI state inside components/hooks; keep global state minimal
|
||||
|
||||
### Styling and components
|
||||
|
||||
- [Tailwind CSS](https://tailwindcss.com/docs) + [shadcn/ui](https://ui.shadcn.com/) ([Radix Primitives](https://www.radix-ui.com/docs/primitives/overview/introduction) under the hood)
|
||||
- Use the design system under `src/components` for primitives and building blocks
|
||||
- Do not use anything under `src/components/_legacy__`; migrate away from it when touching old code
|
||||
- Reference the design system catalog on Chromatic: [`https://dev--670f94474adee5e32c896b98.chromatic.com/`](https://dev--670f94474adee5e32c896b98.chromatic.com/)
|
||||
- Use the [`tailwind-scrollbar`](https://www.npmjs.com/package/tailwind-scrollbar) plugin utilities for scrollbar styling
|
||||
|
||||
---
|
||||
|
||||
## 🧱 Component structure
|
||||
|
||||
For components, separate render logic from data/behavior, and keep implementation details local.
|
||||
|
||||
**Most components should follow this structure.** Pages are just bigger components made of smaller ones, and sub-components can have their own nested sub-components when dealing with complex features.
|
||||
|
||||
### Basic structure
|
||||
|
||||
When a component has non-trivial logic:
|
||||
|
||||
```
|
||||
FeatureX/
|
||||
FeatureX.tsx (render logic only)
|
||||
useFeatureX.ts (hook; data fetching, behavior, state)
|
||||
helpers.ts (pure helpers used by the hook)
|
||||
components/ (optional, subcomponents local to FeatureX)
|
||||
```
|
||||
|
||||
### Example: Page with nested components
|
||||
|
||||
```tsx
|
||||
// Page composition
|
||||
app/(platform)/dashboard/
|
||||
page.tsx
|
||||
useDashboardPage.ts
|
||||
components/ # (Sub-components the dashboard page is made of)
|
||||
StatsPanel/
|
||||
StatsPanel.tsx
|
||||
useStatsPanel.ts
|
||||
helpers.ts
|
||||
components/ # (Sub-components belonging to StatsPanel)
|
||||
StatCard/
|
||||
StatCard.tsx
|
||||
ActivityFeed/
|
||||
ActivityFeed.tsx
|
||||
useActivityFeed.ts
|
||||
```
|
||||
|
||||
### Guidelines
|
||||
|
||||
- Prefer function declarations for components and handlers
|
||||
- Only use arrow functions for small inline lambdas (e.g., in `map`)
|
||||
- Avoid barrel files and `index.ts` re-exports
|
||||
- Keep component files focused and readable; push complex logic to `helpers.ts`
|
||||
- Abstract reusable, cross-feature logic into `src/services/` or `src/lib/utils.ts` as appropriate
|
||||
- Build components encapsulated so they can be easily reused and abstracted elsewhere
|
||||
- Nest sub-components within a `components/` folder when they're local to the parent feature
|
||||
|
||||
### Exceptions
|
||||
|
||||
When to simplify the structure:
|
||||
|
||||
**Small hook logic (3-4 lines)**
|
||||
|
||||
If the hook logic is minimal, keep it inline with the render function:
|
||||
|
||||
```tsx
|
||||
export function ActivityAlert() {
|
||||
const [isVisible, setIsVisible] = useState(true);
|
||||
if (!isVisible) return null;
|
||||
|
||||
return (
|
||||
<Alert onClose={() => setIsVisible(false)}>New activity detected</Alert>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
**Render-only components**
|
||||
|
||||
Components with no hook logic can be direct files in `components/` without a folder:
|
||||
|
||||
```
|
||||
components/
|
||||
ActivityAlert.tsx (render-only, no folder needed)
|
||||
StatsPanel/ (has hook logic, needs folder)
|
||||
StatsPanel.tsx
|
||||
useStatsPanel.ts
|
||||
```
|
||||
|
||||
### Hook file structure
|
||||
|
||||
When separating logic into a custom hook:
|
||||
|
||||
```tsx
|
||||
// useStatsPanel.ts
|
||||
export function useStatsPanel() {
|
||||
const [data, setData] = useState<Stats[]>([]);
|
||||
const [isLoading, setIsLoading] = useState(true);
|
||||
|
||||
useEffect(() => {
|
||||
fetchStats().then(setData);
|
||||
}, []);
|
||||
|
||||
return {
|
||||
data,
|
||||
isLoading,
|
||||
refresh: () => fetchStats().then(setData),
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
Rules:
|
||||
|
||||
- **Always return an object** that exposes data and methods to the view
|
||||
- **Export a single function** named after the component (e.g., `useStatsPanel` for `StatsPanel.tsx`)
|
||||
- **Abstract into helpers.ts** when hook logic grows large, so the hook file remains readable by scanning without diving into implementation details
|
||||
|
||||
---
|
||||
|
||||
## 🔄 Data fetching patterns
|
||||
|
||||
All API hooks are generated from the backend OpenAPI specification using [Orval](https://orval.dev/). The hooks are type-safe and follow the operation names defined in the backend API.
|
||||
|
||||
### How to discover hooks
|
||||
|
||||
Most of the time you can rely on auto-import by typing the endpoint or operation name. Your IDE will suggest the generated hooks based on the OpenAPI operation IDs.
|
||||
|
||||
**Examples of hook naming patterns:**
|
||||
|
||||
- `GET /api/v1/notifications` → `useGetV1GetNotificationPreferences`
|
||||
- `POST /api/v2/store/agents` → `usePostV2CreateStoreAgent`
|
||||
- `DELETE /api/v2/store/submissions/{id}` → `useDeleteV2DeleteStoreSubmission`
|
||||
- `GET /api/v2/library/agents` → `useGetV2ListLibraryAgents`
|
||||
|
||||
**Pattern**: `use{Method}{Version}{OperationName}`
|
||||
|
||||
You can also explore the generated hooks by browsing `src/app/api/__generated__/endpoints/` which is organized by API tags (e.g., `auth`, `store`, `library`).
|
||||
|
||||
**OpenAPI specs:**
|
||||
|
||||
- Production: [https://backend.agpt.co/openapi.json](https://backend.agpt.co/openapi.json)
|
||||
- Staging: [https://dev-server.agpt.co/openapi.json](https://dev-server.agpt.co/openapi.json)
|
||||
|
||||
### Generated hooks (client)
|
||||
|
||||
Prefer the generated React Query hooks (via Orval + React Query):
|
||||
|
||||
```tsx
|
||||
import { useGetV1GetNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
|
||||
export function PreferencesPanel() {
|
||||
const { data, isLoading, isError } = useGetV1GetNotificationPreferences({
|
||||
query: {
|
||||
select: (res) => res.data,
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) return null;
|
||||
if (isError) throw new Error("Failed to load preferences");
|
||||
return <pre>{JSON.stringify(data, null, 2)}</pre>;
|
||||
}
|
||||
```
|
||||
|
||||
### Generated mutations (client)
|
||||
|
||||
```tsx
|
||||
import { useQueryClient } from "@tanstack/react-query";
|
||||
import {
|
||||
useDeleteV2DeleteStoreSubmission,
|
||||
getGetV2ListMySubmissionsQueryKey,
|
||||
} from "@/app/api/__generated__/endpoints/store/store";
|
||||
|
||||
export function DeleteSubmissionButton({
|
||||
submissionId,
|
||||
}: {
|
||||
submissionId: string;
|
||||
}) {
|
||||
const queryClient = useQueryClient();
|
||||
const { mutateAsync: deleteSubmission, isPending } =
|
||||
useDeleteV2DeleteStoreSubmission({
|
||||
mutation: {
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({
|
||||
queryKey: getGetV2ListMySubmissionsQueryKey(),
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
async function onClick() {
|
||||
await deleteSubmission({ submissionId });
|
||||
}
|
||||
|
||||
return (
|
||||
<button disabled={isPending} onClick={onClick}>
|
||||
Delete
|
||||
</button>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Server-side prefetch + client hydration
|
||||
|
||||
Use server-side prefetch to improve TTFB while keeping the component tree client-first (see [React Query SSR & Hydration](https://tanstack.com/query/latest/docs/framework/react/guides/ssr)):
|
||||
|
||||
```tsx
|
||||
// in a server component
|
||||
import { getQueryClient } from "@/lib/tanstack-query/getQueryClient";
|
||||
import { HydrationBoundary, dehydrate } from "@tanstack/react-query";
|
||||
import {
|
||||
prefetchGetV2ListStoreAgentsQuery,
|
||||
prefetchGetV2ListStoreCreatorsQuery,
|
||||
} from "@/app/api/__generated__/endpoints/store/store";
|
||||
|
||||
export default async function MarketplacePage() {
|
||||
const queryClient = getQueryClient();
|
||||
|
||||
await Promise.all([
|
||||
prefetchGetV2ListStoreAgentsQuery(queryClient, { featured: true }),
|
||||
prefetchGetV2ListStoreAgentsQuery(queryClient, { sorted_by: "runs" }),
|
||||
prefetchGetV2ListStoreCreatorsQuery(queryClient, {
|
||||
featured: true,
|
||||
sorted_by: "num_agents",
|
||||
}),
|
||||
]);
|
||||
|
||||
return (
|
||||
<HydrationBoundary state={dehydrate(queryClient)}>
|
||||
{/* Client component tree goes here */}
|
||||
</HydrationBoundary>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- Do not introduce new usages of `BackendAPI` or `src/lib/autogpt-server-api/*`
|
||||
- Keep transformations and mapping logic close to the consumer (hook), not in the view
|
||||
|
||||
---
|
||||
|
||||
## ⚠️ Error handling
|
||||
|
||||
The app has multiple error handling strategies depending on the type of error:
|
||||
|
||||
### Render/runtime errors
|
||||
|
||||
Use `<ErrorCard />` to display render or runtime errors gracefully:
|
||||
|
||||
```tsx
|
||||
import { ErrorCard } from "@/components/molecules/ErrorCard";
|
||||
|
||||
export function DataPanel() {
|
||||
const { data, isLoading, isError, error } = useGetData();
|
||||
|
||||
if (isLoading) return <Skeleton />;
|
||||
if (isError) return <ErrorCard error={error} />;
|
||||
|
||||
return <div>{data.content}</div>;
|
||||
}
|
||||
```
|
||||
|
||||
### API mutation errors
|
||||
|
||||
Display mutation errors using toast notifications:
|
||||
|
||||
```tsx
|
||||
import { useToast } from "@/components/ui/use-toast";
|
||||
|
||||
export function useUpdateSettings() {
|
||||
const { toast } = useToast();
|
||||
const { mutateAsync: updateSettings } = useUpdateSettingsMutation({
|
||||
mutation: {
|
||||
onError: (error) => {
|
||||
toast({
|
||||
title: "Failed to update settings",
|
||||
description: error.message,
|
||||
variant: "destructive",
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
return { updateSettings };
|
||||
}
|
||||
```
|
||||
|
||||
### Manual Sentry capture
|
||||
|
||||
When needed, you can manually capture exceptions to Sentry:
|
||||
|
||||
```tsx
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
|
||||
try {
|
||||
await riskyOperation();
|
||||
} catch (error) {
|
||||
Sentry.captureException(error, {
|
||||
tags: { context: "feature-x" },
|
||||
extra: { metadata: additionalData },
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
```
|
||||
|
||||
### Global error boundaries
|
||||
|
||||
The app has error boundaries already configured to:
|
||||
|
||||
- Capture uncaught errors globally and send them to Sentry
|
||||
- Display a user-friendly error UI when something breaks
|
||||
- Prevent the entire app from crashing
|
||||
|
||||
You don't need to wrap components in error boundaries manually unless you need custom error recovery logic.
|
||||
|
||||
---
|
||||
|
||||
## 🚩 Feature Flags
|
||||
|
||||
- Flags are powered by [LaunchDarkly](https://docs.launchdarkly.com/)
|
||||
- Use the helper APIs under `src/services/feature-flags`
|
||||
|
||||
Check a flag in a client component:
|
||||
|
||||
```tsx
|
||||
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
|
||||
|
||||
export function AgentActivityPanel() {
|
||||
const enabled = useGetFlag(Flag.AGENT_ACTIVITY);
|
||||
if (!enabled) return null;
|
||||
return <div>Feature is enabled!</div>;
|
||||
}
|
||||
```
|
||||
|
||||
Protect a route or page component:
|
||||
|
||||
```tsx
|
||||
import { withFeatureFlag } from "@/services/feature-flags/with-feature-flag";
|
||||
|
||||
export const MyFeaturePage = withFeatureFlag(function Page() {
|
||||
return <div>My feature page</div>;
|
||||
}, "my-feature-flag");
|
||||
```
|
||||
|
||||
Local dev and Playwright:
|
||||
|
||||
- Set `NEXT_PUBLIC_PW_TEST=true` to use mocked flag values during local development and tests
|
||||
|
||||
Adding new flags:
|
||||
|
||||
1. Add the flag to the `Flag` enum and `FlagValues` type
|
||||
2. Provide a mock value in the mock map
|
||||
3. Configure the flag in LaunchDarkly
|
||||
|
||||
---
|
||||
|
||||
## 📙 Naming conventions
|
||||
|
||||
General:
|
||||
|
||||
- Variables and functions should read like plain English
|
||||
- Prefer `const` over `let` unless reassignment is required
|
||||
- Use searchable constants instead of magic numbers
|
||||
|
||||
Files:
|
||||
|
||||
- Components and hooks: `PascalCase` for component files, `camelCase` for hooks
|
||||
- Other files: `kebab-case`
|
||||
- Do not create barrel files or `index.ts` re-exports
|
||||
|
||||
Types:
|
||||
|
||||
- Prefer `interface` for object shapes
|
||||
- Component props should be `interface Props { ... }`
|
||||
- Use precise types; avoid `any` and unsafe casts
|
||||
|
||||
Parameters:
|
||||
|
||||
- If more than one parameter is needed, pass a single `Args` object for clarity
|
||||
|
||||
Comments:
|
||||
|
||||
- Keep comments minimal; code should be clear by itself
|
||||
- Only document non-obvious intent, invariants, or caveats
|
||||
|
||||
Functions:
|
||||
|
||||
- Prefer function declarations for components and handlers
|
||||
- Only use arrow functions for small inline callbacks
|
||||
|
||||
Control flow:
|
||||
|
||||
- Use early returns to reduce nesting
|
||||
- Avoid catching errors unless you handle them meaningfully
|
||||
|
||||
---
|
||||
|
||||
## 🎨 Styling
|
||||
|
||||
- Use Tailwind utilities; prefer semantic, composable class names
|
||||
- Use shadcn/ui components as building blocks when available
|
||||
- Use the `tailwind-scrollbar` utilities for scrollbar styling
|
||||
- Keep responsive and dark-mode behavior consistent with the design system
|
||||
|
||||
Additional requirements:
|
||||
|
||||
- Do not import shadcn primitives directly in feature code; only use components exposed in our design system under `src/components`. shadcn is a low-level skeleton we style on top of and is not meant to be consumed directly.
|
||||
- Prefer design tokens over Tailwind's default theme whenever possible (e.g., color, spacing, radius, and typography tokens). Avoid hardcoded values and default palette if a token exists.
|
||||
|
||||
---
|
||||
|
||||
## ⚠️ Errors and ⏳ Loading
|
||||
|
||||
- **Errors**: Use the `ErrorCard` component from the design system to display API/HTTP errors and retry actions. Keep error derivation/mapping in hooks; pass the final message to the component.
|
||||
- Component: `src/components/molecules/ErrorCard/ErrorCard.tsx`
|
||||
- **Loading**: Use the `Skeleton` component(s) from the design system for loading states. Favor domain-appropriate skeleton layouts (lists, cards, tables) over spinners.
|
||||
- See Storybook examples under Atoms/Skeleton for patterns.
|
||||
|
||||
---
|
||||
|
||||
## 🧭 Responsive and mobile-first
|
||||
|
||||
- Build mobile-first. Ensure new UI looks great from a 375px viewport width (iPhone SE) upwards.
|
||||
- Validate layouts at common breakpoints (375, 768, 1024, 1280). Prefer stacking and progressive disclosure on small screens.
|
||||
|
||||
---
|
||||
|
||||
## 🧰 State for complex flows
|
||||
|
||||
For components/flows with complex state, multi-step wizards, or cross-component coordination, prefer a small co-located store using [Zustand](https://github.com/pmndrs/zustand).
|
||||
|
||||
Guidelines:
|
||||
|
||||
- Co-locate the store with the feature (e.g., `FeatureX/store.ts`).
|
||||
- Expose typed selectors to minimize re-renders.
|
||||
- Keep effects and API calls in hooks; stores hold state and pure actions.
|
||||
|
||||
Example: simple store with selectors
|
||||
|
||||
```ts
|
||||
import { create } from "zustand";
|
||||
|
||||
interface WizardState {
|
||||
step: number;
|
||||
data: Record<string, unknown>;
|
||||
next(): void;
|
||||
back(): void;
|
||||
setField(args: { key: string; value: unknown }): void;
|
||||
}
|
||||
|
||||
export const useWizardStore = create<WizardState>((set) => ({
|
||||
step: 0,
|
||||
data: {},
|
||||
next() {
|
||||
set((state) => ({ step: state.step + 1 }));
|
||||
},
|
||||
back() {
|
||||
set((state) => ({ step: Math.max(0, state.step - 1) }));
|
||||
},
|
||||
setField({ key, value }) {
|
||||
set((state) => ({ data: { ...state.data, [key]: value } }));
|
||||
},
|
||||
}));
|
||||
|
||||
// Usage in a component (selectors keep updates scoped)
|
||||
function WizardFooter() {
|
||||
const step = useWizardStore((s) => s.step);
|
||||
const next = useWizardStore((s) => s.next);
|
||||
const back = useWizardStore((s) => s.back);
|
||||
|
||||
return (
|
||||
<div className="flex items-center gap-2">
|
||||
<button onClick={back} disabled={step === 0}>Back</button>
|
||||
<button onClick={next}>Next</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
Example: async action coordinated via hook + store
|
||||
|
||||
```ts
|
||||
// FeatureX/useFeatureX.ts
|
||||
import { useMutation } from "@tanstack/react-query";
|
||||
import { useWizardStore } from "./store";
|
||||
|
||||
export function useFeatureX() {
|
||||
const setField = useWizardStore((s) => s.setField);
|
||||
const next = useWizardStore((s) => s.next);
|
||||
|
||||
const { mutateAsync: save, isPending } = useMutation({
|
||||
mutationFn: async (payload: unknown) => {
|
||||
// call API here
|
||||
return payload;
|
||||
},
|
||||
onSuccess(data) {
|
||||
setField({ key: "result", value: data });
|
||||
next();
|
||||
},
|
||||
});
|
||||
|
||||
return { save, isSaving: isPending };
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🖼 Icons
|
||||
|
||||
- Only use Phosphor Icons. Treat all other icon libraries as deprecated for new code.
|
||||
- Package: `@phosphor-icons/react`
|
||||
- Site: [`https://phosphoricons.com/`](https://phosphoricons.com/)
|
||||
|
||||
Example usage:
|
||||
|
||||
```tsx
|
||||
import { Plus } from "@phosphor-icons/react";
|
||||
|
||||
export function CreateButton() {
|
||||
return (
|
||||
<button type="button" className="inline-flex items-center gap-2">
|
||||
<Plus size={16} />
|
||||
Create
|
||||
</button>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Testing & Storybook
|
||||
|
||||
- End-to-end: [Playwright](https://playwright.dev/docs/intro) (`pnpm test`, `pnpm test-ui`)
|
||||
- [Storybook](https://storybook.js.org/docs) for isolated UI development (`pnpm storybook` / `pnpm build-storybook`)
|
||||
- For Storybook tests in CI, see [`@storybook/test-runner`](https://storybook.js.org/docs/writing-tests/test-runner) (`test-storybook:ci`)
|
||||
- When changing components in `src/components`, update or add stories and visually verify in Storybook/Chromatic
|
||||
|
||||
---
|
||||
|
||||
## 🛠 Tooling & Scripts
|
||||
|
||||
Common scripts (see `package.json` for full list):
|
||||
|
||||
- `pnpm dev` — Start Next.js dev server (generates API client first)
|
||||
- `pnpm build` — Build for production
|
||||
- `pnpm start` — Start production server
|
||||
- `pnpm lint` — ESLint + Prettier check
|
||||
- `pnpm format` — Format code
|
||||
- `pnpm types` — Type-check
|
||||
- `pnpm storybook` — Run Storybook
|
||||
- `pnpm test` — Run Playwright tests
|
||||
|
||||
Generated API client:
|
||||
|
||||
- `pnpm generate:api` — Fetch OpenAPI spec and regenerate the client
|
||||
|
||||
---
|
||||
|
||||
## ✅ PR checklist (Frontend)
|
||||
|
||||
- Client-first: server components only for SEO or extreme TTFB needs
|
||||
- Uses generated API hooks; no new `BackendAPI` usages
|
||||
- UI uses `src/components` primitives; no new `_legacy__` components
|
||||
- Logic is separated into `use*.ts` and `helpers.ts` when non-trivial
|
||||
- Reusable logic extracted to `src/services/` or `src/lib/utils.ts` when appropriate
|
||||
- Navigation uses the Next.js router
|
||||
- Lint, format, type-check, and tests pass locally
|
||||
- Stories updated/added if UI changed; verified in Storybook
|
||||
|
||||
---
|
||||
|
||||
## ♻️ Migration guidance
|
||||
|
||||
When touching legacy code:
|
||||
|
||||
- Replace usages of `src/components/_legacy__/*` with the modern design system components under `src/components`
|
||||
- Replace `BackendAPI` or `src/lib/autogpt-server-api/*` with generated API hooks
|
||||
- Move presentational logic into render files and data/behavior into hooks
|
||||
- Keep one-off transformations in local `helpers.ts`; move reusable logic to `src/services/` or `src/lib/utils.ts`
|
||||
|
||||
---
|
||||
|
||||
## 📚 References
|
||||
|
||||
- Design system (Chromatic): [`https://dev--670f94474adee5e32c896b98.chromatic.com/`](https://dev--670f94474adee5e32c896b98.chromatic.com/)
|
||||
- Project README for setup and API client examples: `autogpt_platform/frontend/README.md`
|
||||
- Conventional Commits: [conventionalcommits.org](https://www.conventionalcommits.org/)
|
||||
@@ -4,20 +4,12 @@ This is the frontend for AutoGPT's next generation
|
||||
|
||||
This project uses [**pnpm**](https://pnpm.io/) as the package manager via **corepack**. [Corepack](https://github.com/nodejs/corepack) is a Node.js tool that automatically manages package managers without requiring global installations.
|
||||
|
||||
For architecture, conventions, data fetching, feature flags, design system usage, state management, and PR process, see [CONTRIBUTING.md](./CONTRIBUTING.md).
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Make sure you have Node.js 16.10+ installed. Corepack is included with Node.js by default.
|
||||
|
||||
### ⚠️ Migrating from yarn
|
||||
|
||||
> This project was previously using yarn1, make sure to clean up the old files if you set it up previously with yarn:
|
||||
>
|
||||
> ```bash
|
||||
> rm -f yarn.lock && rm -rf node_modules
|
||||
> ```
|
||||
>
|
||||
> Then follow the setup steps below.
|
||||
|
||||
## Setup
|
||||
|
||||
### 1. **Enable corepack** (run this once on your system):
|
||||
@@ -96,184 +88,13 @@ Every time a new Front-end dependency is added by you or others, you will need t
|
||||
|
||||
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
|
||||
|
||||
## 🔄 Data Fetching Strategy
|
||||
## 🔄 Data Fetching
|
||||
|
||||
> [!NOTE]
|
||||
> You don't need to run the OpenAPI commands below to run the Front-end. You will only need to run them when adding or modifying endpoints on the Backend API and wanting to use those on the Frontend.
|
||||
|
||||
This project uses an auto-generated API client powered by [**Orval**](https://orval.dev/), which creates type-safe API clients from OpenAPI specifications.
|
||||
|
||||
### How It Works
|
||||
|
||||
1. **Backend Requirements**: Each API endpoint needs a summary and tag in the OpenAPI spec
|
||||
2. **Operation ID Generation**: FastAPI generates operation IDs using the pattern `{method}{tag}{summary}`
|
||||
3. **Spec Fetching**: The OpenAPI spec is fetched from `http://localhost:8006/openapi.json` and saved to the frontend
|
||||
4. **Spec Transformation**: The OpenAPI spec is cleaned up using a custom transformer (see `autogpt_platform/frontend/src/app/api/transformers`)
|
||||
5. **Client Generation**: Auto-generated client includes TypeScript types, API endpoints, and Zod schemas, organized by tags
|
||||
|
||||
### API Client Commands
|
||||
|
||||
```bash
|
||||
# Fetch OpenAPI spec from backend and generate client
|
||||
pnpm generate:api
|
||||
|
||||
# Only fetch the OpenAPI spec
|
||||
pnpm fetch:openapi
|
||||
|
||||
# Only generate the client (after spec is fetched)
|
||||
pnpm generate:api-client
|
||||
```
|
||||
|
||||
### Using the Generated Client
|
||||
|
||||
The generated client provides React Query hooks for both queries and mutations:
|
||||
|
||||
#### Queries (GET requests)
|
||||
|
||||
```typescript
|
||||
import { useGetV1GetNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
|
||||
const { data, isLoading, isError } = useGetV1GetNotificationPreferences({
|
||||
query: {
|
||||
select: (res) => res.data,
|
||||
// Other React Query options
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
#### Mutations (POST, PUT, DELETE requests)
|
||||
|
||||
```typescript
|
||||
import { useDeleteV2DeleteStoreSubmission } from "@/app/api/__generated__/endpoints/store/store";
|
||||
import { getGetV2ListMySubmissionsQueryKey } from "@/app/api/__generated__/endpoints/store/store";
|
||||
import { useQueryClient } from "@tanstack/react-query";
|
||||
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { mutateAsync: deleteSubmission } = useDeleteV2DeleteStoreSubmission({
|
||||
mutation: {
|
||||
onSuccess: () => {
|
||||
// Invalidate related queries to refresh data
|
||||
queryClient.invalidateQueries({
|
||||
queryKey: getGetV2ListMySubmissionsQueryKey(),
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Usage
|
||||
await deleteSubmission({
|
||||
submissionId: submission_id,
|
||||
});
|
||||
```
|
||||
|
||||
#### Server Actions
|
||||
|
||||
For server-side operations, you can also use the generated client functions directly:
|
||||
|
||||
```typescript
|
||||
import { postV1UpdateNotificationPreferences } from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
|
||||
// In a server action
|
||||
const preferences = {
|
||||
email: "user@example.com",
|
||||
preferences: {
|
||||
AGENT_RUN: true,
|
||||
ZERO_BALANCE: false,
|
||||
// ... other preferences
|
||||
},
|
||||
daily_limit: 0,
|
||||
};
|
||||
|
||||
await postV1UpdateNotificationPreferences(preferences);
|
||||
```
|
||||
|
||||
#### Server-Side Prefetching
|
||||
|
||||
For server-side components, you can prefetch data on the server and hydrate it in the client cache. This allows immediate access to cached data when queries are called:
|
||||
|
||||
```typescript
|
||||
import { getQueryClient } from "@/lib/tanstack-query/getQueryClient";
|
||||
import {
|
||||
prefetchGetV2ListStoreAgentsQuery,
|
||||
prefetchGetV2ListStoreCreatorsQuery
|
||||
} from "@/app/api/__generated__/endpoints/store/store";
|
||||
import { HydrationBoundary, dehydrate } from "@tanstack/react-query";
|
||||
|
||||
// In your server component
|
||||
const queryClient = getQueryClient();
|
||||
|
||||
await Promise.all([
|
||||
prefetchGetV2ListStoreAgentsQuery(queryClient, {
|
||||
featured: true,
|
||||
}),
|
||||
prefetchGetV2ListStoreAgentsQuery(queryClient, {
|
||||
sorted_by: "runs",
|
||||
}),
|
||||
prefetchGetV2ListStoreCreatorsQuery(queryClient, {
|
||||
featured: true,
|
||||
sorted_by: "num_agents",
|
||||
}),
|
||||
]);
|
||||
|
||||
return (
|
||||
<HydrationBoundary state={dehydrate(queryClient)}>
|
||||
<MainMarkeplacePage />
|
||||
</HydrationBoundary>
|
||||
);
|
||||
```
|
||||
|
||||
This pattern improves performance by serving pre-fetched data from the server while maintaining the benefits of client-side React Query features.
|
||||
|
||||
### Configuration
|
||||
|
||||
The Orval configuration is located in `autogpt_platform/frontend/orval.config.ts`. It generates two separate clients:
|
||||
|
||||
1. **autogpt_api_client**: React Query hooks for client-side data fetching
|
||||
2. **autogpt_zod_schema**: Zod schemas for validation
|
||||
|
||||
For more details, see the [Orval documentation](https://orval.dev/) or check the configuration file.
|
||||
See [CONTRIBUTING.md](./CONTRIBUTING.md) for guidance on generated API hooks, SSR + hydration patterns, and usage examples. You generally do not need to run OpenAPI commands unless adding/modifying backend endpoints.
|
||||
|
||||
## 🚩 Feature Flags
|
||||
|
||||
This project uses [LaunchDarkly](https://launchdarkly.com/) for feature flags, allowing us to control feature rollouts and A/B testing.
|
||||
|
||||
### Using Feature Flags
|
||||
|
||||
#### Check if a feature is enabled
|
||||
|
||||
```typescript
|
||||
import { Flag, useGetFlag } from "@/services/feature-flags/use-get-flag";
|
||||
|
||||
function MyComponent() {
|
||||
const isAgentActivityEnabled = useGetFlag(Flag.AGENT_ACTIVITY);
|
||||
|
||||
if (!isAgentActivityEnabled) {
|
||||
return null; // Hide feature
|
||||
}
|
||||
|
||||
return <div>Feature is enabled!</div>;
|
||||
}
|
||||
```
|
||||
|
||||
#### Protect entire components
|
||||
|
||||
```typescript
|
||||
import { withFeatureFlag } from "@/services/feature-flags/with-feature-flag";
|
||||
|
||||
const MyFeaturePage = withFeatureFlag(MyPageComponent, "my-feature-flag");
|
||||
```
|
||||
|
||||
### Testing with Feature Flags
|
||||
|
||||
For local development or running Playwright tests locally, use mocked feature flags by setting `NEXT_PUBLIC_PW_TEST=true` in your `.env` file. This bypasses LaunchDarkly and uses the mock values defined in the code.
|
||||
|
||||
### Adding New Flags
|
||||
|
||||
1. Add the flag to the `Flag` enum in `use-get-flag.ts`
|
||||
2. Add the flag type to `FlagValues` type
|
||||
3. Add mock value to `mockFlags` for testing
|
||||
4. Configure the flag in LaunchDarkly dashboard
|
||||
See [CONTRIBUTING.md](./CONTRIBUTING.md) for feature flag usage patterns, local development with mocks, and how to add new flags.
|
||||
|
||||
## 🚚 Deploy
|
||||
|
||||
@@ -333,7 +154,7 @@ By integrating Storybook into our development workflow, we can streamline UI dev
|
||||
- [**Tailwind CSS**](https://tailwindcss.com/) - Utility-first CSS framework
|
||||
- [**shadcn/ui**](https://ui.shadcn.com/) - Re-usable components built with Radix UI and Tailwind CSS
|
||||
- [**Radix UI**](https://www.radix-ui.com/) - Headless UI components for accessibility
|
||||
- [**Lucide React**](https://lucide.dev/guide/packages/lucide-react) - Beautiful & consistent icons
|
||||
- [**Phosphor Icons**](https://phosphoricons.com/) - Icon set used across the app
|
||||
- [**Framer Motion**](https://motion.dev/) - Animation library for React
|
||||
|
||||
### Development & Testing
|
||||
|
||||
@@ -2,18 +2,11 @@
|
||||
// The config you add here will be used whenever a users loads a page in their browser.
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||
|
||||
import {
|
||||
AppEnv,
|
||||
BehaveAs,
|
||||
getAppEnv,
|
||||
getBehaveAs,
|
||||
getEnvironmentStr,
|
||||
} from "@/lib/utils";
|
||||
import { environment } from "@/services/environment";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
|
||||
const isProdOrDev = [AppEnv.PROD, AppEnv.DEV].includes(getAppEnv());
|
||||
|
||||
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isProdOrDev = environment.isProd() || environment.isDev();
|
||||
const isCloud = environment.isCloud();
|
||||
const isDisabled = process.env.DISABLE_SENTRY === "true";
|
||||
|
||||
const shouldEnable = !isDisabled && isProdOrDev && isCloud;
|
||||
@@ -21,7 +14,7 @@ const shouldEnable = !isDisabled && isProdOrDev && isCloud;
|
||||
Sentry.init({
|
||||
dsn: "https://fe4e4aa4a283391808a5da396da20159@o4505260022104064.ingest.us.sentry.io/4507946746380288",
|
||||
|
||||
environment: getEnvironmentStr(),
|
||||
environment: environment.getEnvironmentStr(),
|
||||
|
||||
enabled: shouldEnable,
|
||||
|
||||
|
||||
@@ -1,16 +1,16 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
import { getAgptServerBaseUrl } from "@/lib/env-config";
|
||||
import { execSync } from "child_process";
|
||||
import * as path from "path";
|
||||
import * as fs from "fs";
|
||||
import * as os from "os";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
function fetchOpenApiSpec(): void {
|
||||
const args = process.argv.slice(2);
|
||||
const forceFlag = args.includes("--force");
|
||||
|
||||
const baseUrl = getAgptServerBaseUrl();
|
||||
const baseUrl = environment.getAGPTServerBaseUrl();
|
||||
const openApiUrl = `${baseUrl}/openapi.json`;
|
||||
const outputPath = path.join(
|
||||
__dirname,
|
||||
|
||||
@@ -3,18 +3,11 @@
|
||||
// Note that this config is unrelated to the Vercel Edge Runtime and is also required when running locally.
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||
|
||||
import { environment } from "@/services/environment";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import {
|
||||
AppEnv,
|
||||
BehaveAs,
|
||||
getAppEnv,
|
||||
getBehaveAs,
|
||||
getEnvironmentStr,
|
||||
} from "./src/lib/utils";
|
||||
|
||||
const isProdOrDev = [AppEnv.PROD, AppEnv.DEV].includes(getAppEnv());
|
||||
|
||||
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isProdOrDev = environment.isProd() || environment.isDev();
|
||||
const isCloud = environment.isCloud();
|
||||
const isDisabled = process.env.DISABLE_SENTRY === "true";
|
||||
|
||||
const shouldEnable = !isDisabled && isProdOrDev && isCloud;
|
||||
@@ -22,7 +15,7 @@ const shouldEnable = !isDisabled && isProdOrDev && isCloud;
|
||||
Sentry.init({
|
||||
dsn: "https://fe4e4aa4a283391808a5da396da20159@o4505260022104064.ingest.us.sentry.io/4507946746380288",
|
||||
|
||||
environment: getEnvironmentStr(),
|
||||
environment: environment.getEnvironmentStr(),
|
||||
|
||||
enabled: shouldEnable,
|
||||
|
||||
@@ -40,7 +33,7 @@ Sentry.init({
|
||||
|
||||
enableLogs: true,
|
||||
integrations: [
|
||||
Sentry.captureConsoleIntegration(),
|
||||
Sentry.captureConsoleIntegration({ levels: ["fatal", "error", "warn"] }),
|
||||
Sentry.extraErrorDataIntegration(),
|
||||
],
|
||||
});
|
||||
|
||||
@@ -2,19 +2,12 @@
|
||||
// The config you add here will be used whenever the server handles a request.
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||
|
||||
import {
|
||||
AppEnv,
|
||||
BehaveAs,
|
||||
getAppEnv,
|
||||
getBehaveAs,
|
||||
getEnvironmentStr,
|
||||
} from "@/lib/utils";
|
||||
import { environment } from "@/services/environment";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
// import { NodeProfilingIntegration } from "@sentry/profiling-node";
|
||||
|
||||
const isProdOrDev = [AppEnv.PROD, AppEnv.DEV].includes(getAppEnv());
|
||||
|
||||
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isProdOrDev = environment.isProd() || environment.isDev();
|
||||
const isCloud = environment.isCloud();
|
||||
const isDisabled = process.env.DISABLE_SENTRY === "true";
|
||||
|
||||
const shouldEnable = !isDisabled && isProdOrDev && isCloud;
|
||||
@@ -22,7 +15,7 @@ const shouldEnable = !isDisabled && isProdOrDev && isCloud;
|
||||
Sentry.init({
|
||||
dsn: "https://fe4e4aa4a283391808a5da396da20159@o4505260022104064.ingest.us.sentry.io/4507946746380288",
|
||||
|
||||
environment: getEnvironmentStr(),
|
||||
environment: environment.getEnvironmentStr(),
|
||||
|
||||
enabled: shouldEnable,
|
||||
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
"use client";
|
||||
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import { useEffect, useState } from "react";
|
||||
import { Button } from "@/components/atoms/Button/Button";
|
||||
import { Text } from "@/components/atoms/Text/Text";
|
||||
import { Card } from "@/components/atoms/Card/Card";
|
||||
import { WaitlistErrorContent } from "@/components/auth/WaitlistErrorContent";
|
||||
import { isWaitlistErrorFromParams } from "@/app/api/auth/utils";
|
||||
import { isWaitlistError } from "@/app/api/auth/utils";
|
||||
import { useRouter } from "next/navigation";
|
||||
import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export default function AuthErrorPage() {
|
||||
const [errorType, setErrorType] = useState<string | null>(null);
|
||||
@@ -17,7 +17,7 @@ export default function AuthErrorPage() {
|
||||
|
||||
useEffect(() => {
|
||||
// This code only runs on the client side
|
||||
if (!isServerSide()) {
|
||||
if (!environment.isServerSide()) {
|
||||
const hash = window.location.hash.substring(1); // Remove the leading '#'
|
||||
const params = new URLSearchParams(hash);
|
||||
|
||||
@@ -38,12 +38,9 @@ export default function AuthErrorPage() {
|
||||
}
|
||||
|
||||
// Check if this is a waitlist/not allowed error
|
||||
const isWaitlistError = isWaitlistErrorFromParams(
|
||||
errorCode,
|
||||
errorDescription,
|
||||
);
|
||||
const isWaitlistErr = isWaitlistError(errorCode, errorDescription);
|
||||
|
||||
if (isWaitlistError) {
|
||||
if (isWaitlistErr) {
|
||||
return (
|
||||
<div className="flex h-screen items-center justify-center">
|
||||
<Card className="w-full max-w-md p-8">
|
||||
@@ -56,34 +53,25 @@ export default function AuthErrorPage() {
|
||||
);
|
||||
}
|
||||
|
||||
// Default error display for other types of errors
|
||||
// Use ErrorCard for consistent error display
|
||||
const errorMessage = errorDescription
|
||||
? `${errorDescription}. If this error persists, please contact support at contact@agpt.co`
|
||||
: "An authentication error occurred. Please contact support at contact@agpt.co";
|
||||
|
||||
return (
|
||||
<div className="flex h-screen items-center justify-center">
|
||||
<Card className="w-full max-w-md p-8">
|
||||
<div className="flex flex-col items-center gap-6">
|
||||
<Text variant="h3">Authentication Error</Text>
|
||||
<div className="flex flex-col gap-2 text-center">
|
||||
{errorType && (
|
||||
<Text variant="body">
|
||||
<strong>Error Type:</strong> {errorType}
|
||||
</Text>
|
||||
)}
|
||||
{errorCode && (
|
||||
<Text variant="body">
|
||||
<strong>Error Code:</strong> {errorCode}
|
||||
</Text>
|
||||
)}
|
||||
{errorDescription && (
|
||||
<Text variant="body">
|
||||
<strong>Description:</strong> {errorDescription}
|
||||
</Text>
|
||||
)}
|
||||
</div>
|
||||
<Button variant="primary" onClick={() => router.push("/login")}>
|
||||
Back to Login
|
||||
</Button>
|
||||
</div>
|
||||
</Card>
|
||||
<div className="flex h-screen items-center justify-center p-4">
|
||||
<div className="w-full max-w-md">
|
||||
<ErrorCard
|
||||
responseError={{
|
||||
message: errorMessage,
|
||||
detail: errorCode
|
||||
? `Error code: ${errorCode}${errorType ? ` (${errorType})` : ""}`
|
||||
: undefined,
|
||||
}}
|
||||
context="authentication"
|
||||
onRetry={() => router.push("/login")}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -6,6 +6,7 @@ import ReactMarkdown from "react-markdown";
|
||||
import type { GraphID } from "@/lib/autogpt-server-api/types";
|
||||
import { askOtto } from "@/app/(platform)/build/actions";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
interface Message {
|
||||
type: "user" | "assistant";
|
||||
@@ -129,7 +130,7 @@ export default function OttoChatWidget({
|
||||
};
|
||||
|
||||
// Don't render the chat widget if we're not on the build page or in local mode
|
||||
if (process.env.NEXT_PUBLIC_BEHAVE_AS !== "CLOUD") {
|
||||
if (environment.isLocal()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import Shepherd from "shepherd.js";
|
||||
import "shepherd.js/dist/css/shepherd.css";
|
||||
import { sendGAEvent } from "@/services/analytics/google-analytics";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import { analytics } from "@/services/analytics";
|
||||
|
||||
export const startTutorial = (
|
||||
emptyNodeList: (forceEmpty: boolean) => boolean,
|
||||
@@ -555,7 +555,7 @@ export const startTutorial = (
|
||||
"use client";
|
||||
console.debug("sendTutorialStep");
|
||||
|
||||
sendGAEvent("event", "tutorial_step_shown", { value: step.id });
|
||||
analytics.sendGAEvent("event", "tutorial_step_shown", { value: step.id });
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -42,8 +42,16 @@ function isVideoUrl(url: string): boolean {
|
||||
if (url.includes("youtube.com/watch") || url.includes("youtu.be/")) {
|
||||
return true;
|
||||
}
|
||||
if (url.includes("vimeo.com/")) {
|
||||
return true;
|
||||
try {
|
||||
const parsed = new URL(url);
|
||||
if (
|
||||
parsed.hostname === "vimeo.com" ||
|
||||
parsed.hostname === "www.vimeo.com"
|
||||
) {
|
||||
return true;
|
||||
}
|
||||
} catch {
|
||||
// If URL parsing fails, treat as not a Vimeo URL.
|
||||
}
|
||||
return videoExtensions.some((ext) => url.toLowerCase().includes(ext));
|
||||
}
|
||||
|
||||
@@ -17,6 +17,7 @@ import { GraphExecutionJobInfo } from "@/app/api/__generated__/models/graphExecu
|
||||
import { LibraryAgentPreset } from "@/app/api/__generated__/models/libraryAgentPreset";
|
||||
import { useGetV1GetUserTimezone } from "@/app/api/__generated__/endpoints/auth/auth";
|
||||
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
|
||||
import { analytics } from "@/services/analytics";
|
||||
|
||||
export type RunVariant =
|
||||
| "manual"
|
||||
@@ -78,6 +79,10 @@ export function useAgentRunModal(
|
||||
agent.graph_id,
|
||||
).queryKey,
|
||||
});
|
||||
analytics.sendDatafastEvent("run_agent", {
|
||||
name: agent.name,
|
||||
id: agent.graph_id,
|
||||
});
|
||||
setIsOpen(false);
|
||||
}
|
||||
},
|
||||
@@ -105,6 +110,11 @@ export function useAgentRunModal(
|
||||
agent.graph_id,
|
||||
),
|
||||
});
|
||||
analytics.sendDatafastEvent("schedule_agent", {
|
||||
name: agent.name,
|
||||
id: agent.graph_id,
|
||||
cronExpression: cronExpression,
|
||||
});
|
||||
setIsOpen(false);
|
||||
}
|
||||
},
|
||||
|
||||
@@ -37,6 +37,7 @@ import { useToastOnFail } from "@/components/molecules/Toast/use-toast";
|
||||
import { AgentRunStatus, agentRunStatusMap } from "./agent-run-status-chip";
|
||||
import useCredits from "@/hooks/useCredits";
|
||||
import { AgentRunOutputView } from "./agent-run-output-view";
|
||||
import { analytics } from "@/services/analytics";
|
||||
|
||||
export function AgentRunDetailsView({
|
||||
agent,
|
||||
@@ -131,7 +132,13 @@ export function AgentRunDetailsView({
|
||||
run.inputs!,
|
||||
run.credential_inputs!,
|
||||
)
|
||||
.then(({ id }) => onRun(id))
|
||||
.then(({ id }) => {
|
||||
analytics.sendDatafastEvent("run_agent", {
|
||||
name: graph.name,
|
||||
id: graph.id,
|
||||
});
|
||||
onRun(id);
|
||||
})
|
||||
.catch(toastOnFail("execute agent preset"));
|
||||
}
|
||||
|
||||
@@ -142,7 +149,13 @@ export function AgentRunDetailsView({
|
||||
run.inputs!,
|
||||
run.credential_inputs!,
|
||||
)
|
||||
.then(({ id }) => onRun(id))
|
||||
.then(({ id }) => {
|
||||
analytics.sendDatafastEvent("run_agent", {
|
||||
name: graph.name,
|
||||
id: graph.id,
|
||||
});
|
||||
onRun(id);
|
||||
})
|
||||
.catch(toastOnFail("execute agent"));
|
||||
}, [api, graph, run, onRun, toastOnFail]);
|
||||
|
||||
|
||||
@@ -43,6 +43,7 @@ import {
|
||||
|
||||
import { AgentStatus, AgentStatusChip } from "./agent-status-chip";
|
||||
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
|
||||
import { analytics } from "@/services/analytics";
|
||||
|
||||
export function AgentRunDraftView({
|
||||
graph,
|
||||
@@ -197,6 +198,12 @@ export function AgentRunDraftView({
|
||||
}
|
||||
// Mark run agent onboarding step as completed
|
||||
completeOnboardingStep("MARKETPLACE_RUN_AGENT");
|
||||
|
||||
analytics.sendDatafastEvent("run_agent", {
|
||||
name: graph.name,
|
||||
id: graph.id,
|
||||
});
|
||||
|
||||
if (runCount > 0) {
|
||||
completeOnboardingStep("RE_RUN_AGENT");
|
||||
}
|
||||
@@ -373,6 +380,12 @@ export function AgentRunDraftView({
|
||||
})
|
||||
.catch(toastOnFail("set up agent run schedule"));
|
||||
|
||||
analytics.sendDatafastEvent("schedule_agent", {
|
||||
name: graph.name,
|
||||
id: graph.id,
|
||||
cronExpression: cronExpression,
|
||||
});
|
||||
|
||||
if (schedule && onCreateSchedule) onCreateSchedule(schedule);
|
||||
},
|
||||
[api, graph, inputValues, inputCredentials, onCreateSchedule, toastOnFail],
|
||||
|
||||
@@ -8,9 +8,9 @@ import { EmailNotAllowedModal } from "@/components/auth/EmailNotAllowedModal";
|
||||
import { GoogleOAuthButton } from "@/components/auth/GoogleOAuthButton";
|
||||
import Turnstile from "@/components/auth/Turnstile";
|
||||
import { Form, FormField } from "@/components/__legacy__/ui/form";
|
||||
import { getBehaveAs } from "@/lib/utils";
|
||||
import { LoadingLogin } from "./components/LoadingLogin";
|
||||
import { useLoginPage } from "./useLoginPage";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export default function LoginPage() {
|
||||
const {
|
||||
@@ -118,7 +118,7 @@ export default function LoginPage() {
|
||||
type="login"
|
||||
message={feedback}
|
||||
isError={!!feedback}
|
||||
behaveAs={getBehaveAs()}
|
||||
behaveAs={environment.getBehaveAs()}
|
||||
/>
|
||||
</Form>
|
||||
<AuthCard.BottomText
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { useTurnstile } from "@/hooks/useTurnstile";
|
||||
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { loginFormSchema, LoginProvider } from "@/types/auth";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import { useRouter } from "next/navigation";
|
||||
@@ -8,6 +7,7 @@ import { useCallback, useEffect, useState } from "react";
|
||||
import { useForm } from "react-hook-form";
|
||||
import z from "zod";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export function useLoginPage() {
|
||||
const { supabase, user, isUserLoading } = useSupabase();
|
||||
@@ -18,7 +18,7 @@ export function useLoginPage() {
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [isGoogleLoading, setIsGoogleLoading] = useState(false);
|
||||
const [showNotAllowedModal, setShowNotAllowedModal] = useState(false);
|
||||
const isCloudEnv = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isCloudEnv = environment.isCloud();
|
||||
const isVercelPreview = process.env.NEXT_PUBLIC_VERCEL_ENV === "preview";
|
||||
|
||||
const turnstile = useTurnstile({
|
||||
|
||||
@@ -6,10 +6,10 @@ import Link from "next/link";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { useAgentInfo } from "./useAgentInfo";
|
||||
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
|
||||
|
||||
interface AgentInfoProps {
|
||||
user: User | null;
|
||||
agentId: string;
|
||||
name: string;
|
||||
creator: string;
|
||||
shortDescription: string;
|
||||
@@ -20,11 +20,12 @@ interface AgentInfoProps {
|
||||
lastUpdated: string;
|
||||
version: string;
|
||||
storeListingVersionId: string;
|
||||
libraryAgent: LibraryAgent | undefined;
|
||||
isAgentAddedToLibrary: boolean;
|
||||
}
|
||||
|
||||
export const AgentInfo = ({
|
||||
user,
|
||||
agentId,
|
||||
name,
|
||||
creator,
|
||||
shortDescription,
|
||||
@@ -35,7 +36,7 @@ export const AgentInfo = ({
|
||||
lastUpdated,
|
||||
version,
|
||||
storeListingVersionId,
|
||||
libraryAgent,
|
||||
isAgentAddedToLibrary,
|
||||
}: AgentInfoProps) => {
|
||||
const {
|
||||
handleDownload,
|
||||
@@ -82,11 +83,15 @@ export const AgentInfo = ({
|
||||
"transition-colors duration-200 hover:bg-violet-500 disabled:bg-zinc-400",
|
||||
)}
|
||||
data-testid={"agent-add-library-button"}
|
||||
onClick={handleLibraryAction}
|
||||
disabled={isAddingAgentToLibrary}
|
||||
onClick={() =>
|
||||
handleLibraryAction({
|
||||
isAddingAgentFirstTime: !isAgentAddedToLibrary,
|
||||
})
|
||||
}
|
||||
>
|
||||
<span className="justify-start font-sans text-sm font-medium leading-snug text-primary-foreground">
|
||||
{libraryAgent ? "See runs" : "Add to library"}
|
||||
{isAgentAddedToLibrary ? "See runs" : "Add to library"}
|
||||
</span>
|
||||
</button>
|
||||
)}
|
||||
@@ -96,7 +101,7 @@ export const AgentInfo = ({
|
||||
"transition-colors duration-200 hover:bg-zinc-200/70 disabled:bg-zinc-200/40",
|
||||
)}
|
||||
data-testid={"agent-download-button"}
|
||||
onClick={handleDownload}
|
||||
onClick={() => handleDownload(agentId, name)}
|
||||
disabled={isDownloadingAgent}
|
||||
>
|
||||
<div className="justify-start text-center font-sans text-sm font-medium leading-snug text-zinc-800">
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
import { usePostV2AddMarketplaceAgent } from "@/app/api/__generated__/endpoints/library/library";
|
||||
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
import { useRouter } from "next/navigation";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import { useGetV2DownloadAgentFile } from "@/app/api/__generated__/endpoints/store/store";
|
||||
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
|
||||
import { analytics } from "@/services/analytics";
|
||||
import { LibraryAgent } from "@/app/api/__generated__/models/libraryAgent";
|
||||
|
||||
interface UseAgentInfoProps {
|
||||
storeListingVersionId: string;
|
||||
@@ -16,29 +17,9 @@ export const useAgentInfo = ({ storeListingVersionId }: UseAgentInfoProps) => {
|
||||
const { completeStep } = useOnboarding();
|
||||
|
||||
const {
|
||||
mutate: addMarketplaceAgentToLibrary,
|
||||
mutateAsync: addMarketplaceAgentToLibrary,
|
||||
isPending: isAddingAgentToLibrary,
|
||||
} = usePostV2AddMarketplaceAgent({
|
||||
mutation: {
|
||||
onSuccess: ({ data }) => {
|
||||
completeStep("MARKETPLACE_ADD_AGENT");
|
||||
router.push(`/library/agents/${(data as LibraryAgent).id}`);
|
||||
toast({
|
||||
title: "Agent Added",
|
||||
description: "Redirecting to your library...",
|
||||
duration: 2000,
|
||||
});
|
||||
},
|
||||
onError: (error) => {
|
||||
Sentry.captureException(error);
|
||||
toast({
|
||||
title: "Error",
|
||||
description: "Failed to add agent to library. Please try again.",
|
||||
variant: "destructive",
|
||||
});
|
||||
},
|
||||
},
|
||||
});
|
||||
} = usePostV2AddMarketplaceAgent();
|
||||
|
||||
const { refetch: downloadAgent, isFetching: isDownloadingAgent } =
|
||||
useGetV2DownloadAgentFile(storeListingVersionId, {
|
||||
@@ -50,13 +31,46 @@ export const useAgentInfo = ({ storeListingVersionId }: UseAgentInfoProps) => {
|
||||
},
|
||||
});
|
||||
|
||||
const handleLibraryAction = async () => {
|
||||
addMarketplaceAgentToLibrary({
|
||||
data: { store_listing_version_id: storeListingVersionId },
|
||||
});
|
||||
const handleLibraryAction = async ({
|
||||
isAddingAgentFirstTime,
|
||||
}: {
|
||||
isAddingAgentFirstTime: boolean;
|
||||
}) => {
|
||||
try {
|
||||
const { data: response } = await addMarketplaceAgentToLibrary({
|
||||
data: { store_listing_version_id: storeListingVersionId },
|
||||
});
|
||||
|
||||
const data = response as LibraryAgent;
|
||||
|
||||
if (isAddingAgentFirstTime) {
|
||||
completeStep("MARKETPLACE_ADD_AGENT");
|
||||
|
||||
analytics.sendDatafastEvent("add_to_library", {
|
||||
name: data.name,
|
||||
id: data.id,
|
||||
});
|
||||
}
|
||||
|
||||
router.push(`/library/agents/${data.id}`);
|
||||
|
||||
toast({
|
||||
title: "Agent Added",
|
||||
description: "Redirecting to your library...",
|
||||
duration: 2000,
|
||||
});
|
||||
} catch (error) {
|
||||
Sentry.captureException(error);
|
||||
|
||||
toast({
|
||||
title: "Error",
|
||||
description: "Failed to add agent to library. Please try again.",
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const handleDownload = async () => {
|
||||
const handleDownload = async (agentId: string, agentName: string) => {
|
||||
try {
|
||||
const { data: file } = await downloadAgent();
|
||||
|
||||
@@ -74,6 +88,11 @@ export const useAgentInfo = ({ storeListingVersionId }: UseAgentInfoProps) => {
|
||||
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
analytics.sendDatafastEvent("download_agent", {
|
||||
name: agentName,
|
||||
id: agentId,
|
||||
});
|
||||
|
||||
toast({
|
||||
title: "Download Complete",
|
||||
description: "Your agent has been successfully downloaded.",
|
||||
|
||||
@@ -82,6 +82,7 @@ export const MainAgentPage = ({ params }: MainAgentPageProps) => {
|
||||
<div className="w-full md:w-auto md:shrink-0">
|
||||
<AgentInfo
|
||||
user={user}
|
||||
agentId={agent.active_version_id ?? "–"}
|
||||
name={agent.agent_name}
|
||||
creator={agent.creator}
|
||||
shortDescription={agent.sub_heading}
|
||||
@@ -92,7 +93,7 @@ export const MainAgentPage = ({ params }: MainAgentPageProps) => {
|
||||
lastUpdated={agent.last_updated.toISOString()}
|
||||
version={agent.versions[agent.versions.length - 1]}
|
||||
storeListingVersionId={agent.store_listing_version_id}
|
||||
libraryAgent={libraryAgent}
|
||||
isAgentAddedToLibrary={Boolean(libraryAgent)}
|
||||
/>
|
||||
</div>
|
||||
<AgentImages
|
||||
|
||||
@@ -17,10 +17,10 @@ import {
|
||||
FormItem,
|
||||
FormLabel,
|
||||
} from "@/components/__legacy__/ui/form";
|
||||
import { getBehaveAs } from "@/lib/utils";
|
||||
import { WarningOctagonIcon } from "@phosphor-icons/react/dist/ssr";
|
||||
import { LoadingSignup } from "./components/LoadingSignup";
|
||||
import { useSignupPage } from "./useSignupPage";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export default function SignupPage() {
|
||||
const {
|
||||
@@ -196,7 +196,7 @@ export default function SignupPage() {
|
||||
type="signup"
|
||||
message={feedback}
|
||||
isError={!!feedback}
|
||||
behaveAs={getBehaveAs()}
|
||||
behaveAs={environment.getBehaveAs()}
|
||||
/>
|
||||
|
||||
<AuthCard.BottomText
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { useTurnstile } from "@/hooks/useTurnstile";
|
||||
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { LoginProvider, signupFormSchema } from "@/types/auth";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import { useRouter } from "next/navigation";
|
||||
@@ -8,6 +7,7 @@ import { useCallback, useEffect, useState } from "react";
|
||||
import { useForm } from "react-hook-form";
|
||||
import z from "zod";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export function useSignupPage() {
|
||||
const { supabase, user, isUserLoading } = useSupabase();
|
||||
@@ -19,7 +19,7 @@ export function useSignupPage() {
|
||||
const [isGoogleLoading, setIsGoogleLoading] = useState(false);
|
||||
const [showNotAllowedModal, setShowNotAllowedModal] = useState(false);
|
||||
|
||||
const isCloudEnv = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isCloudEnv = environment.isCloud();
|
||||
const isVercelPreview = process.env.NEXT_PUBLIC_VERCEL_ENV === "preview";
|
||||
|
||||
const turnstile = useTurnstile({
|
||||
@@ -59,6 +59,7 @@ export function useSignupPage() {
|
||||
resetCaptcha();
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch("/api/auth/provider", {
|
||||
method: "POST",
|
||||
@@ -71,7 +72,6 @@ export function useSignupPage() {
|
||||
setIsGoogleLoading(false);
|
||||
resetCaptcha();
|
||||
|
||||
// Check for waitlist error
|
||||
if (error === "not_allowed") {
|
||||
setShowNotAllowedModal(true);
|
||||
return;
|
||||
@@ -149,6 +149,7 @@ export function useSignupPage() {
|
||||
setShowNotAllowedModal(true);
|
||||
return;
|
||||
}
|
||||
|
||||
toast({
|
||||
title: result?.error || "Signup failed",
|
||||
variant: "destructive",
|
||||
|
||||
@@ -33,7 +33,7 @@ export async function POST(request: Request) {
|
||||
|
||||
if (error) {
|
||||
// Check for waitlist/allowlist error
|
||||
if (isWaitlistError(error)) {
|
||||
if (isWaitlistError(error?.code, error?.message)) {
|
||||
logWaitlistError("OAuth Provider", error.message);
|
||||
return NextResponse.json({ error: "not_allowed" }, { status: 403 });
|
||||
}
|
||||
|
||||
@@ -30,6 +30,7 @@ export async function POST(request: Request) {
|
||||
turnstileToken ?? "",
|
||||
"signup",
|
||||
);
|
||||
|
||||
if (!captchaOk) {
|
||||
return NextResponse.json(
|
||||
{ error: "CAPTCHA verification failed. Please try again." },
|
||||
@@ -48,8 +49,7 @@ export async function POST(request: Request) {
|
||||
const { data, error } = await supabase.auth.signUp(parsed.data);
|
||||
|
||||
if (error) {
|
||||
// Check for waitlist/allowlist error
|
||||
if (isWaitlistError(error)) {
|
||||
if (isWaitlistError(error?.code, error?.message)) {
|
||||
logWaitlistError("Signup", error.message);
|
||||
return NextResponse.json({ error: "not_allowed" }, { status: 403 });
|
||||
}
|
||||
|
||||
@@ -1,45 +1,45 @@
|
||||
/**
|
||||
* Checks if a Supabase auth error is related to the waitlist/allowlist
|
||||
* Checks if an error is related to the waitlist/allowlist
|
||||
*
|
||||
* Can be used with either:
|
||||
* - Error objects from Supabase auth operations: `isWaitlistError(error?.code, error?.message)`
|
||||
* - URL parameters from OAuth callbacks: `isWaitlistError(errorCode, errorDescription)`
|
||||
*
|
||||
* The PostgreSQL trigger raises P0001 with message format:
|
||||
* "The email address "email" is not allowed to register. Please contact support for assistance."
|
||||
*
|
||||
* @param error - The error object from Supabase auth operations
|
||||
* @returns true if this is a waitlist/allowlist error
|
||||
*/
|
||||
export function isWaitlistError(error: any): boolean {
|
||||
if (!error?.message) return false;
|
||||
|
||||
return (
|
||||
error.message.includes("P0001") || // PostgreSQL custom error code
|
||||
error.message.includes("not allowed to register") || // Trigger message
|
||||
error.message.toLowerCase().includes("allowed_users") // Table reference
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if OAuth callback URL parameters indicate a waitlist error
|
||||
*
|
||||
* This is for the auth-code-error page which receives errors via URL hash params
|
||||
* from Supabase OAuth redirects
|
||||
*
|
||||
* @param errorCode - The error_code parameter from the URL
|
||||
* @param errorDescription - The error_description parameter from the URL
|
||||
* @param code - Error code (e.g., "P0001", "unexpected_failure") or null
|
||||
* @param message - Error message/description or null
|
||||
* @returns true if this appears to be a waitlist/allowlist error
|
||||
*/
|
||||
export function isWaitlistErrorFromParams(
|
||||
errorCode?: string | null,
|
||||
errorDescription?: string | null,
|
||||
export function isWaitlistError(
|
||||
code?: string | null,
|
||||
message?: string | null,
|
||||
): boolean {
|
||||
if (!errorDescription) return false;
|
||||
// Check for explicit PostgreSQL trigger error code
|
||||
if (code === "P0001") return true;
|
||||
|
||||
const description = errorDescription.toLowerCase();
|
||||
if (!message) return false;
|
||||
|
||||
const lowerMessage = message.toLowerCase();
|
||||
|
||||
// Check for the generic database error that occurs during waitlist check
|
||||
// This happens when Supabase wraps the PostgreSQL trigger error
|
||||
if (
|
||||
code === "unexpected_failure" &&
|
||||
message === "Database error saving new user"
|
||||
) {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Check for various waitlist-related patterns in the message
|
||||
return (
|
||||
description.includes("p0001") || // PostgreSQL error code might be in description
|
||||
description.includes("not allowed") ||
|
||||
description.includes("waitlist") ||
|
||||
description.includes("allowlist") ||
|
||||
description.includes("allowed_users")
|
||||
lowerMessage.includes("p0001") || // PostgreSQL error code in message
|
||||
lowerMessage.includes("not allowed") || // Common waitlist message
|
||||
lowerMessage.includes("waitlist") || // Explicit waitlist mention
|
||||
lowerMessage.includes("allowlist") || // Explicit allowlist mention
|
||||
lowerMessage.includes("allowed_users") || // Database table reference
|
||||
lowerMessage.includes("not allowed to register") // Full trigger message
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -3,20 +3,19 @@ import {
|
||||
createRequestHeaders,
|
||||
getServerAuthToken,
|
||||
} from "@/lib/autogpt-server-api/helpers";
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import { getAgptServerBaseUrl } from "@/lib/env-config";
|
||||
|
||||
import { transformDates } from "./date-transformer";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
const FRONTEND_BASE_URL =
|
||||
process.env.NEXT_PUBLIC_FRONTEND_BASE_URL || "http://localhost:3000";
|
||||
const API_PROXY_BASE_URL = `${FRONTEND_BASE_URL}/api/proxy`; // Sending request via nextjs Server
|
||||
|
||||
const getBaseUrl = (): string => {
|
||||
if (!isServerSide()) {
|
||||
if (!environment.isServerSide()) {
|
||||
return API_PROXY_BASE_URL;
|
||||
} else {
|
||||
return getAgptServerBaseUrl();
|
||||
return environment.getAGPTServerBaseUrl();
|
||||
}
|
||||
};
|
||||
|
||||
@@ -77,7 +76,7 @@ export const customMutator = async <
|
||||
// The caching in React Query in our system depends on the url, so the base_url could be different for the server and client sides.
|
||||
const fullUrl = `${baseUrl}${url}${queryString}`;
|
||||
|
||||
if (isServerSide()) {
|
||||
if (environment.isServerSide()) {
|
||||
try {
|
||||
const token = await getServerAuthToken();
|
||||
const authHeaders = createRequestHeaders(token, !!data, contentType);
|
||||
@@ -100,7 +99,7 @@ export const customMutator = async <
|
||||
response_data?.detail || response_data?.message || response.statusText;
|
||||
|
||||
console.error(
|
||||
`Request failed ${isServerSide() ? "on server" : "on client"}`,
|
||||
`Request failed ${environment.isServerSide() ? "on server" : "on client"}`,
|
||||
{ status: response.status, url: fullUrl, data: response_data },
|
||||
);
|
||||
|
||||
|
||||
@@ -3,12 +3,12 @@ import {
|
||||
makeAuthenticatedFileUpload,
|
||||
makeAuthenticatedRequest,
|
||||
} from "@/lib/autogpt-server-api/helpers";
|
||||
import { getAgptServerBaseUrl } from "@/lib/env-config";
|
||||
import { environment } from "@/services/environment";
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
function buildBackendUrl(path: string[], queryString: string): string {
|
||||
const backendPath = path.join("/");
|
||||
return `${getAgptServerBaseUrl()}/${backendPath}${queryString}`;
|
||||
return `${environment.getAGPTServerBaseUrl()}/${backendPath}${queryString}`;
|
||||
}
|
||||
|
||||
async function handleJsonRequest(
|
||||
|
||||
@@ -6,11 +6,12 @@ import "./globals.css";
|
||||
|
||||
import { Providers } from "@/app/providers";
|
||||
import TallyPopupSimple from "@/components/molecules/TallyPoup/TallyPopup";
|
||||
import { GoogleAnalytics } from "@/services/analytics/google-analytics";
|
||||
import { Toaster } from "@/components/molecules/Toast/toaster";
|
||||
import { ReactQueryDevtools } from "@tanstack/react-query-devtools";
|
||||
import { SpeedInsights } from "@vercel/speed-insights/next";
|
||||
import { Analytics } from "@vercel/analytics/next";
|
||||
import { headers } from "next/headers";
|
||||
import { SetupAnalytics } from "@/services/analytics";
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: "AutoGPT Platform",
|
||||
@@ -22,6 +23,9 @@ export default async function RootLayout({
|
||||
}: Readonly<{
|
||||
children: React.ReactNode;
|
||||
}>) {
|
||||
const headersList = await headers();
|
||||
const host = headersList.get("host") || "";
|
||||
|
||||
return (
|
||||
<html
|
||||
lang="en"
|
||||
@@ -29,8 +33,11 @@ export default async function RootLayout({
|
||||
suppressHydrationWarning
|
||||
>
|
||||
<head>
|
||||
<GoogleAnalytics
|
||||
gaId={process.env.NEXT_PUBLIC_GA_MEASUREMENT_ID || "G-FH2XK2W4GN"} // This is the measurement Id for the Google Analytics dev project
|
||||
<SetupAnalytics
|
||||
host={host}
|
||||
ga={{
|
||||
gaId: process.env.NEXT_PUBLIC_GA_MEASUREMENT_ID || "G-FH2XK2W4GN",
|
||||
}}
|
||||
/>
|
||||
</head>
|
||||
<body>
|
||||
|
||||
@@ -1,248 +0,0 @@
|
||||
"use client";
|
||||
|
||||
import { StarRatingIcons } from "@/components/__legacy__/ui/icons";
|
||||
import { Separator } from "@/components/__legacy__/ui/separator";
|
||||
import BackendAPI, { LibraryAgent } from "@/lib/autogpt-server-api";
|
||||
import { useRouter } from "next/navigation";
|
||||
import Link from "next/link";
|
||||
import { useToast } from "@/components/molecules/Toast/use-toast";
|
||||
|
||||
import { useOnboarding } from "@/providers/onboarding/onboarding-provider";
|
||||
import { User } from "@supabase/supabase-js";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { FC, useCallback, useMemo, useState } from "react";
|
||||
|
||||
interface AgentInfoProps {
|
||||
user: User | null;
|
||||
name: string;
|
||||
creator: string;
|
||||
shortDescription: string;
|
||||
longDescription: string;
|
||||
rating: number;
|
||||
runs: number;
|
||||
categories: string[];
|
||||
lastUpdated: string;
|
||||
version: string;
|
||||
storeListingVersionId: string;
|
||||
libraryAgent: LibraryAgent | null;
|
||||
}
|
||||
|
||||
export const AgentInfo: FC<AgentInfoProps> = ({
|
||||
user,
|
||||
name,
|
||||
creator,
|
||||
shortDescription,
|
||||
longDescription,
|
||||
rating,
|
||||
runs,
|
||||
categories,
|
||||
lastUpdated,
|
||||
version,
|
||||
storeListingVersionId,
|
||||
libraryAgent,
|
||||
}) => {
|
||||
const router = useRouter();
|
||||
const api = useMemo(() => new BackendAPI(), []);
|
||||
const { toast } = useToast();
|
||||
const { completeStep } = useOnboarding();
|
||||
const [adding, setAdding] = useState(false);
|
||||
const [downloading, setDownloading] = useState(false);
|
||||
|
||||
const libraryAction = useCallback(async () => {
|
||||
setAdding(true);
|
||||
if (libraryAgent) {
|
||||
toast({
|
||||
description: "Redirecting to your library...",
|
||||
duration: 2000,
|
||||
});
|
||||
// Redirect to the library agent page
|
||||
router.push(`/library/agents/${libraryAgent.id}`);
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const newLibraryAgent = await api.addMarketplaceAgentToLibrary(
|
||||
storeListingVersionId,
|
||||
);
|
||||
completeStep("MARKETPLACE_ADD_AGENT");
|
||||
router.push(`/library/agents/${newLibraryAgent.id}`);
|
||||
toast({
|
||||
title: "Agent Added",
|
||||
description: "Redirecting to your library...",
|
||||
duration: 2000,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Failed to add agent to library:", error);
|
||||
toast({
|
||||
title: "Error",
|
||||
description: "Failed to add agent to library. Please try again.",
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
}, [toast, api, storeListingVersionId, completeStep, router]);
|
||||
|
||||
const handleDownload = useCallback(async () => {
|
||||
const downloadAgent = async (): Promise<void> => {
|
||||
setDownloading(true);
|
||||
try {
|
||||
const file = await api.downloadStoreAgent(storeListingVersionId);
|
||||
|
||||
// Similar to Marketplace v1
|
||||
const jsonData = JSON.stringify(file, null, 2);
|
||||
// Create a Blob from the file content
|
||||
const blob = new Blob([jsonData], { type: "application/json" });
|
||||
|
||||
// Create a temporary URL for the Blob
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
|
||||
// Create a temporary anchor element
|
||||
const a = document.createElement("a");
|
||||
a.href = url;
|
||||
a.download = `agent_${storeListingVersionId}.json`; // Set the filename
|
||||
|
||||
// Append the anchor to the body, click it, and remove it
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
|
||||
// Revoke the temporary URL
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
toast({
|
||||
title: "Download Complete",
|
||||
description: "Your agent has been successfully downloaded.",
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`Error downloading agent:`, error);
|
||||
toast({
|
||||
title: "Error",
|
||||
description: "Failed to download agent. Please try again.",
|
||||
variant: "destructive",
|
||||
});
|
||||
}
|
||||
};
|
||||
await downloadAgent();
|
||||
setDownloading(false);
|
||||
}, [setDownloading, api, storeListingVersionId, toast]);
|
||||
|
||||
return (
|
||||
<div className="w-full max-w-[396px] px-4 sm:px-6 lg:w-[396px] lg:px-0">
|
||||
{/* Title */}
|
||||
<div
|
||||
data-testid="agent-title"
|
||||
className="mb-3 w-full font-poppins text-2xl font-medium leading-normal text-neutral-900 dark:text-neutral-100 sm:text-3xl lg:mb-4 lg:text-[35px] lg:leading-10"
|
||||
>
|
||||
{name}
|
||||
</div>
|
||||
|
||||
{/* Creator */}
|
||||
<div className="mb-3 flex w-full items-center gap-1.5 lg:mb-4">
|
||||
<div className="text-base font-normal text-neutral-800 dark:text-neutral-200 sm:text-lg lg:text-xl">
|
||||
by
|
||||
</div>
|
||||
<Link
|
||||
data-testid={"agent-creator"}
|
||||
href={`/marketplace/creator/${encodeURIComponent(creator)}`}
|
||||
className="text-base font-medium text-neutral-800 hover:underline dark:text-neutral-200 sm:text-lg lg:text-xl"
|
||||
>
|
||||
{creator}
|
||||
</Link>
|
||||
</div>
|
||||
|
||||
{/* Short Description */}
|
||||
<div className="mb-4 line-clamp-2 w-full text-base font-normal leading-normal text-neutral-600 dark:text-neutral-300 sm:text-lg lg:mb-6 lg:text-xl lg:leading-7">
|
||||
{shortDescription}
|
||||
</div>
|
||||
|
||||
{/* Buttons */}
|
||||
<div className="mb-4 flex w-full gap-3 lg:mb-[60px]">
|
||||
{user && (
|
||||
<button
|
||||
className={cn(
|
||||
"inline-flex min-w-24 items-center justify-center rounded-full bg-violet-600 px-4 py-3",
|
||||
"transition-colors duration-200 hover:bg-violet-500 disabled:bg-zinc-400",
|
||||
)}
|
||||
data-testid={"agent-add-library-button"}
|
||||
onClick={libraryAction}
|
||||
disabled={adding}
|
||||
>
|
||||
<span className="justify-start font-sans text-sm font-medium leading-snug text-primary-foreground">
|
||||
{libraryAgent ? "See runs" : "Add to library"}
|
||||
</span>
|
||||
</button>
|
||||
)}
|
||||
<button
|
||||
className={cn(
|
||||
"inline-flex min-w-24 items-center justify-center rounded-full bg-zinc-200 px-4 py-3",
|
||||
"transition-colors duration-200 hover:bg-zinc-200/70 disabled:bg-zinc-200/40",
|
||||
)}
|
||||
data-testid={"agent-download-button"}
|
||||
onClick={handleDownload}
|
||||
disabled={downloading}
|
||||
>
|
||||
<div className="justify-start text-center font-sans text-sm font-medium leading-snug text-zinc-800">
|
||||
Download agent
|
||||
</div>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Rating and Runs */}
|
||||
<div className="mb-4 flex w-full items-center justify-between lg:mb-[44px]">
|
||||
<div className="flex items-center gap-1.5 sm:gap-2">
|
||||
<span className="whitespace-nowrap text-base font-semibold text-neutral-800 dark:text-neutral-200 sm:text-lg">
|
||||
{rating.toFixed(1)}
|
||||
</span>
|
||||
<div className="flex gap-0.5">{StarRatingIcons(rating)}</div>
|
||||
</div>
|
||||
<div className="whitespace-nowrap text-base font-semibold text-neutral-800 dark:text-neutral-200 sm:text-lg">
|
||||
{runs.toLocaleString()} runs
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Separator */}
|
||||
<Separator className="mb-4 lg:mb-[44px]" />
|
||||
|
||||
{/* Description Section */}
|
||||
<div className="mb-4 w-full lg:mb-[36px]">
|
||||
<div className="decoration-skip-ink-none mb-1.5 text-base font-medium leading-6 text-neutral-800 dark:text-neutral-200 sm:mb-2">
|
||||
Description
|
||||
</div>
|
||||
<div
|
||||
data-testid={"agent-description"}
|
||||
className="whitespace-pre-line text-base font-normal leading-6 text-neutral-600 dark:text-neutral-400"
|
||||
>
|
||||
{longDescription}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Categories */}
|
||||
<div className="mb-4 flex w-full flex-col gap-1.5 sm:gap-2 lg:mb-[36px]">
|
||||
<div className="decoration-skip-ink-none mb-1.5 text-base font-medium leading-6 text-neutral-800 dark:text-neutral-200 sm:mb-2">
|
||||
Categories
|
||||
</div>
|
||||
<div className="flex flex-wrap gap-1.5 sm:gap-2">
|
||||
{categories.map((category, index) => (
|
||||
<div
|
||||
key={index}
|
||||
className="decoration-skip-ink-none whitespace-nowrap rounded-full border border-neutral-600 bg-white px-2 py-0.5 text-base font-normal leading-6 text-neutral-800 underline-offset-[from-font] dark:border-neutral-700 dark:bg-neutral-800 dark:text-neutral-200 sm:px-[16px] sm:py-[10px]"
|
||||
>
|
||||
{category}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Version History */}
|
||||
<div className="flex w-full flex-col gap-0.5 sm:gap-1">
|
||||
<div className="decoration-skip-ink-none mb-1.5 text-base font-medium leading-6 text-neutral-800 dark:text-neutral-200 sm:mb-2">
|
||||
Version history
|
||||
</div>
|
||||
<div className="decoration-skip-ink-none text-base font-normal leading-6 text-neutral-600 underline-offset-[from-font] dark:text-neutral-400">
|
||||
Last updated {lastUpdated}
|
||||
</div>
|
||||
<div className="text-xs text-neutral-600 dark:text-neutral-400 sm:text-sm">
|
||||
Version {version}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -29,7 +29,7 @@ const TooltipContent = React.forwardRef<
|
||||
ref={ref}
|
||||
sideOffset={sideOffset}
|
||||
className={cn(
|
||||
"z-50 max-w-xs overflow-hidden rounded-md border border-slate-50 bg-white px-3 py-1.5 text-xs text-gray-900 animate-in fade-in-0 zoom-in-95 data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=closed]:zoom-out-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2",
|
||||
"z-50 max-w-xs overflow-hidden rounded-md bg-white px-3 py-1.5 text-xs font-normal leading-normal text-zinc-900 outline outline-1 outline-offset-[-1px] outline-gray-100 animate-in fade-in-0 zoom-in-95 data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=closed]:zoom-out-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2",
|
||||
className,
|
||||
)}
|
||||
{...props}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { HelpItem } from "@/components/auth/help-item";
|
||||
import { Card, CardContent } from "@/components/__legacy__/ui/card";
|
||||
import { BehaveAs } from "@/lib/utils";
|
||||
import { AlertCircle, CheckCircle } from "lucide-react";
|
||||
import { BehaveAs } from "@/services/environment";
|
||||
|
||||
interface Props {
|
||||
type: "login" | "signup";
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
"use client";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
import { useEffect, useRef, useState } from "react";
|
||||
|
||||
export interface TurnstileProps {
|
||||
@@ -32,7 +32,7 @@ export function Turnstile({
|
||||
|
||||
// Load the Turnstile script
|
||||
useEffect(() => {
|
||||
if (isServerSide() || !shouldRender) return;
|
||||
if (environment.isServerSide() || !shouldRender) return;
|
||||
|
||||
// Skip if already loaded
|
||||
if (window.turnstile) {
|
||||
|
||||
@@ -43,7 +43,15 @@ export function WaitlistErrorContent({
|
||||
exact same email address you used when signing up.
|
||||
</Text>
|
||||
<Text variant="small" className="text-center text-muted-foreground">
|
||||
If you're not sure which email you used or need help,{" "}
|
||||
If you're not sure which email you used or need help, contact us
|
||||
at{" "}
|
||||
<a
|
||||
href="mailto:contact@agpt.co"
|
||||
className="underline hover:text-foreground"
|
||||
>
|
||||
contact@agpt.co
|
||||
</a>{" "}
|
||||
or{" "}
|
||||
<a
|
||||
href="https://discord.gg/autogpt"
|
||||
target="_blank"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
import { useCallback, useEffect, useRef, useState } from "react";
|
||||
|
||||
interface useInfiniteScrollProps {
|
||||
@@ -34,7 +34,8 @@ export const useInfiniteScroll = ({
|
||||
}, [hasNextPage, isFetchingNextPage, fetchNextPage, onLoadMore]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!hasNextPage || !endOfListRef.current || isServerSide()) return;
|
||||
if (!hasNextPage || !endOfListRef.current || environment.isServerSide())
|
||||
return;
|
||||
|
||||
const observer = new IntersectionObserver(
|
||||
(entries) => {
|
||||
|
||||
@@ -5,14 +5,15 @@ import { InfoIcon, AlertTriangleIcon, XCircleIcon } from "lucide-react";
|
||||
import { cn } from "@/lib/utils";
|
||||
|
||||
const alertVariants = cva(
|
||||
"relative w-full rounded-lg border border-neutral-200 px-4 py-3 text-sm [&>svg]:absolute [&>svg]:left-4 [&>svg]:top-[12px] [&>svg]:text-neutral-950 [&>svg~*]:pl-7",
|
||||
"relative w-full rounded-lg border border-zinc-200 px-4 py-3 text-sm [&>svg]:absolute [&>svg]:left-4 [&>svg]:top-1/2 [&>svg]:-translate-y-1/2 [&>svg]:text-zinc-800 [&>svg~*]:pl-7",
|
||||
{
|
||||
variants: {
|
||||
variant: {
|
||||
default: "bg-white text-neutral-950 [&>svg]:text-blue-500",
|
||||
default: "bg-white text-zinc-800 [&>svg]:text-purple-500",
|
||||
warning:
|
||||
"bg-white border-orange-500/50 text-orange-600 [&>svg]:text-orange-500",
|
||||
error: "bg-white border-red-500/50 text-red-500 [&>svg]:text-red-500",
|
||||
"bg-[#FFF3E680] border-yellow-300 text-zinc-800 [&>svg]:text-orange-600",
|
||||
error:
|
||||
"bg-[#FDECEC80] border-red-300 text-zinc-800 [&>svg]:text-red-500",
|
||||
},
|
||||
},
|
||||
defaultVariants: {
|
||||
@@ -45,7 +46,7 @@ const Alert = React.forwardRef<HTMLDivElement, AlertProps>(
|
||||
className={cn(alertVariants({ variant: currentVariant }), className)}
|
||||
{...props}
|
||||
>
|
||||
<IconComponent className="h-4 w-4" />
|
||||
<IconComponent className="h-[1.125rem] w-[1.125rem]" />
|
||||
{children}
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { useState, useCallback, useEffect } from "react";
|
||||
import { verifyTurnstileToken } from "@/lib/turnstile";
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
interface UseTurnstileOptions {
|
||||
action?: string;
|
||||
@@ -46,17 +45,15 @@ export function useTurnstile({
|
||||
const [widgetId, setWidgetId] = useState<string | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
const behaveAs = getBehaveAs();
|
||||
const isCloud = environment.isCloud();
|
||||
const hasTurnstileKey = !!TURNSTILE_SITE_KEY;
|
||||
const turnstileDisabled = process.env.NEXT_PUBLIC_TURNSTILE !== "enabled";
|
||||
|
||||
// Only render Turnstile in cloud environment if not explicitly disabled
|
||||
setShouldRender(
|
||||
behaveAs === BehaveAs.CLOUD && hasTurnstileKey && !turnstileDisabled,
|
||||
);
|
||||
setShouldRender(isCloud && hasTurnstileKey && !turnstileDisabled);
|
||||
|
||||
// Skip verification if disabled, in local development, or no key
|
||||
if (turnstileDisabled || behaveAs !== BehaveAs.CLOUD || !hasTurnstileKey) {
|
||||
if (turnstileDisabled || !isCloud || !hasTurnstileKey) {
|
||||
setVerified(true);
|
||||
}
|
||||
}, []);
|
||||
@@ -80,7 +77,12 @@ export function useTurnstile({
|
||||
setError(null);
|
||||
|
||||
// Only reset the actual Turnstile widget if it exists and shouldRender is true
|
||||
if (shouldRender && !isServerSide() && window.turnstile && widgetId) {
|
||||
if (
|
||||
shouldRender &&
|
||||
!environment.isServerSide() &&
|
||||
window.turnstile &&
|
||||
widgetId
|
||||
) {
|
||||
try {
|
||||
window.turnstile.reset(widgetId);
|
||||
} catch (err) {
|
||||
|
||||
@@ -3,12 +3,6 @@ import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { createBrowserClient } from "@supabase/ssr";
|
||||
import type { SupabaseClient } from "@supabase/supabase-js";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import {
|
||||
getAgptServerApiUrl,
|
||||
getAgptWsServerUrl,
|
||||
getSupabaseUrl,
|
||||
getSupabaseAnonKey,
|
||||
} from "@/lib/env-config";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import type {
|
||||
AddUserCreditsResponse,
|
||||
@@ -73,9 +67,9 @@ import type {
|
||||
UserPasswordCredentials,
|
||||
UsersBalanceHistoryResponse,
|
||||
} from "./types";
|
||||
import { isServerSide } from "../utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
const isClient = !isServerSide();
|
||||
const isClient = environment.isClientSide();
|
||||
|
||||
export default class BackendAPI {
|
||||
private baseUrl: string;
|
||||
@@ -93,8 +87,8 @@ export default class BackendAPI {
|
||||
heartbeatTimeoutID: number | null = null;
|
||||
|
||||
constructor(
|
||||
baseUrl: string = getAgptServerApiUrl(),
|
||||
wsUrl: string = getAgptWsServerUrl(),
|
||||
baseUrl: string = environment.getAGPTServerApiUrl(),
|
||||
wsUrl: string = environment.getAGPTWsServerUrl(),
|
||||
) {
|
||||
this.baseUrl = baseUrl;
|
||||
this.wsUrl = wsUrl;
|
||||
@@ -102,9 +96,13 @@ export default class BackendAPI {
|
||||
|
||||
private async getSupabaseClient(): Promise<SupabaseClient | null> {
|
||||
return isClient
|
||||
? createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
|
||||
isSingleton: true,
|
||||
})
|
||||
? createBrowserClient(
|
||||
environment.getSupabaseUrl(),
|
||||
environment.getSupabaseAnonKey(),
|
||||
{
|
||||
isSingleton: true,
|
||||
},
|
||||
)
|
||||
: await getServerSupabase();
|
||||
}
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
"use client";
|
||||
|
||||
import { isServerSide } from "../utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
import BackendAPI from "./client";
|
||||
import React, { createContext, useMemo } from "react";
|
||||
|
||||
@@ -20,7 +20,7 @@ export function BackendAPIProvider({
|
||||
}): React.ReactNode {
|
||||
const api = useMemo(() => new BackendAPI(), []);
|
||||
|
||||
if (process.env.NEXT_PUBLIC_BEHAVE_AS == "LOCAL" && !isServerSide()) {
|
||||
if (environment.isLocal() && !environment.isServerSide()) {
|
||||
window.api = api; // Expose the API globally for debugging purposes
|
||||
}
|
||||
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { getServerSupabase } from "@/lib/supabase/server/getServerSupabase";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import { getAgptServerApiUrl } from "@/lib/env-config";
|
||||
import { isServerSide } from "../utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
import { GraphValidationErrorResponse } from "./types";
|
||||
|
||||
@@ -41,12 +40,11 @@ export function buildRequestUrl(
|
||||
method: string,
|
||||
payload?: Record<string, any>,
|
||||
): string {
|
||||
let url = baseUrl + path;
|
||||
const url = baseUrl + path;
|
||||
const payloadAsQuery = ["GET", "DELETE"].includes(method);
|
||||
|
||||
if (payloadAsQuery && payload) {
|
||||
const queryParams = new URLSearchParams(payload);
|
||||
url += `?${queryParams.toString()}`;
|
||||
return buildUrlWithQuery(url, payload);
|
||||
}
|
||||
|
||||
return url;
|
||||
@@ -57,17 +55,28 @@ export function buildClientUrl(path: string): string {
|
||||
}
|
||||
|
||||
export function buildServerUrl(path: string): string {
|
||||
return `${getAgptServerApiUrl()}${path}`;
|
||||
return `${environment.getAGPTServerApiUrl()}${path}`;
|
||||
}
|
||||
|
||||
export function buildUrlWithQuery(
|
||||
url: string,
|
||||
payload?: Record<string, any>,
|
||||
query?: Record<string, any>,
|
||||
): string {
|
||||
if (!payload) return url;
|
||||
if (!query) return url;
|
||||
|
||||
const queryParams = new URLSearchParams(payload);
|
||||
return `${url}?${queryParams.toString()}`;
|
||||
// Filter out undefined values to prevent them from being included as "undefined" strings
|
||||
const filteredQuery = Object.entries(query).reduce(
|
||||
(acc, [key, value]) => {
|
||||
if (value !== undefined) {
|
||||
acc[key] = value;
|
||||
}
|
||||
return acc;
|
||||
},
|
||||
{} as Record<string, any>,
|
||||
);
|
||||
|
||||
const queryParams = new URLSearchParams(filteredQuery);
|
||||
return queryParams.size > 0 ? `${url}?${queryParams.toString()}` : url;
|
||||
}
|
||||
|
||||
export async function handleFetchError(response: Response): Promise<ApiError> {
|
||||
@@ -197,7 +206,7 @@ function isAuthenticationError(
|
||||
}
|
||||
|
||||
function isLogoutInProgress(): boolean {
|
||||
if (isServerSide()) return false;
|
||||
if (environment.isServerSide()) return false;
|
||||
|
||||
try {
|
||||
// Check if logout was recently triggered
|
||||
|
||||
@@ -1,65 +0,0 @@
|
||||
/**
|
||||
* Environment configuration helper
|
||||
*
|
||||
* Provides unified access to environment variables with server-side priority.
|
||||
* Server-side code uses Docker service names, client-side falls back to localhost.
|
||||
*/
|
||||
|
||||
import { isServerSide } from "./utils/is-server-side";
|
||||
|
||||
/**
|
||||
* Gets the AGPT server URL with server-side priority
|
||||
* Server-side: Uses AGPT_SERVER_URL (http://rest_server:8006/api)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_AGPT_SERVER_URL (http://localhost:8006/api)
|
||||
*/
|
||||
export function getAgptServerApiUrl(): string {
|
||||
// If server-side and server URL exists, use it
|
||||
if (isServerSide() && process.env.AGPT_SERVER_URL) {
|
||||
return process.env.AGPT_SERVER_URL;
|
||||
}
|
||||
|
||||
// Otherwise use the public URL
|
||||
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
|
||||
}
|
||||
|
||||
export function getAgptServerBaseUrl(): string {
|
||||
return getAgptServerApiUrl().replace("/api", "");
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the AGPT WebSocket URL with server-side priority
|
||||
* Server-side: Uses AGPT_WS_SERVER_URL (ws://websocket_server:8001/ws)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_AGPT_WS_SERVER_URL (ws://localhost:8001/ws)
|
||||
*/
|
||||
export function getAgptWsServerUrl(): string {
|
||||
// If server-side and server URL exists, use it
|
||||
if (isServerSide() && process.env.AGPT_WS_SERVER_URL) {
|
||||
return process.env.AGPT_WS_SERVER_URL;
|
||||
}
|
||||
|
||||
// Otherwise use the public URL
|
||||
return process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL || "ws://localhost:8001/ws";
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the Supabase URL with server-side priority
|
||||
* Server-side: Uses SUPABASE_URL (http://kong:8000)
|
||||
* Client-side: Falls back to NEXT_PUBLIC_SUPABASE_URL (http://localhost:8000)
|
||||
*/
|
||||
export function getSupabaseUrl(): string {
|
||||
// If server-side and server URL exists, use it
|
||||
if (isServerSide() && process.env.SUPABASE_URL) {
|
||||
return process.env.SUPABASE_URL;
|
||||
}
|
||||
|
||||
// Otherwise use the public URL
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_URL || "http://localhost:8000";
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the Supabase anon key
|
||||
* Uses NEXT_PUBLIC_SUPABASE_ANON_KEY since anon keys are public and same across environments
|
||||
*/
|
||||
export function getSupabaseAnonKey(): string {
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY || "";
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
import { type CookieOptions } from "@supabase/ssr";
|
||||
import { SupabaseClient } from "@supabase/supabase-js";
|
||||
import { Key, storage } from "@/services/storage/local-storage";
|
||||
import { isServerSide } from "../utils/is-server-side";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export const PROTECTED_PAGES = [
|
||||
"/monitor",
|
||||
@@ -83,7 +83,7 @@ export function setupSessionEventListeners(
|
||||
onFocus: () => void,
|
||||
onStorageChange: (e: StorageEvent) => void,
|
||||
): EventListeners {
|
||||
if (isServerSide() || typeof document === "undefined") {
|
||||
if (environment.isServerSide()) {
|
||||
return { cleanup: () => {} };
|
||||
}
|
||||
|
||||
|
||||
@@ -4,7 +4,6 @@ import { User } from "@supabase/supabase-js";
|
||||
import { usePathname, useRouter } from "next/navigation";
|
||||
import { useEffect, useMemo, useRef, useState } from "react";
|
||||
import { useBackendAPI } from "@/lib/autogpt-server-api/context";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "@/lib/env-config";
|
||||
import {
|
||||
getCurrentUser,
|
||||
refreshSession,
|
||||
@@ -20,6 +19,7 @@ import {
|
||||
setWebSocketDisconnectIntent,
|
||||
setupSessionEventListeners,
|
||||
} from "../helpers";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export function useSupabase() {
|
||||
const router = useRouter();
|
||||
@@ -33,12 +33,16 @@ export function useSupabase() {
|
||||
|
||||
const supabase = useMemo(() => {
|
||||
try {
|
||||
return createBrowserClient(getSupabaseUrl(), getSupabaseAnonKey(), {
|
||||
isSingleton: true,
|
||||
auth: {
|
||||
persistSession: false, // Don't persist session on client with httpOnly cookies
|
||||
return createBrowserClient(
|
||||
environment.getSupabaseUrl(),
|
||||
environment.getSupabaseAnonKey(),
|
||||
{
|
||||
isSingleton: true,
|
||||
auth: {
|
||||
persistSession: false, // Don't persist session on client with httpOnly cookies
|
||||
},
|
||||
},
|
||||
});
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Error creating Supabase client", error);
|
||||
return null;
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
import { createServerClient } from "@supabase/ssr";
|
||||
import { NextResponse, type NextRequest } from "next/server";
|
||||
import { getCookieSettings, isAdminPage, isProtectedPage } from "./helpers";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "../env-config";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export async function updateSession(request: NextRequest) {
|
||||
let supabaseResponse = NextResponse.next({
|
||||
request,
|
||||
});
|
||||
|
||||
const supabaseUrl = getSupabaseUrl();
|
||||
const supabaseKey = getSupabaseAnonKey();
|
||||
const supabaseUrl = environment.getSupabaseUrl();
|
||||
const supabaseKey = environment.getSupabaseAnonKey();
|
||||
const isAvailable = Boolean(supabaseUrl && supabaseKey);
|
||||
|
||||
if (!isAvailable) {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { createServerClient, type CookieOptions } from "@supabase/ssr";
|
||||
import { getCookieSettings } from "../helpers";
|
||||
import { getSupabaseUrl, getSupabaseAnonKey } from "../../env-config";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
type Cookies = { name: string; value: string; options?: CookieOptions }[];
|
||||
|
||||
@@ -12,8 +12,8 @@ export async function getServerSupabase() {
|
||||
|
||||
try {
|
||||
const supabase = createServerClient(
|
||||
getSupabaseUrl(),
|
||||
getSupabaseAnonKey(),
|
||||
environment.getSupabaseUrl(),
|
||||
environment.getSupabaseAnonKey(),
|
||||
{
|
||||
cookies: {
|
||||
getAll() {
|
||||
|
||||
@@ -1,35 +1,27 @@
|
||||
/**
|
||||
* Utility functions for working with Cloudflare Turnstile
|
||||
*/
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { getAgptServerApiUrl } from "@/lib/env-config";
|
||||
import { environment } from "@/services/environment";
|
||||
|
||||
export async function verifyTurnstileToken(
|
||||
token: string,
|
||||
action?: string,
|
||||
): Promise<boolean> {
|
||||
// Skip verification unless explicitly enabled via environment variable
|
||||
if (process.env.NEXT_PUBLIC_TURNSTILE !== "enabled") {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Skip verification in local development
|
||||
const behaveAs = getBehaveAs();
|
||||
if (behaveAs !== BehaveAs.CLOUD) {
|
||||
if (!environment.isCAPTCHAEnabled() || environment.isLocal()) {
|
||||
return true;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`${getAgptServerApiUrl()}/turnstile/verify`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
const response = await fetch(
|
||||
`${environment.getAGPTServerApiUrl()}/turnstile/verify`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
token,
|
||||
action,
|
||||
}),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
token,
|
||||
action,
|
||||
}),
|
||||
});
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
console.error("Turnstile verification failed:", await response.text());
|
||||
|
||||
@@ -228,36 +228,6 @@ export function getPrimaryCategoryColor(categories: Category[]): string {
|
||||
);
|
||||
}
|
||||
|
||||
export enum BehaveAs {
|
||||
CLOUD = "CLOUD",
|
||||
LOCAL = "LOCAL",
|
||||
}
|
||||
|
||||
export function getBehaveAs(): BehaveAs {
|
||||
return process.env.NEXT_PUBLIC_BEHAVE_AS === "CLOUD"
|
||||
? BehaveAs.CLOUD
|
||||
: BehaveAs.LOCAL;
|
||||
}
|
||||
|
||||
export enum AppEnv {
|
||||
LOCAL = "local",
|
||||
DEV = "dev",
|
||||
PROD = "prod",
|
||||
}
|
||||
|
||||
export function getAppEnv(): AppEnv {
|
||||
const env = process.env.NEXT_PUBLIC_APP_ENV;
|
||||
if (env === "dev") return AppEnv.DEV;
|
||||
if (env === "prod") return AppEnv.PROD;
|
||||
// Some places use prod and others production
|
||||
if (env === "production") return AppEnv.PROD;
|
||||
return AppEnv.LOCAL;
|
||||
}
|
||||
|
||||
export function getEnvironmentStr(): string {
|
||||
return `app:${getAppEnv().toLowerCase()}-behave:${getBehaveAs().toLowerCase()}`;
|
||||
}
|
||||
|
||||
function rectanglesOverlap(
|
||||
rect1: { x: number; y: number; width: number; height?: number },
|
||||
rect2: { x: number; y: number; width: number; height?: number },
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
export const isServerSide = (): boolean => {
|
||||
return typeof window === "undefined";
|
||||
};
|
||||
@@ -1,68 +0,0 @@
|
||||
/**
|
||||
* Modified copy of ga.tsx from @next/third-parties/google, with modified gtag.js source URL.
|
||||
* Original source file: https://github.com/vercel/next.js/blob/b304b45e3a6e3e79338568d76e28805e77c03ec9/packages/third-parties/src/google/ga.tsx
|
||||
*/
|
||||
|
||||
"use client";
|
||||
|
||||
import type { GAParams } from "@/types/google";
|
||||
import Script from "next/script";
|
||||
import { useEffect } from "react";
|
||||
|
||||
let currDataLayerName: string | undefined = undefined;
|
||||
|
||||
export function GoogleAnalytics(props: GAParams) {
|
||||
const { gaId, debugMode, dataLayerName = "dataLayer", nonce } = props;
|
||||
|
||||
if (currDataLayerName === undefined) {
|
||||
currDataLayerName = dataLayerName;
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
// Feature usage signal (same as original implementation)
|
||||
performance.mark("mark_feature_usage", {
|
||||
detail: {
|
||||
feature: "custom-ga",
|
||||
},
|
||||
});
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<>
|
||||
<Script
|
||||
id="_custom-ga-init"
|
||||
// Using "afterInteractive" to avoid blocking the initial page rendering
|
||||
strategy="afterInteractive"
|
||||
dangerouslySetInnerHTML={{
|
||||
__html: `
|
||||
window['${dataLayerName}'] = window['${dataLayerName}'] || [];
|
||||
function gtag(){window['${dataLayerName}'].push(arguments);}
|
||||
gtag('js', new Date());
|
||||
gtag('config', '${gaId}' ${debugMode ? ",{ 'debug_mode': true }" : ""});
|
||||
`,
|
||||
}}
|
||||
nonce={nonce}
|
||||
/>
|
||||
<Script
|
||||
id="_custom-ga"
|
||||
strategy="afterInteractive"
|
||||
src="/gtag.js"
|
||||
nonce={nonce}
|
||||
/>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export function sendGAEvent(...args: any[]) {
|
||||
if (currDataLayerName === undefined) {
|
||||
console.warn(`Custom GA: GA has not been initialized`);
|
||||
return;
|
||||
}
|
||||
|
||||
const dataLayer = (window as any)[currDataLayerName];
|
||||
if (dataLayer) {
|
||||
dataLayer.push(...args);
|
||||
} else {
|
||||
console.warn(`Custom GA: dataLayer ${currDataLayerName} does not exist`);
|
||||
}
|
||||
}
|
||||
114
autogpt_platform/frontend/src/services/analytics/index.tsx
Normal file
114
autogpt_platform/frontend/src/services/analytics/index.tsx
Normal file
@@ -0,0 +1,114 @@
|
||||
/**
|
||||
* Modified copy of ga.tsx from @next/third-parties/google, with modified gtag.js source URL.
|
||||
* Original source file: https://github.com/vercel/next.js/blob/b304b45e3a6e3e79338568d76e28805e77c03ec9/packages/third-parties/src/google/ga.tsx
|
||||
*/
|
||||
|
||||
"use client";
|
||||
|
||||
import type { GAParams } from "@/types/google";
|
||||
import Script from "next/script";
|
||||
import { useEffect } from "react";
|
||||
import { environment } from "../environment";
|
||||
|
||||
declare global {
|
||||
interface Window {
|
||||
datafast: (name: string, metadata: Record<string, unknown>) => void;
|
||||
[key: string]: unknown[] | ((...args: unknown[]) => void) | unknown;
|
||||
}
|
||||
}
|
||||
|
||||
let currDataLayerName: string | undefined = undefined;
|
||||
|
||||
type SetupProps = {
|
||||
ga: GAParams;
|
||||
host: string;
|
||||
};
|
||||
|
||||
export function SetupAnalytics(props: SetupProps) {
|
||||
const { ga, host } = props;
|
||||
const { gaId, debugMode, dataLayerName = "dataLayer", nonce } = ga;
|
||||
const isProductionDomain = host.includes("platform.agpt.co");
|
||||
|
||||
// Datafa.st journey analytics only on production
|
||||
const dataFastEnabled = isProductionDomain;
|
||||
// We collect analytics too for open source developers running the platform locally
|
||||
const googleAnalyticsEnabled = environment.isLocal() || isProductionDomain;
|
||||
|
||||
if (currDataLayerName === undefined) {
|
||||
currDataLayerName = dataLayerName;
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (!googleAnalyticsEnabled) return;
|
||||
|
||||
// Google Analytics: feature usage signal (same as original implementation)
|
||||
performance.mark("mark_feature_usage", {
|
||||
detail: {
|
||||
feature: "custom-ga",
|
||||
},
|
||||
});
|
||||
}, [googleAnalyticsEnabled]);
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* Google Analytics */}
|
||||
{googleAnalyticsEnabled ? (
|
||||
<>
|
||||
<Script
|
||||
id="_custom-ga-init"
|
||||
strategy="afterInteractive"
|
||||
dangerouslySetInnerHTML={{
|
||||
__html: `
|
||||
window['${dataLayerName}'] = window['${dataLayerName}'] || [];
|
||||
function gtag(){window['${dataLayerName}'].push(arguments);}
|
||||
gtag('js', new Date());
|
||||
gtag('config', '${gaId}' ${debugMode ? ",{ 'debug_mode': true }" : ""});
|
||||
`,
|
||||
}}
|
||||
nonce={nonce}
|
||||
/>
|
||||
{/* Google Tag Manager */}
|
||||
<Script
|
||||
id="_custom-ga"
|
||||
strategy="afterInteractive"
|
||||
src="/gtag.js"
|
||||
nonce={nonce}
|
||||
/>
|
||||
</>
|
||||
) : null}
|
||||
{/* Datafa.st */}
|
||||
{dataFastEnabled ? (
|
||||
<Script
|
||||
strategy="afterInteractive"
|
||||
data-website-id="dfid_g5wtBIiHUwSkWKcGz80lu"
|
||||
data-domain="agpt.co"
|
||||
src="https://datafa.st/js/script.js"
|
||||
/>
|
||||
) : null}
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export const analytics = {
|
||||
sendGAEvent,
|
||||
sendDatafastEvent,
|
||||
};
|
||||
|
||||
function sendGAEvent(...args: unknown[]) {
|
||||
if (typeof window === "undefined") return;
|
||||
if (currDataLayerName === undefined) return;
|
||||
|
||||
const dataLayer = window[currDataLayerName];
|
||||
if (!dataLayer) return;
|
||||
|
||||
if (Array.isArray(dataLayer)) {
|
||||
dataLayer.push(...args);
|
||||
} else {
|
||||
console.warn(`Custom GA: dataLayer ${currDataLayerName} does not exist`);
|
||||
}
|
||||
}
|
||||
|
||||
function sendDatafastEvent(name: string, metadata: Record<string, unknown>) {
|
||||
if (typeof window === "undefined" || !window.datafast) return;
|
||||
window.datafast(name, metadata);
|
||||
}
|
||||
115
autogpt_platform/frontend/src/services/environment/index.ts
Normal file
115
autogpt_platform/frontend/src/services/environment/index.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
export enum BehaveAs {
|
||||
CLOUD = "CLOUD",
|
||||
LOCAL = "LOCAL",
|
||||
}
|
||||
|
||||
function getBehaveAs(): BehaveAs {
|
||||
return process.env.NEXT_PUBLIC_BEHAVE_AS === "CLOUD"
|
||||
? BehaveAs.CLOUD
|
||||
: BehaveAs.LOCAL;
|
||||
}
|
||||
|
||||
export enum AppEnv {
|
||||
LOCAL = "local",
|
||||
DEV = "dev",
|
||||
PROD = "prod",
|
||||
}
|
||||
|
||||
function getAppEnv(): AppEnv {
|
||||
const env = process.env.NEXT_PUBLIC_APP_ENV;
|
||||
if (env === "dev") return AppEnv.DEV;
|
||||
if (env === "prod") return AppEnv.PROD;
|
||||
// Some places use prod and others production
|
||||
if (env === "production") return AppEnv.PROD;
|
||||
return AppEnv.LOCAL;
|
||||
}
|
||||
|
||||
function getAGPTServerApiUrl() {
|
||||
if (environment.isServerSide() && process.env.AGPT_SERVER_URL) {
|
||||
return process.env.AGPT_SERVER_URL;
|
||||
}
|
||||
|
||||
return process.env.NEXT_PUBLIC_AGPT_SERVER_URL || "http://localhost:8006/api";
|
||||
}
|
||||
|
||||
function getAGPTServerBaseUrl() {
|
||||
return getAGPTServerApiUrl().replace("/api", "");
|
||||
}
|
||||
|
||||
function getAGPTWsServerUrl() {
|
||||
if (environment.isServerSide() && process.env.AGPT_WS_SERVER_URL) {
|
||||
return process.env.AGPT_WS_SERVER_URL;
|
||||
}
|
||||
|
||||
return process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL || "ws://localhost:8001/ws";
|
||||
}
|
||||
|
||||
function getSupabaseUrl() {
|
||||
if (environment.isServerSide() && process.env.SUPABASE_URL) {
|
||||
return process.env.SUPABASE_URL;
|
||||
}
|
||||
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_URL || "http://localhost:8000";
|
||||
}
|
||||
|
||||
function getSupabaseAnonKey() {
|
||||
return process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY || "";
|
||||
}
|
||||
|
||||
function getEnvironmentStr() {
|
||||
return `app:${getAppEnv().toLowerCase()}-behave:${getBehaveAs().toLowerCase()}`;
|
||||
}
|
||||
|
||||
function isProd() {
|
||||
return process.env.NODE_ENV === "production";
|
||||
}
|
||||
|
||||
function isDev() {
|
||||
return process.env.NODE_ENV === "development";
|
||||
}
|
||||
|
||||
function isCloud() {
|
||||
return getBehaveAs() === BehaveAs.CLOUD;
|
||||
}
|
||||
|
||||
function isLocal() {
|
||||
return getBehaveAs() === BehaveAs.LOCAL;
|
||||
}
|
||||
|
||||
function isServerSide() {
|
||||
return typeof window === "undefined";
|
||||
}
|
||||
|
||||
function isClientSide() {
|
||||
return typeof window !== "undefined";
|
||||
}
|
||||
|
||||
function isCAPTCHAEnabled() {
|
||||
return process.env.NEXT_PUBLIC_TURNSTILE === "enabled";
|
||||
}
|
||||
|
||||
function areFeatureFlagsEnabled() {
|
||||
return process.env.NEXT_PUBLIC_LAUNCHDARKLY_ENABLED === "enabled";
|
||||
}
|
||||
|
||||
export const environment = {
|
||||
// Generic
|
||||
getEnvironmentStr,
|
||||
// Get environment variables config
|
||||
getBehaveAs,
|
||||
getAppEnv,
|
||||
getAGPTServerApiUrl,
|
||||
getAGPTServerBaseUrl,
|
||||
getAGPTWsServerUrl,
|
||||
getSupabaseUrl,
|
||||
getSupabaseAnonKey,
|
||||
// Assertions
|
||||
isServerSide,
|
||||
isClientSide,
|
||||
isProd,
|
||||
isDev,
|
||||
isCloud,
|
||||
isLocal,
|
||||
isCAPTCHAEnabled,
|
||||
areFeatureFlagsEnabled,
|
||||
};
|
||||
@@ -4,15 +4,15 @@ import { LDProvider } from "launchdarkly-react-client-sdk";
|
||||
import type { ReactNode } from "react";
|
||||
import { useMemo } from "react";
|
||||
import { useSupabase } from "@/lib/supabase/hooks/useSupabase";
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import { environment } from "../environment";
|
||||
|
||||
const clientId = process.env.NEXT_PUBLIC_LAUNCHDARKLY_CLIENT_ID;
|
||||
const envEnabled = process.env.NEXT_PUBLIC_LAUNCHDARKLY_ENABLED === "true";
|
||||
|
||||
export function LaunchDarklyProvider({ children }: { children: ReactNode }) {
|
||||
const { user, isUserLoading } = useSupabase();
|
||||
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isCloud = environment.isCloud();
|
||||
const isLaunchDarklyConfigured = isCloud && envEnabled && clientId;
|
||||
|
||||
const context = useMemo(() => {
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
"use client";
|
||||
|
||||
import { DEFAULT_SEARCH_TERMS } from "@/app/(platform)/marketplace/components/HeroSection/helpers";
|
||||
import { BehaveAs, getBehaveAs } from "@/lib/utils";
|
||||
import { useFlags } from "launchdarkly-react-client-sdk";
|
||||
import { environment } from "../environment";
|
||||
|
||||
export enum Flag {
|
||||
BETA_BLOCKS = "beta-blocks",
|
||||
@@ -51,7 +51,7 @@ const mockFlags = {
|
||||
export function useGetFlag<T extends Flag>(flag: T): FlagValues[T] | null {
|
||||
const currentFlags = useFlags<FlagValues>();
|
||||
const flagValue = currentFlags[flag];
|
||||
const isCloud = getBehaveAs() === BehaveAs.CLOUD;
|
||||
const isCloud = environment.isCloud();
|
||||
|
||||
if ((isPwMockEnabled && !isCloud) || flagValue === undefined) {
|
||||
return mockFlags[flag];
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { isServerSide } from "@/lib/utils/is-server-side";
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import { environment } from "../environment";
|
||||
|
||||
export enum Key {
|
||||
LOGOUT = "supabase-logout",
|
||||
@@ -10,7 +10,7 @@ export enum Key {
|
||||
}
|
||||
|
||||
function get(key: Key) {
|
||||
if (isServerSide()) {
|
||||
if (environment.isServerSide()) {
|
||||
Sentry.captureException(new Error("Local storage is not available"));
|
||||
return;
|
||||
}
|
||||
@@ -23,7 +23,7 @@ function get(key: Key) {
|
||||
}
|
||||
|
||||
function set(key: Key, value: string) {
|
||||
if (isServerSide()) {
|
||||
if (environment.isServerSide()) {
|
||||
Sentry.captureException(new Error("Local storage is not available"));
|
||||
return;
|
||||
}
|
||||
@@ -31,7 +31,7 @@ function set(key: Key, value: string) {
|
||||
}
|
||||
|
||||
function clean(key: Key) {
|
||||
if (isServerSide()) {
|
||||
if (environment.isServerSide()) {
|
||||
Sentry.captureException(new Error("Local storage is not available"));
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -7,7 +7,7 @@ A block that transcribes the audio content of a YouTube video into text.
|
||||
This block takes a YouTube video URL as input and produces a text transcript of the video's audio content. It also extracts and provides the unique video ID associated with the YouTube video.
|
||||
|
||||
### How it works
|
||||
The block first extracts the video ID from the provided YouTube URL. It then uses this ID to fetch the video's transcript. The transcript is processed and formatted into a readable text format. If any errors occur during this process, the block will capture and report them.
|
||||
The block first extracts the video ID from the provided YouTube URL. It then uses this ID to fetch the video's transcript, preferring English when available. If an English transcript is not available, the block will automatically use the first available transcript in any other language (prioritizing manually created transcripts over auto-generated ones). The transcript is processed and formatted into a readable text format. If any errors occur during this process, the block will capture and report them.
|
||||
|
||||
### Inputs
|
||||
| Input | Description |
|
||||
@@ -22,5 +22,5 @@ The block first extracts the video ID from the provided YouTube URL. It then use
|
||||
| Error | Any error message that occurs if the transcription process fails. |
|
||||
|
||||
### Possible use case
|
||||
A content creator could use this block to automatically generate subtitles for their YouTube videos. They could also use it to create text-based summaries of video content for SEO purposes or to make their content more accessible to hearing-impaired viewers.
|
||||
A content creator could use this block to automatically generate subtitles for their YouTube videos. They could also use it to create text-based summaries of video content for SEO purposes or to make their content more accessible to hearing-impaired viewers. The automatic language fallback feature ensures that transcripts can be obtained even from videos that only have subtitles in non-English languages.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user