Files
AutoGPT/autogpt_platform/frontend
Swifty 5035b69c79 feat(platform): add feature request tools for CoPilot chat (#12102)
Users can now search for existing feature requests and submit new ones
directly through the CoPilot chat interface. Requests are tracked in
Linear with customer need attribution.

### Changes 🏗️

**Backend:**
- Added `SearchFeatureRequestsTool` and `CreateFeatureRequestTool` to
the CoPilot chat tools registry
- Integrated with Linear GraphQL API for searching issues in the feature
requests project, creating new issues, upserting customers, and
attaching customer needs
- Added `linear_api_key` secret to settings for system-level Linear API
access
- Added response models (`FeatureRequestSearchResponse`,
`FeatureRequestCreatedResponse`, `FeatureRequestInfo`) to the tools
models

**Frontend:**
- Added `SearchFeatureRequestsTool` and `CreateFeatureRequestTool` UI
components with full streaming state handling (input-streaming,
input-available, output-available, output-error)
- Added helper utilities for output parsing, type guards, animation
text, and icon rendering
- Wired tools into `ChatMessagesContainer` for rendering in the chat
- Added styleguide examples covering all tool states

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Verified search returns matching feature requests from Linear
- [x] Verified creating a new feature request creates an issue and
customer need in Linear
- [x] Verified adding a need to an existing issue works via
`existing_issue_id`
  - [x] Verified error states render correctly in the UI
  - [x] Verified styleguide page renders all tool states

#### For configuration changes:
- [x] `.env.default` is updated or already compatible with my changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

New secret: `LINEAR_API_KEY` — required for system-level Linear API
operations (defaults to empty string).

<!-- greptile_comment -->

<h2>Greptile Overview</h2>

<details><summary><h3>Greptile Summary</h3></summary>

Adds feature request search and creation tools to CoPilot chat,
integrating with Linear's GraphQL API to track user feedback. Users can
now search existing feature requests and submit new ones (or add their
need to existing issues) directly through conversation.

**Key changes:**
- Backend: `SearchFeatureRequestsTool` and `CreateFeatureRequestTool`
with Linear API integration via system-level `LINEAR_API_KEY`
- Frontend: React components with streaming state handling and accordion
UI for search results and creation confirmations
- Models: Added `FeatureRequestSearchResponse` and
`FeatureRequestCreatedResponse` to response types
- Customer need tracking: Upserts customers in Linear and attaches needs
to issues for better feedback attribution

**Issues found:**
- Missing `LINEAR_API_KEY` entry in `.env.default` (required per PR
description checklist)
- Hardcoded project/team IDs reduce maintainability
- Global singleton pattern could cause issues in async contexts
- Using `user_id` as customer name reduces readability in Linear
</details>


<details><summary><h3>Confidence Score: 4/5</h3></summary>

- Safe to merge with minor configuration fix required
- The implementation is well-structured with proper error handling, type
safety, and follows existing patterns in the codebase. The missing
`.env.default` entry is a straightforward configuration issue that must
be fixed before deployment but doesn't affect code quality. The other
findings are style improvements that don't impact functionality.
- Verify that `LINEAR_API_KEY` is added to `.env.default` before merging
</details>


<details><summary><h3>Sequence Diagram</h3></summary>

```mermaid
sequenceDiagram
    participant User
    participant CoPilot UI
    participant LLM
    participant FeatureRequestTool
    participant LinearClient
    participant Linear API

    User->>CoPilot UI: Request feature via chat
    CoPilot UI->>LLM: Send user message
    
    LLM->>FeatureRequestTool: search_feature_requests(query)
    FeatureRequestTool->>LinearClient: query(SEARCH_ISSUES_QUERY)
    LinearClient->>Linear API: POST /graphql (search)
    Linear API-->>LinearClient: searchIssues.nodes[]
    LinearClient-->>FeatureRequestTool: Feature request data
    FeatureRequestTool-->>LLM: FeatureRequestSearchResponse
    
    alt No existing requests found
        LLM->>FeatureRequestTool: create_feature_request(title, description)
        FeatureRequestTool->>LinearClient: mutate(CUSTOMER_UPSERT_MUTATION)
        LinearClient->>Linear API: POST /graphql (upsert customer)
        Linear API-->>LinearClient: customer {id, name}
        LinearClient-->>FeatureRequestTool: Customer data
        
        FeatureRequestTool->>LinearClient: mutate(ISSUE_CREATE_MUTATION)
        LinearClient->>Linear API: POST /graphql (create issue)
        Linear API-->>LinearClient: issue {id, identifier, url}
        LinearClient-->>FeatureRequestTool: Issue data
        
        FeatureRequestTool->>LinearClient: mutate(CUSTOMER_NEED_CREATE_MUTATION)
        LinearClient->>Linear API: POST /graphql (attach need)
        Linear API-->>LinearClient: need {id, issue}
        LinearClient-->>FeatureRequestTool: Need data
        FeatureRequestTool-->>LLM: FeatureRequestCreatedResponse
    else Existing request found
        LLM->>FeatureRequestTool: create_feature_request(title, description, existing_issue_id)
        FeatureRequestTool->>LinearClient: mutate(CUSTOMER_UPSERT_MUTATION)
        LinearClient->>Linear API: POST /graphql (upsert customer)
        Linear API-->>LinearClient: customer {id}
        LinearClient-->>FeatureRequestTool: Customer data
        
        FeatureRequestTool->>LinearClient: mutate(CUSTOMER_NEED_CREATE_MUTATION)
        LinearClient->>Linear API: POST /graphql (attach need to existing)
        Linear API-->>LinearClient: need {id, issue}
        LinearClient-->>FeatureRequestTool: Need data
        FeatureRequestTool-->>LLM: FeatureRequestCreatedResponse
    end
    
    LLM-->>CoPilot UI: Tool response + continuation
    CoPilot UI-->>User: Display result with accordion UI
```
</details>


<sub>Last reviewed commit: af2e093</sub>

<!-- greptile_other_comments_section -->

<!-- /greptile_comment -->
2026-02-13 15:27:00 +01:00
..

This is the frontend for AutoGPT's next generation

🧢 Getting Started

This project uses pnpm as the package manager via corepack. Corepack is a Node.js tool that automatically manages package managers without requiring global installations.

For architecture, conventions, data fetching, feature flags, design system usage, state management, and PR process, see CONTRIBUTING.md. For Playwright and Storybook testing setup, see TESTING.md.

Prerequisites

Make sure you have Node.js 16.10+ installed. Corepack is included with Node.js by default.

Setup

1. Enable corepack (run this once on your system):

corepack enable

This enables corepack to automatically manage pnpm based on the packageManager field in package.json.

2. Install dependencies:

pnpm i

3. Start the development server:

Running the Front-end & Back-end separately

We recommend this approach if you are doing active development on the project. First spin up the Back-end:

# on `autogpt_platform`
docker compose --profile local up deps_backend -d
# on `autogpt_platform/backend`
poetry run app

Then start the Front-end:

# on `autogpt_platform/frontend`
pnpm dev

Open http://localhost:3000 with your browser to see the result. If the server starts on http://localhost:3001 it means the Front-end is already running via Docker. You have to kill the container then or do docker compose down.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

Running both the Front-end and Back-end via Docker

If you run:

# on `autogpt_platform`
docker compose up -d

It will spin up the Back-end and Front-end via Docker. The Front-end will start on port 3000. This might not be what you want when actively contributing to the Front-end as you won't have direct/easy access to the Next.js dev server.

Subsequent Runs

For subsequent development sessions, you only need to run:

pnpm dev

Every time a new Front-end dependency is added by you or others, you will need to run pnpm i to install the new dependencies.

Available Scripts

  • pnpm dev - Start development server
  • pnpm build - Build for production
  • pnpm start - Start production server
  • pnpm lint - Run ESLint and Prettier checks
  • pnpm format - Format code with Prettier
  • pnpm types - Run TypeScript type checking
  • pnpm test - Run Playwright tests
  • pnpm test-ui - Run Playwright tests with UI
  • pnpm fetch:openapi - Fetch OpenAPI spec from backend
  • pnpm generate:api-client - Generate API client from OpenAPI spec
  • pnpm generate:api - Fetch OpenAPI spec and generate API client

This project uses next/font to automatically optimize and load Inter, a custom Google Font.

🔄 Data Fetching

See CONTRIBUTING.md for guidance on generated API hooks, SSR + hydration patterns, and usage examples. You generally do not need to run OpenAPI commands unless adding/modifying backend endpoints.

🚩 Feature Flags

See CONTRIBUTING.md for feature flag usage patterns, local development with mocks, and how to add new flags.

🚚 Deploy

TODO

📙 Storybook

Storybook is a powerful development environment for UI components. It allows you to build UI components in isolation, making it easier to develop, test, and document your components independently from your main application.

Purpose in the Development Process

  1. Component Development: Develop and test UI components in isolation.
  2. Visual Testing: Easily spot visual regressions.
  3. Documentation: Automatically document components and their props.
  4. Collaboration: Share components with your team or stakeholders for feedback.

How to Use Storybook

  1. Start Storybook: Run the following command to start the Storybook development server:

    pnpm storybook
    

    This will start Storybook on port 6006. Open http://localhost:6006 in your browser to view your component library.

  2. Build Storybook: To build a static version of Storybook for deployment, use:

    pnpm build-storybook
    
  3. Running Storybook Tests: Storybook tests can be run using:

    pnpm test-storybook
    
  4. Writing Stories: Create .stories.tsx files alongside your components to define different states and variations of your components.

By integrating Storybook into our development workflow, we can streamline UI development, improve component reusability, and maintain a consistent design system across the project.

🔭 Tech Stack

Core Framework & Language

  • Next.js - React framework with App Router
  • React - UI library for building user interfaces
  • TypeScript - Typed JavaScript for better developer experience

Styling & UI Components

Development & Testing

Backend & Services

  • Supabase - Backend-as-a-Service (database, auth, storage)
  • Sentry - Error monitoring and performance tracking

Package Management

  • pnpm - Fast, disk space efficient package manager
  • Corepack - Node.js package manager management

Additional Libraries

Development Tools

  • NEXT_PUBLIC_REACT_QUERY_DEVTOOL - Enable React Query DevTools. Set to true to enable.