mirror of
https://github.com/simstudioai/sim.git
synced 2026-01-10 07:27:57 -05:00
feat(copilot): implement copilot (#850)
* Fix autolayout v1 * Nuke tool call history * Modal v1 * Preview v2 * more updates * Checkpoint * Checkpoint * Better preview * Big stuff * Chat ui * Ui * Update * Changes * Increase token limit for copilot * Preview layout fixes * Checkpoint * sse checkpoint * Checkpoint * Continuation logic * Checkpoint * Updates * UPdates * Updateds * Cleanup * Update * Diff checker * test * Checkpoint * Checkpoint * Checkpoint again * checkpoint again * Chat box output format * Auto open diff canvas * Checkpoint * Checkpoit * Cleaning? * Diff checkpoint * Checkpoint * Persist changes * Autolayout fixces * Color diff fixes * Opusss * Works?? * getting there * Fixes? * Fixes? * Chat fixes * Autolayout fixes * Subblock update diffs * Handle delete diffs * Edge diffs v1 * Deletion edge diff * It works, kinda * Fixes * Update * Docs changes * Prompt fixes and new tool * Persist color changes * Add serper to copilot tools * Get env vars copilot tool * Set env vars copilot tool * Copilot console tool * Promtps * Targeted v1 * Targeted v2 * Targeted updates better * Target fixes * diff works?? * Diff goes away when switching workflows * Fixes * Edge fixes * Remove brief error * Lint * Minor fix * Add abort controller * Prompting * Lint * Fix test * Fix lint * Update csp * Add route to send info to sim agent * Consolidated copilot * First refactor complete * Get user workflow now returns yaml * Checkpoint * Checkpoitn * Works * It works * Hi * Cumulative target edit * Checkpont * Checkpoint * Store updates * Checkpoitn * Smart title generation * Handle copilot aborts * Clean up old copilot code * Refactor tool names * Checkpoint * Remove old route * Fix chat loading * Scope copilot chat to workflow id * New chat fixes * Fix chat loading * Update get all blocks and tools * Make metadata better * Update docs * Conditional update * Yaml refactor * Set up yaml service client * Yaml service migration * Migrate diff engine to sim agent * Refactor diff engine * Fix autolayout problems and clean up code some more * improvement: copilot panel ui/ux * Lint * Cleaning * Fix autolayout * Fix chat output selector * Updated copilot * Small update * Checkpoint system v1 * Checkpoint ui * Copilot ui * Copilot ui * Checkpoint * Accept/reject state * Proper tool call display name updates * Fix tool call display updates * Streaming quality updates? * Abort on refresh/chat switch/workflow switch * Abort v2g * Perf updates * Ui updates * Smoother streaming * Handle edge diffs from sim agent properly * Enable text box while streaming, but keep send disabled * Fix set env var route * Add interrupt param * Add interrupt handler * Interrupt v1 * Interrupt ui update * Update icons * Ui fixes? * Simplify get user workflow tool * Run workflow v1 * Fix edit workflow * Run workflow e2e v1 * Run workflow v2 * Run workflow v3 * v1 of v3, bad code, needs refactro * Lint * Cleaning v1? * Checkpoitn * Fix backend processing of status * State updates * Closer * Icon update * More updates * Checkpoitn * Updates * Chat stays when tabbign * Fix get console logss * Markdown copilot * Fix tool naming for build workflow * Streaming improvements * Use logs for copilot run workflow * Progress * it works? * Fix chat loading * Getting there * Autoscroll to bottom of chat * Fix icons * Lint * Confirm route polling * Cleaning * Lint * Remove migrations * Remove journal entry 64 * Resolve merge conflicts * Db migration * Fix test and lint * Fix tsvector lint error * Fixes --------- Co-authored-by: Emir Karabeg <emirkarabeg@berkeley.edu>
This commit is contained in:
committed by
GitHub
parent
9c3bcbabf9
commit
ab85c1a215
@@ -33,30 +33,54 @@ properties:
|
||||
enum: [GET, POST, PUT, DELETE, PATCH]
|
||||
description: HTTP method for the request
|
||||
default: GET
|
||||
queryParams:
|
||||
params:
|
||||
type: array
|
||||
description: Query parameters as key-value pairs
|
||||
description: Query parameters as table entries
|
||||
items:
|
||||
type: object
|
||||
required:
|
||||
- id
|
||||
- cells
|
||||
properties:
|
||||
key:
|
||||
id:
|
||||
type: string
|
||||
description: Parameter name
|
||||
value:
|
||||
type: string
|
||||
description: Parameter value
|
||||
description: Unique identifier for the parameter entry
|
||||
cells:
|
||||
type: object
|
||||
required:
|
||||
- Key
|
||||
- Value
|
||||
properties:
|
||||
Key:
|
||||
type: string
|
||||
description: Parameter name
|
||||
Value:
|
||||
type: string
|
||||
description: Parameter value
|
||||
headers:
|
||||
type: array
|
||||
description: HTTP headers as key-value pairs
|
||||
description: HTTP headers as table entries
|
||||
items:
|
||||
type: object
|
||||
required:
|
||||
- id
|
||||
- cells
|
||||
properties:
|
||||
key:
|
||||
id:
|
||||
type: string
|
||||
description: Header name
|
||||
value:
|
||||
type: string
|
||||
description: Header value
|
||||
description: Unique identifier for the header entry
|
||||
cells:
|
||||
type: object
|
||||
required:
|
||||
- Key
|
||||
- Value
|
||||
properties:
|
||||
Key:
|
||||
type: string
|
||||
description: Header name
|
||||
Value:
|
||||
type: string
|
||||
description: Header value
|
||||
body:
|
||||
type: string
|
||||
description: Request body for POST/PUT/PATCH methods
|
||||
@@ -99,15 +123,21 @@ user-api:
|
||||
url: "https://api.example.com/users/123"
|
||||
method: GET
|
||||
headers:
|
||||
- key: "Authorization"
|
||||
value: "Bearer {{API_TOKEN}}"
|
||||
- key: "Content-Type"
|
||||
value: "application/json"
|
||||
- id: header-1-uuid-here
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{API_TOKEN}}"
|
||||
- id: header-2-uuid-here
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
connections:
|
||||
success: process-user-data
|
||||
error: handle-api-error
|
||||
```
|
||||
|
||||
|
||||
|
||||
### POST Request with Body
|
||||
|
||||
```yaml
|
||||
@@ -118,10 +148,14 @@ create-ticket:
|
||||
url: "https://api.support.com/tickets"
|
||||
method: POST
|
||||
headers:
|
||||
- key: "Authorization"
|
||||
value: "Bearer {{SUPPORT_API_KEY}}"
|
||||
- key: "Content-Type"
|
||||
value: "application/json"
|
||||
- id: auth-header-uuid
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{SUPPORT_API_KEY}}"
|
||||
- id: content-type-uuid
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
body: |
|
||||
{
|
||||
"title": "<agent.title>",
|
||||
@@ -142,32 +176,249 @@ search-api:
|
||||
inputs:
|
||||
url: "https://api.store.com/products"
|
||||
method: GET
|
||||
queryParams:
|
||||
- key: "q"
|
||||
value: <start.searchTerm>
|
||||
- key: "limit"
|
||||
value: "10"
|
||||
- key: "category"
|
||||
value: <filter.category>
|
||||
params:
|
||||
- id: search-param-uuid
|
||||
cells:
|
||||
Key: "q"
|
||||
Value: <start.searchTerm>
|
||||
- id: limit-param-uuid
|
||||
cells:
|
||||
Key: "limit"
|
||||
Value: "10"
|
||||
- id: category-param-uuid
|
||||
cells:
|
||||
Key: "category"
|
||||
Value: <filter.category>
|
||||
headers:
|
||||
- key: "Authorization"
|
||||
value: "Bearer {{STORE_API_KEY}}"
|
||||
- id: auth-header-uuid
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{STORE_API_KEY}}"
|
||||
connections:
|
||||
success: display-results
|
||||
```
|
||||
|
||||
## Output References
|
||||
## Parameter Format
|
||||
|
||||
After an API block executes, you can reference its outputs:
|
||||
Headers and params (query parameters) use the table format with the following structure:
|
||||
|
||||
```yaml
|
||||
# In subsequent blocks
|
||||
next-block:
|
||||
headers:
|
||||
- id: unique-identifier-here
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
- id: another-unique-identifier
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{API_TOKEN}}"
|
||||
|
||||
params:
|
||||
- id: param-identifier-here
|
||||
cells:
|
||||
Key: "limit"
|
||||
Value: "10"
|
||||
```
|
||||
|
||||
**Structure Details:**
|
||||
- `id`: Unique identifier for tracking the table row
|
||||
- `cells.Key`: The parameter/header name
|
||||
- `cells.Value`: The parameter/header value
|
||||
- This format allows for proper table management and UI state preservation
|
||||
|
||||
## Output References
|
||||
|
||||
After an API block executes, you can reference its outputs in subsequent blocks. The API block provides three main outputs:
|
||||
|
||||
### Available Outputs
|
||||
|
||||
| Output | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `data` | any | The response body/payload from the API |
|
||||
| `status` | number | HTTP status code (200, 404, 500, etc.) |
|
||||
| `headers` | object | Response headers returned by the server |
|
||||
|
||||
### Usage Examples
|
||||
|
||||
```yaml
|
||||
# Reference API response data
|
||||
process-data:
|
||||
type: function
|
||||
name: "Process API Data"
|
||||
inputs:
|
||||
data: <api-block-name.output> # Response data
|
||||
status: <api-block-name.status> # HTTP status code
|
||||
headers: <api-block-name.headers> # Response headers
|
||||
error: <api-block-name.error> # Error details (if any)
|
||||
code: |
|
||||
const responseData = <fetchuserdata.data>;
|
||||
const statusCode = <fetchuserdata.status>;
|
||||
const responseHeaders = <fetchuserdata.headers>;
|
||||
|
||||
if (statusCode === 200) {
|
||||
return {
|
||||
success: true,
|
||||
user: responseData,
|
||||
contentType: responseHeaders['content-type']
|
||||
};
|
||||
} else {
|
||||
return {
|
||||
success: false,
|
||||
error: `API call failed with status ${statusCode}`
|
||||
};
|
||||
}
|
||||
|
||||
# Use API data in an agent block
|
||||
analyze-response:
|
||||
type: agent
|
||||
name: "Analyze Response"
|
||||
inputs:
|
||||
userPrompt: |
|
||||
Analyze this API response:
|
||||
|
||||
Status: <fetchuserdata.status>
|
||||
Data: <fetchuserdata.data>
|
||||
|
||||
Provide insights about the response.
|
||||
|
||||
# Conditional logic based on status
|
||||
check-status:
|
||||
type: condition
|
||||
name: "Check API Status"
|
||||
inputs:
|
||||
condition: <fetchuserdata.status> === 200
|
||||
connections:
|
||||
true: success-handler
|
||||
false: error-handler
|
||||
```
|
||||
|
||||
### Practical Example
|
||||
|
||||
```yaml
|
||||
user-api:
|
||||
type: api
|
||||
name: "Fetch User Data"
|
||||
inputs:
|
||||
url: "https://api.example.com/users/123"
|
||||
method: GET
|
||||
connections:
|
||||
success: process-response
|
||||
|
||||
process-response:
|
||||
type: function
|
||||
name: "Process Response"
|
||||
inputs:
|
||||
code: |
|
||||
const user = <fetchuserdata.data>;
|
||||
const status = <fetchuserdata.status>;
|
||||
|
||||
console.log(`API returned status: ${status}`);
|
||||
console.log(`User data:`, user);
|
||||
|
||||
return {
|
||||
userId: user.id,
|
||||
email: user.email,
|
||||
isActive: status === 200
|
||||
};
|
||||
```
|
||||
|
||||
### Error Handling
|
||||
|
||||
```yaml
|
||||
api-with-error-handling:
|
||||
type: api
|
||||
name: "API Call"
|
||||
inputs:
|
||||
url: "https://api.example.com/data"
|
||||
method: GET
|
||||
connections:
|
||||
success: check-response
|
||||
error: handle-error
|
||||
|
||||
check-response:
|
||||
type: condition
|
||||
name: "Check Response Status"
|
||||
inputs:
|
||||
condition: <apicall.status> >= 200 && <apicall.status> < 300
|
||||
connections:
|
||||
true: process-success
|
||||
false: handle-api-error
|
||||
|
||||
process-success:
|
||||
type: function
|
||||
name: "Process Success"
|
||||
inputs:
|
||||
code: |
|
||||
return {
|
||||
success: true,
|
||||
data: <apicall.data>,
|
||||
message: "API call successful"
|
||||
};
|
||||
|
||||
handle-api-error:
|
||||
type: function
|
||||
name: "Handle API Error"
|
||||
inputs:
|
||||
code: |
|
||||
return {
|
||||
success: false,
|
||||
status: <apicall.status>,
|
||||
error: "API call failed",
|
||||
data: <apicall.data>
|
||||
};
|
||||
```
|
||||
|
||||
## YAML String Escaping
|
||||
|
||||
When writing YAML, certain strings must be quoted to be properly parsed:
|
||||
|
||||
### Strings That Must Be Quoted
|
||||
|
||||
```yaml
|
||||
# URLs with hyphens, colons, special characters
|
||||
url: "https://api.example.com/users/123"
|
||||
url: "https://my-api.example.com/data"
|
||||
|
||||
# Header values with hyphens or special characters
|
||||
headers:
|
||||
- id: header-uuid
|
||||
cells:
|
||||
Key: "User-Agent"
|
||||
Value: "My-Application/1.0"
|
||||
- id: auth-uuid
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer my-token-123"
|
||||
|
||||
# Parameter values with hyphens
|
||||
params:
|
||||
- id: param-uuid
|
||||
cells:
|
||||
Key: "sort-by"
|
||||
Value: "created-at"
|
||||
```
|
||||
|
||||
### When to Use Quotes
|
||||
|
||||
- ✅ **Always quote**: URLs, tokens, values with hyphens, colons, or special characters
|
||||
- ✅ **Always quote**: Values that start with numbers but should be strings
|
||||
- ✅ **Always quote**: Boolean-looking strings that should remain as strings
|
||||
- ❌ **Don't quote**: Simple alphanumeric strings without special characters
|
||||
|
||||
### Examples
|
||||
|
||||
```yaml
|
||||
# ✅ Correct
|
||||
url: "https://api.stripe.com/v1/charges"
|
||||
headers:
|
||||
- id: auth-header
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer sk-test-123456789"
|
||||
|
||||
# ❌ Incorrect (may cause parsing errors)
|
||||
url: https://api.stripe.com/v1/charges
|
||||
headers:
|
||||
- id: auth-header
|
||||
cells:
|
||||
Key: Authorization
|
||||
Value: Bearer sk-test-123456789
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
@@ -176,4 +427,5 @@ next-block:
|
||||
- Include error handling with error connections
|
||||
- Set appropriate timeouts for your use case
|
||||
- Validate response status codes in subsequent blocks
|
||||
- Use meaningful block names for easier reference
|
||||
- Use meaningful block names for easier reference
|
||||
- **Always quote strings with special characters, URLs, and tokens**
|
||||
@@ -59,9 +59,6 @@ properties:
|
||||
end:
|
||||
type: string
|
||||
description: Target block ID for loop completion (optional)
|
||||
success:
|
||||
type: string
|
||||
description: Target block ID after loop completion (alternative format)
|
||||
error:
|
||||
type: string
|
||||
description: Target block ID for error handling
|
||||
@@ -79,13 +76,6 @@ connections:
|
||||
error: <string> # Target block ID for error handling (optional)
|
||||
```
|
||||
|
||||
Alternative format (legacy):
|
||||
```yaml
|
||||
connections:
|
||||
success: <string> # Target block ID after loop completion
|
||||
error: <string> # Target block ID for error handling (optional)
|
||||
```
|
||||
|
||||
## Child Block Configuration
|
||||
|
||||
Blocks inside a loop must have their `parentId` set to the loop block ID:
|
||||
|
||||
@@ -59,9 +59,6 @@ properties:
|
||||
end:
|
||||
type: string
|
||||
description: Target block ID after all parallel instances complete (optional)
|
||||
success:
|
||||
type: string
|
||||
description: Target block ID after all instances complete (alternative format)
|
||||
error:
|
||||
type: string
|
||||
description: Target block ID for error handling
|
||||
@@ -79,13 +76,6 @@ connections:
|
||||
error: <string> # Target block ID for error handling (optional)
|
||||
```
|
||||
|
||||
Alternative format (legacy):
|
||||
```yaml
|
||||
connections:
|
||||
success: <string> # Target block ID after all instances complete
|
||||
error: <string> # Target block ID for error handling (optional)
|
||||
```
|
||||
|
||||
## Child Block Configuration
|
||||
|
||||
Blocks inside a parallel block must have their `parentId` set to the parallel block ID:
|
||||
|
||||
@@ -40,16 +40,29 @@ properties:
|
||||
maximum: 599
|
||||
headers:
|
||||
type: array
|
||||
description: Response headers as key-value pairs
|
||||
description: Response headers as table entries
|
||||
items:
|
||||
type: object
|
||||
properties:
|
||||
id:
|
||||
type: string
|
||||
description: Unique identifier for the header entry
|
||||
key:
|
||||
type: string
|
||||
description: Header name
|
||||
value:
|
||||
type: string
|
||||
description: Header value
|
||||
cells:
|
||||
type: object
|
||||
description: Cell display values for the table interface
|
||||
properties:
|
||||
Key:
|
||||
type: string
|
||||
description: Display value for the key column
|
||||
Value:
|
||||
type: string
|
||||
description: Display value for the value column
|
||||
```
|
||||
|
||||
## Connection Configuration
|
||||
@@ -97,6 +110,40 @@ success-response:
|
||||
value: "workflow-engine"
|
||||
```
|
||||
|
||||
### Response with Complete Table Header Format
|
||||
|
||||
When headers are created through the UI table interface, the YAML includes additional metadata:
|
||||
|
||||
```yaml
|
||||
api-response:
|
||||
type: response
|
||||
name: "API Response"
|
||||
inputs:
|
||||
data:
|
||||
message: "Request processed successfully"
|
||||
id: <agent.request_id>
|
||||
status: 200
|
||||
headers:
|
||||
- id: header-1-uuid-here
|
||||
key: "Content-Type"
|
||||
value: "application/json"
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
- id: header-2-uuid-here
|
||||
key: "Cache-Control"
|
||||
value: "no-cache"
|
||||
cells:
|
||||
Key: "Cache-Control"
|
||||
Value: "no-cache"
|
||||
- id: header-3-uuid-here
|
||||
key: "X-API-Version"
|
||||
value: "2.1"
|
||||
cells:
|
||||
Key: "X-API-Version"
|
||||
Value: "2.1"
|
||||
```
|
||||
|
||||
### Error Response
|
||||
|
||||
```yaml
|
||||
@@ -137,4 +184,55 @@ paginated-response:
|
||||
value: "public, max-age=300"
|
||||
- key: "Content-Type"
|
||||
value: "application/json"
|
||||
```
|
||||
|
||||
## Table Parameter Formats
|
||||
|
||||
The Response block supports two formats for headers:
|
||||
|
||||
### Simplified Format (Manual YAML)
|
||||
|
||||
When writing YAML manually, you can use the simplified format:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
- key: "Content-Type"
|
||||
value: "application/json"
|
||||
- key: "Cache-Control"
|
||||
value: "no-cache"
|
||||
```
|
||||
|
||||
### Complete Table Format (UI Generated)
|
||||
|
||||
When headers are created through the UI table interface, the YAML includes additional metadata:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
- id: unique-identifier-here
|
||||
key: "Content-Type"
|
||||
value: "application/json"
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
```
|
||||
|
||||
**Key Differences:**
|
||||
- `id`: Unique identifier for tracking the table row
|
||||
- `cells`: Display values used by the UI table interface
|
||||
- Both formats are functionally equivalent for workflow execution
|
||||
- The complete format preserves UI state when importing/exporting workflows
|
||||
|
||||
**Important:** Always quote header names and values that contain special characters:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
- id: content-type-uuid
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
- id: cache-control-uuid
|
||||
cells:
|
||||
Key: "Cache-Control"
|
||||
Value: "no-cache"
|
||||
```
|
||||
```
|
||||
@@ -34,16 +34,29 @@ properties:
|
||||
description: Secret key for webhook verification
|
||||
headers:
|
||||
type: array
|
||||
description: Expected headers for validation
|
||||
description: Expected headers for validation as table entries
|
||||
items:
|
||||
type: object
|
||||
properties:
|
||||
id:
|
||||
type: string
|
||||
description: Unique identifier for the header entry
|
||||
key:
|
||||
type: string
|
||||
description: Header name
|
||||
value:
|
||||
type: string
|
||||
description: Expected header value
|
||||
cells:
|
||||
type: object
|
||||
description: Cell display values for the table interface
|
||||
properties:
|
||||
Key:
|
||||
type: string
|
||||
description: Display value for the key column
|
||||
Value:
|
||||
type: string
|
||||
description: Display value for the value column
|
||||
methods:
|
||||
type: array
|
||||
description: Allowed HTTP methods
|
||||
@@ -63,16 +76,29 @@ properties:
|
||||
maximum: 599
|
||||
headers:
|
||||
type: array
|
||||
description: Response headers
|
||||
description: Response headers as table entries
|
||||
items:
|
||||
type: object
|
||||
properties:
|
||||
id:
|
||||
type: string
|
||||
description: Unique identifier for the header entry
|
||||
key:
|
||||
type: string
|
||||
description: Header name
|
||||
value:
|
||||
type: string
|
||||
description: Header value
|
||||
cells:
|
||||
type: object
|
||||
description: Cell display values for the table interface
|
||||
properties:
|
||||
Key:
|
||||
type: string
|
||||
description: Display value for the key column
|
||||
Value:
|
||||
type: string
|
||||
description: Display value for the value column
|
||||
body:
|
||||
type: string
|
||||
description: Response body content
|
||||
@@ -180,6 +206,55 @@ stripe-webhook:
|
||||
error: payment-webhook-error
|
||||
```
|
||||
|
||||
### Webhook with Complete Table Header Format
|
||||
|
||||
When headers are created through the UI table interface, the YAML includes additional metadata:
|
||||
|
||||
```yaml
|
||||
api-webhook-complete:
|
||||
type: webhook
|
||||
name: "API Webhook with Table Headers"
|
||||
inputs:
|
||||
webhookConfig:
|
||||
enabled: true
|
||||
methods: [POST]
|
||||
headers:
|
||||
- id: header-1-uuid-here
|
||||
key: "Authorization"
|
||||
value: "Bearer {{WEBHOOK_API_KEY}}"
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{WEBHOOK_API_KEY}}"
|
||||
- id: header-2-uuid-here
|
||||
key: "Content-Type"
|
||||
value: "application/json"
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
responseConfig:
|
||||
status: 200
|
||||
headers:
|
||||
- id: response-header-1-uuid
|
||||
key: "Content-Type"
|
||||
value: "application/json"
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
- id: response-header-2-uuid
|
||||
key: "X-Webhook-Response"
|
||||
value: "processed"
|
||||
cells:
|
||||
Key: "X-Webhook-Response"
|
||||
Value: "processed"
|
||||
body: |
|
||||
{
|
||||
"status": "received",
|
||||
"timestamp": "{{new Date().toISOString()}}"
|
||||
}
|
||||
connections:
|
||||
success: process-webhook-complete
|
||||
```
|
||||
|
||||
### Generic API Webhook
|
||||
|
||||
```yaml
|
||||
@@ -240,6 +315,56 @@ crud-webhook:
|
||||
success: route-by-method
|
||||
```
|
||||
|
||||
## Table Parameter Formats
|
||||
|
||||
The Webhook block supports two formats for headers (both validation headers and response headers):
|
||||
|
||||
### Simplified Format (Manual YAML)
|
||||
|
||||
When writing YAML manually, you can use the simplified format:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
- key: "Authorization"
|
||||
value: "Bearer {{API_TOKEN}}"
|
||||
- key: "Content-Type"
|
||||
value: "application/json"
|
||||
```
|
||||
|
||||
### Complete Table Format (UI Generated)
|
||||
|
||||
When headers are created through the UI table interface, the YAML includes additional metadata:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
- id: unique-identifier-here
|
||||
key: "Authorization"
|
||||
value: "Bearer {{API_TOKEN}}"
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{API_TOKEN}}"
|
||||
```
|
||||
|
||||
**Key Differences:**
|
||||
- `id`: Unique identifier for tracking the table row
|
||||
- `cells`: Display values used by the UI table interface
|
||||
- Both formats are functionally equivalent for webhook processing
|
||||
- The complete format preserves UI state when importing/exporting workflows
|
||||
|
||||
**Important:** Always quote header names and values that contain special characters:
|
||||
|
||||
```yaml
|
||||
headers:
|
||||
- id: auth-header-uuid
|
||||
cells:
|
||||
Key: "Authorization"
|
||||
Value: "Bearer {{WEBHOOK_API_KEY}}"
|
||||
- id: content-type-uuid
|
||||
cells:
|
||||
Key: "Content-Type"
|
||||
Value: "application/json"
|
||||
```
|
||||
|
||||
## Webhook Variables
|
||||
|
||||
Inside webhook-triggered workflows, these special variables are available:
|
||||
|
||||
721
apps/sim/app/api/copilot/chat/route.ts
Normal file
721
apps/sim/app/api/copilot/chat/route.ts
Normal file
@@ -0,0 +1,721 @@
|
||||
import { and, desc, eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { getCopilotModel } from '@/lib/copilot/config'
|
||||
import { TITLE_GENERATION_SYSTEM_PROMPT, TITLE_GENERATION_USER_PROMPT } from '@/lib/copilot/prompts'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { apiKey as apiKeyTable, copilotChats } from '@/db/schema'
|
||||
import { executeProviderRequest } from '@/providers'
|
||||
|
||||
const logger = createLogger('CopilotChatAPI')
|
||||
|
||||
// Schema for chat messages
|
||||
const ChatMessageSchema = z.object({
|
||||
message: z.string().min(1, 'Message is required'),
|
||||
userMessageId: z.string().optional(), // ID from frontend for the user message
|
||||
chatId: z.string().optional(),
|
||||
workflowId: z.string().min(1, 'Workflow ID is required'),
|
||||
mode: z.enum(['ask', 'agent']).optional().default('agent'),
|
||||
createNewChat: z.boolean().optional().default(false),
|
||||
stream: z.boolean().optional().default(true),
|
||||
implicitFeedback: z.string().optional(),
|
||||
})
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
/**
|
||||
* Generate a chat title using LLM
|
||||
*/
|
||||
async function generateChatTitle(userMessage: string): Promise<string> {
|
||||
try {
|
||||
const { provider, model } = getCopilotModel('title')
|
||||
|
||||
// Get the appropriate API key for the provider
|
||||
let apiKey: string | undefined
|
||||
if (provider === 'anthropic') {
|
||||
// Use rotating API key for Anthropic
|
||||
const { getRotatingApiKey } = require('@/lib/utils')
|
||||
try {
|
||||
apiKey = getRotatingApiKey('anthropic')
|
||||
logger.debug(`Using rotating API key for Anthropic title generation`)
|
||||
} catch (e) {
|
||||
// If rotation fails, let the provider handle it
|
||||
logger.warn(`Failed to get rotating API key for Anthropic:`, e)
|
||||
}
|
||||
}
|
||||
|
||||
const response = await executeProviderRequest(provider, {
|
||||
model,
|
||||
systemPrompt: TITLE_GENERATION_SYSTEM_PROMPT,
|
||||
context: TITLE_GENERATION_USER_PROMPT(userMessage),
|
||||
temperature: 0.3,
|
||||
maxTokens: 50,
|
||||
apiKey: apiKey || '',
|
||||
stream: false,
|
||||
})
|
||||
|
||||
if (typeof response === 'object' && 'content' in response) {
|
||||
return response.content?.trim() || 'New Chat'
|
||||
}
|
||||
|
||||
return 'New Chat'
|
||||
} catch (error) {
|
||||
logger.error('Failed to generate chat title:', error)
|
||||
return 'New Chat'
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate chat title asynchronously and update the database
|
||||
*/
|
||||
async function generateChatTitleAsync(
|
||||
chatId: string,
|
||||
userMessage: string,
|
||||
requestId: string,
|
||||
streamController?: ReadableStreamDefaultController<Uint8Array>
|
||||
): Promise<void> {
|
||||
try {
|
||||
logger.info(`[${requestId}] Starting async title generation for chat ${chatId}`)
|
||||
|
||||
const title = await generateChatTitle(userMessage)
|
||||
|
||||
// Update the chat with the generated title
|
||||
await db
|
||||
.update(copilotChats)
|
||||
.set({
|
||||
title,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(copilotChats.id, chatId))
|
||||
|
||||
// Send title_updated event to client if streaming
|
||||
if (streamController) {
|
||||
const encoder = new TextEncoder()
|
||||
const titleEvent = `data: ${JSON.stringify({
|
||||
type: 'title_updated',
|
||||
title: title,
|
||||
})}\n\n`
|
||||
streamController.enqueue(encoder.encode(titleEvent))
|
||||
logger.debug(`[${requestId}] Sent title_updated event to client: "${title}"`)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Generated title for chat ${chatId}: "${title}"`)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Failed to generate title for chat ${chatId}:`, error)
|
||||
// Don't throw - this is a background operation
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/copilot/chat
|
||||
* Send messages to sim agent and handle chat persistence
|
||||
*/
|
||||
export async function POST(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID()
|
||||
const startTime = Date.now()
|
||||
|
||||
try {
|
||||
// Authenticate user
|
||||
const session = await getSession()
|
||||
let authenticatedUserId: string | null = session?.user?.id || null
|
||||
|
||||
// If no session, check for API key auth
|
||||
if (!authenticatedUserId) {
|
||||
const apiKeyHeader = req.headers.get('x-api-key')
|
||||
if (apiKeyHeader) {
|
||||
// Verify API key
|
||||
const [apiKeyRecord] = await db
|
||||
.select({ userId: apiKeyTable.userId })
|
||||
.from(apiKeyTable)
|
||||
.where(eq(apiKeyTable.key, apiKeyHeader))
|
||||
.limit(1)
|
||||
|
||||
if (apiKeyRecord) {
|
||||
authenticatedUserId = apiKeyRecord.userId
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!authenticatedUserId) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await req.json()
|
||||
const {
|
||||
message,
|
||||
userMessageId,
|
||||
chatId,
|
||||
workflowId,
|
||||
mode,
|
||||
createNewChat,
|
||||
stream,
|
||||
implicitFeedback,
|
||||
} = ChatMessageSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Processing copilot chat request`, {
|
||||
userId: authenticatedUserId,
|
||||
workflowId,
|
||||
chatId,
|
||||
mode,
|
||||
stream,
|
||||
createNewChat,
|
||||
messageLength: message.length,
|
||||
hasImplicitFeedback: !!implicitFeedback,
|
||||
})
|
||||
|
||||
// Handle chat context
|
||||
let currentChat: any = null
|
||||
let conversationHistory: any[] = []
|
||||
let actualChatId = chatId
|
||||
|
||||
if (chatId) {
|
||||
// Load existing chat
|
||||
const [chat] = await db
|
||||
.select()
|
||||
.from(copilotChats)
|
||||
.where(and(eq(copilotChats.id, chatId), eq(copilotChats.userId, authenticatedUserId)))
|
||||
.limit(1)
|
||||
|
||||
if (chat) {
|
||||
currentChat = chat
|
||||
conversationHistory = Array.isArray(chat.messages) ? chat.messages : []
|
||||
}
|
||||
} else if (createNewChat && workflowId) {
|
||||
// Create new chat
|
||||
const { provider, model } = getCopilotModel('chat')
|
||||
const [newChat] = await db
|
||||
.insert(copilotChats)
|
||||
.values({
|
||||
userId: authenticatedUserId,
|
||||
workflowId,
|
||||
title: null,
|
||||
model,
|
||||
messages: [],
|
||||
})
|
||||
.returning()
|
||||
|
||||
if (newChat) {
|
||||
currentChat = newChat
|
||||
actualChatId = newChat.id
|
||||
}
|
||||
}
|
||||
|
||||
// Build messages array for sim agent with conversation history
|
||||
const messages = []
|
||||
|
||||
// Add conversation history
|
||||
for (const msg of conversationHistory) {
|
||||
messages.push({
|
||||
role: msg.role,
|
||||
content: msg.content,
|
||||
})
|
||||
}
|
||||
|
||||
// Add implicit feedback if provided
|
||||
if (implicitFeedback) {
|
||||
messages.push({
|
||||
role: 'system',
|
||||
content: implicitFeedback,
|
||||
})
|
||||
}
|
||||
|
||||
// Add current user message
|
||||
messages.push({
|
||||
role: 'user',
|
||||
content: message,
|
||||
})
|
||||
|
||||
// Start title generation in parallel if this is a new chat with first message
|
||||
if (actualChatId && !currentChat?.title && conversationHistory.length === 0) {
|
||||
logger.info(`[${requestId}] Will start parallel title generation inside stream`)
|
||||
}
|
||||
|
||||
// Forward to sim agent API
|
||||
logger.info(`[${requestId}] Sending request to sim agent API`, {
|
||||
messageCount: messages.length,
|
||||
endpoint: `${SIM_AGENT_API_URL}/api/chat-completion-streaming`,
|
||||
})
|
||||
|
||||
const simAgentResponse = await fetch(`${SIM_AGENT_API_URL}/api/chat-completion-streaming`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
messages,
|
||||
workflowId,
|
||||
userId: authenticatedUserId,
|
||||
stream: stream,
|
||||
streamToolCalls: true,
|
||||
mode: mode,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!simAgentResponse.ok) {
|
||||
const errorText = await simAgentResponse.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: simAgentResponse.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ error: `Sim agent API error: ${simAgentResponse.statusText}` },
|
||||
{ status: simAgentResponse.status }
|
||||
)
|
||||
}
|
||||
|
||||
// If streaming is requested, forward the stream and update chat later
|
||||
if (stream && simAgentResponse.body) {
|
||||
logger.info(`[${requestId}] Streaming response from sim agent`)
|
||||
|
||||
// Create user message to save
|
||||
const userMessage = {
|
||||
id: userMessageId || crypto.randomUUID(), // Use frontend ID if provided
|
||||
role: 'user',
|
||||
content: message,
|
||||
timestamp: new Date().toISOString(),
|
||||
}
|
||||
|
||||
// Create a pass-through stream that captures the response
|
||||
const transformedStream = new ReadableStream({
|
||||
async start(controller) {
|
||||
const encoder = new TextEncoder()
|
||||
let assistantContent = ''
|
||||
const toolCalls: any[] = []
|
||||
let buffer = ''
|
||||
let isFirstDone = true
|
||||
|
||||
// Send chatId as first event
|
||||
if (actualChatId) {
|
||||
const chatIdEvent = `data: ${JSON.stringify({
|
||||
type: 'chat_id',
|
||||
chatId: actualChatId,
|
||||
})}\n\n`
|
||||
controller.enqueue(encoder.encode(chatIdEvent))
|
||||
logger.debug(`[${requestId}] Sent initial chatId event to client`)
|
||||
}
|
||||
|
||||
// Start title generation in parallel if needed
|
||||
if (actualChatId && !currentChat?.title && conversationHistory.length === 0) {
|
||||
logger.info(`[${requestId}] Starting title generation with stream updates`, {
|
||||
chatId: actualChatId,
|
||||
hasTitle: !!currentChat?.title,
|
||||
conversationLength: conversationHistory.length,
|
||||
message: message.substring(0, 100) + (message.length > 100 ? '...' : ''),
|
||||
})
|
||||
generateChatTitleAsync(actualChatId, message, requestId, controller).catch((error) => {
|
||||
logger.error(`[${requestId}] Title generation failed:`, error)
|
||||
})
|
||||
} else {
|
||||
logger.debug(`[${requestId}] Skipping title generation`, {
|
||||
chatId: actualChatId,
|
||||
hasTitle: !!currentChat?.title,
|
||||
conversationLength: conversationHistory.length,
|
||||
reason: !actualChatId
|
||||
? 'no chatId'
|
||||
: currentChat?.title
|
||||
? 'already has title'
|
||||
: conversationHistory.length > 0
|
||||
? 'not first message'
|
||||
: 'unknown',
|
||||
})
|
||||
}
|
||||
|
||||
// Forward the sim agent stream and capture assistant response
|
||||
const reader = simAgentResponse.body!.getReader()
|
||||
const decoder = new TextDecoder()
|
||||
|
||||
try {
|
||||
while (true) {
|
||||
const { done, value } = await reader.read()
|
||||
if (done) {
|
||||
logger.info(`[${requestId}] Stream reading completed`)
|
||||
break
|
||||
}
|
||||
|
||||
// Check if client disconnected before processing chunk
|
||||
try {
|
||||
// Forward the chunk to client immediately
|
||||
controller.enqueue(value)
|
||||
} catch (error) {
|
||||
// Client disconnected - stop reading from sim agent
|
||||
logger.info(`[${requestId}] Client disconnected, stopping stream processing`)
|
||||
reader.cancel() // Stop reading from sim agent
|
||||
break
|
||||
}
|
||||
const chunkSize = value.byteLength
|
||||
|
||||
// Decode and parse SSE events for logging and capturing content
|
||||
const decodedChunk = decoder.decode(value, { stream: true })
|
||||
buffer += decodedChunk
|
||||
|
||||
const lines = buffer.split('\n')
|
||||
buffer = lines.pop() || '' // Keep incomplete line in buffer
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.trim() === '') continue // Skip empty lines
|
||||
|
||||
if (line.startsWith('data: ') && line.length > 6) {
|
||||
try {
|
||||
const jsonStr = line.slice(6)
|
||||
|
||||
// Check if the JSON string is unusually large (potential streaming issue)
|
||||
if (jsonStr.length > 50000) {
|
||||
// 50KB limit
|
||||
logger.warn(`[${requestId}] Large SSE event detected`, {
|
||||
size: jsonStr.length,
|
||||
preview: `${jsonStr.substring(0, 100)}...`,
|
||||
})
|
||||
}
|
||||
|
||||
const event = JSON.parse(jsonStr)
|
||||
|
||||
// Log different event types comprehensively
|
||||
switch (event.type) {
|
||||
case 'content':
|
||||
if (event.data) {
|
||||
assistantContent += event.data
|
||||
}
|
||||
break
|
||||
|
||||
case 'tool_call':
|
||||
logger.info(
|
||||
`[${requestId}] Tool call ${event.data?.partial ? '(partial)' : '(complete)'}:`,
|
||||
{
|
||||
id: event.data?.id,
|
||||
name: event.data?.name,
|
||||
arguments: event.data?.arguments,
|
||||
blockIndex: event.data?._blockIndex,
|
||||
}
|
||||
)
|
||||
if (!event.data?.partial) {
|
||||
toolCalls.push(event.data)
|
||||
}
|
||||
break
|
||||
|
||||
case 'tool_execution':
|
||||
logger.info(`[${requestId}] Tool execution started:`, {
|
||||
toolCallId: event.toolCallId,
|
||||
toolName: event.toolName,
|
||||
status: event.status,
|
||||
})
|
||||
break
|
||||
|
||||
case 'tool_result':
|
||||
logger.info(`[${requestId}] Tool result received:`, {
|
||||
toolCallId: event.toolCallId,
|
||||
toolName: event.toolName,
|
||||
success: event.success,
|
||||
result: `${JSON.stringify(event.result).substring(0, 200)}...`,
|
||||
resultSize: JSON.stringify(event.result).length,
|
||||
})
|
||||
break
|
||||
|
||||
case 'tool_error':
|
||||
logger.error(`[${requestId}] Tool error:`, {
|
||||
toolCallId: event.toolCallId,
|
||||
toolName: event.toolName,
|
||||
error: event.error,
|
||||
success: event.success,
|
||||
})
|
||||
break
|
||||
|
||||
case 'done':
|
||||
if (isFirstDone) {
|
||||
logger.info(
|
||||
`[${requestId}] Initial AI response complete, tool count: ${toolCalls.length}`
|
||||
)
|
||||
isFirstDone = false
|
||||
} else {
|
||||
logger.info(`[${requestId}] Conversation round complete`)
|
||||
}
|
||||
break
|
||||
|
||||
case 'error':
|
||||
logger.error(`[${requestId}] Stream error event:`, event.error)
|
||||
break
|
||||
|
||||
default:
|
||||
logger.debug(`[${requestId}] Unknown event type: ${event.type}`, event)
|
||||
}
|
||||
} catch (e) {
|
||||
// Enhanced error handling for large payloads and parsing issues
|
||||
const lineLength = line.length
|
||||
const isLargePayload = lineLength > 10000
|
||||
|
||||
if (isLargePayload) {
|
||||
logger.error(
|
||||
`[${requestId}] Failed to parse large SSE event (${lineLength} chars)`,
|
||||
{
|
||||
error: e,
|
||||
preview: `${line.substring(0, 200)}...`,
|
||||
size: lineLength,
|
||||
}
|
||||
)
|
||||
} else {
|
||||
logger.warn(
|
||||
`[${requestId}] Failed to parse SSE event: "${line.substring(0, 200)}..."`,
|
||||
e
|
||||
)
|
||||
}
|
||||
}
|
||||
} else if (line.trim() && line !== 'data: [DONE]') {
|
||||
logger.debug(`[${requestId}] Non-SSE line from sim agent: "${line}"`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Process any remaining buffer
|
||||
if (buffer.trim()) {
|
||||
logger.debug(`[${requestId}] Processing remaining buffer: "${buffer}"`)
|
||||
if (buffer.startsWith('data: ')) {
|
||||
try {
|
||||
const event = JSON.parse(buffer.slice(6))
|
||||
if (event.type === 'content' && event.data) {
|
||||
assistantContent += event.data
|
||||
}
|
||||
} catch (e) {
|
||||
logger.warn(`[${requestId}] Failed to parse final buffer: "${buffer}"`)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Log final streaming summary
|
||||
logger.info(`[${requestId}] Streaming complete summary:`, {
|
||||
totalContentLength: assistantContent.length,
|
||||
toolCallsCount: toolCalls.length,
|
||||
hasContent: assistantContent.length > 0,
|
||||
toolNames: toolCalls.map((tc) => tc?.name).filter(Boolean),
|
||||
})
|
||||
|
||||
// Save messages to database after streaming completes (including aborted messages)
|
||||
if (currentChat) {
|
||||
const updatedMessages = [...conversationHistory, userMessage]
|
||||
|
||||
// Save assistant message if there's any content or tool calls (even partial from abort)
|
||||
if (assistantContent.trim() || toolCalls.length > 0) {
|
||||
const assistantMessage = {
|
||||
id: crypto.randomUUID(),
|
||||
role: 'assistant',
|
||||
content: assistantContent,
|
||||
timestamp: new Date().toISOString(),
|
||||
...(toolCalls.length > 0 && { toolCalls }),
|
||||
}
|
||||
updatedMessages.push(assistantMessage)
|
||||
logger.info(
|
||||
`[${requestId}] Saving assistant message with content (${assistantContent.length} chars) and ${toolCalls.length} tool calls`
|
||||
)
|
||||
} else {
|
||||
logger.info(
|
||||
`[${requestId}] No assistant content or tool calls to save (aborted before response)`
|
||||
)
|
||||
}
|
||||
|
||||
// Update chat in database immediately (without title)
|
||||
await db
|
||||
.update(copilotChats)
|
||||
.set({
|
||||
messages: updatedMessages,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(copilotChats.id, actualChatId!))
|
||||
|
||||
logger.info(`[${requestId}] Updated chat ${actualChatId} with new messages`, {
|
||||
messageCount: updatedMessages.length,
|
||||
savedUserMessage: true,
|
||||
savedAssistantMessage: assistantContent.trim().length > 0,
|
||||
})
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Error processing stream:`, error)
|
||||
controller.error(error)
|
||||
} finally {
|
||||
controller.close()
|
||||
}
|
||||
},
|
||||
})
|
||||
|
||||
const response = new Response(transformedStream, {
|
||||
headers: {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
Connection: 'keep-alive',
|
||||
'X-Accel-Buffering': 'no',
|
||||
},
|
||||
})
|
||||
|
||||
logger.info(`[${requestId}] Returning streaming response to client`, {
|
||||
duration: Date.now() - startTime,
|
||||
chatId: actualChatId,
|
||||
headers: {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
Connection: 'keep-alive',
|
||||
},
|
||||
})
|
||||
|
||||
return response
|
||||
}
|
||||
|
||||
// For non-streaming responses
|
||||
const responseData = await simAgentResponse.json()
|
||||
logger.info(`[${requestId}] Non-streaming response from sim agent:`, {
|
||||
hasContent: !!responseData.content,
|
||||
contentLength: responseData.content?.length || 0,
|
||||
model: responseData.model,
|
||||
provider: responseData.provider,
|
||||
toolCallsCount: responseData.toolCalls?.length || 0,
|
||||
hasTokens: !!responseData.tokens,
|
||||
})
|
||||
|
||||
// Log tool calls if present
|
||||
if (responseData.toolCalls?.length > 0) {
|
||||
responseData.toolCalls.forEach((toolCall: any) => {
|
||||
logger.info(`[${requestId}] Tool call in response:`, {
|
||||
id: toolCall.id,
|
||||
name: toolCall.name,
|
||||
success: toolCall.success,
|
||||
result: `${JSON.stringify(toolCall.result).substring(0, 200)}...`,
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
// Save messages if we have a chat
|
||||
if (currentChat && responseData.content) {
|
||||
const userMessage = {
|
||||
id: userMessageId || crypto.randomUUID(), // Use frontend ID if provided
|
||||
role: 'user',
|
||||
content: message,
|
||||
timestamp: new Date().toISOString(),
|
||||
}
|
||||
|
||||
const assistantMessage = {
|
||||
id: crypto.randomUUID(),
|
||||
role: 'assistant',
|
||||
content: responseData.content,
|
||||
timestamp: new Date().toISOString(),
|
||||
}
|
||||
|
||||
const updatedMessages = [...conversationHistory, userMessage, assistantMessage]
|
||||
|
||||
// Start title generation in parallel if this is first message (non-streaming)
|
||||
if (actualChatId && !currentChat.title && conversationHistory.length === 0) {
|
||||
logger.info(`[${requestId}] Starting title generation for non-streaming response`)
|
||||
generateChatTitleAsync(actualChatId, message, requestId).catch((error) => {
|
||||
logger.error(`[${requestId}] Title generation failed:`, error)
|
||||
})
|
||||
}
|
||||
|
||||
// Update chat in database immediately (without blocking for title)
|
||||
await db
|
||||
.update(copilotChats)
|
||||
.set({
|
||||
messages: updatedMessages,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(copilotChats.id, actualChatId!))
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Returning non-streaming response`, {
|
||||
duration: Date.now() - startTime,
|
||||
chatId: actualChatId,
|
||||
responseLength: responseData.content?.length || 0,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
response: responseData,
|
||||
chatId: actualChatId,
|
||||
metadata: {
|
||||
requestId,
|
||||
message,
|
||||
duration: Date.now() - startTime,
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
const duration = Date.now() - startTime
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
logger.error(`[${requestId}] Validation error:`, {
|
||||
duration,
|
||||
errors: error.errors,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid request data', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error handling copilot chat:`, {
|
||||
duration,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
stack: error instanceof Error ? error.stack : undefined,
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
{ error: error instanceof Error ? error.message : 'Internal server error' },
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
export async function GET(req: NextRequest) {
|
||||
try {
|
||||
const { searchParams } = new URL(req.url)
|
||||
const workflowId = searchParams.get('workflowId')
|
||||
|
||||
if (!workflowId) {
|
||||
return NextResponse.json({ error: 'workflowId is required' }, { status: 400 })
|
||||
}
|
||||
|
||||
// Get authenticated user
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const authenticatedUserId = session.user.id
|
||||
|
||||
// Fetch chats for this user and workflow
|
||||
const chats = await db
|
||||
.select({
|
||||
id: copilotChats.id,
|
||||
title: copilotChats.title,
|
||||
model: copilotChats.model,
|
||||
messages: copilotChats.messages,
|
||||
createdAt: copilotChats.createdAt,
|
||||
updatedAt: copilotChats.updatedAt,
|
||||
})
|
||||
.from(copilotChats)
|
||||
.where(
|
||||
and(eq(copilotChats.userId, authenticatedUserId), eq(copilotChats.workflowId, workflowId))
|
||||
)
|
||||
.orderBy(desc(copilotChats.updatedAt))
|
||||
|
||||
// Transform the data to include message count
|
||||
const transformedChats = chats.map((chat) => ({
|
||||
id: chat.id,
|
||||
title: chat.title,
|
||||
model: chat.model,
|
||||
messages: Array.isArray(chat.messages) ? chat.messages : [],
|
||||
messageCount: Array.isArray(chat.messages) ? chat.messages.length : 0,
|
||||
previewYaml: null, // Not needed for chat list
|
||||
createdAt: chat.createdAt,
|
||||
updatedAt: chat.updatedAt,
|
||||
}))
|
||||
|
||||
logger.info(`Retrieved ${transformedChats.length} chats for workflow ${workflowId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
chats: transformedChats,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Error fetching copilot chats:', error)
|
||||
return NextResponse.json({ error: 'Failed to fetch chats' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
76
apps/sim/app/api/copilot/chat/update-messages/route.ts
Normal file
76
apps/sim/app/api/copilot/chat/update-messages/route.ts
Normal file
@@ -0,0 +1,76 @@
|
||||
import { and, eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { copilotChats } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('CopilotChatUpdateAPI')
|
||||
|
||||
const UpdateMessagesSchema = z.object({
|
||||
chatId: z.string(),
|
||||
messages: z.array(
|
||||
z.object({
|
||||
id: z.string(),
|
||||
role: z.enum(['user', 'assistant']),
|
||||
content: z.string(),
|
||||
timestamp: z.string(),
|
||||
toolCalls: z.array(z.any()).optional(),
|
||||
contentBlocks: z.array(z.any()).optional(),
|
||||
})
|
||||
),
|
||||
})
|
||||
|
||||
export async function POST(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await req.json()
|
||||
const { chatId, messages } = UpdateMessagesSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Updating chat messages`, {
|
||||
userId: session.user.id,
|
||||
chatId,
|
||||
messageCount: messages.length,
|
||||
})
|
||||
|
||||
// Verify that the chat belongs to the user
|
||||
const [chat] = await db
|
||||
.select()
|
||||
.from(copilotChats)
|
||||
.where(and(eq(copilotChats.id, chatId), eq(copilotChats.userId, session.user.id)))
|
||||
.limit(1)
|
||||
|
||||
if (!chat) {
|
||||
return NextResponse.json({ error: 'Chat not found or unauthorized' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Update chat with new messages
|
||||
await db
|
||||
.update(copilotChats)
|
||||
.set({
|
||||
messages: messages,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(copilotChats.id, chatId))
|
||||
|
||||
logger.info(`[${requestId}] Successfully updated chat messages`, {
|
||||
chatId,
|
||||
newMessageCount: messages.length,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
messageCount: messages.length,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Error updating chat messages:`, error)
|
||||
return NextResponse.json({ error: 'Failed to update chat messages' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
@@ -1,138 +0,0 @@
|
||||
import { and, eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { copilotCheckpoints, workflow as workflowTable } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('RevertCheckpointAPI')
|
||||
|
||||
/**
|
||||
* POST /api/copilot/checkpoints/[id]/revert
|
||||
* Revert workflow to a specific checkpoint
|
||||
*/
|
||||
export async function POST(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
const checkpointId = (await params).id
|
||||
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Reverting to checkpoint: ${checkpointId}`, {
|
||||
userId: session.user.id,
|
||||
})
|
||||
|
||||
// Get the checkpoint
|
||||
const checkpoint = await db
|
||||
.select()
|
||||
.from(copilotCheckpoints)
|
||||
.where(
|
||||
and(eq(copilotCheckpoints.id, checkpointId), eq(copilotCheckpoints.userId, session.user.id))
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!checkpoint.length) {
|
||||
return NextResponse.json({ error: 'Checkpoint not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
const checkpointData = checkpoint[0]
|
||||
const { workflowId, yaml: yamlContent } = checkpointData
|
||||
|
||||
logger.info(`[${requestId}] Processing checkpoint revert`, {
|
||||
workflowId,
|
||||
yamlLength: yamlContent.length,
|
||||
})
|
||||
|
||||
// Use the consolidated YAML endpoint instead of duplicating the processing logic
|
||||
const yamlEndpointUrl = `${process.env.NEXT_PUBLIC_BASE_URL || 'http://localhost:3000'}/api/workflows/${workflowId}/yaml`
|
||||
|
||||
const yamlResponse = await fetch(yamlEndpointUrl, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
// Forward auth cookies from the original request
|
||||
Cookie: request.headers.get('Cookie') || '',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent,
|
||||
description: `Reverted to checkpoint from ${new Date(checkpointData.createdAt).toLocaleString()}`,
|
||||
source: 'checkpoint_revert',
|
||||
applyAutoLayout: true,
|
||||
createCheckpoint: false, // Don't create a checkpoint when reverting to one
|
||||
}),
|
||||
})
|
||||
|
||||
if (!yamlResponse.ok) {
|
||||
const errorData = await yamlResponse.json()
|
||||
logger.error(`[${requestId}] Consolidated YAML endpoint failed:`, errorData)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Failed to revert checkpoint via YAML endpoint',
|
||||
details: errorData.errors || [errorData.error || 'Unknown error'],
|
||||
},
|
||||
{ status: yamlResponse.status }
|
||||
)
|
||||
}
|
||||
|
||||
const yamlResult = await yamlResponse.json()
|
||||
|
||||
if (!yamlResult.success) {
|
||||
logger.error(`[${requestId}] YAML endpoint returned failure:`, yamlResult)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Failed to process checkpoint YAML',
|
||||
details: yamlResult.errors || ['Unknown error'],
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Update workflow's lastSynced timestamp
|
||||
await db
|
||||
.update(workflowTable)
|
||||
.set({
|
||||
lastSynced: new Date(),
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(workflowTable.id, workflowId))
|
||||
|
||||
// Notify the socket server to tell clients to rehydrate stores from database
|
||||
try {
|
||||
const socketUrl = process.env.SOCKET_URL || 'http://localhost:3002'
|
||||
await fetch(`${socketUrl}/api/copilot-workflow-edit`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
workflowId,
|
||||
description: `Reverted to checkpoint from ${new Date(checkpointData.createdAt).toLocaleString()}`,
|
||||
}),
|
||||
})
|
||||
logger.info(`[${requestId}] Notified socket server of checkpoint revert`)
|
||||
} catch (socketError) {
|
||||
logger.warn(`[${requestId}] Failed to notify socket server:`, socketError)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Successfully reverted to checkpoint`)
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
message: `Successfully reverted to checkpoint from ${new Date(checkpointData.createdAt).toLocaleString()}`,
|
||||
summary: yamlResult.summary || `Restored workflow from checkpoint.`,
|
||||
warnings: yamlResult.warnings || [],
|
||||
data: yamlResult.data,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Error reverting checkpoint:`, error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: `Failed to revert checkpoint: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
133
apps/sim/app/api/copilot/checkpoints/revert/route.ts
Normal file
133
apps/sim/app/api/copilot/checkpoints/revert/route.ts
Normal file
@@ -0,0 +1,133 @@
|
||||
import { and, eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { workflowCheckpoints, workflow as workflowTable } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('CheckpointRevertAPI')
|
||||
|
||||
const RevertCheckpointSchema = z.object({
|
||||
checkpointId: z.string().min(1),
|
||||
})
|
||||
|
||||
/**
|
||||
* POST /api/copilot/checkpoints/revert
|
||||
* Revert workflow to a specific checkpoint state
|
||||
*/
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
const { checkpointId } = RevertCheckpointSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Reverting to checkpoint ${checkpointId}`)
|
||||
|
||||
// Get the checkpoint and verify ownership
|
||||
const checkpoint = await db
|
||||
.select()
|
||||
.from(workflowCheckpoints)
|
||||
.where(
|
||||
and(
|
||||
eq(workflowCheckpoints.id, checkpointId),
|
||||
eq(workflowCheckpoints.userId, session.user.id)
|
||||
)
|
||||
)
|
||||
.then((rows) => rows[0])
|
||||
|
||||
if (!checkpoint) {
|
||||
return NextResponse.json({ error: 'Checkpoint not found or access denied' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Verify user still has access to the workflow
|
||||
const workflowData = await db
|
||||
.select()
|
||||
.from(workflowTable)
|
||||
.where(eq(workflowTable.id, checkpoint.workflowId))
|
||||
.then((rows) => rows[0])
|
||||
|
||||
if (!workflowData) {
|
||||
return NextResponse.json({ error: 'Workflow not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
if (workflowData.userId !== session.user.id) {
|
||||
return NextResponse.json({ error: 'Access denied to workflow' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Apply the checkpoint state to the workflow using the existing state endpoint
|
||||
const checkpointState = checkpoint.workflowState as any // Cast to any for property access
|
||||
|
||||
// Clean the checkpoint state to remove any null/undefined values that could cause validation errors
|
||||
const cleanedState = {
|
||||
blocks: checkpointState?.blocks || {},
|
||||
edges: checkpointState?.edges || [],
|
||||
loops: checkpointState?.loops || {},
|
||||
parallels: checkpointState?.parallels || {},
|
||||
isDeployed: checkpointState?.isDeployed || false,
|
||||
deploymentStatuses: checkpointState?.deploymentStatuses || {},
|
||||
hasActiveWebhook: checkpointState?.hasActiveWebhook || false,
|
||||
lastSaved: Date.now(),
|
||||
// Only include deployedAt if it's a valid date string that can be converted
|
||||
...(checkpointState?.deployedAt &&
|
||||
checkpointState.deployedAt !== null &&
|
||||
checkpointState.deployedAt !== undefined &&
|
||||
!Number.isNaN(new Date(checkpointState.deployedAt).getTime())
|
||||
? { deployedAt: new Date(checkpointState.deployedAt) }
|
||||
: {}),
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Applying cleaned checkpoint state`, {
|
||||
blocksCount: Object.keys(cleanedState.blocks).length,
|
||||
edgesCount: cleanedState.edges.length,
|
||||
hasDeployedAt: !!cleanedState.deployedAt,
|
||||
isDeployed: cleanedState.isDeployed,
|
||||
})
|
||||
|
||||
const stateResponse = await fetch(
|
||||
`${request.nextUrl.origin}/api/workflows/${checkpoint.workflowId}/state`,
|
||||
{
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
Cookie: request.headers.get('Cookie') || '', // Forward auth cookies
|
||||
},
|
||||
body: JSON.stringify(cleanedState),
|
||||
}
|
||||
)
|
||||
|
||||
if (!stateResponse.ok) {
|
||||
const errorData = await stateResponse.text()
|
||||
logger.error(`[${requestId}] Failed to apply checkpoint state: ${errorData}`)
|
||||
return NextResponse.json(
|
||||
{ error: 'Failed to revert workflow to checkpoint' },
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await stateResponse.json()
|
||||
logger.info(
|
||||
`[${requestId}] Successfully reverted workflow ${checkpoint.workflowId} to checkpoint ${checkpointId}`
|
||||
)
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
workflowId: checkpoint.workflowId,
|
||||
checkpointId,
|
||||
revertedAt: new Date().toISOString(),
|
||||
checkpoint: {
|
||||
id: checkpoint.id,
|
||||
workflowState: cleanedState, // Return the reverted state for frontend use
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Error reverting to checkpoint:`, error)
|
||||
return NextResponse.json({ error: 'Failed to revert to checkpoint' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
@@ -1,18 +1,26 @@
|
||||
import { and, desc, eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { copilotCheckpoints } from '@/db/schema'
|
||||
import { copilotChats, workflowCheckpoints } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('CopilotCheckpointsAPI')
|
||||
const logger = createLogger('WorkflowCheckpointsAPI')
|
||||
|
||||
const CreateCheckpointSchema = z.object({
|
||||
workflowId: z.string(),
|
||||
chatId: z.string(),
|
||||
messageId: z.string().optional(), // ID of the user message that triggered this checkpoint
|
||||
workflowState: z.string(), // JSON stringified workflow state
|
||||
})
|
||||
|
||||
/**
|
||||
* GET /api/copilot/checkpoints
|
||||
* List checkpoints for a specific chat
|
||||
* POST /api/copilot/checkpoints
|
||||
* Create a new checkpoint with JSON workflow state
|
||||
*/
|
||||
export async function GET(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID()
|
||||
export async function POST(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const session = await getSession()
|
||||
@@ -20,45 +28,131 @@ export async function GET(request: NextRequest) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(request.url)
|
||||
const body = await req.json()
|
||||
const { workflowId, chatId, messageId, workflowState } = CreateCheckpointSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Creating workflow checkpoint`, {
|
||||
userId: session.user.id,
|
||||
workflowId,
|
||||
chatId,
|
||||
messageId,
|
||||
fullRequestBody: body,
|
||||
parsedData: { workflowId, chatId, messageId },
|
||||
messageIdType: typeof messageId,
|
||||
messageIdExists: !!messageId,
|
||||
})
|
||||
|
||||
// Verify that the chat belongs to the user
|
||||
const [chat] = await db
|
||||
.select()
|
||||
.from(copilotChats)
|
||||
.where(and(eq(copilotChats.id, chatId), eq(copilotChats.userId, session.user.id)))
|
||||
.limit(1)
|
||||
|
||||
if (!chat) {
|
||||
return NextResponse.json({ error: 'Chat not found or unauthorized' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Parse the workflow state to validate it's valid JSON
|
||||
let parsedWorkflowState
|
||||
try {
|
||||
parsedWorkflowState = JSON.parse(workflowState)
|
||||
} catch (error) {
|
||||
return NextResponse.json({ error: 'Invalid workflow state JSON' }, { status: 400 })
|
||||
}
|
||||
|
||||
// Create checkpoint with JSON workflow state
|
||||
const [checkpoint] = await db
|
||||
.insert(workflowCheckpoints)
|
||||
.values({
|
||||
userId: session.user.id,
|
||||
workflowId,
|
||||
chatId,
|
||||
messageId,
|
||||
workflowState: parsedWorkflowState, // Store as JSON object
|
||||
})
|
||||
.returning()
|
||||
|
||||
logger.info(`[${requestId}] Workflow checkpoint created successfully`, {
|
||||
checkpointId: checkpoint.id,
|
||||
savedData: {
|
||||
checkpointId: checkpoint.id,
|
||||
userId: checkpoint.userId,
|
||||
workflowId: checkpoint.workflowId,
|
||||
chatId: checkpoint.chatId,
|
||||
messageId: checkpoint.messageId,
|
||||
createdAt: checkpoint.createdAt,
|
||||
},
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
checkpoint: {
|
||||
id: checkpoint.id,
|
||||
userId: checkpoint.userId,
|
||||
workflowId: checkpoint.workflowId,
|
||||
chatId: checkpoint.chatId,
|
||||
messageId: checkpoint.messageId,
|
||||
createdAt: checkpoint.createdAt,
|
||||
updatedAt: checkpoint.updatedAt,
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Failed to create workflow checkpoint:`, error)
|
||||
return NextResponse.json({ error: 'Failed to create checkpoint' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/copilot/checkpoints?chatId=xxx
|
||||
* Retrieve workflow checkpoints for a chat
|
||||
*/
|
||||
export async function GET(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(req.url)
|
||||
const chatId = searchParams.get('chatId')
|
||||
const limit = Number(searchParams.get('limit')) || 10
|
||||
const offset = Number(searchParams.get('offset')) || 0
|
||||
|
||||
if (!chatId) {
|
||||
return NextResponse.json({ error: 'chatId is required' }, { status: 400 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Listing checkpoints for chat: ${chatId}`, {
|
||||
logger.info(`[${requestId}] Fetching workflow checkpoints for chat`, {
|
||||
userId: session.user.id,
|
||||
limit,
|
||||
offset,
|
||||
chatId,
|
||||
})
|
||||
|
||||
// Fetch checkpoints for this user and chat
|
||||
const checkpoints = await db
|
||||
.select()
|
||||
.from(copilotCheckpoints)
|
||||
.select({
|
||||
id: workflowCheckpoints.id,
|
||||
userId: workflowCheckpoints.userId,
|
||||
workflowId: workflowCheckpoints.workflowId,
|
||||
chatId: workflowCheckpoints.chatId,
|
||||
messageId: workflowCheckpoints.messageId,
|
||||
createdAt: workflowCheckpoints.createdAt,
|
||||
updatedAt: workflowCheckpoints.updatedAt,
|
||||
})
|
||||
.from(workflowCheckpoints)
|
||||
.where(
|
||||
and(eq(copilotCheckpoints.userId, session.user.id), eq(copilotCheckpoints.chatId, chatId))
|
||||
and(eq(workflowCheckpoints.chatId, chatId), eq(workflowCheckpoints.userId, session.user.id))
|
||||
)
|
||||
.orderBy(desc(copilotCheckpoints.createdAt))
|
||||
.limit(limit)
|
||||
.offset(offset)
|
||||
.orderBy(desc(workflowCheckpoints.createdAt))
|
||||
|
||||
// Format timestamps to ISO strings for consistent timezone handling
|
||||
const formattedCheckpoints = checkpoints.map((checkpoint) => ({
|
||||
id: checkpoint.id,
|
||||
userId: checkpoint.userId,
|
||||
workflowId: checkpoint.workflowId,
|
||||
chatId: checkpoint.chatId,
|
||||
yaml: checkpoint.yaml,
|
||||
createdAt: checkpoint.createdAt.toISOString(),
|
||||
updatedAt: checkpoint.updatedAt.toISOString(),
|
||||
}))
|
||||
logger.info(`[${requestId}] Retrieved ${checkpoints.length} workflow checkpoints`)
|
||||
|
||||
return NextResponse.json({ checkpoints: formattedCheckpoints })
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
checkpoints,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Error listing checkpoints:`, error)
|
||||
return NextResponse.json({ error: 'Failed to list checkpoints' }, { status: 500 })
|
||||
logger.error(`[${requestId}] Failed to fetch workflow checkpoints:`, error)
|
||||
return NextResponse.json({ error: 'Failed to fetch checkpoints' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
183
apps/sim/app/api/copilot/confirm/route.ts
Normal file
183
apps/sim/app/api/copilot/confirm/route.ts
Normal file
@@ -0,0 +1,183 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getRedisClient } from '@/lib/redis'
|
||||
import type { NotificationStatus } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/lib/tools/types'
|
||||
|
||||
const logger = createLogger('CopilotConfirmAPI')
|
||||
|
||||
// Schema for confirmation request
|
||||
const ConfirmationSchema = z.object({
|
||||
toolCallId: z.string().min(1, 'Tool call ID is required'),
|
||||
status: z.enum(['success', 'error', 'accepted', 'rejected', 'background'] as const, {
|
||||
errorMap: () => ({ message: 'Invalid notification status' }),
|
||||
}),
|
||||
message: z.string().optional(), // Optional message for background moves or additional context
|
||||
})
|
||||
|
||||
/**
|
||||
* Update tool call status in Redis
|
||||
*/
|
||||
async function updateToolCallStatus(
|
||||
toolCallId: string,
|
||||
status: NotificationStatus,
|
||||
message?: string
|
||||
): Promise<boolean> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
logger.warn('updateToolCallStatus: Redis client not available')
|
||||
return false
|
||||
}
|
||||
|
||||
try {
|
||||
const key = `tool_call:${toolCallId}`
|
||||
const timeout = 60000 // 1 minute timeout
|
||||
const pollInterval = 100 // Poll every 100ms
|
||||
const startTime = Date.now()
|
||||
|
||||
logger.info('Polling for tool call in Redis', { toolCallId, key, timeout })
|
||||
|
||||
// Poll until the key exists or timeout
|
||||
while (Date.now() - startTime < timeout) {
|
||||
const exists = await redis.exists(key)
|
||||
if (exists) {
|
||||
logger.info('Tool call found in Redis, updating status', {
|
||||
toolCallId,
|
||||
key,
|
||||
pollDuration: Date.now() - startTime,
|
||||
})
|
||||
break
|
||||
}
|
||||
|
||||
// Wait before next poll
|
||||
await new Promise((resolve) => setTimeout(resolve, pollInterval))
|
||||
}
|
||||
|
||||
// Final check if key exists after polling
|
||||
const exists = await redis.exists(key)
|
||||
if (!exists) {
|
||||
logger.warn('Tool call not found in Redis after polling timeout', {
|
||||
toolCallId,
|
||||
key,
|
||||
timeout,
|
||||
pollDuration: Date.now() - startTime,
|
||||
})
|
||||
return false
|
||||
}
|
||||
|
||||
// Store both status and message as JSON
|
||||
const toolCallData = {
|
||||
status,
|
||||
message: message || null,
|
||||
timestamp: new Date().toISOString(),
|
||||
}
|
||||
await redis.set(key, JSON.stringify(toolCallData), 'EX', 86400) // Keep 24 hour expiry
|
||||
|
||||
logger.info('Tool call status updated in Redis', {
|
||||
toolCallId,
|
||||
key,
|
||||
status,
|
||||
message,
|
||||
pollDuration: Date.now() - startTime,
|
||||
})
|
||||
return true
|
||||
} catch (error) {
|
||||
logger.error('Failed to update tool call status in Redis', {
|
||||
toolCallId,
|
||||
status,
|
||||
message,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
})
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/copilot/confirm
|
||||
* Update tool call status (Accept/Reject)
|
||||
*/
|
||||
export async function POST(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID()
|
||||
const startTime = Date.now()
|
||||
|
||||
try {
|
||||
// Authenticate user (same pattern as copilot chat)
|
||||
const session = await getSession()
|
||||
const authenticatedUserId: string | null = session?.user?.id || null
|
||||
|
||||
if (!authenticatedUserId) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await req.json()
|
||||
const { toolCallId, status, message } = ConfirmationSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Tool call confirmation request`, {
|
||||
userId: authenticatedUserId,
|
||||
toolCallId,
|
||||
status,
|
||||
message,
|
||||
})
|
||||
|
||||
// Update the tool call status in Redis
|
||||
const updated = await updateToolCallStatus(toolCallId, status, message)
|
||||
|
||||
if (!updated) {
|
||||
logger.error(`[${requestId}] Failed to update tool call status`, {
|
||||
userId: authenticatedUserId,
|
||||
toolCallId,
|
||||
status,
|
||||
internalStatus: status,
|
||||
message,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ success: false, error: 'Failed to update tool call status or tool call not found' },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
const duration = Date.now() - startTime
|
||||
logger.info(`[${requestId}] Tool call confirmation completed`, {
|
||||
userId: authenticatedUserId,
|
||||
toolCallId,
|
||||
status,
|
||||
internalStatus: status,
|
||||
duration,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
message: message || `Tool call ${toolCallId} has been ${status.toLowerCase()}ed`,
|
||||
toolCallId,
|
||||
status,
|
||||
})
|
||||
} catch (error) {
|
||||
const duration = Date.now() - startTime
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
logger.error(`[${requestId}] Request validation error:`, {
|
||||
duration,
|
||||
errors: error.errors,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: `Invalid request data: ${error.errors.map((e) => e.message).join(', ')}`,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Unexpected error:`, {
|
||||
duration,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
stack: error instanceof Error ? error.stack : undefined,
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
{ success: false, error: error instanceof Error ? error.message : 'Internal server error' },
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
416
apps/sim/app/api/copilot/methods/route.ts
Normal file
416
apps/sim/app/api/copilot/methods/route.ts
Normal file
@@ -0,0 +1,416 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getRedisClient } from '@/lib/redis'
|
||||
import { copilotToolRegistry } from '../tools/registry'
|
||||
import { createErrorResponse } from './utils'
|
||||
|
||||
const logger = createLogger('CopilotMethodsAPI')
|
||||
|
||||
// Tool call status types - should match NotificationStatus from frontend
|
||||
type ToolCallStatus = 'pending' | 'accepted' | 'rejected' | 'error' | 'success' | 'background'
|
||||
|
||||
/**
|
||||
* Add a tool call to Redis with 'pending' status
|
||||
*/
|
||||
async function addToolToRedis(toolCallId: string): Promise<void> {
|
||||
if (!toolCallId) {
|
||||
logger.warn('addToolToRedis: No tool call ID provided')
|
||||
return
|
||||
}
|
||||
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
logger.warn('addToolToRedis: Redis client not available')
|
||||
return
|
||||
}
|
||||
|
||||
try {
|
||||
const key = `tool_call:${toolCallId}`
|
||||
const status: ToolCallStatus = 'pending'
|
||||
|
||||
// Store as JSON object for consistency with confirm API
|
||||
const toolCallData = {
|
||||
status,
|
||||
message: null,
|
||||
timestamp: new Date().toISOString(),
|
||||
}
|
||||
|
||||
// Set with 24 hour expiry (86400 seconds)
|
||||
await redis.set(key, JSON.stringify(toolCallData), 'EX', 86400)
|
||||
|
||||
logger.info('Tool call added to Redis', {
|
||||
toolCallId,
|
||||
key,
|
||||
status,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Failed to add tool call to Redis', {
|
||||
toolCallId,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Poll Redis for tool call status updates
|
||||
* Returns when status changes to 'Accepted' or 'Rejected', or times out after 60 seconds
|
||||
*/
|
||||
async function pollRedisForTool(
|
||||
toolCallId: string
|
||||
): Promise<{ status: ToolCallStatus; message?: string } | null> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
logger.warn('pollRedisForTool: Redis client not available')
|
||||
return null
|
||||
}
|
||||
|
||||
const key = `tool_call:${toolCallId}`
|
||||
const timeout = 300000 // 5 minutes
|
||||
const pollInterval = 1000 // 1 second
|
||||
const startTime = Date.now()
|
||||
|
||||
logger.info('Starting to poll Redis for tool call status', {
|
||||
toolCallId,
|
||||
timeout,
|
||||
pollInterval,
|
||||
})
|
||||
|
||||
while (Date.now() - startTime < timeout) {
|
||||
try {
|
||||
const redisValue = await redis.get(key)
|
||||
if (!redisValue) {
|
||||
// Wait before next poll
|
||||
await new Promise((resolve) => setTimeout(resolve, pollInterval))
|
||||
continue
|
||||
}
|
||||
|
||||
let status: ToolCallStatus | null = null
|
||||
let message: string | undefined
|
||||
|
||||
// Try to parse as JSON (new format), fallback to string (old format)
|
||||
try {
|
||||
const parsedData = JSON.parse(redisValue)
|
||||
status = parsedData.status as ToolCallStatus
|
||||
message = parsedData.message || undefined
|
||||
} catch {
|
||||
// Fallback to old format (direct status string)
|
||||
status = redisValue as ToolCallStatus
|
||||
}
|
||||
|
||||
if (status !== 'pending') {
|
||||
logger.info('Tool call status resolved', {
|
||||
toolCallId,
|
||||
status,
|
||||
message,
|
||||
duration: Date.now() - startTime,
|
||||
rawRedisValue: redisValue,
|
||||
parsedAsJSON: redisValue
|
||||
? (() => {
|
||||
try {
|
||||
return JSON.parse(redisValue)
|
||||
} catch {
|
||||
return 'failed-to-parse'
|
||||
}
|
||||
})()
|
||||
: null,
|
||||
})
|
||||
|
||||
// Special logging for set environment variables tool when Redis status is found
|
||||
if (toolCallId && (status === 'accepted' || status === 'rejected')) {
|
||||
logger.info('SET_ENV_VARS: Redis polling found status update', {
|
||||
toolCallId,
|
||||
foundStatus: status,
|
||||
redisMessage: message,
|
||||
pollDuration: Date.now() - startTime,
|
||||
redisKey: `tool_call:${toolCallId}`,
|
||||
})
|
||||
}
|
||||
|
||||
return { status, message }
|
||||
}
|
||||
|
||||
// Wait before next poll
|
||||
await new Promise((resolve) => setTimeout(resolve, pollInterval))
|
||||
} catch (error) {
|
||||
logger.error('Error polling Redis for tool call status', {
|
||||
toolCallId,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
})
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
logger.warn('Tool call polling timed out', {
|
||||
toolCallId,
|
||||
timeout,
|
||||
})
|
||||
return null
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle tool calls that require user interruption/approval
|
||||
* Returns { approved: boolean, rejected: boolean, error?: boolean, message?: string } to distinguish between rejection, timeout, and error
|
||||
*/
|
||||
async function interruptHandler(
|
||||
toolCallId: string
|
||||
): Promise<{ approved: boolean; rejected: boolean; error?: boolean; message?: string }> {
|
||||
if (!toolCallId) {
|
||||
logger.error('interruptHandler: No tool call ID provided')
|
||||
return { approved: false, rejected: false, error: true, message: 'No tool call ID provided' }
|
||||
}
|
||||
|
||||
logger.info('Starting interrupt handler for tool call', { toolCallId })
|
||||
|
||||
try {
|
||||
// Step 1: Add tool to Redis with 'pending' status
|
||||
await addToolToRedis(toolCallId)
|
||||
|
||||
// Step 2: Poll Redis for status update
|
||||
const result = await pollRedisForTool(toolCallId)
|
||||
|
||||
if (!result) {
|
||||
logger.error('Failed to get tool call status or timed out', { toolCallId })
|
||||
return { approved: false, rejected: false }
|
||||
}
|
||||
|
||||
const { status, message } = result
|
||||
|
||||
if (status === 'rejected') {
|
||||
logger.info('Tool execution rejected by user', { toolCallId, message })
|
||||
return { approved: false, rejected: true, message }
|
||||
}
|
||||
|
||||
if (status === 'accepted') {
|
||||
logger.info('Tool execution approved by user', { toolCallId, message })
|
||||
return { approved: true, rejected: false, message }
|
||||
}
|
||||
|
||||
if (status === 'error') {
|
||||
logger.error('Tool execution failed with error', { toolCallId, message })
|
||||
return { approved: false, rejected: false, error: true, message }
|
||||
}
|
||||
|
||||
if (status === 'background') {
|
||||
logger.info('Tool execution moved to background', { toolCallId, message })
|
||||
return { approved: true, rejected: false, message }
|
||||
}
|
||||
|
||||
if (status === 'success') {
|
||||
logger.info('Tool execution completed successfully', { toolCallId, message })
|
||||
return { approved: true, rejected: false, message }
|
||||
}
|
||||
|
||||
logger.warn('Unexpected tool call status', { toolCallId, status, message })
|
||||
return {
|
||||
approved: false,
|
||||
rejected: false,
|
||||
error: true,
|
||||
message: `Unexpected tool call status: ${status}`,
|
||||
}
|
||||
} catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error'
|
||||
logger.error('Error in interrupt handler', {
|
||||
toolCallId,
|
||||
error: errorMessage,
|
||||
})
|
||||
return {
|
||||
approved: false,
|
||||
rejected: false,
|
||||
error: true,
|
||||
message: `Interrupt handler error: ${errorMessage}`,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Schema for method execution
|
||||
const MethodExecutionSchema = z.object({
|
||||
methodId: z.string().min(1, 'Method ID is required'),
|
||||
params: z.record(z.any()).optional().default({}),
|
||||
toolCallId: z.string().nullable().optional().default(null),
|
||||
})
|
||||
|
||||
// Simple internal API key authentication
|
||||
function checkInternalApiKey(req: NextRequest) {
|
||||
const apiKey = req.headers.get('x-api-key')
|
||||
const expectedApiKey = process.env.INTERNAL_API_SECRET
|
||||
|
||||
if (!expectedApiKey) {
|
||||
return { success: false, error: 'Internal API key not configured' }
|
||||
}
|
||||
|
||||
if (!apiKey) {
|
||||
return { success: false, error: 'API key required' }
|
||||
}
|
||||
|
||||
if (apiKey !== expectedApiKey) {
|
||||
return { success: false, error: 'Invalid API key' }
|
||||
}
|
||||
|
||||
return { success: true }
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/copilot/methods
|
||||
* Execute a method based on methodId with internal API key auth
|
||||
*/
|
||||
export async function POST(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID()
|
||||
const startTime = Date.now()
|
||||
|
||||
try {
|
||||
// Check authentication (internal API key)
|
||||
const authResult = checkInternalApiKey(req)
|
||||
if (!authResult.success) {
|
||||
return NextResponse.json(createErrorResponse(authResult.error || 'Authentication failed'), {
|
||||
status: 401,
|
||||
})
|
||||
}
|
||||
|
||||
const body = await req.json()
|
||||
const { methodId, params, toolCallId } = MethodExecutionSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Method execution request: ${methodId}`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
hasParams: !!params && Object.keys(params).length > 0,
|
||||
})
|
||||
|
||||
// Check if tool exists in registry
|
||||
if (!copilotToolRegistry.has(methodId)) {
|
||||
logger.error(`[${requestId}] Tool not found in registry: ${methodId}`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
availableTools: copilotToolRegistry.getAvailableIds(),
|
||||
registrySize: copilotToolRegistry.getAvailableIds().length,
|
||||
})
|
||||
return NextResponse.json(
|
||||
createErrorResponse(
|
||||
`Unknown method: ${methodId}. Available methods: ${copilotToolRegistry.getAvailableIds().join(', ')}`
|
||||
),
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Tool found in registry: ${methodId}`, {
|
||||
toolCallId,
|
||||
})
|
||||
|
||||
// Check if the tool requires interrupt/approval
|
||||
const tool = copilotToolRegistry.get(methodId)
|
||||
if (tool?.requiresInterrupt) {
|
||||
if (!toolCallId) {
|
||||
logger.warn(`[${requestId}] Tool requires interrupt but no toolCallId provided`, {
|
||||
methodId,
|
||||
})
|
||||
return NextResponse.json(
|
||||
createErrorResponse('This tool requires approval but no tool call ID was provided'),
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Tool requires interrupt, starting approval process`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
})
|
||||
|
||||
// Handle interrupt flow
|
||||
const { approved, rejected, error, message } = await interruptHandler(toolCallId)
|
||||
|
||||
if (rejected) {
|
||||
logger.info(`[${requestId}] Tool execution rejected by user`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
message,
|
||||
})
|
||||
return NextResponse.json(
|
||||
createErrorResponse(
|
||||
'The user decided to skip running this tool. This was a user decision.'
|
||||
),
|
||||
{ status: 200 } // Changed to 200 - user rejection is a valid response
|
||||
)
|
||||
}
|
||||
|
||||
if (error) {
|
||||
logger.error(`[${requestId}] Tool execution failed with error`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
message,
|
||||
})
|
||||
return NextResponse.json(
|
||||
createErrorResponse(message || 'Tool execution failed with unknown error'),
|
||||
{ status: 500 } // 500 Internal Server Error
|
||||
)
|
||||
}
|
||||
|
||||
if (!approved) {
|
||||
logger.warn(`[${requestId}] Tool execution timed out`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
})
|
||||
return NextResponse.json(
|
||||
createErrorResponse('Tool execution request timed out'),
|
||||
{ status: 408 } // 408 Request Timeout
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Tool execution approved by user`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
message,
|
||||
})
|
||||
|
||||
// For noop tool, pass the confirmation message as a parameter
|
||||
if (methodId === 'no_op' && message) {
|
||||
params.confirmationMessage = message
|
||||
}
|
||||
}
|
||||
|
||||
// Execute the tool directly via registry
|
||||
const result = await copilotToolRegistry.execute(methodId, params)
|
||||
|
||||
logger.info(`[${requestId}] Tool execution result:`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
success: result.success,
|
||||
hasData: !!result.data,
|
||||
hasError: !!result.error,
|
||||
})
|
||||
|
||||
const duration = Date.now() - startTime
|
||||
logger.info(`[${requestId}] Method execution completed: ${methodId}`, {
|
||||
methodId,
|
||||
toolCallId,
|
||||
duration,
|
||||
success: result.success,
|
||||
})
|
||||
|
||||
return NextResponse.json(result)
|
||||
} catch (error) {
|
||||
const duration = Date.now() - startTime
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
logger.error(`[${requestId}] Request validation error:`, {
|
||||
duration,
|
||||
errors: error.errors,
|
||||
})
|
||||
return NextResponse.json(
|
||||
createErrorResponse(
|
||||
`Invalid request data: ${error.errors.map((e) => e.message).join(', ')}`
|
||||
),
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Unexpected error:`, {
|
||||
duration,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
stack: error instanceof Error ? error.stack : undefined,
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
createErrorResponse(error instanceof Error ? error.message : 'Internal server error'),
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
24
apps/sim/app/api/copilot/methods/utils.ts
Normal file
24
apps/sim/app/api/copilot/methods/utils.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import type { CopilotToolResponse } from '../tools/base'
|
||||
|
||||
const logger = createLogger('CopilotMethodsUtils')
|
||||
|
||||
/**
|
||||
* Create a standardized error response
|
||||
*/
|
||||
export function createErrorResponse(error: string): CopilotToolResponse {
|
||||
return {
|
||||
success: false,
|
||||
error,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a standardized success response
|
||||
*/
|
||||
export function createSuccessResponse(data: any): CopilotToolResponse {
|
||||
return {
|
||||
success: true,
|
||||
data,
|
||||
}
|
||||
}
|
||||
@@ -1,429 +0,0 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import {
|
||||
createChat,
|
||||
deleteChat,
|
||||
generateChatTitle,
|
||||
getChat,
|
||||
listChats,
|
||||
sendMessage,
|
||||
updateChat,
|
||||
} from '@/lib/copilot/service'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
|
||||
const logger = createLogger('CopilotAPI')
|
||||
|
||||
// Interface for StreamingExecution response
|
||||
interface StreamingExecution {
|
||||
stream: ReadableStream
|
||||
execution: Promise<any>
|
||||
}
|
||||
|
||||
// Schema for sending messages
|
||||
const SendMessageSchema = z.object({
|
||||
message: z.string().min(1, 'Message is required'),
|
||||
chatId: z.string().optional(),
|
||||
workflowId: z.string().optional(),
|
||||
mode: z.enum(['ask', 'agent']).optional().default('ask'),
|
||||
createNewChat: z.boolean().optional().default(false),
|
||||
stream: z.boolean().optional().default(false),
|
||||
})
|
||||
|
||||
// Schema for docs queries
|
||||
const DocsQuerySchema = z.object({
|
||||
query: z.string().min(1, 'Query is required'),
|
||||
topK: z.number().min(1).max(20).default(5),
|
||||
provider: z.string().optional(),
|
||||
model: z.string().optional(),
|
||||
stream: z.boolean().optional().default(false),
|
||||
chatId: z.string().optional(),
|
||||
workflowId: z.string().optional(),
|
||||
createNewChat: z.boolean().optional().default(false),
|
||||
})
|
||||
|
||||
// Schema for creating chats
|
||||
const CreateChatSchema = z.object({
|
||||
workflowId: z.string().min(1, 'Workflow ID is required'),
|
||||
title: z.string().optional(),
|
||||
initialMessage: z.string().optional(),
|
||||
})
|
||||
|
||||
// Schema for updating chats
|
||||
const UpdateChatSchema = z.object({
|
||||
chatId: z.string().min(1, 'Chat ID is required'),
|
||||
messages: z
|
||||
.array(
|
||||
z.object({
|
||||
id: z.string(),
|
||||
role: z.enum(['user', 'assistant', 'system']),
|
||||
content: z.string(),
|
||||
timestamp: z.string(),
|
||||
citations: z
|
||||
.array(
|
||||
z.object({
|
||||
id: z.number(),
|
||||
title: z.string(),
|
||||
url: z.string(),
|
||||
similarity: z.number().optional(),
|
||||
})
|
||||
)
|
||||
.optional(),
|
||||
})
|
||||
)
|
||||
.optional(),
|
||||
title: z.string().optional(),
|
||||
})
|
||||
|
||||
// Schema for listing chats
|
||||
const ListChatsSchema = z.object({
|
||||
workflowId: z.string().min(1, 'Workflow ID is required'),
|
||||
limit: z.number().min(1).max(100).optional().default(50),
|
||||
offset: z.number().min(0).optional().default(0),
|
||||
})
|
||||
|
||||
/**
|
||||
* POST /api/copilot
|
||||
* Send a message to the copilot
|
||||
*/
|
||||
export async function POST(req: NextRequest) {
|
||||
const requestId = crypto.randomUUID()
|
||||
|
||||
try {
|
||||
const body = await req.json()
|
||||
const { message, chatId, workflowId, mode, createNewChat, stream } =
|
||||
SendMessageSchema.parse(body)
|
||||
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Copilot message: "${message}"`, {
|
||||
chatId,
|
||||
workflowId,
|
||||
mode,
|
||||
createNewChat,
|
||||
stream,
|
||||
userId: session.user.id,
|
||||
})
|
||||
|
||||
// Send message using the service
|
||||
const result = await sendMessage({
|
||||
message,
|
||||
chatId,
|
||||
workflowId,
|
||||
mode,
|
||||
createNewChat,
|
||||
stream,
|
||||
userId: session.user.id,
|
||||
})
|
||||
|
||||
// Handle streaming response (ReadableStream or StreamingExecution)
|
||||
let streamToRead: ReadableStream | null = null
|
||||
|
||||
// Debug logging to see what we actually got
|
||||
logger.info(`[${requestId}] Response type analysis:`, {
|
||||
responseType: typeof result.response,
|
||||
isReadableStream: result.response instanceof ReadableStream,
|
||||
hasStreamProperty:
|
||||
typeof result.response === 'object' && result.response && 'stream' in result.response,
|
||||
hasExecutionProperty:
|
||||
typeof result.response === 'object' && result.response && 'execution' in result.response,
|
||||
responseKeys:
|
||||
typeof result.response === 'object' && result.response ? Object.keys(result.response) : [],
|
||||
})
|
||||
|
||||
if (result.response instanceof ReadableStream) {
|
||||
logger.info(`[${requestId}] Direct ReadableStream detected`)
|
||||
streamToRead = result.response
|
||||
} else if (
|
||||
typeof result.response === 'object' &&
|
||||
result.response &&
|
||||
'stream' in result.response &&
|
||||
'execution' in result.response
|
||||
) {
|
||||
// Handle StreamingExecution (from providers with tool calls)
|
||||
logger.info(`[${requestId}] StreamingExecution detected`)
|
||||
const streamingExecution = result.response as StreamingExecution
|
||||
streamToRead = streamingExecution.stream
|
||||
|
||||
// No need to extract citations - LLM generates direct markdown links
|
||||
}
|
||||
|
||||
if (streamToRead) {
|
||||
logger.info(`[${requestId}] Returning streaming response`)
|
||||
|
||||
const encoder = new TextEncoder()
|
||||
|
||||
return new Response(
|
||||
new ReadableStream({
|
||||
async start(controller) {
|
||||
const reader = streamToRead!.getReader()
|
||||
let accumulatedResponse = ''
|
||||
|
||||
// Send initial metadata
|
||||
const metadata = {
|
||||
type: 'metadata',
|
||||
chatId: result.chatId,
|
||||
metadata: {
|
||||
requestId,
|
||||
message,
|
||||
},
|
||||
}
|
||||
controller.enqueue(encoder.encode(`data: ${JSON.stringify(metadata)}\n\n`))
|
||||
|
||||
try {
|
||||
while (true) {
|
||||
const { done, value } = await reader.read()
|
||||
if (done) break
|
||||
|
||||
const chunkText = new TextDecoder().decode(value)
|
||||
accumulatedResponse += chunkText
|
||||
|
||||
const contentChunk = {
|
||||
type: 'content',
|
||||
content: chunkText,
|
||||
}
|
||||
controller.enqueue(encoder.encode(`data: ${JSON.stringify(contentChunk)}\n\n`))
|
||||
}
|
||||
|
||||
// Send completion signal
|
||||
const completion = {
|
||||
type: 'complete',
|
||||
finalContent: accumulatedResponse,
|
||||
}
|
||||
controller.enqueue(encoder.encode(`data: ${JSON.stringify(completion)}\n\n`))
|
||||
controller.close()
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Streaming error:`, error)
|
||||
const errorChunk = {
|
||||
type: 'error',
|
||||
error: 'Streaming failed',
|
||||
}
|
||||
controller.enqueue(encoder.encode(`data: ${JSON.stringify(errorChunk)}\n\n`))
|
||||
controller.close()
|
||||
}
|
||||
},
|
||||
}),
|
||||
{
|
||||
headers: {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
Connection: 'keep-alive',
|
||||
},
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
// Handle non-streaming response
|
||||
logger.info(`[${requestId}] Chat response generated successfully`)
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
response: result.response,
|
||||
chatId: result.chatId,
|
||||
metadata: {
|
||||
requestId,
|
||||
message,
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid request data', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Copilot error:`, error)
|
||||
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/copilot
|
||||
* List chats or get a specific chat
|
||||
*/
|
||||
export async function GET(req: NextRequest) {
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(req.url)
|
||||
const chatId = searchParams.get('chatId')
|
||||
|
||||
// If chatId is provided, get specific chat
|
||||
if (chatId) {
|
||||
const chat = await getChat(chatId, session.user.id)
|
||||
if (!chat) {
|
||||
return NextResponse.json({ error: 'Chat not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
chat,
|
||||
})
|
||||
}
|
||||
|
||||
// Otherwise, list chats
|
||||
const workflowId = searchParams.get('workflowId')
|
||||
const limit = Number.parseInt(searchParams.get('limit') || '50')
|
||||
const offset = Number.parseInt(searchParams.get('offset') || '0')
|
||||
|
||||
if (!workflowId) {
|
||||
return NextResponse.json(
|
||||
{ error: 'workflowId is required for listing chats' },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
const chats = await listChats(session.user.id, workflowId, { limit, offset })
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
chats,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Failed to handle GET request:', error)
|
||||
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* PUT /api/copilot
|
||||
* Create a new chat
|
||||
*/
|
||||
export async function PUT(req: NextRequest) {
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await req.json()
|
||||
const { workflowId, title, initialMessage } = CreateChatSchema.parse(body)
|
||||
|
||||
logger.info(`Creating new chat for user ${session.user.id}, workflow ${workflowId}`)
|
||||
|
||||
const chat = await createChat(session.user.id, workflowId, {
|
||||
title,
|
||||
initialMessage,
|
||||
})
|
||||
|
||||
logger.info(`Created chat ${chat.id} for user ${session.user.id}`)
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
chat,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid request data', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error('Failed to create chat:', error)
|
||||
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* PATCH /api/copilot
|
||||
* Update a chat with new messages
|
||||
*/
|
||||
export async function PATCH(req: NextRequest) {
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await req.json()
|
||||
const { chatId, messages, title } = UpdateChatSchema.parse(body)
|
||||
|
||||
logger.info(`Updating chat ${chatId} for user ${session.user.id}`)
|
||||
|
||||
// Get the current chat to check if it has a title
|
||||
const existingChat = await getChat(chatId, session.user.id)
|
||||
|
||||
let titleToUse = title
|
||||
|
||||
// Generate title if chat doesn't have one and we have messages
|
||||
if (!titleToUse && existingChat && !existingChat.title && messages && messages.length > 0) {
|
||||
const firstUserMessage = messages.find((msg) => msg.role === 'user')
|
||||
if (firstUserMessage) {
|
||||
logger.info('Generating LLM-based title for chat without title')
|
||||
try {
|
||||
titleToUse = await generateChatTitle(firstUserMessage.content)
|
||||
logger.info(`Generated title: ${titleToUse}`)
|
||||
} catch (error) {
|
||||
logger.error('Failed to generate chat title:', error)
|
||||
titleToUse = 'New Chat'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const chat = await updateChat(chatId, session.user.id, {
|
||||
messages,
|
||||
title: titleToUse,
|
||||
})
|
||||
|
||||
if (!chat) {
|
||||
return NextResponse.json({ error: 'Chat not found or access denied' }, { status: 404 })
|
||||
}
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
chat,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid request data', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error('Failed to update chat:', error)
|
||||
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DELETE /api/copilot
|
||||
* Delete a chat
|
||||
*/
|
||||
export async function DELETE(req: NextRequest) {
|
||||
try {
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(req.url)
|
||||
const chatId = searchParams.get('chatId')
|
||||
|
||||
if (!chatId) {
|
||||
return NextResponse.json({ error: 'chatId is required' }, { status: 400 })
|
||||
}
|
||||
|
||||
const success = await deleteChat(chatId, session.user.id)
|
||||
|
||||
if (!success) {
|
||||
return NextResponse.json({ error: 'Chat not found or access denied' }, { status: 404 })
|
||||
}
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
message: 'Chat deleted successfully',
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Failed to delete chat:', error)
|
||||
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
83
apps/sim/app/api/copilot/tools/base.ts
Normal file
83
apps/sim/app/api/copilot/tools/base.ts
Normal file
@@ -0,0 +1,83 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
|
||||
// Base tool response interface
|
||||
export interface CopilotToolResponse<T = any> {
|
||||
success: boolean
|
||||
data?: T
|
||||
error?: string
|
||||
}
|
||||
|
||||
// Base tool interface that all copilot tools must implement
|
||||
export interface CopilotTool<TParams = any, TResult = any> {
|
||||
readonly id: string
|
||||
readonly displayName: string
|
||||
readonly requiresInterrupt: boolean
|
||||
execute(params: TParams): Promise<CopilotToolResponse<TResult>>
|
||||
}
|
||||
|
||||
// Abstract base class for copilot tools
|
||||
export abstract class BaseCopilotTool<TParams = any, TResult = any>
|
||||
implements CopilotTool<TParams, TResult>
|
||||
{
|
||||
abstract readonly id: string
|
||||
abstract readonly displayName: string
|
||||
readonly requiresInterrupt: boolean = false
|
||||
|
||||
private _logger?: ReturnType<typeof createLogger>
|
||||
|
||||
protected get logger() {
|
||||
if (!this._logger) {
|
||||
this._logger = createLogger(`CopilotTool:${this.id}`)
|
||||
}
|
||||
return this._logger
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute the tool with error handling
|
||||
*/
|
||||
async execute(params: TParams): Promise<CopilotToolResponse<TResult>> {
|
||||
const startTime = Date.now()
|
||||
|
||||
try {
|
||||
this.logger.info(`Executing tool: ${this.id}`, {
|
||||
toolId: this.id,
|
||||
paramsKeys: Object.keys(params || {}),
|
||||
})
|
||||
|
||||
// Execute the tool logic
|
||||
const result = await this.executeImpl(params)
|
||||
|
||||
const duration = Date.now() - startTime
|
||||
this.logger.info(`Tool execution completed: ${this.id}`, {
|
||||
toolId: this.id,
|
||||
duration,
|
||||
hasResult: !!result,
|
||||
})
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: result,
|
||||
}
|
||||
} catch (error) {
|
||||
const duration = Date.now() - startTime
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error'
|
||||
|
||||
this.logger.error(`Tool execution failed: ${this.id}`, {
|
||||
toolId: this.id,
|
||||
duration,
|
||||
error: errorMessage,
|
||||
stack: error instanceof Error ? error.stack : undefined,
|
||||
})
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Abstract method that each tool must implement with their specific logic
|
||||
*/
|
||||
protected abstract executeImpl(params: TParams): Promise<TResult>
|
||||
}
|
||||
@@ -0,0 +1,90 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { registry as blockRegistry } from '@/blocks/registry'
|
||||
import { tools as toolsRegistry } from '@/tools/registry'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
type GetBlocksAndToolsParams = Record<string, never>
|
||||
|
||||
interface BlockInfo {
|
||||
block_name: string
|
||||
tool_names: string[]
|
||||
}
|
||||
|
||||
class GetBlocksAndToolsTool extends BaseCopilotTool<
|
||||
GetBlocksAndToolsParams,
|
||||
Record<string, BlockInfo>
|
||||
> {
|
||||
readonly id = 'get_blocks_and_tools'
|
||||
readonly displayName = 'Getting block information'
|
||||
|
||||
protected async executeImpl(params: GetBlocksAndToolsParams): Promise<Record<string, BlockInfo>> {
|
||||
return getBlocksAndTools()
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const getBlocksAndToolsTool = new GetBlocksAndToolsTool()
|
||||
|
||||
// Implementation function
|
||||
async function getBlocksAndTools(): Promise<Record<string, BlockInfo>> {
|
||||
const logger = createLogger('GetBlocksAndTools')
|
||||
|
||||
logger.info('Getting all blocks and tools')
|
||||
|
||||
// Create mapping of block_id -> {block_name, tool_names}
|
||||
const blockToToolsMapping: Record<string, BlockInfo> = {}
|
||||
|
||||
// Process blocks - filter out hidden blocks and map to their tools
|
||||
Object.entries(blockRegistry)
|
||||
.filter(([blockType, blockConfig]) => {
|
||||
// Filter out hidden blocks
|
||||
if (blockConfig.hideFromToolbar) return false
|
||||
return true
|
||||
})
|
||||
.forEach(([blockType, blockConfig]) => {
|
||||
// Get the tools for this block
|
||||
const blockToolIds = blockConfig.tools?.access || []
|
||||
|
||||
// Map tool IDs to tool names
|
||||
const toolNames = blockToolIds.map((toolId) => {
|
||||
const toolConfig = toolsRegistry[toolId]
|
||||
return toolConfig ? toolConfig.name : toolId // Fallback to ID if name not found
|
||||
})
|
||||
|
||||
blockToToolsMapping[blockType] = {
|
||||
block_name: blockConfig.name || blockType,
|
||||
tool_names: toolNames,
|
||||
}
|
||||
})
|
||||
|
||||
// Add special blocks that aren't in the standard registry
|
||||
const specialBlocks = {
|
||||
loop: {
|
||||
name: 'Loop',
|
||||
tools: [], // Loop blocks don't use standard tools
|
||||
},
|
||||
parallel: {
|
||||
name: 'Parallel',
|
||||
tools: [], // Parallel blocks don't use standard tools
|
||||
},
|
||||
}
|
||||
|
||||
// Add special blocks
|
||||
Object.entries(specialBlocks).forEach(([blockType, blockInfo]) => {
|
||||
blockToToolsMapping[blockType] = {
|
||||
block_name: blockInfo.name,
|
||||
tool_names: blockInfo.tools,
|
||||
}
|
||||
})
|
||||
|
||||
const totalBlocks = Object.keys(blockRegistry).length + Object.keys(specialBlocks).length
|
||||
const includedBlocks = Object.keys(blockToToolsMapping).length
|
||||
|
||||
logger.info(`Successfully mapped ${includedBlocks} blocks to their tools`, {
|
||||
totalBlocks,
|
||||
includedBlocks,
|
||||
outputMapping: blockToToolsMapping,
|
||||
})
|
||||
|
||||
return blockToToolsMapping
|
||||
}
|
||||
236
apps/sim/app/api/copilot/tools/blocks/get-blocks-metadata.ts
Normal file
236
apps/sim/app/api/copilot/tools/blocks/get-blocks-metadata.ts
Normal file
@@ -0,0 +1,236 @@
|
||||
import { existsSync, readFileSync } from 'fs'
|
||||
import { join } from 'path'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { registry as blockRegistry } from '@/blocks/registry'
|
||||
import { tools as toolsRegistry } from '@/tools/registry'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
const logger = createLogger('GetBlockMetadataAPI')
|
||||
|
||||
interface GetBlocksMetadataParams {
|
||||
blockIds: string[]
|
||||
}
|
||||
|
||||
interface BlocksMetadataResult {
|
||||
success: boolean
|
||||
data?: Record<string, any>
|
||||
error?: string
|
||||
}
|
||||
|
||||
class GetBlocksMetadataTool extends BaseCopilotTool<GetBlocksMetadataParams, BlocksMetadataResult> {
|
||||
readonly id = 'get_blocks_metadata'
|
||||
readonly displayName = 'Getting block metadata'
|
||||
|
||||
protected async executeImpl(params: GetBlocksMetadataParams): Promise<BlocksMetadataResult> {
|
||||
return getBlocksMetadata(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const getBlocksMetadataTool = new GetBlocksMetadataTool()
|
||||
|
||||
// Implementation function
|
||||
export async function getBlocksMetadata(
|
||||
params: GetBlocksMetadataParams
|
||||
): Promise<BlocksMetadataResult> {
|
||||
const { blockIds } = params
|
||||
|
||||
if (!blockIds || !Array.isArray(blockIds)) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'blockIds must be an array of block IDs',
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Getting block metadata', {
|
||||
blockIds,
|
||||
blockCount: blockIds.length,
|
||||
requestedBlocks: blockIds.join(', '),
|
||||
})
|
||||
|
||||
try {
|
||||
// Create result object
|
||||
const result: Record<string, any> = {}
|
||||
|
||||
logger.info('=== GET BLOCKS METADATA DEBUG ===')
|
||||
logger.info('Requested block IDs:', blockIds)
|
||||
|
||||
// Process each requested block ID
|
||||
for (const blockId of blockIds) {
|
||||
logger.info(`\n--- Processing block: ${blockId} ---`)
|
||||
let metadata: any = {}
|
||||
|
||||
// Check if it's a special block first
|
||||
if (SPECIAL_BLOCKS_METADATA[blockId]) {
|
||||
logger.info(`✓ Found ${blockId} in SPECIAL_BLOCKS_METADATA`)
|
||||
// Start with the special block metadata
|
||||
metadata = { ...SPECIAL_BLOCKS_METADATA[blockId] }
|
||||
// Normalize tools structure to match regular blocks
|
||||
metadata.tools = metadata.tools?.access || []
|
||||
logger.info(`Initial metadata keys for ${blockId}:`, Object.keys(metadata))
|
||||
} else {
|
||||
// Check if the block exists in the registry
|
||||
const blockConfig = blockRegistry[blockId]
|
||||
if (!blockConfig) {
|
||||
logger.warn(`Block not found in registry: ${blockId}`)
|
||||
continue
|
||||
}
|
||||
|
||||
metadata = {
|
||||
id: blockId,
|
||||
name: blockConfig.name || blockId,
|
||||
description: blockConfig.description || '',
|
||||
inputs: blockConfig.inputs || {},
|
||||
outputs: blockConfig.outputs || {},
|
||||
tools: blockConfig.tools?.access || [],
|
||||
}
|
||||
}
|
||||
|
||||
// Read YAML schema from documentation if available (for both regular and special blocks)
|
||||
const docFileName = DOCS_FILE_MAPPING[blockId] || blockId
|
||||
logger.info(
|
||||
`Checking if ${blockId} is in CORE_BLOCKS_WITH_DOCS:`,
|
||||
CORE_BLOCKS_WITH_DOCS.includes(blockId)
|
||||
)
|
||||
|
||||
if (CORE_BLOCKS_WITH_DOCS.includes(blockId)) {
|
||||
try {
|
||||
// Updated path to point to the actual YAML documentation location
|
||||
// Handle both monorepo root and apps/sim as working directory
|
||||
const workingDir = process.cwd()
|
||||
const isInAppsSim = workingDir.endsWith('/apps/sim') || workingDir.endsWith('\\apps\\sim')
|
||||
const basePath = isInAppsSim ? join(workingDir, '..', '..') : workingDir
|
||||
const docPath = join(
|
||||
basePath,
|
||||
'apps',
|
||||
'docs',
|
||||
'content',
|
||||
'docs',
|
||||
'yaml',
|
||||
'blocks',
|
||||
`${docFileName}.mdx`
|
||||
)
|
||||
logger.info(`Looking for docs at: ${docPath}`)
|
||||
logger.info(`File exists: ${existsSync(docPath)}`)
|
||||
|
||||
if (existsSync(docPath)) {
|
||||
const docContent = readFileSync(docPath, 'utf-8')
|
||||
logger.info(`Doc content length: ${docContent.length}`)
|
||||
|
||||
// Include the entire YAML documentation content
|
||||
metadata.yamlDocumentation = docContent
|
||||
logger.info(`✓ Added full YAML documentation for ${blockId}`)
|
||||
} else {
|
||||
logger.warn(`Documentation file not found for ${blockId}`)
|
||||
}
|
||||
} catch (error) {
|
||||
logger.warn(`Failed to read documentation for ${blockId}:`, error)
|
||||
}
|
||||
} else {
|
||||
logger.info(`${blockId} is NOT in CORE_BLOCKS_WITH_DOCS`)
|
||||
}
|
||||
|
||||
// Add tool metadata if requested
|
||||
if (metadata.tools && metadata.tools.length > 0) {
|
||||
metadata.toolDetails = {}
|
||||
for (const toolId of metadata.tools) {
|
||||
const tool = toolsRegistry[toolId]
|
||||
if (tool) {
|
||||
metadata.toolDetails[toolId] = {
|
||||
name: tool.name,
|
||||
description: tool.description,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logger.info(`Final metadata keys for ${blockId}:`, Object.keys(metadata))
|
||||
logger.info(`Has YAML documentation: ${!!metadata.yamlDocumentation}`)
|
||||
|
||||
result[blockId] = metadata
|
||||
}
|
||||
|
||||
logger.info('\n=== FINAL RESULT ===')
|
||||
logger.info(`Successfully retrieved metadata for ${Object.keys(result).length} blocks`)
|
||||
logger.info('Result keys:', Object.keys(result))
|
||||
|
||||
// Log the full result for parallel block if it's included
|
||||
if (result.parallel) {
|
||||
logger.info('\nParallel block metadata keys:', Object.keys(result.parallel))
|
||||
if (result.parallel.yamlDocumentation) {
|
||||
logger.info('YAML documentation length:', result.parallel.yamlDocumentation.length)
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: result,
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Get block metadata failed', error)
|
||||
return {
|
||||
success: false,
|
||||
error: `Failed to get block metadata: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Core blocks that have documentation with YAML schemas
|
||||
const CORE_BLOCKS_WITH_DOCS = [
|
||||
'agent',
|
||||
'function',
|
||||
'api',
|
||||
'condition',
|
||||
'loop',
|
||||
'parallel',
|
||||
'response',
|
||||
'router',
|
||||
'evaluator',
|
||||
'webhook',
|
||||
]
|
||||
|
||||
// Mapping for blocks that have different doc file names
|
||||
const DOCS_FILE_MAPPING: Record<string, string> = {
|
||||
// All core blocks use their registry ID as the doc filename
|
||||
// e.g., 'api' block -> 'api.mdx', 'agent' block -> 'agent.mdx'
|
||||
}
|
||||
|
||||
// Special blocks that aren't in the standard registry but need metadata
|
||||
const SPECIAL_BLOCKS_METADATA: Record<string, any> = {
|
||||
loop: {
|
||||
type: 'loop',
|
||||
name: 'Loop',
|
||||
description: 'Control flow block for iterating over collections or repeating actions',
|
||||
inputs: {
|
||||
loopType: { type: 'string', required: true, enum: ['for', 'forEach'] },
|
||||
iterations: { type: 'number', required: false, minimum: 1, maximum: 1000 },
|
||||
collection: { type: 'string', required: false },
|
||||
maxConcurrency: { type: 'number', required: false, default: 1, minimum: 1, maximum: 10 },
|
||||
},
|
||||
outputs: {
|
||||
results: 'array',
|
||||
currentIndex: 'number',
|
||||
currentItem: 'any',
|
||||
totalIterations: 'number',
|
||||
},
|
||||
tools: { access: [] },
|
||||
},
|
||||
parallel: {
|
||||
type: 'parallel',
|
||||
name: 'Parallel',
|
||||
description: 'Control flow block for executing multiple branches simultaneously',
|
||||
inputs: {
|
||||
parallelType: { type: 'string', required: true, enum: ['count', 'collection'] },
|
||||
count: { type: 'number', required: false, minimum: 1, maximum: 100 },
|
||||
collection: { type: 'string', required: false },
|
||||
maxConcurrency: { type: 'number', required: false, default: 10, minimum: 1, maximum: 50 },
|
||||
},
|
||||
outputs: {
|
||||
results: 'array',
|
||||
branchId: 'number',
|
||||
branchItem: 'any',
|
||||
totalBranches: 'number',
|
||||
},
|
||||
tools: { access: [] },
|
||||
},
|
||||
}
|
||||
122
apps/sim/app/api/copilot/tools/docs/search-docs.ts
Normal file
122
apps/sim/app/api/copilot/tools/docs/search-docs.ts
Normal file
@@ -0,0 +1,122 @@
|
||||
import { sql } from 'drizzle-orm'
|
||||
import { getCopilotConfig } from '@/lib/copilot/config'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { docsEmbeddings } from '@/db/schema'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
interface DocsSearchParams {
|
||||
query: string
|
||||
topK?: number
|
||||
threshold?: number
|
||||
}
|
||||
|
||||
interface DocumentationSearchResult {
|
||||
id: number
|
||||
title: string
|
||||
url: string
|
||||
content: string
|
||||
similarity: number
|
||||
}
|
||||
|
||||
interface DocsSearchResult {
|
||||
results: DocumentationSearchResult[]
|
||||
query: string
|
||||
totalResults: number
|
||||
}
|
||||
|
||||
class SearchDocsTool extends BaseCopilotTool<DocsSearchParams, DocsSearchResult> {
|
||||
readonly id = 'search_documentation'
|
||||
readonly displayName = 'Searching documentation'
|
||||
|
||||
protected async executeImpl(params: DocsSearchParams): Promise<DocsSearchResult> {
|
||||
return searchDocs(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const searchDocsTool = new SearchDocsTool()
|
||||
|
||||
// Implementation function
|
||||
async function searchDocs(params: DocsSearchParams): Promise<DocsSearchResult> {
|
||||
const logger = createLogger('DocsSearch')
|
||||
const { query, topK = 10, threshold } = params
|
||||
|
||||
logger.info('Executing docs search for copilot', {
|
||||
query,
|
||||
topK,
|
||||
})
|
||||
|
||||
try {
|
||||
const config = getCopilotConfig()
|
||||
const similarityThreshold = threshold ?? config.rag.similarityThreshold
|
||||
|
||||
// Generate embedding for the query
|
||||
const { generateEmbeddings } = await import('@/app/api/knowledge/utils')
|
||||
|
||||
logger.info('About to generate embeddings for query', { query, queryLength: query.length })
|
||||
|
||||
const embeddings = await generateEmbeddings([query])
|
||||
const queryEmbedding = embeddings[0]
|
||||
|
||||
if (!queryEmbedding || queryEmbedding.length === 0) {
|
||||
logger.warn('Failed to generate query embedding')
|
||||
return {
|
||||
results: [],
|
||||
query,
|
||||
totalResults: 0,
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Successfully generated query embedding', {
|
||||
embeddingLength: queryEmbedding.length,
|
||||
})
|
||||
|
||||
// Search docs embeddings using vector similarity
|
||||
const results = await db
|
||||
.select({
|
||||
chunkId: docsEmbeddings.chunkId,
|
||||
chunkText: docsEmbeddings.chunkText,
|
||||
sourceDocument: docsEmbeddings.sourceDocument,
|
||||
sourceLink: docsEmbeddings.sourceLink,
|
||||
headerText: docsEmbeddings.headerText,
|
||||
headerLevel: docsEmbeddings.headerLevel,
|
||||
similarity: sql<number>`1 - (${docsEmbeddings.embedding} <=> ${JSON.stringify(queryEmbedding)}::vector)`,
|
||||
})
|
||||
.from(docsEmbeddings)
|
||||
.orderBy(sql`${docsEmbeddings.embedding} <=> ${JSON.stringify(queryEmbedding)}::vector`)
|
||||
.limit(topK)
|
||||
|
||||
// Filter by similarity threshold
|
||||
const filteredResults = results.filter((result) => result.similarity >= similarityThreshold)
|
||||
|
||||
const documentationResults: DocumentationSearchResult[] = filteredResults.map(
|
||||
(result, index) => ({
|
||||
id: index + 1,
|
||||
title: String(result.headerText || 'Untitled Section'),
|
||||
url: String(result.sourceLink || '#'),
|
||||
content: String(result.chunkText || ''),
|
||||
similarity: result.similarity,
|
||||
})
|
||||
)
|
||||
|
||||
logger.info(`Found ${documentationResults.length} documentation results`, { query })
|
||||
|
||||
return {
|
||||
results: documentationResults,
|
||||
query,
|
||||
totalResults: documentationResults.length,
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Documentation search failed with detailed error:', {
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
stack: error instanceof Error ? error.stack : undefined,
|
||||
query,
|
||||
errorType: error?.constructor?.name,
|
||||
status: (error as any)?.status,
|
||||
})
|
||||
throw new Error(
|
||||
`Documentation search failed: ${error instanceof Error ? error.message : 'Unknown error'}`
|
||||
)
|
||||
}
|
||||
}
|
||||
50
apps/sim/app/api/copilot/tools/other/no-op.ts
Normal file
50
apps/sim/app/api/copilot/tools/other/no-op.ts
Normal file
@@ -0,0 +1,50 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
const logger = createLogger('NoOpTool')
|
||||
|
||||
// No parameters interface - empty object
|
||||
interface NoOpParams {
|
||||
confirmationMessage?: string
|
||||
}
|
||||
|
||||
interface NoOpResult {
|
||||
message: string
|
||||
status: string
|
||||
}
|
||||
|
||||
class NoOpTool extends BaseCopilotTool<NoOpParams, NoOpResult> {
|
||||
readonly id = 'no_op'
|
||||
readonly displayName = 'No operation (requires confirmation)'
|
||||
readonly requiresInterrupt = true
|
||||
|
||||
protected async executeImpl(params: NoOpParams): Promise<NoOpResult> {
|
||||
const message = params.confirmationMessage
|
||||
? `No-op tool executed successfully. ${params.confirmationMessage}`
|
||||
: 'No-op tool executed successfully'
|
||||
|
||||
const result = {
|
||||
message,
|
||||
status: 'success',
|
||||
}
|
||||
|
||||
// Log the noop tool response for debugging
|
||||
logger.info('NoOp tool executed', {
|
||||
result,
|
||||
confirmationMessage: params.confirmationMessage,
|
||||
hasConfirmationMessage: !!params.confirmationMessage,
|
||||
})
|
||||
|
||||
// Log what we're about to return
|
||||
logger.info('NoOp tool returning result', {
|
||||
result,
|
||||
resultType: typeof result,
|
||||
resultKeys: Object.keys(result),
|
||||
})
|
||||
|
||||
return result
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const noOpTool = new NoOpTool()
|
||||
68
apps/sim/app/api/copilot/tools/other/online-search.ts
Normal file
68
apps/sim/app/api/copilot/tools/other/online-search.ts
Normal file
@@ -0,0 +1,68 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { executeTool } from '@/tools'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
interface OnlineSearchParams {
|
||||
query: string
|
||||
num?: number
|
||||
type?: string
|
||||
gl?: string
|
||||
hl?: string
|
||||
}
|
||||
|
||||
interface OnlineSearchResult {
|
||||
results: any[]
|
||||
query: string
|
||||
type: string
|
||||
totalResults: number
|
||||
}
|
||||
|
||||
class OnlineSearchTool extends BaseCopilotTool<OnlineSearchParams, OnlineSearchResult> {
|
||||
readonly id = 'search_online'
|
||||
readonly displayName = 'Searching online'
|
||||
|
||||
protected async executeImpl(params: OnlineSearchParams): Promise<OnlineSearchResult> {
|
||||
return onlineSearch(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const onlineSearchTool = new OnlineSearchTool()
|
||||
|
||||
// Implementation function
|
||||
async function onlineSearch(params: OnlineSearchParams): Promise<OnlineSearchResult> {
|
||||
const logger = createLogger('OnlineSearch')
|
||||
const { query, num = 10, type = 'search', gl, hl } = params
|
||||
|
||||
logger.info('Performing online search', {
|
||||
query,
|
||||
num,
|
||||
type,
|
||||
gl,
|
||||
hl,
|
||||
})
|
||||
|
||||
// Execute the serper_search tool
|
||||
const toolParams = {
|
||||
query,
|
||||
num,
|
||||
type,
|
||||
gl,
|
||||
hl,
|
||||
apiKey: process.env.SERPER_API_KEY || '',
|
||||
}
|
||||
|
||||
const result = await executeTool('serper_search', toolParams)
|
||||
|
||||
if (!result.success) {
|
||||
throw new Error(result.error || 'Search failed')
|
||||
}
|
||||
|
||||
// The serper tool already formats the results properly
|
||||
return {
|
||||
results: result.output.searchResults || [],
|
||||
query,
|
||||
type,
|
||||
totalResults: result.output.searchResults?.length || 0,
|
||||
}
|
||||
}
|
||||
104
apps/sim/app/api/copilot/tools/registry.ts
Normal file
104
apps/sim/app/api/copilot/tools/registry.ts
Normal file
@@ -0,0 +1,104 @@
|
||||
import { COPILOT_TOOL_DISPLAY_NAMES } from '@/stores/constants'
|
||||
import type { CopilotTool } from './base'
|
||||
// Import all tools to register them
|
||||
import { getBlocksAndToolsTool } from './blocks/get-blocks-and-tools'
|
||||
import { getBlocksMetadataTool } from './blocks/get-blocks-metadata'
|
||||
import { searchDocsTool } from './docs/search-docs'
|
||||
import { noOpTool } from './other/no-op'
|
||||
import { onlineSearchTool } from './other/online-search'
|
||||
import { getEnvironmentVariablesTool } from './user/get-environment-variables'
|
||||
import { setEnvironmentVariablesTool } from './user/set-environment-variables'
|
||||
import { buildWorkflowTool } from './workflow/build-workflow'
|
||||
import { editWorkflowTool } from './workflow/edit-workflow'
|
||||
import { getUserWorkflowTool } from './workflow/get-user-workflow'
|
||||
import { getWorkflowConsoleTool } from './workflow/get-workflow-console'
|
||||
|
||||
// Registry of all copilot tools
|
||||
export class CopilotToolRegistry {
|
||||
private tools = new Map<string, CopilotTool>()
|
||||
|
||||
/**
|
||||
* Register a tool in the registry
|
||||
*/
|
||||
register(tool: CopilotTool): void {
|
||||
if (this.tools.has(tool.id)) {
|
||||
throw new Error(`Tool with id '${tool.id}' is already registered`)
|
||||
}
|
||||
this.tools.set(tool.id, tool)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a tool by its ID
|
||||
*/
|
||||
get(id: string): CopilotTool | undefined {
|
||||
return this.tools.get(id)
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a tool exists
|
||||
*/
|
||||
has(id: string): boolean {
|
||||
return this.tools.has(id)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all available tool IDs
|
||||
*/
|
||||
getAvailableIds(): string[] {
|
||||
return Array.from(this.tools.keys())
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all tools
|
||||
*/
|
||||
getAll(): CopilotTool[] {
|
||||
return Array.from(this.tools.values())
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute a tool by ID with parameters
|
||||
*/
|
||||
async execute(toolId: string, params: any): Promise<any> {
|
||||
const tool = this.get(toolId)
|
||||
if (!tool) {
|
||||
throw new Error(`Tool not found: ${toolId}`)
|
||||
}
|
||||
return tool.execute(params)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get display name for a tool ID
|
||||
*/
|
||||
getDisplayName(toolId: string): string {
|
||||
return COPILOT_TOOL_DISPLAY_NAMES[toolId] || toolId
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all tool display names as a record
|
||||
*/
|
||||
getAllDisplayNames(): Record<string, string> {
|
||||
return COPILOT_TOOL_DISPLAY_NAMES
|
||||
}
|
||||
}
|
||||
|
||||
// Global registry instance
|
||||
export const copilotToolRegistry = new CopilotToolRegistry()
|
||||
|
||||
// Register all tools
|
||||
copilotToolRegistry.register(getBlocksAndToolsTool)
|
||||
copilotToolRegistry.register(getBlocksMetadataTool)
|
||||
copilotToolRegistry.register(searchDocsTool)
|
||||
copilotToolRegistry.register(noOpTool)
|
||||
copilotToolRegistry.register(onlineSearchTool)
|
||||
copilotToolRegistry.register(getEnvironmentVariablesTool)
|
||||
copilotToolRegistry.register(setEnvironmentVariablesTool)
|
||||
copilotToolRegistry.register(getUserWorkflowTool)
|
||||
copilotToolRegistry.register(buildWorkflowTool)
|
||||
copilotToolRegistry.register(getWorkflowConsoleTool)
|
||||
copilotToolRegistry.register(editWorkflowTool)
|
||||
|
||||
// Dynamically generated constants - single source of truth
|
||||
export const COPILOT_TOOL_IDS = copilotToolRegistry.getAvailableIds()
|
||||
|
||||
// Export the type from shared constants
|
||||
export type { CopilotToolId } from '@/stores/constants'
|
||||
@@ -0,0 +1,72 @@
|
||||
import { getEnvironmentVariableKeys } from '@/lib/environment/utils'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getUserId } from '@/app/api/auth/oauth/utils'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
interface GetEnvironmentVariablesParams {
|
||||
userId?: string
|
||||
workflowId?: string
|
||||
}
|
||||
|
||||
interface EnvironmentVariablesResult {
|
||||
variableNames: string[]
|
||||
count: number
|
||||
}
|
||||
|
||||
class GetEnvironmentVariablesTool extends BaseCopilotTool<
|
||||
GetEnvironmentVariablesParams,
|
||||
EnvironmentVariablesResult
|
||||
> {
|
||||
readonly id = 'get_environment_variables'
|
||||
readonly displayName = 'Getting environment variables'
|
||||
|
||||
protected async executeImpl(
|
||||
params: GetEnvironmentVariablesParams
|
||||
): Promise<EnvironmentVariablesResult> {
|
||||
return getEnvironmentVariables(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const getEnvironmentVariablesTool = new GetEnvironmentVariablesTool()
|
||||
|
||||
// Implementation function
|
||||
async function getEnvironmentVariables(
|
||||
params: GetEnvironmentVariablesParams
|
||||
): Promise<EnvironmentVariablesResult> {
|
||||
const logger = createLogger('GetEnvironmentVariables')
|
||||
const { userId: directUserId, workflowId } = params
|
||||
|
||||
logger.info('Getting environment variables for copilot', {
|
||||
hasUserId: !!directUserId,
|
||||
hasWorkflowId: !!workflowId,
|
||||
})
|
||||
|
||||
// Resolve userId from workflowId if needed
|
||||
const userId =
|
||||
directUserId || (workflowId ? await getUserId('copilot-env-vars', workflowId) : undefined)
|
||||
|
||||
logger.info('Resolved userId', {
|
||||
directUserId,
|
||||
workflowId,
|
||||
resolvedUserId: userId,
|
||||
})
|
||||
|
||||
if (!userId) {
|
||||
logger.warn('No userId could be determined', { directUserId, workflowId })
|
||||
throw new Error('Either userId or workflowId is required')
|
||||
}
|
||||
|
||||
// Get environment variable keys directly
|
||||
const result = await getEnvironmentVariableKeys(userId)
|
||||
|
||||
logger.info('Environment variable keys retrieved', {
|
||||
userId,
|
||||
variableCount: result.count,
|
||||
})
|
||||
|
||||
return {
|
||||
variableNames: result.variableNames,
|
||||
count: result.count,
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,72 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
interface SetEnvironmentVariablesParams {
|
||||
variables: Record<string, any>
|
||||
workflowId?: string
|
||||
}
|
||||
|
||||
interface SetEnvironmentVariablesResult {
|
||||
message: string
|
||||
updatedVariables: string[]
|
||||
count: number
|
||||
}
|
||||
|
||||
class SetEnvironmentVariablesTool extends BaseCopilotTool<
|
||||
SetEnvironmentVariablesParams,
|
||||
SetEnvironmentVariablesResult
|
||||
> {
|
||||
readonly id = 'set_environment_variables'
|
||||
readonly displayName = 'Setting environment variables'
|
||||
readonly requiresInterrupt = true
|
||||
|
||||
protected async executeImpl(
|
||||
params: SetEnvironmentVariablesParams
|
||||
): Promise<SetEnvironmentVariablesResult> {
|
||||
return setEnvironmentVariables(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const setEnvironmentVariablesTool = new SetEnvironmentVariablesTool()
|
||||
|
||||
// Implementation function
|
||||
async function setEnvironmentVariables(
|
||||
params: SetEnvironmentVariablesParams
|
||||
): Promise<SetEnvironmentVariablesResult> {
|
||||
const logger = createLogger('SetEnvironmentVariables')
|
||||
const { variables, workflowId } = params
|
||||
|
||||
logger.info('Setting environment variables for copilot', {
|
||||
variableCount: Object.keys(variables).length,
|
||||
variableNames: Object.keys(variables),
|
||||
hasWorkflowId: !!workflowId,
|
||||
})
|
||||
|
||||
// Forward the request to the existing environment variables endpoint
|
||||
const envUrl = `${process.env.NEXTAUTH_URL || 'http://localhost:3000'}/api/environment/variables`
|
||||
|
||||
const response = await fetch(envUrl, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ variables, workflowId }),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
logger.error('Set environment variables API failed', {
|
||||
status: response.status,
|
||||
statusText: response.statusText,
|
||||
})
|
||||
throw new Error('Failed to set environment variables')
|
||||
}
|
||||
|
||||
await response.json()
|
||||
|
||||
return {
|
||||
message: 'Environment variables updated successfully',
|
||||
updatedVariables: Object.keys(variables),
|
||||
count: Object.keys(variables).length,
|
||||
}
|
||||
}
|
||||
176
apps/sim/app/api/copilot/tools/workflow/build-workflow.ts
Normal file
176
apps/sim/app/api/copilot/tools/workflow/build-workflow.ts
Normal file
@@ -0,0 +1,176 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
interface BuildWorkflowParams {
|
||||
yamlContent: string
|
||||
description?: string
|
||||
}
|
||||
|
||||
interface BuildWorkflowResult {
|
||||
yamlContent: string
|
||||
description?: string
|
||||
success: boolean
|
||||
message: string
|
||||
workflowState?: any
|
||||
data?: {
|
||||
blocksCount: number
|
||||
edgesCount: number
|
||||
}
|
||||
}
|
||||
|
||||
class BuildWorkflowTool extends BaseCopilotTool<BuildWorkflowParams, BuildWorkflowResult> {
|
||||
readonly id = 'build_workflow'
|
||||
readonly displayName = 'Building workflow'
|
||||
|
||||
protected async executeImpl(params: BuildWorkflowParams): Promise<BuildWorkflowResult> {
|
||||
return buildWorkflow(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const buildWorkflowTool = new BuildWorkflowTool()
|
||||
|
||||
// Implementation function that builds workflow from YAML
|
||||
async function buildWorkflow(params: BuildWorkflowParams): Promise<BuildWorkflowResult> {
|
||||
const logger = createLogger('BuildWorkflow')
|
||||
const { yamlContent, description } = params
|
||||
|
||||
logger.info('Building workflow for copilot', {
|
||||
yamlLength: yamlContent.length,
|
||||
description,
|
||||
})
|
||||
|
||||
try {
|
||||
// Convert YAML by calling sim-agent directly
|
||||
// Gather block registry and utilities
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/to-workflow`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
options: {
|
||||
generateNewIds: true,
|
||||
preservePositions: false,
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
throw new Error(`Sim agent API error: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const conversionResult = await response.json()
|
||||
|
||||
if (!conversionResult.success || !conversionResult.workflowState) {
|
||||
logger.error('YAML conversion failed', {
|
||||
errors: conversionResult.errors,
|
||||
warnings: conversionResult.warnings,
|
||||
})
|
||||
return {
|
||||
success: false,
|
||||
message: `Failed to convert YAML workflow: ${conversionResult.errors.join(', ')}`,
|
||||
yamlContent,
|
||||
description,
|
||||
}
|
||||
}
|
||||
|
||||
const { workflowState, idMapping } = conversionResult
|
||||
|
||||
// Create a basic workflow state structure for preview
|
||||
const previewWorkflowState = {
|
||||
blocks: {} as Record<string, any>,
|
||||
edges: [] as any[],
|
||||
loops: {} as Record<string, any>,
|
||||
parallels: {} as Record<string, any>,
|
||||
lastSaved: Date.now(),
|
||||
isDeployed: false,
|
||||
}
|
||||
|
||||
// Process blocks with preview IDs
|
||||
const blockIdMapping = new Map<string, string>()
|
||||
|
||||
Object.keys(workflowState.blocks).forEach((blockId) => {
|
||||
const previewId = `preview-${Date.now()}-${Math.random().toString(36).substring(2, 7)}`
|
||||
blockIdMapping.set(blockId, previewId)
|
||||
})
|
||||
|
||||
// Add blocks to preview workflow state
|
||||
for (const [originalId, block] of Object.entries(workflowState.blocks)) {
|
||||
const previewBlockId = blockIdMapping.get(originalId)!
|
||||
const typedBlock = block as any
|
||||
|
||||
previewWorkflowState.blocks[previewBlockId] = {
|
||||
...typedBlock,
|
||||
id: previewBlockId,
|
||||
position: typedBlock.position || { x: 0, y: 0 },
|
||||
enabled: true,
|
||||
}
|
||||
}
|
||||
|
||||
// Process edges with updated block IDs
|
||||
previewWorkflowState.edges = workflowState.edges.map((edge: any) => ({
|
||||
...edge,
|
||||
id: `edge-${Date.now()}-${Math.random().toString(36).substring(2, 7)}`,
|
||||
source: blockIdMapping.get(edge.source) || edge.source,
|
||||
target: blockIdMapping.get(edge.target) || edge.target,
|
||||
}))
|
||||
|
||||
const blocksCount = Object.keys(previewWorkflowState.blocks).length
|
||||
const edgesCount = previewWorkflowState.edges.length
|
||||
|
||||
logger.info('Workflow built successfully', { blocksCount, edgesCount })
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: `Successfully built workflow with ${blocksCount} blocks and ${edgesCount} connections`,
|
||||
yamlContent,
|
||||
description: description || 'Built workflow',
|
||||
workflowState: previewWorkflowState,
|
||||
data: {
|
||||
blocksCount,
|
||||
edgesCount,
|
||||
},
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Failed to build workflow:', error)
|
||||
return {
|
||||
success: false,
|
||||
message: `Workflow build failed: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
yamlContent,
|
||||
description,
|
||||
}
|
||||
}
|
||||
}
|
||||
426
apps/sim/app/api/copilot/tools/workflow/edit-workflow.ts
Normal file
426
apps/sim/app/api/copilot/tools/workflow/edit-workflow.ts
Normal file
@@ -0,0 +1,426 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('EditWorkflowAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
// Types for operations
|
||||
interface EditWorkflowOperation {
|
||||
operation_type: 'add' | 'edit' | 'delete'
|
||||
block_id: string
|
||||
params?: Record<string, any>
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply operations to YAML workflow
|
||||
*/
|
||||
async function applyOperationsToYaml(
|
||||
currentYaml: string,
|
||||
operations: EditWorkflowOperation[]
|
||||
): Promise<string> {
|
||||
// Parse current YAML by calling sim-agent directly
|
||||
// Gather block registry and utilities
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/parse`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent: currentYaml,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Sim agent API error: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const parseResult = await response.json()
|
||||
|
||||
if (!parseResult.success || !parseResult.data || parseResult.errors?.length > 0) {
|
||||
throw new Error(`Invalid YAML format: ${parseResult.errors?.join(', ') || 'Unknown error'}`)
|
||||
}
|
||||
|
||||
const workflowData = parseResult.data
|
||||
|
||||
// Apply operations to the parsed YAML data (preserving all existing fields)
|
||||
logger.info('Starting YAML operations', {
|
||||
initialBlockCount: Object.keys(workflowData.blocks).length,
|
||||
version: workflowData.version,
|
||||
operationCount: operations.length,
|
||||
})
|
||||
|
||||
for (const operation of operations) {
|
||||
const { operation_type, block_id, params } = operation
|
||||
|
||||
logger.info(`Processing operation: ${operation_type} for block ${block_id}`, { params })
|
||||
|
||||
switch (operation_type) {
|
||||
case 'delete':
|
||||
if (workflowData.blocks[block_id]) {
|
||||
// First, find child blocks that reference this block as parent (before deleting the parent)
|
||||
const childBlocksToRemove: string[] = []
|
||||
Object.entries(workflowData.blocks).forEach(
|
||||
([childBlockId, childBlock]: [string, any]) => {
|
||||
if (childBlock.parentId === block_id) {
|
||||
logger.info(
|
||||
`Found child block ${childBlockId} with parentId ${block_id}, marking for deletion`
|
||||
)
|
||||
childBlocksToRemove.push(childBlockId)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
// Delete the main block
|
||||
delete workflowData.blocks[block_id]
|
||||
logger.info(`Deleted block ${block_id}`)
|
||||
|
||||
// Remove child blocks
|
||||
childBlocksToRemove.forEach((childBlockId) => {
|
||||
if (workflowData.blocks[childBlockId]) {
|
||||
delete workflowData.blocks[childBlockId]
|
||||
logger.info(`Deleted child block ${childBlockId}`)
|
||||
}
|
||||
})
|
||||
|
||||
// Remove connections mentioning this block or any of its children
|
||||
const allDeletedBlocks = [block_id, ...childBlocksToRemove]
|
||||
Object.values(workflowData.blocks).forEach((block: any) => {
|
||||
if (block.connections) {
|
||||
Object.keys(block.connections).forEach((key) => {
|
||||
const connectionValue = block.connections[key]
|
||||
|
||||
if (typeof connectionValue === 'string') {
|
||||
// Simple format: connections: { default: "block2" }
|
||||
if (allDeletedBlocks.includes(connectionValue)) {
|
||||
delete block.connections[key]
|
||||
logger.info(`Removed connection ${key} to deleted block ${connectionValue}`)
|
||||
}
|
||||
} else if (Array.isArray(connectionValue)) {
|
||||
// Array format: connections: { default: ["block2", "block3"] }
|
||||
block.connections[key] = connectionValue.filter((item: any) => {
|
||||
if (typeof item === 'string') {
|
||||
return !allDeletedBlocks.includes(item)
|
||||
}
|
||||
if (typeof item === 'object' && item.block) {
|
||||
return !allDeletedBlocks.includes(item.block)
|
||||
}
|
||||
return true
|
||||
})
|
||||
|
||||
// If array is empty after filtering, remove the connection
|
||||
if (block.connections[key].length === 0) {
|
||||
delete block.connections[key]
|
||||
}
|
||||
} else if (typeof connectionValue === 'object' && connectionValue.block) {
|
||||
// Object format: connections: { success: { block: "block2", input: "data" } }
|
||||
if (allDeletedBlocks.includes(connectionValue.block)) {
|
||||
delete block.connections[key]
|
||||
logger.info(
|
||||
`Removed object connection ${key} to deleted block ${connectionValue.block}`
|
||||
)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
})
|
||||
} else {
|
||||
logger.warn(`Block ${block_id} not found for deletion`)
|
||||
}
|
||||
break
|
||||
|
||||
case 'edit':
|
||||
if (workflowData.blocks[block_id]) {
|
||||
const block = workflowData.blocks[block_id]
|
||||
|
||||
// Update inputs (preserve existing inputs, only overwrite specified ones)
|
||||
if (params?.inputs) {
|
||||
if (!block.inputs) block.inputs = {}
|
||||
Object.assign(block.inputs, params.inputs)
|
||||
logger.info(`Updated inputs for block ${block_id}`, { inputs: block.inputs })
|
||||
}
|
||||
|
||||
// Update connections (preserve existing connections, only overwrite specified ones)
|
||||
if (params?.connections) {
|
||||
if (!block.connections) block.connections = {}
|
||||
|
||||
// Handle edge removals - if a connection is explicitly set to null, remove it
|
||||
Object.entries(params.connections).forEach(([key, value]) => {
|
||||
if (value === null) {
|
||||
delete (block.connections as any)[key]
|
||||
logger.info(`Removed connection ${key} from block ${block_id}`)
|
||||
} else {
|
||||
;(block.connections as any)[key] = value
|
||||
}
|
||||
})
|
||||
|
||||
logger.info(`Updated connections for block ${block_id}`, {
|
||||
connections: block.connections,
|
||||
})
|
||||
}
|
||||
|
||||
// Handle edge removals when specified in params
|
||||
if (params?.removeEdges && Array.isArray(params.removeEdges)) {
|
||||
params.removeEdges.forEach(
|
||||
(edgeToRemove: {
|
||||
targetBlockId: string
|
||||
sourceHandle?: string
|
||||
targetHandle?: string
|
||||
}) => {
|
||||
if (!block.connections) return
|
||||
|
||||
const { targetBlockId, sourceHandle = 'default' } = edgeToRemove
|
||||
|
||||
// Handle different connection formats
|
||||
const connectionValue = (block.connections as any)[sourceHandle]
|
||||
|
||||
if (typeof connectionValue === 'string') {
|
||||
// Simple format: connections: { default: "block2" }
|
||||
if (connectionValue === targetBlockId) {
|
||||
delete (block.connections as any)[sourceHandle]
|
||||
logger.info(`Removed edge from ${block_id}:${sourceHandle} to ${targetBlockId}`)
|
||||
}
|
||||
} else if (Array.isArray(connectionValue)) {
|
||||
// Array format: connections: { default: ["block2", "block3"] }
|
||||
;(block.connections as any)[sourceHandle] = connectionValue.filter(
|
||||
(item: any) => {
|
||||
if (typeof item === 'string') {
|
||||
return item !== targetBlockId
|
||||
}
|
||||
if (typeof item === 'object' && item.block) {
|
||||
return item.block !== targetBlockId
|
||||
}
|
||||
return true
|
||||
}
|
||||
)
|
||||
|
||||
// If array is empty after filtering, remove the connection
|
||||
if ((block.connections as any)[sourceHandle].length === 0) {
|
||||
delete (block.connections as any)[sourceHandle]
|
||||
}
|
||||
|
||||
logger.info(`Updated array connection for ${block_id}:${sourceHandle}`)
|
||||
} else if (typeof connectionValue === 'object' && connectionValue.block) {
|
||||
// Object format: connections: { success: { block: "block2", input: "data" } }
|
||||
if (connectionValue.block === targetBlockId) {
|
||||
delete (block.connections as any)[sourceHandle]
|
||||
logger.info(
|
||||
`Removed object connection from ${block_id}:${sourceHandle} to ${targetBlockId}`
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
} else {
|
||||
logger.warn(`Block ${block_id} not found for editing`)
|
||||
}
|
||||
break
|
||||
|
||||
case 'add':
|
||||
if (params?.type && params?.name) {
|
||||
workflowData.blocks[block_id] = {
|
||||
type: params.type,
|
||||
name: params.name,
|
||||
inputs: params.inputs || {},
|
||||
connections: params.connections || {},
|
||||
}
|
||||
logger.info(`Added block ${block_id}`, { type: params.type, name: params.name })
|
||||
} else {
|
||||
logger.warn(`Invalid add operation for block ${block_id} - missing type or name`)
|
||||
}
|
||||
break
|
||||
|
||||
default:
|
||||
logger.warn(`Unknown operation type: ${operation_type}`)
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Completed YAML operations', {
|
||||
finalBlockCount: Object.keys(workflowData.blocks).length,
|
||||
})
|
||||
|
||||
// Convert the complete workflow data back to YAML (preserving version and all other fields)
|
||||
const { dump: yamlDump } = await import('js-yaml')
|
||||
return yamlDump(workflowData)
|
||||
}
|
||||
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
interface EditWorkflowParams {
|
||||
operations: EditWorkflowOperation[]
|
||||
workflowId: string
|
||||
}
|
||||
|
||||
interface EditWorkflowResult {
|
||||
yamlContent: string
|
||||
operations: Array<{ type: string; blockId: string }>
|
||||
}
|
||||
|
||||
class EditWorkflowTool extends BaseCopilotTool<EditWorkflowParams, EditWorkflowResult> {
|
||||
readonly id = 'edit_workflow'
|
||||
readonly displayName = 'Updating workflow'
|
||||
|
||||
protected async executeImpl(params: EditWorkflowParams): Promise<EditWorkflowResult> {
|
||||
return editWorkflow(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const editWorkflowTool = new EditWorkflowTool()
|
||||
|
||||
// Implementation function
|
||||
async function editWorkflow(params: EditWorkflowParams): Promise<EditWorkflowResult> {
|
||||
const { operations, workflowId } = params
|
||||
|
||||
logger.info('Processing targeted update request', {
|
||||
workflowId,
|
||||
operationCount: operations.length,
|
||||
})
|
||||
|
||||
// Get current workflow state as JSON
|
||||
const { getUserWorkflowTool } = await import('./get-user-workflow')
|
||||
|
||||
const getUserWorkflowResult = await getUserWorkflowTool.execute({
|
||||
workflowId: workflowId,
|
||||
includeMetadata: false,
|
||||
})
|
||||
|
||||
if (!getUserWorkflowResult.success || !getUserWorkflowResult.data) {
|
||||
throw new Error('Failed to get current workflow state')
|
||||
}
|
||||
|
||||
const workflowStateJson = getUserWorkflowResult.data
|
||||
|
||||
logger.info('Retrieved current workflow state', {
|
||||
jsonLength: workflowStateJson.length,
|
||||
jsonPreview: workflowStateJson.substring(0, 200),
|
||||
})
|
||||
|
||||
// Parse the JSON to get the workflow state object
|
||||
const workflowState = JSON.parse(workflowStateJson)
|
||||
|
||||
// Extract subblock values from the workflow state (same logic as get-user-workflow.ts)
|
||||
const subBlockValues: Record<string, Record<string, any>> = {}
|
||||
Object.entries(workflowState.blocks || {}).forEach(([blockId, block]) => {
|
||||
subBlockValues[blockId] = {}
|
||||
Object.entries((block as any).subBlocks || {}).forEach(([subBlockId, subBlock]) => {
|
||||
if ((subBlock as any).value !== undefined) {
|
||||
subBlockValues[blockId][subBlockId] = (subBlock as any).value
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
logger.info('Extracted subblock values', {
|
||||
blockCount: Object.keys(subBlockValues).length,
|
||||
totalSubblocks: Object.values(subBlockValues).reduce(
|
||||
(sum, blockValues) => sum + Object.keys(blockValues).length,
|
||||
0
|
||||
),
|
||||
})
|
||||
|
||||
// Convert workflow state to YAML format using the same endpoint as the UI
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Convert to YAML using sim-agent
|
||||
const yamlResponse = await fetch(`${SIM_AGENT_API_URL}/api/workflow/to-yaml`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
workflowState,
|
||||
subBlockValues, // Now using the properly extracted subblock values
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!yamlResponse.ok) {
|
||||
const errorText = await yamlResponse.text()
|
||||
throw new Error(`Sim agent API error: ${yamlResponse.statusText}`)
|
||||
}
|
||||
|
||||
const yamlResult = await yamlResponse.json()
|
||||
|
||||
if (!yamlResult.success || !yamlResult.yaml) {
|
||||
throw new Error(yamlResult.error || 'Failed to generate YAML')
|
||||
}
|
||||
|
||||
const currentYaml = yamlResult.yaml
|
||||
|
||||
if (!currentYaml || currentYaml.trim() === '') {
|
||||
throw new Error('Generated YAML is empty')
|
||||
}
|
||||
|
||||
logger.info('Successfully converted workflow to YAML', {
|
||||
workflowId,
|
||||
blockCount: Object.keys(workflowState.blocks).length,
|
||||
yamlLength: currentYaml.length,
|
||||
})
|
||||
|
||||
// Apply operations to generate modified YAML
|
||||
const modifiedYaml = await applyOperationsToYaml(currentYaml, operations)
|
||||
|
||||
logger.info('Applied operations to YAML', {
|
||||
operationCount: operations.length,
|
||||
currentYamlLength: currentYaml.length,
|
||||
modifiedYamlLength: modifiedYaml.length,
|
||||
operations: operations.map((op) => ({ type: op.operation_type, blockId: op.block_id })),
|
||||
})
|
||||
|
||||
logger.info(
|
||||
`Successfully generated modified YAML for ${operations.length} targeted update operations`
|
||||
)
|
||||
|
||||
// Return the modified YAML directly - the UI will handle preview generation via updateDiffStore()
|
||||
return {
|
||||
yamlContent: modifiedYaml,
|
||||
operations: operations.map((op) => ({ type: op.operation_type, blockId: op.block_id })),
|
||||
}
|
||||
}
|
||||
94
apps/sim/app/api/copilot/tools/workflow/get-user-workflow.ts
Normal file
94
apps/sim/app/api/copilot/tools/workflow/get-user-workflow.ts
Normal file
@@ -0,0 +1,94 @@
|
||||
import { eq } from 'drizzle-orm'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/db-helpers'
|
||||
import { db } from '@/db'
|
||||
import { workflow as workflowTable } from '@/db/schema'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
interface GetUserWorkflowParams {
|
||||
workflowId: string
|
||||
includeMetadata?: boolean
|
||||
}
|
||||
|
||||
class GetUserWorkflowTool extends BaseCopilotTool<GetUserWorkflowParams, string> {
|
||||
readonly id = 'get_user_workflow'
|
||||
readonly displayName = 'Analyzing your workflow'
|
||||
|
||||
protected async executeImpl(params: GetUserWorkflowParams): Promise<string> {
|
||||
return getUserWorkflow(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const getUserWorkflowTool = new GetUserWorkflowTool()
|
||||
|
||||
// Implementation function
|
||||
async function getUserWorkflow(params: GetUserWorkflowParams): Promise<string> {
|
||||
const logger = createLogger('GetUserWorkflow')
|
||||
const { workflowId, includeMetadata = false } = params
|
||||
|
||||
logger.info('Fetching user workflow', { workflowId })
|
||||
|
||||
// Fetch workflow from database
|
||||
const [workflowRecord] = await db
|
||||
.select()
|
||||
.from(workflowTable)
|
||||
.where(eq(workflowTable.id, workflowId))
|
||||
.limit(1)
|
||||
|
||||
if (!workflowRecord) {
|
||||
throw new Error(`Workflow ${workflowId} not found`)
|
||||
}
|
||||
|
||||
// Try to load from normalized tables first, fallback to JSON blob
|
||||
let workflowState: any = null
|
||||
const subBlockValues: Record<string, Record<string, any>> = {}
|
||||
|
||||
const normalizedData = await loadWorkflowFromNormalizedTables(workflowId)
|
||||
if (normalizedData) {
|
||||
workflowState = {
|
||||
blocks: normalizedData.blocks,
|
||||
edges: normalizedData.edges,
|
||||
loops: normalizedData.loops,
|
||||
parallels: normalizedData.parallels,
|
||||
}
|
||||
|
||||
// Extract subblock values from normalized data
|
||||
Object.entries(normalizedData.blocks).forEach(([blockId, block]) => {
|
||||
subBlockValues[blockId] = {}
|
||||
Object.entries((block as any).subBlocks || {}).forEach(([subBlockId, subBlock]) => {
|
||||
if ((subBlock as any).value !== undefined) {
|
||||
subBlockValues[blockId][subBlockId] = (subBlock as any).value
|
||||
}
|
||||
})
|
||||
})
|
||||
} else if (workflowRecord.state) {
|
||||
// Fallback to JSON blob
|
||||
workflowState = workflowRecord.state as any
|
||||
// For JSON blob, subblock values are embedded in the block state
|
||||
Object.entries((workflowState.blocks as any) || {}).forEach(([blockId, block]) => {
|
||||
subBlockValues[blockId] = {}
|
||||
Object.entries((block as any).subBlocks || {}).forEach(([subBlockId, subBlock]) => {
|
||||
if ((subBlock as any).value !== undefined) {
|
||||
subBlockValues[blockId][subBlockId] = (subBlock as any).value
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
if (!workflowState || !workflowState.blocks) {
|
||||
throw new Error('Workflow state is empty or invalid')
|
||||
}
|
||||
|
||||
logger.info('Successfully fetched user workflow as JSON', {
|
||||
workflowId,
|
||||
blockCount: Object.keys(workflowState.blocks).length,
|
||||
})
|
||||
|
||||
// Return the raw JSON workflow state
|
||||
return JSON.stringify(workflowState, null, 2)
|
||||
}
|
||||
208
apps/sim/app/api/copilot/tools/workflow/get-workflow-console.ts
Normal file
208
apps/sim/app/api/copilot/tools/workflow/get-workflow-console.ts
Normal file
@@ -0,0 +1,208 @@
|
||||
import { desc, eq } from 'drizzle-orm'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { db } from '@/db'
|
||||
import { workflowExecutionLogs } from '@/db/schema'
|
||||
import { BaseCopilotTool } from '../base'
|
||||
|
||||
interface GetWorkflowConsoleParams {
|
||||
workflowId: string
|
||||
limit?: number
|
||||
includeDetails?: boolean
|
||||
}
|
||||
|
||||
interface BlockExecution {
|
||||
id: string
|
||||
blockId: string
|
||||
blockName: string
|
||||
blockType: string
|
||||
startedAt: string
|
||||
endedAt: string
|
||||
durationMs: number
|
||||
status: 'success' | 'error' | 'skipped'
|
||||
errorMessage?: string
|
||||
inputData: any
|
||||
outputData: any
|
||||
cost?: {
|
||||
total: number
|
||||
input: number
|
||||
output: number
|
||||
model?: string
|
||||
tokens?: {
|
||||
total: number
|
||||
prompt: number
|
||||
completion: number
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
interface ExecutionEntry {
|
||||
id: string
|
||||
executionId: string
|
||||
level: string
|
||||
message: string
|
||||
trigger: string
|
||||
startedAt: string
|
||||
endedAt: string | null
|
||||
durationMs: number | null
|
||||
blockCount: number
|
||||
successCount: number
|
||||
errorCount: number
|
||||
skippedCount: number
|
||||
totalCost: number | null
|
||||
totalTokens: number | null
|
||||
blockExecutions: BlockExecution[]
|
||||
output?: any // Final workflow output
|
||||
}
|
||||
|
||||
interface WorkflowConsoleResult {
|
||||
entries: ExecutionEntry[]
|
||||
totalEntries: number
|
||||
workflowId: string
|
||||
retrievedAt: string
|
||||
hasBlockDetails: boolean
|
||||
}
|
||||
|
||||
// Helper function to extract block executions from trace spans
|
||||
function extractBlockExecutionsFromTraceSpans(traceSpans: any[]): BlockExecution[] {
|
||||
const blockExecutions: BlockExecution[] = []
|
||||
|
||||
function processSpan(span: any) {
|
||||
if (span.blockId) {
|
||||
blockExecutions.push({
|
||||
id: span.id,
|
||||
blockId: span.blockId,
|
||||
blockName: span.name || '',
|
||||
blockType: span.type,
|
||||
startedAt: span.startTime,
|
||||
endedAt: span.endTime,
|
||||
durationMs: span.duration || 0,
|
||||
status: span.status || 'success',
|
||||
errorMessage: span.output?.error || undefined,
|
||||
inputData: span.input || {},
|
||||
outputData: span.output || {},
|
||||
cost: span.cost || undefined,
|
||||
})
|
||||
}
|
||||
|
||||
// Process children recursively
|
||||
if (span.children && Array.isArray(span.children)) {
|
||||
span.children.forEach(processSpan)
|
||||
}
|
||||
}
|
||||
|
||||
traceSpans.forEach(processSpan)
|
||||
return blockExecutions
|
||||
}
|
||||
|
||||
class GetWorkflowConsoleTool extends BaseCopilotTool<
|
||||
GetWorkflowConsoleParams,
|
||||
WorkflowConsoleResult
|
||||
> {
|
||||
readonly id = 'get_workflow_console'
|
||||
readonly displayName = 'Getting workflow console'
|
||||
|
||||
protected async executeImpl(params: GetWorkflowConsoleParams): Promise<WorkflowConsoleResult> {
|
||||
return getWorkflowConsole(params)
|
||||
}
|
||||
}
|
||||
|
||||
// Export the tool instance
|
||||
export const getWorkflowConsoleTool = new GetWorkflowConsoleTool()
|
||||
|
||||
// Implementation function
|
||||
async function getWorkflowConsole(
|
||||
params: GetWorkflowConsoleParams
|
||||
): Promise<WorkflowConsoleResult> {
|
||||
const logger = createLogger('GetWorkflowConsole')
|
||||
const { workflowId, limit = 3, includeDetails = true } = params // Default to 3 executions and include details
|
||||
|
||||
logger.info('Fetching workflow console logs', { workflowId, limit, includeDetails })
|
||||
|
||||
// Get recent execution logs for the workflow (past 3 executions by default)
|
||||
const executionLogs = await db
|
||||
.select({
|
||||
id: workflowExecutionLogs.id,
|
||||
executionId: workflowExecutionLogs.executionId,
|
||||
level: workflowExecutionLogs.level,
|
||||
message: workflowExecutionLogs.message,
|
||||
trigger: workflowExecutionLogs.trigger,
|
||||
startedAt: workflowExecutionLogs.startedAt,
|
||||
endedAt: workflowExecutionLogs.endedAt,
|
||||
totalDurationMs: workflowExecutionLogs.totalDurationMs,
|
||||
blockCount: workflowExecutionLogs.blockCount,
|
||||
successCount: workflowExecutionLogs.successCount,
|
||||
errorCount: workflowExecutionLogs.errorCount,
|
||||
skippedCount: workflowExecutionLogs.skippedCount,
|
||||
totalCost: workflowExecutionLogs.totalCost,
|
||||
totalTokens: workflowExecutionLogs.totalTokens,
|
||||
metadata: workflowExecutionLogs.metadata,
|
||||
})
|
||||
.from(workflowExecutionLogs)
|
||||
.where(eq(workflowExecutionLogs.workflowId, workflowId))
|
||||
.orderBy(desc(workflowExecutionLogs.startedAt))
|
||||
.limit(limit)
|
||||
|
||||
// Format the response with detailed block execution data
|
||||
const formattedEntries: ExecutionEntry[] = executionLogs.map((log) => {
|
||||
// Extract trace spans from metadata
|
||||
const metadata = log.metadata as any
|
||||
const traceSpans = metadata?.traceSpans || []
|
||||
const blockExecutions = extractBlockExecutionsFromTraceSpans(traceSpans)
|
||||
|
||||
// Try to find the final output from the last executed block
|
||||
let finalOutput: any
|
||||
if (blockExecutions.length > 0) {
|
||||
// Look for blocks that typically provide final output (sorted by end time)
|
||||
const sortedBlocks = [...blockExecutions].sort(
|
||||
(a, b) => new Date(b.endedAt).getTime() - new Date(a.endedAt).getTime()
|
||||
)
|
||||
|
||||
// Find the last successful block that has meaningful output
|
||||
const outputBlock = sortedBlocks.find(
|
||||
(block) =>
|
||||
block.status === 'success' && block.outputData && Object.keys(block.outputData).length > 0
|
||||
)
|
||||
|
||||
if (outputBlock) {
|
||||
finalOutput = outputBlock.outputData
|
||||
}
|
||||
}
|
||||
|
||||
const entry: ExecutionEntry = {
|
||||
id: log.id,
|
||||
executionId: log.executionId,
|
||||
level: log.level,
|
||||
message: log.message,
|
||||
trigger: log.trigger,
|
||||
startedAt: log.startedAt.toISOString(),
|
||||
endedAt: log.endedAt?.toISOString() || null,
|
||||
durationMs: log.totalDurationMs,
|
||||
blockCount: log.blockCount,
|
||||
successCount: log.successCount,
|
||||
errorCount: log.errorCount,
|
||||
skippedCount: log.skippedCount || 0,
|
||||
totalCost: log.totalCost ? Number.parseFloat(log.totalCost.toString()) : null,
|
||||
totalTokens: log.totalTokens,
|
||||
blockExecutions: includeDetails ? blockExecutions : [],
|
||||
output: finalOutput,
|
||||
}
|
||||
|
||||
return entry
|
||||
})
|
||||
|
||||
// Log the result size for monitoring
|
||||
const resultSize = JSON.stringify(formattedEntries).length
|
||||
logger.info('Workflow console result prepared', {
|
||||
entryCount: formattedEntries.length,
|
||||
resultSizeKB: Math.round(resultSize / 1024),
|
||||
hasBlockDetails: includeDetails,
|
||||
})
|
||||
|
||||
return {
|
||||
entries: formattedEntries,
|
||||
totalEntries: formattedEntries.length,
|
||||
workflowId,
|
||||
retrievedAt: new Date().toISOString(),
|
||||
hasBlockDetails: includeDetails,
|
||||
}
|
||||
}
|
||||
@@ -1,76 +0,0 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { searchDocumentation } from '@/lib/copilot/service'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
|
||||
const logger = createLogger('DocsSearchAPI')
|
||||
|
||||
// Request and response type definitions
|
||||
interface DocsSearchRequest {
|
||||
query: string
|
||||
topK?: number
|
||||
}
|
||||
|
||||
interface DocsSearchResult {
|
||||
id: number
|
||||
title: string
|
||||
url: string
|
||||
content: string
|
||||
similarity: number
|
||||
}
|
||||
|
||||
interface DocsSearchSuccessResponse {
|
||||
success: true
|
||||
results: DocsSearchResult[]
|
||||
query: string
|
||||
totalResults: number
|
||||
searchTime?: number
|
||||
}
|
||||
|
||||
interface DocsSearchErrorResponse {
|
||||
success: false
|
||||
error: string
|
||||
}
|
||||
|
||||
export async function POST(
|
||||
request: NextRequest
|
||||
): Promise<NextResponse<DocsSearchSuccessResponse | DocsSearchErrorResponse>> {
|
||||
try {
|
||||
const requestBody: DocsSearchRequest = await request.json()
|
||||
const { query, topK = 10 } = requestBody
|
||||
|
||||
if (!query) {
|
||||
const errorResponse: DocsSearchErrorResponse = {
|
||||
success: false,
|
||||
error: 'Query is required',
|
||||
}
|
||||
return NextResponse.json(errorResponse, { status: 400 })
|
||||
}
|
||||
|
||||
logger.info('Executing documentation search', { query, topK })
|
||||
|
||||
const startTime = Date.now()
|
||||
const results = await searchDocumentation(query, { topK })
|
||||
const searchTime = Date.now() - startTime
|
||||
|
||||
logger.info(`Found ${results.length} documentation results`, { query })
|
||||
|
||||
const successResponse: DocsSearchSuccessResponse = {
|
||||
success: true,
|
||||
results,
|
||||
query,
|
||||
totalResults: results.length,
|
||||
searchTime,
|
||||
}
|
||||
|
||||
return NextResponse.json(successResponse)
|
||||
} catch (error) {
|
||||
logger.error('Documentation search API failed', error)
|
||||
|
||||
const errorResponse: DocsSearchErrorResponse = {
|
||||
success: false,
|
||||
error: `Documentation search failed: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
}
|
||||
|
||||
return NextResponse.json(errorResponse, { status: 500 })
|
||||
}
|
||||
}
|
||||
223
apps/sim/app/api/environment/variables/route.ts
Normal file
223
apps/sim/app/api/environment/variables/route.ts
Normal file
@@ -0,0 +1,223 @@
|
||||
import { eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getEnvironmentVariableKeys } from '@/lib/environment/utils'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { decryptSecret, encryptSecret } from '@/lib/utils'
|
||||
import { getUserId } from '@/app/api/auth/oauth/utils'
|
||||
import { db } from '@/db'
|
||||
import { environment } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('EnvironmentVariablesAPI')
|
||||
|
||||
// Schema for environment variable updates
|
||||
const EnvVarSchema = z.object({
|
||||
variables: z.record(z.string()),
|
||||
})
|
||||
|
||||
export async function GET(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
// For GET requests, check for workflowId in query params
|
||||
const { searchParams } = new URL(request.url)
|
||||
const workflowId = searchParams.get('workflowId')
|
||||
|
||||
// Use dual authentication pattern like other copilot tools
|
||||
const userId = await getUserId(requestId, workflowId || undefined)
|
||||
|
||||
if (!userId) {
|
||||
logger.warn(`[${requestId}] Unauthorized environment variables access attempt`)
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
// Get only the variable names (keys), not values
|
||||
const result = await getEnvironmentVariableKeys(userId)
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: true,
|
||||
output: result,
|
||||
},
|
||||
{ status: 200 }
|
||||
)
|
||||
} catch (error: any) {
|
||||
logger.error(`[${requestId}] Environment variables fetch error`, error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error.message || 'Failed to get environment variables',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
export async function PUT(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { workflowId, variables } = body
|
||||
|
||||
// Use dual authentication pattern like other copilot tools
|
||||
const userId = await getUserId(requestId, workflowId)
|
||||
|
||||
if (!userId) {
|
||||
logger.warn(`[${requestId}] Unauthorized environment variables set attempt`)
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
try {
|
||||
const { variables: validatedVariables } = EnvVarSchema.parse({ variables })
|
||||
|
||||
// Get existing environment variables for this user
|
||||
const existingData = await db
|
||||
.select()
|
||||
.from(environment)
|
||||
.where(eq(environment.userId, userId))
|
||||
.limit(1)
|
||||
|
||||
// Start with existing encrypted variables or empty object
|
||||
const existingEncryptedVariables =
|
||||
(existingData[0]?.variables as Record<string, string>) || {}
|
||||
|
||||
// Determine which variables are new or changed by comparing with decrypted existing values
|
||||
const variablesToEncrypt: Record<string, string> = {}
|
||||
const addedVariables: string[] = []
|
||||
const updatedVariables: string[] = []
|
||||
|
||||
for (const [key, newValue] of Object.entries(validatedVariables)) {
|
||||
if (!(key in existingEncryptedVariables)) {
|
||||
// New variable
|
||||
variablesToEncrypt[key] = newValue
|
||||
addedVariables.push(key)
|
||||
} else {
|
||||
// Check if the value has actually changed by decrypting the existing value
|
||||
try {
|
||||
const { decrypted: existingValue } = await decryptSecret(
|
||||
existingEncryptedVariables[key]
|
||||
)
|
||||
|
||||
if (existingValue !== newValue) {
|
||||
// Value changed, needs re-encryption
|
||||
variablesToEncrypt[key] = newValue
|
||||
updatedVariables.push(key)
|
||||
}
|
||||
// If values are the same, keep the existing encrypted value
|
||||
} catch (decryptError) {
|
||||
// If we can't decrypt the existing value, treat as changed and re-encrypt
|
||||
logger.warn(
|
||||
`[${requestId}] Could not decrypt existing variable ${key}, re-encrypting`,
|
||||
{ error: decryptError }
|
||||
)
|
||||
variablesToEncrypt[key] = newValue
|
||||
updatedVariables.push(key)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Only encrypt the variables that are new or changed
|
||||
const newlyEncryptedVariables = await Object.entries(variablesToEncrypt).reduce(
|
||||
async (accPromise, [key, value]) => {
|
||||
const acc = await accPromise
|
||||
const { encrypted } = await encryptSecret(value)
|
||||
return { ...acc, [key]: encrypted }
|
||||
},
|
||||
Promise.resolve({})
|
||||
)
|
||||
|
||||
// Merge existing encrypted variables with newly encrypted ones
|
||||
const finalEncryptedVariables = { ...existingEncryptedVariables, ...newlyEncryptedVariables }
|
||||
|
||||
// Update or insert environment variables for user
|
||||
await db
|
||||
.insert(environment)
|
||||
.values({
|
||||
id: crypto.randomUUID(),
|
||||
userId: userId,
|
||||
variables: finalEncryptedVariables,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.onConflictDoUpdate({
|
||||
target: [environment.userId],
|
||||
set: {
|
||||
variables: finalEncryptedVariables,
|
||||
updatedAt: new Date(),
|
||||
},
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: true,
|
||||
output: {
|
||||
message: `Successfully processed ${Object.keys(validatedVariables).length} environment variable(s): ${addedVariables.length} added, ${updatedVariables.length} updated`,
|
||||
variableCount: Object.keys(validatedVariables).length,
|
||||
variableNames: Object.keys(validatedVariables),
|
||||
totalVariableCount: Object.keys(finalEncryptedVariables).length,
|
||||
addedVariables,
|
||||
updatedVariables,
|
||||
},
|
||||
},
|
||||
{ status: 200 }
|
||||
)
|
||||
} catch (validationError) {
|
||||
if (validationError instanceof z.ZodError) {
|
||||
logger.warn(`[${requestId}] Invalid environment variables data`, {
|
||||
errors: validationError.errors,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid request data', details: validationError.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
throw validationError
|
||||
}
|
||||
} catch (error: any) {
|
||||
logger.error(`[${requestId}] Environment variables set error`, error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error.message || 'Failed to set environment variables',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { workflowId } = body
|
||||
|
||||
// Use dual authentication pattern like other copilot tools
|
||||
const userId = await getUserId(requestId, workflowId)
|
||||
|
||||
if (!userId) {
|
||||
logger.warn(`[${requestId}] Unauthorized environment variables access attempt`)
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
// Get only the variable names (keys), not values
|
||||
const result = await getEnvironmentVariableKeys(userId)
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: true,
|
||||
output: result,
|
||||
},
|
||||
{ status: 200 }
|
||||
)
|
||||
} catch (error: any) {
|
||||
logger.error(`[${requestId}] Environment variables fetch error`, error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error.message || 'Failed to get environment variables',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -1,415 +0,0 @@
|
||||
import { eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { autoLayoutWorkflow } from '@/lib/autolayout/service'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import {
|
||||
loadWorkflowFromNormalizedTables,
|
||||
saveWorkflowToNormalizedTables,
|
||||
} from '@/lib/workflows/db-helpers'
|
||||
import { generateWorkflowYaml } from '@/lib/workflows/yaml-generator'
|
||||
import { getUserId } from '@/app/api/auth/oauth/utils'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { db } from '@/db'
|
||||
import { copilotCheckpoints, workflow as workflowTable } from '@/db/schema'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
import { convertYamlToWorkflow, parseWorkflowYaml } from '@/stores/workflows/yaml/importer'
|
||||
|
||||
export const dynamic = 'force-dynamic'
|
||||
|
||||
const logger = createLogger('EditWorkflowAPI')
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { yamlContent, workflowId, description, chatId } = body
|
||||
|
||||
if (!yamlContent) {
|
||||
return NextResponse.json(
|
||||
{ success: false, error: 'yamlContent is required' },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
if (!workflowId) {
|
||||
return NextResponse.json({ success: false, error: 'workflowId is required' }, { status: 400 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Processing workflow edit request`, {
|
||||
workflowId,
|
||||
yamlLength: yamlContent.length,
|
||||
hasDescription: !!description,
|
||||
hasChatId: !!chatId,
|
||||
})
|
||||
|
||||
// Log the full YAML content for debugging
|
||||
logger.info(`[${requestId}] Full YAML content from copilot:`)
|
||||
logger.info('='.repeat(80))
|
||||
logger.info(yamlContent)
|
||||
logger.info('='.repeat(80))
|
||||
|
||||
// Get the user ID for checkpoint creation
|
||||
const userId = await getUserId(requestId, workflowId)
|
||||
if (!userId) {
|
||||
return NextResponse.json({ success: false, error: 'User not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Create checkpoint before making changes (only if chatId is provided)
|
||||
if (chatId) {
|
||||
try {
|
||||
logger.info(`[${requestId}] Creating checkpoint before workflow edit`)
|
||||
|
||||
// Get current workflow state
|
||||
const currentWorkflowData = await loadWorkflowFromNormalizedTables(workflowId)
|
||||
|
||||
if (currentWorkflowData) {
|
||||
// Generate YAML from current state
|
||||
const currentYaml = generateWorkflowYaml(currentWorkflowData)
|
||||
|
||||
// Create checkpoint
|
||||
await db.insert(copilotCheckpoints).values({
|
||||
userId,
|
||||
workflowId,
|
||||
chatId,
|
||||
yaml: currentYaml,
|
||||
})
|
||||
|
||||
logger.info(`[${requestId}] Checkpoint created successfully`)
|
||||
} else {
|
||||
logger.warn(`[${requestId}] Could not load current workflow state for checkpoint`)
|
||||
}
|
||||
} catch (checkpointError) {
|
||||
logger.error(`[${requestId}] Failed to create checkpoint:`, checkpointError)
|
||||
// Continue with workflow edit even if checkpoint fails
|
||||
}
|
||||
}
|
||||
|
||||
// Parse YAML content server-side
|
||||
const { data: yamlWorkflow, errors: parseErrors } = parseWorkflowYaml(yamlContent)
|
||||
|
||||
if (!yamlWorkflow || parseErrors.length > 0) {
|
||||
logger.error('[edit-workflow] YAML parsing failed', { parseErrors })
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
success: false,
|
||||
message: 'Failed to parse YAML workflow',
|
||||
errors: parseErrors,
|
||||
warnings: [],
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
// Convert YAML to workflow format
|
||||
const { blocks, edges, errors: convertErrors, warnings } = convertYamlToWorkflow(yamlWorkflow)
|
||||
|
||||
if (convertErrors.length > 0) {
|
||||
logger.error('[edit-workflow] YAML conversion failed', { convertErrors })
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
success: false,
|
||||
message: 'Failed to convert YAML to workflow',
|
||||
errors: convertErrors,
|
||||
warnings,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
// Create workflow state (same format as applyWorkflowDiff)
|
||||
const newWorkflowState: any = {
|
||||
blocks: {} as Record<string, any>,
|
||||
edges: [] as any[],
|
||||
loops: {} as Record<string, any>,
|
||||
parallels: {} as Record<string, any>,
|
||||
lastSaved: Date.now(),
|
||||
isDeployed: false,
|
||||
deployedAt: undefined,
|
||||
deploymentStatuses: {} as Record<string, any>,
|
||||
hasActiveSchedule: false,
|
||||
hasActiveWebhook: false,
|
||||
}
|
||||
|
||||
// Process blocks and assign new IDs (complete replacement)
|
||||
const blockIdMapping = new Map<string, string>()
|
||||
|
||||
for (const block of blocks) {
|
||||
const newId = crypto.randomUUID()
|
||||
blockIdMapping.set(block.id, newId)
|
||||
|
||||
// Get block configuration to set proper defaults
|
||||
const blockConfig = getBlock(block.type)
|
||||
const subBlocks: Record<string, any> = {}
|
||||
const outputs: Record<string, any> = {}
|
||||
|
||||
// Set up subBlocks from block configuration
|
||||
if (blockConfig?.subBlocks) {
|
||||
blockConfig.subBlocks.forEach((subBlock) => {
|
||||
subBlocks[subBlock.id] = {
|
||||
id: subBlock.id,
|
||||
type: subBlock.type,
|
||||
value: null,
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// Set up outputs from block configuration
|
||||
if (blockConfig?.outputs) {
|
||||
if (Array.isArray(blockConfig.outputs)) {
|
||||
blockConfig.outputs.forEach((output) => {
|
||||
outputs[output.id] = { type: output.type }
|
||||
})
|
||||
} else if (typeof blockConfig.outputs === 'object') {
|
||||
Object.assign(outputs, blockConfig.outputs)
|
||||
}
|
||||
}
|
||||
|
||||
newWorkflowState.blocks[newId] = {
|
||||
id: newId,
|
||||
type: block.type,
|
||||
name: block.name,
|
||||
position: block.position,
|
||||
subBlocks,
|
||||
outputs,
|
||||
enabled: true,
|
||||
horizontalHandles: true,
|
||||
isWide: false,
|
||||
advancedMode: false,
|
||||
height: 0,
|
||||
data: block.data || {},
|
||||
}
|
||||
|
||||
// Set input values as subblock values with block reference mapping
|
||||
if (block.inputs && typeof block.inputs === 'object') {
|
||||
Object.entries(block.inputs).forEach(([key, value]) => {
|
||||
if (newWorkflowState.blocks[newId].subBlocks[key]) {
|
||||
// Update block references in values to use new mapped IDs
|
||||
let processedValue = value
|
||||
if (typeof value === 'string' && value.includes('<') && value.includes('>')) {
|
||||
// Update block references to use new mapped IDs
|
||||
const blockMatches = value.match(/<([^>]+)>/g)
|
||||
if (blockMatches) {
|
||||
for (const match of blockMatches) {
|
||||
const path = match.slice(1, -1)
|
||||
const [blockRef] = path.split('.')
|
||||
|
||||
// Skip system references (start, loop, parallel, variable)
|
||||
if (['start', 'loop', 'parallel', 'variable'].includes(blockRef.toLowerCase())) {
|
||||
continue
|
||||
}
|
||||
|
||||
// Check if this references an old block ID that needs mapping
|
||||
const newMappedId = blockIdMapping.get(blockRef)
|
||||
if (newMappedId) {
|
||||
logger.info(
|
||||
`[${requestId}] Updating block reference: ${blockRef} -> ${newMappedId}`
|
||||
)
|
||||
processedValue = processedValue.replace(
|
||||
new RegExp(`<${blockRef}\\.`, 'g'),
|
||||
`<${newMappedId}.`
|
||||
)
|
||||
processedValue = processedValue.replace(
|
||||
new RegExp(`<${blockRef}>`, 'g'),
|
||||
`<${newMappedId}>`
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
newWorkflowState.blocks[newId].subBlocks[key].value = processedValue
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Update parent-child relationships with mapped IDs
|
||||
logger.info(`[${requestId}] Block ID mapping:`, Object.fromEntries(blockIdMapping))
|
||||
for (const [newId, blockData] of Object.entries(newWorkflowState.blocks)) {
|
||||
const block = blockData as any
|
||||
if (block.data?.parentId) {
|
||||
logger.info(
|
||||
`[${requestId}] Found child block ${block.name} with parentId: ${block.data.parentId}`
|
||||
)
|
||||
const mappedParentId = blockIdMapping.get(block.data.parentId)
|
||||
if (mappedParentId) {
|
||||
logger.info(
|
||||
`[${requestId}] Updating parent reference: ${block.data.parentId} -> ${mappedParentId}`
|
||||
)
|
||||
block.data.parentId = mappedParentId
|
||||
// Ensure extent is set for child blocks
|
||||
if (!block.data.extent) {
|
||||
block.data.extent = 'parent'
|
||||
}
|
||||
} else {
|
||||
logger.error(
|
||||
`[${requestId}] ❌ Parent block not found for mapping: ${block.data.parentId}`
|
||||
)
|
||||
logger.error(`[${requestId}] Available mappings:`, Array.from(blockIdMapping.keys()))
|
||||
// Remove invalid parent reference
|
||||
block.data.parentId = undefined
|
||||
block.data.extent = undefined
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Process edges with mapped IDs
|
||||
for (const edge of edges) {
|
||||
const sourceId = blockIdMapping.get(edge.source)
|
||||
const targetId = blockIdMapping.get(edge.target)
|
||||
|
||||
if (sourceId && targetId) {
|
||||
newWorkflowState.edges.push({
|
||||
id: crypto.randomUUID(),
|
||||
source: sourceId,
|
||||
target: targetId,
|
||||
sourceHandle: edge.sourceHandle,
|
||||
targetHandle: edge.targetHandle,
|
||||
type: edge.type || 'default',
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Generate loop and parallel configurations from the imported blocks
|
||||
const loops = generateLoopBlocks(newWorkflowState.blocks)
|
||||
const parallels = generateParallelBlocks(newWorkflowState.blocks)
|
||||
|
||||
// Update workflow state with generated configurations
|
||||
newWorkflowState.loops = loops
|
||||
newWorkflowState.parallels = parallels
|
||||
|
||||
logger.info(`[${requestId}] Generated loop and parallel configurations`, {
|
||||
loopsCount: Object.keys(loops).length,
|
||||
parallelsCount: Object.keys(parallels).length,
|
||||
loopIds: Object.keys(loops),
|
||||
parallelIds: Object.keys(parallels),
|
||||
})
|
||||
|
||||
// Apply intelligent autolayout to optimize block positions
|
||||
try {
|
||||
logger.info(
|
||||
`[${requestId}] Applying autolayout to ${Object.keys(newWorkflowState.blocks).length} blocks`
|
||||
)
|
||||
|
||||
const layoutedBlocks = await autoLayoutWorkflow(
|
||||
newWorkflowState.blocks,
|
||||
newWorkflowState.edges,
|
||||
{
|
||||
strategy: 'smart',
|
||||
direction: 'auto',
|
||||
spacing: {
|
||||
horizontal: 400,
|
||||
vertical: 200,
|
||||
layer: 600,
|
||||
},
|
||||
alignment: 'center',
|
||||
padding: {
|
||||
x: 200,
|
||||
y: 200,
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
// Update workflow state with optimized positions
|
||||
newWorkflowState.blocks = layoutedBlocks
|
||||
|
||||
logger.info(`[${requestId}] Autolayout completed successfully`)
|
||||
} catch (layoutError) {
|
||||
// Log the error but don't fail the entire workflow save
|
||||
logger.warn(`[${requestId}] Autolayout failed, using original positions:`, layoutError)
|
||||
}
|
||||
|
||||
// Save directly to database using the same function as the workflow state API
|
||||
const saveResult = await saveWorkflowToNormalizedTables(workflowId, newWorkflowState)
|
||||
|
||||
if (!saveResult.success) {
|
||||
logger.error('[edit-workflow] Failed to save workflow state:', saveResult.error)
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
success: false,
|
||||
message: `Database save failed: ${saveResult.error || 'Unknown error'}`,
|
||||
errors: [saveResult.error || 'Database save failed'],
|
||||
warnings,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
// Update workflow's lastSynced timestamp
|
||||
await db
|
||||
.update(workflowTable)
|
||||
.set({
|
||||
lastSynced: new Date(),
|
||||
updatedAt: new Date(),
|
||||
state: saveResult.jsonBlob, // Also update JSON blob for backward compatibility
|
||||
})
|
||||
.where(eq(workflowTable.id, workflowId))
|
||||
|
||||
// Notify the socket server to tell clients to rehydrate stores from database
|
||||
try {
|
||||
const socketUrl = process.env.SOCKET_URL || 'http://localhost:3002'
|
||||
await fetch(`${socketUrl}/api/copilot-workflow-edit`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
workflowId,
|
||||
description: description || 'Copilot edited workflow',
|
||||
}),
|
||||
})
|
||||
logger.info('[edit-workflow] Notified socket server to rehydrate client stores from database')
|
||||
} catch (socketError) {
|
||||
// Don't fail the main request if socket notification fails
|
||||
logger.warn('[edit-workflow] Failed to notify socket server:', socketError)
|
||||
}
|
||||
|
||||
// Calculate summary with loop/parallel information
|
||||
const loopBlocksCount = Object.values(newWorkflowState.blocks).filter(
|
||||
(b: any) => b.type === 'loop'
|
||||
).length
|
||||
const parallelBlocksCount = Object.values(newWorkflowState.blocks).filter(
|
||||
(b: any) => b.type === 'parallel'
|
||||
).length
|
||||
|
||||
let summaryDetails = `Successfully created workflow with ${blocks.length} blocks and ${edges.length} connections.`
|
||||
|
||||
if (loopBlocksCount > 0 || parallelBlocksCount > 0) {
|
||||
summaryDetails += ` Generated ${Object.keys(loops).length} loop configurations and ${Object.keys(parallels).length} parallel configurations.`
|
||||
}
|
||||
|
||||
const result = {
|
||||
success: true,
|
||||
errors: [],
|
||||
warnings,
|
||||
summary: summaryDetails,
|
||||
}
|
||||
|
||||
logger.info('[edit-workflow] Import result', {
|
||||
success: result.success,
|
||||
errorCount: result.errors.length,
|
||||
warningCount: result.warnings.length,
|
||||
summary: result.summary,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
success: result.success,
|
||||
message: result.success
|
||||
? `Workflow updated successfully${description ? `: ${description}` : ''}`
|
||||
: 'Failed to update workflow',
|
||||
summary: result.summary,
|
||||
errors: result.errors,
|
||||
warnings: result.warnings,
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('[edit-workflow] Error:', error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: `Failed to edit workflow: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -1,66 +0,0 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { registry as blockRegistry } from '@/blocks/registry'
|
||||
|
||||
const logger = createLogger('GetAllBlocksAPI')
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { includeDetails = false, filterCategory } = body
|
||||
|
||||
logger.info('Getting all blocks and tools', { includeDetails, filterCategory })
|
||||
|
||||
// Create mapping of block_id -> [tool_ids]
|
||||
const blockToToolsMapping: Record<string, string[]> = {}
|
||||
|
||||
// Process blocks - filter out hidden blocks and map to their tools
|
||||
Object.entries(blockRegistry)
|
||||
.filter(([blockType, blockConfig]) => {
|
||||
// Filter out hidden blocks
|
||||
if (blockConfig.hideFromToolbar) return false
|
||||
|
||||
// Apply category filter if specified
|
||||
if (filterCategory && blockConfig.category !== filterCategory) return false
|
||||
|
||||
return true
|
||||
})
|
||||
.forEach(([blockType, blockConfig]) => {
|
||||
// Get the tools for this block
|
||||
const blockTools = blockConfig.tools?.access || []
|
||||
blockToToolsMapping[blockType] = blockTools
|
||||
})
|
||||
|
||||
const totalBlocks = Object.keys(blockRegistry).length
|
||||
const includedBlocks = Object.keys(blockToToolsMapping).length
|
||||
const filteredBlocksCount = totalBlocks - includedBlocks
|
||||
|
||||
// Log block to tools mapping for debugging
|
||||
const blockToolsInfo = Object.entries(blockToToolsMapping)
|
||||
.map(([blockType, tools]) => `${blockType}: [${tools.join(', ')}]`)
|
||||
.sort()
|
||||
|
||||
logger.info(`Successfully mapped ${includedBlocks} blocks to their tools`, {
|
||||
totalBlocks,
|
||||
includedBlocks,
|
||||
filteredBlocks: filteredBlocksCount,
|
||||
filterCategory,
|
||||
blockToolsMapping: blockToolsInfo,
|
||||
outputMapping: blockToToolsMapping,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: blockToToolsMapping,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Get all blocks failed', error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: `Failed to get blocks and tools: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -1,239 +0,0 @@
|
||||
import { existsSync, readFileSync } from 'fs'
|
||||
import { join } from 'path'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { registry as blockRegistry } from '@/blocks/registry'
|
||||
import { tools as toolsRegistry } from '@/tools/registry'
|
||||
|
||||
const logger = createLogger('GetBlockMetadataAPI')
|
||||
|
||||
// Core blocks that have documentation with YAML schemas
|
||||
const CORE_BLOCKS_WITH_DOCS = [
|
||||
'agent',
|
||||
'function',
|
||||
'api',
|
||||
'condition',
|
||||
'loop',
|
||||
'parallel',
|
||||
'response',
|
||||
'router',
|
||||
'evaluator',
|
||||
'webhook',
|
||||
]
|
||||
|
||||
// Mapping for blocks that have different doc file names
|
||||
const DOCS_FILE_MAPPING: Record<string, string> = {
|
||||
webhook: 'webhook_trigger',
|
||||
}
|
||||
|
||||
// Helper function to read YAML schema from dedicated YAML documentation files
|
||||
function getYamlSchemaFromDocs(blockType: string): string | null {
|
||||
try {
|
||||
const docFileName = DOCS_FILE_MAPPING[blockType] || blockType
|
||||
// Read from the new YAML documentation structure
|
||||
const yamlDocsPath = join(
|
||||
process.cwd(),
|
||||
'..',
|
||||
'docs/content/docs/yaml/blocks',
|
||||
`${docFileName}.mdx`
|
||||
)
|
||||
|
||||
if (!existsSync(yamlDocsPath)) {
|
||||
logger.warn(`YAML schema file not found for ${blockType} at ${yamlDocsPath}`)
|
||||
return null
|
||||
}
|
||||
|
||||
const content = readFileSync(yamlDocsPath, 'utf-8')
|
||||
|
||||
// Remove the frontmatter and return the content after the title
|
||||
const contentWithoutFrontmatter = content.replace(/^---[\s\S]*?---\s*/, '')
|
||||
return contentWithoutFrontmatter.trim()
|
||||
} catch (error) {
|
||||
logger.warn(`Failed to read YAML schema for ${blockType}:`, error)
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { blockIds } = body
|
||||
|
||||
if (!blockIds || !Array.isArray(blockIds)) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'blockIds must be an array of block IDs',
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info('Getting block metadata', {
|
||||
blockIds,
|
||||
blockCount: blockIds.length,
|
||||
requestedBlocks: blockIds.join(', '),
|
||||
})
|
||||
|
||||
// Create result object
|
||||
const result: Record<string, any> = {}
|
||||
|
||||
for (const blockId of blockIds) {
|
||||
const blockConfig = blockRegistry[blockId]
|
||||
|
||||
if (!blockConfig) {
|
||||
logger.warn(`Block not found: ${blockId}`)
|
||||
continue
|
||||
}
|
||||
|
||||
// Always include code schemas from block configuration
|
||||
const codeSchemas = {
|
||||
inputs: blockConfig.inputs,
|
||||
outputs: blockConfig.outputs,
|
||||
subBlocks: blockConfig.subBlocks,
|
||||
}
|
||||
|
||||
// Check if this is a core block with YAML documentation
|
||||
if (CORE_BLOCKS_WITH_DOCS.includes(blockId)) {
|
||||
// For core blocks, return both YAML schema from documentation AND code schemas
|
||||
const yamlSchema = getYamlSchemaFromDocs(blockId)
|
||||
|
||||
if (yamlSchema) {
|
||||
result[blockId] = {
|
||||
type: 'block',
|
||||
description: blockConfig.description || '',
|
||||
longDescription: blockConfig.longDescription,
|
||||
category: blockConfig.category || '',
|
||||
yamlSchema: yamlSchema,
|
||||
docsLink: blockConfig.docsLink,
|
||||
// Include actual schemas from code
|
||||
codeSchemas: codeSchemas,
|
||||
}
|
||||
} else {
|
||||
// Fallback to regular metadata if YAML schema not found
|
||||
result[blockId] = {
|
||||
type: 'block',
|
||||
description: blockConfig.description || '',
|
||||
longDescription: blockConfig.longDescription,
|
||||
category: blockConfig.category || '',
|
||||
inputs: blockConfig.inputs,
|
||||
outputs: blockConfig.outputs,
|
||||
subBlocks: blockConfig.subBlocks,
|
||||
// Include actual schemas from code
|
||||
codeSchemas: codeSchemas,
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// For tool blocks, return tool schema information AND code schemas
|
||||
const blockTools = blockConfig.tools?.access || []
|
||||
const toolSchemas: Record<string, any> = {}
|
||||
|
||||
for (const toolId of blockTools) {
|
||||
const toolConfig = toolsRegistry[toolId]
|
||||
if (toolConfig) {
|
||||
toolSchemas[toolId] = {
|
||||
id: toolConfig.id,
|
||||
name: toolConfig.name,
|
||||
description: toolConfig.description || '',
|
||||
version: toolConfig.version,
|
||||
params: toolConfig.params,
|
||||
request: toolConfig.request
|
||||
? {
|
||||
method: toolConfig.request.method,
|
||||
url: toolConfig.request.url,
|
||||
headers:
|
||||
typeof toolConfig.request.headers === 'function'
|
||||
? 'function'
|
||||
: toolConfig.request.headers,
|
||||
isInternalRoute: toolConfig.request.isInternalRoute,
|
||||
}
|
||||
: undefined,
|
||||
}
|
||||
} else {
|
||||
logger.warn(`Tool not found: ${toolId} for block: ${blockId}`)
|
||||
toolSchemas[toolId] = {
|
||||
id: toolId,
|
||||
description: 'Tool not found',
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result[blockId] = {
|
||||
type: 'tool',
|
||||
description: blockConfig.description || '',
|
||||
longDescription: blockConfig.longDescription,
|
||||
category: blockConfig.category || '',
|
||||
inputs: blockConfig.inputs,
|
||||
outputs: blockConfig.outputs,
|
||||
subBlocks: blockConfig.subBlocks,
|
||||
toolSchemas: toolSchemas,
|
||||
// Include actual schemas from code
|
||||
codeSchemas: codeSchemas,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const processedBlocks = Object.keys(result).length
|
||||
const requestedBlocks = blockIds.length
|
||||
const notFoundBlocks = requestedBlocks - processedBlocks
|
||||
|
||||
// Log detailed output for debugging
|
||||
Object.entries(result).forEach(([blockId, blockData]) => {
|
||||
if (blockData.type === 'block' && blockData.yamlSchema) {
|
||||
logger.info(`Retrieved YAML schema + code schemas for core block: ${blockId}`, {
|
||||
blockId,
|
||||
type: blockData.type,
|
||||
description: blockData.description,
|
||||
yamlSchemaLength: blockData.yamlSchema.length,
|
||||
yamlSchemaPreview: `${blockData.yamlSchema.substring(0, 200)}...`,
|
||||
hasCodeSchemas: !!blockData.codeSchemas,
|
||||
codeSubBlocksCount: blockData.codeSchemas?.subBlocks?.length || 0,
|
||||
})
|
||||
} else if (blockData.type === 'tool' && blockData.toolSchemas) {
|
||||
const toolIds = Object.keys(blockData.toolSchemas)
|
||||
logger.info(`Retrieved tool schemas + code schemas for tool block: ${blockId}`, {
|
||||
blockId,
|
||||
type: blockData.type,
|
||||
description: blockData.description,
|
||||
toolCount: toolIds.length,
|
||||
toolIds: toolIds,
|
||||
hasCodeSchemas: !!blockData.codeSchemas,
|
||||
codeSubBlocksCount: blockData.codeSchemas?.subBlocks?.length || 0,
|
||||
})
|
||||
} else {
|
||||
logger.info(`Retrieved metadata + code schemas for block: ${blockId}`, {
|
||||
blockId,
|
||||
type: blockData.type,
|
||||
description: blockData.description,
|
||||
hasInputs: !!blockData.inputs,
|
||||
hasOutputs: !!blockData.outputs,
|
||||
hasSubBlocks: !!blockData.subBlocks,
|
||||
hasCodeSchemas: !!blockData.codeSchemas,
|
||||
codeSubBlocksCount: blockData.codeSchemas?.subBlocks?.length || 0,
|
||||
})
|
||||
}
|
||||
})
|
||||
|
||||
logger.info(`Successfully processed ${processedBlocks} block metadata`, {
|
||||
requestedBlocks,
|
||||
processedBlocks,
|
||||
notFoundBlocks,
|
||||
coreBlocks: blockIds.filter((id) => CORE_BLOCKS_WITH_DOCS.includes(id)),
|
||||
toolBlocks: blockIds.filter((id) => !CORE_BLOCKS_WITH_DOCS.includes(id)),
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: result,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Get block metadata failed', error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: `Failed to get block metadata: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -1,213 +0,0 @@
|
||||
import { eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/db-helpers'
|
||||
import { generateWorkflowYaml } from '@/lib/workflows/yaml-generator'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { db } from '@/db'
|
||||
import { workflow as workflowTable } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('GetUserWorkflowAPI')
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { workflowId, includeMetadata = false } = body
|
||||
|
||||
if (!workflowId) {
|
||||
return NextResponse.json(
|
||||
{ success: false, error: 'Workflow ID is required' },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info('Fetching user workflow', { workflowId })
|
||||
|
||||
// Fetch workflow from database
|
||||
const [workflowRecord] = await db
|
||||
.select()
|
||||
.from(workflowTable)
|
||||
.where(eq(workflowTable.id, workflowId))
|
||||
.limit(1)
|
||||
|
||||
if (!workflowRecord) {
|
||||
return NextResponse.json(
|
||||
{ success: false, error: `Workflow ${workflowId} not found` },
|
||||
{ status: 404 }
|
||||
)
|
||||
}
|
||||
|
||||
// Try to load from normalized tables first, fallback to JSON blob
|
||||
let workflowState: any = null
|
||||
const subBlockValues: Record<string, Record<string, any>> = {}
|
||||
|
||||
const normalizedData = await loadWorkflowFromNormalizedTables(workflowId)
|
||||
if (normalizedData) {
|
||||
workflowState = {
|
||||
blocks: normalizedData.blocks,
|
||||
edges: normalizedData.edges,
|
||||
loops: normalizedData.loops,
|
||||
parallels: normalizedData.parallels,
|
||||
}
|
||||
|
||||
// Extract subblock values from normalized data
|
||||
Object.entries(normalizedData.blocks).forEach(([blockId, block]) => {
|
||||
subBlockValues[blockId] = {}
|
||||
Object.entries((block as any).subBlocks || {}).forEach(([subBlockId, subBlock]) => {
|
||||
if ((subBlock as any).value !== undefined) {
|
||||
subBlockValues[blockId][subBlockId] = (subBlock as any).value
|
||||
}
|
||||
})
|
||||
})
|
||||
} else if (workflowRecord.state) {
|
||||
// Fallback to JSON blob
|
||||
workflowState = workflowRecord.state as any
|
||||
// For JSON blob, subblock values are embedded in the block state
|
||||
Object.entries((workflowState.blocks as any) || {}).forEach(([blockId, block]) => {
|
||||
subBlockValues[blockId] = {}
|
||||
Object.entries((block as any).subBlocks || {}).forEach(([subBlockId, subBlock]) => {
|
||||
if ((subBlock as any).value !== undefined) {
|
||||
subBlockValues[blockId][subBlockId] = (subBlock as any).value
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
if (!workflowState || !workflowState.blocks) {
|
||||
return NextResponse.json(
|
||||
{ success: false, error: 'Workflow state is empty or invalid' },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Generate YAML using server-side function
|
||||
const yaml = generateWorkflowYaml(workflowState, subBlockValues)
|
||||
|
||||
if (!yaml || yaml.trim() === '') {
|
||||
return NextResponse.json(
|
||||
{ success: false, error: 'Generated YAML is empty' },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Generate detailed block information with schemas
|
||||
const blockSchemas: Record<string, any> = {}
|
||||
Object.entries(workflowState.blocks).forEach(([blockId, blockState]) => {
|
||||
const block = blockState as any
|
||||
const blockConfig = getBlock(block.type)
|
||||
|
||||
if (blockConfig) {
|
||||
blockSchemas[blockId] = {
|
||||
type: block.type,
|
||||
name: block.name,
|
||||
description: blockConfig.description,
|
||||
longDescription: blockConfig.longDescription,
|
||||
category: blockConfig.category,
|
||||
docsLink: blockConfig.docsLink,
|
||||
inputs: {},
|
||||
inputRequirements: blockConfig.inputs || {},
|
||||
outputs: blockConfig.outputs || {},
|
||||
tools: blockConfig.tools,
|
||||
}
|
||||
|
||||
// Add input schema from subBlocks configuration
|
||||
if (blockConfig.subBlocks) {
|
||||
blockConfig.subBlocks.forEach((subBlock) => {
|
||||
blockSchemas[blockId].inputs[subBlock.id] = {
|
||||
type: subBlock.type,
|
||||
title: subBlock.title,
|
||||
description: subBlock.description || '',
|
||||
layout: subBlock.layout,
|
||||
...(subBlock.options && { options: subBlock.options }),
|
||||
...(subBlock.placeholder && { placeholder: subBlock.placeholder }),
|
||||
...(subBlock.min !== undefined && { min: subBlock.min }),
|
||||
...(subBlock.max !== undefined && { max: subBlock.max }),
|
||||
...(subBlock.columns && { columns: subBlock.columns }),
|
||||
...(subBlock.hidden !== undefined && { hidden: subBlock.hidden }),
|
||||
...(subBlock.condition && { condition: subBlock.condition }),
|
||||
}
|
||||
})
|
||||
}
|
||||
} else {
|
||||
// Handle special block types like loops and parallels
|
||||
blockSchemas[blockId] = {
|
||||
type: block.type,
|
||||
name: block.name,
|
||||
description: `${block.type.charAt(0).toUpperCase() + block.type.slice(1)} container block`,
|
||||
category: 'Control Flow',
|
||||
inputs: {},
|
||||
outputs: {},
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
// Generate workflow summary
|
||||
const blockTypes = Object.values(workflowState.blocks).reduce(
|
||||
(acc: Record<string, number>, block: any) => {
|
||||
acc[block.type] = (acc[block.type] || 0) + 1
|
||||
return acc
|
||||
},
|
||||
{}
|
||||
)
|
||||
|
||||
const categories = Object.values(blockSchemas).reduce(
|
||||
(acc: Record<string, number>, schema: any) => {
|
||||
if (schema.category) {
|
||||
acc[schema.category] = (acc[schema.category] || 0) + 1
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{}
|
||||
)
|
||||
|
||||
// Prepare response with clear context markers
|
||||
const response: any = {
|
||||
workflowContext: 'USER_SPECIFIC_WORKFLOW', // Clear marker for the LLM
|
||||
note: 'This data represents only the blocks and configurations that the user has actually built in their current workflow, not all available Sim capabilities.',
|
||||
yaml,
|
||||
format: 'yaml',
|
||||
summary: {
|
||||
workflowName: workflowRecord.name,
|
||||
blockCount: Object.keys(workflowState.blocks).length,
|
||||
edgeCount: (workflowState.edges || []).length,
|
||||
blockTypes,
|
||||
categories,
|
||||
hasLoops: Object.keys(workflowState.loops || {}).length > 0,
|
||||
hasParallels: Object.keys(workflowState.parallels || {}).length > 0,
|
||||
},
|
||||
userBuiltBlocks: blockSchemas, // Renamed to be clearer
|
||||
}
|
||||
|
||||
// Add metadata if requested
|
||||
if (includeMetadata) {
|
||||
response.metadata = {
|
||||
workflowId: workflowRecord.id,
|
||||
name: workflowRecord.name,
|
||||
description: workflowRecord.description,
|
||||
workspaceId: workflowRecord.workspaceId,
|
||||
createdAt: workflowRecord.createdAt,
|
||||
updatedAt: workflowRecord.updatedAt,
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Successfully fetched user workflow YAML', {
|
||||
workflowId,
|
||||
blockCount: response.summary.blockCount,
|
||||
yamlLength: yaml.length,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
output: response,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Failed to get workflow YAML:', error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: `Failed to get workflow YAML: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -1,27 +0,0 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { getYamlWorkflowPrompt } from '@/lib/copilot/prompts'
|
||||
|
||||
export const dynamic = 'force-dynamic'
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
console.log('[get-yaml-structure] API endpoint called')
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
guide: getYamlWorkflowPrompt(),
|
||||
message: 'Complete YAML workflow syntax guide with examples and best practices',
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('[get-yaml-structure] Error:', error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Failed to get YAML structure',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -2,20 +2,27 @@ import { eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { autoLayoutWorkflow } from '@/lib/autolayout/service'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getUserEntityPermissions } from '@/lib/permissions/utils'
|
||||
import {
|
||||
loadWorkflowFromNormalizedTables,
|
||||
saveWorkflowToNormalizedTables,
|
||||
} from '@/lib/workflows/db-helpers'
|
||||
import { simAgentClient } from '@/lib/sim-agent'
|
||||
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/db-helpers'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { db } from '@/db'
|
||||
import { workflow as workflowTable } from '@/db/schema'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
export const dynamic = 'force-dynamic'
|
||||
|
||||
const logger = createLogger('AutoLayoutAPI')
|
||||
|
||||
// Check API key configuration at module level
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
if (!SIM_AGENT_API_KEY) {
|
||||
logger.warn('SIM_AGENT_API_KEY not configured - autolayout requests will fail')
|
||||
}
|
||||
|
||||
const AutoLayoutRequestSchema = z.object({
|
||||
strategy: z
|
||||
.enum(['smart', 'hierarchical', 'layered', 'force-directed'])
|
||||
@@ -120,67 +127,119 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
|
||||
|
||||
// Apply autolayout
|
||||
logger.info(
|
||||
`[${requestId}] Applying autolayout to ${Object.keys(currentWorkflowData.blocks).length} blocks`
|
||||
)
|
||||
|
||||
const layoutedBlocks = await autoLayoutWorkflow(
|
||||
currentWorkflowData.blocks,
|
||||
currentWorkflowData.edges,
|
||||
`[${requestId}] Applying autolayout to ${Object.keys(currentWorkflowData.blocks).length} blocks`,
|
||||
{
|
||||
strategy: layoutOptions.strategy,
|
||||
direction: layoutOptions.direction,
|
||||
spacing: {
|
||||
horizontal: layoutOptions.spacing?.horizontal || 400,
|
||||
vertical: layoutOptions.spacing?.vertical || 200,
|
||||
layer: layoutOptions.spacing?.layer || 600,
|
||||
},
|
||||
alignment: layoutOptions.alignment,
|
||||
padding: {
|
||||
x: layoutOptions.padding?.x || 200,
|
||||
y: layoutOptions.padding?.y || 200,
|
||||
},
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
simAgentUrl: process.env.SIM_AGENT_API_URL || 'http://localhost:8000',
|
||||
}
|
||||
)
|
||||
|
||||
// Create updated workflow state
|
||||
const updatedWorkflowState = {
|
||||
...currentWorkflowData,
|
||||
blocks: layoutedBlocks,
|
||||
lastSaved: Date.now(),
|
||||
// Create workflow state for autolayout
|
||||
const workflowState = {
|
||||
blocks: currentWorkflowData.blocks,
|
||||
edges: currentWorkflowData.edges,
|
||||
loops: currentWorkflowData.loops || {},
|
||||
parallels: currentWorkflowData.parallels || {},
|
||||
}
|
||||
|
||||
// Save to database
|
||||
const saveResult = await saveWorkflowToNormalizedTables(workflowId, updatedWorkflowState)
|
||||
const autoLayoutOptions = {
|
||||
strategy: layoutOptions.strategy,
|
||||
direction: layoutOptions.direction,
|
||||
spacing: {
|
||||
horizontal: layoutOptions.spacing?.horizontal || 500,
|
||||
vertical: layoutOptions.spacing?.vertical || 400,
|
||||
layer: layoutOptions.spacing?.layer || 700,
|
||||
},
|
||||
alignment: layoutOptions.alignment,
|
||||
padding: {
|
||||
x: layoutOptions.padding?.x || 250,
|
||||
y: layoutOptions.padding?.y || 250,
|
||||
},
|
||||
}
|
||||
|
||||
// Gather block registry and utilities for sim-agent
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
const autoLayoutResult = await simAgentClient.makeRequest('/api/yaml/autolayout', {
|
||||
body: {
|
||||
workflowState,
|
||||
options: autoLayoutOptions,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
},
|
||||
apiKey: SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
// Log the full response for debugging
|
||||
logger.info(`[${requestId}] Sim-agent autolayout response:`, {
|
||||
success: autoLayoutResult.success,
|
||||
status: autoLayoutResult.status,
|
||||
error: autoLayoutResult.error,
|
||||
hasData: !!autoLayoutResult.data,
|
||||
hasWorkflowState: !!autoLayoutResult.data?.workflowState,
|
||||
hasBlocks: !!autoLayoutResult.data?.blocks,
|
||||
dataKeys: autoLayoutResult.data ? Object.keys(autoLayoutResult.data) : [],
|
||||
})
|
||||
|
||||
if (
|
||||
!autoLayoutResult.success ||
|
||||
(!autoLayoutResult.data?.workflowState && !autoLayoutResult.data?.blocks)
|
||||
) {
|
||||
logger.error(`[${requestId}] Auto layout failed:`, {
|
||||
success: autoLayoutResult.success,
|
||||
error: autoLayoutResult.error,
|
||||
status: autoLayoutResult.status,
|
||||
fullResponse: autoLayoutResult,
|
||||
})
|
||||
const errorMessage =
|
||||
autoLayoutResult.error ||
|
||||
(autoLayoutResult.status === 401
|
||||
? 'Unauthorized - check API key'
|
||||
: autoLayoutResult.status === 404
|
||||
? 'Sim-agent service not found'
|
||||
: `HTTP ${autoLayoutResult.status}`)
|
||||
|
||||
if (!saveResult.success) {
|
||||
logger.error(`[${requestId}] Failed to save autolayout results:`, saveResult.error)
|
||||
return NextResponse.json(
|
||||
{ error: 'Failed to save autolayout results', details: saveResult.error },
|
||||
{
|
||||
error: 'Auto layout failed',
|
||||
details: errorMessage,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
|
||||
// Update workflow's lastSynced timestamp
|
||||
await db
|
||||
.update(workflowTable)
|
||||
.set({
|
||||
lastSynced: new Date(),
|
||||
updatedAt: new Date(),
|
||||
state: saveResult.jsonBlob,
|
||||
})
|
||||
.where(eq(workflowTable.id, workflowId))
|
||||
// Handle both response formats from sim-agent
|
||||
const layoutedBlocks =
|
||||
autoLayoutResult.data?.workflowState?.blocks || autoLayoutResult.data?.blocks
|
||||
|
||||
// Notify the socket server to tell clients about the autolayout update
|
||||
try {
|
||||
const socketUrl = process.env.SOCKET_URL || 'http://localhost:3002'
|
||||
await fetch(`${socketUrl}/api/workflow-updated`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ workflowId }),
|
||||
if (!layoutedBlocks) {
|
||||
logger.error(`[${requestId}] No blocks returned from sim-agent:`, {
|
||||
responseData: autoLayoutResult.data,
|
||||
})
|
||||
logger.info(`[${requestId}] Notified socket server of autolayout update`)
|
||||
} catch (socketError) {
|
||||
logger.warn(`[${requestId}] Failed to notify socket server:`, socketError)
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: 'Auto layout failed',
|
||||
details: 'No blocks returned from sim-agent',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
|
||||
const elapsed = Date.now() - startTime
|
||||
@@ -192,6 +251,7 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
|
||||
workflowId,
|
||||
})
|
||||
|
||||
// Return the layouted blocks to the frontend - let the store handle saving
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
message: `Autolayout applied successfully to ${blockCount} blocks`,
|
||||
@@ -200,6 +260,7 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
|
||||
direction: layoutOptions.direction,
|
||||
blockCount,
|
||||
elapsed: `${elapsed}ms`,
|
||||
layoutedBlocks: layoutedBlocks,
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
|
||||
@@ -8,7 +8,7 @@ import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getUserEntityPermissions, hasAdminPermission } from '@/lib/permissions/utils'
|
||||
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/db-helpers'
|
||||
import { db } from '@/db'
|
||||
import { workflow } from '@/db/schema'
|
||||
import { apiKey as apiKeyTable, workflow } from '@/db/schema'
|
||||
|
||||
const logger = createLogger('WorkflowByIdAPI')
|
||||
|
||||
@@ -47,13 +47,33 @@ export async function GET(request: NextRequest, { params }: { params: Promise<{
|
||||
// For internal calls, we'll skip user-specific access checks
|
||||
logger.info(`[${requestId}] Internal API call for workflow ${workflowId}`)
|
||||
} else {
|
||||
// Get the session for regular user calls
|
||||
// Try session auth first (for web UI)
|
||||
const session = await getSession()
|
||||
if (!session?.user?.id) {
|
||||
let authenticatedUserId: string | null = session?.user?.id || null
|
||||
|
||||
// If no session, check for API key auth
|
||||
if (!authenticatedUserId) {
|
||||
const apiKeyHeader = request.headers.get('x-api-key')
|
||||
if (apiKeyHeader) {
|
||||
// Verify API key
|
||||
const [apiKeyRecord] = await db
|
||||
.select({ userId: apiKeyTable.userId })
|
||||
.from(apiKeyTable)
|
||||
.where(eq(apiKeyTable.key, apiKeyHeader))
|
||||
.limit(1)
|
||||
|
||||
if (apiKeyRecord) {
|
||||
authenticatedUserId = apiKeyRecord.userId
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!authenticatedUserId) {
|
||||
logger.warn(`[${requestId}] Unauthorized access attempt for workflow ${workflowId}`)
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
userId = session.user.id
|
||||
|
||||
userId = authenticatedUserId
|
||||
}
|
||||
|
||||
// Fetch the workflow
|
||||
|
||||
@@ -173,8 +173,30 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
|
||||
|
||||
// Save to normalized tables
|
||||
// Ensure all required fields are present for WorkflowState type
|
||||
// Filter out blocks without type or name before saving
|
||||
const filteredBlocks = Object.entries(state.blocks).reduce(
|
||||
(acc, [blockId, block]) => {
|
||||
if (block.type && block.name) {
|
||||
// Ensure all required fields are present
|
||||
acc[blockId] = {
|
||||
...block,
|
||||
enabled: block.enabled !== undefined ? block.enabled : true,
|
||||
horizontalHandles:
|
||||
block.horizontalHandles !== undefined ? block.horizontalHandles : true,
|
||||
isWide: block.isWide !== undefined ? block.isWide : false,
|
||||
height: block.height !== undefined ? block.height : 0,
|
||||
subBlocks: block.subBlocks || {},
|
||||
outputs: block.outputs || {},
|
||||
data: block.data || {},
|
||||
}
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{} as typeof state.blocks
|
||||
)
|
||||
|
||||
const workflowState = {
|
||||
blocks: state.blocks,
|
||||
blocks: filteredBlocks,
|
||||
edges: state.edges,
|
||||
loops: state.loops || {},
|
||||
parallels: state.parallels || {},
|
||||
@@ -211,7 +233,7 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: true,
|
||||
blocksCount: Object.keys(state.blocks).length,
|
||||
blocksCount: Object.keys(filteredBlocks).length,
|
||||
edgesCount: state.edges.length,
|
||||
},
|
||||
{ status: 200 }
|
||||
|
||||
@@ -1,21 +1,25 @@
|
||||
import { eq } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { autoLayoutWorkflow } from '@/lib/autolayout/service'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getUserEntityPermissions } from '@/lib/permissions/utils'
|
||||
import { simAgentClient } from '@/lib/sim-agent'
|
||||
import {
|
||||
loadWorkflowFromNormalizedTables,
|
||||
saveWorkflowToNormalizedTables,
|
||||
} from '@/lib/workflows/db-helpers'
|
||||
import { generateWorkflowYaml } from '@/lib/workflows/yaml-generator'
|
||||
import { getUserId as getOAuthUserId } from '@/app/api/auth/oauth/utils'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { db } from '@/db'
|
||||
import { copilotCheckpoints, workflow as workflowTable } from '@/db/schema'
|
||||
import { workflowCheckpoints, workflow as workflowTable } from '@/db/schema'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
import { convertYamlToWorkflow, parseWorkflowYaml } from '@/stores/workflows/yaml/importer'
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
export const dynamic = 'force-dynamic'
|
||||
|
||||
@@ -50,14 +54,56 @@ async function createWorkflowCheckpoint(
|
||||
|
||||
if (currentWorkflowData) {
|
||||
// Generate YAML from current state
|
||||
const currentYaml = generateWorkflowYaml(currentWorkflowData)
|
||||
// Gather block registry and utilities for sim-agent
|
||||
const allBlockConfigs = getAllBlocks()
|
||||
const blockRegistry = allBlockConfigs.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Create checkpoint
|
||||
await db.insert(copilotCheckpoints).values({
|
||||
const generateResponse = await fetch(`${SIM_AGENT_API_URL}/api/workflow/to-yaml`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
workflowState: currentWorkflowData,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!generateResponse.ok) {
|
||||
const errorText = await generateResponse.text()
|
||||
throw new Error(`Failed to generate YAML: ${errorText}`)
|
||||
}
|
||||
|
||||
const generateResult = await generateResponse.json()
|
||||
if (!generateResult.success || !generateResult.yaml) {
|
||||
throw new Error(generateResult.error || 'Failed to generate YAML')
|
||||
}
|
||||
const currentYaml = generateResult.yaml
|
||||
|
||||
// Create checkpoint using new workflow_checkpoints table
|
||||
await db.insert(workflowCheckpoints).values({
|
||||
userId,
|
||||
workflowId,
|
||||
chatId,
|
||||
yaml: currentYaml,
|
||||
workflowState: currentWorkflowData, // Store JSON workflow state
|
||||
})
|
||||
|
||||
logger.info(`[${requestId}] Checkpoint created successfully`)
|
||||
@@ -221,32 +267,108 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
|
||||
await createWorkflowCheckpoint(userId, workflowId, chatId, requestId)
|
||||
}
|
||||
|
||||
// Parse YAML content
|
||||
const { data: yamlWorkflow, errors: parseErrors } = parseWorkflowYaml(yamlContent)
|
||||
// Convert YAML to workflow state by calling sim-agent directly
|
||||
// Gather block registry and utilities for sim-agent
|
||||
const allBlockTypes = getAllBlocks()
|
||||
const blockRegistry = allBlockTypes.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
if (!yamlWorkflow || parseErrors.length > 0) {
|
||||
logger.error(`[${requestId}] YAML parsing failed`, { parseErrors })
|
||||
const conversionResponse = await fetch(`${SIM_AGENT_API_URL}/api/yaml/to-workflow`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
options: {
|
||||
generateNewIds: false, // We'll handle ID generation manually for now
|
||||
preservePositions: true,
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!conversionResponse.ok) {
|
||||
const errorText = await conversionResponse.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: conversionResponse.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json({
|
||||
success: false,
|
||||
message: 'Failed to parse YAML workflow',
|
||||
errors: parseErrors,
|
||||
message: 'Failed to convert YAML to workflow',
|
||||
errors: [`Sim agent API error: ${conversionResponse.statusText}`],
|
||||
warnings: [],
|
||||
})
|
||||
}
|
||||
|
||||
// Convert YAML to workflow format
|
||||
const { blocks, edges, errors: convertErrors, warnings } = convertYamlToWorkflow(yamlWorkflow)
|
||||
const conversionResult = await conversionResponse.json()
|
||||
|
||||
if (convertErrors.length > 0) {
|
||||
logger.error(`[${requestId}] YAML conversion failed`, { convertErrors })
|
||||
if (!conversionResult.success || !conversionResult.workflowState) {
|
||||
logger.error(`[${requestId}] YAML conversion failed`, {
|
||||
errors: conversionResult.errors,
|
||||
warnings: conversionResult.warnings,
|
||||
})
|
||||
return NextResponse.json({
|
||||
success: false,
|
||||
message: 'Failed to convert YAML to workflow',
|
||||
errors: convertErrors,
|
||||
warnings,
|
||||
errors: conversionResult.errors,
|
||||
warnings: conversionResult.warnings || [],
|
||||
})
|
||||
}
|
||||
|
||||
const { workflowState } = conversionResult
|
||||
|
||||
// Ensure all blocks have required fields
|
||||
Object.values(workflowState.blocks).forEach((block: any) => {
|
||||
if (block.enabled === undefined) {
|
||||
block.enabled = true
|
||||
}
|
||||
if (block.horizontalHandles === undefined) {
|
||||
block.horizontalHandles = true
|
||||
}
|
||||
if (block.isWide === undefined) {
|
||||
block.isWide = false
|
||||
}
|
||||
if (block.height === undefined) {
|
||||
block.height = 0
|
||||
}
|
||||
if (!block.subBlocks) {
|
||||
block.subBlocks = {}
|
||||
}
|
||||
if (!block.outputs) {
|
||||
block.outputs = {}
|
||||
}
|
||||
})
|
||||
|
||||
const blocks = Object.values(workflowState.blocks) as Array<{
|
||||
id: string
|
||||
type: string
|
||||
name: string
|
||||
position: { x: number; y: number }
|
||||
subBlocks?: Record<string, any>
|
||||
data?: Record<string, any>
|
||||
}>
|
||||
const edges = workflowState.edges
|
||||
const warnings = conversionResult.warnings || []
|
||||
|
||||
// Create workflow state
|
||||
const newWorkflowState: any = {
|
||||
blocks: {} as Record<string, any>,
|
||||
@@ -301,17 +423,19 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
|
||||
}
|
||||
})
|
||||
|
||||
// Also ensure we have subBlocks for any YAML inputs that might not be in the config
|
||||
// Also ensure we have subBlocks for any existing subBlocks from conversion
|
||||
// This handles cases where hidden fields or dynamic configurations exist
|
||||
Object.keys(block.inputs).forEach((inputKey) => {
|
||||
if (!subBlocks[inputKey]) {
|
||||
subBlocks[inputKey] = {
|
||||
id: inputKey,
|
||||
type: 'short-input', // Default type for dynamic inputs
|
||||
value: null,
|
||||
if (block.subBlocks) {
|
||||
Object.keys(block.subBlocks).forEach((subBlockKey) => {
|
||||
if (!subBlocks[subBlockKey]) {
|
||||
subBlocks[subBlockKey] = {
|
||||
id: subBlockKey,
|
||||
type: block.subBlocks![subBlockKey].type || 'short-input',
|
||||
value: block.subBlocks![subBlockKey].value || null,
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
// Set up outputs from block configuration
|
||||
const outputs = resolveOutputType(blockConfig.outputs)
|
||||
@@ -337,16 +461,16 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
|
||||
}
|
||||
}
|
||||
|
||||
// Set input values as subblock values with block reference mapping
|
||||
// Set subblock values with block reference mapping
|
||||
for (const block of blocks) {
|
||||
const newId = blockIdMapping.get(block.id)
|
||||
if (!newId || !newWorkflowState.blocks[newId]) continue
|
||||
|
||||
if (block.inputs && typeof block.inputs === 'object') {
|
||||
Object.entries(block.inputs).forEach(([key, value]) => {
|
||||
if (newWorkflowState.blocks[newId].subBlocks[key]) {
|
||||
if (block.subBlocks && typeof block.subBlocks === 'object') {
|
||||
Object.entries(block.subBlocks).forEach(([key, subBlock]: [string, any]) => {
|
||||
if (newWorkflowState.blocks[newId].subBlocks[key] && subBlock.value !== undefined) {
|
||||
// Update block references in values to use new mapped IDs
|
||||
const processedValue = updateBlockReferences(value, blockIdMapping, requestId)
|
||||
const processedValue = updateBlockReferences(subBlock.value, blockIdMapping, requestId)
|
||||
newWorkflowState.blocks[newId].subBlocks[key].value = processedValue
|
||||
}
|
||||
})
|
||||
@@ -423,26 +547,66 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
|
||||
try {
|
||||
logger.info(`[${requestId}] Applying autolayout`)
|
||||
|
||||
const layoutedBlocks = await autoLayoutWorkflow(
|
||||
newWorkflowState.blocks,
|
||||
newWorkflowState.edges,
|
||||
{
|
||||
strategy: 'smart',
|
||||
direction: 'auto',
|
||||
spacing: {
|
||||
horizontal: 400,
|
||||
vertical: 200,
|
||||
layer: 600,
|
||||
},
|
||||
alignment: 'center',
|
||||
padding: {
|
||||
x: 200,
|
||||
y: 200,
|
||||
},
|
||||
}
|
||||
// Create workflow state for autolayout
|
||||
const workflowStateForLayout = {
|
||||
blocks: newWorkflowState.blocks,
|
||||
edges: newWorkflowState.edges,
|
||||
loops: newWorkflowState.loops || {},
|
||||
parallels: newWorkflowState.parallels || {},
|
||||
}
|
||||
|
||||
const autoLayoutOptions = {
|
||||
strategy: 'smart' as const,
|
||||
direction: 'auto' as const,
|
||||
spacing: {
|
||||
horizontal: 500,
|
||||
vertical: 400,
|
||||
layer: 700,
|
||||
},
|
||||
alignment: 'center' as const,
|
||||
padding: {
|
||||
x: 250,
|
||||
y: 250,
|
||||
},
|
||||
}
|
||||
|
||||
// Gather block registry and utilities for sim-agent
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
newWorkflowState.blocks = layoutedBlocks
|
||||
const autoLayoutResult = await simAgentClient.makeRequest('/api/yaml/autolayout', {
|
||||
body: {
|
||||
workflowState: workflowStateForLayout,
|
||||
options: autoLayoutOptions,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
if (autoLayoutResult.success && autoLayoutResult.data?.workflowState) {
|
||||
newWorkflowState.blocks = autoLayoutResult.data.workflowState.blocks
|
||||
} else {
|
||||
logger.warn(
|
||||
`[${requestId}] Auto layout failed, using original positions:`,
|
||||
autoLayoutResult.error
|
||||
)
|
||||
}
|
||||
logger.info(`[${requestId}] Autolayout completed successfully`)
|
||||
} catch (layoutError) {
|
||||
logger.warn(`[${requestId}] Autolayout failed, using original positions:`, layoutError)
|
||||
|
||||
@@ -1,9 +1,16 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateWorkflowYaml } from '@/lib/workflows/yaml-generator'
|
||||
import { simAgentClient } from '@/lib/sim-agent'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('WorkflowYamlAPI')
|
||||
|
||||
// Get API key at module level like working routes
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
@@ -20,16 +27,54 @@ export async function POST(request: NextRequest) {
|
||||
)
|
||||
}
|
||||
|
||||
// Generate YAML using the shared utility
|
||||
const yamlContent = generateWorkflowYaml(workflowState, subBlockValues)
|
||||
// Gather block registry and utilities for sim-agent
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Call sim-agent directly
|
||||
const result = await simAgentClient.makeRequest('/api/workflow/to-yaml', {
|
||||
body: {
|
||||
workflowState,
|
||||
subBlockValues,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
},
|
||||
apiKey: SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
if (!result.success || !result.data?.yaml) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: result.error || 'Failed to generate YAML',
|
||||
},
|
||||
{ status: result.status || 500 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Successfully generated YAML`, {
|
||||
yamlLength: yamlContent.length,
|
||||
yamlLength: result.data.yaml.length,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
yaml: yamlContent,
|
||||
yaml: result.data.yaml,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] YAML generation failed`, error)
|
||||
|
||||
218
apps/sim/app/api/yaml/autolayout/route.ts
Normal file
218
apps/sim/app/api/yaml/autolayout/route.ts
Normal file
@@ -0,0 +1,218 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import {
|
||||
convertLoopBlockToLoop,
|
||||
convertParallelBlockToParallel,
|
||||
findAllDescendantNodes,
|
||||
findChildNodes,
|
||||
generateLoopBlocks,
|
||||
generateParallelBlocks,
|
||||
} from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('YamlAutoLayoutAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
const AutoLayoutRequestSchema = z.object({
|
||||
workflowState: z.object({
|
||||
blocks: z.record(z.any()),
|
||||
edges: z.array(z.any()),
|
||||
loops: z.record(z.any()).optional().default({}),
|
||||
parallels: z.record(z.any()).optional().default({}),
|
||||
}),
|
||||
options: z
|
||||
.object({
|
||||
strategy: z.enum(['smart', 'hierarchical', 'layered', 'force-directed']).optional(),
|
||||
direction: z.enum(['horizontal', 'vertical', 'auto']).optional(),
|
||||
spacing: z
|
||||
.object({
|
||||
horizontal: z.number().optional(),
|
||||
vertical: z.number().optional(),
|
||||
layer: z.number().optional(),
|
||||
})
|
||||
.optional(),
|
||||
alignment: z.enum(['start', 'center', 'end']).optional(),
|
||||
padding: z
|
||||
.object({
|
||||
x: z.number().optional(),
|
||||
y: z.number().optional(),
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { workflowState, options } = AutoLayoutRequestSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Applying auto layout`, {
|
||||
blockCount: Object.keys(workflowState.blocks).length,
|
||||
edgeCount: workflowState.edges.length,
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
strategy: options?.strategy || 'smart',
|
||||
simAgentUrl: SIM_AGENT_API_URL,
|
||||
})
|
||||
|
||||
// Gather block registry and utilities
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Log sample block data for debugging
|
||||
const sampleBlockId = Object.keys(workflowState.blocks)[0]
|
||||
if (sampleBlockId) {
|
||||
logger.info(`[${requestId}] Sample block data:`, {
|
||||
blockId: sampleBlockId,
|
||||
blockType: workflowState.blocks[sampleBlockId].type,
|
||||
hasPosition: !!workflowState.blocks[sampleBlockId].position,
|
||||
position: workflowState.blocks[sampleBlockId].position,
|
||||
})
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Calling sim-agent autolayout with strategy:`, {
|
||||
strategy: options?.strategy || 'smart (default)',
|
||||
direction: options?.direction || 'auto (default)',
|
||||
spacing: options?.spacing,
|
||||
alignment: options?.alignment || 'center (default)',
|
||||
})
|
||||
|
||||
// Call sim-agent API
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/autolayout`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
workflowState: {
|
||||
blocks: workflowState.blocks,
|
||||
edges: workflowState.edges,
|
||||
loops: workflowState.loops || {},
|
||||
parallels: workflowState.parallels || {},
|
||||
},
|
||||
options: {
|
||||
strategy: 'smart',
|
||||
direction: 'auto',
|
||||
spacing: {
|
||||
horizontal: 500,
|
||||
vertical: 400,
|
||||
layer: 700,
|
||||
},
|
||||
alignment: 'center',
|
||||
padding: {
|
||||
x: 250,
|
||||
y: 250,
|
||||
},
|
||||
...options, // Allow override of defaults
|
||||
},
|
||||
blockRegistry,
|
||||
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
convertLoopBlockToLoop: convertLoopBlockToLoop.toString(),
|
||||
convertParallelBlockToParallel: convertParallelBlockToParallel.toString(),
|
||||
findChildNodes: findChildNodes.toString(),
|
||||
findAllDescendantNodes: findAllDescendantNodes.toString(),
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
|
||||
// Try to parse the error as JSON for better error messages
|
||||
let errorMessage = `Sim agent API error: ${response.statusText}`
|
||||
|
||||
// Check if it's a 404 error
|
||||
if (response.status === 404) {
|
||||
errorMessage =
|
||||
'Auto-layout endpoint not found on sim agent. Please ensure the /api/yaml/autolayout endpoint is implemented in the sim agent service.'
|
||||
} else {
|
||||
try {
|
||||
const errorJson = JSON.parse(errorText)
|
||||
if (errorJson.errors && Array.isArray(errorJson.errors)) {
|
||||
errorMessage = errorJson.errors.join(', ')
|
||||
} else if (errorJson.error) {
|
||||
errorMessage = errorJson.error
|
||||
}
|
||||
} catch (e) {
|
||||
// If not JSON, use the raw text
|
||||
errorMessage = errorText || errorMessage
|
||||
}
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: response.status,
|
||||
error: errorText,
|
||||
parsedError: errorMessage,
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: [errorMessage] },
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
logger.info(`[${requestId}] Sim agent response summary:`, {
|
||||
success: result.success,
|
||||
hasBlocks: !!result.blocks,
|
||||
blockCount: result.blocks ? Object.keys(result.blocks).length : 0,
|
||||
responseKeys: Object.keys(result),
|
||||
})
|
||||
|
||||
// Transform the response to match the expected format
|
||||
const transformedResponse = {
|
||||
success: result.success,
|
||||
workflowState: {
|
||||
blocks: result.blocks || {},
|
||||
edges: workflowState.edges || [],
|
||||
loops: workflowState.loops || {},
|
||||
parallels: workflowState.parallels || {},
|
||||
},
|
||||
errors: result.errors,
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Transformed response:`, {
|
||||
success: transformedResponse.success,
|
||||
blockCount: Object.keys(transformedResponse.workflowState.blocks).length,
|
||||
hasWorkflowState: true,
|
||||
})
|
||||
|
||||
return NextResponse.json(transformedResponse)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Auto layout failed:`, error)
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
errors: [error instanceof Error ? error.message : 'Unknown auto layout error'],
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
202
apps/sim/app/api/yaml/diff/create/route.ts
Normal file
202
apps/sim/app/api/yaml/diff/create/route.ts
Normal file
@@ -0,0 +1,202 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import {
|
||||
convertLoopBlockToLoop,
|
||||
convertParallelBlockToParallel,
|
||||
findAllDescendantNodes,
|
||||
findChildNodes,
|
||||
generateLoopBlocks,
|
||||
generateParallelBlocks,
|
||||
} from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('YamlDiffCreateAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
const CreateDiffRequestSchema = z.object({
|
||||
yamlContent: z.string().min(1),
|
||||
diffAnalysis: z
|
||||
.object({
|
||||
new_blocks: z.array(z.string()),
|
||||
edited_blocks: z.array(z.string()),
|
||||
deleted_blocks: z.array(z.string()),
|
||||
field_diffs: z
|
||||
.record(
|
||||
z.object({
|
||||
changed_fields: z.array(z.string()),
|
||||
unchanged_fields: z.array(z.string()),
|
||||
})
|
||||
)
|
||||
.optional(),
|
||||
edge_diff: z
|
||||
.object({
|
||||
new_edges: z.array(z.string()),
|
||||
deleted_edges: z.array(z.string()),
|
||||
unchanged_edges: z.array(z.string()),
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
.optional(),
|
||||
options: z
|
||||
.object({
|
||||
applyAutoLayout: z.boolean().optional(),
|
||||
layoutOptions: z.any().optional(),
|
||||
})
|
||||
.optional(),
|
||||
currentWorkflowState: z
|
||||
.object({
|
||||
blocks: z.record(z.any()),
|
||||
edges: z.array(z.any()),
|
||||
loops: z.record(z.any()).optional(),
|
||||
parallels: z.record(z.any()).optional(),
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { yamlContent, diffAnalysis, options } = CreateDiffRequestSchema.parse(body)
|
||||
|
||||
// Get current workflow state for comparison
|
||||
// Note: This endpoint is stateless, so we need to get this from the request
|
||||
const currentWorkflowState = (body as any).currentWorkflowState
|
||||
|
||||
logger.info(`[${requestId}] Creating diff from YAML`, {
|
||||
contentLength: yamlContent.length,
|
||||
hasDiffAnalysis: !!diffAnalysis,
|
||||
hasOptions: !!options,
|
||||
options: options,
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
hasCurrentWorkflowState: !!currentWorkflowState,
|
||||
currentBlockCount: currentWorkflowState
|
||||
? Object.keys(currentWorkflowState.blocks || {}).length
|
||||
: 0,
|
||||
})
|
||||
|
||||
// Gather block registry
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Call sim-agent API
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/diff/create`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent,
|
||||
diffAnalysis,
|
||||
blockRegistry,
|
||||
currentWorkflowState, // Pass current state for comparison
|
||||
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
convertLoopBlockToLoop: convertLoopBlockToLoop.toString(),
|
||||
convertParallelBlockToParallel: convertParallelBlockToParallel.toString(),
|
||||
findChildNodes: findChildNodes.toString(),
|
||||
findAllDescendantNodes: findAllDescendantNodes.toString(),
|
||||
},
|
||||
options,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: response.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: [`Sim agent API error: ${response.statusText}`] },
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
// Log the full response to see if auto-layout is happening
|
||||
logger.info(`[${requestId}] Full sim agent response:`, JSON.stringify(result, null, 2))
|
||||
|
||||
// Log diff analysis specifically
|
||||
if (result.diff?.diffAnalysis) {
|
||||
logger.info(`[${requestId}] Diff analysis received:`, {
|
||||
new_blocks: result.diff.diffAnalysis.new_blocks || [],
|
||||
edited_blocks: result.diff.diffAnalysis.edited_blocks || [],
|
||||
deleted_blocks: result.diff.diffAnalysis.deleted_blocks || [],
|
||||
has_field_diffs: !!result.diff.diffAnalysis.field_diffs,
|
||||
has_edge_diff: !!result.diff.diffAnalysis.edge_diff,
|
||||
})
|
||||
} else {
|
||||
logger.warn(`[${requestId}] No diff analysis in response!`)
|
||||
}
|
||||
|
||||
// If the sim agent returned blocks directly (when auto-layout is applied),
|
||||
// transform it to the expected diff format
|
||||
if (result.success && result.blocks && !result.diff) {
|
||||
logger.info(`[${requestId}] Transforming sim agent blocks response to diff format`)
|
||||
|
||||
const transformedResult = {
|
||||
success: result.success,
|
||||
diff: {
|
||||
proposedState: {
|
||||
blocks: result.blocks,
|
||||
edges: result.edges || [],
|
||||
loops: result.loops || {},
|
||||
parallels: result.parallels || {},
|
||||
},
|
||||
diffAnalysis: diffAnalysis,
|
||||
metadata: result.metadata || {
|
||||
source: 'sim-agent',
|
||||
timestamp: Date.now(),
|
||||
},
|
||||
},
|
||||
errors: result.errors || [],
|
||||
}
|
||||
|
||||
return NextResponse.json(transformedResult)
|
||||
}
|
||||
|
||||
return NextResponse.json(result)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Diff creation failed:`, error)
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: error.errors.map((e) => e.message) },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
errors: [error instanceof Error ? error.message : 'Unknown error'],
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
166
apps/sim/app/api/yaml/diff/merge/route.ts
Normal file
166
apps/sim/app/api/yaml/diff/merge/route.ts
Normal file
@@ -0,0 +1,166 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import {
|
||||
convertLoopBlockToLoop,
|
||||
convertParallelBlockToParallel,
|
||||
findAllDescendantNodes,
|
||||
findChildNodes,
|
||||
generateLoopBlocks,
|
||||
generateParallelBlocks,
|
||||
} from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('YamlDiffMergeAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
const MergeDiffRequestSchema = z.object({
|
||||
existingDiff: z.object({
|
||||
proposedState: z.object({
|
||||
blocks: z.record(z.any()),
|
||||
edges: z.array(z.any()),
|
||||
loops: z.record(z.any()),
|
||||
parallels: z.record(z.any()),
|
||||
}),
|
||||
diffAnalysis: z.any().optional(),
|
||||
metadata: z.object({
|
||||
source: z.string(),
|
||||
timestamp: z.number(),
|
||||
}),
|
||||
}),
|
||||
yamlContent: z.string().min(1),
|
||||
diffAnalysis: z.any().optional(),
|
||||
options: z
|
||||
.object({
|
||||
applyAutoLayout: z.boolean().optional(),
|
||||
layoutOptions: z.any().optional(),
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { existingDiff, yamlContent, diffAnalysis, options } = MergeDiffRequestSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Merging diff from YAML`, {
|
||||
contentLength: yamlContent.length,
|
||||
existingBlockCount: Object.keys(existingDiff.proposedState.blocks).length,
|
||||
hasDiffAnalysis: !!diffAnalysis,
|
||||
hasOptions: !!options,
|
||||
options: options,
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
// Gather block registry
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Call sim-agent API
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/diff/merge`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
existingDiff,
|
||||
yamlContent,
|
||||
diffAnalysis,
|
||||
blockRegistry,
|
||||
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
convertLoopBlockToLoop: convertLoopBlockToLoop.toString(),
|
||||
convertParallelBlockToParallel: convertParallelBlockToParallel.toString(),
|
||||
findChildNodes: findChildNodes.toString(),
|
||||
findAllDescendantNodes: findAllDescendantNodes.toString(),
|
||||
},
|
||||
options,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: response.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: [`Sim agent API error: ${response.statusText}`] },
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
// Log the full response to see if auto-layout is happening
|
||||
logger.info(`[${requestId}] Full sim agent response:`, JSON.stringify(result, null, 2))
|
||||
|
||||
// If the sim agent returned blocks directly (when auto-layout is applied),
|
||||
// transform it to the expected diff format
|
||||
if (result.success && result.blocks && !result.diff) {
|
||||
logger.info(`[${requestId}] Transforming sim agent blocks response to diff format`)
|
||||
|
||||
const transformedResult = {
|
||||
success: result.success,
|
||||
diff: {
|
||||
proposedState: {
|
||||
blocks: result.blocks,
|
||||
edges: result.edges || existingDiff.proposedState.edges || [],
|
||||
loops: result.loops || existingDiff.proposedState.loops || {},
|
||||
parallels: result.parallels || existingDiff.proposedState.parallels || {},
|
||||
},
|
||||
diffAnalysis: diffAnalysis,
|
||||
metadata: result.metadata || {
|
||||
source: 'sim-agent',
|
||||
timestamp: Date.now(),
|
||||
},
|
||||
},
|
||||
errors: result.errors || [],
|
||||
}
|
||||
|
||||
return NextResponse.json(transformedResult)
|
||||
}
|
||||
|
||||
return NextResponse.json(result)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Diff merge failed:`, error)
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: error.errors.map((e) => e.message) },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
errors: [error instanceof Error ? error.message : 'Unknown error'],
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
100
apps/sim/app/api/yaml/generate/route.ts
Normal file
100
apps/sim/app/api/yaml/generate/route.ts
Normal file
@@ -0,0 +1,100 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('YamlGenerateAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
const GenerateRequestSchema = z.object({
|
||||
workflowState: z.any(), // Let the yaml service handle validation
|
||||
subBlockValues: z.record(z.record(z.any())).optional(),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { workflowState, subBlockValues } = GenerateRequestSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Generating YAML from workflow`, {
|
||||
blocksCount: workflowState.blocks ? Object.keys(workflowState.blocks).length : 0,
|
||||
edgesCount: workflowState.edges ? workflowState.edges.length : 0,
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
// Gather block registry and utilities
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Call sim-agent API
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/workflow/to-yaml`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
workflowState,
|
||||
subBlockValues,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: response.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ success: false, error: `Sim agent API error: ${response.statusText}` },
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
return NextResponse.json(result)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] YAML generation failed:`, error)
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ success: false, error: error.errors.map((e) => e.message).join(', ') },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
47
apps/sim/app/api/yaml/health/route.ts
Normal file
47
apps/sim/app/api/yaml/health/route.ts
Normal file
@@ -0,0 +1,47 @@
|
||||
import { NextResponse } from 'next/server'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
|
||||
const logger = createLogger('YamlHealthAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
export async function GET() {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
logger.info(`[${requestId}] Checking YAML service health`, {
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
// Check sim-agent health
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/health`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
})
|
||||
|
||||
const isHealthy = response.ok
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
healthy: isHealthy,
|
||||
service: 'yaml',
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] YAML health check failed:`, error)
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
healthy: false,
|
||||
service: 'yaml',
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
97
apps/sim/app/api/yaml/parse/route.ts
Normal file
97
apps/sim/app/api/yaml/parse/route.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('YamlParseAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
const ParseRequestSchema = z.object({
|
||||
yamlContent: z.string().min(1),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { yamlContent } = ParseRequestSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Parsing YAML`, {
|
||||
contentLength: yamlContent.length,
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
// Gather block registry and utilities
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Call sim-agent API
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/parse`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: response.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: [`Sim agent API error: ${response.statusText}`] },
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
return NextResponse.json(result)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] YAML parse failed:`, error)
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: error.errors.map((e) => e.message) },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
errors: [error instanceof Error ? error.message : 'Unknown error'],
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
107
apps/sim/app/api/yaml/to-workflow/route.ts
Normal file
107
apps/sim/app/api/yaml/to-workflow/route.ts
Normal file
@@ -0,0 +1,107 @@
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('YamlToWorkflowAPI')
|
||||
|
||||
// Sim Agent API configuration
|
||||
const SIM_AGENT_API_URL = process.env.SIM_AGENT_API_URL || 'http://localhost:8000'
|
||||
const SIM_AGENT_API_KEY = process.env.SIM_AGENT_API_KEY
|
||||
|
||||
const ConvertRequestSchema = z.object({
|
||||
yamlContent: z.string().min(1),
|
||||
options: z
|
||||
.object({
|
||||
generateNewIds: z.boolean().optional(),
|
||||
preservePositions: z.boolean().optional(),
|
||||
existingBlocks: z.record(z.any()).optional(),
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
|
||||
try {
|
||||
const body = await request.json()
|
||||
const { yamlContent, options } = ConvertRequestSchema.parse(body)
|
||||
|
||||
logger.info(`[${requestId}] Converting YAML to workflow`, {
|
||||
contentLength: yamlContent.length,
|
||||
hasOptions: !!options,
|
||||
hasApiKey: !!SIM_AGENT_API_KEY,
|
||||
})
|
||||
|
||||
// Gather block registry and utilities
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
// Call sim-agent API
|
||||
const response = await fetch(`${SIM_AGENT_API_URL}/api/yaml/to-workflow`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(SIM_AGENT_API_KEY && { 'x-api-key': SIM_AGENT_API_KEY }),
|
||||
},
|
||||
body: JSON.stringify({
|
||||
yamlContent,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
options,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
logger.error(`[${requestId}] Sim agent API error:`, {
|
||||
status: response.status,
|
||||
error: errorText,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: [`Sim agent API error: ${response.statusText}`], warnings: [] },
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
return NextResponse.json(result)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] YAML to workflow conversion failed:`, error)
|
||||
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ success: false, errors: error.errors.map((e) => e.message), warnings: [] },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
errors: [error instanceof Error ? error.message : 'Unknown error'],
|
||||
warnings: [],
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -3,10 +3,8 @@
|
||||
import { memo, useMemo, useState } from 'react'
|
||||
import { Check, Copy } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { ToolCallCompletion, ToolCallExecution } from '@/components/ui/tool-call'
|
||||
import { Tooltip, TooltipContent, TooltipProvider, TooltipTrigger } from '@/components/ui/tooltip'
|
||||
import { parseMessageContent, stripToolCallIndicators } from '@/lib/tool-call-parser'
|
||||
import MarkdownRenderer from '@/app/chat/[subdomain]/components/message/components/markdown-renderer'
|
||||
import MarkdownRenderer from './components/markdown-renderer'
|
||||
|
||||
export interface ChatMessage {
|
||||
id: string
|
||||
@@ -33,21 +31,9 @@ export const ClientChatMessage = memo(
|
||||
return typeof message.content === 'object' && message.content !== null
|
||||
}, [message.content])
|
||||
|
||||
// Parse message content to separate text and tool calls (only for assistant messages)
|
||||
const parsedContent = useMemo(() => {
|
||||
if (message.type === 'assistant' && typeof message.content === 'string') {
|
||||
return parseMessageContent(message.content)
|
||||
}
|
||||
return null
|
||||
}, [message.type, message.content])
|
||||
|
||||
// Get clean text content without tool call indicators
|
||||
const cleanTextContent = useMemo(() => {
|
||||
if (message.type === 'assistant' && typeof message.content === 'string') {
|
||||
return stripToolCallIndicators(message.content)
|
||||
}
|
||||
return message.content
|
||||
}, [message.type, message.content])
|
||||
// Since tool calls are now handled via SSE events and stored in message.toolCalls,
|
||||
// we can use the content directly without parsing
|
||||
const cleanTextContent = message.content
|
||||
|
||||
// For user messages (on the right)
|
||||
if (message.type === 'user') {
|
||||
@@ -75,57 +61,18 @@ export const ClientChatMessage = memo(
|
||||
<div className='px-4 pt-5 pb-2' data-message-id={message.id}>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
<div className='flex flex-col space-y-3'>
|
||||
{/* Inline content rendering - tool calls and text in order */}
|
||||
{parsedContent?.inlineContent && parsedContent.inlineContent.length > 0 ? (
|
||||
<div className='space-y-2'>
|
||||
{parsedContent.inlineContent.map((item, index) => {
|
||||
if (item.type === 'tool_call' && item.toolCall) {
|
||||
const toolCall = item.toolCall
|
||||
return (
|
||||
<div key={`${toolCall.id}-${index}`}>
|
||||
{toolCall.state === 'detecting' && (
|
||||
<div className='flex items-center gap-2 rounded-lg border border-blue-200 bg-blue-50 px-3 py-2 text-sm dark:border-blue-800 dark:bg-blue-950'>
|
||||
<div className='h-4 w-4 animate-spin rounded-full border-2 border-blue-600 border-t-transparent dark:border-blue-400' />
|
||||
<span className='text-blue-800 dark:text-blue-200'>
|
||||
Detecting {toolCall.displayName || toolCall.name}...
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{toolCall.state === 'executing' && (
|
||||
<ToolCallExecution toolCall={toolCall} isCompact={true} />
|
||||
)}
|
||||
{(toolCall.state === 'completed' || toolCall.state === 'error') && (
|
||||
<ToolCallCompletion toolCall={toolCall} isCompact={true} />
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
if (item.type === 'text' && item.content.trim()) {
|
||||
return (
|
||||
<div key={`text-${index}`}>
|
||||
<div className='break-words text-lg'>
|
||||
<EnhancedMarkdownRenderer content={item.content} />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
})}
|
||||
{/* Direct content rendering - tool calls are now handled via SSE events */}
|
||||
<div>
|
||||
<div className='break-words text-base'>
|
||||
{isJsonObject ? (
|
||||
<pre className='text-gray-800 dark:text-gray-100'>
|
||||
{JSON.stringify(cleanTextContent, null, 2)}
|
||||
</pre>
|
||||
) : (
|
||||
<EnhancedMarkdownRenderer content={cleanTextContent as string} />
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
/* Fallback for empty content or no inline content */
|
||||
<div>
|
||||
<div className='break-words text-lg'>
|
||||
{isJsonObject ? (
|
||||
<pre className='text-gray-800 dark:text-gray-100'>
|
||||
{JSON.stringify(cleanTextContent, null, 2)}
|
||||
</pre>
|
||||
) : (
|
||||
<EnhancedMarkdownRenderer content={cleanTextContent as string} />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
{message.type === 'assistant' && !isJsonObject && !message.isInitialMessage && (
|
||||
<div className='flex items-center justify-start space-x-2'>
|
||||
{/* Copy Button - Only show when not streaming */}
|
||||
|
||||
@@ -2,7 +2,6 @@ import { Analytics } from '@vercel/analytics/next'
|
||||
import { SpeedInsights } from '@vercel/speed-insights/next'
|
||||
import type { Metadata, Viewport } from 'next'
|
||||
import { PublicEnvScript } from 'next-runtime-env'
|
||||
import { env } from '@/lib/env'
|
||||
import { isHosted } from '@/lib/environment'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getAssetUrl } from '@/lib/utils'
|
||||
@@ -226,13 +225,13 @@ export default function RootLayout({ children }: { children: React.ReactNode })
|
||||
<PublicEnvScript />
|
||||
|
||||
{/* RB2B Script - Only load on hosted version */}
|
||||
{isHosted && env.NEXT_PUBLIC_RB2B_KEY && (
|
||||
{/* {isHosted && env.NEXT_PUBLIC_RB2B_KEY && (
|
||||
<script
|
||||
dangerouslySetInnerHTML={{
|
||||
__html: `!function () {var reb2b = window.reb2b = window.reb2b || [];if (reb2b.invoked) return;reb2b.invoked = true;reb2b.methods = ["identify", "collect"];reb2b.factory = function (method) {return function () {var args = Array.prototype.slice.call(arguments);args.unshift(method);reb2b.push(args);return reb2b;};};for (var i = 0; i < reb2b.methods.length; i++) {var key = reb2b.methods[i];reb2b[key] = reb2b.factory(key);}reb2b.load = function (key) {var script = document.createElement("script");script.type = "text/javascript";script.async = true;script.src = "https://b2bjsstore.s3.us-west-2.amazonaws.com/b/" + key + "/${env.NEXT_PUBLIC_RB2B_KEY}.js.gz";var first = document.getElementsByTagName("script")[0];first.parentNode.insertBefore(script, first);};reb2b.SNIPPET_VERSION = "1.0.1";reb2b.load("${env.NEXT_PUBLIC_RB2B_KEY}");}();`,
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
)} */}
|
||||
</head>
|
||||
<body suppressHydrationWarning>
|
||||
<ZoomPrevention />
|
||||
|
||||
@@ -28,7 +28,7 @@ import { SubdomainInput } from '@/app/workspace/[workspaceId]/w/[workflowId]/com
|
||||
import { SuccessView } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/control-bar/components/deploy-modal/components/chat-deploy/components/success-view'
|
||||
import { useChatDeployment } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/control-bar/components/deploy-modal/components/chat-deploy/hooks/use-chat-deployment'
|
||||
import { useChatForm } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/control-bar/components/deploy-modal/components/chat-deploy/hooks/use-chat-form'
|
||||
import { OutputSelect } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/chat/components'
|
||||
import { OutputSelect } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/chat/components/output-select/output-select'
|
||||
|
||||
const logger = createLogger('ChatDeploy')
|
||||
|
||||
|
||||
@@ -112,6 +112,7 @@ export function ControlBar({ hasValidationErrors = false }: ControlBarProps) {
|
||||
const [, forceUpdate] = useState({})
|
||||
const [isExpanded, setIsExpanded] = useState(false)
|
||||
const [isTemplateModalOpen, setIsTemplateModalOpen] = useState(false)
|
||||
const [isAutoLayouting, setIsAutoLayouting] = useState(false)
|
||||
|
||||
// Deployed state management
|
||||
const [deployedState, setDeployedState] = useState<WorkflowState | null>(null)
|
||||
@@ -547,21 +548,40 @@ export function ControlBar({ hasValidationErrors = false }: ControlBarProps) {
|
||||
* Render auto-layout button
|
||||
*/
|
||||
const renderAutoLayoutButton = () => {
|
||||
const handleAutoLayoutClick = () => {
|
||||
if (isExecuting || isDebugging || !userPermissions.canEdit) {
|
||||
const handleAutoLayoutClick = async () => {
|
||||
if (isExecuting || isDebugging || !userPermissions.canEdit || isAutoLayouting) {
|
||||
return
|
||||
}
|
||||
|
||||
window.dispatchEvent(new CustomEvent('trigger-auto-layout'))
|
||||
setIsAutoLayouting(true)
|
||||
try {
|
||||
// Use the shared auto layout utility for immediate frontend updates
|
||||
const { applyAutoLayoutAndUpdateStore } = await import('../../utils/auto-layout')
|
||||
|
||||
const result = await applyAutoLayoutAndUpdateStore(activeWorkflowId!)
|
||||
|
||||
if (result.success) {
|
||||
logger.info('Auto layout completed successfully')
|
||||
} else {
|
||||
logger.error('Auto layout failed:', result.error)
|
||||
// You could add a toast notification here if available
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Auto layout error:', error)
|
||||
// You could add a toast notification here if available
|
||||
} finally {
|
||||
setIsAutoLayouting(false)
|
||||
}
|
||||
}
|
||||
|
||||
const canEdit = userPermissions.canEdit
|
||||
const isDisabled = isExecuting || isDebugging || !canEdit
|
||||
const isDisabled = isExecuting || isDebugging || !canEdit || isAutoLayouting
|
||||
|
||||
const getTooltipText = () => {
|
||||
if (!canEdit) return 'Admin permission required to use auto-layout'
|
||||
if (isDebugging) return 'Cannot auto-layout while debugging'
|
||||
if (isExecuting) return 'Cannot auto-layout while workflow is running'
|
||||
if (isAutoLayouting) return 'Applying auto-layout...'
|
||||
return 'Auto layout'
|
||||
}
|
||||
|
||||
@@ -570,15 +590,24 @@ export function ControlBar({ hasValidationErrors = false }: ControlBarProps) {
|
||||
<TooltipTrigger asChild>
|
||||
{isDisabled ? (
|
||||
<div className='inline-flex h-12 w-12 cursor-not-allowed items-center justify-center gap-2 whitespace-nowrap rounded-[11px] border bg-card font-medium text-card-foreground text-sm opacity-50 ring-offset-background transition-colors [&_svg]:pointer-events-none [&_svg]:size-4 [&_svg]:shrink-0'>
|
||||
<Layers className='h-5 w-5' />
|
||||
{isAutoLayouting ? (
|
||||
<RefreshCw className='h-5 w-5 animate-spin' />
|
||||
) : (
|
||||
<Layers className='h-5 w-5' />
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
<Button
|
||||
variant='outline'
|
||||
onClick={handleAutoLayoutClick}
|
||||
className='h-12 w-12 rounded-[11px] border bg-card text-card-foreground shadow-xs hover:bg-secondary'
|
||||
disabled={isAutoLayouting}
|
||||
>
|
||||
<Layers className='h-5 w-5' />
|
||||
{isAutoLayouting ? (
|
||||
<RefreshCw className='h-5 w-5 animate-spin' />
|
||||
) : (
|
||||
<Layers className='h-5 w-5' />
|
||||
)}
|
||||
<span className='sr-only'>Auto Layout</span>
|
||||
</Button>
|
||||
)}
|
||||
|
||||
@@ -0,0 +1,292 @@
|
||||
import { Check, Eye, X } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useCopilotStore } from '@/stores/copilot/store'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { mergeSubblockState } from '@/stores/workflows/utils'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
|
||||
const logger = createLogger('DiffControls')
|
||||
|
||||
export function DiffControls() {
|
||||
const {
|
||||
isShowingDiff,
|
||||
isDiffReady,
|
||||
diffWorkflow,
|
||||
toggleDiffView,
|
||||
acceptChanges,
|
||||
rejectChanges,
|
||||
diffMetadata,
|
||||
} = useWorkflowDiffStore()
|
||||
|
||||
const { updatePreviewToolCallState, clearPreviewYaml, currentChat, messages } = useCopilotStore()
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
|
||||
// Don't show anything if no diff is available or diff is not ready
|
||||
if (!diffWorkflow || !isDiffReady) {
|
||||
return null
|
||||
}
|
||||
|
||||
const handleToggleDiff = () => {
|
||||
logger.info('Toggling diff view', { currentState: isShowingDiff })
|
||||
toggleDiffView()
|
||||
}
|
||||
|
||||
const createCheckpoint = async () => {
|
||||
if (!activeWorkflowId || !currentChat?.id) {
|
||||
logger.warn('Cannot create checkpoint: missing workflowId or chatId', {
|
||||
workflowId: activeWorkflowId,
|
||||
chatId: currentChat?.id,
|
||||
})
|
||||
return false
|
||||
}
|
||||
|
||||
try {
|
||||
logger.info('Creating checkpoint before accepting changes')
|
||||
|
||||
// Get current workflow state from the store and ensure it's complete
|
||||
const rawState = useWorkflowStore.getState().getWorkflowState()
|
||||
|
||||
// Merge subblock values from the SubBlockStore to get complete state
|
||||
// This ensures all user inputs and subblock data are captured
|
||||
const blocksWithSubblockValues = mergeSubblockState(rawState.blocks, activeWorkflowId)
|
||||
|
||||
// Filter and complete blocks to ensure all required fields are present
|
||||
// This matches the validation logic from /api/workflows/[id]/state
|
||||
const filteredBlocks = Object.entries(blocksWithSubblockValues).reduce(
|
||||
(acc, [blockId, block]) => {
|
||||
if (block.type && block.name) {
|
||||
// Ensure all required fields are present
|
||||
acc[blockId] = {
|
||||
...block,
|
||||
id: block.id || blockId, // Ensure id field is set
|
||||
enabled: block.enabled !== undefined ? block.enabled : true,
|
||||
horizontalHandles:
|
||||
block.horizontalHandles !== undefined ? block.horizontalHandles : true,
|
||||
isWide: block.isWide !== undefined ? block.isWide : false,
|
||||
height: block.height !== undefined ? block.height : 90,
|
||||
subBlocks: block.subBlocks || {},
|
||||
outputs: block.outputs || {},
|
||||
data: block.data || {},
|
||||
position: block.position || { x: 0, y: 0 }, // Ensure position exists
|
||||
}
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{} as typeof rawState.blocks
|
||||
)
|
||||
|
||||
// Clean the workflow state - only include valid fields, exclude null/undefined values
|
||||
const workflowState = {
|
||||
blocks: filteredBlocks,
|
||||
edges: rawState.edges || [],
|
||||
loops: rawState.loops || {},
|
||||
parallels: rawState.parallels || {},
|
||||
lastSaved: rawState.lastSaved || Date.now(),
|
||||
isDeployed: rawState.isDeployed || false,
|
||||
deploymentStatuses: rawState.deploymentStatuses || {},
|
||||
hasActiveWebhook: rawState.hasActiveWebhook || false,
|
||||
// Only include deployedAt if it's a valid date, never include null/undefined
|
||||
...(rawState.deployedAt && rawState.deployedAt instanceof Date
|
||||
? { deployedAt: rawState.deployedAt }
|
||||
: {}),
|
||||
}
|
||||
|
||||
logger.info('Prepared complete workflow state for checkpoint', {
|
||||
blocksCount: Object.keys(workflowState.blocks).length,
|
||||
edgesCount: workflowState.edges.length,
|
||||
loopsCount: Object.keys(workflowState.loops).length,
|
||||
parallelsCount: Object.keys(workflowState.parallels).length,
|
||||
hasRequiredFields: Object.values(workflowState.blocks).every(
|
||||
(block) => block.id && block.type && block.name && block.position
|
||||
),
|
||||
hasSubblockValues: Object.values(workflowState.blocks).some((block) =>
|
||||
Object.values(block.subBlocks || {}).some(
|
||||
(subblock) => subblock.value !== null && subblock.value !== undefined
|
||||
)
|
||||
),
|
||||
sampleBlock: Object.values(workflowState.blocks)[0],
|
||||
})
|
||||
|
||||
// Find the most recent user message ID from the current chat
|
||||
const userMessages = messages.filter((msg) => msg.role === 'user')
|
||||
const lastUserMessage = userMessages[userMessages.length - 1]
|
||||
const messageId = lastUserMessage?.id
|
||||
|
||||
logger.info('Creating checkpoint with message association', {
|
||||
totalMessages: messages.length,
|
||||
userMessageCount: userMessages.length,
|
||||
lastUserMessageId: messageId,
|
||||
chatId: currentChat.id,
|
||||
entireMessageArray: messages,
|
||||
allMessageIds: messages.map((m) => ({
|
||||
id: m.id,
|
||||
role: m.role,
|
||||
content: m.content.substring(0, 50),
|
||||
})),
|
||||
selectedUserMessages: userMessages.map((m) => ({
|
||||
id: m.id,
|
||||
content: m.content.substring(0, 100),
|
||||
})),
|
||||
allRawMessageIds: messages.map((m) => m.id),
|
||||
userMessageIds: userMessages.map((m) => m.id),
|
||||
checkpointData: {
|
||||
workflowId: activeWorkflowId,
|
||||
chatId: currentChat.id,
|
||||
messageId: messageId,
|
||||
messageFound: !!lastUserMessage,
|
||||
},
|
||||
})
|
||||
|
||||
const response = await fetch('/api/copilot/checkpoints', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
workflowId: activeWorkflowId,
|
||||
chatId: currentChat.id,
|
||||
messageId,
|
||||
workflowState: JSON.stringify(workflowState),
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to create checkpoint: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
const newCheckpoint = result.checkpoint
|
||||
|
||||
logger.info('Checkpoint created successfully', {
|
||||
messageId,
|
||||
chatId: currentChat.id,
|
||||
checkpointId: newCheckpoint?.id,
|
||||
})
|
||||
|
||||
// Update the copilot store immediately to show the checkpoint icon
|
||||
if (newCheckpoint && messageId) {
|
||||
const { messageCheckpoints: currentCheckpoints } = useCopilotStore.getState()
|
||||
const existingCheckpoints = currentCheckpoints[messageId] || []
|
||||
|
||||
const updatedCheckpoints = {
|
||||
...currentCheckpoints,
|
||||
[messageId]: [newCheckpoint, ...existingCheckpoints],
|
||||
}
|
||||
|
||||
useCopilotStore.setState({ messageCheckpoints: updatedCheckpoints })
|
||||
logger.info('Updated copilot store with new checkpoint', {
|
||||
messageId,
|
||||
checkpointId: newCheckpoint.id,
|
||||
})
|
||||
}
|
||||
|
||||
return true
|
||||
} catch (error) {
|
||||
logger.error('Failed to create checkpoint:', error)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
const handleAccept = () => {
|
||||
logger.info('Accepting proposed changes (optimistic)')
|
||||
|
||||
// Create checkpoint in the background (don't await to avoid blocking)
|
||||
createCheckpoint()
|
||||
.then((checkpointCreated) => {
|
||||
if (!checkpointCreated) {
|
||||
logger.warn('Checkpoint creation failed, but proceeding with accept')
|
||||
} else {
|
||||
logger.info('Checkpoint created successfully before accept')
|
||||
}
|
||||
})
|
||||
.catch((error) => {
|
||||
logger.error('Checkpoint creation failed:', error)
|
||||
})
|
||||
|
||||
// Clear preview YAML immediately
|
||||
clearPreviewYaml().catch((error) => {
|
||||
logger.warn('Failed to clear preview YAML:', error)
|
||||
})
|
||||
|
||||
// Start background save without awaiting
|
||||
acceptChanges().catch((error) => {
|
||||
logger.error('Failed to accept changes in background:', error)
|
||||
// TODO: Consider showing a toast notification for save failures
|
||||
// For now, the optimistic update stands since the UI state is already correct
|
||||
})
|
||||
|
||||
logger.info('Optimistically applied changes, saving in background')
|
||||
}
|
||||
|
||||
const handleReject = () => {
|
||||
logger.info('Rejecting proposed changes (optimistic)')
|
||||
|
||||
// Clear preview YAML immediately
|
||||
clearPreviewYaml().catch((error) => {
|
||||
logger.warn('Failed to clear preview YAML:', error)
|
||||
})
|
||||
|
||||
// Reject is immediate (no server save needed)
|
||||
rejectChanges()
|
||||
|
||||
logger.info('Successfully rejected proposed changes')
|
||||
}
|
||||
|
||||
return (
|
||||
<div className='-translate-x-1/2 fixed bottom-20 left-1/2 z-30'>
|
||||
<div className='rounded-lg border bg-background/95 p-4 shadow-lg backdrop-blur-sm'>
|
||||
<div className='flex items-center gap-4'>
|
||||
{/* Info section */}
|
||||
<div className='flex items-center gap-2'>
|
||||
<div className='flex h-8 w-8 items-center justify-center rounded-full bg-purple-100 dark:bg-purple-900'>
|
||||
<Eye className='h-4 w-4 text-purple-600 dark:text-purple-400' />
|
||||
</div>
|
||||
<div className='flex flex-col'>
|
||||
<span className='font-medium text-sm'>
|
||||
{isShowingDiff ? 'Viewing Proposed Changes' : 'Copilot has proposed changes'}
|
||||
</span>
|
||||
{diffMetadata && (
|
||||
<span className='text-muted-foreground text-xs'>
|
||||
Source: {diffMetadata.source} •{' '}
|
||||
{new Date(diffMetadata.timestamp).toLocaleTimeString()}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Controls */}
|
||||
<div className='flex items-center gap-2'>
|
||||
{/* Toggle View Button */}
|
||||
<Button
|
||||
variant={isShowingDiff ? 'default' : 'outline'}
|
||||
size='sm'
|
||||
onClick={handleToggleDiff}
|
||||
className='h-8'
|
||||
>
|
||||
{isShowingDiff ? 'View Original' : 'Preview Changes'}
|
||||
</Button>
|
||||
|
||||
{/* Accept/Reject buttons - only show when viewing diff */}
|
||||
{isShowingDiff && (
|
||||
<>
|
||||
<Button
|
||||
variant='default'
|
||||
size='sm'
|
||||
onClick={handleAccept}
|
||||
className='h-8 bg-green-600 px-3 hover:bg-green-700'
|
||||
>
|
||||
<Check className='mr-1 h-3 w-3' />
|
||||
Accept
|
||||
</Button>
|
||||
<Button variant='destructive' size='sm' onClick={handleReject} className='h-8 px-3'>
|
||||
<X className='mr-1 h-3 w-3' />
|
||||
Reject
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -48,9 +48,14 @@ vi.mock('@/components/ui/card', () => ({
|
||||
Card: ({ children, ...props }: any) => ({ children, ...props }),
|
||||
}))
|
||||
|
||||
vi.mock('@/components/icons', () => ({
|
||||
StartIcon: ({ className }: any) => ({ className }),
|
||||
}))
|
||||
vi.mock('@/components/icons', async (importOriginal) => {
|
||||
const actual = (await importOriginal()) as any
|
||||
return {
|
||||
...actual,
|
||||
// Override specific icons if needed for testing
|
||||
StartIcon: ({ className }: any) => ({ className }),
|
||||
}
|
||||
})
|
||||
|
||||
vi.mock('@/lib/utils', () => ({
|
||||
cn: (...classes: any[]) => classes.filter(Boolean).join(' '),
|
||||
|
||||
@@ -6,8 +6,9 @@ import { StartIcon } from '@/components/icons'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Card } from '@/components/ui/card'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { LoopBadges } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/loop-node/components/loop-badges'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { useCurrentWorkflow } from '../../hooks'
|
||||
import { LoopBadges } from './components/loop-badges'
|
||||
|
||||
// Add these styles to your existing global CSS file or create a separate CSS module
|
||||
const LoopNodeStyles: React.FC = () => {
|
||||
@@ -72,6 +73,12 @@ export const LoopNodeComponent = memo(({ data, selected, id }: NodeProps) => {
|
||||
const { collaborativeRemoveBlock } = useCollaborativeWorkflow()
|
||||
const blockRef = useRef<HTMLDivElement>(null)
|
||||
|
||||
// Use the clean abstraction for current workflow state
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
const currentBlock = currentWorkflow.getBlockById(id)
|
||||
const diffStatus =
|
||||
currentWorkflow.isDiffMode && currentBlock ? (currentBlock as any).is_diff : undefined
|
||||
|
||||
// Check if this is preview mode
|
||||
const isPreview = data?.isPreview || false
|
||||
|
||||
@@ -124,7 +131,11 @@ export const LoopNodeComponent = memo(({ data, selected, id }: NodeProps) => {
|
||||
data?.state === 'valid',
|
||||
nestingLevel > 0 &&
|
||||
`border border-[0.5px] ${nestingLevel % 2 === 0 ? 'border-slate-300/60' : 'border-slate-400/60'}`,
|
||||
data?.hasNestedError && 'border-2 border-red-500 bg-red-50/50'
|
||||
data?.hasNestedError && 'border-2 border-red-500 bg-red-50/50',
|
||||
// Diff highlighting
|
||||
diffStatus === 'new' && 'bg-green-50/50 ring-2 ring-green-500 dark:bg-green-900/10',
|
||||
diffStatus === 'edited' &&
|
||||
'bg-orange-50/50 ring-2 ring-orange-500 dark:bg-orange-900/10'
|
||||
)}
|
||||
style={{
|
||||
width: data.width || 500,
|
||||
|
||||
@@ -11,10 +11,8 @@ import {
|
||||
extractPathFromOutputId,
|
||||
parseOutputContentSafely,
|
||||
} from '@/lib/response-format'
|
||||
import {
|
||||
ChatMessage,
|
||||
OutputSelect,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/chat/components'
|
||||
import { ChatMessage } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/chat/components/chat-message/chat-message'
|
||||
import { OutputSelect } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/chat/components/output-select/output-select'
|
||||
import { useWorkflowExecution } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-workflow-execution'
|
||||
import type { BlockLog, ExecutionResult } from '@/executor/types'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
@@ -32,6 +30,7 @@ interface ChatProps {
|
||||
|
||||
export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
|
||||
const {
|
||||
messages,
|
||||
addMessage,
|
||||
|
||||
@@ -1,254 +0,0 @@
|
||||
'use client'
|
||||
|
||||
import { type KeyboardEvent, useEffect, useMemo, useRef } from 'react'
|
||||
import { ArrowUp, X } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Input } from '@/components/ui/input'
|
||||
import { JSONView } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/console/components'
|
||||
import { useWorkflowExecution } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-workflow-execution'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useChatStore } from '@/stores/panel/chat/store'
|
||||
import type { ChatMessage as ChatMessageType } from '@/stores/panel/chat/types'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
interface ChatMessageProps {
|
||||
message: ChatMessageType
|
||||
}
|
||||
|
||||
// ChatGPT-style message component specifically for modal
|
||||
function ModalChatMessage({ message }: ChatMessageProps) {
|
||||
// Check if content is a JSON object
|
||||
const isJsonObject = useMemo(() => {
|
||||
return typeof message.content === 'object' && message.content !== null
|
||||
}, [message.content])
|
||||
|
||||
// For user messages (on the right)
|
||||
if (message.type === 'user') {
|
||||
return (
|
||||
<div className='px-4 py-5'>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
<div className='flex justify-end'>
|
||||
<div className='max-w-[80%] rounded-3xl bg-[#F4F4F4] px-4 py-3 shadow-sm dark:bg-primary/10'>
|
||||
<div className='whitespace-pre-wrap break-words text-[#0D0D0D] text-base leading-relaxed dark:text-white'>
|
||||
{isJsonObject ? (
|
||||
<JSONView data={message.content} />
|
||||
) : (
|
||||
<span>{message.content}</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// For assistant messages (on the left)
|
||||
return (
|
||||
<div className='px-4 py-5'>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
<div className='flex'>
|
||||
<div className='max-w-[80%]'>
|
||||
<div className='whitespace-pre-wrap break-words text-base leading-relaxed'>
|
||||
{isJsonObject ? <JSONView data={message.content} /> : <span>{message.content}</span>}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
interface ChatModalProps {
|
||||
open: boolean
|
||||
onOpenChange: (open: boolean) => void
|
||||
chatMessage: string
|
||||
setChatMessage: (message: string) => void
|
||||
}
|
||||
|
||||
export function ChatModal({ open, onOpenChange, chatMessage, setChatMessage }: ChatModalProps) {
|
||||
const messagesEndRef = useRef<HTMLDivElement>(null)
|
||||
const messagesContainerRef = useRef<HTMLDivElement>(null)
|
||||
const inputRef = useRef<HTMLInputElement>(null)
|
||||
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
const { messages, addMessage, getConversationId } = useChatStore()
|
||||
|
||||
// Use the execution store state to track if a workflow is executing
|
||||
const { isExecuting } = useExecutionStore()
|
||||
|
||||
// Get workflow execution functionality
|
||||
const { handleRunWorkflow } = useWorkflowExecution()
|
||||
|
||||
// Get filtered messages for current workflow
|
||||
const workflowMessages = useMemo(() => {
|
||||
if (!activeWorkflowId) return []
|
||||
return messages
|
||||
.filter((msg) => msg.workflowId === activeWorkflowId)
|
||||
.sort((a, b) => new Date(a.timestamp).getTime() - new Date(b.timestamp).getTime())
|
||||
}, [messages, activeWorkflowId])
|
||||
|
||||
// Auto-scroll to bottom when new messages are added
|
||||
useEffect(() => {
|
||||
if (messagesEndRef.current) {
|
||||
messagesEndRef.current.scrollIntoView({ behavior: 'smooth' })
|
||||
}
|
||||
}, [workflowMessages])
|
||||
|
||||
// Focus input when modal opens
|
||||
useEffect(() => {
|
||||
if (open && inputRef.current) {
|
||||
inputRef.current.focus()
|
||||
}
|
||||
}, [open])
|
||||
|
||||
// Handle send message
|
||||
const handleSendMessage = async () => {
|
||||
if (!chatMessage.trim() || !activeWorkflowId || isExecuting) return
|
||||
|
||||
// Store the message being sent for reference
|
||||
const sentMessage = chatMessage.trim()
|
||||
|
||||
// Get the conversationId for this workflow before adding the message
|
||||
const conversationId = getConversationId(activeWorkflowId)
|
||||
|
||||
// Add user message
|
||||
addMessage({
|
||||
content: sentMessage,
|
||||
workflowId: activeWorkflowId,
|
||||
type: 'user',
|
||||
})
|
||||
|
||||
// Clear input
|
||||
setChatMessage('')
|
||||
|
||||
// Ensure input stays focused
|
||||
if (inputRef.current) {
|
||||
inputRef.current.focus()
|
||||
}
|
||||
|
||||
// Execute the workflow to generate a response
|
||||
await handleRunWorkflow({
|
||||
input: sentMessage,
|
||||
conversationId: conversationId,
|
||||
})
|
||||
|
||||
// Ensure input stays focused even after response
|
||||
if (inputRef.current) {
|
||||
inputRef.current.focus()
|
||||
}
|
||||
}
|
||||
|
||||
// Handle key press
|
||||
const handleKeyPress = (e: KeyboardEvent<HTMLInputElement>) => {
|
||||
if (e.key === 'Enter' && !e.shiftKey) {
|
||||
e.preventDefault()
|
||||
handleSendMessage()
|
||||
}
|
||||
}
|
||||
|
||||
if (!open) return null
|
||||
|
||||
return (
|
||||
<div className='fixed inset-0 z-[100] flex flex-col bg-background'>
|
||||
<style jsx>{`
|
||||
@keyframes growShrink {
|
||||
0%,
|
||||
100% {
|
||||
transform: scale(0.9);
|
||||
}
|
||||
50% {
|
||||
transform: scale(1.1);
|
||||
}
|
||||
}
|
||||
.loading-dot {
|
||||
animation: growShrink 1.5s infinite ease-in-out;
|
||||
}
|
||||
`}</style>
|
||||
|
||||
{/* Header with title and close button */}
|
||||
<div className='flex items-center justify-between px-4 py-3'>
|
||||
<h2 className='font-medium text-lg'>Chat</h2>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='icon'
|
||||
className='h-8 w-8 rounded-md hover:bg-accent/50'
|
||||
onClick={() => onOpenChange(false)}
|
||||
>
|
||||
<X className='h-4 w-4' />
|
||||
<span className='sr-only'>Close</span>
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* Messages container */}
|
||||
<div ref={messagesContainerRef} className='flex-1 overflow-y-auto'>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
{workflowMessages.length === 0 ? (
|
||||
<div className='flex h-full flex-col items-center justify-center px-4 py-10'>
|
||||
<div className='space-y-2 text-center'>
|
||||
<h3 className='font-medium text-lg'>How can I help you today?</h3>
|
||||
<p className='text-muted-foreground text-sm'>
|
||||
Ask me anything about your workflow.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
workflowMessages.map((message) => (
|
||||
<ModalChatMessage key={message.id} message={message} />
|
||||
))
|
||||
)}
|
||||
|
||||
{/* Loading indicator (shows only when executing) */}
|
||||
{isExecuting && (
|
||||
<div className='px-4 py-5'>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
<div className='flex'>
|
||||
<div className='max-w-[80%]'>
|
||||
<div className='flex h-6 items-center'>
|
||||
<div className='loading-dot h-3 w-3 rounded-full bg-black dark:bg-black' />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div ref={messagesEndRef} className='h-1' />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Input area (fixed at bottom) */}
|
||||
<div className='bg-background p-4'>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
<div className='relative rounded-2xl border bg-background shadow-sm'>
|
||||
<Input
|
||||
ref={inputRef}
|
||||
value={chatMessage}
|
||||
onChange={(e) => setChatMessage(e.target.value)}
|
||||
onKeyDown={handleKeyPress}
|
||||
placeholder='Message...'
|
||||
className='min-h-[50px] flex-1 rounded-2xl border-0 bg-transparent py-7 pr-16 pl-6 text-base focus-visible:ring-0 focus-visible:ring-offset-0'
|
||||
disabled={!activeWorkflowId}
|
||||
/>
|
||||
<Button
|
||||
onClick={handleSendMessage}
|
||||
size='icon'
|
||||
disabled={!chatMessage.trim() || !activeWorkflowId || isExecuting}
|
||||
className='-translate-y-1/2 absolute top-1/2 right-3 h-10 w-10 rounded-xl bg-black p-0 text-white hover:bg-gray-800 dark:bg-primary dark:hover:bg-primary/80'
|
||||
>
|
||||
<ArrowUp className='h-4 w-4 dark:text-black' />
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className='mt-2 text-center text-muted-foreground text-xs'>
|
||||
<p>
|
||||
{activeWorkflowId
|
||||
? 'Your messages will be processed by the active workflow'
|
||||
: 'Select a workflow to start chatting'}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -1,3 +0,0 @@
|
||||
export { ChatMessage } from './chat-message/chat-message'
|
||||
export { ChatModal } from './chat-modal/chat-modal'
|
||||
export { OutputSelect } from './output-select/output-select'
|
||||
@@ -3,6 +3,7 @@ import { Check, ChevronDown } from 'lucide-react'
|
||||
import { extractFieldsFromSchema, parseResponseFormatSafely } from '@/lib/response-format'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
|
||||
@@ -24,6 +25,15 @@ export function OutputSelect({
|
||||
const [isOutputDropdownOpen, setIsOutputDropdownOpen] = useState(false)
|
||||
const dropdownRef = useRef<HTMLDivElement>(null)
|
||||
const blocks = useWorkflowStore((state) => state.blocks)
|
||||
const { isShowingDiff, isDiffReady, diffWorkflow } = useWorkflowDiffStore()
|
||||
|
||||
// Track subblock store state to ensure proper reactivity
|
||||
const subBlockValues = useSubBlockStore((state) =>
|
||||
workflowId ? state.workflowValues[workflowId] : null
|
||||
)
|
||||
|
||||
// Use diff blocks when in diff mode AND diff is ready, otherwise use main blocks
|
||||
const workflowBlocks = isShowingDiff && isDiffReady && diffWorkflow ? diffWorkflow.blocks : blocks
|
||||
|
||||
// Get workflow outputs for the dropdown
|
||||
const workflowOutputs = useMemo(() => {
|
||||
@@ -38,19 +48,42 @@ export function OutputSelect({
|
||||
|
||||
if (!workflowId) return outputs
|
||||
|
||||
// Check if workflowBlocks is defined
|
||||
if (!workflowBlocks || typeof workflowBlocks !== 'object') {
|
||||
return outputs
|
||||
}
|
||||
|
||||
// Check if we actually have blocks to process
|
||||
const blockArray = Object.values(workflowBlocks)
|
||||
if (blockArray.length === 0) {
|
||||
return outputs
|
||||
}
|
||||
|
||||
// Process blocks to extract outputs
|
||||
Object.values(blocks).forEach((block) => {
|
||||
blockArray.forEach((block) => {
|
||||
// Skip starter/start blocks
|
||||
if (block.type === 'starter') return
|
||||
|
||||
// Add defensive check to ensure block exists and has required properties
|
||||
if (!block || !block.id || !block.type) {
|
||||
return
|
||||
}
|
||||
|
||||
// Add defensive check to ensure block.name exists and is a string
|
||||
const blockName =
|
||||
block.name && typeof block.name === 'string'
|
||||
? block.name.replace(/\s+/g, '').toLowerCase()
|
||||
: `block-${block.id}`
|
||||
|
||||
// Get block configuration from registry to get outputs
|
||||
const blockConfig = getBlock(block.type)
|
||||
|
||||
// Check for custom response format first
|
||||
const responseFormatValue = useSubBlockStore.getState().getValue(block.id, 'responseFormat')
|
||||
// In diff mode, get value from diff blocks; otherwise use store
|
||||
const responseFormatValue =
|
||||
isShowingDiff && isDiffReady && diffWorkflow
|
||||
? diffWorkflow.blocks[block.id]?.subBlocks?.responseFormat?.value
|
||||
: subBlockValues?.[block.id]?.responseFormat
|
||||
const responseFormat = parseResponseFormatSafely(responseFormatValue, block.id)
|
||||
|
||||
let outputsToProcess: Record<string, any> = {}
|
||||
@@ -64,12 +97,12 @@ export function OutputSelect({
|
||||
outputsToProcess[field.name] = { type: field.type }
|
||||
})
|
||||
} else {
|
||||
// Fallback to default outputs if schema extraction failed
|
||||
outputsToProcess = block.outputs || {}
|
||||
// Fallback to block config outputs if schema extraction failed
|
||||
outputsToProcess = blockConfig?.outputs || {}
|
||||
}
|
||||
} else {
|
||||
// Use default block outputs
|
||||
outputsToProcess = block.outputs || {}
|
||||
// Use block config outputs instead of block.outputs
|
||||
outputsToProcess = blockConfig?.outputs || {}
|
||||
}
|
||||
|
||||
// Add response outputs
|
||||
@@ -131,7 +164,7 @@ export function OutputSelect({
|
||||
})
|
||||
|
||||
return outputs
|
||||
}, [blocks, workflowId])
|
||||
}, [workflowBlocks, workflowId, isShowingDiff, isDiffReady, diffWorkflow, blocks, subBlockValues])
|
||||
|
||||
// Get selected outputs display text
|
||||
const selectedOutputsDisplayText = useMemo(() => {
|
||||
|
||||
@@ -1,322 +0,0 @@
|
||||
'use client'
|
||||
|
||||
import { useEffect, useRef, useState } from 'react'
|
||||
import {
|
||||
Bot,
|
||||
ChevronDown,
|
||||
History,
|
||||
MessageSquarePlus,
|
||||
MoreHorizontal,
|
||||
Trash2,
|
||||
X,
|
||||
} from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import {
|
||||
DropdownMenu,
|
||||
DropdownMenuContent,
|
||||
DropdownMenuItem,
|
||||
DropdownMenuTrigger,
|
||||
} from '@/components/ui/dropdown-menu'
|
||||
import type { CopilotChat } from '@/lib/copilot/api'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import {
|
||||
CheckpointPanel,
|
||||
CopilotWelcome,
|
||||
ProfessionalInput,
|
||||
ProfessionalMessage,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/components'
|
||||
import type { CopilotMessage } from '@/stores/copilot/types'
|
||||
|
||||
const logger = createLogger('CopilotModal')
|
||||
|
||||
interface CopilotModalProps {
|
||||
open: boolean
|
||||
onOpenChange: (open: boolean) => void
|
||||
copilotMessage: string
|
||||
setCopilotMessage: (message: string) => void
|
||||
messages: CopilotMessage[]
|
||||
onSendMessage: (message: string) => Promise<void>
|
||||
isLoading: boolean
|
||||
isLoadingChats: boolean
|
||||
// Chat management props
|
||||
chats: CopilotChat[]
|
||||
currentChat: CopilotChat | null
|
||||
onSelectChat: (chat: CopilotChat) => void
|
||||
onStartNewChat: () => void
|
||||
onDeleteChat: (chatId: string) => void
|
||||
// Mode props
|
||||
mode: 'ask' | 'agent'
|
||||
onModeChange: (mode: 'ask' | 'agent') => void
|
||||
}
|
||||
|
||||
export function CopilotModal({
|
||||
open,
|
||||
onOpenChange,
|
||||
copilotMessage,
|
||||
setCopilotMessage,
|
||||
messages,
|
||||
onSendMessage,
|
||||
isLoading,
|
||||
isLoadingChats,
|
||||
chats,
|
||||
currentChat,
|
||||
onSelectChat,
|
||||
onStartNewChat,
|
||||
onDeleteChat,
|
||||
mode,
|
||||
onModeChange,
|
||||
}: CopilotModalProps) {
|
||||
const messagesEndRef = useRef<HTMLDivElement>(null)
|
||||
const messagesContainerRef = useRef<HTMLDivElement>(null)
|
||||
const [isDropdownOpen, setIsDropdownOpen] = useState(false)
|
||||
const [showCheckpoints, setShowCheckpoints] = useState(false)
|
||||
|
||||
// Fixed sidebar width for copilot modal positioning
|
||||
const sidebarWidth = 240 // w-60 (sidebar width from staging)
|
||||
|
||||
// Auto-scroll to bottom when new messages are added
|
||||
useEffect(() => {
|
||||
if (messagesEndRef.current) {
|
||||
messagesEndRef.current.scrollIntoView({ behavior: 'smooth' })
|
||||
}
|
||||
}, [messages])
|
||||
|
||||
if (!open) return null
|
||||
|
||||
return (
|
||||
<div
|
||||
className='fixed inset-y-0 right-0 z-[100] flex flex-col bg-background'
|
||||
style={{ left: `${sidebarWidth}px` }}
|
||||
>
|
||||
<style jsx>{`
|
||||
@keyframes growShrink {
|
||||
0%,
|
||||
100% {
|
||||
transform: scale(0.9);
|
||||
}
|
||||
50% {
|
||||
transform: scale(1.1);
|
||||
}
|
||||
}
|
||||
.loading-dot {
|
||||
animation: growShrink 1.5s infinite ease-in-out;
|
||||
}
|
||||
`}</style>
|
||||
|
||||
{/* Show loading state with centered pulsing agent icon */}
|
||||
{isLoadingChats || isLoading ? (
|
||||
<div className='flex h-full items-center justify-center'>
|
||||
<div className='flex items-center justify-center'>
|
||||
<Bot className='h-16 w-16 animate-pulse text-muted-foreground' />
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<>
|
||||
{/* Close button in top right corner */}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='icon'
|
||||
className='absolute top-3 right-4 z-10 h-8 w-8 rounded-md hover:bg-accent/50'
|
||||
onClick={() => onOpenChange(false)}
|
||||
>
|
||||
<X className='h-4 w-4' />
|
||||
<span className='sr-only'>Close</span>
|
||||
</Button>
|
||||
|
||||
{/* Header with chat title and management */}
|
||||
<div className='border-b py-3'>
|
||||
<div className='mx-auto flex w-full max-w-3xl items-center justify-between px-4'>
|
||||
{/* Chat Title Dropdown */}
|
||||
<DropdownMenu open={isDropdownOpen} onOpenChange={setIsDropdownOpen}>
|
||||
<DropdownMenuTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
className='h-8 max-w-[300px] justify-start px-3 hover:bg-accent/50'
|
||||
>
|
||||
<span className='truncate'>{currentChat?.title || 'New Chat'}</span>
|
||||
<ChevronDown className='ml-2 h-4 w-4 shrink-0' />
|
||||
</Button>
|
||||
</DropdownMenuTrigger>
|
||||
<DropdownMenuContent
|
||||
align='start'
|
||||
className='z-[110] w-72 border-border/50 bg-background/95 shadow-lg backdrop-blur-sm'
|
||||
sideOffset={8}
|
||||
onMouseLeave={() => setIsDropdownOpen(false)}
|
||||
>
|
||||
{isLoadingChats ? (
|
||||
<div className='px-4 py-3 text-muted-foreground text-sm'>Loading chats...</div>
|
||||
) : chats.length === 0 ? (
|
||||
<div className='px-4 py-3 text-muted-foreground text-sm'>No chats yet</div>
|
||||
) : (
|
||||
// Sort chats by updated date (most recent first) for display
|
||||
[...chats]
|
||||
.sort(
|
||||
(a, b) => new Date(b.updatedAt).getTime() - new Date(a.updatedAt).getTime()
|
||||
)
|
||||
.map((chat) => (
|
||||
<div key={chat.id} className='group flex items-center gap-2 px-2 py-1'>
|
||||
<DropdownMenuItem asChild>
|
||||
<div
|
||||
onClick={() => {
|
||||
onSelectChat(chat)
|
||||
setIsDropdownOpen(false)
|
||||
}}
|
||||
className={`min-w-0 flex-1 cursor-pointer rounded-lg px-3 py-2.5 transition-all ${
|
||||
currentChat?.id === chat.id
|
||||
? 'bg-accent/80 text-accent-foreground'
|
||||
: 'hover:bg-accent/40'
|
||||
}`}
|
||||
>
|
||||
<div className='min-w-0'>
|
||||
<div className='truncate font-medium text-sm leading-tight'>
|
||||
{chat.title || 'Untitled Chat'}
|
||||
</div>
|
||||
<div className='mt-0.5 truncate text-muted-foreground text-xs'>
|
||||
{new Date(chat.updatedAt).toLocaleDateString()} at{' '}
|
||||
{new Date(chat.updatedAt).toLocaleTimeString([], {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
})}{' '}
|
||||
• {chat.messageCount}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</DropdownMenuItem>
|
||||
<DropdownMenu>
|
||||
<DropdownMenuTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
className='h-7 w-7 shrink-0 p-0 hover:bg-accent/60'
|
||||
>
|
||||
<MoreHorizontal className='h-3.5 w-3.5' />
|
||||
</Button>
|
||||
</DropdownMenuTrigger>
|
||||
<DropdownMenuContent
|
||||
align='end'
|
||||
className='z-[120] border-border/50 bg-background/95 shadow-lg backdrop-blur-sm'
|
||||
>
|
||||
<DropdownMenuItem
|
||||
onClick={() => onDeleteChat(chat.id)}
|
||||
className='cursor-pointer text-destructive hover:bg-destructive/10 hover:text-destructive focus:bg-destructive/10 focus:text-destructive'
|
||||
>
|
||||
<Trash2 className='mr-2 h-3.5 w-3.5' />
|
||||
Delete
|
||||
</DropdownMenuItem>
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
</div>
|
||||
))
|
||||
)}
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
|
||||
{/* Right side action buttons */}
|
||||
<div className='flex items-center gap-2'>
|
||||
{/* Checkpoint Toggle Button */}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={() => setShowCheckpoints(!showCheckpoints)}
|
||||
className={`h-8 w-8 p-0 ${
|
||||
showCheckpoints
|
||||
? 'bg-[#802FFF]/20 text-[#802FFF] hover:bg-[#802FFF]/30'
|
||||
: 'hover:bg-accent/50'
|
||||
}`}
|
||||
title='View Checkpoints'
|
||||
>
|
||||
<History className='h-4 w-4' />
|
||||
</Button>
|
||||
|
||||
{/* New Chat Button */}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={onStartNewChat}
|
||||
className='h-8 w-8 p-0'
|
||||
title='New Chat'
|
||||
>
|
||||
<MessageSquarePlus className='h-4 w-4' />
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Messages container or Checkpoint Panel */}
|
||||
{showCheckpoints ? (
|
||||
<div className='flex-1 overflow-hidden'>
|
||||
<CheckpointPanel />
|
||||
</div>
|
||||
) : (
|
||||
<div ref={messagesContainerRef} className='flex-1 overflow-y-auto'>
|
||||
<div className='mx-auto max-w-3xl'>
|
||||
{messages.length === 0 ? (
|
||||
<CopilotWelcome onQuestionClick={onSendMessage} mode={mode} />
|
||||
) : (
|
||||
messages.map((message) => (
|
||||
<ProfessionalMessage
|
||||
key={message.id}
|
||||
message={message}
|
||||
isStreaming={isLoading && message.id === messages[messages.length - 1]?.id}
|
||||
/>
|
||||
))
|
||||
)}
|
||||
|
||||
<div ref={messagesEndRef} className='h-1' />
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Mode Selector and Input */}
|
||||
{!showCheckpoints && (
|
||||
<>
|
||||
{/* Mode Selector */}
|
||||
<div className='pt-6'>
|
||||
<div className='mx-auto max-w-3xl px-4'>
|
||||
<div className='flex items-center gap-1 rounded-md border bg-muted/30 p-0.5'>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={() => onModeChange('ask')}
|
||||
className={`h-6 flex-1 font-medium text-xs ${
|
||||
mode === 'ask'
|
||||
? 'bg-[#802FFF]/20 text-[#802FFF] hover:bg-[#802FFF]/30'
|
||||
: 'hover:bg-muted/50'
|
||||
}`}
|
||||
title='Ask questions and get answers. Cannot edit workflows.'
|
||||
>
|
||||
Ask
|
||||
</Button>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={() => onModeChange('agent')}
|
||||
className={`h-6 flex-1 font-medium text-xs ${
|
||||
mode === 'agent'
|
||||
? 'bg-[#802FFF]/20 text-[#802FFF] hover:bg-[#802FFF]/30'
|
||||
: 'hover:bg-muted/50'
|
||||
}`}
|
||||
title='Full agent with workflow editing capabilities.'
|
||||
>
|
||||
Agent
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Input area */}
|
||||
<ProfessionalInput
|
||||
onSubmit={async (message) => {
|
||||
await onSendMessage(message)
|
||||
setCopilotMessage('')
|
||||
}}
|
||||
disabled={false}
|
||||
isLoading={isLoading}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -1,5 +0,0 @@
|
||||
export { CheckpointPanel } from './checkpoint-panel'
|
||||
export { CopilotModal } from './copilot-modal/copilot-modal'
|
||||
export { ProfessionalInput } from './professional-input/professional-input'
|
||||
export { ProfessionalMessage } from './professional-message/professional-message'
|
||||
export { CopilotWelcome } from './welcome/welcome'
|
||||
@@ -0,0 +1,234 @@
|
||||
import React, { type HTMLAttributes, type ReactNode } from 'react'
|
||||
import { Copy } from 'lucide-react'
|
||||
import ReactMarkdown from 'react-markdown'
|
||||
import remarkGfm from 'remark-gfm'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
|
||||
|
||||
export function LinkWithPreview({ href, children }: { href: string; children: React.ReactNode }) {
|
||||
return (
|
||||
<Tooltip delayDuration={300}>
|
||||
<TooltipTrigger asChild>
|
||||
<a
|
||||
href={href}
|
||||
className='text-blue-600 hover:underline dark:text-blue-400'
|
||||
target='_blank'
|
||||
rel='noopener noreferrer'
|
||||
>
|
||||
{children}
|
||||
</a>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side='top' align='center' sideOffset={5} className='max-w-sm p-3'>
|
||||
<span className='truncate font-medium text-xs'>{href}</span>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
)
|
||||
}
|
||||
|
||||
export default function CopilotMarkdownRenderer({
|
||||
content,
|
||||
customLinkComponent,
|
||||
}: {
|
||||
content: string
|
||||
customLinkComponent?: typeof LinkWithPreview
|
||||
}) {
|
||||
const LinkComponent = customLinkComponent || LinkWithPreview
|
||||
|
||||
const customComponents = {
|
||||
// Paragraph
|
||||
p: ({ children }: React.HTMLAttributes<HTMLParagraphElement>) => (
|
||||
<p className='mb-1 font-geist-sans text-base text-gray-800 leading-relaxed last:mb-0 dark:text-gray-200'>
|
||||
{children}
|
||||
</p>
|
||||
),
|
||||
|
||||
// Headings
|
||||
h1: ({ children }: React.HTMLAttributes<HTMLHeadingElement>) => (
|
||||
<h1 className='mt-10 mb-5 font-geist-sans font-semibold text-2xl text-gray-900 dark:text-gray-100'>
|
||||
{children}
|
||||
</h1>
|
||||
),
|
||||
h2: ({ children }: React.HTMLAttributes<HTMLHeadingElement>) => (
|
||||
<h2 className='mt-8 mb-4 font-geist-sans font-semibold text-gray-900 text-xl dark:text-gray-100'>
|
||||
{children}
|
||||
</h2>
|
||||
),
|
||||
h3: ({ children }: React.HTMLAttributes<HTMLHeadingElement>) => (
|
||||
<h3 className='mt-7 mb-3 font-geist-sans font-semibold text-gray-900 text-lg dark:text-gray-100'>
|
||||
{children}
|
||||
</h3>
|
||||
),
|
||||
h4: ({ children }: React.HTMLAttributes<HTMLHeadingElement>) => (
|
||||
<h4 className='mt-5 mb-2 font-geist-sans font-semibold text-base text-gray-900 dark:text-gray-100'>
|
||||
{children}
|
||||
</h4>
|
||||
),
|
||||
|
||||
// Lists
|
||||
ul: ({ children }: React.HTMLAttributes<HTMLUListElement>) => (
|
||||
<ul
|
||||
className='mt-1 mb-1 space-y-1 pl-6 font-geist-sans text-gray-800 dark:text-gray-200'
|
||||
style={{ listStyleType: 'disc' }}
|
||||
>
|
||||
{children}
|
||||
</ul>
|
||||
),
|
||||
ol: ({ children }: React.HTMLAttributes<HTMLOListElement>) => (
|
||||
<ol
|
||||
className='mt-1 mb-1 space-y-1 pl-6 font-geist-sans text-gray-800 dark:text-gray-200'
|
||||
style={{ listStyleType: 'decimal' }}
|
||||
>
|
||||
{children}
|
||||
</ol>
|
||||
),
|
||||
li: ({
|
||||
children,
|
||||
ordered,
|
||||
...props
|
||||
}: React.LiHTMLAttributes<HTMLLIElement> & { ordered?: boolean }) => (
|
||||
<li
|
||||
className='font-geist-sans text-gray-800 dark:text-gray-200'
|
||||
style={{ display: 'list-item' }}
|
||||
>
|
||||
{children}
|
||||
</li>
|
||||
),
|
||||
|
||||
// Code blocks
|
||||
pre: ({ children }: HTMLAttributes<HTMLPreElement>) => {
|
||||
let codeProps: HTMLAttributes<HTMLElement> = {}
|
||||
let codeContent: ReactNode = children
|
||||
let language = 'code'
|
||||
|
||||
if (
|
||||
React.isValidElement<{ className?: string; children?: ReactNode }>(children) &&
|
||||
children.type === 'code'
|
||||
) {
|
||||
const childElement = children as React.ReactElement<{
|
||||
className?: string
|
||||
children?: ReactNode
|
||||
}>
|
||||
codeProps = { className: childElement.props.className }
|
||||
codeContent = childElement.props.children
|
||||
language = childElement.props.className?.replace('language-', '') || 'code'
|
||||
}
|
||||
|
||||
return (
|
||||
<div className='my-6 rounded-md bg-gray-900 text-sm dark:bg-black'>
|
||||
<div className='flex items-center justify-between border-gray-700 border-b px-4 py-1.5 dark:border-gray-800'>
|
||||
<span className='font-geist-sans text-gray-400 text-xs'>{language}</span>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
className='h-4 w-4 p-0 opacity-70 hover:opacity-100'
|
||||
onClick={() => {
|
||||
if (typeof codeContent === 'string') {
|
||||
navigator.clipboard.writeText(codeContent)
|
||||
}
|
||||
}}
|
||||
>
|
||||
<Copy className='h-3 w-3 text-gray-400' />
|
||||
</Button>
|
||||
</div>
|
||||
<pre className='overflow-x-auto p-4 font-mono text-gray-200 dark:text-gray-100'>
|
||||
{codeContent}
|
||||
</pre>
|
||||
</div>
|
||||
)
|
||||
},
|
||||
|
||||
// Inline code
|
||||
code: ({
|
||||
inline,
|
||||
className,
|
||||
children,
|
||||
...props
|
||||
}: React.HTMLAttributes<HTMLElement> & { className?: string; inline?: boolean }) => {
|
||||
if (inline) {
|
||||
return (
|
||||
<code
|
||||
className='rounded bg-gray-200 px-1 py-0.5 font-mono text-[0.9em] text-gray-800 dark:bg-gray-700 dark:text-gray-200'
|
||||
{...props}
|
||||
>
|
||||
{children}
|
||||
</code>
|
||||
)
|
||||
}
|
||||
return (
|
||||
<code className={className} {...props}>
|
||||
{children}
|
||||
</code>
|
||||
)
|
||||
},
|
||||
|
||||
// Blockquotes
|
||||
blockquote: ({ children }: React.HTMLAttributes<HTMLQuoteElement>) => (
|
||||
<blockquote className='my-4 border-gray-300 border-l-4 py-1 pl-4 font-geist-sans text-gray-700 italic dark:border-gray-600 dark:text-gray-300'>
|
||||
{children}
|
||||
</blockquote>
|
||||
),
|
||||
|
||||
// Horizontal rule
|
||||
hr: () => <hr className='my-8 border-gray-500/[.07] border-t dark:border-gray-400/[.07]' />,
|
||||
|
||||
// Links
|
||||
a: ({ href, children, ...props }: React.AnchorHTMLAttributes<HTMLAnchorElement>) => (
|
||||
<LinkComponent href={href || '#'} {...props}>
|
||||
{children}
|
||||
</LinkComponent>
|
||||
),
|
||||
|
||||
// Tables
|
||||
table: ({ children }: React.TableHTMLAttributes<HTMLTableElement>) => (
|
||||
<div className='my-4 w-full overflow-x-auto'>
|
||||
<table className='min-w-full table-auto border border-gray-300 font-geist-sans text-sm dark:border-gray-700'>
|
||||
{children}
|
||||
</table>
|
||||
</div>
|
||||
),
|
||||
thead: ({ children }: React.HTMLAttributes<HTMLTableSectionElement>) => (
|
||||
<thead className='bg-gray-100 text-left dark:bg-gray-800'>{children}</thead>
|
||||
),
|
||||
tbody: ({ children }: React.HTMLAttributes<HTMLTableSectionElement>) => (
|
||||
<tbody className='divide-y divide-gray-200 bg-white dark:divide-gray-700 dark:bg-gray-900'>
|
||||
{children}
|
||||
</tbody>
|
||||
),
|
||||
tr: ({ children }: React.HTMLAttributes<HTMLTableRowElement>) => (
|
||||
<tr className='border-gray-200 border-b transition-colors hover:bg-gray-50 dark:border-gray-700 dark:hover:bg-gray-800/60'>
|
||||
{children}
|
||||
</tr>
|
||||
),
|
||||
th: ({ children }: React.ThHTMLAttributes<HTMLTableCellElement>) => (
|
||||
<th className='border-gray-300 border-r px-4 py-2 font-medium text-gray-700 last:border-r-0 dark:border-gray-700 dark:text-gray-300'>
|
||||
{children}
|
||||
</th>
|
||||
),
|
||||
td: ({ children }: React.TdHTMLAttributes<HTMLTableCellElement>) => (
|
||||
<td className='break-words border-gray-300 border-r px-4 py-2 text-gray-800 last:border-r-0 dark:border-gray-700 dark:text-gray-200'>
|
||||
{children}
|
||||
</td>
|
||||
),
|
||||
|
||||
// Images
|
||||
img: ({ src, alt, ...props }: React.ImgHTMLAttributes<HTMLImageElement>) => (
|
||||
<img
|
||||
src={src}
|
||||
alt={alt || 'Image'}
|
||||
className='my-3 h-auto max-w-full rounded-md'
|
||||
{...props}
|
||||
/>
|
||||
),
|
||||
}
|
||||
|
||||
// Pre-process content to fix common issues
|
||||
const processedContent = content.trim()
|
||||
|
||||
return (
|
||||
<div className='space-y-4 break-words font-geist-sans text-[#0D0D0D] text-base leading-relaxed dark:text-gray-100'>
|
||||
<ReactMarkdown remarkPlugins={[remarkGfm]} components={customComponents}>
|
||||
{processedContent}
|
||||
</ReactMarkdown>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -1,39 +1,71 @@
|
||||
'use client'
|
||||
|
||||
import { type FC, type KeyboardEvent, useRef, useState } from 'react'
|
||||
import { ArrowUp, Loader2 } from 'lucide-react'
|
||||
import { type FC, type KeyboardEvent, useEffect, useRef, useState } from 'react'
|
||||
import { ArrowUp, Loader2, MessageCircle, Package, X } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Textarea } from '@/components/ui/textarea'
|
||||
import { cn } from '@/lib/utils'
|
||||
|
||||
interface ProfessionalInputProps {
|
||||
onSubmit: (message: string) => void
|
||||
onAbort?: () => void
|
||||
disabled?: boolean
|
||||
isLoading?: boolean
|
||||
isAborting?: boolean
|
||||
placeholder?: string
|
||||
className?: string
|
||||
mode?: 'ask' | 'agent'
|
||||
onModeChange?: (mode: 'ask' | 'agent') => void
|
||||
value?: string // Controlled value from outside
|
||||
onChange?: (value: string) => void // Callback when value changes
|
||||
}
|
||||
|
||||
const ProfessionalInput: FC<ProfessionalInputProps> = ({
|
||||
onSubmit,
|
||||
onAbort,
|
||||
disabled = false,
|
||||
isLoading = false,
|
||||
isAborting = false,
|
||||
placeholder = 'How can I help you today?',
|
||||
className,
|
||||
mode = 'agent',
|
||||
onModeChange,
|
||||
value: controlledValue,
|
||||
onChange: onControlledChange,
|
||||
}) => {
|
||||
const [message, setMessage] = useState('')
|
||||
const [internalMessage, setInternalMessage] = useState('')
|
||||
const textareaRef = useRef<HTMLTextAreaElement>(null)
|
||||
|
||||
// Use controlled value if provided, otherwise use internal state
|
||||
const message = controlledValue !== undefined ? controlledValue : internalMessage
|
||||
const setMessage =
|
||||
controlledValue !== undefined ? onControlledChange || (() => {}) : setInternalMessage
|
||||
|
||||
// Auto-resize textarea
|
||||
useEffect(() => {
|
||||
const textarea = textareaRef.current
|
||||
if (textarea) {
|
||||
textarea.style.height = 'auto'
|
||||
textarea.style.height = `${Math.min(textarea.scrollHeight, 120)}px` // Max height of 120px
|
||||
}
|
||||
}, [message])
|
||||
|
||||
const handleSubmit = () => {
|
||||
const trimmedMessage = message.trim()
|
||||
if (!trimmedMessage || disabled || isLoading) return
|
||||
|
||||
onSubmit(trimmedMessage)
|
||||
setMessage('')
|
||||
// Clear the message after submit
|
||||
if (controlledValue !== undefined) {
|
||||
onControlledChange?.('')
|
||||
} else {
|
||||
setInternalMessage('')
|
||||
}
|
||||
}
|
||||
|
||||
// Reset textarea height
|
||||
if (textareaRef.current) {
|
||||
textareaRef.current.style.height = 'auto'
|
||||
const handleAbort = () => {
|
||||
if (onAbort && isLoading) {
|
||||
onAbort()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -45,50 +77,90 @@ const ProfessionalInput: FC<ProfessionalInputProps> = ({
|
||||
}
|
||||
|
||||
const handleInputChange = (e: React.ChangeEvent<HTMLTextAreaElement>) => {
|
||||
setMessage(e.target.value)
|
||||
|
||||
// Auto-resize textarea
|
||||
if (textareaRef.current) {
|
||||
textareaRef.current.style.height = 'auto'
|
||||
textareaRef.current.style.height = `${Math.min(textareaRef.current.scrollHeight, 120)}px`
|
||||
const newValue = e.target.value
|
||||
if (controlledValue !== undefined) {
|
||||
onControlledChange?.(newValue)
|
||||
} else {
|
||||
setInternalMessage(newValue)
|
||||
}
|
||||
}
|
||||
|
||||
const canSubmit = message.trim().length > 0 && !disabled && !isLoading
|
||||
const showAbortButton = isLoading && onAbort
|
||||
|
||||
const handleModeToggle = () => {
|
||||
if (onModeChange) {
|
||||
onModeChange(mode === 'ask' ? 'agent' : 'ask')
|
||||
}
|
||||
}
|
||||
|
||||
const getModeIcon = () => {
|
||||
return mode === 'ask' ? (
|
||||
<MessageCircle className='h-3 w-3 text-muted-foreground' />
|
||||
) : (
|
||||
<Package className='h-3 w-3 text-muted-foreground' />
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className={cn('w-full max-w-full overflow-hidden bg-background p-4', className)}>
|
||||
<div className='mx-auto w-full max-w-3xl'>
|
||||
<div className='relative w-full max-w-full'>
|
||||
<div className='relative flex w-full max-w-full items-end rounded-2xl border border-border bg-background shadow-sm transition-all focus-within:border-primary focus-within:ring-1 focus-within:ring-primary'>
|
||||
<Textarea
|
||||
ref={textareaRef}
|
||||
value={message}
|
||||
onChange={handleInputChange}
|
||||
onKeyDown={handleKeyDown}
|
||||
placeholder={placeholder}
|
||||
disabled={disabled || isLoading}
|
||||
className='max-h-[120px] min-h-[50px] w-full max-w-full resize-none border-0 bg-transparent px-4 py-3 pr-12 text-sm placeholder:text-muted-foreground focus-visible:ring-0 focus-visible:ring-offset-0'
|
||||
rows={1}
|
||||
/>
|
||||
<div className={cn('relative flex-none pb-4', className)}>
|
||||
<div className='rounded-[8px] border border-[#E5E5E5] bg-[#FFFFFF] p-2 shadow-xs dark:border-[#414141] dark:bg-[#202020]'>
|
||||
{/* Textarea Field */}
|
||||
<Textarea
|
||||
ref={textareaRef}
|
||||
value={message}
|
||||
onChange={handleInputChange}
|
||||
onKeyDown={handleKeyDown}
|
||||
placeholder={placeholder}
|
||||
disabled={disabled}
|
||||
rows={1}
|
||||
className='mb-2 min-h-[32px] w-full resize-none overflow-hidden border-0 bg-transparent px-[2px] py-1 text-muted-foreground focus-visible:ring-0 focus-visible:ring-offset-0'
|
||||
style={{ height: 'auto' }}
|
||||
/>
|
||||
|
||||
{/* Bottom Row: Mode Selector + Send Button */}
|
||||
<div className='flex items-center justify-between'>
|
||||
{/* Mode Selector Tag */}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={handleModeToggle}
|
||||
disabled={!onModeChange}
|
||||
className='flex h-6 items-center gap-1.5 rounded-full bg-secondary px-2 py-1 font-medium text-secondary-foreground text-xs hover:bg-secondary/80'
|
||||
>
|
||||
{getModeIcon()}
|
||||
<span className='capitalize'>{mode}</span>
|
||||
</Button>
|
||||
|
||||
{/* Send Button */}
|
||||
{showAbortButton ? (
|
||||
<Button
|
||||
onClick={handleAbort}
|
||||
disabled={isAborting}
|
||||
size='icon'
|
||||
className='h-6 w-6 rounded-full bg-red-500 text-white transition-all duration-200 hover:bg-red-600'
|
||||
title='Stop generation'
|
||||
>
|
||||
{isAborting ? (
|
||||
<Loader2 className='h-3 w-3 animate-spin' />
|
||||
) : (
|
||||
<X className='h-3 w-3' />
|
||||
)}
|
||||
</Button>
|
||||
) : (
|
||||
<Button
|
||||
onClick={handleSubmit}
|
||||
disabled={!canSubmit}
|
||||
size='icon'
|
||||
className={cn(
|
||||
'absolute right-2 bottom-2 h-8 w-8 rounded-xl transition-all',
|
||||
canSubmit
|
||||
? 'bg-[#802FFF] text-white shadow-sm hover:bg-[#7028E6]'
|
||||
: 'cursor-not-allowed bg-muted text-muted-foreground'
|
||||
)}
|
||||
className='h-6 w-6 rounded-full bg-[#802FFF] text-white shadow-[0_0_0_0_#802FFF] transition-all duration-200 hover:bg-[#7028E6] hover:shadow-[0_0_0_4px_rgba(127,47,255,0.15)]'
|
||||
>
|
||||
{isLoading ? (
|
||||
<Loader2 className='h-4 w-4 animate-spin' />
|
||||
<Loader2 className='h-3 w-3 animate-spin' />
|
||||
) : (
|
||||
<ArrowUp className='h-4 w-4' />
|
||||
<ArrowUp className='h-3 w-3' />
|
||||
)}
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,322 +1,227 @@
|
||||
'use client'
|
||||
|
||||
import { forwardRef, useCallback, useEffect, useImperativeHandle, useRef, useState } from 'react'
|
||||
import { Bot, ChevronDown, History, MessageSquarePlus, MoreHorizontal, Trash2 } from 'lucide-react'
|
||||
import {
|
||||
Button,
|
||||
DropdownMenu,
|
||||
DropdownMenuContent,
|
||||
DropdownMenuItem,
|
||||
DropdownMenuTrigger,
|
||||
ScrollArea,
|
||||
} from '@/components/ui'
|
||||
import { LoadingAgent } from '@/components/ui/loading-agent'
|
||||
import { ScrollArea } from '@/components/ui/scroll-area'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import {
|
||||
CheckpointPanel,
|
||||
CopilotModal,
|
||||
CopilotWelcome,
|
||||
ProfessionalInput,
|
||||
ProfessionalMessage,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/components'
|
||||
import { COPILOT_TOOL_IDS } from '@/stores/copilot/constants'
|
||||
import { usePreviewStore } from '@/stores/copilot/preview-store'
|
||||
import { useCopilotStore } from '@/stores/copilot/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { CheckpointPanel } from './components/checkpoint-panel'
|
||||
import { ProfessionalInput } from './components/professional-input/professional-input'
|
||||
import { ProfessionalMessage } from './components/professional-message/professional-message'
|
||||
import { CopilotWelcome } from './components/welcome/welcome'
|
||||
|
||||
const logger = createLogger('Copilot')
|
||||
|
||||
interface CopilotProps {
|
||||
panelWidth: number
|
||||
isFullscreen?: boolean
|
||||
onFullscreenToggle?: (fullscreen: boolean) => void
|
||||
fullscreenInput?: string
|
||||
onFullscreenInputChange?: (input: string) => void
|
||||
}
|
||||
|
||||
interface CopilotRef {
|
||||
clearMessages: () => void
|
||||
startNewChat: () => void
|
||||
createNewChat: () => void
|
||||
}
|
||||
|
||||
export const Copilot = forwardRef<CopilotRef, CopilotProps>(
|
||||
(
|
||||
{
|
||||
panelWidth,
|
||||
isFullscreen = false,
|
||||
onFullscreenToggle,
|
||||
fullscreenInput = '',
|
||||
onFullscreenInputChange,
|
||||
export const Copilot = forwardRef<CopilotRef, CopilotProps>(({ panelWidth }, ref) => {
|
||||
const scrollAreaRef = useRef<HTMLDivElement>(null)
|
||||
const [showCheckpoints, setShowCheckpoints] = useState(false)
|
||||
const scannedChatRef = useRef<string | null>(null)
|
||||
const [isInitialized, setIsInitialized] = useState(false)
|
||||
const lastWorkflowIdRef = useRef<string | null>(null)
|
||||
const hasMountedRef = useRef(false)
|
||||
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
|
||||
// Use preview store to track seen previews
|
||||
const { scanAndMarkExistingPreviews, isToolCallSeen, markToolCallAsSeen } = usePreviewStore()
|
||||
|
||||
// Use the new copilot store
|
||||
const {
|
||||
messages,
|
||||
isLoading,
|
||||
isLoadingChats,
|
||||
isSendingMessage,
|
||||
isAborting,
|
||||
mode,
|
||||
inputValue,
|
||||
sendMessage,
|
||||
abortMessage,
|
||||
createNewChat,
|
||||
clearMessages,
|
||||
setMode,
|
||||
setInputValue,
|
||||
chatsLoadedForWorkflow,
|
||||
setWorkflowId: setCopilotWorkflowId,
|
||||
loadChats,
|
||||
} = useCopilotStore()
|
||||
|
||||
// Force fresh initialization on mount (handles hot reload)
|
||||
useEffect(() => {
|
||||
if (activeWorkflowId && !hasMountedRef.current) {
|
||||
hasMountedRef.current = true
|
||||
// Reset state to ensure fresh load, especially important for hot reload
|
||||
setIsInitialized(false)
|
||||
lastWorkflowIdRef.current = null
|
||||
|
||||
// Force reload chats for current workflow
|
||||
setCopilotWorkflowId(activeWorkflowId)
|
||||
loadChats(true) // Force refresh
|
||||
}
|
||||
}, [activeWorkflowId, setCopilotWorkflowId, loadChats])
|
||||
|
||||
// Initialize the component - only on mount and genuine workflow changes
|
||||
useEffect(() => {
|
||||
// If workflow actually changed (not initial mount), reset initialization
|
||||
if (
|
||||
activeWorkflowId &&
|
||||
activeWorkflowId !== lastWorkflowIdRef.current &&
|
||||
hasMountedRef.current
|
||||
) {
|
||||
setIsInitialized(false)
|
||||
lastWorkflowIdRef.current = activeWorkflowId
|
||||
}
|
||||
|
||||
// Set as initialized once we have the workflow and chats are ready
|
||||
if (
|
||||
activeWorkflowId &&
|
||||
!isLoadingChats &&
|
||||
chatsLoadedForWorkflow === activeWorkflowId &&
|
||||
!isInitialized
|
||||
) {
|
||||
setIsInitialized(true)
|
||||
}
|
||||
}, [activeWorkflowId, isLoadingChats, chatsLoadedForWorkflow, isInitialized])
|
||||
|
||||
// Clear any existing preview when component mounts or workflow changes
|
||||
useEffect(() => {
|
||||
// Preview clearing is now handled automatically by the copilot store
|
||||
}, [activeWorkflowId])
|
||||
|
||||
// Auto-scroll to bottom when new messages are added
|
||||
useEffect(() => {
|
||||
if (scrollAreaRef.current) {
|
||||
const scrollContainer = scrollAreaRef.current.querySelector(
|
||||
'[data-radix-scroll-area-viewport]'
|
||||
)
|
||||
if (scrollContainer) {
|
||||
scrollContainer.scrollTop = scrollContainer.scrollHeight
|
||||
}
|
||||
}
|
||||
}, [messages])
|
||||
|
||||
// Auto-scroll to bottom when chat loads in
|
||||
useEffect(() => {
|
||||
if (isInitialized && messages.length > 0 && scrollAreaRef.current) {
|
||||
const scrollContainer = scrollAreaRef.current.querySelector(
|
||||
'[data-radix-scroll-area-viewport]'
|
||||
)
|
||||
if (scrollContainer) {
|
||||
scrollContainer.scrollTop = scrollContainer.scrollHeight
|
||||
}
|
||||
}
|
||||
}, [isInitialized, messages.length])
|
||||
|
||||
// Cleanup on component unmount (page refresh, navigation, etc.)
|
||||
useEffect(() => {
|
||||
return () => {
|
||||
// Abort any active message streaming and terminate active tools
|
||||
if (isSendingMessage) {
|
||||
abortMessage()
|
||||
logger.info('Aborted active message streaming due to component unmount')
|
||||
}
|
||||
}
|
||||
}, [isSendingMessage, abortMessage])
|
||||
|
||||
// Watch for completed preview_workflow tool calls in the new format
|
||||
useEffect(() => {
|
||||
if (!messages.length) return
|
||||
|
||||
const lastMessage = messages[messages.length - 1]
|
||||
if (lastMessage.role !== 'assistant' || !lastMessage.toolCalls) return
|
||||
|
||||
// Check for completed preview_workflow tool calls
|
||||
const previewToolCall = lastMessage.toolCalls.find(
|
||||
(tc) =>
|
||||
tc.name === COPILOT_TOOL_IDS.BUILD_WORKFLOW &&
|
||||
tc.state === 'completed' &&
|
||||
!isToolCallSeen(tc.id)
|
||||
)
|
||||
|
||||
if (previewToolCall?.result) {
|
||||
logger.info('Preview workflow completed via native SSE - handling result')
|
||||
// Mark as seen to prevent duplicate processing
|
||||
markToolCallAsSeen(previewToolCall.id)
|
||||
// Tool call handling logic would go here if needed
|
||||
}
|
||||
}, [messages, isToolCallSeen, markToolCallAsSeen])
|
||||
|
||||
// Handle new chat creation
|
||||
const handleStartNewChat = useCallback(() => {
|
||||
// Preview clearing is now handled automatically by the copilot store
|
||||
createNewChat()
|
||||
logger.info('Started new chat')
|
||||
}, [createNewChat])
|
||||
|
||||
// Expose functions to parent
|
||||
useImperativeHandle(
|
||||
ref,
|
||||
() => ({
|
||||
createNewChat: handleStartNewChat,
|
||||
}),
|
||||
[handleStartNewChat]
|
||||
)
|
||||
|
||||
// Handle message submission
|
||||
const handleSubmit = useCallback(
|
||||
async (query: string) => {
|
||||
if (!query || isSendingMessage || !activeWorkflowId) return
|
||||
|
||||
try {
|
||||
await sendMessage(query, { stream: true })
|
||||
logger.info('Sent message:', query)
|
||||
} catch (error) {
|
||||
logger.error('Failed to send message:', error)
|
||||
}
|
||||
},
|
||||
ref
|
||||
) => {
|
||||
const scrollAreaRef = useRef<HTMLDivElement>(null)
|
||||
const [isDropdownOpen, setIsDropdownOpen] = useState(false)
|
||||
const [showCheckpoints, setShowCheckpoints] = useState(false)
|
||||
[isSendingMessage, activeWorkflowId, sendMessage]
|
||||
)
|
||||
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
// Handle modal message sending
|
||||
const handleModalSendMessage = useCallback(
|
||||
async (message: string) => {
|
||||
await handleSubmit(message)
|
||||
},
|
||||
[handleSubmit]
|
||||
)
|
||||
|
||||
// Use the new copilot store
|
||||
const {
|
||||
currentChat,
|
||||
chats,
|
||||
messages,
|
||||
isLoading,
|
||||
isLoadingChats,
|
||||
isSendingMessage,
|
||||
error,
|
||||
workflowId,
|
||||
mode,
|
||||
setWorkflowId,
|
||||
validateCurrentChat,
|
||||
selectChat,
|
||||
createNewChat,
|
||||
deleteChat,
|
||||
sendMessage,
|
||||
clearMessages,
|
||||
clearError,
|
||||
setMode,
|
||||
} = useCopilotStore()
|
||||
|
||||
// Sync workflow ID with store
|
||||
useEffect(() => {
|
||||
if (activeWorkflowId !== workflowId) {
|
||||
setWorkflowId(activeWorkflowId)
|
||||
}
|
||||
}, [activeWorkflowId, workflowId, setWorkflowId])
|
||||
|
||||
// Safety check: Clear any chat that doesn't belong to current workflow
|
||||
useEffect(() => {
|
||||
if (activeWorkflowId && workflowId === activeWorkflowId) {
|
||||
// Validate that current chat belongs to this workflow
|
||||
validateCurrentChat()
|
||||
}
|
||||
}, [currentChat, chats, activeWorkflowId, workflowId, validateCurrentChat])
|
||||
|
||||
// Auto-scroll to bottom when new messages are added
|
||||
useEffect(() => {
|
||||
if (scrollAreaRef.current) {
|
||||
const scrollContainer = scrollAreaRef.current.querySelector(
|
||||
'[data-radix-scroll-area-viewport]'
|
||||
)
|
||||
if (scrollContainer) {
|
||||
scrollContainer.scrollTop = scrollContainer.scrollHeight
|
||||
}
|
||||
}
|
||||
}, [messages])
|
||||
|
||||
// Handle chat deletion
|
||||
const handleDeleteChat = useCallback(
|
||||
async (chatId: string) => {
|
||||
try {
|
||||
await deleteChat(chatId)
|
||||
logger.info('Chat deleted successfully')
|
||||
} catch (error) {
|
||||
logger.error('Error deleting chat:', error)
|
||||
}
|
||||
},
|
||||
[deleteChat]
|
||||
)
|
||||
|
||||
// Handle new chat creation
|
||||
const handleStartNewChat = useCallback(() => {
|
||||
clearMessages()
|
||||
logger.info('Started new chat')
|
||||
}, [clearMessages])
|
||||
|
||||
// Expose functions to parent
|
||||
useImperativeHandle(
|
||||
ref,
|
||||
() => ({
|
||||
clearMessages: handleStartNewChat,
|
||||
startNewChat: handleStartNewChat,
|
||||
}),
|
||||
[handleStartNewChat]
|
||||
)
|
||||
|
||||
// Handle message submission
|
||||
const handleSubmit = useCallback(
|
||||
async (query: string) => {
|
||||
if (!query || isSendingMessage || !activeWorkflowId) return
|
||||
|
||||
try {
|
||||
await sendMessage(query, { stream: true })
|
||||
logger.info('Sent message:', query)
|
||||
} catch (error) {
|
||||
logger.error('Failed to send message:', error)
|
||||
}
|
||||
},
|
||||
[isSendingMessage, activeWorkflowId, sendMessage]
|
||||
)
|
||||
|
||||
// Handle modal message sending
|
||||
const handleModalSendMessage = useCallback(
|
||||
async (message: string) => {
|
||||
await handleSubmit(message)
|
||||
},
|
||||
[handleSubmit]
|
||||
)
|
||||
|
||||
return (
|
||||
<>
|
||||
<div
|
||||
className='flex h-full max-w-full flex-col overflow-hidden'
|
||||
style={{ width: `${panelWidth}px`, maxWidth: `${panelWidth}px` }}
|
||||
>
|
||||
{/* Show loading state with centered pulsing agent icon */}
|
||||
{isLoadingChats || isLoading ? (
|
||||
<div className='flex h-full items-center justify-center'>
|
||||
<div className='flex items-center justify-center'>
|
||||
<Bot className='h-16 w-16 animate-pulse text-muted-foreground' />
|
||||
</div>
|
||||
return (
|
||||
<>
|
||||
<div className='flex h-full flex-col overflow-hidden'>
|
||||
{/* Show loading state until fully initialized */}
|
||||
{!isInitialized ? (
|
||||
<div className='flex h-full w-full items-center justify-center'>
|
||||
<div className='flex flex-col items-center gap-3'>
|
||||
<LoadingAgent size='md' />
|
||||
<p className='text-muted-foreground text-sm'>Loading chat history...</p>
|
||||
</div>
|
||||
) : (
|
||||
<>
|
||||
{/* Header with Chat Title and Management */}
|
||||
<div className='border-b p-4'>
|
||||
<div className='flex items-center justify-between'>
|
||||
{/* Chat Title Dropdown */}
|
||||
<DropdownMenu open={isDropdownOpen} onOpenChange={setIsDropdownOpen}>
|
||||
<DropdownMenuTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
className='h-8 min-w-0 flex-1 justify-start px-3 hover:bg-accent/50'
|
||||
>
|
||||
<span className='truncate'>
|
||||
{/* Only show chat title if we have verified workflow match */}
|
||||
{currentChat &&
|
||||
workflowId === activeWorkflowId &&
|
||||
chats.some((chat) => chat.id === currentChat.id)
|
||||
? currentChat.title || 'New Chat'
|
||||
: 'New Chat'}
|
||||
</span>
|
||||
<ChevronDown className='ml-2 h-4 w-4 shrink-0' />
|
||||
</Button>
|
||||
</DropdownMenuTrigger>
|
||||
<DropdownMenuContent
|
||||
align='start'
|
||||
className='z-[110] w-72 border-border/50 bg-background/95 shadow-lg backdrop-blur-sm'
|
||||
sideOffset={8}
|
||||
onMouseLeave={() => setIsDropdownOpen(false)}
|
||||
>
|
||||
{isLoadingChats ? (
|
||||
<div className='px-4 py-3 text-muted-foreground text-sm'>
|
||||
Loading chats...
|
||||
</div>
|
||||
) : chats.length === 0 ? (
|
||||
<div className='px-4 py-3 text-muted-foreground text-sm'>No chats yet</div>
|
||||
) : (
|
||||
// Sort chats by updated date (most recent first) for display
|
||||
[...chats]
|
||||
.sort(
|
||||
(a, b) =>
|
||||
new Date(b.updatedAt).getTime() - new Date(a.updatedAt).getTime()
|
||||
)
|
||||
.map((chat) => (
|
||||
<div key={chat.id} className='group flex items-center gap-2 px-2 py-1'>
|
||||
<DropdownMenuItem asChild>
|
||||
<div
|
||||
onClick={() => {
|
||||
selectChat(chat)
|
||||
setIsDropdownOpen(false)
|
||||
}}
|
||||
className={`min-w-0 flex-1 cursor-pointer rounded-lg px-3 py-2.5 transition-all ${
|
||||
currentChat?.id === chat.id
|
||||
? 'bg-accent/80 text-accent-foreground'
|
||||
: 'hover:bg-accent/40'
|
||||
}`}
|
||||
>
|
||||
<div className='min-w-0'>
|
||||
<div className='truncate font-medium text-sm leading-tight'>
|
||||
{chat.title || 'Untitled Chat'}
|
||||
</div>
|
||||
<div className='mt-0.5 truncate text-muted-foreground text-xs'>
|
||||
{new Date(chat.updatedAt).toLocaleDateString()} at{' '}
|
||||
{new Date(chat.updatedAt).toLocaleTimeString([], {
|
||||
hour: '2-digit',
|
||||
minute: '2-digit',
|
||||
})}{' '}
|
||||
• {chat.messageCount}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</DropdownMenuItem>
|
||||
<DropdownMenu>
|
||||
<DropdownMenuTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
className='h-7 w-7 shrink-0 p-0 hover:bg-accent/60'
|
||||
>
|
||||
<MoreHorizontal className='h-3.5 w-3.5' />
|
||||
</Button>
|
||||
</DropdownMenuTrigger>
|
||||
<DropdownMenuContent
|
||||
align='end'
|
||||
className='z-[120] border-border/50 bg-background/95 shadow-lg backdrop-blur-sm'
|
||||
>
|
||||
<DropdownMenuItem
|
||||
onClick={() => handleDeleteChat(chat.id)}
|
||||
className='cursor-pointer text-destructive hover:bg-destructive/10 hover:text-destructive focus:bg-destructive/10 focus:text-destructive'
|
||||
>
|
||||
<Trash2 className='mr-2 h-3.5 w-3.5' />
|
||||
Delete
|
||||
</DropdownMenuItem>
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
</div>
|
||||
))
|
||||
)}
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
|
||||
{/* Checkpoint Toggle Button */}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={() => setShowCheckpoints(!showCheckpoints)}
|
||||
className={`h-8 w-8 p-0 ${
|
||||
showCheckpoints
|
||||
? 'bg-[#802FFF]/20 text-[#802FFF] hover:bg-[#802FFF]/30'
|
||||
: 'hover:bg-accent/50'
|
||||
}`}
|
||||
title='View Checkpoints'
|
||||
>
|
||||
<History className='h-4 w-4' />
|
||||
</Button>
|
||||
|
||||
{/* New Chat Button */}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={handleStartNewChat}
|
||||
className='h-8 w-8 p-0'
|
||||
title='New Chat'
|
||||
>
|
||||
<MessageSquarePlus className='h-4 w-4' />
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* Error display */}
|
||||
{error && (
|
||||
<div className='mt-2 rounded-md bg-destructive/10 p-2 text-destructive text-sm'>
|
||||
{error}
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={clearError}
|
||||
className='ml-2 h-auto p-1 text-destructive'
|
||||
>
|
||||
Dismiss
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Messages area or Checkpoint Panel */}
|
||||
{showCheckpoints ? (
|
||||
<CheckpointPanel />
|
||||
) : (
|
||||
<ScrollArea ref={scrollAreaRef} className='max-w-full flex-1 overflow-hidden'>
|
||||
</div>
|
||||
) : (
|
||||
<>
|
||||
{/* Messages area or Checkpoint Panel */}
|
||||
{showCheckpoints ? (
|
||||
<CheckpointPanel />
|
||||
) : (
|
||||
<ScrollArea
|
||||
ref={scrollAreaRef}
|
||||
className='flex-1 overflow-hidden'
|
||||
hideScrollbar={true}
|
||||
>
|
||||
<div className='space-y-1'>
|
||||
{messages.length === 0 ? (
|
||||
<CopilotWelcome onQuestionClick={handleSubmit} mode={mode} />
|
||||
<div className='flex h-full items-center justify-center p-4'>
|
||||
<CopilotWelcome onQuestionClick={handleSubmit} mode={mode} />
|
||||
</div>
|
||||
) : (
|
||||
messages.map((message) => (
|
||||
<ProfessionalMessage
|
||||
@@ -328,77 +233,29 @@ export const Copilot = forwardRef<CopilotRef, CopilotProps>(
|
||||
/>
|
||||
))
|
||||
)}
|
||||
</ScrollArea>
|
||||
)}
|
||||
</div>
|
||||
</ScrollArea>
|
||||
)}
|
||||
|
||||
{/* Mode Selector and Input */}
|
||||
{!showCheckpoints && (
|
||||
<>
|
||||
{/* Mode Selector */}
|
||||
<div className='border-t px-4 pt-2 pb-1'>
|
||||
<div className='flex items-center gap-1 rounded-md border bg-muted/30 p-0.5'>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={() => setMode('ask')}
|
||||
className={`h-6 flex-1 font-medium text-xs ${
|
||||
mode === 'ask'
|
||||
? 'bg-[#802FFF]/20 text-[#802FFF] hover:bg-[#802FFF]/30'
|
||||
: 'hover:bg-muted/50'
|
||||
}`}
|
||||
title='Ask questions and get answers. Cannot edit workflows.'
|
||||
>
|
||||
Ask
|
||||
</Button>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
onClick={() => setMode('agent')}
|
||||
className={`h-6 flex-1 font-medium text-xs ${
|
||||
mode === 'agent'
|
||||
? 'bg-[#802FFF]/20 text-[#802FFF] hover:bg-[#802FFF]/30'
|
||||
: 'hover:bg-muted/50'
|
||||
}`}
|
||||
title='Full agent with workflow editing capabilities.'
|
||||
>
|
||||
Agent
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Input area */}
|
||||
<ProfessionalInput
|
||||
onSubmit={handleSubmit}
|
||||
disabled={!activeWorkflowId}
|
||||
isLoading={isSendingMessage}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Fullscreen Modal */}
|
||||
<CopilotModal
|
||||
open={isFullscreen}
|
||||
onOpenChange={(open) => onFullscreenToggle?.(open)}
|
||||
copilotMessage={fullscreenInput}
|
||||
setCopilotMessage={(message) => onFullscreenInputChange?.(message)}
|
||||
messages={messages}
|
||||
onSendMessage={handleModalSendMessage}
|
||||
isLoading={isSendingMessage}
|
||||
isLoadingChats={isLoadingChats}
|
||||
chats={chats}
|
||||
currentChat={currentChat}
|
||||
onSelectChat={selectChat}
|
||||
onStartNewChat={handleStartNewChat}
|
||||
onDeleteChat={handleDeleteChat}
|
||||
mode={mode}
|
||||
onModeChange={setMode}
|
||||
/>
|
||||
</>
|
||||
)
|
||||
}
|
||||
)
|
||||
{/* Input area with integrated mode selector */}
|
||||
{!showCheckpoints && (
|
||||
<ProfessionalInput
|
||||
onSubmit={handleSubmit}
|
||||
onAbort={abortMessage}
|
||||
disabled={!activeWorkflowId}
|
||||
isLoading={isSendingMessage}
|
||||
isAborting={isAborting}
|
||||
mode={mode}
|
||||
onModeChange={setMode}
|
||||
value={inputValue}
|
||||
onChange={setInputValue}
|
||||
/>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</>
|
||||
)
|
||||
})
|
||||
|
||||
Copilot.displayName = 'Copilot'
|
||||
|
||||
@@ -0,0 +1,144 @@
|
||||
/**
|
||||
* Base class for all copilot tools
|
||||
*/
|
||||
|
||||
import type {
|
||||
CopilotToolCall,
|
||||
Tool,
|
||||
ToolConfirmResponse,
|
||||
ToolExecuteResult,
|
||||
ToolExecutionOptions,
|
||||
ToolMetadata,
|
||||
ToolState,
|
||||
} from './types'
|
||||
|
||||
export abstract class BaseTool implements Tool {
|
||||
// Static property for tool ID - must be overridden by each tool
|
||||
static readonly id: string
|
||||
|
||||
// Instance property for metadata
|
||||
abstract metadata: ToolMetadata
|
||||
|
||||
/**
|
||||
* Notify the backend about the tool state change
|
||||
*/
|
||||
protected async notify(
|
||||
toolCallId: string,
|
||||
state: ToolState,
|
||||
message?: string
|
||||
): Promise<ToolConfirmResponse> {
|
||||
try {
|
||||
// Map ToolState to NotificationStatus for API
|
||||
const notificationStatus = state === 'errored' ? 'error' : state
|
||||
|
||||
const response = await fetch('/api/copilot/confirm', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
toolCallId,
|
||||
status: notificationStatus,
|
||||
message,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json()
|
||||
console.error(`Failed to confirm tool ${toolCallId}:`, error)
|
||||
return { success: false, message: error.error || 'Failed to confirm tool' }
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
return { success: true, message: result.message }
|
||||
} catch (error) {
|
||||
console.error('Error confirming tool:', error)
|
||||
return { success: false, message: error instanceof Error ? error.message : 'Unknown error' }
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute the tool - must be implemented by each tool
|
||||
*/
|
||||
abstract execute(
|
||||
toolCall: CopilotToolCall,
|
||||
options?: ToolExecutionOptions
|
||||
): Promise<ToolExecuteResult>
|
||||
|
||||
/**
|
||||
* Get the display name for the current state
|
||||
*/
|
||||
getDisplayName(toolCall: CopilotToolCall): string {
|
||||
const { state, parameters = {} } = toolCall
|
||||
const { displayConfig } = this.metadata
|
||||
|
||||
// First try dynamic display name if available
|
||||
if (displayConfig.getDynamicDisplayName) {
|
||||
const dynamicName = displayConfig.getDynamicDisplayName(state, parameters)
|
||||
if (dynamicName) return dynamicName
|
||||
}
|
||||
|
||||
// Then try state-specific display name
|
||||
const stateConfig = displayConfig.states[state]
|
||||
if (stateConfig?.displayName) {
|
||||
return stateConfig.displayName
|
||||
}
|
||||
|
||||
// Fallback to a generic state name
|
||||
return `${this.metadata.id} (${state})`
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the icon for the current state
|
||||
*/
|
||||
getIcon(toolCall: CopilotToolCall): string {
|
||||
const { state } = toolCall
|
||||
const stateConfig = this.metadata.displayConfig.states[state]
|
||||
|
||||
// Return state-specific icon or default
|
||||
return stateConfig?.icon || 'default'
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if tool requires confirmation in current state
|
||||
*/
|
||||
requiresConfirmation(toolCall: CopilotToolCall): boolean {
|
||||
// Only show confirmation UI if tool requires interrupt and is in pending state
|
||||
return this.metadata.requiresInterrupt && toolCall.state === 'pending'
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle user action (run/skip/background)
|
||||
*/
|
||||
async handleUserAction(
|
||||
toolCall: CopilotToolCall,
|
||||
action: 'run' | 'skip' | 'background',
|
||||
options?: ToolExecutionOptions
|
||||
): Promise<void> {
|
||||
// Map actions to states
|
||||
const actionToState: Record<string, ToolState> = {
|
||||
run: 'executing', // Changed from 'accepted' to 'executing'
|
||||
skip: 'rejected',
|
||||
background: 'background',
|
||||
}
|
||||
|
||||
const newState = actionToState[action]
|
||||
|
||||
// Update state locally
|
||||
options?.onStateChange?.(newState)
|
||||
|
||||
// Special handling for run action
|
||||
if (action === 'run') {
|
||||
// Directly call execute method - no wrapper
|
||||
await this.execute(toolCall, options)
|
||||
} else {
|
||||
// For skip/background, just notify
|
||||
const message =
|
||||
action === 'skip'
|
||||
? this.getDisplayName({ ...toolCall, state: 'rejected' })
|
||||
: 'The user moved execution to the background'
|
||||
|
||||
await this.notify(toolCall.id, newState, message)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,197 @@
|
||||
/**
|
||||
* Run Workflow Tool
|
||||
*/
|
||||
|
||||
import { executeWorkflowWithFullLogging } from '@/app/workspace/[workspaceId]/w/[workflowId]/lib/workflow-execution-utils'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { BaseTool } from '../base-tool'
|
||||
import type {
|
||||
CopilotToolCall,
|
||||
ToolExecuteResult,
|
||||
ToolExecutionOptions,
|
||||
ToolMetadata,
|
||||
} from '../types'
|
||||
|
||||
interface RunWorkflowParams {
|
||||
workflowId?: string
|
||||
description?: string
|
||||
workflow_input?: string
|
||||
}
|
||||
|
||||
export class RunWorkflowTool extends BaseTool {
|
||||
static readonly id = 'run_workflow'
|
||||
|
||||
metadata: ToolMetadata = {
|
||||
id: RunWorkflowTool.id,
|
||||
displayConfig: {
|
||||
states: {
|
||||
pending: {
|
||||
displayName: 'Run workflow?',
|
||||
icon: 'play',
|
||||
},
|
||||
executing: {
|
||||
displayName: 'Running workflow',
|
||||
icon: 'spinner',
|
||||
},
|
||||
accepted: {
|
||||
displayName: 'Running workflow',
|
||||
icon: 'spinner',
|
||||
},
|
||||
success: {
|
||||
displayName: 'Executed workflow',
|
||||
icon: 'play',
|
||||
},
|
||||
rejected: {
|
||||
displayName: 'Skipped workflow execution',
|
||||
icon: 'skip',
|
||||
},
|
||||
errored: {
|
||||
displayName: 'Failed to execute workflow',
|
||||
icon: 'error',
|
||||
},
|
||||
background: {
|
||||
displayName: 'Workflow execution moved to background',
|
||||
icon: 'play',
|
||||
},
|
||||
aborted: {
|
||||
displayName: 'Aborted stream',
|
||||
icon: 'abort',
|
||||
},
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: RunWorkflowTool.id,
|
||||
description: 'Execute a workflow with optional input',
|
||||
parameters: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
workflowId: {
|
||||
type: 'string',
|
||||
description: 'The ID of the workflow to run',
|
||||
},
|
||||
description: {
|
||||
type: 'string',
|
||||
description: 'Description of what the workflow does',
|
||||
},
|
||||
workflow_input: {
|
||||
type: 'string',
|
||||
description: 'Input text to pass to the workflow chat',
|
||||
},
|
||||
},
|
||||
required: [],
|
||||
},
|
||||
},
|
||||
requiresInterrupt: true,
|
||||
allowBackgroundExecution: true,
|
||||
stateMessages: {
|
||||
success: 'Workflow successfully executed',
|
||||
background:
|
||||
'User moved workflow exectuion to background. The workflow execution is not complete, but will continue to run in the background.',
|
||||
error: 'Error during workflow execution',
|
||||
rejected: 'The user chose to skip the workflow execution',
|
||||
},
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute the tool - run the workflow
|
||||
* This includes showing a background prompt and handling background vs foreground execution
|
||||
*/
|
||||
async execute(
|
||||
toolCall: CopilotToolCall,
|
||||
options?: ToolExecutionOptions
|
||||
): Promise<ToolExecuteResult> {
|
||||
try {
|
||||
// Parse parameters from either toolCall.parameters or toolCall.input
|
||||
const rawParams = toolCall.parameters || toolCall.input || {}
|
||||
const params = rawParams as RunWorkflowParams
|
||||
|
||||
// Check if workflow is already executing
|
||||
const { isExecuting } = useExecutionStore.getState()
|
||||
if (isExecuting) {
|
||||
options?.onStateChange?.('errored')
|
||||
return {
|
||||
success: false,
|
||||
error: 'The workflow is already in the middle of an execution. Try again later',
|
||||
}
|
||||
}
|
||||
|
||||
// Get current workflow and execution context
|
||||
const { activeWorkflowId } = useWorkflowRegistry.getState()
|
||||
if (!activeWorkflowId) {
|
||||
options?.onStateChange?.('errored')
|
||||
return {
|
||||
success: false,
|
||||
error: 'No active workflow found',
|
||||
}
|
||||
}
|
||||
|
||||
// Prepare workflow input - if workflow_input is provided, pass it to the execution
|
||||
const workflowInput = params.workflow_input
|
||||
? {
|
||||
input: params.workflow_input,
|
||||
}
|
||||
: undefined
|
||||
|
||||
// Set execution state
|
||||
const { setIsExecuting } = useExecutionStore.getState()
|
||||
setIsExecuting(true)
|
||||
|
||||
// Note: toolCall.state is already set to 'executing' by clientAcceptTool
|
||||
|
||||
// Use the standalone execution utility with full logging support
|
||||
// This works for both deployed and non-deployed workflows
|
||||
const result = await executeWorkflowWithFullLogging({
|
||||
workflowInput,
|
||||
executionId: toolCall.id, // Use tool call ID as execution ID
|
||||
})
|
||||
|
||||
// Reset execution state
|
||||
setIsExecuting(false)
|
||||
|
||||
// Check if execution was successful
|
||||
if (result && (!('success' in result) || result.success !== false)) {
|
||||
// Notify server of success
|
||||
await this.notify(toolCall.id, 'success', 'Workflow execution completed successfully')
|
||||
|
||||
options?.onStateChange?.('success')
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: {
|
||||
workflowId: params.workflowId || activeWorkflowId,
|
||||
description: params.description,
|
||||
message: 'Workflow execution finished successfully',
|
||||
},
|
||||
}
|
||||
}
|
||||
// Execution failed
|
||||
const errorMessage = (result as any)?.error || 'Workflow execution failed'
|
||||
|
||||
// Notify server of error
|
||||
await this.notify(toolCall.id, 'errored', `Workflow execution failed: ${errorMessage}`)
|
||||
|
||||
options?.onStateChange?.('errored')
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
} catch (error: any) {
|
||||
// Reset execution state in case of error
|
||||
const { setIsExecuting } = useExecutionStore.getState()
|
||||
setIsExecuting(false)
|
||||
|
||||
const errorMessage = error?.message || 'An unknown error occurred'
|
||||
|
||||
await this.notify(toolCall.id, 'errored', `Workflow execution failed: ${errorMessage}`)
|
||||
|
||||
options?.onStateChange?.('errored')
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,42 @@
|
||||
/**
|
||||
* Copilot Tools Library
|
||||
* Export the public API for the tools system
|
||||
*/
|
||||
|
||||
// Base classes
|
||||
export { BaseTool } from './base-tool'
|
||||
// Client tool implementations
|
||||
export { RunWorkflowTool } from './client-tools/run-workflow'
|
||||
export { InlineToolCall } from './inline-tool-call'
|
||||
// Registry
|
||||
export { ToolRegistry, toolRegistry } from './registry'
|
||||
export type { ServerToolId } from './server-tools/definitions'
|
||||
// Server tool definitions
|
||||
export { SERVER_TOOL_IDS, SERVER_TOOL_METADATA } from './server-tools/definitions'
|
||||
// React components
|
||||
export { ToolConfirmation } from './tool-confirmation'
|
||||
// Core types and interfaces
|
||||
export type {
|
||||
CopilotToolCall,
|
||||
StateDisplayConfig,
|
||||
Tool,
|
||||
ToolConfirmResponse,
|
||||
ToolDisplayConfig,
|
||||
ToolExecuteResult,
|
||||
ToolExecutionOptions,
|
||||
ToolMetadata,
|
||||
ToolSchema,
|
||||
ToolState,
|
||||
} from './types'
|
||||
// Utilities
|
||||
export {
|
||||
createToolActionButton,
|
||||
executeToolWithStateManagement,
|
||||
getToolDisplayName,
|
||||
getToolIcon,
|
||||
getToolStateClasses,
|
||||
renderToolStateIcon,
|
||||
type ToolConfirmationProps,
|
||||
toolRequiresConfirmation,
|
||||
toolRequiresInterrupt,
|
||||
} from './utils'
|
||||
@@ -0,0 +1,284 @@
|
||||
'use client'
|
||||
|
||||
/**
|
||||
* Inline Tool Call Component
|
||||
* Displays a tool call with its current state and optional confirmation UI
|
||||
*/
|
||||
|
||||
import { useState } from 'react'
|
||||
import { Loader2 } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { useCopilotStore } from '@/stores/copilot/store'
|
||||
import type { CopilotToolCall } from '@/stores/copilot/types'
|
||||
import { notifyServerTool } from './notification-utils'
|
||||
import { toolRegistry } from './registry'
|
||||
import { renderToolStateIcon, toolRequiresInterrupt } from './utils'
|
||||
|
||||
interface InlineToolCallProps {
|
||||
toolCall: CopilotToolCall
|
||||
onStateChange?: (state: any) => void
|
||||
context?: Record<string, any>
|
||||
}
|
||||
|
||||
// Simple function to check if tool call should show run/skip buttons
|
||||
function shouldShowRunSkipButtons(toolCall: CopilotToolCall): boolean {
|
||||
// Check if tool requires interrupt and is in pending state
|
||||
return toolRequiresInterrupt(toolCall.name) && toolCall.state === 'pending'
|
||||
}
|
||||
|
||||
// Function to accept a server tool (interrupt required)
|
||||
async function serverAcceptTool(
|
||||
toolCall: CopilotToolCall,
|
||||
setToolCallState: (toolCall: any, state: string, options?: any) => void
|
||||
): Promise<void> {
|
||||
// Set state directly to executing (skip accepted state)
|
||||
setToolCallState(toolCall, 'executing')
|
||||
|
||||
try {
|
||||
// Notify server of acceptance - execution happens elsewhere via SSE
|
||||
await notifyServerTool(toolCall.id, toolCall.name, 'accepted')
|
||||
} catch (error) {
|
||||
console.error('Failed to notify server of tool acceptance:', error)
|
||||
setToolCallState(toolCall, 'error', { error: 'Failed to notify server' })
|
||||
}
|
||||
}
|
||||
|
||||
// Function to accept a client tool
|
||||
async function clientAcceptTool(
|
||||
toolCall: CopilotToolCall,
|
||||
setToolCallState: (toolCall: any, state: string, options?: any) => void,
|
||||
onStateChange?: (state: any) => void,
|
||||
context?: Record<string, any>
|
||||
): Promise<void> {
|
||||
setToolCallState(toolCall, 'executing')
|
||||
|
||||
// Trigger UI update immediately with explicit state
|
||||
onStateChange?.('executing')
|
||||
|
||||
try {
|
||||
// Get the tool and execute it directly
|
||||
const tool = toolRegistry.getTool(toolCall.name)
|
||||
if (tool) {
|
||||
await tool.execute(toolCall, {
|
||||
onStateChange: (state: any) => {
|
||||
setToolCallState(toolCall, state)
|
||||
},
|
||||
context,
|
||||
})
|
||||
} else {
|
||||
throw new Error(`Tool not found: ${toolCall.name}`)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error executing client tool:', error)
|
||||
setToolCallState(toolCall, 'errored', {
|
||||
error: error instanceof Error ? error.message : 'Tool execution failed',
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Function to reject any tool
|
||||
async function rejectTool(
|
||||
toolCall: CopilotToolCall,
|
||||
setToolCallState: (toolCall: any, state: string, options?: any) => void
|
||||
): Promise<void> {
|
||||
// NEW LOGIC: Use centralized state management
|
||||
setToolCallState(toolCall, 'rejected')
|
||||
|
||||
try {
|
||||
// Notify server for both client and server tools
|
||||
await notifyServerTool(toolCall.id, toolCall.name, 'rejected')
|
||||
} catch (error) {
|
||||
console.error('Failed to notify server of tool rejection:', error)
|
||||
}
|
||||
}
|
||||
|
||||
// Function to get tool display name based on state
|
||||
function getToolDisplayNameByState(toolCall: CopilotToolCall): string {
|
||||
const toolName = toolCall.name
|
||||
const state = toolCall.state
|
||||
|
||||
// Check if it's a client tool
|
||||
const clientTool = toolRegistry.getTool(toolName)
|
||||
if (clientTool) {
|
||||
// Use client tool's display name logic
|
||||
return clientTool.getDisplayName(toolCall)
|
||||
}
|
||||
|
||||
// For server tools, use server tool metadata
|
||||
const serverToolMetadata = toolRegistry.getServerToolMetadata(toolName)
|
||||
if (serverToolMetadata) {
|
||||
// Check if there's a dynamic display name function
|
||||
if (serverToolMetadata.displayConfig.getDynamicDisplayName) {
|
||||
const dynamicName = serverToolMetadata.displayConfig.getDynamicDisplayName(
|
||||
state,
|
||||
toolCall.input || toolCall.parameters || {}
|
||||
)
|
||||
if (dynamicName) return dynamicName
|
||||
}
|
||||
|
||||
// Use state-specific display config
|
||||
const stateConfig = serverToolMetadata.displayConfig.states[state]
|
||||
if (stateConfig) {
|
||||
return stateConfig.displayName
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to tool name if no specific display logic found
|
||||
return toolName
|
||||
}
|
||||
|
||||
// Simple run/skip buttons component
|
||||
function RunSkipButtons({
|
||||
toolCall,
|
||||
onStateChange,
|
||||
context,
|
||||
}: {
|
||||
toolCall: CopilotToolCall
|
||||
onStateChange?: (state: any) => void
|
||||
context?: Record<string, any>
|
||||
}) {
|
||||
const [isProcessing, setIsProcessing] = useState(false)
|
||||
const [buttonsHidden, setButtonsHidden] = useState(false)
|
||||
const { setToolCallState } = useCopilotStore()
|
||||
|
||||
const handleRun = async () => {
|
||||
setIsProcessing(true)
|
||||
setButtonsHidden(true) // Hide run/skip buttons immediately
|
||||
|
||||
try {
|
||||
// Check if it's a client tool or server tool
|
||||
const clientTool = toolRegistry.getTool(toolCall.name)
|
||||
|
||||
if (clientTool) {
|
||||
// Client tool - execute immediately
|
||||
await clientAcceptTool(toolCall, setToolCallState, onStateChange, context)
|
||||
// Trigger re-render after tool execution completes
|
||||
onStateChange?.(toolCall.state)
|
||||
} else {
|
||||
// Server tool
|
||||
await serverAcceptTool(toolCall, setToolCallState)
|
||||
// Trigger re-render by calling onStateChange if provided
|
||||
onStateChange?.(toolCall.state)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error handling run action:', error)
|
||||
} finally {
|
||||
setIsProcessing(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleSkip = async () => {
|
||||
setIsProcessing(true)
|
||||
setButtonsHidden(true) // Hide run/skip buttons immediately
|
||||
|
||||
try {
|
||||
await rejectTool(toolCall, setToolCallState)
|
||||
|
||||
// Trigger re-render by calling onStateChange if provided
|
||||
onStateChange?.(toolCall.state)
|
||||
} catch (error) {
|
||||
console.error('Error handling skip action:', error)
|
||||
} finally {
|
||||
setIsProcessing(false)
|
||||
}
|
||||
}
|
||||
|
||||
// If buttons are hidden, show nothing
|
||||
if (buttonsHidden) {
|
||||
return null
|
||||
}
|
||||
|
||||
// Default run/skip buttons
|
||||
return (
|
||||
<div className='flex items-center gap-1.5'>
|
||||
<Button
|
||||
onClick={handleRun}
|
||||
disabled={isProcessing}
|
||||
size='sm'
|
||||
className='h-6 bg-gray-900 px-2 font-medium text-white text-xs hover:bg-gray-800 disabled:opacity-50 dark:bg-gray-100 dark:text-gray-900 dark:hover:bg-gray-200'
|
||||
>
|
||||
{isProcessing ? <Loader2 className='mr-1 h-3 w-3 animate-spin' /> : null}
|
||||
Run
|
||||
</Button>
|
||||
<Button
|
||||
onClick={handleSkip}
|
||||
disabled={isProcessing}
|
||||
size='sm'
|
||||
className='h-6 bg-gray-200 px-2 font-medium text-gray-700 text-xs hover:bg-gray-300 disabled:opacity-50 dark:bg-gray-700 dark:text-gray-300 dark:hover:bg-gray-600'
|
||||
>
|
||||
Skip
|
||||
</Button>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export function InlineToolCall({ toolCall, onStateChange, context }: InlineToolCallProps) {
|
||||
const [, forceUpdate] = useState({})
|
||||
const { setToolCallState } = useCopilotStore()
|
||||
|
||||
if (!toolCall) {
|
||||
return null
|
||||
}
|
||||
|
||||
const showButtons = shouldShowRunSkipButtons(toolCall)
|
||||
|
||||
// Check if we should show background button (when in executing state)
|
||||
const clientTool = toolRegistry.getTool(toolCall.name)
|
||||
const allowsBackground = clientTool?.metadata?.allowBackgroundExecution || false
|
||||
const showBackgroundButton = allowsBackground && toolCall.state === 'executing' && !showButtons
|
||||
|
||||
const handleStateChange = (state: any) => {
|
||||
// Force component re-render
|
||||
forceUpdate({})
|
||||
// Call parent onStateChange if provided
|
||||
onStateChange?.(state)
|
||||
}
|
||||
|
||||
const displayName = getToolDisplayNameByState(toolCall)
|
||||
|
||||
return (
|
||||
<div className='flex items-center justify-between gap-2 py-1'>
|
||||
<div className='flex items-center gap-2 text-muted-foreground'>
|
||||
<div className='flex-shrink-0'>{renderToolStateIcon(toolCall, 'h-3 w-3')}</div>
|
||||
<span className='text-base'>{displayName}</span>
|
||||
</div>
|
||||
|
||||
{showButtons && (
|
||||
<RunSkipButtons toolCall={toolCall} onStateChange={handleStateChange} context={context} />
|
||||
)}
|
||||
|
||||
{showBackgroundButton && (
|
||||
<div className='flex items-center gap-1.5'>
|
||||
<Button
|
||||
onClick={async () => {
|
||||
try {
|
||||
// Set tool state to background
|
||||
setToolCallState(toolCall, 'background')
|
||||
|
||||
// Notify the backend about background state
|
||||
await notifyServerTool(toolCall.id, toolCall.name, 'background')
|
||||
|
||||
// Track that this tool was moved to background
|
||||
if (context) {
|
||||
if (!context.movedToBackgroundToolIds) {
|
||||
context.movedToBackgroundToolIds = new Set()
|
||||
}
|
||||
context.movedToBackgroundToolIds.add(toolCall.id)
|
||||
}
|
||||
|
||||
// Trigger re-render
|
||||
onStateChange?.(toolCall.state)
|
||||
} catch (error) {
|
||||
console.error('Error moving to background:', error)
|
||||
}
|
||||
}}
|
||||
size='sm'
|
||||
className='h-6 bg-blue-600 px-2 font-medium text-white text-xs hover:bg-blue-700'
|
||||
>
|
||||
Move to Background
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -0,0 +1,76 @@
|
||||
/**
|
||||
* Tool Notification Utilities
|
||||
* Handles notifications and state messages for tools
|
||||
*/
|
||||
|
||||
import { toolRegistry } from './registry'
|
||||
import type { NotificationStatus, ToolState } from './types'
|
||||
|
||||
/**
|
||||
* Send a notification for a tool state change
|
||||
* @param toolId - The unique identifier for the tool call
|
||||
* @param toolName - The name of the tool (e.g., 'set_environment_variables')
|
||||
* @param toolState - The current state of the tool
|
||||
*/
|
||||
/**
|
||||
* Maps tool states to notification statuses
|
||||
*/
|
||||
const STATE_MAPPINGS: Partial<Record<ToolState, NotificationStatus>> = {
|
||||
success: 'success',
|
||||
errored: 'error',
|
||||
accepted: 'accepted',
|
||||
rejected: 'rejected',
|
||||
background: 'background',
|
||||
}
|
||||
|
||||
const SERVER_TOOL_MAPPINGS: Partial<Record<ToolState, NotificationStatus>> = {
|
||||
accepted: 'accepted',
|
||||
rejected: 'rejected',
|
||||
background: 'background',
|
||||
}
|
||||
|
||||
export async function notifyServerTool(
|
||||
toolId: string,
|
||||
toolName: string,
|
||||
toolState: ToolState
|
||||
): Promise<void> {
|
||||
const notificationStatus = SERVER_TOOL_MAPPINGS[toolState]
|
||||
if (!notificationStatus) {
|
||||
throw new Error(`Invalid tool state: ${toolState}`)
|
||||
}
|
||||
await notify(toolId, toolName, toolState)
|
||||
}
|
||||
|
||||
export async function notify(
|
||||
toolId: string,
|
||||
toolName: string,
|
||||
toolState: ToolState
|
||||
): Promise<void> {
|
||||
// toolState must be in STATE_MAPPINGS
|
||||
const notificationStatus = STATE_MAPPINGS[toolState]
|
||||
if (!notificationStatus) {
|
||||
throw new Error(`Invalid tool state: ${toolState}`)
|
||||
}
|
||||
|
||||
// Get the state message from tool metadata
|
||||
const metadata = toolRegistry.getToolMetadata(toolId)
|
||||
let stateMessage = metadata?.stateMessages?.[notificationStatus]
|
||||
if (!stateMessage) {
|
||||
stateMessage = ''
|
||||
}
|
||||
|
||||
// Call backend confirm route
|
||||
await fetch('/api/copilot/confirm', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
toolCallId: toolId,
|
||||
status: notificationStatus,
|
||||
toolName,
|
||||
toolState,
|
||||
stateMessage,
|
||||
}),
|
||||
})
|
||||
}
|
||||
@@ -0,0 +1,121 @@
|
||||
/**
|
||||
* Tool Registry - Central management for client-side copilot tools
|
||||
*
|
||||
* This registry manages tools that:
|
||||
* - Require user interrupts/confirmation (requiresInterrupt: true)
|
||||
* - Execute client-side logic
|
||||
*
|
||||
* It also provides metadata for server-side tools for display purposes
|
||||
*/
|
||||
|
||||
// Import client tool implementations
|
||||
import { RunWorkflowTool } from './client-tools/run-workflow'
|
||||
// Import server tool definitions
|
||||
import { SERVER_TOOL_METADATA } from './server-tools/definitions'
|
||||
import type { Tool, ToolMetadata } from './types'
|
||||
|
||||
/**
|
||||
* Tool Registry class that manages all available tools
|
||||
*/
|
||||
export class ToolRegistry {
|
||||
private static instance: ToolRegistry
|
||||
private tools: Map<string, Tool> = new Map()
|
||||
|
||||
private constructor() {
|
||||
// Register all tools on initialization
|
||||
this.registerDefaultTools()
|
||||
}
|
||||
|
||||
/**
|
||||
* Get singleton instance
|
||||
*/
|
||||
static getInstance(): ToolRegistry {
|
||||
if (!ToolRegistry.instance) {
|
||||
ToolRegistry.instance = new ToolRegistry()
|
||||
}
|
||||
return ToolRegistry.instance
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a tool
|
||||
*/
|
||||
register(tool: Tool): void {
|
||||
this.tools.set(tool.metadata.id, tool)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a tool by ID
|
||||
*/
|
||||
getTool(toolId: string): Tool | undefined {
|
||||
return this.tools.get(toolId)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get tool metadata by ID
|
||||
*/
|
||||
getToolMetadata(toolId: string): ToolMetadata | undefined {
|
||||
const tool = this.tools.get(toolId)
|
||||
return tool?.metadata
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all registered tools
|
||||
*/
|
||||
getAllTools(): Tool[] {
|
||||
return Array.from(this.tools.values())
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all tool IDs
|
||||
*/
|
||||
getToolIds(): string[] {
|
||||
return Array.from(this.tools.keys())
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all tool IDs as an object for easy access
|
||||
*/
|
||||
getToolIdsObject(): Record<string, string> {
|
||||
const ids: Record<string, string> = {}
|
||||
|
||||
this.tools.forEach((tool, id) => {
|
||||
const key = id.toUpperCase()
|
||||
ids[key] = id
|
||||
})
|
||||
|
||||
return ids
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a tool requires interrupt
|
||||
*/
|
||||
requiresInterrupt(toolId: string): boolean {
|
||||
// Check client tools first
|
||||
const tool = this.getTool(toolId)
|
||||
if (tool) {
|
||||
return tool.metadata.requiresInterrupt ?? false
|
||||
}
|
||||
|
||||
// Check server tools
|
||||
const serverToolMetadata = SERVER_TOOL_METADATA[toolId as keyof typeof SERVER_TOOL_METADATA]
|
||||
return serverToolMetadata?.requiresInterrupt ?? false
|
||||
}
|
||||
|
||||
/**
|
||||
* Get server tool metadata by ID
|
||||
*/
|
||||
getServerToolMetadata(toolId: string): ToolMetadata | undefined {
|
||||
return SERVER_TOOL_METADATA[toolId as keyof typeof SERVER_TOOL_METADATA]
|
||||
}
|
||||
|
||||
/**
|
||||
* Register default client tools
|
||||
*/
|
||||
private registerDefaultTools(): void {
|
||||
// Register actual client tool implementations
|
||||
this.register(new RunWorkflowTool())
|
||||
}
|
||||
}
|
||||
|
||||
// Export singleton instance
|
||||
export const toolRegistry = ToolRegistry.getInstance()
|
||||
@@ -0,0 +1,283 @@
|
||||
/**
|
||||
* Server-side tool definitions
|
||||
* These tools execute on the server and their results are displayed in the UI
|
||||
*/
|
||||
|
||||
import type { ToolMetadata } from '../types'
|
||||
|
||||
// Tool IDs for server tools
|
||||
export const SERVER_TOOL_IDS = {
|
||||
SEARCH_DOCUMENTATION: 'search_documentation',
|
||||
GET_USER_WORKFLOW: 'get_user_workflow',
|
||||
BUILD_WORKFLOW: 'build_workflow',
|
||||
EDIT_WORKFLOW: 'edit_workflow',
|
||||
GET_BLOCKS_AND_TOOLS: 'get_blocks_and_tools',
|
||||
GET_BLOCKS_METADATA: 'get_blocks_metadata',
|
||||
GET_YAML_STRUCTURE: 'get_yaml_structure',
|
||||
GET_EDIT_WORKFLOW_EXAMPLES: 'get_edit_workflow_examples',
|
||||
GET_BUILD_WORKFLOW_EXAMPLES: 'get_build_workflow_examples',
|
||||
GET_ENVIRONMENT_VARIABLES: 'get_environment_variables',
|
||||
SET_ENVIRONMENT_VARIABLES: 'set_environment_variables',
|
||||
GET_WORKFLOW_CONSOLE: 'get_workflow_console',
|
||||
SEARCH_ONLINE: 'search_online',
|
||||
} as const
|
||||
|
||||
export type ServerToolId = (typeof SERVER_TOOL_IDS)[keyof typeof SERVER_TOOL_IDS]
|
||||
|
||||
/**
|
||||
* Server tool metadata definitions
|
||||
* These define how server tools are displayed in different states
|
||||
*/
|
||||
export const SERVER_TOOL_METADATA: Record<ServerToolId, ToolMetadata> = {
|
||||
[SERVER_TOOL_IDS.SEARCH_DOCUMENTATION]: {
|
||||
id: SERVER_TOOL_IDS.SEARCH_DOCUMENTATION,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Searching documentation', icon: 'spinner' },
|
||||
success: { displayName: 'Searched documentation', icon: 'file' },
|
||||
rejected: { displayName: 'Skipped documentation search', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to search documentation', icon: 'error' },
|
||||
aborted: { displayName: 'Documentation search aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.SEARCH_DOCUMENTATION,
|
||||
description: 'Search through documentation',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_USER_WORKFLOW]: {
|
||||
id: SERVER_TOOL_IDS.GET_USER_WORKFLOW,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Analyzing workflow', icon: 'spinner' },
|
||||
success: { displayName: 'Analyzed workflow', icon: 'workflow' },
|
||||
rejected: { displayName: 'Skipped workflow analysis', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to analyze workflow', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow analysis aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_USER_WORKFLOW,
|
||||
description: 'Get current workflow details',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.BUILD_WORKFLOW]: {
|
||||
id: SERVER_TOOL_IDS.BUILD_WORKFLOW,
|
||||
displayConfig: {
|
||||
states: {
|
||||
ready_for_review: { displayName: 'Workflow ready for review', icon: 'network' },
|
||||
executing: { displayName: 'Building workflow', icon: 'spinner' },
|
||||
success: { displayName: 'Built workflow', icon: 'network' },
|
||||
rejected: { displayName: 'Rejected workflow changes', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to build workflow', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow build aborted', icon: 'x' },
|
||||
accepted: { displayName: 'Built workflow', icon: 'network' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.BUILD_WORKFLOW,
|
||||
description: 'Build a new workflow',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.EDIT_WORKFLOW]: {
|
||||
id: SERVER_TOOL_IDS.EDIT_WORKFLOW,
|
||||
displayConfig: {
|
||||
states: {
|
||||
ready_for_review: { displayName: 'Workflow changes ready for review', icon: 'network' },
|
||||
executing: { displayName: 'Editing workflow', icon: 'spinner' },
|
||||
success: { displayName: 'Edited workflow', icon: 'network' },
|
||||
rejected: { displayName: 'Rejected workflow changes', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to edit workflow', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow edit aborted', icon: 'x' },
|
||||
accepted: { displayName: 'Edited workflow', icon: 'network' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.EDIT_WORKFLOW,
|
||||
description: 'Edit the current workflow',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_BLOCKS_AND_TOOLS]: {
|
||||
id: SERVER_TOOL_IDS.GET_BLOCKS_AND_TOOLS,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Getting block information', icon: 'spinner' },
|
||||
success: { displayName: 'Retrieved block information', icon: 'blocks' },
|
||||
rejected: { displayName: 'Skipped getting block information', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to get block information', icon: 'error' },
|
||||
aborted: { displayName: 'Block information retrieval aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_BLOCKS_AND_TOOLS,
|
||||
description: 'Get available blocks and tools',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_BLOCKS_METADATA]: {
|
||||
id: SERVER_TOOL_IDS.GET_BLOCKS_METADATA,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Getting block metadata', icon: 'spinner' },
|
||||
success: { displayName: 'Retrieved block metadata', icon: 'blocks' },
|
||||
rejected: { displayName: 'Skipped getting block metadata', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to get block metadata', icon: 'error' },
|
||||
aborted: { displayName: 'Block metadata retrieval aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_BLOCKS_METADATA,
|
||||
description: 'Get metadata for blocks',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_YAML_STRUCTURE]: {
|
||||
id: SERVER_TOOL_IDS.GET_YAML_STRUCTURE,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Analyzing workflow structure', icon: 'spinner' },
|
||||
success: { displayName: 'Analyzed workflow structure', icon: 'tree' },
|
||||
rejected: { displayName: 'Skipped workflow structure analysis', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to analyze workflow structure', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow structure analysis aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_YAML_STRUCTURE,
|
||||
description: 'Get workflow YAML structure',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_EDIT_WORKFLOW_EXAMPLES]: {
|
||||
id: SERVER_TOOL_IDS.GET_EDIT_WORKFLOW_EXAMPLES,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Viewing workflow examples', icon: 'spinner' },
|
||||
success: { displayName: 'Viewed workflow examples', icon: 'gitbranch' },
|
||||
rejected: { displayName: 'Skipped workflow examples', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to view workflow examples', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow examples viewing aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_EDIT_WORKFLOW_EXAMPLES,
|
||||
description: 'Get workflow examples',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_BUILD_WORKFLOW_EXAMPLES]: {
|
||||
id: SERVER_TOOL_IDS.GET_BUILD_WORKFLOW_EXAMPLES,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Viewing workflow examples', icon: 'spinner' },
|
||||
success: { displayName: 'Viewed workflow examples', icon: 'gitbranch' },
|
||||
rejected: { displayName: 'Skipped workflow examples', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to view workflow examples', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow examples viewing aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_BUILD_WORKFLOW_EXAMPLES,
|
||||
description: 'Get workflow examples',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_ENVIRONMENT_VARIABLES]: {
|
||||
id: SERVER_TOOL_IDS.GET_ENVIRONMENT_VARIABLES,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Viewing environment variables', icon: 'spinner' },
|
||||
success: { displayName: 'Found environment variables', icon: 'wrench' },
|
||||
rejected: { displayName: 'Skipped viewing environment variables', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to get environment variables', icon: 'error' },
|
||||
aborted: { displayName: 'Environment variables viewing aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_ENVIRONMENT_VARIABLES,
|
||||
description: 'Get environment variables',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.SET_ENVIRONMENT_VARIABLES]: {
|
||||
id: SERVER_TOOL_IDS.SET_ENVIRONMENT_VARIABLES,
|
||||
displayConfig: {
|
||||
states: {
|
||||
pending: { displayName: 'Set environment variables', icon: 'edit' },
|
||||
executing: { displayName: 'Setting environment variables', icon: 'spinner' },
|
||||
success: { displayName: 'Set environment variables', icon: 'wrench' },
|
||||
rejected: { displayName: 'Skipped setting environment variables', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to set environment variables', icon: 'error' },
|
||||
aborted: { displayName: 'Environment variables setting aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.SET_ENVIRONMENT_VARIABLES,
|
||||
description: 'Set environment variables for the workflow',
|
||||
parameters: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
variables: {
|
||||
type: 'object',
|
||||
description: 'Key-value pairs of environment variables to set',
|
||||
additionalProperties: {
|
||||
type: 'string',
|
||||
},
|
||||
},
|
||||
},
|
||||
required: ['variables'],
|
||||
},
|
||||
},
|
||||
requiresInterrupt: true,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.GET_WORKFLOW_CONSOLE]: {
|
||||
id: SERVER_TOOL_IDS.GET_WORKFLOW_CONSOLE,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Reading workflow console', icon: 'spinner' },
|
||||
success: { displayName: 'Read workflow console', icon: 'squareTerminal' },
|
||||
rejected: { displayName: 'Skipped reading workflow console', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to read workflow console', icon: 'error' },
|
||||
aborted: { displayName: 'Workflow console reading aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.GET_WORKFLOW_CONSOLE,
|
||||
description: 'Get workflow console output',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
|
||||
[SERVER_TOOL_IDS.SEARCH_ONLINE]: {
|
||||
id: SERVER_TOOL_IDS.SEARCH_ONLINE,
|
||||
displayConfig: {
|
||||
states: {
|
||||
executing: { displayName: 'Searching online', icon: 'spinner' },
|
||||
success: { displayName: 'Searched online', icon: 'globe' },
|
||||
rejected: { displayName: 'Skipped online search', icon: 'skip' },
|
||||
errored: { displayName: 'Failed to search online', icon: 'error' },
|
||||
aborted: { displayName: 'Online search aborted', icon: 'x' },
|
||||
},
|
||||
},
|
||||
schema: {
|
||||
name: SERVER_TOOL_IDS.SEARCH_ONLINE,
|
||||
description: 'Search online for information',
|
||||
},
|
||||
requiresInterrupt: false,
|
||||
},
|
||||
}
|
||||
@@ -0,0 +1,125 @@
|
||||
'use client'
|
||||
|
||||
/**
|
||||
* Tool Confirmation Component
|
||||
* Renders Run/Skip buttons for tools requiring user confirmation
|
||||
*/
|
||||
|
||||
import { useState } from 'react'
|
||||
import { Loader2 } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { notifyServerTool } from './notification-utils'
|
||||
import { toolRegistry } from './registry'
|
||||
import type { CopilotToolCall } from './types'
|
||||
import { executeToolWithStateManagement } from './utils'
|
||||
|
||||
interface ToolConfirmationProps {
|
||||
toolCall: CopilotToolCall
|
||||
onStateChange: (state: any) => void
|
||||
context?: Record<string, any>
|
||||
onConfirm?: () => void
|
||||
showBackground?: boolean
|
||||
}
|
||||
|
||||
export function ToolConfirmation({
|
||||
toolCall,
|
||||
onStateChange,
|
||||
context,
|
||||
onConfirm,
|
||||
showBackground = false,
|
||||
}: ToolConfirmationProps) {
|
||||
const [isProcessing, setIsProcessing] = useState(false)
|
||||
const [buttonsHidden, setButtonsHidden] = useState(false)
|
||||
const [isMovingToBackground, setIsMovingToBackground] = useState(false)
|
||||
|
||||
const handleAction = async (action: 'run' | 'skip' | 'background') => {
|
||||
if (isProcessing) return
|
||||
|
||||
// Hide buttons immediately
|
||||
setButtonsHidden(true)
|
||||
|
||||
if (action === 'background') {
|
||||
setIsMovingToBackground(true)
|
||||
} else {
|
||||
setIsProcessing(true)
|
||||
}
|
||||
|
||||
try {
|
||||
// Call the confirmation callback if provided
|
||||
if (onConfirm) {
|
||||
onConfirm()
|
||||
}
|
||||
|
||||
// Check if this is a server tool or client tool
|
||||
const isClientTool = toolRegistry.getTool(toolCall.name) !== undefined
|
||||
|
||||
if (isClientTool) {
|
||||
// For client tools, use the existing state management system
|
||||
await executeToolWithStateManagement(toolCall, action, {
|
||||
onStateChange,
|
||||
context,
|
||||
})
|
||||
} else {
|
||||
// For server tools, use the notification system
|
||||
const toolState = action === 'run' ? 'accepted' : 'rejected'
|
||||
const uiState = action === 'run' ? 'accepted' : 'rejected'
|
||||
|
||||
// Update UI state
|
||||
onStateChange(uiState)
|
||||
|
||||
try {
|
||||
await notifyServerTool(toolCall.id, toolCall.name, toolState)
|
||||
} catch (error) {
|
||||
console.error(`Failed to notify server tool ${toolCall.id}:`, error)
|
||||
// Don't throw error for rejections - user explicitly chose to reject
|
||||
if (action === 'skip') {
|
||||
return
|
||||
}
|
||||
throw error
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
setIsProcessing(false)
|
||||
setIsMovingToBackground(false)
|
||||
}
|
||||
}
|
||||
|
||||
// Don't show buttons if already hidden
|
||||
if (buttonsHidden) {
|
||||
return null
|
||||
}
|
||||
|
||||
return (
|
||||
<div className='flex items-center gap-1.5'>
|
||||
<Button
|
||||
onClick={() => handleAction('run')}
|
||||
disabled={isProcessing}
|
||||
size='sm'
|
||||
className='h-6 bg-gray-900 px-2 font-medium text-white text-xs hover:bg-gray-800 disabled:opacity-50 dark:bg-gray-100 dark:text-gray-900 dark:hover:bg-gray-200'
|
||||
>
|
||||
{isProcessing ? <Loader2 className='mr-1 h-3 w-3 animate-spin' /> : null}
|
||||
Run
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => handleAction('skip')}
|
||||
disabled={isProcessing}
|
||||
size='sm'
|
||||
className='h-6 bg-gray-200 px-2 font-medium text-gray-700 text-xs hover:bg-gray-300 disabled:opacity-50 dark:bg-gray-700 dark:text-gray-300 dark:hover:bg-gray-600'
|
||||
>
|
||||
Skip
|
||||
</Button>
|
||||
|
||||
{showBackground && (
|
||||
<Button
|
||||
onClick={() => handleAction('background')}
|
||||
disabled={isMovingToBackground}
|
||||
size='sm'
|
||||
className='h-6 bg-gray-100 px-2 font-medium text-gray-600 text-xs hover:bg-gray-200 disabled:opacity-50 dark:bg-gray-800 dark:text-gray-400 dark:hover:bg-gray-700'
|
||||
>
|
||||
{isMovingToBackground ? <Loader2 className='mr-1 h-3 w-3 animate-spin' /> : null}
|
||||
Move to background
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -0,0 +1,106 @@
|
||||
/**
|
||||
* Copilot Tools Type Definitions
|
||||
* Clean architecture for client-side tool management
|
||||
*/
|
||||
|
||||
import type { CopilotToolCall, ToolState } from '@/stores/copilot/types'
|
||||
|
||||
export type NotificationStatus = 'success' | 'error' | 'accepted' | 'rejected' | 'background'
|
||||
|
||||
// Export the consolidated types
|
||||
export type { CopilotToolCall, ToolState }
|
||||
|
||||
// Display configuration for different states
|
||||
export interface StateDisplayConfig {
|
||||
// Display name for this state (e.g., "Setting environment variables" for executing)
|
||||
displayName: string
|
||||
|
||||
// Icon identifier for this state
|
||||
icon?: string
|
||||
|
||||
// CSS classes or style hints
|
||||
className?: string
|
||||
}
|
||||
|
||||
// Complete display configuration for a tool
|
||||
export interface ToolDisplayConfig {
|
||||
// Display configurations for each state
|
||||
states: {
|
||||
[K in ToolState]?: StateDisplayConfig
|
||||
}
|
||||
|
||||
// Optional function to generate dynamic display names based on parameters
|
||||
getDynamicDisplayName?: (state: ToolState, params: Record<string, any>) => string | null
|
||||
}
|
||||
|
||||
// Schema for tool parameters (OpenAI function calling format)
|
||||
export interface ToolSchema {
|
||||
name: string
|
||||
description: string
|
||||
parameters?: {
|
||||
type: 'object'
|
||||
properties: Record<string, any>
|
||||
required?: string[]
|
||||
}
|
||||
}
|
||||
|
||||
// Tool metadata - all the static configuration
|
||||
export interface ToolMetadata {
|
||||
id: string
|
||||
displayConfig: ToolDisplayConfig
|
||||
schema: ToolSchema
|
||||
requiresInterrupt: boolean
|
||||
allowBackgroundExecution?: boolean
|
||||
stateMessages?: Partial<Record<NotificationStatus, string>>
|
||||
}
|
||||
|
||||
// Result from executing a tool
|
||||
export interface ToolExecuteResult {
|
||||
success: boolean
|
||||
data?: any
|
||||
error?: string
|
||||
}
|
||||
|
||||
// Response from the confirmation API
|
||||
export interface ToolConfirmResponse {
|
||||
success: boolean
|
||||
message?: string
|
||||
}
|
||||
|
||||
// Options for tool execution
|
||||
export interface ToolExecutionOptions {
|
||||
// Callback when state changes
|
||||
onStateChange?: (state: ToolState) => void
|
||||
|
||||
// For tools that need special handling (like run_workflow)
|
||||
beforeExecute?: () => Promise<boolean>
|
||||
afterExecute?: (result: ToolExecuteResult) => Promise<void>
|
||||
|
||||
// Custom context for execution
|
||||
context?: Record<string, any>
|
||||
}
|
||||
|
||||
// The main tool interface that all tools must implement
|
||||
export interface Tool {
|
||||
// Tool metadata
|
||||
metadata: ToolMetadata
|
||||
|
||||
// Execute the tool
|
||||
execute(toolCall: CopilotToolCall, options?: ToolExecutionOptions): Promise<ToolExecuteResult>
|
||||
|
||||
// Get the display name for the current state
|
||||
getDisplayName(toolCall: CopilotToolCall): string
|
||||
|
||||
// Get the icon for the current state
|
||||
getIcon(toolCall: CopilotToolCall): string
|
||||
|
||||
// Handle user action (run/skip)
|
||||
handleUserAction(
|
||||
toolCall: CopilotToolCall,
|
||||
action: 'run' | 'skip' | 'background',
|
||||
options?: ToolExecutionOptions
|
||||
): Promise<void>
|
||||
|
||||
// Check if tool shows confirmation UI for current state
|
||||
requiresConfirmation(toolCall: CopilotToolCall): boolean
|
||||
}
|
||||
@@ -0,0 +1,273 @@
|
||||
/**
|
||||
* Copilot Tools Utilities
|
||||
* Handles all tool display logic and UI components
|
||||
*/
|
||||
|
||||
import React from 'react'
|
||||
import {
|
||||
Blocks,
|
||||
Check,
|
||||
CheckCircle,
|
||||
Code,
|
||||
Database,
|
||||
Edit,
|
||||
Eye,
|
||||
FileText,
|
||||
GitBranch,
|
||||
Globe,
|
||||
Info,
|
||||
Lightbulb,
|
||||
Loader2,
|
||||
type LucideIcon,
|
||||
Minus,
|
||||
Network,
|
||||
Play,
|
||||
Search,
|
||||
Settings,
|
||||
SquareTerminal,
|
||||
Terminal,
|
||||
TreePalm,
|
||||
Variable,
|
||||
Workflow,
|
||||
Wrench,
|
||||
X,
|
||||
XCircle,
|
||||
Zap,
|
||||
} from 'lucide-react'
|
||||
import { toolRegistry } from './registry'
|
||||
import type { CopilotToolCall, ToolState } from './types'
|
||||
|
||||
/**
|
||||
* Map icon identifiers to Lucide icon components
|
||||
*/
|
||||
const ICON_MAP: Record<string, LucideIcon> = {
|
||||
// Tool-specific icons
|
||||
edit: Edit,
|
||||
loader: Loader2,
|
||||
spinner: Loader2, // Standard spinner icon
|
||||
check: Check,
|
||||
checkCircle: CheckCircle,
|
||||
skip: Minus,
|
||||
error: XCircle,
|
||||
background: Eye,
|
||||
play: Play,
|
||||
wrench: Wrench, // Using Zap as wrench icon
|
||||
|
||||
// Generic icons for tools
|
||||
search: Search,
|
||||
code: Code,
|
||||
file: FileText,
|
||||
database: Database,
|
||||
globe: Globe,
|
||||
zap: Zap,
|
||||
lightbulb: Lightbulb,
|
||||
eye: Eye,
|
||||
x: X,
|
||||
blocks: Blocks, // Blocks icon with missing corner
|
||||
info: Info,
|
||||
terminal: Terminal,
|
||||
squareTerminal: SquareTerminal,
|
||||
tree: TreePalm,
|
||||
variable: Variable,
|
||||
template: FileText, // Using FileText for templates
|
||||
settings: Settings, // Gear/cog icon for configuration
|
||||
workflow: Workflow, // Flowchart icon with boxes and connecting lines
|
||||
network: Network, // Complex network icon with multiple interconnected nodes
|
||||
gitbranch: GitBranch, // Git branching icon showing workflow paths
|
||||
|
||||
// Default
|
||||
default: Lightbulb,
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the React icon component for a tool state
|
||||
*/
|
||||
export function getToolIcon(toolCall: CopilotToolCall): LucideIcon {
|
||||
// Check if it's a client tool
|
||||
const clientTool = toolRegistry.getTool(toolCall.name)
|
||||
if (clientTool) {
|
||||
const iconName = clientTool.getIcon(toolCall)
|
||||
return ICON_MAP[iconName] || ICON_MAP.default
|
||||
}
|
||||
|
||||
// For server tools, use server tool metadata
|
||||
const serverToolMetadata = toolRegistry.getServerToolMetadata(toolCall.name)
|
||||
if (serverToolMetadata) {
|
||||
const stateConfig =
|
||||
serverToolMetadata.displayConfig.states[
|
||||
toolCall.state as keyof typeof serverToolMetadata.displayConfig.states
|
||||
]
|
||||
if (stateConfig?.icon) {
|
||||
return ICON_MAP[stateConfig.icon] || ICON_MAP.default
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to default icon
|
||||
return ICON_MAP.default
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the display name for a tool in its current state
|
||||
*/
|
||||
export function getToolDisplayName(toolCall: CopilotToolCall): string {
|
||||
const tool = toolRegistry.getTool(toolCall.name)
|
||||
if (!tool) return toolCall.name
|
||||
|
||||
return tool.getDisplayName(toolCall)
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a tool requires user confirmation in its current state
|
||||
*/
|
||||
export function toolRequiresConfirmation(toolCall: CopilotToolCall): boolean {
|
||||
const tool = toolRegistry.getTool(toolCall.name)
|
||||
if (tool) {
|
||||
// Client-side tool
|
||||
return tool.requiresConfirmation(toolCall)
|
||||
}
|
||||
|
||||
// Server-side tool - check if it requires interrupt and is in pending state
|
||||
const requiresInterrupt = toolRegistry.requiresInterrupt(toolCall.name)
|
||||
return requiresInterrupt && toolCall.state === 'pending'
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a tool requires user confirmation by tool name (for pending state)
|
||||
*/
|
||||
export function toolRequiresInterrupt(toolName: string): boolean {
|
||||
return toolRegistry.requiresInterrupt(toolName)
|
||||
}
|
||||
|
||||
/**
|
||||
* Get CSS classes for tool state
|
||||
*/
|
||||
export function getToolStateClasses(state: ToolState): string {
|
||||
switch (state) {
|
||||
case 'pending':
|
||||
return 'text-muted-foreground'
|
||||
case 'executing':
|
||||
return 'text-yellow-600'
|
||||
case 'success':
|
||||
return 'text-green-600'
|
||||
case 'accepted':
|
||||
return 'text-blue-600'
|
||||
case 'rejected':
|
||||
return 'text-gray-500'
|
||||
case 'errored':
|
||||
return 'text-red-500'
|
||||
case 'background':
|
||||
return 'text-muted-foreground'
|
||||
default:
|
||||
return 'text-muted-foreground'
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Render the appropriate icon for a tool state
|
||||
*/
|
||||
export function renderToolStateIcon(
|
||||
toolCall: CopilotToolCall,
|
||||
className = 'h-3 w-3'
|
||||
): React.ReactElement {
|
||||
const Icon = getToolIcon(toolCall)
|
||||
const stateClasses = getToolStateClasses(toolCall.state)
|
||||
|
||||
// Special rendering for certain states
|
||||
if (toolCall.state === 'executing') {
|
||||
return React.createElement(Icon, { className: `${className} animate-spin ${stateClasses}` })
|
||||
}
|
||||
|
||||
if (toolCall.state === 'rejected') {
|
||||
// Special "skipped" icon style
|
||||
return React.createElement(
|
||||
'div',
|
||||
{
|
||||
className: `flex ${className} items-center justify-center rounded-full border border-gray-400`,
|
||||
},
|
||||
React.createElement(Minus, { className: 'h-2 w-2 text-gray-500' })
|
||||
)
|
||||
}
|
||||
|
||||
return React.createElement(Icon, { className: `${className} ${stateClasses}` })
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle tool execution with proper state management
|
||||
*/
|
||||
export async function executeToolWithStateManagement(
|
||||
toolCall: CopilotToolCall,
|
||||
action: 'run' | 'skip' | 'background',
|
||||
options: {
|
||||
onStateChange: (state: ToolState) => void
|
||||
context?: Record<string, any>
|
||||
}
|
||||
): Promise<void> {
|
||||
const tool = toolRegistry.getTool(toolCall.name)
|
||||
if (!tool) {
|
||||
console.error(`Tool not found: ${toolCall.name}`)
|
||||
return
|
||||
}
|
||||
|
||||
await tool.handleUserAction(toolCall, action, {
|
||||
onStateChange: options.onStateChange,
|
||||
context: options.context,
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Props for the tool confirmation component
|
||||
*/
|
||||
export interface ToolConfirmationProps {
|
||||
toolCall: CopilotToolCall
|
||||
onAction: (action: 'run' | 'skip' | 'background') => void
|
||||
isProcessing?: boolean
|
||||
showBackground?: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Tool action button props
|
||||
*/
|
||||
interface ToolActionButtonProps {
|
||||
label: string
|
||||
onClick: () => void
|
||||
disabled?: boolean
|
||||
loading?: boolean
|
||||
variant: 'primary' | 'secondary' | 'tertiary'
|
||||
size?: 'sm' | 'md'
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a tool action button with consistent styling
|
||||
*/
|
||||
export function createToolActionButton({
|
||||
label,
|
||||
onClick,
|
||||
disabled = false,
|
||||
loading = false,
|
||||
variant,
|
||||
size = 'sm',
|
||||
}: ToolActionButtonProps): React.ReactElement {
|
||||
const baseClasses = 'font-medium transition-colors disabled:opacity-50'
|
||||
|
||||
const sizeClasses = size === 'sm' ? 'h-6 px-2 text-xs' : 'h-8 px-3 text-sm'
|
||||
|
||||
const variantClasses = {
|
||||
primary:
|
||||
'bg-gray-900 text-white hover:bg-gray-800 dark:bg-gray-100 dark:text-gray-900 dark:hover:bg-gray-200',
|
||||
secondary:
|
||||
'bg-gray-200 text-gray-700 hover:bg-gray-300 dark:bg-gray-700 dark:text-gray-300 dark:hover:bg-gray-600',
|
||||
tertiary:
|
||||
'bg-gray-100 text-gray-600 hover:bg-gray-200 dark:bg-gray-800 dark:text-gray-400 dark:hover:bg-gray-700',
|
||||
}
|
||||
|
||||
return React.createElement(
|
||||
'button',
|
||||
{
|
||||
onClick,
|
||||
disabled,
|
||||
className: `${baseClasses} ${sizeClasses} ${variantClasses[variant]}`,
|
||||
},
|
||||
loading && React.createElement(Loader2, { className: 'mr-1 h-3 w-3 animate-spin' }),
|
||||
label
|
||||
)
|
||||
}
|
||||
@@ -1,29 +1,35 @@
|
||||
'use client'
|
||||
|
||||
import { useCallback, useEffect, useRef, useState } from 'react'
|
||||
import { ArrowDownToLine, CircleSlash, X } from 'lucide-react'
|
||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui'
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import { ArrowDownToLine, CircleSlash, History, Plus, X } from 'lucide-react'
|
||||
import {
|
||||
Chat,
|
||||
Console,
|
||||
Copilot,
|
||||
Variables,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components'
|
||||
import { ChatModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/chat/components'
|
||||
DropdownMenu,
|
||||
DropdownMenuContent,
|
||||
DropdownMenuTrigger,
|
||||
} from '@/components/ui/dropdown-menu'
|
||||
import { ScrollArea } from '@/components/ui/scroll-area'
|
||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
|
||||
import { useCopilotStore } from '@/stores/copilot/store'
|
||||
import { useChatStore } from '@/stores/panel/chat/store'
|
||||
import { useConsoleStore } from '@/stores/panel/console/store'
|
||||
import { usePanelStore } from '@/stores/panel/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { Chat } from './components/chat/chat'
|
||||
import { Console } from './components/console/console'
|
||||
import { Copilot } from './components/copilot/copilot'
|
||||
import { Variables } from './components/variables/variables'
|
||||
|
||||
export function Panel() {
|
||||
const [chatMessage, setChatMessage] = useState<string>('')
|
||||
const [copilotMessage, setCopilotMessage] = useState<string>('')
|
||||
const [isChatModalOpen, setIsChatModalOpen] = useState(false)
|
||||
const [isCopilotModalOpen, setIsCopilotModalOpen] = useState(false)
|
||||
const [isHistoryDropdownOpen, setIsHistoryDropdownOpen] = useState(false)
|
||||
|
||||
const [isResizing, setIsResizing] = useState(false)
|
||||
const [resizeStartX, setResizeStartX] = useState(0)
|
||||
const [resizeStartWidth, setResizeStartWidth] = useState(0)
|
||||
const copilotRef = useRef<{ clearMessages: () => void; startNewChat: () => void }>(null)
|
||||
const copilotRef = useRef<{
|
||||
createNewChat: () => void
|
||||
}>(null)
|
||||
const lastLoadedWorkflowRef = useRef<string | null>(null)
|
||||
|
||||
const isOpen = usePanelStore((state) => state.isOpen)
|
||||
const togglePanel = usePanelStore((state) => state.togglePanel)
|
||||
@@ -38,16 +44,172 @@ export function Panel() {
|
||||
const exportChatCSV = useChatStore((state) => state.exportChatCSV)
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
|
||||
const handleTabClick = (tab: 'chat' | 'console' | 'variables' | 'copilot') => {
|
||||
// Redirect copilot tab clicks to console since copilot is hidden
|
||||
if (tab === 'copilot') {
|
||||
setActiveTab('console')
|
||||
} else {
|
||||
setActiveTab(tab)
|
||||
// Copilot store for chat management
|
||||
const {
|
||||
chats,
|
||||
isLoadingChats,
|
||||
isSendingMessage,
|
||||
selectChat,
|
||||
currentChat,
|
||||
error: copilotError,
|
||||
clearError: clearCopilotError,
|
||||
deleteChat,
|
||||
workflowId: copilotWorkflowId,
|
||||
setWorkflowId: setCopilotWorkflowId,
|
||||
loadChats,
|
||||
validateCurrentChat,
|
||||
areChatsFresh,
|
||||
} = useCopilotStore()
|
||||
|
||||
// Handle chat deletion
|
||||
const handleDeleteChat = useCallback(
|
||||
async (chatId: string) => {
|
||||
try {
|
||||
await deleteChat(chatId)
|
||||
} catch (error) {
|
||||
console.error('Error deleting chat:', error)
|
||||
}
|
||||
},
|
||||
[deleteChat]
|
||||
)
|
||||
|
||||
// Ensure copilot data is loaded before performing actions
|
||||
const ensureCopilotDataLoaded = useCallback(
|
||||
async (forceRefresh = false) => {
|
||||
try {
|
||||
// Don't load if already loading, unless force refresh is requested
|
||||
if (isLoadingChats && !forceRefresh) {
|
||||
return
|
||||
}
|
||||
|
||||
// Sync workflow ID if needed
|
||||
if (activeWorkflowId !== copilotWorkflowId) {
|
||||
await setCopilotWorkflowId(activeWorkflowId)
|
||||
}
|
||||
|
||||
// Load chats for the current workflow - let the store handle caching
|
||||
if (activeWorkflowId) {
|
||||
await loadChats(forceRefresh)
|
||||
validateCurrentChat()
|
||||
|
||||
// Mark this workflow as loaded for the legacy ref
|
||||
lastLoadedWorkflowRef.current = activeWorkflowId
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load copilot data:', error)
|
||||
}
|
||||
},
|
||||
[
|
||||
activeWorkflowId,
|
||||
copilotWorkflowId,
|
||||
setCopilotWorkflowId,
|
||||
loadChats,
|
||||
validateCurrentChat,
|
||||
isLoadingChats,
|
||||
]
|
||||
)
|
||||
|
||||
// Handle new chat creation with data loading
|
||||
const handleNewChat = useCallback(async () => {
|
||||
await ensureCopilotDataLoaded()
|
||||
copilotRef.current?.createNewChat()
|
||||
}, [ensureCopilotDataLoaded])
|
||||
|
||||
// Handle history dropdown opening - use smart caching instead of force refresh
|
||||
const handleHistoryDropdownOpen = useCallback(
|
||||
async (open: boolean) => {
|
||||
// Open dropdown immediately for better UX
|
||||
setIsHistoryDropdownOpen(open)
|
||||
|
||||
// If opening, ensure data is loaded but don't force refresh unless needed
|
||||
if (open && activeWorkflowId) {
|
||||
// Only load if we don't have fresh chats for this workflow
|
||||
if (!areChatsFresh(activeWorkflowId)) {
|
||||
// Don't await - let it load in background while dropdown is already open
|
||||
ensureCopilotDataLoaded(false).catch((error) => {
|
||||
console.error('Failed to load chat history:', error)
|
||||
})
|
||||
}
|
||||
}
|
||||
},
|
||||
[ensureCopilotDataLoaded, activeWorkflowId, areChatsFresh]
|
||||
)
|
||||
|
||||
// Group chats by day
|
||||
const groupedChats = useMemo(() => {
|
||||
// Only process chats if we have the right workflow ID and chats exist
|
||||
if (!activeWorkflowId || copilotWorkflowId !== activeWorkflowId || chats.length === 0) {
|
||||
return []
|
||||
}
|
||||
|
||||
// Chats are already filtered by workflow from the API and ordered by updatedAt desc
|
||||
const filteredChats = chats
|
||||
|
||||
if (filteredChats.length === 0) {
|
||||
return []
|
||||
}
|
||||
|
||||
const now = new Date()
|
||||
const today = new Date(now.getFullYear(), now.getMonth(), now.getDate())
|
||||
const yesterday = new Date(today.getTime() - 24 * 60 * 60 * 1000)
|
||||
const thisWeekStart = new Date(today.getTime() - today.getDay() * 24 * 60 * 60 * 1000)
|
||||
const lastWeekStart = new Date(thisWeekStart.getTime() - 7 * 24 * 60 * 60 * 1000)
|
||||
|
||||
const groups: Record<string, typeof filteredChats> = {
|
||||
Today: [],
|
||||
Yesterday: [],
|
||||
'This Week': [],
|
||||
'Last Week': [],
|
||||
Older: [],
|
||||
}
|
||||
|
||||
// Chats are already sorted by updatedAt desc from the API, so we don't need to sort again
|
||||
filteredChats.forEach((chat) => {
|
||||
const chatDate = new Date(chat.updatedAt)
|
||||
const chatDay = new Date(chatDate.getFullYear(), chatDate.getMonth(), chatDate.getDate())
|
||||
|
||||
if (chatDay.getTime() === today.getTime()) {
|
||||
groups.Today.push(chat)
|
||||
} else if (chatDay.getTime() === yesterday.getTime()) {
|
||||
groups.Yesterday.push(chat)
|
||||
} else if (chatDay.getTime() >= thisWeekStart.getTime()) {
|
||||
groups['This Week'].push(chat)
|
||||
} else if (chatDay.getTime() >= lastWeekStart.getTime()) {
|
||||
groups['Last Week'].push(chat)
|
||||
} else {
|
||||
groups.Older.push(chat)
|
||||
}
|
||||
})
|
||||
|
||||
// Filter out empty groups
|
||||
return Object.entries(groups).filter(([, chats]) => chats.length > 0)
|
||||
}, [chats, activeWorkflowId, copilotWorkflowId])
|
||||
|
||||
// Skeleton loading component for chat history
|
||||
const ChatHistorySkeleton = () => (
|
||||
<div className='px-1 py-1'>
|
||||
{/* Group header skeleton */}
|
||||
<div className='border-[#E5E5E5] border-t-0 px-1 pt-1 pb-0.5 dark:border-[#414141]'>
|
||||
<div className='h-3 w-12 animate-pulse rounded bg-muted/40' />
|
||||
</div>
|
||||
{/* Chat item skeletons */}
|
||||
<div className='mt-1 flex flex-col gap-1'>
|
||||
{[1, 2, 3].map((i) => (
|
||||
<div key={i} className='mx-1 flex h-8 items-center rounded-lg px-2 py-1.5'>
|
||||
<div className='h-3 w-full animate-pulse rounded bg-muted/40' />
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
|
||||
// Handle tab clicks - no loading, just switch tabs
|
||||
const handleTabClick = async (tab: 'chat' | 'console' | 'variables' | 'copilot') => {
|
||||
setActiveTab(tab)
|
||||
if (!isOpen) {
|
||||
togglePanel()
|
||||
}
|
||||
// Removed copilot data loading - store should persist across tab switches
|
||||
}
|
||||
|
||||
const handleClosePanel = () => {
|
||||
@@ -97,6 +259,19 @@ export function Panel() {
|
||||
}
|
||||
}, [isResizing, handleResize, handleResizeEnd])
|
||||
|
||||
// Only auto-load copilot data when workflow changes, not when switching tabs
|
||||
useEffect(() => {
|
||||
// Only load when the active workflow changes, not when switching panel tabs
|
||||
if (activeWorkflowId && activeWorkflowId !== lastLoadedWorkflowRef.current) {
|
||||
// This is a real workflow change, not just a tab switch
|
||||
if (copilotWorkflowId !== activeWorkflowId || !copilotWorkflowId) {
|
||||
ensureCopilotDataLoaded().catch((error) => {
|
||||
console.error('Failed to auto-load copilot data on workflow change:', error)
|
||||
})
|
||||
}
|
||||
}
|
||||
}, [activeWorkflowId, copilotWorkflowId, ensureCopilotDataLoaded])
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* Tab Selector - Always visible */}
|
||||
@@ -117,15 +292,14 @@ export function Panel() {
|
||||
>
|
||||
Console
|
||||
</button>
|
||||
{/* Temporarily hiding copilot tab */}
|
||||
{/* <button
|
||||
<button
|
||||
onClick={() => handleTabClick('copilot')}
|
||||
className={`panel-tab-base inline-flex flex-1 cursor-pointer items-center justify-center rounded-[10px] border border-transparent py-1 font-[450] text-sm outline-none transition-colors duration-200 ${
|
||||
isOpen && activeTab === 'copilot' ? 'panel-tab-active' : 'panel-tab-inactive'
|
||||
}`}
|
||||
>
|
||||
Copilot
|
||||
</button> */}
|
||||
</button>
|
||||
<button
|
||||
onClick={() => handleTabClick('variables')}
|
||||
className={`panel-tab-base inline-flex flex-1 cursor-pointer items-center justify-center rounded-[10px] border border-transparent py-1 font-[450] text-sm outline-none transition-colors duration-200 ${
|
||||
@@ -157,7 +331,7 @@ export function Panel() {
|
||||
<TooltipTrigger asChild>
|
||||
<button
|
||||
onClick={() => activeWorkflowId && exportConsoleCSV(activeWorkflowId)}
|
||||
className='font-medium text-md leading-normal transition-all hover:brightness-75 dark:hover:brightness-125'
|
||||
className='font-medium text-md leading-normal transition-[filter] hover:brightness-75 focus:outline-none focus-visible:outline-none active:outline-none dark:hover:brightness-125'
|
||||
style={{ color: 'var(--base-muted-foreground)' }}
|
||||
>
|
||||
<ArrowDownToLine className='h-4 w-4' strokeWidth={2} />
|
||||
@@ -171,7 +345,7 @@ export function Panel() {
|
||||
<TooltipTrigger asChild>
|
||||
<button
|
||||
onClick={() => activeWorkflowId && exportChatCSV(activeWorkflowId)}
|
||||
className='font-medium text-md leading-normal transition-all hover:brightness-75 dark:hover:brightness-125'
|
||||
className='font-medium text-md leading-normal transition-[filter] hover:brightness-75 focus:outline-none focus-visible:outline-none active:outline-none dark:hover:brightness-125'
|
||||
style={{ color: 'var(--base-muted-foreground)' }}
|
||||
>
|
||||
<ArrowDownToLine className='h-4 w-4' strokeWidth={2} />
|
||||
@@ -180,7 +354,95 @@ export function Panel() {
|
||||
<TooltipContent side='bottom'>Export chat data</TooltipContent>
|
||||
</Tooltip>
|
||||
)}
|
||||
{(activeTab === 'console' || activeTab === 'chat' || activeTab === 'copilot') && (
|
||||
{activeTab === 'copilot' && (
|
||||
<>
|
||||
{/* New Chat Button */}
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<button
|
||||
onClick={handleNewChat}
|
||||
className='font-medium text-md leading-normal transition-[filter] hover:brightness-75 focus:outline-none focus-visible:outline-none active:outline-none dark:hover:brightness-125'
|
||||
style={{ color: 'var(--base-muted-foreground)' }}
|
||||
>
|
||||
<Plus className='h-4 w-4' strokeWidth={2} />
|
||||
</button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side='bottom'>New chat</TooltipContent>
|
||||
</Tooltip>
|
||||
|
||||
{/* History Dropdown */}
|
||||
<DropdownMenu
|
||||
open={isHistoryDropdownOpen}
|
||||
onOpenChange={handleHistoryDropdownOpen}
|
||||
>
|
||||
<DropdownMenuTrigger asChild>
|
||||
<button
|
||||
className='font-medium text-md leading-normal transition-[filter] hover:brightness-75 focus:outline-none focus-visible:outline-none active:outline-none dark:hover:brightness-125'
|
||||
style={{ color: 'var(--base-muted-foreground)' }}
|
||||
title='Chat history'
|
||||
>
|
||||
<History className='h-4 w-4' strokeWidth={2} />
|
||||
</button>
|
||||
</DropdownMenuTrigger>
|
||||
<DropdownMenuContent
|
||||
align='end'
|
||||
className='z-[200] w-48 rounded-lg border-[#E5E5E5] bg-[#FFFFFF] shadow-xs dark:border-[#414141] dark:bg-[#202020]'
|
||||
sideOffset={8}
|
||||
side='bottom'
|
||||
avoidCollisions={true}
|
||||
collisionPadding={8}
|
||||
>
|
||||
{isLoadingChats ? (
|
||||
<ScrollArea className='h-[200px]' hideScrollbar={true}>
|
||||
<ChatHistorySkeleton />
|
||||
</ScrollArea>
|
||||
) : groupedChats.length === 0 ? (
|
||||
<div className='px-3 py-2 text-muted-foreground text-sm'>No chats yet</div>
|
||||
) : (
|
||||
<ScrollArea className='h-[200px]' hideScrollbar={true}>
|
||||
{groupedChats.map(([groupName, chats], groupIndex) => (
|
||||
<div key={groupName}>
|
||||
<div
|
||||
className={`border-[#E5E5E5] border-t px-1 pt-1 pb-0.5 font-normal text-muted-foreground text-xs dark:border-[#414141] ${groupIndex === 0 ? 'border-t-0' : ''}`}
|
||||
>
|
||||
{groupName}
|
||||
</div>
|
||||
<div className='flex flex-col gap-1'>
|
||||
{chats.map((chat) => (
|
||||
<div
|
||||
key={chat.id}
|
||||
onClick={() => {
|
||||
selectChat(chat)
|
||||
setIsHistoryDropdownOpen(false)
|
||||
}}
|
||||
className={`group mx-1 flex h-8 cursor-pointer items-center rounded-lg px-2 py-1.5 text-left transition-colors ${
|
||||
currentChat?.id === chat.id
|
||||
? 'bg-accent'
|
||||
: 'hover:bg-accent/50'
|
||||
}`}
|
||||
style={{ width: '176px', maxWidth: '176px' }}
|
||||
>
|
||||
<span
|
||||
className={`min-w-0 flex-1 truncate font-medium text-sm ${
|
||||
currentChat?.id === chat.id
|
||||
? 'text-foreground'
|
||||
: 'text-muted-foreground'
|
||||
}`}
|
||||
>
|
||||
{chat.title || 'Untitled Chat'}
|
||||
</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</ScrollArea>
|
||||
)}
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
</>
|
||||
)}
|
||||
{(activeTab === 'console' || activeTab === 'chat') && (
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<button
|
||||
@@ -189,11 +451,9 @@ export function Panel() {
|
||||
clearConsole(activeWorkflowId)
|
||||
} else if (activeTab === 'chat') {
|
||||
clearChat(activeWorkflowId)
|
||||
} else if (activeTab === 'copilot') {
|
||||
copilotRef.current?.clearMessages()
|
||||
}
|
||||
}}
|
||||
className='font-medium text-md leading-normal transition-all hover:brightness-75 dark:hover:brightness-125'
|
||||
className='font-medium text-md leading-normal transition-[filter] hover:brightness-75 focus:outline-none focus-visible:outline-none active:outline-none dark:hover:brightness-125'
|
||||
style={{ color: 'var(--base-muted-foreground)' }}
|
||||
>
|
||||
<CircleSlash className='h-4 w-4' strokeWidth={2} />
|
||||
@@ -204,7 +464,7 @@ export function Panel() {
|
||||
)}
|
||||
<button
|
||||
onClick={handleClosePanel}
|
||||
className='font-medium text-md leading-normal transition-all hover:brightness-75 dark:hover:brightness-125'
|
||||
className='font-medium text-md leading-normal transition-[filter] hover:brightness-75 focus:outline-none focus-visible:outline-none active:outline-none dark:hover:brightness-125'
|
||||
style={{ color: 'var(--base-muted-foreground)' }}
|
||||
>
|
||||
<X className='h-4 w-4' strokeWidth={2} />
|
||||
@@ -214,37 +474,26 @@ export function Panel() {
|
||||
|
||||
{/* Panel Content Area - Resizable */}
|
||||
<div className='flex-1 overflow-hidden px-3'>
|
||||
{activeTab === 'chat' ? (
|
||||
{/* Keep all tabs mounted but hidden to preserve state and animations */}
|
||||
<div style={{ display: activeTab === 'chat' ? 'block' : 'none', height: '100%' }}>
|
||||
<Chat
|
||||
panelWidth={panelWidth}
|
||||
chatMessage={chatMessage}
|
||||
setChatMessage={setChatMessage}
|
||||
/>
|
||||
) : activeTab === 'console' ? (
|
||||
</div>
|
||||
<div style={{ display: activeTab === 'console' ? 'block' : 'none', height: '100%' }}>
|
||||
<Console panelWidth={panelWidth} />
|
||||
) : activeTab === 'copilot' ? (
|
||||
<Copilot
|
||||
ref={copilotRef}
|
||||
panelWidth={panelWidth}
|
||||
isFullscreen={isCopilotModalOpen}
|
||||
onFullscreenToggle={setIsCopilotModalOpen}
|
||||
fullscreenInput={copilotMessage}
|
||||
onFullscreenInputChange={setCopilotMessage}
|
||||
/>
|
||||
) : (
|
||||
</div>
|
||||
<div style={{ display: activeTab === 'copilot' ? 'block' : 'none', height: '100%' }}>
|
||||
<Copilot ref={copilotRef} panelWidth={panelWidth} />
|
||||
</div>
|
||||
<div style={{ display: activeTab === 'variables' ? 'block' : 'none', height: '100%' }}>
|
||||
<Variables />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Fullscreen Chat Modal */}
|
||||
<ChatModal
|
||||
open={isChatModalOpen}
|
||||
onOpenChange={setIsChatModalOpen}
|
||||
chatMessage={chatMessage}
|
||||
setChatMessage={setChatMessage}
|
||||
/>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -6,8 +6,9 @@ import { StartIcon } from '@/components/icons'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Card } from '@/components/ui/card'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { ParallelBadges } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/parallel-node/components/parallel-badges'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { useCurrentWorkflow } from '../../hooks'
|
||||
import { ParallelBadges } from './components/parallel-badges'
|
||||
|
||||
const ParallelNodeStyles: React.FC = () => {
|
||||
return (
|
||||
@@ -89,6 +90,12 @@ export const ParallelNodeComponent = memo(({ data, selected, id }: NodeProps) =>
|
||||
const { collaborativeRemoveBlock } = useCollaborativeWorkflow()
|
||||
const blockRef = useRef<HTMLDivElement>(null)
|
||||
|
||||
// Use the clean abstraction for current workflow state
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
const currentBlock = currentWorkflow.getBlockById(id)
|
||||
const diffStatus =
|
||||
currentWorkflow.isDiffMode && currentBlock ? (currentBlock as any).is_diff : undefined
|
||||
|
||||
// Check if this is preview mode
|
||||
const isPreview = data?.isPreview || false
|
||||
|
||||
@@ -142,7 +149,11 @@ export const ParallelNodeComponent = memo(({ data, selected, id }: NodeProps) =>
|
||||
data?.state === 'valid',
|
||||
nestingLevel > 0 &&
|
||||
`border border-[0.5px] ${nestingLevel % 2 === 0 ? 'border-slate-300/60' : 'border-slate-400/60'}`,
|
||||
data?.hasNestedError && 'border-2 border-red-500 bg-red-50/50'
|
||||
data?.hasNestedError && 'border-2 border-red-500 bg-red-50/50',
|
||||
// Diff highlighting
|
||||
diffStatus === 'new' && 'bg-green-50/50 ring-2 ring-green-500 dark:bg-green-900/10',
|
||||
diffStatus === 'edited' &&
|
||||
'bg-orange-50/50 ring-2 ring-orange-500 dark:bg-orange-900/10'
|
||||
)}
|
||||
style={{
|
||||
width: data.width || 500,
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { useMemo } from 'react'
|
||||
import { Plus, Trash } from 'lucide-react'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Input } from '@/components/ui/input'
|
||||
@@ -24,12 +25,12 @@ interface EvalInputProps {
|
||||
}
|
||||
|
||||
// Default values
|
||||
const DEFAULT_METRIC: EvalMetric = {
|
||||
const createDefaultMetric = (): EvalMetric => ({
|
||||
id: crypto.randomUUID(),
|
||||
name: '',
|
||||
description: '',
|
||||
range: { min: 0, max: 1 },
|
||||
}
|
||||
})
|
||||
|
||||
export function EvalInput({
|
||||
blockId,
|
||||
@@ -43,17 +44,15 @@ export function EvalInput({
|
||||
// Use preview value when in preview mode, otherwise use store value
|
||||
const value = isPreview ? previewValue : storeValue
|
||||
|
||||
// State hooks
|
||||
const metrics: EvalMetric[] = value || [DEFAULT_METRIC]
|
||||
// State hooks - memoize default metric to prevent key changes
|
||||
const defaultMetric = useMemo(() => createDefaultMetric(), [])
|
||||
const metrics: EvalMetric[] = value || [defaultMetric]
|
||||
|
||||
// Metric operations
|
||||
const addMetric = () => {
|
||||
if (isPreview || disabled) return
|
||||
|
||||
const newMetric: EvalMetric = {
|
||||
...DEFAULT_METRIC,
|
||||
id: crypto.randomUUID(),
|
||||
}
|
||||
const newMetric: EvalMetric = createDefaultMetric()
|
||||
setStoreValue([...metrics, newMetric])
|
||||
}
|
||||
|
||||
@@ -112,7 +111,7 @@ export function EvalInput({
|
||||
<div className='flex h-10 items-center justify-between rounded-t-lg border-b bg-card px-3'>
|
||||
<span className='font-medium text-sm'>Metric {index + 1}</span>
|
||||
<div className='flex items-center gap-1'>
|
||||
<Tooltip>
|
||||
<Tooltip key={`add-${metric.id}`}>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
@@ -128,7 +127,7 @@ export function EvalInput({
|
||||
<TooltipContent>Add Metric</TooltipContent>
|
||||
</Tooltip>
|
||||
|
||||
<Tooltip>
|
||||
<Tooltip key={`remove-${metric.id}`}>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
@@ -159,7 +158,7 @@ export function EvalInput({
|
||||
{renderMetricHeader(metric, index)}
|
||||
|
||||
<div className='space-y-2 px-3 pt-2 pb-3'>
|
||||
<div className='space-y-1'>
|
||||
<div key={`name-${metric.id}`} className='space-y-1'>
|
||||
<Label>Name</Label>
|
||||
<Input
|
||||
name='name'
|
||||
@@ -171,7 +170,7 @@ export function EvalInput({
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className='space-y-1'>
|
||||
<div key={`description-${metric.id}`} className='space-y-1'>
|
||||
<Label>Description</Label>
|
||||
<Input
|
||||
value={metric.description}
|
||||
@@ -182,7 +181,7 @@ export function EvalInput({
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className='grid grid-cols-2 gap-4'>
|
||||
<div key={`range-${metric.id}`} className='grid grid-cols-2 gap-4'>
|
||||
<div className='space-y-1'>
|
||||
<Label>Min Value</Label>
|
||||
<Input
|
||||
|
||||
@@ -3,6 +3,7 @@ import { isEqual } from 'lodash'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { getProviderFromModel } from '@/providers/utils'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
@@ -60,6 +61,13 @@ export function useSubBlockValue<T = any>(
|
||||
useCallback((state) => state.getValue(blockId, subBlockId), [blockId, subBlockId])
|
||||
)
|
||||
|
||||
// Check if we're in diff mode and get diff value if available
|
||||
const { isShowingDiff, diffWorkflow } = useWorkflowDiffStore()
|
||||
const diffValue =
|
||||
isShowingDiff && diffWorkflow
|
||||
? (diffWorkflow.blocks?.[blockId]?.subBlocks?.[subBlockId]?.value ?? null)
|
||||
: null
|
||||
|
||||
// Check if this is an API key field that could be auto-filled
|
||||
const isApiKey =
|
||||
subBlockId === 'apiKey' || (subBlockId?.toLowerCase().includes('apikey') ?? false)
|
||||
@@ -100,6 +108,12 @@ export function useSubBlockValue<T = any>(
|
||||
// Hook to set a value in the subblock store
|
||||
const setValue = useCallback(
|
||||
(newValue: T) => {
|
||||
// Don't allow updates when in diff mode (readonly preview)
|
||||
if (isShowingDiff) {
|
||||
logger.debug('Ignoring setValue in diff mode', { blockId, subBlockId })
|
||||
return
|
||||
}
|
||||
|
||||
// Use deep comparison to avoid unnecessary updates for complex objects
|
||||
if (!isEqual(valueRef.current, newValue)) {
|
||||
valueRef.current = newValue
|
||||
@@ -175,23 +189,32 @@ export function useSubBlockValue<T = any>(
|
||||
modelValue,
|
||||
isStreaming,
|
||||
emitValue,
|
||||
isShowingDiff,
|
||||
]
|
||||
)
|
||||
|
||||
// Determine the effective value: diff value takes precedence if in diff mode
|
||||
const effectiveValue =
|
||||
isShowingDiff && diffValue !== null
|
||||
? diffValue
|
||||
: storeValue !== undefined
|
||||
? storeValue
|
||||
: initialValue
|
||||
|
||||
// Initialize valueRef on first render
|
||||
useEffect(() => {
|
||||
valueRef.current = storeValue !== undefined ? storeValue : initialValue
|
||||
valueRef.current = effectiveValue
|
||||
}, [])
|
||||
|
||||
// Update the ref if the store value changes
|
||||
// Update the ref if the effective value changes
|
||||
// This ensures we're always working with the latest value
|
||||
useEffect(() => {
|
||||
// Use deep comparison for objects to prevent unnecessary updates
|
||||
if (!isEqual(valueRef.current, storeValue)) {
|
||||
valueRef.current = storeValue !== undefined ? storeValue : initialValue
|
||||
if (!isEqual(valueRef.current, effectiveValue)) {
|
||||
valueRef.current = effectiveValue
|
||||
}
|
||||
}, [storeValue, initialValue])
|
||||
}, [effectiveValue])
|
||||
|
||||
// Return appropriate tuple based on whether options were provided
|
||||
return [storeValue !== undefined ? storeValue : initialValue, setValue] as const
|
||||
return [effectiveValue, setValue] as const
|
||||
}
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
import { useState } from 'react'
|
||||
import type React from 'react'
|
||||
import { useEffect, useState } from 'react'
|
||||
import { AlertTriangle, Info } from 'lucide-react'
|
||||
import { Label, Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui'
|
||||
import { cn } from '@/lib/utils'
|
||||
import {
|
||||
ChannelSelectorInput,
|
||||
CheckboxList,
|
||||
@@ -40,6 +42,7 @@ interface SubBlockProps {
|
||||
isPreview?: boolean
|
||||
subBlockValues?: Record<string, any>
|
||||
disabled?: boolean
|
||||
fieldDiffStatus?: 'changed' | 'unchanged'
|
||||
}
|
||||
|
||||
export function SubBlock({
|
||||
@@ -49,9 +52,17 @@ export function SubBlock({
|
||||
isPreview = false,
|
||||
subBlockValues,
|
||||
disabled = false,
|
||||
fieldDiffStatus,
|
||||
}: SubBlockProps) {
|
||||
const [isValidJson, setIsValidJson] = useState(true)
|
||||
|
||||
// Debug field diff status
|
||||
useEffect(() => {
|
||||
if (fieldDiffStatus) {
|
||||
console.log(`[SubBlock ${config.id}] fieldDiffStatus:`, fieldDiffStatus)
|
||||
}
|
||||
}, [fieldDiffStatus, config.id])
|
||||
|
||||
const handleMouseDown = (e: React.MouseEvent) => {
|
||||
e.stopPropagation()
|
||||
}
|
||||
@@ -430,7 +441,15 @@ export function SubBlock({
|
||||
const required = isFieldRequired()
|
||||
|
||||
return (
|
||||
<div className='space-y-[6px] pt-[2px]' onMouseDown={handleMouseDown}>
|
||||
<div
|
||||
className={cn(
|
||||
'space-y-[6px] pt-[2px]',
|
||||
// Field-level diff highlighting - make it more prominent for testing
|
||||
fieldDiffStatus === 'changed' &&
|
||||
'-m-1 rounded-lg border border-orange-200 bg-orange-100 p-3 ring-2 ring-orange-500 dark:border-orange-800 dark:bg-orange-900/40'
|
||||
)}
|
||||
onMouseDown={handleMouseDown}
|
||||
>
|
||||
{config.type !== 'switch' && (
|
||||
<Label className='flex items-center gap-1'>
|
||||
{config.title}
|
||||
|
||||
@@ -2,20 +2,25 @@ import { useEffect, useRef, useState } from 'react'
|
||||
import { BookOpen, Code, Info, RectangleHorizontal, RectangleVertical } from 'lucide-react'
|
||||
import { useParams } from 'next/navigation'
|
||||
import { Handle, type NodeProps, Position, useUpdateNodeInternals } from 'reactflow'
|
||||
import { Badge, Button, Card, Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui'
|
||||
import { Badge } from '@/components/ui/badge'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Card } from '@/components/ui/card'
|
||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
|
||||
import { parseCronToHumanReadable } from '@/lib/schedules/utils'
|
||||
import { cn, validateName } from '@/lib/utils'
|
||||
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/components/providers/workspace-permissions-provider'
|
||||
import { ActionBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/action-bar/action-bar'
|
||||
import { ConnectionBlocks } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/connection-blocks/connection-blocks'
|
||||
import { SubBlock } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/sub-block'
|
||||
import type { BlockConfig, SubBlockConfig } from '@/blocks/types'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
|
||||
import { mergeSubblockState } from '@/stores/workflows/utils'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
import { useCurrentWorkflow } from '../../hooks'
|
||||
import { ActionBar } from './components/action-bar/action-bar'
|
||||
import { ConnectionBlocks } from './components/connection-blocks/connection-blocks'
|
||||
import { SubBlock } from './components/sub-block/sub-block'
|
||||
|
||||
interface WorkflowBlockProps {
|
||||
type: string
|
||||
@@ -25,6 +30,7 @@ interface WorkflowBlockProps {
|
||||
isPending?: boolean
|
||||
isPreview?: boolean
|
||||
subBlockValues?: Record<string, any>
|
||||
blockState?: any // Block state data passed in preview mode
|
||||
}
|
||||
|
||||
// Combine both interfaces into a single component
|
||||
@@ -59,10 +65,73 @@ export function WorkflowBlock({ id, data }: NodeProps<WorkflowBlockProps>) {
|
||||
|
||||
// Workflow store selectors
|
||||
const lastUpdate = useWorkflowStore((state) => state.lastUpdate)
|
||||
const isEnabled = useWorkflowStore((state) => state.blocks[id]?.enabled ?? true)
|
||||
const horizontalHandles = useWorkflowStore(
|
||||
(state) => state.blocks[id]?.horizontalHandles ?? false
|
||||
)
|
||||
|
||||
// Use the clean abstraction for current workflow state
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
const currentBlock = currentWorkflow.getBlockById(id)
|
||||
|
||||
const isEnabled = currentBlock?.enabled ?? true
|
||||
|
||||
// Get diff status from the block itself (set by diff engine)
|
||||
const diffStatus =
|
||||
currentWorkflow.isDiffMode && currentBlock ? (currentBlock as any).is_diff : undefined
|
||||
|
||||
// Get field-level diff information
|
||||
const fieldDiff =
|
||||
currentWorkflow.isDiffMode && currentBlock ? (currentBlock as any).field_diffs : undefined
|
||||
|
||||
// Debug: Log diff status for this block
|
||||
useEffect(() => {
|
||||
if (currentWorkflow.isDiffMode) {
|
||||
console.log(`[WorkflowBlock ${id}] Diff status:`, {
|
||||
blockId: id,
|
||||
blockName: currentBlock?.name,
|
||||
isDiffMode: currentWorkflow.isDiffMode,
|
||||
diffStatus,
|
||||
hasFieldDiff: !!fieldDiff,
|
||||
timestamp: Date.now(),
|
||||
})
|
||||
}
|
||||
}, [id, currentWorkflow.isDiffMode, diffStatus, fieldDiff, currentBlock?.name])
|
||||
|
||||
// Check if this block is marked for deletion (in original workflow, not diff)
|
||||
const diffAnalysis = useWorkflowDiffStore((state) => state.diffAnalysis)
|
||||
const isShowingDiff = useWorkflowDiffStore((state) => state.isShowingDiff)
|
||||
const isDeletedBlock = !isShowingDiff && diffAnalysis?.deleted_blocks?.includes(id)
|
||||
|
||||
// Debug: Log when in diff mode or when blocks are marked for deletion
|
||||
useEffect(() => {
|
||||
if (currentWorkflow.isDiffMode) {
|
||||
console.log(
|
||||
`[WorkflowBlock ${id}] Diff mode active, block exists: ${!!currentBlock}, diff status: ${diffStatus}`
|
||||
)
|
||||
if (fieldDiff) {
|
||||
console.log(`[WorkflowBlock ${id}] Field diff:`, fieldDiff)
|
||||
}
|
||||
}
|
||||
if (diffAnalysis && !isShowingDiff) {
|
||||
console.log(`[WorkflowBlock ${id}] Diff analysis available in original workflow:`, {
|
||||
deleted_blocks: diffAnalysis.deleted_blocks,
|
||||
isDeletedBlock,
|
||||
isShowingDiff,
|
||||
})
|
||||
}
|
||||
if (isDeletedBlock) {
|
||||
console.log(`[WorkflowBlock ${id}] Block marked for deletion in original workflow`)
|
||||
}
|
||||
}, [
|
||||
currentWorkflow.isDiffMode,
|
||||
currentBlock,
|
||||
diffStatus,
|
||||
fieldDiff || null,
|
||||
isDeletedBlock,
|
||||
diffAnalysis,
|
||||
isShowingDiff,
|
||||
id,
|
||||
])
|
||||
const horizontalHandles = data.isPreview
|
||||
? (data.blockState?.horizontalHandles ?? true) // In preview mode, use blockState and default to horizontal
|
||||
: useWorkflowStore((state) => state.blocks[id]?.horizontalHandles ?? true) // Changed default to true for consistency
|
||||
const isWide = useWorkflowStore((state) => state.blocks[id]?.isWide ?? false)
|
||||
const blockHeight = useWorkflowStore((state) => state.blocks[id]?.height ?? 0)
|
||||
// Get per-block webhook status by checking if webhook is configured
|
||||
@@ -310,6 +379,9 @@ export function WorkflowBlock({ id, data }: NodeProps<WorkflowBlockProps>) {
|
||||
if (data.isPreview && data.subBlockValues) {
|
||||
// In preview mode, use the preview values
|
||||
stateToUse = data.subBlockValues
|
||||
} else if (currentWorkflow.isDiffMode && currentBlock) {
|
||||
// In diff mode, use the diff workflow's subblock values
|
||||
stateToUse = currentBlock.subBlocks || {}
|
||||
} else {
|
||||
// In normal mode, use merged state
|
||||
const blocks = useWorkflowStore.getState().blocks
|
||||
@@ -455,6 +527,11 @@ export function WorkflowBlock({ id, data }: NodeProps<WorkflowBlockProps>) {
|
||||
!isEnabled && 'shadow-sm',
|
||||
isActive && 'animate-pulse-ring ring-2 ring-blue-500',
|
||||
isPending && 'ring-2 ring-amber-500',
|
||||
// Diff highlighting
|
||||
diffStatus === 'new' && 'bg-green-50/50 ring-2 ring-green-500 dark:bg-green-900/10',
|
||||
diffStatus === 'edited' && 'bg-orange-50/50 ring-2 ring-orange-500 dark:bg-orange-900/10',
|
||||
// Deleted block highlighting (in original workflow)
|
||||
isDeletedBlock && 'bg-red-50/50 ring-2 ring-red-500 dark:bg-red-900/10',
|
||||
'z-[20]'
|
||||
)}
|
||||
>
|
||||
@@ -795,6 +872,13 @@ export function WorkflowBlock({ id, data }: NodeProps<WorkflowBlockProps>) {
|
||||
isPreview={data.isPreview}
|
||||
subBlockValues={data.subBlockValues}
|
||||
disabled={!userPermissions.canEdit}
|
||||
fieldDiffStatus={
|
||||
fieldDiff?.changed_fields?.includes(subBlock.id)
|
||||
? 'changed'
|
||||
: fieldDiff?.unchanged_fields?.includes(subBlock.id)
|
||||
? 'unchanged'
|
||||
: undefined
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
))}
|
||||
|
||||
@@ -1,5 +1,13 @@
|
||||
import { useEffect } from 'react'
|
||||
import { X } from 'lucide-react'
|
||||
import { BaseEdge, EdgeLabelRenderer, type EdgeProps, getSmoothStepPath } from 'reactflow'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff'
|
||||
import { useCurrentWorkflow } from '../../hooks'
|
||||
|
||||
interface WorkflowEdgeProps extends EdgeProps {
|
||||
sourceHandle?: string | null
|
||||
targetHandle?: string | null
|
||||
}
|
||||
|
||||
export const WorkflowEdge = ({
|
||||
id,
|
||||
@@ -11,7 +19,11 @@ export const WorkflowEdge = ({
|
||||
targetPosition,
|
||||
data,
|
||||
style,
|
||||
}: EdgeProps) => {
|
||||
source,
|
||||
target,
|
||||
sourceHandle,
|
||||
targetHandle,
|
||||
}: WorkflowEdgeProps) => {
|
||||
const isHorizontal = sourcePosition === 'right' || sourcePosition === 'left'
|
||||
|
||||
const [edgePath, labelX, labelY] = getSmoothStepPath({
|
||||
@@ -30,11 +42,110 @@ export const WorkflowEdge = ({
|
||||
const isInsideLoop = data?.isInsideLoop ?? false
|
||||
const parentLoopId = data?.parentLoopId
|
||||
|
||||
// Merge any style props passed from parent
|
||||
// Get edge diff status
|
||||
const diffAnalysis = useWorkflowDiffStore((state) => state.diffAnalysis)
|
||||
const isShowingDiff = useWorkflowDiffStore((state) => state.isShowingDiff)
|
||||
const isDiffReady = useWorkflowDiffStore((state) => state.isDiffReady)
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
|
||||
// Generate edge identifier using block IDs to match diff analysis from sim agent
|
||||
// This must exactly match the logic used by the sim agent diff analysis
|
||||
const generateEdgeIdentity = (
|
||||
sourceId: string,
|
||||
targetId: string,
|
||||
sourceHandle?: string | null,
|
||||
targetHandle?: string | null
|
||||
): string => {
|
||||
// The sim agent generates edge identifiers in the format: sourceId-source-targetId-target
|
||||
return `${sourceId}-source-${targetId}-target`
|
||||
}
|
||||
|
||||
// Generate edge identifier using the exact same logic as the sim agent
|
||||
const edgeIdentifier = generateEdgeIdentity(source, target, sourceHandle, targetHandle)
|
||||
|
||||
// Debug logging to understand what's happening
|
||||
useEffect(() => {
|
||||
if (edgeIdentifier && diffAnalysis?.edge_diff) {
|
||||
console.log(`[Edge Debug] Edge ${id}:`, {
|
||||
edgeIdentifier,
|
||||
sourceHandle,
|
||||
targetHandle,
|
||||
sourceBlockId: source,
|
||||
targetBlockId: target,
|
||||
isShowingDiff,
|
||||
isDiffMode: currentWorkflow.isDiffMode,
|
||||
edgeDiffAnalysis: diffAnalysis.edge_diff,
|
||||
// Show actual array contents to see why matching fails
|
||||
newEdgesArray: diffAnalysis.edge_diff.new_edges,
|
||||
deletedEdgesArray: diffAnalysis.edge_diff.deleted_edges,
|
||||
unchangedEdgesArray: diffAnalysis.edge_diff.unchanged_edges,
|
||||
// Check if this edge matches any in the diff analysis
|
||||
matchesNew: diffAnalysis.edge_diff.new_edges.includes(edgeIdentifier),
|
||||
matchesDeleted: diffAnalysis.edge_diff.deleted_edges.includes(edgeIdentifier),
|
||||
matchesUnchanged: diffAnalysis.edge_diff.unchanged_edges.includes(edgeIdentifier),
|
||||
})
|
||||
}
|
||||
}, [
|
||||
edgeIdentifier,
|
||||
diffAnalysis,
|
||||
isShowingDiff,
|
||||
id,
|
||||
sourceHandle,
|
||||
targetHandle,
|
||||
source,
|
||||
target,
|
||||
currentWorkflow.isDiffMode,
|
||||
])
|
||||
|
||||
// One-time debug log of full diff analysis
|
||||
useEffect(() => {
|
||||
if (diffAnalysis && id === Object.keys(currentWorkflow.blocks)[0]) {
|
||||
// Only log once per diff
|
||||
console.log('[Full Diff Analysis]:', {
|
||||
edge_diff: diffAnalysis.edge_diff,
|
||||
new_blocks: diffAnalysis.new_blocks,
|
||||
edited_blocks: diffAnalysis.edited_blocks,
|
||||
deleted_blocks: diffAnalysis.deleted_blocks,
|
||||
isShowingDiff,
|
||||
currentWorkflowEdgeCount: currentWorkflow.edges.length,
|
||||
currentWorkflowBlockCount: Object.keys(currentWorkflow.blocks).length,
|
||||
})
|
||||
}
|
||||
}, [diffAnalysis, id, currentWorkflow.blocks, currentWorkflow.edges, isShowingDiff])
|
||||
|
||||
// Determine edge diff status
|
||||
let edgeDiffStatus: 'new' | 'deleted' | 'unchanged' | null = null
|
||||
|
||||
// Only attempt to determine diff status if all required data is available
|
||||
if (diffAnalysis?.edge_diff && edgeIdentifier && isDiffReady) {
|
||||
if (isShowingDiff) {
|
||||
// In diff view, show new edges
|
||||
if (diffAnalysis.edge_diff.new_edges.includes(edgeIdentifier)) {
|
||||
edgeDiffStatus = 'new'
|
||||
} else if (diffAnalysis.edge_diff.unchanged_edges.includes(edgeIdentifier)) {
|
||||
edgeDiffStatus = 'unchanged'
|
||||
}
|
||||
} else {
|
||||
// In original workflow, show deleted edges
|
||||
if (diffAnalysis.edge_diff.deleted_edges.includes(edgeIdentifier)) {
|
||||
edgeDiffStatus = 'deleted'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Merge any style props passed from parent with diff highlighting
|
||||
const getEdgeColor = () => {
|
||||
if (edgeDiffStatus === 'new') return '#22c55e' // Green for new edges
|
||||
if (edgeDiffStatus === 'deleted') return '#ef4444' // Red for deleted edges
|
||||
if (isSelected) return '#475569'
|
||||
return '#94a3b8'
|
||||
}
|
||||
|
||||
const edgeStyle = {
|
||||
strokeWidth: isSelected ? 2.5 : 2,
|
||||
stroke: isSelected ? '#475569' : '#94a3b8',
|
||||
strokeDasharray: '5,5',
|
||||
strokeWidth: edgeDiffStatus ? 3 : isSelected ? 2.5 : 2,
|
||||
stroke: getEdgeColor(),
|
||||
strokeDasharray: edgeDiffStatus === 'deleted' ? '10,5' : '5,5', // Longer dashes for deleted
|
||||
opacity: edgeDiffStatus === 'deleted' ? 0.7 : 1,
|
||||
...style,
|
||||
}
|
||||
|
||||
@@ -50,11 +161,12 @@ export const WorkflowEdge = ({
|
||||
data-is-selected={isSelected ? 'true' : 'false'}
|
||||
data-is-inside-loop={isInsideLoop ? 'true' : 'false'}
|
||||
/>
|
||||
{/* Animate dash offset for edge movement effect */}
|
||||
<animate
|
||||
attributeName='stroke-dashoffset'
|
||||
from='10'
|
||||
from={edgeDiffStatus === 'deleted' ? '15' : '10'}
|
||||
to='0'
|
||||
dur='1s'
|
||||
dur={edgeDiffStatus === 'deleted' ? '2s' : '1s'}
|
||||
repeatCount='indefinite'
|
||||
/>
|
||||
|
||||
|
||||
@@ -79,8 +79,8 @@ export async function applyWorkflowDiff(
|
||||
warnings: result.warnings || [],
|
||||
})
|
||||
|
||||
// Trigger auto layout after successful save
|
||||
window.dispatchEvent(new CustomEvent('trigger-auto-layout'))
|
||||
// Auto layout is now handled automatically by the backend system
|
||||
// when applyAutoLayout is true in the request
|
||||
|
||||
// Calculate applied operations (blocks + edges)
|
||||
const appliedOperations = (result.data?.blocksCount || 0) + (result.data?.edgesCount || 0)
|
||||
@@ -151,8 +151,8 @@ export async function applyWorkflowDiff(
|
||||
edgesCount: result.edgesCount,
|
||||
})
|
||||
|
||||
// Trigger auto layout after successful save
|
||||
window.dispatchEvent(new CustomEvent('trigger-auto-layout'))
|
||||
// Auto layout would need to be called separately for JSON format if needed
|
||||
// JSON format doesn't automatically apply auto layout like YAML does
|
||||
|
||||
// Calculate applied operations
|
||||
const appliedOperations = (result.blocksCount || 0) + (result.edgesCount || 0)
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
import { dump as yamlDump, load as yamlLoad } from 'js-yaml'
|
||||
import { dump as yamlDump } from 'js-yaml'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateWorkflowYaml } from '@/lib/workflows/yaml-generator'
|
||||
import type { EditorFormat } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-text-editor/workflow-text-editor'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
import type { EditorFormat } from './workflow-text-editor'
|
||||
|
||||
const logger = createLogger('WorkflowExporter')
|
||||
|
||||
@@ -69,13 +68,36 @@ export function generateFullWorkflowData() {
|
||||
/**
|
||||
* Export workflow in the specified format
|
||||
*/
|
||||
export function exportWorkflow(format: EditorFormat): string {
|
||||
export async function exportWorkflow(format: EditorFormat): Promise<string> {
|
||||
try {
|
||||
if (format === 'yaml') {
|
||||
// Use the existing YAML generator for condensed format
|
||||
// Use the YAML service for conversion
|
||||
const workflowState = useWorkflowStore.getState()
|
||||
const subBlockValues = getSubBlockValues()
|
||||
return generateWorkflowYaml(workflowState, subBlockValues)
|
||||
|
||||
// Call the API route to generate YAML (server has access to API key)
|
||||
const response = await fetch('/api/workflows/yaml/convert', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
workflowState,
|
||||
subBlockValues,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => null)
|
||||
throw new Error(errorData?.error || `Failed to generate YAML: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
if (!result.success || !result.yaml) {
|
||||
throw new Error(result.error || 'Failed to generate YAML')
|
||||
}
|
||||
return result.yaml
|
||||
}
|
||||
// Generate full JSON format
|
||||
const fullData = generateFullWorkflowData()
|
||||
@@ -89,9 +111,11 @@ export function exportWorkflow(format: EditorFormat): string {
|
||||
/**
|
||||
* Parse workflow content based on format
|
||||
*/
|
||||
export function parseWorkflowContent(content: string, format: EditorFormat): any {
|
||||
export async function parseWorkflowContent(content: string, format: EditorFormat): Promise<any> {
|
||||
if (format === 'yaml') {
|
||||
return yamlLoad(content)
|
||||
// For now, we'll parse YAML on the server when it's being saved
|
||||
// The workflow-text-editor should handle the actual conversion
|
||||
throw new Error('YAML parsing should be handled by the server when saving the workflow')
|
||||
}
|
||||
return JSON.parse(content)
|
||||
}
|
||||
|
||||
@@ -44,15 +44,17 @@ export function WorkflowTextEditorModal({
|
||||
useEffect(() => {
|
||||
if (isOpen && activeWorkflowId) {
|
||||
setIsLoading(true)
|
||||
try {
|
||||
const content = exportWorkflow(format)
|
||||
setInitialContent(content)
|
||||
} catch (error) {
|
||||
logger.error('Failed to export workflow:', error)
|
||||
setInitialContent('# Error loading workflow content')
|
||||
} finally {
|
||||
setIsLoading(false)
|
||||
}
|
||||
exportWorkflow(format)
|
||||
.then((content) => {
|
||||
setInitialContent(content)
|
||||
})
|
||||
.catch((error) => {
|
||||
logger.error('Failed to export workflow:', error)
|
||||
setInitialContent('# Error loading workflow content')
|
||||
})
|
||||
.finally(() => {
|
||||
setIsLoading(false)
|
||||
})
|
||||
}
|
||||
}, [isOpen, format, activeWorkflowId])
|
||||
|
||||
@@ -91,7 +93,7 @@ export function WorkflowTextEditorModal({
|
||||
|
||||
// Update initial content to reflect current state
|
||||
try {
|
||||
const updatedContent = exportWorkflow(contentFormat)
|
||||
const updatedContent = await exportWorkflow(contentFormat)
|
||||
setInitialContent(updatedContent)
|
||||
} catch (error) {
|
||||
logger.error('Failed to refresh content after save:', error)
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
'use client'
|
||||
|
||||
import { useCallback, useEffect, useState } from 'react'
|
||||
import { dump as yamlDump, load as yamlParse } from 'js-yaml'
|
||||
import { dump as yamlDump, load as yamlLoad } from 'js-yaml'
|
||||
import { AlertCircle, Check, FileCode, Save } from 'lucide-react'
|
||||
import { Alert, AlertDescription } from '@/components/ui/alert'
|
||||
import { Button } from '@/components/ui/button'
|
||||
@@ -9,7 +9,7 @@ import { Tabs, TabsList, TabsTrigger } from '@/components/ui/tabs'
|
||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { CodeEditor } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/components/tool-input/components/code-editor/code-editor'
|
||||
import { CodeEditor } from '../workflow-block/components/sub-block/components/tool-input/components/code-editor/code-editor'
|
||||
|
||||
const logger = createLogger('WorkflowTextEditor')
|
||||
|
||||
@@ -53,7 +53,7 @@ export function WorkflowTextEditor({
|
||||
const [hasUnsavedChanges, setHasUnsavedChanges] = useState(false)
|
||||
|
||||
// Validate content based on format
|
||||
const validateContent = useCallback((text: string, fmt: EditorFormat): ValidationError[] => {
|
||||
const validateSyntax = useCallback((text: string, fmt: EditorFormat): ValidationError[] => {
|
||||
const errors: ValidationError[] = []
|
||||
|
||||
if (!text.trim()) {
|
||||
@@ -62,11 +62,12 @@ export function WorkflowTextEditor({
|
||||
|
||||
try {
|
||||
if (fmt === 'yaml') {
|
||||
yamlParse(text)
|
||||
// Basic YAML syntax validation using js-yaml
|
||||
yamlLoad(text)
|
||||
} else if (fmt === 'json') {
|
||||
JSON.parse(text)
|
||||
}
|
||||
} catch (error) {
|
||||
} catch (error: any) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Parse error'
|
||||
|
||||
// Extract line/column info if available
|
||||
@@ -94,7 +95,8 @@ export function WorkflowTextEditor({
|
||||
let parsed: any
|
||||
|
||||
if (fromFormat === 'yaml') {
|
||||
parsed = yamlParse(text)
|
||||
// Use basic YAML parsing for synchronous conversion
|
||||
parsed = yamlLoad(text)
|
||||
} else {
|
||||
parsed = JSON.parse(text)
|
||||
}
|
||||
@@ -122,13 +124,13 @@ export function WorkflowTextEditor({
|
||||
setHasUnsavedChanges(newContent !== initialValue)
|
||||
|
||||
// Validate on change
|
||||
const errors = validateContent(newContent, currentFormat)
|
||||
const errors = validateSyntax(newContent, currentFormat)
|
||||
setValidationErrors(errors)
|
||||
|
||||
// Clear save result when editing
|
||||
setSaveResult(null)
|
||||
},
|
||||
[initialValue, currentFormat, validateContent]
|
||||
[initialValue, currentFormat, validateSyntax]
|
||||
)
|
||||
|
||||
// Handle format changes
|
||||
@@ -143,13 +145,13 @@ export function WorkflowTextEditor({
|
||||
setContent(convertedContent)
|
||||
|
||||
// Validate converted content
|
||||
const errors = validateContent(convertedContent, newFormat)
|
||||
const errors = validateSyntax(convertedContent, newFormat)
|
||||
setValidationErrors(errors)
|
||||
|
||||
// Notify parent
|
||||
onFormatChange?.(newFormat)
|
||||
},
|
||||
[content, currentFormat, convertFormat, validateContent, onFormatChange]
|
||||
[content, currentFormat, convertFormat, validateSyntax, onFormatChange]
|
||||
)
|
||||
|
||||
// Handle save
|
||||
|
||||
@@ -0,0 +1,5 @@
|
||||
// Export the current workflow abstraction
|
||||
|
||||
export { type CurrentWorkflow, useCurrentWorkflow } from './use-current-workflow'
|
||||
// Export other workflow-related hooks
|
||||
export { useWorkflowExecution } from './use-workflow-execution'
|
||||
@@ -0,0 +1,86 @@
|
||||
import { useMemo } from 'react'
|
||||
import type { Edge } from 'reactflow'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import type { DeploymentStatus } from '@/stores/workflows/registry/types'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
import type { BlockState, Loop, Parallel, WorkflowState } from '@/stores/workflows/workflow/types'
|
||||
|
||||
/**
|
||||
* Interface for the current workflow abstraction
|
||||
*/
|
||||
export interface CurrentWorkflow {
|
||||
// Current workflow state properties
|
||||
blocks: Record<string, BlockState>
|
||||
edges: Edge[]
|
||||
loops: Record<string, Loop>
|
||||
parallels: Record<string, Parallel>
|
||||
lastSaved?: number
|
||||
isDeployed?: boolean
|
||||
deployedAt?: Date
|
||||
deploymentStatuses?: Record<string, DeploymentStatus>
|
||||
needsRedeployment?: boolean
|
||||
hasActiveWebhook?: boolean
|
||||
|
||||
// Mode information
|
||||
isDiffMode: boolean
|
||||
isNormalMode: boolean
|
||||
|
||||
// Full workflow state (for cases that need the complete object)
|
||||
workflowState: WorkflowState
|
||||
|
||||
// Helper methods
|
||||
getBlockById: (blockId: string) => BlockState | undefined
|
||||
getBlockCount: () => number
|
||||
getEdgeCount: () => number
|
||||
hasBlocks: () => boolean
|
||||
hasEdges: () => boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean abstraction for accessing the current workflow state.
|
||||
* Automatically handles diff vs normal mode without exposing the complexity to consumers.
|
||||
*/
|
||||
export function useCurrentWorkflow(): CurrentWorkflow {
|
||||
// Get normal workflow state
|
||||
const normalWorkflow = useWorkflowStore((state) => state.getWorkflowState())
|
||||
|
||||
// Get diff state - now including isDiffReady
|
||||
const { isShowingDiff, isDiffReady, diffWorkflow } = useWorkflowDiffStore()
|
||||
|
||||
// Create the abstracted interface
|
||||
const currentWorkflow = useMemo((): CurrentWorkflow => {
|
||||
// Determine which workflow to use - only use diff if it's ready
|
||||
const shouldUseDiff = isShowingDiff && isDiffReady && !!diffWorkflow
|
||||
const activeWorkflow = shouldUseDiff ? diffWorkflow : normalWorkflow
|
||||
|
||||
return {
|
||||
// Current workflow state
|
||||
blocks: activeWorkflow.blocks,
|
||||
edges: activeWorkflow.edges,
|
||||
loops: activeWorkflow.loops || {},
|
||||
parallels: activeWorkflow.parallels || {},
|
||||
lastSaved: activeWorkflow.lastSaved,
|
||||
isDeployed: activeWorkflow.isDeployed,
|
||||
deployedAt: activeWorkflow.deployedAt,
|
||||
deploymentStatuses: activeWorkflow.deploymentStatuses,
|
||||
needsRedeployment: activeWorkflow.needsRedeployment,
|
||||
hasActiveWebhook: activeWorkflow.hasActiveWebhook,
|
||||
|
||||
// Mode information - update to reflect ready state
|
||||
isDiffMode: shouldUseDiff,
|
||||
isNormalMode: !shouldUseDiff,
|
||||
|
||||
// Full workflow state (for cases that need the complete object)
|
||||
workflowState: activeWorkflow,
|
||||
|
||||
// Helper methods
|
||||
getBlockById: (blockId: string) => activeWorkflow.blocks[blockId],
|
||||
getBlockCount: () => Object.keys(activeWorkflow.blocks).length,
|
||||
getEdgeCount: () => activeWorkflow.edges.length,
|
||||
hasBlocks: () => Object.keys(activeWorkflow.blocks).length > 0,
|
||||
hasEdges: () => activeWorkflow.edges.length > 0,
|
||||
}
|
||||
}, [normalWorkflow, isShowingDiff, isDiffReady, diffWorkflow])
|
||||
|
||||
return currentWorkflow
|
||||
}
|
||||
@@ -16,7 +16,7 @@ import { useEnvironmentStore } from '@/stores/settings/environment/store'
|
||||
import { useGeneralStore } from '@/stores/settings/general/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { mergeSubblockState } from '@/stores/workflows/utils'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
import { useCurrentWorkflow } from './use-current-workflow'
|
||||
|
||||
const logger = createLogger('useWorkflowExecution')
|
||||
|
||||
@@ -43,7 +43,7 @@ interface DebugValidationResult {
|
||||
}
|
||||
|
||||
export function useWorkflowExecution() {
|
||||
const { blocks, edges, loops, parallels } = useWorkflowStore()
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
const { activeWorkflowId } = useWorkflowRegistry()
|
||||
const { toggleConsole } = useConsoleStore()
|
||||
const { getAllVariables } = useEnvironmentStore()
|
||||
@@ -323,32 +323,15 @@ export function useWorkflowExecution() {
|
||||
|
||||
await Promise.all(streamReadingPromises)
|
||||
|
||||
// Handle both ExecutionResult and StreamingExecution types - process regardless of success status
|
||||
if (result) {
|
||||
// Handle both ExecutionResult and StreamingExecution types
|
||||
const executionResult =
|
||||
result && typeof result === 'object' && 'execution' in result
|
||||
? (result.execution as ExecutionResult)
|
||||
: (result as ExecutionResult)
|
||||
|
||||
if (!executionResult.metadata) {
|
||||
executionResult.metadata = { duration: 0, startTime: new Date().toISOString() }
|
||||
if (result && 'success' in result) {
|
||||
if (!result.metadata) {
|
||||
result.metadata = { duration: 0, startTime: new Date().toISOString() }
|
||||
}
|
||||
;(executionResult.metadata as any).source = 'chat'
|
||||
|
||||
// Update streamed content and apply tokenization - process logs regardless of success status
|
||||
if (executionResult.logs) {
|
||||
// Add newlines between different agent outputs for better readability
|
||||
const processedOutputs = new Set<string>()
|
||||
executionResult.logs.forEach((log: BlockLog) => {
|
||||
;(result.metadata as any).source = 'chat'
|
||||
// Update streamed content and apply tokenization
|
||||
if (result.logs) {
|
||||
result.logs.forEach((log: BlockLog) => {
|
||||
if (streamedContent.has(log.blockId)) {
|
||||
const content = streamedContent.get(log.blockId)
|
||||
if (log.output && content) {
|
||||
const separator = processedOutputs.size > 0 ? '\n\n' : ''
|
||||
log.output.content = separator + content
|
||||
processedOutputs.add(log.blockId)
|
||||
}
|
||||
|
||||
// For console display, show the actual structured block output instead of formatted streaming content
|
||||
// This ensures console logs match the block state structure
|
||||
// Use replaceOutput to completely replace the output instead of merging
|
||||
@@ -365,19 +348,14 @@ export function useWorkflowExecution() {
|
||||
})
|
||||
|
||||
// Process all logs for streaming tokenization
|
||||
const processedCount = processStreamingBlockLogs(
|
||||
executionResult.logs,
|
||||
streamedContent
|
||||
)
|
||||
const processedCount = processStreamingBlockLogs(result.logs, streamedContent)
|
||||
logger.info(`Processed ${processedCount} blocks for streaming tokenization`)
|
||||
}
|
||||
|
||||
controller.enqueue(
|
||||
encoder.encode(
|
||||
`data: ${JSON.stringify({ event: 'final', data: executionResult })}\n\n`
|
||||
)
|
||||
encoder.encode(`data: ${JSON.stringify({ event: 'final', data: result })}\n\n`)
|
||||
)
|
||||
persistLogs(executionId, executionResult).catch((err) =>
|
||||
persistLogs(executionId, result).catch((err) =>
|
||||
logger.error('Error persisting logs:', err)
|
||||
)
|
||||
}
|
||||
@@ -433,10 +411,7 @@ export function useWorkflowExecution() {
|
||||
},
|
||||
[
|
||||
activeWorkflowId,
|
||||
blocks,
|
||||
edges,
|
||||
loops,
|
||||
parallels,
|
||||
currentWorkflow,
|
||||
toggleConsole,
|
||||
getAllVariables,
|
||||
getVariablesByWorkflowId,
|
||||
@@ -455,12 +430,62 @@ export function useWorkflowExecution() {
|
||||
onStream?: (se: StreamingExecution) => Promise<void>,
|
||||
executionId?: string
|
||||
): Promise<ExecutionResult | StreamingExecution> => {
|
||||
// Use the mergeSubblockState utility to get all block states
|
||||
const mergedStates = mergeSubblockState(blocks)
|
||||
// Use currentWorkflow but check if we're in diff mode
|
||||
const {
|
||||
blocks: workflowBlocks,
|
||||
edges: workflowEdges,
|
||||
loops: workflowLoops,
|
||||
parallels: workflowParallels,
|
||||
} = currentWorkflow
|
||||
|
||||
// Filter out blocks without type (these are layout-only blocks)
|
||||
const validBlocks = Object.entries(workflowBlocks).reduce(
|
||||
(acc, [blockId, block]) => {
|
||||
if (block?.type) {
|
||||
acc[blockId] = block
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{} as typeof workflowBlocks
|
||||
)
|
||||
|
||||
const isExecutingFromChat =
|
||||
workflowInput && typeof workflowInput === 'object' && 'input' in workflowInput
|
||||
|
||||
logger.info('Executing workflow', {
|
||||
isDiffMode: currentWorkflow.isDiffMode,
|
||||
isExecutingFromChat,
|
||||
totalBlocksCount: Object.keys(workflowBlocks).length,
|
||||
validBlocksCount: Object.keys(validBlocks).length,
|
||||
edgesCount: workflowEdges.length,
|
||||
})
|
||||
|
||||
// Debug: Check for blocks with undefined types before merging
|
||||
Object.entries(workflowBlocks).forEach(([blockId, block]) => {
|
||||
if (!block || !block.type) {
|
||||
logger.error('Found block with undefined type before merging:', { blockId, block })
|
||||
}
|
||||
})
|
||||
|
||||
// Merge subblock states from the appropriate store
|
||||
const mergedStates = mergeSubblockState(validBlocks)
|
||||
|
||||
// Debug: Check for blocks with undefined types after merging
|
||||
Object.entries(mergedStates).forEach(([blockId, block]) => {
|
||||
if (!block || !block.type) {
|
||||
logger.error('Found block with undefined type after merging:', { blockId, block })
|
||||
}
|
||||
})
|
||||
|
||||
// Filter out trigger blocks for manual execution
|
||||
const filteredStates = Object.entries(mergedStates).reduce(
|
||||
(acc, [id, block]) => {
|
||||
// Skip blocks with undefined type
|
||||
if (!block || !block.type) {
|
||||
logger.warn(`Skipping block with undefined type: ${id}`, block)
|
||||
return acc
|
||||
}
|
||||
|
||||
const blockConfig = getBlock(block.type)
|
||||
const isTriggerBlock = blockConfig?.category === 'triggers'
|
||||
|
||||
@@ -513,7 +538,7 @@ export function useWorkflowExecution() {
|
||||
return blockConfig?.category === 'triggers'
|
||||
})
|
||||
|
||||
const filteredEdges = edges.filter(
|
||||
const filteredEdges = workflowEdges.filter(
|
||||
(edge) => !triggerBlockIds.includes(edge.source) && !triggerBlockIds.includes(edge.target)
|
||||
)
|
||||
|
||||
@@ -521,18 +546,13 @@ export function useWorkflowExecution() {
|
||||
const workflow = new Serializer().serializeWorkflow(
|
||||
filteredStates,
|
||||
filteredEdges,
|
||||
loops,
|
||||
parallels,
|
||||
true // Enable validation during execution
|
||||
workflowLoops,
|
||||
workflowParallels
|
||||
)
|
||||
|
||||
// Determine if this is a chat execution
|
||||
const isChatExecution =
|
||||
workflowInput && typeof workflowInput === 'object' && 'input' in workflowInput
|
||||
|
||||
// If this is a chat execution, get the selected outputs
|
||||
let selectedOutputIds: string[] | undefined
|
||||
if (isChatExecution && activeWorkflowId) {
|
||||
if (isExecutingFromChat && activeWorkflowId) {
|
||||
// Get selected outputs from chat store
|
||||
const chatStore = await import('@/stores/panel/chat/store').then((mod) => mod.useChatStore)
|
||||
selectedOutputIds = chatStore.getState().getSelectedWorkflowOutput(activeWorkflowId)
|
||||
@@ -546,7 +566,7 @@ export function useWorkflowExecution() {
|
||||
workflowInput,
|
||||
workflowVariables,
|
||||
contextExtensions: {
|
||||
stream: isChatExecution,
|
||||
stream: isExecutingFromChat,
|
||||
selectedOutputIds,
|
||||
edges: workflow.connections.map((conn) => ({
|
||||
source: conn.source,
|
||||
@@ -610,54 +630,6 @@ export function useWorkflowExecution() {
|
||||
setIsDebugging(false)
|
||||
setActiveBlocks(new Set())
|
||||
|
||||
// Add the error to the console so users can see what went wrong
|
||||
// This ensures serialization errors appear in the console just like execution errors
|
||||
if (activeWorkflowId) {
|
||||
const consoleStore = useConsoleStore.getState()
|
||||
|
||||
// Try to extract block information from the error message
|
||||
// Serialization errors typically have format: "BlockName is missing required fields: FieldName"
|
||||
let blockName = 'Workflow Execution'
|
||||
let blockId = 'workflow-error'
|
||||
let blockType = 'workflow'
|
||||
|
||||
const blockErrorMatch = errorMessage.match(/^(.+?)\s+is missing required fields/)
|
||||
if (blockErrorMatch) {
|
||||
const failedBlockName = blockErrorMatch[1]
|
||||
blockName = failedBlockName
|
||||
|
||||
// Try to find the actual block in the current workflow to get its ID and type
|
||||
const allBlocks = Object.values(blocks)
|
||||
const failedBlock = allBlocks.find((block) => block.name === failedBlockName)
|
||||
if (failedBlock) {
|
||||
blockId = failedBlock.id
|
||||
blockType = failedBlock.type
|
||||
} else {
|
||||
// Fallback: use the block name as ID if we can't find the actual block
|
||||
blockId = failedBlockName.toLowerCase().replace(/\s+/g, '-')
|
||||
blockType = 'unknown'
|
||||
}
|
||||
}
|
||||
|
||||
consoleStore.addConsole({
|
||||
workflowId: activeWorkflowId,
|
||||
blockId: blockId,
|
||||
blockName: blockName,
|
||||
blockType: blockType,
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
output: {},
|
||||
startedAt: new Date().toISOString(),
|
||||
endedAt: new Date().toISOString(),
|
||||
durationMs: 0,
|
||||
})
|
||||
|
||||
// Auto-open the console so users can see the error (only if it's not already open)
|
||||
if (!consoleStore.isOpen) {
|
||||
toggleConsole()
|
||||
}
|
||||
}
|
||||
|
||||
let notificationMessage = 'Workflow execution failed'
|
||||
if (error?.request?.url) {
|
||||
if (error.request.url && error.request.url.trim() !== '') {
|
||||
|
||||
@@ -0,0 +1,349 @@
|
||||
/**
|
||||
* Standalone workflow execution utilities
|
||||
* This allows workflow execution with proper logging from both React hooks and tools
|
||||
*/
|
||||
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { buildTraceSpans } from '@/lib/logs/execution/trace-spans/trace-spans'
|
||||
import { getBlock } from '@/blocks'
|
||||
import type { BlockOutput } from '@/blocks/types'
|
||||
import { Executor } from '@/executor'
|
||||
import type { ExecutionResult, StreamingExecution } from '@/executor/types'
|
||||
import { Serializer } from '@/serializer'
|
||||
import type { SerializedWorkflow } from '@/serializer/types'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useVariablesStore } from '@/stores/panel/variables/store'
|
||||
import { useEnvironmentStore } from '@/stores/settings/environment/store'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { mergeSubblockState } from '@/stores/workflows/utils'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
|
||||
const logger = createLogger('WorkflowExecutionUtils')
|
||||
|
||||
// Interface for executor options (copied from useWorkflowExecution)
|
||||
interface ExecutorOptions {
|
||||
workflow: SerializedWorkflow
|
||||
currentBlockStates?: Record<string, BlockOutput>
|
||||
envVarValues?: Record<string, string>
|
||||
workflowInput?: any
|
||||
workflowVariables?: Record<string, any>
|
||||
contextExtensions?: {
|
||||
stream?: boolean
|
||||
selectedOutputIds?: string[]
|
||||
edges?: Array<{ source: string; target: string }>
|
||||
onStream?: (streamingExecution: StreamingExecution) => Promise<void>
|
||||
executionId?: string
|
||||
}
|
||||
}
|
||||
|
||||
export interface WorkflowExecutionOptions {
|
||||
workflowInput?: any
|
||||
executionId?: string
|
||||
onStream?: (se: StreamingExecution) => Promise<void>
|
||||
}
|
||||
|
||||
export interface WorkflowExecutionContext {
|
||||
activeWorkflowId: string
|
||||
currentWorkflow: any
|
||||
getAllVariables: () => any
|
||||
getVariablesByWorkflowId: (workflowId: string) => any[]
|
||||
setExecutor: (executor: Executor) => void
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current workflow execution context from stores
|
||||
*/
|
||||
export function getWorkflowExecutionContext(): WorkflowExecutionContext {
|
||||
const { activeWorkflowId } = useWorkflowRegistry.getState()
|
||||
if (!activeWorkflowId) {
|
||||
throw new Error('No active workflow found')
|
||||
}
|
||||
|
||||
const workflowState = useWorkflowStore.getState().getWorkflowState()
|
||||
const { isShowingDiff, isDiffReady, diffWorkflow } = useWorkflowDiffStore.getState()
|
||||
|
||||
// Determine which workflow to use - same logic as useCurrentWorkflow
|
||||
const shouldUseDiff = isShowingDiff && isDiffReady && !!diffWorkflow
|
||||
const currentWorkflow = shouldUseDiff ? diffWorkflow : workflowState
|
||||
|
||||
const { getAllVariables } = useEnvironmentStore.getState()
|
||||
const { getVariablesByWorkflowId } = useVariablesStore.getState()
|
||||
const { setExecutor } = useExecutionStore.getState()
|
||||
|
||||
return {
|
||||
activeWorkflowId,
|
||||
currentWorkflow,
|
||||
getAllVariables,
|
||||
getVariablesByWorkflowId,
|
||||
setExecutor,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute a workflow with proper state management and logging
|
||||
* This is the core execution logic extracted from useWorkflowExecution
|
||||
*/
|
||||
export async function executeWorkflowWithLogging(
|
||||
context: WorkflowExecutionContext,
|
||||
options: WorkflowExecutionOptions = {}
|
||||
): Promise<ExecutionResult | StreamingExecution> {
|
||||
const {
|
||||
activeWorkflowId,
|
||||
currentWorkflow,
|
||||
getAllVariables,
|
||||
getVariablesByWorkflowId,
|
||||
setExecutor,
|
||||
} = context
|
||||
const { workflowInput, onStream, executionId } = options
|
||||
|
||||
const {
|
||||
blocks: workflowBlocks,
|
||||
edges: workflowEdges,
|
||||
loops: workflowLoops,
|
||||
parallels: workflowParallels,
|
||||
} = currentWorkflow
|
||||
|
||||
// Filter out blocks without type (these are layout-only blocks)
|
||||
const validBlocks = Object.entries(workflowBlocks).reduce(
|
||||
(acc, [blockId, block]) => {
|
||||
if (block && typeof block === 'object' && 'type' in block && block.type) {
|
||||
acc[blockId] = block
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{} as typeof workflowBlocks
|
||||
)
|
||||
|
||||
const isExecutingFromChat =
|
||||
workflowInput && typeof workflowInput === 'object' && 'input' in workflowInput
|
||||
|
||||
logger.info('Executing workflow', {
|
||||
isDiffMode: (currentWorkflow as any).isDiffMode,
|
||||
isExecutingFromChat,
|
||||
totalBlocksCount: Object.keys(workflowBlocks).length,
|
||||
validBlocksCount: Object.keys(validBlocks).length,
|
||||
edgesCount: workflowEdges.length,
|
||||
})
|
||||
|
||||
// Merge subblock states from the appropriate store
|
||||
const mergedStates = mergeSubblockState(validBlocks)
|
||||
|
||||
// Filter out trigger blocks for manual execution
|
||||
const filteredStates = Object.entries(mergedStates).reduce(
|
||||
(acc, [id, block]) => {
|
||||
// Skip blocks with undefined type
|
||||
if (!block || !block.type) {
|
||||
logger.warn(`Skipping block with undefined type: ${id}`, block)
|
||||
return acc
|
||||
}
|
||||
|
||||
const blockConfig = getBlock(block.type)
|
||||
const isTriggerBlock = blockConfig?.category === 'triggers'
|
||||
|
||||
// Skip trigger blocks during manual execution
|
||||
if (!isTriggerBlock) {
|
||||
acc[id] = block
|
||||
}
|
||||
return acc
|
||||
},
|
||||
{} as typeof mergedStates
|
||||
)
|
||||
|
||||
const currentBlockStates = Object.entries(filteredStates).reduce(
|
||||
(acc, [id, block]) => {
|
||||
acc[id] = Object.entries(block.subBlocks).reduce(
|
||||
(subAcc, [key, subBlock]) => {
|
||||
subAcc[key] = subBlock.value
|
||||
return subAcc
|
||||
},
|
||||
{} as Record<string, any>
|
||||
)
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, Record<string, any>>
|
||||
)
|
||||
|
||||
// Get environment variables
|
||||
const envVars = getAllVariables()
|
||||
const envVarValues = Object.entries(envVars).reduce(
|
||||
(acc, [key, variable]: [string, any]) => {
|
||||
acc[key] = variable.value
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, string>
|
||||
)
|
||||
|
||||
// Get workflow variables
|
||||
const workflowVars = getVariablesByWorkflowId(activeWorkflowId)
|
||||
const workflowVariables = workflowVars.reduce(
|
||||
(acc, variable: any) => {
|
||||
acc[variable.id] = variable
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, any>
|
||||
)
|
||||
|
||||
// Filter edges to exclude connections to/from trigger blocks
|
||||
const triggerBlockIds = Object.keys(mergedStates).filter((id) => {
|
||||
const blockConfig = getBlock(mergedStates[id].type)
|
||||
return blockConfig?.category === 'triggers'
|
||||
})
|
||||
|
||||
const filteredEdges = workflowEdges.filter(
|
||||
(edge: any) => !triggerBlockIds.includes(edge.source) && !triggerBlockIds.includes(edge.target)
|
||||
)
|
||||
|
||||
// Create serialized workflow with filtered blocks and edges
|
||||
const workflow = new Serializer().serializeWorkflow(
|
||||
filteredStates,
|
||||
filteredEdges,
|
||||
workflowLoops,
|
||||
workflowParallels
|
||||
)
|
||||
|
||||
// If this is a chat execution, get the selected outputs
|
||||
let selectedOutputIds: string[] | undefined
|
||||
if (isExecutingFromChat) {
|
||||
// Get selected outputs from chat store
|
||||
const chatStore = await import('@/stores/panel/chat/store').then((mod) => mod.useChatStore)
|
||||
selectedOutputIds = chatStore.getState().getSelectedWorkflowOutput(activeWorkflowId)
|
||||
}
|
||||
|
||||
// Create executor options
|
||||
const executorOptions: ExecutorOptions = {
|
||||
workflow,
|
||||
currentBlockStates,
|
||||
envVarValues,
|
||||
workflowInput,
|
||||
workflowVariables,
|
||||
contextExtensions: {
|
||||
stream: isExecutingFromChat,
|
||||
selectedOutputIds,
|
||||
edges: workflow.connections.map((conn) => ({
|
||||
source: conn.source,
|
||||
target: conn.target,
|
||||
})),
|
||||
onStream,
|
||||
executionId,
|
||||
},
|
||||
}
|
||||
|
||||
// Create executor and store in global state
|
||||
const newExecutor = new Executor(executorOptions)
|
||||
setExecutor(newExecutor)
|
||||
|
||||
// Execute workflow
|
||||
return newExecutor.execute(activeWorkflowId)
|
||||
}
|
||||
|
||||
/**
|
||||
* Persist execution logs to the backend
|
||||
*/
|
||||
export async function persistExecutionLogs(
|
||||
activeWorkflowId: string,
|
||||
executionId: string,
|
||||
result: ExecutionResult,
|
||||
streamContent?: string
|
||||
): Promise<string> {
|
||||
try {
|
||||
// Build trace spans from execution logs
|
||||
const { traceSpans, totalDuration } = buildTraceSpans(result)
|
||||
|
||||
// Add trace spans to the execution result
|
||||
const enrichedResult = {
|
||||
...result,
|
||||
traceSpans,
|
||||
totalDuration,
|
||||
}
|
||||
|
||||
// If this was a streaming response and we have the final content, update it
|
||||
if (streamContent && result.output && typeof streamContent === 'string') {
|
||||
// Update the content with the final streaming content
|
||||
enrichedResult.output.content = streamContent
|
||||
|
||||
// Also update any block logs to include the content where appropriate
|
||||
if (enrichedResult.logs) {
|
||||
// Get the streaming block ID from metadata if available
|
||||
const streamingBlockId = (result.metadata as any)?.streamingBlockId || null
|
||||
|
||||
for (const log of enrichedResult.logs) {
|
||||
// Only update the specific LLM block (agent/router) that was streamed
|
||||
const isStreamingBlock = streamingBlockId && log.blockId === streamingBlockId
|
||||
if (
|
||||
isStreamingBlock &&
|
||||
(log.blockType === 'agent' || log.blockType === 'router') &&
|
||||
log.output
|
||||
) {
|
||||
log.output.content = streamContent
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const response = await fetch(`/api/workflows/${activeWorkflowId}/log`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
executionId,
|
||||
result: enrichedResult,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to persist logs')
|
||||
}
|
||||
|
||||
return executionId
|
||||
} catch (error) {
|
||||
logger.error('Error persisting logs:', error)
|
||||
return executionId
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute workflow with full logging support
|
||||
* This combines execution + log persistence in a single function
|
||||
*/
|
||||
export async function executeWorkflowWithFullLogging(
|
||||
options: WorkflowExecutionOptions = {}
|
||||
): Promise<ExecutionResult | StreamingExecution> {
|
||||
const context = getWorkflowExecutionContext()
|
||||
const executionId = options.executionId || uuidv4()
|
||||
|
||||
try {
|
||||
const result = await executeWorkflowWithLogging(context, {
|
||||
...options,
|
||||
executionId,
|
||||
})
|
||||
|
||||
// For ExecutionResult (not streaming), persist logs
|
||||
if (result && 'success' in result) {
|
||||
// Don't await log persistence to avoid blocking the UI
|
||||
persistExecutionLogs(context.activeWorkflowId, executionId, result as ExecutionResult).catch(
|
||||
(err) => {
|
||||
logger.error('Error persisting logs:', { error: err })
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
return result
|
||||
} catch (error: any) {
|
||||
// Create error result and persist it
|
||||
const errorResult: ExecutionResult = {
|
||||
success: false,
|
||||
output: { error: error?.message || 'Unknown error' },
|
||||
logs: [],
|
||||
metadata: { duration: 0, startTime: new Date().toISOString() },
|
||||
}
|
||||
|
||||
persistExecutionLogs(context.activeWorkflowId, executionId, errorResult).catch((err) => {
|
||||
logger.error('Error persisting logs:', { error: err })
|
||||
})
|
||||
|
||||
throw error
|
||||
}
|
||||
}
|
||||
@@ -456,43 +456,6 @@ export const resizeLoopNodes = (
|
||||
})
|
||||
}
|
||||
|
||||
export interface LayoutOptions {
|
||||
horizontalSpacing?: number
|
||||
verticalSpacing?: number
|
||||
startX?: number
|
||||
startY?: number
|
||||
alignByLayer?: boolean
|
||||
handleOrientation?: 'auto' | 'horizontal' | 'vertical'
|
||||
}
|
||||
|
||||
/**
|
||||
* Detects the predominant handle orientation in the workflow
|
||||
* @param blocks Block states from workflow store
|
||||
* @returns 'horizontal' if most blocks use horizontal handles, 'vertical' otherwise
|
||||
*/
|
||||
export const detectHandleOrientation = (blocks: Record<string, any>): 'horizontal' | 'vertical' => {
|
||||
const topLevelBlocks = Object.values(blocks).filter((block) => !block.data?.parentId)
|
||||
|
||||
if (topLevelBlocks.length === 0) {
|
||||
return 'horizontal'
|
||||
}
|
||||
|
||||
let horizontalCount = 0
|
||||
let verticalCount = 0
|
||||
|
||||
topLevelBlocks.forEach((block) => {
|
||||
if (block.horizontalHandles === true) {
|
||||
horizontalCount++
|
||||
} else if (block.horizontalHandles === false) {
|
||||
verticalCount++
|
||||
} else {
|
||||
horizontalCount++
|
||||
}
|
||||
})
|
||||
|
||||
return horizontalCount >= verticalCount ? 'horizontal' : 'vertical'
|
||||
}
|
||||
|
||||
/**
|
||||
* Analyzes the workflow graph using topological sort to determine execution layers
|
||||
* and properly handle parallel execution paths, disabled blocks, and terminal blocks
|
||||
@@ -669,488 +632,45 @@ export const analyzeWorkflowGraph = (
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculates auto-layout positions for workflow blocks with improved spacing and alignment
|
||||
* Enhanced to handle both horizontal and vertical handle orientations
|
||||
* @param blocks Block states from workflow store
|
||||
* @param edges Edge connections from workflow store
|
||||
* @param options Layout configuration options
|
||||
* @returns Map of block IDs to their new positions
|
||||
*/
|
||||
export const calculateAutoLayout = (
|
||||
blocks: Record<string, any>,
|
||||
edges: any[],
|
||||
options: LayoutOptions = {}
|
||||
): Map<string, { x: number; y: number }> => {
|
||||
const {
|
||||
horizontalSpacing = 600,
|
||||
verticalSpacing = 200,
|
||||
startX = 300,
|
||||
startY = 300,
|
||||
alignByLayer = true,
|
||||
handleOrientation = 'auto',
|
||||
} = options
|
||||
|
||||
const newPositions = new Map<string, { x: number; y: number }>()
|
||||
|
||||
const topLevelBlocks = Object.fromEntries(
|
||||
Object.entries(blocks).filter(([_, block]) => !block.data?.parentId)
|
||||
)
|
||||
|
||||
if (Object.keys(topLevelBlocks).length === 0) {
|
||||
return newPositions
|
||||
}
|
||||
|
||||
let actualOrientation: 'horizontal' | 'vertical'
|
||||
if (handleOrientation === 'auto') {
|
||||
actualOrientation = detectHandleOrientation(blocks)
|
||||
} else {
|
||||
actualOrientation = handleOrientation
|
||||
}
|
||||
|
||||
if (alignByLayer) {
|
||||
const { parallelGroups, maxLayer, disabledBlocks, terminalBlocks, orphanedBlocks } =
|
||||
analyzeWorkflowGraph(topLevelBlocks, edges)
|
||||
|
||||
if (actualOrientation === 'horizontal') {
|
||||
const calculateLayerSpacing = (currentLayer: number, nextLayer: number): number => {
|
||||
const currentLayerGroups = parallelGroups.get(currentLayer) || []
|
||||
const nextLayerGroups = parallelGroups.get(nextLayer) || []
|
||||
|
||||
let maxCurrentWidth = 0
|
||||
currentLayerGroups.forEach((group: string[]) => {
|
||||
group.forEach((nodeId: string) => {
|
||||
maxCurrentWidth = Math.max(maxCurrentWidth, getBlockWidth(blocks, nodeId))
|
||||
})
|
||||
})
|
||||
|
||||
let maxNextWidth = 0
|
||||
nextLayerGroups.forEach((group: string[]) => {
|
||||
group.forEach((nodeId: string) => {
|
||||
maxNextWidth = Math.max(maxNextWidth, getBlockWidth(blocks, nodeId))
|
||||
})
|
||||
})
|
||||
|
||||
const baseSpacing = horizontalSpacing
|
||||
const widthAdjustment = Math.max(maxCurrentWidth, maxNextWidth) - 350 // 350 is standard width
|
||||
const connectionTagSpace = 100
|
||||
|
||||
const isOrphanedLayer =
|
||||
currentLayer > maxLayer - 2 &&
|
||||
(currentLayerGroups.some((group) => group.some((nodeId) => orphanedBlocks.has(nodeId))) ||
|
||||
nextLayerGroups.some((group) => group.some((nodeId) => orphanedBlocks.has(nodeId))))
|
||||
const orphanedSpacing = isOrphanedLayer ? 200 : 0
|
||||
|
||||
return baseSpacing + widthAdjustment + connectionTagSpace + orphanedSpacing
|
||||
}
|
||||
|
||||
let currentLayerX = startX
|
||||
|
||||
for (let layer = 0; layer <= maxLayer; layer++) {
|
||||
const groups = parallelGroups.get(layer) || []
|
||||
|
||||
let totalHeight = 0
|
||||
const groupHeights: number[] = []
|
||||
|
||||
groups.forEach((group) => {
|
||||
const groupHeight = calculateGroupDimensions(
|
||||
group,
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks,
|
||||
blocks,
|
||||
verticalSpacing,
|
||||
getBlockHeight
|
||||
)
|
||||
groupHeights.push(groupHeight)
|
||||
totalHeight += groupHeight
|
||||
})
|
||||
|
||||
if (groups.length > 1) {
|
||||
totalHeight += (groups.length - 1) * verticalSpacing
|
||||
}
|
||||
|
||||
let currentY = startY - totalHeight / 2
|
||||
|
||||
groups.forEach((group, groupIndex) => {
|
||||
const sortedGroup = sortBlocksByPriority(
|
||||
group,
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks
|
||||
)
|
||||
|
||||
sortedGroup.forEach((nodeId, nodeIndex) => {
|
||||
const blockHeight = getBlockHeight(blocks, nodeId)
|
||||
|
||||
let positionY = currentY
|
||||
if (isContainerBlock(blocks, nodeId)) {
|
||||
positionY = currentY
|
||||
}
|
||||
|
||||
newPositions.set(nodeId, {
|
||||
x: currentLayerX,
|
||||
y: positionY,
|
||||
})
|
||||
|
||||
currentY += blockHeight
|
||||
|
||||
if (nodeIndex < sortedGroup.length - 1) {
|
||||
const currentBlockType = getBlockType(
|
||||
nodeId,
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks
|
||||
)
|
||||
const nextBlockType = getBlockType(
|
||||
sortedGroup[nodeIndex + 1],
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks
|
||||
)
|
||||
|
||||
const extraSpacing = calculateExtraSpacing(
|
||||
currentBlockType,
|
||||
nextBlockType,
|
||||
verticalSpacing
|
||||
)
|
||||
currentY += verticalSpacing * 0.5 + extraSpacing
|
||||
}
|
||||
})
|
||||
|
||||
if (groupIndex < groups.length - 1) {
|
||||
currentY += verticalSpacing
|
||||
}
|
||||
})
|
||||
|
||||
if (layer < maxLayer) {
|
||||
const dynamicSpacing = calculateLayerSpacing(layer, layer + 1)
|
||||
currentLayerX += dynamicSpacing
|
||||
}
|
||||
}
|
||||
} else {
|
||||
const calculateLayerSpacing = (currentLayer: number, nextLayer: number): number => {
|
||||
const currentLayerGroups = parallelGroups.get(currentLayer) || []
|
||||
const nextLayerGroups = parallelGroups.get(nextLayer) || []
|
||||
|
||||
let maxCurrentHeight = 0
|
||||
currentLayerGroups.forEach((group: string[]) => {
|
||||
group.forEach((nodeId: string) => {
|
||||
maxCurrentHeight = Math.max(maxCurrentHeight, getBlockHeight(blocks, nodeId))
|
||||
})
|
||||
})
|
||||
|
||||
let maxNextHeight = 0
|
||||
nextLayerGroups.forEach((group: string[]) => {
|
||||
group.forEach((nodeId: string) => {
|
||||
maxNextHeight = Math.max(maxNextHeight, getBlockHeight(blocks, nodeId))
|
||||
})
|
||||
})
|
||||
|
||||
const baseSpacing = verticalSpacing
|
||||
const heightAdjustment = Math.max(maxCurrentHeight, maxNextHeight) - 150 // 150 is standard height
|
||||
const connectionTagSpace = 50
|
||||
|
||||
const isOrphanedLayer =
|
||||
currentLayer > maxLayer - 2 &&
|
||||
(currentLayerGroups.some((group) => group.some((nodeId) => orphanedBlocks.has(nodeId))) ||
|
||||
nextLayerGroups.some((group) => group.some((nodeId) => orphanedBlocks.has(nodeId))))
|
||||
const orphanedSpacing = isOrphanedLayer ? 150 : 0
|
||||
|
||||
return baseSpacing + heightAdjustment + connectionTagSpace + orphanedSpacing
|
||||
}
|
||||
|
||||
let currentLayerY = startY
|
||||
|
||||
for (let layer = 0; layer <= maxLayer; layer++) {
|
||||
const groups = parallelGroups.get(layer) || []
|
||||
|
||||
let totalWidth = 0
|
||||
const groupWidths: number[] = []
|
||||
|
||||
groups.forEach((group) => {
|
||||
const groupWidth = calculateGroupDimensions(
|
||||
group,
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks,
|
||||
blocks,
|
||||
horizontalSpacing,
|
||||
getBlockWidth
|
||||
)
|
||||
groupWidths.push(groupWidth)
|
||||
totalWidth += groupWidth
|
||||
})
|
||||
|
||||
if (groups.length > 1) {
|
||||
totalWidth += (groups.length - 1) * horizontalSpacing
|
||||
}
|
||||
|
||||
let currentX = startX - totalWidth / 2
|
||||
|
||||
groups.forEach((group, groupIndex) => {
|
||||
const sortedGroup = sortBlocksByPriority(
|
||||
group,
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks
|
||||
)
|
||||
|
||||
sortedGroup.forEach((nodeId, nodeIndex) => {
|
||||
const blockWidth = getBlockWidth(blocks, nodeId)
|
||||
|
||||
let positionX = currentX
|
||||
if (isContainerBlock(blocks, nodeId)) {
|
||||
positionX = currentX
|
||||
}
|
||||
|
||||
newPositions.set(nodeId, {
|
||||
x: positionX,
|
||||
y: currentLayerY,
|
||||
})
|
||||
|
||||
currentX += blockWidth
|
||||
|
||||
if (nodeIndex < sortedGroup.length - 1) {
|
||||
const currentBlockType = getBlockType(
|
||||
nodeId,
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks
|
||||
)
|
||||
const nextBlockType = getBlockType(
|
||||
sortedGroup[nodeIndex + 1],
|
||||
orphanedBlocks,
|
||||
disabledBlocks,
|
||||
terminalBlocks
|
||||
)
|
||||
|
||||
const extraSpacing = calculateExtraSpacing(
|
||||
currentBlockType,
|
||||
nextBlockType,
|
||||
horizontalSpacing
|
||||
)
|
||||
currentX += horizontalSpacing * 0.5 + extraSpacing
|
||||
}
|
||||
})
|
||||
|
||||
if (groupIndex < groups.length - 1) {
|
||||
currentX += horizontalSpacing
|
||||
}
|
||||
})
|
||||
|
||||
if (layer < maxLayer) {
|
||||
const dynamicSpacing = calculateLayerSpacing(layer, layer + 1)
|
||||
currentLayerY += dynamicSpacing
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
const blockIds = Object.keys(topLevelBlocks)
|
||||
|
||||
if (actualOrientation === 'horizontal') {
|
||||
let currentX = startX
|
||||
|
||||
blockIds.forEach((blockId, index) => {
|
||||
newPositions.set(blockId, { x: currentX, y: startY })
|
||||
|
||||
if (index < blockIds.length - 1) {
|
||||
const blockWidth = getBlockWidth(blocks, blockId)
|
||||
const nextBlockWidth = getBlockWidth(blocks, blockIds[index + 1])
|
||||
const spacing = horizontalSpacing + Math.max(blockWidth, nextBlockWidth) - 350
|
||||
currentX += spacing
|
||||
}
|
||||
})
|
||||
} else {
|
||||
let currentY = startY
|
||||
|
||||
blockIds.forEach((blockId, index) => {
|
||||
newPositions.set(blockId, { x: startX, y: currentY })
|
||||
|
||||
if (index < blockIds.length - 1) {
|
||||
const blockHeight = getBlockHeight(blocks, blockId)
|
||||
const nextBlockHeight = getBlockHeight(blocks, blockIds[index + 1])
|
||||
const spacing = verticalSpacing + Math.max(blockHeight, nextBlockHeight) - 150
|
||||
currentY += spacing
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
return newPositions
|
||||
export interface LayoutOptions {
|
||||
horizontalSpacing?: number
|
||||
verticalSpacing?: number
|
||||
startX?: number
|
||||
startY?: number
|
||||
alignByLayer?: boolean
|
||||
handleOrientation?: 'auto' | 'horizontal' | 'vertical'
|
||||
}
|
||||
|
||||
/**
|
||||
* Enhanced auto-layout function with smooth animations
|
||||
* Detects the predominant handle orientation in the workflow
|
||||
* @param blocks Block states from workflow store
|
||||
* @param edges Edge connections from workflow store
|
||||
* @param updateBlockPosition Function to update block positions
|
||||
* @param fitView Function to fit the view
|
||||
* @param resizeLoopNodes Function to resize loop nodes
|
||||
* @param options Layout configuration options
|
||||
* @returns 'horizontal' if most blocks use horizontal handles, 'vertical' otherwise
|
||||
*/
|
||||
export const applyAutoLayoutSmooth = (
|
||||
blocks: Record<string, any>,
|
||||
edges: any[],
|
||||
updateBlockPosition: (id: string, position: { x: number; y: number }) => void,
|
||||
fitView: (options?: { padding?: number; duration?: number }) => void,
|
||||
resizeLoopNodes: () => void,
|
||||
options: LayoutOptions & {
|
||||
animationDuration?: number
|
||||
isSidebarCollapsed?: boolean
|
||||
onComplete?: (finalPositions: Map<string, { x: number; y: number }>) => void
|
||||
} = {}
|
||||
): void => {
|
||||
const {
|
||||
animationDuration = 500,
|
||||
isSidebarCollapsed = false,
|
||||
onComplete,
|
||||
...layoutOptions
|
||||
} = options
|
||||
export const detectHandleOrientation = (blocks: Record<string, any>): 'horizontal' | 'vertical' => {
|
||||
const topLevelBlocks = Object.values(blocks).filter((block) => !block.data?.parentId)
|
||||
|
||||
if (!layoutOptions.handleOrientation || layoutOptions.handleOrientation === 'auto') {
|
||||
layoutOptions.handleOrientation = detectHandleOrientation(blocks)
|
||||
if (topLevelBlocks.length === 0) {
|
||||
return 'horizontal'
|
||||
}
|
||||
|
||||
const topLevelPositions = calculateAutoLayout(blocks, edges, layoutOptions)
|
||||
let horizontalCount = 0
|
||||
let verticalCount = 0
|
||||
|
||||
const childPositions = new Map<string, { x: number; y: number }>()
|
||||
|
||||
const containerBlocks = Object.entries(blocks).filter(
|
||||
([_, block]) => isContainerType(block.type) && !block.data?.parentId
|
||||
)
|
||||
|
||||
containerBlocks.forEach(([containerId]) => {
|
||||
const childBlocks = Object.fromEntries(
|
||||
Object.entries(blocks).filter(([_, block]) => block.data?.parentId === containerId)
|
||||
)
|
||||
|
||||
if (Object.keys(childBlocks).length === 0) return
|
||||
|
||||
const childEdges = edges.filter((edge) => childBlocks[edge.source] && childBlocks[edge.target])
|
||||
|
||||
const childLayoutOptions: LayoutOptions = {
|
||||
horizontalSpacing: Math.min(300, layoutOptions.horizontalSpacing || 300),
|
||||
verticalSpacing: Math.min(150, layoutOptions.verticalSpacing || 150),
|
||||
startX: 50,
|
||||
startY: 80,
|
||||
alignByLayer: true,
|
||||
handleOrientation: layoutOptions.handleOrientation,
|
||||
}
|
||||
|
||||
const childPositionsForContainer = calculateAutoLayout(
|
||||
childBlocks,
|
||||
childEdges,
|
||||
childLayoutOptions
|
||||
)
|
||||
|
||||
childPositionsForContainer.forEach((position, blockId) => {
|
||||
childPositions.set(blockId, position)
|
||||
})
|
||||
})
|
||||
|
||||
const allPositions = new Map([...topLevelPositions, ...childPositions])
|
||||
|
||||
if (allPositions.size === 0) return
|
||||
|
||||
const currentPositions = new Map<string, { x: number; y: number }>()
|
||||
allPositions.forEach((_, blockId) => {
|
||||
const block = blocks[blockId]
|
||||
if (block) {
|
||||
currentPositions.set(blockId, { x: block.position.x, y: block.position.y })
|
||||
}
|
||||
})
|
||||
|
||||
const startTime = Date.now()
|
||||
const easeOutCubic = (t: number): number => 1 - (1 - t) ** 3
|
||||
|
||||
const animate = async () => {
|
||||
const elapsed = Date.now() - startTime
|
||||
const progress = Math.min(elapsed / animationDuration, 1)
|
||||
const easedProgress = easeOutCubic(progress)
|
||||
|
||||
allPositions.forEach((targetPosition, blockId) => {
|
||||
const currentPosition = currentPositions.get(blockId)
|
||||
if (!currentPosition) return
|
||||
|
||||
const newPosition = {
|
||||
x: currentPosition.x + (targetPosition.x - currentPosition.x) * easedProgress,
|
||||
y: currentPosition.y + (targetPosition.y - currentPosition.y) * easedProgress,
|
||||
}
|
||||
|
||||
updateBlockPosition(blockId, newPosition)
|
||||
})
|
||||
|
||||
if (progress < 1) {
|
||||
requestAnimationFrame(animate)
|
||||
topLevelBlocks.forEach((block) => {
|
||||
if (block.horizontalHandles === true) {
|
||||
horizontalCount++
|
||||
} else if (block.horizontalHandles === false) {
|
||||
verticalCount++
|
||||
} else {
|
||||
resizeLoopNodes()
|
||||
|
||||
const padding = isSidebarCollapsed ? 0.35 : 0.55
|
||||
fitView({
|
||||
padding,
|
||||
duration: 400,
|
||||
})
|
||||
|
||||
// Call completion callback with final positions
|
||||
if (onComplete) {
|
||||
onComplete(allPositions)
|
||||
}
|
||||
horizontalCount++
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
animate()
|
||||
return horizontalCount >= verticalCount ? 'horizontal' : 'vertical'
|
||||
}
|
||||
|
||||
/**
|
||||
* Original auto-layout function (for backward compatibility)
|
||||
* @param blocks Block states from workflow store
|
||||
* @param edges Edge connections from workflow store
|
||||
* @param updateBlockPosition Function to update block positions
|
||||
* @param options Layout configuration options
|
||||
* NOTE: Auto layout functions have been moved to the centralized backend system
|
||||
* at apps/sim/lib/autolayout/ - all auto layout now goes through the API routes
|
||||
* and uses the same improved algorithm for consistent results.
|
||||
*/
|
||||
export const applyAutoLayout = (
|
||||
blocks: Record<string, any>,
|
||||
edges: any[],
|
||||
updateBlockPosition: (id: string, position: { x: number; y: number }) => void,
|
||||
options: LayoutOptions = {}
|
||||
): void => {
|
||||
if (!options.handleOrientation || options.handleOrientation === 'auto') {
|
||||
options.handleOrientation = detectHandleOrientation(blocks)
|
||||
}
|
||||
|
||||
const topLevelPositions = calculateAutoLayout(blocks, edges, options)
|
||||
|
||||
topLevelPositions.forEach((position, blockId) => {
|
||||
updateBlockPosition(blockId, position)
|
||||
})
|
||||
|
||||
const containerBlocks = Object.entries(blocks).filter(
|
||||
([_, block]) => isContainerType(block.type) && !block.data?.parentId
|
||||
)
|
||||
|
||||
containerBlocks.forEach(([containerId]) => {
|
||||
const childBlocks = Object.fromEntries(
|
||||
Object.entries(blocks).filter(([_, block]) => block.data?.parentId === containerId)
|
||||
)
|
||||
|
||||
if (Object.keys(childBlocks).length === 0) return
|
||||
|
||||
const childEdges = edges.filter((edge) => childBlocks[edge.source] && childBlocks[edge.target])
|
||||
|
||||
const childLayoutOptions: LayoutOptions = {
|
||||
horizontalSpacing: Math.min(300, options.horizontalSpacing || 300),
|
||||
verticalSpacing: Math.min(150, options.verticalSpacing || 150),
|
||||
startX: 50,
|
||||
startY: 80,
|
||||
alignByLayer: true,
|
||||
handleOrientation: options.handleOrientation,
|
||||
}
|
||||
|
||||
const childPositions = calculateAutoLayout(childBlocks, childEdges, childLayoutOptions)
|
||||
|
||||
childPositions.forEach((position, blockId) => {
|
||||
updateBlockPosition(blockId, position)
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
@@ -0,0 +1,269 @@
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
|
||||
const logger = createLogger('AutoLayoutUtils')
|
||||
|
||||
/**
|
||||
* Auto layout options interface
|
||||
*/
|
||||
export interface AutoLayoutOptions {
|
||||
strategy?: 'smart' | 'hierarchical' | 'layered' | 'force-directed'
|
||||
direction?: 'horizontal' | 'vertical' | 'auto'
|
||||
spacing?: {
|
||||
horizontal?: number
|
||||
vertical?: number
|
||||
layer?: number
|
||||
}
|
||||
alignment?: 'start' | 'center' | 'end'
|
||||
padding?: {
|
||||
x?: number
|
||||
y?: number
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Default auto layout options
|
||||
*/
|
||||
const DEFAULT_AUTO_LAYOUT_OPTIONS: AutoLayoutOptions = {
|
||||
strategy: 'smart',
|
||||
direction: 'auto',
|
||||
spacing: {
|
||||
horizontal: 500,
|
||||
vertical: 400,
|
||||
layer: 700,
|
||||
},
|
||||
alignment: 'center',
|
||||
padding: {
|
||||
x: 250,
|
||||
y: 250,
|
||||
},
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply auto layout to workflow blocks and update the store
|
||||
*/
|
||||
export async function applyAutoLayoutToWorkflow(
|
||||
workflowId: string,
|
||||
blocks: Record<string, any>,
|
||||
edges: any[],
|
||||
loops: Record<string, any> = {},
|
||||
parallels: Record<string, any> = {},
|
||||
options: AutoLayoutOptions = {}
|
||||
): Promise<{
|
||||
success: boolean
|
||||
layoutedBlocks?: Record<string, any>
|
||||
error?: string
|
||||
}> {
|
||||
try {
|
||||
logger.info('Applying auto layout to workflow', {
|
||||
workflowId,
|
||||
blockCount: Object.keys(blocks).length,
|
||||
edgeCount: edges.length,
|
||||
})
|
||||
|
||||
// Call the autolayout API route instead of sim-agent directly
|
||||
|
||||
// Merge with default options and ensure all required properties are present
|
||||
const layoutOptions = {
|
||||
strategy: options.strategy || DEFAULT_AUTO_LAYOUT_OPTIONS.strategy!,
|
||||
direction: options.direction || DEFAULT_AUTO_LAYOUT_OPTIONS.direction!,
|
||||
spacing: {
|
||||
horizontal: options.spacing?.horizontal || DEFAULT_AUTO_LAYOUT_OPTIONS.spacing!.horizontal!,
|
||||
vertical: options.spacing?.vertical || DEFAULT_AUTO_LAYOUT_OPTIONS.spacing!.vertical!,
|
||||
layer: options.spacing?.layer || DEFAULT_AUTO_LAYOUT_OPTIONS.spacing!.layer!,
|
||||
},
|
||||
alignment: options.alignment || DEFAULT_AUTO_LAYOUT_OPTIONS.alignment!,
|
||||
padding: {
|
||||
x: options.padding?.x || DEFAULT_AUTO_LAYOUT_OPTIONS.padding!.x!,
|
||||
y: options.padding?.y || DEFAULT_AUTO_LAYOUT_OPTIONS.padding!.y!,
|
||||
},
|
||||
}
|
||||
|
||||
// Call the autolayout API route which has access to the server-side API key
|
||||
const response = await fetch(`/api/workflows/${workflowId}/autolayout`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(layoutOptions),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => null)
|
||||
const errorMessage = errorData?.error || `Auto layout failed: ${response.statusText}`
|
||||
logger.error('Auto layout API call failed:', {
|
||||
status: response.status,
|
||||
error: errorMessage,
|
||||
})
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
if (!result.success) {
|
||||
const errorMessage = result.error || 'Auto layout failed'
|
||||
logger.error('Auto layout failed:', {
|
||||
error: errorMessage,
|
||||
})
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Successfully applied auto layout', {
|
||||
workflowId,
|
||||
originalBlockCount: Object.keys(blocks).length,
|
||||
layoutedBlockCount: result.data?.layoutedBlocks
|
||||
? Object.keys(result.data.layoutedBlocks).length
|
||||
: 0,
|
||||
})
|
||||
|
||||
// Return the layouted blocks from the API response
|
||||
return {
|
||||
success: true,
|
||||
layoutedBlocks: result.data?.layoutedBlocks || blocks,
|
||||
}
|
||||
} catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown auto layout error'
|
||||
logger.error('Auto layout failed:', { workflowId, error: errorMessage })
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply auto layout and update the workflow store immediately
|
||||
*/
|
||||
export async function applyAutoLayoutAndUpdateStore(
|
||||
workflowId: string,
|
||||
options: AutoLayoutOptions = {}
|
||||
): Promise<{
|
||||
success: boolean
|
||||
error?: string
|
||||
}> {
|
||||
try {
|
||||
// Import workflow store
|
||||
const { useWorkflowStore } = await import('@/stores/workflows/workflow/store')
|
||||
|
||||
const workflowStore = useWorkflowStore.getState()
|
||||
const { blocks, edges, loops = {}, parallels = {} } = workflowStore
|
||||
|
||||
logger.info('Auto layout store data:', {
|
||||
workflowId,
|
||||
blockCount: Object.keys(blocks).length,
|
||||
edgeCount: edges.length,
|
||||
loopCount: Object.keys(loops).length,
|
||||
parallelCount: Object.keys(parallels).length,
|
||||
})
|
||||
|
||||
if (Object.keys(blocks).length === 0) {
|
||||
logger.warn('No blocks to layout', { workflowId })
|
||||
return { success: false, error: 'No blocks to layout' }
|
||||
}
|
||||
|
||||
// Apply auto layout
|
||||
const result = await applyAutoLayoutToWorkflow(
|
||||
workflowId,
|
||||
blocks,
|
||||
edges,
|
||||
loops,
|
||||
parallels,
|
||||
options
|
||||
)
|
||||
|
||||
if (!result.success || !result.layoutedBlocks) {
|
||||
return { success: false, error: result.error }
|
||||
}
|
||||
|
||||
// Update workflow store immediately with new positions
|
||||
const newWorkflowState = {
|
||||
...workflowStore.getWorkflowState(),
|
||||
blocks: result.layoutedBlocks,
|
||||
lastSaved: Date.now(),
|
||||
}
|
||||
|
||||
useWorkflowStore.setState(newWorkflowState)
|
||||
|
||||
logger.info('Successfully updated workflow store with auto layout', { workflowId })
|
||||
|
||||
// Persist the changes to the database optimistically
|
||||
try {
|
||||
// Update the lastSaved timestamp in the store
|
||||
useWorkflowStore.getState().updateLastSaved()
|
||||
|
||||
// Clean up the workflow state for API validation
|
||||
const cleanedWorkflowState = {
|
||||
...newWorkflowState,
|
||||
// Convert null dates to undefined (since they're optional)
|
||||
deployedAt: newWorkflowState.deployedAt ? new Date(newWorkflowState.deployedAt) : undefined,
|
||||
// Ensure other optional fields are properly handled
|
||||
loops: newWorkflowState.loops || {},
|
||||
parallels: newWorkflowState.parallels || {},
|
||||
deploymentStatuses: newWorkflowState.deploymentStatuses || {},
|
||||
}
|
||||
|
||||
// Save the updated workflow state to the database
|
||||
const response = await fetch(`/api/workflows/${workflowId}/state`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(cleanedWorkflowState),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json()
|
||||
throw new Error(errorData.error || `HTTP ${response.status}: ${response.statusText}`)
|
||||
}
|
||||
|
||||
logger.info('Auto layout successfully persisted to database', { workflowId })
|
||||
return { success: true }
|
||||
} catch (saveError) {
|
||||
logger.error('Failed to save auto layout to database, reverting store changes:', {
|
||||
workflowId,
|
||||
error: saveError,
|
||||
})
|
||||
|
||||
// Revert the store changes since database save failed
|
||||
useWorkflowStore.setState({
|
||||
...workflowStore.getWorkflowState(),
|
||||
blocks: blocks, // Revert to original blocks
|
||||
lastSaved: workflowStore.lastSaved, // Revert lastSaved
|
||||
})
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: `Failed to save positions to database: ${saveError instanceof Error ? saveError.message : 'Unknown error'}`,
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown store update error'
|
||||
logger.error('Failed to update store with auto layout:', { workflowId, error: errorMessage })
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply auto layout to a specific set of blocks (used by copilot preview)
|
||||
*/
|
||||
export async function applyAutoLayoutToBlocks(
|
||||
blocks: Record<string, any>,
|
||||
edges: any[],
|
||||
options: AutoLayoutOptions = {}
|
||||
): Promise<{
|
||||
success: boolean
|
||||
layoutedBlocks?: Record<string, any>
|
||||
error?: string
|
||||
}> {
|
||||
return applyAutoLayoutToWorkflow('preview', blocks, edges, {}, {}, options)
|
||||
}
|
||||
@@ -5,6 +5,7 @@ import { useParams, useRouter } from 'next/navigation'
|
||||
import ReactFlow, {
|
||||
Background,
|
||||
ConnectionLineType,
|
||||
type Edge,
|
||||
type EdgeTypes,
|
||||
type NodeTypes,
|
||||
ReactFlowProvider,
|
||||
@@ -13,33 +14,34 @@ import ReactFlow, {
|
||||
import 'reactflow/dist/style.css'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/components/providers/workspace-permissions-provider'
|
||||
import { ControlBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/control-bar/control-bar'
|
||||
import { DiffControls } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/diff-controls'
|
||||
import { ErrorBoundary } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/error/index'
|
||||
import { LoopNodeComponent } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/loop-node/loop-node'
|
||||
import { Panel } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/panel'
|
||||
import { ParallelNodeComponent } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/parallel-node/parallel-node'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { useStreamCleanup } from '@/hooks/use-stream-cleanup'
|
||||
import { useWorkspacePermissions } from '@/hooks/use-workspace-permissions'
|
||||
import { useCopilotStore } from '@/stores/copilot/store'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useVariablesStore } from '@/stores/panel/variables/store'
|
||||
import { useGeneralStore } from '@/stores/settings/general/store'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
import { WorkflowBlock } from './components/workflow-block/workflow-block'
|
||||
import { WorkflowEdge } from './components/workflow-edge/workflow-edge'
|
||||
import { useCurrentWorkflow } from './hooks'
|
||||
import {
|
||||
ControlBar,
|
||||
ErrorBoundary,
|
||||
LoopNodeComponent,
|
||||
Panel,
|
||||
ParallelNodeComponent,
|
||||
WorkflowBlock,
|
||||
WorkflowEdge,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components'
|
||||
import {
|
||||
applyAutoLayoutSmooth,
|
||||
detectHandleOrientation,
|
||||
getNodeAbsolutePosition,
|
||||
getNodeDepth,
|
||||
getNodeHierarchy,
|
||||
isPointInLoopNode,
|
||||
resizeLoopNodes,
|
||||
updateNodeParent as updateNodeParentUtil,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/utils'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { useWorkspacePermissions } from '@/hooks/use-workspace-permissions'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useVariablesStore } from '@/stores/panel/variables/store'
|
||||
import { useGeneralStore } from '@/stores/settings/general/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
} from './utils'
|
||||
|
||||
const logger = createLogger('Workflow')
|
||||
|
||||
@@ -87,16 +89,82 @@ const WorkflowContent = React.memo(() => {
|
||||
const { workflows, activeWorkflowId, isLoading, setActiveWorkflow, createWorkflow } =
|
||||
useWorkflowRegistry()
|
||||
|
||||
const {
|
||||
blocks,
|
||||
edges,
|
||||
updateNodeDimensions,
|
||||
updateBlockPosition: storeUpdateBlockPosition,
|
||||
} = useWorkflowStore()
|
||||
// Use the clean abstraction for current workflow state
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
|
||||
const { updateNodeDimensions, updateBlockPosition: storeUpdateBlockPosition } = useWorkflowStore()
|
||||
|
||||
// Get copilot cleanup function
|
||||
const copilotCleanup = useCopilotStore((state) => state.cleanup)
|
||||
|
||||
// Handle copilot stream cleanup on page unload and component unmount
|
||||
useStreamCleanup(copilotCleanup)
|
||||
|
||||
// Extract workflow data from the abstraction
|
||||
const { blocks, edges, loops, parallels, isDiffMode } = currentWorkflow
|
||||
|
||||
// Get diff analysis for edge reconstruction
|
||||
const { diffAnalysis, isShowingDiff, isDiffReady } = useWorkflowDiffStore()
|
||||
|
||||
// Reconstruct deleted edges when viewing original workflow
|
||||
const edgesForDisplay = useMemo(() => {
|
||||
// If we're not in diff mode and we have diff analysis with deleted edges,
|
||||
// we need to reconstruct those deleted edges and add them to the display
|
||||
// Only do this if diff is ready to prevent race conditions
|
||||
if (!isShowingDiff && isDiffReady && diffAnalysis?.edge_diff?.deleted_edges) {
|
||||
const reconstructedEdges: Edge[] = []
|
||||
|
||||
// Parse deleted edge identifiers to reconstruct edges
|
||||
diffAnalysis.edge_diff.deleted_edges.forEach((edgeIdentifier) => {
|
||||
// Edge identifier format: "sourceId-source-targetId-target"
|
||||
// Parse this to extract the components
|
||||
const match = edgeIdentifier.match(/^([^-]+)-source-([^-]+)-target$/)
|
||||
if (match) {
|
||||
const [, sourceId, targetId] = match
|
||||
|
||||
// Only reconstruct if both blocks still exist
|
||||
if (blocks[sourceId] && blocks[targetId]) {
|
||||
// Generate a unique edge ID
|
||||
const edgeId = `deleted-edge-${sourceId}-${targetId}`
|
||||
|
||||
reconstructedEdges.push({
|
||||
id: edgeId,
|
||||
source: sourceId,
|
||||
target: targetId,
|
||||
sourceHandle: null, // Default handle
|
||||
targetHandle: null, // Default handle
|
||||
type: 'workflowEdge',
|
||||
})
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
// Combine existing edges with reconstructed deleted edges
|
||||
return [...edges, ...reconstructedEdges]
|
||||
}
|
||||
|
||||
// Otherwise, just use the edges as-is
|
||||
return edges
|
||||
}, [edges, isShowingDiff, isDiffReady, diffAnalysis, blocks])
|
||||
|
||||
// User permissions - get current user's specific permissions from context
|
||||
const userPermissions = useUserPermissionsContext()
|
||||
|
||||
// Create diff-aware permissions that disable editing when in diff mode
|
||||
const effectivePermissions = useMemo(() => {
|
||||
if (isDiffMode) {
|
||||
// In diff mode, disable all editing regardless of user permissions
|
||||
return {
|
||||
...userPermissions,
|
||||
canEdit: false,
|
||||
canAdmin: false,
|
||||
// Keep canRead true so users can still view content
|
||||
canRead: userPermissions.canRead,
|
||||
}
|
||||
}
|
||||
return userPermissions
|
||||
}, [userPermissions, isDiffMode])
|
||||
|
||||
// Workspace permissions - get all users and their permissions for this workspace
|
||||
const { permissions: workspacePermissions, error: permissionsError } = useWorkspacePermissions(
|
||||
workspaceId || null
|
||||
@@ -215,73 +283,25 @@ const WorkflowContent = React.memo(() => {
|
||||
[getNodes]
|
||||
)
|
||||
|
||||
// Helper function to get orientation config
|
||||
const getOrientationConfig = useCallback((orientation: string) => {
|
||||
return orientation === 'vertical'
|
||||
? {
|
||||
// Vertical handles: optimize for top-to-bottom flow
|
||||
horizontalSpacing: 400,
|
||||
verticalSpacing: 300,
|
||||
startX: 200,
|
||||
startY: 200,
|
||||
}
|
||||
: {
|
||||
// Horizontal handles: optimize for left-to-right flow
|
||||
horizontalSpacing: 600,
|
||||
verticalSpacing: 200,
|
||||
startX: 150,
|
||||
startY: 300,
|
||||
}
|
||||
}, [])
|
||||
|
||||
// Auto-layout handler
|
||||
const handleAutoLayout = useCallback(() => {
|
||||
// Auto-layout handler - now uses frontend auto layout for immediate updates
|
||||
const handleAutoLayout = useCallback(async () => {
|
||||
if (Object.keys(blocks).length === 0) return
|
||||
|
||||
// Detect the predominant handle orientation in the workflow
|
||||
const detectedOrientation = detectHandleOrientation(blocks)
|
||||
try {
|
||||
// Use the shared auto layout utility for immediate frontend updates
|
||||
const { applyAutoLayoutAndUpdateStore } = await import('./utils/auto-layout')
|
||||
|
||||
// Get spacing configuration based on handle orientation
|
||||
const orientationConfig = getOrientationConfig(detectedOrientation)
|
||||
const result = await applyAutoLayoutAndUpdateStore(activeWorkflowId!)
|
||||
|
||||
applyAutoLayoutSmooth(
|
||||
blocks,
|
||||
edges,
|
||||
storeUpdateBlockPosition,
|
||||
fitView,
|
||||
resizeLoopNodesWrapper,
|
||||
{
|
||||
...orientationConfig,
|
||||
alignByLayer: true,
|
||||
animationDuration: 500, // Smooth 500ms animation
|
||||
handleOrientation: detectedOrientation, // Explicitly set the detected orientation
|
||||
onComplete: (finalPositions) => {
|
||||
// Emit collaborative updates for final positions after animation completes
|
||||
finalPositions.forEach((position, blockId) => {
|
||||
collaborativeUpdateBlockPosition(blockId, position)
|
||||
})
|
||||
},
|
||||
if (result.success) {
|
||||
logger.info('Auto layout completed successfully')
|
||||
} else {
|
||||
logger.error('Auto layout failed:', result.error)
|
||||
}
|
||||
)
|
||||
|
||||
const orientationMessage =
|
||||
detectedOrientation === 'vertical'
|
||||
? 'Auto-layout applied with vertical flow (top-to-bottom)'
|
||||
: 'Auto-layout applied with horizontal flow (left-to-right)'
|
||||
|
||||
logger.info(orientationMessage, {
|
||||
orientation: detectedOrientation,
|
||||
blockCount: Object.keys(blocks).length,
|
||||
})
|
||||
}, [
|
||||
blocks,
|
||||
edges,
|
||||
storeUpdateBlockPosition,
|
||||
collaborativeUpdateBlockPosition,
|
||||
fitView,
|
||||
resizeLoopNodesWrapper,
|
||||
getOrientationConfig,
|
||||
])
|
||||
} catch (error) {
|
||||
logger.error('Auto layout error:', error)
|
||||
}
|
||||
}, [activeWorkflowId, blocks])
|
||||
|
||||
const debouncedAutoLayout = useCallback(() => {
|
||||
const debounceTimer = setTimeout(() => {
|
||||
@@ -323,32 +343,6 @@ const WorkflowContent = React.memo(() => {
|
||||
}
|
||||
}, [debouncedAutoLayout])
|
||||
|
||||
useEffect(() => {
|
||||
let cleanup: (() => void) | null = null
|
||||
|
||||
const handleAutoLayoutEvent = () => {
|
||||
if (cleanup) cleanup()
|
||||
|
||||
// Call auto layout directly without debounce for copilot events
|
||||
handleAutoLayout()
|
||||
|
||||
// Also set up debounced version as backup
|
||||
cleanup = debouncedAutoLayout()
|
||||
}
|
||||
|
||||
window.addEventListener('trigger-auto-layout', handleAutoLayoutEvent)
|
||||
|
||||
return () => {
|
||||
window.removeEventListener('trigger-auto-layout', handleAutoLayoutEvent)
|
||||
if (cleanup) cleanup()
|
||||
}
|
||||
}, [debouncedAutoLayout])
|
||||
|
||||
// Note: Workflow room joining is now handled automatically by socket connect event based on URL
|
||||
// This eliminates the need for manual joining when active workflow changes
|
||||
|
||||
// Note: Workflow initialization now handled by Socket.IO system
|
||||
|
||||
// Handle drops
|
||||
const findClosestOutput = useCallback(
|
||||
(newNodePosition: { x: number; y: number }): BlockData | null => {
|
||||
@@ -403,7 +397,7 @@ const WorkflowContent = React.memo(() => {
|
||||
useEffect(() => {
|
||||
const handleAddBlockFromToolbar = (event: CustomEvent) => {
|
||||
// Check if user has permission to interact with blocks
|
||||
if (!userPermissions.canEdit) {
|
||||
if (!effectivePermissions.canEdit) {
|
||||
return
|
||||
}
|
||||
|
||||
@@ -526,7 +520,7 @@ const WorkflowContent = React.memo(() => {
|
||||
addEdge,
|
||||
findClosestOutput,
|
||||
determineSourceHandle,
|
||||
userPermissions.canEdit,
|
||||
effectivePermissions.canEdit,
|
||||
])
|
||||
|
||||
// Update the onDrop handler
|
||||
@@ -1424,7 +1418,7 @@ const WorkflowContent = React.memo(() => {
|
||||
)
|
||||
|
||||
// Transform edges to include improved selection state
|
||||
const edgesWithSelection = edges.map((edge) => {
|
||||
const edgesWithSelection = edgesForDisplay.map((edge) => {
|
||||
// Check if this edge connects nodes inside a loop
|
||||
const sourceNode = getNodes().find((n) => n.id === edge.source)
|
||||
const targetNode = getNodes().find((n) => n.id === edge.target)
|
||||
@@ -1534,11 +1528,11 @@ const WorkflowContent = React.memo(() => {
|
||||
edges={edgesWithSelection}
|
||||
onNodesChange={onNodesChange}
|
||||
onEdgesChange={onEdgesChange}
|
||||
onConnect={userPermissions.canEdit ? onConnect : undefined}
|
||||
onConnect={effectivePermissions.canEdit ? onConnect : undefined}
|
||||
nodeTypes={nodeTypes}
|
||||
edgeTypes={edgeTypes}
|
||||
onDrop={userPermissions.canEdit ? onDrop : undefined}
|
||||
onDragOver={userPermissions.canEdit ? onDragOver : undefined}
|
||||
onDrop={effectivePermissions.canEdit ? onDrop : undefined}
|
||||
onDragOver={effectivePermissions.canEdit ? onDragOver : undefined}
|
||||
fitView
|
||||
minZoom={0.1}
|
||||
maxZoom={1.3}
|
||||
@@ -1558,22 +1552,22 @@ const WorkflowContent = React.memo(() => {
|
||||
onEdgeClick={onEdgeClick}
|
||||
elementsSelectable={true}
|
||||
selectNodesOnDrag={false}
|
||||
nodesConnectable={userPermissions.canEdit}
|
||||
nodesDraggable={userPermissions.canEdit}
|
||||
nodesConnectable={effectivePermissions.canEdit}
|
||||
nodesDraggable={effectivePermissions.canEdit}
|
||||
draggable={false}
|
||||
noWheelClassName='allow-scroll'
|
||||
edgesFocusable={true}
|
||||
edgesUpdatable={userPermissions.canEdit}
|
||||
edgesUpdatable={effectivePermissions.canEdit}
|
||||
className='workflow-container h-full'
|
||||
onNodeDrag={userPermissions.canEdit ? onNodeDrag : undefined}
|
||||
onNodeDragStop={userPermissions.canEdit ? onNodeDragStop : undefined}
|
||||
onNodeDragStart={userPermissions.canEdit ? onNodeDragStart : undefined}
|
||||
onNodeDrag={effectivePermissions.canEdit ? onNodeDrag : undefined}
|
||||
onNodeDragStop={effectivePermissions.canEdit ? onNodeDragStop : undefined}
|
||||
onNodeDragStart={effectivePermissions.canEdit ? onNodeDragStart : undefined}
|
||||
snapToGrid={false}
|
||||
snapGrid={[20, 20]}
|
||||
elevateEdgesOnSelect={true}
|
||||
elevateNodesOnSelect={true}
|
||||
autoPanOnConnect={userPermissions.canEdit}
|
||||
autoPanOnNodeDrag={userPermissions.canEdit}
|
||||
autoPanOnConnect={effectivePermissions.canEdit}
|
||||
autoPanOnNodeDrag={effectivePermissions.canEdit}
|
||||
>
|
||||
<Background
|
||||
color='hsl(var(--workflow-dots))'
|
||||
@@ -1582,6 +1576,9 @@ const WorkflowContent = React.memo(() => {
|
||||
style={{ backgroundColor: 'hsl(var(--workflow-background))' }}
|
||||
/>
|
||||
</ReactFlow>
|
||||
|
||||
{/* Show DiffControls if diff is available (regardless of current view mode) */}
|
||||
<DiffControls />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
|
||||
@@ -3,8 +3,12 @@
|
||||
import { forwardRef, useImperativeHandle, useRef, useState } from 'react'
|
||||
import { useParams, useRouter } from 'next/navigation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { simAgentClient } from '@/lib/sim-agent'
|
||||
import { getAllBlocks } from '@/blocks/registry'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { resolveOutputType } from '@/blocks/utils'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { parseWorkflowYaml } from '@/stores/workflows/yaml/importer'
|
||||
import { generateLoopBlocks, generateParallelBlocks } from '@/stores/workflows/workflow/utils'
|
||||
|
||||
const logger = createLogger('ImportControls')
|
||||
|
||||
@@ -86,17 +90,45 @@ export const ImportControls = forwardRef<ImportControlsRef, ImportControlsProps>
|
||||
|
||||
try {
|
||||
// First validate the YAML without importing
|
||||
const { data: yamlWorkflow, errors: parseErrors } = parseWorkflowYaml(content)
|
||||
// Gather block registry and utilities for sim-agent
|
||||
const blocks = getAllBlocks()
|
||||
const blockRegistry = blocks.reduce(
|
||||
(acc, block) => {
|
||||
const blockType = block.type
|
||||
acc[blockType] = {
|
||||
...block,
|
||||
id: blockType,
|
||||
subBlocks: block.subBlocks || [],
|
||||
outputs: block.outputs || {},
|
||||
} as any
|
||||
return acc
|
||||
},
|
||||
{} as Record<string, BlockConfig>
|
||||
)
|
||||
|
||||
if (!yamlWorkflow || parseErrors.length > 0) {
|
||||
const parseResult = await simAgentClient.makeRequest('/api/yaml/parse', {
|
||||
body: {
|
||||
yamlContent: content,
|
||||
blockRegistry,
|
||||
utilities: {
|
||||
generateLoopBlocks: generateLoopBlocks.toString(),
|
||||
generateParallelBlocks: generateParallelBlocks.toString(),
|
||||
resolveOutputType: resolveOutputType.toString(),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
if (!parseResult.success || !parseResult.data?.data) {
|
||||
setImportResult({
|
||||
success: false,
|
||||
errors: parseErrors,
|
||||
errors: parseResult.data?.errors || [parseResult.error || 'Failed to parse YAML'],
|
||||
warnings: [],
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
const yamlWorkflow = parseResult.data.data
|
||||
|
||||
// Create a new workflow
|
||||
const newWorkflowId = await createWorkflow({
|
||||
name: `Imported Workflow - ${new Date().toLocaleString()}`,
|
||||
|
||||
@@ -200,10 +200,10 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
|
||||
blockTags = schemaFields.map((field) => `${normalizedBlockName}.${field.name}`)
|
||||
} else {
|
||||
// Fallback to default if schema extraction failed
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs)
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs || {})
|
||||
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
|
||||
}
|
||||
} else if (Object.keys(blockConfig.outputs).length === 0) {
|
||||
} else if (!blockConfig.outputs || Object.keys(blockConfig.outputs).length === 0) {
|
||||
// Handle blocks with no outputs (like starter) - check for custom input fields
|
||||
if (sourceBlock.type === 'starter') {
|
||||
// Check what start workflow mode is selected
|
||||
@@ -240,7 +240,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
|
||||
}
|
||||
} else {
|
||||
// Use default block outputs
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs)
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs || {})
|
||||
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
|
||||
}
|
||||
|
||||
@@ -475,10 +475,10 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
|
||||
blockTags = schemaFields.map((field) => `${normalizedBlockName}.${field.name}`)
|
||||
} else {
|
||||
// Fallback to default if schema extraction failed
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs)
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs || {})
|
||||
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
|
||||
}
|
||||
} else if (Object.keys(blockConfig.outputs).length === 0) {
|
||||
} else if (!blockConfig.outputs || Object.keys(blockConfig.outputs).length === 0) {
|
||||
// Handle blocks with no outputs (like starter) - check for custom input fields
|
||||
if (accessibleBlock.type === 'starter') {
|
||||
// Check what start workflow mode is selected
|
||||
@@ -515,7 +515,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
|
||||
}
|
||||
} else {
|
||||
// Use default block outputs
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs)
|
||||
const outputPaths = generateOutputPaths(blockConfig.outputs || {})
|
||||
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
|
||||
}
|
||||
|
||||
|
||||
@@ -5,7 +5,6 @@ import { CheckCircle, ChevronDown, ChevronRight, Loader2, Settings, XCircle } fr
|
||||
import { Badge } from '@/components/ui/badge'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Collapsible, CollapsibleContent, CollapsibleTrigger } from '@/components/ui/collapsible'
|
||||
import { getToolDisplayName } from '@/lib/tool-call-parser'
|
||||
import { cn } from '@/lib/utils'
|
||||
import type { ToolCallGroup, ToolCallState } from '@/types/tool-call'
|
||||
|
||||
@@ -96,6 +95,7 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
|
||||
const [isExpanded, setIsExpanded] = useState(false)
|
||||
const isSuccess = toolCall.state === 'completed'
|
||||
const isError = toolCall.state === 'error'
|
||||
const isAborted = toolCall.state === 'aborted'
|
||||
|
||||
const formatDuration = (duration?: number) => {
|
||||
if (!duration) return ''
|
||||
@@ -107,7 +107,8 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
|
||||
className={cn(
|
||||
'min-w-0 rounded-lg border',
|
||||
isSuccess && 'border-green-200 bg-green-50 dark:border-green-800 dark:bg-green-950',
|
||||
isError && 'border-red-200 bg-red-50 dark:border-red-800 dark:bg-red-950'
|
||||
isError && 'border-red-200 bg-red-50 dark:border-red-800 dark:bg-red-950',
|
||||
isAborted && 'border-orange-200 bg-orange-50 dark:border-orange-800 dark:bg-orange-950'
|
||||
)}
|
||||
>
|
||||
<Collapsible open={isExpanded} onOpenChange={setIsExpanded}>
|
||||
@@ -117,7 +118,8 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
|
||||
className={cn(
|
||||
'w-full min-w-0 justify-between px-3 py-4',
|
||||
isSuccess && 'hover:bg-green-100 dark:hover:bg-green-900',
|
||||
isError && 'hover:bg-red-100 dark:hover:bg-red-900'
|
||||
isError && 'hover:bg-red-100 dark:hover:bg-red-900',
|
||||
isAborted && 'hover:bg-orange-100 dark:hover:bg-orange-900'
|
||||
)}
|
||||
>
|
||||
<div className='flex min-w-0 items-center gap-2 overflow-hidden'>
|
||||
@@ -125,14 +127,18 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
|
||||
<CheckCircle className='h-4 w-4 shrink-0 text-green-600 dark:text-green-400' />
|
||||
)}
|
||||
{isError && <XCircle className='h-4 w-4 shrink-0 text-red-600 dark:text-red-400' />}
|
||||
{isAborted && (
|
||||
<XCircle className='h-4 w-4 shrink-0 text-orange-600 dark:text-orange-400' />
|
||||
)}
|
||||
<span
|
||||
className={cn(
|
||||
'min-w-0 truncate font-mono text-xs',
|
||||
isSuccess && 'text-green-800 dark:text-green-200',
|
||||
isError && 'text-red-800 dark:text-red-200'
|
||||
isError && 'text-red-800 dark:text-red-200',
|
||||
isAborted && 'text-orange-800 dark:text-orange-200'
|
||||
)}
|
||||
>
|
||||
{getToolDisplayName(toolCall.name, true)}
|
||||
{toolCall.displayName || toolCall.name}
|
||||
</span>
|
||||
{toolCall.duration && (
|
||||
<Badge
|
||||
@@ -140,7 +146,8 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
|
||||
className={cn(
|
||||
'shrink-0 text-xs',
|
||||
isSuccess && 'text-green-700 dark:text-green-300',
|
||||
isError && 'text-red-700 dark:text-red-300'
|
||||
isError && 'text-red-700 dark:text-red-300',
|
||||
isAborted && 'text-orange-700 dark:text-orange-300'
|
||||
)}
|
||||
style={{ fontSize: '0.625rem' }}
|
||||
>
|
||||
|
||||
24
apps/sim/db/migrations/0065_solid_newton_destine.sql
Normal file
24
apps/sim/db/migrations/0065_solid_newton_destine.sql
Normal file
@@ -0,0 +1,24 @@
|
||||
CREATE TABLE "workflow_checkpoints" (
|
||||
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||
"user_id" text NOT NULL,
|
||||
"workflow_id" text NOT NULL,
|
||||
"chat_id" uuid NOT NULL,
|
||||
"message_id" text,
|
||||
"workflow_state" json NOT NULL,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp DEFAULT now() NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
DROP TABLE "copilot_checkpoints" CASCADE;--> statement-breakpoint
|
||||
ALTER TABLE "copilot_chats" ADD COLUMN "preview_yaml" text;--> statement-breakpoint
|
||||
ALTER TABLE "workflow_checkpoints" ADD CONSTRAINT "workflow_checkpoints_user_id_user_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "workflow_checkpoints" ADD CONSTRAINT "workflow_checkpoints_workflow_id_workflow_id_fk" FOREIGN KEY ("workflow_id") REFERENCES "public"."workflow"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "workflow_checkpoints" ADD CONSTRAINT "workflow_checkpoints_chat_id_copilot_chats_id_fk" FOREIGN KEY ("chat_id") REFERENCES "public"."copilot_chats"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_user_id_idx" ON "workflow_checkpoints" USING btree ("user_id");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_workflow_id_idx" ON "workflow_checkpoints" USING btree ("workflow_id");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_chat_id_idx" ON "workflow_checkpoints" USING btree ("chat_id");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_message_id_idx" ON "workflow_checkpoints" USING btree ("message_id");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_user_workflow_idx" ON "workflow_checkpoints" USING btree ("user_id","workflow_id");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_workflow_chat_idx" ON "workflow_checkpoints" USING btree ("workflow_id","chat_id");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_created_at_idx" ON "workflow_checkpoints" USING btree ("created_at");--> statement-breakpoint
|
||||
CREATE INDEX "workflow_checkpoints_chat_created_at_idx" ON "workflow_checkpoints" USING btree ("chat_id","created_at");
|
||||
5652
apps/sim/db/migrations/meta/0065_snapshot.json
Normal file
5652
apps/sim/db/migrations/meta/0065_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -449,6 +449,13 @@
|
||||
"when": 1754088313157,
|
||||
"tag": "0064_elite_hedge_knight",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 65,
|
||||
"version": "7",
|
||||
"when": 1754171385971,
|
||||
"tag": "0065_solid_newton_destine",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
431
apps/sim/db/migrations/relations.ts
Normal file
431
apps/sim/db/migrations/relations.ts
Normal file
@@ -0,0 +1,431 @@
|
||||
import { relations } from 'drizzle-orm/relations'
|
||||
import {
|
||||
account,
|
||||
apiKey,
|
||||
chat,
|
||||
copilotChats,
|
||||
copilotCheckpoints,
|
||||
customTools,
|
||||
document,
|
||||
embedding,
|
||||
environment,
|
||||
invitation,
|
||||
knowledgeBase,
|
||||
marketplace,
|
||||
member,
|
||||
memory,
|
||||
organization,
|
||||
permissions,
|
||||
session,
|
||||
settings,
|
||||
templateStars,
|
||||
templates,
|
||||
user,
|
||||
userRateLimits,
|
||||
userStats,
|
||||
webhook,
|
||||
workflow,
|
||||
workflowBlocks,
|
||||
workflowEdges,
|
||||
workflowExecutionBlocks,
|
||||
workflowExecutionLogs,
|
||||
workflowExecutionSnapshots,
|
||||
workflowFolder,
|
||||
workflowLogs,
|
||||
workflowSchedule,
|
||||
workflowSubflows,
|
||||
workspace,
|
||||
workspaceInvitation,
|
||||
} from './schema'
|
||||
|
||||
export const accountRelations = relations(account, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [account.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const userRelations = relations(user, ({ many }) => ({
|
||||
accounts: many(account),
|
||||
environments: many(environment),
|
||||
apiKeys: many(apiKey),
|
||||
marketplaces: many(marketplace),
|
||||
customTools: many(customTools),
|
||||
sessions: many(session),
|
||||
invitations: many(invitation),
|
||||
members: many(member),
|
||||
chats: many(chat),
|
||||
workspaces: many(workspace),
|
||||
knowledgeBases: many(knowledgeBase),
|
||||
workflows: many(workflow),
|
||||
workflowFolders: many(workflowFolder),
|
||||
workspaceInvitations: many(workspaceInvitation),
|
||||
permissions: many(permissions),
|
||||
userStats: many(userStats),
|
||||
copilotChats: many(copilotChats),
|
||||
templateStars: many(templateStars),
|
||||
templates: many(templates),
|
||||
settings: many(settings),
|
||||
userRateLimits: many(userRateLimits),
|
||||
copilotCheckpoints: many(copilotCheckpoints),
|
||||
}))
|
||||
|
||||
export const environmentRelations = relations(environment, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [environment.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowLogsRelations = relations(workflowLogs, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowLogs.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowRelations = relations(workflow, ({ one, many }) => ({
|
||||
workflowLogs: many(workflowLogs),
|
||||
marketplaces: many(marketplace),
|
||||
chats: many(chat),
|
||||
memories: many(memory),
|
||||
user: one(user, {
|
||||
fields: [workflow.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
workspace: one(workspace, {
|
||||
fields: [workflow.workspaceId],
|
||||
references: [workspace.id],
|
||||
}),
|
||||
workflowFolder: one(workflowFolder, {
|
||||
fields: [workflow.folderId],
|
||||
references: [workflowFolder.id],
|
||||
}),
|
||||
workflowEdges: many(workflowEdges),
|
||||
workflowSubflows: many(workflowSubflows),
|
||||
workflowBlocks: many(workflowBlocks),
|
||||
workflowExecutionBlocks: many(workflowExecutionBlocks),
|
||||
workflowExecutionLogs: many(workflowExecutionLogs),
|
||||
workflowExecutionSnapshots: many(workflowExecutionSnapshots),
|
||||
copilotChats: many(copilotChats),
|
||||
templates: many(templates),
|
||||
webhooks: many(webhook),
|
||||
workflowSchedules: many(workflowSchedule),
|
||||
copilotCheckpoints: many(copilotCheckpoints),
|
||||
}))
|
||||
|
||||
export const apiKeyRelations = relations(apiKey, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [apiKey.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const marketplaceRelations = relations(marketplace, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [marketplace.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
user: one(user, {
|
||||
fields: [marketplace.authorId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const customToolsRelations = relations(customTools, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [customTools.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const sessionRelations = relations(session, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [session.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
organization: one(organization, {
|
||||
fields: [session.activeOrganizationId],
|
||||
references: [organization.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const organizationRelations = relations(organization, ({ many }) => ({
|
||||
sessions: many(session),
|
||||
invitations: many(invitation),
|
||||
members: many(member),
|
||||
}))
|
||||
|
||||
export const invitationRelations = relations(invitation, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [invitation.inviterId],
|
||||
references: [user.id],
|
||||
}),
|
||||
organization: one(organization, {
|
||||
fields: [invitation.organizationId],
|
||||
references: [organization.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const memberRelations = relations(member, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [member.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
organization: one(organization, {
|
||||
fields: [member.organizationId],
|
||||
references: [organization.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const chatRelations = relations(chat, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [chat.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
user: one(user, {
|
||||
fields: [chat.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workspaceRelations = relations(workspace, ({ one, many }) => ({
|
||||
user: one(user, {
|
||||
fields: [workspace.ownerId],
|
||||
references: [user.id],
|
||||
}),
|
||||
knowledgeBases: many(knowledgeBase),
|
||||
workflows: many(workflow),
|
||||
workflowFolders: many(workflowFolder),
|
||||
workspaceInvitations: many(workspaceInvitation),
|
||||
}))
|
||||
|
||||
export const memoryRelations = relations(memory, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [memory.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const knowledgeBaseRelations = relations(knowledgeBase, ({ one, many }) => ({
|
||||
user: one(user, {
|
||||
fields: [knowledgeBase.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
workspace: one(workspace, {
|
||||
fields: [knowledgeBase.workspaceId],
|
||||
references: [workspace.id],
|
||||
}),
|
||||
documents: many(document),
|
||||
embeddings: many(embedding),
|
||||
}))
|
||||
|
||||
export const workflowFolderRelations = relations(workflowFolder, ({ one, many }) => ({
|
||||
workflows: many(workflow),
|
||||
user: one(user, {
|
||||
fields: [workflowFolder.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
workspace: one(workspace, {
|
||||
fields: [workflowFolder.workspaceId],
|
||||
references: [workspace.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowEdgesRelations = relations(workflowEdges, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowEdges.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
workflowBlock_sourceBlockId: one(workflowBlocks, {
|
||||
fields: [workflowEdges.sourceBlockId],
|
||||
references: [workflowBlocks.id],
|
||||
relationName: 'workflowEdges_sourceBlockId_workflowBlocks_id',
|
||||
}),
|
||||
workflowBlock_targetBlockId: one(workflowBlocks, {
|
||||
fields: [workflowEdges.targetBlockId],
|
||||
references: [workflowBlocks.id],
|
||||
relationName: 'workflowEdges_targetBlockId_workflowBlocks_id',
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowBlocksRelations = relations(workflowBlocks, ({ one, many }) => ({
|
||||
workflowEdges_sourceBlockId: many(workflowEdges, {
|
||||
relationName: 'workflowEdges_sourceBlockId_workflowBlocks_id',
|
||||
}),
|
||||
workflowEdges_targetBlockId: many(workflowEdges, {
|
||||
relationName: 'workflowEdges_targetBlockId_workflowBlocks_id',
|
||||
}),
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowBlocks.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
webhooks: many(webhook),
|
||||
workflowSchedules: many(workflowSchedule),
|
||||
}))
|
||||
|
||||
export const workflowSubflowsRelations = relations(workflowSubflows, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowSubflows.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workspaceInvitationRelations = relations(workspaceInvitation, ({ one }) => ({
|
||||
workspace: one(workspace, {
|
||||
fields: [workspaceInvitation.workspaceId],
|
||||
references: [workspace.id],
|
||||
}),
|
||||
user: one(user, {
|
||||
fields: [workspaceInvitation.inviterId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const permissionsRelations = relations(permissions, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [permissions.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const userStatsRelations = relations(userStats, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [userStats.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowExecutionBlocksRelations = relations(workflowExecutionBlocks, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowExecutionBlocks.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowExecutionLogsRelations = relations(workflowExecutionLogs, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowExecutionLogs.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
workflowExecutionSnapshot: one(workflowExecutionSnapshots, {
|
||||
fields: [workflowExecutionLogs.stateSnapshotId],
|
||||
references: [workflowExecutionSnapshots.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowExecutionSnapshotsRelations = relations(
|
||||
workflowExecutionSnapshots,
|
||||
({ one, many }) => ({
|
||||
workflowExecutionLogs: many(workflowExecutionLogs),
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowExecutionSnapshots.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
})
|
||||
)
|
||||
|
||||
export const copilotChatsRelations = relations(copilotChats, ({ one, many }) => ({
|
||||
user: one(user, {
|
||||
fields: [copilotChats.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
workflow: one(workflow, {
|
||||
fields: [copilotChats.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
copilotCheckpoints: many(copilotCheckpoints),
|
||||
}))
|
||||
|
||||
export const documentRelations = relations(document, ({ one, many }) => ({
|
||||
knowledgeBase: one(knowledgeBase, {
|
||||
fields: [document.knowledgeBaseId],
|
||||
references: [knowledgeBase.id],
|
||||
}),
|
||||
embeddings: many(embedding),
|
||||
}))
|
||||
|
||||
export const embeddingRelations = relations(embedding, ({ one }) => ({
|
||||
knowledgeBase: one(knowledgeBase, {
|
||||
fields: [embedding.knowledgeBaseId],
|
||||
references: [knowledgeBase.id],
|
||||
}),
|
||||
document: one(document, {
|
||||
fields: [embedding.documentId],
|
||||
references: [document.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const templateStarsRelations = relations(templateStars, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [templateStars.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
template: one(templates, {
|
||||
fields: [templateStars.templateId],
|
||||
references: [templates.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const templatesRelations = relations(templates, ({ one, many }) => ({
|
||||
templateStars: many(templateStars),
|
||||
workflow: one(workflow, {
|
||||
fields: [templates.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
user: one(user, {
|
||||
fields: [templates.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const settingsRelations = relations(settings, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [settings.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const userRateLimitsRelations = relations(userRateLimits, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [userRateLimits.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const webhookRelations = relations(webhook, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [webhook.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
workflowBlock: one(workflowBlocks, {
|
||||
fields: [webhook.blockId],
|
||||
references: [workflowBlocks.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const workflowScheduleRelations = relations(workflowSchedule, ({ one }) => ({
|
||||
workflow: one(workflow, {
|
||||
fields: [workflowSchedule.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
workflowBlock: one(workflowBlocks, {
|
||||
fields: [workflowSchedule.blockId],
|
||||
references: [workflowBlocks.id],
|
||||
}),
|
||||
}))
|
||||
|
||||
export const copilotCheckpointsRelations = relations(copilotCheckpoints, ({ one }) => ({
|
||||
user: one(user, {
|
||||
fields: [copilotCheckpoints.userId],
|
||||
references: [user.id],
|
||||
}),
|
||||
workflow: one(workflow, {
|
||||
fields: [copilotCheckpoints.workflowId],
|
||||
references: [workflow.id],
|
||||
}),
|
||||
copilotChat: one(copilotChats, {
|
||||
fields: [copilotCheckpoints.chatId],
|
||||
references: [copilotChats.id],
|
||||
}),
|
||||
}))
|
||||
1472
apps/sim/db/migrations/schema.ts
Normal file
1472
apps/sim/db/migrations/schema.ts
Normal file
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user