Compare commits

...

18 Commits

Author SHA1 Message Date
waleed
f3d7673331 ack pr comment 2026-02-03 20:21:02 -08:00
waleed
40983b4b99 update other loc to match 2026-02-03 20:15:33 -08:00
Cursor Agent
68bdf50684 feat(action-bar): change Run button label and hover tooltip
- Changed 'Run from block' to 'Run' in tooltip
- Updated disabled state hover to show 'Disabled: Run Blocks Before'

Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>
2026-02-04 03:48:13 +00:00
Waleed
4db6e556b7 feat(canvas): added the ability to lock blocks (#3102)
* feat(canvas): added the ability to lock blocks

* unlock duplicates of locked blocks

* fix(duplicate): place duplicate outside locked container

When duplicating a block that's inside a locked loop/parallel,
the duplicate is now placed outside the container since nothing
should be added to a locked container.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(duplicate): unlock all blocks when duplicating workflow

- Server-side workflow duplication now sets locked: false for all blocks
- regenerateWorkflowStateIds also unlocks blocks for templates
- Client-side regenerateBlockIds already handled this (for paste/import)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix code block disabled state, allow unlock from editor

* fix(lock): address code review feedback

- Fix toggle enabled using first toggleable block, not first block
- Delete button now checks isParentLocked
- Lock button now has disabled state
- Editor lock icon distinguishes block vs parent lock state

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): prevent unlocking blocks inside locked containers

- Editor: can't unlock block if parent container is locked
- Action bar: can't unlock block if parent container is locked
- Shows "Parent container is locked" tooltip in both cases

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): ensure consistent behavior across all UIs

Block Menu, Editor, Action Bar now all have identical behavior:
- Enable/Disable: disabled when locked OR parent locked
- Flip Handles: disabled when locked OR parent locked
- Delete: disabled when locked OR parent locked
- Remove from Subflow: disabled when locked OR parent locked
- Lock: always available for admins
- Unlock: disabled when parent is locked (unlock parent first)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(enable): consistent behavior - can't enable if parent disabled

Same pattern as lock: must enable parent container first before
enabling children inside it.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* docs(quick-reference): add lock block action

Added documentation for the lock/unlock block feature (admin only).
Note: Image placeholder added, pending actual screenshot.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* remove prefix square brackets in error notif

* add lock block image

* fix(block-menu): paste should not be disabled for locked selection

Paste creates new blocks, doesn't modify selected ones. Changed from
disableEdit (includes lock state) to !userCanEdit (permission only),
matching the Duplicate action behavior.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* refactor(workflow): extract block deletion protection into shared utility

Extract duplicated block protection logic from workflow.tsx into
a reusable filterProtectedBlocks helper in utils/block-protection-utils.ts.
This ensures consistent behavior between context menu delete and
keyboard delete operations.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* refactor(workflow): extend block protection utilities for edge protection

Add isEdgeProtected, filterUnprotectedEdges, and hasProtectedBlocks
utilities. Refactor workflow.tsx to use these helpers for:
- onEdgesChange edge removal filtering
- onConnect connection prevention
- onNodeDragStart drag prevention
- Keyboard edge deletion
- Block menu disableEdit calculation

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): address review comments for lock feature

1. Store batchToggleEnabled now uses continue to skip locked blocks
   entirely, matching database operation behavior

2. Copilot add operation now checks if parent container is locked
   before adding nested nodes (defensive check for consistency)

3. Remove unused filterUnprotectedEdges function

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(copilot): add lock checks for insert and extract operations

- insert_into_subflow: Check if existing block being moved is locked
- extract_from_subflow: Check if block or parent subflow is locked

These operations now match the UI behavior where locked blocks
cannot be moved into/out of containers.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): prevent duplicates inside locked containers via regenerateBlockIds

1. regenerateBlockIds now checks if existing parent is locked before
   keeping the block inside it. If parent is locked, the duplicate
   is placed outside (parentId cleared) instead of creating an
   inconsistent state.

2. Remove unnecessary effectivePermissions.canAdmin and potentialParentId
   from onNodeDragStart dependency array.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): fix toggle locked target state and draggable check

1. BATCH_TOGGLE_LOCKED now uses first block from blocksToToggle set
   instead of blockIds[0], matching BATCH_TOGGLE_ENABLED pattern.
   Also added early exit if blocksToToggle is empty.

2. Blocks inside locked containers are now properly non-draggable.
   Changed draggable check from !block.locked to use isBlockProtected()
   which checks both block lock and parent container lock.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(copilot): check parent lock in edit and delete operations

Both edit and delete operations now check if the block's parent
container is locked, not just if the block itself is locked. This
ensures consistent behavior with the UI which uses isBlockProtected
utility that checks both direct lock and parent lock.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(socket): add server-side lock validation and admin-only permissions

1. BATCH_TOGGLE_LOCKED now requires admin role - non-admin users with
   write role can no longer bypass UI restriction via direct socket
   messages

2. BATCH_REMOVE_BLOCKS now validates lock status server-side - filters
   out protected blocks (locked or inside locked parent) before deletion

3. Remove duplicate/outdated comment in regenerateBlockIds

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* test(socket): update permission test for admin-only lock toggle

batch-toggle-locked is now admin-only, so write role should be denied.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(undo-redo): use consistent target state for toggle redo

The redo logic for BATCH_TOGGLE_ENABLED and BATCH_TOGGLE_LOCKED was
incorrectly computing each block's new state as !previousStates[blockId].
However, the store's batchToggleEnabled/batchToggleLocked set ALL blocks
to the SAME target state based on the first block's previous state.

Now redo computes targetState = !previousStates[firstBlockId] and applies
it to all blocks, matching the store's behavior.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(socket): add comprehensive lock validation across operations

Based on audit findings, adds lock validation to multiple operations:

1. BATCH_TOGGLE_HANDLES - now skips locked/protected blocks at:
   - Store layer (batchToggleHandles)
   - Collaborative hook (collaborativeBatchToggleBlockHandles)
   - Server socket handler

2. BATCH_ADD_BLOCKS - server now filters blocks being added to
   locked parent containers

3. BATCH_UPDATE_PARENT - server now:
   - Skips protected blocks (locked or inside locked container)
   - Prevents moving blocks into locked containers

All validations use consistent isProtected() helper that checks both
direct lock and parent container lock.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* refactor(workflow): use pre-computed lock state from contextMenuBlocks

contextMenuBlocks already has locked and isParentLocked properties
computed in use-canvas-context-menu.ts, so there's no need to look
up blocks again via hasProtectedBlocks.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): add lock validation to block rename operations

Defense-in-depth: although the UI disables rename for locked blocks,
the collaborative layer and server now also validate locks.

- collaborativeUpdateBlockName: checks if block is locked or inside
  locked container before attempting rename
- UPDATE_NAME server handler: checks lock status and parent lock
  before performing database update

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* added defense in depth for renaming locked blocks

* fix(socket): add server-side lock validation for edges and subblocks

Defense-in-depth: adds lock checks to server-side handlers that were
previously relying only on client-side validation.

Edge operations (ADD, REMOVE, BATCH_ADD, BATCH_REMOVE):
- Check if source or target blocks are protected before modifying edges

Subblock updates:
- Check if parent block is protected before updating subblock values

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): fetch parent blocks for edge protection checks and consistent tooltip

- Fixed edge operations to fetch parent blocks before checking lock status
  - Previously, isBlockProtected checked if parent was locked, but the parent
    wasn't in blocksById because only source/target blocks were fetched
  - Now fetches parent blocks for all four edge operations: ADD, REMOVE,
    BATCH_ADD_EDGES, BATCH_REMOVE_EDGES
- Fixed tooltip inconsistency: changed "Run previous blocks first" to
  "Run upstream blocks first" in action-bar to match workflow.tsx

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* updated tooltip text for run from block

* fix(lock): add lock check to duplicate button and clean up drag handler

- Added lock check to duplicate button in action bar to prevent
  duplicating locked blocks (consistent with other edit operations)
- Removed ineffective early return in onNodeDragStart since the
  `draggable` property on nodes already prevents dragging protected
  blocks - the early return was misleading as it couldn't actually
  stop a drag operation

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(lock): use disableEdit for duplicate in block menu

Changed duplicate menu item to use disableEdit (which includes lock
check) instead of !userCanEdit for consistency with action bar and
other edit operations.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-03 19:15:04 -08:00
Waleed
4ba22527b6 improvement(tag-dropdown): removed custom styling on tag dropdown popover, fixed execution ordering in terminal and loops entries (#3126)
* improvement(tag-dropdown): removeed custom styling on tag dropdown popover, fixed execution ordering in terminal and loops entries

* ack pr comments

* handle old records
2026-02-03 18:32:40 -08:00
Waleed
c51f266ad7 fix(logs): use formatDuration utility and align file cards styling (#3125) 2026-02-03 15:26:09 -08:00
Waleed
4ca00810b2 fix(http): serialize nested objects in form-urlencoded body (#3124) 2026-02-03 10:58:01 -08:00
Waleed
710bf75bca fix(sidebar): right-click replaces selection, reset popover hover state (#3123)
* fix(sidebar): right-click replaces selection, reset popover hover state

* fix(queries): add userId to superuser query key for cache isolation
2026-02-03 10:09:03 -08:00
Waleed
f21fe2309c fix(formatting): consolidate duration formatting into shared utility (#3118)
* fix(formatting): consolidate duration formatting into shared utility

* fix(formatting): preserve original precision and rounding behavior

* fix(logs): add precision to logs list duration formatting

* fix(formatting): use parseFloat to preserve fractional milliseconds

* feat(ee): add enterprise modules (#3121)

* fix(formatting): return null for missing values, strip trailing zeros
2026-02-02 23:57:08 -08:00
Waleed
9c3fd1f7af feat(ee): add enterprise modules (#3121) 2026-02-02 23:40:18 -08:00
Waleed
a9b7d75d87 feat(editor): added docs link to editor (#3116) 2026-02-02 12:22:08 -08:00
Vikhyath Mondreti
0449804ffb improvement(billing): duplicate checks for bypasses, logger billing actor consistency, run from block (#3107)
* improvement(billing): improve against direct subscription creation bypasses

* more usage of block/unblock helpers

* address bugbot comments

* fail closed

* only run dup check for orgs
2026-02-02 10:52:08 -08:00
Vikhyath Mondreti
c286f3ed24 fix(mcp): child workflow with response block returns error (#3114) 2026-02-02 09:30:35 -08:00
Vikhyath Mondreti
b738550815 fix(cleanup-cron): stale execution cleanup integer overflow (#3113) 2026-02-02 09:03:56 -08:00
Waleed
c6357f7438 feat(tools): added enrich so (#3103)
* feat(tools): added enrich so

* updated docs and types
2026-01-31 21:18:41 -08:00
Waleed
b1118935f7 fix(workflow): optimize loop/parallel regeneration and prevent duplicate agent tools (#3100)
* fix(workflow): optimize loop/parallel regeneration and prevent duplicate agent tools

* refactor(workflow): remove addBlock in favor of batchAddBlocks

- Migrated undo-redo to use batchAddBlocks instead of addBlock loop
- Removed addBlock method from workflow store (now unused)
- Updated tests to use helper function wrapping batchAddBlocks
- This fixes the cursor bot comments about inconsistent parent checking
2026-01-31 17:55:32 -08:00
Waleed
3e18b4186c fix(mcp): pass timeout to SDK callTool to override 60s default (#3101) 2026-01-31 17:44:49 -08:00
Vikhyath Mondreti
e1ac201936 improvement(ratelimits, sockets): increase across all plans, reconnecting notif for sockets (#3096)
* improvement(rate-limits): increase across all plans

* improve sockets with reconnecting

* address bugbot comment

* fix typing
2026-01-31 16:48:57 -08:00
192 changed files with 21183 additions and 1458 deletions

View File

@@ -5421,3 +5421,18 @@ z'
</svg>
)
}
export function EnrichSoIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 398 394' fill='none'>
<path
fill='#5A52F4'
d='M129.705566,319.705719 C127.553314,322.684906 125.651512,325.414673 123.657059,328.277466 C113.748466,318.440308 105.605003,310.395905 97.510834,302.302216 C93.625801,298.417419 89.990181,294.269318 85.949242,290.558868 C82.857994,287.720428 82.464081,285.757660 85.772888,282.551880 C104.068108,264.826202 122.146088,246.876312 140.285110,228.989670 C141.183945,228.103317 141.957443,227.089844 143.588837,225.218384 C140.691605,225.066116 138.820053,224.882874 136.948410,224.881958 C102.798264,224.865326 68.647453,224.765244 34.498699,224.983612 C29.315699,225.016739 27.990419,223.343155 28.090912,218.397430 C28.381887,204.076935 28.189890,189.746719 28.195684,175.420319 C28.198524,168.398178 28.319166,168.279541 35.590389,168.278687 C69.074188,168.274780 102.557991,168.281174 136.041794,168.266083 C137.968231,168.265213 139.894608,168.107101 141.821030,168.022171 C142.137955,167.513992 142.454895,167.005829 142.771820,166.497650 C122.842415,146.495621 102.913002,126.493591 83.261360,106.770348 C96.563828,93.471756 109.448814,80.590523 122.656265,67.386925 C123.522743,68.161835 124.785545,69.187096 125.930321,70.330513 C144.551819,88.930206 163.103683,107.600082 181.805267,126.118790 C186.713593,130.979126 189.085648,136.448059 189.055374,143.437057 C188.899490,179.418961 188.911179,215.402191 189.046661,251.384262 C189.072296,258.190796 186.742920,263.653717 181.982727,268.323273 C164.624405,285.351227 147.295807,302.409485 129.705566,319.705719z'
/>
<path
fill='#5A52F4'
d='M276.070923,246.906128 C288.284363,258.985870 300.156097,270.902100 312.235931,282.603485 C315.158752,285.434784 315.417542,287.246246 312.383484,290.248932 C301.143494,301.372498 290.168549,312.763733 279.075592,324.036255 C278.168030,324.958496 277.121307,325.743835 275.898315,326.801086 C274.628357,325.711792 273.460663,324.822968 272.422150,323.802673 C253.888397,305.594757 235.418701,287.321289 216.818268,269.181854 C211.508789,264.003937 208.872726,258.136688 208.914001,250.565842 C209.108337,214.917786 209.084808,179.267715 208.928864,143.619293 C208.898407,136.654907 211.130066,131.122162 216.052216,126.246094 C234.867538,107.606842 253.537521,88.820908 272.274780,70.102730 C273.313202,69.065353 274.468597,68.145027 275.264038,67.440727 C288.353516,80.579514 301.213470,93.487869 314.597534,106.922356 C295.163391,126.421753 275.214752,146.437363 255.266113,166.452972 C255.540176,166.940353 255.814240,167.427734 256.088318,167.915100 C257.983887,168.035736 259.879425,168.260345 261.775085,168.261551 C295.425201,168.282852 329.075287,168.273544 362.725403,168.279831 C369.598907,168.281113 369.776215,168.463593 369.778931,175.252213 C369.784882,189.911667 369.646088,204.573074 369.861206,219.229355 C369.925110,223.585022 368.554596,224.976288 364.148865,224.956406 C329.833130,224.801605 295.516388,224.869598 261.199951,224.868744 C259.297974,224.868698 257.396027,224.868744 254.866638,224.868744 C262.350708,232.658707 269.078217,239.661194 276.070923,246.906128z'
/>
</svg>
)
}

View File

@@ -29,6 +29,7 @@ import {
DynamoDBIcon,
ElasticsearchIcon,
ElevenLabsIcon,
EnrichSoIcon,
ExaAIIcon,
EyeIcon,
FirecrawlIcon,
@@ -160,6 +161,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
dynamodb: DynamoDBIcon,
elasticsearch: ElasticsearchIcon,
elevenlabs: ElevenLabsIcon,
enrich: EnrichSoIcon,
exa: ExaAIIcon,
file_v2: DocumentIcon,
firecrawl: FirecrawlIcon,

View File

@@ -27,16 +27,16 @@ All API responses include information about your workflow execution limits and u
"limits": {
"workflowExecutionRateLimit": {
"sync": {
"requestsPerMinute": 60, // Sustained rate limit per minute
"maxBurst": 120, // Maximum burst capacity
"remaining": 118, // Current tokens available (up to maxBurst)
"resetAt": "..." // When tokens next refill
"requestsPerMinute": 150, // Sustained rate limit per minute
"maxBurst": 300, // Maximum burst capacity
"remaining": 298, // Current tokens available (up to maxBurst)
"resetAt": "..." // When tokens next refill
},
"async": {
"requestsPerMinute": 200, // Sustained rate limit per minute
"maxBurst": 400, // Maximum burst capacity
"remaining": 398, // Current tokens available
"resetAt": "..." // When tokens next refill
"requestsPerMinute": 1000, // Sustained rate limit per minute
"maxBurst": 2000, // Maximum burst capacity
"remaining": 1998, // Current tokens available
"resetAt": "..." // When tokens next refill
}
},
"usage": {
@@ -107,28 +107,28 @@ Query workflow execution logs with extensive filtering options.
}
],
"nextCursor": "eyJzIjoiMjAyNS0wMS0wMVQxMjozNDo1Ni43ODlaIiwiaWQiOiJsb2dfYWJjMTIzIn0",
"limits": {
"workflowExecutionRateLimit": {
"sync": {
"requestsPerMinute": 60,
"maxBurst": 120,
"remaining": 118,
"resetAt": "2025-01-01T12:35:56.789Z"
"limits": {
"workflowExecutionRateLimit": {
"sync": {
"requestsPerMinute": 150,
"maxBurst": 300,
"remaining": 298,
"resetAt": "2025-01-01T12:35:56.789Z"
},
"async": {
"requestsPerMinute": 1000,
"maxBurst": 2000,
"remaining": 1998,
"resetAt": "2025-01-01T12:35:56.789Z"
}
},
"async": {
"requestsPerMinute": 200,
"maxBurst": 400,
"remaining": 398,
"resetAt": "2025-01-01T12:35:56.789Z"
"usage": {
"currentPeriodCost": 1.234,
"limit": 10,
"plan": "pro",
"isExceeded": false
}
},
"usage": {
"currentPeriodCost": 1.234,
"limit": 10,
"plan": "pro",
"isExceeded": false
}
}
}
```
</Tab>
@@ -188,15 +188,15 @@ Retrieve detailed information about a specific log entry.
"limits": {
"workflowExecutionRateLimit": {
"sync": {
"requestsPerMinute": 60,
"maxBurst": 120,
"remaining": 118,
"requestsPerMinute": 150,
"maxBurst": 300,
"remaining": 298,
"resetAt": "2025-01-01T12:35:56.789Z"
},
"async": {
"requestsPerMinute": 200,
"maxBurst": 400,
"remaining": 398,
"requestsPerMinute": 1000,
"maxBurst": 2000,
"remaining": 1998,
"resetAt": "2025-01-01T12:35:56.789Z"
}
},
@@ -477,10 +477,10 @@ The API uses a **token bucket algorithm** for rate limiting, providing fair usag
| Plan | Requests/Minute | Burst Capacity |
|------|-----------------|----------------|
| Free | 10 | 20 |
| Pro | 30 | 60 |
| Team | 60 | 120 |
| Enterprise | 120 | 240 |
| Free | 30 | 60 |
| Pro | 100 | 200 |
| Team | 200 | 400 |
| Enterprise | 500 | 1000 |
**How it works:**
- Tokens refill at `requestsPerMinute` rate

View File

@@ -170,16 +170,16 @@ curl -X GET -H "X-API-Key: YOUR_API_KEY" -H "Content-Type: application/json" htt
"rateLimit": {
"sync": {
"isLimited": false,
"requestsPerMinute": 25,
"maxBurst": 50,
"remaining": 50,
"requestsPerMinute": 150,
"maxBurst": 300,
"remaining": 300,
"resetAt": "2025-09-08T22:51:55.999Z"
},
"async": {
"isLimited": false,
"requestsPerMinute": 200,
"maxBurst": 400,
"remaining": 400,
"requestsPerMinute": 1000,
"maxBurst": 2000,
"remaining": 2000,
"resetAt": "2025-09-08T22:51:56.155Z"
},
"authType": "api"
@@ -206,11 +206,11 @@ curl -X GET -H "X-API-Key: YOUR_API_KEY" -H "Content-Type: application/json" htt
Different subscription plans have different usage limits:
| Plan | Monthly Usage Limit | Rate Limits (per minute) |
|------|-------------------|-------------------------|
| **Free** | $20 | 5 sync, 10 async |
| **Pro** | $100 | 10 sync, 50 async |
| **Team** | $500 (pooled) | 50 sync, 100 async |
| Plan | Monthly Usage Included | Rate Limits (per minute) |
|------|------------------------|-------------------------|
| **Free** | $20 | 50 sync, 200 async |
| **Pro** | $20 (adjustable) | 150 sync, 1,000 async |
| **Team** | $40/seat (pooled, adjustable) | 300 sync, 2,500 async |
| **Enterprise** | Custom | Custom |
## Billing Model

View File

@@ -180,6 +180,11 @@ A quick lookup for everyday actions in the Sim workflow editor. For keyboard sho
<td>Right-click → **Enable/Disable**</td>
<td><ActionImage src="/static/quick-reference/disable-block.png" alt="Disable block" /></td>
</tr>
<tr>
<td>Lock/Unlock a block</td>
<td>Hover block → Click lock icon (Admin only)</td>
<td><ActionImage src="/static/quick-reference/lock-block.png" alt="Lock block" /></td>
</tr>
<tr>
<td>Toggle handle orientation</td>
<td>Right-click → **Toggle Handles**</td>

View File

@@ -0,0 +1,930 @@
---
title: Enrich
description: B2B data enrichment and LinkedIn intelligence with Enrich.so
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="enrich"
color="#E5E5E6"
/>
{/* MANUAL-CONTENT-START:intro */}
[Enrich.so](https://enrich.so/) delivers real-time, precision B2B data enrichment and LinkedIn intelligence. Its platform provides dynamic access to public and structured company, contact, and professional information, enabling teams to build richer profiles, improve lead quality, and drive more effective outreach.
With Enrich.so, you can:
- **Enrich contact and company profiles**: Instantly discover key data points for leads, prospects, and businesses using just an email or LinkedIn profile.
- **Verify email deliverability**: Check if emails are valid, deliverable, and safe to contact before sending.
- **Find work & personal emails**: Identify missing business emails from a LinkedIn profile or personal emails to expand your reach.
- **Reveal phone numbers and social profiles**: Surface additional communication channels for contacts through enrichment tools.
- **Analyze LinkedIn posts and engagement**: Extract insights on post reach, reactions, and audience from public LinkedIn content.
- **Conduct advanced people and company search**: Enable your agents to locate companies and professionals based on deep filters and real-time intelligence.
The Sim integration with Enrich.so empowers your agents and automations to instantly query, enrich, and validate B2B data, boosting productivity in workflows like sales prospecting, recruiting, marketing operations, and more. Combining Sim's orchestration capabilities with Enrich.so unlocks smarter, data-driven automation strategies powered by best-in-class B2B intelligence.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Access real-time B2B data intelligence with Enrich.so. Enrich profiles from email addresses, find work emails from LinkedIn, verify email deliverability, search for people and companies, and analyze LinkedIn post engagement.
## Tools
### `enrich_check_credits`
Check your Enrich API credit usage and remaining balance.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalCredits` | number | Total credits allocated to the account |
| `creditsUsed` | number | Credits consumed so far |
| `creditsRemaining` | number | Available credits remaining |
### `enrich_email_to_profile`
Retrieve detailed LinkedIn profile information using an email address including work history, education, and skills.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `email` | string | Yes | Email address to look up \(e.g., john.doe@company.com\) |
| `inRealtime` | boolean | No | Set to true to retrieve fresh data, bypassing cached information |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `displayName` | string | Full display name |
| `firstName` | string | First name |
| `lastName` | string | Last name |
| `headline` | string | Professional headline |
| `occupation` | string | Current occupation |
| `summary` | string | Profile summary |
| `location` | string | Location |
| `country` | string | Country |
| `linkedInUrl` | string | LinkedIn profile URL |
| `photoUrl` | string | Profile photo URL |
| `connectionCount` | number | Number of connections |
| `isConnectionCountObfuscated` | boolean | Whether connection count is obfuscated \(500+\) |
| `positionHistory` | array | Work experience history |
| ↳ `title` | string | Job title |
| ↳ `company` | string | Company name |
| ↳ `startDate` | string | Start date |
| ↳ `endDate` | string | End date |
| ↳ `location` | string | Location |
| `education` | array | Education history |
| ↳ `school` | string | School name |
| ↳ `degree` | string | Degree |
| ↳ `fieldOfStudy` | string | Field of study |
| ↳ `startDate` | string | Start date |
| ↳ `endDate` | string | End date |
| `certifications` | array | Professional certifications |
| ↳ `name` | string | Certification name |
| ↳ `authority` | string | Issuing authority |
| ↳ `url` | string | Certification URL |
| `skills` | array | List of skills |
| `languages` | array | List of languages |
| `locale` | string | Profile locale \(e.g., en_US\) |
| `version` | number | Profile version number |
### `enrich_email_to_person_lite`
Retrieve basic LinkedIn profile information from an email address. A lighter version with essential data only.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `email` | string | Yes | Email address to look up \(e.g., john.doe@company.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Full name |
| `firstName` | string | First name |
| `lastName` | string | Last name |
| `email` | string | Email address |
| `title` | string | Job title |
| `location` | string | Location |
| `company` | string | Current company |
| `companyLocation` | string | Company location |
| `companyLinkedIn` | string | Company LinkedIn URL |
| `profileId` | string | LinkedIn profile ID |
| `schoolName` | string | School name |
| `schoolUrl` | string | School URL |
| `linkedInUrl` | string | LinkedIn profile URL |
| `photoUrl` | string | Profile photo URL |
| `followerCount` | number | Number of followers |
| `connectionCount` | number | Number of connections |
| `languages` | array | Languages spoken |
| `projects` | array | Projects |
| `certifications` | array | Certifications |
| `volunteerExperience` | array | Volunteer experience |
### `enrich_linkedin_profile`
Enrich a LinkedIn profile URL with detailed information including positions, education, and social metrics.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `url` | string | Yes | LinkedIn profile URL \(e.g., linkedin.com/in/williamhgates\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `profileId` | string | LinkedIn profile ID |
| `firstName` | string | First name |
| `lastName` | string | Last name |
| `subTitle` | string | Profile subtitle/headline |
| `profilePicture` | string | Profile picture URL |
| `backgroundImage` | string | Background image URL |
| `industry` | string | Industry |
| `location` | string | Location |
| `followersCount` | number | Number of followers |
| `connectionsCount` | number | Number of connections |
| `premium` | boolean | Whether the account is premium |
| `influencer` | boolean | Whether the account is an influencer |
| `positions` | array | Work positions |
| ↳ `title` | string | Job title |
| ↳ `company` | string | Company name |
| ↳ `companyLogo` | string | Company logo URL |
| ↳ `startDate` | string | Start date |
| ↳ `endDate` | string | End date |
| ↳ `location` | string | Location |
| `education` | array | Education history |
| ↳ `school` | string | School name |
| ↳ `degree` | string | Degree |
| ↳ `fieldOfStudy` | string | Field of study |
| ↳ `startDate` | string | Start date |
| ↳ `endDate` | string | End date |
| `websites` | array | Personal websites |
### `enrich_find_email`
Find a person
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `fullName` | string | Yes | Person's full name \(e.g., John Doe\) |
| `companyDomain` | string | Yes | Company domain \(e.g., example.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Found email address |
| `firstName` | string | First name |
| `lastName` | string | Last name |
| `domain` | string | Company domain |
| `found` | boolean | Whether an email was found |
| `acceptAll` | boolean | Whether the domain accepts all emails |
### `enrich_linkedin_to_work_email`
Find a work email address from a LinkedIn profile URL.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `linkedinProfile` | string | Yes | LinkedIn profile URL \(e.g., https://www.linkedin.com/in/williamhgates\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Found work email address |
| `found` | boolean | Whether an email was found |
| `status` | string | Request status \(in_progress or completed\) |
### `enrich_linkedin_to_personal_email`
Find personal email address from a LinkedIn profile URL.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `linkedinProfile` | string | Yes | LinkedIn profile URL \(e.g., linkedin.com/in/username\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Personal email address |
| `found` | boolean | Whether an email was found |
| `status` | string | Request status |
### `enrich_phone_finder`
Find a phone number from a LinkedIn profile URL.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `linkedinProfile` | string | Yes | LinkedIn profile URL \(e.g., linkedin.com/in/williamhgates\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `profileUrl` | string | LinkedIn profile URL |
| `mobileNumber` | string | Found mobile phone number |
| `found` | boolean | Whether a phone number was found |
| `status` | string | Request status \(in_progress or completed\) |
### `enrich_email_to_phone`
Find a phone number associated with an email address.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `email` | string | Yes | Email address to look up \(e.g., john.doe@example.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Email address looked up |
| `mobileNumber` | string | Found mobile phone number |
| `found` | boolean | Whether a phone number was found |
| `status` | string | Request status \(in_progress or completed\) |
### `enrich_verify_email`
Verify an email address for deliverability, including catch-all detection and provider identification.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `email` | string | Yes | Email address to verify \(e.g., john.doe@example.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Email address verified |
| `status` | string | Verification status |
| `result` | string | Deliverability result \(deliverable, undeliverable, etc.\) |
| `confidenceScore` | number | Confidence score \(0-100\) |
| `smtpProvider` | string | Email service provider \(e.g., Google, Microsoft\) |
| `mailDisposable` | boolean | Whether the email is from a disposable provider |
| `mailAcceptAll` | boolean | Whether the domain is a catch-all domain |
| `free` | boolean | Whether the email uses a free email service |
### `enrich_disposable_email_check`
Check if an email address is from a disposable or temporary email provider. Returns a score and validation details.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `email` | string | Yes | Email address to check \(e.g., john.doe@example.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Email address checked |
| `score` | number | Validation score \(0-100\) |
| `testsPassed` | string | Number of tests passed \(e.g., "3/3"\) |
| `passed` | boolean | Whether the email passed all validation tests |
| `reason` | string | Reason for failure if email did not pass |
| `mailServerIp` | string | Mail server IP address |
| `mxRecords` | array | MX records for the domain |
| ↳ `host` | string | MX record host |
| ↳ `pref` | number | MX record preference |
### `enrich_email_to_ip`
Discover an IP address associated with an email address.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `email` | string | Yes | Email address to look up \(e.g., john.doe@example.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `email` | string | Email address looked up |
| `ip` | string | Associated IP address |
| `found` | boolean | Whether an IP address was found |
### `enrich_ip_to_company`
Identify a company from an IP address with detailed firmographic information.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `ip` | string | Yes | IP address to look up \(e.g., 86.92.60.221\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Company name |
| `legalName` | string | Legal company name |
| `domain` | string | Primary domain |
| `domainAliases` | array | Domain aliases |
| `sector` | string | Business sector |
| `industry` | string | Industry |
| `phone` | string | Phone number |
| `employees` | number | Number of employees |
| `revenue` | string | Estimated revenue |
| `location` | json | Company location |
| ↳ `city` | string | City |
| ↳ `state` | string | State |
| ↳ `country` | string | Country |
| ↳ `timezone` | string | Timezone |
| `linkedInUrl` | string | LinkedIn company URL |
| `twitterUrl` | string | Twitter URL |
| `facebookUrl` | string | Facebook URL |
### `enrich_company_lookup`
Look up comprehensive company information by name or domain including funding, location, and social profiles.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `name` | string | No | Company name \(e.g., Google\) |
| `domain` | string | No | Company domain \(e.g., google.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Company name |
| `universalName` | string | Universal company name |
| `companyId` | string | Company ID |
| `description` | string | Company description |
| `phone` | string | Phone number |
| `linkedInUrl` | string | LinkedIn company URL |
| `websiteUrl` | string | Company website |
| `followers` | number | Number of LinkedIn followers |
| `staffCount` | number | Number of employees |
| `foundedDate` | string | Date founded |
| `type` | string | Company type |
| `industries` | array | Industries |
| `specialties` | array | Company specialties |
| `headquarters` | json | Headquarters location |
| ↳ `city` | string | City |
| ↳ `country` | string | Country |
| ↳ `postalCode` | string | Postal code |
| ↳ `line1` | string | Address line 1 |
| `logo` | string | Company logo URL |
| `coverImage` | string | Cover image URL |
| `fundingRounds` | array | Funding history |
| ↳ `roundType` | string | Funding round type |
| ↳ `amount` | number | Amount raised |
| ↳ `currency` | string | Currency |
| ↳ `investors` | array | Investors |
### `enrich_company_funding`
Retrieve company funding history, traffic metrics, and executive information by domain.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `domain` | string | Yes | Company domain \(e.g., example.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `legalName` | string | Legal company name |
| `employeeCount` | number | Number of employees |
| `headquarters` | string | Headquarters location |
| `industry` | string | Industry |
| `totalFundingRaised` | number | Total funding raised |
| `fundingRounds` | array | Funding rounds |
| ↳ `roundType` | string | Round type |
| ↳ `amount` | number | Amount raised |
| ↳ `date` | string | Date |
| ↳ `investors` | array | Investors |
| `monthlyVisits` | number | Monthly website visits |
| `trafficChange` | number | Traffic change percentage |
| `itSpending` | number | Estimated IT spending in USD |
| `executives` | array | Executive team |
| ↳ `name` | string | Name |
| ↳ `title` | string | Title |
### `enrich_company_revenue`
Retrieve company revenue data, CEO information, and competitive analysis by domain.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `domain` | string | Yes | Company domain \(e.g., clay.io\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `companyName` | string | Company name |
| `shortDescription` | string | Short company description |
| `fullSummary` | string | Full company summary |
| `revenue` | string | Company revenue |
| `revenueMin` | number | Minimum revenue estimate |
| `revenueMax` | number | Maximum revenue estimate |
| `employeeCount` | number | Number of employees |
| `founded` | string | Year founded |
| `ownership` | string | Ownership type |
| `status` | string | Company status \(e.g., Active\) |
| `website` | string | Company website URL |
| `ceo` | json | CEO information |
| ↳ `name` | string | CEO name |
| ↳ `designation` | string | CEO designation/title |
| ↳ `rating` | number | CEO rating |
| `socialLinks` | json | Social media links |
| ↳ `linkedIn` | string | LinkedIn URL |
| ↳ `twitter` | string | Twitter URL |
| ↳ `facebook` | string | Facebook URL |
| `totalFunding` | string | Total funding raised |
| `fundingRounds` | number | Number of funding rounds |
| `competitors` | array | Competitors |
| ↳ `name` | string | Competitor name |
| ↳ `revenue` | string | Revenue |
| ↳ `employeeCount` | number | Employee count |
| ↳ `headquarters` | string | Headquarters |
### `enrich_search_people`
Search for professionals by various criteria including name, title, skills, education, and company.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `firstName` | string | No | First name |
| `lastName` | string | No | Last name |
| `summary` | string | No | Professional summary keywords |
| `subTitle` | string | No | Job title/subtitle |
| `locationCountry` | string | No | Country |
| `locationCity` | string | No | City |
| `locationState` | string | No | State/province |
| `influencer` | boolean | No | Filter for influencers only |
| `premium` | boolean | No | Filter for premium accounts only |
| `language` | string | No | Primary language |
| `industry` | string | No | Industry |
| `currentJobTitles` | json | No | Current job titles \(array\) |
| `pastJobTitles` | json | No | Past job titles \(array\) |
| `skills` | json | No | Skills to search for \(array\) |
| `schoolNames` | json | No | School names \(array\) |
| `certifications` | json | No | Certifications to filter by \(array\) |
| `degreeNames` | json | No | Degree names to filter by \(array\) |
| `studyFields` | json | No | Fields of study to filter by \(array\) |
| `currentCompanies` | json | No | Current company IDs to filter by \(array of numbers\) |
| `pastCompanies` | json | No | Past company IDs to filter by \(array of numbers\) |
| `currentPage` | number | No | Page number \(default: 1\) |
| `pageSize` | number | No | Results per page \(default: 20\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `currentPage` | number | Current page number |
| `totalPage` | number | Total number of pages |
| `pageSize` | number | Results per page |
| `profiles` | array | Search results |
| ↳ `profileIdentifier` | string | Profile ID |
| ↳ `givenName` | string | First name |
| ↳ `familyName` | string | Last name |
| ↳ `currentPosition` | string | Current job title |
| ↳ `profileImage` | string | Profile image URL |
| ↳ `externalProfileUrl` | string | LinkedIn URL |
| ↳ `city` | string | City |
| ↳ `country` | string | Country |
| ↳ `expertSkills` | array | Skills |
### `enrich_search_company`
Search for companies by various criteria including name, industry, location, and size.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `name` | string | No | Company name |
| `website` | string | No | Company website URL |
| `tagline` | string | No | Company tagline |
| `type` | string | No | Company type \(e.g., Private, Public\) |
| `description` | string | No | Company description keywords |
| `industries` | json | No | Industries to filter by \(array\) |
| `locationCountry` | string | No | Country |
| `locationCity` | string | No | City |
| `postalCode` | string | No | Postal code |
| `locationCountryList` | json | No | Multiple countries to filter by \(array\) |
| `locationCityList` | json | No | Multiple cities to filter by \(array\) |
| `specialities` | json | No | Company specialties \(array\) |
| `followers` | number | No | Minimum number of followers |
| `staffCount` | number | No | Maximum staff count |
| `staffCountMin` | number | No | Minimum staff count |
| `staffCountMax` | number | No | Maximum staff count |
| `currentPage` | number | No | Page number \(default: 1\) |
| `pageSize` | number | No | Results per page \(default: 20\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `currentPage` | number | Current page number |
| `totalPage` | number | Total number of pages |
| `pageSize` | number | Results per page |
| `companies` | array | Search results |
| ↳ `companyName` | string | Company name |
| ↳ `tagline` | string | Company tagline |
| ↳ `webAddress` | string | Website URL |
| ↳ `industries` | array | Industries |
| ↳ `teamSize` | number | Team size |
| ↳ `linkedInProfile` | string | LinkedIn URL |
### `enrich_search_company_employees`
Search for employees within specific companies by location and job title.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `companyIds` | json | No | Array of company IDs to search within |
| `country` | string | No | Country filter \(e.g., United States\) |
| `city` | string | No | City filter \(e.g., San Francisco\) |
| `state` | string | No | State filter \(e.g., California\) |
| `jobTitles` | json | No | Job titles to filter by \(array\) |
| `page` | number | No | Page number \(default: 1\) |
| `pageSize` | number | No | Results per page \(default: 10\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `currentPage` | number | Current page number |
| `totalPage` | number | Total number of pages |
| `pageSize` | number | Number of results per page |
| `profiles` | array | Employee profiles |
| ↳ `profileIdentifier` | string | Profile ID |
| ↳ `givenName` | string | First name |
| ↳ `familyName` | string | Last name |
| ↳ `currentPosition` | string | Current job title |
| ↳ `profileImage` | string | Profile image URL |
| ↳ `externalProfileUrl` | string | LinkedIn URL |
| ↳ `city` | string | City |
| ↳ `country` | string | Country |
| ↳ `expertSkills` | array | Skills |
### `enrich_search_similar_companies`
Find companies similar to a given company by LinkedIn URL with filters for location and size.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `url` | string | Yes | LinkedIn company URL \(e.g., linkedin.com/company/google\) |
| `accountLocation` | json | No | Filter by locations \(array of country names\) |
| `employeeSizeType` | string | No | Employee size filter type \(e.g., RANGE\) |
| `employeeSizeRange` | json | No | Employee size ranges \(array of \{start, end\} objects\) |
| `page` | number | No | Page number \(default: 1\) |
| `num` | number | No | Number of results per page |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `companies` | array | Similar companies |
| ↳ `url` | string | LinkedIn URL |
| ↳ `name` | string | Company name |
| ↳ `universalName` | string | Universal name |
| ↳ `type` | string | Company type |
| ↳ `description` | string | Description |
| ↳ `phone` | string | Phone number |
| ↳ `website` | string | Website URL |
| ↳ `logo` | string | Logo URL |
| ↳ `foundedYear` | number | Year founded |
| ↳ `staffTotal` | number | Total staff |
| ↳ `industries` | array | Industries |
| ↳ `relevancyScore` | number | Relevancy score |
| ↳ `relevancyValue` | string | Relevancy value |
### `enrich_sales_pointer_people`
Advanced people search with complex filters for location, company size, seniority, experience, and more.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `page` | number | Yes | Page number \(starts at 1\) |
| `filters` | json | Yes | Array of filter objects. Each filter has type \(e.g., POSTAL_CODE, COMPANY_HEADCOUNT\), values \(array with id, text, selectionType: INCLUDED/EXCLUDED\), and optional selectedSubFilter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | People results |
| ↳ `name` | string | Full name |
| ↳ `summary` | string | Professional summary |
| ↳ `location` | string | Location |
| ↳ `profilePicture` | string | Profile picture URL |
| ↳ `linkedInUrn` | string | LinkedIn URN |
| ↳ `positions` | array | Work positions |
| ↳ `education` | array | Education |
| `pagination` | json | Pagination info |
| ↳ `totalCount` | number | Total results |
| ↳ `returnedCount` | number | Returned count |
| ↳ `start` | number | Start position |
| ↳ `limit` | number | Limit |
### `enrich_search_posts`
Search LinkedIn posts by keywords with date filtering.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `keywords` | string | Yes | Search keywords \(e.g., "AI automation"\) |
| `datePosted` | string | No | Time filter \(e.g., past_week, past_month\) |
| `page` | number | No | Page number \(default: 1\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `count` | number | Total number of results |
| `posts` | array | Search results |
| ↳ `url` | string | Post URL |
| ↳ `postId` | string | Post ID |
| ↳ `author` | object | Author information |
| ↳ `name` | string | Author name |
| ↳ `headline` | string | Author headline |
| ↳ `linkedInUrl` | string | Author LinkedIn URL |
| ↳ `profileImage` | string | Author profile image |
| ↳ `timestamp` | string | Post timestamp |
| ↳ `textContent` | string | Post text content |
| ↳ `hashtags` | array | Hashtags |
| ↳ `mediaUrls` | array | Media URLs |
| ↳ `reactions` | number | Number of reactions |
| ↳ `commentsCount` | number | Number of comments |
### `enrich_get_post_details`
Get detailed information about a LinkedIn post by URL.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `url` | string | Yes | LinkedIn post URL |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `postId` | string | Post ID |
| `author` | json | Author information |
| ↳ `name` | string | Author name |
| ↳ `headline` | string | Author headline |
| ↳ `linkedInUrl` | string | Author LinkedIn URL |
| ↳ `profileImage` | string | Author profile image |
| `timestamp` | string | Post timestamp |
| `textContent` | string | Post text content |
| `hashtags` | array | Hashtags |
| `mediaUrls` | array | Media URLs |
| `reactions` | number | Number of reactions |
| `commentsCount` | number | Number of comments |
### `enrich_search_post_reactions`
Get reactions on a LinkedIn post with filtering by reaction type.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `postUrn` | string | Yes | LinkedIn activity URN \(e.g., urn:li:activity:7231931952839196672\) |
| `reactionType` | string | Yes | Reaction type filter: all, like, love, celebrate, insightful, or funny \(default: all\) |
| `page` | number | Yes | Page number \(starts at 1\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `page` | number | Current page number |
| `totalPage` | number | Total number of pages |
| `count` | number | Number of reactions returned |
| `reactions` | array | Reactions |
| ↳ `reactionType` | string | Type of reaction |
| ↳ `reactor` | object | Person who reacted |
| ↳ `name` | string | Name |
| ↳ `subTitle` | string | Job title |
| ↳ `profileId` | string | Profile ID |
| ↳ `profilePicture` | string | Profile picture URL |
| ↳ `linkedInUrl` | string | LinkedIn URL |
### `enrich_search_post_comments`
Get comments on a LinkedIn post.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `postUrn` | string | Yes | LinkedIn activity URN \(e.g., urn:li:activity:7191163324208705536\) |
| `page` | number | No | Page number \(starts at 1, default: 1\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `page` | number | Current page number |
| `totalPage` | number | Total number of pages |
| `count` | number | Number of comments returned |
| `comments` | array | Comments |
| ↳ `activityId` | string | Comment activity ID |
| ↳ `commentary` | string | Comment text |
| ↳ `linkedInUrl` | string | Link to comment |
| ↳ `commenter` | object | Commenter info |
| ↳ `profileId` | string | Profile ID |
| ↳ `firstName` | string | First name |
| ↳ `lastName` | string | Last name |
| ↳ `subTitle` | string | Subtitle/headline |
| ↳ `profilePicture` | string | Profile picture URL |
| ↳ `backgroundImage` | string | Background image URL |
| ↳ `entityUrn` | string | Entity URN |
| ↳ `objectUrn` | string | Object URN |
| ↳ `profileType` | string | Profile type |
| ↳ `reactionBreakdown` | object | Reactions on the comment |
| ↳ `likes` | number | Number of likes |
| ↳ `empathy` | number | Number of empathy reactions |
| ↳ `other` | number | Number of other reactions |
### `enrich_search_people_activities`
Get a person
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `profileId` | string | Yes | LinkedIn profile ID |
| `activityType` | string | Yes | Activity type: posts, comments, or articles |
| `paginationToken` | string | No | Pagination token for next page of results |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `paginationToken` | string | Token for fetching next page |
| `activityType` | string | Type of activities returned |
| `activities` | array | Activities |
| ↳ `activityId` | string | Activity ID |
| ↳ `commentary` | string | Activity text content |
| ↳ `linkedInUrl` | string | Link to activity |
| ↳ `timeElapsed` | string | Time elapsed since activity |
| ↳ `numReactions` | number | Total number of reactions |
| ↳ `author` | object | Activity author info |
| ↳ `name` | string | Author name |
| ↳ `profileId` | string | Profile ID |
| ↳ `profilePicture` | string | Profile picture URL |
| ↳ `reactionBreakdown` | object | Reactions |
| ↳ `likes` | number | Likes |
| ↳ `empathy` | number | Empathy reactions |
| ↳ `other` | number | Other reactions |
| ↳ `attachments` | array | Attachment URLs |
### `enrich_search_company_activities`
Get a company
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `companyId` | string | Yes | LinkedIn company ID |
| `activityType` | string | Yes | Activity type: posts, comments, or articles |
| `paginationToken` | string | No | Pagination token for next page of results |
| `offset` | number | No | Number of records to skip \(default: 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `paginationToken` | string | Token for fetching next page |
| `activityType` | string | Type of activities returned |
| `activities` | array | Activities |
| ↳ `activityId` | string | Activity ID |
| ↳ `commentary` | string | Activity text content |
| ↳ `linkedInUrl` | string | Link to activity |
| ↳ `timeElapsed` | string | Time elapsed since activity |
| ↳ `numReactions` | number | Total number of reactions |
| ↳ `author` | object | Activity author info |
| ↳ `name` | string | Author name |
| ↳ `profileId` | string | Profile ID |
| ↳ `profilePicture` | string | Profile picture URL |
| ↳ `reactionBreakdown` | object | Reactions |
| ↳ `likes` | number | Likes |
| ↳ `empathy` | number | Empathy reactions |
| ↳ `other` | number | Other reactions |
| ↳ `attachments` | array | Attachments |
### `enrich_reverse_hash_lookup`
Convert an MD5 email hash back to the original email address and display name.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `hash` | string | Yes | MD5 hash value to look up |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `hash` | string | MD5 hash that was looked up |
| `email` | string | Original email address |
| `displayName` | string | Display name associated with the email |
| `found` | boolean | Whether an email was found for the hash |
### `enrich_search_logo`
Get a company logo image URL by domain.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Enrich API key |
| `url` | string | Yes | Company domain \(e.g., google.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `logoUrl` | string | URL to fetch the company logo |
| `domain` | string | Domain that was looked up |

View File

@@ -10,6 +10,23 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
color="#181C1E"
/>
{/* MANUAL-CONTENT-START:intro */}
[GitHub](https://github.com/) is the worlds leading platform for hosting, collaborating on, and managing source code. GitHub offers powerful tools for version control, code review, branching strategies, and team collaboration within the rich Git ecosystem, underpinning both open source and enterprise development worldwide.
The GitHub integration in Sim allows your agents to seamlessly automate, interact with, and orchestrate workflows across your repositories. Using this integration, agents can perform an extended set of code and collaboration operations, enabling:
- **Fetch pull request details:** Retrieve a full overview of any pull request, including file diffs, branch information, metadata, approvals, and a summary of changes, for automation or review workflows.
- **Create pull request comments:** Automatically generate or post comments on PRs—such as reviews, suggestions, or status updates—enabling speedy feedback, documentation, or policy enforcement.
- **Get repository information:** Access comprehensive repository metadata, including descriptions, visibility, topics, default branches, and contributors. This supports intelligent project analysis, dynamic workflow routing, and organizational reporting.
- **Fetch the latest commit:** Quickly obtain details from the newest commit on any branch, including hashes, messages, authors, and timestamps. This is useful for monitoring development velocity, triggering downstream actions, or enforcing quality checks.
- **Trigger workflows from GitHub events:** Set up Sim workflows to start automatically from key GitHub events, including pull request creation, review comments, or when new commits are pushed, through easy webhook integration. Automate actions such as deployments, notifications, compliance checks, or documentation updates in real time.
- **Monitor and manage repository activity:** Programmatically track contributions, manage PR review states, analyze branch histories, and audit code changes. Empower agents to enforce requirements, coordinate releases, and respond dynamically to development patterns.
- **Support for advanced automations:** Combine these operations—for example, fetch PR data, leave context-aware comments, and kick off multi-step Sim workflows on code pushes or PR merges—to automate your teams engineering processes from end to end.
By leveraging all of these capabilities, the Sim GitHub integration enables agents to engage deeply in the development lifecycle. Automate code reviews, streamline team feedback, synchronize project artifacts, accelerate CI/CD, and enforce best practices with ease. Bring security, speed, and reliability to your workflows—directly within your Sim-powered automation environment, with full integration into your organizations GitHub strategy.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate Github into the workflow. Can get get PR details, create PR comment, get repository info, and get latest commit. Can be used in trigger mode to trigger a workflow when a PR is created, commented on, or a commit is pushed.

View File

@@ -11,55 +11,17 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
/>
{/* MANUAL-CONTENT-START:intro */}
[Google Docs](https://docs.google.com) is a powerful cloud-based document creation and editing service that allows users to create, edit, and collaborate on documents in real-time. As part of Google's productivity suite, Google Docs offers a versatile platform for text documents with robust formatting, commenting, and sharing capabilities.
[Google Docs](https://docs.google.com) is Googles collaborative, cloud-based document service, enabling users to create, edit, and share documents in real time. As an integral part of Google Workspace, Docs offers rich formatting tools, commenting, version history, and seamless integration with other Google productivity tools.
Learn how to integrate the Google Docs "Read" tool in Sim to effortlessly fetch data from your docs and to integrate into your workflows. This tutorial walks you through connecting Google Docs, setting up data reads, and using that information to automate processes in real-time. Perfect for syncing live data with your agents.
Google Docs empowers individuals and teams to:
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/f41gy9rBHhE"
title="Use the Google Docs Read tool in Sim"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
- **Create and format documents:** Develop rich text documents with advanced formatting, images, and tables.
- **Collaborate and comment:** Multiple users can edit and comment with suggestions instantly.
- **Track changes and version history:** Review, revert, and manage revisions over time.
- **Access from any device:** Work on documents from web, mobile, or desktop with full cloud synchronization.
- **Integrate across Google services:** Connect Docs with Drive, Sheets, Slides, and external platforms for powerful workflows.
Learn how to integrate the Google Docs "Update" tool in Sim to effortlessly add content in your docs through your workflows. This tutorial walks you through connecting Google Docs, configuring data writes, and using that information to automate document updates seamlessly. Perfect for maintaining dynamic, real-time documentation with minimal effort.
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/L64ROHS2ivA"
title="Use the Google Docs Update tool in Sim"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
Learn how to integrate the Google Docs "Create" tool in Sim to effortlessly generate new documents through your workflows. This tutorial walks you through connecting Google Docs, setting up document creation, and using workflow data to populate content automatically. Perfect for streamlining document generation and enhancing productivity.
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/lWpHH4qddWk"
title="Use the Google Docs Create tool in Sim"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
With Google Docs, you can:
- **Create and edit documents**: Develop text documents with comprehensive formatting options
- **Collaborate in real-time**: Work simultaneously with multiple users on the same document
- **Track changes**: View revision history and restore previous versions
- **Comment and suggest**: Provide feedback and propose edits without changing the original content
- **Access anywhere**: Use Google Docs across devices with automatic cloud synchronization
- **Work offline**: Continue working without internet connection with changes syncing when back online
- **Integrate with other services**: Connect with Google Drive, Sheets, Slides, and third-party applications
In Sim, the Google Docs integration enables your agents to interact directly with document content programmatically. This allows for powerful automation scenarios such as document creation, content extraction, collaborative editing, and document management. Your agents can read existing documents to extract information, write to documents to update content, and create new documents from scratch. This integration bridges the gap between your AI workflows and document management, enabling seamless interaction with one of the world's most widely used document platforms. By connecting Sim with Google Docs, you can automate document workflows, generate reports, extract insights from documents, and maintain documentation - all through your intelligent agents.
In Sim, the Google Docs integration allows your agents to read document content, write new content, and create documents programmatically as part of automated workflows. This integration unlocks automation such as document generation, report writing, content extraction, and collaborative editing—bridging the gap between AI-driven workflows and document management in your organization.
{/* MANUAL-CONTENT-END */}

View File

@@ -11,30 +11,18 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
/>
{/* MANUAL-CONTENT-START:intro */}
[Google Drive](https://drive.google.com) is Google's cloud storage and file synchronization service that allows users to store files, synchronize files across devices, and share files with others. As a core component of Google's productivity ecosystem, Google Drive offers robust storage, organization, and collaboration capabilities.
[Google Drive](https://drive.google.com) is Googles cloud-based file storage and synchronization service, making it easy to store, manage, share, and access files securely across devices and platforms. As a core element of Google Workspace, Google Drive offers robust tools for file organization, collaboration, and seamless integration with the broader productivity suite.
Learn how to integrate the Google Drive tool in Sim to effortlessly pull information from your Drive through your workflows. This tutorial walks you through connecting Google Drive, setting up data retrieval, and using stored documents and files to enhance automation. Perfect for syncing important data with your agents in real-time.
Google Drive enables individuals and teams to:
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/cRoRr4b-EAs"
title="Use the Google Drive tool in Sim"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
- **Store files in the cloud:** Access documents, images, videos, and more from anywhere with internet connectivity.
- **Organize and manage content:** Create and arrange folders, use naming conventions, and leverage search for fast retrieval.
- **Share and collaborate:** Control file and folder permissions, share with individuals or groups, and collaborate in real time.
- **Leverage powerful search:** Quickly locate files using Googles search technology.
- **Access across devices:** Work with your files on desktop, mobile, or web with full synchronization.
- **Integrate deeply across Google services:** Connect with Google Docs, Sheets, Slides, and partner applications in your workflows.
With Google Drive, you can:
- **Store files in the cloud**: Upload and access your files from anywhere with internet access
- **Organize content**: Create folders, use color coding, and implement naming conventions
- **Share and collaborate**: Control access permissions and work simultaneously on files
- **Search efficiently**: Find files quickly with Google's powerful search technology
- **Access across devices**: Use Google Drive on desktop, mobile, and web platforms
- **Integrate with other services**: Connect with Google Docs, Sheets, Slides, and third-party applications
In Sim, the Google Drive integration enables your agents to interact directly with your cloud storage programmatically. This allows for powerful automation scenarios such as file management, content organization, and document workflows. Your agents can upload new files to specific folders, download existing files to process their contents, and list folder contents to navigate your storage structure. This integration bridges the gap between your AI workflows and your document management system, enabling seamless file operations without manual intervention. By connecting Sim with Google Drive, you can automate file-based workflows, manage documents intelligently, and incorporate cloud storage operations into your agent's capabilities.
In Sim, the Google Drive integration allows your agents to read, upload, download, list, and organize your Drive files programmatically. Agents can automate file management, streamline content workflows, and enable no-code automation around document storage and retrieval. By connecting Sim with Google Drive, you empower your agents to incorporate cloud file operations directly into intelligent business processes.
{/* MANUAL-CONTENT-END */}

View File

@@ -11,29 +11,9 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
/>
{/* MANUAL-CONTENT-START:intro */}
[Google Search](https://www.google.com) is the world's most widely used search engine, providing access to billions of web pages and information sources. Google Search uses sophisticated algorithms to deliver relevant search results based on user queries, making it an essential tool for finding information on the internet.
[Google Search](https://www.google.com) is the world's most widely used web search engine, making it easy to find information, discover new content, and answer questions in real time. With advanced search algorithms, Google Search helps you quickly locate web pages, images, news, and more using simple or complex queries.
Learn how to integrate the Google Search tool in Sim to effortlessly fetch real-time search results through your workflows. This tutorial walks you through connecting Google Search, configuring search queries, and using live data to enhance automation. Perfect for powering your agents with up-to-date information and smarter decision-making.
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/1B7hV9b5UMQ"
title="Use the Google Search tool in Sim"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
With Google Search, you can:
- **Find relevant information**: Access billions of web pages with Google's powerful search algorithms
- **Get specific results**: Use search operators to refine and target your queries
- **Discover diverse content**: Find text, images, videos, news, and other content types
- **Access knowledge graphs**: Get structured information about people, places, and things
- **Utilize search features**: Take advantage of specialized search tools like calculators, unit converters, and more
In Sim, the Google Search integration enables your agents to search the web programmatically and incorporate search results into their workflows. This allows for powerful automation scenarios such as research, fact-checking, data gathering, and information synthesis. Your agents can formulate search queries, retrieve relevant results, and extract information from those results to make decisions or generate insights. This integration bridges the gap between your AI workflows and the vast information available on the web, enabling your agents to access up-to-date information from across the internet. By connecting Sim with Google Search, you can create agents that stay informed with the latest information, verify facts, conduct research, and provide users with relevant web content - all without leaving your workflow.
In Sim, the Google Search integration allows your agents to search the web and retrieve live information as part of automated workflows. This enables powerful use cases such as automated research, fact-checking, knowledge synthesis, and dynamic content discovery. By connecting Sim with Google Search, your agents can perform queries, process and analyze web results, and incorporate the latest information into their decisions—without manual effort. Enhance your workflows with always up-to-date knowledge from across the internet.
{/* MANUAL-CONTENT-END */}

View File

@@ -10,6 +10,20 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
color="#F64F9E"
/>
{/* MANUAL-CONTENT-START:intro */}
The Memory tool enables your agents to store, retrieve, and manage conversation memories across workflows. It acts as a persistent memory store that agents can access to maintain conversation context, recall facts, or track actions over time.
With the Memory tool, you can:
- **Add new memories**: Store relevant information, events, or conversation history by saving agent or user messages into a structured memory database
- **Retrieve memories**: Fetch specific memories or all memories tied to a conversation, helping agents recall previous interactions or facts
- **Delete memories**: Remove outdated or incorrect memories from the database to maintain accurate context
- **Append to existing conversations**: Update or expand on existing memory threads by appending new messages with the same conversation identifier
Sims Memory block is especially useful for building agents that require persistent state—helping them remember what was said earlier in a conversation, persist facts between tasks, or apply long-term history in decision-making. By integrating Memory, you enable richer, more contextual, and more dynamic workflows for your agents.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate Memory into the workflow. Can add, get a memory, get all memories, and delete memories.

View File

@@ -24,6 +24,7 @@
"dynamodb",
"elasticsearch",
"elevenlabs",
"enrich",
"exa",
"file",
"firecrawl",

View File

@@ -10,6 +10,21 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
color="#181C1E"
/>
{/* MANUAL-CONTENT-START:intro */}
The Notion tool integration enables your agents to read, create, and manage Notion pages and databases directly within your workflows. This allows you to automate the retrieval and updating of structured content, notes, documents, and more from your Notion workspace.
With the Notion tool, you can:
- **Read pages or databases**: Extract rich content or metadata from specified Notion pages or entire databases
- **Create new content**: Programmatically create new pages or databases for dynamic content generation
- **Append content**: Add new blocks or properties to existing pages and databases
- **Query databases**: Run advanced filters and searches on structured Notion data for custom workflows
- **Search your workspace**: Locate pages and databases across your Notion workspace automatically
This tool is ideal for scenarios where agents need to synchronize information, generate reports, or maintain structured notes within Notion. By bringing Notion's capabilities into automated workflows, you empower your agents to interface with knowledge, documentation, and project management data programmatically and seamlessly.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate with Notion into the workflow. Can read page, read database, create page, create database, append content, query database, and search workspace.

View File

@@ -11,7 +11,7 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
/>
{/* MANUAL-CONTENT-START:intro */}
The [Pulse](https://www.pulseapi.com/) tool enables seamless extraction of text and structured content from a wide variety of documents—including PDFs, images, and Office files—using state-of-the-art OCR (Optical Character Recognition) powered by Pulse. Designed for automated agentic workflows, Pulse Parser makes it easy to unlock valuable information trapped in unstructured documents and integrate the extracted content directly into your workflow.
The [Pulse](https://www.runpulse.com) tool enables seamless extraction of text and structured content from a wide variety of documents—including PDFs, images, and Office files—using state-of-the-art OCR (Optical Character Recognition) powered by Pulse. Designed for automated agentic workflows, Pulse Parser makes it easy to unlock valuable information trapped in unstructured documents and integrate the extracted content directly into your workflow.
With Pulse, you can:

View File

@@ -13,16 +13,6 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
{/* MANUAL-CONTENT-START:intro */}
[Slack](https://www.slack.com/) is a business communication platform that offers teams a unified place for messaging, tools, and files.
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/J5jz3UaWmE8"
title="Slack Integration with Sim"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
With Slack, you can:
- **Automate agent notifications**: Send real-time updates from your Sim agents to any Slack channel

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

View File

@@ -1,6 +1,6 @@
import { redirect } from 'next/navigation'
import { getEnv, isTruthy } from '@/lib/core/config/env'
import SSOForm from '@/app/(auth)/sso/sso-form'
import SSOForm from '@/ee/sso/components/sso-form'
export const dynamic = 'force-dynamic'

View File

@@ -8,6 +8,7 @@ import { verifyCronAuth } from '@/lib/auth/internal'
const logger = createLogger('CleanupStaleExecutions')
const STALE_THRESHOLD_MINUTES = 30
const MAX_INT32 = 2_147_483_647
export async function GET(request: NextRequest) {
try {
@@ -45,13 +46,14 @@ export async function GET(request: NextRequest) {
try {
const staleDurationMs = Date.now() - new Date(execution.startedAt).getTime()
const staleDurationMinutes = Math.round(staleDurationMs / 60000)
const totalDurationMs = Math.min(staleDurationMs, MAX_INT32)
await db
.update(workflowExecutionLogs)
.set({
status: 'failed',
endedAt: new Date(),
totalDurationMs: staleDurationMs,
totalDurationMs,
executionData: sql`jsonb_set(
COALESCE(execution_data, '{}'::jsonb),
ARRAY['error'],

View File

@@ -284,7 +284,7 @@ async function handleToolsCall(
content: [
{ type: 'text', text: JSON.stringify(executeResult.output || executeResult, null, 2) },
],
isError: !executeResult.success,
isError: executeResult.success === false,
}
return NextResponse.json(createResponse(id, result))

View File

@@ -20,6 +20,7 @@ import { z } from 'zod'
import { getEmailSubject, renderInvitationEmail } from '@/components/emails'
import { getSession } from '@/lib/auth'
import { hasAccessControlAccess } from '@/lib/billing'
import { syncUsageLimitsFromSubscription } from '@/lib/billing/core/usage'
import { requireStripeClient } from '@/lib/billing/stripe-client'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { sendEmail } from '@/lib/messaging/email/mailer'
@@ -501,6 +502,18 @@ export async function PUT(
}
}
if (status === 'accepted') {
try {
await syncUsageLimitsFromSubscription(session.user.id)
} catch (syncError) {
logger.error('Failed to sync usage limits after joining org', {
userId: session.user.id,
organizationId,
error: syncError,
})
}
}
logger.info(`Organization invitation ${status}`, {
organizationId,
invitationId,

View File

@@ -29,7 +29,7 @@ import { hasWorkspaceAdminAccess } from '@/lib/workspaces/permissions/utils'
import {
InvitationsNotAllowedError,
validateInvitationsAllowed,
} from '@/executor/utils/permission-check'
} from '@/ee/access-control/utils/permission-check'
const logger = createLogger('OrganizationInvitations')

View File

@@ -5,6 +5,7 @@ import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { hasActiveSubscription } from '@/lib/billing'
const logger = createLogger('SubscriptionTransferAPI')
@@ -88,6 +89,14 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
)
}
// Check if org already has an active subscription (prevent duplicates)
if (await hasActiveSubscription(organizationId)) {
return NextResponse.json(
{ error: 'Organization already has an active subscription' },
{ status: 409 }
)
}
await db
.update(subscription)
.set({ referenceId: organizationId })

View File

@@ -203,6 +203,10 @@ export const PATCH = withAdminAuthParams<RouteParams>(async (request, context) =
}
updateData.billingBlocked = body.billingBlocked
// Clear the reason when unblocking
if (body.billingBlocked === false) {
updateData.billingBlockedReason = null
}
updated.push('billingBlocked')
}

View File

@@ -1,6 +1,4 @@
import { db, workflow as workflowTable } from '@sim/db'
import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { v4 as uuidv4 } from 'uuid'
import { z } from 'zod'
@@ -8,6 +6,7 @@ import { checkHybridAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { SSE_HEADERS } from '@/lib/core/utils/sse'
import { markExecutionCancelled } from '@/lib/execution/cancellation'
import { preprocessExecution } from '@/lib/execution/preprocessing'
import { LoggingSession } from '@/lib/logs/execution/logging-session'
import { executeWorkflowCore } from '@/lib/workflows/executor/execution-core'
import { createSSECallbacks } from '@/lib/workflows/executor/execution-events'
@@ -75,12 +74,31 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
const { startBlockId, sourceSnapshot, input } = validation.data
const executionId = uuidv4()
const [workflowRecord] = await db
.select({ workspaceId: workflowTable.workspaceId, userId: workflowTable.userId })
.from(workflowTable)
.where(eq(workflowTable.id, workflowId))
.limit(1)
// Run preprocessing checks (billing, rate limits, usage limits)
const preprocessResult = await preprocessExecution({
workflowId,
userId,
triggerType: 'manual',
executionId,
requestId,
checkRateLimit: false, // Manual executions don't rate limit
checkDeployment: false, // Run-from-block doesn't require deployment
})
if (!preprocessResult.success) {
const { error } = preprocessResult
logger.warn(`[${requestId}] Preprocessing failed for run-from-block`, {
workflowId,
error: error?.message,
statusCode: error?.statusCode,
})
return NextResponse.json(
{ error: error?.message || 'Execution blocked' },
{ status: error?.statusCode || 500 }
)
}
const workflowRecord = preprocessResult.workflowRecord
if (!workflowRecord?.workspaceId) {
return NextResponse.json({ error: 'Workflow not found or has no workspace' }, { status: 404 })
}
@@ -92,6 +110,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
workflowId,
startBlockId,
executedBlocksCount: sourceSnapshot.executedBlocks.length,
billingActorUserId: preprocessResult.actorUserId,
})
const loggingSession = new LoggingSession(workflowId, executionId, 'manual', requestId)

View File

@@ -567,6 +567,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
blockId: string,
blockName: string,
blockType: string,
executionOrder: number,
iterationContext?: IterationContext
) => {
logger.info(`[${requestId}] 🔷 onBlockStart called:`, { blockId, blockName, blockType })
@@ -579,6 +580,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
blockId,
blockName,
blockType,
executionOrder,
...(iterationContext && {
iterationCurrent: iterationContext.iterationCurrent,
iterationTotal: iterationContext.iterationTotal,
@@ -617,6 +619,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
error: callbackData.output.error,
durationMs: callbackData.executionTime || 0,
startedAt: callbackData.startedAt,
executionOrder: callbackData.executionOrder,
endedAt: callbackData.endedAt,
...(iterationContext && {
iterationCurrent: iterationContext.iterationCurrent,
@@ -644,6 +647,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
output: callbackData.output,
durationMs: callbackData.executionTime || 0,
startedAt: callbackData.startedAt,
executionOrder: callbackData.executionOrder,
endedAt: callbackData.endedAt,
...(iterationContext && {
iterationCurrent: iterationContext.iterationCurrent,

View File

@@ -102,7 +102,7 @@ describe('Workspace Invitations API Route', () => {
inArray: vi.fn().mockImplementation((field, values) => ({ type: 'inArray', field, values })),
}))
vi.doMock('@/executor/utils/permission-check', () => ({
vi.doMock('@/ee/access-control/utils/permission-check', () => ({
validateInvitationsAllowed: vi.fn().mockResolvedValue(undefined),
InvitationsNotAllowedError: class InvitationsNotAllowedError extends Error {
constructor() {

View File

@@ -21,7 +21,7 @@ import { getFromEmailAddress } from '@/lib/messaging/email/utils'
import {
InvitationsNotAllowedError,
validateInvitationsAllowed,
} from '@/executor/utils/permission-check'
} from '@/ee/access-control/utils/permission-check'
export const dynamic = 'force-dynamic'
@@ -38,7 +38,6 @@ export async function GET(req: NextRequest) {
}
try {
// Get all workspaces where the user has permissions
const userWorkspaces = await db
.select({ id: workspace.id })
.from(workspace)
@@ -55,10 +54,8 @@ export async function GET(req: NextRequest) {
return NextResponse.json({ invitations: [] })
}
// Get all workspaceIds where the user is a member
const workspaceIds = userWorkspaces.map((w) => w.id)
// Find all invitations for those workspaces
const invitations = await db
.select()
.from(workspaceInvitation)

View File

@@ -14,11 +14,11 @@ import {
ChatMessageContainer,
EmailAuth,
PasswordAuth,
SSOAuth,
VoiceInterface,
} from '@/app/chat/components'
import { CHAT_ERROR_MESSAGES, CHAT_REQUEST_TIMEOUT_MS } from '@/app/chat/constants'
import { useAudioStreaming, useChatStreaming } from '@/app/chat/hooks'
import SSOAuth from '@/ee/sso/components/sso-auth'
const logger = createLogger('ChatClient')

View File

@@ -1,6 +1,5 @@
export { default as EmailAuth } from './auth/email/email-auth'
export { default as PasswordAuth } from './auth/password/password-auth'
export { default as SSOAuth } from './auth/sso/sso-auth'
export { ChatErrorState } from './error-state/error-state'
export { ChatHeader } from './header/header'
export { ChatInput } from './input/input'

View File

@@ -1,7 +1,7 @@
import { redirect } from 'next/navigation'
import { getSession } from '@/lib/auth'
import { verifyWorkspaceMembership } from '@/app/api/workflows/utils'
import { getUserPermissionConfig } from '@/executor/utils/permission-check'
import { getUserPermissionConfig } from '@/ee/access-control/utils/permission-check'
import { Knowledge } from './knowledge'
interface KnowledgePageProps {
@@ -23,7 +23,6 @@ export default async function KnowledgePage({ params }: KnowledgePageProps) {
redirect('/')
}
// Check permission group restrictions
const permissionConfig = await getUserPermissionConfig(session.user.id)
if (permissionConfig?.hideKnowledgeBaseTab) {
redirect(`/workspace/${workspaceId}`)

View File

@@ -104,14 +104,12 @@ function FileCard({ file, isExecutionFile = false, workspaceId }: FileCardProps)
}
return (
<div className='flex flex-col gap-[8px] rounded-[6px] bg-[var(--surface-1)] px-[10px] py-[8px]'>
<div className='flex items-center justify-between'>
<div className='flex items-center gap-[8px]'>
<span className='truncate font-medium text-[12px] text-[var(--text-secondary)]'>
{file.name}
</span>
</div>
<span className='font-medium text-[12px] text-[var(--text-tertiary)]'>
<div className='flex flex-col gap-[4px] rounded-[6px] bg-[var(--surface-1)] px-[8px] py-[6px]'>
<div className='flex min-w-0 items-center justify-between gap-[8px]'>
<span className='min-w-0 flex-1 truncate font-medium text-[12px] text-[var(--text-secondary)]'>
{file.name}
</span>
<span className='flex-shrink-0 font-medium text-[12px] text-[var(--text-tertiary)]'>
{formatFileSize(file.size)}
</span>
</div>
@@ -142,20 +140,18 @@ export function FileCards({ files, isExecutionFile = false, workspaceId }: FileC
}
return (
<div className='flex w-full flex-col gap-[6px] rounded-[6px] bg-[var(--surface-2)] px-[10px] py-[8px]'>
<div className='mt-[4px] flex flex-col gap-[6px] rounded-[6px] border border-[var(--border)] bg-[var(--surface-2)] px-[10px] py-[8px] dark:bg-transparent'>
<span className='font-medium text-[12px] text-[var(--text-tertiary)]'>
Files ({files.length})
</span>
<div className='flex flex-col gap-[8px]'>
{files.map((file, index) => (
<FileCard
key={file.id || `file-${index}`}
file={file}
isExecutionFile={isExecutionFile}
workspaceId={workspaceId}
/>
))}
</div>
{files.map((file, index) => (
<FileCard
key={file.id || `file-${index}`}
file={file}
isExecutionFile={isExecutionFile}
workspaceId={workspaceId}
/>
))}
</div>
)
}

View File

@@ -18,6 +18,7 @@ import {
import { ScrollArea } from '@/components/ui/scroll-area'
import { BASE_EXECUTION_CHARGE } from '@/lib/billing/constants'
import { cn } from '@/lib/core/utils/cn'
import { formatDuration } from '@/lib/core/utils/formatting'
import { filterHiddenOutputKeys } from '@/lib/logs/execution/trace-spans/trace-spans'
import {
ExecutionSnapshot,
@@ -453,7 +454,7 @@ export const LogDetails = memo(function LogDetails({
Duration
</span>
<span className='font-medium text-[13px] text-[var(--text-secondary)]'>
{log.duration || '—'}
{formatDuration(log.duration, { precision: 2 }) || '—'}
</span>
</div>

View File

@@ -6,11 +6,11 @@ import Link from 'next/link'
import { List, type RowComponentProps, useListRef } from 'react-window'
import { Badge, buttonVariants } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn'
import { formatDuration } from '@/lib/core/utils/formatting'
import {
DELETED_WORKFLOW_COLOR,
DELETED_WORKFLOW_LABEL,
formatDate,
formatDuration,
getDisplayStatus,
LOG_COLUMNS,
StatusBadge,
@@ -113,7 +113,7 @@ const LogRow = memo(
<div className={`${LOG_COLUMNS.duration.width} ${LOG_COLUMNS.duration.minWidth}`}>
<Badge variant='default' className='rounded-[6px] px-[9px] py-[2px] text-[12px]'>
{formatDuration(log.duration) || '—'}
{formatDuration(log.duration, { precision: 2 }) || '—'}
</Badge>
</div>
</div>

View File

@@ -1,6 +1,7 @@
import React from 'react'
import { format } from 'date-fns'
import { Badge } from '@/components/emcn'
import { formatDuration } from '@/lib/core/utils/formatting'
import { getIntegrationMetadata } from '@/lib/logs/get-trigger-options'
import { getBlock } from '@/blocks/registry'
import { CORE_TRIGGER_TYPES } from '@/stores/logs/filters/types'
@@ -362,47 +363,14 @@ export function mapToExecutionLogAlt(log: RawLogResponse): ExecutionLog {
}
}
/**
* Format duration for display in logs UI
* If duration is under 1 second, displays as milliseconds (e.g., "500ms")
* If duration is 1 second or more, displays as seconds (e.g., "1.23s")
* @param duration - Duration string (e.g., "500ms") or null
* @returns Formatted duration string or null
*/
export function formatDuration(duration: string | null): string | null {
if (!duration) return null
// Extract numeric value from duration string (e.g., "500ms" -> 500)
const ms = Number.parseInt(duration.replace(/[^0-9]/g, ''), 10)
if (!Number.isFinite(ms)) return duration
if (ms < 1000) {
return `${ms}ms`
}
// Convert to seconds with up to 2 decimal places
const seconds = ms / 1000
return `${seconds.toFixed(2).replace(/\.?0+$/, '')}s`
}
/**
* Format latency value for display in dashboard UI
* If latency is under 1 second, displays as milliseconds (e.g., "500ms")
* If latency is 1 second or more, displays as seconds (e.g., "1.23s")
* @param ms - Latency in milliseconds (number)
* @returns Formatted latency string
*/
export function formatLatency(ms: number): string {
if (!Number.isFinite(ms) || ms <= 0) return '—'
if (ms < 1000) {
return `${Math.round(ms)}ms`
}
// Convert to seconds with up to 2 decimal places
const seconds = ms / 1000
return `${seconds.toFixed(2).replace(/\.?0+$/, '')}s`
return formatDuration(ms, { precision: 2 }) ?? '—'
}
export const formatDate = (dateString: string) => {

View File

@@ -1,10 +1,11 @@
'use client'
import type React from 'react'
import { createContext, useCallback, useContext, useEffect, useMemo, useState } from 'react'
import { createContext, useCallback, useContext, useEffect, useMemo, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { useQueryClient } from '@tanstack/react-query'
import { useParams } from 'next/navigation'
import { useSocket } from '@/app/workspace/providers/socket-provider'
import {
useWorkspacePermissionsQuery,
type WorkspacePermissions,
@@ -57,14 +58,42 @@ export function WorkspacePermissionsProvider({ children }: WorkspacePermissionsP
const [hasShownOfflineNotification, setHasShownOfflineNotification] = useState(false)
const hasOperationError = useOperationQueueStore((state) => state.hasOperationError)
const addNotification = useNotificationStore((state) => state.addNotification)
const removeNotification = useNotificationStore((state) => state.removeNotification)
const { isReconnecting } = useSocket()
const reconnectingNotificationIdRef = useRef<string | null>(null)
const isOfflineMode = hasOperationError
useEffect(() => {
if (isReconnecting && !reconnectingNotificationIdRef.current && !isOfflineMode) {
const id = addNotification({
level: 'error',
message: 'Reconnecting...',
})
reconnectingNotificationIdRef.current = id
} else if (!isReconnecting && reconnectingNotificationIdRef.current) {
removeNotification(reconnectingNotificationIdRef.current)
reconnectingNotificationIdRef.current = null
}
return () => {
if (reconnectingNotificationIdRef.current) {
removeNotification(reconnectingNotificationIdRef.current)
reconnectingNotificationIdRef.current = null
}
}
}, [isReconnecting, isOfflineMode, addNotification, removeNotification])
useEffect(() => {
if (!isOfflineMode || hasShownOfflineNotification) {
return
}
if (reconnectingNotificationIdRef.current) {
removeNotification(reconnectingNotificationIdRef.current)
reconnectingNotificationIdRef.current = null
}
try {
addNotification({
level: 'error',
@@ -78,7 +107,7 @@ export function WorkspacePermissionsProvider({ children }: WorkspacePermissionsP
} catch (error) {
logger.error('Failed to add offline notification', { error })
}
}, [addNotification, hasShownOfflineNotification, isOfflineMode])
}, [addNotification, removeNotification, hasShownOfflineNotification, isOfflineMode])
const {
data: workspacePermissions,

View File

@@ -6,7 +6,7 @@ import { getSession } from '@/lib/auth'
import { verifyWorkspaceMembership } from '@/app/api/workflows/utils'
import type { Template as WorkspaceTemplate } from '@/app/workspace/[workspaceId]/templates/templates'
import Templates from '@/app/workspace/[workspaceId]/templates/templates'
import { getUserPermissionConfig } from '@/executor/utils/permission-check'
import { getUserPermissionConfig } from '@/ee/access-control/utils/permission-check'
interface TemplatesPageProps {
params: Promise<{

View File

@@ -1,5 +1,5 @@
import { memo, useCallback } from 'react'
import { ArrowLeftRight, ArrowUpDown, Circle, CircleOff, LogOut } from 'lucide-react'
import { ArrowLeftRight, ArrowUpDown, Circle, CircleOff, Lock, LogOut, Unlock } from 'lucide-react'
import { Button, Copy, PlayOutline, Tooltip, Trash2 } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn'
import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
@@ -49,6 +49,7 @@ export const ActionBar = memo(
collaborativeBatchRemoveBlocks,
collaborativeBatchToggleBlockEnabled,
collaborativeBatchToggleBlockHandles,
collaborativeBatchToggleLocked,
} = useCollaborativeWorkflow()
const { setPendingSelection } = useWorkflowRegistry()
const { handleRunFromBlock } = useWorkflowExecution()
@@ -84,16 +85,28 @@ export const ActionBar = memo(
)
}, [blockId, addNotification, collaborativeBatchAddBlocks, setPendingSelection])
const { isEnabled, horizontalHandles, parentId, parentType } = useWorkflowStore(
const {
isEnabled,
horizontalHandles,
parentId,
parentType,
isLocked,
isParentLocked,
isParentDisabled,
} = useWorkflowStore(
useCallback(
(state) => {
const block = state.blocks[blockId]
const parentId = block?.data?.parentId
const parentBlock = parentId ? state.blocks[parentId] : undefined
return {
isEnabled: block?.enabled ?? true,
horizontalHandles: block?.horizontalHandles ?? false,
parentId,
parentType: parentId ? state.blocks[parentId]?.type : undefined,
parentType: parentBlock?.type,
isLocked: block?.locked ?? false,
isParentLocked: parentBlock?.locked ?? false,
isParentDisabled: parentBlock ? !parentBlock.enabled : false,
}
},
[blockId]
@@ -161,26 +174,28 @@ export const ActionBar = memo(
{!isNoteBlock && !isInsideSubflow && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (canRunFromBlock && !disabled) {
handleRunFromBlockClick()
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled || !canRunFromBlock}
>
<PlayOutline className={ICON_SIZE} />
</Button>
<span className='inline-flex'>
<Button
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (canRunFromBlock && !disabled) {
handleRunFromBlockClick()
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled || !canRunFromBlock}
>
<PlayOutline className={ICON_SIZE} />
</Button>
</span>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{(() => {
if (disabled) return getTooltipMessage('Run from block')
if (disabled) return getTooltipMessage('Run')
if (isExecuting) return 'Execution in progress'
if (!dependenciesSatisfied) return 'Run upstream blocks first'
return 'Run from block'
if (!dependenciesSatisfied) return 'Disabled: Run Blocks Before'
return 'Run'
})()}
</Tooltip.Content>
</Tooltip.Root>
@@ -193,18 +208,54 @@ export const ActionBar = memo(
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
// Can't enable if parent is disabled (must enable parent first)
const cantEnable = !isEnabled && isParentDisabled
if (!disabled && !isLocked && !isParentLocked && !cantEnable) {
collaborativeBatchToggleBlockEnabled([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
disabled={
disabled || isLocked || isParentLocked || (!isEnabled && isParentDisabled)
}
>
{isEnabled ? <Circle className={ICON_SIZE} /> : <CircleOff className={ICON_SIZE} />}
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
{isLocked || isParentLocked
? 'Block is locked'
: !isEnabled && isParentDisabled
? 'Parent container is disabled'
: getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
</Tooltip.Content>
</Tooltip.Root>
)}
{userPermissions.canAdmin && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
onClick={(e) => {
e.stopPropagation()
// Can't unlock a block if its parent container is locked
if (!disabled && !(isLocked && isParentLocked)) {
collaborativeBatchToggleLocked([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled || (isLocked && isParentLocked)}
>
{isLocked ? <Unlock className={ICON_SIZE} /> : <Lock className={ICON_SIZE} />}
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{isLocked && isParentLocked
? 'Parent container is locked'
: isLocked
? 'Unlock Block'
: 'Lock Block'}
</Tooltip.Content>
</Tooltip.Root>
)}
@@ -216,17 +267,21 @@ export const ActionBar = memo(
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
if (!disabled && !isLocked && !isParentLocked) {
handleDuplicateBlock()
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
disabled={disabled || isLocked || isParentLocked}
>
<Copy className={ICON_SIZE} />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>{getTooltipMessage('Duplicate Block')}</Tooltip.Content>
<Tooltip.Content side='top'>
{isLocked || isParentLocked
? 'Block is locked'
: getTooltipMessage('Duplicate Block')}
</Tooltip.Content>
</Tooltip.Root>
)}
@@ -237,12 +292,12 @@ export const ActionBar = memo(
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
if (!disabled && !isLocked && !isParentLocked) {
collaborativeBatchToggleBlockHandles([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
disabled={disabled || isLocked || isParentLocked}
>
{horizontalHandles ? (
<ArrowLeftRight className={ICON_SIZE} />
@@ -252,7 +307,9 @@ export const ActionBar = memo(
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{getTooltipMessage(horizontalHandles ? 'Vertical Ports' : 'Horizontal Ports')}
{isLocked || isParentLocked
? 'Block is locked'
: getTooltipMessage(horizontalHandles ? 'Vertical Ports' : 'Horizontal Ports')}
</Tooltip.Content>
</Tooltip.Root>
)}
@@ -264,19 +321,23 @@ export const ActionBar = memo(
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled && userPermissions.canEdit) {
if (!disabled && userPermissions.canEdit && !isLocked && !isParentLocked) {
window.dispatchEvent(
new CustomEvent('remove-from-subflow', { detail: { blockIds: [blockId] } })
)
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled || !userPermissions.canEdit}
disabled={disabled || !userPermissions.canEdit || isLocked || isParentLocked}
>
<LogOut className={ICON_SIZE} />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>{getTooltipMessage('Remove from Subflow')}</Tooltip.Content>
<Tooltip.Content side='top'>
{isLocked || isParentLocked
? 'Block is locked'
: getTooltipMessage('Remove from Subflow')}
</Tooltip.Content>
</Tooltip.Root>
)}
@@ -286,17 +347,19 @@ export const ActionBar = memo(
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
if (!disabled && !isLocked && !isParentLocked) {
collaborativeBatchRemoveBlocks([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
disabled={disabled || isLocked || isParentLocked}
>
<Trash2 className={ICON_SIZE} />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>{getTooltipMessage('Delete Block')}</Tooltip.Content>
<Tooltip.Content side='top'>
{isLocked || isParentLocked ? 'Block is locked' : getTooltipMessage('Delete Block')}
</Tooltip.Content>
</Tooltip.Root>
</div>
)

View File

@@ -20,6 +20,9 @@ export interface BlockInfo {
horizontalHandles: boolean
parentId?: string
parentType?: string
locked?: boolean
isParentLocked?: boolean
isParentDisabled?: boolean
}
/**
@@ -46,10 +49,17 @@ export interface BlockMenuProps {
showRemoveFromSubflow?: boolean
/** Whether run from block is available (has snapshot, was executed, not inside subflow) */
canRunFromBlock?: boolean
/** Whether to disable edit actions (user can't edit OR blocks are locked) */
disableEdit?: boolean
/** Whether the user has edit permission (ignoring locked state) */
userCanEdit?: boolean
isExecuting?: boolean
/** Whether the selected block is a trigger (has no incoming edges) */
isPositionalTrigger?: boolean
/** Callback to toggle locked state of selected blocks */
onToggleLocked?: () => void
/** Whether the user has admin permissions */
canAdmin?: boolean
}
/**
@@ -78,13 +88,22 @@ export function BlockMenu({
showRemoveFromSubflow = false,
canRunFromBlock = false,
disableEdit = false,
userCanEdit = true,
isExecuting = false,
isPositionalTrigger = false,
onToggleLocked,
canAdmin = false,
}: BlockMenuProps) {
const isSingleBlock = selectedBlocks.length === 1
const allEnabled = selectedBlocks.every((b) => b.enabled)
const allDisabled = selectedBlocks.every((b) => !b.enabled)
const allLocked = selectedBlocks.every((b) => b.locked)
const allUnlocked = selectedBlocks.every((b) => !b.locked)
// Can't unlock blocks that have locked parents
const hasBlockWithLockedParent = selectedBlocks.some((b) => b.locked && b.isParentLocked)
// Can't enable blocks that have disabled parents
const hasBlockWithDisabledParent = selectedBlocks.some((b) => !b.enabled && b.isParentDisabled)
const hasSingletonBlock = selectedBlocks.some(
(b) =>
@@ -108,6 +127,12 @@ export function BlockMenu({
return 'Toggle Enabled'
}
const getToggleLockedLabel = () => {
if (allLocked) return 'Unlock'
if (allUnlocked) return 'Lock'
return 'Toggle Lock'
}
return (
<Popover
open={isOpen}
@@ -139,7 +164,7 @@ export function BlockMenu({
</PopoverItem>
<PopoverItem
className='group'
disabled={disableEdit || !hasClipboard}
disabled={!userCanEdit || !hasClipboard}
onClick={() => {
onPaste()
onClose()
@@ -164,13 +189,15 @@ export function BlockMenu({
{!allNoteBlocks && <PopoverDivider />}
{!allNoteBlocks && (
<PopoverItem
disabled={disableEdit}
disabled={disableEdit || hasBlockWithDisabledParent}
onClick={() => {
onToggleEnabled()
onClose()
if (!disableEdit && !hasBlockWithDisabledParent) {
onToggleEnabled()
onClose()
}
}}
>
{getToggleEnabledLabel()}
{hasBlockWithDisabledParent ? 'Parent is disabled' : getToggleEnabledLabel()}
</PopoverItem>
)}
{!allNoteBlocks && !isSubflow && (
@@ -195,6 +222,19 @@ export function BlockMenu({
Remove from Subflow
</PopoverItem>
)}
{canAdmin && onToggleLocked && (
<PopoverItem
disabled={hasBlockWithLockedParent}
onClick={() => {
if (!hasBlockWithLockedParent) {
onToggleLocked()
onClose()
}
}}
>
{hasBlockWithLockedParent ? 'Parent is locked' : getToggleLockedLabel()}
</PopoverItem>
)}
{/* Single block actions */}
{isSingleBlock && <PopoverDivider />}
@@ -233,7 +273,7 @@ export function BlockMenu({
}
}}
>
Run from block
Run
</PopoverItem>
{/* Hide "Run until" for triggers - they're always at the start */}
{!hasTriggerBlock && (

View File

@@ -34,6 +34,8 @@ export interface CanvasMenuProps {
canUndo?: boolean
canRedo?: boolean
isInvitationsDisabled?: boolean
/** Whether the workflow has locked blocks (disables auto-layout) */
hasLockedBlocks?: boolean
}
/**
@@ -60,6 +62,7 @@ export function CanvasMenu({
disableEdit = false,
canUndo = false,
canRedo = false,
hasLockedBlocks = false,
}: CanvasMenuProps) {
return (
<Popover
@@ -129,11 +132,12 @@ export function CanvasMenu({
</PopoverItem>
<PopoverItem
className='group'
disabled={disableEdit}
disabled={disableEdit || hasLockedBlocks}
onClick={() => {
onAutoLayout()
onClose()
}}
title={hasLockedBlocks ? 'Unlock blocks to use auto-layout' : undefined}
>
<span>Auto-layout</span>
<span className='ml-auto opacity-70 group-hover:opacity-100'>L</span>

View File

@@ -3,6 +3,7 @@
import { memo, useEffect, useMemo, useRef, useState } from 'react'
import clsx from 'clsx'
import { ChevronUp } from 'lucide-react'
import { formatDuration } from '@/lib/core/utils/formatting'
import { CopilotMarkdownRenderer } from '../markdown-renderer'
/** Removes thinking tags (raw or escaped) and special tags from streamed content */
@@ -241,15 +242,11 @@ export function ThinkingBlock({
return () => window.clearInterval(intervalId)
}, [isStreaming, isExpanded, userHasScrolledAway])
/** Formats duration in milliseconds to seconds (minimum 1s) */
const formatDuration = (ms: number) => {
const seconds = Math.max(1, Math.round(ms / 1000))
return `${seconds}s`
}
const hasContent = cleanContent.length > 0
const isThinkingDone = !isStreaming || hasFollowingContent || hasSpecialTags
const durationText = `${label} for ${formatDuration(duration)}`
// Round to nearest second (minimum 1s) to match original behavior
const roundedMs = Math.max(1000, Math.round(duration / 1000) * 1000)
const durationText = `${label} for ${formatDuration(roundedMs)}`
const getStreamingLabel = (lbl: string) => {
if (lbl === 'Thought') return 'Thinking'

View File

@@ -15,6 +15,7 @@ import {
hasInterrupt as hasInterruptFromConfig,
isSpecialTool as isSpecialToolFromConfig,
} from '@/lib/copilot/tools/client/ui-config'
import { formatDuration } from '@/lib/core/utils/formatting'
import { CopilotMarkdownRenderer } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/components/copilot-message/components/markdown-renderer'
import { SmoothStreamingText } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/components/copilot-message/components/smooth-streaming'
import { ThinkingBlock } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/copilot/components/copilot-message/components/thinking-block'
@@ -848,13 +849,10 @@ const SubagentContentRenderer = memo(function SubagentContentRenderer({
(allParsed.options && Object.keys(allParsed.options).length > 0)
)
const formatDuration = (ms: number) => {
const seconds = Math.max(1, Math.round(ms / 1000))
return `${seconds}s`
}
const outerLabel = getSubagentCompletionLabel(toolCall.name)
const durationText = `${outerLabel} for ${formatDuration(duration)}`
// Round to nearest second (minimum 1s) to match original behavior
const roundedMs = Math.max(1000, Math.round(duration / 1000) * 1000)
const durationText = `${outerLabel} for ${formatDuration(roundedMs)}`
const renderCollapsibleContent = () => (
<>

View File

@@ -45,7 +45,7 @@ export function CredentialSelector({
previewValue,
}: CredentialSelectorProps) {
const [showOAuthModal, setShowOAuthModal] = useState(false)
const [inputValue, setInputValue] = useState('')
const [editingValue, setEditingValue] = useState('')
const [isEditing, setIsEditing] = useState(false)
const { activeWorkflowId } = useWorkflowRegistry()
const [storeValue, setStoreValue] = useSubBlockValue<string | null>(blockId, subBlock.id)
@@ -128,11 +128,7 @@ export function CredentialSelector({
return ''
}, [selectedCredentialSet, isForeignCredentialSet, selectedCredential, isForeign])
useEffect(() => {
if (!isEditing) {
setInputValue(resolvedLabel)
}
}, [resolvedLabel, isEditing])
const displayValue = isEditing ? editingValue : resolvedLabel
const invalidSelection =
!isPreview &&
@@ -295,7 +291,7 @@ export function CredentialSelector({
const selectedCredentialProvider = selectedCredential?.provider ?? provider
const overlayContent = useMemo(() => {
if (!inputValue) return null
if (!displayValue) return null
if (isCredentialSetSelected && selectedCredentialSet) {
return (
@@ -303,7 +299,7 @@ export function CredentialSelector({
<div className='mr-2 flex-shrink-0 opacity-90'>
<Users className='h-3 w-3' />
</div>
<span className='truncate'>{inputValue}</span>
<span className='truncate'>{displayValue}</span>
</div>
)
}
@@ -313,12 +309,12 @@ export function CredentialSelector({
<div className='mr-2 flex-shrink-0 opacity-90'>
{getProviderIcon(selectedCredentialProvider)}
</div>
<span className='truncate'>{inputValue}</span>
<span className='truncate'>{displayValue}</span>
</div>
)
}, [
getProviderIcon,
inputValue,
displayValue,
selectedCredentialProvider,
isCredentialSetSelected,
selectedCredentialSet,
@@ -335,7 +331,6 @@ export function CredentialSelector({
const credentialSetId = value.slice(CREDENTIAL_SET.PREFIX.length)
const matchedSet = credentialSets.find((cs) => cs.id === credentialSetId)
if (matchedSet) {
setInputValue(matchedSet.name)
handleCredentialSetSelect(credentialSetId)
return
}
@@ -343,13 +338,12 @@ export function CredentialSelector({
const matchedCred = credentials.find((c) => c.id === value)
if (matchedCred) {
setInputValue(matchedCred.name)
handleSelect(value)
return
}
setIsEditing(true)
setInputValue(value)
setEditingValue(value)
},
[credentials, credentialSets, handleAddCredential, handleSelect, handleCredentialSetSelect]
)
@@ -359,7 +353,7 @@ export function CredentialSelector({
<Combobox
options={comboboxOptions}
groups={comboboxGroups}
value={inputValue}
value={displayValue}
selectedValue={rawSelectedId}
onChange={handleComboboxChange}
onOpenChange={handleOpenChange}

View File

@@ -908,8 +908,10 @@ const PopoverContextCapture: React.FC<{
* When in nested folders, goes back one level at a time.
* At the root folder level, closes the folder.
*/
const TagDropdownBackButton: React.FC = () => {
const { isInFolder, closeFolder, colorScheme, size } = usePopoverContext()
const TagDropdownBackButton: React.FC<{ setSelectedIndex: (index: number) => void }> = ({
setSelectedIndex,
}) => {
const { isInFolder, closeFolder, size, isKeyboardNav, setKeyboardNav } = usePopoverContext()
const nestedNav = useNestedNavigation()
if (!isInFolder) return null
@@ -922,28 +924,31 @@ const TagDropdownBackButton: React.FC = () => {
closeFolder()
}
const handleMouseEnter = () => {
if (isKeyboardNav) return
setKeyboardNav(false)
setSelectedIndex(-1)
}
return (
<div
className={cn(
'flex min-w-0 cursor-pointer items-center gap-[8px] rounded-[6px] px-[6px] font-base',
size === 'sm' ? 'h-[22px] text-[11px]' : 'h-[26px] text-[13px]',
colorScheme === 'inverted'
? 'text-white hover:bg-[#363636] hover:text-white dark:text-[var(--text-primary)] dark:hover:bg-[var(--surface-5)]'
: 'text-[var(--text-primary)] hover:bg-[var(--border-1)]'
)}
role='button'
onClick={handleBackClick}
<PopoverItem
onMouseDown={(e) => {
e.preventDefault()
e.stopPropagation()
handleBackClick(e)
}}
onMouseEnter={handleMouseEnter}
>
<svg
className={size === 'sm' ? 'h-3 w-3' : 'h-3.5 w-3.5'}
className={cn('shrink-0', size === 'sm' ? 'h-3 w-3' : 'h-3.5 w-3.5')}
fill='none'
viewBox='0 0 24 24'
stroke='currentColor'
>
<path strokeLinecap='round' strokeLinejoin='round' strokeWidth={2} d='M15 19l-7-7 7-7' />
</svg>
<span>Back</span>
</div>
<span className='shrink-0'>Back</span>
</PopoverItem>
)
}
@@ -1961,8 +1966,8 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
onOpenAutoFocus={(e) => e.preventDefault()}
onCloseAutoFocus={(e) => e.preventDefault()}
>
<TagDropdownBackButton />
<PopoverScrollArea ref={scrollAreaRef}>
<TagDropdownBackButton setSelectedIndex={setSelectedIndex} />
{flatTagList.length === 0 ? (
<div className='px-[6px] py-[8px] text-[12px] text-[var(--white)]/60'>
No matching tags found

View File

@@ -0,0 +1,443 @@
/**
* @vitest-environment node
*/
import { describe, expect, it } from 'vitest'
interface StoredTool {
type: string
title?: string
toolId?: string
params?: Record<string, string>
customToolId?: string
schema?: any
code?: string
operation?: string
usageControl?: 'auto' | 'force' | 'none'
}
const isMcpToolAlreadySelected = (selectedTools: StoredTool[], mcpToolId: string): boolean => {
return selectedTools.some((tool) => tool.type === 'mcp' && tool.toolId === mcpToolId)
}
const isCustomToolAlreadySelected = (
selectedTools: StoredTool[],
customToolId: string
): boolean => {
return selectedTools.some(
(tool) => tool.type === 'custom-tool' && tool.customToolId === customToolId
)
}
const isWorkflowAlreadySelected = (selectedTools: StoredTool[], workflowId: string): boolean => {
return selectedTools.some(
(tool) => tool.type === 'workflow_input' && tool.params?.workflowId === workflowId
)
}
describe('isMcpToolAlreadySelected', () => {
describe('basic functionality', () => {
it.concurrent('returns false when selectedTools is empty', () => {
expect(isMcpToolAlreadySelected([], 'mcp-tool-123')).toBe(false)
})
it.concurrent('returns false when MCP tool is not in selectedTools', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'different-mcp-tool', title: 'Different Tool' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-123')).toBe(false)
})
it.concurrent('returns true when MCP tool is already selected', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-tool-123', title: 'My MCP Tool' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-123')).toBe(true)
})
it.concurrent('returns true when MCP tool is one of many selected tools', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', customToolId: 'custom-1' },
{ type: 'mcp', toolId: 'mcp-tool-123', title: 'My MCP Tool' },
{ type: 'workflow_input', toolId: 'workflow_executor' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-123')).toBe(true)
})
})
describe('type discrimination', () => {
it.concurrent('does not match non-MCP tools with same toolId', () => {
const selectedTools: StoredTool[] = [{ type: 'http_request', toolId: 'mcp-tool-123' }]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-123')).toBe(false)
})
it.concurrent('does not match custom tools even with toolId set', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', toolId: 'custom-mcp-tool-123', customToolId: 'db-id' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-123')).toBe(false)
})
})
describe('multiple MCP tools', () => {
it.concurrent('correctly identifies first of multiple MCP tools', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-tool-1', title: 'Tool 1' },
{ type: 'mcp', toolId: 'mcp-tool-2', title: 'Tool 2' },
{ type: 'mcp', toolId: 'mcp-tool-3', title: 'Tool 3' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-1')).toBe(true)
})
it.concurrent('correctly identifies middle MCP tool', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-tool-1', title: 'Tool 1' },
{ type: 'mcp', toolId: 'mcp-tool-2', title: 'Tool 2' },
{ type: 'mcp', toolId: 'mcp-tool-3', title: 'Tool 3' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-2')).toBe(true)
})
it.concurrent('correctly identifies last MCP tool', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-tool-1', title: 'Tool 1' },
{ type: 'mcp', toolId: 'mcp-tool-2', title: 'Tool 2' },
{ type: 'mcp', toolId: 'mcp-tool-3', title: 'Tool 3' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-3')).toBe(true)
})
it.concurrent('returns false for non-existent MCP tool among many', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-tool-1', title: 'Tool 1' },
{ type: 'mcp', toolId: 'mcp-tool-2', title: 'Tool 2' },
]
expect(isMcpToolAlreadySelected(selectedTools, 'mcp-tool-999')).toBe(false)
})
})
})
describe('isCustomToolAlreadySelected', () => {
describe('basic functionality', () => {
it.concurrent('returns false when selectedTools is empty', () => {
expect(isCustomToolAlreadySelected([], 'custom-tool-123')).toBe(false)
})
it.concurrent('returns false when custom tool is not in selectedTools', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', customToolId: 'different-custom-tool' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(false)
})
it.concurrent('returns true when custom tool is already selected', () => {
const selectedTools: StoredTool[] = [{ type: 'custom-tool', customToolId: 'custom-tool-123' }]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(true)
})
it.concurrent('returns true when custom tool is one of many selected tools', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-1', title: 'MCP Tool' },
{ type: 'custom-tool', customToolId: 'custom-tool-123' },
{ type: 'http_request', toolId: 'http_request_tool' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(true)
})
})
describe('type discrimination', () => {
it.concurrent('does not match non-custom tools with similar IDs', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'custom-tool-123', title: 'MCP with similar ID' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(false)
})
it.concurrent('does not match MCP tools even if customToolId happens to match', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-id', customToolId: 'custom-tool-123' } as StoredTool,
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(false)
})
})
describe('legacy inline custom tools', () => {
it.concurrent('does not match legacy inline tools without customToolId', () => {
const selectedTools: StoredTool[] = [
{
type: 'custom-tool',
title: 'Legacy Tool',
toolId: 'custom-myFunction',
schema: { function: { name: 'myFunction' } },
code: 'return true',
},
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(false)
})
it.concurrent('does not false-positive on legacy tools when checking for database tool', () => {
const selectedTools: StoredTool[] = [
{
type: 'custom-tool',
title: 'Legacy Tool',
schema: { function: { name: 'sameName' } },
code: 'return true',
},
]
expect(isCustomToolAlreadySelected(selectedTools, 'db-tool-1')).toBe(false)
})
})
describe('multiple custom tools', () => {
it.concurrent('correctly identifies first of multiple custom tools', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', customToolId: 'custom-1' },
{ type: 'custom-tool', customToolId: 'custom-2' },
{ type: 'custom-tool', customToolId: 'custom-3' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-1')).toBe(true)
})
it.concurrent('correctly identifies middle custom tool', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', customToolId: 'custom-1' },
{ type: 'custom-tool', customToolId: 'custom-2' },
{ type: 'custom-tool', customToolId: 'custom-3' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-2')).toBe(true)
})
it.concurrent('correctly identifies last custom tool', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', customToolId: 'custom-1' },
{ type: 'custom-tool', customToolId: 'custom-2' },
{ type: 'custom-tool', customToolId: 'custom-3' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-3')).toBe(true)
})
it.concurrent('returns false for non-existent custom tool among many', () => {
const selectedTools: StoredTool[] = [
{ type: 'custom-tool', customToolId: 'custom-1' },
{ type: 'custom-tool', customToolId: 'custom-2' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-999')).toBe(false)
})
})
describe('mixed tool types', () => {
it.concurrent('correctly identifies custom tool in mixed list', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-tool-1', title: 'MCP Tool' },
{ type: 'custom-tool', customToolId: 'custom-tool-123' },
{ type: 'http_request', toolId: 'http_request' },
{ type: 'workflow_input', toolId: 'workflow_executor' },
{ type: 'custom-tool', title: 'Legacy', schema: {}, code: '' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-tool-123')).toBe(true)
})
it.concurrent('does not confuse MCP toolId with custom customToolId', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'shared-id-123', title: 'MCP Tool' },
{ type: 'custom-tool', customToolId: 'different-id' },
]
expect(isCustomToolAlreadySelected(selectedTools, 'shared-id-123')).toBe(false)
})
})
})
describe('isWorkflowAlreadySelected', () => {
describe('basic functionality', () => {
it.concurrent('returns false when selectedTools is empty', () => {
expect(isWorkflowAlreadySelected([], 'workflow-123')).toBe(false)
})
it.concurrent('returns false when workflow is not in selectedTools', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'different-workflow' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-123')).toBe(false)
})
it.concurrent('returns true when workflow is already selected', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-123' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-123')).toBe(true)
})
it.concurrent('returns true when workflow is one of many selected tools', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'mcp-1', title: 'MCP Tool' },
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-123' },
},
{ type: 'custom-tool', customToolId: 'custom-1' },
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-123')).toBe(true)
})
})
describe('type discrimination', () => {
it.concurrent('does not match non-workflow_input tools', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'workflow-123', params: { workflowId: 'workflow-123' } },
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-123')).toBe(false)
})
it.concurrent('does not match workflow_input without params', () => {
const selectedTools: StoredTool[] = [{ type: 'workflow_input', toolId: 'workflow_executor' }]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-123')).toBe(false)
})
it.concurrent('does not match workflow_input with different workflowId in params', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'other-workflow' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-123')).toBe(false)
})
})
describe('multiple workflows', () => {
it.concurrent('allows different workflows to be selected', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-a' },
},
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-b' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-a')).toBe(true)
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-b')).toBe(true)
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-c')).toBe(false)
})
it.concurrent('correctly identifies specific workflow among many', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-1' },
},
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-2' },
},
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-3' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-2')).toBe(true)
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-999')).toBe(false)
})
})
})
describe('duplicate prevention integration scenarios', () => {
describe('add then try to re-add', () => {
it.concurrent('prevents re-adding the same MCP tool', () => {
const selectedTools: StoredTool[] = [
{
type: 'mcp',
toolId: 'planetscale-query',
title: 'PlanetScale Query',
params: { serverId: 'server-1' },
},
]
expect(isMcpToolAlreadySelected(selectedTools, 'planetscale-query')).toBe(true)
})
it.concurrent('prevents re-adding the same custom tool', () => {
const selectedTools: StoredTool[] = [
{
type: 'custom-tool',
customToolId: 'my-custom-tool-uuid',
usageControl: 'auto',
},
]
expect(isCustomToolAlreadySelected(selectedTools, 'my-custom-tool-uuid')).toBe(true)
})
it.concurrent('prevents re-adding the same workflow', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'my-workflow-uuid' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'my-workflow-uuid')).toBe(true)
})
})
describe('remove then re-add', () => {
it.concurrent('allows re-adding MCP tool after removal', () => {
const selectedToolsAfterRemoval: StoredTool[] = []
expect(isMcpToolAlreadySelected(selectedToolsAfterRemoval, 'planetscale-query')).toBe(false)
})
it.concurrent('allows re-adding custom tool after removal', () => {
const selectedToolsAfterRemoval: StoredTool[] = [
{ type: 'mcp', toolId: 'some-other-tool', title: 'Other' },
]
expect(isCustomToolAlreadySelected(selectedToolsAfterRemoval, 'my-custom-tool-uuid')).toBe(
false
)
})
it.concurrent('allows re-adding workflow after removal', () => {
const selectedToolsAfterRemoval: StoredTool[] = [
{ type: 'mcp', toolId: 'some-tool', title: 'Other' },
]
expect(isWorkflowAlreadySelected(selectedToolsAfterRemoval, 'my-workflow-uuid')).toBe(false)
})
})
describe('different tools with similar names', () => {
it.concurrent('allows adding different MCP tools from same server', () => {
const selectedTools: StoredTool[] = [
{ type: 'mcp', toolId: 'server1-tool-a', title: 'Tool A', params: { serverId: 'server1' } },
]
expect(isMcpToolAlreadySelected(selectedTools, 'server1-tool-b')).toBe(false)
})
it.concurrent('allows adding different custom tools', () => {
const selectedTools: StoredTool[] = [{ type: 'custom-tool', customToolId: 'custom-a' }]
expect(isCustomToolAlreadySelected(selectedTools, 'custom-b')).toBe(false)
})
it.concurrent('allows adding different workflows', () => {
const selectedTools: StoredTool[] = [
{
type: 'workflow_input',
toolId: 'workflow_executor',
params: { workflowId: 'workflow-a' },
},
]
expect(isWorkflowAlreadySelected(selectedTools, 'workflow-b')).toBe(false)
})
})
})

View File

@@ -1226,6 +1226,40 @@ export const ToolInput = memo(function ToolInput({
return selectedTools.some((tool) => tool.toolId === toolId)
}
/**
* Checks if an MCP tool is already selected.
*
* @param mcpToolId - The MCP tool identifier to check
* @returns `true` if the MCP tool is already selected
*/
const isMcpToolAlreadySelected = (mcpToolId: string): boolean => {
return selectedTools.some((tool) => tool.type === 'mcp' && tool.toolId === mcpToolId)
}
/**
* Checks if a custom tool is already selected.
*
* @param customToolId - The custom tool identifier to check
* @returns `true` if the custom tool is already selected
*/
const isCustomToolAlreadySelected = (customToolId: string): boolean => {
return selectedTools.some(
(tool) => tool.type === 'custom-tool' && tool.customToolId === customToolId
)
}
/**
* Checks if a workflow is already selected.
*
* @param workflowId - The workflow identifier to check
* @returns `true` if the workflow is already selected
*/
const isWorkflowAlreadySelected = (workflowId: string): boolean => {
return selectedTools.some(
(tool) => tool.type === 'workflow_input' && tool.params?.workflowId === workflowId
)
}
/**
* Checks if a block supports multiple operations.
*
@@ -1745,24 +1779,29 @@ export const ToolInput = memo(function ToolInput({
if (!permissionConfig.disableCustomTools && customTools.length > 0) {
groups.push({
section: 'Custom Tools',
items: customTools.map((customTool) => ({
label: customTool.title,
value: `custom-${customTool.id}`,
iconElement: createToolIcon('#3B82F6', WrenchIcon),
onSelect: () => {
const newTool: StoredTool = {
type: 'custom-tool',
customToolId: customTool.id,
usageControl: 'auto',
isExpanded: true,
}
setStoreValue([
...selectedTools.map((tool) => ({ ...tool, isExpanded: false })),
newTool,
])
setOpen(false)
},
})),
items: customTools.map((customTool) => {
const alreadySelected = isCustomToolAlreadySelected(customTool.id)
return {
label: customTool.title,
value: `custom-${customTool.id}`,
iconElement: createToolIcon('#3B82F6', WrenchIcon),
disabled: isPreview || alreadySelected,
onSelect: () => {
if (alreadySelected) return
const newTool: StoredTool = {
type: 'custom-tool',
customToolId: customTool.id,
usageControl: 'auto',
isExpanded: true,
}
setStoreValue([
...selectedTools.map((tool) => ({ ...tool, isExpanded: false })),
newTool,
])
setOpen(false)
},
}
}),
})
}
@@ -1772,11 +1811,13 @@ export const ToolInput = memo(function ToolInput({
section: 'MCP Tools',
items: availableMcpTools.map((mcpTool) => {
const server = mcpServers.find((s) => s.id === mcpTool.serverId)
const alreadySelected = isMcpToolAlreadySelected(mcpTool.id)
return {
label: mcpTool.name,
value: `mcp-${mcpTool.id}`,
iconElement: createToolIcon(mcpTool.bgColor || '#6366F1', mcpTool.icon || McpIcon),
onSelect: () => {
if (alreadySelected) return
const newTool: StoredTool = {
type: 'mcp',
title: mcpTool.name,
@@ -1796,7 +1837,7 @@ export const ToolInput = memo(function ToolInput({
}
handleMcpToolSelect(newTool, true)
},
disabled: isPreview || disabled,
disabled: isPreview || disabled || alreadySelected,
}
}),
})
@@ -1810,12 +1851,17 @@ export const ToolInput = memo(function ToolInput({
if (builtInTools.length > 0) {
groups.push({
section: 'Built-in Tools',
items: builtInTools.map((block) => ({
label: block.name,
value: `builtin-${block.type}`,
iconElement: createToolIcon(block.bgColor, block.icon),
onSelect: () => handleSelectTool(block),
})),
items: builtInTools.map((block) => {
const toolId = getToolIdForOperation(block.type, undefined)
const alreadySelected = toolId ? isToolAlreadySelected(toolId, block.type) : false
return {
label: block.name,
value: `builtin-${block.type}`,
iconElement: createToolIcon(block.bgColor, block.icon),
disabled: isPreview || alreadySelected,
onSelect: () => handleSelectTool(block),
}
}),
})
}
@@ -1823,12 +1869,17 @@ export const ToolInput = memo(function ToolInput({
if (integrations.length > 0) {
groups.push({
section: 'Integrations',
items: integrations.map((block) => ({
label: block.name,
value: `builtin-${block.type}`,
iconElement: createToolIcon(block.bgColor, block.icon),
onSelect: () => handleSelectTool(block),
})),
items: integrations.map((block) => {
const toolId = getToolIdForOperation(block.type, undefined)
const alreadySelected = toolId ? isToolAlreadySelected(toolId, block.type) : false
return {
label: block.name,
value: `builtin-${block.type}`,
iconElement: createToolIcon(block.bgColor, block.icon),
disabled: isPreview || alreadySelected,
onSelect: () => handleSelectTool(block),
}
}),
})
}
@@ -1836,29 +1887,33 @@ export const ToolInput = memo(function ToolInput({
if (availableWorkflows.length > 0) {
groups.push({
section: 'Workflows',
items: availableWorkflows.map((workflow) => ({
label: workflow.name,
value: `workflow-${workflow.id}`,
iconElement: createToolIcon('#6366F1', WorkflowIcon),
onSelect: () => {
const newTool: StoredTool = {
type: 'workflow_input',
title: 'Workflow',
toolId: 'workflow_executor',
params: {
workflowId: workflow.id,
},
isExpanded: true,
usageControl: 'auto',
}
setStoreValue([
...selectedTools.map((tool) => ({ ...tool, isExpanded: false })),
newTool,
])
setOpen(false)
},
disabled: isPreview || disabled,
})),
items: availableWorkflows.map((workflow) => {
const alreadySelected = isWorkflowAlreadySelected(workflow.id)
return {
label: workflow.name,
value: `workflow-${workflow.id}`,
iconElement: createToolIcon('#6366F1', WorkflowIcon),
onSelect: () => {
if (alreadySelected) return
const newTool: StoredTool = {
type: 'workflow_input',
title: 'Workflow',
toolId: 'workflow_executor',
params: {
workflowId: workflow.id,
},
isExpanded: true,
usageControl: 'auto',
}
setStoreValue([
...selectedTools.map((tool) => ({ ...tool, isExpanded: false })),
newTool,
])
setOpen(false)
},
disabled: isPreview || disabled || alreadySelected,
}
}),
})
}
@@ -1877,6 +1932,11 @@ export const ToolInput = memo(function ToolInput({
permissionConfig.disableCustomTools,
permissionConfig.disableMcpTools,
availableWorkflows,
getToolIdForOperation,
isToolAlreadySelected,
isMcpToolAlreadySelected,
isCustomToolAlreadySelected,
isWorkflowAlreadySelected,
])
const toolRequiresOAuth = (toolId: string): boolean => {

View File

@@ -9,7 +9,9 @@ import {
ChevronUp,
ExternalLink,
Loader2,
Lock,
Pencil,
Unlock,
} from 'lucide-react'
import { useParams } from 'next/navigation'
import { useShallow } from 'zustand/react/shallow'
@@ -46,10 +48,17 @@ import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { usePanelEditorStore } from '@/stores/panel'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
/** Stable empty object to avoid creating new references */
const EMPTY_SUBBLOCK_VALUES = {} as Record<string, any>
/** Shared style for dashed divider lines */
const DASHED_DIVIDER_STYLE = {
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
} as const
/**
* Icon component for rendering block icons.
*
@@ -89,31 +98,31 @@ export function Editor() {
const blockConfig = currentBlock ? getBlock(currentBlock.type) : null
const title = currentBlock?.name || 'Editor'
// Check if selected block is a subflow (loop or parallel)
const isSubflow =
currentBlock && (currentBlock.type === 'loop' || currentBlock.type === 'parallel')
// Get subflow display properties from configs
const subflowConfig = isSubflow ? (currentBlock.type === 'loop' ? LoopTool : ParallelTool) : null
// Check if selected block is a workflow block
const isWorkflowBlock =
currentBlock && (currentBlock.type === 'workflow' || currentBlock.type === 'workflow_input')
// Get workspace ID from params
const params = useParams()
const workspaceId = params.workspaceId as string
// Refs for resize functionality
const subBlocksRef = useRef<HTMLDivElement>(null)
// Get user permissions
const userPermissions = useUserPermissionsContext()
// Get active workflow ID
// Check if block is locked (or inside a locked container) and compute edit permission
// Locked blocks cannot be edited by anyone (admins can only lock/unlock)
const blocks = useWorkflowStore((state) => state.blocks)
const parentId = currentBlock?.data?.parentId as string | undefined
const isParentLocked = parentId ? (blocks[parentId]?.locked ?? false) : false
const isLocked = (currentBlock?.locked ?? false) || isParentLocked
const canEditBlock = userPermissions.canEdit && !isLocked
const activeWorkflowId = useWorkflowRegistry((state) => state.activeWorkflowId)
// Get block properties (advanced/trigger modes)
const { advancedMode, triggerMode } = useEditorBlockProperties(
currentBlockId,
currentWorkflow.isSnapshotView
@@ -145,22 +154,17 @@ export function Editor() {
[subBlocksForCanonical]
)
const canonicalModeOverrides = currentBlock?.data?.canonicalModes
const advancedValuesPresent = hasAdvancedValues(
subBlocksForCanonical,
blockSubBlockValues,
canonicalIndex
const advancedValuesPresent = useMemo(
() => hasAdvancedValues(subBlocksForCanonical, blockSubBlockValues, canonicalIndex),
[subBlocksForCanonical, blockSubBlockValues, canonicalIndex]
)
const displayAdvancedOptions = userPermissions.canEdit
? advancedMode
: advancedMode || advancedValuesPresent
const displayAdvancedOptions = canEditBlock ? advancedMode : advancedMode || advancedValuesPresent
const hasAdvancedOnlyFields = useMemo(() => {
for (const subBlock of subBlocksForCanonical) {
// Must be standalone advanced (mode: 'advanced' without canonicalParamId)
if (subBlock.mode !== 'advanced') continue
if (canonicalIndex.canonicalIdBySubBlockId[subBlock.id]) continue
// Check condition - skip if condition not met for current values
if (
subBlock.condition &&
!evaluateSubBlockCondition(subBlock.condition, blockSubBlockValues)
@@ -173,7 +177,6 @@ export function Editor() {
return false
}, [subBlocksForCanonical, canonicalIndex.canonicalIdBySubBlockId, blockSubBlockValues])
// Get subblock layout using custom hook
const { subBlocks, stateToUse: subBlockState } = useEditorSubblockLayout(
blockConfig || ({} as any),
currentBlockId || '',
@@ -206,40 +209,44 @@ export function Editor() {
return { regularSubBlocks: regular, advancedOnlySubBlocks: advancedOnly }
}, [subBlocks, canonicalIndex.canonicalIdBySubBlockId])
// Get block connections
const { incomingConnections, hasIncomingConnections } = useBlockConnections(currentBlockId || '')
// Connections resize hook
const { handleMouseDown: handleConnectionsResizeMouseDown, isResizing } = useConnectionsResize({
subBlocksRef,
})
// Collaborative actions
const {
collaborativeSetBlockCanonicalMode,
collaborativeUpdateBlockName,
collaborativeToggleBlockAdvancedMode,
collaborativeBatchToggleLocked,
} = useCollaborativeWorkflow()
// Advanced mode toggle handler
const handleToggleAdvancedMode = useCallback(() => {
if (!currentBlockId || !userPermissions.canEdit) return
if (!currentBlockId || !canEditBlock) return
collaborativeToggleBlockAdvancedMode(currentBlockId)
}, [currentBlockId, userPermissions.canEdit, collaborativeToggleBlockAdvancedMode])
}, [currentBlockId, canEditBlock, collaborativeToggleBlockAdvancedMode])
// Rename state
const [isRenaming, setIsRenaming] = useState(false)
const [editedName, setEditedName] = useState('')
const nameInputRef = useRef<HTMLInputElement>(null)
/**
* Ref callback that auto-selects the input text when mounted.
*/
const nameInputRefCallback = useCallback((element: HTMLInputElement | null) => {
if (element) {
element.select()
}
}, [])
/**
* Handles starting the rename process.
*/
const handleStartRename = useCallback(() => {
if (!userPermissions.canEdit || !currentBlock) return
if (!canEditBlock || !currentBlock) return
setEditedName(currentBlock.name || '')
setIsRenaming(true)
}, [userPermissions.canEdit, currentBlock])
}, [canEditBlock, currentBlock])
/**
* Handles saving the renamed block.
@@ -251,7 +258,6 @@ export function Editor() {
if (trimmedName && trimmedName !== currentBlock?.name) {
const result = collaborativeUpdateBlockName(currentBlockId, trimmedName)
if (!result.success) {
// Keep rename mode open on error so user can correct the name
return
}
}
@@ -266,14 +272,6 @@ export function Editor() {
setEditedName('')
}, [])
// Focus input when entering rename mode
useEffect(() => {
if (isRenaming && nameInputRef.current) {
nameInputRef.current.select()
}
}, [isRenaming])
// Trigger rename mode when signaled from context menu
useEffect(() => {
if (shouldFocusRename && currentBlock) {
handleStartRename()
@@ -284,17 +282,13 @@ export function Editor() {
/**
* Handles opening documentation link in a new secure tab.
*/
const handleOpenDocs = () => {
const handleOpenDocs = useCallback(() => {
const docsLink = isSubflow ? subflowConfig?.docsLink : blockConfig?.docsLink
if (docsLink) {
window.open(docsLink, '_blank', 'noopener,noreferrer')
}
}
window.open(docsLink || 'https://docs.sim.ai/quick-reference', '_blank', 'noopener,noreferrer')
}, [isSubflow, subflowConfig?.docsLink, blockConfig?.docsLink])
// Get child workflow ID for workflow blocks
const childWorkflowId = isWorkflowBlock ? blockSubBlockValues?.workflowId : null
// Fetch child workflow state for preview (only for workflow blocks with a selected workflow)
const { data: childWorkflowState, isLoading: isLoadingChildWorkflow } =
useWorkflowState(childWorkflowId)
@@ -307,7 +301,6 @@ export function Editor() {
}
}, [childWorkflowId, workspaceId])
// Determine if connections are at minimum height (collapsed state)
const isConnectionsAtMinHeight = connectionsHeight <= 35
return (
@@ -328,7 +321,7 @@ export function Editor() {
)}
{isRenaming ? (
<input
ref={nameInputRef}
ref={nameInputRefCallback}
type='text'
value={editedName}
onChange={(e) => setEditedName(e.target.value)}
@@ -358,6 +351,36 @@ export function Editor() {
)}
</div>
<div className='flex shrink-0 items-center gap-[8px]'>
{/* Locked indicator - clickable to unlock if user has admin permissions, block is locked, and parent is not locked */}
{isLocked && currentBlock && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
{userPermissions.canAdmin && currentBlock.locked && !isParentLocked ? (
<Button
variant='ghost'
className='p-0'
onClick={() => collaborativeBatchToggleLocked([currentBlockId!])}
aria-label='Unlock block'
>
<Unlock className='h-[14px] w-[14px] text-[var(--text-secondary)]' />
</Button>
) : (
<div className='flex items-center justify-center'>
<Lock className='h-[14px] w-[14px] text-[var(--text-secondary)]' />
</div>
)}
</Tooltip.Trigger>
<Tooltip.Content side='top'>
<p>
{isParentLocked
? 'Parent container is locked'
: userPermissions.canAdmin && currentBlock.locked
? 'Unlock block'
: 'Block is locked'}
</p>
</Tooltip.Content>
</Tooltip.Root>
)}
{/* Rename button */}
{currentBlock && (
<Tooltip.Root>
@@ -366,7 +389,7 @@ export function Editor() {
variant='ghost'
className='p-0'
onClick={isRenaming ? handleSaveRename : handleStartRename}
disabled={!userPermissions.canEdit}
disabled={!canEditBlock}
aria-label={isRenaming ? 'Save name' : 'Rename block'}
>
{isRenaming ? (
@@ -399,23 +422,21 @@ export function Editor() {
</Tooltip.Content>
</Tooltip.Root>
)} */}
{currentBlock && (isSubflow ? subflowConfig?.docsLink : blockConfig?.docsLink) && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
className='p-0'
onClick={handleOpenDocs}
aria-label='Open documentation'
>
<BookOpen className='h-[14px] w-[14px]' />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
<p>Open docs</p>
</Tooltip.Content>
</Tooltip.Root>
)}
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
className='p-0'
onClick={handleOpenDocs}
aria-label='Open documentation'
>
<BookOpen className='h-[14px] w-[14px]' />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
<p>Open docs</p>
</Tooltip.Content>
</Tooltip.Root>
</div>
</div>
@@ -434,7 +455,7 @@ export function Editor() {
incomingConnections={incomingConnections}
handleConnectionsResizeMouseDown={handleConnectionsResizeMouseDown}
toggleConnectionsCollapsed={toggleConnectionsCollapsed}
userCanEdit={userPermissions.canEdit}
userCanEdit={canEditBlock}
isConnectionsAtMinHeight={isConnectionsAtMinHeight}
/>
) : (
@@ -495,13 +516,7 @@ export function Editor() {
</div>
</div>
<div className='subblock-divider px-[2px] pt-[16px] pb-[13px]'>
<div
className='h-[1.25px]'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
<div className='h-[1.25px]' style={DASHED_DIVIDER_STYLE} />
</div>
</>
)}
@@ -542,14 +557,14 @@ export function Editor() {
config={subBlock}
isPreview={false}
subBlockValues={subBlockState}
disabled={!userPermissions.canEdit}
disabled={!canEditBlock}
fieldDiffStatus={undefined}
allowExpandInPreview={false}
canonicalToggle={
isCanonicalSwap && canonicalMode && canonicalId
? {
mode: canonicalMode,
disabled: !userPermissions.canEdit,
disabled: !canEditBlock,
onToggle: () => {
if (!currentBlockId) return
const nextMode =
@@ -566,28 +581,16 @@ export function Editor() {
/>
{showDivider && (
<div className='subblock-divider px-[2px] pt-[16px] pb-[13px]'>
<div
className='h-[1.25px]'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
<div className='h-[1.25px]' style={DASHED_DIVIDER_STYLE} />
</div>
)}
</div>
)
})}
{hasAdvancedOnlyFields && userPermissions.canEdit && (
{hasAdvancedOnlyFields && canEditBlock && (
<div className='flex items-center gap-[10px] px-[2px] pt-[14px] pb-[12px]'>
<div
className='h-[1.25px] flex-1'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
<div className='h-[1.25px] flex-1' style={DASHED_DIVIDER_STYLE} />
<button
type='button'
onClick={handleToggleAdvancedMode}
@@ -600,13 +603,7 @@ export function Editor() {
className={`h-[14px] w-[14px] transition-transform duration-200 ${displayAdvancedOptions ? 'rotate-180' : ''}`}
/>
</button>
<div
className='h-[1.25px] flex-1'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
<div className='h-[1.25px] flex-1' style={DASHED_DIVIDER_STYLE} />
</div>
)}
@@ -624,19 +621,13 @@ export function Editor() {
config={subBlock}
isPreview={false}
subBlockValues={subBlockState}
disabled={!userPermissions.canEdit}
disabled={!canEditBlock}
fieldDiffStatus={undefined}
allowExpandInPreview={false}
/>
{index < advancedOnlySubBlocks.length - 1 && (
<div className='subblock-divider px-[2px] pt-[16px] pb-[13px]'>
<div
className='h-[1.25px]'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
<div className='h-[1.25px]' style={DASHED_DIVIDER_STYLE} />
</div>
)}
</div>

View File

@@ -45,11 +45,13 @@ import { useWorkflowExecution } from '@/app/workspace/[workspaceId]/w/[workflowI
import { useDeleteWorkflow, useImportWorkflow } from '@/app/workspace/[workspaceId]/w/hooks'
import { usePermissionConfig } from '@/hooks/use-permission-config'
import { useChatStore } from '@/stores/chat/store'
import { useNotificationStore } from '@/stores/notifications/store'
import type { PanelTab } from '@/stores/panel'
import { usePanelStore, useVariablesStore as usePanelVariablesStore } from '@/stores/panel'
import { useVariablesStore } from '@/stores/variables/store'
import { getWorkflowWithValues } from '@/stores/workflows'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
const logger = createLogger('Panel')
/**
@@ -119,6 +121,11 @@ export const Panel = memo(function Panel() {
hydration.phase === 'state-loading'
const { handleAutoLayout: autoLayoutWithFitView } = useAutoLayout(activeWorkflowId || null)
// Check for locked blocks (disables auto-layout)
const hasLockedBlocks = useWorkflowStore((state) =>
Object.values(state.blocks).some((block) => block.locked)
)
// Delete workflow hook
const { isDeleting, handleDeleteWorkflow } = useDeleteWorkflow({
workspaceId,
@@ -230,11 +237,24 @@ export const Panel = memo(function Panel() {
setIsAutoLayouting(true)
try {
await autoLayoutWithFitView()
const result = await autoLayoutWithFitView()
if (!result.success && result.error) {
useNotificationStore.getState().addNotification({
level: 'info',
message: result.error,
workflowId: activeWorkflowId || undefined,
})
}
} finally {
setIsAutoLayouting(false)
}
}, [isExecuting, userPermissions.canEdit, isAutoLayouting, autoLayoutWithFitView])
}, [
isExecuting,
userPermissions.canEdit,
isAutoLayouting,
autoLayoutWithFitView,
activeWorkflowId,
])
/**
* Handles exporting workflow as JSON
@@ -404,7 +424,10 @@ export const Panel = memo(function Panel() {
<PopoverContent align='start' side='bottom' sideOffset={8}>
<PopoverItem
onClick={handleAutoLayout}
disabled={isExecuting || !userPermissions.canEdit || isAutoLayouting}
disabled={
isExecuting || !userPermissions.canEdit || isAutoLayouting || hasLockedBlocks
}
title={hasLockedBlocks ? 'Unlock blocks to use auto-layout' : undefined}
>
<Layout className='h-3 w-3' animate={isAutoLayouting} variant='clockwise' />
<span>Auto layout</span>

View File

@@ -80,6 +80,7 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
: undefined
const isEnabled = currentBlock?.enabled ?? true
const isLocked = currentBlock?.locked ?? false
const isPreview = data?.isPreview || false
// Focus state
@@ -200,7 +201,10 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
{blockName}
</span>
</div>
{!isEnabled && <Badge variant='gray-secondary'>disabled</Badge>}
<div className='flex items-center gap-1'>
{!isEnabled && <Badge variant='gray-secondary'>disabled</Badge>}
{isLocked && <Badge variant='gray-secondary'>locked</Badge>}
</div>
</div>
{!isPreview && (

View File

@@ -105,11 +105,9 @@ export function useTerminalFilters() {
})
}
// Apply sorting by timestamp
// Sort by executionOrder (monotonically increasing integer from server)
result = [...result].sort((a, b) => {
const timeA = new Date(a.timestamp).getTime()
const timeB = new Date(b.timestamp).getTime()
const comparison = timeA - timeB
const comparison = a.executionOrder - b.executionOrder
return sortConfig.direction === 'asc' ? comparison : -comparison
})

View File

@@ -24,6 +24,7 @@ import {
Tooltip,
} from '@/components/emcn'
import { getEnv, isTruthy } from '@/lib/core/config/env'
import { formatDuration } from '@/lib/core/utils/formatting'
import { useRegisterGlobalCommands } from '@/app/workspace/[workspaceId]/providers/global-commands-provider'
import { createCommands } from '@/app/workspace/[workspaceId]/utils/commands-utils'
import {
@@ -43,7 +44,6 @@ import {
type EntryNode,
type ExecutionGroup,
flattenBlockEntriesOnly,
formatDuration,
getBlockColor,
getBlockIcon,
groupEntriesByExecution,
@@ -128,7 +128,7 @@ const BlockRow = memo(function BlockRow({
<StatusDisplay
isRunning={isRunning}
isCanceled={isCanceled}
formattedDuration={formatDuration(entry.durationMs)}
formattedDuration={formatDuration(entry.durationMs, { precision: 2 }) ?? '-'}
/>
</span>
</div>
@@ -201,7 +201,7 @@ const IterationNodeRow = memo(function IterationNodeRow({
<StatusDisplay
isRunning={hasRunningChild}
isCanceled={hasCanceledChild}
formattedDuration={formatDuration(entry.durationMs)}
formattedDuration={formatDuration(entry.durationMs, { precision: 2 }) ?? '-'}
/>
</span>
</div>
@@ -314,7 +314,7 @@ const SubflowNodeRow = memo(function SubflowNodeRow({
<StatusDisplay
isRunning={hasRunningDescendant}
isCanceled={hasCanceledDescendant}
formattedDuration={formatDuration(entry.durationMs)}
formattedDuration={formatDuration(entry.durationMs, { precision: 2 }) ?? '-'}
/>
</span>
</div>

View File

@@ -53,17 +53,6 @@ export function getBlockColor(blockType: string): string {
return '#6b7280'
}
/**
* Formats duration from milliseconds to readable format
*/
export function formatDuration(ms?: number): string {
if (ms === undefined || ms === null) return '-'
if (ms < 1000) {
return `${Math.round(ms)}ms`
}
return `${(ms / 1000).toFixed(2)}s`
}
/**
* Determines if a keyboard event originated from a text-editable element
*/
@@ -195,13 +184,9 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
group.blocks.push(entry)
}
// Sort blocks within each iteration by start time ascending (oldest first, top-down)
// Sort blocks within each iteration by executionOrder ascending (oldest first, top-down)
for (const group of iterationGroupsMap.values()) {
group.blocks.sort((a, b) => {
const aStart = new Date(a.startedAt || a.timestamp).getTime()
const bStart = new Date(b.startedAt || b.timestamp).getTime()
return aStart - bStart
})
group.blocks.sort((a, b) => a.executionOrder - b.executionOrder)
}
// Group iterations by iterationType to create subflow parents
@@ -236,6 +221,8 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
const totalDuration = allBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
// Create synthetic subflow parent entry
// Use the minimum executionOrder from all child blocks for proper ordering
const subflowExecutionOrder = Math.min(...allBlocks.map((b) => b.executionOrder))
const syntheticSubflow: ConsoleEntry = {
id: `subflow-${iterationType}-${firstIteration.blocks[0]?.executionId || 'unknown'}`,
timestamp: new Date(subflowStartMs).toISOString(),
@@ -245,6 +232,7 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
blockType: iterationType,
executionId: firstIteration.blocks[0]?.executionId,
startedAt: new Date(subflowStartMs).toISOString(),
executionOrder: subflowExecutionOrder,
endedAt: new Date(subflowEndMs).toISOString(),
durationMs: totalDuration,
success: !allBlocks.some((b) => b.error),
@@ -262,6 +250,8 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
)
const iterDuration = iterBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
// Use the minimum executionOrder from blocks in this iteration
const iterExecutionOrder = Math.min(...iterBlocks.map((b) => b.executionOrder))
const syntheticIteration: ConsoleEntry = {
id: `iteration-${iterationType}-${iterGroup.iterationCurrent}-${iterBlocks[0]?.executionId || 'unknown'}`,
timestamp: new Date(iterStartMs).toISOString(),
@@ -271,6 +261,7 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
blockType: iterationType,
executionId: iterBlocks[0]?.executionId,
startedAt: new Date(iterStartMs).toISOString(),
executionOrder: iterExecutionOrder,
endedAt: new Date(iterEndMs).toISOString(),
durationMs: iterDuration,
success: !iterBlocks.some((b) => b.error),
@@ -311,14 +302,9 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
nodeType: 'block' as const,
}))
// Combine all nodes and sort by start time ascending (oldest first, top-down)
// Combine all nodes and sort by executionOrder ascending (oldest first, top-down)
const allNodes = [...subflowNodes, ...regularNodes]
allNodes.sort((a, b) => {
const aStart = new Date(a.entry.startedAt || a.entry.timestamp).getTime()
const bStart = new Date(b.entry.startedAt || b.entry.timestamp).getTime()
return aStart - bStart
})
allNodes.sort((a, b) => a.entry.executionOrder - b.entry.executionOrder)
return allNodes
}

View File

@@ -30,6 +30,7 @@ import {
Textarea,
} from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn'
import { formatDuration } from '@/lib/core/utils/formatting'
import { sanitizeForCopilot } from '@/lib/workflows/sanitization/json-sanitizer'
import { formatEditSequence } from '@/lib/workflows/training/compute-edit-sequence'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-current-workflow'
@@ -575,7 +576,9 @@ export function TrainingModal() {
<span className='text-[var(--text-muted)]'>Duration:</span>{' '}
<span className='text-[var(--text-secondary)]'>
{dataset.metadata?.duration
? `${(dataset.metadata.duration / 1000).toFixed(1)}s`
? formatDuration(dataset.metadata.duration, {
precision: 1,
})
: 'N/A'}
</span>
</div>

View File

@@ -18,6 +18,8 @@ export interface UseBlockStateReturn {
diffStatus: DiffStatus
/** Whether this is a deleted block in diff mode */
isDeletedBlock: boolean
/** Whether the block is locked */
isLocked: boolean
}
/**
@@ -40,6 +42,11 @@ export function useBlockState(
? (data.blockState?.enabled ?? true)
: (currentBlock?.enabled ?? true)
// Determine if block is locked
const isLocked = data.isPreview
? (data.blockState?.locked ?? false)
: (currentBlock?.locked ?? false)
// Get diff status
const diffStatus: DiffStatus =
currentWorkflow.isDiffMode && currentBlock && hasDiffStatus(currentBlock)
@@ -68,5 +75,6 @@ export function useBlockState(
isActive,
diffStatus,
isDeletedBlock: isDeletedBlock ?? false,
isLocked,
}
}

View File

@@ -672,6 +672,7 @@ export const WorkflowBlock = memo(function WorkflowBlock({
currentWorkflow,
activeWorkflowId,
isEnabled,
isLocked,
handleClick,
hasRing,
ringStyles,
@@ -1100,7 +1101,7 @@ export const WorkflowBlock = memo(function WorkflowBlock({
{name}
</span>
</div>
<div className='relative z-10 flex flex-shrink-0 items-center gap-2'>
<div className='relative z-10 flex flex-shrink-0 items-center gap-1'>
{isWorkflowSelector &&
childWorkflowId &&
typeof childIsDeployed === 'boolean' &&
@@ -1133,6 +1134,7 @@ export const WorkflowBlock = memo(function WorkflowBlock({
</Tooltip.Root>
)}
{!isEnabled && <Badge variant='gray-secondary'>disabled</Badge>}
{isLocked && <Badge variant='gray-secondary'>locked</Badge>}
{type === 'schedule' && shouldShowScheduleBadge && scheduleInfo?.isDisabled && (
<Tooltip.Root>

View File

@@ -47,6 +47,7 @@ export function useBlockVisual({
isActive: isExecuting,
diffStatus,
isDeletedBlock,
isLocked,
} = useBlockState(blockId, currentWorkflow, data)
const currentBlockId = usePanelEditorStore((state) => state.currentBlockId)
@@ -103,6 +104,7 @@ export function useBlockVisual({
currentWorkflow,
activeWorkflowId,
isEnabled,
isLocked,
handleClick,
hasRing,
ringStyles,

View File

@@ -31,7 +31,8 @@ export function useCanvasContextMenu({ blocks, getNodes, setNodes }: UseCanvasCo
nodes.map((n) => {
const block = blocks[n.id]
const parentId = block?.data?.parentId
const parentType = parentId ? blocks[parentId]?.type : undefined
const parentBlock = parentId ? blocks[parentId] : undefined
const parentType = parentBlock?.type
return {
id: n.id,
type: block?.type || '',
@@ -39,6 +40,9 @@ export function useCanvasContextMenu({ blocks, getNodes, setNodes }: UseCanvasCo
horizontalHandles: block?.horizontalHandles ?? false,
parentId,
parentType,
locked: block?.locked ?? false,
isParentLocked: parentBlock?.locked ?? false,
isParentDisabled: parentBlock ? !parentBlock.enabled : false,
}
}),
[blocks]

View File

@@ -926,6 +926,7 @@ export function useWorkflowExecution() {
})
// Add entry to terminal immediately with isRunning=true
// Use server-provided executionOrder to ensure correct sort order
const startedAt = new Date().toISOString()
addConsole({
input: {},
@@ -933,6 +934,7 @@ export function useWorkflowExecution() {
success: undefined,
durationMs: undefined,
startedAt,
executionOrder: data.executionOrder,
endedAt: undefined,
workflowId: activeWorkflowId,
blockId: data.blockId,
@@ -948,8 +950,6 @@ export function useWorkflowExecution() {
},
onBlockCompleted: (data) => {
logger.info('onBlockCompleted received:', { data })
activeBlocksSet.delete(data.blockId)
setActiveBlocks(new Set(activeBlocksSet))
setBlockRunStatus(data.blockId, 'success')
@@ -976,6 +976,7 @@ export function useWorkflowExecution() {
success: true,
durationMs: data.durationMs,
startedAt,
executionOrder: data.executionOrder,
endedAt,
})
@@ -987,6 +988,7 @@ export function useWorkflowExecution() {
replaceOutput: data.output,
success: true,
durationMs: data.durationMs,
startedAt,
endedAt,
isRunning: false,
// Pass through iteration context for subflow grouping
@@ -1027,6 +1029,7 @@ export function useWorkflowExecution() {
error: data.error,
durationMs: data.durationMs,
startedAt,
executionOrder: data.executionOrder,
endedAt,
})
@@ -1039,6 +1042,7 @@ export function useWorkflowExecution() {
success: false,
error: data.error,
durationMs: data.durationMs,
startedAt,
endedAt,
isRunning: false,
// Pass through iteration context for subflow grouping
@@ -1163,6 +1167,7 @@ export function useWorkflowExecution() {
if (existingLogs.length === 0) {
// No blocks executed yet - this is a pre-execution error
// Use 0 for executionOrder so validation errors appear first
addConsole({
input: {},
output: {},
@@ -1170,6 +1175,7 @@ export function useWorkflowExecution() {
error: data.error,
durationMs: data.duration || 0,
startedAt: new Date(Date.now() - (data.duration || 0)).toISOString(),
executionOrder: 0,
endedAt: new Date().toISOString(),
workflowId: activeWorkflowId,
blockId: 'validation',
@@ -1237,6 +1243,7 @@ export function useWorkflowExecution() {
blockType = error.blockType || blockType
}
// Use MAX_SAFE_INTEGER so execution errors appear at the end of the log
useTerminalConsoleStore.getState().addConsole({
input: {},
output: {},
@@ -1244,6 +1251,7 @@ export function useWorkflowExecution() {
error: normalizedMessage,
durationMs: 0,
startedAt: new Date().toISOString(),
executionOrder: Number.MAX_SAFE_INTEGER,
endedAt: new Date().toISOString(),
workflowId: activeWorkflowId || '',
blockId,
@@ -1615,6 +1623,7 @@ export function useWorkflowExecution() {
success: true,
durationMs: data.durationMs,
startedAt,
executionOrder: data.executionOrder,
endedAt,
})
@@ -1624,6 +1633,7 @@ export function useWorkflowExecution() {
success: true,
durationMs: data.durationMs,
startedAt,
executionOrder: data.executionOrder,
endedAt,
workflowId,
blockId: data.blockId,
@@ -1653,6 +1663,7 @@ export function useWorkflowExecution() {
output: {},
success: false,
error: data.error,
executionOrder: data.executionOrder,
durationMs: data.durationMs,
startedAt,
endedAt,
@@ -1665,6 +1676,7 @@ export function useWorkflowExecution() {
error: data.error,
durationMs: data.durationMs,
startedAt,
executionOrder: data.executionOrder,
endedAt,
workflowId,
blockId: data.blockId,

View File

@@ -52,6 +52,16 @@ export async function applyAutoLayoutAndUpdateStore(
return { success: false, error: 'No blocks to layout' }
}
// Check for locked blocks - auto-layout is disabled when blocks are locked
const hasLockedBlocks = Object.values(blocks).some((block) => block.locked)
if (hasLockedBlocks) {
logger.info('Auto layout skipped: workflow contains locked blocks', { workflowId })
return {
success: false,
error: 'Auto-layout is disabled when blocks are locked. Unlock blocks to use auto-layout.',
}
}
// Merge with default options
const layoutOptions = {
spacing: {

View File

@@ -0,0 +1,72 @@
import type { BlockState } from '@/stores/workflows/workflow/types'
/**
* Result of filtering protected blocks from a deletion operation
*/
export interface FilterProtectedBlocksResult {
/** Block IDs that can be deleted (not protected) */
deletableIds: string[]
/** Block IDs that are protected and cannot be deleted */
protectedIds: string[]
/** Whether all blocks are protected (deletion should be cancelled entirely) */
allProtected: boolean
}
/**
* Checks if a block is protected from editing/deletion.
* A block is protected if it is locked or if its parent container is locked.
*
* @param blockId - The ID of the block to check
* @param blocks - Record of all blocks in the workflow
* @returns True if the block is protected
*/
export function isBlockProtected(blockId: string, blocks: Record<string, BlockState>): boolean {
const block = blocks[blockId]
if (!block) return false
// Block is locked directly
if (block.locked) return true
// Block is inside a locked container
const parentId = block.data?.parentId
if (parentId && blocks[parentId]?.locked) return true
return false
}
/**
* Checks if an edge is protected from modification.
* An edge is protected if either its source or target block is protected.
*
* @param edge - The edge to check (must have source and target)
* @param blocks - Record of all blocks in the workflow
* @returns True if the edge is protected
*/
export function isEdgeProtected(
edge: { source: string; target: string },
blocks: Record<string, BlockState>
): boolean {
return isBlockProtected(edge.source, blocks) || isBlockProtected(edge.target, blocks)
}
/**
* Filters out protected blocks from a list of block IDs for deletion.
* Protected blocks are those that are locked or inside a locked container.
*
* @param blockIds - Array of block IDs to filter
* @param blocks - Record of all blocks in the workflow
* @returns Result containing deletable IDs, protected IDs, and whether all are protected
*/
export function filterProtectedBlocks(
blockIds: string[],
blocks: Record<string, BlockState>
): FilterProtectedBlocksResult {
const protectedIds = blockIds.filter((id) => isBlockProtected(id, blocks))
const deletableIds = blockIds.filter((id) => !protectedIds.includes(id))
return {
deletableIds,
protectedIds,
allProtected: protectedIds.length === blockIds.length && blockIds.length > 0,
}
}

View File

@@ -1,4 +1,5 @@
export * from './auto-layout-utils'
export * from './block-protection-utils'
export * from './block-ring-utils'
export * from './node-position-utils'
export * from './workflow-canvas-helpers'

View File

@@ -111,6 +111,7 @@ export async function executeWorkflowWithFullLogging(
success: true,
durationMs: event.data.durationMs,
startedAt: new Date(Date.now() - event.data.durationMs).toISOString(),
executionOrder: event.data.executionOrder,
endedAt: new Date().toISOString(),
workflowId: activeWorkflowId,
blockId: event.data.blockId,
@@ -140,6 +141,7 @@ export async function executeWorkflowWithFullLogging(
error: event.data.error,
durationMs: event.data.durationMs,
startedAt: new Date(Date.now() - event.data.durationMs).toISOString(),
executionOrder: event.data.executionOrder,
endedAt: new Date().toISOString(),
workflowId: activeWorkflowId,
blockId: event.data.blockId,

View File

@@ -55,7 +55,10 @@ import {
clearDragHighlights,
computeClampedPositionUpdates,
estimateBlockDimensions,
filterProtectedBlocks,
getClampedPositionForNode,
isBlockProtected,
isEdgeProtected,
isInEditableElement,
resolveParentChildSelectionConflicts,
validateTriggerPaste,
@@ -543,6 +546,7 @@ const WorkflowContent = React.memo(() => {
collaborativeBatchRemoveBlocks,
collaborativeBatchToggleBlockEnabled,
collaborativeBatchToggleBlockHandles,
collaborativeBatchToggleLocked,
undo,
redo,
} = useCollaborativeWorkflow()
@@ -1069,8 +1073,27 @@ const WorkflowContent = React.memo(() => {
const handleContextDelete = useCallback(() => {
const blockIds = contextMenuBlocks.map((b) => b.id)
collaborativeBatchRemoveBlocks(blockIds)
}, [contextMenuBlocks, collaborativeBatchRemoveBlocks])
const { deletableIds, protectedIds, allProtected } = filterProtectedBlocks(blockIds, blocks)
if (protectedIds.length > 0) {
if (allProtected) {
addNotification({
level: 'info',
message: 'Cannot delete locked blocks or blocks inside locked containers',
workflowId: activeWorkflowId || undefined,
})
return
}
addNotification({
level: 'info',
message: `Skipped ${protectedIds.length} protected block(s)`,
workflowId: activeWorkflowId || undefined,
})
}
if (deletableIds.length > 0) {
collaborativeBatchRemoveBlocks(deletableIds)
}
}, [contextMenuBlocks, collaborativeBatchRemoveBlocks, addNotification, activeWorkflowId, blocks])
const handleContextToggleEnabled = useCallback(() => {
const blockIds = contextMenuBlocks.map((block) => block.id)
@@ -1082,6 +1105,11 @@ const WorkflowContent = React.memo(() => {
collaborativeBatchToggleBlockHandles(blockIds)
}, [contextMenuBlocks, collaborativeBatchToggleBlockHandles])
const handleContextToggleLocked = useCallback(() => {
const blockIds = contextMenuBlocks.map((block) => block.id)
collaborativeBatchToggleLocked(blockIds)
}, [contextMenuBlocks, collaborativeBatchToggleLocked])
const handleContextRemoveFromSubflow = useCallback(() => {
const blocksToRemove = contextMenuBlocks.filter(
(block) => block.parentId && (block.parentType === 'loop' || block.parentType === 'parallel')
@@ -1145,7 +1173,7 @@ const WorkflowContent = React.memo(() => {
block.parentId && (block.parentType === 'loop' || block.parentType === 'parallel')
if (isInsideSubflow) return { canRun: false, reason: 'Cannot run from inside subflow' }
if (!dependenciesSatisfied) return { canRun: false, reason: 'Run upstream blocks first' }
if (!dependenciesSatisfied) return { canRun: false, reason: 'Disabled: Run Blocks Before' }
if (isNoteBlock) return { canRun: false, reason: undefined }
if (isExecuting) return { canRun: false, reason: undefined }
@@ -1951,7 +1979,6 @@ const WorkflowContent = React.memo(() => {
const loadingWorkflowRef = useRef<string | null>(null)
const currentWorkflowExists = Boolean(workflows[workflowIdParam])
/** Initializes workflow when it exists in registry and needs hydration. */
useEffect(() => {
const currentId = workflowIdParam
const currentWorkspaceHydration = hydration.workspaceId
@@ -2128,6 +2155,7 @@ const WorkflowContent = React.memo(() => {
parentId: block.data?.parentId,
extent: block.data?.extent || undefined,
dragHandle: '.workflow-drag-handle',
draggable: !isBlockProtected(block.id, blocks),
data: {
...block.data,
name: block.name,
@@ -2163,6 +2191,7 @@ const WorkflowContent = React.memo(() => {
position,
parentId: block.data?.parentId,
dragHandle,
draggable: !isBlockProtected(block.id, blocks),
extent: (() => {
// Clamp children to subflow body (exclude header)
const parentId = block.data?.parentId as string | undefined
@@ -2491,12 +2520,18 @@ const WorkflowContent = React.memo(() => {
const edgeIdsToRemove = changes
.filter((change: any) => change.type === 'remove')
.map((change: any) => change.id)
.filter((edgeId: string) => {
// Prevent removing edges connected to protected blocks
const edge = edges.find((e) => e.id === edgeId)
if (!edge) return true
return !isEdgeProtected(edge, blocks)
})
if (edgeIdsToRemove.length > 0) {
collaborativeBatchRemoveEdges(edgeIdsToRemove)
}
},
[collaborativeBatchRemoveEdges]
[collaborativeBatchRemoveEdges, edges, blocks]
)
/**
@@ -2558,6 +2593,16 @@ const WorkflowContent = React.memo(() => {
if (!sourceNode || !targetNode) return
// Prevent connections to/from protected blocks
if (isEdgeProtected(connection, blocks)) {
addNotification({
level: 'info',
message: 'Cannot connect to locked blocks or blocks inside locked containers',
workflowId: activeWorkflowId || undefined,
})
return
}
// Get parent information (handle container start node case)
const sourceParentId =
blocks[sourceNode.id]?.data?.parentId ||
@@ -2620,7 +2665,7 @@ const WorkflowContent = React.memo(() => {
connectionCompletedRef.current = true
}
},
[addEdge, getNodes, blocks]
[addEdge, getNodes, blocks, addNotification, activeWorkflowId]
)
/**
@@ -2715,6 +2760,9 @@ const WorkflowContent = React.memo(() => {
// Only consider container nodes that aren't the dragged node
if (n.type !== 'subflowNode' || n.id === node.id) return false
// Don't allow dropping into locked containers
if (blocks[n.id]?.locked) return false
// Get the container's absolute position
const containerAbsolutePos = getNodeAbsolutePosition(n.id)
@@ -2807,6 +2855,8 @@ const WorkflowContent = React.memo(() => {
/** Captures initial parent ID and position when drag starts. */
const onNodeDragStart = useCallback(
(_event: React.MouseEvent, node: any) => {
// Note: Protected blocks are already non-draggable via the `draggable` node property
// Store the original parent ID when starting to drag
const currentParentId = blocks[node.id]?.data?.parentId || null
setDragStartParentId(currentParentId)
@@ -2835,7 +2885,7 @@ const WorkflowContent = React.memo(() => {
}
})
},
[blocks, setDragStartPosition, getNodes, potentialParentId, setPotentialParentId]
[blocks, setDragStartPosition, getNodes, setPotentialParentId]
)
/** Handles node drag stop to establish parent-child relationships. */
@@ -2897,6 +2947,18 @@ const WorkflowContent = React.memo(() => {
// Don't process parent changes if the node hasn't actually changed parent or is being moved within same parent
if (potentialParentId === dragStartParentId) return
// Prevent moving locked blocks out of locked containers
// Unlocked blocks (e.g., duplicates) can be moved out freely
if (dragStartParentId && blocks[dragStartParentId]?.locked && blocks[node.id]?.locked) {
addNotification({
level: 'info',
message: 'Cannot move locked blocks out of locked containers',
workflowId: activeWorkflowId || undefined,
})
setPotentialParentId(dragStartParentId) // Reset to original parent
return
}
// Check if this is a starter block - starter blocks should never be in containers
const isStarterBlock = node.data?.type === 'starter'
if (isStarterBlock) {
@@ -3293,6 +3355,16 @@ const WorkflowContent = React.memo(() => {
/** Stable delete handler to avoid creating new function references per edge. */
const handleEdgeDelete = useCallback(
(edgeId: string) => {
// Prevent removing edges connected to protected blocks
const edge = edges.find((e) => e.id === edgeId)
if (edge && isEdgeProtected(edge, blocks)) {
addNotification({
level: 'info',
message: 'Cannot remove connections from locked blocks',
workflowId: activeWorkflowId || undefined,
})
return
}
removeEdge(edgeId)
// Remove this edge from selection (find by edge ID value)
setSelectedEdges((prev) => {
@@ -3305,7 +3377,7 @@ const WorkflowContent = React.memo(() => {
return next
})
},
[removeEdge]
[removeEdge, edges, blocks, addNotification, activeWorkflowId]
)
/** Transforms edges to include selection state and delete handlers. Memoized to prevent re-renders. */
@@ -3346,9 +3418,15 @@ const WorkflowContent = React.memo(() => {
// Handle edge deletion first (edges take priority if selected)
if (selectedEdges.size > 0) {
// Get all selected edge IDs and batch delete them
const edgeIds = Array.from(selectedEdges.values())
collaborativeBatchRemoveEdges(edgeIds)
// Get all selected edge IDs and filter out edges connected to protected blocks
const edgeIds = Array.from(selectedEdges.values()).filter((edgeId) => {
const edge = edges.find((e) => e.id === edgeId)
if (!edge) return true
return !isEdgeProtected(edge, blocks)
})
if (edgeIds.length > 0) {
collaborativeBatchRemoveEdges(edgeIds)
}
setSelectedEdges(new Map())
return
}
@@ -3365,7 +3443,29 @@ const WorkflowContent = React.memo(() => {
event.preventDefault()
const selectedIds = selectedNodes.map((node) => node.id)
collaborativeBatchRemoveBlocks(selectedIds)
const { deletableIds, protectedIds, allProtected } = filterProtectedBlocks(
selectedIds,
blocks
)
if (protectedIds.length > 0) {
if (allProtected) {
addNotification({
level: 'info',
message: 'Cannot delete locked blocks or blocks inside locked containers',
workflowId: activeWorkflowId || undefined,
})
return
}
addNotification({
level: 'info',
message: `Skipped ${protectedIds.length} protected block(s)`,
workflowId: activeWorkflowId || undefined,
})
}
if (deletableIds.length > 0) {
collaborativeBatchRemoveBlocks(deletableIds)
}
}
window.addEventListener('keydown', handleKeyDown)
@@ -3376,6 +3476,10 @@ const WorkflowContent = React.memo(() => {
getNodes,
collaborativeBatchRemoveBlocks,
effectivePermissions.canEdit,
blocks,
edges,
addNotification,
activeWorkflowId,
])
return (
@@ -3496,12 +3600,18 @@ const WorkflowContent = React.memo(() => {
(b) => b.parentId && (b.parentType === 'loop' || b.parentType === 'parallel')
)}
canRunFromBlock={runFromBlockState.canRun}
disableEdit={!effectivePermissions.canEdit}
disableEdit={
!effectivePermissions.canEdit ||
contextMenuBlocks.some((b) => b.locked || b.isParentLocked)
}
userCanEdit={effectivePermissions.canEdit}
isExecuting={isExecuting}
isPositionalTrigger={
contextMenuBlocks.length === 1 &&
edges.filter((e) => e.target === contextMenuBlocks[0]?.id).length === 0
}
onToggleLocked={handleContextToggleLocked}
canAdmin={effectivePermissions.canAdmin}
/>
<CanvasMenu
@@ -3524,6 +3634,7 @@ const WorkflowContent = React.memo(() => {
disableEdit={!effectivePermissions.canEdit}
canUndo={canUndo}
canRedo={canRedo}
hasLockedBlocks={Object.values(blocks).some((b) => b.locked)}
/>
</>
)}

View File

@@ -246,7 +246,6 @@ export function CredentialSets() {
setNewSetDescription('')
setNewSetProvider('google-email')
// Open detail view for the newly created group
if (result?.credentialSet) {
setViewingSet(result.credentialSet)
}
@@ -336,7 +335,6 @@ export function CredentialSets() {
email,
})
// Start 60s cooldown
setResendCooldowns((prev) => ({ ...prev, [invitationId]: 60 }))
const interval = setInterval(() => {
setResendCooldowns((prev) => {
@@ -393,7 +391,6 @@ export function CredentialSets() {
return <GmailIcon className='h-4 w-4' />
}
// All hooks must be called before any early returns
const activeMemberships = useMemo(
() => memberships.filter((m) => m.status === 'active'),
[memberships]
@@ -447,7 +444,6 @@ export function CredentialSets() {
<div className='flex h-full flex-col gap-[16px]'>
<div className='min-h-0 flex-1 overflow-y-auto'>
<div className='flex flex-col gap-[16px]'>
{/* Group Info */}
<div className='flex items-center gap-[16px]'>
<div className='flex items-center gap-[8px]'>
<span className='font-medium text-[13px] text-[var(--text-primary)]'>
@@ -471,7 +467,6 @@ export function CredentialSets() {
</div>
</div>
{/* Invite Section - Email Tags Input */}
<div className='flex flex-col gap-[4px]'>
<div className='flex items-center gap-[8px]'>
<TagInput
@@ -495,7 +490,6 @@ export function CredentialSets() {
{emailError && <p className='text-[12px] text-[var(--text-error)]'>{emailError}</p>}
</div>
{/* Members List - styled like team members */}
<div className='flex flex-col gap-[16px]'>
<h4 className='font-medium text-[14px] text-[var(--text-primary)]'>Members</h4>
@@ -519,7 +513,6 @@ export function CredentialSets() {
</p>
) : (
<div className='flex flex-col gap-[16px]'>
{/* Active Members */}
{activeMembers.map((member) => {
const name = member.userName || 'Unknown'
const avatarInitial = name.charAt(0).toUpperCase()
@@ -572,7 +565,6 @@ export function CredentialSets() {
)
})}
{/* Pending Invitations */}
{pendingInvitations.map((invitation) => {
const email = invitation.email || 'Unknown'
const emailPrefix = email.split('@')[0]
@@ -641,7 +633,6 @@ export function CredentialSets() {
</div>
</div>
{/* Footer Actions */}
<div className='mt-auto flex items-center justify-end'>
<Button onClick={handleBackToList} variant='tertiary'>
Back
@@ -822,7 +813,6 @@ export function CredentialSets() {
</div>
</div>
{/* Create Polling Group Modal */}
<Modal open={showCreateModal} onOpenChange={handleCloseCreateModal}>
<ModalContent size='sm'>
<ModalHeader>Create Polling Group</ModalHeader>
@@ -895,7 +885,6 @@ export function CredentialSets() {
</ModalContent>
</Modal>
{/* Leave Confirmation Modal */}
<Modal open={!!leavingMembership} onOpenChange={() => setLeavingMembership(null)}>
<ModalContent size='sm'>
<ModalHeader>Leave Polling Group</ModalHeader>
@@ -923,7 +912,6 @@ export function CredentialSets() {
</ModalContent>
</Modal>
{/* Delete Confirmation Modal */}
<Modal open={!!deletingSet} onOpenChange={() => setDeletingSet(null)}>
<ModalContent size='sm'>
<ModalHeader>Delete Polling Group</ModalHeader>

View File

@@ -1,4 +1,3 @@
export { AccessControl } from './access-control/access-control'
export { ApiKeys } from './api-keys/api-keys'
export { BYOK } from './byok/byok'
export { Copilot } from './copilot/copilot'
@@ -10,7 +9,6 @@ export { Files as FileUploads } from './files/files'
export { General } from './general/general'
export { Integrations } from './integrations/integrations'
export { MCP } from './mcp/mcp'
export { SSO } from './sso/sso'
export { Subscription } from './subscription/subscription'
export { TeamManagement } from './team-management/team-management'
export { WorkflowMcpServers } from './workflow-mcp-servers/workflow-mcp-servers'

View File

@@ -407,14 +407,12 @@ export function MCP({ initialServerId }: MCPProps) {
const [urlScrollLeft, setUrlScrollLeft] = useState(0)
const [headerScrollLeft, setHeaderScrollLeft] = useState<Record<string, number>>({})
// Auto-select server when initialServerId is provided
useEffect(() => {
if (initialServerId && servers.some((s) => s.id === initialServerId)) {
setSelectedServerId(initialServerId)
}
}, [initialServerId, servers])
// Force refresh tools when entering server detail view to detect stale schemas
useEffect(() => {
if (selectedServerId) {
forceRefreshTools(workspaceId)
@@ -675,6 +673,7 @@ export function MCP({ initialServerId }: MCPProps) {
/**
* Opens the detail view for a specific server.
* Note: Tool refresh is handled by the useEffect that watches selectedServerId
*/
const handleViewDetails = useCallback((serverId: string) => {
setSelectedServerId(serverId)
@@ -717,7 +716,6 @@ export function MCP({ initialServerId }: MCPProps) {
`Refreshed MCP server: ${serverId}, workflows updated: ${result.workflowsUpdated}`
)
// If the active workflow was updated, reload its subblock values from DB
const activeWorkflowId = useWorkflowRegistry.getState().activeWorkflowId
if (activeWorkflowId && result.updatedWorkflowIds?.includes(activeWorkflowId)) {
logger.info(`Active workflow ${activeWorkflowId} was updated, reloading subblock values`)

View File

@@ -13,8 +13,8 @@ import { SlackMonoIcon } from '@/components/icons'
import type { PlanFeature } from '@/app/workspace/[workspaceId]/w/components/sidebar/components/settings-modal/components/subscription/components/plan-card'
export const PRO_PLAN_FEATURES: PlanFeature[] = [
{ icon: Zap, text: '25 runs per minute (sync)' },
{ icon: Clock, text: '200 runs per minute (async)' },
{ icon: Zap, text: '150 runs per minute (sync)' },
{ icon: Clock, text: '1,000 runs per minute (async)' },
{ icon: HardDrive, text: '50GB file storage' },
{ icon: Building2, text: 'Unlimited workspaces' },
{ icon: Users, text: 'Unlimited invites' },
@@ -22,8 +22,8 @@ export const PRO_PLAN_FEATURES: PlanFeature[] = [
]
export const TEAM_PLAN_FEATURES: PlanFeature[] = [
{ icon: Zap, text: '75 runs per minute (sync)' },
{ icon: Clock, text: '500 runs per minute (async)' },
{ icon: Zap, text: '300 runs per minute (sync)' },
{ icon: Clock, text: '2,500 runs per minute (async)' },
{ icon: HardDrive, text: '500GB file storage (pooled)' },
{ icon: Building2, text: 'Unlimited workspaces' },
{ icon: Users, text: 'Unlimited invites' },

View File

@@ -41,7 +41,6 @@ import { getEnv, isTruthy } from '@/lib/core/config/env'
import { isHosted } from '@/lib/core/config/feature-flags'
import { getUserRole } from '@/lib/workspaces/organization'
import {
AccessControl,
ApiKeys,
BYOK,
Copilot,
@@ -53,16 +52,18 @@ import {
General,
Integrations,
MCP,
SSO,
Subscription,
TeamManagement,
WorkflowMcpServers,
} from '@/app/workspace/[workspaceId]/w/components/sidebar/components/settings-modal/components'
import { TemplateProfile } from '@/app/workspace/[workspaceId]/w/components/sidebar/components/settings-modal/components/template-profile/template-profile'
import { AccessControl } from '@/ee/access-control/components/access-control'
import { SSO } from '@/ee/sso/components/sso-settings'
import { ssoKeys, useSSOProviders } from '@/ee/sso/hooks/sso'
import { generalSettingsKeys, useGeneralSettings } from '@/hooks/queries/general-settings'
import { organizationKeys, useOrganizations } from '@/hooks/queries/organization'
import { ssoKeys, useSSOProviders } from '@/hooks/queries/sso'
import { subscriptionKeys, useSubscriptionData } from '@/hooks/queries/subscription'
import { useSuperUserStatus } from '@/hooks/queries/user-profile'
import { usePermissionConfig } from '@/hooks/use-permission-config'
import { useSettingsModalStore } from '@/stores/modals/settings/store'
@@ -204,13 +205,13 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
const [activeSection, setActiveSection] = useState<SettingsSection>('general')
const { initialSection, mcpServerId, clearInitialState } = useSettingsModalStore()
const [pendingMcpServerId, setPendingMcpServerId] = useState<string | null>(null)
const [isSuperUser, setIsSuperUser] = useState(false)
const { data: session } = useSession()
const queryClient = useQueryClient()
const { data: organizationsData } = useOrganizations()
const { data: generalSettings } = useGeneralSettings()
const { data: subscriptionData } = useSubscriptionData({ enabled: isBillingEnabled })
const { data: ssoProvidersData, isLoading: isLoadingSSO } = useSSOProviders()
const { data: superUserData } = useSuperUserStatus(session?.user?.id)
const activeOrganization = organizationsData?.activeOrganization
const { config: permissionConfig } = usePermissionConfig()
@@ -229,22 +230,7 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
const hasEnterprisePlan = subscriptionStatus.isEnterprise
const hasOrganization = !!activeOrganization?.id
// Fetch superuser status
useEffect(() => {
const fetchSuperUserStatus = async () => {
if (!userId) return
try {
const response = await fetch('/api/user/super-user')
if (response.ok) {
const data = await response.json()
setIsSuperUser(data.isSuperUser)
}
} catch {
setIsSuperUser(false)
}
}
fetchSuperUserStatus()
}, [userId])
const isSuperUser = superUserData?.isSuperUser ?? false
// Memoize SSO provider ownership check
const isSSOProviderOwner = useMemo(() => {
@@ -328,7 +314,13 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
generalSettings?.superUserModeEnabled,
])
// Memoized callbacks to prevent infinite loops in child components
const effectiveActiveSection = useMemo(() => {
if (!isBillingEnabled && (activeSection === 'subscription' || activeSection === 'team')) {
return 'general'
}
return activeSection
}, [activeSection])
const registerEnvironmentBeforeLeaveHandler = useCallback(
(handler: (onProceed: () => void) => void) => {
environmentBeforeLeaveHandler.current = handler
@@ -342,19 +334,18 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
const handleSectionChange = useCallback(
(sectionId: SettingsSection) => {
if (sectionId === activeSection) return
if (sectionId === effectiveActiveSection) return
if (activeSection === 'environment' && environmentBeforeLeaveHandler.current) {
if (effectiveActiveSection === 'environment' && environmentBeforeLeaveHandler.current) {
environmentBeforeLeaveHandler.current(() => setActiveSection(sectionId))
return
}
setActiveSection(sectionId)
},
[activeSection]
[effectiveActiveSection]
)
// Apply initial section from store when modal opens
useEffect(() => {
if (open && initialSection) {
setActiveSection(initialSection)
@@ -365,7 +356,6 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
}
}, [open, initialSection, mcpServerId, clearInitialState])
// Clear pending server ID when section changes away from MCP
useEffect(() => {
if (activeSection !== 'mcp') {
setPendingMcpServerId(null)
@@ -391,14 +381,6 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
}
}, [onOpenChange])
// Redirect away from billing tabs if billing is disabled
useEffect(() => {
if (!isBillingEnabled && (activeSection === 'subscription' || activeSection === 'team')) {
setActiveSection('general')
}
}, [activeSection])
// Prefetch functions for React Query
const prefetchGeneral = () => {
queryClient.prefetchQuery({
queryKey: generalSettingsKeys.settings(),
@@ -489,9 +471,17 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
// Handle dialog close - delegate to environment component if it's active
const handleDialogOpenChange = (newOpen: boolean) => {
if (!newOpen && activeSection === 'environment' && environmentBeforeLeaveHandler.current) {
if (
!newOpen &&
effectiveActiveSection === 'environment' &&
environmentBeforeLeaveHandler.current
) {
environmentBeforeLeaveHandler.current(() => onOpenChange(false))
} else if (!newOpen && activeSection === 'integrations' && integrationsCloseHandler.current) {
} else if (
!newOpen &&
effectiveActiveSection === 'integrations' &&
integrationsCloseHandler.current
) {
integrationsCloseHandler.current(newOpen)
} else {
onOpenChange(newOpen)
@@ -522,7 +512,7 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
{sectionItems.map((item) => (
<SModalSidebarItem
key={item.id}
active={activeSection === item.id}
active={effectiveActiveSection === item.id}
icon={<item.icon />}
onMouseEnter={() => handlePrefetch(item.id)}
onClick={() => handleSectionChange(item.id)}
@@ -538,35 +528,36 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
<SModalMain>
<SModalMainHeader>
{navigationItems.find((item) => item.id === activeSection)?.label || activeSection}
{navigationItems.find((item) => item.id === effectiveActiveSection)?.label ||
effectiveActiveSection}
</SModalMainHeader>
<SModalMainBody>
{activeSection === 'general' && <General onOpenChange={onOpenChange} />}
{activeSection === 'environment' && (
{effectiveActiveSection === 'general' && <General onOpenChange={onOpenChange} />}
{effectiveActiveSection === 'environment' && (
<EnvironmentVariables
registerBeforeLeaveHandler={registerEnvironmentBeforeLeaveHandler}
/>
)}
{activeSection === 'template-profile' && <TemplateProfile />}
{activeSection === 'integrations' && (
{effectiveActiveSection === 'template-profile' && <TemplateProfile />}
{effectiveActiveSection === 'integrations' && (
<Integrations
onOpenChange={onOpenChange}
registerCloseHandler={registerIntegrationsCloseHandler}
/>
)}
{activeSection === 'credential-sets' && <CredentialSets />}
{activeSection === 'access-control' && <AccessControl />}
{activeSection === 'apikeys' && <ApiKeys onOpenChange={onOpenChange} />}
{activeSection === 'files' && <FileUploads />}
{isBillingEnabled && activeSection === 'subscription' && <Subscription />}
{isBillingEnabled && activeSection === 'team' && <TeamManagement />}
{activeSection === 'sso' && <SSO />}
{activeSection === 'byok' && <BYOK />}
{activeSection === 'copilot' && <Copilot />}
{activeSection === 'mcp' && <MCP initialServerId={pendingMcpServerId} />}
{activeSection === 'custom-tools' && <CustomTools />}
{activeSection === 'workflow-mcp-servers' && <WorkflowMcpServers />}
{activeSection === 'debug' && <Debug />}
{effectiveActiveSection === 'credential-sets' && <CredentialSets />}
{effectiveActiveSection === 'access-control' && <AccessControl />}
{effectiveActiveSection === 'apikeys' && <ApiKeys onOpenChange={onOpenChange} />}
{effectiveActiveSection === 'files' && <FileUploads />}
{isBillingEnabled && effectiveActiveSection === 'subscription' && <Subscription />}
{isBillingEnabled && effectiveActiveSection === 'team' && <TeamManagement />}
{effectiveActiveSection === 'sso' && <SSO />}
{effectiveActiveSection === 'byok' && <BYOK />}
{effectiveActiveSection === 'copilot' && <Copilot />}
{effectiveActiveSection === 'mcp' && <MCP initialServerId={pendingMcpServerId} />}
{effectiveActiveSection === 'custom-tools' && <CustomTools />}
{effectiveActiveSection === 'workflow-mcp-servers' && <WorkflowMcpServers />}
{effectiveActiveSection === 'debug' && <Debug />}
</SModalMainBody>
</SModalMain>
</SModalContent>

View File

@@ -231,6 +231,8 @@ export function FolderItem({
const isFolderSelected = store.selectedFolders.has(folder.id)
if (!isFolderSelected) {
// Replace selection with just this folder (Finder/Explorer pattern)
store.clearAllSelection()
store.selectFolder(folder.id)
}

View File

@@ -189,6 +189,9 @@ export function WorkflowItem({
const isCurrentlySelected = store.selectedWorkflows.has(workflow.id)
if (!isCurrentlySelected) {
// Replace selection with just this item (Finder/Explorer pattern)
// This clears both workflow and folder selections
store.clearAllSelection()
store.selectWorkflow(workflow.id)
}

View File

@@ -49,6 +49,7 @@ interface SocketContextType {
socket: Socket | null
isConnected: boolean
isConnecting: boolean
isReconnecting: boolean
authFailed: boolean
currentWorkflowId: string | null
currentSocketId: string | null
@@ -66,9 +67,16 @@ interface SocketContextType {
blockId: string,
subblockId: string,
value: any,
operationId?: string
operationId: string | undefined,
workflowId: string
) => void
emitVariableUpdate: (
variableId: string,
field: string,
value: any,
operationId: string | undefined,
workflowId: string
) => void
emitVariableUpdate: (variableId: string, field: string, value: any, operationId?: string) => void
emitCursorUpdate: (cursor: { x: number; y: number } | null) => void
emitSelectionUpdate: (selection: { type: 'block' | 'edge' | 'none'; id?: string }) => void
@@ -88,6 +96,7 @@ const SocketContext = createContext<SocketContextType>({
socket: null,
isConnected: false,
isConnecting: false,
isReconnecting: false,
authFailed: false,
currentWorkflowId: null,
currentSocketId: null,
@@ -122,6 +131,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
const [socket, setSocket] = useState<Socket | null>(null)
const [isConnected, setIsConnected] = useState(false)
const [isConnecting, setIsConnecting] = useState(false)
const [isReconnecting, setIsReconnecting] = useState(false)
const [currentWorkflowId, setCurrentWorkflowId] = useState<string | null>(null)
const [currentSocketId, setCurrentSocketId] = useState<string | null>(null)
const [presenceUsers, setPresenceUsers] = useState<PresenceUser[]>([])
@@ -236,20 +246,19 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
setCurrentWorkflowId(null)
setPresenceUsers([])
logger.info('Socket disconnected', {
reason,
})
// socket.active indicates if auto-reconnect will happen
if (socketInstance.active) {
setIsReconnecting(true)
logger.info('Socket disconnected, will auto-reconnect', { reason })
} else {
setIsReconnecting(false)
logger.info('Socket disconnected, no auto-reconnect', { reason })
}
})
socketInstance.on('connect_error', (error: any) => {
socketInstance.on('connect_error', (error: Error) => {
setIsConnecting(false)
logger.error('Socket connection error:', {
message: error.message,
stack: error.stack,
description: error.description,
type: error.type,
transport: error.transport,
})
logger.error('Socket connection error:', { message: error.message })
// Check if this is an authentication failure
const isAuthError =
@@ -261,43 +270,41 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
logger.warn(
'Authentication failed - stopping reconnection attempts. User may need to refresh/re-login.'
)
// Stop reconnection attempts to prevent infinite loop
socketInstance.disconnect()
// Reset state to allow re-initialization when session is restored
setSocket(null)
setAuthFailed(true)
setIsReconnecting(false)
initializedRef.current = false
} else if (socketInstance.active) {
// Temporary failure, will auto-reconnect
setIsReconnecting(true)
}
})
socketInstance.on('reconnect', (attemptNumber) => {
// Reconnection events are on the Manager (socket.io), not the socket itself
socketInstance.io.on('reconnect', (attemptNumber) => {
setIsConnected(true)
setIsReconnecting(false)
setCurrentSocketId(socketInstance.id ?? null)
logger.info('Socket reconnected successfully', {
attemptNumber,
socketId: socketInstance.id,
transport: socketInstance.io.engine?.transport?.name,
})
// Note: join-workflow is handled by the useEffect watching isConnected
})
socketInstance.on('reconnect_attempt', (attemptNumber) => {
logger.info('Socket reconnection attempt (fresh token will be generated)', {
attemptNumber,
timestamp: new Date().toISOString(),
})
socketInstance.io.on('reconnect_attempt', (attemptNumber) => {
setIsReconnecting(true)
logger.info('Socket reconnection attempt', { attemptNumber })
})
socketInstance.on('reconnect_error', (error: any) => {
logger.error('Socket reconnection error:', {
message: error.message,
attemptNumber: error.attemptNumber,
type: error.type,
})
socketInstance.io.on('reconnect_error', (error: Error) => {
logger.error('Socket reconnection error:', { message: error.message })
})
socketInstance.on('reconnect_failed', () => {
socketInstance.io.on('reconnect_failed', () => {
logger.error('Socket reconnection failed - all attempts exhausted')
setIsReconnecting(false)
setIsConnecting(false)
})
@@ -629,6 +636,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
if (commit) {
socket.emit('workflow-operation', {
workflowId: currentWorkflowId,
operation,
target,
payload,
@@ -645,6 +653,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
}
pendingPositionUpdates.current.set(blockId, {
workflowId: currentWorkflowId,
operation,
target,
payload,
@@ -666,6 +675,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
}
} else {
socket.emit('workflow-operation', {
workflowId: currentWorkflowId,
operation,
target,
payload,
@@ -678,47 +688,51 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
)
const emitSubblockUpdate = useCallback(
(blockId: string, subblockId: string, value: any, operationId?: string) => {
if (socket && currentWorkflowId) {
socket.emit('subblock-update', {
blockId,
subblockId,
value,
timestamp: Date.now(),
operationId,
})
} else {
logger.warn('Cannot emit subblock update: no socket connection or workflow room', {
hasSocket: !!socket,
currentWorkflowId,
blockId,
subblockId,
})
(
blockId: string,
subblockId: string,
value: any,
operationId: string | undefined,
workflowId: string
) => {
if (!socket) {
logger.warn('Cannot emit subblock update: no socket connection', { workflowId, blockId })
return
}
socket.emit('subblock-update', {
workflowId,
blockId,
subblockId,
value,
timestamp: Date.now(),
operationId,
})
},
[socket, currentWorkflowId]
[socket]
)
const emitVariableUpdate = useCallback(
(variableId: string, field: string, value: any, operationId?: string) => {
if (socket && currentWorkflowId) {
socket.emit('variable-update', {
variableId,
field,
value,
timestamp: Date.now(),
operationId,
})
} else {
logger.warn('Cannot emit variable update: no socket connection or workflow room', {
hasSocket: !!socket,
currentWorkflowId,
variableId,
field,
})
(
variableId: string,
field: string,
value: any,
operationId: string | undefined,
workflowId: string
) => {
if (!socket) {
logger.warn('Cannot emit variable update: no socket connection', { workflowId, variableId })
return
}
socket.emit('variable-update', {
workflowId,
variableId,
field,
value,
timestamp: Date.now(),
operationId,
})
},
[socket, currentWorkflowId]
[socket]
)
const lastCursorEmit = useRef(0)
@@ -794,6 +808,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
socket,
isConnected,
isConnecting,
isReconnecting,
authFailed,
currentWorkflowId,
currentSocketId,
@@ -820,6 +835,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
socket,
isConnected,
isConnecting,
isReconnecting,
authFailed,
currentWorkflowId,
currentSocketId,

View File

@@ -19,6 +19,7 @@ import { checkUsageStatus } from '@/lib/billing/calculations/usage-monitor'
import { getHighestPrioritySubscription } from '@/lib/billing/core/subscription'
import { RateLimiter } from '@/lib/core/rate-limiter'
import { decryptSecret } from '@/lib/core/security/encryption'
import { formatDuration } from '@/lib/core/utils/formatting'
import { getBaseUrl } from '@/lib/core/utils/urls'
import type { TraceSpan, WorkflowExecutionLog } from '@/lib/logs/types'
import { sendEmail } from '@/lib/messaging/email/mailer'
@@ -227,12 +228,6 @@ async function deliverWebhook(
}
}
function formatDuration(ms: number): string {
if (ms < 1000) return `${ms}ms`
if (ms < 60000) return `${(ms / 1000).toFixed(1)}s`
return `${(ms / 60000).toFixed(1)}m`
}
function formatCost(cost?: Record<string, unknown>): string {
if (!cost?.total) return 'N/A'
const total = cost.total as number
@@ -302,7 +297,7 @@ async function deliverEmail(
workflowName: payload.data.workflowName || 'Unknown Workflow',
status: payload.data.status,
trigger: payload.data.trigger,
duration: formatDuration(payload.data.totalDurationMs),
duration: formatDuration(payload.data.totalDurationMs, { precision: 1 }) ?? '-',
cost: formatCost(payload.data.cost),
logUrl,
alertReason,
@@ -315,7 +310,7 @@ async function deliverEmail(
to: subscription.emailRecipients,
subject,
html,
text: `${subject}\n${alertReason ? `\nReason: ${alertReason}\n` : ''}\nWorkflow: ${payload.data.workflowName}\nStatus: ${statusText}\nTrigger: ${payload.data.trigger}\nDuration: ${formatDuration(payload.data.totalDurationMs)}\nCost: ${formatCost(payload.data.cost)}\n\nView Log: ${logUrl}${includedDataText}`,
text: `${subject}\n${alertReason ? `\nReason: ${alertReason}\n` : ''}\nWorkflow: ${payload.data.workflowName}\nStatus: ${statusText}\nTrigger: ${payload.data.trigger}\nDuration: ${formatDuration(payload.data.totalDurationMs, { precision: 1 }) ?? '-'}\nCost: ${formatCost(payload.data.cost)}\n\nView Log: ${logUrl}${includedDataText}`,
emailType: 'notifications',
})
@@ -373,7 +368,10 @@ async function deliverSlack(
fields: [
{ type: 'mrkdwn', text: `*Status:*\n${payload.data.status}` },
{ type: 'mrkdwn', text: `*Trigger:*\n${payload.data.trigger}` },
{ type: 'mrkdwn', text: `*Duration:*\n${formatDuration(payload.data.totalDurationMs)}` },
{
type: 'mrkdwn',
text: `*Duration:*\n${formatDuration(payload.data.totalDurationMs, { precision: 1 }) ?? '-'}`,
},
{ type: 'mrkdwn', text: `*Cost:*\n${formatCost(payload.data.cost)}` },
],
},

View File

@@ -0,0 +1,625 @@
import { EnrichSoIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
export const EnrichBlock: BlockConfig = {
type: 'enrich',
name: 'Enrich',
description: 'B2B data enrichment and LinkedIn intelligence with Enrich.so',
authMode: AuthMode.ApiKey,
longDescription:
'Access real-time B2B data intelligence with Enrich.so. Enrich profiles from email addresses, find work emails from LinkedIn, verify email deliverability, search for people and companies, and analyze LinkedIn post engagement.',
docsLink: 'https://docs.enrich.so/',
category: 'tools',
bgColor: '#E5E5E6',
icon: EnrichSoIcon,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
// Person/Profile Enrichment
{ label: 'Email to Profile', id: 'email_to_profile' },
{ label: 'Email to Person (Lite)', id: 'email_to_person_lite' },
{ label: 'LinkedIn Profile Enrichment', id: 'linkedin_profile' },
// Email Finding
{ label: 'Find Email', id: 'find_email' },
{ label: 'LinkedIn to Work Email', id: 'linkedin_to_work_email' },
{ label: 'LinkedIn to Personal Email', id: 'linkedin_to_personal_email' },
// Phone Finding
{ label: 'Phone Finder (LinkedIn)', id: 'phone_finder' },
{ label: 'Email to Phone', id: 'email_to_phone' },
// Email Verification
{ label: 'Verify Email', id: 'verify_email' },
{ label: 'Disposable Email Check', id: 'disposable_email_check' },
// IP/Company Lookup
{ label: 'Email to IP', id: 'email_to_ip' },
{ label: 'IP to Company', id: 'ip_to_company' },
// Company Enrichment
{ label: 'Company Lookup', id: 'company_lookup' },
{ label: 'Company Funding & Traffic', id: 'company_funding' },
{ label: 'Company Revenue', id: 'company_revenue' },
// Search
{ label: 'Search People', id: 'search_people' },
{ label: 'Search Company', id: 'search_company' },
{ label: 'Search Company Employees', id: 'search_company_employees' },
{ label: 'Search Similar Companies', id: 'search_similar_companies' },
{ label: 'Sales Pointer (People)', id: 'sales_pointer_people' },
// LinkedIn Posts/Activities
{ label: 'Search Posts', id: 'search_posts' },
{ label: 'Get Post Details', id: 'get_post_details' },
{ label: 'Search Post Reactions', id: 'search_post_reactions' },
{ label: 'Search Post Comments', id: 'search_post_comments' },
{ label: 'Search People Activities', id: 'search_people_activities' },
{ label: 'Search Company Activities', id: 'search_company_activities' },
// Other
{ label: 'Reverse Hash Lookup', id: 'reverse_hash_lookup' },
{ label: 'Search Logo', id: 'search_logo' },
{ label: 'Check Credits', id: 'check_credits' },
],
value: () => 'email_to_profile',
},
{
id: 'apiKey',
title: 'Enrich API Key',
type: 'short-input',
placeholder: 'Enter your Enrich.so API key',
password: true,
required: true,
},
{
id: 'email',
title: 'Email Address',
type: 'short-input',
placeholder: 'john.doe@company.com',
condition: {
field: 'operation',
value: [
'email_to_profile',
'email_to_person_lite',
'email_to_phone',
'verify_email',
'disposable_email_check',
'email_to_ip',
],
},
required: {
field: 'operation',
value: [
'email_to_profile',
'email_to_person_lite',
'email_to_phone',
'verify_email',
'disposable_email_check',
'email_to_ip',
],
},
},
{
id: 'inRealtime',
title: 'Fetch Fresh Data',
type: 'switch',
condition: { field: 'operation', value: 'email_to_profile' },
mode: 'advanced',
},
{
id: 'linkedinUrl',
title: 'LinkedIn Profile URL',
type: 'short-input',
placeholder: 'linkedin.com/in/williamhgates',
condition: {
field: 'operation',
value: [
'linkedin_profile',
'linkedin_to_work_email',
'linkedin_to_personal_email',
'phone_finder',
],
},
required: {
field: 'operation',
value: [
'linkedin_profile',
'linkedin_to_work_email',
'linkedin_to_personal_email',
'phone_finder',
],
},
},
{
id: 'fullName',
title: 'Full Name',
type: 'short-input',
placeholder: 'John Doe',
condition: { field: 'operation', value: 'find_email' },
required: { field: 'operation', value: 'find_email' },
},
{
id: 'companyDomain',
title: 'Company Domain',
type: 'short-input',
placeholder: 'example.com',
condition: { field: 'operation', value: 'find_email' },
required: { field: 'operation', value: 'find_email' },
},
{
id: 'ip',
title: 'IP Address',
type: 'short-input',
placeholder: '86.92.60.221',
condition: { field: 'operation', value: 'ip_to_company' },
required: { field: 'operation', value: 'ip_to_company' },
},
{
id: 'companyName',
title: 'Company Name',
type: 'short-input',
placeholder: 'Google',
condition: { field: 'operation', value: 'company_lookup' },
},
{
id: 'domain',
title: 'Domain',
type: 'short-input',
placeholder: 'google.com',
condition: {
field: 'operation',
value: ['company_lookup', 'company_funding', 'company_revenue', 'search_logo'],
},
required: {
field: 'operation',
value: ['company_funding', 'company_revenue', 'search_logo'],
},
},
{
id: 'firstName',
title: 'First Name',
type: 'short-input',
placeholder: 'John',
condition: { field: 'operation', value: 'search_people' },
},
{
id: 'lastName',
title: 'Last Name',
type: 'short-input',
placeholder: 'Doe',
condition: { field: 'operation', value: 'search_people' },
},
{
id: 'subTitle',
title: 'Job Title',
type: 'short-input',
placeholder: 'Software Engineer',
condition: { field: 'operation', value: 'search_people' },
},
{
id: 'locationCountry',
title: 'Country',
type: 'short-input',
placeholder: 'United States',
condition: { field: 'operation', value: ['search_people', 'search_company'] },
},
{
id: 'locationCity',
title: 'City',
type: 'short-input',
placeholder: 'San Francisco',
condition: { field: 'operation', value: ['search_people', 'search_company'] },
},
{
id: 'industry',
title: 'Industry',
type: 'short-input',
placeholder: 'Technology',
condition: { field: 'operation', value: 'search_people' },
},
{
id: 'currentJobTitles',
title: 'Current Job Titles (JSON)',
type: 'code',
placeholder: '["CEO", "CTO", "VP Engineering"]',
condition: { field: 'operation', value: 'search_people' },
},
{
id: 'skills',
title: 'Skills (JSON)',
type: 'code',
placeholder: '["Python", "Machine Learning"]',
condition: { field: 'operation', value: 'search_people' },
},
{
id: 'searchCompanyName',
title: 'Company Name',
type: 'short-input',
placeholder: 'Google',
condition: { field: 'operation', value: 'search_company' },
},
{
id: 'industries',
title: 'Industries (JSON)',
type: 'code',
placeholder: '["Technology", "Software"]',
condition: { field: 'operation', value: 'search_company' },
},
{
id: 'staffCountMin',
title: 'Min Employees',
type: 'short-input',
placeholder: '50',
condition: { field: 'operation', value: 'search_company' },
},
{
id: 'staffCountMax',
title: 'Max Employees',
type: 'short-input',
placeholder: '500',
condition: { field: 'operation', value: 'search_company' },
},
{
id: 'companyIds',
title: 'Company IDs (JSON)',
type: 'code',
placeholder: '[12345, 67890]',
condition: { field: 'operation', value: 'search_company_employees' },
},
{
id: 'country',
title: 'Country',
type: 'short-input',
placeholder: 'United States',
condition: { field: 'operation', value: 'search_company_employees' },
},
{
id: 'city',
title: 'City',
type: 'short-input',
placeholder: 'San Francisco',
condition: { field: 'operation', value: 'search_company_employees' },
},
{
id: 'jobTitles',
title: 'Job Titles (JSON)',
type: 'code',
placeholder: '["Software Engineer", "Product Manager"]',
condition: { field: 'operation', value: 'search_company_employees' },
},
{
id: 'linkedinCompanyUrl',
title: 'LinkedIn Company URL',
type: 'short-input',
placeholder: 'linkedin.com/company/google',
condition: { field: 'operation', value: 'search_similar_companies' },
required: { field: 'operation', value: 'search_similar_companies' },
},
{
id: 'accountLocation',
title: 'Locations (JSON)',
type: 'code',
placeholder: '["germany", "france"]',
condition: { field: 'operation', value: 'search_similar_companies' },
},
{
id: 'employeeSizeType',
title: 'Employee Size Filter Type',
type: 'dropdown',
options: [
{ label: 'Range', id: 'RANGE' },
{ label: 'Exact', id: 'EXACT' },
],
condition: { field: 'operation', value: 'search_similar_companies' },
mode: 'advanced',
},
{
id: 'employeeSizeRange',
title: 'Employee Size Range (JSON)',
type: 'code',
placeholder: '[{"start": 50, "end": 200}]',
condition: { field: 'operation', value: 'search_similar_companies' },
},
{
id: 'num',
title: 'Results Per Page',
type: 'short-input',
placeholder: '10',
condition: { field: 'operation', value: 'search_similar_companies' },
},
{
id: 'filters',
title: 'Filters (JSON)',
type: 'code',
placeholder:
'[{"type": "POSTAL_CODE", "values": [{"id": "101041448", "text": "San Francisco", "selectionType": "INCLUDED"}]}]',
condition: { field: 'operation', value: 'sales_pointer_people' },
required: { field: 'operation', value: 'sales_pointer_people' },
},
{
id: 'keywords',
title: 'Keywords',
type: 'short-input',
placeholder: 'AI automation',
condition: { field: 'operation', value: 'search_posts' },
required: { field: 'operation', value: 'search_posts' },
},
{
id: 'datePosted',
title: 'Date Posted',
type: 'dropdown',
options: [
{ label: 'Any time', id: '' },
{ label: 'Past 24 hours', id: 'past_24_hours' },
{ label: 'Past week', id: 'past_week' },
{ label: 'Past month', id: 'past_month' },
],
condition: { field: 'operation', value: 'search_posts' },
},
{
id: 'postUrl',
title: 'LinkedIn Post URL',
type: 'short-input',
placeholder: 'https://www.linkedin.com/posts/...',
condition: { field: 'operation', value: 'get_post_details' },
required: { field: 'operation', value: 'get_post_details' },
},
{
id: 'postUrn',
title: 'Post URN',
type: 'short-input',
placeholder: 'urn:li:activity:7231931952839196672',
condition: {
field: 'operation',
value: ['search_post_reactions', 'search_post_comments'],
},
required: {
field: 'operation',
value: ['search_post_reactions', 'search_post_comments'],
},
},
{
id: 'reactionType',
title: 'Reaction Type',
type: 'dropdown',
options: [
{ label: 'All', id: 'all' },
{ label: 'Like', id: 'like' },
{ label: 'Love', id: 'love' },
{ label: 'Celebrate', id: 'celebrate' },
{ label: 'Insightful', id: 'insightful' },
{ label: 'Funny', id: 'funny' },
],
condition: { field: 'operation', value: 'search_post_reactions' },
},
{
id: 'profileId',
title: 'Profile ID',
type: 'short-input',
placeholder: 'ACoAAC1wha0BhoDIRAHrP5rgzVDyzmSdnl-KuEk',
condition: { field: 'operation', value: 'search_people_activities' },
required: { field: 'operation', value: 'search_people_activities' },
},
{
id: 'activityType',
title: 'Activity Type',
type: 'dropdown',
options: [
{ label: 'Posts', id: 'posts' },
{ label: 'Comments', id: 'comments' },
{ label: 'Articles', id: 'articles' },
],
condition: {
field: 'operation',
value: ['search_people_activities', 'search_company_activities'],
},
},
{
id: 'companyId',
title: 'Company ID',
type: 'short-input',
placeholder: '100746430',
condition: { field: 'operation', value: 'search_company_activities' },
required: { field: 'operation', value: 'search_company_activities' },
},
{
id: 'offset',
title: 'Offset',
type: 'short-input',
placeholder: '0',
condition: { field: 'operation', value: 'search_company_activities' },
mode: 'advanced',
},
{
id: 'hash',
title: 'MD5 Hash',
type: 'short-input',
placeholder: '5f0efb20de5ecfedbe0bf5e7c12353fe',
condition: { field: 'operation', value: 'reverse_hash_lookup' },
required: { field: 'operation', value: 'reverse_hash_lookup' },
},
{
id: 'page',
title: 'Page Number',
type: 'short-input',
placeholder: '1',
condition: {
field: 'operation',
value: [
'search_people',
'search_company',
'search_company_employees',
'search_similar_companies',
'sales_pointer_people',
'search_posts',
'search_post_reactions',
'search_post_comments',
],
},
required: { field: 'operation', value: 'sales_pointer_people' },
},
{
id: 'pageSize',
title: 'Results Per Page',
type: 'short-input',
placeholder: '20',
condition: {
field: 'operation',
value: ['search_people', 'search_company', 'search_company_employees'],
},
},
{
id: 'paginationToken',
title: 'Pagination Token',
type: 'short-input',
placeholder: 'Token from previous response',
condition: {
field: 'operation',
value: ['search_people_activities', 'search_company_activities'],
},
mode: 'advanced',
},
],
tools: {
access: [
'enrich_check_credits',
'enrich_email_to_profile',
'enrich_email_to_person_lite',
'enrich_linkedin_profile',
'enrich_find_email',
'enrich_linkedin_to_work_email',
'enrich_linkedin_to_personal_email',
'enrich_phone_finder',
'enrich_email_to_phone',
'enrich_verify_email',
'enrich_disposable_email_check',
'enrich_email_to_ip',
'enrich_ip_to_company',
'enrich_company_lookup',
'enrich_company_funding',
'enrich_company_revenue',
'enrich_search_people',
'enrich_search_company',
'enrich_search_company_employees',
'enrich_search_similar_companies',
'enrich_sales_pointer_people',
'enrich_search_posts',
'enrich_get_post_details',
'enrich_search_post_reactions',
'enrich_search_post_comments',
'enrich_search_people_activities',
'enrich_search_company_activities',
'enrich_reverse_hash_lookup',
'enrich_search_logo',
],
config: {
tool: (params) => `enrich_${params.operation}`,
params: (params) => {
const { operation, ...rest } = params
const parsedParams: Record<string, any> = { ...rest }
try {
if (rest.currentJobTitles && typeof rest.currentJobTitles === 'string') {
parsedParams.currentJobTitles = JSON.parse(rest.currentJobTitles)
}
if (rest.skills && typeof rest.skills === 'string') {
parsedParams.skills = JSON.parse(rest.skills)
}
if (rest.industries && typeof rest.industries === 'string') {
parsedParams.industries = JSON.parse(rest.industries)
}
if (rest.companyIds && typeof rest.companyIds === 'string') {
parsedParams.companyIds = JSON.parse(rest.companyIds)
}
if (rest.jobTitles && typeof rest.jobTitles === 'string') {
parsedParams.jobTitles = JSON.parse(rest.jobTitles)
}
if (rest.accountLocation && typeof rest.accountLocation === 'string') {
parsedParams.accountLocation = JSON.parse(rest.accountLocation)
}
if (rest.employeeSizeRange && typeof rest.employeeSizeRange === 'string') {
parsedParams.employeeSizeRange = JSON.parse(rest.employeeSizeRange)
}
if (rest.filters && typeof rest.filters === 'string') {
parsedParams.filters = JSON.parse(rest.filters)
}
} catch (error: any) {
throw new Error(`Invalid JSON input: ${error.message}`)
}
if (operation === 'linkedin_profile') {
parsedParams.url = rest.linkedinUrl
parsedParams.linkedinUrl = undefined
}
if (
operation === 'linkedin_to_work_email' ||
operation === 'linkedin_to_personal_email' ||
operation === 'phone_finder'
) {
parsedParams.linkedinProfile = rest.linkedinUrl
parsedParams.linkedinUrl = undefined
}
if (operation === 'company_lookup') {
parsedParams.name = rest.companyName
parsedParams.companyName = undefined
}
if (operation === 'search_company') {
parsedParams.name = rest.searchCompanyName
parsedParams.searchCompanyName = undefined
}
if (operation === 'search_similar_companies') {
parsedParams.url = rest.linkedinCompanyUrl
parsedParams.linkedinCompanyUrl = undefined
}
if (operation === 'get_post_details') {
parsedParams.url = rest.postUrl
parsedParams.postUrl = undefined
}
if (operation === 'search_logo') {
parsedParams.url = rest.domain
}
if (parsedParams.page) {
const pageNum = Number(parsedParams.page)
if (operation === 'search_people' || operation === 'search_company') {
parsedParams.currentPage = pageNum
parsedParams.page = undefined
} else {
parsedParams.page = pageNum
}
}
if (parsedParams.pageSize) parsedParams.pageSize = Number(parsedParams.pageSize)
if (parsedParams.num) parsedParams.num = Number(parsedParams.num)
if (parsedParams.offset) parsedParams.offset = Number(parsedParams.offset)
if (parsedParams.staffCountMin)
parsedParams.staffCountMin = Number(parsedParams.staffCountMin)
if (parsedParams.staffCountMax)
parsedParams.staffCountMax = Number(parsedParams.staffCountMax)
return parsedParams
},
},
},
inputs: {
operation: { type: 'string', description: 'Enrich operation to perform' },
},
outputs: {
success: { type: 'boolean', description: 'Whether the operation was successful' },
output: { type: 'json', description: 'Output data from the Enrich operation' },
},
}

View File

@@ -26,6 +26,7 @@ import { DuckDuckGoBlock } from '@/blocks/blocks/duckduckgo'
import { DynamoDBBlock } from '@/blocks/blocks/dynamodb'
import { ElasticsearchBlock } from '@/blocks/blocks/elasticsearch'
import { ElevenLabsBlock } from '@/blocks/blocks/elevenlabs'
import { EnrichBlock } from '@/blocks/blocks/enrich'
import { EvaluatorBlock } from '@/blocks/blocks/evaluator'
import { ExaBlock } from '@/blocks/blocks/exa'
import { FileBlock, FileV2Block } from '@/blocks/blocks/file'
@@ -188,6 +189,7 @@ export const registry: Record<string, BlockConfig> = {
dynamodb: DynamoDBBlock,
elasticsearch: ElasticsearchBlock,
elevenlabs: ElevenLabsBlock,
enrich: EnrichBlock,
evaluator: EvaluatorBlock,
exa: ExaBlock,
file: FileBlock,

View File

@@ -13,8 +13,8 @@ interface FreeTierUpgradeEmailProps {
const proFeatures = [
{ label: '$20/month', desc: 'in credits included' },
{ label: '25 runs/min', desc: 'sync executions' },
{ label: '200 runs/min', desc: 'async executions' },
{ label: '150 runs/min', desc: 'sync executions' },
{ label: '1,000 runs/min', desc: 'async executions' },
{ label: '50GB storage', desc: 'for files & assets' },
{ label: 'Unlimited', desc: 'workspaces & invites' },
]

View File

@@ -458,8 +458,8 @@ export function getCodeEditorProps(options?: {
'caret-[var(--text-primary)] dark:caret-white',
// Font smoothing
'[-webkit-font-smoothing:antialiased] [-moz-osx-font-smoothing:grayscale]',
// Disable interaction for streaming/preview
(isStreaming || isPreview) && 'pointer-events-none'
// Disable interaction for streaming/preview/disabled
(isStreaming || isPreview || disabled) && 'pointer-events-none'
),
}
}

View File

@@ -260,6 +260,9 @@ const Popover: React.FC<PopoverProps> = ({
setIsKeyboardNav(false)
setSelectedIndex(-1)
registeredItemsRef.current = []
} else {
// Reset hover state when opening to prevent stale submenu from previous menu
setLastHoveredItem(null)
}
}, [open])

View File

@@ -5421,3 +5421,18 @@ z'
</svg>
)
}
export function EnrichSoIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 398 394' fill='none'>
<path
fill='#5A52F4'
d='M129.705566,319.705719 C127.553314,322.684906 125.651512,325.414673 123.657059,328.277466 C113.748466,318.440308 105.605003,310.395905 97.510834,302.302216 C93.625801,298.417419 89.990181,294.269318 85.949242,290.558868 C82.857994,287.720428 82.464081,285.757660 85.772888,282.551880 C104.068108,264.826202 122.146088,246.876312 140.285110,228.989670 C141.183945,228.103317 141.957443,227.089844 143.588837,225.218384 C140.691605,225.066116 138.820053,224.882874 136.948410,224.881958 C102.798264,224.865326 68.647453,224.765244 34.498699,224.983612 C29.315699,225.016739 27.990419,223.343155 28.090912,218.397430 C28.381887,204.076935 28.189890,189.746719 28.195684,175.420319 C28.198524,168.398178 28.319166,168.279541 35.590389,168.278687 C69.074188,168.274780 102.557991,168.281174 136.041794,168.266083 C137.968231,168.265213 139.894608,168.107101 141.821030,168.022171 C142.137955,167.513992 142.454895,167.005829 142.771820,166.497650 C122.842415,146.495621 102.913002,126.493591 83.261360,106.770348 C96.563828,93.471756 109.448814,80.590523 122.656265,67.386925 C123.522743,68.161835 124.785545,69.187096 125.930321,70.330513 C144.551819,88.930206 163.103683,107.600082 181.805267,126.118790 C186.713593,130.979126 189.085648,136.448059 189.055374,143.437057 C188.899490,179.418961 188.911179,215.402191 189.046661,251.384262 C189.072296,258.190796 186.742920,263.653717 181.982727,268.323273 C164.624405,285.351227 147.295807,302.409485 129.705566,319.705719z'
/>
<path
fill='#5A52F4'
d='M276.070923,246.906128 C288.284363,258.985870 300.156097,270.902100 312.235931,282.603485 C315.158752,285.434784 315.417542,287.246246 312.383484,290.248932 C301.143494,301.372498 290.168549,312.763733 279.075592,324.036255 C278.168030,324.958496 277.121307,325.743835 275.898315,326.801086 C274.628357,325.711792 273.460663,324.822968 272.422150,323.802673 C253.888397,305.594757 235.418701,287.321289 216.818268,269.181854 C211.508789,264.003937 208.872726,258.136688 208.914001,250.565842 C209.108337,214.917786 209.084808,179.267715 208.928864,143.619293 C208.898407,136.654907 211.130066,131.122162 216.052216,126.246094 C234.867538,107.606842 253.537521,88.820908 272.274780,70.102730 C273.313202,69.065353 274.468597,68.145027 275.264038,67.440727 C288.353516,80.579514 301.213470,93.487869 314.597534,106.922356 C295.163391,126.421753 275.214752,146.437363 255.266113,166.452972 C255.540176,166.940353 255.814240,167.427734 256.088318,167.915100 C257.983887,168.035736 259.879425,168.260345 261.775085,168.261551 C295.425201,168.282852 329.075287,168.273544 362.725403,168.279831 C369.598907,168.281113 369.776215,168.463593 369.778931,175.252213 C369.784882,189.911667 369.646088,204.573074 369.861206,219.229355 C369.925110,223.585022 368.554596,224.976288 364.148865,224.956406 C329.833130,224.801605 295.516388,224.869598 261.199951,224.868744 C259.297974,224.868698 257.396027,224.868744 254.866638,224.868744 C262.350708,232.658707 269.078217,239.661194 276.070923,246.906128z'
/>
</svg>
)
}

View File

@@ -7,6 +7,7 @@ import { Button } from '@/components/ui/button'
import { Collapsible, CollapsibleContent, CollapsibleTrigger } from '@/components/ui/collapsible'
import type { ToolCallGroup, ToolCallState } from '@/lib/copilot/types'
import { cn } from '@/lib/core/utils/cn'
import { formatDuration } from '@/lib/core/utils/formatting'
interface ToolCallProps {
toolCall: ToolCallState
@@ -225,11 +226,6 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
const isError = toolCall.state === 'error'
const isAborted = toolCall.state === 'aborted'
const formatDuration = (duration?: number) => {
if (!duration) return ''
return duration < 1000 ? `${duration}ms` : `${(duration / 1000).toFixed(1)}s`
}
return (
<div
className={cn(
@@ -279,7 +275,7 @@ export function ToolCallCompletion({ toolCall, isCompact = false }: ToolCallProp
)}
style={{ fontSize: '0.625rem' }}
>
{formatDuration(toolCall.duration)}
{toolCall.duration ? formatDuration(toolCall.duration, { precision: 1 }) : ''}
</Badge>
)}
</div>

43
apps/sim/ee/LICENSE Normal file
View File

@@ -0,0 +1,43 @@
Sim Enterprise License
Copyright (c) 2025-present Sim Studio, Inc.
This software and associated documentation files (the "Software") are licensed
under the following terms:
1. LICENSE GRANT
Subject to the terms of this license, Sim Studio, Inc. grants you a limited,
non-exclusive, non-transferable license to use the Software for:
- Development, testing, and evaluation purposes
- Internal non-production use
Production use of the Software requires a valid Sim Enterprise subscription.
2. RESTRICTIONS
You may not:
- Use the Software in production without a valid Enterprise subscription
- Modify, adapt, or create derivative works of the Software
- Redistribute, sublicense, or transfer the Software
- Remove or alter any proprietary notices in the Software
3. ENTERPRISE SUBSCRIPTION
Production deployment of enterprise features requires an active Sim Enterprise
subscription. Contact sales@simstudio.ai for licensing information.
4. DISCLAIMER
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
5. LIMITATION OF LIABILITY
IN NO EVENT SHALL SIM STUDIO, INC. BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY ARISING FROM THE USE OF THE SOFTWARE.
For questions about enterprise licensing, contact: sales@simstudio.ai

21
apps/sim/ee/README.md Normal file
View File

@@ -0,0 +1,21 @@
# Sim Enterprise Edition
This directory contains enterprise features that require a Sim Enterprise subscription
for production use.
## Features
- **SSO (Single Sign-On)**: OIDC and SAML authentication integration
- **Access Control**: Permission groups for fine-grained user access management
- **Credential Sets**: Shared credential pools for email polling workflows
## Licensing
See [LICENSE](./LICENSE) for terms. Development and testing use is permitted.
Production deployment requires an active Enterprise subscription.
## Architecture
Enterprise features are imported directly throughout the codebase. The `ee/` directory
is required at build time. Feature visibility is controlled at runtime via environment
variables (e.g., `NEXT_PUBLIC_ACCESS_CONTROL_ENABLED`, `NEXT_PUBLIC_SSO_ENABLED`).

View File

@@ -29,7 +29,6 @@ import type { PermissionGroupConfig } from '@/lib/permission-groups/types'
import { getUserColor } from '@/lib/workspaces/colors'
import { getUserRole } from '@/lib/workspaces/organization'
import { getAllBlocks } from '@/blocks'
import { useOrganization, useOrganizations } from '@/hooks/queries/organization'
import {
type PermissionGroup,
useBulkAddPermissionGroupMembers,
@@ -39,7 +38,8 @@ import {
usePermissionGroups,
useRemovePermissionGroupMember,
useUpdatePermissionGroup,
} from '@/hooks/queries/permission-groups'
} from '@/ee/access-control/hooks/permission-groups'
import { useOrganization, useOrganizations } from '@/hooks/queries/organization'
import { useSubscriptionData } from '@/hooks/queries/subscription'
import { PROVIDER_DEFINITIONS } from '@/providers/models'
import { getAllProviderIds } from '@/providers/utils'
@@ -255,7 +255,6 @@ export function AccessControl() {
queryEnabled
)
// Show loading while dependencies load, or while permission groups query is pending
const isLoading = orgsLoading || subLoading || (queryEnabled && groupsLoading)
const { data: organization } = useOrganization(activeOrganization?.id || '')
@@ -410,10 +409,8 @@ export function AccessControl() {
}, [viewingGroup, editingConfig])
const allBlocks = useMemo(() => {
// Filter out hidden blocks and start_trigger (which should never be disabled)
const blocks = getAllBlocks().filter((b) => !b.hideFromToolbar && b.type !== 'start_trigger')
return blocks.sort((a, b) => {
// Group by category: triggers first, then blocks, then tools
const categoryOrder = { triggers: 0, blocks: 1, tools: 2 }
const catA = categoryOrder[a.category] ?? 3
const catB = categoryOrder[b.category] ?? 3
@@ -555,10 +552,9 @@ export function AccessControl() {
}, [viewingGroup, editingConfig, activeOrganization?.id, updatePermissionGroup])
const handleOpenAddMembersModal = useCallback(() => {
const existingMemberUserIds = new Set(members.map((m) => m.userId))
setSelectedMemberIds(new Set())
setShowAddMembersModal(true)
}, [members])
}, [])
const handleAddSelectedMembers = useCallback(async () => {
if (!viewingGroup || selectedMemberIds.size === 0) return
@@ -891,7 +887,6 @@ export function AccessControl() {
prev
? {
...prev,
// When deselecting all, keep start_trigger allowed (it should never be disabled)
allowedIntegrations: allAllowed ? ['start_trigger'] : null,
}
: prev

View File

@@ -1,3 +1,5 @@
'use client'
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
import type { PermissionGroupConfig } from '@/lib/permission-groups/types'
import { fetchJson } from '@/hooks/selectors/helpers'

View File

@@ -11,55 +11,13 @@ import { isBillingEnabled } from '@/lib/core/config/feature-flags'
import { cn } from '@/lib/core/utils/cn'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { getUserRole } from '@/lib/workspaces/organization/utils'
import { SSO_TRUSTED_PROVIDERS } from '@/ee/sso/constants'
import { useConfigureSSO, useSSOProviders } from '@/ee/sso/hooks/sso'
import { useOrganizations } from '@/hooks/queries/organization'
import { useConfigureSSO, useSSOProviders } from '@/hooks/queries/sso'
import { useSubscriptionData } from '@/hooks/queries/subscription'
const logger = createLogger('SSO')
const TRUSTED_SSO_PROVIDERS = [
'okta',
'okta-saml',
'okta-prod',
'okta-dev',
'okta-staging',
'okta-test',
'azure-ad',
'azure-active-directory',
'azure-corp',
'azure-enterprise',
'adfs',
'adfs-company',
'adfs-corp',
'adfs-enterprise',
'auth0',
'auth0-prod',
'auth0-dev',
'auth0-staging',
'onelogin',
'onelogin-prod',
'onelogin-corp',
'jumpcloud',
'jumpcloud-prod',
'jumpcloud-corp',
'ping-identity',
'ping-federate',
'pingone',
'shibboleth',
'shibboleth-idp',
'google-workspace',
'google-sso',
'saml',
'saml2',
'saml-sso',
'oidc',
'oidc-sso',
'openid-connect',
'custom-sso',
'enterprise-sso',
'company-sso',
]
interface SSOProvider {
id: string
providerId: string
@@ -565,7 +523,7 @@ export function SSO() {
<Combobox
value={formData.providerId}
onChange={(value: string) => handleInputChange('providerId', value)}
options={TRUSTED_SSO_PROVIDERS.map((id) => ({
options={SSO_TRUSTED_PROVIDERS.map((id) => ({
label: id,
value: id,
}))}

View File

@@ -1,3 +1,7 @@
/**
* List of trusted SSO provider identifiers.
* Used for validation and autocomplete in SSO configuration.
*/
export const SSO_TRUSTED_PROVIDERS = [
'okta',
'okta-saml',

View File

@@ -1,3 +1,5 @@
'use client'
import { keepPreviousData, useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
import { organizationKeys } from '@/hooks/queries/organization'
@@ -75,39 +77,3 @@ export function useConfigureSSO() {
},
})
}
/**
* Delete SSO provider mutation
*/
interface DeleteSSOParams {
providerId: string
orgId?: string
}
export function useDeleteSSO() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async ({ providerId }: DeleteSSOParams) => {
const response = await fetch(`/api/auth/sso/providers/${providerId}`, {
method: 'DELETE',
})
if (!response.ok) {
const error = await response.json()
throw new Error(error.message || 'Failed to delete SSO provider')
}
return response.json()
},
onSuccess: (_data, variables) => {
queryClient.invalidateQueries({ queryKey: ssoKeys.providers() })
if (variables.orgId) {
queryClient.invalidateQueries({
queryKey: organizationKeys.detail(variables.orgId),
})
}
},
})
}

View File

@@ -5,6 +5,7 @@ import {
hydrateUserFilesWithBase64,
} from '@/lib/uploads/utils/user-file-base64.server'
import { sanitizeInputFormat, sanitizeTools } from '@/lib/workflows/comparison/normalize'
import { validateBlockType } from '@/ee/access-control/utils/permission-check'
import {
BlockType,
buildResumeApiUrl,
@@ -20,18 +21,18 @@ import {
generatePauseContextId,
mapNodeMetadataToPauseScopes,
} from '@/executor/human-in-the-loop/utils.ts'
import type {
BlockHandler,
BlockLog,
BlockState,
ExecutionContext,
NormalizedBlockOutput,
import {
type BlockHandler,
type BlockLog,
type BlockState,
type ExecutionContext,
getNextExecutionOrder,
type NormalizedBlockOutput,
} from '@/executor/types'
import { streamingResponseFormatProcessor } from '@/executor/utils'
import { buildBlockExecutionError, normalizeError } from '@/executor/utils/errors'
import { isJSONString } from '@/executor/utils/json'
import { filterOutputForLog } from '@/executor/utils/output-filter'
import { validateBlockType } from '@/executor/utils/permission-check'
import type { VariableResolver } from '@/executor/variables/resolver'
import type { SerializedBlock } from '@/serializer/types'
import type { SubflowType } from '@/stores/workflows/workflow/types'
@@ -68,7 +69,7 @@ export class BlockExecutor {
if (!isSentinel) {
blockLog = this.createBlockLog(ctx, node.id, block, node)
ctx.blockLogs.push(blockLog)
this.callOnBlockStart(ctx, node, block)
this.callOnBlockStart(ctx, node, block, blockLog.executionOrder)
}
const startTime = performance.now()
@@ -159,7 +160,7 @@ export class BlockExecutor {
this.state.setBlockOutput(node.id, normalizedOutput, duration)
if (!isSentinel) {
if (!isSentinel && blockLog) {
const displayOutput = filterOutputForLog(block.metadata?.id || '', normalizedOutput, {
block,
})
@@ -170,8 +171,9 @@ export class BlockExecutor {
this.sanitizeInputsForLog(resolvedInputs),
displayOutput,
duration,
blockLog!.startedAt,
blockLog!.endedAt
blockLog.startedAt,
blockLog.executionOrder,
blockLog.endedAt
)
}
@@ -268,7 +270,7 @@ export class BlockExecutor {
}
)
if (!isSentinel) {
if (!isSentinel && blockLog) {
const displayOutput = filterOutputForLog(block.metadata?.id || '', errorOutput, { block })
this.callOnBlockComplete(
ctx,
@@ -277,8 +279,9 @@ export class BlockExecutor {
this.sanitizeInputsForLog(input),
displayOutput,
duration,
blockLog!.startedAt,
blockLog!.endedAt
blockLog.startedAt,
blockLog.executionOrder,
blockLog.endedAt
)
}
@@ -346,6 +349,7 @@ export class BlockExecutor {
blockName,
blockType: block.metadata?.id ?? DEFAULTS.BLOCK_TYPE,
startedAt: new Date().toISOString(),
executionOrder: getNextExecutionOrder(ctx),
endedAt: '',
durationMs: 0,
success: false,
@@ -409,7 +413,12 @@ export class BlockExecutor {
return result
}
private callOnBlockStart(ctx: ExecutionContext, node: DAGNode, block: SerializedBlock): void {
private callOnBlockStart(
ctx: ExecutionContext,
node: DAGNode,
block: SerializedBlock,
executionOrder: number
): void {
const blockId = node.id
const blockName = block.metadata?.name ?? blockId
const blockType = block.metadata?.id ?? DEFAULTS.BLOCK_TYPE
@@ -417,7 +426,13 @@ export class BlockExecutor {
const iterationContext = this.getIterationContext(ctx, node)
if (this.contextExtensions.onBlockStart) {
this.contextExtensions.onBlockStart(blockId, blockName, blockType, iterationContext)
this.contextExtensions.onBlockStart(
blockId,
blockName,
blockType,
executionOrder,
iterationContext
)
}
}
@@ -429,6 +444,7 @@ export class BlockExecutor {
output: NormalizedBlockOutput,
duration: number,
startedAt: string,
executionOrder: number,
endedAt: string
): void {
const blockId = node.id
@@ -447,6 +463,7 @@ export class BlockExecutor {
output,
executionTime: duration,
startedAt,
executionOrder,
endedAt,
},
iterationContext

View File

@@ -55,7 +55,13 @@ export interface IterationContext {
export interface ExecutionCallbacks {
onStream?: (streamingExec: any) => Promise<void>
onBlockStart?: (blockId: string, blockName: string, blockType: string) => Promise<void>
onBlockStart?: (
blockId: string,
blockName: string,
blockType: string,
executionOrder: number,
iterationContext?: IterationContext
) => Promise<void>
onBlockComplete?: (
blockId: string,
blockName: string,
@@ -97,6 +103,7 @@ export interface ContextExtensions {
blockId: string,
blockName: string,
blockType: string,
executionOrder: number,
iterationContext?: IterationContext
) => Promise<void>
onBlockComplete?: (
@@ -108,6 +115,7 @@ export interface ContextExtensions {
output: NormalizedBlockOutput
executionTime: number
startedAt: string
executionOrder: number
endedAt: string
},
iterationContext?: IterationContext

View File

@@ -6,6 +6,12 @@ import { createMcpToolId } from '@/lib/mcp/utils'
import { refreshTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { getAllBlocks } from '@/blocks'
import type { BlockOutput } from '@/blocks/types'
import {
validateBlockType,
validateCustomToolsAllowed,
validateMcpToolsAllowed,
validateModelProvider,
} from '@/ee/access-control/utils/permission-check'
import { AGENT, BlockType, DEFAULTS, REFERENCE, stripCustomToolPrefix } from '@/executor/constants'
import { memoryService } from '@/executor/handlers/agent/memory'
import type {
@@ -18,12 +24,6 @@ import type { BlockHandler, ExecutionContext, StreamingExecution } from '@/execu
import { collectBlockData } from '@/executor/utils/block-data'
import { buildAPIUrl, buildAuthHeaders } from '@/executor/utils/http'
import { stringifyJSON } from '@/executor/utils/json'
import {
validateBlockType,
validateCustomToolsAllowed,
validateMcpToolsAllowed,
validateModelProvider,
} from '@/executor/utils/permission-check'
import { executeProviderRequest } from '@/providers'
import { getProviderFromModel, transformBlockTool } from '@/providers/utils'
import type { SerializedBlock } from '@/serializer/types'

View File

@@ -4,11 +4,11 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { refreshTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import type { BlockOutput } from '@/blocks/types'
import { validateModelProvider } from '@/ee/access-control/utils/permission-check'
import { BlockType, DEFAULTS, EVALUATOR } from '@/executor/constants'
import type { BlockHandler, ExecutionContext } from '@/executor/types'
import { buildAPIUrl, buildAuthHeaders, extractAPIErrorMessage } from '@/executor/utils/http'
import { isJSONString, parseJSON, stringifyJSON } from '@/executor/utils/json'
import { validateModelProvider } from '@/executor/utils/permission-check'
import { calculateCost, getProviderFromModel } from '@/providers/utils'
import type { SerializedBlock } from '@/serializer/types'

View File

@@ -6,6 +6,7 @@ import { getBaseUrl } from '@/lib/core/utils/urls'
import { refreshTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { generateRouterPrompt, generateRouterV2Prompt } from '@/blocks/blocks/router'
import type { BlockOutput } from '@/blocks/types'
import { validateModelProvider } from '@/ee/access-control/utils/permission-check'
import {
BlockType,
DEFAULTS,
@@ -15,7 +16,6 @@ import {
} from '@/executor/constants'
import type { BlockHandler, ExecutionContext } from '@/executor/types'
import { buildAuthHeaders } from '@/executor/utils/http'
import { validateModelProvider } from '@/executor/utils/permission-check'
import { calculateCost, getProviderFromModel } from '@/providers/utils'
import type { SerializedBlock } from '@/serializer/types'

View File

@@ -212,11 +212,11 @@ export class WorkflowBlockHandler implements BlockHandler {
/**
* Parses a potentially nested workflow error message to extract:
* - The chain of workflow names
* - The actual root error message (preserving the block prefix for the failing block)
* - The actual root error message (preserving the block name prefix for the failing block)
*
* Handles formats like:
* - "workflow-name" failed: error
* - [block_type] Block Name: "workflow-name" failed: error
* - Block Name: "workflow-name" failed: error
* - Workflow chain: A → B | error
*/
private parseNestedWorkflowError(message: string): { chain: string[]; rootError: string } {
@@ -234,8 +234,8 @@ export class WorkflowBlockHandler implements BlockHandler {
// Extract workflow names from patterns like:
// - "workflow-name" failed:
// - [block_type] Block Name: "workflow-name" failed:
const workflowPattern = /(?:\[[^\]]+\]\s*[^:]+:\s*)?"([^"]+)"\s*failed:\s*/g
// - Block Name: "workflow-name" failed:
const workflowPattern = /(?:\[[^\]]+\]\s*)?(?:[^:]+:\s*)?"([^"]+)"\s*failed:\s*/g
let match: RegExpExecArray | null
let lastIndex = 0
@@ -247,7 +247,7 @@ export class WorkflowBlockHandler implements BlockHandler {
}
// The root error is everything after the last match
// Keep the block prefix (e.g., [function] Function 1:) so we know which block failed
// Keep the block name prefix (e.g., Function 1:) so we know which block failed
const rootError = lastIndex > 0 ? remaining.slice(lastIndex) : remaining
return { chain, rootError: rootError.trim() || 'Unknown error' }

View File

@@ -7,7 +7,11 @@ import type { DAG } from '@/executor/dag/builder'
import type { EdgeManager } from '@/executor/execution/edge-manager'
import type { LoopScope } from '@/executor/execution/state'
import type { BlockStateController, ContextExtensions } from '@/executor/execution/types'
import type { ExecutionContext, NormalizedBlockOutput } from '@/executor/types'
import {
type ExecutionContext,
getNextExecutionOrder,
type NormalizedBlockOutput,
} from '@/executor/types'
import type { LoopConfigWithNodes } from '@/executor/types/loop'
import { replaceValidReferences } from '@/executor/utils/reference-validation'
import {
@@ -286,6 +290,7 @@ export class LoopOrchestrator {
output,
executionTime: DEFAULTS.EXECUTION_TIME,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
})
}

View File

@@ -3,7 +3,11 @@ import { DEFAULTS } from '@/executor/constants'
import type { DAG } from '@/executor/dag/builder'
import type { ParallelScope } from '@/executor/execution/state'
import type { BlockStateWriter, ContextExtensions } from '@/executor/execution/types'
import type { ExecutionContext, NormalizedBlockOutput } from '@/executor/types'
import {
type ExecutionContext,
getNextExecutionOrder,
type NormalizedBlockOutput,
} from '@/executor/types'
import type { ParallelConfigWithNodes } from '@/executor/types/parallel'
import { ParallelExpander } from '@/executor/utils/parallel-expansion'
import {
@@ -270,6 +274,7 @@ export class ParallelOrchestrator {
output,
executionTime: 0,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
})
}

View File

@@ -114,6 +114,11 @@ export interface BlockLog {
loopId?: string
parallelId?: string
iterationIndex?: number
/**
* Monotonically increasing integer (1, 2, 3, ...) for accurate block ordering.
* Generated via getNextExecutionOrder() to ensure deterministic sorting.
*/
executionOrder: number
/**
* Child workflow trace spans for nested workflow execution.
* Stored separately from output to keep output clean for display
@@ -227,7 +232,12 @@ export interface ExecutionContext {
edges?: Array<{ source: string; target: string }>
onStream?: (streamingExecution: StreamingExecution) => Promise<void>
onBlockStart?: (blockId: string, blockName: string, blockType: string) => Promise<void>
onBlockStart?: (
blockId: string,
blockName: string,
blockType: string,
executionOrder: number
) => Promise<void>
onBlockComplete?: (
blockId: string,
blockName: string,
@@ -268,6 +278,23 @@ export interface ExecutionContext {
* Stop execution after this block completes. Used for "run until block" feature.
*/
stopAfterBlockId?: string
/**
* Counter for generating monotonically increasing execution order values.
* Starts at 0 and increments for each block. Use getNextExecutionOrder() to access.
*/
executionOrderCounter?: { value: number }
}
/**
* Gets the next execution order value for a block.
* Returns a simple incrementing integer (1, 2, 3, ...) for clear ordering.
*/
export function getNextExecutionOrder(ctx: ExecutionContext): number {
if (!ctx.executionOrderCounter) {
ctx.executionOrderCounter = { value: 0 }
}
return ++ctx.executionOrderCounter.value
}
export interface ExecutionResult {

View File

@@ -47,7 +47,7 @@ export function buildBlockExecutionError(details: BlockExecutionErrorDetails): E
const blockName = details.block.metadata?.name || details.block.id
const blockType = details.block.metadata?.id || 'unknown'
const error = new Error(`[${blockType}] ${blockName}: ${errorMessage}`)
const error = new Error(`${blockName}: ${errorMessage}`)
Object.assign(error, {
blockId: details.block.id,

View File

@@ -1,7 +1,7 @@
import { createLogger } from '@sim/logger'
import { LOOP, PARALLEL, PARSING, REFERENCE } from '@/executor/constants'
import type { ContextExtensions } from '@/executor/execution/types'
import type { BlockLog, ExecutionContext } from '@/executor/types'
import { type BlockLog, type ExecutionContext, getNextExecutionOrder } from '@/executor/types'
import type { VariableResolver } from '@/executor/variables/resolver'
const logger = createLogger('SubflowUtils')
@@ -208,6 +208,7 @@ export function addSubflowErrorLog(
contextExtensions: ContextExtensions | null
): void {
const now = new Date().toISOString()
const execOrder = getNextExecutionOrder(ctx)
const block = ctx.workflow?.blocks?.find((b) => b.id === blockId)
const blockName = block?.metadata?.name || (blockType === 'loop' ? 'Loop' : 'Parallel')
@@ -217,6 +218,7 @@ export function addSubflowErrorLog(
blockName,
blockType,
startedAt: now,
executionOrder: execOrder,
endedAt: now,
durationMs: 0,
success: false,
@@ -233,6 +235,7 @@ export function addSubflowErrorLog(
output: { error: errorMessage },
executionTime: 0,
startedAt: now,
executionOrder: execOrder,
endedAt: now,
})
}

View File

@@ -1,3 +1,5 @@
'use client'
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
import { fetchJson } from '@/hooks/selectors/helpers'

Some files were not shown because too many files have changed in this diff Show More