Compare commits

...

14 Commits

Author SHA1 Message Date
Waleed
f1ec5fe824 v0.5.104: memory improvements, nested subflows, careers page redirect, brandfetch, google meet 2026-03-03 23:45:29 -08:00
Waleed
efc1aeed70 fix(subflows): fix pointer events for nested subflow interaction (#3409)
* fix(subflows): fix pointer events for nested subflow interaction

* fix(subflows): use Tailwind class for pointer-events-none
2026-03-03 23:28:51 -08:00
Waleed
46065983f6 fix(editor): restore cursor position after tag/env-var completion in code editors (#3406)
* fix(editor): restore cursor position after tag/env-var completion in code editors

* lint

* refactor(editor): extract restoreCursorAfterInsertion helper, fix weak fallbacks

* updated

* fix(editor): replace useEffect with direct ref assignment for editorValueRef

* fix(editor): guard cursor restoration behind preview/readOnly check

Move restoreCursorAfterInsertion inside the !isPreview && !readOnly guard
so cursor position isn't computed against newValue when the textarea still
holds liveValue. Add comment documenting the cross-string index invariant
in the shared helper.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(editor): escape blockId in CSS selector with CSS.escape()

Prevents potential SyntaxError if blockId ever contains CSS special
characters when querying the textarea for cursor restoration.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* perf(editor): use ref for cursor fallback to stabilize useCallback

Replace cursorPosition state in handleSubflowTagSelect's dependency
array with a cursorPositionRef. This avoids recreating the callback
on every keystroke since cursorPosition is only used as a fallback
when textareaRef.current is null.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(editor): pass cursor position explicitly from dropdowns

Instead of inferring cursor position by searching for delimiters in the
output string (which could match unrelated < or {{ in code), compute
the exact cursor position in TagDropdown and EnvVarDropdown where the
insertion range is definitively known, and pass it through onSelect.

This follows the same pattern used by CodeMirror, Monaco, and
ProseMirror: the insertion source always knows the range, so cursor
position is computed at the source rather than inferred by the consumer.

- TagDropdown/EnvVarDropdown: compute newCursorPosition, pass as 2nd arg
- restoreCursorAfterInsertion: simplified to just (textarea, position)
- code.tsx, condition-input.tsx, use-subflow-editor.ts: accept position
- Removed editorValueRef and cursorPositionRef from use-subflow-editor
  (no longer needed since dropdown computes position)
- Other consumers (native inputs) unaffected due to TS callback compat

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(editor): fix JSDoc terminology — macrotask not microtask

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:10:00 -08:00
Waleed
2c79d0249f improvement(executor): support nested loops/parallels (#3398)
* feat(executor): support nested loop DAG construction and edge wiring

Wire inner loop sentinel nodes into outer loop sentinel chains so that
nested loops execute correctly. Resolves boundary-node detection to use
effective sentinel IDs for nested loops, handles loop-exit edges from
inner sentinel-end to outer sentinel-end, and recursively clears
execution state for all nested loop scopes between iterations.

NOTE: loop-in-loop nesting only; parallel nesting is not yet supported.
Made-with: Cursor

* feat(executor): add nested loop iteration context and named loop variable resolution

Introduce ParentIteration to track ancestor loop state, build a
loopParentMap during DAG construction, and propagate parent iterations
through block execution and child workflow contexts.

Extend LoopResolver to support named loop references (e.g. <loop1.index>)
and add output property resolution (<loop1.result>). Named references
use the block's display name normalized to a tag-safe identifier,
enabling blocks inside nested loops to reference any ancestor loop's
iteration state.

NOTE: loop-in-loop nesting only; parallel nesting is not yet supported.
Made-with: Cursor

* feat(terminal): propagate parent iteration context through SSE events and terminal display

Thread parentIterations through SSE block-started, block-completed, and
block-error events so the terminal can reconstruct nested loop
hierarchies. Update the entry tree builder to recursively nest inner
loop subflow nodes inside their parent iteration rows, using
parentIterations depth-stripping to support arbitrary nesting depth.

Display the block's store name for subflow container rows instead of
the generic "Loop" / "Parallel" label.

Made-with: Cursor

* feat(canvas): allow nesting subflow containers and prevent cycles

Remove the restriction that prevented subflow nodes from being dragged
into other subflow containers, enabling loop-in-loop nesting on the
canvas. Add cycle detection (isDescendantOf) to prevent a container
from being placed inside one of its own descendants.

Resize all ancestor containers when a nested child moves, collect
descendant blocks when removing from a subflow so boundary edges are
attributed correctly, and surface all ancestor loop tags in the tag
dropdown for blocks inside nested loops.

Made-with: Cursor

* feat(agent): add MCP server discovery mode for agent tool input (#3353)

* feat(agent): add MCP server discovery mode for agent tool input

* fix(tool-input): use type variant for MCP server tool count badge

* fix(mcp-dynamic-args): align label styling with standard subblock labels

* standardized inp format UI

* feat(tool-input): replace MCP server inline expand with drill-down navigation

* feat(tool-input): add chevron affordance and keyboard nav for MCP server drill-down

* fix(tool-input): handle mcp-server type in refresh, validation, badges, and usage control

* refactor(tool-validation): extract getMcpServerIssue, remove fake tool hack

* lint

* reorder dropdown

* perf(agent): parallelize MCP server tool creation with Promise.all

* fix(combobox): preserve cursor movement in search input, reset query on drilldown

* fix(combobox): route ArrowRight through handleSelect, remove redundant type guards

* fix(agent): rename mcpServers to mcpServerSelections to avoid shadowing DB import, route ArrowRight through handleSelect

* docs: update google integration docs

* fix(tool-input): reset drilldown state on tool selection to prevent stale view

* perf(agent): parallelize MCP server discovery across multiple servers

* improvement(tests): speed up unit tests by eliminating vi.resetModules anti-pattern (#3357)

* improvement(tests): speed up unit tests by eliminating vi.resetModules anti-pattern

- convert 51 test files from vi.resetModules/vi.doMock/dynamic import to vi.hoisted/vi.mock/static import
- add global @sim/db mock to vitest.setup.ts
- switch 4 test files from jsdom to node environment
- remove all vi.importActual calls that loaded heavy modules (200+ block files)
- remove slow mockConsoleLogger/mockAuth/setupCommonApiMocks helpers
- reduce real setTimeout delays in engine tests
- mock heavy transitive deps in diff-engine test

test execution time: 34s -> 9s (3.9x faster)
environment time: 2.5s -> 0.6s (4x faster)

* docs(testing): update testing best practices with performance rules

- document vi.hoisted + vi.mock + static import as the standard pattern
- explicitly ban vi.resetModules, vi.doMock, vi.importActual, mockAuth, setupCommonApiMocks
- document global mocks from vitest.setup.ts
- add mock pattern reference for auth, hybrid auth, and database chains
- add performance rules section covering heavy deps, jsdom vs node, real timers

* fix(tests): fix 4 failing test files with missing mocks

- socket/middleware/permissions: add vi.mock for @/lib/auth to prevent transitive getBaseUrl() call
- workflow-handler: add vi.mock for @/executor/utils/http matching executor mock pattern
- evaluator-handler: add db.query.account mock structure before vi.spyOn
- router-handler: same db.query.account fix as evaluator

* fix(tests): replace banned Function type with explicit callback signature

* feat(databricks): add Databricks integration with 8 tools (#3361)

* feat(databricks): add Databricks integration with 8 tools

Add complete Databricks integration supporting SQL execution, job management,
run monitoring, and cluster listing via Personal Access Token authentication.

Tools: execute_sql, list_jobs, run_job, get_run, list_runs, cancel_run,
get_run_output, list_clusters

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(databricks): throw on invalid JSON params, fix boolean coercion, add expandTasks field

- Throw errors on invalid JSON in jobParameters/notebookParams instead of silently defaulting to {}
- Always set boolean params explicitly to prevent string 'false' being truthy
- Add missing expandTasks dropdown UI field for list_jobs operation

* fix(databricks): align tool inputs/outputs with official API spec

- execute_sql: fix wait_timeout default description (50s, not 10s)
- get_run: add queueDuration field, update lifecycle/result state enums
- get_run_output: fix notebook output size (5 MB not 1 MB), add logsTruncated field
- list_runs: add userCancelledOrTimedout to state, fix limit range (1-24), update state enums
- list_jobs: fix name filter description to "exact case-insensitive"
- list_clusters: add PIPELINE_MAINTENANCE to ClusterSource enum

* fix(databricks): regenerate docs to reflect API spec fixes

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(luma): add Luma integration for event and guest management (#3364)

* feat(luma): add Luma integration for event and guest management

Add complete Luma (lu.ma) integration with 6 tools: get event, create event,
update event, list calendar events, get guests, and add guests. Includes block
configuration with wandConfig for timestamps/timezones/durations, advanced mode
for optional fields, and generated documentation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(luma): address PR review feedback

- Remove hosts field from list_events transformResponse (not in LumaEventEntry type)
- Fix truncated add_guests description by removing quotes that broke docs generator

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(luma): fix update_event field name and add_guests response parsing

- Use 'id' instead of 'event_id' in update_event request body per API spec
- Fix add_guests to parse entries[].guest response structure instead of flat guests array

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(gamma): add gamma integration for AI-powered content generation (#3358)

* feat(gamma): add gamma integration for AI-powered content generation

* fix(gamma): address PR review comments

- Make credits/error conditionally included in check_status response to avoid always-truthy objects
- Replace full wordmark SVG with square "G" letterform for proper rendering in icon slots

* fix(gamma): remove imageSource from generate_from_template endpoint

The from-template API only accepts imageOptions.model and imageOptions.style,
not imageOptions.source (image source is inherited from the template).

* fix(gamma): use typed output in check_status transformResponse

* regen docs

* feat(greenhouse): add greenhouse integration for managing candidates, jobs, and applications (#3363)

* feat(ashby): add ashby integration for candidate, job, and application management (#3362)

* feat(ashby): add ashby integration for candidate, job, and application management

* fix(ashby): auto-fix lint formatting in docs files

* improvement(oauth): reordered oauth modal (#3368)

* feat(loops): add Loops email platform integration (#3359)

* feat(loops): add Loops email platform integration

Add complete Loops integration with 10 tools covering all API endpoints:
- Contact management: create, update, find, delete
- Email: send transactional emails with attachments
- Events: trigger automated email sequences
- Lists: list mailing lists and transactional email templates
- Properties: create and list contact properties

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ran litn

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(resend): expand integration with contacts, domains, and enhanced email ops (#3366)

* improvement(blocks): update luma styling and linkup field modes (#3370)

* improvement(blocks): update luma styling and linkup field modes

* improvement(fireflies): move optional fields to advanced mode

* improvement(blocks): move optional fields to advanced mode for 10 integrations

* improvement(blocks): move optional fields to advanced mode for 6 more integrations

* feat(x): add 28 new X API v2 tool integrations and expand OAuth scopes (#3365)

* feat(x): add 28 new X API v2 tool integrations and expand OAuth scopes

* fix(x): add missing nextToken param to search tweets and fix XCreateTweetParams type

* fix(x): correct API spec issues in retweeted_by, quote_tweets, personalized_trends, and usage tools

* fix(x): add missing newestId and oldestId to error meta in get_liked_tweets and get_quote_tweets

* fix(x): add missing newestId/oldestId to get_liked_tweets success branch and includes to XTweetListResponse

* fix(x): add error handling to create_tweet and delete_tweet transformResponse

* fix(x): add error handling and logger to all X tools

* fix(x): revert block requiredScopes to match current operations

* feat(x): update block to support all 28 new X API v2 tools

* fix(x): add missing text output and fix hiddenResult output key mismatch

* docs(x): regenerate docs for all 28 new X API v2 tools

* improvement(docs): audit and standardize tool description sections, update developer count to 70k (#3371)

* improvement(x): align OAuth scopes, add scope descriptions, and set optional fields to advanced mode (#3372)

* improvement(x): align OAuth scopes, add scope descriptions, and set optional fields to advanced mode

* improvement(skills): add typed JSON outputs guidance to add-tools, add-block, and add-integration skills

* improvement(skills): add final validation steps to add-tools, add-block, and add-integration skills

* fix(skills): correct misleading JSON array comment in wandConfig example

* feat(skills): add validate-integration skill for auditing tools, blocks, and registry against API docs

* improvement(skills): expand validate-integration with full block-tool alignment, OAuth scopes, pagination, and error handling checks

* improvement(ci): add sticky disk caches and bump runner for faster builds (#3373)

* improvement(selectors): make selectorKeys declarative (#3374)

* fix(webflow): resolution for selectors

* remove unecessary fallback'

* fix teams selector resolution

* make selector keys declarative

* selectors fixes

* improvement(selectors): consolidate selector input logic (#3375)

* feat(google-contacts): add google contacts integration (#3340)

* feat(google-contacts): add google contacts integration

* fix(google-contacts): throw error when no update fields provided

* lint

* update icon

* improvement(google-contacts): add advanced mode, error handling, and input trimming

- Set mode: 'advanced' on optional fields (emailType, phoneType, notes, pageSize, pageToken, sortOrder)
- Add createLogger and response.ok error handling to all 6 tools
- Add .trim() on resourceName in get, update, delete URL builders

* improvement(mcp): add all MCP server tools individually instead of as single server entry (#3376)

* improvement(mcp): add all MCP server tools individually instead of as single server entry

* fix(mcp): prevent remove popover from opening inadvertently

* fix(sse): fix memory leaks in SSE stream cleanup and add memory telemetry (#3378)

* fix(sse): fix memory leaks in SSE stream cleanup and add memory telemetry

* improvement(monitoring): add SSE metering to wand, execution-stream, and a2a-message endpoints

* fix(workflow-execute): remove abort from cancel() to preserve run-on-leave behavior

* improvement(monitoring): use stable process.getActiveResourcesInfo() API

* refactor(a2a): hoist resubscribe cleanup to eliminate duplication between start() and cancel()

* style(a2a): format import line

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(wand): set guard flag on early-return decrement for consistency

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* improvement(ashby): validate ashby integration and update skill files (#3381)

* improvement(luma): expand host response fields and harden event ID inputs (#3383)

* improvement(resend): add error handling, authMode, and naming consistency (#3382)

* fix(chat-deploy): fix launch chat popup and auth persistence, clean up React anti-patterns (#3380)

* fix(chat-deploy): fix launch chat popup and auth persistence, clean up React anti-patterns

* lint

* fix(greenhouse): fix email_address query param, add .trim() to ID paths, revert onValidationChange to useEffect

* fix(chat-deploy): fix stale AuthSelector state, stabilize refetch ref, clean up copy timeout

* fix(chat-deploy): reset chatSuccess on modal open to prevent stuck state

* improvement(loops): validate loops integration and update skill files (#3384)

* improvement(loops): validate loops integration and update skill files

* loops icon color

* update databricks icon

* fix(monitoring): set MemoryTelemetry logger to INFO level for production visibility (#3386)

Production defaults to ERROR-only logging. Without this override,
memory snapshots would be silently suppressed.

* feat(integrations): add amplitude, google pagespeed insights, and pagerduty integrations (#3385)

* feat(integrations): add amplitude and google pagespeed insights integrations

* verified and regen docs

* fix icons

* fix(integrations): add pagerduty to tool and block registries

Re-add registry entries that were reverted after initial commit.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* more updates

* ack comemnts

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(docs): add API reference with OpenAPI spec and auto-generated endpoint pages (#3388)

* feat(docs): add API reference with OpenAPI spec and auto-generated endpoint pages

* multiline curl

* random improvements

* cleanup

* update docs copy

* fix build

* cast

* fix builg

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Lakee Sivaraya <71339072+lakeesiv@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>

* fix(icons): fix pagerduty icon (#3392)

* improvement(executor): audit and harden nested loop/parallel implementation

* improvement(executor): audit and harden nested loop/parallel implementation

- Replace unsafe _childWorkflowInstanceId cast with typeof type guard
- Reuse WorkflowNodeMetadata interface instead of inline type duplication
- Rename _executeCore to executeCore (private, no underscore needed)
- Add log warning when SSE callbacks are dropped beyond MAX_SSE_CHILD_DEPTH
- Remove unnecessary onStream type assertion, use StreamingExecution type
- Convert OUTPUT_PROPERTIES/KNOWN_PROPERTIES from arrays to Sets for O(1) lookup
- Add type guard in loop resolver resolveOutput before casting
- Add TSDoc to edgeCrossesLoopBoundary explaining original-ID usage
- Add TSDoc to MAX_SSE_CHILD_DEPTH constant
- Update ParentIteration TSDoc to reflect parallel nesting support
- Type usageControl as union 'auto'|'force'|'none' in buildMcpTool
- Replace (t: any) casts with typed objects in agent-handler tests
- Add type guard in builder-data convertArrayItem
- Make ctx required in clearLoopExecutionState (only caller always passes it)
- Replace Math.random() with deterministic counter in terminal tests
- Fix isWorkflowBlockType mock to actually check block types
- Add loop-in-loop and workflow block tree tests

* improvement(executor): audit fixes for nested subflow implementation

- Fix findInnermostLoopForBlock/ParallelForBlock to return deepest nested
  container instead of first Object.keys() match
- Fix isBlockInLoopOrDescendant returning false when directLoopId equals
  target (should return true)
- Add isBlockInParallelOrDescendant with recursive nested parallel checking
  to match loop resolver behavior
- Extract duplicated ~20-line iteration context building from loop/parallel
  orchestrators into shared buildContainerIterationContext utility
- Remove inline import() type references in orchestrators
- Remove dead executionOrder field from WorkflowNodeMetadata
- Remove redundant double-normalization in findParallelBoundaryNodes
- Consolidate 3 identical tree-walk helpers into generic hasMatchInTree
- Add empty-array guards for Math.min/Math.max in terminal utils
- Make KNOWN_PROPERTIES a Set in parallel resolver for consistency
- Remove no-op handleDragEnd callback from toolbar
- Remove dead result/results entries from KNOWN_PROPERTIES in loop resolver
- Add tests for buildContainerIterationContext

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* finished

* improvement(airtable): added more tools (#3396)

* fix(layout): polyfill crypto.randomUUID for non-secure HTTP contexts (#3397)

* feat(integrations): add dub.co integration (#3400)

* feat(integrations): add dub.co integration

* improvement(dub): add manual docs description and lint formatting fixes

* lint

* fix(dub): remove unsupported optional property from block outputs

* fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks (#3399)

* fix(monitoring): set MemoryTelemetry logger to INFO level for production visibility

Production defaults to ERROR-only logging. Without this override,
memory snapshots would be silently suppressed.

* fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks

* fix(tests): add text() mock to workflow-handler test fetch responses

* fix(memory): remove unused O(n²) join in onStreamChunk callback

* chore(careers): remove careers page, redirect to Ashby jobs portal (#3401)

* chore(careers): remove careers page, redirect to Ashby jobs portal

* lint

* feat(integrations): add google meet integration (#3403)

* feat(integrations): add google meet integration

* lint

* ack comments

* ack comments

* fix(terminal): deduplicate nested container entries in buildEntryTree

Filter out container-typed block rows when matching nested subflow
nodes exist, preventing nested loops/parallels from appearing twice
(once as a flat block and once as an expandable subflow).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* improvement(executor): clean up nested subflow implementation

- Fix wireSentinelEdges to use LOOP_EXIT handle for nested loop terminals
- Extract buildExecutionPipeline to deduplicate orchestrator wiring
- Replace two-phase init with constructor injection for Loop/ParallelOrchestrator
- Remove dead code: shouldExecuteLoopNode, resolveForEachItems, isLoopNode, isParallelNode, isSubflowBlockType
- Deduplicate currentItem resolution in ParallelResolver via resolveCurrentItem
- Type getDistributionItems param as SerializedParallel instead of any
- Demote verbose per-reference logger.info to logger.debug in evaluateWhileCondition
- Add loop-in-parallel wiring test in edges.test.ts

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(test): update parallel resolver test to use distribution instead of distributionItems

The distributionItems fallback was never part of SerializedParallel — it
only worked through any typing. Updated the test to use the real
distribution property.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(executor): skip loop back-edges in parallel boundary detection and update test

findParallelBoundaryNodes now skips LOOP_CONTINUE back-edges when
detecting terminal nodes, matching findLoopBoundaryNodes behavior.
Without this, a nested loop's back-edge was incorrectly counted as a
forward edge within the parallel, preventing terminal detection.

Also updated parallel resolver test to use the real distribution
property instead of the non-existent distributionItems fallback.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(executor): clean up cloned loop scopes in deleteParallelScopeAndClones

When a parallel contains a nested loop, cloned loop scopes (__obranch-N)
created by expandParallel were not being deleted, causing stale scopes to
persist across outer loop iterations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(executor): remove dead fallbacks, fix nested loop boundary detection, restore executionOrder

- Remove unreachable `?? candidateIds[0]` fallbacks in loop/parallel resolvers
- Remove arbitrary first-match fallback scan in findEffectiveContainerId
- Fix edgeCrossesLoopBoundary to use innermost loop detection for nested loops
- Add warning log for missing branch outputs in parallel aggregation
- Restore executionOrder on WorkflowNodeMetadata and pipe through child workflow notification
- Remove dead sim-drag-subflow classList.remove call
- Clean up cloned loop subflowParentMap entries in deleteParallelScopeAndClones

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* leftover

* upgrade turborepo

* update stagehand icon

* fix(tag-dropdown): show contextual loop/parallel tags for deeply nested blocks

findAncestorLoops only checked direct loop membership, missing blocks nested
inside parallels within loops (and vice versa). Refactored to walk through
both loop and parallel containers recursively, so a block inside a parallel
inside a loop correctly sees the loop's contextual tags (index, currentItem)
instead of the loop's output tags (results).

Also fixed parallel ancestor detection to handle nested parallel-in-loop and
loop-in-parallel scenarios, collecting all ancestor parallels instead of just
the immediate containing one.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* testing

* fixed dedicated logs

* fix

* fix(subflows): enable nested subflow interaction and execution highlighting

Remove !important z-index overrides that prevented nested subflows from
being grabbed/dragged independently. Z-index is now managed by ReactFlow's
elevateNodesOnSelect and per-node zIndex: depth props. Also adds execution
status highlighting for nested subflows in both canvas and snapshot preview.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(preview): add cycle guard to recursive subflow status derivation

Prevents infinite recursion if subflowChildrenMap contains circular
references by tracking visited nodes during traversal.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Lakee Sivaraya <71339072+lakeesiv@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vasyl Abramovych <vasyl.abramovych@gmail.com>
2026-03-03 19:21:52 -08:00
Waleed
1cf7fdfc8c fix(logs): add status field to log detail API for polling (#3405) 2026-03-03 18:00:21 -08:00
Waleed
37bdffeda0 fix(socket): persist outbound edges from locked blocks (#3404)
* fix(socket): persist outbound edges from locked blocks

* fix(socket): align edge remove protection check with client-side behavior

* fix(socket): align batch edge protection checks with target-only model

* fix(socket): update stale comments for edge protection checks
2026-03-03 12:54:07 -08:00
Waleed
6fa4b9b410 feat(integrations): add brandfetch integration (#3402)
* feat(integrations): add brandfetch integration

* lint

* ack comments
2026-03-02 22:10:38 -08:00
Waleed
f0ee492ada feat(integrations): add google meet integration (#3403)
* feat(integrations): add google meet integration

* lint

* ack comments
2026-03-02 21:59:09 -08:00
Waleed
a8e0203a92 chore(careers): remove careers page, redirect to Ashby jobs portal (#3401)
* chore(careers): remove careers page, redirect to Ashby jobs portal

* lint
2026-03-02 14:12:03 -08:00
Waleed
ebb9a2bdd3 fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks (#3399)
* fix(monitoring): set MemoryTelemetry logger to INFO level for production visibility

Production defaults to ERROR-only logging. Without this override,
memory snapshots would be silently suppressed.

* fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks

* fix(tests): add text() mock to workflow-handler test fetch responses

* fix(memory): remove unused O(n²) join in onStreamChunk callback
2026-03-02 13:58:03 -08:00
Waleed
61a447aba5 feat(integrations): add dub.co integration (#3400)
* feat(integrations): add dub.co integration

* improvement(dub): add manual docs description and lint formatting fixes

* lint

* fix(dub): remove unsupported optional property from block outputs
2026-03-02 13:45:09 -08:00
Waleed
e91ab6260a fix(layout): polyfill crypto.randomUUID for non-secure HTTP contexts (#3397) 2026-03-02 11:57:31 -08:00
Waleed
afaa361801 improvement(airtable): added more tools (#3396) 2026-03-02 10:58:21 -08:00
Waleed
cd88706ea4 fix(icons): fix pagerduty icon (#3392) 2026-03-01 23:43:09 -08:00
134 changed files with 9937 additions and 2779 deletions

View File

@@ -1711,167 +1711,42 @@ export function StagehandIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
{...props}
width='108'
height='159'
viewBox='0 0 108 159'
fill='none'
xmlns='http://www.w3.org/2000/svg'
width='256'
height='352'
viewBox='0 0 256 352'
fill='none'
>
<path
d='M15 26C22.8234 31.822 23.619 41.405 25.3125 50.3867C25.8461 53.1914 26.4211 55.9689 27.0625 58.75C27.7987 61.9868 28.4177 65.2319 29 68.5C29.332 70.3336 29.6653 72.1669 30 74C30.1418 74.7863 30.2836 75.5727 30.4297 76.3828C31.8011 83.2882 33.3851 90.5397 39.4375 94.75C40.3405 95.3069 40.3405 95.3069 41.2617 95.875C43.8517 97.5512 45.826 99.826 48 102C50.6705 102.89 52.3407 103.143 55.0898 103.211C55.8742 103.239 56.6586 103.268 57.4668 103.297C59.1098 103.349 60.7531 103.393 62.3965 103.43C65.8896 103.567 68.4123 103.705 71.5664 105.289C73 107 73 107 73 111C73.66 111 74.32 111 75 111C74.0759 106.912 74.0759 106.912 71.4766 103.828C67.0509 102.348 62.3634 102.64 57.7305 102.609C52.3632 102.449 49.2783 101.537 45 98C41.8212 94.0795 41.5303 90.9791 42 86C44.9846 83.0154 48.2994 83.6556 52.3047 83.6289C53.139 83.6199 53.9734 83.6108 54.833 83.6015C56.6067 83.587 58.3805 83.5782 60.1543 83.5745C62.8304 83.5627 65.5041 83.5137 68.1797 83.4629C81.1788 83.34 91.8042 85.3227 102 94C106.37 100.042 105.483 106.273 104.754 113.406C103.821 119.026 101.968 124.375 100.125 129.75C99.8806 130.471 99.6361 131.193 99.3843 131.936C97.7783 136.447 95.9466 140.206 93 144C92.34 144 91.68 144 91 144C91 144.66 91 145.32 91 146C79.0816 156.115 63.9798 156.979 49 156C36.6394 154.226 26.7567 148.879 19 139C11.0548 125.712 11.6846 105.465 11.3782 90.4719C11.0579 77.4745 8.03411 64.8142 5.4536 52.1135C5.04373 50.0912 4.64233 48.0673 4.24218 46.043C4.00354 44.8573 3.7649 43.6716 3.51903 42.45C2.14425 33.3121 2.14425 33.3121 4.87499 29.125C8.18297 25.817 10.3605 25.4542 15 26Z'
fill='#FDFDFD'
d='M 242.29,45.79 C 242.29,28.88 226.69,13.76 206.61,13.76 C 188.59,13.76 174.82,28.66 174.82,45.85 V 101.97 C 168.89,98.09 163.18,96.76 157.14,96.76 C 145.94,96.76 137.02,101.49 128.83,110.17 C 121.81,101.01 112.07,95.73 100.72,95.73 C 93.97,95.73 87.82,98.09 82.11,100.9 V 80.05 C 82.11,64.08 66.14,47.28 48.74,47.28 C 31.12,47.28 14.54,62.71 14.54,78.79 V 219.4 C 14.54,273.71 56.99,337.89 125.23,337.89 C 197.41,337.89 242.29,289.05 242.29,186.01 V 78.9 L 242.29,45.79 Z'
fill='black'
/>
<path
d='M91 0.999996C94.8466 2.96604 96.2332 5.08365 97.6091 9.03564C99.203 14.0664 99.4412 18.7459 99.4414 23.9922C99.4538 24.9285 99.4663 25.8647 99.4791 26.8294C99.5049 28.8198 99.5247 30.8103 99.539 32.8008C99.5785 37.9693 99.6682 43.1369 99.7578 48.3047C99.7747 49.3188 99.7917 50.3328 99.8091 51.3776C99.9603 59.6066 100.323 67.7921 100.937 76C101.012 77.0582 101.087 78.1163 101.164 79.2065C101.646 85.1097 102.203 90.3442 105.602 95.3672C107.492 98.9262 107.45 102.194 107.375 106.125C107.366 106.881 107.356 107.638 107.346 108.417C107.18 114.639 106.185 120.152 104 126C103.636 126.996 103.273 127.993 102.898 129.02C98.2182 141.022 92.6784 149.349 80.7891 155.062C67.479 160.366 49.4234 159.559 36 155C32.4272 153.286 29.2162 151.308 26 149C25.0719 148.361 24.1437 147.721 23.1875 147.062C8.32968 133.054 9.60387 109.231 8.73413 90.3208C8.32766 81.776 7.51814 73.4295 5.99999 65C5.82831 64.0338 5.65662 63.0675 5.47973 62.072C4.98196 59.3363 4.46395 56.6053 3.93749 53.875C3.76412 52.9572 3.59074 52.0394 3.4121 51.0938C2.75101 47.6388 2.11387 44.3416 0.999995 41C0.505898 36.899 0.0476353 32.7768 2.04687 29.0469C4.91881 25.5668 6.78357 24.117 11.25 23.6875C15.8364 24.0697 17.5999 24.9021 21 28C24.7763 34.3881 26.047 41.2626 27.1875 48.5C27.5111 50.4693 27.8377 52.4381 28.168 54.4062C28.3733 55.695 28.3733 55.695 28.5828 57.0098C28.8087 58.991 28.8087 58.991 30 60C30.3171 59.4947 30.6342 58.9894 30.9609 58.4688C33.1122 55.4736 34.7097 53.3284 38.3789 52.3945C44.352 52.203 48.1389 53.6183 53 57C53.0928 56.1338 53.0928 56.1338 53.1875 55.25C54.4089 51.8676 55.9015 50.8075 59 49C63.8651 48.104 66.9348 48.3122 71.1487 51.0332C72.0896 51.6822 73.0305 52.3313 74 53C73.9686 51.2986 73.9686 51.2986 73.9365 49.5627C73.8636 45.3192 73.818 41.0758 73.7803 36.8318C73.7603 35.0016 73.733 33.1715 73.6982 31.3415C73.6492 28.6976 73.6269 26.0545 73.6094 23.4102C73.5887 22.6035 73.5681 21.7969 73.5468 20.9658C73.5441 13.8444 75.5121 7.83341 80.25 2.4375C83.9645 0.495841 86.8954 0.209055 91 0.999996ZM3.99999 30C1.56925 34.8615 3.215 40.9393 4.24218 46.043C4.37061 46.6927 4.49905 47.3424 4.63137 48.0118C5.03968 50.0717 5.45687 52.1296 5.87499 54.1875C11.1768 80.6177 11.1768 80.6177 11.4375 93.375C11.7542 120.78 11.7542 120.78 23.5625 144.375C28.5565 149.002 33.5798 151.815 40 154C40.6922 154.244 41.3844 154.487 42.0977 154.738C55.6463 158.576 72.4909 156.79 84.8086 150.316C87.0103 148.994 89.0458 147.669 91 146C91 145.34 91 144.68 91 144C91.66 144 92.32 144 93 144C97.1202 138.98 99.3206 133.053 101.25 126.937C101.505 126.174 101.76 125.41 102.023 124.623C104.94 115.65 107.293 104.629 103.625 95.625C96.3369 88.3369 86.5231 83.6919 76.1988 83.6088C74.9905 83.6226 74.9905 83.6226 73.7578 83.6367C72.9082 83.6362 72.0586 83.6357 71.1833 83.6352C69.4034 83.6375 67.6235 83.6472 65.8437 83.6638C63.1117 83.6876 60.3806 83.6843 57.6484 83.6777C55.9141 83.6833 54.1797 83.6904 52.4453 83.6992C51.6277 83.6983 50.81 83.6974 49.9676 83.6964C45.5122 83.571 45.5122 83.571 42 86C41.517 90.1855 41.733 92.4858 43.6875 96.25C46.4096 99.4871 48.6807 101.674 53.0105 102.282C55.3425 102.411 57.6645 102.473 60 102.5C69.8847 102.612 69.8847 102.612 74 106C74.8125 108.687 74.8125 108.688 75 111C74.34 111 73.68 111 73 111C72.8969 110.216 72.7937 109.432 72.6875 108.625C72.224 105.67 72.224 105.67 69 104C65.2788 103.745 61.5953 103.634 57.8672 103.609C51.1596 103.409 46.859 101.691 41.875 97C41.2562 96.34 40.6375 95.68 40 95C39.175 94.4637 38.35 93.9275 37.5 93.375C30.9449 87.1477 30.3616 77.9789 29.4922 69.418C29.1557 66.1103 29.1557 66.1103 28.0352 63.625C26.5234 59.7915 26.1286 55.8785 25.5625 51.8125C23.9233 38.3 23.9233 38.3 17 27C11.7018 24.3509 7.9915 26.1225 3.99999 30Z'
fill='#1F1F1F'
d='M 224.94,46.23 C 224.94,36.76 215.91,28.66 205.91,28.66 C 196.75,28.66 189.9,36.11 189.9,45.14 V 152.72 C 202.88,153.38 214.08,155.96 224.94,166.19 V 78.79 L 224.94,46.23 Z'
fill='white'
/>
<path
d='M89.0976 2.53906C91 3 91 3 93.4375 5.3125C96.1586 9.99276 96.178 14.1126 96.2461 19.3828C96.2778 21.1137 96.3098 22.8446 96.342 24.5754C96.3574 25.4822 96.3728 26.3889 96.3887 27.3232C96.6322 41.3036 96.9728 55.2117 98.3396 69.1353C98.9824 75.7746 99.0977 82.3308 99 89C96.5041 88.0049 94.0126 87.0053 91.5351 85.9648C90.3112 85.4563 90.3112 85.4563 89.0625 84.9375C87.8424 84.4251 87.8424 84.4251 86.5976 83.9023C83.7463 82.9119 80.9774 82.4654 78 82C76.7702 65.9379 75.7895 49.8907 75.7004 33.7775C75.6919 32.3138 75.6783 30.8501 75.6594 29.3865C75.5553 20.4082 75.6056 12.1544 80.6875 4.4375C83.6031 2.62508 85.7 2.37456 89.0976 2.53906Z'
fill='#FBFBFB'
d='M 157.21,113.21 C 146.12,113.21 137.93,122.02 137.93,131.76 V 154.62 C 142.24,153.05 145.95,152.61 149.83,152.61 H 174.71 V 131.76 C 174.71,122.35 166.73,113.21 157.21,113.21 Z'
fill='white'
/>
<path
d='M97 13C97.99 13.495 97.99 13.495 99 14C99.0297 15.8781 99.0297 15.8781 99.0601 17.7942C99.4473 46.9184 99.4473 46.9184 100.937 76C101.012 77.0574 101.087 78.1149 101.164 79.2043C101.646 85.1082 102.203 90.3434 105.602 95.3672C107.492 98.9262 107.45 102.194 107.375 106.125C107.366 106.881 107.356 107.638 107.346 108.417C107.18 114.639 106.185 120.152 104 126C103.636 126.996 103.273 127.993 102.898 129.02C98.2182 141.022 92.6784 149.349 80.7891 155.062C67.479 160.366 49.4234 159.559 36 155C32.4272 153.286 29.2162 151.308 26 149C24.6078 148.041 24.6078 148.041 23.1875 147.062C13.5484 137.974 10.832 124.805 9.99999 112C9.91815 101.992 10.4358 91.9898 11 82C11.33 82 11.66 82 12 82C12.0146 82.6118 12.0292 83.2236 12.0442 83.854C11.5946 115.845 11.5946 115.845 24.0625 143.875C28.854 148.273 33.89 150.868 40 153C40.6935 153.245 41.387 153.49 42.1016 153.742C56.9033 157.914 73.8284 155.325 87 148C88.3301 147.327 89.6624 146.658 91 146C91 145.34 91 144.68 91 144C91.66 144 92.32 144 93 144C100.044 130.286 105.786 114.602 104 99C102.157 94.9722 100.121 93.0631 96.3125 90.875C95.5042 90.398 94.696 89.9211 93.8633 89.4297C85.199 85.1035 78.1558 84.4842 68.5 84.3125C67.2006 84.2783 65.9012 84.2442 64.5625 84.209C61.3751 84.127 58.1879 84.0577 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.8637 87.6094 98.8637 87.6094 98.7246 86.1907C96.96 67.8915 95.697 49.7051 95.75 31.3125C95.751 30.5016 95.7521 29.6908 95.7532 28.8554C95.7901 15.4198 95.7901 15.4198 97 13Z'
fill='#262114'
d='M 100.06,111.75 C 89.19,111.75 81.85,121.06 81.85,130.31 V 157.86 C 81.85,167.71 89.72,175.38 99.24,175.38 C 109.71,175.38 118.39,166.91 118.39,157.39 V 130.31 C 118.39,120.79 110.03,111.75 100.06,111.75 Z'
fill='white'
/>
<path
d='M68 51C72.86 54.06 74.644 56.5072 76 62C76.249 65.2763 76.2347 68.5285 76.1875 71.8125C76.1868 72.6833 76.1862 73.554 76.1855 74.4512C76.1406 80.8594 76.1406 80.8594 75 82C73.5113 82.0867 72.0185 82.107 70.5273 82.0976C69.6282 82.0944 68.7291 82.0912 67.8027 82.0879C66.8572 82.0795 65.9117 82.0711 64.9375 82.0625C63.9881 82.058 63.0387 82.0535 62.0605 82.0488C59.707 82.037 57.3535 82.0205 55 82C53.6352 77.2188 53.738 72.5029 53.6875 67.5625C53.6585 66.6208 53.6295 65.6792 53.5996 64.709C53.5591 60.2932 53.5488 57.7378 55.8945 53.9023C59.5767 50.5754 63.1766 50.211 68 51Z'
fill='#F8F8F8'
d='M 192.04,168.87 H 150.16 C 140.19,168.87 133.34,175.39 133.34,183.86 C 133.34,192.9 140.19,199.75 148.66,199.75 H 182.52 C 188.01,199.75 189.63,204.81 189.63,207.49 C 189.63,211.91 186.37,214.64 181.09,215.51 C 162.96,218.66 137.71,229.13 137.71,259.68 C 137.71,265.07 133.67,267.42 130.29,267.42 C 126.09,267.42 122.38,264.74 122.38,260.12 C 122.38,241.15 129.02,228.17 143.26,214.81 C 131.01,212.02 119.21,202.99 117.75,186.43 C 111.93,189.81 107.2,191.15 100.18,191.15 C 82.11,191.15 66.68,176.58 66.68,158.29 V 80.71 C 66.68,71.24 57.16,63.5 49.18,63.5 C 38.71,63.5 29.89,72.42 29.89,80.27 V 217.19 C 29.89,266.48 68.71,322.19 124.88,322.19 C 185.91,322.19 223.91,282.15 223.91,207.16 C 223.91,187.19 214.28,168.87 192.04,168.87 Z'
fill='white'
/>
</svg>
)
}
export function BrandfetchIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 29 31' fill='none' xmlns='http://www.w3.org/2000/svg'>
<path
d='M46 55C48.7557 57.1816 50.4359 58.8718 52 62C52.0837 63.5215 52.1073 65.0466 52.0977 66.5703C52.0944 67.4662 52.0912 68.3621 52.0879 69.2852C52.0795 70.2223 52.0711 71.1595 52.0625 72.125C52.058 73.0699 52.0535 74.0148 52.0488 74.9883C52.037 77.3256 52.0206 79.6628 52 82C50.9346 82.1992 50.9346 82.1992 49.8477 82.4023C48.9286 82.5789 48.0094 82.7555 47.0625 82.9375C46.146 83.1115 45.2294 83.2855 44.2852 83.4648C42.0471 83.7771 42.0471 83.7771 41 85C40.7692 86.3475 40.5885 87.7038 40.4375 89.0625C40.2931 90.3619 40.1487 91.6613 40 93C37 92 37 92 35.8672 90.1094C35.5398 89.3308 35.2123 88.5522 34.875 87.75C34.5424 86.9817 34.2098 86.2134 33.8672 85.4219C31.9715 80.1277 31.7884 75.065 31.75 69.5C31.7294 68.7536 31.7087 68.0073 31.6875 67.2383C31.6551 62.6607 32.0474 59.7266 35 56C38.4726 54.2637 42.2119 54.3981 46 55Z'
fill='#FAFAFA'
/>
<path
d='M97 13C97.66 13.33 98.32 13.66 99 14C99.0297 15.8781 99.0297 15.8781 99.0601 17.7942C99.4473 46.9184 99.4473 46.9184 100.937 76C101.012 77.0574 101.087 78.1149 101.164 79.2043C101.566 84.1265 102.275 88.3364 104 93C103.625 95.375 103.625 95.375 103 97C102.361 96.2781 101.721 95.5563 101.062 94.8125C94.4402 88.1902 85.5236 84.8401 76.2891 84.5859C75.0451 84.5473 73.8012 84.5086 72.5195 84.4688C71.2343 84.4378 69.9491 84.4069 68.625 84.375C66.6624 84.317 66.6624 84.317 64.6601 84.2578C61.4402 84.1638 58.2203 84.0781 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.9091 88.0729 98.8182 87.1458 98.7246 86.1907C96.96 67.8915 95.697 49.7051 95.75 31.3125C95.751 30.5016 95.7521 29.6908 95.7532 28.8554C95.7901 15.4198 95.7901 15.4198 97 13Z'
fill='#423B28'
/>
<path
d='M91 0.999996C94.3999 3.06951 96.8587 5.11957 98 9C97.625 12.25 97.625 12.25 97 15C95.804 12.6081 94.6146 10.2139 93.4375 7.8125C92.265 5.16236 92.265 5.16236 91 4C88.074 3.7122 85.8483 3.51695 83 4C79.1128 7.37574 78.178 11.0991 77 16C76.8329 18.5621 76.7615 21.1317 76.7695 23.6992C76.77 24.4155 76.7704 25.1318 76.7709 25.8698C76.7739 27.3783 76.7817 28.8868 76.7942 30.3953C76.8123 32.664 76.8147 34.9324 76.8144 37.2012C76.8329 44.6001 77.0765 51.888 77.7795 59.259C78.1413 63.7564 78.1068 68.2413 78.0625 72.75C78.058 73.6498 78.0535 74.5495 78.0488 75.4766C78.0373 77.6511 78.0193 79.8255 78 82C78.99 82.495 78.99 82.495 80 83C68.78 83.33 57.56 83.66 46 84C46.495 83.01 46.495 83.01 47 82C52.9349 80.7196 58.8909 80.8838 64.9375 80.9375C65.9075 80.942 66.8775 80.9465 67.8769 80.9512C70.2514 80.9629 72.6256 80.9793 75 81C75.0544 77.9997 75.0939 75.0005 75.125 72C75.1418 71.1608 75.1585 70.3216 75.1758 69.457C75.2185 63.9475 74.555 59.2895 73 54C73.66 54 74.32 54 75 54C74.9314 53.2211 74.8629 52.4422 74.7922 51.6396C74.1158 43.5036 73.7568 35.4131 73.6875 27.25C73.644 25.5194 73.644 25.5194 73.5996 23.7539C73.5376 15.3866 74.6189 8.85069 80.25 2.4375C83.9433 0.506911 86.9162 0.173322 91 0.999996Z'
fill='#131311'
/>
<path
d='M15 24C20.2332 26.3601 22.1726 29.3732 24.1875 34.5195C26.8667 42.6988 27.2651 50.4282 27 59C26.67 59 26.34 59 26 59C25.8945 58.436 25.7891 57.8721 25.6804 57.291C25.1901 54.6926 24.6889 52.0963 24.1875 49.5C24.0218 48.6131 23.8562 47.7262 23.6855 46.8125C21.7568 35.5689 21.7568 35.5689 15 27C12.0431 26.2498 12.0431 26.2498 8.99999 27C5.97965 28.9369 5.97965 28.9369 3.99999 32C3.67226 36.9682 4.31774 41.4911 5.27733 46.3594C5.40814 47.0304 5.53894 47.7015 5.67371 48.3929C5.94892 49.7985 6.22723 51.2035 6.50854 52.6079C6.93887 54.7569 7.35989 56.9075 7.77929 59.0586C9.09359 66.104 9.09359 66.104 11 73C11.0836 75.2109 11.1073 77.4243 11.0976 79.6367C11.0944 80.9354 11.0912 82.2342 11.0879 83.5723C11.0795 84.944 11.0711 86.3158 11.0625 87.6875C11.0575 89.071 11.0529 90.4544 11.0488 91.8379C11.037 95.2253 11.0206 98.6126 11 102C8.54975 99.5498 8.73228 98.8194 8.65624 95.4492C8.62812 94.53 8.60001 93.6108 8.57104 92.6638C8.54759 91.6816 8.52415 90.6994 8.49999 89.6875C8.20265 81.3063 7.58164 73.2485 5.99999 65C5.67135 63.2175 5.34327 61.435 5.01562 59.6523C4.31985 55.9098 3.62013 52.1681 2.90233 48.4297C2.75272 47.6484 2.60311 46.867 2.44897 46.062C1.99909 43.8187 1.99909 43.8187 0.999995 41C0.505898 36.899 0.0476353 32.7768 2.04687 29.0469C6.06003 24.1839 8.81126 23.4843 15 24Z'
fill='#2A2311'
/>
<path
d='M11 82C11.33 82 11.66 82 12 82C12.0146 82.6118 12.0292 83.2236 12.0442 83.854C11.5946 115.845 11.5946 115.845 24.0625 143.875C30.0569 149.404 36.9894 152.617 45 154C42 156 42 156 39.4375 156C29.964 153.244 20.8381 146.677 16 138C8.26993 120.062 9.92611 101.014 11 82Z'
fill='#272214'
/>
<path
d='M68 49C70.3478 50.1116 71.9703 51.3346 74 53C73.34 53.66 72.68 54.32 72 55C71.505 54.505 71.01 54.01 70.5 53.5C67.6718 51.8031 65.3662 51.5622 62.0976 51.4062C58.4026 52.4521 57.1992 53.8264 55 57C54.3826 61.2861 54.5302 65.4938 54.6875 69.8125C54.7101 70.9823 54.7326 72.1521 54.7559 73.3574C54.8147 76.2396 54.8968 79.1191 55 82C54.01 82 53.02 82 52 82C51.9854 81.4203 51.9708 80.8407 51.9558 80.2434C51.881 77.5991 51.7845 74.9561 51.6875 72.3125C51.6649 71.4005 51.6424 70.4885 51.6191 69.5488C51.4223 64.6292 51.2621 60.9548 48 57C45.6603 55.8302 44.1661 55.8339 41.5625 55.8125C40.78 55.7983 39.9976 55.7841 39.1914 55.7695C36.7079 55.8591 36.7079 55.8591 34 58C32.7955 60.5518 32.7955 60.5518 32 63C31.34 63 30.68 63 30 63C30.2839 59.6879 31.0332 57.9518 32.9375 55.1875C36.7018 52.4987 38.9555 52.3484 43.4844 52.5586C47.3251 53.2325 49.8148 54.7842 53 57C53.0928 56.1338 53.0928 56.1338 53.1875 55.25C55.6091 48.544 61.7788 47.8649 68 49Z'
fill='#1F1A0F'
/>
<path
d='M99 60C99.33 60 99.66 60 100 60C100.05 60.7865 100.1 61.573 100.152 62.3833C100.385 65.9645 100.63 69.5447 100.875 73.125C100.954 74.3625 101.032 75.6 101.113 76.875C101.197 78.0738 101.281 79.2727 101.367 80.5078C101.44 81.6075 101.514 82.7073 101.589 83.8403C102.013 87.1 102.94 89.8988 104 93C103.625 95.375 103.625 95.375 103 97C102.361 96.2781 101.721 95.5563 101.062 94.8125C94.4402 88.1902 85.5236 84.8401 76.2891 84.5859C74.4231 84.5279 74.4231 84.5279 72.5195 84.4688C71.2343 84.4378 69.9491 84.4069 68.625 84.375C67.3166 84.3363 66.0082 84.2977 64.6601 84.2578C61.4402 84.1638 58.2203 84.0781 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.9162 87.912 98.8324 86.8241 98.7461 85.7031C98.1266 77.012 97.9127 68.6814 99 60Z'
fill='#332E22'
/>
<path
d='M15 24C20.2332 26.3601 22.1726 29.3732 24.1875 34.5195C26.8667 42.6988 27.2651 50.4282 27 59C26.67 59 26.34 59 26 59C25.8945 58.436 25.7891 57.8721 25.6804 57.291C25.1901 54.6926 24.6889 52.0963 24.1875 49.5C24.0218 48.6131 23.8562 47.7262 23.6855 46.8125C21.7568 35.5689 21.7568 35.5689 15 27C12.0431 26.2498 12.0431 26.2498 8.99999 27C5.2818 29.7267 4.15499 31.2727 3.18749 35.8125C3.12562 36.8644 3.06374 37.9163 2.99999 39C2.33999 39 1.67999 39 0.999992 39C0.330349 31.2321 0.330349 31.2321 3.37499 27.5625C7.31431 23.717 9.51597 23.543 15 24Z'
fill='#1D180A'
/>
<path
d='M91 0.999996C94.3999 3.06951 96.8587 5.11957 98 9C97.625 12.25 97.625 12.25 97 15C95.804 12.6081 94.6146 10.2139 93.4375 7.8125C92.265 5.16236 92.265 5.16236 91 4C85.4345 3.33492 85.4345 3.33491 80.6875 5.75C78.5543 9.85841 77.6475 13.9354 76.7109 18.4531C76.4763 19.2936 76.2417 20.1341 76 21C75.34 21.33 74.68 21.66 74 22C73.5207 15.4102 74.5846 10.6998 78 5C81.755 0.723465 85.5463 -0.103998 91 0.999996Z'
fill='#16130D'
/>
<path
d='M42 93C42.5569 93.7631 43.1137 94.5263 43.6875 95.3125C46.4238 98.4926 48.7165 100.679 53.0105 101.282C55.3425 101.411 57.6646 101.473 60 101.5C70.6207 101.621 70.6207 101.621 75 106C75.0406 107.666 75.0427 109.334 75 111C74.34 111 73.68 111 73 111C72.7112 110.196 72.4225 109.391 72.125 108.562C71.2674 105.867 71.2674 105.867 69 105C65.3044 104.833 61.615 104.703 57.916 104.658C52.1631 104.454 48.7484 103.292 44 100C41.5625 97.25 41.5625 97.25 40 95C40.66 95 41.32 95 42 95C42 94.34 42 93.68 42 93Z'
fill='#2B2B2B'
/>
<path
d='M11 82C11.33 82 11.66 82 12 82C12.1682 86.6079 12.3287 91.216 12.4822 95.8245C12.5354 97.3909 12.5907 98.9574 12.6482 100.524C12.7306 102.78 12.8055 105.036 12.8789 107.293C12.9059 107.989 12.933 108.685 12.9608 109.402C13.0731 113.092 12.9015 116.415 12 120C11.67 120 11.34 120 11 120C9.63778 112.17 10.1119 104.4 10.4375 96.5C10.4908 95.0912 10.5436 93.6823 10.5957 92.2734C10.7247 88.8487 10.8596 85.4243 11 82Z'
fill='#4D483B'
/>
<path
d='M43.4844 52.5586C47.3251 53.2325 49.8148 54.7842 53 57C52 59 52 59 50 60C49.5256 59.34 49.0512 58.68 48.5625 58C45.2656 55.4268 43.184 55.5955 39.1211 55.6641C36.7043 55.8955 36.7043 55.8955 34 58C32.7955 60.5518 32.7955 60.5518 32 63C31.34 63 30.68 63 30 63C30.2839 59.6879 31.0332 57.9518 32.9375 55.1875C36.7018 52.4987 38.9555 52.3484 43.4844 52.5586Z'
fill='#221F16'
/>
<path
d='M76 73C76.33 73 76.66 73 77 73C77 75.97 77 78.94 77 82C78.485 82.495 78.485 82.495 80 83C68.78 83.33 57.56 83.66 46 84C46.33 83.34 46.66 82.68 47 82C52.9349 80.7196 58.8909 80.8838 64.9375 80.9375C65.9075 80.942 66.8775 80.9465 67.8769 80.9512C70.2514 80.9629 72.6256 80.9793 75 81C75.33 78.36 75.66 75.72 76 73Z'
fill='#040404'
/>
<path
d='M27 54C27.33 54 27.66 54 28 54C28.33 56.97 28.66 59.94 29 63C29.99 63 30.98 63 32 63C32 66.96 32 70.92 32 75C31.01 74.67 30.02 74.34 29 74C28.8672 73.2523 28.7344 72.5047 28.5977 71.7344C28.421 70.7495 28.2444 69.7647 28.0625 68.75C27.8885 67.7755 27.7144 66.8009 27.5352 65.7969C27.0533 63.087 27.0533 63.087 26.4062 60.8125C25.8547 58.3515 26.3956 56.4176 27 54Z'
fill='#434039'
/>
<path
d='M78 5C78.99 5.33 79.98 5.66 81 6C80.3194 6.92812 80.3194 6.92812 79.625 7.875C77.7233 11.532 77.1555 14.8461 76.5273 18.8906C76.3533 19.5867 76.1793 20.2828 76 21C75.34 21.33 74.68 21.66 74 22C73.5126 15.2987 74.9229 10.9344 78 5Z'
fill='#2A2313'
/>
<path
d='M12 115C12.99 115.495 12.99 115.495 14 116C14.5334 118.483 14.9326 120.864 15.25 123.375C15.3531 124.061 15.4562 124.747 15.5625 125.453C16.0763 129.337 16.2441 130.634 14 134C12.6761 127.57 11.752 121.571 12 115Z'
fill='#2F2C22'
/>
<path
d='M104 95C107 98 107 98 107.363 101.031C107.347 102.176 107.33 103.321 107.312 104.5C107.309 105.645 107.305 106.789 107.301 107.969C107 111 107 111 105 114C104.67 107.73 104.34 101.46 104 95Z'
fill='#120F05'
/>
<path
d='M56 103C58.6048 102.919 61.2071 102.86 63.8125 102.812C64.5505 102.787 65.2885 102.762 66.0488 102.736C71.4975 102.662 71.4975 102.662 74 104.344C75.374 106.619 75.2112 108.396 75 111C74.34 111 73.68 111 73 111C72.7112 110.196 72.4225 109.391 72.125 108.562C71.2674 105.867 71.2674 105.867 69 105C66.7956 104.77 64.5861 104.589 62.375 104.438C61.1865 104.354 59.998 104.27 58.7734 104.184C57.4006 104.093 57.4006 104.093 56 104C56 103.67 56 103.34 56 103Z'
fill='#101010'
/>
<path
d='M23 40C23.66 40 24.32 40 25 40C27.3084 46.3482 27.1982 52.2948 27 59C26.67 59 26.34 59 26 59C25.01 52.73 24.02 46.46 23 40Z'
fill='#191409'
/>
<path
d='M47 83C46.3606 83.3094 45.7212 83.6187 45.0625 83.9375C41.9023 87.0977 42.181 90.6833 42 95C41.01 94.67 40.02 94.34 39 94C39.3463 85.7409 39.3463 85.7409 41.875 82.875C44 82 44 82 47 83Z'
fill='#171717'
/>
<path
d='M53 61C53.33 61 53.66 61 54 61C54.33 67.93 54.66 74.86 55 82C54.01 82 53.02 82 52 82C52.33 75.07 52.66 68.14 53 61Z'
fill='#444444'
/>
<path
d='M81 154C78.6696 156.33 77.8129 156.39 74.625 156.75C73.4687 156.897 73.4687 156.897 72.2891 157.047C69.6838 156.994 68.2195 156.317 66 155C67.7478 154.635 69.4984 154.284 71.25 153.938C72.7118 153.642 72.7118 153.642 74.2031 153.34C76.8681 153.016 78.4887 153.145 81 154Z'
fill='#332F23'
/>
<path
d='M19 28C19.66 28 20.32 28 21 28C21.6735 29.4343 22.3386 30.8726 23 32.3125C23.5569 33.5133 23.5569 33.5133 24.125 34.7383C25 37 25 37 25 40C22 39 22 39 21.0508 37.2578C20.8071 36.554 20.5635 35.8502 20.3125 35.125C20.0611 34.4263 19.8098 33.7277 19.5508 33.0078C19 31 19 31 19 28Z'
fill='#282213'
/>
<path
d='M102 87C104.429 93.2857 104.429 93.2857 103 97C100.437 94.75 100.437 94.75 98 92C98.0625 89.75 98.0625 89.75 99 88C101 87 101 87 102 87Z'
fill='#37301F'
/>
<path
d='M53 56C53.33 56 53.66 56 54 56C53.67 62.27 53.34 68.54 53 75C52.67 75 52.34 75 52 75C51.7788 72.2088 51.5726 69.4179 51.375 66.625C51.3105 65.8309 51.2461 65.0369 51.1797 64.2188C51.0394 62.1497 51.0124 60.0737 51 58C51.66 57.34 52.32 56.68 53 56Z'
fill='#030303'
/>
<path
d='M100 129C100.33 129 100.66 129 101 129C100.532 133.776 99.7567 137.045 97 141C96.34 140.67 95.68 140.34 95 140C96.65 136.37 98.3 132.74 100 129Z'
fill='#1E1A12'
/>
<path
d='M15 131C17.7061 132.353 17.9618 133.81 19.125 136.562C19.4782 137.389 19.8314 138.215 20.1953 139.066C20.4609 139.704 20.7264 140.343 21 141C20.01 141 19.02 141 18 141C15.9656 137.27 15 135.331 15 131Z'
fill='#1C1912'
/>
<path
d='M63 49C69.4 49.4923 69.4 49.4923 72.4375 52.0625C73.2109 53.0216 73.2109 53.0216 74 54C70.8039 54 69.5828 53.4533 66.8125 52C66.0971 51.6288 65.3816 51.2575 64.6445 50.875C64.1018 50.5863 63.5591 50.2975 63 50C63 49.67 63 49.34 63 49Z'
fill='#13110C'
/>
<path
d='M0.999992 39C1.98999 39 2.97999 39 3.99999 39C5.24999 46.625 5.24999 46.625 2.99999 50C2.33999 46.37 1.67999 42.74 0.999992 39Z'
fill='#312C1E'
/>
<path
d='M94 5C94.66 5 95.32 5 96 5C97.8041 7.75924 98.0127 8.88972 97.625 12.25C97.4187 13.1575 97.2125 14.065 97 15C95.1161 11.7345 94.5071 8.71888 94 5Z'
fill='#292417'
/>
<path
d='M20 141C23.3672 142.393 24.9859 143.979 27 147C24.625 146.812 24.625 146.812 22 146C20.6875 143.438 20.6875 143.438 20 141Z'
fill='#373328'
/>
<path
d='M86 83C86.33 83.99 86.66 84.98 87 86C83.37 85.34 79.74 84.68 76 84C80.3553 81.8223 81.4663 81.9696 86 83Z'
fill='#2F2F2F'
/>
<path
d='M42 93C46 97.625 46 97.625 46 101C44.02 99.35 42.04 97.7 40 96C40.66 95.67 41.32 95.34 42 95C42 94.34 42 93.68 42 93Z'
fill='#232323'
/>
<path
d='M34 55C34.66 55.33 35.32 55.66 36 56C35.5256 56.7838 35.0512 57.5675 34.5625 58.375C33.661 59.8895 32.7882 61.4236 32 63C31.34 63 30.68 63 30 63C30.4983 59.3125 31.1007 57.3951 34 55Z'
fill='#110F0A'
d='M29 7.54605C29 9.47222 28.316 11.1378 26.9481 12.5428C25.5802 13.9251 23.5852 14.9222 20.9634 15.534C22.377 15.9192 23.4484 16.5537 24.1781 17.4375C24.9077 18.2987 25.2724 19.2956 25.2724 20.4287C25.2724 22.2189 24.7025 23.7713 23.5625 25.0855C22.4454 26.3998 20.8039 27.4195 18.638 28.1447C16.4721 28.8472 13.8616 29.1985 10.8066 29.1985C9.66666 29.1985 8.75472 29.1645 8.07075 29.0965C8.04796 29.7309 7.77438 30.2068 7.25 30.5241C6.72562 30.8414 6.05307 31 5.23231 31C4.41156 31 3.84159 30.8187 3.52241 30.4561C3.22603 30.0936 3.10062 29.561 3.14623 28.8586C3.35141 25.686 3.75039 22.3662 4.34316 18.8991C4.93593 15.4094 5.68829 12.0442 6.60024 8.80373C6.75982 8.23721 7.07901 7.84064 7.55778 7.61404C8.03656 7.38743 8.66353 7.27412 9.43868 7.27412C10.8294 7.27412 11.5248 7.65936 11.5248 8.42983C11.5248 8.74708 11.4564 9.10965 11.3196 9.51754C10.7268 11.2851 10.134 13.6871 9.54127 16.7237C8.9485 19.7375 8.52674 22.6156 8.27594 25.3575C9.37028 25.448 10.2594 25.4934 10.9434 25.4934C14.1352 25.4934 16.4721 25.0401 17.954 24.1338C19.4587 23.2046 20.2111 22.0263 20.2111 20.5987C20.2111 19.6016 19.778 18.7632 18.9116 18.0833C18.0681 17.4035 16.6431 17.0296 14.6368 16.9616C14.1808 16.939 13.8616 16.8257 13.6792 16.6217C13.4968 16.4178 13.4057 16.0892 13.4057 15.636C13.4057 14.9788 13.5425 14.4463 13.816 14.0384C14.0896 13.6305 14.5912 13.4152 15.3208 13.3925C16.9395 13.3472 18.3986 13.1093 19.6981 12.6787C21.0204 12.2482 22.0578 11.6477 22.8101 10.8772C23.5625 10.0841 23.9387 9.1663 23.9387 8.1239C23.9387 6.80958 23.2889 5.77851 21.9894 5.0307C20.6899 4.26024 18.6949 3.875 16.0047 3.875C13.5652 3.875 11.2056 4.19226 8.92571 4.82676C6.64584 5.4386 4.70793 6.2204 3.11203 7.17215C2.38246 7.6027 1.7669 7.81798 1.26533 7.81798C0.854953 7.81798 0.53577 7.68202 0.307783 7.41009C0.102594 7.1155 0 6.75292 0 6.32237C0 5.75585 0.113994 5.26864 0.341981 4.86075C0.592768 4.45285 1.17414 3.98831 2.08608 3.46711C4.00118 2.37939 6.24685 1.52961 8.82311 0.917763C11.3994 0.305921 14.0326 0 16.7229 0C20.8494 0 23.9272 0.691156 25.9564 2.07347C27.9855 3.45577 29 5.27998 29 7.54605Z'
fill='currentColor'
/>
</svg>
)
@@ -2467,7 +2342,7 @@ export function PagerDutyIcon(props: SVGProps<SVGSVGElement>) {
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 64 64' fill='none'>
<path
d='M6.704 59.217H0v-33.65c0-3.455 1.418-5.544 2.604-6.704 2.63-2.58 6.2-2.656 6.782-2.656h10.546c3.765 0 5.93 1.52 7.117 2.8 2.346 2.553 2.372 5.853 2.32 6.73v12.687c0 3.662-1.496 5.828-2.733 6.988-2.553 2.398-5.93 2.45-6.73 2.424H6.704zm13.46-18.102c.36 0 1.367-.103 1.908-.62.413-.387.62-1.083.62-2.1v-13.02c0-.36-.077-1.315-.593-1.857-.5-.516-1.444-.62-2.166-.62h-10.6c-2.63 0-2.63 1.985-2.63 2.656v15.55zM57.296 4.783H64V38.46c0 3.455-1.418 5.544-2.604 6.704-2.63 2.58-6.2 2.656-6.782 2.656H44.068c-3.765 0-5.93-1.52-7.117-2.8-2.346-2.553-2.372-5.853-2.32-6.73V25.62c0-3.662 1.496-5.828 2.733-6.988 2.553-2.398 5.93-2.45 6.73-2.424h13.202zM43.836 22.9c-.36 0-1.367.103-1.908.62-.413.387-.62 1.083-.62 2.1v13.02c0 .36.077 1.315.593 1.857.5.516 1.444.62 2.166.62h10.598c2.656-.026 2.656-2 2.656-2.682V22.9z'
fill='#06AC38'
fill='#FFFFFF'
/>
</svg>
)
@@ -4796,6 +4671,22 @@ export function GoogleGroupsIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function GoogleMeetIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' fill='none' viewBox='0 0 87.5 72'>
<path fill='#00832d' d='M49.5 36l8.53 9.75 11.47 7.33 2-17.02-2-16.64-11.69 6.44z' />
<path fill='#0066da' d='M0 51.5V66c0 3.315 2.685 6 6 6h14.5l3-10.96-3-9.54-9.95-3z' />
<path fill='#e94235' d='M20.5 0L0 20.5l10.55 3 9.95-3 2.95-9.41z' />
<path fill='#2684fc' d='M20.5 20.5H0v31h20.5z' />
<path
fill='#00ac47'
d='M82.6 8.68L69.5 19.42v33.66l13.16 10.79c1.97 1.54 4.85.135 4.85-2.37V11c0-2.535-2.945-3.925-4.91-2.32zM49.5 36v15.5h-29V72h43c3.315 0 6-2.685 6-6V53.08z'
/>
<path fill='#ffba00' d='M63.5 0h-43v20.5h29V36l20-16.57V6c0-3.315-2.685-6-6-6z' />
</svg>
)
}
export function CursorIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 546 546' fill='currentColor'>
@@ -4804,6 +4695,19 @@ export function CursorIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function DubIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 64 64' fill='none' xmlns='http://www.w3.org/2000/svg'>
<path
fillRule='evenodd'
clipRule='evenodd'
d='M32 64c17.673 0 32-14.327 32-32 0-11.844-6.435-22.186-16-27.719V48h-8v-2.14A15.9 15.9 0 0 1 32 48c-8.837 0-16-7.163-16-16s7.163-16 16-16c2.914 0 5.647.78 8 2.14V1.008A32 32 0 0 0 32 0C14.327 0 0 14.327 0 32s14.327 32 32 32'
fill='currentColor'
/>
</svg>
)
}
export function DuckDuckGoIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='-108 -108 216 216'>

View File

@@ -17,6 +17,7 @@ import {
AshbyIcon,
AttioIcon,
BrainIcon,
BrandfetchIcon,
BrowserUseIcon,
CalComIcon,
CalendlyIcon,
@@ -33,6 +34,7 @@ import {
DocumentIcon,
DropboxIcon,
DsPyIcon,
DubIcon,
DuckDuckGoIcon,
DynamoDBIcon,
ElasticsearchIcon,
@@ -57,6 +59,7 @@ import {
GoogleGroupsIcon,
GoogleIcon,
GoogleMapsIcon,
GoogleMeetIcon,
GooglePagespeedIcon,
GoogleSheetsIcon,
GoogleSlidesIcon,
@@ -177,6 +180,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
asana: AsanaIcon,
ashby: AshbyIcon,
attio: AttioIcon,
brandfetch: BrandfetchIcon,
browser_use: BrowserUseIcon,
calcom: CalComIcon,
calendly: CalendlyIcon,
@@ -192,6 +196,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
discord: DiscordIcon,
dropbox: DropboxIcon,
dspy: DsPyIcon,
dub: DubIcon,
duckduckgo: DuckDuckGoIcon,
dynamodb: DynamoDBIcon,
elasticsearch: ElasticsearchIcon,
@@ -215,6 +220,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
google_forms: GoogleFormsIcon,
google_groups: GoogleGroupsIcon,
google_maps: GoogleMapsIcon,
google_meet: GoogleMeetIcon,
google_pagespeed: GooglePagespeedIcon,
google_search: GoogleIcon,
google_sheets_v2: GoogleSheetsIcon,

View File

@@ -26,12 +26,63 @@ In Sim, the Airtable integration enables your agents to interact with your Airta
## Usage Instructions
Integrates Airtable into the workflow. Can create, get, list, or update Airtable records. Can be used in trigger mode to trigger a workflow when an update is made to an Airtable table.
Integrates Airtable into the workflow. Can list bases, list tables (with schema), and create, get, list, or update records. Can also be used in trigger mode to trigger a workflow when an update is made to an Airtable table.
## Tools
### `airtable_list_bases`
List all Airtable bases the user has access to
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `offset` | string | No | Pagination offset for retrieving additional bases |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `bases` | array | List of Airtable bases |
| ↳ `id` | string | Base ID \(starts with "app"\) |
| ↳ `name` | string | Base name |
| ↳ `permissionLevel` | string | Permission level \(none, read, comment, edit, create\) |
| `metadata` | json | Pagination and count metadata |
| ↳ `offset` | string | Offset for next page of results |
| ↳ `totalBases` | number | Number of bases returned |
### `airtable_list_tables`
List all tables and their schema in an Airtable base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `baseId` | string | Yes | Airtable base ID \(starts with "app", e.g., "appXXXXXXXXXXXXXX"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `tables` | array | List of tables in the base with their schema |
| ↳ `id` | string | Table ID \(starts with "tbl"\) |
| ↳ `name` | string | Table name |
| ↳ `description` | string | Table description |
| ↳ `primaryFieldId` | string | ID of the primary field |
| ↳ `fields` | array | List of fields in the table |
| ↳ `id` | string | Field ID \(starts with "fld"\) |
| ↳ `name` | string | Field name |
| ↳ `type` | string | Field type \(singleLineText, multilineText, number, checkbox, singleSelect, multipleSelects, date, dateTime, attachment, linkedRecord, etc.\) |
| ↳ `description` | string | Field description |
| ↳ `options` | json | Field-specific options \(choices, etc.\) |
| `metadata` | json | Base info and count metadata |
| ↳ `baseId` | string | The base ID queried |
| ↳ `totalTables` | number | Number of tables in the base |
### `airtable_list_records`
Read records from an Airtable table
@@ -49,8 +100,13 @@ Read records from an Airtable table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `records` | json | Array of retrieved Airtable records |
| `records` | array | Array of retrieved Airtable records |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata including pagination offset and total records count |
| ↳ `offset` | string | Pagination offset for next page |
| ↳ `totalRecords` | number | Number of records returned |
### `airtable_get_record`
@@ -68,8 +124,12 @@ Retrieve a single record from an Airtable table by its ID
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `record` | json | Retrieved Airtable record with id, createdTime, and fields |
| `metadata` | json | Operation metadata including record count |
| `record` | json | Retrieved Airtable record |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records returned \(always 1\) |
### `airtable_create_records`
@@ -88,8 +148,12 @@ Write new records to an Airtable table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `records` | json | Array of created Airtable records |
| `records` | array | Array of created Airtable records |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records created |
### `airtable_update_record`
@@ -108,8 +172,13 @@ Update an existing record in an Airtable table by ID
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `record` | json | Updated Airtable record with id, createdTime, and fields |
| `metadata` | json | Operation metadata including record count and updated field names |
| `record` | json | Updated Airtable record |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records updated \(always 1\) |
| ↳ `updatedFields` | array | List of field names that were updated |
### `airtable_update_multiple_records`
@@ -127,7 +196,12 @@ Update multiple existing records in an Airtable table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `records` | json | Array of updated Airtable records |
| `metadata` | json | Operation metadata including record count and updated record IDs |
| `records` | array | Array of updated Airtable records |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records updated |
| ↳ `updatedRecordIds` | array | List of updated record IDs |

View File

@@ -0,0 +1,83 @@
---
title: Brandfetch
description: Look up brand assets, logos, colors, and company info
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="brandfetch"
color="#000000"
/>
## Usage Instructions
Integrate Brandfetch into your workflow. Retrieve brand logos, colors, fonts, and company data by domain, ticker, or name search.
## Tools
### `brandfetch_get_brand`
Retrieve brand assets including logos, colors, fonts, and company info by domain, ticker, ISIN, or crypto symbol
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Brandfetch API key |
| `identifier` | string | Yes | Brand identifier: domain \(nike.com\), stock ticker \(NKE\), ISIN \(US6541061031\), or crypto symbol \(BTC\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique brand identifier |
| `name` | string | Brand name |
| `domain` | string | Brand domain |
| `claimed` | boolean | Whether the brand profile is claimed |
| `description` | string | Short brand description |
| `longDescription` | string | Detailed brand description |
| `links` | array | Social media and website links |
| ↳ `name` | string | Link name \(e.g., twitter, linkedin\) |
| ↳ `url` | string | Link URL |
| `logos` | array | Brand logos with formats and themes |
| ↳ `type` | string | Logo type \(logo, icon, symbol, other\) |
| ↳ `theme` | string | Logo theme \(light, dark\) |
| ↳ `formats` | array | Available formats with src URL, format, width, and height |
| `colors` | array | Brand colors with hex values and types |
| ↳ `hex` | string | Hex color code |
| ↳ `type` | string | Color type \(accent, dark, light, brand\) |
| ↳ `brightness` | number | Brightness value |
| `fonts` | array | Brand fonts with names and types |
| ↳ `name` | string | Font name |
| ↳ `type` | string | Font type \(title, body\) |
| ↳ `origin` | string | Font origin \(google, custom, system\) |
| `company` | json | Company firmographic data including employees, location, and industries |
| `qualityScore` | number | Data quality score from 0 to 1 |
| `isNsfw` | boolean | Whether the brand contains adult content |
### `brandfetch_search`
Search for brands by name and find their domains and logos
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Brandfetch API key |
| `name` | string | Yes | Company or brand name to search for |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `results` | array | List of matching brands |
| ↳ `brandId` | string | Unique brand identifier |
| ↳ `name` | string | Brand name |
| ↳ `domain` | string | Brand domain |
| ↳ `claimed` | boolean | Whether the brand profile is claimed |
| ↳ `icon` | string | Brand icon URL |

View File

@@ -0,0 +1,318 @@
---
title: Dub
description: Link management with Dub
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="dub"
color="#181C1E"
/>
{/* MANUAL-CONTENT-START:intro */}
[Dub](https://dub.co/) is an open-source link management platform for modern marketing teams. It provides powerful short link creation, analytics, and tracking capabilities with enterprise-grade infrastructure.
With the Dub integration in Sim, you can:
- **Create short links**: Generate branded short links with custom domains, slugs, and UTM parameters
- **Upsert links**: Create or update links idempotently by destination URL
- **Retrieve link info**: Look up link details by ID, external ID, or domain + key combination
- **Update links**: Modify destination URLs, metadata, UTM parameters, and link settings
- **Delete links**: Remove short links by ID or external ID
- **List links**: Search and filter links with pagination, sorting, and tag filtering
- **Get analytics**: Retrieve click, lead, and sales analytics with grouping by time, geography, device, browser, referer, and more
In Sim, the Dub integration enables your agents to manage short links and track their performance programmatically. Use it to create trackable links as part of marketing workflows, monitor link engagement, and make data-driven decisions based on click analytics and conversion metrics.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Create, manage, and track short links with Dub. Supports custom domains, UTM parameters, link analytics, and more.
## Tools
### `dub_create_link`
Create a new short link with Dub. Supports custom domains, slugs, UTM parameters, and more.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `url` | string | Yes | The destination URL of the short link |
| `domain` | string | No | Custom domain for the short link \(defaults to dub.sh\) |
| `key` | string | No | Custom slug for the short link \(randomly generated if not provided\) |
| `externalId` | string | No | External ID for the link in your database |
| `tagIds` | string | No | Comma-separated tag IDs to assign to the link |
| `comments` | string | No | Comments for the short link |
| `expiresAt` | string | No | Expiration date in ISO 8601 format |
| `password` | string | No | Password to protect the short link |
| `rewrite` | boolean | No | Whether to enable link cloaking |
| `archived` | boolean | No | Whether to archive the link |
| `title` | string | No | Custom OG title for the link preview |
| `description` | string | No | Custom OG description for the link preview |
| `utm_source` | string | No | UTM source parameter |
| `utm_medium` | string | No | UTM medium parameter |
| `utm_campaign` | string | No | UTM campaign parameter |
| `utm_term` | string | No | UTM term parameter |
| `utm_content` | string | No | UTM content parameter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the created link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_upsert_link`
Create or update a short link by its URL. If a link with the same URL already exists, update it. Otherwise, create a new link.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `url` | string | Yes | The destination URL of the short link |
| `domain` | string | No | Custom domain for the short link \(defaults to dub.sh\) |
| `key` | string | No | Custom slug for the short link \(randomly generated if not provided\) |
| `externalId` | string | No | External ID for the link in your database |
| `tagIds` | string | No | Comma-separated tag IDs to assign to the link |
| `comments` | string | No | Comments for the short link |
| `expiresAt` | string | No | Expiration date in ISO 8601 format |
| `password` | string | No | Password to protect the short link |
| `rewrite` | boolean | No | Whether to enable link cloaking |
| `archived` | boolean | No | Whether to archive the link |
| `title` | string | No | Custom OG title for the link preview |
| `description` | string | No | Custom OG description for the link preview |
| `utm_source` | string | No | UTM source parameter |
| `utm_medium` | string | No | UTM medium parameter |
| `utm_campaign` | string | No | UTM campaign parameter |
| `utm_term` | string | No | UTM term parameter |
| `utm_content` | string | No | UTM content parameter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_get_link`
Retrieve information about a short link by its link ID, external ID, or domain + key combination.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `linkId` | string | No | The unique ID of the short link |
| `externalId` | string | No | The external ID of the link in your database |
| `domain` | string | No | The domain of the link \(use with key\) |
| `key` | string | No | The slug of the link \(use with domain\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_update_link`
Update an existing short link. You can modify the destination URL, slug, metadata, UTM parameters, and more.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `linkId` | string | Yes | The link ID or external ID prefixed with ext_ |
| `url` | string | No | New destination URL |
| `domain` | string | No | New custom domain |
| `key` | string | No | New custom slug |
| `title` | string | No | Custom OG title |
| `description` | string | No | Custom OG description |
| `externalId` | string | No | External ID for the link |
| `tagIds` | string | No | Comma-separated tag IDs |
| `comments` | string | No | Comments for the short link |
| `expiresAt` | string | No | Expiration date in ISO 8601 format |
| `password` | string | No | Password to protect the link |
| `rewrite` | boolean | No | Whether to enable link cloaking |
| `archived` | boolean | No | Whether to archive the link |
| `utm_source` | string | No | UTM source parameter |
| `utm_medium` | string | No | UTM medium parameter |
| `utm_campaign` | string | No | UTM campaign parameter |
| `utm_term` | string | No | UTM term parameter |
| `utm_content` | string | No | UTM content parameter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the updated link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_delete_link`
Delete a short link by its link ID or external ID (prefixed with ext_).
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `linkId` | string | Yes | The link ID or external ID prefixed with ext_ |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | ID of the deleted link |
### `dub_list_links`
Retrieve a paginated list of short links for the authenticated workspace. Supports filtering by domain, search query, tags, and sorting.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `domain` | string | No | Filter by domain |
| `search` | string | No | Search query matched against the short link slug and destination URL |
| `tagIds` | string | No | Comma-separated tag IDs to filter by |
| `showArchived` | boolean | No | Whether to include archived links \(defaults to false\) |
| `sortBy` | string | No | Sort by field: createdAt, clicks, saleAmount, or lastClicked |
| `sortOrder` | string | No | Sort order: asc or desc |
| `page` | number | No | Page number \(default: 1\) |
| `pageSize` | number | No | Number of links per page \(default: 100, max: 100\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `links` | json | Array of link objects \(id, domain, key, url, shortLink, clicks, tags, createdAt\) |
| `count` | number | Number of links returned |
### `dub_get_analytics`
Retrieve analytics for links including clicks, leads, and sales. Supports filtering by link, time range, and grouping by various dimensions.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `event` | string | No | Event type: clicks \(default\), leads, sales, or composite |
| `groupBy` | string | No | Group results by: count \(default\), timeseries, countries, cities, devices, browsers, os, referers, top_links, top_urls |
| `linkId` | string | No | Filter by link ID |
| `externalId` | string | No | Filter by external ID \(prefix with ext_\) |
| `domain` | string | No | Filter by domain |
| `interval` | string | No | Time interval: 24h \(default\), 7d, 30d, 90d, 1y, mtd, qtd, ytd, or all |
| `start` | string | No | Start date/time in ISO 8601 format \(overrides interval\) |
| `end` | string | No | End date/time in ISO 8601 format \(defaults to now\) |
| `country` | string | No | Filter by country \(ISO 3166-1 alpha-2 code\) |
| `timezone` | string | No | IANA timezone for timeseries data \(defaults to UTC\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `clicks` | number | Total number of clicks |
| `leads` | number | Total number of leads |
| `sales` | number | Total number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `data` | json | Grouped analytics data \(timeseries, countries, devices, etc.\) |

View File

@@ -0,0 +1,156 @@
---
title: Google Meet
description: Create and manage Google Meet meetings
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="google_meet"
color="#E0E0E0"
/>
{/* MANUAL-CONTENT-START:intro */}
[Google Meet](https://meet.google.com) is Google's video conferencing and online meeting platform, providing secure, high-quality video calls for individuals and teams. As a core component of Google Workspace, Google Meet enables real-time collaboration through video meetings, screen sharing, and integrated chat.
The Google Meet REST API (v2) allows programmatic management of meeting spaces and conference records, enabling automated workflows to create meetings, track participation, and manage active conferences without manual intervention.
Key features of the Google Meet API include:
- **Meeting Space Management**: Create, retrieve, and configure meeting spaces with customizable access controls.
- **Conference Records**: Access historical conference data including start/end times and associated spaces.
- **Participant Tracking**: View participant details for any conference including join/leave times and user types.
- **Access Controls**: Configure who can join meetings (open, trusted, or restricted) and which entry points are allowed.
- **Active Conference Management**: Programmatically end active conferences in meeting spaces.
In Sim, the Google Meet integration allows your agents to create meeting spaces on demand, monitor conference activity, track participation across meetings, and manage active conferences as part of automated workflows. This enables scenarios such as automatically provisioning meeting rooms for scheduled events, generating attendance reports, ending stale conferences, and building meeting analytics dashboards.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate Google Meet into your workflow. Create meeting spaces, get space details, end conferences, list conference records, and view participants.
## Tools
### `google_meet_create_space`
Create a new Google Meet meeting space
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `accessType` | string | No | Who can join the meeting without knocking: OPEN \(anyone with link\), TRUSTED \(org members\), RESTRICTED \(only invited\) |
| `entryPointAccess` | string | No | Entry points allowed: ALL \(all entry points\) or CREATOR_APP_ONLY \(only via app\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Resource name of the space \(e.g., spaces/abc123\) |
| `meetingUri` | string | Meeting URL \(e.g., https://meet.google.com/abc-defg-hij\) |
| `meetingCode` | string | Meeting code \(e.g., abc-defg-hij\) |
| `accessType` | string | Access type configuration |
| `entryPointAccess` | string | Entry point access configuration |
### `google_meet_get_space`
Get details of a Google Meet meeting space by name or meeting code
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `spaceName` | string | Yes | Space resource name \(spaces/abc123\) or meeting code \(abc-defg-hij\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Resource name of the space |
| `meetingUri` | string | Meeting URL |
| `meetingCode` | string | Meeting code |
| `accessType` | string | Access type configuration |
| `entryPointAccess` | string | Entry point access configuration |
| `activeConference` | string | Active conference record name |
### `google_meet_end_conference`
End the active conference in a Google Meet space
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `spaceName` | string | Yes | Space resource name \(e.g., spaces/abc123\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `ended` | boolean | Whether the conference was ended successfully |
### `google_meet_list_conference_records`
List conference records for meetings you organized
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `filter` | string | No | Filter by space name \(e.g., space.name = "spaces/abc123"\) or time range \(e.g., start_time &gt; "2024-01-01T00:00:00Z"\) |
| `pageSize` | number | No | Maximum number of conference records to return \(max 100\) |
| `pageToken` | string | No | Page token from a previous list request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `conferenceRecords` | json | List of conference records with name, start/end times, and space |
| `nextPageToken` | string | Token for next page of results |
### `google_meet_get_conference_record`
Get details of a specific conference record
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `conferenceName` | string | Yes | Conference record resource name \(e.g., conferenceRecords/abc123\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Conference record resource name |
| `startTime` | string | Conference start time |
| `endTime` | string | Conference end time |
| `expireTime` | string | Conference record expiration time |
| `space` | string | Associated space resource name |
### `google_meet_list_participants`
List participants of a conference record
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `conferenceName` | string | Yes | Conference record resource name \(e.g., conferenceRecords/abc123\) |
| `filter` | string | No | Filter participants \(e.g., earliest_start_time &gt; "2024-01-01T00:00:00Z"\) |
| `pageSize` | number | No | Maximum number of participants to return \(default 100, max 250\) |
| `pageToken` | string | No | Page token from a previous list request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `participants` | json | List of participants with name, times, display name, and user type |
| `nextPageToken` | string | Token for next page of results |
| `totalSize` | number | Total number of participants |

View File

@@ -13,6 +13,7 @@
"asana",
"ashby",
"attio",
"brandfetch",
"browser_use",
"calcom",
"calendly",
@@ -28,6 +29,7 @@
"discord",
"dropbox",
"dspy",
"dub",
"duckduckgo",
"dynamodb",
"elasticsearch",
@@ -51,6 +53,7 @@
"google_forms",
"google_groups",
"google_maps",
"google_meet",
"google_pagespeed",
"google_search",
"google_sheets",

View File

@@ -1,534 +0,0 @@
'use client'
import { useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { X } from 'lucide-react'
import { Textarea } from '@/components/emcn'
import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label'
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from '@/components/ui/select'
import { isHosted } from '@/lib/core/config/feature-flags'
import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import Footer from '@/app/(landing)/components/footer/footer'
import Nav from '@/app/(landing)/components/nav/nav'
const logger = createLogger('CareersPage')
const validateName = (name: string): string[] => {
const errors: string[] = []
if (!name || name.trim().length < 2) {
errors.push('Name must be at least 2 characters')
}
return errors
}
const validateEmail = (email: string): string[] => {
const errors: string[] = []
if (!email || !email.trim()) {
errors.push('Email is required')
return errors
}
const validation = quickValidateEmail(email.trim().toLowerCase())
if (!validation.isValid) {
errors.push(validation.reason || 'Please enter a valid email address')
}
return errors
}
const validatePosition = (position: string): string[] => {
const errors: string[] = []
if (!position || position.trim().length < 2) {
errors.push('Please specify the position you are interested in')
}
return errors
}
const validateLinkedIn = (url: string): string[] => {
if (!url || url.trim() === '') return []
const errors: string[] = []
try {
new URL(url)
} catch {
errors.push('Please enter a valid LinkedIn URL')
}
return errors
}
const validatePortfolio = (url: string): string[] => {
if (!url || url.trim() === '') return []
const errors: string[] = []
try {
new URL(url)
} catch {
errors.push('Please enter a valid portfolio URL')
}
return errors
}
const validateLocation = (location: string): string[] => {
const errors: string[] = []
if (!location || location.trim().length < 2) {
errors.push('Please enter your location')
}
return errors
}
const validateMessage = (message: string): string[] => {
const errors: string[] = []
if (!message || message.trim().length < 50) {
errors.push('Please tell us more about yourself (at least 50 characters)')
}
return errors
}
export default function CareersPage() {
const [isSubmitting, setIsSubmitting] = useState(false)
const [submitStatus, setSubmitStatus] = useState<'idle' | 'success' | 'error'>('idle')
const [showErrors, setShowErrors] = useState(false)
// Form fields
const [name, setName] = useState('')
const [email, setEmail] = useState('')
const [phone, setPhone] = useState('')
const [position, setPosition] = useState('')
const [linkedin, setLinkedin] = useState('')
const [portfolio, setPortfolio] = useState('')
const [experience, setExperience] = useState('')
const [location, setLocation] = useState('')
const [message, setMessage] = useState('')
const [resume, setResume] = useState<File | null>(null)
const fileInputRef = useRef<HTMLInputElement>(null)
// Field errors
const [nameErrors, setNameErrors] = useState<string[]>([])
const [emailErrors, setEmailErrors] = useState<string[]>([])
const [positionErrors, setPositionErrors] = useState<string[]>([])
const [linkedinErrors, setLinkedinErrors] = useState<string[]>([])
const [portfolioErrors, setPortfolioErrors] = useState<string[]>([])
const [experienceErrors, setExperienceErrors] = useState<string[]>([])
const [locationErrors, setLocationErrors] = useState<string[]>([])
const [messageErrors, setMessageErrors] = useState<string[]>([])
const [resumeErrors, setResumeErrors] = useState<string[]>([])
const handleFileChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0] || null
setResume(file)
if (file) {
setResumeErrors([])
}
}
async function onSubmit(e: React.FormEvent<HTMLFormElement>) {
e.preventDefault()
setShowErrors(true)
// Validate all fields
const nameErrs = validateName(name)
const emailErrs = validateEmail(email)
const positionErrs = validatePosition(position)
const linkedinErrs = validateLinkedIn(linkedin)
const portfolioErrs = validatePortfolio(portfolio)
const experienceErrs = experience ? [] : ['Please select your years of experience']
const locationErrs = validateLocation(location)
const messageErrs = validateMessage(message)
const resumeErrs = resume ? [] : ['Resume is required']
setNameErrors(nameErrs)
setEmailErrors(emailErrs)
setPositionErrors(positionErrs)
setLinkedinErrors(linkedinErrs)
setPortfolioErrors(portfolioErrs)
setExperienceErrors(experienceErrs)
setLocationErrors(locationErrs)
setMessageErrors(messageErrs)
setResumeErrors(resumeErrs)
if (
nameErrs.length > 0 ||
emailErrs.length > 0 ||
positionErrs.length > 0 ||
linkedinErrs.length > 0 ||
portfolioErrs.length > 0 ||
experienceErrs.length > 0 ||
locationErrs.length > 0 ||
messageErrs.length > 0 ||
resumeErrs.length > 0
) {
return
}
setIsSubmitting(true)
setSubmitStatus('idle')
try {
const formData = new FormData()
formData.append('name', name)
formData.append('email', email)
formData.append('phone', phone || '')
formData.append('position', position)
formData.append('linkedin', linkedin || '')
formData.append('portfolio', portfolio || '')
formData.append('experience', experience)
formData.append('location', location)
formData.append('message', message)
if (resume) formData.append('resume', resume)
const response = await fetch('/api/careers/submit', {
method: 'POST',
body: formData,
})
if (!response.ok) {
throw new Error('Failed to submit application')
}
setSubmitStatus('success')
} catch (error) {
logger.error('Error submitting application:', error)
setSubmitStatus('error')
} finally {
setIsSubmitting(false)
}
}
return (
<main className={`${soehne.className} min-h-screen bg-white text-gray-900`}>
<Nav variant='landing' />
{/* Content */}
<div className='px-4 pt-[60px] pb-[80px] sm:px-8 md:px-[44px]'>
<h1 className='mb-10 text-center font-bold text-4xl text-gray-900 md:text-5xl'>
Join Our Team
</h1>
<div className='mx-auto max-w-4xl'>
{/* Form Section */}
<section className='rounded-2xl border border-gray-200 bg-white p-6 shadow-sm sm:p-10'>
<form onSubmit={onSubmit} className='space-y-5'>
{/* Name and Email */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='name' className='font-medium text-sm'>
Full Name *
</Label>
<Input
id='name'
placeholder='John Doe'
value={name}
onChange={(e) => setName(e.target.value)}
className={cn(
showErrors &&
nameErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && nameErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{nameErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
<div className='space-y-2'>
<Label htmlFor='email' className='font-medium text-sm'>
Email *
</Label>
<Input
id='email'
type='email'
placeholder='john@example.com'
value={email}
onChange={(e) => setEmail(e.target.value)}
className={cn(
showErrors &&
emailErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && emailErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{emailErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* Phone and Position */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='phone' className='font-medium text-sm'>
Phone Number
</Label>
<Input
id='phone'
type='tel'
placeholder='+1 (555) 123-4567'
value={phone}
onChange={(e) => setPhone(e.target.value)}
/>
</div>
<div className='space-y-2'>
<Label htmlFor='position' className='font-medium text-sm'>
Position of Interest *
</Label>
<Input
id='position'
placeholder='e.g. Full Stack Engineer, Product Designer'
value={position}
onChange={(e) => setPosition(e.target.value)}
className={cn(
showErrors &&
positionErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && positionErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{positionErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* LinkedIn and Portfolio */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='linkedin' className='font-medium text-sm'>
LinkedIn Profile
</Label>
<Input
id='linkedin'
placeholder='https://linkedin.com/in/yourprofile'
value={linkedin}
onChange={(e) => setLinkedin(e.target.value)}
className={cn(
showErrors &&
linkedinErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && linkedinErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{linkedinErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
<div className='space-y-2'>
<Label htmlFor='portfolio' className='font-medium text-sm'>
Portfolio / Website
</Label>
<Input
id='portfolio'
placeholder='https://yourportfolio.com'
value={portfolio}
onChange={(e) => setPortfolio(e.target.value)}
className={cn(
showErrors &&
portfolioErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && portfolioErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{portfolioErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* Experience and Location */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='experience' className='font-medium text-sm'>
Years of Experience *
</Label>
<Select value={experience} onValueChange={setExperience}>
<SelectTrigger
className={cn(
showErrors &&
experienceErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
>
<SelectValue placeholder='Select experience level' />
</SelectTrigger>
<SelectContent>
<SelectItem value='0-1'>0-1 years</SelectItem>
<SelectItem value='1-3'>1-3 years</SelectItem>
<SelectItem value='3-5'>3-5 years</SelectItem>
<SelectItem value='5-10'>5-10 years</SelectItem>
<SelectItem value='10+'>10+ years</SelectItem>
</SelectContent>
</Select>
{showErrors && experienceErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{experienceErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
<div className='space-y-2'>
<Label htmlFor='location' className='font-medium text-sm'>
Location *
</Label>
<Input
id='location'
placeholder='e.g. San Francisco, CA'
value={location}
onChange={(e) => setLocation(e.target.value)}
className={cn(
showErrors &&
locationErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && locationErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{locationErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* Message */}
<div className='space-y-2'>
<Label htmlFor='message' className='font-medium text-sm'>
Tell us about yourself *
</Label>
<Textarea
id='message'
placeholder='Tell us about your experience, what excites you about Sim, and why you would be a great fit for this role...'
className={cn(
'min-h-[140px]',
showErrors &&
messageErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
value={message}
onChange={(e) => setMessage(e.target.value)}
/>
<p className='mt-1.5 text-gray-500 text-xs'>Minimum 50 characters</p>
{showErrors && messageErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{messageErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
{/* Resume Upload */}
<div className='space-y-2'>
<Label htmlFor='resume' className='font-medium text-sm'>
Resume *
</Label>
<div className='relative'>
{resume ? (
<div className='flex items-center gap-2 rounded-md border border-input bg-background px-3 py-2'>
<span className='flex-1 truncate text-sm'>{resume.name}</span>
<button
type='button'
onClick={(e) => {
e.preventDefault()
setResume(null)
if (fileInputRef.current) {
fileInputRef.current.value = ''
}
}}
className='flex-shrink-0 text-muted-foreground transition-colors hover:text-foreground'
aria-label='Remove file'
>
<X className='h-4 w-4' />
</button>
</div>
) : (
<Input
id='resume'
type='file'
accept='.pdf,.doc,.docx'
onChange={handleFileChange}
ref={fileInputRef}
className={cn(
showErrors &&
resumeErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
)}
</div>
<p className='mt-1.5 text-gray-500 text-xs'>PDF or Word document, max 10MB</p>
{showErrors && resumeErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{resumeErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
{/* Submit Button */}
<div className='flex justify-end pt-2'>
<BrandedButton
type='submit'
disabled={isSubmitting || submitStatus === 'success'}
loading={isSubmitting}
loadingText='Submitting'
showArrow={false}
fullWidth={false}
className='min-w-[200px]'
>
{submitStatus === 'success' ? 'Submitted' : 'Submit Application'}
</BrandedButton>
</div>
</form>
</section>
{/* Additional Info */}
<section className='mt-6 text-center text-gray-600 text-sm'>
<p>
Questions? Email us at{' '}
<a
href='mailto:careers@sim.ai'
className='font-medium text-gray-900 underline transition-colors hover:text-gray-700'
>
careers@sim.ai
</a>
</p>
</section>
</div>
</div>
{/* Footer - Only for hosted instances */}
{isHosted && (
<div className='relative z-20'>
<Footer fullWidth={true} />
</div>
)}
</main>
)
}

View File

@@ -77,12 +77,14 @@ export default function Footer({ fullWidth = false }: FooterProps) {
>
Status
</Link>
<Link
href='/careers'
<a
href='https://jobs.ashbyhq.com/sim'
target='_blank'
rel='noopener noreferrer'
className='text-[14px] text-muted-foreground transition-colors hover:text-foreground'
>
Careers
</Link>
</a>
<Link
href='/privacy'
target='_blank'

View File

@@ -91,12 +91,14 @@ export default function Nav({ hideAuthButtons = false, variant = 'landing' }: Na
</button>
</li>
<li>
<Link
href='/careers'
<a
href='https://jobs.ashbyhq.com/sim'
target='_blank'
rel='noopener noreferrer'
className='text-[16px] text-muted-foreground transition-colors hover:text-foreground'
>
Careers
</Link>
</a>
</li>
<li>
<a

View File

@@ -18,7 +18,6 @@ export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
pathname.startsWith('/privacy') ||
pathname.startsWith('/invite') ||
pathname.startsWith('/verify') ||
pathname.startsWith('/careers') ||
pathname.startsWith('/changelog') ||
pathname.startsWith('/chat') ||
pathname.startsWith('/studio') ||

View File

@@ -833,15 +833,7 @@ input[type="search"]::-ms-clear {
animation: growShrink 1.5s infinite ease-in-out;
}
/* Subflow node z-index and drag-over styles */
.workflow-container .react-flow__node-subflowNode {
z-index: -1 !important;
}
.workflow-container .react-flow__node-subflowNode:has([data-subflow-selected="true"]) {
z-index: 10 !important;
}
/* Subflow node drag-over styles */
.loop-node-drag-over,
.parallel-node-drag-over {
box-shadow: 0 0 0 1.75px var(--brand-secondary) !important;

View File

@@ -711,7 +711,7 @@ async function handleMessageStream(
if (response.body && isStreamingResponse) {
const reader = response.body.getReader()
const decoder = new TextDecoder()
let accumulatedContent = ''
const contentChunks: string[] = []
let finalContent: string | undefined
while (true) {
@@ -722,7 +722,7 @@ async function handleMessageStream(
const parsed = parseWorkflowSSEChunk(rawChunk)
if (parsed.content) {
accumulatedContent += parsed.content
contentChunks.push(parsed.content)
sendEvent('message', {
kind: 'message',
taskId,
@@ -738,6 +738,7 @@ async function handleMessageStream(
}
}
const accumulatedContent = contentChunks.join('')
const messageContent =
(finalContent !== undefined && finalContent.length > 0
? finalContent

View File

@@ -1,192 +0,0 @@
import { render } from '@react-email/components'
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { CareersConfirmationEmail, CareersSubmissionEmail } from '@/components/emails'
import { generateRequestId } from '@/lib/core/utils/request'
import { sendEmail } from '@/lib/messaging/email/mailer'
export const dynamic = 'force-dynamic'
const logger = createLogger('CareersAPI')
const MAX_FILE_SIZE = 10 * 1024 * 1024
const ALLOWED_FILE_TYPES = [
'application/pdf',
'application/msword',
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
]
const CareersSubmissionSchema = z.object({
name: z.string().min(2, 'Name must be at least 2 characters'),
email: z.string().email('Please enter a valid email address'),
phone: z.string().optional(),
position: z.string().min(2, 'Please specify the position you are interested in'),
linkedin: z.string().url('Please enter a valid LinkedIn URL').optional().or(z.literal('')),
portfolio: z.string().url('Please enter a valid portfolio URL').optional().or(z.literal('')),
experience: z.enum(['0-1', '1-3', '3-5', '5-10', '10+']),
location: z.string().min(2, 'Please enter your location'),
message: z.string().min(50, 'Please tell us more about yourself (at least 50 characters)'),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const formData = await request.formData()
const data = {
name: formData.get('name') as string,
email: formData.get('email') as string,
phone: formData.get('phone') as string,
position: formData.get('position') as string,
linkedin: formData.get('linkedin') as string,
portfolio: formData.get('portfolio') as string,
experience: formData.get('experience') as string,
location: formData.get('location') as string,
message: formData.get('message') as string,
}
const resumeFile = formData.get('resume') as File | null
if (!resumeFile) {
return NextResponse.json(
{
success: false,
message: 'Resume is required',
errors: [{ path: ['resume'], message: 'Resume is required' }],
},
{ status: 400 }
)
}
if (resumeFile.size > MAX_FILE_SIZE) {
return NextResponse.json(
{
success: false,
message: 'Resume file size must be less than 10MB',
errors: [{ path: ['resume'], message: 'File size must be less than 10MB' }],
},
{ status: 400 }
)
}
if (!ALLOWED_FILE_TYPES.includes(resumeFile.type)) {
return NextResponse.json(
{
success: false,
message: 'Resume must be a PDF or Word document',
errors: [{ path: ['resume'], message: 'File must be PDF or Word document' }],
},
{ status: 400 }
)
}
const resumeBuffer = await resumeFile.arrayBuffer()
const resumeBase64 = Buffer.from(resumeBuffer).toString('base64')
const validatedData = CareersSubmissionSchema.parse(data)
logger.info(`[${requestId}] Processing career application`, {
name: validatedData.name,
email: validatedData.email,
position: validatedData.position,
resumeSize: resumeFile.size,
resumeType: resumeFile.type,
})
const submittedDate = new Date()
const careersEmailHtml = await render(
CareersSubmissionEmail({
name: validatedData.name,
email: validatedData.email,
phone: validatedData.phone,
position: validatedData.position,
linkedin: validatedData.linkedin,
portfolio: validatedData.portfolio,
experience: validatedData.experience,
location: validatedData.location,
message: validatedData.message,
submittedDate,
})
)
const confirmationEmailHtml = await render(
CareersConfirmationEmail({
name: validatedData.name,
position: validatedData.position,
submittedDate,
})
)
const careersEmailResult = await sendEmail({
to: 'careers@sim.ai',
subject: `New Career Application: ${validatedData.name} - ${validatedData.position}`,
html: careersEmailHtml,
emailType: 'transactional',
replyTo: validatedData.email,
attachments: [
{
filename: resumeFile.name,
content: resumeBase64,
contentType: resumeFile.type,
},
],
})
if (!careersEmailResult.success) {
logger.error(`[${requestId}] Failed to send email to careers@sim.ai`, {
error: careersEmailResult.message,
})
throw new Error('Failed to submit application')
}
const confirmationResult = await sendEmail({
to: validatedData.email,
subject: `Your Application to Sim - ${validatedData.position}`,
html: confirmationEmailHtml,
emailType: 'transactional',
replyTo: validatedData.email,
})
if (!confirmationResult.success) {
logger.warn(`[${requestId}] Failed to send confirmation email to applicant`, {
email: validatedData.email,
error: confirmationResult.message,
})
}
logger.info(`[${requestId}] Career application submitted successfully`, {
careersEmailSent: careersEmailResult.success,
confirmationEmailSent: confirmationResult.success,
})
return NextResponse.json({
success: true,
message: 'Application submitted successfully',
})
} catch (error) {
if (error instanceof z.ZodError) {
logger.warn(`[${requestId}] Invalid application data`, { errors: error.errors })
return NextResponse.json(
{
success: false,
message: 'Invalid application data',
errors: error.errors,
},
{ status: 400 }
)
}
logger.error(`[${requestId}] Error processing career application:`, error)
return NextResponse.json(
{
success: false,
message:
'Failed to submit application. Please try again or email us directly at careers@sim.ai',
},
{ status: 500 }
)
}
}

View File

@@ -405,13 +405,17 @@ export async function POST(req: NextRequest) {
},
})
} finally {
controller.close()
try {
controller.close()
} catch {
// controller may already be closed by cancel()
}
}
},
async cancel() {
clientDisconnected = true
if (eventWriter) {
await eventWriter.flush()
await eventWriter.close().catch(() => {})
}
},
})

View File

@@ -2,8 +2,6 @@ import type { NextRequest } from 'next/server'
import { NextResponse } from 'next/server'
import {
renderBatchInvitationEmail,
renderCareersConfirmationEmail,
renderCareersSubmissionEmail,
renderCreditPurchaseEmail,
renderEnterpriseSubscriptionEmail,
renderFreeTierUpgradeEmail,
@@ -94,22 +92,6 @@ const emailTemplates = {
failureReason: 'Card declined',
}),
// Careers emails
'careers-confirmation': () => renderCareersConfirmationEmail('John Doe', 'Senior Engineer'),
'careers-submission': () =>
renderCareersSubmissionEmail({
name: 'John Doe',
email: 'john@example.com',
phone: '+1 (555) 123-4567',
position: 'Senior Engineer',
linkedin: 'https://linkedin.com/in/johndoe',
portfolio: 'https://johndoe.dev',
experience: '5-10',
location: 'San Francisco, CA',
message:
'I have 10 years of experience building scalable distributed systems. Most recently, I led a team at a Series B startup where we scaled from 100K to 10M users.',
}),
// Notification emails
'workflow-notification-success': () =>
renderWorkflowNotificationEmail({
@@ -176,7 +158,6 @@ export async function GET(request: NextRequest) {
'credit-purchase',
'payment-failed',
],
Careers: ['careers-confirmation', 'careers-submission'],
Notifications: [
'workflow-notification-success',
'workflow-notification-error',

View File

@@ -36,6 +36,7 @@ export async function GET(_request: NextRequest, { params }: { params: Promise<{
stateSnapshotId: workflowExecutionLogs.stateSnapshotId,
deploymentVersionId: workflowExecutionLogs.deploymentVersionId,
level: workflowExecutionLogs.level,
status: workflowExecutionLogs.status,
trigger: workflowExecutionLogs.trigger,
startedAt: workflowExecutionLogs.startedAt,
endedAt: workflowExecutionLogs.endedAt,
@@ -99,6 +100,7 @@ export async function GET(_request: NextRequest, { params }: { params: Promise<{
deploymentVersion: log.deploymentVersion ?? null,
deploymentVersionName: log.deploymentVersionName ?? null,
level: log.level,
status: log.status,
duration: log.totalDurationMs ? `${log.totalDurationMs}ms` : null,
trigger: log.trigger,
createdAt: log.startedAt.toISOString(),

View File

@@ -342,6 +342,7 @@ async function generateWithRunway(
})
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Runway status check failed: ${statusResponse.status}`)
}
@@ -352,6 +353,7 @@ async function generateWithRunway(
const videoResponse = await fetch(statusData.output[0])
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}
@@ -448,6 +450,7 @@ async function generateWithVeo(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Veo status check failed: ${statusResponse.status}`)
}
@@ -472,6 +475,7 @@ async function generateWithVeo(
})
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}
@@ -561,6 +565,7 @@ async function generateWithLuma(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Luma status check failed: ${statusResponse.status}`)
}
@@ -576,6 +581,7 @@ async function generateWithLuma(
const videoResponse = await fetch(videoUrl)
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}
@@ -679,6 +685,7 @@ async function generateWithMiniMax(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`MiniMax status check failed: ${statusResponse.status}`)
}
@@ -712,6 +719,7 @@ async function generateWithMiniMax(
)
if (!fileResponse.ok) {
await fileResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${fileResponse.status}`)
}
@@ -725,6 +733,7 @@ async function generateWithMiniMax(
// Download the actual video file
const videoResponse = await fetch(videoUrl)
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video from URL: ${videoResponse.status}`)
}
@@ -881,6 +890,7 @@ async function generateWithFalAI(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Fal.ai status check failed: ${statusResponse.status}`)
}
@@ -899,6 +909,7 @@ async function generateWithFalAI(
)
if (!resultResponse.ok) {
await resultResponse.text().catch(() => {})
throw new Error(`Failed to fetch result: ${resultResponse.status}`)
}
@@ -911,6 +922,7 @@ async function generateWithFalAI(
const videoResponse = await fetch(videoUrl)
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}

View File

@@ -828,6 +828,9 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...(childWorkflowContext && {
childWorkflowBlockId: childWorkflowContext.parentBlockId,
@@ -884,6 +887,9 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...childWorkflowData,
...instanceData,
@@ -915,6 +921,9 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...childWorkflowData,
...instanceData,

View File

@@ -38,6 +38,24 @@ export default function RootLayout({ children }: { children: React.ReactNode })
return (
<html lang='en' suppressHydrationWarning>
<head>
{/* Polyfill crypto.randomUUID for non-secure contexts (HTTP on non-localhost) */}
<script
id='crypto-randomuuid-polyfill'
dangerouslySetInnerHTML={{
__html: `
if (typeof crypto !== 'undefined' && typeof crypto.randomUUID !== 'function' && typeof crypto.getRandomValues === 'function') {
crypto.randomUUID = function() {
var a = new Uint8Array(16);
crypto.getRandomValues(a);
a[6] = (a[6] & 0x0f) | 0x40;
a[8] = (a[8] & 0x3f) | 0x80;
var h = Array.prototype.map.call(a, function(b) { return ('0' + b.toString(16)).slice(-2); }).join('');
return h.slice(0,8) + '-' + h.slice(8,12) + '-' + h.slice(12,16) + '-' + h.slice(16,20) + '-' + h.slice(20);
};
}
`,
}}
/>
{isReactScanEnabled && (
<Script
src='https://unpkg.com/react-scan/dist/auto.global.js'

View File

@@ -55,7 +55,7 @@ Sim provides a visual drag-and-drop interface for building and deploying AI agen
## Optional
- [Careers](${baseUrl}/careers): Join the Sim team
- [Careers](https://jobs.ashbyhq.com/sim): Join the Sim team
- [Terms of Service](${baseUrl}/terms): Legal terms
- [Privacy Policy](${baseUrl}/privacy): Data handling practices
`

View File

@@ -28,10 +28,6 @@ export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
url: `${baseUrl}/changelog`,
lastModified: now,
},
{
url: `${baseUrl}/careers`,
lastModified: new Date('2024-10-06'),
},
{
url: `${baseUrl}/terms`,
lastModified: new Date('2024-10-14'),

View File

@@ -164,7 +164,7 @@ export const ActionBar = memo(
return (
<div
className={cn(
'-top-[46px] absolute right-0',
'-top-[46px] pointer-events-auto absolute right-0',
'flex flex-row items-center',
'opacity-0 transition-opacity duration-200 group-hover:opacity-100',
'gap-[5px] rounded-[10px] p-[5px]',

View File

@@ -31,6 +31,7 @@ import {
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import type { WandControlHandlers } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/sub-block'
import { restoreCursorAfterInsertion } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/utils'
import { WandPromptBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/wand-prompt-bar/wand-prompt-bar'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { useWand } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-wand'
@@ -537,36 +538,40 @@ export const Code = memo(function Code({
/**
* Handles selection of a tag from the tag dropdown.
* @param newValue - The new code value with the selected tag inserted
* @param newCursorPosition - The cursor position after the inserted tag
*/
const handleTagSelect = (newValue: string) => {
const handleTagSelect = (newValue: string, newCursorPosition: number) => {
const textarea = editorRef.current?.querySelector('textarea') as HTMLTextAreaElement | null
if (!isPreview && !readOnly) {
setCode(newValue)
emitTagSelection(newValue)
recordChange(newValue)
restoreCursorAfterInsertion(textarea, newCursorPosition)
} else {
setTimeout(() => textarea?.focus(), 0)
}
setShowTags(false)
setActiveSourceBlockId(null)
setTimeout(() => {
editorRef.current?.querySelector('textarea')?.focus()
}, 0)
}
/**
* Handles selection of an environment variable from the dropdown.
* @param newValue - The new code value with the selected env var inserted
* @param newCursorPosition - The cursor position after the inserted env var
*/
const handleEnvVarSelect = (newValue: string) => {
const handleEnvVarSelect = (newValue: string, newCursorPosition: number) => {
const textarea = editorRef.current?.querySelector('textarea') as HTMLTextAreaElement | null
if (!isPreview && !readOnly) {
setCode(newValue)
emitTagSelection(newValue)
recordChange(newValue)
restoreCursorAfterInsertion(textarea, newCursorPosition)
} else {
setTimeout(() => textarea?.focus(), 0)
}
setShowEnvVars(false)
setTimeout(() => {
editorRef.current?.querySelector('textarea')?.focus()
}, 0)
}
/**

View File

@@ -31,6 +31,7 @@ import {
TagDropdown,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { restoreCursorAfterInsertion } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/utils'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { normalizeName } from '@/executor/constants'
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
@@ -554,9 +555,17 @@ export function ConditionInput({
)
}
const handleTagSelectImmediate = (blockId: string, newValue: string) => {
const handleTagSelectImmediate = (
blockId: string,
newValue: string,
newCursorPosition: number
) => {
if (isPreview || disabled) return
const textarea = containerRef.current?.querySelector(
`[data-block-id="${CSS.escape(blockId)}"] textarea`
) as HTMLTextAreaElement | null
shouldPersistRef.current = true
setConditionalBlocks((blocks) =>
blocks.map((block) =>
@@ -582,11 +591,21 @@ export function ConditionInput({
: block
)
emitTagSelection(JSON.stringify(updatedBlocks))
restoreCursorAfterInsertion(textarea, newCursorPosition)
}
const handleEnvVarSelectImmediate = (blockId: string, newValue: string) => {
const handleEnvVarSelectImmediate = (
blockId: string,
newValue: string,
newCursorPosition: number
) => {
if (isPreview || disabled) return
const textarea = containerRef.current?.querySelector(
`[data-block-id="${CSS.escape(blockId)}"] textarea`
) as HTMLTextAreaElement | null
shouldPersistRef.current = true
setConditionalBlocks((blocks) =>
blocks.map((block) =>
@@ -612,6 +631,8 @@ export function ConditionInput({
: block
)
emitTagSelection(JSON.stringify(updatedBlocks))
restoreCursorAfterInsertion(textarea, newCursorPosition)
}
/**
@@ -999,7 +1020,9 @@ export function ConditionInput({
{block.showEnvVars && (
<EnvVarDropdown
visible={block.showEnvVars}
onSelect={(newValue) => handleEnvVarSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleEnvVarSelectImmediate(block.id, newValue, newCursorPosition)
}
searchTerm={block.searchTerm}
inputValue={block.value}
cursorPosition={block.cursorPosition}
@@ -1023,7 +1046,9 @@ export function ConditionInput({
{block.showTags && (
<TagDropdown
visible={block.showTags}
onSelect={(newValue) => handleTagSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleTagSelectImmediate(block.id, newValue, newCursorPosition)
}
blockId={blockId}
activeSourceBlockId={block.activeSourceBlockId}
inputValue={block.value}
@@ -1207,7 +1232,9 @@ export function ConditionInput({
{block.showEnvVars && (
<EnvVarDropdown
visible={block.showEnvVars}
onSelect={(newValue) => handleEnvVarSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleEnvVarSelectImmediate(block.id, newValue, newCursorPosition)
}
searchTerm={block.searchTerm}
inputValue={block.value}
cursorPosition={block.cursorPosition}
@@ -1225,7 +1252,9 @@ export function ConditionInput({
{block.showTags && (
<TagDropdown
visible={block.showTags}
onSelect={(newValue) => handleTagSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleTagSelectImmediate(block.id, newValue, newCursorPosition)
}
blockId={blockId}
activeSourceBlockId={block.activeSourceBlockId}
inputValue={block.value}

View File

@@ -55,6 +55,10 @@ const SCOPE_DESCRIPTIONS: Record<string, string> = {
'https://www.googleapis.com/auth/admin.directory.group.readonly': 'View Google Workspace groups',
'https://www.googleapis.com/auth/admin.directory.group.member.readonly':
'View Google Workspace group memberships',
'https://www.googleapis.com/auth/meetings.space.created':
'Create and manage Google Meet meeting spaces',
'https://www.googleapis.com/auth/meetings.space.readonly':
'View Google Meet meeting space details',
'https://www.googleapis.com/auth/cloud-platform':
'Full access to Google Cloud resources for Vertex AI',
'read:confluence-content.all': 'Read all Confluence content',
@@ -119,6 +123,7 @@ const SCOPE_DESCRIPTIONS: Record<string, string> = {
'offline.access': 'Access account when not using the application',
'data.records:read': 'Read records',
'data.records:write': 'Write to records',
'schema.bases:read': 'View bases and tables',
'webhook:manage': 'Manage webhooks',
'page.read': 'Read Notion pages',
'page.write': 'Write to Notion pages',

View File

@@ -23,7 +23,7 @@ interface EnvVarDropdownProps {
/** Whether the dropdown is visible */
visible: boolean
/** Callback when an environment variable is selected */
onSelect: (newValue: string) => void
onSelect: (newValue: string, newCursorPosition: number) => void
/** Search term to filter environment variables */
searchTerm?: string
/** Additional CSS class names */
@@ -189,6 +189,8 @@ export const EnvVarDropdown: React.FC<EnvVarDropdownProps> = ({
const isStandardEnvVarContext = lastOpenBraces !== -1
const tagLength = 2 + envVar.length + 2
if (isStandardEnvVarContext) {
const startText = textBeforeCursor.slice(0, lastOpenBraces)
@@ -196,13 +198,10 @@ export const EnvVarDropdown: React.FC<EnvVarDropdownProps> = ({
const endText = closeIndex !== -1 ? textAfterCursor.slice(closeIndex + 2) : textAfterCursor
const newValue = `${startText}{{${envVar}}}${endText}`
onSelect(newValue)
onSelect(newValue, lastOpenBraces + tagLength)
} else {
if (inputValue.trim() !== '') {
onSelect(`{{${envVar}}}`)
} else {
onSelect(`{{${envVar}}}`)
}
const newValue = `{{${envVar}}}`
onSelect(newValue, tagLength)
}
onClose?.()

View File

@@ -70,7 +70,7 @@ interface TagDropdownProps {
/** Whether the dropdown is visible */
visible: boolean
/** Callback when a tag is selected */
onSelect: (newValue: string) => void
onSelect: (newValue: string, newCursorPosition: number) => void
/** ID of the block that owns the input field */
blockId: string
/** ID of the specific source block being referenced, if any */
@@ -1167,21 +1167,56 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
{} as Record<string, { type: string; id: string }>
)
let loopBlockGroup: BlockTagGroup | null = null
const loopBlockGroups: BlockTagGroup[] = []
const ancestorLoopIds = new Set<string>()
const visitedContainerIds = new Set<string>()
const findAncestorContainers = (targetId: string) => {
if (visitedContainerIds.has(targetId)) return
visitedContainerIds.add(targetId)
// Check if targetId is directly inside any loop
for (const [loopId, loop] of Object.entries(loops)) {
if (loop.nodes.includes(targetId) && !ancestorLoopIds.has(loopId)) {
ancestorLoopIds.add(loopId)
const loopBlock = blocks[loopId]
if (loopBlock) {
const loopType = loop.loopType || 'for'
const loopBlockName = loopBlock.name || loopBlock.type
const normalizedLoopName = normalizeName(loopBlockName)
const contextualTags: string[] = [`${normalizedLoopName}.index`]
if (loopType === 'forEach') {
contextualTags.push(`${normalizedLoopName}.currentItem`)
contextualTags.push(`${normalizedLoopName}.items`)
}
loopBlockGroups.push({
blockName: loopBlockName,
blockId: loopId,
blockType: 'loop',
tags: contextualTags,
distance: 0,
isContextual: true,
})
}
findAncestorContainers(loopId)
}
}
// Also walk through containing parallels so we find loops that contain
// the parallel (e.g. block inside parallel inside loop)
for (const [parallelId, parallel] of Object.entries(parallels || {})) {
if (parallel.nodes.includes(targetId)) {
findAncestorContainers(parallelId)
}
}
}
const isLoopBlock = blocks[blockId]?.type === 'loop'
const currentLoop = isLoopBlock ? loops[blockId] : null
const containingLoop = Object.entries(loops).find(([_, loop]) => loop.nodes.includes(blockId))
let containingLoopBlockId: string | null = null
if (currentLoop && isLoopBlock) {
containingLoopBlockId = blockId
const loopType = currentLoop.loopType || 'for'
if (isLoopBlock && loops[blockId]) {
const loop = loops[blockId]
ancestorLoopIds.add(blockId)
const loopBlock = blocks[blockId]
if (loopBlock) {
const loopType = loop.loopType || 'for'
const loopBlockName = loopBlock.name || loopBlock.type
const normalizedLoopName = normalizeName(loopBlockName)
const contextualTags: string[] = [`${normalizedLoopName}.index`]
@@ -1189,71 +1224,65 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
contextualTags.push(`${normalizedLoopName}.currentItem`)
contextualTags.push(`${normalizedLoopName}.items`)
}
loopBlockGroup = {
loopBlockGroups.push({
blockName: loopBlockName,
blockId: blockId,
blockType: 'loop',
tags: contextualTags,
distance: 0,
isContextual: true,
}
})
}
} else if (containingLoop) {
const [loopId, loop] = containingLoop
containingLoopBlockId = loopId
const loopType = loop.loopType || 'for'
findAncestorContainers(blockId)
} else {
findAncestorContainers(blockId)
}
const containingLoopBlock = blocks[loopId]
if (containingLoopBlock) {
const loopBlockName = containingLoopBlock.name || containingLoopBlock.type
const normalizedLoopName = normalizeName(loopBlockName)
const contextualTags: string[] = [`${normalizedLoopName}.index`]
if (loopType === 'forEach') {
contextualTags.push(`${normalizedLoopName}.currentItem`)
contextualTags.push(`${normalizedLoopName}.items`)
}
const parallelBlockGroups: BlockTagGroup[] = []
const ancestorParallelIds = new Set<string>()
const visitedParallelTargets = new Set<string>()
loopBlockGroup = {
blockName: loopBlockName,
blockId: loopId,
blockType: 'loop',
tags: contextualTags,
distance: 0,
isContextual: true,
const findAncestorParallels = (targetId: string) => {
if (visitedParallelTargets.has(targetId)) return
visitedParallelTargets.add(targetId)
for (const [parallelId, parallel] of Object.entries(parallels || {})) {
if (parallel.nodes.includes(targetId) && !ancestorParallelIds.has(parallelId)) {
ancestorParallelIds.add(parallelId)
const parallelBlock = blocks[parallelId]
if (parallelBlock) {
const parallelType = parallel.parallelType || 'count'
const parallelBlockName = parallelBlock.name || parallelBlock.type
const normalizedParallelName = normalizeName(parallelBlockName)
const contextualTags: string[] = [`${normalizedParallelName}.index`]
if (parallelType === 'collection') {
contextualTags.push(`${normalizedParallelName}.currentItem`)
contextualTags.push(`${normalizedParallelName}.items`)
}
parallelBlockGroups.push({
blockName: parallelBlockName,
blockId: parallelId,
blockType: 'parallel',
tags: contextualTags,
distance: 0,
isContextual: true,
})
}
// Walk up through containing loops and parallels
for (const [loopId, loop] of Object.entries(loops)) {
if (loop.nodes.includes(parallelId)) {
findAncestorParallels(loopId)
}
}
findAncestorParallels(parallelId)
}
}
}
let parallelBlockGroup: BlockTagGroup | null = null
const containingParallel = Object.entries(parallels || {}).find(([_, parallel]) =>
parallel.nodes.includes(blockId)
)
let containingParallelBlockId: string | null = null
if (containingParallel) {
const [parallelId, parallel] = containingParallel
containingParallelBlockId = parallelId
const parallelType = parallel.parallelType || 'count'
const containingParallelBlock = blocks[parallelId]
if (containingParallelBlock) {
const parallelBlockName = containingParallelBlock.name || containingParallelBlock.type
const normalizedParallelName = normalizeName(parallelBlockName)
const contextualTags: string[] = [`${normalizedParallelName}.index`]
if (parallelType === 'collection') {
contextualTags.push(`${normalizedParallelName}.currentItem`)
contextualTags.push(`${normalizedParallelName}.items`)
}
parallelBlockGroup = {
blockName: parallelBlockName,
blockId: parallelId,
blockType: 'parallel',
tags: contextualTags,
distance: 0,
isContextual: true,
}
}
findAncestorParallels(blockId)
// Also check through ancestor loops (a block in a loop that's in a parallel)
for (const loopId of ancestorLoopIds) {
findAncestorParallels(loopId)
}
const blockTagGroups: BlockTagGroup[] = []
@@ -1275,8 +1304,8 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
if (!blockConfig) {
if (accessibleBlock.type === 'loop' || accessibleBlock.type === 'parallel') {
if (
accessibleBlockId === containingLoopBlockId ||
accessibleBlockId === containingParallelBlockId
ancestorLoopIds.has(accessibleBlockId) ||
ancestorParallelIds.has(accessibleBlockId)
) {
continue
}
@@ -1366,12 +1395,8 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
}
const finalBlockTagGroups: BlockTagGroup[] = []
if (loopBlockGroup) {
finalBlockTagGroups.push(loopBlockGroup)
}
if (parallelBlockGroup) {
finalBlockTagGroups.push(parallelBlockGroup)
}
finalBlockTagGroups.push(...loopBlockGroups)
finalBlockTagGroups.push(...parallelBlockGroups)
blockTagGroups.sort((a, b) => a.distance - b.distance)
finalBlockTagGroups.push(...blockTagGroups)
@@ -1570,28 +1595,15 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
if (variableObj) {
processedTag = tag
}
} else if (
blockGroup?.isContextual &&
(blockGroup.blockType === 'loop' || blockGroup.blockType === 'parallel')
) {
const tagParts = tag.split('.')
if (tagParts.length === 1) {
processedTag = blockGroup.blockType
} else {
const lastPart = tagParts[tagParts.length - 1]
if (['index', 'currentItem', 'items'].includes(lastPart)) {
processedTag = `${blockGroup.blockType}.${lastPart}`
} else {
processedTag = tag
}
}
}
let newValue: string
let insertStart: number
if (lastOpenBracket === -1) {
// No '<' found - insert the full tag at cursor position
newValue = `${textBeforeCursor}<${processedTag}>${textAfterCursor}`
insertStart = liveCursor
} else {
// '<' found - replace from '<' to cursor (and consume trailing '>' if present)
const nextCloseBracket = textAfterCursor.indexOf('>')
@@ -1605,9 +1617,11 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
}
newValue = `${textBeforeCursor.slice(0, lastOpenBracket)}<${processedTag}>${remainingTextAfterCursor}`
insertStart = lastOpenBracket
}
onSelect(newValue)
const newCursorPos = insertStart + 1 + processedTag.length + 1
onSelect(newValue, newCursorPos)
onClose?.()
},
[workflowVariables, onSelect, onClose, getMergedSubBlocks]

View File

@@ -7,6 +7,7 @@ import {
splitReferenceSegment,
} from '@/lib/workflows/sanitization/references'
import { checkTagTrigger } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { restoreCursorAfterInsertion } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/utils'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { normalizeName, REFERENCE } from '@/executor/constants'
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
@@ -60,7 +61,6 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
const textareaRef = useRef<HTMLTextAreaElement | null>(null)
const editorContainerRef = useRef<HTMLDivElement>(null)
const [tempInputValue, setTempInputValue] = useState<string | null>(null)
const [showTagDropdown, setShowTagDropdown] = useState(false)
const [cursorPosition, setCursorPosition] = useState(0)
@@ -289,8 +289,9 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
* Handle tag selection from dropdown
*/
const handleSubflowTagSelect = useCallback(
(newValue: string) => {
(newValue: string, newCursorPosition: number) => {
if (!currentBlockId || !isSubflow || !currentBlock) return
collaborativeUpdateIterationCollection(
currentBlockId,
currentBlock.type as 'loop' | 'parallel',
@@ -298,12 +299,7 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
)
setShowTagDropdown(false)
setTimeout(() => {
const textarea = textareaRef.current
if (textarea) {
textarea.focus()
}
}, 0)
restoreCursorAfterInsertion(textareaRef.current, newCursorPosition)
},
[currentBlockId, isSubflow, currentBlock, collaborativeUpdateIterationCollection]
)

View File

@@ -0,0 +1,20 @@
/**
* Restores the cursor position in a textarea after a dropdown insertion.
* Schedules a macrotask (via setTimeout) that runs after React's controlled-component commit
* so that the cursor position sticks.
*
* @param textarea - The textarea element to restore cursor in (may be null)
* @param newCursorPosition - The exact position to place the cursor at
*/
export function restoreCursorAfterInsertion(
textarea: HTMLTextAreaElement | null,
newCursorPosition: number
): void {
setTimeout(() => {
if (textarea) {
textarea.focus()
textarea.selectionStart = newCursorPosition
textarea.selectionEnd = newCursorPosition
}
}, 0)
}

View File

@@ -67,9 +67,6 @@ const ToolbarItem = memo(function ToolbarItem({
const handleDragStart = useCallback(
(e: React.DragEvent<HTMLElement>) => {
if (!isTrigger && (item.type === 'loop' || item.type === 'parallel')) {
document.body.classList.add('sim-drag-subflow')
}
const iconElement = e.currentTarget.querySelector('.toolbar-item-icon')
onDragStart(e, item.type, isTriggerCapable, {
name: item.name,
@@ -80,12 +77,6 @@ const ToolbarItem = memo(function ToolbarItem({
[item.type, item.name, item.bgColor, isTriggerCapable, onDragStart, isTrigger]
)
const handleDragEnd = useCallback(() => {
if (!isTrigger) {
document.body.classList.remove('sim-drag-subflow')
}
}, [isTrigger])
const handleClick = useCallback(() => {
onClick(item.type, isTriggerCapable)
}, [item.type, isTriggerCapable, onClick])
@@ -114,7 +105,6 @@ const ToolbarItem = memo(function ToolbarItem({
tabIndex={-1}
draggable
onDragStart={handleDragStart}
onDragEnd={handleDragEnd}
onClick={handleClick}
onContextMenu={handleContextMenu}
className={clsx(

View File

@@ -8,6 +8,7 @@ import { type DiffStatus, hasDiffStatus } from '@/lib/workflows/diff/types'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { ActionBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/action-bar/action-bar'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks'
import { useLastRunPath } from '@/stores/execution'
import { usePanelEditorStore } from '@/stores/panel'
/**
@@ -23,6 +24,8 @@ export interface SubflowNodeData {
isPreviewSelected?: boolean
kind: 'loop' | 'parallel'
name?: string
/** Execution status passed by preview/snapshot views */
executionStatus?: 'success' | 'error' | 'not-executed'
}
/**
@@ -56,6 +59,15 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
const isPreviewSelected = data?.isPreviewSelected || false
const lastRunPath = useLastRunPath()
const executionStatus = data.executionStatus
const runPathStatus: 'success' | 'error' | undefined =
executionStatus === 'success' || executionStatus === 'error'
? executionStatus
: isPreview
? undefined
: lastRunPath.get(id)
/**
* Calculate the nesting level of this subflow node based on its parent hierarchy.
* Used to apply appropriate styling for nested containers.
@@ -105,33 +117,57 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
* Determine the ring styling based on subflow state priority:
* 1. Focused (selected in editor), selected (shift-click/box), or preview selected - blue ring
* 2. Diff status (version comparison) - green/orange ring
* 3. Run path status (execution result) - green/red ring
*/
const isSelected = !isPreview && selected
const hasRing =
isFocused || isSelected || isPreviewSelected || diffStatus === 'new' || diffStatus === 'edited'
const ringStyles = cn(
hasRing && 'ring-[1.75px]',
(isFocused || isSelected || isPreviewSelected) && 'ring-[var(--brand-secondary)]',
diffStatus === 'new' && 'ring-[var(--brand-tertiary-2)]',
diffStatus === 'edited' && 'ring-[var(--warning)]'
)
isFocused ||
isSelected ||
isPreviewSelected ||
diffStatus === 'new' ||
diffStatus === 'edited' ||
!!runPathStatus
/**
* Compute the outline color for the subflow ring.
* Uses CSS outline instead of box-shadow ring because in ReactFlow v11,
* child nodes are DOM children of parent nodes and paint over the parent's
* internal ring overlay. Outline renders on the element's own compositing
* layer, so it stays visible above nested child nodes.
*/
const outlineColor = hasRing
? isFocused || isSelected || isPreviewSelected
? 'var(--brand-secondary)'
: diffStatus === 'new'
? 'var(--brand-tertiary-2)'
: diffStatus === 'edited'
? 'var(--warning)'
: runPathStatus === 'success'
? executionStatus
? 'var(--brand-tertiary-2)'
: 'var(--border-success)'
: runPathStatus === 'error'
? 'var(--text-error)'
: undefined
: undefined
return (
<div className='group relative'>
<div className='group pointer-events-none relative'>
<div
ref={blockRef}
onClick={() => setCurrentBlockId(id)}
className={cn(
'workflow-drag-handle relative cursor-grab select-none rounded-[8px] border border-[var(--border-1)] [&:active]:cursor-grabbing',
'transition-block-bg transition-ring',
'z-[20]'
'relative select-none rounded-[8px] border border-[var(--border-1)]',
'transition-block-bg'
)}
style={{
width: data.width || 500,
height: data.height || 300,
position: 'relative',
overflow: 'visible',
pointerEvents: isPreview ? 'none' : 'all',
pointerEvents: 'none',
...(outlineColor && {
outline: `1.75px solid ${outlineColor}`,
outlineOffset: '-1px',
}),
}}
data-node-id={id}
data-type='subflowNode'
@@ -142,11 +178,13 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
<ActionBar blockId={id} blockType={data.kind} disabled={!userPermissions.canEdit} />
)}
{/* Header Section */}
{/* Header Section — only interactive area for dragging */}
<div
onClick={() => setCurrentBlockId(id)}
className={cn(
'flex items-center justify-between rounded-t-[8px] border-[var(--border)] border-b bg-[var(--surface-2)] py-[8px] pr-[12px] pl-[8px]'
'workflow-drag-handle flex cursor-grab items-center justify-between rounded-t-[8px] border-[var(--border)] border-b bg-[var(--surface-2)] py-[8px] pr-[12px] pl-[8px] [&:active]:cursor-grabbing'
)}
style={{ pointerEvents: 'auto' }}
>
<div className='flex min-w-0 flex-1 items-center gap-[10px]'>
<div
@@ -183,7 +221,7 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
data-dragarea='true'
style={{
position: 'relative',
pointerEvents: isPreview ? 'none' : 'auto',
pointerEvents: 'none',
}}
>
{/* Subflow Start */}
@@ -233,12 +271,6 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
}}
id={endHandleId}
/>
{hasRing && (
<div
className={cn('pointer-events-none absolute inset-0 z-40 rounded-[8px]', ringStyles)}
/>
)}
</div>
</div>
)

View File

@@ -60,6 +60,7 @@ import { openCopilotWithMessage } from '@/stores/notifications/utils'
import type { ConsoleEntry } from '@/stores/terminal'
import { useTerminalConsoleStore, useTerminalStore } from '@/stores/terminal'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
/**
* Terminal height configuration constants
@@ -68,20 +69,21 @@ const MIN_HEIGHT = TERMINAL_HEIGHT.MIN
const DEFAULT_EXPANDED_HEIGHT = TERMINAL_HEIGHT.DEFAULT
const MIN_OUTPUT_PANEL_WIDTH_PX = OUTPUT_PANEL_WIDTH.MIN
/** Returns true if any node in the subtree has an error */
function hasErrorInTree(nodes: EntryNode[]): boolean {
return nodes.some((n) => Boolean(n.entry.error) || hasErrorInTree(n.children))
const MAX_TREE_DEPTH = 50
function hasMatchInTree(
nodes: EntryNode[],
predicate: (e: ConsoleEntry) => boolean,
depth = 0
): boolean {
if (depth >= MAX_TREE_DEPTH) return false
return nodes.some((n) => predicate(n.entry) || hasMatchInTree(n.children, predicate, depth + 1))
}
/** Returns true if any node in the subtree is currently running */
function hasRunningInTree(nodes: EntryNode[]): boolean {
return nodes.some((n) => Boolean(n.entry.isRunning) || hasRunningInTree(n.children))
}
/** Returns true if any node in the subtree was canceled */
function hasCanceledInTree(nodes: EntryNode[]): boolean {
return nodes.some((n) => Boolean(n.entry.isCanceled) || hasCanceledInTree(n.children))
}
const hasErrorInTree = (nodes: EntryNode[]) => hasMatchInTree(nodes, (e) => Boolean(e.error))
const hasRunningInTree = (nodes: EntryNode[]) => hasMatchInTree(nodes, (e) => Boolean(e.isRunning))
const hasCanceledInTree = (nodes: EntryNode[]) =>
hasMatchInTree(nodes, (e) => Boolean(e.isCanceled))
/**
* Block row component for displaying actual block entries
@@ -263,28 +265,21 @@ const SubflowNodeRow = memo(function SubflowNodeRow({
}) {
const { entry, children } = node
const BlockIcon = getBlockIcon(entry.blockType)
const hasError =
Boolean(entry.error) ||
children.some((c) => c.entry.error || c.children.some((gc) => gc.entry.error))
const hasError = Boolean(entry.error) || hasErrorInTree(children)
const bgColor = getBlockColor(entry.blockType)
const nodeId = entry.id
const isExpanded = expandedNodes.has(nodeId)
const hasChildren = children.length > 0
// Check if any nested block is running or canceled
const hasRunningDescendant = children.some(
(c) => c.entry.isRunning || c.children.some((gc) => gc.entry.isRunning)
)
const hasCanceledDescendant =
children.some((c) => c.entry.isCanceled || c.children.some((gc) => gc.entry.isCanceled)) &&
!hasRunningDescendant
// Check if any nested block is running or canceled (recursive for arbitrary nesting depth)
const hasRunningDescendant = hasRunningInTree(children)
const hasCanceledDescendant = hasCanceledInTree(children) && !hasRunningDescendant
const displayName =
entry.blockType === 'loop'
? 'Loop'
: entry.blockType === 'parallel'
? 'Parallel'
: entry.blockName
const containerId = entry.iterationContainerId
const storeBlockName = useWorkflowStore((state) =>
containerId ? state.blocks[containerId]?.name : undefined
)
const displayName = storeBlockName || entry.blockName
return (
<div className='flex min-w-0 flex-col'>

View File

@@ -0,0 +1,478 @@
/**
* @vitest-environment node
*/
import { describe, expect, it, vi } from 'vitest'
vi.mock('@/blocks', () => ({
getBlock: vi.fn().mockReturnValue(null),
}))
vi.mock('@/executor/constants', () => ({
isWorkflowBlockType: vi.fn((blockType: string | undefined) => {
return blockType === 'workflow' || blockType === 'workflow_input'
}),
}))
vi.mock('@/stores/constants', () => ({
TERMINAL_BLOCK_COLUMN_WIDTH: { MIN: 120, DEFAULT: 200, MAX: 400 },
}))
import type { ConsoleEntry } from '@/stores/terminal'
import { buildEntryTree, type EntryNode, groupEntriesByExecution } from './utils'
let entryCounter = 0
function makeEntry(overrides: Partial<ConsoleEntry>): ConsoleEntry {
return {
id: overrides.id ?? `entry-${++entryCounter}`,
timestamp: overrides.timestamp ?? '2025-01-01T00:00:00Z',
workflowId: overrides.workflowId ?? 'wf-1',
blockId: overrides.blockId ?? 'block-1',
blockName: overrides.blockName ?? 'Block',
blockType: overrides.blockType ?? 'function',
executionId: overrides.executionId ?? 'exec-1',
startedAt: overrides.startedAt ?? '2025-01-01T00:00:00Z',
executionOrder: overrides.executionOrder ?? 0,
...overrides,
} as ConsoleEntry
}
/** Collect all nodes from a tree depth-first */
function collectAllNodes(nodes: EntryNode[]): EntryNode[] {
const result: EntryNode[] = []
for (const node of nodes) {
result.push(node)
result.push(...collectAllNodes(node.children))
}
return result
}
/**
* Creates entries for a parallel-in-loop scenario.
* All Function 1 entries are nestedIterationEntries (have parentIterations).
* No topLevelIterationEntries exist (sentinels don't emit SSE events).
*/
function makeParallelInLoopEntries(
loopIterations: number,
parallelBranches: number
): ConsoleEntry[] {
const entries: ConsoleEntry[] = []
let order = 1
for (let loopIter = 0; loopIter < loopIterations; loopIter++) {
for (let branch = 0; branch < parallelBranches; branch++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
startedAt: new Date(Date.UTC(2025, 0, 1, 0, 0, loopIter * 10 + branch)).toISOString(),
endedAt: new Date(Date.UTC(2025, 0, 1, 0, 0, loopIter * 10 + branch + 1)).toISOString(),
durationMs: 50,
iterationType: 'parallel',
iterationCurrent: branch,
iterationTotal: parallelBranches,
iterationContainerId: 'parallel-1',
parentIterations: [
{
iterationType: 'loop',
iterationCurrent: loopIter,
iterationTotal: loopIterations,
iterationContainerId: 'loop-1',
},
],
})
)
}
}
return entries
}
describe('buildEntryTree', () => {
describe('simple loop (no nesting)', () => {
it('groups entries by loop iteration', () => {
const entries: ConsoleEntry[] = []
for (let iter = 0; iter < 3; iter++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: iter + 1,
iterationType: 'loop',
iterationCurrent: iter,
iterationTotal: 3,
iterationContainerId: 'loop-1',
})
)
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('loop')
expect(subflows[0].children).toHaveLength(3)
for (let i = 0; i < 3; i++) {
expect(subflows[0].children[i].iterationInfo?.current).toBe(i)
expect(subflows[0].children[i].children).toHaveLength(1)
expect(subflows[0].children[i].children[0].entry.blockId).toBe('function-1')
}
})
})
describe('simple parallel (no nesting)', () => {
it('groups entries by parallel branch', () => {
const entries: ConsoleEntry[] = []
for (let branch = 0; branch < 4; branch++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: branch + 1,
iterationType: 'parallel',
iterationCurrent: branch,
iterationTotal: 4,
iterationContainerId: 'parallel-1',
})
)
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('parallel')
expect(subflows[0].children).toHaveLength(4)
})
})
describe('parallel-in-loop', () => {
it('creates all loop iterations (5 loop × 5 parallel)', () => {
const entries = makeParallelInLoopEntries(5, 5)
expect(entries).toHaveLength(25)
const tree = buildEntryTree(entries)
// Top level: 1 subflow (Loop)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('loop')
// Loop has 5 iteration children
const loopIterations = subflows[0].children
expect(loopIterations).toHaveLength(5)
for (let loopIter = 0; loopIter < 5; loopIter++) {
const iterNode = loopIterations[loopIter]
expect(iterNode.nodeType).toBe('iteration')
expect(iterNode.iterationInfo?.current).toBe(loopIter)
expect(iterNode.iterationInfo?.total).toBe(5)
// Each loop iteration has 1 nested subflow (Parallel)
const parallelSubflows = iterNode.children.filter((n) => n.nodeType === 'subflow')
expect(parallelSubflows).toHaveLength(1)
expect(parallelSubflows[0].entry.blockType).toBe('parallel')
// Each parallel has 5 branch iterations
const branches = parallelSubflows[0].children
expect(branches).toHaveLength(5)
for (let branch = 0; branch < 5; branch++) {
expect(branches[branch].iterationInfo?.current).toBe(branch)
expect(branches[branch].children).toHaveLength(1)
expect(branches[branch].children[0].entry.blockId).toBe('function-1')
}
}
})
it('preserves all block entries in the tree (no silently dropped entries)', () => {
const entries = makeParallelInLoopEntries(5, 5)
const tree = buildEntryTree(entries)
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter(
(n) => n.nodeType === 'block' && n.entry.blockId === 'function-1'
)
expect(blocks).toHaveLength(25)
})
it('works with a regular block alongside', () => {
const entries = [
makeEntry({
blockId: 'start-1',
blockName: 'Start',
blockType: 'starter',
executionOrder: 0,
}),
...makeParallelInLoopEntries(3, 2),
]
const tree = buildEntryTree(entries)
const regularBlocks = tree.filter((n) => n.nodeType === 'block')
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(regularBlocks).toHaveLength(1)
expect(regularBlocks[0].entry.blockId).toBe('start-1')
expect(subflows).toHaveLength(1)
expect(subflows[0].children).toHaveLength(3)
})
it('works when some iterations also have topLevelIterationEntries', () => {
const entries: ConsoleEntry[] = [
// Real top-level entry for loop iteration 0 (from a container event)
makeEntry({
blockId: 'parallel-container',
blockName: 'Parallel',
blockType: 'parallel',
executionOrder: 100,
iterationType: 'loop',
iterationCurrent: 0,
iterationTotal: 3,
iterationContainerId: 'loop-1',
}),
...makeParallelInLoopEntries(3, 2),
]
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
// All 3 loop iterations must exist
expect(subflows[0].children).toHaveLength(3)
// All 6 Function 1 blocks should appear somewhere in the tree
const allNodes = collectAllNodes(tree)
const fnBlocks = allNodes.filter(
(n) => n.nodeType === 'block' && n.entry.blockId === 'function-1'
)
expect(fnBlocks).toHaveLength(6)
})
it('handles 2 loop × 3 parallel', () => {
const entries = makeParallelInLoopEntries(2, 3)
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].children).toHaveLength(2)
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter(
(n) => n.nodeType === 'block' && n.entry.blockId === 'function-1'
)
expect(blocks).toHaveLength(6)
})
})
describe('loop-in-parallel', () => {
it('creates all parallel branches with nested loop iterations', () => {
const entries: ConsoleEntry[] = []
let order = 1
for (let branch = 0; branch < 3; branch++) {
for (let loopIter = 0; loopIter < 2; loopIter++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
iterationType: 'loop',
iterationCurrent: loopIter,
iterationTotal: 2,
iterationContainerId: 'loop-1',
parentIterations: [
{
iterationType: 'parallel',
iterationCurrent: branch,
iterationTotal: 3,
iterationContainerId: 'parallel-1',
},
],
})
)
}
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('parallel')
// 3 parallel branches
const branches = subflows[0].children
expect(branches).toHaveLength(3)
for (let branch = 0; branch < 3; branch++) {
const branchNode = branches[branch]
expect(branchNode.iterationInfo?.current).toBe(branch)
// Each branch has a nested loop subflow
const nestedSubflows = branchNode.children.filter((n) => n.nodeType === 'subflow')
expect(nestedSubflows).toHaveLength(1)
expect(nestedSubflows[0].entry.blockType).toBe('loop')
// Each loop has 2 iterations
expect(nestedSubflows[0].children).toHaveLength(2)
}
})
})
describe('loop-in-loop', () => {
it('creates outer and inner loop iterations', () => {
const entries: ConsoleEntry[] = []
let order = 1
for (let outer = 0; outer < 2; outer++) {
for (let inner = 0; inner < 3; inner++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
iterationType: 'loop',
iterationCurrent: inner,
iterationTotal: 3,
iterationContainerId: 'inner-loop',
parentIterations: [
{
iterationType: 'loop',
iterationCurrent: outer,
iterationTotal: 2,
iterationContainerId: 'outer-loop',
},
],
})
)
}
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('loop')
// Outer loop: 2 iterations
expect(subflows[0].children).toHaveLength(2)
for (let outer = 0; outer < 2; outer++) {
const outerIter = subflows[0].children[outer]
expect(outerIter.iterationInfo?.current).toBe(outer)
// Each outer iteration has an inner loop
const innerSubflows = outerIter.children.filter((n) => n.nodeType === 'subflow')
expect(innerSubflows).toHaveLength(1)
expect(innerSubflows[0].children).toHaveLength(3)
}
// All 6 blocks present
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter((n) => n.nodeType === 'block')
expect(blocks).toHaveLength(6)
})
})
describe('parallel-in-parallel', () => {
it('creates outer and inner parallel branches', () => {
const entries: ConsoleEntry[] = []
let order = 1
for (let outer = 0; outer < 2; outer++) {
for (let inner = 0; inner < 3; inner++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
iterationType: 'parallel',
iterationCurrent: inner,
iterationTotal: 3,
iterationContainerId: 'inner-parallel',
parentIterations: [
{
iterationType: 'parallel',
iterationCurrent: outer,
iterationTotal: 2,
iterationContainerId: 'outer-parallel',
},
],
})
)
}
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('parallel')
// 2 outer branches
expect(subflows[0].children).toHaveLength(2)
for (let outer = 0; outer < 2; outer++) {
const outerBranch = subflows[0].children[outer]
const innerSubflows = outerBranch.children.filter((n) => n.nodeType === 'subflow')
expect(innerSubflows).toHaveLength(1)
expect(innerSubflows[0].children).toHaveLength(3)
}
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter((n) => n.nodeType === 'block')
expect(blocks).toHaveLength(6)
})
})
})
describe('groupEntriesByExecution', () => {
it('builds tree for parallel-in-loop via groupEntriesByExecution', () => {
const entries = makeParallelInLoopEntries(3, 2)
const groups = groupEntriesByExecution(entries)
expect(groups).toHaveLength(1)
const entryTree = groups[0].entryTree
const subflows = entryTree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].children).toHaveLength(3)
})
it('handles workflow child entries alongside iteration entries', () => {
const entries: ConsoleEntry[] = [
makeEntry({
id: 'start-entry',
blockId: 'start',
blockName: 'Start',
blockType: 'start_trigger',
executionOrder: 0,
}),
makeEntry({
id: 'workflow-block',
blockId: 'wf-block-1',
blockName: 'My Sub-Workflow',
blockType: 'workflow',
executionOrder: 1,
}),
makeEntry({
id: 'child-block-1',
blockId: 'child-func',
blockName: 'Child Function',
blockType: 'function',
executionOrder: 2,
childWorkflowBlockId: 'wf-block-1',
childWorkflowName: 'Child Workflow',
childWorkflowInstanceId: 'instance-1',
}),
]
const groups = groupEntriesByExecution(entries)
expect(groups).toHaveLength(1)
const tree = groups[0].entryTree
expect(tree.length).toBeGreaterThanOrEqual(2)
const startNode = tree.find((n) => n.entry.blockType === 'start_trigger')
expect(startNode).toBeDefined()
// Child entry should be nested under workflow block, not at top level
const topLevelChild = tree.find((n) => n.entry.blockId === 'child-func')
expect(topLevelChild).toBeUndefined()
})
})

View File

@@ -18,10 +18,9 @@ import type { ConsoleEntry } from '@/stores/terminal'
const SUBFLOW_COLORS = {
loop: '#2FB3FF',
parallel: '#FEE12B',
workflow: '#8b5cf6',
} as const
const WORKFLOW_COLOR = '#8b5cf6'
/**
* Special block type colors for errors and system messages
*/
@@ -86,7 +85,7 @@ export function getBlockColor(blockType: string): string {
return SUBFLOW_COLORS.parallel
}
if (blockType === 'workflow') {
return WORKFLOW_COLOR
return SUBFLOW_COLORS.workflow
}
// Special block types for errors and system messages
if (blockType === 'error') {
@@ -126,14 +125,6 @@ export function isEventFromEditableElement(e: KeyboardEvent): boolean {
return false
}
/**
* Checks if a block type is a subflow (loop or parallel)
*/
export function isSubflowBlockType(blockType: string): boolean {
const lower = blockType?.toLowerCase() || ''
return lower === 'loop' || lower === 'parallel'
}
/**
* Node type for the tree structure
*/
@@ -221,27 +212,26 @@ function collectWorkflowDescendants(
* that executed within each iteration.
* Sorts by start time to ensure chronological order.
*/
function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
// Separate entries into three buckets:
// 1. Iteration entries (loop/parallel children)
// 2. Workflow child entries (blocks inside a child workflow)
// 3. Regular blocks
export function buildEntryTree(entries: ConsoleEntry[], idPrefix = ''): EntryNode[] {
const regularBlocks: ConsoleEntry[] = []
const iterationEntries: ConsoleEntry[] = []
const topLevelIterationEntries: ConsoleEntry[] = []
const nestedIterationEntries: ConsoleEntry[] = []
const workflowChildEntries: ConsoleEntry[] = []
for (const entry of entries) {
if (entry.childWorkflowBlockId) {
// Child workflow entries take priority over iteration classification
workflowChildEntries.push(entry)
} else if (entry.iterationType && entry.iterationCurrent !== undefined) {
iterationEntries.push(entry)
if (entry.parentIterations && entry.parentIterations.length > 0) {
nestedIterationEntries.push(entry)
} else {
topLevelIterationEntries.push(entry)
}
} else {
regularBlocks.push(entry)
}
}
// Group workflow child entries by the parent workflow block ID
const workflowChildGroups = new Map<string, ConsoleEntry[]>()
for (const entry of workflowChildEntries) {
const parentId = entry.childWorkflowBlockId!
@@ -253,9 +243,8 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
}
}
// Group iteration entries by (iterationType, iterationContainerId, iterationCurrent)
const iterationGroupsMap = new Map<string, IterationGroup>()
for (const entry of iterationEntries) {
for (const entry of topLevelIterationEntries) {
const iterationContainerId = entry.iterationContainerId || 'unknown'
const key = `${entry.iterationType}-${iterationContainerId}-${entry.iterationCurrent}`
let group = iterationGroupsMap.get(key)
@@ -272,11 +261,9 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
}
iterationGroupsMap.set(key, group)
} else {
// Update start time to earliest
if (entryStartMs < group.startTimeMs) {
group.startTimeMs = entryStartMs
}
// Update total if available
if (entry.iterationTotal !== undefined) {
group.iterationTotal = entry.iterationTotal
}
@@ -284,12 +271,10 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
group.blocks.push(entry)
}
// Sort blocks within each iteration by executionOrder ascending (oldest first, top-down)
for (const group of iterationGroupsMap.values()) {
group.blocks.sort((a, b) => a.executionOrder - b.executionOrder)
}
// Group iterations by (iterationType, iterationContainerId) to create subflow parents
const subflowGroups = new Map<
string,
{ iterationType: string; iterationContainerId: string; groups: IterationGroup[] }
@@ -308,112 +293,218 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
subflowGroup.groups.push(group)
}
// Sort iterations within each subflow by iteration number
for (const subflowGroup of subflowGroups.values()) {
subflowGroup.groups.sort((a, b) => a.iterationCurrent - b.iterationCurrent)
}
// Build subflow nodes with iteration children
// Create synthetic parent subflow groups for orphaned nested iteration entries.
// Nested subflow containers (e.g., inner parallel inside outer parallel) may not
// have store entries if no block:started event was emitted for them. Without a
// parent subflow group, their child entries would be silently dropped from the tree.
// Check at the iteration level (not container level) so that existing iterations
// from topLevelIterationEntries don't block synthetic creation for other iterations
// of the same container (e.g., loop iterations 1-4 when iteration 0 already exists).
const syntheticIterations = new Map<string, IterationGroup>()
for (const entry of nestedIterationEntries) {
const parent = entry.parentIterations?.[0]
if (!parent?.iterationContainerId) {
continue
}
// Only skip if this specific iteration already has a group from topLevelIterationEntries
const iterKey = `${parent.iterationType}-${parent.iterationContainerId}-${parent.iterationCurrent}`
if (iterationGroupsMap.has(iterKey)) {
continue
}
const entryMs = new Date(entry.startedAt || entry.timestamp).getTime()
if (!syntheticIterations.has(iterKey)) {
syntheticIterations.set(iterKey, {
iterationType: parent.iterationType!,
iterationContainerId: parent.iterationContainerId!,
iterationCurrent: parent.iterationCurrent!,
iterationTotal: parent.iterationTotal,
blocks: [],
startTimeMs: entryMs,
})
} else {
const existing = syntheticIterations.get(iterKey)!
if (entryMs < existing.startTimeMs) {
existing.startTimeMs = entryMs
}
}
}
const syntheticSubflows = new Map<
string,
{ iterationType: string; iterationContainerId: string; groups: IterationGroup[] }
>()
for (const iterGroup of syntheticIterations.values()) {
const subflowKey = `${iterGroup.iterationType}-${iterGroup.iterationContainerId}`
let subflow = syntheticSubflows.get(subflowKey)
if (!subflow) {
subflow = {
iterationType: iterGroup.iterationType,
iterationContainerId: iterGroup.iterationContainerId,
groups: [],
}
syntheticSubflows.set(subflowKey, subflow)
}
subflow.groups.push(iterGroup)
}
for (const subflow of syntheticSubflows.values()) {
const key = `${subflow.iterationType}-${subflow.iterationContainerId}`
const existing = subflowGroups.get(key)
if (existing) {
// Merge synthetic iteration groups into the existing subflow group
existing.groups.push(...subflow.groups)
existing.groups.sort((a, b) => a.iterationCurrent - b.iterationCurrent)
} else {
subflow.groups.sort((a, b) => a.iterationCurrent - b.iterationCurrent)
subflowGroups.set(key, subflow)
}
}
const subflowNodes: EntryNode[] = []
for (const subflowGroup of subflowGroups.values()) {
const { iterationType, iterationContainerId, groups: iterationGroups } = subflowGroup
// Calculate subflow timing from all its iterations
const firstIteration = iterationGroups[0]
const allBlocks = iterationGroups.flatMap((g) => g.blocks)
const subflowStartMs = Math.min(
...allBlocks.map((b) => new Date(b.startedAt || b.timestamp).getTime())
)
const nestedForThisSubflow = nestedIterationEntries.filter((e) => {
const parent = e.parentIterations?.[0]
return parent && parent.iterationContainerId === iterationContainerId
})
const allDirectBlocks = iterationGroups.flatMap((g) => g.blocks)
const allRelevantBlocks = [...allDirectBlocks, ...nestedForThisSubflow]
if (allRelevantBlocks.length === 0) continue
const timestamps = allRelevantBlocks.map((b) => new Date(b.startedAt || b.timestamp).getTime())
const subflowStartMs = Math.min(...timestamps)
const subflowEndMs = Math.max(
...allBlocks.map((b) => new Date(b.endedAt || b.timestamp).getTime())
...allRelevantBlocks.map((b) => new Date(b.endedAt || b.timestamp).getTime())
)
const totalDuration = allBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
// Parallel branches run concurrently — use wall-clock time. Loop iterations run serially — use sum.
const totalDuration = allRelevantBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
const subflowDuration =
iterationType === 'parallel' ? subflowEndMs - subflowStartMs : totalDuration
// Create synthetic subflow parent entry
// Use the minimum executionOrder from all child blocks for proper ordering
const subflowExecutionOrder = Math.min(...allBlocks.map((b) => b.executionOrder))
const subflowExecutionOrder = Math.min(...allRelevantBlocks.map((b) => b.executionOrder))
const metadataSource = allRelevantBlocks[0]
const syntheticSubflow: ConsoleEntry = {
id: `subflow-${iterationType}-${iterationContainerId}-${firstIteration.blocks[0]?.executionId || 'unknown'}`,
id: `${idPrefix}subflow-${iterationType}-${iterationContainerId}-${metadataSource.executionId || 'unknown'}`,
timestamp: new Date(subflowStartMs).toISOString(),
workflowId: firstIteration.blocks[0]?.workflowId || '',
workflowId: metadataSource.workflowId || '',
blockId: `${iterationType}-container-${iterationContainerId}`,
blockName: iterationType.charAt(0).toUpperCase() + iterationType.slice(1),
blockType: iterationType,
executionId: firstIteration.blocks[0]?.executionId,
executionId: metadataSource.executionId,
startedAt: new Date(subflowStartMs).toISOString(),
executionOrder: subflowExecutionOrder,
endedAt: new Date(subflowEndMs).toISOString(),
durationMs: subflowDuration,
success: !allBlocks.some((b) => b.error),
success: !allRelevantBlocks.some((b) => b.error),
iterationContainerId,
}
// Build iteration child nodes
const iterationNodes: EntryNode[] = iterationGroups.map((iterGroup) => {
// Create synthetic iteration entry
const iterBlocks = iterGroup.blocks
const iterStartMs = Math.min(
...iterBlocks.map((b) => new Date(b.startedAt || b.timestamp).getTime())
)
const iterEndMs = Math.max(
...iterBlocks.map((b) => new Date(b.endedAt || b.timestamp).getTime())
)
const iterDuration = iterBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
// Parallel branches run concurrently — use wall-clock time. Loop iterations run serially — use sum.
const iterDisplayDuration =
iterationType === 'parallel' ? iterEndMs - iterStartMs : iterDuration
const iterationNodes: EntryNode[] = iterationGroups
.map((iterGroup): EntryNode | null => {
const matchingNestedEntries = nestedForThisSubflow.filter((e) => {
const parent = e.parentIterations?.[0]
return parent?.iterationCurrent === iterGroup.iterationCurrent
})
// Use the minimum executionOrder from blocks in this iteration
const iterExecutionOrder = Math.min(...iterBlocks.map((b) => b.executionOrder))
const syntheticIteration: ConsoleEntry = {
id: `iteration-${iterationType}-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}-${iterBlocks[0]?.executionId || 'unknown'}`,
timestamp: new Date(iterStartMs).toISOString(),
workflowId: iterBlocks[0]?.workflowId || '',
blockId: `iteration-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}`,
blockName: `Iteration ${iterGroup.iterationCurrent}${iterGroup.iterationTotal !== undefined ? ` / ${iterGroup.iterationTotal}` : ''}`,
blockType: iterationType,
executionId: iterBlocks[0]?.executionId,
startedAt: new Date(iterStartMs).toISOString(),
executionOrder: iterExecutionOrder,
endedAt: new Date(iterEndMs).toISOString(),
durationMs: iterDisplayDuration,
success: !iterBlocks.some((b) => b.error),
iterationCurrent: iterGroup.iterationCurrent,
iterationTotal: iterGroup.iterationTotal,
iterationType: iterationType as 'loop' | 'parallel',
iterationContainerId: iterGroup.iterationContainerId,
}
const strippedNestedEntries: ConsoleEntry[] = matchingNestedEntries.map((e) => ({
...e,
parentIterations:
e.parentIterations && e.parentIterations.length > 1
? e.parentIterations.slice(1)
: undefined,
}))
// Block nodes within this iteration — workflow blocks get their full subtree
const blockNodes: EntryNode[] = iterBlocks.map((block) => {
if (isWorkflowBlockType(block.blockType)) {
const instanceKey = block.childWorkflowInstanceId ?? block.blockId
const allDescendants = collectWorkflowDescendants(instanceKey, workflowChildGroups)
const rawChildren = allDescendants.map((c) => ({
...c,
childWorkflowBlockId:
c.childWorkflowBlockId === instanceKey ? undefined : c.childWorkflowBlockId,
}))
return {
entry: block,
children: buildEntryTree(rawChildren),
nodeType: 'workflow' as const,
}
const iterBlocks = iterGroup.blocks
const allIterEntries = [...iterBlocks, ...strippedNestedEntries]
if (allIterEntries.length === 0) return null
const iterStartMs = Math.min(
...allIterEntries.map((b) => new Date(b.startedAt || b.timestamp).getTime())
)
const iterEndMs = Math.max(
...allIterEntries.map((b) => new Date(b.endedAt || b.timestamp).getTime())
)
const iterDuration = allIterEntries.reduce((sum, b) => sum + (b.durationMs || 0), 0)
const iterDisplayDuration =
iterationType === 'parallel' ? iterEndMs - iterStartMs : iterDuration
const iterExecutionOrder = Math.min(...allIterEntries.map((b) => b.executionOrder))
const iterMetadataSource = allIterEntries[0]
const syntheticIteration: ConsoleEntry = {
id: `${idPrefix}iteration-${iterationType}-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}-${iterMetadataSource.executionId || 'unknown'}`,
timestamp: new Date(iterStartMs).toISOString(),
workflowId: iterMetadataSource.workflowId || '',
blockId: `iteration-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}`,
blockName: `Iteration ${iterGroup.iterationCurrent}${iterGroup.iterationTotal !== undefined ? ` / ${iterGroup.iterationTotal}` : ''}`,
blockType: iterationType,
executionId: iterMetadataSource.executionId,
startedAt: new Date(iterStartMs).toISOString(),
executionOrder: iterExecutionOrder,
endedAt: new Date(iterEndMs).toISOString(),
durationMs: iterDisplayDuration,
success: !allIterEntries.some((b) => b.error),
iterationCurrent: iterGroup.iterationCurrent,
iterationTotal: iterGroup.iterationTotal,
iterationType: iterationType as 'loop' | 'parallel',
iterationContainerId: iterGroup.iterationContainerId,
}
return { entry: block, children: [], nodeType: 'block' as const }
})
return {
entry: syntheticIteration,
children: blockNodes,
nodeType: 'iteration' as const,
iterationInfo: {
current: iterGroup.iterationCurrent,
total: iterGroup.iterationTotal,
},
}
})
const childPrefix = `${idPrefix}${iterationContainerId}-${iterGroup.iterationCurrent}-`
const nestedSubflowNodes =
strippedNestedEntries.length > 0 ? buildEntryTree(strippedNestedEntries, childPrefix) : []
// Filter out container completion events when matching nested subflow nodes exist,
// to avoid duplicating them as both a flat block row and an expandable subflow.
const hasNestedSubflows = nestedSubflowNodes.length > 0
const blockNodes: EntryNode[] = iterBlocks
.filter((block) => {
if (
hasNestedSubflows &&
(block.blockType === 'loop' || block.blockType === 'parallel')
) {
return false
}
return true
})
.map((block) => {
if (isWorkflowBlockType(block.blockType)) {
const instanceKey = block.childWorkflowInstanceId ?? block.blockId
const allDescendants = collectWorkflowDescendants(instanceKey, workflowChildGroups)
const rawChildren = allDescendants.map((c) => ({
...c,
childWorkflowBlockId:
c.childWorkflowBlockId === instanceKey ? undefined : c.childWorkflowBlockId,
}))
return {
entry: block,
children: buildEntryTree(rawChildren),
nodeType: 'workflow' as const,
}
}
return { entry: block, children: [], nodeType: 'block' as const }
})
const allChildren = [...blockNodes, ...nestedSubflowNodes]
allChildren.sort((a, b) => a.entry.executionOrder - b.entry.executionOrder)
return {
entry: syntheticIteration,
children: allChildren,
nodeType: 'iteration' as const,
iterationInfo: {
current: iterGroup.iterationCurrent,
total: iterGroup.iterationTotal,
},
}
})
.filter((node): node is EntryNode => node !== null)
subflowNodes.push({
entry: syntheticSubflow,
@@ -422,7 +513,6 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
})
}
// Build workflow nodes for regular blocks that are workflow block types
const workflowNodes: EntryNode[] = []
const remainingRegularBlocks: ConsoleEntry[] = []
@@ -442,16 +532,15 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
}
}
// Build nodes for remaining regular blocks
const regularNodes: EntryNode[] = remainingRegularBlocks.map((entry) => ({
entry,
children: [],
nodeType: 'block' as const,
}))
// Combine all nodes and sort by executionOrder ascending (oldest first, top-down)
const allNodes = [...subflowNodes, ...workflowNodes, ...regularNodes]
allNodes.sort((a, b) => a.entry.executionOrder - b.entry.executionOrder)
return allNodes
}

View File

@@ -81,16 +81,43 @@ export function useNodeUtilities(blocks: Record<string, any>) {
* @returns Array of node IDs representing the hierarchy path
*/
const getNodeHierarchy = useCallback(
(nodeId: string): string[] => {
(nodeId: string, maxDepth = 100): string[] => {
const node = getNodes().find((n) => n.id === nodeId)
if (!node) return [nodeId]
if (!node || maxDepth <= 0) return [nodeId]
const parentId = blocks?.[nodeId]?.data?.parentId
if (!parentId) return [nodeId]
return [...getNodeHierarchy(parentId), nodeId]
return [...getNodeHierarchy(parentId, maxDepth - 1), nodeId]
},
[getNodes, blocks]
)
/**
* Returns true if nodeId is in the subtree of ancestorId (i.e. walking from nodeId
* up the parentId chain we reach ancestorId). Used to reject parent assignments that
* would create a cycle (e.g. setting dragged node's parent to a container inside it).
*
* @param ancestorId - Node that might be an ancestor
* @param nodeId - Node to walk from (upward)
* @returns True if ancestorId appears in the parent chain of nodeId
*/
const isDescendantOf = useCallback(
(ancestorId: string, nodeId: string): boolean => {
const visited = new Set<string>()
const maxDepth = 100
let currentId: string | undefined = nodeId
let depth = 0
while (currentId && depth < maxDepth) {
if (currentId === ancestorId) return true
if (visited.has(currentId)) return false
visited.add(currentId)
currentId = blocks?.[currentId]?.data?.parentId
depth += 1
}
return false
},
[blocks]
)
/**
* Gets the absolute position of a node (accounting for nested parents).
* For nodes inside containers, accounts for header and padding offsets.
@@ -379,6 +406,7 @@ export function useNodeUtilities(blocks: Record<string, any>) {
return {
getNodeDepth,
getNodeHierarchy,
isDescendantOf,
getNodeAbsolutePosition,
calculateRelativePosition,
isPointInLoopNode,

View File

@@ -32,6 +32,7 @@ import type {
} from '@/executor/types'
import { hasExecutionResult } from '@/executor/utils/errors'
import { coerceValue } from '@/executor/utils/start-block'
import { stripCloneSuffixes } from '@/executor/utils/subflow-utils'
import { subscriptionKeys } from '@/hooks/queries/subscription'
import { useExecutionStream } from '@/hooks/use-execution-stream'
import { WorkflowValidationError } from '@/serializer'
@@ -347,6 +348,22 @@ export function useWorkflowExecution() {
return blockType === 'loop' || blockType === 'parallel'
}
/** Extracts iteration and child-workflow fields shared across console entry call sites. */
const extractIterationFields = (
data: BlockStartedData | BlockCompletedData | BlockErrorData
) => ({
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
parentIterations: data.parentIterations,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
...('childWorkflowInstanceId' in data && {
childWorkflowInstanceId: data.childWorkflowInstanceId,
}),
})
const createBlockLogEntry = (
data: BlockCompletedData | BlockErrorData,
options: { success: boolean; output?: unknown; error?: string }
@@ -379,13 +396,7 @@ export function useWorkflowExecution() {
executionId: executionIdRef.current,
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
})
}
@@ -405,13 +416,7 @@ export function useWorkflowExecution() {
executionId: executionIdRef.current,
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
})
}
@@ -427,13 +432,7 @@ export function useWorkflowExecution() {
startedAt: data.startedAt,
endedAt: data.endedAt,
isRunning: false,
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
},
executionIdRef.current
)
@@ -452,13 +451,7 @@ export function useWorkflowExecution() {
startedAt: data.startedAt,
endedAt: data.endedAt,
isRunning: false,
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
},
executionIdRef.current
)
@@ -486,12 +479,7 @@ export function useWorkflowExecution() {
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
isRunning: true,
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
...extractIterationFields(data),
})
}
@@ -499,7 +487,6 @@ export function useWorkflowExecution() {
if (isStaleExecution()) return
updateActiveBlocks(data.blockId, false)
if (workflowId) setBlockRunStatus(workflowId, data.blockId, 'success')
executedBlockIds.add(data.blockId)
accumulatedBlockStates.set(data.blockId, {
output: data.output,
@@ -507,7 +494,17 @@ export function useWorkflowExecution() {
executionTime: data.durationMs,
})
// For nested containers, the SSE blockId may be a cloned ID (e.g. P1__obranch-0).
// Also record the original workflow-level ID so the canvas can highlight it.
if (isContainerBlockType(data.blockType)) {
const originalId = stripCloneSuffixes(data.blockId)
if (originalId !== data.blockId) {
executedBlockIds.add(originalId)
if (workflowId) setBlockRunStatus(workflowId, originalId, 'success')
}
}
if (isContainerBlockType(data.blockType) && !data.iterationContainerId) {
return
}
@@ -538,6 +535,15 @@ export function useWorkflowExecution() {
executionTime: data.durationMs || 0,
})
// For nested containers, also record the original workflow-level ID
if (isContainerBlockType(data.blockType)) {
const originalId = stripCloneSuffixes(data.blockId)
if (originalId !== data.blockId) {
executedBlockIds.add(originalId)
if (workflowId) setBlockRunStatus(workflowId, originalId, 'error')
}
}
accumulatedBlockLogs.push(
createBlockLogEntry(data, { success: false, output: {}, error: data.error })
)
@@ -745,7 +751,7 @@ export function useWorkflowExecution() {
const stream = new ReadableStream({
async start(controller) {
const { encodeSSE } = await import('@/lib/core/utils/sse')
const streamedContent = new Map<string, string>()
const streamedChunks = new Map<string, string[]>()
const streamReadingPromises: Promise<void>[] = []
const safeEnqueue = (data: Uint8Array) => {
@@ -845,8 +851,8 @@ export function useWorkflowExecution() {
const reader = streamingExecution.stream.getReader()
const blockId = (streamingExecution.execution as any)?.blockId
if (blockId && !streamedContent.has(blockId)) {
streamedContent.set(blockId, '')
if (blockId && !streamedChunks.has(blockId)) {
streamedChunks.set(blockId, [])
}
try {
@@ -860,13 +866,13 @@ export function useWorkflowExecution() {
}
const chunk = new TextDecoder().decode(value)
if (blockId) {
streamedContent.set(blockId, (streamedContent.get(blockId) || '') + chunk)
streamedChunks.get(blockId)!.push(chunk)
}
let chunkToSend = chunk
if (blockId && !processedFirstChunk.has(blockId)) {
processedFirstChunk.add(blockId)
if (streamedContent.size > 1) {
if (streamedChunks.size > 1) {
chunkToSend = `\n\n${chunk}`
}
}
@@ -884,7 +890,7 @@ export function useWorkflowExecution() {
// Handle non-streaming blocks (like Function blocks)
const onBlockComplete = async (blockId: string, output: any) => {
// Skip if this block already had streaming content (avoid duplicates)
if (streamedContent.has(blockId)) {
if (streamedChunks.has(blockId)) {
logger.debug('[handleRunWorkflow] Skipping onBlockComplete for streaming block', {
blockId,
})
@@ -921,13 +927,13 @@ export function useWorkflowExecution() {
: JSON.stringify(outputValue, null, 2)
// Add separator if this isn't the first output
const separator = streamedContent.size > 0 ? '\n\n' : ''
const separator = streamedChunks.size > 0 ? '\n\n' : ''
// Send the non-streaming block output as a chunk
safeEnqueue(encodeSSE({ blockId, chunk: separator + formattedOutput }))
// Track that we've sent output for this block
streamedContent.set(blockId, formattedOutput)
streamedChunks.set(blockId, [formattedOutput])
}
}
}
@@ -969,6 +975,12 @@ export function useWorkflowExecution() {
})
}
// Resolve chunks to final strings for consumption
const streamedContent = new Map<string, string>()
for (const [id, chunks] of streamedChunks) {
streamedContent.set(id, chunks.join(''))
}
// Update streamed content and apply tokenization
if (result.logs) {
result.logs.forEach((log: BlockLog) => {
@@ -1316,7 +1328,7 @@ export function useWorkflowExecution() {
const activeBlocksSet = new Set<string>()
const activeBlockRefCounts = new Map<string, number>()
const streamedContent = new Map<string, string>()
const streamedChunks = new Map<string, string[]>()
const accumulatedBlockLogs: BlockLog[] = []
const accumulatedBlockStates = new Map<string, BlockState>()
const executedBlockIds = new Set<string>()
@@ -1374,8 +1386,10 @@ export function useWorkflowExecution() {
onBlockChildWorkflowStarted: blockHandlers.onBlockChildWorkflowStarted,
onStreamChunk: (data) => {
const existing = streamedContent.get(data.blockId) || ''
streamedContent.set(data.blockId, existing + data.chunk)
if (!streamedChunks.has(data.blockId)) {
streamedChunks.set(data.blockId, [])
}
streamedChunks.get(data.blockId)!.push(data.chunk)
// Call onStream callback if provided (create a fake StreamingExecution)
if (onStream && isExecutingFromChat) {
@@ -1390,7 +1404,7 @@ export function useWorkflowExecution() {
stream,
execution: {
success: true,
output: { content: existing + data.chunk },
output: { content: '' },
blockId: data.blockId,
} as any,
}

View File

@@ -4,6 +4,39 @@ import { TriggerUtils } from '@/lib/workflows/triggers/triggers'
import { clampPositionToContainer } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils/node-position-utils'
import type { BlockState } from '@/stores/workflows/workflow/types'
/**
* Collects all descendant block IDs for container blocks (loop/parallel) in the given set.
* Used to treat a nested subflow as one unit when computing boundary edges (e.g. remove-from-subflow).
*
* @param blockIds - Root block IDs (e.g. the blocks being removed from subflow)
* @param blocks - All workflow blocks
* @returns IDs of blocks that are descendants of any container in blockIds (excluding the roots)
*/
export function getDescendantBlockIds(
blockIds: string[],
blocks: Record<string, BlockState>
): string[] {
const current = new Set(blockIds)
const added: string[] = []
const toProcess = [...blockIds]
while (toProcess.length > 0) {
const id = toProcess.pop()!
const block = blocks[id]
if (block?.type !== 'loop' && block?.type !== 'parallel') continue
for (const [bid, b] of Object.entries(blocks)) {
if (b?.data?.parentId === id && !current.has(bid)) {
current.add(bid)
added.push(bid)
toProcess.push(bid)
}
}
}
return added
}
/**
* Checks if the currently focused element is an editable input.
* Returns true if the user is typing in an input, textarea, or contenteditable element.

View File

@@ -57,6 +57,7 @@ import {
estimateBlockDimensions,
filterProtectedBlocks,
getClampedPositionForNode,
getDescendantBlockIds,
getWorkflowLockToggleIds,
isBlockProtected,
isEdgeProtected,
@@ -197,7 +198,7 @@ const defaultEdgeOptions = { type: 'custom' }
const reactFlowStyles = [
'bg-[var(--bg)]',
'[&_.react-flow__edges]:!z-0',
'[&_.react-flow__node]:!z-[21]',
'[&_.react-flow__node]:z-[21]',
'[&_.react-flow__handle]:!z-[30]',
'[&_.react-flow__edge-labels]:!z-[60]',
'[&_.react-flow__pane]:!bg-[var(--bg)]',
@@ -416,6 +417,7 @@ const WorkflowContent = React.memo(() => {
const {
getNodeDepth,
getNodeAbsolutePosition,
isDescendantOf,
calculateRelativePosition,
isPointInLoopNode,
resizeLoopNodes,
@@ -432,7 +434,6 @@ const WorkflowContent = React.memo(() => {
const canNodeEnterContainer = useCallback(
(node: Node): boolean => {
if (node.data?.type === 'starter') return false
if (node.type === 'subflowNode') return false
const block = blocks[node.id]
return !(block && TriggerUtils.isTriggerBlock(block))
},
@@ -681,10 +682,15 @@ const WorkflowContent = React.memo(() => {
if (nodesNeedingUpdate.length === 0) return
// Filter out nodes that cannot enter containers (when target is a container)
const validNodes = targetParentId
let validNodes = targetParentId
? nodesNeedingUpdate.filter(canNodeEnterContainer)
: nodesNeedingUpdate
// Exclude nodes that would create a cycle (moving a container into one of its descendants)
if (targetParentId) {
validNodes = validNodes.filter((n) => !isDescendantOf(n.id, targetParentId))
}
if (validNodes.length === 0) return
// Find boundary edges (edges that cross the container boundary)
@@ -744,6 +750,7 @@ const WorkflowContent = React.memo(() => {
blocks,
edgesForDisplay,
canNodeEnterContainer,
isDescendantOf,
calculateRelativePosition,
getNodeAbsolutePosition,
shiftUpdatesToContainerBounds,
@@ -1014,12 +1021,22 @@ const WorkflowContent = React.memo(() => {
return
}
// Check if any pasted block is a subflow - subflows cannot be nested
const hasSubflow = pastedBlocksArray.some((b) => b.type === 'loop' || b.type === 'parallel')
if (hasSubflow) {
// Prevent cycle: pasting a container that is the target container itself or one of its ancestors.
// Use original clipboard IDs since preparePasteData regenerates them via uuidv4().
const ancestorIds = new Set<string>()
let walkId: string | undefined = targetContainer.loopId
while (walkId && !ancestorIds.has(walkId)) {
ancestorIds.add(walkId)
walkId = blocks[walkId]?.data?.parentId as string | undefined
}
const originalClipboardBlocks = clipboard ? Object.values(clipboard.blocks) : []
const wouldCreateCycle = originalClipboardBlocks.some(
(b) => (b.type === 'loop' || b.type === 'parallel') && ancestorIds.has(b.id)
)
if (wouldCreateCycle) {
addNotification({
level: 'error',
message: 'Subflows cannot be nested inside other subflows.',
message: 'Cannot paste a subflow inside itself or its own descendant.',
workflowId: activeWorkflowId || undefined,
})
return
@@ -1702,31 +1719,75 @@ const WorkflowContent = React.memo(() => {
const containerInfo = isPointInLoopNode(position)
clearDragHighlights()
document.body.classList.remove('sim-drag-subflow')
if (data.type === 'loop' || data.type === 'parallel') {
const id = crypto.randomUUID()
const baseName = data.type === 'loop' ? 'Loop' : 'Parallel'
const name = getUniqueBlockName(baseName, blocks)
const autoConnectEdge = tryCreateAutoConnectEdge(position, id, {
targetParentId: null,
})
if (containerInfo) {
const rawPosition = {
x: position.x - containerInfo.loopPosition.x,
y: position.y - containerInfo.loopPosition.y,
}
addBlock(
id,
data.type,
name,
position,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
type: 'subflowNode',
},
undefined,
undefined,
autoConnectEdge
)
const relativePosition = clampPositionToContainer(
rawPosition,
containerInfo.dimensions,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
}
)
const existingChildBlocks = Object.values(blocks)
.filter((b) => b.data?.parentId === containerInfo.loopId)
.map((b) => ({ id: b.id, type: b.type, position: b.position }))
const autoConnectEdge = tryCreateAutoConnectEdge(relativePosition, id, {
targetParentId: containerInfo.loopId,
existingChildBlocks,
containerId: containerInfo.loopId,
})
addBlock(
id,
data.type,
name,
relativePosition,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
type: 'subflowNode',
parentId: containerInfo.loopId,
extent: 'parent',
},
containerInfo.loopId,
'parent',
autoConnectEdge
)
resizeLoopNodesWrapper()
} else {
const autoConnectEdge = tryCreateAutoConnectEdge(position, id, {
targetParentId: null,
})
addBlock(
id,
data.type,
name,
position,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
type: 'subflowNode',
},
undefined,
undefined,
autoConnectEdge
)
}
return
}
@@ -2113,11 +2174,9 @@ const WorkflowContent = React.memo(() => {
// Check if hovering over a container node
const containerInfo = isPointInLoopNode(position)
// Highlight container if hovering over it and not dragging a subflow
// Subflow drag is marked by body class flag set by toolbar
const isSubflowDrag = document.body.classList.contains('sim-drag-subflow')
// Highlight container if hovering over it
if (containerInfo && !isSubflowDrag) {
if (containerInfo) {
const containerNode = getNodes().find((n) => n.id === containerInfo.loopId)
if (containerNode?.type === 'subflowNode') {
const kind = (containerNode.data as SubflowNodeData)?.kind
@@ -2308,6 +2367,13 @@ const WorkflowContent = React.memo(() => {
// Handle container nodes differently
if (block.type === 'loop' || block.type === 'parallel') {
// Compute nesting depth so children always render above parents
let depth = 0
let pid = block.data?.parentId as string | undefined
while (pid && depth < 100) {
depth++
pid = blocks[pid]?.data?.parentId as string | undefined
}
nodeArray.push({
id: block.id,
type: 'subflowNode',
@@ -2316,6 +2382,8 @@ const WorkflowContent = React.memo(() => {
extent: block.data?.extent || undefined,
dragHandle: '.workflow-drag-handle',
draggable: !isBlockProtected(block.id, blocks),
zIndex: depth,
className: block.data?.parentId ? 'nested-subflow-node' : undefined,
data: {
...block.data,
name: block.name,
@@ -2476,15 +2544,35 @@ const WorkflowContent = React.memo(() => {
})
if (validBlockIds.length === 0) return
const movingNodeIds = new Set(validBlockIds)
const validBlockIdSet = new Set(validBlockIds)
const descendantIds = getDescendantBlockIds(validBlockIds, blocks)
const movingNodeIds = new Set([...validBlockIds, ...descendantIds])
// Find boundary edges (edges that cross the subflow boundary)
// Find boundary edges (one end inside the subtree, one end outside)
const boundaryEdges = edgesForDisplay.filter((e) => {
const sourceInSelection = movingNodeIds.has(e.source)
const targetInSelection = movingNodeIds.has(e.target)
return sourceInSelection !== targetInSelection
})
const boundaryEdgesByNode = mapEdgesByNode(boundaryEdges, movingNodeIds)
// Attribute each boundary edge to the validBlockId that is the ancestor of the moved endpoint
const boundaryEdgesByNode = new Map<string, Edge[]>()
for (const edge of boundaryEdges) {
const movedEnd = movingNodeIds.has(edge.source) ? edge.source : edge.target
let id: string | undefined = movedEnd
const seen = new Set<string>()
while (id) {
if (seen.has(id)) break
seen.add(id)
if (validBlockIdSet.has(id)) {
const list = boundaryEdgesByNode.get(id) ?? []
list.push(edge)
boundaryEdgesByNode.set(id, list)
break
}
id = blocks[id]?.data?.parentId
}
}
// Collect absolute positions BEFORE any mutations
const absolutePositions = new Map<string, { x: number; y: number }>()
@@ -2546,42 +2634,54 @@ const WorkflowContent = React.memo(() => {
/**
* Updates container dimensions in displayNodes during drag or keyboard movement.
* Resizes the moved node's immediate parent and all ancestor containers (for nested loops/parallels).
*/
const updateContainerDimensionsDuringMove = useCallback(
(movedNodeId: string, movedNodePosition: { x: number; y: number }) => {
const parentId = blocks[movedNodeId]?.data?.parentId
if (!parentId) return
const ancestorIds: string[] = []
const visited = new Set<string>()
let currentId = blocks[movedNodeId]?.data?.parentId
while (currentId && !visited.has(currentId)) {
visited.add(currentId)
ancestorIds.push(currentId)
currentId = blocks[currentId]?.data?.parentId
}
if (ancestorIds.length === 0) return
setDisplayNodes((currentNodes) => {
const childNodes = currentNodes.filter((n) => n.parentId === parentId)
if (childNodes.length === 0) return currentNodes
const computedDimensions = new Map<string, { width: number; height: number }>()
const childPositions = childNodes.map((node) => {
const nodePosition = node.id === movedNodeId ? movedNodePosition : node.position
const { width, height } = getBlockDimensions(node.id)
return { x: nodePosition.x, y: nodePosition.y, width, height }
})
for (const containerId of ancestorIds) {
const childNodes = currentNodes.filter((n) => n.parentId === containerId)
if (childNodes.length === 0) continue
const { width: newWidth, height: newHeight } = calculateContainerDimensions(childPositions)
const childPositions = childNodes.map((node) => {
const nodePosition = node.id === movedNodeId ? movedNodePosition : node.position
const dims = computedDimensions.get(node.id)
const width = dims?.width ?? node.data?.width ?? getBlockDimensions(node.id).width
const height = dims?.height ?? node.data?.height ?? getBlockDimensions(node.id).height
return { x: nodePosition.x, y: nodePosition.y, width, height }
})
computedDimensions.set(containerId, calculateContainerDimensions(childPositions))
}
return currentNodes.map((node) => {
if (node.id === parentId) {
const currentWidth = node.data?.width || CONTAINER_DIMENSIONS.DEFAULT_WIDTH
const currentHeight = node.data?.height || CONTAINER_DIMENSIONS.DEFAULT_HEIGHT
// Only update if dimensions changed
if (newWidth !== currentWidth || newHeight !== currentHeight) {
return {
...node,
data: {
...node.data,
width: newWidth,
height: newHeight,
},
}
}
const newDims = computedDimensions.get(node.id)
if (!newDims) return node
const currentWidth = node.data?.width ?? CONTAINER_DIMENSIONS.DEFAULT_WIDTH
const currentHeight = node.data?.height ?? CONTAINER_DIMENSIONS.DEFAULT_HEIGHT
if (newDims.width === currentWidth && newDims.height === currentHeight) {
return node
}
return {
...node,
data: {
...node.data,
width: newDims.width,
height: newDims.height,
},
}
return node
})
})
},
@@ -2914,16 +3014,6 @@ const WorkflowContent = React.memo(() => {
// Get the node's absolute position to properly calculate intersections
const nodeAbsolutePos = getNodeAbsolutePosition(node.id)
// Prevent subflows from being dragged into other subflows
if (node.type === 'subflowNode') {
// Clear any highlighting for subflow nodes
if (potentialParentId) {
clearDragHighlights()
setPotentialParentId(null)
}
return // Exit early - subflows cannot be placed inside other subflows
}
// Find intersections with container nodes using absolute coordinates
const intersectingNodes = getNodes()
.filter((n) => {
@@ -2993,15 +3083,25 @@ const WorkflowContent = React.memo(() => {
return a.size - b.size // Smaller container takes precedence
})
// Exclude containers that are inside the dragged node (would create a cycle)
const validContainers = sortedContainers.filter(
({ container }) => !isDescendantOf(node.id, container.id)
)
// Use the most appropriate container (deepest or smallest at same depth)
const bestContainerMatch = sortedContainers[0]
const bestContainerMatch = validContainers[0]
setPotentialParentId(bestContainerMatch.container.id)
if (bestContainerMatch) {
setPotentialParentId(bestContainerMatch.container.id)
// Add highlight class and change cursor
const kind = (bestContainerMatch.container.data as SubflowNodeData)?.kind
if (kind === 'loop' || kind === 'parallel') {
highlightContainerNode(bestContainerMatch.container.id, kind)
// Add highlight class and change cursor
const kind = (bestContainerMatch.container.data as SubflowNodeData)?.kind
if (kind === 'loop' || kind === 'parallel') {
highlightContainerNode(bestContainerMatch.container.id, kind)
}
} else {
clearDragHighlights()
setPotentialParentId(null)
}
} else {
// Remove highlighting if no longer over a container
@@ -3017,6 +3117,7 @@ const WorkflowContent = React.memo(() => {
blocks,
getNodeAbsolutePosition,
getNodeDepth,
isDescendantOf,
updateContainerDimensionsDuringMove,
highlightContainerNode,
]
@@ -3159,6 +3260,17 @@ const WorkflowContent = React.memo(() => {
}
}
// Prevent placing a container inside one of its own nested containers (would create cycle)
if (potentialParentId && isDescendantOf(node.id, potentialParentId)) {
addNotification({
level: 'info',
message: 'Cannot place a container inside one of its own nested containers',
workflowId: activeWorkflowId || undefined,
})
setPotentialParentId(null)
return
}
// Update the node's parent relationship
if (potentialParentId) {
// Remove existing edges before moving into container
@@ -3275,6 +3387,7 @@ const WorkflowContent = React.memo(() => {
getNodes,
dragStartParentId,
potentialParentId,
isDescendantOf,
updateNodeParent,
updateBlockPosition,
collaborativeBatchAddEdges,

View File

@@ -289,21 +289,38 @@ export function PreviewWorkflow({
return map
}, [executedBlocks])
/** Derives subflow status from children. Error takes precedence. */
/** Derives subflow status from children. Recursively checks nested subflows. Error takes precedence. */
const getSubflowExecutionStatus = useMemo(() => {
return (subflowId: string): ExecutionStatus | undefined => {
const derive = (
subflowId: string,
visited: Set<string> = new Set()
): ExecutionStatus | undefined => {
if (visited.has(subflowId)) return undefined
visited.add(subflowId)
const childIds = subflowChildrenMap.get(subflowId)
if (!childIds?.length) return undefined
const executedChildren = childIds
.map((id) => blockExecutionMap.get(id))
.filter((status): status is { status: string } => Boolean(status))
const childStatuses: string[] = []
for (const childId of childIds) {
const direct = blockExecutionMap.get(childId)
if (direct) {
childStatuses.push(direct.status)
} else {
const childBlock = workflowState.blocks?.[childId]
if (childBlock?.type === 'loop' || childBlock?.type === 'parallel') {
const nested = derive(childId, visited)
if (nested) childStatuses.push(nested)
}
}
}
if (executedChildren.length === 0) return undefined
if (executedChildren.some((s) => s.status === 'error')) return 'error'
if (childStatuses.length === 0) return undefined
if (childStatuses.some((s) => s === 'error')) return 'error'
return 'success'
}
}, [subflowChildrenMap, blockExecutionMap])
return derive
}, [subflowChildrenMap, blockExecutionMap, workflowState.blocks])
/** Gets block status. Subflows derive status from children. */
const getBlockExecutionStatus = useMemo(() => {

View File

@@ -10,7 +10,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
description: 'Read, create, and update Airtable',
authMode: AuthMode.OAuth,
longDescription:
'Integrates Airtable into the workflow. Can create, get, list, or update Airtable records. Can be used in trigger mode to trigger a workflow when an update is made to an Airtable table.',
'Integrates Airtable into the workflow. Can list bases, list tables (with schema), and create, get, list, or update records. Can also be used in trigger mode to trigger a workflow when an update is made to an Airtable table.',
docsLink: 'https://docs.sim.ai/tools/airtable',
category: 'tools',
bgColor: '#E0E0E0',
@@ -21,10 +21,13 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'List Bases', id: 'listBases' },
{ label: 'List Tables', id: 'listTables' },
{ label: 'List Records', id: 'list' },
{ label: 'Get Record', id: 'get' },
{ label: 'Create Records', id: 'create' },
{ label: 'Update Record', id: 'update' },
{ label: 'Update Multiple Records', id: 'updateMultiple' },
],
value: () => 'list',
},
@@ -38,6 +41,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
requiredScopes: [
'data.records:read',
'data.records:write',
'schema.bases:read',
'user.email:read',
'webhook:manage',
],
@@ -59,7 +63,8 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
type: 'short-input',
placeholder: 'Enter your base ID (e.g., appXXXXXXXXXXXXXX)',
dependsOn: ['credential'],
required: true,
condition: { field: 'operation', value: 'listBases', not: true },
required: { field: 'operation', value: 'listBases', not: true },
},
{
id: 'tableId',
@@ -67,7 +72,8 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
type: 'short-input',
placeholder: 'Enter table ID (e.g., tblXXXXXXXXXXXXXX)',
dependsOn: ['credential', 'baseId'],
required: true,
condition: { field: 'operation', value: ['listBases', 'listTables'], not: true },
required: { field: 'operation', value: ['listBases', 'listTables'], not: true },
},
{
id: 'recordId',
@@ -83,6 +89,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
type: 'short-input',
placeholder: 'Maximum records to return',
condition: { field: 'operation', value: 'list' },
mode: 'advanced',
},
{
id: 'filterFormula',
@@ -90,6 +97,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
type: 'long-input',
placeholder: 'Airtable formula to filter records (optional)',
condition: { field: 'operation', value: 'list' },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: `Generate an Airtable filter formula based on the user's description.
@@ -206,6 +214,8 @@ Return ONLY the valid JSON object - no explanations, no markdown.`,
],
tools: {
access: [
'airtable_list_bases',
'airtable_list_tables',
'airtable_list_records',
'airtable_get_record',
'airtable_create_records',
@@ -215,6 +225,10 @@ Return ONLY the valid JSON object - no explanations, no markdown.`,
config: {
tool: (params) => {
switch (params.operation) {
case 'listBases':
return 'airtable_list_bases'
case 'listTables':
return 'airtable_list_tables'
case 'list':
return 'airtable_list_records'
case 'get':
@@ -278,6 +292,11 @@ Return ONLY the valid JSON object - no explanations, no markdown.`,
},
// Output structure depends on the operation, covered by AirtableResponse union type
outputs: {
// List Bases output
bases: { type: 'json', description: 'List of accessible Airtable bases' },
// List Tables output
tables: { type: 'json', description: 'List of tables in the base with schema' },
// Record outputs
records: { type: 'json', description: 'Retrieved record data' }, // Optional: for list, create, updateMultiple
record: { type: 'json', description: 'Single record data' }, // Optional: for get, update single
metadata: { type: 'json', description: 'Operation metadata' }, // Required: present in all responses

View File

@@ -0,0 +1,97 @@
import { BrandfetchIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import type { BrandfetchGetBrandResponse, BrandfetchSearchResponse } from '@/tools/brandfetch/types'
export const BrandfetchBlock: BlockConfig<BrandfetchGetBrandResponse | BrandfetchSearchResponse> = {
type: 'brandfetch',
name: 'Brandfetch',
description: 'Look up brand assets, logos, colors, and company info',
longDescription:
'Integrate Brandfetch into your workflow. Retrieve brand logos, colors, fonts, and company data by domain, ticker, or name search.',
docsLink: 'https://docs.sim.ai/tools/brandfetch',
category: 'tools',
bgColor: '#000000',
icon: BrandfetchIcon,
authMode: AuthMode.ApiKey,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Get Brand', id: 'get_brand' },
{ label: 'Search Brands', id: 'search' },
],
value: () => 'get_brand',
},
{
id: 'identifier',
title: 'Identifier',
type: 'short-input',
placeholder: 'e.g., nike.com, NKE, BTC',
required: { field: 'operation', value: 'get_brand' },
condition: { field: 'operation', value: 'get_brand' },
},
{
id: 'name',
title: 'Brand Name',
type: 'short-input',
placeholder: 'e.g., Nike',
required: { field: 'operation', value: 'search' },
condition: { field: 'operation', value: 'search' },
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your Brandfetch API key',
required: true,
password: true,
},
],
tools: {
access: ['brandfetch_get_brand', 'brandfetch_search'],
config: {
tool: (params) => {
switch (params.operation) {
case 'get_brand':
return 'brandfetch_get_brand'
case 'search':
return 'brandfetch_search'
default:
return 'brandfetch_get_brand'
}
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
identifier: {
type: 'string',
description: 'Brand identifier (domain, ticker, ISIN, or crypto symbol)',
},
name: { type: 'string', description: 'Brand name to search for' },
apiKey: { type: 'string', description: 'Brandfetch API key' },
},
outputs: {
id: { type: 'string', description: 'Unique brand identifier' },
name: { type: 'string', description: 'Brand name' },
domain: { type: 'string', description: 'Brand domain' },
claimed: { type: 'boolean', description: 'Whether the brand profile is claimed' },
description: { type: 'string', description: 'Short brand description' },
longDescription: { type: 'string', description: 'Detailed brand description' },
links: { type: 'array', description: 'Social media and website links' },
logos: { type: 'array', description: 'Brand logos with formats and themes' },
colors: { type: 'array', description: 'Brand colors with hex values' },
fonts: { type: 'array', description: 'Brand fonts' },
company: { type: 'json', description: 'Company firmographic data' },
qualityScore: { type: 'number', description: 'Data quality score (0-1)' },
isNsfw: { type: 'boolean', description: 'Adult content indicator' },
results: { type: 'array', description: 'Search results with brand name, domain, and icon' },
},
}

View File

@@ -0,0 +1,491 @@
import { DubIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import type { DubResponse } from '@/tools/dub/types'
export const DubBlock: BlockConfig<DubResponse> = {
type: 'dub',
name: 'Dub',
description: 'Link management with Dub',
authMode: AuthMode.ApiKey,
longDescription:
'Create, manage, and track short links with Dub. Supports custom domains, UTM parameters, link analytics, and more.',
docsLink: 'https://docs.sim.ai/tools/dub',
category: 'tools',
bgColor: '#181C1E',
icon: DubIcon,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Create Link', id: 'create_link' },
{ label: 'Upsert Link', id: 'upsert_link' },
{ label: 'Get Link', id: 'get_link' },
{ label: 'Update Link', id: 'update_link' },
{ label: 'Delete Link', id: 'delete_link' },
{ label: 'List Links', id: 'list_links' },
{ label: 'Get Analytics', id: 'get_analytics' },
],
value: () => 'create_link',
},
{
id: 'url',
title: 'Destination URL',
type: 'short-input',
placeholder: 'https://example.com',
condition: { field: 'operation', value: ['create_link', 'upsert_link'] },
required: { field: 'operation', value: ['create_link', 'upsert_link'] },
},
{
id: 'domain',
title: 'Domain',
type: 'short-input',
placeholder: 'dub.sh (default)',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'key',
title: 'Custom Slug',
type: 'short-input',
placeholder: 'my-link (randomly generated if empty)',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'title',
title: 'Title',
type: 'short-input',
placeholder: 'Custom OG title',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'description',
title: 'Description',
type: 'short-input',
placeholder: 'Custom OG description',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'externalId',
title: 'External ID',
type: 'short-input',
placeholder: 'Your database ID for this link',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'tagIds',
title: 'Tag IDs',
type: 'short-input',
placeholder: 'Comma-separated tag IDs',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'comments',
title: 'Comments',
type: 'short-input',
placeholder: 'Internal comments',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'expiresAt',
title: 'Expires At',
type: 'short-input',
placeholder: 'ISO 8601 date (e.g., 2025-12-31T23:59:59Z)',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: `Generate an ISO 8601 timestamp based on the user's description. Return ONLY the timestamp string - no explanations, no extra text.`,
placeholder: 'Describe the expiration (e.g., "in 30 days", "end of year")...',
generationType: 'timestamp',
},
},
{
id: 'password',
title: 'Password',
type: 'short-input',
placeholder: 'Password to protect the link',
password: true,
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'utm_source',
title: 'UTM Source',
type: 'short-input',
placeholder: 'e.g., twitter',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'utm_medium',
title: 'UTM Medium',
type: 'short-input',
placeholder: 'e.g., social',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'utm_campaign',
title: 'UTM Campaign',
type: 'short-input',
placeholder: 'e.g., summer-sale',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'utm_term',
title: 'UTM Term',
type: 'short-input',
placeholder: 'e.g., link-shortener',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'utm_content',
title: 'UTM Content',
type: 'short-input',
placeholder: 'e.g., header-cta',
condition: { field: 'operation', value: ['create_link', 'upsert_link', 'update_link'] },
mode: 'advanced',
},
{
id: 'linkId',
title: 'Link ID',
type: 'short-input',
placeholder: 'Link ID or ext_<externalId>',
condition: { field: 'operation', value: ['get_link', 'update_link', 'delete_link'] },
required: { field: 'operation', value: ['update_link', 'delete_link'] },
},
{
id: 'getLinkExternalId',
title: 'External ID',
type: 'short-input',
placeholder: 'External ID from your database',
condition: { field: 'operation', value: 'get_link' },
mode: 'advanced',
},
{
id: 'getLinkDomain',
title: 'Domain',
type: 'short-input',
placeholder: 'dub.sh',
condition: { field: 'operation', value: 'get_link' },
mode: 'advanced',
},
{
id: 'getLinkKey',
title: 'Key',
type: 'short-input',
placeholder: 'Link slug',
condition: { field: 'operation', value: 'get_link' },
mode: 'advanced',
},
{
id: 'updateUrl',
title: 'New Destination URL',
type: 'short-input',
placeholder: 'https://example.com/new-page',
condition: { field: 'operation', value: 'update_link' },
},
{
id: 'listDomain',
title: 'Filter by Domain',
type: 'short-input',
placeholder: 'dub.sh',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'search',
title: 'Search',
type: 'short-input',
placeholder: 'Search links by slug or destination URL',
condition: { field: 'operation', value: 'list_links' },
},
{
id: 'listTagIds',
title: 'Filter by Tag IDs',
type: 'short-input',
placeholder: 'Comma-separated tag IDs',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'showArchived',
title: 'Show Archived',
type: 'dropdown',
options: [
{ label: 'No', id: 'false' },
{ label: 'Yes', id: 'true' },
],
value: () => 'false',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'sortBy',
title: 'Sort By',
type: 'dropdown',
options: [
{ label: 'Created At', id: 'createdAt' },
{ label: 'Clicks', id: 'clicks' },
{ label: 'Sale Amount', id: 'saleAmount' },
{ label: 'Last Clicked', id: 'lastClicked' },
],
value: () => 'createdAt',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'sortOrder',
title: 'Sort Order',
type: 'dropdown',
options: [
{ label: 'Descending', id: 'desc' },
{ label: 'Ascending', id: 'asc' },
],
value: () => 'desc',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'page',
title: 'Page',
type: 'short-input',
placeholder: '1',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'pageSize',
title: 'Page Size',
type: 'short-input',
placeholder: '100 (max: 100)',
condition: { field: 'operation', value: 'list_links' },
mode: 'advanced',
},
{
id: 'analyticsEvent',
title: 'Event Type',
type: 'dropdown',
options: [
{ label: 'Clicks', id: 'clicks' },
{ label: 'Leads', id: 'leads' },
{ label: 'Sales', id: 'sales' },
{ label: 'Composite', id: 'composite' },
],
value: () => 'clicks',
condition: { field: 'operation', value: 'get_analytics' },
},
{
id: 'analyticsGroupBy',
title: 'Group By',
type: 'dropdown',
options: [
{ label: 'Count (total)', id: 'count' },
{ label: 'Timeseries', id: 'timeseries' },
{ label: 'Countries', id: 'countries' },
{ label: 'Cities', id: 'cities' },
{ label: 'Devices', id: 'devices' },
{ label: 'Browsers', id: 'browsers' },
{ label: 'OS', id: 'os' },
{ label: 'Referers', id: 'referers' },
{ label: 'Top Links', id: 'top_links' },
{ label: 'Top URLs', id: 'top_urls' },
],
value: () => 'count',
condition: { field: 'operation', value: 'get_analytics' },
},
{
id: 'analyticsLinkId',
title: 'Link ID',
type: 'short-input',
placeholder: 'Filter analytics by link ID',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
},
{
id: 'analyticsExternalId',
title: 'External ID',
type: 'short-input',
placeholder: 'Filter by external ID (prefix with ext_)',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
},
{
id: 'analyticsDomain',
title: 'Domain',
type: 'short-input',
placeholder: 'Filter by domain',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
},
{
id: 'analyticsInterval',
title: 'Interval',
type: 'dropdown',
options: [
{ label: '24 Hours', id: '24h' },
{ label: '7 Days', id: '7d' },
{ label: '30 Days', id: '30d' },
{ label: '90 Days', id: '90d' },
{ label: '1 Year', id: '1y' },
{ label: 'Month to Date', id: 'mtd' },
{ label: 'Quarter to Date', id: 'qtd' },
{ label: 'Year to Date', id: 'ytd' },
{ label: 'All Time', id: 'all' },
],
value: () => '24h',
condition: { field: 'operation', value: 'get_analytics' },
},
{
id: 'analyticsStart',
title: 'Start Date',
type: 'short-input',
placeholder: 'ISO 8601 date (overrides interval)',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: `Generate an ISO 8601 timestamp based on the user's description. Return ONLY the timestamp string - no explanations, no extra text.`,
placeholder: 'Describe the start date (e.g., "7 days ago", "start of month")...',
generationType: 'timestamp',
},
},
{
id: 'analyticsEnd',
title: 'End Date',
type: 'short-input',
placeholder: 'ISO 8601 date (defaults to now)',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: `Generate an ISO 8601 timestamp based on the user's description. Return ONLY the timestamp string - no explanations, no extra text.`,
placeholder: 'Describe the end date (e.g., "today", "end of last month")...',
generationType: 'timestamp',
},
},
{
id: 'analyticsCountry',
title: 'Country',
type: 'short-input',
placeholder: 'ISO 3166-1 alpha-2 code (e.g., US)',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
},
{
id: 'analyticsTimezone',
title: 'Timezone',
type: 'short-input',
placeholder: 'IANA timezone (e.g., America/New_York)',
condition: { field: 'operation', value: 'get_analytics' },
mode: 'advanced',
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your Dub API key',
password: true,
required: true,
},
],
tools: {
access: [
'dub_create_link',
'dub_upsert_link',
'dub_get_link',
'dub_update_link',
'dub_delete_link',
'dub_list_links',
'dub_get_analytics',
],
config: {
tool: (params) => `dub_${params.operation}`,
params: (params) => {
const result: Record<string, unknown> = {}
if (params.operation === 'get_link') {
if (params.getLinkExternalId) result.externalId = params.getLinkExternalId
if (params.getLinkDomain) result.domain = params.getLinkDomain
if (params.getLinkKey) result.key = params.getLinkKey
}
if (params.operation === 'update_link' && params.updateUrl) {
result.url = params.updateUrl
}
if (params.operation === 'list_links') {
if (params.listDomain) result.domain = params.listDomain
if (params.listTagIds) result.tagIds = params.listTagIds
if (params.showArchived && params.showArchived !== 'false') result.showArchived = true
if (params.page) result.page = Number(params.page)
if (params.pageSize) result.pageSize = Number(params.pageSize)
}
if (params.operation === 'get_analytics') {
if (params.analyticsEvent) result.event = params.analyticsEvent
if (params.analyticsGroupBy) result.groupBy = params.analyticsGroupBy
if (params.analyticsLinkId) result.linkId = params.analyticsLinkId
if (params.analyticsExternalId) result.externalId = params.analyticsExternalId
if (params.analyticsDomain) result.domain = params.analyticsDomain
if (params.analyticsInterval) result.interval = params.analyticsInterval
if (params.analyticsStart) result.start = params.analyticsStart
if (params.analyticsEnd) result.end = params.analyticsEnd
if (params.analyticsCountry) result.country = params.analyticsCountry
if (params.analyticsTimezone) result.timezone = params.analyticsTimezone
}
return result
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
apiKey: { type: 'string', description: 'Dub API key' },
url: { type: 'string', description: 'Destination URL for the short link' },
linkId: { type: 'string', description: 'Link ID for get/update/delete operations' },
domain: { type: 'string', description: 'Custom domain for the short link' },
key: { type: 'string', description: 'Custom slug for the short link' },
search: { type: 'string', description: 'Search query for listing links' },
},
outputs: {
id: { type: 'string', description: 'Link ID' },
domain: { type: 'string', description: 'Domain of the short link' },
key: { type: 'string', description: 'Slug of the short link' },
url: { type: 'string', description: 'Destination URL' },
shortLink: { type: 'string', description: 'Full short link URL' },
qrCode: { type: 'string', description: 'QR code URL' },
archived: { type: 'boolean', description: 'Whether the link is archived' },
externalId: { type: 'string', description: 'External ID' },
title: { type: 'string', description: 'OG title' },
description: { type: 'string', description: 'OG description' },
tags: { type: 'json', description: 'Tags assigned to the link (id, name, color)' },
clicks: { type: 'number', description: 'Number of clicks' },
leads: { type: 'number', description: 'Number of leads' },
sales: { type: 'number', description: 'Number of sales' },
saleAmount: { type: 'number', description: 'Total sale amount in cents' },
lastClicked: { type: 'string', description: 'Last clicked timestamp' },
createdAt: { type: 'string', description: 'Creation timestamp' },
updatedAt: { type: 'string', description: 'Last update timestamp' },
utm_source: { type: 'string', description: 'UTM source parameter' },
utm_medium: { type: 'string', description: 'UTM medium parameter' },
utm_campaign: { type: 'string', description: 'UTM campaign parameter' },
utm_term: { type: 'string', description: 'UTM term parameter' },
utm_content: { type: 'string', description: 'UTM content parameter' },
links: {
type: 'json',
description: 'Array of links (id, domain, key, url, shortLink, clicks, tags, createdAt)',
},
count: { type: 'number', description: 'Number of links returned (list operation)' },
data: {
type: 'json',
description: 'Grouped analytics data (timeseries, countries, devices, etc.)',
},
},
}

View File

@@ -0,0 +1,182 @@
import { GoogleMeetIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import type { GoogleMeetResponse } from '@/tools/google_meet/types'
export const GoogleMeetBlock: BlockConfig<GoogleMeetResponse> = {
type: 'google_meet',
name: 'Google Meet',
description: 'Create and manage Google Meet meetings',
longDescription:
'Integrate Google Meet into your workflow. Create meeting spaces, get space details, end conferences, list conference records, and view participants.',
docsLink: 'https://docs.sim.ai/tools/google_meet',
category: 'tools',
bgColor: '#E0E0E0',
icon: GoogleMeetIcon,
authMode: AuthMode.OAuth,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Create Space', id: 'create_space' },
{ label: 'Get Space', id: 'get_space' },
{ label: 'End Conference', id: 'end_conference' },
{ label: 'List Conference Records', id: 'list_conference_records' },
{ label: 'Get Conference Record', id: 'get_conference_record' },
{ label: 'List Participants', id: 'list_participants' },
],
value: () => 'create_space',
},
{
id: 'credential',
title: 'Google Meet Account',
type: 'oauth-input',
canonicalParamId: 'oauthCredential',
mode: 'basic',
required: true,
serviceId: 'google-meet',
requiredScopes: [
'https://www.googleapis.com/auth/meetings.space.created',
'https://www.googleapis.com/auth/meetings.space.readonly',
],
placeholder: 'Select Google Meet account',
},
{
id: 'manualCredential',
title: 'Google Meet Account',
type: 'short-input',
canonicalParamId: 'oauthCredential',
mode: 'advanced',
placeholder: 'Enter credential ID',
required: true,
},
// Create Space Fields
{
id: 'accessType',
title: 'Access Type',
type: 'dropdown',
condition: { field: 'operation', value: 'create_space' },
options: [
{ label: 'Open (anyone with link)', id: 'OPEN' },
{ label: 'Trusted (organization members)', id: 'TRUSTED' },
{ label: 'Restricted (invited only)', id: 'RESTRICTED' },
],
},
{
id: 'entryPointAccess',
title: 'Entry Point Access',
type: 'dropdown',
condition: { field: 'operation', value: 'create_space' },
mode: 'advanced',
options: [
{ label: 'All entry points', id: 'ALL' },
{ label: 'Creator app only', id: 'CREATOR_APP_ONLY' },
],
},
// Get Space / End Conference Fields
{
id: 'spaceName',
title: 'Space Name or Meeting Code',
type: 'short-input',
placeholder: 'spaces/abc123 or abc-defg-hij',
condition: { field: 'operation', value: ['get_space', 'end_conference'] },
required: { field: 'operation', value: ['get_space', 'end_conference'] },
},
// Conference Record Fields
{
id: 'conferenceName',
title: 'Conference Record Name',
type: 'short-input',
placeholder: 'conferenceRecords/abc123',
condition: { field: 'operation', value: ['get_conference_record', 'list_participants'] },
required: { field: 'operation', value: ['get_conference_record', 'list_participants'] },
},
// List Conference Records Fields
{
id: 'filter',
title: 'Filter',
type: 'short-input',
placeholder: 'space.name = "spaces/abc123"',
condition: { field: 'operation', value: ['list_conference_records', 'list_participants'] },
mode: 'advanced',
},
{
id: 'pageSize',
title: 'Page Size',
type: 'short-input',
placeholder: '25',
condition: { field: 'operation', value: ['list_conference_records', 'list_participants'] },
mode: 'advanced',
},
{
id: 'pageToken',
title: 'Page Token',
type: 'short-input',
placeholder: 'Token from previous request',
condition: { field: 'operation', value: ['list_conference_records', 'list_participants'] },
mode: 'advanced',
},
],
tools: {
access: [
'google_meet_create_space',
'google_meet_get_space',
'google_meet_end_conference',
'google_meet_list_conference_records',
'google_meet_get_conference_record',
'google_meet_list_participants',
],
config: {
tool: (params) => `google_meet_${params.operation}`,
params: (params) => {
const { oauthCredential, operation, pageSize, ...rest } = params
const processedParams: Record<string, unknown> = { ...rest }
if (pageSize) {
processedParams.pageSize =
typeof pageSize === 'string' ? Number.parseInt(pageSize, 10) : pageSize
}
return {
oauthCredential,
...processedParams,
}
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
oauthCredential: { type: 'string', description: 'Google Meet access token' },
accessType: { type: 'string', description: 'Access type for meeting space' },
entryPointAccess: { type: 'string', description: 'Entry point access setting' },
spaceName: { type: 'string', description: 'Space resource name or meeting code' },
conferenceName: { type: 'string', description: 'Conference record resource name' },
filter: { type: 'string', description: 'Filter expression' },
pageSize: { type: 'string', description: 'Maximum results per page' },
pageToken: { type: 'string', description: 'Pagination token' },
},
outputs: {
name: { type: 'string', description: 'Resource name' },
meetingUri: { type: 'string', description: 'Meeting URL' },
meetingCode: { type: 'string', description: 'Meeting code' },
accessType: { type: 'string', description: 'Access type' },
entryPointAccess: { type: 'string', description: 'Entry point access' },
activeConference: { type: 'string', description: 'Active conference record' },
ended: { type: 'boolean', description: 'Whether conference was ended' },
conferenceRecords: { type: 'json', description: 'List of conference records' },
startTime: { type: 'string', description: 'Conference start time' },
endTime: { type: 'string', description: 'Conference end time' },
expireTime: { type: 'string', description: 'Record expiration time' },
space: { type: 'string', description: 'Associated space name' },
participants: { type: 'json', description: 'List of participants' },
nextPageToken: { type: 'string', description: 'Next page token' },
totalSize: { type: 'number', description: 'Total participant count' },
},
}

View File

@@ -13,6 +13,7 @@ import { ArxivBlock } from '@/blocks/blocks/arxiv'
import { AsanaBlock } from '@/blocks/blocks/asana'
import { AshbyBlock } from '@/blocks/blocks/ashby'
import { AttioBlock } from '@/blocks/blocks/attio'
import { BrandfetchBlock } from '@/blocks/blocks/brandfetch'
import { BrowserUseBlock } from '@/blocks/blocks/browser_use'
import { CalComBlock } from '@/blocks/blocks/calcom'
import { CalendlyBlock } from '@/blocks/blocks/calendly'
@@ -30,6 +31,7 @@ import { DevinBlock } from '@/blocks/blocks/devin'
import { DiscordBlock } from '@/blocks/blocks/discord'
import { DropboxBlock } from '@/blocks/blocks/dropbox'
import { DSPyBlock } from '@/blocks/blocks/dspy'
import { DubBlock } from '@/blocks/blocks/dub'
import { DuckDuckGoBlock } from '@/blocks/blocks/duckduckgo'
import { DynamoDBBlock } from '@/blocks/blocks/dynamodb'
import { ElasticsearchBlock } from '@/blocks/blocks/elasticsearch'
@@ -57,6 +59,7 @@ import { GoogleDriveBlock } from '@/blocks/blocks/google_drive'
import { GoogleFormsBlock } from '@/blocks/blocks/google_forms'
import { GoogleGroupsBlock } from '@/blocks/blocks/google_groups'
import { GoogleMapsBlock } from '@/blocks/blocks/google_maps'
import { GoogleMeetBlock } from '@/blocks/blocks/google_meet'
import { GooglePagespeedBlock } from '@/blocks/blocks/google_pagespeed'
import { GoogleSheetsBlock, GoogleSheetsV2Block } from '@/blocks/blocks/google_sheets'
import { GoogleSlidesBlock, GoogleSlidesV2Block } from '@/blocks/blocks/google_slides'
@@ -205,6 +208,7 @@ export const registry: Record<string, BlockConfig> = {
asana: AsanaBlock,
ashby: AshbyBlock,
attio: AttioBlock,
brandfetch: BrandfetchBlock,
browser_use: BrowserUseBlock,
calcom: CalComBlock,
calendly: CalendlyBlock,
@@ -224,6 +228,7 @@ export const registry: Record<string, BlockConfig> = {
discord: DiscordBlock,
dropbox: DropboxBlock,
dspy: DSPyBlock,
dub: DubBlock,
duckduckgo: DuckDuckGoBlock,
dynamodb: DynamoDBBlock,
elasticsearch: ElasticsearchBlock,
@@ -253,6 +258,7 @@ export const registry: Record<string, BlockConfig> = {
google_drive: GoogleDriveBlock,
google_forms: GoogleFormsBlock,
google_groups: GoogleGroupsBlock,
google_meet: GoogleMeetBlock,
google_maps: GoogleMapsBlock,
google_pagespeed: GooglePagespeedBlock,
google_tasks: GoogleTasksBlock,

View File

@@ -1,60 +0,0 @@
import { Text } from '@react-email/components'
import { format } from 'date-fns'
import { baseStyles } from '@/components/emails/_styles'
import { EmailLayout } from '@/components/emails/components'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { getBrandConfig } from '@/ee/whitelabeling'
interface CareersConfirmationEmailProps {
name: string
position: string
submittedDate?: Date
}
export function CareersConfirmationEmail({
name,
position,
submittedDate = new Date(),
}: CareersConfirmationEmailProps) {
const brand = getBrandConfig()
const baseUrl = getBaseUrl()
return (
<EmailLayout
preview={`Your application to ${brand.name} has been received`}
showUnsubscribe={false}
>
<Text style={baseStyles.paragraph}>Hello {name},</Text>
<Text style={baseStyles.paragraph}>
We've received your application for <strong>{position}</strong>. Our team reviews every
application and will reach out if there's a match.
</Text>
<Text style={baseStyles.paragraph}>
In the meantime, explore our{' '}
<a
href='https://docs.sim.ai'
target='_blank'
rel='noopener noreferrer'
style={baseStyles.link}
>
docs
</a>{' '}
or{' '}
<a href={`${baseUrl}/studio`} style={baseStyles.link}>
blog
</a>{' '}
to learn more about what we're building.
</Text>
{/* Divider */}
<div style={baseStyles.divider} />
<Text style={{ ...baseStyles.footerText, textAlign: 'left' }}>
Submitted on {format(submittedDate, 'MMMM do, yyyy')}.
</Text>
</EmailLayout>
)
}
export default CareersConfirmationEmail

View File

@@ -1,337 +0,0 @@
import { Section, Text } from '@react-email/components'
import { format } from 'date-fns'
import { baseStyles, colors } from '@/components/emails/_styles'
import { EmailLayout } from '@/components/emails/components'
interface CareersSubmissionEmailProps {
name: string
email: string
phone?: string
position: string
linkedin?: string
portfolio?: string
experience: string
location: string
message: string
submittedDate?: Date
}
const getExperienceLabel = (experience: string) => {
const labels: Record<string, string> = {
'0-1': '0-1 years',
'1-3': '1-3 years',
'3-5': '3-5 years',
'5-10': '5-10 years',
'10+': '10+ years',
}
return labels[experience] || experience
}
export function CareersSubmissionEmail({
name,
email,
phone,
position,
linkedin,
portfolio,
experience,
location,
message,
submittedDate = new Date(),
}: CareersSubmissionEmailProps) {
return (
<EmailLayout preview={`New Career Application from ${name}`} hideFooter showUnsubscribe={false}>
<Text
style={{
...baseStyles.paragraph,
fontSize: '18px',
fontWeight: 'bold',
color: colors.textPrimary,
}}
>
New Career Application
</Text>
<Text style={baseStyles.paragraph}>
A new career application has been submitted on {format(submittedDate, 'MMMM do, yyyy')} at{' '}
{format(submittedDate, 'h:mm a')}.
</Text>
{/* Applicant Information */}
<Section
style={{
marginTop: '24px',
marginBottom: '24px',
padding: '20px',
backgroundColor: colors.bgOuter,
borderRadius: '8px',
border: `1px solid ${colors.divider}`,
}}
>
<Text
style={{
margin: '0 0 16px 0',
fontSize: '16px',
fontWeight: 'bold',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
Applicant Information
</Text>
<table style={{ width: '100%', borderCollapse: 'collapse' }}>
<tbody>
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
width: '40%',
fontFamily: baseStyles.fontFamily,
}}
>
Name:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
{name}
</td>
</tr>
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
Email:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
<a href={`mailto:${email}`} style={baseStyles.link}>
{email}
</a>
</td>
</tr>
{phone && (
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
Phone:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
<a href={`tel:${phone}`} style={baseStyles.link}>
{phone}
</a>
</td>
</tr>
)}
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
Position:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
{position}
</td>
</tr>
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
Experience:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
{getExperienceLabel(experience)}
</td>
</tr>
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
Location:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
{location}
</td>
</tr>
{linkedin && (
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
LinkedIn:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
<a
href={linkedin}
target='_blank'
rel='noopener noreferrer'
style={baseStyles.link}
>
View Profile
</a>
</td>
</tr>
)}
{portfolio && (
<tr>
<td
style={{
padding: '8px 0',
fontSize: '14px',
fontWeight: 'bold',
color: colors.textMuted,
fontFamily: baseStyles.fontFamily,
}}
>
Portfolio:
</td>
<td
style={{
padding: '8px 0',
fontSize: '14px',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
<a
href={portfolio}
target='_blank'
rel='noopener noreferrer'
style={baseStyles.link}
>
View Portfolio
</a>
</td>
</tr>
)}
</tbody>
</table>
</Section>
{/* Message */}
<Section
style={{
marginTop: '24px',
marginBottom: '24px',
padding: '20px',
backgroundColor: colors.bgOuter,
borderRadius: '8px',
border: `1px solid ${colors.divider}`,
}}
>
<Text
style={{
margin: '0 0 12px 0',
fontSize: '16px',
fontWeight: 'bold',
color: colors.textPrimary,
fontFamily: baseStyles.fontFamily,
}}
>
About Themselves
</Text>
<Text
style={{
margin: '0',
fontSize: '14px',
color: colors.textPrimary,
lineHeight: '1.6',
whiteSpace: 'pre-wrap',
fontFamily: baseStyles.fontFamily,
}}
>
{message}
</Text>
</Section>
</EmailLayout>
)
}
export default CareersSubmissionEmail

View File

@@ -1,2 +0,0 @@
export { CareersConfirmationEmail } from './careers-confirmation-email'
export { CareersSubmissionEmail } from './careers-submission-email'

View File

@@ -4,8 +4,6 @@ export * from './_styles'
export * from './auth'
// Billing emails
export * from './billing'
// Careers emails
export * from './careers'
// Shared components
export * from './components'
// Invitation emails

View File

@@ -8,7 +8,6 @@ import {
PlanWelcomeEmail,
UsageThresholdEmail,
} from '@/components/emails/billing'
import { CareersConfirmationEmail, CareersSubmissionEmail } from '@/components/emails/careers'
import {
BatchInvitationEmail,
InvitationEmail,
@@ -225,44 +224,6 @@ export async function renderPaymentFailedEmail(params: {
)
}
export async function renderCareersConfirmationEmail(
name: string,
position: string
): Promise<string> {
return await render(
CareersConfirmationEmail({
name,
position,
})
)
}
export async function renderCareersSubmissionEmail(params: {
name: string
email: string
phone?: string
position: string
linkedin?: string
portfolio?: string
experience: string
location: string
message: string
}): Promise<string> {
return await render(
CareersSubmissionEmail({
name: params.name,
email: params.email,
phone: params.phone,
position: params.position,
linkedin: params.linkedin,
portfolio: params.portfolio,
experience: params.experience,
location: params.location,
message: params.message,
})
)
}
export async function renderWorkflowNotificationEmail(
params: WorkflowNotificationEmailProps
): Promise<string> {

View File

@@ -1711,167 +1711,42 @@ export function StagehandIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
{...props}
width='108'
height='159'
viewBox='0 0 108 159'
fill='none'
xmlns='http://www.w3.org/2000/svg'
width='256'
height='352'
viewBox='0 0 256 352'
fill='none'
>
<path
d='M15 26C22.8234 31.822 23.619 41.405 25.3125 50.3867C25.8461 53.1914 26.4211 55.9689 27.0625 58.75C27.7987 61.9868 28.4177 65.2319 29 68.5C29.332 70.3336 29.6653 72.1669 30 74C30.1418 74.7863 30.2836 75.5727 30.4297 76.3828C31.8011 83.2882 33.3851 90.5397 39.4375 94.75C40.3405 95.3069 40.3405 95.3069 41.2617 95.875C43.8517 97.5512 45.826 99.826 48 102C50.6705 102.89 52.3407 103.143 55.0898 103.211C55.8742 103.239 56.6586 103.268 57.4668 103.297C59.1098 103.349 60.7531 103.393 62.3965 103.43C65.8896 103.567 68.4123 103.705 71.5664 105.289C73 107 73 107 73 111C73.66 111 74.32 111 75 111C74.0759 106.912 74.0759 106.912 71.4766 103.828C67.0509 102.348 62.3634 102.64 57.7305 102.609C52.3632 102.449 49.2783 101.537 45 98C41.8212 94.0795 41.5303 90.9791 42 86C44.9846 83.0154 48.2994 83.6556 52.3047 83.6289C53.139 83.6199 53.9734 83.6108 54.833 83.6015C56.6067 83.587 58.3805 83.5782 60.1543 83.5745C62.8304 83.5627 65.5041 83.5137 68.1797 83.4629C81.1788 83.34 91.8042 85.3227 102 94C106.37 100.042 105.483 106.273 104.754 113.406C103.821 119.026 101.968 124.375 100.125 129.75C99.8806 130.471 99.6361 131.193 99.3843 131.936C97.7783 136.447 95.9466 140.206 93 144C92.34 144 91.68 144 91 144C91 144.66 91 145.32 91 146C79.0816 156.115 63.9798 156.979 49 156C36.6394 154.226 26.7567 148.879 19 139C11.0548 125.712 11.6846 105.465 11.3782 90.4719C11.0579 77.4745 8.03411 64.8142 5.4536 52.1135C5.04373 50.0912 4.64233 48.0673 4.24218 46.043C4.00354 44.8573 3.7649 43.6716 3.51903 42.45C2.14425 33.3121 2.14425 33.3121 4.87499 29.125C8.18297 25.817 10.3605 25.4542 15 26Z'
fill='#FDFDFD'
d='M 242.29,45.79 C 242.29,28.88 226.69,13.76 206.61,13.76 C 188.59,13.76 174.82,28.66 174.82,45.85 V 101.97 C 168.89,98.09 163.18,96.76 157.14,96.76 C 145.94,96.76 137.02,101.49 128.83,110.17 C 121.81,101.01 112.07,95.73 100.72,95.73 C 93.97,95.73 87.82,98.09 82.11,100.9 V 80.05 C 82.11,64.08 66.14,47.28 48.74,47.28 C 31.12,47.28 14.54,62.71 14.54,78.79 V 219.4 C 14.54,273.71 56.99,337.89 125.23,337.89 C 197.41,337.89 242.29,289.05 242.29,186.01 V 78.9 L 242.29,45.79 Z'
fill='black'
/>
<path
d='M91 0.999996C94.8466 2.96604 96.2332 5.08365 97.6091 9.03564C99.203 14.0664 99.4412 18.7459 99.4414 23.9922C99.4538 24.9285 99.4663 25.8647 99.4791 26.8294C99.5049 28.8198 99.5247 30.8103 99.539 32.8008C99.5785 37.9693 99.6682 43.1369 99.7578 48.3047C99.7747 49.3188 99.7917 50.3328 99.8091 51.3776C99.9603 59.6066 100.323 67.7921 100.937 76C101.012 77.0582 101.087 78.1163 101.164 79.2065C101.646 85.1097 102.203 90.3442 105.602 95.3672C107.492 98.9262 107.45 102.194 107.375 106.125C107.366 106.881 107.356 107.638 107.346 108.417C107.18 114.639 106.185 120.152 104 126C103.636 126.996 103.273 127.993 102.898 129.02C98.2182 141.022 92.6784 149.349 80.7891 155.062C67.479 160.366 49.4234 159.559 36 155C32.4272 153.286 29.2162 151.308 26 149C25.0719 148.361 24.1437 147.721 23.1875 147.062C8.32968 133.054 9.60387 109.231 8.73413 90.3208C8.32766 81.776 7.51814 73.4295 5.99999 65C5.82831 64.0338 5.65662 63.0675 5.47973 62.072C4.98196 59.3363 4.46395 56.6053 3.93749 53.875C3.76412 52.9572 3.59074 52.0394 3.4121 51.0938C2.75101 47.6388 2.11387 44.3416 0.999995 41C0.505898 36.899 0.0476353 32.7768 2.04687 29.0469C4.91881 25.5668 6.78357 24.117 11.25 23.6875C15.8364 24.0697 17.5999 24.9021 21 28C24.7763 34.3881 26.047 41.2626 27.1875 48.5C27.5111 50.4693 27.8377 52.4381 28.168 54.4062C28.3733 55.695 28.3733 55.695 28.5828 57.0098C28.8087 58.991 28.8087 58.991 30 60C30.3171 59.4947 30.6342 58.9894 30.9609 58.4688C33.1122 55.4736 34.7097 53.3284 38.3789 52.3945C44.352 52.203 48.1389 53.6183 53 57C53.0928 56.1338 53.0928 56.1338 53.1875 55.25C54.4089 51.8676 55.9015 50.8075 59 49C63.8651 48.104 66.9348 48.3122 71.1487 51.0332C72.0896 51.6822 73.0305 52.3313 74 53C73.9686 51.2986 73.9686 51.2986 73.9365 49.5627C73.8636 45.3192 73.818 41.0758 73.7803 36.8318C73.7603 35.0016 73.733 33.1715 73.6982 31.3415C73.6492 28.6976 73.6269 26.0545 73.6094 23.4102C73.5887 22.6035 73.5681 21.7969 73.5468 20.9658C73.5441 13.8444 75.5121 7.83341 80.25 2.4375C83.9645 0.495841 86.8954 0.209055 91 0.999996ZM3.99999 30C1.56925 34.8615 3.215 40.9393 4.24218 46.043C4.37061 46.6927 4.49905 47.3424 4.63137 48.0118C5.03968 50.0717 5.45687 52.1296 5.87499 54.1875C11.1768 80.6177 11.1768 80.6177 11.4375 93.375C11.7542 120.78 11.7542 120.78 23.5625 144.375C28.5565 149.002 33.5798 151.815 40 154C40.6922 154.244 41.3844 154.487 42.0977 154.738C55.6463 158.576 72.4909 156.79 84.8086 150.316C87.0103 148.994 89.0458 147.669 91 146C91 145.34 91 144.68 91 144C91.66 144 92.32 144 93 144C97.1202 138.98 99.3206 133.053 101.25 126.937C101.505 126.174 101.76 125.41 102.023 124.623C104.94 115.65 107.293 104.629 103.625 95.625C96.3369 88.3369 86.5231 83.6919 76.1988 83.6088C74.9905 83.6226 74.9905 83.6226 73.7578 83.6367C72.9082 83.6362 72.0586 83.6357 71.1833 83.6352C69.4034 83.6375 67.6235 83.6472 65.8437 83.6638C63.1117 83.6876 60.3806 83.6843 57.6484 83.6777C55.9141 83.6833 54.1797 83.6904 52.4453 83.6992C51.6277 83.6983 50.81 83.6974 49.9676 83.6964C45.5122 83.571 45.5122 83.571 42 86C41.517 90.1855 41.733 92.4858 43.6875 96.25C46.4096 99.4871 48.6807 101.674 53.0105 102.282C55.3425 102.411 57.6645 102.473 60 102.5C69.8847 102.612 69.8847 102.612 74 106C74.8125 108.687 74.8125 108.688 75 111C74.34 111 73.68 111 73 111C72.8969 110.216 72.7937 109.432 72.6875 108.625C72.224 105.67 72.224 105.67 69 104C65.2788 103.745 61.5953 103.634 57.8672 103.609C51.1596 103.409 46.859 101.691 41.875 97C41.2562 96.34 40.6375 95.68 40 95C39.175 94.4637 38.35 93.9275 37.5 93.375C30.9449 87.1477 30.3616 77.9789 29.4922 69.418C29.1557 66.1103 29.1557 66.1103 28.0352 63.625C26.5234 59.7915 26.1286 55.8785 25.5625 51.8125C23.9233 38.3 23.9233 38.3 17 27C11.7018 24.3509 7.9915 26.1225 3.99999 30Z'
fill='#1F1F1F'
d='M 224.94,46.23 C 224.94,36.76 215.91,28.66 205.91,28.66 C 196.75,28.66 189.9,36.11 189.9,45.14 V 152.72 C 202.88,153.38 214.08,155.96 224.94,166.19 V 78.79 L 224.94,46.23 Z'
fill='white'
/>
<path
d='M89.0976 2.53906C91 3 91 3 93.4375 5.3125C96.1586 9.99276 96.178 14.1126 96.2461 19.3828C96.2778 21.1137 96.3098 22.8446 96.342 24.5754C96.3574 25.4822 96.3728 26.3889 96.3887 27.3232C96.6322 41.3036 96.9728 55.2117 98.3396 69.1353C98.9824 75.7746 99.0977 82.3308 99 89C96.5041 88.0049 94.0126 87.0053 91.5351 85.9648C90.3112 85.4563 90.3112 85.4563 89.0625 84.9375C87.8424 84.4251 87.8424 84.4251 86.5976 83.9023C83.7463 82.9119 80.9774 82.4654 78 82C76.7702 65.9379 75.7895 49.8907 75.7004 33.7775C75.6919 32.3138 75.6783 30.8501 75.6594 29.3865C75.5553 20.4082 75.6056 12.1544 80.6875 4.4375C83.6031 2.62508 85.7 2.37456 89.0976 2.53906Z'
fill='#FBFBFB'
d='M 157.21,113.21 C 146.12,113.21 137.93,122.02 137.93,131.76 V 154.62 C 142.24,153.05 145.95,152.61 149.83,152.61 H 174.71 V 131.76 C 174.71,122.35 166.73,113.21 157.21,113.21 Z'
fill='white'
/>
<path
d='M97 13C97.99 13.495 97.99 13.495 99 14C99.0297 15.8781 99.0297 15.8781 99.0601 17.7942C99.4473 46.9184 99.4473 46.9184 100.937 76C101.012 77.0574 101.087 78.1149 101.164 79.2043C101.646 85.1082 102.203 90.3434 105.602 95.3672C107.492 98.9262 107.45 102.194 107.375 106.125C107.366 106.881 107.356 107.638 107.346 108.417C107.18 114.639 106.185 120.152 104 126C103.636 126.996 103.273 127.993 102.898 129.02C98.2182 141.022 92.6784 149.349 80.7891 155.062C67.479 160.366 49.4234 159.559 36 155C32.4272 153.286 29.2162 151.308 26 149C24.6078 148.041 24.6078 148.041 23.1875 147.062C13.5484 137.974 10.832 124.805 9.99999 112C9.91815 101.992 10.4358 91.9898 11 82C11.33 82 11.66 82 12 82C12.0146 82.6118 12.0292 83.2236 12.0442 83.854C11.5946 115.845 11.5946 115.845 24.0625 143.875C28.854 148.273 33.89 150.868 40 153C40.6935 153.245 41.387 153.49 42.1016 153.742C56.9033 157.914 73.8284 155.325 87 148C88.3301 147.327 89.6624 146.658 91 146C91 145.34 91 144.68 91 144C91.66 144 92.32 144 93 144C100.044 130.286 105.786 114.602 104 99C102.157 94.9722 100.121 93.0631 96.3125 90.875C95.5042 90.398 94.696 89.9211 93.8633 89.4297C85.199 85.1035 78.1558 84.4842 68.5 84.3125C67.2006 84.2783 65.9012 84.2442 64.5625 84.209C61.3751 84.127 58.1879 84.0577 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.8637 87.6094 98.8637 87.6094 98.7246 86.1907C96.96 67.8915 95.697 49.7051 95.75 31.3125C95.751 30.5016 95.7521 29.6908 95.7532 28.8554C95.7901 15.4198 95.7901 15.4198 97 13Z'
fill='#262114'
d='M 100.06,111.75 C 89.19,111.75 81.85,121.06 81.85,130.31 V 157.86 C 81.85,167.71 89.72,175.38 99.24,175.38 C 109.71,175.38 118.39,166.91 118.39,157.39 V 130.31 C 118.39,120.79 110.03,111.75 100.06,111.75 Z'
fill='white'
/>
<path
d='M68 51C72.86 54.06 74.644 56.5072 76 62C76.249 65.2763 76.2347 68.5285 76.1875 71.8125C76.1868 72.6833 76.1862 73.554 76.1855 74.4512C76.1406 80.8594 76.1406 80.8594 75 82C73.5113 82.0867 72.0185 82.107 70.5273 82.0976C69.6282 82.0944 68.7291 82.0912 67.8027 82.0879C66.8572 82.0795 65.9117 82.0711 64.9375 82.0625C63.9881 82.058 63.0387 82.0535 62.0605 82.0488C59.707 82.037 57.3535 82.0205 55 82C53.6352 77.2188 53.738 72.5029 53.6875 67.5625C53.6585 66.6208 53.6295 65.6792 53.5996 64.709C53.5591 60.2932 53.5488 57.7378 55.8945 53.9023C59.5767 50.5754 63.1766 50.211 68 51Z'
fill='#F8F8F8'
d='M 192.04,168.87 H 150.16 C 140.19,168.87 133.34,175.39 133.34,183.86 C 133.34,192.9 140.19,199.75 148.66,199.75 H 182.52 C 188.01,199.75 189.63,204.81 189.63,207.49 C 189.63,211.91 186.37,214.64 181.09,215.51 C 162.96,218.66 137.71,229.13 137.71,259.68 C 137.71,265.07 133.67,267.42 130.29,267.42 C 126.09,267.42 122.38,264.74 122.38,260.12 C 122.38,241.15 129.02,228.17 143.26,214.81 C 131.01,212.02 119.21,202.99 117.75,186.43 C 111.93,189.81 107.2,191.15 100.18,191.15 C 82.11,191.15 66.68,176.58 66.68,158.29 V 80.71 C 66.68,71.24 57.16,63.5 49.18,63.5 C 38.71,63.5 29.89,72.42 29.89,80.27 V 217.19 C 29.89,266.48 68.71,322.19 124.88,322.19 C 185.91,322.19 223.91,282.15 223.91,207.16 C 223.91,187.19 214.28,168.87 192.04,168.87 Z'
fill='white'
/>
</svg>
)
}
export function BrandfetchIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 29 31' fill='none' xmlns='http://www.w3.org/2000/svg'>
<path
d='M46 55C48.7557 57.1816 50.4359 58.8718 52 62C52.0837 63.5215 52.1073 65.0466 52.0977 66.5703C52.0944 67.4662 52.0912 68.3621 52.0879 69.2852C52.0795 70.2223 52.0711 71.1595 52.0625 72.125C52.058 73.0699 52.0535 74.0148 52.0488 74.9883C52.037 77.3256 52.0206 79.6628 52 82C50.9346 82.1992 50.9346 82.1992 49.8477 82.4023C48.9286 82.5789 48.0094 82.7555 47.0625 82.9375C46.146 83.1115 45.2294 83.2855 44.2852 83.4648C42.0471 83.7771 42.0471 83.7771 41 85C40.7692 86.3475 40.5885 87.7038 40.4375 89.0625C40.2931 90.3619 40.1487 91.6613 40 93C37 92 37 92 35.8672 90.1094C35.5398 89.3308 35.2123 88.5522 34.875 87.75C34.5424 86.9817 34.2098 86.2134 33.8672 85.4219C31.9715 80.1277 31.7884 75.065 31.75 69.5C31.7294 68.7536 31.7087 68.0073 31.6875 67.2383C31.6551 62.6607 32.0474 59.7266 35 56C38.4726 54.2637 42.2119 54.3981 46 55Z'
fill='#FAFAFA'
/>
<path
d='M97 13C97.66 13.33 98.32 13.66 99 14C99.0297 15.8781 99.0297 15.8781 99.0601 17.7942C99.4473 46.9184 99.4473 46.9184 100.937 76C101.012 77.0574 101.087 78.1149 101.164 79.2043C101.566 84.1265 102.275 88.3364 104 93C103.625 95.375 103.625 95.375 103 97C102.361 96.2781 101.721 95.5563 101.062 94.8125C94.4402 88.1902 85.5236 84.8401 76.2891 84.5859C75.0451 84.5473 73.8012 84.5086 72.5195 84.4688C71.2343 84.4378 69.9491 84.4069 68.625 84.375C66.6624 84.317 66.6624 84.317 64.6601 84.2578C61.4402 84.1638 58.2203 84.0781 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.9091 88.0729 98.8182 87.1458 98.7246 86.1907C96.96 67.8915 95.697 49.7051 95.75 31.3125C95.751 30.5016 95.7521 29.6908 95.7532 28.8554C95.7901 15.4198 95.7901 15.4198 97 13Z'
fill='#423B28'
/>
<path
d='M91 0.999996C94.3999 3.06951 96.8587 5.11957 98 9C97.625 12.25 97.625 12.25 97 15C95.804 12.6081 94.6146 10.2139 93.4375 7.8125C92.265 5.16236 92.265 5.16236 91 4C88.074 3.7122 85.8483 3.51695 83 4C79.1128 7.37574 78.178 11.0991 77 16C76.8329 18.5621 76.7615 21.1317 76.7695 23.6992C76.77 24.4155 76.7704 25.1318 76.7709 25.8698C76.7739 27.3783 76.7817 28.8868 76.7942 30.3953C76.8123 32.664 76.8147 34.9324 76.8144 37.2012C76.8329 44.6001 77.0765 51.888 77.7795 59.259C78.1413 63.7564 78.1068 68.2413 78.0625 72.75C78.058 73.6498 78.0535 74.5495 78.0488 75.4766C78.0373 77.6511 78.0193 79.8255 78 82C78.99 82.495 78.99 82.495 80 83C68.78 83.33 57.56 83.66 46 84C46.495 83.01 46.495 83.01 47 82C52.9349 80.7196 58.8909 80.8838 64.9375 80.9375C65.9075 80.942 66.8775 80.9465 67.8769 80.9512C70.2514 80.9629 72.6256 80.9793 75 81C75.0544 77.9997 75.0939 75.0005 75.125 72C75.1418 71.1608 75.1585 70.3216 75.1758 69.457C75.2185 63.9475 74.555 59.2895 73 54C73.66 54 74.32 54 75 54C74.9314 53.2211 74.8629 52.4422 74.7922 51.6396C74.1158 43.5036 73.7568 35.4131 73.6875 27.25C73.644 25.5194 73.644 25.5194 73.5996 23.7539C73.5376 15.3866 74.6189 8.85069 80.25 2.4375C83.9433 0.506911 86.9162 0.173322 91 0.999996Z'
fill='#131311'
/>
<path
d='M15 24C20.2332 26.3601 22.1726 29.3732 24.1875 34.5195C26.8667 42.6988 27.2651 50.4282 27 59C26.67 59 26.34 59 26 59C25.8945 58.436 25.7891 57.8721 25.6804 57.291C25.1901 54.6926 24.6889 52.0963 24.1875 49.5C24.0218 48.6131 23.8562 47.7262 23.6855 46.8125C21.7568 35.5689 21.7568 35.5689 15 27C12.0431 26.2498 12.0431 26.2498 8.99999 27C5.97965 28.9369 5.97965 28.9369 3.99999 32C3.67226 36.9682 4.31774 41.4911 5.27733 46.3594C5.40814 47.0304 5.53894 47.7015 5.67371 48.3929C5.94892 49.7985 6.22723 51.2035 6.50854 52.6079C6.93887 54.7569 7.35989 56.9075 7.77929 59.0586C9.09359 66.104 9.09359 66.104 11 73C11.0836 75.2109 11.1073 77.4243 11.0976 79.6367C11.0944 80.9354 11.0912 82.2342 11.0879 83.5723C11.0795 84.944 11.0711 86.3158 11.0625 87.6875C11.0575 89.071 11.0529 90.4544 11.0488 91.8379C11.037 95.2253 11.0206 98.6126 11 102C8.54975 99.5498 8.73228 98.8194 8.65624 95.4492C8.62812 94.53 8.60001 93.6108 8.57104 92.6638C8.54759 91.6816 8.52415 90.6994 8.49999 89.6875C8.20265 81.3063 7.58164 73.2485 5.99999 65C5.67135 63.2175 5.34327 61.435 5.01562 59.6523C4.31985 55.9098 3.62013 52.1681 2.90233 48.4297C2.75272 47.6484 2.60311 46.867 2.44897 46.062C1.99909 43.8187 1.99909 43.8187 0.999995 41C0.505898 36.899 0.0476353 32.7768 2.04687 29.0469C6.06003 24.1839 8.81126 23.4843 15 24Z'
fill='#2A2311'
/>
<path
d='M11 82C11.33 82 11.66 82 12 82C12.0146 82.6118 12.0292 83.2236 12.0442 83.854C11.5946 115.845 11.5946 115.845 24.0625 143.875C30.0569 149.404 36.9894 152.617 45 154C42 156 42 156 39.4375 156C29.964 153.244 20.8381 146.677 16 138C8.26993 120.062 9.92611 101.014 11 82Z'
fill='#272214'
/>
<path
d='M68 49C70.3478 50.1116 71.9703 51.3346 74 53C73.34 53.66 72.68 54.32 72 55C71.505 54.505 71.01 54.01 70.5 53.5C67.6718 51.8031 65.3662 51.5622 62.0976 51.4062C58.4026 52.4521 57.1992 53.8264 55 57C54.3826 61.2861 54.5302 65.4938 54.6875 69.8125C54.7101 70.9823 54.7326 72.1521 54.7559 73.3574C54.8147 76.2396 54.8968 79.1191 55 82C54.01 82 53.02 82 52 82C51.9854 81.4203 51.9708 80.8407 51.9558 80.2434C51.881 77.5991 51.7845 74.9561 51.6875 72.3125C51.6649 71.4005 51.6424 70.4885 51.6191 69.5488C51.4223 64.6292 51.2621 60.9548 48 57C45.6603 55.8302 44.1661 55.8339 41.5625 55.8125C40.78 55.7983 39.9976 55.7841 39.1914 55.7695C36.7079 55.8591 36.7079 55.8591 34 58C32.7955 60.5518 32.7955 60.5518 32 63C31.34 63 30.68 63 30 63C30.2839 59.6879 31.0332 57.9518 32.9375 55.1875C36.7018 52.4987 38.9555 52.3484 43.4844 52.5586C47.3251 53.2325 49.8148 54.7842 53 57C53.0928 56.1338 53.0928 56.1338 53.1875 55.25C55.6091 48.544 61.7788 47.8649 68 49Z'
fill='#1F1A0F'
/>
<path
d='M99 60C99.33 60 99.66 60 100 60C100.05 60.7865 100.1 61.573 100.152 62.3833C100.385 65.9645 100.63 69.5447 100.875 73.125C100.954 74.3625 101.032 75.6 101.113 76.875C101.197 78.0738 101.281 79.2727 101.367 80.5078C101.44 81.6075 101.514 82.7073 101.589 83.8403C102.013 87.1 102.94 89.8988 104 93C103.625 95.375 103.625 95.375 103 97C102.361 96.2781 101.721 95.5563 101.062 94.8125C94.4402 88.1902 85.5236 84.8401 76.2891 84.5859C74.4231 84.5279 74.4231 84.5279 72.5195 84.4688C71.2343 84.4378 69.9491 84.4069 68.625 84.375C67.3166 84.3363 66.0082 84.2977 64.6601 84.2578C61.4402 84.1638 58.2203 84.0781 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.9162 87.912 98.8324 86.8241 98.7461 85.7031C98.1266 77.012 97.9127 68.6814 99 60Z'
fill='#332E22'
/>
<path
d='M15 24C20.2332 26.3601 22.1726 29.3732 24.1875 34.5195C26.8667 42.6988 27.2651 50.4282 27 59C26.67 59 26.34 59 26 59C25.8945 58.436 25.7891 57.8721 25.6804 57.291C25.1901 54.6926 24.6889 52.0963 24.1875 49.5C24.0218 48.6131 23.8562 47.7262 23.6855 46.8125C21.7568 35.5689 21.7568 35.5689 15 27C12.0431 26.2498 12.0431 26.2498 8.99999 27C5.2818 29.7267 4.15499 31.2727 3.18749 35.8125C3.12562 36.8644 3.06374 37.9163 2.99999 39C2.33999 39 1.67999 39 0.999992 39C0.330349 31.2321 0.330349 31.2321 3.37499 27.5625C7.31431 23.717 9.51597 23.543 15 24Z'
fill='#1D180A'
/>
<path
d='M91 0.999996C94.3999 3.06951 96.8587 5.11957 98 9C97.625 12.25 97.625 12.25 97 15C95.804 12.6081 94.6146 10.2139 93.4375 7.8125C92.265 5.16236 92.265 5.16236 91 4C85.4345 3.33492 85.4345 3.33491 80.6875 5.75C78.5543 9.85841 77.6475 13.9354 76.7109 18.4531C76.4763 19.2936 76.2417 20.1341 76 21C75.34 21.33 74.68 21.66 74 22C73.5207 15.4102 74.5846 10.6998 78 5C81.755 0.723465 85.5463 -0.103998 91 0.999996Z'
fill='#16130D'
/>
<path
d='M42 93C42.5569 93.7631 43.1137 94.5263 43.6875 95.3125C46.4238 98.4926 48.7165 100.679 53.0105 101.282C55.3425 101.411 57.6646 101.473 60 101.5C70.6207 101.621 70.6207 101.621 75 106C75.0406 107.666 75.0427 109.334 75 111C74.34 111 73.68 111 73 111C72.7112 110.196 72.4225 109.391 72.125 108.562C71.2674 105.867 71.2674 105.867 69 105C65.3044 104.833 61.615 104.703 57.916 104.658C52.1631 104.454 48.7484 103.292 44 100C41.5625 97.25 41.5625 97.25 40 95C40.66 95 41.32 95 42 95C42 94.34 42 93.68 42 93Z'
fill='#2B2B2B'
/>
<path
d='M11 82C11.33 82 11.66 82 12 82C12.1682 86.6079 12.3287 91.216 12.4822 95.8245C12.5354 97.3909 12.5907 98.9574 12.6482 100.524C12.7306 102.78 12.8055 105.036 12.8789 107.293C12.9059 107.989 12.933 108.685 12.9608 109.402C13.0731 113.092 12.9015 116.415 12 120C11.67 120 11.34 120 11 120C9.63778 112.17 10.1119 104.4 10.4375 96.5C10.4908 95.0912 10.5436 93.6823 10.5957 92.2734C10.7247 88.8487 10.8596 85.4243 11 82Z'
fill='#4D483B'
/>
<path
d='M43.4844 52.5586C47.3251 53.2325 49.8148 54.7842 53 57C52 59 52 59 50 60C49.5256 59.34 49.0512 58.68 48.5625 58C45.2656 55.4268 43.184 55.5955 39.1211 55.6641C36.7043 55.8955 36.7043 55.8955 34 58C32.7955 60.5518 32.7955 60.5518 32 63C31.34 63 30.68 63 30 63C30.2839 59.6879 31.0332 57.9518 32.9375 55.1875C36.7018 52.4987 38.9555 52.3484 43.4844 52.5586Z'
fill='#221F16'
/>
<path
d='M76 73C76.33 73 76.66 73 77 73C77 75.97 77 78.94 77 82C78.485 82.495 78.485 82.495 80 83C68.78 83.33 57.56 83.66 46 84C46.33 83.34 46.66 82.68 47 82C52.9349 80.7196 58.8909 80.8838 64.9375 80.9375C65.9075 80.942 66.8775 80.9465 67.8769 80.9512C70.2514 80.9629 72.6256 80.9793 75 81C75.33 78.36 75.66 75.72 76 73Z'
fill='#040404'
/>
<path
d='M27 54C27.33 54 27.66 54 28 54C28.33 56.97 28.66 59.94 29 63C29.99 63 30.98 63 32 63C32 66.96 32 70.92 32 75C31.01 74.67 30.02 74.34 29 74C28.8672 73.2523 28.7344 72.5047 28.5977 71.7344C28.421 70.7495 28.2444 69.7647 28.0625 68.75C27.8885 67.7755 27.7144 66.8009 27.5352 65.7969C27.0533 63.087 27.0533 63.087 26.4062 60.8125C25.8547 58.3515 26.3956 56.4176 27 54Z'
fill='#434039'
/>
<path
d='M78 5C78.99 5.33 79.98 5.66 81 6C80.3194 6.92812 80.3194 6.92812 79.625 7.875C77.7233 11.532 77.1555 14.8461 76.5273 18.8906C76.3533 19.5867 76.1793 20.2828 76 21C75.34 21.33 74.68 21.66 74 22C73.5126 15.2987 74.9229 10.9344 78 5Z'
fill='#2A2313'
/>
<path
d='M12 115C12.99 115.495 12.99 115.495 14 116C14.5334 118.483 14.9326 120.864 15.25 123.375C15.3531 124.061 15.4562 124.747 15.5625 125.453C16.0763 129.337 16.2441 130.634 14 134C12.6761 127.57 11.752 121.571 12 115Z'
fill='#2F2C22'
/>
<path
d='M104 95C107 98 107 98 107.363 101.031C107.347 102.176 107.33 103.321 107.312 104.5C107.309 105.645 107.305 106.789 107.301 107.969C107 111 107 111 105 114C104.67 107.73 104.34 101.46 104 95Z'
fill='#120F05'
/>
<path
d='M56 103C58.6048 102.919 61.2071 102.86 63.8125 102.812C64.5505 102.787 65.2885 102.762 66.0488 102.736C71.4975 102.662 71.4975 102.662 74 104.344C75.374 106.619 75.2112 108.396 75 111C74.34 111 73.68 111 73 111C72.7112 110.196 72.4225 109.391 72.125 108.562C71.2674 105.867 71.2674 105.867 69 105C66.7956 104.77 64.5861 104.589 62.375 104.438C61.1865 104.354 59.998 104.27 58.7734 104.184C57.4006 104.093 57.4006 104.093 56 104C56 103.67 56 103.34 56 103Z'
fill='#101010'
/>
<path
d='M23 40C23.66 40 24.32 40 25 40C27.3084 46.3482 27.1982 52.2948 27 59C26.67 59 26.34 59 26 59C25.01 52.73 24.02 46.46 23 40Z'
fill='#191409'
/>
<path
d='M47 83C46.3606 83.3094 45.7212 83.6187 45.0625 83.9375C41.9023 87.0977 42.181 90.6833 42 95C41.01 94.67 40.02 94.34 39 94C39.3463 85.7409 39.3463 85.7409 41.875 82.875C44 82 44 82 47 83Z'
fill='#171717'
/>
<path
d='M53 61C53.33 61 53.66 61 54 61C54.33 67.93 54.66 74.86 55 82C54.01 82 53.02 82 52 82C52.33 75.07 52.66 68.14 53 61Z'
fill='#444444'
/>
<path
d='M81 154C78.6696 156.33 77.8129 156.39 74.625 156.75C73.4687 156.897 73.4687 156.897 72.2891 157.047C69.6838 156.994 68.2195 156.317 66 155C67.7478 154.635 69.4984 154.284 71.25 153.938C72.7118 153.642 72.7118 153.642 74.2031 153.34C76.8681 153.016 78.4887 153.145 81 154Z'
fill='#332F23'
/>
<path
d='M19 28C19.66 28 20.32 28 21 28C21.6735 29.4343 22.3386 30.8726 23 32.3125C23.5569 33.5133 23.5569 33.5133 24.125 34.7383C25 37 25 37 25 40C22 39 22 39 21.0508 37.2578C20.8071 36.554 20.5635 35.8502 20.3125 35.125C20.0611 34.4263 19.8098 33.7277 19.5508 33.0078C19 31 19 31 19 28Z'
fill='#282213'
/>
<path
d='M102 87C104.429 93.2857 104.429 93.2857 103 97C100.437 94.75 100.437 94.75 98 92C98.0625 89.75 98.0625 89.75 99 88C101 87 101 87 102 87Z'
fill='#37301F'
/>
<path
d='M53 56C53.33 56 53.66 56 54 56C53.67 62.27 53.34 68.54 53 75C52.67 75 52.34 75 52 75C51.7788 72.2088 51.5726 69.4179 51.375 66.625C51.3105 65.8309 51.2461 65.0369 51.1797 64.2188C51.0394 62.1497 51.0124 60.0737 51 58C51.66 57.34 52.32 56.68 53 56Z'
fill='#030303'
/>
<path
d='M100 129C100.33 129 100.66 129 101 129C100.532 133.776 99.7567 137.045 97 141C96.34 140.67 95.68 140.34 95 140C96.65 136.37 98.3 132.74 100 129Z'
fill='#1E1A12'
/>
<path
d='M15 131C17.7061 132.353 17.9618 133.81 19.125 136.562C19.4782 137.389 19.8314 138.215 20.1953 139.066C20.4609 139.704 20.7264 140.343 21 141C20.01 141 19.02 141 18 141C15.9656 137.27 15 135.331 15 131Z'
fill='#1C1912'
/>
<path
d='M63 49C69.4 49.4923 69.4 49.4923 72.4375 52.0625C73.2109 53.0216 73.2109 53.0216 74 54C70.8039 54 69.5828 53.4533 66.8125 52C66.0971 51.6288 65.3816 51.2575 64.6445 50.875C64.1018 50.5863 63.5591 50.2975 63 50C63 49.67 63 49.34 63 49Z'
fill='#13110C'
/>
<path
d='M0.999992 39C1.98999 39 2.97999 39 3.99999 39C5.24999 46.625 5.24999 46.625 2.99999 50C2.33999 46.37 1.67999 42.74 0.999992 39Z'
fill='#312C1E'
/>
<path
d='M94 5C94.66 5 95.32 5 96 5C97.8041 7.75924 98.0127 8.88972 97.625 12.25C97.4187 13.1575 97.2125 14.065 97 15C95.1161 11.7345 94.5071 8.71888 94 5Z'
fill='#292417'
/>
<path
d='M20 141C23.3672 142.393 24.9859 143.979 27 147C24.625 146.812 24.625 146.812 22 146C20.6875 143.438 20.6875 143.438 20 141Z'
fill='#373328'
/>
<path
d='M86 83C86.33 83.99 86.66 84.98 87 86C83.37 85.34 79.74 84.68 76 84C80.3553 81.8223 81.4663 81.9696 86 83Z'
fill='#2F2F2F'
/>
<path
d='M42 93C46 97.625 46 97.625 46 101C44.02 99.35 42.04 97.7 40 96C40.66 95.67 41.32 95.34 42 95C42 94.34 42 93.68 42 93Z'
fill='#232323'
/>
<path
d='M34 55C34.66 55.33 35.32 55.66 36 56C35.5256 56.7838 35.0512 57.5675 34.5625 58.375C33.661 59.8895 32.7882 61.4236 32 63C31.34 63 30.68 63 30 63C30.4983 59.3125 31.1007 57.3951 34 55Z'
fill='#110F0A'
d='M29 7.54605C29 9.47222 28.316 11.1378 26.9481 12.5428C25.5802 13.9251 23.5852 14.9222 20.9634 15.534C22.377 15.9192 23.4484 16.5537 24.1781 17.4375C24.9077 18.2987 25.2724 19.2956 25.2724 20.4287C25.2724 22.2189 24.7025 23.7713 23.5625 25.0855C22.4454 26.3998 20.8039 27.4195 18.638 28.1447C16.4721 28.8472 13.8616 29.1985 10.8066 29.1985C9.66666 29.1985 8.75472 29.1645 8.07075 29.0965C8.04796 29.7309 7.77438 30.2068 7.25 30.5241C6.72562 30.8414 6.05307 31 5.23231 31C4.41156 31 3.84159 30.8187 3.52241 30.4561C3.22603 30.0936 3.10062 29.561 3.14623 28.8586C3.35141 25.686 3.75039 22.3662 4.34316 18.8991C4.93593 15.4094 5.68829 12.0442 6.60024 8.80373C6.75982 8.23721 7.07901 7.84064 7.55778 7.61404C8.03656 7.38743 8.66353 7.27412 9.43868 7.27412C10.8294 7.27412 11.5248 7.65936 11.5248 8.42983C11.5248 8.74708 11.4564 9.10965 11.3196 9.51754C10.7268 11.2851 10.134 13.6871 9.54127 16.7237C8.9485 19.7375 8.52674 22.6156 8.27594 25.3575C9.37028 25.448 10.2594 25.4934 10.9434 25.4934C14.1352 25.4934 16.4721 25.0401 17.954 24.1338C19.4587 23.2046 20.2111 22.0263 20.2111 20.5987C20.2111 19.6016 19.778 18.7632 18.9116 18.0833C18.0681 17.4035 16.6431 17.0296 14.6368 16.9616C14.1808 16.939 13.8616 16.8257 13.6792 16.6217C13.4968 16.4178 13.4057 16.0892 13.4057 15.636C13.4057 14.9788 13.5425 14.4463 13.816 14.0384C14.0896 13.6305 14.5912 13.4152 15.3208 13.3925C16.9395 13.3472 18.3986 13.1093 19.6981 12.6787C21.0204 12.2482 22.0578 11.6477 22.8101 10.8772C23.5625 10.0841 23.9387 9.1663 23.9387 8.1239C23.9387 6.80958 23.2889 5.77851 21.9894 5.0307C20.6899 4.26024 18.6949 3.875 16.0047 3.875C13.5652 3.875 11.2056 4.19226 8.92571 4.82676C6.64584 5.4386 4.70793 6.2204 3.11203 7.17215C2.38246 7.6027 1.7669 7.81798 1.26533 7.81798C0.854953 7.81798 0.53577 7.68202 0.307783 7.41009C0.102594 7.1155 0 6.75292 0 6.32237C0 5.75585 0.113994 5.26864 0.341981 4.86075C0.592768 4.45285 1.17414 3.98831 2.08608 3.46711C4.00118 2.37939 6.24685 1.52961 8.82311 0.917763C11.3994 0.305921 14.0326 0 16.7229 0C20.8494 0 23.9272 0.691156 25.9564 2.07347C27.9855 3.45577 29 5.27998 29 7.54605Z'
fill='currentColor'
/>
</svg>
)
@@ -2467,7 +2342,7 @@ export function PagerDutyIcon(props: SVGProps<SVGSVGElement>) {
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 64 64' fill='none'>
<path
d='M6.704 59.217H0v-33.65c0-3.455 1.418-5.544 2.604-6.704 2.63-2.58 6.2-2.656 6.782-2.656h10.546c3.765 0 5.93 1.52 7.117 2.8 2.346 2.553 2.372 5.853 2.32 6.73v12.687c0 3.662-1.496 5.828-2.733 6.988-2.553 2.398-5.93 2.45-6.73 2.424H6.704zm13.46-18.102c.36 0 1.367-.103 1.908-.62.413-.387.62-1.083.62-2.1v-13.02c0-.36-.077-1.315-.593-1.857-.5-.516-1.444-.62-2.166-.62h-10.6c-2.63 0-2.63 1.985-2.63 2.656v15.55zM57.296 4.783H64V38.46c0 3.455-1.418 5.544-2.604 6.704-2.63 2.58-6.2 2.656-6.782 2.656H44.068c-3.765 0-5.93-1.52-7.117-2.8-2.346-2.553-2.372-5.853-2.32-6.73V25.62c0-3.662 1.496-5.828 2.733-6.988 2.553-2.398 5.93-2.45 6.73-2.424h13.202zM43.836 22.9c-.36 0-1.367.103-1.908.62-.413.387-.62 1.083-.62 2.1v13.02c0 .36.077 1.315.593 1.857.5.516 1.444.62 2.166.62h10.598c2.656-.026 2.656-2 2.656-2.682V22.9z'
fill='#06AC38'
fill='#FFFFFF'
/>
</svg>
)
@@ -4796,6 +4671,22 @@ export function GoogleGroupsIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function GoogleMeetIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' fill='none' viewBox='0 0 87.5 72'>
<path fill='#00832d' d='M49.5 36l8.53 9.75 11.47 7.33 2-17.02-2-16.64-11.69 6.44z' />
<path fill='#0066da' d='M0 51.5V66c0 3.315 2.685 6 6 6h14.5l3-10.96-3-9.54-9.95-3z' />
<path fill='#e94235' d='M20.5 0L0 20.5l10.55 3 9.95-3 2.95-9.41z' />
<path fill='#2684fc' d='M20.5 20.5H0v31h20.5z' />
<path
fill='#00ac47'
d='M82.6 8.68L69.5 19.42v33.66l13.16 10.79c1.97 1.54 4.85.135 4.85-2.37V11c0-2.535-2.945-3.925-4.91-2.32zM49.5 36v15.5h-29V72h43c3.315 0 6-2.685 6-6V53.08z'
/>
<path fill='#ffba00' d='M63.5 0h-43v20.5h29V36l20-16.57V6c0-3.315-2.685-6-6-6z' />
</svg>
)
}
export function CursorIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 546 546' fill='currentColor'>
@@ -4804,6 +4695,19 @@ export function CursorIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function DubIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 64 64' fill='none' xmlns='http://www.w3.org/2000/svg'>
<path
fillRule='evenodd'
clipRule='evenodd'
d='M32 64c17.673 0 32-14.327 32-32 0-11.844-6.435-22.186-16-27.719V48h-8v-2.14A15.9 15.9 0 0 1 32 48c-8.837 0-16-7.163-16-16s7.163-16 16-16c2.914 0 5.647.78 8 2.14V1.008A32 32 0 0 0 32 0C14.327 0 0 14.327 0 32s14.327 32 32 32'
fill='currentColor'
/>
</svg>
)
}
export function DuckDuckGoIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='-108 -108 216 216'>

View File

@@ -158,6 +158,8 @@ export const DEFAULTS = {
MAX_LOOP_ITERATIONS: 1000,
MAX_FOREACH_ITEMS: 1000,
MAX_PARALLEL_BRANCHES: 20,
MAX_NESTING_DEPTH: 10,
/** Maximum child workflow depth for propagating SSE callbacks (block:started, block:completed). */
MAX_SSE_CHILD_DEPTH: 3,
EXECUTION_TIME: 0,
TOKENS: {

View File

@@ -2,6 +2,11 @@ import { loggerMock } from '@sim/testing'
import { describe, expect, it, vi } from 'vitest'
import { BlockType } from '@/executor/constants'
import { DAGBuilder } from '@/executor/dag/builder'
import {
buildBranchNodeId,
buildParallelSentinelEndId,
buildParallelSentinelStartId,
} from '@/executor/utils/subflow-utils'
import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types'
vi.mock('@sim/logger', () => loggerMock)
@@ -89,6 +94,96 @@ describe('DAGBuilder disabled subflow validation', () => {
})
})
describe('DAGBuilder nested parallel support', () => {
it('builds DAG for parallel-in-parallel with correct sentinel wiring', () => {
const outerParallelId = 'outer-parallel'
const innerParallelId = 'inner-parallel'
const functionId = 'func-1'
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
createBlock(outerParallelId, BlockType.PARALLEL),
createBlock(innerParallelId, BlockType.PARALLEL),
createBlock(functionId, BlockType.FUNCTION),
],
connections: [
{ source: 'start', target: outerParallelId },
{
source: outerParallelId,
target: innerParallelId,
sourceHandle: 'parallel-start-source',
},
{
source: innerParallelId,
target: functionId,
sourceHandle: 'parallel-start-source',
},
],
loops: {},
parallels: {
[innerParallelId]: {
id: innerParallelId,
nodes: [functionId],
count: 5,
parallelType: 'count',
},
[outerParallelId]: {
id: outerParallelId,
nodes: [innerParallelId],
count: 5,
parallelType: 'count',
},
},
}
const builder = new DAGBuilder()
const dag = builder.build(workflow)
// Outer parallel sentinel pair exists
const outerStartId = buildParallelSentinelStartId(outerParallelId)
const outerEndId = buildParallelSentinelEndId(outerParallelId)
expect(dag.nodes.has(outerStartId)).toBe(true)
expect(dag.nodes.has(outerEndId)).toBe(true)
// Inner parallel sentinel pair exists
const innerStartId = buildParallelSentinelStartId(innerParallelId)
const innerEndId = buildParallelSentinelEndId(innerParallelId)
expect(dag.nodes.has(innerStartId)).toBe(true)
expect(dag.nodes.has(innerEndId)).toBe(true)
// Function 1 branch template node exists
const funcTemplateId = buildBranchNodeId(functionId, 0)
expect(dag.nodes.has(funcTemplateId)).toBe(true)
// Start → outer-sentinel-start
const startNode = dag.nodes.get('start')!
const startTargets = Array.from(startNode.outgoingEdges.values()).map((e) => e.target)
expect(startTargets).toContain(outerStartId)
// Outer-sentinel-start → inner-sentinel-start
const outerStart = dag.nodes.get(outerStartId)!
const outerStartTargets = Array.from(outerStart.outgoingEdges.values()).map((e) => e.target)
expect(outerStartTargets).toContain(innerStartId)
// Inner-sentinel-start → function branch template
const innerStart = dag.nodes.get(innerStartId)!
const innerStartTargets = Array.from(innerStart.outgoingEdges.values()).map((e) => e.target)
expect(innerStartTargets).toContain(funcTemplateId)
// Function branch template → inner-sentinel-end
const funcTemplate = dag.nodes.get(funcTemplateId)!
const funcTargets = Array.from(funcTemplate.outgoingEdges.values()).map((e) => e.target)
expect(funcTargets).toContain(innerEndId)
// Inner-sentinel-end → outer-sentinel-end
const innerEnd = dag.nodes.get(innerEndId)!
const innerEndTargets = Array.from(innerEnd.outgoingEdges.values()).map((e) => e.target)
expect(innerEndTargets).toContain(outerEndId)
})
})
describe('DAGBuilder human-in-the-loop transformation', () => {
it('creates trigger nodes and rewires edges for pause blocks', () => {
const workflow: SerializedWorkflow = {

View File

@@ -8,7 +8,7 @@ import type { DAGEdge, NodeMetadata } from '@/executor/dag/types'
import {
buildParallelSentinelStartId,
buildSentinelStartId,
extractBaseBlockId,
normalizeNodeId,
} from '@/executor/utils/subflow-utils'
import type {
SerializedBlock,
@@ -156,7 +156,7 @@ export class DAGBuilder {
}
const hasConnections = Array.from(sentinelStartNode.outgoingEdges.values()).some((edge) =>
nodes.includes(extractBaseBlockId(edge.target))
nodes.includes(normalizeNodeId(edge.target))
)
if (!hasConnections) {

View File

@@ -1102,4 +1102,595 @@ describe('EdgeConstructor', () => {
})
})
})
describe('Nested loop wiring', () => {
it('should wire inner loop sentinels into outer loop sentinel chain', () => {
const outerLoopId = 'outer-loop'
const innerLoopId = 'inner-loop'
const functionId = 'func-1'
const innerFunctionId = 'func-2'
const outerSentinelStart = `loop-${outerLoopId}-sentinel-start`
const outerSentinelEnd = `loop-${outerLoopId}-sentinel-end`
const innerSentinelStart = `loop-${innerLoopId}-sentinel-start`
const innerSentinelEnd = `loop-${innerLoopId}-sentinel-end`
const outerLoop: SerializedLoop = {
id: outerLoopId,
nodes: [functionId, innerLoopId],
iterations: 5,
loopType: 'for',
}
const innerLoop: SerializedLoop = {
id: innerLoopId,
nodes: [innerFunctionId],
iterations: 3,
loopType: 'for',
}
const dag = createMockDAG([
functionId,
innerFunctionId,
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
])
dag.loopConfigs.set(outerLoopId, outerLoop)
dag.loopConfigs.set(innerLoopId, innerLoop)
const workflow = createMockWorkflow(
[
createMockBlock(functionId),
createMockBlock(innerFunctionId),
createMockBlock(innerLoopId, 'loop'),
],
[{ source: functionId, target: innerLoopId }],
{ [outerLoopId]: outerLoop, [innerLoopId]: innerLoop }
)
edgeConstructor.execute(
workflow,
dag,
new Set(),
new Set([functionId, innerLoopId, innerFunctionId]),
new Set([
functionId,
innerFunctionId,
innerLoopId,
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
]),
new Map()
)
const outerStartNode = dag.nodes.get(outerSentinelStart)!
const outerStartTargets = Array.from(outerStartNode.outgoingEdges.values()).map(
(e) => e.target
)
expect(outerStartTargets).toContain(functionId)
const funcNode = dag.nodes.get(functionId)!
const funcTargets = Array.from(funcNode.outgoingEdges.values()).map((e) => e.target)
expect(funcTargets).toContain(innerSentinelStart)
const innerEndNode = dag.nodes.get(innerSentinelEnd)!
const innerEndEdges = Array.from(innerEndNode.outgoingEdges.values())
const exitEdge = innerEndEdges.find((e) => e.target === outerSentinelEnd)
expect(exitEdge).toBeDefined()
expect(exitEdge!.sourceHandle).toBe('loop_exit')
const backEdge = innerEndEdges.find((e) => e.target === innerSentinelStart)
expect(backEdge).toBeDefined()
expect(backEdge!.sourceHandle).toBe('loop_continue')
const outerEndNode = dag.nodes.get(outerSentinelEnd)!
const outerBackEdge = Array.from(outerEndNode.outgoingEdges.values()).find(
(e) => e.target === outerSentinelStart
)
expect(outerBackEdge).toBeDefined()
expect(outerBackEdge!.sourceHandle).toBe('loop_continue')
})
it('should correctly identify boundary nodes when inner loop is the only node', () => {
const outerLoopId = 'outer-loop'
const innerLoopId = 'inner-loop'
const innerFunctionId = 'func-inner'
const outerSentinelStart = `loop-${outerLoopId}-sentinel-start`
const outerSentinelEnd = `loop-${outerLoopId}-sentinel-end`
const innerSentinelStart = `loop-${innerLoopId}-sentinel-start`
const innerSentinelEnd = `loop-${innerLoopId}-sentinel-end`
const outerLoop: SerializedLoop = {
id: outerLoopId,
nodes: [innerLoopId],
iterations: 2,
loopType: 'for',
}
const innerLoop: SerializedLoop = {
id: innerLoopId,
nodes: [innerFunctionId],
iterations: 3,
loopType: 'for',
}
const dag = createMockDAG([
innerFunctionId,
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
])
dag.loopConfigs.set(outerLoopId, outerLoop)
dag.loopConfigs.set(innerLoopId, innerLoop)
const workflow = createMockWorkflow(
[createMockBlock(innerFunctionId), createMockBlock(innerLoopId, 'loop')],
[],
{ [outerLoopId]: outerLoop, [innerLoopId]: innerLoop }
)
edgeConstructor.execute(
workflow,
dag,
new Set(),
new Set([innerLoopId, innerFunctionId]),
new Set([
innerFunctionId,
innerLoopId,
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
]),
new Map()
)
const outerStartNode = dag.nodes.get(outerSentinelStart)!
const outerStartTargets = Array.from(outerStartNode.outgoingEdges.values()).map(
(e) => e.target
)
expect(outerStartTargets).toContain(innerSentinelStart)
const innerEndNode = dag.nodes.get(innerSentinelEnd)!
const exitEdge = Array.from(innerEndNode.outgoingEdges.values()).find(
(e) => e.target === outerSentinelEnd
)
expect(exitEdge).toBeDefined()
expect(exitEdge!.sourceHandle).toBe('loop_exit')
})
it('should not drop intra-loop edges when target is a nested loop block', () => {
const outerLoopId = 'outer-loop'
const innerLoopId = 'inner-loop'
const functionId = 'func-1'
const innerFunctionId = 'func-2'
const outerSentinelStart = `loop-${outerLoopId}-sentinel-start`
const outerSentinelEnd = `loop-${outerLoopId}-sentinel-end`
const innerSentinelStart = `loop-${innerLoopId}-sentinel-start`
const innerSentinelEnd = `loop-${innerLoopId}-sentinel-end`
const outerLoop: SerializedLoop = {
id: outerLoopId,
nodes: [functionId, innerLoopId],
iterations: 5,
loopType: 'for',
}
const innerLoop: SerializedLoop = {
id: innerLoopId,
nodes: [innerFunctionId],
iterations: 3,
loopType: 'for',
}
const dag = createMockDAG([
functionId,
innerFunctionId,
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
])
dag.loopConfigs.set(outerLoopId, outerLoop)
dag.loopConfigs.set(innerLoopId, innerLoop)
const workflow = createMockWorkflow(
[
createMockBlock(functionId),
createMockBlock(innerFunctionId),
createMockBlock(innerLoopId, 'loop'),
],
[{ source: functionId, target: innerLoopId }],
{ [outerLoopId]: outerLoop, [innerLoopId]: innerLoop }
)
edgeConstructor.execute(
workflow,
dag,
new Set(),
new Set([functionId, innerLoopId, innerFunctionId]),
new Set([
functionId,
innerFunctionId,
innerLoopId,
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
]),
new Map()
)
const funcNode = dag.nodes.get(functionId)!
const edgeToInnerStart = Array.from(funcNode.outgoingEdges.values()).find(
(e) => e.target === innerSentinelStart
)
expect(edgeToInnerStart).toBeDefined()
const innerStartNode = dag.nodes.get(innerSentinelStart)!
expect(innerStartNode.incomingEdges.has(functionId)).toBe(true)
})
})
describe('Nested parallel wiring', () => {
it('should wire inner parallel sentinels into outer parallel sentinel chain', () => {
const outerParallelId = 'outer-parallel'
const innerParallelId = 'inner-parallel'
const functionId = 'func-1'
const outerSentinelStart = `parallel-${outerParallelId}-sentinel-start`
const outerSentinelEnd = `parallel-${outerParallelId}-sentinel-end`
const innerSentinelStart = `parallel-${innerParallelId}-sentinel-start`
const innerSentinelEnd = `parallel-${innerParallelId}-sentinel-end`
const funcTemplate = `${functionId}₍0₎`
const dag = createMockDAG([
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
funcTemplate,
])
// Set up sentinel metadata
dag.nodes.get(outerSentinelStart)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'start',
parallelId: outerParallelId,
}
dag.nodes.get(outerSentinelEnd)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'end',
parallelId: outerParallelId,
}
dag.nodes.get(innerSentinelStart)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'start',
parallelId: innerParallelId,
}
dag.nodes.get(innerSentinelEnd)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'end',
parallelId: innerParallelId,
}
dag.nodes.get(funcTemplate)!.metadata = {
isParallelBranch: true,
parallelId: innerParallelId,
branchIndex: 0,
branchTotal: 1,
originalBlockId: functionId,
}
dag.parallelConfigs.set(outerParallelId, {
id: outerParallelId,
nodes: [innerParallelId],
count: 3,
parallelType: 'count',
})
dag.parallelConfigs.set(innerParallelId, {
id: innerParallelId,
nodes: [functionId],
count: 2,
parallelType: 'count',
})
const workflow = createMockWorkflow(
[createMockBlock(functionId)],
[
// Outer parallel start → inner parallel (intra-parallel, skipped by wireRegularEdges)
{
source: outerParallelId,
target: innerParallelId,
sourceHandle: 'parallel-start-source',
},
// Inner parallel start → function (intra-parallel, skipped by wireRegularEdges)
{
source: innerParallelId,
target: functionId,
sourceHandle: 'parallel-start-source',
},
]
)
const edgeConstructor = new EdgeConstructor()
edgeConstructor.execute(
workflow,
dag,
new Set([innerParallelId, functionId]),
new Set(),
new Set([outerParallelId, innerParallelId, functionId]),
new Map()
)
// Outer sentinel-start → inner sentinel-start
const outerStartNode = dag.nodes.get(outerSentinelStart)!
const edgeToInnerStart = Array.from(outerStartNode.outgoingEdges.values()).find(
(e) => e.target === innerSentinelStart
)
expect(edgeToInnerStart).toBeDefined()
// Inner sentinel-end → outer sentinel-end
const innerEndNode = dag.nodes.get(innerSentinelEnd)!
const edgeToOuterEnd = Array.from(innerEndNode.outgoingEdges.values()).find(
(e) => e.target === outerSentinelEnd
)
expect(edgeToOuterEnd).toBeDefined()
// Inner sentinel-start → func template
const innerStartNode = dag.nodes.get(innerSentinelStart)!
const edgeToFunc = Array.from(innerStartNode.outgoingEdges.values()).find(
(e) => e.target === funcTemplate
)
expect(edgeToFunc).toBeDefined()
// Func template → inner sentinel-end
const funcNode = dag.nodes.get(funcTemplate)!
const edgeToInnerEnd = Array.from(funcNode.outgoingEdges.values()).find(
(e) => e.target === innerSentinelEnd
)
expect(edgeToInnerEnd).toBeDefined()
})
it('should wire parallel-in-loop sentinels correctly', () => {
const loopId = 'outer-loop'
const innerParallelId = 'inner-parallel'
const functionId = 'func-1'
const loopSentinelStart = `loop-${loopId}-sentinel-start`
const loopSentinelEnd = `loop-${loopId}-sentinel-end`
const parallelSentinelStart = `parallel-${innerParallelId}-sentinel-start`
const parallelSentinelEnd = `parallel-${innerParallelId}-sentinel-end`
const funcTemplate = `${functionId}₍0₎`
const dag = createMockDAG([
loopSentinelStart,
loopSentinelEnd,
parallelSentinelStart,
parallelSentinelEnd,
funcTemplate,
])
dag.nodes.get(loopSentinelStart)!.metadata = {
isSentinel: true,
sentinelType: 'start',
loopId,
}
dag.nodes.get(loopSentinelEnd)!.metadata = {
isSentinel: true,
sentinelType: 'end',
loopId,
}
dag.nodes.get(parallelSentinelStart)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'start',
parallelId: innerParallelId,
}
dag.nodes.get(parallelSentinelEnd)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'end',
parallelId: innerParallelId,
}
dag.nodes.get(funcTemplate)!.metadata = {
isParallelBranch: true,
parallelId: innerParallelId,
branchIndex: 0,
branchTotal: 1,
originalBlockId: functionId,
}
const outerLoop: SerializedLoop = {
id: loopId,
nodes: [innerParallelId],
iterations: 5,
loopType: 'for',
}
dag.loopConfigs.set(loopId, outerLoop)
dag.parallelConfigs.set(innerParallelId, {
id: innerParallelId,
nodes: [functionId],
count: 2,
parallelType: 'count',
})
const workflow = createMockWorkflow(
[createMockBlock(functionId)],
[
{
source: loopId,
target: innerParallelId,
sourceHandle: 'loop-start-source',
},
{
source: innerParallelId,
target: functionId,
sourceHandle: 'parallel-start-source',
},
]
)
const edgeConstructor = new EdgeConstructor()
edgeConstructor.execute(
workflow,
dag,
new Set([functionId]),
new Set([innerParallelId]),
new Set([loopId, innerParallelId, functionId]),
new Map()
)
// Loop sentinel-start → parallel sentinel-start
const loopStartNode = dag.nodes.get(loopSentinelStart)!
const edgeToParallelStart = Array.from(loopStartNode.outgoingEdges.values()).find(
(e) => e.target === parallelSentinelStart
)
expect(edgeToParallelStart).toBeDefined()
// Parallel sentinel-end → loop sentinel-end
const parallelEndNode = dag.nodes.get(parallelSentinelEnd)!
const edgeToLoopEnd = Array.from(parallelEndNode.outgoingEdges.values()).find(
(e) => e.target === loopSentinelEnd
)
expect(edgeToLoopEnd).toBeDefined()
// Inner parallel wiring: sentinel-start → func, func → sentinel-end
const parallelStartNode = dag.nodes.get(parallelSentinelStart)!
expect(
Array.from(parallelStartNode.outgoingEdges.values()).some((e) => e.target === funcTemplate)
).toBe(true)
const funcNode = dag.nodes.get(funcTemplate)!
expect(
Array.from(funcNode.outgoingEdges.values()).some((e) => e.target === parallelSentinelEnd)
).toBe(true)
})
it('should wire loop-in-parallel with correct exit handles', () => {
const outerParallelId = 'outer-parallel'
const innerLoopId = 'inner-loop'
const functionId = 'func-1'
const outerSentinelStart = `parallel-${outerParallelId}-sentinel-start`
const outerSentinelEnd = `parallel-${outerParallelId}-sentinel-end`
const innerSentinelStart = `loop-${innerLoopId}-sentinel-start`
const innerSentinelEnd = `loop-${innerLoopId}-sentinel-end`
const dag = createMockDAG([
outerSentinelStart,
outerSentinelEnd,
innerSentinelStart,
innerSentinelEnd,
functionId,
])
dag.nodes.get(outerSentinelStart)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'start',
parallelId: outerParallelId,
}
dag.nodes.get(outerSentinelEnd)!.metadata = {
isSentinel: true,
isParallelSentinel: true,
sentinelType: 'end',
parallelId: outerParallelId,
}
dag.nodes.get(innerSentinelStart)!.metadata = {
isSentinel: true,
sentinelType: 'start',
loopId: innerLoopId,
}
dag.nodes.get(innerSentinelEnd)!.metadata = {
isSentinel: true,
sentinelType: 'end',
loopId: innerLoopId,
}
const innerLoop: SerializedLoop = {
id: innerLoopId,
nodes: [functionId],
iterations: 3,
loopType: 'for',
}
dag.loopConfigs.set(innerLoopId, innerLoop)
dag.parallelConfigs.set(outerParallelId, {
id: outerParallelId,
nodes: [innerLoopId],
count: 2,
parallelType: 'count',
})
const workflow = createMockWorkflow(
[createMockBlock(functionId), createMockBlock(innerLoopId, 'loop')],
[
{
source: outerParallelId,
target: innerLoopId,
sourceHandle: 'parallel-start-source',
},
{
source: innerLoopId,
target: functionId,
sourceHandle: 'loop-start-source',
},
],
{ [innerLoopId]: innerLoop }
)
const edgeConstructor = new EdgeConstructor()
edgeConstructor.execute(
workflow,
dag,
new Set([innerLoopId]),
new Set([functionId]),
new Set([outerParallelId, innerLoopId, functionId]),
new Map()
)
// Outer sentinel-start → inner loop sentinel-start
const outerStartNode = dag.nodes.get(outerSentinelStart)!
const edgeToInnerStart = Array.from(outerStartNode.outgoingEdges.values()).find(
(e) => e.target === innerSentinelStart
)
expect(edgeToInnerStart).toBeDefined()
// Inner loop sentinel-end → outer parallel sentinel-end with loop_exit handle
const innerEndNode = dag.nodes.get(innerSentinelEnd)!
const edgeToOuterEnd = Array.from(innerEndNode.outgoingEdges.values()).find(
(e) => e.target === outerSentinelEnd
)
expect(edgeToOuterEnd).toBeDefined()
expect(edgeToOuterEnd!.sourceHandle).toBe('loop_exit')
// Inner loop back-edge: sentinel-end → sentinel-start with loop_continue handle
const backEdge = Array.from(innerEndNode.outgoingEdges.values()).find(
(e) => e.target === innerSentinelStart
)
expect(backEdge).toBeDefined()
expect(backEdge!.sourceHandle).toBe('loop_continue')
// Inner loop wiring: sentinel-start → function
const innerStartNode = dag.nodes.get(innerSentinelStart)!
expect(
Array.from(innerStartNode.outgoingEdges.values()).some((e) => e.target === functionId)
).toBe(true)
// Function → inner loop sentinel-end
const funcNode = dag.nodes.get(functionId)!
expect(
Array.from(funcNode.outgoingEdges.values()).some((e) => e.target === innerSentinelEnd)
).toBe(true)
})
})
})

View File

@@ -5,14 +5,14 @@ import {
isRouterBlockType,
isRouterV2BlockType,
} from '@/executor/constants'
import type { DAG } from '@/executor/dag/builder'
import type { DAG, DAGNode } from '@/executor/dag/builder'
import {
buildBranchNodeId,
buildParallelSentinelEndId,
buildParallelSentinelStartId,
buildSentinelEndId,
buildSentinelStartId,
extractBaseBlockId,
normalizeNodeId,
} from '@/executor/utils/subflow-utils'
import type { SerializedWorkflow } from '@/serializer/types'
@@ -62,7 +62,7 @@ export class EdgeConstructor {
pauseTriggerMapping
)
this.wireLoopSentinels(dag, reachableBlocks)
this.wireLoopSentinels(dag)
this.wireParallelSentinels(dag)
}
@@ -242,6 +242,11 @@ export class EdgeConstructor {
}
if (sourceIsParallelBlock) {
// Skip intra-parallel edges (start → child); handled by wireParallelSentinels
const sourceParallelNodes = dag.parallelConfigs.get(originalSource)?.nodes
if (sourceParallelNodes?.includes(originalTarget)) {
continue
}
const sentinelEndId = buildParallelSentinelEndId(originalSource)
if (!dag.nodes.has(sentinelEndId)) {
continue
@@ -258,11 +263,12 @@ export class EdgeConstructor {
target = sentinelStartId
}
if (this.edgeCrossesLoopBoundary(source, target, blocksInLoops, dag)) {
if (this.edgeCrossesLoopBoundary(originalSource, originalTarget, blocksInLoops, dag)) {
continue
}
if (loopSentinelStartId && !blocksInLoops.has(originalTarget)) {
const sourceLoopNodes = dag.loopConfigs.get(originalSource)?.nodes
if (loopSentinelStartId && !sourceLoopNodes?.includes(originalTarget)) {
this.addEdge(dag, loopSentinelStartId, target, EDGE.LOOP_EXIT, targetHandle)
}
@@ -288,7 +294,7 @@ export class EdgeConstructor {
}
}
private wireLoopSentinels(dag: DAG, reachableBlocks: Set<string>): void {
private wireLoopSentinels(dag: DAG): void {
for (const [loopId, loopConfig] of dag.loopConfigs) {
const nodes = loopConfig.nodes
@@ -301,14 +307,27 @@ export class EdgeConstructor {
continue
}
const { startNodes, terminalNodes } = this.findLoopBoundaryNodes(nodes, dag, reachableBlocks)
const { startNodes, terminalNodes } = this.findLoopBoundaryNodes(nodes, dag)
for (const startNodeId of startNodes) {
this.addEdge(dag, sentinelStartId, startNodeId)
const resolvedId = this.resolveLoopBlockToSentinelStart(startNodeId, dag)
this.addEdge(dag, sentinelStartId, resolvedId)
}
for (const terminalNodeId of terminalNodes) {
this.addEdge(dag, terminalNodeId, sentinelEndId)
const resolvedId = this.resolveLoopBlockToSentinelEnd(terminalNodeId, dag)
if (resolvedId !== terminalNodeId) {
// Use the sourceHandle that matches the nested subflow's exit route.
// Parallel sentinel-end outputs selectedRoute "parallel_exit",
// loop sentinel-end outputs "loop_exit". The edge manager only activates
// edges whose sourceHandle matches the source node's selectedRoute.
const handle = dag.parallelConfigs.has(terminalNodeId)
? EDGE.PARALLEL_EXIT
: EDGE.LOOP_EXIT
this.addEdge(dag, resolvedId, sentinelEndId, handle)
} else {
this.addEdge(dag, resolvedId, sentinelEndId)
}
}
this.addEdge(dag, sentinelEndId, sentinelStartId, EDGE.LOOP_CONTINUE, undefined, true)
@@ -331,21 +350,60 @@ export class EdgeConstructor {
const { entryNodes, terminalNodes } = this.findParallelBoundaryNodes(nodes, dag)
for (const entryNodeId of entryNodes) {
const templateNodeId = buildBranchNodeId(entryNodeId, 0)
if (dag.nodes.has(templateNodeId)) {
this.addEdge(dag, sentinelStartId, templateNodeId)
const targetId = this.resolveSubflowToSentinelStart(entryNodeId, dag)
if (dag.nodes.has(targetId)) {
this.addEdge(dag, sentinelStartId, targetId)
}
}
for (const terminalNodeId of terminalNodes) {
const templateNodeId = buildBranchNodeId(terminalNodeId, 0)
if (dag.nodes.has(templateNodeId)) {
this.addEdge(dag, templateNodeId, sentinelEndId)
const sourceId = this.resolveSubflowToSentinelEnd(terminalNodeId, dag)
if (dag.nodes.has(sourceId)) {
// Use the sourceHandle that matches the nested subflow's exit route.
// A nested loop sentinel-end outputs "loop_exit", not "parallel_exit".
const handle = dag.loopConfigs.has(terminalNodeId) ? EDGE.LOOP_EXIT : EDGE.PARALLEL_EXIT
this.addEdge(dag, sourceId, sentinelEndId, handle)
}
}
}
}
/**
* Resolves a node ID to the appropriate entry point for sentinel wiring.
* Nested parallels → their sentinel-start, nested loops → their sentinel-start,
* regular blocks → their branch template node.
*/
private resolveSubflowToSentinelStart(nodeId: string, dag: DAG): string {
if (dag.parallelConfigs.has(nodeId)) {
return buildParallelSentinelStartId(nodeId)
}
if (dag.loopConfigs.has(nodeId)) {
return buildSentinelStartId(nodeId)
}
return buildBranchNodeId(nodeId, 0)
}
/**
* Resolves a node ID to the appropriate exit point for sentinel wiring.
* Nested parallels → their sentinel-end, nested loops → their sentinel-end,
* regular blocks → their branch template node.
*/
private resolveSubflowToSentinelEnd(nodeId: string, dag: DAG): string {
if (dag.parallelConfigs.has(nodeId)) {
return buildParallelSentinelEndId(nodeId)
}
if (dag.loopConfigs.has(nodeId)) {
return buildSentinelEndId(nodeId)
}
return buildBranchNodeId(nodeId, 0)
}
/**
* Checks whether an edge crosses a loop boundary (source and target are in
* different loops, or one is inside a loop and the other is not). Uses the
* original block IDs (pre-sentinel-remapping) because `blocksInLoops` and
* `loopConfigs.nodes` reference original block IDs from the serialized workflow.
*/
private edgeCrossesLoopBoundary(
source: string,
target: string,
@@ -363,22 +421,37 @@ export class EdgeConstructor {
return false
}
let sourceLoopId: string | undefined
let targetLoopId: string | undefined
for (const [loopId, loopConfig] of dag.loopConfigs) {
if (loopConfig.nodes.includes(source)) {
sourceLoopId = loopId
}
if (loopConfig.nodes.includes(target)) {
targetLoopId = loopId
}
}
// Find the innermost loop for each block. In nested loops a block appears
// in multiple loop configs; we need the most deeply nested one.
const sourceLoopId = this.findInnermostLoop(source, dag)
const targetLoopId = this.findInnermostLoop(target, dag)
return sourceLoopId !== targetLoopId
}
/**
* Finds the innermost loop containing a block. When a block is in nested
* loops (A contains B, both list the block), returns B (the one that
* doesn't contain any other candidate loop).
*/
private findInnermostLoop(blockId: string, dag: DAG): string | undefined {
const candidates: string[] = []
for (const [loopId, loopConfig] of dag.loopConfigs) {
if (loopConfig.nodes.includes(blockId)) {
candidates.push(loopId)
}
}
if (candidates.length <= 1) return candidates[0]
return candidates.find((candidateId) =>
candidates.every((otherId) => {
if (otherId === candidateId) return true
const candidateConfig = dag.loopConfigs.get(candidateId)
return !candidateConfig?.nodes.includes(otherId)
})
)
}
private isEdgeReachable(
source: string,
target: string,
@@ -406,24 +479,75 @@ export class EdgeConstructor {
this.addEdge(dag, sourceNodeId, targetNodeId, sourceHandle, targetHandle)
}
/**
* Resolves the DAG node to inspect for a given loop child.
* If the child is a nested subflow (loop or parallel), returns its sentinel node;
* otherwise returns the regular DAG node.
*/
private resolveLoopChildNode(
nodeId: string,
dag: DAG,
sentinel: 'start' | 'end'
): { resolvedId: string; node: DAGNode | undefined } {
if (dag.loopConfigs.has(nodeId)) {
const resolvedId =
sentinel === 'start' ? buildSentinelStartId(nodeId) : buildSentinelEndId(nodeId)
return { resolvedId, node: dag.nodes.get(resolvedId) }
}
if (dag.parallelConfigs.has(nodeId)) {
const resolvedId =
sentinel === 'start'
? buildParallelSentinelStartId(nodeId)
: buildParallelSentinelEndId(nodeId)
return { resolvedId, node: dag.nodes.get(resolvedId) }
}
return { resolvedId: nodeId, node: dag.nodes.get(nodeId) }
}
private resolveLoopBlockToSentinelStart(nodeId: string, dag: DAG): string {
return this.resolveLoopChildNode(nodeId, dag, 'start').resolvedId
}
private resolveLoopBlockToSentinelEnd(nodeId: string, dag: DAG): string {
return this.resolveLoopChildNode(nodeId, dag, 'end').resolvedId
}
/**
* Builds the set of effective DAG node IDs for a loop's children,
* mapping nested subflow block IDs (loops and parallels) to their sentinel IDs.
*/
private buildEffectiveNodeSet(nodes: string[], dag: DAG): Set<string> {
const effective = new Set<string>()
for (const nodeId of nodes) {
if (dag.loopConfigs.has(nodeId)) {
effective.add(buildSentinelStartId(nodeId))
effective.add(buildSentinelEndId(nodeId))
} else if (dag.parallelConfigs.has(nodeId)) {
effective.add(buildParallelSentinelStartId(nodeId))
effective.add(buildParallelSentinelEndId(nodeId))
} else {
effective.add(nodeId)
}
}
return effective
}
private findLoopBoundaryNodes(
nodes: string[],
dag: DAG,
reachableBlocks: Set<string>
dag: DAG
): { startNodes: string[]; terminalNodes: string[] } {
const nodesSet = new Set(nodes)
const effectiveNodeSet = this.buildEffectiveNodeSet(nodes, dag)
const startNodesSet = new Set<string>()
const terminalNodesSet = new Set<string>()
for (const nodeId of nodes) {
const node = dag.nodes.get(nodeId)
const { node } = this.resolveLoopChildNode(nodeId, dag, 'start')
if (!node) continue
let hasIncomingFromLoop = false
for (const incomingNodeId of node.incomingEdges) {
if (nodesSet.has(incomingNodeId)) {
if (effectiveNodeSet.has(incomingNodeId)) {
hasIncomingFromLoop = true
break
}
@@ -435,14 +559,17 @@ export class EdgeConstructor {
}
for (const nodeId of nodes) {
const node = dag.nodes.get(nodeId)
const { node } = this.resolveLoopChildNode(nodeId, dag, 'end')
if (!node) continue
let hasOutgoingToLoop = false
for (const [, edge] of node.outgoingEdges) {
const isBackEdge =
edge.sourceHandle === EDGE.LOOP_CONTINUE || edge.sourceHandle === EDGE.LOOP_CONTINUE_ALT
if (isBackEdge) continue
for (const [_, edge] of node.outgoingEdges) {
if (nodesSet.has(edge.target)) {
if (effectiveNodeSet.has(edge.target)) {
hasOutgoingToLoop = true
break
}
@@ -468,39 +595,77 @@ export class EdgeConstructor {
const terminalNodes: string[] = []
for (const nodeId of nodes) {
const templateId = buildBranchNodeId(nodeId, 0)
const templateNode = dag.nodes.get(templateId)
// For nested subflow containers, use their sentinel nodes for boundary detection
const { startNode, endNode } = this.resolveParallelChildNodes(nodeId, dag)
if (!templateNode) continue
if (!startNode && !endNode) continue
let hasIncomingFromParallel = false
for (const incomingNodeId of templateNode.incomingEdges) {
const originalNodeId = extractBaseBlockId(incomingNodeId)
if (nodesSet.has(originalNodeId)) {
hasIncomingFromParallel = true
break
// Entry detection: check if the start-facing node has incoming edges from within the parallel
if (startNode) {
let hasIncomingFromParallel = false
for (const incomingNodeId of startNode.incomingEdges) {
const originalNodeId = normalizeNodeId(incomingNodeId)
if (nodesSet.has(originalNodeId)) {
hasIncomingFromParallel = true
break
}
}
if (!hasIncomingFromParallel) {
entryNodes.push(nodeId)
}
}
if (!hasIncomingFromParallel) {
entryNodes.push(nodeId)
}
let hasOutgoingToParallel = false
for (const [, edge] of templateNode.outgoingEdges) {
const originalTargetId = extractBaseBlockId(edge.target)
if (nodesSet.has(originalTargetId)) {
hasOutgoingToParallel = true
break
// Terminal detection: check if the end-facing node has outgoing edges to within the parallel
if (endNode) {
let hasOutgoingToParallel = false
for (const [, edge] of endNode.outgoingEdges) {
// Skip loop back-edges — they don't count as forward edges within the parallel
const isBackEdge =
edge.sourceHandle === EDGE.LOOP_CONTINUE || edge.sourceHandle === EDGE.LOOP_CONTINUE_ALT
if (isBackEdge) continue
const originalTargetId = normalizeNodeId(edge.target)
if (nodesSet.has(originalTargetId)) {
hasOutgoingToParallel = true
break
}
}
if (!hasOutgoingToParallel) {
terminalNodes.push(nodeId)
}
}
if (!hasOutgoingToParallel) {
terminalNodes.push(nodeId)
}
}
return { entryNodes, terminalNodes }
}
/**
* Resolves a child node inside a parallel to the correct DAG nodes for boundary detection.
* For regular blocks, returns the branch template node for both start and end.
* For nested parallels, returns the inner parallel's sentinel-start and sentinel-end.
* For nested loops, returns the inner loop's sentinel-start and sentinel-end.
*/
private resolveParallelChildNodes(
nodeId: string,
dag: DAG
): { startNode: DAGNode | undefined; endNode: DAGNode | undefined } {
if (dag.parallelConfigs.has(nodeId)) {
return {
startNode: dag.nodes.get(buildParallelSentinelStartId(nodeId)),
endNode: dag.nodes.get(buildParallelSentinelEndId(nodeId)),
}
}
if (dag.loopConfigs.has(nodeId)) {
return {
startNode: dag.nodes.get(buildSentinelStartId(nodeId)),
endNode: dag.nodes.get(buildSentinelEndId(nodeId)),
}
}
// Regular block — use branch template node for both
const templateNode = dag.nodes.get(buildBranchNodeId(nodeId, 0))
return { startNode: templateNode, endNode: templateNode }
}
private getParallelId(blockId: string, dag: DAG): string | null {
for (const [parallelId, parallelConfig] of dag.parallelConfigs) {
if (parallelConfig.nodes.includes(blockId)) {

View File

@@ -20,7 +20,7 @@ import { ChildWorkflowError } from '@/executor/errors/child-workflow-error'
import type {
BlockStateWriter,
ContextExtensions,
IterationContext,
WorkflowNodeMetadata,
} from '@/executor/execution/types'
import {
generatePauseContextId,
@@ -36,11 +36,14 @@ import {
} from '@/executor/types'
import { streamingResponseFormatProcessor } from '@/executor/utils'
import { buildBlockExecutionError, normalizeError } from '@/executor/utils/errors'
import {
buildUnifiedParentIterations,
getIterationContext,
} from '@/executor/utils/iteration-context'
import { isJSONString } from '@/executor/utils/json'
import { filterOutputForLog } from '@/executor/utils/output-filter'
import type { VariableResolver } from '@/executor/variables/resolver'
import type { SerializedBlock } from '@/serializer/types'
import type { SubflowType } from '@/stores/workflows/workflow/types'
import { SYSTEM_SUBBLOCK_IDS } from '@/triggers/constants'
const logger = createLogger('BlockExecutor')
@@ -169,9 +172,10 @@ export class BlockExecutor {
this.state.setBlockOutput(node.id, normalizedOutput, duration)
if (!isSentinel && blockLog) {
const childWorkflowInstanceId = normalizedOutput._childWorkflowInstanceId as
| string
| undefined
const childWorkflowInstanceId =
typeof normalizedOutput._childWorkflowInstanceId === 'string'
? normalizedOutput._childWorkflowInstanceId
: undefined
const displayOutput = filterOutputForLog(block.metadata?.id || '', normalizedOutput, {
block,
})
@@ -205,15 +209,7 @@ export class BlockExecutor {
}
}
private buildNodeMetadata(node: DAGNode): {
nodeId: string
loopId?: string
parallelId?: string
branchIndex?: number
branchTotal?: number
originalBlockId?: string
isLoopNode?: boolean
} {
private buildNodeMetadata(node: DAGNode): WorkflowNodeMetadata {
const metadata = node?.metadata ?? {}
return {
nodeId: node.id,
@@ -367,6 +363,11 @@ export class BlockExecutor {
}
}
const containerId = parallelId ?? loopId
const parentIterations = containerId
? buildUnifiedParentIterations(ctx, containerId)
: undefined
return {
blockId,
blockName,
@@ -379,6 +380,7 @@ export class BlockExecutor {
loopId,
parallelId,
iterationIndex,
...(parentIterations?.length && { parentIterations }),
}
}
@@ -447,7 +449,7 @@ export class BlockExecutor {
const blockName = block.metadata?.name ?? blockId
const blockType = block.metadata?.id ?? DEFAULTS.BLOCK_TYPE
const iterationContext = this.getIterationContext(ctx, node)
const iterationContext = getIterationContext(ctx, node?.metadata)
if (this.contextExtensions.onBlockStart) {
this.contextExtensions.onBlockStart(
@@ -477,7 +479,7 @@ export class BlockExecutor {
const blockName = block.metadata?.name ?? blockId
const blockType = block.metadata?.id ?? DEFAULTS.BLOCK_TYPE
const iterationContext = this.getIterationContext(ctx, node)
const iterationContext = getIterationContext(ctx, node?.metadata)
if (this.contextExtensions.onBlockComplete) {
this.contextExtensions.onBlockComplete(
@@ -499,47 +501,6 @@ export class BlockExecutor {
}
}
private createIterationContext(
iterationCurrent: number,
iterationType: SubflowType,
iterationContainerId?: string,
iterationTotal?: number
): IterationContext {
return {
iterationCurrent,
iterationTotal,
iterationType,
iterationContainerId,
}
}
private getIterationContext(ctx: ExecutionContext, node: DAGNode): IterationContext | undefined {
if (!node?.metadata) return undefined
if (node.metadata.branchIndex !== undefined && node.metadata.branchTotal !== undefined) {
return this.createIterationContext(
node.metadata.branchIndex,
'parallel',
node.metadata.parallelId,
node.metadata.branchTotal
)
}
if (node.metadata.isLoopNode && node.metadata.loopId) {
const loopScope = ctx.loopExecutions?.get(node.metadata.loopId)
if (loopScope && loopScope.iteration !== undefined) {
return this.createIterationContext(
loopScope.iteration,
'loop',
node.metadata.loopId,
loopScope.maxIterations
)
}
}
return undefined
}
private preparePauseResumeSelfReference(
ctx: ExecutionContext,
node: DAGNode,

View File

@@ -1,5 +1,6 @@
import { createLogger } from '@sim/logger'
import { StartBlockPath } from '@/lib/workflows/triggers/triggers'
import type { DAG } from '@/executor/dag/builder'
import { DAGBuilder } from '@/executor/dag/builder'
import { BlockExecutor } from '@/executor/execution/block-executor'
import { EdgeManager } from '@/executor/execution/edge-manager'
@@ -32,6 +33,7 @@ import {
} from '@/executor/utils/subflow-utils'
import { VariableResolver } from '@/executor/variables/resolver'
import type { SerializedWorkflow } from '@/serializer/types'
import type { SubflowType } from '@/stores/workflows/workflow/types'
const logger = createLogger('DAGExecutor')
@@ -67,25 +69,9 @@ export class DAGExecutor {
savedIncomingEdges,
})
const { context, state } = this.createExecutionContext(workflowId, triggerBlockId)
context.subflowParentMap = this.buildSubflowParentMap(dag)
const resolver = new VariableResolver(this.workflow, this.workflowVariables, state)
const loopOrchestrator = new LoopOrchestrator(dag, state, resolver)
loopOrchestrator.setContextExtensions(this.contextExtensions)
const parallelOrchestrator = new ParallelOrchestrator(dag, state)
parallelOrchestrator.setResolver(resolver)
parallelOrchestrator.setContextExtensions(this.contextExtensions)
const allHandlers = createBlockHandlers()
const blockExecutor = new BlockExecutor(allHandlers, resolver, this.contextExtensions, state)
const edgeManager = new EdgeManager(dag)
loopOrchestrator.setEdgeManager(edgeManager)
const nodeOrchestrator = new NodeExecutionOrchestrator(
dag,
state,
blockExecutor,
loopOrchestrator,
parallelOrchestrator
)
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
const engine = this.buildExecutionPipeline(context, dag, state)
return await engine.run(triggerBlockId)
}
@@ -208,17 +194,30 @@ export class DAGExecutor {
snapshotState: filteredSnapshot,
runFromBlockContext,
})
context.subflowParentMap = this.buildSubflowParentMap(dag)
const engine = this.buildExecutionPipeline(context, dag, state)
return await engine.run()
}
private buildExecutionPipeline(context: ExecutionContext, dag: DAG, state: ExecutionState) {
const resolver = new VariableResolver(this.workflow, this.workflowVariables, state)
const loopOrchestrator = new LoopOrchestrator(dag, state, resolver)
loopOrchestrator.setContextExtensions(this.contextExtensions)
const parallelOrchestrator = new ParallelOrchestrator(dag, state)
parallelOrchestrator.setResolver(resolver)
parallelOrchestrator.setContextExtensions(this.contextExtensions)
const allHandlers = createBlockHandlers()
const blockExecutor = new BlockExecutor(allHandlers, resolver, this.contextExtensions, state)
const edgeManager = new EdgeManager(dag)
loopOrchestrator.setEdgeManager(edgeManager)
const loopOrchestrator = new LoopOrchestrator(
dag,
state,
resolver,
this.contextExtensions,
edgeManager
)
const parallelOrchestrator = new ParallelOrchestrator(
dag,
state,
resolver,
this.contextExtensions
)
const nodeOrchestrator = new NodeExecutionOrchestrator(
dag,
state,
@@ -226,9 +225,7 @@ export class DAGExecutor {
loopOrchestrator,
parallelOrchestrator
)
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
return await engine.run()
return new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
}
private createExecutionContext(
@@ -371,6 +368,37 @@ export class DAGExecutor {
return { context, state }
}
/**
* Builds a unified child-subflow → parent-subflow mapping that covers all nesting
* combinations: loop-in-loop, parallel-in-parallel, loop-in-parallel, parallel-in-loop.
* Used by the iteration context builder to walk the full ancestor chain for SSE events.
*/
private buildSubflowParentMap(
dag: DAG
): Map<string, { parentId: string; parentType: SubflowType }> {
const parentMap = new Map<string, { parentId: string; parentType: SubflowType }>()
// Scan loop configs: children can be loops or parallels
for (const [loopId, config] of dag.loopConfigs) {
for (const nodeId of config.nodes) {
if (dag.loopConfigs.has(nodeId) || dag.parallelConfigs.has(nodeId)) {
parentMap.set(nodeId, { parentId: loopId, parentType: 'loop' })
}
}
}
// Scan parallel configs: children can be parallels or loops
for (const [parallelId, config] of dag.parallelConfigs) {
for (const nodeId of config.nodes ?? []) {
if (dag.parallelConfigs.has(nodeId) || dag.loopConfigs.has(nodeId)) {
parentMap.set(nodeId, { parentId: parallelId, parentType: 'parallel' })
}
}
}
return parentMap
}
private initializeStarterBlock(
context: ExecutionContext,
state: ExecutionState,

View File

@@ -1,5 +1,11 @@
import type { Edge } from 'reactflow'
import type { BlockLog, BlockState, NormalizedBlockOutput } from '@/executor/types'
import type { NodeMetadata } from '@/executor/dag/types'
import type {
BlockLog,
BlockState,
NormalizedBlockOutput,
StreamingExecution,
} from '@/executor/types'
import type { RunFromBlockContext } from '@/executor/utils/run-from-block'
import type { SubflowType } from '@/stores/workflows/workflow/types'
@@ -49,11 +55,45 @@ export interface SerializableExecutionState {
completedPauseContexts?: string[]
}
/**
* Represents the iteration state of an ancestor subflow in a nested chain.
* Used to propagate parent iteration context through SSE events for both
* loop-in-loop and parallel-in-parallel nesting hierarchies.
*/
export interface ParentIteration {
iterationCurrent: number
iterationTotal?: number
iterationType: SubflowType
iterationContainerId: string
}
export interface IterationContext {
iterationCurrent: number
iterationTotal?: number
iterationType: SubflowType
/**
* Block ID of the loop or parallel container owning this iteration.
* Optional because generic `<loop.index>` references may resolve before
* the container ID is known (e.g., via `context.loopScope` fallback).
* Always present on {@link ParentIteration} entries since those are built
* from fully resolved ancestor loops.
*/
iterationContainerId?: string
parentIterations?: ParentIteration[]
}
/**
* Metadata passed to block handlers that execute within subflow contexts
* (loops, parallels, child workflows). Extends the DAG node metadata with
* runtime identifiers needed for execution tracking.
*/
export interface WorkflowNodeMetadata
extends Pick<
NodeMetadata,
'loopId' | 'parallelId' | 'branchIndex' | 'branchTotal' | 'originalBlockId' | 'isLoopNode'
> {
nodeId: string
executionOrder?: number
}
export interface ChildWorkflowContext {
@@ -68,7 +108,7 @@ export interface ChildWorkflowContext {
}
export interface ExecutionCallbacks {
onStream?: (streamingExec: any) => Promise<void>
onStream?: (streamingExec: StreamingExecution) => Promise<void>
onBlockStart?: (
blockId: string,
blockName: string,
@@ -122,7 +162,7 @@ export interface ContextExtensions {
abortSignal?: AbortSignal
includeFileBase64?: boolean
base64MaxBytes?: number
onStream?: (streamingExecution: unknown) => Promise<void>
onStream?: (streamingExecution: StreamingExecution) => Promise<void>
onBlockStart?: (
blockId: string,
blockName: string,

View File

@@ -186,7 +186,7 @@ describe('AgentBlockHandler', () => {
})
})
mockTransformBlockTool.mockImplementation((tool: any) => ({
mockTransformBlockTool.mockImplementation((tool: { id?: string; operation?: string }) => ({
id: `transformed_${tool.id}`,
name: `${tool.id}_${tool.operation}`,
description: 'Transformed tool',
@@ -341,9 +341,15 @@ describe('AgentBlockHandler', () => {
expect(tools.length).toBe(2)
const autoTool = tools.find((t: any) => t.name === 'auto_tool')
const forceTool = tools.find((t: any) => t.name === 'force_tool')
const noneTool = tools.find((t: any) => t.name === 'none_tool')
const autoTool = tools.find(
(t: { name?: string; id?: string; usageControl?: string }) => t.name === 'auto_tool'
)
const forceTool = tools.find(
(t: { name?: string; id?: string; usageControl?: string }) => t.name === 'force_tool'
)
const noneTool = tools.find(
(t: { name?: string; id?: string; usageControl?: string }) => t.name === 'none_tool'
)
expect(autoTool).toBeDefined()
expect(forceTool).toBeDefined()
@@ -392,7 +398,9 @@ describe('AgentBlockHandler', () => {
expect(requestBody.tools.length).toBe(2)
const toolIds = requestBody.tools.map((t: any) => t.id)
const toolIds = requestBody.tools.map(
(t: { name?: string; id?: string; usageControl?: string }) => t.id
)
expect(toolIds).toContain('transformed_tool_1')
expect(toolIds).toContain('transformed_tool_3')
expect(toolIds).not.toContain('transformed_tool_2')
@@ -421,7 +429,7 @@ describe('AgentBlockHandler', () => {
],
}
mockTransformBlockTool.mockImplementation((tool: any) => ({
mockTransformBlockTool.mockImplementation((tool: { id?: string; operation?: string }) => ({
id: `transformed_${tool.id}`,
name: `${tool.id}_${tool.operation}`,
description: 'Transformed tool',
@@ -502,13 +510,19 @@ describe('AgentBlockHandler', () => {
expect(requestBody.tools.length).toBe(2)
const toolNames = requestBody.tools.map((t: any) => t.name)
const toolNames = requestBody.tools.map(
(t: { name?: string; id?: string; usageControl?: string }) => t.name
)
expect(toolNames).toContain('custom_tool_auto')
expect(toolNames).toContain('custom_tool_force')
expect(toolNames).not.toContain('custom_tool_none')
const autoTool = requestBody.tools.find((t: any) => t.name === 'custom_tool_auto')
const forceTool = requestBody.tools.find((t: any) => t.name === 'custom_tool_force')
const autoTool = requestBody.tools.find(
(t: { name?: string; id?: string; usageControl?: string }) => t.name === 'custom_tool_auto'
)
const forceTool = requestBody.tools.find(
(t: { name?: string; id?: string; usageControl?: string }) => t.name === 'custom_tool_force'
)
expect(autoTool.usageControl).toBe('auto')
expect(forceTool.usageControl).toBe('force')
@@ -1473,7 +1487,7 @@ describe('AgentBlockHandler', () => {
timing: { total: 200 },
})
mockTransformBlockTool.mockImplementation((tool: any) => ({
mockTransformBlockTool.mockImplementation((tool: { id?: string; operation?: string }) => ({
id: tool.schema?.function?.name || `mcp-${tool.title.toLowerCase().replace(' ', '-')}`,
name: tool.schema?.function?.name || tool.title,
description: tool.schema?.function?.description || `MCP tool: ${tool.title}`,

View File

@@ -357,6 +357,7 @@ export class AgentBlockHandler implements BlockHandler {
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error(`Failed to fetch custom tools: ${response.status}`)
return null
}
@@ -590,12 +591,15 @@ export class AgentBlockHandler implements BlockHandler {
serverId: string
toolName: string
description: string
schema: any
userProvidedParams: Record<string, any>
usageControl?: string
}): Promise<any> {
schema: Record<string, unknown>
userProvidedParams: Record<string, unknown>
usageControl?: 'auto' | 'force' | 'none'
}) {
const { filterSchemaForLLM } = await import('@/tools/params')
const filteredSchema = filterSchemaForLLM(config.schema, config.userProvidedParams)
const filteredSchema = filterSchemaForLLM(
config.schema as unknown as Parameters<typeof filterSchemaForLLM>[0],
config.userProvidedParams as Record<string, unknown>
)
const toolId = createMcpToolId(config.serverId, config.toolName)
return {

View File

@@ -116,21 +116,22 @@ export class Memory {
ctx: ExecutionContext,
inputs: AgentInputs
): ReadableStream<Uint8Array> {
let accumulatedContent = ''
const chunks: string[] = []
const decoder = new TextDecoder()
const transformStream = new TransformStream<Uint8Array, Uint8Array>({
transform: (chunk, controller) => {
controller.enqueue(chunk)
const decoded = decoder.decode(chunk, { stream: true })
accumulatedContent += decoded
chunks.push(decoded)
},
flush: () => {
if (accumulatedContent.trim()) {
const content = chunks.join('')
if (content.trim()) {
this.appendToMemory(ctx, inputs, {
role: 'assistant',
content: accumulatedContent,
content,
}).catch((error) => logger.error('Failed to persist streaming response:', error))
}
},

View File

@@ -142,6 +142,7 @@ describe('WorkflowBlockHandler', () => {
ok: false,
status: 404,
statusText: 'Not Found',
text: () => Promise.resolve(''),
})
await expect(handler.execute(mockContext, mockBlock, inputs)).rejects.toThrow(
@@ -168,6 +169,7 @@ describe('WorkflowBlockHandler', () => {
ok: false,
status: 404,
statusText: 'Not Found',
text: () => Promise.resolve(''),
})
const result = await (handler as any).loadChildWorkflow(workflowId)

View File

@@ -7,7 +7,7 @@ import type { BlockOutput } from '@/blocks/types'
import { Executor } from '@/executor'
import { BlockType, DEFAULTS, HTTP } from '@/executor/constants'
import { ChildWorkflowError } from '@/executor/errors/child-workflow-error'
import type { IterationContext } from '@/executor/execution/types'
import type { WorkflowNodeMetadata } from '@/executor/execution/types'
import type {
BlockHandler,
ExecutionContext,
@@ -16,6 +16,7 @@ import type {
} from '@/executor/types'
import { hasExecutionResult } from '@/executor/utils/errors'
import { buildAPIUrl, buildAuthHeaders } from '@/executor/utils/http'
import { getIterationContext } from '@/executor/utils/iteration-context'
import { parseJSON } from '@/executor/utils/json'
import { lazyCleanupInputMapping } from '@/executor/utils/lazy-cleanup'
import { Serializer } from '@/serializer'
@@ -47,41 +48,23 @@ export class WorkflowBlockHandler implements BlockHandler {
block: SerializedBlock,
inputs: Record<string, any>
): Promise<BlockOutput | StreamingExecution> {
return this._executeCore(ctx, block, inputs)
return this.executeCore(ctx, block, inputs)
}
async executeWithNode(
ctx: ExecutionContext,
block: SerializedBlock,
inputs: Record<string, any>,
nodeMetadata: {
nodeId: string
loopId?: string
parallelId?: string
branchIndex?: number
branchTotal?: number
originalBlockId?: string
isLoopNode?: boolean
executionOrder?: number
}
nodeMetadata: WorkflowNodeMetadata
): Promise<BlockOutput | StreamingExecution> {
return this._executeCore(ctx, block, inputs, nodeMetadata)
return this.executeCore(ctx, block, inputs, nodeMetadata)
}
private async _executeCore(
private async executeCore(
ctx: ExecutionContext,
block: SerializedBlock,
inputs: Record<string, any>,
nodeMetadata?: {
nodeId: string
loopId?: string
parallelId?: string
branchIndex?: number
branchTotal?: number
originalBlockId?: string
isLoopNode?: boolean
executionOrder?: number
}
nodeMetadata?: WorkflowNodeMetadata
): Promise<BlockOutput | StreamingExecution> {
logger.info(`Executing workflow block: ${block.id}`)
@@ -164,13 +147,19 @@ export class WorkflowBlockHandler implements BlockHandler {
const childDepth = (ctx.childWorkflowContext?.depth ?? 0) + 1
const shouldPropagateCallbacks = childDepth <= DEFAULTS.MAX_SSE_CHILD_DEPTH
if (!shouldPropagateCallbacks) {
logger.info('Dropping SSE callbacks beyond max child depth', {
childDepth,
maxDepth: DEFAULTS.MAX_SSE_CHILD_DEPTH,
childWorkflowName,
})
}
if (shouldPropagateCallbacks) {
const effectiveBlockId = nodeMetadata
? (nodeMetadata.originalBlockId ?? nodeMetadata.nodeId)
: block.id
const iterationContext = nodeMetadata
? this.getIterationContext(ctx, nodeMetadata)
: undefined
const iterationContext = nodeMetadata ? getIterationContext(ctx, nodeMetadata) : undefined
ctx.onChildWorkflowInstanceReady?.(
effectiveBlockId,
instanceId,
@@ -196,7 +185,7 @@ export class WorkflowBlockHandler implements BlockHandler {
...(shouldPropagateCallbacks && {
onBlockStart: ctx.onBlockStart,
onBlockComplete: ctx.onBlockComplete,
onStream: ctx.onStream as ((streamingExecution: unknown) => Promise<void>) | undefined,
onStream: ctx.onStream,
onChildWorkflowInstanceReady: ctx.onChildWorkflowInstanceReady,
childWorkflowContext: {
parentBlockId: instanceId,
@@ -268,40 +257,6 @@ export class WorkflowBlockHandler implements BlockHandler {
}
}
private getIterationContext(
ctx: ExecutionContext,
nodeMetadata: {
loopId?: string
parallelId?: string
branchIndex?: number
branchTotal?: number
isLoopNode?: boolean
}
): IterationContext | undefined {
if (nodeMetadata.branchIndex !== undefined && nodeMetadata.branchTotal !== undefined) {
return {
iterationCurrent: nodeMetadata.branchIndex,
iterationTotal: nodeMetadata.branchTotal,
iterationType: 'parallel',
iterationContainerId: nodeMetadata.parallelId,
}
}
if (nodeMetadata.isLoopNode && nodeMetadata.loopId) {
const loopScope = ctx.loopExecutions?.get(nodeMetadata.loopId)
if (loopScope && loopScope.iteration !== undefined) {
return {
iterationCurrent: loopScope.iteration,
iterationTotal: loopScope.maxIterations,
iterationType: 'loop',
iterationContainerId: nodeMetadata.loopId,
}
}
}
return undefined
}
/**
* Builds a cleaner error message for nested workflow errors.
* Parses nested error messages to extract workflow chain and root error.
@@ -375,6 +330,7 @@ export class WorkflowBlockHandler implements BlockHandler {
const response = await fetch(url.toString(), { headers })
if (!response.ok) {
await response.text().catch(() => {})
if (response.status === HTTP.STATUS.NOT_FOUND) {
logger.warn(`Child workflow ${workflowId} not found`)
return null
@@ -626,6 +582,6 @@ export class WorkflowBlockHandler implements BlockHandler {
result,
childTraceSpans: childTraceSpans || [],
_childWorkflowInstanceId: instanceId,
} as Record<string, any>
} as unknown as BlockOutput
}
}

View File

@@ -1,12 +1,10 @@
import { PARALLEL } from '@/executor/constants'
import type { NodeMetadata } from '@/executor/dag/types'
import type { ExecutionContext, LoopPauseScope, ParallelPauseScope } from '@/executor/types'
interface NodeMetadataLike {
interface NodeMetadataLike
extends Pick<NodeMetadata, 'loopId' | 'parallelId' | 'branchIndex' | 'branchTotal'> {
nodeId: string
loopId?: string
parallelId?: string
branchIndex?: number
branchTotal?: number
}
export function generatePauseContextId(

View File

@@ -2,7 +2,7 @@ import { createLogger } from '@sim/logger'
import { generateRequestId } from '@/lib/core/utils/request'
import { isExecutionCancelled, isRedisCancellationEnabled } from '@/lib/execution/cancellation'
import { executeInIsolatedVM } from '@/lib/execution/isolated-vm'
import { buildLoopIndexCondition, DEFAULTS, EDGE } from '@/executor/constants'
import { buildLoopIndexCondition, DEFAULTS, EDGE, PARALLEL } from '@/executor/constants'
import type { DAG } from '@/executor/dag/builder'
import type { EdgeManager } from '@/executor/execution/edge-manager'
import type { LoopScope } from '@/executor/execution/state'
@@ -13,9 +13,12 @@ import {
type NormalizedBlockOutput,
} from '@/executor/types'
import type { LoopConfigWithNodes } from '@/executor/types/loop'
import { buildContainerIterationContext } from '@/executor/utils/iteration-context'
import { replaceValidReferences } from '@/executor/utils/reference-validation'
import {
addSubflowErrorLog,
buildParallelSentinelEndId,
buildParallelSentinelStartId,
buildSentinelEndId,
buildSentinelStartId,
extractBaseBlockId,
@@ -39,23 +42,14 @@ export interface LoopContinuationResult {
}
export class LoopOrchestrator {
private edgeManager: EdgeManager | null = null
private contextExtensions: ContextExtensions | null = null
constructor(
private dag: DAG,
private state: BlockStateController,
private resolver: VariableResolver
private resolver: VariableResolver,
private contextExtensions: ContextExtensions | null = null,
private edgeManager: EdgeManager | null = null
) {}
setContextExtensions(contextExtensions: ContextExtensions): void {
this.contextExtensions = contextExtensions
}
setEdgeManager(edgeManager: EdgeManager): void {
this.edgeManager = edgeManager
}
initializeLoopScope(ctx: ExecutionContext, loopId: string): LoopScope {
const loopConfig = this.dag.loopConfigs.get(loopId) as SerializedLoop | undefined
if (!loopConfig) {
@@ -100,7 +94,7 @@ export class LoopOrchestrator {
scope.loopType = 'forEach'
let items: any[]
try {
items = this.resolveForEachItems(ctx, loopConfig.forEachItems)
items = resolveArrayInput(ctx, loopConfig.forEachItems, this.resolver)
} catch (error) {
const errorMessage = `ForEach loop resolution failed: ${error instanceof Error ? error.message : String(error)}`
logger.error(errorMessage, { loopId, forEachItems: loopConfig.forEachItems })
@@ -283,16 +277,23 @@ export class LoopOrchestrator {
const output = { results }
this.state.setBlockOutput(loopId, output, DEFAULTS.EXECUTION_TIME)
// Emit onBlockComplete for the loop container so the UI can track it
if (this.contextExtensions?.onBlockComplete) {
const now = new Date().toISOString()
this.contextExtensions.onBlockComplete(loopId, 'Loop', 'loop', {
output,
executionTime: DEFAULTS.EXECUTION_TIME,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
})
const iterationContext = buildContainerIterationContext(ctx, loopId)
this.contextExtensions.onBlockComplete(
loopId,
'Loop',
'loop',
{
output,
executionTime: DEFAULTS.EXECUTION_TIME,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
},
iterationContext
)
}
return {
@@ -327,21 +328,211 @@ export class LoopOrchestrator {
return result
}
clearLoopExecutionState(loopId: string): void {
const loopConfig = this.dag.loopConfigs.get(loopId) as LoopConfigWithNodes | undefined
if (!loopConfig) {
logger.warn('Loop config not found for state clearing', { loopId })
return
clearLoopExecutionState(loopId: string, ctx: ExecutionContext): void {
const allNodeIds = this.collectAllLoopNodeIds(loopId)
for (const nodeId of allNodeIds) {
this.state.unmarkExecuted(nodeId)
}
this.resetNestedLoopScopes(loopId, ctx)
this.resetNestedParallelScopes(loopId, ctx)
}
/**
* Deletes loop scopes for any nested loops so they re-initialize
* on the next outer iteration.
*/
private resetNestedLoopScopes(loopId: string, ctx: ExecutionContext): void {
const loopConfig = this.dag.loopConfigs.get(loopId) as LoopConfigWithNodes | undefined
if (!loopConfig) return
for (const nodeId of loopConfig.nodes) {
if (this.dag.loopConfigs.has(nodeId)) {
ctx.loopExecutions?.delete(nodeId)
// Delete cloned loop variants (__obranch-N and __clone*) but not original
// subflowParentMap entries which are needed for SSE iteration context.
if (ctx.loopExecutions) {
const obranchPrefix = `${nodeId}__obranch-`
const cloneSeqPrefix = `${nodeId}__clone`
for (const key of ctx.loopExecutions.keys()) {
if (key.startsWith(obranchPrefix) || key.startsWith(cloneSeqPrefix)) {
ctx.loopExecutions.delete(key)
ctx.subflowParentMap?.delete(key)
}
}
}
this.resetNestedLoopScopes(nodeId, ctx)
}
}
}
/**
* Deletes parallel scopes for any nested parallels (including cloned
* subflows with `__obranch-N` suffixes) so they re-initialize on the
* next outer loop iteration.
*/
private resetNestedParallelScopes(loopId: string, ctx: ExecutionContext): void {
const loopConfig = this.dag.loopConfigs.get(loopId) as LoopConfigWithNodes | undefined
if (!loopConfig) return
for (const nodeId of loopConfig.nodes) {
if (this.dag.parallelConfigs.has(nodeId)) {
this.deleteParallelScopeAndClones(nodeId, ctx)
} else if (this.dag.loopConfigs.has(nodeId)) {
this.resetNestedParallelScopes(nodeId, ctx)
}
}
}
/**
* Deletes a parallel scope and any cloned variants (`__obranch-N`),
* recursively handling nested subflows within the parallel.
*/
private deleteParallelScopeAndClones(parallelId: string, ctx: ExecutionContext): void {
ctx.parallelExecutions?.delete(parallelId)
// Delete cloned scopes (__obranch-N and __clone*) but not original subflowParentMap entries
if (ctx.parallelExecutions) {
const obranchPrefix = `${parallelId}__obranch-`
const clonePrefix = `${parallelId}__clone`
for (const key of ctx.parallelExecutions.keys()) {
if (key.startsWith(obranchPrefix) || key.startsWith(clonePrefix)) {
ctx.parallelExecutions.delete(key)
ctx.subflowParentMap?.delete(key)
}
}
}
const parallelConfig = this.dag.parallelConfigs.get(parallelId)
if (parallelConfig?.nodes) {
for (const nodeId of parallelConfig.nodes) {
if (this.dag.parallelConfigs.has(nodeId)) {
this.deleteParallelScopeAndClones(nodeId, ctx)
} else if (this.dag.loopConfigs.has(nodeId)) {
ctx.loopExecutions?.delete(nodeId)
// Also delete cloned loop scopes (__obranch-N and __clone*) created by expandParallel
if (ctx.loopExecutions) {
const obranchPrefix = `${nodeId}__obranch-`
const cloneSeqPrefix = `${nodeId}__clone`
for (const key of ctx.loopExecutions.keys()) {
if (key.startsWith(obranchPrefix) || key.startsWith(cloneSeqPrefix)) {
ctx.loopExecutions.delete(key)
ctx.subflowParentMap?.delete(key)
}
}
}
this.resetNestedParallelScopes(nodeId, ctx)
}
}
}
}
/**
* Collects all effective DAG node IDs for a loop, recursively including
* sentinel IDs for any nested subflow blocks (loops and parallels).
*/
private collectAllLoopNodeIds(loopId: string, visited = new Set<string>()): Set<string> {
if (visited.has(loopId)) return new Set()
visited.add(loopId)
const loopConfig = this.dag.loopConfigs.get(loopId) as LoopConfigWithNodes | undefined
if (!loopConfig) return new Set()
const sentinelStartId = buildSentinelStartId(loopId)
const sentinelEndId = buildSentinelEndId(loopId)
const loopNodes = loopConfig.nodes
const result = new Set([sentinelStartId, sentinelEndId])
this.state.unmarkExecuted(sentinelStartId)
this.state.unmarkExecuted(sentinelEndId)
for (const loopNodeId of loopNodes) {
this.state.unmarkExecuted(loopNodeId)
for (const nodeId of loopConfig.nodes) {
if (this.dag.loopConfigs.has(nodeId)) {
for (const id of this.collectAllLoopNodeIds(nodeId, visited)) {
result.add(id)
}
this.collectClonedSubflowNodes(nodeId, result, visited)
} else if (this.dag.parallelConfigs.has(nodeId)) {
for (const id of this.collectAllParallelNodeIds(nodeId, visited)) {
result.add(id)
}
this.collectClonedSubflowNodes(nodeId, result, visited)
} else {
result.add(nodeId)
}
}
return result
}
/**
* Collects all effective DAG node IDs for a parallel, including
* sentinel IDs and branch template nodes, recursively handling nested subflows.
*/
private collectAllParallelNodeIds(parallelId: string, visited = new Set<string>()): Set<string> {
if (visited.has(parallelId)) return new Set()
visited.add(parallelId)
const parallelConfig = this.dag.parallelConfigs.get(parallelId)
if (!parallelConfig) return new Set()
const sentinelStartId = buildParallelSentinelStartId(parallelId)
const sentinelEndId = buildParallelSentinelEndId(parallelId)
const result = new Set([sentinelStartId, sentinelEndId])
for (const nodeId of parallelConfig.nodes) {
if (this.dag.loopConfigs.has(nodeId)) {
for (const id of this.collectAllLoopNodeIds(nodeId, visited)) {
result.add(id)
}
this.collectClonedSubflowNodes(nodeId, result, visited)
} else if (this.dag.parallelConfigs.has(nodeId)) {
for (const id of this.collectAllParallelNodeIds(nodeId, visited)) {
result.add(id)
}
this.collectClonedSubflowNodes(nodeId, result, visited)
} else {
result.add(nodeId)
this.collectAllBranchNodes(nodeId, result)
}
}
return result
}
/**
* Collects all branch nodes for a given base block ID by scanning the DAG.
* This captures dynamically created branches (1, 2, ...) beyond the template (0).
*/
private collectAllBranchNodes(baseNodeId: string, result: Set<string>): void {
const prefix = `${baseNodeId}${PARALLEL.BRANCH.PREFIX}`
for (const dagNodeId of this.dag.nodes.keys()) {
if (dagNodeId.startsWith(prefix)) {
result.add(dagNodeId)
}
}
}
/**
* Collects all cloned subflow variants (e.g., loop-1__obranch-N) and their
* descendant nodes by scanning the DAG configs.
*/
private collectClonedSubflowNodes(
originalId: string,
result: Set<string>,
visited: Set<string>
): void {
const obranchPrefix = `${originalId}__obranch-`
const clonePrefix = `${originalId}__clone`
for (const loopId of this.dag.loopConfigs.keys()) {
if (loopId.startsWith(obranchPrefix) || loopId.startsWith(clonePrefix)) {
for (const id of this.collectAllLoopNodeIds(loopId, visited)) {
result.add(id)
}
}
}
for (const parallelId of this.dag.parallelConfigs.keys()) {
if (parallelId.startsWith(obranchPrefix) || parallelId.startsWith(clonePrefix)) {
for (const id of this.collectAllParallelNodeIds(parallelId, visited)) {
result.add(id)
}
}
}
}
@@ -352,10 +543,7 @@ export class LoopOrchestrator {
return
}
const sentinelStartId = buildSentinelStartId(loopId)
const sentinelEndId = buildSentinelEndId(loopId)
const loopNodes = loopConfig.nodes
const allLoopNodeIds = new Set([sentinelStartId, sentinelEndId, ...loopNodes])
const allLoopNodeIds = this.collectAllLoopNodeIds(loopId)
if (this.edgeManager) {
this.edgeManager.clearDeactivatedEdgesForNodes(allLoopNodeIds)
@@ -365,8 +553,9 @@ export class LoopOrchestrator {
const nodeToRestore = this.dag.nodes.get(nodeId)
if (!nodeToRestore) continue
for (const [potentialSourceId, potentialSourceNode] of this.dag.nodes) {
if (!allLoopNodeIds.has(potentialSourceId)) continue
for (const potentialSourceId of allLoopNodeIds) {
const potentialSourceNode = this.dag.nodes.get(potentialSourceId)
if (!potentialSourceNode) continue
for (const [, edge] of potentialSourceNode.outgoingEdges) {
if (edge.target === nodeId) {
@@ -412,23 +601,19 @@ export class LoopOrchestrator {
return true
}
// for: skip if maxIterations is 0
if (scope.loopType === 'for') {
if (scope.maxIterations === 0) {
logger.info('For loop has 0 iterations, skipping loop body', { loopId })
// Set empty output for the loop
this.state.setBlockOutput(loopId, { results: [] }, DEFAULTS.EXECUTION_TIME)
return false
}
return true
}
// doWhile: always execute at least once
if (scope.loopType === 'doWhile') {
return true
}
// while: check condition before first iteration
if (scope.loopType === 'while') {
if (!scope.condition) {
logger.warn('No condition defined for while loop', { loopId })
@@ -448,20 +633,6 @@ export class LoopOrchestrator {
return true
}
shouldExecuteLoopNode(_ctx: ExecutionContext, _nodeId: string, _loopId: string): boolean {
return true
}
private findLoopForNode(nodeId: string): string | undefined {
for (const [loopId, config] of this.dag.loopConfigs) {
const nodes = (config as any).nodes || []
if (nodes.includes(nodeId)) {
return loopId
}
}
return undefined
}
private async evaluateWhileCondition(
ctx: ExecutionContext,
condition: string,
@@ -480,10 +651,9 @@ export class LoopOrchestrator {
const evaluatedCondition = replaceValidReferences(condition, (match) => {
const resolved = this.resolver.resolveSingleReference(ctx, '', match, scope)
logger.info('Resolved variable reference in loop condition', {
logger.debug('Resolved variable reference in loop condition', {
reference: match,
resolvedValue: resolved,
resolvedType: typeof resolved,
})
if (resolved !== undefined) {
if (typeof resolved === 'boolean' || typeof resolved === 'number') {
@@ -538,8 +708,4 @@ export class LoopOrchestrator {
return false
}
}
private resolveForEachItems(ctx: ExecutionContext, items: any): any[] {
return resolveArrayInput(ctx, items, this.resolver)
}
}

View File

@@ -56,14 +56,6 @@ export class NodeExecutionOrchestrator {
this.loopOrchestrator.initializeLoopScope(ctx, loopId)
}
if (loopId && !this.loopOrchestrator.shouldExecuteLoopNode(ctx, nodeId, loopId)) {
return {
nodeId,
output: {},
isFinalOutput: false,
}
}
const parallelId = node.metadata.parallelId
if (parallelId && !this.parallelOrchestrator.getParallelScope(ctx, parallelId)) {
const parallelConfig = this.dag.parallelConfigs.get(parallelId)
@@ -276,7 +268,7 @@ export class NodeExecutionOrchestrator {
) {
const loopId = node.metadata.loopId
if (loopId) {
this.loopOrchestrator.clearLoopExecutionState(loopId)
this.loopOrchestrator.clearLoopExecutionState(loopId, ctx)
this.loopOrchestrator.restoreLoopEdges(loopId)
}
}

View File

@@ -9,6 +9,7 @@ import {
type NormalizedBlockOutput,
} from '@/executor/types'
import type { ParallelConfigWithNodes } from '@/executor/types/parallel'
import { buildContainerIterationContext } from '@/executor/utils/iteration-context'
import { ParallelExpander } from '@/executor/utils/parallel-expansion'
import {
addSubflowErrorLog,
@@ -36,23 +37,15 @@ export interface ParallelAggregationResult {
}
export class ParallelOrchestrator {
private resolver: VariableResolver | null = null
private contextExtensions: ContextExtensions | null = null
private expander = new ParallelExpander()
constructor(
private dag: DAG,
private state: BlockStateWriter
private state: BlockStateWriter,
private resolver: VariableResolver | null = null,
private contextExtensions: ContextExtensions | null = null
) {}
setResolver(resolver: VariableResolver): void {
this.resolver = resolver
}
setContextExtensions(contextExtensions: ContextExtensions): void {
this.contextExtensions = contextExtensions
}
initializeParallelScope(
ctx: ExecutionContext,
parallelId: string,
@@ -97,7 +90,6 @@ export class ParallelOrchestrator {
throw new Error(branchError)
}
// Handle empty distribution - skip parallel body
if (isEmpty || branchCount === 0) {
const scope: ParallelScope = {
parallelId,
@@ -114,7 +106,6 @@ export class ParallelOrchestrator {
}
ctx.parallelExecutions.set(parallelId, scope)
// Set empty output for the parallel
this.state.setBlockOutput(parallelId, { results: [] })
logger.info('Parallel scope initialized with empty distribution, skipping body', {
@@ -125,7 +116,56 @@ export class ParallelOrchestrator {
return scope
}
const { entryNodes } = this.expander.expandParallel(this.dag, parallelId, branchCount, items)
const { entryNodes, clonedSubflows } = this.expander.expandParallel(
this.dag,
parallelId,
branchCount,
items
)
// Register cloned subflows in the parent map so iteration context resolves correctly.
// Build a per-branch clone map so nested clones point to the cloned parent, not the original.
if (clonedSubflows.length > 0 && ctx.subflowParentMap) {
const branchCloneMaps = new Map<number, Map<string, string>>()
for (const clone of clonedSubflows) {
let map = branchCloneMaps.get(clone.outerBranchIndex)
if (!map) {
map = new Map()
branchCloneMaps.set(clone.outerBranchIndex, map)
}
map.set(clone.originalId, clone.clonedId)
}
for (const clone of clonedSubflows) {
const originalEntry = ctx.subflowParentMap.get(clone.originalId)
if (originalEntry) {
const cloneMap = branchCloneMaps.get(clone.outerBranchIndex)
const clonedParentId = cloneMap?.get(originalEntry.parentId)
if (clonedParentId) {
// Parent was also cloned — this is the original (branch 0) inside the cloned parent
ctx.subflowParentMap.set(clone.clonedId, {
parentId: clonedParentId,
parentType: originalEntry.parentType,
branchIndex: 0,
})
} else {
// Parent was not cloned — direct child of the expanding parallel
ctx.subflowParentMap.set(clone.clonedId, {
parentId: parallelId,
parentType: 'parallel',
branchIndex: clone.outerBranchIndex,
})
}
} else {
// Not in parent map — direct child of the expanding parallel
ctx.subflowParentMap.set(clone.clonedId, {
parentId: parallelId,
parentType: 'parallel',
branchIndex: clone.outerBranchIndex,
})
}
}
}
const scope: ParallelScope = {
parallelId,
@@ -210,10 +250,6 @@ export class ParallelOrchestrator {
}
private resolveDistributionItems(ctx: ExecutionContext, config: SerializedParallel): any[] {
if (config.parallelType === 'count') {
return []
}
if (
config.distribution === undefined ||
config.distribution === null ||
@@ -261,22 +297,35 @@ export class ParallelOrchestrator {
const results: NormalizedBlockOutput[][] = []
for (let i = 0; i < scope.totalBranches; i++) {
const branchOutputs = scope.branchOutputs.get(i) || []
results.push(branchOutputs)
const branchOutputs = scope.branchOutputs.get(i)
if (!branchOutputs) {
logger.warn('Missing branch output during parallel aggregation', { parallelId, branch: i })
}
results.push(branchOutputs ?? [])
}
const output = { results }
this.state.setBlockOutput(parallelId, output)
// Emit onBlockComplete for the parallel container so the UI can track it
// Emit onBlockComplete for the parallel container so the UI can track it.
// When this parallel is nested inside a parent subflow (parallel or loop), emit
// iteration context so the terminal can group this event under the parent container.
if (this.contextExtensions?.onBlockComplete) {
const now = new Date().toISOString()
this.contextExtensions.onBlockComplete(parallelId, 'Parallel', 'parallel', {
output,
executionTime: 0,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
})
const iterationContext = buildContainerIterationContext(ctx, parallelId)
this.contextExtensions.onBlockComplete(
parallelId,
'Parallel',
'parallel',
{
output,
executionTime: 0,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
},
iterationContext
)
}
return {

View File

@@ -4,10 +4,12 @@ import type { BlockOutput } from '@/blocks/types'
import type {
ChildWorkflowContext,
IterationContext,
ParentIteration,
SerializableExecutionState,
} from '@/executor/execution/types'
import type { RunFromBlockContext } from '@/executor/utils/run-from-block'
import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types'
import type { SubflowType } from '@/stores/workflows/workflow/types'
export interface UserFile {
id: string
@@ -121,6 +123,8 @@ export interface BlockLog {
loopId?: string
parallelId?: string
iterationIndex?: number
/** Full ancestor iteration chain for nested subflows (outermost → innermost). */
parentIterations?: ParentIteration[]
/**
* Monotonically increasing integer (1, 2, 3, ...) for accurate block ordering.
* Generated via getNextExecutionOrder() to ensure deterministic sorting.
@@ -192,6 +196,17 @@ export interface ExecutionContext {
completedLoops: Set<string>
/**
* Unified parent map for subflow nesting (loop-in-loop, parallel-in-parallel,
* loop-in-parallel, parallel-in-loop). Maps any child subflow ID to its parent
* subflow ID and type, enabling the iteration context builder to walk the full
* ancestor chain regardless of subflow type.
*/
subflowParentMap?: Map<
string,
{ parentId: string; parentType: SubflowType; branchIndex?: number }
>
loopExecutions?: Map<
string,
{

View File

@@ -4,19 +4,19 @@ export interface JSONProperty {
id: string
name: string
type: string
value: any
value: unknown
collapsed?: boolean
}
/**
* Converts builder data (structured JSON properties) into a plain JSON object.
*/
export function convertBuilderDataToJson(builderData: JSONProperty[]): any {
export function convertBuilderDataToJson(builderData: JSONProperty[]): Record<string, unknown> {
if (!Array.isArray(builderData)) {
return {}
}
const result: any = {}
const result: Record<string, unknown> = {}
for (const prop of builderData) {
if (!prop.name || !prop.name.trim()) {
@@ -38,7 +38,7 @@ export function convertBuilderDataToJsonString(builderData: JSONProperty[]): str
return '{\n \n}'
}
const result: any = {}
const result: Record<string, unknown> = {}
for (const prop of builderData) {
if (!prop.name || !prop.name.trim()) {
@@ -55,7 +55,7 @@ export function convertBuilderDataToJsonString(builderData: JSONProperty[]): str
return jsonString
}
export function convertPropertyValue(prop: JSONProperty): any {
export function convertPropertyValue(prop: JSONProperty): unknown {
switch (prop.type) {
case 'object':
return convertObjectValue(prop.value)
@@ -72,9 +72,9 @@ export function convertPropertyValue(prop: JSONProperty): any {
}
}
function convertObjectValue(value: any): any {
function convertObjectValue(value: unknown): unknown {
if (Array.isArray(value)) {
return convertBuilderDataToJson(value)
return convertBuilderDataToJson(value as JSONProperty[])
}
if (typeof value === 'string' && !isVariableReference(value)) {
@@ -84,9 +84,9 @@ function convertObjectValue(value: any): any {
return value
}
function convertArrayValue(value: any): any {
function convertArrayValue(value: unknown): unknown {
if (Array.isArray(value)) {
return value.map((item: any) => convertArrayItem(item))
return value.map((item: unknown) => convertArrayItem(item))
}
if (typeof value === 'string' && !isVariableReference(value)) {
@@ -97,25 +97,34 @@ function convertArrayValue(value: any): any {
return value
}
function convertArrayItem(item: any): any {
if (typeof item !== 'object' || !item.type) {
function convertArrayItem(item: unknown): unknown {
if (typeof item !== 'object' || item === null || !('type' in item)) {
return item
}
if (item.type === 'object' && Array.isArray(item.value)) {
return convertBuilderDataToJson(item.value)
const record = item as Record<string, unknown>
if (typeof record.type !== 'string') {
return item
}
if (item.type === 'array' && Array.isArray(item.value)) {
return item.value.map((subItem: any) =>
typeof subItem === 'object' && subItem.type ? subItem.value : subItem
const typed = record as { type: string; value: unknown }
if (typed.type === 'object' && Array.isArray(typed.value)) {
return convertBuilderDataToJson(typed.value as JSONProperty[])
}
if (typed.type === 'array' && Array.isArray(typed.value)) {
return (typed.value as unknown[]).map((subItem: unknown) =>
typeof subItem === 'object' && subItem !== null && 'value' in subItem
? (subItem as { value: unknown }).value
: subItem
)
}
return item.value
return typed.value
}
function convertNumberValue(value: any): any {
function convertNumberValue(value: unknown): unknown {
if (isVariableReference(value)) {
return value
}
@@ -124,7 +133,7 @@ function convertNumberValue(value: any): any {
return Number.isNaN(numValue) ? value : numValue
}
function convertBooleanValue(value: any): any {
function convertBooleanValue(value: unknown): unknown {
if (isVariableReference(value)) {
return value
}
@@ -132,7 +141,7 @@ function convertBooleanValue(value: any): any {
return value === 'true' || value === true
}
function tryParseJson(jsonString: string, fallback: any): any {
function tryParseJson(jsonString: string, fallback: unknown): unknown {
try {
return JSON.parse(jsonString)
} catch {
@@ -140,7 +149,7 @@ function tryParseJson(jsonString: string, fallback: any): any {
}
}
function isVariableReference(value: any): boolean {
function isVariableReference(value: unknown): boolean {
return (
typeof value === 'string' &&
value.trim().startsWith(REFERENCE.START) &&

View File

@@ -0,0 +1,585 @@
/**
* @vitest-environment node
*/
import { describe, expect, it } from 'vitest'
import type { ExecutionContext } from '@/executor/types'
import {
buildContainerIterationContext,
buildUnifiedParentIterations,
getIterationContext,
type IterationNodeMetadata,
} from './iteration-context'
function makeCtx(overrides: Partial<ExecutionContext> = {}): ExecutionContext {
return {
workflowId: 'wf-1',
executionId: 'exec-1',
workspaceId: 'ws-1',
userId: 'user-1',
blockStates: {},
blockLogs: [],
executedBlocks: [],
environmentVariables: {},
decisions: { router: new Map(), condition: new Map() },
completedLoops: new Set(),
activeExecutionPath: [],
executionOrder: 0,
...overrides,
} as unknown as ExecutionContext
}
describe('getIterationContext', () => {
it('returns undefined for undefined metadata', () => {
const ctx = makeCtx()
expect(getIterationContext(ctx, undefined)).toBeUndefined()
})
it('resolves parallel branch metadata', () => {
const ctx = makeCtx({
parallelExecutions: new Map([
[
'p1',
{
parallelId: 'p1',
totalBranches: 3,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 3,
},
],
]),
})
const metadata: IterationNodeMetadata = {
branchIndex: 1,
branchTotal: 3,
parallelId: 'p1',
}
const result = getIterationContext(ctx, metadata)
expect(result).toEqual({
iterationCurrent: 1,
iterationTotal: 3,
iterationType: 'parallel',
iterationContainerId: 'p1',
})
})
it('resolves loop node metadata', () => {
const ctx = makeCtx({
loopExecutions: new Map([
[
'l1',
{
iteration: 2,
maxIterations: 5,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
})
const metadata: IterationNodeMetadata = {
isLoopNode: true,
loopId: 'l1',
}
const result = getIterationContext(ctx, metadata)
expect(result).toEqual({
iterationCurrent: 2,
iterationTotal: 5,
iterationType: 'loop',
iterationContainerId: 'l1',
})
})
})
describe('buildUnifiedParentIterations', () => {
it('returns empty array when no parent maps exist', () => {
const ctx = makeCtx()
expect(buildUnifiedParentIterations(ctx, 'some-id')).toEqual([])
})
it('resolves loop-in-loop parent chain', () => {
const ctx = makeCtx({
subflowParentMap: new Map([['inner-loop', { parentId: 'outer-loop', parentType: 'loop' }]]),
loopExecutions: new Map([
[
'outer-loop',
{
iteration: 1,
maxIterations: 3,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
})
const result = buildUnifiedParentIterations(ctx, 'inner-loop')
expect(result).toEqual([
{
iterationCurrent: 1,
iterationTotal: 3,
iterationType: 'loop',
iterationContainerId: 'outer-loop',
},
])
})
it('resolves parallel-in-parallel parent chain', () => {
const ctx = makeCtx({
subflowParentMap: new Map([
['inner-p__obranch-2', { parentId: 'outer-p', parentType: 'parallel' }],
]),
parallelExecutions: new Map([
[
'outer-p',
{
parallelId: 'outer-p',
totalBranches: 4,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 4,
},
],
]),
})
const result = buildUnifiedParentIterations(ctx, 'inner-p__obranch-2')
expect(result).toEqual([
{
iterationCurrent: 2,
iterationTotal: 4,
iterationType: 'parallel',
iterationContainerId: 'outer-p',
},
])
})
it('resolves loop-in-parallel (cross-type nesting)', () => {
const ctx = makeCtx({
subflowParentMap: new Map([
['loop-1__obranch-1', { parentId: 'parallel-1', parentType: 'parallel' }],
]),
parallelExecutions: new Map([
[
'parallel-1',
{
parallelId: 'parallel-1',
totalBranches: 5,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 5,
},
],
]),
})
const result = buildUnifiedParentIterations(ctx, 'loop-1__obranch-1')
expect(result).toEqual([
{
iterationCurrent: 1,
iterationTotal: 5,
iterationType: 'parallel',
iterationContainerId: 'parallel-1',
},
])
})
it('resolves parallel-in-loop (cross-type nesting)', () => {
const ctx = makeCtx({
subflowParentMap: new Map([['parallel-1', { parentId: 'loop-1', parentType: 'loop' }]]),
loopExecutions: new Map([
[
'loop-1',
{
iteration: 3,
maxIterations: 5,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
})
const result = buildUnifiedParentIterations(ctx, 'parallel-1')
expect(result).toEqual([
{
iterationCurrent: 3,
iterationTotal: 5,
iterationType: 'loop',
iterationContainerId: 'loop-1',
},
])
})
it('resolves deep cross-type nesting: parallel → loop → parallel', () => {
const ctx = makeCtx({
subflowParentMap: new Map([
['inner-p', { parentId: 'mid-loop', parentType: 'loop' }],
['mid-loop', { parentId: 'outer-p', parentType: 'parallel' }],
['mid-loop__obranch-2', { parentId: 'outer-p', parentType: 'parallel' }],
]),
loopExecutions: new Map([
[
'mid-loop',
{
iteration: 1,
maxIterations: 4,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
parallelExecutions: new Map([
[
'outer-p',
{
parallelId: 'outer-p',
totalBranches: 3,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 3,
},
],
]),
})
const result = buildUnifiedParentIterations(ctx, 'inner-p')
expect(result).toEqual([
{
iterationCurrent: 0,
iterationTotal: 3,
iterationType: 'parallel',
iterationContainerId: 'outer-p',
},
{
iterationCurrent: 1,
iterationTotal: 4,
iterationType: 'loop',
iterationContainerId: 'mid-loop',
},
])
})
it('resolves 3-level parallel nesting with branchIndex entries', () => {
// P1 → P2 → P3, with P2__obranch-1 and P3__clone0__obranch-1
const ctx = makeCtx({
subflowParentMap: new Map([
['P2', { parentId: 'P1', parentType: 'parallel' }],
['P3', { parentId: 'P2', parentType: 'parallel' }],
['P2__obranch-1', { parentId: 'P1', parentType: 'parallel', branchIndex: 1 }],
[
'P3__clone0__obranch-1',
{ parentId: 'P2__obranch-1', parentType: 'parallel', branchIndex: 0 },
],
['P3__obranch-1', { parentId: 'P2', parentType: 'parallel', branchIndex: 1 }],
]),
parallelExecutions: new Map([
[
'P1',
{
parallelId: 'P1',
totalBranches: 2,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 2,
},
],
[
'P2',
{
parallelId: 'P2',
totalBranches: 2,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 2,
},
],
[
'P2__obranch-1',
{
parallelId: 'P2__obranch-1',
totalBranches: 2,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 2,
},
],
]),
})
// P3 (original): inside P2 branch 0, inside P1 branch 0
expect(buildUnifiedParentIterations(ctx, 'P3')).toEqual([
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'P1',
},
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'P2',
},
])
// P3__obranch-1 (runtime clone): inside P2 branch 1, inside P1 branch 0
expect(buildUnifiedParentIterations(ctx, 'P3__obranch-1')).toEqual([
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'P1',
},
{
iterationCurrent: 1,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'P2',
},
])
// P3__clone0__obranch-1 (pre-expansion clone): inside P2__obranch-1 branch 0, inside P1 branch 1
expect(buildUnifiedParentIterations(ctx, 'P3__clone0__obranch-1')).toEqual([
{
iterationCurrent: 1,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'P1',
},
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'P2__obranch-1',
},
])
})
it('includes parent iterations in getIterationContext for loop-in-parallel', () => {
const ctx = makeCtx({
subflowParentMap: new Map([
['loop-1__obranch-2', { parentId: 'parallel-1', parentType: 'parallel' }],
]),
parallelExecutions: new Map([
[
'parallel-1',
{
parallelId: 'parallel-1',
totalBranches: 5,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 5,
},
],
]),
loopExecutions: new Map([
[
'loop-1__obranch-2',
{
iteration: 3,
maxIterations: 5,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
})
const metadata: IterationNodeMetadata = {
isLoopNode: true,
loopId: 'loop-1__obranch-2',
}
const result = getIterationContext(ctx, metadata)
expect(result).toEqual({
iterationCurrent: 3,
iterationTotal: 5,
iterationType: 'loop',
iterationContainerId: 'loop-1__obranch-2',
parentIterations: [
{
iterationCurrent: 2,
iterationTotal: 5,
iterationType: 'parallel',
iterationContainerId: 'parallel-1',
},
],
})
})
it('includes parent iterations in getIterationContext for parallel-in-loop', () => {
const ctx = makeCtx({
subflowParentMap: new Map([['parallel-1', { parentId: 'loop-1', parentType: 'loop' }]]),
loopExecutions: new Map([
[
'loop-1',
{
iteration: 2,
maxIterations: 5,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
parallelExecutions: new Map([
[
'parallel-1',
{
parallelId: 'parallel-1',
totalBranches: 3,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 3,
},
],
]),
})
const metadata: IterationNodeMetadata = {
branchIndex: 1,
branchTotal: 3,
parallelId: 'parallel-1',
}
const result = getIterationContext(ctx, metadata)
expect(result).toEqual({
iterationCurrent: 1,
iterationTotal: 3,
iterationType: 'parallel',
iterationContainerId: 'parallel-1',
parentIterations: [
{
iterationCurrent: 2,
iterationTotal: 5,
iterationType: 'loop',
iterationContainerId: 'loop-1',
},
],
})
})
})
describe('buildContainerIterationContext', () => {
it('returns undefined when no parent map exists', () => {
const ctx = makeCtx()
expect(buildContainerIterationContext(ctx, 'loop-1')).toBeUndefined()
})
it('returns undefined when container is not in parent map', () => {
const ctx = makeCtx({
subflowParentMap: new Map(),
})
expect(buildContainerIterationContext(ctx, 'loop-1')).toBeUndefined()
})
it('resolves loop nested inside parallel', () => {
const ctx = makeCtx({
subflowParentMap: new Map([
['loop-1__obranch-2', { parentId: 'parallel-1', parentType: 'parallel' }],
]),
parallelExecutions: new Map([
[
'parallel-1',
{
parallelId: 'parallel-1',
totalBranches: 5,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 5,
},
],
]),
})
const result = buildContainerIterationContext(ctx, 'loop-1__obranch-2')
expect(result).toEqual({
iterationCurrent: 2,
iterationTotal: 5,
iterationType: 'parallel',
iterationContainerId: 'parallel-1',
})
})
it('resolves parallel nested inside loop', () => {
const ctx = makeCtx({
subflowParentMap: new Map([['parallel-1', { parentId: 'loop-1', parentType: 'loop' }]]),
loopExecutions: new Map([
[
'loop-1',
{
iteration: 3,
maxIterations: 10,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
},
],
]),
})
const result = buildContainerIterationContext(ctx, 'parallel-1')
expect(result).toEqual({
iterationCurrent: 3,
iterationTotal: 10,
iterationType: 'loop',
iterationContainerId: 'loop-1',
})
})
it('returns undefined when parent scope is missing', () => {
const ctx = makeCtx({
subflowParentMap: new Map([['loop-1', { parentId: 'parallel-1', parentType: 'parallel' }]]),
parallelExecutions: new Map(),
})
expect(buildContainerIterationContext(ctx, 'loop-1')).toBeUndefined()
})
it('resolves pre-expansion clone with explicit branchIndex', () => {
// P1 → P2 → P3: P3__clone0__obranch-1 is pre-cloned inside P2__obranch-1
const ctx = makeCtx({
subflowParentMap: new Map([
[
'P3__clone0__obranch-1',
{ parentId: 'P2__obranch-1', parentType: 'parallel', branchIndex: 0 },
],
]),
parallelExecutions: new Map([
[
'P2__obranch-1',
{
parallelId: 'P2__obranch-1',
totalBranches: 5,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 5,
},
],
]),
})
const result = buildContainerIterationContext(ctx, 'P3__clone0__obranch-1')
expect(result).toEqual({
iterationCurrent: 0,
iterationTotal: 5,
iterationType: 'parallel',
iterationContainerId: 'P2__obranch-1',
})
})
it('uses branch index 0 for non-cloned container in parallel parent', () => {
const ctx = makeCtx({
subflowParentMap: new Map([
['inner-loop', { parentId: 'outer-parallel', parentType: 'parallel' }],
]),
parallelExecutions: new Map([
[
'outer-parallel',
{
parallelId: 'outer-parallel',
totalBranches: 3,
branchOutputs: new Map(),
completedCount: 0,
totalExpectedNodes: 3,
},
],
]),
})
const result = buildContainerIterationContext(ctx, 'inner-loop')
expect(result).toEqual({
iterationCurrent: 0,
iterationTotal: 3,
iterationType: 'parallel',
iterationContainerId: 'outer-parallel',
})
})
})

View File

@@ -0,0 +1,179 @@
import { DEFAULTS } from '@/executor/constants'
import type { NodeMetadata } from '@/executor/dag/types'
import type { IterationContext, ParentIteration } from '@/executor/execution/types'
import type { ExecutionContext } from '@/executor/types'
import { extractOuterBranchIndex, findEffectiveContainerId } from '@/executor/utils/subflow-utils'
/** Maximum ancestor depth to prevent runaway traversal in deeply nested subflows. */
const MAX_PARENT_DEPTH = DEFAULTS.MAX_NESTING_DEPTH
/**
* Subset of {@link NodeMetadata} needed for iteration context resolution.
* Compatible with both DAGNode.metadata and inline metadata objects.
*/
export type IterationNodeMetadata = Pick<
NodeMetadata,
'loopId' | 'parallelId' | 'branchIndex' | 'branchTotal' | 'isLoopNode'
>
/**
* Resolves the iteration context for a node based on its metadata and execution state.
* Handles both parallel (branch) and loop iteration contexts, including cross-type
* nesting (loop-in-parallel, parallel-in-loop) via the unified subflow parent map.
*/
export function getIterationContext(
ctx: ExecutionContext,
metadata: IterationNodeMetadata | undefined
): IterationContext | undefined {
if (!metadata) return undefined
if (metadata.branchIndex !== undefined && metadata.branchTotal !== undefined) {
const parentIterations = metadata.parallelId
? buildUnifiedParentIterations(ctx, metadata.parallelId)
: []
return {
iterationCurrent: metadata.branchIndex,
iterationTotal: metadata.branchTotal,
iterationType: 'parallel',
iterationContainerId: metadata.parallelId,
...(parentIterations.length > 0 && { parentIterations }),
}
}
if (metadata.isLoopNode && metadata.loopId) {
const loopScope = ctx.loopExecutions?.get(metadata.loopId)
if (loopScope && loopScope.iteration !== undefined) {
const parentIterations = buildUnifiedParentIterations(ctx, metadata.loopId)
return {
iterationCurrent: loopScope.iteration,
iterationTotal: loopScope.maxIterations,
iterationType: 'loop',
iterationContainerId: metadata.loopId,
...(parentIterations.length > 0 && { parentIterations }),
}
}
}
return undefined
}
/**
* Builds a single-level iteration context for a container (loop/parallel) that is
* nested inside a parent subflow. Used by orchestrators when emitting onBlockComplete
* for container sentinel nodes.
*/
export function buildContainerIterationContext(
ctx: ExecutionContext,
containerId: string
): IterationContext | undefined {
const parentEntry = ctx.subflowParentMap?.get(containerId)
if (!parentEntry) return undefined
if (parentEntry.parentType === 'parallel') {
// Use stored parentId directly when branchIndex is available (set during expansion),
// otherwise fall back to findEffectiveContainerId for backward compatibility.
const hasBranchIndex = parentEntry.branchIndex !== undefined
const effectiveParentId = hasBranchIndex
? parentEntry.parentId
: ctx.parallelExecutions
? findEffectiveContainerId(parentEntry.parentId, containerId, ctx.parallelExecutions)
: parentEntry.parentId
const parentScope = ctx.parallelExecutions?.get(effectiveParentId)
if (parentScope) {
return {
iterationCurrent: hasBranchIndex
? parentEntry.branchIndex!
: (extractOuterBranchIndex(containerId) ?? 0),
iterationTotal: parentScope.totalBranches,
iterationType: 'parallel',
iterationContainerId: effectiveParentId,
}
}
} else if (parentEntry.parentType === 'loop') {
const effectiveParentId = ctx.loopExecutions
? findEffectiveContainerId(parentEntry.parentId, containerId, ctx.loopExecutions)
: parentEntry.parentId
const parentScope = ctx.loopExecutions?.get(effectiveParentId)
if (parentScope && parentScope.iteration !== undefined) {
return {
iterationCurrent: parentScope.iteration,
iterationTotal: parentScope.maxIterations,
iterationType: 'loop',
iterationContainerId: effectiveParentId,
}
}
}
return undefined
}
/**
* Walks the unified subflow parent map to build the full ancestor iteration chain,
* handling all nesting combinations (loop-in-loop, parallel-in-parallel,
* loop-in-parallel, parallel-in-loop).
*
* Returns an array of parent iteration contexts, ordered from outermost to innermost.
*/
export function buildUnifiedParentIterations(
ctx: ExecutionContext,
subflowId: string
): ParentIteration[] {
if (!ctx.subflowParentMap) {
return []
}
const parents: ParentIteration[] = []
const visited = new Set<string>()
let currentId = subflowId
while (
ctx.subflowParentMap.has(currentId) &&
!visited.has(currentId) &&
visited.size < MAX_PARENT_DEPTH
) {
visited.add(currentId)
const entry = ctx.subflowParentMap.get(currentId)!
const { parentId, parentType } = entry
if (parentType === 'loop') {
// Resolve the effective (possibly cloned) loop ID — at runtime the scope
// may live under a cloned ID like `mid-loop__obranch-2` rather than `mid-loop`
const effectiveParentId = ctx.loopExecutions
? findEffectiveContainerId(parentId, currentId, ctx.loopExecutions)
: parentId
const parentScope = ctx.loopExecutions?.get(effectiveParentId)
if (parentScope && parentScope.iteration !== undefined) {
parents.unshift({
iterationCurrent: parentScope.iteration,
iterationTotal: parentScope.maxIterations,
iterationType: 'loop',
iterationContainerId: effectiveParentId,
})
}
} else {
// Use stored parentId directly when branchIndex is available (set during expansion),
// otherwise fall back to findEffectiveContainerId for backward compatibility.
const hasBranchIndex = entry.branchIndex !== undefined
const effectiveParentId = hasBranchIndex
? parentId
: ctx.parallelExecutions
? findEffectiveContainerId(parentId, currentId, ctx.parallelExecutions)
: parentId
const parentScope = ctx.parallelExecutions?.get(effectiveParentId)
if (parentScope) {
const outerBranchIndex = hasBranchIndex
? entry.branchIndex!
: (extractOuterBranchIndex(currentId) ?? 0)
parents.unshift({
iterationCurrent: outerBranchIndex,
iterationTotal: parentScope.totalBranches,
iterationType: 'parallel',
iterationContainerId: effectiveParentId,
})
}
}
currentId = parentId
}
return parents
}

View File

@@ -0,0 +1,316 @@
/**
* @vitest-environment node
*/
import { loggerMock } from '@sim/testing'
import { describe, expect, it, vi } from 'vitest'
import { BlockType } from '@/executor/constants'
import { DAGBuilder } from '@/executor/dag/builder'
import { EdgeManager } from '@/executor/execution/edge-manager'
import { ParallelExpander } from '@/executor/utils/parallel-expansion'
import {
buildBranchNodeId,
buildParallelSentinelEndId,
buildParallelSentinelStartId,
stripCloneSuffixes,
} from '@/executor/utils/subflow-utils'
import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types'
vi.mock('@sim/logger', () => loggerMock)
function createBlock(id: string, metadataId: string): SerializedBlock {
return {
id,
position: { x: 0, y: 0 },
config: { tool: 'noop', params: {} },
inputs: {},
outputs: {},
metadata: { id: metadataId, name: id },
enabled: true,
}
}
describe('Nested parallel expansion + edge resolution', () => {
it('outer parallel expansion clones inner subflow per branch and edge manager resolves correctly', () => {
const outerParallelId = 'outer-parallel'
const innerParallelId = 'inner-parallel'
const functionId = 'func-1'
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
createBlock(outerParallelId, BlockType.PARALLEL),
createBlock(innerParallelId, BlockType.PARALLEL),
createBlock(functionId, BlockType.FUNCTION),
],
connections: [
{ source: 'start', target: outerParallelId },
{
source: outerParallelId,
target: innerParallelId,
sourceHandle: 'parallel-start-source',
},
{
source: innerParallelId,
target: functionId,
sourceHandle: 'parallel-start-source',
},
],
loops: {},
parallels: {
[innerParallelId]: {
id: innerParallelId,
nodes: [functionId],
count: 3,
parallelType: 'count',
},
[outerParallelId]: {
id: outerParallelId,
nodes: [innerParallelId],
count: 2,
parallelType: 'count',
},
},
}
// Step 1: Build the DAG
const builder = new DAGBuilder()
const dag = builder.build(workflow)
const outerStartId = buildParallelSentinelStartId(outerParallelId)
const outerEndId = buildParallelSentinelEndId(outerParallelId)
const innerStartId = buildParallelSentinelStartId(innerParallelId)
const innerEndId = buildParallelSentinelEndId(innerParallelId)
// Verify DAG construction: start → outer-sentinel-start
const startNode = dag.nodes.get('start')!
const startTargets = Array.from(startNode.outgoingEdges.values()).map((e) => e.target)
expect(startTargets).toContain(outerStartId)
// Step 2: Simulate runtime expansion of outer parallel (count=2)
const expander = new ParallelExpander()
const outerResult = expander.expandParallel(dag, outerParallelId, 2)
// After expansion, outer-sentinel-start should point to 2 entry nodes:
// branch 0 uses original inner-sentinel-start, branch 1 uses cloned sentinel
const outerStart = dag.nodes.get(outerStartId)!
const outerStartTargets = Array.from(outerStart.outgoingEdges.values()).map((e) => e.target)
expect(outerStartTargets).toHaveLength(2)
expect(outerStartTargets).toContain(innerStartId) // branch 0
// Verify cloned subflow info
expect(outerResult.clonedSubflows).toHaveLength(1)
expect(outerResult.clonedSubflows[0].originalId).toBe(innerParallelId)
expect(outerResult.clonedSubflows[0].outerBranchIndex).toBe(1)
const clonedInnerParallelId = outerResult.clonedSubflows[0].clonedId
const clonedInnerStartId = buildParallelSentinelStartId(clonedInnerParallelId)
const clonedInnerEndId = buildParallelSentinelEndId(clonedInnerParallelId)
expect(outerStartTargets).toContain(clonedInnerStartId) // branch 1
// Verify cloned parallel config was registered
expect(dag.parallelConfigs.has(clonedInnerParallelId)).toBe(true)
const clonedConfig = dag.parallelConfigs.get(clonedInnerParallelId)!
expect(clonedConfig.count).toBe(3)
expect(clonedConfig.nodes).toHaveLength(1)
// inner-sentinel-end → outer-sentinel-end (branch 0)
const innerEnd = dag.nodes.get(innerEndId)!
const innerEndTargets = Array.from(innerEnd.outgoingEdges.values()).map((e) => e.target)
expect(innerEndTargets).toContain(outerEndId)
// cloned inner sentinel-end → outer-sentinel-end (branch 1)
const clonedInnerEnd = dag.nodes.get(clonedInnerEndId)!
const clonedInnerEndTargets = Array.from(clonedInnerEnd.outgoingEdges.values()).map(
(e) => e.target
)
expect(clonedInnerEndTargets).toContain(outerEndId)
// Entry/terminal nodes from expansion
expect(outerResult.entryNodes).toContain(innerStartId)
expect(outerResult.entryNodes).toContain(clonedInnerStartId)
expect(outerResult.terminalNodes).toContain(innerEndId)
expect(outerResult.terminalNodes).toContain(clonedInnerEndId)
// Step 3: Verify edge manager resolves ready nodes after outer-sentinel-start completes
const edgeManager = new EdgeManager(dag)
const readyAfterOuterStart = edgeManager.processOutgoingEdges(
outerStart,
{ sentinelStart: true },
false
)
expect(readyAfterOuterStart).toContain(innerStartId)
expect(readyAfterOuterStart).toContain(clonedInnerStartId)
// Step 4: Expand inner parallel (branch 0's inner) with count=3
expander.expandParallel(dag, innerParallelId, 3)
// Inner sentinel-start should now point to 3 branch nodes
const innerStart = dag.nodes.get(innerStartId)!
const innerStartTargets = Array.from(innerStart.outgoingEdges.values()).map((e) => e.target)
expect(innerStartTargets).toHaveLength(3)
const branch0 = buildBranchNodeId(functionId, 0)
const branch1 = buildBranchNodeId(functionId, 1)
const branch2 = buildBranchNodeId(functionId, 2)
expect(innerStartTargets).toContain(branch0)
expect(innerStartTargets).toContain(branch1)
expect(innerStartTargets).toContain(branch2)
// Step 5: Verify edge manager resolves branch nodes after inner-sentinel-start
const readyAfterInnerStart = edgeManager.processOutgoingEdges(
innerStart,
{ sentinelStart: true },
false
)
expect(readyAfterInnerStart).toContain(branch0)
expect(readyAfterInnerStart).toContain(branch1)
expect(readyAfterInnerStart).toContain(branch2)
// Step 6: Simulate branch completions → inner-sentinel-end becomes ready
const branch0Node = dag.nodes.get(branch0)!
const branch1Node = dag.nodes.get(branch1)!
const branch2Node = dag.nodes.get(branch2)!
edgeManager.processOutgoingEdges(branch0Node, {}, false)
edgeManager.processOutgoingEdges(branch1Node, {}, false)
const readyAfterBranch2 = edgeManager.processOutgoingEdges(branch2Node, {}, false)
expect(readyAfterBranch2).toContain(innerEndId)
// Step 7: inner-sentinel-end completes → outer-sentinel-end becomes ready
// (only if both branches are done — cloned branch must also complete)
const readyAfterInnerEnd = edgeManager.processOutgoingEdges(
innerEnd,
{ sentinelEnd: true, selectedRoute: 'parallel_exit' },
false
)
// outer-sentinel-end has 2 incoming (innerEnd + clonedInnerEnd), not ready yet
expect(readyAfterInnerEnd).not.toContain(outerEndId)
// Expand and complete cloned inner parallel (branch 1's inner)
const clonedBlockId = clonedConfig.nodes![0]
expander.expandParallel(dag, clonedInnerParallelId, 3)
const clonedInnerStart = dag.nodes.get(clonedInnerStartId)!
const clonedBranch0 = buildBranchNodeId(clonedBlockId, 0)
const clonedBranch1 = buildBranchNodeId(clonedBlockId, 1)
const clonedBranch2 = buildBranchNodeId(clonedBlockId, 2)
edgeManager.processOutgoingEdges(clonedInnerStart, { sentinelStart: true }, false)
edgeManager.processOutgoingEdges(dag.nodes.get(clonedBranch0)!, {}, false)
edgeManager.processOutgoingEdges(dag.nodes.get(clonedBranch1)!, {}, false)
edgeManager.processOutgoingEdges(dag.nodes.get(clonedBranch2)!, {}, false)
const readyAfterClonedInnerEnd = edgeManager.processOutgoingEdges(
clonedInnerEnd,
{ sentinelEnd: true, selectedRoute: 'parallel_exit' },
false
)
// Now both branches done → outer-sentinel-end becomes ready
expect(readyAfterClonedInnerEnd).toContain(outerEndId)
})
it('3-level nesting: pre-expansion clone IDs do not collide with runtime expansion', () => {
const p1 = 'p1'
const p2 = 'p2'
const p3 = 'p3'
const leafBlock = 'leaf'
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
createBlock(p1, BlockType.PARALLEL),
createBlock(p2, BlockType.PARALLEL),
createBlock(p3, BlockType.PARALLEL),
createBlock(leafBlock, BlockType.FUNCTION),
],
connections: [
{ source: 'start', target: p1 },
{ source: p1, target: p2, sourceHandle: 'parallel-start-source' },
{ source: p2, target: p3, sourceHandle: 'parallel-start-source' },
{ source: p3, target: leafBlock, sourceHandle: 'parallel-start-source' },
],
loops: {},
parallels: {
[p3]: { id: p3, nodes: [leafBlock], count: 2, parallelType: 'count' },
[p2]: { id: p2, nodes: [p3], count: 2, parallelType: 'count' },
[p1]: { id: p1, nodes: [p2], count: 2, parallelType: 'count' },
},
}
const builder = new DAGBuilder()
const dag = builder.build(workflow)
const expander = new ParallelExpander()
// Step 1: Expand P1 (outermost) — this pre-clones P2 and recursively P3
const p1Result = expander.expandParallel(dag, p1, 2)
// P1 should have cloned P2 (and recursively P3 inside it)
const p2Clone = p1Result.clonedSubflows.find((c) => c.originalId === p2)!
expect(p2Clone).toBeDefined()
expect(p2Clone.clonedId).toBe('p2__obranch-1')
// P3 should also be cloned (inside P2__obranch-1) with a __clone prefix
const p3Clone = p1Result.clonedSubflows.find((c) => c.originalId === p3)!
expect(p3Clone).toBeDefined()
expect(p3Clone.clonedId).toMatch(/^p3__clone\d+__obranch-1$/)
expect(stripCloneSuffixes(p3Clone.clonedId)).toBe('p3')
// Step 2: Expand P2 (original, branch 0 of P1) — this creates P3__obranch-1 at runtime
const p2Result = expander.expandParallel(dag, p2, 2)
// P2 should clone P3 as P3__obranch-1 (standard runtime naming)
const p3RuntimeClone = p2Result.clonedSubflows.find((c) => c.originalId === p3)!
expect(p3RuntimeClone).toBeDefined()
expect(p3RuntimeClone.clonedId).toBe('p3__obranch-1')
// Key assertion: P3__obranch-1 (runtime) !== P3__clone*__obranch-1 (pre-expansion)
expect(p3RuntimeClone.clonedId).not.toBe(p3Clone.clonedId)
// Both P3 configs should exist independently in the DAG
expect(dag.parallelConfigs.has(p3RuntimeClone.clonedId)).toBe(true)
expect(dag.parallelConfigs.has(p3Clone.clonedId)).toBe(true)
// Step 3: Expand P2__obranch-1 (cloned, branch 1 of P1)
// Its inner P3 is the pre-cloned variant P3__clone*__obranch-1
const p2ClonedConfig = dag.parallelConfigs.get(p2Clone.clonedId)!
const p3InsideP2Clone = p2ClonedConfig.nodes![0]
expect(p3InsideP2Clone).toBe(p3Clone.clonedId)
const p2CloneResult = expander.expandParallel(dag, p2Clone.clonedId, 2)
// P2__obranch-1 should clone its P3 (the pre-cloned variant) with __obranch-1 suffix
const p3DeepClone = p2CloneResult.clonedSubflows.find((c) => c.originalId === p3Clone.clonedId)!
expect(p3DeepClone).toBeDefined()
// This ID should be unique (no collision with any earlier P3 clone)
expect(dag.parallelConfigs.has(p3DeepClone.clonedId)).toBe(true)
// Step 4: Expand all P3 variants and verify no node collisions
const allP3Variants = [p3, p3RuntimeClone.clonedId, p3Clone.clonedId, p3DeepClone.clonedId]
const allLeafNodes = new Set<string>()
for (const p3Id of allP3Variants) {
const p3Config = dag.parallelConfigs.get(p3Id)!
const leafId = p3Config.nodes![0]
const p3Result = expander.expandParallel(dag, p3Id, 2)
// Each expansion creates branch nodes — verify they're unique
const branch0 = buildBranchNodeId(leafId, 0)
const branch1 = buildBranchNodeId(leafId, 1)
expect(dag.nodes.has(branch0)).toBe(true)
expect(dag.nodes.has(branch1)).toBe(true)
// No duplicate node IDs across all expansions
expect(allLeafNodes.has(branch0)).toBe(false)
expect(allLeafNodes.has(branch1)).toBe(false)
allLeafNodes.add(branch0)
allLeafNodes.add(branch1)
}
// 4 P3 variants × 2 branches each = 8 unique leaf nodes
expect(allLeafNodes.size).toBe(8)
})
})

View File

@@ -1,22 +1,37 @@
import { createLogger } from '@sim/logger'
import { EDGE } from '@/executor/constants'
import type { DAG, DAGNode } from '@/executor/dag/builder'
import type { SerializedBlock } from '@/serializer/types'
import {
buildBranchNodeId,
buildClonedSubflowId,
buildParallelSentinelEndId,
buildParallelSentinelStartId,
buildSentinelEndId,
buildSentinelStartId,
extractBaseBlockId,
isLoopSentinelNodeId,
} from './subflow-utils'
const logger = createLogger('ParallelExpansion')
export interface ClonedSubflowInfo {
clonedId: string
originalId: string
outerBranchIndex: number
}
export interface ExpansionResult {
entryNodes: string[]
terminalNodes: string[]
allBranchNodes: string[]
clonedSubflows: ClonedSubflowInfo[]
}
export class ParallelExpander {
/** Monotonically increasing counter for generating unique pre-expansion clone IDs. */
private cloneSeq = 0
expandParallel(
dag: DAG,
parallelId: string,
@@ -30,13 +45,27 @@ export class ParallelExpander {
const blocksInParallel = config.nodes || []
if (blocksInParallel.length === 0) {
return { entryNodes: [], terminalNodes: [], allBranchNodes: [] }
return { entryNodes: [], terminalNodes: [], allBranchNodes: [], clonedSubflows: [] }
}
const blocksSet = new Set(blocksInParallel)
const allBranchNodes: string[] = []
// Separate nested subflow containers from regular expandable blocks.
// Nested parallels/loops have sentinel nodes instead of branch template nodes,
// so they cannot be cloned per-branch like regular blocks.
const regularBlocks: string[] = []
const nestedSubflows: string[] = []
for (const blockId of blocksInParallel) {
if (dag.parallelConfigs.has(blockId) || dag.loopConfigs.has(blockId)) {
nestedSubflows.push(blockId)
} else {
regularBlocks.push(blockId)
}
}
const regularSet = new Set(regularBlocks)
const allBranchNodes: string[] = []
for (const blockId of regularBlocks) {
const templateId = buildBranchNodeId(blockId, 0)
const templateNode = dag.nodes.get(templateId)
@@ -65,14 +94,43 @@ export class ParallelExpander {
}
}
this.wireInternalEdges(dag, blocksInParallel, blocksSet, branchCount)
this.wireInternalEdges(dag, regularBlocks, regularSet, branchCount)
const { entryNodes, terminalNodes } = this.identifyBoundaryNodes(
dag,
blocksInParallel,
blocksSet,
branchCount
)
const { entryNodes, terminalNodes } =
regularBlocks.length > 0
? this.identifyBoundaryNodes(dag, regularBlocks, regularSet, branchCount)
: { entryNodes: [] as string[], terminalNodes: [] as string[] }
// Clone nested subflow graphs per outer branch so each branch runs independently.
// Branch 0 uses the original sentinel/template nodes; branches 1..N get full clones.
const clonedSubflows: ClonedSubflowInfo[] = []
for (const subflowId of nestedSubflows) {
const isParallel = dag.parallelConfigs.has(subflowId)
const startId = isParallel
? buildParallelSentinelStartId(subflowId)
: buildSentinelStartId(subflowId)
const endId = isParallel
? buildParallelSentinelEndId(subflowId)
: buildSentinelEndId(subflowId)
// Branch 0 uses original nodes
if (dag.nodes.has(startId)) entryNodes.push(startId)
if (dag.nodes.has(endId)) terminalNodes.push(endId)
// Branches 1..N clone the entire subflow graph (recursively for deep nesting)
for (let i = 1; i < branchCount; i++) {
const cloned = this.cloneNestedSubflow(dag, subflowId, i, clonedSubflows)
entryNodes.push(cloned.startId)
terminalNodes.push(cloned.endId)
clonedSubflows.push({
clonedId: cloned.clonedId,
originalId: subflowId,
outerBranchIndex: i,
})
}
}
this.wireSentinelEdges(dag, parallelId, entryNodes, terminalNodes, branchCount)
@@ -80,10 +138,11 @@ export class ParallelExpander {
parallelId,
branchCount,
blocksCount: blocksInParallel.length,
nestedSubflows: nestedSubflows.length,
totalNodes: allBranchNodes.length,
})
return { entryNodes, terminalNodes, allBranchNodes }
return { entryNodes, terminalNodes, allBranchNodes, clonedSubflows }
}
private updateBranchMetadata(
@@ -216,6 +275,207 @@ export class ParallelExpander {
return false
}
/**
* Generates a unique clone ID for pre-expansion cloning.
*
* Pre-expansion clones use `{originalId}__clone{N}__obranch-{branchIndex}` instead
* of the plain `{originalId}__obranch-{branchIndex}` used by runtime expansion.
* The `__clone{N}` segment (from a monotonic counter) prevents naming collisions
* when the original (branch-0) subflow later expands at runtime and creates
* `{child}__obranch-{branchIndex}`.
*/
private buildPreCloneId(originalId: string, outerBranchIndex: number): string {
return `${originalId}__clone${this.cloneSeq++}__obranch-${outerBranchIndex}`
}
/**
* Clones an entire nested subflow graph for a specific outer branch.
*
* The top-level subflow gets a standard `__obranch-{N}` clone ID (needed by
* `findEffectiveContainerId` at runtime). All deeper children — both containers
* and regular blocks — receive unique `__clone{N}__obranch-{M}` IDs via
* {@link buildPreCloneId} to avoid collisions with runtime expansion.
*/
private cloneNestedSubflow(
dag: DAG,
subflowId: string,
outerBranchIndex: number,
clonedSubflows: ClonedSubflowInfo[]
): { startId: string; endId: string; clonedId: string; idMap: Map<string, string> } {
const clonedId = buildClonedSubflowId(subflowId, outerBranchIndex)
const { startId, endId, idMap } = this.cloneSubflowGraph(
dag,
subflowId,
clonedId,
outerBranchIndex,
clonedSubflows
)
return { startId, endId, clonedId, idMap }
}
/**
* Core recursive cloning: duplicates a subflow's sentinels, config, child blocks,
* and DAG nodes under the given `clonedId`. Nested containers are recursively
* cloned with unique pre-clone IDs.
*/
private cloneSubflowGraph(
dag: DAG,
originalId: string,
clonedId: string,
outerBranchIndex: number,
clonedSubflows: ClonedSubflowInfo[]
): { startId: string; endId: string; idMap: Map<string, string> } {
const isParallel = dag.parallelConfigs.has(originalId)
const config = isParallel
? dag.parallelConfigs.get(originalId)!
: dag.loopConfigs.get(originalId)!
const blockIds = config.nodes || []
const idMap = new Map<string, string>()
// Map sentinel nodes
const origStartId = isParallel
? buildParallelSentinelStartId(originalId)
: buildSentinelStartId(originalId)
const origEndId = isParallel
? buildParallelSentinelEndId(originalId)
: buildSentinelEndId(originalId)
const clonedStartId = isParallel
? buildParallelSentinelStartId(clonedId)
: buildSentinelStartId(clonedId)
const clonedEndId = isParallel
? buildParallelSentinelEndId(clonedId)
: buildSentinelEndId(clonedId)
idMap.set(origStartId, clonedStartId)
idMap.set(origEndId, clonedEndId)
// Process child blocks — recurse into nested containers, remap regular blocks
const clonedBlockIds: string[] = []
for (const blockId of blockIds) {
const isNestedParallel = dag.parallelConfigs.has(blockId)
const isNestedLoop = dag.loopConfigs.has(blockId)
if (isNestedParallel || isNestedLoop) {
const nestedClonedId = this.buildPreCloneId(blockId, outerBranchIndex)
clonedBlockIds.push(nestedClonedId)
const innerResult = this.cloneSubflowGraph(
dag,
blockId,
nestedClonedId,
outerBranchIndex,
clonedSubflows
)
for (const [k, v] of innerResult.idMap) {
idMap.set(k, v)
}
clonedSubflows.push({
clonedId: nestedClonedId,
originalId: blockId,
outerBranchIndex,
})
} else {
const clonedBlockId = this.buildPreCloneId(blockId, outerBranchIndex)
clonedBlockIds.push(clonedBlockId)
if (isParallel) {
idMap.set(buildBranchNodeId(blockId, 0), buildBranchNodeId(clonedBlockId, 0))
} else {
idMap.set(blockId, clonedBlockId)
}
}
}
// Register cloned config
if (isParallel) {
dag.parallelConfigs.set(clonedId, {
...dag.parallelConfigs.get(originalId)!,
id: clonedId,
nodes: clonedBlockIds,
})
} else {
dag.loopConfigs.set(clonedId, {
...dag.loopConfigs.get(originalId)!,
id: clonedId,
nodes: clonedBlockIds,
})
}
// Clone DAG nodes (sentinels + regular blocks) with remapped edges
const origNodeIds = [origStartId, origEndId]
for (const blockId of blockIds) {
if (dag.parallelConfigs.has(blockId) || dag.loopConfigs.has(blockId)) continue
if (isParallel) {
origNodeIds.push(buildBranchNodeId(blockId, 0))
} else {
origNodeIds.push(blockId)
}
}
for (const origId of origNodeIds) {
const origNode = dag.nodes.get(origId)
if (!origNode) continue
const clonedNodeId = idMap.get(origId)!
this.cloneDAGNode(dag, origNode, clonedNodeId, clonedId, isParallel, idMap)
}
return { startId: clonedStartId, endId: clonedEndId, idMap }
}
/**
* Clones a single DAG node with remapped edges and updated metadata.
*/
private cloneDAGNode(
dag: DAG,
origNode: DAGNode,
clonedNodeId: string,
parentClonedId: string,
parentIsParallel: boolean,
idMap: Map<string, string>
): void {
const clonedOutgoing = new Map<
string,
{ target: string; sourceHandle?: string; targetHandle?: string }
>()
for (const [, edge] of origNode.outgoingEdges) {
const clonedTarget = idMap.get(edge.target) ?? edge.target
const edgeId = edge.sourceHandle
? `${clonedNodeId}${clonedTarget}-${edge.sourceHandle}`
: `${clonedNodeId}${clonedTarget}`
clonedOutgoing.set(edgeId, {
target: clonedTarget,
sourceHandle: edge.sourceHandle,
targetHandle: edge.targetHandle,
})
}
const clonedIncoming = new Set<string>()
for (const incomingId of origNode.incomingEdges) {
clonedIncoming.add(idMap.get(incomingId) ?? incomingId)
}
const metadataOverride = parentIsParallel
? { parallelId: parentClonedId }
: { loopId: parentClonedId }
dag.nodes.set(clonedNodeId, {
id: clonedNodeId,
block: { ...origNode.block, id: clonedNodeId },
incomingEdges: clonedIncoming,
outgoingEdges: clonedOutgoing,
metadata: {
...origNode.metadata,
...metadataOverride,
...(origNode.metadata.originalBlockId && {
originalBlockId: origNode.metadata.originalBlockId,
}),
},
})
}
private wireSentinelEdges(
dag: DAG,
parallelId: string,
@@ -247,8 +507,12 @@ export class ParallelExpander {
const terminalNode = dag.nodes.get(terminalNodeId)
if (!terminalNode) continue
const edgeId = `${terminalNodeId}${sentinelEndId}`
terminalNode.outgoingEdges.set(edgeId, { target: sentinelEndId })
const handle = isLoopSentinelNodeId(terminalNodeId) ? EDGE.LOOP_EXIT : EDGE.PARALLEL_EXIT
const edgeId = `${terminalNodeId}${sentinelEndId}-${handle}`
terminalNode.outgoingEdges.set(edgeId, {
target: sentinelEndId,
sourceHandle: handle,
})
sentinelEnd.incomingEdges.add(terminalNodeId)
}
}

View File

@@ -1,11 +1,8 @@
import { createLogger } from '@sim/logger'
import { LOOP, PARALLEL, PARSING, REFERENCE } from '@/executor/constants'
import { LOOP, PARALLEL, REFERENCE } from '@/executor/constants'
import type { ContextExtensions } from '@/executor/execution/types'
import { type BlockLog, type ExecutionContext, getNextExecutionOrder } from '@/executor/types'
import type { VariableResolver } from '@/executor/variables/resolver'
const logger = createLogger('SubflowUtils')
const BRANCH_PATTERN = new RegExp(`${PARALLEL.BRANCH.PREFIX}\\d+${PARALLEL.BRANCH.SUFFIX}$`)
const BRANCH_INDEX_PATTERN = new RegExp(`${PARALLEL.BRANCH.PREFIX}(\\d+)${PARALLEL.BRANCH.SUFFIX}$`)
const LOOP_SENTINEL_START_PATTERN = new RegExp(
@@ -38,12 +35,17 @@ export function buildParallelSentinelEndId(parallelId: string): string {
}
export function isLoopSentinelNodeId(nodeId: string): boolean {
return nodeId.includes(LOOP.SENTINEL.START_SUFFIX) || nodeId.includes(LOOP.SENTINEL.END_SUFFIX)
return (
nodeId.startsWith(LOOP.SENTINEL.PREFIX) &&
(nodeId.endsWith(LOOP.SENTINEL.START_SUFFIX) || nodeId.endsWith(LOOP.SENTINEL.END_SUFFIX))
)
}
export function isParallelSentinelNodeId(nodeId: string): boolean {
return (
nodeId.includes(PARALLEL.SENTINEL.START_SUFFIX) || nodeId.includes(PARALLEL.SENTINEL.END_SUFFIX)
nodeId.startsWith(PARALLEL.SENTINEL.PREFIX) &&
(nodeId.endsWith(PARALLEL.SENTINEL.START_SUFFIX) ||
nodeId.endsWith(PARALLEL.SENTINEL.END_SUFFIX))
)
}
@@ -80,19 +82,82 @@ export function extractBaseBlockId(branchNodeId: string): string {
export function extractBranchIndex(branchNodeId: string): number | null {
const match = branchNodeId.match(BRANCH_INDEX_PATTERN)
return match ? Number.parseInt(match[1], PARSING.JSON_RADIX) : null
return match ? Number.parseInt(match[1], 10) : null
}
export function isBranchNodeId(nodeId: string): boolean {
return BRANCH_PATTERN.test(nodeId)
}
export function isLoopNode(nodeId: string): boolean {
return isLoopSentinelNodeId(nodeId) || nodeId.startsWith(LOOP.SENTINEL.PREFIX)
const OUTER_BRANCH_PATTERN = /__obranch-(\d+)/
const OUTER_BRANCH_STRIP_PATTERN = /__obranch-\d+/g
const CLONE_SEQ_STRIP_PATTERN = /__clone\d+/g
/**
* Extracts the outer branch index from a cloned subflow ID.
* Cloned IDs follow the pattern `{originalId}__obranch-{index}`.
* Returns undefined if the ID is not a clone.
*/
export function extractOuterBranchIndex(clonedId: string): number | undefined {
const match = clonedId.match(OUTER_BRANCH_PATTERN)
return match ? Number.parseInt(match[1], 10) : undefined
}
export function isParallelNode(nodeId: string): boolean {
return isBranchNodeId(nodeId) || isParallelSentinelNodeId(nodeId)
/**
* Strips all clone suffixes (`__obranch-N`) and branch subscripts (`₍N₎`)
* from a node ID, returning the original workflow-level block ID.
*/
export function stripCloneSuffixes(nodeId: string): string {
return extractBaseBlockId(
nodeId.replace(OUTER_BRANCH_STRIP_PATTERN, '').replace(CLONE_SEQ_STRIP_PATTERN, '')
)
}
/**
* Builds a cloned subflow ID from an original ID and outer branch index.
*/
export function buildClonedSubflowId(originalId: string, branchIndex: number): string {
return `${originalId}__obranch-${branchIndex}`
}
/**
* Strips outer-branch clone suffixes (`__obranch-N`) from an ID,
* returning the original workflow-level subflow ID.
*/
export function stripOuterBranchSuffix(id: string): string {
return id.replace(OUTER_BRANCH_STRIP_PATTERN, '').replace(CLONE_SEQ_STRIP_PATTERN, '')
}
/**
* Finds the effective (possibly cloned) container ID for a subflow,
* given the current node's ID and an execution map (loopExecutions or parallelExecutions).
*
* When inside a cloned subflow (e.g., loop-1__obranch-2), the execution scope is
* stored under the cloned ID, not the original. This function extracts the `__obranch-N`
* suffix from the current node ID, constructs the candidate cloned container ID, and
* checks if it exists in the execution map.
*
* Returns the effective ID (cloned or original) that exists in the map.
*/
export function findEffectiveContainerId(
originalId: string,
currentNodeId: string,
executionMap: Map<string, unknown>
): string {
// Prefer the cloned variant when currentNodeId carries an __obranch-N suffix.
// During concurrent parallel-in-loop execution both the original (branch 0)
// and cloned variants coexist in the map; the clone is the correct scope.
const match = currentNodeId.match(OUTER_BRANCH_PATTERN)
if (match) {
const candidateId = buildClonedSubflowId(originalId, Number.parseInt(match[1], 10))
if (executionMap.has(candidateId)) {
return candidateId
}
}
// Return original ID — for branch-0 (non-cloned) or when scope is missing.
// Callers handle the missing-scope case gracefully.
return originalId
}
export function normalizeNodeId(nodeId: string): string {

View File

@@ -7,33 +7,46 @@ import type { ResolutionContext } from './reference'
vi.mock('@sim/logger', () => loggerMock)
/**
* Creates a minimal workflow for testing.
*/
function createTestWorkflow(
loops: Record<string, { nodes: string[]; id?: string; iterations?: number }> = {}
) {
// Ensure each loop has required fields
interface LoopDef {
nodes: string[]
id?: string
iterations?: number
loopType?: 'for' | 'forEach'
}
interface BlockDef {
id: string
name: string
}
function createTestWorkflow(loops: Record<string, LoopDef> = {}, blockDefs: BlockDef[] = []) {
const normalizedLoops: Record<string, { id: string; nodes: string[]; iterations: number }> = {}
for (const [key, loop] of Object.entries(loops)) {
normalizedLoops[key] = {
id: loop.id ?? key,
nodes: loop.nodes,
iterations: loop.iterations ?? 1,
...(loop.loopType && { loopType: loop.loopType }),
}
}
const blocks = blockDefs.map((b) => ({
id: b.id,
position: { x: 0, y: 0 },
config: { tool: 'test', params: {} },
inputs: {},
outputs: {},
metadata: { id: 'function', name: b.name },
enabled: true,
}))
return {
version: '1.0',
blocks: [],
blocks,
connections: [],
loops: normalizedLoops,
parallels: {},
}
}
/**
* Creates a test loop scope.
*/
function createLoopScope(overrides: Partial<LoopScope> = {}): LoopScope {
return {
iteration: 0,
@@ -43,19 +56,19 @@ function createLoopScope(overrides: Partial<LoopScope> = {}): LoopScope {
}
}
/**
* Creates a minimal ResolutionContext for testing.
*/
function createTestContext(
currentNodeId: string,
loopScope?: LoopScope,
loopExecutions?: Map<string, LoopScope>
loopExecutions?: Map<string, LoopScope>,
blockOutputs?: Record<string, any>
): ResolutionContext {
return {
executionContext: {
loopExecutions: loopExecutions ?? new Map(),
},
executionState: {},
executionState: {
getBlockOutput: (id: string) => blockOutputs?.[id],
},
currentNodeId,
loopScope,
} as ResolutionContext
@@ -304,4 +317,127 @@ describe('LoopResolver', () => {
expect(resolver.resolve('<loop.iteration>', ctx)).toBe(2)
})
})
describe('named loop references', () => {
it.concurrent('should resolve named loop by block name', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
expect(resolver.canResolve('<loop1.index>')).toBe(true)
})
it.concurrent('should resolve index via named reference for block inside the loop', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
const loopScope = createLoopScope({ iteration: 3 })
const loopExecutions = new Map([['loop-1', loopScope]])
const ctx = createTestContext('block-1', undefined, loopExecutions)
expect(resolver.resolve('<loop1.index>', ctx)).toBe(3)
})
it.concurrent('should resolve index for block in a nested descendant loop', () => {
const workflow = createTestWorkflow(
{
'loop-outer': { nodes: ['loop-inner', 'block-a'] },
'loop-inner': { nodes: ['block-b'] },
},
[
{ id: 'loop-outer', name: 'Loop 1' },
{ id: 'loop-inner', name: 'Loop 2' },
]
)
const resolver = new LoopResolver(workflow)
const outerScope = createLoopScope({ iteration: 2 })
const innerScope = createLoopScope({ iteration: 4 })
const loopExecutions = new Map<string, LoopScope>([
['loop-outer', outerScope],
['loop-inner', innerScope],
])
const ctx = createTestContext('block-b', undefined, loopExecutions)
expect(resolver.resolve('<loop1.index>', ctx)).toBe(2)
expect(resolver.resolve('<loop2.index>', ctx)).toBe(4)
expect(resolver.resolve('<loop.index>', ctx)).toBe(4)
})
it.concurrent('should return undefined for index when block is outside the loop', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
const loopScope = createLoopScope({ iteration: 3 })
const loopExecutions = new Map([['loop-1', loopScope]])
const ctx = createTestContext('block-outside', undefined, loopExecutions)
expect(resolver.resolve('<loop1.index>', ctx)).toBeUndefined()
})
it.concurrent('should resolve result from anywhere after loop completes', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
const results = [[{ response: 'a' }], [{ response: 'b' }]]
const ctx = createTestContext('block-outside', undefined, new Map(), {
'loop-1': { results },
})
expect(resolver.resolve('<loop1.result>', ctx)).toEqual(results)
expect(resolver.resolve('<loop1.results>', ctx)).toEqual(results)
})
it.concurrent('should resolve result with nested path', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
const results = [[{ response: 'a' }], [{ response: 'b' }]]
const ctx = createTestContext('block-outside', undefined, new Map(), {
'loop-1': { results },
})
expect(resolver.resolve('<loop1.result.0>', ctx)).toEqual([{ response: 'a' }])
expect(resolver.resolve('<loop1.result.1.0.response>', ctx)).toBe('b')
})
it.concurrent('should resolve forEach properties via named reference', () => {
const workflow = createTestWorkflow(
{ 'loop-1': { nodes: ['block-1'], loopType: 'forEach' } },
[{ id: 'loop-1', name: 'Loop 1' }]
)
const resolver = new LoopResolver(workflow)
const items = ['x', 'y', 'z']
const loopScope = createLoopScope({ iteration: 1, item: 'y', items })
const loopExecutions = new Map([['loop-1', loopScope]])
const ctx = createTestContext('block-1', undefined, loopExecutions)
expect(resolver.resolve('<loop1.index>', ctx)).toBe(1)
expect(resolver.resolve('<loop1.currentItem>', ctx)).toBe('y')
expect(resolver.resolve('<loop1.items>', ctx)).toEqual(items)
})
it.concurrent('should throw InvalidFieldError for unknown property on named ref', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
const loopScope = createLoopScope({ iteration: 0 })
const loopExecutions = new Map([['loop-1', loopScope]])
const ctx = createTestContext('block-1', undefined, loopExecutions)
expect(() => resolver.resolve('<loop1.unknownProp>', ctx)).toThrow(InvalidFieldError)
})
it.concurrent('should not resolve named ref when no matching block exists', () => {
const workflow = createTestWorkflow({ 'loop-1': { nodes: ['block-1'] } }, [
{ id: 'loop-1', name: 'Loop 1' },
])
const resolver = new LoopResolver(workflow)
expect(resolver.canResolve('<loop99.index>')).toBe(false)
})
})
})

View File

@@ -1,7 +1,11 @@
import { createLogger } from '@sim/logger'
import { isReference, parseReferencePath, REFERENCE } from '@/executor/constants'
import { isReference, normalizeName, parseReferencePath, REFERENCE } from '@/executor/constants'
import { InvalidFieldError } from '@/executor/utils/block-reference'
import { extractBaseBlockId } from '@/executor/utils/subflow-utils'
import {
findEffectiveContainerId,
stripCloneSuffixes,
stripOuterBranchSuffix,
} from '@/executor/utils/subflow-utils'
import {
navigatePath,
type ResolutionContext,
@@ -12,9 +16,19 @@ import type { SerializedWorkflow } from '@/serializer/types'
const logger = createLogger('LoopResolver')
export class LoopResolver implements Resolver {
constructor(private workflow: SerializedWorkflow) {}
private loopNameToId: Map<string, string>
private static KNOWN_PROPERTIES = ['iteration', 'index', 'item', 'currentItem', 'items']
constructor(private workflow: SerializedWorkflow) {
this.loopNameToId = new Map()
for (const block of workflow.blocks) {
if (workflow.loops[block.id] && block.metadata?.name) {
this.loopNameToId.set(normalizeName(block.metadata.name), block.id)
}
}
}
private static OUTPUT_PROPERTIES = new Set(['result', 'results'])
private static KNOWN_PROPERTIES = new Set(['iteration', 'index', 'item', 'currentItem', 'items'])
canResolve(reference: string): boolean {
if (!isReference(reference)) {
@@ -25,7 +39,7 @@ export class LoopResolver implements Resolver {
return false
}
const [type] = parts
return type === REFERENCE.PREFIX.LOOP
return type === REFERENCE.PREFIX.LOOP || this.loopNameToId.has(type)
}
resolve(reference: string, context: ResolutionContext): any {
@@ -35,14 +49,67 @@ export class LoopResolver implements Resolver {
return undefined
}
const loopId = this.findLoopForBlock(context.currentNodeId)
let loopScope = context.loopScope
const [firstPart, ...rest] = parts
const isGenericRef = firstPart === REFERENCE.PREFIX.LOOP
if (!loopScope) {
if (!loopId) {
let targetLoopId: string | undefined
if (isGenericRef) {
targetLoopId = this.findInnermostLoopForBlock(context.currentNodeId)
if (!targetLoopId && !context.loopScope) {
return undefined
}
loopScope = context.executionContext.loopExecutions?.get(loopId)
} else {
targetLoopId = this.loopNameToId.get(firstPart)
if (!targetLoopId) {
return undefined
}
}
// Resolve the effective (possibly cloned) loop ID for scope/output lookups
if (targetLoopId && context.executionContext.loopExecutions) {
targetLoopId = findEffectiveContainerId(
targetLoopId,
context.currentNodeId,
context.executionContext.loopExecutions
)
}
if (rest.length > 0) {
const property = rest[0]
if (LoopResolver.OUTPUT_PROPERTIES.has(property)) {
if (!targetLoopId) {
return undefined
}
return this.resolveOutput(targetLoopId, rest.slice(1), context)
}
if (!LoopResolver.KNOWN_PROPERTIES.has(property)) {
const isForEach = targetLoopId
? this.isForEachLoop(targetLoopId)
: context.loopScope?.items !== undefined
const availableFields = isForEach
? ['index', 'currentItem', 'items', 'result']
: ['index', 'result']
throw new InvalidFieldError(firstPart, property, availableFields)
}
if (!isGenericRef && targetLoopId) {
if (!this.isBlockInLoopOrDescendant(context.currentNodeId, targetLoopId)) {
logger.warn('Block is not inside the referenced loop', {
reference,
blockId: context.currentNodeId,
loopId: targetLoopId,
})
return undefined
}
}
}
let loopScope = isGenericRef ? context.loopScope : undefined
if (!loopScope && targetLoopId) {
loopScope = context.executionContext.loopExecutions?.get(targetLoopId)
}
if (!loopScope) {
@@ -50,26 +117,20 @@ export class LoopResolver implements Resolver {
return undefined
}
const isForEach = loopId ? this.isForEachLoop(loopId) : loopScope.items !== undefined
if (parts.length === 1) {
const result: Record<string, any> = {
if (rest.length === 0) {
const obj: Record<string, any> = {
index: loopScope.iteration,
}
if (loopScope.item !== undefined) {
result.currentItem = loopScope.item
obj.currentItem = loopScope.item
}
if (loopScope.items !== undefined) {
result.items = loopScope.items
obj.items = loopScope.items
}
return result
return obj
}
const [_, property, ...pathParts] = parts
if (!LoopResolver.KNOWN_PROPERTIES.includes(property)) {
const availableFields = isForEach ? ['index', 'currentItem', 'items'] : ['index']
throw new InvalidFieldError('loop', property, availableFields)
}
const [property, ...pathParts] = rest
let value: any
switch (property) {
@@ -93,20 +154,84 @@ export class LoopResolver implements Resolver {
return value
}
private findLoopForBlock(blockId: string): string | undefined {
const baseId = extractBaseBlockId(blockId)
for (const loopId of Object.keys(this.workflow.loops || {})) {
const loopConfig = this.workflow.loops[loopId]
if (loopConfig.nodes.includes(baseId)) {
return loopId
private resolveOutput(loopId: string, pathParts: string[], context: ResolutionContext): unknown {
const output = context.executionState.getBlockOutput(loopId)
if (!output || typeof output !== 'object') {
return undefined
}
const value = (output as Record<string, unknown>).results
if (pathParts.length > 0) {
return navigatePath(value, pathParts)
}
return value
}
private findInnermostLoopForBlock(blockId: string): string | undefined {
const baseId = stripCloneSuffixes(blockId)
const loops = this.workflow.loops || {}
const candidateLoopIds = Object.keys(loops).filter((loopId) =>
loops[loopId].nodes.includes(baseId)
)
if (candidateLoopIds.length === 0) return undefined
if (candidateLoopIds.length === 1) return candidateLoopIds[0]
// Return the innermost: the loop that is not an ancestor of any other candidate.
// In a valid DAG, exactly one candidate will satisfy this (circular containment is impossible).
return candidateLoopIds.find((candidateId) =>
candidateLoopIds.every(
(otherId) => otherId === candidateId || !loops[candidateId].nodes.includes(otherId)
)
)
}
private isBlockInLoopOrDescendant(blockId: string, targetLoopId: string): boolean {
const baseId = stripCloneSuffixes(blockId)
const originalLoopId = stripOuterBranchSuffix(targetLoopId)
const targetLoop = this.workflow.loops?.[originalLoopId]
if (!targetLoop) {
return false
}
if (targetLoop.nodes.includes(baseId)) {
return true
}
const directLoopId = this.findInnermostLoopForBlock(blockId)
if (!directLoopId) {
return false
}
if (directLoopId === originalLoopId) {
return true
}
return this.isLoopNestedInside(directLoopId, originalLoopId)
}
private isLoopNestedInside(
childLoopId: string,
ancestorLoopId: string,
visited = new Set<string>()
): boolean {
if (visited.has(ancestorLoopId)) return false
visited.add(ancestorLoopId)
const ancestorLoop = this.workflow.loops?.[ancestorLoopId]
if (!ancestorLoop) {
return false
}
if (ancestorLoop.nodes.includes(childLoopId)) {
return true
}
for (const nodeId of ancestorLoop.nodes) {
if (this.workflow.loops[nodeId]) {
if (this.isLoopNestedInside(childLoopId, nodeId, visited)) {
return true
}
}
}
return undefined
return false
}
private isForEachLoop(loopId: string): boolean {
const loopConfig = this.workflow.loops?.[loopId]
const originalId = stripOuterBranchSuffix(loopId)
const loopConfig = this.workflow.loops?.[originalId]
return loopConfig?.loopType === 'forEach'
}
}

View File

@@ -16,19 +16,16 @@ function createTestWorkflow(
nodes: string[]
id?: string
distribution?: any
distributionItems?: any
parallelType?: 'count' | 'collection'
}
> = {}
) {
// Ensure each parallel has required fields
const normalizedParallels: Record<
string,
{
id: string
nodes: string[]
distribution?: any
distributionItems?: any
parallelType?: 'count' | 'collection'
}
> = {}
@@ -37,7 +34,6 @@ function createTestWorkflow(
id: parallel.id ?? key,
nodes: parallel.nodes,
distribution: parallel.distribution,
distributionItems: parallel.distributionItems,
parallelType: parallel.parallelType,
}
}
@@ -366,9 +362,9 @@ describe('ParallelResolver', () => {
expect(resolver.resolve('<parallel.items>', ctx)).toEqual([])
})
it.concurrent('should handle distributionItems property as fallback', () => {
it.concurrent('should resolve distribution items from distribution property', () => {
const workflow = createTestWorkflow({
'parallel-1': { nodes: ['block-1'], distributionItems: ['fallback1', 'fallback2'] },
'parallel-1': { nodes: ['block-1'], distribution: ['fallback1', 'fallback2'] },
})
const resolver = new ParallelResolver(workflow)
const ctx = createTestContext('block-1₍0₎')

View File

@@ -1,20 +1,34 @@
import { createLogger } from '@sim/logger'
import { isReference, parseReferencePath, REFERENCE } from '@/executor/constants'
import { isReference, normalizeName, parseReferencePath, REFERENCE } from '@/executor/constants'
import { InvalidFieldError } from '@/executor/utils/block-reference'
import { extractBaseBlockId, extractBranchIndex } from '@/executor/utils/subflow-utils'
import {
extractBranchIndex,
findEffectiveContainerId,
stripCloneSuffixes,
stripOuterBranchSuffix,
} from '@/executor/utils/subflow-utils'
import {
navigatePath,
type ResolutionContext,
type Resolver,
} from '@/executor/variables/resolvers/reference'
import type { SerializedWorkflow } from '@/serializer/types'
import type { SerializedParallel, SerializedWorkflow } from '@/serializer/types'
const logger = createLogger('ParallelResolver')
export class ParallelResolver implements Resolver {
constructor(private workflow: SerializedWorkflow) {}
private parallelNameToId: Map<string, string>
private static KNOWN_PROPERTIES = ['index', 'currentItem', 'items']
constructor(private workflow: SerializedWorkflow) {
this.parallelNameToId = new Map()
for (const block of workflow.blocks) {
if (workflow.parallels?.[block.id] && block.metadata?.name) {
this.parallelNameToId.set(normalizeName(block.metadata.name), block.id)
}
}
}
private static KNOWN_PROPERTIES = new Set(['index', 'currentItem', 'items'])
canResolve(reference: string): boolean {
if (!isReference(reference)) {
@@ -25,7 +39,7 @@ export class ParallelResolver implements Resolver {
return false
}
const [type] = parts
return type === REFERENCE.PREFIX.PARALLEL
return type === REFERENCE.PREFIX.PARALLEL || this.parallelNameToId.has(type)
}
resolve(reference: string, context: ResolutionContext): any {
@@ -35,64 +49,85 @@ export class ParallelResolver implements Resolver {
return undefined
}
const parallelId = this.findParallelForBlock(context.currentNodeId)
if (!parallelId) {
const [firstPart, ...rest] = parts
const isGenericRef = firstPart === REFERENCE.PREFIX.PARALLEL
// For named references, resolve to the specific parallel ID
let targetParallelId: string | undefined
if (isGenericRef) {
targetParallelId = this.findInnermostParallelForBlock(context.currentNodeId)
} else {
targetParallelId = this.parallelNameToId.get(firstPart)
}
if (!targetParallelId) {
return undefined
}
const parallelConfig = this.workflow.parallels?.[parallelId]
// Resolve the effective (possibly cloned) parallel ID for scope lookups
if (context.executionContext.parallelExecutions) {
targetParallelId = findEffectiveContainerId(
targetParallelId,
context.currentNodeId,
context.executionContext.parallelExecutions
)
}
// Look up config using the original (non-cloned) ID
const originalParallelId = stripOuterBranchSuffix(targetParallelId)
const parallelConfig = this.workflow.parallels?.[originalParallelId]
if (!parallelConfig) {
logger.warn('Parallel config not found', { parallelId })
logger.warn('Parallel config not found', { parallelId: targetParallelId })
return undefined
}
if (!isGenericRef) {
if (!this.isBlockInParallelOrDescendant(context.currentNodeId, originalParallelId)) {
logger.warn('Block is not inside the referenced parallel', {
reference,
blockId: context.currentNodeId,
parallelId: targetParallelId,
})
return undefined
}
}
const branchIndex = extractBranchIndex(context.currentNodeId)
if (branchIndex === null) {
return undefined
}
const parallelScope = context.executionContext.parallelExecutions?.get(parallelId)
const parallelScope = context.executionContext.parallelExecutions?.get(targetParallelId)
const distributionItems = parallelScope?.items ?? this.getDistributionItems(parallelConfig)
if (parts.length === 1) {
const result: Record<string, any> = {
index: branchIndex,
}
const currentItem = this.resolveCurrentItem(distributionItems, branchIndex)
if (rest.length === 0) {
const result: Record<string, any> = { index: branchIndex }
if (distributionItems !== undefined) {
result.items = distributionItems
if (Array.isArray(distributionItems)) {
result.currentItem = distributionItems[branchIndex]
} else if (typeof distributionItems === 'object' && distributionItems !== null) {
const keys = Object.keys(distributionItems)
const key = keys[branchIndex]
result.currentItem = key !== undefined ? distributionItems[key] : undefined
}
result.currentItem = currentItem
}
return result
}
const [_, property, ...pathParts] = parts
if (!ParallelResolver.KNOWN_PROPERTIES.includes(property)) {
const property = rest[0]
const pathParts = rest.slice(1)
if (!ParallelResolver.KNOWN_PROPERTIES.has(property)) {
const isCollection = parallelConfig.parallelType === 'collection'
const availableFields = isCollection ? ['index', 'currentItem', 'items'] : ['index']
throw new InvalidFieldError('parallel', property, availableFields)
throw new InvalidFieldError(firstPart, property, availableFields)
}
let value: any
let value: unknown
switch (property) {
case 'index':
value = branchIndex
break
case 'currentItem':
if (Array.isArray(distributionItems)) {
value = distributionItems[branchIndex]
} else if (typeof distributionItems === 'object' && distributionItems !== null) {
const keys = Object.keys(distributionItems)
const key = keys[branchIndex]
value = key !== undefined ? distributionItems[key] : undefined
} else {
return undefined
}
value = currentItem
if (value === undefined) return undefined
break
case 'items':
value = distributionItems
@@ -106,23 +141,83 @@ export class ParallelResolver implements Resolver {
return value
}
private findParallelForBlock(blockId: string): string | undefined {
const baseId = extractBaseBlockId(blockId)
if (!this.workflow.parallels) {
return undefined
}
for (const parallelId of Object.keys(this.workflow.parallels)) {
const parallelConfig = this.workflow.parallels[parallelId]
if (parallelConfig?.nodes.includes(baseId)) {
return parallelId
private findInnermostParallelForBlock(blockId: string): string | undefined {
const baseId = stripCloneSuffixes(blockId)
const parallels = this.workflow.parallels
if (!parallels) return undefined
const candidateIds = Object.keys(parallels).filter((parallelId) =>
parallels[parallelId]?.nodes.includes(baseId)
)
if (candidateIds.length === 0) return undefined
if (candidateIds.length === 1) return candidateIds[0]
// Return the innermost: the parallel that is not an ancestor of any other candidate.
// In a valid DAG, exactly one candidate will satisfy this (circular containment is impossible).
return candidateIds.find((candidateId) =>
candidateIds.every(
(otherId) => otherId === candidateId || !parallels[candidateId]?.nodes.includes(otherId)
)
)
}
private isBlockInParallelOrDescendant(blockId: string, targetParallelId: string): boolean {
const baseId = stripCloneSuffixes(blockId)
const parallels = this.workflow.parallels
if (!parallels) return false
const targetConfig = parallels[targetParallelId]
if (!targetConfig) return false
if (targetConfig.nodes.includes(baseId)) return true
const directParallelId = this.findInnermostParallelForBlock(blockId)
if (!directParallelId) return false
if (directParallelId === targetParallelId) return true
return this.isParallelNestedInside(directParallelId, targetParallelId)
}
private isParallelNestedInside(
childParallelId: string,
ancestorParallelId: string,
visited = new Set<string>()
): boolean {
if (visited.has(ancestorParallelId)) return false
visited.add(ancestorParallelId)
const ancestorConfig = this.workflow.parallels?.[ancestorParallelId]
if (!ancestorConfig) return false
if (ancestorConfig.nodes.includes(childParallelId)) return true
for (const nodeId of ancestorConfig.nodes) {
if (this.workflow.parallels?.[nodeId]) {
if (this.isParallelNestedInside(childParallelId, nodeId, visited)) {
return true
}
}
}
return false
}
private resolveCurrentItem(
distributionItems: unknown[] | undefined,
branchIndex: number
): unknown {
if (Array.isArray(distributionItems)) {
return distributionItems[branchIndex]
}
if (typeof distributionItems === 'object' && distributionItems !== null) {
const keys = Object.keys(distributionItems)
const key = keys[branchIndex]
return key !== undefined ? (distributionItems as Record<string, unknown>)[key] : undefined
}
return undefined
}
private getDistributionItems(parallelConfig: any): any[] {
const rawItems = parallelConfig.distributionItems || parallelConfig.distribution || []
private getDistributionItems(parallelConfig: SerializedParallel): unknown[] {
const rawItems = parallelConfig.distribution ?? []
// Already an array - return as-is
if (Array.isArray(rawItems)) {

View File

@@ -77,6 +77,7 @@ export async function deliverPushNotification(taskId: string, state: TaskState):
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Push notification delivery failed', {
taskId,
url: config.url,

View File

@@ -488,6 +488,7 @@ export const auth = betterAuth({
'google-bigquery',
'google-vault',
'google-groups',
'google-meet',
'google-tasks',
'vertex-ai',
'github-repo',
@@ -773,6 +774,7 @@ export const auth = betterAuth({
})
if (!profileResponse.ok) {
await profileResponse.text().catch(() => {})
logger.error('Failed to fetch GitHub profile', {
status: profileResponse.status,
statusText: profileResponse.statusText,
@@ -850,6 +852,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -889,6 +892,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -929,6 +933,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -969,6 +974,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1009,6 +1015,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1049,6 +1056,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1090,6 +1098,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1129,6 +1138,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1170,6 +1180,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1211,6 +1222,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1232,6 +1244,47 @@ export const auth = betterAuth({
},
},
{
providerId: 'google-meet',
clientId: env.GOOGLE_CLIENT_ID as string,
clientSecret: env.GOOGLE_CLIENT_SECRET as string,
discoveryUrl: 'https://accounts.google.com/.well-known/openid-configuration',
accessType: 'offline',
scopes: [
'https://www.googleapis.com/auth/userinfo.email',
'https://www.googleapis.com/auth/userinfo.profile',
'https://www.googleapis.com/auth/meetings.space.created',
'https://www.googleapis.com/auth/meetings.space.readonly',
],
prompt: 'consent',
redirectURI: `${getBaseUrl()}/api/auth/oauth2/callback/google-meet`,
getUserInfo: async (tokens) => {
try {
const response = await fetch('https://openidconnect.googleapis.com/v1/userinfo', {
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
const profile = await response.json()
const now = new Date()
return {
id: `${profile.sub}-${crypto.randomUUID()}`,
name: profile.name || 'Google User',
email: profile.email,
image: profile.picture || undefined,
emailVerified: profile.email_verified || false,
createdAt: now,
updatedAt: now,
}
} catch (error) {
logger.error('Error in Google getUserInfo', { error })
throw error
}
},
},
{
providerId: 'google-tasks',
clientId: env.GOOGLE_CLIENT_ID as string,
@@ -1251,6 +1304,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1291,6 +1345,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Google user info', { status: response.status })
throw new Error(`Failed to fetch Google user info: ${response.statusText}`)
}
@@ -1352,6 +1407,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Microsoft user info', { status: response.status })
throw new Error(`Failed to fetch Microsoft user info: ${response.statusText}`)
}
@@ -1391,6 +1447,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Microsoft user info', { status: response.status })
throw new Error(`Failed to fetch Microsoft user info: ${response.statusText}`)
}
@@ -1485,6 +1542,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Microsoft user info', { status: response.status })
throw new Error(`Failed to fetch Microsoft user info: ${response.statusText}`)
}
@@ -1533,6 +1591,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Microsoft user info', { status: response.status })
throw new Error(`Failed to fetch Microsoft user info: ${response.statusText}`)
}
@@ -1572,6 +1631,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Microsoft user info', { status: response.status })
throw new Error(`Failed to fetch Microsoft user info: ${response.statusText}`)
}
@@ -1619,6 +1679,7 @@ export const auth = betterAuth({
headers: { Authorization: `Bearer ${tokens.accessToken}` },
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Microsoft user info', { status: response.status })
throw new Error(`Failed to fetch Microsoft user info: ${response.statusText}`)
}
@@ -1701,6 +1762,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Pipedrive user info', {
status: response.status,
})
@@ -1848,6 +1910,7 @@ export const auth = betterAuth({
)
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Salesforce user info', {
status: response.status,
})
@@ -1915,6 +1978,7 @@ export const auth = betterAuth({
)
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching X user info:', {
status: response.status,
statusText: response.statusText,
@@ -2009,6 +2073,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Confluence user info:', {
status: response.status,
statusText: response.statusText,
@@ -2120,6 +2185,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Jira user info:', {
status: response.status,
statusText: response.statusText,
@@ -2155,7 +2221,13 @@ export const auth = betterAuth({
authorizationUrl: 'https://airtable.com/oauth2/v1/authorize',
tokenUrl: 'https://airtable.com/oauth2/v1/token',
userInfoUrl: 'https://api.airtable.com/v0/meta/whoami',
scopes: ['data.records:read', 'data.records:write', 'user.email:read', 'webhook:manage'],
scopes: [
'data.records:read',
'data.records:write',
'schema.bases:read',
'user.email:read',
'webhook:manage',
],
responseType: 'code',
pkce: true,
accessType: 'offline',
@@ -2171,6 +2243,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Airtable user info:', {
status: response.status,
statusText: response.statusText,
@@ -2220,6 +2293,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Notion user info:', {
status: response.status,
statusText: response.statusText,
@@ -2287,6 +2361,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Reddit user info:', {
status: response.status,
statusText: response.statusText,
@@ -2534,6 +2609,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Asana user info:', {
status: response.status,
statusText: response.statusText,
@@ -2600,6 +2676,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Slack auth.test failed', {
status: response.status,
statusText: response.statusText,
@@ -2659,6 +2736,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Error fetching Webflow user info:', {
status: response.status,
statusText: response.statusText,
@@ -2710,6 +2788,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch LinkedIn user info', {
status: response.status,
statusText: response.statusText,
@@ -2772,6 +2851,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Zoom user info', {
status: response.status,
statusText: response.statusText,
@@ -2839,6 +2919,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Spotify user info', {
status: response.status,
statusText: response.statusText,
@@ -2887,6 +2968,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch WordPress.com user info', {
status: response.status,
statusText: response.statusText,
@@ -2936,6 +3018,7 @@ export const auth = betterAuth({
})
if (!response.ok) {
await response.text().catch(() => {})
logger.error('Failed to fetch Cal.com user info', {
status: response.status,
statusText: response.statusText,

View File

@@ -43,6 +43,7 @@ export function useSubscriptionUpgrade() {
try {
const orgsResponse = await fetch('/api/organizations')
if (!orgsResponse.ok) {
await orgsResponse.text().catch(() => {})
throw new Error('Failed to check organization status')
}

View File

@@ -71,6 +71,7 @@ export async function saveMessageCheckpoint(
})
if (!response.ok) {
await response.text().catch(() => {})
throw new Error(`Failed to create checkpoint: ${response.statusText}`)
}

View File

@@ -11,6 +11,7 @@ export async function fetchPersonalEnvironment(): Promise<Record<string, Environ
const response = await fetch(API_ENDPOINTS.ENVIRONMENT)
if (!response.ok) {
await response.text().catch(() => {})
throw new Error(`Failed to load environment variables: ${response.statusText}`)
}
@@ -29,6 +30,7 @@ export async function fetchWorkspaceEnvironment(
const response = await fetch(API_ENDPOINTS.WORKSPACE_ENVIRONMENT(workspaceId))
if (!response.ok) {
await response.text().catch(() => {})
throw new Error(`Failed to load workspace environment: ${response.statusText}`)
}

View File

@@ -1531,3 +1531,443 @@ describe('stripCustomToolPrefix', () => {
expect(stripCustomToolPrefix('regular_tool')).toBe('regular_tool')
})
})
describe('nested subflow grouping via parentIterations', () => {
it.concurrent('parallel-in-parallel (P1 → P2 → leaf) with only leaf BlockLogs', () => {
// Sentinel blocks do NOT produce BlockLogs. Only leaf blocks have logs.
// Each leaf has parentIterations = full ancestor chain (outermost → innermost).
const result: ExecutionResult = {
success: true,
output: { content: 'done' },
metadata: { duration: 4000, startTime: '2024-01-01T10:00:00.000Z' },
logs: [
// P1 iter 0, P2 iter 0
{
blockId: 'func-1__obranch-0__obranch-0',
blockName: 'Func (iteration 0)',
blockType: 'function',
startedAt: '2024-01-01T10:00:00.000Z',
endedAt: '2024-01-01T10:00:01.000Z',
durationMs: 1000,
success: true,
parallelId: 'p2',
iterationIndex: 0,
executionOrder: 1,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
],
},
// P1 iter 0, P2 iter 1
{
blockId: 'func-1__obranch-1__obranch-0',
blockName: 'Func (iteration 1)',
blockType: 'function',
startedAt: '2024-01-01T10:00:01.000Z',
endedAt: '2024-01-01T10:00:02.000Z',
durationMs: 1000,
success: true,
parallelId: 'p2',
iterationIndex: 1,
executionOrder: 2,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
],
},
// P1 iter 1, P2 iter 0
{
blockId: 'func-1__obranch-0__obranch-1',
blockName: 'Func (iteration 0)',
blockType: 'function',
startedAt: '2024-01-01T10:00:02.000Z',
endedAt: '2024-01-01T10:00:03.000Z',
durationMs: 1000,
success: true,
parallelId: 'p2__obranch-1',
iterationIndex: 0,
executionOrder: 3,
parentIterations: [
{
iterationCurrent: 1,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
],
},
// P1 iter 1, P2 iter 1
{
blockId: 'func-1__obranch-1__obranch-1',
blockName: 'Func (iteration 1)',
blockType: 'function',
startedAt: '2024-01-01T10:00:03.000Z',
endedAt: '2024-01-01T10:00:04.000Z',
durationMs: 1000,
success: true,
parallelId: 'p2__obranch-1',
iterationIndex: 1,
executionOrder: 4,
parentIterations: [
{
iterationCurrent: 1,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
],
},
],
}
const { traceSpans } = buildTraceSpans(result)
const workflow = traceSpans[0]
expect(workflow.name).toBe('Workflow Execution')
// Should have one top-level parallel container (P1)
const p1 = workflow.children!.find((s) => s.type === 'parallel')!
expect(p1).toBeDefined()
expect(p1.children).toHaveLength(2) // 2 iterations of P1
// P1 iteration 0 → nested P2 container
const p1Iter0 = p1.children![0]
expect(p1Iter0.name).toBe('Iteration 0')
const p2InIter0 = p1Iter0.children!.find((s) => s.type === 'parallel')
expect(p2InIter0).toBeDefined()
expect(p2InIter0!.children).toHaveLength(2) // 2 iterations of P2
// P1 iteration 1 → nested P2 container
const p1Iter1 = p1.children![1]
expect(p1Iter1.name).toBe('Iteration 1')
const p2InIter1 = p1Iter1.children!.find((s) => s.type === 'parallel')
expect(p2InIter1).toBeDefined()
expect(p2InIter1!.children).toHaveLength(2)
// Leaf spans inside P2 iterations
expect(p2InIter0!.children![0].children![0].name).toBe('Func')
})
it.concurrent('loop-in-loop nests correctly with parentIterations', () => {
// Only leaf blocks produce BlockLogs in loops too
const result: ExecutionResult = {
success: true,
output: { content: 'done' },
metadata: { duration: 3000, startTime: '2024-01-01T10:00:00.000Z' },
logs: [
// Outer iter 0, inner iter 0
{
blockId: 'agent-1',
blockName: 'Agent (iteration 0)',
blockType: 'agent',
startedAt: '2024-01-01T10:00:00.000Z',
endedAt: '2024-01-01T10:00:01.000Z',
durationMs: 1000,
success: true,
loopId: 'inner-loop',
iterationIndex: 0,
executionOrder: 1,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'loop',
iterationContainerId: 'outer-loop',
},
],
},
// Outer iter 0, inner iter 1
{
blockId: 'agent-1',
blockName: 'Agent (iteration 1)',
blockType: 'agent',
startedAt: '2024-01-01T10:00:01.000Z',
endedAt: '2024-01-01T10:00:02.000Z',
durationMs: 1000,
success: true,
loopId: 'inner-loop',
iterationIndex: 1,
executionOrder: 2,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'loop',
iterationContainerId: 'outer-loop',
},
],
},
// Outer iter 1, inner iter 0
{
blockId: 'agent-1',
blockName: 'Agent (iteration 0)',
blockType: 'agent',
startedAt: '2024-01-01T10:00:02.000Z',
endedAt: '2024-01-01T10:00:03.000Z',
durationMs: 1000,
success: true,
loopId: 'inner-loop',
iterationIndex: 0,
executionOrder: 3,
parentIterations: [
{
iterationCurrent: 1,
iterationTotal: 2,
iterationType: 'loop',
iterationContainerId: 'outer-loop',
},
],
},
],
}
const { traceSpans } = buildTraceSpans(result)
const workflow = traceSpans[0]
const outerLoop = workflow.children!.find((s) => s.type === 'loop')!
expect(outerLoop).toBeDefined()
expect(outerLoop.children).toHaveLength(2) // 2 outer iterations
// Outer iteration 0 → inner-loop container with 2 iterations
const outerIter0 = outerLoop.children![0]
const innerLoop0 = outerIter0.children!.find((s) => s.type === 'loop')
expect(innerLoop0).toBeDefined()
expect(innerLoop0!.children).toHaveLength(2)
// Outer iteration 1 → inner-loop container with 1 iteration
const outerIter1 = outerLoop.children![1]
const innerLoop1 = outerIter1.children!.find((s) => s.type === 'loop')
expect(innerLoop1).toBeDefined()
expect(innerLoop1!.children).toHaveLength(1)
})
it.concurrent('3-level nesting (P1 → P2 → P3 → leaf) groups recursively', () => {
const result: ExecutionResult = {
success: true,
output: { content: 'done' },
metadata: { duration: 2000, startTime: '2024-01-01T10:00:00.000Z' },
logs: [
// Leaf: parallelId=p3, parentIterations=[p1:0, p2:0]
{
blockId: 'func-1__obranch-0__obranch-0__obranch-0',
blockName: 'Func (iteration 0)',
blockType: 'function',
startedAt: '2024-01-01T10:00:00.000Z',
endedAt: '2024-01-01T10:00:01.000Z',
durationMs: 1000,
success: true,
parallelId: 'p3',
iterationIndex: 0,
executionOrder: 1,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p2',
},
],
},
{
blockId: 'func-1__obranch-1__obranch-0__obranch-0',
blockName: 'Func (iteration 1)',
blockType: 'function',
startedAt: '2024-01-01T10:00:01.000Z',
endedAt: '2024-01-01T10:00:02.000Z',
durationMs: 1000,
success: true,
parallelId: 'p3',
iterationIndex: 1,
executionOrder: 2,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p2',
},
],
},
],
}
const { traceSpans } = buildTraceSpans(result)
const workflow = traceSpans[0]
// P1 container
const p1 = workflow.children!.find((s) => s.type === 'parallel')!
expect(p1).toBeDefined()
expect(p1.children).toHaveLength(1) // 1 iteration of P1
// P1 → Iteration 0 → P2
const p1Iter0 = p1.children![0]
const p2 = p1Iter0.children!.find((s) => s.type === 'parallel')
expect(p2).toBeDefined()
expect(p2!.children).toHaveLength(1) // 1 iteration of P2
// P2 → Iteration 0 → P3
const p2Iter0 = p2!.children![0]
const p3 = p2Iter0.children!.find((s) => s.type === 'parallel')
expect(p3).toBeDefined()
expect(p3!.children).toHaveLength(2) // 2 iterations of P3
// P3 leaf spans
expect(p3!.children![0].children![0].name).toBe('Func')
expect(p3!.children![1].children![0].name).toBe('Func')
})
it.concurrent('backward compatibility: spans without parentIterations group flat', () => {
const result: ExecutionResult = {
success: true,
output: { content: 'done' },
metadata: { duration: 2000, startTime: '2024-01-01T10:00:00.000Z' },
logs: [
{
blockId: 'api-1__obranch-0',
blockName: 'API (iteration 0)',
blockType: 'api',
startedAt: '2024-01-01T10:00:00.000Z',
endedAt: '2024-01-01T10:00:01.000Z',
durationMs: 1000,
success: true,
parallelId: 'p1',
iterationIndex: 0,
executionOrder: 1,
},
{
blockId: 'api-1__obranch-1',
blockName: 'API (iteration 1)',
blockType: 'api',
startedAt: '2024-01-01T10:00:01.000Z',
endedAt: '2024-01-01T10:00:02.000Z',
durationMs: 1000,
success: true,
parallelId: 'p1',
iterationIndex: 1,
executionOrder: 2,
},
],
}
const { traceSpans } = buildTraceSpans(result)
const workflow = traceSpans[0]
// Should group into a flat parallel container with 2 iterations
const parallel = workflow.children!.find((s) => s.type === 'parallel')!
expect(parallel).toBeDefined()
expect(parallel.children).toHaveLength(2)
expect(parallel.children![0].name).toBe('Iteration 0')
expect(parallel.children![1].name).toBe('Iteration 1')
// No nested containers — leaf spans are directly inside iteration
expect(parallel.children![0].children![0].name).toBe('API')
expect(parallel.children![0].children![0].type).toBe('api')
})
it.concurrent('mixed: flat loop + nested parallel-in-parallel in same execution', () => {
const result: ExecutionResult = {
success: true,
output: { content: 'done' },
metadata: { duration: 5000, startTime: '2024-01-01T10:00:00.000Z' },
logs: [
// Flat loop iterations (no parentIterations)
{
blockId: 'agent-1',
blockName: 'Agent (iteration 0)',
blockType: 'agent',
startedAt: '2024-01-01T10:00:00.000Z',
endedAt: '2024-01-01T10:00:01.000Z',
durationMs: 1000,
success: true,
loopId: 'loop-1',
iterationIndex: 0,
executionOrder: 1,
},
{
blockId: 'agent-1',
blockName: 'Agent (iteration 1)',
blockType: 'agent',
startedAt: '2024-01-01T10:00:01.000Z',
endedAt: '2024-01-01T10:00:02.000Z',
durationMs: 1000,
success: true,
loopId: 'loop-1',
iterationIndex: 1,
executionOrder: 2,
},
// Nested P1 → P2 leaf (only leaf, no sentinel logs)
{
blockId: 'func-1__obranch-0__obranch-0',
blockName: 'Func (iteration 0)',
blockType: 'function',
startedAt: '2024-01-01T10:00:02.000Z',
endedAt: '2024-01-01T10:00:03.000Z',
durationMs: 1000,
success: true,
parallelId: 'p2',
iterationIndex: 0,
executionOrder: 3,
parentIterations: [
{
iterationCurrent: 0,
iterationTotal: 2,
iterationType: 'parallel',
iterationContainerId: 'p1',
},
],
},
// Non-iteration span
{
blockId: 'starter',
blockName: 'Starter',
blockType: 'starter',
startedAt: '2024-01-01T10:00:04.000Z',
endedAt: '2024-01-01T10:00:05.000Z',
durationMs: 1000,
success: true,
executionOrder: 5,
},
],
}
const { traceSpans } = buildTraceSpans(result)
const workflow = traceSpans[0]
const children = workflow.children!
const loop = children.find((s) => s.type === 'loop')
const parallel = children.find((s) => s.type === 'parallel')
const starter = children.find((s) => s.name === 'Starter')
expect(loop).toBeDefined()
expect(parallel).toBeDefined()
expect(starter).toBeDefined()
// Loop should have 2 flat iterations
expect(loop!.children).toHaveLength(2)
// P1 should have 1 iteration with nested P2
expect(parallel!.children).toHaveLength(1)
const p1Iter0 = parallel!.children![0]
const nestedP2 = p1Iter0.children!.find((s) => s.type === 'parallel')
expect(nestedP2).toBeDefined()
expect(nestedP2!.children).toHaveLength(1)
})
})

View File

@@ -2,6 +2,7 @@ import { createLogger } from '@sim/logger'
import type { ToolCall, TraceSpan } from '@/lib/logs/types'
import { isWorkflowBlockType, stripCustomToolPrefix } from '@/executor/constants'
import type { ExecutionResult } from '@/executor/types'
import { stripCloneSuffixes } from '@/executor/utils/subflow-utils'
const logger = createLogger('TraceSpans')
@@ -160,6 +161,7 @@ export function buildTraceSpans(result: ExecutionResult): {
...(log.loopId && { loopId: log.loopId }),
...(log.parallelId && { parallelId: log.parallelId }),
...(log.iterationIndex !== undefined && { iterationIndex: log.iterationIndex }),
...(log.parentIterations?.length && { parentIterations: log.parentIterations }),
}
if (log.output?.providerTiming) {
@@ -525,292 +527,320 @@ export function buildTraceSpans(result: ExecutionResult): {
return { traceSpans: groupedRootSpans, totalDuration }
}
/**
* Builds a container-level TraceSpan (iteration wrapper or top-level container)
* from its source spans and resolved children.
*/
function buildContainerSpan(opts: {
id: string
name: string
type: string
sourceSpans: TraceSpan[]
children: TraceSpan[]
}): TraceSpan {
const startTimes = opts.sourceSpans.map((s) => new Date(s.startTime).getTime())
const endTimes = opts.sourceSpans.map((s) => new Date(s.endTime).getTime())
const earliestStart = Math.min(...startTimes)
const latestEnd = Math.max(...endTimes)
const hasErrors = opts.sourceSpans.some((s) => s.status === 'error')
const allErrorsHandled =
hasErrors && opts.children.every((s) => s.status !== 'error' || s.errorHandled)
return {
id: opts.id,
name: opts.name,
type: opts.type,
duration: latestEnd - earliestStart,
startTime: new Date(earliestStart).toISOString(),
endTime: new Date(latestEnd).toISOString(),
status: hasErrors ? 'error' : 'success',
...(allErrorsHandled && { errorHandled: true }),
children: opts.children,
}
}
/** Counter state for generating sequential container names. */
interface ContainerNameCounters {
loopNumbers: Map<string, number>
parallelNumbers: Map<string, number>
loopCounter: number
parallelCounter: number
}
/**
* Resolves a container name from normal (non-iteration) spans or assigns a sequential number.
* Strips clone suffixes so all clones of the same container share one name/number.
*/
function resolveContainerName(
containerId: string,
containerType: 'parallel' | 'loop',
normalSpans: TraceSpan[],
counters: ContainerNameCounters
): string {
const originalId = stripCloneSuffixes(containerId)
const matchingBlock = normalSpans.find(
(s) => s.blockId === originalId && s.type === containerType
)
if (matchingBlock?.name) return matchingBlock.name
if (containerType === 'parallel') {
if (!counters.parallelNumbers.has(originalId)) {
counters.parallelNumbers.set(originalId, counters.parallelCounter++)
}
return `Parallel ${counters.parallelNumbers.get(originalId)}`
}
if (!counters.loopNumbers.has(originalId)) {
counters.loopNumbers.set(originalId, counters.loopCounter++)
}
return `Loop ${counters.loopNumbers.get(originalId)}`
}
/**
* Classifies a span's immediate container ID and type from its metadata.
* Returns undefined for non-iteration spans.
*/
function classifySpanContainer(
span: TraceSpan
): { containerId: string; containerType: 'parallel' | 'loop' } | undefined {
if (span.parallelId) {
return { containerId: span.parallelId, containerType: 'parallel' }
}
if (span.loopId) {
return { containerId: span.loopId, containerType: 'loop' }
}
// Fallback: parse from blockId for legacy data
if (span.blockId?.includes('_parallel_')) {
const match = span.blockId.match(/_parallel_([^_]+)_iteration_/)
if (match) {
return { containerId: match[1], containerType: 'parallel' }
}
}
return undefined
}
/**
* Finds the outermost container for a span. For nested spans, this is parentIterations[0].
* For flat spans, this is the span's own immediate container.
*/
function getOutermostContainer(
span: TraceSpan
): { containerId: string; containerType: 'parallel' | 'loop' } | undefined {
if (span.parentIterations && span.parentIterations.length > 0) {
const outermost = span.parentIterations[0]
return {
containerId: outermost.iterationContainerId,
containerType: outermost.iterationType as 'parallel' | 'loop',
}
}
return classifySpanContainer(span)
}
/**
* Builds the iteration-level hierarchy for a container, recursively nesting
* any deeper subflows. Works with both:
* - Direct spans (spans whose immediate container matches)
* - Nested spans (spans with parentIterations pointing through this container)
*/
function buildContainerChildren(
containerType: 'parallel' | 'loop',
containerId: string,
spans: TraceSpan[],
normalSpans: TraceSpan[],
counters: ContainerNameCounters
): TraceSpan[] {
const iterationType = containerType === 'parallel' ? 'parallel-iteration' : 'loop-iteration'
// Group spans by iteration index at this level.
// Each span's iteration index at this level comes from:
// - parentIterations[0].iterationCurrent if parentIterations[0].containerId === containerId
// - span.iterationIndex if span's immediate container === containerId
const iterationGroups = new Map<number, TraceSpan[]>()
for (const span of spans) {
let iterIdx: number | undefined
if (
span.parentIterations &&
span.parentIterations.length > 0 &&
span.parentIterations[0].iterationContainerId === containerId
) {
iterIdx = span.parentIterations[0].iterationCurrent
} else {
// The span's immediate container is this container
iterIdx = span.iterationIndex
}
if (iterIdx === undefined) continue
if (!iterationGroups.has(iterIdx)) iterationGroups.set(iterIdx, [])
iterationGroups.get(iterIdx)!.push(span)
}
const iterationChildren: TraceSpan[] = []
const sortedIterations = Array.from(iterationGroups.entries()).sort(([a], [b]) => a - b)
for (const [iterationIndex, iterSpans] of sortedIterations) {
// For each span in this iteration, strip one level of ancestry and determine
// whether it belongs to this container directly or to a deeper subflow
const directLeaves: TraceSpan[] = []
const deeperSpans: TraceSpan[] = []
for (const span of iterSpans) {
if (
span.parentIterations &&
span.parentIterations.length > 0 &&
span.parentIterations[0].iterationContainerId === containerId
) {
// Strip the outermost parentIteration (this container level)
deeperSpans.push({
...span,
parentIterations: span.parentIterations.slice(1),
})
} else {
// This span's immediate container IS this container — it's a direct leaf
directLeaves.push({
...span,
name: span.name.replace(/ \(iteration \d+\)$/, ''),
})
}
}
// Recursively group the deeper spans (they'll form nested containers)
const nestedResult = groupIterationBlocksRecursive(
[...directLeaves, ...deeperSpans],
normalSpans,
counters
)
iterationChildren.push(
buildContainerSpan({
id: `${containerId}-iteration-${iterationIndex}`,
name: `Iteration ${iterationIndex}`,
type: iterationType,
sourceSpans: iterSpans,
children: nestedResult,
})
)
}
return iterationChildren
}
/**
* Core recursive algorithm for grouping iteration blocks.
*
* Handles two cases:
* 1. **Flat** (backward compat): spans have loopId/parallelId + iterationIndex but no
* parentIterations. Grouped by immediate container → iteration → leaf.
* 2. **Nested** (new): spans have parentIterations chains. The outermost ancestor in the
* chain determines the top-level container. Iteration spans are peeled one level at a
* time and recursed.
*
* Sentinel blocks (parallel/loop containers) do NOT produce BlockLogs, so there are no
* sentinel spans to anchor grouping. Containers are synthesized from the iteration data.
*/
function groupIterationBlocksRecursive(
spans: TraceSpan[],
normalSpans: TraceSpan[],
counters: ContainerNameCounters
): TraceSpan[] {
const result: TraceSpan[] = []
const iterationSpans: TraceSpan[] = []
const nonIterationSpans: TraceSpan[] = []
for (const span of spans) {
if (
span.name.match(/^(.+) \(iteration (\d+)\)$/) ||
(span.parentIterations && span.parentIterations.length > 0)
) {
iterationSpans.push(span)
} else {
nonIterationSpans.push(span)
}
}
// Non-iteration spans that aren't consumed container sentinels go straight to result
const nonContainerSpans = nonIterationSpans.filter(
(span) => (span.type !== 'parallel' && span.type !== 'loop') || span.status === 'error'
)
if (iterationSpans.length === 0) {
result.push(...nonContainerSpans)
result.sort((a, b) => new Date(a.startTime).getTime() - new Date(b.startTime).getTime())
return result
}
// Group iteration spans by outermost container
const containerGroups = new Map<
string,
{ type: 'parallel' | 'loop'; containerId: string; containerName: string; spans: TraceSpan[] }
>()
for (const span of iterationSpans) {
const outermost = getOutermostContainer(span)
if (!outermost) continue
const { containerId, containerType } = outermost
const groupKey = `${containerType}_${containerId}`
if (!containerGroups.has(groupKey)) {
const containerName = resolveContainerName(containerId, containerType, normalSpans, counters)
containerGroups.set(groupKey, {
type: containerType,
containerId,
containerName,
spans: [],
})
}
containerGroups.get(groupKey)!.spans.push(span)
}
// Build each container with recursive nesting
for (const [, group] of containerGroups) {
const { type, containerId, containerName, spans: containerSpans } = group
const iterationChildren = buildContainerChildren(
type,
containerId,
containerSpans,
normalSpans,
counters
)
result.push(
buildContainerSpan({
id: `${type === 'parallel' ? 'parallel' : 'loop'}-execution-${containerId}`,
name: containerName,
type,
sourceSpans: containerSpans,
children: iterationChildren,
})
)
}
result.push(...nonContainerSpans)
result.sort((a, b) => new Date(a.startTime).getTime() - new Date(b.startTime).getTime())
return result
}
/**
* Groups iteration-based blocks (parallel and loop) by organizing their iteration spans
* into a hierarchical structure with proper parent-child relationships.
* Supports recursive nesting via parentIterations (e.g., parallel-in-parallel, loop-in-loop).
*
* @param spans - Array of root spans to process
* @returns Array of spans with iteration blocks properly grouped
*/
function groupIterationBlocks(spans: TraceSpan[]): TraceSpan[] {
const result: TraceSpan[] = []
const iterationSpans: TraceSpan[] = []
const normalSpans: TraceSpan[] = []
spans.forEach((span) => {
const iterationMatch = span.name.match(/^(.+) \(iteration (\d+)\)$/)
if (iterationMatch) {
iterationSpans.push(span)
} else {
normalSpans.push(span)
}
})
// Include loop/parallel spans that have errors (e.g., validation errors that blocked execution)
// These won't have iteration children, so they should appear directly in results
const nonIterationContainerSpans = normalSpans.filter(
(span) => (span.type !== 'parallel' && span.type !== 'loop') || span.status === 'error'
)
if (iterationSpans.length > 0) {
const containerGroups = new Map<
string,
{
type: 'parallel' | 'loop'
containerId: string
containerName: string
spans: TraceSpan[]
}
>()
// Track sequential numbers for loops and parallels
const loopNumbers = new Map<string, number>()
const parallelNumbers = new Map<string, number>()
let loopCounter = 1
let parallelCounter = 1
iterationSpans.forEach((span) => {
const iterationMatch = span.name.match(/^(.+) \(iteration (\d+)\)$/)
if (iterationMatch) {
let containerType: 'parallel' | 'loop' = 'loop'
let containerId = 'unknown'
let containerName = 'Unknown'
// Use the loopId/parallelId from the span metadata (set during execution)
if (span.parallelId) {
containerType = 'parallel'
containerId = span.parallelId
const parallelBlock = normalSpans.find(
(s) => s.blockId === containerId && s.type === 'parallel'
)
// Use custom name if available, otherwise assign sequential number
if (parallelBlock?.name) {
containerName = parallelBlock.name
} else {
if (!parallelNumbers.has(containerId)) {
parallelNumbers.set(containerId, parallelCounter++)
}
containerName = `Parallel ${parallelNumbers.get(containerId)}`
}
} else if (span.loopId) {
containerType = 'loop'
containerId = span.loopId
const loopBlock = normalSpans.find((s) => s.blockId === containerId && s.type === 'loop')
// Use custom name if available, otherwise assign sequential number
if (loopBlock?.name) {
containerName = loopBlock.name
} else {
if (!loopNumbers.has(containerId)) {
loopNumbers.set(containerId, loopCounter++)
}
containerName = `Loop ${loopNumbers.get(containerId)}`
}
} else {
// Fallback to old logic if metadata is missing
if (span.blockId?.includes('_parallel_')) {
const parallelMatch = span.blockId.match(/_parallel_([^_]+)_iteration_/)
if (parallelMatch) {
containerType = 'parallel'
containerId = parallelMatch[1]
const parallelBlock = normalSpans.find(
(s) => s.blockId === containerId && s.type === 'parallel'
)
// Use custom name if available, otherwise assign sequential number
if (parallelBlock?.name) {
containerName = parallelBlock.name
} else {
if (!parallelNumbers.has(containerId)) {
parallelNumbers.set(containerId, parallelCounter++)
}
containerName = `Parallel ${parallelNumbers.get(containerId)}`
}
}
} else {
containerType = 'loop'
// Find the first loop as fallback
const loopBlock = normalSpans.find((s) => s.type === 'loop')
if (loopBlock?.blockId) {
containerId = loopBlock.blockId
// Use custom name if available, otherwise assign sequential number
if (loopBlock.name) {
containerName = loopBlock.name
} else {
if (!loopNumbers.has(containerId)) {
loopNumbers.set(containerId, loopCounter++)
}
containerName = `Loop ${loopNumbers.get(containerId)}`
}
} else {
containerId = 'loop-1'
containerName = 'Loop 1'
}
}
}
const groupKey = `${containerType}_${containerId}`
if (!containerGroups.has(groupKey)) {
containerGroups.set(groupKey, {
type: containerType,
containerId,
containerName,
spans: [],
})
}
containerGroups.get(groupKey)!.spans.push(span)
}
})
containerGroups.forEach((group, groupKey) => {
const { type, containerId, containerName, spans } = group
const iterationGroups = new Map<number, TraceSpan[]>()
spans.forEach((span) => {
const iterationMatch = span.name.match(/^(.+) \(iteration (\d+)\)$/)
if (iterationMatch) {
const iterationIndex = Number.parseInt(iterationMatch[2])
if (!iterationGroups.has(iterationIndex)) {
iterationGroups.set(iterationIndex, [])
}
iterationGroups.get(iterationIndex)!.push(span)
}
})
if (type === 'parallel') {
const allIterationSpans = spans
const startTimes = allIterationSpans.map((span) => new Date(span.startTime).getTime())
const endTimes = allIterationSpans.map((span) => new Date(span.endTime).getTime())
const earliestStart = Math.min(...startTimes)
const latestEnd = Math.max(...endTimes)
const totalDuration = latestEnd - earliestStart
const iterationChildren: TraceSpan[] = []
const sortedIterations = Array.from(iterationGroups.entries()).sort(([a], [b]) => a - b)
sortedIterations.forEach(([iterationIndex, spans]) => {
const iterStartTimes = spans.map((span) => new Date(span.startTime).getTime())
const iterEndTimes = spans.map((span) => new Date(span.endTime).getTime())
const iterEarliestStart = Math.min(...iterStartTimes)
const iterLatestEnd = Math.max(...iterEndTimes)
const iterDuration = iterLatestEnd - iterEarliestStart
const hasErrors = spans.some((span) => span.status === 'error')
const allErrorsHandled =
hasErrors && spans.every((span) => span.status !== 'error' || span.errorHandled)
const iterationSpan: TraceSpan = {
id: `${containerId}-iteration-${iterationIndex}`,
name: `Iteration ${iterationIndex}`,
type: 'parallel-iteration',
duration: iterDuration,
startTime: new Date(iterEarliestStart).toISOString(),
endTime: new Date(iterLatestEnd).toISOString(),
status: hasErrors ? 'error' : 'success',
...(allErrorsHandled && { errorHandled: true }),
children: spans.map((span) => ({
...span,
name: span.name.replace(/ \(iteration \d+\)$/, ''),
})),
}
iterationChildren.push(iterationSpan)
})
const hasErrors = allIterationSpans.some((span) => span.status === 'error')
const allErrorsHandled =
hasErrors &&
iterationChildren.every((span) => span.status !== 'error' || span.errorHandled)
const parallelContainer: TraceSpan = {
id: `parallel-execution-${containerId}`,
name: containerName,
type: 'parallel',
duration: totalDuration,
startTime: new Date(earliestStart).toISOString(),
endTime: new Date(latestEnd).toISOString(),
status: hasErrors ? 'error' : 'success',
...(allErrorsHandled && { errorHandled: true }),
children: iterationChildren,
}
result.push(parallelContainer)
} else {
const allIterationSpans = spans
const startTimes = allIterationSpans.map((span) => new Date(span.startTime).getTime())
const endTimes = allIterationSpans.map((span) => new Date(span.endTime).getTime())
const earliestStart = Math.min(...startTimes)
const latestEnd = Math.max(...endTimes)
const totalDuration = latestEnd - earliestStart
const iterationChildren: TraceSpan[] = []
const sortedIterations = Array.from(iterationGroups.entries()).sort(([a], [b]) => a - b)
sortedIterations.forEach(([iterationIndex, spans]) => {
const iterStartTimes = spans.map((span) => new Date(span.startTime).getTime())
const iterEndTimes = spans.map((span) => new Date(span.endTime).getTime())
const iterEarliestStart = Math.min(...iterStartTimes)
const iterLatestEnd = Math.max(...iterEndTimes)
const iterDuration = iterLatestEnd - iterEarliestStart
const hasErrors = spans.some((span) => span.status === 'error')
const allErrorsHandled =
hasErrors && spans.every((span) => span.status !== 'error' || span.errorHandled)
const iterationSpan: TraceSpan = {
id: `${containerId}-iteration-${iterationIndex}`,
name: `Iteration ${iterationIndex}`,
type: 'loop-iteration',
duration: iterDuration,
startTime: new Date(iterEarliestStart).toISOString(),
endTime: new Date(iterLatestEnd).toISOString(),
status: hasErrors ? 'error' : 'success',
...(allErrorsHandled && { errorHandled: true }),
children: spans.map((span) => ({
...span,
name: span.name.replace(/ \(iteration \d+\)$/, ''),
})),
}
iterationChildren.push(iterationSpan)
})
const hasErrors = allIterationSpans.some((span) => span.status === 'error')
const allErrorsHandled =
hasErrors &&
iterationChildren.every((span) => span.status !== 'error' || span.errorHandled)
const loopContainer: TraceSpan = {
id: `loop-execution-${containerId}`,
name: containerName,
type: 'loop',
duration: totalDuration,
startTime: new Date(earliestStart).toISOString(),
endTime: new Date(latestEnd).toISOString(),
status: hasErrors ? 'error' : 'success',
...(allErrorsHandled && { errorHandled: true }),
children: iterationChildren,
}
result.push(loopContainer)
}
})
const normalSpans = spans.filter((s) => !s.name.match(/^(.+) \(iteration (\d+)\)$/))
const counters: ContainerNameCounters = {
loopNumbers: new Map<string, number>(),
parallelNumbers: new Map<string, number>(),
loopCounter: 1,
parallelCounter: 1,
}
result.push(...nonIterationContainerSpans)
result.sort((a, b) => new Date(a.startTime).getTime() - new Date(b.startTime).getTime())
return result
return groupIterationBlocksRecursive(spans, normalSpans, counters)
}

View File

@@ -1,5 +1,5 @@
import type { Edge } from 'reactflow'
import type { SerializableExecutionState } from '@/executor/execution/types'
import type { ParentIteration, SerializableExecutionState } from '@/executor/execution/types'
import type { BlockLog, NormalizedBlockOutput } from '@/executor/types'
import type { DeploymentStatus } from '@/stores/workflows/registry/types'
import type { Loop, Parallel, WorkflowState } from '@/stores/workflows/workflow/types'
@@ -194,6 +194,7 @@ export interface TraceSpan {
loopId?: string
parallelId?: string
iterationIndex?: number
parentIterations?: ParentIteration[]
}
export interface WorkflowExecutionSummary {

View File

@@ -16,6 +16,7 @@ import {
GoogleFormsIcon,
GoogleGroupsIcon,
GoogleIcon,
GoogleMeetIcon,
GoogleSheetsIcon,
GoogleTasksIcon,
HubspotIcon,
@@ -168,6 +169,17 @@ export const OAUTH_PROVIDERS: Record<string, OAuthProviderConfig> = {
'https://www.googleapis.com/auth/admin.directory.group.member',
],
},
'google-meet': {
name: 'Google Meet',
description: 'Create and manage Google Meet meeting spaces and conferences.',
providerId: 'google-meet',
icon: GoogleMeetIcon,
baseProviderIcon: GoogleIcon,
scopes: [
'https://www.googleapis.com/auth/meetings.space.created',
'https://www.googleapis.com/auth/meetings.space.readonly',
],
},
'vertex-ai': {
name: 'Vertex AI',
description: 'Access Google Cloud Vertex AI for Gemini models with OAuth.',
@@ -486,7 +498,13 @@ export const OAUTH_PROVIDERS: Record<string, OAuthProviderConfig> = {
providerId: 'airtable',
icon: AirtableIcon,
baseProviderIcon: AirtableIcon,
scopes: ['data.records:read', 'data.records:write', 'user.email:read', 'webhook:manage'],
scopes: [
'data.records:read',
'data.records:write',
'schema.bases:read',
'user.email:read',
'webhook:manage',
],
},
},
defaultService: 'airtable',

View File

@@ -13,6 +13,7 @@ export type OAuthProvider =
| 'google-vault'
| 'google-forms'
| 'google-groups'
| 'google-meet'
| 'vertex-ai'
| 'github'
| 'github-repo'
@@ -61,6 +62,7 @@ export type OAuthService =
| 'google-vault'
| 'google-forms'
| 'google-groups'
| 'google-meet'
| 'vertex-ai'
| 'github'
| 'x'

View File

@@ -278,6 +278,7 @@ async function fetchNewRssItems(
})
if (!response.ok) {
await response.text().catch(() => {})
throw new Error(`Failed to fetch RSS feed: ${response.status} ${response.statusText}`)
}

View File

@@ -1,4 +1,8 @@
import type { ChildWorkflowContext, IterationContext } from '@/executor/execution/types'
import type {
ChildWorkflowContext,
IterationContext,
ParentIteration,
} from '@/executor/execution/types'
import type { SubflowType } from '@/stores/workflows/workflow/types'
export type ExecutionEventType =
@@ -83,6 +87,7 @@ export interface BlockStartedEvent extends BaseExecutionEvent {
iterationTotal?: number
iterationType?: SubflowType
iterationContainerId?: string
parentIterations?: ParentIteration[]
childWorkflowBlockId?: string
childWorkflowName?: string
}
@@ -108,6 +113,7 @@ export interface BlockCompletedEvent extends BaseExecutionEvent {
iterationTotal?: number
iterationType?: SubflowType
iterationContainerId?: string
parentIterations?: ParentIteration[]
childWorkflowBlockId?: string
childWorkflowName?: string
/** Per-invocation unique ID for correlating child block events with this workflow block. */
@@ -135,6 +141,7 @@ export interface BlockErrorEvent extends BaseExecutionEvent {
iterationTotal?: number
iterationType?: SubflowType
iterationContainerId?: string
parentIterations?: ParentIteration[]
childWorkflowBlockId?: string
childWorkflowName?: string
/** Per-invocation unique ID for correlating child block events with this workflow block. */
@@ -271,6 +278,9 @@ export function createSSECallbacks(options: SSECallbackOptions) {
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...(childWorkflowContext && {
childWorkflowBlockId: childWorkflowContext.parentBlockId,
@@ -303,6 +313,9 @@ export function createSSECallbacks(options: SSECallbackOptions) {
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}
: {}
const childWorkflowData = childWorkflowContext

View File

@@ -52,11 +52,19 @@ export interface StreamingResponseOptions {
}
interface StreamingState {
streamedContent: Map<string, string>
streamedChunks: Map<string, string[]>
processedOutputs: Set<string>
streamCompletionTimes: Map<string, number>
}
function resolveStreamedContent(state: StreamingState): Map<string, string> {
const result = new Map<string, string>()
for (const [blockId, chunks] of state.streamedChunks) {
result.set(blockId, chunks.join(''))
}
return result
}
function extractOutputValue(output: unknown, path: string): unknown {
return traverseObjectPath(output, path)
}
@@ -125,17 +133,21 @@ async function buildMinimalResult(
return minimalResult
}
function updateLogsWithStreamedContent(logs: BlockLog[], state: StreamingState): BlockLog[] {
function updateLogsWithStreamedContent(
logs: BlockLog[],
streamedContent: Map<string, string>,
streamCompletionTimes: Map<string, number>
): BlockLog[] {
return logs.map((log: BlockLog) => {
if (!state.streamedContent.has(log.blockId)) {
if (!streamedContent.has(log.blockId)) {
return log
}
const content = state.streamedContent.get(log.blockId)
const content = streamedContent.get(log.blockId)
const updatedLog = { ...log }
if (state.streamCompletionTimes.has(log.blockId)) {
const completionTime = state.streamCompletionTimes.get(log.blockId)!
if (streamCompletionTimes.has(log.blockId)) {
const completionTime = streamCompletionTimes.get(log.blockId)!
const startTime = new Date(log.startedAt).getTime()
updatedLog.endedAt = new Date(completionTime).toISOString()
updatedLog.durationMs = completionTime - startTime
@@ -176,7 +188,7 @@ export async function createStreamingResponse(
return new ReadableStream({
async start(controller) {
const state: StreamingState = {
streamedContent: new Map(),
streamedChunks: new Map(),
processedOutputs: new Set(),
streamCompletionTimes: new Map(),
}
@@ -210,10 +222,10 @@ export async function createStreamingResponse(
}
const textChunk = decoder.decode(value, { stream: true })
state.streamedContent.set(
blockId,
(state.streamedContent.get(blockId) || '') + textChunk
)
if (!state.streamedChunks.has(blockId)) {
state.streamedChunks.set(blockId, [])
}
state.streamedChunks.get(blockId)!.push(textChunk)
if (isFirstChunk) {
sendChunk(blockId, textChunk)
@@ -242,7 +254,7 @@ export async function createStreamingResponse(
return
}
if (state.streamedContent.has(blockId)) {
if (state.streamedChunks.has(blockId)) {
return
}
@@ -292,9 +304,16 @@ export async function createStreamingResponse(
executionId
)
if (result.logs && state.streamedContent.size > 0) {
result.logs = updateLogsWithStreamedContent(result.logs, state)
processStreamingBlockLogs(result.logs, state.streamedContent)
const streamedContent =
state.streamedChunks.size > 0 ? resolveStreamedContent(state) : new Map<string, string>()
if (result.logs && streamedContent.size > 0) {
result.logs = updateLogsWithStreamedContent(
result.logs,
streamedContent,
state.streamCompletionTimes
)
processStreamingBlockLogs(result.logs, streamedContent)
}
if (
@@ -316,7 +335,7 @@ export async function createStreamingResponse(
const minimalResult = await buildMinimalResult(
result,
streamConfig.selectedOutputs,
state.streamedContent,
streamedContent,
requestId,
streamConfig.includeFileBase64 ?? true,
streamConfig.base64MaxBytes

View File

@@ -328,6 +328,11 @@ const nextConfig: NextConfig = {
source: '/team',
destination: 'https://cal.com/emirkarabeg/sim-team',
permanent: false,
},
{
source: '/careers',
destination: 'https://jobs.ashbyhq.com/sim',
permanent: true,
}
)

View File

@@ -37,6 +37,7 @@ export const ollamaProvider: ProviderConfig = {
try {
const response = await fetch(`${OLLAMA_HOST}/api/tags`)
if (!response.ok) {
await response.text().catch(() => {})
useProvidersStore.getState().setProviderModels('ollama', [])
logger.warn('Ollama service is not available. The provider will be disabled.')
return

View File

@@ -29,6 +29,7 @@ async function fetchModelCapabilities(): Promise<Map<string, ModelCapabilities>>
})
if (!response.ok) {
await response.text().catch(() => {})
logger.warn('Failed to fetch OpenRouter model capabilities', {
status: response.status,
})

View File

@@ -73,6 +73,7 @@ async function fetchWorkflowMetadata(
const response = await fetch(url.toString(), { headers })
if (!response.ok) {
await response.text().catch(() => {})
logger.warn(`Failed to fetch workflow metadata for ${workflowId}`)
return null
}

View File

@@ -57,6 +57,7 @@ export const vllmProvider: ProviderConfig = {
const response = await fetch(`${baseUrl}/v1/models`, { headers })
if (!response.ok) {
await response.text().catch(() => {})
useProvidersStore.getState().setProviderModels('vllm', [])
logger.warn('vLLM service is not available. The provider will be disabled.')
return

View File

@@ -1246,8 +1246,8 @@ async function handleEdgeOperationTx(tx: any, workflowId: string, operation: str
return false
}
if (isBlockProtected(payload.source) || isBlockProtected(payload.target)) {
logger.info(`Skipping edge add - source or target block is protected`)
if (isBlockProtected(payload.target)) {
logger.info(`Skipping edge add - target block is protected`)
break
}
@@ -1269,7 +1269,7 @@ async function handleEdgeOperationTx(tx: any, workflowId: string, operation: str
throw new Error('Missing edge ID for remove operation')
}
// Get the edge to check if connected blocks are protected
// Get the edge to check if target block is protected
const [edgeToRemove] = await tx
.select({
sourceBlockId: workflowEdges.sourceBlockId,
@@ -1283,7 +1283,7 @@ async function handleEdgeOperationTx(tx: any, workflowId: string, operation: str
throw new Error(`Edge ${payload.id} not found in workflow ${workflowId}`)
}
// Check if source or target blocks are protected
// Check if target block is protected
const connectedBlocks = await tx
.select({
id: workflowBlocks.id,
@@ -1345,11 +1345,8 @@ async function handleEdgeOperationTx(tx: any, workflowId: string, operation: str
return false
}
if (
isBlockProtected(edgeToRemove.sourceBlockId) ||
isBlockProtected(edgeToRemove.targetBlockId)
) {
logger.info(`Skipping edge remove - source or target block is protected`)
if (isBlockProtected(edgeToRemove.targetBlockId)) {
logger.info(`Skipping edge remove - target block is protected`)
break
}
@@ -1470,14 +1467,11 @@ async function handleEdgesOperationTx(
}
const safeEdgeIds = edgesToRemove
.filter(
(e: EdgeToRemove) =>
!isBlockProtected(e.sourceBlockId) && !isBlockProtected(e.targetBlockId)
)
.filter((e: EdgeToRemove) => !isBlockProtected(e.targetBlockId))
.map((e: EdgeToRemove) => e.id)
if (safeEdgeIds.length === 0) {
logger.info('All edges are connected to protected blocks, skipping removal')
logger.info('All edges target protected blocks, skipping removal')
return
}
@@ -1569,13 +1563,13 @@ async function handleEdgesOperationTx(
return false
}
// Filter edges - only add edges where neither block is protected
// Filter edges - only add edges where target block is not protected
const safeEdges = (edges as Array<Record<string, unknown>>).filter(
(e) => !isBlockProtected(e.source as string) && !isBlockProtected(e.target as string)
(e) => !isBlockProtected(e.target as string)
)
if (safeEdges.length === 0) {
logger.info('All edges connect to protected blocks, skipping add')
logger.info('All edges target protected blocks, skipping add')
return
}

View File

@@ -427,6 +427,10 @@ export const useTerminalConsoleStore = create<ConsoleStore>()(
updatedEntry.iterationContainerId = update.iterationContainerId
}
if (update.parentIterations !== undefined) {
updatedEntry.parentIterations = update.parentIterations
}
if (update.childWorkflowBlockId !== undefined) {
updatedEntry.childWorkflowBlockId = update.childWorkflowBlockId
}

View File

@@ -1,3 +1,4 @@
import type { ParentIteration } from '@/executor/execution/types'
import type { NormalizedBlockOutput } from '@/executor/types'
import type { SubflowType } from '@/stores/workflows/workflow/types'
@@ -22,6 +23,7 @@ export interface ConsoleEntry {
iterationTotal?: number
iterationType?: SubflowType
iterationContainerId?: string
parentIterations?: ParentIteration[]
isRunning?: boolean
isCanceled?: boolean
/** ID of the workflow block in the parent execution that spawned this child block */
@@ -50,6 +52,7 @@ export interface ConsoleUpdate {
iterationTotal?: number
iterationType?: SubflowType
iterationContainerId?: string
parentIterations?: ParentIteration[]
childWorkflowBlockId?: string
childWorkflowName?: string
childWorkflowInstanceId?: string

Some files were not shown because too many files have changed in this diff Show More