Compare commits

...

36 Commits

Author SHA1 Message Date
Vikhyath Mondreti
fc5df60d8f remove dead code 2026-03-06 18:37:32 -08:00
Vikhyath Mondreti
adea9db89d another workflowid pass through 2026-03-06 18:25:36 -08:00
Vikhyath Mondreti
94abc424be fix resolve values fallback 2026-03-06 18:18:25 -08:00
Vikhyath Mondreti
c1c6ed66d1 improvement(selectors): simplify selectorContext + add tests 2026-03-06 18:03:40 -08:00
Waleed
a71304200e improvement(oauth): centralize scopes and remove dead scope evaluation code (#3449)
* improvement(oauth): centralize scopes and remove dead scope evaluation code

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(oauth): fix stale scope-descriptions.ts references and add test coverage

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 17:08:25 -08:00
Vikhyath Mondreti
a4d581c76f improvement(canonical): backfill for canonical modes on config changes (#3447)
* improvement(canonical): backfill for canonical modes on config changes

* persist data changes to db
2026-03-06 16:17:14 -08:00
Waleed
f1efc598d1 fix(selectors): resolve env var references at design time for selector context (#3446)
* fix(selectors): resolve env var references at design time for selector context

Selectors now resolve {{ENV_VAR}} references before building context and
returning dependency values to consumers, enabling env-var-based credentials
(e.g. {{SLACK_BOT_TOKEN}}) to work with selector dropdowns.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): prevent unresolved env var templates from leaking into context

- Fall back to undefined instead of raw template string when env var is
  missing from store, so the null-check in the context loop discards it
- Use resolvedDetailId in query cache key so React Query refetches when
  the underlying env var value changes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): use || for consistent empty-string env var handling

Align use-selector-setup.ts with use-selector-query.ts by using || instead
of ?? so empty-string env var values are treated as unset.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 15:53:00 -08:00
Waleed
244cf4ff7e feat(selectors): add dropdown selectors for 14 integrations (#3433)
* feat(selectors): add dropdown selectors for 14 integrations

* fix(selectors): secure OAuth tokens in JSM and Confluence selector routes

Convert JSM selector-servicedesks, selector-requesttypes, and Confluence
selector-spaces routes from GET (with access token in URL query params) to
POST with authorizeCredentialUse + refreshAccessTokenIfNeeded pattern. Also
adds missing ensureCredential guard to microsoft.planner.plans registry entry.

* fix(selectors): use sanitized serviceDeskId and encode SharePoint siteId

Use serviceDeskIdValidation.sanitized instead of raw serviceDeskId in JSM
request types URL. Add encodeURIComponent to SharePoint siteId to prevent
URL path injection.

* lint

* fix(selectors): revert encodeURIComponent on SharePoint siteId

SharePoint site IDs use the format "hostname,guid,guid" with commas that
must remain unencoded for the Microsoft Graph API. The encodeURIComponent
call would convert commas to %2C and break the API call.

* fix(selectors): use sanitized cloudId in Confluence and JSM route URLs

Use cloudIdValidation.sanitized instead of raw cloudId in URL construction
for consistency with the validation pattern, even though the current
validator returns the input unchanged.

* fix(selectors): add missing context fields to resolution, ensureCredential to sharepoint.lists, and siteId validation

- Add baseId, datasetId, serviceDeskId to SelectorResolutionArgs,
  ExtendedSelectorContext, extractExtendedContext, useSelectorDisplayName,
  and resolveSelectorForSubBlock so cascading selectors resolve correctly
  through the resolution path.
- Add ensureCredential guard to sharepoint.lists registry entry.
- Add regex validation for SharePoint siteId format (hostname,GUID,GUID).

* fix(selectors): rename advanced subBlock IDs to avoid canonicalParamId clashes

Rename all advanced-mode subBlock IDs that matched their canonicalParamId
to use a `manual*` prefix, following the established convention
(e.g., manualSiteId, manualCredential). This prevents ambiguity between
subBlock IDs and canonical parameter names in the serialization layer.

25 renames across 14 blocks: baseId→manualBaseId, tableId→manualTableId,
workspace→manualWorkspace, objectType→manualObjectType, etc.

* Revert "fix(selectors): rename advanced subBlock IDs to avoid canonicalParamId clashes"

This reverts commit 4e30161c68.

* fix(selectors): rename canonicalParamIds to avoid subBlock ID clashes

Prefix all clashing canonicalParamId values with `selected_` so they
don't match any subBlock ID. Update each block's `inputs` section and
`tools.config.params` function to destructure the new canonical names
and remap them to the original tool param names. SubBlock IDs and tool
definitions remain unchanged for backwards compatibility.

Affected: 25 canonical params across 14 blocks (airtable, asana, attio,
calcom, confluence, google_bigquery, google_tasks, jsm, microsoft_planner,
notion, pipedrive, sharepoint, trello, zoom).

* fix(selectors): rename pre-existing driveId and files canonicalParamIds in SharePoint

Apply the same selected_ prefix convention to the pre-existing SharePoint
driveId and files canonical params that clashed with their subBlock IDs.

* style: format long lines in calcom, pipedrive, and sharepoint blocks

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): resolve cascading context for selected_ canonical params and normalize Asana response

Strip `selected_` prefix from canonical param IDs when mapping to
SelectorContext fields so cascading selectors (Airtable base→table,
BigQuery dataset→table, JSM serviceDesk→requestType) correctly
propagate parent values.

Normalize Asana workspaces route to return `{ id, name }` instead of
`{ gid, name }` for consistency with all other selector routes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): replace hacky prefix stripping with explicit CANONICAL_TO_CONTEXT mapping

Replace CONTEXT_FIELD_SET (Record<string, true>) with CANONICAL_TO_CONTEXT
(Record<string, keyof SelectorContext>) that explicitly maps canonical
param IDs to their SelectorContext field names.

This properly handles the selected_ prefix aliases (e.g. selected_baseId
→ baseId) without string manipulation, and removes the unsafe
Record<string, unknown> cast.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(selectors): remove unnecessary selected_ prefix from canonicalParamIds

The selected_ prefix was added to avoid a perceived clash between
canonicalParamId and subBlock id values, but this clash does not
actually cause any issues — pre-existing blocks on main (Google Sheets,
Webflow, SharePoint) already use matching values successfully.

Remove the prefix from all 14 blocks, revert use-selector-setup.ts to
the simple CONTEXT_FIELD_SET pattern, and simplify tools.config.params
functions that were only remapping the prefix back.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): add spaceId selector pair to Confluence V2 block

The V2 block was missing the spaceSelector basic-mode selector that the
V1 (Legacy) block already had.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(selectors): revert V1 block changes, add selectors to Notion V1 for V2 inheritance

Confluence V1: reverted to main state (V2 has its own subBlocks).
Notion V1: added selector pairs per-operation since V2 inherits
subBlocks, inputs, and params from V1.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): audit fixes for auth patterns, registry gaps, and display name resolution

- Convert Microsoft Planner plans/tasks routes from GET+getSession to POST+authorizeCredentialUse
- Add fetchById to microsoft.planner (tasks) and sharepoint.sites registry entries
- Add ensureCredential to sharepoint.sites and microsoft.planner registry fetchList
- Update microsoft.planner.plans registry to use POST method
- Add siteId, collectionId, spreadsheetId, fileId to SelectorDisplayNameArgs and caller
- Add fileId to SelectorResolutionArgs and resolution context
- Fix Zoom topicUpdate visibility in basic mode (remove mode:'advanced')
- Change Zoom meetings selector to fetch upcoming_meetings instead of only scheduled

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style: lint formatting fixes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): consolidate Notion canonical param pairs into array conditions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): add missing selectorKey to Confluence V1 page selector

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): use sanitized IDs in URLs, convert SharePoint routes to POST+authorizeCredentialUse

- Use planIdValidation.sanitized in MS Planner tasks fetch URL
- Convert sharepoint/lists and sharepoint/sites from GET+getSession to POST+authorizeCredentialUse
- Update registry entries to match POST pattern

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): revert Zoom meetings type to scheduled for broader compatibility

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): add SharePoint site ID validator, fix cascading selector display name fallbacks

- Add validateSharePointSiteId to input-validation.ts
- Use validation util in SharePoint lists route instead of inline regex
- Add || fallback to selector IDs in workflow-block.tsx so cascading
  display names resolve in basic mode (baseSelector, planSelector, etc.)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): hoist requestId before try block in all selector routes

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): hoist requestId before try block in Trello boards route

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): guard selector queries against unresolved variable references

Skip fetchById and context population when values are design-time
placeholders (<Block.output> or {{ENV_VAR}}) rather than real IDs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(selectors): replace hardcoded display name fallbacks with canonical-aware resolution

Use resolveDependencyValue to resolve context values for
useSelectorDisplayName, eliminating manual || getStringValue('*Selector')
fallbacks that required updating for each new selector pair.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): tighten SharePoint site ID validation to exclude underscores

SharePoint composite site IDs use hostname,guid,guid format where only
alphanumerics, periods, hyphens, and commas are valid characters.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): ensure string IDs in Pipedrive/Cal.com routes, fix Trello closed board filter

Pipedrive pipelines and Cal.com event-types/schedules routes now
consistently return string IDs via String() conversion.

Trello boards route no longer filters out closed boards, preserving
them for fetchById lookups. The closed filter is applied only in the
registry's fetchList so archived boards don't appear in dropdowns
but can still be resolved by ID for display names.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): convert Zoom meeting IDs to strings for consistency

Zoom API returns numeric meeting IDs. Convert with String() to match
the string ID convention used by all other selector routes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(selectors): align registry types with route string ID returns

Routes already convert numeric IDs to strings via String(), so update
the registry types (CalcomEventType, CalcomSchedule, PipedrivePipeline,
ZoomMeeting) from id: number to id: string and remove the now-redundant
String() coercions in fetchList/fetchById.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 12:34:28 -08:00
Waleed
ae887185a1 fix(memory): upgrade bun from 1.3.9 to 1.3.10 (#3441) 2026-03-06 11:35:46 -08:00
Waleed
06c88441f8 fix(tool-input): restore workflow input mapper visibility (#3438) 2026-03-06 05:51:27 -08:00
Waleed
127968d467 feat(slack): add views.open, views.update, views.push, views.publish tools (#3436)
* feat(slack): add views.open, views.update, views.push, views.publish tools

* feat(slack): wire view tools into slack block definition
2026-03-05 22:10:02 -08:00
Waleed
2722f0efbf feat(reddit): add 5 new tools, fix bugs, and audit all endpoints against API docs (#3434)
* feat(reddit): add 5 new tools, fix bugs, and audit all endpoints against API docs

* fix(reddit): add optional chaining, pagination wiring, and trim safety

- Add optional chaining on children?.[0] in get_posts, get_controversial,
  search, and get_comments to prevent TypeError on unexpected API responses
- Wire after/before pagination params to get_messages block operation
- Use ?? instead of || for get_comments limit to handle 0 correctly
- Add .trim() on postId in get_comments URL path

* chore(reddit): remove unused output property constants from types.ts

* fix(reddit): add HTTP error handling to GET tools

Add !response.ok guards to get_me, get_user, get_subreddit_info,
and get_messages to return success: false on non-2xx responses
instead of silently returning empty data with success: true.

* fix(reddit): add input validation and HTTP error guards

- Add validateEnum/validatePathSegment to prevent URL path traversal
- Add !response.ok guards to send_message and reply tools
- Centralize subreddit validation in normalizeSubreddit
2026-03-05 20:07:29 -08:00
Vikhyath Mondreti
4f45f705a5 improvement(snapshot): exclude sentinel in client side activation detection (#3432) 2026-03-05 17:26:09 -08:00
Vikhyath Mondreti
d640fa0852 fix(condition): execution with subflow sentinels follow-on, snapshot highlighting, duplicate terminal logs (#3429)
* fix(condition): consecutive error logging + execution dequeuing

* fix snapshot highlighting

* address minor gaps

* fix incomplete case

* remove activatedEdges path

* cleanup tests

* address greptile comments

* update tests:
2026-03-05 17:03:02 -08:00
Vikhyath Mondreti
28f8e0fd97 fix(kbs): legacy subblock id migration + CI check (#3425)
* fix(kbs): legacy subblock id migration + CI check

* cleanup migration code

* address regex inaccuracy
2026-03-05 12:38:12 -08:00
Waleed
cc38ecaf12 feat(models): add gpt-5.4 and gpt-5.4-pro model definitions (#3424)
* feat(models): add gpt-5.4 and gpt-5.4-pro model definitions

* fix(providers): update test for gpt-5.4-pro missing verbosity support
2026-03-05 11:59:52 -08:00
Waleed
0a6a2ee694 feat(slack): add new tools and user selectors (#3420)
* feat(slack): add new tools and user selectors

* fix(slack): fix download fileName param and canvas error handling

* fix(slack): use markdown format for canvas rename title_content

* fix(slack): rename channel output to channelInfo and document presence API limitation

* lint

* fix(chat): use explicit trigger type check instead of heuristic for chat guard (#3419)

* fix(chat): use explicit trigger type check instead of heuristic for chat guard

* fix(chat): remove heuristic fallback from isExecutingFromChat

Use only overrideTriggerType === 'chat' instead of also checking
for 'input' in workflowInput, which can false-positive on manual
executions with workflow input.

* fix(chat): use isExecutingFromChat variable consistently in callbacks

Replace inline overrideTriggerType !== 'chat' checks with
!isExecutingFromChat to stay consistent with the rest of the function.

* fix(slack): add missing fields to SlackChannel interface

* fix(slack): fix canvas transformResponse type mismatch

Provide required output fields on error path to match SlackCanvasResponse type.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(slack): move error field to top level in canvas transformResponse

The error field belongs on ToolResponse, not inside the output object.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 22:28:10 -08:00
Waleed
8579beb199 fix(chat): use explicit trigger type check instead of heuristic for chat guard (#3419)
* fix(chat): use explicit trigger type check instead of heuristic for chat guard

* fix(chat): remove heuristic fallback from isExecutingFromChat

Use only overrideTriggerType === 'chat' instead of also checking
for 'input' in workflowInput, which can false-positive on manual
executions with workflow input.

* fix(chat): use isExecutingFromChat variable consistently in callbacks

Replace inline overrideTriggerType !== 'chat' checks with
!isExecutingFromChat to stay consistent with the rest of the function.
2026-03-04 19:05:45 -08:00
Waleed
115b4581a5 fix(editor): pass workspaceId to useCredentialName in block preview (#3418) 2026-03-04 18:15:27 -08:00
Waleed
fcdcaed00d fix(memory): add Bun.gc, stream cancellation, and unconsumed fetch drains (#3416)
* fix(memory): add Bun.gc, stream cancellation, and unconsumed fetch drains

* fix(memory): await reader.cancel() and use non-blocking Bun.gc

* fix(memory): update Bun.gc comment to match non-blocking call

* fix(memory): use response.body.cancel() instead of response.text() for drains

* fix(executor): flush TextDecoder after streaming loop for multi-byte chars

* fix(memory): use text() drain for SecureFetchResponse which lacks body property

* fix(chat): prevent premature isExecuting=false from killing chat stream

The onExecutionCompleted/Error/Cancelled callbacks were setting
isExecuting=false as soon as the server-side SSE stream completed.
For chat executions, this triggered a useEffect in chat.tsx that
cancelled the client-side stream reader before it finished consuming
buffered data — causing empty or partial chat responses.

Skip the isExecuting=false in these callbacks for chat executions
since the chat's own finally block handles cleanup after the stream
is fully consumed.

* fix(chat): remove useEffect anti-pattern that killed chat stream on state change

The effect reacted to isExecuting becoming false to clean up streams,
but this is an anti-pattern per React guidelines — using state changes
as a proxy for events. All cleanup cases are already handled by proper
event paths: stream done (processStreamingResponse), user cancel
(handleStopStreaming), component unmount (cleanup effect), and
abort/error (catch block).

* fix(servicenow): remove invalid string comparison on numeric offset param

* upgrade turborepo
2026-03-04 17:46:20 -08:00
Waleed
04fa31864b feat(servicenow): add offset and display value params to read records (#3415)
* feat(servicenow): add offset and display value params to read records

* fix(servicenow): address greptile review feedback for offset and displayValue

* fix(servicenow): handle offset=0 correctly in pagination

* fix(servicenow): guard offset against empty string in URL builder
2026-03-04 17:01:31 -08:00
Waleed
6b355e9b54 fix(subflows): recurse into all descendants for lock, enable, and protection checks (#3412)
* fix(subflows): recurse into all descendants for lock, enable, and protection checks

* fix(subflows): prevent container resize on initial render and clean up code

- Add canvasReadyRef to skip container dimension recalculation during
  ReactFlow init — position changes from extent clamping fired before
  block heights are measured, causing containers to resize on page load
- Resolve globals.css merge conflict, remove global z-index overrides
  (handled via ReactFlow zIndex prop instead)
- Clean up subflow-node: hoist static helpers to module scope, remove
  unused ref, fix nested ternary readability, rename outlineColor→ringColor

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(subflows): use full ancestor-chain protection for descendant enable-toggle

The enable-toggle for descendants was checking only direct `locked` status
instead of walking the full ancestor chain via `isBlockProtected`. This meant
a block nested 2+ levels inside a locked subflow could still be toggled.
Also added TSDoc clarifying why boxShadow works for subflow ring indicators.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* revert(subflows): remove canvasReadyRef height-gating approach

The canvasReadyRef gating in onNodesChange didn't fully fix the
container resize-on-load issue. Reverting to address properly later.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: remove unintentional edge-interaction CSS from globals

Leftover from merge conflict resolution — not part of this PR's changes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(editor): correct isAncestorLocked when block and ancestor both locked, restore fade-in transition

isAncestorLocked was derived from isBlockProtected which short-circuits
on block.locked, so a self-locked block inside a locked ancestor showed
"Unlock block" instead of "Ancestor container is locked". Now walks the
ancestor chain independently.

Also restores the accidentally removed transition-opacity duration-150
class on the ReactFlow container.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(subflows): use full ancestor-chain protection for top-level enable-toggle, restore edge-label z-index

The top-level block check in batchToggleEnabled used block.locked (self
only) while descendants used isBlockProtected (full ancestor chain). A
block inside a locked ancestor but not itself locked would bypass the
check. Now all three layers (store, collaborative hook, DB operations)
consistently use isBlockProtected/isDbBlockProtected at both levels.

Also restores the accidentally removed edge-labels z-index rule, bumped
from 60 to 1001 so labels render above child nodes (zIndex: 1000).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(subflows): extract isAncestorProtected utility, add cycle detection to all traversals

- Extract isAncestorProtected from utils.ts so editor.tsx doesn't
  duplicate the ancestor-chain walk. isBlockProtected now delegates to it.
- Add visited-set cycle detection to all ancestor walks
  (isBlockProtected, isAncestorProtected, isDbBlockProtected) and
  descendant searches (findAllDescendantNodes, findDbDescendants) to
  guard against corrupt parentId references.
- Document why click-catching div has no event bubbling concern
  (ReactFlow renders children as viewport siblings, not DOM children).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 15:51:32 -08:00
Waleed
127994f077 feat(slack): add remove reaction tool (#3414)
* feat(slack): add remove reaction tool

* lint
2026-03-04 15:28:41 -08:00
Waleed
efc1aeed70 fix(subflows): fix pointer events for nested subflow interaction (#3409)
* fix(subflows): fix pointer events for nested subflow interaction

* fix(subflows): use Tailwind class for pointer-events-none
2026-03-03 23:28:51 -08:00
Waleed
46065983f6 fix(editor): restore cursor position after tag/env-var completion in code editors (#3406)
* fix(editor): restore cursor position after tag/env-var completion in code editors

* lint

* refactor(editor): extract restoreCursorAfterInsertion helper, fix weak fallbacks

* updated

* fix(editor): replace useEffect with direct ref assignment for editorValueRef

* fix(editor): guard cursor restoration behind preview/readOnly check

Move restoreCursorAfterInsertion inside the !isPreview && !readOnly guard
so cursor position isn't computed against newValue when the textarea still
holds liveValue. Add comment documenting the cross-string index invariant
in the shared helper.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(editor): escape blockId in CSS selector with CSS.escape()

Prevents potential SyntaxError if blockId ever contains CSS special
characters when querying the textarea for cursor restoration.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* perf(editor): use ref for cursor fallback to stabilize useCallback

Replace cursorPosition state in handleSubflowTagSelect's dependency
array with a cursorPositionRef. This avoids recreating the callback
on every keystroke since cursorPosition is only used as a fallback
when textareaRef.current is null.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(editor): pass cursor position explicitly from dropdowns

Instead of inferring cursor position by searching for delimiters in the
output string (which could match unrelated < or {{ in code), compute
the exact cursor position in TagDropdown and EnvVarDropdown where the
insertion range is definitively known, and pass it through onSelect.

This follows the same pattern used by CodeMirror, Monaco, and
ProseMirror: the insertion source always knows the range, so cursor
position is computed at the source rather than inferred by the consumer.

- TagDropdown/EnvVarDropdown: compute newCursorPosition, pass as 2nd arg
- restoreCursorAfterInsertion: simplified to just (textarea, position)
- code.tsx, condition-input.tsx, use-subflow-editor.ts: accept position
- Removed editorValueRef and cursorPositionRef from use-subflow-editor
  (no longer needed since dropdown computes position)
- Other consumers (native inputs) unaffected due to TS callback compat

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(editor): fix JSDoc terminology — macrotask not microtask

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 22:10:00 -08:00
Waleed
2c79d0249f improvement(executor): support nested loops/parallels (#3398)
* feat(executor): support nested loop DAG construction and edge wiring

Wire inner loop sentinel nodes into outer loop sentinel chains so that
nested loops execute correctly. Resolves boundary-node detection to use
effective sentinel IDs for nested loops, handles loop-exit edges from
inner sentinel-end to outer sentinel-end, and recursively clears
execution state for all nested loop scopes between iterations.

NOTE: loop-in-loop nesting only; parallel nesting is not yet supported.
Made-with: Cursor

* feat(executor): add nested loop iteration context and named loop variable resolution

Introduce ParentIteration to track ancestor loop state, build a
loopParentMap during DAG construction, and propagate parent iterations
through block execution and child workflow contexts.

Extend LoopResolver to support named loop references (e.g. <loop1.index>)
and add output property resolution (<loop1.result>). Named references
use the block's display name normalized to a tag-safe identifier,
enabling blocks inside nested loops to reference any ancestor loop's
iteration state.

NOTE: loop-in-loop nesting only; parallel nesting is not yet supported.
Made-with: Cursor

* feat(terminal): propagate parent iteration context through SSE events and terminal display

Thread parentIterations through SSE block-started, block-completed, and
block-error events so the terminal can reconstruct nested loop
hierarchies. Update the entry tree builder to recursively nest inner
loop subflow nodes inside their parent iteration rows, using
parentIterations depth-stripping to support arbitrary nesting depth.

Display the block's store name for subflow container rows instead of
the generic "Loop" / "Parallel" label.

Made-with: Cursor

* feat(canvas): allow nesting subflow containers and prevent cycles

Remove the restriction that prevented subflow nodes from being dragged
into other subflow containers, enabling loop-in-loop nesting on the
canvas. Add cycle detection (isDescendantOf) to prevent a container
from being placed inside one of its own descendants.

Resize all ancestor containers when a nested child moves, collect
descendant blocks when removing from a subflow so boundary edges are
attributed correctly, and surface all ancestor loop tags in the tag
dropdown for blocks inside nested loops.

Made-with: Cursor

* feat(agent): add MCP server discovery mode for agent tool input (#3353)

* feat(agent): add MCP server discovery mode for agent tool input

* fix(tool-input): use type variant for MCP server tool count badge

* fix(mcp-dynamic-args): align label styling with standard subblock labels

* standardized inp format UI

* feat(tool-input): replace MCP server inline expand with drill-down navigation

* feat(tool-input): add chevron affordance and keyboard nav for MCP server drill-down

* fix(tool-input): handle mcp-server type in refresh, validation, badges, and usage control

* refactor(tool-validation): extract getMcpServerIssue, remove fake tool hack

* lint

* reorder dropdown

* perf(agent): parallelize MCP server tool creation with Promise.all

* fix(combobox): preserve cursor movement in search input, reset query on drilldown

* fix(combobox): route ArrowRight through handleSelect, remove redundant type guards

* fix(agent): rename mcpServers to mcpServerSelections to avoid shadowing DB import, route ArrowRight through handleSelect

* docs: update google integration docs

* fix(tool-input): reset drilldown state on tool selection to prevent stale view

* perf(agent): parallelize MCP server discovery across multiple servers

* improvement(tests): speed up unit tests by eliminating vi.resetModules anti-pattern (#3357)

* improvement(tests): speed up unit tests by eliminating vi.resetModules anti-pattern

- convert 51 test files from vi.resetModules/vi.doMock/dynamic import to vi.hoisted/vi.mock/static import
- add global @sim/db mock to vitest.setup.ts
- switch 4 test files from jsdom to node environment
- remove all vi.importActual calls that loaded heavy modules (200+ block files)
- remove slow mockConsoleLogger/mockAuth/setupCommonApiMocks helpers
- reduce real setTimeout delays in engine tests
- mock heavy transitive deps in diff-engine test

test execution time: 34s -> 9s (3.9x faster)
environment time: 2.5s -> 0.6s (4x faster)

* docs(testing): update testing best practices with performance rules

- document vi.hoisted + vi.mock + static import as the standard pattern
- explicitly ban vi.resetModules, vi.doMock, vi.importActual, mockAuth, setupCommonApiMocks
- document global mocks from vitest.setup.ts
- add mock pattern reference for auth, hybrid auth, and database chains
- add performance rules section covering heavy deps, jsdom vs node, real timers

* fix(tests): fix 4 failing test files with missing mocks

- socket/middleware/permissions: add vi.mock for @/lib/auth to prevent transitive getBaseUrl() call
- workflow-handler: add vi.mock for @/executor/utils/http matching executor mock pattern
- evaluator-handler: add db.query.account mock structure before vi.spyOn
- router-handler: same db.query.account fix as evaluator

* fix(tests): replace banned Function type with explicit callback signature

* feat(databricks): add Databricks integration with 8 tools (#3361)

* feat(databricks): add Databricks integration with 8 tools

Add complete Databricks integration supporting SQL execution, job management,
run monitoring, and cluster listing via Personal Access Token authentication.

Tools: execute_sql, list_jobs, run_job, get_run, list_runs, cancel_run,
get_run_output, list_clusters

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(databricks): throw on invalid JSON params, fix boolean coercion, add expandTasks field

- Throw errors on invalid JSON in jobParameters/notebookParams instead of silently defaulting to {}
- Always set boolean params explicitly to prevent string 'false' being truthy
- Add missing expandTasks dropdown UI field for list_jobs operation

* fix(databricks): align tool inputs/outputs with official API spec

- execute_sql: fix wait_timeout default description (50s, not 10s)
- get_run: add queueDuration field, update lifecycle/result state enums
- get_run_output: fix notebook output size (5 MB not 1 MB), add logsTruncated field
- list_runs: add userCancelledOrTimedout to state, fix limit range (1-24), update state enums
- list_jobs: fix name filter description to "exact case-insensitive"
- list_clusters: add PIPELINE_MAINTENANCE to ClusterSource enum

* fix(databricks): regenerate docs to reflect API spec fixes

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(luma): add Luma integration for event and guest management (#3364)

* feat(luma): add Luma integration for event and guest management

Add complete Luma (lu.ma) integration with 6 tools: get event, create event,
update event, list calendar events, get guests, and add guests. Includes block
configuration with wandConfig for timestamps/timezones/durations, advanced mode
for optional fields, and generated documentation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(luma): address PR review feedback

- Remove hosts field from list_events transformResponse (not in LumaEventEntry type)
- Fix truncated add_guests description by removing quotes that broke docs generator

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(luma): fix update_event field name and add_guests response parsing

- Use 'id' instead of 'event_id' in update_event request body per API spec
- Fix add_guests to parse entries[].guest response structure instead of flat guests array

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(gamma): add gamma integration for AI-powered content generation (#3358)

* feat(gamma): add gamma integration for AI-powered content generation

* fix(gamma): address PR review comments

- Make credits/error conditionally included in check_status response to avoid always-truthy objects
- Replace full wordmark SVG with square "G" letterform for proper rendering in icon slots

* fix(gamma): remove imageSource from generate_from_template endpoint

The from-template API only accepts imageOptions.model and imageOptions.style,
not imageOptions.source (image source is inherited from the template).

* fix(gamma): use typed output in check_status transformResponse

* regen docs

* feat(greenhouse): add greenhouse integration for managing candidates, jobs, and applications (#3363)

* feat(ashby): add ashby integration for candidate, job, and application management (#3362)

* feat(ashby): add ashby integration for candidate, job, and application management

* fix(ashby): auto-fix lint formatting in docs files

* improvement(oauth): reordered oauth modal (#3368)

* feat(loops): add Loops email platform integration (#3359)

* feat(loops): add Loops email platform integration

Add complete Loops integration with 10 tools covering all API endpoints:
- Contact management: create, update, find, delete
- Email: send transactional emails with attachments
- Events: trigger automated email sequences
- Lists: list mailing lists and transactional email templates
- Properties: create and list contact properties

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ran litn

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(resend): expand integration with contacts, domains, and enhanced email ops (#3366)

* improvement(blocks): update luma styling and linkup field modes (#3370)

* improvement(blocks): update luma styling and linkup field modes

* improvement(fireflies): move optional fields to advanced mode

* improvement(blocks): move optional fields to advanced mode for 10 integrations

* improvement(blocks): move optional fields to advanced mode for 6 more integrations

* feat(x): add 28 new X API v2 tool integrations and expand OAuth scopes (#3365)

* feat(x): add 28 new X API v2 tool integrations and expand OAuth scopes

* fix(x): add missing nextToken param to search tweets and fix XCreateTweetParams type

* fix(x): correct API spec issues in retweeted_by, quote_tweets, personalized_trends, and usage tools

* fix(x): add missing newestId and oldestId to error meta in get_liked_tweets and get_quote_tweets

* fix(x): add missing newestId/oldestId to get_liked_tweets success branch and includes to XTweetListResponse

* fix(x): add error handling to create_tweet and delete_tweet transformResponse

* fix(x): add error handling and logger to all X tools

* fix(x): revert block requiredScopes to match current operations

* feat(x): update block to support all 28 new X API v2 tools

* fix(x): add missing text output and fix hiddenResult output key mismatch

* docs(x): regenerate docs for all 28 new X API v2 tools

* improvement(docs): audit and standardize tool description sections, update developer count to 70k (#3371)

* improvement(x): align OAuth scopes, add scope descriptions, and set optional fields to advanced mode (#3372)

* improvement(x): align OAuth scopes, add scope descriptions, and set optional fields to advanced mode

* improvement(skills): add typed JSON outputs guidance to add-tools, add-block, and add-integration skills

* improvement(skills): add final validation steps to add-tools, add-block, and add-integration skills

* fix(skills): correct misleading JSON array comment in wandConfig example

* feat(skills): add validate-integration skill for auditing tools, blocks, and registry against API docs

* improvement(skills): expand validate-integration with full block-tool alignment, OAuth scopes, pagination, and error handling checks

* improvement(ci): add sticky disk caches and bump runner for faster builds (#3373)

* improvement(selectors): make selectorKeys declarative (#3374)

* fix(webflow): resolution for selectors

* remove unecessary fallback'

* fix teams selector resolution

* make selector keys declarative

* selectors fixes

* improvement(selectors): consolidate selector input logic (#3375)

* feat(google-contacts): add google contacts integration (#3340)

* feat(google-contacts): add google contacts integration

* fix(google-contacts): throw error when no update fields provided

* lint

* update icon

* improvement(google-contacts): add advanced mode, error handling, and input trimming

- Set mode: 'advanced' on optional fields (emailType, phoneType, notes, pageSize, pageToken, sortOrder)
- Add createLogger and response.ok error handling to all 6 tools
- Add .trim() on resourceName in get, update, delete URL builders

* improvement(mcp): add all MCP server tools individually instead of as single server entry (#3376)

* improvement(mcp): add all MCP server tools individually instead of as single server entry

* fix(mcp): prevent remove popover from opening inadvertently

* fix(sse): fix memory leaks in SSE stream cleanup and add memory telemetry (#3378)

* fix(sse): fix memory leaks in SSE stream cleanup and add memory telemetry

* improvement(monitoring): add SSE metering to wand, execution-stream, and a2a-message endpoints

* fix(workflow-execute): remove abort from cancel() to preserve run-on-leave behavior

* improvement(monitoring): use stable process.getActiveResourcesInfo() API

* refactor(a2a): hoist resubscribe cleanup to eliminate duplication between start() and cancel()

* style(a2a): format import line

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(wand): set guard flag on early-return decrement for consistency

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* improvement(ashby): validate ashby integration and update skill files (#3381)

* improvement(luma): expand host response fields and harden event ID inputs (#3383)

* improvement(resend): add error handling, authMode, and naming consistency (#3382)

* fix(chat-deploy): fix launch chat popup and auth persistence, clean up React anti-patterns (#3380)

* fix(chat-deploy): fix launch chat popup and auth persistence, clean up React anti-patterns

* lint

* fix(greenhouse): fix email_address query param, add .trim() to ID paths, revert onValidationChange to useEffect

* fix(chat-deploy): fix stale AuthSelector state, stabilize refetch ref, clean up copy timeout

* fix(chat-deploy): reset chatSuccess on modal open to prevent stuck state

* improvement(loops): validate loops integration and update skill files (#3384)

* improvement(loops): validate loops integration and update skill files

* loops icon color

* update databricks icon

* fix(monitoring): set MemoryTelemetry logger to INFO level for production visibility (#3386)

Production defaults to ERROR-only logging. Without this override,
memory snapshots would be silently suppressed.

* feat(integrations): add amplitude, google pagespeed insights, and pagerduty integrations (#3385)

* feat(integrations): add amplitude and google pagespeed insights integrations

* verified and regen docs

* fix icons

* fix(integrations): add pagerduty to tool and block registries

Re-add registry entries that were reverted after initial commit.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* more updates

* ack comemnts

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(docs): add API reference with OpenAPI spec and auto-generated endpoint pages (#3388)

* feat(docs): add API reference with OpenAPI spec and auto-generated endpoint pages

* multiline curl

* random improvements

* cleanup

* update docs copy

* fix build

* cast

* fix builg

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Lakee Sivaraya <71339072+lakeesiv@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>

* fix(icons): fix pagerduty icon (#3392)

* improvement(executor): audit and harden nested loop/parallel implementation

* improvement(executor): audit and harden nested loop/parallel implementation

- Replace unsafe _childWorkflowInstanceId cast with typeof type guard
- Reuse WorkflowNodeMetadata interface instead of inline type duplication
- Rename _executeCore to executeCore (private, no underscore needed)
- Add log warning when SSE callbacks are dropped beyond MAX_SSE_CHILD_DEPTH
- Remove unnecessary onStream type assertion, use StreamingExecution type
- Convert OUTPUT_PROPERTIES/KNOWN_PROPERTIES from arrays to Sets for O(1) lookup
- Add type guard in loop resolver resolveOutput before casting
- Add TSDoc to edgeCrossesLoopBoundary explaining original-ID usage
- Add TSDoc to MAX_SSE_CHILD_DEPTH constant
- Update ParentIteration TSDoc to reflect parallel nesting support
- Type usageControl as union 'auto'|'force'|'none' in buildMcpTool
- Replace (t: any) casts with typed objects in agent-handler tests
- Add type guard in builder-data convertArrayItem
- Make ctx required in clearLoopExecutionState (only caller always passes it)
- Replace Math.random() with deterministic counter in terminal tests
- Fix isWorkflowBlockType mock to actually check block types
- Add loop-in-loop and workflow block tree tests

* improvement(executor): audit fixes for nested subflow implementation

- Fix findInnermostLoopForBlock/ParallelForBlock to return deepest nested
  container instead of first Object.keys() match
- Fix isBlockInLoopOrDescendant returning false when directLoopId equals
  target (should return true)
- Add isBlockInParallelOrDescendant with recursive nested parallel checking
  to match loop resolver behavior
- Extract duplicated ~20-line iteration context building from loop/parallel
  orchestrators into shared buildContainerIterationContext utility
- Remove inline import() type references in orchestrators
- Remove dead executionOrder field from WorkflowNodeMetadata
- Remove redundant double-normalization in findParallelBoundaryNodes
- Consolidate 3 identical tree-walk helpers into generic hasMatchInTree
- Add empty-array guards for Math.min/Math.max in terminal utils
- Make KNOWN_PROPERTIES a Set in parallel resolver for consistency
- Remove no-op handleDragEnd callback from toolbar
- Remove dead result/results entries from KNOWN_PROPERTIES in loop resolver
- Add tests for buildContainerIterationContext

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* finished

* improvement(airtable): added more tools (#3396)

* fix(layout): polyfill crypto.randomUUID for non-secure HTTP contexts (#3397)

* feat(integrations): add dub.co integration (#3400)

* feat(integrations): add dub.co integration

* improvement(dub): add manual docs description and lint formatting fixes

* lint

* fix(dub): remove unsupported optional property from block outputs

* fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks (#3399)

* fix(monitoring): set MemoryTelemetry logger to INFO level for production visibility

Production defaults to ERROR-only logging. Without this override,
memory snapshots would be silently suppressed.

* fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks

* fix(tests): add text() mock to workflow-handler test fetch responses

* fix(memory): remove unused O(n²) join in onStreamChunk callback

* chore(careers): remove careers page, redirect to Ashby jobs portal (#3401)

* chore(careers): remove careers page, redirect to Ashby jobs portal

* lint

* feat(integrations): add google meet integration (#3403)

* feat(integrations): add google meet integration

* lint

* ack comments

* ack comments

* fix(terminal): deduplicate nested container entries in buildEntryTree

Filter out container-typed block rows when matching nested subflow
nodes exist, preventing nested loops/parallels from appearing twice
(once as a flat block and once as an expandable subflow).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* improvement(executor): clean up nested subflow implementation

- Fix wireSentinelEdges to use LOOP_EXIT handle for nested loop terminals
- Extract buildExecutionPipeline to deduplicate orchestrator wiring
- Replace two-phase init with constructor injection for Loop/ParallelOrchestrator
- Remove dead code: shouldExecuteLoopNode, resolveForEachItems, isLoopNode, isParallelNode, isSubflowBlockType
- Deduplicate currentItem resolution in ParallelResolver via resolveCurrentItem
- Type getDistributionItems param as SerializedParallel instead of any
- Demote verbose per-reference logger.info to logger.debug in evaluateWhileCondition
- Add loop-in-parallel wiring test in edges.test.ts

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(test): update parallel resolver test to use distribution instead of distributionItems

The distributionItems fallback was never part of SerializedParallel — it
only worked through any typing. Updated the test to use the real
distribution property.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(executor): skip loop back-edges in parallel boundary detection and update test

findParallelBoundaryNodes now skips LOOP_CONTINUE back-edges when
detecting terminal nodes, matching findLoopBoundaryNodes behavior.
Without this, a nested loop's back-edge was incorrectly counted as a
forward edge within the parallel, preventing terminal detection.

Also updated parallel resolver test to use the real distribution
property instead of the non-existent distributionItems fallback.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(executor): clean up cloned loop scopes in deleteParallelScopeAndClones

When a parallel contains a nested loop, cloned loop scopes (__obranch-N)
created by expandParallel were not being deleted, causing stale scopes to
persist across outer loop iterations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(executor): remove dead fallbacks, fix nested loop boundary detection, restore executionOrder

- Remove unreachable `?? candidateIds[0]` fallbacks in loop/parallel resolvers
- Remove arbitrary first-match fallback scan in findEffectiveContainerId
- Fix edgeCrossesLoopBoundary to use innermost loop detection for nested loops
- Add warning log for missing branch outputs in parallel aggregation
- Restore executionOrder on WorkflowNodeMetadata and pipe through child workflow notification
- Remove dead sim-drag-subflow classList.remove call
- Clean up cloned loop subflowParentMap entries in deleteParallelScopeAndClones

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* leftover

* upgrade turborepo

* update stagehand icon

* fix(tag-dropdown): show contextual loop/parallel tags for deeply nested blocks

findAncestorLoops only checked direct loop membership, missing blocks nested
inside parallels within loops (and vice versa). Refactored to walk through
both loop and parallel containers recursively, so a block inside a parallel
inside a loop correctly sees the loop's contextual tags (index, currentItem)
instead of the loop's output tags (results).

Also fixed parallel ancestor detection to handle nested parallel-in-loop and
loop-in-parallel scenarios, collecting all ancestor parallels instead of just
the immediate containing one.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* testing

* fixed dedicated logs

* fix

* fix(subflows): enable nested subflow interaction and execution highlighting

Remove !important z-index overrides that prevented nested subflows from
being grabbed/dragged independently. Z-index is now managed by ReactFlow's
elevateNodesOnSelect and per-node zIndex: depth props. Also adds execution
status highlighting for nested subflows in both canvas and snapshot preview.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(preview): add cycle guard to recursive subflow status derivation

Prevents infinite recursion if subflowChildrenMap contains circular
references by tracking visited nodes during traversal.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Lakee Sivaraya <71339072+lakeesiv@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vasyl Abramovych <vasyl.abramovych@gmail.com>
2026-03-03 19:21:52 -08:00
Waleed
1cf7fdfc8c fix(logs): add status field to log detail API for polling (#3405) 2026-03-03 18:00:21 -08:00
Waleed
37bdffeda0 fix(socket): persist outbound edges from locked blocks (#3404)
* fix(socket): persist outbound edges from locked blocks

* fix(socket): align edge remove protection check with client-side behavior

* fix(socket): align batch edge protection checks with target-only model

* fix(socket): update stale comments for edge protection checks
2026-03-03 12:54:07 -08:00
Waleed
6fa4b9b410 feat(integrations): add brandfetch integration (#3402)
* feat(integrations): add brandfetch integration

* lint

* ack comments
2026-03-02 22:10:38 -08:00
Waleed
f0ee492ada feat(integrations): add google meet integration (#3403)
* feat(integrations): add google meet integration

* lint

* ack comments
2026-03-02 21:59:09 -08:00
Waleed
a8e0203a92 chore(careers): remove careers page, redirect to Ashby jobs portal (#3401)
* chore(careers): remove careers page, redirect to Ashby jobs portal

* lint
2026-03-02 14:12:03 -08:00
Waleed
ebb9a2bdd3 fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks (#3399)
* fix(monitoring): set MemoryTelemetry logger to INFO level for production visibility

Production defaults to ERROR-only logging. Without this override,
memory snapshots would be silently suppressed.

* fix(memory): fix O(n²) string concatenation and unconsumed fetch response leaks

* fix(tests): add text() mock to workflow-handler test fetch responses

* fix(memory): remove unused O(n²) join in onStreamChunk callback
2026-03-02 13:58:03 -08:00
Waleed
61a447aba5 feat(integrations): add dub.co integration (#3400)
* feat(integrations): add dub.co integration

* improvement(dub): add manual docs description and lint formatting fixes

* lint

* fix(dub): remove unsupported optional property from block outputs
2026-03-02 13:45:09 -08:00
Waleed
e91ab6260a fix(layout): polyfill crypto.randomUUID for non-secure HTTP contexts (#3397) 2026-03-02 11:57:31 -08:00
Waleed
afaa361801 improvement(airtable): added more tools (#3396) 2026-03-02 10:58:21 -08:00
Waleed
cd88706ea4 fix(icons): fix pagerduty icon (#3392) 2026-03-01 23:43:09 -08:00
293 changed files with 20380 additions and 5748 deletions

View File

@@ -20,6 +20,7 @@ When the user asks you to create a block:
import { {ServiceName}Icon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import { getScopesForService } from '@/lib/oauth/utils'
export const {ServiceName}Block: BlockConfig = {
type: '{service}', // snake_case identifier
@@ -115,12 +116,17 @@ export const {ServiceName}Block: BlockConfig = {
id: 'credential',
title: 'Account',
type: 'oauth-input',
serviceId: '{service}', // Must match OAuth provider
serviceId: '{service}', // Must match OAuth provider service key
requiredScopes: getScopesForService('{service}'), // Import from @/lib/oauth/utils
placeholder: 'Select account',
required: true,
}
```
**Scopes:** Always use `getScopesForService(serviceId)` from `@/lib/oauth/utils` for `requiredScopes`. Never hardcode scope arrays — the single source of truth is `OAUTH_PROVIDERS` in `lib/oauth/oauth.ts`.
**Scope descriptions:** When adding a new OAuth provider, also add human-readable descriptions for all scopes in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts`.
### Selectors (with dynamic options)
```typescript
// Channel selector (Slack, Discord, etc.)
@@ -624,6 +630,7 @@ export const registry: Record<string, BlockConfig> = {
import { ServiceIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import { getScopesForService } from '@/lib/oauth/utils'
export const ServiceBlock: BlockConfig = {
type: 'service',
@@ -654,6 +661,7 @@ export const ServiceBlock: BlockConfig = {
title: 'Service Account',
type: 'oauth-input',
serviceId: 'service',
requiredScopes: getScopesForService('service'),
placeholder: 'Select account',
required: true,
},
@@ -792,7 +800,8 @@ All tool IDs referenced in `tools.access` and returned by `tools.config.tool` MU
- [ ] Conditions use correct syntax (field, value, not, and)
- [ ] DependsOn set for fields that need other values
- [ ] Required fields marked correctly (boolean or condition)
- [ ] OAuth inputs have correct `serviceId`
- [ ] OAuth inputs have correct `serviceId` and `requiredScopes: getScopesForService(serviceId)`
- [ ] Scope descriptions added to `SCOPE_DESCRIPTIONS` in `lib/oauth/utils.ts` for any new scopes
- [ ] Tools.access lists all tool IDs (snake_case)
- [ ] Tools.config.tool returns correct tool ID (snake_case)
- [ ] Outputs match tool outputs

View File

@@ -114,6 +114,7 @@ export const {service}{Action}Tool: ToolConfig<Params, Response> = {
import { {Service}Icon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import { getScopesForService } from '@/lib/oauth/utils'
export const {Service}Block: BlockConfig = {
type: '{service}',
@@ -144,6 +145,7 @@ export const {Service}Block: BlockConfig = {
title: '{Service} Account',
type: 'oauth-input',
serviceId: '{service}',
requiredScopes: getScopesForService('{service}'),
required: true,
},
// Conditional fields per operation
@@ -409,7 +411,7 @@ If creating V2 versions (API-aligned outputs):
### Block
- [ ] Created `blocks/blocks/{service}.ts`
- [ ] Defined operation dropdown with all operations
- [ ] Added credential field (oauth-input or short-input)
- [ ] Added credential field with `requiredScopes: getScopesForService('{service}')`
- [ ] Added conditional fields per operation
- [ ] Set up dependsOn for cascading selectors
- [ ] Configured tools.access with all tool IDs
@@ -419,6 +421,12 @@ If creating V2 versions (API-aligned outputs):
- [ ] If triggers: set `triggers.enabled` and `triggers.available`
- [ ] If triggers: spread trigger subBlocks with `getTrigger()`
### OAuth Scopes (if OAuth service)
- [ ] Defined scopes in `lib/oauth/oauth.ts` under `OAUTH_PROVIDERS`
- [ ] Added scope descriptions in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts`
- [ ] Used `getCanonicalScopesForProvider()` in `auth.ts` (never hardcode)
- [ ] Used `getScopesForService()` in block `requiredScopes` (never hardcode)
### Icon
- [ ] Asked user to provide SVG
- [ ] Added icon to `components/icons.tsx`
@@ -717,6 +725,25 @@ Use `wandConfig` for fields that are hard to fill out manually:
}
```
### OAuth Scopes (Centralized System)
Scopes are maintained in a single source of truth and reused everywhere:
1. **Define scopes** in `lib/oauth/oauth.ts` under `OAUTH_PROVIDERS[provider].services[service].scopes`
2. **Add descriptions** in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts` for the OAuth modal UI
3. **Reference in auth.ts** using `getCanonicalScopesForProvider(providerId)` from `@/lib/oauth/utils`
4. **Reference in blocks** using `getScopesForService(serviceId)` from `@/lib/oauth/utils`
**Never hardcode scope arrays** in `auth.ts` or block `requiredScopes`. Always import from the centralized source.
```typescript
// In auth.ts (Better Auth config)
scopes: getCanonicalScopesForProvider('{service}'),
// In block credential sub-block
requiredScopes: getScopesForService('{service}'),
```
### Common Gotchas
1. **OAuth serviceId must match** - The `serviceId` in oauth-input must match the OAuth provider configuration
@@ -729,3 +756,5 @@ Use `wandConfig` for fields that are hard to fill out manually:
8. **Always handle legacy file params** - Keep hidden `fileContent` params for backwards compatibility
9. **Optional fields use advanced mode** - Set `mode: 'advanced'` on rarely-used optional fields
10. **Complex inputs need wandConfig** - Timestamps, JSON arrays, and other hard-to-type values should have `wandConfig` enabled
11. **Never hardcode scopes** - Use `getScopesForService()` in blocks and `getCanonicalScopesForProvider()` in auth.ts
12. **Always add scope descriptions** - New scopes must have entries in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts`

View File

@@ -26,8 +26,9 @@ apps/sim/blocks/blocks/{service}.ts # Block definition
apps/sim/tools/registry.ts # Tool registry entries for this service
apps/sim/blocks/registry.ts # Block registry entry for this service
apps/sim/components/icons.tsx # Icon definition
apps/sim/lib/auth/auth.ts # OAuth scopes (if OAuth service)
apps/sim/lib/oauth/oauth.ts # OAuth provider config (if OAuth service)
apps/sim/lib/auth/auth.ts # OAuth config — should use getCanonicalScopesForProvider()
apps/sim/lib/oauth/oauth.ts # OAuth provider config — single source of truth for scopes
apps/sim/lib/oauth/utils.ts # Scope utilities, SCOPE_DESCRIPTIONS for modal UI
```
## Step 2: Pull API Documentation
@@ -199,11 +200,14 @@ For **each tool** in `tools.access`:
## Step 5: Validate OAuth Scopes (if OAuth service)
- [ ] `auth.ts` scopes include ALL scopes needed by ALL tools in the integration
- [ ] `oauth.ts` provider config scopes match `auth.ts` scopes
- [ ] Block `requiredScopes` (if defined) matches `auth.ts` scopes
Scopes are centralized — the single source of truth is `OAUTH_PROVIDERS` in `lib/oauth/oauth.ts`.
- [ ] Scopes defined in `lib/oauth/oauth.ts` under `OAUTH_PROVIDERS[provider].services[service].scopes`
- [ ] `auth.ts` uses `getCanonicalScopesForProvider(providerId)` — NOT a hardcoded array
- [ ] Block `requiredScopes` uses `getScopesForService(serviceId)` — NOT a hardcoded array
- [ ] No hardcoded scope arrays in `auth.ts` or block files (should all use utility functions)
- [ ] Each scope has a human-readable description in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts`
- [ ] No excess scopes that aren't needed by any tool
- [ ] Each scope has a human-readable description in `oauth-required-modal.tsx`'s `SCOPE_DESCRIPTIONS`
## Step 6: Validate Pagination Consistency
@@ -244,7 +248,8 @@ Group findings by severity:
- Missing `.trim()` on ID fields in request URLs
- Missing `?? null` on nullable response fields
- Block condition array missing an operation that uses that field
- Missing scope description in `oauth-required-modal.tsx`
- Hardcoded scope arrays instead of using `getScopesForService()` / `getCanonicalScopesForProvider()`
- Missing scope description in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts`
**Suggestion** (minor improvements):
- Better description text
@@ -273,7 +278,8 @@ After fixing, confirm:
- [ ] Validated wandConfig on timestamps and complex inputs
- [ ] Validated tools.config mapping, tool selector, and type coercions
- [ ] Validated block outputs match what tools return, with typed JSON where possible
- [ ] Validated OAuth scopes alignment across auth.ts, oauth.ts, block, and modal (if OAuth)
- [ ] Validated OAuth scopes use centralized utilities (getScopesForService, getCanonicalScopesForProvider) — no hardcoded arrays
- [ ] Validated scope descriptions exist in `SCOPE_DESCRIPTIONS` within `lib/oauth/utils.ts` for all scopes
- [ ] Validated pagination consistency across tools and block
- [ ] Validated error handling (error checks, meaningful messages)
- [ ] Validated registry entries (tools and block, alphabetical, correct imports)

View File

@@ -1,4 +1,4 @@
FROM oven/bun:1.3.9-alpine
FROM oven/bun:1.3.10-alpine
# Install necessary packages for development
RUN apk add --no-cache \

View File

@@ -20,7 +20,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Setup Node
uses: actions/setup-node@v4

View File

@@ -23,7 +23,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Cache Bun dependencies
uses: actions/cache@v4
@@ -122,7 +122,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Cache Bun dependencies
uses: actions/cache@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Cache Bun dependencies
uses: actions/cache@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Setup Node.js for npm publishing
uses: actions/setup-node@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Setup Node.js for npm publishing
uses: actions/setup-node@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.9
bun-version: 1.3.10
- name: Setup Node
uses: actions/setup-node@v4
@@ -90,6 +90,16 @@ jobs:
echo "✅ All feature flags are properly configured"
- name: Check subblock ID stability
run: |
if [ "${{ github.event_name }}" = "pull_request" ]; then
BASE_REF="origin/${{ github.base_ref }}"
git fetch --depth=1 origin "${{ github.base_ref }}" 2>/dev/null || true
else
BASE_REF="HEAD~1"
fi
bun run apps/sim/scripts/check-subblock-id-stability.ts "$BASE_REF"
- name: Lint code
run: bun run lint:check

View File

@@ -1711,167 +1711,42 @@ export function StagehandIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
{...props}
width='108'
height='159'
viewBox='0 0 108 159'
fill='none'
xmlns='http://www.w3.org/2000/svg'
width='256'
height='352'
viewBox='0 0 256 352'
fill='none'
>
<path
d='M15 26C22.8234 31.822 23.619 41.405 25.3125 50.3867C25.8461 53.1914 26.4211 55.9689 27.0625 58.75C27.7987 61.9868 28.4177 65.2319 29 68.5C29.332 70.3336 29.6653 72.1669 30 74C30.1418 74.7863 30.2836 75.5727 30.4297 76.3828C31.8011 83.2882 33.3851 90.5397 39.4375 94.75C40.3405 95.3069 40.3405 95.3069 41.2617 95.875C43.8517 97.5512 45.826 99.826 48 102C50.6705 102.89 52.3407 103.143 55.0898 103.211C55.8742 103.239 56.6586 103.268 57.4668 103.297C59.1098 103.349 60.7531 103.393 62.3965 103.43C65.8896 103.567 68.4123 103.705 71.5664 105.289C73 107 73 107 73 111C73.66 111 74.32 111 75 111C74.0759 106.912 74.0759 106.912 71.4766 103.828C67.0509 102.348 62.3634 102.64 57.7305 102.609C52.3632 102.449 49.2783 101.537 45 98C41.8212 94.0795 41.5303 90.9791 42 86C44.9846 83.0154 48.2994 83.6556 52.3047 83.6289C53.139 83.6199 53.9734 83.6108 54.833 83.6015C56.6067 83.587 58.3805 83.5782 60.1543 83.5745C62.8304 83.5627 65.5041 83.5137 68.1797 83.4629C81.1788 83.34 91.8042 85.3227 102 94C106.37 100.042 105.483 106.273 104.754 113.406C103.821 119.026 101.968 124.375 100.125 129.75C99.8806 130.471 99.6361 131.193 99.3843 131.936C97.7783 136.447 95.9466 140.206 93 144C92.34 144 91.68 144 91 144C91 144.66 91 145.32 91 146C79.0816 156.115 63.9798 156.979 49 156C36.6394 154.226 26.7567 148.879 19 139C11.0548 125.712 11.6846 105.465 11.3782 90.4719C11.0579 77.4745 8.03411 64.8142 5.4536 52.1135C5.04373 50.0912 4.64233 48.0673 4.24218 46.043C4.00354 44.8573 3.7649 43.6716 3.51903 42.45C2.14425 33.3121 2.14425 33.3121 4.87499 29.125C8.18297 25.817 10.3605 25.4542 15 26Z'
fill='#FDFDFD'
d='M 242.29,45.79 C 242.29,28.88 226.69,13.76 206.61,13.76 C 188.59,13.76 174.82,28.66 174.82,45.85 V 101.97 C 168.89,98.09 163.18,96.76 157.14,96.76 C 145.94,96.76 137.02,101.49 128.83,110.17 C 121.81,101.01 112.07,95.73 100.72,95.73 C 93.97,95.73 87.82,98.09 82.11,100.9 V 80.05 C 82.11,64.08 66.14,47.28 48.74,47.28 C 31.12,47.28 14.54,62.71 14.54,78.79 V 219.4 C 14.54,273.71 56.99,337.89 125.23,337.89 C 197.41,337.89 242.29,289.05 242.29,186.01 V 78.9 L 242.29,45.79 Z'
fill='black'
/>
<path
d='M91 0.999996C94.8466 2.96604 96.2332 5.08365 97.6091 9.03564C99.203 14.0664 99.4412 18.7459 99.4414 23.9922C99.4538 24.9285 99.4663 25.8647 99.4791 26.8294C99.5049 28.8198 99.5247 30.8103 99.539 32.8008C99.5785 37.9693 99.6682 43.1369 99.7578 48.3047C99.7747 49.3188 99.7917 50.3328 99.8091 51.3776C99.9603 59.6066 100.323 67.7921 100.937 76C101.012 77.0582 101.087 78.1163 101.164 79.2065C101.646 85.1097 102.203 90.3442 105.602 95.3672C107.492 98.9262 107.45 102.194 107.375 106.125C107.366 106.881 107.356 107.638 107.346 108.417C107.18 114.639 106.185 120.152 104 126C103.636 126.996 103.273 127.993 102.898 129.02C98.2182 141.022 92.6784 149.349 80.7891 155.062C67.479 160.366 49.4234 159.559 36 155C32.4272 153.286 29.2162 151.308 26 149C25.0719 148.361 24.1437 147.721 23.1875 147.062C8.32968 133.054 9.60387 109.231 8.73413 90.3208C8.32766 81.776 7.51814 73.4295 5.99999 65C5.82831 64.0338 5.65662 63.0675 5.47973 62.072C4.98196 59.3363 4.46395 56.6053 3.93749 53.875C3.76412 52.9572 3.59074 52.0394 3.4121 51.0938C2.75101 47.6388 2.11387 44.3416 0.999995 41C0.505898 36.899 0.0476353 32.7768 2.04687 29.0469C4.91881 25.5668 6.78357 24.117 11.25 23.6875C15.8364 24.0697 17.5999 24.9021 21 28C24.7763 34.3881 26.047 41.2626 27.1875 48.5C27.5111 50.4693 27.8377 52.4381 28.168 54.4062C28.3733 55.695 28.3733 55.695 28.5828 57.0098C28.8087 58.991 28.8087 58.991 30 60C30.3171 59.4947 30.6342 58.9894 30.9609 58.4688C33.1122 55.4736 34.7097 53.3284 38.3789 52.3945C44.352 52.203 48.1389 53.6183 53 57C53.0928 56.1338 53.0928 56.1338 53.1875 55.25C54.4089 51.8676 55.9015 50.8075 59 49C63.8651 48.104 66.9348 48.3122 71.1487 51.0332C72.0896 51.6822 73.0305 52.3313 74 53C73.9686 51.2986 73.9686 51.2986 73.9365 49.5627C73.8636 45.3192 73.818 41.0758 73.7803 36.8318C73.7603 35.0016 73.733 33.1715 73.6982 31.3415C73.6492 28.6976 73.6269 26.0545 73.6094 23.4102C73.5887 22.6035 73.5681 21.7969 73.5468 20.9658C73.5441 13.8444 75.5121 7.83341 80.25 2.4375C83.9645 0.495841 86.8954 0.209055 91 0.999996ZM3.99999 30C1.56925 34.8615 3.215 40.9393 4.24218 46.043C4.37061 46.6927 4.49905 47.3424 4.63137 48.0118C5.03968 50.0717 5.45687 52.1296 5.87499 54.1875C11.1768 80.6177 11.1768 80.6177 11.4375 93.375C11.7542 120.78 11.7542 120.78 23.5625 144.375C28.5565 149.002 33.5798 151.815 40 154C40.6922 154.244 41.3844 154.487 42.0977 154.738C55.6463 158.576 72.4909 156.79 84.8086 150.316C87.0103 148.994 89.0458 147.669 91 146C91 145.34 91 144.68 91 144C91.66 144 92.32 144 93 144C97.1202 138.98 99.3206 133.053 101.25 126.937C101.505 126.174 101.76 125.41 102.023 124.623C104.94 115.65 107.293 104.629 103.625 95.625C96.3369 88.3369 86.5231 83.6919 76.1988 83.6088C74.9905 83.6226 74.9905 83.6226 73.7578 83.6367C72.9082 83.6362 72.0586 83.6357 71.1833 83.6352C69.4034 83.6375 67.6235 83.6472 65.8437 83.6638C63.1117 83.6876 60.3806 83.6843 57.6484 83.6777C55.9141 83.6833 54.1797 83.6904 52.4453 83.6992C51.6277 83.6983 50.81 83.6974 49.9676 83.6964C45.5122 83.571 45.5122 83.571 42 86C41.517 90.1855 41.733 92.4858 43.6875 96.25C46.4096 99.4871 48.6807 101.674 53.0105 102.282C55.3425 102.411 57.6645 102.473 60 102.5C69.8847 102.612 69.8847 102.612 74 106C74.8125 108.687 74.8125 108.688 75 111C74.34 111 73.68 111 73 111C72.8969 110.216 72.7937 109.432 72.6875 108.625C72.224 105.67 72.224 105.67 69 104C65.2788 103.745 61.5953 103.634 57.8672 103.609C51.1596 103.409 46.859 101.691 41.875 97C41.2562 96.34 40.6375 95.68 40 95C39.175 94.4637 38.35 93.9275 37.5 93.375C30.9449 87.1477 30.3616 77.9789 29.4922 69.418C29.1557 66.1103 29.1557 66.1103 28.0352 63.625C26.5234 59.7915 26.1286 55.8785 25.5625 51.8125C23.9233 38.3 23.9233 38.3 17 27C11.7018 24.3509 7.9915 26.1225 3.99999 30Z'
fill='#1F1F1F'
d='M 224.94,46.23 C 224.94,36.76 215.91,28.66 205.91,28.66 C 196.75,28.66 189.9,36.11 189.9,45.14 V 152.72 C 202.88,153.38 214.08,155.96 224.94,166.19 V 78.79 L 224.94,46.23 Z'
fill='white'
/>
<path
d='M89.0976 2.53906C91 3 91 3 93.4375 5.3125C96.1586 9.99276 96.178 14.1126 96.2461 19.3828C96.2778 21.1137 96.3098 22.8446 96.342 24.5754C96.3574 25.4822 96.3728 26.3889 96.3887 27.3232C96.6322 41.3036 96.9728 55.2117 98.3396 69.1353C98.9824 75.7746 99.0977 82.3308 99 89C96.5041 88.0049 94.0126 87.0053 91.5351 85.9648C90.3112 85.4563 90.3112 85.4563 89.0625 84.9375C87.8424 84.4251 87.8424 84.4251 86.5976 83.9023C83.7463 82.9119 80.9774 82.4654 78 82C76.7702 65.9379 75.7895 49.8907 75.7004 33.7775C75.6919 32.3138 75.6783 30.8501 75.6594 29.3865C75.5553 20.4082 75.6056 12.1544 80.6875 4.4375C83.6031 2.62508 85.7 2.37456 89.0976 2.53906Z'
fill='#FBFBFB'
d='M 157.21,113.21 C 146.12,113.21 137.93,122.02 137.93,131.76 V 154.62 C 142.24,153.05 145.95,152.61 149.83,152.61 H 174.71 V 131.76 C 174.71,122.35 166.73,113.21 157.21,113.21 Z'
fill='white'
/>
<path
d='M97 13C97.99 13.495 97.99 13.495 99 14C99.0297 15.8781 99.0297 15.8781 99.0601 17.7942C99.4473 46.9184 99.4473 46.9184 100.937 76C101.012 77.0574 101.087 78.1149 101.164 79.2043C101.646 85.1082 102.203 90.3434 105.602 95.3672C107.492 98.9262 107.45 102.194 107.375 106.125C107.366 106.881 107.356 107.638 107.346 108.417C107.18 114.639 106.185 120.152 104 126C103.636 126.996 103.273 127.993 102.898 129.02C98.2182 141.022 92.6784 149.349 80.7891 155.062C67.479 160.366 49.4234 159.559 36 155C32.4272 153.286 29.2162 151.308 26 149C24.6078 148.041 24.6078 148.041 23.1875 147.062C13.5484 137.974 10.832 124.805 9.99999 112C9.91815 101.992 10.4358 91.9898 11 82C11.33 82 11.66 82 12 82C12.0146 82.6118 12.0292 83.2236 12.0442 83.854C11.5946 115.845 11.5946 115.845 24.0625 143.875C28.854 148.273 33.89 150.868 40 153C40.6935 153.245 41.387 153.49 42.1016 153.742C56.9033 157.914 73.8284 155.325 87 148C88.3301 147.327 89.6624 146.658 91 146C91 145.34 91 144.68 91 144C91.66 144 92.32 144 93 144C100.044 130.286 105.786 114.602 104 99C102.157 94.9722 100.121 93.0631 96.3125 90.875C95.5042 90.398 94.696 89.9211 93.8633 89.4297C85.199 85.1035 78.1558 84.4842 68.5 84.3125C67.2006 84.2783 65.9012 84.2442 64.5625 84.209C61.3751 84.127 58.1879 84.0577 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.8637 87.6094 98.8637 87.6094 98.7246 86.1907C96.96 67.8915 95.697 49.7051 95.75 31.3125C95.751 30.5016 95.7521 29.6908 95.7532 28.8554C95.7901 15.4198 95.7901 15.4198 97 13Z'
fill='#262114'
d='M 100.06,111.75 C 89.19,111.75 81.85,121.06 81.85,130.31 V 157.86 C 81.85,167.71 89.72,175.38 99.24,175.38 C 109.71,175.38 118.39,166.91 118.39,157.39 V 130.31 C 118.39,120.79 110.03,111.75 100.06,111.75 Z'
fill='white'
/>
<path
d='M68 51C72.86 54.06 74.644 56.5072 76 62C76.249 65.2763 76.2347 68.5285 76.1875 71.8125C76.1868 72.6833 76.1862 73.554 76.1855 74.4512C76.1406 80.8594 76.1406 80.8594 75 82C73.5113 82.0867 72.0185 82.107 70.5273 82.0976C69.6282 82.0944 68.7291 82.0912 67.8027 82.0879C66.8572 82.0795 65.9117 82.0711 64.9375 82.0625C63.9881 82.058 63.0387 82.0535 62.0605 82.0488C59.707 82.037 57.3535 82.0205 55 82C53.6352 77.2188 53.738 72.5029 53.6875 67.5625C53.6585 66.6208 53.6295 65.6792 53.5996 64.709C53.5591 60.2932 53.5488 57.7378 55.8945 53.9023C59.5767 50.5754 63.1766 50.211 68 51Z'
fill='#F8F8F8'
d='M 192.04,168.87 H 150.16 C 140.19,168.87 133.34,175.39 133.34,183.86 C 133.34,192.9 140.19,199.75 148.66,199.75 H 182.52 C 188.01,199.75 189.63,204.81 189.63,207.49 C 189.63,211.91 186.37,214.64 181.09,215.51 C 162.96,218.66 137.71,229.13 137.71,259.68 C 137.71,265.07 133.67,267.42 130.29,267.42 C 126.09,267.42 122.38,264.74 122.38,260.12 C 122.38,241.15 129.02,228.17 143.26,214.81 C 131.01,212.02 119.21,202.99 117.75,186.43 C 111.93,189.81 107.2,191.15 100.18,191.15 C 82.11,191.15 66.68,176.58 66.68,158.29 V 80.71 C 66.68,71.24 57.16,63.5 49.18,63.5 C 38.71,63.5 29.89,72.42 29.89,80.27 V 217.19 C 29.89,266.48 68.71,322.19 124.88,322.19 C 185.91,322.19 223.91,282.15 223.91,207.16 C 223.91,187.19 214.28,168.87 192.04,168.87 Z'
fill='white'
/>
</svg>
)
}
export function BrandfetchIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 29 31' fill='none' xmlns='http://www.w3.org/2000/svg'>
<path
d='M46 55C48.7557 57.1816 50.4359 58.8718 52 62C52.0837 63.5215 52.1073 65.0466 52.0977 66.5703C52.0944 67.4662 52.0912 68.3621 52.0879 69.2852C52.0795 70.2223 52.0711 71.1595 52.0625 72.125C52.058 73.0699 52.0535 74.0148 52.0488 74.9883C52.037 77.3256 52.0206 79.6628 52 82C50.9346 82.1992 50.9346 82.1992 49.8477 82.4023C48.9286 82.5789 48.0094 82.7555 47.0625 82.9375C46.146 83.1115 45.2294 83.2855 44.2852 83.4648C42.0471 83.7771 42.0471 83.7771 41 85C40.7692 86.3475 40.5885 87.7038 40.4375 89.0625C40.2931 90.3619 40.1487 91.6613 40 93C37 92 37 92 35.8672 90.1094C35.5398 89.3308 35.2123 88.5522 34.875 87.75C34.5424 86.9817 34.2098 86.2134 33.8672 85.4219C31.9715 80.1277 31.7884 75.065 31.75 69.5C31.7294 68.7536 31.7087 68.0073 31.6875 67.2383C31.6551 62.6607 32.0474 59.7266 35 56C38.4726 54.2637 42.2119 54.3981 46 55Z'
fill='#FAFAFA'
/>
<path
d='M97 13C97.66 13.33 98.32 13.66 99 14C99.0297 15.8781 99.0297 15.8781 99.0601 17.7942C99.4473 46.9184 99.4473 46.9184 100.937 76C101.012 77.0574 101.087 78.1149 101.164 79.2043C101.566 84.1265 102.275 88.3364 104 93C103.625 95.375 103.625 95.375 103 97C102.361 96.2781 101.721 95.5563 101.062 94.8125C94.4402 88.1902 85.5236 84.8401 76.2891 84.5859C75.0451 84.5473 73.8012 84.5086 72.5195 84.4688C71.2343 84.4378 69.9491 84.4069 68.625 84.375C66.6624 84.317 66.6624 84.317 64.6601 84.2578C61.4402 84.1638 58.2203 84.0781 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.9091 88.0729 98.8182 87.1458 98.7246 86.1907C96.96 67.8915 95.697 49.7051 95.75 31.3125C95.751 30.5016 95.7521 29.6908 95.7532 28.8554C95.7901 15.4198 95.7901 15.4198 97 13Z'
fill='#423B28'
/>
<path
d='M91 0.999996C94.3999 3.06951 96.8587 5.11957 98 9C97.625 12.25 97.625 12.25 97 15C95.804 12.6081 94.6146 10.2139 93.4375 7.8125C92.265 5.16236 92.265 5.16236 91 4C88.074 3.7122 85.8483 3.51695 83 4C79.1128 7.37574 78.178 11.0991 77 16C76.8329 18.5621 76.7615 21.1317 76.7695 23.6992C76.77 24.4155 76.7704 25.1318 76.7709 25.8698C76.7739 27.3783 76.7817 28.8868 76.7942 30.3953C76.8123 32.664 76.8147 34.9324 76.8144 37.2012C76.8329 44.6001 77.0765 51.888 77.7795 59.259C78.1413 63.7564 78.1068 68.2413 78.0625 72.75C78.058 73.6498 78.0535 74.5495 78.0488 75.4766C78.0373 77.6511 78.0193 79.8255 78 82C78.99 82.495 78.99 82.495 80 83C68.78 83.33 57.56 83.66 46 84C46.495 83.01 46.495 83.01 47 82C52.9349 80.7196 58.8909 80.8838 64.9375 80.9375C65.9075 80.942 66.8775 80.9465 67.8769 80.9512C70.2514 80.9629 72.6256 80.9793 75 81C75.0544 77.9997 75.0939 75.0005 75.125 72C75.1418 71.1608 75.1585 70.3216 75.1758 69.457C75.2185 63.9475 74.555 59.2895 73 54C73.66 54 74.32 54 75 54C74.9314 53.2211 74.8629 52.4422 74.7922 51.6396C74.1158 43.5036 73.7568 35.4131 73.6875 27.25C73.644 25.5194 73.644 25.5194 73.5996 23.7539C73.5376 15.3866 74.6189 8.85069 80.25 2.4375C83.9433 0.506911 86.9162 0.173322 91 0.999996Z'
fill='#131311'
/>
<path
d='M15 24C20.2332 26.3601 22.1726 29.3732 24.1875 34.5195C26.8667 42.6988 27.2651 50.4282 27 59C26.67 59 26.34 59 26 59C25.8945 58.436 25.7891 57.8721 25.6804 57.291C25.1901 54.6926 24.6889 52.0963 24.1875 49.5C24.0218 48.6131 23.8562 47.7262 23.6855 46.8125C21.7568 35.5689 21.7568 35.5689 15 27C12.0431 26.2498 12.0431 26.2498 8.99999 27C5.97965 28.9369 5.97965 28.9369 3.99999 32C3.67226 36.9682 4.31774 41.4911 5.27733 46.3594C5.40814 47.0304 5.53894 47.7015 5.67371 48.3929C5.94892 49.7985 6.22723 51.2035 6.50854 52.6079C6.93887 54.7569 7.35989 56.9075 7.77929 59.0586C9.09359 66.104 9.09359 66.104 11 73C11.0836 75.2109 11.1073 77.4243 11.0976 79.6367C11.0944 80.9354 11.0912 82.2342 11.0879 83.5723C11.0795 84.944 11.0711 86.3158 11.0625 87.6875C11.0575 89.071 11.0529 90.4544 11.0488 91.8379C11.037 95.2253 11.0206 98.6126 11 102C8.54975 99.5498 8.73228 98.8194 8.65624 95.4492C8.62812 94.53 8.60001 93.6108 8.57104 92.6638C8.54759 91.6816 8.52415 90.6994 8.49999 89.6875C8.20265 81.3063 7.58164 73.2485 5.99999 65C5.67135 63.2175 5.34327 61.435 5.01562 59.6523C4.31985 55.9098 3.62013 52.1681 2.90233 48.4297C2.75272 47.6484 2.60311 46.867 2.44897 46.062C1.99909 43.8187 1.99909 43.8187 0.999995 41C0.505898 36.899 0.0476353 32.7768 2.04687 29.0469C6.06003 24.1839 8.81126 23.4843 15 24Z'
fill='#2A2311'
/>
<path
d='M11 82C11.33 82 11.66 82 12 82C12.0146 82.6118 12.0292 83.2236 12.0442 83.854C11.5946 115.845 11.5946 115.845 24.0625 143.875C30.0569 149.404 36.9894 152.617 45 154C42 156 42 156 39.4375 156C29.964 153.244 20.8381 146.677 16 138C8.26993 120.062 9.92611 101.014 11 82Z'
fill='#272214'
/>
<path
d='M68 49C70.3478 50.1116 71.9703 51.3346 74 53C73.34 53.66 72.68 54.32 72 55C71.505 54.505 71.01 54.01 70.5 53.5C67.6718 51.8031 65.3662 51.5622 62.0976 51.4062C58.4026 52.4521 57.1992 53.8264 55 57C54.3826 61.2861 54.5302 65.4938 54.6875 69.8125C54.7101 70.9823 54.7326 72.1521 54.7559 73.3574C54.8147 76.2396 54.8968 79.1191 55 82C54.01 82 53.02 82 52 82C51.9854 81.4203 51.9708 80.8407 51.9558 80.2434C51.881 77.5991 51.7845 74.9561 51.6875 72.3125C51.6649 71.4005 51.6424 70.4885 51.6191 69.5488C51.4223 64.6292 51.2621 60.9548 48 57C45.6603 55.8302 44.1661 55.8339 41.5625 55.8125C40.78 55.7983 39.9976 55.7841 39.1914 55.7695C36.7079 55.8591 36.7079 55.8591 34 58C32.7955 60.5518 32.7955 60.5518 32 63C31.34 63 30.68 63 30 63C30.2839 59.6879 31.0332 57.9518 32.9375 55.1875C36.7018 52.4987 38.9555 52.3484 43.4844 52.5586C47.3251 53.2325 49.8148 54.7842 53 57C53.0928 56.1338 53.0928 56.1338 53.1875 55.25C55.6091 48.544 61.7788 47.8649 68 49Z'
fill='#1F1A0F'
/>
<path
d='M99 60C99.33 60 99.66 60 100 60C100.05 60.7865 100.1 61.573 100.152 62.3833C100.385 65.9645 100.63 69.5447 100.875 73.125C100.954 74.3625 101.032 75.6 101.113 76.875C101.197 78.0738 101.281 79.2727 101.367 80.5078C101.44 81.6075 101.514 82.7073 101.589 83.8403C102.013 87.1 102.94 89.8988 104 93C103.625 95.375 103.625 95.375 103 97C102.361 96.2781 101.721 95.5563 101.062 94.8125C94.4402 88.1902 85.5236 84.8401 76.2891 84.5859C74.4231 84.5279 74.4231 84.5279 72.5195 84.4688C71.2343 84.4378 69.9491 84.4069 68.625 84.375C67.3166 84.3363 66.0082 84.2977 64.6601 84.2578C61.4402 84.1638 58.2203 84.0781 55 84C55 83.67 55 83.34 55 83C58.9087 82.7294 62.8179 82.4974 66.7309 82.2981C68.7007 82.1902 70.6688 82.0535 72.6367 81.916C82.854 81.4233 90.4653 83.3102 99 89C98.9162 87.912 98.8324 86.8241 98.7461 85.7031C98.1266 77.012 97.9127 68.6814 99 60Z'
fill='#332E22'
/>
<path
d='M15 24C20.2332 26.3601 22.1726 29.3732 24.1875 34.5195C26.8667 42.6988 27.2651 50.4282 27 59C26.67 59 26.34 59 26 59C25.8945 58.436 25.7891 57.8721 25.6804 57.291C25.1901 54.6926 24.6889 52.0963 24.1875 49.5C24.0218 48.6131 23.8562 47.7262 23.6855 46.8125C21.7568 35.5689 21.7568 35.5689 15 27C12.0431 26.2498 12.0431 26.2498 8.99999 27C5.2818 29.7267 4.15499 31.2727 3.18749 35.8125C3.12562 36.8644 3.06374 37.9163 2.99999 39C2.33999 39 1.67999 39 0.999992 39C0.330349 31.2321 0.330349 31.2321 3.37499 27.5625C7.31431 23.717 9.51597 23.543 15 24Z'
fill='#1D180A'
/>
<path
d='M91 0.999996C94.3999 3.06951 96.8587 5.11957 98 9C97.625 12.25 97.625 12.25 97 15C95.804 12.6081 94.6146 10.2139 93.4375 7.8125C92.265 5.16236 92.265 5.16236 91 4C85.4345 3.33492 85.4345 3.33491 80.6875 5.75C78.5543 9.85841 77.6475 13.9354 76.7109 18.4531C76.4763 19.2936 76.2417 20.1341 76 21C75.34 21.33 74.68 21.66 74 22C73.5207 15.4102 74.5846 10.6998 78 5C81.755 0.723465 85.5463 -0.103998 91 0.999996Z'
fill='#16130D'
/>
<path
d='M42 93C42.5569 93.7631 43.1137 94.5263 43.6875 95.3125C46.4238 98.4926 48.7165 100.679 53.0105 101.282C55.3425 101.411 57.6646 101.473 60 101.5C70.6207 101.621 70.6207 101.621 75 106C75.0406 107.666 75.0427 109.334 75 111C74.34 111 73.68 111 73 111C72.7112 110.196 72.4225 109.391 72.125 108.562C71.2674 105.867 71.2674 105.867 69 105C65.3044 104.833 61.615 104.703 57.916 104.658C52.1631 104.454 48.7484 103.292 44 100C41.5625 97.25 41.5625 97.25 40 95C40.66 95 41.32 95 42 95C42 94.34 42 93.68 42 93Z'
fill='#2B2B2B'
/>
<path
d='M11 82C11.33 82 11.66 82 12 82C12.1682 86.6079 12.3287 91.216 12.4822 95.8245C12.5354 97.3909 12.5907 98.9574 12.6482 100.524C12.7306 102.78 12.8055 105.036 12.8789 107.293C12.9059 107.989 12.933 108.685 12.9608 109.402C13.0731 113.092 12.9015 116.415 12 120C11.67 120 11.34 120 11 120C9.63778 112.17 10.1119 104.4 10.4375 96.5C10.4908 95.0912 10.5436 93.6823 10.5957 92.2734C10.7247 88.8487 10.8596 85.4243 11 82Z'
fill='#4D483B'
/>
<path
d='M43.4844 52.5586C47.3251 53.2325 49.8148 54.7842 53 57C52 59 52 59 50 60C49.5256 59.34 49.0512 58.68 48.5625 58C45.2656 55.4268 43.184 55.5955 39.1211 55.6641C36.7043 55.8955 36.7043 55.8955 34 58C32.7955 60.5518 32.7955 60.5518 32 63C31.34 63 30.68 63 30 63C30.2839 59.6879 31.0332 57.9518 32.9375 55.1875C36.7018 52.4987 38.9555 52.3484 43.4844 52.5586Z'
fill='#221F16'
/>
<path
d='M76 73C76.33 73 76.66 73 77 73C77 75.97 77 78.94 77 82C78.485 82.495 78.485 82.495 80 83C68.78 83.33 57.56 83.66 46 84C46.33 83.34 46.66 82.68 47 82C52.9349 80.7196 58.8909 80.8838 64.9375 80.9375C65.9075 80.942 66.8775 80.9465 67.8769 80.9512C70.2514 80.9629 72.6256 80.9793 75 81C75.33 78.36 75.66 75.72 76 73Z'
fill='#040404'
/>
<path
d='M27 54C27.33 54 27.66 54 28 54C28.33 56.97 28.66 59.94 29 63C29.99 63 30.98 63 32 63C32 66.96 32 70.92 32 75C31.01 74.67 30.02 74.34 29 74C28.8672 73.2523 28.7344 72.5047 28.5977 71.7344C28.421 70.7495 28.2444 69.7647 28.0625 68.75C27.8885 67.7755 27.7144 66.8009 27.5352 65.7969C27.0533 63.087 27.0533 63.087 26.4062 60.8125C25.8547 58.3515 26.3956 56.4176 27 54Z'
fill='#434039'
/>
<path
d='M78 5C78.99 5.33 79.98 5.66 81 6C80.3194 6.92812 80.3194 6.92812 79.625 7.875C77.7233 11.532 77.1555 14.8461 76.5273 18.8906C76.3533 19.5867 76.1793 20.2828 76 21C75.34 21.33 74.68 21.66 74 22C73.5126 15.2987 74.9229 10.9344 78 5Z'
fill='#2A2313'
/>
<path
d='M12 115C12.99 115.495 12.99 115.495 14 116C14.5334 118.483 14.9326 120.864 15.25 123.375C15.3531 124.061 15.4562 124.747 15.5625 125.453C16.0763 129.337 16.2441 130.634 14 134C12.6761 127.57 11.752 121.571 12 115Z'
fill='#2F2C22'
/>
<path
d='M104 95C107 98 107 98 107.363 101.031C107.347 102.176 107.33 103.321 107.312 104.5C107.309 105.645 107.305 106.789 107.301 107.969C107 111 107 111 105 114C104.67 107.73 104.34 101.46 104 95Z'
fill='#120F05'
/>
<path
d='M56 103C58.6048 102.919 61.2071 102.86 63.8125 102.812C64.5505 102.787 65.2885 102.762 66.0488 102.736C71.4975 102.662 71.4975 102.662 74 104.344C75.374 106.619 75.2112 108.396 75 111C74.34 111 73.68 111 73 111C72.7112 110.196 72.4225 109.391 72.125 108.562C71.2674 105.867 71.2674 105.867 69 105C66.7956 104.77 64.5861 104.589 62.375 104.438C61.1865 104.354 59.998 104.27 58.7734 104.184C57.4006 104.093 57.4006 104.093 56 104C56 103.67 56 103.34 56 103Z'
fill='#101010'
/>
<path
d='M23 40C23.66 40 24.32 40 25 40C27.3084 46.3482 27.1982 52.2948 27 59C26.67 59 26.34 59 26 59C25.01 52.73 24.02 46.46 23 40Z'
fill='#191409'
/>
<path
d='M47 83C46.3606 83.3094 45.7212 83.6187 45.0625 83.9375C41.9023 87.0977 42.181 90.6833 42 95C41.01 94.67 40.02 94.34 39 94C39.3463 85.7409 39.3463 85.7409 41.875 82.875C44 82 44 82 47 83Z'
fill='#171717'
/>
<path
d='M53 61C53.33 61 53.66 61 54 61C54.33 67.93 54.66 74.86 55 82C54.01 82 53.02 82 52 82C52.33 75.07 52.66 68.14 53 61Z'
fill='#444444'
/>
<path
d='M81 154C78.6696 156.33 77.8129 156.39 74.625 156.75C73.4687 156.897 73.4687 156.897 72.2891 157.047C69.6838 156.994 68.2195 156.317 66 155C67.7478 154.635 69.4984 154.284 71.25 153.938C72.7118 153.642 72.7118 153.642 74.2031 153.34C76.8681 153.016 78.4887 153.145 81 154Z'
fill='#332F23'
/>
<path
d='M19 28C19.66 28 20.32 28 21 28C21.6735 29.4343 22.3386 30.8726 23 32.3125C23.5569 33.5133 23.5569 33.5133 24.125 34.7383C25 37 25 37 25 40C22 39 22 39 21.0508 37.2578C20.8071 36.554 20.5635 35.8502 20.3125 35.125C20.0611 34.4263 19.8098 33.7277 19.5508 33.0078C19 31 19 31 19 28Z'
fill='#282213'
/>
<path
d='M102 87C104.429 93.2857 104.429 93.2857 103 97C100.437 94.75 100.437 94.75 98 92C98.0625 89.75 98.0625 89.75 99 88C101 87 101 87 102 87Z'
fill='#37301F'
/>
<path
d='M53 56C53.33 56 53.66 56 54 56C53.67 62.27 53.34 68.54 53 75C52.67 75 52.34 75 52 75C51.7788 72.2088 51.5726 69.4179 51.375 66.625C51.3105 65.8309 51.2461 65.0369 51.1797 64.2188C51.0394 62.1497 51.0124 60.0737 51 58C51.66 57.34 52.32 56.68 53 56Z'
fill='#030303'
/>
<path
d='M100 129C100.33 129 100.66 129 101 129C100.532 133.776 99.7567 137.045 97 141C96.34 140.67 95.68 140.34 95 140C96.65 136.37 98.3 132.74 100 129Z'
fill='#1E1A12'
/>
<path
d='M15 131C17.7061 132.353 17.9618 133.81 19.125 136.562C19.4782 137.389 19.8314 138.215 20.1953 139.066C20.4609 139.704 20.7264 140.343 21 141C20.01 141 19.02 141 18 141C15.9656 137.27 15 135.331 15 131Z'
fill='#1C1912'
/>
<path
d='M63 49C69.4 49.4923 69.4 49.4923 72.4375 52.0625C73.2109 53.0216 73.2109 53.0216 74 54C70.8039 54 69.5828 53.4533 66.8125 52C66.0971 51.6288 65.3816 51.2575 64.6445 50.875C64.1018 50.5863 63.5591 50.2975 63 50C63 49.67 63 49.34 63 49Z'
fill='#13110C'
/>
<path
d='M0.999992 39C1.98999 39 2.97999 39 3.99999 39C5.24999 46.625 5.24999 46.625 2.99999 50C2.33999 46.37 1.67999 42.74 0.999992 39Z'
fill='#312C1E'
/>
<path
d='M94 5C94.66 5 95.32 5 96 5C97.8041 7.75924 98.0127 8.88972 97.625 12.25C97.4187 13.1575 97.2125 14.065 97 15C95.1161 11.7345 94.5071 8.71888 94 5Z'
fill='#292417'
/>
<path
d='M20 141C23.3672 142.393 24.9859 143.979 27 147C24.625 146.812 24.625 146.812 22 146C20.6875 143.438 20.6875 143.438 20 141Z'
fill='#373328'
/>
<path
d='M86 83C86.33 83.99 86.66 84.98 87 86C83.37 85.34 79.74 84.68 76 84C80.3553 81.8223 81.4663 81.9696 86 83Z'
fill='#2F2F2F'
/>
<path
d='M42 93C46 97.625 46 97.625 46 101C44.02 99.35 42.04 97.7 40 96C40.66 95.67 41.32 95.34 42 95C42 94.34 42 93.68 42 93Z'
fill='#232323'
/>
<path
d='M34 55C34.66 55.33 35.32 55.66 36 56C35.5256 56.7838 35.0512 57.5675 34.5625 58.375C33.661 59.8895 32.7882 61.4236 32 63C31.34 63 30.68 63 30 63C30.4983 59.3125 31.1007 57.3951 34 55Z'
fill='#110F0A'
d='M29 7.54605C29 9.47222 28.316 11.1378 26.9481 12.5428C25.5802 13.9251 23.5852 14.9222 20.9634 15.534C22.377 15.9192 23.4484 16.5537 24.1781 17.4375C24.9077 18.2987 25.2724 19.2956 25.2724 20.4287C25.2724 22.2189 24.7025 23.7713 23.5625 25.0855C22.4454 26.3998 20.8039 27.4195 18.638 28.1447C16.4721 28.8472 13.8616 29.1985 10.8066 29.1985C9.66666 29.1985 8.75472 29.1645 8.07075 29.0965C8.04796 29.7309 7.77438 30.2068 7.25 30.5241C6.72562 30.8414 6.05307 31 5.23231 31C4.41156 31 3.84159 30.8187 3.52241 30.4561C3.22603 30.0936 3.10062 29.561 3.14623 28.8586C3.35141 25.686 3.75039 22.3662 4.34316 18.8991C4.93593 15.4094 5.68829 12.0442 6.60024 8.80373C6.75982 8.23721 7.07901 7.84064 7.55778 7.61404C8.03656 7.38743 8.66353 7.27412 9.43868 7.27412C10.8294 7.27412 11.5248 7.65936 11.5248 8.42983C11.5248 8.74708 11.4564 9.10965 11.3196 9.51754C10.7268 11.2851 10.134 13.6871 9.54127 16.7237C8.9485 19.7375 8.52674 22.6156 8.27594 25.3575C9.37028 25.448 10.2594 25.4934 10.9434 25.4934C14.1352 25.4934 16.4721 25.0401 17.954 24.1338C19.4587 23.2046 20.2111 22.0263 20.2111 20.5987C20.2111 19.6016 19.778 18.7632 18.9116 18.0833C18.0681 17.4035 16.6431 17.0296 14.6368 16.9616C14.1808 16.939 13.8616 16.8257 13.6792 16.6217C13.4968 16.4178 13.4057 16.0892 13.4057 15.636C13.4057 14.9788 13.5425 14.4463 13.816 14.0384C14.0896 13.6305 14.5912 13.4152 15.3208 13.3925C16.9395 13.3472 18.3986 13.1093 19.6981 12.6787C21.0204 12.2482 22.0578 11.6477 22.8101 10.8772C23.5625 10.0841 23.9387 9.1663 23.9387 8.1239C23.9387 6.80958 23.2889 5.77851 21.9894 5.0307C20.6899 4.26024 18.6949 3.875 16.0047 3.875C13.5652 3.875 11.2056 4.19226 8.92571 4.82676C6.64584 5.4386 4.70793 6.2204 3.11203 7.17215C2.38246 7.6027 1.7669 7.81798 1.26533 7.81798C0.854953 7.81798 0.53577 7.68202 0.307783 7.41009C0.102594 7.1155 0 6.75292 0 6.32237C0 5.75585 0.113994 5.26864 0.341981 4.86075C0.592768 4.45285 1.17414 3.98831 2.08608 3.46711C4.00118 2.37939 6.24685 1.52961 8.82311 0.917763C11.3994 0.305921 14.0326 0 16.7229 0C20.8494 0 23.9272 0.691156 25.9564 2.07347C27.9855 3.45577 29 5.27998 29 7.54605Z'
fill='currentColor'
/>
</svg>
)
@@ -2467,7 +2342,7 @@ export function PagerDutyIcon(props: SVGProps<SVGSVGElement>) {
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 64 64' fill='none'>
<path
d='M6.704 59.217H0v-33.65c0-3.455 1.418-5.544 2.604-6.704 2.63-2.58 6.2-2.656 6.782-2.656h10.546c3.765 0 5.93 1.52 7.117 2.8 2.346 2.553 2.372 5.853 2.32 6.73v12.687c0 3.662-1.496 5.828-2.733 6.988-2.553 2.398-5.93 2.45-6.73 2.424H6.704zm13.46-18.102c.36 0 1.367-.103 1.908-.62.413-.387.62-1.083.62-2.1v-13.02c0-.36-.077-1.315-.593-1.857-.5-.516-1.444-.62-2.166-.62h-10.6c-2.63 0-2.63 1.985-2.63 2.656v15.55zM57.296 4.783H64V38.46c0 3.455-1.418 5.544-2.604 6.704-2.63 2.58-6.2 2.656-6.782 2.656H44.068c-3.765 0-5.93-1.52-7.117-2.8-2.346-2.553-2.372-5.853-2.32-6.73V25.62c0-3.662 1.496-5.828 2.733-6.988 2.553-2.398 5.93-2.45 6.73-2.424h13.202zM43.836 22.9c-.36 0-1.367.103-1.908.62-.413.387-.62 1.083-.62 2.1v13.02c0 .36.077 1.315.593 1.857.5.516 1.444.62 2.166.62h10.598c2.656-.026 2.656-2 2.656-2.682V22.9z'
fill='#06AC38'
fill='#FFFFFF'
/>
</svg>
)
@@ -4796,6 +4671,22 @@ export function GoogleGroupsIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function GoogleMeetIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' fill='none' viewBox='0 0 87.5 72'>
<path fill='#00832d' d='M49.5 36l8.53 9.75 11.47 7.33 2-17.02-2-16.64-11.69 6.44z' />
<path fill='#0066da' d='M0 51.5V66c0 3.315 2.685 6 6 6h14.5l3-10.96-3-9.54-9.95-3z' />
<path fill='#e94235' d='M20.5 0L0 20.5l10.55 3 9.95-3 2.95-9.41z' />
<path fill='#2684fc' d='M20.5 20.5H0v31h20.5z' />
<path
fill='#00ac47'
d='M82.6 8.68L69.5 19.42v33.66l13.16 10.79c1.97 1.54 4.85.135 4.85-2.37V11c0-2.535-2.945-3.925-4.91-2.32zM49.5 36v15.5h-29V72h43c3.315 0 6-2.685 6-6V53.08z'
/>
<path fill='#ffba00' d='M63.5 0h-43v20.5h29V36l20-16.57V6c0-3.315-2.685-6-6-6z' />
</svg>
)
}
export function CursorIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 546 546' fill='currentColor'>
@@ -4804,6 +4695,19 @@ export function CursorIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function DubIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 64 64' fill='none' xmlns='http://www.w3.org/2000/svg'>
<path
fillRule='evenodd'
clipRule='evenodd'
d='M32 64c17.673 0 32-14.327 32-32 0-11.844-6.435-22.186-16-27.719V48h-8v-2.14A15.9 15.9 0 0 1 32 48c-8.837 0-16-7.163-16-16s7.163-16 16-16c2.914 0 5.647.78 8 2.14V1.008A32 32 0 0 0 32 0C14.327 0 0 14.327 0 32s14.327 32 32 32'
fill='currentColor'
/>
</svg>
)
}
export function DuckDuckGoIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='-108 -108 216 216'>

View File

@@ -17,6 +17,7 @@ import {
AshbyIcon,
AttioIcon,
BrainIcon,
BrandfetchIcon,
BrowserUseIcon,
CalComIcon,
CalendlyIcon,
@@ -33,6 +34,7 @@ import {
DocumentIcon,
DropboxIcon,
DsPyIcon,
DubIcon,
DuckDuckGoIcon,
DynamoDBIcon,
ElasticsearchIcon,
@@ -57,6 +59,7 @@ import {
GoogleGroupsIcon,
GoogleIcon,
GoogleMapsIcon,
GoogleMeetIcon,
GooglePagespeedIcon,
GoogleSheetsIcon,
GoogleSlidesIcon,
@@ -177,6 +180,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
asana: AsanaIcon,
ashby: AshbyIcon,
attio: AttioIcon,
brandfetch: BrandfetchIcon,
browser_use: BrowserUseIcon,
calcom: CalComIcon,
calendly: CalendlyIcon,
@@ -192,6 +196,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
discord: DiscordIcon,
dropbox: DropboxIcon,
dspy: DsPyIcon,
dub: DubIcon,
duckduckgo: DuckDuckGoIcon,
dynamodb: DynamoDBIcon,
elasticsearch: ElasticsearchIcon,
@@ -215,6 +220,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
google_forms: GoogleFormsIcon,
google_groups: GoogleGroupsIcon,
google_maps: GoogleMapsIcon,
google_meet: GoogleMeetIcon,
google_pagespeed: GooglePagespeedIcon,
google_search: GoogleIcon,
google_sheets_v2: GoogleSheetsIcon,

View File

@@ -26,12 +26,63 @@ In Sim, the Airtable integration enables your agents to interact with your Airta
## Usage Instructions
Integrates Airtable into the workflow. Can create, get, list, or update Airtable records. Can be used in trigger mode to trigger a workflow when an update is made to an Airtable table.
Integrates Airtable into the workflow. Can list bases, list tables (with schema), and create, get, list, or update records. Can also be used in trigger mode to trigger a workflow when an update is made to an Airtable table.
## Tools
### `airtable_list_bases`
List all Airtable bases the user has access to
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `offset` | string | No | Pagination offset for retrieving additional bases |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `bases` | array | List of Airtable bases |
| ↳ `id` | string | Base ID \(starts with "app"\) |
| ↳ `name` | string | Base name |
| ↳ `permissionLevel` | string | Permission level \(none, read, comment, edit, create\) |
| `metadata` | json | Pagination and count metadata |
| ↳ `offset` | string | Offset for next page of results |
| ↳ `totalBases` | number | Number of bases returned |
### `airtable_list_tables`
List all tables and their schema in an Airtable base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `baseId` | string | Yes | Airtable base ID \(starts with "app", e.g., "appXXXXXXXXXXXXXX"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `tables` | array | List of tables in the base with their schema |
| ↳ `id` | string | Table ID \(starts with "tbl"\) |
| ↳ `name` | string | Table name |
| ↳ `description` | string | Table description |
| ↳ `primaryFieldId` | string | ID of the primary field |
| ↳ `fields` | array | List of fields in the table |
| ↳ `id` | string | Field ID \(starts with "fld"\) |
| ↳ `name` | string | Field name |
| ↳ `type` | string | Field type \(singleLineText, multilineText, number, checkbox, singleSelect, multipleSelects, date, dateTime, attachment, linkedRecord, etc.\) |
| ↳ `description` | string | Field description |
| ↳ `options` | json | Field-specific options \(choices, etc.\) |
| `metadata` | json | Base info and count metadata |
| ↳ `baseId` | string | The base ID queried |
| ↳ `totalTables` | number | Number of tables in the base |
### `airtable_list_records`
Read records from an Airtable table
@@ -49,8 +100,13 @@ Read records from an Airtable table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `records` | json | Array of retrieved Airtable records |
| `records` | array | Array of retrieved Airtable records |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata including pagination offset and total records count |
| ↳ `offset` | string | Pagination offset for next page |
| ↳ `totalRecords` | number | Number of records returned |
### `airtable_get_record`
@@ -68,8 +124,12 @@ Retrieve a single record from an Airtable table by its ID
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `record` | json | Retrieved Airtable record with id, createdTime, and fields |
| `metadata` | json | Operation metadata including record count |
| `record` | json | Retrieved Airtable record |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records returned \(always 1\) |
### `airtable_create_records`
@@ -88,8 +148,12 @@ Write new records to an Airtable table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `records` | json | Array of created Airtable records |
| `records` | array | Array of created Airtable records |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records created |
### `airtable_update_record`
@@ -108,8 +172,13 @@ Update an existing record in an Airtable table by ID
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `record` | json | Updated Airtable record with id, createdTime, and fields |
| `metadata` | json | Operation metadata including record count and updated field names |
| `record` | json | Updated Airtable record |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records updated \(always 1\) |
| ↳ `updatedFields` | array | List of field names that were updated |
### `airtable_update_multiple_records`
@@ -127,7 +196,12 @@ Update multiple existing records in an Airtable table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `records` | json | Array of updated Airtable records |
| `metadata` | json | Operation metadata including record count and updated record IDs |
| `records` | array | Array of updated Airtable records |
| ↳ `id` | string | Record ID |
| ↳ `createdTime` | string | Record creation timestamp |
| ↳ `fields` | json | Record field values |
| `metadata` | json | Operation metadata |
| ↳ `recordCount` | number | Number of records updated |
| ↳ `updatedRecordIds` | array | List of updated record IDs |

View File

@@ -0,0 +1,83 @@
---
title: Brandfetch
description: Look up brand assets, logos, colors, and company info
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="brandfetch"
color="#000000"
/>
## Usage Instructions
Integrate Brandfetch into your workflow. Retrieve brand logos, colors, fonts, and company data by domain, ticker, or name search.
## Tools
### `brandfetch_get_brand`
Retrieve brand assets including logos, colors, fonts, and company info by domain, ticker, ISIN, or crypto symbol
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Brandfetch API key |
| `identifier` | string | Yes | Brand identifier: domain \(nike.com\), stock ticker \(NKE\), ISIN \(US6541061031\), or crypto symbol \(BTC\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique brand identifier |
| `name` | string | Brand name |
| `domain` | string | Brand domain |
| `claimed` | boolean | Whether the brand profile is claimed |
| `description` | string | Short brand description |
| `longDescription` | string | Detailed brand description |
| `links` | array | Social media and website links |
| ↳ `name` | string | Link name \(e.g., twitter, linkedin\) |
| ↳ `url` | string | Link URL |
| `logos` | array | Brand logos with formats and themes |
| ↳ `type` | string | Logo type \(logo, icon, symbol, other\) |
| ↳ `theme` | string | Logo theme \(light, dark\) |
| ↳ `formats` | array | Available formats with src URL, format, width, and height |
| `colors` | array | Brand colors with hex values and types |
| ↳ `hex` | string | Hex color code |
| ↳ `type` | string | Color type \(accent, dark, light, brand\) |
| ↳ `brightness` | number | Brightness value |
| `fonts` | array | Brand fonts with names and types |
| ↳ `name` | string | Font name |
| ↳ `type` | string | Font type \(title, body\) |
| ↳ `origin` | string | Font origin \(google, custom, system\) |
| `company` | json | Company firmographic data including employees, location, and industries |
| `qualityScore` | number | Data quality score from 0 to 1 |
| `isNsfw` | boolean | Whether the brand contains adult content |
### `brandfetch_search`
Search for brands by name and find their domains and logos
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Brandfetch API key |
| `name` | string | Yes | Company or brand name to search for |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `results` | array | List of matching brands |
| ↳ `brandId` | string | Unique brand identifier |
| ↳ `name` | string | Brand name |
| ↳ `domain` | string | Brand domain |
| ↳ `claimed` | boolean | Whether the brand profile is claimed |
| ↳ `icon` | string | Brand icon URL |

View File

@@ -0,0 +1,318 @@
---
title: Dub
description: Link management with Dub
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="dub"
color="#181C1E"
/>
{/* MANUAL-CONTENT-START:intro */}
[Dub](https://dub.co/) is an open-source link management platform for modern marketing teams. It provides powerful short link creation, analytics, and tracking capabilities with enterprise-grade infrastructure.
With the Dub integration in Sim, you can:
- **Create short links**: Generate branded short links with custom domains, slugs, and UTM parameters
- **Upsert links**: Create or update links idempotently by destination URL
- **Retrieve link info**: Look up link details by ID, external ID, or domain + key combination
- **Update links**: Modify destination URLs, metadata, UTM parameters, and link settings
- **Delete links**: Remove short links by ID or external ID
- **List links**: Search and filter links with pagination, sorting, and tag filtering
- **Get analytics**: Retrieve click, lead, and sales analytics with grouping by time, geography, device, browser, referer, and more
In Sim, the Dub integration enables your agents to manage short links and track their performance programmatically. Use it to create trackable links as part of marketing workflows, monitor link engagement, and make data-driven decisions based on click analytics and conversion metrics.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Create, manage, and track short links with Dub. Supports custom domains, UTM parameters, link analytics, and more.
## Tools
### `dub_create_link`
Create a new short link with Dub. Supports custom domains, slugs, UTM parameters, and more.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `url` | string | Yes | The destination URL of the short link |
| `domain` | string | No | Custom domain for the short link \(defaults to dub.sh\) |
| `key` | string | No | Custom slug for the short link \(randomly generated if not provided\) |
| `externalId` | string | No | External ID for the link in your database |
| `tagIds` | string | No | Comma-separated tag IDs to assign to the link |
| `comments` | string | No | Comments for the short link |
| `expiresAt` | string | No | Expiration date in ISO 8601 format |
| `password` | string | No | Password to protect the short link |
| `rewrite` | boolean | No | Whether to enable link cloaking |
| `archived` | boolean | No | Whether to archive the link |
| `title` | string | No | Custom OG title for the link preview |
| `description` | string | No | Custom OG description for the link preview |
| `utm_source` | string | No | UTM source parameter |
| `utm_medium` | string | No | UTM medium parameter |
| `utm_campaign` | string | No | UTM campaign parameter |
| `utm_term` | string | No | UTM term parameter |
| `utm_content` | string | No | UTM content parameter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the created link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_upsert_link`
Create or update a short link by its URL. If a link with the same URL already exists, update it. Otherwise, create a new link.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `url` | string | Yes | The destination URL of the short link |
| `domain` | string | No | Custom domain for the short link \(defaults to dub.sh\) |
| `key` | string | No | Custom slug for the short link \(randomly generated if not provided\) |
| `externalId` | string | No | External ID for the link in your database |
| `tagIds` | string | No | Comma-separated tag IDs to assign to the link |
| `comments` | string | No | Comments for the short link |
| `expiresAt` | string | No | Expiration date in ISO 8601 format |
| `password` | string | No | Password to protect the short link |
| `rewrite` | boolean | No | Whether to enable link cloaking |
| `archived` | boolean | No | Whether to archive the link |
| `title` | string | No | Custom OG title for the link preview |
| `description` | string | No | Custom OG description for the link preview |
| `utm_source` | string | No | UTM source parameter |
| `utm_medium` | string | No | UTM medium parameter |
| `utm_campaign` | string | No | UTM campaign parameter |
| `utm_term` | string | No | UTM term parameter |
| `utm_content` | string | No | UTM content parameter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_get_link`
Retrieve information about a short link by its link ID, external ID, or domain + key combination.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `linkId` | string | No | The unique ID of the short link |
| `externalId` | string | No | The external ID of the link in your database |
| `domain` | string | No | The domain of the link \(use with key\) |
| `key` | string | No | The slug of the link \(use with domain\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_update_link`
Update an existing short link. You can modify the destination URL, slug, metadata, UTM parameters, and more.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `linkId` | string | Yes | The link ID or external ID prefixed with ext_ |
| `url` | string | No | New destination URL |
| `domain` | string | No | New custom domain |
| `key` | string | No | New custom slug |
| `title` | string | No | Custom OG title |
| `description` | string | No | Custom OG description |
| `externalId` | string | No | External ID for the link |
| `tagIds` | string | No | Comma-separated tag IDs |
| `comments` | string | No | Comments for the short link |
| `expiresAt` | string | No | Expiration date in ISO 8601 format |
| `password` | string | No | Password to protect the link |
| `rewrite` | boolean | No | Whether to enable link cloaking |
| `archived` | boolean | No | Whether to archive the link |
| `utm_source` | string | No | UTM source parameter |
| `utm_medium` | string | No | UTM medium parameter |
| `utm_campaign` | string | No | UTM campaign parameter |
| `utm_term` | string | No | UTM term parameter |
| `utm_content` | string | No | UTM content parameter |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique ID of the updated link |
| `domain` | string | Domain of the short link |
| `key` | string | Slug of the short link |
| `url` | string | Destination URL |
| `shortLink` | string | Full short link URL |
| `qrCode` | string | QR code URL for the short link |
| `archived` | boolean | Whether the link is archived |
| `externalId` | string | External ID |
| `title` | string | OG title |
| `description` | string | OG description |
| `tags` | json | Tags assigned to the link \(id, name, color\) |
| `clicks` | number | Number of clicks |
| `leads` | number | Number of leads |
| `sales` | number | Number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `lastClicked` | string | Last clicked timestamp |
| `createdAt` | string | Creation timestamp |
| `updatedAt` | string | Last update timestamp |
| `utm_source` | string | UTM source parameter |
| `utm_medium` | string | UTM medium parameter |
| `utm_campaign` | string | UTM campaign parameter |
| `utm_term` | string | UTM term parameter |
| `utm_content` | string | UTM content parameter |
### `dub_delete_link`
Delete a short link by its link ID or external ID (prefixed with ext_).
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `linkId` | string | Yes | The link ID or external ID prefixed with ext_ |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | ID of the deleted link |
### `dub_list_links`
Retrieve a paginated list of short links for the authenticated workspace. Supports filtering by domain, search query, tags, and sorting.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `domain` | string | No | Filter by domain |
| `search` | string | No | Search query matched against the short link slug and destination URL |
| `tagIds` | string | No | Comma-separated tag IDs to filter by |
| `showArchived` | boolean | No | Whether to include archived links \(defaults to false\) |
| `sortBy` | string | No | Sort by field: createdAt, clicks, saleAmount, or lastClicked |
| `sortOrder` | string | No | Sort order: asc or desc |
| `page` | number | No | Page number \(default: 1\) |
| `pageSize` | number | No | Number of links per page \(default: 100, max: 100\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `links` | json | Array of link objects \(id, domain, key, url, shortLink, clicks, tags, createdAt\) |
| `count` | number | Number of links returned |
### `dub_get_analytics`
Retrieve analytics for links including clicks, leads, and sales. Supports filtering by link, time range, and grouping by various dimensions.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Dub API key |
| `event` | string | No | Event type: clicks \(default\), leads, sales, or composite |
| `groupBy` | string | No | Group results by: count \(default\), timeseries, countries, cities, devices, browsers, os, referers, top_links, top_urls |
| `linkId` | string | No | Filter by link ID |
| `externalId` | string | No | Filter by external ID \(prefix with ext_\) |
| `domain` | string | No | Filter by domain |
| `interval` | string | No | Time interval: 24h \(default\), 7d, 30d, 90d, 1y, mtd, qtd, ytd, or all |
| `start` | string | No | Start date/time in ISO 8601 format \(overrides interval\) |
| `end` | string | No | End date/time in ISO 8601 format \(defaults to now\) |
| `country` | string | No | Filter by country \(ISO 3166-1 alpha-2 code\) |
| `timezone` | string | No | IANA timezone for timeseries data \(defaults to UTC\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `clicks` | number | Total number of clicks |
| `leads` | number | Total number of leads |
| `sales` | number | Total number of sales |
| `saleAmount` | number | Total sale amount in cents |
| `data` | json | Grouped analytics data \(timeseries, countries, devices, etc.\) |

View File

@@ -0,0 +1,156 @@
---
title: Google Meet
description: Create and manage Google Meet meetings
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="google_meet"
color="#E0E0E0"
/>
{/* MANUAL-CONTENT-START:intro */}
[Google Meet](https://meet.google.com) is Google's video conferencing and online meeting platform, providing secure, high-quality video calls for individuals and teams. As a core component of Google Workspace, Google Meet enables real-time collaboration through video meetings, screen sharing, and integrated chat.
The Google Meet REST API (v2) allows programmatic management of meeting spaces and conference records, enabling automated workflows to create meetings, track participation, and manage active conferences without manual intervention.
Key features of the Google Meet API include:
- **Meeting Space Management**: Create, retrieve, and configure meeting spaces with customizable access controls.
- **Conference Records**: Access historical conference data including start/end times and associated spaces.
- **Participant Tracking**: View participant details for any conference including join/leave times and user types.
- **Access Controls**: Configure who can join meetings (open, trusted, or restricted) and which entry points are allowed.
- **Active Conference Management**: Programmatically end active conferences in meeting spaces.
In Sim, the Google Meet integration allows your agents to create meeting spaces on demand, monitor conference activity, track participation across meetings, and manage active conferences as part of automated workflows. This enables scenarios such as automatically provisioning meeting rooms for scheduled events, generating attendance reports, ending stale conferences, and building meeting analytics dashboards.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate Google Meet into your workflow. Create meeting spaces, get space details, end conferences, list conference records, and view participants.
## Tools
### `google_meet_create_space`
Create a new Google Meet meeting space
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `accessType` | string | No | Who can join the meeting without knocking: OPEN \(anyone with link\), TRUSTED \(org members\), RESTRICTED \(only invited\) |
| `entryPointAccess` | string | No | Entry points allowed: ALL \(all entry points\) or CREATOR_APP_ONLY \(only via app\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Resource name of the space \(e.g., spaces/abc123\) |
| `meetingUri` | string | Meeting URL \(e.g., https://meet.google.com/abc-defg-hij\) |
| `meetingCode` | string | Meeting code \(e.g., abc-defg-hij\) |
| `accessType` | string | Access type configuration |
| `entryPointAccess` | string | Entry point access configuration |
### `google_meet_get_space`
Get details of a Google Meet meeting space by name or meeting code
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `spaceName` | string | Yes | Space resource name \(spaces/abc123\) or meeting code \(abc-defg-hij\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Resource name of the space |
| `meetingUri` | string | Meeting URL |
| `meetingCode` | string | Meeting code |
| `accessType` | string | Access type configuration |
| `entryPointAccess` | string | Entry point access configuration |
| `activeConference` | string | Active conference record name |
### `google_meet_end_conference`
End the active conference in a Google Meet space
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `spaceName` | string | Yes | Space resource name \(e.g., spaces/abc123\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `ended` | boolean | Whether the conference was ended successfully |
### `google_meet_list_conference_records`
List conference records for meetings you organized
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `filter` | string | No | Filter by space name \(e.g., space.name = "spaces/abc123"\) or time range \(e.g., start_time &gt; "2024-01-01T00:00:00Z"\) |
| `pageSize` | number | No | Maximum number of conference records to return \(max 100\) |
| `pageToken` | string | No | Page token from a previous list request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `conferenceRecords` | json | List of conference records with name, start/end times, and space |
| `nextPageToken` | string | Token for next page of results |
### `google_meet_get_conference_record`
Get details of a specific conference record
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `conferenceName` | string | Yes | Conference record resource name \(e.g., conferenceRecords/abc123\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `name` | string | Conference record resource name |
| `startTime` | string | Conference start time |
| `endTime` | string | Conference end time |
| `expireTime` | string | Conference record expiration time |
| `space` | string | Associated space resource name |
### `google_meet_list_participants`
List participants of a conference record
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `conferenceName` | string | Yes | Conference record resource name \(e.g., conferenceRecords/abc123\) |
| `filter` | string | No | Filter participants \(e.g., earliest_start_time &gt; "2024-01-01T00:00:00Z"\) |
| `pageSize` | number | No | Maximum number of participants to return \(default 100, max 250\) |
| `pageToken` | string | No | Page token from a previous list request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `participants` | json | List of participants with name, times, display name, and user type |
| `nextPageToken` | string | Token for next page of results |
| `totalSize` | number | Total number of participants |

View File

@@ -13,6 +13,7 @@
"asana",
"ashby",
"attio",
"brandfetch",
"browser_use",
"calcom",
"calendly",
@@ -28,6 +29,7 @@
"discord",
"dropbox",
"dspy",
"dub",
"duckduckgo",
"dynamodb",
"elasticsearch",
@@ -51,6 +53,7 @@
"google_forms",
"google_groups",
"google_maps",
"google_meet",
"google_pagespeed",
"google_search",
"google_sheets",

View File

@@ -24,7 +24,7 @@ These operations let your agents access and analyze Reddit content as part of yo
## Usage Instructions
Integrate Reddit into workflows. Read posts, comments, and search content. Submit posts, vote, reply, edit, and manage your Reddit account.
Integrate Reddit into workflows. Read posts, comments, and search content. Submit posts, vote, reply, edit, manage messages, and access user and subreddit info.
@@ -39,14 +39,15 @@ Fetch posts from a subreddit with different sorting options
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `subreddit` | string | Yes | The subreddit to fetch posts from \(e.g., "technology", "news"\) |
| `sort` | string | No | Sort method for posts \(e.g., "hot", "new", "top", "rising"\). Default: "hot" |
| `sort` | string | No | Sort method for posts \(e.g., "hot", "new", "top", "rising", "controversial"\). Default: "hot" |
| `limit` | number | No | Maximum number of posts to return \(e.g., 25\). Default: 10, max: 100 |
| `time` | string | No | Time filter for "top" sorted posts: "day", "week", "month", "year", or "all" \(default: "day"\) |
| `time` | string | No | Time filter for "top" sorted posts: "day", "week", "month", "year", or "all" \(default: "all"\) |
| `after` | string | No | Fullname of a thing to fetch items after \(for pagination\) |
| `before` | string | No | Fullname of a thing to fetch items before \(for pagination\) |
| `count` | number | No | A count of items already seen in the listing \(used for numbering\) |
| `show` | string | No | Show items that would normally be filtered \(e.g., "all"\) |
| `sr_detail` | boolean | No | Expand subreddit details in the response |
| `g` | string | No | Geo filter for posts \(e.g., "GLOBAL", "US", "AR", etc.\) |
#### Output
@@ -55,6 +56,7 @@ Fetch posts from a subreddit with different sorting options
| `subreddit` | string | Name of the subreddit where posts were fetched from |
| `posts` | array | Array of posts with title, author, URL, score, comments count, and metadata |
| ↳ `id` | string | Post ID |
| ↳ `name` | string | Thing fullname \(t3_xxxxx\) |
| ↳ `title` | string | Post title |
| ↳ `author` | string | Author username |
| ↳ `url` | string | Post URL |
@@ -66,6 +68,8 @@ Fetch posts from a subreddit with different sorting options
| ↳ `selftext` | string | Text content for self posts |
| ↳ `thumbnail` | string | Thumbnail URL |
| ↳ `subreddit` | string | Subreddit name |
| `after` | string | Fullname of the last item for forward pagination |
| `before` | string | Fullname of the first item for backward pagination |
### `reddit_get_comments`
@@ -83,12 +87,9 @@ Fetch comments from a specific Reddit post
| `context` | number | No | Number of parent comments to include |
| `showedits` | boolean | No | Show edit information for comments |
| `showmore` | boolean | No | Include "load more comments" elements in the response |
| `showtitle` | boolean | No | Include submission title in the response |
| `threaded` | boolean | No | Return comments in threaded/nested format |
| `truncate` | number | No | Integer to truncate comment depth |
| `after` | string | No | Fullname of a thing to fetch items after \(for pagination\) |
| `before` | string | No | Fullname of a thing to fetch items before \(for pagination\) |
| `count` | number | No | A count of items already seen in the listing \(used for numbering\) |
| `comment` | string | No | ID36 of a comment to focus on \(returns that comment thread\) |
#### Output
@@ -96,6 +97,7 @@ Fetch comments from a specific Reddit post
| --------- | ---- | ----------- |
| `post` | object | Post information including ID, title, author, content, and metadata |
| ↳ `id` | string | Post ID |
| ↳ `name` | string | Thing fullname \(t3_xxxxx\) |
| ↳ `title` | string | Post title |
| ↳ `author` | string | Post author |
| ↳ `selftext` | string | Post text content |
@@ -104,6 +106,7 @@ Fetch comments from a specific Reddit post
| ↳ `permalink` | string | Reddit permalink |
| `comments` | array | Nested comments with author, body, score, timestamps, and replies |
| ↳ `id` | string | Comment ID |
| ↳ `name` | string | Thing fullname \(t1_xxxxx\) |
| ↳ `author` | string | Comment author |
| ↳ `body` | string | Comment text |
| ↳ `score` | number | Comment score |
@@ -135,6 +138,7 @@ Fetch controversial posts from a subreddit
| `subreddit` | string | Name of the subreddit where posts were fetched from |
| `posts` | array | Array of controversial posts with title, author, URL, score, comments count, and metadata |
| ↳ `id` | string | Post ID |
| ↳ `name` | string | Thing fullname \(t3_xxxxx\) |
| ↳ `title` | string | Post title |
| ↳ `author` | string | Author username |
| ↳ `url` | string | Post URL |
@@ -146,6 +150,8 @@ Fetch controversial posts from a subreddit
| ↳ `selftext` | string | Text content for self posts |
| ↳ `thumbnail` | string | Thumbnail URL |
| ↳ `subreddit` | string | Subreddit name |
| `after` | string | Fullname of the last item for forward pagination |
| `before` | string | Fullname of the first item for backward pagination |
### `reddit_search`
@@ -165,6 +171,8 @@ Search for posts within a subreddit
| `before` | string | No | Fullname of a thing to fetch items before \(for pagination\) |
| `count` | number | No | A count of items already seen in the listing \(used for numbering\) |
| `show` | string | No | Show items that would normally be filtered \(e.g., "all"\) |
| `type` | string | No | Type of search results: "link" \(posts\), "sr" \(subreddits\), or "user" \(users\). Default: "link" |
| `sr_detail` | boolean | No | Expand subreddit details in the response |
#### Output
@@ -173,6 +181,7 @@ Search for posts within a subreddit
| `subreddit` | string | Name of the subreddit where search was performed |
| `posts` | array | Array of search result posts with title, author, URL, score, comments count, and metadata |
| ↳ `id` | string | Post ID |
| ↳ `name` | string | Thing fullname \(t3_xxxxx\) |
| ↳ `title` | string | Post title |
| ↳ `author` | string | Author username |
| ↳ `url` | string | Post URL |
@@ -184,6 +193,8 @@ Search for posts within a subreddit
| ↳ `selftext` | string | Text content for self posts |
| ↳ `thumbnail` | string | Thumbnail URL |
| ↳ `subreddit` | string | Subreddit name |
| `after` | string | Fullname of the last item for forward pagination |
| `before` | string | Fullname of the first item for backward pagination |
### `reddit_submit_post`
@@ -200,6 +211,9 @@ Submit a new post to a subreddit (text or link)
| `nsfw` | boolean | No | Mark post as NSFW |
| `spoiler` | boolean | No | Mark post as spoiler |
| `send_replies` | boolean | No | Send reply notifications to inbox \(default: true\) |
| `flair_id` | string | No | Flair template UUID for the post \(max 36 characters\) |
| `flair_text` | string | No | Flair text to display on the post \(max 64 characters\) |
| `collection_id` | string | No | Collection UUID to add the post to |
#### Output
@@ -264,6 +278,21 @@ Save a Reddit post or comment to your saved items
| `posts` | json | Posts data |
| `post` | json | Single post data |
| `comments` | json | Comments data |
| `success` | boolean | Operation success status |
| `message` | string | Result message |
| `data` | json | Response data |
| `after` | string | Pagination cursor \(next page\) |
| `before` | string | Pagination cursor \(previous page\) |
| `id` | string | Entity ID |
| `name` | string | Entity fullname |
| `messages` | json | Messages data |
| `display_name` | string | Subreddit display name |
| `subscribers` | number | Subscriber count |
| `description` | string | Description text |
| `link_karma` | number | Link karma |
| `comment_karma` | number | Comment karma |
| `total_karma` | number | Total karma |
| `icon_img` | string | Icon image URL |
### `reddit_reply`
@@ -275,6 +304,7 @@ Add a comment reply to a Reddit post or comment
| --------- | ---- | -------- | ----------- |
| `parent_id` | string | Yes | Thing fullname to reply to \(e.g., "t3_abc123" for post, "t1_def456" for comment\) |
| `text` | string | Yes | Comment text in markdown format \(e.g., "Great post! Here is my **reply**"\) |
| `return_rtjson` | boolean | No | Return response in Rich Text JSON format |
#### Output
@@ -345,4 +375,138 @@ Subscribe or unsubscribe from a subreddit
| `success` | boolean | Whether the subscription action was successful |
| `message` | string | Success or error message |
### `reddit_get_me`
Get information about the authenticated Reddit user
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | User ID |
| `name` | string | Username |
| `created_utc` | number | Account creation time in UTC epoch seconds |
| `link_karma` | number | Total link karma |
| `comment_karma` | number | Total comment karma |
| `total_karma` | number | Combined total karma |
| `is_gold` | boolean | Whether user has Reddit Premium |
| `is_mod` | boolean | Whether user is a moderator |
| `has_verified_email` | boolean | Whether email is verified |
| `icon_img` | string | User avatar/icon URL |
### `reddit_get_user`
Get public profile information about any Reddit user by username
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `username` | string | Yes | Reddit username to look up \(e.g., "spez", "example_user"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | User ID |
| `name` | string | Username |
| `created_utc` | number | Account creation time in UTC epoch seconds |
| `link_karma` | number | Total link karma |
| `comment_karma` | number | Total comment karma |
| `total_karma` | number | Combined total karma |
| `is_gold` | boolean | Whether user has Reddit Premium |
| `is_mod` | boolean | Whether user is a moderator |
| `has_verified_email` | boolean | Whether email is verified |
| `icon_img` | string | User avatar/icon URL |
### `reddit_send_message`
Send a private message to a Reddit user
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `to` | string | Yes | Recipient username \(e.g., "example_user"\) or subreddit \(e.g., "/r/subreddit"\) |
| `subject` | string | Yes | Message subject \(max 100 characters\) |
| `text` | string | Yes | Message body in markdown format |
| `from_sr` | string | No | Subreddit name to send the message from \(requires moderator mail permission\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `success` | boolean | Whether the message was sent successfully |
| `message` | string | Success or error message |
### `reddit_get_messages`
Retrieve private messages from your Reddit inbox
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `where` | string | No | Message folder to retrieve: "inbox" \(all\), "unread", "sent", "messages" \(direct messages only\), "comments" \(comment replies\), "selfreply" \(self-post replies\), or "mentions" \(username mentions\). Default: "inbox" |
| `limit` | number | No | Maximum number of messages to return \(e.g., 25\). Default: 25, max: 100 |
| `after` | string | No | Fullname of a thing to fetch items after \(for pagination\) |
| `before` | string | No | Fullname of a thing to fetch items before \(for pagination\) |
| `mark` | boolean | No | Whether to mark fetched messages as read |
| `count` | number | No | A count of items already seen in the listing \(used for numbering\) |
| `show` | string | No | Show items that would normally be filtered \(e.g., "all"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `messages` | array | Array of messages with sender, recipient, subject, body, and metadata |
| ↳ `id` | string | Message ID |
| ↳ `name` | string | Thing fullname \(t4_xxxxx\) |
| ↳ `author` | string | Sender username |
| ↳ `dest` | string | Recipient username |
| ↳ `subject` | string | Message subject |
| ↳ `body` | string | Message body text |
| ↳ `created_utc` | number | Creation time in UTC epoch seconds |
| ↳ `new` | boolean | Whether the message is unread |
| ↳ `was_comment` | boolean | Whether the message is a comment reply |
| ↳ `context` | string | Context URL for comment replies |
| ↳ `distinguished` | string | Distinction: null/"moderator"/"admin" |
| `after` | string | Fullname of the last item for forward pagination |
| `before` | string | Fullname of the first item for backward pagination |
### `reddit_get_subreddit_info`
Get metadata and information about a subreddit
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `subreddit` | string | Yes | The subreddit to get info about \(e.g., "technology", "programming", "news"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Subreddit ID |
| `name` | string | Subreddit fullname \(t5_xxxxx\) |
| `display_name` | string | Subreddit name without prefix |
| `title` | string | Subreddit title |
| `description` | string | Full subreddit description \(markdown\) |
| `public_description` | string | Short public description |
| `subscribers` | number | Number of subscribers |
| `accounts_active` | number | Number of currently active users |
| `created_utc` | number | Creation time in UTC epoch seconds |
| `over18` | boolean | Whether the subreddit is NSFW |
| `lang` | string | Primary language of the subreddit |
| `subreddit_type` | string | Subreddit type: public, private, restricted, etc. |
| `url` | string | Subreddit URL path \(e.g., /r/technology/\) |
| `icon_img` | string | Subreddit icon URL |
| `banner_img` | string | Subreddit banner URL |

View File

@@ -69,7 +69,9 @@ Read records from a ServiceNow table
| `number` | string | No | Record number \(e.g., INC0010001\) |
| `query` | string | No | Encoded query string \(e.g., "active=true^priority=1"\) |
| `limit` | number | No | Maximum number of records to return \(e.g., 10, 50, 100\) |
| `offset` | number | No | Number of records to skip for pagination \(e.g., 0, 10, 20\) |
| `fields` | string | No | Comma-separated list of fields to return \(e.g., sys_id,number,short_description,state\) |
| `displayValue` | string | No | Return display values for reference fields: "true" \(display only\), "false" \(sys_id only\), or "all" \(both\) |
#### Output

View File

@@ -1,6 +1,6 @@
---
title: Slack
description: Send, update, delete messages, send ephemeral messages, add reactions in Slack or trigger workflows from Slack events
description: Send, update, delete messages, manage views and modals, add or remove reactions, manage canvases, get channel info and user presence in Slack
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
@@ -39,7 +39,7 @@ If you encounter issues with the Slack integration, contact us at [help@sim.ai](
## Usage Instructions
Integrate Slack into the workflow. Can send, update, and delete messages, send ephemeral messages visible only to a specific user, create canvases, read messages, and add reactions. Requires Bot Token instead of OAuth in advanced mode. Can be used in trigger mode to trigger a workflow when a message is sent to a channel.
Integrate Slack into the workflow. Can send, update, and delete messages, send ephemeral messages visible only to a specific user, open/update/push modal views, publish Home tab views, create canvases, read messages, and add or remove reactions. Requires Bot Token instead of OAuth in advanced mode. Can be used in trigger mode to trigger a workflow when a message is sent to a channel.
@@ -799,4 +799,313 @@ Add an emoji reaction to a Slack message
| ↳ `timestamp` | string | Message timestamp |
| ↳ `reaction` | string | Emoji reaction name |
### `slack_remove_reaction`
Remove an emoji reaction from a Slack message
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `channel` | string | Yes | Channel ID where the message was posted \(e.g., C1234567890\) |
| `timestamp` | string | Yes | Timestamp of the message to remove reaction from \(e.g., 1405894322.002768\) |
| `name` | string | Yes | Name of the emoji reaction to remove \(without colons, e.g., thumbsup, heart, eyes\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `content` | string | Success message |
| `metadata` | object | Reaction metadata |
| ↳ `channel` | string | Channel ID |
| ↳ `timestamp` | string | Message timestamp |
| ↳ `reaction` | string | Emoji reaction name |
### `slack_get_channel_info`
Get detailed information about a Slack channel by its ID
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `channel` | string | Yes | Channel ID to get information about \(e.g., C1234567890\) |
| `includeNumMembers` | boolean | No | Whether to include the member count in the response |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `channelInfo` | object | Detailed channel information |
| ↳ `id` | string | Channel ID \(e.g., C1234567890\) |
| ↳ `name` | string | Channel name without # prefix |
| ↳ `is_channel` | boolean | Whether this is a channel |
| ↳ `is_private` | boolean | Whether channel is private |
| ↳ `is_archived` | boolean | Whether channel is archived |
| ↳ `is_general` | boolean | Whether this is the general channel |
| ↳ `is_member` | boolean | Whether the bot/user is a member |
| ↳ `is_shared` | boolean | Whether channel is shared across workspaces |
| ↳ `is_ext_shared` | boolean | Whether channel is externally shared |
| ↳ `is_org_shared` | boolean | Whether channel is org-wide shared |
| ↳ `num_members` | number | Number of members in the channel |
| ↳ `topic` | string | Channel topic |
| ↳ `purpose` | string | Channel purpose/description |
| ↳ `created` | number | Unix timestamp when channel was created |
| ↳ `creator` | string | User ID of channel creator |
| ↳ `updated` | number | Unix timestamp of last update |
### `slack_get_user_presence`
Check whether a Slack user is currently active or away
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `userId` | string | Yes | User ID to check presence for \(e.g., U1234567890\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `presence` | string | User presence status: "active" or "away" |
| `online` | boolean | Whether user has an active client connection \(only available when checking own presence\) |
| `autoAway` | boolean | Whether user was automatically set to away due to inactivity \(only available when checking own presence\) |
| `manualAway` | boolean | Whether user manually set themselves as away \(only available when checking own presence\) |
| `connectionCount` | number | Total number of active connections for the user \(only available when checking own presence\) |
| `lastActivity` | number | Unix timestamp of last detected activity \(only available when checking own presence\) |
### `slack_edit_canvas`
Edit an existing Slack canvas by inserting, replacing, or deleting content
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `canvasId` | string | Yes | Canvas ID to edit \(e.g., F1234ABCD\) |
| `operation` | string | Yes | Edit operation: insert_at_start, insert_at_end, insert_after, insert_before, replace, delete, or rename |
| `content` | string | No | Markdown content for the operation \(required for insert/replace operations\) |
| `sectionId` | string | No | Section ID to target \(required for insert_after, insert_before, replace, and delete\) |
| `title` | string | No | New title for the canvas \(only used with rename operation\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `content` | string | Success message |
### `slack_create_channel_canvas`
Create a canvas pinned to a Slack channel as its resource hub
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `channel` | string | Yes | Channel ID to create the canvas in \(e.g., C1234567890\) |
| `title` | string | No | Title for the channel canvas |
| `content` | string | No | Canvas content in markdown format |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `canvas_id` | string | ID of the created channel canvas |
### `slack_open_view`
Open a modal view in Slack using a trigger_id from an interaction payload. Used to display forms, confirmations, and other interactive modals.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `triggerId` | string | Yes | Exchange a trigger to post to the user. Obtained from an interaction payload \(e.g., slash command, button click\) |
| `interactivityPointer` | string | No | Alternative to trigger_id for posting to user |
| `view` | json | Yes | A view payload object defining the modal. Must include type \("modal"\), title, and blocks array |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `view` | object | The opened modal view object |
| ↳ `id` | string | Unique view identifier |
| ↳ `team_id` | string | Workspace/team ID |
| ↳ `type` | string | View type \(e.g., "modal"\) |
| ↳ `title` | json | Plain text title object with type and text fields |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Title text content |
| ↳ `submit` | json | Plain text submit button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Submit button text |
| ↳ `close` | json | Plain text close button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Close button text |
| ↳ `blocks` | array | Block Kit blocks in the view |
| ↳ `type` | string | Block type \(section, divider, image, actions, etc.\) |
| ↳ `block_id` | string | Unique block identifier |
| ↳ `private_metadata` | string | Private metadata string passed with the view |
| ↳ `callback_id` | string | Custom identifier for the view |
| ↳ `external_id` | string | Custom external identifier \(max 255 chars, unique per workspace\) |
| ↳ `state` | json | Current state of the view with input values |
| ↳ `hash` | string | View version hash for updates |
| ↳ `clear_on_close` | boolean | Whether to clear all views in the stack when this view is closed |
| ↳ `notify_on_close` | boolean | Whether to send a view_closed event when this view is closed |
| ↳ `root_view_id` | string | ID of the root view in the view stack |
| ↳ `previous_view_id` | string | ID of the previous view in the view stack |
| ↳ `app_id` | string | Application identifier |
| ↳ `bot_id` | string | Bot identifier |
### `slack_update_view`
Update an existing modal view in Slack. Identify the view by view_id or external_id, and provide the updated view payload.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `viewId` | string | No | Unique identifier of the view to update. Either viewId or externalId is required |
| `externalId` | string | No | Developer-set unique identifier of the view to update \(max 255 chars\). Either viewId or externalId is required |
| `hash` | string | No | View state hash to protect against race conditions. Obtained from a previous views response |
| `view` | json | Yes | A view payload object defining the updated modal. Must include type \("modal"\), title, and blocks array. Use identical block_id and action_id values to preserve input data |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `view` | object | The updated modal view object |
| ↳ `id` | string | Unique view identifier |
| ↳ `team_id` | string | Workspace/team ID |
| ↳ `type` | string | View type \(e.g., "modal"\) |
| ↳ `title` | json | Plain text title object with type and text fields |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Title text content |
| ↳ `submit` | json | Plain text submit button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Submit button text |
| ↳ `close` | json | Plain text close button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Close button text |
| ↳ `blocks` | array | Block Kit blocks in the view |
| ↳ `type` | string | Block type \(section, divider, image, actions, etc.\) |
| ↳ `block_id` | string | Unique block identifier |
| ↳ `private_metadata` | string | Private metadata string passed with the view |
| ↳ `callback_id` | string | Custom identifier for the view |
| ↳ `external_id` | string | Custom external identifier \(max 255 chars, unique per workspace\) |
| ↳ `state` | json | Current state of the view with input values |
| ↳ `hash` | string | View version hash for updates |
| ↳ `clear_on_close` | boolean | Whether to clear all views in the stack when this view is closed |
| ↳ `notify_on_close` | boolean | Whether to send a view_closed event when this view is closed |
| ↳ `root_view_id` | string | ID of the root view in the view stack |
| ↳ `previous_view_id` | string | ID of the previous view in the view stack |
| ↳ `app_id` | string | Application identifier |
| ↳ `bot_id` | string | Bot identifier |
### `slack_push_view`
Push a new view onto an existing modal stack in Slack. Limited to 2 additional views after the initial modal is opened.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `triggerId` | string | Yes | Exchange a trigger to post to the user. Obtained from an interaction payload \(e.g., button click within an existing modal\) |
| `interactivityPointer` | string | No | Alternative to trigger_id for posting to user |
| `view` | json | Yes | A view payload object defining the modal to push. Must include type \("modal"\), title, and blocks array |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `view` | object | The pushed modal view object |
| ↳ `id` | string | Unique view identifier |
| ↳ `team_id` | string | Workspace/team ID |
| ↳ `type` | string | View type \(e.g., "modal"\) |
| ↳ `title` | json | Plain text title object with type and text fields |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Title text content |
| ↳ `submit` | json | Plain text submit button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Submit button text |
| ↳ `close` | json | Plain text close button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Close button text |
| ↳ `blocks` | array | Block Kit blocks in the view |
| ↳ `type` | string | Block type \(section, divider, image, actions, etc.\) |
| ↳ `block_id` | string | Unique block identifier |
| ↳ `private_metadata` | string | Private metadata string passed with the view |
| ↳ `callback_id` | string | Custom identifier for the view |
| ↳ `external_id` | string | Custom external identifier \(max 255 chars, unique per workspace\) |
| ↳ `state` | json | Current state of the view with input values |
| ↳ `hash` | string | View version hash for updates |
| ↳ `clear_on_close` | boolean | Whether to clear all views in the stack when this view is closed |
| ↳ `notify_on_close` | boolean | Whether to send a view_closed event when this view is closed |
| ↳ `root_view_id` | string | ID of the root view in the view stack |
| ↳ `previous_view_id` | string | ID of the previous view in the view stack |
| ↳ `app_id` | string | Application identifier |
| ↳ `bot_id` | string | Bot identifier |
### `slack_publish_view`
Publish a static view to a user
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `userId` | string | Yes | The user ID to publish the Home tab view to \(e.g., U0BPQUNTA\) |
| `hash` | string | No | View state hash to protect against race conditions. Obtained from a previous views response |
| `view` | json | Yes | A view payload object defining the Home tab. Must include type \("home"\) and blocks array |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `view` | object | The published Home tab view object |
| ↳ `id` | string | Unique view identifier |
| ↳ `team_id` | string | Workspace/team ID |
| ↳ `type` | string | View type \(e.g., "modal"\) |
| ↳ `title` | json | Plain text title object with type and text fields |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Title text content |
| ↳ `submit` | json | Plain text submit button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Submit button text |
| ↳ `close` | json | Plain text close button object |
| ↳ `type` | string | Text object type \(plain_text\) |
| ↳ `text` | string | Close button text |
| ↳ `blocks` | array | Block Kit blocks in the view |
| ↳ `type` | string | Block type \(section, divider, image, actions, etc.\) |
| ↳ `block_id` | string | Unique block identifier |
| ↳ `private_metadata` | string | Private metadata string passed with the view |
| ↳ `callback_id` | string | Custom identifier for the view |
| ↳ `external_id` | string | Custom external identifier \(max 255 chars, unique per workspace\) |
| ↳ `state` | json | Current state of the view with input values |
| ↳ `hash` | string | View version hash for updates |
| ↳ `clear_on_close` | boolean | Whether to clear all views in the stack when this view is closed |
| ↳ `notify_on_close` | boolean | Whether to send a view_closed event when this view is closed |
| ↳ `root_view_id` | string | ID of the root view in the view stack |
| ↳ `previous_view_id` | string | ID of the previous view in the view stack |
| ↳ `app_id` | string | Application identifier |
| ↳ `bot_id` | string | Bot identifier |

View File

@@ -1,534 +0,0 @@
'use client'
import { useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { X } from 'lucide-react'
import { Textarea } from '@/components/emcn'
import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label'
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from '@/components/ui/select'
import { isHosted } from '@/lib/core/config/feature-flags'
import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import Footer from '@/app/(landing)/components/footer/footer'
import Nav from '@/app/(landing)/components/nav/nav'
const logger = createLogger('CareersPage')
const validateName = (name: string): string[] => {
const errors: string[] = []
if (!name || name.trim().length < 2) {
errors.push('Name must be at least 2 characters')
}
return errors
}
const validateEmail = (email: string): string[] => {
const errors: string[] = []
if (!email || !email.trim()) {
errors.push('Email is required')
return errors
}
const validation = quickValidateEmail(email.trim().toLowerCase())
if (!validation.isValid) {
errors.push(validation.reason || 'Please enter a valid email address')
}
return errors
}
const validatePosition = (position: string): string[] => {
const errors: string[] = []
if (!position || position.trim().length < 2) {
errors.push('Please specify the position you are interested in')
}
return errors
}
const validateLinkedIn = (url: string): string[] => {
if (!url || url.trim() === '') return []
const errors: string[] = []
try {
new URL(url)
} catch {
errors.push('Please enter a valid LinkedIn URL')
}
return errors
}
const validatePortfolio = (url: string): string[] => {
if (!url || url.trim() === '') return []
const errors: string[] = []
try {
new URL(url)
} catch {
errors.push('Please enter a valid portfolio URL')
}
return errors
}
const validateLocation = (location: string): string[] => {
const errors: string[] = []
if (!location || location.trim().length < 2) {
errors.push('Please enter your location')
}
return errors
}
const validateMessage = (message: string): string[] => {
const errors: string[] = []
if (!message || message.trim().length < 50) {
errors.push('Please tell us more about yourself (at least 50 characters)')
}
return errors
}
export default function CareersPage() {
const [isSubmitting, setIsSubmitting] = useState(false)
const [submitStatus, setSubmitStatus] = useState<'idle' | 'success' | 'error'>('idle')
const [showErrors, setShowErrors] = useState(false)
// Form fields
const [name, setName] = useState('')
const [email, setEmail] = useState('')
const [phone, setPhone] = useState('')
const [position, setPosition] = useState('')
const [linkedin, setLinkedin] = useState('')
const [portfolio, setPortfolio] = useState('')
const [experience, setExperience] = useState('')
const [location, setLocation] = useState('')
const [message, setMessage] = useState('')
const [resume, setResume] = useState<File | null>(null)
const fileInputRef = useRef<HTMLInputElement>(null)
// Field errors
const [nameErrors, setNameErrors] = useState<string[]>([])
const [emailErrors, setEmailErrors] = useState<string[]>([])
const [positionErrors, setPositionErrors] = useState<string[]>([])
const [linkedinErrors, setLinkedinErrors] = useState<string[]>([])
const [portfolioErrors, setPortfolioErrors] = useState<string[]>([])
const [experienceErrors, setExperienceErrors] = useState<string[]>([])
const [locationErrors, setLocationErrors] = useState<string[]>([])
const [messageErrors, setMessageErrors] = useState<string[]>([])
const [resumeErrors, setResumeErrors] = useState<string[]>([])
const handleFileChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0] || null
setResume(file)
if (file) {
setResumeErrors([])
}
}
async function onSubmit(e: React.FormEvent<HTMLFormElement>) {
e.preventDefault()
setShowErrors(true)
// Validate all fields
const nameErrs = validateName(name)
const emailErrs = validateEmail(email)
const positionErrs = validatePosition(position)
const linkedinErrs = validateLinkedIn(linkedin)
const portfolioErrs = validatePortfolio(portfolio)
const experienceErrs = experience ? [] : ['Please select your years of experience']
const locationErrs = validateLocation(location)
const messageErrs = validateMessage(message)
const resumeErrs = resume ? [] : ['Resume is required']
setNameErrors(nameErrs)
setEmailErrors(emailErrs)
setPositionErrors(positionErrs)
setLinkedinErrors(linkedinErrs)
setPortfolioErrors(portfolioErrs)
setExperienceErrors(experienceErrs)
setLocationErrors(locationErrs)
setMessageErrors(messageErrs)
setResumeErrors(resumeErrs)
if (
nameErrs.length > 0 ||
emailErrs.length > 0 ||
positionErrs.length > 0 ||
linkedinErrs.length > 0 ||
portfolioErrs.length > 0 ||
experienceErrs.length > 0 ||
locationErrs.length > 0 ||
messageErrs.length > 0 ||
resumeErrs.length > 0
) {
return
}
setIsSubmitting(true)
setSubmitStatus('idle')
try {
const formData = new FormData()
formData.append('name', name)
formData.append('email', email)
formData.append('phone', phone || '')
formData.append('position', position)
formData.append('linkedin', linkedin || '')
formData.append('portfolio', portfolio || '')
formData.append('experience', experience)
formData.append('location', location)
formData.append('message', message)
if (resume) formData.append('resume', resume)
const response = await fetch('/api/careers/submit', {
method: 'POST',
body: formData,
})
if (!response.ok) {
throw new Error('Failed to submit application')
}
setSubmitStatus('success')
} catch (error) {
logger.error('Error submitting application:', error)
setSubmitStatus('error')
} finally {
setIsSubmitting(false)
}
}
return (
<main className={`${soehne.className} min-h-screen bg-white text-gray-900`}>
<Nav variant='landing' />
{/* Content */}
<div className='px-4 pt-[60px] pb-[80px] sm:px-8 md:px-[44px]'>
<h1 className='mb-10 text-center font-bold text-4xl text-gray-900 md:text-5xl'>
Join Our Team
</h1>
<div className='mx-auto max-w-4xl'>
{/* Form Section */}
<section className='rounded-2xl border border-gray-200 bg-white p-6 shadow-sm sm:p-10'>
<form onSubmit={onSubmit} className='space-y-5'>
{/* Name and Email */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='name' className='font-medium text-sm'>
Full Name *
</Label>
<Input
id='name'
placeholder='John Doe'
value={name}
onChange={(e) => setName(e.target.value)}
className={cn(
showErrors &&
nameErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && nameErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{nameErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
<div className='space-y-2'>
<Label htmlFor='email' className='font-medium text-sm'>
Email *
</Label>
<Input
id='email'
type='email'
placeholder='john@example.com'
value={email}
onChange={(e) => setEmail(e.target.value)}
className={cn(
showErrors &&
emailErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && emailErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{emailErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* Phone and Position */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='phone' className='font-medium text-sm'>
Phone Number
</Label>
<Input
id='phone'
type='tel'
placeholder='+1 (555) 123-4567'
value={phone}
onChange={(e) => setPhone(e.target.value)}
/>
</div>
<div className='space-y-2'>
<Label htmlFor='position' className='font-medium text-sm'>
Position of Interest *
</Label>
<Input
id='position'
placeholder='e.g. Full Stack Engineer, Product Designer'
value={position}
onChange={(e) => setPosition(e.target.value)}
className={cn(
showErrors &&
positionErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && positionErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{positionErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* LinkedIn and Portfolio */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='linkedin' className='font-medium text-sm'>
LinkedIn Profile
</Label>
<Input
id='linkedin'
placeholder='https://linkedin.com/in/yourprofile'
value={linkedin}
onChange={(e) => setLinkedin(e.target.value)}
className={cn(
showErrors &&
linkedinErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && linkedinErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{linkedinErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
<div className='space-y-2'>
<Label htmlFor='portfolio' className='font-medium text-sm'>
Portfolio / Website
</Label>
<Input
id='portfolio'
placeholder='https://yourportfolio.com'
value={portfolio}
onChange={(e) => setPortfolio(e.target.value)}
className={cn(
showErrors &&
portfolioErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && portfolioErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{portfolioErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* Experience and Location */}
<div className='grid grid-cols-1 gap-4 sm:gap-6 md:grid-cols-2'>
<div className='space-y-2'>
<Label htmlFor='experience' className='font-medium text-sm'>
Years of Experience *
</Label>
<Select value={experience} onValueChange={setExperience}>
<SelectTrigger
className={cn(
showErrors &&
experienceErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
>
<SelectValue placeholder='Select experience level' />
</SelectTrigger>
<SelectContent>
<SelectItem value='0-1'>0-1 years</SelectItem>
<SelectItem value='1-3'>1-3 years</SelectItem>
<SelectItem value='3-5'>3-5 years</SelectItem>
<SelectItem value='5-10'>5-10 years</SelectItem>
<SelectItem value='10+'>10+ years</SelectItem>
</SelectContent>
</Select>
{showErrors && experienceErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{experienceErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
<div className='space-y-2'>
<Label htmlFor='location' className='font-medium text-sm'>
Location *
</Label>
<Input
id='location'
placeholder='e.g. San Francisco, CA'
value={location}
onChange={(e) => setLocation(e.target.value)}
className={cn(
showErrors &&
locationErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
{showErrors && locationErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{locationErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
</div>
{/* Message */}
<div className='space-y-2'>
<Label htmlFor='message' className='font-medium text-sm'>
Tell us about yourself *
</Label>
<Textarea
id='message'
placeholder='Tell us about your experience, what excites you about Sim, and why you would be a great fit for this role...'
className={cn(
'min-h-[140px]',
showErrors &&
messageErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
value={message}
onChange={(e) => setMessage(e.target.value)}
/>
<p className='mt-1.5 text-gray-500 text-xs'>Minimum 50 characters</p>
{showErrors && messageErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{messageErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
{/* Resume Upload */}
<div className='space-y-2'>
<Label htmlFor='resume' className='font-medium text-sm'>
Resume *
</Label>
<div className='relative'>
{resume ? (
<div className='flex items-center gap-2 rounded-md border border-input bg-background px-3 py-2'>
<span className='flex-1 truncate text-sm'>{resume.name}</span>
<button
type='button'
onClick={(e) => {
e.preventDefault()
setResume(null)
if (fileInputRef.current) {
fileInputRef.current.value = ''
}
}}
className='flex-shrink-0 text-muted-foreground transition-colors hover:text-foreground'
aria-label='Remove file'
>
<X className='h-4 w-4' />
</button>
</div>
) : (
<Input
id='resume'
type='file'
accept='.pdf,.doc,.docx'
onChange={handleFileChange}
ref={fileInputRef}
className={cn(
showErrors &&
resumeErrors.length > 0 &&
'border-red-500 focus:border-red-500 focus:ring-red-100 focus-visible:ring-red-500'
)}
/>
)}
</div>
<p className='mt-1.5 text-gray-500 text-xs'>PDF or Word document, max 10MB</p>
{showErrors && resumeErrors.length > 0 && (
<div className='mt-1 space-y-1 text-red-400 text-xs'>
{resumeErrors.map((error, index) => (
<p key={index}>{error}</p>
))}
</div>
)}
</div>
{/* Submit Button */}
<div className='flex justify-end pt-2'>
<BrandedButton
type='submit'
disabled={isSubmitting || submitStatus === 'success'}
loading={isSubmitting}
loadingText='Submitting'
showArrow={false}
fullWidth={false}
className='min-w-[200px]'
>
{submitStatus === 'success' ? 'Submitted' : 'Submit Application'}
</BrandedButton>
</div>
</form>
</section>
{/* Additional Info */}
<section className='mt-6 text-center text-gray-600 text-sm'>
<p>
Questions? Email us at{' '}
<a
href='mailto:careers@sim.ai'
className='font-medium text-gray-900 underline transition-colors hover:text-gray-700'
>
careers@sim.ai
</a>
</p>
</section>
</div>
</div>
{/* Footer - Only for hosted instances */}
{isHosted && (
<div className='relative z-20'>
<Footer fullWidth={true} />
</div>
)}
</main>
)
}

View File

@@ -77,12 +77,14 @@ export default function Footer({ fullWidth = false }: FooterProps) {
>
Status
</Link>
<Link
href='/careers'
<a
href='https://jobs.ashbyhq.com/sim'
target='_blank'
rel='noopener noreferrer'
className='text-[14px] text-muted-foreground transition-colors hover:text-foreground'
>
Careers
</Link>
</a>
<Link
href='/privacy'
target='_blank'

View File

@@ -91,12 +91,14 @@ export default function Nav({ hideAuthButtons = false, variant = 'landing' }: Na
</button>
</li>
<li>
<Link
href='/careers'
<a
href='https://jobs.ashbyhq.com/sim'
target='_blank'
rel='noopener noreferrer'
className='text-[16px] text-muted-foreground transition-colors hover:text-foreground'
>
Careers
</Link>
</a>
</li>
<li>
<a

View File

@@ -18,7 +18,6 @@ export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
pathname.startsWith('/privacy') ||
pathname.startsWith('/invite') ||
pathname.startsWith('/verify') ||
pathname.startsWith('/careers') ||
pathname.startsWith('/changelog') ||
pathname.startsWith('/chat') ||
pathname.startsWith('/studio') ||

View File

@@ -833,15 +833,7 @@ input[type="search"]::-ms-clear {
animation: growShrink 1.5s infinite ease-in-out;
}
/* Subflow node z-index and drag-over styles */
.workflow-container .react-flow__node-subflowNode {
z-index: -1 !important;
}
.workflow-container .react-flow__node-subflowNode:has([data-subflow-selected="true"]) {
z-index: 10 !important;
}
/* Subflow node drag-over styles */
.loop-node-drag-over,
.parallel-node-drag-over {
box-shadow: 0 0 0 1.75px var(--brand-secondary) !important;

View File

@@ -711,7 +711,7 @@ async function handleMessageStream(
if (response.body && isStreamingResponse) {
const reader = response.body.getReader()
const decoder = new TextDecoder()
let accumulatedContent = ''
const contentChunks: string[] = []
let finalContent: string | undefined
while (true) {
@@ -722,7 +722,7 @@ async function handleMessageStream(
const parsed = parseWorkflowSSEChunk(rawChunk)
if (parsed.content) {
accumulatedContent += parsed.content
contentChunks.push(parsed.content)
sendEvent('message', {
kind: 'message',
taskId,
@@ -738,6 +738,7 @@ async function handleMessageStream(
}
}
const accumulatedContent = contentChunks.join('')
const messageContent =
(finalContent !== undefined && finalContent.length > 0
? finalContent

View File

@@ -6,40 +6,33 @@
import { createMockRequest } from '@sim/testing'
import { beforeEach, describe, expect, it, vi } from 'vitest'
const {
mockGetSession,
mockDb,
mockLogger,
mockParseProvider,
mockEvaluateScopeCoverage,
mockJwtDecode,
mockEq,
} = vi.hoisted(() => {
const db = {
select: vi.fn().mockReturnThis(),
from: vi.fn().mockReturnThis(),
where: vi.fn().mockReturnThis(),
limit: vi.fn(),
const { mockGetSession, mockDb, mockLogger, mockParseProvider, mockJwtDecode, mockEq } = vi.hoisted(
() => {
const db = {
select: vi.fn().mockReturnThis(),
from: vi.fn().mockReturnThis(),
where: vi.fn().mockReturnThis(),
limit: vi.fn(),
}
const logger = {
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
trace: vi.fn(),
fatal: vi.fn(),
child: vi.fn(),
}
return {
mockGetSession: vi.fn(),
mockDb: db,
mockLogger: logger,
mockParseProvider: vi.fn(),
mockJwtDecode: vi.fn(),
mockEq: vi.fn((field: unknown, value: unknown) => ({ field, value, type: 'eq' })),
}
}
const logger = {
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
trace: vi.fn(),
fatal: vi.fn(),
child: vi.fn(),
}
return {
mockGetSession: vi.fn(),
mockDb: db,
mockLogger: logger,
mockParseProvider: vi.fn(),
mockEvaluateScopeCoverage: vi.fn(),
mockJwtDecode: vi.fn(),
mockEq: vi.fn((field: unknown, value: unknown) => ({ field, value, type: 'eq' })),
}
})
)
vi.mock('@/lib/auth', () => ({
getSession: mockGetSession,
@@ -66,7 +59,6 @@ vi.mock('@sim/logger', () => ({
vi.mock('@/lib/oauth/utils', () => ({
parseProvider: mockParseProvider,
evaluateScopeCoverage: mockEvaluateScopeCoverage,
}))
import { GET } from '@/app/api/auth/oauth/connections/route'
@@ -83,16 +75,6 @@ describe('OAuth Connections API Route', () => {
baseProvider: providerId.split('-')[0] || providerId,
featureType: providerId.split('-')[1] || 'default',
}))
mockEvaluateScopeCoverage.mockImplementation(
(_providerId: string, _grantedScopes: string[]) => ({
canonicalScopes: ['email', 'profile'],
grantedScopes: ['email', 'profile'],
missingScopes: [],
extraScopes: [],
requiresReauthorization: false,
})
)
})
it('should return connections successfully', async () => {

View File

@@ -6,7 +6,7 @@ import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import type { OAuthProvider } from '@/lib/oauth'
import { evaluateScopeCoverage, parseProvider } from '@/lib/oauth'
import { parseProvider } from '@/lib/oauth'
const logger = createLogger('OAuthConnectionsAPI')
@@ -49,8 +49,7 @@ export async function GET(request: NextRequest) {
for (const acc of accounts) {
const { baseProvider, featureType } = parseProvider(acc.providerId as OAuthProvider)
const grantedScopes = acc.scope ? acc.scope.split(/\s+/).filter(Boolean) : []
const scopeEvaluation = evaluateScopeCoverage(acc.providerId, grantedScopes)
const scopes = acc.scope ? acc.scope.split(/\s+/).filter(Boolean) : []
if (baseProvider) {
// Try multiple methods to get a user-friendly display name
@@ -96,10 +95,6 @@ export async function GET(request: NextRequest) {
const accountSummary = {
id: acc.id,
name: displayName,
scopes: scopeEvaluation.grantedScopes,
missingScopes: scopeEvaluation.missingScopes,
extraScopes: scopeEvaluation.extraScopes,
requiresReauthorization: scopeEvaluation.requiresReauthorization,
}
if (existingConnection) {
@@ -108,20 +103,8 @@ export async function GET(request: NextRequest) {
existingConnection.accounts.push(accountSummary)
existingConnection.scopes = Array.from(
new Set([...(existingConnection.scopes || []), ...scopeEvaluation.grantedScopes])
new Set([...(existingConnection.scopes || []), ...scopes])
)
existingConnection.missingScopes = Array.from(
new Set([...(existingConnection.missingScopes || []), ...scopeEvaluation.missingScopes])
)
existingConnection.extraScopes = Array.from(
new Set([...(existingConnection.extraScopes || []), ...scopeEvaluation.extraScopes])
)
existingConnection.canonicalScopes =
existingConnection.canonicalScopes && existingConnection.canonicalScopes.length > 0
? existingConnection.canonicalScopes
: scopeEvaluation.canonicalScopes
existingConnection.requiresReauthorization =
existingConnection.requiresReauthorization || scopeEvaluation.requiresReauthorization
const existingTimestamp = existingConnection.lastConnected
? new Date(existingConnection.lastConnected).getTime()
@@ -138,11 +121,7 @@ export async function GET(request: NextRequest) {
baseProvider,
featureType,
isConnected: true,
scopes: scopeEvaluation.grantedScopes,
canonicalScopes: scopeEvaluation.canonicalScopes,
missingScopes: scopeEvaluation.missingScopes,
extraScopes: scopeEvaluation.extraScopes,
requiresReauthorization: scopeEvaluation.requiresReauthorization,
scopes,
lastConnected: acc.updatedAt.toISOString(),
accounts: [accountSummary],
})

View File

@@ -7,7 +7,7 @@
import { NextRequest } from 'next/server'
import { beforeEach, describe, expect, it, vi } from 'vitest'
const { mockCheckSessionOrInternalAuth, mockEvaluateScopeCoverage, mockLogger } = vi.hoisted(() => {
const { mockCheckSessionOrInternalAuth, mockLogger } = vi.hoisted(() => {
const logger = {
info: vi.fn(),
warn: vi.fn(),
@@ -19,7 +19,6 @@ const { mockCheckSessionOrInternalAuth, mockEvaluateScopeCoverage, mockLogger }
}
return {
mockCheckSessionOrInternalAuth: vi.fn(),
mockEvaluateScopeCoverage: vi.fn(),
mockLogger: logger,
}
})
@@ -28,10 +27,6 @@ vi.mock('@/lib/auth/hybrid', () => ({
checkSessionOrInternalAuth: mockCheckSessionOrInternalAuth,
}))
vi.mock('@/lib/oauth', () => ({
evaluateScopeCoverage: mockEvaluateScopeCoverage,
}))
vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn().mockReturnValue('mock-request-id'),
}))
@@ -87,16 +82,6 @@ describe('OAuth Credentials API Route', () => {
beforeEach(() => {
vi.clearAllMocks()
mockEvaluateScopeCoverage.mockImplementation(
(_providerId: string, grantedScopes: string[]) => ({
canonicalScopes: grantedScopes,
grantedScopes,
missingScopes: [],
extraScopes: [],
requiresReauthorization: false,
})
)
})
it('should handle unauthenticated user', async () => {

View File

@@ -7,7 +7,6 @@ import { z } from 'zod'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncWorkspaceOAuthCredentialsForUser } from '@/lib/credentials/oauth'
import { evaluateScopeCoverage } from '@/lib/oauth'
import { authorizeWorkflowByWorkspacePermission } from '@/lib/workflows/utils'
import { checkWorkspaceAccess } from '@/lib/workspaces/permissions/utils'
@@ -39,8 +38,7 @@ function toCredentialResponse(
scope: string | null
) {
const storedScope = scope?.trim()
const grantedScopes = storedScope ? storedScope.split(/[\s,]+/).filter(Boolean) : []
const scopeEvaluation = evaluateScopeCoverage(providerId, grantedScopes)
const scopes = storedScope ? storedScope.split(/[\s,]+/).filter(Boolean) : []
const [_, featureType = 'default'] = providerId.split('-')
return {
@@ -49,11 +47,7 @@ function toCredentialResponse(
provider: providerId,
lastUsed: updatedAt.toISOString(),
isDefault: featureType === 'default',
scopes: scopeEvaluation.grantedScopes,
canonicalScopes: scopeEvaluation.canonicalScopes,
missingScopes: scopeEvaluation.missingScopes,
extraScopes: scopeEvaluation.extraScopes,
requiresReauthorization: scopeEvaluation.requiresReauthorization,
scopes,
}
}

View File

@@ -1,192 +0,0 @@
import { render } from '@react-email/components'
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { CareersConfirmationEmail, CareersSubmissionEmail } from '@/components/emails'
import { generateRequestId } from '@/lib/core/utils/request'
import { sendEmail } from '@/lib/messaging/email/mailer'
export const dynamic = 'force-dynamic'
const logger = createLogger('CareersAPI')
const MAX_FILE_SIZE = 10 * 1024 * 1024
const ALLOWED_FILE_TYPES = [
'application/pdf',
'application/msword',
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
]
const CareersSubmissionSchema = z.object({
name: z.string().min(2, 'Name must be at least 2 characters'),
email: z.string().email('Please enter a valid email address'),
phone: z.string().optional(),
position: z.string().min(2, 'Please specify the position you are interested in'),
linkedin: z.string().url('Please enter a valid LinkedIn URL').optional().or(z.literal('')),
portfolio: z.string().url('Please enter a valid portfolio URL').optional().or(z.literal('')),
experience: z.enum(['0-1', '1-3', '3-5', '5-10', '10+']),
location: z.string().min(2, 'Please enter your location'),
message: z.string().min(50, 'Please tell us more about yourself (at least 50 characters)'),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const formData = await request.formData()
const data = {
name: formData.get('name') as string,
email: formData.get('email') as string,
phone: formData.get('phone') as string,
position: formData.get('position') as string,
linkedin: formData.get('linkedin') as string,
portfolio: formData.get('portfolio') as string,
experience: formData.get('experience') as string,
location: formData.get('location') as string,
message: formData.get('message') as string,
}
const resumeFile = formData.get('resume') as File | null
if (!resumeFile) {
return NextResponse.json(
{
success: false,
message: 'Resume is required',
errors: [{ path: ['resume'], message: 'Resume is required' }],
},
{ status: 400 }
)
}
if (resumeFile.size > MAX_FILE_SIZE) {
return NextResponse.json(
{
success: false,
message: 'Resume file size must be less than 10MB',
errors: [{ path: ['resume'], message: 'File size must be less than 10MB' }],
},
{ status: 400 }
)
}
if (!ALLOWED_FILE_TYPES.includes(resumeFile.type)) {
return NextResponse.json(
{
success: false,
message: 'Resume must be a PDF or Word document',
errors: [{ path: ['resume'], message: 'File must be PDF or Word document' }],
},
{ status: 400 }
)
}
const resumeBuffer = await resumeFile.arrayBuffer()
const resumeBase64 = Buffer.from(resumeBuffer).toString('base64')
const validatedData = CareersSubmissionSchema.parse(data)
logger.info(`[${requestId}] Processing career application`, {
name: validatedData.name,
email: validatedData.email,
position: validatedData.position,
resumeSize: resumeFile.size,
resumeType: resumeFile.type,
})
const submittedDate = new Date()
const careersEmailHtml = await render(
CareersSubmissionEmail({
name: validatedData.name,
email: validatedData.email,
phone: validatedData.phone,
position: validatedData.position,
linkedin: validatedData.linkedin,
portfolio: validatedData.portfolio,
experience: validatedData.experience,
location: validatedData.location,
message: validatedData.message,
submittedDate,
})
)
const confirmationEmailHtml = await render(
CareersConfirmationEmail({
name: validatedData.name,
position: validatedData.position,
submittedDate,
})
)
const careersEmailResult = await sendEmail({
to: 'careers@sim.ai',
subject: `New Career Application: ${validatedData.name} - ${validatedData.position}`,
html: careersEmailHtml,
emailType: 'transactional',
replyTo: validatedData.email,
attachments: [
{
filename: resumeFile.name,
content: resumeBase64,
contentType: resumeFile.type,
},
],
})
if (!careersEmailResult.success) {
logger.error(`[${requestId}] Failed to send email to careers@sim.ai`, {
error: careersEmailResult.message,
})
throw new Error('Failed to submit application')
}
const confirmationResult = await sendEmail({
to: validatedData.email,
subject: `Your Application to Sim - ${validatedData.position}`,
html: confirmationEmailHtml,
emailType: 'transactional',
replyTo: validatedData.email,
})
if (!confirmationResult.success) {
logger.warn(`[${requestId}] Failed to send confirmation email to applicant`, {
email: validatedData.email,
error: confirmationResult.message,
})
}
logger.info(`[${requestId}] Career application submitted successfully`, {
careersEmailSent: careersEmailResult.success,
confirmationEmailSent: confirmationResult.success,
})
return NextResponse.json({
success: true,
message: 'Application submitted successfully',
})
} catch (error) {
if (error instanceof z.ZodError) {
logger.warn(`[${requestId}] Invalid application data`, { errors: error.errors })
return NextResponse.json(
{
success: false,
message: 'Invalid application data',
errors: error.errors,
},
{ status: 400 }
)
}
logger.error(`[${requestId}] Error processing career application:`, error)
return NextResponse.json(
{
success: false,
message:
'Failed to submit application. Please try again or email us directly at careers@sim.ai',
},
{ status: 500 }
)
}
}

View File

@@ -405,13 +405,17 @@ export async function POST(req: NextRequest) {
},
})
} finally {
controller.close()
try {
controller.close()
} catch {
// controller may already be closed by cancel()
}
}
},
async cancel() {
clientDisconnected = true
if (eventWriter) {
await eventWriter.flush()
await eventWriter.close().catch(() => {})
}
},
})

View File

@@ -2,8 +2,6 @@ import type { NextRequest } from 'next/server'
import { NextResponse } from 'next/server'
import {
renderBatchInvitationEmail,
renderCareersConfirmationEmail,
renderCareersSubmissionEmail,
renderCreditPurchaseEmail,
renderEnterpriseSubscriptionEmail,
renderFreeTierUpgradeEmail,
@@ -94,22 +92,6 @@ const emailTemplates = {
failureReason: 'Card declined',
}),
// Careers emails
'careers-confirmation': () => renderCareersConfirmationEmail('John Doe', 'Senior Engineer'),
'careers-submission': () =>
renderCareersSubmissionEmail({
name: 'John Doe',
email: 'john@example.com',
phone: '+1 (555) 123-4567',
position: 'Senior Engineer',
linkedin: 'https://linkedin.com/in/johndoe',
portfolio: 'https://johndoe.dev',
experience: '5-10',
location: 'San Francisco, CA',
message:
'I have 10 years of experience building scalable distributed systems. Most recently, I led a team at a Series B startup where we scaled from 100K to 10M users.',
}),
// Notification emails
'workflow-notification-success': () =>
renderWorkflowNotificationEmail({
@@ -176,7 +158,6 @@ export async function GET(request: NextRequest) {
'credit-purchase',
'payment-failed',
],
Careers: ['careers-confirmation', 'careers-submission'],
Notifications: [
'workflow-notification-success',
'workflow-notification-error',

View File

@@ -36,6 +36,7 @@ export async function GET(_request: NextRequest, { params }: { params: Promise<{
stateSnapshotId: workflowExecutionLogs.stateSnapshotId,
deploymentVersionId: workflowExecutionLogs.deploymentVersionId,
level: workflowExecutionLogs.level,
status: workflowExecutionLogs.status,
trigger: workflowExecutionLogs.trigger,
startedAt: workflowExecutionLogs.startedAt,
endedAt: workflowExecutionLogs.endedAt,
@@ -99,6 +100,7 @@ export async function GET(_request: NextRequest, { params }: { params: Promise<{
deploymentVersion: log.deploymentVersion ?? null,
deploymentVersionName: log.deploymentVersionName ?? null,
level: log.level,
status: log.status,
duration: log.totalDurationMs ? `${log.totalDurationMs}ms` : null,
trigger: log.trigger,
createdAt: log.startedAt.toISOString(),

View File

@@ -0,0 +1,79 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('AirtableBasesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.airtable.com/v0/meta/bases', {
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Airtable bases', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Airtable bases', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const bases = (data.bases || []).map((base: { id: string; name: string }) => ({
id: base.id,
name: base.name,
}))
return NextResponse.json({ bases })
} catch (error) {
logger.error('Error processing Airtable bases request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Airtable bases', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,95 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { validateAirtableId } from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('AirtableTablesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, baseId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!baseId) {
logger.error('Missing baseId in request')
return NextResponse.json({ error: 'Base ID is required' }, { status: 400 })
}
const baseIdValidation = validateAirtableId(baseId, 'app', 'baseId')
if (!baseIdValidation.isValid) {
logger.error('Invalid baseId', { error: baseIdValidation.error })
return NextResponse.json({ error: baseIdValidation.error }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch(
`https://api.airtable.com/v0/meta/bases/${baseIdValidation.sanitized}/tables`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
}
)
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Airtable tables', {
status: response.status,
error: errorData,
baseId,
})
return NextResponse.json(
{ error: 'Failed to fetch Airtable tables', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const tables = (data.tables || []).map((table: { id: string; name: string }) => ({
id: table.id,
name: table.name,
}))
return NextResponse.json({ tables })
} catch (error) {
logger.error('Error processing Airtable tables request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Airtable tables', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,79 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('AsanaWorkspacesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://app.asana.com/api/1.0/workspaces', {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Asana workspaces', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Asana workspaces', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const workspaces = (data.data || []).map((workspace: { gid: string; name: string }) => ({
id: workspace.gid,
name: workspace.name,
}))
return NextResponse.json({ workspaces })
} catch (error) {
logger.error('Error processing Asana workspaces request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Asana workspaces', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,79 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('AttioListsAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.attio.com/v2/lists', {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Attio lists', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Attio lists', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const lists = (data.data || []).map((list: { api_slug: string; name: string }) => ({
id: list.api_slug,
name: list.name,
}))
return NextResponse.json({ lists })
} catch (error) {
logger.error('Error processing Attio lists request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Attio lists', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,79 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('AttioObjectsAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.attio.com/v2/objects', {
headers: {
Authorization: `Bearer ${accessToken}`,
Accept: 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Attio objects', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Attio objects', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const objects = (data.data || []).map((obj: { api_slug: string; singular_noun: string }) => ({
id: obj.api_slug,
name: obj.singular_noun,
}))
return NextResponse.json({ objects })
} catch (error) {
logger.error('Error processing Attio objects request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Attio objects', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,83 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('CalcomEventTypesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.cal.com/v2/event-types', {
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
'cal-api-version': '2024-06-14',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Cal.com event types', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Cal.com event types', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const eventTypes = (data.data || []).map(
(eventType: { id: number; title: string; slug: string }) => ({
id: String(eventType.id),
title: eventType.title,
slug: eventType.slug,
})
)
return NextResponse.json({ eventTypes })
} catch (error) {
logger.error('Error processing Cal.com event types request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Cal.com event types', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,80 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('CalcomSchedulesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.cal.com/v2/schedules', {
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
'cal-api-version': '2024-06-11',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Cal.com schedules', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Cal.com schedules', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const schedules = (data.data || []).map((schedule: { id: number; name: string }) => ({
id: String(schedule.id),
name: schedule.name,
}))
return NextResponse.json({ schedules })
} catch (error) {
logger.error('Error processing Cal.com schedules request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Cal.com schedules', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,96 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { validateJiraCloudId } from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { getConfluenceCloudId } from '@/tools/confluence/utils'
const logger = createLogger('ConfluenceSelectorSpacesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, domain } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!domain) {
return NextResponse.json({ error: 'Domain is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const cloudId = await getConfluenceCloudId(domain, accessToken)
const cloudIdValidation = validateJiraCloudId(cloudId, 'cloudId')
if (!cloudIdValidation.isValid) {
return NextResponse.json({ error: cloudIdValidation.error }, { status: 400 })
}
const url = `https://api.atlassian.com/ex/confluence/${cloudIdValidation.sanitized}/wiki/api/v2/spaces?limit=250`
const response = await fetch(url, {
method: 'GET',
headers: {
Accept: 'application/json',
Authorization: `Bearer ${accessToken}`,
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => null)
logger.error('Confluence API error:', {
status: response.status,
statusText: response.statusText,
error: errorData,
})
const errorMessage =
errorData?.message || `Failed to list Confluence spaces (${response.status})`
return NextResponse.json({ error: errorMessage }, { status: response.status })
}
const data = await response.json()
const spaces = (data.results || []).map((space: { id: string; name: string; key: string }) => ({
id: space.id,
name: space.name,
key: space.key,
}))
return NextResponse.json({ spaces })
} catch (error) {
logger.error('Error listing Confluence spaces:', error)
return NextResponse.json(
{ error: (error as Error).message || 'Internal server error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,100 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('GoogleBigQueryDatasetsAPI')
export const dynamic = 'force-dynamic'
/**
* POST /api/tools/google_bigquery/datasets
*
* Fetches the list of BigQuery datasets for a given project using the caller's OAuth credential.
*
* @param request - Incoming request containing `credential`, `workflowId`, and `projectId` in the JSON body
* @returns JSON response with a `datasets` array, each entry containing `datasetReference` and optional `friendlyName`
*/
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, projectId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!projectId) {
logger.error('Missing project ID in request')
return NextResponse.json({ error: 'Project ID is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch(
`https://bigquery.googleapis.com/bigquery/v2/projects/${encodeURIComponent(projectId)}/datasets?maxResults=200`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
}
)
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch BigQuery datasets', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch BigQuery datasets', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const datasets = (data.datasets || []).map(
(ds: {
datasetReference: { datasetId: string; projectId: string }
friendlyName?: string
}) => ({
datasetReference: ds.datasetReference,
friendlyName: ds.friendlyName,
})
)
return NextResponse.json({ datasets })
} catch (error) {
logger.error('Error processing BigQuery datasets request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve BigQuery datasets', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,94 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('GoogleBigQueryTablesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, projectId, datasetId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!projectId) {
logger.error('Missing project ID in request')
return NextResponse.json({ error: 'Project ID is required' }, { status: 400 })
}
if (!datasetId) {
logger.error('Missing dataset ID in request')
return NextResponse.json({ error: 'Dataset ID is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch(
`https://bigquery.googleapis.com/bigquery/v2/projects/${encodeURIComponent(projectId)}/datasets/${encodeURIComponent(datasetId)}/tables?maxResults=200`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
}
)
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch BigQuery tables', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch BigQuery tables', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const tables = (data.tables || []).map(
(t: { tableReference: { tableId: string }; friendlyName?: string }) => ({
tableReference: t.tableReference,
friendlyName: t.friendlyName,
})
)
return NextResponse.json({ tables })
} catch (error) {
logger.error('Error processing BigQuery tables request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve BigQuery tables', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,79 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('GoogleTasksTaskListsAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://tasks.googleapis.com/tasks/v1/users/@me/lists', {
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Google Tasks task lists', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Google Tasks task lists', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const taskLists = (data.items || []).map((list: { id: string; title: string }) => ({
id: list.id,
title: list.title,
}))
return NextResponse.json({ taskLists })
} catch (error) {
logger.error('Error processing Google Tasks task lists request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Google Tasks task lists', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,103 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { validateAlphanumericId, validateJiraCloudId } from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { getJiraCloudId, getJsmApiBaseUrl, getJsmHeaders } from '@/tools/jsm/utils'
const logger = createLogger('JsmSelectorRequestTypesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, domain, serviceDeskId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!domain) {
return NextResponse.json({ error: 'Domain is required' }, { status: 400 })
}
if (!serviceDeskId) {
return NextResponse.json({ error: 'Service Desk ID is required' }, { status: 400 })
}
const serviceDeskIdValidation = validateAlphanumericId(serviceDeskId, 'serviceDeskId')
if (!serviceDeskIdValidation.isValid) {
return NextResponse.json({ error: serviceDeskIdValidation.error }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const cloudId = await getJiraCloudId(domain, accessToken)
const cloudIdValidation = validateJiraCloudId(cloudId, 'cloudId')
if (!cloudIdValidation.isValid) {
return NextResponse.json({ error: cloudIdValidation.error }, { status: 400 })
}
const baseUrl = getJsmApiBaseUrl(cloudIdValidation.sanitized!)
const url = `${baseUrl}/servicedesk/${serviceDeskIdValidation.sanitized}/requesttype?limit=100`
const response = await fetch(url, {
method: 'GET',
headers: getJsmHeaders(accessToken),
})
if (!response.ok) {
const errorText = await response.text()
logger.error('JSM API error:', {
status: response.status,
statusText: response.statusText,
error: errorText,
})
return NextResponse.json(
{ error: `JSM API error: ${response.status} ${response.statusText}` },
{ status: response.status }
)
}
const data = await response.json()
const requestTypes = (data.values || []).map((rt: { id: string; name: string }) => ({
id: rt.id,
name: rt.name,
}))
return NextResponse.json({ requestTypes })
} catch (error) {
logger.error('Error listing JSM request types:', error)
return NextResponse.json(
{ error: (error as Error).message || 'Internal server error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,94 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { validateJiraCloudId } from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { getJiraCloudId, getJsmApiBaseUrl, getJsmHeaders } from '@/tools/jsm/utils'
const logger = createLogger('JsmSelectorServiceDesksAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, domain } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!domain) {
return NextResponse.json({ error: 'Domain is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const cloudId = await getJiraCloudId(domain, accessToken)
const cloudIdValidation = validateJiraCloudId(cloudId, 'cloudId')
if (!cloudIdValidation.isValid) {
return NextResponse.json({ error: cloudIdValidation.error }, { status: 400 })
}
const baseUrl = getJsmApiBaseUrl(cloudIdValidation.sanitized!)
const url = `${baseUrl}/servicedesk?limit=100`
const response = await fetch(url, {
method: 'GET',
headers: getJsmHeaders(accessToken),
})
if (!response.ok) {
const errorText = await response.text()
logger.error('JSM API error:', {
status: response.status,
statusText: response.statusText,
error: errorText,
})
return NextResponse.json(
{ error: `JSM API error: ${response.status} ${response.statusText}` },
{ status: response.status }
)
}
const data = await response.json()
const serviceDesks = (data.values || []).map((sd: { id: string; projectName: string }) => ({
id: sd.id,
name: sd.projectName,
}))
return NextResponse.json({ serviceDesks })
} catch (error) {
logger.error('Error listing JSM service desks:', error)
return NextResponse.json(
{ error: (error as Error).message || 'Internal server error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,72 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('MicrosoftPlannerPlansAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error(`[${requestId}] Missing credential in request`)
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error(`[${requestId}] Failed to obtain valid access token`)
return NextResponse.json(
{ error: 'Failed to obtain valid access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://graph.microsoft.com/v1.0/me/planner/plans', {
headers: {
Authorization: `Bearer ${accessToken}`,
},
})
if (!response.ok) {
const errorText = await response.text()
logger.error(`[${requestId}] Microsoft Graph API error:`, errorText)
return NextResponse.json(
{ error: 'Failed to fetch plans from Microsoft Graph' },
{ status: response.status }
)
}
const data = await response.json()
const plans = data.value || []
const filteredPlans = plans.map((plan: { id: string; title: string }) => ({
id: plan.id,
title: plan.title,
}))
return NextResponse.json({ plans: filteredPlans })
} catch (error) {
logger.error(`[${requestId}] Error fetching Microsoft Planner plans:`, error)
return NextResponse.json({ error: 'Failed to fetch plans' }, { status: 500 })
}
}

View File

@@ -1,38 +1,29 @@
import { randomUUID } from 'crypto'
import { db } from '@sim/db'
import { account } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { validateMicrosoftGraphId } from '@/lib/core/security/input-validation'
import { refreshAccessTokenIfNeeded, resolveOAuthAccountId } from '@/app/api/auth/oauth/utils'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import type { PlannerTask } from '@/tools/microsoft_planner/types'
const logger = createLogger('MicrosoftPlannerTasksAPI')
export async function GET(request: NextRequest) {
const requestId = randomUUID().slice(0, 8)
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const session = await getSession()
const body = await request.json()
const { credential, workflowId, planId } = body
if (!session?.user?.id) {
logger.warn(`[${requestId}] Unauthenticated request rejected`)
return NextResponse.json({ error: 'User not authenticated' }, { status: 401 })
}
const { searchParams } = new URL(request.url)
const credentialId = searchParams.get('credentialId')
const planId = searchParams.get('planId')
if (!credentialId) {
logger.error(`[${requestId}] Missing credentialId parameter`)
return NextResponse.json({ error: 'Credential ID is required' }, { status: 400 })
if (!credential) {
logger.error(`[${requestId}] Missing credential in request`)
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
if (!planId) {
logger.error(`[${requestId}] Missing planId parameter`)
logger.error(`[${requestId}] Missing planId in request`)
return NextResponse.json({ error: 'Plan ID is required' }, { status: 400 })
}
@@ -42,52 +33,35 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: planIdValidation.error }, { status: 400 })
}
const resolved = await resolveOAuthAccountId(credentialId)
if (!resolved) {
return NextResponse.json({ error: 'Credential not found' }, { status: 404 })
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
if (resolved.workspaceId) {
const { getUserEntityPermissions } = await import('@/lib/workspaces/permissions/utils')
const perm = await getUserEntityPermissions(
session.user.id,
'workspace',
resolved.workspaceId
)
if (perm === null) {
return NextResponse.json({ error: 'Forbidden' }, { status: 403 })
}
}
const credentials = await db
.select()
.from(account)
.where(eq(account.id, resolved.accountId))
.limit(1)
if (!credentials.length) {
logger.warn(`[${requestId}] Credential not found`, { credentialId })
return NextResponse.json({ error: 'Credential not found' }, { status: 404 })
}
const accountRow = credentials[0]
const accessToken = await refreshAccessTokenIfNeeded(
resolved.accountId,
accountRow.userId,
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error(`[${requestId}] Failed to obtain valid access token`)
return NextResponse.json({ error: 'Failed to obtain valid access token' }, { status: 401 })
return NextResponse.json(
{ error: 'Failed to obtain valid access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch(`https://graph.microsoft.com/v1.0/planner/plans/${planId}/tasks`, {
headers: {
Authorization: `Bearer ${accessToken}`,
},
})
const response = await fetch(
`https://graph.microsoft.com/v1.0/planner/plans/${planIdValidation.sanitized}/tasks`,
{
headers: {
Authorization: `Bearer ${accessToken}`,
},
}
)
if (!response.ok) {
const errorText = await response.text()

View File

@@ -0,0 +1,86 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { extractTitleFromItem } from '@/tools/notion/utils'
const logger = createLogger('NotionDatabasesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.notion.com/v1/search', {
method: 'POST',
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
'Notion-Version': '2022-06-28',
},
body: JSON.stringify({
filter: { value: 'database', property: 'object' },
page_size: 100,
}),
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Notion databases', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Notion databases', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const databases = (data.results || []).map((db: Record<string, unknown>) => ({
id: db.id as string,
name: extractTitleFromItem(db),
}))
return NextResponse.json({ databases })
} catch (error) {
logger.error('Error processing Notion databases request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Notion databases', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,86 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { extractTitleFromItem } from '@/tools/notion/utils'
const logger = createLogger('NotionPagesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.notion.com/v1/search', {
method: 'POST',
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
'Notion-Version': '2022-06-28',
},
body: JSON.stringify({
filter: { value: 'page', property: 'object' },
page_size: 100,
}),
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Notion pages', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Notion pages', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const pages = (data.results || []).map((page: Record<string, unknown>) => ({
id: page.id as string,
name: extractTitleFromItem(page),
}))
return NextResponse.json({ pages })
} catch (error) {
logger.error('Error processing Notion pages request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Notion pages', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,79 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('PipedrivePipelinesAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch('https://api.pipedrive.com/v1/pipelines', {
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Pipedrive pipelines', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Pipedrive pipelines', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const pipelines = (data.data || []).map((pipeline: { id: number; name: string }) => ({
id: String(pipeline.id),
name: pipeline.name,
}))
return NextResponse.json({ pipelines })
} catch (error) {
logger.error('Error processing Pipedrive pipelines request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Pipedrive pipelines', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,91 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { validateSharePointSiteId } from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
export const dynamic = 'force-dynamic'
const logger = createLogger('SharePointListsAPI')
interface SharePointList {
id: string
displayName: string
description?: string
webUrl?: string
list?: {
hidden?: boolean
}
}
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId, siteId } = body
if (!credential) {
logger.error(`[${requestId}] Missing credential in request`)
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const siteIdValidation = validateSharePointSiteId(siteId)
if (!siteIdValidation.isValid) {
logger.error(`[${requestId}] Invalid siteId: ${siteIdValidation.error}`)
return NextResponse.json({ error: siteIdValidation.error }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error(`[${requestId}] Failed to obtain valid access token`)
return NextResponse.json(
{ error: 'Failed to obtain valid access token', authRequired: true },
{ status: 401 }
)
}
const url = `https://graph.microsoft.com/v1.0/sites/${siteIdValidation.sanitized}/lists?$select=id,displayName,description,webUrl&$expand=list($select=hidden)&$top=100`
const response = await fetch(url, {
headers: {
Authorization: `Bearer ${accessToken}`,
},
})
if (!response.ok) {
const errorData = await response.json().catch(() => ({ error: { message: 'Unknown error' } }))
return NextResponse.json(
{ error: errorData.error?.message || 'Failed to fetch lists from SharePoint' },
{ status: response.status }
)
}
const data = await response.json()
const lists = (data.value || [])
.filter((list: SharePointList) => list.list?.hidden !== true)
.map((list: SharePointList) => ({
id: list.id,
displayName: list.displayName,
}))
logger.info(`[${requestId}] Successfully fetched ${lists.length} SharePoint lists`)
return NextResponse.json({ lists }, { status: 200 })
} catch (error) {
logger.error(`[${requestId}] Error fetching lists from SharePoint`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@@ -1,79 +1,45 @@
import { randomUUID } from 'crypto'
import { db } from '@sim/db'
import { account } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { validateAlphanumericId } from '@/lib/core/security/input-validation'
import { refreshAccessTokenIfNeeded, resolveOAuthAccountId } from '@/app/api/auth/oauth/utils'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import type { SharepointSite } from '@/tools/sharepoint/types'
export const dynamic = 'force-dynamic'
const logger = createLogger('SharePointSitesAPI')
/**
* Get SharePoint sites from Microsoft Graph API
*/
export async function GET(request: NextRequest) {
const requestId = randomUUID().slice(0, 8)
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'User not authenticated' }, { status: 401 })
const body = await request.json()
const { credential, workflowId, query } = body
if (!credential) {
logger.error(`[${requestId}] Missing credential in request`)
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const { searchParams } = new URL(request.url)
const credentialId = searchParams.get('credentialId')
const query = searchParams.get('query') || ''
if (!credentialId) {
return NextResponse.json({ error: 'Credential ID is required' }, { status: 400 })
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const credentialIdValidation = validateAlphanumericId(credentialId, 'credentialId', 255)
if (!credentialIdValidation.isValid) {
logger.warn(`[${requestId}] Invalid credential ID`, { error: credentialIdValidation.error })
return NextResponse.json({ error: credentialIdValidation.error }, { status: 400 })
}
const resolved = await resolveOAuthAccountId(credentialId)
if (!resolved) {
return NextResponse.json({ error: 'Credential not found' }, { status: 404 })
}
if (resolved.workspaceId) {
const { getUserEntityPermissions } = await import('@/lib/workspaces/permissions/utils')
const perm = await getUserEntityPermissions(
session.user.id,
'workspace',
resolved.workspaceId
)
if (perm === null) {
return NextResponse.json({ error: 'Forbidden' }, { status: 403 })
}
}
const credentials = await db
.select()
.from(account)
.where(eq(account.id, resolved.accountId))
.limit(1)
if (!credentials.length) {
return NextResponse.json({ error: 'Credential not found' }, { status: 404 })
}
const accountRow = credentials[0]
const accessToken = await refreshAccessTokenIfNeeded(
resolved.accountId,
accountRow.userId,
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
return NextResponse.json({ error: 'Failed to obtain valid access token' }, { status: 401 })
logger.error(`[${requestId}] Failed to obtain valid access token`)
return NextResponse.json(
{ error: 'Failed to obtain valid access token', authRequired: true },
{ status: 401 }
)
}
const searchQuery = query || '*'

View File

@@ -0,0 +1,87 @@
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
export const dynamic = 'force-dynamic'
const SlackRemoveReactionSchema = z.object({
accessToken: z.string().min(1, 'Access token is required'),
channel: z.string().min(1, 'Channel is required'),
timestamp: z.string().min(1, 'Message timestamp is required'),
name: z.string().min(1, 'Emoji name is required'),
})
export async function POST(request: NextRequest) {
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json(
{
success: false,
error: authResult.error || 'Authentication required',
},
{ status: 401 }
)
}
const body = await request.json()
const validatedData = SlackRemoveReactionSchema.parse(body)
const slackResponse = await fetch('https://slack.com/api/reactions.remove', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${validatedData.accessToken}`,
},
body: JSON.stringify({
channel: validatedData.channel,
timestamp: validatedData.timestamp,
name: validatedData.name,
}),
})
const data = await slackResponse.json()
if (!data.ok) {
return NextResponse.json(
{
success: false,
error: data.error || 'Failed to remove reaction',
},
{ status: slackResponse.status }
)
}
return NextResponse.json({
success: true,
output: {
content: `Successfully removed :${validatedData.name}: reaction`,
metadata: {
channel: validatedData.channel,
timestamp: validatedData.timestamp,
reaction: validatedData.name,
},
},
})
} catch (error) {
if (error instanceof z.ZodError) {
return NextResponse.json(
{
success: false,
error: 'Invalid request data',
details: error.errors,
},
{ status: 400 }
)
}
return NextResponse.json(
{
success: false,
error: error instanceof Error ? error.message : 'Unknown error occurred',
},
{ status: 500 }
)
}
}

View File

@@ -150,6 +150,7 @@ export async function POST(request: NextRequest) {
method: 'GET',
})
if (!response.ok) {
await response.text().catch(() => {})
throw new Error(`Failed to download audio from URL: ${response.statusText}`)
}

View File

@@ -135,6 +135,7 @@ async function fetchDocumentBytes(url: string): Promise<{ bytes: string; content
method: 'GET',
})
if (!response.ok) {
await response.text().catch(() => {})
throw new Error(`Failed to fetch document: ${response.statusText}`)
}

View File

@@ -0,0 +1,87 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('TrelloBoardsAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const apiKey = process.env.TRELLO_API_KEY
if (!apiKey) {
logger.error('Trello API key not configured')
return NextResponse.json({ error: 'Trello API key not configured' }, { status: 500 })
}
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch(
`https://api.trello.com/1/members/me/boards?key=${apiKey}&token=${accessToken}&fields=id,name,closed`,
{
headers: {
Accept: 'application/json',
},
}
)
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Trello boards', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Trello boards', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const boards = (data || []).map((board: { id: string; name: string; closed: boolean }) => ({
id: board.id,
name: board.name,
closed: board.closed,
}))
return NextResponse.json({ boards })
} catch (error) {
logger.error('Error processing Trello boards request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Trello boards', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -65,6 +65,7 @@ export async function POST(request: NextRequest) {
})
if (!response.ok) {
await response.body?.cancel().catch(() => {})
logger.error(`Failed to generate TTS: ${response.status} ${response.statusText}`)
return NextResponse.json(
{ error: `Failed to generate TTS: ${response.status} ${response.statusText}` },

View File

@@ -342,6 +342,7 @@ async function generateWithRunway(
})
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Runway status check failed: ${statusResponse.status}`)
}
@@ -352,6 +353,7 @@ async function generateWithRunway(
const videoResponse = await fetch(statusData.output[0])
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}
@@ -448,6 +450,7 @@ async function generateWithVeo(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Veo status check failed: ${statusResponse.status}`)
}
@@ -472,6 +475,7 @@ async function generateWithVeo(
})
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}
@@ -561,6 +565,7 @@ async function generateWithLuma(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Luma status check failed: ${statusResponse.status}`)
}
@@ -576,6 +581,7 @@ async function generateWithLuma(
const videoResponse = await fetch(videoUrl)
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}
@@ -679,6 +685,7 @@ async function generateWithMiniMax(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`MiniMax status check failed: ${statusResponse.status}`)
}
@@ -712,6 +719,7 @@ async function generateWithMiniMax(
)
if (!fileResponse.ok) {
await fileResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${fileResponse.status}`)
}
@@ -725,6 +733,7 @@ async function generateWithMiniMax(
// Download the actual video file
const videoResponse = await fetch(videoUrl)
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video from URL: ${videoResponse.status}`)
}
@@ -881,6 +890,7 @@ async function generateWithFalAI(
)
if (!statusResponse.ok) {
await statusResponse.text().catch(() => {})
throw new Error(`Fal.ai status check failed: ${statusResponse.status}`)
}
@@ -899,6 +909,7 @@ async function generateWithFalAI(
)
if (!resultResponse.ok) {
await resultResponse.text().catch(() => {})
throw new Error(`Failed to fetch result: ${resultResponse.status}`)
}
@@ -911,6 +922,7 @@ async function generateWithFalAI(
const videoResponse = await fetch(videoUrl)
if (!videoResponse.ok) {
await videoResponse.text().catch(() => {})
throw new Error(`Failed to download video: ${videoResponse.status}`)
}

View File

@@ -184,6 +184,7 @@ export async function POST(request: NextRequest) {
method: 'GET',
})
if (!response.ok) {
await response.text().catch(() => {})
return NextResponse.json(
{ success: false, error: 'Failed to fetch image for Gemini' },
{ status: 400 }

View File

@@ -0,0 +1,82 @@
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
const logger = createLogger('ZoomMeetingsAPI')
export const dynamic = 'force-dynamic'
export async function POST(request: Request) {
const requestId = generateRequestId()
try {
const body = await request.json()
const { credential, workflowId } = body
if (!credential) {
logger.error('Missing credential in request')
return NextResponse.json({ error: 'Credential is required' }, { status: 400 })
}
const authz = await authorizeCredentialUse(request as any, {
credentialId: credential,
workflowId,
})
if (!authz.ok || !authz.credentialOwnerUserId) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
const accessToken = await refreshAccessTokenIfNeeded(
credential,
authz.credentialOwnerUserId,
requestId
)
if (!accessToken) {
logger.error('Failed to get access token', {
credentialId: credential,
userId: authz.credentialOwnerUserId,
})
return NextResponse.json(
{ error: 'Could not retrieve access token', authRequired: true },
{ status: 401 }
)
}
const response = await fetch(
'https://api.zoom.us/v2/users/me/meetings?page_size=300&type=scheduled',
{
headers: {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
},
}
)
if (!response.ok) {
const errorData = await response.json().catch(() => ({}))
logger.error('Failed to fetch Zoom meetings', {
status: response.status,
error: errorData,
})
return NextResponse.json(
{ error: 'Failed to fetch Zoom meetings', details: errorData },
{ status: response.status }
)
}
const data = await response.json()
const meetings = (data.meetings || []).map((meeting: { id: number; topic: string }) => ({
id: String(meeting.id),
name: meeting.topic,
}))
return NextResponse.json({ meetings })
} catch (error) {
logger.error('Error processing Zoom meetings request:', error)
return NextResponse.json(
{ error: 'Failed to retrieve Zoom meetings', details: (error as Error).message },
{ status: 500 }
)
}
}

View File

@@ -828,6 +828,9 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...(childWorkflowContext && {
childWorkflowBlockId: childWorkflowContext.parentBlockId,
@@ -884,6 +887,9 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...childWorkflowData,
...instanceData,
@@ -915,6 +921,9 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
iterationTotal: iterationContext.iterationTotal,
iterationType: iterationContext.iterationType,
iterationContainerId: iterationContext.iterationContainerId,
...(iterationContext.parentIterations?.length && {
parentIterations: iterationContext.parentIterations,
}),
}),
...childWorkflowData,
...instanceData,
@@ -955,7 +964,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
logger.error(`[${requestId}] Error streaming block content:`, error)
} finally {
try {
reader.releaseLock()
await reader.cancel().catch(() => {})
} catch {}
}
}

View File

@@ -38,6 +38,24 @@ export default function RootLayout({ children }: { children: React.ReactNode })
return (
<html lang='en' suppressHydrationWarning>
<head>
{/* Polyfill crypto.randomUUID for non-secure contexts (HTTP on non-localhost) */}
<script
id='crypto-randomuuid-polyfill'
dangerouslySetInnerHTML={{
__html: `
if (typeof crypto !== 'undefined' && typeof crypto.randomUUID !== 'function' && typeof crypto.getRandomValues === 'function') {
crypto.randomUUID = function() {
var a = new Uint8Array(16);
crypto.getRandomValues(a);
a[6] = (a[6] & 0x0f) | 0x40;
a[8] = (a[8] & 0x3f) | 0x80;
var h = Array.prototype.map.call(a, function(b) { return ('0' + b.toString(16)).slice(-2); }).join('');
return h.slice(0,8) + '-' + h.slice(8,12) + '-' + h.slice(12,16) + '-' + h.slice(16,20) + '-' + h.slice(20);
};
}
`,
}}
/>
{isReactScanEnabled && (
<Script
src='https://unpkg.com/react-scan/dist/auto.global.js'

View File

@@ -55,7 +55,7 @@ Sim provides a visual drag-and-drop interface for building and deploying AI agen
## Optional
- [Careers](${baseUrl}/careers): Join the Sim team
- [Careers](https://jobs.ashbyhq.com/sim): Join the Sim team
- [Terms of Service](${baseUrl}/terms): Legal terms
- [Privacy Policy](${baseUrl}/privacy): Data handling practices
`

View File

@@ -28,10 +28,6 @@ export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
url: `${baseUrl}/changelog`,
lastModified: now,
},
{
url: `${baseUrl}/careers`,
lastModified: new Date('2024-10-06'),
},
{
url: `${baseUrl}/terms`,
lastModified: new Date('2024-10-14'),

View File

@@ -164,7 +164,7 @@ export const ActionBar = memo(
return (
<div
className={cn(
'-top-[46px] absolute right-0',
'-top-[46px] pointer-events-auto absolute right-0',
'flex flex-row items-center',
'opacity-0 transition-opacity duration-200 group-hover:opacity-100',
'gap-[5px] rounded-[10px] p-[5px]',

View File

@@ -501,17 +501,6 @@ export function Chat() {
}
}, [])
useEffect(() => {
if (!isExecuting && isStreaming) {
const lastMessage = workflowMessages[workflowMessages.length - 1]
if (lastMessage?.isStreaming) {
streamReaderRef.current?.cancel()
streamReaderRef.current = null
finalizeMessageStream(lastMessage.id)
}
}
}, [isExecuting, isStreaming, workflowMessages, finalizeMessageStream])
const handleStopStreaming = useCallback(() => {
streamReaderRef.current?.cancel()
streamReaderRef.current = null

View File

@@ -31,6 +31,7 @@ import {
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import type { WandControlHandlers } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/sub-block'
import { restoreCursorAfterInsertion } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/utils'
import { WandPromptBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/wand-prompt-bar/wand-prompt-bar'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { useWand } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-wand'
@@ -537,36 +538,40 @@ export const Code = memo(function Code({
/**
* Handles selection of a tag from the tag dropdown.
* @param newValue - The new code value with the selected tag inserted
* @param newCursorPosition - The cursor position after the inserted tag
*/
const handleTagSelect = (newValue: string) => {
const handleTagSelect = (newValue: string, newCursorPosition: number) => {
const textarea = editorRef.current?.querySelector('textarea') as HTMLTextAreaElement | null
if (!isPreview && !readOnly) {
setCode(newValue)
emitTagSelection(newValue)
recordChange(newValue)
restoreCursorAfterInsertion(textarea, newCursorPosition)
} else {
setTimeout(() => textarea?.focus(), 0)
}
setShowTags(false)
setActiveSourceBlockId(null)
setTimeout(() => {
editorRef.current?.querySelector('textarea')?.focus()
}, 0)
}
/**
* Handles selection of an environment variable from the dropdown.
* @param newValue - The new code value with the selected env var inserted
* @param newCursorPosition - The cursor position after the inserted env var
*/
const handleEnvVarSelect = (newValue: string) => {
const handleEnvVarSelect = (newValue: string, newCursorPosition: number) => {
const textarea = editorRef.current?.querySelector('textarea') as HTMLTextAreaElement | null
if (!isPreview && !readOnly) {
setCode(newValue)
emitTagSelection(newValue)
recordChange(newValue)
restoreCursorAfterInsertion(textarea, newCursorPosition)
} else {
setTimeout(() => textarea?.focus(), 0)
}
setShowEnvVars(false)
setTimeout(() => {
editorRef.current?.querySelector('textarea')?.focus()
}, 0)
}
/**

View File

@@ -31,6 +31,7 @@ import {
TagDropdown,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { restoreCursorAfterInsertion } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/utils'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { normalizeName } from '@/executor/constants'
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
@@ -554,9 +555,17 @@ export function ConditionInput({
)
}
const handleTagSelectImmediate = (blockId: string, newValue: string) => {
const handleTagSelectImmediate = (
blockId: string,
newValue: string,
newCursorPosition: number
) => {
if (isPreview || disabled) return
const textarea = containerRef.current?.querySelector(
`[data-block-id="${CSS.escape(blockId)}"] textarea`
) as HTMLTextAreaElement | null
shouldPersistRef.current = true
setConditionalBlocks((blocks) =>
blocks.map((block) =>
@@ -582,11 +591,21 @@ export function ConditionInput({
: block
)
emitTagSelection(JSON.stringify(updatedBlocks))
restoreCursorAfterInsertion(textarea, newCursorPosition)
}
const handleEnvVarSelectImmediate = (blockId: string, newValue: string) => {
const handleEnvVarSelectImmediate = (
blockId: string,
newValue: string,
newCursorPosition: number
) => {
if (isPreview || disabled) return
const textarea = containerRef.current?.querySelector(
`[data-block-id="${CSS.escape(blockId)}"] textarea`
) as HTMLTextAreaElement | null
shouldPersistRef.current = true
setConditionalBlocks((blocks) =>
blocks.map((block) =>
@@ -612,6 +631,8 @@ export function ConditionInput({
: block
)
emitTagSelection(JSON.stringify(updatedBlocks))
restoreCursorAfterInsertion(textarea, newCursorPosition)
}
/**
@@ -999,7 +1020,9 @@ export function ConditionInput({
{block.showEnvVars && (
<EnvVarDropdown
visible={block.showEnvVars}
onSelect={(newValue) => handleEnvVarSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleEnvVarSelectImmediate(block.id, newValue, newCursorPosition)
}
searchTerm={block.searchTerm}
inputValue={block.value}
cursorPosition={block.cursorPosition}
@@ -1023,7 +1046,9 @@ export function ConditionInput({
{block.showTags && (
<TagDropdown
visible={block.showTags}
onSelect={(newValue) => handleTagSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleTagSelectImmediate(block.id, newValue, newCursorPosition)
}
blockId={blockId}
activeSourceBlockId={block.activeSourceBlockId}
inputValue={block.value}
@@ -1207,7 +1232,9 @@ export function ConditionInput({
{block.showEnvVars && (
<EnvVarDropdown
visible={block.showEnvVars}
onSelect={(newValue) => handleEnvVarSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleEnvVarSelectImmediate(block.id, newValue, newCursorPosition)
}
searchTerm={block.searchTerm}
inputValue={block.value}
cursorPosition={block.cursorPosition}
@@ -1225,7 +1252,9 @@ export function ConditionInput({
{block.showTags && (
<TagDropdown
visible={block.showTags}
onSelect={(newValue) => handleTagSelectImmediate(block.id, newValue)}
onSelect={(newValue, newCursorPosition) =>
handleTagSelectImmediate(block.id, newValue, newCursorPosition)
}
blockId={blockId}
activeSourceBlockId={block.activeSourceBlockId}
inputValue={block.value}

View File

@@ -15,6 +15,7 @@ import {
import { client } from '@/lib/auth/auth-client'
import {
getProviderIdFromServiceId,
getScopeDescription,
OAUTH_PROVIDERS,
type OAuthProvider,
parseProvider,
@@ -33,313 +34,6 @@ export interface OAuthRequiredModalProps {
onConnect?: () => Promise<void> | void
}
const SCOPE_DESCRIPTIONS: Record<string, string> = {
'https://www.googleapis.com/auth/gmail.send': 'Send emails',
'https://www.googleapis.com/auth/gmail.labels': 'View and manage email labels',
'https://www.googleapis.com/auth/gmail.modify': 'View and manage email messages',
'https://www.googleapis.com/auth/drive.file': 'View and manage Google Drive files',
'https://www.googleapis.com/auth/drive': 'Access all Google Drive files',
'https://www.googleapis.com/auth/calendar': 'View and manage calendar',
'https://www.googleapis.com/auth/contacts': 'View and manage Google Contacts',
'https://www.googleapis.com/auth/tasks': 'Create, read, update, and delete Google Tasks',
'https://www.googleapis.com/auth/userinfo.email': 'View email address',
'https://www.googleapis.com/auth/userinfo.profile': 'View basic profile info',
'https://www.googleapis.com/auth/forms.body': 'View and manage Google Forms',
'https://www.googleapis.com/auth/forms.responses.readonly': 'View responses to Google Forms',
'https://www.googleapis.com/auth/bigquery': 'View and manage data in Google BigQuery',
'https://www.googleapis.com/auth/ediscovery': 'Access Google Vault for eDiscovery',
'https://www.googleapis.com/auth/devstorage.read_only': 'Read files from Google Cloud Storage',
'https://www.googleapis.com/auth/admin.directory.group': 'Manage Google Workspace groups',
'https://www.googleapis.com/auth/admin.directory.group.member':
'Manage Google Workspace group memberships',
'https://www.googleapis.com/auth/admin.directory.group.readonly': 'View Google Workspace groups',
'https://www.googleapis.com/auth/admin.directory.group.member.readonly':
'View Google Workspace group memberships',
'https://www.googleapis.com/auth/cloud-platform':
'Full access to Google Cloud resources for Vertex AI',
'read:confluence-content.all': 'Read all Confluence content',
'read:confluence-space.summary': 'Read Confluence space information',
'read:space:confluence': 'View Confluence spaces',
'read:space-details:confluence': 'View detailed Confluence space information',
'write:confluence-content': 'Create and edit Confluence pages',
'write:confluence-space': 'Manage Confluence spaces',
'write:confluence-file': 'Upload files to Confluence',
'read:content:confluence': 'Read Confluence content',
'read:page:confluence': 'View Confluence pages',
'write:page:confluence': 'Create and update Confluence pages',
'read:comment:confluence': 'View comments on Confluence pages',
'write:comment:confluence': 'Create and update comments',
'delete:comment:confluence': 'Delete comments from Confluence pages',
'read:attachment:confluence': 'View attachments on Confluence pages',
'write:attachment:confluence': 'Upload and manage attachments',
'delete:attachment:confluence': 'Delete attachments from Confluence pages',
'delete:page:confluence': 'Delete Confluence pages',
'read:label:confluence': 'View labels on Confluence content',
'write:label:confluence': 'Add and remove labels',
'search:confluence': 'Search Confluence content',
'readonly:content.attachment:confluence': 'View attachments',
'read:blogpost:confluence': 'View Confluence blog posts',
'write:blogpost:confluence': 'Create and update Confluence blog posts',
'read:content.property:confluence': 'View properties on Confluence content',
'write:content.property:confluence': 'Create and manage content properties',
'read:hierarchical-content:confluence': 'View page hierarchy (children and ancestors)',
'read:content.metadata:confluence': 'View content metadata (required for ancestors)',
'read:user:confluence': 'View Confluence user profiles',
'read:task:confluence': 'View Confluence inline tasks',
'write:task:confluence': 'Update Confluence inline tasks',
'delete:blogpost:confluence': 'Delete Confluence blog posts',
'write:space:confluence': 'Create and update Confluence spaces',
'delete:space:confluence': 'Delete Confluence spaces',
'read:space.property:confluence': 'View Confluence space properties',
'write:space.property:confluence': 'Create and manage space properties',
'read:space.permission:confluence': 'View Confluence space permissions',
'read:me': 'Read profile information',
'database.read': 'Read database',
'database.write': 'Write to database',
'projects.read': 'Read projects',
offline_access: 'Access account when not using the application',
repo: 'Access repositories',
workflow: 'Manage repository workflows',
'read:user': 'Read public user information',
'user:email': 'Access email address',
'tweet.read': 'Read tweets and timeline',
'tweet.write': 'Post and delete tweets',
'tweet.moderate.write': 'Hide and unhide replies to tweets',
'users.read': 'Read user profiles and account information',
'follows.read': 'View followers and following lists',
'follows.write': 'Follow and unfollow users',
'bookmark.read': 'View bookmarked tweets',
'bookmark.write': 'Add and remove bookmarks',
'like.read': 'View liked tweets and liking users',
'like.write': 'Like and unlike tweets',
'block.read': 'View blocked users',
'block.write': 'Block and unblock users',
'mute.read': 'View muted users',
'mute.write': 'Mute and unmute users',
'offline.access': 'Access account when not using the application',
'data.records:read': 'Read records',
'data.records:write': 'Write to records',
'webhook:manage': 'Manage webhooks',
'page.read': 'Read Notion pages',
'page.write': 'Write to Notion pages',
'workspace.content': 'Read Notion content',
'workspace.name': 'Read Notion workspace name',
'workspace.read': 'Read Notion workspace',
'workspace.write': 'Write to Notion workspace',
'user.email:read': 'Read email address',
'read:jira-user': 'Read Jira user',
'read:jira-work': 'Read Jira work',
'write:jira-work': 'Write to Jira work',
'manage:jira-webhook': 'Register and manage Jira webhooks',
'read:webhook:jira': 'View Jira webhooks',
'write:webhook:jira': 'Create and update Jira webhooks',
'delete:webhook:jira': 'Delete Jira webhooks',
'read:issue-event:jira': 'Read Jira issue events',
'write:issue:jira': 'Write to Jira issues',
'read:project:jira': 'Read Jira projects',
'read:issue-type:jira': 'Read Jira issue types',
'read:issue-meta:jira': 'Read Jira issue meta',
'read:issue-security-level:jira': 'Read Jira issue security level',
'read:issue.vote:jira': 'Read Jira issue votes',
'read:issue.changelog:jira': 'Read Jira issue changelog',
'read:avatar:jira': 'Read Jira avatar',
'read:issue:jira': 'Read Jira issues',
'read:status:jira': 'Read Jira status',
'read:user:jira': 'Read Jira user',
'read:field-configuration:jira': 'Read Jira field configuration',
'read:issue-details:jira': 'Read Jira issue details',
'read:field:jira': 'Read Jira field configurations',
'read:jql:jira': 'Use JQL to filter Jira issues',
'read:comment.property:jira': 'Read Jira comment properties',
'read:issue.property:jira': 'Read Jira issue properties',
'delete:issue:jira': 'Delete Jira issues',
'write:comment:jira': 'Add and update comments on Jira issues',
'read:comment:jira': 'Read comments on Jira issues',
'delete:comment:jira': 'Delete comments from Jira issues',
'read:attachment:jira': 'Read attachments from Jira issues',
'delete:attachment:jira': 'Delete attachments from Jira issues',
'write:issue-worklog:jira': 'Add and update worklog entries on Jira issues',
'read:issue-worklog:jira': 'Read worklog entries from Jira issues',
'delete:issue-worklog:jira': 'Delete worklog entries from Jira issues',
'write:issue-link:jira': 'Create links between Jira issues',
'delete:issue-link:jira': 'Delete links between Jira issues',
'User.Read': 'Read Microsoft user',
'Chat.Read': 'Read Microsoft chats',
'Chat.ReadWrite': 'Write to Microsoft chats',
'Chat.ReadBasic': 'Read Microsoft chats',
'ChatMessage.Send': 'Send chat messages',
'Channel.ReadBasic.All': 'Read Microsoft channels',
'ChannelMessage.Send': 'Write to Microsoft channels',
'ChannelMessage.Read.All': 'Read Microsoft channels',
'ChannelMessage.ReadWrite': 'Read and write to Microsoft channels',
'ChannelMember.Read.All': 'Read team channel members',
'Group.Read.All': 'Read Microsoft groups',
'Group.ReadWrite.All': 'Write to Microsoft groups',
'Team.ReadBasic.All': 'Read Microsoft teams',
'TeamMember.Read.All': 'Read team members',
'Mail.ReadWrite': 'Write to Microsoft emails',
'Mail.ReadBasic': 'Read Microsoft emails',
'Mail.Read': 'Read Microsoft emails',
'Mail.Send': 'Send emails',
'Files.Read': 'Read OneDrive files',
'Files.ReadWrite': 'Read and write OneDrive files',
'Tasks.ReadWrite': 'Read and manage Planner tasks',
'Sites.Read.All': 'Read Sharepoint sites',
'Sites.ReadWrite.All': 'Read and write Sharepoint sites',
'Sites.Manage.All': 'Manage Sharepoint sites',
openid: 'Standard authentication',
profile: 'Access profile information',
email: 'Access email address',
identify: 'Read Discord user',
bot: 'Read Discord bot',
'messages.read': 'Read Discord messages',
guilds: 'Read Discord guilds',
'guilds.members.read': 'Read Discord guild members',
identity: 'Access Reddit identity',
submit: 'Submit posts and comments',
vote: 'Vote on posts and comments',
save: 'Save and unsave posts and comments',
edit: 'Edit posts and comments',
subscribe: 'Subscribe and unsubscribe from subreddits',
history: 'Access Reddit history',
privatemessages: 'Access inbox and send private messages',
account: 'Update account preferences and settings',
mysubreddits: 'Access subscribed and moderated subreddits',
flair: 'Manage user and post flair',
report: 'Report posts and comments for rule violations',
modposts: 'Approve, remove, and moderate posts in moderated subreddits',
modflair: 'Manage flair in moderated subreddits',
modmail: 'Access and respond to moderator mail',
login: 'Access Wealthbox account',
data: 'Access Wealthbox data',
read: 'Read access to workspace',
write: 'Write access to Linear workspace',
'channels:read': 'View public channels',
'channels:history': 'Read channel messages',
'groups:read': 'View private channels',
'groups:history': 'Read private messages',
'chat:write': 'Send messages',
'chat:write.public': 'Post to public channels',
'im:write': 'Send direct messages',
'im:history': 'Read direct message history',
'im:read': 'View direct message channels',
'users:read': 'View workspace users',
'files:write': 'Upload files',
'files:read': 'Download and read files',
'canvases:write': 'Create canvas documents',
'reactions:write': 'Add emoji reactions to messages',
'sites:read': 'View Webflow sites',
'sites:write': 'Manage webhooks and site settings',
'cms:read': 'View CMS content',
'cms:write': 'Manage CMS content',
'crm.objects.contacts.read': 'Read HubSpot contacts',
'crm.objects.contacts.write': 'Create and update HubSpot contacts',
'crm.objects.companies.read': 'Read HubSpot companies',
'crm.objects.companies.write': 'Create and update HubSpot companies',
'crm.objects.deals.read': 'Read HubSpot deals',
'crm.objects.deals.write': 'Create and update HubSpot deals',
'crm.objects.owners.read': 'Read HubSpot object owners',
'crm.objects.users.read': 'Read HubSpot users',
'crm.objects.users.write': 'Create and update HubSpot users',
'crm.objects.marketing_events.read': 'Read HubSpot marketing events',
'crm.objects.marketing_events.write': 'Create and update HubSpot marketing events',
'crm.objects.line_items.read': 'Read HubSpot line items',
'crm.objects.line_items.write': 'Create and update HubSpot line items',
'crm.objects.quotes.read': 'Read HubSpot quotes',
'crm.objects.quotes.write': 'Create and update HubSpot quotes',
'crm.objects.appointments.read': 'Read HubSpot appointments',
'crm.objects.appointments.write': 'Create and update HubSpot appointments',
'crm.objects.carts.read': 'Read HubSpot shopping carts',
'crm.objects.carts.write': 'Create and update HubSpot shopping carts',
'crm.import': 'Import data into HubSpot',
'crm.lists.read': 'Read HubSpot lists',
'crm.lists.write': 'Create and update HubSpot lists',
tickets: 'Manage HubSpot tickets',
api: 'Access Salesforce API',
refresh_token: 'Maintain long-term access to Salesforce account',
default: 'Access Asana workspace',
base: 'Basic access to Pipedrive account',
'deals:read': 'Read Pipedrive deals',
'deals:full': 'Full access to manage Pipedrive deals',
'contacts:read': 'Read Pipedrive contacts',
'contacts:full': 'Full access to manage Pipedrive contacts',
'leads:read': 'Read Pipedrive leads',
'leads:full': 'Full access to manage Pipedrive leads',
'activities:read': 'Read Pipedrive activities',
'activities:full': 'Full access to manage Pipedrive activities',
'mail:read': 'Read Pipedrive emails',
'mail:full': 'Full access to manage Pipedrive emails',
'projects:read': 'Read Pipedrive projects',
'projects:full': 'Full access to manage Pipedrive projects',
'webhooks:read': 'Read Pipedrive webhooks',
'webhooks:full': 'Full access to manage Pipedrive webhooks',
w_member_social: 'Access LinkedIn profile',
// Box scopes
root_readwrite: 'Read and write all files and folders in Box account',
root_readonly: 'Read all files and folders in Box account',
// Shopify scopes (write_* implicitly includes read access)
write_products: 'Read and manage Shopify products',
write_orders: 'Read and manage Shopify orders',
write_customers: 'Read and manage Shopify customers',
write_inventory: 'Read and manage Shopify inventory levels',
read_locations: 'View store locations',
write_merchant_managed_fulfillment_orders: 'Create fulfillments for orders',
// Zoom scopes
'user:read:user': 'View Zoom profile information',
'meeting:write:meeting': 'Create Zoom meetings',
'meeting:read:meeting': 'View Zoom meeting details',
'meeting:read:list_meetings': 'List Zoom meetings',
'meeting:update:meeting': 'Update Zoom meetings',
'meeting:delete:meeting': 'Delete Zoom meetings',
'meeting:read:invitation': 'View Zoom meeting invitations',
'meeting:read:list_past_participants': 'View past meeting participants',
'cloud_recording:read:list_user_recordings': 'List Zoom cloud recordings',
'cloud_recording:read:list_recording_files': 'View recording files',
'cloud_recording:delete:recording_file': 'Delete cloud recordings',
// Dropbox scopes
'account_info.read': 'View Dropbox account information',
'files.metadata.read': 'View file and folder names, sizes, and dates',
'files.metadata.write': 'Modify file and folder metadata',
'files.content.read': 'Download and read Dropbox files',
'files.content.write': 'Upload, copy, move, and delete files in Dropbox',
'sharing.read': 'View shared files and folders',
'sharing.write': 'Share files and folders with others',
// WordPress.com scopes
global: 'Full access to manage WordPress.com sites, posts, pages, media, and settings',
// Spotify scopes
'user-read-private': 'View Spotify account details',
'user-read-email': 'View email address on Spotify',
'user-library-read': 'View saved tracks and albums',
'user-library-modify': 'Save and remove tracks and albums from library',
'playlist-read-private': 'View private playlists',
'playlist-read-collaborative': 'View collaborative playlists',
'playlist-modify-public': 'Create and manage public playlists',
'playlist-modify-private': 'Create and manage private playlists',
'user-read-playback-state': 'View current playback state',
'user-modify-playback-state': 'Control playback on Spotify devices',
'user-read-currently-playing': 'View currently playing track',
'user-read-recently-played': 'View recently played tracks',
'user-top-read': 'View top artists and tracks',
'user-follow-read': 'View followed artists and users',
'user-follow-modify': 'Follow and unfollow artists and users',
'user-read-playback-position': 'View playback position in podcasts',
'ugc-image-upload': 'Upload images to Spotify playlists',
// Attio
'record_permission:read-write': 'Read and write CRM records',
'object_configuration:read-write': 'Read and manage object schemas',
'list_configuration:read-write': 'Read and manage list configurations',
'list_entry:read-write': 'Read and write list entries',
'note:read-write': 'Read and write notes',
'task:read-write': 'Read and write tasks',
'comment:read-write': 'Read and write comments and threads',
'user_management:read': 'View workspace members',
'webhook:read-write': 'Manage webhooks',
}
function getScopeDescription(scope: string): string {
return SCOPE_DESCRIPTIONS[scope] || scope
}
export function OAuthRequiredModal({
isOpen,
onClose,

View File

@@ -15,6 +15,7 @@ import {
type OAuthProvider,
parseProvider,
} from '@/lib/oauth'
import { getMissingRequiredScopes } from '@/lib/oauth/utils'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/oauth-required-modal'
import { useDependsOnGate } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-depends-on-gate'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
@@ -25,7 +26,6 @@ import { useOAuthCredentials } from '@/hooks/queries/oauth-credentials'
import { useOrganizations } from '@/hooks/queries/organization'
import { useSubscriptionData } from '@/hooks/queries/subscription'
import { useCredentialRefreshTriggers } from '@/hooks/use-credential-refresh-triggers'
import { getMissingRequiredScopes } from '@/hooks/use-oauth-scope-status'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
const isBillingEnabled = isTruthy(getEnv('NEXT_PUBLIC_BILLING_ENABLED'))

View File

@@ -23,7 +23,7 @@ interface EnvVarDropdownProps {
/** Whether the dropdown is visible */
visible: boolean
/** Callback when an environment variable is selected */
onSelect: (newValue: string) => void
onSelect: (newValue: string, newCursorPosition: number) => void
/** Search term to filter environment variables */
searchTerm?: string
/** Additional CSS class names */
@@ -189,6 +189,8 @@ export const EnvVarDropdown: React.FC<EnvVarDropdownProps> = ({
const isStandardEnvVarContext = lastOpenBraces !== -1
const tagLength = 2 + envVar.length + 2
if (isStandardEnvVarContext) {
const startText = textBeforeCursor.slice(0, lastOpenBraces)
@@ -196,13 +198,10 @@ export const EnvVarDropdown: React.FC<EnvVarDropdownProps> = ({
const endText = closeIndex !== -1 ? textAfterCursor.slice(closeIndex + 2) : textAfterCursor
const newValue = `${startText}{{${envVar}}}${endText}`
onSelect(newValue)
onSelect(newValue, lastOpenBraces + tagLength)
} else {
if (inputValue.trim() !== '') {
onSelect(`{{${envVar}}}`)
} else {
onSelect(`{{${envVar}}}`)
}
const newValue = `{{${envVar}}}`
onSelect(newValue, tagLength)
}
onClose?.()

View File

@@ -70,7 +70,7 @@ interface TagDropdownProps {
/** Whether the dropdown is visible */
visible: boolean
/** Callback when a tag is selected */
onSelect: (newValue: string) => void
onSelect: (newValue: string, newCursorPosition: number) => void
/** ID of the block that owns the input field */
blockId: string
/** ID of the specific source block being referenced, if any */
@@ -1167,21 +1167,56 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
{} as Record<string, { type: string; id: string }>
)
let loopBlockGroup: BlockTagGroup | null = null
const loopBlockGroups: BlockTagGroup[] = []
const ancestorLoopIds = new Set<string>()
const visitedContainerIds = new Set<string>()
const findAncestorContainers = (targetId: string) => {
if (visitedContainerIds.has(targetId)) return
visitedContainerIds.add(targetId)
// Check if targetId is directly inside any loop
for (const [loopId, loop] of Object.entries(loops)) {
if (loop.nodes.includes(targetId) && !ancestorLoopIds.has(loopId)) {
ancestorLoopIds.add(loopId)
const loopBlock = blocks[loopId]
if (loopBlock) {
const loopType = loop.loopType || 'for'
const loopBlockName = loopBlock.name || loopBlock.type
const normalizedLoopName = normalizeName(loopBlockName)
const contextualTags: string[] = [`${normalizedLoopName}.index`]
if (loopType === 'forEach') {
contextualTags.push(`${normalizedLoopName}.currentItem`)
contextualTags.push(`${normalizedLoopName}.items`)
}
loopBlockGroups.push({
blockName: loopBlockName,
blockId: loopId,
blockType: 'loop',
tags: contextualTags,
distance: 0,
isContextual: true,
})
}
findAncestorContainers(loopId)
}
}
// Also walk through containing parallels so we find loops that contain
// the parallel (e.g. block inside parallel inside loop)
for (const [parallelId, parallel] of Object.entries(parallels || {})) {
if (parallel.nodes.includes(targetId)) {
findAncestorContainers(parallelId)
}
}
}
const isLoopBlock = blocks[blockId]?.type === 'loop'
const currentLoop = isLoopBlock ? loops[blockId] : null
const containingLoop = Object.entries(loops).find(([_, loop]) => loop.nodes.includes(blockId))
let containingLoopBlockId: string | null = null
if (currentLoop && isLoopBlock) {
containingLoopBlockId = blockId
const loopType = currentLoop.loopType || 'for'
if (isLoopBlock && loops[blockId]) {
const loop = loops[blockId]
ancestorLoopIds.add(blockId)
const loopBlock = blocks[blockId]
if (loopBlock) {
const loopType = loop.loopType || 'for'
const loopBlockName = loopBlock.name || loopBlock.type
const normalizedLoopName = normalizeName(loopBlockName)
const contextualTags: string[] = [`${normalizedLoopName}.index`]
@@ -1189,71 +1224,65 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
contextualTags.push(`${normalizedLoopName}.currentItem`)
contextualTags.push(`${normalizedLoopName}.items`)
}
loopBlockGroup = {
loopBlockGroups.push({
blockName: loopBlockName,
blockId: blockId,
blockType: 'loop',
tags: contextualTags,
distance: 0,
isContextual: true,
}
})
}
} else if (containingLoop) {
const [loopId, loop] = containingLoop
containingLoopBlockId = loopId
const loopType = loop.loopType || 'for'
findAncestorContainers(blockId)
} else {
findAncestorContainers(blockId)
}
const containingLoopBlock = blocks[loopId]
if (containingLoopBlock) {
const loopBlockName = containingLoopBlock.name || containingLoopBlock.type
const normalizedLoopName = normalizeName(loopBlockName)
const contextualTags: string[] = [`${normalizedLoopName}.index`]
if (loopType === 'forEach') {
contextualTags.push(`${normalizedLoopName}.currentItem`)
contextualTags.push(`${normalizedLoopName}.items`)
}
const parallelBlockGroups: BlockTagGroup[] = []
const ancestorParallelIds = new Set<string>()
const visitedParallelTargets = new Set<string>()
loopBlockGroup = {
blockName: loopBlockName,
blockId: loopId,
blockType: 'loop',
tags: contextualTags,
distance: 0,
isContextual: true,
const findAncestorParallels = (targetId: string) => {
if (visitedParallelTargets.has(targetId)) return
visitedParallelTargets.add(targetId)
for (const [parallelId, parallel] of Object.entries(parallels || {})) {
if (parallel.nodes.includes(targetId) && !ancestorParallelIds.has(parallelId)) {
ancestorParallelIds.add(parallelId)
const parallelBlock = blocks[parallelId]
if (parallelBlock) {
const parallelType = parallel.parallelType || 'count'
const parallelBlockName = parallelBlock.name || parallelBlock.type
const normalizedParallelName = normalizeName(parallelBlockName)
const contextualTags: string[] = [`${normalizedParallelName}.index`]
if (parallelType === 'collection') {
contextualTags.push(`${normalizedParallelName}.currentItem`)
contextualTags.push(`${normalizedParallelName}.items`)
}
parallelBlockGroups.push({
blockName: parallelBlockName,
blockId: parallelId,
blockType: 'parallel',
tags: contextualTags,
distance: 0,
isContextual: true,
})
}
// Walk up through containing loops and parallels
for (const [loopId, loop] of Object.entries(loops)) {
if (loop.nodes.includes(parallelId)) {
findAncestorParallels(loopId)
}
}
findAncestorParallels(parallelId)
}
}
}
let parallelBlockGroup: BlockTagGroup | null = null
const containingParallel = Object.entries(parallels || {}).find(([_, parallel]) =>
parallel.nodes.includes(blockId)
)
let containingParallelBlockId: string | null = null
if (containingParallel) {
const [parallelId, parallel] = containingParallel
containingParallelBlockId = parallelId
const parallelType = parallel.parallelType || 'count'
const containingParallelBlock = blocks[parallelId]
if (containingParallelBlock) {
const parallelBlockName = containingParallelBlock.name || containingParallelBlock.type
const normalizedParallelName = normalizeName(parallelBlockName)
const contextualTags: string[] = [`${normalizedParallelName}.index`]
if (parallelType === 'collection') {
contextualTags.push(`${normalizedParallelName}.currentItem`)
contextualTags.push(`${normalizedParallelName}.items`)
}
parallelBlockGroup = {
blockName: parallelBlockName,
blockId: parallelId,
blockType: 'parallel',
tags: contextualTags,
distance: 0,
isContextual: true,
}
}
findAncestorParallels(blockId)
// Also check through ancestor loops (a block in a loop that's in a parallel)
for (const loopId of ancestorLoopIds) {
findAncestorParallels(loopId)
}
const blockTagGroups: BlockTagGroup[] = []
@@ -1275,8 +1304,8 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
if (!blockConfig) {
if (accessibleBlock.type === 'loop' || accessibleBlock.type === 'parallel') {
if (
accessibleBlockId === containingLoopBlockId ||
accessibleBlockId === containingParallelBlockId
ancestorLoopIds.has(accessibleBlockId) ||
ancestorParallelIds.has(accessibleBlockId)
) {
continue
}
@@ -1366,12 +1395,8 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
}
const finalBlockTagGroups: BlockTagGroup[] = []
if (loopBlockGroup) {
finalBlockTagGroups.push(loopBlockGroup)
}
if (parallelBlockGroup) {
finalBlockTagGroups.push(parallelBlockGroup)
}
finalBlockTagGroups.push(...loopBlockGroups)
finalBlockTagGroups.push(...parallelBlockGroups)
blockTagGroups.sort((a, b) => a.distance - b.distance)
finalBlockTagGroups.push(...blockTagGroups)
@@ -1570,28 +1595,15 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
if (variableObj) {
processedTag = tag
}
} else if (
blockGroup?.isContextual &&
(blockGroup.blockType === 'loop' || blockGroup.blockType === 'parallel')
) {
const tagParts = tag.split('.')
if (tagParts.length === 1) {
processedTag = blockGroup.blockType
} else {
const lastPart = tagParts[tagParts.length - 1]
if (['index', 'currentItem', 'items'].includes(lastPart)) {
processedTag = `${blockGroup.blockType}.${lastPart}`
} else {
processedTag = tag
}
}
}
let newValue: string
let insertStart: number
if (lastOpenBracket === -1) {
// No '<' found - insert the full tag at cursor position
newValue = `${textBeforeCursor}<${processedTag}>${textAfterCursor}`
insertStart = liveCursor
} else {
// '<' found - replace from '<' to cursor (and consume trailing '>' if present)
const nextCloseBracket = textAfterCursor.indexOf('>')
@@ -1605,9 +1617,11 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
}
newValue = `${textBeforeCursor.slice(0, lastOpenBracket)}<${processedTag}>${remainingTextAfterCursor}`
insertStart = lastOpenBracket
}
onSelect(newValue)
const newCursorPos = insertStart + 1 + processedTag.length + 1
onSelect(newValue, newCursorPos)
onClose?.()
},
[workflowVariables, onSelect, onClose, getMergedSubBlocks]

View File

@@ -12,10 +12,10 @@ import {
type OAuthService,
parseProvider,
} from '@/lib/oauth'
import { getMissingRequiredScopes } from '@/lib/oauth/utils'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/oauth-required-modal'
import { useOAuthCredentials } from '@/hooks/queries/oauth-credentials'
import { useCredentialRefreshTriggers } from '@/hooks/use-credential-refresh-triggers'
import { getMissingRequiredScopes } from '@/hooks/use-oauth-scope-status'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
const getProviderIcon = (providerName: OAuthProvider) => {

View File

@@ -1969,9 +1969,8 @@ export const ToolInput = memo(function ToolInput({
}
if (useSubBlocks && displaySubBlocks.length > 0) {
const allBlockSubBlocks = toolBlock?.subBlocks || []
const coveredParamIds = new Set(
allBlockSubBlocks.flatMap((sb) => {
displaySubBlocks.flatMap((sb) => {
const ids = [sb.id]
if (sb.canonicalParamId) ids.push(sb.canonicalParamId)
const cId = toolCanonicalIndex?.canonicalIdBySubBlockId[sb.id]

View File

@@ -2,8 +2,11 @@
import { useMemo } from 'react'
import { useParams } from 'next/navigation'
import { SELECTOR_CONTEXT_FIELDS } from '@/lib/workflows/subblocks/context'
import type { SubBlockConfig } from '@/blocks/types'
import { extractEnvVarName, isEnvVarReference, isReference } from '@/executor/constants'
import type { SelectorContext, SelectorKey } from '@/hooks/selectors/types'
import { useEnvironmentStore } from '@/stores/settings/environment'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useDependsOnGate } from './use-depends-on-gate'
@@ -12,8 +15,7 @@ import { useDependsOnGate } from './use-depends-on-gate'
*
* Builds a `SelectorContext` by mapping each `dependsOn` entry through the
* canonical index to its `canonicalParamId`, which maps directly to
* `SelectorContext` field names (e.g. `siteId`, `teamId`, `collectionId`).
* The one special case is `oauthCredential` which maps to `credentialId`.
* `SelectorContext` field names (e.g. `siteId`, `teamId`, `oauthCredential`).
*
* @param blockId - The block containing the selector sub-block
* @param subBlock - The sub-block config (must have `selectorKey` set)
@@ -29,53 +31,58 @@ export function useSelectorSetup(
const activeWorkflowId = useWorkflowRegistry((s) => s.activeWorkflowId)
const workflowId = (params?.workflowId as string) || activeWorkflowId || ''
const envVariables = useEnvironmentStore((s) => s.variables)
const { finalDisabled, dependencyValues, canonicalIndex } = useDependsOnGate(
blockId,
subBlock,
opts
)
const resolvedDependencyValues = useMemo(() => {
const resolved: Record<string, unknown> = {}
for (const [key, value] of Object.entries(dependencyValues)) {
if (value === null || value === undefined) {
resolved[key] = value
continue
}
const str = String(value)
if (isEnvVarReference(str)) {
const varName = extractEnvVarName(str)
resolved[key] = envVariables[varName]?.value || undefined
} else {
resolved[key] = value
}
}
return resolved
}, [dependencyValues, envVariables])
const selectorContext = useMemo<SelectorContext>(() => {
const context: SelectorContext = {
workflowId,
mimeType: subBlock.mimeType,
}
for (const [depKey, value] of Object.entries(dependencyValues)) {
for (const [depKey, value] of Object.entries(resolvedDependencyValues)) {
if (value === null || value === undefined) continue
const strValue = String(value)
if (!strValue) continue
if (isReference(strValue)) continue
const canonicalParamId = canonicalIndex.canonicalIdBySubBlockId[depKey] ?? depKey
if (canonicalParamId === 'oauthCredential') {
context.credentialId = strValue
} else if (canonicalParamId in CONTEXT_FIELD_SET) {
;(context as Record<string, unknown>)[canonicalParamId] = strValue
if (SELECTOR_CONTEXT_FIELDS.has(canonicalParamId as keyof SelectorContext)) {
context[canonicalParamId as keyof SelectorContext] = strValue
}
}
return context
}, [dependencyValues, canonicalIndex, workflowId, subBlock.mimeType])
}, [resolvedDependencyValues, canonicalIndex, workflowId, subBlock.mimeType])
return {
selectorKey: (subBlock.selectorKey ?? null) as SelectorKey | null,
selectorContext,
allowSearch: subBlock.selectorAllowSearch ?? true,
disabled: finalDisabled || !subBlock.selectorKey,
dependencyValues,
dependencyValues: resolvedDependencyValues,
}
}
const CONTEXT_FIELD_SET: Record<string, true> = {
credentialId: true,
domain: true,
teamId: true,
projectId: true,
knowledgeBaseId: true,
planId: true,
siteId: true,
collectionId: true,
spreadsheetId: true,
fileId: true,
}

View File

@@ -57,9 +57,9 @@ import { useWebhookManagement } from '@/hooks/use-webhook-management'
const SLACK_OVERRIDES: SelectorOverrides = {
transformContext: (context, deps) => {
const authMethod = deps.authMethod as string
const credentialId =
const oauthCredential =
authMethod === 'bot_token' ? String(deps.botToken ?? '') : String(deps.credential ?? '')
return { ...context, credentialId }
return { ...context, oauthCredential }
},
}

View File

@@ -40,6 +40,10 @@ import { LoopTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/component
import { ParallelTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/parallel/parallel-config'
import { getSubBlockStableKey } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/utils'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks'
import {
isAncestorProtected,
isBlockProtected,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/utils/block-protection-utils'
import { PreviewWorkflow } from '@/app/workspace/[workspaceId]/w/components/preview'
import { getBlock } from '@/blocks/registry'
import type { SubBlockType } from '@/blocks/types'
@@ -107,12 +111,11 @@ export function Editor() {
const userPermissions = useUserPermissionsContext()
// Check if block is locked (or inside a locked container) and compute edit permission
// Check if block is locked (or inside a locked ancestor) and compute edit permission
// Locked blocks cannot be edited by anyone (admins can only lock/unlock)
const blocks = useWorkflowStore((state) => state.blocks)
const parentId = currentBlock?.data?.parentId as string | undefined
const isParentLocked = parentId ? (blocks[parentId]?.locked ?? false) : false
const isLocked = (currentBlock?.locked ?? false) || isParentLocked
const isLocked = currentBlockId ? isBlockProtected(currentBlockId, blocks) : false
const isAncestorLocked = currentBlockId ? isAncestorProtected(currentBlockId, blocks) : false
const canEditBlock = userPermissions.canEdit && !isLocked
const activeWorkflowId = useWorkflowRegistry((state) => state.activeWorkflowId)
@@ -247,10 +250,7 @@ export function Editor() {
const block = blocks[blockId]
if (!block) return
const parentId = block.data?.parentId as string | undefined
const isParentLocked = parentId ? (blocks[parentId]?.locked ?? false) : false
const isLocked = (block.locked ?? false) || isParentLocked
if (!userPermissions.canEdit || isLocked) return
if (!userPermissions.canEdit || isBlockProtected(blockId, blocks)) return
renamingBlockIdRef.current = blockId
setEditedName(block.name || '')
@@ -364,11 +364,11 @@ export function Editor() {
)}
</div>
<div className='flex shrink-0 items-center gap-[8px]'>
{/* Locked indicator - clickable to unlock if user has admin permissions, block is locked, and parent is not locked */}
{/* Locked indicator - clickable to unlock if user has admin permissions, block is locked directly, and not locked by an ancestor */}
{isLocked && currentBlock && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
{userPermissions.canAdmin && currentBlock.locked && !isParentLocked ? (
{userPermissions.canAdmin && currentBlock.locked && !isAncestorLocked ? (
<Button
variant='ghost'
className='p-0'
@@ -385,8 +385,8 @@ export function Editor() {
</Tooltip.Trigger>
<Tooltip.Content side='top'>
<p>
{isParentLocked
? 'Parent container is locked'
{isAncestorLocked
? 'Ancestor container is locked'
: userPermissions.canAdmin && currentBlock.locked
? 'Unlock block'
: 'Block is locked'}

View File

@@ -7,6 +7,7 @@ import {
splitReferenceSegment,
} from '@/lib/workflows/sanitization/references'
import { checkTagTrigger } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { restoreCursorAfterInsertion } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/utils'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { normalizeName, REFERENCE } from '@/executor/constants'
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
@@ -60,7 +61,6 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
const textareaRef = useRef<HTMLTextAreaElement | null>(null)
const editorContainerRef = useRef<HTMLDivElement>(null)
const [tempInputValue, setTempInputValue] = useState<string | null>(null)
const [showTagDropdown, setShowTagDropdown] = useState(false)
const [cursorPosition, setCursorPosition] = useState(0)
@@ -289,8 +289,9 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
* Handle tag selection from dropdown
*/
const handleSubflowTagSelect = useCallback(
(newValue: string) => {
(newValue: string, newCursorPosition: number) => {
if (!currentBlockId || !isSubflow || !currentBlock) return
collaborativeUpdateIterationCollection(
currentBlockId,
currentBlock.type as 'loop' | 'parallel',
@@ -298,12 +299,7 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
)
setShowTagDropdown(false)
setTimeout(() => {
const textarea = textareaRef.current
if (textarea) {
textarea.focus()
}
}, 0)
restoreCursorAfterInsertion(textareaRef.current, newCursorPosition)
},
[currentBlockId, isSubflow, currentBlock, collaborativeUpdateIterationCollection]
)

View File

@@ -0,0 +1,20 @@
/**
* Restores the cursor position in a textarea after a dropdown insertion.
* Schedules a macrotask (via setTimeout) that runs after React's controlled-component commit
* so that the cursor position sticks.
*
* @param textarea - The textarea element to restore cursor in (may be null)
* @param newCursorPosition - The exact position to place the cursor at
*/
export function restoreCursorAfterInsertion(
textarea: HTMLTextAreaElement | null,
newCursorPosition: number
): void {
setTimeout(() => {
if (textarea) {
textarea.focus()
textarea.selectionStart = newCursorPosition
textarea.selectionEnd = newCursorPosition
}
}, 0)
}

View File

@@ -67,9 +67,6 @@ const ToolbarItem = memo(function ToolbarItem({
const handleDragStart = useCallback(
(e: React.DragEvent<HTMLElement>) => {
if (!isTrigger && (item.type === 'loop' || item.type === 'parallel')) {
document.body.classList.add('sim-drag-subflow')
}
const iconElement = e.currentTarget.querySelector('.toolbar-item-icon')
onDragStart(e, item.type, isTriggerCapable, {
name: item.name,
@@ -80,12 +77,6 @@ const ToolbarItem = memo(function ToolbarItem({
[item.type, item.name, item.bgColor, isTriggerCapable, onDragStart, isTrigger]
)
const handleDragEnd = useCallback(() => {
if (!isTrigger) {
document.body.classList.remove('sim-drag-subflow')
}
}, [isTrigger])
const handleClick = useCallback(() => {
onClick(item.type, isTriggerCapable)
}, [item.type, isTriggerCapable, onClick])
@@ -114,7 +105,6 @@ const ToolbarItem = memo(function ToolbarItem({
tabIndex={-1}
draggable
onDragStart={handleDragStart}
onDragEnd={handleDragEnd}
onClick={handleClick}
onContextMenu={handleContextMenu}
className={clsx(

View File

@@ -1,4 +1,4 @@
import { memo, useMemo, useRef } from 'react'
import { memo, useMemo } from 'react'
import { RepeatIcon, SplitIcon } from 'lucide-react'
import { Handle, type NodeProps, Position, useReactFlow } from 'reactflow'
import { Badge } from '@/components/emcn'
@@ -8,6 +8,7 @@ import { type DiffStatus, hasDiffStatus } from '@/lib/workflows/diff/types'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { ActionBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/action-bar/action-bar'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks'
import { useLastRunPath } from '@/stores/execution'
import { usePanelEditorStore } from '@/stores/panel'
/**
@@ -23,6 +24,30 @@ export interface SubflowNodeData {
isPreviewSelected?: boolean
kind: 'loop' | 'parallel'
name?: string
/** Execution status passed by preview/snapshot views */
executionStatus?: 'success' | 'error' | 'not-executed'
}
const HANDLE_STYLE = {
top: `${HANDLE_POSITIONS.DEFAULT_Y_OFFSET}px`,
transform: 'translateY(-50%)',
} as const
/**
* Reusable class names for Handle components.
* Matches the styling pattern from workflow-block.tsx.
*/
const getHandleClasses = (position: 'left' | 'right') => {
const baseClasses = '!z-[10] !cursor-crosshair !border-none !transition-[colors] !duration-150'
const colorClasses = '!bg-[var(--workflow-edge)]'
const positionClasses = {
left: '!left-[-8px] !h-5 !w-[7px] !rounded-l-[2px] !rounded-r-none hover:!left-[-11px] hover:!w-[10px] hover:!rounded-l-full',
right:
'!right-[-8px] !h-5 !w-[7px] !rounded-r-[2px] !rounded-l-none hover:!right-[-11px] hover:!w-[10px] hover:!rounded-r-full',
}
return cn(baseClasses, colorClasses, positionClasses[position])
}
/**
@@ -35,7 +60,6 @@ export interface SubflowNodeData {
*/
export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<SubflowNodeData>) => {
const { getNodes } = useReactFlow()
const blockRef = useRef<HTMLDivElement>(null)
const userPermissions = useUserPermissionsContext()
const currentWorkflow = useCurrentWorkflow()
@@ -49,13 +73,21 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
const isLocked = currentBlock?.locked ?? false
const isPreview = data?.isPreview || false
// Focus state
const setCurrentBlockId = usePanelEditorStore((state) => state.setCurrentBlockId)
const currentBlockId = usePanelEditorStore((state) => state.currentBlockId)
const isFocused = currentBlockId === id
const isPreviewSelected = data?.isPreviewSelected || false
const lastRunPath = useLastRunPath()
const executionStatus = data.executionStatus
const runPathStatus: 'success' | 'error' | undefined =
executionStatus === 'success' || executionStatus === 'error'
? executionStatus
: isPreview
? undefined
: lastRunPath.get(id)
/**
* Calculate the nesting level of this subflow node based on its parent hierarchy.
* Used to apply appropriate styling for nested containers.
@@ -72,7 +104,7 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
}
return level
}, [id, data?.parentId, getNodes])
}, [data?.parentId, getNodes])
const startHandleId = data.kind === 'loop' ? 'loop-start-source' : 'parallel-start-source'
const endHandleId = data.kind === 'loop' ? 'loop-end-source' : 'parallel-end-source'
@@ -80,58 +112,52 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
const blockIconBg = data.kind === 'loop' ? '#2FB3FF' : '#FEE12B'
const blockName = data.name || (data.kind === 'loop' ? 'Loop' : 'Parallel')
/**
* Reusable styles and positioning for Handle components.
* Matches the styling pattern from workflow-block.tsx.
*/
const getHandleClasses = (position: 'left' | 'right') => {
const baseClasses = '!z-[10] !cursor-crosshair !border-none !transition-[colors] !duration-150'
const colorClasses = '!bg-[var(--workflow-edge)]'
const positionClasses = {
left: '!left-[-8px] !h-5 !w-[7px] !rounded-l-[2px] !rounded-r-none hover:!left-[-11px] hover:!w-[10px] hover:!rounded-l-full',
right:
'!right-[-8px] !h-5 !w-[7px] !rounded-r-[2px] !rounded-l-none hover:!right-[-11px] hover:!w-[10px] hover:!rounded-r-full',
}
return cn(baseClasses, colorClasses, positionClasses[position])
}
const getHandleStyle = () => {
return { top: `${HANDLE_POSITIONS.DEFAULT_Y_OFFSET}px`, transform: 'translateY(-50%)' }
}
/**
* Determine the ring styling based on subflow state priority:
* 1. Focused (selected in editor), selected (shift-click/box), or preview selected - blue ring
* 2. Diff status (version comparison) - green/orange ring
* 3. Run path status (execution result) - green/red ring
*/
const isSelected = !isPreview && selected
const hasRing =
isFocused || isSelected || isPreviewSelected || diffStatus === 'new' || diffStatus === 'edited'
const ringStyles = cn(
hasRing && 'ring-[1.75px]',
(isFocused || isSelected || isPreviewSelected) && 'ring-[var(--brand-secondary)]',
diffStatus === 'new' && 'ring-[var(--brand-tertiary-2)]',
diffStatus === 'edited' && 'ring-[var(--warning)]'
)
isFocused ||
isSelected ||
isPreviewSelected ||
diffStatus === 'new' ||
diffStatus === 'edited' ||
!!runPathStatus
/**
* Compute the ring color for the subflow selection indicator.
* Uses boxShadow (not CSS outline) to match the ring styling of regular workflow blocks.
* This works because ReactFlow renders child nodes as sibling divs at the viewport level
* (not as DOM children), so children at zIndex 1000 don't clip the parent's boxShadow.
*/
const getRingColor = (): string | undefined => {
if (!hasRing) return undefined
if (isFocused || isSelected || isPreviewSelected) return 'var(--brand-secondary)'
if (diffStatus === 'new') return 'var(--brand-tertiary-2)'
if (diffStatus === 'edited') return 'var(--warning)'
if (runPathStatus === 'success') {
return executionStatus ? 'var(--brand-tertiary-2)' : 'var(--border-success)'
}
if (runPathStatus === 'error') return 'var(--text-error)'
return undefined
}
const ringColor = getRingColor()
return (
<div className='group relative'>
<div className='group pointer-events-none relative'>
<div
ref={blockRef}
onClick={() => setCurrentBlockId(id)}
className={cn(
'workflow-drag-handle relative cursor-grab select-none rounded-[8px] border border-[var(--border-1)] [&:active]:cursor-grabbing',
'transition-block-bg transition-ring',
'z-[20]'
)}
className='relative select-none rounded-[8px] border border-[var(--border-1)] transition-block-bg'
style={{
width: data.width || 500,
height: data.height || 300,
position: 'relative',
overflow: 'visible',
pointerEvents: isPreview ? 'none' : 'all',
pointerEvents: 'none',
...(ringColor && {
boxShadow: `0 0 0 1.75px ${ringColor}`,
}),
}}
data-node-id={id}
data-type='subflowNode'
@@ -142,11 +168,11 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
<ActionBar blockId={id} blockType={data.kind} disabled={!userPermissions.canEdit} />
)}
{/* Header Section */}
{/* Header Section — only interactive area for dragging */}
<div
className={cn(
'flex items-center justify-between rounded-t-[8px] border-[var(--border)] border-b bg-[var(--surface-2)] py-[8px] pr-[12px] pl-[8px]'
)}
onClick={() => setCurrentBlockId(id)}
className='workflow-drag-handle flex cursor-grab items-center justify-between rounded-t-[8px] border-[var(--border)] border-b bg-[var(--surface-2)] py-[8px] pr-[12px] pl-[8px] [&:active]:cursor-grabbing'
style={{ pointerEvents: 'auto' }}
>
<div className='flex min-w-0 flex-1 items-center gap-[10px]'>
<div
@@ -171,6 +197,17 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
</div>
</div>
{/*
* Click-catching background — selects this subflow when the body area is clicked.
* No event bubbling concern: ReactFlow renders child nodes as viewport-level siblings,
* not as DOM children of this component, so child clicks never reach this div.
*/}
<div
className='absolute inset-0 top-[44px] rounded-b-[8px]'
style={{ pointerEvents: isPreview ? 'none' : 'auto' }}
onClick={() => setCurrentBlockId(id)}
/>
{!isPreview && (
<div
className='absolute right-[8px] bottom-[8px] z-20 flex h-[32px] w-[32px] cursor-se-resize items-center justify-center text-muted-foreground'
@@ -179,12 +216,9 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
)}
<div
className='h-[calc(100%-50px)] pt-[16px] pr-[80px] pb-[16px] pl-[16px]'
className='relative h-[calc(100%-50px)] pt-[16px] pr-[80px] pb-[16px] pl-[16px]'
data-dragarea='true'
style={{
position: 'relative',
pointerEvents: isPreview ? 'none' : 'auto',
}}
style={{ pointerEvents: 'none' }}
>
{/* Subflow Start */}
<div
@@ -217,7 +251,7 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
position={Position.Left}
className={getHandleClasses('left')}
style={{
...getHandleStyle(),
...HANDLE_STYLE,
pointerEvents: 'auto',
}}
/>
@@ -228,17 +262,11 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
position={Position.Right}
className={getHandleClasses('right')}
style={{
...getHandleStyle(),
...HANDLE_STYLE,
pointerEvents: 'auto',
}}
id={endHandleId}
/>
{hasRing && (
<div
className={cn('pointer-events-none absolute inset-0 z-40 rounded-[8px]', ringStyles)}
/>
)}
</div>
</div>
)

View File

@@ -60,6 +60,7 @@ import { openCopilotWithMessage } from '@/stores/notifications/utils'
import type { ConsoleEntry } from '@/stores/terminal'
import { useTerminalConsoleStore, useTerminalStore } from '@/stores/terminal'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
/**
* Terminal height configuration constants
@@ -68,20 +69,21 @@ const MIN_HEIGHT = TERMINAL_HEIGHT.MIN
const DEFAULT_EXPANDED_HEIGHT = TERMINAL_HEIGHT.DEFAULT
const MIN_OUTPUT_PANEL_WIDTH_PX = OUTPUT_PANEL_WIDTH.MIN
/** Returns true if any node in the subtree has an error */
function hasErrorInTree(nodes: EntryNode[]): boolean {
return nodes.some((n) => Boolean(n.entry.error) || hasErrorInTree(n.children))
const MAX_TREE_DEPTH = 50
function hasMatchInTree(
nodes: EntryNode[],
predicate: (e: ConsoleEntry) => boolean,
depth = 0
): boolean {
if (depth >= MAX_TREE_DEPTH) return false
return nodes.some((n) => predicate(n.entry) || hasMatchInTree(n.children, predicate, depth + 1))
}
/** Returns true if any node in the subtree is currently running */
function hasRunningInTree(nodes: EntryNode[]): boolean {
return nodes.some((n) => Boolean(n.entry.isRunning) || hasRunningInTree(n.children))
}
/** Returns true if any node in the subtree was canceled */
function hasCanceledInTree(nodes: EntryNode[]): boolean {
return nodes.some((n) => Boolean(n.entry.isCanceled) || hasCanceledInTree(n.children))
}
const hasErrorInTree = (nodes: EntryNode[]) => hasMatchInTree(nodes, (e) => Boolean(e.error))
const hasRunningInTree = (nodes: EntryNode[]) => hasMatchInTree(nodes, (e) => Boolean(e.isRunning))
const hasCanceledInTree = (nodes: EntryNode[]) =>
hasMatchInTree(nodes, (e) => Boolean(e.isCanceled))
/**
* Block row component for displaying actual block entries
@@ -263,28 +265,21 @@ const SubflowNodeRow = memo(function SubflowNodeRow({
}) {
const { entry, children } = node
const BlockIcon = getBlockIcon(entry.blockType)
const hasError =
Boolean(entry.error) ||
children.some((c) => c.entry.error || c.children.some((gc) => gc.entry.error))
const hasError = Boolean(entry.error) || hasErrorInTree(children)
const bgColor = getBlockColor(entry.blockType)
const nodeId = entry.id
const isExpanded = expandedNodes.has(nodeId)
const hasChildren = children.length > 0
// Check if any nested block is running or canceled
const hasRunningDescendant = children.some(
(c) => c.entry.isRunning || c.children.some((gc) => gc.entry.isRunning)
)
const hasCanceledDescendant =
children.some((c) => c.entry.isCanceled || c.children.some((gc) => gc.entry.isCanceled)) &&
!hasRunningDescendant
// Check if any nested block is running or canceled (recursive for arbitrary nesting depth)
const hasRunningDescendant = hasRunningInTree(children)
const hasCanceledDescendant = hasCanceledInTree(children) && !hasRunningDescendant
const displayName =
entry.blockType === 'loop'
? 'Loop'
: entry.blockType === 'parallel'
? 'Parallel'
: entry.blockName
const containerId = entry.iterationContainerId
const storeBlockName = useWorkflowStore((state) =>
containerId ? state.blocks[containerId]?.name : undefined
)
const displayName = storeBlockName || entry.blockName
return (
<div className='flex min-w-0 flex-col'>

View File

@@ -0,0 +1,478 @@
/**
* @vitest-environment node
*/
import { describe, expect, it, vi } from 'vitest'
vi.mock('@/blocks', () => ({
getBlock: vi.fn().mockReturnValue(null),
}))
vi.mock('@/executor/constants', () => ({
isWorkflowBlockType: vi.fn((blockType: string | undefined) => {
return blockType === 'workflow' || blockType === 'workflow_input'
}),
}))
vi.mock('@/stores/constants', () => ({
TERMINAL_BLOCK_COLUMN_WIDTH: { MIN: 120, DEFAULT: 200, MAX: 400 },
}))
import type { ConsoleEntry } from '@/stores/terminal'
import { buildEntryTree, type EntryNode, groupEntriesByExecution } from './utils'
let entryCounter = 0
function makeEntry(overrides: Partial<ConsoleEntry>): ConsoleEntry {
return {
id: overrides.id ?? `entry-${++entryCounter}`,
timestamp: overrides.timestamp ?? '2025-01-01T00:00:00Z',
workflowId: overrides.workflowId ?? 'wf-1',
blockId: overrides.blockId ?? 'block-1',
blockName: overrides.blockName ?? 'Block',
blockType: overrides.blockType ?? 'function',
executionId: overrides.executionId ?? 'exec-1',
startedAt: overrides.startedAt ?? '2025-01-01T00:00:00Z',
executionOrder: overrides.executionOrder ?? 0,
...overrides,
} as ConsoleEntry
}
/** Collect all nodes from a tree depth-first */
function collectAllNodes(nodes: EntryNode[]): EntryNode[] {
const result: EntryNode[] = []
for (const node of nodes) {
result.push(node)
result.push(...collectAllNodes(node.children))
}
return result
}
/**
* Creates entries for a parallel-in-loop scenario.
* All Function 1 entries are nestedIterationEntries (have parentIterations).
* No topLevelIterationEntries exist (sentinels don't emit SSE events).
*/
function makeParallelInLoopEntries(
loopIterations: number,
parallelBranches: number
): ConsoleEntry[] {
const entries: ConsoleEntry[] = []
let order = 1
for (let loopIter = 0; loopIter < loopIterations; loopIter++) {
for (let branch = 0; branch < parallelBranches; branch++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
startedAt: new Date(Date.UTC(2025, 0, 1, 0, 0, loopIter * 10 + branch)).toISOString(),
endedAt: new Date(Date.UTC(2025, 0, 1, 0, 0, loopIter * 10 + branch + 1)).toISOString(),
durationMs: 50,
iterationType: 'parallel',
iterationCurrent: branch,
iterationTotal: parallelBranches,
iterationContainerId: 'parallel-1',
parentIterations: [
{
iterationType: 'loop',
iterationCurrent: loopIter,
iterationTotal: loopIterations,
iterationContainerId: 'loop-1',
},
],
})
)
}
}
return entries
}
describe('buildEntryTree', () => {
describe('simple loop (no nesting)', () => {
it('groups entries by loop iteration', () => {
const entries: ConsoleEntry[] = []
for (let iter = 0; iter < 3; iter++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: iter + 1,
iterationType: 'loop',
iterationCurrent: iter,
iterationTotal: 3,
iterationContainerId: 'loop-1',
})
)
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('loop')
expect(subflows[0].children).toHaveLength(3)
for (let i = 0; i < 3; i++) {
expect(subflows[0].children[i].iterationInfo?.current).toBe(i)
expect(subflows[0].children[i].children).toHaveLength(1)
expect(subflows[0].children[i].children[0].entry.blockId).toBe('function-1')
}
})
})
describe('simple parallel (no nesting)', () => {
it('groups entries by parallel branch', () => {
const entries: ConsoleEntry[] = []
for (let branch = 0; branch < 4; branch++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: branch + 1,
iterationType: 'parallel',
iterationCurrent: branch,
iterationTotal: 4,
iterationContainerId: 'parallel-1',
})
)
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('parallel')
expect(subflows[0].children).toHaveLength(4)
})
})
describe('parallel-in-loop', () => {
it('creates all loop iterations (5 loop × 5 parallel)', () => {
const entries = makeParallelInLoopEntries(5, 5)
expect(entries).toHaveLength(25)
const tree = buildEntryTree(entries)
// Top level: 1 subflow (Loop)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('loop')
// Loop has 5 iteration children
const loopIterations = subflows[0].children
expect(loopIterations).toHaveLength(5)
for (let loopIter = 0; loopIter < 5; loopIter++) {
const iterNode = loopIterations[loopIter]
expect(iterNode.nodeType).toBe('iteration')
expect(iterNode.iterationInfo?.current).toBe(loopIter)
expect(iterNode.iterationInfo?.total).toBe(5)
// Each loop iteration has 1 nested subflow (Parallel)
const parallelSubflows = iterNode.children.filter((n) => n.nodeType === 'subflow')
expect(parallelSubflows).toHaveLength(1)
expect(parallelSubflows[0].entry.blockType).toBe('parallel')
// Each parallel has 5 branch iterations
const branches = parallelSubflows[0].children
expect(branches).toHaveLength(5)
for (let branch = 0; branch < 5; branch++) {
expect(branches[branch].iterationInfo?.current).toBe(branch)
expect(branches[branch].children).toHaveLength(1)
expect(branches[branch].children[0].entry.blockId).toBe('function-1')
}
}
})
it('preserves all block entries in the tree (no silently dropped entries)', () => {
const entries = makeParallelInLoopEntries(5, 5)
const tree = buildEntryTree(entries)
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter(
(n) => n.nodeType === 'block' && n.entry.blockId === 'function-1'
)
expect(blocks).toHaveLength(25)
})
it('works with a regular block alongside', () => {
const entries = [
makeEntry({
blockId: 'start-1',
blockName: 'Start',
blockType: 'starter',
executionOrder: 0,
}),
...makeParallelInLoopEntries(3, 2),
]
const tree = buildEntryTree(entries)
const regularBlocks = tree.filter((n) => n.nodeType === 'block')
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(regularBlocks).toHaveLength(1)
expect(regularBlocks[0].entry.blockId).toBe('start-1')
expect(subflows).toHaveLength(1)
expect(subflows[0].children).toHaveLength(3)
})
it('works when some iterations also have topLevelIterationEntries', () => {
const entries: ConsoleEntry[] = [
// Real top-level entry for loop iteration 0 (from a container event)
makeEntry({
blockId: 'parallel-container',
blockName: 'Parallel',
blockType: 'parallel',
executionOrder: 100,
iterationType: 'loop',
iterationCurrent: 0,
iterationTotal: 3,
iterationContainerId: 'loop-1',
}),
...makeParallelInLoopEntries(3, 2),
]
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
// All 3 loop iterations must exist
expect(subflows[0].children).toHaveLength(3)
// All 6 Function 1 blocks should appear somewhere in the tree
const allNodes = collectAllNodes(tree)
const fnBlocks = allNodes.filter(
(n) => n.nodeType === 'block' && n.entry.blockId === 'function-1'
)
expect(fnBlocks).toHaveLength(6)
})
it('handles 2 loop × 3 parallel', () => {
const entries = makeParallelInLoopEntries(2, 3)
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].children).toHaveLength(2)
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter(
(n) => n.nodeType === 'block' && n.entry.blockId === 'function-1'
)
expect(blocks).toHaveLength(6)
})
})
describe('loop-in-parallel', () => {
it('creates all parallel branches with nested loop iterations', () => {
const entries: ConsoleEntry[] = []
let order = 1
for (let branch = 0; branch < 3; branch++) {
for (let loopIter = 0; loopIter < 2; loopIter++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
iterationType: 'loop',
iterationCurrent: loopIter,
iterationTotal: 2,
iterationContainerId: 'loop-1',
parentIterations: [
{
iterationType: 'parallel',
iterationCurrent: branch,
iterationTotal: 3,
iterationContainerId: 'parallel-1',
},
],
})
)
}
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('parallel')
// 3 parallel branches
const branches = subflows[0].children
expect(branches).toHaveLength(3)
for (let branch = 0; branch < 3; branch++) {
const branchNode = branches[branch]
expect(branchNode.iterationInfo?.current).toBe(branch)
// Each branch has a nested loop subflow
const nestedSubflows = branchNode.children.filter((n) => n.nodeType === 'subflow')
expect(nestedSubflows).toHaveLength(1)
expect(nestedSubflows[0].entry.blockType).toBe('loop')
// Each loop has 2 iterations
expect(nestedSubflows[0].children).toHaveLength(2)
}
})
})
describe('loop-in-loop', () => {
it('creates outer and inner loop iterations', () => {
const entries: ConsoleEntry[] = []
let order = 1
for (let outer = 0; outer < 2; outer++) {
for (let inner = 0; inner < 3; inner++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
iterationType: 'loop',
iterationCurrent: inner,
iterationTotal: 3,
iterationContainerId: 'inner-loop',
parentIterations: [
{
iterationType: 'loop',
iterationCurrent: outer,
iterationTotal: 2,
iterationContainerId: 'outer-loop',
},
],
})
)
}
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('loop')
// Outer loop: 2 iterations
expect(subflows[0].children).toHaveLength(2)
for (let outer = 0; outer < 2; outer++) {
const outerIter = subflows[0].children[outer]
expect(outerIter.iterationInfo?.current).toBe(outer)
// Each outer iteration has an inner loop
const innerSubflows = outerIter.children.filter((n) => n.nodeType === 'subflow')
expect(innerSubflows).toHaveLength(1)
expect(innerSubflows[0].children).toHaveLength(3)
}
// All 6 blocks present
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter((n) => n.nodeType === 'block')
expect(blocks).toHaveLength(6)
})
})
describe('parallel-in-parallel', () => {
it('creates outer and inner parallel branches', () => {
const entries: ConsoleEntry[] = []
let order = 1
for (let outer = 0; outer < 2; outer++) {
for (let inner = 0; inner < 3; inner++) {
entries.push(
makeEntry({
blockId: 'function-1',
blockName: 'Function 1',
executionOrder: order++,
iterationType: 'parallel',
iterationCurrent: inner,
iterationTotal: 3,
iterationContainerId: 'inner-parallel',
parentIterations: [
{
iterationType: 'parallel',
iterationCurrent: outer,
iterationTotal: 2,
iterationContainerId: 'outer-parallel',
},
],
})
)
}
}
const tree = buildEntryTree(entries)
const subflows = tree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].entry.blockType).toBe('parallel')
// 2 outer branches
expect(subflows[0].children).toHaveLength(2)
for (let outer = 0; outer < 2; outer++) {
const outerBranch = subflows[0].children[outer]
const innerSubflows = outerBranch.children.filter((n) => n.nodeType === 'subflow')
expect(innerSubflows).toHaveLength(1)
expect(innerSubflows[0].children).toHaveLength(3)
}
const allNodes = collectAllNodes(tree)
const blocks = allNodes.filter((n) => n.nodeType === 'block')
expect(blocks).toHaveLength(6)
})
})
})
describe('groupEntriesByExecution', () => {
it('builds tree for parallel-in-loop via groupEntriesByExecution', () => {
const entries = makeParallelInLoopEntries(3, 2)
const groups = groupEntriesByExecution(entries)
expect(groups).toHaveLength(1)
const entryTree = groups[0].entryTree
const subflows = entryTree.filter((n) => n.nodeType === 'subflow')
expect(subflows).toHaveLength(1)
expect(subflows[0].children).toHaveLength(3)
})
it('handles workflow child entries alongside iteration entries', () => {
const entries: ConsoleEntry[] = [
makeEntry({
id: 'start-entry',
blockId: 'start',
blockName: 'Start',
blockType: 'start_trigger',
executionOrder: 0,
}),
makeEntry({
id: 'workflow-block',
blockId: 'wf-block-1',
blockName: 'My Sub-Workflow',
blockType: 'workflow',
executionOrder: 1,
}),
makeEntry({
id: 'child-block-1',
blockId: 'child-func',
blockName: 'Child Function',
blockType: 'function',
executionOrder: 2,
childWorkflowBlockId: 'wf-block-1',
childWorkflowName: 'Child Workflow',
childWorkflowInstanceId: 'instance-1',
}),
]
const groups = groupEntriesByExecution(entries)
expect(groups).toHaveLength(1)
const tree = groups[0].entryTree
expect(tree.length).toBeGreaterThanOrEqual(2)
const startNode = tree.find((n) => n.entry.blockType === 'start_trigger')
expect(startNode).toBeDefined()
// Child entry should be nested under workflow block, not at top level
const topLevelChild = tree.find((n) => n.entry.blockId === 'child-func')
expect(topLevelChild).toBeUndefined()
})
})

View File

@@ -18,10 +18,9 @@ import type { ConsoleEntry } from '@/stores/terminal'
const SUBFLOW_COLORS = {
loop: '#2FB3FF',
parallel: '#FEE12B',
workflow: '#8b5cf6',
} as const
const WORKFLOW_COLOR = '#8b5cf6'
/**
* Special block type colors for errors and system messages
*/
@@ -86,7 +85,7 @@ export function getBlockColor(blockType: string): string {
return SUBFLOW_COLORS.parallel
}
if (blockType === 'workflow') {
return WORKFLOW_COLOR
return SUBFLOW_COLORS.workflow
}
// Special block types for errors and system messages
if (blockType === 'error') {
@@ -126,14 +125,6 @@ export function isEventFromEditableElement(e: KeyboardEvent): boolean {
return false
}
/**
* Checks if a block type is a subflow (loop or parallel)
*/
export function isSubflowBlockType(blockType: string): boolean {
const lower = blockType?.toLowerCase() || ''
return lower === 'loop' || lower === 'parallel'
}
/**
* Node type for the tree structure
*/
@@ -221,27 +212,26 @@ function collectWorkflowDescendants(
* that executed within each iteration.
* Sorts by start time to ensure chronological order.
*/
function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
// Separate entries into three buckets:
// 1. Iteration entries (loop/parallel children)
// 2. Workflow child entries (blocks inside a child workflow)
// 3. Regular blocks
export function buildEntryTree(entries: ConsoleEntry[], idPrefix = ''): EntryNode[] {
const regularBlocks: ConsoleEntry[] = []
const iterationEntries: ConsoleEntry[] = []
const topLevelIterationEntries: ConsoleEntry[] = []
const nestedIterationEntries: ConsoleEntry[] = []
const workflowChildEntries: ConsoleEntry[] = []
for (const entry of entries) {
if (entry.childWorkflowBlockId) {
// Child workflow entries take priority over iteration classification
workflowChildEntries.push(entry)
} else if (entry.iterationType && entry.iterationCurrent !== undefined) {
iterationEntries.push(entry)
if (entry.parentIterations && entry.parentIterations.length > 0) {
nestedIterationEntries.push(entry)
} else {
topLevelIterationEntries.push(entry)
}
} else {
regularBlocks.push(entry)
}
}
// Group workflow child entries by the parent workflow block ID
const workflowChildGroups = new Map<string, ConsoleEntry[]>()
for (const entry of workflowChildEntries) {
const parentId = entry.childWorkflowBlockId!
@@ -253,9 +243,8 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
}
}
// Group iteration entries by (iterationType, iterationContainerId, iterationCurrent)
const iterationGroupsMap = new Map<string, IterationGroup>()
for (const entry of iterationEntries) {
for (const entry of topLevelIterationEntries) {
const iterationContainerId = entry.iterationContainerId || 'unknown'
const key = `${entry.iterationType}-${iterationContainerId}-${entry.iterationCurrent}`
let group = iterationGroupsMap.get(key)
@@ -272,11 +261,9 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
}
iterationGroupsMap.set(key, group)
} else {
// Update start time to earliest
if (entryStartMs < group.startTimeMs) {
group.startTimeMs = entryStartMs
}
// Update total if available
if (entry.iterationTotal !== undefined) {
group.iterationTotal = entry.iterationTotal
}
@@ -284,12 +271,10 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
group.blocks.push(entry)
}
// Sort blocks within each iteration by executionOrder ascending (oldest first, top-down)
for (const group of iterationGroupsMap.values()) {
group.blocks.sort((a, b) => a.executionOrder - b.executionOrder)
}
// Group iterations by (iterationType, iterationContainerId) to create subflow parents
const subflowGroups = new Map<
string,
{ iterationType: string; iterationContainerId: string; groups: IterationGroup[] }
@@ -308,112 +293,218 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
subflowGroup.groups.push(group)
}
// Sort iterations within each subflow by iteration number
for (const subflowGroup of subflowGroups.values()) {
subflowGroup.groups.sort((a, b) => a.iterationCurrent - b.iterationCurrent)
}
// Build subflow nodes with iteration children
// Create synthetic parent subflow groups for orphaned nested iteration entries.
// Nested subflow containers (e.g., inner parallel inside outer parallel) may not
// have store entries if no block:started event was emitted for them. Without a
// parent subflow group, their child entries would be silently dropped from the tree.
// Check at the iteration level (not container level) so that existing iterations
// from topLevelIterationEntries don't block synthetic creation for other iterations
// of the same container (e.g., loop iterations 1-4 when iteration 0 already exists).
const syntheticIterations = new Map<string, IterationGroup>()
for (const entry of nestedIterationEntries) {
const parent = entry.parentIterations?.[0]
if (!parent?.iterationContainerId) {
continue
}
// Only skip if this specific iteration already has a group from topLevelIterationEntries
const iterKey = `${parent.iterationType}-${parent.iterationContainerId}-${parent.iterationCurrent}`
if (iterationGroupsMap.has(iterKey)) {
continue
}
const entryMs = new Date(entry.startedAt || entry.timestamp).getTime()
if (!syntheticIterations.has(iterKey)) {
syntheticIterations.set(iterKey, {
iterationType: parent.iterationType!,
iterationContainerId: parent.iterationContainerId!,
iterationCurrent: parent.iterationCurrent!,
iterationTotal: parent.iterationTotal,
blocks: [],
startTimeMs: entryMs,
})
} else {
const existing = syntheticIterations.get(iterKey)!
if (entryMs < existing.startTimeMs) {
existing.startTimeMs = entryMs
}
}
}
const syntheticSubflows = new Map<
string,
{ iterationType: string; iterationContainerId: string; groups: IterationGroup[] }
>()
for (const iterGroup of syntheticIterations.values()) {
const subflowKey = `${iterGroup.iterationType}-${iterGroup.iterationContainerId}`
let subflow = syntheticSubflows.get(subflowKey)
if (!subflow) {
subflow = {
iterationType: iterGroup.iterationType,
iterationContainerId: iterGroup.iterationContainerId,
groups: [],
}
syntheticSubflows.set(subflowKey, subflow)
}
subflow.groups.push(iterGroup)
}
for (const subflow of syntheticSubflows.values()) {
const key = `${subflow.iterationType}-${subflow.iterationContainerId}`
const existing = subflowGroups.get(key)
if (existing) {
// Merge synthetic iteration groups into the existing subflow group
existing.groups.push(...subflow.groups)
existing.groups.sort((a, b) => a.iterationCurrent - b.iterationCurrent)
} else {
subflow.groups.sort((a, b) => a.iterationCurrent - b.iterationCurrent)
subflowGroups.set(key, subflow)
}
}
const subflowNodes: EntryNode[] = []
for (const subflowGroup of subflowGroups.values()) {
const { iterationType, iterationContainerId, groups: iterationGroups } = subflowGroup
// Calculate subflow timing from all its iterations
const firstIteration = iterationGroups[0]
const allBlocks = iterationGroups.flatMap((g) => g.blocks)
const subflowStartMs = Math.min(
...allBlocks.map((b) => new Date(b.startedAt || b.timestamp).getTime())
)
const nestedForThisSubflow = nestedIterationEntries.filter((e) => {
const parent = e.parentIterations?.[0]
return parent && parent.iterationContainerId === iterationContainerId
})
const allDirectBlocks = iterationGroups.flatMap((g) => g.blocks)
const allRelevantBlocks = [...allDirectBlocks, ...nestedForThisSubflow]
if (allRelevantBlocks.length === 0) continue
const timestamps = allRelevantBlocks.map((b) => new Date(b.startedAt || b.timestamp).getTime())
const subflowStartMs = Math.min(...timestamps)
const subflowEndMs = Math.max(
...allBlocks.map((b) => new Date(b.endedAt || b.timestamp).getTime())
...allRelevantBlocks.map((b) => new Date(b.endedAt || b.timestamp).getTime())
)
const totalDuration = allBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
// Parallel branches run concurrently — use wall-clock time. Loop iterations run serially — use sum.
const totalDuration = allRelevantBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
const subflowDuration =
iterationType === 'parallel' ? subflowEndMs - subflowStartMs : totalDuration
// Create synthetic subflow parent entry
// Use the minimum executionOrder from all child blocks for proper ordering
const subflowExecutionOrder = Math.min(...allBlocks.map((b) => b.executionOrder))
const subflowExecutionOrder = Math.min(...allRelevantBlocks.map((b) => b.executionOrder))
const metadataSource = allRelevantBlocks[0]
const syntheticSubflow: ConsoleEntry = {
id: `subflow-${iterationType}-${iterationContainerId}-${firstIteration.blocks[0]?.executionId || 'unknown'}`,
id: `${idPrefix}subflow-${iterationType}-${iterationContainerId}-${metadataSource.executionId || 'unknown'}`,
timestamp: new Date(subflowStartMs).toISOString(),
workflowId: firstIteration.blocks[0]?.workflowId || '',
workflowId: metadataSource.workflowId || '',
blockId: `${iterationType}-container-${iterationContainerId}`,
blockName: iterationType.charAt(0).toUpperCase() + iterationType.slice(1),
blockType: iterationType,
executionId: firstIteration.blocks[0]?.executionId,
executionId: metadataSource.executionId,
startedAt: new Date(subflowStartMs).toISOString(),
executionOrder: subflowExecutionOrder,
endedAt: new Date(subflowEndMs).toISOString(),
durationMs: subflowDuration,
success: !allBlocks.some((b) => b.error),
success: !allRelevantBlocks.some((b) => b.error),
iterationContainerId,
}
// Build iteration child nodes
const iterationNodes: EntryNode[] = iterationGroups.map((iterGroup) => {
// Create synthetic iteration entry
const iterBlocks = iterGroup.blocks
const iterStartMs = Math.min(
...iterBlocks.map((b) => new Date(b.startedAt || b.timestamp).getTime())
)
const iterEndMs = Math.max(
...iterBlocks.map((b) => new Date(b.endedAt || b.timestamp).getTime())
)
const iterDuration = iterBlocks.reduce((sum, b) => sum + (b.durationMs || 0), 0)
// Parallel branches run concurrently — use wall-clock time. Loop iterations run serially — use sum.
const iterDisplayDuration =
iterationType === 'parallel' ? iterEndMs - iterStartMs : iterDuration
const iterationNodes: EntryNode[] = iterationGroups
.map((iterGroup): EntryNode | null => {
const matchingNestedEntries = nestedForThisSubflow.filter((e) => {
const parent = e.parentIterations?.[0]
return parent?.iterationCurrent === iterGroup.iterationCurrent
})
// Use the minimum executionOrder from blocks in this iteration
const iterExecutionOrder = Math.min(...iterBlocks.map((b) => b.executionOrder))
const syntheticIteration: ConsoleEntry = {
id: `iteration-${iterationType}-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}-${iterBlocks[0]?.executionId || 'unknown'}`,
timestamp: new Date(iterStartMs).toISOString(),
workflowId: iterBlocks[0]?.workflowId || '',
blockId: `iteration-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}`,
blockName: `Iteration ${iterGroup.iterationCurrent}${iterGroup.iterationTotal !== undefined ? ` / ${iterGroup.iterationTotal}` : ''}`,
blockType: iterationType,
executionId: iterBlocks[0]?.executionId,
startedAt: new Date(iterStartMs).toISOString(),
executionOrder: iterExecutionOrder,
endedAt: new Date(iterEndMs).toISOString(),
durationMs: iterDisplayDuration,
success: !iterBlocks.some((b) => b.error),
iterationCurrent: iterGroup.iterationCurrent,
iterationTotal: iterGroup.iterationTotal,
iterationType: iterationType as 'loop' | 'parallel',
iterationContainerId: iterGroup.iterationContainerId,
}
const strippedNestedEntries: ConsoleEntry[] = matchingNestedEntries.map((e) => ({
...e,
parentIterations:
e.parentIterations && e.parentIterations.length > 1
? e.parentIterations.slice(1)
: undefined,
}))
// Block nodes within this iteration — workflow blocks get their full subtree
const blockNodes: EntryNode[] = iterBlocks.map((block) => {
if (isWorkflowBlockType(block.blockType)) {
const instanceKey = block.childWorkflowInstanceId ?? block.blockId
const allDescendants = collectWorkflowDescendants(instanceKey, workflowChildGroups)
const rawChildren = allDescendants.map((c) => ({
...c,
childWorkflowBlockId:
c.childWorkflowBlockId === instanceKey ? undefined : c.childWorkflowBlockId,
}))
return {
entry: block,
children: buildEntryTree(rawChildren),
nodeType: 'workflow' as const,
}
const iterBlocks = iterGroup.blocks
const allIterEntries = [...iterBlocks, ...strippedNestedEntries]
if (allIterEntries.length === 0) return null
const iterStartMs = Math.min(
...allIterEntries.map((b) => new Date(b.startedAt || b.timestamp).getTime())
)
const iterEndMs = Math.max(
...allIterEntries.map((b) => new Date(b.endedAt || b.timestamp).getTime())
)
const iterDuration = allIterEntries.reduce((sum, b) => sum + (b.durationMs || 0), 0)
const iterDisplayDuration =
iterationType === 'parallel' ? iterEndMs - iterStartMs : iterDuration
const iterExecutionOrder = Math.min(...allIterEntries.map((b) => b.executionOrder))
const iterMetadataSource = allIterEntries[0]
const syntheticIteration: ConsoleEntry = {
id: `${idPrefix}iteration-${iterationType}-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}-${iterMetadataSource.executionId || 'unknown'}`,
timestamp: new Date(iterStartMs).toISOString(),
workflowId: iterMetadataSource.workflowId || '',
blockId: `iteration-${iterGroup.iterationContainerId}-${iterGroup.iterationCurrent}`,
blockName: `Iteration ${iterGroup.iterationCurrent}${iterGroup.iterationTotal !== undefined ? ` / ${iterGroup.iterationTotal}` : ''}`,
blockType: iterationType,
executionId: iterMetadataSource.executionId,
startedAt: new Date(iterStartMs).toISOString(),
executionOrder: iterExecutionOrder,
endedAt: new Date(iterEndMs).toISOString(),
durationMs: iterDisplayDuration,
success: !allIterEntries.some((b) => b.error),
iterationCurrent: iterGroup.iterationCurrent,
iterationTotal: iterGroup.iterationTotal,
iterationType: iterationType as 'loop' | 'parallel',
iterationContainerId: iterGroup.iterationContainerId,
}
return { entry: block, children: [], nodeType: 'block' as const }
})
return {
entry: syntheticIteration,
children: blockNodes,
nodeType: 'iteration' as const,
iterationInfo: {
current: iterGroup.iterationCurrent,
total: iterGroup.iterationTotal,
},
}
})
const childPrefix = `${idPrefix}${iterationContainerId}-${iterGroup.iterationCurrent}-`
const nestedSubflowNodes =
strippedNestedEntries.length > 0 ? buildEntryTree(strippedNestedEntries, childPrefix) : []
// Filter out container completion events when matching nested subflow nodes exist,
// to avoid duplicating them as both a flat block row and an expandable subflow.
const hasNestedSubflows = nestedSubflowNodes.length > 0
const blockNodes: EntryNode[] = iterBlocks
.filter((block) => {
if (
hasNestedSubflows &&
(block.blockType === 'loop' || block.blockType === 'parallel')
) {
return false
}
return true
})
.map((block) => {
if (isWorkflowBlockType(block.blockType)) {
const instanceKey = block.childWorkflowInstanceId ?? block.blockId
const allDescendants = collectWorkflowDescendants(instanceKey, workflowChildGroups)
const rawChildren = allDescendants.map((c) => ({
...c,
childWorkflowBlockId:
c.childWorkflowBlockId === instanceKey ? undefined : c.childWorkflowBlockId,
}))
return {
entry: block,
children: buildEntryTree(rawChildren),
nodeType: 'workflow' as const,
}
}
return { entry: block, children: [], nodeType: 'block' as const }
})
const allChildren = [...blockNodes, ...nestedSubflowNodes]
allChildren.sort((a, b) => a.entry.executionOrder - b.entry.executionOrder)
return {
entry: syntheticIteration,
children: allChildren,
nodeType: 'iteration' as const,
iterationInfo: {
current: iterGroup.iterationCurrent,
total: iterGroup.iterationTotal,
},
}
})
.filter((node): node is EntryNode => node !== null)
subflowNodes.push({
entry: syntheticSubflow,
@@ -422,7 +513,6 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
})
}
// Build workflow nodes for regular blocks that are workflow block types
const workflowNodes: EntryNode[] = []
const remainingRegularBlocks: ConsoleEntry[] = []
@@ -442,16 +532,15 @@ function buildEntryTree(entries: ConsoleEntry[]): EntryNode[] {
}
}
// Build nodes for remaining regular blocks
const regularNodes: EntryNode[] = remainingRegularBlocks.map((entry) => ({
entry,
children: [],
nodeType: 'block' as const,
}))
// Combine all nodes and sort by executionOrder ascending (oldest first, top-down)
const allNodes = [...subflowNodes, ...workflowNodes, ...regularNodes]
allNodes.sort((a, b) => a.entry.executionOrder - b.entry.executionOrder)
return allNodes
}

View File

@@ -527,7 +527,8 @@ const SubBlockRow = memo(function SubBlockRow({
const { displayName: credentialName } = useCredentialName(
credentialSourceId,
credentialProviderId,
workflowId
workflowId,
workspaceId
)
const credentialId = dependencyValues.credential
@@ -548,21 +549,48 @@ const SubBlockRow = memo(function SubBlockRow({
return typeof option === 'string' ? option : option.label
}, [subBlock, rawValue])
const domainValue = getStringValue('domain')
const teamIdValue = getStringValue('teamId')
const projectIdValue = getStringValue('projectId')
const planIdValue = getStringValue('planId')
const resolveContextValue = useCallback(
(key: string): string | undefined => {
const resolved = resolveDependencyValue(
key,
rawValues,
canonicalIndex || buildCanonicalIndex([]),
canonicalModeOverrides
)
return typeof resolved === 'string' && resolved.length > 0 ? resolved : undefined
},
[rawValues, canonicalIndex, canonicalModeOverrides]
)
const domainValue = resolveContextValue('domain')
const teamIdValue = resolveContextValue('teamId')
const projectIdValue = resolveContextValue('projectId')
const planIdValue = resolveContextValue('planId')
const baseIdValue = resolveContextValue('baseId')
const datasetIdValue = resolveContextValue('datasetId')
const serviceDeskIdValue = resolveContextValue('serviceDeskId')
const siteIdValue = resolveContextValue('siteId')
const collectionIdValue = resolveContextValue('collectionId')
const spreadsheetIdValue = resolveContextValue('spreadsheetId')
const fileIdValue = resolveContextValue('fileId')
const { displayName: selectorDisplayName } = useSelectorDisplayName({
subBlock,
value: rawValue,
workflowId,
credentialId: typeof credentialId === 'string' ? credentialId : undefined,
oauthCredential: typeof credentialId === 'string' ? credentialId : undefined,
knowledgeBaseId: typeof knowledgeBaseId === 'string' ? knowledgeBaseId : undefined,
domain: domainValue,
teamId: teamIdValue,
projectId: projectIdValue,
planId: planIdValue,
baseId: baseIdValue,
datasetId: datasetIdValue,
serviceDeskId: serviceDeskIdValue,
siteId: siteIdValue,
collectionId: collectionIdValue,
spreadsheetId: spreadsheetIdValue,
fileId: fileIdValue,
})
const { knowledgeBase: kbForDisplayName } = useKnowledgeBase(

View File

@@ -81,16 +81,43 @@ export function useNodeUtilities(blocks: Record<string, any>) {
* @returns Array of node IDs representing the hierarchy path
*/
const getNodeHierarchy = useCallback(
(nodeId: string): string[] => {
(nodeId: string, maxDepth = 100): string[] => {
const node = getNodes().find((n) => n.id === nodeId)
if (!node) return [nodeId]
if (!node || maxDepth <= 0) return [nodeId]
const parentId = blocks?.[nodeId]?.data?.parentId
if (!parentId) return [nodeId]
return [...getNodeHierarchy(parentId), nodeId]
return [...getNodeHierarchy(parentId, maxDepth - 1), nodeId]
},
[getNodes, blocks]
)
/**
* Returns true if nodeId is in the subtree of ancestorId (i.e. walking from nodeId
* up the parentId chain we reach ancestorId). Used to reject parent assignments that
* would create a cycle (e.g. setting dragged node's parent to a container inside it).
*
* @param ancestorId - Node that might be an ancestor
* @param nodeId - Node to walk from (upward)
* @returns True if ancestorId appears in the parent chain of nodeId
*/
const isDescendantOf = useCallback(
(ancestorId: string, nodeId: string): boolean => {
const visited = new Set<string>()
const maxDepth = 100
let currentId: string | undefined = nodeId
let depth = 0
while (currentId && depth < maxDepth) {
if (currentId === ancestorId) return true
if (visited.has(currentId)) return false
visited.add(currentId)
currentId = blocks?.[currentId]?.data?.parentId
depth += 1
}
return false
},
[blocks]
)
/**
* Gets the absolute position of a node (accounting for nested parents).
* For nodes inside containers, accounts for header and padding offsets.
@@ -379,6 +406,7 @@ export function useNodeUtilities(blocks: Record<string, any>) {
return {
getNodeDepth,
getNodeHierarchy,
isDescendantOf,
getNodeAbsolutePosition,
calculateRelativePosition,
isPointInLoopNode,

View File

@@ -20,7 +20,10 @@ import {
TriggerUtils,
} from '@/lib/workflows/triggers/triggers'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-current-workflow'
import { updateActiveBlockRefCount } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils/workflow-execution-utils'
import {
markOutgoingEdgesFromOutput,
updateActiveBlockRefCount,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/utils/workflow-execution-utils'
import { getBlock } from '@/blocks'
import type { SerializableExecutionState } from '@/executor/execution/types'
import type {
@@ -32,6 +35,7 @@ import type {
} from '@/executor/types'
import { hasExecutionResult } from '@/executor/utils/errors'
import { coerceValue } from '@/executor/utils/start-block'
import { stripCloneSuffixes } from '@/executor/utils/subflow-utils'
import { subscriptionKeys } from '@/hooks/queries/subscription'
import { useExecutionStream } from '@/hooks/use-execution-stream'
import { WorkflowValidationError } from '@/serializer'
@@ -62,7 +66,7 @@ interface DebugValidationResult {
interface BlockEventHandlerConfig {
workflowId?: string
executionIdRef: { current: string }
workflowEdges: Array<{ id: string; target: string; sourceHandle?: string | null }>
workflowEdges: Array<{ id: string; source: string; target: string; sourceHandle?: string | null }>
activeBlocksSet: Set<string>
activeBlockRefCounts: Map<string, number>
accumulatedBlockLogs: BlockLog[]
@@ -334,19 +338,31 @@ export function useWorkflowExecution() {
setActiveBlocks(workflowId, new Set(activeBlocksSet))
}
const markIncomingEdges = (blockId: string) => {
const markOutgoingEdges = (blockId: string, output: Record<string, any> | undefined) => {
if (!workflowId) return
const incomingEdges = workflowEdges.filter((edge) => edge.target === blockId)
incomingEdges.forEach((edge) => {
const status = edge.sourceHandle === 'error' ? 'error' : 'success'
setEdgeRunStatus(workflowId, edge.id, status)
})
markOutgoingEdgesFromOutput(blockId, output, workflowEdges, workflowId, setEdgeRunStatus)
}
const isContainerBlockType = (blockType?: string) => {
return blockType === 'loop' || blockType === 'parallel'
}
/** Extracts iteration and child-workflow fields shared across console entry call sites. */
const extractIterationFields = (
data: BlockStartedData | BlockCompletedData | BlockErrorData
) => ({
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
parentIterations: data.parentIterations,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
...('childWorkflowInstanceId' in data && {
childWorkflowInstanceId: data.childWorkflowInstanceId,
}),
})
const createBlockLogEntry = (
data: BlockCompletedData | BlockErrorData,
options: { success: boolean; output?: unknown; error?: string }
@@ -379,13 +395,7 @@ export function useWorkflowExecution() {
executionId: executionIdRef.current,
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
})
}
@@ -405,13 +415,7 @@ export function useWorkflowExecution() {
executionId: executionIdRef.current,
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
})
}
@@ -427,13 +431,7 @@ export function useWorkflowExecution() {
startedAt: data.startedAt,
endedAt: data.endedAt,
isRunning: false,
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
},
executionIdRef.current
)
@@ -452,13 +450,7 @@ export function useWorkflowExecution() {
startedAt: data.startedAt,
endedAt: data.endedAt,
isRunning: false,
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
childWorkflowInstanceId: data.childWorkflowInstanceId,
...extractIterationFields(data),
},
executionIdRef.current
)
@@ -467,7 +459,6 @@ export function useWorkflowExecution() {
const onBlockStarted = (data: BlockStartedData) => {
if (isStaleExecution()) return
updateActiveBlocks(data.blockId, true)
markIncomingEdges(data.blockId)
if (!includeStartConsoleEntry || !workflowId) return
@@ -486,12 +477,7 @@ export function useWorkflowExecution() {
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
isRunning: true,
iterationCurrent: data.iterationCurrent,
iterationTotal: data.iterationTotal,
iterationType: data.iterationType,
iterationContainerId: data.iterationContainerId,
childWorkflowBlockId: data.childWorkflowBlockId,
childWorkflowName: data.childWorkflowName,
...extractIterationFields(data),
})
}
@@ -499,7 +485,7 @@ export function useWorkflowExecution() {
if (isStaleExecution()) return
updateActiveBlocks(data.blockId, false)
if (workflowId) setBlockRunStatus(workflowId, data.blockId, 'success')
markOutgoingEdges(data.blockId, data.output as Record<string, any> | undefined)
executedBlockIds.add(data.blockId)
accumulatedBlockStates.set(data.blockId, {
output: data.output,
@@ -507,8 +493,20 @@ export function useWorkflowExecution() {
executionTime: data.durationMs,
})
// For nested containers, the SSE blockId may be a cloned ID (e.g. P1__obranch-0).
// Also record the original workflow-level ID so the canvas can highlight it.
if (isContainerBlockType(data.blockType)) {
return
const originalId = stripCloneSuffixes(data.blockId)
if (originalId !== data.blockId) {
executedBlockIds.add(originalId)
if (workflowId) setBlockRunStatus(workflowId, originalId, 'success')
}
}
if (isContainerBlockType(data.blockType) && !data.iterationContainerId) {
const output = data.output as Record<string, any> | undefined
const isEmptySubflow = Array.isArray(output?.results) && output.results.length === 0
if (!isEmptySubflow) return
}
accumulatedBlockLogs.push(createBlockLogEntry(data, { success: true, output: data.output }))
@@ -530,6 +528,7 @@ export function useWorkflowExecution() {
if (isStaleExecution()) return
updateActiveBlocks(data.blockId, false)
if (workflowId) setBlockRunStatus(workflowId, data.blockId, 'error')
markOutgoingEdges(data.blockId, { error: data.error })
executedBlockIds.add(data.blockId)
accumulatedBlockStates.set(data.blockId, {
@@ -538,6 +537,15 @@ export function useWorkflowExecution() {
executionTime: data.durationMs || 0,
})
// For nested containers, also record the original workflow-level ID
if (isContainerBlockType(data.blockType)) {
const originalId = stripCloneSuffixes(data.blockId)
if (originalId !== data.blockId) {
executedBlockIds.add(originalId)
if (workflowId) setBlockRunStatus(workflowId, originalId, 'error')
}
}
accumulatedBlockLogs.push(
createBlockLogEntry(data, { success: false, output: {}, error: data.error })
)
@@ -745,7 +753,7 @@ export function useWorkflowExecution() {
const stream = new ReadableStream({
async start(controller) {
const { encodeSSE } = await import('@/lib/core/utils/sse')
const streamedContent = new Map<string, string>()
const streamedChunks = new Map<string, string[]>()
const streamReadingPromises: Promise<void>[] = []
const safeEnqueue = (data: Uint8Array) => {
@@ -845,8 +853,8 @@ export function useWorkflowExecution() {
const reader = streamingExecution.stream.getReader()
const blockId = (streamingExecution.execution as any)?.blockId
if (blockId && !streamedContent.has(blockId)) {
streamedContent.set(blockId, '')
if (blockId && !streamedChunks.has(blockId)) {
streamedChunks.set(blockId, [])
}
try {
@@ -860,13 +868,13 @@ export function useWorkflowExecution() {
}
const chunk = new TextDecoder().decode(value)
if (blockId) {
streamedContent.set(blockId, (streamedContent.get(blockId) || '') + chunk)
streamedChunks.get(blockId)!.push(chunk)
}
let chunkToSend = chunk
if (blockId && !processedFirstChunk.has(blockId)) {
processedFirstChunk.add(blockId)
if (streamedContent.size > 1) {
if (streamedChunks.size > 1) {
chunkToSend = `\n\n${chunk}`
}
}
@@ -884,7 +892,7 @@ export function useWorkflowExecution() {
// Handle non-streaming blocks (like Function blocks)
const onBlockComplete = async (blockId: string, output: any) => {
// Skip if this block already had streaming content (avoid duplicates)
if (streamedContent.has(blockId)) {
if (streamedChunks.has(blockId)) {
logger.debug('[handleRunWorkflow] Skipping onBlockComplete for streaming block', {
blockId,
})
@@ -921,13 +929,13 @@ export function useWorkflowExecution() {
: JSON.stringify(outputValue, null, 2)
// Add separator if this isn't the first output
const separator = streamedContent.size > 0 ? '\n\n' : ''
const separator = streamedChunks.size > 0 ? '\n\n' : ''
// Send the non-streaming block output as a chunk
safeEnqueue(encodeSSE({ blockId, chunk: separator + formattedOutput }))
// Track that we've sent output for this block
streamedContent.set(blockId, formattedOutput)
streamedChunks.set(blockId, [formattedOutput])
}
}
}
@@ -969,6 +977,12 @@ export function useWorkflowExecution() {
})
}
// Resolve chunks to final strings for consumption
const streamedContent = new Map<string, string>()
for (const [id, chunks] of streamedChunks) {
streamedContent.set(id, chunks.join(''))
}
// Update streamed content and apply tokenization
if (result.logs) {
result.logs.forEach((log: BlockLog) => {
@@ -1112,9 +1126,7 @@ export function useWorkflowExecution() {
{} as typeof workflowBlocks
)
const isExecutingFromChat =
overrideTriggerType === 'chat' ||
(workflowInput && typeof workflowInput === 'object' && 'input' in workflowInput)
const isExecutingFromChat = overrideTriggerType === 'chat'
logger.info('Executing workflow', {
isDiffMode: currentWorkflow.isDiffMode,
@@ -1316,7 +1328,7 @@ export function useWorkflowExecution() {
const activeBlocksSet = new Set<string>()
const activeBlockRefCounts = new Map<string, number>()
const streamedContent = new Map<string, string>()
const streamedChunks = new Map<string, string[]>()
const accumulatedBlockLogs: BlockLog[] = []
const accumulatedBlockStates = new Map<string, BlockState>()
const executedBlockIds = new Set<string>()
@@ -1374,8 +1386,10 @@ export function useWorkflowExecution() {
onBlockChildWorkflowStarted: blockHandlers.onBlockChildWorkflowStarted,
onStreamChunk: (data) => {
const existing = streamedContent.get(data.blockId) || ''
streamedContent.set(data.blockId, existing + data.chunk)
if (!streamedChunks.has(data.blockId)) {
streamedChunks.set(data.blockId, [])
}
streamedChunks.get(data.blockId)!.push(data.chunk)
// Call onStream callback if provided (create a fake StreamingExecution)
if (onStream && isExecutingFromChat) {
@@ -1390,7 +1404,7 @@ export function useWorkflowExecution() {
stream,
execution: {
success: true,
output: { content: existing + data.chunk },
output: { content: '' },
blockId: data.blockId,
} as any,
}
@@ -1481,8 +1495,13 @@ export function useWorkflowExecution() {
: null
if (activeWorkflowId && !workflowExecState?.isDebugging) {
setExecutionResult(executionResult)
setIsExecuting(activeWorkflowId, false)
setActiveBlocks(activeWorkflowId, new Set())
// For chat executions, don't set isExecuting=false here — the chat's
// client-side stream wrapper still has buffered data to deliver.
// The chat's finally block handles cleanup after the stream is fully consumed.
if (!isExecutingFromChat) {
setIsExecuting(activeWorkflowId, false)
setActiveBlocks(activeWorkflowId, new Set())
}
setTimeout(() => {
queryClient.invalidateQueries({ queryKey: subscriptionKeys.all })
}, 1000)
@@ -1522,7 +1541,7 @@ export function useWorkflowExecution() {
isPreExecutionError,
})
if (activeWorkflowId) {
if (activeWorkflowId && !isExecutingFromChat) {
setIsExecuting(activeWorkflowId, false)
setIsDebugging(activeWorkflowId, false)
setActiveBlocks(activeWorkflowId, new Set())
@@ -1548,7 +1567,7 @@ export function useWorkflowExecution() {
durationMs: data?.duration,
})
if (activeWorkflowId) {
if (activeWorkflowId && !isExecutingFromChat) {
setIsExecuting(activeWorkflowId, false)
setIsDebugging(activeWorkflowId, false)
setActiveBlocks(activeWorkflowId, new Set())

View File

@@ -1,4 +1,7 @@
import type { BlockState } from '@/stores/workflows/workflow/types'
import { isAncestorProtected, isBlockProtected } from '@/stores/workflows/workflow/utils'
export { isAncestorProtected, isBlockProtected }
/**
* Result of filtering protected blocks from a deletion operation
@@ -12,28 +15,6 @@ export interface FilterProtectedBlocksResult {
allProtected: boolean
}
/**
* Checks if a block is protected from editing/deletion.
* A block is protected if it is locked or if its parent container is locked.
*
* @param blockId - The ID of the block to check
* @param blocks - Record of all blocks in the workflow
* @returns True if the block is protected
*/
export function isBlockProtected(blockId: string, blocks: Record<string, BlockState>): boolean {
const block = blocks[blockId]
if (!block) return false
// Block is locked directly
if (block.locked) return true
// Block is inside a locked container
const parentId = block.data?.parentId
if (parentId && blocks[parentId]?.locked) return true
return false
}
/**
* Checks if an edge is protected from modification.
* An edge is protected only if its target block is protected.

View File

@@ -4,6 +4,39 @@ import { TriggerUtils } from '@/lib/workflows/triggers/triggers'
import { clampPositionToContainer } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils/node-position-utils'
import type { BlockState } from '@/stores/workflows/workflow/types'
/**
* Collects all descendant block IDs for container blocks (loop/parallel) in the given set.
* Used to treat a nested subflow as one unit when computing boundary edges (e.g. remove-from-subflow).
*
* @param blockIds - Root block IDs (e.g. the blocks being removed from subflow)
* @param blocks - All workflow blocks
* @returns IDs of blocks that are descendants of any container in blockIds (excluding the roots)
*/
export function getDescendantBlockIds(
blockIds: string[],
blocks: Record<string, BlockState>
): string[] {
const current = new Set(blockIds)
const added: string[] = []
const toProcess = [...blockIds]
while (toProcess.length > 0) {
const id = toProcess.pop()!
const block = blocks[id]
if (block?.type !== 'loop' && block?.type !== 'parallel') continue
for (const [bid, b] of Object.entries(blocks)) {
if (b?.data?.parentId === id && !current.has(bid)) {
current.add(bid)
added.push(bid)
toProcess.push(bid)
}
}
}
return added
}
/**
* Checks if the currently focused element is an editable input.
* Returns true if the user is typing in an input, textarea, or contenteditable element.

View File

@@ -29,6 +29,62 @@ export function updateActiveBlockRefCount(
}
}
/**
* Determines if a workflow edge should be marked as active based on its handle and the block output.
* Mirrors the executor's EdgeManager.shouldActivateEdge logic on the client side.
* Exclude sentinel handles here
*/
function shouldActivateEdgeClient(
handle: string | null | undefined,
output: Record<string, any> | undefined
): boolean {
if (!handle) return true
if (handle.startsWith('condition-')) {
return output?.selectedOption === handle.substring('condition-'.length)
}
if (handle.startsWith('router-')) {
return output?.selectedRoute === handle.substring('router-'.length)
}
switch (handle) {
case 'error':
return !!output?.error
case 'source':
return !output?.error
case 'loop-start-source':
case 'loop-end-source':
case 'parallel-start-source':
case 'parallel-end-source':
return true
default:
return true
}
}
export function markOutgoingEdgesFromOutput(
blockId: string,
output: Record<string, any> | undefined,
workflowEdges: Array<{
id: string
source: string
target: string
sourceHandle?: string | null
}>,
workflowId: string,
setEdgeRunStatus: (wfId: string, edgeId: string, status: 'success' | 'error') => void
): void {
const outgoing = workflowEdges.filter((edge) => edge.source === blockId)
for (const edge of outgoing) {
const handle = edge.sourceHandle
if (shouldActivateEdgeClient(handle, output)) {
const status = handle === 'error' ? 'error' : output?.error ? 'error' : 'success'
setEdgeRunStatus(workflowId, edge.id, status)
}
}
}
export interface WorkflowExecutionOptions {
workflowInput?: any
onStream?: (se: StreamingExecution) => Promise<void>
@@ -135,13 +191,6 @@ export async function executeWorkflowWithFullLogging(
true
)
setActiveBlocks(wfId, new Set(activeBlocksSet))
const incomingEdges = workflowEdges.filter(
(edge) => edge.target === event.data.blockId
)
incomingEdges.forEach((edge) => {
setEdgeRunStatus(wfId, edge.id, 'success')
})
break
}
@@ -155,6 +204,13 @@ export async function executeWorkflowWithFullLogging(
setActiveBlocks(wfId, new Set(activeBlocksSet))
setBlockRunStatus(wfId, event.data.blockId, 'success')
markOutgoingEdgesFromOutput(
event.data.blockId,
event.data.output,
workflowEdges,
wfId,
setEdgeRunStatus
)
addConsole({
input: event.data.input || {},
@@ -194,6 +250,13 @@ export async function executeWorkflowWithFullLogging(
setActiveBlocks(wfId, new Set(activeBlocksSet))
setBlockRunStatus(wfId, event.data.blockId, 'error')
markOutgoingEdgesFromOutput(
event.data.blockId,
{ error: event.data.error },
workflowEdges,
wfId,
setEdgeRunStatus
)
addConsole({
input: event.data.input || {},

View File

@@ -57,6 +57,7 @@ import {
estimateBlockDimensions,
filterProtectedBlocks,
getClampedPositionForNode,
getDescendantBlockIds,
getWorkflowLockToggleIds,
isBlockProtected,
isEdgeProtected,
@@ -195,17 +196,14 @@ const edgeTypes: EdgeTypes = {
const defaultEdgeOptions = { type: 'custom' }
const reactFlowStyles = [
'bg-[var(--bg)]',
'[&_.react-flow__edges]:!z-0',
'[&_.react-flow__node]:!z-[21]',
'[&_.react-flow__node]:z-[21]',
'[&_.react-flow__handle]:!z-[30]',
'[&_.react-flow__edge-labels]:!z-[60]',
'[&_.react-flow__pane]:!bg-[var(--bg)]',
'[&_.react-flow__edge-labels]:!z-[1001]',
'[&_.react-flow__pane]:select-none',
'[&_.react-flow__selectionpane]:select-none',
'[&_.react-flow__renderer]:!bg-[var(--bg)]',
'[&_.react-flow__viewport]:!bg-[var(--bg)]',
'[&_.react-flow__background]:hidden',
'[&_.react-flow__node-subflowNode.selected]:!shadow-none',
].join(' ')
const reactFlowFitViewOptions = { padding: 0.6, maxZoom: 1.0 } as const
const reactFlowProOptions = { hideAttribution: true } as const
@@ -416,6 +414,7 @@ const WorkflowContent = React.memo(() => {
const {
getNodeDepth,
getNodeAbsolutePosition,
isDescendantOf,
calculateRelativePosition,
isPointInLoopNode,
resizeLoopNodes,
@@ -432,7 +431,6 @@ const WorkflowContent = React.memo(() => {
const canNodeEnterContainer = useCallback(
(node: Node): boolean => {
if (node.data?.type === 'starter') return false
if (node.type === 'subflowNode') return false
const block = blocks[node.id]
return !(block && TriggerUtils.isTriggerBlock(block))
},
@@ -681,10 +679,15 @@ const WorkflowContent = React.memo(() => {
if (nodesNeedingUpdate.length === 0) return
// Filter out nodes that cannot enter containers (when target is a container)
const validNodes = targetParentId
let validNodes = targetParentId
? nodesNeedingUpdate.filter(canNodeEnterContainer)
: nodesNeedingUpdate
// Exclude nodes that would create a cycle (moving a container into one of its descendants)
if (targetParentId) {
validNodes = validNodes.filter((n) => !isDescendantOf(n.id, targetParentId))
}
if (validNodes.length === 0) return
// Find boundary edges (edges that cross the container boundary)
@@ -744,6 +747,7 @@ const WorkflowContent = React.memo(() => {
blocks,
edgesForDisplay,
canNodeEnterContainer,
isDescendantOf,
calculateRelativePosition,
getNodeAbsolutePosition,
shiftUpdatesToContainerBounds,
@@ -1014,12 +1018,22 @@ const WorkflowContent = React.memo(() => {
return
}
// Check if any pasted block is a subflow - subflows cannot be nested
const hasSubflow = pastedBlocksArray.some((b) => b.type === 'loop' || b.type === 'parallel')
if (hasSubflow) {
// Prevent cycle: pasting a container that is the target container itself or one of its ancestors.
// Use original clipboard IDs since preparePasteData regenerates them via uuidv4().
const ancestorIds = new Set<string>()
let walkId: string | undefined = targetContainer.loopId
while (walkId && !ancestorIds.has(walkId)) {
ancestorIds.add(walkId)
walkId = blocks[walkId]?.data?.parentId as string | undefined
}
const originalClipboardBlocks = clipboard ? Object.values(clipboard.blocks) : []
const wouldCreateCycle = originalClipboardBlocks.some(
(b) => (b.type === 'loop' || b.type === 'parallel') && ancestorIds.has(b.id)
)
if (wouldCreateCycle) {
addNotification({
level: 'error',
message: 'Subflows cannot be nested inside other subflows.',
message: 'Cannot paste a subflow inside itself or its own descendant.',
workflowId: activeWorkflowId || undefined,
})
return
@@ -1702,31 +1716,75 @@ const WorkflowContent = React.memo(() => {
const containerInfo = isPointInLoopNode(position)
clearDragHighlights()
document.body.classList.remove('sim-drag-subflow')
if (data.type === 'loop' || data.type === 'parallel') {
const id = crypto.randomUUID()
const baseName = data.type === 'loop' ? 'Loop' : 'Parallel'
const name = getUniqueBlockName(baseName, blocks)
const autoConnectEdge = tryCreateAutoConnectEdge(position, id, {
targetParentId: null,
})
if (containerInfo) {
const rawPosition = {
x: position.x - containerInfo.loopPosition.x,
y: position.y - containerInfo.loopPosition.y,
}
addBlock(
id,
data.type,
name,
position,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
type: 'subflowNode',
},
undefined,
undefined,
autoConnectEdge
)
const relativePosition = clampPositionToContainer(
rawPosition,
containerInfo.dimensions,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
}
)
const existingChildBlocks = Object.values(blocks)
.filter((b) => b.data?.parentId === containerInfo.loopId)
.map((b) => ({ id: b.id, type: b.type, position: b.position }))
const autoConnectEdge = tryCreateAutoConnectEdge(relativePosition, id, {
targetParentId: containerInfo.loopId,
existingChildBlocks,
containerId: containerInfo.loopId,
})
addBlock(
id,
data.type,
name,
relativePosition,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
type: 'subflowNode',
parentId: containerInfo.loopId,
extent: 'parent',
},
containerInfo.loopId,
'parent',
autoConnectEdge
)
resizeLoopNodesWrapper()
} else {
const autoConnectEdge = tryCreateAutoConnectEdge(position, id, {
targetParentId: null,
})
addBlock(
id,
data.type,
name,
position,
{
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
type: 'subflowNode',
},
undefined,
undefined,
autoConnectEdge
)
}
return
}
@@ -2113,11 +2171,9 @@ const WorkflowContent = React.memo(() => {
// Check if hovering over a container node
const containerInfo = isPointInLoopNode(position)
// Highlight container if hovering over it and not dragging a subflow
// Subflow drag is marked by body class flag set by toolbar
const isSubflowDrag = document.body.classList.contains('sim-drag-subflow')
// Highlight container if hovering over it
if (containerInfo && !isSubflowDrag) {
if (containerInfo) {
const containerNode = getNodes().find((n) => n.id === containerInfo.loopId)
if (containerNode?.type === 'subflowNode') {
const kind = (containerNode.data as SubflowNodeData)?.kind
@@ -2308,6 +2364,13 @@ const WorkflowContent = React.memo(() => {
// Handle container nodes differently
if (block.type === 'loop' || block.type === 'parallel') {
// Compute nesting depth so children always render above parents
let depth = 0
let pid = block.data?.parentId as string | undefined
while (pid && depth < 100) {
depth++
pid = blocks[pid]?.data?.parentId as string | undefined
}
nodeArray.push({
id: block.id,
type: 'subflowNode',
@@ -2316,6 +2379,8 @@ const WorkflowContent = React.memo(() => {
extent: block.data?.extent || undefined,
dragHandle: '.workflow-drag-handle',
draggable: !isBlockProtected(block.id, blocks),
zIndex: depth,
className: block.data?.parentId ? 'nested-subflow-node' : undefined,
data: {
...block.data,
name: block.name,
@@ -2344,6 +2409,12 @@ const WorkflowContent = React.memo(() => {
const nodeType = block.type === 'note' ? 'noteBlock' : 'workflowBlock'
const dragHandle = block.type === 'note' ? '.note-drag-handle' : '.workflow-drag-handle'
// Compute zIndex for blocks inside containers so they render above the
// parent subflow's interactive body area (which needs pointer-events for
// click-to-select). Container nodes use zIndex: depth (0, 1, 2...),
// so child blocks use a baseline that is always above any container.
const childZIndex = block.data?.parentId ? 1000 : undefined
// Create stable node object - React Flow will handle shallow comparison
nodeArray.push({
id: block.id,
@@ -2352,6 +2423,7 @@ const WorkflowContent = React.memo(() => {
parentId: block.data?.parentId,
dragHandle,
draggable: !isBlockProtected(block.id, blocks),
...(childZIndex !== undefined && { zIndex: childZIndex }),
extent: (() => {
// Clamp children to subflow body (exclude header)
const parentId = block.data?.parentId as string | undefined
@@ -2476,15 +2548,35 @@ const WorkflowContent = React.memo(() => {
})
if (validBlockIds.length === 0) return
const movingNodeIds = new Set(validBlockIds)
const validBlockIdSet = new Set(validBlockIds)
const descendantIds = getDescendantBlockIds(validBlockIds, blocks)
const movingNodeIds = new Set([...validBlockIds, ...descendantIds])
// Find boundary edges (edges that cross the subflow boundary)
// Find boundary edges (one end inside the subtree, one end outside)
const boundaryEdges = edgesForDisplay.filter((e) => {
const sourceInSelection = movingNodeIds.has(e.source)
const targetInSelection = movingNodeIds.has(e.target)
return sourceInSelection !== targetInSelection
})
const boundaryEdgesByNode = mapEdgesByNode(boundaryEdges, movingNodeIds)
// Attribute each boundary edge to the validBlockId that is the ancestor of the moved endpoint
const boundaryEdgesByNode = new Map<string, Edge[]>()
for (const edge of boundaryEdges) {
const movedEnd = movingNodeIds.has(edge.source) ? edge.source : edge.target
let id: string | undefined = movedEnd
const seen = new Set<string>()
while (id) {
if (seen.has(id)) break
seen.add(id)
if (validBlockIdSet.has(id)) {
const list = boundaryEdgesByNode.get(id) ?? []
list.push(edge)
boundaryEdgesByNode.set(id, list)
break
}
id = blocks[id]?.data?.parentId
}
}
// Collect absolute positions BEFORE any mutations
const absolutePositions = new Map<string, { x: number; y: number }>()
@@ -2546,42 +2638,54 @@ const WorkflowContent = React.memo(() => {
/**
* Updates container dimensions in displayNodes during drag or keyboard movement.
* Resizes the moved node's immediate parent and all ancestor containers (for nested loops/parallels).
*/
const updateContainerDimensionsDuringMove = useCallback(
(movedNodeId: string, movedNodePosition: { x: number; y: number }) => {
const parentId = blocks[movedNodeId]?.data?.parentId
if (!parentId) return
const ancestorIds: string[] = []
const visited = new Set<string>()
let currentId = blocks[movedNodeId]?.data?.parentId
while (currentId && !visited.has(currentId)) {
visited.add(currentId)
ancestorIds.push(currentId)
currentId = blocks[currentId]?.data?.parentId
}
if (ancestorIds.length === 0) return
setDisplayNodes((currentNodes) => {
const childNodes = currentNodes.filter((n) => n.parentId === parentId)
if (childNodes.length === 0) return currentNodes
const computedDimensions = new Map<string, { width: number; height: number }>()
const childPositions = childNodes.map((node) => {
const nodePosition = node.id === movedNodeId ? movedNodePosition : node.position
const { width, height } = getBlockDimensions(node.id)
return { x: nodePosition.x, y: nodePosition.y, width, height }
})
for (const containerId of ancestorIds) {
const childNodes = currentNodes.filter((n) => n.parentId === containerId)
if (childNodes.length === 0) continue
const { width: newWidth, height: newHeight } = calculateContainerDimensions(childPositions)
const childPositions = childNodes.map((node) => {
const nodePosition = node.id === movedNodeId ? movedNodePosition : node.position
const dims = computedDimensions.get(node.id)
const width = dims?.width ?? node.data?.width ?? getBlockDimensions(node.id).width
const height = dims?.height ?? node.data?.height ?? getBlockDimensions(node.id).height
return { x: nodePosition.x, y: nodePosition.y, width, height }
})
computedDimensions.set(containerId, calculateContainerDimensions(childPositions))
}
return currentNodes.map((node) => {
if (node.id === parentId) {
const currentWidth = node.data?.width || CONTAINER_DIMENSIONS.DEFAULT_WIDTH
const currentHeight = node.data?.height || CONTAINER_DIMENSIONS.DEFAULT_HEIGHT
// Only update if dimensions changed
if (newWidth !== currentWidth || newHeight !== currentHeight) {
return {
...node,
data: {
...node.data,
width: newWidth,
height: newHeight,
},
}
}
const newDims = computedDimensions.get(node.id)
if (!newDims) return node
const currentWidth = node.data?.width ?? CONTAINER_DIMENSIONS.DEFAULT_WIDTH
const currentHeight = node.data?.height ?? CONTAINER_DIMENSIONS.DEFAULT_HEIGHT
if (newDims.width === currentWidth && newDims.height === currentHeight) {
return node
}
return {
...node,
data: {
...node.data,
width: newDims.width,
height: newDims.height,
},
}
return node
})
})
},
@@ -2914,16 +3018,6 @@ const WorkflowContent = React.memo(() => {
// Get the node's absolute position to properly calculate intersections
const nodeAbsolutePos = getNodeAbsolutePosition(node.id)
// Prevent subflows from being dragged into other subflows
if (node.type === 'subflowNode') {
// Clear any highlighting for subflow nodes
if (potentialParentId) {
clearDragHighlights()
setPotentialParentId(null)
}
return // Exit early - subflows cannot be placed inside other subflows
}
// Find intersections with container nodes using absolute coordinates
const intersectingNodes = getNodes()
.filter((n) => {
@@ -2993,15 +3087,25 @@ const WorkflowContent = React.memo(() => {
return a.size - b.size // Smaller container takes precedence
})
// Exclude containers that are inside the dragged node (would create a cycle)
const validContainers = sortedContainers.filter(
({ container }) => !isDescendantOf(node.id, container.id)
)
// Use the most appropriate container (deepest or smallest at same depth)
const bestContainerMatch = sortedContainers[0]
const bestContainerMatch = validContainers[0]
setPotentialParentId(bestContainerMatch.container.id)
if (bestContainerMatch) {
setPotentialParentId(bestContainerMatch.container.id)
// Add highlight class and change cursor
const kind = (bestContainerMatch.container.data as SubflowNodeData)?.kind
if (kind === 'loop' || kind === 'parallel') {
highlightContainerNode(bestContainerMatch.container.id, kind)
// Add highlight class and change cursor
const kind = (bestContainerMatch.container.data as SubflowNodeData)?.kind
if (kind === 'loop' || kind === 'parallel') {
highlightContainerNode(bestContainerMatch.container.id, kind)
}
} else {
clearDragHighlights()
setPotentialParentId(null)
}
} else {
// Remove highlighting if no longer over a container
@@ -3017,6 +3121,7 @@ const WorkflowContent = React.memo(() => {
blocks,
getNodeAbsolutePosition,
getNodeDepth,
isDescendantOf,
updateContainerDimensionsDuringMove,
highlightContainerNode,
]
@@ -3159,6 +3264,17 @@ const WorkflowContent = React.memo(() => {
}
}
// Prevent placing a container inside one of its own nested containers (would create cycle)
if (potentialParentId && isDescendantOf(node.id, potentialParentId)) {
addNotification({
level: 'info',
message: 'Cannot place a container inside one of its own nested containers',
workflowId: activeWorkflowId || undefined,
})
setPotentialParentId(null)
return
}
// Update the node's parent relationship
if (potentialParentId) {
// Remove existing edges before moving into container
@@ -3275,6 +3391,7 @@ const WorkflowContent = React.memo(() => {
getNodes,
dragStartParentId,
potentialParentId,
isDescendantOf,
updateNodeParent,
updateBlockPosition,
collaborativeBatchAddEdges,
@@ -3655,21 +3772,20 @@ const WorkflowContent = React.memo(() => {
return (
<div className='flex h-full w-full flex-col overflow-hidden'>
<div className='relative h-full w-full flex-1'>
{/* Loading spinner - always mounted, animation paused when hidden to avoid overhead */}
<div
className={`absolute inset-0 z-[5] flex items-center justify-center bg-[var(--bg)] transition-opacity duration-150 ${isWorkflowReady ? 'pointer-events-none opacity-0' : 'opacity-100'}`}
>
<div
className={`h-[18px] w-[18px] rounded-full ${isWorkflowReady ? '' : 'animate-spin'}`}
style={{
background:
'conic-gradient(from 0deg, hsl(var(--muted-foreground)) 0deg 120deg, transparent 120deg 180deg, hsl(var(--muted-foreground)) 180deg 300deg, transparent 300deg 360deg)',
mask: 'radial-gradient(farthest-side, transparent calc(100% - 1.5px), black calc(100% - 1.5px))',
WebkitMask:
'radial-gradient(farthest-side, transparent calc(100% - 1.5px), black calc(100% - 1.5px))',
}}
/>
</div>
{!isWorkflowReady && (
<div className='absolute inset-0 z-[5] flex items-center justify-center bg-[var(--bg)]'>
<div
className='h-[18px] w-[18px] animate-spin rounded-full'
style={{
background:
'conic-gradient(from 0deg, hsl(var(--muted-foreground)) 0deg 120deg, transparent 120deg 180deg, hsl(var(--muted-foreground)) 180deg 300deg, transparent 300deg 360deg)',
mask: 'radial-gradient(farthest-side, transparent calc(100% - 1.5px), black calc(100% - 1.5px))',
WebkitMask:
'radial-gradient(farthest-side, transparent calc(100% - 1.5px), black calc(100% - 1.5px))',
}}
/>
</div>
)}
{isWorkflowReady && (
<>
@@ -3722,7 +3838,7 @@ const WorkflowContent = React.memo(() => {
noWheelClassName='allow-scroll'
edgesFocusable={true}
edgesUpdatable={effectivePermissions.canEdit}
className={`workflow-container h-full transition-opacity duration-150 ${reactFlowStyles} ${isCanvasReady ? 'opacity-100' : 'opacity-0'} ${isHandMode ? 'canvas-mode-hand' : 'canvas-mode-cursor'}`}
className={`workflow-container h-full bg-[var(--bg)] transition-opacity duration-150 ${reactFlowStyles} ${isCanvasReady ? 'opacity-100' : 'opacity-0'} ${isHandMode ? 'canvas-mode-hand' : 'canvas-mode-cursor'}`}
onNodeDrag={effectivePermissions.canEdit ? onNodeDrag : undefined}
onNodeDragStop={effectivePermissions.canEdit ? onNodeDragStop : undefined}
onSelectionDragStart={effectivePermissions.canEdit ? onSelectionDragStart : undefined}
@@ -3734,7 +3850,7 @@ const WorkflowContent = React.memo(() => {
elevateEdgesOnSelect={true}
onlyRenderVisibleElements={false}
deleteKeyCode={null}
elevateNodesOnSelect={true}
elevateNodesOnSelect={false}
autoPanOnConnect={effectivePermissions.canEdit}
autoPanOnNodeDrag={effectivePermissions.canEdit}
/>

View File

@@ -145,7 +145,7 @@ interface PreviewWorkflowProps {
/** Cursor style to show when hovering the canvas */
cursorStyle?: 'default' | 'pointer' | 'grab'
/** Map of executed block IDs to their status for highlighting the execution path */
executedBlocks?: Record<string, { status: string }>
executedBlocks?: Record<string, { status: string; output?: unknown }>
/** Currently selected block ID for highlighting */
selectedBlockId?: string | null
/** Skips expensive subblock computations for thumbnails/template previews */
@@ -274,9 +274,9 @@ export function PreviewWorkflow({
/** Maps base block IDs to execution data, handling parallel iteration variants (blockId₍n₎). */
const blockExecutionMap = useMemo(() => {
if (!executedBlocks) return new Map<string, { status: string }>()
if (!executedBlocks) return new Map<string, { status: string; output?: unknown }>()
const map = new Map<string, { status: string }>()
const map = new Map<string, { status: string; output?: unknown }>()
for (const [key, value] of Object.entries(executedBlocks)) {
// Extract base ID (remove iteration suffix like ₍0₎)
const baseId = key.includes('₍') ? key.split('₍')[0] : key
@@ -289,21 +289,38 @@ export function PreviewWorkflow({
return map
}, [executedBlocks])
/** Derives subflow status from children. Error takes precedence. */
/** Derives subflow status from children. Recursively checks nested subflows. Error takes precedence. */
const getSubflowExecutionStatus = useMemo(() => {
return (subflowId: string): ExecutionStatus | undefined => {
const derive = (
subflowId: string,
visited: Set<string> = new Set()
): ExecutionStatus | undefined => {
if (visited.has(subflowId)) return undefined
visited.add(subflowId)
const childIds = subflowChildrenMap.get(subflowId)
if (!childIds?.length) return undefined
const executedChildren = childIds
.map((id) => blockExecutionMap.get(id))
.filter((status): status is { status: string } => Boolean(status))
const childStatuses: string[] = []
for (const childId of childIds) {
const direct = blockExecutionMap.get(childId)
if (direct) {
childStatuses.push(direct.status)
} else {
const childBlock = workflowState.blocks?.[childId]
if (childBlock?.type === 'loop' || childBlock?.type === 'parallel') {
const nested = derive(childId, visited)
if (nested) childStatuses.push(nested)
}
}
}
if (executedChildren.length === 0) return undefined
if (executedChildren.some((s) => s.status === 'error')) return 'error'
if (childStatuses.length === 0) return undefined
if (childStatuses.some((s) => s === 'error')) return 'error'
return 'success'
}
}, [subflowChildrenMap, blockExecutionMap])
return derive
}, [subflowChildrenMap, blockExecutionMap, workflowState.blocks])
/** Gets block status. Subflows derive status from children. */
const getBlockExecutionStatus = useMemo(() => {
@@ -434,7 +451,6 @@ export function PreviewWorkflow({
const edges: Edge[] = useMemo(() => {
if (!isValidWorkflowState) return []
/** Edge is green if target executed and source condition met by edge type. */
const getEdgeExecutionStatus = (edge: {
source: string
target: string
@@ -446,17 +462,40 @@ export function PreviewWorkflow({
if (!targetStatus?.executed) return 'not-executed'
const sourceStatus = getBlockExecutionStatus(edge.source)
const { sourceHandle } = edge
if (!sourceStatus?.executed) return 'not-executed'
if (sourceHandle === 'error') {
return sourceStatus?.status === 'error' ? 'success' : 'not-executed'
const handle = edge.sourceHandle
if (!handle) {
return sourceStatus.status === 'success' ? 'success' : 'not-executed'
}
if (sourceHandle === 'loop-start-source' || sourceHandle === 'parallel-start-source') {
return 'success'
const sourceOutput = blockExecutionMap.get(edge.source)?.output as
| Record<string, any>
| undefined
if (handle.startsWith('condition-')) {
const conditionValue = handle.substring('condition-'.length)
return sourceOutput?.selectedOption === conditionValue ? 'success' : 'not-executed'
}
return sourceStatus?.status === 'success' ? 'success' : 'not-executed'
if (handle.startsWith('router-')) {
const routeId = handle.substring('router-'.length)
return sourceOutput?.selectedRoute === routeId ? 'success' : 'not-executed'
}
switch (handle) {
case 'error':
return sourceStatus.status === 'error' ? 'error' : 'not-executed'
case 'source':
return sourceStatus.status === 'success' ? 'success' : 'not-executed'
case 'loop-start-source':
case 'loop-end-source':
case 'parallel-start-source':
case 'parallel-end-source':
return 'success'
default:
return sourceStatus.status === 'success' ? 'success' : 'not-executed'
}
}
return (workflowState.edges || []).map((edge) => {

View File

@@ -667,15 +667,18 @@ describe.concurrent('Blocks Module', () => {
const errors: string[] = []
for (const block of blocks) {
const allSubBlockIds = new Set(block.subBlocks.map((sb) => sb.id))
// Exclude trigger-mode subBlocks — they operate in a separate rendering context
// and their IDs don't participate in canonical param resolution
const nonTriggerSubBlocks = block.subBlocks.filter((sb) => sb.mode !== 'trigger')
const allSubBlockIds = new Set(nonTriggerSubBlocks.map((sb) => sb.id))
const canonicalParamIds = new Set(
block.subBlocks.filter((sb) => sb.canonicalParamId).map((sb) => sb.canonicalParamId)
nonTriggerSubBlocks.filter((sb) => sb.canonicalParamId).map((sb) => sb.canonicalParamId)
)
for (const canonicalId of canonicalParamIds) {
if (allSubBlockIds.has(canonicalId!)) {
// Check if the matching subBlock also has a canonicalParamId pointing to itself
const matchingSubBlock = block.subBlocks.find(
const matchingSubBlock = nonTriggerSubBlocks.find(
(sb) => sb.id === canonicalId && !sb.canonicalParamId
)
if (matchingSubBlock) {
@@ -857,6 +860,10 @@ describe.concurrent('Blocks Module', () => {
if (typeof subBlock.condition === 'function') {
continue
}
// Skip trigger-mode subBlocks — they operate in a separate rendering context
if (subBlock.mode === 'trigger') {
continue
}
const conditionKey = serializeCondition(subBlock.condition)
if (!canonicalByCondition.has(subBlock.canonicalParamId)) {
canonicalByCondition.set(subBlock.canonicalParamId, new Set())

View File

@@ -1,5 +1,6 @@
import { createLogger } from '@sim/logger'
import { AgentIcon } from '@/components/icons'
import { getScopesForService } from '@/lib/oauth/utils'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import { getApiKeyCondition, getModelOptions } from '@/blocks/utils'
@@ -128,7 +129,7 @@ Return ONLY the JSON array.`,
serviceId: 'vertex-ai',
canonicalParamId: 'oauthCredential',
mode: 'basic',
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
requiredScopes: getScopesForService('vertex-ai'),
placeholder: 'Select Google Cloud account',
required: true,
condition: {

View File

@@ -1,4 +1,5 @@
import { AirtableIcon } from '@/components/icons'
import { getScopesForService } from '@/lib/oauth/utils'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import type { AirtableResponse } from '@/tools/airtable/types'
@@ -10,7 +11,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
description: 'Read, create, and update Airtable',
authMode: AuthMode.OAuth,
longDescription:
'Integrates Airtable into the workflow. Can create, get, list, or update Airtable records. Can be used in trigger mode to trigger a workflow when an update is made to an Airtable table.',
'Integrates Airtable into the workflow. Can list bases, list tables (with schema), and create, get, list, or update records. Can also be used in trigger mode to trigger a workflow when an update is made to an Airtable table.',
docsLink: 'https://docs.sim.ai/tools/airtable',
category: 'tools',
bgColor: '#E0E0E0',
@@ -21,10 +22,13 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'List Bases', id: 'listBases' },
{ label: 'List Tables', id: 'listTables' },
{ label: 'List Records', id: 'list' },
{ label: 'Get Record', id: 'get' },
{ label: 'Create Records', id: 'create' },
{ label: 'Update Record', id: 'update' },
{ label: 'Update Multiple Records', id: 'updateMultiple' },
],
value: () => 'list',
},
@@ -35,12 +39,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
canonicalParamId: 'oauthCredential',
mode: 'basic',
serviceId: 'airtable',
requiredScopes: [
'data.records:read',
'data.records:write',
'user.email:read',
'webhook:manage',
],
requiredScopes: getScopesForService('airtable'),
placeholder: 'Select Airtable account',
required: true,
},
@@ -53,21 +52,53 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
placeholder: 'Enter credential ID',
required: true,
},
{
id: 'baseSelector',
title: 'Base',
type: 'project-selector',
canonicalParamId: 'baseId',
serviceId: 'airtable',
selectorKey: 'airtable.bases',
selectorAllowSearch: false,
placeholder: 'Select Airtable base',
dependsOn: ['credential'],
mode: 'basic',
condition: { field: 'operation', value: 'listBases', not: true },
required: { field: 'operation', value: 'listBases', not: true },
},
{
id: 'baseId',
title: 'Base ID',
type: 'short-input',
canonicalParamId: 'baseId',
placeholder: 'Enter your base ID (e.g., appXXXXXXXXXXXXXX)',
dependsOn: ['credential'],
required: true,
mode: 'advanced',
condition: { field: 'operation', value: 'listBases', not: true },
required: { field: 'operation', value: 'listBases', not: true },
},
{
id: 'tableSelector',
title: 'Table',
type: 'file-selector',
canonicalParamId: 'tableId',
serviceId: 'airtable',
selectorKey: 'airtable.tables',
selectorAllowSearch: false,
placeholder: 'Select Airtable table',
dependsOn: ['credential', 'baseSelector'],
mode: 'basic',
condition: { field: 'operation', value: ['listBases', 'listTables'], not: true },
required: { field: 'operation', value: ['listBases', 'listTables'], not: true },
},
{
id: 'tableId',
title: 'Table ID',
type: 'short-input',
canonicalParamId: 'tableId',
placeholder: 'Enter table ID (e.g., tblXXXXXXXXXXXXXX)',
dependsOn: ['credential', 'baseId'],
required: true,
mode: 'advanced',
condition: { field: 'operation', value: ['listBases', 'listTables'], not: true },
required: { field: 'operation', value: ['listBases', 'listTables'], not: true },
},
{
id: 'recordId',
@@ -83,6 +114,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
type: 'short-input',
placeholder: 'Maximum records to return',
condition: { field: 'operation', value: 'list' },
mode: 'advanced',
},
{
id: 'filterFormula',
@@ -90,6 +122,7 @@ export const AirtableBlock: BlockConfig<AirtableResponse> = {
type: 'long-input',
placeholder: 'Airtable formula to filter records (optional)',
condition: { field: 'operation', value: 'list' },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: `Generate an Airtable filter formula based on the user's description.
@@ -206,6 +239,8 @@ Return ONLY the valid JSON object - no explanations, no markdown.`,
],
tools: {
access: [
'airtable_list_bases',
'airtable_list_tables',
'airtable_list_records',
'airtable_get_record',
'airtable_create_records',
@@ -215,6 +250,10 @@ Return ONLY the valid JSON object - no explanations, no markdown.`,
config: {
tool: (params) => {
switch (params.operation) {
case 'listBases':
return 'airtable_list_bases'
case 'listTables':
return 'airtable_list_tables'
case 'list':
return 'airtable_list_records'
case 'get':
@@ -278,6 +317,11 @@ Return ONLY the valid JSON object - no explanations, no markdown.`,
},
// Output structure depends on the operation, covered by AirtableResponse union type
outputs: {
// List Bases output
bases: { type: 'json', description: 'List of accessible Airtable bases' },
// List Tables output
tables: { type: 'json', description: 'List of tables in the base with schema' },
// Record outputs
records: { type: 'json', description: 'Retrieved record data' }, // Optional: for list, create, updateMultiple
record: { type: 'json', description: 'Single record data' }, // Optional: for get, update single
metadata: { type: 'json', description: 'Operation metadata' }, // Required: present in all responses

View File

@@ -1,4 +1,5 @@
import { AsanaIcon } from '@/components/icons'
import { getScopesForService } from '@/lib/oauth/utils'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import type { AsanaResponse } from '@/tools/asana/types'
@@ -36,7 +37,7 @@ export const AsanaBlock: BlockConfig<AsanaResponse> = {
mode: 'basic',
required: true,
serviceId: 'asana',
requiredScopes: ['default'],
requiredScopes: getScopesForService('asana'),
placeholder: 'Select Asana account',
},
{
@@ -48,12 +49,31 @@ export const AsanaBlock: BlockConfig<AsanaResponse> = {
placeholder: 'Enter credential ID',
required: true,
},
{
id: 'workspaceSelector',
title: 'Workspace',
type: 'project-selector',
canonicalParamId: 'workspace',
serviceId: 'asana',
selectorKey: 'asana.workspaces',
selectorAllowSearch: false,
placeholder: 'Select Asana workspace',
dependsOn: ['credential'],
mode: 'basic',
condition: {
field: 'operation',
value: ['create_task', 'get_projects', 'search_tasks'],
},
required: true,
},
{
id: 'workspace',
title: 'Workspace GID',
type: 'short-input',
canonicalParamId: 'workspace',
required: true,
placeholder: 'Enter Asana workspace GID',
mode: 'advanced',
condition: {
field: 'operation',
value: ['create_task', 'get_projects', 'search_tasks'],
@@ -81,11 +101,29 @@ export const AsanaBlock: BlockConfig<AsanaResponse> = {
value: ['update_task', 'add_comment'],
},
},
{
id: 'getTasksWorkspaceSelector',
title: 'Workspace',
type: 'project-selector',
canonicalParamId: 'getTasks_workspace',
serviceId: 'asana',
selectorKey: 'asana.workspaces',
selectorAllowSearch: false,
placeholder: 'Select Asana workspace',
dependsOn: ['credential'],
mode: 'basic',
condition: {
field: 'operation',
value: ['get_task'],
},
},
{
id: 'getTasks_workspace',
title: 'Workspace GID',
type: 'short-input',
canonicalParamId: 'getTasks_workspace',
placeholder: 'Enter workspace GID',
mode: 'advanced',
condition: {
field: 'operation',
value: ['get_task'],

View File

@@ -86,11 +86,47 @@ export const AttioBlock: BlockConfig<AttioResponse> = {
},
// Record fields
{
id: 'objectTypeSelector',
title: 'Object Type',
type: 'project-selector',
canonicalParamId: 'objectType',
serviceId: 'attio',
selectorKey: 'attio.objects',
selectorAllowSearch: false,
placeholder: 'Select object type',
dependsOn: ['credential'],
mode: 'basic',
condition: {
field: 'operation',
value: [
'list_records',
'get_record',
'create_record',
'update_record',
'delete_record',
'assert_record',
],
},
required: {
field: 'operation',
value: [
'list_records',
'get_record',
'create_record',
'update_record',
'delete_record',
'assert_record',
],
},
},
{
id: 'objectType',
title: 'Object Type',
type: 'short-input',
canonicalParamId: 'objectType',
placeholder: 'e.g. people, companies',
mode: 'advanced',
condition: {
field: 'operation',
value: [
@@ -524,11 +560,49 @@ Return ONLY the JSON array. No explanations, no markdown, no extra text.
},
// List fields
{
id: 'listSelector',
title: 'List',
type: 'project-selector',
canonicalParamId: 'listIdOrSlug',
serviceId: 'attio',
selectorKey: 'attio.lists',
selectorAllowSearch: false,
placeholder: 'Select Attio list',
dependsOn: ['credential'],
mode: 'basic',
condition: {
field: 'operation',
value: [
'get_list',
'update_list',
'query_list_entries',
'get_list_entry',
'create_list_entry',
'update_list_entry',
'delete_list_entry',
],
},
required: {
field: 'operation',
value: [
'get_list',
'update_list',
'query_list_entries',
'get_list_entry',
'create_list_entry',
'update_list_entry',
'delete_list_entry',
],
},
},
{
id: 'listIdOrSlug',
title: 'List ID or Slug',
type: 'short-input',
canonicalParamId: 'listIdOrSlug',
placeholder: 'Enter the list ID or slug',
mode: 'advanced',
condition: {
field: 'operation',
value: [

View File

@@ -0,0 +1,97 @@
import { BrandfetchIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import type { BrandfetchGetBrandResponse, BrandfetchSearchResponse } from '@/tools/brandfetch/types'
export const BrandfetchBlock: BlockConfig<BrandfetchGetBrandResponse | BrandfetchSearchResponse> = {
type: 'brandfetch',
name: 'Brandfetch',
description: 'Look up brand assets, logos, colors, and company info',
longDescription:
'Integrate Brandfetch into your workflow. Retrieve brand logos, colors, fonts, and company data by domain, ticker, or name search.',
docsLink: 'https://docs.sim.ai/tools/brandfetch',
category: 'tools',
bgColor: '#000000',
icon: BrandfetchIcon,
authMode: AuthMode.ApiKey,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Get Brand', id: 'get_brand' },
{ label: 'Search Brands', id: 'search' },
],
value: () => 'get_brand',
},
{
id: 'identifier',
title: 'Identifier',
type: 'short-input',
placeholder: 'e.g., nike.com, NKE, BTC',
required: { field: 'operation', value: 'get_brand' },
condition: { field: 'operation', value: 'get_brand' },
},
{
id: 'name',
title: 'Brand Name',
type: 'short-input',
placeholder: 'e.g., Nike',
required: { field: 'operation', value: 'search' },
condition: { field: 'operation', value: 'search' },
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your Brandfetch API key',
required: true,
password: true,
},
],
tools: {
access: ['brandfetch_get_brand', 'brandfetch_search'],
config: {
tool: (params) => {
switch (params.operation) {
case 'get_brand':
return 'brandfetch_get_brand'
case 'search':
return 'brandfetch_search'
default:
return 'brandfetch_get_brand'
}
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
identifier: {
type: 'string',
description: 'Brand identifier (domain, ticker, ISIN, or crypto symbol)',
},
name: { type: 'string', description: 'Brand name to search for' },
apiKey: { type: 'string', description: 'Brandfetch API key' },
},
outputs: {
id: { type: 'string', description: 'Unique brand identifier' },
name: { type: 'string', description: 'Brand name' },
domain: { type: 'string', description: 'Brand domain' },
claimed: { type: 'boolean', description: 'Whether the brand profile is claimed' },
description: { type: 'string', description: 'Short brand description' },
longDescription: { type: 'string', description: 'Detailed brand description' },
links: { type: 'array', description: 'Social media and website links' },
logos: { type: 'array', description: 'Brand logos with formats and themes' },
colors: { type: 'array', description: 'Brand colors with hex values' },
fonts: { type: 'array', description: 'Brand fonts' },
company: { type: 'json', description: 'Company firmographic data' },
qualityScore: { type: 'number', description: 'Data quality score (0-1)' },
isNsfw: { type: 'boolean', description: 'Adult content indicator' },
results: { type: 'array', description: 'Search results with brand name, domain, and icon' },
},
}

Some files were not shown because too many files have changed in this diff Show More