Compare commits

...

49 Commits

Author SHA1 Message Date
Waleed
ff7b5b528c v0.6.4: subflows, docusign, ashby new tools, box, workday, billing bug fixes 2026-03-18 23:12:36 -07:00
Waleed
cef321bda2 feat(box): add Box and Box Sign integrations (#3660)
* feat(box): add Box and Box Sign integrations

Add complete Box integration with file management (upload, download, get info, list folders, create/delete folders, copy, search, update metadata) and Box Sign e-signature support (create/get/list/cancel/resend sign requests). Includes OAuth provider setup, internal upload API route following the Dropbox pattern, block configurations, icon, and generated docs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): address PR review comments

- Fix docsLink for Box Sign: use underscore (box_sign) to match docs URL
- Move normalizeFileInput from tool() to params() in Box block config to match Dropbox pattern
- Throw error on invalid additionalSigners JSON instead of silently dropping signers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): remove unsupported reason param from cancel sign request

The Box Sign cancel endpoint (POST /sign_requests/{id}/cancel) does not
accept a request body per the API specification. Remove the misleading
reason parameter from the tool, types, block config, and docs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): use canonical param ID for file normalization in params()

The params function must reference canonical IDs (params.file), not raw
subBlock IDs (uploadFile/fileRef) which are deleted after canonical
transformation. Matches the Dropbox block pattern.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): use generic output descriptions for shared file properties

Rename "Uploaded file ID/name" to "File ID/name" in
UPLOAD_FILE_OUTPUT_PROPERTIES since the constant is shared by both
upload and copy operations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): rename items output to entries for list_folder_items

Rename the output field from "items" to "entries" to match Box API
naming and avoid collision with JSON schema "items" keyword that
prevented the docs generator from rendering the nested array structure.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): filter empty file IDs from sourceFileIds input

Add .filter(Boolean) after splitting sourceFileIds to prevent empty
strings from trailing/double commas being sent as invalid file IDs
to the Box Sign API.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(box): merge Box Sign into single Box block

Combine Box and Box Sign into one unified block with all 15 operations
accessible via a single dropdown, removing the separate box_sign block.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): filter empty strings from tags array in update_file

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style(docs): apply lint formatting to icon-mapping and meta.json

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style(box): format chained method calls per linter rules

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style(box,docusign): set block bgColor to white and regenerate docs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style(docs): apply lint formatting

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): populate OAuth scopes for Box since token response omits them

Box's OAuth2 token endpoint does not return a scope field in the
response, so Better Auth stores nothing in the DB. This causes the
credential selector to always show "Additional permissions required".
Fix by populating the scope from the requested scopes in the
account.create.before hook.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(box): add sign_requests.readwrite scope for Box Sign operations

Box Sign API requires the sign_requests.readwrite scope in addition
to root_readwrite. Without it, sign requests fail with "The request
requires higher privileges than provided by the access token."

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* update docs

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 23:06:08 -07:00
Vikhyath Mondreti
1809b3801b improvement(billing): immediately charge for billing upgrades (#3664)
* improvement(billing): immediately charge for billing upgrades

* block on payment failures even for upgrades

* address bugbot comments
2026-03-18 22:47:31 -07:00
Vikhyath Mondreti
bc111a6d5c feat(workday): block + tools (#3663)
* checkpoint workday block

* add icon svg

* fix workday to use soap api

* fix SOAP API

* address comments

* fix

* more type fixes

* address more comments

* fix files

* fix file editor useEffect

* fix build issue

* fix typing

* fix test
2026-03-18 22:26:10 -07:00
Waleed
12908c14be feat(ashby): add 15 new tools and fix existing tool accuracy (#3662)
* feat(ashby): add 15 new tools and fix existing tool accuracy

* fix(ashby): fix response field mappings for changeStage and createNote

* fix(ashby): fix websiteUrl field name in candidate.update request

* fix(ashby): revert body field names to candidateId and jobId for info endpoints

* fix(ashby): add subblock ID migrations for removed emailType and phoneType

* fix(ashby): map removed emailType/phoneType to dummy keys to avoid data corruption
2026-03-18 22:12:16 -07:00
Waleed
638063cac1 feat(docusign): add docusign integration (#3661)
* feat(docusign): add DocuSign e-signature integration

* fix(docusign): add base_uri null check and move file normalization to params

* fix(docusign): use canonical param documentFile instead of raw subBlock IDs

* fix(docusign): validate document file is present before sending envelope

* fix(docusign): rename tool files from kebab-case to snake_case for docs generation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 22:07:51 -07:00
Vikhyath Mondreti
5f7a980c5f fix(schedules): deployment bug (#3666)
* fix(schedules): deployment bug

* fix
2026-03-18 21:39:13 -07:00
Vikhyath Mondreti
a2c08e19a8 fix(subflows): subflow-child selection issues, subflow error logs (#3656)
* fix(subflows): subflow-child selection issues, subflow error logs

* address comments

* make selection context more reliable

* fix more multiselect issues

* fix shift selection ordering to work correctly

* fix more comments

* address more comments

* reuse helper
2026-03-18 19:08:14 -07:00
Waleed
30f2d1a0fc v0.6.3: hubspot integration, kb block improvements 2026-03-18 11:19:55 -07:00
Theodore Li
5332614a19 fix(mothership): mothership-ran workflows show workflow validation errors (#3634)
* fix(mothership): mothership-ran workflows show workflow validation errors

* Distinguish errors from 5xxs

* Unify workflow event handling

* Fix run from block

* Fix lint

---------

Co-authored-by: Theodore Li <theo@sim.ai>
2026-03-18 13:55:59 -04:00
Waleed
ff5d90e0c0 fix(knowledge): infer MIME type from file extension in create/upsert tools (#3651)
* fix(knowledge): infer MIME type from file extension in create/upsert tools

Both create_document and upsert_document forced .txt extension and
text/plain MIME type regardless of the document name. Now the tools
infer the correct MIME type from the file extension (html, md, csv,
json, yaml, xml) and only default to .txt when no extension is given.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(knowledge): reuse existing getMimeTypeFromExtension from uploads

Replace duplicate EXTENSION_MIME_MAP and getMimeTypeFromExtension with
the existing, more comprehensive version from lib/uploads/utils/file-utils.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): fix btoa stack overflow and duplicate encoding in create_document

Same fixes as upsert_document: use loop-based String.fromCharCode
instead of spread, consolidate duplicate TextEncoder calls, and
check byte length instead of character length for 1MB limit.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): allowlist text-compatible MIME types in inferDocumentFileInfo

Use an explicit allowlist instead of only checking for octet-stream,
preventing binary MIME types (image/jpeg, audio/mpeg, etc.) from
leaking through when a user names a document with a binary extension.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): remove pdf/rtf from allowlist, normalize unrecognized extensions

- Remove application/pdf and application/rtf from TEXT_COMPATIBLE_MIME_TYPES
  since these tools pass plain text content, not binary
- Normalize unrecognized extensions (e.g. report.v2) to .txt instead of
  preserving the original extension with text/plain MIME type

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): handle dotfile names to avoid empty base in filename

Dotfiles like .env would produce an empty base, resulting in '.txt'.
Now falls back to the original name so .env becomes .env.txt.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 10:49:18 -07:00
Waleed
8b245693e2 fix(hubspot): add missing tickets and oauth scopes to OAuth config (#3653) 2026-03-18 10:34:49 -07:00
Waleed
4bd0731871 v0.6.2: mothership stability, chat iframe embedding, KB upserts, new blog post 2026-03-18 03:29:39 -07:00
Waleed
60bb9422ca feat(blog): add v0.6 blog post and email broadcast (#3636)
* chore(blog): add v0.6 blog post and email broadcast scaffolding

* mothership blog

* turned on mothership blog

* small change

---------

Co-authored-by: Emir Karabeg <emirkarabeg@berkeley.edu>
2026-03-18 03:25:59 -07:00
Waleed
8a4c161ec4 feat(home): resizable chat/resource panel divider (#3648)
* feat(home): resizable chat/resource panel divider

* fix(home): address PR review comments

- Remove aria-hidden from resize handle outer div so separator role is visible to AT
- Add viewport-resize re-clamping in useMothershipResize to prevent panel exceeding max % after browser window narrows
- Change default MothershipView width from 60% to 50%

* refactor(home): eradicate useEffect anti-patterns per you-might-not-need-an-effect

- use-chat: remove messageQueue→ref sync Effect; inline assignment like other refs
- use-chat: replace activeResourceId selection Effect with useMemo (derived value, avoids
  extra re-render cycle; activeResourceIdRef now tracks effective value for API payloads)
- use-chat: replace 3x useLayoutEffect ref-sync (processSSEStream, finalize, sendMessage)
  with direct render-phase assignment — consistent with existing resourcesRef pattern
- user-input: fold onEditValueConsumed callback into existing render-phase guard; remove Effect
- home: move isResourceAnimatingIn 400ms timer into expandResource/handleResourceEvent event
  handlers where setIsResourceAnimatingIn(true) is called; remove reactive Effect watcher

* fix(home): revert default width to 60%, reduce max resize to 63%

* improvement(react): replace useEffect anti-patterns with better React primitives

* improvement(react): replace useEffect anti-patterns with better React primitives

* improvement(home): use pointer events for resize handle (touch + mouse support)

* fix(autosave): store idle-reset timer ref to prevent status corruption on rapid saves

* fix(home): move onEditValueConsumed call out of render phase into useEffect

* fix(home): add pointercancel handler; fix(settings): sync name on profile refetch

* fix(home): restore cleanupRef assignment dropped during AbortController refactor
2026-03-18 02:57:44 -07:00
Waleed
b84f30e9e7 fix(db): reduce connection pool sizes to prevent exhaustion (#3649) 2026-03-18 02:20:55 -07:00
Waleed
28de28899a improvement(landing): added enterprise section (#3637)
* improvement(landing): added enterprise section

* make components interactive

* added more things to pricing sheet

* remove debug log

* fix(landing): remove dead DotGrid component and fix enterprise CTA to use Link

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 20:38:16 -07:00
Siddharth Ganesan
168cd585cb feat(mothership): request ids (#3645)
* Include rid

* Persist rid

* fix ui

* address comments

* update types

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
2026-03-17 18:24:27 -07:00
Waleed
5f89c7140c feat(knowledge): add upsert document operation (#3644)
* feat(knowledge): add upsert document operation to Knowledge block

Add a "Create or Update" (upsert) document capability that finds an
existing document by ID or filename, deletes it if found, then creates
a new document and queues re-processing. Includes new tool, API route,
block wiring, and typed interfaces.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): address review comments on upsert document

- Reorder create-then-delete to prevent data loss if creation fails
- Move Zod validation before workflow authorization for validated input
- Fix btoa stack overflow for large content using loop-based encoding

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): guard against empty createDocumentRecords result

Add safety check before accessing firstDocument to prevent TypeError
and data loss if createDocumentRecords unexpectedly returns empty.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): prevent documentId fallthrough and use byte-count limit

- Use if/else so filename lookup only runs when no documentId is provided,
  preventing stale IDs from silently replacing unrelated documents
- Check utf8 byte length instead of character count for 1MB size limit,
  correctly handling multi-byte characters (CJK, emoji)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(knowledge): rollback on delete failure, deduplicate sub-block IDs

- Add compensating rollback: if deleteDocument throws after create
  succeeds, clean up the new record to prevent orphaned pending docs
- Merge duplicate name/content sub-blocks into single entries with
  array conditions, matching the documentTags pattern

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* lint

* lint

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-17 18:14:29 -07:00
Waleed
2bc11a70ba waleedlatif1/hangzhou v2 (#3647)
* feat(admin): add user search by email and ID, remove table border

- Replace Load Users button with a live search input; query fires on any input
- Email search uses listUsers with contains operator
- User ID search (UUID format) uses admin.getUser directly for exact lookup
- Remove outer border on user table that rendered white in dark mode
- Reset pagination to page 0 on new search

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix(admin): replace live search with explicit search button

- Split searchInput (controlled input) from searchQuery (committed value)
  so the hook only fires on Search click or Enter, not every keystroke
- Gate table render on searchQuery.length > 0 to prevent stale results
  showing after input is cleared

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 18:00:55 -07:00
PlaneInABottle
67478bbc80 fix(logs): add durable execution diagnostics foundation (#3564)
* fix(logs): persist execution diagnostics markers

Store last-started and last-completed block markers with finalization metadata so later read surfaces can explain how a run ended without reconstructing executor state.

* fix(executor): preserve durable diagnostics ordering

Await only the persistence needed to keep diagnostics durable before terminal completion while keeping callback failures from changing execution behavior.

* fix(logs): preserve fallback diagnostics semantics

Keep successful fallback output and accumulated cost intact while tightening progress-write draining and deduplicating trace span counting for diagnostics helpers.

* fix(api): restore async execute route test mock

Add the missing AuthType export to the hybrid auth mock so the async execution route test exercises the 202 queueing path instead of crashing with a 500 in CI.

* fix(executor): align async block error handling

* fix(logs): tighten marker ordering scope

Allow same-millisecond marker writes to replace prior markers and drop the unused diagnostics read helper so this PR stays focused on persistence rather than unread foundation code.

* fix(logs): remove unused finalization type guard

Drop the unused  helper so this PR only ships the persistence-side status types it actually uses.

* fix(executor): await subflow diagnostics callbacks

Ensure empty-subflow and subflow-error lifecycle callbacks participate in progress-write draining before terminal finalization while still swallowing callback failures.

---------

Co-authored-by: test <test@example.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
2026-03-17 17:24:40 -07:00
Waleed
c9f082da1a feat(csp): allow chat UI to be embedded in iframes (#3643)
* feat(csp): allow chat UI to be embedded in iframes

Mirror the existing form embed CSP pattern for chat pages: add
getChatEmbedCSPPolicy() with frame-ancestors *, configure /chat/:path*
headers in next.config.ts without X-Frame-Options, and early-return in
proxy.ts so chat routes skip the strict runtime CSP.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor(csp): extract shared getEmbedCSPPolicy helper

Deduplicate getChatEmbedCSPPolicy and getFormEmbedCSPPolicy into a
shared private helper to prevent future divergence.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-17 17:12:34 -07:00
Waleed
75a3e2c3a8 fix(workspace): prevent stale placeholder data from corrupting workflow registry on switch 2026-03-17 16:20:06 -07:00
Siddharth Ganesan
cdd0f75cd5 fix(mothership): fix mothership file uploads (#3640)
* Fix files

* Fix

* Fix
2026-03-17 16:19:47 -07:00
Waleed
4f3bc37fe4 v0.6.1: added better auth admin plugin 2026-03-17 15:16:16 -07:00
Waleed
84d6fdc423 v0.6: mothership, tables, connectors 2026-03-17 12:21:15 -07:00
Vikhyath Mondreti
4c12914d35 v0.5.113: jira, ashby, google ads, grain updates 2026-03-12 22:54:25 -07:00
Waleed
e9bdc57616 v0.5.112: trace spans improvements, fathom integration, jira fixes, canvas navigation updates 2026-03-12 13:30:20 -07:00
Vikhyath Mondreti
36612ae42a v0.5.111: non-polling webhook execs off trigger.dev, gmail subject headers, webhook trigger configs (#3530) 2026-03-11 17:47:28 -07:00
Waleed
1c2c2c65d4 v0.5.110: webhook execution speedups, SSRF patches 2026-03-11 15:00:24 -07:00
Waleed
ecd3536a72 v0.5.109: obsidian and evernote integrations, slack fixes, remove memory instrumentation 2026-03-09 10:40:37 -07:00
Vikhyath Mondreti
8c0a2e04b1 v0.5.108: workflow input params in agent tools, bun upgrade, dropdown selectors for 14 blocks 2026-03-06 21:02:25 -08:00
Waleed
6586c5ce40 v0.5.107: new reddit, slack tools 2026-03-05 22:48:20 -08:00
Vikhyath Mondreti
3ce947566d v0.5.106: condition block and legacy kbs fixes, GPT 5.4 2026-03-05 17:30:05 -08:00
Waleed
70c36cb7aa v0.5.105: slack remove reaction, nested subflow locks fix, servicenow pagination, memory improvements 2026-03-04 22:38:26 -08:00
Waleed
f1ec5fe824 v0.5.104: memory improvements, nested subflows, careers page redirect, brandfetch, google meet 2026-03-03 23:45:29 -08:00
Waleed
e07e3c34cc v0.5.103: memory util instrumentation, API docs, amplitude, google pagespeed insights, pagerduty 2026-03-01 23:27:02 -08:00
Waleed
0d2e6ff31d v0.5.102: new integrations, new tools, ci speedups, memory leak instrumentation 2026-02-28 12:48:10 -08:00
Waleed
4fd0989264 v0.5.101: circular dependency mitigation, confluence enhancements, google tasks and bigquery integrations, workflow lock 2026-02-26 15:04:53 -08:00
Waleed
67f8a687f6 v0.5.100: multiple credentials, 40% speedup, gong, attio, audit log improvements 2026-02-25 00:28:25 -08:00
Waleed
af592349d3 v0.5.99: local dev improvements, live workflow logs in terminal 2026-02-23 00:24:49 -08:00
Waleed
0d86ea01f0 v0.5.98: change detection improvements, rate limit and code execution fixes, removed retired models, hex integration 2026-02-21 18:07:40 -08:00
Waleed
115f04e989 v0.5.97: oidc discovery for copilot mcp 2026-02-21 02:06:25 -08:00
Waleed
34d92fae89 v0.5.96: sim oauth provider, slack ephemeral message tool and blockkit support 2026-02-20 18:22:20 -08:00
Waleed
67aa4bb332 v0.5.95: gemini 3.1 pro, cloudflare, dataverse, revenuecat, redis, upstash, algolia tools; isolated-vm robustness improvements, tables backend (#3271)
* feat(tools): advanced fields for youtube, vercel; added cloudflare and dataverse tools (#3257)

* refactor(vercel): mark optional fields as advanced mode

Move optional/power-user fields behind the advanced toggle:
- List Deployments: project filter, target, state
- Create Deployment: project ID override, redeploy from, target
- List Projects: search
- Create/Update Project: framework, build/output/install commands
- Env Vars: variable type
- Webhooks: project IDs filter
- Checks: path, details URL
- Team Members: role filter
- All operations: team ID scope

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style(youtube): mark optional params as advanced mode

Hide pagination, sort order, and filter fields behind the advanced
toggle for a cleaner default UX across all YouTube operations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* added advanced fields for vercel and youtube, added cloudflare and dataverse block

* addded desc for dataverse

* add more tools

* ack comment

* more

* ops

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(tables): added tables (#2867)

* updates

* required

* trashy table viewer

* updates

* updates

* filtering ui

* updates

* updates

* updates

* one input mode

* format

* fix lints

* improved errors

* updates

* updates

* chages

* doc strings

* breaking down file

* update comments with ai

* updates

* comments

* changes

* revert

* updates

* dedupe

* updates

* updates

* updates

* refactoring

* renames & refactors

* refactoring

* updates

* undo

* update db

* wand

* updates

* fix comments

* fixes

* simplify comments

* u[dates

* renames

* better comments

* validation

* updates

* updates

* updates

* fix sorting

* fix appearnce

* updating prompt to make it user sort

* rm

* updates

* rename

* comments

* clean comments

* simplicifcaiton

* updates

* updates

* refactor

* reduced type confusion

* undo

* rename

* undo changes

* undo

* simplify

* updates

* updates

* revert

* updates

* db updates

* type fix

* fix

* fix error handling

* updates

* docs

* docs

* updates

* rename

* dedupe

* revert

* uncook

* updates

* fix

* fix

* fix

* fix

* prepare merge

* readd migrations

* add back missed code

* migrate enrichment logic to general abstraction

* address bugbot concerns

* adhere to size limits for tables

* remove conflicting migration

* add back migrations

* fix tables auth

* fix permissive auth

* fix lint

* reran migrations

* migrate to use tanstack query for all server state

* update table-selector

* update names

* added tables to permission groups, updated subblock types

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: waleed <walif6@gmail.com>

* fix(snapshot): changed insert to upsert when concurrent identical child workflows are running (#3259)

* fix(snapshot): changed insert to upsert when concurrent identical child workflows are running

* fixed ci tests failing

* fix(workflows): disallow duplicate workflow names at the same folder level (#3260)

* feat(tools): added redis, upstash, algolia, and revenuecat (#3261)

* feat(tools): added redis, upstash, algolia, and revenuecat

* ack comment

* feat(models): add gemini-3.1-pro-preview and update gemini-3-pro thinking levels (#3263)

* fix(audit-log): lazily resolve actor name/email when missing (#3262)

* fix(blocks): move type coercions from tools.config.tool to tools.config.params (#3264)

* fix(blocks): move type coercions from tools.config.tool to tools.config.params

Number() coercions in tools.config.tool ran at serialization time before
variable resolution, destroying dynamic references like <block.result.count>
by converting them to NaN/null. Moved all coercions to tools.config.params
which runs at execution time after variables are resolved.

Fixed in 15 blocks: exa, arxiv, sentry, incidentio, wikipedia, ahrefs,
posthog, elasticsearch, dropbox, hunter, lemlist, spotify, youtube, grafana,
parallel. Also added mode: 'advanced' to optional exa fields.

Closes #3258

* fix(blocks): address PR review — move remaining param mutations from tool() to params()

- Moved field mappings from tool() to params() in grafana, posthog,
  lemlist, spotify, dropbox (same dynamic reference bug)
- Fixed parallel.ts excerpts/full_content boolean logic
- Fixed parallel.ts search_queries empty case (must set undefined)
- Fixed elasticsearch.ts timeout not included when already ends with 's'
- Restored dropbox.ts tool() switch for proper default fallback

* fix(blocks): restore field renames to tool() for serialization-time validation

Field renames (e.g. personalApiKey→apiKey) must be in tool() because
validateRequiredFieldsBeforeExecution calls selectToolId()→tool() then
checks renamed field names on params. Only type coercions (Number(),
boolean) stay in params() to avoid destroying dynamic variable references.

* improvement(resolver): resovled empty sentinel to not pass through unexecuted valid refs to text inputs (#3266)

* fix(blocks): add required constraint for serviceDeskId in JSM block (#3268)

* fix(blocks): add required constraint for serviceDeskId in JSM block

* fix(blocks): rename custom field values to request field values in JSM create request

* fix(trigger): add isolated-vm support to trigger.dev container builds (#3269)

Scheduled workflow executions running in trigger.dev containers were
failing to spawn isolated-vm workers because the native module wasn't
available in the container. This caused loop condition evaluation to
silently fail and exit after one iteration.

- Add isolated-vm to build.external and additionalPackages in trigger config
- Include isolated-vm-worker.cjs via additionalFiles for child process spawning
- Add fallback path resolution for worker file in trigger.dev environment

* fix(tables): hide tables from sidebar and block registry (#3270)

* fix(tables): hide tables from sidebar and block registry

* fix(trigger): add isolated-vm support to trigger.dev container builds (#3269)

Scheduled workflow executions running in trigger.dev containers were
failing to spawn isolated-vm workers because the native module wasn't
available in the container. This caused loop condition evaluation to
silently fail and exit after one iteration.

- Add isolated-vm to build.external and additionalPackages in trigger config
- Include isolated-vm-worker.cjs via additionalFiles for child process spawning
- Add fallback path resolution for worker file in trigger.dev environment

* lint

* fix(trigger): update node version to align with main app (#3272)

* fix(build): fix corrupted sticky disk cache on blacksmith (#3273)

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Lakee Sivaraya <71339072+lakeesiv@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
2026-02-20 13:43:07 -08:00
Waleed
15ace5e63f v0.5.94: vercel integration, folder insertion, migrated tracking redirects to rewrites 2026-02-18 16:53:34 -08:00
Waleed
fdca73679d v0.5.93: NextJS config changes, MCP and Blocks whitelisting, copilot keyboard shortcuts, audit logs 2026-02-18 12:10:05 -08:00
Waleed
da46a387c9 v0.5.92: shortlinks, copilot scrolling stickiness, pagination 2026-02-17 15:13:21 -08:00
Waleed
b7e377ec4b v0.5.91: docs i18n, turborepo upgrade 2026-02-16 00:36:05 -08:00
224 changed files with 14388 additions and 1549 deletions

View File

@@ -1146,6 +1146,25 @@ export function DevinIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function DocuSignIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 1547 1549' xmlns='http://www.w3.org/2000/svg'>
<path
d='m1113.4 1114.9v395.6c0 20.8-16.7 37.6-37.5 37.6h-1038.4c-20.7 0-37.5-16.8-37.5-37.6v-1039c0-20.7 16.8-37.5 37.5-37.5h394.3v643.4c0 20.7 16.8 37.5 37.5 37.5z'
fill='#4c00ff'
/>
<path
d='m1546 557.1c0 332.4-193.9 557-432.6 557.8v-418.8c0-12-4.8-24-13.5-31.9l-217.1-217.4c-8.8-8.8-20-13.6-32-13.6h-418.2v-394.8c0-20.8 16.8-37.6 37.5-37.6h585.1c277.7-0.8 490.8 223 490.8 556.3z'
fill='#ff5252'
/>
<path
d='m1099.9 663.4c8.7 8.7 13.5 19.9 13.5 31.9v418.8h-643.3c-20.7 0-37.5-16.8-37.5-37.5v-643.4h418.2c12 0 24 4.8 32 13.6z'
fill='#000000'
/>
</svg>
)
}
export function DiscordIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
@@ -1390,7 +1409,7 @@ export function AmplitudeIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 49 49'>
<path
fill='#FFFFFF'
fill='currentColor'
d='M23.4,15.3c0.6,1.8,1.2,4.1,1.9,6.7c-2.6,0-5.3-0.1-7.8-0.1h-1.3c1.5-5.7,3.2-10.1,4.6-11.1 c0.1-0.1,0.2-0.1,0.4-0.1c0.2,0,0.3,0.1,0.5,0.3C21.9,11.5,22.5,12.7,23.4,15.3z M49,24.5C49,38,38,49,24.5,49S0,38,0,24.5 S11,0,24.5,0S49,11,49,24.5z M42.7,23.9c0-0.6-0.4-1.2-1-1.3l0,0l0,0l0,0c-0.1,0-0.1,0-0.2,0h-0.2c-4.1-0.3-8.4-0.4-12.4-0.5l0,0 C27,14.8,24.5,7.4,21.3,7.4c-3,0-5.8,4.9-8.2,14.5c-1.7,0-3.2,0-4.6-0.1c-0.1,0-0.2,0-0.2,0c-0.3,0-0.5,0-0.5,0 c-0.8,0.1-1.4,0.9-1.4,1.7c0,0.8,0.6,1.6,1.5,1.7l0,0h4.6c-0.4,1.9-0.8,3.8-1.1,5.6l-0.1,0.8l0,0c0,0.6,0.5,1.1,1.1,1.1 c0.4,0,0.8-0.2,1-0.5l0,0l2.2-7.1h10.7c0.8,3.1,1.7,6.3,2.8,9.3c0.6,1.6,2,5.4,4.4,5.4l0,0c3.6,0,5-5.8,5.9-9.6 c0.2-0.8,0.4-1.5,0.5-2.1l0.1-0.2l0,0c0-0.1,0-0.2,0-0.3c-0.1-0.2-0.2-0.3-0.4-0.4c-0.3-0.1-0.5,0.1-0.6,0.4l0,0l-0.1,0.2 c-0.3,0.8-0.6,1.6-0.8,2.3v0.1c-1.6,4.4-2.3,6.4-3.7,6.4l0,0l0,0l0,0c-1.8,0-3.5-7.3-4.1-10.1c-0.1-0.5-0.2-0.9-0.3-1.3h11.7 c0.2,0,0.4-0.1,0.6-0.1l0,0c0,0,0,0,0.1,0c0,0,0,0,0.1,0l0,0c0,0,0.1,0,0.1-0.1l0,0C42.5,24.6,42.7,24.3,42.7,23.9z'
/>
</svg>
@@ -4569,11 +4588,17 @@ export function ShopifyIcon(props: SVGProps<SVGSVGElement>) {
export function BoxCompanyIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 41 22'>
<path
d='M39.7 19.2c.5.7.4 1.6-.2 2.1-.7.5-1.7.4-2.2-.2l-3.5-4.5-3.4 4.4c-.5.7-1.5.7-2.2.2-.7-.5-.8-1.4-.3-2.1l4-5.2-4-5.2c-.5-.7-.3-1.7.3-2.2.7-.5 1.7-.3 2.2.3l3.4 4.5L37.3 7c.5-.7 1.4-.8 2.2-.3.7.5.7 1.5.2 2.2L35.8 14l3.9 5.2zm-18.2-.6c-2.6 0-4.7-2-4.7-4.6 0-2.5 2.1-4.6 4.7-4.6s4.7 2.1 4.7 4.6c-.1 2.6-2.2 4.6-4.7 4.6zm-13.8 0c-2.6 0-4.7-2-4.7-4.6 0-2.5 2.1-4.6 4.7-4.6s4.7 2.1 4.7 4.6c0 2.6-2.1 4.6-4.7 4.6zM21.5 6.4c-2.9 0-5.5 1.6-6.8 4-1.3-2.4-3.9-4-6.9-4-1.8 0-3.4.6-4.7 1.5V1.5C3.1.7 2.4 0 1.6 0 .7 0 0 .7 0 1.5v12.6c.1 4.2 3.5 7.5 7.7 7.5 3 0 5.6-1.7 6.9-4.1 1.3 2.4 3.9 4.1 6.8 4.1 4.3 0 7.8-3.4 7.8-7.7.1-4.1-3.4-7.5-7.7-7.5z'
fill='currentColor'
/>
<svg
{...props}
xmlns='http://www.w3.org/2000/svg'
width='2500'
height='1379'
viewBox='0 0 444.893 245.414'
>
<g fill='#0075C9'>
<path d='M239.038 72.43c-33.081 0-61.806 18.6-76.322 45.904-14.516-27.305-43.24-45.902-76.32-45.902-19.443 0-37.385 6.424-51.821 17.266V16.925h-.008C34.365 7.547 26.713 0 17.286 0 7.858 0 .208 7.547.008 16.925H0v143.333h.036c.768 47.051 39.125 84.967 86.359 84.967 33.08 0 61.805-18.603 76.32-45.908 14.517 27.307 43.241 45.906 76.321 45.906 47.715 0 86.396-38.684 86.396-86.396.001-47.718-38.682-86.397-86.394-86.397zM86.395 210.648c-28.621 0-51.821-23.201-51.821-51.82 0-28.623 23.201-51.823 51.821-51.823 28.621 0 51.822 23.2 51.822 51.823 0 28.619-23.201 51.82-51.822 51.82zm152.643 0c-28.622 0-51.821-23.201-51.821-51.822 0-28.623 23.2-51.821 51.821-51.821 28.619 0 51.822 23.198 51.822 51.821-.001 28.621-23.203 51.822-51.822 51.822z' />
<path d='M441.651 218.033l-44.246-59.143 44.246-59.144-.008-.007c5.473-7.62 3.887-18.249-3.652-23.913-7.537-5.658-18.187-4.221-23.98 3.157l-.004-.002-38.188 51.047-38.188-51.047-.006.009c-5.793-7.385-16.441-8.822-23.981-3.16-7.539 5.664-9.125 16.293-3.649 23.911l-.008.005 44.245 59.144-44.245 59.143.008.005c-5.477 7.62-3.89 18.247 3.649 23.909 7.54 5.664 18.188 4.225 23.981-3.155l.006.007 38.188-51.049 38.188 51.049.004-.002c5.794 7.377 16.443 8.814 23.98 3.154 7.539-5.662 9.125-16.291 3.652-23.91l.008-.008z' />
</g>
</svg>
)
}

View File

@@ -16,6 +16,7 @@ import {
AsanaIcon,
AshbyIcon,
AttioIcon,
BoxCompanyIcon,
BrainIcon,
BrandfetchIcon,
BrowserUseIcon,
@@ -32,6 +33,7 @@ import {
DevinIcon,
DiscordIcon,
DocumentIcon,
DocuSignIcon,
DropboxIcon,
DsPyIcon,
DubIcon,
@@ -184,6 +186,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
asana: AsanaIcon,
ashby: AshbyIcon,
attio: AttioIcon,
box: BoxCompanyIcon,
brandfetch: BrandfetchIcon,
browser_use: BrowserUseIcon,
calcom: CalComIcon,
@@ -198,6 +201,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
datadog: DatadogIcon,
devin: DevinIcon,
discord: DiscordIcon,
docusign: DocuSignIcon,
dropbox: DropboxIcon,
dspy: DsPyIcon,
dub: DubIcon,

View File

@@ -0,0 +1,440 @@
---
title: Box
description: Manage files, folders, and e-signatures with Box
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="box"
color="#FFFFFF"
/>
{/* MANUAL-CONTENT-START:intro */}
[Box](https://www.box.com/) is a leading cloud content management and file sharing platform trusted by enterprises worldwide to securely store, manage, and collaborate on files. Box provides robust APIs for automating file operations and integrating with business workflows, including [Box Sign](https://www.box.com/esignature) for native e-signatures.
With the Box integration in Sim, you can:
- **Upload files**: Upload documents, images, and other files to any Box folder
- **Download files**: Retrieve file content from Box for processing in your workflows
- **Get file info**: Access detailed metadata including size, owner, timestamps, tags, and shared links
- **List folder contents**: Browse files and folders with sorting and pagination support
- **Create folders**: Organize your Box storage by creating new folders programmatically
- **Delete files and folders**: Remove content with optional recursive deletion for folders
- **Copy files**: Duplicate files across folders with optional renaming
- **Search**: Find files and folders by name, content, extension, or location
- **Update file metadata**: Rename, move, add descriptions, or tag files
- **Create sign requests**: Send documents for e-signature with one or more signers
- **Track signing status**: Monitor the progress of sign requests
- **List sign requests**: View all sign requests with marker-based pagination
- **Cancel sign requests**: Cancel pending sign requests that are no longer needed
- **Resend sign reminders**: Send reminder notifications to signers who haven't completed signing
These capabilities allow your Sim agents to automate Box operations directly within your workflows — from organizing documents and distributing content to processing uploaded files, managing e-signature workflows for offer letters and contracts, and maintaining structured cloud storage as part of your business processes.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate Box into your workflow to manage files, folders, and e-signatures. Upload and download files, search content, create folders, send documents for e-signature, track signing status, and more.
## Tools
### `box_upload_file`
Upload a file to a Box folder
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `parentFolderId` | string | Yes | The ID of the folder to upload the file to \(use "0" for root\) |
| `file` | file | No | The file to upload \(UserFile object\) |
| `fileContent` | string | No | Legacy: base64 encoded file content |
| `fileName` | string | No | Optional filename override |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | File ID |
| `name` | string | File name |
| `size` | number | File size in bytes |
| `sha1` | string | SHA1 hash of file content |
| `createdAt` | string | Creation timestamp |
| `modifiedAt` | string | Last modified timestamp |
| `parentId` | string | Parent folder ID |
| `parentName` | string | Parent folder name |
### `box_download_file`
Download a file from Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileId` | string | Yes | The ID of the file to download |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `file` | file | Downloaded file stored in execution files |
| `content` | string | Base64 encoded file content |
### `box_get_file_info`
Get detailed information about a file in Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileId` | string | Yes | The ID of the file to get information about |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | File ID |
| `name` | string | File name |
| `description` | string | File description |
| `size` | number | File size in bytes |
| `sha1` | string | SHA1 hash of file content |
| `createdAt` | string | Creation timestamp |
| `modifiedAt` | string | Last modified timestamp |
| `createdBy` | object | User who created the file |
| `modifiedBy` | object | User who last modified the file |
| `ownedBy` | object | User who owns the file |
| `parentId` | string | Parent folder ID |
| `parentName` | string | Parent folder name |
| `sharedLink` | json | Shared link details |
| `tags` | array | File tags |
| `commentCount` | number | Number of comments |
### `box_list_folder_items`
List files and folders in a Box folder
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `folderId` | string | Yes | The ID of the folder to list items from \(use "0" for root\) |
| `limit` | number | No | Maximum number of items to return per page |
| `offset` | number | No | The offset for pagination |
| `sort` | string | No | Sort field: id, name, date, or size |
| `direction` | string | No | Sort direction: ASC or DESC |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `entries` | array | List of items in the folder |
| ↳ `type` | string | Item type \(file, folder, web_link\) |
| ↳ `id` | string | Item ID |
| ↳ `name` | string | Item name |
| ↳ `size` | number | Item size in bytes |
| ↳ `createdAt` | string | Creation timestamp |
| ↳ `modifiedAt` | string | Last modified timestamp |
| `totalCount` | number | Total number of items in the folder |
| `offset` | number | Current pagination offset |
| `limit` | number | Current pagination limit |
### `box_create_folder`
Create a new folder in Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `name` | string | Yes | Name for the new folder |
| `parentFolderId` | string | Yes | The ID of the parent folder \(use "0" for root\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Folder ID |
| `name` | string | Folder name |
| `createdAt` | string | Creation timestamp |
| `modifiedAt` | string | Last modified timestamp |
| `parentId` | string | Parent folder ID |
| `parentName` | string | Parent folder name |
### `box_delete_file`
Delete a file from Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileId` | string | Yes | The ID of the file to delete |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `deleted` | boolean | Whether the file was successfully deleted |
| `message` | string | Success confirmation message |
### `box_delete_folder`
Delete a folder from Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `folderId` | string | Yes | The ID of the folder to delete |
| `recursive` | boolean | No | Delete folder and all its contents recursively |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `deleted` | boolean | Whether the folder was successfully deleted |
| `message` | string | Success confirmation message |
### `box_copy_file`
Copy a file to another folder in Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileId` | string | Yes | The ID of the file to copy |
| `parentFolderId` | string | Yes | The ID of the destination folder |
| `name` | string | No | Optional new name for the copied file |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | File ID |
| `name` | string | File name |
| `size` | number | File size in bytes |
| `sha1` | string | SHA1 hash of file content |
| `createdAt` | string | Creation timestamp |
| `modifiedAt` | string | Last modified timestamp |
| `parentId` | string | Parent folder ID |
| `parentName` | string | Parent folder name |
### `box_search`
Search for files and folders in Box
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `query` | string | Yes | The search query string |
| `limit` | number | No | Maximum number of results to return |
| `offset` | number | No | The offset for pagination |
| `ancestorFolderId` | string | No | Restrict search to a specific folder and its subfolders |
| `fileExtensions` | string | No | Comma-separated file extensions to filter by \(e.g., pdf,docx\) |
| `type` | string | No | Restrict to a specific content type: file, folder, or web_link |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `results` | array | Search results |
| ↳ `type` | string | Item type \(file, folder, web_link\) |
| ↳ `id` | string | Item ID |
| ↳ `name` | string | Item name |
| ↳ `size` | number | Item size in bytes |
| ↳ `createdAt` | string | Creation timestamp |
| ↳ `modifiedAt` | string | Last modified timestamp |
| ↳ `parentId` | string | Parent folder ID |
| ↳ `parentName` | string | Parent folder name |
| `totalCount` | number | Total number of matching results |
### `box_update_file`
Update file info in Box (rename, move, change description, add tags)
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileId` | string | Yes | The ID of the file to update |
| `name` | string | No | New name for the file |
| `description` | string | No | New description for the file \(max 256 characters\) |
| `parentFolderId` | string | No | Move the file to a different folder by specifying the folder ID |
| `tags` | string | No | Comma-separated tags to set on the file |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | File ID |
| `name` | string | File name |
| `description` | string | File description |
| `size` | number | File size in bytes |
| `sha1` | string | SHA1 hash of file content |
| `createdAt` | string | Creation timestamp |
| `modifiedAt` | string | Last modified timestamp |
| `createdBy` | object | User who created the file |
| `modifiedBy` | object | User who last modified the file |
| `ownedBy` | object | User who owns the file |
| `parentId` | string | Parent folder ID |
| `parentName` | string | Parent folder name |
| `sharedLink` | json | Shared link details |
| `tags` | array | File tags |
| `commentCount` | number | Number of comments |
### `box_sign_create_request`
Create a new Box Sign request to send documents for e-signature
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `sourceFileIds` | string | Yes | Comma-separated Box file IDs to send for signing |
| `signerEmail` | string | Yes | Primary signer email address |
| `signerRole` | string | No | Primary signer role: signer, approver, or final_copy_reader \(default: signer\) |
| `additionalSigners` | string | No | JSON array of additional signers, e.g. \[\{"email":"user@example.com","role":"signer"\}\] |
| `parentFolderId` | string | No | Box folder ID where signed documents will be stored \(default: user root\) |
| `emailSubject` | string | No | Custom subject line for the signing email |
| `emailMessage` | string | No | Custom message in the signing email body |
| `name` | string | No | Name for the sign request |
| `daysValid` | number | No | Number of days before the request expires \(0-730\) |
| `areRemindersEnabled` | boolean | No | Whether to send automatic signing reminders |
| `areTextSignaturesEnabled` | boolean | No | Whether to allow typed \(text\) signatures |
| `signatureColor` | string | No | Signature color: blue, black, or red |
| `redirectUrl` | string | No | URL to redirect signers to after signing |
| `declinedRedirectUrl` | string | No | URL to redirect signers to after declining |
| `isDocumentPreparationNeeded` | boolean | No | Whether document preparation is needed before sending |
| `externalId` | string | No | External system reference ID |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Sign request ID |
| `status` | string | Request status \(converting, created, sent, viewed, signed, cancelled, declined, expired, error_converting, error_sending, finalizing, error_finalizing\) |
| `name` | string | Sign request name |
| `shortId` | string | Human-readable short ID |
| `signers` | array | List of signers |
| `sourceFiles` | array | Source files for signing |
| `emailSubject` | string | Custom email subject line |
| `emailMessage` | string | Custom email message body |
| `daysValid` | number | Number of days the request is valid |
| `createdAt` | string | Creation timestamp |
| `autoExpireAt` | string | Auto-expiration timestamp |
| `prepareUrl` | string | URL for document preparation \(if preparation is needed\) |
| `senderEmail` | string | Email of the sender |
### `box_sign_get_request`
Get the details and status of a Box Sign request
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `signRequestId` | string | Yes | The ID of the sign request to retrieve |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Sign request ID |
| `status` | string | Request status \(converting, created, sent, viewed, signed, cancelled, declined, expired, error_converting, error_sending, finalizing, error_finalizing\) |
| `name` | string | Sign request name |
| `shortId` | string | Human-readable short ID |
| `signers` | array | List of signers |
| `sourceFiles` | array | Source files for signing |
| `emailSubject` | string | Custom email subject line |
| `emailMessage` | string | Custom email message body |
| `daysValid` | number | Number of days the request is valid |
| `createdAt` | string | Creation timestamp |
| `autoExpireAt` | string | Auto-expiration timestamp |
| `prepareUrl` | string | URL for document preparation \(if preparation is needed\) |
| `senderEmail` | string | Email of the sender |
### `box_sign_list_requests`
List all Box Sign requests
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `limit` | number | No | Maximum number of sign requests to return \(max 1000\) |
| `marker` | string | No | Pagination marker from a previous response |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `signRequests` | array | List of sign requests |
| ↳ `id` | string | Sign request ID |
| ↳ `status` | string | Request status \(converting, created, sent, viewed, signed, cancelled, declined, expired, error_converting, error_sending, finalizing, error_finalizing\) |
| ↳ `name` | string | Sign request name |
| ↳ `shortId` | string | Human-readable short ID |
| ↳ `signers` | array | List of signers |
| ↳ `sourceFiles` | array | Source files for signing |
| ↳ `emailSubject` | string | Custom email subject line |
| ↳ `emailMessage` | string | Custom email message body |
| ↳ `daysValid` | number | Number of days the request is valid |
| ↳ `createdAt` | string | Creation timestamp |
| ↳ `autoExpireAt` | string | Auto-expiration timestamp |
| ↳ `prepareUrl` | string | URL for document preparation \(if preparation is needed\) |
| ↳ `senderEmail` | string | Email of the sender |
| `count` | number | Number of sign requests returned in this page |
| `nextMarker` | string | Marker for next page of results |
### `box_sign_cancel_request`
Cancel a pending Box Sign request
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `signRequestId` | string | Yes | The ID of the sign request to cancel |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Sign request ID |
| `status` | string | Request status \(converting, created, sent, viewed, signed, cancelled, declined, expired, error_converting, error_sending, finalizing, error_finalizing\) |
| `name` | string | Sign request name |
| `shortId` | string | Human-readable short ID |
| `signers` | array | List of signers |
| `sourceFiles` | array | Source files for signing |
| `emailSubject` | string | Custom email subject line |
| `emailMessage` | string | Custom email message body |
| `daysValid` | number | Number of days the request is valid |
| `createdAt` | string | Creation timestamp |
| `autoExpireAt` | string | Auto-expiration timestamp |
| `prepareUrl` | string | URL for document preparation \(if preparation is needed\) |
| `senderEmail` | string | Email of the sender |
### `box_sign_resend_request`
Resend a Box Sign request to signers who have not yet signed
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `signRequestId` | string | Yes | The ID of the sign request to resend |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Success confirmation message |

View File

@@ -0,0 +1,230 @@
---
title: DocuSign
description: Send documents for e-signature via DocuSign
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="docusign"
color="#FFFFFF"
/>
{/* MANUAL-CONTENT-START:intro */}
[DocuSign](https://www.docusign.com) is the world's leading e-signature platform, enabling businesses to send, sign, and manage agreements digitally. With its powerful eSignature REST API, DocuSign supports the full document lifecycle from creation through completion.
With the DocuSign integration in Sim, you can:
- **Send envelopes**: Create and send documents for e-signature with custom recipients and signing tabs
- **Use templates**: Send envelopes from pre-configured DocuSign templates with role assignments
- **Track status**: Get envelope details including signing progress, timestamps, and recipient status
- **List envelopes**: Search and filter envelopes by date range, status, and text
- **Download documents**: Retrieve signed documents as base64-encoded files
- **Manage recipients**: View signer and CC recipient details and signing status
- **Void envelopes**: Cancel in-progress envelopes with a reason
In Sim, the DocuSign integration enables your agents to automate document workflows end-to-end. Agents can generate agreements, send them for signature, monitor completion, and retrieve signed copies—powering contract management, HR onboarding, sales closings, and compliance processes.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Create and send envelopes for e-signature, use templates, check signing status, download signed documents, and manage recipients with DocuSign.
## Tools
### `docusign_send_envelope`
Create and send a DocuSign envelope with a document for e-signature
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `emailSubject` | string | Yes | Email subject for the envelope |
| `emailBody` | string | No | Email body message |
| `signerEmail` | string | Yes | Email address of the signer |
| `signerName` | string | Yes | Full name of the signer |
| `ccEmail` | string | No | Email address of carbon copy recipient |
| `ccName` | string | No | Full name of carbon copy recipient |
| `file` | file | No | Document file to send for signature |
| `status` | string | No | Envelope status: "sent" to send immediately, "created" for draft \(default: "sent"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `envelopeId` | string | Created envelope ID |
| `status` | string | Envelope status |
| `statusDateTime` | string | Status change datetime |
| `uri` | string | Envelope URI |
### `docusign_create_from_template`
Create and send a DocuSign envelope using a pre-built template
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `templateId` | string | Yes | DocuSign template ID to use |
| `emailSubject` | string | No | Override email subject \(uses template default if not set\) |
| `emailBody` | string | No | Override email body message |
| `templateRoles` | string | Yes | JSON array of template roles, e.g. \[\{"roleName":"Signer","name":"John","email":"john@example.com"\}\] |
| `status` | string | No | Envelope status: "sent" to send immediately, "created" for draft \(default: "sent"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `envelopeId` | string | Created envelope ID |
| `status` | string | Envelope status |
| `statusDateTime` | string | Status change datetime |
| `uri` | string | Envelope URI |
### `docusign_get_envelope`
Get the details and status of a DocuSign envelope
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `envelopeId` | string | Yes | The envelope ID to retrieve |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `envelopeId` | string | Envelope ID |
| `status` | string | Envelope status \(created, sent, delivered, completed, declined, voided\) |
| `emailSubject` | string | Email subject line |
| `sentDateTime` | string | When the envelope was sent |
| `completedDateTime` | string | When all recipients completed signing |
| `createdDateTime` | string | When the envelope was created |
| `statusChangedDateTime` | string | When the status last changed |
| `voidedReason` | string | Reason the envelope was voided |
| `signerCount` | number | Number of signers |
| `documentCount` | number | Number of documents |
### `docusign_list_envelopes`
List envelopes from your DocuSign account with optional filters
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fromDate` | string | No | Start date filter \(ISO 8601\). Defaults to 30 days ago |
| `toDate` | string | No | End date filter \(ISO 8601\) |
| `envelopeStatus` | string | No | Filter by status: created, sent, delivered, completed, declined, voided |
| `searchText` | string | No | Search text to filter envelopes |
| `count` | string | No | Maximum number of envelopes to return \(default: 25\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `envelopes` | array | Array of DocuSign envelopes |
| ↳ `envelopeId` | string | Unique envelope identifier |
| ↳ `status` | string | Envelope status \(created, sent, delivered, completed, declined, voided\) |
| ↳ `emailSubject` | string | Email subject line |
| ↳ `sentDateTime` | string | ISO 8601 datetime when envelope was sent |
| ↳ `completedDateTime` | string | ISO 8601 datetime when envelope was completed |
| ↳ `createdDateTime` | string | ISO 8601 datetime when envelope was created |
| ↳ `statusChangedDateTime` | string | ISO 8601 datetime of last status change |
| `totalSetSize` | number | Total number of matching envelopes |
| `resultSetSize` | number | Number of envelopes returned in this response |
### `docusign_void_envelope`
Void (cancel) a sent DocuSign envelope that has not yet been completed
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `envelopeId` | string | Yes | The envelope ID to void |
| `voidedReason` | string | Yes | Reason for voiding the envelope |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `envelopeId` | string | Voided envelope ID |
| `status` | string | Envelope status \(voided\) |
### `docusign_download_document`
Download a signed document from a completed DocuSign envelope
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `envelopeId` | string | Yes | The envelope ID containing the document |
| `documentId` | string | No | Specific document ID to download, or "combined" for all documents merged \(default: "combined"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `base64Content` | string | Base64-encoded document content |
| `mimeType` | string | MIME type of the document |
| `fileName` | string | Original file name |
### `docusign_list_templates`
List available templates in your DocuSign account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `searchText` | string | No | Search text to filter templates by name |
| `count` | string | No | Maximum number of templates to return |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `templates` | array | Array of DocuSign templates |
| ↳ `templateId` | string | Template identifier |
| ↳ `name` | string | Template name |
| ↳ `description` | string | Template description |
| ↳ `shared` | boolean | Whether template is shared |
| ↳ `created` | string | ISO 8601 creation date |
| ↳ `lastModified` | string | ISO 8601 last modified date |
| `totalSetSize` | number | Total number of matching templates |
| `resultSetSize` | number | Number of templates returned in this response |
### `docusign_list_recipients`
Get the recipient status details for a DocuSign envelope
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `envelopeId` | string | Yes | The envelope ID to get recipients for |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `signers` | array | Array of DocuSign recipients |
| ↳ `recipientId` | string | Recipient identifier |
| ↳ `name` | string | Recipient name |
| ↳ `email` | string | Recipient email address |
| ↳ `status` | string | Recipient signing status \(sent, delivered, completed, declined\) |
| ↳ `signedDateTime` | string | ISO 8601 datetime when recipient signed |
| ↳ `deliveredDateTime` | string | ISO 8601 datetime when delivered to recipient |
| `carbonCopies` | array | Array of carbon copy recipients |
| ↳ `recipientId` | string | Recipient ID |
| ↳ `name` | string | Recipient name |
| ↳ `email` | string | Recipient email |
| ↳ `status` | string | Recipient status |

View File

@@ -53,6 +53,9 @@ Extract structured content from web pages with comprehensive metadata support. C
| `url` | string | Yes | The URL to scrape content from \(e.g., "https://example.com/page"\) |
| `scrapeOptions` | json | No | Options for content scraping |
| `apiKey` | string | Yes | Firecrawl API key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -86,6 +89,9 @@ Search for information on the web using Firecrawl
| --------- | ---- | -------- | ----------- |
| `query` | string | Yes | The search query to use |
| `apiKey` | string | Yes | Firecrawl API key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -123,6 +129,9 @@ Crawl entire websites and extract structured content from all accessible pages
| `includePaths` | json | No | URL paths to include in crawling \(e.g., \["/docs/*", "/api/*"\]\). Only these paths will be crawled |
| `onlyMainContent` | boolean | No | Extract only main content from pages |
| `apiKey` | string | Yes | Firecrawl API Key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -142,7 +151,6 @@ Crawl entire websites and extract structured content from all accessible pages
| ↳ `statusCode` | number | HTTP status code |
| ↳ `ogLocaleAlternate` | array | Alternate locale versions |
| `total` | number | Total number of pages found during crawl |
| `creditsUsed` | number | Number of credits consumed by the crawl operation |
### `firecrawl_map`
@@ -161,6 +169,9 @@ Get a complete list of URLs from any website quickly and reliably. Useful for di
| `timeout` | number | No | Request timeout in milliseconds |
| `location` | json | No | Geographic context for proxying \(country, languages\) |
| `apiKey` | string | Yes | Firecrawl API key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -187,6 +198,9 @@ Extract structured data from entire webpages using natural language prompts and
| `ignoreInvalidURLs` | boolean | No | Skip invalid URLs in the array \(default: true\) |
| `scrapeOptions` | json | No | Advanced scraping configuration options |
| `apiKey` | string | Yes | Firecrawl API key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -217,7 +231,6 @@ Autonomous web data extraction agent. Searches and gathers information based on
| `success` | boolean | Whether the agent operation was successful |
| `status` | string | Current status of the agent job \(processing, completed, failed\) |
| `data` | object | Extracted data from the agent |
| `creditsUsed` | number | Number of credits consumed by this agent task |
| `expiresAt` | string | Timestamp when the results expire \(24 hours\) |
| `sources` | object | Array of source URLs used by the agent |

View File

@@ -46,6 +46,8 @@ Search for books using the Google Books API
| `startIndex` | number | No | Index of the first result to return \(for pagination\) |
| `maxResults` | number | No | Maximum number of results to return \(1-40\) |
| `langRestrict` | string | No | Restrict results to a specific language \(ISO 639-1 code\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -82,6 +84,8 @@ Get detailed information about a specific book volume
| `apiKey` | string | Yes | Google Books API key |
| `volumeId` | string | Yes | The ID of the volume to retrieve |
| `projection` | string | No | Projection level \(full, lite\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -50,6 +50,8 @@ Get current air quality data for a location
| `lat` | number | Yes | Latitude coordinate |
| `lng` | number | Yes | Longitude coordinate |
| `languageCode` | string | No | Language code for the response \(e.g., "en", "es"\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -91,6 +93,8 @@ Get directions and route information between two locations
| `waypoints` | json | No | Array of intermediate waypoints |
| `units` | string | No | Unit system: metric or imperial |
| `language` | string | No | Language code for results \(e.g., en, es, fr\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -135,6 +139,8 @@ Calculate travel distance and time between multiple origins and destinations
| `avoid` | string | No | Features to avoid: tolls, highways, or ferries |
| `units` | string | No | Unit system: metric or imperial |
| `language` | string | No | Language code for results \(e.g., en, es, fr\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -163,6 +169,8 @@ Get elevation data for a location
| `apiKey` | string | Yes | Google Maps API key |
| `lat` | number | Yes | Latitude coordinate |
| `lng` | number | Yes | Longitude coordinate |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -185,6 +193,8 @@ Convert an address into geographic coordinates (latitude and longitude)
| `address` | string | Yes | The address to geocode |
| `language` | string | No | Language code for results \(e.g., en, es, fr\) |
| `region` | string | No | Region bias as a ccTLD code \(e.g., us, uk\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -217,6 +227,8 @@ Geolocate a device using WiFi access points, cell towers, or IP address
| `considerIp` | boolean | No | Whether to use IP address for geolocation \(default: true\) |
| `cellTowers` | array | No | Array of cell tower objects with cellId, locationAreaCode, mobileCountryCode, mobileNetworkCode |
| `wifiAccessPoints` | array | No | Array of WiFi access point objects with macAddress \(required\), signalStrength, etc. |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -238,6 +250,8 @@ Get detailed information about a specific place
| `placeId` | string | Yes | Google Place ID |
| `fields` | string | No | Comma-separated list of fields to return |
| `language` | string | No | Language code for results \(e.g., en, es, fr\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -290,6 +304,8 @@ Search for places using a text query
| `type` | string | No | Place type filter \(e.g., restaurant, cafe, hotel\) |
| `language` | string | No | Language code for results \(e.g., en, es, fr\) |
| `region` | string | No | Region bias as a ccTLD code \(e.g., us, uk\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -322,6 +338,8 @@ Convert geographic coordinates (latitude and longitude) into a human-readable ad
| `lat` | number | Yes | Latitude coordinate |
| `lng` | number | Yes | Longitude coordinate |
| `language` | string | No | Language code for results \(e.g., en, es, fr\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -346,6 +364,8 @@ Snap GPS coordinates to the nearest road segment
| `apiKey` | string | Yes | Google Maps API key with Roads API enabled |
| `path` | string | Yes | Pipe-separated list of lat,lng coordinates \(e.g., "60.170880,24.942795\|60.170879,24.942796"\) |
| `interpolate` | boolean | No | Whether to interpolate additional points along the road |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -399,6 +419,8 @@ Get timezone information for a location
| `lng` | number | Yes | Longitude coordinate |
| `timestamp` | number | No | Unix timestamp to determine DST offset \(defaults to current time\) |
| `language` | string | No | Language code for timezone name \(e.g., en, es, fr\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -424,6 +446,8 @@ Validate and standardize a postal address
| `regionCode` | string | No | ISO 3166-1 alpha-2 country code \(e.g., "US", "CA"\) |
| `locality` | string | No | City or locality name |
| `enableUspsCass` | boolean | No | Enable USPS CASS validation for US addresses |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -55,6 +55,8 @@ Analyze a webpage for performance, accessibility, SEO, and best practices using
| `category` | string | No | Lighthouse categories to analyze \(comma-separated\): performance, accessibility, best-practices, seo |
| `strategy` | string | No | Analysis strategy: desktop or mobile |
| `locale` | string | No | Locale for results \(e.g., en, fr, de\) |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -43,6 +43,9 @@ Translate text between languages using the Google Cloud Translation API. Support
| `target` | string | Yes | Target language code \(e.g., "es", "fr", "de", "ja"\) |
| `source` | string | No | Source language code. If omitted, the API will auto-detect the source language. |
| `format` | string | No | Format of the text: "text" for plain text, "html" for HTML content |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -61,6 +64,9 @@ Detect the language of text using the Google Cloud Translation API.
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Google Cloud API key with Cloud Translation API enabled |
| `text` | string | Yes | The text to detect the language of |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -138,6 +138,26 @@ Get the full transcript of a recording
| ↳ `end` | number | End timestamp in ms |
| ↳ `text` | string | Transcript text |
### `grain_list_views`
List available Grain views for webhook subscriptions
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Grain API key \(Personal Access Token\) |
| `typeFilter` | string | No | Optional view type filter: recordings, highlights, or stories |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `views` | array | Array of Grain views |
| ↳ `id` | string | View UUID |
| ↳ `name` | string | View name |
| ↳ `type` | string | View type: recordings, highlights, or stories |
### `grain_list_teams`
List all teams in the workspace
@@ -185,15 +205,9 @@ Create a webhook to receive recording events
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Grain API key \(Personal Access Token\) |
| `hookUrl` | string | Yes | Webhook endpoint URL \(e.g., "https://example.com/webhooks/grain"\) |
| `hookType` | string | Yes | Type of webhook: "recording_added" or "upload_status" |
| `filterBeforeDatetime` | string | No | Filter: recordings before this ISO8601 date \(e.g., "2024-01-15T00:00:00Z"\) |
| `filterAfterDatetime` | string | No | Filter: recordings after this ISO8601 date \(e.g., "2024-01-01T00:00:00Z"\) |
| `filterParticipantScope` | string | No | Filter: "internal" or "external" |
| `filterTeamId` | string | No | Filter: specific team UUID \(e.g., "a1b2c3d4-e5f6-7890-abcd-ef1234567890"\) |
| `filterMeetingTypeId` | string | No | Filter: specific meeting type UUID \(e.g., "a1b2c3d4-e5f6-7890-abcd-ef1234567890"\) |
| `includeHighlights` | boolean | No | Include highlights in webhook payload |
| `includeParticipants` | boolean | No | Include participants in webhook payload |
| `includeAiSummary` | boolean | No | Include AI summary in webhook payload |
| `viewId` | string | Yes | Grain view ID from GET /_/public-api/views |
| `actions` | array | No | Optional list of actions to subscribe to: added, updated, removed |
| `items` | string | No | No description |
#### Output
@@ -202,9 +216,8 @@ Create a webhook to receive recording events
| `id` | string | Hook UUID |
| `enabled` | boolean | Whether hook is active |
| `hook_url` | string | The webhook URL |
| `hook_type` | string | Type of hook: recording_added or upload_status |
| `filter` | object | Applied filters |
| `include` | object | Included fields |
| `view_id` | string | Grain view ID for the webhook |
| `actions` | array | Configured actions for the webhook |
| `inserted_at` | string | ISO8601 creation timestamp |
### `grain_list_hooks`
@@ -225,9 +238,8 @@ List all webhooks for the account
| ↳ `id` | string | Hook UUID |
| ↳ `enabled` | boolean | Whether hook is active |
| ↳ `hook_url` | string | Webhook URL |
| ↳ `hook_type` | string | Type: recording_added or upload_status |
| ↳ `filter` | object | Applied filters |
| ↳ `include` | object | Included fields |
| ↳ `view_id` | string | Grain view ID |
| ↳ `actions` | array | Configured actions |
| ↳ `inserted_at` | string | Creation timestamp |
### `grain_delete_hook`

View File

@@ -64,6 +64,7 @@ Extract and process web content into clean, LLM-friendly text using Jina AI Read
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `content` | string | The extracted content from the URL, processed into clean, LLM-friendly text |
| `tokensUsed` | number | Number of Jina tokens consumed by this request |
### `jina_search`
@@ -97,5 +98,6 @@ Search the web and return top 5 results with LLM-friendly content. Each result i
| ↳ `content` | string | LLM-friendly extracted content |
| ↳ `usage` | object | Token usage information |
| ↳ `tokens` | number | Number of tokens consumed by this request |
| `tokensUsed` | number | Number of Jina tokens consumed by this request |

View File

@@ -122,6 +122,37 @@ Create a new document in a knowledge base
| `message` | string | Success or error message describing the operation result |
| `documentId` | string | ID of the created document |
### `knowledge_upsert_document`
Create or update a document in a knowledge base. If a document with the given ID or filename already exists, it will be replaced with the new content.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base containing the document |
| `documentId` | string | No | Optional ID of an existing document to update. If not provided, lookup is done by filename. |
| `name` | string | Yes | Name of the document |
| `content` | string | Yes | Content of the document |
| `documentTags` | json | No | Document tags |
| `documentTags` | string | No | No description |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | object | Information about the upserted document |
| ↳ `documentId` | string | Document ID |
| ↳ `documentName` | string | Document name |
| ↳ `type` | string | Document type |
| ↳ `enabled` | boolean | Whether the document is enabled |
| ↳ `isUpdate` | boolean | Whether an existing document was replaced |
| ↳ `previousDocumentId` | string | ID of the document that was replaced, if any |
| ↳ `createdAt` | string | Creation timestamp |
| ↳ `updatedAt` | string | Last update timestamp |
| `message` | string | Success or error message describing the operation result |
| `documentId` | string | ID of the upserted document |
### `knowledge_list_tags`
List all tag definitions for a knowledge base

View File

@@ -51,6 +51,9 @@ Search the web for information using Linkup
| `includeDomains` | string | No | Comma-separated list of domain names to restrict search results to |
| `includeInlineCitations` | boolean | No | Add inline citations to answers \(only applies when outputType is "sourcedAnswer"\) |
| `includeSources` | boolean | No | Include sources in response |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -13,6 +13,7 @@
"asana",
"ashby",
"attio",
"box",
"brandfetch",
"browser_use",
"calcom",
@@ -27,6 +28,7 @@
"datadog",
"devin",
"discord",
"docusign",
"dropbox",
"dspy",
"dub",

View File

@@ -49,6 +49,9 @@ Generate completions using Perplexity AI chat models
| `max_tokens` | number | No | Maximum number of tokens to generate \(e.g., 1024, 2048, 4096\) |
| `temperature` | number | No | Sampling temperature between 0 and 1 \(e.g., 0.0 for deterministic, 0.7 for creative\) |
| `apiKey` | string | Yes | Perplexity API key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output
@@ -78,6 +81,8 @@ Get ranked search results from Perplexity
| `search_after_date` | string | No | Include only content published after this date \(format: MM/DD/YYYY\) |
| `search_before_date` | string | No | Include only content published before this date \(format: MM/DD/YYYY\) |
| `apiKey` | string | Yes | Perplexity API key |
| `pricing` | per_request | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -47,6 +47,9 @@ A powerful web search tool that provides access to Google search results through
| `hl` | string | No | Language code for search results \(e.g., "en", "es", "de", "fr"\) |
| `type` | string | No | Type of search to perform \(e.g., "search", "news", "images", "videos", "places", "shopping"\) |
| `apiKey` | string | Yes | Serper API Key |
| `pricing` | custom | No | No description |
| `metadata` | string | No | No description |
| `rateLimit` | string | No | No description |
#### Output

View File

@@ -4,14 +4,484 @@
* SEO:
* - `<section id="enterprise" aria-labelledby="enterprise-heading">`.
* - `<h2 id="enterprise-heading">` for the section title.
* - Compliance certs (SOC2, HIPAA) as visible `<strong>` text.
* - Compliance certs (SOC 2, HIPAA) as visible `<strong>` text.
* - Enterprise CTA links to contact form via `<a>` with `rel="noopener noreferrer"`.
*
* GEO:
* - Entity-rich: "Sim is SOC2 and HIPAA compliant" — not "We are compliant."
* - Entity-rich: "Sim is SOC 2 and HIPAA compliant" — not "We are compliant."
* - `<ul>` checklist of features (SSO, RBAC, audit logs, SLA, on-premise deployment)
* as an atomic answer block for "What enterprise features does Sim offer?".
*/
export default function Enterprise() {
return null
'use client'
import { useEffect, useRef, useState } from 'react'
import { AnimatePresence, motion } from 'framer-motion'
import Image from 'next/image'
import Link from 'next/link'
import { Badge, ChevronDown } from '@/components/emcn'
import { Lock } from '@/components/emcn/icons'
import { GithubIcon } from '@/components/icons'
/** Consistent color per actor — same pattern as Collaboration section cursors. */
const ACTOR_COLORS: Record<string, string> = {
'Sarah K.': '#2ABBF8',
'Sid G.': '#33C482',
'Theo L.': '#FA4EDF',
'Abhay K.': '#FFCC02',
'Danny S.': '#FF6B35',
}
/** Left accent bar opacity by recency — newest is brightest. */
const ACCENT_OPACITIES = [0.75, 0.45, 0.28, 0.15, 0.07] as const
/** Human-readable label per resource type. */
const RESOURCE_TYPE_LABEL: Record<string, string> = {
workflow: 'Workflow',
member: 'Member',
byok_key: 'BYOK Key',
api_key: 'API Key',
permission_group: 'Permission Group',
credential_set: 'Credential Set',
knowledge_base: 'Knowledge Base',
environment: 'Environment',
mcp_server: 'MCP Server',
file: 'File',
webhook: 'Webhook',
chat: 'Chat',
table: 'Table',
folder: 'Folder',
document: 'Document',
}
interface LogEntry {
id: number
actor: string
/** Matches the `description` field stored by recordAudit() */
description: string
resourceType: string
/** Unix ms timestamp of when this entry was "received" */
insertedAt: number
}
function formatTimeAgo(insertedAt: number): string {
const elapsed = Date.now() - insertedAt
if (elapsed < 8_000) return 'just now'
if (elapsed < 60_000) return `${Math.floor(elapsed / 1000)}s ago`
return `${Math.floor(elapsed / 60_000)}m ago`
}
/**
* Entry templates using real description strings from the actual recordAudit()
* calls across the codebase (e.g. `Added BYOK key for openai`,
* `Invited alex@acme.com to workspace as member`).
*/
const ENTRY_TEMPLATES: Omit<LogEntry, 'id' | 'insertedAt'>[] = [
{ actor: 'Sarah K.', description: 'Deployed workflow "Email Triage"', resourceType: 'workflow' },
{
actor: 'Sid G.',
description: 'Invited alex@acme.com to workspace as member',
resourceType: 'member',
},
{ actor: 'Theo L.', description: 'Added BYOK key for openai', resourceType: 'byok_key' },
{ actor: 'Sarah K.', description: 'Created workflow "Invoice Parser"', resourceType: 'workflow' },
{
actor: 'Abhay K.',
description: 'Created permission group "Engineering"',
resourceType: 'permission_group',
},
{ actor: 'Danny S.', description: 'Created API key "Production Key"', resourceType: 'api_key' },
{
actor: 'Theo L.',
description: 'Changed permissions for sam@acme.com to editor',
resourceType: 'member',
},
{ actor: 'Sarah K.', description: 'Uploaded file "Q3_Report.pdf"', resourceType: 'file' },
{
actor: 'Sid G.',
description: 'Created credential set "Prod Keys"',
resourceType: 'credential_set',
},
{
actor: 'Abhay K.',
description: 'Created knowledge base "Internal Docs"',
resourceType: 'knowledge_base',
},
{ actor: 'Danny S.', description: 'Updated environment variables', resourceType: 'environment' },
{
actor: 'Sarah K.',
description: 'Added tool "search_web" to MCP server',
resourceType: 'mcp_server',
},
{ actor: 'Sid G.', description: 'Created webhook "Stripe Payment"', resourceType: 'webhook' },
{ actor: 'Theo L.', description: 'Deployed chat "Support Assistant"', resourceType: 'chat' },
{ actor: 'Abhay K.', description: 'Created table "Lead Tracker"', resourceType: 'table' },
{ actor: 'Danny S.', description: 'Revoked API key "Staging Key"', resourceType: 'api_key' },
{
actor: 'Sarah K.',
description: 'Duplicated workflow "Data Enrichment"',
resourceType: 'workflow',
},
{
actor: 'Sid G.',
description: 'Removed member theo@acme.com from workspace',
resourceType: 'member',
},
{
actor: 'Theo L.',
description: 'Updated knowledge base "Product Docs"',
resourceType: 'knowledge_base',
},
{ actor: 'Abhay K.', description: 'Created folder "Finance Workflows"', resourceType: 'folder' },
{
actor: 'Danny S.',
description: 'Uploaded document "onboarding-guide.pdf"',
resourceType: 'document',
},
{
actor: 'Sarah K.',
description: 'Updated credential set "Prod Keys"',
resourceType: 'credential_set',
},
{
actor: 'Sid G.',
description: 'Added member abhay@acme.com to permission group "Engineering"',
resourceType: 'permission_group',
},
{ actor: 'Theo L.', description: 'Locked workflow "Customer Sync"', resourceType: 'workflow' },
]
const INITIAL_OFFSETS_MS = [0, 20_000, 75_000, 240_000, 540_000]
const MARQUEE_KEYFRAMES = `
@keyframes marquee {
0% { transform: translateX(0); }
100% { transform: translateX(-25%); }
}
@media (prefers-reduced-motion: reduce) {
@keyframes marquee { 0%, 100% { transform: none; } }
}
`
const FEATURE_TAGS = [
'Access Control',
'Self-Hosting',
'Bring Your Own Key',
'Credential Sharing',
'Custom Limits',
'Admin API',
'White Labeling',
'Dedicated Support',
'99.99% Uptime SLA',
'Workflow Versioning',
'On-Premise',
'Organizations',
'Workspace Export',
'Audit Logs',
] as const
interface AuditRowProps {
entry: LogEntry
index: number
}
function AuditRow({ entry, index }: AuditRowProps) {
const color = ACTOR_COLORS[entry.actor] ?? '#F6F6F6'
const accentOpacity = ACCENT_OPACITIES[index] ?? 0.04
const timeAgo = formatTimeAgo(entry.insertedAt)
const resourceLabel = RESOURCE_TYPE_LABEL[entry.resourceType]
return (
<div className='group relative overflow-hidden border-[#2A2A2A] border-b bg-[#191919] transition-colors duration-150 last:border-b-0 hover:bg-[#212121]'>
{/* Left accent bar — brightness encodes recency */}
<div
aria-hidden='true'
className='absolute top-0 bottom-0 left-0 w-[2px] transition-opacity duration-150 group-hover:opacity-100'
style={{ backgroundColor: color, opacity: accentOpacity }}
/>
{/* Row content */}
<div className='flex min-w-0 items-center gap-3 py-[10px] pr-4 pl-5'>
{/* Actor avatar */}
<div
className='flex h-[22px] w-[22px] shrink-0 items-center justify-center rounded-full'
style={{ backgroundColor: `${color}20` }}
>
<span className='font-[500] font-season text-[9px] leading-none' style={{ color }}>
{entry.actor[0]}
</span>
</div>
{/* Time */}
<span className='w-[56px] shrink-0 font-[430] font-season text-[#F6F6F6]/30 text-[11px] leading-none tracking-[0.02em]'>
{timeAgo}
</span>
{/* Description — description hidden on mobile to avoid truncation */}
<span className='min-w-0 truncate font-[430] font-season text-[12px] leading-none tracking-[0.02em]'>
<span className='text-[#F6F6F6]/80'>{entry.actor}</span>
<span className='hidden sm:inline'>
<span className='text-[#F6F6F6]/40'> · </span>
<span className='text-[#F6F6F6]/55'>{entry.description}</span>
</span>
</span>
{/* Resource type label — formatted name, neutral so it doesn't compete with actor colors */}
{resourceLabel && (
<span className='ml-auto shrink-0 rounded border border-[#2A2A2A] px-[7px] py-[3px] font-[430] font-season text-[#F6F6F6]/25 text-[10px] leading-none tracking-[0.04em]'>
{resourceLabel}
</span>
)}
</div>
</div>
)
}
function AuditLogPreview() {
const counterRef = useRef(ENTRY_TEMPLATES.length)
const templateIndexRef = useRef(5 % ENTRY_TEMPLATES.length)
const now = Date.now()
const [entries, setEntries] = useState<LogEntry[]>(() =>
ENTRY_TEMPLATES.slice(0, 5).map((t, i) => ({
...t,
id: i,
insertedAt: now - INITIAL_OFFSETS_MS[i],
}))
)
const [, tick] = useState(0)
useEffect(() => {
const addInterval = setInterval(() => {
const template = ENTRY_TEMPLATES[templateIndexRef.current]
templateIndexRef.current = (templateIndexRef.current + 1) % ENTRY_TEMPLATES.length
setEntries((prev) => [
{ ...template, id: counterRef.current++, insertedAt: Date.now() },
...prev.slice(0, 4),
])
}, 2600)
// Refresh time labels every 5s so "just now" ages to "Xs ago"
const tickInterval = setInterval(() => tick((n) => n + 1), 5_000)
return () => {
clearInterval(addInterval)
clearInterval(tickInterval)
}
}, [])
return (
<div className='mx-6 mt-6 overflow-hidden rounded-[8px] border border-[#2A2A2A] md:mx-8 md:mt-8'>
{/* Header */}
<div className='flex items-center justify-between border-[#2A2A2A] border-b bg-[#161616] px-4 py-[10px]'>
<div className='flex items-center gap-2'>
{/* Pulsing live indicator */}
<span className='relative flex h-[8px] w-[8px]'>
<span
className='absolute inline-flex h-full w-full animate-ping rounded-full opacity-50'
style={{ backgroundColor: '#33C482' }}
/>
<span
className='relative inline-flex h-[8px] w-[8px] rounded-full'
style={{ backgroundColor: '#33C482' }}
/>
</span>
<span className='font-[430] font-season text-[#F6F6F6]/40 text-[11px] uppercase tracking-[0.08em]'>
Audit Log
</span>
</div>
<div className='flex items-center gap-2'>
<span className='rounded border border-[#2A2A2A] px-[8px] py-[3px] font-[430] font-season text-[#F6F6F6]/20 text-[11px] tracking-[0.02em]'>
Export
</span>
<span className='rounded border border-[#2A2A2A] px-[8px] py-[3px] font-[430] font-season text-[#F6F6F6]/20 text-[11px] tracking-[0.02em]'>
Filter
</span>
</div>
</div>
{/* Log entries — new items push existing ones down */}
<div className='overflow-hidden'>
<AnimatePresence mode='popLayout' initial={false}>
{entries.map((entry, index) => (
<motion.div
key={entry.id}
layout
initial={{ y: -48, opacity: 0 }}
animate={{ y: 0, opacity: 1 }}
exit={{ opacity: 0 }}
transition={{
layout: {
type: 'spring',
stiffness: 380,
damping: 38,
mass: 0.8,
},
y: { duration: 0.32, ease: [0.25, 0.46, 0.45, 0.94] },
opacity: { duration: 0.25 },
}}
>
<AuditRow entry={entry} index={index} />
</motion.div>
))}
</AnimatePresence>
</div>
</div>
)
}
function TrustStrip() {
return (
<div className='mx-6 mt-4 grid grid-cols-1 overflow-hidden rounded-[8px] border border-[#2A2A2A] sm:grid-cols-3 md:mx-8'>
{/* SOC 2 + HIPAA combined */}
<Link
href='https://trust.delve.co/sim-studio'
target='_blank'
rel='noopener noreferrer'
className='group flex items-center gap-3 border-[#2A2A2A] border-b px-4 py-[14px] transition-colors hover:bg-[#212121] sm:border-r sm:border-b-0'
>
<Image
src='/footer/soc2.png'
alt='SOC 2 Type II'
width={22}
height={22}
className='shrink-0 object-contain'
/>
<div className='flex flex-col gap-[3px]'>
<strong className='font-[430] font-season text-[13px] text-white leading-none'>
SOC 2 & HIPAA
</strong>
<span className='font-[430] font-season text-[#F6F6F6]/30 text-[11px] leading-none tracking-[0.02em] transition-colors group-hover:text-[#F6F6F6]/55'>
Type II · PHI protected
</span>
</div>
</Link>
{/* Open Source — center */}
<Link
href='https://github.com/simstudioai/sim'
target='_blank'
rel='noopener noreferrer'
className='group flex items-center gap-3 border-[#2A2A2A] border-b px-4 py-[14px] transition-colors hover:bg-[#212121] sm:border-r sm:border-b-0'
>
<div className='flex h-[22px] w-[22px] shrink-0 items-center justify-center rounded-full bg-[#FFCC02]/10'>
<GithubIcon width={11} height={11} className='text-[#FFCC02]/75' />
</div>
<div className='flex flex-col gap-[3px]'>
<strong className='font-[430] font-season text-[13px] text-white leading-none'>
Open Source
</strong>
<span className='font-[430] font-season text-[#F6F6F6]/30 text-[11px] leading-none tracking-[0.02em] transition-colors group-hover:text-[#F6F6F6]/55'>
View on GitHub
</span>
</div>
</Link>
{/* SSO */}
<div className='flex items-center gap-3 px-4 py-[14px]'>
<div className='flex h-[22px] w-[22px] shrink-0 items-center justify-center rounded-full bg-[#2ABBF8]/10'>
<Lock className='h-[14px] w-[14px] text-[#2ABBF8]/75' />
</div>
<div className='flex flex-col gap-[3px]'>
<strong className='font-[430] font-season text-[13px] text-white leading-none'>
SSO & SCIM
</strong>
<span className='font-[430] font-season text-[#F6F6F6]/30 text-[11px] leading-none tracking-[0.02em]'>
Okta, Azure AD, Google
</span>
</div>
</div>
</div>
)
}
export default function Enterprise() {
return (
<section id='enterprise' aria-labelledby='enterprise-heading' className='bg-[#F6F6F6]'>
<div className='px-4 pt-[60px] pb-[40px] sm:px-8 sm:pt-[80px] sm:pb-0 md:px-[80px] md:pt-[100px]'>
<div className='flex flex-col items-start gap-3 sm:gap-4 md:gap-[20px]'>
<Badge
variant='blue'
size='md'
dot
className='bg-[#FFCC02]/10 font-season text-[#FFCC02] uppercase tracking-[0.02em]'
>
Enterprise
</Badge>
<h2
id='enterprise-heading'
className='max-w-[600px] font-[430] font-season text-[#1C1C1C] text-[32px] leading-[100%] tracking-[-0.02em] sm:text-[36px] md:text-[40px]'
>
Enterprise features for
<br />
fast, scalable workflows
</h2>
</div>
<div className='mt-8 overflow-hidden rounded-[12px] bg-[#1C1C1C] sm:mt-10 md:mt-12'>
<AuditLogPreview />
<TrustStrip />
{/* Scrolling feature ticker */}
<div className='relative mt-6 overflow-hidden border-[#2A2A2A] border-t'>
<style dangerouslySetInnerHTML={{ __html: MARQUEE_KEYFRAMES }} />
{/* Fade edges */}
<div
aria-hidden='true'
className='pointer-events-none absolute top-0 bottom-0 left-0 z-10 w-16'
style={{ background: 'linear-gradient(to right, #1C1C1C, transparent)' }}
/>
<div
aria-hidden='true'
className='pointer-events-none absolute top-0 right-0 bottom-0 z-10 w-16'
style={{ background: 'linear-gradient(to left, #1C1C1C, transparent)' }}
/>
{/* Duplicate tags for seamless loop */}
<div className='flex w-max' style={{ animation: 'marquee 30s linear infinite' }}>
{[...FEATURE_TAGS, ...FEATURE_TAGS, ...FEATURE_TAGS, ...FEATURE_TAGS].map(
(tag, i) => (
<span
key={i}
className='whitespace-nowrap border-[#2A2A2A] border-r px-5 py-4 font-[430] font-season text-[#F6F6F6]/40 text-[13px] leading-none tracking-[0.02em]'
>
{tag}
</span>
)
)}
</div>
</div>
<div className='flex items-center justify-between border-[#2A2A2A] border-t px-6 py-5 md:px-8 md:py-6'>
<p className='font-[430] font-season text-[#F6F6F6]/40 text-[15px] leading-[150%] tracking-[0.02em]'>
Ready for growth?
</p>
<Link
href='/contact'
className='group/cta inline-flex h-[32px] items-center gap-[6px] rounded-[5px] border border-white bg-white px-[10px] font-[430] font-season text-[14px] text-black transition-colors hover:border-[#E0E0E0] hover:bg-[#E0E0E0]'
>
Book a demo
<span className='relative h-[10px] w-[10px] shrink-0'>
<ChevronDown className='-rotate-90 absolute inset-0 h-[10px] w-[10px] transition-opacity duration-150 group-hover/cta:opacity-0' />
<svg
className='absolute inset-0 h-[10px] w-[10px] opacity-0 transition-opacity duration-150 group-hover/cta:opacity-100'
viewBox='0 0 10 10'
fill='none'
>
<path
d='M1 5H8M5.5 2L8.5 5L5.5 8'
stroke='currentColor'
strokeWidth='1.33'
strokeLinecap='square'
strokeLinejoin='miter'
fill='none'
/>
</svg>
</span>
</Link>
</div>
</div>
</div>
</section>
)
}

View File

@@ -2,6 +2,8 @@
import { type SVGProps, useEffect, useRef, useState } from 'react'
import { AnimatePresence, motion, useInView } from 'framer-motion'
import ReactMarkdown, { type Components } from 'react-markdown'
import remarkGfm from 'remark-gfm'
import { ChevronDown } from '@/components/emcn'
import { Database, File, Library, Table } from '@/components/emcn/icons'
import {
@@ -16,6 +18,7 @@ import {
xAIIcon,
} from '@/components/icons'
import { CsvIcon, JsonIcon, MarkdownIcon, PdfIcon } from '@/components/icons/document-icons'
import { cn } from '@/lib/core/utils/cn'
interface FeaturesPreviewProps {
activeTab: number
@@ -124,7 +127,7 @@ const EXPAND_TARGETS: Record<number, { row: number; col: number }> = {
}
const EXPAND_ROW_COUNTS: Record<number, number> = {
1: 10,
1: 8,
2: 10,
3: 10,
4: 7,
@@ -603,7 +606,28 @@ const MOCK_KB_DATA = [
['metrics.csv', '1.4 MB', '5.8k', '38', 'enabled'],
] as const
const MD_COMPONENTS: Components = {
h1: ({ children }) => (
<h1 className='mb-4 border-[#E5E5E5] border-b pb-2 font-semibold text-[#1C1C1C] text-[20px]'>
{children}
</h1>
),
h2: ({ children }) => (
<h2 className='mt-5 mb-3 border-[#E5E5E5] border-b pb-1.5 font-semibold text-[#1C1C1C] text-[16px]'>
{children}
</h2>
),
ul: ({ children }) => <ul className='mb-3 list-disc pl-[24px]'>{children}</ul>,
ol: ({ children }) => <ol className='mb-3 list-decimal pl-[24px]'>{children}</ol>,
li: ({ children }) => (
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>{children}</li>
),
p: ({ children }) => <p className='mb-3 text-[#1C1C1C] text-[14px] leading-[1.6]'>{children}</p>,
}
function MockFullFiles() {
const [source, setSource] = useState(MOCK_MD_SOURCE)
return (
<div className='flex h-full flex-col'>
<div className='flex h-[44px] shrink-0 items-center border-[#E5E5E5] border-b px-[24px]'>
@@ -622,9 +646,13 @@ function MockFullFiles() {
animate={{ opacity: 1 }}
transition={{ duration: 0.4, delay: 0.3 }}
>
<pre className='h-full overflow-auto whitespace-pre-wrap p-[24px] font-[300] font-mono text-[#1C1C1C] text-[12px] leading-[1.7]'>
{MOCK_MD_SOURCE}
</pre>
<textarea
value={source}
onChange={(e) => setSource(e.target.value)}
spellCheck={false}
autoCorrect='off'
className='h-full w-full resize-none overflow-auto whitespace-pre-wrap bg-transparent p-[24px] font-[300] font-mono text-[#1C1C1C] text-[12px] leading-[1.7] outline-none'
/>
</motion.div>
<div className='h-full w-px shrink-0 bg-[#E5E5E5]' />
@@ -636,47 +664,9 @@ function MockFullFiles() {
transition={{ duration: 0.4, delay: 0.5 }}
>
<div className='h-full overflow-auto p-[24px]'>
<h1 className='mb-4 border-[#E5E5E5] border-b pb-2 font-semibold text-[#1C1C1C] text-[20px]'>
Meeting Notes
</h1>
<h2 className='mt-5 mb-3 border-[#E5E5E5] border-b pb-1.5 font-semibold text-[#1C1C1C] text-[16px]'>
Action Items
</h2>
<ul className='mb-3 list-disc pl-[24px]'>
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Review Q1 metrics with Sarah
</li>
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Update API documentation
</li>
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Schedule design review for v2.0
</li>
</ul>
<h2 className='mt-5 mb-3 border-[#E5E5E5] border-b pb-1.5 font-semibold text-[#1C1C1C] text-[16px]'>
Discussion Points
</h2>
<p className='mb-3 text-[#1C1C1C] text-[14px] leading-[1.6]'>
The team agreed to prioritize the new onboarding flow. Key decisions:
</p>
<ol className='mb-3 list-decimal pl-[24px]'>
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Migrate to the new auth provider by end of March
</li>
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Ship the dashboard redesign in two phases
</li>
<li className='mb-1 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Add automated testing for all critical paths
</li>
</ol>
<h2 className='mt-5 mb-3 border-[#E5E5E5] border-b pb-1.5 font-semibold text-[#1C1C1C] text-[16px]'>
Next Steps
</h2>
<p className='mb-3 text-[#1C1C1C] text-[14px] leading-[1.6]'>
Follow up with engineering on the timeline for the API v2 migration. Draft the
proposal for the board meeting next week.
</p>
<ReactMarkdown remarkPlugins={[remarkGfm]} components={MD_COMPONENTS}>
{source}
</ReactMarkdown>
</div>
</motion.div>
</div>
@@ -809,8 +799,79 @@ const LOG_STATUS_STYLES: Record<string, { bg: string; text: string; label: strin
error: { bg: '#FEE2E2', text: '#991B1B', label: 'Error' },
}
interface MockLogDetail {
output: string
spans: { name: string; ms: number; depth: number }[]
}
const MOCK_LOG_DETAILS: MockLogDetail[] = [
{
output: '{\n "result": "processed",\n "emails": 3,\n "status": "complete"\n}',
spans: [
{ name: 'Agent Block', ms: 800, depth: 0 },
{ name: 'search_web', ms: 210, depth: 1 },
{ name: 'Function Block', ms: 180, depth: 0 },
],
},
{
output: '{\n "score": 87,\n "label": "high",\n "confidence": 0.94\n}',
spans: [
{ name: 'Agent Block', ms: 2100, depth: 0 },
{ name: 'hubspot_get_contact', ms: 340, depth: 1 },
{ name: 'Function Block', ms: 180, depth: 0 },
{ name: 'Condition', ms: 50, depth: 0 },
],
},
{
output: '{\n "error": "timeout",\n "message": "LLM request exceeded limit"\n}',
spans: [
{ name: 'Agent Block', ms: 650, depth: 0 },
{ name: 'search_kb', ms: 120, depth: 1 },
],
},
{
output: '{\n "user": "james@globex.io",\n "steps_completed": 4,\n "status": "sent"\n}',
spans: [
{ name: 'Agent Block', ms: 980, depth: 0 },
{ name: 'send_email', ms: 290, depth: 1 },
{ name: 'Function Block', ms: 210, depth: 0 },
{ name: 'Agent Block', ms: 420, depth: 0 },
],
},
{
output: '{\n "records_processed": 142,\n "inserted": 138,\n "errors": 4\n}',
spans: [
{ name: 'Agent Block', ms: 1800, depth: 0 },
{ name: 'salesforce_query', ms: 820, depth: 1 },
{ name: 'Function Block', ms: 340, depth: 0 },
{ name: 'Agent Block', ms: 1200, depth: 0 },
{ name: 'insert_rows', ms: 610, depth: 1 },
],
},
{
output: '{\n "result": "processed",\n "emails": 1,\n "status": "complete"\n}',
spans: [
{ name: 'Agent Block', ms: 720, depth: 0 },
{ name: 'gmail_read', ms: 190, depth: 1 },
{ name: 'Function Block', ms: 160, depth: 0 },
],
},
{
output: '{\n "ticket_id": "TKT-4291",\n "priority": "medium",\n "assigned": "support"\n}',
spans: [
{ name: 'Agent Block', ms: 1400, depth: 0 },
{ name: 'classify_intent', ms: 380, depth: 1 },
{ name: 'Function Block', ms: 220, depth: 0 },
{ name: 'Agent Block', ms: 780, depth: 0 },
],
},
]
const MOCK_LOG_DETAIL_MAX_MS = MOCK_LOG_DETAILS.map((d) => Math.max(...d.spans.map((s) => s.ms)))
function MockFullLogs({ revealedRows }: { revealedRows: number }) {
const [showSidebar, setShowSidebar] = useState(false)
const [selectedRow, setSelectedRow] = useState(0)
useEffect(() => {
if (revealedRows < MOCK_LOG_DATA.length) return
@@ -818,8 +879,6 @@ function MockFullLogs({ revealedRows }: { revealedRows: number }) {
return () => clearTimeout(timer)
}, [revealedRows])
const selectedRow = 0
return (
<div className='relative flex h-full'>
<div className='flex min-w-0 flex-1 flex-col'>
@@ -856,7 +915,11 @@ function MockFullLogs({ revealedRows }: { revealedRows: number }) {
initial={{ opacity: 0, y: 4 }}
animate={{ opacity: 1, y: 0 }}
transition={{ duration: 0.2, ease: 'easeOut' }}
className={isSelected ? 'bg-[#F5F5F5]' : 'hover:bg-[#FAFAFA]'}
className={cn(
'cursor-pointer',
isSelected ? 'bg-[#F5F5F5]' : 'hover:bg-[#FAFAFA]'
)}
onClick={() => setSelectedRow(i)}
>
<td className='px-[24px] py-[10px] align-middle'>
<span className='flex items-center gap-[12px] font-medium text-[#1C1C1C] text-[14px]'>
@@ -908,24 +971,59 @@ function MockFullLogs({ revealedRows }: { revealedRows: number }) {
transition={{ duration: 0.25, ease: [0.4, 0, 0.2, 1] }}
style={{ width: '45%' }}
>
<MockLogDetailsSidebar />
<MockLogDetailsSidebar
selectedRow={selectedRow}
onPrev={() => setSelectedRow((r) => Math.max(0, r - 1))}
onNext={() => setSelectedRow((r) => Math.min(MOCK_LOG_DATA.length - 1, r + 1))}
/>
</motion.div>
</div>
)
}
function MockLogDetailsSidebar() {
interface MockLogDetailsSidebarProps {
selectedRow: number
onPrev: () => void
onNext: () => void
}
function MockLogDetailsSidebar({ selectedRow, onPrev, onNext }: MockLogDetailsSidebarProps) {
const row = MOCK_LOG_DATA[selectedRow]
const detail = MOCK_LOG_DETAILS[selectedRow]
const statusStyle = LOG_STATUS_STYLES[row[2]] ?? LOG_STATUS_STYLES.success
const [date, time] = row[1].split(', ')
const color = MOCK_LOG_COLORS[selectedRow]
const maxMs = MOCK_LOG_DETAIL_MAX_MS[selectedRow]
const isPrevDisabled = selectedRow === 0
const isNextDisabled = selectedRow === MOCK_LOG_DATA.length - 1
return (
<div className='flex h-full flex-col px-[14px] pt-[12px]'>
<div className='flex h-full flex-col overflow-y-auto px-[14px] pt-[12px]'>
<div className='flex items-center justify-between'>
<span className='font-medium text-[#1C1C1C] text-[14px]'>Log Details</span>
<div className='flex items-center gap-[1px]'>
<div className='flex h-[24px] w-[24px] items-center justify-center rounded-[4px] text-[#999] hover:bg-[#F5F5F5]'>
<button
type='button'
onClick={onPrev}
disabled={isPrevDisabled}
className={cn(
'flex h-[24px] w-[24px] items-center justify-center rounded-[4px] text-[#999]',
isPrevDisabled ? 'cursor-not-allowed opacity-40' : 'hover:bg-[#F5F5F5]'
)}
>
<ChevronDown className='h-[14px] w-[14px] rotate-180' />
</div>
<div className='flex h-[24px] w-[24px] items-center justify-center rounded-[4px] text-[#999] hover:bg-[#F5F5F5]'>
</button>
<button
type='button'
onClick={onNext}
disabled={isNextDisabled}
className={cn(
'flex h-[24px] w-[24px] items-center justify-center rounded-[4px] text-[#999]',
isNextDisabled ? 'cursor-not-allowed opacity-40' : 'hover:bg-[#F5F5F5]'
)}
>
<ChevronDown className='h-[14px] w-[14px]' />
</div>
</button>
</div>
</div>
@@ -934,8 +1032,8 @@ function MockLogDetailsSidebar() {
<div className='flex w-[120px] shrink-0 flex-col gap-[8px]'>
<span className='font-medium text-[#999] text-[12px]'>Timestamp</span>
<div className='flex items-center gap-[6px]'>
<span className='font-medium text-[#666] text-[13px]'>Mar 17</span>
<span className='font-medium text-[#666] text-[13px]'>2:14 PM</span>
<span className='font-medium text-[#666] text-[13px]'>{date}</span>
<span className='font-medium text-[#666] text-[13px]'>{time}</span>
</div>
</div>
<div className='flex min-w-0 flex-1 flex-col gap-[8px]'>
@@ -944,12 +1042,12 @@ function MockLogDetailsSidebar() {
<div
className='h-[10px] w-[10px] shrink-0 rounded-[3px] border-[1.5px]'
style={{
backgroundColor: '#7C3AED',
borderColor: '#7C3AED60',
backgroundColor: color,
borderColor: `${color}60`,
backgroundClip: 'padding-box',
}}
/>
<span className='truncate font-medium text-[#666] text-[13px]'>Email Bot</span>
<span className='truncate font-medium text-[#666] text-[13px]'>{row[0]}</span>
</div>
</div>
</div>
@@ -957,26 +1055,52 @@ function MockLogDetailsSidebar() {
<div className='flex flex-col'>
<div className='flex h-[42px] items-center justify-between border-[#E5E5E5] border-b px-[8px]'>
<span className='font-medium text-[#999] text-[12px]'>Level</span>
<span className='inline-flex items-center rounded-full bg-[#DCFCE7] px-[8px] py-[2px] font-medium text-[#166534] text-[11px]'>
Success
<span
className='inline-flex items-center rounded-full px-[8px] py-[2px] font-medium text-[11px]'
style={{ backgroundColor: statusStyle.bg, color: statusStyle.text }}
>
{statusStyle.label}
</span>
</div>
<div className='flex h-[42px] items-center justify-between border-[#E5E5E5] border-b px-[8px]'>
<span className='font-medium text-[#999] text-[12px]'>Trigger</span>
<span className='rounded-[4px] bg-[#F5F5F5] px-[6px] py-[2px] text-[#666] text-[11px]'>
API
{row[4]}
</span>
</div>
<div className='flex h-[42px] items-center justify-between px-[8px]'>
<span className='font-medium text-[#999] text-[12px]'>Duration</span>
<span className='font-medium text-[#666] text-[13px]'>1.2s</span>
<span className='font-medium text-[#666] text-[13px]'>{row[5]}</span>
</div>
</div>
<div className='flex flex-col gap-[6px] rounded-[6px] border border-[#E5E5E5] bg-[#FAFAFA] px-[10px] py-[8px]'>
<span className='font-medium text-[#999] text-[12px]'>Workflow Output</span>
<div className='rounded-[6px] bg-[#F0F0F0] p-[10px] font-mono text-[#555] text-[11px] leading-[1.5]'>
{'{\n "result": "processed",\n "emails": 3,\n "status": "complete"\n}'}
{detail.output}
</div>
</div>
<div className='flex flex-col gap-[6px] rounded-[6px] border border-[#E5E5E5] bg-[#FAFAFA] px-[10px] py-[8px]'>
<span className='font-medium text-[#999] text-[12px]'>Trace Spans</span>
<div className='flex flex-col gap-[6px]'>
{detail.spans.map((span, i) => (
<div
key={i}
className={cn('flex flex-col gap-[3px]', span.depth === 1 && 'ml-[12px]')}
>
<div className='flex items-center justify-between'>
<span className='font-mono text-[#555] text-[11px]'>{span.name}</span>
<span className='font-medium text-[#999] text-[11px]'>{span.ms}ms</span>
</div>
<div className='h-[4px] w-full overflow-hidden rounded-full bg-[#F0F0F0]'>
<div
className='h-full rounded-full bg-[#2F6FED]'
style={{ width: `${(span.ms / maxMs) * 100}%` }}
/>
</div>
</div>
))}
</div>
</div>
</div>
@@ -985,6 +1109,8 @@ function MockLogDetailsSidebar() {
}
function MockFullTable({ revealedRows }: { revealedRows: number }) {
const [selectedRow, setSelectedRow] = useState<number | null>(null)
return (
<div className='flex h-full flex-col'>
<div className='flex h-[44px] shrink-0 items-center border-[#E5E5E5] border-b px-[24px]'>
@@ -1037,26 +1163,48 @@ function MockFullTable({ revealedRows }: { revealedRows: number }) {
</tr>
</thead>
<tbody>
{MOCK_TABLE_DATA.slice(0, revealedRows).map((row, i) => (
<motion.tr
key={i}
initial={{ opacity: 0, y: 4 }}
animate={{ opacity: 1, y: 0 }}
transition={{ duration: 0.2, ease: 'easeOut' }}
>
<td className='border-[#E5E5E5] border-r border-b px-[4px] py-[7px] text-center align-middle'>
<span className='text-[#999] text-[11px] tabular-nums'>{i + 1}</span>
</td>
{row.map((cell, j) => (
{MOCK_TABLE_DATA.slice(0, revealedRows).map((row, i) => {
const isSelected = selectedRow === i
return (
<motion.tr
key={i}
initial={{ opacity: 0, y: 4 }}
animate={{ opacity: 1, y: 0 }}
transition={{ duration: 0.2, ease: 'easeOut' }}
className='cursor-pointer'
onClick={() => setSelectedRow(i)}
>
<td
key={j}
className='border-[#E5E5E5] border-r border-b px-[8px] py-[7px] align-middle'
className={cn(
'border-[#E5E5E5] border-r border-b px-[4px] py-[7px] text-center align-middle',
isSelected ? 'bg-[rgba(37,99,235,0.06)]' : 'hover:bg-[#FAFAFA]'
)}
>
<span className='block truncate text-[#1C1C1C] text-[13px]'>{cell}</span>
<span className='text-[#999] text-[11px] tabular-nums'>{i + 1}</span>
</td>
))}
</motion.tr>
))}
{row.map((cell, j) => (
<td
key={j}
className={cn(
'relative border-[#E5E5E5] border-r border-b px-[8px] py-[7px] align-middle',
isSelected ? 'bg-[rgba(37,99,235,0.06)]' : 'hover:bg-[#FAFAFA]'
)}
>
{isSelected && (
<div
className={cn(
'-bottom-px -top-px pointer-events-none absolute left-0 z-[5] border-[#1a5cf6] border-t border-b',
j === 0 && 'border-l',
j === row.length - 1 ? '-right-px border-r' : 'right-0'
)}
/>
)}
<span className='block truncate text-[#1C1C1C] text-[13px]'>{cell}</span>
</td>
))}
</motion.tr>
)
})}
</tbody>
</table>
</div>

View File

@@ -22,9 +22,10 @@ const PRICING_TIERS: PricingTier[] = [
features: [
'1,000 credits (trial)',
'5GB file storage',
'3 tables · 1,000 rows each',
'5 min execution limit',
'Limited log retention',
'CLI/SDK Access',
'7-day log retention',
'CLI/SDK/MCP Access',
],
cta: { label: 'Get started', href: '/signup' },
},
@@ -36,11 +37,12 @@ const PRICING_TIERS: PricingTier[] = [
billingPeriod: 'per month',
color: '#00F701',
features: [
'6,000 credits/mo',
'+50 daily refresh credits',
'150 runs/min (sync)',
'50 min sync execution limit',
'6,000 credits/mo · +50/day',
'50GB file storage',
'25 tables · 5,000 rows each',
'50 min execution · 150 runs/min',
'Unlimited log retention',
'CLI/SDK/MCP Access',
],
cta: { label: 'Get started', href: '/signup' },
},
@@ -52,11 +54,12 @@ const PRICING_TIERS: PricingTier[] = [
billingPeriod: 'per month',
color: '#FA4EDF',
features: [
'25,000 credits/mo',
'+200 daily refresh credits',
'300 runs/min (sync)',
'50 min sync execution limit',
'25,000 credits/mo · +200/day',
'500GB file storage',
'25 tables · 5,000 rows each',
'50 min execution · 300 runs/min',
'Unlimited log retention',
'CLI/SDK/MCP Access',
],
cta: { label: 'Get started', href: '/signup' },
},
@@ -66,7 +69,15 @@ const PRICING_TIERS: PricingTier[] = [
description: 'For organizations needing security and scale',
price: 'Custom',
color: '#FFCC02',
features: ['Custom infra limits', 'SSO', 'SOC2', 'Self hosting', 'Dedicated support'],
features: [
'Custom credits & infra limits',
'Custom file storage',
'10,000 tables · 1M rows each',
'Custom execution limits',
'Unlimited log retention',
'SSO & SCIM · SOC2 & HIPAA',
'Self hosting · Dedicated support',
],
cta: { label: 'Book a demo', href: '/contact' },
},
]
@@ -114,12 +125,12 @@ function PricingCard({ tier }: PricingCardProps) {
</p>
<div className='mt-4'>
{isEnterprise ? (
<a
<Link
href={tier.cta.href}
className='flex h-[32px] w-full items-center justify-center rounded-[5px] border border-[#E5E5E5] px-[10px] font-[430] font-season text-[#1C1C1C] text-[14px] transition-colors hover:bg-[#F0F0F0]'
>
{tier.cta.label}
</a>
</Link>
) : isPro ? (
<Link
href={tier.cta.href}

View File

@@ -28,8 +28,8 @@ import {
* for immediate availability to AI crawlers.
* - Section `id` attributes serve as fragment anchors for precise AI citations.
* - Content ordering prioritizes answer-first patterns: definition (Hero) ->
* examples (Templates) -> capabilities (Features) -> social proof (Collaboration, Testimonials) ->
* pricing (Pricing) -> enterprise (Enterprise).
* examples (Templates) -> capabilities (Features) -> social proof (Collaboration) ->
* enterprise (Enterprise) -> pricing (Pricing) -> testimonials (Testimonials).
*/
export default async function Landing() {
return (
@@ -43,8 +43,8 @@ export default async function Landing() {
<Templates />
<Features />
<Collaboration />
<Pricing />
<Enterprise />
<Pricing />
<Testimonials />
</main>
<Footer />

View File

@@ -142,7 +142,7 @@ export async function POST(request: NextRequest) {
quantity: currentQuantity,
},
],
proration_behavior: 'create_prorations',
proration_behavior: 'always_invoice',
})
}

View File

@@ -102,7 +102,7 @@ async function handleLocalFile(filename: string, userId: string): Promise<NextRe
throw new FileNotFoundError(`File not found: ${filename}`)
}
const filePath = findLocalFile(filename)
const filePath = await findLocalFile(filename)
if (!filePath) {
throw new FileNotFoundError(`File not found: ${filename}`)
@@ -228,7 +228,7 @@ async function handleCloudProxyPublic(
async function handleLocalFilePublic(filename: string): Promise<NextResponse> {
try {
const filePath = findLocalFile(filename)
const filePath = await findLocalFile(filename)
if (!filePath) {
throw new FileNotFoundError(`File not found: ${filename}`)

View File

@@ -75,7 +75,7 @@ export async function POST(request: NextRequest) {
const uploadResults = []
for (const file of files) {
const originalName = file.name || 'untitled'
const originalName = file.name || 'untitled.md'
if (!validateFileExtension(originalName)) {
const extension = originalName.split('.').pop()?.toLowerCase() || 'unknown'

View File

@@ -331,7 +331,7 @@ describe('extractFilename', () => {
describe('findLocalFile - Path Traversal Security Tests', () => {
describe('path traversal attack prevention', () => {
it.concurrent('should reject classic path traversal attacks', () => {
it.concurrent('should reject classic path traversal attacks', async () => {
const maliciousInputs = [
'../../../etc/passwd',
'..\\..\\..\\windows\\system32\\config\\sam',
@@ -340,35 +340,35 @@ describe('findLocalFile - Path Traversal Security Tests', () => {
'..\\config.ini',
]
maliciousInputs.forEach((input) => {
const result = findLocalFile(input)
for (const input of maliciousInputs) {
const result = await findLocalFile(input)
expect(result).toBeNull()
})
}
})
it.concurrent('should reject encoded path traversal attempts', () => {
it.concurrent('should reject encoded path traversal attempts', async () => {
const encodedInputs = [
'%2e%2e%2f%2e%2e%2f%65%74%63%2f%70%61%73%73%77%64', // ../../../etc/passwd
'..%2f..%2fetc%2fpasswd',
'..%5c..%5cconfig.ini',
]
encodedInputs.forEach((input) => {
const result = findLocalFile(input)
for (const input of encodedInputs) {
const result = await findLocalFile(input)
expect(result).toBeNull()
})
}
})
it.concurrent('should reject mixed path separators', () => {
it.concurrent('should reject mixed path separators', async () => {
const mixedInputs = ['../..\\config.txt', '..\\../secret.ini', '/..\\..\\system32']
mixedInputs.forEach((input) => {
const result = findLocalFile(input)
for (const input of mixedInputs) {
const result = await findLocalFile(input)
expect(result).toBeNull()
})
}
})
it.concurrent('should reject filenames with dangerous characters', () => {
it.concurrent('should reject filenames with dangerous characters', async () => {
const dangerousInputs = [
'file:with:colons.txt',
'file|with|pipes.txt',
@@ -376,43 +376,45 @@ describe('findLocalFile - Path Traversal Security Tests', () => {
'file*with*asterisks.txt',
]
dangerousInputs.forEach((input) => {
const result = findLocalFile(input)
for (const input of dangerousInputs) {
const result = await findLocalFile(input)
expect(result).toBeNull()
})
}
})
it.concurrent('should reject null and empty inputs', () => {
expect(findLocalFile('')).toBeNull()
expect(findLocalFile(' ')).toBeNull()
expect(findLocalFile('\t\n')).toBeNull()
it.concurrent('should reject null and empty inputs', async () => {
expect(await findLocalFile('')).toBeNull()
expect(await findLocalFile(' ')).toBeNull()
expect(await findLocalFile('\t\n')).toBeNull()
})
it.concurrent('should reject filenames that become empty after sanitization', () => {
it.concurrent('should reject filenames that become empty after sanitization', async () => {
const emptyAfterSanitization = ['../..', '..\\..\\', '////', '....', '..']
emptyAfterSanitization.forEach((input) => {
const result = findLocalFile(input)
for (const input of emptyAfterSanitization) {
const result = await findLocalFile(input)
expect(result).toBeNull()
})
}
})
})
describe('security validation passes for legitimate files', () => {
it.concurrent('should accept properly formatted filenames without throwing errors', () => {
const legitimateInputs = [
'document.pdf',
'image.png',
'data.csv',
'report-2024.doc',
'file_with_underscores.txt',
'file-with-dashes.json',
]
it.concurrent(
'should accept properly formatted filenames without throwing errors',
async () => {
const legitimateInputs = [
'document.pdf',
'image.png',
'data.csv',
'report-2024.doc',
'file_with_underscores.txt',
'file-with-dashes.json',
]
legitimateInputs.forEach((input) => {
// Should not throw security errors for legitimate filenames
expect(() => findLocalFile(input)).not.toThrow()
})
})
for (const input of legitimateInputs) {
await expect(findLocalFile(input)).resolves.toBeDefined()
}
}
)
})
})

View File

@@ -1,8 +1,5 @@
import { existsSync } from 'fs'
import path from 'path'
import { createLogger } from '@sim/logger'
import { NextResponse } from 'next/server'
import { UPLOAD_DIR } from '@/lib/uploads/config'
import { sanitizeFileKey } from '@/lib/uploads/utils/file-utils'
const logger = createLogger('FilesUtils')
@@ -123,76 +120,29 @@ export function extractFilename(path: string): string {
return filename
}
function sanitizeFilename(filename: string): string {
if (!filename || typeof filename !== 'string') {
throw new Error('Invalid filename provided')
}
if (!filename.includes('/')) {
throw new Error('File key must include a context prefix (e.g., kb/, workspace/, execution/)')
}
const segments = filename.split('/')
const sanitizedSegments = segments.map((segment) => {
if (segment === '..' || segment === '.') {
throw new Error('Path traversal detected')
}
const sanitized = segment.replace(/\.\./g, '').replace(/[\\]/g, '').replace(/^\./g, '').trim()
if (!sanitized) {
throw new Error('Invalid or empty path segment after sanitization')
}
if (
sanitized.includes(':') ||
sanitized.includes('|') ||
sanitized.includes('?') ||
sanitized.includes('*') ||
sanitized.includes('\x00') ||
/[\x00-\x1F\x7F]/.test(sanitized)
) {
throw new Error('Path segment contains invalid characters')
}
return sanitized
})
return sanitizedSegments.join(path.sep)
}
export function findLocalFile(filename: string): string | null {
export async function findLocalFile(filename: string): Promise<string | null> {
try {
const sanitizedFilename = sanitizeFileKey(filename)
// Reject if sanitized filename is empty or only contains path separators/dots
if (!sanitizedFilename || !sanitizedFilename.trim() || /^[/\\.\s]+$/.test(sanitizedFilename)) {
return null
}
const possiblePaths = [
path.join(UPLOAD_DIR, sanitizedFilename),
path.join(process.cwd(), 'uploads', sanitizedFilename),
]
const { existsSync } = await import('fs')
const path = await import('path')
const { UPLOAD_DIR_SERVER } = await import('@/lib/uploads/core/setup.server')
for (const filePath of possiblePaths) {
const resolvedPath = path.resolve(filePath)
const allowedDirs = [path.resolve(UPLOAD_DIR), path.resolve(process.cwd(), 'uploads')]
const resolvedPath = path.join(UPLOAD_DIR_SERVER, sanitizedFilename)
// Must be within allowed directory but NOT the directory itself
const isWithinAllowedDir = allowedDirs.some(
(allowedDir) =>
resolvedPath.startsWith(allowedDir + path.sep) && resolvedPath !== allowedDir
)
if (
!resolvedPath.startsWith(UPLOAD_DIR_SERVER + path.sep) ||
resolvedPath === UPLOAD_DIR_SERVER
) {
return null
}
if (!isWithinAllowedDir) {
continue
}
if (existsSync(resolvedPath)) {
return resolvedPath
}
if (existsSync(resolvedPath)) {
return resolvedPath
}
return null

View File

@@ -0,0 +1,248 @@
import { randomUUID } from 'crypto'
import { db } from '@sim/db'
import { document } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import {
createDocumentRecords,
deleteDocument,
getProcessingConfig,
processDocumentsWithQueue,
} from '@/lib/knowledge/documents/service'
import { authorizeWorkflowByWorkspacePermission } from '@/lib/workflows/utils'
import { checkKnowledgeBaseWriteAccess } from '@/app/api/knowledge/utils'
const logger = createLogger('DocumentUpsertAPI')
const UpsertDocumentSchema = z.object({
documentId: z.string().optional(),
filename: z.string().min(1, 'Filename is required'),
fileUrl: z.string().min(1, 'File URL is required'),
fileSize: z.number().min(1, 'File size must be greater than 0'),
mimeType: z.string().min(1, 'MIME type is required'),
documentTagsData: z.string().optional(),
processingOptions: z.object({
chunkSize: z.number().min(100).max(4000),
minCharactersPerChunk: z.number().min(1).max(2000),
recipe: z.string(),
lang: z.string(),
chunkOverlap: z.number().min(0).max(500),
}),
workflowId: z.string().optional(),
})
export async function POST(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const requestId = randomUUID().slice(0, 8)
const { id: knowledgeBaseId } = await params
try {
const body = await req.json()
logger.info(`[${requestId}] Knowledge base document upsert request`, {
knowledgeBaseId,
hasDocumentId: !!body.documentId,
filename: body.filename,
})
const auth = await checkSessionOrInternalAuth(req, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
logger.warn(`[${requestId}] Authentication failed: ${auth.error || 'Unauthorized'}`)
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const userId = auth.userId
const validatedData = UpsertDocumentSchema.parse(body)
if (validatedData.workflowId) {
const authorization = await authorizeWorkflowByWorkspacePermission({
workflowId: validatedData.workflowId,
userId,
action: 'write',
})
if (!authorization.allowed) {
return NextResponse.json(
{ error: authorization.message || 'Access denied' },
{ status: authorization.status }
)
}
}
const accessCheck = await checkKnowledgeBaseWriteAccess(knowledgeBaseId, userId)
if (!accessCheck.hasAccess) {
if ('notFound' in accessCheck && accessCheck.notFound) {
logger.warn(`[${requestId}] Knowledge base not found: ${knowledgeBaseId}`)
return NextResponse.json({ error: 'Knowledge base not found' }, { status: 404 })
}
logger.warn(
`[${requestId}] User ${userId} attempted to upsert document in unauthorized knowledge base ${knowledgeBaseId}`
)
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
let existingDocumentId: string | null = null
let isUpdate = false
if (validatedData.documentId) {
const existingDoc = await db
.select({ id: document.id })
.from(document)
.where(
and(
eq(document.id, validatedData.documentId),
eq(document.knowledgeBaseId, knowledgeBaseId),
isNull(document.deletedAt)
)
)
.limit(1)
if (existingDoc.length > 0) {
existingDocumentId = existingDoc[0].id
}
} else {
const docsByFilename = await db
.select({ id: document.id })
.from(document)
.where(
and(
eq(document.filename, validatedData.filename),
eq(document.knowledgeBaseId, knowledgeBaseId),
isNull(document.deletedAt)
)
)
.limit(1)
if (docsByFilename.length > 0) {
existingDocumentId = docsByFilename[0].id
}
}
if (existingDocumentId) {
isUpdate = true
logger.info(
`[${requestId}] Found existing document ${existingDocumentId}, creating replacement before deleting old`
)
}
const createdDocuments = await createDocumentRecords(
[
{
filename: validatedData.filename,
fileUrl: validatedData.fileUrl,
fileSize: validatedData.fileSize,
mimeType: validatedData.mimeType,
...(validatedData.documentTagsData && {
documentTagsData: validatedData.documentTagsData,
}),
},
],
knowledgeBaseId,
requestId
)
const firstDocument = createdDocuments[0]
if (!firstDocument) {
logger.error(`[${requestId}] createDocumentRecords returned empty array unexpectedly`)
return NextResponse.json({ error: 'Failed to create document record' }, { status: 500 })
}
if (existingDocumentId) {
try {
await deleteDocument(existingDocumentId, requestId)
} catch (deleteError) {
logger.error(
`[${requestId}] Failed to delete old document ${existingDocumentId}, rolling back new record`,
deleteError
)
await deleteDocument(firstDocument.documentId, requestId).catch(() => {})
return NextResponse.json({ error: 'Failed to replace existing document' }, { status: 500 })
}
}
processDocumentsWithQueue(
createdDocuments,
knowledgeBaseId,
validatedData.processingOptions,
requestId
).catch((error: unknown) => {
logger.error(`[${requestId}] Critical error in document processing pipeline:`, error)
})
try {
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.knowledgeBaseDocumentsUploaded({
knowledgeBaseId,
documentsCount: 1,
uploadType: 'single',
chunkSize: validatedData.processingOptions.chunkSize,
recipe: validatedData.processingOptions.recipe,
})
} catch (_e) {
// Silently fail
}
recordAudit({
workspaceId: accessCheck.knowledgeBase?.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: isUpdate ? AuditAction.DOCUMENT_UPDATED : AuditAction.DOCUMENT_UPLOADED,
resourceType: AuditResourceType.DOCUMENT,
resourceId: knowledgeBaseId,
resourceName: validatedData.filename,
description: isUpdate
? `Upserted (replaced) document "${validatedData.filename}" in knowledge base "${knowledgeBaseId}"`
: `Upserted (created) document "${validatedData.filename}" in knowledge base "${knowledgeBaseId}"`,
metadata: {
fileName: validatedData.filename,
previousDocumentId: existingDocumentId,
isUpdate,
},
request: req,
})
return NextResponse.json({
success: true,
data: {
documentsCreated: [
{
documentId: firstDocument.documentId,
filename: firstDocument.filename,
status: 'pending',
},
],
isUpdate,
previousDocumentId: existingDocumentId,
processingMethod: 'background',
processingConfig: {
maxConcurrentDocuments: getProcessingConfig().maxConcurrentDocuments,
batchSize: getProcessingConfig().batchSize,
},
},
})
} catch (error) {
if (error instanceof z.ZodError) {
logger.warn(`[${requestId}] Invalid upsert request data`, { errors: error.errors })
return NextResponse.json(
{ error: 'Invalid request data', details: error.errors },
{ status: 400 }
)
}
logger.error(`[${requestId}] Error upserting document`, error)
const errorMessage = error instanceof Error ? error.message : 'Failed to upsert document'
const isStorageLimitError =
errorMessage.includes('Storage limit exceeded') || errorMessage.includes('storage limit')
const isMissingKnowledgeBase = errorMessage === 'Knowledge base not found'
return NextResponse.json(
{ error: errorMessage },
{ status: isMissingKnowledgeBase ? 404 : isStorageLimitError ? 413 : 500 }
)
}
}

View File

@@ -279,6 +279,7 @@ export async function POST(req: NextRequest) {
role: 'assistant' as const,
content: result.content,
timestamp: new Date().toISOString(),
...(result.requestId ? { requestId: result.requestId } : {}),
}
if (result.toolCalls.length > 0) {
assistantMessage.toolCalls = result.toolCalls

View File

@@ -161,7 +161,7 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
quantity: newSeatCount,
},
],
proration_behavior: 'create_prorations', // Stripe's default - charge/credit immediately
proration_behavior: 'always_invoice',
}
)
@@ -213,7 +213,7 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
oldSeats: currentSeats,
newSeats: newSeatCount,
updatedBy: session.user.id,
prorationBehavior: 'create_prorations',
prorationBehavior: 'always_invoice',
})
return NextResponse.json({

View File

@@ -0,0 +1,140 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { FileInputSchema } from '@/lib/uploads/utils/file-schemas'
import { processFilesToUserFiles, type RawFileInput } from '@/lib/uploads/utils/file-utils'
import { downloadFileFromStorage } from '@/lib/uploads/utils/file-utils.server'
export const dynamic = 'force-dynamic'
const logger = createLogger('BoxUploadAPI')
const BoxUploadSchema = z.object({
accessToken: z.string().min(1, 'Access token is required'),
parentFolderId: z.string().min(1, 'Parent folder ID is required'),
file: FileInputSchema.optional().nullable(),
fileContent: z.string().optional().nullable(),
fileName: z.string().optional().nullable(),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
logger.warn(`[${requestId}] Unauthorized Box upload attempt: ${authResult.error}`)
return NextResponse.json(
{ success: false, error: authResult.error || 'Authentication required' },
{ status: 401 }
)
}
logger.info(`[${requestId}] Authenticated Box upload request via ${authResult.authType}`)
const body = await request.json()
const validatedData = BoxUploadSchema.parse(body)
let fileBuffer: Buffer
let fileName: string
if (validatedData.file) {
const userFiles = processFilesToUserFiles(
[validatedData.file as RawFileInput],
requestId,
logger
)
if (userFiles.length === 0) {
return NextResponse.json({ success: false, error: 'Invalid file input' }, { status: 400 })
}
const userFile = userFiles[0]
logger.info(`[${requestId}] Downloading file: ${userFile.name} (${userFile.size} bytes)`)
fileBuffer = await downloadFileFromStorage(userFile, requestId, logger)
fileName = validatedData.fileName || userFile.name
} else if (validatedData.fileContent) {
logger.info(`[${requestId}] Using legacy base64 content input`)
fileBuffer = Buffer.from(validatedData.fileContent, 'base64')
fileName = validatedData.fileName || 'file'
} else {
return NextResponse.json({ success: false, error: 'File is required' }, { status: 400 })
}
logger.info(
`[${requestId}] Uploading to Box folder ${validatedData.parentFolderId}: ${fileName} (${fileBuffer.length} bytes)`
)
const attributes = JSON.stringify({
name: fileName,
parent: { id: validatedData.parentFolderId },
})
const formData = new FormData()
formData.append('attributes', attributes)
formData.append(
'file',
new Blob([new Uint8Array(fileBuffer)], { type: 'application/octet-stream' }),
fileName
)
const response = await fetch('https://upload.box.com/api/2.0/files/content', {
method: 'POST',
headers: {
Authorization: `Bearer ${validatedData.accessToken}`,
},
body: formData,
})
const data = await response.json()
if (!response.ok) {
const errorMessage = data.message || 'Failed to upload file'
logger.error(`[${requestId}] Box API error:`, { status: response.status, data })
return NextResponse.json({ success: false, error: errorMessage }, { status: response.status })
}
const file = data.entries?.[0]
if (!file) {
return NextResponse.json(
{ success: false, error: 'No file returned in upload response' },
{ status: 500 }
)
}
logger.info(`[${requestId}] File uploaded successfully: ${file.name} (ID: ${file.id})`)
return NextResponse.json({
success: true,
output: {
id: file.id ?? '',
name: file.name ?? '',
size: file.size ?? 0,
sha1: file.sha1 ?? null,
createdAt: file.created_at ?? null,
modifiedAt: file.modified_at ?? null,
parentId: file.parent?.id ?? null,
parentName: file.parent?.name ?? null,
},
})
} catch (error) {
if (error instanceof z.ZodError) {
logger.warn(`[${requestId}] Validation error:`, error.errors)
return NextResponse.json(
{ success: false, error: error.errors[0]?.message || 'Validation failed' },
{ status: 400 }
)
}
logger.error(`[${requestId}] Unexpected error:`, error)
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,466 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { FileInputSchema } from '@/lib/uploads/utils/file-schemas'
import { processFilesToUserFiles, type RawFileInput } from '@/lib/uploads/utils/file-utils'
import { downloadFileFromStorage } from '@/lib/uploads/utils/file-utils.server'
const logger = createLogger('DocuSignAPI')
interface DocuSignAccountInfo {
accountId: string
baseUri: string
}
/**
* Resolves the user's DocuSign account info from their access token
* by calling the DocuSign userinfo endpoint.
*/
async function resolveAccount(accessToken: string): Promise<DocuSignAccountInfo> {
const response = await fetch('https://account-d.docusign.com/oauth/userinfo', {
headers: { Authorization: `Bearer ${accessToken}` },
})
if (!response.ok) {
const errorText = await response.text()
logger.error('Failed to resolve DocuSign account', {
status: response.status,
error: errorText,
})
throw new Error(`Failed to resolve DocuSign account: ${response.status}`)
}
const data = await response.json()
const accounts = data.accounts ?? []
const defaultAccount = accounts.find((a: { is_default: boolean }) => a.is_default) ?? accounts[0]
if (!defaultAccount) {
throw new Error('No DocuSign accounts found for this user')
}
const baseUri = defaultAccount.base_uri
if (!baseUri) {
throw new Error('DocuSign account is missing base_uri')
}
return {
accountId: defaultAccount.account_id,
baseUri,
}
}
export async function POST(request: NextRequest) {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const { accessToken, operation, ...params } = body
if (!accessToken) {
return NextResponse.json({ success: false, error: 'Access token is required' }, { status: 400 })
}
if (!operation) {
return NextResponse.json({ success: false, error: 'Operation is required' }, { status: 400 })
}
try {
const account = await resolveAccount(accessToken)
const apiBase = `${account.baseUri}/restapi/v2.1/accounts/${account.accountId}`
const headers: Record<string, string> = {
Authorization: `Bearer ${accessToken}`,
'Content-Type': 'application/json',
}
switch (operation) {
case 'send_envelope':
return await handleSendEnvelope(apiBase, headers, params)
case 'create_from_template':
return await handleCreateFromTemplate(apiBase, headers, params)
case 'get_envelope':
return await handleGetEnvelope(apiBase, headers, params)
case 'list_envelopes':
return await handleListEnvelopes(apiBase, headers, params)
case 'void_envelope':
return await handleVoidEnvelope(apiBase, headers, params)
case 'download_document':
return await handleDownloadDocument(apiBase, headers, params)
case 'list_templates':
return await handleListTemplates(apiBase, headers, params)
case 'list_recipients':
return await handleListRecipients(apiBase, headers, params)
default:
return NextResponse.json(
{ success: false, error: `Unknown operation: ${operation}` },
{ status: 400 }
)
}
} catch (error) {
logger.error('DocuSign API error', { operation, error })
const message = error instanceof Error ? error.message : 'Internal server error'
return NextResponse.json({ success: false, error: message }, { status: 500 })
}
}
async function handleSendEnvelope(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const { signerEmail, signerName, emailSubject, emailBody, ccEmail, ccName, file, status } = params
if (!signerEmail || !signerName || !emailSubject) {
return NextResponse.json(
{ success: false, error: 'signerEmail, signerName, and emailSubject are required' },
{ status: 400 }
)
}
let documentBase64 = ''
let documentName = 'document.pdf'
if (file) {
try {
const parsed = FileInputSchema.parse(file)
const userFiles = processFilesToUserFiles([parsed as RawFileInput], 'docusign-send', logger)
if (userFiles.length > 0) {
const userFile = userFiles[0]
const buffer = await downloadFileFromStorage(userFile, 'docusign-send', logger)
documentBase64 = buffer.toString('base64')
documentName = userFile.name
}
} catch (fileError) {
logger.error('Failed to process file for DocuSign envelope', { fileError })
return NextResponse.json(
{ success: false, error: 'Failed to process uploaded file' },
{ status: 400 }
)
}
}
const envelopeBody: Record<string, unknown> = {
emailSubject,
status: (status as string) || 'sent',
recipients: {
signers: [
{
email: signerEmail,
name: signerName,
recipientId: '1',
routingOrder: '1',
tabs: {
signHereTabs: [
{
anchorString: '/sig1/',
anchorUnits: 'pixels',
anchorXOffset: '0',
anchorYOffset: '0',
},
],
dateSignedTabs: [
{
anchorString: '/date1/',
anchorUnits: 'pixels',
anchorXOffset: '0',
anchorYOffset: '0',
},
],
},
},
],
carbonCopies: ccEmail
? [
{
email: ccEmail,
name: ccName || (ccEmail as string),
recipientId: '2',
routingOrder: '2',
},
]
: [],
},
}
if (emailBody) {
envelopeBody.emailBlurb = emailBody
}
if (documentBase64) {
envelopeBody.documents = [
{
documentBase64,
name: documentName,
fileExtension: documentName.split('.').pop() || 'pdf',
documentId: '1',
},
]
} else if (((status as string) || 'sent') === 'sent') {
return NextResponse.json(
{ success: false, error: 'A document file is required to send an envelope' },
{ status: 400 }
)
}
const response = await fetch(`${apiBase}/envelopes`, {
method: 'POST',
headers,
body: JSON.stringify(envelopeBody),
})
const data = await response.json()
if (!response.ok) {
logger.error('DocuSign send envelope failed', { data, status: response.status })
return NextResponse.json(
{ success: false, error: data.message || data.errorCode || 'Failed to send envelope' },
{ status: response.status }
)
}
return NextResponse.json(data)
}
async function handleCreateFromTemplate(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const { templateId, emailSubject, emailBody, templateRoles, status } = params
if (!templateId) {
return NextResponse.json({ success: false, error: 'templateId is required' }, { status: 400 })
}
let parsedRoles: unknown[] = []
if (templateRoles) {
if (typeof templateRoles === 'string') {
try {
parsedRoles = JSON.parse(templateRoles)
} catch {
return NextResponse.json(
{ success: false, error: 'Invalid JSON for templateRoles' },
{ status: 400 }
)
}
} else if (Array.isArray(templateRoles)) {
parsedRoles = templateRoles
}
}
const envelopeBody: Record<string, unknown> = {
templateId,
status: (status as string) || 'sent',
templateRoles: parsedRoles,
}
if (emailSubject) envelopeBody.emailSubject = emailSubject
if (emailBody) envelopeBody.emailBlurb = emailBody
const response = await fetch(`${apiBase}/envelopes`, {
method: 'POST',
headers,
body: JSON.stringify(envelopeBody),
})
const data = await response.json()
if (!response.ok) {
logger.error('DocuSign create from template failed', { data, status: response.status })
return NextResponse.json(
{
success: false,
error: data.message || data.errorCode || 'Failed to create envelope from template',
},
{ status: response.status }
)
}
return NextResponse.json(data)
}
async function handleGetEnvelope(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const { envelopeId } = params
if (!envelopeId) {
return NextResponse.json({ success: false, error: 'envelopeId is required' }, { status: 400 })
}
const response = await fetch(
`${apiBase}/envelopes/${(envelopeId as string).trim()}?include=recipients,documents`,
{ headers }
)
const data = await response.json()
if (!response.ok) {
return NextResponse.json(
{ success: false, error: data.message || data.errorCode || 'Failed to get envelope' },
{ status: response.status }
)
}
return NextResponse.json(data)
}
async function handleListEnvelopes(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const queryParams = new URLSearchParams()
const fromDate = params.fromDate as string | undefined
if (fromDate) {
queryParams.append('from_date', fromDate)
} else {
const thirtyDaysAgo = new Date()
thirtyDaysAgo.setDate(thirtyDaysAgo.getDate() - 30)
queryParams.append('from_date', thirtyDaysAgo.toISOString())
}
if (params.toDate) queryParams.append('to_date', params.toDate as string)
if (params.envelopeStatus) queryParams.append('status', params.envelopeStatus as string)
if (params.searchText) queryParams.append('search_text', params.searchText as string)
if (params.count) queryParams.append('count', params.count as string)
const response = await fetch(`${apiBase}/envelopes?${queryParams}`, { headers })
const data = await response.json()
if (!response.ok) {
return NextResponse.json(
{ success: false, error: data.message || data.errorCode || 'Failed to list envelopes' },
{ status: response.status }
)
}
return NextResponse.json(data)
}
async function handleVoidEnvelope(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const { envelopeId, voidedReason } = params
if (!envelopeId) {
return NextResponse.json({ success: false, error: 'envelopeId is required' }, { status: 400 })
}
if (!voidedReason) {
return NextResponse.json({ success: false, error: 'voidedReason is required' }, { status: 400 })
}
const response = await fetch(`${apiBase}/envelopes/${(envelopeId as string).trim()}`, {
method: 'PUT',
headers,
body: JSON.stringify({ status: 'voided', voidedReason }),
})
const data = await response.json()
if (!response.ok) {
return NextResponse.json(
{ success: false, error: data.message || data.errorCode || 'Failed to void envelope' },
{ status: response.status }
)
}
return NextResponse.json({ envelopeId, status: 'voided' })
}
async function handleDownloadDocument(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const { envelopeId, documentId } = params
if (!envelopeId) {
return NextResponse.json({ success: false, error: 'envelopeId is required' }, { status: 400 })
}
const docId = (documentId as string) || 'combined'
const response = await fetch(
`${apiBase}/envelopes/${(envelopeId as string).trim()}/documents/${docId}`,
{
headers: { Authorization: headers.Authorization },
}
)
if (!response.ok) {
let errorText = ''
try {
errorText = await response.text()
} catch {
// ignore
}
return NextResponse.json(
{ success: false, error: `Failed to download document: ${response.status} ${errorText}` },
{ status: response.status }
)
}
const contentType = response.headers.get('content-type') || 'application/pdf'
const contentDisposition = response.headers.get('content-disposition') || ''
let fileName = `document-${docId}.pdf`
const filenameMatch = contentDisposition.match(/filename[^;=\n]*=((['"]).*?\2|[^;\n]*)/)
if (filenameMatch) {
fileName = filenameMatch[1].replace(/['"]/g, '')
}
const buffer = Buffer.from(await response.arrayBuffer())
const base64Content = buffer.toString('base64')
return NextResponse.json({ base64Content, mimeType: contentType, fileName })
}
async function handleListTemplates(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const queryParams = new URLSearchParams()
if (params.searchText) queryParams.append('search_text', params.searchText as string)
if (params.count) queryParams.append('count', params.count as string)
const queryString = queryParams.toString()
const url = queryString ? `${apiBase}/templates?${queryString}` : `${apiBase}/templates`
const response = await fetch(url, { headers })
const data = await response.json()
if (!response.ok) {
return NextResponse.json(
{ success: false, error: data.message || data.errorCode || 'Failed to list templates' },
{ status: response.status }
)
}
return NextResponse.json(data)
}
async function handleListRecipients(
apiBase: string,
headers: Record<string, string>,
params: Record<string, unknown>
) {
const { envelopeId } = params
if (!envelopeId) {
return NextResponse.json({ success: false, error: 'envelopeId is required' }, { status: 400 })
}
const response = await fetch(`${apiBase}/envelopes/${(envelopeId as string).trim()}/recipients`, {
headers,
})
const data = await response.json()
if (!response.ok) {
return NextResponse.json(
{ success: false, error: data.message || data.errorCode || 'Failed to list recipients' },
{ status: response.status }
)
}
return NextResponse.json(data)
}

View File

@@ -0,0 +1,67 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { createWorkdaySoapClient, extractRefId, wdRef } from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayAssignOnboardingAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
workerId: z.string().min(1),
onboardingPlanId: z.string().min(1),
actionEventId: z.string().min(1),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'humanResources',
data.username,
data.password
)
const [result] = await client.Put_Onboarding_Plan_AssignmentAsync({
Onboarding_Plan_Assignment_Data: {
Onboarding_Plan_Reference: wdRef('Onboarding_Plan_ID', data.onboardingPlanId),
Person_Reference: wdRef('WID', data.workerId),
Action_Event_Reference: wdRef('Background_Check_ID', data.actionEventId),
Assignment_Effective_Moment: new Date().toISOString(),
Active: true,
},
})
return NextResponse.json({
success: true,
output: {
assignmentId: extractRefId(result?.Onboarding_Plan_Assignment_Reference),
workerId: data.workerId,
planId: data.onboardingPlanId,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday assign onboarding failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,94 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { createWorkdaySoapClient, extractRefId, wdRef } from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayChangeJobAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
workerId: z.string().min(1),
effectiveDate: z.string().min(1),
newPositionId: z.string().optional(),
newJobProfileId: z.string().optional(),
newLocationId: z.string().optional(),
newSupervisoryOrgId: z.string().optional(),
reason: z.string().min(1, 'Reason is required for job changes'),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const changeJobDetailData: Record<string, unknown> = {
Reason_Reference: wdRef('Change_Job_Subcategory_ID', data.reason),
}
if (data.newPositionId) {
changeJobDetailData.Position_Reference = wdRef('Position_ID', data.newPositionId)
}
if (data.newJobProfileId) {
changeJobDetailData.Job_Profile_Reference = wdRef('Job_Profile_ID', data.newJobProfileId)
}
if (data.newLocationId) {
changeJobDetailData.Location_Reference = wdRef('Location_ID', data.newLocationId)
}
if (data.newSupervisoryOrgId) {
changeJobDetailData.Supervisory_Organization_Reference = wdRef(
'Supervisory_Organization_ID',
data.newSupervisoryOrgId
)
}
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'staffing',
data.username,
data.password
)
const [result] = await client.Change_JobAsync({
Business_Process_Parameters: {
Auto_Complete: true,
Run_Now: true,
},
Change_Job_Data: {
Worker_Reference: wdRef('Employee_ID', data.workerId),
Effective_Date: data.effectiveDate,
Change_Job_Detail_Data: changeJobDetailData,
},
})
const eventRef = result?.Event_Reference
return NextResponse.json({
success: true,
output: {
eventId: extractRefId(eventRef),
workerId: data.workerId,
effectiveDate: data.effectiveDate,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday change job failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,134 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { createWorkdaySoapClient, extractRefId, wdRef } from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayCreatePrehireAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
legalName: z.string().min(1),
email: z.string().optional(),
phoneNumber: z.string().optional(),
address: z.string().optional(),
countryCode: z.string().optional(),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
if (!data.email && !data.phoneNumber && !data.address) {
return NextResponse.json(
{
success: false,
error: 'At least one contact method (email, phone, or address) is required',
},
{ status: 400 }
)
}
const parts = data.legalName.trim().split(/\s+/)
const firstName = parts[0] ?? ''
const lastName = parts.length > 1 ? parts.slice(1).join(' ') : ''
if (!lastName) {
return NextResponse.json(
{ success: false, error: 'Legal name must include both a first name and last name' },
{ status: 400 }
)
}
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'staffing',
data.username,
data.password
)
const contactData: Record<string, unknown> = {}
if (data.email) {
contactData.Email_Address_Data = [
{
Email_Address: data.email,
Usage_Data: {
Type_Data: { Type_Reference: wdRef('Communication_Usage_Type_ID', 'WORK') },
Public: true,
},
},
]
}
if (data.phoneNumber) {
contactData.Phone_Data = [
{
Phone_Number: data.phoneNumber,
Phone_Device_Type_Reference: wdRef('Phone_Device_Type_ID', 'Landline'),
Usage_Data: {
Type_Data: { Type_Reference: wdRef('Communication_Usage_Type_ID', 'WORK') },
Public: true,
},
},
]
}
if (data.address) {
contactData.Address_Data = [
{
Formatted_Address: data.address,
Usage_Data: {
Type_Data: { Type_Reference: wdRef('Communication_Usage_Type_ID', 'WORK') },
Public: true,
},
},
]
}
const [result] = await client.Put_ApplicantAsync({
Applicant_Data: {
Personal_Data: {
Name_Data: {
Legal_Name_Data: {
Name_Detail_Data: {
Country_Reference: wdRef('ISO_3166-1_Alpha-2_Code', data.countryCode ?? 'US'),
First_Name: firstName,
Last_Name: lastName,
},
},
},
Contact_Information_Data: contactData,
},
},
})
const applicantRef = result?.Applicant_Reference
return NextResponse.json({
success: true,
output: {
preHireId: extractRefId(applicantRef),
descriptor: applicantRef?.attributes?.Descriptor ?? null,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday create prehire failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,101 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import {
createWorkdaySoapClient,
extractRefId,
normalizeSoapArray,
type WorkdayCompensationDataSoap,
type WorkdayCompensationPlanSoap,
type WorkdayWorkerSoap,
} from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayGetCompensationAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
workerId: z.string().min(1),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'humanResources',
data.username,
data.password
)
const [result] = await client.Get_WorkersAsync({
Request_References: {
Worker_Reference: {
ID: { attributes: { 'wd:type': 'Employee_ID' }, $value: data.workerId },
},
},
Response_Group: {
Include_Reference: true,
Include_Compensation: true,
},
})
const worker =
normalizeSoapArray(
result?.Response_Data?.Worker as WorkdayWorkerSoap | WorkdayWorkerSoap[] | undefined
)[0] ?? null
const compensationData = worker?.Worker_Data?.Compensation_Data
const mapPlan = (p: WorkdayCompensationPlanSoap) => ({
id: extractRefId(p.Compensation_Plan_Reference) ?? null,
planName: p.Compensation_Plan_Reference?.attributes?.Descriptor ?? null,
amount: p.Amount ?? p.Per_Unit_Amount ?? p.Individual_Target_Amount ?? null,
currency: extractRefId(p.Currency_Reference) ?? null,
frequency: extractRefId(p.Frequency_Reference) ?? null,
})
const planTypeKeys: (keyof WorkdayCompensationDataSoap)[] = [
'Employee_Base_Pay_Plan_Assignment_Data',
'Employee_Salary_Unit_Plan_Assignment_Data',
'Employee_Bonus_Plan_Assignment_Data',
'Employee_Allowance_Plan_Assignment_Data',
'Employee_Commission_Plan_Assignment_Data',
'Employee_Stock_Plan_Assignment_Data',
'Employee_Period_Salary_Plan_Assignment_Data',
]
const compensationPlans: ReturnType<typeof mapPlan>[] = []
for (const key of planTypeKeys) {
for (const plan of normalizeSoapArray(compensationData?.[key])) {
compensationPlans.push(mapPlan(plan))
}
}
return NextResponse.json({
success: true,
output: { compensationPlans },
})
} catch (error) {
logger.error(`[${requestId}] Workday get compensation failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,94 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import {
createWorkdaySoapClient,
extractRefId,
normalizeSoapArray,
type WorkdayOrganizationSoap,
} from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayGetOrganizationsAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
type: z.string().optional(),
limit: z.number().optional(),
offset: z.number().optional(),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'humanResources',
data.username,
data.password
)
const limit = data.limit ?? 20
const offset = data.offset ?? 0
const page = offset > 0 ? Math.floor(offset / limit) + 1 : 1
const [result] = await client.Get_OrganizationsAsync({
Response_Filter: { Page: page, Count: limit },
Request_Criteria: data.type
? {
Organization_Type_Reference: {
ID: {
attributes: { 'wd:type': 'Organization_Type_ID' },
$value: data.type,
},
},
}
: undefined,
Response_Group: { Include_Hierarchy_Data: true },
})
const orgsArray = normalizeSoapArray(
result?.Response_Data?.Organization as
| WorkdayOrganizationSoap
| WorkdayOrganizationSoap[]
| undefined
)
const organizations = orgsArray.map((o) => ({
id: extractRefId(o.Organization_Reference) ?? null,
descriptor: o.Organization_Descriptor ?? null,
type: extractRefId(o.Organization_Data?.Organization_Type_Reference) ?? null,
subtype: extractRefId(o.Organization_Data?.Organization_Subtype_Reference) ?? null,
isActive: o.Organization_Data?.Inactive != null ? !o.Organization_Data.Inactive : null,
}))
const total = result?.Response_Results?.Total_Results ?? organizations.length
return NextResponse.json({
success: true,
output: { organizations, total },
})
} catch (error) {
logger.error(`[${requestId}] Workday get organizations failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,87 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import {
createWorkdaySoapClient,
extractRefId,
normalizeSoapArray,
type WorkdayWorkerSoap,
} from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayGetWorkerAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
workerId: z.string().min(1),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'humanResources',
data.username,
data.password
)
const [result] = await client.Get_WorkersAsync({
Request_References: {
Worker_Reference: {
ID: { attributes: { 'wd:type': 'Employee_ID' }, $value: data.workerId },
},
},
Response_Group: {
Include_Reference: true,
Include_Personal_Information: true,
Include_Employment_Information: true,
Include_Compensation: true,
Include_Organizations: true,
},
})
const worker =
normalizeSoapArray(
result?.Response_Data?.Worker as WorkdayWorkerSoap | WorkdayWorkerSoap[] | undefined
)[0] ?? null
return NextResponse.json({
success: true,
output: {
worker: worker
? {
id: extractRefId(worker.Worker_Reference) ?? null,
descriptor: worker.Worker_Descriptor ?? null,
personalData: worker.Worker_Data?.Personal_Data ?? null,
employmentData: worker.Worker_Data?.Employment_Data ?? null,
compensationData: worker.Worker_Data?.Compensation_Data ?? null,
organizationData: worker.Worker_Data?.Organization_Data ?? null,
}
: null,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday get worker failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,78 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { createWorkdaySoapClient, extractRefId, wdRef } from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayHireAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
preHireId: z.string().min(1),
positionId: z.string().min(1),
hireDate: z.string().min(1),
employeeType: z.string().optional(),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'staffing',
data.username,
data.password
)
const [result] = await client.Hire_EmployeeAsync({
Business_Process_Parameters: {
Auto_Complete: true,
Run_Now: true,
},
Hire_Employee_Data: {
Applicant_Reference: wdRef('Applicant_ID', data.preHireId),
Position_Reference: wdRef('Position_ID', data.positionId),
Hire_Date: data.hireDate,
Hire_Employee_Event_Data: {
Employee_Type_Reference: wdRef('Employee_Type_ID', data.employeeType ?? 'Regular'),
First_Day_of_Work: data.hireDate,
},
},
})
const employeeRef = result?.Employee_Reference
const eventRef = result?.Event_Reference
return NextResponse.json({
success: true,
output: {
workerId: extractRefId(employeeRef),
employeeId: extractRefId(employeeRef),
eventId: extractRefId(eventRef),
hireDate: data.hireDate,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday hire employee failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,83 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import {
createWorkdaySoapClient,
extractRefId,
normalizeSoapArray,
type WorkdayWorkerSoap,
} from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayListWorkersAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
limit: z.number().optional(),
offset: z.number().optional(),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'humanResources',
data.username,
data.password
)
const limit = data.limit ?? 20
const offset = data.offset ?? 0
const page = offset > 0 ? Math.floor(offset / limit) + 1 : 1
const [result] = await client.Get_WorkersAsync({
Response_Filter: { Page: page, Count: limit },
Response_Group: {
Include_Reference: true,
Include_Personal_Information: true,
Include_Employment_Information: true,
},
})
const workersArray = normalizeSoapArray(
result?.Response_Data?.Worker as WorkdayWorkerSoap | WorkdayWorkerSoap[] | undefined
)
const workers = workersArray.map((w) => ({
id: extractRefId(w.Worker_Reference) ?? null,
descriptor: w.Worker_Descriptor ?? null,
personalData: w.Worker_Data?.Personal_Data ?? null,
employmentData: w.Worker_Data?.Employment_Data ?? null,
}))
const total = result?.Response_Results?.Total_Results ?? workers.length
return NextResponse.json({
success: true,
output: { workers, total },
})
} catch (error) {
logger.error(`[${requestId}] Workday list workers failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,77 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { createWorkdaySoapClient, extractRefId, wdRef } from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayTerminateAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
workerId: z.string().min(1),
terminationDate: z.string().min(1),
reason: z.string().min(1),
notificationDate: z.string().optional(),
lastDayOfWork: z.string().optional(),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'staffing',
data.username,
data.password
)
const [result] = await client.Terminate_EmployeeAsync({
Business_Process_Parameters: {
Auto_Complete: true,
Run_Now: true,
},
Terminate_Employee_Data: {
Employee_Reference: wdRef('Employee_ID', data.workerId),
Termination_Date: data.terminationDate,
Terminate_Event_Data: {
Primary_Reason_Reference: wdRef('Termination_Subcategory_ID', data.reason),
Last_Day_of_Work: data.lastDayOfWork ?? data.terminationDate,
Notification_Date: data.notificationDate ?? data.terminationDate,
},
},
})
const eventRef = result?.Event_Reference
return NextResponse.json({
success: true,
output: {
eventId: extractRefId(eventRef),
workerId: data.workerId,
terminationDate: data.terminationDate,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday terminate employee failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -0,0 +1,66 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { createWorkdaySoapClient, extractRefId, wdRef } from '@/tools/workday/soap'
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkdayUpdateWorkerAPI')
const RequestSchema = z.object({
tenantUrl: z.string().min(1),
tenant: z.string().min(1),
username: z.string().min(1),
password: z.string().min(1),
workerId: z.string().min(1),
fields: z.record(z.unknown()),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
return NextResponse.json({ success: false, error: 'Unauthorized' }, { status: 401 })
}
const body = await request.json()
const data = RequestSchema.parse(body)
const client = await createWorkdaySoapClient(
data.tenantUrl,
data.tenant,
'humanResources',
data.username,
data.password
)
const [result] = await client.Change_Personal_InformationAsync({
Business_Process_Parameters: {
Auto_Complete: true,
Run_Now: true,
},
Change_Personal_Information_Data: {
Person_Reference: wdRef('Employee_ID', data.workerId),
Personal_Information_Data: data.fields,
},
})
return NextResponse.json({
success: true,
output: {
eventId: extractRefId(result?.Personal_Information_Change_Event_Reference),
workerId: data.workerId,
},
})
} catch (error) {
logger.error(`[${requestId}] Workday update worker failed`, { error })
return NextResponse.json(
{ success: false, error: error instanceof Error ? error.message : 'Unknown error' },
{ status: 500 }
)
}
}

View File

@@ -93,7 +93,7 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
return NextResponse.json({ error: 'No file provided' }, { status: 400 })
}
const fileName = rawFile.name || 'untitled'
const fileName = rawFile.name || 'untitled.md'
const maxSize = 100 * 1024 * 1024
if (rawFile.size > maxSize) {

View File

@@ -1,5 +1,6 @@
export { ErrorState, type ErrorStateProps } from './error'
export { InlineRenameInput } from './inline-rename-input'
export { MessageActions } from './message-actions'
export { ownerCell } from './resource/components/owner-cell/owner-cell'
export type {
BreadcrumbEditing,

View File

@@ -0,0 +1 @@
export { MessageActions } from './message-actions'

View File

@@ -0,0 +1,84 @@
'use client'
import { useCallback, useEffect, useRef, useState } from 'react'
import { Check, Copy, Ellipsis, Hash } from 'lucide-react'
import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuTrigger,
} from '@/components/emcn'
interface MessageActionsProps {
content: string
requestId?: string
}
export function MessageActions({ content, requestId }: MessageActionsProps) {
const [copied, setCopied] = useState<'message' | 'request' | null>(null)
const resetTimeoutRef = useRef<number | null>(null)
useEffect(() => {
return () => {
if (resetTimeoutRef.current !== null) {
window.clearTimeout(resetTimeoutRef.current)
}
}
}, [])
const copyToClipboard = useCallback(async (text: string, type: 'message' | 'request') => {
try {
await navigator.clipboard.writeText(text)
setCopied(type)
if (resetTimeoutRef.current !== null) {
window.clearTimeout(resetTimeoutRef.current)
}
resetTimeoutRef.current = window.setTimeout(() => setCopied(null), 1500)
} catch {
return
}
}, [])
if (!content && !requestId) {
return null
}
return (
<DropdownMenu modal={false}>
<DropdownMenuTrigger asChild>
<button
type='button'
aria-label='More options'
className='flex h-5 w-5 items-center justify-center rounded-sm text-[var(--text-icon)] opacity-0 transition-colors transition-opacity hover:bg-[var(--surface-3)] hover:text-[var(--text-primary)] focus-visible:opacity-100 focus-visible:outline-none group-hover/msg:opacity-100 data-[state=open]:opacity-100'
onClick={(event) => event.stopPropagation()}
>
<Ellipsis className='h-3 w-3' strokeWidth={2} />
</button>
</DropdownMenuTrigger>
<DropdownMenuContent align='end' side='top' sideOffset={4}>
<DropdownMenuItem
disabled={!content}
onSelect={(event) => {
event.stopPropagation()
void copyToClipboard(content, 'message')
}}
>
{copied === 'message' ? <Check /> : <Copy />}
<span>Copy Message</span>
</DropdownMenuItem>
<DropdownMenuItem
disabled={!requestId}
onSelect={(event) => {
event.stopPropagation()
if (requestId) {
void copyToClipboard(requestId, 'request')
}
}}
>
{copied === 'request' ? <Check /> : <Hash />}
<span>Copy Request ID</span>
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
)
}

View File

@@ -215,16 +215,13 @@ function TextEditor({
onSaveStatusChange?.(saveStatus)
}, [saveStatus, onSaveStatusChange])
useEffect(() => {
if (saveRef) {
saveRef.current = saveImmediately
}
return () => {
if (saveRef) {
saveRef.current = null
}
}
}, [saveRef, saveImmediately])
if (saveRef) saveRef.current = saveImmediately
useEffect(
() => () => {
if (saveRef) saveRef.current = null
},
[saveRef]
)
useEffect(() => {
if (!isResizing) return

View File

@@ -151,6 +151,8 @@ export function Files() {
}
const justCreatedFileIdRef = useRef<string | null>(null)
const filesRef = useRef(files)
filesRef.current = files
const [uploading, setUploading] = useState(false)
const [uploadProgress, setUploadProgress] = useState({ completed: 0, total: 0 })
@@ -483,11 +485,11 @@ export function Files() {
if (isJustCreated) {
setPreviewMode('editor')
} else {
const file = selectedFileId ? files.find((f) => f.id === selectedFileId) : null
const file = selectedFileId ? filesRef.current.find((f) => f.id === selectedFileId) : null
const canPreview = file ? isPreviewable(file) : false
setPreviewMode(canPreview ? 'preview' : 'editor')
}
}, [selectedFileId, files])
}, [selectedFileId])
useEffect(() => {
if (!selectedFile) return

View File

@@ -160,8 +160,8 @@ export function EmbeddedWorkflowActions({ workspaceId, workflowId }: EmbeddedWor
])
const handleOpenWorkflow = useCallback(() => {
router.push(`/workspace/${workspaceId}/w/${workflowId}`)
}, [router, workspaceId, workflowId])
window.open(`/workspace/${workspaceId}/w/${workflowId}`, '_blank')
}, [workspaceId, workflowId])
return (
<>

View File

@@ -1,6 +1,6 @@
'use client'
import { memo, useCallback, useEffect, useState } from 'react'
import { forwardRef, memo, useCallback, useState } from 'react'
import { cn } from '@/lib/core/utils/cn'
import { getFileExtension } from '@/lib/uploads/utils/file-utils'
import type { PreviewMode } from '@/app/workspace/[workspaceId]/files/components/file-viewer'
@@ -31,68 +31,79 @@ interface MothershipViewProps {
className?: string
}
export const MothershipView = memo(function MothershipView({
workspaceId,
chatId,
resources,
activeResourceId,
onSelectResource,
onAddResource,
onRemoveResource,
onReorderResources,
onCollapse,
isCollapsed,
className,
}: MothershipViewProps) {
const active = resources.find((r) => r.id === activeResourceId) ?? resources[0] ?? null
export const MothershipView = memo(
forwardRef<HTMLDivElement, MothershipViewProps>(function MothershipView(
{
workspaceId,
chatId,
resources,
activeResourceId,
onSelectResource,
onAddResource,
onRemoveResource,
onReorderResources,
onCollapse,
isCollapsed,
className,
}: MothershipViewProps,
ref
) {
const active = resources.find((r) => r.id === activeResourceId) ?? resources[0] ?? null
const [previewMode, setPreviewMode] = useState<PreviewMode>('preview')
const handleCyclePreview = useCallback(() => setPreviewMode((m) => PREVIEW_CYCLE[m]), [])
const [previewMode, setPreviewMode] = useState<PreviewMode>('preview')
const [prevActiveId, setPrevActiveId] = useState<string | null | undefined>(active?.id)
const handleCyclePreview = useCallback(() => setPreviewMode((m) => PREVIEW_CYCLE[m]), [])
useEffect(() => {
setPreviewMode('preview')
}, [active?.id])
// Reset preview mode to default when the active resource changes (guarded render-phase update)
if (active?.id !== prevActiveId) {
setPrevActiveId(active?.id)
setPreviewMode('preview')
}
const isActivePreviewable =
active?.type === 'file' && RICH_PREVIEWABLE_EXTENSIONS.has(getFileExtension(active.title))
const isActivePreviewable =
active?.type === 'file' && RICH_PREVIEWABLE_EXTENSIONS.has(getFileExtension(active.title))
return (
<div
className={cn(
'relative z-10 flex h-full flex-col overflow-hidden border-[var(--border)] bg-[var(--bg)] transition-[width,min-width,border-width] duration-300 ease-out',
isCollapsed ? 'w-0 min-w-0 border-l-0' : 'w-[60%] border-l',
className
)}
>
<div className='flex min-h-0 flex-1 flex-col'>
<ResourceTabs
workspaceId={workspaceId}
chatId={chatId}
resources={resources}
activeId={active?.id ?? null}
onSelect={onSelectResource}
onAddResource={onAddResource}
onRemoveResource={onRemoveResource}
onReorderResources={onReorderResources}
onCollapse={onCollapse}
actions={active ? <ResourceActions workspaceId={workspaceId} resource={active} /> : null}
previewMode={isActivePreviewable ? previewMode : undefined}
onCyclePreviewMode={isActivePreviewable ? handleCyclePreview : undefined}
/>
<div className='min-h-0 flex-1 overflow-hidden'>
{active ? (
<ResourceContent
workspaceId={workspaceId}
resource={active}
previewMode={isActivePreviewable ? previewMode : undefined}
/>
) : (
<div className='flex h-full items-center justify-center text-[14px] text-[var(--text-muted)]'>
Click "+" above to add a resource
</div>
)}
return (
<div
ref={ref}
className={cn(
'relative z-10 flex h-full flex-col overflow-hidden border-[var(--border)] bg-[var(--bg)] transition-[width,min-width,border-width] duration-300 ease-out',
isCollapsed ? 'w-0 min-w-0 border-l-0' : 'w-[60%] border-l',
className
)}
>
<div className='flex min-h-0 flex-1 flex-col'>
<ResourceTabs
workspaceId={workspaceId}
chatId={chatId}
resources={resources}
activeId={active?.id ?? null}
onSelect={onSelectResource}
onAddResource={onAddResource}
onRemoveResource={onRemoveResource}
onReorderResources={onReorderResources}
onCollapse={onCollapse}
actions={
active ? <ResourceActions workspaceId={workspaceId} resource={active} /> : null
}
previewMode={isActivePreviewable ? previewMode : undefined}
onCyclePreviewMode={isActivePreviewable ? handleCyclePreview : undefined}
/>
<div className='min-h-0 flex-1 overflow-hidden'>
{active ? (
<ResourceContent
workspaceId={workspaceId}
resource={active}
previewMode={isActivePreviewable ? previewMode : undefined}
/>
) : (
<div className='flex h-full items-center justify-center text-[14px] text-[var(--text-muted)]'>
Click "+" above to add a resource
</div>
)}
</div>
</div>
</div>
</div>
)
})
)
})
)

View File

@@ -202,9 +202,7 @@ export function UserInput({
}
useEffect(() => {
if (editValue) {
onEditValueConsumed?.()
}
if (editValue) onEditValueConsumed?.()
}, [editValue, onEditValueConsumed])
const animatedPlaceholder = useAnimatedPlaceholder(isInitialView)

View File

@@ -13,6 +13,7 @@ import {
LandingWorkflowSeedStorage,
} from '@/lib/core/utils/browser-storage'
import { persistImportedWorkflow } from '@/lib/workflows/operations/import-export'
import { MessageActions } from '@/app/workspace/[workspaceId]/components'
import { useChatHistory, useMarkTaskRead } from '@/hooks/queries/tasks'
import type { ChatContext } from '@/stores/panel'
import { useSidebarStore } from '@/stores/sidebar/store'
@@ -25,7 +26,7 @@ import {
UserMessageContent,
} from './components'
import { PendingTagIndicator } from './components/message-content/components/special-tags'
import { useAutoScroll, useChat } from './hooks'
import { useAutoScroll, useChat, useMothershipResize } from './hooks'
import type { FileAttachmentForApi, MothershipResource, MothershipResourceType } from './types'
const logger = createLogger('Home')
@@ -137,26 +138,41 @@ export function Home({ chatId }: HomeProps = {}) {
useChatHistory(chatId)
const { mutate: markRead } = useMarkTaskRead(workspaceId)
const { mothershipRef, handleResizePointerDown, clearWidth } = useMothershipResize()
const [isResourceCollapsed, setIsResourceCollapsed] = useState(true)
const [isResourceAnimatingIn, setIsResourceAnimatingIn] = useState(false)
const [skipResourceTransition, setSkipResourceTransition] = useState(false)
const isResourceCollapsedRef = useRef(isResourceCollapsed)
isResourceCollapsedRef.current = isResourceCollapsed
const collapseResource = useCallback(() => setIsResourceCollapsed(true), [])
const collapseResource = useCallback(() => {
clearWidth()
setIsResourceCollapsed(true)
}, [clearWidth])
const animatingInTimerRef = useRef<ReturnType<typeof setTimeout> | null>(null)
const startAnimatingIn = useCallback(() => {
if (animatingInTimerRef.current) clearTimeout(animatingInTimerRef.current)
setIsResourceAnimatingIn(true)
animatingInTimerRef.current = setTimeout(() => {
setIsResourceAnimatingIn(false)
animatingInTimerRef.current = null
}, 400)
}, [])
const expandResource = useCallback(() => {
setIsResourceCollapsed(false)
setIsResourceAnimatingIn(true)
}, [])
startAnimatingIn()
}, [startAnimatingIn])
const handleResourceEvent = useCallback(() => {
if (isResourceCollapsedRef.current) {
const { isCollapsed, toggleCollapsed } = useSidebarStore.getState()
if (!isCollapsed) toggleCollapsed()
setIsResourceCollapsed(false)
setIsResourceAnimatingIn(true)
startAnimatingIn()
}
}, [])
}, [startAnimatingIn])
const {
messages,
@@ -177,8 +193,15 @@ export function Home({ chatId }: HomeProps = {}) {
} = useChat(workspaceId, chatId, { onResourceEvent: handleResourceEvent })
const [editingInputValue, setEditingInputValue] = useState('')
const [prevChatId, setPrevChatId] = useState(chatId)
const clearEditingValue = useCallback(() => setEditingInputValue(''), [])
// Clear editing value when navigating to a different chat (guarded render-phase update)
if (chatId !== prevChatId) {
setPrevChatId(chatId)
setEditingInputValue('')
}
const handleEditQueuedMessage = useCallback(
(id: string) => {
const msg = editQueuedMessage(id)
@@ -189,10 +212,6 @@ export function Home({ chatId }: HomeProps = {}) {
[editQueuedMessage]
)
useEffect(() => {
setEditingInputValue('')
}, [chatId])
useEffect(() => {
wasSendingRef.current = false
if (resolvedChatId) markRead(resolvedChatId)
@@ -206,23 +225,12 @@ export function Home({ chatId }: HomeProps = {}) {
}, [isSending, resolvedChatId, markRead])
useEffect(() => {
if (!isResourceAnimatingIn) return
const timer = setTimeout(() => setIsResourceAnimatingIn(false), 400)
return () => clearTimeout(timer)
}, [isResourceAnimatingIn])
useEffect(() => {
if (resources.length > 0 && isResourceCollapsedRef.current) {
setSkipResourceTransition(true)
setIsResourceCollapsed(false)
}
}, [resources])
useEffect(() => {
if (!skipResourceTransition) return
if (!(resources.length > 0 && isResourceCollapsedRef.current)) return
setIsResourceCollapsed(false)
setSkipResourceTransition(true)
const id = requestAnimationFrame(() => setSkipResourceTransition(false))
return () => cancelAnimationFrame(id)
}, [skipResourceTransition])
}, [resources])
const handleSubmit = useCallback(
(text: string, fileAttachments?: FileAttachmentForApi[], contexts?: ChatContext[]) => {
@@ -358,7 +366,7 @@ export function Home({ chatId }: HomeProps = {}) {
return (
<div className='relative flex h-full bg-[var(--bg)]'>
<div className='flex h-full min-w-0 flex-1 flex-col'>
<div className='flex h-full min-w-[320px] flex-1 flex-col'>
<div
ref={scrollContainerRef}
className='min-h-0 flex-1 overflow-y-auto overflow-x-hidden px-6 pt-4 pb-8 [scrollbar-gutter:stable]'
@@ -414,7 +422,12 @@ export function Home({ chatId }: HomeProps = {}) {
const isLastMessage = index === messages.length - 1
return (
<div key={msg.id} className='pb-4'>
<div key={msg.id} className='group/msg relative pb-5'>
{!isThisStreaming && (msg.content || msg.contentBlocks?.length) && (
<div className='absolute right-0 bottom-0 z-10'>
<MessageActions content={msg.content} requestId={msg.requestId} />
</div>
)}
<MessageContent
blocks={msg.contentBlocks || []}
fallbackContent={msg.content}
@@ -452,7 +465,21 @@ export function Home({ chatId }: HomeProps = {}) {
</div>
</div>
{/* Resize handle — zero-width flex child whose absolute child straddles the border */}
{!isResourceCollapsed && (
<div className='relative z-20 w-0 flex-none'>
<div
className='absolute inset-y-0 left-[-4px] w-[8px] cursor-ew-resize'
role='separator'
aria-orientation='vertical'
aria-label='Resize resource panel'
onPointerDown={handleResizePointerDown}
/>
</div>
)}
<MothershipView
ref={mothershipRef}
workspaceId={workspaceId}
chatId={resolvedChatId}
resources={resources}

View File

@@ -2,4 +2,5 @@ export { useAnimatedPlaceholder } from './use-animated-placeholder'
export { useAutoScroll } from './use-auto-scroll'
export type { UseChatReturn } from './use-chat'
export { useChat } from './use-chat'
export { useMothershipResize } from './use-mothership-resize'
export { useStreamingReveal } from './use-streaming-reveal'

View File

@@ -1,4 +1,4 @@
import { useCallback, useEffect, useLayoutEffect, useRef, useState } from 'react'
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { useQueryClient } from '@tanstack/react-query'
import { usePathname } from 'next/navigation'
@@ -132,7 +132,7 @@ function toDisplayAttachment(f: TaskStoredFileAttachment): ChatMessageAttachment
media_type: f.media_type,
size: f.size,
previewUrl: f.media_type.startsWith('image/')
? `/api/files/serve/${encodeURIComponent(f.key)}?context=copilot`
? `/api/files/serve/${encodeURIComponent(f.key)}?context=mothership`
: undefined,
}
}
@@ -142,6 +142,7 @@ function mapStoredMessage(msg: TaskStoredMessage): ChatMessage {
id: msg.id,
role: msg.role,
content: msg.content,
...(msg.requestId ? { requestId: msg.requestId } : {}),
}
const hasContentBlocks = Array.isArray(msg.contentBlocks) && msg.contentBlocks.length > 0
@@ -268,14 +269,22 @@ export function useChat(
onResourceEventRef.current = options?.onResourceEvent
const resourcesRef = useRef(resources)
resourcesRef.current = resources
const activeResourceIdRef = useRef(activeResourceId)
activeResourceIdRef.current = activeResourceId
// Derive the effective active resource ID — auto-selects the last resource when the stored ID is
// absent or no longer in the list, avoiding a separate Effect-based state correction loop.
const effectiveActiveResourceId = useMemo(() => {
if (resources.length === 0) return null
if (activeResourceId && resources.some((r) => r.id === activeResourceId))
return activeResourceId
return resources[resources.length - 1].id
}, [resources, activeResourceId])
const activeResourceIdRef = useRef(effectiveActiveResourceId)
activeResourceIdRef.current = effectiveActiveResourceId
const [messageQueue, setMessageQueue] = useState<QueuedMessage[]>([])
const messageQueueRef = useRef<QueuedMessage[]>([])
useEffect(() => {
messageQueueRef.current = messageQueue
}, [messageQueue])
messageQueueRef.current = messageQueue
const sendMessageRef = useRef<UseChatReturn['sendMessage']>(async () => {})
const processSSEStreamRef = useRef<
@@ -481,19 +490,6 @@ export function useChat(
}
}, [chatHistory, workspaceId, queryClient])
useEffect(() => {
if (resources.length === 0) {
if (activeResourceId !== null) {
setActiveResourceId(null)
}
return
}
if (!activeResourceId || !resources.some((resource) => resource.id === activeResourceId)) {
setActiveResourceId(resources[resources.length - 1].id)
}
}, [activeResourceId, resources])
const processSSEStream = useCallback(
async (
reader: ReadableStreamDefaultReader<Uint8Array>,
@@ -509,6 +505,7 @@ export function useChat(
let activeSubagent: string | undefined
let runningText = ''
let lastContentSource: 'main' | 'subagent' | null = null
let streamRequestId: string | undefined
streamingContentRef.current = ''
streamingBlocksRef.current = []
@@ -526,14 +523,21 @@ export function useChat(
const flush = () => {
if (isStale()) return
streamingBlocksRef.current = [...blocks]
const snapshot = { content: runningText, contentBlocks: [...blocks] }
const snapshot: Partial<ChatMessage> = {
content: runningText,
contentBlocks: [...blocks],
}
if (streamRequestId) snapshot.requestId = streamRequestId
setMessages((prev) => {
if (expectedGen !== undefined && streamGenRef.current !== expectedGen) return prev
const idx = prev.findIndex((m) => m.id === assistantId)
if (idx >= 0) {
return prev.map((m) => (m.id === assistantId ? { ...m, ...snapshot } : m))
}
return [...prev, { id: assistantId, role: 'assistant' as const, ...snapshot }]
return [
...prev,
{ id: assistantId, role: 'assistant' as const, content: '', ...snapshot },
]
})
}
@@ -597,6 +601,14 @@ export function useChat(
}
break
}
case 'request_id': {
const rid = typeof parsed.data === 'string' ? parsed.data : undefined
if (rid) {
streamRequestId = rid
flush()
}
break
}
case 'content': {
const chunk = typeof parsed.data === 'string' ? parsed.data : (parsed.content ?? '')
if (chunk) {
@@ -854,9 +866,7 @@ export function useChat(
},
[workspaceId, queryClient, addResource, removeResource]
)
useLayoutEffect(() => {
processSSEStreamRef.current = processSSEStream
})
processSSEStreamRef.current = processSSEStream
const persistPartialResponse = useCallback(async () => {
const chatId = chatIdRef.current
@@ -945,9 +955,7 @@ export function useChat(
},
[invalidateChatQueries]
)
useLayoutEffect(() => {
finalizeRef.current = finalize
})
finalizeRef.current = finalize
const sendMessage = useCallback(
async (message: string, fileAttachments?: FileAttachmentForApi[], contexts?: ChatContext[]) => {
@@ -1083,9 +1091,7 @@ export function useChat(
},
[workspaceId, queryClient, processSSEStream, finalize]
)
useLayoutEffect(() => {
sendMessageRef.current = sendMessage
})
sendMessageRef.current = sendMessage
const stopGeneration = useCallback(async () => {
if (sendingRef.current && !chatIdRef.current) {
@@ -1223,7 +1229,7 @@ export function useChat(
sendMessage,
stopGeneration,
resources,
activeResourceId,
activeResourceId: effectiveActiveResourceId,
setActiveResourceId,
addResource,
removeResource,

View File

@@ -0,0 +1,101 @@
import { useCallback, useEffect, useRef } from 'react'
import { MOTHERSHIP_WIDTH } from '@/stores/constants'
/**
* Hook for managing resize of the MothershipView resource panel.
*
* Uses imperative DOM manipulation (zero React re-renders during drag) with
* Pointer Events + setPointerCapture for unified mouse/touch/stylus support.
* Attach `mothershipRef` to the MothershipView root div and bind
* `handleResizePointerDown` to the drag handle's onPointerDown.
* Call `clearWidth` when the panel collapses so the CSS class retakes control.
*/
export function useMothershipResize() {
const mothershipRef = useRef<HTMLDivElement | null>(null)
// Stored so the useEffect cleanup can tear down listeners if the component unmounts mid-drag
const cleanupRef = useRef<(() => void) | null>(null)
const handleResizePointerDown = useCallback((e: React.PointerEvent) => {
e.preventDefault()
const el = mothershipRef.current
if (!el) return
const handle = e.currentTarget as HTMLElement
handle.setPointerCapture(e.pointerId)
// Pin to current rendered width so drag starts from the visual position
el.style.width = `${el.getBoundingClientRect().width}px`
// Disable CSS transition to prevent animation lag during drag
const prevTransition = el.style.transition
el.style.transition = 'none'
document.body.style.cursor = 'ew-resize'
document.body.style.userSelect = 'none'
// AbortController removes all listeners at once on cleanup/cancel/unmount
const ac = new AbortController()
const { signal } = ac
const cleanup = () => {
ac.abort()
el.style.transition = prevTransition
document.body.style.cursor = ''
document.body.style.userSelect = ''
cleanupRef.current = null
}
cleanupRef.current = cleanup
handle.addEventListener(
'pointermove',
(moveEvent: PointerEvent) => {
const newWidth = window.innerWidth - moveEvent.clientX
const maxWidth = window.innerWidth * MOTHERSHIP_WIDTH.MAX_PERCENTAGE
el.style.width = `${Math.min(Math.max(newWidth, MOTHERSHIP_WIDTH.MIN), maxWidth)}px`
},
{ signal }
)
handle.addEventListener(
'pointerup',
(upEvent: PointerEvent) => {
handle.releasePointerCapture(upEvent.pointerId)
cleanup()
},
{ signal }
)
// Browser fires pointercancel when it reclaims the gesture (scroll, palm rejection, etc.)
// Without this, body cursor/userSelect and transition would be permanently stuck
handle.addEventListener('pointercancel', cleanup, { signal })
}, [])
// Tear down any active drag if the component unmounts mid-drag
useEffect(() => {
return () => {
cleanupRef.current?.()
}
}, [])
// Re-clamp panel width when the viewport is resized (inline px width can exceed max after narrowing)
useEffect(() => {
const handleWindowResize = () => {
const el = mothershipRef.current
if (!el || !el.style.width) return
const maxWidth = window.innerWidth * MOTHERSHIP_WIDTH.MAX_PERCENTAGE
const current = el.getBoundingClientRect().width
if (current > maxWidth) {
el.style.width = `${maxWidth}px`
}
}
window.addEventListener('resize', handleWindowResize)
return () => window.removeEventListener('resize', handleWindowResize)
}, [])
/** Remove inline width so the collapse CSS class retakes control */
const clearWidth = useCallback(() => {
mothershipRef.current?.style.removeProperty('width')
}, [])
return { mothershipRef, handleResizePointerDown, clearWidth }
}

View File

@@ -33,6 +33,7 @@ export interface QueuedMessage {
*/
export type SSEEventType =
| 'chat_id'
| 'request_id'
| 'title_updated'
| 'content'
| 'reasoning' // openai reasoning - render as thinking text
@@ -199,6 +200,7 @@ export interface ChatMessage {
contentBlocks?: ContentBlock[]
attachments?: ChatMessageAttachment[]
contexts?: ChatMessageContext[]
requestId?: string
}
export const SUBAGENT_LABELS: Record<SubagentName, string> = {

View File

@@ -169,16 +169,13 @@ export function ChunkEditor({
const saveFunction = isCreateMode ? handleSave : saveImmediately
useEffect(() => {
if (saveRef) {
saveRef.current = saveFunction
}
return () => {
if (saveRef) {
saveRef.current = null
}
}
}, [saveRef, saveFunction])
if (saveRef) saveRef.current = saveFunction
useEffect(
() => () => {
if (saveRef) saveRef.current = null
},
[saveRef]
)
const tokenStrings = useMemo(() => {
if (!tokenizerOn || !editedContent) return []

View File

@@ -274,9 +274,7 @@ export function KnowledgeBase({
const { data: connectors = [], isLoading: isLoadingConnectors } = useConnectorList(id)
const hasSyncingConnectors = connectors.some((c) => c.status === 'syncing')
const hasSyncingConnectorsRef = useRef(hasSyncingConnectors)
useEffect(() => {
hasSyncingConnectorsRef.current = hasSyncingConnectors
}, [hasSyncingConnectors])
hasSyncingConnectorsRef.current = hasSyncingConnectors
const {
documents,
@@ -752,11 +750,9 @@ export function KnowledgeBase({
const prevKnowledgeBaseIdRef = useRef<string>(id)
const isNavigatingToNewKB = prevKnowledgeBaseIdRef.current !== id
useEffect(() => {
if (knowledgeBase && knowledgeBase.id === id) {
prevKnowledgeBaseIdRef.current = id
}
}, [knowledgeBase, id])
if (knowledgeBase && knowledgeBase.id === id) {
prevKnowledgeBaseIdRef.current = id
}
const isInitialLoad = isLoadingKnowledgeBase && !knowledgeBase
const isFetchingNewKB = isNavigatingToNewKB && isFetchingDocuments

View File

@@ -220,10 +220,7 @@ function DashboardInner({ stats, isLoading, error }: DashboardProps) {
return result
}, [rawExecutions])
useEffect(() => {
prevExecutionsRef.current = executions
}, [executions])
prevExecutionsRef.current = executions
const lastExecutionByWorkflow = useMemo(() => {
const map = new Map<string, number>()

View File

@@ -31,7 +31,8 @@ export function Admin() {
const [workflowId, setWorkflowId] = useState('')
const [usersOffset, setUsersOffset] = useState(0)
const [usersEnabled, setUsersEnabled] = useState(false)
const [searchInput, setSearchInput] = useState('')
const [searchQuery, setSearchQuery] = useState('')
const [banUserId, setBanUserId] = useState<string | null>(null)
const [banReason, setBanReason] = useState('')
@@ -39,8 +40,12 @@ export function Admin() {
data: usersData,
isLoading: usersLoading,
error: usersError,
refetch: refetchUsers,
} = useAdminUsers(usersOffset, PAGE_SIZE, usersEnabled)
} = useAdminUsers(usersOffset, PAGE_SIZE, searchQuery)
const handleSearch = () => {
setUsersOffset(0)
setSearchQuery(searchInput.trim())
}
const totalPages = useMemo(
() => Math.ceil((usersData?.total ?? 0) / PAGE_SIZE),
@@ -62,14 +67,6 @@ export function Admin() {
)
}
const handleLoadUsers = () => {
if (usersEnabled) {
refetchUsers()
} else {
setUsersEnabled(true)
}
}
const pendingUserIds = useMemo(() => {
const ids = new Set<string>()
if (setUserRole.isPending && (setUserRole.variables as { userId?: string })?.userId)
@@ -136,10 +133,16 @@ export function Admin() {
<div className='h-px bg-[var(--border-secondary)]' />
<div className='flex flex-col gap-[12px]'>
<div className='flex items-center justify-between'>
<p className='font-medium text-[14px] text-[var(--text-primary)]'>User Management</p>
<Button variant='active' onClick={handleLoadUsers} disabled={usersLoading}>
{usersLoading ? 'Loading...' : usersEnabled ? 'Refresh' : 'Load Users'}
<p className='font-medium text-[14px] text-[var(--text-primary)]'>User Management</p>
<div className='flex gap-[8px]'>
<EmcnInput
value={searchInput}
onChange={(e) => setSearchInput(e.target.value)}
onKeyDown={(e) => e.key === 'Enter' && handleSearch()}
placeholder='Search by email or paste a user ID...'
/>
<Button variant='primary' onClick={handleSearch} disabled={usersLoading}>
{usersLoading ? 'Searching...' : 'Search'}
</Button>
</div>
@@ -164,9 +167,9 @@ export function Admin() {
</div>
)}
{usersData && (
{searchQuery.length > 0 && usersData && (
<>
<div className='flex flex-col gap-[2px] rounded-[8px] border border-[var(--border-secondary)]'>
<div className='flex flex-col gap-[2px]'>
<div className='flex items-center gap-[12px] border-[var(--border-secondary)] border-b px-[12px] py-[8px] text-[12px] text-[var(--text-tertiary)]'>
<span className='w-[200px]'>Name</span>
<span className='flex-1'>Email</span>
@@ -176,7 +179,7 @@ export function Admin() {
</div>
{usersData.users.length === 0 && (
<div className='px-[12px] py-[16px] text-center text-[13px] text-[var(--text-tertiary)]'>
<div className='py-[16px] text-center text-[13px] text-[var(--text-tertiary)]'>
No users found.
</div>
)}

View File

@@ -1,6 +1,6 @@
'use client'
import { useEffect, useMemo, useRef, useState } from 'react'
import { useCallback, useMemo, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { Info, Plus, Search } from 'lucide-react'
import { useParams } from 'next/navigation'
@@ -64,13 +64,19 @@ export function ApiKeys() {
const [deleteKey, setDeleteKey] = useState<ApiKey | null>(null)
const [showDeleteDialog, setShowDeleteDialog] = useState(false)
const [searchTerm, setSearchTerm] = useState('')
const [shouldScrollToBottom, setShouldScrollToBottom] = useState(false)
const defaultKeyType = allowPersonalApiKeys ? 'personal' : 'workspace'
const createButtonDisabled = isLoading || (!allowPersonalApiKeys && !canManageWorkspaceKeys)
const scrollContainerRef = useRef<HTMLDivElement>(null)
const scrollToBottom = useCallback(() => {
scrollContainerRef.current?.scrollTo({
top: scrollContainerRef.current.scrollHeight,
behavior: 'smooth',
})
}, [])
const filteredWorkspaceKeys = useMemo(() => {
if (!searchTerm.trim()) {
return workspaceKeys.map((key, index) => ({ key, originalIndex: index }))
@@ -111,16 +117,6 @@ export function ApiKeys() {
}
}
useEffect(() => {
if (shouldScrollToBottom && scrollContainerRef.current) {
scrollContainerRef.current.scrollTo({
top: scrollContainerRef.current.scrollHeight,
behavior: 'smooth',
})
setShouldScrollToBottom(false)
}
}, [shouldScrollToBottom])
const formatLastUsed = (dateString?: string) => {
if (!dateString) return 'Never'
return formatDate(new Date(dateString))

View File

@@ -316,6 +316,9 @@ export function CredentialsManager() {
// --- Detail view state ---
const [selectedCredentialId, setSelectedCredentialId] = useState<string | null>(null)
const [prevSelectedCredentialId, setPrevSelectedCredentialId] = useState<
string | null | undefined
>(undefined)
const [selectedDisplayNameDraft, setSelectedDisplayNameDraft] = useState('')
const [selectedDescriptionDraft, setSelectedDescriptionDraft] = useState('')
const [copyIdSuccess, setCopyIdSuccess] = useState(false)
@@ -347,6 +350,19 @@ export function CredentialsManager() {
[envCredentials, selectedCredentialId]
)
if (selectedCredential?.id !== prevSelectedCredentialId) {
setPrevSelectedCredentialId(selectedCredential?.id ?? null)
if (!selectedCredential) {
setSelectedDescriptionDraft('')
setSelectedDisplayNameDraft('')
setDetailsError(null)
} else {
setDetailsError(null)
setSelectedDescriptionDraft(selectedCredential.description || '')
setSelectedDisplayNameDraft(selectedCredential.displayName)
}
}
// --- Detail view hooks ---
const { data: members = [], isPending: membersLoading } = useWorkspaceCredentialMembers(
selectedCredential?.id
@@ -458,12 +474,10 @@ export function CredentialsManager() {
return personalInvalid || workspaceInvalid
}, [envVars, newWorkspaceRows])
// --- Effects ---
useEffect(() => {
hasChangesRef.current = hasChanges
shouldBlockNavRef.current = hasChanges || isDetailsDirty
}, [hasChanges, isDetailsDirty])
hasChangesRef.current = hasChanges
shouldBlockNavRef.current = hasChanges || isDetailsDirty
// --- Effects ---
useEffect(() => {
if (hasSavedRef.current) return
@@ -549,19 +563,6 @@ export function CredentialsManager() {
}
}, [])
// --- Detail view: sync drafts when credential changes ---
useEffect(() => {
if (!selectedCredential) {
setSelectedDescriptionDraft('')
setSelectedDisplayNameDraft('')
return
}
setDetailsError(null)
setSelectedDescriptionDraft(selectedCredential.description || '')
setSelectedDisplayNameDraft(selectedCredential.displayName)
}, [selectedCredential])
// --- Pending credential create request ---
const applyPendingCredentialCreateRequest = useCallback(
(request: PendingCredentialCreateRequest) => {

View File

@@ -68,6 +68,12 @@ export function General() {
const [name, setName] = useState(profile?.name || '')
const [isEditingName, setIsEditingName] = useState(false)
const inputRef = useRef<HTMLInputElement>(null)
const [prevProfileName, setPrevProfileName] = useState(profile?.name)
if (profile?.name && profile.name !== prevProfileName) {
setPrevProfileName(profile.name)
setName(profile.name)
}
const [showResetPasswordModal, setShowResetPasswordModal] = useState(false)
const resetPassword = useResetPassword()
@@ -76,12 +82,6 @@ export function General() {
const snapToGridValue = settings?.snapToGridSize ?? 0
useEffect(() => {
if (profile?.name) {
setName(profile.name)
}
}, [profile?.name])
const {
previewUrl: profilePictureUrl,
fileInputRef: profilePictureInputRef,

View File

@@ -3,6 +3,7 @@
import { type FC, memo, useCallback, useMemo, useRef, useState } from 'react'
import { RotateCcw } from 'lucide-react'
import { Button } from '@/components/emcn'
import { MessageActions } from '@/app/workspace/[workspaceId]/components'
import {
OptionsSelector,
parseSpecialTags,
@@ -409,10 +410,15 @@ const CopilotMessage: FC<CopilotMessageProps> = memo(
if (isAssistant) {
return (
<div
className={`w-full max-w-full flex-none overflow-hidden [max-width:var(--panel-max-width)] ${isDimmed ? 'opacity-40' : 'opacity-100'}`}
className={`group/msg relative w-full max-w-full flex-none overflow-hidden [max-width:var(--panel-max-width)] ${isDimmed ? 'opacity-40' : 'opacity-100'}`}
style={{ '--panel-max-width': `${panelWidth - 16}px` } as React.CSSProperties}
>
<div className='max-w-full space-y-[4px] px-[2px] pb-[4px]'>
{!isStreaming && (message.content || message.contentBlocks?.length) && (
<div className='absolute right-0 bottom-0 z-10'>
<MessageActions content={message.content} requestId={message.requestId} />
</div>
)}
<div className='max-w-full space-y-[4px] px-[2px] pb-5'>
{/* Content blocks in chronological order */}
{memoizedContentBlocks || (isStreaming && <div className='min-h-0' />)}

View File

@@ -97,16 +97,14 @@ const PlanModeSection: React.FC<PlanModeSectionProps> = ({
const [isResizing, setIsResizing] = React.useState(false)
const [isEditing, setIsEditing] = React.useState(false)
const [editedContent, setEditedContent] = React.useState(content)
const [prevContent, setPrevContent] = React.useState(content)
if (!isEditing && content !== prevContent) {
setPrevContent(content)
setEditedContent(content)
}
const resizeStartRef = React.useRef({ y: 0, startHeight: 0 })
const textareaRef = React.useRef<HTMLTextAreaElement>(null)
// Update edited content when content prop changes
React.useEffect(() => {
if (!isEditing) {
setEditedContent(content)
}
}, [content, isEditing])
const handleResizeStart = React.useCallback(
(e: React.MouseEvent) => {
e.preventDefault()

View File

@@ -1,6 +1,6 @@
'use client'
import { memo, useEffect, useState } from 'react'
import { memo, useState } from 'react'
import { Check, ChevronDown, ChevronRight, Loader2, X } from 'lucide-react'
import { Button } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn'
@@ -47,13 +47,11 @@ export const TodoList = memo(function TodoList({
className,
}: TodoListProps) {
const [isCollapsed, setIsCollapsed] = useState(collapsed)
/**
* Sync collapsed prop with internal state
*/
useEffect(() => {
const [prevCollapsed, setPrevCollapsed] = useState(collapsed)
if (collapsed !== prevCollapsed) {
setPrevCollapsed(collapsed)
setIsCollapsed(collapsed)
}, [collapsed])
}
if (!todos || todos.length === 0) {
return null

View File

@@ -1,4 +1,4 @@
import { useCallback, useEffect, useRef, useState } from 'react'
import { useCallback, useEffect, useState } from 'react'
import {
escapeRegex,
filterOutContext,
@@ -22,15 +22,6 @@ interface UseContextManagementProps {
*/
export function useContextManagement({ message, initialContexts }: UseContextManagementProps) {
const [selectedContexts, setSelectedContexts] = useState<ChatContext[]>(initialContexts ?? [])
const initializedRef = useRef(false)
// Initialize with initial contexts when they're first provided (for edit mode)
useEffect(() => {
if (initialContexts && initialContexts.length > 0 && !initializedRef.current) {
setSelectedContexts(initialContexts)
initializedRef.current = true
}
}, [initialContexts])
/**
* Adds a context to the selected contexts list, avoiding duplicates

View File

@@ -1,6 +1,6 @@
'use client'
import { useCallback, useEffect, useMemo, useState } from 'react'
import { useCallback, useMemo, useState } from 'react'
import { createLogger } from '@sim/logger'
import {
Button,
@@ -49,7 +49,10 @@ export function GeneralDeploy({
onLoadDeploymentComplete,
}: GeneralDeployProps) {
const [selectedVersion, setSelectedVersion] = useState<number | null>(null)
const [previewMode, setPreviewMode] = useState<PreviewMode>('active')
const [showActiveDespiteSelection, setShowActiveDespiteSelection] = useState(false)
// Derived — no useEffect needed
const previewMode: PreviewMode =
selectedVersion !== null && !showActiveDespiteSelection ? 'selected' : 'active'
const [showLoadDialog, setShowLoadDialog] = useState(false)
const [showPromoteDialog, setShowPromoteDialog] = useState(false)
const [showExpandedPreview, setShowExpandedPreview] = useState(false)
@@ -64,16 +67,9 @@ export function GeneralDeploy({
const revertMutation = useRevertToVersion()
useEffect(() => {
if (selectedVersion !== null) {
setPreviewMode('selected')
} else {
setPreviewMode('active')
}
}, [selectedVersion])
const handleSelectVersion = useCallback((version: number | null) => {
setSelectedVersion(version)
setShowActiveDespiteSelection(false)
}, [])
const handleLoadDeployment = useCallback((version: number) => {
@@ -164,7 +160,9 @@ export function GeneralDeploy({
>
<ButtonGroup
value={previewMode}
onValueChange={(val) => setPreviewMode(val as PreviewMode)}
onValueChange={(val) =>
setShowActiveDespiteSelection((val as PreviewMode) === 'active')
}
>
<ButtonGroupItem value='active'>Live</ButtonGroupItem>
<ButtonGroupItem value='selected' className='truncate'>

View File

@@ -227,12 +227,39 @@ export function DeployModal({
getApiKeyLabel,
])
const selectedStreamingOutputsRef = useRef(selectedStreamingOutputs)
selectedStreamingOutputsRef.current = selectedStreamingOutputs
useEffect(() => {
if (open && workflowId) {
setActiveTab('general')
setDeployError(null)
setDeployWarnings([])
setChatSuccess(false)
const currentOutputs = selectedStreamingOutputsRef.current
if (currentOutputs.length > 0) {
const blocks = Object.values(useWorkflowStore.getState().blocks)
const validOutputs = currentOutputs.filter((outputId) => {
if (startsWithUuid(outputId)) {
const underscoreIndex = outputId.indexOf('_')
if (underscoreIndex === -1) return false
const blockId = outputId.substring(0, underscoreIndex)
return blocks.some((b) => b.id === blockId)
}
const parts = outputId.split('.')
if (parts.length >= 2) {
const blockName = parts[0]
return blocks.some(
(b) => b.name?.toLowerCase().replace(/\s+/g, '') === blockName.toLowerCase()
)
}
return true
})
if (validOutputs.length !== currentOutputs.length) {
setSelectedStreamingOutputs(validOutputs)
}
}
}
return () => {
if (chatSuccessTimeoutRef.current) {
@@ -241,38 +268,6 @@ export function DeployModal({
}
}, [open, workflowId])
useEffect(() => {
if (!open || selectedStreamingOutputs.length === 0) return
const blocks = Object.values(useWorkflowStore.getState().blocks)
const validOutputs = selectedStreamingOutputs.filter((outputId) => {
if (startsWithUuid(outputId)) {
const underscoreIndex = outputId.indexOf('_')
if (underscoreIndex === -1) return false
const blockId = outputId.substring(0, underscoreIndex)
const block = blocks.find((b) => b.id === blockId)
return !!block
}
const parts = outputId.split('.')
if (parts.length >= 2) {
const blockName = parts[0]
const block = blocks.find(
(b) => b.name?.toLowerCase().replace(/\s+/g, '') === blockName.toLowerCase()
)
return !!block
}
return true
})
if (validOutputs.length !== selectedStreamingOutputs.length) {
setSelectedStreamingOutputs(validOutputs)
}
}, [open, selectedStreamingOutputs, setSelectedStreamingOutputs])
useEffect(() => {
const handleOpenDeployModal = (event: Event) => {
const customEvent = event as CustomEvent<{ tab?: TabView }>

View File

@@ -120,7 +120,6 @@ export const ComboBox = memo(function ComboBox({
)
// State management
const [storeInitialized, setStoreInitialized] = useState(false)
const [fetchedOptions, setFetchedOptions] = useState<Array<{ label: string; id: string }>>([])
const [isLoadingOptions, setIsLoadingOptions] = useState(false)
const [fetchError, setFetchError] = useState<string | null>(null)
@@ -280,27 +279,22 @@ export const ComboBox = memo(function ComboBox({
}, [value, evaluatedOptions])
const [inputValue, setInputValue] = useState(displayValue)
useEffect(() => {
const [prevDisplayValue, setPrevDisplayValue] = useState(displayValue)
if (displayValue !== prevDisplayValue) {
setPrevDisplayValue(displayValue)
setInputValue(displayValue)
}, [displayValue])
}
// Mark store as initialized on first render
useEffect(() => {
setStoreInitialized(true)
}, [])
// Set default value once store is initialized and permissions are loaded
// Set default value once permissions are loaded
useEffect(() => {
if (isPermissionLoading) return
if (!storeInitialized) return
if (defaultOptionValue === undefined) return
// Only set default when no value exists (initial block add)
if (value === null || value === undefined) {
setStoreValue(defaultOptionValue)
}
}, [storeInitialized, value, defaultOptionValue, setStoreValue, isPermissionLoading])
}, [value, defaultOptionValue, setStoreValue, isPermissionLoading])
// Clear fetched options and hydrated option when dependencies change
useEffect(() => {

View File

@@ -124,7 +124,6 @@ export const Dropdown = memo(function Dropdown({
isEqual
)
const [storeInitialized, setStoreInitialized] = useState(false)
const [fetchedOptions, setFetchedOptions] = useState<Array<{ label: string; id: string }>>([])
const [isLoadingOptions, setIsLoadingOptions] = useState(false)
const [fetchError, setFetchError] = useState<string | null>(null)
@@ -242,17 +241,13 @@ export const Dropdown = memo(function Dropdown({
}, [defaultValue, comboboxOptions, multiSelect])
useEffect(() => {
setStoreInitialized(true)
}, [])
useEffect(() => {
if (multiSelect || !storeInitialized || defaultOptionValue === undefined) {
if (multiSelect || defaultOptionValue === undefined) {
return
}
if (storeValue === null || storeValue === undefined || storeValue === '') {
setStoreValue(defaultOptionValue)
}
}, [storeInitialized, storeValue, defaultOptionValue, setStoreValue, multiSelect])
}, [storeValue, defaultOptionValue, setStoreValue, multiSelect])
/**
* Normalizes variable references in JSON strings by wrapping them in quotes

View File

@@ -122,11 +122,9 @@ export function LongInput({
isStreaming: wandHook.isStreaming,
})
useEffect(() => {
persistSubBlockValueRef.current = (value: string) => {
setSubBlockValue(value)
}
}, [setSubBlockValue])
persistSubBlockValueRef.current = (value: string) => {
setSubBlockValue(value)
}
// Check if wand is actually enabled
const isWandEnabled = config.wandConfig?.enabled ?? false
@@ -193,12 +191,12 @@ export function LongInput({
// Sync local content with base value when not streaming
useEffect(() => {
if (!wandHook.isStreaming) {
const baseValueString = baseValue?.toString() ?? ''
if (baseValueString !== localContent) {
setLocalContent(baseValueString)
}
setLocalContent((prev) => {
const baseValueString = baseValue?.toString() ?? ''
return baseValueString !== prev ? baseValueString : prev
})
}
}, [baseValue, wandHook.isStreaming]) // Removed localContent to prevent infinite loop
}, [baseValue, wandHook.isStreaming])
// Update height when rows prop changes
useLayoutEffect(() => {

View File

@@ -109,11 +109,9 @@ export const ShortInput = memo(function ShortInput({
isStreaming: wandHook.isStreaming,
})
useEffect(() => {
persistSubBlockValueRef.current = (value: string) => {
setSubBlockValue(value)
}
}, [setSubBlockValue])
persistSubBlockValueRef.current = (value: string) => {
setSubBlockValue(value)
}
const isWandEnabled = config.wandConfig?.enabled ?? false
@@ -214,12 +212,12 @@ export const ShortInput = memo(function ShortInput({
useEffect(() => {
if (!wandHook.isStreaming) {
const baseValueString = baseValue?.toString() ?? ''
if (baseValueString !== localContent) {
setLocalContent(baseValueString)
}
setLocalContent((prev) => {
const baseValueString = baseValue?.toString() ?? ''
return baseValueString !== prev ? baseValueString : prev
})
}
}, [baseValue, wandHook.isStreaming, localContent])
}, [baseValue, wandHook.isStreaming])
const handleScroll = useCallback((e: React.UIEvent<HTMLInputElement>) => {
if (overlayRef.current) {

View File

@@ -310,6 +310,14 @@ export const Toolbar = memo(
// Search state
const [isSearchActive, setIsSearchActive] = useState(false)
const [searchQuery, setSearchQuery] = useState('')
const [prevIsActive, setPrevIsActive] = useState(isActive)
if (isActive !== prevIsActive) {
setPrevIsActive(isActive)
if (!isActive) {
setIsSearchActive(false)
setSearchQuery('')
}
}
// Toggle animation state
const [isToggling, setIsToggling] = useState(false)
@@ -350,14 +358,8 @@ export const Toolbar = memo(
const isTriggersAtMinimum = toolbarTriggersHeight <= TRIGGERS_MIN_THRESHOLD
/**
* Clear search when tab becomes inactive
* Filter items based on search query
*/
useEffect(() => {
if (!isActive) {
setIsSearchActive(false)
setSearchQuery('')
}
}, [isActive])
/**
* Filter items based on search query

View File

@@ -168,7 +168,7 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
<ActionBar blockId={id} blockType={data.kind} disabled={!userPermissions.canEdit} />
)}
{/* Header Section — only interactive area for dragging */}
{/* Header Section */}
<div
onClick={() => setCurrentBlockId(id)}
className='workflow-drag-handle flex cursor-grab items-center justify-between rounded-t-[8px] border-[var(--border)] border-b bg-[var(--surface-2)] py-[8px] pr-[12px] pl-[8px] [&:active]:cursor-grabbing'
@@ -198,14 +198,15 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
</div>
{/*
* Subflow body background. Uses pointer-events: none so that edges rendered
* inside the subflow remain clickable. The subflow node wrapper also has
* pointer-events: none (set in workflow.tsx), so body-area clicks pass
* through to the pane. Subflow selection is done via the header above.
* Subflow body background. Captures clicks to select the subflow in the
* panel editor, matching the header click behavior. Child nodes and edges
* are rendered as sibling divs at the viewport level by ReactFlow (not as
* DOM children), so enabling pointer events here doesn't block them.
*/}
<div
className='absolute inset-0 top-[44px] rounded-b-[8px]'
style={{ pointerEvents: 'none' }}
className='workflow-drag-handle absolute inset-0 top-[44px] cursor-grab rounded-b-[8px] [&:active]:cursor-grabbing'
style={{ pointerEvents: isPreview ? 'none' : 'auto' }}
onClick={() => setCurrentBlockId(id)}
/>
{!isPreview && (

View File

@@ -604,11 +604,13 @@ export const Terminal = memo(function Terminal() {
const [autoSelectEnabled, setAutoSelectEnabled] = useState(true)
const [mainOptionsOpen, setMainOptionsOpen] = useState(false)
const [isTrainingEnvEnabled, setIsTrainingEnvEnabled] = useState(false)
const [isTrainingEnvEnabled] = useState(() =>
isTruthy(getEnv('NEXT_PUBLIC_COPILOT_TRAINING_ENABLED'))
)
const showTrainingControls = useShowTrainingControls()
const { isTraining, toggleModal: toggleTrainingModal, stopTraining } = useCopilotTrainingStore()
const [isPlaygroundEnabled, setIsPlaygroundEnabled] = useState(false)
const [isPlaygroundEnabled] = useState(() => isTruthy(getEnv('NEXT_PUBLIC_ENABLE_PLAYGROUND')))
const { handleMouseDown } = useTerminalResize()
const { handleMouseDown: handleOutputPanelResizeMouseDown } = useOutputPanelResize()
@@ -709,21 +711,21 @@ export const Terminal = memo(function Terminal() {
}, [outputData])
// Keep refs in sync for keyboard handler
useEffect(() => {
selectedEntryRef.current = selectedEntry
navigableEntriesRef.current = navigableEntries
showInputRef.current = showInput
hasInputDataRef.current = hasInputData
isExpandedRef.current = isExpanded
}, [selectedEntry, navigableEntries, showInput, hasInputData, isExpanded])
selectedEntryRef.current = selectedEntry
navigableEntriesRef.current = navigableEntries
showInputRef.current = showInput
hasInputDataRef.current = hasInputData
isExpandedRef.current = isExpanded
/**
* Reset entry tracking when switching workflows to ensure auto-open
* works correctly for each workflow independently.
*/
useEffect(() => {
const prevActiveWorkflowIdRef = useRef(activeWorkflowId)
if (prevActiveWorkflowIdRef.current !== activeWorkflowId) {
prevActiveWorkflowIdRef.current = activeWorkflowId
hasInitializedEntriesRef.current = false
}, [activeWorkflowId])
}
/**
* Auto-open the terminal on new entries when "Open on run" is enabled.
@@ -961,11 +963,6 @@ export const Terminal = memo(function Terminal() {
return unsub
}, [])
useEffect(() => {
setIsTrainingEnvEnabled(isTruthy(getEnv('NEXT_PUBLIC_COPILOT_TRAINING_ENABLED')))
setIsPlaygroundEnabled(isTruthy(getEnv('NEXT_PUBLIC_ENABLE_PLAYGROUND')))
}, [])
useEffect(() => {
if (!selectedEntry) {
setShowInput(false)

View File

@@ -29,6 +29,11 @@ export function WandPromptBar({
}: WandPromptBarProps) {
const promptBarRef = useRef<HTMLDivElement>(null)
const [isExiting, setIsExiting] = useState(false)
const [prevIsVisible, setPrevIsVisible] = useState(isVisible)
if (isVisible !== prevIsVisible) {
setPrevIsVisible(isVisible)
if (isVisible) setIsExiting(false)
}
// Handle the fade-out animation
const handleCancel = () => {
@@ -66,13 +71,6 @@ export function WandPromptBar({
}
}, [isVisible, isStreaming, isLoading, isExiting, onCancel])
// Reset the exit state when visibility changes
useEffect(() => {
if (isVisible) {
setIsExiting(false)
}
}, [isVisible])
if (!isVisible && !isStreaming && !isExiting) {
return null
}

View File

@@ -1,6 +1,6 @@
'use client'
import { useCallback, useEffect, useRef, useState } from 'react'
import { useCallback, useEffect, useLayoutEffect, useRef, useState } from 'react'
const AUTO_SCROLL_GRACE_MS = 120
@@ -38,6 +38,13 @@ export function useScrollManagement(
) {
const scrollAreaRef = useRef<HTMLDivElement>(null)
const [userHasScrolledAway, setUserHasScrolledAway] = useState(false)
const [prevIsSendingMessage, setPrevIsSendingMessage] = useState(isSendingMessage)
if (prevIsSendingMessage !== isSendingMessage) {
setPrevIsSendingMessage(isSendingMessage)
if (!isSendingMessage) {
setUserHasScrolledAway(false)
}
}
const programmaticUntilRef = useRef(0)
const lastScrollTopRef = useRef(0)
@@ -138,12 +145,6 @@ export function useScrollManagement(
}
}, [messages, userHasScrolledAway, scrollToBottom])
useEffect(() => {
if (!isSendingMessage) {
setUserHasScrolledAway(false)
}
}, [isSendingMessage])
useEffect(() => {
if (!isSendingMessage || userHasScrolledAway) return
@@ -167,7 +168,7 @@ export function useScrollManagement(
// overflow-anchor: none during streaming prevents the browser from
// fighting our programmatic scrollToBottom calls (Chromium/Firefox only;
// Safari does not support this property).
useEffect(() => {
useLayoutEffect(() => {
const container = scrollAreaRef.current
if (!container) return

View File

@@ -16,8 +16,12 @@ import {
} from '@/lib/workflows/triggers/triggers'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-current-workflow'
import {
addHttpErrorConsoleEntry,
type BlockEventHandlerConfig,
createBlockEventHandlers,
addExecutionErrorConsoleEntry as sharedAddExecutionErrorConsoleEntry,
handleExecutionCancelledConsole as sharedHandleExecutionCancelledConsole,
handleExecutionErrorConsole as sharedHandleExecutionErrorConsole,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/utils/workflow-execution-utils'
import { getBlock } from '@/blocks'
import type { SerializableExecutionState } from '@/executor/execution/types'
@@ -159,99 +163,6 @@ export function useWorkflowExecution() {
setActiveBlocks,
])
/**
* Builds timing fields for execution-level console entries.
*/
const buildExecutionTiming = useCallback((durationMs?: number) => {
const normalizedDuration = durationMs || 0
return {
durationMs: normalizedDuration,
startedAt: new Date(Date.now() - normalizedDuration).toISOString(),
endedAt: new Date().toISOString(),
}
}, [])
/**
* Adds an execution-level error entry to the console when appropriate.
*/
const addExecutionErrorConsoleEntry = useCallback(
(params: {
workflowId?: string
executionId?: string
error?: string
durationMs?: number
blockLogs: BlockLog[]
isPreExecutionError?: boolean
}) => {
if (!params.workflowId) return
const hasBlockError = params.blockLogs.some((log) => log.error)
const isPreExecutionError = params.isPreExecutionError ?? false
if (!isPreExecutionError && hasBlockError) {
return
}
const errorMessage = params.error || 'Execution failed'
const isTimeout = errorMessage.toLowerCase().includes('timed out')
const timing = buildExecutionTiming(params.durationMs)
addConsole({
input: {},
output: {},
success: false,
error: errorMessage,
durationMs: timing.durationMs,
startedAt: timing.startedAt,
executionOrder: isPreExecutionError ? 0 : Number.MAX_SAFE_INTEGER,
endedAt: timing.endedAt,
workflowId: params.workflowId,
blockId: isPreExecutionError
? 'validation'
: isTimeout
? 'timeout-error'
: 'execution-error',
executionId: params.executionId,
blockName: isPreExecutionError
? 'Workflow Validation'
: isTimeout
? 'Timeout Error'
: 'Execution Error',
blockType: isPreExecutionError ? 'validation' : 'error',
})
},
[addConsole, buildExecutionTiming]
)
/**
* Adds an execution-level cancellation entry to the console.
*/
const addExecutionCancelledConsoleEntry = useCallback(
(params: { workflowId?: string; executionId?: string; durationMs?: number }) => {
if (!params.workflowId) return
const timing = buildExecutionTiming(params.durationMs)
addConsole({
input: {},
output: {},
success: false,
error: 'Execution was cancelled',
durationMs: timing.durationMs,
startedAt: timing.startedAt,
executionOrder: Number.MAX_SAFE_INTEGER,
endedAt: timing.endedAt,
workflowId: params.workflowId,
blockId: 'cancelled',
executionId: params.executionId,
blockName: 'Execution Cancelled',
blockType: 'cancelled',
})
},
[addConsole, buildExecutionTiming]
)
/**
* Handles workflow-level execution errors for console output.
*/
const handleExecutionErrorConsole = useCallback(
(params: {
workflowId?: string
@@ -261,25 +172,24 @@ export function useWorkflowExecution() {
blockLogs: BlockLog[]
isPreExecutionError?: boolean
}) => {
if (params.workflowId) {
cancelRunningEntries(params.workflowId)
}
addExecutionErrorConsoleEntry(params)
if (!params.workflowId) return
sharedHandleExecutionErrorConsole(addConsole, cancelRunningEntries, {
...params,
workflowId: params.workflowId,
})
},
[addExecutionErrorConsoleEntry, cancelRunningEntries]
[addConsole, cancelRunningEntries]
)
/**
* Handles workflow-level execution cancellations for console output.
*/
const handleExecutionCancelledConsole = useCallback(
(params: { workflowId?: string; executionId?: string; durationMs?: number }) => {
if (params.workflowId) {
cancelRunningEntries(params.workflowId)
}
addExecutionCancelledConsoleEntry(params)
if (!params.workflowId) return
sharedHandleExecutionCancelledConsole(addConsole, cancelRunningEntries, {
...params,
workflowId: params.workflowId,
})
},
[addExecutionCancelledConsoleEntry, cancelRunningEntries]
[addConsole, cancelRunningEntries]
)
const buildBlockEventHandlers = useCallback(
@@ -1319,31 +1229,42 @@ export function useWorkflowExecution() {
} else {
if (!executor) {
try {
let blockId = 'serialization'
let blockName = 'Workflow'
let blockType = 'serializer'
if (error instanceof WorkflowValidationError) {
blockId = error.blockId || blockId
blockName = error.blockName || blockName
blockType = error.blockType || blockType
}
const httpStatus =
isRecord(error) && typeof error.httpStatus === 'number' ? error.httpStatus : undefined
const storeAddConsole = useTerminalConsoleStore.getState().addConsole
// Use MAX_SAFE_INTEGER so execution errors appear at the end of the log
useTerminalConsoleStore.getState().addConsole({
input: {},
output: {},
success: false,
error: normalizedMessage,
durationMs: 0,
startedAt: new Date().toISOString(),
executionOrder: Number.MAX_SAFE_INTEGER,
endedAt: new Date().toISOString(),
workflowId: activeWorkflowId || '',
blockId,
executionId: options?.executionId,
blockName,
blockType,
})
if (httpStatus && activeWorkflowId) {
addHttpErrorConsoleEntry(storeAddConsole, {
workflowId: activeWorkflowId,
executionId: options?.executionId,
error: normalizedMessage,
httpStatus,
})
} else if (error instanceof WorkflowValidationError) {
storeAddConsole({
input: {},
output: {},
success: false,
error: normalizedMessage,
durationMs: 0,
startedAt: new Date().toISOString(),
executionOrder: Number.MAX_SAFE_INTEGER,
endedAt: new Date().toISOString(),
workflowId: activeWorkflowId || '',
blockId: error.blockId || 'serialization',
executionId: options?.executionId,
blockName: error.blockName || 'Workflow',
blockType: error.blockType || 'serializer',
})
} else {
sharedAddExecutionErrorConsoleEntry(storeAddConsole, {
workflowId: activeWorkflowId || '',
executionId: options?.executionId,
error: normalizedMessage,
blockLogs: [],
isPreExecutionError: true,
})
}
} catch {}
}
@@ -1681,8 +1602,8 @@ export function useWorkflowExecution() {
accumulatedBlockLogs,
accumulatedBlockStates,
executedBlockIds,
consoleMode: 'add',
includeStartConsoleEntry: false,
consoleMode: 'update',
includeStartConsoleEntry: true,
})
await executionStream.executeFromBlock({

View File

@@ -221,3 +221,68 @@ export function resolveParentChildSelectionConflicts(
return hasConflict ? resolved : nodes
}
export function getNodeSelectionContextId(
node: Pick<Node, 'id' | 'parentId'>,
blocks: Record<string, { data?: { parentId?: string } }>
): string | null {
return node.parentId || blocks[node.id]?.data?.parentId || null
}
export function getEdgeSelectionContextId(
edge: Pick<Edge, 'source' | 'target'>,
nodes: Array<Pick<Node, 'id' | 'parentId'>>,
blocks: Record<string, { data?: { parentId?: string } }>
): string | null {
const sourceNode = nodes.find((node) => node.id === edge.source)
const targetNode = nodes.find((node) => node.id === edge.target)
const sourceContextId = sourceNode ? getNodeSelectionContextId(sourceNode, blocks) : null
const targetContextId = targetNode ? getNodeSelectionContextId(targetNode, blocks) : null
if (sourceContextId) return sourceContextId
if (targetContextId) return targetContextId
return null
}
export function resolveSelectionContextConflicts(
nodes: Node[],
blocks: Record<string, { data?: { parentId?: string } }>,
preferredContextId?: string | null
): Node[] {
const selectedNodes = nodes.filter((node) => node.selected)
if (selectedNodes.length <= 1) return nodes
const allowedContextId =
preferredContextId !== undefined
? preferredContextId
: getNodeSelectionContextId(selectedNodes[0], blocks)
let hasConflict = false
const resolved = nodes.map((node) => {
if (!node.selected) return node
const contextId = getNodeSelectionContextId(node, blocks)
if (contextId !== allowedContextId) {
hasConflict = true
return { ...node, selected: false }
}
return node
})
return hasConflict ? resolved : nodes
}
export function resolveSelectionConflicts(
nodes: Node[],
blocks: Record<string, { data?: { parentId?: string } }>,
preferredNodeId?: string
): Node[] {
const afterParentChild = resolveParentChildSelectionConflicts(nodes, blocks)
const preferredContextId =
preferredNodeId !== undefined
? afterParentChild.find((n) => n.id === preferredNodeId && n.selected)
? getNodeSelectionContextId(afterParentChild.find((n) => n.id === preferredNodeId)!, blocks)
: undefined
: undefined
return resolveSelectionContextConflicts(afterParentChild, blocks, preferredContextId)
}

View File

@@ -13,6 +13,7 @@ import type {
StreamingExecution,
} from '@/executor/types'
import { stripCloneSuffixes } from '@/executor/utils/subflow-utils'
import { processSSEStream } from '@/hooks/use-execution-stream'
const logger = createLogger('workflow-execution-utils')
@@ -406,6 +407,161 @@ export function createBlockEventHandlers(
return { onBlockStarted, onBlockCompleted, onBlockError, onBlockChildWorkflowStarted }
}
type AddConsoleFn = (entry: Omit<ConsoleEntry, 'id' | 'timestamp'>) => ConsoleEntry
type CancelRunningEntriesFn = (workflowId: string) => void
export interface ExecutionTimingFields {
durationMs: number
startedAt: string
endedAt: string
}
/**
* Builds timing fields for an execution-level console entry.
*/
export function buildExecutionTiming(durationMs?: number): ExecutionTimingFields {
const normalizedDuration = durationMs || 0
return {
durationMs: normalizedDuration,
startedAt: new Date(Date.now() - normalizedDuration).toISOString(),
endedAt: new Date().toISOString(),
}
}
export interface ExecutionErrorConsoleParams {
workflowId: string
executionId?: string
error?: string
durationMs?: number
blockLogs: BlockLog[]
isPreExecutionError?: boolean
}
/**
* Adds an execution-level error entry to the console when no block-level error already covers it.
* Shared between direct user execution and mothership-initiated execution.
*/
export function addExecutionErrorConsoleEntry(
addConsole: AddConsoleFn,
params: ExecutionErrorConsoleParams
): void {
const hasBlockError = params.blockLogs.some((log) => log.error)
const isPreExecutionError = params.isPreExecutionError ?? false
if (!isPreExecutionError && hasBlockError) return
const errorMessage = params.error || 'Execution failed'
const isTimeout = errorMessage.toLowerCase().includes('timed out')
const timing = buildExecutionTiming(params.durationMs)
addConsole({
input: {},
output: {},
success: false,
error: errorMessage,
durationMs: timing.durationMs,
startedAt: timing.startedAt,
executionOrder: isPreExecutionError ? 0 : Number.MAX_SAFE_INTEGER,
endedAt: timing.endedAt,
workflowId: params.workflowId,
blockId: isPreExecutionError ? 'validation' : isTimeout ? 'timeout-error' : 'execution-error',
executionId: params.executionId,
blockName: isPreExecutionError
? 'Workflow Validation'
: isTimeout
? 'Timeout Error'
: 'Execution Error',
blockType: isPreExecutionError ? 'validation' : 'error',
})
}
/**
* Cancels running entries and adds an execution-level error console entry.
*/
export function handleExecutionErrorConsole(
addConsole: AddConsoleFn,
cancelRunningEntries: CancelRunningEntriesFn,
params: ExecutionErrorConsoleParams
): void {
cancelRunningEntries(params.workflowId)
addExecutionErrorConsoleEntry(addConsole, params)
}
export interface HttpErrorConsoleParams {
workflowId: string
executionId?: string
error: string
httpStatus: number
}
/**
* Adds a console entry for HTTP-level execution errors (non-OK response before SSE streaming).
*/
export function addHttpErrorConsoleEntry(
addConsole: AddConsoleFn,
params: HttpErrorConsoleParams
): void {
const isValidationError = params.httpStatus >= 400 && params.httpStatus < 500
const now = new Date().toISOString()
addConsole({
input: {},
output: {},
success: false,
error: params.error,
durationMs: 0,
startedAt: now,
executionOrder: 0,
endedAt: now,
workflowId: params.workflowId,
blockId: isValidationError ? 'validation' : 'execution-error',
executionId: params.executionId,
blockName: isValidationError ? 'Workflow Validation' : 'Execution Error',
blockType: isValidationError ? 'validation' : 'error',
})
}
export interface CancelledConsoleParams {
workflowId: string
executionId?: string
durationMs?: number
}
/**
* Adds a console entry for execution cancellation.
*/
export function addCancelledConsoleEntry(
addConsole: AddConsoleFn,
params: CancelledConsoleParams
): void {
const timing = buildExecutionTiming(params.durationMs)
addConsole({
input: {},
output: {},
success: false,
error: 'Execution was cancelled',
durationMs: timing.durationMs,
startedAt: timing.startedAt,
executionOrder: Number.MAX_SAFE_INTEGER,
endedAt: timing.endedAt,
workflowId: params.workflowId,
blockId: 'cancelled',
executionId: params.executionId,
blockName: 'Execution Cancelled',
blockType: 'cancelled',
})
}
/**
* Cancels running entries and adds a cancelled console entry.
*/
export function handleExecutionCancelledConsole(
addConsole: AddConsoleFn,
cancelRunningEntries: CancelRunningEntriesFn,
params: CancelledConsoleParams
): void {
cancelRunningEntries(params.workflowId)
addCancelledConsoleEntry(addConsole, params)
}
export interface WorkflowExecutionOptions {
workflowId?: string
workflowInput?: any
@@ -436,7 +592,7 @@ export async function executeWorkflowWithFullLogging(
}
const executionId = options.executionId || uuidv4()
const { addConsole, updateConsole } = useTerminalConsoleStore.getState()
const { addConsole, updateConsole, cancelRunningEntries } = useTerminalConsoleStore.getState()
const { setActiveBlocks, setBlockRunStatus, setEdgeRunStatus, setCurrentExecutionId } =
useExecutionStore.getState()
const wfId = targetWorkflowId
@@ -445,6 +601,7 @@ export async function executeWorkflowWithFullLogging(
const activeBlocksSet = new Set<string>()
const activeBlockRefCounts = new Map<string, number>()
const executionIdRef = { current: executionId }
const accumulatedBlockLogs: BlockLog[] = []
const blockHandlers = createBlockEventHandlers(
{
@@ -453,7 +610,7 @@ export async function executeWorkflowWithFullLogging(
workflowEdges,
activeBlocksSet,
activeBlockRefCounts,
accumulatedBlockLogs: [],
accumulatedBlockLogs,
accumulatedBlockStates: new Map(),
executedBlockIds: new Set(),
consoleMode: 'update',
@@ -490,16 +647,26 @@ export async function executeWorkflowWithFullLogging(
if (!response.ok) {
const error = await response.json()
throw new Error(error.error || 'Workflow execution failed')
const errorMessage = error.error || 'Workflow execution failed'
addHttpErrorConsoleEntry(addConsole, {
workflowId: wfId,
executionId,
error: errorMessage,
httpStatus: response.status,
})
throw new Error(errorMessage)
}
if (!response.body) {
throw new Error('No response body')
}
const reader = response.body.getReader()
const decoder = new TextDecoder()
let buffer = ''
const serverExecutionId = response.headers.get('X-Execution-Id')
if (serverExecutionId) {
executionIdRef.current = serverExecutionId
setCurrentExecutionId(wfId, serverExecutionId)
}
let executionResult: ExecutionResult = {
success: false,
output: {},
@@ -507,89 +674,67 @@ export async function executeWorkflowWithFullLogging(
}
try {
while (true) {
const { done, value } = await reader.read()
if (done) break
await processSSEStream(
response.body.getReader(),
{
onExecutionStarted: (data) => {
logger.info('Execution started', { startTime: data.startTime })
},
buffer += decoder.decode(value, { stream: true })
const lines = buffer.split('\n\n')
buffer = lines.pop() || ''
onBlockStarted: blockHandlers.onBlockStarted,
onBlockCompleted: blockHandlers.onBlockCompleted,
onBlockError: blockHandlers.onBlockError,
onBlockChildWorkflowStarted: blockHandlers.onBlockChildWorkflowStarted,
for (const line of lines) {
if (!line.trim() || !line.startsWith('data: ')) continue
onExecutionCompleted: (data) => {
setCurrentExecutionId(wfId, null)
executionResult = {
success: data.success,
output: data.output,
logs: accumulatedBlockLogs,
metadata: {
duration: data.duration,
startTime: data.startTime,
endTime: data.endTime,
},
}
},
const data = line.substring(6).trim()
if (data === '[DONE]') continue
onExecutionCancelled: () => {
setCurrentExecutionId(wfId, null)
executionResult = {
success: false,
output: {},
error: 'Execution was cancelled',
logs: accumulatedBlockLogs,
}
},
let event: any
try {
event = JSON.parse(data)
} catch {
continue
}
switch (event.type) {
case 'execution:started': {
setCurrentExecutionId(wfId, event.executionId)
executionIdRef.current = event.executionId || executionId
break
onExecutionError: (data) => {
setCurrentExecutionId(wfId, null)
const errorMessage = data.error || 'Execution failed'
executionResult = {
success: false,
output: {},
error: errorMessage,
logs: accumulatedBlockLogs,
metadata: { duration: data.duration },
}
case 'block:started':
blockHandlers.onBlockStarted(event.data)
break
case 'block:completed':
blockHandlers.onBlockCompleted(event.data)
break
case 'block:error':
blockHandlers.onBlockError(event.data)
break
case 'block:childWorkflowStarted':
blockHandlers.onBlockChildWorkflowStarted(event.data)
break
case 'execution:completed':
setCurrentExecutionId(wfId, null)
executionResult = {
success: event.data.success,
output: event.data.output,
logs: [],
metadata: {
duration: event.data.duration,
startTime: event.data.startTime,
endTime: event.data.endTime,
},
}
break
case 'execution:cancelled':
setCurrentExecutionId(wfId, null)
executionResult = {
success: false,
output: {},
error: 'Execution was cancelled',
logs: [],
}
break
case 'execution:error':
setCurrentExecutionId(wfId, null)
executionResult = {
success: false,
output: {},
error: event.data.error || 'Execution failed',
logs: [],
}
break
}
}
}
handleExecutionErrorConsole(addConsole, cancelRunningEntries, {
workflowId: wfId,
executionId: executionIdRef.current,
error: errorMessage,
durationMs: data.duration || 0,
blockLogs: accumulatedBlockLogs,
isPreExecutionError: accumulatedBlockLogs.length === 0,
})
},
},
'CopilotExecution'
)
} finally {
setCurrentExecutionId(wfId, null)
reader.releaseLock()
setActiveBlocks(wfId, new Set())
}

View File

@@ -59,11 +59,13 @@ import {
filterProtectedBlocks,
getClampedPositionForNode,
getDescendantBlockIds,
getEdgeSelectionContextId,
getNodeSelectionContextId,
getWorkflowLockToggleIds,
isBlockProtected,
isEdgeProtected,
isInEditableElement,
resolveParentChildSelectionConflicts,
resolveSelectionConflicts,
validateTriggerPaste,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/utils'
import { useSocket } from '@/app/workspace/providers/socket-provider'
@@ -168,16 +170,17 @@ function mapEdgesByNode(edges: Edge[], nodeIds: Set<string>): Map<string, Edge[]
/**
* Syncs the panel editor with the current selection state.
* Shows block details when exactly one block is selected, clears otherwise.
* Shows the last selected block in the panel. Clears when nothing is selected.
*/
function syncPanelWithSelection(selectedIds: string[]) {
const { currentBlockId, clearCurrentBlock, setCurrentBlockId } = usePanelEditorStore.getState()
if (selectedIds.length === 1 && selectedIds[0] !== currentBlockId) {
setCurrentBlockId(selectedIds[0])
} else if (selectedIds.length === 0 && currentBlockId) {
clearCurrentBlock()
} else if (selectedIds.length > 1 && currentBlockId) {
clearCurrentBlock()
if (selectedIds.length === 0) {
if (currentBlockId) clearCurrentBlock()
} else {
const lastSelectedId = selectedIds[selectedIds.length - 1]
if (lastSelectedId !== currentBlockId) {
setCurrentBlockId(lastSelectedId)
}
}
}
@@ -246,7 +249,6 @@ const WorkflowContent = React.memo(
const [selectedEdges, setSelectedEdges] = useState<SelectedEdgesMap>(new Map())
const [isErrorConnectionDrag, setIsErrorConnectionDrag] = useState(false)
const canvasContainerRef = useRef<HTMLDivElement>(null)
const selectedIdsRef = useRef<string[] | null>(null)
const embeddedFitFrameRef = useRef<number | null>(null)
const hasCompletedInitialEmbeddedFitRef = useRef(false)
const canvasMode = useCanvasModeStore((state) => state.mode)
@@ -336,9 +338,7 @@ const WorkflowContent = React.memo(
const isAutoConnectEnabled = useAutoConnect()
const autoConnectRef = useRef(isAutoConnectEnabled)
useEffect(() => {
autoConnectRef.current = isAutoConnectEnabled
}, [isAutoConnectEnabled])
autoConnectRef.current = isAutoConnectEnabled
// Panel open states for context menu
const isVariablesOpen = useVariablesStore((state) => state.isOpen)
@@ -2479,6 +2479,16 @@ const WorkflowContent = React.memo(
// Local state for nodes - allows smooth drag without store updates on every frame
const [displayNodes, setDisplayNodes] = useState<Node[]>([])
const selectedNodeIds = useMemo(
() => displayNodes.filter((node) => node.selected).map((node) => node.id),
[displayNodes]
)
const selectedNodeIdsKey = selectedNodeIds.join(',')
useEffect(() => {
syncPanelWithSelection(selectedNodeIds)
}, [selectedNodeIdsKey])
useEffect(() => {
// Check for pending selection (from paste/duplicate), otherwise preserve existing selection
if (pendingSelection && pendingSelection.length > 0) {
@@ -2490,10 +2500,8 @@ const WorkflowContent = React.memo(
...node,
selected: pendingSet.has(node.id),
}))
const resolved = resolveParentChildSelectionConflicts(withSelection, blocks)
const resolved = resolveSelectionConflicts(withSelection, blocks)
setDisplayNodes(resolved)
const selectedIds = resolved.filter((node) => node.selected).map((node) => node.id)
syncPanelWithSelection(selectedIds)
return
}
@@ -2711,19 +2719,20 @@ const WorkflowContent = React.memo(
/** Handles node changes - applies changes and resolves parent-child selection conflicts. */
const onNodesChange = useCallback(
(changes: NodeChange[]) => {
selectedIdsRef.current = null
setDisplayNodes((nds) => {
const updated = applyNodeChanges(changes, nds)
const hasSelectionChange = changes.some((c) => c.type === 'select')
const hasSelectionChange = changes.some((c) => c.type === 'select')
setDisplayNodes((currentNodes) => {
const updated = applyNodeChanges(changes, currentNodes)
if (!hasSelectionChange) return updated
const resolved = resolveParentChildSelectionConflicts(updated, blocks)
selectedIdsRef.current = resolved.filter((node) => node.selected).map((node) => node.id)
return resolved
const preferredNodeId = [...changes]
.reverse()
.find(
(change): change is NodeChange & { id: string; selected: boolean } =>
change.type === 'select' && 'selected' in change && change.selected === true
)?.id
return resolveSelectionConflicts(updated, blocks, preferredNodeId)
})
const selectedIds = selectedIdsRef.current as string[] | null
if (selectedIds !== null) {
syncPanelWithSelection(selectedIds)
}
// Handle position changes (e.g., from keyboard arrow key movement)
// Update container dimensions when child nodes are moved and persist to backend
@@ -3162,7 +3171,10 @@ const WorkflowContent = React.memo(
parentId: currentParentId,
})
// Capture all selected nodes' positions for multi-node undo/redo
// Capture all selected nodes' positions for multi-node undo/redo.
// Also include the dragged node itself — during shift+click+drag, ReactFlow
// may have toggled (deselected) the node before drag starts, so it might not
// appear in the selected set yet.
const allNodes = getNodes()
const selectedNodes = allNodes.filter((n) => n.selected)
multiNodeDragStartRef.current.clear()
@@ -3176,6 +3188,33 @@ const WorkflowContent = React.memo(
})
}
})
if (!multiNodeDragStartRef.current.has(node.id)) {
multiNodeDragStartRef.current.set(node.id, {
x: node.position.x,
y: node.position.y,
parentId: currentParentId ?? undefined,
})
}
// When shift+clicking an already-selected node, ReactFlow toggles (deselects)
// it via onNodesChange before drag starts. Re-select the dragged node so all
// previously selected nodes move together as a group — but only if the
// deselection wasn't from a parent-child conflict (e.g. dragging a child
// when its parent subflow is selected).
const draggedNodeInSelected = allNodes.find((n) => n.id === node.id)
if (draggedNodeInSelected && !draggedNodeInSelected.selected && selectedNodes.length > 0) {
const draggedParentId = blocks[node.id]?.data?.parentId
const parentIsSelected =
draggedParentId && selectedNodes.some((n) => n.id === draggedParentId)
const contextMismatch =
getNodeSelectionContextId(draggedNodeInSelected, blocks) !==
getNodeSelectionContextId(selectedNodes[0], blocks)
if (!parentIsSelected && !contextMismatch) {
setDisplayNodes((currentNodes) =>
currentNodes.map((n) => (n.id === node.id ? { ...n, selected: true } : n))
)
}
}
},
[blocks, setDragStartPosition, getNodes, setPotentialParentId]
)
@@ -3455,7 +3494,7 @@ const WorkflowContent = React.memo(
})
// Apply visual deselection of children
setDisplayNodes((allNodes) => resolveParentChildSelectionConflicts(allNodes, blocks))
setDisplayNodes((allNodes) => resolveSelectionConflicts(allNodes, blocks))
},
[blocks]
)
@@ -3606,19 +3645,25 @@ const WorkflowContent = React.memo(
/**
* Handles node click to select the node in ReactFlow.
* Parent-child conflict resolution happens automatically in onNodesChange.
* Uses the controlled display node state so parent-child conflicts are resolved
* consistently for click, shift-click, and marquee selection.
*/
const handleNodeClick = useCallback(
(event: React.MouseEvent, node: Node) => {
const isMultiSelect = event.shiftKey || event.metaKey || event.ctrlKey
setNodes((nodes) =>
nodes.map((n) => ({
...n,
selected: isMultiSelect ? (n.id === node.id ? true : n.selected) : n.id === node.id,
setDisplayNodes((currentNodes) => {
const updated = currentNodes.map((currentNode) => ({
...currentNode,
selected: isMultiSelect
? currentNode.id === node.id
? true
: currentNode.selected
: currentNode.id === node.id,
}))
)
return resolveSelectionConflicts(updated, blocks, isMultiSelect ? node.id : undefined)
})
},
[setNodes]
[blocks]
)
/** Handles edge selection with container context tracking and Shift-click multi-selection. */
@@ -3626,16 +3671,10 @@ const WorkflowContent = React.memo(
(event: React.MouseEvent, edge: any) => {
event.stopPropagation() // Prevent bubbling
// Determine if edge is inside a loop by checking its source/target nodes
const sourceNode = getNodes().find((n) => n.id === edge.source)
const targetNode = getNodes().find((n) => n.id === edge.target)
// An edge is inside a loop if either source or target has a parent
// If source and target have different parents, prioritize source's parent
const parentLoopId = sourceNode?.parentId || targetNode?.parentId
// Create a unique identifier that combines edge ID and parent context
const contextId = `${edge.id}${parentLoopId ? `-${parentLoopId}` : ''}`
const contextId = `${edge.id}${(() => {
const selectionContextId = getEdgeSelectionContextId(edge, getNodes(), blocks)
return selectionContextId ? `-${selectionContextId}` : ''
})()}`
if (event.shiftKey) {
// Shift-click: toggle edge in selection
@@ -3653,7 +3692,7 @@ const WorkflowContent = React.memo(
setSelectedEdges(new Map([[contextId, edge.id]]))
}
},
[getNodes]
[blocks, getNodes]
)
/** Stable delete handler to avoid creating new function references per edge. */

View File

@@ -277,16 +277,20 @@ function ConnectionsSection({
onResizeMouseDown,
onToggleCollapsed,
}: ConnectionsSectionProps) {
const [expandedBlocks, setExpandedBlocks] = useState<Set<string>>(() => new Set())
/** Stable string of connection IDs to prevent guard from running on every render */
const connectionIds = useMemo(() => connections.map((c) => c.blockId).join(','), [connections])
const [expandedBlocks, setExpandedBlocks] = useState<Set<string>>(
() => new Set(connectionIds.split(',').filter(Boolean))
)
const [expandedVariables, setExpandedVariables] = useState(true)
const [expandedEnvVars, setExpandedEnvVars] = useState(true)
/** Stable string of connection IDs to prevent effect from running on every render */
const connectionIds = useMemo(() => connections.map((c) => c.blockId).join(','), [connections])
useEffect(() => {
const [prevConnectionIds, setPrevConnectionIds] = useState(connectionIds)
if (connectionIds !== prevConnectionIds) {
setPrevConnectionIds(connectionIds)
setExpandedBlocks(new Set(connectionIds.split(',').filter(Boolean)))
}, [connectionIds])
}
const hasContent = connections.length > 0 || workflowVars.length > 0 || envVars.length > 0

View File

@@ -1,6 +1,6 @@
'use client'
import { useCallback, useEffect, useMemo, useState } from 'react'
import { useCallback, useMemo, useState } from 'react'
import { ArrowLeft } from 'lucide-react'
import { Button, Tooltip } from '@/components/emcn'
import { redactApiKeys } from '@/lib/core/security/redaction'
@@ -161,6 +161,11 @@ export function Preview({
})
const [workflowStack, setWorkflowStack] = useState<WorkflowStackEntry[]>([])
const [prevRootState, setPrevRootState] = useState(rootWorkflowState)
if (rootWorkflowState !== prevRootState) {
setPrevRootState(rootWorkflowState)
setWorkflowStack([])
}
const rootBlockExecutions = useMemo(() => {
if (providedBlockExecutions) return providedBlockExecutions
@@ -227,10 +232,6 @@ export function Preview({
setPinnedBlockId(null)
}, [])
useEffect(() => {
setWorkflowStack([])
}, [rootWorkflowState])
const isNested = workflowStack.length > 0
const currentWorkflowName = isNested ? workflowStack[workflowStack.length - 1].workflowName : null

View File

@@ -175,24 +175,26 @@ export function SearchModal({
]
)
const [search, setSearch] = useState('')
const [prevOpen, setPrevOpen] = useState(open)
if (open !== prevOpen) {
setPrevOpen(open)
if (open) setSearch('')
}
useEffect(() => {
if (open) {
setSearch('')
if (inputRef.current) {
const nativeInputValueSetter = Object.getOwnPropertyDescriptor(
window.HTMLInputElement.prototype,
'value'
)?.set
if (nativeInputValueSetter) {
nativeInputValueSetter.call(inputRef.current, '')
inputRef.current.dispatchEvent(new Event('input', { bubbles: true }))
}
inputRef.current.focus()
}
if (!open || !inputRef.current) return
const nativeInputValueSetter = Object.getOwnPropertyDescriptor(
window.HTMLInputElement.prototype,
'value'
)?.set
if (nativeInputValueSetter) {
nativeInputValueSetter.call(inputRef.current, '')
inputRef.current.dispatchEvent(new Event('input', { bubbles: true }))
}
inputRef.current.focus()
}, [open])
const [search, setSearch] = useState('')
const deferredSearch = useDeferredValue(search)
const handleSearchChange = useCallback((value: string) => {

View File

@@ -124,13 +124,6 @@ export function useDragDrop(options: UseDragDropOptions = {}) {
}
}, [hoverFolderId, isDragging, expandedFolders, setExpanded])
useEffect(() => {
if (!isDragging) {
setHoverFolderId(null)
setDropIndicator(null)
}
}, [isDragging])
const calculateDropPosition = useCallback(
(e: React.DragEvent, element: HTMLElement): 'before' | 'after' => {
const rect = element.getBoundingClientRect()

View File

@@ -42,13 +42,6 @@ export function useItemRename({ initialName, onSave, itemType, itemId }: UseItem
const [isRenaming, setIsRenaming] = useState(false)
const inputRef = useRef<HTMLInputElement>(null)
/**
* Update edit value when initial name changes
*/
useEffect(() => {
setEditValue(initialName)
}, [initialName])
/**
* Focus and select input when entering edit mode
*/

View File

@@ -7,7 +7,7 @@ export const AshbyBlock: BlockConfig = {
name: 'Ashby',
description: 'Manage candidates, jobs, and applications in Ashby',
longDescription:
'Integrate Ashby into the workflow. Can list, search, create, and update candidates, list and get job details, create notes, list notes, list and get applications, create applications, and list offers.',
'Integrate Ashby into the workflow. Manage candidates (list, get, create, update, search, tag), applications (list, get, create, change stage), jobs (list, get), job postings (list, get), offers (list, get), notes (list, create), interviews (list), and reference data (sources, tags, archive reasons, custom fields, departments, locations, openings, users).',
docsLink: 'https://docs.sim.ai/tools/ashby',
category: 'tools',
bgColor: '#5D4ED6',
@@ -45,6 +45,21 @@ export const AshbyBlock: BlockConfig = {
{ label: 'Get Application', id: 'get_application' },
{ label: 'Create Application', id: 'create_application' },
{ label: 'List Offers', id: 'list_offers' },
{ label: 'Change Application Stage', id: 'change_application_stage' },
{ label: 'Add Candidate Tag', id: 'add_candidate_tag' },
{ label: 'Remove Candidate Tag', id: 'remove_candidate_tag' },
{ label: 'Get Offer', id: 'get_offer' },
{ label: 'List Sources', id: 'list_sources' },
{ label: 'List Candidate Tags', id: 'list_candidate_tags' },
{ label: 'List Archive Reasons', id: 'list_archive_reasons' },
{ label: 'List Custom Fields', id: 'list_custom_fields' },
{ label: 'List Departments', id: 'list_departments' },
{ label: 'List Locations', id: 'list_locations' },
{ label: 'List Job Postings', id: 'list_job_postings' },
{ label: 'Get Job Posting', id: 'get_job_posting' },
{ label: 'List Openings', id: 'list_openings' },
{ label: 'List Users', id: 'list_users' },
{ label: 'List Interviews', id: 'list_interviews' },
],
value: () => 'list_candidates',
},
@@ -56,24 +71,34 @@ export const AshbyBlock: BlockConfig = {
placeholder: 'Enter your Ashby API key',
password: true,
},
// Get Candidate / Create Note / List Notes / Update Candidate - candidateId
{
id: 'candidateId',
title: 'Candidate ID',
type: 'short-input',
required: {
field: 'operation',
value: ['get_candidate', 'create_note', 'list_notes', 'update_candidate'],
value: [
'get_candidate',
'create_note',
'list_notes',
'update_candidate',
'add_candidate_tag',
'remove_candidate_tag',
],
},
placeholder: 'Enter candidate UUID',
condition: {
field: 'operation',
value: ['get_candidate', 'create_note', 'list_notes', 'update_candidate'],
value: [
'get_candidate',
'create_note',
'list_notes',
'update_candidate',
'add_candidate_tag',
'remove_candidate_tag',
],
},
},
// Create Candidate fields
{
id: 'name',
title: 'Name',
@@ -86,22 +111,10 @@ export const AshbyBlock: BlockConfig = {
id: 'email',
title: 'Email',
type: 'short-input',
required: { field: 'operation', value: 'create_candidate' },
placeholder: 'Email address',
condition: { field: 'operation', value: ['create_candidate', 'update_candidate'] },
},
{
id: 'emailType',
title: 'Email Type',
type: 'dropdown',
options: [
{ label: 'Work', id: 'Work' },
{ label: 'Personal', id: 'Personal' },
{ label: 'Other', id: 'Other' },
],
value: () => 'Work',
condition: { field: 'operation', value: ['create_candidate', 'update_candidate'] },
mode: 'advanced',
},
{
id: 'phoneNumber',
title: 'Phone Number',
@@ -110,19 +123,6 @@ export const AshbyBlock: BlockConfig = {
condition: { field: 'operation', value: ['create_candidate', 'update_candidate'] },
mode: 'advanced',
},
{
id: 'phoneType',
title: 'Phone Type',
type: 'dropdown',
options: [
{ label: 'Work', id: 'Work' },
{ label: 'Personal', id: 'Personal' },
{ label: 'Other', id: 'Other' },
],
value: () => 'Work',
condition: { field: 'operation', value: ['create_candidate', 'update_candidate'] },
mode: 'advanced',
},
{
id: 'linkedInUrl',
title: 'LinkedIn URL',
@@ -150,8 +150,6 @@ export const AshbyBlock: BlockConfig = {
},
mode: 'advanced',
},
// Update Candidate fields
{
id: 'updateName',
title: 'Name',
@@ -168,8 +166,6 @@ export const AshbyBlock: BlockConfig = {
condition: { field: 'operation', value: 'update_candidate' },
mode: 'advanced',
},
// Search Candidates fields
{
id: 'searchName',
title: 'Name',
@@ -184,8 +180,6 @@ export const AshbyBlock: BlockConfig = {
placeholder: 'Search by candidate email',
condition: { field: 'operation', value: 'search_candidates' },
},
// Get Job fields
{
id: 'jobId',
title: 'Job ID',
@@ -194,18 +188,20 @@ export const AshbyBlock: BlockConfig = {
placeholder: 'Enter job UUID',
condition: { field: 'operation', value: ['get_job', 'create_application'] },
},
// Get Application fields
{
id: 'applicationId',
title: 'Application ID',
type: 'short-input',
required: { field: 'operation', value: 'get_application' },
required: {
field: 'operation',
value: ['get_application', 'change_application_stage'],
},
placeholder: 'Enter application UUID',
condition: { field: 'operation', value: 'get_application' },
condition: {
field: 'operation',
value: ['get_application', 'change_application_stage', 'list_interviews'],
},
},
// Create Application fields
{
id: 'appCandidateId',
title: 'Candidate ID',
@@ -226,9 +222,12 @@ export const AshbyBlock: BlockConfig = {
id: 'interviewStageId',
title: 'Interview Stage ID',
type: 'short-input',
placeholder: 'Interview stage UUID (defaults to first Lead stage)',
condition: { field: 'operation', value: 'create_application' },
mode: 'advanced',
required: { field: 'operation', value: 'change_application_stage' },
placeholder: 'Interview stage UUID',
condition: {
field: 'operation',
value: ['create_application', 'change_application_stage', 'list_interviews'],
},
},
{
id: 'creditedToUserId',
@@ -257,8 +256,6 @@ Output only the ISO 8601 timestamp string, nothing else.`,
generationType: 'timestamp',
},
},
// Create Note fields
{
id: 'note',
title: 'Note',
@@ -286,8 +283,6 @@ Output only the ISO 8601 timestamp string, nothing else.`,
condition: { field: 'operation', value: 'create_note' },
mode: 'advanced',
},
// List Applications filter fields
{
id: 'filterStatus',
title: 'Status Filter',
@@ -338,8 +333,6 @@ Output only the ISO 8601 timestamp string, nothing else.`,
generationType: 'timestamp',
},
},
// List Jobs status filter
{
id: 'jobStatus',
title: 'Status Filter',
@@ -355,8 +348,6 @@ Output only the ISO 8601 timestamp string, nothing else.`,
condition: { field: 'operation', value: 'list_jobs' },
mode: 'advanced',
},
// Pagination fields for list operations
{
id: 'cursor',
title: 'Cursor',
@@ -364,7 +355,16 @@ Output only the ISO 8601 timestamp string, nothing else.`,
placeholder: 'Pagination cursor from previous response',
condition: {
field: 'operation',
value: ['list_candidates', 'list_jobs', 'list_applications', 'list_notes', 'list_offers'],
value: [
'list_candidates',
'list_jobs',
'list_applications',
'list_notes',
'list_offers',
'list_openings',
'list_users',
'list_interviews',
],
},
mode: 'advanced',
},
@@ -375,12 +375,57 @@ Output only the ISO 8601 timestamp string, nothing else.`,
placeholder: 'Results per page (default 100)',
condition: {
field: 'operation',
value: ['list_candidates', 'list_jobs', 'list_applications', 'list_notes', 'list_offers'],
value: [
'list_candidates',
'list_jobs',
'list_applications',
'list_notes',
'list_offers',
'list_openings',
'list_users',
'list_interviews',
],
},
mode: 'advanced',
},
// Trigger subBlocks
{
id: 'tagId',
title: 'Tag ID',
type: 'short-input',
required: {
field: 'operation',
value: ['add_candidate_tag', 'remove_candidate_tag'],
},
placeholder: 'Enter tag UUID',
condition: {
field: 'operation',
value: ['add_candidate_tag', 'remove_candidate_tag'],
},
},
{
id: 'archiveReasonId',
title: 'Archive Reason ID',
type: 'short-input',
placeholder: 'Archive reason UUID (required for Archived stages)',
condition: { field: 'operation', value: 'change_application_stage' },
mode: 'advanced',
},
{
id: 'offerId',
title: 'Offer ID',
type: 'short-input',
required: { field: 'operation', value: 'get_offer' },
placeholder: 'Enter offer UUID',
condition: { field: 'operation', value: 'get_offer' },
},
{
id: 'jobPostingId',
title: 'Job Posting ID',
type: 'short-input',
required: { field: 'operation', value: 'get_job_posting' },
placeholder: 'Enter job posting UUID',
condition: { field: 'operation', value: 'get_job_posting' },
},
...getTrigger('ashby_application_submit').subBlocks,
...getTrigger('ashby_candidate_stage_change').subBlocks,
...getTrigger('ashby_candidate_hire').subBlocks,
@@ -391,17 +436,32 @@ Output only the ISO 8601 timestamp string, nothing else.`,
tools: {
access: [
'ashby_add_candidate_tag',
'ashby_change_application_stage',
'ashby_create_application',
'ashby_create_candidate',
'ashby_create_note',
'ashby_get_application',
'ashby_get_candidate',
'ashby_get_job',
'ashby_get_job_posting',
'ashby_get_offer',
'ashby_list_applications',
'ashby_list_archive_reasons',
'ashby_list_candidate_tags',
'ashby_list_candidates',
'ashby_list_custom_fields',
'ashby_list_departments',
'ashby_list_interviews',
'ashby_list_job_postings',
'ashby_list_jobs',
'ashby_list_locations',
'ashby_list_notes',
'ashby_list_offers',
'ashby_list_openings',
'ashby_list_sources',
'ashby_list_users',
'ashby_remove_candidate_tag',
'ashby_search_candidates',
'ashby_update_candidate',
],
@@ -419,10 +479,8 @@ Output only the ISO 8601 timestamp string, nothing else.`,
if (params.sendNotifications === 'true' || params.sendNotifications === true) {
result.sendNotifications = true
}
// Create Application params
if (params.appCandidateId) result.candidateId = params.appCandidateId
if (params.appCreatedAt) result.createdAt = params.appCreatedAt
// Update Candidate params
if (params.updateName) result.name = params.updateName
return result
},
@@ -435,9 +493,7 @@ Output only the ISO 8601 timestamp string, nothing else.`,
candidateId: { type: 'string', description: 'Candidate UUID' },
name: { type: 'string', description: 'Candidate full name' },
email: { type: 'string', description: 'Email address' },
emailType: { type: 'string', description: 'Email type (Personal, Work, Other)' },
phoneNumber: { type: 'string', description: 'Phone number' },
phoneType: { type: 'string', description: 'Phone type (Personal, Work, Other)' },
linkedInUrl: { type: 'string', description: 'LinkedIn profile URL' },
githubUrl: { type: 'string', description: 'GitHub profile URL' },
websiteUrl: { type: 'string', description: 'Personal website URL' },
@@ -462,6 +518,10 @@ Output only the ISO 8601 timestamp string, nothing else.`,
jobStatus: { type: 'string', description: 'Job status filter' },
cursor: { type: 'string', description: 'Pagination cursor' },
perPage: { type: 'number', description: 'Results per page' },
tagId: { type: 'string', description: 'Tag UUID' },
offerId: { type: 'string', description: 'Offer UUID' },
jobPostingId: { type: 'string', description: 'Job posting UUID' },
archiveReasonId: { type: 'string', description: 'Archive reason UUID' },
},
outputs: {
@@ -486,12 +546,73 @@ Output only the ISO 8601 timestamp string, nothing else.`,
},
offers: {
type: 'json',
description: 'List of offers (id, status, candidate, job, createdAt, updatedAt)',
description:
'List of offers (id, offerStatus, acceptanceStatus, applicationId, startDate, salary, openingId)',
},
archiveReasons: {
type: 'json',
description: 'List of archive reasons (id, text, reasonType, isArchived)',
},
sources: {
type: 'json',
description: 'List of sources (id, title, isArchived)',
},
customFields: {
type: 'json',
description: 'List of custom fields (id, title, fieldType, objectType, isArchived)',
},
departments: {
type: 'json',
description: 'List of departments (id, name, isArchived, parentId)',
},
locations: {
type: 'json',
description: 'List of locations (id, name, isArchived, isRemote, address)',
},
jobPostings: {
type: 'json',
description:
'List of job postings (id, title, jobId, locationName, departmentName, employmentType, isListed, publishedDate)',
},
openings: {
type: 'json',
description: 'List of openings (id, openingState, isArchived, openedAt, closedAt)',
},
users: {
type: 'json',
description: 'List of users (id, firstName, lastName, email, isEnabled, globalRole)',
},
interviewSchedules: {
type: 'json',
description:
'List of interview schedules (id, applicationId, interviewStageId, status, createdAt)',
},
tags: {
type: 'json',
description: 'List of candidate tags (id, title, isArchived)',
},
stageId: { type: 'string', description: 'Interview stage UUID after stage change' },
success: { type: 'boolean', description: 'Whether the operation succeeded' },
offerStatus: {
type: 'string',
description: 'Offer status (e.g. WaitingOnCandidateResponse, CandidateAccepted)',
},
acceptanceStatus: {
type: 'string',
description: 'Acceptance status (e.g. Accepted, Declined, Pending)',
},
applicationId: { type: 'string', description: 'Associated application UUID' },
openingId: { type: 'string', description: 'Opening UUID associated with the offer' },
salary: {
type: 'json',
description: 'Salary details from latest version (currencyCode, value)',
},
startDate: { type: 'string', description: 'Offer start date from latest version' },
id: { type: 'string', description: 'Resource UUID' },
name: { type: 'string', description: 'Resource name' },
title: { type: 'string', description: 'Job title' },
status: { type: 'string', description: 'Status' },
noteId: { type: 'string', description: 'Created note UUID' },
content: { type: 'string', description: 'Note content' },
moreDataAvailable: { type: 'boolean', description: 'Whether more pages exist' },
nextCursor: { type: 'string', description: 'Pagination cursor for next page' },

View File

@@ -0,0 +1,599 @@
import { BoxCompanyIcon } from '@/components/icons'
import { getScopesForService } from '@/lib/oauth/utils'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import { normalizeFileInput } from '@/blocks/utils'
export const BoxBlock: BlockConfig = {
type: 'box',
name: 'Box',
description: 'Manage files, folders, and e-signatures with Box',
longDescription:
'Integrate Box into your workflow to manage files, folders, and e-signatures. Upload and download files, search content, create folders, send documents for e-signature, track signing status, and more.',
docsLink: 'https://docs.sim.ai/tools/box',
category: 'tools',
bgColor: '#FFFFFF',
icon: BoxCompanyIcon,
authMode: AuthMode.OAuth,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Upload File', id: 'upload_file' },
{ label: 'Download File', id: 'download_file' },
{ label: 'Get File Info', id: 'get_file_info' },
{ label: 'List Folder Items', id: 'list_folder_items' },
{ label: 'Create Folder', id: 'create_folder' },
{ label: 'Delete File', id: 'delete_file' },
{ label: 'Delete Folder', id: 'delete_folder' },
{ label: 'Copy File', id: 'copy_file' },
{ label: 'Search', id: 'search' },
{ label: 'Update File', id: 'update_file' },
{ label: 'Create Sign Request', id: 'sign_create_request' },
{ label: 'Get Sign Request', id: 'sign_get_request' },
{ label: 'List Sign Requests', id: 'sign_list_requests' },
{ label: 'Cancel Sign Request', id: 'sign_cancel_request' },
{ label: 'Resend Sign Request', id: 'sign_resend_request' },
],
value: () => 'upload_file',
},
{
id: 'credential',
title: 'Box Account',
type: 'oauth-input',
serviceId: 'box',
requiredScopes: getScopesForService('box'),
placeholder: 'Select Box account',
required: true,
},
// Upload File fields
{
id: 'uploadFile',
title: 'File',
type: 'file-upload',
canonicalParamId: 'file',
placeholder: 'Upload file to send to Box',
mode: 'basic',
multiple: false,
required: { field: 'operation', value: 'upload_file' },
condition: { field: 'operation', value: 'upload_file' },
},
{
id: 'fileRef',
title: 'File',
type: 'short-input',
canonicalParamId: 'file',
placeholder: 'Reference file from previous blocks',
mode: 'advanced',
required: { field: 'operation', value: 'upload_file' },
condition: { field: 'operation', value: 'upload_file' },
},
{
id: 'parentFolderId',
title: 'Parent Folder ID',
type: 'short-input',
placeholder: 'Folder ID (use "0" for root)',
required: { field: 'operation', value: ['upload_file', 'create_folder', 'copy_file'] },
condition: { field: 'operation', value: ['upload_file', 'create_folder', 'copy_file'] },
},
{
id: 'uploadFileName',
title: 'File Name',
type: 'short-input',
placeholder: 'Optional filename override',
condition: { field: 'operation', value: 'upload_file' },
mode: 'advanced',
},
// File ID field (shared by download, get info, delete, copy, update)
{
id: 'fileId',
title: 'File ID',
type: 'short-input',
placeholder: 'Box file ID',
required: {
field: 'operation',
value: ['download_file', 'get_file_info', 'delete_file', 'copy_file', 'update_file'],
},
condition: {
field: 'operation',
value: ['download_file', 'get_file_info', 'delete_file', 'copy_file', 'update_file'],
},
},
// Folder ID field (shared by list, delete folder)
{
id: 'folderId',
title: 'Folder ID',
type: 'short-input',
placeholder: 'Box folder ID (use "0" for root)',
required: { field: 'operation', value: ['list_folder_items', 'delete_folder'] },
condition: { field: 'operation', value: ['list_folder_items', 'delete_folder'] },
},
// Create Folder fields
{
id: 'folderName',
title: 'Folder Name',
type: 'short-input',
placeholder: 'Name for the new folder',
required: { field: 'operation', value: 'create_folder' },
condition: { field: 'operation', value: 'create_folder' },
},
// Copy File fields
{
id: 'copyName',
title: 'New Name',
type: 'short-input',
placeholder: 'Optional name for the copy',
condition: { field: 'operation', value: 'copy_file' },
},
// Search fields
{
id: 'query',
title: 'Search Query',
type: 'short-input',
placeholder: 'Search query string',
required: { field: 'operation', value: 'search' },
condition: { field: 'operation', value: 'search' },
},
{
id: 'ancestorFolderId',
title: 'Ancestor Folder ID',
type: 'short-input',
placeholder: 'Restrict search to a folder',
condition: { field: 'operation', value: 'search' },
mode: 'advanced',
},
{
id: 'fileExtensions',
title: 'File Extensions',
type: 'short-input',
placeholder: 'e.g., pdf,docx,xlsx',
condition: { field: 'operation', value: 'search' },
mode: 'advanced',
},
{
id: 'contentType',
title: 'Content Type',
type: 'dropdown',
options: [
{ label: 'All', id: '' },
{ label: 'Files', id: 'file' },
{ label: 'Folders', id: 'folder' },
{ label: 'Web Links', id: 'web_link' },
],
value: () => '',
condition: { field: 'operation', value: 'search' },
mode: 'advanced',
},
// Update File fields
{
id: 'newName',
title: 'New Name',
type: 'short-input',
placeholder: 'Rename the file',
condition: { field: 'operation', value: 'update_file' },
},
{
id: 'description',
title: 'Description',
type: 'short-input',
placeholder: 'File description (max 256 chars)',
condition: { field: 'operation', value: 'update_file' },
},
{
id: 'moveToFolderId',
title: 'Move to Folder ID',
type: 'short-input',
placeholder: 'Move file to this folder',
condition: { field: 'operation', value: 'update_file' },
mode: 'advanced',
},
{
id: 'tags',
title: 'Tags',
type: 'short-input',
placeholder: 'Comma-separated tags',
condition: { field: 'operation', value: 'update_file' },
mode: 'advanced',
},
// Delete Folder options
{
id: 'recursive',
title: 'Delete Recursively',
type: 'switch',
condition: { field: 'operation', value: 'delete_folder' },
},
// Shared pagination fields (file operations)
{
id: 'limit',
title: 'Limit',
type: 'short-input',
placeholder: 'Max results per page',
condition: {
field: 'operation',
value: ['list_folder_items', 'search', 'sign_list_requests'],
},
mode: 'advanced',
},
{
id: 'offset',
title: 'Offset',
type: 'short-input',
placeholder: 'Pagination offset',
condition: { field: 'operation', value: ['list_folder_items', 'search'] },
mode: 'advanced',
},
// List Folder sort options
{
id: 'sort',
title: 'Sort By',
type: 'dropdown',
options: [
{ label: 'Default', id: '' },
{ label: 'ID', id: 'id' },
{ label: 'Name', id: 'name' },
{ label: 'Date', id: 'date' },
{ label: 'Size', id: 'size' },
],
value: () => '',
condition: { field: 'operation', value: 'list_folder_items' },
mode: 'advanced',
},
{
id: 'direction',
title: 'Sort Direction',
type: 'dropdown',
options: [
{ label: 'Ascending', id: 'ASC' },
{ label: 'Descending', id: 'DESC' },
],
value: () => 'ASC',
condition: { field: 'operation', value: 'list_folder_items' },
mode: 'advanced',
},
// Sign Request fields
{
id: 'sourceFileIds',
title: 'Source File IDs',
type: 'short-input',
placeholder: 'Comma-separated Box file IDs (e.g., 12345,67890)',
required: { field: 'operation', value: 'sign_create_request' },
condition: { field: 'operation', value: 'sign_create_request' },
},
{
id: 'signerEmail',
title: 'Signer Email',
type: 'short-input',
placeholder: 'Primary signer email address',
required: { field: 'operation', value: 'sign_create_request' },
condition: { field: 'operation', value: 'sign_create_request' },
},
{
id: 'signerRole',
title: 'Signer Role',
type: 'dropdown',
options: [
{ label: 'Signer', id: 'signer' },
{ label: 'Approver', id: 'approver' },
{ label: 'Final Copy Reader', id: 'final_copy_reader' },
],
value: () => 'signer',
condition: { field: 'operation', value: 'sign_create_request' },
},
{
id: 'emailSubject',
title: 'Email Subject',
type: 'short-input',
placeholder: 'Custom email subject line',
condition: { field: 'operation', value: 'sign_create_request' },
},
{
id: 'emailMessage',
title: 'Email Message',
type: 'long-input',
placeholder: 'Custom message in the signing email',
condition: { field: 'operation', value: 'sign_create_request' },
},
{
id: 'signRequestName',
title: 'Request Name',
type: 'short-input',
placeholder: 'Name for this sign request',
condition: { field: 'operation', value: 'sign_create_request' },
},
{
id: 'additionalSigners',
title: 'Additional Signers',
type: 'long-input',
placeholder: '[{"email":"user@example.com","role":"signer"}]',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'signParentFolderId',
title: 'Destination Folder ID',
type: 'short-input',
placeholder: 'Box folder ID for signed documents',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'daysValid',
title: 'Days Valid',
type: 'short-input',
placeholder: 'Number of days before expiry (0-730)',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'areRemindersEnabled',
title: 'Enable Reminders',
type: 'switch',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'areTextSignaturesEnabled',
title: 'Allow Text Signatures',
type: 'switch',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'signatureColor',
title: 'Signature Color',
type: 'dropdown',
options: [
{ label: 'Blue', id: 'blue' },
{ label: 'Black', id: 'black' },
{ label: 'Red', id: 'red' },
],
value: () => 'blue',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'redirectUrl',
title: 'Redirect URL',
type: 'short-input',
placeholder: 'URL to redirect after signing',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'declinedRedirectUrl',
title: 'Declined Redirect URL',
type: 'short-input',
placeholder: 'URL to redirect after declining',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'isDocumentPreparationNeeded',
title: 'Document Preparation Needed',
type: 'switch',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
{
id: 'externalId',
title: 'External ID',
type: 'short-input',
placeholder: 'External system reference ID',
condition: { field: 'operation', value: 'sign_create_request' },
mode: 'advanced',
},
// Sign Request ID (shared by get, cancel, resend)
{
id: 'signRequestId',
title: 'Sign Request ID',
type: 'short-input',
placeholder: 'Box Sign request ID',
required: {
field: 'operation',
value: ['sign_get_request', 'sign_cancel_request', 'sign_resend_request'],
},
condition: {
field: 'operation',
value: ['sign_get_request', 'sign_cancel_request', 'sign_resend_request'],
},
},
// Sign list pagination marker
{
id: 'marker',
title: 'Pagination Marker',
type: 'short-input',
placeholder: 'Marker from previous response',
condition: { field: 'operation', value: 'sign_list_requests' },
mode: 'advanced',
},
],
tools: {
access: [
'box_upload_file',
'box_download_file',
'box_get_file_info',
'box_list_folder_items',
'box_create_folder',
'box_delete_file',
'box_delete_folder',
'box_copy_file',
'box_search',
'box_update_file',
'box_sign_create_request',
'box_sign_get_request',
'box_sign_list_requests',
'box_sign_cancel_request',
'box_sign_resend_request',
],
config: {
tool: (params) => {
const op = params.operation as string
if (op.startsWith('sign_')) {
return `box_${op}`
}
return `box_${op}`
},
params: (params) => {
const normalizedFile = normalizeFileInput(params.file, { single: true })
if (normalizedFile) {
params.file = normalizedFile
}
const { credential, operation, ...rest } = params
const baseParams: Record<string, unknown> = {
accessToken: credential,
}
switch (operation) {
case 'upload_file':
baseParams.parentFolderId = rest.parentFolderId
baseParams.file = rest.file
if (rest.uploadFileName) baseParams.fileName = rest.uploadFileName
break
case 'download_file':
case 'get_file_info':
case 'delete_file':
baseParams.fileId = rest.fileId
break
case 'list_folder_items':
baseParams.folderId = rest.folderId
if (rest.limit) baseParams.limit = Number(rest.limit)
if (rest.offset) baseParams.offset = Number(rest.offset)
if (rest.sort) baseParams.sort = rest.sort
if (rest.direction) baseParams.direction = rest.direction
break
case 'create_folder':
baseParams.name = rest.folderName
baseParams.parentFolderId = rest.parentFolderId
break
case 'delete_folder':
baseParams.folderId = rest.folderId
if (rest.recursive !== undefined) baseParams.recursive = rest.recursive
break
case 'copy_file':
baseParams.fileId = rest.fileId
baseParams.parentFolderId = rest.parentFolderId
if (rest.copyName) baseParams.name = rest.copyName
break
case 'search':
baseParams.query = rest.query
if (rest.limit) baseParams.limit = Number(rest.limit)
if (rest.offset) baseParams.offset = Number(rest.offset)
if (rest.ancestorFolderId) baseParams.ancestorFolderId = rest.ancestorFolderId
if (rest.fileExtensions) baseParams.fileExtensions = rest.fileExtensions
if (rest.contentType) baseParams.type = rest.contentType
break
case 'update_file':
baseParams.fileId = rest.fileId
if (rest.newName) baseParams.name = rest.newName
if (rest.description !== undefined) baseParams.description = rest.description
if (rest.moveToFolderId) baseParams.parentFolderId = rest.moveToFolderId
if (rest.tags) baseParams.tags = rest.tags
break
case 'sign_create_request':
baseParams.sourceFileIds = rest.sourceFileIds
baseParams.signerEmail = rest.signerEmail
if (rest.signerRole) baseParams.signerRole = rest.signerRole
if (rest.additionalSigners) baseParams.additionalSigners = rest.additionalSigners
if (rest.signParentFolderId) baseParams.parentFolderId = rest.signParentFolderId
if (rest.emailSubject) baseParams.emailSubject = rest.emailSubject
if (rest.emailMessage) baseParams.emailMessage = rest.emailMessage
if (rest.signRequestName) baseParams.name = rest.signRequestName
if (rest.daysValid) baseParams.daysValid = Number(rest.daysValid)
if (rest.areRemindersEnabled !== undefined)
baseParams.areRemindersEnabled = rest.areRemindersEnabled
if (rest.areTextSignaturesEnabled !== undefined)
baseParams.areTextSignaturesEnabled = rest.areTextSignaturesEnabled
if (rest.signatureColor) baseParams.signatureColor = rest.signatureColor
if (rest.redirectUrl) baseParams.redirectUrl = rest.redirectUrl
if (rest.declinedRedirectUrl) baseParams.declinedRedirectUrl = rest.declinedRedirectUrl
if (rest.isDocumentPreparationNeeded !== undefined)
baseParams.isDocumentPreparationNeeded = rest.isDocumentPreparationNeeded
if (rest.externalId) baseParams.externalId = rest.externalId
break
case 'sign_get_request':
case 'sign_cancel_request':
case 'sign_resend_request':
baseParams.signRequestId = rest.signRequestId
break
case 'sign_list_requests':
if (rest.limit) baseParams.limit = Number(rest.limit)
if (rest.marker) baseParams.marker = rest.marker
break
}
return baseParams
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
credential: { type: 'string', description: 'Box OAuth credential' },
file: { type: 'json', description: 'File to upload (canonical param)' },
fileId: { type: 'string', description: 'Box file ID' },
folderId: { type: 'string', description: 'Box folder ID' },
parentFolderId: { type: 'string', description: 'Parent folder ID' },
query: { type: 'string', description: 'Search query' },
sourceFileIds: { type: 'string', description: 'Comma-separated Box file IDs' },
signerEmail: { type: 'string', description: 'Primary signer email address' },
signRequestId: { type: 'string', description: 'Sign request ID' },
},
outputs: {
id: 'string',
name: 'string',
description: 'string',
size: 'number',
sha1: 'string',
createdAt: 'string',
modifiedAt: 'string',
createdBy: 'json',
modifiedBy: 'json',
ownedBy: 'json',
parentId: 'string',
parentName: 'string',
sharedLink: 'json',
tags: 'json',
commentCount: 'number',
file: 'file',
content: 'string',
entries: 'json',
totalCount: 'number',
offset: 'number',
limit: 'number',
results: 'json',
deleted: 'boolean',
message: 'string',
status: 'string',
shortId: 'string',
signers: 'json',
sourceFiles: 'json',
emailSubject: 'string',
emailMessage: 'string',
daysValid: 'number',
autoExpireAt: 'string',
prepareUrl: 'string',
senderEmail: 'string',
signRequests: 'json',
count: 'number',
nextMarker: 'string',
},
}

View File

@@ -0,0 +1,372 @@
import { DocuSignIcon } from '@/components/icons'
import { getScopesForService } from '@/lib/oauth/utils'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode } from '@/blocks/types'
import { normalizeFileInput } from '@/blocks/utils'
import type { DocuSignResponse } from '@/tools/docusign/types'
export const DocuSignBlock: BlockConfig<DocuSignResponse> = {
type: 'docusign',
name: 'DocuSign',
description: 'Send documents for e-signature via DocuSign',
longDescription:
'Create and send envelopes for e-signature, use templates, check signing status, download signed documents, and manage recipients with DocuSign.',
docsLink: 'https://docs.sim.ai/tools/docusign',
category: 'tools',
bgColor: '#FFFFFF',
icon: DocuSignIcon,
authMode: AuthMode.OAuth,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Send Envelope', id: 'send_envelope' },
{ label: 'Send from Template', id: 'create_from_template' },
{ label: 'Get Envelope', id: 'get_envelope' },
{ label: 'List Envelopes', id: 'list_envelopes' },
{ label: 'Void Envelope', id: 'void_envelope' },
{ label: 'Download Document', id: 'download_document' },
{ label: 'List Templates', id: 'list_templates' },
{ label: 'List Recipients', id: 'list_recipients' },
],
value: () => 'send_envelope',
},
{
id: 'credential',
title: 'DocuSign Account',
type: 'oauth-input',
canonicalParamId: 'oauthCredential',
mode: 'basic',
serviceId: 'docusign',
requiredScopes: getScopesForService('docusign'),
placeholder: 'Select DocuSign account',
required: true,
},
{
id: 'manualCredential',
title: 'DocuSign Account',
type: 'short-input',
canonicalParamId: 'oauthCredential',
mode: 'advanced',
placeholder: 'Enter credential ID',
required: true,
},
// Send Envelope fields
{
id: 'emailSubject',
title: 'Email Subject',
type: 'short-input',
placeholder: 'Please sign this document',
condition: { field: 'operation', value: ['send_envelope', 'create_from_template'] },
required: { field: 'operation', value: 'send_envelope' },
},
{
id: 'emailBody',
title: 'Email Body',
type: 'long-input',
placeholder: 'Optional message to include in the email',
condition: { field: 'operation', value: ['send_envelope', 'create_from_template'] },
mode: 'advanced',
},
{
id: 'signerEmail',
title: 'Signer Email',
type: 'short-input',
placeholder: 'signer@example.com',
condition: { field: 'operation', value: 'send_envelope' },
required: { field: 'operation', value: 'send_envelope' },
},
{
id: 'signerName',
title: 'Signer Name',
type: 'short-input',
placeholder: 'John Doe',
condition: { field: 'operation', value: 'send_envelope' },
required: { field: 'operation', value: 'send_envelope' },
},
{
id: 'uploadDocument',
title: 'Document',
type: 'file-upload',
canonicalParamId: 'documentFile',
placeholder: 'Upload document for signature',
mode: 'basic',
multiple: false,
condition: { field: 'operation', value: 'send_envelope' },
},
{
id: 'documentRef',
title: 'Document',
type: 'short-input',
canonicalParamId: 'documentFile',
placeholder: 'Reference file from another block',
mode: 'advanced',
condition: { field: 'operation', value: 'send_envelope' },
},
{
id: 'ccEmail',
title: 'CC Email',
type: 'short-input',
placeholder: 'cc@example.com',
condition: { field: 'operation', value: 'send_envelope' },
mode: 'advanced',
},
{
id: 'ccName',
title: 'CC Name',
type: 'short-input',
placeholder: 'CC recipient name',
condition: { field: 'operation', value: 'send_envelope' },
mode: 'advanced',
},
{
id: 'envelopeStatus',
title: 'Status',
type: 'dropdown',
options: [
{ label: 'Send Immediately', id: 'sent' },
{ label: 'Save as Draft', id: 'created' },
],
value: () => 'sent',
condition: { field: 'operation', value: ['send_envelope', 'create_from_template'] },
mode: 'advanced',
},
// Send from Template fields
{
id: 'templateId',
title: 'Template ID',
type: 'short-input',
placeholder: 'DocuSign template ID',
condition: { field: 'operation', value: 'create_from_template' },
required: { field: 'operation', value: 'create_from_template' },
},
{
id: 'templateRoles',
title: 'Template Roles',
type: 'long-input',
placeholder: '[{"roleName":"Signer","name":"John Doe","email":"john@example.com"}]',
condition: { field: 'operation', value: 'create_from_template' },
required: { field: 'operation', value: 'create_from_template' },
wandConfig: {
enabled: true,
prompt:
'Generate a JSON array of DocuSign template role objects. Each role needs: roleName (must match the template role name), name (full name), email (email address). Return ONLY the JSON array.',
generationType: 'json-object',
},
},
// Envelope ID field (shared across multiple operations)
{
id: 'envelopeId',
title: 'Envelope ID',
type: 'short-input',
placeholder: 'DocuSign envelope ID',
condition: {
field: 'operation',
value: ['get_envelope', 'void_envelope', 'download_document', 'list_recipients'],
},
required: {
field: 'operation',
value: ['get_envelope', 'void_envelope', 'download_document', 'list_recipients'],
},
},
// Void Envelope fields
{
id: 'voidedReason',
title: 'Void Reason',
type: 'short-input',
placeholder: 'Reason for voiding this envelope',
condition: { field: 'operation', value: 'void_envelope' },
required: { field: 'operation', value: 'void_envelope' },
},
// Download Document fields
{
id: 'documentId',
title: 'Document ID',
type: 'short-input',
placeholder: '"combined" for all docs, or specific document ID',
condition: { field: 'operation', value: 'download_document' },
mode: 'advanced',
},
// List Envelopes filters
{
id: 'fromDate',
title: 'From Date',
type: 'short-input',
placeholder: 'ISO 8601 date (defaults to 30 days ago)',
condition: { field: 'operation', value: 'list_envelopes' },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: 'Generate an ISO 8601 timestamp. Return ONLY the timestamp string.',
generationType: 'timestamp',
},
},
{
id: 'toDate',
title: 'To Date',
type: 'short-input',
placeholder: 'ISO 8601 date',
condition: { field: 'operation', value: 'list_envelopes' },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt: 'Generate an ISO 8601 timestamp. Return ONLY the timestamp string.',
generationType: 'timestamp',
},
},
{
id: 'listEnvelopeStatus',
title: 'Status Filter',
type: 'dropdown',
options: [
{ label: 'All', id: '' },
{ label: 'Created', id: 'created' },
{ label: 'Sent', id: 'sent' },
{ label: 'Delivered', id: 'delivered' },
{ label: 'Completed', id: 'completed' },
{ label: 'Declined', id: 'declined' },
{ label: 'Voided', id: 'voided' },
],
value: () => '',
condition: { field: 'operation', value: 'list_envelopes' },
mode: 'advanced',
},
{
id: 'searchText',
title: 'Search',
type: 'short-input',
placeholder: 'Search envelopes or templates',
condition: { field: 'operation', value: ['list_envelopes', 'list_templates'] },
mode: 'advanced',
},
{
id: 'count',
title: 'Max Results',
type: 'short-input',
placeholder: '25',
condition: { field: 'operation', value: ['list_envelopes', 'list_templates'] },
mode: 'advanced',
},
],
tools: {
access: [
'docusign_send_envelope',
'docusign_create_from_template',
'docusign_get_envelope',
'docusign_list_envelopes',
'docusign_void_envelope',
'docusign_download_document',
'docusign_list_templates',
'docusign_list_recipients',
],
config: {
tool: (params) => `docusign_${params.operation}`,
params: (params) => {
const { oauthCredential, operation, documentFile, listEnvelopeStatus, ...rest } = params
const cleanParams: Record<string, unknown> = {
oauthCredential,
}
const file = normalizeFileInput(documentFile, { single: true })
if (file) {
cleanParams.file = file
}
if (listEnvelopeStatus && operation === 'list_envelopes') {
cleanParams.envelopeStatus = listEnvelopeStatus
}
if (operation === 'create_from_template') {
cleanParams.status = rest.envelopeStatus
} else if (operation === 'send_envelope') {
cleanParams.status = rest.envelopeStatus
}
const excludeKeys = ['envelopeStatus']
for (const [key, value] of Object.entries(rest)) {
if (value !== undefined && value !== null && value !== '' && !excludeKeys.includes(key)) {
cleanParams[key] = value
}
}
return cleanParams
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
oauthCredential: { type: 'string', description: 'DocuSign access token' },
emailSubject: { type: 'string', description: 'Email subject for the envelope' },
emailBody: { type: 'string', description: 'Email body message' },
signerEmail: { type: 'string', description: 'Signer email address' },
signerName: { type: 'string', description: 'Signer full name' },
documentFile: { type: 'string', description: 'Document file for signature' },
ccEmail: { type: 'string', description: 'CC recipient email' },
ccName: { type: 'string', description: 'CC recipient name' },
templateId: { type: 'string', description: 'DocuSign template ID' },
templateRoles: { type: 'string', description: 'JSON array of template roles' },
envelopeId: { type: 'string', description: 'Envelope ID' },
voidedReason: { type: 'string', description: 'Reason for voiding' },
documentId: { type: 'string', description: 'Document ID to download' },
fromDate: { type: 'string', description: 'Start date filter' },
toDate: { type: 'string', description: 'End date filter' },
searchText: { type: 'string', description: 'Search text filter' },
count: { type: 'string', description: 'Max results to return' },
},
outputs: {
envelopeId: { type: 'string', description: 'Envelope ID' },
status: {
type: 'string',
description: 'Envelope status (created, sent, delivered, completed, declined, voided)',
},
statusDateTime: { type: 'string', description: 'ISO 8601 datetime of status change' },
uri: { type: 'string', description: 'Envelope URI path' },
emailSubject: { type: 'string', description: 'Envelope email subject' },
sentDateTime: { type: 'string', description: 'ISO 8601 datetime when envelope was sent' },
completedDateTime: { type: 'string', description: 'ISO 8601 datetime when signing completed' },
createdDateTime: { type: 'string', description: 'ISO 8601 datetime when envelope was created' },
statusChangedDateTime: {
type: 'string',
description: 'ISO 8601 datetime of last status change',
},
voidedReason: { type: 'string', description: 'Reason the envelope was voided' },
signerCount: { type: 'number', description: 'Number of signers on the envelope' },
documentCount: { type: 'number', description: 'Number of documents in the envelope' },
envelopes: {
type: 'json',
description:
'Array of envelopes (envelopeId, status, emailSubject, sentDateTime, completedDateTime, createdDateTime, statusChangedDateTime)',
},
templates: {
type: 'json',
description:
'Array of templates (templateId, name, description, shared, created, lastModified)',
},
signers: {
type: 'json',
description:
'Array of signer recipients (recipientId, name, email, status, signedDateTime, deliveredDateTime)',
},
carbonCopies: {
type: 'json',
description: 'Array of CC recipients (recipientId, name, email, status)',
},
base64Content: { type: 'string', description: 'Base64-encoded document content' },
mimeType: { type: 'string', description: 'Document MIME type' },
fileName: { type: 'string', description: 'Document file name' },
totalSetSize: { type: 'number', description: 'Total matching results' },
resultSetSize: { type: 'number', description: 'Results returned in this response' },
},
}

View File

@@ -29,6 +29,7 @@ export const KnowledgeBlock: BlockConfig = {
{ label: 'List Documents', id: 'list_documents' },
{ label: 'Get Document', id: 'get_document' },
{ label: 'Create Document', id: 'create_document' },
{ label: 'Upsert Document', id: 'upsert_document' },
{ label: 'Delete Document', id: 'delete_document' },
{ label: 'List Chunks', id: 'list_chunks' },
{ label: 'Upload Chunk', id: 'upload_chunk' },
@@ -175,14 +176,14 @@ export const KnowledgeBlock: BlockConfig = {
condition: { field: 'operation', value: 'upload_chunk' },
},
// --- Create Document ---
// --- Create Document / Upsert Document ---
{
id: 'name',
title: 'Document Name',
type: 'short-input',
placeholder: 'Enter document name',
required: true,
condition: { field: 'operation', value: 'create_document' },
condition: { field: 'operation', value: ['create_document', 'upsert_document'] },
},
{
id: 'content',
@@ -191,14 +192,21 @@ export const KnowledgeBlock: BlockConfig = {
placeholder: 'Enter the document content',
rows: 6,
required: true,
condition: { field: 'operation', value: 'create_document' },
condition: { field: 'operation', value: ['create_document', 'upsert_document'] },
},
{
id: 'upsertDocumentId',
title: 'Document ID (Optional)',
type: 'short-input',
placeholder: 'Enter existing document ID to update (or leave empty to match by name)',
condition: { field: 'operation', value: 'upsert_document' },
},
{
id: 'documentTags',
title: 'Document Tags',
type: 'document-tag-entry',
dependsOn: ['knowledgeBaseSelector'],
condition: { field: 'operation', value: 'create_document' },
condition: { field: 'operation', value: ['create_document', 'upsert_document'] },
},
// --- Update Chunk / Delete Chunk ---
@@ -264,6 +272,7 @@ export const KnowledgeBlock: BlockConfig = {
'knowledge_search',
'knowledge_upload_chunk',
'knowledge_create_document',
'knowledge_upsert_document',
'knowledge_list_tags',
'knowledge_list_documents',
'knowledge_get_document',
@@ -284,6 +293,8 @@ export const KnowledgeBlock: BlockConfig = {
return 'knowledge_upload_chunk'
case 'create_document':
return 'knowledge_create_document'
case 'upsert_document':
return 'knowledge_upsert_document'
case 'list_tags':
return 'knowledge_list_tags'
case 'list_documents':
@@ -355,6 +366,11 @@ export const KnowledgeBlock: BlockConfig = {
if (params.chunkEnabledFilter) params.enabled = params.chunkEnabledFilter
}
// Map upsert sub-block field to tool param
if (params.operation === 'upsert_document' && params.upsertDocumentId) {
params.documentId = String(params.upsertDocumentId).trim()
}
// Convert enabled dropdown string to boolean for update_chunk
if (params.operation === 'update_chunk' && typeof params.enabled === 'string') {
params.enabled = params.enabled === 'true'
@@ -382,6 +398,7 @@ export const KnowledgeBlock: BlockConfig = {
documentTags: { type: 'string', description: 'Document tags' },
chunkSearch: { type: 'string', description: 'Search filter for chunks' },
chunkEnabledFilter: { type: 'string', description: 'Filter chunks by enabled status' },
upsertDocumentId: { type: 'string', description: 'Document ID for upsert operation' },
connectorId: { type: 'string', description: 'Connector identifier' },
},
outputs: {

View File

@@ -0,0 +1,440 @@
import { WorkdayIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
export const WorkdayBlock: BlockConfig = {
type: 'workday',
name: 'Workday',
description: 'Manage workers, hiring, onboarding, and HR operations in Workday',
longDescription:
'Integrate Workday HRIS into your workflow. Create pre-hires, hire employees, manage worker profiles, assign onboarding plans, handle job changes, retrieve compensation data, and process terminations.',
docsLink: 'https://docs.sim.ai/tools/workday',
category: 'tools',
bgColor: '#F5F0EB',
icon: WorkdayIcon,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Get Worker', id: 'get_worker' },
{ label: 'List Workers', id: 'list_workers' },
{ label: 'Create Pre-Hire', id: 'create_prehire' },
{ label: 'Hire Employee', id: 'hire_employee' },
{ label: 'Update Worker', id: 'update_worker' },
{ label: 'Assign Onboarding Plan', id: 'assign_onboarding' },
{ label: 'Get Organizations', id: 'get_organizations' },
{ label: 'Change Job', id: 'change_job' },
{ label: 'Get Compensation', id: 'get_compensation' },
{ label: 'Terminate Worker', id: 'terminate_worker' },
],
value: () => 'get_worker',
},
{
id: 'tenantUrl',
title: 'Tenant URL',
type: 'short-input',
placeholder: 'https://wd2-impl-services1.workday.com',
required: true,
description: 'Your Workday instance URL (e.g., https://wd2-impl-services1.workday.com)',
},
{
id: 'tenant',
title: 'Tenant Name',
type: 'short-input',
placeholder: 'mycompany',
required: true,
description: 'Workday tenant identifier',
},
{
id: 'username',
title: 'Username',
type: 'short-input',
placeholder: 'ISU username',
required: true,
description: 'Integration System User username',
},
{
id: 'password',
title: 'Password',
type: 'short-input',
placeholder: 'ISU password',
password: true,
required: true,
description: 'Integration System User password',
},
// Get Worker
{
id: 'workerId',
title: 'Worker ID',
type: 'short-input',
placeholder: 'e.g., 3aa5550b7fe348b98d7b5741afc65534',
condition: {
field: 'operation',
value: [
'get_worker',
'update_worker',
'assign_onboarding',
'change_job',
'get_compensation',
'terminate_worker',
],
},
required: {
field: 'operation',
value: [
'get_worker',
'update_worker',
'assign_onboarding',
'change_job',
'get_compensation',
'terminate_worker',
],
},
},
// List Workers
{
id: 'limit',
title: 'Limit',
type: 'short-input',
placeholder: '20',
condition: { field: 'operation', value: ['list_workers', 'get_organizations'] },
mode: 'advanced',
},
{
id: 'offset',
title: 'Offset',
type: 'short-input',
placeholder: '0',
condition: { field: 'operation', value: ['list_workers', 'get_organizations'] },
mode: 'advanced',
},
// Create Pre-Hire
{
id: 'legalName',
title: 'Legal Name',
type: 'short-input',
placeholder: 'e.g., Jane Doe',
condition: { field: 'operation', value: 'create_prehire' },
required: { field: 'operation', value: 'create_prehire' },
},
{
id: 'email',
title: 'Email',
type: 'short-input',
placeholder: 'jane.doe@company.com',
condition: { field: 'operation', value: 'create_prehire' },
},
{
id: 'phoneNumber',
title: 'Phone Number',
type: 'short-input',
placeholder: '+1-555-0100',
condition: { field: 'operation', value: 'create_prehire' },
mode: 'advanced',
},
{
id: 'address',
title: 'Address',
type: 'short-input',
placeholder: '123 Main St, City, State',
condition: { field: 'operation', value: 'create_prehire' },
mode: 'advanced',
},
{
id: 'countryCode',
title: 'Country Code',
type: 'short-input',
placeholder: 'US',
condition: { field: 'operation', value: 'create_prehire' },
mode: 'advanced',
description: 'ISO 3166-1 Alpha-2 country code (defaults to US)',
},
// Hire Employee
{
id: 'preHireId',
title: 'Pre-Hire ID',
type: 'short-input',
placeholder: 'Pre-hire record ID',
condition: { field: 'operation', value: 'hire_employee' },
required: { field: 'operation', value: 'hire_employee' },
},
{
id: 'positionId',
title: 'Position ID',
type: 'short-input',
placeholder: 'Position to assign',
condition: { field: 'operation', value: ['hire_employee', 'change_job'] },
required: { field: 'operation', value: ['hire_employee'] },
},
{
id: 'hireDate',
title: 'Hire Date',
type: 'short-input',
placeholder: 'YYYY-MM-DD',
condition: { field: 'operation', value: 'hire_employee' },
required: { field: 'operation', value: 'hire_employee' },
wandConfig: {
enabled: true,
prompt: 'Generate an ISO 8601 date (YYYY-MM-DD). Return ONLY the date string.',
generationType: 'timestamp',
},
},
{
id: 'jobProfileId',
title: 'Job Profile ID',
type: 'short-input',
placeholder: 'Job profile ID',
condition: { field: 'operation', value: 'change_job' },
mode: 'advanced',
},
{
id: 'locationId',
title: 'Location ID',
type: 'short-input',
placeholder: 'Work location ID',
condition: { field: 'operation', value: 'change_job' },
mode: 'advanced',
},
{
id: 'supervisoryOrgId',
title: 'Supervisory Organization ID',
type: 'short-input',
placeholder: 'Target supervisory organization ID',
condition: { field: 'operation', value: 'change_job' },
mode: 'advanced',
},
{
id: 'employeeType',
title: 'Employee Type',
type: 'dropdown',
options: [
{ label: 'Regular', id: 'Regular' },
{ label: 'Temporary', id: 'Temporary' },
{ label: 'Contractor', id: 'Contractor' },
],
value: () => 'Regular',
condition: { field: 'operation', value: 'hire_employee' },
mode: 'advanced',
},
// Update Worker
{
id: 'fields',
title: 'Fields (JSON)',
type: 'code',
language: 'json',
placeholder:
'{\n "businessTitle": "Senior Engineer",\n "primaryWorkEmail": "new@company.com"\n}',
condition: { field: 'operation', value: 'update_worker' },
required: { field: 'operation', value: 'update_worker' },
wandConfig: {
enabled: true,
maintainHistory: true,
prompt: `Generate a Workday worker update payload as JSON.
### COMMON FIELDS
- businessTitle: Job title string
- primaryWorkEmail: Work email address
- primaryWorkPhone: Work phone number
- managerReference: Manager worker ID
### RULES
- Output ONLY valid JSON starting with { and ending with }
- Include only fields that need updating
### EXAMPLE
User: "Update title to Senior Engineer"
Output: {"businessTitle": "Senior Engineer"}`,
generationType: 'json-object',
},
},
// Assign Onboarding
{
id: 'onboardingPlanId',
title: 'Onboarding Plan ID',
type: 'short-input',
placeholder: 'Plan ID to assign',
condition: { field: 'operation', value: 'assign_onboarding' },
required: { field: 'operation', value: 'assign_onboarding' },
},
{
id: 'actionEventId',
title: 'Action Event ID',
type: 'short-input',
placeholder: 'Hiring event ID that enables onboarding',
condition: { field: 'operation', value: 'assign_onboarding' },
required: { field: 'operation', value: 'assign_onboarding' },
},
// Get Organizations
{
id: 'orgType',
title: 'Organization Type',
type: 'dropdown',
options: [
{ label: 'All Types', id: '' },
{ label: 'Supervisory', id: 'Supervisory' },
{ label: 'Cost Center', id: 'Cost_Center' },
{ label: 'Company', id: 'Company' },
{ label: 'Region', id: 'Region' },
],
value: () => '',
condition: { field: 'operation', value: 'get_organizations' },
},
// Change Job
{
id: 'effectiveDate',
title: 'Effective Date',
type: 'short-input',
placeholder: 'YYYY-MM-DD',
condition: { field: 'operation', value: 'change_job' },
required: { field: 'operation', value: 'change_job' },
wandConfig: {
enabled: true,
prompt: 'Generate an ISO 8601 date (YYYY-MM-DD). Return ONLY the date string.',
generationType: 'timestamp',
},
},
{
id: 'reason',
title: 'Reason',
type: 'short-input',
placeholder: 'e.g., Promotion, Transfer',
condition: { field: 'operation', value: ['change_job', 'terminate_worker'] },
required: { field: 'operation', value: ['change_job', 'terminate_worker'] },
},
// Terminate Worker
{
id: 'terminationDate',
title: 'Termination Date',
type: 'short-input',
placeholder: 'YYYY-MM-DD',
condition: { field: 'operation', value: 'terminate_worker' },
required: { field: 'operation', value: 'terminate_worker' },
wandConfig: {
enabled: true,
prompt: 'Generate an ISO 8601 date (YYYY-MM-DD). Return ONLY the date string.',
generationType: 'timestamp',
},
},
{
id: 'notificationDate',
title: 'Notification Date',
type: 'short-input',
placeholder: 'YYYY-MM-DD',
condition: { field: 'operation', value: 'terminate_worker' },
mode: 'advanced',
},
{
id: 'lastDayOfWork',
title: 'Last Day of Work',
type: 'short-input',
placeholder: 'YYYY-MM-DD (defaults to termination date)',
condition: { field: 'operation', value: 'terminate_worker' },
mode: 'advanced',
},
],
tools: {
access: [
'workday_get_worker',
'workday_list_workers',
'workday_create_prehire',
'workday_hire_employee',
'workday_update_worker',
'workday_assign_onboarding',
'workday_get_organizations',
'workday_change_job',
'workday_get_compensation',
'workday_terminate_worker',
],
config: {
tool: (params) => `workday_${params.operation}`,
params: (params) => {
const { operation, orgType, fields, jobProfileId, locationId, supervisoryOrgId, ...rest } =
params
if (rest.limit != null && rest.limit !== '') rest.limit = Number(rest.limit)
if (rest.offset != null && rest.offset !== '') rest.offset = Number(rest.offset)
if (orgType) rest.type = orgType
if (operation === 'change_job') {
if (rest.positionId) {
rest.newPositionId = rest.positionId
rest.positionId = undefined
}
if (jobProfileId) rest.newJobProfileId = jobProfileId
if (locationId) rest.newLocationId = locationId
if (supervisoryOrgId) rest.newSupervisoryOrgId = supervisoryOrgId
}
if (fields && operation === 'update_worker') {
try {
const parsedFields = typeof fields === 'string' ? JSON.parse(fields) : fields
return { ...rest, fields: parsedFields }
} catch {
throw new Error('Invalid JSON in Fields block')
}
}
return rest
},
},
},
inputs: {
operation: { type: 'string', description: 'Workday operation to perform' },
tenantUrl: { type: 'string', description: 'Workday instance URL' },
tenant: { type: 'string', description: 'Workday tenant name' },
username: { type: 'string', description: 'ISU username' },
password: { type: 'string', description: 'ISU password' },
workerId: { type: 'string', description: 'Worker ID' },
limit: { type: 'number', description: 'Result limit' },
offset: { type: 'number', description: 'Pagination offset' },
legalName: { type: 'string', description: 'Legal name for pre-hire' },
email: { type: 'string', description: 'Email address' },
phoneNumber: { type: 'string', description: 'Phone number' },
address: { type: 'string', description: 'Address' },
countryCode: { type: 'string', description: 'ISO 3166-1 Alpha-2 country code' },
preHireId: { type: 'string', description: 'Pre-hire record ID' },
positionId: { type: 'string', description: 'Position ID' },
hireDate: { type: 'string', description: 'Hire date (YYYY-MM-DD)' },
jobProfileId: { type: 'string', description: 'Job profile ID' },
locationId: { type: 'string', description: 'Location ID' },
supervisoryOrgId: { type: 'string', description: 'Target supervisory organization ID' },
employeeType: { type: 'string', description: 'Employee type' },
fields: { type: 'json', description: 'Fields to update' },
onboardingPlanId: { type: 'string', description: 'Onboarding plan ID' },
actionEventId: { type: 'string', description: 'Action event ID for onboarding' },
orgType: { type: 'string', description: 'Organization type filter' },
effectiveDate: { type: 'string', description: 'Effective date (YYYY-MM-DD)' },
reason: { type: 'string', description: 'Reason for change or termination' },
terminationDate: { type: 'string', description: 'Termination date (YYYY-MM-DD)' },
notificationDate: { type: 'string', description: 'Notification date' },
lastDayOfWork: { type: 'string', description: 'Last day of work' },
},
outputs: {
worker: { type: 'json', description: 'Worker profile data' },
workers: { type: 'json', description: 'Array of worker profiles' },
total: { type: 'number', description: 'Total count of results' },
preHireId: { type: 'string', description: 'Created pre-hire ID' },
descriptor: { type: 'string', description: 'Display name of pre-hire' },
workerId: { type: 'string', description: 'Worker ID' },
employeeId: { type: 'string', description: 'Employee ID' },
hireDate: { type: 'string', description: 'Hire date' },
assignmentId: { type: 'string', description: 'Onboarding assignment ID' },
planId: { type: 'string', description: 'Onboarding plan ID' },
organizations: { type: 'json', description: 'Array of organizations' },
eventId: { type: 'string', description: 'Event ID for staffing changes' },
effectiveDate: { type: 'string', description: 'Effective date of change' },
compensationPlans: { type: 'json', description: 'Compensation plan details' },
terminationDate: { type: 'string', description: 'Termination date' },
},
}

View File

@@ -13,6 +13,7 @@ import { ArxivBlock } from '@/blocks/blocks/arxiv'
import { AsanaBlock } from '@/blocks/blocks/asana'
import { AshbyBlock } from '@/blocks/blocks/ashby'
import { AttioBlock } from '@/blocks/blocks/attio'
import { BoxBlock } from '@/blocks/blocks/box'
import { BrandfetchBlock } from '@/blocks/blocks/brandfetch'
import { BrowserUseBlock } from '@/blocks/blocks/browser_use'
import { CalComBlock } from '@/blocks/blocks/calcom'
@@ -29,6 +30,7 @@ import { DatabricksBlock } from '@/blocks/blocks/databricks'
import { DatadogBlock } from '@/blocks/blocks/datadog'
import { DevinBlock } from '@/blocks/blocks/devin'
import { DiscordBlock } from '@/blocks/blocks/discord'
import { DocuSignBlock } from '@/blocks/blocks/docusign'
import { DropboxBlock } from '@/blocks/blocks/dropbox'
import { DSPyBlock } from '@/blocks/blocks/dspy'
import { DubBlock } from '@/blocks/blocks/dub'
@@ -188,6 +190,7 @@ import { WebhookRequestBlock } from '@/blocks/blocks/webhook_request'
import { WhatsAppBlock } from '@/blocks/blocks/whatsapp'
import { WikipediaBlock } from '@/blocks/blocks/wikipedia'
import { WordPressBlock } from '@/blocks/blocks/wordpress'
import { WorkdayBlock } from '@/blocks/blocks/workday'
import { WorkflowBlock } from '@/blocks/blocks/workflow'
import { WorkflowInputBlock } from '@/blocks/blocks/workflow_input'
import { XBlock } from '@/blocks/blocks/x'
@@ -215,6 +218,7 @@ export const registry: Record<string, BlockConfig> = {
ashby: AshbyBlock,
attio: AttioBlock,
brandfetch: BrandfetchBlock,
box: BoxBlock,
browser_use: BrowserUseBlock,
calcom: CalComBlock,
calendly: CalendlyBlock,
@@ -232,6 +236,7 @@ export const registry: Record<string, BlockConfig> = {
datadog: DatadogBlock,
devin: DevinBlock,
discord: DiscordBlock,
docusign: DocuSignBlock,
dropbox: DropboxBlock,
dspy: DSPyBlock,
dub: DubBlock,
@@ -408,6 +413,7 @@ export const registry: Record<string, BlockConfig> = {
whatsapp: WhatsAppBlock,
wikipedia: WikipediaBlock,
wordpress: WordPressBlock,
workday: WorkdayBlock,
workflow: WorkflowBlock,
workflow_input: WorkflowInputBlock,
x: XBlock,

View File

@@ -124,6 +124,34 @@ export function NoteIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function WorkdayIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
const clipId = `workday_clip_${id}`
return (
<svg {...props} viewBox='0 0 64 64' fill='none' xmlns='http://www.w3.org/2000/svg'>
<g clipPath={`url(#${clipId})`} transform='matrix(0.53333333,0,0,0.53333333,-124.63685,-16)'>
<path
fillRule='evenodd'
clipRule='evenodd'
d='m 251.21,88.7755 h 8.224 c 1.166,0 2.178,0.7836 2.444,1.8924 l 11.057,44.6751 c 0.152,0.002 12.182,-44.6393 12.182,-44.6393 0.306,-1.1361 1.36,-1.9282 2.566,-1.9282 h 12.74 c 1.144,0 2.144,0.7515 2.435,1.8296 l 12.118,44.9289 c 0.448,-0.282 11.147,-44.8661 11.147,-44.8661 0.267,-1.1088 1.279,-1.8924 2.444,-1.8924 h 8.219 c 1.649,0 2.854,1.5192 2.437,3.0742 l -15.08,56.3173 c -0.286,1.072 -1.272,1.823 -2.406,1.833 l -12.438,-0.019 c -1.142,-0.002 -2.137,-0.744 -2.429,-1.819 -2.126,-7.805 -12.605,-47.277 -12.605,-47.277 0,0 -11.008,39.471 -13.133,47.277 -0.293,1.075 -1.288,1.817 -2.429,1.819 L 266.264,150 c -1.133,-0.01 -2.119,-0.761 -2.406,-1.833 L 248.777,91.8438 c -0.416,-1.5524 0.786,-3.0683 2.433,-3.0683 z'
fill='#005cb9'
/>
<path
fillRule='evenodd'
clipRule='evenodd'
d='m 333.324,72.2449 c 0.531,0 1.071,-0.0723 1.608,-0.2234 3.18,-0.8968 5.039,-4.2303 4.153,-7.446 -0.129,-0.4673 -0.265,-0.9327 -0.408,-1.3936 C 332.529,43.3349 314.569,30 293.987,30 c -20.557,0 -38.51,13.3133 -44.673,33.1281 -0.136,0.4355 -0.267,0.8782 -0.391,1.3232 -0.902,3.2119 0.943,6.5541 4.12,7.4645 3.173,0.9112 6.48,-0.9547 7.381,-4.1666 0.094,-0.3322 0.19,-0.6616 0.292,-0.9892 4.591,-14.7582 17.961,-24.6707 33.271,-24.6707 15.329,0 28.704,9.9284 33.281,24.7063 0.105,0.3397 0.206,0.682 0.301,1.0263 0.737,2.6726 3.139,4.423 5.755,4.423 z'
fill='#f38b00'
/>
</g>
<defs>
<clipPath id={clipId}>
<path d='M 354,30 H 234 v 120 h 120 z' fill='#ffffff' />
</clipPath>
</defs>
</svg>
)
}
export function WorkflowIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
@@ -1146,6 +1174,25 @@ export function DevinIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function DocuSignIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 1547 1549' xmlns='http://www.w3.org/2000/svg'>
<path
d='m1113.4 1114.9v395.6c0 20.8-16.7 37.6-37.5 37.6h-1038.4c-20.7 0-37.5-16.8-37.5-37.6v-1039c0-20.7 16.8-37.5 37.5-37.5h394.3v643.4c0 20.7 16.8 37.5 37.5 37.5z'
fill='#4c00ff'
/>
<path
d='m1546 557.1c0 332.4-193.9 557-432.6 557.8v-418.8c0-12-4.8-24-13.5-31.9l-217.1-217.4c-8.8-8.8-20-13.6-32-13.6h-418.2v-394.8c0-20.8 16.8-37.6 37.5-37.6h585.1c277.7-0.8 490.8 223 490.8 556.3z'
fill='#ff5252'
/>
<path
d='m1099.9 663.4c8.7 8.7 13.5 19.9 13.5 31.9v418.8h-643.3c-20.7 0-37.5-16.8-37.5-37.5v-643.4h418.2c12 0 24 4.8 32 13.6z'
fill='#000000'
/>
</svg>
)
}
export function DiscordIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
@@ -4569,11 +4616,17 @@ export function ShopifyIcon(props: SVGProps<SVGSVGElement>) {
export function BoxCompanyIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 41 22'>
<path
d='M39.7 19.2c.5.7.4 1.6-.2 2.1-.7.5-1.7.4-2.2-.2l-3.5-4.5-3.4 4.4c-.5.7-1.5.7-2.2.2-.7-.5-.8-1.4-.3-2.1l4-5.2-4-5.2c-.5-.7-.3-1.7.3-2.2.7-.5 1.7-.3 2.2.3l3.4 4.5L37.3 7c.5-.7 1.4-.8 2.2-.3.7.5.7 1.5.2 2.2L35.8 14l3.9 5.2zm-18.2-.6c-2.6 0-4.7-2-4.7-4.6 0-2.5 2.1-4.6 4.7-4.6s4.7 2.1 4.7 4.6c-.1 2.6-2.2 4.6-4.7 4.6zm-13.8 0c-2.6 0-4.7-2-4.7-4.6 0-2.5 2.1-4.6 4.7-4.6s4.7 2.1 4.7 4.6c0 2.6-2.1 4.6-4.7 4.6zM21.5 6.4c-2.9 0-5.5 1.6-6.8 4-1.3-2.4-3.9-4-6.9-4-1.8 0-3.4.6-4.7 1.5V1.5C3.1.7 2.4 0 1.6 0 .7 0 0 .7 0 1.5v12.6c.1 4.2 3.5 7.5 7.7 7.5 3 0 5.6-1.7 6.9-4.1 1.3 2.4 3.9 4.1 6.8 4.1 4.3 0 7.8-3.4 7.8-7.7.1-4.1-3.4-7.5-7.7-7.5z'
fill='currentColor'
/>
<svg
{...props}
xmlns='http://www.w3.org/2000/svg'
width='2500'
height='1379'
viewBox='0 0 444.893 245.414'
>
<g fill='#0075C9'>
<path d='M239.038 72.43c-33.081 0-61.806 18.6-76.322 45.904-14.516-27.305-43.24-45.902-76.32-45.902-19.443 0-37.385 6.424-51.821 17.266V16.925h-.008C34.365 7.547 26.713 0 17.286 0 7.858 0 .208 7.547.008 16.925H0v143.333h.036c.768 47.051 39.125 84.967 86.359 84.967 33.08 0 61.805-18.603 76.32-45.908 14.517 27.307 43.241 45.906 76.321 45.906 47.715 0 86.396-38.684 86.396-86.396.001-47.718-38.682-86.397-86.394-86.397zM86.395 210.648c-28.621 0-51.821-23.201-51.821-51.82 0-28.623 23.201-51.823 51.821-51.823 28.621 0 51.822 23.2 51.822 51.823 0 28.619-23.201 51.82-51.822 51.82zm152.643 0c-28.622 0-51.821-23.201-51.821-51.822 0-28.623 23.2-51.821 51.821-51.821 28.619 0 51.822 23.198 51.822 51.821-.001 28.621-23.203 51.822-51.822 51.822z' />
<path d='M441.651 218.033l-44.246-59.143 44.246-59.144-.008-.007c5.473-7.62 3.887-18.249-3.652-23.913-7.537-5.658-18.187-4.221-23.98 3.157l-.004-.002-38.188 51.047-38.188-51.047-.006.009c-5.793-7.385-16.441-8.822-23.981-3.16-7.539 5.664-9.125 16.293-3.649 23.911l-.008.005 44.245 59.144-44.245 59.143.008.005c-5.477 7.62-3.89 18.247 3.649 23.909 7.54 5.664 18.188 4.225 23.981-3.155l.006.007 38.188-51.049 38.188 51.049.004-.002c5.794 7.377 16.443 8.814 23.98 3.154 7.539-5.662 9.125-16.291 3.652-23.91l.008-.008z' />
</g>
</svg>
)
}

View File

@@ -0,0 +1,140 @@
---
slug: mothership
title: 'Introducing Mothership'
description: 'Sim v0.6 introduces Mothership—a central intelligence layer for orchestrating your AI agents—alongside Tables, Files, Knowledge Base Connectors, and Scheduled Tasks.'
date: 2026-03-17
updated: 2026-03-17
authors:
- emir
readingTime: 10
tags: [Release, Mothership, Tables, Knowledge Base, Connectors, RAG, Sim]
ogImage: /blog/mothership/cover.png
ogAlt: 'Sim v0.6 release announcement'
about: ['AI Agents', 'Workflow Automation', 'Developer Tools']
timeRequired: PT10M
canonical: https://sim.ai/blog/mothership
featured: true
draft: false
---
I often wonder why AI agents don't already do everything for me today. Why don't they take my meetings for me? Why can't they send follow-ups to customers? Why can't they track the progress of our product launches and just start solving problems for us?
It seems like this is an engineering challenge more than a scientific one. Models today are already remarkably capable at solving complex problems. Look at how far AI coding has come in just two years.
![SWE-bench Performance Over Time](/blog/mothership/swe-bench.png)
In 2023, the best models could solve around 5% of real-world GitHub issues. By early 2025, that number crossed 70%. Today we're approaching 80%. These aren't toy benchmarks—they're actual pull requests on open-source repositories that require multi-file reasoning, debugging, and testing.
If a model can navigate a codebase, diagnose a bug, and write a working fix, why can't it update my CRM after a sales call? Why can't it pull last week's metrics from three different tools and post a summary to Slack every Monday?
The answer isn't intelligence. It's infrastructure. Models need the right workspace around them—persistent data, access to your tools, knowledge from your documents, and the ability to act on a schedule without you in the loop.
That's what we built.
![Workspace](/blog/mothership/workspace.png)
---
## Mothership
![Mothership](/blog/mothership/mothership.png)
Mothership is the control plane for your entire workspace. It's not a chatbot—it's an orchestrator with full context over your workflows, tables, knowledge bases, files, and every integration you've connected.
Ask it to "create a CRM table, seed it with my existing leads data, build a self-healing sync workflow, and schedule it to run daily"—and it does exactly that. It creates the table, writes the rows, wires up the workflow blocks, configures the integrations, tests it, and deploys it. One prompt, end to end.
Mothership can call any of your 100+ integration tools directly—query a CRM, send a Slack message, search a knowledge base, create a Linear issue—all from a single conversation. It can read, write, and edit files in your workspace. It can build entire workflows from a description and deploy them as APIs, chat interfaces, or MCP tools.
When something breaks, it debugs. When you need a report, it writes one. When you describe a workflow in plain English, it builds it, tests it, and ships it.
![Email Inbox](/blog/mothership/email.png)
You can even email your workspace. Enable a Sim inbox, and Mothership reads incoming messages, executes tasks using your integrations and data, and replies. Your workspace becomes an email-driven automation layer—send it a request, and it handles the rest.
This is the difference between an assistant that answers questions and one that actually runs things.
---
## Tables
For agents to do real work, they need somewhere to put things. Not ephemeral context that disappears after a conversation—persistent, structured data they can read, write, and query over time.
Tables give your workspace a database. Create tables with typed columns—string, number, boolean, date, JSON—insert and update rows, query with filters and sorting, and use them as the state layer for any automation. 10+ operations are available as workflow blocks: query, insert, upsert, batch insert, update, delete, get row, get schema.
The Mothership can manage tables directly—creating schemas, seeding data, and querying results as part of a larger task. It's not just storage. It's memory.
A customer support workflow writes new tickets to a table. A lead enrichment pipeline upserts contact data after every sync. A reporting agent queries last week's rows and generates a summary. Tables are how your agents remember things and build on previous work.
Think about why this matters. An agent that forgets everything between runs can't track a product launch. An agent with a table can see what happened yesterday, what's overdue, and what needs attention—and act on it.
---
## Files
![Files](/blog/mothership/files.png)
Your workspace has a built-in file system. Upload documents, create new files, and use them across your workflows and the Mothership.
Upload PDFs, spreadsheets, Markdown, JSON, YAML, CSV, HTML, SVG, audio, video—up to 100MB per file. Each format gets a rich preview: Markdown renders with syntax highlighting, HTML and SVG render inline, CSVs display as scrollable data tables, and media files play with native controls.
Edit files directly in the browser with a split-view editor—write on the left, see the rendered preview on the right. The Mothership can also create and edit files programmatically. Ask it to "write a report summarizing last week's metrics" and it creates the Markdown file, populates it, and shows you the preview inline.
CSV files can be imported directly into Tables. Sim auto-detects delimiters, infers column types from the data, and batch-inserts up to 1,000 rows. One click from a raw export to a queryable, workflow-ready table.
Files give agents the ability to produce real artifacts. Not just chat responses—actual documents, reports, configs, and exports that you and your team can use.
---
## Knowledge Base
![Connectors](/blog/mothership/connectors.png)
Agents need access to what your team already knows—docs, wikis, meeting notes, support articles, design specs. But that knowledge lives scattered across dozens of tools. Without it, agents hallucinate or give generic answers. With it, they're grounded in your actual business context.
Knowledge Base Connectors bring it all together. Connect a source—Google Docs, Notion, Confluence, Slack, GitHub, Jira, Linear, HubSpot, Salesforce, Zendesk, Dropbox, OneDrive, Gmail, Discord, and 15+ more—and Sim handles the rest. Documents are fetched, chunked, embedded, and indexed automatically. Incremental sync keeps everything up to date on a schedule.
The pipeline handles PDFs with OCR, Markdown, JSON, YAML, and plain text. Chunking is configurable—size, overlap, minimum size. Embeddings use OpenAI's text-embedding-3-small with BYOK support. The result is a vector-searchable knowledge base that's instantly available to any workflow block or to Mothership directly.
This is RAG without the setup. Connect a source, and within minutes your agents can search your team's knowledge, retrieve relevant context, and ground their responses in real information. A support agent that can search your actual help docs. A sales assistant that pulls from your real product specs and competitive intel. A research workflow that synthesizes information from your actual documents.
---
## Scheduled Tasks
Everything above means agents can orchestrate, store data, manage files, and access knowledge. The last piece is autonomy—the ability to act without you in the loop.
Scheduled Tasks let you tell Mothership what you need done and when. Set up recurring or one-time jobs that execute on a cron schedule, with full access to your integrations, tables, and knowledge bases.
Two modes: **persistent** tasks run indefinitely on schedule, and **until_complete** tasks poll until a success condition is met, then stop automatically. Configurable max runs, 40+ timezone support, and automatic failure tracking that disables a task after 3 consecutive failures.
**"Every morning at 9am, check my Gmail for new support tickets and create rows in my Tickets table."** A persistent task that runs daily, searches Gmail, and writes structured data to a Table.
**"Every hour, poll the Stripe API for failed payments and send a Slack summary to #billing."** Recurring monitoring across two integrations in a single prompt.
**"Check if the Q1 report has been uploaded to Google Drive. When it appears, summarize it and email the team."** An until_complete task that polls until the file exists, then acts and stops.
**"Every Monday at 8am, pull this week's Linear issues, cross-reference with our product roadmap in Notion, and post a status update to Slack."** Multi-step reasoning across three integrations on a weekly schedule.
Each run tracks execution history, so the Mothership has context about what it did previously—making each run smarter.
This is the part that closes the gap. You described a task. It runs on schedule. You're not in the loop. The agent is just doing its job.
---
## It All Connects
None of these features exist in isolation. They compose into something larger.
Mothership queries your Tables, searches your Knowledge Bases, reads and writes your Files, calls your integrations, and runs on a schedule through Scheduled Tasks. A user emails your workspace and the Mothership reads the message, researches the answer across synced Notion docs, stores the result in a Table, and triggers a Slack notification workflow. One control plane, all your data, all your tools.
The reason AI agents can't run your life today isn't that they aren't smart enough. It's that they don't have the right workspace. They need persistent memory, access to your knowledge, the ability to act across your tools, and the autonomy to do it on a schedule.
That's what v0.6 is.
---
## Get Started
Sim v0.6 is available now at [sim.ai](https://sim.ai). Check out our [documentation](https://docs.sim.ai) for detailed guides on Mothership, Tables, Connectors, and more.
*Questions? [help@sim.ai](mailto:help@sim.ai) · [Discord](https://sim.ai/discord)*

View File

@@ -0,0 +1,335 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>What's New at Sim</title>
</head>
<body style="margin:0;padding:0;background-color:#ffffff;">
<table width="100%" cellspacing="0" cellpadding="0" border="0" role="presentation" style="background-color:#ffffff;">
<tr>
<td align="center" style="padding:0 16px;">
<table width="600" cellspacing="0" cellpadding="0" border="0" role="presentation" style="max-width:600px;width:100%;">
<!-- Logo -->
<tr>
<td align="center" style="padding-top:32px;padding-bottom:16px;">
<a href="https://sim.ai" style="color:#000;text-decoration:none;" target="_blank">
<img src="https://sim.ai/email/broadcast/v0.6/logo.png" width="79" alt="Sim Logo" style="display:block;width:79px;" border="0">
</a>
</td>
</tr>
<!-- Intro Paragraph -->
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:20px;padding-top:8px;text-align:left;">
<p style="margin:0;">Introducing <strong>Sim v0.6</strong>—your workspace is now a living system. The <strong>Mothership</strong> is the control plane for everything you build: an AI that understands your workflows, your data, and your tools, and can act on all of them. Alongside it, we're shipping <strong>Tables</strong> for structured data and <strong>Knowledge Base Connectors</strong> that automatically sync from 30+ sources.</p>
</td>
</tr>
<!-- CTA Button -->
<tr>
<td align="center" style="padding-bottom:25px;padding-top:5px;">
<table cellspacing="0" cellpadding="0" border="0" role="presentation">
<tr>
<td align="center" style="background-color:#32bd7e;border-radius:5px;">
<a href="https://sim.ai" style="display:inline-block;font-weight:500;font-size:14px;padding:7px 16px;text-decoration:none;color:#ffffff;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;" target="_blank">Try Sim</a>
</td>
</tr>
</table>
</td>
</tr>
<!-- Header Text -->
<tr>
<td align="left" valign="top" style="color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;word-break:break-word;padding-top:10px;padding-bottom:8px;text-align:left;">
<h2 style="margin:0;color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:28px;font-weight:600;word-break:break-word;">One control plane, <em>all your data</em></h2>
</td>
</tr>
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:30px;text-align:left;">
<p style="margin:0;">The Mothership orchestrates your workflows, tables, knowledge bases, and integrations from a single conversation.</p>
</td>
</tr>
<!-- FEATURE 1: Mothership -->
<tr>
<td align="left" style="color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:18px;word-break:break-word;padding-top:10px;padding-bottom:12px;text-align:left;">
<strong>The Mothership</strong>
</td>
</tr>
<tr>
<td align="center" style="border-radius:8px;font-size:0;line-height:0;">
<a href="https://sim.ai" style="color:#000;text-decoration:none;" target="_blank">
<!-- TODO: Replace with actual mothership screenshot -->
<img src="https://sim.ai/email/broadcast/v0.6/mothership.jpg" width="570" alt="The Mothership" style="display:block;width:100%;border-radius:8px;" border="0">
</a>
</td>
</tr>
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:25px;padding-top:12px;text-align:left;">
<!-- TODO: Write final copy -->
<p style="margin:0;">The Mothership is the central intelligence of your Sim workspace. It already knows your workflows, tables, knowledge bases, files, and credentials—no context needed. Ask it to query a CRM, search your synced documents, edit a file, update a table, or build an entire workflow from a single prompt. It opens resources in a split-view panel right next to the chat so you can see and edit your data live. Email your workspace and it reads the message, runs tasks, and replies. It's not an assistant—it's the operating system for your AI workspace.</p>
</td>
</tr>
<!-- FEATURE 2: Build from Chat -->
<tr>
<td align="left" style="color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:18px;word-break:break-word;padding-top:20px;padding-bottom:12px;text-align:left;">
<strong>Build, Test, and Deploy from Chat</strong>
</td>
</tr>
<tr>
<td align="center">
<a href="https://sim.ai" style="text-decoration:none;display:block;" target="_blank">
<table cellspacing="0" cellpadding="0" border="0" role="presentation" width="100%" style="background-color:#181C1E;border-radius:8px;">
<tr>
<td align="center" style="padding:40px 30px;">
<table cellspacing="0" cellpadding="0" border="0" role="presentation">
<tr>
<td align="center" style="padding-bottom:16px;">
<!-- TODO: Replace with build icon -->
<img src="https://sim.ai/email/broadcast/v0.6/build.png" width="48" height="48" alt="Build" style="display:block;" border="0">
</td>
</tr>
<tr>
<td align="center">
<p style="margin:0;color:#ffffff;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:20px;font-weight:500;">Describe it in words, ship it in seconds</p>
</td>
</tr>
</table>
</td>
</tr>
</table>
</a>
</td>
</tr>
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:25px;padding-top:12px;text-align:left;">
<!-- TODO: Write final copy -->
<p style="margin:0;">Tell the Mothership what you need: "Build a workflow that monitors GitHub issues, searches our docs, and posts a suggested fix to Slack." It plans the workflow, wires the blocks, configures your integrations, tests the result, and deploys it as an API, chat UI, or MCP tool—all from a single conversation. When something breaks, the debug agent diagnoses the issue and suggests a fix. No drag-and-drop required.</p>
</td>
</tr>
<!-- FEATURE 3: Tables & Files -->
<tr>
<td align="left" style="color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:18px;word-break:break-word;padding-top:20px;padding-bottom:12px;text-align:left;">
<strong>Tables & Files</strong>
</td>
</tr>
<tr>
<td align="center" style="border-radius:8px;font-size:0;line-height:0;">
<a href="https://sim.ai" style="color:#000;text-decoration:none;" target="_blank">
<!-- TODO: Replace with actual tables screenshot -->
<img src="https://sim.ai/email/broadcast/v0.6/tables.jpg" width="570" alt="Tables & Files" style="display:block;width:100%;border-radius:8px;" border="0">
</a>
</td>
</tr>
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:25px;padding-top:12px;text-align:left;">
<!-- TODO: Write final copy -->
<p style="margin:0;">Tables give your workflows persistent, structured storage—no external database needed. Create typed columns, query with filters, insert and upsert rows, or batch operations. Import CSVs directly into Tables with automatic column type inference. Your workspace also has a built-in file system: upload PDFs, Markdown, JSON, HTML, SVGs, and more—preview and edit them in the browser with a split-view editor, or let the Mothership read and write them programmatically. Everything is accessible via REST API.</p>
</td>
</tr>
<!-- FEATURE 3: KB Connectors -->
<tr>
<td align="left" style="color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:18px;word-break:break-word;padding-top:20px;padding-bottom:12px;text-align:left;">
<strong>Knowledge Base Connectors</strong>
</td>
</tr>
<tr>
<td align="center">
<a href="https://sim.ai" style="text-decoration:none;display:block;" target="_blank">
<table cellspacing="0" cellpadding="0" border="0" role="presentation" width="100%" style="background-color:#181C1E;border-radius:8px;">
<tr>
<td align="center" style="padding:40px 30px;">
<table cellspacing="0" cellpadding="0" border="0" role="presentation">
<tr>
<td align="center" style="padding-bottom:16px;">
<!-- TODO: Replace with connector icon -->
<img src="https://sim.ai/email/broadcast/v0.6/connectors.png" width="48" height="48" alt="Connectors" style="display:block;" border="0">
</td>
</tr>
<tr>
<td align="center">
<p style="margin:0;color:#ffffff;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:20px;font-weight:500;">30+ sources, automatic sync, instant RAG</p>
</td>
</tr>
</table>
</td>
</tr>
</table>
</a>
</td>
</tr>
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:25px;padding-top:12px;text-align:left;">
<!-- TODO: Write final copy -->
<p style="margin:0;">Connect Google Docs, Notion, Confluence, Slack, GitHub, Jira, HubSpot, Salesforce, and 20+ more sources to your knowledge bases. We handle the rest—fetching documents on a schedule, chunking content, generating embeddings, and keeping everything up to date with incremental sync. Search semantically with tag filtering across any connected source, directly from your workflows or the Mothership.</p>
</td>
</tr>
<!-- FEATURE 4: Scheduled Tasks -->
<tr>
<td align="left" style="color:#2d2d2d;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:18px;word-break:break-word;padding-top:20px;padding-bottom:12px;text-align:left;">
<strong>Scheduled Tasks</strong>
</td>
</tr>
<tr>
<td align="center" style="border-radius:8px;font-size:0;line-height:0;">
<a href="https://sim.ai" style="color:#000;text-decoration:none;" target="_blank">
<!-- TODO: Replace with actual scheduled tasks screenshot -->
<img src="https://sim.ai/email/broadcast/v0.6/scheduled-tasks.jpg" width="570" alt="Scheduled Tasks" style="display:block;width:100%;border-radius:8px;" border="0">
</a>
</td>
</tr>
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:25px;padding-top:12px;text-align:left;">
<!-- TODO: Write final copy -->
<p style="margin:0;">Tell the Mothership what you need done and when. Schedule recurring jobs—"every morning, check Gmail for support tickets and add them to my table"—or one-time tasks that poll until a condition is met. The Mothership executes autonomously with full access to your integrations, tracks its own history across runs, and stops itself when the job is done.</p>
</td>
</tr>
<!-- Divider -->
<tr>
<td align="center" style="padding-bottom:10px;padding-top:10px;">
<table width="100%" cellspacing="0" cellpadding="0" border="0" role="presentation" height="1" style="border-top:1px solid #ededed;">
<tr>
<td height="0" style="font-size:0;line-height:0;">&#173;</td>
</tr>
</table>
</td>
</tr>
<!-- Final CTA -->
<tr>
<td align="left" style="color:#404040;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;font-size:16px;line-height:24px;word-break:break-word;padding-bottom:15px;padding-top:15px;text-align:left;">
<p style="margin:0;">Ready to build? Tables, Connectors, Scheduled Tasks, and the Mothership are available now in Sim.</p>
</td>
</tr>
<tr>
<td align="center" style="padding-bottom:30px;padding-top:15px;">
<table cellspacing="0" cellpadding="0" border="0" role="presentation">
<tr>
<td align="center" style="background-color:#32bd7e;border-radius:5px;">
<a href="https://sim.ai" style="display:inline-block;font-weight:500;font-size:14px;padding:7px 16px;text-decoration:none;color:#ffffff;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;" target="_blank">Get Started</a>
</td>
</tr>
</table>
</td>
</tr>
<!-- Footer Divider -->
<tr>
<td align="center" style="padding-top:20px;padding-bottom:20px;">
<table width="100%" cellspacing="0" cellpadding="0" border="0" role="presentation" height="1" style="border-top:1px solid #ededed;">
<tr>
<td height="0" style="font-size:0;line-height:0;">&#173;</td>
</tr>
</table>
</td>
</tr>
<!-- Social links row -->
<tr>
<td align="center">
<table cellspacing="0" cellpadding="0" border="0" role="presentation">
<tbody>
<tr>
<td align="center" style="padding:0 8px 0 0;">
<a href="https://sim.ai/x" rel="noopener noreferrer" target="_blank">
<img src="https://sim.ai/static/x-icon.png" width="20" height="20" alt="X" style="display:block;" border="0">
</a>
</td>
<td align="center" style="padding:0 8px;">
<a href="https://sim.ai/discord" rel="noopener noreferrer" target="_blank">
<img src="https://sim.ai/static/discord-icon.png" width="20" height="20" alt="Discord" style="display:block;" border="0">
</a>
</td>
<td align="center" style="padding:0 8px;">
<a href="https://sim.ai/github" rel="noopener noreferrer" target="_blank">
<img src="https://sim.ai/static/github-icon.png" width="20" height="20" alt="GitHub" style="display:block;" border="0">
</a>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
<!-- Spacer -->
<tr>
<td height="16" style="font-size:1px;line-height:1px;">&nbsp;</td>
</tr>
<!-- Address row -->
<tr>
<td align="center" style="font-size:12px;line-height:20px;color:#737373;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;margin:0;">
Sim, 80 Langton St, San Francisco, CA 94103, USA
</td>
</tr>
<!-- Spacer -->
<tr>
<td height="8" style="font-size:1px;line-height:1px;">&nbsp;</td>
</tr>
<!-- Contact row -->
<tr>
<td align="center" style="font-size:12px;line-height:20px;color:#737373;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;margin:0;">
Questions? <a href="mailto:support@sim.ai" style="color:#737373;text-decoration:underline;font-weight:normal;">support@sim.ai</a>
</td>
</tr>
<!-- Spacer -->
<tr>
<td height="8" style="font-size:1px;line-height:1px;">&nbsp;</td>
</tr>
<!-- Links row -->
<tr>
<td align="center" style="font-size:12px;line-height:20px;color:#737373;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;margin:0;">
<a href="https://sim.ai/privacy" style="color:#737373;text-decoration:underline;font-weight:normal;" rel="noopener noreferrer" target="_blank">Privacy Policy</a>
&nbsp;&bull;&nbsp;
<a href="https://sim.ai/terms" style="color:#737373;text-decoration:underline;font-weight:normal;" rel="noopener noreferrer" target="_blank">Terms of Service</a>
&nbsp;&bull;&nbsp;
<a href="{{{RESEND_UNSUBSCRIBE_URL}}}" style="color:#737373;text-decoration:underline;font-weight:normal;" rel="noopener noreferrer" target="_blank">Unsubscribe</a>
</td>
</tr>
<!-- Spacer -->
<tr>
<td height="16" style="font-size:1px;line-height:1px;">&nbsp;</td>
</tr>
<!-- Copyright row -->
<tr>
<td align="center" style="font-size:12px;line-height:20px;color:#737373;font-family:-apple-system,'SF Pro Display','SF Pro Text','Helvetica',sans-serif;margin:0;">
&copy; 2026 Sim, All Rights Reserved
</td>
</tr>
<!-- Bottom spacer -->
<tr>
<td height="32" style="font-size:1px;line-height:1px;">&nbsp;</td>
</tr>
</table>
</td>
</tr>
</table>
</body>
</html>

View File

@@ -92,6 +92,57 @@ describe('DAGBuilder disabled subflow validation', () => {
// Should not throw - loop is effectively disabled since all inner blocks are disabled
expect(() => builder.build(workflow)).not.toThrow()
})
it('does not mutate serialized loop config nodes during DAG build', () => {
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
createBlock('loop-1', BlockType.LOOP),
{ ...createBlock('inner-block', BlockType.FUNCTION), enabled: false },
],
connections: [{ source: 'start', target: 'loop-1' }],
loops: {
'loop-1': {
id: 'loop-1',
nodes: ['inner-block'],
iterations: 3,
},
},
parallels: {},
}
const builder = new DAGBuilder()
builder.build(workflow)
expect(workflow.loops?.['loop-1']?.nodes).toEqual(['inner-block'])
})
it('does not mutate serialized parallel config nodes during DAG build', () => {
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
createBlock('parallel-1', BlockType.PARALLEL),
{ ...createBlock('inner-block', BlockType.FUNCTION), enabled: false },
],
connections: [{ source: 'start', target: 'parallel-1' }],
loops: {},
parallels: {
'parallel-1': {
id: 'parallel-1',
nodes: ['inner-block'],
count: 2,
parallelType: 'count',
},
},
}
const builder = new DAGBuilder()
builder.build(workflow)
expect(workflow.parallels?.['parallel-1']?.nodes).toEqual(['inner-block'])
})
})
describe('DAGBuilder nested parallel support', () => {

View File

@@ -113,13 +113,19 @@ export class DAGBuilder {
private initializeConfigs(workflow: SerializedWorkflow, dag: DAG): void {
if (workflow.loops) {
for (const [loopId, loopConfig] of Object.entries(workflow.loops)) {
dag.loopConfigs.set(loopId, loopConfig)
dag.loopConfigs.set(loopId, {
...loopConfig,
nodes: [...(loopConfig.nodes ?? [])],
})
}
}
if (workflow.parallels) {
for (const [parallelId, parallelConfig] of Object.entries(workflow.parallels)) {
dag.parallelConfigs.set(parallelId, parallelConfig)
dag.parallelConfigs.set(parallelId, {
...parallelConfig,
nodes: [...(parallelConfig.nodes ?? [])],
})
}
}
}
@@ -150,9 +156,7 @@ export class DAGBuilder {
if (!sentinelStartNode) return
if (!nodes || nodes.length === 0) {
throw new Error(
`${type} has no blocks inside. Add at least one block to the ${type.toLowerCase()}.`
)
return
}
const hasConnections = Array.from(sentinelStartNode.outgoingEdges.values()).some((edge) =>

View File

@@ -8,24 +8,21 @@ const logger = createLogger('LoopConstructor')
export class LoopConstructor {
execute(dag: DAG, reachableBlocks: Set<string>): void {
for (const [loopId, loopConfig] of dag.loopConfigs) {
const loopNodes = loopConfig.nodes
if (loopNodes.length === 0) {
if (!reachableBlocks.has(loopId)) {
continue
}
if (!this.hasReachableNodes(loopNodes, reachableBlocks)) {
continue
const loopNodes = loopConfig.nodes
const hasReachableChildren = loopNodes.some((nodeId) => reachableBlocks.has(nodeId))
if (!hasReachableChildren) {
loopConfig.nodes = []
}
this.createSentinelPair(dag, loopId)
}
}
private hasReachableNodes(loopNodes: string[], reachableBlocks: Set<string>): boolean {
return loopNodes.some((nodeId) => reachableBlocks.has(nodeId))
}
private createSentinelPair(dag: DAG, loopId: string): void {
const startId = buildSentinelStartId(loopId)
const endId = buildSentinelEndId(loopId)

View File

@@ -11,24 +11,21 @@ const logger = createLogger('ParallelConstructor')
export class ParallelConstructor {
execute(dag: DAG, reachableBlocks: Set<string>): void {
for (const [parallelId, parallelConfig] of dag.parallelConfigs) {
const parallelNodes = parallelConfig.nodes
if (parallelNodes.length === 0) {
if (!reachableBlocks.has(parallelId)) {
continue
}
if (!this.hasReachableNodes(parallelNodes, reachableBlocks)) {
continue
const parallelNodes = parallelConfig.nodes
const hasReachableChildren = parallelNodes.some((nodeId) => reachableBlocks.has(nodeId))
if (!hasReachableChildren) {
parallelConfig.nodes = []
}
this.createSentinelPair(dag, parallelId)
}
}
private hasReachableNodes(parallelNodes: string[], reachableBlocks: Set<string>): boolean {
return parallelNodes.some((nodeId) => reachableBlocks.has(nodeId))
}
private createSentinelPair(dag: DAG, parallelId: string): void {
const startId = buildParallelSentinelStartId(parallelId)
const endId = buildParallelSentinelEndId(parallelId)

View File

@@ -77,7 +77,7 @@ export class BlockExecutor {
if (!isSentinel) {
blockLog = this.createBlockLog(ctx, node.id, block, node)
ctx.blockLogs.push(blockLog)
this.callOnBlockStart(ctx, node, block, blockLog.executionOrder)
await this.callOnBlockStart(ctx, node, block, blockLog.executionOrder)
}
const startTime = performance.now()
@@ -105,7 +105,7 @@ export class BlockExecutor {
}
} catch (error) {
cleanupSelfReference?.()
return this.handleBlockError(
return await this.handleBlockError(
error,
ctx,
node,
@@ -179,7 +179,7 @@ export class BlockExecutor {
const displayOutput = filterOutputForLog(block.metadata?.id || '', normalizedOutput, {
block,
})
this.callOnBlockComplete(
await this.callOnBlockComplete(
ctx,
node,
block,
@@ -195,7 +195,7 @@ export class BlockExecutor {
return normalizedOutput
} catch (error) {
return this.handleBlockError(
return await this.handleBlockError(
error,
ctx,
node,
@@ -226,7 +226,7 @@ export class BlockExecutor {
return this.blockHandlers.find((h) => h.canHandle(block))
}
private handleBlockError(
private async handleBlockError(
error: unknown,
ctx: ExecutionContext,
node: DAGNode,
@@ -236,7 +236,7 @@ export class BlockExecutor {
resolvedInputs: Record<string, any>,
isSentinel: boolean,
phase: 'input_resolution' | 'execution'
): NormalizedBlockOutput {
): Promise<NormalizedBlockOutput> {
const duration = performance.now() - startTime
const errorMessage = normalizeError(error)
const hasResolvedInputs =
@@ -287,7 +287,7 @@ export class BlockExecutor {
? error.childWorkflowInstanceId
: undefined
const displayOutput = filterOutputForLog(block.metadata?.id || '', errorOutput, { block })
this.callOnBlockComplete(
await this.callOnBlockComplete(
ctx,
node,
block,
@@ -439,12 +439,12 @@ export class BlockExecutor {
return redactApiKeys(result)
}
private callOnBlockStart(
private async callOnBlockStart(
ctx: ExecutionContext,
node: DAGNode,
block: SerializedBlock,
executionOrder: number
): void {
): Promise<void> {
const blockId = node.metadata?.originalBlockId ?? node.id
const blockName = block.metadata?.name ?? blockId
const blockType = block.metadata?.id ?? DEFAULTS.BLOCK_TYPE
@@ -452,18 +452,26 @@ export class BlockExecutor {
const iterationContext = getIterationContext(ctx, node?.metadata)
if (this.contextExtensions.onBlockStart) {
this.contextExtensions.onBlockStart(
blockId,
blockName,
blockType,
executionOrder,
iterationContext,
ctx.childWorkflowContext
)
try {
await this.contextExtensions.onBlockStart(
blockId,
blockName,
blockType,
executionOrder,
iterationContext,
ctx.childWorkflowContext
)
} catch (error) {
logger.warn('Block start callback failed', {
blockId,
blockType,
error: error instanceof Error ? error.message : String(error),
})
}
}
}
private callOnBlockComplete(
private async callOnBlockComplete(
ctx: ExecutionContext,
node: DAGNode,
block: SerializedBlock,
@@ -474,7 +482,7 @@ export class BlockExecutor {
executionOrder: number,
endedAt: string,
childWorkflowInstanceId?: string
): void {
): Promise<void> {
const blockId = node.metadata?.originalBlockId ?? node.id
const blockName = block.metadata?.name ?? blockId
const blockType = block.metadata?.id ?? DEFAULTS.BLOCK_TYPE
@@ -482,22 +490,30 @@ export class BlockExecutor {
const iterationContext = getIterationContext(ctx, node?.metadata)
if (this.contextExtensions.onBlockComplete) {
this.contextExtensions.onBlockComplete(
blockId,
blockName,
blockType,
{
input,
output,
executionTime: duration,
startedAt,
executionOrder,
endedAt,
childWorkflowInstanceId,
},
iterationContext,
ctx.childWorkflowContext
)
try {
await this.contextExtensions.onBlockComplete(
blockId,
blockName,
blockType,
{
input,
output,
executionTime: duration,
startedAt,
executionOrder,
endedAt,
childWorkflowInstanceId,
},
iterationContext,
ctx.childWorkflowContext
)
} catch (error) {
logger.warn('Block completion callback failed', {
blockId,
blockType,
error: error instanceof Error ? error.message : String(error),
})
}
}
}

View File

@@ -51,11 +51,34 @@ export class LoopOrchestrator {
private edgeManager: EdgeManager | null = null
) {}
initializeLoopScope(ctx: ExecutionContext, loopId: string): LoopScope {
async initializeLoopScope(ctx: ExecutionContext, loopId: string): Promise<LoopScope> {
const loopConfig = this.dag.loopConfigs.get(loopId) as SerializedLoop | undefined
if (!loopConfig) {
throw new Error(`Loop config not found: ${loopId}`)
}
if (loopConfig.nodes.length === 0) {
const errorMessage =
'Loop has no executable blocks inside. Add or enable at least one block in the loop.'
const loopType = loopConfig.loopType || 'for'
logger.error(errorMessage, { loopId })
await this.addLoopErrorLog(ctx, loopId, loopType, errorMessage, {})
const errorScope: LoopScope = {
iteration: 0,
maxIterations: 0,
loopType,
currentIterationOutputs: new Map(),
allIterationOutputs: [],
condition: 'false',
validationError: errorMessage,
}
if (!ctx.loopExecutions) {
ctx.loopExecutions = new Map()
}
ctx.loopExecutions.set(loopId, errorScope)
throw new Error(errorMessage)
}
const scope: LoopScope = {
iteration: 0,
currentIterationOutputs: new Map(),
@@ -76,7 +99,7 @@ export class LoopOrchestrator {
)
if (iterationError) {
logger.error(iterationError, { loopId, requestedIterations })
this.addLoopErrorLog(ctx, loopId, loopType, iterationError, {
await this.addLoopErrorLog(ctx, loopId, loopType, iterationError, {
iterations: requestedIterations,
})
scope.maxIterations = 0
@@ -93,13 +116,31 @@ export class LoopOrchestrator {
case 'forEach': {
scope.loopType = 'forEach'
if (
loopConfig.forEachItems === undefined ||
loopConfig.forEachItems === null ||
loopConfig.forEachItems === ''
) {
const errorMessage =
'ForEach loop collection is empty. Provide an array or a reference that resolves to a collection.'
logger.error(errorMessage, { loopId })
await this.addLoopErrorLog(ctx, loopId, loopType, errorMessage, {
forEachItems: loopConfig.forEachItems,
})
scope.items = []
scope.maxIterations = 0
scope.validationError = errorMessage
scope.condition = buildLoopIndexCondition(0)
ctx.loopExecutions?.set(loopId, scope)
throw new Error(errorMessage)
}
let items: any[]
try {
items = resolveArrayInput(ctx, loopConfig.forEachItems, this.resolver)
} catch (error) {
const errorMessage = `ForEach loop resolution failed: ${error instanceof Error ? error.message : String(error)}`
logger.error(errorMessage, { loopId, forEachItems: loopConfig.forEachItems })
this.addLoopErrorLog(ctx, loopId, loopType, errorMessage, {
await this.addLoopErrorLog(ctx, loopId, loopType, errorMessage, {
forEachItems: loopConfig.forEachItems,
})
scope.items = []
@@ -117,7 +158,7 @@ export class LoopOrchestrator {
)
if (sizeError) {
logger.error(sizeError, { loopId, collectionSize: items.length })
this.addLoopErrorLog(ctx, loopId, loopType, sizeError, {
await this.addLoopErrorLog(ctx, loopId, loopType, sizeError, {
forEachItems: loopConfig.forEachItems,
collectionSize: items.length,
})
@@ -155,7 +196,7 @@ export class LoopOrchestrator {
)
if (iterationError) {
logger.error(iterationError, { loopId, requestedIterations })
this.addLoopErrorLog(ctx, loopId, loopType, iterationError, {
await this.addLoopErrorLog(ctx, loopId, loopType, iterationError, {
iterations: requestedIterations,
})
scope.maxIterations = 0
@@ -182,14 +223,14 @@ export class LoopOrchestrator {
return scope
}
private addLoopErrorLog(
private async addLoopErrorLog(
ctx: ExecutionContext,
loopId: string,
loopType: string,
errorMessage: string,
inputData?: any
): void {
addSubflowErrorLog(
): Promise<void> {
await addSubflowErrorLog(
ctx,
loopId,
'loop',
@@ -238,7 +279,7 @@ export class LoopOrchestrator {
}
if (isCancelled) {
logger.info('Loop execution cancelled', { loopId, iteration: scope.iteration })
return this.createExitResult(ctx, loopId, scope)
return await this.createExitResult(ctx, loopId, scope)
}
const iterationResults: NormalizedBlockOutput[] = []
@@ -253,7 +294,7 @@ export class LoopOrchestrator {
scope.currentIterationOutputs.clear()
if (!(await this.evaluateCondition(ctx, scope, scope.iteration + 1))) {
return this.createExitResult(ctx, loopId, scope)
return await this.createExitResult(ctx, loopId, scope)
}
scope.iteration++
@@ -269,11 +310,11 @@ export class LoopOrchestrator {
}
}
private createExitResult(
private async createExitResult(
ctx: ExecutionContext,
loopId: string,
scope: LoopScope
): LoopContinuationResult {
): Promise<LoopContinuationResult> {
const results = scope.allIterationOutputs
const output = { results }
this.state.setBlockOutput(loopId, output, DEFAULTS.EXECUTION_TIME)
@@ -282,19 +323,26 @@ export class LoopOrchestrator {
const now = new Date().toISOString()
const iterationContext = buildContainerIterationContext(ctx, loopId)
this.contextExtensions.onBlockComplete(
loopId,
'Loop',
'loop',
{
output,
executionTime: DEFAULTS.EXECUTION_TIME,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
},
iterationContext
)
try {
await this.contextExtensions.onBlockComplete(
loopId,
'Loop',
'loop',
{
output,
executionTime: DEFAULTS.EXECUTION_TIME,
startedAt: now,
executionOrder: getNextExecutionOrder(ctx),
endedAt: now,
},
iterationContext
)
} catch (error) {
logger.warn('Loop completion callback failed', {
loopId,
error: error instanceof Error ? error.message : String(error),
})
}
}
return {
@@ -597,7 +645,7 @@ export class LoopOrchestrator {
if (!scope.items || scope.items.length === 0) {
logger.info('ForEach loop has empty collection, skipping loop body', { loopId })
this.state.setBlockOutput(loopId, { results: [] }, DEFAULTS.EXECUTION_TIME)
emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
await emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
return false
}
return true
@@ -607,7 +655,7 @@ export class LoopOrchestrator {
if (scope.maxIterations === 0) {
logger.info('For loop has 0 iterations, skipping loop body', { loopId })
this.state.setBlockOutput(loopId, { results: [] }, DEFAULTS.EXECUTION_TIME)
emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
await emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
return false
}
return true
@@ -621,7 +669,7 @@ export class LoopOrchestrator {
if (!scope.condition) {
logger.warn('No condition defined for while loop', { loopId })
this.state.setBlockOutput(loopId, { results: [] }, DEFAULTS.EXECUTION_TIME)
emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
await emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
return false
}
@@ -634,7 +682,7 @@ export class LoopOrchestrator {
if (!result) {
this.state.setBlockOutput(loopId, { results: [] }, DEFAULTS.EXECUTION_TIME)
emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
await emitEmptySubflowEvents(ctx, loopId, 'loop', this.contextExtensions)
}
return result

Some files were not shown because too many files have changed in this diff Show More