Compare commits

...

77 Commits

Author SHA1 Message Date
Waleed
c400e59ea6 feat(sap_s4hana): add get_material_document and fix supplier invoice key order (#4317)
* fix(sap_s4hana): require non-empty items in create_purchase_order

Why: SAP A_PurchaseOrder POST silently fails or returns opaque errors
without to_PurchaseOrderItem entries. Block already required this body
but the tool marked it optional and didn't validate items presence —
mismatched contract with create_sales_order / create_purchase_requisition.

Also clarifies the deliveryDocument placeholder to show both outbound
and inbound number ranges.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* feat(sap_s4hana): add get_material_document and fix supplier invoice key order

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): align material doc key order in description and require purchase order body type

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-27 23:41:56 -07:00
Vikhyath Mondreti
2e3de9ac8a feat(governance): external workspace users from outside org (#4313)
* feat(governance): external workspace users from outside org

* update docs

* address comments

* edge case improvements

* remove unused fallback

* address comments

* add outbox for seat reduction

* fix edge case with org join after invite

* add server side batch invites for workspace

* use zod schema for route
2026-04-27 22:07:41 -07:00
Theodore Li
154b9d0883 fix(vm): categorize user or server side errors (#4283)
* fix(vm): categorize user or server side errors

* recategorize function syntax errors as 4xx
2026-04-27 22:27:50 -04:00
Waleed
c95ac3bc23 improvement(browser-use,stagehand): expose live session URLs (#4314)
* improvement(browser-use,stagehand): expose live session URLs and align with latest API specs

- Browser Use: switch to v2 camelCase schema, fetch live URL from sessions endpoint, add startUrl/maxSteps/allowedDomains/vision/flashMode/thinking/systemPromptExtension/structuredOutput/metadata params, surface liveUrl/shareUrl/sessionId outputs
- Stagehand: fetch Browserbase debug URL, add mode/maxSteps params, surface liveViewUrl/sessionId outputs, bump @browserbasehq/stagehand to ^3.2.1, update to claude-sonnet-4-6

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(browser-use): respect API default for highlightElements

Only send highlightElements when user explicitly toggles it; previously defaulted to true which silently overrode the v2 API default of false.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(browser-use,stagehand): address PR review feedback

- Browser Use: fetch liveUrl during polling once sessionId is known, instead of immediately after task creation. Handles tasks started without profile_id (where sessionId isn't returned in create response) and ensures session is active before fetching.
- Stagehand: coerce empty/whitespace maxSteps strings to undefined so they're dropped from the request body instead of failing zod validation as ''.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(stagehand): preserve liveViewUrl and sessionId on agent error

If the agent throws after Browserbase session init succeeds, callers can still surface the live view / session ID for debugging.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(browser-use): coerce empty maxSteps strings to undefined

Mirrors the Stagehand block's handling so a cleared field doesn't pass through as ''.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(browser-use): skip metadata when empty

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-27 18:02:51 -07:00
Theodore Li
ca814f021d fix(ui): display file upload error messages (#4315)
* fix(ui): display file upload messages

* Address pr comments
2026-04-27 18:58:41 -04:00
Waleed
2502369122 feat(integrations): SAP S/4HANA (#4301)
* feat(integrations): SAP S/4HANA tools, block, and proxy with multi-deployment support

* fix(sap_s4hana): address PR review comments

- Validate baseUrl/tokenUrl in Zod schema and at runtime to prevent SSRF
  (https-only, deny loopback/link-local/cloud-metadata hosts)
- Cap proxy token cache at 500 entries with LRU eviction
- Add 30s timeout to outbound token, CSRF, and OData fetches
- Make parseJsonInput return T | undefined so missing input is type-safe
- Reset authType when deploymentType changes and surface OAuth fields
  whenever auth is not basic, so cloud_public users always see clientId/
  clientSecret after switching from a basic-auth private deployment
- Reject OData service names that are not uppercase identifiers and
  paths containing ".." or "." traversal segments

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): allow versioned service names; tighten proxy SSRF defenses

- Permit ";v=NNNN" suffix on ServiceName regex so the four delivery tools
  (API_OUTBOUND_DELIVERY_SRV;v=0002, API_INBOUND_DELIVERY_SRV;v=0002) pass
  schema validation
- Restrict subdomain to RFC 1123 label characters and region to lowercase
  alphanumeric short codes; run the constructed cloud_public host through
  assertSafeExternalUrl so a crafted subdomain (e.g. "evil.com#") cannot
  redirect requests carrying SAP credentials
- Block RFC-1918 (10/8, 172.16/12, 192.168/16), 127/8, 169.254/16, and
  0.0.0.0 via isPrivateIPv4, plus IPv4-mapped IPv6 variants
  (::ffff:10.0.0.1, ::10.0.0.1) so private internal hosts cannot be
  reached from baseUrl, tokenUrl, or the resolved cloud_public URL

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): catch hex-form IPv4-mapped IPv6 in SSRF check

The WHATWG URL parser normalizes IPv4-mapped IPv6 addresses to hex form
(e.g. [::ffff:169.254.169.254] → [::ffff:a9fe:a9fe]), which slipped past
the dotted-decimal-only extractor. Decode the trailing two 16-bit hex
groups back into IPv4 octets and run them through isPrivateIPv4. Also
add isPrivateOrLoopbackIPv6 so pure IPv6 loopback (::, ::1), unique
local addresses (fc00::/7), and link-local (fe80::/10) cannot be reached.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): scope CSRF metadata fetch and isolate token cache by secret

- buildOdataUrl skips request query params when called with an internal
  pathOverride so the /$metadata CSRF probe never carries user OData
  options ($filter, $top, $select), which were causing write operations
  through the generic odata_query tool to fail.
- tokenCacheKey now mixes a sha256 hash of clientSecret into the cache
  key so two tenants sharing the same tokenUrl + clientId but different
  secrets get isolated entries (no cross-tenant token leak).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): reject ?/# in service path; trim long update tool descriptions

- ServicePath validator now rejects "?" and "#" so a caller can't smuggle
  query options through the path field (e.g.,
  "/A_BusinessPartner?$format=atomsvc"); the Zod refine now reports
  ".." / "." segments, "?", and "#" together.
- Update Customer / Update Supplier / Update Purchase Requisition tool
  descriptions exceeded the docs generator's 600-char regex window, so
  they were rendering with empty descriptions on the integrations
  landing page. Trimmed them to fit while keeping the limited-fields
  note and the If-Match guidance, then regenerated integrations.json
  and tool docs.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): reject percent-encoded path traversal; widen Set-Cookie split

- ServicePath now also rejects %2e/%2E, %2f/%2F, %5c/%5C, %3f/%3F, %23
  so a caller cannot smuggle ".." / "." / "/" / "\" / "?" / "#" past the
  validator and have SAP's ABAP/ICM gateway decode them server-side.
- joinSetCookies fallback regex now allows the ", " separator that's
  used when multiple Set-Cookie values are folded onto one header line
  (older runtimes without Headers.getSetCookie). Prevents CSRF cookies
  from being concatenated into a single value during write operations.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): preserve $ in OData query params; reject empty items array

- buildOdataUrl now constructs query strings manually with
  encodeURIComponent and restores literal "$" so OData system options
  ($filter, $top, $select, $expand, $orderby, $skip, $format) reach
  SAP and any intermediary proxies/WAFs as-is, not as "%24filter".
  URLSearchParams was percent-encoding "$" to "%24" which most ICMs
  decode but some intermediaries silently drop, returning unfiltered
  results.
- create_sales_order now rejects an empty items array (matches
  create_purchase_requisition) so callers get a clear client-side
  error instead of an opaque SAP validation failure on the deep-insert.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): ignore baseUrl on cloud_public to prevent token redirection

Why: resolveHost previously preferred baseUrl unconditionally. A caller
sending deploymentType=cloud_public with a baseUrl pointing elsewhere
would obtain a real SAP UAA token, then forward it as Bearer to the
attacker host. Zod superRefine did not validate baseUrl for cloud_public.

Fix: resolveHost now constructs the SAP host from subdomain when
deploymentType is cloud_public and only uses baseUrl for cloud_private
and on_premise (where it is already SSRF-checked in superRefine).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(icons): use useId for SapS4HanaIcon and PipedriveIcon gradients

Why: hardcoded SVG gradient/mask IDs collide when an icon renders more
than once on a page (e.g. integrations listing). All other icons in this
file use React's useId() — these were inconsistent.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* icons

* fix(icons): use useId for AWS-style icon gradients

Why: IAMIcon, IdentityCenterIcon, STSIcon, SESIcon, and SecretsManagerIcon
all used hardcoded `id='xxxGradient'` values that collide when an icon
renders more than once on a page (e.g. integrations listing).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(sap_s4hana): ignore tokenUrl on cloud_public to prevent UAA redirection

Why: resolveTokenUrl previously honored caller-supplied tokenUrl
regardless of deploymentType, mirroring the same redirection class as
the prior baseUrl bug. A cloud_public caller could send tokenUrl to an
attacker host, causing the proxy to POST clientId:clientSecret as Basic
auth to it. superRefine for cloud_public did not validate tokenUrl.

Fix: derive UAA URL from subdomain+region for cloud_public; only honor
tokenUrl for cloud_private/on_premise (already SSRF-checked).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(icons): remove unused mask in PipedriveIcon

Why: the <mask> element had no consumer (no mask='url(#...)' anywhere
in the SVG), so both it and the maskId variable were dead code.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-27 15:07:00 -07:00
Waleed
8266f0afdb fix(security): rate limit chat OTP + validate mothership proxy endpoint (#4312)
* fix(security): rate limit chat OTP endpoint to prevent email bombing

* fix(security): validate mothership proxy endpoint to block path traversal

* fix(security): address greptile feedback on OTP rate limiting
2026-04-27 14:51:58 -07:00
Waleed
896a00ae31 fix(security): require internal API key for copilot training endpoints (#4311) 2026-04-27 14:30:47 -07:00
Waleed
74946fb162 improvement(docker): speed up app image build with cache mounts and parallel node-gyp (#4310) 2026-04-27 13:34:57 -07:00
Waleed
f62d274478 fix(mothership): stabilize task sidebar ordering on selection (#4309) 2026-04-27 12:42:20 -07:00
Theodore Li
65e17de065 fix(retention-job): add chunking strategy for cleanup (#4305)
* fix(retention-job): add chunking strategy for cleanup

* change stats to be perjob not per chunk
2026-04-27 15:13:00 -04:00
Vikhyath Mondreti
79ff5d80b3 improvement(slack): channel selector for list canvasses (#4307) 2026-04-27 12:11:14 -07:00
Vikhyath Mondreti
2a52141d2f feat(slack): canvas related operations (#4306)
* feat(slack): canvas related operations

* extract shared code
2026-04-27 12:00:40 -07:00
Theodore Li
76ad59fd7d fix(stream): Avoid bun memory leak bug from TransformStream (#4255)
* Avoid bun memory leak bug from TransformStream

* fix(executor): skip content persistence when stream consumer exits early

Previously, if the onStream consumer caught an internal error without
re-throwing, the block-executor would treat the shortened accumulator
as the complete response, persist a truncated string to memory via
appendToMemory, and set it as executionOutput.content.

Track whether the source ReadableStream actually closed (done=true) in
the pull handler. If onStream returns before the source drains, skip
content persistence and log a warning — the old tee()-based flow was
immune to this because the executor branch drained independently of
the client branch.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix lint

* fix(executor): early-return when no streamed content to make onFullContent symmetric

Previously, executionOutput.content was guarded by `if (fullContent)`
but `onFullContent` fired regardless. The agent-handler implementor
defensively bails on empty/whitespace content, but that's a callee
contract, not enforced at the call site — future implementors could
spuriously persist empty assistant turns to memory.

Hoist the `!fullContent` check to a single early return, so both the
output write and the callback share the same precondition.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 14:23:36 -04:00
Waleed
c32c1cb917 fix(security): patch copilot tool & multipart upload IDORs (#4304)
* fix(security): patch copilot tool & multipart upload IDORs

- multipart upload: bind upload session to (userId, workspaceId, key)
  via short-lived HMAC-signed token; require workspace write access at
  initiate; source key/uploadId/context from verified token (never
  client) at get-part-urls/complete/abort
- copilot knowledge-base tools: gate all 11 read/write/tag/connector
  ops with checkKnowledgeBaseAccess / checkKnowledgeBaseWriteAccess
- copilot user-table tools: add workspace-id check to get, get_schema,
  add/rename/delete/update_column to match existing op pattern
- copilot manage-credential: add full ownership/write-permission auth
  via getCredentialActorContext (previously had no auth)
- copilot restore-resource: verify workspace ownership and write
  permission for workflow, table, knowledgebase, file, and folder
  restores
- copilot folder rename/move: verify both folderId and parentId belong
  to the caller's workspace via new verifyFolderWorkspace helper
- copilot get-job-logs: verify schedule belongs to caller's workspace

* fix(security): address PR review — document IDOR, log count, token split

- knowledge-base delete_document/update_document: verify each document
  belongs to the claimed knowledgeBaseId via checkDocumentWriteAccess
  (was: trusted args.knowledgeBaseId without binding it to the document)
- multipart batch complete: log verifiedEntries.length instead of raw
  client-supplied data.uploads.length
- upload-token: reject tokens with !=2 dot-delimited segments

* fix(security): close folder workspace bypass when workspaceId is falsy
2026-04-27 11:05:22 -07:00
Vikhyath Mondreti
50e74f75ef fix(mothership): parallel subagent rendering, exec stream re-attach (#4299)
* fix(mothership): parallel subagent rendering

* fix rendering logic

* address comments

* cleanup dead code

* address mothership legacy key

* prevent id collision

* rollout stream edge case

* address bugbot comments"

* remove unused fallbacks

* fix execution stream attach, cleanup comments

* remove debug logs

* cleanup failed reconnect
2026-04-27 00:40:39 -07:00
Waleed
60652e621c fix(security): credential-set invite email check + shopify authorize XSS (#4302) 2026-04-26 20:52:42 -07:00
Waleed
8863f1132a feat(models): add gpt-5.5 models (#4300)
* feat(models): add gpt-5.5 models

* fix(models): address gpt-5.5 review feedback

* fix(models): align gpt-5.5 pro controls
2026-04-25 19:13:29 -07:00
Vikhyath Mondreti
d93a6f57bc chore(guide): update contributing guide (#4296)
* chore(guide): update contributing guide

* fix md
2026-04-24 18:47:57 -07:00
Vikhyath Mondreti
df581c3efb fix(mothership): queue supersede crash (#4297)
* fix(mothership): queue supersede crash

* add test

* abort observed marker
2026-04-24 18:31:46 -07:00
Vikhyath Mondreti
f16d17ba49 improvement(mothership): do not silently re-route missing stream id (#4295) 2026-04-24 16:37:40 -07:00
Waleed
e6fefc863c fix(table-block): resolve canonical tableId in filter/sort builders (#4294)
The visual filter/sort builders read the selected tableId from subBlock id
'tableId', but the Table block stores it under 'tableSelector' (basic) or
'manualTableId' (advanced) via canonicalParamId. The lookup always returned
null, so useTable was disabled and the column picker always showed
"no options available".

Adds useCanonicalSubBlockValue that resolves by canonicalParamId through
the canonical index, mirroring the pattern used by dropdown dependsOn.
2026-04-24 16:24:49 -07:00
Waleed
5fba724818 chore(bun): bump bun to 1.3.13 (#4291) 2026-04-24 15:59:10 -07:00
Waleed
60b80ec172 improvement(tables): race-free row-count trigger + scoped tx timeouts (#4289)
* improvement(tables): race-free row-count trigger + scoped tx timeouts

* fix(tables): close upsert race + serialize replaceTableRows

Two concurrency bugs flagged by review:

1. `upsertRow` insert path: removing FOR UPDATE broke serialization between
   the initial existing-row SELECT and the INSERT. Two concurrent upserts
   on the same conflict target could both find no match, then both insert,
   producing a duplicate that bypasses the app-level unique check. Fixed
   by re-checking for the matching row *after* acquiring the per-table
   advisory lock; if a racing tx already inserted, switch to UPDATE.

2. `replaceTableRows`: under READ COMMITTED each tx's DELETE uses its own
   MVCC snapshot, so two concurrent replaces could each DELETE only the
   rows visible at their start, then both INSERT, leaving the table with
   the union of both row sets. Fixed by acquiring the per-table advisory
   lock at the start of the tx to serialize replaces against each other
   and against auto-position inserts.

Also updated a stale docstring on `upsertRow` that still referenced the
removed FOR UPDATE.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(tables): serialize explicit-position inserts with advisory lock

The `(table_id, position)` index is non-unique. Concurrent explicit-
position inserts at the same slot can both observe the slot empty, both
skip the shift, then each INSERT — producing duplicate `(table_id,
position)` rows.

Take the per-table advisory lock in the explicit-position branches of
`insertRow` and `batchInsertRows` (the auto-position branches already do
this). Updated the test that asserted the explicit path skipped the lock,
and added coverage for `batchInsertRows` with explicit positions.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* refactor(tables): dedupe upsert UPDATE path + extract nextAutoPosition

Two pure cleanups on the round-1 changes:

1. Extract `nextAutoPosition(trx, tableId)` — the `SELECT coalesce(max(
   position), -1) + 1` pattern was repeated in three call sites
   (`insertRow` auto branch, `batchInsertRows` auto branch, `upsertRow`
   insert branch). One helper, same behavior.

2. Consolidate `upsertRow` update path. The initial-SELECT match and the
   post-lock re-select match previously had two literal duplicates of the
   same UPDATE + return block. Resolve `matchedRowId` first, then run one
   UPDATE branch. Lock is still only acquired when we don't find a match
   on the first pass.

No behavior change. 98/98 table unit tests unchanged.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-24 15:33:17 -07:00
Vikhyath Mondreti
af4be770a1 improvement(mothership): treat error as terminal event (#4290) 2026-04-24 14:51:26 -07:00
Waleed
f330fe22a2 refactor(ashby): align tools, block, and triggers with Ashby API (#4288)
* refactor(ashby): align tools, block, and triggers with Ashby API

Audit-driven refactor to destructure rich fields per Ashby's API docs,
centralize output shapes via shared mappers in tools/ashby/utils.ts,
and align webhook provider handler with trigger IDs via a shared
action map. Removes stale block outputs left over from prior flat
response shapes.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(ashby): remove stale noteId output and reject ping events

- Remove stale `noteId` output descriptor from block (create_note
  now returns `id` at the top level via the shared note mapper).
- Explicitly reject `ping` events in the webhook matchEvent before
  falling back to the generic triggerId check, so webhook records
  missing triggerId cannot execute workflows on Ashby ping probes.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(ashby): trim optional ID params in create/update tools

Optional ID params in create_application, change_application_stage,
and update_candidate were passed through to the request body without
.trim(), unlike their required ID siblings. Normalize to prevent
copy-paste whitespace errors.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(ashby): add subblock migration for removed filterCandidateId

Add SUBBLOCK_ID_MIGRATIONS entry so deployed workflows that previously
used the `filterCandidateId` subBlock on `list_applications` don't break
after the field was removed (Ashby's application.list doesn't filter by
candidateId). Also regenerate docs to sync noteId removal.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(ashby): final API alignment from parallel validation

- create_candidate: email is optional per Ashby docs (only name is
  required); tool, types, and block all made non-required.
- list_applications: guard NaN when createdAfter can't be parsed so we
  don't send a bad value to Ashby's API.
- webhook provider: replace createHmacVerifier with explicit
  fail-closed verifyAuth that 401s when secretToken, signature header,
  or signature match is missing (was previously fail-open on missing
  secret).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(ashby): preserve input.data path in webhook formatInput

Restore the explicit `data` key alongside the spread so deployed
workflows that reference `input.data.application.*`, `input.data.candidate.*`,
etc. keep working. The spread alone dropped those paths.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* refactor(ashby): drop legacy input.data key from webhook formatInput

Keep formatInput aligned with the advertised trigger outputs schema
(flat top-level entities) and drop the legacy input.data.* compat path.
Every field declared in each trigger's outputs is now populated 1:1 by
the data spread plus the explicit action key — no undeclared keys.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(ashby): trim remaining ID body params for parity

Add .trim() on sourceId (create_candidate), jobId (list_applications),
applicationId and interviewStageId (list_interviews) to match the
trim-on-IDs pattern used across the rest of the Ashby tools and guard
against copy-paste whitespace.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* update docs

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-24 13:29:26 -07:00
d 🔹
efc868263a fix(copilot): replace crypto.randomUUID() with generateId() per project rule (#4268)
AGENTS.md / CLAUDE.md forbid crypto.randomUUID() (non-secure contexts throw
TypeError in browsers). Four copilot server-side files still violated this
rule, left over after PR #3397 polyfilled the client.

Routes through request lifecycle, OAuth draft insertion, persisted message
normalization, and table-row generation now use generateId from @sim/utils/id,
which is a drop-in UUID v4 producer that falls back to crypto.getRandomValues
when randomUUID is unavailable.

Refs #3393.
2026-04-24 11:59:48 -07:00
Vikhyath Mondreti
56044776d5 improvement(mothership): stream retry state machine, progressive re-rendering (#4287)
* improvement(mothership): stream rety state machine, progressive re-rendering

* address comments
2026-04-24 11:56:53 -07:00
Theodore Li
04f1d015f3 fix(mothership): Use heartbeat mechanism for chat locks (#4286) 2026-04-24 14:36:50 -04:00
Waleed
ccb5f1e690 fix(db): revert statement_timeout startup options breaking pooled connections (#4284) 2026-04-23 23:31:52 -07:00
Waleed
91ccbb9921 fix(agentphone): fix image (#4281) 2026-04-23 16:54:32 -07:00
Theodore Li
dcbe7c69b0 feat(ui): Show subagent logs in bounded vertical view (#4280)
* feat(ui): render subagent actions in bounded box.

* Add gradients and scroll bar

* fix lint
2026-04-23 18:41:53 -04:00
Theodore Li
c22ac38ab0 Set statement timeout of 90 seconds (#4276) 2026-04-23 18:27:15 -04:00
Waleed
cdde8cbd66 feat(agentphone): add AgentPhone integration (#4278)
* feat(agentphone): add AgentPhone integration

* fix(agentphone): validate numeric inputs and metadata JSON

* chore(agentphone): remove dead from fallback in get_number_messages

* fix(agentphone): drop empty-string updates in update_contact

* fix(agentphone): scope limit/offset to list ops and revert stray IdentityCenter change

* lint
2026-04-23 15:20:19 -07:00
Waleed
0ae19dab85 feat(files): default sort by updated and add updated sort option (#4279)
* feat(files): default sort by updated and add updated sort option

* feat(files): show Last Updated column

Matches the visible-column pattern already used on Knowledge and Tables
so users can see the value they're sorting by.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-23 15:02:07 -07:00
Waleed
3b11c814f8 fix(tables): account for letter-spacing and displayed content in column auto-resize (#4277) 2026-04-23 14:41:40 -07:00
Theodore Li
b86ebb35fd fix(api): Pass archivedAt to list table response (#4275)
* fix(retention): switch data retention to be org-level

* fix lint

* cleanup mothership ran logs

* fix cleanup dispatcher

* fix ui flash for data retention settings

* fix lint

* remove raw sql string interprolation

* fix(api): return archivedAt for list tables route
2026-04-23 14:32:37 -04:00
Theodore Li
65972f2fa3 fix(retention): switch data retention to be org-level (#4270)
* fix(retention): switch data retention to be org-level

* fix lint

* cleanup mothership ran logs

* fix cleanup dispatcher

* fix ui flash for data retention settings

* fix lint

* remove raw sql string interprolation
2026-04-23 02:41:49 -04:00
Vikhyath Mondreti
5f0f0edd63 improvement(repo): separate realtime into separate app (#4262)
* improvement(repo): restructuring to make realtime image narrower scoped

* improvements

* chore(repo): rebase fixes and quality improvements for realtime split

Addresses merge-time issues and gaps from the realtime app split:
- Retarget stale vi.mock paths to @sim/workflow-persistence/subblocks
- Restore README branding, fix AGENTS.md script reference
- Restore TSDoc on workflow-persistence subblocks helpers
- Use toError() from @sim/utils/errors in save.ts
- Add vitest config + local mocks so @sim/audit tests run standalone
- Move socket.io-client to devDependencies in apps/realtime
- Add missing package COPY steps to docker/app.Dockerfile
- Add check:boundaries/check:realtime-prune scripts and wire into CI

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* refactor(security): consolidate crypto primitives into @sim/security

Move general-purpose crypto primitives out of apps/sim into the
@sim/security package so both apps/sim and apps/realtime can share them.

@sim/security exports (all pure, dependency-free):
  ./compare    safeCompare (constant-time HMAC-wrapped equality)
  ./encryption encrypt/decrypt (AES-256-GCM, iv:cipher:tag format)
  ./hash       sha256Hex
  ./tokens     generateSecureToken (base64url)

Migrate apps/sim call sites to use these + @sim/utils helpers:
  crypto.randomUUID()            -> generateId() from @sim/utils/id
  createHash('sha256').digest    -> sha256Hex
  timingSafeEqual on hashed hex  -> safeCompare
  new Promise(setTimeout)        -> sleep from @sim/utils/helpers

No behavior change: encryption format, digest output, and token
length are preserved exactly.

* refactor(copilot): use toError in remaining otel/finalize sites

Replace the last two `error instanceof Error ? error : new Error(String(error))`
patterns with toError from @sim/utils/errors. Completes the sweep of clean
candidates — no behavior change.

* refactor(security): consolidate HMAC-SHA256 primitives into @sim/security

Adds hmacSha256Hex and hmacSha256Base64 to @sim/security/hmac and migrates
15 webhook providers plus 5 other hot paths (deployment token signing,
outbound webhook requests, workspace notification delivery, notification
test route, Shopify OAuth callback) off bare `createHmac` calls. Secret
parameter accepts `string | Buffer` to cover base64-decoded Svix-style
secrets (Resend) and MS Teams' HMAC scheme. AWS SigV4 signing in S3 and
Textract tools intentionally retains direct `createHmac` usage — its
multi-step key derivation chain doesn't fit a generic helper.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* chore(packages): post-audit test + packaging polish

- Add safeCompare unit tests (identity, length mismatch, hex-nibble diff).
- Add Buffer-secret cases to hmac tests to lock in Svix/MS-Teams contract.
- Declare `reactflow` as a peerDependency on @sim/workflow-types — only used for type imports.
- Add a barrel export to @sim/workflow-persistence for consumers that prefer package-level imports; subpath exports retained.
- Document the data-field invariant in load.ts for loop/parallel subflow patching.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* chore(realtime): address PR review feedback

- Remove redundant SOCKET_PORT=3002 env from Dockerfile runner stage
  (env.PORT already defaults to 3002 via zod schema).
- Reorder PORT fallback so an explicitly-set SOCKET_PORT wins over
  the schema default for PORT; keeps SOCKET_PORT functional as an
  override instead of dead code.
- Add dedicated type-check CI step for @sim/realtime so TS errors
  surface pre-deploy (the Dockerfile runs source TS via Bun and has
  no implicit build-time type check).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* chore(realtime): remove unused SOCKET_PORT env var

SOCKET_PORT has lived in the socket server since the June 2025 refactor
but was never actually set in any deploy config — docker-compose.prod,
helm values/templates, .env.example, and docs all use PORT or the 3002
default exclusively. No self-hoster was ever pointed at SOCKET_PORT, so
removing it is safe.

Simplifies realtime port resolution to `env.PORT` (zod-validated with a
3002 default) and drops the orphaned sim-side schema entry.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Waleed Latif <walif6@gmail.com>
Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-22 23:06:16 -07:00
Waleed
bed5e95742 fix(selectors): enable search on all picker and selector subBlocks (#4269) 2026-04-22 20:40:59 -07:00
Theodore Li
f7ab39984c feat(ui): add thinking ui to mothership (#4254)
* feat(ui): Add thinking ui

* fix tests

* Remove duplicate helper for block timing

* fix lint

* fix endedAt timestamp bug

* fix stuck subagent thinking
2026-04-22 19:52:56 -04:00
Theodore Li
8ce56fe1f2 fix(auth): add api key auth via sha256 hash lookup (#4266)
* fix(auth): add api key auth via sha256 hash lookup

* Remove promise all logic

* Restore feature flag

* fix feature flag

* Combine auth and hash gate
2026-04-22 18:30:37 -04:00
Vikhyath Mondreti
8c9ddefc53 fix(otel): chat root OTel span on all early-return paths (#4265) 2026-04-22 13:54:47 -07:00
Theodore Li
d927d8bdff fix(db): raise db pool size (#4263)
* fix(db): raise db pool size

* Raise socket connections

* bump up connection size even more
2026-04-22 14:25:13 -04:00
Siddharth Ganesan
0aeab026a8 feat(observability): add mothership tracing (#4253) 2026-04-22 09:06:01 -07:00
Vikhyath Mondreti
41a1b50ace improvement(migrations): log better errors (#4260) 2026-04-21 22:06:05 -07:00
Waleed
7941dcde98 fix(docs): update simstudio.ai URLs to sim.ai in SSO docs (#4257)
* fix(docs): update simstudio.ai URLs to sim.ai in SSO docs

* improvement(docs): remove plan defaults table from data retention docs

* improvement(docs): consolidate self-hosting info at bottom of enterprise docs

* improvement(docs): reduce callout and FAQ overuse in enterprise docs

* improvement(docs): restore FAQs and genuine-gotcha callouts
2026-04-21 21:18:32 -07:00
Vikhyath Mondreti
51ace655e4 fix(migration): permission group migration error (#4258) 2026-04-21 21:10:50 -07:00
Waleed
c2529c3f2c fix(settings): hide data-retention nav item when user lacks enterprise plan (#4256) 2026-04-21 20:19:04 -07:00
Waleed
45bf396968 fix(deps): bump drizzle-orm 0.45.2 + adopt MCP SDK 1.25.3 native types (#4252)
* fix(deps): bump drizzle-orm to 0.45.2 (GHSA-gpj5-g38j-94v9)

Resolves Dependabot alert #98. Drizzle ORM <0.45.2 improperly escaped
quoted SQL identifiers, allowing SQL injection via untrusted input
passed to APIs like sql.identifier() or .as().

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* chore(mcp): adopt native SDK types after @modelcontextprotocol/sdk 1.25.3 bump

Replace hand-written schema/annotation shapes with the SDK's exported
Tool, JSONRPCResultResponse, and Tool['annotations'] types so changes
upstream flow through automatically.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* refactor(types): use drizzle $inferSelect for row types

Replace hand-written interfaces that duplicated schema shape with
typeof table.$inferSelect aliases for webhook, workflow, and
workspaceFiles rows. Also simplify metadata insert/update to use
.returning() instead of field-by-field copies.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(uploads): fall through to INSERT if restore-deleted row races a hard delete

If a hard delete races between the initial SELECT and the restore UPDATE,
.returning() yields no row. Previously the function would return undefined
and silently violate the Promise<FileMetadataRecord> contract. Now the
function falls through to the INSERT path, which already handles
uniqueness races via the 23505 catch.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* chore(uploads): align metadata.ts with global standards

Replace dynamic uuid import with generateId() per @sim/utils/id
convention, narrow the error catch off `any`, and convert the inline
comment to TSDoc.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-21 19:52:15 -07:00
Waleed
193f06fcf5 fix(aws): add validateAwsRegion to all AWS route schemas to prevent SSRF (#4250)
* fix(aws): add validateAwsRegion to all AWS route schemas to prevent SSRF

* fix(validation): add mx and eu-isoe prefixes to validateAwsRegion regex

* test(validation): add mx-central-1, eu-isoe-west-1, and us-iso-west-1 region test cases

* fix(aws): eliminate double validateAwsRegion call and fix regex alternation order

- Replace double-call .refine() pattern with single-call + static message across all 61 AWS routes
- Reorder regex alternation to put longer prefixes first (eu-isoe before eu, us-isob/us-iso/us-gov before us) for engine-agnostic correctness
2026-04-21 19:22:50 -07:00
Waleed
34cfc2689a improvement(contact): add Turnstile CAPTCHA, honeypot, and robustness fixes (#4248)
* improvement(contact): add Turnstile CAPTCHA, honeypot, and robustness fixes

- Add Cloudflare Turnstile with graceful degradation: when the widget
  fails to load (ad blockers, iOS privacy, corporate DNS), submissions
  fall through to a tighter rate-limit bucket rather than hard-blocking
- Add honeypot field to filter automated submissions without user impact
- Add separate CAPTCHA_UNAVAILABLE_RATE_LIMIT bucket (3/min) for the
  no-captcha path so spam via ad-blocker bypass remains expensive
- Pass expectedHostname to verifyTurnstileToken to close cross-site
  token reuse gap
- Add SITE_HOSTNAME as module-level constant (avoid URL parsing per req)
- Wire onExpire/onError/onUnsupported callbacks so token expiry during
  slow form-filling falls back gracefully instead of showing a captcha error
- Add getResponsePromise(30_000) timeout to prevent indefinite hang on
  network blips
- Add size: 'invisible' to Turnstile options (required for execute mode)
- Move turnstile.ts to lib/core/security/ alongside csp/encryption/input-validation
- Switch all CSS to --landing-* variables throughout contact form
- Move error display inline next to label with truncation in LandingField
- Add labelClassName prop to LandingField for context-specific overrides
- Simplify contact page to single-column max-w-[640px] layout

* fix(contact): fall through to no-captcha rate limit on Cloudflare transport errors

* chore(contact): remove extraneous comments from route

* fix(contact): remove forced min-height on success state, let content flow naturally

* fix(contact): cast CONTACT_TOPIC_OPTIONS to satisfy Combobox mutable type

* fix(contact): disable submit during CAPTCHA resolution window, add relative to form
2026-04-21 18:25:48 -07:00
Waleed
2d94b3729d feat(integrations): AWS SES, IAM Identity Center, and enhanced IAM/STS/CloudWatch/DynamoDB (#4245)
* feat(integrations): add AWS SES, IAM Identity Center, and enhanced IAM/STS/CloudWatch/DynamoDB integrations

- Add AWS SES v2 integration with 9 operations (send email, templated, bulk, templates, account)
- Add AWS IAM Identity Center integration with 12 operations (account assignments, permission sets, users, groups)
- Add 3 new IAM tools: list-attached-role-policies, list-attached-user-policies, simulate-principal-policy
- Fix DynamoDB duplicate subBlock IDs, add operation-scoped field names, add subblock migrations
- Add authMode: AuthMode.ApiKey to DynamoDB block
- Fix CloudWatch routes: toError, client.destroy(), withRouteHandler, auth outside try
- Fix STS/DynamoDB/IAM routes: nullable Zod schemas, withRouteHandler adoption
- Fix Identity Center: list_instances pagination, list_groups instanceArn condition
- Add subblock migrations for renamed DynamoDB fields (key, filterExpression, etc.)
- Apply withRouteHandler to all new and existing AWS tool routes

* docs(ses): add manual intro section to SES docs

* fix(dynamodb): add legacy fallbacks in params for subblock migration compatibility

Workflows saved with the old shared IDs (key, filterExpression, etc.) that migrate
to get-scoped slots via subblock-migrations still work correctly on update/delete/scan/put
operations via fallback lookups in tools.config.params.

* feat(contact): add contact page, migrate help/demo forms to useMutation (#4242)

* feat(contact): add contact page, migrate help/demo forms to useMutation

* improvement(contact): address greptile review feedback

- Map contact topic to help email type for accurate confirmation emails
- Drop Zod schema details from 400 response on public /api/contact
- Wire aria-describedby + aria-invalid in LandingField for both forms
- Reset helpMutation on modal reopen to match demo-request pattern

* improvement(landing): extract shared LandingField component

* fix(landing): resolve error-page crash on invalid /models and /integrations routes (#4243)

* fix(layout): use plain inline script for PublicEnvScript to set env before chunks eval on error pages

* fix(landing): handle runtime env race on error-page renders

React skips SSR on unhandled server errors and re-renders on the client
(see vercel/next.js#63980, #82456). Root-layout scripts — including the
runtime env script that populates window.__ENV — are inserted but not
executed on that client re-render, so any client module that reads env
at module evaluation crashes the render into a blank "Application error"
overlay instead of rendering the styled 404.

This replaces the earlier PublicEnvScript tweak with the architectural
fix:

- auth-client.ts: fall back to window.location.origin when getBaseUrl()
  throws on the client. Auth endpoints are same-origin, so this is the
  correct baseURL on the client. Server-side we still throw on genuine
  misconfig.
- loading.tsx under /models/[provider], /models/[provider]/[model], and
  /integrations/[slug]: establishes a Suspense boundary below the root
  layout so a page-level notFound() no longer invalidates the layout's
  SSR output (the fix endorsed by Next.js maintainers in #63980).
- layout.tsx: revert disableNextScript — the research showed this
  doesn't actually fix error-page renders. The real fix is above.

* improvement(landing): use emcn Loader in scoped loading.tsx, trim auth-client comment

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>

* fix(iam): correct MissingContextValues mapping in simulatePrincipalPolicy

* fix(aws): add conditionExpression migration fallback for DynamoDB delete, fix SES pageSize min

* fix(aws): deep validation fixes across SES, IAM, Identity Center, DynamoDB integrations

- IAM: replace non-existent StatementId with SourcePolicyType in simulatePrincipalPolicy
- IAM: add .int() constraint to list-users/roles/policies/groups Zod schemas
- IAM: remove redundant manual requestId from all 21 IAM route handlers
- SES: add .refine() body validation to create-template route
- SES: make bulk email destination templateData optional, only include ReplacementEmailContent when present
- SES: fix pageSize guard to if (pageSize != null) to correctly forward 0
- SES: add max(100) to list-templates pageSize, revert list-identities to min(0) per SDK
- STS: fix logger.error calls to use structured metadata pattern
- Identity Center: remove deprecated account.Status fallback, use account.State only
- DynamoDB: convert empty interface extends to type aliases, remove redundant error field, fix barrel to absolute imports

* regen docs

* fix(iam): add .int() constraint to maxSessionDuration in create-role route

* fix(ses): forward pageSize=0 correctly in listIdentities util

* fix(aws): add gradient background to IdentityCenterIcon, fix listTemplates pageSize guard

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-21 16:30:43 -07:00
Vikhyath Mondreti
aee6189d14 improvement(access-control): migrate to workspace scope (#4244)
* improvement(access-control): migrate to workspace scope

* fix edge cases

* update docs

* prep merge

* regen migrations

* address comments

* add ws id, user constraint

* address more comments

* address ui comments

* address more comments
2026-04-21 15:53:17 -07:00
Emir Karabeg
3a0e7b89a4 seo(robots): disallow tag-filtered blog URLs from crawlers (#4247) 2026-04-21 15:13:08 -07:00
Waleed
d185953248 improvement(landing): scope navbar/footer to (shell) route group, align scoped 404s with root (#4246)
* improvement(landing): scope navbar/footer shell to (shell) route group, align scoped 404s with root

Move integrations and models page routes into a `(shell)` route group so the Navbar+Footer layout wraps pages but not `not-found.tsx`. This lets scoped 404s render the same `<AuthBackground>` + Navbar treatment as the root `/` 404, instead of appearing inside the landing CTA footer.

Extract the shared 404 markup into `<NotFoundView>` so root, integrations, and models 404s share a single source of truth. Route URLs are unchanged — route groups are URL-transparent.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(landing): convert relative imports to absolute in integrations (shell) page

Build failed because the move into the (shell) route group invalidated relative `./components/...` and `./data/...` imports. CLAUDE.md mandates absolute imports throughout — switching these resolves the Turbopack build errors.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-21 14:03:51 -07:00
Waleed
42ef2b1dbb fix(landing): resolve error-page crash on invalid /models and /integrations routes (#4243)
* fix(layout): use plain inline script for PublicEnvScript to set env before chunks eval on error pages

* fix(landing): handle runtime env race on error-page renders

React skips SSR on unhandled server errors and re-renders on the client
(see vercel/next.js#63980, #82456). Root-layout scripts — including the
runtime env script that populates window.__ENV — are inserted but not
executed on that client re-render, so any client module that reads env
at module evaluation crashes the render into a blank "Application error"
overlay instead of rendering the styled 404.

This replaces the earlier PublicEnvScript tweak with the architectural
fix:

- auth-client.ts: fall back to window.location.origin when getBaseUrl()
  throws on the client. Auth endpoints are same-origin, so this is the
  correct baseURL on the client. Server-side we still throw on genuine
  misconfig.
- loading.tsx under /models/[provider], /models/[provider]/[model], and
  /integrations/[slug]: establishes a Suspense boundary below the root
  layout so a page-level notFound() no longer invalidates the layout's
  SSR output (the fix endorsed by Next.js maintainers in #63980).
- layout.tsx: revert disableNextScript — the research showed this
  doesn't actually fix error-page renders. The real fix is above.

* improvement(landing): use emcn Loader in scoped loading.tsx, trim auth-client comment

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-21 12:43:43 -07:00
Waleed
2456128fb7 feat(contact): add contact page, migrate help/demo forms to useMutation (#4242)
* feat(contact): add contact page, migrate help/demo forms to useMutation

* improvement(contact): address greptile review feedback

- Map contact topic to help email type for accurate confirmation emails
- Drop Zod schema details from 400 response on public /api/contact
- Wire aria-describedby + aria-invalid in LandingField for both forms
- Reset helpMutation on modal reopen to match demo-request pattern

* improvement(landing): extract shared LandingField component
2026-04-21 12:20:09 -07:00
Theodore Li
699bbfd16f feat(log): Add wrapper function for standardized logging (#4061)
* feat(log): Add wrapper function for standardized logging

* Add all routes to wrapper, handle background execution

* fix lint

* fix test

* fix test missing url

* fix lint

* fix tests

* fix build

* fix(build): unmangle generic in admin outbox requeue route

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-21 13:21:13 -04:00
Waleed
0e1ff0a1ac improvement(enterprise): slack wizard UI, enterprise docs, data retention updates (#4241)
* improvement(enterprise): slack wizard UI, enterprise docs, data retention updates

* improvement(docs): add enterprise screenshots to sso, access-control, whitelabeling pages

* form

* fix(enterprise): address PR review — h-full for recently-deleted, shared SettingRow, toast UX, stale form fix, emcn tokens

* fix(whitelabeling): scope drop zone to thumbnail only, not full upload row

* fix(whitelabeling): remove drop image text from drag overlay

* fix(config): add DATA_RETENTION_ENABLED to env schema to fix build type error

* fix(testing): add isDataRetentionEnabled to feature flags mock

* improvement(docs): remove redundant requirements section from data-retention page

* improvement(docs): remove requirements sections from all enterprise doc pages

* improvement(docs): add screenshot to audit-logs page

* fix(data-retention): bypass enterprise gate when billing is disabled for self-hosted
2026-04-20 23:24:14 -07:00
Waleed
94202ed229 improvement(knowledge): show selector with saved option in connector edit modal (#4240)
* improvement(knowledge): show selector with saved option in connector edit modal

* fix(kb-connectors): clear canonical siblings when non-canonical dep changes; share selector field

* refactor(kb-connectors): extract canonical-field logic into useConnectorConfigFields hook

* fix(kb-connectors): only merge changed fields into sourceConfig on edit save

Avoids writing spurious empty-string keys for untouched optional fields when
another field triggers a save.

* refactor(kb-connectors): tighten state primitives in modals

- edit modal: replace useMemo([]) + eslint-disable with useState lazy
  initializer for initialSourceConfig — same mount-once semantics
  without the escape hatch.
- add modal: drop useCallback on handleConnectNewAccount (no observer
  saw the reference) and inline the one call site.
2026-04-20 20:49:58 -07:00
Theodore Li
802f4cf0fc feat(jobs): Add data retention jobs (#4128)
* feat(jobs): Add data retention jobs

Add 3 cron-triggered cleanup jobs dispatched via Trigger.dev (or inline fallback):
- cleanup-soft-deletes: hard-deletes soft-deleted workspace resources past retention
- cleanup-logs: deletes expired workflow execution logs + S3 files
- cleanup-tasks: deletes expired copilot chats, runs, feedback, inbox tasks

Enterprise admins can configure per-workspace retention via Settings > Data Retention.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

# Conflicts:
#	packages/db/migrations/meta/0192_snapshot.json
#	packages/db/migrations/meta/_journal.json
#	packages/db/schema.ts

* Cleanup orphaned using ids, not timestamp sorting

* fix lint
2026-04-20 20:24:39 -04:00
Waleed
ac4ccfcac8 fix(billing): close TOCTOU race in subscription transfer, centralize stripe test mocks (#4239)
* fix(billing): close TOCTOU race in subscription transfer, centralize stripe test mocks

* more mocks

* fix(testing): provide complete Stripe.Event defaults in createMockStripeEvent

* fix(testing): make dbChainMock .for('update') chainable with .limit()

* fix(billing): gate subscription transfer noop behind membership check

Previously the 'already belongs to this organization' early return fired
before the org/member lookups, letting any authenticated caller probe
sub-to-org pairings without being a member of the target org. Move the
noop check after the admin/owner verification so unauthorized callers
hit the 403 first.
2026-04-20 16:46:28 -07:00
Waleed
0cd14f4ac9 improvement(sso): fix provider lookup, migrate UI to emcn, add enterprise SSO docs (#4238)
* improvement(sso): fix provider lookup, migrate UI to emcn, add enterprise SSO docs

* fix(sso): add org membership guard on providers route, fix idpMetadata round-trip

* fix(sso): add org membership guard on register route, fix SP entityID, remove fullError leak

* fix(sso): fix SAML script callbackUrl and SP entityID to use app base URL

* fix(sso): correct SAML callback URL path in script header comment

* fix(sso): restrict SSO provider read/write to org owners and admins

* docs(sso): restructure page, fix provider guide accuracy, add external doc links

* fix(sso): correct SAML callback path and generate idpMetadata from cert+entryPoint

* fix(sso): always require NEXT_PUBLIC_APP_URL for SAML SP metadata entityID

* fix(sso): scope provider query to org only when organizationId is provided

* fix(sso): escape XML special chars in script idpMetadata generation

* fix(sso): final audit corrections — saml mapping, xml escaping, self-hosted org guard

* fix(sso): redact oidc client secret in providers response, add self-hosted org admin guard

* fix(sso): scope redacted-secret lookup to caller's org or userId

* fix(sso): null out oidcConfig on parse failure to prevent unredacted secret leak

* fix(sso): use issuer as entityID in auto-generated idp metadata xml
2026-04-20 16:45:37 -07:00
Vikhyath Mondreti
d9209f9588 improvement(governance): workspace-org invitation system consolidation (#4230)
* workspace re-org checkpoint

* admin route reconciliation

* checkpoint consistency fixes

* prep merge

* regen migration

* checkpoint

* code cleanup

* update docs

* add feature for owner to leave + admin route

* address comments

* fix new account race

* address comments
2026-04-20 14:45:07 -07:00
Theodore Li
2ae1ad293f feat(ui): Add slack manifest generator (#4237)
* feat(slack): add manifest copying

* Try using wizard modal

* clean up modal

* feat(ui): Add wizard emcn component

* Made modal subblock more generic

* Fix test

* Address greptile comments

* fix handle copy behavior

* Add secret input emcn type

* Lint code
2026-04-20 16:50:30 -04:00
Waleed
febc36ff9c fix(security): enforce URL validation across connectors, providers, and auth flows (SSRF + open-redirect hardening) (#4236)
* fix(workday): validate tenantUrl to prevent SSRF in SOAP client

* fix(workday): use validation.sanitized in buildWsdlUrl

* fix(security): enforce URL validation across connectors, providers, auth

- Azure OpenAI/Anthropic: validate user-supplied azureEndpoint with validateUrlWithDNS to block SSRF to private IPs, localhost (in hosted mode), and dangerous ports.
- ServiceNow connector: enforce ServiceNow domain allowlist via validateServiceNowInstanceUrl before calling the instance URL.
- Obsidian connector: validate vaultUrl with validateUrlWithDNS and reuse the resolved IP via secureFetchWithPinnedIPAndRetry to block DNS rebinding between validation and request.
- Signup + verify flows: pass redirect/callbackUrl/redirectAfter and stored inviteRedirectUrl through validateCallbackUrl; drop unsafe values and log a warning.
- lib/knowledge/documents/utils.ts: add secureFetchWithPinnedIPAndRetry wrapper around secureFetchWithPinnedIP (used by Obsidian).

* fix(obsidian): use isomorphic SSRF validation to unblock client build

The Obsidian connector is reachable from client bundles via `connectors/registry.ts` (the knowledge UI reads metadata like `.icon`/`.name`). Importing `validateUrlWithDNS` / `secureFetchWithPinnedIP` from `input-validation.server` pulled `dns/promises`, `http`, `https`, `net` into client chunks, breaking the Turbopack build:

  Module not found: Can't resolve 'dns/promises'
  ./apps/sim/lib/core/security/input-validation.server.ts [Client Component Browser]
  ./apps/sim/connectors/obsidian/obsidian.ts [Client Component Browser]
  ./apps/sim/connectors/registry.ts [Client Component Browser]

Once that file polluted a browser context, Turbopack also failed to resolve the Node builtins in its legitimate server-route imports, cascading the error across App Routes and Server Components.

Fix: switch the Obsidian connector to the isomorphic `validateExternalUrl` + `fetchWithRetry` helpers, matching the pattern used by every other connector in the registry. This keeps the core SSRF protections:
  - hosted Sim: blocks localhost, private IPs, HTTP (HTTPS enforced)
  - self-hosted Sim: allows localhost + HTTP, still blocks non-loopback private IPs and dangerous ports (22, 25, 3306, 5432, 6379, 27017, 9200)

Drops the DNS-rebinding defense specifically (the IP-pinned fetch chain). The trade-off is acceptable because the vault URL is entered by the workspace admin — not arbitrary untrusted input — and hosted deployments already force the plugin to be exposed through a public URL (tunnel/port-forward), making rebinding a narrow threat.

Also reverts the `secureFetchWithPinnedIPAndRetry` wrapper in `lib/knowledge/documents/utils.ts` (no longer needed, and its `.server` import was the original source of the client-bundle pollution).

* fix(servicenow): propagate URL validation errors in getDocument

Match listDocuments behavior — invalid instance URL should surface as a
configuration error rather than being swallowed into a "document not found"
null response during sync.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(obsidian): drop allowHttp to restore HTTPS enforcement in hosted mode

allowHttp: true permitted plaintext HTTP for all hosts in all deployment
modes, contradicting the documented policy. The default validateExternalUrl
behavior already allows http://localhost in self-hosted mode (the actual
Obsidian Local REST API use case) via the built-in carve-out, while correctly
rejecting HTTP for public hosts in hosted mode — which prevents leaking the
Bearer access token over plaintext.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-20 10:54:03 -07:00
Waleed
5cf7e8d546 improvement(codebase): migrate tests to dbChainMock, extract react-query hooks (#4235)
* improvement(codebase): migrate tests to dbChainMock, extract react-query hooks

Migrate 97 test files to centralized dbChainMock/dbChainMockFns helpers from
@sim/testing — removes hoisted chain-wiring boilerplate.

Extend dbChainMock to cover insert/update/delete/transaction/execute patterns.
Extract useGitHubStars and useVoiceSettings react-query hooks from inline fetches.
Centralize additional mocks (authMockFns, hybridAuthMockFns) and update docs.

* fix(github-stars): centralize fallback via initialData, remove stale constants

Move the placeholder star count into useGitHubStars as initialData with
initialDataUpdatedAt: 0 so `data` is always a narrowed string while still
refetching on mount. Fixes two Bugbot issues: stale '25.8k' in chat.tsx
(vs '27.8k' in navbar) and empty-string return in fetchGitHubStars that
bypassed `??` fallbacks in consumers.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(testing): wire dbChainMock.db to shared transaction and execute fns

dbChainMock.db.transaction was an inline vi.fn() separate from the exported
dbChainMockFns.transaction, so dbChainMockFns.transaction.mockResolvedValueOnce
and assertions silently targeted the wrong instance. dbChainMock.db also
omitted execute, so tests for any module that calls db.execute (logging-session,
table service, billing balance) would throw TypeError. Both mocks now reference
the module-level constants so overrides and resetDbChainMock affect the same fn.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(chat,testing): memoize welcome message and add selectDistinct to dbChainMock.db

Why:
- Welcome ChatMessage was rebuilt inline each render, producing a fresh
  timestamp and new array identity — cascading to ChatMessageContainer
  and VoiceInterface props on every tick.
- dbChainMockFns exports selectDistinct/selectDistinctOn but the
  dbChainMock.db object omitted them, so tests that stub those builders
  hit undefined on the mocked module.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(chat): re-attach scroll listener once container mounts

The scroll effect's empty dep array meant it ran only on the first
render, when `chatConfig` is still loading and the component returns
`<ChatLoadingState />` — so `messagesContainerRef.current` was null and
the listener was never attached. Depend on the gating conditions that
control which tree renders, so the effect re-runs once the real
container is in the DOM (and re-attaches when toggling in/out of voice
mode).

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

* fix(chat): reset chat state on identifier change via key prop

Keying `<ChatClient>` on `identifier` guarantees a full remount on
route transitions between chats, so `conversationId`, `messages`, and
every other piece of local state start fresh — no reset effect
required.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-19 23:05:06 -07:00
Waleed
1ced54a77c fix(settings): restore paste-to-destructure for workspace secrets, cleanup hooks and design tokens (#4231)
* fix(settings): restore paste-to-destructure for workspace secrets, cleanup hooks and design tokens

Restores the env-var paste feature (KEY=VALUE → split rows, multi-line
→ multi rows) for workspace secrets that was lost when the unified
Credentials tab was split into Secrets and Integrations. Adds
`parseEnvVarLine`, `parseValidEnvVars`, and `handleWorkspacePaste` with
full support for export prefix, quoted values, inline comments, and
base64 false-positive guards. Also adds consistent value masking
(show on focus / mask on blur) to new workspace input rows.

Cleans up ~20 unnecessary `useCallback` wrappers, fixes a direct state
mutation in `handleSingleValuePaste`, moves `e.preventDefault()` inside
the `parsedVars.length > 0` guard, replaces all hardcoded hex colors
with CSS variable tokens, converts template-literal classNames to `cn()`,
and replaces raw `<button>` with emcn `Button`.

* fix(settings): fix handlePaste silent swallow, quote-strip bug, and credential sync efficiency

- Move e.preventDefault() inside parsedVars guard in handlePaste so KEY= lines
  don't silently discard input (mirrors handleWorkspacePaste fix from same PR)
- Add value.length >= 2 guard before quote-stripping to prevent single-char
  values like KEY=\" from being stripped to empty and silently dropped
- Introduce createWorkspaceEnvCredentials and deleteWorkspaceEnvCredentials
  for delta-aware credential sync (O(k) instead of O(n*m) for env var mutations)
- Fix createWorkspaceEnvCredentials early-return bug that skipped credential
  record creation when workspace had zero members
- Update credentials/[id] DELETE to use deleteWorkspaceEnvCredentials instead
  of full syncWorkspaceEnvCredentials
- Optimize syncWorkspaceEnvCredentials to fetch workspace+member IDs in parallel
  once instead of once per credential

* fix(settings): normalize Windows line endings in paste handlers

* fix(settings): eliminate double-parse in handlePaste by inlining handleKeyValuePaste
2026-04-18 21:22:29 -07:00
Waleed
951fbd4ded fix(landing): render proper 404 for invalid /models and /integrations routes (#4232) 2026-04-18 21:13:48 -07:00
Waleed
f91c1b614a chore(docker): add packages/utils to app and realtime Dockerfiles (#4229)
* chore(docker): add packages/utils to app and realtime Dockerfiles

* chore(docker): copy packages/utils in realtime runner stage
2026-04-18 18:00:44 -07:00
Waleed
b5674d9ed4 improvement(codebase): centralize test mocks, extract @sim/utils, remove dead code (#4228)
* improvement(codebase): centralize test mocks, extract @sim/utils, remove dead code

* improvement(codebase): apply @sim/utils conventions to staging-introduced files
2026-04-18 14:39:03 -07:00
Theodore Li
c19187257e fix(ui): stop scrolling on leaving workflow sidebar for drag-drop (#4139)
* fix(ui): stop scrolling on leaving workflow sidebar for drag-drop

* Address comments, fix hover state

* address comments
2026-04-18 15:45:25 -04:00
Vikhyath Mondreti
c246f5c660 improvement(billing): route scope by subscription referenceId, sync plan from Stripe, transfer storage on org join, outbox service (#4219)
* fix(billing): route scope by subscription referenceId, sync plan from Stripe, transfer storage on org join

Route every billing decision (usage limits, credits, storage, rate
limit, threshold billing, webhooks, UI permissions) through the
subscription's `referenceId` instead of plan-name heuristics. Fixes
the production state where a `pro_6000` subscription attached to an
organization was treated as personal Pro by display/edit code while
execution correctly enforced the org cap.

Scope
- Add `isOrgScopedSubscription(sub, userId)` (pure) and
  `isSubscriptionOrgScoped(sub)` (async DB-backed) helpers. One is
  used wherever a user perspective is available; the other in webhook
  handlers that only have a subscription row.
- Replace plan-name scope checks in ~20 files: usage/limit readers,
  credits balance + purchase, threshold billing, storage limits +
  tracking, rate limiter, invoice + subscription webhooks, seat
  management, membership join/leave, `switch-plan` admin gate,
  admin credits/billing routes, copilot 402 handler, UI subscription
  settings + permissions + sidebar indicator, React Query types.

Plan sync
- Add `syncSubscriptionPlan(subscriptionId, currentPlan, planFromStripe)`
  called from `onSubscriptionComplete` and `onSubscriptionUpdate` so
  the DB `plan` column heals on every Stripe event. Pro->Team upgrades
  previously updated price, seats, and referenceId but left `plan`
  stale — this is what produced the `pro_6000`-on-org row.

Priority + grace period
- `getHighestPrioritySubscription` now prefers org over personal
  within each tier (Enterprise > Team > Pro, org > personal at each).
  A user with a `cancelAtPeriodEnd` personal Pro who joins a paid org
  routes pooled resources to the org through the grace window.
- `calculateSubscriptionOverage` personal-Pro branch reads user_stats
  directly (bypassing priority) and bills only `proPeriodCostSnapshot`
  when the user joined a paid org mid-cycle, so post-join org usage
  isn't double-charged on the personal Pro's final invoice.
  `resetUsageForSubscription` mirrors this: preserves
  `currentPeriodCost` / `currentPeriodCopilotCost` when
  `proPeriodCostSnapshot > 0` so the org's next cycle-close captures
  post-join usage correctly.

Uniform base-price formula
- `basePrice × (seats ?? 1)` everywhere: `getOrgUsageLimit`,
  `updateOrganizationUsageLimit`, `setUsageLimitForCredits`,
  `calculateSubscriptionOverage`, threshold billing,
  `syncSubscriptionUsageLimits`, `getOrganizationBillingData`.
  Admin dashboard math now agrees with enforcement math.

Storage transfer on join
- Invitation-accept flow moves `user_stats.storageUsedBytes` into
  `organization.storageUsedBytes` inside the same transaction when
  the org is paid.
- `syncSubscriptionUsageLimits` runs a bulk-backfill version so
  members who joined before this fix, or orgs that upgraded from
  free to paid after members joined, get pulled into the org pool
  on the next subscription event. Idempotent.

UX polish
- Copilot 402 handler differentiates personal-scoped ("increase your
  usage limit") from org-scoped ("ask an owner or admin to raise the
  limit") while keeping the `increase_limit` action code the parser
  already understands.
- Duplicate-subscription error on team upgrade names the existing
  plan via `getDisplayPlanName`.
- Invitation-accept invalidates subscription + organization React
  Query caches before redirect so settings doesn't flash the user's
  pre-join personal view.

Dead code removal
- Remove unused `calculateUserOverage`, and the following fields on
  `SubscriptionBillingData` / `getSimplifiedBillingSummary` that no
  consumer in the monorepo read: `basePrice`, `overageAmount`,
  `totalProjected`, `tierCredits`, `basePriceCredits`,
  `currentUsageCredits`, `overageAmountCredits`, `totalProjectedCredits`,
  `usageLimitCredits`, `currentCredits`, `limitCredits`,
  `lastPeriodCostCredits`, `lastPeriodCopilotCostCredits`,
  `copilotCostCredits`, and the `organizationData` subobject. Add
  `metadata: unknown` to match what the server returns.

Notes for the triggering customer
- The `pro_6000`-on-org row self-heals on the next Stripe event via
  `syncSubscriptionPlan`. For the one known customer, a direct
  UPDATE is sufficient:
  `UPDATE subscription SET plan='team_6000' WHERE id='aq2...' AND plan='pro_6000'`.

Made-with: Cursor

* fix tests

* address more comments

* progress

* harden further

* outbox service

* address comments

* address comment on check

* simplify

* cleanup code

* minor improvement
2026-04-18 10:46:14 -07:00
Waleed
28b4c4cc67 fix(blocks): resolve variable display in mothership resource preview (#4226)
* fix(blocks): resolve variable display in mothership resource preview

Variables block showed empty assignments in the embedded workflow preview
because currentWorkflowId was read from URL params, which don't contain
workflowId in the mothership route. Fall back to activeWorkflowId from
the workflow registry.

* fix(blocks): narrow currentWorkflowId to string to satisfy strict null checks
2026-04-18 10:26:46 -07:00
Vikhyath Mondreti
32541e79d4 chore(readme): update tech stack section (#4227)
* chore(readme): update tech stack section

* fix
2026-04-18 10:02:54 -07:00
Waleed
a01f80c6a3 feat(tables): column selection, keyboard shortcuts, drag reorder, and undo improvements (#4222)
* feat(tables): add column selection, missing keyboard shortcuts, and Sheets-aligned operations

Click column headers to select entire columns, shift-click to extend to
a column range. Delete, cut, and copy operations work on column
selections with full undo/redo support. Adds Home, End, Ctrl+Home,
Ctrl+End, PageUp, PageDown, Ctrl+Space, and all Shift variants.
Changes Ctrl+A to select all cells instead of checkbox rows. Column
header dropdown menu now opens on right-click instead of left-click.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): chevron opens dropdown, drag header to reorder columns

Split column header into label area (click to select, draggable for
reorder) and chevron button (click to open dropdown menu). Remove
the grip handle — dragging the header itself now reorders columns.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): full-column highlight during drag reorder

Replace the thin 2px line drop indicator with a full-column highlight
that spans the entire table height, matching Google Sheets behavior.
The insertion line is still shown at the drop edge for precision.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): handle drag reorder edge cases, dim source column

Suppress drop indicator when drag would result in no position change
(dragging onto self or adjacent no-op positions). Dim the source
column body cells during drag with a background overlay. Skip the
API call when the computed order is identical to the current order.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(tables): add column reorder undo/redo, body drop targets, and escape cancel

Column drag-and-drop now supports dropping anywhere in a column (not just headers),
pressing Escape to cancel a drag, and full undo/redo integration for column reordering.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): merge partial updates in updateRow to prevent column data loss

When Mothership called updateRow directly (bypassing the PATCH API route),
it passed only the changed fields — which were written as the entire row,
wiping all other columns. Move the merge logic into updateRow itself so
all callers get correct partial-update semantics, and remove the now-redundant
pre-merge from both PATCH routes.

* test(tables): add updateRow partial merge tests

Covers the bug where partial updates wiped unmentioned columns — verifies
that fields not in the update payload are preserved, nulling a field works,
full-row updates are idempotent, and missing rows throw correctly.

* feat(tables): add delete-column undo/redo, rename metadata sync, and comprehensive row ID patching

- Delete column now captures column definition, cell data, order, and width for full undo/redo
- Column rename undo/redo now properly syncs columnWidths and columnOrder metadata
- patchRedoRowId/patchUndoRowId extended to handle all action types containing row IDs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): remove source column dimming during drag reorder

Only show the insertion line at the drop position, matching Google Sheets
behavior. Remove dragSourceBounds memo and isDragging prop.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): preserve selection on right-click, auto-resize on double-click, fix escape during drag

- Right-clicking within an existing selection now preserves it instead of
  resetting to a single cell, so context menu operations apply to the full range
- Double-clicking a column border auto-resizes the column to fit its content
- Escape during column drag now immediately clears refs before state update,
  preventing the dragend handler from executing the reorder

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): add aria-hidden value and aria-label for column header accessibility

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): tighten auto-resize padding to match Google Sheets

Reduce header padding from +48px to +36px (icon + cell padding) and cell
padding from +20px to +17px (cell padding + border) for a snug fit.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): clean drag ghost and clear selection on drag start

- Create a minimal custom drag image showing only the column name instead
  of the browser's default ghost that includes adjacent columns/checkboxes
- Clear any existing cell/column selection when starting a column drag to
  prevent stale highlights from persisting during reorder

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(tables): add Shift+Space row selection and Ctrl+D fill down

Shift+Space now selects the entire row (all columns) instead of toggling
a checkbox, matching Google Sheets behavior. Ctrl+D copies the top cell's
value down through the selected range with full undo/redo support.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): show toast on incompatible column type change

The server validates type compatibility and returns a clear error message
(e.g. "3 row(s) have incompatible values"), but the client was silently
swallowing it. Now surfaces the error via toast notification. Also moved
the undo push to onSuccess so a failed type change doesn't pollute the
undo stack.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): scroll-into-view for selection focus, Home/End origin, delete-column undo timing

- Scroll-into-view now tracks selectionFocus (not just anchor), so
  Shift+Arrow extending selection off-screen properly auto-scrolls
- Shift+Home/End now uses the current focus as origin (matching
  Shift+Arrow behavior) instead of always using anchor
- Delete column undo entry is now pushed in onSuccess, preventing
  a corrupted undo stack if the server rejects the deletion
- Dialog copy updated from "cannot be undone" to "You can undo this
  action" since undo/redo is supported

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: resolve duplicate declarations from rebase against staging

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix file upload

* fix(tables): merge column widths on delete-column undo, try/finally for auto-resize

- Delete-column undo now reads current column widths via getColumnWidths
  callback and merges the restored column's width into the full map,
  preventing other columns' widths from being wiped
- Auto-resize measurement span is now wrapped in try/finally to ensure
  DOM cleanup if an exception occurs during measurement

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: revert accidental home.tsx change from rebase conflict resolution

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): clear isColumnSelection on double-click and right-click, skip scroll for column select

- Clear isColumnSelection when double-clicking a cell to edit, preventing
  the column selection effect from fighting with the editing state
- Clear isColumnSelection when right-clicking outside the current
  selection, preventing stale column selection from re-expanding
- Skip scroll-into-view when isColumnSelection is true, preventing
  the viewport from jumping to the bottom row when clicking a column header

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): remove inline font override in auto-resize, guard undefined columnOrder

- Remove `font:inherit` from measurement span inline style so Tailwind
  classes (font-medium, text-small) control font properties for accurate
  column width measurement
- Only include columnOrder in metadata update when defined, preventing
  handleColumnRename from clearing a persisted column order when
  columnOrderRef is null

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): capture columnRequired in delete-column undo for full restoration

The delete-column undo action captured columnUnique but not columnRequired,
so undoing a delete on a required column would silently drop the constraint.
Now captures and restores both constraints.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): restore width independently of order on delete-column undo, batch fill-down

- Column width restoration in delete-column undo no longer requires
  previousOrder to be non-null — width is restored independently
- Ctrl+D fill-down now uses batchUpdateRef (single API call) instead
  of calling mutateRef per row in a loop

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): multi-column delete, select-all cell model, cut flash, chevron alignment

- Multi-select delete: detect column selection range and delete all selected
  columns sequentially with individual undo entries
- Select all (header checkbox): use cell selection model instead of checkbox
  model for consistent highlighting
- Cut flash: batch cell clears into single mutation to prevent stale data
  flashing from multiple onSettled invalidations
- Chevron alignment: adjust right padding from pr-2 to pr-2.5

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): restore column width locally on delete-column undo

Add onColumnWidthsChange callback to undo hook so restored column
widths update local component state, not just server metadata.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): prevent Ctrl+D bookmark dialog, batch Delete/Backspace mutations

- Move e.preventDefault() before early returns in Ctrl+D handler so
  the browser bookmark dialog is always suppressed
- Replace per-row mutateRef calls with single batchUpdateRef call in
  both Delete/Backspace handlers (checked rows and cell selection),
  consistent with cut and fill-down

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): adjust column positions for multi-column delete undo

Capture original schema positions upfront and adjust each by the
count of previously-deleted columns with lower positions, so undo
restores columns at correct server-side positions in LIFO order.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): only multi-delete when clicked column is within selection

Check that the right-clicked column is within the selected column
range before using multi-column delete. If the click is outside the
selection, delete only the clicked column.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): prevent duplicate undo entry on column drag-drop

Clear dragColumnNameRef immediately in handleColumnDragEnd so the
second invocation (from dragend after drop already fired) is a no-op.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): clean up width on delete-column redo, suppress click during drag

- Redo path for delete-column now removes the column's width from
  metadata and local state, preventing stale width entries
- Add didDragRef to ColumnHeaderMenu to suppress the click event
  that fires after a drag operation, preventing selection flash

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): remove unstable mutation object from useCallback deps

deleteTableMutation is not referentially stable — only .mutateAsync()
is. Including the mutation object causes unnecessary callback recreation
on every mutation state change.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): fix auto-resize header padding, deduplicate rename metadata logic

Increase header text measurement padding from 36px to 57px to account
for the chevron dropdown button (pl-0.5 + 9px icon + pr-2.5) that
always occupies layout space. Prevents header text truncation on
auto-resize.

Deduplicate column rename metadata logic by having columnRename.onSave
call handleColumnRename instead of reimplementing the same width/order
transfer and metadata persist.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): log error on cell data restoration failure during undo

Add onError handler to the batchUpdateRowsMutation inside
delete-column undo so failures are logged instead of silently
swallowed. The column schema restores first, and the cell data
restoration is a separate async call that the outer try/catch
cannot intercept.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): address audit findings across table, undo hook, and store

- Add missing bounds check in handleCopy (c >= cols.length) matching
  handleCut for defensive consistency
- Clear lastCheckboxRowRef in Ctrl+Space and Shift+Space to prevent
  stale shift-click checkbox range after keyboard selection
- Fix stale snapshot race in patchRedoRowId/patchUndoRowId by reading
  state inside the set() updater instead of via get() outside it
- Add metadata cleanup to create-column undo so column width is removed
  from both local state and server, symmetric with delete-column redo
- Remove stale width key from columnWidths on column delete instead of
  persisting orphaned entries
- Normalize undefined vs null in handleInlineSave change detection to
  prevent unnecessary mutations when oldValue is undefined
- Use ghost.parentNode?.removeChild instead of document.body.removeChild
  in drag ghost cleanup to prevent throw on component unmount

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(tables): reset didDragRef in handleDragEnd to prevent stale flag

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-18 01:30:32 -07:00
1921 changed files with 233057 additions and 43840 deletions

View File

@@ -1,7 +1,10 @@
# Global Standards
## Logging
Import `createLogger` from `sim/logger`. Use `logger.info`, `logger.warn`, `logger.error` instead of `console.log`.
Import `createLogger` from `@sim/logger`. Use `logger.info`, `logger.warn`, `logger.error` instead of `console.log`. Inside API routes wrapped with `withRouteHandler`, loggers automatically include the request ID.
## API Route Handlers
All API route handlers must be wrapped with `withRouteHandler` from `@/lib/core/utils/with-route-handler`. Never export a bare `async function GET/POST/...` — always use `export const METHOD = withRouteHandler(...)`.
## Comments
Use TSDoc for documentation. No `====` separators. No non-TSDoc comments.
@@ -10,7 +13,7 @@ Use TSDoc for documentation. No `====` separators. No non-TSDoc comments.
Never update global styles. Keep all styling local to components.
## ID Generation
Never use `crypto.randomUUID()`, `nanoid`, or the `uuid` package directly. Use the utilities from `@/lib/core/utils/uuid`:
Never use `crypto.randomUUID()`, `nanoid`, or the `uuid` package directly. Use the utilities from `@sim/utils/id`:
- `generateId()` — UUID v4, use by default
- `generateShortId(size?)` — short URL-safe ID (default 21 chars), for compact identifiers
@@ -24,14 +27,14 @@ import { v4 as uuidv4 } from 'uuid'
const id = crypto.randomUUID()
// ✓ Good
import { generateId, generateShortId } from '@/lib/core/utils/uuid'
import { generateId, generateShortId } from '@sim/utils/id'
const uuid = generateId()
const shortId = generateShortId()
const tiny = generateShortId(8)
```
## Common Utilities
Use shared helpers from `@/lib/core/utils/helpers` instead of writing inline implementations:
Use shared helpers from `@sim/utils` instead of writing inline implementations:
- `sleep(ms)` — async delay. Never write `new Promise(resolve => setTimeout(resolve, ms))`
- `toError(value)` — normalize unknown caught values to `Error`. Never write `e instanceof Error ? e : new Error(String(e))`
@@ -44,7 +47,8 @@ const msg = error instanceof Error ? error.message : String(error)
const err = error instanceof Error ? error : new Error(String(error))
// ✓ Good
import { sleep, toError } from '@/lib/core/utils/helpers'
import { sleep } from '@sim/utils/helpers'
import { toError } from '@sim/utils/errors'
await sleep(1000)
const msg = toError(error).message
const err = toError(error)

View File

@@ -13,8 +13,12 @@ Use Vitest. Test files: `feature.ts` → `feature.test.ts`
These modules are mocked globally — do NOT re-mock them in test files unless you need to override behavior:
- `@sim/db``databaseMock`
- `@sim/db/schema``schemaMock`
- `drizzle-orm``drizzleOrmMock`
- `@sim/logger``loggerMock`
- `@/lib/auth``authMock`
- `@/lib/auth/hybrid``hybridAuthMock` (with default session-delegating behavior)
- `@/lib/core/utils/request``requestUtilsMock`
- `@/stores/console/store`, `@/stores/terminal`, `@/stores/execution/store`
- `@/blocks/registry`
- `@trigger.dev/sdk`
@@ -102,10 +106,6 @@ vi.mock('@/lib/workspaces/utils', () => ({
}))
```
### NEVER use `mockAuth()`, `mockConsoleLogger()`, or `setupCommonApiMocks()` from `@sim/testing`
These helpers internally use `vi.doMock()` which is slow. Use direct `vi.hoisted()` + `vi.mock()` instead.
### Mock heavy transitive dependencies
If a module under test imports `@/blocks` (200+ files), `@/tools/registry`, or other heavy modules, mock them:
@@ -135,83 +135,129 @@ await new Promise(r => setTimeout(r, 1))
vi.useFakeTimers()
```
## Mock Pattern Reference
## Centralized Mocks (prefer over local declarations)
`@sim/testing` exports ready-to-use mock modules for common dependencies. Import and pass directly to `vi.mock()` — no `vi.hoisted()` boilerplate needed. Each paired `*MockFns` object exposes the underlying `vi.fn()`s for per-test overrides.
| Module mocked | Import | Factory form |
|---|---|---|
| `@/app/api/auth/oauth/utils` | `authOAuthUtilsMock`, `authOAuthUtilsMockFns` | `vi.mock('@/app/api/auth/oauth/utils', () => authOAuthUtilsMock)` |
| `@/app/api/knowledge/utils` | `knowledgeApiUtilsMock`, `knowledgeApiUtilsMockFns` | `vi.mock('@/app/api/knowledge/utils', () => knowledgeApiUtilsMock)` |
| `@/app/api/workflows/utils` | `workflowsApiUtilsMock`, `workflowsApiUtilsMockFns` | `vi.mock('@/app/api/workflows/utils', () => workflowsApiUtilsMock)` |
| `@sim/audit` | `auditMock`, `auditMockFns` | `vi.mock('@sim/audit', () => auditMock)` |
| `@/lib/auth` | `authMock`, `authMockFns` | `vi.mock('@/lib/auth', () => authMock)` |
| `@/lib/auth/hybrid` | `hybridAuthMock`, `hybridAuthMockFns` | `vi.mock('@/lib/auth/hybrid', () => hybridAuthMock)` |
| `@/lib/copilot/request/http` | `copilotHttpMock`, `copilotHttpMockFns` | `vi.mock('@/lib/copilot/request/http', () => copilotHttpMock)` |
| `@/lib/core/config/env` | `envMock`, `createEnvMock(overrides)` | `vi.mock('@/lib/core/config/env', () => envMock)` |
| `@/lib/core/config/feature-flags` | `featureFlagsMock` | `vi.mock('@/lib/core/config/feature-flags', () => featureFlagsMock)` |
| `@/lib/core/config/redis` | `redisConfigMock`, `redisConfigMockFns` | `vi.mock('@/lib/core/config/redis', () => redisConfigMock)` |
| `@/lib/core/security/encryption` | `encryptionMock`, `encryptionMockFns` | `vi.mock('@/lib/core/security/encryption', () => encryptionMock)` |
| `@/lib/core/security/input-validation.server` | `inputValidationMock`, `inputValidationMockFns` | `vi.mock('@/lib/core/security/input-validation.server', () => inputValidationMock)` |
| `@/lib/core/utils/request` | `requestUtilsMock`, `requestUtilsMockFns` | `vi.mock('@/lib/core/utils/request', () => requestUtilsMock)` |
| `@/lib/core/utils/urls` | `urlsMock`, `urlsMockFns` | `vi.mock('@/lib/core/utils/urls', () => urlsMock)` |
| `@/lib/execution/preprocessing` | `executionPreprocessingMock`, `executionPreprocessingMockFns` | `vi.mock('@/lib/execution/preprocessing', () => executionPreprocessingMock)` |
| `@/lib/logs/execution/logging-session` | `loggingSessionMock`, `loggingSessionMockFns`, `LoggingSessionMock` | `vi.mock('@/lib/logs/execution/logging-session', () => loggingSessionMock)` |
| `@/lib/workflows/orchestration` | `workflowsOrchestrationMock`, `workflowsOrchestrationMockFns` | `vi.mock('@/lib/workflows/orchestration', () => workflowsOrchestrationMock)` |
| `@/lib/workflows/persistence/utils` | `workflowsPersistenceUtilsMock`, `workflowsPersistenceUtilsMockFns` | `vi.mock('@/lib/workflows/persistence/utils', () => workflowsPersistenceUtilsMock)` |
| `@/lib/workflows/utils` | `workflowsUtilsMock`, `workflowsUtilsMockFns` | `vi.mock('@/lib/workflows/utils', () => workflowsUtilsMock)` |
| `@/lib/workspaces/permissions/utils` | `permissionsMock`, `permissionsMockFns` | `vi.mock('@/lib/workspaces/permissions/utils', () => permissionsMock)` |
| `@sim/db/schema` | `schemaMock` | `vi.mock('@sim/db/schema', () => schemaMock)` |
### Auth mocking (API routes)
```typescript
const { mockGetSession } = vi.hoisted(() => ({
mockGetSession: vi.fn(),
}))
import { authMock, authMockFns } from '@sim/testing'
import { beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@/lib/auth', () => ({
auth: { api: { getSession: vi.fn() } },
getSession: mockGetSession,
}))
vi.mock('@/lib/auth', () => authMock)
// In tests:
mockGetSession.mockResolvedValue({ user: { id: 'user-1', email: 'test@example.com' } })
mockGetSession.mockResolvedValue(null) // unauthenticated
import { GET } from '@/app/api/my-route/route'
beforeEach(() => {
vi.clearAllMocks()
authMockFns.mockGetSession.mockResolvedValue({ user: { id: 'user-1' } })
})
```
Only define a local `vi.mock('@/lib/auth', ...)` if the module under test consumes exports outside the centralized shape (e.g., `auth.api.verifyOneTimeToken`, `auth.api.resetPassword`).
### Hybrid auth mocking
```typescript
const { mockCheckSessionOrInternalAuth } = vi.hoisted(() => ({
mockCheckSessionOrInternalAuth: vi.fn(),
}))
import { hybridAuthMock, hybridAuthMockFns } from '@sim/testing'
vi.mock('@/lib/auth/hybrid', () => ({
checkSessionOrInternalAuth: mockCheckSessionOrInternalAuth,
}))
vi.mock('@/lib/auth/hybrid', () => hybridAuthMock)
// In tests:
mockCheckSessionOrInternalAuth.mockResolvedValue({
hybridAuthMockFns.mockCheckSessionOrInternalAuth.mockResolvedValue({
success: true, userId: 'user-1', authType: 'session',
})
```
### Database chain mocking
```typescript
const { mockSelect, mockFrom, mockWhere } = vi.hoisted(() => ({
mockSelect: vi.fn(),
mockFrom: vi.fn(),
mockWhere: vi.fn(),
}))
Use the centralized `dbChainMock` + `dbChainMockFns` helpers — no `vi.hoisted()` or chain-wiring boilerplate needed.
vi.mock('@sim/db', () => ({
db: { select: mockSelect },
}))
```typescript
import { dbChainMock, dbChainMockFns, resetDbChainMock } from '@sim/testing'
vi.mock('@sim/db', () => dbChainMock)
// Spread for custom exports: vi.mock('@sim/db', () => ({ ...dbChainMock, myTable: {...} }))
beforeEach(() => {
mockSelect.mockReturnValue({ from: mockFrom })
mockFrom.mockReturnValue({ where: mockWhere })
mockWhere.mockResolvedValue([{ id: '1', name: 'test' }])
vi.clearAllMocks()
resetDbChainMock() // only needed if tests use permanent (non-`Once`) overrides
})
it('reads a row', async () => {
dbChainMockFns.limit.mockResolvedValueOnce([{ id: '1', name: 'test' }])
// exercise code that hits db.select().from().where().limit()
expect(dbChainMockFns.where).toHaveBeenCalled()
})
```
**Default chains supported:**
- `select()/selectDistinct()/selectDistinctOn() → from() → where()/innerJoin()/leftJoin() → where() → limit()/orderBy()/returning()/groupBy()/for()`
- `insert() → values() → returning()/onConflictDoUpdate()/onConflictDoNothing()`
- `update() → set() → where() → limit()/orderBy()/returning()/for()`
- `delete() → where() → limit()/orderBy()/returning()/for()`
- `db.execute()` resolves `[]`
- `db.transaction(cb)` calls cb with `dbChainMock.db`
`.for('update')` (Postgres row-level locking) is supported on `where`
builders. It returns a thenable with `.limit` / `.orderBy` / `.returning` /
`.groupBy` attached, so both `await .where().for('update')` (terminal) and
`await .where().for('update').limit(1)` (chained) work. Override the terminal
result with `dbChainMockFns.for.mockResolvedValueOnce([...])`; for the chained
form, mock the downstream terminal (e.g. `dbChainMockFns.limit.mockResolvedValueOnce([...])`).
All terminals default to `Promise.resolve([])`. Override per-test with `dbChainMockFns.<terminal>.mockResolvedValueOnce(...)`.
Use `resetDbChainMock()` in `beforeEach` only when tests replace wiring with `.mockReturnValue` / `.mockResolvedValue` (permanent). Tests using only `...Once` variants don't need it.
## @sim/testing Package
Always prefer over local test data.
| Category | Utilities |
|----------|-----------|
| **Mocks** | `loggerMock`, `databaseMock`, `drizzleOrmMock`, `setupGlobalFetchMock()` |
| **Module mocks** | See "Centralized Mocks" table above |
| **Logger helpers** | `loggerMock`, `createMockLogger()`, `getLoggerCalls()`, `clearLoggerMocks()` |
| **Database helpers** | `databaseMock`, `drizzleOrmMock`, `createMockDb()`, `createMockSql()`, `createMockSqlOperators()` |
| **Fetch helpers** | `setupGlobalFetchMock()`, `createMockFetch()`, `createMockResponse()`, `mockFetchError()` |
| **Factories** | `createSession()`, `createWorkflowRecord()`, `createBlock()`, `createExecutionContext()` |
| **Builders** | `WorkflowBuilder`, `ExecutionContextBuilder` |
| **Assertions** | `expectWorkflowAccessGranted()`, `expectBlockExecuted()` |
| **Requests** | `createMockRequest()`, `createEnvMock()` |
| **Requests** | `createMockRequest()`, `createMockFormDataRequest()` |
## Rules Summary
1. `@vitest-environment node` unless DOM is required
2. `vi.hoisted()` + `vi.mock()` + static imports — never `vi.resetModules()` + `vi.doMock()` + dynamic imports
3. `vi.mock()` calls before importing mocked modules
4. `@sim/testing` utilities over local mocks
2. Prefer centralized mocks from `@sim/testing` (see table above) over local `vi.hoisted()` + `vi.mock()` boilerplate
3. `vi.hoisted()` + `vi.mock()` + static imports — never `vi.resetModules()` + `vi.doMock()` + dynamic imports
4. `vi.mock()` calls before importing mocked modules
5. `beforeEach(() => vi.clearAllMocks())` to reset state — no redundant `afterEach`
6. No `vi.importActual()` — mock everything explicitly
7. No `mockAuth()`, `mockConsoleLogger()`, `setupCommonApiMocks()` — use direct mocks
8. Mock heavy deps (`@/blocks`, `@/tools/registry`, `@/triggers`) in tests that don't need them
9. Use absolute imports in test files
10. Avoid real timers — use 1ms delays or `vi.useFakeTimers()`
7. Mock heavy deps (`@/blocks`, `@/tools/registry`, `@/triggers`) in tests that don't need them
8. Use absolute imports in test files
9. Avoid real timers — use 1ms delays or `vi.useFakeTimers()`

View File

@@ -17,7 +17,7 @@ Use TSDoc for documentation. No `====` separators. No non-TSDoc comments.
Never update global styles. Keep all styling local to components.
## ID Generation
Never use `crypto.randomUUID()`, `nanoid`, or the `uuid` package directly. Use the utilities from `@/lib/core/utils/uuid`:
Never use `crypto.randomUUID()`, `nanoid`, or the `uuid` package directly. Use the utilities from `@sim/utils/id`:
- `generateId()` — UUID v4, use by default
- `generateShortId(size?)` — short URL-safe ID (default 21 chars), for compact identifiers
@@ -31,14 +31,14 @@ import { v4 as uuidv4 } from 'uuid'
const id = crypto.randomUUID()
// ✓ Good
import { generateId, generateShortId } from '@/lib/core/utils/uuid'
import { generateId, generateShortId } from '@sim/utils/id'
const uuid = generateId()
const shortId = generateShortId()
const tiny = generateShortId(8)
```
## Common Utilities
Use shared helpers from `@/lib/core/utils/helpers` instead of writing inline implementations:
Use shared helpers from `@sim/utils` instead of writing inline implementations:
- `sleep(ms)` — async delay. Never write `new Promise(resolve => setTimeout(resolve, ms))`
- `toError(value)` — normalize unknown caught values to `Error`. Never write `e instanceof Error ? e : new Error(String(e))`
@@ -51,7 +51,8 @@ const msg = error instanceof Error ? error.message : String(error)
const err = error instanceof Error ? error : new Error(String(error))
// ✓ Good
import { sleep, toError } from '@/lib/core/utils/helpers'
import { sleep } from '@sim/utils/helpers'
import { toError } from '@sim/utils/errors'
await sleep(1000)
const msg = toError(error).message
const err = toError(error)

View File

@@ -3,6 +3,7 @@ description: Testing patterns with Vitest and @sim/testing
globs: ["apps/sim/**/*.test.ts", "apps/sim/**/*.test.tsx"]
---
# Testing Patterns
Use Vitest. Test files: `feature.ts` → `feature.test.ts`
@@ -12,8 +13,12 @@ Use Vitest. Test files: `feature.ts` → `feature.test.ts`
These modules are mocked globally — do NOT re-mock them in test files unless you need to override behavior:
- `@sim/db` → `databaseMock`
- `@sim/db/schema` → `schemaMock`
- `drizzle-orm` → `drizzleOrmMock`
- `@sim/logger` → `loggerMock`
- `@/lib/auth` → `authMock`
- `@/lib/auth/hybrid` → `hybridAuthMock` (with default session-delegating behavior)
- `@/lib/core/utils/request` → `requestUtilsMock`
- `@/stores/console/store`, `@/stores/terminal`, `@/stores/execution/store`
- `@/blocks/registry`
- `@trigger.dev/sdk`
@@ -101,10 +106,6 @@ vi.mock('@/lib/workspaces/utils', () => ({
}))
```
### NEVER use `mockAuth()`, `mockConsoleLogger()`, or `setupCommonApiMocks()` from `@sim/testing`
These helpers internally use `vi.doMock()` which is slow. Use direct `vi.hoisted()` + `vi.mock()` instead.
### Mock heavy transitive dependencies
If a module under test imports `@/blocks` (200+ files), `@/tools/registry`, or other heavy modules, mock them:
@@ -134,38 +135,61 @@ await new Promise(r => setTimeout(r, 1))
vi.useFakeTimers()
```
## Mock Pattern Reference
## Centralized Mocks (prefer over local declarations)
`@sim/testing` exports ready-to-use mock modules for common dependencies. Import and pass directly to `vi.mock()` — no `vi.hoisted()` boilerplate needed. Each paired `*MockFns` object exposes the underlying `vi.fn()`s for per-test overrides.
| Module mocked | Import | Factory form |
|---|---|---|
| `@/app/api/auth/oauth/utils` | `authOAuthUtilsMock`, `authOAuthUtilsMockFns` | `vi.mock('@/app/api/auth/oauth/utils', () => authOAuthUtilsMock)` |
| `@/app/api/knowledge/utils` | `knowledgeApiUtilsMock`, `knowledgeApiUtilsMockFns` | `vi.mock('@/app/api/knowledge/utils', () => knowledgeApiUtilsMock)` |
| `@/app/api/workflows/utils` | `workflowsApiUtilsMock`, `workflowsApiUtilsMockFns` | `vi.mock('@/app/api/workflows/utils', () => workflowsApiUtilsMock)` |
| `@sim/audit` | `auditMock`, `auditMockFns` | `vi.mock('@sim/audit', () => auditMock)` |
| `@/lib/auth` | `authMock`, `authMockFns` | `vi.mock('@/lib/auth', () => authMock)` |
| `@/lib/auth/hybrid` | `hybridAuthMock`, `hybridAuthMockFns` | `vi.mock('@/lib/auth/hybrid', () => hybridAuthMock)` |
| `@/lib/copilot/request/http` | `copilotHttpMock`, `copilotHttpMockFns` | `vi.mock('@/lib/copilot/request/http', () => copilotHttpMock)` |
| `@/lib/core/config/env` | `envMock`, `createEnvMock(overrides)` | `vi.mock('@/lib/core/config/env', () => envMock)` |
| `@/lib/core/config/feature-flags` | `featureFlagsMock` | `vi.mock('@/lib/core/config/feature-flags', () => featureFlagsMock)` |
| `@/lib/core/config/redis` | `redisConfigMock`, `redisConfigMockFns` | `vi.mock('@/lib/core/config/redis', () => redisConfigMock)` |
| `@/lib/core/security/encryption` | `encryptionMock`, `encryptionMockFns` | `vi.mock('@/lib/core/security/encryption', () => encryptionMock)` |
| `@/lib/core/security/input-validation.server` | `inputValidationMock`, `inputValidationMockFns` | `vi.mock('@/lib/core/security/input-validation.server', () => inputValidationMock)` |
| `@/lib/core/utils/request` | `requestUtilsMock`, `requestUtilsMockFns` | `vi.mock('@/lib/core/utils/request', () => requestUtilsMock)` |
| `@/lib/core/utils/urls` | `urlsMock`, `urlsMockFns` | `vi.mock('@/lib/core/utils/urls', () => urlsMock)` |
| `@/lib/execution/preprocessing` | `executionPreprocessingMock`, `executionPreprocessingMockFns` | `vi.mock('@/lib/execution/preprocessing', () => executionPreprocessingMock)` |
| `@/lib/logs/execution/logging-session` | `loggingSessionMock`, `loggingSessionMockFns`, `LoggingSessionMock` | `vi.mock('@/lib/logs/execution/logging-session', () => loggingSessionMock)` |
| `@/lib/workflows/orchestration` | `workflowsOrchestrationMock`, `workflowsOrchestrationMockFns` | `vi.mock('@/lib/workflows/orchestration', () => workflowsOrchestrationMock)` |
| `@/lib/workflows/persistence/utils` | `workflowsPersistenceUtilsMock`, `workflowsPersistenceUtilsMockFns` | `vi.mock('@/lib/workflows/persistence/utils', () => workflowsPersistenceUtilsMock)` |
| `@/lib/workflows/utils` | `workflowsUtilsMock`, `workflowsUtilsMockFns` | `vi.mock('@/lib/workflows/utils', () => workflowsUtilsMock)` |
| `@/lib/workspaces/permissions/utils` | `permissionsMock`, `permissionsMockFns` | `vi.mock('@/lib/workspaces/permissions/utils', () => permissionsMock)` |
| `@sim/db/schema` | `schemaMock` | `vi.mock('@sim/db/schema', () => schemaMock)` |
### Auth mocking (API routes)
```typescript
const { mockGetSession } = vi.hoisted(() => ({
mockGetSession: vi.fn(),
}))
import { authMock, authMockFns } from '@sim/testing'
import { beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@/lib/auth', () => ({
auth: { api: { getSession: vi.fn() } },
getSession: mockGetSession,
}))
vi.mock('@/lib/auth', () => authMock)
// In tests:
mockGetSession.mockResolvedValue({ user: { id: 'user-1', email: 'test@example.com' } })
mockGetSession.mockResolvedValue(null) // unauthenticated
import { GET } from '@/app/api/my-route/route'
beforeEach(() => {
vi.clearAllMocks()
authMockFns.mockGetSession.mockResolvedValue({ user: { id: 'user-1' } })
})
```
Only define a local `vi.mock('@/lib/auth', ...)` if the module under test consumes exports outside the centralized shape (e.g., `auth.api.verifyOneTimeToken`, `auth.api.resetPassword`).
### Hybrid auth mocking
```typescript
const { mockCheckSessionOrInternalAuth } = vi.hoisted(() => ({
mockCheckSessionOrInternalAuth: vi.fn(),
}))
import { hybridAuthMock, hybridAuthMockFns } from '@sim/testing'
vi.mock('@/lib/auth/hybrid', () => ({
checkSessionOrInternalAuth: mockCheckSessionOrInternalAuth,
}))
vi.mock('@/lib/auth/hybrid', () => hybridAuthMock)
// In tests:
mockCheckSessionOrInternalAuth.mockResolvedValue({
hybridAuthMockFns.mockCheckSessionOrInternalAuth.mockResolvedValue({
success: true, userId: 'user-1', authType: 'session',
})
```
@@ -196,21 +220,23 @@ Always prefer over local test data.
| Category | Utilities |
|----------|-----------|
| **Mocks** | `loggerMock`, `databaseMock`, `drizzleOrmMock`, `setupGlobalFetchMock()` |
| **Module mocks** | See "Centralized Mocks" table above |
| **Logger helpers** | `loggerMock`, `createMockLogger()`, `getLoggerCalls()`, `clearLoggerMocks()` |
| **Database helpers** | `databaseMock`, `drizzleOrmMock`, `createMockDb()`, `createMockSql()`, `createMockSqlOperators()` |
| **Fetch helpers** | `setupGlobalFetchMock()`, `createMockFetch()`, `createMockResponse()`, `mockFetchError()` |
| **Factories** | `createSession()`, `createWorkflowRecord()`, `createBlock()`, `createExecutionContext()` |
| **Builders** | `WorkflowBuilder`, `ExecutionContextBuilder` |
| **Assertions** | `expectWorkflowAccessGranted()`, `expectBlockExecuted()` |
| **Requests** | `createMockRequest()`, `createEnvMock()` |
| **Requests** | `createMockRequest()`, `createMockFormDataRequest()` |
## Rules Summary
1. `@vitest-environment node` unless DOM is required
2. `vi.hoisted()` + `vi.mock()` + static imports — never `vi.resetModules()` + `vi.doMock()` + dynamic imports
3. `vi.mock()` calls before importing mocked modules
4. `@sim/testing` utilities over local mocks
2. Prefer centralized mocks from `@sim/testing` (see table above) over local `vi.hoisted()` + `vi.mock()` boilerplate
3. `vi.hoisted()` + `vi.mock()` + static imports — never `vi.resetModules()` + `vi.doMock()` + dynamic imports
4. `vi.mock()` calls before importing mocked modules
5. `beforeEach(() => vi.clearAllMocks())` to reset state — no redundant `afterEach`
6. No `vi.importActual()` — mock everything explicitly
7. No `mockAuth()`, `mockConsoleLogger()`, `setupCommonApiMocks()` — use direct mocks
8. Mock heavy deps (`@/blocks`, `@/tools/registry`, `@/triggers`) in tests that don't need them
9. Use absolute imports in test files
10. Avoid real timers — use 1ms delays or `vi.useFakeTimers()`
7. Mock heavy deps (`@/blocks`, `@/tools/registry`, `@/triggers`) in tests that don't need them
8. Use absolute imports in test files
9. Avoid real timers — use 1ms delays or `vi.useFakeTimers()`

View File

@@ -1,4 +1,4 @@
FROM oven/bun:1.3.11-alpine
FROM oven/bun:1.3.13-alpine
# Install necessary packages for development
RUN apk add --no-cache \

View File

@@ -71,7 +71,7 @@ fi
# Set up environment variables if .env doesn't exist for the sim app
if [ ! -f "apps/sim/.env" ]; then
echo "📄 Creating .env file from template..."
echo "📄 Creating apps/sim/.env from template..."
if [ -f "apps/sim/.env.example" ]; then
cp apps/sim/.env.example apps/sim/.env
else
@@ -79,6 +79,18 @@ if [ ! -f "apps/sim/.env" ]; then
fi
fi
# Set up env for the realtime server (must match the shared values in apps/sim/.env)
if [ ! -f "apps/realtime/.env" ] && [ -f "apps/realtime/.env.example" ]; then
echo "📄 Creating apps/realtime/.env from template..."
cp apps/realtime/.env.example apps/realtime/.env
fi
# Set up packages/db/.env for drizzle-kit and migration scripts
if [ ! -f "packages/db/.env" ] && [ -f "packages/db/.env.example" ]; then
echo "📄 Creating packages/db/.env from template..."
cp packages/db/.env.example packages/db/.env
fi
# Generate schema and run database migrations
echo "🗃️ Running database schema generation and migrations..."
echo "Generating schema..."

View File

@@ -2,8 +2,15 @@
Thank you for your interest in contributing to Sim! Our goal is to provide developers with a powerful, user-friendly platform for building, testing, and optimizing agentic workflows. We welcome contributions in all forms—from bug fixes and design improvements to brand-new features.
> **Project Overview:**
> Sim is a monorepo using Turborepo, containing the main application (`apps/sim/`), documentation (`apps/docs/`), and shared packages (`packages/`). The main application is built with Next.js (app router), ReactFlow, Zustand, Shadcn, and Tailwind CSS. Please ensure your contributions follow our best practices for clarity, maintainability, and consistency.
> **Project Overview:**
> Sim is a Turborepo monorepo with two deployable apps and a set of shared packages:
>
> - `apps/sim/` — the main Next.js application (App Router, ReactFlow, Zustand, Shadcn, Tailwind CSS).
> - `apps/realtime/` — a small Bun + Socket.IO server that powers the collaborative canvas. Shares DB and Better Auth secrets with `apps/sim` via `@sim/*` packages.
> - `apps/docs/` — Fumadocs-based documentation site.
> - `packages/` — shared workspace packages (`@sim/db`, `@sim/auth`, `@sim/audit`, `@sim/workflow-types`, `@sim/workflow-persistence`, `@sim/workflow-authz`, `@sim/realtime-protocol`, `@sim/security`, `@sim/logger`, `@sim/utils`, `@sim/testing`, `@sim/tsconfig`).
>
> Strict one-way dependency flow: `apps/* → packages/*`. Packages never import from apps. Please ensure your contributions follow this and our best practices for clarity, maintainability, and consistency.
---
@@ -24,14 +31,17 @@ Thank you for your interest in contributing to Sim! Our goal is to provide devel
We strive to keep our workflow as simple as possible. To contribute:
1. **Fork the Repository**
1. **Fork the Repository**
Click the **Fork** button on GitHub to create your own copy of the project.
2. **Clone Your Fork**
```bash
git clone https://github.com/<your-username>/sim.git
cd sim
```
3. **Create a Feature Branch**
3. **Create a Feature Branch**
Create a new branch with a descriptive name:
```bash
@@ -40,21 +50,23 @@ We strive to keep our workflow as simple as possible. To contribute:
Use a clear naming convention to indicate the type of work (e.g., `feat/`, `fix/`, `docs/`).
4. **Make Your Changes**
4. **Make Your Changes**
Ensure your changes are small, focused, and adhere to our coding guidelines.
5. **Commit Your Changes**
5. **Commit Your Changes**
Write clear, descriptive commit messages that follow the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/#specification) specification. This allows us to maintain a coherent project history and generate changelogs automatically. For example:
- `feat(api): add new endpoint for user authentication`
- `fix(ui): resolve button alignment issue`
- `docs: update contribution guidelines`
6. **Push Your Branch**
```bash
git push origin feat/your-feature-name
```
7. **Create a Pull Request**
7. **Create a Pull Request**
Open a pull request against the `staging` branch on GitHub. Please provide a clear description of the changes and reference any relevant issues (e.g., `fixes #123`).
---
@@ -65,7 +77,7 @@ If you discover a bug or have a feature request, please open an issue in our Git
- Provide a clear, descriptive title.
- Include as many details as possible (steps to reproduce, screenshots, etc.).
- **Tag Your Issue Appropriately:**
- **Tag Your Issue Appropriately:**
Use the following labels to help us categorize your issue:
- **active:** Actively working on it right now.
- **bug:** Something isn't working.
@@ -82,12 +94,11 @@ If you discover a bug or have a feature request, please open an issue in our Git
Before creating a pull request:
- **Ensure Your Branch Is Up-to-Date:**
- **Ensure Your Branch Is Up-to-Date:**
Rebase your branch onto the latest `staging` branch to prevent merge conflicts.
- **Follow the Guidelines:**
- **Follow the Guidelines:**
Make sure your changes are well-tested, follow our coding standards, and include relevant documentation if necessary.
- **Reference Issues:**
- **Reference Issues:**
If your PR addresses an existing issue, include `refs #<issue-number>` or `fixes #<issue-number>` in your PR description.
Our maintainers will review your pull request and provide feedback. We aim to make the review process as smooth and timely as possible.
@@ -166,27 +177,27 @@ To use local models with Sim:
1. Install Ollama and pull models:
```bash
# Install Ollama (if not already installed)
curl -fsSL https://ollama.ai/install.sh | sh
```bash
# Install Ollama (if not already installed)
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model (e.g., gemma3:4b)
ollama pull gemma3:4b
```
# Pull a model (e.g., gemma3:4b)
ollama pull gemma3:4b
```
2. Start Sim with local model support:
```bash
# With NVIDIA GPU support
docker compose --profile local-gpu -f docker-compose.ollama.yml up -d
```bash
# With NVIDIA GPU support
docker compose --profile local-gpu -f docker-compose.ollama.yml up -d
# Without GPU (CPU only)
docker compose --profile local-cpu -f docker-compose.ollama.yml up -d
# Without GPU (CPU only)
docker compose --profile local-cpu -f docker-compose.ollama.yml up -d
# If hosting on a server, update the environment variables in the docker-compose.prod.yml file
# to include the server's public IP then start again (OLLAMA_URL to i.e. http://1.1.1.1:11434)
docker compose -f docker-compose.prod.yml up -d
```
# If hosting on a server, update the environment variables in the docker-compose.prod.yml file
# to include the server's public IP then start again (OLLAMA_URL to i.e. http://1.1.1.1:11434)
docker compose -f docker-compose.prod.yml up -d
```
### Option 3: Using VS Code / Cursor Dev Containers
@@ -201,61 +212,104 @@ Dev Containers provide a consistent and easy-to-use development environment:
2. **Setup Steps:**
- Clone the repository:
```bash
git clone https://github.com/<your-username>/sim.git
cd sim
```
- Open the project in VS Code/Cursor
- When prompted, click "Reopen in Container" (or press F1 and select "Remote-Containers: Reopen in Container")
- Wait for the container to build and initialize
- Open the project in VS Code/Cursor.
- When prompted, click "Reopen in Container" (or press F1 and select "Remote-Containers: Reopen in Container").
- Wait for the container to build and initialize.
3. **Start Developing:**
- Run `bun run dev:full` in the terminal or use the `sim-start` alias
- This starts both the main application and the realtime socket server
- All dependencies and configurations are automatically set up
- Your changes will be automatically hot-reloaded
- Run `bun run dev:full` in the terminal or use the `sim-start` alias.
- This starts both the main application and the realtime socket server.
- All dependencies and configurations are automatically set up.
- Your changes will be automatically hot-reloaded.
4. **GitHub Codespaces:**
- This setup also works with GitHub Codespaces if you prefer development in the browser
- Just click "Code" → "Codespaces" → "Create codespace on staging"
- This setup also works with GitHub Codespaces if you prefer development in the browser.
- Just click "Code" → "Codespaces" → "Create codespace on staging".
### Option 4: Manual Setup
If you prefer not to use Docker or Dev Containers:
If you prefer not to use Docker or Dev Containers. **All commands run from the repository root unless explicitly noted.**
1. **Clone and Install:**
1. **Clone the Repository:**
```bash
git clone https://github.com/<your-username>/sim.git
cd sim
bun install
```
2. **Set Up Environment:**
Bun workspaces handle dependency resolution for all apps and packages from the root `bun install`.
- Navigate to the app directory:
```bash
cd apps/sim
```
- Copy `.env.example` to `.env`
- Configure required variables (DATABASE_URL, BETTER_AUTH_SECRET, BETTER_AUTH_URL)
2. **Set Up Environment Files:**
3. **Set Up Database:**
We use **per-app `.env` files** (the Turborepo-canonical pattern), not a single root `.env`. Three files are needed for local dev:
```bash
bunx drizzle-kit push
# Main app — large, app-specific (OAuth secrets, LLM keys, Stripe, etc.)
cp apps/sim/.env.example apps/sim/.env
# Realtime server — small, only the values shared with the main app
cp apps/realtime/.env.example apps/realtime/.env
# DB tooling (drizzle-kit, db:migrate)
cp packages/db/.env.example packages/db/.env
```
4. **Run the Development Server:**
At minimum, each `.env` needs `DATABASE_URL`. `apps/sim/.env` and `apps/realtime/.env` additionally need matching values for `BETTER_AUTH_URL`, `BETTER_AUTH_SECRET`, `INTERNAL_API_SECRET`, and `NEXT_PUBLIC_APP_URL`. `apps/sim/.env` also needs `ENCRYPTION_KEY` and `API_ENCRYPTION_KEY`. Generate any 32-char secrets with `openssl rand -hex 32`.
The same `BETTER_AUTH_SECRET`, `INTERNAL_API_SECRET`, and `DATABASE_URL` must appear in both `apps/sim/.env` and `apps/realtime/.env` so the two services share auth and DB. After editing `apps/sim/.env`, you can mirror the shared subset into the realtime env in one shot:
```bash
grep -E '^(DATABASE_URL|BETTER_AUTH_URL|BETTER_AUTH_SECRET|INTERNAL_API_SECRET|NEXT_PUBLIC_APP_URL|REDIS_URL)=' apps/sim/.env > apps/realtime/.env
grep -E '^DATABASE_URL=' apps/sim/.env > packages/db/.env
```
3. **Run Database Migrations:**
Migrations live in `packages/db/migrations/`. Run them via the dedicated workspace script:
```bash
cd packages/db && bun run db:migrate && cd ../..
```
For ad-hoc schema iteration during development you can also use `bun run db:push` from `packages/db`, but `db:migrate` is the canonical command for both local and CI/CD setups.
4. **Run the Development Servers:**
```bash
bun run dev:full
```
This command starts both the main application and the realtime socket server required for full functionality.
This launches both apps with coloured prefixes:
- `[App]` — Next.js on `http://localhost:3000`
- `[Realtime]` — Socket.IO on `http://localhost:3002`
Or run them separately:
```bash
bun run dev # Next.js app only
bun run dev:sockets # realtime server only
```
5. **Make Your Changes and Test Locally.**
Before opening a PR, run the same checks CI runs:
```bash
bun run type-check # TypeScript across every workspace
bun run lint:check # Biome lint across every workspace
bun run test # Vitest across every workspace
```
### Email Template Development
When working on email templates, you can preview them using a local email preview server:
@@ -263,18 +317,19 @@ When working on email templates, you can preview them using a local email previe
1. **Run the Email Preview Server:**
```bash
bun run email:dev
cd apps/sim && bun run email:dev
```
2. **Access the Preview:**
- Open `http://localhost:3000` in your browser
- You'll see a list of all email templates
- Click on any template to view and test it with various parameters
- Open `http://localhost:3000` in your browser.
- You'll see a list of all email templates.
- Click on any template to view and test it with various parameters.
3. **Templates Location:**
- Email templates are located in `sim/app/emails/`
- After making changes to templates, they will automatically update in the preview
- Email templates live in `apps/sim/components/emails/`.
- Changes hot-reload automatically in the preview.
---
@@ -282,28 +337,41 @@ When working on email templates, you can preview them using a local email previe
Sim is built in a modular fashion where blocks and tools extend the platform's functionality. To maintain consistency and quality, please follow the guidelines below when adding a new block or tool.
> **Use the skill guides for step-by-step recipes.** The repository ships opinionated, end-to-end guides under `.agents/skills/` that cover the exact file layout, conventions, registry wiring, and gotchas for each kind of contribution. Read the relevant SKILL.md before you start writing code:
>
> | Adding… | Read |
> | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------- |
> | A new integration end-to-end (tools + block + icon + optional triggers + all registrations) | [`.agents/skills/add-integration/SKILL.md`](../.agents/skills/add-integration/SKILL.md) |
> | Just a block (or aligning an existing block with its tools) | [`.agents/skills/add-block/SKILL.md`](../.agents/skills/add-block/SKILL.md) |
> | Just tool configs for a service | [`.agents/skills/add-tools/SKILL.md`](../.agents/skills/add-tools/SKILL.md) |
> | A webhook trigger for a service | [`.agents/skills/add-trigger/SKILL.md`](../.agents/skills/add-trigger/SKILL.md) |
> | A knowledge-base connector (sync docs from an external source) | [`.agents/skills/add-connector/SKILL.md`](../.agents/skills/add-connector/SKILL.md) |
>
> The shorter overview below is a high-level reference; the SKILL.md files are the authoritative source of truth and stay in sync with the codebase.
### Where to Add Your Code
- **Blocks:** Create your new block file under the `/apps/sim/blocks/blocks` directory. The name of the file should match the provider name (e.g., `pinecone.ts`).
- **Tools:** Create a new directory under `/apps/sim/tools` with the same name as the provider (e.g., `/apps/sim/tools/pinecone`).
- **Blocks:** Create your new block file under the `apps/sim/blocks/blocks/` directory. The name of the file should match the provider name (e.g., `pinecone.ts`).
- **Tools:** Create a new directory under `apps/sim/tools/` with the same name as the provider (e.g., `apps/sim/tools/pinecone`).
In addition, you will need to update the registries:
- **Block Registry:** Update the blocks index (`/apps/sim/blocks/index.ts`) to include your new block.
- **Tool Registry:** Update the tools registry (`/apps/sim/tools/index.ts`) to add your new tool.
- **Block Registry:** Add your block to `apps/sim/blocks/registry.ts`. (`apps/sim/blocks/index.ts` re-exports lookups from the registry; you do not need to edit it.)
- **Tool Registry:** Add your tool to `apps/sim/tools/index.ts`.
### How to Create a New Block
1. **Create a New File:**
Create a file for your block named after the provider (e.g., `pinecone.ts`) in the `/apps/sim/blocks/blocks` directory.
1. **Create a New File:**
Create a file for your block named after the provider (e.g., `pinecone.ts`) in the `apps/sim/blocks/blocks/` directory.
2. **Create a New Icon:**
Create a new icon for your block in the `/apps/sim/components/icons.tsx` file. The icon should follow the same naming convention as the block (e.g., `PineconeIcon`).
Create a new icon for your block in `apps/sim/components/icons.tsx`. The icon should follow the same naming convention as the block (e.g., `PineconeIcon`).
3. **Define the Block Configuration:**
3. **Define the Block Configuration:**
Your block should export a constant of type `BlockConfig`. For example:
```typescript:/apps/sim/blocks/blocks/pinecone.ts
```typescript
// apps/sim/blocks/blocks/pinecone.ts
import { PineconeIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import type { PineconeResponse } from '@/tools/pinecone/types'
@@ -321,7 +389,7 @@ In addition, you will need to update the registries:
{
id: 'operation',
title: 'Operation',
type: 'dropdown'
type: 'dropdown',
required: true,
options: [
{ label: 'Generate Embeddings', id: 'generate' },
@@ -332,7 +400,7 @@ In addition, you will need to update the registries:
{
id: 'apiKey',
title: 'API Key',
type: 'short-input'
type: 'short-input',
placeholder: 'Your Pinecone API key',
password: true,
required: true,
@@ -370,10 +438,11 @@ In addition, you will need to update the registries:
}
```
4. **Register Your Block:**
Add your block to the blocks registry (`/apps/sim/blocks/registry.ts`):
4. **Register Your Block:**
Add your block to the blocks registry (`apps/sim/blocks/registry.ts`):
```typescript:/apps/sim/blocks/registry.ts
```typescript
// apps/sim/blocks/registry.ts
import { PineconeBlock } from '@/blocks/blocks/pinecone'
// Registry of all available blocks
@@ -385,24 +454,25 @@ In addition, you will need to update the registries:
The block will be automatically available to the application through the registry.
5. **Test Your Block:**
5. **Test Your Block:**
Ensure that the block displays correctly in the UI and that its functionality works as expected.
### How to Create a New Tool
1. **Create a New Directory:**
Create a directory under `/apps/sim/tools` with the same name as the provider (e.g., `/apps/sim/tools/pinecone`).
1. **Create a New Directory:**
Create a directory under `apps/sim/tools/` with the same name as the provider (e.g., `apps/sim/tools/pinecone`).
2. **Create Tool Files:**
2. **Create Tool Files:**
Create separate files for each tool functionality with descriptive names (e.g., `fetch.ts`, `generate_embeddings.ts`, `search_text.ts`) in your tool directory.
3. **Create a Types File:**
3. **Create a Types File:**
Create a `types.ts` file in your tool directory to define and export all types related to your tools.
4. **Create an Index File:**
4. **Create an Index File:**
Create an `index.ts` file in your tool directory that imports and exports all tools:
```typescript:/apps/sim/tools/pinecone/index.ts
```typescript
// apps/sim/tools/pinecone/index.ts
import { fetchTool } from './fetch'
import { generateEmbeddingsTool } from './generate_embeddings'
import { searchTextTool } from './search_text'
@@ -410,10 +480,11 @@ In addition, you will need to update the registries:
export { fetchTool, generateEmbeddingsTool, searchTextTool }
```
5. **Define the Tool Configuration:**
5. **Define the Tool Configuration:**
Your tool should export a constant with a naming convention of `{toolName}Tool`. The tool ID should follow the format `{provider}_{tool_name}`. For example:
```typescript:/apps/sim/tools/pinecone/fetch.ts
```typescript
// apps/sim/tools/pinecone/fetch.ts
import { ToolConfig, ToolResponse } from '@/tools/types'
import { PineconeParams, PineconeResponse } from '@/tools/pinecone/types'
@@ -449,11 +520,12 @@ In addition, you will need to update the registries:
}
```
6. **Register Your Tool:**
Update the tools registry in `/apps/sim/tools/index.ts` to include your new tool:
6. **Register Your Tool:**
Update the tools registry in `apps/sim/tools/index.ts` to include your new tool:
```typescript:/apps/sim/tools/index.ts
import { fetchTool, generateEmbeddingsTool, searchTextTool } from '/@tools/pinecone'
```typescript
// apps/sim/tools/index.ts
import { fetchTool, generateEmbeddingsTool, searchTextTool } from '@/tools/pinecone'
// ... other imports
export const tools: Record<string, ToolConfig> = {
@@ -464,13 +536,14 @@ In addition, you will need to update the registries:
}
```
7. **Test Your Tool:**
7. **Test Your Tool:**
Ensure that your tool functions correctly by making test requests and verifying the responses.
8. **Generate Documentation:**
Run the documentation generator to create docs for your new tool:
8. **Generate Documentation:**
Run the documentation generator (from `apps/sim`) to create docs for your new tool:
```bash
./scripts/generate-docs.sh
cd apps/sim && bun run generate-docs
```
### Naming Conventions
@@ -480,7 +553,7 @@ Maintaining consistent naming across the codebase is critical for auto-generatio
- **Block Files:** Name should match the provider (e.g., `pinecone.ts`)
- **Block Export:** Should be named `{Provider}Block` (e.g., `PineconeBlock`)
- **Icons:** Should be named `{Provider}Icon` (e.g., `PineconeIcon`)
- **Tool Directories:** Should match the provider name (e.g., `/tools/pinecone/`)
- **Tool Directories:** Should match the provider name (e.g., `tools/pinecone/`)
- **Tool Files:** Should be named after their function (e.g., `fetch.ts`, `search_text.ts`)
- **Tool Exports:** Should be named `{toolName}Tool` (e.g., `fetchTool`)
- **Tool IDs:** Should follow the format `{provider}_{tool_name}` (e.g., `pinecone_fetch`)
@@ -489,12 +562,12 @@ Maintaining consistent naming across the codebase is critical for auto-generatio
Sim implements a sophisticated parameter visibility system that controls how parameters are exposed to users and LLMs in agent workflows. Each parameter can have one of four visibility levels:
| Visibility | User Sees | LLM Sees | How It Gets Set |
|-------------|-----------|----------|--------------------------------|
| `user-only` | ✅ Yes | ❌ No | User provides in UI |
| `user-or-llm` | ✅ Yes | ✅ Yes | User provides OR LLM generates |
| `llm-only` | ❌ No | ✅ Yes | LLM generates only |
| `hidden` | ❌ No | ❌ No | Application injects at runtime |
| Visibility | User Sees | LLM Sees | How It Gets Set |
| ------------- | --------- | -------- | ------------------------------ |
| `user-only` | ✅ Yes | ❌ No | User provides in UI |
| `user-or-llm` | ✅ Yes | ✅ Yes | User provides OR LLM generates |
| `llm-only` | ❌ No | ✅ Yes | LLM generates only |
| `hidden` | ❌ No | ❌ No | Application injects at runtime |
#### Visibility Guidelines

View File

@@ -20,7 +20,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Setup Node
uses: actions/setup-node@v4

View File

@@ -23,7 +23,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Cache Bun dependencies
uses: actions/cache@v4
@@ -122,7 +122,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Cache Bun dependencies
uses: actions/cache@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Cache Bun dependencies
uses: actions/cache@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Setup Node.js for npm publishing
uses: actions/setup-node@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Setup Node.js for npm publishing
uses: actions/setup-node@v4

View File

@@ -19,7 +19,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: 1.3.11
bun-version: 1.3.13
- name: Setup Node
uses: actions/setup-node@v4
@@ -103,6 +103,15 @@ jobs:
- name: Lint code
run: bun run lint:check
- name: Enforce monorepo boundaries
run: bun run check:boundaries
- name: Verify realtime prune graph
run: bun run check:realtime-prune
- name: Type-check realtime server
run: bunx turbo run type-check --filter=@sim/realtime
- name: Run tests with coverage
env:
NODE_OPTIONS: '--no-warnings --max-old-space-size=8192'

View File

@@ -7,7 +7,7 @@ You are a professional software engineer. All code must follow best practices: a
- **Logging**: Import `createLogger` from `@sim/logger`. Use `logger.info`, `logger.warn`, `logger.error` instead of `console.log`
- **Comments**: Use TSDoc for documentation. No `====` separators. No non-TSDoc comments
- **Styling**: Never update global styles. Keep all styling local to components
- **ID Generation**: Never use `crypto.randomUUID()`, `nanoid`, or `uuid` package. Use `generateId()` (UUID v4) or `generateShortId()` (compact) from `@/lib/core/utils/uuid`
- **ID Generation**: Never use `crypto.randomUUID()`, `nanoid`, or `uuid` package. Use `generateId()` (UUID v4) or `generateShortId()` (compact) from `@sim/utils/id`
- **Package Manager**: Use `bun` and `bunx`, not `npm` and `npx`
## Architecture
@@ -20,19 +20,42 @@ You are a professional software engineer. All code must follow best practices: a
### Root Structure
```
apps/sim/
├── app/ # Next.js app router (pages, API routes)
├── blocks/ # Block definitions and registry
├── components/ # Shared UI (emcn/, ui/)
├── executor/ # Workflow execution engine
├── hooks/ # Shared hooks (queries/, selectors/)
├── lib/ # App-wide utilities
├── providers/ # LLM provider integrations
├── stores/ # Zustand stores
├── tools/ # Tool definitions
└── triggers/ # Trigger definitions
apps/
├── sim/ # Next.js app (UI + API routes + workflow editor)
│ ├── app/ # Next.js app router (pages, API routes)
│ ├── blocks/ # Block definitions and registry
│ ├── components/ # Shared UI (emcn/, ui/)
│ ├── executor/ # Workflow execution engine
│ ├── hooks/ # Shared hooks (queries/, selectors/)
│ ├── lib/ # App-wide utilities
│ ├── providers/ # LLM provider integrations
├── stores/ # Zustand stores
│ ├── tools/ # Tool definitions
│ └── triggers/ # Trigger definitions
└── realtime/ # Bun Socket.IO server (collaborative canvas)
└── src/ # auth, config, database, handlers, middleware,
# rooms, routes, internal/webhook-cleanup.ts
packages/
├── audit/ # @sim/audit — recordAudit + AuditAction + AuditResourceType
├── auth/ # @sim/auth — @sim/auth/verify (shared Better Auth verifier)
├── db/ # @sim/db — drizzle schema + client
├── logger/ # @sim/logger
├── realtime-protocol/ # @sim/realtime-protocol — socket operation constants + zod schemas
├── security/ # @sim/security — safeCompare
├── tsconfig/ # shared tsconfig presets
├── utils/ # @sim/utils
├── workflow-authz/ # @sim/workflow-authz — authorizeWorkflowByWorkspacePermission
├── workflow-persistence/ # @sim/workflow-persistence — raw load/save + subflow helpers
└── workflow-types/ # @sim/workflow-types — pure BlockState/Loop/Parallel/... types
```
### Package boundaries
- `apps/* → packages/*` only. Packages never import from `apps/*`.
- Each package has explicit subpath `exports` maps; no barrels that accidentally pull in heavy halves.
- `apps/realtime` intentionally avoids Next.js, React, the block/tool registry, provider SDKs, and the executor. CI enforces this via `scripts/check-monorepo-boundaries.ts` and `scripts/check-realtime-prune-graph.ts`.
- Auth is shared across services via the Better Auth "Shared Database Session" pattern: both apps read the same `BETTER_AUTH_SECRET` and point at the same DB via `@sim/db`.
### Naming Conventions
- Components: PascalCase (`WorkflowList`)
- Hooks: `use` prefix (`useWorkflowOperations`)

View File

@@ -4,11 +4,12 @@ You are a professional software engineer. All code must follow best practices: a
## Global Standards
- **Logging**: Import `createLogger` from `@sim/logger`. Use `logger.info`, `logger.warn`, `logger.error` instead of `console.log`
- **Logging**: Import `createLogger` from `@sim/logger`. Use `logger.info`, `logger.warn`, `logger.error` instead of `console.log`. Inside API routes wrapped with `withRouteHandler`, loggers automatically include the request ID — no manual `withMetadata({ requestId })` needed
- **API Route Handlers**: All API route handlers (`GET`, `POST`, `PUT`, `DELETE`, `PATCH`) must be wrapped with `withRouteHandler` from `@/lib/core/utils/with-route-handler`. This provides request ID tracking, automatic error logging for 4xx/5xx responses, and unhandled error catching. See "API Route Pattern" section below
- **Comments**: Use TSDoc for documentation. No `====` separators. No non-TSDoc comments
- **Styling**: Never update global styles. Keep all styling local to components
- **ID Generation**: Never use `crypto.randomUUID()`, `nanoid`, or `uuid` package. Use `generateId()` (UUID v4) or `generateShortId()` (compact) from `@/lib/core/utils/uuid`
- **Common Utilities**: Use shared helpers from `@/lib/core/utils/helpers` instead of inline implementations. `sleep(ms)` for delays, `toError(e)` to normalize caught values.
- **ID Generation**: Never use `crypto.randomUUID()`, `nanoid`, or `uuid` package. Use `generateId()` (UUID v4) or `generateShortId()` (compact) from `@sim/utils/id`
- **Common Utilities**: Use shared helpers from `@sim/utils` instead of inline implementations. `sleep(ms)` from `@sim/utils/helpers` for delays, `toError(e)` from `@sim/utils/errors` to normalize caught values.
- **Package Manager**: Use `bun` and `bunx`, not `npm` and `npx`
## Architecture
@@ -93,6 +94,41 @@ export function Component({ requiredProp, optionalProp = false }: ComponentProps
Extract when: 50+ lines, used in 2+ files, or has own state/logic. Keep inline when: < 10 lines, single use, purely presentational.
## API Route Pattern
Every API route handler must be wrapped with `withRouteHandler`. This sets up `AsyncLocalStorage`-based request context so all loggers in the request lifecycle automatically include the request ID.
```typescript
import { createLogger } from '@sim/logger'
import type { NextRequest } from 'next/server'
import { NextResponse } from 'next/server'
import { withRouteHandler } from '@/lib/core/utils/with-route-handler'
const logger = createLogger('MyAPI')
// Simple route
export const GET = withRouteHandler(async (request: NextRequest) => {
logger.info('Handling request') // automatically includes {requestId=...}
return NextResponse.json({ ok: true })
})
// Route with params
export const DELETE = withRouteHandler(async (
request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) => {
const { id } = await params
return NextResponse.json({ deleted: id })
})
// Composing with other middleware (withRouteHandler wraps the outermost layer)
export const POST = withRouteHandler(withAdminAuth(async (request) => {
return NextResponse.json({ ok: true })
}))
```
Never export a bare `async function GET/POST/...` — always use `export const METHOD = withRouteHandler(...)`.
## Hooks
```typescript

View File

@@ -142,13 +142,15 @@ See the [environment variables reference](https://docs.sim.ai/self-hosting/envir
- **Database**: PostgreSQL with [Drizzle ORM](https://orm.drizzle.team)
- **Authentication**: [Better Auth](https://better-auth.com)
- **UI**: [Shadcn](https://ui.shadcn.com/), [Tailwind CSS](https://tailwindcss.com)
- **State Management**: [Zustand](https://zustand-demo.pmnd.rs/)
- **Streaming Markdown**: [Streamdown](https://github.com/vercel/streamdown)
- **State Management**: [Zustand](https://zustand-demo.pmnd.rs/), [TanStack Query](https://tanstack.com/query)
- **Flow Editor**: [ReactFlow](https://reactflow.dev/)
- **Docs**: [Fumadocs](https://fumadocs.vercel.app/)
- **Monorepo**: [Turborepo](https://turborepo.org/)
- **Realtime**: [Socket.io](https://socket.io/)
- **Background Jobs**: [Trigger.dev](https://trigger.dev/)
- **Remote Code Execution**: [E2B](https://www.e2b.dev/)
- **Isolated Code Execution**: [isolated-vm](https://github.com/laverdet/isolated-vm)
## Contributing

View File

@@ -28,6 +28,36 @@ export function AgentMailIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function AgentPhoneIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 150 150' xmlns='http://www.w3.org/2000/svg'>
<path
fill='#23AF58'
stroke='#007F3F'
strokeWidth='0.15'
strokeMiterlimit='10'
d='m139.6 53.3c-1.4-2.3-4.9-3.3-7.6-4.8-2.7-1.3-4.2-2.4-5.7-3.6-1.9-1-2.5-2.7-3.3-3.2s-2.7-1.4-4.5 1.3c-2 2.7-4.5 6.6-6.6 11.1-2.3 5.4-6.3 14.9-6.3 18.9 0.5 4.9 3.1 4.6 6.1 7.2 2.5 2.1 2.8 5.8 1.5 12.5-1.3 6.6-4 12.8-7.8 19.2-3.3 5.1-5.8 8.7-10 9.1-5.3 0.5-12.5-3.1-16.8-5.6-1-0.6-2.5-0.9-3.8-0.2-1.3 0.5-2.2 1.6-3.2 3.3-1.5 2.5-4.6 7.7-5.8 12.2-0.5 3 0 6.4 2.9 9 1.4 1.2 2.8 2.5 4.4 3.4 5 2.8 9.6 4.5 16.5 4.9 5.3 0.2 9.3-1 13.4-3.1 2.4-1.3 6.6-4.2 9.6-7.3l1.1-1.2c2.8-3.1 8.8-10 11.6-14.5 2.3-3.5 4.8-7.4 6.9-12.3 2.9-6.7 4.4-14 5-17.9 1.2-7 2.4-17.5 3.4-31.1 0.1-4.3-0.3-6.1-1-7.3zm-4.5 6.7c-0.5 9.5-1.9 23.3-3.1 30.1-0.9 4.5-2.4 9.6-3.8 13.4-1.1 2.6-3.1 7-5.6 10.8-3.4 5.3-8.4 11.6-12 15.8-6.4 6.6-10.2 9.6-14.2 10.8-2.2 0.9-3.8 1.2-7 1.2-3.4-0.1-8-0.7-11.3-2.2-3-1.2-7-4-6.9-6.8 0.4-3.2 3.3-9.6 5.2-11.9 0.2-0.3 0.5-0.3 0.7-0.2 2.5 1.1 6 3.2 9.6 4.5 2.4 0.9 4.8 1.4 7.3 1.4 3.9 0 6.7-1.2 9.5-3.2 5.6-4.6 9-10.8 12.1-17.5 2-4.3 4.1-11.6 4.4-18.3 0.1-4.9-1.1-8.9-4.5-12.2-1.1-0.7-3-2.1-3-2.8 0-4.2 3.9-13 8.9-22.9 0.2-0.7 0.5-1 1.1-0.7 1.1 0.6 3 1.4 4.6 2.4 2.1 1 5.4 2.4 7.1 3.9 0.9 0.4 1 3 0.9 4.4z'
/>
<path
fill='#23AF58'
d='m104.7 27.8c-1.3-1.5-3.3-1.3-6.2-1.5l-1.9 0.2-7-0.2-31.5 0.2 1.5-9.3c2-1.1 5.1-3.5 5.8-6.3 1-2.8 0.2-5.9-2-7.4-2.3-1.9-5.8-2.4-9.3-0.8-1.6 1-4.7 3.4-5.4 6.9-0.8 4.1 2.4 6.7 4.7 7.9l-1.5 9.1-17.2 0.9c-12.3 1.1-16.3 1.2-20.6 4.3-2 1.3-3 4.5-3.4 9.8-0.6 11.3-0.7 18.7-0.6 28.3 0.4 11.2 0 36.6 3 39.8l-1.2 0.3c-3.8 0.6-4 6.2-0.5 6.6l15.5-1 69.7-7.6c2.5-0.4 4.3-0.9 4.6-4.3l3.7-71.5c0-1.9 0.2-3.6-0.2-4.4zm-49.6-17.3c0.3-2.2 2.4-3 3.3-2.8 0.7 0.4 1 1.8 0 2.8-1.5 2-3.3 1.7-3.3 0zm40 90.2c-4 1-5.5 1.5-11.5 2.4-7.7 1-19.7 2.1-31.2 3.4l-33.8 2.9c-0.7 0.2-1-0.4-1-1-0.6-6.5-1.2-20.5-1.5-39.5l0.3-23.3c0.6-7.5 0.7-8.7 4.6-9.7 5.1-0.9 7.4-1.4 14.9-1.8l19.5-0.5 41.1-0.5c1.4 0 1.9 0.4 1.9 1.5l-3.3 66.1z'
/>
<path
fill='#23AF58'
d='m38.9 52.4c-1.8 0-4 1.1-4.5 3.3-1 3.9 1 7.6 4.5 7.7 3.8 0 5-3.8 4.7-6.3-0.2-2-2-4.7-4.7-4.7z'
/>
<path
fill='#23AF58'
d='m73.5 53.9c-1.8 0-4.3 1.5-4.4 4.5-0.1 3.2 2 5.3 4.3 5.3 2.5 0 4.2-1.7 4.2-4.8 0-3.2-1.7-4.8-4.1-5z'
/>
<path
fill='#23AF58'
d='m72.1 77.1c-2.7 3.4-7.2 7.4-14.7 8.3-7.3 0.3-13.9-2.9-20-8.5-3.5-3.4-8 0-6.2 2.7 1.7 2.5 6.4 6.6 10.4 8.8 3.5 2 7.3 3.3 13.8 3.5 4.7 0 9.2-0.8 12.7-2.4 2.9-1.1 5-2.8 6-3.8 2.3-2.1 3.8-4.1 3.5-7.3-0.9-2.5-3.6-2.8-5.5-1.3z'
/>
</svg>
)
}
export function CrowdStrikeIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 768 500' fill='none' xmlns='http://www.w3.org/2000/svg'>
@@ -4015,6 +4045,7 @@ export function AsanaIcon(props: SVGProps<SVGSVGElement>) {
}
export function PipedriveIcon(props: SVGProps<SVGSVGElement>) {
const pathId = useId()
return (
<svg
{...props}
@@ -4028,7 +4059,7 @@ export function PipedriveIcon(props: SVGProps<SVGSVGElement>) {
<defs>
<path
d='M59.6807,81.1772 C59.6807,101.5343 70.0078,123.4949 92.7336,123.4949 C109.5872,123.4949 126.6277,110.3374 126.6277,80.8785 C126.6277,55.0508 113.232,37.7119 93.2944,37.7119 C77.0483,37.7119 59.6807,49.1244 59.6807,81.1772 Z M101.3006,0 C142.0482,0 169.4469,32.2728 169.4469,80.3126 C169.4469,127.5978 140.584,160.60942 99.3224,160.60942 C79.6495,160.60942 67.0483,152.1836 60.4595,146.0843 C60.5063,147.5305 60.5374,149.1497 60.5374,150.8788 L60.5374,215 L18.32565,215 L18.32565,44.157 C18.32565,41.6732 17.53126,40.8873 15.07021,40.8873 L0.5531,40.8873 L0.5531,3.4741 L35.9736,3.4741 C52.282,3.4741 56.4564,11.7741 57.2508,18.1721 C63.8708,10.7524 77.5935,0 101.3006,0 Z'
id='path-1'
id={pathId}
/>
</defs>
<g
@@ -4039,10 +4070,7 @@ export function PipedriveIcon(props: SVGProps<SVGSVGElement>) {
fillRule='evenodd'
>
<g transform='translate(67.000000, 44.000000)'>
<mask id='mask-2' fill='white'>
<use href='#path-1' />
</mask>
<use id='Clip-5' fill='#FFFFFF' xlinkHref='#path-1' />
<use fill='#FFFFFF' xlinkHref={`#${pathId}`} />
</g>
</g>
</svg>
@@ -4068,6 +4096,40 @@ export function SalesforceIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function SapS4HanaIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 412.38 204'>
<defs>
<linearGradient
id={id}
x1='206.19'
y1='0'
x2='206.19'
y2='204'
gradientUnits='userSpaceOnUse'
>
<stop offset='0' stopColor='#00b1eb' />
<stop offset='.212' stopColor='#009ad9' />
<stop offset='.519' stopColor='#007fc4' />
<stop offset='.792' stopColor='#006eb8' />
<stop offset='1' stopColor='#0069b4' />
</linearGradient>
</defs>
<polyline
fill={`url(#${id})`}
fillRule='evenodd'
points='0 204 208.413 204 412.38 0 0 0 0 204'
/>
<path
fill='#fff'
fillRule='evenodd'
d='m244.727,38.359l-40.593-.025v96.518l-35.46-96.518h-35.16l-30.277,80.716c-3.224-20.352-24.277-27.38-40.84-32.649-10.937-3.512-22.541-8.678-22.434-14.387.091-4.687,6.225-9.04,18.377-8.385,8.17.433,15.373,1.092,29.71,8.006l14.102-24.557c-13.088-6.658-31.169-10.867-45.985-10.883h-.086c-17.277,0-31.677,5.598-40.602,14.824-6.221,6.443-9.572,14.626-9.712,23.679-.227,12.454,4.341,21.292,13.938,28.338,8.104,5.944,18.468,9.794,27.603,12.626,11.27,3.492,20.467,6.526,20.36,13.002-.083,2.355-.977,4.552-2.671,6.337-2.807,2.897-7.124,3.986-13.084,4.098-11.497.243-20.026-1.559-33.61-9.585l-12.536,24.903c13.546,7.705,29.586,12.223,45.952,12.223l2.106-.024c14.247-.256,25.745-4.316,34.929-11.712.527-.416,1.001-.845,1.488-1.277l-4.073,10.874h36.875l6.189-18.822c6.477,2.214,13.847,3.437,21.676,3.437,7.618,0,14.795-1.17,21.156-3.252l5.965,18.637h60.137v-38.969h13.113c31.706,0,50.456-16.147,50.456-43.202,0-30.139-18.219-43.969-57.011-43.969Zm-93.816,82.587c-4.737,0-9.177-.828-13.006-2.275l12.866-40.593h.244l12.643,40.708c-3.801,1.349-8.138,2.16-12.746,2.16Zm96.199-23.324h-8.941v-32.711h8.941c11.927,0,21.437,3.961,21.437,16.139,0,12.602-9.51,16.572-21.437,16.572'
/>
</svg>
)
}
export function ServiceNowIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} xmlns='http://www.w3.org/2000/svg' viewBox='0 0 71.1 63.6'>
@@ -4664,15 +4726,16 @@ export function DynamoDBIcon(props: SVGProps<SVGSVGElement>) {
}
export function IAMIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
return (
<svg {...props} viewBox='0 0 80 80' xmlns='http://www.w3.org/2000/svg'>
<defs>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id='iamGradient'>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id={id}>
<stop stopColor='#BD0816' offset='0%' />
<stop stopColor='#FF5252' offset='100%' />
</linearGradient>
</defs>
<rect fill='url(#iamGradient)' width='80' height='80' />
<rect fill={`url(#${id})`} width='80' height='80' />
<path
d='M14,59 L66,59 L66,21 L14,21 L14,59 Z M68,20 L68,60 C68,60.552 67.553,61 67,61 L13,61 C12.447,61 12,60.552 12,60 L12,20 C12,19.448 12.447,19 13,19 L67,19 C67.553,19 68,19.448 68,20 L68,20 Z M44,48 L59,48 L59,46 L44,46 L44,48 Z M57,42 L62,42 L62,40 L57,40 L57,42 Z M44,42 L52,42 L52,40 L44,40 L44,42 Z M29,46 C29,45.449 28.552,45 28,45 C27.448,45 27,45.449 27,46 C27,46.551 27.448,47 28,47 C28.552,47 29,46.551 29,46 L29,46 Z M31,46 C31,47.302 30.161,48.401 29,48.816 L29,51 L27,51 L27,48.815 C25.839,48.401 25,47.302 25,46 C25,44.346 26.346,43 28,43 C29.654,43 31,44.346 31,46 L31,46 Z M19,53.993 L36.994,54 L36.996,50 L33,50 L33,48 L36.996,48 L36.998,45 L33,45 L33,43 L36.999,43 L37,40.007 L19.006,40 L19,53.993 Z M22,38.001 L34,38.006 L34,31 C34.001,28.697 31.197,26.677 28,26.675 L27.996,26.675 C24.804,26.675 22.004,28.696 22.002,31 L22,38.001 Z M17,54.992 L17.006,39 C17.006,38.734 17.111,38.48 17.299,38.292 C17.486,38.105 17.741,38 18.006,38 L20,38.001 L20.002,31 C20.004,27.512 23.59,24.675 27.996,24.675 L28,24.675 C32.412,24.677 36.001,27.515 36,31 L36,38.007 L38,38.008 C38.553,38.008 39,38.456 39,39.008 L38.994,55 C38.994,55.266 38.889,55.52 38.701,55.708 C38.514,55.895 38.259,56 37.994,56 L18,55.992 C17.447,55.992 17,55.544 17,54.992 L17,54.992 Z M60,36 L62,36 L62,34 L60,34 L60,36 Z M44,36 L55,36 L55,34 L44,34 L44,36 Z'
fill='#FFFFFF'
@@ -4681,16 +4744,36 @@ export function IAMIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function IdentityCenterIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
return (
<svg {...props} viewBox='0 0 80 80' xmlns='http://www.w3.org/2000/svg'>
<defs>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id={id}>
<stop stopColor='#BD0816' offset='0%' />
<stop stopColor='#FF5252' offset='100%' />
</linearGradient>
</defs>
<rect fill={`url(#${id})`} width='80' height='80' />
<path
d='M46.694,46.8194562 C47.376,46.1374562 47.376,45.0294562 46.694,44.3474562 C46.353,44.0074562 45.906,43.8374562 45.459,43.8374562 C45.01,43.8374562 44.563,44.0074562 44.222,44.3474562 C43.542,45.0284562 43.542,46.1384562 44.222,46.8194562 C44.905,47.5014562 46.013,47.4994562 46.694,46.8194562 M47.718,47.1374562 L51.703,51.1204562 L50.996,51.8274562 L49.868,50.6994562 L48.793,51.7754562 L48.086,51.0684562 L49.161,49.9924562 L47.011,47.8444562 C46.545,48.1654562 46.003,48.3294562 45.458,48.3294562 C44.755,48.3294562 44.051,48.0624562 43.515,47.5264562 C42.445,46.4554562 42.445,44.7124562 43.515,43.6404562 C44.586,42.5714562 46.329,42.5694562 47.401,43.6404562 C48.351,44.5904562 48.455,46.0674562 47.718,47.1374562 M53,44.1014562 C53,46.1684562 51.505,47.0934562 50.023,47.0934562 L50.023,46.0934562 C50.487,46.0934562 52,45.9494562 52,44.1014562 C52,43.0044562 51.353,42.3894562 49.905,42.1084562 C49.68,42.0654562 49.514,41.8754562 49.501,41.6484562 C49.446,40.7444562 48.987,40.1124562 48.384,40.1124562 C48.084,40.1124562 47.854,40.2424562 47.616,40.5464562 C47.506,40.6884562 47.324,40.7594562 47.147,40.7324562 C46.968,40.7054562 46.818,40.5844562 46.755,40.4144562 C46.577,39.9434562 46.211,39.4334562 45.723,38.9774562 C45.231,38.5094562 43.883,37.5074562 41.972,38.2734562 C40.885,38.7054562 40.034,39.9494562 40.034,41.1074562 C40.034,41.2354562 40.043,41.3624562 40.058,41.4884562 C40.061,41.5094562 40.062,41.5304562 40.062,41.5514562 C40.062,41.7994562 39.882,42.0064562 39.645,42.0464562 C38.886,42.2394562 38,42.7454562 38,44.0554562 L38.005,44.2104562 C38.069,45.3254562 39.252,45.9954562 40.358,45.9984562 L41,45.9984562 L41,46.9984562 L40.357,46.9984562 C38.536,46.9944562 37.095,45.8194562 37.006,44.2644562 C37.003,44.1944562 37,44.1244562 37,44.0554562 C37,42.6944562 37.752,41.6484562 39.035,41.1884562 C39.034,41.1614562 39.034,41.1344562 39.034,41.1074562 C39.034,39.5434562 40.138,37.9254562 41.602,37.3434562 C43.298,36.6654562 45.095,37.0034562 46.409,38.2494562 C46.706,38.5274562 47.076,38.9264562 47.372,39.4134562 C47.673,39.2124562 48.008,39.1124562 48.384,39.1124562 C49.257,39.1124562 50.231,39.7714562 50.458,41.2074562 C52.145,41.6324562 53,42.6054562 53,44.1014562 M27,53 L27,27 L53,27 L53,34 L51,34 L51,29 L29,29 L29,51 L51,51 L51,46 L53,46 L53,53 Z'
fill='#FFFFFF'
/>
</svg>
)
}
export function STSIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
return (
<svg {...props} viewBox='0 0 80 80' xmlns='http://www.w3.org/2000/svg'>
<defs>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id='stsGradient'>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id={id}>
<stop stopColor='#BD0816' offset='0%' />
<stop stopColor='#FF5252' offset='100%' />
</linearGradient>
</defs>
<rect fill='url(#stsGradient)' width='80' height='80' />
<rect fill={`url(#${id})`} width='80' height='80' />
<path
d='M14,59 L66,59 L66,21 L14,21 L14,59 Z M68,20 L68,60 C68,60.552 67.553,61 67,61 L13,61 C12.447,61 12,60.552 12,60 L12,20 C12,19.448 12.447,19 13,19 L67,19 C67.553,19 68,19.448 68,20 L68,20 Z M44,48 L59,48 L59,46 L44,46 L44,48 Z M57,42 L62,42 L62,40 L57,40 L57,42 Z M44,42 L52,42 L52,40 L44,40 L44,42 Z M29,46 C29,45.449 28.552,45 28,45 C27.448,45 27,45.449 27,46 C27,46.551 27.448,47 28,47 C28.552,47 29,46.551 29,46 L29,46 Z M31,46 C31,47.302 30.161,48.401 29,48.816 L29,51 L27,51 L27,48.815 C25.839,48.401 25,47.302 25,46 C25,44.346 26.346,43 28,43 C29.654,43 31,44.346 31,46 L31,46 Z M19,53.993 L36.994,54 L36.996,50 L33,50 L33,48 L36.996,48 L36.998,45 L33,45 L33,43 L36.999,43 L37,40.007 L19.006,40 L19,53.993 Z M22,38.001 L34,38.006 L34,31 C34.001,28.697 31.197,26.677 28,26.675 L27.996,26.675 C24.804,26.675 22.004,28.696 22.002,31 L22,38.001 Z M17,54.992 L17.006,39 C17.006,38.734 17.111,38.48 17.299,38.292 C17.486,38.105 17.741,38 18.006,38 L20,38.001 L20.002,31 C20.004,27.512 23.59,24.675 27.996,24.675 L28,24.675 C32.412,24.677 36.001,27.515 36,31 L36,38.007 L38,38.008 C38.553,38.008 39,38.456 39,39.008 L38.994,55 C38.994,55.266 38.889,55.52 38.701,55.708 C38.514,55.895 38.259,56 37.994,56 L18,55.992 C17.447,55.992 17,55.544 17,54.992 L17,54.992 Z M60,36 L62,36 L62,34 L60,34 L60,36 Z M44,36 L55,36 L55,34 L44,34 L44,36 Z'
fill='#FFFFFF'
@@ -4699,16 +4782,36 @@ export function STSIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function SecretsManagerIcon(props: SVGProps<SVGSVGElement>) {
export function SESIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
return (
<svg {...props} viewBox='0 0 80 80' xmlns='http://www.w3.org/2000/svg'>
<defs>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id='secretsManagerGradient'>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id={id}>
<stop stopColor='#BD0816' offset='0%' />
<stop stopColor='#FF5252' offset='100%' />
</linearGradient>
</defs>
<rect fill='url(#secretsManagerGradient)' width='80' height='80' />
<rect fill={`url(#${id})`} width='80' height='80' />
<path
d='M57,60.999875 C57,59.373846 55.626,57.9998214 54,57.9998214 C52.374,57.9998214 51,59.373846 51,60.999875 C51,62.625904 52.374,63.9999286 54,63.9999286 C55.626,63.9999286 57,62.625904 57,60.999875 L57,60.999875 Z M40,59.9998571 C38.374,59.9998571 37,61.3738817 37,62.9999107 C37,64.6259397 38.374,65.9999643 40,65.9999643 C41.626,65.9999643 43,64.6259397 43,62.9999107 C43,61.3738817 41.626,59.9998571 40,59.9998571 L40,59.9998571 Z M26,57.9998214 C24.374,57.9998214 23,59.373846 23,60.999875 C23,62.625904 24.374,63.9999286 26,63.9999286 C27.626,63.9999286 29,62.625904 29,60.999875 C29,59.373846 27.626,57.9998214 26,57.9998214 L26,57.9998214 Z M28.605,42.9995536 L51.395,42.9995536 L43.739,36.1104305 L40.649,38.7584778 C40.463,38.9194807 40.23,38.9994821 39.999,38.9994821 C39.768,38.9994821 39.535,38.9194807 39.349,38.7584778 L36.26,36.1104305 L28.605,42.9995536 Z M27,28.1732888 L27,41.7545313 L34.729,34.7984071 L27,28.1732888 Z M51.297,26.9992678 L28.703,26.9992678 L39.999,36.6824408 L51.297,26.9992678 Z M53,41.7545313 L53,28.1732888 L45.271,34.7974071 L53,41.7545313 Z M59,60.999875 C59,63.7099234 56.71,65.9999643 54,65.9999643 C51.29,65.9999643 49,63.7099234 49,60.999875 C49,58.6308327 50.75,56.5837961 53,56.1057876 L53,52.9997321 L41,52.9997321 L41,58.1058233 C43.25,58.5838319 45,60.6308684 45,62.9999107 C45,65.7099591 42.71,68 40,68 C37.29,68 35,65.7099591 35,62.9999107 C35,60.6308684 36.75,58.5838319 39,58.1058233 L39,52.9997321 L27,52.9997321 L27,56.1057876 C29.25,56.5837961 31,58.6308327 31,60.999875 C31,63.7099234 28.71,65.9999643 26,65.9999643 C23.29,65.9999643 21,63.7099234 21,60.999875 C21,58.6308327 22.75,56.5837961 25,56.1057876 L25,51.9997143 C25,51.4477044 25.447,50.9996964 26,50.9996964 L39,50.9996964 L39,44.9995893 L26,44.9995893 C25.447,44.9995893 25,44.5515813 25,43.9995714 L25,25.99925 C25,25.4472401 25.447,24.9992321 26,24.9992321 L54,24.9992321 C54.553,24.9992321 55,25.4472401 55,25.99925 L55,43.9995714 C55,44.5515813 54.553,44.9995893 54,44.9995893 L41,44.9995893 L41,50.9996964 L54,50.9996964 C54.553,50.9996964 55,51.4477044 55,51.9997143 L55,56.1057876 C57.25,56.5837961 59,58.6308327 59,60.999875 L59,60.999875 Z M68,39.9995 C68,45.9066055 66.177,51.5597064 62.727,56.3447919 L61.104,55.174771 C64.307,50.7316916 66,45.4845979 66,39.9995 C66,25.664244 54.337,14.0000357 40.001,14.0000357 C25.664,14.0000357 14,25.664244 14,39.9995 C14,45.4845979 15.693,50.7316916 18.896,55.174771 L17.273,56.3447919 C13.823,51.5597064 12,45.9066055 12,39.9995 C12,24.5612243 24.561,12 39.999,12 C55.438,12 68,24.5612243 68,39.9995 L68,39.9995 Z'
fill='#FFFFFF'
/>
</svg>
)
}
export function SecretsManagerIcon(props: SVGProps<SVGSVGElement>) {
const id = useId()
return (
<svg {...props} viewBox='0 0 80 80' xmlns='http://www.w3.org/2000/svg'>
<defs>
<linearGradient x1='0%' y1='100%' x2='100%' y2='0%' id={id}>
<stop stopColor='#BD0816' offset='0%' />
<stop stopColor='#FF5252' offset='100%' />
</linearGradient>
</defs>
<rect fill={`url(#${id})`} width='80' height='80' />
<path
d='M38.76,43.36 C38.76,44.044 39.317,44.6 40,44.6 C40.684,44.6 41.24,44.044 41.24,43.36 C41.24,42.676 40.684,42.12 40,42.12 C39.317,42.12 38.76,42.676 38.76,43.36 L38.76,43.36 Z M36.76,43.36 C36.76,41.573 38.213,40.12 40,40.12 C41.787,40.12 43.24,41.573 43.24,43.36 C43.24,44.796 42.296,46.002 41,46.426 L41,49 L39,49 L39,46.426 C37.704,46.002 36.76,44.796 36.76,43.36 L36.76,43.36 Z M49,38 L31,38 L31,51 L49,51 L49,48 L46,48 L46,46 L49,46 L49,43 L46,43 L46,41 L49,41 L49,38 Z M34,36 L45.999,36 L46,31 C46.001,28.384 43.143,26.002 40.004,26 L40.001,26 C38.472,26 36.928,26.574 35.763,27.575 C34.643,28.537 34,29.786 34,31.001 L34,36 Z M48,31.001 L47.999,36 L50,36 C50.553,36 51,36.448 51,37 L51,52 C51,52.552 50.553,53 50,53 L30,53 C29.447,53 29,52.552 29,52 L29,37 C29,36.448 29.447,36 30,36 L32,36 L32,31 C32.001,29.202 32.897,27.401 34.459,26.058 C35.982,24.75 38.001,24 40.001,24 L40.004,24 C44.265,24.002 48.001,27.273 48,31.001 L48,31.001 Z M19.207,55.049 L20.828,53.877 C18.093,50.097 16.581,45.662 16.396,41 L19,41 L19,39 L16.399,39 C16.598,34.366 18.108,29.957 20.828,26.198 L19.207,25.025 C16.239,29.128 14.599,33.942 14.399,39 L12,39 L12,41 L14.396,41 C14.582,46.086 16.224,50.926 19.207,55.049 L19.207,55.049 Z M53.838,59.208 C50.069,61.936 45.648,63.446 41,63.639 L41,61 L39,61 L39,63.639 C34.352,63.447 29.93,61.937 26.159,59.208 L24.988,60.828 C29.1,63.805 33.928,65.445 39,65.639 L39,68 L41,68 L41,65.639 C46.072,65.445 50.898,63.805 55.01,60.828 L53.838,59.208 Z M26.159,20.866 C29.93,18.138 34.352,16.628 39,16.436 L39,19 L41,19 L41,16.436 C45.648,16.628 50.069,18.138 53.838,20.866 L55.01,19.246 C50.898,16.27 46.072,14.63 41,14.436 L41,12 L39,12 L39,14.436 C33.928,14.629 29.1,16.269 24.988,19.246 L26.159,20.866 Z M65.599,39 C65.399,33.942 63.759,29.128 60.79,25.025 L59.169,26.198 C61.89,29.957 63.4,34.366 63.599,39 L61,39 L61,41 L63.602,41 C63.416,45.662 61.905,50.097 59.169,53.877 L60.79,55.049 C63.774,50.926 65.415,46.086 65.602,41 L68,41 L68,39 L65.599,39 Z M56.386,25.064 L64.226,17.224 L62.812,15.81 L54.972,23.65 L56.386,25.064 Z M23.612,55.01 L15.772,62.85 L17.186,64.264 L25.026,56.424 L23.612,55.01 Z M28.666,27.253 L13.825,12.413 L12.411,13.827 L27.252,28.667 L28.666,27.253 Z M54.193,52.78 L67.586,66.173 L66.172,67.587 L52.779,54.194 L54.193,52.78 Z'
fill='#FFFFFF'

View File

@@ -6,6 +6,7 @@ import type { ComponentType, SVGProps } from 'react'
import {
A2AIcon,
AgentMailIcon,
AgentPhoneIcon,
AgiloftIcon,
AhrefsIcon,
AirtableIcon,
@@ -91,6 +92,7 @@ import {
HuggingFaceIcon,
HunterIOIcon,
IAMIcon,
IdentityCenterIcon,
ImageIcon,
IncidentioIcon,
InfisicalIcon,
@@ -152,6 +154,8 @@ import {
RootlyIcon,
S3Icon,
SalesforceIcon,
SapS4HanaIcon,
SESIcon,
SearchIcon,
SecretsManagerIcon,
SendgridIcon,
@@ -202,6 +206,7 @@ type IconComponent = ComponentType<SVGProps<SVGSVGElement>>
export const blockTypeToIconMap: Record<string, IconComponent> = {
a2a: A2AIcon,
agentmail: AgentMailIcon,
agentphone: AgentPhoneIcon,
agiloft: AgiloftIcon,
ahrefs: AhrefsIcon,
airtable: AirtableIcon,
@@ -294,6 +299,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
huggingface: HuggingFaceIcon,
hunter: HunterIOIcon,
iam: IAMIcon,
identity_center: IdentityCenterIcon,
image_generator: ImageIcon,
imap: MailServerIcon,
incidentio: IncidentioIcon,
@@ -364,12 +370,14 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
rootly: RootlyIcon,
s3: S3Icon,
salesforce: SalesforceIcon,
sap_s4hana: SapS4HanaIcon,
search: SearchIcon,
secrets_manager: SecretsManagerIcon,
sendgrid: SendgridIcon,
sentry: SentryIcon,
serper: SerperIcon,
servicenow: ServiceNowIcon,
ses: SESIcon,
sftp: SftpIcon,
sharepoint: MicrosoftSharepointIcon,
shopify: ShopifyIcon,

View File

@@ -25,6 +25,8 @@ Secrets are organized into two sections:
- **Workspace** — shared with all members of your workspace
- **Personal** — private to you
External workspace members count as workspace members for workspace-scoped secrets. They can use workspace secrets according to their workspace permission level, even though they are not members of your organization.
### Adding a Secret
Type a key name (e.g. `OPENAI_API_KEY`) into the **Key** column and its value into the **Value** column in the last empty row. A new empty row appears automatically as you type. Existing values are masked by default.
@@ -89,7 +91,7 @@ Click **Save** to apply changes, or **Back** to return to the list.
| | Workspace | Personal |
|---|---|---|
| **Visibility** | All workspace members | Only you |
| **Visibility** | All workspace members, including external workspace members | Only you |
| **Use in workflows** | Any member can use | Only you can use |
| **Best for** | Production workflows, shared services | Testing, personal API keys |
| **Who can edit** | Workspace admins | Only you |

View File

@@ -0,0 +1,219 @@
---
title: Access Control
description: Restrict which models, blocks, and platform features each group of users can access
---
import { Callout } from 'fumadocs-ui/components/callout'
import { FAQ } from '@/components/ui/faq'
import { Image } from '@/components/ui/image'
Access Control lets workspace admins define permission groups that restrict what each set of workspace members can do — which AI model providers they can use, which workflow blocks they can place, and which platform features are visible to them. Permission groups are scoped to a single workspace: a user can be in different groups (or no group) in different workspaces. Restrictions are enforced both in the workflow executor and in Mothership, based on the workflow's workspace.
---
## How it works
Access control is built around **permission groups**. Each group belongs to a specific workspace and has a name, an optional description, and a configuration that defines what its members can and cannot do. A user can belong to at most one permission group **per workspace**, but can belong to different groups in different workspaces.
When a user runs a workflow or uses Mothership, Sim reads their group's configuration and applies it:
- **In the executor:** If a workflow uses a disallowed block type or model provider, execution halts immediately with an error. This applies to both manual runs and scheduled or API-triggered deployments.
- **In Mothership:** Disallowed blocks are filtered out of the block list so they cannot be added to a workflow. Disallowed tool types (MCP, custom tools, skills) are skipped if Mothership attempts to use them.
---
## Setup
### 1. Open Access Control settings
Go to **Settings → Enterprise → Access Control** in the workspace you want to manage. Each workspace has its own set of permission groups.
<Image src="/static/enterprise/access-control-groups.png" alt="Access Control settings showing a list of permission groups: Contractors, Sales, Engineering, and Marketing, each with Details and Delete actions" width={900} height={500} />
### 2. Create a permission group
Click **+ Create** and enter a name (required) and optional description. You can also enable **Auto-add new members** — when active, any new member who joins this workspace is automatically added to this group. Only one group per workspace can have this setting enabled at a time.
### 3. Configure permissions
Click **Details** on a group, then open **Configure Permissions**. There are three tabs.
#### Model Providers
Controls which AI model providers members of this group can use.
<Image src="/static/enterprise/access-control-model-providers.png" alt="Model Providers tab showing a grid of AI providers including Ollama, vLLM, OpenAI, Anthropic, Google, Azure OpenAI, and others with checkboxes to allow or restrict access" width={900} height={500} /> The list shows all providers available in Sim.
- **All checked (default):** All providers are allowed.
- **Subset checked:** Only the selected providers are allowed. Any workflow block or agent using a provider not on the list will fail at execution time.
#### Blocks
Controls which workflow blocks members can place and execute.
<Image src="/static/enterprise/access-control-blocks.png" alt="Blocks tab showing Core Blocks (Agent, API, Condition, Function, Knowledge, etc.) and Tools (integrations like 1Password, A2A, Ahrefs, Airtable, and more) with checkboxes to allow or restrict each" width={900} height={500} /> Blocks are split into two sections: **Core Blocks** (Agent, API, Condition, Function, etc.) and **Tools** (all integration blocks).
- **All checked (default):** All blocks are allowed.
- **Subset checked:** Only the selected blocks are allowed. Workflows that already contain a disallowed block will fail when run — they are not automatically modified.
<Callout type="info">
The `start_trigger` block (the entry point of every workflow) is always allowed and cannot be restricted.
</Callout>
#### Platform
Controls visibility of platform features and modules.
<Image src="/static/enterprise/access-control-platform.png" alt="Platform tab showing feature toggles grouped by category: Sidebar (Knowledge Base, Tables, Templates), Workflow Panel (Copilot), Settings Tabs, Tools, Deploy Tabs, Features, Logs, and Collaboration" width={900} height={500} /> Each checkbox maps to a specific feature; checking it hides or disables that feature for group members.
**Sidebar**
| Feature | Effect when checked |
|---------|-------------------|
| Knowledge Base | Hides the Knowledge Base section from the sidebar |
| Tables | Hides the Tables section from the sidebar |
| Templates | Hides the Templates section from the sidebar |
**Workflow Panel**
| Feature | Effect when checked |
|---------|-------------------|
| Copilot | Hides the Copilot panel inside the workflow editor |
**Settings Tabs**
| Feature | Effect when checked |
|---------|-------------------|
| Integrations | Hides the Integrations tab in Settings |
| Secrets | Hides the Secrets tab in Settings |
| API Keys | Hides the Sim Keys tab in Settings |
| Files | Hides the Files tab in Settings |
**Tools**
| Feature | Effect when checked |
|---------|-------------------|
| MCP Tools | Disables the use of MCP tools in workflows and agents |
| Custom Tools | Disables the use of custom tools in workflows and agents |
| Skills | Disables the use of Sim Skills in workflows and agents |
**Deploy Tabs**
| Feature | Effect when checked |
|---------|-------------------|
| API | Hides the API deployment tab |
| MCP | Hides the MCP deployment tab |
| A2A | Hides the A2A deployment tab |
| Chat | Hides the Chat deployment tab |
| Template | Hides the Template deployment tab |
**Features**
| Feature | Effect when checked |
|---------|-------------------|
| Sim Mailer | Hides the Sim Mailer (Inbox) feature |
| Public API | Disables public API access for deployed workflows |
**Logs**
| Feature | Effect when checked |
|---------|-------------------|
| Trace Spans | Hides trace span details in execution logs |
**Collaboration**
| Feature | Effect when checked |
|---------|-------------------|
| Invitations | Disables the ability to invite new members to the workspace |
### 4. Add members
Open the group's **Details** view and add members by searching for users by name or email. Only users who already have workspace-level access can be added. A user can only belong to one group per workspace — adding a user to a new group within the same workspace removes them from their current group for that workspace.
External workspace members are treated like other workspace members for access-control purposes. They can be assigned to permission groups in any workspace they have access to, but they do not become organization members or appear in the organization roster.
---
## Enforcement
### Workflow execution
Restrictions are enforced at the point of execution, not at save time. If a group's configuration changes after a workflow is built:
- **Block restrictions:** Any workflow run that reaches a disallowed block halts immediately with an error. The workflow is not modified — only execution is blocked.
- **Model provider restrictions:** Any block or agent that uses a disallowed provider halts immediately with an error.
- **Tool restrictions (MCP, custom tools, skills):** Agents that use a disallowed tool type halt immediately with an error.
This applies regardless of how the workflow is triggered — manually, via API, via schedule, or via webhook.
### Mothership
When a user opens Mothership, their permission group is read before any block or tool suggestions are made:
- Blocks not in the allowed list are filtered out of the block picker entirely — they do not appear as options.
- If Mothership generates a workflow step that would use a disallowed tool (MCP, custom, or skills), that step is skipped and the reason is noted.
---
## User membership rules
- A user can belong to **at most one** permission group **per workspace**, but may be in different groups across different workspaces.
- Moving a user to a new group within a workspace automatically removes them from their previous group in that workspace.
- Users not assigned to any group in a workspace have no restrictions applied in that workspace (all blocks, providers, and features are available to them there).
- If **Auto-add new members** is enabled on a group, new members of that workspace are automatically placed in the group. Only one group per workspace can have this setting active.
- External workspace members follow the same per-workspace permission group rules as internal members.
---
<FAQ items={[
{
question: "Who can create and manage permission groups?",
answer: "Any workspace admin on an Enterprise-entitled workspace can create, edit, and delete permission groups for that workspace. The workspace's billed account must be on the Enterprise plan."
},
{
question: "What happens to a workflow that was built before a block was restricted?",
answer: "The workflow is not modified — it still exists and can be edited. However, any run that reaches a disallowed block will halt immediately with an error. The block must be removed or the user's group configuration must be updated before the workflow can run successfully."
},
{
question: "Can a user be in multiple permission groups?",
answer: "A user can belong to at most one permission group per workspace, but can belong to different groups in different workspaces. Adding a user to a new group within the same workspace automatically removes them from their previous group in that workspace."
},
{
question: "What does a user see if they have no permission group assigned in a workspace?",
answer: "Users with no group in a given workspace have no restrictions in that workspace. All blocks, model providers, and platform features are fully available to them there. Restrictions only apply in the specific workspaces where they are assigned to a group."
},
{
question: "Does Mothership respect the same restrictions as the executor?",
answer: "Yes. Mothership reads the user's permission group for the active workspace before suggesting blocks or tools. Disallowed blocks are filtered out of the block picker, and disallowed tool types are skipped during workflow generation."
},
{
question: "Can I restrict access to specific workflows or workspaces?",
answer: "Access Control operates at the feature and block level within a workspace. To restrict who can access the workspace itself, use workspace invitations and permissions. To apply different restrictions to different workflows, put them in different workspaces with distinct permission groups."
},
{
question: "What is Auto-add new members?",
answer: "When a group has Auto-add new members enabled, any new member who joins the workspace is automatically added to that group. Only one group per workspace can have this setting enabled at a time."
}
]} />
---
## Self-hosted setup
Self-hosted deployments use environment variables instead of the billing/plan check.
### Environment variables
```bash
ACCESS_CONTROL_ENABLED=true
NEXT_PUBLIC_ACCESS_CONTROL_ENABLED=true
```
You can also set a server-level block allowlist using the `ALLOWED_INTEGRATIONS` environment variable. This is applied as an additional constraint on top of any permission group configuration — a block must be allowed by both the environment allowlist and the user's group to be usable.
```bash
# Only these block types are available across the entire instance
ALLOWED_INTEGRATIONS=slack,gmail,agent,function,condition
```
Once enabled, permission groups are managed through **Settings → Enterprise → Access Control** the same way as Sim Cloud.

View File

@@ -0,0 +1,145 @@
---
title: Audit Logs
description: Track every action taken across your organization's workspaces
---
import { FAQ } from '@/components/ui/faq'
import { Image } from '@/components/ui/image'
Audit logs give your organization a tamper-evident record of every significant action taken across workspaces — who did what, when, and on which resource. Use them for security reviews, compliance investigations, and incident response.
---
## Viewing audit logs
### In the UI
Go to **Settings → Enterprise → Audit Logs** in your workspace. Logs are displayed in a table with the following columns:
<Image src="/static/enterprise/audit-logs.png" alt="Audit Logs settings showing a table of events with columns for Timestamp, Event, Description, and Actor, along with search and filter controls" width={900} height={500} />
| Column | Description |
|--------|-------------|
| **Timestamp** | When the action occurred. |
| **Event** | The action taken, e.g. `workflow.created`. |
| **Description** | A human-readable summary of the action. |
| **Actor** | The email address of the user who performed the action. |
Use the search bar, event type filter, and date range selector to narrow results.
### Via API
Audit logs are also accessible through the Sim API for integration with external SIEM or log management tools.
```http
GET /api/v1/audit-logs
Authorization: Bearer <api-key>
```
**Query parameters:**
| Parameter | Type | Description |
|-----------|------|-------------|
| `action` | string | Filter by event type (e.g. `workflow.created`) |
| `resourceType` | string | Filter by resource type (e.g. `workflow`) |
| `resourceId` | string | Filter by a specific resource ID |
| `workspaceId` | string | Filter by workspace |
| `actorId` | string | Filter by user ID. For organization-wide filters, the actor must be a current or former org member; workspace-scoped logs can also include external workspace members. |
| `startDate` | string | ISO 8601 date — return logs on or after this date |
| `endDate` | string | ISO 8601 date — return logs on or before this date |
| `includeDeparted` | boolean | Include logs from members who have since left the organization (default `false`) |
| `limit` | number | Results per page (1100, default 50) |
| `cursor` | string | Opaque cursor for fetching the next page |
**Example response:**
```json
{
"data": [
{
"id": "abc123",
"action": "workflow.created",
"resourceType": "workflow",
"resourceId": "wf_xyz",
"resourceName": "Customer Onboarding",
"description": "Created workflow \"Customer Onboarding\"",
"actorId": "usr_abc",
"actorName": "Alice Smith",
"actorEmail": "alice@company.com",
"workspaceId": "ws_def",
"metadata": {},
"createdAt": "2026-04-20T21:16:00.000Z"
}
],
"nextCursor": "eyJpZCI6ImFiYzEyMyJ9"
}
```
Paginate by passing the `nextCursor` value as the `cursor` parameter in the next request. When `nextCursor` is absent, you have reached the last page.
The API accepts both personal and workspace-scoped API keys. Rate limits apply — the response includes `X-RateLimit-*` headers with your current limit and remaining quota.
---
## Event types
Audit log events follow a `resource.action` naming pattern. The table below lists the main categories.
| Category | Example events |
|----------|---------------|
| **Workflows** | `workflow.created`, `workflow.deleted`, `workflow.deployed`, `workflow.locked` |
| **Workspaces** | `workspace.created`, `workspace.updated`, `workspace.deleted` |
| **Members** | `member.invited`, `member.removed`, `member.role_changed` |
| **Permission groups** | `permission_group.created`, `permission_group.updated`, `permission_group.deleted` |
| **Environments** | `environment.updated`, `environment.deleted` |
| **Knowledge bases** | `knowledge_base.created`, `knowledge_base.deleted`, `connector.synced` |
| **Tables** | `table.created`, `table.updated`, `table.deleted` |
| **API keys** | `api_key.created`, `api_key.revoked` |
| **Credentials** | `credential.created`, `credential.deleted`, `oauth.disconnected` |
| **Organization** | `organization.updated`, `org_member.added`, `org_member.role_changed` |
Workspace invitation events include whether the invite is for an internal organization member or an external workspace member in their metadata. External workspace members can appear as actors on workspace-scoped events, but they are not organization members and do not appear in the organization roster.
---
<FAQ items={[
{
question: "Who can view audit logs?",
answer: "Organization owners and admins can view audit logs. On Sim Cloud, you must be on the Enterprise plan."
},
{
question: "Are audit logs tamper-proof?",
answer: "Audit log entries are append-only and cannot be modified or deleted through the Sim interface or API. They represent a reliable record of actions taken in your organization."
},
{
question: "Can I export audit logs?",
answer: "Yes. Use the API to export logs programmatically. Paginate through all records using the cursor parameter and store them in your own data warehouse or SIEM."
},
{
question: "Are logs scoped to a single workspace or the whole organization?",
answer: "Audit logs are scoped to your organization and include activity across all workspaces within it. You can filter by workspaceId to narrow results to a specific workspace."
},
{
question: "What information is included in each log entry?",
answer: "Each entry includes the event type, a description, the actor's name and email, the affected resource, the workspace, and a timestamp. IP addresses and user agents are not exposed through the API."
},
{
question: "Can I filter logs by a specific user?",
answer: "Yes. Pass the actorId query parameter to filter logs by a specific user. Organization-wide actor filters require the actor to be a current or former member of your organization. Workspace-scoped logs may also include external workspace members who acted inside a workspace without joining the organization."
}
]} />
---
## Self-hosted setup
Self-hosted deployments use environment variables instead of the billing/plan check.
### Environment variables
```bash
AUDIT_LOGS_ENABLED=true
NEXT_PUBLIC_AUDIT_LOGS_ENABLED=true
```
Once enabled, audit logs are viewable in **Settings → Enterprise → Audit Logs** and accessible via the API.

View File

@@ -0,0 +1,114 @@
---
title: Data Retention
description: Control how long execution logs, deleted resources, and copilot data are kept before permanent deletion
---
import { FAQ } from '@/components/ui/faq'
import { Image } from '@/components/ui/image'
Data Retention lets organization owners and admins on Enterprise plans configure how long three categories of data are kept before they are permanently deleted. The configuration applies to every workspace in the organization.
---
## Setup
Go to **Settings → Enterprise → Data Retention** in your workspace.
<Image src="/static/enterprise/data-retention.png" alt="Data Retention settings showing three dropdowns — Log retention, Soft deletion cleanup, and Task cleanup — each set to Forever" width={900} height={500} />
You will see three independent settings, each with the same set of options: **1 day, 3 days, 7 days, 14 days, 30 days, 60 days, 90 days, 180 days, 1 year, 5 years,** or **Forever**.
Setting a period to **Forever** means that category of data is never automatically deleted.
---
## Settings
### Log retention
Controls how long **workflow execution logs** are kept.
When the retention period expires, execution log records are permanently deleted, along with any files associated with those executions stored in cloud storage.
### Soft deletion cleanup
Controls how long **soft-deleted resources** remain recoverable before permanent removal.
When you delete a workflow, folder, knowledge base, table, or file, it is initially soft-deleted and can be recovered from Recently Deleted. Once the soft deletion cleanup period expires, those resources are permanently removed and cannot be recovered.
Resources covered:
- Workflows
- Workflow folders
- Knowledge bases
- Tables
- Files
- MCP server configurations
- Agent memory
### Task cleanup
Controls how long **Mothership data** is kept, including:
- Copilot chats and run history
- Run checkpoints and async tool calls
- Inbox tasks (Sim Mailer)
Each setting is independent. You can configure a short log retention period alongside a long soft deletion cleanup period, or any combination that fits your compliance requirements.
---
## Organization-wide configuration
Retention is configured at the **organization level**. A single configuration applies to every workspace in the organization — there are no per-workspace overrides.
---
## Defaults
By default, all three settings are unconfigured — no data is automatically deleted in any category until you configure it. Setting a period to **Forever** has the same effect as leaving it unconfigured, but makes the intent explicit and allows you to change it later without saving from scratch.
---
<FAQ items={[
{
question: "Who can configure data retention settings?",
answer: "Only organization owners and admins can configure data retention settings. On Sim Cloud, the organization must be on an Enterprise plan."
},
{
question: "Is deletion immediate once the retention period expires?",
answer: "No. Deletion runs on a scheduled cleanup job. Data is deleted when the job next runs after the retention period has elapsed — not at the exact moment it expires."
},
{
question: "Can deleted data be recovered after the soft deletion cleanup period?",
answer: "No. Once the soft deletion cleanup period expires and the cleanup job runs, resources are permanently deleted and cannot be recovered."
},
{
question: "Does the retention period apply to all workspaces in my organization?",
answer: "Yes. Retention is configured once per organization and applies to every workspace in the organization."
},
{
question: "What happens if I shorten the retention period?",
answer: "The next cleanup job will delete any data that is older than the new, shorter period — including data that would have been kept under the previous setting. Shortening the period is irreversible for data that falls outside the new window."
},
{
question: "What is the minimum retention period?",
answer: "1 day (24 hours)."
},
{
question: "What is the maximum retention period?",
answer: "5 years."
}
]} />
---
## Self-hosted setup
### Environment variables
```bash
NEXT_PUBLIC_DATA_RETENTION_ENABLED=true
```
Once enabled, data retention settings are configurable through **Settings → Enterprise → Data Retention** the same way as Sim Cloud.

View File

@@ -3,7 +3,6 @@ title: Enterprise
description: Enterprise features for business organizations
---
import { Callout } from 'fumadocs-ui/components/callout'
import { FAQ } from '@/components/ui/faq'
Sim Enterprise provides advanced features for organizations with enhanced security, compliance, and management requirements.
@@ -12,7 +11,9 @@ Sim Enterprise provides advanced features for organizations with enhanced securi
## Access Control
Define permission groups to control what features and integrations team members can use.
Define permission groups on a workspace to control what features and integrations its members can use. Permission groups are scoped to a single workspace — a user can belong to different groups (or no group) in different workspaces.
External workspace members can be assigned to permission groups just like internal organization members, but they remain outside the organization roster and do not consume seats.
### Features
@@ -22,104 +23,64 @@ Define permission groups to control what features and integrations team members
### Setup
1. Navigate to **Settings** → **Access Control** in your workspace
1. Navigate to **Settings** → **Access Control** in the workspace you want to manage
2. Create a permission group with your desired restrictions
3. Add team members to the permission group
3. Add workspace members to the permission group
<Callout type="info">
Users not assigned to any permission group have full access. Permission restrictions are enforced at both UI and execution time.
</Callout>
Any workspace admin on an Enterprise-entitled workspace can manage permission groups. Users not assigned to any group have full access. Restrictions are enforced at both UI and execution time, based on the workflow's workspace.
See the [Access Control guide](/docs/enterprise/access-control) for full details.
---
## Single Sign-On (SSO)
Enterprise authentication with SAML 2.0 and OIDC support for centralized identity management.
Enterprise authentication with SAML 2.0 and OIDC support. Works with Okta, Azure AD (Entra ID), Google Workspace, ADFS, and any standard OIDC or SAML 2.0 provider.
### Supported Providers
- Okta
- Azure AD / Entra ID
- Google Workspace
- OneLogin
- Any SAML 2.0 or OIDC provider
### Setup
1. Navigate to **Settings** → **SSO** in your workspace
2. Choose your identity provider
3. Configure the connection using your IdP's metadata
4. Enable SSO for your organization
<Callout type="info">
Once SSO is enabled, team members authenticate through your identity provider instead of email/password.
</Callout>
See the [SSO setup guide](/docs/enterprise/sso) for step-by-step instructions and provider-specific configuration.
---
## Self-Hosted Configuration
## Whitelabeling
For self-hosted deployments, enterprise features can be enabled via environment variables without requiring billing.
Replace Sim's default branding — logos, product name, and favicons — with your own. See the [whitelabeling guide](/docs/enterprise/whitelabeling).
### Environment Variables
---
## Audit Logs
Track configuration and security-relevant actions across your organization for compliance and monitoring. See the [audit logs guide](/docs/enterprise/audit-logs).
---
## Data Retention
Configure how long execution logs, soft-deleted resources, and Mothership data are kept before permanent deletion. See the [data retention guide](/docs/enterprise/data-retention).
---
<FAQ items={[
{ question: "Who can manage Enterprise features?", answer: "Workspace admins on an Enterprise-entitled workspace. Access Control, SSO, whitelabeling, audit logs, and data retention are all configured per workspace under Settings → Enterprise." },
{ question: "Which SSO providers are supported?", answer: "Sim supports SAML 2.0 and OIDC, which works with virtually any enterprise identity provider including Okta, Azure AD (Entra ID), Google Workspace, ADFS, and OneLogin." },
{ question: "How do access control permission groups work?", answer: "Permission groups are created per workspace and let you restrict which AI providers, workflow blocks, and platform features are available to specific members of that workspace. Each user can belong to at most one group per workspace. Users not assigned to any group have full access. Restrictions are enforced at both the UI level and at execution time based on the workflow's workspace." },
]} />
---
## Self-hosted setup
Self-hosted deployments enable enterprise features via environment variables instead of billing.
| Variable | Description |
|----------|-------------|
| `ORGANIZATIONS_ENABLED`, `NEXT_PUBLIC_ORGANIZATIONS_ENABLED` | Enable team/organization management |
| `ACCESS_CONTROL_ENABLED`, `NEXT_PUBLIC_ACCESS_CONTROL_ENABLED` | Permission groups for access restrictions |
| `SSO_ENABLED`, `NEXT_PUBLIC_SSO_ENABLED` | Single Sign-On with SAML/OIDC |
| `CREDENTIAL_SETS_ENABLED`, `NEXT_PUBLIC_CREDENTIAL_SETS_ENABLED` | Polling Groups for email triggers |
| `INBOX_ENABLED`, `NEXT_PUBLIC_INBOX_ENABLED` | Sim Mailer inbox for outbound email |
| `WHITELABELING_ENABLED`, `NEXT_PUBLIC_WHITELABELING_ENABLED` | Custom branding and white-labeling |
| `AUDIT_LOGS_ENABLED`, `NEXT_PUBLIC_AUDIT_LOGS_ENABLED` | Audit logging for compliance and monitoring |
| `DISABLE_INVITATIONS`, `NEXT_PUBLIC_DISABLE_INVITATIONS` | Globally disable workspace/organization invitations |
| `ORGANIZATIONS_ENABLED`, `NEXT_PUBLIC_ORGANIZATIONS_ENABLED` | Team and organization management |
| `ACCESS_CONTROL_ENABLED`, `NEXT_PUBLIC_ACCESS_CONTROL_ENABLED` | Permission groups |
| `SSO_ENABLED`, `NEXT_PUBLIC_SSO_ENABLED` | SAML and OIDC sign-in |
| `WHITELABELING_ENABLED`, `NEXT_PUBLIC_WHITELABELING_ENABLED` | Custom branding |
| `AUDIT_LOGS_ENABLED`, `NEXT_PUBLIC_AUDIT_LOGS_ENABLED` | Audit logging |
| `NEXT_PUBLIC_DATA_RETENTION_ENABLED` | Data retention configuration |
| `CREDENTIAL_SETS_ENABLED`, `NEXT_PUBLIC_CREDENTIAL_SETS_ENABLED` | Polling groups for email triggers |
| `INBOX_ENABLED`, `NEXT_PUBLIC_INBOX_ENABLED` | Sim Mailer inbox |
| `DISABLE_INVITATIONS`, `NEXT_PUBLIC_DISABLE_INVITATIONS` | Disable invitations; manage membership via Admin API |
### Organization Management
When billing is disabled, use the Admin API to manage organizations:
```bash
# Create an organization
curl -X POST https://your-instance/api/v1/admin/organizations \
-H "x-admin-key: YOUR_ADMIN_API_KEY" \
-H "Content-Type: application/json" \
-d '{"name": "My Organization", "ownerId": "user-id-here"}'
# Add a member
curl -X POST https://your-instance/api/v1/admin/organizations/{orgId}/members \
-H "x-admin-key: YOUR_ADMIN_API_KEY" \
-H "Content-Type: application/json" \
-d '{"userId": "user-id-here", "role": "admin"}'
```
### Workspace Members
When invitations are disabled, use the Admin API to manage workspace memberships directly:
```bash
# Add a user to a workspace
curl -X POST https://your-instance/api/v1/admin/workspaces/{workspaceId}/members \
-H "x-admin-key: YOUR_ADMIN_API_KEY" \
-H "Content-Type: application/json" \
-d '{"userId": "user-id-here", "permissions": "write"}'
# Remove a user from a workspace
curl -X DELETE "https://your-instance/api/v1/admin/workspaces/{workspaceId}/members?userId=user-id-here" \
-H "x-admin-key: YOUR_ADMIN_API_KEY"
```
### Notes
- Enabling `ACCESS_CONTROL_ENABLED` automatically enables organizations, as access control requires organization membership.
- When `DISABLE_INVITATIONS` is set, users cannot send invitations. Use the Admin API to manage workspace and organization memberships instead.
<FAQ items={[
{ question: "What are the minimum requirements to self-host Sim?", answer: "The Docker Compose production setup includes the Sim application (8 GB memory limit), a realtime collaboration server (1 GB memory limit), and a PostgreSQL database with pgvector. A machine with at least 16 GB of RAM and 4 CPU cores is recommended. You will also need Docker and Docker Compose installed." },
{ question: "Can I run Sim completely offline with local AI models?", answer: "Yes. Sim supports Ollama and VLLM for running local AI models. A separate Docker Compose configuration (docker-compose.ollama.yml) is available for deploying with Ollama. This lets you run workflows without any external API calls, keeping all data on your infrastructure." },
{ question: "How does data privacy work with self-hosted deployments?", answer: "When self-hosted, all data stays on your infrastructure. Workflow definitions, execution logs, credentials, and user data are stored in your PostgreSQL database. If you use local AI models through Ollama or VLLM, no data leaves your network. When using external AI providers, only the data sent in prompts goes to those providers." },
{ question: "Do I need a paid license to self-host Sim?", answer: "The core Sim platform is open source under Apache 2.0 and can be self-hosted for free. Enterprise features like SSO (SAML/OIDC), access control with permission groups, and organization management require an Enterprise subscription for production use. These features can be enabled via environment variables for development and evaluation without a license." },
{ question: "Which SSO providers are supported?", answer: "Sim supports SAML 2.0 and OIDC protocols, which means it works with virtually any enterprise identity provider including Okta, Azure AD (Entra ID), Google Workspace, and OneLogin. Configuration is done through Settings in the workspace UI." },
{ question: "How do I manage users when invitations are disabled?", answer: "Use the Admin API with your admin API key. You can create organizations, add members to organizations with specific roles, add users to workspaces with defined permissions, and remove users. All management is done through REST API calls authenticated with the x-admin-key header." },
{ question: "Can I scale Sim horizontally for high availability?", answer: "The Docker Compose setup is designed for single-node deployments. For production scaling, you can deploy on Kubernetes with multiple application replicas behind a load balancer. The database can be scaled independently using managed PostgreSQL services. Redis can be configured for session and cache management across multiple instances." },
{ question: "How do access control permission groups work?", answer: "Permission groups let you restrict which AI providers, workflow blocks, and platform features are available to specific team members. Users not assigned to any group have full access. Restrictions are enforced at both the UI level (hiding restricted options) and at execution time (blocking unauthorized operations). Enabling access control automatically enables organization management." },
]} />
Once enabled, each feature is configured through the same Settings UI as Sim Cloud. When invitations are disabled, use the Admin API (`x-admin-key` header) to manage organization membership and workspace access. Internal members join the organization; external workspace members only receive access to a specific workspace.

View File

@@ -0,0 +1,5 @@
{
"title": "Enterprise",
"pages": ["index", "sso", "access-control", "whitelabeling", "audit-logs", "data-retention"],
"defaultOpen": false
}

View File

@@ -0,0 +1,328 @@
---
title: Single Sign-On (SSO)
description: Configure SAML 2.0 or OIDC-based single sign-on for your organization
---
import { Callout } from 'fumadocs-ui/components/callout'
import { Tab, Tabs } from 'fumadocs-ui/components/tabs'
import { FAQ } from '@/components/ui/faq'
import { Image } from '@/components/ui/image'
Single Sign-On lets your team sign in to Sim through your company's identity provider instead of managing separate passwords. Sim supports both OIDC and SAML 2.0.
---
## Setup
### 1. Open SSO settings
Go to **Settings → Enterprise → Single Sign-On** in your workspace.
### 2. Choose a protocol
| Protocol | Use when |
|----------|----------|
| **OIDC** | Your IdP supports OpenID Connect — Okta, Microsoft Entra ID, Auth0, Google Workspace |
| **SAML 2.0** | Your IdP is SAML-only — ADFS, Shibboleth, or older enterprise IdPs |
### 3. Fill in the form
<Image src="/static/enterprise/sso-form.png" alt="Single Sign-On configuration form showing Provider Type (OIDC), Provider ID, Issuer URL, Domain, Client ID, Client Secret, Scopes, and Callback URL fields" width={900} height={500} />
**Fields required for both protocols:**
| Field | What to enter |
|-------|--------------|
| **Provider ID** | A short slug identifying this connection, e.g. `okta` or `azure-ad`. Letters, numbers, and dashes only. |
| **Issuer URL** | The identity provider's issuer URL. Must be HTTPS. |
| **Domain** | Your organization's email domain, e.g. `company.com`. Users with this domain will be routed through SSO at sign-in. |
**OIDC additional fields:**
| Field | What to enter |
|-------|--------------|
| **Client ID** | The application client ID from your IdP. |
| **Client Secret** | The client secret from your IdP. |
| **Scopes** | Comma-separated OIDC scopes. Default: `openid,profile,email`. |
<Callout type="info">
For OIDC, Sim automatically fetches endpoints (`authorization_endpoint`, `token_endpoint`, `userinfo_endpoint`, `jwks_uri`) from your issuer's `/.well-known/openid-configuration` discovery document. You only need to provide the issuer URL.
</Callout>
**SAML additional fields:**
| Field | What to enter |
|-------|--------------|
| **Entry Point URL** | The IdP's SSO service URL where Sim sends authentication requests. |
| **Identity Provider Certificate** | The Base-64 encoded X.509 certificate from your IdP for verifying assertions. |
### 4. Copy the Callback URL
The **Callback URL** shown in the form is the endpoint your identity provider must redirect users back to after authentication. Copy it and register it in your IdP before saving.
**OIDC providers** (Okta, Microsoft Entra ID, Google Workspace, Auth0):
```
https://sim.ai/api/auth/sso/callback/{provider-id}
```
**SAML providers** (ADFS, Shibboleth):
```
https://sim.ai/api/auth/sso/saml2/callback/{provider-id}
```
### 5. Save and test
Click **Save**. To test, sign out and use the **Sign in with SSO** button on the login page. Enter an email address at your configured domain — Sim will redirect you to your identity provider.
---
## Provider Guides
<Tabs items={['Okta', 'Microsoft Entra ID', 'Google Workspace', 'ADFS']}>
<Tab value="Okta">
### Okta (OIDC)
**In Okta** ([official docs](https://help.okta.com/en-us/content/topics/apps/apps_app_integration_wizard_oidc.htm)):
1. Go to **Applications → Create App Integration**
2. Select **OIDC - OpenID Connect**, then **Web Application**
3. Set the **Sign-in redirect URI** to your Sim callback URL:
```
https://sim.ai/api/auth/sso/callback/okta
```
4. Under **Assignments**, grant access to the relevant users or groups
5. Copy the **Client ID** and **Client Secret** from the app's **General** tab
6. Your Okta domain is the hostname of your admin console, e.g. `dev-1234567.okta.com`
**In Sim:**
| Field | Value |
|-------|-------|
| Provider Type | OIDC |
| Provider ID | `okta` |
| Issuer URL | `https://dev-1234567.okta.com/oauth2/default` |
| Domain | `company.com` |
| Client ID | From Okta app |
| Client Secret | From Okta app |
The issuer URL uses Okta's default authorization server, which is pre-configured on every Okta org. If you created a custom authorization server, replace `default` with your server name.
</Tab>
<Tab value="Microsoft Entra ID">
### Microsoft Entra ID (OIDC)
**In Azure** ([official docs](https://learn.microsoft.com/en-us/entra/identity-platform/quickstart-register-app)):
1. Go to **Microsoft Entra ID → App registrations → New registration**
2. Under **Redirect URI**, select **Web** and enter your Sim callback URL:
```
https://sim.ai/api/auth/sso/callback/azure-ad
```
3. After registration, go to **Certificates & secrets → New client secret** and copy the value immediately — it won't be shown again
4. Go to **Overview** and copy the **Application (client) ID** and **Directory (tenant) ID**
**In Sim:**
| Field | Value |
|-------|-------|
| Provider Type | OIDC |
| Provider ID | `azure-ad` |
| Issuer URL | `https://login.microsoftonline.com/{tenant-id}/v2.0` |
| Domain | `company.com` |
| Client ID | Application (client) ID |
| Client Secret | Secret value |
</Tab>
<Tab value="Google Workspace">
### Google Workspace (OIDC)
**In Google Cloud Console** ([official docs](https://developers.google.com/identity/openid-connect/openid-connect)):
1. Go to **APIs & Services → Credentials → Create Credentials → OAuth 2.0 Client ID**
2. Set the application type to **Web application**
3. Add your Sim callback URL to **Authorized redirect URIs**:
```
https://sim.ai/api/auth/sso/callback/google-workspace
```
4. Copy the **Client ID** and **Client Secret**
**In Sim:**
| Field | Value |
|-------|-------|
| Provider Type | OIDC |
| Provider ID | `google-workspace` |
| Issuer URL | `https://accounts.google.com` |
| Domain | `company.com` |
| Client ID | From Google Cloud Console |
| Client Secret | From Google Cloud Console |
<Callout type="info">
To restrict sign-in to your Google Workspace domain, configure the OAuth consent screen and ensure your app is set to **Internal** (Workspace users only) under **User type**. Setting the app to Internal limits access to users within your Google Workspace organization.
</Callout>
</Tab>
<Tab value="ADFS">
### ADFS (SAML 2.0)
**In ADFS** ([official docs](https://learn.microsoft.com/en-us/windows-server/identity/ad-fs/operations/create-a-relying-party-trust)):
1. Open **AD FS Management → Relying Party Trusts → Add Relying Party Trust**
2. Choose **Claims aware**, then **Enter data about the relying party manually**
3. Set the **Relying party identifier** (Entity ID) to your Sim base URL:
```
https://sim.ai
```
4. Add an endpoint: **SAML Assertion Consumer Service** (HTTP POST) with the URL:
```
https://sim.ai/api/auth/sso/saml2/callback/adfs
```
5. Export the **Token-signing certificate** from **Certificates**: right-click → **View Certificate → Details → Copy to File**, choose **Base-64 encoded X.509 (.CER)**. The `.cer` file is PEM-encoded — rename it to `.pem` before pasting its contents into Sim.
6. Note the **ADFS Federation Service endpoint URL** (e.g. `https://adfs.company.com/adfs/ls`)
**In Sim:**
| Field | Value |
|-------|-------|
| Provider Type | SAML |
| Provider ID | `adfs` |
| Issuer URL | `https://sim.ai` |
| Domain | `company.com` |
| Entry Point URL | `https://adfs.company.com/adfs/ls` |
| Certificate | Contents of the `.pem` file |
<Callout type="info">
For ADFS, the **Issuer URL** field is the SP entity ID — the identifier ADFS uses to identify Sim as a relying party. It must match the **Relying party identifier** you registered in ADFS.
</Callout>
</Tab>
</Tabs>
---
## How sign-in works after setup
Once SSO is configured, users with your domain (`company.com`) can sign in through your identity provider:
1. User goes to `sim.ai` and clicks **Sign in with SSO**
2. They enter their work email (e.g. `alice@company.com`)
3. Sim redirects them to your identity provider
4. After authenticating, they are returned to Sim and added to your organization automatically
5. They land in the workspace
Users who sign in via SSO for the first time are automatically provisioned and added to your organization — no manual invite required.
SSO provisioning creates internal organization members. External workspace members are different: they are invited to a specific workspace without joining your organization or consuming one of your seats.
<Callout type="info">
Password-based login remains available. Forcing all organization members to use SSO exclusively is not yet supported.
</Callout>
---
<FAQ items={[
{
question: "Which SSO providers are supported?",
answer: "Any identity provider that supports OIDC or SAML 2.0. This includes Okta, Microsoft Entra ID (Azure AD), Google Workspace, Auth0, OneLogin, JumpCloud, Ping Identity, ADFS, Shibboleth, and more."
},
{
question: "What is the Domain field used for?",
answer: "The domain (e.g. company.com) is how Sim routes users to the right identity provider. When a user enters their email on the SSO sign-in page, Sim matches their email domain to a registered SSO provider and redirects them there."
},
{
question: "Do I need to provide OIDC endpoints manually?",
answer: "No. For OIDC providers, Sim automatically fetches the authorization, token, and JWKS endpoints from the discovery document at {issuer}/.well-known/openid-configuration. You only need to provide the issuer URL."
},
{
question: "What happens when a user signs in with SSO for the first time?",
answer: "Sim creates an account for them automatically and adds them to your organization. No manual invite is needed. They are assigned the member role by default. External workspace members are not provisioned through SSO into your organization; they are invited directly to a workspace and remain outside your org roster."
},
{
question: "Can I still use email/password login after enabling SSO?",
answer: "Yes. Enabling SSO does not disable password-based login. Users can still sign in with their email and password if they have one. Forced SSO (requiring all users on the domain to use SSO) is not yet supported."
},
{
question: "Who can configure SSO on Sim Cloud?",
answer: "Organization owners and admins can configure SSO. You must be on the Enterprise plan."
},
{
question: "What is the Callback URL?",
answer: "The Callback URL (also called Redirect URI or ACS URL) is the endpoint in Sim that receives the authentication response from your identity provider. For OIDC providers it follows the format: https://sim.ai/api/auth/sso/callback/{provider-id}. For SAML providers it is: https://sim.ai/api/auth/sso/saml2/callback/{provider-id}. You must register this URL in your identity provider before SSO will work."
},
{
question: "How do I update or replace an existing SSO configuration?",
answer: "Open Settings → Enterprise → Single Sign-On and click Edit. Update the fields and save. The existing provider configuration is replaced."
}
]} />
---
## Self-hosted setup
Self-hosted deployments use environment variables instead of the billing/plan check.
### Environment variables
```bash
# Required
SSO_ENABLED=true
NEXT_PUBLIC_SSO_ENABLED=true
# Required if you want users auto-added to your organization on first SSO sign-in
ORGANIZATIONS_ENABLED=true
NEXT_PUBLIC_ORGANIZATIONS_ENABLED=true
```
You can register providers through the **Settings UI** (same as cloud) or by running the registration script directly against your database.
### Script-based registration
Use this when you need to register an SSO provider without going through the UI — for example, during initial deployment or CI/CD automation.
```bash
# OIDC example (Okta)
SSO_ENABLED=true \
NEXT_PUBLIC_APP_URL=https://your-instance.com \
SSO_PROVIDER_TYPE=oidc \
SSO_PROVIDER_ID=okta \
SSO_ISSUER=https://dev-1234567.okta.com/oauth2/default \
SSO_DOMAIN=company.com \
SSO_USER_EMAIL=admin@company.com \
SSO_OIDC_CLIENT_ID=your-client-id \
SSO_OIDC_CLIENT_SECRET=your-client-secret \
bun run packages/db/scripts/register-sso-provider.ts
```
```bash
# SAML example (ADFS)
SSO_ENABLED=true \
NEXT_PUBLIC_APP_URL=https://your-instance.com \
SSO_PROVIDER_TYPE=saml \
SSO_PROVIDER_ID=adfs \
SSO_ISSUER=https://your-instance.com \
SSO_DOMAIN=company.com \
SSO_USER_EMAIL=admin@company.com \
SSO_SAML_ENTRY_POINT=https://adfs.company.com/adfs/ls \
SSO_SAML_CERT="-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----" \
bun run packages/db/scripts/register-sso-provider.ts
```
The script outputs the callback URL to configure in your IdP once it completes.
To remove a provider:
```bash
SSO_USER_EMAIL=admin@company.com \
bun run packages/db/scripts/deregister-sso-provider.ts
```

View File

@@ -0,0 +1,103 @@
---
title: Whitelabeling
description: Replace Sim branding with your own logo, colors, and links
---
import { FAQ } from '@/components/ui/faq'
import { Image } from '@/components/ui/image'
Whitelabeling lets you replace Sim's default branding — logo, colors, and support links — with your own. Members of your organization see your brand instead of Sim's throughout the workspace.
---
## Setup
### 1. Open Whitelabeling settings
Go to **Settings → Enterprise → Whitelabeling** in your workspace.
<Image src="/static/enterprise/whitelabeling.png" alt="Whitelabeling settings showing brand identity fields (Logo, Wordmark, Brand name), color pickers for primary and accent colors, and link fields for support email and documentation URL" width={900} height={500} />
### 2. Configure brand identity
| Field | Description |
|-------|-------------|
| **Logo** | Shown in the collapsed sidebar. Square image (PNG, JPEG, SVG, or WebP). Max 5 MB. |
| **Wordmark** | Shown in the expanded sidebar. Wide image (PNG, JPEG, SVG, or WebP). Max 5 MB. |
| **Brand name** | Replaces "Sim" in the sidebar and select UI elements. Max 64 characters. |
### 3. Configure colors
All colors must be valid hex values (e.g. `#701ffc`).
| Field | Description |
|-------|-------------|
| **Primary color** | Main accent color used for buttons and active states. |
| **Primary hover color** | Color shown when hovering over primary elements. |
| **Accent color** | Secondary accent for highlights and secondary interactive elements. |
| **Accent hover color** | Color shown when hovering over accent elements. |
### 4. Configure links
Replace Sim's default support and legal links with your own.
| Field | Description |
|-------|-------------|
| **Support email** | Shown in help prompts. Must be a valid email address. |
| **Documentation URL** | Link to your internal documentation. Must be a valid URL. |
| **Terms of service URL** | Link to your terms page. Must be a valid URL. |
| **Privacy policy URL** | Link to your privacy page. Must be a valid URL. |
### 5. Save
Click **Save changes**. The new branding is applied immediately for all members of your organization.
---
## What gets replaced
Whitelabeling replaces the following visual elements:
- **Sidebar logo and wordmark** — your uploaded images replace the Sim logo
- **Brand name** — appears in the sidebar and select UI labels
- **Primary and accent colors** — applied to buttons, active states, and highlights
- **Support and legal links** — help prompts and footer links point to your URLs
Whitelabeling applies only to members of your organization. Public-facing pages (login, marketing) are not affected.
---
<FAQ items={[
{
question: "Who can configure whitelabeling?",
answer: "Organization owners and admins can configure whitelabeling. On Sim Cloud, you must be on the Enterprise plan."
},
{
question: "What image formats are supported?",
answer: "PNG, JPEG, SVG, and WebP. Maximum file size is 5 MB for both the logo and wordmark."
},
{
question: "What is the difference between the logo and the wordmark?",
answer: "The logo is a square image shown in the collapsed sidebar. The wordmark is a wide image shown in the expanded sidebar alongside member names and navigation items."
},
{
question: "Do members outside my organization see the custom branding?",
answer: "No. Custom branding is scoped to your organization. Members see your branding when signed in to your organization's workspace."
}
]} />
---
## Self-hosted setup
Self-hosted deployments use environment variables instead of the billing/plan check.
### Environment variables
```bash
WHITELABELING_ENABLED=true
NEXT_PUBLIC_WHITELABELING_ENABLED=true
```
Once enabled, configure branding through **Settings → Enterprise → Whitelabeling** the same way.

View File

@@ -272,6 +272,8 @@ Sim has two paid plan tiers - **Pro** and **Max**. Either can be used individual
To use Pro or Max with a team, select **Get For Team** in subscription settings and choose the tier and number of seats. Credits are pooled across the organization at the per-seat rate (e.g. Max for Teams with 3 seats = 75,000 credits/mo pooled).
Internal organization members use seats and contribute to the team's pooled credit allocation. External workspace members do not join your organization, do not appear in the organization roster, and do not count toward your seat total.
### Daily Refresh Credits
Paid plans include a small daily credit allowance that does not count toward your plan limit. Each day, usage up to the daily refresh amount is excluded from billable usage. This allowance resets every 24 hours and does not carry over - use it or lose it.
@@ -308,6 +310,17 @@ By default, your usage is capped at the credits included in your plan. To allow
## Plan Limits
### Workspaces
| Plan | Personal Workspaces | Shared (Organization) Workspaces |
|------|---------------------|----------------------------------|
| **Free** | 1 | — |
| **Pro** | Up to 3 | — |
| **Max** | Up to 10 | — |
| **Team / Enterprise** | Unlimited | Unlimited |
Team and Enterprise plans unlock shared workspaces that belong to your organization. Internal members invited to a shared workspace join the organization and count toward your seat total. Existing Sim users who already belong to another organization can be added as external workspace members; they get workspace access without joining your organization or using one of your seats. When a Team or Enterprise subscription is cancelled or downgraded, existing shared workspaces remain accessible to current members but new invites are disabled until the organization is upgraded again.
### Rate Limits
| Plan | Sync (req/min) | Async (req/min) |
@@ -357,7 +370,8 @@ Sim uses a **base subscription + overage** billing model:
- Example: 7,000 credits used = $25 (subscription) + $5 (overage for 1,000 extra credits at $0.005/credit)
**Team Plans:**
- Usage is pooled across all team members in the organization
- Usage is pooled across internal team members in the organization
- External workspace members keep their own organization or personal billing context for runs where they are the billing actor
- Overage is calculated from total team usage against the pooled limit
- Organization owner receives one bill

View File

@@ -42,6 +42,8 @@ Only authorized senders can create tasks. Emails from anyone else are automatica
- **Workspace members** are allowed by default — no setup needed
- **External senders** can be added manually with an optional label for easy identification
External senders are email addresses that can create inbox tasks. They are not the same as external workspace members, who have workspace access in Sim without joining your organization.
Manage your allowed senders list in **Settings** → **Inbox** → **Allowed Senders**.
## Tracking Tasks

View File

@@ -220,6 +220,6 @@ import { FAQ } from '@/components/ui/faq'
{ question: "Who can configure MCP servers in a workspace?", answer: "Users with Write permission can configure (add and update) MCP servers in workspace settings. Only Admin permission is required to delete MCP servers. Users with Read permission can view available MCP tools and execute them in agents and MCP Tool blocks. This means all workspace members with at least Read access can use MCP tools in their workflows." },
{ question: "Can I use MCP servers from multiple workspaces?", answer: "MCP servers are configured per workspace. Each workspace maintains its own set of MCP server connections. If you need the same MCP server in multiple workspaces, you need to configure it separately in each workspace's settings." },
{ question: "How do I update MCP tool schemas after a server changes its available tools?", answer: "Click the Refresh button on the MCP server in your workspace settings. This fetches the latest tool schemas from the server and automatically updates any agent blocks that use those tools with the new parameter definitions." },
{ question: "Can permission groups restrict access to MCP tools?", answer: "Yes. Organization admins can create permission groups that disable MCP tools for specific members using the disableMcpTools configuration option. When this is enabled, affected users will not be able to add or use MCP tools in their workflows." },
{ question: "Can permission groups restrict access to MCP tools?", answer: "Yes. On Enterprise-entitled workspaces, any workspace admin can create a permission group that disables MCP tools for its members using the disableMcpTools option. When this is enabled, affected users will not be able to add or use MCP tools in workflows that belong to that workspace." },
{ question: "What happens if an MCP server goes offline during workflow execution?", answer: "If the MCP server is unreachable during execution, the tool call will fail and return an error. In an Agent block, the AI may attempt to handle the failure gracefully. In a standalone MCP Tool block, the workflow step will fail. Check MCP server logs and verify the server is running and accessible to troubleshoot connectivity issues." },
]} />

View File

@@ -25,7 +25,7 @@
"execution",
"permissions",
"self-hosting",
"./enterprise/index",
"enterprise",
"./keyboard-shortcuts/index"
],
"defaultOpen": false

View File

@@ -2,10 +2,31 @@
title: "Roles and Permissions"
---
import { Callout } from 'fumadocs-ui/components/callout'
import { Video } from '@/components/ui/video'
When you invite team members to your organization or workspace, you'll need to choose what level of access to give them. This guide explains what each permission level allows users to do, helping you understand team roles and what access each permission level provides.
## Workspaces and Organizations
Sim has two kinds of workspaces:
- **Personal workspaces** live under your individual account. The number you can create depends on your plan.
- **Shared (organization) workspaces** live under an organization and are available on Team and Enterprise plans. Any organization Owner or Admin can create them. Internal members invited to a shared workspace join the organization and count toward your seat total. Existing Sim users who already belong to another organization can be added as external workspace members instead, giving them access to the workspace without adding them to your organization roster or using one of your seats.
### Workspace Limits by Plan
| Plan | Personal Workspaces | Shared Workspaces |
|------|---------------------|-------------------|
| **Free** | 1 | — |
| **Pro** | Up to 3 | — |
| **Max** | Up to 10 | — |
| **Team / Enterprise** | Unlimited | Unlimited (seat-gated invites) |
<Callout type="info">
When a Team or Enterprise subscription is cancelled or downgraded, existing shared workspaces stay accessible to current members. New invitations are blocked until the organization is upgraded again.
</Callout>
## How to Invite Someone to a Workspace
<div className="mx-auto w-full overflow-hidden rounded-lg">
@@ -22,6 +43,15 @@ When inviting someone to a workspace, you can assign one of three permission lev
| **Write** | Create and edit workflows, run workflows, manage environment variables |
| **Admin** | Everything Write can do, plus invite/remove users and manage workspace settings |
## Internal Members vs External Workspace Members
Workspace permissions are separate from organization membership:
- **Internal organization members** belong to your organization, appear in the organization roster, and count toward your seat total. Invite new teammates this way when they should be part of your company or team in Sim.
- **External workspace members** have access only to the workspace they are invited to. They keep their own organization membership, do not appear in your organization roster, and do not count toward your organization's seats. Use external access for clients, partners, contractors, or collaborators who already use Sim in another organization.
External workspace members still receive a workspace permission level — Read, Write, or Admin — and that permission controls what they can do inside the workspace.
## What Each Permission Level Can Do
Here's a detailed breakdown of what users can do with each permission level:
@@ -88,6 +118,10 @@ Every workspace has one **Owner** (the person who created it) plus any number of
- Can do everything except delete the workspace or remove the owner
- Can be removed from the workspace by the owner or other admins
<Callout type="info">
For shared (organization) workspaces, the organization's Owner and Admins are treated as Admins of every workspace in the organization, even without an explicit per-workspace invite.
</Callout>
---
## Common Scenarios
@@ -101,7 +135,7 @@ Every workspace has one **Owner** (the person who created it) plus any number of
2. **Workspace level**: Give them **Admin** permission so they can manage the team and see everything
### Adding a Stakeholder or Client
1. **Organization level**: Invite them as an **Organization Member**
1. **Organization level**: If they should not join your organization, add them as an **External workspace member**
2. **Workspace level**: Give them **Read** permission so they can see progress but not make changes
---
@@ -145,28 +179,41 @@ Periodically review who has access to what, especially when team members change
## Organization Roles
When inviting someone to your organization, you can assign one of two roles:
An organization has three roles: **Owner**, **Admin**, and **Member**.
### Organization Owner
**What they can do:**
- Everything an Admin can do
- Transfer organization ownership to another user
- Only one Owner exists per organization
### Organization Admin
**What they can do:**
- Invite and remove team members from the organization
- Create new workspaces
- Manage billing and subscription settings
- Access all workspaces within the organization
- Create new shared workspaces under the organization
- Manage billing, seat count, and subscription settings
- Access all shared workspaces within the organization as a workspace Admin
- Promote members to Admin or demote Admins to Member
<Callout type="info">
Owners and Admins have the same day-to-day permissions. The only action reserved for the Owner is transferring ownership.
</Callout>
### Organization Member
**What they can do:**
- Access workspaces they've been specifically invited to
- Access shared workspaces they've been specifically invited to
- View the list of organization members
- Cannot invite new people or manage organization settings
- Cannot invite new people, create shared workspaces, or manage organization settings
import { FAQ } from '@/components/ui/faq'
<FAQ items={[
{ question: "What is the difference between organization roles and workspace permissions?", answer: "Organization roles (Admin or Member) control who can manage the organization itself, including inviting people, creating workspaces, and handling billing. Workspace permissions (Read, Write, Admin) control what a user can do within a specific workspace, such as viewing, editing, or managing workflows. A user needs both an organization role and a workspace permission to work within a workspace." },
{ question: "Can I restrict which integrations or model providers a team member can use?", answer: "Yes. Organization admins can create permission groups with fine-grained controls, including restricting allowed integrations and allowed model providers to specific lists. You can also disable access to MCP tools, custom tools, skills, and various platform features like the knowledge base, API keys, or Copilot on a per-group basis." },
{ question: "What is the difference between organization roles and workspace permissions?", answer: "Organization roles (Owner, Admin, or Member) control who can manage the organization itself, including inviting people, creating shared workspaces, and handling billing. Workspace permissions (Read, Write, Admin) control what a user can do within a specific workspace, such as viewing, editing, or managing workflows. Internal members need both an organization role and a workspace permission to work within a shared workspace. External workspace members do not have an organization role in your org; they only have workspace-level access." },
{ question: "How many workspaces can I create?", answer: "Free users get 1 personal workspace. Pro users get up to 3 personal workspaces. Max users get up to 10 personal workspaces. Team and Enterprise plans support unlimited shared workspaces under the organization — new invites are gated by your seat count." },
{ question: "What happens to my shared workspaces if I cancel or downgrade my Team plan?", answer: "Existing shared workspaces remain accessible to current members, but new invitations are disabled until you upgrade back to a Team or Enterprise plan. No workspaces or members are deleted — the organization is simply dormant until billing is re-enabled." },
{ question: "Can I restrict which integrations or model providers a team member can use?", answer: "Yes, on Enterprise-entitled workspaces. Any workspace admin can create permission groups with fine-grained controls, including restricting allowed integrations and allowed model providers to specific lists. You can also disable access to MCP tools, custom tools, skills, and various platform features like the knowledge base, API keys, or Copilot on a per-group basis. Permission groups are scoped per workspace — a user can belong to different groups in different workspaces." },
{ question: "What happens when a personal environment variable has the same name as a workspace variable?", answer: "The personal environment variable takes priority. When a workflow runs, if both a personal and workspace variable share the same name, the personal value is used. This allows individual users to override shared workspace configuration when needed." },
{ question: "Can an Admin remove the workspace owner?", answer: "No. The workspace owner cannot be removed from the workspace by anyone. Only the workspace owner can delete the workspace or transfer ownership to another user. Admins can do everything else, including inviting and removing other users and managing workspace settings." },
{ question: "What are permission groups and how do they work?", answer: "Permission groups are an advanced access control feature that lets organization admins define granular restrictions beyond the standard Read/Write/Admin roles. A permission group can hide UI sections (like trace spans, knowledge base, API keys, or deployment options), disable features (MCP tools, custom tools, skills, invitations), and restrict which integrations and model providers members can access. Members can be assigned to groups, and new members can be auto-added." },
{ question: "How should I set up permissions for a new team member?", answer: "Start with the lowest permission level they need. Invite them to the organization as a Member, then add them to the relevant workspace with Read permission if they only need visibility, Write if they need to create and run workflows, or Admin if they need to manage the workspace and its users. You can always increase permissions later." },
{ question: "What are permission groups and how do they work?", answer: "Permission groups are an Enterprise access control feature that lets workspace admins define granular restrictions beyond the standard Read/Write/Admin roles. Groups are scoped to a single workspace: each user can be in at most one group per workspace, and a user can be in different groups across different workspaces. A permission group can hide UI sections (like trace spans, knowledge base, API keys, or deployment options), disable features (MCP tools, custom tools, skills, invitations), and restrict which integrations and model providers its members can access. Members can be assigned manually, and new members can be auto-added on join. Execution-time enforcement is based on the workflow's workspace, not the user's current UI context." },
{ question: "How should I set up permissions for a new team member?", answer: "Start with the lowest permission level they need. Invite teammates to the organization as Members, then add them to the relevant workspace with Read permission if they only need visibility, Write if they need to create and run workflows, or Admin if they need to manage the workspace and its users. For clients, partners, or users who already belong to another Sim organization, use external workspace access so they can collaborate without joining your organization or consuming a seat." },
]} />

View File

@@ -31,7 +31,7 @@ A quick lookup for everyday actions in the Sim workflow editor. For keyboard sho
</tr>
<tr>
<td>Invite team members</td>
<td>Sidebar → **Invite**</td>
<td>Sidebar → **Invite**. Internal invites join the organization; external workspace members get workspace access only.</td>
<td><ActionVideo src="quick-reference/invite.mp4" alt="Invite team members" /></td>
</tr>
<tr>

View File

@@ -140,7 +140,7 @@ import { FAQ } from '@/components/ui/faq'
{ question: "How does the agent decide when to load a skill?", answer: "The agent sees an available_skills section in its system prompt listing each skill's name and description. When the agent determines that a skill is relevant to the current task, it calls the load_skill tool with the skill name. The full skill content is then returned as a tool response. This is why writing a specific, keyword-rich description is critical -- it is the only thing the agent reads before deciding whether to activate a skill." },
{ question: "Do skills work with all LLM providers?", answer: "Yes. The load_skill mechanism uses standard tool-calling, which is supported by all LLM providers in Sim. No provider-specific configuration is needed. The skill system works the same way whether you are using Anthropic, OpenAI, Google, or any other supported provider." },
{ question: "When should I use skills vs. agent instructions?", answer: "Use skills for knowledge that applies across multiple workflows or changes frequently. Skills are reusable packages that can be attached to any agent. Use agent instructions for task-specific context that is unique to a single agent and workflow. If you find yourself copying the same instructions into multiple agents, that content should be a skill instead." },
{ question: "Can permission groups disable skills for certain users?", answer: "Yes. Organization admins can create permission groups with the disableSkills option enabled. When a user is assigned to such a permission group, the skills dropdown in agent blocks will be disabled and they will not be able to add or use skills in their workflows." },
{ question: "Can permission groups disable skills for certain users?", answer: "Yes. On Enterprise-entitled workspaces, any workspace admin can create a permission group with the disableSkills option enabled. When a user is assigned to such a group in a workspace, the skills dropdown in agent blocks is disabled and they cannot add or use skills in workflows belonging to that workspace." },
{ question: "What is the recommended maximum length for skill content?", answer: "Keep skills focused and under 500 lines. If a skill grows too large, split it into multiple specialized skills. Shorter, focused skills are more effective because the agent can load exactly what it needs. A broad skill with too much content can overwhelm the agent and reduce the quality of its responses." },
{ question: "Where do I create and manage skills?", answer: "Go to Settings and select Skills under the Tools section. From there you can add new skills with a name (kebab-case identifier, max 64 characters), description (max 1024 characters), and content (full instructions in markdown). You can also edit or delete existing skills from this page." },
]} />

View File

@@ -0,0 +1,629 @@
---
title: AgentPhone
description: Provision numbers, send SMS and iMessage, and place voice calls with AgentPhone
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="agentphone"
color="linear-gradient(135deg, #1a1a1a 0%, #0a2a14 100%)"
/>
{/* MANUAL-CONTENT-START:intro */}
[AgentPhone](https://agentphone.to/) is an API-first voice and messaging platform built for AI agents. AgentPhone lets you provision real phone numbers, place outbound AI voice calls, send SMS and iMessage, manage conversations and contacts, and monitor usage — all through a simple REST API designed for programmatic access.
**Why AgentPhone?**
- **Agent-Native Telephony:** Purpose-built for AI agents — provision numbers, place calls, and send messages without carrier contracts or telephony plumbing.
- **Voice + Messaging in One API:** Drive outbound AI voice calls alongside SMS, MMS, and iMessage from the same account and phone numbers.
- **Conversation & Transcript Management:** Every call returns an ordered transcript; every message thread is tracked as a conversation with full history and metadata.
- **Contacts Built In:** Create, search, update, and delete contacts on the account so your agents can reference people by name instead of raw phone numbers.
- **Usage Visibility:** Inspect plan limits, current counts, and daily/monthly aggregation so workflows can stay inside guardrails.
**Using AgentPhone in Sim**
Sim's AgentPhone integration connects your agentic workflows directly to AgentPhone using an API key. With 22 operations spanning numbers, calls, conversations, contacts, and usage, you can build powerful voice and messaging automations without writing backend code.
**Key benefits of using AgentPhone in Sim:**
- **Dynamic number provisioning:** Reserve US or Canadian numbers on the fly — per agent, per customer, or per workflow — and release them when no longer needed.
- **Outbound AI voice calls:** Place calls from an agent with an optional greeting, voice override, or system prompt, and read the full transcript back as structured data once the call completes.
- **Two-way messaging:** Send SMS, MMS, or iMessage, fetch conversation history, and react to incoming iMessages — all from inside your workflow.
- **Contact and metadata management:** Keep an account-level contact list and attach custom JSON metadata to conversations so downstream blocks can branch on state.
- **Operational insight:** Pull current usage stats and daily/monthly breakdowns to monitor consumption and enforce plan limits before making the next call.
Whether you're building an outbound AI voice agent, running automated SMS follow-ups, managing two-way customer conversations, or monitoring phone usage across your organization, AgentPhone in Sim gives you direct, secure access to the full AgentPhone API — no middleware required. Simply configure your API key, select the operation you need, and let Sim handle the rest.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Give your workflow a phone. Provision SMS- and voice-enabled numbers, send messages and tapback reactions, place outbound voice calls, manage conversations and contacts, and track usage — all through a single AgentPhone API key.
## Tools
### `agentphone_create_call`
Initiate an outbound voice call from an AgentPhone agent
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `agentId` | string | Yes | Agent that will handle the call |
| `toNumber` | string | Yes | Phone number to call in E.164 format \(e.g. +14155551234\) |
| `fromNumberId` | string | No | Phone number ID to use as caller ID. Must belong to the agent. If omitted, the agent's first assigned number is used. |
| `initialGreeting` | string | No | Optional greeting spoken when the recipient answers |
| `voice` | string | No | Voice ID override for this call \(defaults to the agent's configured voice\) |
| `systemPrompt` | string | No | When provided, uses a built-in LLM for the conversation instead of forwarding to your webhook |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique call identifier |
| `agentId` | string | Agent handling the call |
| `status` | string | Initial call status |
| `toNumber` | string | Destination phone number |
| `fromNumber` | string | Caller ID used for the call |
| `phoneNumberId` | string | ID of the phone number used as caller ID |
| `direction` | string | Call direction \(outbound\) |
| `startedAt` | string | ISO 8601 timestamp |
### `agentphone_create_contact`
Create a new contact in AgentPhone
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `phoneNumber` | string | Yes | Phone number in E.164 format \(e.g. +14155551234\) |
| `name` | string | Yes | Contact's full name |
| `email` | string | No | Contact's email address |
| `notes` | string | No | Freeform notes stored on the contact |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Contact ID |
| `phoneNumber` | string | Phone number in E.164 format |
| `name` | string | Contact name |
| `email` | string | Contact email address |
| `notes` | string | Freeform notes |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 update timestamp |
### `agentphone_create_number`
Provision a new SMS- and voice-enabled phone number
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `country` | string | No | Two-letter country code \(e.g. US, CA\). Defaults to US. |
| `areaCode` | string | No | Preferred area code \(US/CA only, e.g. "415"\). Best-effort — may be ignored if unavailable. |
| `agentId` | string | No | Optionally attach the number to an agent immediately |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Unique phone number ID |
| `phoneNumber` | string | Provisioned phone number in E.164 format |
| `country` | string | Two-letter country code |
| `status` | string | Number status \(e.g. active\) |
| `type` | string | Number type \(e.g. sms\) |
| `agentId` | string | Agent the number is attached to |
| `createdAt` | string | ISO 8601 timestamp when the number was created |
### `agentphone_delete_contact`
Delete a contact by ID
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `contactId` | string | Yes | Contact ID |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | ID of the deleted contact |
| `deleted` | boolean | Whether the contact was deleted successfully |
### `agentphone_get_call`
Fetch a call and its full transcript
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `callId` | string | Yes | ID of the call to retrieve |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Call ID |
| `agentId` | string | Agent that handled the call |
| `phoneNumberId` | string | Phone number ID |
| `phoneNumber` | string | Phone number used for the call |
| `fromNumber` | string | Caller phone number |
| `toNumber` | string | Recipient phone number |
| `direction` | string | inbound or outbound |
| `status` | string | Call status |
| `startedAt` | string | ISO 8601 timestamp |
| `endedAt` | string | ISO 8601 timestamp |
| `durationSeconds` | number | Call duration in seconds |
| `lastTranscriptSnippet` | string | Last transcript snippet |
| `recordingUrl` | string | Recording audio URL |
| `recordingAvailable` | boolean | Whether a recording is available |
| `transcripts` | array | Ordered transcript turns for the call |
| ↳ `id` | string | Transcript turn ID |
| ↳ `transcript` | string | User utterance |
| ↳ `confidence` | number | Speech recognition confidence |
| ↳ `response` | string | Agent response \(when available\) |
| ↳ `createdAt` | string | ISO 8601 timestamp |
### `agentphone_get_call_transcript`
Get the full ordered transcript for a call
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `callId` | string | Yes | ID of the call to retrieve the transcript for |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `callId` | string | Call ID |
| `transcript` | array | Ordered transcript turns for the call |
| ↳ `role` | string | Speaker role \(user or agent\) |
| ↳ `content` | string | Turn content |
| ↳ `createdAt` | string | ISO 8601 timestamp |
### `agentphone_get_contact`
Fetch a single contact by ID
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `contactId` | string | Yes | Contact ID |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Contact ID |
| `phoneNumber` | string | Phone number in E.164 format |
| `name` | string | Contact name |
| `email` | string | Contact email address |
| `notes` | string | Freeform notes |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 update timestamp |
### `agentphone_get_conversation`
Get a conversation along with its recent messages
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `conversationId` | string | Yes | Conversation ID |
| `messageLimit` | number | No | Number of recent messages to include \(default 50, max 100\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Conversation ID |
| `agentId` | string | Agent ID |
| `phoneNumberId` | string | Phone number ID |
| `phoneNumber` | string | Phone number |
| `participant` | string | External participant phone number |
| `lastMessageAt` | string | ISO 8601 timestamp |
| `messageCount` | number | Number of messages in the conversation |
| `metadata` | json | Custom metadata stored on the conversation |
| `createdAt` | string | ISO 8601 timestamp |
| `messages` | array | Recent messages in the conversation |
| ↳ `id` | string | Message ID |
| ↳ `body` | string | Message text |
| ↳ `fromNumber` | string | Sender phone number |
| ↳ `toNumber` | string | Recipient phone number |
| ↳ `direction` | string | inbound or outbound |
| ↳ `channel` | string | sms, mms, or imessage |
| ↳ `mediaUrl` | string | Attached media URL |
| ↳ `receivedAt` | string | ISO 8601 timestamp |
### `agentphone_get_conversation_messages`
Get paginated messages for a conversation
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `conversationId` | string | Yes | Conversation ID |
| `limit` | number | No | Number of messages to return \(default 50, max 200\) |
| `before` | string | No | Return messages received before this ISO 8601 timestamp |
| `after` | string | No | Return messages received after this ISO 8601 timestamp |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Messages in the conversation |
| ↳ `id` | string | Message ID |
| ↳ `body` | string | Message text |
| ↳ `fromNumber` | string | Sender phone number |
| ↳ `toNumber` | string | Recipient phone number |
| ↳ `direction` | string | inbound or outbound |
| ↳ `channel` | string | sms, mms, or imessage |
| ↳ `mediaUrl` | string | Attached media URL |
| ↳ `receivedAt` | string | ISO 8601 timestamp |
| `hasMore` | boolean | Whether more messages are available |
### `agentphone_get_number_messages`
Fetch messages received on a specific phone number
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `numberId` | string | Yes | ID of the phone number |
| `limit` | number | No | Number of messages to return \(default 50, max 200\) |
| `before` | string | No | Return messages received before this ISO 8601 timestamp |
| `after` | string | No | Return messages received after this ISO 8601 timestamp |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Messages received on the number |
| ↳ `id` | string | Message ID |
| ↳ `from_` | string | Sender phone number \(E.164\) |
| ↳ `to` | string | Recipient phone number \(E.164\) |
| ↳ `body` | string | Message text |
| ↳ `direction` | string | inbound or outbound |
| ↳ `channel` | string | Channel \(sms, mms, etc.\) |
| ↳ `receivedAt` | string | ISO 8601 timestamp |
| `hasMore` | boolean | Whether more messages are available |
### `agentphone_get_usage`
Retrieve current usage statistics for the AgentPhone account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `plan` | json | Plan name and limits \(name, limits: numbers/messagesPerMonth/voiceMinutesPerMonth/maxCallDurationMinutes/concurrentCalls\) |
| `numbers` | json | Phone number usage \(used, limit, remaining\) |
| `stats` | json | Usage stats: totalMessages, messagesLast24h/7d/30d, totalCalls, callsLast24h/7d/30d, totalWebhookDeliveries, successfulWebhookDeliveries, failedWebhookDeliveries |
| `periodStart` | string | Billing period start |
| `periodEnd` | string | Billing period end |
### `agentphone_get_usage_daily`
Get a daily breakdown of usage (messages, calls, webhooks) for the last N days
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `days` | number | No | Number of days to return \(1-365, default 30\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Daily usage entries |
| ↳ `date` | string | Day \(YYYY-MM-DD\) |
| ↳ `messages` | number | Messages that day |
| ↳ `calls` | number | Calls that day |
| ↳ `webhooks` | number | Webhook deliveries that day |
| `days` | number | Number of days returned |
### `agentphone_get_usage_monthly`
Get monthly usage aggregation (messages, calls, webhooks) for the last N months
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `months` | number | No | Number of months to return \(1-24, default 6\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Monthly usage entries |
| ↳ `month` | string | Month \(YYYY-MM\) |
| ↳ `messages` | number | Messages that month |
| ↳ `calls` | number | Calls that month |
| ↳ `webhooks` | number | Webhook deliveries that month |
| `months` | number | Number of months returned |
### `agentphone_list_calls`
List voice calls for this AgentPhone account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `limit` | number | No | Number of results to return \(default 20, max 100\) |
| `offset` | number | No | Number of results to skip \(min 0\) |
| `status` | string | No | Filter by status \(completed, in-progress, failed\) |
| `direction` | string | No | Filter by direction \(inbound, outbound\) |
| `type` | string | No | Filter by call type \(pstn, web\) |
| `search` | string | No | Search by phone number \(matches fromNumber or toNumber\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Calls |
| ↳ `id` | string | Call ID |
| ↳ `agentId` | string | Agent that handled the call |
| ↳ `phoneNumberId` | string | Phone number ID used for the call |
| ↳ `phoneNumber` | string | Phone number used for the call |
| ↳ `fromNumber` | string | Caller phone number |
| ↳ `toNumber` | string | Recipient phone number |
| ↳ `direction` | string | inbound or outbound |
| ↳ `status` | string | Call status |
| ↳ `startedAt` | string | ISO 8601 timestamp |
| ↳ `endedAt` | string | ISO 8601 timestamp |
| ↳ `durationSeconds` | number | Call duration in seconds |
| ↳ `lastTranscriptSnippet` | string | Last transcript snippet |
| ↳ `recordingUrl` | string | Recording audio URL |
| ↳ `recordingAvailable` | boolean | Whether a recording is available |
| `hasMore` | boolean | Whether more results are available |
| `total` | number | Total number of matching calls |
### `agentphone_list_contacts`
List contacts for this AgentPhone account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `search` | string | No | Filter by name or phone number \(case-insensitive contains\) |
| `limit` | number | No | Number of results to return \(default 50\) |
| `offset` | number | No | Number of results to skip \(min 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Contacts |
| ↳ `id` | string | Contact ID |
| ↳ `phoneNumber` | string | Phone number in E.164 format |
| ↳ `name` | string | Contact name |
| ↳ `email` | string | Contact email address |
| ↳ `notes` | string | Freeform notes |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 update timestamp |
| `hasMore` | boolean | Whether more results are available |
| `total` | number | Total number of contacts |
### `agentphone_list_conversations`
List conversations (message threads) for this AgentPhone account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `limit` | number | No | Number of results to return \(default 20, max 100\) |
| `offset` | number | No | Number of results to skip \(min 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Conversations |
| ↳ `id` | string | Conversation ID |
| ↳ `agentId` | string | Agent ID |
| ↳ `phoneNumberId` | string | Phone number ID |
| ↳ `phoneNumber` | string | Phone number |
| ↳ `participant` | string | External participant phone number |
| ↳ `lastMessageAt` | string | ISO 8601 timestamp |
| ↳ `lastMessagePreview` | string | Last message preview |
| ↳ `messageCount` | number | Number of messages in the conversation |
| ↳ `metadata` | json | Custom metadata stored on the conversation |
| ↳ `createdAt` | string | ISO 8601 timestamp |
| `hasMore` | boolean | Whether more results are available |
| `total` | number | Total number of conversations |
### `agentphone_list_numbers`
List all phone numbers provisioned for this AgentPhone account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `limit` | number | No | Number of results to return \(default 20, max 100\) |
| `offset` | number | No | Number of results to skip \(min 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | array | Phone numbers |
| ↳ `id` | string | Phone number ID |
| ↳ `phoneNumber` | string | Phone number in E.164 format |
| ↳ `country` | string | Two-letter country code |
| ↳ `status` | string | Number status |
| ↳ `type` | string | Number type \(e.g. sms\) |
| ↳ `agentId` | string | Attached agent ID |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| `hasMore` | boolean | Whether more results are available |
| `total` | number | Total number of phone numbers |
### `agentphone_react_to_message`
Send an iMessage tapback reaction to a message (iMessage only)
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `messageId` | string | Yes | ID of the message to react to |
| `reaction` | string | Yes | Reaction type: love, like, dislike, laugh, emphasize, or question |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Reaction ID |
| `reactionType` | string | Reaction type applied |
| `messageId` | string | ID of the message that was reacted to |
| `channel` | string | Channel \(imessage\) |
### `agentphone_release_number`
Release (delete) a phone number. This action is irreversible.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `numberId` | string | Yes | ID of the phone number to release |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | ID of the released phone number |
| `released` | boolean | Whether the number was released successfully |
### `agentphone_send_message`
Send an outbound SMS or iMessage from an AgentPhone agent
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `agentId` | string | Yes | Agent sending the message |
| `toNumber` | string | Yes | Recipient phone number in E.164 format \(e.g. +14155551234\) |
| `body` | string | Yes | Message text to send |
| `mediaUrl` | string | No | Optional URL of an image, video, or file to attach |
| `numberId` | string | No | Phone number ID to send from. If omitted, the agent's first assigned number is used. |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Message ID |
| `status` | string | Delivery status |
| `channel` | string | sms, mms, or imessage |
| `fromNumber` | string | Sender phone number |
| `toNumber` | string | Recipient phone number |
### `agentphone_update_contact`
Update a contact
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `contactId` | string | Yes | Contact ID |
| `phoneNumber` | string | No | New phone number in E.164 format |
| `name` | string | No | New contact name |
| `email` | string | No | New email address |
| `notes` | string | No | New freeform notes |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Contact ID |
| `phoneNumber` | string | Phone number in E.164 format |
| `name` | string | Contact name |
| `email` | string | Contact email address |
| `notes` | string | Freeform notes |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 update timestamp |
### `agentphone_update_conversation`
Update conversation metadata (stored state). Pass null to clear existing metadata.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | AgentPhone API key |
| `conversationId` | string | Yes | Conversation ID |
| `metadata` | json | No | Custom key-value metadata to store on the conversation. Pass null to clear existing metadata. |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Conversation ID |
| `agentId` | string | Agent ID |
| `phoneNumberId` | string | Phone number ID |
| `phoneNumber` | string | Phone number |
| `participant` | string | External participant phone number |
| `lastMessageAt` | string | ISO 8601 timestamp |
| `messageCount` | number | Number of messages |
| `metadata` | json | Custom metadata stored on the conversation |
| `createdAt` | string | ISO 8601 timestamp |
| `messages` | array | Messages in the conversation |
| ↳ `id` | string | Message ID |
| ↳ `body` | string | Message body |
| ↳ `fromNumber` | string | Sender phone number |
| ↳ `toNumber` | string | Recipient phone number |
| ↳ `direction` | string | inbound or outbound |
| ↳ `channel` | string | Channel \(sms, mms, etc.\) |
| ↳ `mediaUrl` | string | Media URL if any |
| ↳ `receivedAt` | string | ISO 8601 timestamp |

View File

@@ -38,7 +38,7 @@ Integrate Ashby into the workflow. Manage candidates (list, get, create, update,
### `ashby_add_candidate_tag`
Adds a tag to a candidate in Ashby.
Adds a tag to a candidate in Ashby and returns the updated candidate.
#### Input
@@ -52,7 +52,37 @@ Adds a tag to a candidate in Ashby.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `success` | boolean | Whether the tag was successfully added |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_change_application_stage`
@@ -71,8 +101,37 @@ Moves an application to a different interview stage. Requires an archive reason
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `applicationId` | string | Application UUID |
| `stageId` | string | New interview stage UUID |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_create_application`
@@ -95,7 +154,37 @@ Creates a new application for a candidate on a job. Optionally specify interview
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `applicationId` | string | Created application UUID |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_create_candidate`
@@ -107,7 +196,7 @@ Creates a new candidate record in Ashby.
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Ashby API Key |
| `name` | string | Yes | The candidate full name |
| `email` | string | Yes | Primary email address for the candidate |
| `email` | string | No | Primary email address for the candidate |
| `phoneNumber` | string | No | Primary phone number for the candidate |
| `linkedInUrl` | string | No | LinkedIn profile URL |
| `githubUrl` | string | No | GitHub profile URL |
@@ -117,17 +206,37 @@ Creates a new candidate record in Ashby.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Created candidate UUID |
| `name` | string | Full name |
| `primaryEmailAddress` | object | Primary email contact info |
| ↳ `value` | string | Email address |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary email |
| `primaryPhoneNumber` | object | Primary phone contact info |
| ↳ `value` | string | Phone number |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary phone |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_create_note`
@@ -147,7 +256,15 @@ Creates a note on a candidate in Ashby. Supports plain text and HTML content (bo
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `noteId` | string | Created note UUID |
| `id` | string | Created note UUID |
| `createdAt` | string | ISO 8601 creation timestamp |
| `isPrivate` | boolean | Whether the note is private |
| `content` | string | Note content |
| `author` | object | Author of the note |
| ↳ `id` | string | Author user UUID |
| ↳ `firstName` | string | Author first name |
| ↳ `lastName` | string | Author last name |
| ↳ `email` | string | Author email |
### `ashby_get_application`
@@ -164,28 +281,37 @@ Retrieves full details about a single application by its ID.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Application UUID |
| `status` | string | Application status \(Active, Hired, Archived, Lead\) |
| `candidate` | object | Associated candidate |
| ↳ `id` | string | Candidate UUID |
| ↳ `name` | string | Candidate name |
| `job` | object | Associated job |
| ↳ `id` | string | Job UUID |
| ↳ `title` | string | Job title |
| `currentInterviewStage` | object | Current interview stage |
| ↳ `id` | string | Stage UUID |
| ↳ `title` | string | Stage title |
| ↳ `type` | string | Stage type |
| `source` | object | Application source |
| ↳ `id` | string | Source UUID |
| ↳ `title` | string | Source title |
| `archiveReason` | object | Reason for archival |
| ↳ `id` | string | Reason UUID |
| ↳ `text` | string | Reason text |
| ↳ `reasonType` | string | Reason type |
| `archivedAt` | string | ISO 8601 archive timestamp |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_get_candidate`
@@ -202,27 +328,37 @@ Retrieves full details about a single candidate by their ID.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Candidate UUID |
| `name` | string | Full name |
| `primaryEmailAddress` | object | Primary email contact info |
| ↳ `value` | string | Email address |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary email |
| `primaryPhoneNumber` | object | Primary phone contact info |
| ↳ `value` | string | Phone number |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary phone |
| `profileUrl` | string | URL to the candidate Ashby profile |
| `position` | string | Current position or title |
| `company` | string | Current company |
| `linkedInUrl` | string | LinkedIn profile URL |
| `githubUrl` | string | GitHub profile URL |
| `tags` | array | Tags applied to the candidate |
| ↳ `id` | string | Tag UUID |
| `title` | string | Tag title |
| `applicationIds` | array | IDs of associated applications |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_get_job`
@@ -239,16 +375,37 @@ Retrieves full details about a single job by its ID.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Job UUID |
| `title` | string | Job title |
| `status` | string | Job status \(Open, Closed, Draft, Archived\) |
| `employmentType` | string | Employment type \(FullTime, PartTime, Intern, Contract, Temporary\) |
| `departmentId` | string | Department UUID |
| `locationId` | string | Location UUID |
| `descriptionPlain` | string | Job description in plain text |
| `isArchived` | boolean | Whether the job is archived |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_get_job_posting`
@@ -260,6 +417,8 @@ Retrieves full details about a single job posting by its ID.
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Ashby API Key |
| `jobPostingId` | string | Yes | The UUID of the job posting to fetch |
| `expandApplicationFormDefinition` | boolean | No | Include application form definition in the response |
| `expandSurveyFormDefinitions` | boolean | No | Include survey form definitions in the response |
#### Output
@@ -267,14 +426,56 @@ Retrieves full details about a single job posting by its ID.
| --------- | ---- | ----------- |
| `id` | string | Job posting UUID |
| `title` | string | Job posting title |
| `jobId` | string | Associated job UUID |
| `locationName` | string | Location name |
| `descriptionPlain` | string | Full description in plain text |
| `descriptionHtml` | string | Full description in HTML |
| `descriptionSocial` | string | Shortened description for social sharing \(max 200 chars\) |
| `descriptionParts` | object | Description broken into opening, body, and closing sections |
| ↳ `descriptionOpening` | object | Opening \(from Job Boards theme settings\) |
| ↳ `html` | string | HTML content |
| ↳ `plain` | string | Plain text content |
| ↳ `descriptionBody` | object | Main description body |
| ↳ `html` | string | HTML content |
| ↳ `plain` | string | Plain text content |
| ↳ `descriptionClosing` | object | Closing \(from Job Boards theme settings\) |
| ↳ `html` | string | HTML content |
| ↳ `plain` | string | Plain text content |
| `departmentName` | string | Department name |
| `employmentType` | string | Employment type \(e.g. FullTime, PartTime, Contract\) |
| `descriptionPlain` | string | Job posting description in plain text |
| `isListed` | boolean | Whether the posting is publicly listed |
| `teamName` | string | Team name |
| `teamNameHierarchy` | array | Hierarchy of team names from root to team |
| `jobId` | string | Associated job UUID |
| `locationName` | string | Primary location name |
| `locationIds` | object | Primary and secondary location UUIDs |
| ↳ `primaryLocationId` | string | Primary location UUID |
| ↳ `secondaryLocationIds` | array | Secondary location UUIDs |
| `address` | object | Postal address of the posting location |
| ↳ `postalAddress` | object | Structured postal address |
| ↳ `addressCountry` | string | Country |
| ↳ `addressRegion` | string | State or region |
| ↳ `addressLocality` | string | City or locality |
| ↳ `postalCode` | string | Postal code |
| ↳ `streetAddress` | string | Street address |
| `isRemote` | boolean | Whether the posting is remote |
| `workplaceType` | string | Workplace type \(OnSite, Remote, Hybrid\) |
| `employmentType` | string | Employment type \(FullTime, PartTime, Intern, Contract, Temporary\) |
| `isListed` | boolean | Whether publicly listed on the job board |
| `suppressDescriptionOpening` | boolean | Whether the theme opening is hidden on this posting |
| `suppressDescriptionClosing` | boolean | Whether the theme closing is hidden on this posting |
| `publishedDate` | string | ISO 8601 published date |
| `applicationDeadline` | string | ISO 8601 application deadline |
| `externalLink` | string | External link to the job posting |
| `applyLink` | string | Direct apply link |
| `compensation` | object | Compensation details for the posting |
| ↳ `compensationTierSummary` | string | Human-readable tier summary |
| ↳ `summaryComponents` | array | Structured compensation components |
| ↳ `summary` | string | Component summary |
| ↳ `compensationTypeLabel` | string | Component type label \(Salary, Commission, Bonus, Equity, etc.\) |
| ↳ `interval` | string | Payment interval \(e.g. annual, hourly\) |
| ↳ `currencyCode` | string | ISO 4217 currency code |
| ↳ `minValue` | number | Minimum value |
| ↳ `maxValue` | number | Maximum value |
| ↳ `shouldDisplayCompensationOnJobBoard` | boolean | Whether compensation is shown on the job board |
| `applicationLimitCalloutHtml` | string | HTML callout shown when application limit is reached |
| `updatedAt` | string | ISO 8601 last update timestamp |
### `ashby_get_offer`
@@ -291,20 +492,41 @@ Retrieves full details about a single offer by its ID.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Offer UUID |
| `offerStatus` | string | Offer status \(e.g. WaitingOnCandidateResponse, CandidateAccepted\) |
| `acceptanceStatus` | string | Acceptance status \(e.g. Accepted, Declined, Pending\) |
| `applicationId` | string | Associated application UUID |
| `startDate` | string | Offer start date |
| `salary` | object | Salary details |
| ↳ `currencyCode` | string | ISO 4217 currency code |
| ↳ `value` | number | Salary amount |
| `openingId` | string | Associated opening UUID |
| `createdAt` | string | ISO 8601 creation timestamp \(from latest version\) |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_list_applications`
Lists all applications in an Ashby organization with pagination and optional filters for status, job, candidate, and creation date.
Lists all applications in an Ashby organization with pagination and optional filters for status, job, and creation date.
#### Input
@@ -315,7 +537,6 @@ Lists all applications in an Ashby organization with pagination and optional fil
| `perPage` | number | No | Number of results per page \(default 100\) |
| `status` | string | No | Filter by application status: Active, Hired, Archived, or Lead |
| `jobId` | string | No | Filter applications by a specific job UUID |
| `candidateId` | string | No | Filter applications by a specific candidate UUID |
| `createdAfter` | string | No | Filter to applications created after this ISO 8601 timestamp \(e.g. 2024-01-01T00:00:00Z\) |
#### Output
@@ -323,23 +544,6 @@ Lists all applications in an Ashby organization with pagination and optional fil
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `applications` | array | List of applications |
| ↳ `id` | string | Application UUID |
| ↳ `status` | string | Application status \(Active, Hired, Archived, Lead\) |
| ↳ `candidate` | object | Associated candidate |
| ↳ `id` | string | Candidate UUID |
| ↳ `name` | string | Candidate name |
| ↳ `job` | object | Associated job |
| ↳ `id` | string | Job UUID |
| ↳ `title` | string | Job title |
| ↳ `currentInterviewStage` | object | Current interview stage |
| ↳ `id` | string | Stage UUID |
| ↳ `title` | string | Stage title |
| ↳ `type` | string | Stage type |
| ↳ `source` | object | Application source |
| ↳ `id` | string | Source UUID |
| ↳ `title` | string | Source title |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
@@ -352,6 +556,7 @@ Lists all archive reasons configured in Ashby.
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Ashby API Key |
| `includeArchived` | boolean | No | Whether to include archived archive reasons in the response \(default false\) |
#### Output
@@ -360,7 +565,7 @@ Lists all archive reasons configured in Ashby.
| `archiveReasons` | array | List of archive reasons |
| ↳ `id` | string | Archive reason UUID |
| ↳ `text` | string | Archive reason text |
| ↳ `reasonType` | string | Reason type |
| ↳ `reasonType` | string | Reason type \(RejectedByCandidate, RejectedByOrg, Other\) |
| ↳ `isArchived` | boolean | Whether the reason is archived |
### `ashby_list_candidate_tags`
@@ -372,6 +577,10 @@ Lists all candidate tags configured in Ashby.
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Ashby API Key |
| `includeArchived` | boolean | No | Whether to include archived candidate tags \(default false\) |
| `cursor` | string | No | Opaque pagination cursor from a previous response nextCursor value |
| `syncToken` | string | No | Sync token from a previous response to fetch only changed results |
| `perPage` | number | No | Number of results per page \(default 100\) |
#### Output
@@ -381,6 +590,9 @@ Lists all candidate tags configured in Ashby.
| ↳ `id` | string | Tag UUID |
| ↳ `title` | string | Tag title |
| ↳ `isArchived` | boolean | Whether the tag is archived |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
| `syncToken` | string | Sync token to use for incremental updates in future requests |
### `ashby_list_candidates`
@@ -399,18 +611,6 @@ Lists all candidates in an Ashby organization with cursor-based pagination.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `candidates` | array | List of candidates |
| ↳ `id` | string | Candidate UUID |
| ↳ `name` | string | Full name |
| ↳ `primaryEmailAddress` | object | Primary email contact info |
| ↳ `value` | string | Email address |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary email |
| ↳ `primaryPhoneNumber` | object | Primary phone contact info |
| ↳ `value` | string | Phone number |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary phone |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
@@ -431,9 +631,15 @@ Lists all custom field definitions configured in Ashby.
| `customFields` | array | List of custom field definitions |
| ↳ `id` | string | Custom field UUID |
| ↳ `title` | string | Custom field title |
| ↳ `fieldType` | string | Field type \(e.g. String, Number, Boolean\) |
| ↳ `objectType` | string | Object type the field applies to \(e.g. Candidate, Application, Job\) |
| ↳ `isPrivate` | boolean | Whether the custom field is private |
| ↳ `fieldType` | string | Field data type \(MultiValueSelect, NumberRange, String, Date, ValueSelect, Number, Currency, Boolean, LongText, CompensationRange\) |
| ↳ `objectType` | string | Object type the field applies to \(Application, Candidate, Employee, Job, Offer, Opening, Talent_Project\) |
| ↳ `isArchived` | boolean | Whether the custom field is archived |
| ↳ `isRequired` | boolean | Whether a value is required |
| ↳ `selectableValues` | array | Selectable values for MultiValueSelect fields \(empty for other field types\) |
| ↳ `label` | string | Display label |
| ↳ `value` | string | Stored value |
| ↳ `isArchived` | boolean | Whether archived |
### `ashby_list_departments`
@@ -452,8 +658,11 @@ Lists all departments in Ashby.
| `departments` | array | List of departments |
| ↳ `id` | string | Department UUID |
| ↳ `name` | string | Department name |
| ↳ `externalName` | string | Candidate-facing name used on job boards |
| ↳ `isArchived` | boolean | Whether the department is archived |
| ↳ `parentId` | string | Parent department UUID |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
### `ashby_list_interviews`
@@ -475,10 +684,24 @@ Lists interview schedules in Ashby, optionally filtered by application or interv
| --------- | ---- | ----------- |
| `interviewSchedules` | array | List of interview schedules |
| ↳ `id` | string | Interview schedule UUID |
| ↳ `status` | string | Schedule status \(NeedsScheduling, WaitingOnCandidateBooking, Scheduled, Complete, Cancelled, OnHold, etc.\) |
| ↳ `applicationId` | string | Associated application UUID |
| ↳ `interviewStageId` | string | Interview stage UUID |
| ↳ `status` | string | Schedule status |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
| ↳ `interviewEvents` | array | Scheduled interview events on this schedule |
| ↳ `id` | string | Event UUID |
| ↳ `interviewId` | string | Interview template UUID |
| ↳ `interviewScheduleId` | string | Parent schedule UUID |
| ↳ `interviewerUserIds` | array | User UUIDs of interviewers assigned to the event |
| ↳ `createdAt` | string | Event creation timestamp |
| ↳ `updatedAt` | string | Event last updated timestamp |
| ↳ `startTime` | string | Event start time |
| ↳ `endTime` | string | Event end time |
| ↳ `feedbackLink` | string | URL to submit feedback for the event |
| ↳ `location` | string | Physical location |
| ↳ `meetingLink` | string | Virtual meeting URL |
| ↳ `hasSubmittedFeedback` | boolean | Whether any feedback has been submitted |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
@@ -500,11 +723,22 @@ Lists all job postings in Ashby.
| ↳ `id` | string | Job posting UUID |
| ↳ `title` | string | Job posting title |
| ↳ `jobId` | string | Associated job UUID |
| ↳ `locationName` | string | Location name |
| ↳ `departmentName` | string | Department name |
| ↳ `employmentType` | string | Employment type \(e.g. FullTime, PartTime, Contract\) |
| ↳ `teamName` | string | Team name |
| ↳ `locationName` | string | Primary location display name |
| ↳ `locationIds` | object | Primary and secondary location UUIDs |
| ↳ `primaryLocationId` | string | Primary location UUID |
| ↳ `secondaryLocationIds` | array | Secondary location UUIDs |
| ↳ `workplaceType` | string | Workplace type \(OnSite, Remote, Hybrid\) |
| ↳ `employmentType` | string | Employment type \(FullTime, PartTime, Intern, Contract, Temporary\) |
| ↳ `isListed` | boolean | Whether the posting is publicly listed |
| ↳ `publishedDate` | string | ISO 8601 published date |
| ↳ `applicationDeadline` | string | ISO 8601 application deadline |
| ↳ `externalLink` | string | External link to the job posting |
| ↳ `applyLink` | string | Direct apply link for the job posting |
| ↳ `compensationTierSummary` | string | Compensation tier summary for job boards |
| ↳ `shouldDisplayCompensationOnJobBoard` | boolean | Whether compensation is shown on the job board |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
### `ashby_list_jobs`
@@ -524,14 +758,6 @@ Lists all jobs in an Ashby organization. By default returns Open, Closed, and Ar
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `jobs` | array | List of jobs |
| ↳ `id` | string | Job UUID |
| ↳ `title` | string | Job title |
| ↳ `status` | string | Job status \(Open, Closed, Archived, Draft\) |
| ↳ `employmentType` | string | Employment type \(FullTime, PartTime, Intern, Contract, Temporary\) |
| ↳ `departmentId` | string | Department UUID |
| ↳ `locationId` | string | Location UUID |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
@@ -552,12 +778,18 @@ Lists all locations configured in Ashby.
| `locations` | array | List of locations |
| ↳ `id` | string | Location UUID |
| ↳ `name` | string | Location name |
| ↳ `externalName` | string | Candidate-facing name used on job boards |
| ↳ `isArchived` | boolean | Whether the location is archived |
| ↳ `isRemote` | boolean | Whether this is a remote location |
| ↳ `address` | object | Location address |
| ↳ `city` | string | City |
| ↳ `region` | string | State or region |
| ↳ `country` | string | Country |
| ↳ `isRemote` | boolean | Whether the location is remote \(use workplaceType instead\) |
| ↳ `workplaceType` | string | Workplace type \(OnSite, Hybrid, Remote\) |
| ↳ `parentLocationId` | string | Parent location UUID |
| ↳ `type` | string | Location component type \(Location, LocationHierarchy\) |
| ↳ `address` | object | Location postal address |
| ↳ `addressCountry` | string | Country |
| ↳ `addressRegion` | string | State or region |
| ↳ `addressLocality` | string | City or locality |
| ↳ `postalCode` | string | Postal code |
| ↳ `streetAddress` | string | Street address |
### `ashby_list_notes`
@@ -579,6 +811,7 @@ Lists all notes on a candidate with pagination support.
| `notes` | array | List of notes on the candidate |
| ↳ `id` | string | Note UUID |
| ↳ `content` | string | Note content |
| ↳ `isPrivate` | boolean | Whether the note is private |
| ↳ `author` | object | Note author |
| ↳ `id` | string | Author user UUID |
| ↳ `firstName` | string | First name |
@@ -605,16 +838,6 @@ Lists all offers with their latest version in an Ashby organization.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `offers` | array | List of offers |
| ↳ `id` | string | Offer UUID |
| ↳ `offerStatus` | string | Offer status |
| ↳ `acceptanceStatus` | string | Acceptance status |
| ↳ `applicationId` | string | Associated application UUID |
| ↳ `startDate` | string | Offer start date |
| ↳ `salary` | object | Salary details |
| ↳ `currencyCode` | string | ISO 4217 currency code |
| ↳ `value` | number | Salary amount |
| ↳ `openingId` | string | Associated opening UUID |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
@@ -634,12 +857,6 @@ Lists all openings in Ashby with pagination.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `openings` | array | List of openings |
| ↳ `id` | string | Opening UUID |
| ↳ `openingState` | string | Opening state \(Approved, Closed, Draft, Filled, Open\) |
| ↳ `isArchived` | boolean | Whether the opening is archived |
| ↳ `openedAt` | string | ISO 8601 opened timestamp |
| ↳ `closedAt` | string | ISO 8601 closed timestamp |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
@@ -661,6 +878,10 @@ Lists all candidate sources configured in Ashby.
| ↳ `id` | string | Source UUID |
| ↳ `title` | string | Source title |
| ↳ `isArchived` | boolean | Whether the source is archived |
| ↳ `sourceType` | object | Source type grouping |
| ↳ `id` | string | Source type UUID |
| ↳ `title` | string | Source type title |
| ↳ `isArchived` | boolean | Whether archived |
### `ashby_list_users`
@@ -679,18 +900,12 @@ Lists all users in Ashby with pagination.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `users` | array | List of users |
| ↳ `id` | string | User UUID |
| ↳ `firstName` | string | First name |
| ↳ `lastName` | string | Last name |
| ↳ `email` | string | Email address |
| ↳ `isEnabled` | boolean | Whether the user account is enabled |
| ↳ `globalRole` | string | User role \(Organization Admin, Elevated Access, Limited Access, External Recruiter\) |
| `moreDataAvailable` | boolean | Whether more pages of results exist |
| `nextCursor` | string | Opaque cursor for fetching the next page |
### `ashby_remove_candidate_tag`
Removes a tag from a candidate in Ashby.
Removes a tag from a candidate in Ashby and returns the updated candidate.
#### Input
@@ -704,7 +919,37 @@ Removes a tag from a candidate in Ashby.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `success` | boolean | Whether the tag was successfully removed |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |
### `ashby_search_candidates`
@@ -723,18 +968,6 @@ Searches for candidates by name and/or email with AND logic. Results are limited
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `candidates` | array | Matching candidates \(max 100 results\) |
| ↳ `id` | string | Candidate UUID |
| ↳ `name` | string | Full name |
| ↳ `primaryEmailAddress` | object | Primary email contact info |
| ↳ `value` | string | Email address |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary email |
| ↳ `primaryPhoneNumber` | object | Primary phone contact info |
| ↳ `value` | string | Phone number |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary phone |
| ↳ `createdAt` | string | ISO 8601 creation timestamp |
| ↳ `updatedAt` | string | ISO 8601 last update timestamp |
### `ashby_update_candidate`
@@ -758,26 +991,36 @@ Updates an existing candidate record in Ashby. Only provided fields are changed.
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | Candidate UUID |
| `name` | string | Full name |
| `primaryEmailAddress` | object | Primary email contact info |
| ↳ `value` | string | Email address |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary email |
| `primaryPhoneNumber` | object | Primary phone contact info |
| ↳ `value` | string | Phone number |
| ↳ `type` | string | Contact type \(Personal, Work, Other\) |
| ↳ `isPrimary` | boolean | Whether this is the primary phone |
| `profileUrl` | string | URL to the candidate Ashby profile |
| `position` | string | Current position or title |
| `company` | string | Current company |
| `linkedInUrl` | string | LinkedIn profile URL |
| `githubUrl` | string | GitHub profile URL |
| `tags` | array | Tags applied to the candidate |
| ↳ `id` | string | Tag UUID |
| `title` | string | Tag title |
| `applicationIds` | array | IDs of associated applications |
| `candidates` | json | List of candidates with rich fields \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], linkedInUrl, githubUrl, profileUrl, position, company, school, timezone, location with locationComponents\[\], tags\[\], applicationIds\[\], customFields\[\], resumeFileHandle, fileHandles\[\], source with sourceType, creditedToUser, fraudStatus, createdAt, updatedAt\) |
| `jobs` | json | List of jobs \(id, title, confidential, status, employmentType, locationId, departmentId, defaultInterviewPlanId, interviewPlanIds\[\], customFields\[\], jobPostingIds\[\], customRequisitionId, brandId, hiringTeam\[\], author, createdAt, updatedAt, openedAt, closedAt, location with address, openings\[\] with latestVersion, compensation with compensationTiers\[\]\) |
| `applications` | json | List of applications \(id, status, customFields\[\], candidate summary, currentInterviewStage, source with sourceType, archiveReason with customFields\[\], archivedAt, job summary, creditedToUser, hiringTeam\[\], appliedViaJobPostingId, submitterClientIp, submitterUserAgent, createdAt, updatedAt\) |
| `notes` | json | List of notes \(id, content, author, isPrivate, createdAt\) |
| `offers` | json | List of offers \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion with id/startDate/salary/createdAt/openingId/customFields\[\]/fileHandles\[\]/author/approvalStatus\) |
| `archiveReasons` | json | List of archive reasons \(id, text, reasonType \[RejectedByCandidate/RejectedByOrg/Other\], isArchived\) |
| `sources` | json | List of sources \(id, title, isArchived, sourceType \{id, title, isArchived\}\) |
| `customFields` | json | List of custom field definitions \(id, title, isPrivate, fieldType, objectType, isArchived, isRequired, selectableValues\[\] \{label, value, isArchived\}\) |
| `departments` | json | List of departments \(id, name, externalName, isArchived, parentId, createdAt, updatedAt\) |
| `locations` | json | List of locations \(id, name, externalName, isArchived, isRemote, workplaceType, parentLocationId, type, address with addressCountry/Region/Locality/postalCode/streetAddress\) |
| `jobPostings` | json | List of job postings \(id, title, jobId, departmentName, teamName, locationName, locationIds, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensationTierSummary, shouldDisplayCompensationOnJobBoard, updatedAt\) |
| `openings` | json | List of openings \(id, openedAt, closedAt, isArchived, archivedAt, closeReasonId, openingState, latestVersion with identifier/description/authorId/createdAt/teamId/jobIds\[\]/targetHireDate/targetStartDate/isBackfill/employmentType/locationIds\[\]/hiringTeam\[\]/customFields\[\]\) |
| `users` | json | List of users \(id, firstName, lastName, email, globalRole, isEnabled, updatedAt, managerId\) |
| `interviewSchedules` | json | List of interview schedules \(id, applicationId, interviewStageId, interviewEvents\[\] with interviewerUserIds/startTime/endTime/feedbackLink/location/meetingLink/hasSubmittedFeedback, status, scheduledBy, createdAt, updatedAt\) |
| `tags` | json | List of candidate tags \(id, title, isArchived\) |
| `id` | string | Resource UUID |
| `name` | string | Resource name |
| `title` | string | Job title or job posting title |
| `status` | string | Status |
| `candidate` | json | Candidate details \(id, name, primaryEmailAddress, primaryPhoneNumber, emailAddresses\[\], phoneNumbers\[\], socialLinks\[\], customFields\[\], source, creditedToUser, createdAt, updatedAt\) |
| `job` | json | Job details \(id, title, status, employmentType, locationId, departmentId, hiringTeam\[\], author, location, openings\[\], compensation, createdAt, updatedAt\) |
| `application` | json | Application details \(id, status, customFields\[\], candidate, currentInterviewStage, source, archiveReason, job, hiringTeam\[\], createdAt, updatedAt\) |
| `offer` | json | Offer details \(id, decidedAt, applicationId, acceptanceStatus, offerStatus, latestVersion\) |
| `jobPosting` | json | Job posting details \(id, title, descriptionPlain, descriptionHtml, descriptionSocial, descriptionParts, departmentName, teamName, teamNameHierarchy\[\], jobId, locationName, locationIds, linkedData, address, isRemote, workplaceType, employmentType, isListed, publishedDate, applicationDeadline, externalLink, applyLink, compensation, updatedAt\) |
| `content` | string | Note content |
| `author` | json | Note author \(id, firstName, lastName, email, globalRole, isEnabled\) |
| `isPrivate` | boolean | Whether the note is private |
| `createdAt` | string | ISO 8601 creation timestamp |
| `updatedAt` | string | ISO 8601 last update timestamp |
| `moreDataAvailable` | boolean | Whether more pages exist |
| `nextCursor` | string | Pagination cursor for next page |
| `syncToken` | string | Sync token for incremental updates |

View File

@@ -42,9 +42,18 @@ Runs a browser automation task using BrowserUse
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `task` | string | Yes | What should the browser agent do |
| `variables` | json | No | Optional variables to use as secrets \(format: \{key: value\}\) |
| `save_browser_data` | boolean | No | Whether to save browser data |
| `model` | string | No | LLM model to use \(default: gpt-4o\) |
| `startUrl` | string | No | Initial page URL to start the agent on \(reduces navigation steps\) |
| `variables` | json | No | Optional secrets injected into the task \(format: \{key: value\}\) |
| `allowedDomains` | string | No | Comma-separated list of domains the agent is allowed to visit |
| `maxSteps` | number | No | Maximum number of steps the agent may take \(default 100, max 10000\) |
| `flashMode` | boolean | No | Enable flash mode \(faster, less careful navigation\) |
| `thinking` | boolean | No | Enable extended reasoning mode |
| `vision` | string | No | Vision capability: "true", "false", or "auto" |
| `systemPromptExtension` | string | No | Optional text appended to the agent system prompt \(max 2000 chars\) |
| `structuredOutput` | string | No | Stringified JSON schema for the structured output |
| `highlightElements` | boolean | No | Highlight interactive elements on the page \(default true\) |
| `metadata` | json | No | Custom key-value metadata \(up to 10 pairs\) for tracking |
| `model` | string | No | LLM model identifier \(e.g. browser-use-2.0\) |
| `apiKey` | string | Yes | API key for BrowserUse API |
| `profile_id` | string | No | Browser profile ID for persistent sessions \(cookies, login state\) |
@@ -54,7 +63,18 @@ Runs a browser automation task using BrowserUse
| --------- | ---- | ----------- |
| `id` | string | Task execution identifier |
| `success` | boolean | Task completion status |
| `output` | json | Task output data |
| `steps` | json | Execution steps taken |
| `output` | json | Final task output \(string or structured\) |
| `steps` | array | Steps the agent executed \(number, memory, nextGoal, url, actions, duration\) |
| ↳ `number` | number | Sequential step number |
| ↳ `memory` | string | Agent memory at this step |
| ↳ `evaluationPreviousGoal` | string | Evaluation of previous goal completion |
| ↳ `nextGoal` | string | Goal for the next step |
| ↳ `url` | string | Current URL of the browser |
| ↳ `screenshotUrl` | string | Optional screenshot URL |
| ↳ `actions` | array | Stringified JSON actions performed |
| ↳ `duration` | number | Step duration in seconds |
| `liveUrl` | string | Embeddable live browser session URL \(active during execution\) |
| `shareUrl` | string | Public shareable URL for the recorded session \(post-run\) |
| `sessionId` | string | Browser Use session identifier |

View File

@@ -57,9 +57,12 @@ Run a CloudWatch Log Insights query against one or more log groups
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `results` | array | Query result rows |
| `statistics` | object | Query statistics \(bytesScanned, recordsMatched, recordsScanned\) |
| `status` | string | Query completion status |
| `results` | array | Query result rows \(each row is a key/value map of field name to value\) |
| `statistics` | object | Query statistics |
| ↳ `bytesScanned` | number | Total bytes of log data scanned |
| ↳ `recordsMatched` | number | Number of log records that matched the query |
| ↳ `recordsScanned` | number | Total log records scanned |
| `status` | string | Query completion status \(Complete, Failed, Cancelled, or Timeout\) |
### `cloudwatch_describe_log_groups`
@@ -80,6 +83,11 @@ List available CloudWatch log groups
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `logGroups` | array | List of CloudWatch log groups with metadata |
| ↳ `logGroupName` | string | Log group name |
| ↳ `arn` | string | Log group ARN |
| ↳ `storedBytes` | number | Total stored bytes |
| ↳ `retentionInDays` | number | Retention period in days \(if set\) |
| ↳ `creationTime` | number | Creation time in epoch milliseconds |
### `cloudwatch_get_log_events`
@@ -103,6 +111,9 @@ Retrieve log events from a specific CloudWatch log stream
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `events` | array | Log events with timestamp, message, and ingestion time |
| ↳ `timestamp` | number | Event timestamp in epoch milliseconds |
| ↳ `message` | string | Log event message |
| ↳ `ingestionTime` | number | Ingestion time in epoch milliseconds |
### `cloudwatch_describe_log_streams`
@@ -123,7 +134,12 @@ List log streams within a CloudWatch log group
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `logStreams` | array | List of log streams with metadata |
| `logStreams` | array | List of log streams with metadata, sorted by last event time \(most recent first\) unless a prefix filter is applied |
| ↳ `logStreamName` | string | Log stream name |
| ↳ `lastEventTimestamp` | number | Timestamp of the last log event in epoch milliseconds |
| ↳ `firstEventTimestamp` | number | Timestamp of the first log event in epoch milliseconds |
| ↳ `creationTime` | number | Stream creation time in epoch milliseconds |
| ↳ `storedBytes` | number | Total stored bytes |
### `cloudwatch_list_metrics`
@@ -146,6 +162,9 @@ List available CloudWatch metrics
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `metrics` | array | List of metrics with namespace, name, and dimensions |
| ↳ `namespace` | string | Metric namespace \(e.g., AWS/EC2\) |
| ↳ `metricName` | string | Metric name \(e.g., CPUUtilization\) |
| ↳ `dimensions` | array | Array of name/value dimension pairs |
### `cloudwatch_get_metric_statistics`
@@ -170,8 +189,15 @@ Get statistics for a CloudWatch metric over a time range
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `label` | string | Metric label |
| `datapoints` | array | Datapoints with timestamp and statistics values |
| `label` | string | Metric label returned by CloudWatch |
| `datapoints` | array | Datapoints sorted by timestamp with statistics values |
| ↳ `timestamp` | number | Datapoint timestamp in epoch milliseconds |
| ↳ `average` | number | Average statistic value |
| ↳ `sum` | number | Sum statistic value |
| ↳ `minimum` | number | Minimum statistic value |
| ↳ `maximum` | number | Maximum statistic value |
| ↳ `sampleCount` | number | Sample count statistic value |
| ↳ `unit` | string | Unit of the metric |
### `cloudwatch_put_metric_data`
@@ -222,5 +248,13 @@ List and filter CloudWatch alarms
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `alarms` | array | List of CloudWatch alarms with state and configuration |
| ↳ `alarmName` | string | Alarm name |
| ↳ `alarmArn` | string | Alarm ARN |
| ↳ `stateValue` | string | Current state \(OK, ALARM, INSUFFICIENT_DATA\) |
| ↳ `stateReason` | string | Human-readable reason for the state |
| ↳ `metricName` | string | Metric name \(MetricAlarm only\) |
| ↳ `namespace` | string | Metric namespace \(MetricAlarm only\) |
| ↳ `threshold` | number | Threshold value \(MetricAlarm only\) |
| ↳ `stateUpdatedTimestamp` | number | Epoch ms when state last changed |

View File

@@ -1,6 +1,6 @@
---
title: Amazon DynamoDB
description: Connect to Amazon DynamoDB
description: Get, put, query, scan, update, and delete items in Amazon DynamoDB tables
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
@@ -55,7 +55,7 @@ Get an item from a DynamoDB table by primary key
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `tableName` | string | Yes | DynamoDB table name \(e.g., "Users", "Orders"\) |
| `key` | object | Yes | Primary key of the item to retrieve \(e.g., \{"pk": "USER#123"\} or \{"pk": "ORDER#456", "sk": "ITEM#789"\}\) |
| `key` | json | Yes | Primary key of the item to retrieve \(e.g., \{"pk": "USER#123"\} or \{"pk": "ORDER#456", "sk": "ITEM#789"\}\) |
| `consistentRead` | boolean | No | Use strongly consistent read |
#### Output
@@ -63,7 +63,7 @@ Get an item from a DynamoDB table by primary key
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Operation status message |
| `item` | object | Retrieved item |
| `item` | json | Retrieved item |
### `dynamodb_put`
@@ -77,14 +77,17 @@ Put an item into a DynamoDB table
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `tableName` | string | Yes | DynamoDB table name \(e.g., "Users", "Orders"\) |
| `item` | object | Yes | Item to put into the table \(e.g., \{"pk": "USER#123", "name": "John", "email": "john@example.com"\}\) |
| `item` | json | Yes | Item to put into the table \(e.g., \{"pk": "USER#123", "name": "John", "email": "john@example.com"\}\) |
| `conditionExpression` | string | No | Condition that must be met for the put to succeed \(e.g., "attribute_not_exists\(pk\)" to prevent overwrites\) |
| `expressionAttributeNames` | json | No | Attribute name mappings for reserved words used in conditionExpression \(e.g., \{"#name": "name"\}\) |
| `expressionAttributeValues` | json | No | Expression attribute values used in conditionExpression \(e.g., \{":expected": "value"\}\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Operation status message |
| `item` | object | Created item |
| `item` | json | Created item |
### `dynamodb_query`
@@ -100,10 +103,12 @@ Query items from a DynamoDB table using key conditions
| `tableName` | string | Yes | DynamoDB table name \(e.g., "Users", "Orders"\) |
| `keyConditionExpression` | string | Yes | Key condition expression \(e.g., "pk = :pk" or "pk = :pk AND sk BEGINS_WITH :prefix"\) |
| `filterExpression` | string | No | Filter expression for results \(e.g., "age &gt; :minAge AND #status = :status"\) |
| `expressionAttributeNames` | object | No | Attribute name mappings for reserved words \(e.g., \{"#status": "status"\}\) |
| `expressionAttributeValues` | object | No | Expression attribute values \(e.g., \{":pk": "USER#123", ":minAge": 18\}\) |
| `expressionAttributeNames` | json | No | Attribute name mappings for reserved words \(e.g., \{"#status": "status"\}\) |
| `expressionAttributeValues` | json | No | Expression attribute values \(e.g., \{":pk": "USER#123", ":minAge": 18\}\) |
| `indexName` | string | No | Secondary index name to query \(e.g., "GSI1", "email-index"\) |
| `limit` | number | No | Maximum number of items to return \(e.g., 10, 50, 100\) |
| `exclusiveStartKey` | json | No | Pagination token from a previous query's lastEvaluatedKey to continue fetching results |
| `scanIndexForward` | boolean | No | Sort order for the sort key: true for ascending \(default\), false for descending |
#### Output
@@ -112,6 +117,7 @@ Query items from a DynamoDB table using key conditions
| `message` | string | Operation status message |
| `items` | array | Array of items returned |
| `count` | number | Number of items returned |
| `lastEvaluatedKey` | json | Pagination token to pass as exclusiveStartKey to fetch the next page of results |
### `dynamodb_scan`
@@ -127,9 +133,10 @@ Scan all items in a DynamoDB table
| `tableName` | string | Yes | DynamoDB table name \(e.g., "Users", "Orders"\) |
| `filterExpression` | string | No | Filter expression for results \(e.g., "age &gt; :minAge AND #status = :status"\) |
| `projectionExpression` | string | No | Attributes to retrieve \(e.g., "pk, sk, #name, email"\) |
| `expressionAttributeNames` | object | No | Attribute name mappings for reserved words \(e.g., \{"#name": "name", "#status": "status"\}\) |
| `expressionAttributeValues` | object | No | Expression attribute values \(e.g., \{":minAge": 18, ":status": "active"\}\) |
| `expressionAttributeNames` | json | No | Attribute name mappings for reserved words \(e.g., \{"#name": "name", "#status": "status"\}\) |
| `expressionAttributeValues` | json | No | Expression attribute values \(e.g., \{":minAge": 18, ":status": "active"\}\) |
| `limit` | number | No | Maximum number of items to return \(e.g., 10, 50, 100\) |
| `exclusiveStartKey` | json | No | Pagination token from a previous scan's lastEvaluatedKey to continue fetching results |
#### Output
@@ -138,6 +145,7 @@ Scan all items in a DynamoDB table
| `message` | string | Operation status message |
| `items` | array | Array of items returned |
| `count` | number | Number of items returned |
| `lastEvaluatedKey` | json | Pagination token to pass as exclusiveStartKey to fetch the next page of results |
### `dynamodb_update`
@@ -151,10 +159,10 @@ Update an item in a DynamoDB table
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `tableName` | string | Yes | DynamoDB table name \(e.g., "Users", "Orders"\) |
| `key` | object | Yes | Primary key of the item to update \(e.g., \{"pk": "USER#123"\} or \{"pk": "ORDER#456", "sk": "ITEM#789"\}\) |
| `key` | json | Yes | Primary key of the item to update \(e.g., \{"pk": "USER#123"\} or \{"pk": "ORDER#456", "sk": "ITEM#789"\}\) |
| `updateExpression` | string | Yes | Update expression \(e.g., "SET #name = :name, age = :age" or "SET #count = #count + :inc"\) |
| `expressionAttributeNames` | object | No | Attribute name mappings for reserved words \(e.g., \{"#name": "name", "#count": "count"\}\) |
| `expressionAttributeValues` | object | No | Expression attribute values \(e.g., \{":name": "John", ":age": 30, ":inc": 1\}\) |
| `expressionAttributeNames` | json | No | Attribute name mappings for reserved words \(e.g., \{"#name": "name", "#count": "count"\}\) |
| `expressionAttributeValues` | json | No | Expression attribute values \(e.g., \{":name": "John", ":age": 30, ":inc": 1\}\) |
| `conditionExpression` | string | No | Condition that must be met for the update to succeed \(e.g., "attribute_exists\(pk\)" or "version = :expectedVersion"\) |
#### Output
@@ -162,7 +170,7 @@ Update an item in a DynamoDB table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Operation status message |
| `item` | object | Updated item |
| `item` | json | Updated item with all attributes |
### `dynamodb_delete`
@@ -176,8 +184,10 @@ Delete an item from a DynamoDB table
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `tableName` | string | Yes | DynamoDB table name \(e.g., "Users", "Orders"\) |
| `key` | object | Yes | Primary key of the item to delete \(e.g., \{"pk": "USER#123"\} or \{"pk": "ORDER#456", "sk": "ITEM#789"\}\) |
| `key` | json | Yes | Primary key of the item to delete \(e.g., \{"pk": "USER#123"\} or \{"pk": "ORDER#456", "sk": "ITEM#789"\}\) |
| `conditionExpression` | string | No | Condition that must be met for the delete to succeed \(e.g., "attribute_exists\(pk\)"\) |
| `expressionAttributeNames` | json | No | Attribute name mappings for reserved words used in conditionExpression \(e.g., \{"#status": "status"\}\) |
| `expressionAttributeValues` | json | No | Expression attribute values used in conditionExpression \(e.g., \{":status": "active"\}\) |
#### Output
@@ -204,6 +214,6 @@ Introspect DynamoDB to list tables or get detailed schema information for a spec
| --------- | ---- | ----------- |
| `message` | string | Operation status message |
| `tables` | array | List of table names in the region |
| `tableDetails` | object | Detailed schema information for a specific table |
| `tableDetails` | json | Detailed schema information for a specific table |

View File

@@ -68,7 +68,7 @@ Get detailed information about an IAM user
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `userName` | string | Yes | The name of the IAM user to retrieve |
| `userName` | string | No | The name of the IAM user to retrieve \(defaults to the caller if omitted\) |
#### Output
@@ -440,4 +440,80 @@ Remove an IAM user from a group
| --------- | ---- | ----------- |
| `message` | string | Operation status message |
### `iam_list_attached_role_policies`
List all managed policies attached to an IAM role
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `roleName` | string | Yes | Name of the IAM role |
| `pathPrefix` | string | No | Path prefix to filter policies \(e.g., /application/\) |
| `maxItems` | number | No | Maximum number of policies to return \(1-1000\) |
| `marker` | string | No | Pagination marker from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `attachedPolicies` | json | List of attached policies with policyName and policyArn |
| `isTruncated` | boolean | Whether there are more results available |
| `marker` | string | Pagination marker for the next page of results |
| `count` | number | Number of attached policies returned |
### `iam_list_attached_user_policies`
List all managed policies attached to an IAM user
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `userName` | string | Yes | Name of the IAM user |
| `pathPrefix` | string | No | Path prefix to filter policies \(e.g., /application/\) |
| `maxItems` | number | No | Maximum number of policies to return \(1-1000\) |
| `marker` | string | No | Pagination marker from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `attachedPolicies` | json | List of attached policies with policyName and policyArn |
| `isTruncated` | boolean | Whether there are more results available |
| `marker` | string | Pagination marker for the next page of results |
| `count` | number | Number of attached policies returned |
### `iam_simulate_principal_policy`
Simulate whether a user, role, or group is allowed to perform specific AWS actions — useful for pre-flight access checks
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `policySourceArn` | string | Yes | ARN of the user, group, or role to simulate \(e.g., arn:aws:iam::123456789012:user/alice\) |
| `actionNames` | string | Yes | Comma-separated list of AWS actions to simulate \(e.g., s3:GetObject,ec2:DescribeInstances\) |
| `resourceArns` | string | No | Comma-separated list of resource ARNs to simulate against \(defaults to * if not provided\) |
| `maxResults` | number | No | Maximum number of simulation results to return \(1-1000\) |
| `marker` | string | No | Pagination marker from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `evaluationResults` | json | Simulation results per action: evalActionName, evalResourceName, evalDecision \(allowed/explicitDeny/implicitDeny\), matchedStatements \(sourcePolicyId, sourcePolicyType\), missingContextValues |
| `isTruncated` | boolean | Whether there are more results available |
| `marker` | string | Pagination marker for the next page of results |
| `count` | number | Number of evaluation results returned |

View File

@@ -0,0 +1,340 @@
---
title: AWS Identity Center
description: Manage temporary elevated access in AWS IAM Identity Center
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="identity_center"
color="linear-gradient(45deg, #BD0816 0%, #FF5252 100%)"
/>
{/* MANUAL-CONTENT-START:intro */}
[AWS IAM Identity Center](https://aws.amazon.com/iam/identity-center/) (formerly AWS Single Sign-On) is the recommended service for managing workforce access to multiple AWS accounts and applications. It provides a central place to assign users and groups temporary, permission-scoped access to AWS accounts using permission sets — without creating long-lived IAM credentials.
With AWS IAM Identity Center, you can:
- **Provision account assignments**: Grant a user or group access to a specific AWS account with a specific permission set — the core primitive of temporary elevated access
- **Revoke access on demand**: Delete account assignments to immediately remove elevated permissions when they are no longer needed
- **Look up users by email**: Resolve a federated identity (email address) to an Identity Store user ID for programmatic access provisioning
- **List permission sets**: Enumerate the available permission sets (e.g., ReadOnly, PowerUser, AdministratorAccess) defined in your Identity Center instance
- **Monitor assignment status**: Poll the provisioning status of create/delete operations, which are asynchronous in AWS
- **List accounts in your organization**: Enumerate all AWS accounts in your AWS Organizations structure to populate access request dropdowns
- **Manage groups**: List groups and resolve group IDs by display name for group-based access grants
In Sim, the AWS Identity Center integration is designed to power **TEAM (Temporary Elevated Access Management)** workflows — automated pipelines where users request elevated access, approvers approve or deny it, access is provisioned with a time limit, and auto-revocation removes it when the window expires. This replaces manual console-based access management with auditable, agent-driven workflows that integrate with Slack, email, ticketing systems, and CloudTrail for full traceability.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Provision and revoke temporary access to AWS accounts via IAM Identity Center (SSO). Assign permission sets to users or groups, look up users by email, and list accounts and permission sets for access request workflows.
## Tools
### `identity_center_list_instances`
List all AWS IAM Identity Center instances in your account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `maxResults` | number | No | Maximum number of instances to return \(1-100\) |
| `nextToken` | string | No | Pagination token from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `instances` | json | List of Identity Center instances with instanceArn, identityStoreId, name, status, statusReason |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of instances returned |
### `identity_center_list_accounts`
List all AWS accounts in your organization
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `maxResults` | number | No | Maximum number of accounts to return |
| `nextToken` | string | No | Pagination token from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `accounts` | json | List of AWS accounts with id, arn, name, email, status |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of accounts returned |
### `identity_center_describe_account`
Retrieve details about a specific AWS account by its ID
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `accountId` | string | Yes | AWS account ID to describe |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | AWS account ID |
| `arn` | string | AWS account ARN |
| `name` | string | Account name |
| `email` | string | Root email address of the account |
| `status` | string | Account status \(ACTIVE, SUSPENDED, etc.\) |
| `joinedTimestamp` | string | Date the account joined the organization |
### `identity_center_list_permission_sets`
List all permission sets defined in an IAM Identity Center instance
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `instanceArn` | string | Yes | ARN of the Identity Center instance |
| `maxResults` | number | No | Maximum number of permission sets to return |
| `nextToken` | string | No | Pagination token from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `permissionSets` | json | List of permission sets with permissionSetArn, name, description, sessionDuration |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of permission sets returned |
### `identity_center_get_user`
Look up a user in the Identity Store by email address
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `identityStoreId` | string | Yes | Identity Store ID \(from the Identity Center instance\) |
| `email` | string | Yes | Email address of the user to look up |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `userId` | string | Identity Store user ID \(use as principalId\) |
| `userName` | string | Username in the Identity Store |
| `displayName` | string | Display name of the user |
| `email` | string | Email address of the user |
### `identity_center_get_group`
Look up a group in the Identity Store by display name
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `identityStoreId` | string | Yes | Identity Store ID \(from the Identity Center instance\) |
| `displayName` | string | Yes | Display name of the group to look up |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `groupId` | string | Identity Store group ID \(use as principalId\) |
| `displayName` | string | Display name of the group |
| `description` | string | Group description |
### `identity_center_list_groups`
List all groups in the Identity Store
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `identityStoreId` | string | Yes | Identity Store ID \(from the Identity Center instance\) |
| `maxResults` | number | No | Maximum number of groups to return |
| `nextToken` | string | No | Pagination token from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `groups` | json | List of groups with groupId, displayName, description |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of groups returned |
### `identity_center_create_account_assignment`
Grant a user or group access to an AWS account via a permission set (temporary elevated access)
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `instanceArn` | string | Yes | ARN of the Identity Center instance |
| `accountId` | string | Yes | AWS account ID to grant access to |
| `permissionSetArn` | string | Yes | ARN of the permission set to assign |
| `principalType` | string | Yes | Type of principal: USER or GROUP |
| `principalId` | string | Yes | Identity Store ID of the user or group |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Status message |
| `status` | string | Provisioning status: IN_PROGRESS, FAILED, or SUCCEEDED |
| `requestId` | string | Request ID to use with Check Assignment Status |
| `accountId` | string | Target AWS account ID |
| `permissionSetArn` | string | Permission set ARN |
| `principalType` | string | Principal type \(USER or GROUP\) |
| `principalId` | string | Principal ID |
| `failureReason` | string | Reason for failure if status is FAILED |
| `createdDate` | string | Date the request was created |
### `identity_center_delete_account_assignment`
Revoke a user or group access to an AWS account by removing a permission set assignment
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `instanceArn` | string | Yes | ARN of the Identity Center instance |
| `accountId` | string | Yes | AWS account ID to revoke access from |
| `permissionSetArn` | string | Yes | ARN of the permission set to remove |
| `principalType` | string | Yes | Type of principal: USER or GROUP |
| `principalId` | string | Yes | Identity Store ID of the user or group |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Status message |
| `status` | string | Deprovisioning status: IN_PROGRESS, FAILED, or SUCCEEDED |
| `requestId` | string | Request ID to use with Check Assignment Status |
| `accountId` | string | Target AWS account ID |
| `permissionSetArn` | string | Permission set ARN |
| `principalType` | string | Principal type \(USER or GROUP\) |
| `principalId` | string | Principal ID |
| `failureReason` | string | Reason for failure if status is FAILED |
| `createdDate` | string | Date the request was created |
### `identity_center_check_assignment_status`
Check the provisioning status of an account assignment creation request
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `instanceArn` | string | Yes | ARN of the Identity Center instance |
| `requestId` | string | Yes | Request ID returned from Create or Delete Account Assignment |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Human-readable status message |
| `status` | string | Current status: IN_PROGRESS, FAILED, or SUCCEEDED |
| `requestId` | string | The request ID that was checked |
| `accountId` | string | Target AWS account ID |
| `permissionSetArn` | string | Permission set ARN |
| `principalType` | string | Principal type \(USER or GROUP\) |
| `principalId` | string | Principal ID |
| `failureReason` | string | Reason for failure if status is FAILED |
| `createdDate` | string | Date the request was created |
### `identity_center_check_assignment_deletion_status`
Check the deprovisioning status of an account assignment deletion request
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `instanceArn` | string | Yes | ARN of the Identity Center instance |
| `requestId` | string | Yes | Request ID returned from Delete Account Assignment |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Human-readable status message |
| `status` | string | Current deletion status: IN_PROGRESS, FAILED, or SUCCEEDED |
| `requestId` | string | The deletion request ID that was checked |
| `accountId` | string | Target AWS account ID |
| `permissionSetArn` | string | Permission set ARN |
| `principalType` | string | Principal type \(USER or GROUP\) |
| `principalId` | string | Principal ID |
| `failureReason` | string | Reason for failure if status is FAILED |
| `createdDate` | string | Date the request was created |
### `identity_center_list_account_assignments`
List all account assignments for a specific user or group across all accounts
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `instanceArn` | string | Yes | ARN of the Identity Center instance |
| `principalId` | string | Yes | Identity Store ID of the user or group |
| `principalType` | string | Yes | Type of principal: USER or GROUP |
| `maxResults` | number | No | Maximum number of assignments to return |
| `nextToken` | string | No | Pagination token from a previous request |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `assignments` | json | List of account assignments with accountId, permissionSetArn, principalType, principalId |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of assignments returned |

View File

@@ -3,6 +3,7 @@
"index",
"a2a",
"agentmail",
"agentphone",
"agiloft",
"ahrefs",
"airtable",
@@ -86,6 +87,7 @@
"huggingface",
"hunter",
"iam",
"identity_center",
"image_generator",
"imap",
"incidentio",
@@ -148,12 +150,14 @@
"rootly",
"s3",
"salesforce",
"sap_s4hana",
"search",
"secrets_manager",
"sendgrid",
"sentry",
"serper",
"servicenow",
"ses",
"sftp",
"sharepoint",
"shopify",

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,241 @@
---
title: AWS SES
description: Send emails and manage templates with AWS Simple Email Service
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="ses"
color="linear-gradient(45deg, #BD0816 0%, #FF5252 100%)"
/>
{/* MANUAL-CONTENT-START:intro */}
[Amazon Simple Email Service (SES)](https://aws.amazon.com/ses/) is a cloud-based email sending service designed for high-volume, transactional, and marketing email delivery. It provides a cost-effective, scalable way to send email without managing your own mail server infrastructure.
With AWS SES, you can:
- **Send simple emails**: Deliver one-off emails with plain text or HTML body content to individual recipients
- **Send templated emails**: Use pre-defined templates with variable substitution (e.g., `{{name}}`, `{{link}}`) for personalized emails at scale
- **Send bulk emails**: Deliver templated emails to large lists of recipients in a single API call, with per-destination data overrides
- **Manage email templates**: Create, retrieve, list, and delete reusable email templates for transactional and marketing campaigns
- **Monitor account health**: Retrieve your account's sending quota, send rate, and whether sending is currently enabled
In Sim, the AWS SES integration is designed for workflows that need reliable, programmatic email delivery — from access request notifications and approval alerts to bulk outreach and automated reporting. It pairs naturally with the IAM Identity Center integration for TEAM (Temporary Elevated Access Management) workflows, where email notifications are sent when access is provisioned, approved, or revoked.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate AWS SES v2 into the workflow. Send simple, templated, and bulk emails. Manage email templates and retrieve account sending quota and verified identity information.
## Tools
### `ses_send_email`
Send an email via AWS SES using simple or HTML content
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `fromAddress` | string | Yes | Verified sender email address |
| `toAddresses` | string | Yes | Comma-separated list of recipient email addresses |
| `subject` | string | Yes | Email subject line |
| `bodyText` | string | No | Plain text email body |
| `bodyHtml` | string | No | HTML email body |
| `ccAddresses` | string | No | Comma-separated list of CC email addresses |
| `bccAddresses` | string | No | Comma-separated list of BCC email addresses |
| `replyToAddresses` | string | No | Comma-separated list of reply-to email addresses |
| `configurationSetName` | string | No | SES configuration set name for tracking |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `messageId` | string | SES message ID for the sent email |
### `ses_send_templated_email`
Send an email using an SES email template with dynamic template data
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `fromAddress` | string | Yes | Verified sender email address |
| `toAddresses` | string | Yes | Comma-separated list of recipient email addresses |
| `templateName` | string | Yes | Name of the SES email template to use |
| `templateData` | string | Yes | JSON string of key-value pairs for template variable substitution |
| `ccAddresses` | string | No | Comma-separated list of CC email addresses |
| `bccAddresses` | string | No | Comma-separated list of BCC email addresses |
| `configurationSetName` | string | No | SES configuration set name for tracking |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `messageId` | string | SES message ID for the sent email |
### `ses_send_bulk_email`
Send emails to multiple recipients using an SES template with per-recipient data
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `fromAddress` | string | Yes | Verified sender email address |
| `templateName` | string | Yes | Name of the SES email template to use |
| `destinations` | string | Yes | JSON array of destination objects with toAddresses \(string\[\]\) and optional templateData \(JSON string\); falls back to defaultTemplateData when omitted |
| `defaultTemplateData` | string | No | Default JSON template data used when a destination does not specify its own |
| `configurationSetName` | string | No | SES configuration set name for tracking |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `results` | array | Per-destination send results with status and messageId |
| `successCount` | number | Number of successfully sent emails |
| `failureCount` | number | Number of failed email sends |
### `ses_list_identities`
List all verified email identities (email addresses and domains) in your SES account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `pageSize` | number | No | Maximum number of identities to return \(1-1000\) |
| `nextToken` | string | No | Pagination token from a previous list response |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `identities` | array | List of email identities with name, type, sending status, and verification status |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of identities returned |
### `ses_get_account`
Get SES account sending quota and status information
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `sendingEnabled` | boolean | Whether email sending is enabled for the account |
| `max24HourSend` | number | Maximum emails allowed per 24-hour period |
| `maxSendRate` | number | Maximum emails allowed per second |
| `sentLast24Hours` | number | Number of emails sent in the last 24 hours |
### `ses_create_template`
Create a new SES email template for use with templated email sending
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `templateName` | string | Yes | Unique name for the email template |
| `subjectPart` | string | Yes | Subject line template \(supports \{\{variable\}\} substitution\) |
| `textPart` | string | No | Plain text version of the template body |
| `htmlPart` | string | No | HTML version of the template body |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Confirmation message for the created template |
### `ses_get_template`
Retrieve the content and details of an SES email template
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `templateName` | string | Yes | Name of the template to retrieve |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `templateName` | string | Name of the template |
| `subjectPart` | string | Subject line of the template |
| `textPart` | string | Plain text body of the template |
| `htmlPart` | string | HTML body of the template |
### `ses_list_templates`
List all SES email templates in your account
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `pageSize` | number | No | Maximum number of templates to return |
| `nextToken` | string | No | Pagination token from a previous list response |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `templates` | array | List of email templates with name and creation timestamp |
| `nextToken` | string | Pagination token for the next page of results |
| `count` | number | Number of templates returned |
### `ses_delete_template`
Delete an existing SES email template
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `region` | string | Yes | AWS region \(e.g., us-east-1\) |
| `accessKeyId` | string | Yes | AWS access key ID |
| `secretAccessKey` | string | Yes | AWS secret access key |
| `templateName` | string | Yes | Name of the template to delete |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `message` | string | Confirmation message for the deleted template |

View File

@@ -925,6 +925,139 @@ Create a canvas pinned to a Slack channel as its resource hub
| --------- | ---- | ----------- |
| `canvas_id` | string | ID of the created channel canvas |
### `slack_get_canvas`
Get Slack canvas file metadata by canvas ID
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `canvasId` | string | Yes | Canvas file ID to retrieve \(e.g., F1234ABCD\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `canvas` | object | Canvas file information returned by Slack |
| ↳ `id` | string | Unique canvas file identifier |
| ↳ `created` | number | Unix timestamp when the canvas was created |
| ↳ `timestamp` | number | Unix timestamp associated with the canvas |
| ↳ `name` | string | Canvas file name |
| ↳ `title` | string | Canvas title |
| ↳ `mimetype` | string | MIME type of the canvas file |
| ↳ `filetype` | string | Slack file type for the canvas |
| ↳ `pretty_type` | string | Human-readable file type |
| ↳ `user` | string | User ID of the canvas creator |
| ↳ `editable` | boolean | Whether the canvas file is editable |
| ↳ `size` | number | Canvas file size in bytes |
| ↳ `mode` | string | File mode |
| ↳ `is_external` | boolean | Whether the canvas is externally hosted |
| ↳ `is_public` | boolean | Whether the canvas is public |
| ↳ `url_private` | string | Private URL for the canvas file |
| ↳ `url_private_download` | string | Private download URL for the canvas file |
| ↳ `permalink` | string | Permanent URL for the canvas |
| ↳ `channels` | array | Public channel IDs where the canvas appears |
| ↳ `groups` | array | Private channel IDs where the canvas appears |
| ↳ `ims` | array | Direct message IDs where the canvas appears |
| ↳ `canvas_readtime` | number | Approximate read time for canvas content |
| ↳ `is_channel_space` | boolean | Whether this canvas is linked to a channel |
| ↳ `linked_channel_id` | string | Channel ID linked to this canvas |
| ↳ `canvas_creator_id` | string | User ID of the canvas creator |
### `slack_list_canvases`
List Slack canvases available to the authenticated user or bot
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `channel` | string | No | Filter canvases appearing in a specific channel ID |
| `count` | number | No | Number of canvases to return per page |
| `page` | number | No | Page number to return |
| `user` | string | No | Filter canvases created by a single user ID |
| `tsFrom` | string | No | Filter canvases created after this Unix timestamp |
| `tsTo` | string | No | Filter canvases created before this Unix timestamp |
| `teamId` | string | No | Encoded team ID, required when using an org-level token |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `canvases` | array | Canvas file objects returned by Slack |
| ↳ `id` | string | Unique canvas file identifier |
| ↳ `created` | number | Unix timestamp when the canvas was created |
| ↳ `timestamp` | number | Unix timestamp associated with the canvas |
| ↳ `name` | string | Canvas file name |
| ↳ `title` | string | Canvas title |
| ↳ `mimetype` | string | MIME type of the canvas file |
| ↳ `filetype` | string | Slack file type for the canvas |
| ↳ `pretty_type` | string | Human-readable file type |
| ↳ `user` | string | User ID of the canvas creator |
| ↳ `editable` | boolean | Whether the canvas file is editable |
| ↳ `size` | number | Canvas file size in bytes |
| ↳ `mode` | string | File mode |
| ↳ `is_external` | boolean | Whether the canvas is externally hosted |
| ↳ `is_public` | boolean | Whether the canvas is public |
| ↳ `url_private` | string | Private URL for the canvas file |
| ↳ `url_private_download` | string | Private download URL for the canvas file |
| ↳ `permalink` | string | Permanent URL for the canvas |
| ↳ `channels` | array | Public channel IDs where the canvas appears |
| ↳ `groups` | array | Private channel IDs where the canvas appears |
| ↳ `ims` | array | Direct message IDs where the canvas appears |
| ↳ `canvas_readtime` | number | Approximate read time for canvas content |
| ↳ `is_channel_space` | boolean | Whether this canvas is linked to a channel |
| ↳ `linked_channel_id` | string | Channel ID linked to this canvas |
| ↳ `canvas_creator_id` | string | User ID of the canvas creator |
| `paging` | object | Pagination information from Slack |
| ↳ `count` | number | Number of items requested per page |
| ↳ `total` | number | Total number of matching files |
| ↳ `page` | number | Current page number |
| ↳ `pages` | number | Total number of pages |
### `slack_lookup_canvas_sections`
Find Slack canvas section IDs matching criteria for later edits
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `canvasId` | string | Yes | Canvas ID to search \(e.g., F1234ABCD\) |
| `criteria` | json | Yes | Section lookup criteria, such as \{"section_types":\["h1"\],"contains_text":"Roadmap"\} |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `sections` | array | Canvas sections matching the lookup criteria |
| ↳ `id` | string | Canvas section identifier |
### `slack_delete_canvas`
Delete a Slack canvas by its canvas ID
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `authMethod` | string | No | Authentication method: oauth or bot_token |
| `botToken` | string | No | Bot token for Custom Bot |
| `canvasId` | string | Yes | Canvas ID to delete \(e.g., F1234ABCD\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `ok` | boolean | Whether Slack deleted the canvas successfully |
### `slack_create_conversation`
Create a new public or private channel in a Slack workspace.

View File

@@ -72,6 +72,8 @@ Run an autonomous web agent to complete tasks and extract structured data
| `provider` | string | No | AI provider to use: openai or anthropic |
| `apiKey` | string | Yes | API key for the selected provider |
| `outputSchema` | json | No | Optional JSON schema defining the structure of data the agent should return |
| `mode` | string | No | Agent tool mode: dom \(default\), hybrid, or cua |
| `maxSteps` | number | No | Maximum agent steps \(default 20, max 200\) |
#### Output
@@ -92,5 +94,7 @@ Run an autonomous web agent to complete tasks and extract structured data
| ↳ `timestamp` | number | Unix timestamp when the action was performed |
| ↳ `timeMs` | number | Time in milliseconds \(for wait actions\) |
| `structuredOutput` | object | Extracted data matching the provided output schema |
| `liveViewUrl` | string | Embeddable Browserbase live view URL \(active only while the session is running\) |
| `sessionId` | string | Browserbase session identifier |

View File

@@ -46,6 +46,7 @@ Assume an IAM role and receive temporary security credentials
| `roleArn` | string | Yes | ARN of the IAM role to assume |
| `roleSessionName` | string | Yes | Identifier for the assumed role session |
| `durationSeconds` | number | No | Duration of the session in seconds \(900-43200, default 3600\) |
| `policy` | string | No | JSON IAM policy to further restrict session permissions \(max 2048 chars\) |
| `externalId` | string | No | External ID for cross-account access |
| `serialNumber` | string | No | MFA device serial number or ARN |
| `tokenCode` | string | No | MFA token code \(6 digits\) |
@@ -61,6 +62,7 @@ Assume an IAM role and receive temporary security credentials
| `assumedRoleArn` | string | ARN of the assumed role |
| `assumedRoleId` | string | Assumed role ID with session name |
| `packedPolicySize` | number | Percentage of allowed policy size used |
| `sourceIdentity` | string | Source identity set on the role session, if any |
### `sts_get_caller_identity`

View File

@@ -97,6 +97,14 @@ Trigger workflow when a candidate is hired
| ↳ `job` | object | job output from the tool |
| ↳ `id` | string | Job UUID |
| ↳ `title` | string | Job title |
| `offer` | object | offer output from the tool |
| ↳ `id` | string | Accepted offer UUID |
| ↳ `applicationId` | string | Associated application UUID |
| ↳ `acceptanceStatus` | string | Offer acceptance status |
| ↳ `offerStatus` | string | Offer process status |
| ↳ `decidedAt` | string | Offer decision timestamp \(ISO 8601\) |
| ↳ `latestVersion` | object | latestVersion output from the tool |
| ↳ `id` | string | Latest offer version UUID |
---

View File

@@ -29,6 +29,7 @@ Trigger workflow when a Fireflies meeting transcription is complete
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `meetingId` | string | The ID of the transcribed meeting |
| `eventType` | string | The type of event \(Transcription completed\) |
| `eventType` | string | The type of event \(e.g. Transcription completed, meeting.transcribed\) |
| `clientReferenceId` | string | Custom reference ID if set during upload |
| `timestamp` | number | Unix timestamp in milliseconds when the event was fired \(V2 webhooks\) |

View File

@@ -89,6 +89,8 @@ Polling Groups let you monitor multiple team members' Gmail or Outlook inboxes w
Invitees receive an email with a link to connect their account. Once connected, their inbox is automatically included in the polling group. Invitees don't need to be members of your Sim organization.
This is separate from external workspace membership: polling group invitees are granting access to an inbox for a trigger, while external workspace members are collaborators with Read, Write, or Admin access to a workspace.
**Using in a Workflow**
When configuring an email trigger, select your polling group from the credentials dropdown instead of an individual account. The system creates webhooks for each member and routes all emails through your workflow.

View File

@@ -304,7 +304,7 @@ Trigger workflow on any Jira Service Management webhook event
| ↳ `id` | string | Changelog ID |
| `comment` | object | comment output from the tool |
| ↳ `id` | string | Comment ID |
| ↳ `body` | string | Comment text/body |
| ↳ `body` | json | Comment body in Atlassian Document Format \(ADF\). On Jira Server this may be a plain string. |
| ↳ `author` | object | author output from the tool |
| ↳ `displayName` | string | Comment author display name |
| ↳ `accountId` | string | Comment author account ID |

View File

@@ -25,6 +25,7 @@ Trigger workflow from Slack events like mentions, messages, and reactions
| `signingSecret` | string | Yes | The signing secret from your Slack app to validate request authenticity. |
| `botToken` | string | No | The bot token from your Slack app. Required for downloading files attached to messages. |
| `includeFiles` | boolean | No | Download and include file attachments from messages. Requires a bot token with files:read scope. |
| `setupWizard` | modal | No | Walk through manifest creation, app install, and pasting credentials. |
#### Output

View File

@@ -49,7 +49,7 @@ Environment variables store sensitive values like API keys, tokens, and configur
| Scope | Visibility | Use case |
|-------|-----------|----------|
| **Workspace** | All workspace members | Shared API keys, team configuration |
| **Workspace** | All workspace members, including external workspace members | Shared API keys, team configuration |
| **Personal** | Only you | Your personal tokens, dev credentials |
When both a workspace and personal variable share the same key, the workspace value takes precedence.
@@ -84,7 +84,7 @@ If a workflow variable and a block output share the same name, Sim resolves the
<FAQ items={[
{ question: "What's the difference between workflow variables and environment variables?", answer: "Workflow variables store runtime data (text, numbers, objects, arrays) that blocks can read and modify during execution. They use <variable.name> syntax. Environment variables store sensitive configuration like API keys using {{KEY}} syntax. They never appear in logs and are managed at the workspace or personal level." },
{ question: "Can I use environment variables in the Function block?", answer: "Yes. Use the double curly brace syntax {{KEY}} directly in your code. The value is substituted before execution, so the actual secret never appears in logs or outputs." },
{ question: "How do I share an API key with my team?", answer: "Create a workspace-scoped environment variable in Settings → Secrets. All workspace members will be able to use it in their workflows via {{KEY}} syntax." },
{ question: "How do I share an API key with my team?", answer: "Create a workspace-scoped environment variable in Settings → Secrets. All workspace members, including external workspace members, will be able to use it in their workflows via {{KEY}} syntax." },
{ question: "What happens if a variable name has spaces or mixed case?", answer: "Variable resolution is case-insensitive and ignores spaces. A variable named 'My Counter' can be referenced as <variable.mycounter> or <variable.My Counter>. However, using consistent naming (like camelCase) is recommended." },
{ question: "Can I reference environment variables in the Agent system prompt?", answer: "Yes. You can use {{KEY}} syntax in any text field, including system prompts, to inject environment variable values." },
]} />

View File

@@ -20,7 +20,7 @@
"@vercel/og": "^0.6.5",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"drizzle-orm": "^0.44.5",
"drizzle-orm": "^0.45.2",
"fumadocs-core": "16.6.7",
"fumadocs-mdx": "14.2.8",
"fumadocs-openapi": "10.3.13",

Binary file not shown.

After

Width:  |  Height:  |  Size: 504 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 114 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 169 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 197 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 488 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 105 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 209 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 236 KiB

View File

@@ -0,0 +1,30 @@
# Environment variables required by the @sim/realtime (Socket.IO) server.
# These MUST match the corresponding values in apps/sim/.env for auth to work.
# See apps/realtime/src/env.ts for the full zod schema.
# Core
NODE_ENV=development
PORT=3002
# Database — must point at the same Postgres as the main app
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/simstudio
# Auth — shared with apps/sim (Better Auth "Shared Database Session" pattern)
BETTER_AUTH_URL=http://localhost:3000
BETTER_AUTH_SECRET=your_better_auth_secret_min_32_chars
# Internal RPC — shared with apps/sim
INTERNAL_API_SECRET=your_internal_api_secret_min_32_chars
# Public app URL — used for CORS allow-list and base URL resolution
NEXT_PUBLIC_APP_URL=http://localhost:3000
# Optional: Redis for cross-pod room management
# Leave unset for single-pod / in-memory rooms
# REDIS_URL=redis://localhost:6379
# Optional: extra Socket.IO CORS allow-list (comma-separated)
# ALLOWED_ORIGINS=https://embed.example.com,https://admin.example.com
# Optional: disable auth entirely for trusted private networks
# DISABLE_AUTH=true

View File

@@ -0,0 +1,48 @@
{
"name": "@sim/realtime",
"version": "0.1.0",
"private": true,
"license": "Apache-2.0",
"type": "module",
"engines": {
"bun": ">=1.2.13",
"node": ">=20.0.0"
},
"scripts": {
"dev": "bun --watch src/index.ts",
"start": "bun src/index.ts",
"type-check": "tsc --noEmit",
"lint": "biome check --write --unsafe .",
"lint:check": "biome check .",
"format": "biome format --write .",
"format:check": "biome format .",
"test": "vitest run",
"test:watch": "vitest"
},
"dependencies": {
"@sim/audit": "workspace:*",
"@sim/auth": "workspace:*",
"@sim/db": "workspace:*",
"@sim/logger": "workspace:*",
"@sim/realtime-protocol": "workspace:*",
"@sim/security": "workspace:*",
"@sim/utils": "workspace:*",
"@sim/workflow-authz": "workspace:*",
"@sim/workflow-persistence": "workspace:*",
"@sim/workflow-types": "workspace:*",
"@socket.io/redis-adapter": "8.3.0",
"drizzle-orm": "^0.45.2",
"postgres": "^3.4.5",
"redis": "5.10.0",
"socket.io": "^4.8.1",
"zod": "^3.24.2"
},
"devDependencies": {
"@sim/testing": "workspace:*",
"@sim/tsconfig": "workspace:*",
"@types/node": "24.2.1",
"socket.io-client": "4.8.1",
"typescript": "^5.7.3",
"vitest": "^3.0.8"
}
}

17
apps/realtime/src/auth.ts Normal file
View File

@@ -0,0 +1,17 @@
import { createVerifyAuth } from '@sim/auth/verify'
import { env } from '@/env'
export const ANONYMOUS_USER_ID = '00000000-0000-0000-0000-000000000000'
export const ANONYMOUS_USER = {
id: ANONYMOUS_USER_ID,
name: 'Anonymous',
email: 'anonymous@localhost',
emailVerified: true,
image: null,
} as const
export const auth = createVerifyAuth({
secret: env.BETTER_AUTH_SECRET,
baseURL: env.BETTER_AUTH_URL,
})

View File

@@ -3,9 +3,7 @@ import { createLogger } from '@sim/logger'
import { createAdapter } from '@socket.io/redis-adapter'
import { createClient, type RedisClientType } from 'redis'
import { Server } from 'socket.io'
import { env } from '@/lib/core/config/env'
import { isProd } from '@/lib/core/config/feature-flags'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { env, getBaseUrl, isProd } from '@/env'
const logger = createLogger('SocketIOConfig')

View File

@@ -1,15 +1,7 @@
import { AuditAction, AuditResourceType, recordAudit } from '@sim/audit'
import * as schema from '@sim/db'
import { webhook, workflow, workflowBlocks, workflowEdges, workflowSubflows } from '@sim/db'
import { workflow, workflowBlocks, workflowEdges, workflowSubflows } from '@sim/db'
import { createLogger } from '@sim/logger'
import { and, eq, inArray, isNull, or, sql } from 'drizzle-orm'
import { drizzle } from 'drizzle-orm/postgres-js'
import postgres from 'postgres'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { env } from '@/lib/core/config/env'
import { cleanupExternalWebhook } from '@/lib/webhooks/provider-subscriptions'
import { getActiveWorkflowContext } from '@/lib/workflows/active-context'
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/persistence/utils'
import { mergeSubBlockValues } from '@/lib/workflows/subblocks'
import {
BLOCK_OPERATIONS,
BLOCKS_OPERATIONS,
@@ -19,7 +11,14 @@ import {
SUBFLOW_OPERATIONS,
VARIABLE_OPERATIONS,
WORKFLOW_OPERATIONS,
} from '@/socket/constants'
} from '@sim/realtime-protocol/constants'
import { getActiveWorkflowContext } from '@sim/workflow-authz'
import { loadWorkflowFromNormalizedTablesRaw } from '@sim/workflow-persistence/load'
import { mergeSubBlockValues } from '@sim/workflow-persistence/subblocks'
import { and, eq, inArray, isNull, or, sql } from 'drizzle-orm'
import { drizzle } from 'drizzle-orm/postgres-js'
import postgres from 'postgres'
import { env } from '@/env'
const logger = createLogger('SocketDatabase')
@@ -29,7 +28,7 @@ const socketDb = drizzle(
prepare: false,
idle_timeout: 10,
connect_timeout: 20,
max: 10,
max: 30,
onnotice: () => {},
}),
{ schema }
@@ -182,7 +181,7 @@ export async function getWorkflowState(workflowId: string) {
throw new Error(`Workflow ${workflowId} not found`)
}
const normalizedData = await loadWorkflowFromNormalizedTables(workflowId)
const normalizedData = await loadWorkflowFromNormalizedTablesRaw(workflowId)
if (normalizedData) {
const finalState = {
@@ -915,30 +914,10 @@ async function handleBlocksOperationTx(
}
}
// Clean up external webhooks
const webhooksToCleanup = await tx
.select({
webhook: webhook,
workflow: {
id: workflow.id,
userId: workflow.userId,
workspaceId: workflow.workspaceId,
},
})
.from(webhook)
.innerJoin(workflow, eq(webhook.workflowId, workflow.id))
.where(and(eq(webhook.workflowId, workflowId), inArray(webhook.blockId, blockIdsArray)))
if (webhooksToCleanup.length > 0) {
const requestId = `socket-batch-${workflowId}-${Date.now()}`
for (const { webhook: wh, workflow: wf } of webhooksToCleanup) {
try {
await cleanupExternalWebhook(wh, wf, requestId)
} catch (error) {
logger.error(`Failed to cleanup webhook ${wh.id}:`, error)
}
}
}
// Webhook rows are only created at deploy time (saveTriggerWebhooksForDeploy in
// lib/webhooks/deploy.ts) with deploymentVersionId set; their external-subscription
// lifecycle is managed by deploy.ts, lifecycle.ts, and the /api/webhooks/[id] route.
// Removing a trigger block from the draft canvas does not touch any webhook rows.
// Delete edges connected to any of the blocks
await tx

44
apps/realtime/src/env.ts Normal file
View File

@@ -0,0 +1,44 @@
import { z } from 'zod'
const EnvSchema = z.object({
NODE_ENV: z.enum(['development', 'test', 'production']).default('development'),
DATABASE_URL: z.string().url(),
REDIS_URL: z.string().url().optional(),
BETTER_AUTH_URL: z.string().url(),
BETTER_AUTH_SECRET: z.string().min(32),
INTERNAL_API_SECRET: z.string().min(32),
NEXT_PUBLIC_APP_URL: z.string().url(),
ALLOWED_ORIGINS: z.string().optional(),
PORT: z.coerce.number().int().positive().default(3002),
DISABLE_AUTH: z
.string()
.optional()
.transform((value) => value === 'true' || value === '1'),
})
function parseEnv() {
const parsed = EnvSchema.safeParse(process.env)
if (!parsed.success) {
const formatted = parsed.error.format()
throw new Error(`Invalid realtime server environment: ${JSON.stringify(formatted, null, 2)}`)
}
return parsed.data
}
export const env = parseEnv()
export const isProd = env.NODE_ENV === 'production'
export const isDev = env.NODE_ENV === 'development'
export const isTest = env.NODE_ENV === 'test'
let appHostname = ''
try {
appHostname = new URL(env.NEXT_PUBLIC_APP_URL).hostname
} catch {}
export const isHosted = appHostname === 'sim.ai' || appHostname.endsWith('.sim.ai')
export const isAuthDisabled = env.DISABLE_AUTH === true && !isHosted
export function getBaseUrl(): string {
return env.NEXT_PUBLIC_APP_URL
}

View File

@@ -1,8 +1,8 @@
import { createLogger } from '@sim/logger'
import { cleanupPendingSubblocksForSocket } from '@/socket/handlers/subblocks'
import { cleanupPendingVariablesForSocket } from '@/socket/handlers/variables'
import type { AuthenticatedSocket } from '@/socket/middleware/auth'
import type { IRoomManager } from '@/socket/rooms'
import { cleanupPendingSubblocksForSocket } from '@/handlers/subblocks'
import { cleanupPendingVariablesForSocket } from '@/handlers/variables'
import type { AuthenticatedSocket } from '@/middleware/auth'
import type { IRoomManager } from '@/rooms'
const logger = createLogger('ConnectionHandlers')

View File

@@ -0,0 +1,17 @@
import { setupConnectionHandlers } from '@/handlers/connection'
import { setupOperationsHandlers } from '@/handlers/operations'
import { setupPresenceHandlers } from '@/handlers/presence'
import { setupSubblocksHandlers } from '@/handlers/subblocks'
import { setupVariablesHandlers } from '@/handlers/variables'
import { setupWorkflowHandlers } from '@/handlers/workflow'
import type { AuthenticatedSocket } from '@/middleware/auth'
import type { IRoomManager } from '@/rooms'
export function setupAllHandlers(socket: AuthenticatedSocket, roomManager: IRoomManager) {
setupWorkflowHandlers(socket, roomManager)
setupOperationsHandlers(socket, roomManager)
setupSubblocksHandlers(socket, roomManager)
setupVariablesHandlers(socket, roomManager)
setupPresenceHandlers(socket, roomManager)
setupConnectionHandlers(socket, roomManager)
}

View File

@@ -1,6 +1,4 @@
import { createLogger } from '@sim/logger'
import { ZodError } from 'zod'
import { generateId } from '@/lib/core/utils/uuid'
import {
BLOCK_OPERATIONS,
BLOCKS_OPERATIONS,
@@ -9,12 +7,14 @@ import {
VARIABLE_OPERATIONS,
type VariableOperation,
WORKFLOW_OPERATIONS,
} from '@/socket/constants'
import { persistWorkflowOperation } from '@/socket/database/operations'
import type { AuthenticatedSocket } from '@/socket/middleware/auth'
import { checkRolePermission } from '@/socket/middleware/permissions'
import type { IRoomManager, UserSession } from '@/socket/rooms'
import { WorkflowOperationSchema } from '@/socket/validation/schemas'
} from '@sim/realtime-protocol/constants'
import { WorkflowOperationSchema } from '@sim/realtime-protocol/schemas'
import { generateId } from '@sim/utils/id'
import { ZodError } from 'zod'
import { persistWorkflowOperation } from '@/database/operations'
import type { AuthenticatedSocket } from '@/middleware/auth'
import { checkRolePermission } from '@/middleware/permissions'
import type { IRoomManager, UserSession } from '@/rooms'
const logger = createLogger('OperationsHandlers')

View File

@@ -1,6 +1,6 @@
import { createLogger } from '@sim/logger'
import type { AuthenticatedSocket } from '@/socket/middleware/auth'
import type { IRoomManager } from '@/socket/rooms'
import type { AuthenticatedSocket } from '@/middleware/auth'
import type { IRoomManager } from '@/rooms'
const logger = createLogger('PresenceHandlers')

View File

@@ -1,11 +1,11 @@
import { db } from '@sim/db'
import { workflow, workflowBlocks } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { SUBBLOCK_OPERATIONS } from '@sim/realtime-protocol/constants'
import { and, eq } from 'drizzle-orm'
import { SUBBLOCK_OPERATIONS } from '@/socket/constants'
import type { AuthenticatedSocket } from '@/socket/middleware/auth'
import { checkRolePermission } from '@/socket/middleware/permissions'
import type { IRoomManager } from '@/socket/rooms'
import type { AuthenticatedSocket } from '@/middleware/auth'
import { checkRolePermission } from '@/middleware/permissions'
import type { IRoomManager } from '@/rooms'
const logger = createLogger('SubblocksHandlers')

View File

@@ -1,11 +1,11 @@
import { db } from '@sim/db'
import { workflow } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { VARIABLE_OPERATIONS } from '@sim/realtime-protocol/constants'
import { eq } from 'drizzle-orm'
import { VARIABLE_OPERATIONS } from '@/socket/constants'
import type { AuthenticatedSocket } from '@/socket/middleware/auth'
import { checkRolePermission } from '@/socket/middleware/permissions'
import type { IRoomManager } from '@/socket/rooms'
import type { AuthenticatedSocket } from '@/middleware/auth'
import { checkRolePermission } from '@/middleware/permissions'
import type { IRoomManager } from '@/rooms'
const logger = createLogger('VariablesHandlers')

View File

@@ -2,36 +2,27 @@
* @vitest-environment node
*/
import { beforeEach, describe, expect, it, vi } from 'vitest'
import type { IRoomManager } from '@/socket/rooms'
import type { IRoomManager } from '@/rooms'
const { mockGetWorkflowState, mockVerifyWorkflowAccess } = vi.hoisted(() => ({
mockGetWorkflowState: vi.fn(),
mockVerifyWorkflowAccess: vi.fn(),
}))
vi.mock('@sim/logger', () => ({
createLogger: () => ({
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
}),
}))
vi.mock('@sim/db', () => ({
db: { select: vi.fn() },
user: { image: 'image' },
}))
vi.mock('@/socket/database/operations', () => ({
vi.mock('@/database/operations', () => ({
getWorkflowState: mockGetWorkflowState,
}))
vi.mock('@/socket/middleware/permissions', () => ({
vi.mock('@/middleware/permissions', () => ({
verifyWorkflowAccess: mockVerifyWorkflowAccess,
}))
import { setupWorkflowHandlers } from '@/socket/handlers/workflow'
import { setupWorkflowHandlers } from '@/handlers/workflow'
interface JoinWorkflowPayload {
workflowId: string

View File

@@ -1,10 +1,10 @@
import { db, user } from '@sim/db'
import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { getWorkflowState } from '@/socket/database/operations'
import type { AuthenticatedSocket } from '@/socket/middleware/auth'
import { verifyWorkflowAccess } from '@/socket/middleware/permissions'
import type { IRoomManager, UserPresence } from '@/socket/rooms'
import { getWorkflowState } from '@/database/operations'
import type { AuthenticatedSocket } from '@/middleware/auth'
import { verifyWorkflowAccess } from '@/middleware/permissions'
import type { IRoomManager, UserPresence } from '@/rooms'
const logger = createLogger('WorkflowHandlers')

View File

@@ -4,23 +4,28 @@
* @vitest-environment node
*/
import { createServer, request as httpRequest } from 'http'
import { createEnvMock, createMockLogger, databaseMock } from '@sim/testing'
import { createMockLogger } from '@sim/testing'
import { afterEach, beforeAll, beforeEach, describe, expect, it, vi } from 'vitest'
import { createSocketIOServer } from '@/socket/config/socket'
import { MemoryRoomManager } from '@/socket/rooms'
import { createHttpHandler } from '@/socket/routes/http'
import { createSocketIOServer } from '@/config/socket'
import { MemoryRoomManager } from '@/rooms'
import { createHttpHandler } from '@/routes/http'
vi.mock('@/lib/auth', () => ({
vi.mock('@/auth', () => ({
auth: {
api: {
verifyOneTimeToken: vi.fn(),
},
},
ANONYMOUS_USER_ID: '00000000-0000-0000-0000-000000000000',
ANONYMOUS_USER: {
id: '00000000-0000-0000-0000-000000000000',
name: 'Anonymous',
email: 'anonymous@localhost',
emailVerified: true,
image: null,
},
}))
vi.mock('@sim/db', () => databaseMock)
// Mock redis package to prevent actual Redis connections
vi.mock('redis', () => ({
createClient: vi.fn(() => ({
on: vi.fn(),
@@ -30,15 +35,27 @@ vi.mock('redis', () => ({
})),
}))
vi.mock('@/lib/core/config/env', () =>
createEnvMock({
vi.mock('@/env', () => ({
env: {
DATABASE_URL: 'postgres://localhost/test',
NODE_ENV: 'test',
REDIS_URL: undefined,
})
)
BETTER_AUTH_URL: 'http://localhost:3000',
BETTER_AUTH_SECRET: 'test-better-auth-secret-at-least-32-chars',
INTERNAL_API_SECRET: 'test-internal-api-secret-at-least-32-chars',
NEXT_PUBLIC_APP_URL: 'http://localhost:3000',
PORT: 3002,
DISABLE_AUTH: false,
},
isProd: false,
isDev: false,
isTest: true,
isHosted: false,
isAuthDisabled: false,
getBaseUrl: () => 'http://localhost:3000',
}))
vi.mock('@/socket/middleware/auth', () => ({
vi.mock('@/middleware/auth', () => ({
authenticateSocket: vi.fn((socket, next) => {
socket.userId = 'test-user-id'
socket.userName = 'Test User'
@@ -47,7 +64,7 @@ vi.mock('@/socket/middleware/auth', () => ({
}),
}))
vi.mock('@/socket/middleware/permissions', () => ({
vi.mock('@/middleware/permissions', () => ({
verifyWorkflowAccess: vi.fn().mockResolvedValue({
hasAccess: true,
role: 'admin',
@@ -57,7 +74,7 @@ vi.mock('@/socket/middleware/permissions', () => ({
}),
}))
vi.mock('@/socket/database/operations', () => ({
vi.mock('@/database/operations', () => ({
getWorkflowState: vi.fn().mockResolvedValue({
id: 'test-workflow',
name: 'Test Workflow',
@@ -277,13 +294,13 @@ describe('Socket Server Index Integration', () => {
describe('Module Integration', () => {
it.concurrent('should properly import all extracted modules', async () => {
const { createSocketIOServer } = await import('@/socket/config/socket')
const { createHttpHandler } = await import('@/socket/routes/http')
const { MemoryRoomManager, RedisRoomManager } = await import('@/socket/rooms')
const { authenticateSocket } = await import('@/socket/middleware/auth')
const { verifyWorkflowAccess } = await import('@/socket/middleware/permissions')
const { getWorkflowState } = await import('@/socket/database/operations')
const { WorkflowOperationSchema } = await import('@/socket/validation/schemas')
const { createSocketIOServer } = await import('@/config/socket')
const { createHttpHandler } = await import('@/routes/http')
const { MemoryRoomManager, RedisRoomManager } = await import('@/rooms')
const { authenticateSocket } = await import('@/middleware/auth')
const { verifyWorkflowAccess } = await import('@/middleware/permissions')
const { getWorkflowState } = await import('@/database/operations')
const { WorkflowOperationSchema } = await import('@sim/realtime-protocol/schemas')
expect(createSocketIOServer).toBeTypeOf('function')
expect(createHttpHandler).toBeTypeOf('function')
@@ -334,7 +351,7 @@ describe('Socket Server Index Integration', () => {
describe('Validation and Utils', () => {
it.concurrent('should validate workflow operations', async () => {
const { WorkflowOperationSchema } = await import('@/socket/validation/schemas')
const { WorkflowOperationSchema } = await import('@sim/realtime-protocol/schemas')
const validOperation = {
operation: 'batch-add-blocks',
@@ -360,7 +377,7 @@ describe('Socket Server Index Integration', () => {
})
it.concurrent('should validate batch-add-blocks with edges', async () => {
const { WorkflowOperationSchema } = await import('@/socket/validation/schemas')
const { WorkflowOperationSchema } = await import('@sim/realtime-protocol/schemas')
const validOperationWithEdge = {
operation: 'batch-add-blocks',
@@ -395,7 +412,7 @@ describe('Socket Server Index Integration', () => {
})
it.concurrent('should validate edge operations', async () => {
const { WorkflowOperationSchema } = await import('@/socket/validation/schemas')
const { WorkflowOperationSchema } = await import('@sim/realtime-protocol/schemas')
const validEdgeOperation = {
operation: 'add',
@@ -412,7 +429,7 @@ describe('Socket Server Index Integration', () => {
})
it('should validate subflow operations', async () => {
const { WorkflowOperationSchema } = await import('@/socket/validation/schemas')
const { WorkflowOperationSchema } = await import('@sim/realtime-protocol/schemas')
const validSubflowOperation = {
operation: 'update',

View File

@@ -1,12 +1,12 @@
import { createServer } from 'http'
import { createLogger } from '@sim/logger'
import type { Server as SocketIOServer } from 'socket.io'
import { env } from '@/lib/core/config/env'
import { createSocketIOServer, shutdownSocketIOAdapter } from '@/socket/config/socket'
import { setupAllHandlers } from '@/socket/handlers'
import { type AuthenticatedSocket, authenticateSocket } from '@/socket/middleware/auth'
import { type IRoomManager, MemoryRoomManager, RedisRoomManager } from '@/socket/rooms'
import { createHttpHandler } from '@/socket/routes/http'
import { createSocketIOServer, shutdownSocketIOAdapter } from '@/config/socket'
import { env } from '@/env'
import { setupAllHandlers } from '@/handlers'
import { type AuthenticatedSocket, authenticateSocket } from '@/middleware/auth'
import { type IRoomManager, MemoryRoomManager, RedisRoomManager } from '@/rooms'
import { createHttpHandler } from '@/routes/http'
const logger = createLogger('CollaborativeSocketServer')
@@ -29,7 +29,7 @@ async function createRoomManager(io: SocketIOServer): Promise<IRoomManager> {
async function main() {
const httpServer = createServer()
const PORT = Number(env.PORT || env.SOCKET_PORT || 3002)
const PORT = env.PORT
logger.info('Starting Socket.IO server...', {
port: PORT,

View File

@@ -1,9 +1,8 @@
import { createLogger } from '@sim/logger'
import { toError } from '@sim/utils/errors'
import type { Socket } from 'socket.io'
import { auth } from '@/lib/auth'
import { ANONYMOUS_USER, ANONYMOUS_USER_ID } from '@/lib/auth/constants'
import { isAuthDisabled } from '@/lib/core/config/feature-flags'
import { toError } from '@/lib/core/utils/helpers'
import { ANONYMOUS_USER, ANONYMOUS_USER_ID, auth } from '@/auth'
import { isAuthDisabled } from '@/env'
const logger = createLogger('SocketAuth')

View File

@@ -13,14 +13,8 @@ import {
ROLE_ALLOWED_OPERATIONS,
SOCKET_OPERATIONS,
} from '@sim/testing'
import { describe, expect, it, vi } from 'vitest'
vi.mock('@/lib/auth', () => ({
auth: { api: { getSession: vi.fn() } },
getSession: vi.fn(),
}))
import { checkRolePermission } from '@/socket/middleware/permissions'
import { describe, expect, it } from 'vitest'
import { checkRolePermission } from '@/middleware/permissions'
describe('checkRolePermission', () => {
describe('admin role', () => {

View File

@@ -1,8 +1,6 @@
import { db } from '@sim/db'
import { workflow } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import { authorizeWorkflowByWorkspacePermission } from '@/lib/workflows/utils'
import {
BLOCK_OPERATIONS,
BLOCKS_OPERATIONS,
@@ -12,7 +10,9 @@ import {
SUBFLOW_OPERATIONS,
VARIABLE_OPERATIONS,
WORKFLOW_OPERATIONS,
} from '@/socket/constants'
} from '@sim/realtime-protocol/constants'
import { authorizeWorkflowByWorkspacePermission } from '@sim/workflow-authz'
import { and, eq, isNull } from 'drizzle-orm'
const logger = createLogger('SocketPermissions')

View File

@@ -0,0 +1,3 @@
export { MemoryRoomManager } from '@/rooms/memory-manager'
export { RedisRoomManager } from '@/rooms/redis-manager'
export type { IRoomManager, UserPresence, UserSession, WorkflowRoom } from '@/rooms/types'

View File

@@ -1,6 +1,6 @@
import { createLogger } from '@sim/logger'
import type { Server } from 'socket.io'
import type { IRoomManager, UserPresence, UserSession, WorkflowRoom } from '@/socket/rooms/types'
import type { IRoomManager, UserPresence, UserSession, WorkflowRoom } from '@/rooms/types'
const logger = createLogger('MemoryRoomManager')

View File

@@ -1,7 +1,7 @@
import { createLogger } from '@sim/logger'
import { createClient, type RedisClientType } from 'redis'
import type { Server } from 'socket.io'
import type { IRoomManager, UserPresence, UserSession } from '@/socket/rooms/types'
import type { IRoomManager, UserPresence, UserSession } from '@/rooms/types'
const logger = createLogger('RedisRoomManager')

View File

@@ -1,7 +1,7 @@
import type { IncomingMessage, ServerResponse } from 'http'
import { env } from '@/lib/core/config/env'
import { safeCompare } from '@/lib/core/security/encryption'
import type { IRoomManager } from '@/socket/rooms'
import { safeCompare } from '@sim/security/compare'
import { env } from '@/env'
import type { IRoomManager } from '@/rooms'
interface Logger {
info: (message: string, ...args: unknown[]) => void

View File

@@ -0,0 +1,11 @@
{
"extends": "@sim/tsconfig/base.json",
"compilerOptions": {
"baseUrl": ".",
"paths": {
"@/*": ["src/*"]
}
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}

View File

@@ -0,0 +1,27 @@
import path from 'node:path'
import { defineConfig } from 'vitest/config'
export default defineConfig({
test: {
globals: true,
environment: 'node',
include: ['**/*.test.{ts,tsx}'],
exclude: ['**/node_modules/**', '**/dist/**'],
setupFiles: ['./vitest.setup.ts'],
pool: 'threads',
testTimeout: 10000,
},
resolve: {
alias: [
{
find: '@sim/db',
replacement: path.resolve(__dirname, '../../packages/db'),
},
{
find: '@sim/logger',
replacement: path.resolve(__dirname, '../../packages/logger/src'),
},
{ find: '@', replacement: path.resolve(__dirname, 'src') },
],
},
})

View File

@@ -0,0 +1,6 @@
process.env.DATABASE_URL ??= 'postgres://localhost/test'
process.env.NODE_ENV ??= 'test'
process.env.BETTER_AUTH_URL ??= 'http://localhost:3000'
process.env.BETTER_AUTH_SECRET ??= 'test-better-auth-secret-at-least-32-chars'
process.env.INTERNAL_API_SECRET ??= 'test-internal-api-secret-at-least-32-chars'
process.env.NEXT_PUBLIC_APP_URL ??= 'http://localhost:3000'

View File

@@ -26,6 +26,13 @@ apps/sim/
└── triggers/ # Trigger definitions
```
The Socket.IO collaborative-canvas server lives in a separate workspace at
`apps/realtime/`. It shares DB + auth with `apps/sim` via the `@sim/*`
packages. Do not add imports from `@/lib/webhooks/providers/*`, `@/executor/*`,
`@/blocks/*`, or `@/tools/*` to any package consumed by `apps/realtime`
those heavyweight registries stay in this app. `apps/realtime` calls back
into this app only over internal HTTP with `INTERNAL_API_SECRET`.
### Feature Organization
Features live under `app/workspace/[workspaceId]/`:

View File

@@ -10,6 +10,7 @@ import { usePostHog } from 'posthog-js/react'
import { Input, Label } from '@/components/emcn'
import { client, useSession } from '@/lib/auth/auth-client'
import { getEnv, isFalsy, isTruthy } from '@/lib/core/config/env'
import { validateCallbackUrl } from '@/lib/core/security/input-validation'
import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { captureClientEvent, captureEvent } from '@/lib/posthog/client'
@@ -102,10 +103,14 @@ function SignupFormContent({ githubAvailable, googleAvailable, isProduction }: S
useEffect(() => {
setTurnstileSiteKey(getEnv('NEXT_PUBLIC_TURNSTILE_SITE_KEY'))
}, [])
const redirectUrl = useMemo(
() => searchParams.get('redirect') || searchParams.get('callbackUrl') || '',
[searchParams]
)
const rawRedirectUrl = searchParams.get('redirect') || searchParams.get('callbackUrl') || ''
const isValidRedirectUrl = rawRedirectUrl ? validateCallbackUrl(rawRedirectUrl) : false
const invalidCallbackRef = useRef(false)
if (rawRedirectUrl && !isValidRedirectUrl && !invalidCallbackRef.current) {
invalidCallbackRef.current = true
logger.warn('Invalid callback URL detected and blocked:', { url: rawRedirectUrl })
}
const redirectUrl = isValidRedirectUrl ? rawRedirectUrl : ''
const isInviteFlow = useMemo(
() =>
searchParams.get('invite_flow') === 'true' ||

View File

@@ -4,6 +4,7 @@ import { useEffect, useState } from 'react'
import { createLogger } from '@sim/logger'
import { useRouter, useSearchParams } from 'next/navigation'
import { client, useSession } from '@/lib/auth/auth-client'
import { validateCallbackUrl } from '@/lib/core/security/input-validation'
const logger = createLogger('useVerification')
@@ -55,8 +56,11 @@ export function useVerification({
}
const storedRedirectUrl = sessionStorage.getItem('inviteRedirectUrl')
if (storedRedirectUrl) {
if (storedRedirectUrl && validateCallbackUrl(storedRedirectUrl)) {
setRedirectUrl(storedRedirectUrl)
} else if (storedRedirectUrl) {
logger.warn('Ignoring unsafe stored invite redirect URL', { url: storedRedirectUrl })
sessionStorage.removeItem('inviteRedirectUrl')
}
const storedIsInviteFlow = sessionStorage.getItem('isInviteFlow')
@@ -67,7 +71,11 @@ export function useVerification({
const redirectParam = searchParams.get('redirectAfter')
if (redirectParam) {
setRedirectUrl(redirectParam)
if (validateCallbackUrl(redirectParam)) {
setRedirectUrl(redirectParam)
} else {
logger.warn('Ignoring unsafe redirectAfter parameter', { url: redirectParam })
}
}
const inviteFlowParam = searchParams.get('invite_flow')

View File

@@ -1,26 +0,0 @@
import { createLogger } from '@sim/logger'
const DEFAULT_STARS = '19.4k'
const logger = createLogger('GitHubStars')
export async function getFormattedGitHubStars(): Promise<string> {
try {
const response = await fetch('/api/stars', {
headers: {
'Cache-Control': 'max-age=3600', // Cache for 1 hour
},
})
if (!response.ok) {
logger.warn('Failed to fetch GitHub stars from API')
return DEFAULT_STARS
}
const data = await response.json()
return data.stars || DEFAULT_STARS
} catch (error) {
logger.warn('Error fetching GitHub stars:', error)
return DEFAULT_STARS
}
}

View File

@@ -0,0 +1,82 @@
import { z } from 'zod'
import { NO_EMAIL_HEADER_CONTROL_CHARS_REGEX } from '@/lib/messaging/email/utils'
import { quickValidateEmail } from '@/lib/messaging/email/validation'
export const CONTACT_TOPIC_VALUES = [
'general',
'support',
'integration',
'feature_request',
'sales',
'partnership',
'billing',
'other',
] as const
export const CONTACT_TOPIC_OPTIONS = [
{ value: 'general', label: 'General question' },
{ value: 'support', label: 'Technical support' },
{ value: 'integration', label: 'Integration request' },
{ value: 'feature_request', label: 'Feature request' },
{ value: 'sales', label: 'Sales & pricing' },
{ value: 'partnership', label: 'Partnership' },
{ value: 'billing', label: 'Billing' },
{ value: 'other', label: 'Other' },
] as const
export const contactRequestSchema = z.object({
name: z
.string()
.trim()
.min(1, 'Name is required')
.max(120, 'Name must be 120 characters or less')
.regex(NO_EMAIL_HEADER_CONTROL_CHARS_REGEX, 'Invalid characters'),
email: z
.string()
.trim()
.min(1, 'Email is required')
.max(320)
.transform((value) => value.toLowerCase())
.refine((value) => quickValidateEmail(value).isValid, 'Enter a valid email'),
company: z
.string()
.trim()
.max(120, 'Company must be 120 characters or less')
.optional()
.transform((value) => (value && value.length > 0 ? value : undefined)),
topic: z.enum(CONTACT_TOPIC_VALUES, {
errorMap: () => ({ message: 'Please select a topic' }),
}),
subject: z
.string()
.trim()
.min(1, 'Subject is required')
.max(200, 'Subject must be 200 characters or less')
.regex(NO_EMAIL_HEADER_CONTROL_CHARS_REGEX, 'Invalid characters'),
message: z
.string()
.trim()
.min(1, 'Message is required')
.max(5000, 'Message must be 5,000 characters or less'),
})
export type ContactRequestPayload = z.infer<typeof contactRequestSchema>
export function getContactTopicLabel(value: ContactRequestPayload['topic']): string {
return CONTACT_TOPIC_OPTIONS.find((option) => option.value === value)?.label ?? value
}
export type HelpEmailType = 'bug' | 'feedback' | 'feature_request' | 'other'
export function mapContactTopicToHelpType(topic: ContactRequestPayload['topic']): HelpEmailType {
switch (topic) {
case 'feature_request':
return 'feature_request'
case 'support':
return 'bug'
case 'integration':
return 'feedback'
default:
return 'other'
}
}

View File

@@ -0,0 +1,354 @@
'use client'
import { useEffect, useRef, useState } from 'react'
import { Turnstile, type TurnstileInstance } from '@marsidev/react-turnstile'
import { toError } from '@sim/utils/errors'
import { useMutation } from '@tanstack/react-query'
import Link from 'next/link'
import { Combobox, type ComboboxOption, Input, Textarea } from '@/components/emcn'
import { Check } from '@/components/emcn/icons'
import { getEnv } from '@/lib/core/config/env'
import { captureClientEvent } from '@/lib/posthog/client'
import {
CONTACT_TOPIC_OPTIONS,
type ContactRequestPayload,
contactRequestSchema,
} from '@/app/(landing)/components/contact/consts'
import { LandingField } from '@/app/(landing)/components/forms/landing-field'
type ContactField = keyof ContactRequestPayload
type ContactErrors = Partial<Record<ContactField, string>>
interface ContactFormState {
name: string
email: string
company: string
topic: ContactRequestPayload['topic'] | ''
subject: string
message: string
}
const INITIAL_FORM_STATE: ContactFormState = {
name: '',
email: '',
company: '',
topic: '',
subject: '',
message: '',
}
const LANDING_INPUT =
'h-[40px] rounded-[5px] border border-[var(--landing-bg-elevated)] bg-[var(--landing-bg-surface)] px-3 font-[430] font-season text-[14px] text-[var(--landing-text)] outline-none transition-colors placeholder:text-[var(--landing-text-muted)] focus:border-[var(--landing-border-strong)]'
const LANDING_TEXTAREA =
'min-h-[140px] rounded-[5px] border border-[var(--landing-bg-elevated)] bg-[var(--landing-bg-surface)] px-3 py-2.5 font-[430] font-season text-[14px] text-[var(--landing-text)] outline-none transition-colors placeholder:text-[var(--landing-text-muted)] focus:border-[var(--landing-border-strong)]'
const LANDING_COMBOBOX =
'h-[40px] rounded-[5px] border border-[var(--landing-bg-elevated)] bg-[var(--landing-bg-surface)] px-3 font-[430] font-season text-[14px] text-[var(--landing-text)] hover:bg-[var(--landing-bg-surface)] focus-within:border-[var(--landing-border-strong)]'
const LANDING_SUBMIT =
'flex h-[40px] w-full items-center justify-center rounded-[5px] border border-[var(--landing-text-subtle)] bg-[var(--landing-text-subtle)] font-[430] font-season text-[14px] text-[var(--landing-text-dark)] transition-colors hover:border-[var(--landing-bg-hover)] hover:bg-[var(--landing-bg-hover)] disabled:cursor-not-allowed disabled:opacity-60'
const LANDING_LABEL =
'font-[500] font-season text-[13px] text-[var(--landing-text)] tracking-[0.02em]'
interface SubmitContactRequestInput extends ContactRequestPayload {
website: string
captchaToken?: string
captchaUnavailable?: boolean
}
async function submitContactRequest(payload: SubmitContactRequestInput) {
const response = await fetch('/api/contact', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload),
})
const result = (await response.json().catch(() => null)) as {
error?: string
message?: string
} | null
if (!response.ok) {
throw new Error(result?.error || 'Failed to send message')
}
return result
}
export function ContactForm() {
const turnstileRef = useRef<TurnstileInstance>(null)
const contactMutation = useMutation({
mutationFn: submitContactRequest,
onSuccess: (_data, variables) => {
captureClientEvent('landing_contact_submitted', { topic: variables.topic })
setForm(INITIAL_FORM_STATE)
setErrors({})
setSubmitSuccess(true)
},
onError: () => {
turnstileRef.current?.reset()
},
})
const [form, setForm] = useState<ContactFormState>(INITIAL_FORM_STATE)
const [errors, setErrors] = useState<ContactErrors>({})
const [submitSuccess, setSubmitSuccess] = useState(false)
const [isSubmitting, setIsSubmitting] = useState(false)
const [website, setWebsite] = useState('')
const [widgetReady, setWidgetReady] = useState(false)
const [turnstileSiteKey, setTurnstileSiteKey] = useState<string | undefined>()
useEffect(() => {
setTurnstileSiteKey(getEnv('NEXT_PUBLIC_TURNSTILE_SITE_KEY'))
}, [])
function updateField<TField extends keyof ContactFormState>(
field: TField,
value: ContactFormState[TField]
) {
setForm((prev) => ({ ...prev, [field]: value }))
setErrors((prev) => {
if (!prev[field as ContactField]) {
return prev
}
const nextErrors = { ...prev }
delete nextErrors[field as ContactField]
return nextErrors
})
if (contactMutation.isError) {
contactMutation.reset()
}
}
async function handleSubmit(event: React.FormEvent<HTMLFormElement>) {
event.preventDefault()
if (contactMutation.isPending || isSubmitting) return
setIsSubmitting(true)
const parsed = contactRequestSchema.safeParse({
...form,
company: form.company || undefined,
})
if (!parsed.success) {
const fieldErrors = parsed.error.flatten().fieldErrors
setErrors({
name: fieldErrors.name?.[0],
email: fieldErrors.email?.[0],
company: fieldErrors.company?.[0],
topic: fieldErrors.topic?.[0],
subject: fieldErrors.subject?.[0],
message: fieldErrors.message?.[0],
})
setIsSubmitting(false)
return
}
let captchaToken: string | undefined
let captchaUnavailable: boolean | undefined
const widget = turnstileRef.current
if (turnstileSiteKey) {
if (widgetReady && widget) {
try {
widget.reset()
widget.execute()
captchaToken = await widget.getResponsePromise(30_000)
} catch {
captchaUnavailable = true
}
} else {
captchaUnavailable = true
}
}
contactMutation.mutate({ ...parsed.data, website, captchaToken, captchaUnavailable })
setIsSubmitting(false)
}
const isBusy = contactMutation.isPending || isSubmitting
const submitError = contactMutation.isError
? toError(contactMutation.error).message || 'Failed to send message. Please try again.'
: null
if (submitSuccess) {
return (
<div className='flex flex-col items-center px-8 py-16 text-center'>
<div className='flex h-16 w-16 items-center justify-center rounded-full border border-[var(--landing-bg-elevated)] bg-[var(--landing-bg-surface)] text-[var(--landing-text)]'>
<Check className='h-8 w-8' />
</div>
<h2 className='mt-6 font-[430] font-season text-[24px] text-[var(--landing-text)] leading-[1.2] tracking-[-0.02em]'>
Message received
</h2>
<p className='mt-3 max-w-sm font-season text-[14px] text-[var(--landing-text-body)] leading-[1.6]'>
Thanks for reaching out. We've sent a confirmation to your inbox and will get back to you
shortly.
</p>
<button
type='button'
onClick={() => setSubmitSuccess(false)}
className='mt-6 font-season text-[13px] text-[var(--landing-text)] underline underline-offset-2 transition-opacity hover:opacity-80'
>
Send another message
</button>
</div>
)
}
return (
<form onSubmit={handleSubmit} className='relative flex flex-col gap-5'>
{/* Honeypot */}
<div
aria-hidden='true'
className='pointer-events-none absolute left-[-9999px] h-px w-px overflow-hidden opacity-0'
>
<label htmlFor='contact-website'>Website</label>
<input
id='contact-website'
name='website'
type='text'
tabIndex={-1}
autoComplete='off'
value={website}
onChange={(event) => setWebsite(event.target.value)}
data-lpignore='true'
data-1p-ignore='true'
/>
</div>
<div className='grid gap-5 sm:grid-cols-2'>
<LandingField
htmlFor='contact-name'
label='Name'
error={errors.name}
labelClassName={LANDING_LABEL}
>
<Input
id='contact-name'
value={form.name}
onChange={(event) => updateField('name', event.target.value)}
placeholder='Your name'
className={LANDING_INPUT}
/>
</LandingField>
<LandingField
htmlFor='contact-email'
label='Email'
error={errors.email}
labelClassName={LANDING_LABEL}
>
<Input
id='contact-email'
type='email'
value={form.email}
onChange={(event) => updateField('email', event.target.value)}
placeholder='you@company.com'
className={LANDING_INPUT}
/>
</LandingField>
</div>
<div className='grid gap-5 sm:grid-cols-2'>
<LandingField
htmlFor='contact-company'
label='Company'
optional
error={errors.company}
labelClassName={LANDING_LABEL}
>
<Input
id='contact-company'
value={form.company}
onChange={(event) => updateField('company', event.target.value)}
placeholder='Company name'
className={LANDING_INPUT}
/>
</LandingField>
<LandingField
htmlFor='contact-topic'
label='Topic'
error={errors.topic}
labelClassName={LANDING_LABEL}
>
<Combobox
options={CONTACT_TOPIC_OPTIONS as unknown as ComboboxOption[]}
value={form.topic}
selectedValue={form.topic}
onChange={(value) => updateField('topic', value as ContactRequestPayload['topic'])}
placeholder='Select a topic'
editable={false}
filterOptions={false}
className={LANDING_COMBOBOX}
/>
</LandingField>
</div>
<LandingField
htmlFor='contact-subject'
label='Subject'
error={errors.subject}
labelClassName={LANDING_LABEL}
>
<Input
id='contact-subject'
value={form.subject}
onChange={(event) => updateField('subject', event.target.value)}
placeholder='How can we help?'
className={LANDING_INPUT}
/>
</LandingField>
<LandingField
htmlFor='contact-message'
label='Message'
error={errors.message}
labelClassName={LANDING_LABEL}
>
<Textarea
id='contact-message'
value={form.message}
onChange={(event) => updateField('message', event.target.value)}
placeholder='Share details so we can help as quickly as possible'
className={LANDING_TEXTAREA}
/>
</LandingField>
{turnstileSiteKey ? (
<Turnstile
ref={turnstileRef}
siteKey={turnstileSiteKey}
options={{ execution: 'execute', appearance: 'execute', size: 'invisible' }}
onWidgetLoad={() => setWidgetReady(true)}
onExpire={() => setWidgetReady(false)}
onError={() => setWidgetReady(false)}
onUnsupported={() => setWidgetReady(false)}
/>
) : null}
{submitError ? (
<p role='alert' className='font-season text-[13px] text-[var(--text-error)]'>
{submitError}
</p>
) : null}
<button type='submit' disabled={isBusy} className={LANDING_SUBMIT}>
{isBusy ? 'Sending...' : 'Send message'}
</button>
<p className='text-center font-season text-[12px] text-[var(--landing-text-muted)] leading-[1.6]'>
By submitting, you agree to our{' '}
<Link
href='/privacy'
className='text-[var(--landing-text)] underline underline-offset-2 transition-opacity hover:opacity-80'
>
Privacy Policy
</Link>
.
</p>
</form>
)
}

View File

@@ -1,6 +1,7 @@
'use client'
import { useCallback, useState } from 'react'
import { useState } from 'react'
import { useMutation } from '@tanstack/react-query'
import {
Combobox,
Input,
@@ -19,6 +20,7 @@ import {
type DemoRequestPayload,
demoRequestSchema,
} from '@/app/(landing)/components/demo-request/consts'
import { LandingField } from '@/app/(landing)/components/forms/landing-field'
interface DemoRequestModalProps {
children: React.ReactNode
@@ -49,136 +51,104 @@ const INITIAL_FORM_STATE: DemoRequestFormState = {
details: '',
}
interface LandingFieldProps {
label: string
htmlFor: string
optional?: boolean
error?: string
children: React.ReactNode
}
function LandingField({ label, htmlFor, optional, error, children }: LandingFieldProps) {
return (
<div className='flex flex-col gap-1.5'>
<label
htmlFor={htmlFor}
className='font-[430] font-season text-[13px] text-[var(--text-secondary)] tracking-[0.02em]'
>
{label}
{optional ? <span className='ml-1 text-[var(--text-muted)]'>(optional)</span> : null}
</label>
{children}
{error ? <p className='text-[12px] text-[var(--text-error)]'>{error}</p> : null}
</div>
)
}
const LANDING_INPUT =
'h-[32px] rounded-[5px] border border-[var(--border-1)] bg-[var(--surface-5)] px-2.5 font-[430] font-season text-[13.5px] text-[var(--text-primary)] transition-colors placeholder:text-[var(--text-muted)] outline-none'
async function submitDemoRequest(payload: DemoRequestPayload) {
const response = await fetch('/api/demo-requests', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload),
})
const result = (await response.json().catch(() => null)) as {
error?: string
message?: string
} | null
if (!response.ok) {
throw new Error(result?.error || 'Failed to submit demo request')
}
return result
}
export function DemoRequestModal({ children, theme = 'dark' }: DemoRequestModalProps) {
const [open, setOpen] = useState(false)
const [form, setForm] = useState<DemoRequestFormState>(INITIAL_FORM_STATE)
const [errors, setErrors] = useState<DemoRequestErrors>({})
const [isSubmitting, setIsSubmitting] = useState(false)
const [submitError, setSubmitError] = useState<string | null>(null)
const [submitSuccess, setSubmitSuccess] = useState(false)
const resetForm = useCallback(() => {
const demoMutation = useMutation({
mutationFn: submitDemoRequest,
onSuccess: (_data, variables) => {
captureClientEvent('landing_demo_request_submitted', {
company_size: variables.companySize,
})
setSubmitSuccess(true)
},
})
function resetForm() {
setForm(INITIAL_FORM_STATE)
setErrors({})
setIsSubmitting(false)
setSubmitError(null)
setSubmitSuccess(false)
}, [])
demoMutation.reset()
}
const handleOpenChange = useCallback(
(nextOpen: boolean) => {
setOpen(nextOpen)
resetForm()
},
[resetForm]
)
function handleOpenChange(nextOpen: boolean) {
setOpen(nextOpen)
resetForm()
}
const updateField = useCallback(
<TField extends keyof DemoRequestFormState>(
field: TField,
value: DemoRequestFormState[TField]
) => {
setForm((prev) => ({ ...prev, [field]: value }))
setErrors((prev) => {
if (!prev[field]) {
return prev
}
const nextErrors = { ...prev }
delete nextErrors[field]
return nextErrors
})
setSubmitError(null)
setSubmitSuccess(false)
},
[]
)
const handleSubmit = useCallback(
async (event: React.FormEvent<HTMLFormElement>) => {
event.preventDefault()
setSubmitError(null)
setSubmitSuccess(false)
const parsed = demoRequestSchema.safeParse({
...form,
phoneNumber: form.phoneNumber || undefined,
})
if (!parsed.success) {
const fieldErrors = parsed.error.flatten().fieldErrors
setErrors({
firstName: fieldErrors.firstName?.[0],
lastName: fieldErrors.lastName?.[0],
companyEmail: fieldErrors.companyEmail?.[0],
phoneNumber: fieldErrors.phoneNumber?.[0],
companySize: fieldErrors.companySize?.[0],
details: fieldErrors.details?.[0],
})
return
function updateField<TField extends keyof DemoRequestFormState>(
field: TField,
value: DemoRequestFormState[TField]
) {
setForm((prev) => ({ ...prev, [field]: value }))
setErrors((prev) => {
if (!prev[field]) {
return prev
}
const nextErrors = { ...prev }
delete nextErrors[field]
return nextErrors
})
if (demoMutation.isError) {
demoMutation.reset()
}
}
setIsSubmitting(true)
function handleSubmit(event: React.FormEvent<HTMLFormElement>) {
event.preventDefault()
if (demoMutation.isPending) return
try {
const response = await fetch('/api/demo-requests', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(parsed.data),
})
const parsed = demoRequestSchema.safeParse({
...form,
phoneNumber: form.phoneNumber || undefined,
})
const result = (await response.json().catch(() => null)) as {
error?: string
message?: string
} | null
if (!parsed.success) {
const fieldErrors = parsed.error.flatten().fieldErrors
setErrors({
firstName: fieldErrors.firstName?.[0],
lastName: fieldErrors.lastName?.[0],
companyEmail: fieldErrors.companyEmail?.[0],
phoneNumber: fieldErrors.phoneNumber?.[0],
companySize: fieldErrors.companySize?.[0],
details: fieldErrors.details?.[0],
})
return
}
if (!response.ok) {
throw new Error(result?.error || 'Failed to submit demo request')
}
demoMutation.mutate(parsed.data)
}
setSubmitSuccess(true)
captureClientEvent('landing_demo_request_submitted', {
company_size: parsed.data.companySize,
})
} catch (error) {
setSubmitError(
error instanceof Error
? error.message
: 'Failed to submit demo request. Please try again.'
)
} finally {
setIsSubmitting(false)
}
},
[form, resetForm]
)
const submitError = demoMutation.isError
? demoMutation.error instanceof Error
? demoMutation.error.message
: 'Failed to submit demo request. Please try again.'
: null
return (
<Modal open={open} onOpenChange={handleOpenChange}>
@@ -284,14 +254,16 @@ export function DemoRequestModal({ children, theme = 'dark' }: DemoRequestModalP
<ModalFooter className='flex-col items-stretch gap-3 border-t-0 bg-transparent pt-0'>
{submitError && (
<p className='font-season text-[13px] text-[var(--text-error)]'>{submitError}</p>
<p role='alert' className='font-season text-[13px] text-[var(--text-error)]'>
{submitError}
</p>
)}
<button
type='submit'
disabled={isSubmitting}
disabled={demoMutation.isPending}
className='flex h-[32px] w-full items-center justify-center rounded-[5px] bg-[var(--text-primary)] font-[430] font-season text-[13.5px] text-[var(--bg)] transition-colors hover:opacity-90 disabled:cursor-not-allowed disabled:opacity-50'
>
{isSubmitting ? 'Submitting...' : 'Submit'}
{demoMutation.isPending ? 'Submitting...' : 'Submit'}
</button>
</ModalFooter>
</form>

View File

@@ -31,6 +31,7 @@ const RESOURCES_LINKS: FooterItem[] = [
{ label: 'Partners', href: '/partners' },
{ label: 'Careers', href: 'https://jobs.ashbyhq.com/sim', external: true, externalArrow: true },
{ label: 'Changelog', href: '/changelog' },
{ label: 'Contact', href: '/contact' },
]
const BLOCK_LINKS: FooterItem[] = [

View File

@@ -0,0 +1,49 @@
import { cloneElement, isValidElement } from 'react'
interface LandingFieldProps {
label: string
htmlFor: string
optional?: boolean
error?: string
children: React.ReactNode
/** Replaces the default label className. */
labelClassName?: string
}
const DEFAULT_LABEL_CLASSNAME =
'font-[430] font-season text-[13px] text-[var(--text-secondary)] tracking-[0.02em]'
export function LandingField({
label,
htmlFor,
optional,
error,
children,
labelClassName,
}: LandingFieldProps) {
const errorId = error ? `${htmlFor}-error` : undefined
const describedChild =
errorId && isValidElement<{ 'aria-describedby'?: string; 'aria-invalid'?: boolean }>(children)
? cloneElement(children, { 'aria-describedby': errorId, 'aria-invalid': true })
: children
return (
<div className='flex flex-col gap-1.5'>
<div className='flex min-h-[18px] items-baseline justify-between gap-3'>
<label htmlFor={htmlFor} className={labelClassName ?? DEFAULT_LABEL_CLASSNAME}>
{label}
{optional ? <span className='ml-1 text-[var(--text-muted)]'>(optional)</span> : null}
</label>
{error ? (
<span
id={errorId}
role='alert'
className='truncate font-season text-[12px] text-[var(--text-error)]'
>
{error}
</span>
) : null}
</div>
{describedChild}
</div>
)
}

View File

@@ -1,13 +1,7 @@
'use client'
import { useEffect, useState } from 'react'
import { createLogger } from '@sim/logger'
import { GithubOutlineIcon } from '@/components/icons'
import { getFormattedGitHubStars } from '@/app/(landing)/actions/github'
const logger = createLogger('github-stars')
const INITIAL_STARS = '27.7k'
import { useGitHubStars } from '@/hooks/queries/github-stars'
/**
* Client component that displays GitHub stars count.
@@ -16,15 +10,7 @@ const INITIAL_STARS = '27.7k'
* a Server Component for optimal SEO/GEO crawlability.
*/
export function GitHubStars() {
const [stars, setStars] = useState(INITIAL_STARS)
useEffect(() => {
getFormattedGitHubStars()
.then(setStars)
.catch((error) => {
logger.warn('Failed to fetch GitHub stars', error)
})
}, [])
const { data: stars } = useGitHubStars()
return (
<a

View File

@@ -0,0 +1,40 @@
import Link from 'next/link'
import { getNavBlogPosts } from '@/lib/blog/registry'
import AuthBackground from '@/app/(auth)/components/auth-background'
import { AUTH_PRIMARY_CTA_BASE } from '@/app/(auth)/components/auth-button-classes'
import Navbar from '@/app/(landing)/components/navbar/navbar'
/**
* Shared 404 view used by every `not-found.tsx` under the landing surface.
*
* Rendered outside the route-group `(shell)` layout so it owns the full
* viewport (Navbar + AuthBackground decoration, no Footer), matching the
* root `/` 404 treatment.
*/
export default async function NotFoundView() {
const blogPosts = await getNavBlogPosts()
return (
<AuthBackground className='dark font-[430] font-season'>
<main className='relative flex min-h-full flex-col text-[var(--landing-text)]'>
<header className='shrink-0 bg-[var(--landing-bg)]'>
<Navbar blogPosts={blogPosts} />
</header>
<div className='relative z-30 flex flex-1 items-center justify-center px-4 pb-24'>
<div className='flex flex-col items-center gap-3'>
<h1 className='text-balance font-[430] font-season text-[40px] text-white leading-[110%] tracking-[-0.02em]'>
Page not found
</h1>
<p className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_60%,transparent)] text-lg leading-[125%] tracking-[0.02em]'>
The page you&apos;re looking for doesn&apos;t exist or has been moved.
</p>
<div className='mt-3 flex items-center gap-2'>
<Link href='/' className={AUTH_PRIMARY_CTA_BASE}>
Return to Home
</Link>
</div>
</div>
</div>
</main>
</AuthBackground>
)
}

Some files were not shown because too many files have changed in this diff Show More