Compare commits

..

65 Commits

Author SHA1 Message Date
Waleed
7d0fdefb22 v0.6.18: file operations block, profound integration, edge connection improvements, copy logs, knowledgebase robustness 2026-03-30 21:35:41 -07:00
Vikhyath Mondreti
90f592797a fix(file): use file-upload subblock (#3862)
* fix(file): use file-upload subblock

* fix preview + logs url for notifs

* fix color for profound

* remove canonical param from desc
2026-03-30 21:29:01 -07:00
Waleed
d091441e39 feat(logs): add copy link and deep-link support for log entries (#3863)
* feat(logs): add copy link and deep link support for log entries

* fix(logs): move Link icon to emcn and handle clipboard rejections

* feat(notifications): use executionId deep-link for View Log URLs

Switch buildLogUrl from ?search= to ?executionId= so email and Slack
'View Log' buttons open the logs page with the specific execution
auto-selected and the details panel expanded.
2026-03-30 21:17:58 -07:00
Waleed
7d4dd26760 fix(knowledge): fix document processing stuck in processing state (#3857)
* fix(knowledge): fix document processing stuck in processing state

* fix(knowledge): use Promise.allSettled for document dispatch and fix Copilot OAuth context

- Change Promise.all to Promise.allSettled in processDocumentsWithQueue so
  one failed dispatch doesn't abort the entire batch
- Add writeOAuthReturnContext before showing LazyOAuthRequiredModal from
  Copilot tools so useOAuthReturnForWorkflow can handle the return
- Add consumeOAuthReturnContext on modal close to clean up stale context

* fix(knowledge): fix type error in useCredentialRefreshTriggers call

Pass empty string instead of undefined for connectorProviderId fallback
to match the hook's string parameter type.

* upgrade turbo

* fix(knowledge): fix type error in connectors-section useCredentialRefreshTriggers call

Same string narrowing fix as add-connector-modal — pass empty string
fallback for providerId.
2026-03-30 20:35:08 -07:00
Vikhyath Mondreti
0abeac77e1 improvement(platform): standardize perms, audit logging, lifecycle across admin, copilot, ui actions (#3858)
* improvement(platform): standardize perms, audit logging, lifecycle mgmt across admin, copilot, ui actions

* address comments

* improve error codes

* address bugbot comments

* fix test
2026-03-30 20:25:38 -07:00
Waleed
e9c94fa462 feat(logs): add copy link and deep link support for log entries (#3855)
* feat(logs): add copy link and deep link support for log entries

* fix(logs): fetch next page when deep linked log is beyond initial page

* fix(logs): move Link icon to emcn and handle clipboard rejections

* fix(logs): track isFetching reactively and drop empty-list early-return

- Remove  guard that prevented clearing the
  pending ref when filters return no results
- Use  directly in the condition and add it to
  the effect deps so the effect re-triggers after a background refetch

* fix(logs): guard deep-link ref clear until query has succeeded

Only clear pendingExecutionIdRef when the query status is 'success',
preventing premature clearing before the initial fetch completes.
On mount, the query is disabled (isInitialized.current starts false),
so hasNextPage is false but no data has loaded yet — the ref was being
cleared in the same effect pass that set it.

* fix(logs): guard fetchNextPage call until query has succeeded

Add logsQuery.status === 'success' to the fetchNextPage branch so it
mirrors the clear branch. On mount the query is disabled (isFetching is
false, status is pending), causing the effect to call fetchNextPage()
before the query is initialized — now both branches require success.
2026-03-30 18:42:27 -07:00
Waleed
72eea64bf6 improvement(tour): align product tour tooltip styling with emcn and fix spotlight overflow (#3854) 2026-03-30 16:59:53 -07:00
Waleed
27460f847c fix(atlassian): harden cloud ID resolution for Confluence and Jira (#3853) 2026-03-30 16:54:45 -07:00
Vikhyath Mondreti
c7643198dc fix(mothership): hang condition (#3852) 2026-03-30 16:47:54 -07:00
Waleed
e5aef6184a feat(profound): add Profound AI visibility and analytics integration (#3849)
* feat(profound): add Profound AI visibility and analytics integration

* fix(profound): fix import ordering and JSON formatting for CI lint

* fix(profound): gate metrics mapping on current operation to prevent stale overrides

* fix(profound): guard JSON.parse on filters, fix offset=0 falsy check, remove duplicate prompt_answers in FILTER_OPS

* lint

* fix(docs): fix import ordering and trailing newline for docs lint

* fix(scripts): sort generated imports to match Biome's organizeImports order

* fix(profound): use != null checks for limit param across all tools

* fix(profound): flatten block output type to 'json' to pass block validation test

* fix(profound): remove invalid 'required' field from block inputs (not part of ParamConfig)

* fix(profound): rename tool files from kebab-case to snake_case for docs generator compatibility

* lint

* fix(docs): let biome auto-fix import order, revert custom sort in generator

* fix(landing): fix import order in sim icon-mapping via biome

* fix(scripts): match Biome's exact import sort order in docs generator

* fix(generate-docs): produce Biome-compatible JSON output

The generator wrote multi-line arrays for short string arrays (like tags)
and omitted trailing newlines, causing Biome format check failures in CI.
Post-process integrations.json to collapse short arrays onto single lines
and add trailing newlines to both integrations.json and meta.json.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-30 16:30:06 -07:00
Waleed
4ae5b1b620 improvement(workflow): use DOM hit-testing for edge drop-on-block detection (#3851) 2026-03-30 16:20:30 -07:00
Waleed
5c334874eb fix(auth): use standard 'Unauthorized' error in hybrid auth responses (#3850) 2026-03-30 16:11:04 -07:00
Theodore Li
d3d58a9615 Feat/improved logging (#3833)
* feat(logs): add additional metadata for workflow execution logs

* Revert "Feat(logs) upgrade mothership chat messages to error (#3772)"

This reverts commit 9d1b9763c5.

* Fix lint, address greptile comments

* improvement(sidebar): expand sidebar by hovering and clicking the edge (#3830)

* improvement(sidebar): expand sidebar by hovering and clicking the edge

* improvement(sidebar): add keyboard shortcuts for new workflow/task, center search modal, fix edge ARIA

* improvement(sidebar): use Tooltip.Shortcut for inline shortcut display

* fix(sidebar): change new workflow shortcut from Mod+Shift+W to Mod+Shift+P to avoid browser close-window conflict

* fix(hotkeys): fall back to event.code for international keyboard layout compatibility

* fix(sidebar): guard add-workflow shortcut with canEdit and isCreatingWorkflow checks

* feat(ui): handle image paste (#3826)

* feat(ui): handle image paste

* Fix lint

* Fix type error

---------

Co-authored-by: Theodore Li <theo@sim.ai>

* feat(files): interactive markdown checkbox toggling in preview (#3829)

* feat(files): interactive markdown checkbox toggling in preview

* fix(files): handle ordered-list checkboxes and fix index drift

* lint

* fix(files): remove counter offset that prevented checkbox toggling

* fix(files): apply task-list styling to ordered lists too

* fix(files): render single pass when interactive to avoid index drift

* fix(files): move useMemo above conditional return to fix Rules of Hooks

* fix(files): pass content directly to preview when not streaming to avoid stale frame

* improvement(home): position @ mention popup at caret and fix icon consistency (#3831)

* improvement(home): position @ mention popup at caret and fix icon consistency

* fix(home): pin mirror div to document origin and guard button anchor

* chore(auth): restore hybrid.ts to staging

* improvement(ui): sidebar (#3832)

* Fix logger tests

* Add metadata to mothership logs

---------

Co-authored-by: Theodore Li <theo@sim.ai>
Co-authored-by: Waleed <walif6@gmail.com>
Co-authored-by: Theodore Li <theo@sim.ai>
2026-03-30 19:02:17 -04:00
Waleed
1d59eca90a fix(analytics): use getBaseDomain for Profound host field (#3848)
request.url resolves to internal ALB hostname on ECS, not the public domain
2026-03-30 15:36:41 -07:00
Theodore Li
e1359b09d6 feat(block) add block write and append operations (#3665)
* Add file write and delete operations

* Add file block write operation

* Fix lint

* Allow loop-in-loop workflow edits

* Fix type error

* Remove file id input, output link correctly

* Add append tool

* fix lint

* Address feedback

* Handle writing to same file name gracefully

* Removed  mime type from append block

* Add lock for file append operation

---------

Co-authored-by: Theodore Li <theo@sim.ai>
2026-03-30 18:08:51 -04:00
Waleed
35b3646330 fix(sidebar): cmd+click opens in new tab, shift+click for range select (#3846)
* fix(sidebar): cmd+click opens in new tab, shift+click for range select

* comment cleanup

* fix(sidebar): drop stale metaKey param from workflow and task selection hooks
2026-03-30 11:02:33 -07:00
Waleed
73e00f53e1 v0.6.17: trigger.dev CI, workers FF 2026-03-30 09:33:30 -07:00
Waleed
5c47ea58f8 chore(trigger): update @trigger.dev/sdk and @trigger.dev/build to 4.4.3 (#3843)
* chore(trigger): update @trigger.dev/sdk and @trigger.dev/build to 4.4.3

* fix(webhooks): execute non-polling webhooks inline when BullMQ is disabled
2026-03-30 09:27:15 -07:00
Vikhyath Mondreti
1d7ae906bc v0.6.16: bullmq optionality 2026-03-30 00:12:21 -07:00
Vikhyath Mondreti
c4f4e6b48c fix(bullmq): disable temporarily (#3841) 2026-03-30 00:04:59 -07:00
Waleed
560fa75155 v0.6.15: workers, security hardening, sidebar improvements, chat fixes, profound 2026-03-29 23:02:19 -07:00
Waleed
1728c370de improvement(landing): lighthouse performance and accessibility fixes (#3837)
* improvement(landing): lighthouse performance and accessibility fixes

* improvement(landing): extract FeatureToggleItem to deduplicate accessibility logic

* lint

* fix(landing): ensure explicit delay prop takes precedence over transition spread
2026-03-29 22:33:34 -07:00
Waleed
82e58a5082 fix(academy): hide academy pages until content is ready (#3839)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-29 22:18:13 -07:00
Waleed
336c065234 fix(viewer): image pan/zoom, sort fixes, sidebar dot fixes (#3836)
* feat(file-viewer): add pan and zoom to image preview

* fix(viewer): fix sort key mapping, disable load-more on sort, hide status dots when menu open

* fix(file-viewer): prevent scroll bleed and zoom button micro-pans

* fix(file-viewer): use exponential zoom formula to prevent zero/negative multiplier
2026-03-29 12:40:29 -07:00
Waleed
b3713642b2 feat(resources): add sort and filter to all resource list pages (#3834)
* improvement(tables): improve table filtering UX

- Replace popover filter with persistent inline panel below toolbar
- Add AND/OR toggle between filter rules (shown in Where label slot)
- Sync filter panel state from applied filter on open
- Show filter button active state when filter is applied or panel is open
- Use readable operator labels matching dropdown options
- Add Clear filters button (shown only when filter is active)
- Close filter panel when last rule is removed via X
- Fix empty gap rows appearing in filtered results by skipping position gap rendering when filter is active
- Add toggle mode to ResourceOptionsBar for inline panel pattern
- Memoize FilterRuleRow for perf, fix filterTags key collision, remove dead filterActiveCount prop

* fix(table-filter): use ref to stabilize handleRemove/handleApply callbacks

Reading rules via ref instead of closure eliminates rules from useCallback
dependency arrays, keeping callbacks stable across rule edits and preserving
the memo() benefit on FilterRuleRow.

* improvement(tables,kb): remove hacky patterns, fix KB filter popover width

- Remove non-TSDoc comment from table-filter (rulesRef pattern is self-evident)
- Simplify SearchSection: remove setState-during-render anti-pattern; controlled
  input binds directly to search.value/onChange (simpler and correct)
- Reduce KB filter popover from w-[320px] to w-[200px]; tag filter uses vertical
  layout so narrow width works; Status-only case is now appropriately compact

* feat(knowledge): add sort and filter to KB list page

Sort dropdown: name, documents, tokens, created, last updated — pre-sorted
externally before passing rows to Resource. Active sort highlights the Sort
button; clear resets to default (created desc).

Filter popover: filter by connector status (All / With connectors /
Without connectors). Active filter shown as a removable tag in the toolbar.

* feat(files): add sort and filter to files list page

* feat(scheduled-tasks): add sort and filter to scheduled tasks page

* fix(table-filter): use explicit close handler instead of toggle

* improvement(files,knowledge): replace manual debounce with useDebounce hook and use type guards for file filtering

* fix(resource): prevent popover from inheriting anchor min-width

* feat(tables): add sort to tables list page

* feat(knowledge): add content and owner filters to KB list

* feat(scheduled-tasks): add status and health filters

* feat(files): add size and uploaded-by filters to files list

* feat(tables): add row count, owner, and column type filters

* improvement(scheduled-tasks): use combobox filter panel matching logs UI style

* improvement(knowledge): use combobox filter panel matching logs UI style

* improvement(files): use combobox filter panel matching logs UI style

Replaces button-list filters with Combobox-based multi-select sections for file type, size, and uploaded-by filters, aligning the panel with the logs page filter UI.

* improvement(tables): use combobox filter panel matching logs UI style

* feat(settings): add sort to recently deleted page

Add a sort dropdown next to the search bar allowing users to sort by deletion date (default, newest first), name (A–Z), or type (A–Z).

* feat(logs): add sort to logs page

* improvement(knowledge): upgrade document list filter to combobox style

* fix(resources): fix missing imports, memoization, and stale refs across resource pages

* improvement(tables): remove column type filter

* fix(resources): fix filter/sort correctness issues from audit

* fix(chunks): add server-side sort to document chunks API

Chunk sort was previously done client-side on a single page of
server-paginated data, which only reordered the current page.
Now sort params (sortBy, sortOrder) flow through the full stack:
types → service → API route → query hook → useDocumentChunks → document.tsx.

* perf(resources): memoize filterContent JSX across all resource pages

Resource is wrapped in React.memo, so an unstable filterContent reference
on every parent re-render defeats the memo. Wrap filterContent in useMemo
with correct deps in all 6 pages (files, tables, scheduled-tasks, knowledge,
base, document).

* fix(resources): add missing sort options for all visible columns

Every column visible in a resource table should be sortable. Three pages
had visible columns with no sort support:
- files.tsx: add 'owner' sort (member name lookup)
- scheduled-tasks.tsx: add 'schedule' sort (localeCompare on description)
- knowledge.tsx: add 'connectors' (count) and 'owner' (member name) sorts

Also add 'members' to processedKBs deps in knowledge.tsx since owner
sort now reads member names inside the memo.

* whitelabeling updates, sidebar fixes, files bug

* increased type safety

* pr fixes
2026-03-28 23:31:54 -07:00
Waleed
b9b930bb63 feat(analytics): add Profound web traffic tracking (#3835)
* feat(analytics): add Profound web traffic tracking

* fix(analytics): address PR review — add endpoint check and document trade-offs

* chore(analytics): remove implementation comments

* fix(analytics): guard sendToProfound with try-catch and align check with isProfoundEnabled

* fix(analytics): strip sensitive query params and remove redundant guard

* chore(analytics): remove unnecessary query param filtering
2026-03-28 22:09:23 -07:00
Vikhyath Mondreti
f1ead2ed55 fix docker image build 2026-03-28 20:58:56 -07:00
Waleed
14089f7dbb v0.6.14: performance improvements, connectors UX, collapsed sidebar actions 2026-03-27 13:07:59 -07:00
Waleed
e615816dce v0.6.13: emcn standardization, granola and ketch integrations, security hardening, connectors improvements 2026-03-27 00:16:37 -07:00
Waleed
ca87d7ce29 v0.6.12: billing, blogs UI 2026-03-26 01:19:23 -07:00
Waleed
6bebbc5e29 v0.6.11: billing fixes, rippling, hubspot, UI improvements, demo modal 2026-03-25 22:54:56 -07:00
Waleed
7b572f1f61 v0.6.10: tour fix, connectors reliability improvements, tooltip gif fixes 2026-03-24 21:38:19 -07:00
Vikhyath Mondreti
ed9a71f0af v0.6.9: general ux improvements for tables, mothership 2026-03-24 17:03:24 -07:00
Siddharth Ganesan
c78c870fda v0.6.8: mothership tool loop
v0.6.8: mothership tool loop
2026-03-24 04:06:19 -07:00
Waleed
19442f19e2 v0.6.7: kb improvements, edge z index fix, captcha, new trust center, block classifications 2026-03-21 12:43:33 -07:00
Waleed
1731a4d7f0 v0.6.6: landing improvements, styling consistency, mothership table renaming 2026-03-19 23:58:30 -07:00
Waleed
9fcd02fd3b v0.6.5: email validation, integrations page, mothership and custom tool fixes 2026-03-19 16:08:30 -07:00
Waleed
ff7b5b528c v0.6.4: subflows, docusign, ashby new tools, box, workday, billing bug fixes 2026-03-18 23:12:36 -07:00
Waleed
30f2d1a0fc v0.6.3: hubspot integration, kb block improvements 2026-03-18 11:19:55 -07:00
Waleed
4bd0731871 v0.6.2: mothership stability, chat iframe embedding, KB upserts, new blog post 2026-03-18 03:29:39 -07:00
Waleed
4f3bc37fe4 v0.6.1: added better auth admin plugin 2026-03-17 15:16:16 -07:00
Waleed
84d6fdc423 v0.6: mothership, tables, connectors 2026-03-17 12:21:15 -07:00
Vikhyath Mondreti
4c12914d35 v0.5.113: jira, ashby, google ads, grain updates 2026-03-12 22:54:25 -07:00
Waleed
e9bdc57616 v0.5.112: trace spans improvements, fathom integration, jira fixes, canvas navigation updates 2026-03-12 13:30:20 -07:00
Vikhyath Mondreti
36612ae42a v0.5.111: non-polling webhook execs off trigger.dev, gmail subject headers, webhook trigger configs (#3530) 2026-03-11 17:47:28 -07:00
Waleed
1c2c2c65d4 v0.5.110: webhook execution speedups, SSRF patches 2026-03-11 15:00:24 -07:00
Waleed
ecd3536a72 v0.5.109: obsidian and evernote integrations, slack fixes, remove memory instrumentation 2026-03-09 10:40:37 -07:00
Vikhyath Mondreti
8c0a2e04b1 v0.5.108: workflow input params in agent tools, bun upgrade, dropdown selectors for 14 blocks 2026-03-06 21:02:25 -08:00
Waleed
6586c5ce40 v0.5.107: new reddit, slack tools 2026-03-05 22:48:20 -08:00
Vikhyath Mondreti
3ce947566d v0.5.106: condition block and legacy kbs fixes, GPT 5.4 2026-03-05 17:30:05 -08:00
Waleed
70c36cb7aa v0.5.105: slack remove reaction, nested subflow locks fix, servicenow pagination, memory improvements 2026-03-04 22:38:26 -08:00
Waleed
f1ec5fe824 v0.5.104: memory improvements, nested subflows, careers page redirect, brandfetch, google meet 2026-03-03 23:45:29 -08:00
Waleed
e07e3c34cc v0.5.103: memory util instrumentation, API docs, amplitude, google pagespeed insights, pagerduty 2026-03-01 23:27:02 -08:00
Waleed
0d2e6ff31d v0.5.102: new integrations, new tools, ci speedups, memory leak instrumentation 2026-02-28 12:48:10 -08:00
Waleed
4fd0989264 v0.5.101: circular dependency mitigation, confluence enhancements, google tasks and bigquery integrations, workflow lock 2026-02-26 15:04:53 -08:00
Waleed
67f8a687f6 v0.5.100: multiple credentials, 40% speedup, gong, attio, audit log improvements 2026-02-25 00:28:25 -08:00
Waleed
af592349d3 v0.5.99: local dev improvements, live workflow logs in terminal 2026-02-23 00:24:49 -08:00
Waleed
0d86ea01f0 v0.5.98: change detection improvements, rate limit and code execution fixes, removed retired models, hex integration 2026-02-21 18:07:40 -08:00
Waleed
115f04e989 v0.5.97: oidc discovery for copilot mcp 2026-02-21 02:06:25 -08:00
Waleed
34d92fae89 v0.5.96: sim oauth provider, slack ephemeral message tool and blockkit support 2026-02-20 18:22:20 -08:00
Waleed
67aa4bb332 v0.5.95: gemini 3.1 pro, cloudflare, dataverse, revenuecat, redis, upstash, algolia tools; isolated-vm robustness improvements, tables backend (#3271)
* feat(tools): advanced fields for youtube, vercel; added cloudflare and dataverse tools (#3257)

* refactor(vercel): mark optional fields as advanced mode

Move optional/power-user fields behind the advanced toggle:
- List Deployments: project filter, target, state
- Create Deployment: project ID override, redeploy from, target
- List Projects: search
- Create/Update Project: framework, build/output/install commands
- Env Vars: variable type
- Webhooks: project IDs filter
- Checks: path, details URL
- Team Members: role filter
- All operations: team ID scope

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* style(youtube): mark optional params as advanced mode

Hide pagination, sort order, and filter fields behind the advanced
toggle for a cleaner default UX across all YouTube operations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* added advanced fields for vercel and youtube, added cloudflare and dataverse block

* addded desc for dataverse

* add more tools

* ack comment

* more

* ops

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>

* feat(tables): added tables (#2867)

* updates

* required

* trashy table viewer

* updates

* updates

* filtering ui

* updates

* updates

* updates

* one input mode

* format

* fix lints

* improved errors

* updates

* updates

* chages

* doc strings

* breaking down file

* update comments with ai

* updates

* comments

* changes

* revert

* updates

* dedupe

* updates

* updates

* updates

* refactoring

* renames & refactors

* refactoring

* updates

* undo

* update db

* wand

* updates

* fix comments

* fixes

* simplify comments

* u[dates

* renames

* better comments

* validation

* updates

* updates

* updates

* fix sorting

* fix appearnce

* updating prompt to make it user sort

* rm

* updates

* rename

* comments

* clean comments

* simplicifcaiton

* updates

* updates

* refactor

* reduced type confusion

* undo

* rename

* undo changes

* undo

* simplify

* updates

* updates

* revert

* updates

* db updates

* type fix

* fix

* fix error handling

* updates

* docs

* docs

* updates

* rename

* dedupe

* revert

* uncook

* updates

* fix

* fix

* fix

* fix

* prepare merge

* readd migrations

* add back missed code

* migrate enrichment logic to general abstraction

* address bugbot concerns

* adhere to size limits for tables

* remove conflicting migration

* add back migrations

* fix tables auth

* fix permissive auth

* fix lint

* reran migrations

* migrate to use tanstack query for all server state

* update table-selector

* update names

* added tables to permission groups, updated subblock types

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: waleed <walif6@gmail.com>

* fix(snapshot): changed insert to upsert when concurrent identical child workflows are running (#3259)

* fix(snapshot): changed insert to upsert when concurrent identical child workflows are running

* fixed ci tests failing

* fix(workflows): disallow duplicate workflow names at the same folder level (#3260)

* feat(tools): added redis, upstash, algolia, and revenuecat (#3261)

* feat(tools): added redis, upstash, algolia, and revenuecat

* ack comment

* feat(models): add gemini-3.1-pro-preview and update gemini-3-pro thinking levels (#3263)

* fix(audit-log): lazily resolve actor name/email when missing (#3262)

* fix(blocks): move type coercions from tools.config.tool to tools.config.params (#3264)

* fix(blocks): move type coercions from tools.config.tool to tools.config.params

Number() coercions in tools.config.tool ran at serialization time before
variable resolution, destroying dynamic references like <block.result.count>
by converting them to NaN/null. Moved all coercions to tools.config.params
which runs at execution time after variables are resolved.

Fixed in 15 blocks: exa, arxiv, sentry, incidentio, wikipedia, ahrefs,
posthog, elasticsearch, dropbox, hunter, lemlist, spotify, youtube, grafana,
parallel. Also added mode: 'advanced' to optional exa fields.

Closes #3258

* fix(blocks): address PR review — move remaining param mutations from tool() to params()

- Moved field mappings from tool() to params() in grafana, posthog,
  lemlist, spotify, dropbox (same dynamic reference bug)
- Fixed parallel.ts excerpts/full_content boolean logic
- Fixed parallel.ts search_queries empty case (must set undefined)
- Fixed elasticsearch.ts timeout not included when already ends with 's'
- Restored dropbox.ts tool() switch for proper default fallback

* fix(blocks): restore field renames to tool() for serialization-time validation

Field renames (e.g. personalApiKey→apiKey) must be in tool() because
validateRequiredFieldsBeforeExecution calls selectToolId()→tool() then
checks renamed field names on params. Only type coercions (Number(),
boolean) stay in params() to avoid destroying dynamic variable references.

* improvement(resolver): resovled empty sentinel to not pass through unexecuted valid refs to text inputs (#3266)

* fix(blocks): add required constraint for serviceDeskId in JSM block (#3268)

* fix(blocks): add required constraint for serviceDeskId in JSM block

* fix(blocks): rename custom field values to request field values in JSM create request

* fix(trigger): add isolated-vm support to trigger.dev container builds (#3269)

Scheduled workflow executions running in trigger.dev containers were
failing to spawn isolated-vm workers because the native module wasn't
available in the container. This caused loop condition evaluation to
silently fail and exit after one iteration.

- Add isolated-vm to build.external and additionalPackages in trigger config
- Include isolated-vm-worker.cjs via additionalFiles for child process spawning
- Add fallback path resolution for worker file in trigger.dev environment

* fix(tables): hide tables from sidebar and block registry (#3270)

* fix(tables): hide tables from sidebar and block registry

* fix(trigger): add isolated-vm support to trigger.dev container builds (#3269)

Scheduled workflow executions running in trigger.dev containers were
failing to spawn isolated-vm workers because the native module wasn't
available in the container. This caused loop condition evaluation to
silently fail and exit after one iteration.

- Add isolated-vm to build.external and additionalPackages in trigger config
- Include isolated-vm-worker.cjs via additionalFiles for child process spawning
- Add fallback path resolution for worker file in trigger.dev environment

* lint

* fix(trigger): update node version to align with main app (#3272)

* fix(build): fix corrupted sticky disk cache on blacksmith (#3273)

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Lakee Sivaraya <71339072+lakeesiv@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
2026-02-20 13:43:07 -08:00
Waleed
15ace5e63f v0.5.94: vercel integration, folder insertion, migrated tracking redirects to rewrites 2026-02-18 16:53:34 -08:00
Waleed
fdca73679d v0.5.93: NextJS config changes, MCP and Blocks whitelisting, copilot keyboard shortcuts, audit logs 2026-02-18 12:10:05 -08:00
Waleed
da46a387c9 v0.5.92: shortlinks, copilot scrolling stickiness, pagination 2026-02-17 15:13:21 -08:00
Waleed
b7e377ec4b v0.5.91: docs i18n, turborepo upgrade 2026-02-16 00:36:05 -08:00
195 changed files with 10328 additions and 4668 deletions

View File

@@ -1285,6 +1285,17 @@ export function StartIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function ProfoundIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg width='1em' height='1em' viewBox='0 0 55 55' xmlns='http://www.w3.org/2000/svg' {...props}>
<path
fill='currentColor'
d='M0 36.685V21.349a7.017 7.017 0 0 1 2.906-5.69l19.742-14.25A7.443 7.443 0 0 1 27.004 0h.062c1.623 0 3.193.508 4.501 1.452l19.684 14.207a7.016 7.016 0 0 1 2.906 5.69v12.302a7.013 7.013 0 0 1-2.907 5.689L31.527 53.562A7.605 7.605 0 0 1 27.078 55a7.641 7.641 0 0 1-4.465-1.44c-2.581-1.859-6.732-4.855-6.732-4.855V29.777c0-.249.28-.393.482-.248l10.538 7.605c.106.077.249.077.355 0l13.005-9.386a.306.306 0 0 0 0-.496l-13.005-9.386a.303.303 0 0 0-.355 0L.482 36.933A.304.304 0 0 1 0 36.685Z'
/>
</svg>
)
}
export function PineconeIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg

View File

@@ -126,6 +126,7 @@ import {
PolymarketIcon,
PostgresIcon,
PosthogIcon,
ProfoundIcon,
PulseIcon,
QdrantIcon,
QuiverIcon,
@@ -302,6 +303,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
polymarket: PolymarketIcon,
postgresql: PostgresIcon,
posthog: PosthogIcon,
profound: ProfoundIcon,
pulse_v2: PulseIcon,
qdrant: QdrantIcon,
quiver: QuiverIcon,

View File

@@ -1,6 +1,6 @@
---
title: File
description: Read and parse multiple files
description: Read and write workspace files
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
@@ -27,7 +27,7 @@ The File Parser tool is particularly useful for scenarios where your agents need
## Usage Instructions
Upload files directly or import from external URLs to get UserFile objects for use in other blocks.
Read and parse files from uploads or URLs, write new workspace files, or append content to existing files.
@@ -52,4 +52,45 @@ Parse one or more uploaded files or files from URLs (text, PDF, CSV, images, etc
| `files` | file[] | Parsed files as UserFile objects |
| `combinedContent` | string | Combined content of all parsed files |
### `file_write`
Create a new workspace file. If a file with the same name already exists, a numeric suffix is added (e.g.,
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileName` | string | Yes | File name \(e.g., "data.csv"\). If a file with this name exists, a numeric suffix is added automatically. |
| `content` | string | Yes | The text content to write to the file. |
| `contentType` | string | No | MIME type for new files \(e.g., "text/plain"\). Auto-detected from file extension if omitted. |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | File ID |
| `name` | string | File name |
| `size` | number | File size in bytes |
| `url` | string | URL to access the file |
### `file_append`
Append content to an existing workspace file. The file must already exist. Content is added to the end of the file.
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `fileName` | string | Yes | Name of an existing workspace file to append to. |
| `content` | string | Yes | The text content to append to the file. |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `id` | string | File ID |
| `name` | string | File name |
| `size` | number | File size in bytes |
| `url` | string | URL to access the file |

View File

@@ -121,6 +121,7 @@
"polymarket",
"postgresql",
"posthog",
"profound",
"pulse",
"qdrant",
"quiver",

View File

@@ -0,0 +1,626 @@
---
title: Profound
description: AI visibility and analytics with Profound
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="profound"
color="#000000"
/>
{/* MANUAL-CONTENT-START:intro */}
[Profound](https://tryprofound.com/) is an AI visibility and analytics platform that helps brands understand how they appear across AI-powered search engines, chatbots, and assistants. It tracks mentions, citations, sentiment, bot traffic, and referral patterns across platforms like ChatGPT, Perplexity, Google AI Overviews, and more.
With the Profound integration in Sim, you can:
- **Monitor AI Visibility**: Track share of voice, visibility scores, and mention counts across AI platforms for your brand and competitors.
- **Analyze Sentiment**: Measure how positively or negatively your brand is discussed in AI-generated responses.
- **Track Citations**: See which URLs are being cited by AI models and your citation share relative to competitors.
- **Monitor Bot Traffic**: Analyze AI crawler activity on your domain, including GPTBot, ClaudeBot, and other AI agents, with hourly granularity.
- **Track Referral Traffic**: Monitor human visits arriving from AI platforms to your website.
- **Explore Prompt Data**: Access raw prompt-answer pairs, query fanouts, and prompt volume trends across AI platforms.
- **Optimize Content**: Get AEO (Answer Engine Optimization) scores and actionable recommendations to improve how AI models reference your content.
- **Manage Categories & Assets**: List and explore your tracked categories, assets (brands), topics, tags, personas, and regions.
These tools let your agents automate AI visibility monitoring, competitive intelligence, and content optimization workflows. To use the Profound integration, you'll need a Profound account with API access.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Track how your brand appears across AI platforms. Monitor visibility scores, sentiment, citations, bot traffic, referrals, content optimization, and prompt volumes with Profound.
## Tools
### `profound_list_categories`
List all organization categories in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `categories` | json | List of organization categories |
| ↳ `id` | string | Category ID |
| ↳ `name` | string | Category name |
### `profound_list_regions`
List all organization regions in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `regions` | json | List of organization regions |
| ↳ `id` | string | Region ID \(UUID\) |
| ↳ `name` | string | Region name |
### `profound_list_models`
List all AI models/platforms tracked in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `models` | json | List of AI models/platforms |
| ↳ `id` | string | Model ID \(UUID\) |
| ↳ `name` | string | Model/platform name |
### `profound_list_domains`
List all organization domains in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `domains` | json | List of organization domains |
| ↳ `id` | string | Domain ID \(UUID\) |
| ↳ `name` | string | Domain name |
| ↳ `createdAt` | string | When the domain was added |
### `profound_list_assets`
List all organization assets (companies/brands) across all categories in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `assets` | json | List of organization assets with category info |
| ↳ `id` | string | Asset ID |
| ↳ `name` | string | Asset/company name |
| ↳ `website` | string | Asset website URL |
| ↳ `alternateDomains` | json | Alternate domain names |
| ↳ `isOwned` | boolean | Whether this asset is owned by the organization |
| ↳ `createdAt` | string | When the asset was created |
| ↳ `logoUrl` | string | URL of the asset logo |
| ↳ `categoryId` | string | Category ID the asset belongs to |
| ↳ `categoryName` | string | Category name |
### `profound_list_personas`
List all organization personas across all categories in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `personas` | json | List of organization personas with profile details |
| ↳ `id` | string | Persona ID |
| ↳ `name` | string | Persona name |
| ↳ `categoryId` | string | Category ID |
| ↳ `categoryName` | string | Category name |
| ↳ `persona` | json | Persona profile with behavior, employment, and demographics |
### `profound_category_topics`
List topics for a specific category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `topics` | json | List of topics in the category |
| ↳ `id` | string | Topic ID \(UUID\) |
| ↳ `name` | string | Topic name |
### `profound_category_tags`
List tags for a specific category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `tags` | json | List of tags in the category |
| ↳ `id` | string | Tag ID \(UUID\) |
| ↳ `name` | string | Tag name |
### `profound_category_prompts`
List prompts for a specific category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
| `limit` | number | No | Maximum number of results \(default 10000, max 10000\) |
| `cursor` | string | No | Pagination cursor from previous response |
| `orderDir` | string | No | Sort direction: asc or desc \(default desc\) |
| `promptType` | string | No | Comma-separated prompt types to filter: visibility, sentiment |
| `topicId` | string | No | Comma-separated topic IDs \(UUIDs\) to filter by |
| `tagId` | string | No | Comma-separated tag IDs \(UUIDs\) to filter by |
| `regionId` | string | No | Comma-separated region IDs \(UUIDs\) to filter by |
| `platformId` | string | No | Comma-separated platform IDs \(UUIDs\) to filter by |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of prompts |
| `nextCursor` | string | Cursor for next page of results |
| `prompts` | json | List of prompts |
| ↳ `id` | string | Prompt ID |
| ↳ `prompt` | string | Prompt text |
| ↳ `promptType` | string | Prompt type \(visibility or sentiment\) |
| ↳ `topicId` | string | Topic ID |
| ↳ `topicName` | string | Topic name |
| ↳ `tags` | json | Associated tags |
| ↳ `regions` | json | Associated regions |
| ↳ `platforms` | json | Associated platforms |
| ↳ `createdAt` | string | When the prompt was created |
### `profound_category_assets`
List assets (companies/brands) for a specific category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `assets` | json | List of assets in the category |
| ↳ `id` | string | Asset ID |
| ↳ `name` | string | Asset/company name |
| ↳ `website` | string | Website URL |
| ↳ `alternateDomains` | json | Alternate domain names |
| ↳ `isOwned` | boolean | Whether the asset is owned by the organization |
| ↳ `createdAt` | string | When the asset was created |
| ↳ `logoUrl` | string | URL of the asset logo |
### `profound_category_personas`
List personas for a specific category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `personas` | json | List of personas in the category |
| ↳ `id` | string | Persona ID |
| ↳ `name` | string | Persona name |
| ↳ `persona` | json | Persona profile with behavior, employment, and demographics |
### `profound_visibility_report`
Query AI visibility report for a category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | Yes | End date \(YYYY-MM-DD or ISO 8601\) |
| `metrics` | string | Yes | Comma-separated metrics: share_of_voice, mentions_count, visibility_score, executions, average_position |
| `dimensions` | string | No | Comma-separated dimensions: date, region, topic, model, asset_name, prompt, tag, persona |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"asset_name","operator":"is","value":"Company"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Report data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_sentiment_report`
Query sentiment report for a category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | Yes | End date \(YYYY-MM-DD or ISO 8601\) |
| `metrics` | string | Yes | Comma-separated metrics: positive, negative, occurrences |
| `dimensions` | string | No | Comma-separated dimensions: theme, date, region, topic, model, asset_name, tag, prompt, sentiment_type, persona |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"asset_name","operator":"is","value":"Company"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Report data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_citations_report`
Query citations report for a category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | Yes | End date \(YYYY-MM-DD or ISO 8601\) |
| `metrics` | string | Yes | Comma-separated metrics: count, citation_share |
| `dimensions` | string | No | Comma-separated dimensions: hostname, path, date, region, topic, model, tag, prompt, url, root_domain, persona, citation_category |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"hostname","operator":"is","value":"example.com"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Report data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_query_fanouts`
Query fanout report showing how AI models expand prompts into sub-queries in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | Yes | End date \(YYYY-MM-DD or ISO 8601\) |
| `metrics` | string | Yes | Comma-separated metrics: fanouts_per_execution, total_fanouts, share |
| `dimensions` | string | No | Comma-separated dimensions: prompt, query, model, region, date |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Report data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_prompt_answers`
Get raw prompt answers data for a category in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `categoryId` | string | Yes | Category ID \(UUID\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | Yes | End date \(YYYY-MM-DD or ISO 8601\) |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"prompt_type","operator":"is","value":"visibility"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of answer rows |
| `data` | json | Raw prompt answer data |
| ↳ `prompt` | string | The prompt text |
| ↳ `promptType` | string | Prompt type \(visibility or sentiment\) |
| ↳ `response` | string | AI model response text |
| ↳ `mentions` | json | Companies/assets mentioned in the response |
| ↳ `citations` | json | URLs cited in the response |
| ↳ `topic` | string | Topic name |
| ↳ `region` | string | Region name |
| ↳ `model` | string | AI model/platform name |
| ↳ `asset` | string | Asset name |
| ↳ `createdAt` | string | Timestamp when the answer was collected |
### `profound_bots_report`
Query bot traffic report with hourly granularity for a domain in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `domain` | string | Yes | Domain to query bot traffic for \(e.g. example.com\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | No | End date \(YYYY-MM-DD or ISO 8601\). Defaults to now |
| `metrics` | string | Yes | Comma-separated metrics: count, citations, indexing, training, last_visit |
| `dimensions` | string | No | Comma-separated dimensions: date, hour, path, bot_name, bot_provider, bot_type |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"bot_name","operator":"is","value":"GPTBot"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Report data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_referrals_report`
Query human referral traffic report with hourly granularity for a domain in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `domain` | string | Yes | Domain to query referral traffic for \(e.g. example.com\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | No | End date \(YYYY-MM-DD or ISO 8601\). Defaults to now |
| `metrics` | string | Yes | Comma-separated metrics: visits, last_visit |
| `dimensions` | string | No | Comma-separated dimensions: date, hour, path, referral_source, referral_type |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"referral_source","operator":"is","value":"openai"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Report data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_raw_logs`
Get raw traffic logs with filters for a domain in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `domain` | string | Yes | Domain to query logs for \(e.g. example.com\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | No | End date \(YYYY-MM-DD or ISO 8601\). Defaults to now |
| `dimensions` | string | No | Comma-separated dimensions: timestamp, method, host, path, status_code, ip, user_agent, referer, bytes_sent, duration_ms, query_params |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"path","operator":"contains","value":"/blog"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of log entries |
| `data` | json | Log data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values \(count\) |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_bot_logs`
Get identified bot visit logs with filters for a domain in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `domain` | string | Yes | Domain to query bot logs for \(e.g. example.com\) |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | No | End date \(YYYY-MM-DD or ISO 8601\). Defaults to now |
| `dimensions` | string | No | Comma-separated dimensions: timestamp, method, host, path, status_code, ip, user_agent, referer, bytes_sent, duration_ms, query_params, bot_name, bot_provider, bot_types |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"bot_name","operator":"is","value":"GPTBot"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of bot log entries |
| `data` | json | Bot log data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values \(count\) |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_list_optimizations`
List content optimization entries for an asset in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `assetId` | string | Yes | Asset ID \(UUID\) |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
| `offset` | number | No | Offset for pagination \(default 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of optimization entries |
| `optimizations` | json | List of content optimization entries |
| ↳ `id` | string | Optimization ID \(UUID\) |
| ↳ `title` | string | Content title |
| ↳ `createdAt` | string | When the optimization was created |
| ↳ `extractedInput` | string | Extracted input text |
| ↳ `type` | string | Content type: file, text, or url |
| ↳ `status` | string | Optimization status |
### `profound_optimization_analysis`
Get detailed content optimization analysis for a specific content item in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `assetId` | string | Yes | Asset ID \(UUID\) |
| `contentId` | string | Yes | Content/optimization ID \(UUID\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `content` | json | The analyzed content |
| ↳ `format` | string | Content format: markdown or html |
| ↳ `value` | string | Content text |
| `aeoContentScore` | json | AEO content score with target zone |
| ↳ `value` | number | AEO score value |
| ↳ `targetZone` | json | Target zone range |
| ↳ `low` | number | Low end of target range |
| ↳ `high` | number | High end of target range |
| `analysis` | json | Analysis breakdown by category |
| ↳ `breakdown` | json | Array of scoring breakdowns |
| ↳ `title` | string | Category title |
| ↳ `weight` | number | Category weight |
| ↳ `score` | number | Category score |
| `recommendations` | json | Content optimization recommendations |
| ↳ `title` | string | Recommendation title |
| ↳ `status` | string | Status: done or pending |
| ↳ `impact` | json | Impact details with section and score |
| ↳ `suggestion` | json | Suggestion text and rationale |
| ↳ `text` | string | Suggestion text |
| ↳ `rationale` | string | Why this recommendation matters |
### `profound_prompt_volume`
Query prompt volume data to understand search demand across AI platforms in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `startDate` | string | Yes | Start date \(YYYY-MM-DD or ISO 8601\) |
| `endDate` | string | Yes | End date \(YYYY-MM-DD or ISO 8601\) |
| `metrics` | string | Yes | Comma-separated metrics: volume, change |
| `dimensions` | string | No | Comma-separated dimensions: keyword, date, platform, country_code, matching_type, frequency |
| `dateInterval` | string | No | Date interval: hour, day, week, month, year |
| `filters` | string | No | JSON array of filter objects, e.g. \[\{"field":"keyword","operator":"contains","value":"best"\}\] |
| `limit` | number | No | Maximum number of results \(default 10000, max 50000\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `totalRows` | number | Total number of rows in the report |
| `data` | json | Volume data rows with metrics and dimension values |
| ↳ `metrics` | json | Array of metric values matching requested metrics order |
| ↳ `dimensions` | json | Array of dimension values matching requested dimensions order |
### `profound_citation_prompts`
Get prompts that cite a specific domain across AI platforms in Profound
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Profound API Key |
| `inputDomain` | string | Yes | Domain to look up citations for \(e.g. ramp.com\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `data` | json | Citation prompt data for the queried domain |

View File

@@ -1,3 +1,6 @@
/** Shared className for primary auth form submit buttons across all auth pages. */
export const AUTH_SUBMIT_BTN =
'inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50' as const
/** Shared className for primary auth/status CTA buttons on dark auth surfaces. */
export const AUTH_PRIMARY_CTA_BASE =
'inline-flex h-[32px] items-center justify-center gap-2 rounded-[5px] border border-[var(--auth-primary-btn-border)] bg-[var(--auth-primary-btn-bg)] px-2.5 font-[430] font-season text-[var(--auth-primary-btn-text)] text-sm transition-colors hover:border-[var(--auth-primary-btn-hover-border)] hover:bg-[var(--auth-primary-btn-hover-bg)] hover:text-[var(--auth-primary-btn-hover-text)] disabled:cursor-not-allowed disabled:opacity-50' as const
/** Full-width variant used for primary auth form submit buttons. */
export const AUTH_SUBMIT_BTN = `${AUTH_PRIMARY_CTA_BASE} w-full` as const

View File

@@ -288,7 +288,6 @@ export default function Collaboration() {
width={876}
height={480}
className='h-full w-auto object-left md:min-w-[100vw]'
priority
/>
</div>
<div className='hidden lg:block'>

View File

@@ -81,6 +81,56 @@ function ProviderPreviewIcon({ providerId }: { providerId?: string }) {
)
}
interface FeatureToggleItemProps {
feature: PermissionFeature
enabled: boolean
color: string
isInView: boolean
delay: number
textClassName: string
transition: Record<string, unknown>
onToggle: () => void
}
function FeatureToggleItem({
feature,
enabled,
color,
isInView,
delay,
textClassName,
transition,
onToggle,
}: FeatureToggleItemProps) {
return (
<motion.div
key={feature.key}
role='button'
tabIndex={0}
aria-label={`Toggle ${feature.name}`}
aria-pressed={enabled}
className='flex cursor-pointer items-center gap-2 rounded-[4px] py-0.5'
initial={{ opacity: 0, x: -6 }}
animate={isInView ? { opacity: 1, x: 0 } : {}}
transition={{ ...transition, delay }}
onClick={onToggle}
onKeyDown={(e) => {
if (e.key === 'Enter' || e.key === ' ') {
e.preventDefault()
onToggle()
}
}}
whileTap={{ scale: 0.98 }}
>
<CheckboxIcon checked={enabled} color={color} />
<ProviderPreviewIcon providerId={feature.providerId} />
<span className={textClassName} style={{ color: enabled ? '#F6F6F6AA' : '#F6F6F640' }}>
{feature.name}
</span>
</motion.div>
)
}
export function AccessControlPanel() {
const ref = useRef(null)
const isInView = useInView(ref, { once: true, margin: '-40px' })
@@ -97,39 +147,25 @@ export function AccessControlPanel() {
return (
<div key={category.label} className={catIdx > 0 ? 'mt-4' : ''}>
<span className='font-[430] font-season text-[#F6F6F6]/30 text-[10px] uppercase leading-none tracking-[0.08em]'>
<span className='font-[430] font-season text-[#F6F6F6]/55 text-[10px] uppercase leading-none tracking-[0.08em]'>
{category.label}
</span>
<div className='mt-2 grid grid-cols-2 gap-x-4 gap-y-2'>
{category.features.map((feature, featIdx) => {
const enabled = accessState[feature.key]
return (
<motion.div
key={feature.key}
className='flex cursor-pointer items-center gap-2 rounded-[4px] py-0.5'
initial={{ opacity: 0, x: -6 }}
animate={isInView ? { opacity: 1, x: 0 } : {}}
transition={{
delay: 0.05 + (offsetBefore + featIdx) * 0.04,
duration: 0.3,
}}
onClick={() =>
setAccessState((prev) => ({ ...prev, [feature.key]: !prev[feature.key] }))
}
whileTap={{ scale: 0.98 }}
>
<CheckboxIcon checked={enabled} color={category.color} />
<ProviderPreviewIcon providerId={feature.providerId} />
<span
className='truncate font-[430] font-season text-[13px] leading-none tracking-[0.02em]'
style={{ color: enabled ? '#F6F6F6AA' : '#F6F6F640' }}
>
{feature.name}
</span>
</motion.div>
)
})}
{category.features.map((feature, featIdx) => (
<FeatureToggleItem
key={feature.key}
feature={feature}
enabled={accessState[feature.key]}
color={category.color}
isInView={isInView}
delay={0.05 + (offsetBefore + featIdx) * 0.04}
textClassName='truncate font-[430] font-season text-[13px] leading-none tracking-[0.02em]'
transition={{ duration: 0.3 }}
onToggle={() =>
setAccessState((prev) => ({ ...prev, [feature.key]: !prev[feature.key] }))
}
/>
))}
</div>
</div>
)
@@ -140,12 +176,11 @@ export function AccessControlPanel() {
<div className='hidden lg:block'>
{PERMISSION_CATEGORIES.map((category, catIdx) => (
<div key={category.label} className={catIdx > 0 ? 'mt-4' : ''}>
<span className='font-[430] font-season text-[#F6F6F6]/30 text-[10px] uppercase leading-none tracking-[0.08em]'>
<span className='font-[430] font-season text-[#F6F6F6]/55 text-[10px] uppercase leading-none tracking-[0.08em]'>
{category.label}
</span>
<div className='mt-2 grid grid-cols-2 gap-x-4 gap-y-2'>
{category.features.map((feature, featIdx) => {
const enabled = accessState[feature.key]
const currentIndex =
PERMISSION_CATEGORIES.slice(0, catIdx).reduce(
(sum, c) => sum + c.features.length,
@@ -153,30 +188,19 @@ export function AccessControlPanel() {
) + featIdx
return (
<motion.div
<FeatureToggleItem
key={feature.key}
className='flex cursor-pointer items-center gap-2 rounded-[4px] py-0.5'
initial={{ opacity: 0, x: -6 }}
animate={isInView ? { opacity: 1, x: 0 } : {}}
transition={{
delay: 0.1 + currentIndex * 0.04,
duration: 0.3,
ease: [0.25, 0.46, 0.45, 0.94],
}}
onClick={() =>
feature={feature}
enabled={accessState[feature.key]}
color={category.color}
isInView={isInView}
delay={0.1 + currentIndex * 0.04}
textClassName='truncate font-[430] font-season text-[11px] leading-none tracking-[0.02em] transition-opacity duration-200'
transition={{ duration: 0.3, ease: [0.25, 0.46, 0.45, 0.94] }}
onToggle={() =>
setAccessState((prev) => ({ ...prev, [feature.key]: !prev[feature.key] }))
}
whileTap={{ scale: 0.98 }}
>
<CheckboxIcon checked={enabled} color={category.color} />
<ProviderPreviewIcon providerId={feature.providerId} />
<span
className='truncate font-[430] font-season text-[11px] leading-none tracking-[0.02em] transition-opacity duration-200'
style={{ color: enabled ? '#F6F6F6AA' : '#F6F6F640' }}
>
{feature.name}
</span>
</motion.div>
/>
)
})}
</div>

View File

@@ -146,14 +146,14 @@ function AuditRow({ entry, index }: AuditRowProps) {
</div>
{/* Time */}
<span className='w-[56px] shrink-0 font-[430] font-season text-[#F6F6F6]/30 text-[11px] leading-none tracking-[0.02em]'>
<span className='w-[56px] shrink-0 font-[430] font-season text-[#F6F6F6]/55 text-[11px] leading-none tracking-[0.02em]'>
{timeAgo}
</span>
<span className='min-w-0 truncate font-[430] font-season text-[12px] leading-none tracking-[0.02em]'>
<span className='text-[#F6F6F6]/80'>{entry.actor}</span>
<span className='hidden sm:inline'>
<span className='text-[#F6F6F6]/40'> · </span>
<span className='text-[#F6F6F6]/60'> · </span>
<span className='text-[#F6F6F6]/55'>{entry.description}</span>
</span>
</span>

View File

@@ -85,7 +85,7 @@ function TrustStrip() {
<strong className='font-[430] font-season text-small text-white leading-none'>
SOC 2 & HIPAA
</strong>
<span className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_30%,transparent)] text-xs leading-none tracking-[0.02em] transition-colors group-hover:text-[color-mix(in_srgb,var(--landing-text-subtle)_55%,transparent)]'>
<span className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_55%,transparent)] text-xs leading-none tracking-[0.02em] transition-colors group-hover:text-[color-mix(in_srgb,var(--landing-text-subtle)_75%,transparent)]'>
Type II · PHI protected
</span>
</div>
@@ -105,7 +105,7 @@ function TrustStrip() {
<strong className='font-[430] font-season text-small text-white leading-none'>
Open Source
</strong>
<span className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_30%,transparent)] text-xs leading-none tracking-[0.02em] transition-colors group-hover:text-[color-mix(in_srgb,var(--landing-text-subtle)_55%,transparent)]'>
<span className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_55%,transparent)] text-xs leading-none tracking-[0.02em] transition-colors group-hover:text-[color-mix(in_srgb,var(--landing-text-subtle)_75%,transparent)]'>
View on GitHub
</span>
</div>
@@ -120,7 +120,7 @@ function TrustStrip() {
<strong className='font-[430] font-season text-small text-white leading-none'>
SSO & SCIM
</strong>
<span className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_30%,transparent)] text-xs leading-none tracking-[0.02em]'>
<span className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_55%,transparent)] text-xs leading-none tracking-[0.02em]'>
Okta, Azure AD, Google
</span>
</div>
@@ -165,7 +165,7 @@ export default function Enterprise() {
<h3 className='font-[430] font-season text-[16px] text-white leading-[120%] tracking-[-0.01em]'>
Audit Trail
</h3>
<p className='mt-2 max-w-[480px] font-[430] font-season text-[#F6F6F6]/50 text-[14px] leading-[150%] tracking-[0.02em]'>
<p className='mt-2 max-w-[480px] font-[430] font-season text-[#F6F6F6]/70 text-[14px] leading-[150%] tracking-[0.02em]'>
Every action is captured with full actor attribution.
</p>
</div>
@@ -179,7 +179,7 @@ export default function Enterprise() {
<h3 className='font-[430] font-season text-[16px] text-white leading-[120%] tracking-[-0.01em]'>
Access Control
</h3>
<p className='mt-1.5 font-[430] font-season text-[#F6F6F6]/50 text-[14px] leading-[150%] tracking-[0.02em]'>
<p className='mt-1.5 font-[430] font-season text-[#F6F6F6]/70 text-[14px] leading-[150%] tracking-[0.02em]'>
Restrict providers, surfaces, and tools per group.
</p>
</div>
@@ -211,7 +211,7 @@ export default function Enterprise() {
(tag, i) => (
<span
key={i}
className='enterprise-feature-marquee-tag whitespace-nowrap border-[var(--landing-bg-elevated)] border-r px-5 py-4 font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_40%,transparent)] text-small leading-none tracking-[0.02em] hover:bg-white/[0.04] hover:text-[color-mix(in_srgb,var(--landing-text-subtle)_55%,transparent)]'
className='enterprise-feature-marquee-tag whitespace-nowrap border-[var(--landing-bg-elevated)] border-r px-5 py-4 font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_60%,transparent)] text-small leading-none tracking-[0.02em] hover:bg-white/[0.04] hover:text-[color-mix(in_srgb,var(--landing-text-subtle)_80%,transparent)]'
>
{tag}
</span>
@@ -221,7 +221,7 @@ export default function Enterprise() {
</div>
<div className='flex items-center justify-between border-[var(--landing-bg-elevated)] border-t px-6 py-5 md:px-8 md:py-6'>
<p className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_40%,transparent)] text-base leading-[150%] tracking-[0.02em]'>
<p className='font-[430] font-season text-[color-mix(in_srgb,var(--landing-text-subtle)_60%,transparent)] text-base leading-[150%] tracking-[0.02em]'>
Ready for growth?
</p>
<DemoRequestModal>

View File

@@ -190,7 +190,6 @@ export default function Features() {
width={1440}
height={366}
className='h-auto w-full'
priority
/>
</div>

View File

@@ -67,6 +67,7 @@ export function FooterCTA() {
type='button'
onClick={handleSubmit}
disabled={isEmpty}
aria-label='Submit message'
className='flex h-[28px] w-[28px] items-center justify-center rounded-full border-0 p-0 transition-colors'
style={{
background: isEmpty ? '#C0C0C0' : '#1C1C1C',

View File

@@ -26,7 +26,7 @@ const RESOURCES_LINKS: FooterItem[] = [
{ label: 'Blog', href: '/blog' },
// { label: 'Templates', href: '/templates' },
{ label: 'Docs', href: 'https://docs.sim.ai', external: true },
{ label: 'Academy', href: '/academy' },
// { label: 'Academy', href: '/academy' },
{ label: 'Partners', href: '/partners' },
{ label: 'Careers', href: 'https://jobs.ashbyhq.com/sim', external: true },
{ label: 'Changelog', href: '/changelog' },

View File

@@ -126,6 +126,7 @@ import {
PolymarketIcon,
PostgresIcon,
PosthogIcon,
ProfoundIcon,
PulseIcon,
QdrantIcon,
QuiverIcon,
@@ -302,6 +303,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
polymarket: PolymarketIcon,
postgresql: PostgresIcon,
posthog: PosthogIcon,
profound: ProfoundIcon,
pulse_v2: PulseIcon,
qdrant: QdrantIcon,
quiver: QuiverIcon,

View File

@@ -2993,13 +2993,26 @@
"type": "file_v3",
"slug": "file",
"name": "File",
"description": "Read and parse multiple files",
"longDescription": "Upload files directly or import from external URLs to get UserFile objects for use in other blocks.",
"description": "Read and write workspace files",
"longDescription": "Read and parse files from uploads or URLs, write new workspace files, or append content to existing files.",
"bgColor": "#40916C",
"iconName": "DocumentIcon",
"docsUrl": "https://docs.sim.ai/tools/file",
"operations": [],
"operationCount": 0,
"operations": [
{
"name": "Read",
"description": "Parse one or more uploaded files or files from URLs (text, PDF, CSV, images, etc.)"
},
{
"name": "Write",
"description": "Create a new workspace file. If a file with the same name already exists, a numeric suffix is added (e.g., "
},
{
"name": "Append",
"description": "Append content to an existing workspace file. The file must already exist. Content is added to the end of the file."
}
],
"operationCount": 3,
"triggers": [],
"triggerCount": 0,
"authType": "none",
@@ -8611,6 +8624,121 @@
"integrationType": "analytics",
"tags": ["data-analytics", "monitoring"]
},
{
"type": "profound",
"slug": "profound",
"name": "Profound",
"description": "AI visibility and analytics with Profound",
"longDescription": "Track how your brand appears across AI platforms. Monitor visibility scores, sentiment, citations, bot traffic, referrals, content optimization, and prompt volumes with Profound.",
"bgColor": "#000000",
"iconName": "ProfoundIcon",
"docsUrl": "https://docs.sim.ai/tools/profound",
"operations": [
{
"name": "List Categories",
"description": "List all organization categories in Profound"
},
{
"name": "List Regions",
"description": "List all organization regions in Profound"
},
{
"name": "List Models",
"description": "List all AI models/platforms tracked in Profound"
},
{
"name": "List Domains",
"description": "List all organization domains in Profound"
},
{
"name": "List Assets",
"description": "List all organization assets (companies/brands) across all categories in Profound"
},
{
"name": "List Personas",
"description": "List all organization personas across all categories in Profound"
},
{
"name": "Category Topics",
"description": "List topics for a specific category in Profound"
},
{
"name": "Category Tags",
"description": "List tags for a specific category in Profound"
},
{
"name": "Category Prompts",
"description": "List prompts for a specific category in Profound"
},
{
"name": "Category Assets",
"description": "List assets (companies/brands) for a specific category in Profound"
},
{
"name": "Category Personas",
"description": "List personas for a specific category in Profound"
},
{
"name": "Visibility Report",
"description": "Query AI visibility report for a category in Profound"
},
{
"name": "Sentiment Report",
"description": "Query sentiment report for a category in Profound"
},
{
"name": "Citations Report",
"description": "Query citations report for a category in Profound"
},
{
"name": "Query Fanouts",
"description": "Query fanout report showing how AI models expand prompts into sub-queries in Profound"
},
{
"name": "Prompt Answers",
"description": "Get raw prompt answers data for a category in Profound"
},
{
"name": "Bots Report",
"description": "Query bot traffic report with hourly granularity for a domain in Profound"
},
{
"name": "Referrals Report",
"description": "Query human referral traffic report with hourly granularity for a domain in Profound"
},
{
"name": "Raw Logs",
"description": "Get raw traffic logs with filters for a domain in Profound"
},
{
"name": "Bot Logs",
"description": "Get identified bot visit logs with filters for a domain in Profound"
},
{
"name": "List Optimizations",
"description": "List content optimization entries for an asset in Profound"
},
{
"name": "Optimization Analysis",
"description": "Get detailed content optimization analysis for a specific content item in Profound"
},
{
"name": "Prompt Volume",
"description": "Query prompt volume data to understand search demand across AI platforms in Profound"
},
{
"name": "Citation Prompts",
"description": "Get prompts that cite a specific domain across AI platforms in Profound"
}
],
"operationCount": 24,
"triggers": [],
"triggerCount": 0,
"authType": "api-key",
"category": "tools",
"integrationType": "analytics",
"tags": ["seo", "data-analytics"]
},
{
"type": "pulse_v2",
"slug": "pulse",

View File

@@ -1,5 +1,4 @@
import type { Metadata } from 'next'
import Link from 'next/link'
import { getNavBlogPosts } from '@/lib/blog/registry'
import { martianMono } from '@/app/_styles/fonts/martian-mono/martian-mono'
import { season } from '@/app/_styles/fonts/season/season'
@@ -152,12 +151,13 @@ export default async function PartnersPage() {
recognition in the growing ecosystem of AI workflow builders.
</p>
<div className='flex items-center gap-4'>
<Link
{/* TODO: Uncomment when academy is public */}
{/* <Link
href='/academy'
className='inline-flex h-[44px] items-center rounded-[5px] bg-white px-6 text-[#1C1C1C] text-[15px] transition-colors hover:bg-[#E8E8E8]'
>
Start Sim Academy →
</Link>
</Link> */}
<a
href='#how-it-works'
className='inline-flex h-[44px] items-center rounded-[5px] border border-[#3A3A3A] px-6 text-[#ECECEC] text-[15px] transition-colors hover:border-[#4A4A4A]'
@@ -275,12 +275,13 @@ export default async function PartnersPage() {
Complete Sim Academy to earn your first certification and unlock partner benefits.
It's free to start — no credit card required.
</p>
<Link
{/* TODO: Uncomment when academy is public */}
{/* <Link
href='/academy'
className='inline-flex h-[48px] items-center rounded-[5px] bg-white px-8 font-[430] text-[#1C1C1C] text-[15px] transition-colors hover:bg-[#E8E8E8]'
>
Start Sim Academy
</Link>
</Link> */}
</div>
</section>
</main>

View File

@@ -15,6 +15,12 @@
--toolbar-triggers-height: 300px; /* TOOLBAR_TRIGGERS_HEIGHT.DEFAULT */
--editor-connections-height: 172px; /* EDITOR_CONNECTIONS_HEIGHT.DEFAULT */
--terminal-height: 206px; /* TERMINAL_HEIGHT.DEFAULT */
--auth-primary-btn-bg: #ffffff;
--auth-primary-btn-border: #ffffff;
--auth-primary-btn-text: #000000;
--auth-primary-btn-hover-bg: #e0e0e0;
--auth-primary-btn-hover-border: #e0e0e0;
--auth-primary-btn-hover-text: #000000;
/* z-index scale for layered UI
Popover must be above modal so dropdowns inside modals render correctly */

View File

@@ -1,5 +1,9 @@
import type React from 'react'
import type { Metadata } from 'next'
import { notFound } from 'next/navigation'
// TODO: Remove notFound() call to make academy pages public once content is ready
const ACADEMY_ENABLED = false
export const metadata: Metadata = {
title: {
@@ -17,6 +21,10 @@ export const metadata: Metadata = {
}
export default function AcademyLayout({ children }: { children: React.ReactNode }) {
if (!ACADEMY_ENABLED) {
notFound()
}
return (
<div className='min-h-screen bg-[#1C1C1C] font-[430] font-season text-[#ECECEC]'>
{children}

View File

@@ -15,12 +15,12 @@ const {
mockLimit,
mockUpdate,
mockSet,
mockDelete,
mockCreateSuccessResponse,
mockCreateErrorResponse,
mockEncryptSecret,
mockCheckChatAccess,
mockDeployWorkflow,
mockPerformChatUndeploy,
mockLogger,
} = vi.hoisted(() => {
const logger = {
@@ -40,12 +40,12 @@ const {
mockLimit: vi.fn(),
mockUpdate: vi.fn(),
mockSet: vi.fn(),
mockDelete: vi.fn(),
mockCreateSuccessResponse: vi.fn(),
mockCreateErrorResponse: vi.fn(),
mockEncryptSecret: vi.fn(),
mockCheckChatAccess: vi.fn(),
mockDeployWorkflow: vi.fn(),
mockPerformChatUndeploy: vi.fn(),
mockLogger: logger,
}
})
@@ -66,7 +66,6 @@ vi.mock('@sim/db', () => ({
db: {
select: mockSelect,
update: mockUpdate,
delete: mockDelete,
},
}))
vi.mock('@sim/db/schema', () => ({
@@ -88,6 +87,9 @@ vi.mock('@/app/api/chat/utils', () => ({
vi.mock('@/lib/workflows/persistence/utils', () => ({
deployWorkflow: mockDeployWorkflow,
}))
vi.mock('@/lib/workflows/orchestration', () => ({
performChatUndeploy: mockPerformChatUndeploy,
}))
vi.mock('drizzle-orm', () => ({
and: vi.fn((...conditions: unknown[]) => ({ type: 'and', conditions })),
eq: vi.fn((field: unknown, value: unknown) => ({ field, value, type: 'eq' })),
@@ -106,7 +108,7 @@ describe('Chat Edit API Route', () => {
mockWhere.mockReturnValue({ limit: mockLimit })
mockUpdate.mockReturnValue({ set: mockSet })
mockSet.mockReturnValue({ where: mockWhere })
mockDelete.mockReturnValue({ where: mockWhere })
mockPerformChatUndeploy.mockResolvedValue({ success: true })
mockCreateSuccessResponse.mockImplementation((data) => {
return new Response(JSON.stringify(data), {
@@ -428,7 +430,11 @@ describe('Chat Edit API Route', () => {
const response = await DELETE(req, { params: Promise.resolve({ id: 'chat-123' }) })
expect(response.status).toBe(200)
expect(mockDelete).toHaveBeenCalled()
expect(mockPerformChatUndeploy).toHaveBeenCalledWith({
chatId: 'chat-123',
userId: 'user-id',
workspaceId: 'workspace-123',
})
const data = await response.json()
expect(data.message).toBe('Chat deployment deleted successfully')
})
@@ -451,7 +457,11 @@ describe('Chat Edit API Route', () => {
expect(response.status).toBe(200)
expect(mockCheckChatAccess).toHaveBeenCalledWith('chat-123', 'admin-user-id')
expect(mockDelete).toHaveBeenCalled()
expect(mockPerformChatUndeploy).toHaveBeenCalledWith({
chatId: 'chat-123',
userId: 'admin-user-id',
workspaceId: 'workspace-123',
})
})
})
})

View File

@@ -9,6 +9,7 @@ import { getSession } from '@/lib/auth'
import { isDev } from '@/lib/core/config/feature-flags'
import { encryptSecret } from '@/lib/core/security/encryption'
import { getEmailDomain } from '@/lib/core/utils/urls'
import { performChatUndeploy } from '@/lib/workflows/orchestration'
import { deployWorkflow } from '@/lib/workflows/persistence/utils'
import { checkChatAccess } from '@/app/api/chat/utils'
import { createErrorResponse, createSuccessResponse } from '@/app/api/workflows/utils'
@@ -270,33 +271,25 @@ export async function DELETE(
return createErrorResponse('Unauthorized', 401)
}
const {
hasAccess,
chat: chatRecord,
workspaceId: chatWorkspaceId,
} = await checkChatAccess(chatId, session.user.id)
const { hasAccess, workspaceId: chatWorkspaceId } = await checkChatAccess(
chatId,
session.user.id
)
if (!hasAccess) {
return createErrorResponse('Chat not found or access denied', 404)
}
await db.delete(chat).where(eq(chat.id, chatId))
logger.info(`Chat "${chatId}" deleted successfully`)
recordAudit({
workspaceId: chatWorkspaceId || null,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CHAT_DELETED,
resourceType: AuditResourceType.CHAT,
resourceId: chatId,
resourceName: chatRecord?.title || chatId,
description: `Deleted chat deployment "${chatRecord?.title || chatId}"`,
request: _request,
const result = await performChatUndeploy({
chatId,
userId: session.user.id,
workspaceId: chatWorkspaceId,
})
if (!result.success) {
return createErrorResponse(result.error || 'Failed to delete chat', 500)
}
return createSuccessResponse({
message: 'Chat deployment deleted successfully',
})

View File

@@ -3,7 +3,7 @@
*
* @vitest-environment node
*/
import { auditMock, createEnvMock } from '@sim/testing'
import { createEnvMock } from '@sim/testing'
import { NextRequest } from 'next/server'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
@@ -12,66 +12,51 @@ const {
mockFrom,
mockWhere,
mockLimit,
mockInsert,
mockValues,
mockReturning,
mockCreateSuccessResponse,
mockCreateErrorResponse,
mockEncryptSecret,
mockCheckWorkflowAccessForChatCreation,
mockDeployWorkflow,
mockPerformChatDeploy,
mockGetSession,
mockUuidV4,
} = vi.hoisted(() => ({
mockSelect: vi.fn(),
mockFrom: vi.fn(),
mockWhere: vi.fn(),
mockLimit: vi.fn(),
mockInsert: vi.fn(),
mockValues: vi.fn(),
mockReturning: vi.fn(),
mockCreateSuccessResponse: vi.fn(),
mockCreateErrorResponse: vi.fn(),
mockEncryptSecret: vi.fn(),
mockCheckWorkflowAccessForChatCreation: vi.fn(),
mockDeployWorkflow: vi.fn(),
mockPerformChatDeploy: vi.fn(),
mockGetSession: vi.fn(),
mockUuidV4: vi.fn(),
}))
vi.mock('@/lib/audit/log', () => auditMock)
vi.mock('@sim/db', () => ({
db: {
select: mockSelect,
insert: mockInsert,
},
}))
vi.mock('@sim/db/schema', () => ({
chat: { userId: 'userId', identifier: 'identifier' },
chat: { userId: 'userId', identifier: 'identifier', archivedAt: 'archivedAt' },
workflow: { id: 'id', userId: 'userId', isDeployed: 'isDeployed' },
}))
vi.mock('drizzle-orm', () => ({
and: vi.fn((...conditions: unknown[]) => ({ type: 'and', conditions })),
eq: vi.fn((field: unknown, value: unknown) => ({ field, value, type: 'eq' })),
isNull: vi.fn((field: unknown) => ({ type: 'isNull', field })),
}))
vi.mock('@/app/api/workflows/utils', () => ({
createSuccessResponse: mockCreateSuccessResponse,
createErrorResponse: mockCreateErrorResponse,
}))
vi.mock('@/lib/core/security/encryption', () => ({
encryptSecret: mockEncryptSecret,
}))
vi.mock('uuid', () => ({
v4: mockUuidV4,
}))
vi.mock('@/app/api/chat/utils', () => ({
checkWorkflowAccessForChatCreation: mockCheckWorkflowAccessForChatCreation,
}))
vi.mock('@/lib/workflows/persistence/utils', () => ({
deployWorkflow: mockDeployWorkflow,
vi.mock('@/lib/workflows/orchestration', () => ({
performChatDeploy: mockPerformChatDeploy,
}))
vi.mock('@/lib/auth', () => ({
@@ -94,10 +79,6 @@ describe('Chat API Route', () => {
mockSelect.mockReturnValue({ from: mockFrom })
mockFrom.mockReturnValue({ where: mockWhere })
mockWhere.mockReturnValue({ limit: mockLimit })
mockInsert.mockReturnValue({ values: mockValues })
mockValues.mockReturnValue({ returning: mockReturning })
mockUuidV4.mockReturnValue('test-uuid')
mockCreateSuccessResponse.mockImplementation((data) => {
return new Response(JSON.stringify(data), {
@@ -113,12 +94,10 @@ describe('Chat API Route', () => {
})
})
mockEncryptSecret.mockResolvedValue({ encrypted: 'encrypted-password' })
mockDeployWorkflow.mockResolvedValue({
mockPerformChatDeploy.mockResolvedValue({
success: true,
version: 1,
deployedAt: new Date(),
chatId: 'test-uuid',
chatUrl: 'http://localhost:3000/chat/test-chat',
})
})
@@ -277,7 +256,6 @@ describe('Chat API Route', () => {
hasAccess: true,
workflow: { userId: 'user-id', workspaceId: null, isDeployed: true },
})
mockReturning.mockResolvedValue([{ id: 'test-uuid' }])
const req = new NextRequest('http://localhost:3000/api/chat', {
method: 'POST',
@@ -287,6 +265,13 @@ describe('Chat API Route', () => {
expect(response.status).toBe(200)
expect(mockCheckWorkflowAccessForChatCreation).toHaveBeenCalledWith('workflow-123', 'user-id')
expect(mockPerformChatDeploy).toHaveBeenCalledWith(
expect.objectContaining({
workflowId: 'workflow-123',
userId: 'user-id',
identifier: 'test-chat',
})
)
})
it('should allow chat deployment when user has workspace admin permission', async () => {
@@ -309,7 +294,6 @@ describe('Chat API Route', () => {
hasAccess: true,
workflow: { userId: 'other-user-id', workspaceId: 'workspace-123', isDeployed: true },
})
mockReturning.mockResolvedValue([{ id: 'test-uuid' }])
const req = new NextRequest('http://localhost:3000/api/chat', {
method: 'POST',
@@ -319,6 +303,12 @@ describe('Chat API Route', () => {
expect(response.status).toBe(200)
expect(mockCheckWorkflowAccessForChatCreation).toHaveBeenCalledWith('workflow-123', 'user-id')
expect(mockPerformChatDeploy).toHaveBeenCalledWith(
expect.objectContaining({
workflowId: 'workflow-123',
workspaceId: 'workspace-123',
})
)
})
it('should reject when workflow is in workspace but user lacks admin permission', async () => {
@@ -383,7 +373,7 @@ describe('Chat API Route', () => {
expect(mockCheckWorkflowAccessForChatCreation).toHaveBeenCalledWith('workflow-123', 'user-id')
})
it('should auto-deploy workflow if not already deployed', async () => {
it('should call performChatDeploy for undeployed workflow', async () => {
mockGetSession.mockResolvedValue({
user: { id: 'user-id', email: 'user@example.com' },
})
@@ -403,7 +393,6 @@ describe('Chat API Route', () => {
hasAccess: true,
workflow: { userId: 'user-id', workspaceId: null, isDeployed: false },
})
mockReturning.mockResolvedValue([{ id: 'test-uuid' }])
const req = new NextRequest('http://localhost:3000/api/chat', {
method: 'POST',
@@ -412,10 +401,12 @@ describe('Chat API Route', () => {
const response = await POST(req)
expect(response.status).toBe(200)
expect(mockDeployWorkflow).toHaveBeenCalledWith({
workflowId: 'workflow-123',
deployedBy: 'user-id',
})
expect(mockPerformChatDeploy).toHaveBeenCalledWith(
expect.objectContaining({
workflowId: 'workflow-123',
userId: 'user-id',
})
)
})
})
})

View File

@@ -3,14 +3,9 @@ import { chat } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { v4 as uuidv4 } from 'uuid'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { isDev } from '@/lib/core/config/feature-flags'
import { encryptSecret } from '@/lib/core/security/encryption'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { deployWorkflow } from '@/lib/workflows/persistence/utils'
import { performChatDeploy } from '@/lib/workflows/orchestration'
import { checkWorkflowAccessForChatCreation } from '@/app/api/chat/utils'
import { createErrorResponse, createSuccessResponse } from '@/app/api/workflows/utils'
@@ -109,7 +104,6 @@ export async function POST(request: NextRequest) {
)
}
// Check identifier availability and workflow access in parallel
const [existingIdentifier, { hasAccess, workflow: workflowRecord }] = await Promise.all([
db
.select()
@@ -127,121 +121,27 @@ export async function POST(request: NextRequest) {
return createErrorResponse('Workflow not found or access denied', 404)
}
// Always deploy/redeploy the workflow to ensure latest version
const result = await deployWorkflow({
workflowId,
deployedBy: session.user.id,
})
if (!result.success) {
return createErrorResponse(result.error || 'Failed to deploy workflow', 500)
}
logger.info(
`${workflowRecord.isDeployed ? 'Redeployed' : 'Auto-deployed'} workflow ${workflowId} for chat (v${result.version})`
)
// Encrypt password if provided
let encryptedPassword = null
if (authType === 'password' && password) {
const { encrypted } = await encryptSecret(password)
encryptedPassword = encrypted
}
// Create the chat deployment
const id = uuidv4()
// Log the values we're inserting
logger.info('Creating chat deployment with values:', {
workflowId,
identifier,
title,
authType,
hasPassword: !!encryptedPassword,
emailCount: allowedEmails?.length || 0,
outputConfigsCount: outputConfigs.length,
})
// Merge customizations with the additional fields
const mergedCustomizations = {
...(customizations || {}),
primaryColor: customizations?.primaryColor || 'var(--brand-hover)',
welcomeMessage: customizations?.welcomeMessage || 'Hi there! How can I help you today?',
}
await db.insert(chat).values({
id,
const result = await performChatDeploy({
workflowId,
userId: session.user.id,
identifier,
title,
description: description || null,
customizations: mergedCustomizations,
isActive: true,
description,
customizations,
authType,
password: encryptedPassword,
allowedEmails: authType === 'email' || authType === 'sso' ? allowedEmails : [],
password,
allowedEmails,
outputConfigs,
createdAt: new Date(),
updatedAt: new Date(),
workspaceId: workflowRecord.workspaceId,
})
// Return successful response with chat URL
// Generate chat URL using path-based routing instead of subdomains
const baseUrl = getBaseUrl()
let chatUrl: string
try {
const url = new URL(baseUrl)
let host = url.host
if (host.startsWith('www.')) {
host = host.substring(4)
}
chatUrl = `${url.protocol}//${host}/chat/${identifier}`
} catch (error) {
logger.warn('Failed to parse baseUrl, falling back to defaults:', {
baseUrl,
error: error instanceof Error ? error.message : 'Unknown error',
})
// Fallback based on environment
if (isDev) {
chatUrl = `http://localhost:3000/chat/${identifier}`
} else {
chatUrl = `https://sim.ai/chat/${identifier}`
}
if (!result.success) {
return createErrorResponse(result.error || 'Failed to deploy chat', 500)
}
logger.info(`Chat "${title}" deployed successfully at ${chatUrl}`)
try {
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.chatDeployed({
chatId: id,
workflowId,
authType,
hasOutputConfigs: outputConfigs.length > 0,
})
} catch (_e) {
// Silently fail
}
recordAudit({
workspaceId: workflowRecord.workspaceId || null,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CHAT_DEPLOYED,
resourceType: AuditResourceType.CHAT,
resourceId: id,
resourceName: title,
description: `Deployed chat "${title}"`,
metadata: { workflowId, identifier, authType },
request,
})
return createSuccessResponse({
id,
chatUrl,
id: result.chatId,
chatUrl: result.chatUrl,
message: 'Chat deployment created successfully',
})
} catch (validationError) {

View File

@@ -15,7 +15,6 @@ import {
requestChatTitle,
SSE_RESPONSE_HEADERS,
} from '@/lib/copilot/chat-streaming'
import { appendCopilotLogContext } from '@/lib/copilot/logging'
import { COPILOT_REQUEST_MODES } from '@/lib/copilot/models'
import { orchestrateCopilotStream } from '@/lib/copilot/orchestrator'
import { getStreamMeta, readStreamEvents } from '@/lib/copilot/orchestrator/stream/buffer'
@@ -184,36 +183,31 @@ export async function POST(req: NextRequest) {
const wf = await getWorkflowById(workflowId)
resolvedWorkspaceId = wf?.workspaceId ?? undefined
} catch {
logger.warn(
appendCopilotLogContext('Failed to resolve workspaceId from workflow', {
requestId: tracker.requestId,
messageId: userMessageId,
})
)
logger
.withMetadata({ requestId: tracker.requestId, messageId: userMessageId })
.warn('Failed to resolve workspaceId from workflow')
}
const userMessageIdToUse = userMessageId || crypto.randomUUID()
const reqLogger = logger.withMetadata({
requestId: tracker.requestId,
messageId: userMessageIdToUse,
})
try {
logger.error(
appendCopilotLogContext('Received chat POST', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
{
workflowId,
hasContexts: Array.isArray(normalizedContexts),
contextsCount: Array.isArray(normalizedContexts) ? normalizedContexts.length : 0,
contextsPreview: Array.isArray(normalizedContexts)
? normalizedContexts.map((c: any) => ({
kind: c?.kind,
chatId: c?.chatId,
workflowId: c?.workflowId,
executionId: (c as any)?.executionId,
label: c?.label,
}))
: undefined,
}
)
reqLogger.info('Received chat POST', {
workflowId,
hasContexts: Array.isArray(normalizedContexts),
contextsCount: Array.isArray(normalizedContexts) ? normalizedContexts.length : 0,
contextsPreview: Array.isArray(normalizedContexts)
? normalizedContexts.map((c: any) => ({
kind: c?.kind,
chatId: c?.chatId,
workflowId: c?.workflowId,
executionId: (c as any)?.executionId,
label: c?.label,
}))
: undefined,
})
} catch {}
let currentChat: any = null
@@ -251,40 +245,22 @@ export async function POST(req: NextRequest) {
actualChatId
)
agentContexts = processed
logger.error(
appendCopilotLogContext('Contexts processed for request', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
{
processedCount: agentContexts.length,
kinds: agentContexts.map((c) => c.type),
lengthPreview: agentContexts.map((c) => c.content?.length ?? 0),
}
)
reqLogger.info('Contexts processed for request', {
processedCount: agentContexts.length,
kinds: agentContexts.map((c) => c.type),
lengthPreview: agentContexts.map((c) => c.content?.length ?? 0),
})
if (
Array.isArray(normalizedContexts) &&
normalizedContexts.length > 0 &&
agentContexts.length === 0
) {
logger.warn(
appendCopilotLogContext(
'Contexts provided but none processed. Check executionId for logs contexts.',
{
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}
)
reqLogger.warn(
'Contexts provided but none processed. Check executionId for logs contexts.'
)
}
} catch (e) {
logger.error(
appendCopilotLogContext('Failed to process contexts', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
e
)
reqLogger.error('Failed to process contexts', e)
}
}
@@ -313,13 +289,7 @@ export async function POST(req: NextRequest) {
if (result.status === 'fulfilled' && result.value) {
agentContexts.push(result.value)
} else if (result.status === 'rejected') {
logger.error(
appendCopilotLogContext('Failed to resolve resource attachment', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
result.reason
)
reqLogger.error('Failed to resolve resource attachment', result.reason)
}
}
}
@@ -358,26 +328,20 @@ export async function POST(req: NextRequest) {
)
try {
logger.error(
appendCopilotLogContext('About to call Sim Agent', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
{
hasContext: agentContexts.length > 0,
contextCount: agentContexts.length,
hasFileAttachments: Array.isArray(requestPayload.fileAttachments),
messageLength: message.length,
mode: effectiveMode,
hasTools: Array.isArray(requestPayload.tools),
toolCount: Array.isArray(requestPayload.tools) ? requestPayload.tools.length : 0,
hasBaseTools: Array.isArray(requestPayload.baseTools),
baseToolCount: Array.isArray(requestPayload.baseTools)
? requestPayload.baseTools.length
: 0,
hasCredentials: !!requestPayload.credentials,
}
)
reqLogger.info('About to call Sim Agent', {
hasContext: agentContexts.length > 0,
contextCount: agentContexts.length,
hasFileAttachments: Array.isArray(requestPayload.fileAttachments),
messageLength: message.length,
mode: effectiveMode,
hasTools: Array.isArray(requestPayload.tools),
toolCount: Array.isArray(requestPayload.tools) ? requestPayload.tools.length : 0,
hasBaseTools: Array.isArray(requestPayload.baseTools),
baseToolCount: Array.isArray(requestPayload.baseTools)
? requestPayload.baseTools.length
: 0,
hasCredentials: !!requestPayload.credentials,
})
} catch {}
if (stream && actualChatId) {
@@ -521,16 +485,10 @@ export async function POST(req: NextRequest) {
.where(eq(copilotChats.id, actualChatId))
}
} catch (error) {
logger.error(
appendCopilotLogContext('Failed to persist chat messages', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
{
chatId: actualChatId,
error: error instanceof Error ? error.message : 'Unknown error',
}
)
reqLogger.error('Failed to persist chat messages', {
chatId: actualChatId,
error: error instanceof Error ? error.message : 'Unknown error',
})
}
},
},
@@ -572,19 +530,13 @@ export async function POST(req: NextRequest) {
provider: typeof requestPayload?.provider === 'string' ? requestPayload.provider : undefined,
}
logger.error(
appendCopilotLogContext('Non-streaming response from orchestrator', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
{
hasContent: !!responseData.content,
contentLength: responseData.content?.length || 0,
model: responseData.model,
provider: responseData.provider,
toolCallsCount: responseData.toolCalls?.length || 0,
}
)
reqLogger.info('Non-streaming response from orchestrator', {
hasContent: !!responseData.content,
contentLength: responseData.content?.length || 0,
model: responseData.model,
provider: responseData.provider,
toolCallsCount: responseData.toolCalls?.length || 0,
})
// Save messages if we have a chat
if (currentChat && responseData.content) {
@@ -617,12 +569,7 @@ export async function POST(req: NextRequest) {
// Start title generation in parallel if this is first message (non-streaming)
if (actualChatId && !currentChat.title && conversationHistory.length === 0) {
logger.error(
appendCopilotLogContext('Starting title generation for non-streaming response', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
})
)
reqLogger.info('Starting title generation for non-streaming response')
requestChatTitle({ message, model: selectedModel, provider, messageId: userMessageIdToUse })
.then(async (title) => {
if (title) {
@@ -633,22 +580,11 @@ export async function POST(req: NextRequest) {
updatedAt: new Date(),
})
.where(eq(copilotChats.id, actualChatId!))
logger.error(
appendCopilotLogContext(`Generated and saved title: ${title}`, {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
})
)
reqLogger.info(`Generated and saved title: ${title}`)
}
})
.catch((error) => {
logger.error(
appendCopilotLogContext('Title generation failed', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
error
)
reqLogger.error('Title generation failed', error)
})
}
@@ -662,17 +598,11 @@ export async function POST(req: NextRequest) {
.where(eq(copilotChats.id, actualChatId!))
}
logger.error(
appendCopilotLogContext('Returning non-streaming response', {
requestId: tracker.requestId,
messageId: userMessageIdToUse,
}),
{
duration: tracker.getDuration(),
chatId: actualChatId,
responseLength: responseData.content?.length || 0,
}
)
reqLogger.info('Returning non-streaming response', {
duration: tracker.getDuration(),
chatId: actualChatId,
responseLength: responseData.content?.length || 0,
})
return NextResponse.json({
success: true,
@@ -696,33 +626,25 @@ export async function POST(req: NextRequest) {
const duration = tracker.getDuration()
if (error instanceof z.ZodError) {
logger.error(
appendCopilotLogContext('Validation error', {
requestId: tracker.requestId,
messageId: pendingChatStreamID ?? undefined,
}),
{
logger
.withMetadata({ requestId: tracker.requestId, messageId: pendingChatStreamID ?? undefined })
.error('Validation error', {
duration,
errors: error.errors,
}
)
})
return NextResponse.json(
{ error: 'Invalid request data', details: error.errors },
{ status: 400 }
)
}
logger.error(
appendCopilotLogContext('Error handling copilot chat', {
requestId: tracker.requestId,
messageId: pendingChatStreamID ?? undefined,
}),
{
logger
.withMetadata({ requestId: tracker.requestId, messageId: pendingChatStreamID ?? undefined })
.error('Error handling copilot chat', {
duration,
error: error instanceof Error ? error.message : 'Unknown error',
stack: error instanceof Error ? error.stack : undefined,
}
)
})
return NextResponse.json(
{ error: error instanceof Error ? error.message : 'Internal server error' },
@@ -767,16 +689,13 @@ export async function GET(req: NextRequest) {
status: meta?.status || 'unknown',
}
} catch (err) {
logger.warn(
appendCopilotLogContext('Failed to read stream snapshot for chat', {
messageId: chat.conversationId || undefined,
}),
{
logger
.withMetadata({ messageId: chat.conversationId || undefined })
.warn('Failed to read stream snapshot for chat', {
chatId,
conversationId: chat.conversationId,
error: err instanceof Error ? err.message : String(err),
}
)
})
}
}
@@ -795,11 +714,9 @@ export async function GET(req: NextRequest) {
...(streamSnapshot ? { streamSnapshot } : {}),
}
logger.error(
appendCopilotLogContext(`Retrieved chat ${chatId}`, {
messageId: chat.conversationId || undefined,
})
)
logger
.withMetadata({ messageId: chat.conversationId || undefined })
.info(`Retrieved chat ${chatId}`)
return NextResponse.json({ success: true, chat: transformedChat })
}

View File

@@ -1,6 +1,5 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { appendCopilotLogContext } from '@/lib/copilot/logging'
import {
getStreamMeta,
readStreamEvents,
@@ -36,24 +35,21 @@ export async function GET(request: NextRequest) {
const toParam = url.searchParams.get('to')
const toEventId = toParam ? Number(toParam) : undefined
logger.error(
appendCopilotLogContext('[Resume] Received resume request', {
messageId: streamId || undefined,
}),
{
streamId: streamId || undefined,
fromEventId,
toEventId,
batchMode,
}
)
const reqLogger = logger.withMetadata({ messageId: streamId || undefined })
reqLogger.info('[Resume] Received resume request', {
streamId: streamId || undefined,
fromEventId,
toEventId,
batchMode,
})
if (!streamId) {
return NextResponse.json({ error: 'streamId is required' }, { status: 400 })
}
const meta = (await getStreamMeta(streamId)) as StreamMeta | null
logger.error(appendCopilotLogContext('[Resume] Stream lookup', { messageId: streamId }), {
reqLogger.info('[Resume] Stream lookup', {
streamId,
fromEventId,
toEventId,
@@ -72,7 +68,7 @@ export async function GET(request: NextRequest) {
if (batchMode) {
const events = await readStreamEvents(streamId, fromEventId)
const filteredEvents = toEventId ? events.filter((e) => e.eventId <= toEventId) : events
logger.error(appendCopilotLogContext('[Resume] Batch response', { messageId: streamId }), {
reqLogger.info('[Resume] Batch response', {
streamId,
fromEventId,
toEventId,
@@ -124,14 +120,11 @@ export async function GET(request: NextRequest) {
const flushEvents = async () => {
const events = await readStreamEvents(streamId, lastEventId)
if (events.length > 0) {
logger.error(
appendCopilotLogContext('[Resume] Flushing events', { messageId: streamId }),
{
streamId,
fromEventId: lastEventId,
eventCount: events.length,
}
)
reqLogger.info('[Resume] Flushing events', {
streamId,
fromEventId: lastEventId,
eventCount: events.length,
})
}
for (const entry of events) {
lastEventId = entry.eventId
@@ -178,7 +171,7 @@ export async function GET(request: NextRequest) {
}
} catch (error) {
if (!controllerClosed && !request.signal.aborted) {
logger.warn(appendCopilotLogContext('Stream replay failed', { messageId: streamId }), {
reqLogger.warn('Stream replay failed', {
streamId,
error: error instanceof Error ? error.message : String(error),
})

View File

@@ -100,7 +100,7 @@ const emailTemplates = {
trigger: 'api',
duration: '2.3s',
cost: '$0.0042',
logUrl: 'https://sim.ai/workspace/ws_123/logs?search=exec_abc123',
logUrl: 'https://sim.ai/workspace/ws_123/logs?executionId=exec_abc123',
}),
'workflow-notification-error': () =>
renderWorkflowNotificationEmail({
@@ -109,7 +109,7 @@ const emailTemplates = {
trigger: 'webhook',
duration: '1.1s',
cost: '$0.0021',
logUrl: 'https://sim.ai/workspace/ws_123/logs?search=exec_abc123',
logUrl: 'https://sim.ai/workspace/ws_123/logs?executionId=exec_abc123',
}),
'workflow-notification-alert': () =>
renderWorkflowNotificationEmail({
@@ -118,7 +118,7 @@ const emailTemplates = {
trigger: 'schedule',
duration: '45.2s',
cost: '$0.0156',
logUrl: 'https://sim.ai/workspace/ws_123/logs?search=exec_abc123',
logUrl: 'https://sim.ai/workspace/ws_123/logs?executionId=exec_abc123',
alertReason: '3 consecutive failures detected',
}),
'workflow-notification-full': () =>
@@ -128,7 +128,7 @@ const emailTemplates = {
trigger: 'api',
duration: '12.5s',
cost: '$0.0234',
logUrl: 'https://sim.ai/workspace/ws_123/logs?search=exec_abc123',
logUrl: 'https://sim.ai/workspace/ws_123/logs?executionId=exec_abc123',
finalOutput: { processed: 150, skipped: 3, status: 'completed' },
rateLimits: {
sync: { requestsPerMinute: 60, remaining: 45 },

View File

@@ -6,7 +6,14 @@
import { auditMock, createMockRequest, type MockUser } from '@sim/testing'
import { beforeEach, describe, expect, it, vi } from 'vitest'
const { mockGetSession, mockGetUserEntityPermissions, mockLogger, mockDbRef } = vi.hoisted(() => {
const {
mockGetSession,
mockGetUserEntityPermissions,
mockLogger,
mockDbRef,
mockPerformDeleteFolder,
mockCheckForCircularReference,
} = vi.hoisted(() => {
const logger = {
info: vi.fn(),
warn: vi.fn(),
@@ -21,6 +28,8 @@ const { mockGetSession, mockGetUserEntityPermissions, mockLogger, mockDbRef } =
mockGetUserEntityPermissions: vi.fn(),
mockLogger: logger,
mockDbRef: { current: null as any },
mockPerformDeleteFolder: vi.fn(),
mockCheckForCircularReference: vi.fn(),
}
})
@@ -39,6 +48,12 @@ vi.mock('@sim/db', () => ({
return mockDbRef.current
},
}))
vi.mock('@/lib/workflows/orchestration', () => ({
performDeleteFolder: mockPerformDeleteFolder,
}))
vi.mock('@/lib/workflows/utils', () => ({
checkForCircularReference: mockCheckForCircularReference,
}))
import { DELETE, PUT } from '@/app/api/folders/[id]/route'
@@ -144,6 +159,11 @@ describe('Individual Folder API Route', () => {
mockGetUserEntityPermissions.mockResolvedValue('admin')
mockDbRef.current = createFolderDbMock()
mockPerformDeleteFolder.mockResolvedValue({
success: true,
deletedItems: { folders: 1, workflows: 0 },
})
mockCheckForCircularReference.mockResolvedValue(false)
})
describe('PUT /api/folders/[id]', () => {
@@ -369,13 +389,17 @@ describe('Individual Folder API Route', () => {
it('should prevent circular references when updating parent', async () => {
mockAuthenticatedUser()
const circularCheckResults = [{ parentId: 'folder-2' }, { parentId: 'folder-3' }]
mockDbRef.current = createFolderDbMock({
folderLookupResult: { id: 'folder-3', parentId: null, name: 'Folder 3' },
circularCheckResults,
folderLookupResult: {
id: 'folder-3',
parentId: null,
name: 'Folder 3',
workspaceId: 'workspace-123',
},
})
mockCheckForCircularReference.mockResolvedValue(true)
const req = createMockRequest('PUT', {
name: 'Updated Folder 3',
parentId: 'folder-1',
@@ -388,6 +412,7 @@ describe('Individual Folder API Route', () => {
const data = await response.json()
expect(data).toHaveProperty('error', 'Cannot create circular folder reference')
expect(mockCheckForCircularReference).toHaveBeenCalledWith('folder-3', 'folder-1')
})
})
@@ -409,6 +434,12 @@ describe('Individual Folder API Route', () => {
const data = await response.json()
expect(data).toHaveProperty('success', true)
expect(data).toHaveProperty('deletedItems')
expect(mockPerformDeleteFolder).toHaveBeenCalledWith({
folderId: 'folder-1',
workspaceId: 'workspace-123',
userId: TEST_USER.id,
folderName: 'Test Folder',
})
})
it('should return 401 for unauthenticated delete requests', async () => {
@@ -472,6 +503,7 @@ describe('Individual Folder API Route', () => {
const data = await response.json()
expect(data).toHaveProperty('success', true)
expect(mockPerformDeleteFolder).toHaveBeenCalled()
})
it('should handle database errors during deletion', async () => {

View File

@@ -1,12 +1,12 @@
import { db } from '@sim/db'
import { workflow, workflowFolder } from '@sim/db/schema'
import { workflowFolder } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { archiveWorkflowsByIdsInWorkspace } from '@/lib/workflows/lifecycle'
import { performDeleteFolder } from '@/lib/workflows/orchestration'
import { checkForCircularReference } from '@/lib/workflows/utils'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
const logger = createLogger('FoldersIDAPI')
@@ -130,7 +130,6 @@ export async function DELETE(
return NextResponse.json({ error: 'Folder not found' }, { status: 404 })
}
// Check if user has admin permissions for the workspace (admin-only for deletions)
const workspacePermission = await getUserEntityPermissions(
session.user.id,
'workspace',
@@ -144,170 +143,25 @@ export async function DELETE(
)
}
// Check if deleting this folder would delete the last workflow(s) in the workspace
const workflowsInFolder = await countWorkflowsInFolderRecursively(
id,
existingFolder.workspaceId
)
const totalWorkflowsInWorkspace = await db
.select({ id: workflow.id })
.from(workflow)
.where(and(eq(workflow.workspaceId, existingFolder.workspaceId), isNull(workflow.archivedAt)))
if (workflowsInFolder > 0 && workflowsInFolder >= totalWorkflowsInWorkspace.length) {
return NextResponse.json(
{ error: 'Cannot delete folder containing the only workflow(s) in the workspace' },
{ status: 400 }
)
}
// Recursively delete folder and all its contents
const deletionStats = await deleteFolderRecursively(id, existingFolder.workspaceId)
logger.info('Deleted folder and all contents:', {
id,
deletionStats,
})
recordAudit({
const result = await performDeleteFolder({
folderId: id,
workspaceId: existingFolder.workspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.FOLDER_DELETED,
resourceType: AuditResourceType.FOLDER,
resourceId: id,
resourceName: existingFolder.name,
description: `Deleted folder "${existingFolder.name}"`,
metadata: {
affected: {
workflows: deletionStats.workflows,
subfolders: deletionStats.folders - 1,
},
},
request,
userId: session.user.id,
folderName: existingFolder.name,
})
if (!result.success) {
const status =
result.errorCode === 'not_found' ? 404 : result.errorCode === 'validation' ? 400 : 500
return NextResponse.json({ error: result.error }, { status })
}
return NextResponse.json({
success: true,
deletedItems: deletionStats,
deletedItems: result.deletedItems,
})
} catch (error) {
logger.error('Error deleting folder:', { error })
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
// Helper function to recursively delete a folder and all its contents
async function deleteFolderRecursively(
folderId: string,
workspaceId: string
): Promise<{ folders: number; workflows: number }> {
const stats = { folders: 0, workflows: 0 }
// Get all child folders first (workspace-scoped, not user-scoped)
const childFolders = await db
.select({ id: workflowFolder.id })
.from(workflowFolder)
.where(and(eq(workflowFolder.parentId, folderId), eq(workflowFolder.workspaceId, workspaceId)))
// Recursively delete child folders
for (const childFolder of childFolders) {
const childStats = await deleteFolderRecursively(childFolder.id, workspaceId)
stats.folders += childStats.folders
stats.workflows += childStats.workflows
}
// Delete all workflows in this folder (workspace-scoped, not user-scoped)
// The database cascade will handle deleting related workflow_blocks, workflow_edges, workflow_subflows
const workflowsInFolder = await db
.select({ id: workflow.id })
.from(workflow)
.where(
and(
eq(workflow.folderId, folderId),
eq(workflow.workspaceId, workspaceId),
isNull(workflow.archivedAt)
)
)
if (workflowsInFolder.length > 0) {
await archiveWorkflowsByIdsInWorkspace(
workspaceId,
workflowsInFolder.map((entry) => entry.id),
{ requestId: `folder-${folderId}` }
)
stats.workflows += workflowsInFolder.length
}
// Delete this folder
await db.delete(workflowFolder).where(eq(workflowFolder.id, folderId))
stats.folders += 1
return stats
}
/**
* Counts the number of workflows in a folder and all its subfolders recursively.
*/
async function countWorkflowsInFolderRecursively(
folderId: string,
workspaceId: string
): Promise<number> {
let count = 0
const workflowsInFolder = await db
.select({ id: workflow.id })
.from(workflow)
.where(
and(
eq(workflow.folderId, folderId),
eq(workflow.workspaceId, workspaceId),
isNull(workflow.archivedAt)
)
)
count += workflowsInFolder.length
const childFolders = await db
.select({ id: workflowFolder.id })
.from(workflowFolder)
.where(and(eq(workflowFolder.parentId, folderId), eq(workflowFolder.workspaceId, workspaceId)))
for (const childFolder of childFolders) {
count += await countWorkflowsInFolderRecursively(childFolder.id, workspaceId)
}
return count
}
// Helper function to check for circular references
async function checkForCircularReference(folderId: string, parentId: string): Promise<boolean> {
let currentParentId: string | null = parentId
const visited = new Set<string>()
while (currentParentId) {
if (visited.has(currentParentId)) {
return true // Circular reference detected
}
if (currentParentId === folderId) {
return true // Would create a cycle
}
visited.add(currentParentId)
// Get the parent of the current parent
const parent: { parentId: string | null } | undefined = await db
.select({ parentId: workflowFolder.parentId })
.from(workflowFolder)
.where(eq(workflowFolder.id, currentParentId))
.then((rows) => rows[0])
currentParentId = parent?.parentId || null
}
return false
}

View File

@@ -15,6 +15,8 @@ const GetChunksQuerySchema = z.object({
enabled: z.enum(['true', 'false', 'all']).optional().default('all'),
limit: z.coerce.number().min(1).max(100).optional().default(50),
offset: z.coerce.number().min(0).optional().default(0),
sortBy: z.enum(['chunkIndex', 'tokenCount', 'enabled']).optional().default('chunkIndex'),
sortOrder: z.enum(['asc', 'desc']).optional().default('asc'),
})
const CreateChunkSchema = z.object({
@@ -88,6 +90,8 @@ export async function GET(
enabled: searchParams.get('enabled') || undefined,
limit: searchParams.get('limit') || undefined,
offset: searchParams.get('offset') || undefined,
sortBy: searchParams.get('sortBy') || undefined,
sortOrder: searchParams.get('sortOrder') || undefined,
})
const result = await queryChunks(documentId, queryParams, requestId)

View File

@@ -12,7 +12,6 @@ import {
createSSEStream,
SSE_RESPONSE_HEADERS,
} from '@/lib/copilot/chat-streaming'
import { appendCopilotLogContext } from '@/lib/copilot/logging'
import type { OrchestratorResult } from '@/lib/copilot/orchestrator/types'
import { processContextsServer, resolveActiveResourceContext } from '@/lib/copilot/process-contents'
import { createRequestTracker, createUnauthorizedResponse } from '@/lib/copilot/request-helpers'
@@ -112,27 +111,22 @@ export async function POST(req: NextRequest) {
const userMessageId = providedMessageId || crypto.randomUUID()
userMessageIdForLogs = userMessageId
const reqLogger = logger.withMetadata({
requestId: tracker.requestId,
messageId: userMessageId,
})
logger.error(
appendCopilotLogContext('Received mothership chat start request', {
requestId: tracker.requestId,
messageId: userMessageId,
}),
{
workspaceId,
chatId,
createNewChat,
hasContexts: Array.isArray(contexts) && contexts.length > 0,
contextsCount: Array.isArray(contexts) ? contexts.length : 0,
hasResourceAttachments:
Array.isArray(resourceAttachments) && resourceAttachments.length > 0,
resourceAttachmentCount: Array.isArray(resourceAttachments)
? resourceAttachments.length
: 0,
hasFileAttachments: Array.isArray(fileAttachments) && fileAttachments.length > 0,
fileAttachmentCount: Array.isArray(fileAttachments) ? fileAttachments.length : 0,
}
)
reqLogger.info('Received mothership chat start request', {
workspaceId,
chatId,
createNewChat,
hasContexts: Array.isArray(contexts) && contexts.length > 0,
contextsCount: Array.isArray(contexts) ? contexts.length : 0,
hasResourceAttachments: Array.isArray(resourceAttachments) && resourceAttachments.length > 0,
resourceAttachmentCount: Array.isArray(resourceAttachments) ? resourceAttachments.length : 0,
hasFileAttachments: Array.isArray(fileAttachments) && fileAttachments.length > 0,
fileAttachmentCount: Array.isArray(fileAttachments) ? fileAttachments.length : 0,
})
try {
await assertActiveWorkspaceAccess(workspaceId, authenticatedUserId)
@@ -174,13 +168,7 @@ export async function POST(req: NextRequest) {
actualChatId
)
} catch (e) {
logger.error(
appendCopilotLogContext('Failed to process contexts', {
requestId: tracker.requestId,
messageId: userMessageId,
}),
e
)
reqLogger.error('Failed to process contexts', e)
}
}
@@ -205,13 +193,7 @@ export async function POST(req: NextRequest) {
if (result.status === 'fulfilled' && result.value) {
agentContexts.push(result.value)
} else if (result.status === 'rejected') {
logger.error(
appendCopilotLogContext('Failed to resolve resource attachment', {
requestId: tracker.requestId,
messageId: userMessageId,
}),
result.reason
)
reqLogger.error('Failed to resolve resource attachment', result.reason)
}
}
}
@@ -399,16 +381,10 @@ export async function POST(req: NextRequest) {
})
}
} catch (error) {
logger.error(
appendCopilotLogContext('Failed to persist chat messages', {
requestId: tracker.requestId,
messageId: userMessageId,
}),
{
chatId: actualChatId,
error: error instanceof Error ? error.message : 'Unknown error',
}
)
reqLogger.error('Failed to persist chat messages', {
chatId: actualChatId,
error: error instanceof Error ? error.message : 'Unknown error',
})
}
},
},
@@ -423,15 +399,11 @@ export async function POST(req: NextRequest) {
)
}
logger.error(
appendCopilotLogContext('Error handling mothership chat', {
requestId: tracker.requestId,
messageId: userMessageIdForLogs,
}),
{
logger
.withMetadata({ requestId: tracker.requestId, messageId: userMessageIdForLogs })
.error('Error handling mothership chat', {
error: error instanceof Error ? error.message : 'Unknown error',
}
)
})
return NextResponse.json(
{ error: error instanceof Error ? error.message : 'Internal server error' },

View File

@@ -5,6 +5,7 @@ import { and, eq, sql } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { releasePendingChatStream } from '@/lib/copilot/chat-streaming'
import { taskPubSub } from '@/lib/copilot/task-events'
const logger = createLogger('MothershipChatStopAPI')
@@ -58,6 +59,8 @@ export async function POST(req: NextRequest) {
const { chatId, streamId, content, contentBlocks } = StopSchema.parse(await req.json())
await releasePendingChatStream(chatId, streamId)
const setClause: Record<string, unknown> = {
conversationId: null,
updatedAt: new Date(),

View File

@@ -5,7 +5,6 @@ import { and, eq, sql } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getAccessibleCopilotChat } from '@/lib/copilot/chat-lifecycle'
import { appendCopilotLogContext } from '@/lib/copilot/logging'
import { getStreamMeta, readStreamEvents } from '@/lib/copilot/orchestrator/stream/buffer'
import {
authenticateCopilotRequestSessionOnly,
@@ -63,16 +62,13 @@ export async function GET(
status: meta?.status || 'unknown',
}
} catch (error) {
logger.warn(
appendCopilotLogContext('Failed to read stream snapshot for mothership chat', {
messageId: chat.conversationId || undefined,
}),
{
logger
.withMetadata({ messageId: chat.conversationId || undefined })
.warn('Failed to read stream snapshot for mothership chat', {
chatId,
conversationId: chat.conversationId,
error: error instanceof Error ? error.message : String(error),
}
)
})
}
}

View File

@@ -4,7 +4,6 @@ import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { createRunSegment } from '@/lib/copilot/async-runs/repository'
import { buildIntegrationToolSchemas } from '@/lib/copilot/chat-payload'
import { appendCopilotLogContext } from '@/lib/copilot/logging'
import { orchestrateCopilotStream } from '@/lib/copilot/orchestrator'
import { generateWorkspaceContext } from '@/lib/copilot/workspace-context'
import {
@@ -53,6 +52,7 @@ export async function POST(req: NextRequest) {
const effectiveChatId = chatId || crypto.randomUUID()
messageId = crypto.randomUUID()
const reqLogger = logger.withMetadata({ messageId })
const [workspaceContext, integrationTools, userPermission] = await Promise.all([
generateWorkspaceContext(workspaceId, userId),
buildIntegrationToolSchemas(userId, messageId),
@@ -96,7 +96,7 @@ export async function POST(req: NextRequest) {
})
if (!result.success) {
logger.error(appendCopilotLogContext('Mothership execute failed', { messageId }), {
reqLogger.error('Mothership execute failed', {
error: result.error,
errors: result.errors,
})
@@ -135,7 +135,7 @@ export async function POST(req: NextRequest) {
)
}
logger.error(appendCopilotLogContext('Mothership execute error', { messageId }), {
logger.withMetadata({ messageId }).error('Mothership execute error', {
error: error instanceof Error ? error.message : 'Unknown error',
})

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { deleteSkill, listSkills, upsertSkills } from '@/lib/workflows/skills/operations'
@@ -96,6 +97,18 @@ export async function POST(req: NextRequest) {
requestId,
})
for (const skill of resultSkills) {
recordAudit({
workspaceId,
actorId: userId,
action: AuditAction.SKILL_CREATED,
resourceType: AuditResourceType.SKILL,
resourceId: skill.id,
resourceName: skill.name,
description: `Created/updated skill "${skill.name}"`,
})
}
return NextResponse.json({ success: true, data: resultSkills })
} catch (validationError) {
if (validationError instanceof z.ZodError) {
@@ -158,6 +171,15 @@ export async function DELETE(request: NextRequest) {
return NextResponse.json({ error: 'Skill not found' }, { status: 404 })
}
recordAudit({
workspaceId,
actorId: authResult.userId,
action: AuditAction.SKILL_DELETED,
resourceType: AuditResourceType.SKILL,
resourceId: skillId,
description: `Deleted skill`,
})
logger.info(`[${requestId}] Deleted skill: ${skillId}`)
return NextResponse.json({ success: true })
} catch (error) {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, desc, eq, isNull, or } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { upsertCustomTools } from '@/lib/workflows/custom-tools/operations'
@@ -166,6 +167,18 @@ export async function POST(req: NextRequest) {
requestId,
})
for (const tool of resultTools) {
recordAudit({
workspaceId,
actorId: userId,
action: AuditAction.CUSTOM_TOOL_CREATED,
resourceType: AuditResourceType.CUSTOM_TOOL,
resourceId: tool.id,
resourceName: tool.title,
description: `Created/updated custom tool "${tool.title}"`,
})
}
return NextResponse.json({ success: true, data: resultTools })
} catch (validationError) {
if (validationError instanceof z.ZodError) {
@@ -265,6 +278,15 @@ export async function DELETE(request: NextRequest) {
// Delete the tool
await db.delete(customTools).where(eq(customTools.id, toolId))
recordAudit({
workspaceId: tool.workspaceId || undefined,
actorId: userId,
action: AuditAction.CUSTOM_TOOL_DELETED,
resourceType: AuditResourceType.CUSTOM_TOOL,
resourceId: toolId,
description: `Deleted custom tool`,
})
logger.info(`[${requestId}] Deleted tool: ${toolId}`)
return NextResponse.json({ success: true })
} catch (error) {

View File

@@ -0,0 +1,166 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { acquireLock, releaseLock } from '@/lib/core/config/redis'
import { ensureAbsoluteUrl } from '@/lib/core/utils/urls'
import {
downloadWorkspaceFile,
getWorkspaceFileByName,
updateWorkspaceFileContent,
uploadWorkspaceFile,
} from '@/lib/uploads/contexts/workspace/workspace-file-manager'
import { getFileExtension, getMimeTypeFromExtension } from '@/lib/uploads/utils/file-utils'
export const dynamic = 'force-dynamic'
const logger = createLogger('FileManageAPI')
export async function POST(request: NextRequest) {
const auth = await checkInternalAuth(request, { requireWorkflowId: false })
if (!auth.success) {
return NextResponse.json({ success: false, error: auth.error }, { status: 401 })
}
const { searchParams } = new URL(request.url)
const userId = auth.userId || searchParams.get('userId')
if (!userId) {
return NextResponse.json({ success: false, error: 'userId is required' }, { status: 400 })
}
let body: Record<string, unknown>
try {
body = await request.json()
} catch {
return NextResponse.json({ success: false, error: 'Invalid JSON body' }, { status: 400 })
}
const workspaceId = (body.workspaceId as string) || searchParams.get('workspaceId')
if (!workspaceId) {
return NextResponse.json({ success: false, error: 'workspaceId is required' }, { status: 400 })
}
const operation = body.operation as string
try {
switch (operation) {
case 'write': {
const fileName = body.fileName as string | undefined
const content = body.content as string | undefined
const contentType = body.contentType as string | undefined
if (!fileName) {
return NextResponse.json(
{ success: false, error: 'fileName is required for write operation' },
{ status: 400 }
)
}
if (!content && content !== '') {
return NextResponse.json(
{ success: false, error: 'content is required for write operation' },
{ status: 400 }
)
}
const mimeType = contentType || getMimeTypeFromExtension(getFileExtension(fileName))
const fileBuffer = Buffer.from(content ?? '', 'utf-8')
const result = await uploadWorkspaceFile(
workspaceId,
userId,
fileBuffer,
fileName,
mimeType
)
logger.info('File created', {
fileId: result.id,
name: fileName,
size: fileBuffer.length,
})
return NextResponse.json({
success: true,
data: {
id: result.id,
name: result.name,
size: fileBuffer.length,
url: ensureAbsoluteUrl(result.url),
},
})
}
case 'append': {
const fileName = body.fileName as string | undefined
const content = body.content as string | undefined
if (!fileName) {
return NextResponse.json(
{ success: false, error: 'fileName is required for append operation' },
{ status: 400 }
)
}
if (!content && content !== '') {
return NextResponse.json(
{ success: false, error: 'content is required for append operation' },
{ status: 400 }
)
}
const existing = await getWorkspaceFileByName(workspaceId, fileName)
if (!existing) {
return NextResponse.json(
{ success: false, error: `File not found: "${fileName}"` },
{ status: 404 }
)
}
const lockKey = `file-append:${workspaceId}:${existing.id}`
const lockValue = `${Date.now()}-${Math.random().toString(36).slice(2)}`
const acquired = await acquireLock(lockKey, lockValue, 30)
if (!acquired) {
return NextResponse.json(
{ success: false, error: 'File is busy, please retry' },
{ status: 409 }
)
}
try {
const existingBuffer = await downloadWorkspaceFile(existing)
const finalContent = existingBuffer.toString('utf-8') + content
const fileBuffer = Buffer.from(finalContent, 'utf-8')
await updateWorkspaceFileContent(workspaceId, existing.id, userId, fileBuffer)
logger.info('File appended', {
fileId: existing.id,
name: existing.name,
size: fileBuffer.length,
})
return NextResponse.json({
success: true,
data: {
id: existing.id,
name: existing.name,
size: fileBuffer.length,
url: ensureAbsoluteUrl(existing.path),
},
})
} finally {
await releaseLock(lockKey, lockValue)
}
}
default:
return NextResponse.json(
{ success: false, error: `Unknown operation: ${operation}. Supported: write, append` },
{ status: 400 }
)
}
} catch (error) {
const message = error instanceof Error ? error.message : 'Unknown error'
logger.error('File operation failed', { operation, error: message })
return NextResponse.json({ success: false, error: message }, { status: 500 })
}
}

View File

@@ -1,25 +1,7 @@
import { db, workflowDeploymentVersion } from '@sim/db'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { generateRequestId } from '@/lib/core/utils/request'
import { removeMcpToolsForWorkflow, syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
import {
cleanupWebhooksForWorkflow,
restorePreviousVersionWebhooks,
saveTriggerWebhooksForDeploy,
} from '@/lib/webhooks/deploy'
import { getActiveWorkflowRecord } from '@/lib/workflows/active-context'
import {
activateWorkflowVersionById,
deployWorkflow,
loadWorkflowFromNormalizedTables,
undeployWorkflow,
} from '@/lib/workflows/persistence/utils'
import {
cleanupDeploymentVersion,
createSchedulesForDeploy,
validateWorkflowSchedules,
} from '@/lib/workflows/schedules'
import { performFullDeploy, performFullUndeploy } from '@/lib/workflows/orchestration'
import { withAdminAuthParams } from '@/app/api/v1/admin/middleware'
import {
badRequestResponse,
@@ -31,12 +13,19 @@ import type { AdminDeployResult, AdminUndeployResult } from '@/app/api/v1/admin/
const logger = createLogger('AdminWorkflowDeployAPI')
const ADMIN_ACTOR_ID = 'admin-api'
interface RouteParams {
id: string
}
/**
* POST — Deploy a workflow via admin API.
*
* `userId` is set to the workflow owner so that webhook credential resolution
* (OAuth token lookups for providers like Airtable, Attio, etc.) uses a real
* user. `actorId` is set to `'admin-api'` so that the `deployedBy` field on
* the deployment version and audit log entries are correctly attributed to an
* admin action rather than the workflow owner.
*/
export const POST = withAdminAuthParams<RouteParams>(async (request, context) => {
const { id: workflowId } = await context.params
const requestId = generateRequestId()
@@ -48,140 +37,28 @@ export const POST = withAdminAuthParams<RouteParams>(async (request, context) =>
return notFoundResponse('Workflow')
}
const normalizedData = await loadWorkflowFromNormalizedTables(workflowId)
if (!normalizedData) {
return badRequestResponse('Workflow has no saved state')
}
const scheduleValidation = validateWorkflowSchedules(normalizedData.blocks)
if (!scheduleValidation.isValid) {
return badRequestResponse(`Invalid schedule configuration: ${scheduleValidation.error}`)
}
const [currentActiveVersion] = await db
.select({ id: workflowDeploymentVersion.id })
.from(workflowDeploymentVersion)
.where(
and(
eq(workflowDeploymentVersion.workflowId, workflowId),
eq(workflowDeploymentVersion.isActive, true)
)
)
.limit(1)
const previousVersionId = currentActiveVersion?.id
const rollbackDeployment = async () => {
if (previousVersionId) {
await restorePreviousVersionWebhooks({
request,
workflow: workflowData,
userId: workflowRecord.userId,
previousVersionId,
requestId,
})
const reactivateResult = await activateWorkflowVersionById({
workflowId,
deploymentVersionId: previousVersionId,
})
if (reactivateResult.success) {
return
}
}
await undeployWorkflow({ workflowId })
}
const deployResult = await deployWorkflow({
const result = await performFullDeploy({
workflowId,
deployedBy: ADMIN_ACTOR_ID,
workflowName: workflowRecord.name,
})
if (!deployResult.success) {
return internalErrorResponse(deployResult.error || 'Failed to deploy workflow')
}
if (!deployResult.deploymentVersionId) {
await undeployWorkflow({ workflowId })
return internalErrorResponse('Failed to resolve deployment version')
}
const workflowData = workflowRecord as Record<string, unknown>
const triggerSaveResult = await saveTriggerWebhooksForDeploy({
request,
workflowId,
workflow: workflowData,
userId: workflowRecord.userId,
blocks: normalizedData.blocks,
workflowName: workflowRecord.name,
requestId,
deploymentVersionId: deployResult.deploymentVersionId,
previousVersionId,
request,
actorId: 'admin-api',
})
if (!triggerSaveResult.success) {
await cleanupDeploymentVersion({
workflowId,
workflow: workflowData,
requestId,
deploymentVersionId: deployResult.deploymentVersionId,
})
await rollbackDeployment()
return internalErrorResponse(
triggerSaveResult.error?.message || 'Failed to sync trigger configuration'
)
if (!result.success) {
if (result.errorCode === 'not_found') return notFoundResponse('Workflow state')
if (result.errorCode === 'validation') return badRequestResponse(result.error!)
return internalErrorResponse(result.error || 'Failed to deploy workflow')
}
const scheduleResult = await createSchedulesForDeploy(
workflowId,
normalizedData.blocks,
db,
deployResult.deploymentVersionId
)
if (!scheduleResult.success) {
logger.error(
`[${requestId}] Admin API: Schedule creation failed for workflow ${workflowId}: ${scheduleResult.error}`
)
await cleanupDeploymentVersion({
workflowId,
workflow: workflowData,
requestId,
deploymentVersionId: deployResult.deploymentVersionId,
})
await rollbackDeployment()
return internalErrorResponse(scheduleResult.error || 'Failed to create schedule')
}
if (previousVersionId && previousVersionId !== deployResult.deploymentVersionId) {
try {
logger.info(`[${requestId}] Admin API: Cleaning up previous version ${previousVersionId}`)
await cleanupDeploymentVersion({
workflowId,
workflow: workflowData,
requestId,
deploymentVersionId: previousVersionId,
skipExternalCleanup: true,
})
} catch (cleanupError) {
logger.error(
`[${requestId}] Admin API: Failed to clean up previous version ${previousVersionId}`,
cleanupError
)
}
}
logger.info(
`[${requestId}] Admin API: Deployed workflow ${workflowId} as v${deployResult.version}`
)
// Sync MCP tools with the latest parameter schema
await syncMcpToolsForWorkflow({ workflowId, requestId, context: 'deploy' })
logger.info(`[${requestId}] Admin API: Deployed workflow ${workflowId} as v${result.version}`)
const response: AdminDeployResult = {
isDeployed: true,
version: deployResult.version!,
deployedAt: deployResult.deployedAt!.toISOString(),
warnings: triggerSaveResult.warnings,
version: result.version!,
deployedAt: result.deployedAt!.toISOString(),
warnings: result.warnings,
}
return singleResponse(response)
@@ -191,7 +68,7 @@ export const POST = withAdminAuthParams<RouteParams>(async (request, context) =>
}
})
export const DELETE = withAdminAuthParams<RouteParams>(async (request, context) => {
export const DELETE = withAdminAuthParams<RouteParams>(async (_request, context) => {
const { id: workflowId } = await context.params
const requestId = generateRequestId()
@@ -202,19 +79,17 @@ export const DELETE = withAdminAuthParams<RouteParams>(async (request, context)
return notFoundResponse('Workflow')
}
const result = await undeployWorkflow({ workflowId })
const result = await performFullUndeploy({
workflowId,
userId: workflowRecord.userId,
requestId,
actorId: 'admin-api',
})
if (!result.success) {
return internalErrorResponse(result.error || 'Failed to undeploy workflow')
}
await cleanupWebhooksForWorkflow(
workflowId,
workflowRecord as Record<string, unknown>,
requestId
)
await removeMcpToolsForWorkflow(workflowId, requestId)
logger.info(`Admin API: Undeployed workflow ${workflowId}`)
const response: AdminUndeployResult = {

View File

@@ -13,12 +13,12 @@
*/
import { db } from '@sim/db'
import { templates, workflowBlocks, workflowEdges } from '@sim/db/schema'
import { workflowBlocks, workflowEdges } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { count, eq } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { getActiveWorkflowRecord } from '@/lib/workflows/active-context'
import { archiveWorkflow } from '@/lib/workflows/lifecycle'
import { performDeleteWorkflow } from '@/lib/workflows/orchestration'
import { withAdminAuthParams } from '@/app/api/v1/admin/middleware'
import {
internalErrorResponse,
@@ -69,7 +69,7 @@ export const GET = withAdminAuthParams<RouteParams>(async (request, context) =>
}
})
export const DELETE = withAdminAuthParams<RouteParams>(async (request, context) => {
export const DELETE = withAdminAuthParams<RouteParams>(async (_request, context) => {
const { id: workflowId } = await context.params
try {
@@ -79,12 +79,18 @@ export const DELETE = withAdminAuthParams<RouteParams>(async (request, context)
return notFoundResponse('Workflow')
}
await db.update(templates).set({ workflowId: null }).where(eq(templates.workflowId, workflowId))
await archiveWorkflow(workflowId, {
const result = await performDeleteWorkflow({
workflowId,
userId: workflowData.userId,
skipLastWorkflowGuard: true,
requestId: `admin-workflow-${workflowId}`,
actorId: 'admin-api',
})
if (!result.success) {
return internalErrorResponse(result.error || 'Failed to delete workflow')
}
logger.info(`Admin API: Deleted workflow ${workflowId} (${workflowData.name})`)
return NextResponse.json({ success: true, workflowId })

View File

@@ -1,16 +1,7 @@
import { db, workflowDeploymentVersion } from '@sim/db'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
import { restorePreviousVersionWebhooks, saveTriggerWebhooksForDeploy } from '@/lib/webhooks/deploy'
import { getActiveWorkflowRecord } from '@/lib/workflows/active-context'
import { activateWorkflowVersion } from '@/lib/workflows/persistence/utils'
import {
cleanupDeploymentVersion,
createSchedulesForDeploy,
validateWorkflowSchedules,
} from '@/lib/workflows/schedules'
import { performActivateVersion } from '@/lib/workflows/orchestration'
import { withAdminAuthParams } from '@/app/api/v1/admin/middleware'
import {
badRequestResponse,
@@ -18,7 +9,6 @@ import {
notFoundResponse,
singleResponse,
} from '@/app/api/v1/admin/responses'
import type { BlockState } from '@/stores/workflows/workflow/types'
const logger = createLogger('AdminWorkflowActivateVersionAPI')
@@ -43,144 +33,22 @@ export const POST = withAdminAuthParams<RouteParams>(async (request, context) =>
return badRequestResponse('Invalid version number')
}
const [versionRow] = await db
.select({
id: workflowDeploymentVersion.id,
state: workflowDeploymentVersion.state,
})
.from(workflowDeploymentVersion)
.where(
and(
eq(workflowDeploymentVersion.workflowId, workflowId),
eq(workflowDeploymentVersion.version, versionNum)
)
)
.limit(1)
if (!versionRow?.state) {
return notFoundResponse('Deployment version')
}
const [currentActiveVersion] = await db
.select({ id: workflowDeploymentVersion.id })
.from(workflowDeploymentVersion)
.where(
and(
eq(workflowDeploymentVersion.workflowId, workflowId),
eq(workflowDeploymentVersion.isActive, true)
)
)
.limit(1)
const previousVersionId = currentActiveVersion?.id
const deployedState = versionRow.state as { blocks?: Record<string, BlockState> }
const blocks = deployedState.blocks
if (!blocks || typeof blocks !== 'object') {
return internalErrorResponse('Invalid deployed state structure')
}
const workflowData = workflowRecord as Record<string, unknown>
const scheduleValidation = validateWorkflowSchedules(blocks)
if (!scheduleValidation.isValid) {
return badRequestResponse(`Invalid schedule configuration: ${scheduleValidation.error}`)
}
const triggerSaveResult = await saveTriggerWebhooksForDeploy({
request,
const result = await performActivateVersion({
workflowId,
workflow: workflowData,
version: versionNum,
userId: workflowRecord.userId,
blocks,
workflow: workflowRecord as Record<string, unknown>,
requestId,
deploymentVersionId: versionRow.id,
previousVersionId,
forceRecreateSubscriptions: true,
request,
actorId: 'admin-api',
})
if (!triggerSaveResult.success) {
logger.error(
`[${requestId}] Admin API: Failed to sync triggers for workflow ${workflowId}`,
triggerSaveResult.error
)
return internalErrorResponse(
triggerSaveResult.error?.message || 'Failed to sync trigger configuration'
)
}
const scheduleResult = await createSchedulesForDeploy(workflowId, blocks, db, versionRow.id)
if (!scheduleResult.success) {
await cleanupDeploymentVersion({
workflowId,
workflow: workflowData,
requestId,
deploymentVersionId: versionRow.id,
})
if (previousVersionId) {
await restorePreviousVersionWebhooks({
request,
workflow: workflowData,
userId: workflowRecord.userId,
previousVersionId,
requestId,
})
}
return internalErrorResponse(scheduleResult.error || 'Failed to sync schedules')
}
const result = await activateWorkflowVersion({ workflowId, version: versionNum })
if (!result.success) {
await cleanupDeploymentVersion({
workflowId,
workflow: workflowData,
requestId,
deploymentVersionId: versionRow.id,
})
if (previousVersionId) {
await restorePreviousVersionWebhooks({
request,
workflow: workflowData,
userId: workflowRecord.userId,
previousVersionId,
requestId,
})
}
if (result.error === 'Deployment version not found') {
return notFoundResponse('Deployment version')
}
if (result.errorCode === 'not_found') return notFoundResponse('Deployment version')
if (result.errorCode === 'validation') return badRequestResponse(result.error!)
return internalErrorResponse(result.error || 'Failed to activate version')
}
if (previousVersionId && previousVersionId !== versionRow.id) {
try {
logger.info(
`[${requestId}] Admin API: Cleaning up previous version ${previousVersionId} webhooks/schedules`
)
await cleanupDeploymentVersion({
workflowId,
workflow: workflowData,
requestId,
deploymentVersionId: previousVersionId,
skipExternalCleanup: true,
})
logger.info(`[${requestId}] Admin API: Previous version cleanup completed`)
} catch (cleanupError) {
logger.error(
`[${requestId}] Admin API: Failed to clean up previous version ${previousVersionId}`,
cleanupError
)
}
}
await syncMcpToolsForWorkflow({
workflowId,
requestId,
state: versionRow.state,
context: 'activate',
})
logger.info(
`[${requestId}] Admin API: Activated version ${versionNum} for workflow ${workflowId}`
)
@@ -189,14 +57,12 @@ export const POST = withAdminAuthParams<RouteParams>(async (request, context) =>
success: true,
version: versionNum,
deployedAt: result.deployedAt!.toISOString(),
warnings: triggerSaveResult.warnings,
warnings: result.warnings,
})
} catch (error) {
logger.error(
`[${requestId}] Admin API: Failed to activate version for workflow ${workflowId}`,
{
error,
}
{ error }
)
return internalErrorResponse('Failed to activate deployment version')
}

View File

@@ -2,7 +2,6 @@ import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { createRunSegment } from '@/lib/copilot/async-runs/repository'
import { appendCopilotLogContext } from '@/lib/copilot/logging'
import { COPILOT_REQUEST_MODES } from '@/lib/copilot/models'
import { orchestrateCopilotStream } from '@/lib/copilot/orchestrator'
import { getWorkflowById, resolveWorkflowIdForUser } from '@/lib/workflows/utils'
@@ -84,17 +83,15 @@ export async function POST(req: NextRequest) {
const chatId = parsed.chatId || crypto.randomUUID()
messageId = crypto.randomUUID()
logger.error(
appendCopilotLogContext('Received headless copilot chat start request', { messageId }),
{
workflowId: resolved.workflowId,
workflowName: parsed.workflowName,
chatId,
mode: transportMode,
autoExecuteTools: parsed.autoExecuteTools,
timeout: parsed.timeout,
}
)
const reqLogger = logger.withMetadata({ messageId })
reqLogger.info('Received headless copilot chat start request', {
workflowId: resolved.workflowId,
workflowName: parsed.workflowName,
chatId,
mode: transportMode,
autoExecuteTools: parsed.autoExecuteTools,
timeout: parsed.timeout,
})
const requestPayload = {
message: parsed.message,
workflowId: resolved.workflowId,
@@ -144,7 +141,7 @@ export async function POST(req: NextRequest) {
)
}
logger.error(appendCopilotLogContext('Headless copilot request failed', { messageId }), {
logger.withMetadata({ messageId }).error('Headless copilot request failed', {
error: error instanceof Error ? error.message : String(error),
})
return NextResponse.json({ success: false, error: 'Internal server error' }, { status: 500 })

View File

@@ -1,26 +1,9 @@
import { db, workflow, workflowDeploymentVersion } from '@sim/db'
import { db, workflow } from '@sim/db'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { generateRequestId } from '@/lib/core/utils/request'
import { removeMcpToolsForWorkflow, syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
import {
cleanupWebhooksForWorkflow,
restorePreviousVersionWebhooks,
saveTriggerWebhooksForDeploy,
} from '@/lib/webhooks/deploy'
import {
activateWorkflowVersionById,
deployWorkflow,
loadWorkflowFromNormalizedTables,
undeployWorkflow,
} from '@/lib/workflows/persistence/utils'
import {
cleanupDeploymentVersion,
createSchedulesForDeploy,
validateWorkflowSchedules,
} from '@/lib/workflows/schedules'
import { performFullDeploy, performFullUndeploy } from '@/lib/workflows/orchestration'
import { validateWorkflowPermissions } from '@/lib/workflows/utils'
import {
checkNeedsRedeployment,
@@ -97,164 +80,22 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
return createErrorResponse('Unable to determine deploying user', 400)
}
const normalizedData = await loadWorkflowFromNormalizedTables(id)
if (!normalizedData) {
return createErrorResponse('Failed to load workflow state', 500)
}
const scheduleValidation = validateWorkflowSchedules(normalizedData.blocks)
if (!scheduleValidation.isValid) {
logger.warn(
`[${requestId}] Schedule validation failed for workflow ${id}: ${scheduleValidation.error}`
)
return createErrorResponse(`Invalid schedule configuration: ${scheduleValidation.error}`, 400)
}
const [currentActiveVersion] = await db
.select({ id: workflowDeploymentVersion.id })
.from(workflowDeploymentVersion)
.where(
and(
eq(workflowDeploymentVersion.workflowId, id),
eq(workflowDeploymentVersion.isActive, true)
)
)
.limit(1)
const previousVersionId = currentActiveVersion?.id
const rollbackDeployment = async () => {
if (previousVersionId) {
await restorePreviousVersionWebhooks({
request,
workflow: workflowData as Record<string, unknown>,
userId: actorUserId,
previousVersionId,
requestId,
})
const reactivateResult = await activateWorkflowVersionById({
workflowId: id,
deploymentVersionId: previousVersionId,
})
if (reactivateResult.success) {
return
}
}
await undeployWorkflow({ workflowId: id })
}
const deployResult = await deployWorkflow({
const result = await performFullDeploy({
workflowId: id,
deployedBy: actorUserId,
workflowName: workflowData!.name,
})
if (!deployResult.success) {
return createErrorResponse(deployResult.error || 'Failed to deploy workflow', 500)
}
const deployedAt = deployResult.deployedAt!
const deploymentVersionId = deployResult.deploymentVersionId
if (!deploymentVersionId) {
await undeployWorkflow({ workflowId: id })
return createErrorResponse('Failed to resolve deployment version', 500)
}
const triggerSaveResult = await saveTriggerWebhooksForDeploy({
request,
workflowId: id,
workflow: workflowData,
userId: actorUserId,
blocks: normalizedData.blocks,
workflowName: workflowData!.name || undefined,
requestId,
deploymentVersionId,
previousVersionId,
request,
})
if (!triggerSaveResult.success) {
await cleanupDeploymentVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId,
})
await rollbackDeployment()
return createErrorResponse(
triggerSaveResult.error?.message || 'Failed to save trigger configuration',
triggerSaveResult.error?.status || 500
)
}
let scheduleInfo: { scheduleId?: string; cronExpression?: string; nextRunAt?: Date } = {}
const scheduleResult = await createSchedulesForDeploy(
id,
normalizedData.blocks,
db,
deploymentVersionId
)
if (!scheduleResult.success) {
logger.error(
`[${requestId}] Failed to create schedule for workflow ${id}: ${scheduleResult.error}`
)
await cleanupDeploymentVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId,
})
await rollbackDeployment()
return createErrorResponse(scheduleResult.error || 'Failed to create schedule', 500)
}
if (scheduleResult.scheduleId) {
scheduleInfo = {
scheduleId: scheduleResult.scheduleId,
cronExpression: scheduleResult.cronExpression,
nextRunAt: scheduleResult.nextRunAt,
}
logger.info(
`[${requestId}] Schedule created for workflow ${id}: ${scheduleResult.scheduleId}`
)
}
if (previousVersionId && previousVersionId !== deploymentVersionId) {
try {
logger.info(`[${requestId}] Cleaning up previous version ${previousVersionId} DB records`)
await cleanupDeploymentVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId: previousVersionId,
skipExternalCleanup: true,
})
} catch (cleanupError) {
logger.error(
`[${requestId}] Failed to clean up previous version ${previousVersionId}`,
cleanupError
)
// Non-fatal - continue with success response
}
if (!result.success) {
const status =
result.errorCode === 'validation' ? 400 : result.errorCode === 'not_found' ? 404 : 500
return createErrorResponse(result.error || 'Failed to deploy workflow', status)
}
logger.info(`[${requestId}] Workflow deployed successfully: ${id}`)
// Sync MCP tools with the latest parameter schema
await syncMcpToolsForWorkflow({ workflowId: id, requestId, context: 'deploy' })
recordAudit({
workspaceId: workflowData?.workspaceId || null,
actorId: actorUserId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.WORKFLOW_DEPLOYED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
resourceName: workflowData?.name,
description: `Deployed workflow "${workflowData?.name || id}"`,
metadata: { version: deploymentVersionId },
request,
})
const responseApiKeyInfo = workflowData!.workspaceId
? 'Workspace API keys'
: 'Personal API keys'
@@ -262,25 +103,13 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
return createSuccessResponse({
apiKey: responseApiKeyInfo,
isDeployed: true,
deployedAt,
schedule: scheduleInfo.scheduleId
? {
id: scheduleInfo.scheduleId,
cronExpression: scheduleInfo.cronExpression,
nextRunAt: scheduleInfo.nextRunAt,
}
: undefined,
warnings: triggerSaveResult.warnings,
deployedAt: result.deployedAt,
warnings: result.warnings,
})
} catch (error: any) {
logger.error(`[${requestId}] Error deploying workflow: ${id}`, {
error: error.message,
stack: error.stack,
name: error.name,
cause: error.cause,
fullError: error,
})
return createErrorResponse(error.message || 'Failed to deploy workflow', 500)
} catch (error: unknown) {
const message = error instanceof Error ? error.message : 'Failed to deploy workflow'
logger.error(`[${requestId}] Error deploying workflow: ${id}`, { error })
return createErrorResponse(message, 500)
}
}
@@ -328,60 +157,36 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
}
export async function DELETE(
request: NextRequest,
_request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) {
const requestId = generateRequestId()
const { id } = await params
try {
const {
error,
session,
workflow: workflowData,
} = await validateWorkflowPermissions(id, requestId, 'admin')
const { error, session } = await validateWorkflowPermissions(id, requestId, 'admin')
if (error) {
return createErrorResponse(error.message, error.status)
}
const result = await undeployWorkflow({ workflowId: id })
const result = await performFullUndeploy({
workflowId: id,
userId: session!.user.id,
requestId,
})
if (!result.success) {
return createErrorResponse(result.error || 'Failed to undeploy workflow', 500)
}
await cleanupWebhooksForWorkflow(id, workflowData as Record<string, unknown>, requestId)
await removeMcpToolsForWorkflow(id, requestId)
logger.info(`[${requestId}] Workflow undeployed successfully: ${id}`)
try {
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.workflowUndeployed({ workflowId: id })
} catch (_e) {
// Silently fail
}
recordAudit({
workspaceId: workflowData?.workspaceId || null,
actorId: session!.user.id,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.WORKFLOW_UNDEPLOYED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
resourceName: workflowData?.name,
description: `Undeployed workflow "${workflowData?.name || id}"`,
request,
})
return createSuccessResponse({
isDeployed: false,
deployedAt: null,
apiKey: null,
})
} catch (error: any) {
logger.error(`[${requestId}] Error undeploying workflow: ${id}`, error)
return createErrorResponse(error.message || 'Failed to undeploy workflow', 500)
} catch (error: unknown) {
const message = error instanceof Error ? error.message : 'Failed to undeploy workflow'
logger.error(`[${requestId}] Error undeploying workflow: ${id}`, { error })
return createErrorResponse(message, 500)
}
}

View File

@@ -3,19 +3,10 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
import { restorePreviousVersionWebhooks, saveTriggerWebhooksForDeploy } from '@/lib/webhooks/deploy'
import { activateWorkflowVersion } from '@/lib/workflows/persistence/utils'
import {
cleanupDeploymentVersion,
createSchedulesForDeploy,
validateWorkflowSchedules,
} from '@/lib/workflows/schedules'
import { performActivateVersion } from '@/lib/workflows/orchestration'
import { validateWorkflowPermissions } from '@/lib/workflows/utils'
import { createErrorResponse, createSuccessResponse } from '@/app/api/workflows/utils'
import type { BlockState } from '@/stores/workflows/workflow/types'
const logger = createLogger('WorkflowDeploymentVersionAPI')
@@ -129,140 +120,25 @@ export async function PATCH(
return createErrorResponse('Unable to determine activating user', 400)
}
const [versionRow] = await db
.select({
id: workflowDeploymentVersion.id,
state: workflowDeploymentVersion.state,
})
.from(workflowDeploymentVersion)
.where(
and(
eq(workflowDeploymentVersion.workflowId, id),
eq(workflowDeploymentVersion.version, versionNum)
)
)
.limit(1)
if (!versionRow?.state) {
return createErrorResponse('Deployment version not found', 404)
}
const [currentActiveVersion] = await db
.select({ id: workflowDeploymentVersion.id })
.from(workflowDeploymentVersion)
.where(
and(
eq(workflowDeploymentVersion.workflowId, id),
eq(workflowDeploymentVersion.isActive, true)
)
)
.limit(1)
const previousVersionId = currentActiveVersion?.id
const deployedState = versionRow.state as { blocks?: Record<string, BlockState> }
const blocks = deployedState.blocks
if (!blocks || typeof blocks !== 'object') {
return createErrorResponse('Invalid deployed state structure', 500)
}
const scheduleValidation = validateWorkflowSchedules(blocks)
if (!scheduleValidation.isValid) {
return createErrorResponse(
`Invalid schedule configuration: ${scheduleValidation.error}`,
400
)
}
const triggerSaveResult = await saveTriggerWebhooksForDeploy({
request,
const activateResult = await performActivateVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
version: versionNum,
userId: actorUserId,
blocks,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId: versionRow.id,
previousVersionId,
forceRecreateSubscriptions: true,
request,
})
if (!triggerSaveResult.success) {
return createErrorResponse(
triggerSaveResult.error?.message || 'Failed to sync trigger configuration',
triggerSaveResult.error?.status || 500
)
if (!activateResult.success) {
const status =
activateResult.errorCode === 'not_found'
? 404
: activateResult.errorCode === 'validation'
? 400
: 500
return createErrorResponse(activateResult.error || 'Failed to activate deployment', status)
}
const scheduleResult = await createSchedulesForDeploy(id, blocks, db, versionRow.id)
if (!scheduleResult.success) {
await cleanupDeploymentVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId: versionRow.id,
})
if (previousVersionId) {
await restorePreviousVersionWebhooks({
request,
workflow: workflowData as Record<string, unknown>,
userId: actorUserId,
previousVersionId,
requestId,
})
}
return createErrorResponse(scheduleResult.error || 'Failed to sync schedules', 500)
}
const result = await activateWorkflowVersion({ workflowId: id, version: versionNum })
if (!result.success) {
await cleanupDeploymentVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId: versionRow.id,
})
if (previousVersionId) {
await restorePreviousVersionWebhooks({
request,
workflow: workflowData as Record<string, unknown>,
userId: actorUserId,
previousVersionId,
requestId,
})
}
return createErrorResponse(result.error || 'Failed to activate deployment', 400)
}
if (previousVersionId && previousVersionId !== versionRow.id) {
try {
logger.info(
`[${requestId}] Cleaning up previous version ${previousVersionId} webhooks/schedules`
)
await cleanupDeploymentVersion({
workflowId: id,
workflow: workflowData as Record<string, unknown>,
requestId,
deploymentVersionId: previousVersionId,
skipExternalCleanup: true,
})
logger.info(`[${requestId}] Previous version cleanup completed`)
} catch (cleanupError) {
logger.error(
`[${requestId}] Failed to clean up previous version ${previousVersionId}`,
cleanupError
)
}
}
await syncMcpToolsForWorkflow({
workflowId: id,
requestId,
state: versionRow.state,
context: 'activate',
})
// Apply name/description updates if provided alongside activation
let updatedName: string | null | undefined
let updatedDescription: string | null | undefined
if (name !== undefined || description !== undefined) {
@@ -298,23 +174,10 @@ export async function PATCH(
}
}
recordAudit({
workspaceId: workflowData?.workspaceId,
actorId: actorUserId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.WORKFLOW_DEPLOYMENT_ACTIVATED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
description: `Activated deployment version ${versionNum}`,
metadata: { version: versionNum },
request,
})
return createSuccessResponse({
success: true,
deployedAt: result.deployedAt,
warnings: triggerSaveResult.warnings,
deployedAt: activateResult.deployedAt,
warnings: activateResult.warnings,
...(updatedName !== undefined && { name: updatedName }),
...(updatedDescription !== undefined && { description: updatedDescription }),
})

View File

@@ -82,14 +82,16 @@ vi.mock('@/background/workflow-execution', () => ({
executeWorkflowJob: vi.fn(),
}))
vi.mock('@sim/logger', () => ({
createLogger: vi.fn().mockReturnValue({
vi.mock('@sim/logger', () => {
const createMockLogger = (): Record<string, any> => ({
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
}),
}))
withMetadata: vi.fn(() => createMockLogger()),
})
return { createLogger: vi.fn(() => createMockLogger()) }
})
vi.mock('uuid', () => ({
validate: vi.fn().mockReturnValue(true),

View File

@@ -187,6 +187,13 @@ type AsyncExecutionParams = {
async function handleAsyncExecution(params: AsyncExecutionParams): Promise<NextResponse> {
const { requestId, workflowId, userId, workspaceId, input, triggerType, executionId, callChain } =
params
const asyncLogger = logger.withMetadata({
requestId,
workflowId,
workspaceId,
userId,
executionId,
})
const correlation = {
executionId,
@@ -233,10 +240,7 @@ async function handleAsyncExecution(params: AsyncExecutionParams): Promise<NextR
metadata: { workflowId, userId, correlation },
})
logger.info(`[${requestId}] Queued async workflow execution`, {
workflowId,
jobId,
})
asyncLogger.info('Queued async workflow execution', { jobId })
if (shouldExecuteInline() && jobQueue) {
const inlineJobQueue = jobQueue
@@ -247,14 +251,14 @@ async function handleAsyncExecution(params: AsyncExecutionParams): Promise<NextR
await inlineJobQueue.completeJob(jobId, output)
} catch (error) {
const errorMessage = error instanceof Error ? error.message : String(error)
logger.error(`[${requestId}] Async workflow execution failed`, {
asyncLogger.error('Async workflow execution failed', {
jobId,
error: errorMessage,
})
try {
await inlineJobQueue.markJobFailed(jobId, errorMessage)
} catch (markFailedError) {
logger.error(`[${requestId}] Failed to mark job as failed`, {
asyncLogger.error('Failed to mark job as failed', {
jobId,
error:
markFailedError instanceof Error
@@ -289,7 +293,7 @@ async function handleAsyncExecution(params: AsyncExecutionParams): Promise<NextR
)
}
logger.error(`[${requestId}] Failed to queue async execution`, error)
asyncLogger.error('Failed to queue async execution', error)
return NextResponse.json(
{ error: `Failed to queue async execution: ${error.message}` },
{ status: 500 }
@@ -352,11 +356,12 @@ async function handleExecutePost(
): Promise<NextResponse | Response> {
const requestId = generateRequestId()
const { id: workflowId } = await params
let reqLogger = logger.withMetadata({ requestId, workflowId })
const incomingCallChain = parseCallChain(req.headers.get(SIM_VIA_HEADER))
const callChainError = validateCallChain(incomingCallChain)
if (callChainError) {
logger.warn(`[${requestId}] Call chain rejected for workflow ${workflowId}: ${callChainError}`)
reqLogger.warn(`Call chain rejected: ${callChainError}`)
return NextResponse.json({ error: callChainError }, { status: 409 })
}
const callChain = buildNextCallChain(incomingCallChain, workflowId)
@@ -414,12 +419,12 @@ async function handleExecutePost(
body = JSON.parse(text)
}
} catch (error) {
logger.warn(`[${requestId}] Failed to parse request body, using defaults`)
reqLogger.warn('Failed to parse request body, using defaults')
}
const validation = ExecuteWorkflowSchema.safeParse(body)
if (!validation.success) {
logger.warn(`[${requestId}] Invalid request body:`, validation.error.errors)
reqLogger.warn('Invalid request body:', validation.error.errors)
return NextResponse.json(
{
error: 'Invalid request body',
@@ -589,9 +594,10 @@ async function handleExecutePost(
)
}
logger.info(`[${requestId}] Starting server-side execution`, {
workflowId,
userId,
const executionId = uuidv4()
reqLogger = reqLogger.withMetadata({ userId, executionId })
reqLogger.info('Starting server-side execution', {
hasInput: !!input,
triggerType,
authType: auth.authType,
@@ -600,8 +606,6 @@ async function handleExecutePost(
enableSSE,
isAsyncMode,
})
const executionId = uuidv4()
let loggingTriggerType: CoreTriggerType = 'manual'
if (CORE_TRIGGER_TYPES.includes(triggerType as CoreTriggerType)) {
loggingTriggerType = triggerType as CoreTriggerType
@@ -657,10 +661,11 @@ async function handleExecutePost(
const workflow = preprocessResult.workflowRecord!
if (!workflow.workspaceId) {
logger.error(`[${requestId}] Workflow ${workflowId} has no workspaceId`)
reqLogger.error('Workflow has no workspaceId')
return NextResponse.json({ error: 'Workflow has no associated workspace' }, { status: 500 })
}
const workspaceId = workflow.workspaceId
reqLogger = reqLogger.withMetadata({ workspaceId, userId: actorUserId })
if (auth.apiKeyType === 'workspace' && auth.workspaceId !== workspaceId) {
return NextResponse.json(
@@ -669,11 +674,7 @@ async function handleExecutePost(
)
}
logger.info(`[${requestId}] Preprocessing passed`, {
workflowId,
actorUserId,
workspaceId,
})
reqLogger.info('Preprocessing passed')
if (isAsyncMode) {
return handleAsyncExecution({
@@ -744,7 +745,7 @@ async function handleExecutePost(
)
}
} catch (fileError) {
logger.error(`[${requestId}] Failed to process input file fields:`, fileError)
reqLogger.error('Failed to process input file fields:', fileError)
await loggingSession.safeStart({
userId: actorUserId,
@@ -772,7 +773,7 @@ async function handleExecutePost(
sanitizedWorkflowStateOverride || cachedWorkflowData || undefined
if (!enableSSE) {
logger.info(`[${requestId}] Using non-SSE execution (direct JSON response)`)
reqLogger.info('Using non-SSE execution (direct JSON response)')
const metadata: ExecutionMetadata = {
requestId,
executionId,
@@ -866,7 +867,7 @@ async function handleExecutePost(
const errorMessage = error instanceof Error ? error.message : 'Unknown error'
logger.error(`[${requestId}] Queued non-SSE execution failed: ${errorMessage}`)
reqLogger.error(`Queued non-SSE execution failed: ${errorMessage}`)
return NextResponse.json(
{
@@ -908,7 +909,7 @@ async function handleExecutePost(
timeoutController.timeoutMs
) {
const timeoutErrorMessage = getTimeoutErrorMessage(null, timeoutController.timeoutMs)
logger.info(`[${requestId}] Non-SSE execution timed out`, {
reqLogger.info('Non-SSE execution timed out', {
timeoutMs: timeoutController.timeoutMs,
})
await loggingSession.markAsFailed(timeoutErrorMessage)
@@ -962,7 +963,7 @@ async function handleExecutePost(
} catch (error: unknown) {
const errorMessage = error instanceof Error ? error.message : 'Unknown error'
logger.error(`[${requestId}] Non-SSE execution failed: ${errorMessage}`)
reqLogger.error(`Non-SSE execution failed: ${errorMessage}`)
const executionResult = hasExecutionResult(error) ? error.executionResult : undefined
@@ -985,7 +986,7 @@ async function handleExecutePost(
timeoutController.cleanup()
if (executionId) {
void cleanupExecutionBase64Cache(executionId).catch((error) => {
logger.error(`[${requestId}] Failed to cleanup base64 cache`, { error })
reqLogger.error('Failed to cleanup base64 cache', { error })
})
}
}
@@ -1039,9 +1040,9 @@ async function handleExecutePost(
})
}
logger.info(`[${requestId}] Using SSE console log streaming (manual execution)`)
reqLogger.info('Using SSE console log streaming (manual execution)')
} else {
logger.info(`[${requestId}] Using streaming API response`)
reqLogger.info('Using streaming API response')
const resolvedSelectedOutputs = resolveOutputIds(
selectedOutputs,
@@ -1135,7 +1136,7 @@ async function handleExecutePost(
iterationContext?: IterationContext,
childWorkflowContext?: ChildWorkflowContext
) => {
logger.info(`[${requestId}] 🔷 onBlockStart called:`, { blockId, blockName, blockType })
reqLogger.info('onBlockStart called', { blockId, blockName, blockType })
sendEvent({
type: 'block:started',
timestamp: new Date().toISOString(),
@@ -1184,7 +1185,7 @@ async function handleExecutePost(
: {}
if (hasError) {
logger.info(`[${requestId}] ✗ onBlockComplete (error) called:`, {
reqLogger.info('onBlockComplete (error) called', {
blockId,
blockName,
blockType,
@@ -1219,7 +1220,7 @@ async function handleExecutePost(
},
})
} else {
logger.info(`[${requestId}] ✓ onBlockComplete called:`, {
reqLogger.info('onBlockComplete called', {
blockId,
blockName,
blockType,
@@ -1284,7 +1285,7 @@ async function handleExecutePost(
data: { blockId },
})
} catch (error) {
logger.error(`[${requestId}] Error streaming block content:`, error)
reqLogger.error('Error streaming block content:', error)
} finally {
try {
await reader.cancel().catch(() => {})
@@ -1360,9 +1361,7 @@ async function handleExecutePost(
if (result.status === 'paused') {
if (!result.snapshotSeed) {
logger.error(`[${requestId}] Missing snapshot seed for paused execution`, {
executionId,
})
reqLogger.error('Missing snapshot seed for paused execution')
await loggingSession.markAsFailed('Missing snapshot seed for paused execution')
} else {
try {
@@ -1374,8 +1373,7 @@ async function handleExecutePost(
executorUserId: result.metadata?.userId,
})
} catch (pauseError) {
logger.error(`[${requestId}] Failed to persist pause result`, {
executionId,
reqLogger.error('Failed to persist pause result', {
error: pauseError instanceof Error ? pauseError.message : String(pauseError),
})
await loggingSession.markAsFailed(
@@ -1390,7 +1388,7 @@ async function handleExecutePost(
if (result.status === 'cancelled') {
if (timeoutController.isTimedOut() && timeoutController.timeoutMs) {
const timeoutErrorMessage = getTimeoutErrorMessage(null, timeoutController.timeoutMs)
logger.info(`[${requestId}] Workflow execution timed out`, {
reqLogger.info('Workflow execution timed out', {
timeoutMs: timeoutController.timeoutMs,
})
@@ -1408,7 +1406,7 @@ async function handleExecutePost(
})
finalMetaStatus = 'error'
} else {
logger.info(`[${requestId}] Workflow execution was cancelled`)
reqLogger.info('Workflow execution was cancelled')
sendEvent({
type: 'execution:cancelled',
@@ -1452,7 +1450,7 @@ async function handleExecutePost(
? error.message
: 'Unknown error'
logger.error(`[${requestId}] SSE execution failed: ${errorMessage}`, { isTimeout })
reqLogger.error(`SSE execution failed: ${errorMessage}`, { isTimeout })
const executionResult = hasExecutionResult(error) ? error.executionResult : undefined
@@ -1475,7 +1473,7 @@ async function handleExecutePost(
try {
await eventWriter.close()
} catch (closeError) {
logger.warn(`[${requestId}] Failed to close event writer`, {
reqLogger.warn('Failed to close event writer', {
error: closeError instanceof Error ? closeError.message : String(closeError),
})
}
@@ -1496,7 +1494,7 @@ async function handleExecutePost(
},
cancel() {
isStreamClosed = true
logger.info(`[${requestId}] Client disconnected from SSE stream`)
reqLogger.info('Client disconnected from SSE stream')
},
})
@@ -1518,7 +1516,7 @@ async function handleExecutePost(
)
}
logger.error(`[${requestId}] Failed to start workflow execution:`, error)
reqLogger.error('Failed to start workflow execution:', error)
return NextResponse.json(
{ error: error.message || 'Failed to start workflow execution' },
{ status: 500 }

View File

@@ -5,14 +5,7 @@
* @vitest-environment node
*/
import {
auditMock,
envMock,
loggerMock,
requestUtilsMock,
setupGlobalFetchMock,
telemetryMock,
} from '@sim/testing'
import { auditMock, envMock, loggerMock, requestUtilsMock, telemetryMock } from '@sim/testing'
import { NextRequest } from 'next/server'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
@@ -21,7 +14,7 @@ const mockCheckSessionOrInternalAuth = vi.fn()
const mockLoadWorkflowFromNormalizedTables = vi.fn()
const mockGetWorkflowById = vi.fn()
const mockAuthorizeWorkflowByWorkspacePermission = vi.fn()
const mockArchiveWorkflow = vi.fn()
const mockPerformDeleteWorkflow = vi.fn()
const mockDbUpdate = vi.fn()
const mockDbSelect = vi.fn()
@@ -72,8 +65,8 @@ vi.mock('@/lib/workflows/utils', () => ({
}) => mockAuthorizeWorkflowByWorkspacePermission(params),
}))
vi.mock('@/lib/workflows/lifecycle', () => ({
archiveWorkflow: (...args: unknown[]) => mockArchiveWorkflow(...args),
vi.mock('@/lib/workflows/orchestration', () => ({
performDeleteWorkflow: (...args: unknown[]) => mockPerformDeleteWorkflow(...args),
}))
vi.mock('@sim/db', () => ({
@@ -294,18 +287,7 @@ describe('Workflow By ID API Route', () => {
workspacePermission: 'admin',
})
mockDbSelect.mockReturnValue({
from: vi.fn().mockReturnValue({
where: vi.fn().mockResolvedValue([{ id: 'workflow-123' }, { id: 'workflow-456' }]),
}),
})
mockArchiveWorkflow.mockResolvedValue({
archived: true,
workflow: mockWorkflow,
})
setupGlobalFetchMock({ ok: true })
mockPerformDeleteWorkflow.mockResolvedValue({ success: true })
const req = new NextRequest('http://localhost:3000/api/workflows/workflow-123', {
method: 'DELETE',
@@ -317,6 +299,12 @@ describe('Workflow By ID API Route', () => {
expect(response.status).toBe(200)
const data = await response.json()
expect(data.success).toBe(true)
expect(mockPerformDeleteWorkflow).toHaveBeenCalledWith(
expect.objectContaining({
workflowId: 'workflow-123',
userId: 'user-123',
})
)
})
it('should allow admin to delete workspace workflow', async () => {
@@ -337,19 +325,7 @@ describe('Workflow By ID API Route', () => {
workspacePermission: 'admin',
})
// Mock db.select() to return multiple workflows so deletion is allowed
mockDbSelect.mockReturnValue({
from: vi.fn().mockReturnValue({
where: vi.fn().mockResolvedValue([{ id: 'workflow-123' }, { id: 'workflow-456' }]),
}),
})
mockArchiveWorkflow.mockResolvedValue({
archived: true,
workflow: mockWorkflow,
})
setupGlobalFetchMock({ ok: true })
mockPerformDeleteWorkflow.mockResolvedValue({ success: true })
const req = new NextRequest('http://localhost:3000/api/workflows/workflow-123', {
method: 'DELETE',
@@ -381,11 +357,10 @@ describe('Workflow By ID API Route', () => {
workspacePermission: 'admin',
})
// Mock db.select() to return only 1 workflow (the one being deleted)
mockDbSelect.mockReturnValue({
from: vi.fn().mockReturnValue({
where: vi.fn().mockResolvedValue([{ id: 'workflow-123' }]),
}),
mockPerformDeleteWorkflow.mockResolvedValue({
success: false,
error: 'Cannot delete the only workflow in the workspace',
errorCode: 'validation',
})
const req = new NextRequest('http://localhost:3000/api/workflows/workflow-123', {

View File

@@ -1,13 +1,12 @@
import { db } from '@sim/db'
import { templates, workflow } from '@sim/db/schema'
import { workflow } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull, ne } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { AuthType, checkHybridAuth, checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { archiveWorkflow } from '@/lib/workflows/lifecycle'
import { performDeleteWorkflow } from '@/lib/workflows/orchestration'
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/persistence/utils'
import { authorizeWorkflowByWorkspacePermission, getWorkflowById } from '@/lib/workflows/utils'
@@ -184,28 +183,12 @@ export async function DELETE(
)
}
// Check if this is the last workflow in the workspace
if (workflowData.workspaceId) {
const totalWorkflowsInWorkspace = await db
.select({ id: workflow.id })
.from(workflow)
.where(and(eq(workflow.workspaceId, workflowData.workspaceId), isNull(workflow.archivedAt)))
if (totalWorkflowsInWorkspace.length <= 1) {
return NextResponse.json(
{ error: 'Cannot delete the only workflow in the workspace' },
{ status: 400 }
)
}
}
// Check if workflow has published templates before deletion
const { searchParams } = new URL(request.url)
const checkTemplates = searchParams.get('check-templates') === 'true'
const deleteTemplatesParam = searchParams.get('deleteTemplates')
if (checkTemplates) {
// Return template information for frontend to handle
const { templates } = await import('@sim/db/schema')
const publishedTemplates = await db
.select({
id: templates.id,
@@ -229,49 +212,22 @@ export async function DELETE(
})
}
// Handle template deletion based on user choice
if (deleteTemplatesParam !== null) {
const deleteTemplates = deleteTemplatesParam === 'delete'
const result = await performDeleteWorkflow({
workflowId,
userId,
requestId,
templateAction: deleteTemplatesParam === 'delete' ? 'delete' : 'orphan',
})
if (deleteTemplates) {
// Delete all templates associated with this workflow
await db.delete(templates).where(eq(templates.workflowId, workflowId))
logger.info(`[${requestId}] Deleted templates for workflow ${workflowId}`)
} else {
// Orphan the templates (set workflowId to null)
await db
.update(templates)
.set({ workflowId: null })
.where(eq(templates.workflowId, workflowId))
logger.info(`[${requestId}] Orphaned templates for workflow ${workflowId}`)
}
}
const archiveResult = await archiveWorkflow(workflowId, { requestId })
if (!archiveResult.workflow) {
return NextResponse.json({ error: 'Workflow not found' }, { status: 404 })
if (!result.success) {
const status =
result.errorCode === 'not_found' ? 404 : result.errorCode === 'validation' ? 400 : 500
return NextResponse.json({ error: result.error }, { status })
}
const elapsed = Date.now() - startTime
logger.info(`[${requestId}] Successfully archived workflow ${workflowId} in ${elapsed}ms`)
recordAudit({
workspaceId: workflowData.workspaceId || null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.WORKFLOW_DELETED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: workflowId,
resourceName: workflowData.name,
description: `Archived workflow "${workflowData.name}"`,
metadata: {
archived: archiveResult.archived,
deleteTemplates: deleteTemplatesParam === 'delete',
},
request,
})
return NextResponse.json({ success: true }, { status: 200 })
} catch (error: any) {
const elapsed = Date.now() - startTime

View File

@@ -293,7 +293,7 @@ export default function EmailAuth({ identifier, onAuthSuccess }: EmailAuthProps)
<button
onClick={() => handleVerifyOtp()}
disabled={otpValue.length !== 6 || isVerifyingOtp}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
className={AUTH_SUBMIT_BTN}
>
{isVerifyingOtp ? (
<span className='flex items-center gap-2'>

View File

@@ -1,6 +1,7 @@
'use client'
import { useRouter } from 'next/navigation'
import { AUTH_SUBMIT_BTN } from '@/app/(auth)/components/auth-button-classes'
import { StatusPageLayout } from '@/app/(auth)/components/status-page-layout'
interface FormErrorStateProps {
@@ -12,10 +13,7 @@ export function FormErrorState({ error }: FormErrorStateProps) {
return (
<StatusPageLayout title='Form Unavailable' description={error}>
<button
onClick={() => router.push('/workspace')}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button onClick={() => router.push('/workspace')} className={AUTH_SUBMIT_BTN}>
Return to Workspace
</button>
</StatusPageLayout>

View File

@@ -5,6 +5,7 @@ import { Eye, EyeOff, Loader2 } from 'lucide-react'
import { Input, Label } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn'
import AuthBackground from '@/app/(auth)/components/auth-background'
import { AUTH_SUBMIT_BTN } from '@/app/(auth)/components/auth-button-classes'
import { SupportFooter } from '@/app/(auth)/components/support-footer'
import Navbar from '@/app/(home)/components/navbar/navbar'
@@ -75,7 +76,7 @@ export function PasswordAuth({ onSubmit, error }: PasswordAuthProps) {
<button
type='submit'
disabled={!password.trim() || isSubmitting}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
className={AUTH_SUBMIT_BTN}
>
{isSubmitting ? (
<span className='flex items-center gap-2'>

View File

@@ -2,6 +2,7 @@
import { useEffect } from 'react'
import { createLogger } from '@sim/logger'
import { AUTH_SUBMIT_BTN } from '@/app/(auth)/components/auth-button-classes'
import { StatusPageLayout } from '@/app/(auth)/components/status-page-layout'
const logger = createLogger('FormError')
@@ -21,10 +22,7 @@ export default function FormError({ error, reset }: FormErrorProps) {
title='Something went wrong'
description='We encountered an error loading this form. Please try again.'
>
<button
onClick={reset}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button onClick={reset} className={AUTH_SUBMIT_BTN}>
Try again
</button>
</StatusPageLayout>

View File

@@ -5,6 +5,7 @@ import { createLogger } from '@sim/logger'
import { Loader2 } from 'lucide-react'
import { martianMono } from '@/app/_styles/fonts/martian-mono/martian-mono'
import AuthBackground from '@/app/(auth)/components/auth-background'
import { AUTH_SUBMIT_BTN } from '@/app/(auth)/components/auth-button-classes'
import { SupportFooter } from '@/app/(auth)/components/support-footer'
import Navbar from '@/app/(home)/components/navbar/navbar'
import {
@@ -322,11 +323,7 @@ export default function Form({ identifier }: { identifier: string }) {
)}
{fields.length > 0 && (
<button
type='submit'
disabled={isSubmitting}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button type='submit' disabled={isSubmitting} className={AUTH_SUBMIT_BTN}>
{isSubmitting ? (
<span className='flex items-center gap-2'>
<Loader2 className='h-4 w-4 animate-spin' />

View File

@@ -3,6 +3,7 @@
import { Loader2 } from 'lucide-react'
import { useRouter } from 'next/navigation'
import { cn } from '@/lib/core/utils/cn'
import { AUTH_PRIMARY_CTA_BASE } from '@/app/(auth)/components/auth-button-classes'
interface InviteStatusCardProps {
type: 'login' | 'loading' | 'error' | 'success' | 'invitation' | 'warning'
@@ -55,10 +56,7 @@ export function InviteStatusCard({
<div className='mt-8 w-full max-w-[410px] space-y-3'>
{isExpiredError && (
<button
onClick={() => router.push('/')}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button onClick={() => router.push('/')} className={`${AUTH_PRIMARY_CTA_BASE} w-full`}>
Request New Invitation
</button>
)}
@@ -69,9 +67,9 @@ export function InviteStatusCard({
onClick={action.onClick}
disabled={action.disabled || action.loading}
className={cn(
'inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50',
`${AUTH_PRIMARY_CTA_BASE} w-full`,
index !== 0 &&
'border-[var(--landing-border-strong)] bg-transparent text-[var(--landing-text)] hover:border-[var(--landing-border-strong)] hover:bg-[var(--landing-bg-elevated)]'
'border-[var(--landing-border-strong)] bg-transparent text-[var(--landing-text)] hover:border-[var(--landing-border-strong)] hover:bg-[var(--landing-bg-elevated)] hover:text-[var(--landing-text)]'
)}
>
{action.loading ? (

View File

@@ -218,6 +218,7 @@ export default function RootLayout({ children }: { children: React.ReactNode })
<meta httpEquiv='x-ua-compatible' content='ie=edge' />
{/* OneDollarStats Analytics */}
<link rel='dns-prefetch' href='https://assets.onedollarstats.com' />
<script defer src='https://assets.onedollarstats.com/stonks.js' />
<PublicEnvScript />

View File

@@ -2,6 +2,7 @@ import type { Metadata } from 'next'
import Link from 'next/link'
import { getNavBlogPosts } from '@/lib/blog/registry'
import AuthBackground from '@/app/(auth)/components/auth-background'
import { AUTH_PRIMARY_CTA_BASE } from '@/app/(auth)/components/auth-button-classes'
import Navbar from '@/app/(home)/components/navbar/navbar'
export const metadata: Metadata = {
@@ -9,9 +10,6 @@ export const metadata: Metadata = {
robots: { index: false, follow: true },
}
const CTA_BASE =
'inline-flex items-center h-[32px] rounded-[5px] border px-2.5 font-[430] font-season text-sm'
export default async function NotFound() {
const blogPosts = await getNavBlogPosts()
return (
@@ -29,10 +27,7 @@ export default async function NotFound() {
The page you&apos;re looking for doesn&apos;t exist or has been moved.
</p>
<div className='mt-3 flex items-center gap-2'>
<Link
href='/'
className={`${CTA_BASE} gap-2 border-white bg-white text-black transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)]`}
>
<Link href='/' className={AUTH_PRIMARY_CTA_BASE}>
Return to Home
</Link>
</div>

View File

@@ -2,7 +2,7 @@ import type { Metadata } from 'next'
import { getBaseUrl } from '@/lib/core/utils/urls'
import Landing from '@/app/(home)/landing'
export const dynamic = 'force-dynamic'
export const revalidate = 3600
const baseUrl = getBaseUrl()

View File

@@ -3,6 +3,7 @@
import { Suspense, useEffect, useState } from 'react'
import { Loader2 } from 'lucide-react'
import { useSearchParams } from 'next/navigation'
import { AUTH_SUBMIT_BTN } from '@/app/(auth)/components/auth-button-classes'
import { InviteLayout } from '@/app/invite/components'
interface UnsubscribeData {
@@ -143,10 +144,7 @@ function UnsubscribeContent() {
</div>
<div className={'mt-8 w-full max-w-[410px] space-y-3'}>
<button
onClick={() => window.history.back()}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button onClick={() => window.history.back()} className={AUTH_SUBMIT_BTN}>
Go Back
</button>
</div>
@@ -168,10 +166,7 @@ function UnsubscribeContent() {
</div>
<div className={'mt-8 w-full max-w-[410px] space-y-3'}>
<button
onClick={() => window.close()}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button onClick={() => window.close()} className={AUTH_SUBMIT_BTN}>
Close
</button>
</div>
@@ -193,10 +188,7 @@ function UnsubscribeContent() {
</div>
<div className={'mt-8 w-full max-w-[410px] space-y-3'}>
<button
onClick={() => window.close()}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
>
<button onClick={() => window.close()} className={AUTH_SUBMIT_BTN}>
Close
</button>
</div>
@@ -222,7 +214,7 @@ function UnsubscribeContent() {
<button
onClick={() => handleUnsubscribe('all')}
disabled={processing || isAlreadyUnsubscribedFromAll}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
className={AUTH_SUBMIT_BTN}
>
{processing ? (
<span className='flex items-center gap-2'>
@@ -249,7 +241,7 @@ function UnsubscribeContent() {
isAlreadyUnsubscribedFromAll ||
data?.currentPreferences.unsubscribeMarketing
}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
className={AUTH_SUBMIT_BTN}
>
{data?.currentPreferences.unsubscribeMarketing
? 'Unsubscribed from Marketing'
@@ -263,7 +255,7 @@ function UnsubscribeContent() {
isAlreadyUnsubscribedFromAll ||
data?.currentPreferences.unsubscribeUpdates
}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
className={AUTH_SUBMIT_BTN}
>
{data?.currentPreferences.unsubscribeUpdates
? 'Unsubscribed from Updates'
@@ -277,7 +269,7 @@ function UnsubscribeContent() {
isAlreadyUnsubscribedFromAll ||
data?.currentPreferences.unsubscribeNotifications
}
className='inline-flex h-[32px] w-full items-center justify-center gap-2 rounded-[5px] border border-white bg-white px-2.5 font-[430] font-season text-black text-sm transition-colors hover:border-[var(--border-1)] hover:bg-[var(--border-1)] disabled:cursor-not-allowed disabled:opacity-50'
className={AUTH_SUBMIT_BTN}
>
{data?.currentPreferences.unsubscribeNotifications
? 'Unsubscribed from Notifications'

View File

@@ -61,7 +61,7 @@ export const navTourSteps: Step[] = [
target: '[data-tour="nav-tasks"]',
title: 'Tasks',
content:
'Tasks that work for you. Mothership can create, edit, and delete resource throughout the platform. It can also perform actions on your behalf, like sending emails, creating tasks, and more.',
'Tasks that work for you. Mothership can create, edit, and delete resources throughout the platform. It can also perform actions on your behalf, like sending emails, creating tasks, and more.',
placement: 'right',
disableBeacon: true,
},

View File

@@ -1,6 +1,6 @@
'use client'
import { createContext, useCallback, useContext, useEffect, useState } from 'react'
import { createContext, useCallback, useContext } from 'react'
import type { TooltipRenderProps } from 'react-joyride'
import { TourTooltip } from '@/components/emcn'
@@ -59,18 +59,14 @@ export function TourTooltipAdapter({
closeProps,
}: TooltipRenderProps) {
const { isTooltipVisible, isEntrance, totalSteps } = useContext(TourStateContext)
const [targetEl, setTargetEl] = useState<HTMLElement | null>(null)
useEffect(() => {
const { target } = step
if (typeof target === 'string') {
setTargetEl(document.querySelector<HTMLElement>(target))
} else if (target instanceof HTMLElement) {
setTargetEl(target)
} else {
setTargetEl(null)
}
}, [step])
const { target } = step
const targetEl =
typeof target === 'string'
? document.querySelector<HTMLElement>(target)
: target instanceof HTMLElement
? target
: null
/**
* Forwards the Joyride tooltip ref safely, handling both

View File

@@ -114,6 +114,16 @@ export function useTour({
[steps.length, stopTour, cancelPendingTransitions, scheduleReveal]
)
useEffect(() => {
if (!run) return
const html = document.documentElement
const prev = html.style.scrollbarGutter
html.style.scrollbarGutter = 'stable'
return () => {
html.style.scrollbarGutter = prev
}
}, [run])
/** Stop the tour when disabled becomes true (e.g. navigating away from the relevant page) */
useEffect(() => {
if (disabled && run) {

View File

@@ -1,4 +1,4 @@
import { memo, type ReactNode, useCallback, useRef, useState } from 'react'
import { memo, type ReactNode } from 'react'
import * as PopoverPrimitive from '@radix-ui/react-popover'
import {
ArrowDown,
@@ -19,8 +19,6 @@ import { cn } from '@/lib/core/utils/cn'
const SEARCH_ICON = (
<Search className='pointer-events-none h-[14px] w-[14px] shrink-0 text-[var(--text-icon)]' />
)
const FILTER_ICON = <ListFilter className='mr-1.5 h-[14px] w-[14px] text-[var(--text-icon)]' />
const SORT_ICON = <ArrowUpDown className='mr-1.5 h-[14px] w-[14px] text-[var(--text-icon)]' />
type SortDirection = 'asc' | 'desc'
@@ -67,7 +65,12 @@ export interface SearchConfig {
interface ResourceOptionsBarProps {
search?: SearchConfig
sort?: SortConfig
/** Popover content — renders inside a Popover (used by logs, etc.) */
filter?: ReactNode
/** When provided, Filter button acts as a toggle instead of opening a Popover */
onFilterToggle?: () => void
/** Whether the filter is currently active (highlights the toggle button) */
filterActive?: boolean
filterTags?: FilterTag[]
extras?: ReactNode
}
@@ -76,10 +79,13 @@ export const ResourceOptionsBar = memo(function ResourceOptionsBar({
search,
sort,
filter,
onFilterToggle,
filterActive,
filterTags,
extras,
}: ResourceOptionsBarProps) {
const hasContent = search || sort || filter || extras || (filterTags && filterTags.length > 0)
const hasContent =
search || sort || filter || onFilterToggle || extras || (filterTags && filterTags.length > 0)
if (!hasContent) return null
return (
@@ -88,22 +94,39 @@ export const ResourceOptionsBar = memo(function ResourceOptionsBar({
{search && <SearchSection search={search} />}
<div className='flex items-center gap-1.5'>
{extras}
{filterTags?.map((tag) => (
{filterTags?.map((tag, i) => (
<Button
key={tag.label}
key={`${tag.label}-${i}`}
variant='subtle'
className='px-2 py-1 text-caption'
className='max-w-[200px] px-2 py-1 text-caption'
onClick={tag.onRemove}
>
{tag.label}
<span className='ml-1 text-[var(--text-icon)] text-micro'></span>
<span className='truncate'>{tag.label}</span>
<span className='ml-1 shrink-0 text-[var(--text-icon)] text-micro'></span>
</Button>
))}
{filter && (
{onFilterToggle ? (
<Button
variant='subtle'
className={cn(
'px-2 py-1 text-caption',
filterActive && 'bg-[var(--surface-3)] text-[var(--text-primary)]'
)}
onClick={onFilterToggle}
>
<ListFilter
className={cn(
'mr-1.5 h-[14px] w-[14px]',
filterActive ? 'text-[var(--text-primary)]' : 'text-[var(--text-icon)]'
)}
/>
Filter
</Button>
) : filter ? (
<PopoverPrimitive.Root>
<PopoverPrimitive.Trigger asChild>
<Button variant='subtle' className='px-2 py-1 text-caption'>
{FILTER_ICON}
<ListFilter className='mr-1.5 h-[14px] w-[14px] text-[var(--text-icon)]' />
Filter
</Button>
</PopoverPrimitive.Trigger>
@@ -111,15 +134,13 @@ export const ResourceOptionsBar = memo(function ResourceOptionsBar({
<PopoverPrimitive.Content
align='start'
sideOffset={6}
className={cn(
'z-50 rounded-lg border border-[var(--border)] bg-[var(--bg)] shadow-sm'
)}
className='z-50 w-fit rounded-lg border border-[var(--border)] bg-[var(--bg)] shadow-sm'
>
{filter}
</PopoverPrimitive.Content>
</PopoverPrimitive.Portal>
</PopoverPrimitive.Root>
)}
) : null}
{sort && <SortDropdown config={sort} />}
</div>
</div>
@@ -128,34 +149,6 @@ export const ResourceOptionsBar = memo(function ResourceOptionsBar({
})
const SearchSection = memo(function SearchSection({ search }: { search: SearchConfig }) {
const [localValue, setLocalValue] = useState(search.value)
const lastReportedRef = useRef(search.value)
if (search.value !== lastReportedRef.current) {
setLocalValue(search.value)
lastReportedRef.current = search.value
}
const handleInputChange = useCallback(
(e: React.ChangeEvent<HTMLInputElement>) => {
const next = e.target.value
setLocalValue(next)
search.onChange(next)
},
[search.onChange]
)
const handleClearAll = useCallback(() => {
setLocalValue('')
lastReportedRef.current = ''
if (search.onClearAll) {
search.onClearAll()
} else {
search.onChange('')
}
}, [search.onClearAll, search.onChange])
return (
<div className='relative flex flex-1 items-center'>
{SEARCH_ICON}
@@ -177,8 +170,8 @@ const SearchSection = memo(function SearchSection({ search }: { search: SearchCo
<input
ref={search.inputRef}
type='text'
value={localValue}
onChange={handleInputChange}
value={search.value}
onChange={(e) => search.onChange(e.target.value)}
onKeyDown={search.onKeyDown}
onFocus={search.onFocus}
onBlur={search.onBlur}
@@ -186,11 +179,11 @@ const SearchSection = memo(function SearchSection({ search }: { search: SearchCo
className='min-w-[80px] flex-1 bg-transparent py-1 text-[var(--text-secondary)] text-caption outline-none placeholder:text-[var(--text-subtle)]'
/>
</div>
{search.tags?.length || localValue ? (
{search.tags?.length || search.value ? (
<button
type='button'
className='mr-0.5 flex h-[14px] w-[14px] shrink-0 items-center justify-center text-[var(--text-subtle)] transition-colors hover-hover:text-[var(--text-secondary)]'
onClick={handleClearAll}
onClick={search.onClearAll ?? (() => search.onChange(''))}
>
<span className='text-caption'></span>
</button>
@@ -213,8 +206,19 @@ const SortDropdown = memo(function SortDropdown({ config }: { config: SortConfig
return (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button variant='subtle' className='px-2 py-1 text-caption'>
{SORT_ICON}
<Button
variant='subtle'
className={cn(
'px-2 py-1 text-caption',
active && 'bg-[var(--surface-3)] text-[var(--text-primary)]'
)}
>
<ArrowUpDown
className={cn(
'mr-1.5 h-[14px] w-[14px]',
active ? 'text-[var(--text-primary)]' : 'text-[var(--text-icon)]'
)}
/>
Sort
</Button>
</DropdownMenuTrigger>

View File

@@ -2,6 +2,7 @@
import { memo, useCallback, useEffect, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { ZoomIn, ZoomOut } from 'lucide-react'
import { Skeleton } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn'
import type { WorkspaceFileRecord } from '@/lib/uploads/contexts/workspace'
@@ -432,17 +433,120 @@ const IframePreview = memo(function IframePreview({ file }: { file: WorkspaceFil
)
})
const ZOOM_MIN = 0.25
const ZOOM_MAX = 4
const ZOOM_WHEEL_SENSITIVITY = 0.005
const ZOOM_BUTTON_FACTOR = 1.2
const clampZoom = (z: number) => Math.min(Math.max(z, ZOOM_MIN), ZOOM_MAX)
const ImagePreview = memo(function ImagePreview({ file }: { file: WorkspaceFileRecord }) {
const serveUrl = `/api/files/serve/${encodeURIComponent(file.key)}?context=workspace`
const [zoom, setZoom] = useState(1)
const [offset, setOffset] = useState({ x: 0, y: 0 })
const isDragging = useRef(false)
const dragStart = useRef({ x: 0, y: 0 })
const offsetAtDragStart = useRef({ x: 0, y: 0 })
const offsetRef = useRef(offset)
offsetRef.current = offset
const containerRef = useRef<HTMLDivElement>(null)
const zoomIn = useCallback(() => setZoom((z) => clampZoom(z * ZOOM_BUTTON_FACTOR)), [])
const zoomOut = useCallback(() => setZoom((z) => clampZoom(z / ZOOM_BUTTON_FACTOR)), [])
useEffect(() => {
const el = containerRef.current
if (!el) return
const onWheel = (e: WheelEvent) => {
e.preventDefault()
if (e.ctrlKey || e.metaKey) {
setZoom((z) => clampZoom(z * Math.exp(-e.deltaY * ZOOM_WHEEL_SENSITIVITY)))
} else {
setOffset((o) => ({ x: o.x - e.deltaX, y: o.y - e.deltaY }))
}
}
el.addEventListener('wheel', onWheel, { passive: false })
return () => el.removeEventListener('wheel', onWheel)
}, [])
const handleMouseDown = useCallback((e: React.MouseEvent) => {
if (e.button !== 0) return
isDragging.current = true
dragStart.current = { x: e.clientX, y: e.clientY }
offsetAtDragStart.current = offsetRef.current
if (containerRef.current) containerRef.current.style.cursor = 'grabbing'
e.preventDefault()
}, [])
const handleMouseMove = useCallback((e: React.MouseEvent) => {
if (!isDragging.current) return
setOffset({
x: offsetAtDragStart.current.x + (e.clientX - dragStart.current.x),
y: offsetAtDragStart.current.y + (e.clientY - dragStart.current.y),
})
}, [])
const handleMouseUp = useCallback(() => {
isDragging.current = false
if (containerRef.current) containerRef.current.style.cursor = 'grab'
}, [])
useEffect(() => {
setZoom(1)
setOffset({ x: 0, y: 0 })
}, [file.key])
return (
<div className='flex flex-1 items-center justify-center overflow-auto bg-[var(--surface-1)] p-6'>
<img
src={serveUrl}
alt={file.name}
className='max-h-full max-w-full rounded-md object-contain'
loading='eager'
/>
<div
ref={containerRef}
className='relative flex flex-1 cursor-grab overflow-hidden bg-[var(--surface-1)]'
onMouseDown={handleMouseDown}
onMouseMove={handleMouseMove}
onMouseUp={handleMouseUp}
onMouseLeave={handleMouseUp}
>
<div
className='pointer-events-none absolute inset-0 flex items-center justify-center'
style={{
transform: `translate(${offset.x}px, ${offset.y}px) scale(${zoom})`,
transformOrigin: 'center center',
}}
>
<img
src={serveUrl}
alt={file.name}
className='max-h-full max-w-full select-none rounded-md object-contain'
draggable={false}
loading='eager'
/>
</div>
<div
className='absolute right-4 bottom-4 flex items-center gap-1 rounded-md border border-[var(--border)] bg-[var(--surface-2)] px-2 py-1 shadow-sm'
onMouseDown={(e) => e.stopPropagation()}
>
<button
type='button'
onClick={zoomOut}
disabled={zoom <= ZOOM_MIN}
className='flex h-6 w-6 items-center justify-center rounded text-[var(--text-secondary)] transition-colors hover:bg-[var(--surface-3)] hover:text-[var(--text-primary)] disabled:cursor-not-allowed disabled:opacity-40'
aria-label='Zoom out'
>
<ZoomOut className='h-3.5 w-3.5' />
</button>
<span className='min-w-[3rem] text-center text-[11px] text-[var(--text-secondary)]'>
{Math.round(zoom * 100)}%
</span>
<button
type='button'
onClick={zoomIn}
disabled={zoom >= ZOOM_MAX}
className='flex h-6 w-6 items-center justify-center rounded text-[var(--text-secondary)] transition-colors hover:bg-[var(--surface-3)] hover:text-[var(--text-primary)] disabled:cursor-not-allowed disabled:opacity-40'
aria-label='Zoom in'
>
<ZoomIn className='h-3.5 w-3.5' />
</button>
</div>
</div>
)
})

View File

@@ -1,6 +1,7 @@
'use client'
import { memo, useMemo, useRef } from 'react'
import { createContext, memo, useContext, useMemo, useRef } from 'react'
import type { Components, ExtraProps } from 'react-markdown'
import ReactMarkdown from 'react-markdown'
import remarkBreaks from 'remark-breaks'
import remarkGfm from 'remark-gfm'
@@ -70,34 +71,51 @@ export const PreviewPanel = memo(function PreviewPanel({
const REMARK_PLUGINS = [remarkGfm, remarkBreaks]
/**
* Carries the contentRef and toggle handler from MarkdownPreview down to the
* task-list renderers. Only present when the preview is interactive.
*/
const MarkdownCheckboxCtx = createContext<{
contentRef: React.MutableRefObject<string>
onToggle: (index: number, checked: boolean) => void
} | null>(null)
/** Carries the resolved checkbox index from LiRenderer to InputRenderer. */
const CheckboxIndexCtx = createContext(-1)
const STATIC_MARKDOWN_COMPONENTS = {
p: ({ children }: any) => (
p: ({ children }: { children?: React.ReactNode }) => (
<p className='mb-3 break-words text-[14px] text-[var(--text-primary)] leading-[1.6] last:mb-0'>
{children}
</p>
),
h1: ({ children }: any) => (
h1: ({ children }: { children?: React.ReactNode }) => (
<h1 className='mt-6 mb-4 break-words font-semibold text-[24px] text-[var(--text-primary)] first:mt-0'>
{children}
</h1>
),
h2: ({ children }: any) => (
h2: ({ children }: { children?: React.ReactNode }) => (
<h2 className='mt-5 mb-3 break-words font-semibold text-[20px] text-[var(--text-primary)] first:mt-0'>
{children}
</h2>
),
h3: ({ children }: any) => (
h3: ({ children }: { children?: React.ReactNode }) => (
<h3 className='mt-4 mb-2 break-words font-semibold text-[16px] text-[var(--text-primary)] first:mt-0'>
{children}
</h3>
),
h4: ({ children }: any) => (
h4: ({ children }: { children?: React.ReactNode }) => (
<h4 className='mt-3 mb-2 break-words font-semibold text-[14px] text-[var(--text-primary)] first:mt-0'>
{children}
</h4>
),
code: ({ inline, className, children, ...props }: any) => {
const isInline = inline || !className?.includes('language-')
code: ({
className,
children,
node: _node,
...props
}: React.HTMLAttributes<HTMLElement> & ExtraProps) => {
const isInline = !className?.includes('language-')
if (isInline) {
return (
@@ -119,8 +137,8 @@ const STATIC_MARKDOWN_COMPONENTS = {
</code>
)
},
pre: ({ children }: any) => <>{children}</>,
a: ({ href, children }: any) => (
pre: ({ children }: { children?: React.ReactNode }) => <>{children}</>,
a: ({ href, children }: { href?: string; children?: React.ReactNode }) => (
<a
href={href}
target='_blank'
@@ -130,102 +148,133 @@ const STATIC_MARKDOWN_COMPONENTS = {
{children}
</a>
),
strong: ({ children }: any) => (
strong: ({ children }: { children?: React.ReactNode }) => (
<strong className='break-words font-semibold text-[var(--text-primary)]'>{children}</strong>
),
em: ({ children }: any) => (
em: ({ children }: { children?: React.ReactNode }) => (
<em className='break-words text-[var(--text-tertiary)]'>{children}</em>
),
blockquote: ({ children }: any) => (
blockquote: ({ children }: { children?: React.ReactNode }) => (
<blockquote className='my-4 break-words border-[var(--border-1)] border-l-4 py-1 pl-4 text-[var(--text-tertiary)] italic'>
{children}
</blockquote>
),
hr: () => <hr className='my-6 border-[var(--border)]' />,
img: ({ src, alt }: any) => (
img: ({ src, alt, node: _node }: React.ComponentPropsWithoutRef<'img'> & ExtraProps) => (
<img src={src} alt={alt ?? ''} className='my-3 max-w-full rounded-md' loading='lazy' />
),
table: ({ children }: any) => (
table: ({ children }: { children?: React.ReactNode }) => (
<div className='my-4 max-w-full overflow-x-auto'>
<table className='w-full border-collapse text-[13px]'>{children}</table>
</div>
),
thead: ({ children }: any) => <thead className='bg-[var(--surface-2)]'>{children}</thead>,
tbody: ({ children }: any) => <tbody>{children}</tbody>,
tr: ({ children }: any) => (
thead: ({ children }: { children?: React.ReactNode }) => (
<thead className='bg-[var(--surface-2)]'>{children}</thead>
),
tbody: ({ children }: { children?: React.ReactNode }) => <tbody>{children}</tbody>,
tr: ({ children }: { children?: React.ReactNode }) => (
<tr className='border-[var(--border)] border-b last:border-b-0'>{children}</tr>
),
th: ({ children }: any) => (
th: ({ children }: { children?: React.ReactNode }) => (
<th className='px-3 py-2 text-left font-semibold text-[12px] text-[var(--text-primary)]'>
{children}
</th>
),
td: ({ children }: any) => <td className='px-3 py-2 text-[var(--text-secondary)]'>{children}</td>,
td: ({ children }: { children?: React.ReactNode }) => (
<td className='px-3 py-2 text-[var(--text-secondary)]'>{children}</td>
),
}
function buildMarkdownComponents(
checkboxCounterRef: React.MutableRefObject<number>,
onCheckboxToggle?: (checkboxIndex: number, checked: boolean) => void
) {
const isInteractive = Boolean(onCheckboxToggle)
function UlRenderer({ className, children }: React.ComponentPropsWithoutRef<'ul'> & ExtraProps) {
const isTaskList = typeof className === 'string' && className.includes('contains-task-list')
return (
<ul
className={cn(
'mt-1 mb-3 space-y-1 break-words text-[14px] text-[var(--text-primary)]',
isTaskList ? 'list-none pl-0' : 'list-disc pl-6'
)}
>
{children}
</ul>
)
}
return {
...STATIC_MARKDOWN_COMPONENTS,
ul: ({ className, children }: any) => {
const isTaskList = typeof className === 'string' && className.includes('contains-task-list')
return (
<ul
className={cn(
'mt-1 mb-3 space-y-1 break-words text-[14px] text-[var(--text-primary)]',
isTaskList ? 'list-none pl-0' : 'list-disc pl-6'
)}
>
{children}
</ul>
)
},
ol: ({ className, children }: any) => {
const isTaskList = typeof className === 'string' && className.includes('contains-task-list')
return (
<ol
className={cn(
'mt-1 mb-3 space-y-1 break-words text-[14px] text-[var(--text-primary)]',
isTaskList ? 'list-none pl-0' : 'list-decimal pl-6'
)}
>
{children}
</ol>
)
},
li: ({ className, children }: any) => {
const isTaskItem = typeof className === 'string' && className.includes('task-list-item')
if (isTaskItem) {
function OlRenderer({ className, children }: React.ComponentPropsWithoutRef<'ol'> & ExtraProps) {
const isTaskList = typeof className === 'string' && className.includes('contains-task-list')
return (
<ol
className={cn(
'mt-1 mb-3 space-y-1 break-words text-[14px] text-[var(--text-primary)]',
isTaskList ? 'list-none pl-0' : 'list-decimal pl-6'
)}
>
{children}
</ol>
)
}
function LiRenderer({
className,
children,
node,
}: React.ComponentPropsWithoutRef<'li'> & ExtraProps) {
const ctx = useContext(MarkdownCheckboxCtx)
const isTaskItem = typeof className === 'string' && className.includes('task-list-item')
if (isTaskItem) {
if (ctx) {
const offset = node?.position?.start?.offset
if (offset === undefined) {
return <li className='flex items-start gap-2 break-words leading-[1.6]'>{children}</li>
}
return <li className='break-words leading-[1.6]'>{children}</li>
},
input: ({ type, checked, ...props }: any) => {
if (type !== 'checkbox') return <input type={type} checked={checked} {...props} />
const index = checkboxCounterRef.current++
const before = ctx.contentRef.current.slice(0, offset)
const prior = before.match(/^(\s*(?:[-*+]|\d+[.)]) +)\[([ xX])\]/gm)
return (
<Checkbox
checked={checked ?? false}
onCheckedChange={
isInteractive
? (newChecked) => onCheckboxToggle!(index, Boolean(newChecked))
: undefined
}
disabled={!isInteractive}
size='sm'
className='mt-1 shrink-0'
/>
<CheckboxIndexCtx.Provider value={prior ? prior.length : 0}>
<li className='flex items-start gap-2 break-words leading-[1.6]'>{children}</li>
</CheckboxIndexCtx.Provider>
)
},
}
return <li className='flex items-start gap-2 break-words leading-[1.6]'>{children}</li>
}
return <li className='break-words leading-[1.6]'>{children}</li>
}
function InputRenderer({
type,
checked,
node: _node,
...props
}: React.ComponentPropsWithoutRef<'input'> & ExtraProps) {
const ctx = useContext(MarkdownCheckboxCtx)
const index = useContext(CheckboxIndexCtx)
if (type !== 'checkbox') return <input type={type} checked={checked} {...props} />
const isInteractive = ctx !== null && index >= 0
return (
<Checkbox
checked={checked ?? false}
onCheckedChange={
isInteractive ? (newChecked) => ctx.onToggle(index, Boolean(newChecked)) : undefined
}
disabled={!isInteractive}
size='sm'
className='mt-1 shrink-0'
/>
)
}
const MARKDOWN_COMPONENTS = {
...STATIC_MARKDOWN_COMPONENTS,
ul: UlRenderer,
ol: OlRenderer,
li: LiRenderer,
input: InputRenderer,
} satisfies Components
const MarkdownPreview = memo(function MarkdownPreview({
content,
isStreaming = false,
@@ -238,32 +287,33 @@ const MarkdownPreview = memo(function MarkdownPreview({
const { ref: scrollRef } = useAutoScroll(isStreaming)
const { committed, incoming, generation } = useStreamingReveal(content, isStreaming)
const checkboxCounterRef = useRef(0)
const contentRef = useRef(content)
contentRef.current = content
const components = useMemo(
() => buildMarkdownComponents(checkboxCounterRef, onCheckboxToggle),
const ctxValue = useMemo(
() => (onCheckboxToggle ? { contentRef, onToggle: onCheckboxToggle } : null),
[onCheckboxToggle]
)
checkboxCounterRef.current = 0
const committedMarkdown = useMemo(
() =>
committed ? (
<ReactMarkdown remarkPlugins={REMARK_PLUGINS} components={components}>
<ReactMarkdown remarkPlugins={REMARK_PLUGINS} components={MARKDOWN_COMPONENTS}>
{committed}
</ReactMarkdown>
) : null,
[committed, components]
[committed]
)
if (onCheckboxToggle) {
return (
<div ref={scrollRef} className='h-full overflow-auto p-6'>
<ReactMarkdown remarkPlugins={REMARK_PLUGINS} components={components}>
{content}
</ReactMarkdown>
</div>
<MarkdownCheckboxCtx.Provider value={ctxValue}>
<div ref={scrollRef} className='h-full overflow-auto p-6'>
<ReactMarkdown remarkPlugins={REMARK_PLUGINS} components={MARKDOWN_COMPONENTS}>
{content}
</ReactMarkdown>
</div>
</MarkdownCheckboxCtx.Provider>
)
}
@@ -275,7 +325,7 @@ const MarkdownPreview = memo(function MarkdownPreview({
key={generation}
className={cn(isStreaming && 'animate-stream-fade-in', '[&>:first-child]:mt-0')}
>
<ReactMarkdown remarkPlugins={REMARK_PLUGINS} components={components}>
<ReactMarkdown remarkPlugins={REMARK_PLUGINS} components={MARKDOWN_COMPONENTS}>
{incoming}
</ReactMarkdown>
</div>

View File

@@ -6,6 +6,8 @@ import { useParams, useRouter } from 'next/navigation'
import {
Button,
Columns2,
Combobox,
type ComboboxOption,
Download,
DropdownMenu,
DropdownMenuContent,
@@ -31,17 +33,22 @@ import {
formatFileSize,
getFileExtension,
getMimeTypeFromExtension,
isAudioFileType,
isVideoFileType,
} from '@/lib/uploads/utils/file-utils'
import {
isSupportedExtension,
SUPPORTED_AUDIO_EXTENSIONS,
SUPPORTED_DOCUMENT_EXTENSIONS,
SUPPORTED_VIDEO_EXTENSIONS,
} from '@/lib/uploads/utils/validation'
import type {
FilterTag,
HeaderAction,
ResourceColumn,
ResourceRow,
SearchConfig,
SortConfig,
} from '@/app/workspace/[workspaceId]/components'
import {
InlineRenameInput,
@@ -66,6 +73,7 @@ import {
useUploadWorkspaceFile,
useWorkspaceFiles,
} from '@/hooks/queries/workspace-files'
import { useDebounce } from '@/hooks/use-debounce'
import { useInlineRename } from '@/hooks/use-inline-rename'
type SaveStatus = 'idle' | 'saving' | 'saved' | 'error'
@@ -86,7 +94,6 @@ const COLUMNS: ResourceColumn[] = [
{ id: 'type', header: 'Type' },
{ id: 'created', header: 'Created' },
{ id: 'owner', header: 'Owner' },
{ id: 'updated', header: 'Last Updated' },
]
const MIME_TYPE_LABELS: Record<string, string> = {
@@ -161,16 +168,14 @@ export function Files() {
const [uploading, setUploading] = useState(false)
const [uploadProgress, setUploadProgress] = useState({ completed: 0, total: 0 })
const [inputValue, setInputValue] = useState('')
const [debouncedSearchTerm, setDebouncedSearchTerm] = useState('')
const searchTimerRef = useRef<ReturnType<typeof setTimeout>>(null)
const handleSearchChange = useCallback((value: string) => {
setInputValue(value)
if (searchTimerRef.current) clearTimeout(searchTimerRef.current)
searchTimerRef.current = setTimeout(() => {
setDebouncedSearchTerm(value)
}, 200)
}, [])
const debouncedSearchTerm = useDebounce(inputValue, 200)
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const [typeFilter, setTypeFilter] = useState<string[]>([])
const [sizeFilter, setSizeFilter] = useState<string[]>([])
const [uploadedByFilter, setUploadedByFilter] = useState<string[]>([])
const [creatingFile, setCreatingFile] = useState(false)
const [isDirty, setIsDirty] = useState(false)
@@ -206,10 +211,60 @@ export function Files() {
selectedFileRef.current = selectedFile
const filteredFiles = useMemo(() => {
if (!debouncedSearchTerm) return files
const q = debouncedSearchTerm.toLowerCase()
return files.filter((f) => f.name.toLowerCase().includes(q))
}, [files, debouncedSearchTerm])
let result = debouncedSearchTerm
? files.filter((f) => f.name.toLowerCase().includes(debouncedSearchTerm.toLowerCase()))
: files
if (typeFilter.length > 0) {
result = result.filter((f) => {
const ext = getFileExtension(f.name)
if (typeFilter.includes('document') && isSupportedExtension(ext)) return true
if (typeFilter.includes('audio') && isAudioFileType(f.type)) return true
if (typeFilter.includes('video') && isVideoFileType(f.type)) return true
return false
})
}
if (sizeFilter.length > 0) {
result = result.filter((f) => {
if (sizeFilter.includes('small') && f.size < 1_048_576) return true
if (sizeFilter.includes('medium') && f.size >= 1_048_576 && f.size <= 10_485_760)
return true
if (sizeFilter.includes('large') && f.size > 10_485_760) return true
return false
})
}
if (uploadedByFilter.length > 0) {
result = result.filter((f) => uploadedByFilter.includes(f.uploadedBy))
}
const col = activeSort?.column ?? 'created'
const dir = activeSort?.direction ?? 'desc'
return [...result].sort((a, b) => {
let cmp = 0
switch (col) {
case 'name':
cmp = a.name.localeCompare(b.name)
break
case 'size':
cmp = a.size - b.size
break
case 'type':
cmp = formatFileType(a.type, a.name).localeCompare(formatFileType(b.type, b.name))
break
case 'created':
cmp = new Date(a.uploadedAt).getTime() - new Date(b.uploadedAt).getTime()
break
case 'owner':
cmp = (members?.find((m) => m.userId === a.uploadedBy)?.name ?? '').localeCompare(
members?.find((m) => m.userId === b.uploadedBy)?.name ?? ''
)
break
}
return dir === 'asc' ? cmp : -cmp
})
}, [files, debouncedSearchTerm, typeFilter, sizeFilter, uploadedByFilter, activeSort, members])
const rowCacheRef = useRef(
new Map<string, { row: ResourceRow; file: WorkspaceFileRecord; members: typeof members }>()
@@ -245,12 +300,6 @@ export function Files() {
},
created: timeCell(file.uploadedAt),
owner: ownerCell(file.uploadedBy, members),
updated: timeCell(file.uploadedAt),
},
sortValues: {
size: file.size,
created: -new Date(file.uploadedAt).getTime(),
updated: -new Date(file.uploadedAt).getTime(),
},
}
nextCache.set(file.id, { row, file, members })
@@ -342,7 +391,7 @@ export function Files() {
}
}
},
[workspaceId]
[workspaceId, uploadFile]
)
const handleDownload = useCallback(async (file: WorkspaceFileRecord) => {
@@ -690,7 +739,6 @@ export function Files() {
handleDeleteSelected,
])
/** Stable refs for values used in callbacks to avoid dependency churn */
const listRenameRef = useRef(listRename)
listRenameRef.current = listRename
const headerRenameRef = useRef(headerRename)
@@ -711,18 +759,14 @@ export function Files() {
const canEdit = userPermissions.canEdit === true
const handleSearchClearAll = useCallback(() => {
handleSearchChange('')
}, [handleSearchChange])
const searchConfig: SearchConfig = useMemo(
() => ({
value: inputValue,
onChange: handleSearchChange,
onClearAll: handleSearchClearAll,
onChange: setInputValue,
onClearAll: () => setInputValue(''),
placeholder: 'Search files...',
}),
[inputValue, handleSearchChange, handleSearchClearAll]
[inputValue]
)
const createConfig = useMemo(
@@ -764,6 +808,205 @@ export function Files() {
[handleNavigateToFiles]
)
const typeDisplayLabel = useMemo(() => {
if (typeFilter.length === 0) return 'All'
if (typeFilter.length === 1) {
const labels: Record<string, string> = {
document: 'Documents',
audio: 'Audio',
video: 'Video',
}
return labels[typeFilter[0]] ?? typeFilter[0]
}
return `${typeFilter.length} selected`
}, [typeFilter])
const sizeDisplayLabel = useMemo(() => {
if (sizeFilter.length === 0) return 'All'
if (sizeFilter.length === 1) {
const labels: Record<string, string> = { small: 'Small', medium: 'Medium', large: 'Large' }
return labels[sizeFilter[0]] ?? sizeFilter[0]
}
return `${sizeFilter.length} selected`
}, [sizeFilter])
const uploadedByDisplayLabel = useMemo(() => {
if (uploadedByFilter.length === 0) return 'All'
if (uploadedByFilter.length === 1)
return members?.find((m) => m.userId === uploadedByFilter[0])?.name ?? '1 member'
return `${uploadedByFilter.length} members`
}, [uploadedByFilter, members])
const memberOptions: ComboboxOption[] = useMemo(
() =>
(members ?? []).map((m) => ({
value: m.userId,
label: m.name,
iconElement: m.image ? (
<img
src={m.image}
alt={m.name}
referrerPolicy='no-referrer'
className='h-[14px] w-[14px] rounded-full border border-[var(--border)] object-cover'
/>
) : (
<span className='flex h-[14px] w-[14px] items-center justify-center rounded-full border border-[var(--border)] bg-[var(--surface-3)] font-medium text-[8px] text-[var(--text-secondary)]'>
{m.name.charAt(0).toUpperCase()}
</span>
),
})),
[members]
)
const sortConfig: SortConfig = useMemo(
() => ({
options: [
{ id: 'name', label: 'Name' },
{ id: 'size', label: 'Size' },
{ id: 'type', label: 'Type' },
{ id: 'created', label: 'Created' },
{ id: 'owner', label: 'Owner' },
],
active: activeSort,
onSort: (column, direction) => setActiveSort({ column, direction }),
onClear: () => setActiveSort(null),
}),
[activeSort]
)
const hasActiveFilters =
typeFilter.length > 0 || sizeFilter.length > 0 || uploadedByFilter.length > 0
const filterContent = useMemo(
() => (
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>File Type</span>
<Combobox
options={[
{ value: 'document', label: 'Documents' },
{ value: 'audio', label: 'Audio' },
{ value: 'video', label: 'Video' },
]}
multiSelect
multiSelectValues={typeFilter}
onMultiSelectChange={setTypeFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{typeDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Size</span>
<Combobox
options={[
{ value: 'small', label: 'Small (< 1 MB)' },
{ value: 'medium', label: 'Medium (110 MB)' },
{ value: 'large', label: 'Large (> 10 MB)' },
]}
multiSelect
multiSelectValues={sizeFilter}
onMultiSelectChange={setSizeFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{sizeDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
{memberOptions.length > 0 && (
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>
Uploaded By
</span>
<Combobox
options={memberOptions}
multiSelect
multiSelectValues={uploadedByFilter}
onMultiSelectChange={setUploadedByFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>
{uploadedByDisplayLabel}
</span>
}
searchable
searchPlaceholder='Search members...'
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
)}
{hasActiveFilters && (
<button
type='button'
onClick={() => {
setTypeFilter([])
setSizeFilter([])
setUploadedByFilter([])
}}
className='flex h-[32px] w-full items-center justify-center rounded-md text-[var(--text-secondary)] text-caption transition-colors hover-hover:bg-[var(--surface-active)]'
>
Clear all filters
</button>
)}
</div>
),
[
typeFilter,
sizeFilter,
uploadedByFilter,
memberOptions,
typeDisplayLabel,
sizeDisplayLabel,
uploadedByDisplayLabel,
hasActiveFilters,
]
)
const filterTags: FilterTag[] = useMemo(() => {
const tags: FilterTag[] = []
if (typeFilter.length > 0) {
const typeLabels: Record<string, string> = {
document: 'Documents',
audio: 'Audio',
video: 'Video',
}
const label =
typeFilter.length === 1
? `Type: ${typeLabels[typeFilter[0]]}`
: `Type: ${typeFilter.length} selected`
tags.push({ label, onRemove: () => setTypeFilter([]) })
}
if (sizeFilter.length > 0) {
const sizeLabels: Record<string, string> = {
small: 'Small',
medium: 'Medium',
large: 'Large',
}
const label =
sizeFilter.length === 1
? `Size: ${sizeLabels[sizeFilter[0]]}`
: `Size: ${sizeFilter.length} selected`
tags.push({ label, onRemove: () => setSizeFilter([]) })
}
if (uploadedByFilter.length > 0) {
const label =
uploadedByFilter.length === 1
? `Uploaded by: ${members?.find((m) => m.userId === uploadedByFilter[0])?.name ?? '1 member'}`
: `Uploaded by: ${uploadedByFilter.length} members`
tags.push({ label, onRemove: () => setUploadedByFilter([]) })
}
return tags
}, [typeFilter, sizeFilter, uploadedByFilter, members])
if (fileIdFromRoute && !selectedFile) {
return (
<div className='flex h-full flex-1 flex-col overflow-hidden bg-[var(--bg)]'>
@@ -834,7 +1077,9 @@ export function Files() {
title='Files'
create={createConfig}
search={searchConfig}
defaultSort='created'
sort={sortConfig}
filter={filterContent}
filterTags={filterTags}
headerActions={headerActionsConfig}
columns={COLUMNS}
rows={rows}

View File

@@ -91,7 +91,9 @@ export function MothershipChat({
}: MothershipChatProps) {
const styles = LAYOUT_STYLES[layout]
const isStreamActive = isSending || isReconnecting
const { ref: scrollContainerRef, scrollToBottom } = useAutoScroll(isStreamActive)
const { ref: scrollContainerRef, scrollToBottom } = useAutoScroll(isStreamActive, {
scrollOnMount: true,
})
const hasMessages = messages.length > 0
const initialScrollDoneRef = useRef(false)

View File

@@ -571,19 +571,19 @@ export function UserInput({
const items = e.clipboardData?.items
if (!items) return
const imageFiles: File[] = []
const pastedFiles: File[] = []
for (const item of Array.from(items)) {
if (item.kind === 'file' && item.type.startsWith('image/')) {
if (item.kind === 'file') {
const file = item.getAsFile()
if (file) imageFiles.push(file)
if (file) pastedFiles.push(file)
}
}
if (imageFiles.length === 0) return
if (pastedFiles.length === 0) return
e.preventDefault()
const dt = new DataTransfer()
for (const file of imageFiles) {
for (const file of pastedFiles) {
dt.items.add(file)
}
filesRef.current.processFiles(dt.files)

View File

@@ -1737,6 +1737,8 @@ export function useChat(
}
if (options?.error) {
pendingRecoveryMessageRef.current = null
setPendingRecoveryMessage(null)
setMessageQueue([])
return
}

View File

@@ -7,6 +7,7 @@ import { useParams, useRouter, useSearchParams } from 'next/navigation'
import {
Badge,
Button,
Combobox,
Modal,
ModalBody,
ModalContent,
@@ -15,7 +16,6 @@ import {
Trash,
} from '@/components/emcn'
import { SearchHighlight } from '@/components/ui/search-highlight'
import { cn } from '@/lib/core/utils/cn'
import type { ChunkData } from '@/lib/knowledge/types'
import { formatTokenCount } from '@/lib/tokenization'
import type {
@@ -27,6 +27,7 @@ import type {
ResourceRow,
SearchConfig,
SelectableConfig,
SortConfig,
} from '@/app/workspace/[workspaceId]/components'
import { Resource, ResourceHeader } from '@/app/workspace/[workspaceId]/components'
import {
@@ -152,7 +153,16 @@ export function Document({
const [searchQuery, setSearchQuery] = useState('')
const [debouncedSearchQuery, setDebouncedSearchQuery] = useState('')
const [enabledFilter, setEnabledFilter] = useState<'all' | 'enabled' | 'disabled'>('all')
const [enabledFilter, setEnabledFilter] = useState<string[]>([])
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const enabledFilterParam = useMemo(
() => (enabledFilter.length === 1 ? (enabledFilter[0] as 'enabled' | 'disabled') : 'all'),
[enabledFilter]
)
const {
chunks: initialChunks,
@@ -165,7 +175,21 @@ export function Document({
refreshChunks: initialRefreshChunks,
updateChunk: initialUpdateChunk,
isFetching: isFetchingChunks,
} = useDocumentChunks(knowledgeBaseId, documentId, currentPageFromURL, '', enabledFilter)
} = useDocumentChunks(
knowledgeBaseId,
documentId,
currentPageFromURL,
'',
enabledFilterParam,
activeSort?.column === 'tokens'
? 'tokenCount'
: activeSort?.column === 'status'
? 'enabled'
: activeSort?.column === 'index'
? 'chunkIndex'
: undefined,
activeSort?.direction
)
const { data: searchResults = [], error: searchQueryError } = useDocumentChunkSearchQuery(
{
@@ -229,7 +253,10 @@ export function Document({
searchStartIndex + SEARCH_PAGE_SIZE
)
const displayChunks = showingSearch ? paginatedSearchResults : initialChunks
const rawDisplayChunks = showingSearch ? paginatedSearchResults : initialChunks
const displayChunks = rawDisplayChunks ?? []
const currentPage = showingSearch ? searchCurrentPage : initialPage
const totalPages = showingSearch ? searchTotalPages : initialTotalPages
const hasNextPage = showingSearch ? searchCurrentPage < searchTotalPages : initialHasNextPage
@@ -562,47 +589,68 @@ export function Document({
}
: undefined
const filterContent = (
<div className='w-[200px]'>
<div className='border-[var(--border-1)] border-b px-3 py-2'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Status</span>
</div>
<div className='flex flex-col gap-0.5 px-3 py-2'>
{(['all', 'enabled', 'disabled'] as const).map((value) => (
<button
key={value}
type='button'
className={cn(
'flex w-full cursor-pointer select-none items-center rounded-[5px] px-2 py-[5px] font-medium text-[var(--text-secondary)] text-caption outline-none transition-colors hover-hover:bg-[var(--surface-active)]',
enabledFilter === value && 'bg-[var(--surface-active)]'
)}
onClick={() => {
setEnabledFilter(value)
const enabledDisplayLabel = useMemo(() => {
if (enabledFilter.length === 0) return 'All'
if (enabledFilter.length === 1) return enabledFilter[0] === 'enabled' ? 'Enabled' : 'Disabled'
return `${enabledFilter.length} selected`
}, [enabledFilter])
const filterContent = useMemo(
() => (
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Status</span>
<Combobox
options={[
{ value: 'enabled', label: 'Enabled' },
{ value: 'disabled', label: 'Disabled' },
]}
multiSelect
multiSelectValues={enabledFilter}
onMultiSelectChange={(values) => {
setEnabledFilter(values)
setSelectedChunks(new Set())
void goToPage(1)
}}
>
{value.charAt(0).toUpperCase() + value.slice(1)}
</button>
))}
</div>
</div>
)
const filterTags: FilterTag[] = [
...(enabledFilter !== 'all'
? [
{
label: `Status: ${enabledFilter === 'enabled' ? 'Enabled' : 'Disabled'}`,
onRemove: () => {
setEnabledFilter('all')
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{enabledDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
{enabledFilter.length > 0 && (
<button
type='button'
onClick={() => {
setEnabledFilter([])
setSelectedChunks(new Set())
void goToPage(1)
},
},
]
: []),
]
}}
className='flex h-[32px] w-full items-center justify-center rounded-md text-[var(--text-secondary)] text-caption transition-colors hover-hover:bg-[var(--surface-active)]'
>
Clear all filters
</button>
)}
</div>
),
[enabledFilter, enabledDisplayLabel, goToPage]
)
const filterTags: FilterTag[] = useMemo(
() =>
enabledFilter.map((value) => ({
label: `Status: ${value === 'enabled' ? 'Enabled' : 'Disabled'}`,
onRemove: () => {
setEnabledFilter((prev) => prev.filter((v) => v !== value))
setSelectedChunks(new Set())
void goToPage(1)
},
})),
[enabledFilter, goToPage]
)
const handleChunkClick = useCallback((rowId: string) => {
setSelectedChunkId(rowId)
@@ -814,6 +862,26 @@ export function Document({
}
: undefined
const sortConfig: SortConfig = useMemo(
() => ({
options: [
{ id: 'index', label: 'Index' },
{ id: 'tokens', label: 'Tokens' },
{ id: 'status', label: 'Status' },
],
active: activeSort,
onSort: (column, direction) => {
setActiveSort({ column, direction })
void goToPage(1)
},
onClear: () => {
setActiveSort(null)
void goToPage(1)
},
}),
[activeSort, goToPage]
)
const chunkRows: ResourceRow[] = useMemo(() => {
if (!isCompleted) {
return [
@@ -1100,6 +1168,7 @@ export function Document({
emptyMessage={emptyMessage}
filter={combinedError ? undefined : filterContent}
filterTags={combinedError ? undefined : filterTags}
sort={combinedError ? undefined : sortConfig}
/>
<DocumentTagsModal

View File

@@ -208,7 +208,7 @@ export function KnowledgeBase({
const [searchQuery, setSearchQuery] = useState('')
const [showTagsModal, setShowTagsModal] = useState(false)
const [enabledFilter, setEnabledFilter] = useState<'all' | 'enabled' | 'disabled'>('all')
const [enabledFilter, setEnabledFilter] = useState<string[]>([])
const [tagFilterEntries, setTagFilterEntries] = useState<
{
id: string
@@ -235,6 +235,17 @@ export function KnowledgeBase({
[tagFilterEntries]
)
const enabledFilterParam = useMemo<'all' | 'enabled' | 'disabled'>(() => {
if (enabledFilter.length === 1) return enabledFilter[0] as 'enabled' | 'disabled'
return 'all'
}, [enabledFilter])
const enabledDisplayLabel = useMemo(() => {
if (enabledFilter.length === 0) return 'All'
if (enabledFilter.length === 1) return enabledFilter[0] === 'enabled' ? 'Enabled' : 'Disabled'
return '2 selected'
}, [enabledFilter])
const handleSearchChange = useCallback((newQuery: string) => {
setSearchQuery(newQuery)
setCurrentPage(1)
@@ -249,8 +260,10 @@ export function KnowledgeBase({
const [showBulkDeleteModal, setShowBulkDeleteModal] = useState(false)
const [showConnectorsModal, setShowConnectorsModal] = useState(false)
const [currentPage, setCurrentPage] = useState(1)
const [sortBy, setSortBy] = useState<DocumentSortField>('uploadedAt')
const [sortOrder, setSortOrder] = useState<SortOrder>('desc')
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const [contextMenuDocument, setContextMenuDocument] = useState<DocumentData | null>(null)
const [showRenameModal, setShowRenameModal] = useState(false)
const [documentToRename, setDocumentToRename] = useState<DocumentData | null>(null)
@@ -290,8 +303,8 @@ export function KnowledgeBase({
search: searchQuery || undefined,
limit: DOCUMENTS_PER_PAGE,
offset: (currentPage - 1) * DOCUMENTS_PER_PAGE,
sortBy,
sortOrder,
sortBy: (activeSort?.column ?? 'uploadedAt') as DocumentSortField,
sortOrder: (activeSort?.direction ?? 'desc') as SortOrder,
refetchInterval: (data) => {
if (isDeleting) return false
const hasPending = data?.documents?.some(
@@ -301,7 +314,7 @@ export function KnowledgeBase({
if (hasSyncingConnectorsRef.current) return 5000
return false
},
enabledFilter,
enabledFilter: enabledFilterParam,
tagFilters: activeTagFilters.length > 0 ? activeTagFilters : undefined,
})
@@ -571,7 +584,7 @@ export function KnowledgeBase({
knowledgeBaseId: id,
operation: 'enable',
selectAll: true,
enabledFilter,
enabledFilter: enabledFilterParam,
},
{
onSuccess: (result) => {
@@ -618,7 +631,7 @@ export function KnowledgeBase({
knowledgeBaseId: id,
operation: 'disable',
selectAll: true,
enabledFilter,
enabledFilter: enabledFilterParam,
},
{
onSuccess: (result) => {
@@ -667,7 +680,7 @@ export function KnowledgeBase({
knowledgeBaseId: id,
operation: 'delete',
selectAll: true,
enabledFilter,
enabledFilter: enabledFilterParam,
},
{
onSuccess: (result) => {
@@ -707,12 +720,12 @@ export function KnowledgeBase({
const selectedDocumentsList = documents.filter((doc) => selectedDocuments.has(doc.id))
const enabledCount = isSelectAllMode
? enabledFilter === 'disabled'
? enabledFilterParam === 'disabled'
? 0
: pagination.total
: selectedDocumentsList.filter((doc) => doc.enabled).length
const disabledCount = isSelectAllMode
? enabledFilter === 'enabled'
? enabledFilterParam === 'enabled'
? 0
: pagination.total
: selectedDocumentsList.filter((doc) => !doc.enabled).length
@@ -795,59 +808,83 @@ export function KnowledgeBase({
: []),
]
const sortConfig: SortConfig = {
options: [
{ id: 'filename', label: 'Name' },
{ id: 'fileSize', label: 'Size' },
{ id: 'tokenCount', label: 'Tokens' },
{ id: 'chunkCount', label: 'Chunks' },
{ id: 'uploadedAt', label: 'Uploaded' },
{ id: 'enabled', label: 'Status' },
],
active: { column: sortBy, direction: sortOrder },
onSort: (column, direction) => {
setSortBy(column as DocumentSortField)
setSortOrder(direction)
setCurrentPage(1)
},
}
const sortConfig: SortConfig = useMemo(
() => ({
options: [
{ id: 'filename', label: 'Name' },
{ id: 'fileSize', label: 'Size' },
{ id: 'tokenCount', label: 'Tokens' },
{ id: 'chunkCount', label: 'Chunks' },
{ id: 'uploadedAt', label: 'Uploaded' },
{ id: 'enabled', label: 'Status' },
],
active: activeSort,
onSort: (column, direction) => {
setActiveSort({ column, direction })
setCurrentPage(1)
},
onClear: () => {
setActiveSort(null)
setCurrentPage(1)
},
}),
[activeSort]
)
const filterContent = (
<div className='w-[320px]'>
<div className='border-[var(--border-1)] border-b px-3 py-2'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Status</span>
</div>
<div className='flex flex-col gap-0.5 px-3 py-2'>
{(['all', 'enabled', 'disabled'] as const).map((value) => (
<button
key={value}
type='button'
className={cn(
'flex w-full cursor-pointer select-none items-center rounded-[5px] px-2 py-[5px] font-medium text-[var(--text-secondary)] text-caption outline-none transition-colors hover-hover:bg-[var(--surface-active)]',
enabledFilter === value && 'bg-[var(--surface-active)]'
)}
onClick={() => {
setEnabledFilter(value)
const filterContent = useMemo(
() => (
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Status</span>
<Combobox
options={[
{ value: 'enabled', label: 'Enabled' },
{ value: 'disabled', label: 'Disabled' },
]}
multiSelect
multiSelectValues={enabledFilter}
onMultiSelectChange={(values) => {
setEnabledFilter(values)
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
}}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{enabledDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
{enabledFilter.length > 0 && (
<button
type='button'
onClick={() => {
setEnabledFilter([])
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
}}
className='flex h-[32px] w-full items-center justify-center rounded-md text-[var(--text-secondary)] text-caption transition-colors hover-hover:bg-[var(--surface-active)]'
>
{value.charAt(0).toUpperCase() + value.slice(1)}
Clear status filter
</button>
))}
)}
<TagFilterSection
tagDefinitions={tagDefinitions}
entries={tagFilterEntries}
onChange={(entries) => {
setTagFilterEntries(entries)
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
}}
/>
</div>
<TagFilterSection
tagDefinitions={tagDefinitions}
entries={tagFilterEntries}
onChange={(entries) => {
setTagFilterEntries(entries)
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
}}
/>
</div>
),
[enabledFilter, enabledDisplayLabel, tagDefinitions, tagFilterEntries]
)
const connectorBadges =
@@ -863,7 +900,11 @@ export function KnowledgeBase({
onClick={() => setShowConnectorsModal(true)}
className='flex shrink-0 cursor-pointer items-center gap-1.5 rounded-md px-2 py-1 text-[var(--text-secondary)] text-caption shadow-[inset_0_0_0_1px_var(--border)] transition-colors hover-hover:bg-[var(--surface-3)]'
>
{ConnectorIcon && <ConnectorIcon className='h-[14px] w-[14px]' />}
{connector.status === 'syncing' ? (
<Loader2 className='h-[14px] w-[14px] animate-spin' />
) : (
ConnectorIcon && <ConnectorIcon className='h-[14px] w-[14px]' />
)}
{def?.name || connector.connectorType}
</button>
)
@@ -871,33 +912,39 @@ export function KnowledgeBase({
</>
) : null
const filterTags: FilterTag[] = [
...(enabledFilter !== 'all'
? [
{
label: `Status: ${enabledFilter === 'enabled' ? 'Enabled' : 'Disabled'}`,
onRemove: () => {
setEnabledFilter('all')
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
const filterTags: FilterTag[] = useMemo(
() => [
...(enabledFilter.length > 0
? [
{
label:
enabledFilter.length === 1
? `Status: ${enabledFilter[0] === 'enabled' ? 'Enabled' : 'Disabled'}`
: 'Status: 2 selected',
onRemove: () => {
setEnabledFilter([])
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
},
},
]
: []),
...tagFilterEntries
.filter((f) => f.tagSlot && f.value.trim())
.map((f) => ({
label: `${f.tagName}: ${f.value}`,
onRemove: () => {
const updated = tagFilterEntries.filter((e) => e.id !== f.id)
setTagFilterEntries(updated)
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
},
]
: []),
...tagFilterEntries
.filter((f) => f.tagSlot && f.value.trim())
.map((f) => ({
label: `${f.tagName}: ${f.value}`,
onRemove: () => {
const updated = tagFilterEntries.filter((_, idx) => idx !== tagFilterEntries.indexOf(f))
setTagFilterEntries(updated)
setCurrentPage(1)
setSelectedDocuments(new Set())
setIsSelectAllMode(false)
},
})),
]
})),
],
[enabledFilter, tagFilterEntries]
)
const selectableConfig: SelectableConfig = {
selectedIds: selectedDocuments,
@@ -922,7 +969,7 @@ export function KnowledgeBase({
content: (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<div style={{ cursor: 'help' }}>{getStatusBadge(doc)}</div>
<div className='cursor-help'>{getStatusBadge(doc)}</div>
</Tooltip.Trigger>
<Tooltip.Content side='top' className='max-w-xs'>
{doc.processingError}
@@ -1019,7 +1066,7 @@ export function KnowledgeBase({
const emptyMessage = searchQuery
? 'No documents found'
: enabledFilter !== 'all' || activeTagFilters.length > 0
: enabledFilter.length > 0 || activeTagFilters.length > 0
? 'Nothing matches your filter'
: undefined

View File

@@ -19,21 +19,17 @@ import {
ModalHeader,
Tooltip,
} from '@/components/emcn'
import { useSession } from '@/lib/auth/auth-client'
import { consumeOAuthReturnContext, writeOAuthReturnContext } from '@/lib/credentials/client-state'
import {
getCanonicalScopesForProvider,
getProviderIdFromServiceId,
type OAuthProvider,
} from '@/lib/oauth'
import { consumeOAuthReturnContext } from '@/lib/credentials/client-state'
import { getProviderIdFromServiceId, type OAuthProvider } from '@/lib/oauth'
import { ConnectorSelectorField } from '@/app/workspace/[workspaceId]/knowledge/[id]/components/add-connector-modal/components/connector-selector-field'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/oauth-required-modal'
import { ConnectCredentialModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/connect-credential-modal'
import { getDependsOnFields } from '@/blocks/utils'
import { CONNECTOR_REGISTRY } from '@/connectors/registry'
import type { ConnectorConfig, ConnectorConfigField } from '@/connectors/types'
import { useCreateConnector } from '@/hooks/queries/kb/connectors'
import { useOAuthCredentials } from '@/hooks/queries/oauth/oauth-credentials'
import type { SelectorKey } from '@/hooks/selectors/types'
import { useCredentialRefreshTriggers } from '@/hooks/use-credential-refresh-triggers'
const SYNC_INTERVALS = [
{ label: 'Every hour', value: 60 },
@@ -69,7 +65,6 @@ export function AddConnectorModal({ open, onOpenChange, knowledgeBaseId }: AddCo
const [searchTerm, setSearchTerm] = useState('')
const { workspaceId } = useParams<{ workspaceId: string }>()
const { data: session } = useSession()
const { mutate: createConnector, isPending: isCreating } = useCreateConnector()
const connectorConfig = selectedType ? CONNECTOR_REGISTRY[selectedType] : null
@@ -82,10 +77,16 @@ export function AddConnectorModal({ open, onOpenChange, knowledgeBaseId }: AddCo
[connectorConfig]
)
const { data: credentials = [], isLoading: credentialsLoading } = useOAuthCredentials(
connectorProviderId ?? undefined,
{ enabled: Boolean(connectorConfig) && !isApiKeyMode, workspaceId }
)
const {
data: credentials = [],
isLoading: credentialsLoading,
refetch: refetchCredentials,
} = useOAuthCredentials(connectorProviderId ?? undefined, {
enabled: Boolean(connectorConfig) && !isApiKeyMode,
workspaceId,
})
useCredentialRefreshTriggers(refetchCredentials, connectorProviderId ?? '', workspaceId)
const effectiveCredentialId =
selectedCredentialId ?? (credentials.length === 1 ? credentials[0].id : null)
@@ -263,51 +264,9 @@ export function AddConnectorModal({ open, onOpenChange, knowledgeBaseId }: AddCo
)
}
const handleConnectNewAccount = useCallback(async () => {
if (!connectorConfig || !connectorProviderId || !workspaceId) return
const userName = session?.user?.name
const integrationName = connectorConfig.name
const displayName = userName ? `${userName}'s ${integrationName}` : integrationName
try {
const res = await fetch('/api/credentials/draft', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
workspaceId,
providerId: connectorProviderId,
displayName,
}),
})
if (!res.ok) {
setError('Failed to prepare credential. Please try again.')
return
}
} catch {
setError('Failed to prepare credential. Please try again.')
return
}
writeOAuthReturnContext({
origin: 'kb-connectors',
knowledgeBaseId,
displayName,
providerId: connectorProviderId,
preCount: credentials.length,
workspaceId,
requestedAt: Date.now(),
})
const handleConnectNewAccount = useCallback(() => {
setShowOAuthModal(true)
}, [
connectorConfig,
connectorProviderId,
workspaceId,
session?.user?.name,
knowledgeBaseId,
credentials.length,
])
}, [])
const filteredEntries = useMemo(() => {
const term = searchTerm.toLowerCase().trim()
@@ -396,40 +355,40 @@ export function AddConnectorModal({ open, onOpenChange, knowledgeBaseId }: AddCo
) : (
<div className='flex flex-col gap-2'>
<Label>Account</Label>
{credentialsLoading ? (
<div className='flex items-center gap-2 text-[var(--text-muted)] text-small'>
<Loader2 className='h-4 w-4 animate-spin' />
Loading credentials...
</div>
) : (
<Combobox
size='sm'
options={[
...credentials.map(
(cred): ComboboxOption => ({
label: cred.name || cred.provider,
value: cred.id,
icon: connectorConfig.icon,
})
),
{
label: 'Connect new account',
value: '__connect_new__',
icon: Plus,
onSelect: () => {
void handleConnectNewAccount()
},
<Combobox
size='sm'
options={[
...credentials.map(
(cred): ComboboxOption => ({
label: cred.name || cred.provider,
value: cred.id,
icon: connectorConfig.icon,
})
),
{
label:
credentials.length > 0
? `Connect another ${connectorConfig.name} account`
: `Connect ${connectorConfig.name} account`,
value: '__connect_new__',
icon: Plus,
onSelect: () => {
void handleConnectNewAccount()
},
]}
value={effectiveCredentialId ?? undefined}
onChange={(value) => setSelectedCredentialId(value)}
placeholder={
credentials.length === 0
? `No ${connectorConfig.name} accounts`
: 'Select account'
}
/>
)}
},
]}
value={effectiveCredentialId ?? undefined}
onChange={(value) => setSelectedCredentialId(value)}
onOpenChange={(isOpen) => {
if (isOpen) void refetchCredentials()
}}
placeholder={
credentials.length === 0
? `No ${connectorConfig.name} accounts`
: 'Select account'
}
isLoading={credentialsLoading}
/>
</div>
)}
@@ -590,20 +549,23 @@ export function AddConnectorModal({ open, onOpenChange, knowledgeBaseId }: AddCo
)}
</ModalContent>
</Modal>
{connectorConfig && connectorConfig.auth.mode === 'oauth' && connectorProviderId && (
<OAuthRequiredModal
isOpen={showOAuthModal}
onClose={() => {
consumeOAuthReturnContext()
setShowOAuthModal(false)
}}
provider={connectorProviderId}
toolName={connectorConfig.name}
requiredScopes={getCanonicalScopesForProvider(connectorProviderId)}
newScopes={[]}
serviceId={connectorConfig.auth.provider}
/>
)}
{showOAuthModal &&
connectorConfig &&
connectorConfig.auth.mode === 'oauth' &&
connectorProviderId && (
<ConnectCredentialModal
isOpen={showOAuthModal}
onClose={() => {
consumeOAuthReturnContext()
setShowOAuthModal(false)
}}
provider={connectorProviderId}
serviceId={connectorConfig.auth.provider}
workspaceId={workspaceId}
knowledgeBaseId={knowledgeBaseId}
credentialCount={credentials.length}
/>
)}
</>
)
}

View File

@@ -36,6 +36,7 @@ import {
} from '@/lib/oauth'
import { getMissingRequiredScopes } from '@/lib/oauth/utils'
import { EditConnectorModal } from '@/app/workspace/[workspaceId]/knowledge/[id]/components/edit-connector-modal/edit-connector-modal'
import { ConnectCredentialModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/connect-credential-modal'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/oauth-required-modal'
import { CONNECTOR_REGISTRY } from '@/connectors/registry'
import type { ConnectorData, SyncLogData } from '@/hooks/queries/kb/connectors'
@@ -46,6 +47,7 @@ import {
useUpdateConnector,
} from '@/hooks/queries/kb/connectors'
import { useOAuthCredentials } from '@/hooks/queries/oauth/oauth-credentials'
import { useCredentialRefreshTriggers } from '@/hooks/use-credential-refresh-triggers'
const logger = createLogger('ConnectorsSection')
@@ -328,11 +330,16 @@ function ConnectorCard({
const requiredScopes =
connectorDef?.auth.mode === 'oauth' ? (connectorDef.auth.requiredScopes ?? []) : []
const { data: credentials } = useOAuthCredentials(providerId, { workspaceId })
const { data: credentials, refetch: refetchCredentials } = useOAuthCredentials(providerId, {
workspaceId,
})
useCredentialRefreshTriggers(refetchCredentials, providerId ?? '', workspaceId)
const missingScopes = useMemo(() => {
if (!credentials || !connector.credentialId) return []
const credential = credentials.find((c) => c.id === connector.credentialId)
if (!credential) return []
return getMissingRequiredScopes(credential, requiredScopes)
}, [credentials, connector.credentialId, requiredScopes])
@@ -484,15 +491,17 @@ function ConnectorCard({
<Button
variant='active'
onClick={() => {
writeOAuthReturnContext({
origin: 'kb-connectors',
knowledgeBaseId,
displayName: connectorDef?.name ?? connector.connectorType,
providerId: providerId!,
preCount: credentials?.length ?? 0,
workspaceId,
requestedAt: Date.now(),
})
if (connector.credentialId) {
writeOAuthReturnContext({
origin: 'kb-connectors',
knowledgeBaseId,
displayName: connectorDef?.name ?? connector.connectorType,
providerId: providerId!,
preCount: credentials?.length ?? 0,
workspaceId,
requestedAt: Date.now(),
})
}
setShowOAuthModal(true)
}}
className='w-full px-2 py-1 font-medium text-caption'
@@ -510,7 +519,22 @@ function ConnectorCard({
</div>
)}
{showOAuthModal && serviceId && providerId && (
{showOAuthModal && serviceId && providerId && !connector.credentialId && (
<ConnectCredentialModal
isOpen={showOAuthModal}
onClose={() => {
consumeOAuthReturnContext()
setShowOAuthModal(false)
}}
provider={providerId as OAuthProvider}
serviceId={serviceId}
workspaceId={workspaceId}
knowledgeBaseId={knowledgeBaseId}
credentialCount={credentials?.length ?? 0}
/>
)}
{showOAuthModal && serviceId && providerId && connector.credentialId && (
<OAuthRequiredModal
isOpen={showOAuthModal}
onClose={() => {

View File

@@ -3,15 +3,18 @@
import { useCallback, useMemo, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { useParams, useRouter } from 'next/navigation'
import { Tooltip } from '@/components/emcn'
import type { ComboboxOption } from '@/components/emcn'
import { Combobox, Tooltip } from '@/components/emcn'
import { Database } from '@/components/emcn/icons'
import type { KnowledgeBaseData } from '@/lib/knowledge/types'
import type {
CreateAction,
FilterTag,
ResourceCell,
ResourceColumn,
ResourceRow,
SearchConfig,
SortConfig,
} from '@/app/workspace/[workspaceId]/components'
import { ownerCell, Resource, timeCell } from '@/app/workspace/[workspaceId]/components'
import { BaseTagsModal } from '@/app/workspace/[workspaceId]/knowledge/[id]/components'
@@ -29,6 +32,7 @@ import { CONNECTOR_REGISTRY } from '@/connectors/registry'
import { useKnowledgeBasesList } from '@/hooks/kb/use-knowledge'
import { useDeleteKnowledgeBase, useUpdateKnowledgeBase } from '@/hooks/queries/kb/knowledge'
import { useWorkspaceMembersQuery } from '@/hooks/queries/workspace'
import { useDebounce } from '@/hooks/use-debounce'
const logger = createLogger('Knowledge')
@@ -98,21 +102,16 @@ export function Knowledge() {
const { mutateAsync: updateKnowledgeBaseMutation } = useUpdateKnowledgeBase(workspaceId)
const { mutateAsync: deleteKnowledgeBaseMutation } = useDeleteKnowledgeBase(workspaceId)
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const [connectorFilter, setConnectorFilter] = useState<string[]>([])
const [contentFilter, setContentFilter] = useState<string[]>([])
const [ownerFilter, setOwnerFilter] = useState<string[]>([])
const [searchInputValue, setSearchInputValue] = useState('')
const [debouncedSearchQuery, setDebouncedSearchQuery] = useState('')
const searchTimerRef = useRef<ReturnType<typeof setTimeout>>(null)
const handleSearchChange = useCallback((value: string) => {
setSearchInputValue(value)
if (searchTimerRef.current) clearTimeout(searchTimerRef.current)
searchTimerRef.current = setTimeout(() => {
setDebouncedSearchQuery(value)
}, 300)
}, [])
const handleSearchClearAll = useCallback(() => {
handleSearchChange('')
}, [handleSearchChange])
const debouncedSearchQuery = useDebounce(searchInputValue, 300)
const [isCreateModalOpen, setIsCreateModalOpen] = useState(false)
@@ -184,14 +183,77 @@ export function Knowledge() {
[deleteKnowledgeBaseMutation]
)
const filteredKnowledgeBases = useMemo(
() => filterKnowledgeBases(knowledgeBases, debouncedSearchQuery),
[knowledgeBases, debouncedSearchQuery]
)
const processedKBs = useMemo(() => {
let result = filterKnowledgeBases(knowledgeBases, debouncedSearchQuery)
if (connectorFilter.length > 0) {
result = result.filter((kb) => {
const hasConnectors = (kb.connectorTypes?.length ?? 0) > 0
if (connectorFilter.includes('connected') && hasConnectors) return true
if (connectorFilter.includes('unconnected') && !hasConnectors) return true
return false
})
}
if (contentFilter.length > 0) {
const docCount = (kb: KnowledgeBaseData) => (kb as KnowledgeBaseWithDocCount).docCount ?? 0
result = result.filter((kb) => {
if (contentFilter.includes('has-docs') && docCount(kb) > 0) return true
if (contentFilter.includes('empty') && docCount(kb) === 0) return true
return false
})
}
if (ownerFilter.length > 0) {
result = result.filter((kb) => ownerFilter.includes(kb.userId))
}
const col = activeSort?.column ?? 'created'
const dir = activeSort?.direction ?? 'desc'
return [...result].sort((a, b) => {
let cmp = 0
switch (col) {
case 'name':
cmp = a.name.localeCompare(b.name)
break
case 'documents':
cmp =
((a as KnowledgeBaseWithDocCount).docCount || 0) -
((b as KnowledgeBaseWithDocCount).docCount || 0)
break
case 'tokens':
cmp = (a.tokenCount || 0) - (b.tokenCount || 0)
break
case 'created':
cmp = new Date(a.createdAt).getTime() - new Date(b.createdAt).getTime()
break
case 'updated':
cmp = new Date(a.updatedAt).getTime() - new Date(b.updatedAt).getTime()
break
case 'connectors':
cmp = (a.connectorTypes?.length ?? 0) - (b.connectorTypes?.length ?? 0)
break
case 'owner':
cmp = (members?.find((m) => m.userId === a.userId)?.name ?? '').localeCompare(
members?.find((m) => m.userId === b.userId)?.name ?? ''
)
break
}
return dir === 'asc' ? cmp : -cmp
})
}, [
knowledgeBases,
debouncedSearchQuery,
connectorFilter,
contentFilter,
ownerFilter,
activeSort,
members,
])
const rows: ResourceRow[] = useMemo(
() =>
filteredKnowledgeBases.map((kb) => {
processedKBs.map((kb) => {
const kbWithCount = kb as KnowledgeBaseWithDocCount
return {
id: kb.id,
@@ -211,16 +273,9 @@ export function Knowledge() {
owner: ownerCell(kb.userId, members),
updated: timeCell(kb.updatedAt),
},
sortValues: {
documents: kbWithCount.docCount || 0,
tokens: kb.tokenCount || 0,
connectors: kb.connectorTypes?.length || 0,
created: -new Date(kb.createdAt).getTime(),
updated: -new Date(kb.updatedAt).getTime(),
},
}
}),
[filteredKnowledgeBases, members]
[processedKBs, members]
)
const handleRowClick = useCallback(
@@ -303,13 +358,190 @@ export function Knowledge() {
const searchConfig: SearchConfig = useMemo(
() => ({
value: searchInputValue,
onChange: handleSearchChange,
onClearAll: handleSearchClearAll,
onChange: setSearchInputValue,
onClearAll: () => setSearchInputValue(''),
placeholder: 'Search knowledge bases...',
}),
[searchInputValue, handleSearchChange, handleSearchClearAll]
[searchInputValue]
)
const sortConfig: SortConfig = useMemo(
() => ({
options: [
{ id: 'name', label: 'Name' },
{ id: 'documents', label: 'Documents' },
{ id: 'tokens', label: 'Tokens' },
{ id: 'connectors', label: 'Connectors' },
{ id: 'created', label: 'Created' },
{ id: 'updated', label: 'Last Updated' },
{ id: 'owner', label: 'Owner' },
],
active: activeSort,
onSort: (column, direction) => setActiveSort({ column, direction }),
onClear: () => setActiveSort(null),
}),
[activeSort]
)
const connectorDisplayLabel = useMemo(() => {
if (connectorFilter.length === 0) return 'All'
if (connectorFilter.length === 1)
return connectorFilter[0] === 'connected' ? 'With connectors' : 'Without connectors'
return `${connectorFilter.length} selected`
}, [connectorFilter])
const contentDisplayLabel = useMemo(() => {
if (contentFilter.length === 0) return 'All'
if (contentFilter.length === 1)
return contentFilter[0] === 'has-docs' ? 'Has documents' : 'Empty'
return `${contentFilter.length} selected`
}, [contentFilter])
const ownerDisplayLabel = useMemo(() => {
if (ownerFilter.length === 0) return 'All'
if (ownerFilter.length === 1)
return members?.find((m) => m.userId === ownerFilter[0])?.name ?? '1 member'
return `${ownerFilter.length} members`
}, [ownerFilter, members])
const memberOptions: ComboboxOption[] = useMemo(
() =>
(members ?? []).map((m) => ({
value: m.userId,
label: m.name,
iconElement: m.image ? (
<img
src={m.image}
alt={m.name}
referrerPolicy='no-referrer'
className='h-[14px] w-[14px] rounded-full border border-[var(--border)] object-cover'
/>
) : (
<span className='flex h-[14px] w-[14px] items-center justify-center rounded-full border border-[var(--border)] bg-[var(--surface-3)] font-medium text-[8px] text-[var(--text-secondary)]'>
{m.name.charAt(0).toUpperCase()}
</span>
),
})),
[members]
)
const hasActiveFilters =
connectorFilter.length > 0 || contentFilter.length > 0 || ownerFilter.length > 0
const filterContent = useMemo(
() => (
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Connectors</span>
<Combobox
options={[
{ value: 'connected', label: 'With connectors' },
{ value: 'unconnected', label: 'Without connectors' },
]}
multiSelect
multiSelectValues={connectorFilter}
onMultiSelectChange={setConnectorFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{connectorDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Content</span>
<Combobox
options={[
{ value: 'has-docs', label: 'Has documents' },
{ value: 'empty', label: 'Empty' },
]}
multiSelect
multiSelectValues={contentFilter}
onMultiSelectChange={setContentFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{contentDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
{memberOptions.length > 0 && (
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Owner</span>
<Combobox
options={memberOptions}
multiSelect
multiSelectValues={ownerFilter}
onMultiSelectChange={setOwnerFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{ownerDisplayLabel}</span>
}
searchable
searchPlaceholder='Search members...'
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
)}
{hasActiveFilters && (
<button
type='button'
onClick={() => {
setConnectorFilter([])
setContentFilter([])
setOwnerFilter([])
}}
className='flex h-[32px] w-full items-center justify-center rounded-md text-[var(--text-secondary)] text-caption transition-colors hover-hover:bg-[var(--surface-active)]'
>
Clear all filters
</button>
)}
</div>
),
[
connectorFilter,
contentFilter,
ownerFilter,
memberOptions,
connectorDisplayLabel,
contentDisplayLabel,
ownerDisplayLabel,
hasActiveFilters,
]
)
const filterTags: FilterTag[] = useMemo(() => {
const tags: FilterTag[] = []
if (connectorFilter.length > 0) {
const label =
connectorFilter.length === 1
? `Connectors: ${connectorFilter[0] === 'connected' ? 'With connectors' : 'Without connectors'}`
: `Connectors: ${connectorFilter.length} types`
tags.push({ label, onRemove: () => setConnectorFilter([]) })
}
if (contentFilter.length > 0) {
const label =
contentFilter.length === 1
? `Content: ${contentFilter[0] === 'has-docs' ? 'Has documents' : 'Empty'}`
: `Content: ${contentFilter.length} types`
tags.push({ label, onRemove: () => setContentFilter([]) })
}
if (ownerFilter.length > 0) {
const label =
ownerFilter.length === 1
? `Owner: ${members?.find((m) => m.userId === ownerFilter[0])?.name ?? '1 member'}`
: `Owner: ${ownerFilter.length} members`
tags.push({ label, onRemove: () => setOwnerFilter([]) })
}
return tags
}, [connectorFilter, contentFilter, ownerFilter, members])
return (
<>
<Resource
@@ -317,7 +549,9 @@ export function Knowledge() {
title='Knowledge Base'
create={createAction}
search={searchConfig}
defaultSort='created'
sort={sortConfig}
filter={filterContent}
filterTags={filterTags}
columns={COLUMNS}
rows={rows}
onRowClick={handleRowClick}

View File

@@ -8,7 +8,7 @@ import {
DropdownMenuSeparator,
DropdownMenuTrigger,
} from '@/components/emcn'
import { Copy, Eye, ListFilter, SquareArrowUpRight, X } from '@/components/emcn/icons'
import { Copy, Eye, Link, ListFilter, SquareArrowUpRight, X } from '@/components/emcn/icons'
import type { WorkflowLog } from '@/stores/logs/filters/types'
interface LogRowContextMenuProps {
@@ -17,6 +17,7 @@ interface LogRowContextMenuProps {
onClose: () => void
log: WorkflowLog | null
onCopyExecutionId: () => void
onCopyLink: () => void
onOpenWorkflow: () => void
onOpenPreview: () => void
onToggleWorkflowFilter: () => void
@@ -35,6 +36,7 @@ export const LogRowContextMenu = memo(function LogRowContextMenu({
onClose,
log,
onCopyExecutionId,
onCopyLink,
onOpenWorkflow,
onOpenPreview,
onToggleWorkflowFilter,
@@ -71,6 +73,10 @@ export const LogRowContextMenu = memo(function LogRowContextMenu({
<Copy />
Copy Execution ID
</DropdownMenuItem>
<DropdownMenuItem disabled={!hasExecutionId} onSelect={onCopyLink}>
<Link />
Copy Link
</DropdownMenuItem>
<DropdownMenuSeparator />
<DropdownMenuItem disabled={!hasWorkflow} onSelect={onOpenWorkflow}>

View File

@@ -39,6 +39,7 @@ import type {
ResourceColumn,
ResourceRow,
SearchConfig,
SortConfig,
} from '@/app/workspace/[workspaceId]/components'
import {
ResourceHeader,
@@ -265,16 +266,17 @@ export default function Logs() {
isSidebarOpen: false,
})
const isInitialized = useRef<boolean>(false)
const pendingExecutionIdRef = useRef<string | null>(null)
const [searchQuery, setSearchQuery] = useState('')
const debouncedSearchQuery = useDebounce(searchQuery, 300)
useEffect(() => {
const urlSearch = new URLSearchParams(window.location.search).get('search') || ''
if (urlSearch && urlSearch !== searchQuery) {
setSearchQuery(urlSearch)
}
// eslint-disable-next-line react-hooks/exhaustive-deps
const params = new URLSearchParams(window.location.search)
const urlSearch = params.get('search')
if (urlSearch) setSearchQuery(urlSearch)
const urlExecutionId = params.get('executionId')
if (urlExecutionId) pendingExecutionIdRef.current = urlExecutionId
}, [])
const isLive = true
@@ -288,12 +290,15 @@ export default function Logs() {
const activeLogRefetchRef = useRef<() => void>(() => {})
const logsQueryRef = useRef({ isFetching: false, hasNextPage: false, fetchNextPage: () => {} })
const [isNotificationSettingsOpen, setIsNotificationSettingsOpen] = useState(false)
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const userPermissions = useUserPermissionsContext()
const [contextMenuOpen, setContextMenuOpen] = useState(false)
const [contextMenuPosition, setContextMenuPosition] = useState({ x: 0, y: 0 })
const [contextMenuLog, setContextMenuLog] = useState<WorkflowLog | null>(null)
const contextMenuRef = useRef<HTMLDivElement>(null)
const [isPreviewOpen, setIsPreviewOpen] = useState(false)
const [previewLogId, setPreviewLogId] = useState<string | null>(null)
@@ -358,11 +363,43 @@ export default function Logs() {
return logsQuery.data.pages.flatMap((page) => page.logs)
}, [logsQuery.data?.pages])
const sortedLogs = useMemo(() => {
if (!activeSort) return logs
const { column, direction } = activeSort
return [...logs].sort((a, b) => {
let cmp = 0
switch (column) {
case 'date':
cmp = new Date(a.createdAt).getTime() - new Date(b.createdAt).getTime()
break
case 'duration': {
const aDuration = parseDuration({ duration: a.duration ?? undefined }) ?? -1
const bDuration = parseDuration({ duration: b.duration ?? undefined }) ?? -1
cmp = aDuration - bDuration
break
}
case 'cost': {
const aCost = typeof a.cost?.total === 'number' ? a.cost.total : -1
const bCost = typeof b.cost?.total === 'number' ? b.cost.total : -1
cmp = aCost - bCost
break
}
case 'status':
cmp = (a.status ?? '').localeCompare(b.status ?? '')
break
default:
break
}
return direction === 'asc' ? cmp : -cmp
})
}, [logs, activeSort])
const selectedLogIndex = useMemo(
() => (selectedLogId ? logs.findIndex((l) => l.id === selectedLogId) : -1),
[logs, selectedLogId]
() => (selectedLogId ? sortedLogs.findIndex((l) => l.id === selectedLogId) : -1),
[sortedLogs, selectedLogId]
)
const selectedLogFromList = selectedLogIndex >= 0 ? logs[selectedLogIndex] : null
const selectedLogFromList = selectedLogIndex >= 0 ? sortedLogs[selectedLogIndex] : null
const selectedLog = useMemo(() => {
if (!selectedLogFromList) return null
@@ -380,28 +417,30 @@ export default function Logs() {
useFolders(workspaceId)
logsRef.current = sortedLogs
selectedLogIndexRef.current = selectedLogIndex
selectedLogIdRef.current = selectedLogId
logsRefetchRef.current = logsQuery.refetch
activeLogRefetchRef.current = activeLogQuery.refetch
logsQueryRef.current = {
isFetching: logsQuery.isFetching,
hasNextPage: logsQuery.hasNextPage ?? false,
fetchNextPage: logsQuery.fetchNextPage,
}
useEffect(() => {
logsRef.current = logs
}, [logs])
useEffect(() => {
selectedLogIndexRef.current = selectedLogIndex
}, [selectedLogIndex])
useEffect(() => {
selectedLogIdRef.current = selectedLogId
}, [selectedLogId])
useEffect(() => {
logsRefetchRef.current = logsQuery.refetch
}, [logsQuery.refetch])
useEffect(() => {
activeLogRefetchRef.current = activeLogQuery.refetch
}, [activeLogQuery.refetch])
useEffect(() => {
logsQueryRef.current = {
isFetching: logsQuery.isFetching,
hasNextPage: logsQuery.hasNextPage ?? false,
fetchNextPage: logsQuery.fetchNextPage,
if (!pendingExecutionIdRef.current) return
const targetExecutionId = pendingExecutionIdRef.current
const found = sortedLogs.find((l) => l.executionId === targetExecutionId)
if (found) {
pendingExecutionIdRef.current = null
dispatch({ type: 'TOGGLE_LOG', logId: found.id })
} else if (!logsQuery.hasNextPage && logsQuery.status === 'success') {
pendingExecutionIdRef.current = null
} else if (!logsQuery.isFetching && logsQuery.status === 'success') {
logsQueryRef.current.fetchNextPage()
}
}, [logsQuery.isFetching, logsQuery.hasNextPage, logsQuery.fetchNextPage])
}, [sortedLogs, logsQuery.hasNextPage, logsQuery.isFetching, logsQuery.status])
useEffect(() => {
const timers = refreshTimersRef.current
@@ -443,20 +482,27 @@ export default function Logs() {
const handleLogContextMenu = useCallback(
(e: React.MouseEvent, rowId: string) => {
e.preventDefault()
const log = logs.find((l) => l.id === rowId) ?? null
const log = sortedLogs.find((l) => l.id === rowId) ?? null
setContextMenuPosition({ x: e.clientX, y: e.clientY })
setContextMenuLog(log)
setContextMenuOpen(true)
},
[logs]
[sortedLogs]
)
const handleCopyExecutionId = useCallback(() => {
if (contextMenuLog?.executionId) {
navigator.clipboard.writeText(contextMenuLog.executionId)
navigator.clipboard.writeText(contextMenuLog.executionId).catch(() => {})
}
}, [contextMenuLog])
const handleCopyLink = useCallback(() => {
if (contextMenuLog?.executionId) {
const url = `${window.location.origin}/workspace/${workspaceId}/logs?executionId=${contextMenuLog.executionId}`
navigator.clipboard.writeText(url).catch(() => {})
}
}, [contextMenuLog, workspaceId])
const handleOpenWorkflow = useCallback(() => {
const wfId = contextMenuLog?.workflow?.id || contextMenuLog?.workflowId
if (wfId) {
@@ -603,11 +649,12 @@ export default function Logs() {
}, [initializeFromURL])
const loadMoreLogs = useCallback(() => {
if (activeSort) return
const { isFetching, hasNextPage, fetchNextPage } = logsQueryRef.current
if (!isFetching && hasNextPage) {
fetchNextPage()
}
}, [])
}, [activeSort])
useEffect(() => {
const handleKeyDown = (e: KeyboardEvent) => {
@@ -659,7 +706,7 @@ export default function Logs() {
const rows: ResourceRow[] = useMemo(
() =>
logs.map((log) => {
sortedLogs.map((log) => {
const formattedDate = formatDate(log.createdAt)
const displayStatus = getDisplayStatus(log.status)
const isMothershipJob = log.trigger === 'mothership'
@@ -710,7 +757,7 @@ export default function Logs() {
},
}
}),
[logs]
[sortedLogs]
)
const sidebarOverlay = useMemo(
@@ -721,7 +768,7 @@ export default function Logs() {
onClose={handleCloseSidebar}
onNavigateNext={handleNavigateNext}
onNavigatePrev={handleNavigatePrev}
hasNext={selectedLogIndex < logs.length - 1}
hasNext={selectedLogIndex < sortedLogs.length - 1}
hasPrev={selectedLogIndex > 0}
/>
),
@@ -732,7 +779,7 @@ export default function Logs() {
handleNavigateNext,
handleNavigatePrev,
selectedLogIndex,
logs.length,
sortedLogs.length,
]
)
@@ -978,6 +1025,21 @@ export default function Logs() {
[appliedFilters, textSearch, removeBadge, handleFiltersChange]
)
const sortConfig = useMemo<SortConfig>(
() => ({
options: [
{ id: 'date', label: 'Date' },
{ id: 'duration', label: 'Duration' },
{ id: 'cost', label: 'Cost' },
{ id: 'status', label: 'Status' },
],
active: activeSort,
onSort: (column, direction) => setActiveSort({ column, direction }),
onClear: () => setActiveSort(null),
}),
[activeSort]
)
const searchConfig = useMemo<SearchConfig>(
() => ({
value: currentInput,
@@ -1021,7 +1083,7 @@ export default function Logs() {
label: 'Export',
icon: Download,
onClick: handleExport,
disabled: !userPermissions.canEdit || isExporting || logs.length === 0,
disabled: !userPermissions.canEdit || isExporting || sortedLogs.length === 0,
},
{
label: 'Notifications',
@@ -1054,7 +1116,7 @@ export default function Logs() {
handleExport,
userPermissions.canEdit,
isExporting,
logs.length,
sortedLogs.length,
handleOpenNotificationSettings,
]
)
@@ -1065,6 +1127,7 @@ export default function Logs() {
<ResourceHeader icon={Library} title='Logs' actions={headerActions} />
<ResourceOptionsBar
search={searchConfig}
sort={sortConfig}
filter={
<LogsFilterPanel searchQuery={searchQuery} onSearchQueryChange={setSearchQuery} />
}
@@ -1091,7 +1154,7 @@ export default function Logs() {
onRowContextMenu={handleLogContextMenu}
isLoading={!logsQuery.data}
onLoadMore={loadMoreLogs}
hasMore={logsQuery.hasNextPage ?? false}
hasMore={!activeSort && (logsQuery.hasNextPage ?? false)}
isLoadingMore={logsQuery.isFetchingNextPage}
emptyMessage='No logs found'
overlay={sidebarOverlay}
@@ -1111,6 +1174,7 @@ export default function Logs() {
onClose={handleCloseContextMenu}
log={contextMenuLog}
onCopyExecutionId={handleCopyExecutionId}
onCopyLink={handleCopyLink}
onOpenWorkflow={handleOpenWorkflow}
onOpenPreview={handleOpenPreview}
onToggleWorkflowFilter={handleToggleWorkflowFilter}
@@ -1335,7 +1399,7 @@ function LogsFilterPanel({ searchQuery, onSearchQueryChange }: LogsFilterPanelPr
}, [resetFilters, onSearchQueryChange])
return (
<div className='flex flex-col gap-3 p-3'>
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Status</span>
<Combobox

View File

@@ -3,11 +3,24 @@
import { useCallback, useMemo, useState } from 'react'
import { createLogger } from '@sim/logger'
import { useParams } from 'next/navigation'
import { Button, Modal, ModalBody, ModalContent, ModalFooter, ModalHeader } from '@/components/emcn'
import {
Button,
Combobox,
Modal,
ModalBody,
ModalContent,
ModalFooter,
ModalHeader,
} from '@/components/emcn'
import { Calendar } from '@/components/emcn/icons'
import { formatAbsoluteDate } from '@/lib/core/utils/formatting'
import { parseCronToHumanReadable } from '@/lib/workflows/schedules/utils'
import type { ResourceColumn, ResourceRow } from '@/app/workspace/[workspaceId]/components'
import type {
FilterTag,
ResourceColumn,
ResourceRow,
SortConfig,
} from '@/app/workspace/[workspaceId]/components'
import { Resource, timeCell } from '@/app/workspace/[workspaceId]/components'
import { ScheduleModal } from '@/app/workspace/[workspaceId]/scheduled-tasks/components/create-schedule-modal'
import { ScheduleContextMenu } from '@/app/workspace/[workspaceId]/scheduled-tasks/components/schedule-context-menu'
@@ -74,6 +87,13 @@ export function ScheduledTasks() {
const [activeTask, setActiveTask] = useState<WorkspaceScheduleData | null>(null)
const [searchQuery, setSearchQuery] = useState('')
const debouncedSearchQuery = useDebounce(searchQuery, 300)
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const [scheduleTypeFilter, setScheduleTypeFilter] = useState<string[]>([])
const [statusFilter, setStatusFilter] = useState<string[]>([])
const [healthFilter, setHealthFilter] = useState<string[]>([])
const visibleItems = useMemo(
() => allItems.filter((item) => item.sourceType === 'job' && item.status !== 'completed'),
@@ -81,15 +101,68 @@ export function ScheduledTasks() {
)
const filteredItems = useMemo(() => {
if (!debouncedSearchQuery) return visibleItems
const q = debouncedSearchQuery.toLowerCase()
return visibleItems.filter((item) => {
const task = item.prompt || ''
return (
task.toLowerCase().includes(q) || getScheduleDescription(item).toLowerCase().includes(q)
)
let result = debouncedSearchQuery
? visibleItems.filter((item) => {
const task = item.prompt || ''
return (
task.toLowerCase().includes(debouncedSearchQuery.toLowerCase()) ||
getScheduleDescription(item).toLowerCase().includes(debouncedSearchQuery.toLowerCase())
)
})
: visibleItems
if (scheduleTypeFilter.length > 0) {
result = result.filter((item) => {
if (scheduleTypeFilter.includes('recurring') && Boolean(item.cronExpression)) return true
if (scheduleTypeFilter.includes('once') && !item.cronExpression) return true
return false
})
}
if (statusFilter.length > 0) {
result = result.filter((item) => {
if (statusFilter.includes('active') && item.status === 'active') return true
if (statusFilter.includes('paused') && item.status === 'disabled') return true
return false
})
}
if (healthFilter.includes('has-failures')) {
result = result.filter((item) => (item.failedCount ?? 0) > 0)
}
const col = activeSort?.column ?? 'nextRun'
const dir = activeSort?.direction ?? 'desc'
return [...result].sort((a, b) => {
let cmp = 0
switch (col) {
case 'task':
cmp = (a.prompt || '').localeCompare(b.prompt || '')
break
case 'nextRun':
cmp =
(a.nextRunAt ? new Date(a.nextRunAt).getTime() : 0) -
(b.nextRunAt ? new Date(b.nextRunAt).getTime() : 0)
break
case 'lastRun':
cmp =
(a.lastRanAt ? new Date(a.lastRanAt).getTime() : 0) -
(b.lastRanAt ? new Date(b.lastRanAt).getTime() : 0)
break
case 'schedule':
cmp = getScheduleDescription(a).localeCompare(getScheduleDescription(b))
break
}
return dir === 'asc' ? cmp : -cmp
})
}, [visibleItems, debouncedSearchQuery])
}, [
visibleItems,
debouncedSearchQuery,
scheduleTypeFilter,
statusFilter,
healthFilter,
activeSort,
])
const rows: ResourceRow[] = useMemo(
() =>
@@ -104,10 +177,6 @@ export function ScheduledTasks() {
nextRun: timeCell(item.nextRunAt),
lastRun: timeCell(item.lastRanAt),
},
sortValues: {
nextRun: item.nextRunAt ? -new Date(item.nextRunAt).getTime() : 0,
lastRun: item.lastRanAt ? -new Date(item.lastRanAt).getTime() : 0,
},
})),
[filteredItems]
)
@@ -170,6 +239,151 @@ export function ScheduledTasks() {
}
}
const sortConfig: SortConfig = useMemo(
() => ({
options: [
{ id: 'task', label: 'Task' },
{ id: 'schedule', label: 'Schedule' },
{ id: 'nextRun', label: 'Next Run' },
{ id: 'lastRun', label: 'Last Run' },
],
active: activeSort,
onSort: (column, direction) => setActiveSort({ column, direction }),
onClear: () => setActiveSort(null),
}),
[activeSort]
)
const scheduleTypeDisplayLabel = useMemo(() => {
if (scheduleTypeFilter.length === 0) return 'All'
if (scheduleTypeFilter.length === 1)
return scheduleTypeFilter[0] === 'recurring' ? 'Recurring' : 'One-time'
return `${scheduleTypeFilter.length} selected`
}, [scheduleTypeFilter])
const statusDisplayLabel = useMemo(() => {
if (statusFilter.length === 0) return 'All'
if (statusFilter.length === 1) return statusFilter[0] === 'active' ? 'Active' : 'Paused'
return `${statusFilter.length} selected`
}, [statusFilter])
const healthDisplayLabel = useMemo(() => {
if (healthFilter.length === 0) return 'All'
return 'Has failures'
}, [healthFilter])
const hasActiveFilters =
scheduleTypeFilter.length > 0 || statusFilter.length > 0 || healthFilter.length > 0
const filterContent = useMemo(
() => (
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>
Schedule Type
</span>
<Combobox
options={[
{ value: 'recurring', label: 'Recurring' },
{ value: 'once', label: 'One-time' },
]}
multiSelect
multiSelectValues={scheduleTypeFilter}
onMultiSelectChange={setScheduleTypeFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>
{scheduleTypeDisplayLabel}
</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Status</span>
<Combobox
options={[
{ value: 'active', label: 'Active' },
{ value: 'paused', label: 'Paused' },
]}
multiSelect
multiSelectValues={statusFilter}
onMultiSelectChange={setStatusFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{statusDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Health</span>
<Combobox
options={[{ value: 'has-failures', label: 'Has failures' }]}
multiSelect
multiSelectValues={healthFilter}
onMultiSelectChange={setHealthFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{healthDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
{hasActiveFilters && (
<button
type='button'
onClick={() => {
setScheduleTypeFilter([])
setStatusFilter([])
setHealthFilter([])
}}
className='flex h-[32px] w-full items-center justify-center rounded-md text-[var(--text-secondary)] text-caption transition-colors hover-hover:bg-[var(--surface-active)]'
>
Clear all filters
</button>
)}
</div>
),
[
scheduleTypeFilter,
statusFilter,
healthFilter,
scheduleTypeDisplayLabel,
statusDisplayLabel,
healthDisplayLabel,
hasActiveFilters,
]
)
const filterTags: FilterTag[] = useMemo(() => {
const tags: FilterTag[] = []
if (scheduleTypeFilter.length > 0) {
const label =
scheduleTypeFilter.length === 1
? `Type: ${scheduleTypeFilter[0] === 'recurring' ? 'Recurring' : 'One-time'}`
: `Type: ${scheduleTypeFilter.length} selected`
tags.push({ label, onRemove: () => setScheduleTypeFilter([]) })
}
if (statusFilter.length > 0) {
const label =
statusFilter.length === 1
? `Status: ${statusFilter[0] === 'active' ? 'Active' : 'Paused'}`
: `Status: ${statusFilter.length} selected`
tags.push({ label, onRemove: () => setStatusFilter([]) })
}
if (healthFilter.length > 0) {
tags.push({ label: 'Health: Has failures', onRemove: () => setHealthFilter([]) })
}
return tags
}, [scheduleTypeFilter, statusFilter, healthFilter])
return (
<>
<Resource
@@ -184,7 +398,9 @@ export function ScheduledTasks() {
onChange: setSearchQuery,
placeholder: 'Search scheduled tasks...',
}}
defaultSort='nextRun'
sort={sortConfig}
filter={filterContent}
filterTags={filterTags}
columns={COLUMNS}
rows={rows}
onRowContextMenu={handleRowContextMenu}

View File

@@ -3,7 +3,7 @@
import { useMemo, useState } from 'react'
import { Search } from 'lucide-react'
import { useParams, useRouter } from 'next/navigation'
import { Button, SModalTabs, SModalTabsList, SModalTabsTrigger } from '@/components/emcn'
import { Button, Combobox, SModalTabs, SModalTabsList, SModalTabsTrigger } from '@/components/emcn'
import { Input } from '@/components/ui'
import { formatDate } from '@/lib/core/utils/formatting'
import { RESOURCE_REGISTRY } from '@/app/workspace/[workspaceId]/home/components/mothership-view/components/resource-registry'
@@ -34,6 +34,21 @@ function getResourceHref(
type ResourceType = 'all' | 'workflow' | 'table' | 'knowledge' | 'file'
type SortColumn = 'deleted' | 'name' | 'type'
interface SortConfig {
column: SortColumn
direction: 'asc' | 'desc'
}
const DEFAULT_SORT: SortConfig = { column: 'deleted', direction: 'desc' }
const SORT_OPTIONS: { column: SortColumn; direction: 'asc' | 'desc'; label: string }[] = [
{ column: 'deleted', direction: 'desc', label: 'Deleted (newest first)' },
{ column: 'name', direction: 'asc', label: 'Name (AZ)' },
{ column: 'type', direction: 'asc', label: 'Type (AZ)' },
]
const ICON_CLASS = 'h-[14px] w-[14px]'
const RESOURCE_TYPE_TO_MOTHERSHIP: Record<Exclude<ResourceType, 'all'>, MothershipResourceType> = {
@@ -100,6 +115,7 @@ export function RecentlyDeleted() {
const workspaceId = params?.workspaceId as string
const [activeTab, setActiveTab] = useState<ResourceType>('all')
const [searchTerm, setSearchTerm] = useState('')
const [activeSort, setActiveSort] = useState<SortConfig | null>(null)
const [restoringIds, setRestoringIds] = useState<Set<string>>(new Set())
const [restoredItems, setRestoredItems] = useState<Map<string, DeletedResource>>(new Map())
@@ -174,7 +190,6 @@ export function RecentlyDeleted() {
}
}
items.sort((a, b) => b.deletedAt.getTime() - a.deletedAt.getTime())
return items
}, [
workflowsQuery.data,
@@ -191,10 +206,27 @@ export function RecentlyDeleted() {
const normalized = searchTerm.toLowerCase()
items = items.filter((r) => r.name.toLowerCase().includes(normalized))
}
return items
}, [resources, activeTab, searchTerm])
const col = (activeSort ?? DEFAULT_SORT).column
const dir = (activeSort ?? DEFAULT_SORT).direction
return [...items].sort((a, b) => {
let cmp = 0
switch (col) {
case 'name':
cmp = a.name.localeCompare(b.name)
break
case 'type':
cmp = a.type.localeCompare(b.type)
break
case 'deleted':
cmp = a.deletedAt.getTime() - b.deletedAt.getTime()
break
}
return dir === 'asc' ? cmp : -cmp
})
}, [resources, activeTab, searchTerm, activeSort])
const showNoResults = searchTerm.trim() && filtered.length === 0 && resources.length > 0
const selectedSort = activeSort ?? DEFAULT_SORT
function handleRestore(resource: DeletedResource) {
setRestoringIds((prev) => new Set(prev).add(resource.id))
@@ -232,18 +264,41 @@ export function RecentlyDeleted() {
return (
<div className='flex h-full flex-col gap-4.5'>
<div className='flex items-center gap-2 rounded-lg border border-[var(--border)] bg-transparent px-2 py-[5px] transition-colors duration-100 dark:bg-[var(--surface-4)] dark:hover-hover:border-[var(--border-1)] dark:hover-hover:bg-[var(--surface-5)]'>
<Search
className='h-[14px] w-[14px] flex-shrink-0 text-[var(--text-tertiary)]'
strokeWidth={2}
/>
<Input
placeholder='Search deleted items...'
value={searchTerm}
onChange={(e) => setSearchTerm(e.target.value)}
disabled={isLoading}
className='h-auto flex-1 border-0 bg-transparent p-0 font-base leading-none placeholder:text-[var(--text-tertiary)] focus-visible:ring-0 focus-visible:ring-offset-0'
/>
<div className='flex items-center gap-2'>
<div className='flex flex-1 items-center gap-2 rounded-lg border border-[var(--border)] bg-transparent px-2 py-[5px] transition-colors duration-100 dark:bg-[var(--surface-4)] dark:hover-hover:border-[var(--border-1)] dark:hover-hover:bg-[var(--surface-5)]'>
<Search
className='h-[14px] w-[14px] flex-shrink-0 text-[var(--text-tertiary)]'
strokeWidth={2}
/>
<Input
placeholder='Search deleted items...'
value={searchTerm}
onChange={(e) => setSearchTerm(e.target.value)}
disabled={isLoading}
className='h-auto flex-1 border-0 bg-transparent p-0 font-base leading-none placeholder:text-[var(--text-tertiary)] focus-visible:ring-0 focus-visible:ring-offset-0'
/>
</div>
<div className='w-[190px] shrink-0'>
<Combobox
size='sm'
align='end'
disabled={isLoading}
value={`${selectedSort.column}:${selectedSort.direction}`}
onChange={(value) => {
const option = SORT_OPTIONS.find(
(sortOption) => `${sortOption.column}:${sortOption.direction}` === value
)
if (option) {
setActiveSort({ column: option.column, direction: option.direction })
}
}}
options={SORT_OPTIONS.map((option) => ({
label: option.label,
value: `${option.column}:${option.direction}`,
}))}
className='h-[30px] rounded-lg border-[var(--border)] bg-transparent px-2.5 text-small dark:bg-[var(--surface-4)]'
/>
</div>
</div>
<SModalTabs value={activeTab} onValueChange={(v) => setActiveTab(v as ResourceType)}>

View File

@@ -1,6 +1,6 @@
'use client'
import { useCallback, useMemo, useState } from 'react'
import { memo, useCallback, useMemo, useRef, useState } from 'react'
import { X } from 'lucide-react'
import { nanoid } from 'nanoid'
import {
@@ -11,29 +11,29 @@ import {
DropdownMenuTrigger,
} from '@/components/emcn'
import { ChevronDown, Plus } from '@/components/emcn/icons'
import { cn } from '@/lib/core/utils/cn'
import type { Filter, FilterRule } from '@/lib/table'
import { COMPARISON_OPERATORS } from '@/lib/table/query-builder/constants'
import { filterRulesToFilter } from '@/lib/table/query-builder/converters'
import { filterRulesToFilter, filterToRules } from '@/lib/table/query-builder/converters'
const OPERATOR_LABELS: Record<string, string> = {
eq: '=',
ne: '≠',
gt: '>',
gte: '≥',
lt: '<',
lte: '≤',
contains: '∋',
in: '∈',
} as const
const OPERATOR_LABELS = Object.fromEntries(
COMPARISON_OPERATORS.map((op) => [op.value, op.label])
) as Record<string, string>
interface TableFilterProps {
columns: Array<{ name: string; type: string }>
filter: Filter | null
onApply: (filter: Filter | null) => void
onClose: () => void
}
export function TableFilter({ columns, onApply }: TableFilterProps) {
const [rules, setRules] = useState<FilterRule[]>(() => [createRule(columns)])
export function TableFilter({ columns, filter, onApply, onClose }: TableFilterProps) {
const [rules, setRules] = useState<FilterRule[]>(() => {
const fromFilter = filterToRules(filter)
return fromFilter.length > 0 ? fromFilter : [createRule(columns)]
})
const rulesRef = useRef(rules)
rulesRef.current = rules
const columnOptions = useMemo(
() => columns.map((col) => ({ value: col.name, label: col.name })),
@@ -46,52 +46,82 @@ export function TableFilter({ columns, onApply }: TableFilterProps) {
const handleRemove = useCallback(
(id: string) => {
setRules((prev) => {
const next = prev.filter((r) => r.id !== id)
return next.length === 0 ? [createRule(columns)] : next
})
const next = rulesRef.current.filter((r) => r.id !== id)
if (next.length === 0) {
onApply(null)
onClose()
setRules([createRule(columns)])
} else {
setRules(next)
}
},
[columns]
[columns, onApply, onClose]
)
const handleUpdate = useCallback((id: string, field: keyof FilterRule, value: string) => {
setRules((prev) => prev.map((r) => (r.id === id ? { ...r, [field]: value } : r)))
}, [])
const handleToggleLogical = useCallback((id: string) => {
setRules((prev) =>
prev.map((r) =>
r.id === id ? { ...r, logicalOperator: r.logicalOperator === 'and' ? 'or' : 'and' } : r
)
)
}, [])
const handleApply = useCallback(() => {
const validRules = rules.filter((r) => r.column && r.value)
const validRules = rulesRef.current.filter((r) => r.column && r.value)
onApply(filterRulesToFilter(validRules))
}, [rules, onApply])
}, [onApply])
const handleClear = useCallback(() => {
setRules([createRule(columns)])
onApply(null)
}, [columns, onApply])
return (
<div className='flex flex-col gap-1.5 p-2'>
{rules.map((rule) => (
<FilterRuleRow
key={rule.id}
rule={rule}
columns={columnOptions}
onUpdate={handleUpdate}
onRemove={handleRemove}
onApply={handleApply}
/>
))}
<div className='border-[var(--border)] border-b bg-[var(--bg)] px-4 py-2'>
<div className='flex flex-col gap-1'>
{rules.map((rule, index) => (
<FilterRuleRow
key={rule.id}
rule={rule}
isFirst={index === 0}
columns={columnOptions}
onUpdate={handleUpdate}
onRemove={handleRemove}
onApply={handleApply}
onToggleLogical={handleToggleLogical}
/>
))}
<div className='flex items-center justify-between gap-3'>
<Button
variant='ghost'
size='sm'
onClick={handleAdd}
className={cn(
'border border-[var(--border)] border-dashed px-2 py-[3px] text-[var(--text-secondary)] text-xs'
)}
>
<Plus className='mr-1 h-[10px] w-[10px]' />
Add filter
</Button>
<Button variant='default' size='sm' onClick={handleApply} className='text-xs'>
Apply filter
</Button>
<div className='mt-1 flex items-center justify-between'>
<Button
variant='ghost'
size='sm'
onClick={handleAdd}
className='px-2 py-1 text-[var(--text-secondary)] text-xs'
>
<Plus className='mr-1 h-[10px] w-[10px]' />
Add filter
</Button>
<div className='flex items-center gap-1.5'>
{filter !== null && (
<Button
variant='ghost'
size='sm'
onClick={handleClear}
className='px-2 py-1 text-[var(--text-secondary)] text-xs'
>
Clear filters
</Button>
)}
<Button variant='default' size='sm' onClick={handleApply} className='text-xs'>
Apply filter
</Button>
</div>
</div>
</div>
</div>
)
@@ -99,18 +129,39 @@ export function TableFilter({ columns, onApply }: TableFilterProps) {
interface FilterRuleRowProps {
rule: FilterRule
isFirst: boolean
columns: Array<{ value: string; label: string }>
onUpdate: (id: string, field: keyof FilterRule, value: string) => void
onRemove: (id: string) => void
onApply: () => void
onToggleLogical: (id: string) => void
}
function FilterRuleRow({ rule, columns, onUpdate, onRemove, onApply }: FilterRuleRowProps) {
const FilterRuleRow = memo(function FilterRuleRow({
rule,
isFirst,
columns,
onUpdate,
onRemove,
onApply,
onToggleLogical,
}: FilterRuleRowProps) {
return (
<div className='flex items-center gap-1'>
<div className='flex items-center gap-1.5'>
{isFirst ? (
<span className='w-[42px] shrink-0 text-right text-[var(--text-muted)] text-xs'>Where</span>
) : (
<button
onClick={() => onToggleLogical(rule.id)}
className='w-[42px] shrink-0 rounded-full py-0.5 text-right font-medium text-[10px] text-[var(--text-muted)] uppercase tracking-wide transition-colors hover:text-[var(--text-secondary)]'
>
{rule.logicalOperator}
</button>
)}
<DropdownMenu>
<DropdownMenuTrigger asChild>
<button className='flex h-[30px] min-w-[100px] items-center justify-between rounded-[5px] border border-[var(--border)] bg-transparent px-2 text-[var(--text-secondary)] text-xs outline-none hover-hover:border-[var(--border-1)]'>
<button className='flex h-[28px] min-w-[100px] items-center justify-between rounded-[5px] border border-[var(--border)] bg-transparent px-2 text-[var(--text-secondary)] text-xs outline-none hover-hover:border-[var(--border-1)]'>
<span className='truncate'>{rule.column || 'Column'}</span>
<ChevronDown className='ml-1 h-[10px] w-[10px] shrink-0 text-[var(--text-icon)]' />
</button>
@@ -129,8 +180,8 @@ function FilterRuleRow({ rule, columns, onUpdate, onRemove, onApply }: FilterRul
<DropdownMenu>
<DropdownMenuTrigger asChild>
<button className='flex h-[30px] min-w-[50px] items-center justify-between rounded-[5px] border border-[var(--border)] bg-transparent px-2 text-[var(--text-secondary)] text-xs outline-none hover-hover:border-[var(--border-1)]'>
<span>{OPERATOR_LABELS[rule.operator] ?? rule.operator}</span>
<button className='flex h-[28px] min-w-[90px] items-center justify-between rounded-[5px] border border-[var(--border)] bg-transparent px-2 text-[var(--text-secondary)] text-xs outline-none hover-hover:border-[var(--border-1)]'>
<span className='truncate'>{OPERATOR_LABELS[rule.operator] ?? rule.operator}</span>
<ChevronDown className='ml-1 h-[10px] w-[10px] shrink-0 text-[var(--text-icon)]' />
</button>
</DropdownMenuTrigger>
@@ -151,25 +202,21 @@ function FilterRuleRow({ rule, columns, onUpdate, onRemove, onApply }: FilterRul
value={rule.value}
onChange={(e) => onUpdate(rule.id, 'value', e.target.value)}
onKeyDown={(e) => {
if (e.key === 'Enter') handleApply()
if (e.key === 'Enter') onApply()
}}
placeholder='Enter a value'
className='h-[30px] min-w-[160px] flex-1 rounded-[5px] border border-[var(--border)] bg-transparent px-2 text-[var(--text-secondary)] text-xs outline-none placeholder:text-[var(--text-subtle)] hover-hover:border-[var(--border-1)] focus:border-[var(--border-1)]'
className='h-[28px] flex-1 rounded-[5px] border border-[var(--border)] bg-transparent px-2 text-[var(--text-secondary)] text-xs outline-none placeholder:text-[var(--text-subtle)] hover-hover:border-[var(--border-1)] focus:border-[var(--border-1)]'
/>
<button
onClick={() => onRemove(rule.id)}
className='flex h-[30px] w-[30px] shrink-0 items-center justify-center rounded-[5px] text-[var(--text-tertiary)] transition-colors hover-hover:bg-[var(--surface-4)] hover-hover:text-[var(--text-primary)]'
className='flex h-[28px] w-[28px] shrink-0 items-center justify-center rounded-[5px] text-[var(--text-tertiary)] transition-colors hover-hover:bg-[var(--surface-4)] hover-hover:text-[var(--text-primary)]'
>
<X className='h-[12px] w-[12px]' />
</button>
</div>
)
function handleApply() {
onApply()
}
}
})
function createRule(columns: Array<{ name: string }>): FilterRule {
return {

View File

@@ -1,6 +1,7 @@
'use client'
import React, { useCallback, useEffect, useMemo, useRef, useState } from 'react'
import { GripVertical } from 'lucide-react'
import { useParams, useRouter } from 'next/navigation'
import {
Button,
@@ -305,6 +306,17 @@ export function Table({
return 0
}, [resizingColumn, displayColumns, columnWidths])
const dropIndicatorLeft = useMemo(() => {
if (!dropTargetColumnName) return null
let left = CHECKBOX_COL_WIDTH
for (const col of displayColumns) {
if (dropSide === 'left' && col.name === dropTargetColumnName) return left
left += columnWidths[col.name] ?? COL_WIDTH
if (dropSide === 'right' && col.name === dropTargetColumnName) return left
}
return null
}, [dropTargetColumnName, dropSide, displayColumns, columnWidths])
const isAllRowsSelected = useMemo(() => {
if (checkedRows.size > 0 && rows.length > 0 && checkedRows.size >= rows.length) {
for (const row of rows) {
@@ -367,13 +379,11 @@ export function Table({
setColumnWidths(updatedWidths)
}
const updatedOrder = columnOrderRef.current?.map((n) => (n === columnName ? newName : n))
if (updatedOrder) {
setColumnOrder(updatedOrder)
updateMetadataRef.current({
columnWidths: updatedWidths,
columnOrder: updatedOrder,
})
}
if (updatedOrder) setColumnOrder(updatedOrder)
updateMetadataRef.current({
columnWidths: updatedWidths,
columnOrder: updatedOrder,
})
updateColumnMutation.mutate({ columnName, updates: { name: newName } })
},
})
@@ -682,6 +692,7 @@ export function Table({
}
setDragColumnName(null)
setDropTargetColumnName(null)
setDropSide('left')
}, [])
const handleColumnDragLeave = useCallback(() => {
@@ -1340,8 +1351,7 @@ export function Table({
const insertColumnInOrder = useCallback(
(anchorColumn: string, newColumn: string, side: 'left' | 'right') => {
const order = columnOrderRef.current
if (!order) return
const order = columnOrderRef.current ?? schemaColumnsRef.current.map((c) => c.name)
const newOrder = [...order]
let anchorIdx = newOrder.indexOf(anchorColumn)
if (anchorIdx === -1) {
@@ -1422,12 +1432,12 @@ export function Table({
const handleDeleteColumnConfirm = useCallback(() => {
if (!deletingColumn) return
const columnToDelete = deletingColumn
const orderAtDelete = columnOrderRef.current
setDeletingColumn(null)
deleteColumnMutation.mutate(columnToDelete, {
onSuccess: () => {
const order = columnOrderRef.current
if (!order) return
const newOrder = order.filter((n) => n !== columnToDelete)
if (!orderAtDelete) return
const newOrder = orderAtDelete.filter((n) => n !== columnToDelete)
setColumnOrder(newOrder)
updateMetadataRef.current({
columnWidths: columnWidthsRef.current,
@@ -1448,6 +1458,17 @@ export function Table({
const handleFilterApply = useCallback((filter: Filter | null) => {
setQueryOptions((prev) => ({ ...prev, filter }))
}, [])
const [filterOpen, setFilterOpen] = useState(false)
const handleFilterToggle = useCallback(() => {
setFilterOpen((prev) => !prev)
}, [])
const handleFilterClose = useCallback(() => {
setFilterOpen(false)
}, [])
const columnOptions = useMemo<ColumnOption[]>(
() =>
displayColumns.map((col) => ({
@@ -1526,11 +1547,6 @@ export function Table({
[handleAddColumn, addColumnMutation.isPending]
)
const filterElement = useMemo(
() => <TableFilter columns={displayColumns} onApply={handleFilterApply} />,
[displayColumns, handleFilterApply]
)
const activeSortState = useMemo(() => {
if (!queryOptions.sort) return null
const entries = Object.entries(queryOptions.sort)
@@ -1597,7 +1613,19 @@ export function Table({
<>
<ResourceHeader icon={TableIcon} breadcrumbs={breadcrumbs} create={createAction} />
<ResourceOptionsBar sort={sortConfig} filter={filterElement} />
<ResourceOptionsBar
sort={sortConfig}
onFilterToggle={handleFilterToggle}
filterActive={filterOpen || !!queryOptions.filter}
/>
{filterOpen && (
<TableFilter
columns={displayColumns}
filter={queryOptions.filter}
onApply={handleFilterApply}
onClose={handleFilterClose}
/>
)}
</>
)}
@@ -1677,8 +1705,6 @@ export function Table({
onResize={handleColumnResize}
onResizeEnd={handleColumnResizeEnd}
isDragging={dragColumnName === column.name}
isDropTarget={dropTargetColumnName === column.name}
dropSide={dropTargetColumnName === column.name ? dropSide : undefined}
onDragStart={handleColumnDragStart}
onDragOver={handleColumnDragOver}
onDragEnd={handleColumnDragEnd}
@@ -1701,7 +1727,7 @@ export function Table({
<>
{rows.map((row, index) => {
const prevPosition = index > 0 ? rows[index - 1].position : -1
const gapCount = row.position - prevPosition - 1
const gapCount = queryOptions.filter ? 0 : row.position - prevPosition - 1
return (
<React.Fragment key={row.id}>
{gapCount > 0 && (
@@ -1755,6 +1781,12 @@ export function Table({
style={{ left: resizeIndicatorLeft }}
/>
)}
{dropIndicatorLeft !== null && (
<div
className='-translate-x-[1px] pointer-events-none absolute top-0 z-20 h-full w-[2px] bg-[var(--selection)]'
style={{ left: dropIndicatorLeft }}
/>
)}
</div>
{!isLoadingTable && !isLoadingRows && userPermissions.canEdit && (
<AddRowButton onClick={handleAppendRow} />
@@ -2581,8 +2613,6 @@ const ColumnHeaderMenu = React.memo(function ColumnHeaderMenu({
onResize,
onResizeEnd,
isDragging,
isDropTarget,
dropSide,
onDragStart,
onDragOver,
onDragEnd,
@@ -2605,8 +2635,6 @@ const ColumnHeaderMenu = React.memo(function ColumnHeaderMenu({
onResize: (columnName: string, width: number) => void
onResizeEnd: () => void
isDragging?: boolean
isDropTarget?: boolean
dropSide?: 'left' | 'right'
onDragStart?: (columnName: string) => void
onDragOver?: (columnName: string, side: 'left' | 'right') => void
onDragEnd?: () => void
@@ -2698,22 +2726,13 @@ const ColumnHeaderMenu = React.memo(function ColumnHeaderMenu({
return (
<th
className={cn(
'relative border-[var(--border)] border-r border-b bg-[var(--bg)] p-0 text-left align-middle',
'group relative border-[var(--border)] border-r border-b bg-[var(--bg)] p-0 text-left align-middle',
isDragging && 'opacity-40'
)}
draggable={!readOnly && !isRenaming}
onDragStart={handleDragStart}
onDragOver={handleDragOver}
onDrop={handleDrop}
onDragEnd={handleDragEnd}
onDragLeave={handleDragLeave}
>
{isDropTarget && dropSide === 'left' && (
<div className='pointer-events-none absolute top-0 bottom-0 left-[-1px] z-10 w-[2px] bg-[var(--selection)]' />
)}
{isDropTarget && dropSide === 'right' && (
<div className='pointer-events-none absolute top-0 right-[-1px] bottom-0 z-10 w-[2px] bg-[var(--selection)]' />
)}
{isRenaming ? (
<div className='flex h-full w-full min-w-0 items-center px-2 py-[7px]'>
<ColumnTypeIcon type={column.type} />
@@ -2738,63 +2757,73 @@ const ColumnHeaderMenu = React.memo(function ColumnHeaderMenu({
</span>
</div>
) : (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<button
type='button'
className='flex h-full w-full min-w-0 cursor-pointer items-center px-2 py-[7px] outline-none active:cursor-grabbing'
>
<ColumnTypeIcon type={column.type} />
<span className='ml-1.5 min-w-0 overflow-clip text-ellipsis whitespace-nowrap font-medium text-[var(--text-primary)] text-small'>
{column.name}
</span>
<ChevronDown className='ml-2 h-[7px] w-[9px] shrink-0 text-[var(--text-muted)]' />
</button>
</DropdownMenuTrigger>
<DropdownMenuContent align='start'>
<DropdownMenuItem onSelect={() => onRenameColumn(column.name)}>
<Pencil />
Rename column
</DropdownMenuItem>
<DropdownMenuSub>
<DropdownMenuSubTrigger>
{React.createElement(COLUMN_TYPE_ICONS[column.type] ?? TypeText)}
Change type
</DropdownMenuSubTrigger>
<DropdownMenuSubContent>
{COLUMN_TYPE_OPTIONS.map((option) => (
<DropdownMenuItem
key={option.type}
disabled={column.type === option.type}
onSelect={() => onChangeType(column.name, option.type)}
>
<option.icon />
{option.label}
</DropdownMenuItem>
))}
</DropdownMenuSubContent>
</DropdownMenuSub>
<DropdownMenuSeparator />
<DropdownMenuItem onSelect={() => onInsertLeft(column.name)}>
<ArrowLeft />
Insert column left
</DropdownMenuItem>
<DropdownMenuItem onSelect={() => onInsertRight(column.name)}>
<ArrowRight />
Insert column right
</DropdownMenuItem>
<DropdownMenuSeparator />
<DropdownMenuItem onSelect={() => onToggleUnique(column.name)}>
<Fingerprint />
{column.unique ? 'Remove unique' : 'Set unique'}
</DropdownMenuItem>
<DropdownMenuSeparator />
<DropdownMenuItem onSelect={() => onDeleteColumn(column.name)}>
<Trash />
Delete column
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
<div className='flex h-full w-full min-w-0 items-center'>
<DropdownMenu>
<DropdownMenuTrigger asChild>
<button
type='button'
className='flex min-w-0 flex-1 cursor-pointer items-center px-2 py-[7px] outline-none'
>
<ColumnTypeIcon type={column.type} />
<span className='ml-1.5 min-w-0 overflow-clip text-ellipsis whitespace-nowrap font-medium text-[var(--text-primary)] text-small'>
{column.name}
</span>
<ChevronDown className='ml-1.5 h-[7px] w-[9px] shrink-0 text-[var(--text-muted)]' />
</button>
</DropdownMenuTrigger>
<DropdownMenuContent align='start'>
<DropdownMenuItem onSelect={() => onRenameColumn(column.name)}>
<Pencil />
Rename column
</DropdownMenuItem>
<DropdownMenuSub>
<DropdownMenuSubTrigger>
{React.createElement(COLUMN_TYPE_ICONS[column.type] ?? TypeText)}
Change type
</DropdownMenuSubTrigger>
<DropdownMenuSubContent>
{COLUMN_TYPE_OPTIONS.map((option) => (
<DropdownMenuItem
key={option.type}
disabled={column.type === option.type}
onSelect={() => onChangeType(column.name, option.type)}
>
<option.icon />
{option.label}
</DropdownMenuItem>
))}
</DropdownMenuSubContent>
</DropdownMenuSub>
<DropdownMenuSeparator />
<DropdownMenuItem onSelect={() => onInsertLeft(column.name)}>
<ArrowLeft />
Insert column left
</DropdownMenuItem>
<DropdownMenuItem onSelect={() => onInsertRight(column.name)}>
<ArrowRight />
Insert column right
</DropdownMenuItem>
<DropdownMenuSeparator />
<DropdownMenuItem onSelect={() => onToggleUnique(column.name)}>
<Fingerprint />
{column.unique ? 'Remove unique' : 'Set unique'}
</DropdownMenuItem>
<DropdownMenuSeparator />
<DropdownMenuItem onSelect={() => onDeleteColumn(column.name)}>
<Trash />
Delete column
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
<div
draggable
onDragStart={handleDragStart}
onDragEnd={handleDragEnd}
className='flex h-full cursor-grab items-center pr-1.5 pl-0.5 opacity-0 transition-opacity active:cursor-grabbing group-hover:opacity-100'
>
<GripVertical className='h-3 w-3 shrink-0 text-[var(--text-muted)]' />
</div>
</div>
)}
<div
className='-right-[3px] absolute top-0 z-[1] h-full w-[6px] cursor-col-resize'

View File

@@ -3,8 +3,10 @@
import { useCallback, useMemo, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { useParams, useRouter } from 'next/navigation'
import type { ComboboxOption } from '@/components/emcn'
import {
Button,
Combobox,
Modal,
ModalBody,
ModalContent,
@@ -16,7 +18,13 @@ import {
import { Columns3, Rows3, Table as TableIcon } from '@/components/emcn/icons'
import type { TableDefinition } from '@/lib/table'
import { generateUniqueTableName } from '@/lib/table/constants'
import type { ResourceColumn, ResourceRow } from '@/app/workspace/[workspaceId]/components'
import type {
FilterTag,
ResourceColumn,
ResourceRow,
SearchConfig,
SortConfig,
} from '@/app/workspace/[workspaceId]/components'
import { ownerCell, Resource, timeCell } from '@/app/workspace/[workspaceId]/components'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { TablesListContextMenu } from '@/app/workspace/[workspaceId]/tables/components'
@@ -29,6 +37,7 @@ import {
useUploadCsvToTable,
} from '@/hooks/queries/tables'
import { useWorkspaceMembersQuery } from '@/hooks/queries/workspace'
import { useDebounce } from '@/hooks/use-debounce'
const logger = createLogger('Tables')
@@ -60,6 +69,13 @@ export function Tables() {
const [isDeleteDialogOpen, setIsDeleteDialogOpen] = useState(false)
const [activeTable, setActiveTable] = useState<TableDefinition | null>(null)
const [searchTerm, setSearchTerm] = useState('')
const debouncedSearchTerm = useDebounce(searchTerm, 300)
const [activeSort, setActiveSort] = useState<{
column: string
direction: 'asc' | 'desc'
} | null>(null)
const [rowCountFilter, setRowCountFilter] = useState<string[]>([])
const [ownerFilter, setOwnerFilter] = useState<string[]>([])
const [uploading, setUploading] = useState(false)
const [uploadProgress, setUploadProgress] = useState({ completed: 0, total: 0 })
const csvInputRef = useRef<HTMLInputElement>(null)
@@ -78,15 +94,56 @@ export function Tables() {
closeMenu: closeRowContextMenu,
} = useContextMenu()
const filteredTables = useMemo(() => {
if (!searchTerm) return tables
const term = searchTerm.toLowerCase()
return tables.filter((table) => table.name.toLowerCase().includes(term))
}, [tables, searchTerm])
const processedTables = useMemo(() => {
let result = debouncedSearchTerm
? tables.filter((t) => t.name.toLowerCase().includes(debouncedSearchTerm.toLowerCase()))
: tables
if (rowCountFilter.length > 0) {
result = result.filter((t) => {
if (rowCountFilter.includes('empty') && t.rowCount === 0) return true
if (rowCountFilter.includes('small') && t.rowCount >= 1 && t.rowCount <= 100) return true
if (rowCountFilter.includes('large') && t.rowCount > 100) return true
return false
})
}
if (ownerFilter.length > 0) {
result = result.filter((t) => ownerFilter.includes(t.createdBy))
}
const col = activeSort?.column ?? 'created'
const dir = activeSort?.direction ?? 'desc'
return [...result].sort((a, b) => {
let cmp = 0
switch (col) {
case 'name':
cmp = a.name.localeCompare(b.name)
break
case 'columns':
cmp = a.schema.columns.length - b.schema.columns.length
break
case 'rows':
cmp = a.rowCount - b.rowCount
break
case 'created':
cmp = new Date(a.createdAt).getTime() - new Date(b.createdAt).getTime()
break
case 'updated':
cmp = new Date(a.updatedAt).getTime() - new Date(b.updatedAt).getTime()
break
case 'owner': {
const aName = members?.find((m) => m.userId === a.createdBy)?.name ?? ''
const bName = members?.find((m) => m.userId === b.createdBy)?.name ?? ''
cmp = aName.localeCompare(bName)
break
}
}
return dir === 'asc' ? cmp : -cmp
})
}, [tables, debouncedSearchTerm, rowCountFilter, ownerFilter, activeSort, members])
const rows: ResourceRow[] = useMemo(
() =>
filteredTables.map((table) => ({
processedTables.map((table) => ({
id: table.id,
cells: {
name: {
@@ -105,16 +162,167 @@ export function Tables() {
owner: ownerCell(table.createdBy, members),
updated: timeCell(table.updatedAt),
},
sortValues: {
columns: table.schema.columns.length,
rows: table.rowCount,
created: -new Date(table.createdAt).getTime(),
updated: -new Date(table.updatedAt).getTime(),
},
})),
[filteredTables, members]
[processedTables, members]
)
const searchConfig: SearchConfig = useMemo(
() => ({
value: searchTerm,
onChange: setSearchTerm,
onClearAll: () => setSearchTerm(''),
placeholder: 'Search tables...',
}),
[searchTerm]
)
const sortConfig: SortConfig = useMemo(
() => ({
options: [
{ id: 'name', label: 'Name' },
{ id: 'columns', label: 'Columns' },
{ id: 'rows', label: 'Rows' },
{ id: 'created', label: 'Created' },
{ id: 'owner', label: 'Owner' },
{ id: 'updated', label: 'Last Updated' },
],
active: activeSort,
onSort: (column, direction) => setActiveSort({ column, direction }),
onClear: () => setActiveSort(null),
}),
[activeSort]
)
const rowCountDisplayLabel = useMemo(() => {
if (rowCountFilter.length === 0) return 'All'
if (rowCountFilter.length === 1) {
const labels: Record<string, string> = {
empty: 'Empty',
small: 'Small (1100)',
large: 'Large (101+)',
}
return labels[rowCountFilter[0]] ?? rowCountFilter[0]
}
return `${rowCountFilter.length} selected`
}, [rowCountFilter])
const ownerDisplayLabel = useMemo(() => {
if (ownerFilter.length === 0) return 'All'
if (ownerFilter.length === 1)
return members?.find((m) => m.userId === ownerFilter[0])?.name ?? '1 member'
return `${ownerFilter.length} members`
}, [ownerFilter, members])
const memberOptions: ComboboxOption[] = useMemo(
() =>
(members ?? []).map((m) => ({
value: m.userId,
label: m.name,
iconElement: m.image ? (
<img
src={m.image}
alt={m.name}
referrerPolicy='no-referrer'
className='h-[14px] w-[14px] rounded-full border border-[var(--border)] object-cover'
/>
) : (
<span className='flex h-[14px] w-[14px] items-center justify-center rounded-full border border-[var(--border)] bg-[var(--surface-3)] font-medium text-[8px] text-[var(--text-secondary)]'>
{m.name.charAt(0).toUpperCase()}
</span>
),
})),
[members]
)
const hasActiveFilters = rowCountFilter.length > 0 || ownerFilter.length > 0
const filterContent = useMemo(
() => (
<div className='flex w-[240px] flex-col gap-3 p-3'>
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Row Count</span>
<Combobox
options={[
{ value: 'empty', label: 'Empty' },
{ value: 'small', label: 'Small (1100 rows)' },
{ value: 'large', label: 'Large (101+ rows)' },
]}
multiSelect
multiSelectValues={rowCountFilter}
onMultiSelectChange={setRowCountFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{rowCountDisplayLabel}</span>
}
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
{memberOptions.length > 0 && (
<div className='flex flex-col gap-1.5'>
<span className='font-medium text-[var(--text-secondary)] text-caption'>Owner</span>
<Combobox
options={memberOptions}
multiSelect
multiSelectValues={ownerFilter}
onMultiSelectChange={setOwnerFilter}
overlayContent={
<span className='truncate text-[var(--text-primary)]'>{ownerDisplayLabel}</span>
}
searchable
searchPlaceholder='Search members...'
showAllOption
allOptionLabel='All'
size='sm'
className='h-[32px] w-full rounded-md'
/>
</div>
)}
{hasActiveFilters && (
<button
type='button'
onClick={() => {
setRowCountFilter([])
setOwnerFilter([])
}}
className='flex h-[32px] w-full items-center justify-center rounded-md text-[var(--text-secondary)] text-caption transition-colors hover-hover:bg-[var(--surface-active)]'
>
Clear all filters
</button>
)}
</div>
),
[
rowCountFilter,
ownerFilter,
memberOptions,
rowCountDisplayLabel,
ownerDisplayLabel,
hasActiveFilters,
]
)
const filterTags: FilterTag[] = useMemo(() => {
const tags: FilterTag[] = []
if (rowCountFilter.length > 0) {
const rowLabels: Record<string, string> = { empty: 'Empty', small: 'Small', large: 'Large' }
const label =
rowCountFilter.length === 1
? `Rows: ${rowLabels[rowCountFilter[0]]}`
: `Rows: ${rowCountFilter.length} selected`
tags.push({ label, onRemove: () => setRowCountFilter([]) })
}
if (ownerFilter.length > 0) {
const label =
ownerFilter.length === 1
? `Owner: ${members?.find((m) => m.userId === ownerFilter[0])?.name ?? '1 member'}`
: `Owner: ${ownerFilter.length} members`
tags.push({ label, onRemove: () => setOwnerFilter([]) })
}
return tags
}, [rowCountFilter, ownerFilter, members])
const handleContentContextMenu = useCallback(
(e: React.MouseEvent) => {
const target = e.target as HTMLElement
@@ -215,7 +423,7 @@ export function Tables() {
}
}
},
[workspaceId, router]
[workspaceId, router, uploadCsv]
)
const handleListUploadCsv = useCallback(() => {
@@ -260,12 +468,10 @@ export function Tables() {
onClick: handleCreateTable,
disabled: uploading || userPermissions.canEdit !== true || createTable.isPending,
}}
search={{
value: searchTerm,
onChange: setSearchTerm,
placeholder: 'Search tables...',
}}
defaultSort='created'
search={searchConfig}
sort={sortConfig}
filter={filterContent}
filterTags={filterTags}
headerActions={[
{
label: uploadButtonLabel,

View File

@@ -14,6 +14,7 @@ import {
ModalHeader,
} from '@/components/emcn'
import { client } from '@/lib/auth/auth-client'
import type { OAuthReturnContext } from '@/lib/credentials/client-state'
import { writeOAuthReturnContext } from '@/lib/credentials/client-state'
import {
getCanonicalScopesForProvider,
@@ -27,17 +28,22 @@ import { useCreateCredentialDraft } from '@/hooks/queries/credentials'
const logger = createLogger('ConnectCredentialModal')
export interface ConnectCredentialModalProps {
interface ConnectCredentialModalBaseProps {
isOpen: boolean
onClose: () => void
provider: OAuthProvider
serviceId: string
workspaceId: string
workflowId: string
/** Number of existing credentials for this provider — used to detect a successful new connection. */
credentialCount: number
}
export type ConnectCredentialModalProps = ConnectCredentialModalBaseProps &
(
| { workflowId: string; knowledgeBaseId?: never }
| { workflowId?: never; knowledgeBaseId: string }
)
export function ConnectCredentialModal({
isOpen,
onClose,
@@ -45,6 +51,7 @@ export function ConnectCredentialModal({
serviceId,
workspaceId,
workflowId,
knowledgeBaseId,
credentialCount,
}: ConnectCredentialModalProps) {
const [displayName, setDisplayName] = useState('')
@@ -97,15 +104,19 @@ export function ConnectCredentialModal({
try {
await createDraft.mutateAsync({ workspaceId, providerId, displayName: trimmedName })
writeOAuthReturnContext({
origin: 'workflow',
workflowId,
const baseContext = {
displayName: trimmedName,
providerId,
preCount: credentialCount,
workspaceId,
requestedAt: Date.now(),
})
}
const returnContext: OAuthReturnContext = knowledgeBaseId
? { ...baseContext, origin: 'kb-connectors' as const, knowledgeBaseId }
: { ...baseContext, origin: 'workflow' as const, workflowId: workflowId! }
writeOAuthReturnContext(returnContext)
if (providerId === 'trello') {
window.location.href = '/api/auth/trello/authorize'

View File

@@ -7,6 +7,7 @@ import { Button, Combobox } from '@/components/emcn/components'
import { getSubscriptionAccessState } from '@/lib/billing/client'
import { getEnv, isTruthy } from '@/lib/core/config/env'
import { getPollingProviderFromOAuth } from '@/lib/credential-sets/providers'
import { consumeOAuthReturnContext, writeOAuthReturnContext } from '@/lib/credentials/client-state'
import {
getCanonicalScopesForProvider,
getProviderIdFromServiceId,
@@ -357,7 +358,18 @@ export function CredentialSelector({
</div>
<Button
variant='active'
onClick={() => setShowOAuthModal(true)}
onClick={() => {
writeOAuthReturnContext({
origin: 'workflow',
workflowId: activeWorkflowId || '',
displayName: selectedCredential?.name ?? getProviderName(provider),
providerId: effectiveProviderId,
preCount: credentials.length,
workspaceId,
requestedAt: Date.now(),
})
setShowOAuthModal(true)
}}
className='w-full px-2 py-1 font-medium text-caption'
>
Update access
@@ -380,7 +392,10 @@ export function CredentialSelector({
{showOAuthModal && (
<OAuthRequiredModal
isOpen={showOAuthModal}
onClose={() => setShowOAuthModal(false)}
onClose={() => {
consumeOAuthReturnContext()
setShowOAuthModal(false)
}}
provider={provider}
toolName={getProviderName(provider)}
requiredScopes={getCanonicalScopesForProvider(effectiveProviderId)}

View File

@@ -7,9 +7,9 @@ import { useParams } from 'next/navigation'
import { Button, Combobox } from '@/components/emcn/components'
import { Progress } from '@/components/ui/progress'
import { cn } from '@/lib/core/utils/cn'
import type { WorkspaceFileRecord } from '@/lib/uploads/contexts/workspace'
import { getExtensionFromMimeType } from '@/lib/uploads/utils/file-utils'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { useWorkspaceFiles } from '@/hooks/queries/workspace-files'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
@@ -150,8 +150,6 @@ export function FileUpload({
const [storeValue, setStoreValue] = useSubBlockValue(blockId, subBlockId)
const [uploadingFiles, setUploadingFiles] = useState<UploadingFile[]>([])
const [uploadProgress, setUploadProgress] = useState(0)
const [workspaceFiles, setWorkspaceFiles] = useState<WorkspaceFileRecord[]>([])
const [loadingWorkspaceFiles, setLoadingWorkspaceFiles] = useState(false)
const [uploadError, setUploadError] = useState<string | null>(null)
const [inputValue, setInputValue] = useState('')
@@ -163,26 +161,14 @@ export function FileUpload({
const params = useParams()
const workspaceId = params?.workspaceId as string
const {
data: workspaceFiles = [],
isLoading: loadingWorkspaceFiles,
refetch: refetchWorkspaceFiles,
} = useWorkspaceFiles(isPreview ? '' : workspaceId)
const value = isPreview ? previewValue : storeValue
const loadWorkspaceFiles = async () => {
if (!workspaceId || isPreview) return
try {
setLoadingWorkspaceFiles(true)
const response = await fetch(`/api/workspaces/${workspaceId}/files`)
const data = await response.json()
if (data.success) {
setWorkspaceFiles(data.files || [])
}
} catch (error) {
logger.error('Error loading workspace files:', error)
} finally {
setLoadingWorkspaceFiles(false)
}
}
/**
* Checks if a file's MIME type matches the accepted types
* Supports exact matches, wildcard patterns (e.g., 'image/*'), and '*' for all types
@@ -226,10 +212,6 @@ export function FileUpload({
return !isAlreadySelected
})
useEffect(() => {
void loadWorkspaceFiles()
}, [workspaceId])
/**
* Opens file dialog
*/
@@ -394,7 +376,7 @@ export function FileUpload({
setUploadError(null)
if (workspaceId) {
void loadWorkspaceFiles()
void refetchWorkspaceFiles()
}
if (uploadedFiles.length === 1) {
@@ -726,7 +708,7 @@ export function FileUpload({
value={inputValue}
onChange={handleComboboxChange}
onOpenChange={(open) => {
if (open) void loadWorkspaceFiles()
if (open) void refetchWorkspaceFiles()
}}
placeholder={loadingWorkspaceFiles ? 'Loading files...' : '+ Add More'}
disabled={disabled || loadingWorkspaceFiles}
@@ -746,7 +728,7 @@ export function FileUpload({
onInputChange={handleComboboxChange}
onClear={(e) => handleRemoveFile(filesArray[0], e)}
onOpenChange={(open) => {
if (open) void loadWorkspaceFiles()
if (open) void refetchWorkspaceFiles()
}}
disabled={disabled}
isLoading={loadingWorkspaceFiles}
@@ -763,7 +745,7 @@ export function FileUpload({
value={inputValue}
onChange={handleComboboxChange}
onOpenChange={(open) => {
if (open) void loadWorkspaceFiles()
if (open) void refetchWorkspaceFiles()
}}
placeholder={loadingWorkspaceFiles ? 'Loading files...' : 'Select or upload file'}
disabled={disabled || loadingWorkspaceFiles}

View File

@@ -4,6 +4,7 @@ import { createElement, useCallback, useMemo, useRef, useState } from 'react'
import { ExternalLink } from 'lucide-react'
import { useParams } from 'next/navigation'
import { Button, Combobox } from '@/components/emcn/components'
import { consumeOAuthReturnContext, writeOAuthReturnContext } from '@/lib/credentials/client-state'
import {
getCanonicalScopesForProvider,
getProviderIdFromServiceId,
@@ -222,7 +223,18 @@ export function ToolCredentialSelector({
</div>
<Button
variant='active'
onClick={() => setShowOAuthModal(true)}
onClick={() => {
writeOAuthReturnContext({
origin: 'workflow',
workflowId: effectiveWorkflowId || '',
displayName: selectedCredential?.name ?? getProviderName(provider),
providerId: effectiveProviderId,
preCount: credentials.length,
workspaceId,
requestedAt: Date.now(),
})
setShowOAuthModal(true)
}}
className='w-full px-2 py-1 font-medium text-caption'
>
Update access
@@ -245,7 +257,10 @@ export function ToolCredentialSelector({
{showOAuthModal && (
<OAuthRequiredModal
isOpen={showOAuthModal}
onClose={() => setShowOAuthModal(false)}
onClose={() => {
consumeOAuthReturnContext()
setShowOAuthModal(false)
}}
provider={provider}
toolName={getProviderName(provider)}
requiredScopes={getCanonicalScopesForProvider(effectiveProviderId)}

View File

@@ -19,6 +19,7 @@ import { createLogger } from '@sim/logger'
import { useShallow } from 'zustand/react/shallow'
import { useSession } from '@/lib/auth/auth-client'
import type { OAuthConnectEventDetail } from '@/lib/copilot/tools/client/base-tool'
import { consumeOAuthReturnContext, writeOAuthReturnContext } from '@/lib/credentials/client-state'
import type { OAuthProvider } from '@/lib/oauth'
import { BLOCK_DIMENSIONS, CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions'
import { TriggerUtils } from '@/lib/workflows/triggers/triggers'
@@ -263,7 +264,7 @@ const WorkflowContent = React.memo(
const params = useParams()
const router = useRouter()
const reactFlowInstance = useReactFlow()
const { screenToFlowPosition, getNodes, setNodes, getIntersectingNodes } = reactFlowInstance
const { screenToFlowPosition, getNodes, setNodes } = reactFlowInstance
const { fitViewToBounds, getViewportCenter } = useCanvasViewport(reactFlowInstance, {
embedded,
})
@@ -478,6 +479,17 @@ const WorkflowContent = React.memo(
const handleOpenOAuthConnect = (event: Event) => {
const detail = (event as CustomEvent<OAuthConnectEventDetail>).detail
if (!detail) return
writeOAuthReturnContext({
origin: 'workflow',
workflowId: workflowIdParam,
displayName: detail.providerName,
providerId: detail.providerId,
preCount: 0,
workspaceId,
requestedAt: Date.now(),
})
setOauthModal({
provider: detail.providerId as OAuthProvider,
serviceId: detail.serviceId,
@@ -490,7 +502,7 @@ const WorkflowContent = React.memo(
window.addEventListener('open-oauth-connect', handleOpenOAuthConnect as EventListener)
return () =>
window.removeEventListener('open-oauth-connect', handleOpenOAuthConnect as EventListener)
}, [])
}, [workflowIdParam, workspaceId])
const { diffAnalysis, isShowingDiff, isDiffReady, reapplyDiffMarkers, hasActiveDiff } =
useWorkflowDiffStore(
@@ -2849,38 +2861,29 @@ const WorkflowContent = React.memo(
)
/**
* Finds the best node at a given flow position for drop-on-block connection.
* Skips subflow containers as they have their own connection logic.
* Finds the node under the cursor using DOM hit-testing for pixel-perfect
* detection that matches exactly what the user sees on screen.
* Uses the same approach as ReactFlow's internal handle detection.
*/
const findNodeAtPosition = useCallback(
(position: { x: number; y: number }) => {
const cursorRect = {
x: position.x - 1,
y: position.y - 1,
width: 2,
height: 2,
const findNodeAtScreenPosition = useCallback(
(clientX: number, clientY: number) => {
const elements = document.elementsFromPoint(clientX, clientY)
const nodes = getNodes()
for (const el of elements) {
const nodeEl = el.closest('.react-flow__node') as HTMLElement | null
if (!nodeEl) continue
const nodeId = nodeEl.getAttribute('data-id')
if (!nodeId) continue
const node = nodes.find((n) => n.id === nodeId)
if (node && node.type !== 'subflowNode') return node
}
const intersecting = getIntersectingNodes(cursorRect, true).filter(
(node) => node.type !== 'subflowNode'
)
if (intersecting.length === 0) return undefined
if (intersecting.length === 1) return intersecting[0]
return intersecting.reduce((closest, node) => {
const getDistance = (n: Node) => {
const absPos = getNodeAbsolutePosition(n.id)
const dims = getBlockDimensions(n.id)
const centerX = absPos.x + dims.width / 2
const centerY = absPos.y + dims.height / 2
return Math.hypot(position.x - centerX, position.y - centerY)
}
return getDistance(node) < getDistance(closest) ? node : closest
})
return undefined
},
[getIntersectingNodes, getNodeAbsolutePosition, getBlockDimensions]
[getNodes]
)
/**
@@ -3005,15 +3008,9 @@ const WorkflowContent = React.memo(
return
}
// Get cursor position in flow coordinates
// Find node under cursor using DOM hit-testing
const clientPos = 'changedTouches' in event ? event.changedTouches[0] : event
const flowPosition = screenToFlowPosition({
x: clientPos.clientX,
y: clientPos.clientY,
})
// Find node under cursor
const targetNode = findNodeAtPosition(flowPosition)
const targetNode = findNodeAtScreenPosition(clientPos.clientX, clientPos.clientY)
// Create connection if valid target found (handle-to-body case)
if (targetNode && targetNode.id !== source.nodeId) {
@@ -3027,7 +3024,7 @@ const WorkflowContent = React.memo(
connectionSourceRef.current = null
},
[screenToFlowPosition, findNodeAtPosition, onConnect]
[findNodeAtScreenPosition, onConnect]
)
/** Handles node drag to detect container intersections and update highlighting. */
@@ -4118,7 +4115,10 @@ const WorkflowContent = React.memo(
<Suspense fallback={null}>
<LazyOAuthRequiredModal
isOpen={true}
onClose={() => setOauthModal(null)}
onClose={() => {
consumeOAuthReturnContext()
setOauthModal(null)
}}
provider={oauthModal.provider}
toolName={oauthModal.providerName}
serviceId={oauthModal.serviceId}

View File

@@ -245,9 +245,6 @@ export function CollapsedTaskFlyoutItem({
title={task.name}
isActive={!!task.isActive}
isUnread={!!task.isUnread}
statusIndicatorClassName={
!(isCurrentRoute || isMenuOpen) ? 'group-hover:hidden' : undefined
}
/>
</Link>
{showActions && (

View File

@@ -3,7 +3,7 @@
import { useCallback, useMemo } from 'react'
import { useQueryClient } from '@tanstack/react-query'
import { useParams, usePathname, useRouter } from 'next/navigation'
import { ChevronDown, Skeleton, Tooltip } from '@/components/emcn'
import { ChevronDown, Skeleton } from '@/components/emcn'
import { useSession } from '@/lib/auth/auth-client'
import { getSubscriptionAccessState } from '@/lib/billing/client'
import { isHosted } from '@/lib/core/config/feature-flags'
@@ -15,6 +15,7 @@ import {
isBillingEnabled,
sectionConfig,
} from '@/app/workspace/[workspaceId]/settings/navigation'
import { SidebarTooltip } from '@/app/workspace/[workspaceId]/w/components/sidebar/sidebar'
import { useSSOProviders } from '@/ee/sso/hooks/sso'
import { prefetchWorkspaceCredentials } from '@/hooks/queries/credentials'
import { prefetchGeneralSettings, useGeneralSettings } from '@/hooks/queries/general-settings'
@@ -186,25 +187,18 @@ export function SettingsSidebar({
<>
{/* Back button */}
<div className='mt-2.5 flex flex-shrink-0 flex-col gap-0.5 px-2'>
<Tooltip.Root key={`back-${isCollapsed}`}>
<Tooltip.Trigger asChild>
<button
type='button'
onClick={handleBack}
className='group mx-0.5 flex h-[30px] items-center gap-2 rounded-lg px-2 text-sm hover-hover:bg-[var(--surface-hover)]'
>
<div className='flex h-[16px] w-[16px] flex-shrink-0 items-center justify-center text-[var(--text-icon)]'>
<ChevronDown className='h-[10px] w-[10px] rotate-90' />
</div>
<span className='truncate font-base text-[var(--text-body)]'>Back</span>
</button>
</Tooltip.Trigger>
{showCollapsedTooltips && (
<Tooltip.Content side='right'>
<p>Back</p>
</Tooltip.Content>
)}
</Tooltip.Root>
<SidebarTooltip label='Back' enabled={showCollapsedTooltips}>
<button
type='button'
onClick={handleBack}
className='group mx-0.5 flex h-[30px] items-center gap-2 rounded-lg px-2 text-sm hover-hover:bg-[var(--surface-hover)]'
>
<div className='flex h-[16px] w-[16px] flex-shrink-0 items-center justify-center text-[var(--text-icon)]'>
<ChevronDown className='h-[10px] w-[10px] rotate-90' />
</div>
<span className='truncate font-base text-[var(--text-body)]'>Back</span>
</button>
</SidebarTooltip>
</div>
{/* Settings sections */}
@@ -303,14 +297,13 @@ export function SettingsSidebar({
)
return (
<Tooltip.Root key={`${item.id}-${isCollapsed}`}>
<Tooltip.Trigger asChild>{element}</Tooltip.Trigger>
{showCollapsedTooltips && (
<Tooltip.Content side='right'>
<p>{item.label}</p>
</Tooltip.Content>
)}
</Tooltip.Root>
<SidebarTooltip
key={`${item.id}-${isCollapsed}`}
label={item.label}
enabled={showCollapsedTooltips}
>
{element}
</SidebarTooltip>
)
})}
</div>

View File

@@ -35,7 +35,7 @@ interface WorkflowItemProps {
active: boolean
level: number
dragDisabled?: boolean
onWorkflowClick: (workflowId: string, shiftKey: boolean, metaKey: boolean) => void
onWorkflowClick: (workflowId: string, shiftKey: boolean) => void
onDragStart?: () => void
onDragEnd?: () => void
}
@@ -368,13 +368,15 @@ export function WorkflowItem({
return
}
const isModifierClick = e.shiftKey || e.metaKey || e.ctrlKey
if (e.metaKey || e.ctrlKey) {
return
}
if (isModifierClick) {
if (e.shiftKey) {
e.preventDefault()
}
onWorkflowClick(workflow.id, e.shiftKey, e.metaKey || e.ctrlKey)
onWorkflowClick(workflow.id, e.shiftKey)
},
[shouldPreventClickRef, workflow.id, onWorkflowClick, isEditing]
)

View File

@@ -9,8 +9,9 @@ interface UseTaskSelectionProps {
}
/**
* Hook for managing task selection with support for single, range, and toggle selection.
* Handles shift-click for range selection and cmd/ctrl-click for toggle.
* Hook for managing task selection with support for single and range selection.
* Handles shift-click for range selection.
* cmd/ctrl+click is handled by the browser (opens in new tab) and never reaches this handler.
* Uses the last selected task as the anchor point for range selections.
* Selecting tasks clears workflow/folder selections and vice versa.
*/
@@ -18,16 +19,14 @@ export function useTaskSelection({ taskIds }: UseTaskSelectionProps) {
const selectedTasks = useFolderStore((s) => s.selectedTasks)
const handleTaskClick = useCallback(
(taskId: string, shiftKey: boolean, metaKey: boolean) => {
(taskId: string, shiftKey: boolean) => {
const {
selectTaskOnly,
selectTaskRange,
toggleTaskSelection,
lastSelectedTaskId: anchor,
} = useFolderStore.getState()
if (metaKey) {
toggleTaskSelection(taskId)
} else if (shiftKey && anchor && anchor !== taskId) {
if (shiftKey && anchor && anchor !== taskId) {
selectTaskRange(taskIds, anchor, taskId)
} else if (shiftKey) {
toggleTaskSelection(taskId)

View File

@@ -60,18 +60,15 @@ export function useWorkflowSelection({
}, [workflowAncestorFolderIds])
/**
* Handle workflow click with support for shift-click range selection and cmd/ctrl-click toggle.
* Handle workflow click with support for shift-click range selection.
* cmd/ctrl+click is handled by the browser (opens in new tab) and never reaches this handler.
*
* @param workflowId - ID of clicked workflow
* @param shiftKey - Whether shift key was pressed
* @param metaKey - Whether cmd (Mac) or ctrl (Windows) key was pressed
*/
const handleWorkflowClick = useCallback(
(workflowId: string, shiftKey: boolean, metaKey: boolean) => {
if (metaKey) {
toggleWorkflowSelection(workflowId)
deselectConflictingFolders()
} else if (shiftKey && activeWorkflowId && activeWorkflowId !== workflowId) {
(workflowId: string, shiftKey: boolean) => {
if (shiftKey && activeWorkflowId && activeWorkflowId !== workflowId) {
selectRange(workflowIds, activeWorkflowId, workflowId)
deselectConflictingFolders()
} else if (shiftKey) {

View File

@@ -100,6 +100,28 @@ import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
const logger = createLogger('Sidebar')
export function SidebarTooltip({
children,
label,
enabled,
side = 'right',
}: {
children: React.ReactElement
label: string
enabled: boolean
side?: 'right' | 'bottom'
}) {
if (!enabled) return children
return (
<Tooltip.Root>
<Tooltip.Trigger asChild>{children}</Tooltip.Trigger>
<Tooltip.Content side={side}>
<p>{label}</p>
</Tooltip.Content>
</Tooltip.Root>
)
}
function SidebarItemSkeleton() {
return (
<div className='sidebar-collapse-hide mx-0.5 flex h-[30px] items-center gap-2 rounded-lg px-2'>
@@ -129,77 +151,68 @@ const SidebarTaskItem = memo(function SidebarTaskItem({
isUnread: boolean
isMenuOpen: boolean
showCollapsedTooltips: boolean
onMultiSelectClick: (taskId: string, shiftKey: boolean, metaKey: boolean) => void
onMultiSelectClick: (taskId: string, shiftKey: boolean) => void
onContextMenu: (e: React.MouseEvent, taskId: string) => void
onMorePointerDown: () => void
onMoreClick: (e: React.MouseEvent<HTMLButtonElement>, taskId: string) => void
}) {
return (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Link
href={task.href}
className={cn(
'group mx-0.5 flex h-[30px] items-center gap-2 rounded-lg px-2 text-sm',
!(isCurrentRoute || isSelected || isMenuOpen) &&
'hover-hover:bg-[var(--surface-hover)]',
(isCurrentRoute || isSelected || isMenuOpen) && 'bg-[var(--surface-active)]'
)}
onClick={(e) => {
if (task.id === 'new') return
if (e.shiftKey || e.metaKey || e.ctrlKey) {
e.preventDefault()
onMultiSelectClick(task.id, e.shiftKey, e.metaKey || e.ctrlKey)
} else {
useFolderStore.setState({
selectedTasks: new Set<string>(),
lastSelectedTaskId: task.id,
})
}
}}
onContextMenu={task.id !== 'new' ? (e) => onContextMenu(e, task.id) : undefined}
>
<Blimp className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />
<div className='min-w-0 flex-1 truncate font-base text-[var(--text-body)]'>
{task.name}
<SidebarTooltip label={task.name} enabled={showCollapsedTooltips}>
<Link
href={task.href}
className={cn(
'group mx-0.5 flex h-[30px] items-center gap-2 rounded-lg px-2 text-sm',
!(isCurrentRoute || isSelected || isMenuOpen) && 'hover-hover:bg-[var(--surface-hover)]',
(isCurrentRoute || isSelected || isMenuOpen) && 'bg-[var(--surface-active)]'
)}
onClick={(e) => {
if (task.id === 'new') return
if (e.metaKey || e.ctrlKey) return
if (e.shiftKey) {
e.preventDefault()
onMultiSelectClick(task.id, true)
} else {
useFolderStore.setState({
selectedTasks: new Set<string>(),
lastSelectedTaskId: task.id,
})
}
}}
onContextMenu={task.id !== 'new' ? (e) => onContextMenu(e, task.id) : undefined}
>
<Blimp className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />
<div className='min-w-0 flex-1 truncate font-base text-[var(--text-body)]'>{task.name}</div>
{task.id !== 'new' && (
<div className='relative flex h-[18px] w-[18px] flex-shrink-0 items-center justify-center'>
{isActive && !isCurrentRoute && !isMenuOpen && (
<span className='absolute h-[7px] w-[7px] animate-ping rounded-full bg-amber-400 opacity-30 group-hover:hidden' />
)}
{isActive && !isCurrentRoute && !isMenuOpen && (
<span className='absolute h-[7px] w-[7px] rounded-full bg-amber-400 group-hover:hidden' />
)}
{!isActive && isUnread && !isCurrentRoute && !isMenuOpen && (
<span className='absolute h-[7px] w-[7px] rounded-full bg-[var(--brand-accent)] group-hover:hidden' />
)}
<button
type='button'
aria-label='Task options'
onPointerDown={onMorePointerDown}
onClick={(e) => {
e.preventDefault()
e.stopPropagation()
onMoreClick(e, task.id)
}}
className={cn(
'flex h-[18px] w-[18px] items-center justify-center rounded-sm opacity-0 group-hover:opacity-100',
isMenuOpen && 'opacity-100'
)}
>
<MoreHorizontal className='h-[16px] w-[16px] text-[var(--text-icon)]' />
</button>
</div>
{task.id !== 'new' && (
<div className='relative flex h-[18px] w-[18px] flex-shrink-0 items-center justify-center'>
{isActive && !isCurrentRoute && (
<span className='absolute h-[7px] w-[7px] animate-ping rounded-full bg-amber-400 opacity-30 group-hover:hidden' />
)}
{isActive && !isCurrentRoute && (
<span className='absolute h-[7px] w-[7px] rounded-full bg-amber-400 group-hover:hidden' />
)}
{!isActive && isUnread && !isCurrentRoute && (
<span className='absolute h-[7px] w-[7px] rounded-full bg-[var(--brand-accent)] group-hover:hidden' />
)}
<button
type='button'
aria-label='Task options'
onPointerDown={onMorePointerDown}
onClick={(e) => {
e.preventDefault()
e.stopPropagation()
onMoreClick(e, task.id)
}}
className={cn(
'flex h-[18px] w-[18px] items-center justify-center rounded-sm opacity-0 group-hover:opacity-100',
isMenuOpen && 'opacity-100'
)}
>
<MoreHorizontal className='h-[16px] w-[16px] text-[var(--text-icon)]' />
</button>
</div>
)}
</Link>
</Tooltip.Trigger>
{showCollapsedTooltips && (
<Tooltip.Content side='right'>
<p>{task.name}</p>
</Tooltip.Content>
)}
</Tooltip.Root>
)}
</Link>
</SidebarTooltip>
)
})
@@ -265,15 +278,12 @@ const SidebarNavItem = memo(function SidebarNavItem({
</button>
) : null
if (!element) return null
return (
<Tooltip.Root>
<Tooltip.Trigger asChild>{element}</Tooltip.Trigger>
{showCollapsedTooltips && (
<Tooltip.Content side='right'>
<p>{item.label}</p>
</Tooltip.Content>
)}
</Tooltip.Root>
<SidebarTooltip label={item.label} enabled={showCollapsedTooltips}>
{element}
</SidebarTooltip>
)
})
@@ -317,6 +327,7 @@ export const Sidebar = memo(function Sidebar() {
const setSidebarWidth = useSidebarStore((state) => state.setSidebarWidth)
const isCollapsed = useSidebarStore((state) => state.isCollapsed)
const toggleCollapsed = useSidebarStore((state) => state.toggleCollapsed)
const _hasHydrated = useSidebarStore((state) => state._hasHydrated)
const isOnWorkflowPage = !!workflowId
const isCollapsedRef = useRef(isCollapsed)
@@ -326,14 +337,12 @@ export const Sidebar = memo(function Sidebar() {
const isMac = useMemo(() => isMacPlatform(), [])
// Delay collapsed tooltips until the width transition finishes.
const [showCollapsedTooltips, setShowCollapsedTooltips] = useState(isCollapsed)
useLayoutEffect(() => {
if (!isCollapsed) {
document.documentElement.removeAttribute('data-sidebar-collapsed')
}
}, [isCollapsed])
if (!_hasHydrated) return
document.documentElement.removeAttribute('data-sidebar-collapsed')
}, [_hasHydrated])
useEffect(() => {
if (isCollapsed) {
@@ -1010,10 +1019,6 @@ export const Sidebar = memo(function Sidebar() {
[importWorkspace]
)
// ── Memoised elements & objects for collapsed menus ──
// Prevents new JSX/object references on every render, which would defeat
// React.memo on CollapsedSidebarMenu and its children.
const tasksCollapsedIcon = useMemo(
() => <Blimp className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />,
[]
@@ -1054,9 +1059,6 @@ export const Sidebar = memo(function Sidebar() {
[handleCreateWorkflow]
)
// Stable no-op for collapsed workflow context menu delete (never changes)
const noop = useCallback(() => {}, [])
const handleExpandSidebar = useCallback(
(e: React.MouseEvent) => {
e.preventDefault()
@@ -1065,16 +1067,13 @@ export const Sidebar = memo(function Sidebar() {
[toggleCollapsed]
)
// Stable callback for the "New task" button in expanded mode
const handleNewTask = useCallback(
() => navigateToPage(`/workspace/${workspaceId}/home`),
[navigateToPage, workspaceId]
)
// Stable callback for "See more" tasks
const handleSeeMoreTasks = useCallback(() => setVisibleTaskCount((prev) => prev + 5), [])
// Stable callback for DeleteModal close
const handleCloseTaskDeleteModal = useCallback(() => setIsTaskDeleteModalOpen(false), [])
const handleEdgeKeyDown = useCallback(
@@ -1087,16 +1086,13 @@ export const Sidebar = memo(function Sidebar() {
[isCollapsed, toggleCollapsed]
)
// Stable handler for help modal open from dropdown
const handleOpenHelpFromMenu = useCallback(() => setIsHelpModalOpen(true), [])
// Stable handler for opening docs
const handleOpenDocs = useCallback(
() => window.open('https://docs.sim.ai', '_blank', 'noopener,noreferrer'),
[]
)
// Stable blur handlers for inline rename inputs
const handleTaskRenameBlur = useCallback(
() => void taskFlyoutRename.saveRename(),
[taskFlyoutRename.saveRename]
@@ -1107,7 +1103,6 @@ export const Sidebar = memo(function Sidebar() {
[workflowFlyoutRename.saveRename]
)
// Stable style for hidden file inputs
const hiddenStyle = useMemo(() => ({ display: 'none' }) as const, [])
const resolveWorkspaceIdFromPath = useCallback((): string | undefined => {
@@ -1205,69 +1200,68 @@ export const Sidebar = memo(function Sidebar() {
onClick={handleSidebarClick}
>
<div className='flex h-full flex-col pt-3'>
{/* Top bar: Logo + Collapse toggle */}
<div className='flex flex-shrink-0 items-center pr-2 pb-2 pl-2.5'>
<div className='flex h-[30px] items-center'>
<Tooltip.Root>
<Tooltip.Trigger asChild>
<div className='relative h-[30px]'>
<Link
href={`/workspace/${workspaceId}/home`}
className='sidebar-collapse-hide !transition-none group flex h-[30px] items-center rounded-[8px] px-[7px] hover-hover:bg-[var(--surface-hover)]'
tabIndex={isCollapsed ? -1 : undefined}
aria-label={brand.name}
>
{brand.logoUrl ? (
<Image
src={brand.logoUrl}
alt={brand.name}
width={16}
height={16}
className='h-[16px] w-[16px] flex-shrink-0 object-contain'
unoptimized
/>
) : (
<Wordmark className='h-[16px] w-auto text-[var(--text-body)]' />
)}
</Link>
<SidebarTooltip label='Expand sidebar' enabled={showCollapsedTooltips}>
<Link
href={`/workspace/${workspaceId}/home`}
onClick={isCollapsed ? handleExpandSidebar : undefined}
className='group flex h-[30px] items-center rounded-[8px] px-1.5 hover-hover:bg-[var(--surface-hover)]'
aria-label={isCollapsed ? 'Expand sidebar' : brand.name}
onClick={handleExpandSidebar}
className='sidebar-collapse-show !transition-none group absolute top-0 left-0 flex h-[30px] w-[30px] items-center justify-center rounded-[8px] hover-hover:bg-[var(--surface-hover)]'
tabIndex={isCollapsed ? undefined : -1}
aria-label='Expand sidebar'
>
{brand.logoUrl ? (
<Image
src={brand.logoUrl}
alt={brand.name}
alt=''
width={16}
height={16}
className={cn(
'h-[16px] w-[16px] flex-shrink-0 object-contain',
isCollapsed && 'group-hover:hidden'
)}
className='h-[16px] w-[16px] flex-shrink-0 object-contain group-hover:hidden'
unoptimized
/>
) : isCollapsed ? (
<Sim className='h-[16px] w-[16px] flex-shrink-0 group-hover:hidden' />
) : (
<Wordmark className='h-[16px] w-auto text-[var(--text-body)]' />
)}
{isCollapsed && (
<PanelLeft className='hidden h-[16px] w-[16px] flex-shrink-0 rotate-180 text-[var(--text-icon)] group-hover:block' />
<Sim className='h-[16px] w-[16px] flex-shrink-0 group-hover:hidden' />
)}
<PanelLeft className='hidden h-[16px] w-[16px] rotate-180 text-[var(--text-icon)] group-hover:block' />
</Link>
</Tooltip.Trigger>
{showCollapsedTooltips && (
<Tooltip.Content side='right'>
<p>Expand sidebar</p>
</Tooltip.Content>
)}
</Tooltip.Root>
</SidebarTooltip>
</div>
</div>
<Tooltip.Root>
<Tooltip.Trigger asChild>
<button
type='button'
onClick={toggleCollapsed}
className={cn(
'sidebar-collapse-btn ml-auto flex h-[30px] items-center justify-center overflow-hidden rounded-lg transition-all duration-200 hover-hover:bg-[var(--surface-hover)]',
isCollapsed ? 'w-0 opacity-0' : 'w-[30px] opacity-100'
)}
aria-label='Collapse sidebar'
>
<PanelLeft className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />
</button>
</Tooltip.Trigger>
{!isCollapsed && (
<Tooltip.Content side='bottom'>
<p>Collapse sidebar</p>
</Tooltip.Content>
)}
</Tooltip.Root>
<SidebarTooltip label='Collapse sidebar' enabled={!isCollapsed} side='bottom'>
<button
type='button'
onClick={toggleCollapsed}
className={cn(
'sidebar-collapse-btn ml-auto flex h-[30px] items-center justify-center overflow-hidden rounded-lg transition-all duration-200 hover-hover:bg-[var(--surface-hover)]',
isCollapsed ? 'w-0 opacity-0' : 'w-[30px] opacity-100'
)}
aria-label='Collapse sidebar'
>
<PanelLeft className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />
</button>
</SidebarTooltip>
</div>
{/* Workspace Header */}
<div className='flex-shrink-0 pr-2.5 pl-[9px]'>
<WorkspaceHeader
activeWorkspace={activeWorkspace}
@@ -1299,7 +1293,6 @@ export const Sidebar = memo(function Sidebar() {
/>
) : (
<>
{/* Top Navigation: Home, Search */}
<div className='mt-2.5 flex flex-shrink-0 flex-col gap-0.5 px-2'>
{topNavItems.map((item) => (
<SidebarNavItem
@@ -1312,7 +1305,6 @@ export const Sidebar = memo(function Sidebar() {
))}
</div>
{/* Workspace */}
<div className='mt-3.5 flex flex-shrink-0 flex-col pb-2'>
<div className='px-4 pb-1.5'>
<div className='font-base text-[var(--text-icon)] text-small'>Workspace</div>
@@ -1330,7 +1322,6 @@ export const Sidebar = memo(function Sidebar() {
</div>
</div>
{/* Scrollable Tasks + Workflows */}
<div
ref={isCollapsed ? undefined : scrollContainerRef}
className={cn(
@@ -1338,9 +1329,11 @@ export const Sidebar = memo(function Sidebar() {
!hasOverflowTop && 'border-transparent'
)}
>
{/* Tasks */}
<div className='tasks-section flex flex-shrink-0 flex-col' data-tour='nav-tasks'>
<div className='flex h-[18px] flex-shrink-0 items-center justify-between px-4'>
<div
className='tasks-section mx-2 flex flex-shrink-0 flex-col'
data-tour='nav-tasks'
>
<div className='flex h-[18px] flex-shrink-0 items-center justify-between px-2'>
<div className='font-base text-[var(--text-icon)] text-small'>All tasks</div>
{!isCollapsed && (
<div className='flex items-center justify-center gap-2'>
@@ -1460,12 +1453,11 @@ export const Sidebar = memo(function Sidebar() {
)}
</div>
{/* Workflows */}
<div
className='workflows-section relative mt-3.5 flex flex-col'
className='workflows-section relative mx-2 mt-3.5 flex flex-col'
data-tour='nav-workflows'
>
<div className='flex h-[18px] flex-shrink-0 items-center justify-between px-4'>
<div className='flex h-[18px] flex-shrink-0 items-center justify-between px-2'>
<div className='font-base text-[var(--text-icon)] text-small'>Workflows</div>
{!isCollapsed && (
<div className='flex items-center justify-center gap-2'>
@@ -1612,36 +1604,27 @@ export const Sidebar = memo(function Sidebar() {
</div>
</div>
{/* Footer */}
<div
className={cn(
'flex flex-shrink-0 flex-col gap-0.5 border-t px-2 pt-[9px] pb-2 transition-colors duration-150',
!hasOverflowBottom && 'border-transparent'
)}
>
{/* Help dropdown */}
<DropdownMenu>
<Tooltip.Root>
<SidebarTooltip label='Help' enabled={showCollapsedTooltips}>
<DropdownMenuTrigger asChild>
<Tooltip.Trigger asChild>
<button
type='button'
data-item-id='help'
className='group mx-0.5 flex h-[30px] items-center gap-2 rounded-[8px] px-2 text-[14px] hover-hover:bg-[var(--surface-hover)]'
>
<HelpCircle className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />
<span className='sidebar-collapse-hide truncate font-base text-[var(--text-body)]'>
Help
</span>
</button>
</Tooltip.Trigger>
<button
type='button'
data-item-id='help'
className='group mx-0.5 flex h-[30px] items-center gap-2 rounded-[8px] px-2 text-[14px] hover-hover:bg-[var(--surface-hover)]'
>
<HelpCircle className='h-[16px] w-[16px] flex-shrink-0 text-[var(--text-icon)]' />
<span className='sidebar-collapse-hide truncate font-base text-[var(--text-body)]'>
Help
</span>
</button>
</DropdownMenuTrigger>
{showCollapsedTooltips && (
<Tooltip.Content side='right'>
<p>Help</p>
</Tooltip.Content>
)}
</Tooltip.Root>
</SidebarTooltip>
<DropdownMenuContent align='start' side='top' sideOffset={4}>
<DropdownMenuItem onSelect={handleOpenDocs}>
<BookOpen className='h-[14px] w-[14px]' />
@@ -1669,7 +1652,6 @@ export const Sidebar = memo(function Sidebar() {
))}
</div>
{/* Nav Item Context Menu */}
<NavItemContextMenu
isOpen={isNavContextMenuOpen}
position={navContextMenuPosition}
@@ -1679,7 +1661,6 @@ export const Sidebar = memo(function Sidebar() {
onCopyLink={handleNavCopyLink}
/>
{/* Task Context Menu */}
<ContextMenu
isOpen={isTaskContextMenuOpen}
position={taskContextMenuPosition}
@@ -1704,7 +1685,6 @@ export const Sidebar = memo(function Sidebar() {
disableDelete={!canEdit}
/>
{/* Task Delete Confirmation Modal */}
<DeleteModal
isOpen={isTaskDeleteModalOpen}
onClose={handleCloseTaskDeleteModal}
@@ -1735,7 +1715,6 @@ export const Sidebar = memo(function Sidebar() {
)}
</div>
{/* Universal Search Modal */}
<SearchModal
open={isSearchModalOpen}
onOpenChange={setIsSearchModalOpen}
@@ -1748,14 +1727,12 @@ export const Sidebar = memo(function Sidebar() {
isOnWorkflowPage={!!workflowId}
/>
{/* Footer Navigation Modals */}
<HelpModal
open={isHelpModalOpen}
onOpenChange={setIsHelpModalOpen}
workflowId={workflowId}
workspaceId={workspaceId}
/>
{/* Hidden file input for workspace import */}
<input
ref={workspaceFileInputRef}
type='file'

View File

@@ -247,7 +247,7 @@ function formatCost(cost?: Record<string, unknown>): string {
}
function buildLogUrl(workspaceId: string, executionId: string): string {
return `${getBaseUrl()}/workspace/${workspaceId}/logs?search=${encodeURIComponent(executionId)}`
return `${getBaseUrl()}/workspace/${workspaceId}/logs?executionId=${encodeURIComponent(executionId)}`
}
function formatAlertReason(alertConfig: AlertConfig): string {

View File

@@ -250,9 +250,9 @@ export const FileV2Block: BlockConfig<FileParserOutput> = {
export const FileV3Block: BlockConfig<FileParserV3Output> = {
type: 'file_v3',
name: 'File',
description: 'Read and parse multiple files',
description: 'Read and write workspace files',
longDescription:
'Upload files directly or import from external URLs to get UserFile objects for use in other blocks.',
'Read and parse files from uploads or URLs, write new workspace files, or append content to existing files.',
docsLink: 'https://docs.sim.ai/tools/file',
category: 'tools',
integrationType: IntegrationType.FileStorage,
@@ -260,6 +260,17 @@ export const FileV3Block: BlockConfig<FileParserV3Output> = {
bgColor: '#40916C',
icon: DocumentIcon,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown' as SubBlockType,
options: [
{ label: 'Read', id: 'file_parser_v3' },
{ label: 'Write', id: 'file_write' },
{ label: 'Append', id: 'file_append' },
],
value: () => 'file_parser_v3',
},
{
id: 'file',
title: 'Files',
@@ -270,7 +281,8 @@ export const FileV3Block: BlockConfig<FileParserV3Output> = {
multiple: true,
mode: 'basic',
maxSize: 100,
required: true,
required: { field: 'operation', value: 'file_parser_v3' },
condition: { field: 'operation', value: 'file_parser_v3' },
},
{
id: 'fileUrl',
@@ -279,15 +291,105 @@ export const FileV3Block: BlockConfig<FileParserV3Output> = {
canonicalParamId: 'fileInput',
placeholder: 'https://example.com/document.pdf',
mode: 'advanced',
required: true,
required: { field: 'operation', value: 'file_parser_v3' },
condition: { field: 'operation', value: 'file_parser_v3' },
},
{
id: 'fileName',
title: 'File Name',
type: 'short-input' as SubBlockType,
placeholder: 'File name (e.g., data.csv)',
condition: { field: 'operation', value: 'file_write' },
required: { field: 'operation', value: 'file_write' },
},
{
id: 'content',
title: 'Content',
type: 'long-input' as SubBlockType,
placeholder: 'File content to write...',
condition: { field: 'operation', value: 'file_write' },
required: { field: 'operation', value: 'file_write' },
},
{
id: 'contentType',
title: 'Content Type',
type: 'short-input' as SubBlockType,
placeholder: 'text/plain (auto-detected from extension)',
condition: { field: 'operation', value: 'file_write' },
mode: 'advanced',
},
{
id: 'appendFile',
title: 'File',
type: 'file-upload' as SubBlockType,
canonicalParamId: 'appendFileInput',
acceptedTypes: '.txt,.md,.json,.csv,.xml,.html,.htm,.yaml,.yml,.log,.rtf',
placeholder: 'Select or upload a workspace file',
mode: 'basic',
condition: { field: 'operation', value: 'file_append' },
required: { field: 'operation', value: 'file_append' },
},
{
id: 'appendFileName',
title: 'File',
type: 'short-input' as SubBlockType,
canonicalParamId: 'appendFileInput',
placeholder: 'File name (e.g., notes.md)',
mode: 'advanced',
condition: { field: 'operation', value: 'file_append' },
required: { field: 'operation', value: 'file_append' },
},
{
id: 'appendContent',
title: 'Content',
type: 'long-input' as SubBlockType,
placeholder: 'Content to append...',
condition: { field: 'operation', value: 'file_append' },
required: { field: 'operation', value: 'file_append' },
},
],
tools: {
access: ['file_parser_v3'],
access: ['file_parser_v3', 'file_write', 'file_append'],
config: {
tool: () => 'file_parser_v3',
tool: (params) => params.operation || 'file_parser_v3',
params: (params) => {
// Use canonical 'fileInput' param directly
const operation = params.operation || 'file_parser_v3'
if (operation === 'file_write') {
return {
fileName: params.fileName,
content: params.content,
contentType: params.contentType,
workspaceId: params._context?.workspaceId,
}
}
if (operation === 'file_append') {
const appendInput = params.appendFileInput
if (!appendInput) {
throw new Error('File is required for append')
}
let fileName: string
if (typeof appendInput === 'string') {
fileName = appendInput.trim()
} else {
const normalized = normalizeFileInput(appendInput, { single: true })
const file = normalized as Record<string, unknown> | null
fileName = (file?.name as string) ?? ''
}
if (!fileName) {
throw new Error('Could not determine file name')
}
return {
fileName,
content: params.appendContent,
workspaceId: params._context?.workspaceId,
}
}
const fileInput = params.fileInput
if (!fileInput) {
logger.error('No file input provided')
@@ -326,17 +428,39 @@ export const FileV3Block: BlockConfig<FileParserV3Output> = {
},
},
inputs: {
fileInput: { type: 'json', description: 'File input (canonical param)' },
fileType: { type: 'string', description: 'File type' },
operation: { type: 'string', description: 'Operation to perform (read, write, or append)' },
fileInput: { type: 'json', description: 'File input for read' },
fileType: { type: 'string', description: 'File type for read' },
fileName: { type: 'string', description: 'Name for a new file (write)' },
content: { type: 'string', description: 'File content to write' },
contentType: { type: 'string', description: 'MIME content type for write' },
appendFileInput: { type: 'json', description: 'File to append to' },
appendContent: { type: 'string', description: 'Content to append to file' },
},
outputs: {
files: {
type: 'file[]',
description: 'Parsed files as UserFile objects',
description: 'Parsed files as UserFile objects (read)',
},
combinedContent: {
type: 'string',
description: 'All file contents merged into a single text string',
description: 'All file contents merged into a single text string (read)',
},
id: {
type: 'string',
description: 'File ID (write)',
},
name: {
type: 'string',
description: 'File name (write)',
},
size: {
type: 'number',
description: 'File size in bytes (write)',
},
url: {
type: 'string',
description: 'URL to access the file (write)',
},
},
}

View File

@@ -0,0 +1,406 @@
import { ProfoundIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
import { AuthMode, IntegrationType } from '@/blocks/types'
const CATEGORY_REPORT_OPS = [
'visibility_report',
'sentiment_report',
'citations_report',
'prompt_answers',
'query_fanouts',
] as const
const DOMAIN_REPORT_OPS = ['bots_report', 'referrals_report', 'raw_logs', 'bot_logs'] as const
const ALL_REPORT_OPS = [...CATEGORY_REPORT_OPS, ...DOMAIN_REPORT_OPS] as const
const CATEGORY_ID_OPS = [
...CATEGORY_REPORT_OPS,
'category_topics',
'category_tags',
'category_prompts',
'category_assets',
'category_personas',
] as const
const DATE_REQUIRED_CATEGORY_OPS = [
'visibility_report',
'sentiment_report',
'citations_report',
'prompt_answers',
'query_fanouts',
'prompt_volume',
] as const
const DATE_REQUIRED_ALL_OPS = [...DATE_REQUIRED_CATEGORY_OPS, ...DOMAIN_REPORT_OPS] as const
const METRICS_REPORT_OPS = [
'visibility_report',
'sentiment_report',
'citations_report',
'bots_report',
'referrals_report',
'query_fanouts',
'prompt_volume',
] as const
const DIMENSION_OPS = [
'visibility_report',
'sentiment_report',
'citations_report',
'bots_report',
'referrals_report',
'query_fanouts',
'raw_logs',
'bot_logs',
'prompt_volume',
] as const
const FILTER_OPS = [...ALL_REPORT_OPS, 'prompt_volume'] as const
export const ProfoundBlock: BlockConfig = {
type: 'profound',
name: 'Profound',
description: 'AI visibility and analytics with Profound',
longDescription:
'Track how your brand appears across AI platforms. Monitor visibility scores, sentiment, citations, bot traffic, referrals, content optimization, and prompt volumes with Profound.',
docsLink: 'https://docs.sim.ai/tools/profound',
category: 'tools',
integrationType: IntegrationType.Analytics,
tags: ['seo', 'data-analytics'],
bgColor: '#000000',
icon: ProfoundIcon,
authMode: AuthMode.ApiKey,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'List Categories', id: 'list_categories' },
{ label: 'List Regions', id: 'list_regions' },
{ label: 'List Models', id: 'list_models' },
{ label: 'List Domains', id: 'list_domains' },
{ label: 'List Assets', id: 'list_assets' },
{ label: 'List Personas', id: 'list_personas' },
{ label: 'Category Topics', id: 'category_topics' },
{ label: 'Category Tags', id: 'category_tags' },
{ label: 'Category Prompts', id: 'category_prompts' },
{ label: 'Category Assets', id: 'category_assets' },
{ label: 'Category Personas', id: 'category_personas' },
{ label: 'Visibility Report', id: 'visibility_report' },
{ label: 'Sentiment Report', id: 'sentiment_report' },
{ label: 'Citations Report', id: 'citations_report' },
{ label: 'Query Fanouts', id: 'query_fanouts' },
{ label: 'Prompt Answers', id: 'prompt_answers' },
{ label: 'Bots Report', id: 'bots_report' },
{ label: 'Referrals Report', id: 'referrals_report' },
{ label: 'Raw Logs', id: 'raw_logs' },
{ label: 'Bot Logs', id: 'bot_logs' },
{ label: 'List Optimizations', id: 'list_optimizations' },
{ label: 'Optimization Analysis', id: 'optimization_analysis' },
{ label: 'Prompt Volume', id: 'prompt_volume' },
{ label: 'Citation Prompts', id: 'citation_prompts' },
],
value: () => 'visibility_report',
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your Profound API key',
required: true,
password: true,
},
// Category ID - for category-based operations
{
id: 'categoryId',
title: 'Category ID',
type: 'short-input',
placeholder: 'Category UUID',
required: { field: 'operation', value: [...CATEGORY_ID_OPS] },
condition: { field: 'operation', value: [...CATEGORY_ID_OPS] },
},
// Domain - for domain-based operations
{
id: 'domain',
title: 'Domain',
type: 'short-input',
placeholder: 'e.g. example.com',
required: { field: 'operation', value: [...DOMAIN_REPORT_OPS] },
condition: { field: 'operation', value: [...DOMAIN_REPORT_OPS] },
},
// Input domain - for citation prompts
{
id: 'inputDomain',
title: 'Domain',
type: 'short-input',
placeholder: 'e.g. ramp.com',
required: { field: 'operation', value: 'citation_prompts' },
condition: { field: 'operation', value: 'citation_prompts' },
},
// Asset ID - for content optimization
{
id: 'assetId',
title: 'Asset ID',
type: 'short-input',
placeholder: 'Asset UUID',
required: { field: 'operation', value: ['list_optimizations', 'optimization_analysis'] },
condition: { field: 'operation', value: ['list_optimizations', 'optimization_analysis'] },
},
// Content ID - for optimization analysis
{
id: 'contentId',
title: 'Content ID',
type: 'short-input',
placeholder: 'Content/optimization UUID',
required: { field: 'operation', value: 'optimization_analysis' },
condition: { field: 'operation', value: 'optimization_analysis' },
},
// Date fields
{
id: 'startDate',
title: 'Start Date',
type: 'short-input',
placeholder: 'YYYY-MM-DD',
required: { field: 'operation', value: [...DATE_REQUIRED_ALL_OPS] },
condition: { field: 'operation', value: [...DATE_REQUIRED_ALL_OPS] },
wandConfig: {
enabled: true,
prompt: 'Generate a date in YYYY-MM-DD format. Return ONLY the date string.',
generationType: 'timestamp',
},
},
{
id: 'endDate',
title: 'End Date',
type: 'short-input',
placeholder: 'YYYY-MM-DD',
required: { field: 'operation', value: [...DATE_REQUIRED_CATEGORY_OPS] },
condition: { field: 'operation', value: [...DATE_REQUIRED_ALL_OPS] },
wandConfig: {
enabled: true,
prompt: 'Generate a date in YYYY-MM-DD format. Return ONLY the date string.',
generationType: 'timestamp',
},
},
// Per-operation metrics fields
{
id: 'visibilityMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'share_of_voice, visibility_score, mentions_count',
required: { field: 'operation', value: 'visibility_report' },
condition: { field: 'operation', value: 'visibility_report' },
},
{
id: 'sentimentMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'positive, negative, occurrences',
required: { field: 'operation', value: 'sentiment_report' },
condition: { field: 'operation', value: 'sentiment_report' },
},
{
id: 'citationsMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'count, citation_share',
required: { field: 'operation', value: 'citations_report' },
condition: { field: 'operation', value: 'citations_report' },
},
{
id: 'botsMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'count, citations, indexing, training',
required: { field: 'operation', value: 'bots_report' },
condition: { field: 'operation', value: 'bots_report' },
},
{
id: 'referralsMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'visits, last_visit',
required: { field: 'operation', value: 'referrals_report' },
condition: { field: 'operation', value: 'referrals_report' },
},
{
id: 'fanoutsMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'fanouts_per_execution, total_fanouts, share',
required: { field: 'operation', value: 'query_fanouts' },
condition: { field: 'operation', value: 'query_fanouts' },
},
{
id: 'volumeMetrics',
title: 'Metrics',
type: 'short-input',
placeholder: 'volume, change',
required: { field: 'operation', value: 'prompt_volume' },
condition: { field: 'operation', value: 'prompt_volume' },
},
// Advanced fields
{
id: 'dimensions',
title: 'Dimensions',
type: 'short-input',
placeholder: 'e.g. date, asset_name, model',
condition: { field: 'operation', value: [...DIMENSION_OPS] },
mode: 'advanced',
},
{
id: 'dateInterval',
title: 'Date Interval',
type: 'dropdown',
options: [
{ label: 'Day', id: 'day' },
{ label: 'Hour', id: 'hour' },
{ label: 'Week', id: 'week' },
{ label: 'Month', id: 'month' },
{ label: 'Year', id: 'year' },
],
condition: { field: 'operation', value: [...METRICS_REPORT_OPS] },
mode: 'advanced',
},
{
id: 'filters',
title: 'Filters',
type: 'long-input',
placeholder: '[{"field":"asset_name","operator":"is","value":"Company"}]',
condition: { field: 'operation', value: [...FILTER_OPS] },
mode: 'advanced',
wandConfig: {
enabled: true,
prompt:
'Generate a JSON array of filter objects. Each object has "field", "operator", and "value" keys. Return ONLY valid JSON.',
generationType: 'json-object',
},
},
{
id: 'limit',
title: 'Limit',
type: 'short-input',
placeholder: '10000',
condition: {
field: 'operation',
value: [...FILTER_OPS, 'category_prompts', 'list_optimizations'],
},
mode: 'advanced',
},
// Category prompts specific fields
{
id: 'cursor',
title: 'Cursor',
type: 'short-input',
placeholder: 'Pagination cursor from previous response',
condition: { field: 'operation', value: 'category_prompts' },
mode: 'advanced',
},
{
id: 'promptType',
title: 'Prompt Type',
type: 'short-input',
placeholder: 'visibility, sentiment',
condition: { field: 'operation', value: 'category_prompts' },
mode: 'advanced',
},
// Optimization list specific
{
id: 'offset',
title: 'Offset',
type: 'short-input',
placeholder: '0',
condition: { field: 'operation', value: 'list_optimizations' },
mode: 'advanced',
},
],
tools: {
access: [
'profound_list_categories',
'profound_list_regions',
'profound_list_models',
'profound_list_domains',
'profound_list_assets',
'profound_list_personas',
'profound_category_topics',
'profound_category_tags',
'profound_category_prompts',
'profound_category_assets',
'profound_category_personas',
'profound_visibility_report',
'profound_sentiment_report',
'profound_citations_report',
'profound_query_fanouts',
'profound_prompt_answers',
'profound_bots_report',
'profound_referrals_report',
'profound_raw_logs',
'profound_bot_logs',
'profound_list_optimizations',
'profound_optimization_analysis',
'profound_prompt_volume',
'profound_citation_prompts',
],
config: {
tool: (params) => `profound_${params.operation}`,
params: (params) => {
const result: Record<string, unknown> = {}
const metricsMap: Record<string, string> = {
visibility_report: 'visibilityMetrics',
sentiment_report: 'sentimentMetrics',
citations_report: 'citationsMetrics',
bots_report: 'botsMetrics',
referrals_report: 'referralsMetrics',
query_fanouts: 'fanoutsMetrics',
prompt_volume: 'volumeMetrics',
}
const metricsField = metricsMap[params.operation as string]
if (metricsField && params[metricsField]) {
result.metrics = params[metricsField]
}
if (params.limit != null) result.limit = Number(params.limit)
if (params.offset != null) result.offset = Number(params.offset)
return result
},
},
},
inputs: {
apiKey: { type: 'string' },
categoryId: { type: 'string' },
domain: { type: 'string' },
inputDomain: { type: 'string' },
assetId: { type: 'string' },
contentId: { type: 'string' },
startDate: { type: 'string' },
endDate: { type: 'string' },
metrics: { type: 'string' },
dimensions: { type: 'string' },
dateInterval: { type: 'string' },
filters: { type: 'string' },
limit: { type: 'number' },
offset: { type: 'number' },
cursor: { type: 'string' },
promptType: { type: 'string' },
},
outputs: {
response: {
type: 'json',
},
},
}

View File

@@ -137,6 +137,7 @@ import { PipedriveBlock } from '@/blocks/blocks/pipedrive'
import { PolymarketBlock } from '@/blocks/blocks/polymarket'
import { PostgreSQLBlock } from '@/blocks/blocks/postgresql'
import { PostHogBlock } from '@/blocks/blocks/posthog'
import { ProfoundBlock } from '@/blocks/blocks/profound'
import { PulseBlock, PulseV2Block } from '@/blocks/blocks/pulse'
import { QdrantBlock } from '@/blocks/blocks/qdrant'
import { QuiverBlock } from '@/blocks/blocks/quiver'
@@ -357,6 +358,7 @@ export const registry: Record<string, BlockConfig> = {
perplexity: PerplexityBlock,
pinecone: PineconeBlock,
pipedrive: PipedriveBlock,
profound: ProfoundBlock,
polymarket: PolymarketBlock,
postgresql: PostgreSQLBlock,
posthog: PostHogBlock,

View File

@@ -45,12 +45,12 @@ function TourCard({
return (
<>
<div className='flex items-center justify-between gap-2 px-4 pt-4 pb-2'>
<h3 className='min-w-0 font-medium text-[var(--text-primary)] text-sm leading-none'>
<h3 className='min-w-0 font-medium text-[var(--text-primary)] text-small leading-none'>
{title}
</h3>
<Button
variant='ghost'
className='h-[16px] w-[16px] flex-shrink-0 p-0'
className='relative h-[16px] w-[16px] flex-shrink-0 p-0 before:absolute before:inset-[-14px] before:content-[""]'
onClick={onClose}
aria-label='Close tour'
>
@@ -60,24 +60,23 @@ function TourCard({
</div>
<div className='px-4 pt-1 pb-3'>
<p className='text-[12px] text-[var(--text-secondary)] leading-[1.6]'>{description}</p>
<p className='text-[var(--text-secondary)] text-caption leading-[1.6]'>{description}</p>
</div>
<div className='flex items-center justify-between border-[var(--border)] border-t px-4 py-3'>
<span className='text-[11px] text-[var(--text-muted)] [font-variant-numeric:tabular-nums]'>
<div className='flex items-center justify-between rounded-b-xl border-[var(--border)] border-t bg-[color-mix(in_srgb,var(--surface-3)_50%,transparent)] px-4 py-2'>
<span className='text-[var(--text-muted)] text-xs [font-variant-numeric:tabular-nums]'>
{step} / {totalSteps}
</span>
<div className='flex items-center gap-1.5'>
<div className={cn(isFirst && 'invisible')}>
<Button
variant='default'
size='sm'
onClick={onBack}
tabIndex={isFirst ? -1 : undefined}
>
Back
</Button>
</div>
<Button
variant='default'
size='sm'
onClick={onBack}
tabIndex={isFirst ? -1 : undefined}
className={cn(isFirst && 'invisible')}
>
Back
</Button>
<Button variant='tertiary' size='sm' onClick={onNext}>
{isLast ? 'Done' : 'Next'}
</Button>
@@ -156,7 +155,7 @@ function TourTooltip({
const isCentered = placement === 'center'
const cardClasses = cn(
'w-[260px] overflow-hidden rounded-[8px] bg-[var(--bg)]',
'w-[260px] overflow-hidden rounded-xl bg-[var(--bg)]',
isEntrance && 'animate-tour-tooltip-in motion-reduce:animate-none',
className
)
@@ -181,7 +180,7 @@ function TourTooltip({
<div
className={cn(
cardClasses,
'pointer-events-auto relative border border-[var(--border)] shadow-sm'
'pointer-events-auto relative shadow-overlay ring-1 ring-foreground/10'
)}
>
{cardContent}
@@ -202,10 +201,7 @@ function TourTooltip({
sideOffset={10}
collisionPadding={12}
avoidCollisions
className='z-[10000300] outline-none'
style={{
filter: 'drop-shadow(0 0 0.5px var(--border)) drop-shadow(0 1px 2px rgba(0,0,0,0.1))',
}}
className='z-[10000300] outline-none drop-shadow-tour'
onOpenAutoFocus={(e) => e.preventDefault()}
onCloseAutoFocus={(e) => e.preventDefault()}
>

View File

@@ -42,6 +42,7 @@ export { Key } from './key'
export { KeySquare } from './key-square'
export { Layout } from './layout'
export { Library } from './library'
export { Link } from './link'
export { ListFilter } from './list-filter'
export { Loader } from './loader'
export { Lock } from './lock'

View File

@@ -0,0 +1,26 @@
import type { SVGProps } from 'react'
/**
* Link icon component
* @param props - SVG properties including className, size, etc.
*/
export function Link(props: SVGProps<SVGSVGElement>) {
return (
<svg
xmlns='http://www.w3.org/2000/svg'
width='24'
height='24'
viewBox='0 0 24 24'
fill='none'
stroke='currentColor'
strokeWidth='2'
strokeLinecap='round'
strokeLinejoin='round'
aria-hidden='true'
{...props}
>
<path d='M10 13a5 5 0 0 0 7.54.54l3-3a5 5 0 0 0-7.07-7.07l-1.72 1.71' />
<path d='M14 11a5 5 0 0 0-7.54-.54l-3 3a5 5 0 0 0 7.07 7.07l1.71-1.71' />
</svg>
)
}

View File

@@ -1285,6 +1285,17 @@ export function StartIcon(props: SVGProps<SVGSVGElement>) {
)
}
export function ProfoundIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg width='1em' height='1em' viewBox='0 0 55 55' xmlns='http://www.w3.org/2000/svg' {...props}>
<path
fill='currentColor'
d='M0 36.685V21.349a7.017 7.017 0 0 1 2.906-5.69l19.742-14.25A7.443 7.443 0 0 1 27.004 0h.062c1.623 0 3.193.508 4.501 1.452l19.684 14.207a7.016 7.016 0 0 1 2.906 5.69v12.302a7.013 7.013 0 0 1-2.907 5.689L31.527 53.562A7.605 7.605 0 0 1 27.078 55a7.641 7.641 0 0 1-4.465-1.44c-2.581-1.859-6.732-4.855-6.732-4.855V29.777c0-.249.28-.393.482-.248l10.538 7.605c.106.077.249.077.355 0l13.005-9.386a.306.306 0 0 0 0-.496l-13.005-9.386a.303.303 0 0 0-.355 0L.482 36.933A.304.304 0 0 1 0 36.685Z'
/>
</svg>
)
}
export function PineconeIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg

View File

@@ -10,17 +10,36 @@ function isDarkBackground(hexColor: string): boolean {
return luminance < 0.5
}
function getContrastTextColor(hexColor: string): string {
return isDarkBackground(hexColor) ? '#ffffff' : '#000000'
}
export function generateThemeCSS(): string {
const cssVars: string[] = []
if (process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR) {
cssVars.push(`--brand: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR};`)
// Override brand-accent so Run/Deploy buttons and other accent-styled elements use the brand color
cssVars.push(`--brand-accent: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR};`)
cssVars.push(`--auth-primary-btn-bg: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR};`)
cssVars.push(`--auth-primary-btn-border: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR};`)
cssVars.push(`--auth-primary-btn-hover-bg: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR};`)
cssVars.push(`--auth-primary-btn-hover-border: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR};`)
const primaryTextColor = getContrastTextColor(process.env.NEXT_PUBLIC_BRAND_PRIMARY_COLOR)
cssVars.push(`--auth-primary-btn-text: ${primaryTextColor};`)
cssVars.push(`--auth-primary-btn-hover-text: ${primaryTextColor};`)
}
if (process.env.NEXT_PUBLIC_BRAND_PRIMARY_HOVER_COLOR) {
cssVars.push(`--brand-hover: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_HOVER_COLOR};`)
cssVars.push(
`--auth-primary-btn-hover-bg: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_HOVER_COLOR};`
)
cssVars.push(
`--auth-primary-btn-hover-border: ${process.env.NEXT_PUBLIC_BRAND_PRIMARY_HOVER_COLOR};`
)
cssVars.push(
`--auth-primary-btn-hover-text: ${getContrastTextColor(process.env.NEXT_PUBLIC_BRAND_PRIMARY_HOVER_COLOR)};`
)
}
if (process.env.NEXT_PUBLIC_BRAND_ACCENT_COLOR) {
@@ -32,7 +51,6 @@ export function generateThemeCSS(): string {
}
if (process.env.NEXT_PUBLIC_BRAND_BACKGROUND_COLOR) {
// Add dark theme class when background is dark
const isDark = isDarkBackground(process.env.NEXT_PUBLIC_BRAND_BACKGROUND_COLOR)
if (isDark) {
cssVars.push(`--brand-is-dark: 1;`)

View File

@@ -1,4 +1,4 @@
import { createLogger } from '@sim/logger'
import { createLogger, type Logger } from '@sim/logger'
import { redactApiKeys } from '@/lib/core/security/redaction'
import { getBaseUrl } from '@/lib/core/utils/urls'
import {
@@ -49,12 +49,22 @@ import { SYSTEM_SUBBLOCK_IDS } from '@/triggers/constants'
const logger = createLogger('BlockExecutor')
export class BlockExecutor {
private execLogger: Logger
constructor(
private blockHandlers: BlockHandler[],
private resolver: VariableResolver,
private contextExtensions: ContextExtensions,
private state: BlockStateWriter
) {}
) {
this.execLogger = logger.withMetadata({
workflowId: this.contextExtensions.metadata?.workflowId,
workspaceId: this.contextExtensions.workspaceId,
executionId: this.contextExtensions.executionId,
userId: this.contextExtensions.userId,
requestId: this.contextExtensions.metadata?.requestId,
})
}
async execute(
ctx: ExecutionContext,
@@ -273,7 +283,7 @@ export class BlockExecutor {
}
}
logger.error(
this.execLogger.error(
phase === 'input_resolution' ? 'Failed to resolve block inputs' : 'Block execution failed',
{
blockId: node.id,
@@ -306,7 +316,7 @@ export class BlockExecutor {
if (blockLog) {
blockLog.errorHandled = true
}
logger.info('Block has error port - returning error output instead of throwing', {
this.execLogger.info('Block has error port - returning error output instead of throwing', {
blockId: node.id,
error: errorMessage,
})
@@ -358,7 +368,7 @@ export class BlockExecutor {
blockName = `${blockName} (iteration ${loopScope.iteration})`
iterationIndex = loopScope.iteration
} else {
logger.warn('Loop scope not found for block', { blockId, loopId })
this.execLogger.warn('Loop scope not found for block', { blockId, loopId })
}
}
}
@@ -462,7 +472,7 @@ export class BlockExecutor {
ctx.childWorkflowContext
)
} catch (error) {
logger.warn('Block start callback failed', {
this.execLogger.warn('Block start callback failed', {
blockId,
blockType,
error: error instanceof Error ? error.message : String(error),
@@ -508,7 +518,7 @@ export class BlockExecutor {
ctx.childWorkflowContext
)
} catch (error) {
logger.warn('Block completion callback failed', {
this.execLogger.warn('Block completion callback failed', {
blockId,
blockType,
error: error instanceof Error ? error.message : String(error),
@@ -633,7 +643,7 @@ export class BlockExecutor {
try {
await ctx.onStream?.(clientStreamingExec)
} catch (error) {
logger.error('Error in onStream callback', { blockId, error })
this.execLogger.error('Error in onStream callback', { blockId, error })
// Cancel the client stream to release the tee'd buffer
await processedClientStream.cancel().catch(() => {})
}
@@ -663,7 +673,7 @@ export class BlockExecutor {
stream: processedStream,
})
} catch (error) {
logger.error('Error in onStream callback', { blockId, error })
this.execLogger.error('Error in onStream callback', { blockId, error })
await processedStream.cancel().catch(() => {})
}
}
@@ -687,7 +697,7 @@ export class BlockExecutor {
const tail = decoder.decode()
if (tail) chunks.push(tail)
} catch (error) {
logger.error('Error reading executor stream for block', { blockId, error })
this.execLogger.error('Error reading executor stream for block', { blockId, error })
} finally {
try {
await reader.cancel().catch(() => {})
@@ -718,7 +728,10 @@ export class BlockExecutor {
}
return
} catch (error) {
logger.warn('Failed to parse streamed content for response format', { blockId, error })
this.execLogger.warn('Failed to parse streamed content for response format', {
blockId,
error,
})
}
}

View File

@@ -1,4 +1,4 @@
import { createLogger } from '@sim/logger'
import { createLogger, type Logger } from '@sim/logger'
import { isExecutionCancelled, isRedisCancellationEnabled } from '@/lib/execution/cancellation'
import { BlockType } from '@/executor/constants'
import type { DAG } from '@/executor/dag/builder'
@@ -34,6 +34,7 @@ export class ExecutionEngine {
private readonly CANCELLATION_CHECK_INTERVAL_MS = 500
private abortPromise: Promise<void> | null = null
private abortResolve: (() => void) | null = null
private execLogger: Logger
constructor(
private context: ExecutionContext,
@@ -43,6 +44,13 @@ export class ExecutionEngine {
) {
this.allowResumeTriggers = this.context.metadata.resumeFromSnapshot === true
this.useRedisCancellation = isRedisCancellationEnabled() && !!this.context.executionId
this.execLogger = logger.withMetadata({
workflowId: this.context.workflowId,
workspaceId: this.context.workspaceId,
executionId: this.context.executionId,
userId: this.context.userId,
requestId: this.context.metadata.requestId,
})
this.initializeAbortHandler()
}
@@ -88,7 +96,9 @@ export class ExecutionEngine {
const cancelled = await isExecutionCancelled(this.context.executionId!)
if (cancelled) {
this.cancelledFlag = true
logger.info('Execution cancelled via Redis', { executionId: this.context.executionId })
this.execLogger.info('Execution cancelled via Redis', {
executionId: this.context.executionId,
})
}
return cancelled
}
@@ -169,7 +179,7 @@ export class ExecutionEngine {
this.finalizeIncompleteLogs()
const errorMessage = normalizeError(error)
logger.error('Execution failed', { error: errorMessage })
this.execLogger.error('Execution failed', { error: errorMessage })
const executionResult: ExecutionResult = {
success: false,
@@ -270,7 +280,7 @@ export class ExecutionEngine {
private initializeQueue(triggerBlockId?: string): void {
if (this.context.runFromBlockContext) {
const { startBlockId } = this.context.runFromBlockContext
logger.info('Initializing queue for run-from-block mode', {
this.execLogger.info('Initializing queue for run-from-block mode', {
startBlockId,
dirtySetSize: this.context.runFromBlockContext.dirtySet.size,
})
@@ -282,7 +292,7 @@ export class ExecutionEngine {
const remainingEdges = (this.context.metadata as any).remainingEdges
if (remainingEdges && Array.isArray(remainingEdges) && remainingEdges.length > 0) {
logger.info('Removing edges from resumed pause blocks', {
this.execLogger.info('Removing edges from resumed pause blocks', {
edgeCount: remainingEdges.length,
edges: remainingEdges,
})
@@ -294,13 +304,13 @@ export class ExecutionEngine {
targetNode.incomingEdges.delete(edge.source)
if (this.edgeManager.isNodeReady(targetNode)) {
logger.info('Node became ready after edge removal', { nodeId: targetNode.id })
this.execLogger.info('Node became ready after edge removal', { nodeId: targetNode.id })
this.addToQueue(targetNode.id)
}
}
}
logger.info('Edge removal complete, queued ready nodes', {
this.execLogger.info('Edge removal complete, queued ready nodes', {
queueLength: this.readyQueue.length,
queuedNodes: this.readyQueue,
})
@@ -309,7 +319,7 @@ export class ExecutionEngine {
}
if (pendingBlocks && pendingBlocks.length > 0) {
logger.info('Initializing queue from pending blocks (resume mode)', {
this.execLogger.info('Initializing queue from pending blocks (resume mode)', {
pendingBlocks,
allowResumeTriggers: this.allowResumeTriggers,
dagNodeCount: this.dag.nodes.size,
@@ -319,7 +329,7 @@ export class ExecutionEngine {
this.addToQueue(nodeId)
}
logger.info('Pending blocks queued', {
this.execLogger.info('Pending blocks queued', {
queueLength: this.readyQueue.length,
queuedNodes: this.readyQueue,
})
@@ -341,7 +351,7 @@ export class ExecutionEngine {
if (startNode) {
this.addToQueue(startNode.id)
} else {
logger.warn('No start node found in DAG')
this.execLogger.warn('No start node found in DAG')
}
}
@@ -373,7 +383,7 @@ export class ExecutionEngine {
}
} catch (error) {
const errorMessage = normalizeError(error)
logger.error('Node execution failed', { nodeId, error: errorMessage })
this.execLogger.error('Node execution failed', { nodeId, error: errorMessage })
throw error
}
}
@@ -385,7 +395,7 @@ export class ExecutionEngine {
): Promise<void> {
const node = this.dag.nodes.get(nodeId)
if (!node) {
logger.error('Node not found during completion', { nodeId })
this.execLogger.error('Node not found during completion', { nodeId })
return
}
@@ -409,7 +419,7 @@ export class ExecutionEngine {
// shouldContinue: true means more iterations, shouldExit: true means loop is done
const shouldContinueLoop = output.shouldContinue === true
if (!shouldContinueLoop) {
logger.info('Stopping execution after target block', { nodeId })
this.execLogger.info('Stopping execution after target block', { nodeId })
this.stoppedEarlyFlag = true
return
}
@@ -417,7 +427,7 @@ export class ExecutionEngine {
const readyNodes = this.edgeManager.processOutgoingEdges(node, output, false)
logger.info('Processing outgoing edges', {
this.execLogger.info('Processing outgoing edges', {
nodeId,
outgoingEdgesCount: node.outgoingEdges.size,
outgoingEdges: Array.from(node.outgoingEdges.entries()).map(([id, e]) => ({
@@ -435,7 +445,7 @@ export class ExecutionEngine {
if (this.context.pendingDynamicNodes && this.context.pendingDynamicNodes.length > 0) {
const dynamicNodes = this.context.pendingDynamicNodes
this.context.pendingDynamicNodes = []
logger.info('Adding dynamically expanded parallel nodes', { dynamicNodes })
this.execLogger.info('Adding dynamically expanded parallel nodes', { dynamicNodes })
this.addMultipleToQueue(dynamicNodes)
}
}
@@ -482,7 +492,7 @@ export class ExecutionEngine {
}
return parsedSnapshot.state
} catch (error) {
logger.warn('Failed to serialize execution state', {
this.execLogger.warn('Failed to serialize execution state', {
error: error instanceof Error ? error.message : String(error),
})
return undefined

View File

@@ -1,4 +1,4 @@
import { createLogger } from '@sim/logger'
import { createLogger, type Logger } from '@sim/logger'
import { StartBlockPath } from '@/lib/workflows/triggers/triggers'
import type { DAG } from '@/executor/dag/builder'
import { DAGBuilder } from '@/executor/dag/builder'
@@ -52,6 +52,7 @@ export class DAGExecutor {
private workflowVariables: Record<string, unknown>
private contextExtensions: ContextExtensions
private dagBuilder: DAGBuilder
private execLogger: Logger
constructor(options: DAGExecutorOptions) {
this.workflow = options.workflow
@@ -60,6 +61,13 @@ export class DAGExecutor {
this.workflowVariables = options.workflowVariables ?? {}
this.contextExtensions = options.contextExtensions ?? {}
this.dagBuilder = new DAGBuilder()
this.execLogger = logger.withMetadata({
workflowId: this.contextExtensions.metadata?.workflowId,
workspaceId: this.contextExtensions.workspaceId,
executionId: this.contextExtensions.executionId,
userId: this.contextExtensions.userId,
requestId: this.contextExtensions.metadata?.requestId,
})
}
async execute(workflowId: string, triggerBlockId?: string): Promise<ExecutionResult> {
@@ -79,7 +87,9 @@ export class DAGExecutor {
_pendingBlocks: string[],
context: ExecutionContext
): Promise<ExecutionResult> {
logger.warn('Debug mode (continueExecution) is not yet implemented in the refactored executor')
this.execLogger.warn(
'Debug mode (continueExecution) is not yet implemented in the refactored executor'
)
return {
success: false,
output: {},
@@ -163,7 +173,7 @@ export class DAGExecutor {
parallelExecutions: filteredParallelExecutions,
}
logger.info('Executing from block', {
this.execLogger.info('Executing from block', {
workflowId,
startBlockId,
effectiveStartBlockId,
@@ -247,7 +257,7 @@ export class DAGExecutor {
if (overrides?.runFromBlockContext) {
const { dirtySet } = overrides.runFromBlockContext
executedBlocks = new Set([...executedBlocks].filter((id) => !dirtySet.has(id)))
logger.info('Cleared executed status for dirty blocks', {
this.execLogger.info('Cleared executed status for dirty blocks', {
dirtySetSize: dirtySet.size,
remainingExecutedBlocks: executedBlocks.size,
})
@@ -332,7 +342,7 @@ export class DAGExecutor {
if (this.contextExtensions.resumeFromSnapshot) {
context.metadata.resumeFromSnapshot = true
logger.info('Resume from snapshot enabled', {
this.execLogger.info('Resume from snapshot enabled', {
resumePendingQueue: this.contextExtensions.resumePendingQueue,
remainingEdges: this.contextExtensions.remainingEdges,
triggerBlockId,
@@ -341,14 +351,14 @@ export class DAGExecutor {
if (this.contextExtensions.remainingEdges) {
;(context.metadata as any).remainingEdges = this.contextExtensions.remainingEdges
logger.info('Set remaining edges for resume', {
this.execLogger.info('Set remaining edges for resume', {
edgeCount: this.contextExtensions.remainingEdges.length,
})
}
if (this.contextExtensions.resumePendingQueue?.length) {
context.metadata.pendingBlocks = [...this.contextExtensions.resumePendingQueue]
logger.info('Set pending blocks from resume queue', {
this.execLogger.info('Set pending blocks from resume queue', {
pendingBlocks: context.metadata.pendingBlocks,
skipStarterBlockInit: true,
})
@@ -409,7 +419,7 @@ export class DAGExecutor {
if (triggerBlockId) {
const triggerBlock = this.workflow.blocks.find((b) => b.id === triggerBlockId)
if (!triggerBlock) {
logger.error('Specified trigger block not found in workflow', {
this.execLogger.error('Specified trigger block not found in workflow', {
triggerBlockId,
})
throw new Error(`Trigger block not found: ${triggerBlockId}`)
@@ -431,7 +441,7 @@ export class DAGExecutor {
})
if (!startResolution?.block) {
logger.warn('No start block found in workflow')
this.execLogger.warn('No start block found in workflow')
return
}
}

Some files were not shown because too many files have changed in this diff Show More