Compare commits

...

63 Commits

Author SHA1 Message Date
Waleed
cc2be33d6b v0.5.67: loading, password reset, ui improvements, helm updates (#2928)
* fix(zustand): updated to useShallow from deprecated createWithEqualityFn (#2919)

* fix(logger): use direct env access for webpack inlining (#2920)

* fix(notifications): text overflow with line-clamp (#2921)

* chore(helm): add env vars for Vertex AI, orgs, and telemetry (#2922)

* fix(auth): improve reset password flow and consolidate brand detection (#2924)

* fix(auth): improve reset password flow and consolidate brand detection

* fix(auth): set errorHandled for EMAIL_NOT_VERIFIED to prevent duplicate error

* fix(auth): clear success message on login errors

* chore(auth): fix import order per lint

* fix(action-bar): duplicate subflows with children (#2923)

* fix(action-bar): duplicate subflows with children

* fix(action-bar): add validateTriggerPaste for subflow duplicate

* fix(resolver): agent response format, input formats, root level (#2925)

* fix(resolvers): agent response format, input formats, root level

* fix response block initial seeding

* fix tests

* fix(messages-input): fix cursor alignment and auto-resize with overlay (#2926)

* fix(messages-input): fix cursor alignment and auto-resize with overlay

* fixed remaining zustand warnings

* fix(stores): remove dead code causing log spam on startup (#2927)

* fix(stores): remove dead code causing log spam on startup

* fix(stores): replace custom tools zustand store with react query cache

* improvement(ui): use BrandedButton and BrandedLink components (#2930)

- Refactor auth forms to use BrandedButton component
- Add BrandedLink component for changelog page
- Reduce code duplication in login, signup, reset-password forms
- Update star count default value

* fix(custom-tools): remove unsafe title fallback in getCustomTool (#2929)

* fix(custom-tools): remove unsafe title fallback in getCustomTool

* fix(custom-tools): restore title fallback in getCustomTool lookup

Custom tools are referenced by title (custom_${title}), not database ID.
The title fallback is required for client-side tool resolution to work.

* fix(null-bodies): empty bodies handling (#2931)

* fix(null-statuses): empty bodies handling

* address bugbot comment

* fix(token-refresh): microsoft, notion, x, linear (#2933)

* fix(microsoft): proactive refresh needed

* fix(x): missing token refresh flag

* notion and linear missing flag too

* address bugbot comment

* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback (#2932)

* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback

* refactor(auth): extract redirectToVerify helper to reduce duplication

* fix(workflow-selector): use dedicated selector for workflow dropdown (#2934)

* feat(workflow-block): preview (#2935)

* improvement(copilot): tool configs to show nested props (#2936)

* fix(auth): add genericOAuth providers to trustedProviders (#2937)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-21 22:53:25 -08:00
Waleed
376f7cb571 fix(auth): add genericOAuth providers to trustedProviders (#2937) 2026-01-21 22:44:30 -08:00
Vikhyath Mondreti
42159c23b9 improvement(copilot): tool configs to show nested props (#2936) 2026-01-21 20:02:59 -08:00
Emir Karabeg
2f0f246002 feat(workflow-block): preview (#2935) 2026-01-21 19:12:28 -08:00
Waleed
900d3ef9ea fix(workflow-selector): use dedicated selector for workflow dropdown (#2934) 2026-01-21 18:38:03 -08:00
Waleed
f3fcc28f89 fix(auth): handle EMAIL_NOT_VERIFIED in onError callback (#2932)
* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback

* refactor(auth): extract redirectToVerify helper to reduce duplication
2026-01-21 18:34:49 -08:00
Vikhyath Mondreti
7cfdf46724 fix(token-refresh): microsoft, notion, x, linear (#2933)
* fix(microsoft): proactive refresh needed

* fix(x): missing token refresh flag

* notion and linear missing flag too

* address bugbot comment
2026-01-21 18:30:53 -08:00
Vikhyath Mondreti
d681451297 fix(null-bodies): empty bodies handling (#2931)
* fix(null-statuses): empty bodies handling

* address bugbot comment
2026-01-21 18:10:33 -08:00
Waleed
5987a6d060 fix(custom-tools): remove unsafe title fallback in getCustomTool (#2929)
* fix(custom-tools): remove unsafe title fallback in getCustomTool

* fix(custom-tools): restore title fallback in getCustomTool lookup

Custom tools are referenced by title (custom_${title}), not database ID.
The title fallback is required for client-side tool resolution to work.
2026-01-21 17:36:10 -08:00
Waleed
e2ccefb2f4 improvement(ui): use BrandedButton and BrandedLink components (#2930)
- Refactor auth forms to use BrandedButton component
- Add BrandedLink component for changelog page
- Reduce code duplication in login, signup, reset-password forms
- Update star count default value
2026-01-21 17:25:30 -08:00
Waleed
103b31a569 fix(stores): remove dead code causing log spam on startup (#2927)
* fix(stores): remove dead code causing log spam on startup

* fix(stores): replace custom tools zustand store with react query cache
2026-01-21 16:08:26 -08:00
Waleed
004e058353 fix(messages-input): fix cursor alignment and auto-resize with overlay (#2926)
* fix(messages-input): fix cursor alignment and auto-resize with overlay

* fixed remaining zustand warnings
2026-01-21 15:30:13 -08:00
Vikhyath Mondreti
5157f0bbb2 fix(resolver): agent response format, input formats, root level (#2925)
* fix(resolvers): agent response format, input formats, root level

* fix response block initial seeding

* fix tests
2026-01-21 14:55:23 -08:00
Waleed
8bbcf31b83 fix(action-bar): duplicate subflows with children (#2923)
* fix(action-bar): duplicate subflows with children

* fix(action-bar): add validateTriggerPaste for subflow duplicate
2026-01-21 14:54:29 -08:00
Waleed
9e814315dd fix(auth): improve reset password flow and consolidate brand detection (#2924)
* fix(auth): improve reset password flow and consolidate brand detection

* fix(auth): set errorHandled for EMAIL_NOT_VERIFIED to prevent duplicate error

* fix(auth): clear success message on login errors

* chore(auth): fix import order per lint
2026-01-21 14:42:14 -08:00
Waleed
0ea0256623 chore(helm): add env vars for Vertex AI, orgs, and telemetry (#2922) 2026-01-21 11:36:16 -08:00
Waleed
fb8868c854 fix(notifications): text overflow with line-clamp (#2921) 2026-01-21 10:20:21 -08:00
Waleed
ea4964052d fix(logger): use direct env access for webpack inlining (#2920) 2026-01-21 10:14:40 -08:00
Waleed
268e2f114f fix(zustand): updated to useShallow from deprecated createWithEqualityFn (#2919) 2026-01-21 09:47:48 -08:00
Vikhyath Mondreti
45371e521e v0.5.66: external http requests fix, ring highlighting 2026-01-21 02:55:39 -08:00
Vikhyath Mondreti
5988d0e46f fix(ring): duplicate should clear original block (#2916)
* fix(ring): duplicate should clear original block

* rename correctly
2026-01-21 02:40:58 -08:00
Vikhyath Mondreti
145db9d8c3 fix(http): options not parsed accurately (#2914)
* fix(http): options not parsed accurately

* fix lint

* remove boilerplate code'
2026-01-21 01:36:29 -08:00
Waleed
0ce0f98aa5 v0.5.65: gemini updates, textract integration, ui updates (#2909)
* fix(google): wrap primitive tool responses for Gemini API compatibility (#2900)

* fix(canonical): copilot path + update parent (#2901)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output (#2902)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output

* fix(imap): add top-level fields to IMAP trigger output

* improvement(browseruse): add profile id param (#2903)

* improvement(browseruse): add profile id param

* make request a stub since we have directExec

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels (#2880)

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels

* comments

* improvement(files): update execution for passing base64 strings (#2906)

* progress

* improvement(execution): update execution for passing base64 strings

* fix types

* cleanup comments

* path security vuln

* reject promise correctly

* fix redirect case

* remove proxy routes

* fix tests

* use ipaddr

* feat(tools): added textract, added v2 for mistral, updated tag dropdown (#2904)

* feat(tools): added textract

* cleanup

* ack pr comments

* reorder

* removed upload for textract async version

* fix additional fields dropdown in editor, update parser to leave validation to be done on the server

* added mistral v2, files v2, and finalized textract

* updated the rest of the old file patterns, updated mistral outputs for v2

* updated tag dropdown to parse non-operation fields as well

* updated extension finder

* cleanup

* added description for inputs to workflow

* use helper for internal route check

* fix tag dropdown merge conflict change

* remove duplicate code

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>

* fix(ui): change add inputs button to match output selector (#2907)

* fix(canvas): removed invite to workspace from canvas popover (#2908)

* fix(canvas): removed invite to workspace

* removed unused props

* fix(copilot): legacy tool display names (#2911)

* fix(a2a): canonical merge  (#2912)

* fix canonical merge

* fix empty array case

* fix(change-detection): copilot diffs have extra field (#2913)

* improvement(logs): improved logs ui bugs, added subflow disable UI (#2910)

* improvement(logs): improved logs ui bugs, added subflow disable UI

* added duplicate to action bar for subflows

* feat(broadcast): email v0.5 (#2905)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-20 23:54:55 -08:00
Emir Karabeg
294b168ed9 feat(broadcast): email v0.5 (#2905) 2026-01-20 23:42:48 -08:00
Waleed
0dc2c1fe0d improvement(logs): improved logs ui bugs, added subflow disable UI (#2910)
* improvement(logs): improved logs ui bugs, added subflow disable UI

* added duplicate to action bar for subflows
2026-01-20 23:13:05 -08:00
Vikhyath Mondreti
fb90c4e9b1 fix(change-detection): copilot diffs have extra field (#2913) 2026-01-20 22:04:08 -08:00
Vikhyath Mondreti
0af96d06c6 fix(a2a): canonical merge (#2912)
* fix canonical merge

* fix empty array case
2026-01-20 21:58:13 -08:00
Vikhyath Mondreti
1d450578c8 fix(copilot): legacy tool display names (#2911) 2026-01-20 21:16:48 -08:00
Waleed
c6d408c65b fix(canvas): removed invite to workspace from canvas popover (#2908)
* fix(canvas): removed invite to workspace

* removed unused props
2026-01-20 20:29:53 -08:00
Waleed
16716ea26a fix(ui): change add inputs button to match output selector (#2907) 2026-01-20 19:24:59 -08:00
Waleed
563098ca0a feat(tools): added textract, added v2 for mistral, updated tag dropdown (#2904)
* feat(tools): added textract

* cleanup

* ack pr comments

* reorder

* removed upload for textract async version

* fix additional fields dropdown in editor, update parser to leave validation to be done on the server

* added mistral v2, files v2, and finalized textract

* updated the rest of the old file patterns, updated mistral outputs for v2

* updated tag dropdown to parse non-operation fields as well

* updated extension finder

* cleanup

* added description for inputs to workflow

* use helper for internal route check

* fix tag dropdown merge conflict change

* remove duplicate code

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
2026-01-20 18:41:26 -08:00
Vikhyath Mondreti
1f1f015031 improvement(files): update execution for passing base64 strings (#2906)
* progress

* improvement(execution): update execution for passing base64 strings

* fix types

* cleanup comments

* path security vuln

* reject promise correctly

* fix redirect case

* remove proxy routes

* fix tests

* use ipaddr
2026-01-20 17:49:00 -08:00
Waleed
dff1c9d083 v0.5.64: unsubscribe, search improvements, metrics, additional SSO configuration 2026-01-20 00:34:11 -08:00
Vikhyath Mondreti
b09f683072 v0.5.63: ui and performance improvements, more google tools 2026-01-18 15:22:42 -08:00
Vikhyath Mondreti
a8bb0db660 v0.5.62: webhook bug fixes, seeding default subblock values, block selection fixes 2026-01-16 20:27:06 -08:00
Waleed
af82820a28 v0.5.61: webhook improvements, workflow controls, react query for deployment status, chat fixes, reducto and pulse OCR, linear fixes 2026-01-16 18:06:23 -08:00
Waleed
4372841797 v0.5.60: invitation flow improvements, chat fixes, a2a improvements, additional copilot actions 2026-01-15 00:02:18 -08:00
Waleed
5e8c843241 v0.5.59: a2a support, documentation 2026-01-13 13:21:21 -08:00
Waleed
7bf3d73ee6 v0.5.58: export folders, new tools, permissions groups enhancements 2026-01-13 00:56:59 -08:00
Vikhyath Mondreti
7ffc11a738 v0.5.57: subagents, context menu improvements, bug fixes 2026-01-11 11:38:40 -08:00
Waleed
be578e2ed7 v0.5.56: batch operations, access control and permission groups, billing fixes 2026-01-10 00:31:34 -08:00
Waleed
f415e5edc4 v0.5.55: polling groups, bedrock provider, devcontainer fixes, workflow preview enhancements 2026-01-08 23:36:56 -08:00
Waleed
13a6e6c3fa v0.5.54: seo, model blacklist, helm chart updates, fireflies integration, autoconnect improvements, billing fixes 2026-01-07 16:09:45 -08:00
Waleed
f5ab7f21ae v0.5.53: hotkey improvements, added redis fallback, fixes for workflow tool 2026-01-06 23:34:52 -08:00
Waleed
bfb6fffe38 v0.5.52: new port-based router block, combobox expression and variable support 2026-01-06 16:14:10 -08:00
Waleed
4fbec0a43f v0.5.51: triggers, kb, condition block improvements, supabase and grain integration updates 2026-01-06 14:26:46 -08:00
Waleed
585f5e365b v0.5.50: import improvements, ui upgrades, kb styling and performance improvements 2026-01-05 00:35:55 -08:00
Waleed
3792bdd252 v0.5.49: hitl improvements, new email styles, imap trigger, logs context menu (#2672)
* feat(logs-context-menu): consolidated logs utils and types, added logs record context menu (#2659)

* feat(email): welcome email; improvement(emails): ui/ux (#2658)

* feat(email): welcome email; improvement(emails): ui/ux

* improvement(emails): links, accounts, preview

* refactor(emails): file structure and wrapper components

* added envvar for personal emails sent, added isHosted gate

* fixed failing tests, added env mock

* fix: removed comment

---------

Co-authored-by: waleed <walif6@gmail.com>

* fix(logging): hitl + trigger dev crash protection (#2664)

* hitl gaps

* deal with trigger worker crashes

* cleanup import strcuture

* feat(imap): added support for imap trigger (#2663)

* feat(tools): added support for imap trigger

* feat(imap): added parity, tested

* ack PR comments

* final cleanup

* feat(i18n): update translations (#2665)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(grain): updated grain trigger to auto-establish trigger (#2666)

Co-authored-by: aadamgough <adam@sim.ai>

* feat(admin): routes to manage deployments (#2667)

* feat(admin): routes to manage deployments

* fix naming fo deployed by

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date (#2668)

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date

* removed unused params, cleaned up redundant utils

* improvement(invite): aligned styling (#2669)

* improvement(invite): aligned with rest of app

* fix(invite): error handling

* fix: addressed comments

---------

Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
Co-authored-by: aadamgough <adam@sim.ai>
2026-01-03 13:19:18 -08:00
Waleed
eb5d1f3e5b v0.5.48: copy-paste workflow blocks, docs updates, mcp tool fixes 2025-12-31 18:00:04 -08:00
Waleed
54ab82c8dd v0.5.47: deploy workflow as mcp, kb chunks tokenizer, UI improvements, jira service management tools 2025-12-30 23:18:58 -08:00
Waleed
f895bf469b v0.5.46: build improvements, greptile, light mode improvements 2025-12-29 02:17:52 -08:00
Waleed
dd3209af06 v0.5.45: light mode fixes, realtime usage indicator, docker build improvements 2025-12-27 19:57:42 -08:00
Waleed
b6ba3b50a7 v0.5.44: keyboard shortcuts, autolayout, light mode, byok, testing improvements 2025-12-26 21:25:19 -08:00
Waleed
b304233062 v0.5.43: export logs, circleback, grain, vertex, code hygiene, schedule improvements 2025-12-23 19:19:18 -08:00
Vikhyath Mondreti
57e4b49bd6 v0.5.42: fix memory migration 2025-12-23 01:24:54 -08:00
Vikhyath Mondreti
e12dd204ed v0.5.41: memory fixes, copilot improvements, knowledgebase improvements, LLM providers standardization 2025-12-23 00:15:18 -08:00
Vikhyath Mondreti
3d9d9cbc54 v0.5.40: supabase ops to allow non-public schemas, jira uuid 2025-12-21 22:28:05 -08:00
Waleed
0f4ec962ad v0.5.39: notion, workflow variables fixes 2025-12-20 20:44:00 -08:00
Waleed
4827866f9a v0.5.38: snap to grid, copilot ux improvements, billing line items 2025-12-20 17:24:38 -08:00
Waleed
3e697d9ed9 v0.5.37: redaction utils consolidation, logs updates, autoconnect improvements, additional kb tag types 2025-12-19 22:31:55 -08:00
Martin Yankov
4431a1a484 fix(helm): add custom egress rules to realtime network policy (#2481)
The realtime service network policy was missing the custom egress rules section
that allows configuration of additional egress rules via values.yaml. This caused
the realtime pods to be unable to connect to external databases (e.g., PostgreSQL
on port 5432) when using external database configurations.

The app network policy already had this section, but the realtime network policy
was missing it, creating an inconsistency and preventing the realtime service
from accessing external databases configured via networkPolicy.egress values.

This fix adds the same custom egress rules template section to the realtime
network policy, matching the app network policy behavior and allowing users to
configure database connectivity via values.yaml.
2025-12-19 18:59:08 -08:00
Waleed
4d1a9a3f22 v0.5.36: hitl improvements, opengraph, slack fixes, one-click unsubscribe, auth checks, new db indexes 2025-12-19 01:27:49 -08:00
Vikhyath Mondreti
eb07a080fb v0.5.35: helm updates, copilot improvements, 404 for docs, salesforce fixes, subflow resize clamping 2025-12-18 16:23:19 -08:00
239 changed files with 6436 additions and 2813 deletions

View File

@@ -14,7 +14,7 @@
</p> </p>
<p align="center"> <p align="center">
<a href="https://deepwiki.com/simstudioai/sim" target="_blank" rel="noopener noreferrer"><img src="https://deepwiki.com/badge.svg" alt="Ask DeepWiki"></a> <a href="https://cursor.com/link/prompt?text=Help%20me%20set%20up%20Sim%20Studio%20locally.%20Follow%20these%20steps%3A%0A%0A1.%20First%2C%20verify%20Docker%20is%20installed%20and%20running%3A%0A%20%20%20docker%20--version%0A%20%20%20docker%20info%0A%0A2.%20Clone%20the%20repository%3A%0A%20%20%20git%20clone%20https%3A%2F%2Fgithub.com%2Fsimstudioai%2Fsim.git%0A%20%20%20cd%20sim%0A%0A3.%20Start%20the%20services%20with%20Docker%20Compose%3A%0A%20%20%20docker%20compose%20-f%20docker-compose.prod.yml%20up%20-d%0A%0A4.%20Wait%20for%20all%20containers%20to%20be%20healthy%20(this%20may%20take%201-2%20minutes)%3A%0A%20%20%20docker%20compose%20-f%20docker-compose.prod.yml%20ps%0A%0A5.%20Verify%20the%20app%20is%20accessible%20at%20http%3A%2F%2Flocalhost%3A3000%0A%0AIf%20there%20are%20any%20errors%2C%20help%20me%20troubleshoot%20them.%20Common%20issues%3A%0A-%20Port%203000%2C%203002%2C%20or%205432%20already%20in%20use%0A-%20Docker%20not%20running%0A-%20Insufficient%20memory%20(needs%2012GB%2B%20RAM)%0A%0AFor%20local%20AI%20models%20with%20Ollama%2C%20use%20this%20instead%20of%20step%203%3A%0A%20%20%20docker%20compose%20-f%20docker-compose.ollama.yml%20--profile%20setup%20up%20-d"><img src="https://img.shields.io/badge/Set%20Up%20with-Cursor-000000?logo=cursor&logoColor=white" alt="Set Up with Cursor"></a> <a href="https://deepwiki.com/simstudioai/sim" target="_blank" rel="noopener noreferrer"><img src="https://deepwiki.com/badge.svg" alt="Ask DeepWiki"></a> <a href="https://cursor.com/link/prompt?text=Help%20me%20set%20up%20Sim%20locally.%20Follow%20these%20steps%3A%0A%0A1.%20First%2C%20verify%20Docker%20is%20installed%20and%20running%3A%0A%20%20%20docker%20--version%0A%20%20%20docker%20info%0A%0A2.%20Clone%20the%20repository%3A%0A%20%20%20git%20clone%20https%3A%2F%2Fgithub.com%2Fsimstudioai%2Fsim.git%0A%20%20%20cd%20sim%0A%0A3.%20Start%20the%20services%20with%20Docker%20Compose%3A%0A%20%20%20docker%20compose%20-f%20docker-compose.prod.yml%20up%20-d%0A%0A4.%20Wait%20for%20all%20containers%20to%20be%20healthy%20(this%20may%20take%201-2%20minutes)%3A%0A%20%20%20docker%20compose%20-f%20docker-compose.prod.yml%20ps%0A%0A5.%20Verify%20the%20app%20is%20accessible%20at%20http%3A%2F%2Flocalhost%3A3000%0A%0AIf%20there%20are%20any%20errors%2C%20help%20me%20troubleshoot%20them.%20Common%20issues%3A%0A-%20Port%203000%2C%203002%2C%20or%205432%20already%20in%20use%0A-%20Docker%20not%20running%0A-%20Insufficient%20memory%20(needs%2012GB%2B%20RAM)%0A%0AFor%20local%20AI%20models%20with%20Ollama%2C%20use%20this%20instead%20of%20step%203%3A%0A%20%20%20docker%20compose%20-f%20docker-compose.ollama.yml%20--profile%20setup%20up%20-d"><img src="https://img.shields.io/badge/Set%20Up%20with-Cursor-000000?logo=cursor&logoColor=white" alt="Set Up with Cursor"></a>
</p> </p>
### Build Workflows with Ease ### Build Workflows with Ease

View File

@@ -4093,6 +4093,23 @@ export function SQSIcon(props: SVGProps<SVGSVGElement>) {
) )
} }
export function TextractIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg
{...props}
viewBox='10 14 60 52'
version='1.1'
xmlns='http://www.w3.org/2000/svg'
xmlnsXlink='http://www.w3.org/1999/xlink'
>
<path
d='M22.0624102,50 C24.3763895,53.603 28.4103535,56 33.0003125,56 C40.1672485,56 45.9991964,50.168 45.9991964,43 C45.9991964,35.832 40.1672485,30 33.0003125,30 C27.6033607,30 22.9664021,33.307 21.0024196,38 L23.2143999,38 C25.0393836,34.444 28.7363506,32 33.0003125,32 C39.0652583,32 43.9992143,36.935 43.9992143,43 C43.9992143,49.065 39.0652583,54 33.0003125,54 C29.5913429,54 26.5413702,52.441 24.5213882,50 L22.0624102,50 Z M37.0002768,45 L37.0002768,43 L41.9992321,43 C41.9992321,38.038 37.9622682,34 33.0003125,34 C28.0373568,34 23.9993929,38.038 23.9993929,43 L28.9993482,43 L28.9993482,45 L24.2313908,45 C25.1443826,49.002 28.7253507,52 33.0003125,52 C35.1362934,52 37.0992759,51.249 38.6442621,50 L34.0003036,50 L34.0003036,48 L40.4782457,48 C41.0812403,47.102 41.5202364,46.087 41.7682342,45 L37.0002768,45 Z M21.0024196,48 L23.2143999,48 C22.4434068,46.498 22.0004107,44.801 22.0004107,43 C22.0004107,41.959 22.1554093,40.955 22.4264069,40 L20.3634253,40 C20.1344274,40.965 19.9994286,41.966 19.9994286,43 C19.9994286,44.771 20.3584254,46.46 21.0024196,48 L21.0024196,48 Z M19.7434309,50 L17.0004554,50 L17.0004554,48 L18.8744386,48 C18.5344417,47.04 18.2894438,46.038 18.1494451,45 L15.4144695,45 L16.707458,46.293 L15.2924706,47.707 L12.2924974,44.707 C11.9025009,44.316 11.9025009,43.684 12.2924974,43.293 L15.2924706,40.293 L16.707458,41.707 L15.4144695,43 L18.0004464,43 C18.0004464,41.973 18.1044455,40.97 18.3024437,40 L17.0004554,40 L17.0004554,38 L18.8744386,38 C20.9404202,32.184 26.4833707,28 33.0003125,28 C37.427273,28 41.4002375,29.939 44.148213,33 L59.0000804,33 L59.0000804,35 L45.6661994,35 C47.1351863,37.318 47.9991786,40.058 47.9991786,43 L59.0000804,43 L59.0000804,45 L47.8501799,45 C46.8681887,52.327 40.5912447,58 33.0003125,58 C27.2563638,58 22.2624084,54.752 19.7434309,50 L19.7434309,50 Z M37.0002768,39 C37.0002768,38.448 36.5522808,38 36.0002857,38 L29.9993482,38 C29.4473442,38 28.9993482,38.448 28.9993482,39 L28.9993482,41 L31.0003304,41 L31.0003304,40 L32.0003214,40 L32.0003214,43 L31.0003304,43 L31.0003304,45 L35.0002946,45 L35.0002946,43 L34.0003036,43 L34.0003036,40 L35.0002946,40 L35.0002946,41 L37.0002768,41 L37.0002768,39 Z M49.0001696,40 L59.0000804,40 L59.0000804,38 L49.0001696,38 L49.0001696,40 Z M49.0001696,50 L59.0000804,50 L59.0000804,48 L49.0001696,48 L49.0001696,50 Z M57.0000982,27 L60.5850662,27 L57.0000982,23.414 L57.0000982,27 Z M63.7070383,27.293 C63.8940367,27.48 64.0000357,27.735 64.0000357,28 L64.0000357,63 C64.0000357,63.552 63.5520397,64 63.0000446,64 L32.0003304,64 C31.4473264,64 31.0003304,63.552 31.0003304,63 L31.0003304,59 L33.0003125,59 L33.0003125,62 L62.0000536,62 L62.0000536,29 L56.0001071,29 C55.4471121,29 55.0001161,28.552 55.0001161,28 L55.0001161,22 L33.0003125,22 L33.0003125,27 L31.0003304,27 L31.0003304,21 C31.0003304,20.448 31.4473264,20 32.0003304,20 L56.0001071,20 C56.2651048,20 56.5191025,20.105 56.7071008,20.293 L63.7070383,27.293 Z M68,24.166 L68,61 C68,61.552 67.552004,62 67.0000089,62 L65.0000268,62 L65.0000268,60 L66.0000179,60 L66.0000179,24.612 L58.6170838,18 L36.0002857,18 L36.0002857,19 L34.0003036,19 L34.0003036,17 C34.0003036,16.448 34.4472996,16 35.0003036,16 L59.0000804,16 C59.2460782,16 59.483076,16.091 59.6660744,16.255 L67.666003,23.42 C67.8780011,23.61 68,23.881 68,24.166 L68,24.166 Z'
fill='currentColor'
/>
</svg>
)
}
export function McpIcon(props: SVGProps<SVGSVGElement>) { export function McpIcon(props: SVGProps<SVGSVGElement>) {
return ( return (
<svg <svg

View File

@@ -110,6 +110,7 @@ import {
SupabaseIcon, SupabaseIcon,
TavilyIcon, TavilyIcon,
TelegramIcon, TelegramIcon,
TextractIcon,
TinybirdIcon, TinybirdIcon,
TranslateIcon, TranslateIcon,
TrelloIcon, TrelloIcon,
@@ -143,7 +144,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
calendly: CalendlyIcon, calendly: CalendlyIcon,
circleback: CirclebackIcon, circleback: CirclebackIcon,
clay: ClayIcon, clay: ClayIcon,
confluence: ConfluenceIcon, confluence_v2: ConfluenceIcon,
cursor_v2: CursorIcon, cursor_v2: CursorIcon,
datadog: DatadogIcon, datadog: DatadogIcon,
discord: DiscordIcon, discord: DiscordIcon,
@@ -153,7 +154,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
elasticsearch: ElasticsearchIcon, elasticsearch: ElasticsearchIcon,
elevenlabs: ElevenLabsIcon, elevenlabs: ElevenLabsIcon,
exa: ExaAIIcon, exa: ExaAIIcon,
file: DocumentIcon, file_v2: DocumentIcon,
firecrawl: FirecrawlIcon, firecrawl: FirecrawlIcon,
fireflies: FirefliesIcon, fireflies: FirefliesIcon,
github_v2: GithubIcon, github_v2: GithubIcon,
@@ -195,7 +196,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
microsoft_excel_v2: MicrosoftExcelIcon, microsoft_excel_v2: MicrosoftExcelIcon,
microsoft_planner: MicrosoftPlannerIcon, microsoft_planner: MicrosoftPlannerIcon,
microsoft_teams: MicrosoftTeamsIcon, microsoft_teams: MicrosoftTeamsIcon,
mistral_parse: MistralIcon, mistral_parse_v2: MistralIcon,
mongodb: MongoDBIcon, mongodb: MongoDBIcon,
mysql: MySQLIcon, mysql: MySQLIcon,
neo4j: Neo4jIcon, neo4j: Neo4jIcon,
@@ -237,6 +238,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
supabase: SupabaseIcon, supabase: SupabaseIcon,
tavily: TavilyIcon, tavily: TavilyIcon,
telegram: TelegramIcon, telegram: TelegramIcon,
textract: TextractIcon,
tinybird: TinybirdIcon, tinybird: TinybirdIcon,
translate: TranslateIcon, translate: TranslateIcon,
trello: TrelloIcon, trello: TrelloIcon,
@@ -244,7 +246,7 @@ export const blockTypeToIconMap: Record<string, IconComponent> = {
twilio_sms: TwilioIcon, twilio_sms: TwilioIcon,
twilio_voice: TwilioIcon, twilio_voice: TwilioIcon,
typeform: TypeformIcon, typeform: TypeformIcon,
video_generator: VideoIcon, video_generator_v2: VideoIcon,
vision: EyeIcon, vision: EyeIcon,
wealthbox: WealthboxIcon, wealthbox: WealthboxIcon,
webflow: WebflowIcon, webflow: WebflowIcon,

View File

@@ -6,7 +6,7 @@ description: Interact with Confluence
import { BlockInfoCard } from "@/components/ui/block-info-card" import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard <BlockInfoCard
type="confluence" type="confluence_v2"
color="#E0E0E0" color="#E0E0E0"
/> />

View File

@@ -6,7 +6,7 @@ description: Read and parse multiple files
import { BlockInfoCard } from "@/components/ui/block-info-card" import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard <BlockInfoCard
type="file" type="file_v2"
color="#40916C" color="#40916C"
/> />
@@ -48,7 +48,7 @@ Parse one or more uploaded files or files from URLs (text, PDF, CSV, images, etc
| Parameter | Type | Description | | Parameter | Type | Description |
| --------- | ---- | ----------- | | --------- | ---- | ----------- |
| `files` | array | Array of parsed files | | `files` | array | Array of parsed files with content, metadata, and file properties |
| `combinedContent` | string | Combined content of all parsed files | | `combinedContent` | string | All file contents merged into a single text string |

View File

@@ -106,6 +106,7 @@
"supabase", "supabase",
"tavily", "tavily",
"telegram", "telegram",
"textract",
"tinybird", "tinybird",
"translate", "translate",
"trello", "trello",

View File

@@ -6,7 +6,7 @@ description: Extract text from PDF documents
import { BlockInfoCard } from "@/components/ui/block-info-card" import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard <BlockInfoCard
type="mistral_parse" type="mistral_parse_v2"
color="#000000" color="#000000"
/> />
@@ -54,18 +54,37 @@ Parse PDF documents using Mistral OCR API
| Parameter | Type | Description | | Parameter | Type | Description |
| --------- | ---- | ----------- | | --------- | ---- | ----------- |
| `success` | boolean | Whether the PDF was parsed successfully | | `pages` | array | Array of page objects from Mistral OCR |
| `content` | string | Extracted content in the requested format \(markdown, text, or JSON\) | | ↳ `index` | number | Page index \(zero-based\) |
| `metadata` | object | Processing metadata including jobId, fileType, pageCount, and usage info | | ↳ `markdown` | string | Extracted markdown content |
| ↳ `jobId` | string | Unique job identifier | | ↳ `images` | array | Images extracted from this page with bounding boxes |
| ↳ `fileType` | string | File type \(e.g., pdf\) | | ↳ `id` | string | Image identifier \(e.g., img-0.jpeg\) |
| ↳ `fileName` | string | Original file name | | ↳ `top_left_x` | number | Top-left X coordinate in pixels |
| ↳ `source` | string | Source type \(url\) | | ↳ `top_left_y` | number | Top-left Y coordinate in pixels |
| ↳ `pageCount` | number | Number of pages processed | | ↳ `bottom_right_x` | number | Bottom-right X coordinate in pixels |
| ↳ `model` | string | Mistral model used | | ↳ `bottom_right_y` | number | Bottom-right Y coordinate in pixels |
| ↳ `resultType` | string | Output format \(markdown, text, json\) | | ↳ `image_base64` | string | Base64-encoded image data \(when include_image_base64=true\) |
| ↳ `processedAt` | string | Processing timestamp | | ↳ `id` | string | Image identifier \(e.g., img-0.jpeg\) |
| ↳ `sourceUrl` | string | Source URL if applicable | | ↳ `top_left_x` | number | Top-left X coordinate in pixels |
| ↳ `usageInfo` | object | Usage statistics from OCR processing | | ↳ `top_left_y` | number | Top-left Y coordinate in pixels |
| ↳ `bottom_right_x` | number | Bottom-right X coordinate in pixels |
| ↳ `bottom_right_y` | number | Bottom-right Y coordinate in pixels |
| ↳ `image_base64` | string | Base64-encoded image data \(when include_image_base64=true\) |
| ↳ `dimensions` | object | Page dimensions |
| ↳ `dpi` | number | Dots per inch |
| ↳ `height` | number | Page height in pixels |
| ↳ `width` | number | Page width in pixels |
| ↳ `dpi` | number | Dots per inch |
| ↳ `height` | number | Page height in pixels |
| ↳ `width` | number | Page width in pixels |
| ↳ `tables` | array | Extracted tables as HTML/markdown \(when table_format is set\). Referenced via placeholders like \[tbl-0.html\] |
| ↳ `hyperlinks` | array | Array of URL strings detected in the page \(e.g., \[ |
| ↳ `header` | string | Page header content \(when extract_header=true\) |
| ↳ `footer` | string | Page footer content \(when extract_footer=true\) |
| `model` | string | Mistral OCR model identifier \(e.g., mistral-ocr-latest\) |
| `usage_info` | object | Usage and processing statistics |
| ↳ `pages_processed` | number | Total number of pages processed |
| ↳ `doc_size_bytes` | number | Document file size in bytes |
| `document_annotation` | string | Structured annotation data as JSON string \(when applicable\) |

View File

@@ -58,6 +58,7 @@ Upload a file to an AWS S3 bucket
| Parameter | Type | Description | | Parameter | Type | Description |
| --------- | ---- | ----------- | | --------- | ---- | ----------- |
| `url` | string | URL of the uploaded S3 object | | `url` | string | URL of the uploaded S3 object |
| `uri` | string | S3 URI of the uploaded object \(s3://bucket/key\) |
| `metadata` | object | Upload metadata including ETag and location | | `metadata` | object | Upload metadata including ETag and location |
### `s3_get_object` ### `s3_get_object`
@@ -149,6 +150,7 @@ Copy an object within or between AWS S3 buckets
| Parameter | Type | Description | | Parameter | Type | Description |
| --------- | ---- | ----------- | | --------- | ---- | ----------- |
| `url` | string | URL of the copied S3 object | | `url` | string | URL of the copied S3 object |
| `uri` | string | S3 URI of the copied object \(s3://bucket/key\) |
| `metadata` | object | Copy operation metadata | | `metadata` | object | Copy operation metadata |

View File

@@ -0,0 +1,120 @@
---
title: AWS Textract
description: Extract text, tables, and forms from documents
---
import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard
type="textract"
color="linear-gradient(135deg, #055F4E 0%, #56C0A7 100%)"
/>
{/* MANUAL-CONTENT-START:intro */}
[AWS Textract](https://aws.amazon.com/textract/) is a powerful AI service from Amazon Web Services designed to automatically extract printed text, handwriting, tables, forms, key-value pairs, and other structured data from scanned documents and images. Textract leverages advanced optical character recognition (OCR) and document analysis to transform documents into actionable data, enabling automation, analytics, compliance, and more.
With AWS Textract, you can:
- **Extract text from images and documents**: Recognize printed text and handwriting in formats such as PDF, JPEG, PNG, or TIFF
- **Detect and extract tables**: Automatically find tables and output their structured content
- **Parse forms and key-value pairs**: Pull structured data from forms, including fields and their corresponding values
- **Identify signatures and layout features**: Detect signatures, geometric layout, and relationships between document elements
- **Customize extraction with queries**: Extract specific fields and answers using query-based extraction (e.g., "What is the invoice number?")
In Sim, the AWS Textract integration empowers your agents to intelligently process documents as part of their workflows. This unlocks automation scenarios such as data entry from invoices, onboarding documents, contracts, receipts, and more. Your agents can extract relevant data, analyze structured forms, and generate summaries or reports directly from document uploads or URLs. By connecting Sim with AWS Textract, you can reduce manual effort, improve data accuracy, and streamline your business processes with robust document understanding.
{/* MANUAL-CONTENT-END */}
## Usage Instructions
Integrate AWS Textract into your workflow to extract text, tables, forms, and key-value pairs from documents. Single-page mode supports JPEG, PNG, and single-page PDF. Multi-page mode supports multi-page PDF and TIFF.
## Tools
### `textract_parser`
Parse documents using AWS Textract OCR and document analysis
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `accessKeyId` | string | Yes | AWS Access Key ID |
| `secretAccessKey` | string | Yes | AWS Secret Access Key |
| `region` | string | Yes | AWS region for Textract service \(e.g., us-east-1\) |
| `processingMode` | string | No | Document type: single-page or multi-page. Defaults to single-page. |
| `filePath` | string | No | URL to a document to be processed \(JPEG, PNG, or single-page PDF\). |
| `s3Uri` | string | No | S3 URI for multi-page processing \(s3://bucket/key\). |
| `fileUpload` | object | No | File upload data from file-upload component |
| `featureTypes` | array | No | Feature types to detect: TABLES, FORMS, QUERIES, SIGNATURES, LAYOUT. If not specified, only text detection is performed. |
| `items` | string | No | Feature type |
| `queries` | array | No | Custom queries to extract specific information. Only used when featureTypes includes QUERIES. |
| `items` | object | No | Query configuration |
| `properties` | string | No | The query text |
| `Text` | string | No | No description |
| `Alias` | string | No | No description |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `blocks` | array | Array of Block objects containing detected text, tables, forms, and other elements |
| ↳ `BlockType` | string | Type of block \(PAGE, LINE, WORD, TABLE, CELL, KEY_VALUE_SET, etc.\) |
| ↳ `Id` | string | Unique identifier for the block |
| ↳ `Text` | string | Query text |
| ↳ `TextType` | string | Type of text \(PRINTED or HANDWRITING\) |
| ↳ `Confidence` | number | Confidence score \(0-100\) |
| ↳ `Page` | number | Page number |
| ↳ `Geometry` | object | Location and bounding box information |
| ↳ `BoundingBox` | object | Height as ratio of document height |
| ↳ `Height` | number | Height as ratio of document height |
| ↳ `Left` | number | Left position as ratio of document width |
| ↳ `Top` | number | Top position as ratio of document height |
| ↳ `Width` | number | Width as ratio of document width |
| ↳ `Height` | number | Height as ratio of document height |
| ↳ `Left` | number | Left position as ratio of document width |
| ↳ `Top` | number | Top position as ratio of document height |
| ↳ `Width` | number | Width as ratio of document width |
| ↳ `Polygon` | array | Polygon coordinates |
| ↳ `X` | number | X coordinate |
| ↳ `Y` | number | Y coordinate |
| ↳ `X` | number | X coordinate |
| ↳ `Y` | number | Y coordinate |
| ↳ `BoundingBox` | object | Height as ratio of document height |
| ↳ `Height` | number | Height as ratio of document height |
| ↳ `Left` | number | Left position as ratio of document width |
| ↳ `Top` | number | Top position as ratio of document height |
| ↳ `Width` | number | Width as ratio of document width |
| ↳ `Height` | number | Height as ratio of document height |
| ↳ `Left` | number | Left position as ratio of document width |
| ↳ `Top` | number | Top position as ratio of document height |
| ↳ `Width` | number | Width as ratio of document width |
| ↳ `Polygon` | array | Polygon coordinates |
| ↳ `X` | number | X coordinate |
| ↳ `Y` | number | Y coordinate |
| ↳ `X` | number | X coordinate |
| ↳ `Y` | number | Y coordinate |
| ↳ `Relationships` | array | Relationships to other blocks |
| ↳ `Type` | string | Relationship type \(CHILD, VALUE, ANSWER, etc.\) |
| ↳ `Ids` | array | IDs of related blocks |
| ↳ `Type` | string | Relationship type \(CHILD, VALUE, ANSWER, etc.\) |
| ↳ `Ids` | array | IDs of related blocks |
| ↳ `EntityTypes` | array | Entity types for KEY_VALUE_SET \(KEY or VALUE\) |
| ↳ `SelectionStatus` | string | For checkboxes: SELECTED or NOT_SELECTED |
| ↳ `RowIndex` | number | Row index for table cells |
| ↳ `ColumnIndex` | number | Column index for table cells |
| ↳ `RowSpan` | number | Row span for merged cells |
| ↳ `ColumnSpan` | number | Column span for merged cells |
| ↳ `Query` | object | Query information for QUERY blocks |
| ↳ `Text` | string | Query text |
| ↳ `Alias` | string | Query alias |
| ↳ `Pages` | array | Pages to search |
| ↳ `Alias` | string | Query alias |
| ↳ `Pages` | array | Pages to search |
| `documentMetadata` | object | Metadata about the analyzed document |
| ↳ `pages` | number | Number of pages in the document |
| `modelVersion` | string | Version of the Textract model used for processing |

View File

@@ -6,7 +6,7 @@ description: Generate videos from text using AI
import { BlockInfoCard } from "@/components/ui/block-info-card" import { BlockInfoCard } from "@/components/ui/block-info-card"
<BlockInfoCard <BlockInfoCard
type="video_generator" type="video_generator_v2"
color="#181C1E" color="#181C1E"
/> />

View File

@@ -2,10 +2,9 @@
import { useEffect, useState } from 'react' import { useEffect, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { ArrowRight, ChevronRight, Eye, EyeOff } from 'lucide-react' import { Eye, EyeOff } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { useRouter, useSearchParams } from 'next/navigation' import { useRouter, useSearchParams } from 'next/navigation'
import { Button } from '@/components/ui/button'
import { import {
Dialog, Dialog,
DialogContent, DialogContent,
@@ -22,8 +21,10 @@ import { getBaseUrl } from '@/lib/core/utils/urls'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons' import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons'
import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button' import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
const logger = createLogger('LoginForm') const logger = createLogger('LoginForm')
@@ -105,8 +106,7 @@ export default function LoginPage({
const [password, setPassword] = useState('') const [password, setPassword] = useState('')
const [passwordErrors, setPasswordErrors] = useState<string[]>([]) const [passwordErrors, setPasswordErrors] = useState<string[]>([])
const [showValidationError, setShowValidationError] = useState(false) const [showValidationError, setShowValidationError] = useState(false)
const [buttonClass, setButtonClass] = useState('branded-button-gradient') const buttonClass = useBrandedButtonClass()
const [isButtonHovered, setIsButtonHovered] = useState(false)
const [callbackUrl, setCallbackUrl] = useState('/workspace') const [callbackUrl, setCallbackUrl] = useState('/workspace')
const [isInviteFlow, setIsInviteFlow] = useState(false) const [isInviteFlow, setIsInviteFlow] = useState(false)
@@ -114,7 +114,6 @@ export default function LoginPage({
const [forgotPasswordOpen, setForgotPasswordOpen] = useState(false) const [forgotPasswordOpen, setForgotPasswordOpen] = useState(false)
const [forgotPasswordEmail, setForgotPasswordEmail] = useState('') const [forgotPasswordEmail, setForgotPasswordEmail] = useState('')
const [isSubmittingReset, setIsSubmittingReset] = useState(false) const [isSubmittingReset, setIsSubmittingReset] = useState(false)
const [isResetButtonHovered, setIsResetButtonHovered] = useState(false)
const [resetStatus, setResetStatus] = useState<{ const [resetStatus, setResetStatus] = useState<{
type: 'success' | 'error' | null type: 'success' | 'error' | null
message: string message: string
@@ -123,6 +122,7 @@ export default function LoginPage({
const [email, setEmail] = useState('') const [email, setEmail] = useState('')
const [emailErrors, setEmailErrors] = useState<string[]>([]) const [emailErrors, setEmailErrors] = useState<string[]>([])
const [showEmailValidationError, setShowEmailValidationError] = useState(false) const [showEmailValidationError, setShowEmailValidationError] = useState(false)
const [resetSuccessMessage, setResetSuccessMessage] = useState<string | null>(null)
useEffect(() => { useEffect(() => {
setMounted(true) setMounted(true)
@@ -139,32 +139,12 @@ export default function LoginPage({
const inviteFlow = searchParams.get('invite_flow') === 'true' const inviteFlow = searchParams.get('invite_flow') === 'true'
setIsInviteFlow(inviteFlow) setIsInviteFlow(inviteFlow)
}
const checkCustomBrand = () => { const resetSuccess = searchParams.get('resetSuccess') === 'true'
const computedStyle = getComputedStyle(document.documentElement) if (resetSuccess) {
const brandAccent = computedStyle.getPropertyValue('--brand-accent-hex').trim() setResetSuccessMessage('Password reset successful. Please sign in with your new password.')
if (brandAccent && brandAccent !== '#6f3dfa') {
setButtonClass('branded-button-custom')
} else {
setButtonClass('branded-button-gradient')
} }
} }
checkCustomBrand()
window.addEventListener('resize', checkCustomBrand)
const observer = new MutationObserver(checkCustomBrand)
observer.observe(document.documentElement, {
attributes: true,
attributeFilter: ['style', 'class'],
})
return () => {
window.removeEventListener('resize', checkCustomBrand)
observer.disconnect()
}
}, [searchParams]) }, [searchParams])
useEffect(() => { useEffect(() => {
@@ -202,6 +182,13 @@ export default function LoginPage({
e.preventDefault() e.preventDefault()
setIsLoading(true) setIsLoading(true)
const redirectToVerify = (emailToVerify: string) => {
if (typeof window !== 'undefined') {
sessionStorage.setItem('verificationEmail', emailToVerify)
}
router.push('/verify')
}
const formData = new FormData(e.currentTarget) const formData = new FormData(e.currentTarget)
const emailRaw = formData.get('email') as string const emailRaw = formData.get('email') as string
const email = emailRaw.trim().toLowerCase() const email = emailRaw.trim().toLowerCase()
@@ -221,6 +208,7 @@ export default function LoginPage({
try { try {
const safeCallbackUrl = validateCallbackUrl(callbackUrl) ? callbackUrl : '/workspace' const safeCallbackUrl = validateCallbackUrl(callbackUrl) ? callbackUrl : '/workspace'
let errorHandled = false
const result = await client.signIn.email( const result = await client.signIn.email(
{ {
@@ -231,11 +219,16 @@ export default function LoginPage({
{ {
onError: (ctx) => { onError: (ctx) => {
logger.error('Login error:', ctx.error) logger.error('Login error:', ctx.error)
const errorMessage: string[] = ['Invalid email or password']
if (ctx.error.code?.includes('EMAIL_NOT_VERIFIED')) { if (ctx.error.code?.includes('EMAIL_NOT_VERIFIED')) {
errorHandled = true
redirectToVerify(email)
return return
} }
errorHandled = true
const errorMessage: string[] = ['Invalid email or password']
if ( if (
ctx.error.code?.includes('BAD_REQUEST') || ctx.error.code?.includes('BAD_REQUEST') ||
ctx.error.message?.includes('Email and password sign in is not enabled') ctx.error.message?.includes('Email and password sign in is not enabled')
@@ -271,6 +264,7 @@ export default function LoginPage({
errorMessage.push('Too many requests. Please wait a moment before trying again.') errorMessage.push('Too many requests. Please wait a moment before trying again.')
} }
setResetSuccessMessage(null)
setPasswordErrors(errorMessage) setPasswordErrors(errorMessage)
setShowValidationError(true) setShowValidationError(true)
}, },
@@ -278,15 +272,25 @@ export default function LoginPage({
) )
if (!result || result.error) { if (!result || result.error) {
// Show error if not already handled by onError callback
if (!errorHandled) {
setResetSuccessMessage(null)
const errorMessage = result?.error?.message || 'Login failed. Please try again.'
setPasswordErrors([errorMessage])
setShowValidationError(true)
}
setIsLoading(false) setIsLoading(false)
return return
} }
// Clear reset success message on successful login
setResetSuccessMessage(null)
// Explicit redirect fallback if better-auth doesn't redirect
router.push(safeCallbackUrl)
} catch (err: any) { } catch (err: any) {
if (err.message?.includes('not verified') || err.code?.includes('EMAIL_NOT_VERIFIED')) { if (err.message?.includes('not verified') || err.code?.includes('EMAIL_NOT_VERIFIED')) {
if (typeof window !== 'undefined') { redirectToVerify(email)
sessionStorage.setItem('verificationEmail', email)
}
router.push('/verify')
return return
} }
@@ -400,6 +404,13 @@ export default function LoginPage({
</div> </div>
)} )}
{/* Password reset success message */}
{resetSuccessMessage && (
<div className={`${inter.className} mt-1 space-y-1 text-[#4CAF50] text-xs`}>
<p>{resetSuccessMessage}</p>
</div>
)}
{/* Email/Password Form - show unless explicitly disabled */} {/* Email/Password Form - show unless explicitly disabled */}
{!isFalsy(getEnv('NEXT_PUBLIC_EMAIL_PASSWORD_SIGNUP_ENABLED')) && ( {!isFalsy(getEnv('NEXT_PUBLIC_EMAIL_PASSWORD_SIGNUP_ENABLED')) && (
<form onSubmit={onSubmit} className={`${inter.className} mt-8 space-y-8`}> <form onSubmit={onSubmit} className={`${inter.className} mt-8 space-y-8`}>
@@ -482,24 +493,14 @@ export default function LoginPage({
</div> </div>
</div> </div>
<Button <BrandedButton
type='submit' type='submit'
onMouseEnter={() => setIsButtonHovered(true)}
onMouseLeave={() => setIsButtonHovered(false)}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
disabled={isLoading} disabled={isLoading}
loading={isLoading}
loadingText='Signing in'
> >
<span className='flex items-center gap-1'> Sign in
{isLoading ? 'Signing in...' : 'Sign in'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
)} )}
@@ -610,25 +611,15 @@ export default function LoginPage({
<p>{resetStatus.message}</p> <p>{resetStatus.message}</p>
</div> </div>
)} )}
<Button <BrandedButton
type='button' type='button'
onClick={handleForgotPassword} onClick={handleForgotPassword}
onMouseEnter={() => setIsResetButtonHovered(true)}
onMouseLeave={() => setIsResetButtonHovered(false)}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
disabled={isSubmittingReset} disabled={isSubmittingReset}
loading={isSubmittingReset}
loadingText='Sending'
> >
<span className='flex items-center gap-1'> Send Reset Link
{isSubmittingReset ? 'Sending...' : 'Send Reset Link'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isResetButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</div> </div>
</DialogContent> </DialogContent>
</Dialog> </Dialog>

View File

@@ -1,12 +1,12 @@
'use client' 'use client'
import { useEffect, useState } from 'react' import { useState } from 'react'
import { ArrowRight, ChevronRight, Eye, EyeOff } from 'lucide-react' import { Eye, EyeOff } from 'lucide-react'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
interface RequestResetFormProps { interface RequestResetFormProps {
email: string email: string
@@ -27,36 +27,6 @@ export function RequestResetForm({
statusMessage, statusMessage,
className, className,
}: RequestResetFormProps) { }: RequestResetFormProps) {
const [buttonClass, setButtonClass] = useState('branded-button-gradient')
const [isButtonHovered, setIsButtonHovered] = useState(false)
useEffect(() => {
const checkCustomBrand = () => {
const computedStyle = getComputedStyle(document.documentElement)
const brandAccent = computedStyle.getPropertyValue('--brand-accent-hex').trim()
if (brandAccent && brandAccent !== '#6f3dfa') {
setButtonClass('branded-button-custom')
} else {
setButtonClass('branded-button-gradient')
}
}
checkCustomBrand()
window.addEventListener('resize', checkCustomBrand)
const observer = new MutationObserver(checkCustomBrand)
observer.observe(document.documentElement, {
attributes: true,
attributeFilter: ['style', 'class'],
})
return () => {
window.removeEventListener('resize', checkCustomBrand)
observer.disconnect()
}
}, [])
const handleSubmit = async (e: React.FormEvent) => { const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault() e.preventDefault()
onSubmit(email) onSubmit(email)
@@ -94,24 +64,14 @@ export function RequestResetForm({
)} )}
</div> </div>
<Button <BrandedButton
type='submit' type='submit'
disabled={isSubmitting} disabled={isSubmitting}
onMouseEnter={() => setIsButtonHovered(true)} loading={isSubmitting}
onMouseLeave={() => setIsButtonHovered(false)} loadingText='Sending'
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
> >
<span className='flex items-center gap-1'> Send Reset Link
{isSubmitting ? 'Sending...' : 'Send Reset Link'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
) )
} }
@@ -138,35 +98,6 @@ export function SetNewPasswordForm({
const [validationMessage, setValidationMessage] = useState('') const [validationMessage, setValidationMessage] = useState('')
const [showPassword, setShowPassword] = useState(false) const [showPassword, setShowPassword] = useState(false)
const [showConfirmPassword, setShowConfirmPassword] = useState(false) const [showConfirmPassword, setShowConfirmPassword] = useState(false)
const [buttonClass, setButtonClass] = useState('branded-button-gradient')
const [isButtonHovered, setIsButtonHovered] = useState(false)
useEffect(() => {
const checkCustomBrand = () => {
const computedStyle = getComputedStyle(document.documentElement)
const brandAccent = computedStyle.getPropertyValue('--brand-accent-hex').trim()
if (brandAccent && brandAccent !== '#6f3dfa') {
setButtonClass('branded-button-custom')
} else {
setButtonClass('branded-button-gradient')
}
}
checkCustomBrand()
window.addEventListener('resize', checkCustomBrand)
const observer = new MutationObserver(checkCustomBrand)
observer.observe(document.documentElement, {
attributes: true,
attributeFilter: ['style', 'class'],
})
return () => {
window.removeEventListener('resize', checkCustomBrand)
observer.disconnect()
}
}, [])
const handleSubmit = async (e: React.FormEvent) => { const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault() e.preventDefault()
@@ -296,24 +227,14 @@ export function SetNewPasswordForm({
)} )}
</div> </div>
<Button <BrandedButton
disabled={isSubmitting || !token}
type='submit' type='submit'
onMouseEnter={() => setIsButtonHovered(true)} disabled={isSubmitting || !token}
onMouseLeave={() => setIsButtonHovered(false)} loading={isSubmitting}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all' loadingText='Resetting'
> >
<span className='flex items-center gap-1'> Reset Password
{isSubmitting ? 'Resetting...' : 'Reset Password'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
) )
} }

View File

@@ -2,10 +2,9 @@
import { Suspense, useEffect, useState } from 'react' import { Suspense, useEffect, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { ArrowRight, ChevronRight, Eye, EyeOff } from 'lucide-react' import { Eye, EyeOff } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { useRouter, useSearchParams } from 'next/navigation' import { useRouter, useSearchParams } from 'next/navigation'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { client, useSession } from '@/lib/auth/auth-client' import { client, useSession } from '@/lib/auth/auth-client'
@@ -14,8 +13,10 @@ import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons' import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons'
import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button' import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
const logger = createLogger('SignupForm') const logger = createLogger('SignupForm')
@@ -95,8 +96,7 @@ function SignupFormContent({
const [showEmailValidationError, setShowEmailValidationError] = useState(false) const [showEmailValidationError, setShowEmailValidationError] = useState(false)
const [redirectUrl, setRedirectUrl] = useState('') const [redirectUrl, setRedirectUrl] = useState('')
const [isInviteFlow, setIsInviteFlow] = useState(false) const [isInviteFlow, setIsInviteFlow] = useState(false)
const [buttonClass, setButtonClass] = useState('branded-button-gradient') const buttonClass = useBrandedButtonClass()
const [isButtonHovered, setIsButtonHovered] = useState(false)
const [name, setName] = useState('') const [name, setName] = useState('')
const [nameErrors, setNameErrors] = useState<string[]>([]) const [nameErrors, setNameErrors] = useState<string[]>([])
@@ -126,31 +126,6 @@ function SignupFormContent({
if (inviteFlowParam === 'true') { if (inviteFlowParam === 'true') {
setIsInviteFlow(true) setIsInviteFlow(true)
} }
const checkCustomBrand = () => {
const computedStyle = getComputedStyle(document.documentElement)
const brandAccent = computedStyle.getPropertyValue('--brand-accent-hex').trim()
if (brandAccent && brandAccent !== '#6f3dfa') {
setButtonClass('branded-button-custom')
} else {
setButtonClass('branded-button-gradient')
}
}
checkCustomBrand()
window.addEventListener('resize', checkCustomBrand)
const observer = new MutationObserver(checkCustomBrand)
observer.observe(document.documentElement, {
attributes: true,
attributeFilter: ['style', 'class'],
})
return () => {
window.removeEventListener('resize', checkCustomBrand)
observer.disconnect()
}
}, [searchParams]) }, [searchParams])
const validatePassword = (passwordValue: string): string[] => { const validatePassword = (passwordValue: string): string[] => {
@@ -500,24 +475,14 @@ function SignupFormContent({
</div> </div>
</div> </div>
<Button <BrandedButton
type='submit' type='submit'
onMouseEnter={() => setIsButtonHovered(true)}
onMouseLeave={() => setIsButtonHovered(false)}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
disabled={isLoading} disabled={isLoading}
loading={isLoading}
loadingText='Creating account'
> >
<span className='flex items-center gap-1'> Create account
{isLoading ? 'Creating account' : 'Create account'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
)} )}

View File

@@ -13,6 +13,7 @@ import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
const logger = createLogger('SSOForm') const logger = createLogger('SSOForm')
@@ -57,7 +58,7 @@ export default function SSOForm() {
const [email, setEmail] = useState('') const [email, setEmail] = useState('')
const [emailErrors, setEmailErrors] = useState<string[]>([]) const [emailErrors, setEmailErrors] = useState<string[]>([])
const [showEmailValidationError, setShowEmailValidationError] = useState(false) const [showEmailValidationError, setShowEmailValidationError] = useState(false)
const [buttonClass, setButtonClass] = useState('branded-button-gradient') const buttonClass = useBrandedButtonClass()
const [callbackUrl, setCallbackUrl] = useState('/workspace') const [callbackUrl, setCallbackUrl] = useState('/workspace')
useEffect(() => { useEffect(() => {
@@ -90,31 +91,6 @@ export default function SSOForm() {
setShowEmailValidationError(true) setShowEmailValidationError(true)
} }
} }
const checkCustomBrand = () => {
const computedStyle = getComputedStyle(document.documentElement)
const brandAccent = computedStyle.getPropertyValue('--brand-accent-hex').trim()
if (brandAccent && brandAccent !== '#6f3dfa') {
setButtonClass('branded-button-custom')
} else {
setButtonClass('branded-button-gradient')
}
}
checkCustomBrand()
window.addEventListener('resize', checkCustomBrand)
const observer = new MutationObserver(checkCustomBrand)
observer.observe(document.documentElement, {
attributes: true,
attributeFilter: ['style', 'class'],
})
return () => {
window.removeEventListener('resize', checkCustomBrand)
observer.disconnect()
}
}, [searchParams]) }, [searchParams])
const handleEmailChange = (e: React.ChangeEvent<HTMLInputElement>) => { const handleEmailChange = (e: React.ChangeEvent<HTMLInputElement>) => {

View File

@@ -8,6 +8,7 @@ import { cn } from '@/lib/core/utils/cn'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { useVerification } from '@/app/(auth)/verify/use-verification' import { useVerification } from '@/app/(auth)/verify/use-verification'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
interface VerifyContentProps { interface VerifyContentProps {
hasEmailService: boolean hasEmailService: boolean
@@ -58,34 +59,7 @@ function VerificationForm({
setCountdown(30) setCountdown(30)
} }
const [buttonClass, setButtonClass] = useState('branded-button-gradient') const buttonClass = useBrandedButtonClass()
useEffect(() => {
const checkCustomBrand = () => {
const computedStyle = getComputedStyle(document.documentElement)
const brandAccent = computedStyle.getPropertyValue('--brand-accent-hex').trim()
if (brandAccent && brandAccent !== '#6f3dfa') {
setButtonClass('branded-button-custom')
} else {
setButtonClass('branded-button-gradient')
}
}
checkCustomBrand()
window.addEventListener('resize', checkCustomBrand)
const observer = new MutationObserver(checkCustomBrand)
observer.observe(document.documentElement, {
attributes: true,
attributeFilter: ['style', 'class'],
})
return () => {
window.removeEventListener('resize', checkCustomBrand)
observer.disconnect()
}
}, [])
return ( return (
<> <>

View File

@@ -4,7 +4,6 @@ import { useRef, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { X } from 'lucide-react' import { X } from 'lucide-react'
import { Textarea } from '@/components/emcn' import { Textarea } from '@/components/emcn'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { import {
@@ -18,6 +17,7 @@ import { isHosted } from '@/lib/core/config/feature-flags'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import Footer from '@/app/(landing)/components/footer/footer' import Footer from '@/app/(landing)/components/footer/footer'
import Nav from '@/app/(landing)/components/nav/nav' import Nav from '@/app/(landing)/components/nav/nav'
@@ -493,18 +493,17 @@ export default function CareersPage() {
{/* Submit Button */} {/* Submit Button */}
<div className='flex justify-end pt-2'> <div className='flex justify-end pt-2'>
<Button <BrandedButton
type='submit' type='submit'
disabled={isSubmitting || submitStatus === 'success'} disabled={isSubmitting || submitStatus === 'success'}
className='min-w-[200px] rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all duration-300 hover:opacity-90 disabled:opacity-50' loading={isSubmitting}
size='lg' loadingText='Submitting'
showArrow={false}
fullWidth={false}
className='min-w-[200px]'
> >
{isSubmitting {submitStatus === 'success' ? 'Submitted' : 'Submit Application'}
? 'Submitting...' </BrandedButton>
: submitStatus === 'success'
? 'Submitted'
: 'Submit Application'}
</Button>
</div> </div>
</form> </form>
</section> </section>

View File

@@ -11,6 +11,7 @@ import { useBrandConfig } from '@/lib/branding/branding'
import { isHosted } from '@/lib/core/config/feature-flags' import { isHosted } from '@/lib/core/config/feature-flags'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { getFormattedGitHubStars } from '@/app/(landing)/actions/github' import { getFormattedGitHubStars } from '@/app/(landing)/actions/github'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
const logger = createLogger('nav') const logger = createLogger('nav')
@@ -20,11 +21,12 @@ interface NavProps {
} }
export default function Nav({ hideAuthButtons = false, variant = 'landing' }: NavProps = {}) { export default function Nav({ hideAuthButtons = false, variant = 'landing' }: NavProps = {}) {
const [githubStars, setGithubStars] = useState('25.1k') const [githubStars, setGithubStars] = useState('25.8k')
const [isHovered, setIsHovered] = useState(false) const [isHovered, setIsHovered] = useState(false)
const [isLoginHovered, setIsLoginHovered] = useState(false) const [isLoginHovered, setIsLoginHovered] = useState(false)
const router = useRouter() const router = useRouter()
const brand = useBrandConfig() const brand = useBrandConfig()
const buttonClass = useBrandedButtonClass()
useEffect(() => { useEffect(() => {
if (variant !== 'landing') return if (variant !== 'landing') return
@@ -183,7 +185,7 @@ export default function Nav({ hideAuthButtons = false, variant = 'landing' }: Na
href='/signup' href='/signup'
onMouseEnter={() => setIsHovered(true)} onMouseEnter={() => setIsHovered(true)}
onMouseLeave={() => setIsHovered(false)} onMouseLeave={() => setIsHovered(false)}
className='group inline-flex items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[14px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all sm:text-[16px]' className={`${buttonClass} group inline-flex items-center justify-center gap-2 rounded-[10px] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white transition-all`}
aria-label='Get started with Sim - Sign up for free' aria-label='Get started with Sim - Sign up for free'
prefetch={true} prefetch={true}
> >

View File

@@ -4,6 +4,11 @@ import { createLogger } from '@sim/logger'
import { and, desc, eq, inArray } from 'drizzle-orm' import { and, desc, eq, inArray } from 'drizzle-orm'
import { getSession } from '@/lib/auth' import { getSession } from '@/lib/auth'
import { refreshOAuthToken } from '@/lib/oauth' import { refreshOAuthToken } from '@/lib/oauth'
import {
getMicrosoftRefreshTokenExpiry,
isMicrosoftProvider,
PROACTIVE_REFRESH_THRESHOLD_DAYS,
} from '@/lib/oauth/microsoft'
const logger = createLogger('OAuthUtilsAPI') const logger = createLogger('OAuthUtilsAPI')
@@ -205,15 +210,32 @@ export async function refreshAccessTokenIfNeeded(
} }
// Decide if we should refresh: token missing OR expired // Decide if we should refresh: token missing OR expired
const expiresAt = credential.accessTokenExpiresAt const accessTokenExpiresAt = credential.accessTokenExpiresAt
const refreshTokenExpiresAt = credential.refreshTokenExpiresAt
const now = new Date() const now = new Date()
const shouldRefresh =
!!credential.refreshToken && (!credential.accessToken || (expiresAt && expiresAt <= now)) // Check if access token needs refresh (missing or expired)
const accessTokenNeedsRefresh =
!!credential.refreshToken &&
(!credential.accessToken || (accessTokenExpiresAt && accessTokenExpiresAt <= now))
// Check if we should proactively refresh to prevent refresh token expiry
// This applies to Microsoft providers whose refresh tokens expire after 90 days of inactivity
const proactiveRefreshThreshold = new Date(
now.getTime() + PROACTIVE_REFRESH_THRESHOLD_DAYS * 24 * 60 * 60 * 1000
)
const refreshTokenNeedsProactiveRefresh =
!!credential.refreshToken &&
isMicrosoftProvider(credential.providerId) &&
refreshTokenExpiresAt &&
refreshTokenExpiresAt <= proactiveRefreshThreshold
const shouldRefresh = accessTokenNeedsRefresh || refreshTokenNeedsProactiveRefresh
const accessToken = credential.accessToken const accessToken = credential.accessToken
if (shouldRefresh) { if (shouldRefresh) {
logger.info(`[${requestId}] Token expired, attempting to refresh for credential`) logger.info(`[${requestId}] Refreshing token for credential`)
try { try {
const refreshedToken = await refreshOAuthToken( const refreshedToken = await refreshOAuthToken(
credential.providerId, credential.providerId,
@@ -227,11 +249,15 @@ export async function refreshAccessTokenIfNeeded(
userId: credential.userId, userId: credential.userId,
hasRefreshToken: !!credential.refreshToken, hasRefreshToken: !!credential.refreshToken,
}) })
if (!accessTokenNeedsRefresh && accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return accessToken
}
return null return null
} }
// Prepare update data // Prepare update data
const updateData: any = { const updateData: Record<string, unknown> = {
accessToken: refreshedToken.accessToken, accessToken: refreshedToken.accessToken,
accessTokenExpiresAt: new Date(Date.now() + refreshedToken.expiresIn * 1000), accessTokenExpiresAt: new Date(Date.now() + refreshedToken.expiresIn * 1000),
updatedAt: new Date(), updatedAt: new Date(),
@@ -243,6 +269,10 @@ export async function refreshAccessTokenIfNeeded(
updateData.refreshToken = refreshedToken.refreshToken updateData.refreshToken = refreshedToken.refreshToken
} }
if (isMicrosoftProvider(credential.providerId)) {
updateData.refreshTokenExpiresAt = getMicrosoftRefreshTokenExpiry()
}
// Update the token in the database // Update the token in the database
await db.update(account).set(updateData).where(eq(account.id, credentialId)) await db.update(account).set(updateData).where(eq(account.id, credentialId))
@@ -256,6 +286,10 @@ export async function refreshAccessTokenIfNeeded(
credentialId, credentialId,
userId: credential.userId, userId: credential.userId,
}) })
if (!accessTokenNeedsRefresh && accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return accessToken
}
return null return null
} }
} else if (!accessToken) { } else if (!accessToken) {
@@ -277,10 +311,27 @@ export async function refreshTokenIfNeeded(
credentialId: string credentialId: string
): Promise<{ accessToken: string; refreshed: boolean }> { ): Promise<{ accessToken: string; refreshed: boolean }> {
// Decide if we should refresh: token missing OR expired // Decide if we should refresh: token missing OR expired
const expiresAt = credential.accessTokenExpiresAt const accessTokenExpiresAt = credential.accessTokenExpiresAt
const refreshTokenExpiresAt = credential.refreshTokenExpiresAt
const now = new Date() const now = new Date()
const shouldRefresh =
!!credential.refreshToken && (!credential.accessToken || (expiresAt && expiresAt <= now)) // Check if access token needs refresh (missing or expired)
const accessTokenNeedsRefresh =
!!credential.refreshToken &&
(!credential.accessToken || (accessTokenExpiresAt && accessTokenExpiresAt <= now))
// Check if we should proactively refresh to prevent refresh token expiry
// This applies to Microsoft providers whose refresh tokens expire after 90 days of inactivity
const proactiveRefreshThreshold = new Date(
now.getTime() + PROACTIVE_REFRESH_THRESHOLD_DAYS * 24 * 60 * 60 * 1000
)
const refreshTokenNeedsProactiveRefresh =
!!credential.refreshToken &&
isMicrosoftProvider(credential.providerId) &&
refreshTokenExpiresAt &&
refreshTokenExpiresAt <= proactiveRefreshThreshold
const shouldRefresh = accessTokenNeedsRefresh || refreshTokenNeedsProactiveRefresh
// If token appears valid and present, return it directly // If token appears valid and present, return it directly
if (!shouldRefresh) { if (!shouldRefresh) {
@@ -293,13 +344,17 @@ export async function refreshTokenIfNeeded(
if (!refreshResult) { if (!refreshResult) {
logger.error(`[${requestId}] Failed to refresh token for credential`) logger.error(`[${requestId}] Failed to refresh token for credential`)
if (!accessTokenNeedsRefresh && credential.accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return { accessToken: credential.accessToken, refreshed: false }
}
throw new Error('Failed to refresh token') throw new Error('Failed to refresh token')
} }
const { accessToken: refreshedToken, expiresIn, refreshToken: newRefreshToken } = refreshResult const { accessToken: refreshedToken, expiresIn, refreshToken: newRefreshToken } = refreshResult
// Prepare update data // Prepare update data
const updateData: any = { const updateData: Record<string, unknown> = {
accessToken: refreshedToken, accessToken: refreshedToken,
accessTokenExpiresAt: new Date(Date.now() + expiresIn * 1000), // Use provider's expiry accessTokenExpiresAt: new Date(Date.now() + expiresIn * 1000), // Use provider's expiry
updatedAt: new Date(), updatedAt: new Date(),
@@ -311,6 +366,10 @@ export async function refreshTokenIfNeeded(
updateData.refreshToken = newRefreshToken updateData.refreshToken = newRefreshToken
} }
if (isMicrosoftProvider(credential.providerId)) {
updateData.refreshTokenExpiresAt = getMicrosoftRefreshTokenExpiry()
}
await db.update(account).set(updateData).where(eq(account.id, credentialId)) await db.update(account).set(updateData).where(eq(account.id, credentialId))
logger.info(`[${requestId}] Successfully refreshed access token`) logger.info(`[${requestId}] Successfully refreshed access token`)
@@ -331,6 +390,11 @@ export async function refreshTokenIfNeeded(
} }
} }
if (!accessTokenNeedsRefresh && credential.accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return { accessToken: credential.accessToken, refreshed: false }
}
logger.error(`[${requestId}] Refresh failed and no valid token found in DB`, error) logger.error(`[${requestId}] Refresh failed and no valid token found in DB`, error)
throw error throw error
} }

View File

@@ -15,7 +15,8 @@ const resetPasswordSchema = z.object({
.max(100, 'Password must not exceed 100 characters') .max(100, 'Password must not exceed 100 characters')
.regex(/[A-Z]/, 'Password must contain at least one uppercase letter') .regex(/[A-Z]/, 'Password must contain at least one uppercase letter')
.regex(/[a-z]/, 'Password must contain at least one lowercase letter') .regex(/[a-z]/, 'Password must contain at least one lowercase letter')
.regex(/[0-9]/, 'Password must contain at least one number'), .regex(/[0-9]/, 'Password must contain at least one number')
.regex(/[^A-Za-z0-9]/, 'Password must contain at least one special character'),
}) })
export async function POST(request: NextRequest) { export async function POST(request: NextRequest) {

View File

@@ -224,7 +224,7 @@ export async function POST(req: NextRequest) {
hasApiKey: !!executionParams.apiKey, hasApiKey: !!executionParams.apiKey,
}) })
const result = await executeTool(resolvedToolName, executionParams, true) const result = await executeTool(resolvedToolName, executionParams)
logger.info(`[${tracker.requestId}] Tool execution complete`, { logger.info(`[${tracker.requestId}] Tool execution complete`, {
toolName, toolName,

View File

@@ -6,9 +6,10 @@ import { createLogger } from '@sim/logger'
import binaryExtensionsList from 'binary-extensions' import binaryExtensionsList from 'binary-extensions'
import { type NextRequest, NextResponse } from 'next/server' import { type NextRequest, NextResponse } from 'next/server'
import { checkHybridAuth } from '@/lib/auth/hybrid' import { checkHybridAuth } from '@/lib/auth/hybrid'
import { createPinnedUrl, validateUrlWithDNS } from '@/lib/core/security/input-validation' import { secureFetchWithPinnedIP, validateUrlWithDNS } from '@/lib/core/security/input-validation'
import { isSupportedFileType, parseFile } from '@/lib/file-parsers' import { isSupportedFileType, parseFile } from '@/lib/file-parsers'
import { isUsingCloudStorage, type StorageContext, StorageService } from '@/lib/uploads' import { isUsingCloudStorage, type StorageContext, StorageService } from '@/lib/uploads'
import { uploadExecutionFile } from '@/lib/uploads/contexts/execution'
import { UPLOAD_DIR_SERVER } from '@/lib/uploads/core/setup.server' import { UPLOAD_DIR_SERVER } from '@/lib/uploads/core/setup.server'
import { getFileMetadataByKey } from '@/lib/uploads/server/metadata' import { getFileMetadataByKey } from '@/lib/uploads/server/metadata'
import { import {
@@ -21,6 +22,7 @@ import {
} from '@/lib/uploads/utils/file-utils' } from '@/lib/uploads/utils/file-utils'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils' import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
import { verifyFileAccess } from '@/app/api/files/authorization' import { verifyFileAccess } from '@/app/api/files/authorization'
import type { UserFile } from '@/executor/types'
import '@/lib/uploads/core/setup.server' import '@/lib/uploads/core/setup.server'
export const dynamic = 'force-dynamic' export const dynamic = 'force-dynamic'
@@ -30,6 +32,12 @@ const logger = createLogger('FilesParseAPI')
const MAX_DOWNLOAD_SIZE_BYTES = 100 * 1024 * 1024 // 100 MB const MAX_DOWNLOAD_SIZE_BYTES = 100 * 1024 * 1024 // 100 MB
const DOWNLOAD_TIMEOUT_MS = 30000 // 30 seconds const DOWNLOAD_TIMEOUT_MS = 30000 // 30 seconds
interface ExecutionContext {
workspaceId: string
workflowId: string
executionId: string
}
interface ParseResult { interface ParseResult {
success: boolean success: boolean
content?: string content?: string
@@ -37,6 +45,7 @@ interface ParseResult {
filePath: string filePath: string
originalName?: string // Original filename from database (for workspace files) originalName?: string // Original filename from database (for workspace files)
viewerUrl?: string | null // Viewer URL for the file if available viewerUrl?: string | null // Viewer URL for the file if available
userFile?: UserFile // UserFile object for the raw file
metadata?: { metadata?: {
fileType: string fileType: string
size: number size: number
@@ -70,27 +79,45 @@ export async function POST(request: NextRequest) {
const userId = authResult.userId const userId = authResult.userId
const requestData = await request.json() const requestData = await request.json()
const { filePath, fileType, workspaceId } = requestData const { filePath, fileType, workspaceId, workflowId, executionId } = requestData
if (!filePath || (typeof filePath === 'string' && filePath.trim() === '')) { if (!filePath || (typeof filePath === 'string' && filePath.trim() === '')) {
return NextResponse.json({ success: false, error: 'No file path provided' }, { status: 400 }) return NextResponse.json({ success: false, error: 'No file path provided' }, { status: 400 })
} }
logger.info('File parse request received:', { filePath, fileType, workspaceId, userId }) // Build execution context if all required fields are present
const executionContext: ExecutionContext | undefined =
workspaceId && workflowId && executionId
? { workspaceId, workflowId, executionId }
: undefined
logger.info('File parse request received:', {
filePath,
fileType,
workspaceId,
userId,
hasExecutionContext: !!executionContext,
})
if (Array.isArray(filePath)) { if (Array.isArray(filePath)) {
const results = [] const results = []
for (const path of filePath) { for (const singlePath of filePath) {
if (!path || (typeof path === 'string' && path.trim() === '')) { if (!singlePath || (typeof singlePath === 'string' && singlePath.trim() === '')) {
results.push({ results.push({
success: false, success: false,
error: 'Empty file path in array', error: 'Empty file path in array',
filePath: path || '', filePath: singlePath || '',
}) })
continue continue
} }
const result = await parseFileSingle(path, fileType, workspaceId, userId) const result = await parseFileSingle(
singlePath,
fileType,
workspaceId,
userId,
executionContext
)
if (result.metadata) { if (result.metadata) {
result.metadata.processingTime = Date.now() - startTime result.metadata.processingTime = Date.now() - startTime
} }
@@ -106,6 +133,7 @@ export async function POST(request: NextRequest) {
fileType: result.metadata?.fileType || 'application/octet-stream', fileType: result.metadata?.fileType || 'application/octet-stream',
size: result.metadata?.size || 0, size: result.metadata?.size || 0,
binary: false, binary: false,
file: result.userFile,
}, },
filePath: result.filePath, filePath: result.filePath,
viewerUrl: result.viewerUrl, viewerUrl: result.viewerUrl,
@@ -121,7 +149,7 @@ export async function POST(request: NextRequest) {
}) })
} }
const result = await parseFileSingle(filePath, fileType, workspaceId, userId) const result = await parseFileSingle(filePath, fileType, workspaceId, userId, executionContext)
if (result.metadata) { if (result.metadata) {
result.metadata.processingTime = Date.now() - startTime result.metadata.processingTime = Date.now() - startTime
@@ -137,6 +165,7 @@ export async function POST(request: NextRequest) {
fileType: result.metadata?.fileType || 'application/octet-stream', fileType: result.metadata?.fileType || 'application/octet-stream',
size: result.metadata?.size || 0, size: result.metadata?.size || 0,
binary: false, binary: false,
file: result.userFile,
}, },
filePath: result.filePath, filePath: result.filePath,
viewerUrl: result.viewerUrl, viewerUrl: result.viewerUrl,
@@ -164,7 +193,8 @@ async function parseFileSingle(
filePath: string, filePath: string,
fileType: string, fileType: string,
workspaceId: string, workspaceId: string,
userId: string userId: string,
executionContext?: ExecutionContext
): Promise<ParseResult> { ): Promise<ParseResult> {
logger.info('Parsing file:', filePath) logger.info('Parsing file:', filePath)
@@ -186,18 +216,18 @@ async function parseFileSingle(
} }
if (filePath.includes('/api/files/serve/')) { if (filePath.includes('/api/files/serve/')) {
return handleCloudFile(filePath, fileType, undefined, userId) return handleCloudFile(filePath, fileType, undefined, userId, executionContext)
} }
if (filePath.startsWith('http://') || filePath.startsWith('https://')) { if (filePath.startsWith('http://') || filePath.startsWith('https://')) {
return handleExternalUrl(filePath, fileType, workspaceId, userId) return handleExternalUrl(filePath, fileType, workspaceId, userId, executionContext)
} }
if (isUsingCloudStorage()) { if (isUsingCloudStorage()) {
return handleCloudFile(filePath, fileType, undefined, userId) return handleCloudFile(filePath, fileType, undefined, userId, executionContext)
} }
return handleLocalFile(filePath, fileType, userId) return handleLocalFile(filePath, fileType, userId, executionContext)
} }
/** /**
@@ -230,12 +260,14 @@ function validateFilePath(filePath: string): { isValid: boolean; error?: string
/** /**
* Handle external URL * Handle external URL
* If workspaceId is provided, checks if file already exists and saves to workspace if not * If workspaceId is provided, checks if file already exists and saves to workspace if not
* If executionContext is provided, also stores the file in execution storage and returns UserFile
*/ */
async function handleExternalUrl( async function handleExternalUrl(
url: string, url: string,
fileType: string, fileType: string,
workspaceId: string, workspaceId: string,
userId: string userId: string,
executionContext?: ExecutionContext
): Promise<ParseResult> { ): Promise<ParseResult> {
try { try {
logger.info('Fetching external URL:', url) logger.info('Fetching external URL:', url)
@@ -312,17 +344,13 @@ async function handleExternalUrl(
if (existingFile) { if (existingFile) {
const storageFilePath = `/api/files/serve/${existingFile.key}` const storageFilePath = `/api/files/serve/${existingFile.key}`
return handleCloudFile(storageFilePath, fileType, 'workspace', userId) return handleCloudFile(storageFilePath, fileType, 'workspace', userId, executionContext)
} }
} }
} }
const pinnedUrl = createPinnedUrl(url, urlValidation.resolvedIP!) const response = await secureFetchWithPinnedIP(url, urlValidation.resolvedIP!, {
const response = await fetch(pinnedUrl, { timeout: DOWNLOAD_TIMEOUT_MS,
signal: AbortSignal.timeout(DOWNLOAD_TIMEOUT_MS),
headers: {
Host: urlValidation.originalHostname!,
},
}) })
if (!response.ok) { if (!response.ok) {
throw new Error(`Failed to fetch URL: ${response.status} ${response.statusText}`) throw new Error(`Failed to fetch URL: ${response.status} ${response.statusText}`)
@@ -341,6 +369,19 @@ async function handleExternalUrl(
logger.info(`Downloaded file from URL: ${url}, size: ${buffer.length} bytes`) logger.info(`Downloaded file from URL: ${url}, size: ${buffer.length} bytes`)
let userFile: UserFile | undefined
const mimeType = response.headers.get('content-type') || getMimeTypeFromExtension(extension)
if (executionContext) {
try {
userFile = await uploadExecutionFile(executionContext, buffer, filename, mimeType, userId)
logger.info(`Stored file in execution storage: ${filename}`, { key: userFile.key })
} catch (uploadError) {
logger.warn(`Failed to store file in execution storage:`, uploadError)
// Continue without userFile - parsing can still work
}
}
if (shouldCheckWorkspace) { if (shouldCheckWorkspace) {
try { try {
const permission = await getUserEntityPermissions(userId, 'workspace', workspaceId) const permission = await getUserEntityPermissions(userId, 'workspace', workspaceId)
@@ -353,8 +394,6 @@ async function handleExternalUrl(
}) })
} else { } else {
const { uploadWorkspaceFile } = await import('@/lib/uploads/contexts/workspace') const { uploadWorkspaceFile } = await import('@/lib/uploads/contexts/workspace')
const mimeType =
response.headers.get('content-type') || getMimeTypeFromExtension(extension)
await uploadWorkspaceFile(workspaceId, userId, buffer, filename, mimeType) await uploadWorkspaceFile(workspaceId, userId, buffer, filename, mimeType)
logger.info(`Saved URL file to workspace storage: ${filename}`) logger.info(`Saved URL file to workspace storage: ${filename}`)
} }
@@ -363,17 +402,23 @@ async function handleExternalUrl(
} }
} }
let parseResult: ParseResult
if (extension === 'pdf') { if (extension === 'pdf') {
return await handlePdfBuffer(buffer, filename, fileType, url) parseResult = await handlePdfBuffer(buffer, filename, fileType, url)
} } else if (extension === 'csv') {
if (extension === 'csv') { parseResult = await handleCsvBuffer(buffer, filename, fileType, url)
return await handleCsvBuffer(buffer, filename, fileType, url) } else if (isSupportedFileType(extension)) {
} parseResult = await handleGenericTextBuffer(buffer, filename, extension, fileType, url)
if (isSupportedFileType(extension)) { } else {
return await handleGenericTextBuffer(buffer, filename, extension, fileType, url) parseResult = handleGenericBuffer(buffer, filename, extension, fileType)
} }
return handleGenericBuffer(buffer, filename, extension, fileType) // Attach userFile to the result
if (userFile) {
parseResult.userFile = userFile
}
return parseResult
} catch (error) { } catch (error) {
logger.error(`Error handling external URL ${url}:`, error) logger.error(`Error handling external URL ${url}:`, error)
return { return {
@@ -386,12 +431,15 @@ async function handleExternalUrl(
/** /**
* Handle file stored in cloud storage * Handle file stored in cloud storage
* If executionContext is provided and file is not already from execution storage,
* copies the file to execution storage and returns UserFile
*/ */
async function handleCloudFile( async function handleCloudFile(
filePath: string, filePath: string,
fileType: string, fileType: string,
explicitContext: string | undefined, explicitContext: string | undefined,
userId: string userId: string,
executionContext?: ExecutionContext
): Promise<ParseResult> { ): Promise<ParseResult> {
try { try {
const cloudKey = extractStorageKey(filePath) const cloudKey = extractStorageKey(filePath)
@@ -438,6 +486,7 @@ async function handleCloudFile(
const filename = originalFilename || cloudKey.split('/').pop() || cloudKey const filename = originalFilename || cloudKey.split('/').pop() || cloudKey
const extension = path.extname(filename).toLowerCase().substring(1) const extension = path.extname(filename).toLowerCase().substring(1)
const mimeType = getMimeTypeFromExtension(extension)
const normalizedFilePath = `/api/files/serve/${encodeURIComponent(cloudKey)}?context=${context}` const normalizedFilePath = `/api/files/serve/${encodeURIComponent(cloudKey)}?context=${context}`
let workspaceIdFromKey: string | undefined let workspaceIdFromKey: string | undefined
@@ -453,6 +502,39 @@ async function handleCloudFile(
const viewerUrl = getViewerUrl(cloudKey, workspaceIdFromKey) const viewerUrl = getViewerUrl(cloudKey, workspaceIdFromKey)
// Store file in execution storage if executionContext is provided
let userFile: UserFile | undefined
if (executionContext) {
// If file is already from execution context, create UserFile reference without re-uploading
if (context === 'execution') {
userFile = {
id: `file_${Date.now()}_${Math.random().toString(36).substring(2, 9)}`,
name: filename,
url: normalizedFilePath,
size: fileBuffer.length,
type: mimeType,
key: cloudKey,
context: 'execution',
}
logger.info(`Created UserFile reference for existing execution file: ${filename}`)
} else {
// Copy from workspace/other storage to execution storage
try {
userFile = await uploadExecutionFile(
executionContext,
fileBuffer,
filename,
mimeType,
userId
)
logger.info(`Copied file to execution storage: ${filename}`, { key: userFile.key })
} catch (uploadError) {
logger.warn(`Failed to copy file to execution storage:`, uploadError)
}
}
}
let parseResult: ParseResult let parseResult: ParseResult
if (extension === 'pdf') { if (extension === 'pdf') {
parseResult = await handlePdfBuffer(fileBuffer, filename, fileType, normalizedFilePath) parseResult = await handlePdfBuffer(fileBuffer, filename, fileType, normalizedFilePath)
@@ -477,6 +559,11 @@ async function handleCloudFile(
parseResult.viewerUrl = viewerUrl parseResult.viewerUrl = viewerUrl
// Attach userFile to the result
if (userFile) {
parseResult.userFile = userFile
}
return parseResult return parseResult
} catch (error) { } catch (error) {
logger.error(`Error handling cloud file ${filePath}:`, error) logger.error(`Error handling cloud file ${filePath}:`, error)
@@ -500,7 +587,8 @@ async function handleCloudFile(
async function handleLocalFile( async function handleLocalFile(
filePath: string, filePath: string,
fileType: string, fileType: string,
userId: string userId: string,
executionContext?: ExecutionContext
): Promise<ParseResult> { ): Promise<ParseResult> {
try { try {
const filename = filePath.split('/').pop() || filePath const filename = filePath.split('/').pop() || filePath
@@ -540,13 +628,32 @@ async function handleLocalFile(
const hash = createHash('md5').update(fileBuffer).digest('hex') const hash = createHash('md5').update(fileBuffer).digest('hex')
const extension = path.extname(filename).toLowerCase().substring(1) const extension = path.extname(filename).toLowerCase().substring(1)
const mimeType = fileType || getMimeTypeFromExtension(extension)
// Store file in execution storage if executionContext is provided
let userFile: UserFile | undefined
if (executionContext) {
try {
userFile = await uploadExecutionFile(
executionContext,
fileBuffer,
filename,
mimeType,
userId
)
logger.info(`Stored local file in execution storage: ${filename}`, { key: userFile.key })
} catch (uploadError) {
logger.warn(`Failed to store local file in execution storage:`, uploadError)
}
}
return { return {
success: true, success: true,
content: result.content, content: result.content,
filePath, filePath,
userFile,
metadata: { metadata: {
fileType: fileType || getMimeTypeFromExtension(extension), fileType: mimeType,
size: stats.size, size: stats.size,
hash, hash,
processingTime: 0, processingTime: 0,

View File

@@ -11,7 +11,7 @@ import { preprocessExecution } from '@/lib/execution/preprocessing'
import { LoggingSession } from '@/lib/logs/execution/logging-session' import { LoggingSession } from '@/lib/logs/execution/logging-session'
import { normalizeInputFormatValue } from '@/lib/workflows/input-format' import { normalizeInputFormatValue } from '@/lib/workflows/input-format'
import { createStreamingResponse } from '@/lib/workflows/streaming/streaming' import { createStreamingResponse } from '@/lib/workflows/streaming/streaming'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
import { setFormAuthCookie, validateFormAuth } from '@/app/api/form/utils' import { setFormAuthCookie, validateFormAuth } from '@/app/api/form/utils'
import { createErrorResponse, createSuccessResponse } from '@/app/api/workflows/utils' import { createErrorResponse, createSuccessResponse } from '@/app/api/workflows/utils'
@@ -36,7 +36,7 @@ async function getWorkflowInputSchema(workflowId: string): Promise<any[]> {
.from(workflowBlocks) .from(workflowBlocks)
.where(eq(workflowBlocks.workflowId, workflowId)) .where(eq(workflowBlocks.workflowId, workflowId))
const startBlock = blocks.find((block) => isValidStartBlockType(block.type)) const startBlock = blocks.find((block) => isInputDefinitionTrigger(block.type))
if (!startBlock) { if (!startBlock) {
return [] return []

View File

@@ -276,8 +276,11 @@ describe('Function Execute API Route', () => {
it.concurrent('should resolve tag variables with <tag_name> syntax', async () => { it.concurrent('should resolve tag variables with <tag_name> syntax', async () => {
const req = createMockRequest('POST', { const req = createMockRequest('POST', {
code: 'return <email>', code: 'return <email>',
params: { blockData: {
email: { id: '123', subject: 'Test Email' }, 'block-123': { id: '123', subject: 'Test Email' },
},
blockNameMapping: {
email: 'block-123',
}, },
}) })
@@ -305,9 +308,13 @@ describe('Function Execute API Route', () => {
it.concurrent('should only match valid variable names in angle brackets', async () => { it.concurrent('should only match valid variable names in angle brackets', async () => {
const req = createMockRequest('POST', { const req = createMockRequest('POST', {
code: 'return <validVar> + "<invalid@email.com>" + <another_valid>', code: 'return <validVar> + "<invalid@email.com>" + <another_valid>',
params: { blockData: {
validVar: 'hello', 'block-1': 'hello',
another_valid: 'world', 'block-2': 'world',
},
blockNameMapping: {
validVar: 'block-1',
another_valid: 'block-2',
}, },
}) })
@@ -321,28 +328,22 @@ describe('Function Execute API Route', () => {
it.concurrent( it.concurrent(
'should handle Gmail webhook data with email addresses containing angle brackets', 'should handle Gmail webhook data with email addresses containing angle brackets',
async () => { async () => {
const gmailData = { const emailData = {
email: { id: '123',
id: '123', from: 'Waleed Latif <waleed@sim.ai>',
from: 'Waleed Latif <waleed@sim.ai>', to: 'User <user@example.com>',
to: 'User <user@example.com>', subject: 'Test Email',
subject: 'Test Email', bodyText: 'Hello world',
bodyText: 'Hello world',
},
rawEmail: {
id: '123',
payload: {
headers: [
{ name: 'From', value: 'Waleed Latif <waleed@sim.ai>' },
{ name: 'To', value: 'User <user@example.com>' },
],
},
},
} }
const req = createMockRequest('POST', { const req = createMockRequest('POST', {
code: 'return <email>', code: 'return <email>',
params: gmailData, blockData: {
'block-email': emailData,
},
blockNameMapping: {
email: 'block-email',
},
}) })
const response = await POST(req) const response = await POST(req)
@@ -356,17 +357,20 @@ describe('Function Execute API Route', () => {
it.concurrent( it.concurrent(
'should properly serialize complex email objects with special characters', 'should properly serialize complex email objects with special characters',
async () => { async () => {
const complexEmailData = { const emailData = {
email: { from: 'Test User <test@example.com>',
from: 'Test User <test@example.com>', bodyHtml: '<div>HTML content with "quotes" and \'apostrophes\'</div>',
bodyHtml: '<div>HTML content with "quotes" and \'apostrophes\'</div>', bodyText: 'Text with\nnewlines\tand\ttabs',
bodyText: 'Text with\nnewlines\tand\ttabs',
},
} }
const req = createMockRequest('POST', { const req = createMockRequest('POST', {
code: 'return <email>', code: 'return <email>',
params: complexEmailData, blockData: {
'block-email': emailData,
},
blockNameMapping: {
email: 'block-email',
},
}) })
const response = await POST(req) const response = await POST(req)
@@ -519,18 +523,23 @@ describe('Function Execute API Route', () => {
}) })
it.concurrent('should handle JSON serialization edge cases', async () => { it.concurrent('should handle JSON serialization edge cases', async () => {
const complexData = {
special: 'chars"with\'quotes',
unicode: '🎉 Unicode content',
nested: {
deep: {
value: 'test',
},
},
}
const req = createMockRequest('POST', { const req = createMockRequest('POST', {
code: 'return <complexData>', code: 'return <complexData>',
params: { blockData: {
complexData: { 'block-complex': complexData,
special: 'chars"with\'quotes', },
unicode: '🎉 Unicode content', blockNameMapping: {
nested: { complexData: 'block-complex',
deep: {
value: 'test',
},
},
},
}, },
}) })

View File

@@ -9,8 +9,8 @@ import { escapeRegExp, normalizeName, REFERENCE } from '@/executor/constants'
import { import {
createEnvVarPattern, createEnvVarPattern,
createWorkflowVariablePattern, createWorkflowVariablePattern,
resolveEnvVarReferences,
} from '@/executor/utils/reference-validation' } from '@/executor/utils/reference-validation'
import { navigatePath } from '@/executor/variables/resolvers/reference'
export const dynamic = 'force-dynamic' export const dynamic = 'force-dynamic'
export const runtime = 'nodejs' export const runtime = 'nodejs'
@@ -18,8 +18,8 @@ export const MAX_DURATION = 210
const logger = createLogger('FunctionExecuteAPI') const logger = createLogger('FunctionExecuteAPI')
const E2B_JS_WRAPPER_LINES = 3 // Lines before user code: ';(async () => {', ' try {', ' const __sim_result = await (async () => {' const E2B_JS_WRAPPER_LINES = 3
const E2B_PYTHON_WRAPPER_LINES = 1 // Lines before user code: 'def __sim_main__():' const E2B_PYTHON_WRAPPER_LINES = 1
type TypeScriptModule = typeof import('typescript') type TypeScriptModule = typeof import('typescript')
@@ -134,33 +134,21 @@ function extractEnhancedError(
if (error.stack) { if (error.stack) {
enhanced.stack = error.stack enhanced.stack = error.stack
// Parse stack trace to extract line and column information
// Handle both compilation errors and runtime errors
const stackLines: string[] = error.stack.split('\n') const stackLines: string[] = error.stack.split('\n')
for (const line of stackLines) { for (const line of stackLines) {
// Pattern 1: Compilation errors - "user-function.js:6"
let match = line.match(/user-function\.js:(\d+)(?::(\d+))?/) let match = line.match(/user-function\.js:(\d+)(?::(\d+))?/)
// Pattern 2: Runtime errors - "at user-function.js:5:12"
if (!match) { if (!match) {
match = line.match(/at\s+user-function\.js:(\d+):(\d+)/) match = line.match(/at\s+user-function\.js:(\d+):(\d+)/)
} }
// Pattern 3: Generic patterns for any line containing our filename
if (!match) {
match = line.match(/user-function\.js:(\d+)(?::(\d+))?/)
}
if (match) { if (match) {
const stackLine = Number.parseInt(match[1], 10) const stackLine = Number.parseInt(match[1], 10)
const stackColumn = match[2] ? Number.parseInt(match[2], 10) : undefined const stackColumn = match[2] ? Number.parseInt(match[2], 10) : undefined
// Adjust line number to account for wrapper code
// The user code starts at a specific line in our wrapper
const adjustedLine = stackLine - userCodeStartLine + 1 const adjustedLine = stackLine - userCodeStartLine + 1
// Check if this is a syntax error in wrapper code caused by incomplete user code
const isWrapperSyntaxError = const isWrapperSyntaxError =
stackLine > userCodeStartLine && stackLine > userCodeStartLine &&
error.name === 'SyntaxError' && error.name === 'SyntaxError' &&
@@ -168,7 +156,6 @@ function extractEnhancedError(
error.message.includes('Unexpected end of input')) error.message.includes('Unexpected end of input'))
if (isWrapperSyntaxError && userCode) { if (isWrapperSyntaxError && userCode) {
// Map wrapper syntax errors to the last line of user code
const codeLines = userCode.split('\n') const codeLines = userCode.split('\n')
const lastUserLine = codeLines.length const lastUserLine = codeLines.length
enhanced.line = lastUserLine enhanced.line = lastUserLine
@@ -181,7 +168,6 @@ function extractEnhancedError(
enhanced.line = adjustedLine enhanced.line = adjustedLine
enhanced.column = stackColumn enhanced.column = stackColumn
// Extract the actual line content from user code
if (userCode) { if (userCode) {
const codeLines = userCode.split('\n') const codeLines = userCode.split('\n')
if (adjustedLine <= codeLines.length) { if (adjustedLine <= codeLines.length) {
@@ -192,7 +178,6 @@ function extractEnhancedError(
} }
if (stackLine <= userCodeStartLine) { if (stackLine <= userCodeStartLine) {
// Error is in wrapper code itself
enhanced.line = stackLine enhanced.line = stackLine
enhanced.column = stackColumn enhanced.column = stackColumn
break break
@@ -200,7 +185,6 @@ function extractEnhancedError(
} }
} }
// Clean up stack trace to show user-relevant information
const cleanedStackLines: string[] = stackLines const cleanedStackLines: string[] = stackLines
.filter( .filter(
(line: string) => (line: string) =>
@@ -214,9 +198,6 @@ function extractEnhancedError(
} }
} }
// Keep original message without adding error type prefix
// The error type will be added later in createUserFriendlyErrorMessage
return enhanced return enhanced
} }
@@ -231,7 +212,6 @@ function formatE2BError(
userCode: string, userCode: string,
prologueLineCount: number prologueLineCount: number
): { formattedError: string; cleanedOutput: string } { ): { formattedError: string; cleanedOutput: string } {
// Calculate line offset based on language and prologue
const wrapperLines = const wrapperLines =
language === CodeLanguage.Python ? E2B_PYTHON_WRAPPER_LINES : E2B_JS_WRAPPER_LINES language === CodeLanguage.Python ? E2B_PYTHON_WRAPPER_LINES : E2B_JS_WRAPPER_LINES
const totalOffset = prologueLineCount + wrapperLines const totalOffset = prologueLineCount + wrapperLines
@@ -241,27 +221,20 @@ function formatE2BError(
let cleanErrorMsg = '' let cleanErrorMsg = ''
if (language === CodeLanguage.Python) { if (language === CodeLanguage.Python) {
// Python error format: "Cell In[X], line Y" followed by error details
// Extract line number from the Cell reference
const cellMatch = errorOutput.match(/Cell In\[\d+\], line (\d+)/) const cellMatch = errorOutput.match(/Cell In\[\d+\], line (\d+)/)
if (cellMatch) { if (cellMatch) {
const originalLine = Number.parseInt(cellMatch[1], 10) const originalLine = Number.parseInt(cellMatch[1], 10)
userLine = originalLine - totalOffset userLine = originalLine - totalOffset
} }
// Extract clean error message from the error string
// Remove file references like "(detected at line X) (file.py, line Y)"
cleanErrorMsg = errorMessage cleanErrorMsg = errorMessage
.replace(/\s*\(detected at line \d+\)/g, '') .replace(/\s*\(detected at line \d+\)/g, '')
.replace(/\s*\([^)]+\.py, line \d+\)/g, '') .replace(/\s*\([^)]+\.py, line \d+\)/g, '')
.trim() .trim()
} else if (language === CodeLanguage.JavaScript) { } else if (language === CodeLanguage.JavaScript) {
// JavaScript error format from E2B: "SyntaxError: /path/file.ts: Message. (line:col)\n\n 9 | ..."
// First, extract the error type and message from the first line
const firstLineEnd = errorMessage.indexOf('\n') const firstLineEnd = errorMessage.indexOf('\n')
const firstLine = firstLineEnd > 0 ? errorMessage.substring(0, firstLineEnd) : errorMessage const firstLine = firstLineEnd > 0 ? errorMessage.substring(0, firstLineEnd) : errorMessage
// Parse: "SyntaxError: /home/user/index.ts: Missing semicolon. (11:9)"
const jsErrorMatch = firstLine.match(/^(\w+Error):\s*[^:]+:\s*([^(]+)\.\s*\((\d+):(\d+)\)/) const jsErrorMatch = firstLine.match(/^(\w+Error):\s*[^:]+:\s*([^(]+)\.\s*\((\d+):(\d+)\)/)
if (jsErrorMatch) { if (jsErrorMatch) {
cleanErrorType = jsErrorMatch[1] cleanErrorType = jsErrorMatch[1]
@@ -269,13 +242,11 @@ function formatE2BError(
const originalLine = Number.parseInt(jsErrorMatch[3], 10) const originalLine = Number.parseInt(jsErrorMatch[3], 10)
userLine = originalLine - totalOffset userLine = originalLine - totalOffset
} else { } else {
// Fallback: look for line number in the arrow pointer line (> 11 |)
const arrowMatch = errorMessage.match(/^>\s*(\d+)\s*\|/m) const arrowMatch = errorMessage.match(/^>\s*(\d+)\s*\|/m)
if (arrowMatch) { if (arrowMatch) {
const originalLine = Number.parseInt(arrowMatch[1], 10) const originalLine = Number.parseInt(arrowMatch[1], 10)
userLine = originalLine - totalOffset userLine = originalLine - totalOffset
} }
// Try to extract error type and message
const errorMatch = firstLine.match(/^(\w+Error):\s*(.+)/) const errorMatch = firstLine.match(/^(\w+Error):\s*(.+)/)
if (errorMatch) { if (errorMatch) {
cleanErrorType = errorMatch[1] cleanErrorType = errorMatch[1]
@@ -289,13 +260,11 @@ function formatE2BError(
} }
} }
// Build the final clean error message
const finalErrorMsg = const finalErrorMsg =
cleanErrorType && cleanErrorMsg cleanErrorType && cleanErrorMsg
? `${cleanErrorType}: ${cleanErrorMsg}` ? `${cleanErrorType}: ${cleanErrorMsg}`
: cleanErrorMsg || errorMessage : cleanErrorMsg || errorMessage
// Format with line number if available
let formattedError = finalErrorMsg let formattedError = finalErrorMsg
if (userLine && userLine > 0) { if (userLine && userLine > 0) {
const codeLines = userCode.split('\n') const codeLines = userCode.split('\n')
@@ -311,7 +280,6 @@ function formatE2BError(
} }
} }
// For stdout, just return the clean error message without the full traceback
const cleanedOutput = finalErrorMsg const cleanedOutput = finalErrorMsg
return { formattedError, cleanedOutput } return { formattedError, cleanedOutput }
@@ -327,7 +295,6 @@ function createUserFriendlyErrorMessage(
): string { ): string {
let errorMessage = enhanced.message let errorMessage = enhanced.message
// Add line information if available
if (enhanced.line !== undefined) { if (enhanced.line !== undefined) {
let lineInfo = `Line ${enhanced.line}` let lineInfo = `Line ${enhanced.line}`
@@ -338,18 +305,14 @@ function createUserFriendlyErrorMessage(
errorMessage = `${lineInfo} - ${errorMessage}` errorMessage = `${lineInfo} - ${errorMessage}`
} else { } else {
// If no line number, try to extract it from stack trace for display
if (enhanced.stack) { if (enhanced.stack) {
const stackMatch = enhanced.stack.match(/user-function\.js:(\d+)(?::(\d+))?/) const stackMatch = enhanced.stack.match(/user-function\.js:(\d+)(?::(\d+))?/)
if (stackMatch) { if (stackMatch) {
const line = Number.parseInt(stackMatch[1], 10) const line = Number.parseInt(stackMatch[1], 10)
let lineInfo = `Line ${line}` let lineInfo = `Line ${line}`
// Try to get line content if we have userCode
if (userCode) { if (userCode) {
const codeLines = userCode.split('\n') const codeLines = userCode.split('\n')
// Note: stackMatch gives us VM line number, need to adjust
// This is a fallback case, so we might not have perfect line mapping
if (line <= codeLines.length) { if (line <= codeLines.length) {
const lineContent = codeLines[line - 1]?.trim() const lineContent = codeLines[line - 1]?.trim()
if (lineContent) { if (lineContent) {
@@ -363,7 +326,6 @@ function createUserFriendlyErrorMessage(
} }
} }
// Add error type prefix with consistent naming
if (enhanced.name !== 'Error') { if (enhanced.name !== 'Error') {
const errorTypePrefix = const errorTypePrefix =
enhanced.name === 'SyntaxError' enhanced.name === 'SyntaxError'
@@ -374,7 +336,6 @@ function createUserFriendlyErrorMessage(
? 'Reference Error' ? 'Reference Error'
: enhanced.name : enhanced.name
// Only add prefix if not already present
if (!errorMessage.toLowerCase().includes(errorTypePrefix.toLowerCase())) { if (!errorMessage.toLowerCase().includes(errorTypePrefix.toLowerCase())) {
errorMessage = `${errorTypePrefix}: ${errorMessage}` errorMessage = `${errorTypePrefix}: ${errorMessage}`
} }
@@ -383,9 +344,6 @@ function createUserFriendlyErrorMessage(
return errorMessage return errorMessage
} }
/**
* Resolves workflow variables with <variable.name> syntax
*/
function resolveWorkflowVariables( function resolveWorkflowVariables(
code: string, code: string,
workflowVariables: Record<string, any>, workflowVariables: Record<string, any>,
@@ -405,39 +363,35 @@ function resolveWorkflowVariables(
while ((match = regex.exec(code)) !== null) { while ((match = regex.exec(code)) !== null) {
const variableName = match[1].trim() const variableName = match[1].trim()
// Find the variable by name (workflowVariables is indexed by ID, values are variable objects)
const foundVariable = Object.entries(workflowVariables).find( const foundVariable = Object.entries(workflowVariables).find(
([_, variable]) => normalizeName(variable.name || '') === variableName ([_, variable]) => normalizeName(variable.name || '') === variableName
) )
let variableValue: unknown = '' if (!foundVariable) {
if (foundVariable) { const availableVars = Object.values(workflowVariables)
const variable = foundVariable[1] .map((v) => v.name)
variableValue = variable.value .filter(Boolean)
throw new Error(
`Variable "${variableName}" doesn't exist.` +
(availableVars.length > 0 ? ` Available: ${availableVars.join(', ')}` : '')
)
}
if (variable.value !== undefined && variable.value !== null) { const variable = foundVariable[1]
let variableValue: unknown = variable.value
if (variable.value !== undefined && variable.value !== null) {
const type = variable.type === 'string' ? 'plain' : variable.type
if (type === 'number') {
variableValue = Number(variableValue)
} else if (type === 'boolean') {
variableValue = variableValue === 'true' || variableValue === true
} else if (type === 'json' && typeof variableValue === 'string') {
try { try {
// Handle 'string' type the same as 'plain' for backward compatibility variableValue = JSON.parse(variableValue)
const type = variable.type === 'string' ? 'plain' : variable.type
// For plain text, use exactly what's entered without modifications
if (type === 'plain' && typeof variableValue === 'string') {
// Use as-is for plain text
} else if (type === 'number') {
variableValue = Number(variableValue)
} else if (type === 'boolean') {
variableValue = variableValue === 'true' || variableValue === true
} else if (type === 'json') {
try {
variableValue =
typeof variableValue === 'string' ? JSON.parse(variableValue) : variableValue
} catch {
// Keep original value if JSON parsing fails
}
}
} catch { } catch {
// Fallback to original value on error // Keep as-is
variableValue = variable.value
} }
} }
} }
@@ -450,11 +404,9 @@ function resolveWorkflowVariables(
}) })
} }
// Process replacements in reverse order to maintain correct indices
for (let i = replacements.length - 1; i >= 0; i--) { for (let i = replacements.length - 1; i >= 0; i--) {
const { match: matchStr, index, variableName, variableValue } = replacements[i] const { match: matchStr, index, variableName, variableValue } = replacements[i]
// Use variable reference approach
const safeVarName = `__variable_${variableName.replace(/[^a-zA-Z0-9_]/g, '_')}` const safeVarName = `__variable_${variableName.replace(/[^a-zA-Z0-9_]/g, '_')}`
contextVariables[safeVarName] = variableValue contextVariables[safeVarName] = variableValue
resolvedCode = resolvedCode =
@@ -464,9 +416,6 @@ function resolveWorkflowVariables(
return resolvedCode return resolvedCode
} }
/**
* Resolves environment variables with {{var_name}} syntax
*/
function resolveEnvironmentVariables( function resolveEnvironmentVariables(
code: string, code: string,
params: Record<string, any>, params: Record<string, any>,
@@ -482,32 +431,28 @@ function resolveEnvironmentVariables(
const resolverVars: Record<string, string> = {} const resolverVars: Record<string, string> = {}
Object.entries(params).forEach(([key, value]) => { Object.entries(params).forEach(([key, value]) => {
if (value) { if (value !== undefined && value !== null) {
resolverVars[key] = String(value) resolverVars[key] = String(value)
} }
}) })
Object.entries(envVars).forEach(([key, value]) => { Object.entries(envVars).forEach(([key, value]) => {
if (value) { if (value !== undefined && value !== null) {
resolverVars[key] = value resolverVars[key] = value
} }
}) })
while ((match = regex.exec(code)) !== null) { while ((match = regex.exec(code)) !== null) {
const varName = match[1].trim() const varName = match[1].trim()
const resolved = resolveEnvVarReferences(match[0], resolverVars, {
allowEmbedded: true, if (!(varName in resolverVars)) {
resolveExactMatch: true, continue
trimKeys: true, }
onMissing: 'empty',
deep: false,
})
const varValue =
typeof resolved === 'string' ? resolved : resolved == null ? '' : String(resolved)
replacements.push({ replacements.push({
match: match[0], match: match[0],
index: match.index, index: match.index,
varName, varName,
varValue: String(varValue), varValue: resolverVars[varName],
}) })
} }
@@ -523,12 +468,8 @@ function resolveEnvironmentVariables(
return resolvedCode return resolvedCode
} }
/**
* Resolves tags with <tag_name> syntax (including nested paths like <block.response.data>)
*/
function resolveTagVariables( function resolveTagVariables(
code: string, code: string,
params: Record<string, any>,
blockData: Record<string, any>, blockData: Record<string, any>,
blockNameMapping: Record<string, string>, blockNameMapping: Record<string, string>,
contextVariables: Record<string, any> contextVariables: Record<string, any>
@@ -543,27 +484,30 @@ function resolveTagVariables(
for (const match of tagMatches) { for (const match of tagMatches) {
const tagName = match.slice(REFERENCE.START.length, -REFERENCE.END.length).trim() const tagName = match.slice(REFERENCE.START.length, -REFERENCE.END.length).trim()
const pathParts = tagName.split(REFERENCE.PATH_DELIMITER)
const blockName = pathParts[0]
// Handle nested paths like "getrecord.response.data" or "function1.response.result" const blockId = blockNameMapping[blockName]
// First try params, then blockData directly, then try with block name mapping if (!blockId) {
let tagValue = getNestedValue(params, tagName) || getNestedValue(blockData, tagName) || '' continue
}
// If not found and the path starts with a block name, try mapping the block name to ID
if (!tagValue && tagName.includes(REFERENCE.PATH_DELIMITER)) { const blockOutput = blockData[blockId]
const pathParts = tagName.split(REFERENCE.PATH_DELIMITER) if (blockOutput === undefined) {
const normalizedBlockName = pathParts[0] // This should already be normalized like "function1" continue
}
// Direct lookup using normalized block name
const blockId = blockNameMapping[normalizedBlockName] ?? null let tagValue: any
if (pathParts.length === 1) {
if (blockId) { tagValue = blockOutput
const remainingPath = pathParts.slice(1).join('.') } else {
const fullPath = `${blockId}.${remainingPath}` tagValue = navigatePath(blockOutput, pathParts.slice(1))
tagValue = getNestedValue(blockData, fullPath) || '' }
}
if (tagValue === undefined) {
continue
} }
// If the value is a stringified JSON, parse it back to object
if ( if (
typeof tagValue === 'string' && typeof tagValue === 'string' &&
tagValue.length > 100 && tagValue.length > 100 &&
@@ -571,16 +515,13 @@ function resolveTagVariables(
) { ) {
try { try {
tagValue = JSON.parse(tagValue) tagValue = JSON.parse(tagValue)
} catch (e) { } catch {
// Keep as string if parsing fails // Keep as-is
} }
} }
// Instead of injecting large JSON directly, create a variable reference
const safeVarName = `__tag_${tagName.replace(/[^a-zA-Z0-9_]/g, '_')}` const safeVarName = `__tag_${tagName.replace(/[^a-zA-Z0-9_]/g, '_')}`
contextVariables[safeVarName] = tagValue contextVariables[safeVarName] = tagValue
// Replace the template with a variable reference
resolvedCode = resolvedCode.replace(new RegExp(escapeRegExp(match), 'g'), safeVarName) resolvedCode = resolvedCode.replace(new RegExp(escapeRegExp(match), 'g'), safeVarName)
} }
@@ -605,35 +546,13 @@ function resolveCodeVariables(
let resolvedCode = code let resolvedCode = code
const contextVariables: Record<string, any> = {} const contextVariables: Record<string, any> = {}
// Resolve workflow variables with <variable.name> syntax first
resolvedCode = resolveWorkflowVariables(resolvedCode, workflowVariables, contextVariables) resolvedCode = resolveWorkflowVariables(resolvedCode, workflowVariables, contextVariables)
// Resolve environment variables with {{var_name}} syntax
resolvedCode = resolveEnvironmentVariables(resolvedCode, params, envVars, contextVariables) resolvedCode = resolveEnvironmentVariables(resolvedCode, params, envVars, contextVariables)
resolvedCode = resolveTagVariables(resolvedCode, blockData, blockNameMapping, contextVariables)
// Resolve tags with <tag_name> syntax (including nested paths like <block.response.data>)
resolvedCode = resolveTagVariables(
resolvedCode,
params,
blockData,
blockNameMapping,
contextVariables
)
return { resolvedCode, contextVariables } return { resolvedCode, contextVariables }
} }
/**
* Get nested value from object using dot notation path
*/
function getNestedValue(obj: any, path: string): any {
if (!obj || !path) return undefined
return path.split('.').reduce((current, key) => {
return current && typeof current === 'object' ? current[key] : undefined
}, obj)
}
/** /**
* Remove one trailing newline from stdout * Remove one trailing newline from stdout
* This handles the common case where print() or console.log() adds a trailing \n * This handles the common case where print() or console.log() adds a trailing \n
@@ -671,7 +590,6 @@ export async function POST(req: NextRequest) {
isCustomTool = false, isCustomTool = false,
} = body } = body
// Extract internal parameters that shouldn't be passed to the execution context
const executionParams = { ...params } const executionParams = { ...params }
executionParams._context = undefined executionParams._context = undefined
@@ -697,7 +615,6 @@ export async function POST(req: NextRequest) {
const lang = isValidCodeLanguage(language) ? language : DEFAULT_CODE_LANGUAGE const lang = isValidCodeLanguage(language) ? language : DEFAULT_CODE_LANGUAGE
// Extract imports once for JavaScript code (reuse later to avoid double extraction)
let jsImports = '' let jsImports = ''
let jsRemainingCode = resolvedCode let jsRemainingCode = resolvedCode
let hasImports = false let hasImports = false
@@ -707,31 +624,22 @@ export async function POST(req: NextRequest) {
jsImports = extractionResult.imports jsImports = extractionResult.imports
jsRemainingCode = extractionResult.remainingCode jsRemainingCode = extractionResult.remainingCode
// Check for ES6 imports or CommonJS require statements
// ES6 imports are extracted by the TypeScript parser
// Also check for require() calls which indicate external dependencies
const hasRequireStatements = /require\s*\(\s*['"`]/.test(resolvedCode) const hasRequireStatements = /require\s*\(\s*['"`]/.test(resolvedCode)
hasImports = jsImports.trim().length > 0 || hasRequireStatements hasImports = jsImports.trim().length > 0 || hasRequireStatements
} }
// Python always requires E2B
if (lang === CodeLanguage.Python && !isE2bEnabled) { if (lang === CodeLanguage.Python && !isE2bEnabled) {
throw new Error( throw new Error(
'Python execution requires E2B to be enabled. Please contact your administrator to enable E2B, or use JavaScript instead.' 'Python execution requires E2B to be enabled. Please contact your administrator to enable E2B, or use JavaScript instead.'
) )
} }
// JavaScript with imports requires E2B
if (lang === CodeLanguage.JavaScript && hasImports && !isE2bEnabled) { if (lang === CodeLanguage.JavaScript && hasImports && !isE2bEnabled) {
throw new Error( throw new Error(
'JavaScript code with import statements requires E2B to be enabled. Please remove the import statements, or contact your administrator to enable E2B.' 'JavaScript code with import statements requires E2B to be enabled. Please remove the import statements, or contact your administrator to enable E2B.'
) )
} }
// Use E2B if:
// - E2B is enabled AND
// - Not a custom tool AND
// - (Python OR JavaScript with imports)
const useE2B = const useE2B =
isE2bEnabled && isE2bEnabled &&
!isCustomTool && !isCustomTool &&
@@ -744,13 +652,10 @@ export async function POST(req: NextRequest) {
language: lang, language: lang,
}) })
let prologue = '' let prologue = ''
const epilogue = ''
if (lang === CodeLanguage.JavaScript) { if (lang === CodeLanguage.JavaScript) {
// Track prologue lines for error adjustment
let prologueLineCount = 0 let prologueLineCount = 0
// Reuse the imports we already extracted earlier
const imports = jsImports const imports = jsImports
const remainingCode = jsRemainingCode const remainingCode = jsRemainingCode
@@ -782,7 +687,7 @@ export async function POST(req: NextRequest) {
' }', ' }',
'})();', '})();',
].join('\n') ].join('\n')
const codeForE2B = importSection + prologue + wrapped + epilogue const codeForE2B = importSection + prologue + wrapped
const execStart = Date.now() const execStart = Date.now()
const { const {
@@ -804,7 +709,6 @@ export async function POST(req: NextRequest) {
error: e2bError, error: e2bError,
}) })
// If there was an execution error, format it properly
if (e2bError) { if (e2bError) {
const { formattedError, cleanedOutput } = formatE2BError( const { formattedError, cleanedOutput } = formatE2BError(
e2bError, e2bError,
@@ -828,7 +732,7 @@ export async function POST(req: NextRequest) {
output: { result: e2bResult ?? null, stdout: cleanStdout(stdout), executionTime }, output: { result: e2bResult ?? null, stdout: cleanStdout(stdout), executionTime },
}) })
} }
// Track prologue lines for error adjustment
let prologueLineCount = 0 let prologueLineCount = 0
prologue += 'import json\n' prologue += 'import json\n'
prologueLineCount++ prologueLineCount++
@@ -846,7 +750,7 @@ export async function POST(req: NextRequest) {
'__sim_result__ = __sim_main__()', '__sim_result__ = __sim_main__()',
"print('__SIM_RESULT__=' + json.dumps(__sim_result__))", "print('__SIM_RESULT__=' + json.dumps(__sim_result__))",
].join('\n') ].join('\n')
const codeForE2B = prologue + wrapped + epilogue const codeForE2B = prologue + wrapped
const execStart = Date.now() const execStart = Date.now()
const { const {
@@ -868,7 +772,6 @@ export async function POST(req: NextRequest) {
error: e2bError, error: e2bError,
}) })
// If there was an execution error, format it properly
if (e2bError) { if (e2bError) {
const { formattedError, cleanedOutput } = formatE2BError( const { formattedError, cleanedOutput } = formatE2BError(
e2bError, e2bError,
@@ -897,7 +800,6 @@ export async function POST(req: NextRequest) {
const wrapperLines = ['(async () => {', ' try {'] const wrapperLines = ['(async () => {', ' try {']
if (isCustomTool) { if (isCustomTool) {
wrapperLines.push(' // For custom tools, make parameters directly accessible')
Object.keys(executionParams).forEach((key) => { Object.keys(executionParams).forEach((key) => {
wrapperLines.push(` const ${key} = params.${key};`) wrapperLines.push(` const ${key} = params.${key};`)
}) })
@@ -931,12 +833,10 @@ export async function POST(req: NextRequest) {
}) })
const ivmError = isolatedResult.error const ivmError = isolatedResult.error
// Adjust line number for prepended param destructuring in custom tools
let adjustedLine = ivmError.line let adjustedLine = ivmError.line
let adjustedLineContent = ivmError.lineContent let adjustedLineContent = ivmError.lineContent
if (prependedLineCount > 0 && ivmError.line !== undefined) { if (prependedLineCount > 0 && ivmError.line !== undefined) {
adjustedLine = Math.max(1, ivmError.line - prependedLineCount) adjustedLine = Math.max(1, ivmError.line - prependedLineCount)
// Get line content from original user code, not the prepended code
const codeLines = resolvedCode.split('\n') const codeLines = resolvedCode.split('\n')
if (adjustedLine <= codeLines.length) { if (adjustedLine <= codeLines.length) {
adjustedLineContent = codeLines[adjustedLine - 1]?.trim() adjustedLineContent = codeLines[adjustedLine - 1]?.trim()

View File

@@ -1,395 +0,0 @@
import { createLogger } from '@sim/logger'
import type { NextRequest } from 'next/server'
import { NextResponse } from 'next/server'
import { z } from 'zod'
import { checkHybridAuth } from '@/lib/auth/hybrid'
import { generateInternalToken } from '@/lib/auth/internal'
import { isDev } from '@/lib/core/config/feature-flags'
import { createPinnedUrl, validateUrlWithDNS } from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { executeTool } from '@/tools'
import { getTool, validateRequiredParametersAfterMerge } from '@/tools/utils'
const logger = createLogger('ProxyAPI')
const proxyPostSchema = z.object({
toolId: z.string().min(1, 'toolId is required'),
params: z.record(z.any()).optional().default({}),
executionContext: z
.object({
workflowId: z.string().optional(),
workspaceId: z.string().optional(),
executionId: z.string().optional(),
userId: z.string().optional(),
})
.optional(),
})
/**
* Creates a minimal set of default headers for proxy requests
* @returns Record of HTTP headers
*/
const getProxyHeaders = (): Record<string, string> => {
return {
'User-Agent':
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36',
Accept: '*/*',
'Accept-Encoding': 'gzip, deflate, br',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
}
}
/**
* Formats a response with CORS headers
* @param responseData Response data object
* @param status HTTP status code
* @returns NextResponse with CORS headers
*/
const formatResponse = (responseData: any, status = 200) => {
return NextResponse.json(responseData, {
status,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
},
})
}
/**
* Creates an error response with consistent formatting
* @param error Error object or message
* @param status HTTP status code
* @param additionalData Additional data to include in the response
* @returns Formatted error response
*/
const createErrorResponse = (error: any, status = 500, additionalData = {}) => {
const errorMessage = error instanceof Error ? error.message : String(error)
const errorStack = error instanceof Error ? error.stack : undefined
logger.error('Creating error response', {
errorMessage,
status,
stack: isDev ? errorStack : undefined,
})
return formatResponse(
{
success: false,
error: errorMessage,
stack: isDev ? errorStack : undefined,
...additionalData,
},
status
)
}
/**
* GET handler for direct external URL proxying
* This allows for GET requests to external APIs
*/
export async function GET(request: Request) {
const url = new URL(request.url)
const targetUrl = url.searchParams.get('url')
const requestId = generateRequestId()
// Vault download proxy: /api/proxy?vaultDownload=1&bucket=...&object=...&credentialId=...
const vaultDownload = url.searchParams.get('vaultDownload')
if (vaultDownload === '1') {
try {
const bucket = url.searchParams.get('bucket')
const objectParam = url.searchParams.get('object')
const credentialId = url.searchParams.get('credentialId')
if (!bucket || !objectParam || !credentialId) {
return createErrorResponse('Missing bucket, object, or credentialId', 400)
}
// Fetch access token using existing token API
const baseUrl = new URL(getBaseUrl())
const tokenUrl = new URL('/api/auth/oauth/token', baseUrl)
// Build headers: forward session cookies if present; include internal auth for server-side
const tokenHeaders: Record<string, string> = { 'Content-Type': 'application/json' }
const incomingCookie = request.headers.get('cookie')
if (incomingCookie) tokenHeaders.Cookie = incomingCookie
try {
const internalToken = await generateInternalToken()
tokenHeaders.Authorization = `Bearer ${internalToken}`
} catch (_e) {
// best-effort internal auth
}
// Optional workflow context for collaboration auth
const workflowId = url.searchParams.get('workflowId') || undefined
const tokenRes = await fetch(tokenUrl.toString(), {
method: 'POST',
headers: tokenHeaders,
body: JSON.stringify({ credentialId, workflowId }),
})
if (!tokenRes.ok) {
const err = await tokenRes.text()
return createErrorResponse(`Failed to fetch access token: ${err}`, 401)
}
const tokenJson = await tokenRes.json()
const accessToken = tokenJson.accessToken
if (!accessToken) {
return createErrorResponse('No access token available', 401)
}
// Avoid double-encoding: incoming object may already be percent-encoded
const objectDecoded = decodeURIComponent(objectParam)
const gcsUrl = `https://storage.googleapis.com/storage/v1/b/${encodeURIComponent(
bucket
)}/o/${encodeURIComponent(objectDecoded)}?alt=media`
const fileRes = await fetch(gcsUrl, {
headers: { Authorization: `Bearer ${accessToken}` },
})
if (!fileRes.ok) {
const errText = await fileRes.text()
return createErrorResponse(errText || 'Failed to download file', fileRes.status)
}
const headers = new Headers()
fileRes.headers.forEach((v, k) => headers.set(k, v))
return new NextResponse(fileRes.body, { status: 200, headers })
} catch (error: any) {
logger.error(`[${requestId}] Vault download proxy failed`, {
error: error instanceof Error ? error.message : String(error),
})
return createErrorResponse('Vault download failed', 500)
}
}
if (!targetUrl) {
logger.error(`[${requestId}] Missing 'url' parameter`)
return createErrorResponse("Missing 'url' parameter", 400)
}
const urlValidation = await validateUrlWithDNS(targetUrl)
if (!urlValidation.isValid) {
logger.warn(`[${requestId}] Blocked proxy request`, {
url: targetUrl.substring(0, 100),
error: urlValidation.error,
})
return createErrorResponse(urlValidation.error || 'Invalid URL', 403)
}
const method = url.searchParams.get('method') || 'GET'
const bodyParam = url.searchParams.get('body')
let body: string | undefined
if (bodyParam && ['POST', 'PUT', 'PATCH'].includes(method.toUpperCase())) {
try {
body = decodeURIComponent(bodyParam)
} catch (error) {
logger.warn(`[${requestId}] Failed to decode body parameter`, error)
}
}
const customHeaders: Record<string, string> = {}
for (const [key, value] of url.searchParams.entries()) {
if (key.startsWith('header.')) {
const headerName = key.substring(7)
customHeaders[headerName] = value
}
}
if (body && !customHeaders['Content-Type']) {
customHeaders['Content-Type'] = 'application/json'
}
logger.info(`[${requestId}] Proxying ${method} request to: ${targetUrl}`)
try {
const pinnedUrl = createPinnedUrl(targetUrl, urlValidation.resolvedIP!)
const response = await fetch(pinnedUrl, {
method: method,
headers: {
...getProxyHeaders(),
...customHeaders,
Host: urlValidation.originalHostname!,
},
body: body || undefined,
})
const contentType = response.headers.get('content-type') || ''
let data
if (contentType.includes('application/json')) {
data = await response.json()
} else {
data = await response.text()
}
const errorMessage = !response.ok
? data && typeof data === 'object' && data.error
? `${data.error.message || JSON.stringify(data.error)}`
: response.statusText || `HTTP error ${response.status}`
: undefined
if (!response.ok) {
logger.error(`[${requestId}] External API error: ${response.status} ${response.statusText}`)
}
return formatResponse({
success: response.ok,
status: response.status,
statusText: response.statusText,
headers: Object.fromEntries(response.headers.entries()),
data,
error: errorMessage,
})
} catch (error: any) {
logger.error(`[${requestId}] Proxy GET request failed`, {
url: targetUrl,
error: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined,
})
return createErrorResponse(error)
}
}
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
const startTime = new Date()
const startTimeISO = startTime.toISOString()
try {
const authResult = await checkHybridAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
logger.error(`[${requestId}] Authentication failed for proxy:`, authResult.error)
return createErrorResponse('Unauthorized', 401)
}
let requestBody
try {
requestBody = await request.json()
} catch (parseError) {
logger.error(`[${requestId}] Failed to parse request body`, {
error: parseError instanceof Error ? parseError.message : String(parseError),
})
throw new Error('Invalid JSON in request body')
}
const validationResult = proxyPostSchema.safeParse(requestBody)
if (!validationResult.success) {
logger.error(`[${requestId}] Request validation failed`, {
errors: validationResult.error.errors,
})
const errorMessages = validationResult.error.errors
.map((err) => `${err.path.join('.')}: ${err.message}`)
.join(', ')
throw new Error(`Validation failed: ${errorMessages}`)
}
const { toolId, params } = validationResult.data
logger.info(`[${requestId}] Processing tool: ${toolId}`)
const tool = getTool(toolId)
if (!tool) {
logger.error(`[${requestId}] Tool not found: ${toolId}`)
throw new Error(`Tool not found: ${toolId}`)
}
try {
validateRequiredParametersAfterMerge(toolId, tool, params)
} catch (validationError) {
logger.warn(`[${requestId}] Tool validation failed for ${toolId}`, {
error: validationError instanceof Error ? validationError.message : String(validationError),
})
const endTime = new Date()
const endTimeISO = endTime.toISOString()
const duration = endTime.getTime() - startTime.getTime()
return createErrorResponse(validationError, 400, {
startTime: startTimeISO,
endTime: endTimeISO,
duration,
})
}
const hasFileOutputs =
tool.outputs &&
Object.values(tool.outputs).some(
(output) => output.type === 'file' || output.type === 'file[]'
)
const result = await executeTool(
toolId,
params,
true, // skipProxy (we're already in the proxy)
!hasFileOutputs, // skipPostProcess (don't skip if tool has file outputs)
undefined // execution context is not available in proxy context
)
if (!result.success) {
logger.warn(`[${requestId}] Tool execution failed for ${toolId}`, {
error: result.error || 'Unknown error',
})
throw new Error(result.error || 'Tool execution failed')
}
const endTime = new Date()
const endTimeISO = endTime.toISOString()
const duration = endTime.getTime() - startTime.getTime()
const responseWithTimingData = {
...result,
startTime: startTimeISO,
endTime: endTimeISO,
duration,
timing: {
startTime: startTimeISO,
endTime: endTimeISO,
duration,
},
}
logger.info(`[${requestId}] Tool executed successfully: ${toolId} (${duration}ms)`)
return formatResponse(responseWithTimingData)
} catch (error: any) {
logger.error(`[${requestId}] Proxy request failed`, {
error: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined,
name: error instanceof Error ? error.name : undefined,
})
const endTime = new Date()
const endTimeISO = endTime.toISOString()
const duration = endTime.getTime() - startTime.getTime()
return createErrorResponse(error, 500, {
startTime: startTimeISO,
endTime: endTimeISO,
duration,
})
}
}
export async function OPTIONS() {
return new NextResponse(null, {
status: 204,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
'Access-Control-Max-Age': '86400',
},
})
}

View File

@@ -5,7 +5,11 @@ import { checkHybridAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request' import { generateRequestId } from '@/lib/core/utils/request'
import { getBaseUrl } from '@/lib/core/utils/urls' import { getBaseUrl } from '@/lib/core/utils/urls'
import { StorageService } from '@/lib/uploads' import { StorageService } from '@/lib/uploads'
import { extractStorageKey, inferContextFromKey } from '@/lib/uploads/utils/file-utils' import {
extractStorageKey,
inferContextFromKey,
isInternalFileUrl,
} from '@/lib/uploads/utils/file-utils'
import { verifyFileAccess } from '@/app/api/files/authorization' import { verifyFileAccess } from '@/app/api/files/authorization'
export const dynamic = 'force-dynamic' export const dynamic = 'force-dynamic'
@@ -47,13 +51,13 @@ export async function POST(request: NextRequest) {
logger.info(`[${requestId}] Mistral parse request`, { logger.info(`[${requestId}] Mistral parse request`, {
filePath: validatedData.filePath, filePath: validatedData.filePath,
isWorkspaceFile: validatedData.filePath.includes('/api/files/serve/'), isWorkspaceFile: isInternalFileUrl(validatedData.filePath),
userId, userId,
}) })
let fileUrl = validatedData.filePath let fileUrl = validatedData.filePath
if (validatedData.filePath?.includes('/api/files/serve/')) { if (isInternalFileUrl(validatedData.filePath)) {
try { try {
const storageKey = extractStorageKey(validatedData.filePath) const storageKey = extractStorageKey(validatedData.filePath)

View File

@@ -5,7 +5,11 @@ import { checkHybridAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request' import { generateRequestId } from '@/lib/core/utils/request'
import { getBaseUrl } from '@/lib/core/utils/urls' import { getBaseUrl } from '@/lib/core/utils/urls'
import { StorageService } from '@/lib/uploads' import { StorageService } from '@/lib/uploads'
import { extractStorageKey, inferContextFromKey } from '@/lib/uploads/utils/file-utils' import {
extractStorageKey,
inferContextFromKey,
isInternalFileUrl,
} from '@/lib/uploads/utils/file-utils'
import { verifyFileAccess } from '@/app/api/files/authorization' import { verifyFileAccess } from '@/app/api/files/authorization'
export const dynamic = 'force-dynamic' export const dynamic = 'force-dynamic'
@@ -48,13 +52,13 @@ export async function POST(request: NextRequest) {
logger.info(`[${requestId}] Pulse parse request`, { logger.info(`[${requestId}] Pulse parse request`, {
filePath: validatedData.filePath, filePath: validatedData.filePath,
isWorkspaceFile: validatedData.filePath.includes('/api/files/serve/'), isWorkspaceFile: isInternalFileUrl(validatedData.filePath),
userId, userId,
}) })
let fileUrl = validatedData.filePath let fileUrl = validatedData.filePath
if (validatedData.filePath?.includes('/api/files/serve/')) { if (isInternalFileUrl(validatedData.filePath)) {
try { try {
const storageKey = extractStorageKey(validatedData.filePath) const storageKey = extractStorageKey(validatedData.filePath)
const context = inferContextFromKey(storageKey) const context = inferContextFromKey(storageKey)

View File

@@ -5,7 +5,11 @@ import { checkHybridAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request' import { generateRequestId } from '@/lib/core/utils/request'
import { getBaseUrl } from '@/lib/core/utils/urls' import { getBaseUrl } from '@/lib/core/utils/urls'
import { StorageService } from '@/lib/uploads' import { StorageService } from '@/lib/uploads'
import { extractStorageKey, inferContextFromKey } from '@/lib/uploads/utils/file-utils' import {
extractStorageKey,
inferContextFromKey,
isInternalFileUrl,
} from '@/lib/uploads/utils/file-utils'
import { verifyFileAccess } from '@/app/api/files/authorization' import { verifyFileAccess } from '@/app/api/files/authorization'
export const dynamic = 'force-dynamic' export const dynamic = 'force-dynamic'
@@ -44,13 +48,13 @@ export async function POST(request: NextRequest) {
logger.info(`[${requestId}] Reducto parse request`, { logger.info(`[${requestId}] Reducto parse request`, {
filePath: validatedData.filePath, filePath: validatedData.filePath,
isWorkspaceFile: validatedData.filePath.includes('/api/files/serve/'), isWorkspaceFile: isInternalFileUrl(validatedData.filePath),
userId, userId,
}) })
let fileUrl = validatedData.filePath let fileUrl = validatedData.filePath
if (validatedData.filePath?.includes('/api/files/serve/')) { if (isInternalFileUrl(validatedData.filePath)) {
try { try {
const storageKey = extractStorageKey(validatedData.filePath) const storageKey = extractStorageKey(validatedData.filePath)
const context = inferContextFromKey(storageKey) const context = inferContextFromKey(storageKey)

View File

@@ -79,11 +79,13 @@ export async function POST(request: NextRequest) {
// Generate public URL for destination (properly encode the destination key) // Generate public URL for destination (properly encode the destination key)
const encodedDestKey = validatedData.destinationKey.split('/').map(encodeURIComponent).join('/') const encodedDestKey = validatedData.destinationKey.split('/').map(encodeURIComponent).join('/')
const url = `https://${validatedData.destinationBucket}.s3.${validatedData.region}.amazonaws.com/${encodedDestKey}` const url = `https://${validatedData.destinationBucket}.s3.${validatedData.region}.amazonaws.com/${encodedDestKey}`
const uri = `s3://${validatedData.destinationBucket}/${validatedData.destinationKey}`
return NextResponse.json({ return NextResponse.json({
success: true, success: true,
output: { output: {
url, url,
uri,
copySourceVersionId: result.CopySourceVersionId, copySourceVersionId: result.CopySourceVersionId,
versionId: result.VersionId, versionId: result.VersionId,
etag: result.CopyObjectResult?.ETag, etag: result.CopyObjectResult?.ETag,

View File

@@ -117,11 +117,13 @@ export async function POST(request: NextRequest) {
const encodedKey = validatedData.objectKey.split('/').map(encodeURIComponent).join('/') const encodedKey = validatedData.objectKey.split('/').map(encodeURIComponent).join('/')
const url = `https://${validatedData.bucketName}.s3.${validatedData.region}.amazonaws.com/${encodedKey}` const url = `https://${validatedData.bucketName}.s3.${validatedData.region}.amazonaws.com/${encodedKey}`
const uri = `s3://${validatedData.bucketName}/${validatedData.objectKey}`
return NextResponse.json({ return NextResponse.json({
success: true, success: true,
output: { output: {
url, url,
uri,
etag: result.ETag, etag: result.ETag,
location: url, location: url,
key: validatedData.objectKey, key: validatedData.objectKey,

View File

@@ -0,0 +1,637 @@
import crypto from 'crypto'
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkHybridAuth } from '@/lib/auth/hybrid'
import {
validateAwsRegion,
validateExternalUrl,
validateS3BucketName,
} from '@/lib/core/security/input-validation'
import { generateRequestId } from '@/lib/core/utils/request'
import { StorageService } from '@/lib/uploads'
import {
extractStorageKey,
inferContextFromKey,
isInternalFileUrl,
} from '@/lib/uploads/utils/file-utils'
import { verifyFileAccess } from '@/app/api/files/authorization'
export const dynamic = 'force-dynamic'
export const maxDuration = 300 // 5 minutes for large multi-page PDF processing
const logger = createLogger('TextractParseAPI')
const QuerySchema = z.object({
Text: z.string().min(1),
Alias: z.string().optional(),
Pages: z.array(z.string()).optional(),
})
const TextractParseSchema = z
.object({
accessKeyId: z.string().min(1, 'AWS Access Key ID is required'),
secretAccessKey: z.string().min(1, 'AWS Secret Access Key is required'),
region: z.string().min(1, 'AWS region is required'),
processingMode: z.enum(['sync', 'async']).optional().default('sync'),
filePath: z.string().optional(),
s3Uri: z.string().optional(),
featureTypes: z
.array(z.enum(['TABLES', 'FORMS', 'QUERIES', 'SIGNATURES', 'LAYOUT']))
.optional(),
queries: z.array(QuerySchema).optional(),
})
.superRefine((data, ctx) => {
const regionValidation = validateAwsRegion(data.region, 'AWS region')
if (!regionValidation.isValid) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
message: regionValidation.error,
path: ['region'],
})
}
})
function getSignatureKey(
key: string,
dateStamp: string,
regionName: string,
serviceName: string
): Buffer {
const kDate = crypto.createHmac('sha256', `AWS4${key}`).update(dateStamp).digest()
const kRegion = crypto.createHmac('sha256', kDate).update(regionName).digest()
const kService = crypto.createHmac('sha256', kRegion).update(serviceName).digest()
const kSigning = crypto.createHmac('sha256', kService).update('aws4_request').digest()
return kSigning
}
function signAwsRequest(
method: string,
host: string,
uri: string,
body: string,
accessKeyId: string,
secretAccessKey: string,
region: string,
service: string,
amzTarget: string
): Record<string, string> {
const date = new Date()
const amzDate = date.toISOString().replace(/[:-]|\.\d{3}/g, '')
const dateStamp = amzDate.slice(0, 8)
const payloadHash = crypto.createHash('sha256').update(body).digest('hex')
const canonicalHeaders =
`content-type:application/x-amz-json-1.1\n` +
`host:${host}\n` +
`x-amz-date:${amzDate}\n` +
`x-amz-target:${amzTarget}\n`
const signedHeaders = 'content-type;host;x-amz-date;x-amz-target'
const canonicalRequest = `${method}\n${uri}\n\n${canonicalHeaders}\n${signedHeaders}\n${payloadHash}`
const algorithm = 'AWS4-HMAC-SHA256'
const credentialScope = `${dateStamp}/${region}/${service}/aws4_request`
const stringToSign = `${algorithm}\n${amzDate}\n${credentialScope}\n${crypto.createHash('sha256').update(canonicalRequest).digest('hex')}`
const signingKey = getSignatureKey(secretAccessKey, dateStamp, region, service)
const signature = crypto.createHmac('sha256', signingKey).update(stringToSign).digest('hex')
const authorizationHeader = `${algorithm} Credential=${accessKeyId}/${credentialScope}, SignedHeaders=${signedHeaders}, Signature=${signature}`
return {
'Content-Type': 'application/x-amz-json-1.1',
Host: host,
'X-Amz-Date': amzDate,
'X-Amz-Target': amzTarget,
Authorization: authorizationHeader,
}
}
async function fetchDocumentBytes(url: string): Promise<{ bytes: string; contentType: string }> {
const response = await fetch(url)
if (!response.ok) {
throw new Error(`Failed to fetch document: ${response.statusText}`)
}
const arrayBuffer = await response.arrayBuffer()
const bytes = Buffer.from(arrayBuffer).toString('base64')
const contentType = response.headers.get('content-type') || 'application/octet-stream'
return { bytes, contentType }
}
function parseS3Uri(s3Uri: string): { bucket: string; key: string } {
const match = s3Uri.match(/^s3:\/\/([^/]+)\/(.+)$/)
if (!match) {
throw new Error(
`Invalid S3 URI format: ${s3Uri}. Expected format: s3://bucket-name/path/to/object`
)
}
const bucket = match[1]
const key = match[2]
const bucketValidation = validateS3BucketName(bucket, 'S3 bucket name')
if (!bucketValidation.isValid) {
throw new Error(bucketValidation.error)
}
if (key.includes('..') || key.startsWith('/')) {
throw new Error('S3 key contains invalid path traversal sequences')
}
return { bucket, key }
}
function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms))
}
async function callTextractAsync(
host: string,
amzTarget: string,
body: Record<string, unknown>,
accessKeyId: string,
secretAccessKey: string,
region: string
): Promise<Record<string, unknown>> {
const bodyString = JSON.stringify(body)
const headers = signAwsRequest(
'POST',
host,
'/',
bodyString,
accessKeyId,
secretAccessKey,
region,
'textract',
amzTarget
)
const response = await fetch(`https://${host}/`, {
method: 'POST',
headers,
body: bodyString,
})
if (!response.ok) {
const errorText = await response.text()
let errorMessage = `Textract API error: ${response.statusText}`
try {
const errorJson = JSON.parse(errorText)
if (errorJson.Message) {
errorMessage = errorJson.Message
} else if (errorJson.__type) {
errorMessage = `${errorJson.__type}: ${errorJson.message || errorText}`
}
} catch {
// Use default error message
}
throw new Error(errorMessage)
}
return response.json()
}
async function pollForJobCompletion(
host: string,
jobId: string,
accessKeyId: string,
secretAccessKey: string,
region: string,
useAnalyzeDocument: boolean,
requestId: string
): Promise<Record<string, unknown>> {
const pollIntervalMs = 5000 // 5 seconds between polls
const maxPollTimeMs = 180000 // 3 minutes maximum polling time
const maxAttempts = Math.ceil(maxPollTimeMs / pollIntervalMs)
const getTarget = useAnalyzeDocument
? 'Textract.GetDocumentAnalysis'
: 'Textract.GetDocumentTextDetection'
for (let attempt = 0; attempt < maxAttempts; attempt++) {
const result = await callTextractAsync(
host,
getTarget,
{ JobId: jobId },
accessKeyId,
secretAccessKey,
region
)
const jobStatus = result.JobStatus as string
if (jobStatus === 'SUCCEEDED') {
logger.info(`[${requestId}] Async job completed successfully after ${attempt + 1} polls`)
let allBlocks = (result.Blocks as unknown[]) || []
let nextToken = result.NextToken as string | undefined
while (nextToken) {
const nextResult = await callTextractAsync(
host,
getTarget,
{ JobId: jobId, NextToken: nextToken },
accessKeyId,
secretAccessKey,
region
)
allBlocks = allBlocks.concat((nextResult.Blocks as unknown[]) || [])
nextToken = nextResult.NextToken as string | undefined
}
return {
...result,
Blocks: allBlocks,
}
}
if (jobStatus === 'FAILED') {
throw new Error(`Textract job failed: ${result.StatusMessage || 'Unknown error'}`)
}
if (jobStatus === 'PARTIAL_SUCCESS') {
logger.warn(`[${requestId}] Job completed with partial success: ${result.StatusMessage}`)
let allBlocks = (result.Blocks as unknown[]) || []
let nextToken = result.NextToken as string | undefined
while (nextToken) {
const nextResult = await callTextractAsync(
host,
getTarget,
{ JobId: jobId, NextToken: nextToken },
accessKeyId,
secretAccessKey,
region
)
allBlocks = allBlocks.concat((nextResult.Blocks as unknown[]) || [])
nextToken = nextResult.NextToken as string | undefined
}
return {
...result,
Blocks: allBlocks,
}
}
logger.info(`[${requestId}] Job status: ${jobStatus}, attempt ${attempt + 1}/${maxAttempts}`)
await sleep(pollIntervalMs)
}
throw new Error(
`Timeout waiting for Textract job to complete (max ${maxPollTimeMs / 1000} seconds)`
)
}
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkHybridAuth(request, { requireWorkflowId: false })
if (!authResult.success || !authResult.userId) {
logger.warn(`[${requestId}] Unauthorized Textract parse attempt`, {
error: authResult.error || 'Missing userId',
})
return NextResponse.json(
{
success: false,
error: authResult.error || 'Unauthorized',
},
{ status: 401 }
)
}
const userId = authResult.userId
const body = await request.json()
const validatedData = TextractParseSchema.parse(body)
const processingMode = validatedData.processingMode || 'sync'
const featureTypes = validatedData.featureTypes ?? []
const useAnalyzeDocument = featureTypes.length > 0
const host = `textract.${validatedData.region}.amazonaws.com`
logger.info(`[${requestId}] Textract parse request`, {
processingMode,
filePath: validatedData.filePath?.substring(0, 50),
s3Uri: validatedData.s3Uri?.substring(0, 50),
featureTypes,
userId,
})
if (processingMode === 'async') {
if (!validatedData.s3Uri) {
return NextResponse.json(
{
success: false,
error: 'S3 URI is required for multi-page processing (s3://bucket/key)',
},
{ status: 400 }
)
}
const { bucket: s3Bucket, key: s3Key } = parseS3Uri(validatedData.s3Uri)
logger.info(`[${requestId}] Starting async Textract job`, { s3Bucket, s3Key })
const startTarget = useAnalyzeDocument
? 'Textract.StartDocumentAnalysis'
: 'Textract.StartDocumentTextDetection'
const startBody: Record<string, unknown> = {
DocumentLocation: {
S3Object: {
Bucket: s3Bucket,
Name: s3Key,
},
},
}
if (useAnalyzeDocument) {
startBody.FeatureTypes = featureTypes
if (
validatedData.queries &&
validatedData.queries.length > 0 &&
featureTypes.includes('QUERIES')
) {
startBody.QueriesConfig = {
Queries: validatedData.queries.map((q) => ({
Text: q.Text,
Alias: q.Alias,
Pages: q.Pages,
})),
}
}
}
const startResult = await callTextractAsync(
host,
startTarget,
startBody,
validatedData.accessKeyId,
validatedData.secretAccessKey,
validatedData.region
)
const jobId = startResult.JobId as string
if (!jobId) {
throw new Error('Failed to start Textract job: No JobId returned')
}
logger.info(`[${requestId}] Async job started`, { jobId })
const textractData = await pollForJobCompletion(
host,
jobId,
validatedData.accessKeyId,
validatedData.secretAccessKey,
validatedData.region,
useAnalyzeDocument,
requestId
)
logger.info(`[${requestId}] Textract async parse successful`, {
pageCount: (textractData.DocumentMetadata as { Pages?: number })?.Pages ?? 0,
blockCount: (textractData.Blocks as unknown[])?.length ?? 0,
})
return NextResponse.json({
success: true,
output: {
blocks: textractData.Blocks ?? [],
documentMetadata: {
pages: (textractData.DocumentMetadata as { Pages?: number })?.Pages ?? 0,
},
modelVersion: (textractData.AnalyzeDocumentModelVersion ??
textractData.DetectDocumentTextModelVersion) as string | undefined,
},
})
}
if (!validatedData.filePath) {
return NextResponse.json(
{
success: false,
error: 'File path is required for single-page processing',
},
{ status: 400 }
)
}
let fileUrl = validatedData.filePath
const isInternalFilePath = validatedData.filePath && isInternalFileUrl(validatedData.filePath)
if (isInternalFilePath) {
try {
const storageKey = extractStorageKey(validatedData.filePath)
const context = inferContextFromKey(storageKey)
const hasAccess = await verifyFileAccess(storageKey, userId, undefined, context, false)
if (!hasAccess) {
logger.warn(`[${requestId}] Unauthorized presigned URL generation attempt`, {
userId,
key: storageKey,
context,
})
return NextResponse.json(
{
success: false,
error: 'File not found',
},
{ status: 404 }
)
}
fileUrl = await StorageService.generatePresignedDownloadUrl(storageKey, context, 5 * 60)
logger.info(`[${requestId}] Generated presigned URL for ${context} file`)
} catch (error) {
logger.error(`[${requestId}] Failed to generate presigned URL:`, error)
return NextResponse.json(
{
success: false,
error: 'Failed to generate file access URL',
},
{ status: 500 }
)
}
} else if (validatedData.filePath?.startsWith('/')) {
// Reject arbitrary absolute paths that don't contain /api/files/serve/
logger.warn(`[${requestId}] Invalid internal path`, {
userId,
path: validatedData.filePath.substring(0, 50),
})
return NextResponse.json(
{
success: false,
error: 'Invalid file path. Only uploaded files are supported for internal paths.',
},
{ status: 400 }
)
} else {
const urlValidation = validateExternalUrl(fileUrl, 'Document URL')
if (!urlValidation.isValid) {
logger.warn(`[${requestId}] SSRF attempt blocked`, {
userId,
url: fileUrl.substring(0, 100),
error: urlValidation.error,
})
return NextResponse.json(
{
success: false,
error: urlValidation.error,
},
{ status: 400 }
)
}
}
const { bytes, contentType } = await fetchDocumentBytes(fileUrl)
// Track if this is a PDF for better error messaging
const isPdf = contentType.includes('pdf') || fileUrl.toLowerCase().endsWith('.pdf')
const uri = '/'
let textractBody: Record<string, unknown>
let amzTarget: string
if (useAnalyzeDocument) {
amzTarget = 'Textract.AnalyzeDocument'
textractBody = {
Document: {
Bytes: bytes,
},
FeatureTypes: featureTypes,
}
if (
validatedData.queries &&
validatedData.queries.length > 0 &&
featureTypes.includes('QUERIES')
) {
textractBody.QueriesConfig = {
Queries: validatedData.queries.map((q) => ({
Text: q.Text,
Alias: q.Alias,
Pages: q.Pages,
})),
}
}
} else {
amzTarget = 'Textract.DetectDocumentText'
textractBody = {
Document: {
Bytes: bytes,
},
}
}
const bodyString = JSON.stringify(textractBody)
const headers = signAwsRequest(
'POST',
host,
uri,
bodyString,
validatedData.accessKeyId,
validatedData.secretAccessKey,
validatedData.region,
'textract',
amzTarget
)
const textractResponse = await fetch(`https://${host}${uri}`, {
method: 'POST',
headers,
body: bodyString,
})
if (!textractResponse.ok) {
const errorText = await textractResponse.text()
logger.error(`[${requestId}] Textract API error:`, errorText)
let errorMessage = `Textract API error: ${textractResponse.statusText}`
let isUnsupportedFormat = false
try {
const errorJson = JSON.parse(errorText)
if (errorJson.Message) {
errorMessage = errorJson.Message
} else if (errorJson.__type) {
errorMessage = `${errorJson.__type}: ${errorJson.message || errorText}`
}
// Check for unsupported document format error
isUnsupportedFormat =
errorJson.__type === 'UnsupportedDocumentException' ||
errorJson.Message?.toLowerCase().includes('unsupported document') ||
errorText.toLowerCase().includes('unsupported document')
} catch {
isUnsupportedFormat = errorText.toLowerCase().includes('unsupported document')
}
// Provide helpful message for unsupported format (likely multi-page PDF)
if (isUnsupportedFormat && isPdf) {
errorMessage =
'This document format is not supported in Single Page mode. If this is a multi-page PDF, please use "Multi-Page (PDF, TIFF via S3)" mode instead, which requires uploading your document to S3 first. Single Page mode only supports JPEG, PNG, and single-page PDF files.'
}
return NextResponse.json(
{
success: false,
error: errorMessage,
},
{ status: textractResponse.status }
)
}
const textractData = await textractResponse.json()
logger.info(`[${requestId}] Textract parse successful`, {
pageCount: textractData.DocumentMetadata?.Pages ?? 0,
blockCount: textractData.Blocks?.length ?? 0,
})
return NextResponse.json({
success: true,
output: {
blocks: textractData.Blocks ?? [],
documentMetadata: {
pages: textractData.DocumentMetadata?.Pages ?? 0,
},
modelVersion:
textractData.AnalyzeDocumentModelVersion ??
textractData.DetectDocumentTextModelVersion ??
undefined,
},
})
} catch (error) {
if (error instanceof z.ZodError) {
logger.warn(`[${requestId}] Invalid request data`, { errors: error.errors })
return NextResponse.json(
{
success: false,
error: 'Invalid request data',
details: error.errors,
},
{ status: 400 }
)
}
logger.error(`[${requestId}] Error in Textract parse:`, error)
return NextResponse.json(
{
success: false,
error: error instanceof Error ? error.message : 'Internal server error',
},
{ status: 500 }
)
}
}

View File

@@ -12,6 +12,10 @@ import { markExecutionCancelled } from '@/lib/execution/cancellation'
import { processInputFileFields } from '@/lib/execution/files' import { processInputFileFields } from '@/lib/execution/files'
import { preprocessExecution } from '@/lib/execution/preprocessing' import { preprocessExecution } from '@/lib/execution/preprocessing'
import { LoggingSession } from '@/lib/logs/execution/logging-session' import { LoggingSession } from '@/lib/logs/execution/logging-session'
import {
cleanupExecutionBase64Cache,
hydrateUserFilesWithBase64,
} from '@/lib/uploads/utils/user-file-base64.server'
import { executeWorkflowCore } from '@/lib/workflows/executor/execution-core' import { executeWorkflowCore } from '@/lib/workflows/executor/execution-core'
import { type ExecutionEvent, encodeSSEEvent } from '@/lib/workflows/executor/execution-events' import { type ExecutionEvent, encodeSSEEvent } from '@/lib/workflows/executor/execution-events'
import { PauseResumeManager } from '@/lib/workflows/executor/human-in-the-loop-manager' import { PauseResumeManager } from '@/lib/workflows/executor/human-in-the-loop-manager'
@@ -25,7 +29,7 @@ import type { WorkflowExecutionPayload } from '@/background/workflow-execution'
import { normalizeName } from '@/executor/constants' import { normalizeName } from '@/executor/constants'
import { ExecutionSnapshot } from '@/executor/execution/snapshot' import { ExecutionSnapshot } from '@/executor/execution/snapshot'
import type { ExecutionMetadata, IterationContext } from '@/executor/execution/types' import type { ExecutionMetadata, IterationContext } from '@/executor/execution/types'
import type { StreamingExecution } from '@/executor/types' import type { NormalizedBlockOutput, StreamingExecution } from '@/executor/types'
import { Serializer } from '@/serializer' import { Serializer } from '@/serializer'
import { CORE_TRIGGER_TYPES, type CoreTriggerType } from '@/stores/logs/filters/types' import { CORE_TRIGGER_TYPES, type CoreTriggerType } from '@/stores/logs/filters/types'
@@ -38,6 +42,8 @@ const ExecuteWorkflowSchema = z.object({
useDraftState: z.boolean().optional(), useDraftState: z.boolean().optional(),
input: z.any().optional(), input: z.any().optional(),
isClientSession: z.boolean().optional(), isClientSession: z.boolean().optional(),
includeFileBase64: z.boolean().optional().default(true),
base64MaxBytes: z.number().int().positive().optional(),
workflowStateOverride: z workflowStateOverride: z
.object({ .object({
blocks: z.record(z.any()), blocks: z.record(z.any()),
@@ -214,6 +220,8 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
useDraftState, useDraftState,
input: validatedInput, input: validatedInput,
isClientSession = false, isClientSession = false,
includeFileBase64,
base64MaxBytes,
workflowStateOverride, workflowStateOverride,
} = validation.data } = validation.data
@@ -227,6 +235,8 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
triggerType, triggerType,
stream, stream,
useDraftState, useDraftState,
includeFileBase64,
base64MaxBytes,
workflowStateOverride, workflowStateOverride,
workflowId: _workflowId, // Also exclude workflowId used for internal JWT auth workflowId: _workflowId, // Also exclude workflowId used for internal JWT auth
...rest ...rest
@@ -427,16 +437,31 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
snapshot, snapshot,
callbacks: {}, callbacks: {},
loggingSession, loggingSession,
includeFileBase64,
base64MaxBytes,
}) })
const hasResponseBlock = workflowHasResponseBlock(result) const outputWithBase64 = includeFileBase64
? ((await hydrateUserFilesWithBase64(result.output, {
requestId,
executionId,
maxBytes: base64MaxBytes,
})) as NormalizedBlockOutput)
: result.output
const resultWithBase64 = { ...result, output: outputWithBase64 }
// Cleanup base64 cache for this execution
await cleanupExecutionBase64Cache(executionId)
const hasResponseBlock = workflowHasResponseBlock(resultWithBase64)
if (hasResponseBlock) { if (hasResponseBlock) {
return createHttpResponseFromBlock(result) return createHttpResponseFromBlock(resultWithBase64)
} }
const filteredResult = { const filteredResult = {
success: result.success, success: result.success,
output: result.output, output: outputWithBase64,
error: result.error, error: result.error,
metadata: result.metadata metadata: result.metadata
? { ? {
@@ -498,6 +523,8 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
selectedOutputs: resolvedSelectedOutputs, selectedOutputs: resolvedSelectedOutputs,
isSecureMode: false, isSecureMode: false,
workflowTriggerType: triggerType === 'chat' ? 'chat' : 'api', workflowTriggerType: triggerType === 'chat' ? 'chat' : 'api',
includeFileBase64,
base64MaxBytes,
}, },
executionId, executionId,
}) })
@@ -698,6 +725,8 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
}, },
loggingSession, loggingSession,
abortSignal: abortController.signal, abortSignal: abortController.signal,
includeFileBase64,
base64MaxBytes,
}) })
if (result.status === 'paused') { if (result.status === 'paused') {
@@ -750,12 +779,21 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
workflowId, workflowId,
data: { data: {
success: result.success, success: result.success,
output: result.output, output: includeFileBase64
? await hydrateUserFilesWithBase64(result.output, {
requestId,
executionId,
maxBytes: base64MaxBytes,
})
: result.output,
duration: result.metadata?.duration || 0, duration: result.metadata?.duration || 0,
startTime: result.metadata?.startTime || startTime.toISOString(), startTime: result.metadata?.startTime || startTime.toISOString(),
endTime: result.metadata?.endTime || new Date().toISOString(), endTime: result.metadata?.endTime || new Date().toISOString(),
}, },
}) })
// Cleanup base64 cache for this execution
await cleanupExecutionBase64Cache(executionId)
} catch (error: any) { } catch (error: any) {
const errorMessage = error.message || 'Unknown error' const errorMessage = error.message || 'Unknown error'
logger.error(`[${requestId}] SSE execution failed: ${errorMessage}`) logger.error(`[${requestId}] SSE execution failed: ${errorMessage}`)

View File

@@ -0,0 +1,27 @@
'use client'
import Link from 'next/link'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
interface BrandedLinkProps {
href: string
children: React.ReactNode
className?: string
target?: string
rel?: string
}
export function BrandedLink({ href, children, className = '', target, rel }: BrandedLinkProps) {
const buttonClass = useBrandedButtonClass()
return (
<Link
href={href}
target={target}
rel={rel}
className={`${buttonClass} group inline-flex items-center justify-center gap-2 rounded-[10px] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white transition-all ${className}`}
>
{children}
</Link>
)
}

View File

@@ -2,6 +2,7 @@ import { BookOpen, Github, Rss } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedLink } from '@/app/changelog/components/branded-link'
import ChangelogList from '@/app/changelog/components/timeline-list' import ChangelogList from '@/app/changelog/components/timeline-list'
export interface ChangelogEntry { export interface ChangelogEntry {
@@ -66,25 +67,24 @@ export default async function ChangelogContent() {
<hr className='mt-6 border-border' /> <hr className='mt-6 border-border' />
<div className='mt-6 flex flex-wrap items-center gap-3 text-sm'> <div className='mt-6 flex flex-wrap items-center gap-3 text-sm'>
<Link <BrandedLink
href='https://github.com/simstudioai/sim/releases' href='https://github.com/simstudioai/sim/releases'
target='_blank' target='_blank'
rel='noopener noreferrer' rel='noopener noreferrer'
className='group inline-flex items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[14px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all sm:text-[16px]'
> >
<Github className='h-4 w-4' /> <Github className='h-4 w-4' />
View on GitHub View on GitHub
</Link> </BrandedLink>
<Link <Link
href='https://docs.sim.ai' href='https://docs.sim.ai'
className='inline-flex items-center gap-2 rounded-md border border-border px-3 py-1.5 hover:bg-muted' className='inline-flex items-center gap-2 rounded-[10px] border border-border py-[6px] pr-[10px] pl-[12px] text-[15px] transition-all hover:bg-muted'
> >
<BookOpen className='h-4 w-4' /> <BookOpen className='h-4 w-4' />
Documentation Documentation
</Link> </Link>
<Link <Link
href='/changelog.xml' href='/changelog.xml'
className='inline-flex items-center gap-2 rounded-md border border-border px-3 py-1.5 hover:bg-muted' className='inline-flex items-center gap-2 rounded-[10px] border border-border py-[6px] pr-[10px] pl-[12px] text-[15px] transition-all hover:bg-muted'
> >
<Rss className='h-4 w-4' /> <Rss className='h-4 w-4' />
RSS Feed RSS Feed

View File

@@ -117,7 +117,7 @@ export default function ChatClient({ identifier }: { identifier: string }) {
const [error, setError] = useState<string | null>(null) const [error, setError] = useState<string | null>(null)
const messagesEndRef = useRef<HTMLDivElement>(null) const messagesEndRef = useRef<HTMLDivElement>(null)
const messagesContainerRef = useRef<HTMLDivElement>(null) const messagesContainerRef = useRef<HTMLDivElement>(null)
const [starCount, setStarCount] = useState('25.1k') const [starCount, setStarCount] = useState('25.8k')
const [conversationId, setConversationId] = useState('') const [conversationId, setConversationId] = useState('')
const [showScrollButton, setShowScrollButton] = useState(false) const [showScrollButton, setShowScrollButton] = useState(false)

View File

@@ -2,7 +2,7 @@
import { useRef, useState } from 'react' import { useRef, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { isUserFile } from '@/lib/core/utils/display-filters' import { isUserFileWithMetadata } from '@/lib/core/utils/user-file'
import type { ChatFile, ChatMessage } from '@/app/chat/components/message/message' import type { ChatFile, ChatMessage } from '@/app/chat/components/message/message'
import { CHAT_ERROR_MESSAGES } from '@/app/chat/constants' import { CHAT_ERROR_MESSAGES } from '@/app/chat/constants'
@@ -17,7 +17,7 @@ function extractFilesFromData(
return files return files
} }
if (isUserFile(data)) { if (isUserFileWithMetadata(data)) {
if (!seenIds.has(data.id)) { if (!seenIds.has(data.id)) {
seenIds.add(data.id) seenIds.add(data.id)
files.push({ files.push({
@@ -232,7 +232,7 @@ export function useChatStreaming() {
return null return null
} }
if (isUserFile(value)) { if (isUserFileWithMetadata(value)) {
return null return null
} }
@@ -285,7 +285,7 @@ export function useChatStreaming() {
const value = getOutputValue(blockOutputs, config.path) const value = getOutputValue(blockOutputs, config.path)
if (isUserFile(value)) { if (isUserFileWithMetadata(value)) {
extractedFiles.push({ extractedFiles.push({
id: value.id, id: value.id,
name: value.name, name: value.name,

View File

@@ -207,7 +207,6 @@ function TemplateCardInner({
isPannable={false} isPannable={false}
defaultZoom={0.8} defaultZoom={0.8}
fitPadding={0.2} fitPadding={0.2}
lightweight
/> />
) : ( ) : (
<div className='h-full w-full bg-[var(--surface-4)]' /> <div className='h-full w-full bg-[var(--surface-4)]' />

View File

@@ -16,8 +16,8 @@ import {
import { redactApiKeys } from '@/lib/core/security/redaction' import { redactApiKeys } from '@/lib/core/security/redaction'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { import {
BlockDetailsSidebar,
getLeftmostBlockId, getLeftmostBlockId,
PreviewEditor,
WorkflowPreview, WorkflowPreview,
} from '@/app/workspace/[workspaceId]/w/components/preview' } from '@/app/workspace/[workspaceId]/w/components/preview'
import { useExecutionSnapshot } from '@/hooks/queries/logs' import { useExecutionSnapshot } from '@/hooks/queries/logs'
@@ -248,11 +248,10 @@ export function ExecutionSnapshot({
cursorStyle='pointer' cursorStyle='pointer'
executedBlocks={blockExecutions} executedBlocks={blockExecutions}
selectedBlockId={pinnedBlockId} selectedBlockId={pinnedBlockId}
lightweight
/> />
</div> </div>
{pinnedBlockId && workflowState.blocks[pinnedBlockId] && ( {pinnedBlockId && workflowState.blocks[pinnedBlockId] && (
<BlockDetailsSidebar <PreviewEditor
block={workflowState.blocks[pinnedBlockId]} block={workflowState.blocks[pinnedBlockId]}
executionData={blockExecutions[pinnedBlockId]} executionData={blockExecutions[pinnedBlockId]}
allBlockExecutions={blockExecutions} allBlockExecutions={blockExecutions}

View File

@@ -234,7 +234,7 @@ function ProgressBar({
{segments.map((segment, index) => ( {segments.map((segment, index) => (
<div <div
key={index} key={index}
className='absolute h-full' className='absolute h-full opacity-70'
style={{ style={{
left: `${segment.startPercent}%`, left: `${segment.startPercent}%`,
width: `${segment.widthPercent}%`, width: `${segment.widthPercent}%`,

View File

@@ -257,7 +257,7 @@ export const LogDetails = memo(function LogDetails({
Version Version
</span> </span>
<div className='flex w-0 flex-1 justify-end'> <div className='flex w-0 flex-1 justify-end'>
<span className='max-w-full truncate rounded-[6px] bg-[#14291B] px-[9px] py-[2px] font-medium text-[#86EFAC] text-[12px]'> <span className='max-w-full truncate rounded-[6px] bg-[#bbf7d0] px-[9px] py-[2px] font-medium text-[#15803d] text-[12px] dark:bg-[#14291B] dark:text-[#86EFAC]'>
{log.deploymentVersionName || `v${log.deploymentVersion}`} {log.deploymentVersionName || `v${log.deploymentVersion}`}
</span> </span>
</div> </div>

View File

@@ -19,6 +19,7 @@ import { DatePicker } from '@/components/emcn/components/date-picker/date-picker
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { hasActiveFilters } from '@/lib/logs/filters' import { hasActiveFilters } from '@/lib/logs/filters'
import { getTriggerOptions } from '@/lib/logs/get-trigger-options' import { getTriggerOptions } from '@/lib/logs/get-trigger-options'
import { type LogStatus, STATUS_CONFIG } from '@/app/workspace/[workspaceId]/logs/utils'
import { getBlock } from '@/blocks/registry' import { getBlock } from '@/blocks/registry'
import { useFolderStore } from '@/stores/folders/store' import { useFolderStore } from '@/stores/folders/store'
import { useFilterStore } from '@/stores/logs/filters/store' import { useFilterStore } from '@/stores/logs/filters/store'
@@ -211,12 +212,12 @@ export function LogsToolbar({
}, [level]) }, [level])
const statusOptions: ComboboxOption[] = useMemo( const statusOptions: ComboboxOption[] = useMemo(
() => [ () =>
{ value: 'error', label: 'Error', icon: getColorIcon('var(--text-error)') }, (Object.keys(STATUS_CONFIG) as LogStatus[]).map((status) => ({
{ value: 'info', label: 'Info', icon: getColorIcon('var(--terminal-status-info-color)') }, value: status,
{ value: 'running', label: 'Running', icon: getColorIcon('#22c55e') }, label: STATUS_CONFIG[status].label,
{ value: 'pending', label: 'Pending', icon: getColorIcon('#f59e0b') }, icon: getColorIcon(STATUS_CONFIG[status].color),
], })),
[] []
) )
@@ -242,12 +243,8 @@ export function LogsToolbar({
const selectedStatusColor = useMemo(() => { const selectedStatusColor = useMemo(() => {
if (selectedStatuses.length !== 1) return null if (selectedStatuses.length !== 1) return null
const status = selectedStatuses[0] const status = selectedStatuses[0] as LogStatus
if (status === 'error') return 'var(--text-error)' return STATUS_CONFIG[status]?.color ?? null
if (status === 'info') return 'var(--terminal-status-info-color)'
if (status === 'running') return '#22c55e'
if (status === 'pending') return '#f59e0b'
return null
}, [selectedStatuses]) }, [selectedStatuses])
const workflowOptions: ComboboxOption[] = useMemo( const workflowOptions: ComboboxOption[] = useMemo(

View File

@@ -5,7 +5,6 @@ import { getIntegrationMetadata } from '@/lib/logs/get-trigger-options'
import { getBlock } from '@/blocks/registry' import { getBlock } from '@/blocks/registry'
import { CORE_TRIGGER_TYPES } from '@/stores/logs/filters/types' import { CORE_TRIGGER_TYPES } from '@/stores/logs/filters/types'
/** Column configuration for logs table - shared between header and rows */
export const LOG_COLUMNS = { export const LOG_COLUMNS = {
date: { width: 'w-[8%]', minWidth: 'min-w-[70px]', label: 'Date' }, date: { width: 'w-[8%]', minWidth: 'min-w-[70px]', label: 'Date' },
time: { width: 'w-[12%]', minWidth: 'min-w-[90px]', label: 'Time' }, time: { width: 'w-[12%]', minWidth: 'min-w-[90px]', label: 'Time' },
@@ -16,10 +15,8 @@ export const LOG_COLUMNS = {
duration: { width: 'w-[20%]', minWidth: 'min-w-[100px]', label: 'Duration' }, duration: { width: 'w-[20%]', minWidth: 'min-w-[100px]', label: 'Duration' },
} as const } as const
/** Type-safe column key derived from LOG_COLUMNS */
export type LogColumnKey = keyof typeof LOG_COLUMNS export type LogColumnKey = keyof typeof LOG_COLUMNS
/** Ordered list of column keys for rendering table headers */
export const LOG_COLUMN_ORDER: readonly LogColumnKey[] = [ export const LOG_COLUMN_ORDER: readonly LogColumnKey[] = [
'date', 'date',
'time', 'time',
@@ -30,7 +27,6 @@ export const LOG_COLUMN_ORDER: readonly LogColumnKey[] = [
'duration', 'duration',
] as const ] as const
/** Possible execution status values for workflow logs */
export type LogStatus = 'error' | 'pending' | 'running' | 'info' | 'cancelled' export type LogStatus = 'error' | 'pending' | 'running' | 'info' | 'cancelled'
/** /**
@@ -53,30 +49,28 @@ export function getDisplayStatus(status: string | null | undefined): LogStatus {
} }
} }
/** Configuration mapping log status to Badge variant and display label */ export const STATUS_CONFIG: Record<
const STATUS_VARIANT_MAP: Record<
LogStatus, LogStatus,
{ variant: React.ComponentProps<typeof Badge>['variant']; label: string } { variant: React.ComponentProps<typeof Badge>['variant']; label: string; color: string }
> = { > = {
error: { variant: 'red', label: 'Error' }, error: { variant: 'red', label: 'Error', color: 'var(--text-error)' },
pending: { variant: 'amber', label: 'Pending' }, pending: { variant: 'amber', label: 'Pending', color: '#f59e0b' },
running: { variant: 'green', label: 'Running' }, running: { variant: 'green', label: 'Running', color: '#22c55e' },
cancelled: { variant: 'gray', label: 'Cancelled' }, cancelled: { variant: 'orange', label: 'Cancelled', color: '#f97316' },
info: { variant: 'gray', label: 'Info' }, info: { variant: 'gray', label: 'Info', color: 'var(--terminal-status-info-color)' },
} }
/** Configuration mapping core trigger types to Badge color variants */
const TRIGGER_VARIANT_MAP: Record<string, React.ComponentProps<typeof Badge>['variant']> = { const TRIGGER_VARIANT_MAP: Record<string, React.ComponentProps<typeof Badge>['variant']> = {
manual: 'gray-secondary', manual: 'gray-secondary',
api: 'blue', api: 'blue',
schedule: 'green', schedule: 'green',
chat: 'purple', chat: 'purple',
webhook: 'orange', webhook: 'orange',
mcp: 'cyan',
a2a: 'teal', a2a: 'teal',
} }
interface StatusBadgeProps { interface StatusBadgeProps {
/** The execution status to display */
status: LogStatus status: LogStatus
} }
@@ -86,14 +80,13 @@ interface StatusBadgeProps {
* @returns A Badge with dot indicator and status label * @returns A Badge with dot indicator and status label
*/ */
export const StatusBadge = React.memo(({ status }: StatusBadgeProps) => { export const StatusBadge = React.memo(({ status }: StatusBadgeProps) => {
const config = STATUS_VARIANT_MAP[status] const config = STATUS_CONFIG[status]
return React.createElement(Badge, { variant: config.variant, dot: true }, config.label) return React.createElement(Badge, { variant: config.variant, dot: true }, config.label)
}) })
StatusBadge.displayName = 'StatusBadge' StatusBadge.displayName = 'StatusBadge'
interface TriggerBadgeProps { interface TriggerBadgeProps {
/** The trigger type identifier (e.g., 'manual', 'api', or integration block type) */
trigger: string trigger: string
} }

View File

@@ -213,7 +213,6 @@ function TemplateCardInner({
isPannable={false} isPannable={false}
defaultZoom={0.8} defaultZoom={0.8}
fitPadding={0.2} fitPadding={0.2}
lightweight
cursorStyle='pointer' cursorStyle='pointer'
/> />
) : ( ) : (

View File

@@ -2,12 +2,12 @@ import { memo, useCallback } from 'react'
import { ArrowLeftRight, ArrowUpDown, Circle, CircleOff, LogOut } from 'lucide-react' import { ArrowLeftRight, ArrowUpDown, Circle, CircleOff, LogOut } from 'lucide-react'
import { Button, Copy, Tooltip, Trash2 } from '@/components/emcn' import { Button, Copy, Tooltip, Trash2 } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider' import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { validateTriggerPaste } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils'
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow' import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { useNotificationStore } from '@/stores/notifications'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store' import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { getUniqueBlockName, prepareDuplicateBlockState } from '@/stores/workflows/utils'
import { useWorkflowStore } from '@/stores/workflows/workflow/store' import { useWorkflowStore } from '@/stores/workflows/workflow/store'
const DEFAULT_DUPLICATE_OFFSET = { x: 50, y: 50 } const DEFAULT_DUPLICATE_OFFSET = { x: 50, y: 50 }
@@ -48,29 +48,38 @@ export const ActionBar = memo(
collaborativeBatchToggleBlockEnabled, collaborativeBatchToggleBlockEnabled,
collaborativeBatchToggleBlockHandles, collaborativeBatchToggleBlockHandles,
} = useCollaborativeWorkflow() } = useCollaborativeWorkflow()
const { activeWorkflowId, setPendingSelection } = useWorkflowRegistry() const { setPendingSelection } = useWorkflowRegistry()
const addNotification = useNotificationStore((s) => s.addNotification)
const handleDuplicateBlock = useCallback(() => { const handleDuplicateBlock = useCallback(() => {
const blocks = useWorkflowStore.getState().blocks const { copyBlocks, preparePasteData, activeWorkflowId } = useWorkflowRegistry.getState()
const sourceBlock = blocks[blockId] const existingBlocks = useWorkflowStore.getState().blocks
if (!sourceBlock) return copyBlocks([blockId])
const newId = crypto.randomUUID() const pasteData = preparePasteData(DEFAULT_DUPLICATE_OFFSET)
const newName = getUniqueBlockName(sourceBlock.name, blocks) if (!pasteData) return
const subBlockValues =
useSubBlockStore.getState().workflowValues[activeWorkflowId || '']?.[blockId] || {}
const { block, subBlockValues: filteredValues } = prepareDuplicateBlockState({ const blocks = Object.values(pasteData.blocks)
sourceBlock, const validation = validateTriggerPaste(blocks, existingBlocks, 'duplicate')
newId, if (!validation.isValid) {
newName, addNotification({
positionOffset: DEFAULT_DUPLICATE_OFFSET, level: 'error',
subBlockValues, message: validation.message!,
}) workflowId: activeWorkflowId || undefined,
})
return
}
setPendingSelection([newId]) setPendingSelection(blocks.map((b) => b.id))
collaborativeBatchAddBlocks([block], [], {}, {}, { [newId]: filteredValues }) collaborativeBatchAddBlocks(
}, [blockId, activeWorkflowId, collaborativeBatchAddBlocks, setPendingSelection]) blocks,
pasteData.edges,
pasteData.loops,
pasteData.parallels,
pasteData.subBlockValues
)
}, [blockId, addNotification, collaborativeBatchAddBlocks, setPendingSelection])
const { isEnabled, horizontalHandles, parentId, parentType } = useWorkflowStore( const { isEnabled, horizontalHandles, parentId, parentType } = useWorkflowStore(
useCallback( useCallback(
@@ -90,7 +99,7 @@ export const ActionBar = memo(
const userPermissions = useUserPermissionsContext() const userPermissions = useUserPermissionsContext()
const isStartBlock = isValidStartBlockType(blockType) const isStartBlock = isInputDefinitionTrigger(blockType)
const isResponseBlock = blockType === 'response' const isResponseBlock = blockType === 'response'
const isNoteBlock = blockType === 'note' const isNoteBlock = blockType === 'note'
const isSubflowBlock = blockType === 'loop' || blockType === 'parallel' const isSubflowBlock = blockType === 'loop' || blockType === 'parallel'
@@ -142,7 +151,7 @@ export const ActionBar = memo(
</Tooltip.Root> </Tooltip.Root>
)} )}
{!isStartBlock && !isResponseBlock && !isSubflowBlock && ( {!isStartBlock && !isResponseBlock && (
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<Button <Button
@@ -213,6 +222,29 @@ export const ActionBar = memo(
</Tooltip.Root> </Tooltip.Root>
)} )}
{isSubflowBlock && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
collaborativeBatchToggleBlockEnabled([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
>
{isEnabled ? <Circle className={ICON_SIZE} /> : <CircleOff className={ICON_SIZE} />}
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
</Tooltip.Content>
</Tooltip.Root>
)}
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<Button <Button

View File

@@ -8,7 +8,7 @@ import {
PopoverDivider, PopoverDivider,
PopoverItem, PopoverItem,
} from '@/components/emcn' } from '@/components/emcn'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { TriggerUtils } from '@/lib/workflows/triggers/triggers'
/** /**
* Block information for context menu actions * Block information for context menu actions
@@ -74,12 +74,16 @@ export function BlockMenu({
const allEnabled = selectedBlocks.every((b) => b.enabled) const allEnabled = selectedBlocks.every((b) => b.enabled)
const allDisabled = selectedBlocks.every((b) => !b.enabled) const allDisabled = selectedBlocks.every((b) => !b.enabled)
const hasStarterBlock = selectedBlocks.some((b) => isValidStartBlockType(b.type)) const hasSingletonBlock = selectedBlocks.some(
(b) =>
TriggerUtils.requiresSingleInstance(b.type) || TriggerUtils.isSingleInstanceBlockType(b.type)
)
const hasTriggerBlock = selectedBlocks.some((b) => TriggerUtils.isTriggerBlock(b))
const allNoteBlocks = selectedBlocks.every((b) => b.type === 'note') const allNoteBlocks = selectedBlocks.every((b) => b.type === 'note')
const isSubflow = const isSubflow =
isSingleBlock && (selectedBlocks[0]?.type === 'loop' || selectedBlocks[0]?.type === 'parallel') isSingleBlock && (selectedBlocks[0]?.type === 'loop' || selectedBlocks[0]?.type === 'parallel')
const canRemoveFromSubflow = showRemoveFromSubflow && !hasStarterBlock const canRemoveFromSubflow = showRemoveFromSubflow && !hasTriggerBlock
const getToggleEnabledLabel = () => { const getToggleEnabledLabel = () => {
if (allEnabled) return 'Disable' if (allEnabled) return 'Disable'
@@ -127,7 +131,7 @@ export function BlockMenu({
<span>Paste</span> <span>Paste</span>
<span className='ml-auto opacity-70 group-hover:opacity-100'>V</span> <span className='ml-auto opacity-70 group-hover:opacity-100'>V</span>
</PopoverItem> </PopoverItem>
{!hasStarterBlock && ( {!hasSingletonBlock && (
<PopoverItem <PopoverItem
disabled={disableEdit} disabled={disableEdit}
onClick={() => { onClick={() => {

View File

@@ -26,7 +26,6 @@ export interface CanvasMenuProps {
onOpenLogs: () => void onOpenLogs: () => void
onToggleVariables: () => void onToggleVariables: () => void
onToggleChat: () => void onToggleChat: () => void
onInvite: () => void
isVariablesOpen?: boolean isVariablesOpen?: boolean
isChatOpen?: boolean isChatOpen?: boolean
hasClipboard?: boolean hasClipboard?: boolean
@@ -55,15 +54,12 @@ export function CanvasMenu({
onOpenLogs, onOpenLogs,
onToggleVariables, onToggleVariables,
onToggleChat, onToggleChat,
onInvite,
isVariablesOpen = false, isVariablesOpen = false,
isChatOpen = false, isChatOpen = false,
hasClipboard = false, hasClipboard = false,
disableEdit = false, disableEdit = false,
disableAdmin = false,
canUndo = false, canUndo = false,
canRedo = false, canRedo = false,
isInvitationsDisabled = false,
}: CanvasMenuProps) { }: CanvasMenuProps) {
return ( return (
<Popover <Popover
@@ -179,22 +175,6 @@ export function CanvasMenu({
> >
{isChatOpen ? 'Close Chat' : 'Open Chat'} {isChatOpen ? 'Close Chat' : 'Open Chat'}
</PopoverItem> </PopoverItem>
{/* Admin action - hidden when invitations are disabled */}
{!isInvitationsDisabled && (
<>
<PopoverDivider />
<PopoverItem
disabled={disableAdmin}
onClick={() => {
onInvite()
onClose()
}}
>
Invite to Workspace
</PopoverItem>
</>
)}
</PopoverContent> </PopoverContent>
</Popover> </Popover>
) )

View File

@@ -886,17 +886,16 @@ export function Chat() {
onMouseDown={(e) => e.stopPropagation()} onMouseDown={(e) => e.stopPropagation()}
> >
{shouldShowConfigureStartInputsButton && ( {shouldShowConfigureStartInputsButton && (
<Badge <div
variant='outline' className='flex flex-none cursor-pointer items-center whitespace-nowrap rounded-[6px] border border-[var(--border-1)] bg-[var(--surface-5)] px-[9px] py-[2px] font-medium font-sans text-[12px] text-[var(--text-primary)] hover:bg-[var(--surface-7)] dark:hover:border-[var(--surface-7)] dark:hover:bg-[var(--border-1)]'
className='flex-none cursor-pointer whitespace-nowrap rounded-[6px]'
title='Add chat inputs to Start block' title='Add chat inputs to Start block'
onMouseDown={(e) => { onMouseDown={(e) => {
e.stopPropagation() e.stopPropagation()
handleConfigureStartInputs() handleConfigureStartInputs()
}} }}
> >
<span className='whitespace-nowrap text-[12px]'>Add inputs</span> <span className='whitespace-nowrap'>Add inputs</span>
</Badge> </div>
)} )}
<OutputSelect <OutputSelect

View File

@@ -129,10 +129,6 @@ export function OutputSelect({
? baselineWorkflow.blocks?.[block.id]?.subBlocks?.responseFormat?.value ? baselineWorkflow.blocks?.[block.id]?.subBlocks?.responseFormat?.value
: subBlockValues?.[block.id]?.responseFormat : subBlockValues?.[block.id]?.responseFormat
const responseFormat = parseResponseFormatSafely(responseFormatValue, block.id) const responseFormat = parseResponseFormatSafely(responseFormatValue, block.id)
const operationValue =
shouldUseBaseline && baselineWorkflow
? baselineWorkflow.blocks?.[block.id]?.subBlocks?.operation?.value
: subBlockValues?.[block.id]?.operation
let outputsToProcess: Record<string, unknown> = {} let outputsToProcess: Record<string, unknown> = {}
@@ -146,10 +142,20 @@ export function OutputSelect({
outputsToProcess = blockConfig?.outputs || {} outputsToProcess = blockConfig?.outputs || {}
} }
} else { } else {
const toolOutputs = // Build subBlocks object for tool selector
blockConfig && typeof operationValue === 'string' const rawSubBlockValues =
? getToolOutputs(blockConfig, operationValue) shouldUseBaseline && baselineWorkflow
: {} ? baselineWorkflow.blocks?.[block.id]?.subBlocks
: subBlockValues?.[block.id]
const subBlocks: Record<string, { value: unknown }> = {}
if (rawSubBlockValues && typeof rawSubBlockValues === 'object') {
for (const [key, val] of Object.entries(rawSubBlockValues)) {
// Handle both { value: ... } and raw value formats
subBlocks[key] = val && typeof val === 'object' && 'value' in val ? val : { value: val }
}
}
const toolOutputs = blockConfig ? getToolOutputs(blockConfig, subBlocks) : {}
outputsToProcess = outputsToProcess =
Object.keys(toolOutputs).length > 0 ? toolOutputs : blockConfig?.outputs || {} Object.keys(toolOutputs).length > 0 ? toolOutputs : blockConfig?.outputs || {}
} }

View File

@@ -138,18 +138,24 @@ export const Notifications = memo(function Notifications() {
}`} }`}
> >
<div className='flex h-full flex-col justify-between px-[8px] pt-[6px] pb-[8px]'> <div className='flex h-full flex-col justify-between px-[8px] pt-[6px] pb-[8px]'>
<div <div className='flex items-start gap-[8px]'>
className={`font-medium text-[12px] leading-[16px] ${ <div
hasAction ? 'line-clamp-2' : 'line-clamp-4' className={`min-w-0 flex-1 font-medium text-[12px] leading-[16px] ${
}`} hasAction ? 'line-clamp-2' : 'line-clamp-4'
> }`}
>
{notification.level === 'error' && (
<span className='mr-[6px] mb-[2.75px] inline-block h-[6px] w-[6px] rounded-[2px] bg-[var(--text-error)] align-middle' />
)}
{notification.message}
</div>
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<Button <Button
variant='ghost' variant='ghost'
onClick={() => removeNotification(notification.id)} onClick={() => removeNotification(notification.id)}
aria-label='Dismiss notification' aria-label='Dismiss notification'
className='!p-1.5 -m-1.5 float-right ml-[16px]' className='!p-1.5 -m-1.5 shrink-0'
> >
<X className='h-3 w-3' /> <X className='h-3 w-3' />
</Button> </Button>
@@ -158,10 +164,6 @@ export const Notifications = memo(function Notifications() {
<Tooltip.Shortcut keys='⌘E'>Clear all</Tooltip.Shortcut> <Tooltip.Shortcut keys='⌘E'>Clear all</Tooltip.Shortcut>
</Tooltip.Content> </Tooltip.Content>
</Tooltip.Root> </Tooltip.Root>
{notification.level === 'error' && (
<span className='mr-[6px] mb-[2.75px] inline-block h-[6px] w-[6px] rounded-[2px] bg-[var(--text-error)] align-middle' />
)}
{notification.message}
</div> </div>
{hasAction && ( {hasAction && (
<Button <Button

View File

@@ -5,7 +5,6 @@ import { createLogger } from '@sim/logger'
import { Check, Clipboard } from 'lucide-react' import { Check, Clipboard } from 'lucide-react'
import { useParams } from 'next/navigation' import { useParams } from 'next/navigation'
import { import {
Badge,
Button, Button,
ButtonGroup, ButtonGroup,
ButtonGroupItem, ButtonGroupItem,
@@ -883,14 +882,13 @@ console.log(data);`
<code className='text-[10px]'>&lt;start.files&gt;</code>. <code className='text-[10px]'>&lt;start.files&gt;</code>.
</p> </p>
{missingFields.any && ( {missingFields.any && (
<Badge <div
variant='outline' className='flex flex-none cursor-pointer items-center whitespace-nowrap rounded-[6px] border border-[var(--border-1)] bg-[var(--surface-5)] px-[9px] py-[2px] font-medium font-sans text-[12px] text-[var(--text-primary)] hover:bg-[var(--surface-7)] dark:hover:border-[var(--surface-7)] dark:hover:bg-[var(--border-1)]'
className='flex-none cursor-pointer whitespace-nowrap rounded-[6px]'
title='Add required A2A input fields to Start block' title='Add required A2A input fields to Start block'
onClick={handleAddA2AInputs} onClick={handleAddA2AInputs}
> >
<span className='whitespace-nowrap text-[12px]'>Add inputs</span> <span className='whitespace-nowrap'>Add inputs</span>
</Badge> </div>
)} )}
</div> </div>
</div> </div>

View File

@@ -17,7 +17,7 @@ import { Skeleton } from '@/components/ui'
import { isDev } from '@/lib/core/config/feature-flags' import { isDev } from '@/lib/core/config/feature-flags'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { getBaseUrl, getEmailDomain } from '@/lib/core/utils/urls' import { getBaseUrl, getEmailDomain } from '@/lib/core/utils/urls'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
import { import {
type FieldConfig, type FieldConfig,
useCreateForm, useCreateForm,
@@ -147,7 +147,7 @@ export function FormDeploy({
useEffect(() => { useEffect(() => {
const blocks = Object.values(useWorkflowStore.getState().blocks) const blocks = Object.values(useWorkflowStore.getState().blocks)
const startBlock = blocks.find((b) => isValidStartBlockType(b.type)) const startBlock = blocks.find((b) => isInputDefinitionTrigger(b.type))
if (startBlock) { if (startBlock) {
const inputFormat = useSubBlockStore.getState().getValue(startBlock.id, 'inputFormat') const inputFormat = useSubBlockStore.getState().getValue(startBlock.id, 'inputFormat')

View File

@@ -14,7 +14,7 @@ import {
Textarea, Textarea,
} from '@/components/emcn' } from '@/components/emcn'
import { normalizeInputFormatValue } from '@/lib/workflows/input-format' import { normalizeInputFormatValue } from '@/lib/workflows/input-format'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
import type { InputFormatField } from '@/lib/workflows/types' import type { InputFormatField } from '@/lib/workflows/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store' import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store' import { useSubBlockStore } from '@/stores/workflows/subblock/store'
@@ -52,7 +52,7 @@ export function ApiInfoModal({ open, onOpenChange, workflowId }: ApiInfoModalPro
for (const [blockId, block] of Object.entries(blocks)) { for (const [blockId, block] of Object.entries(blocks)) {
if (!block || typeof block !== 'object') continue if (!block || typeof block !== 'object') continue
const blockType = (block as { type?: string }).type const blockType = (block as { type?: string }).type
if (blockType && isValidStartBlockType(blockType)) { if (blockType && isInputDefinitionTrigger(blockType)) {
return blockId return blockId
} }
} }

View File

@@ -18,8 +18,8 @@ import {
import { Skeleton } from '@/components/ui' import { Skeleton } from '@/components/ui'
import type { WorkflowDeploymentVersionResponse } from '@/lib/workflows/persistence/utils' import type { WorkflowDeploymentVersionResponse } from '@/lib/workflows/persistence/utils'
import { import {
BlockDetailsSidebar,
getLeftmostBlockId, getLeftmostBlockId,
PreviewEditor,
WorkflowPreview, WorkflowPreview,
} from '@/app/workspace/[workspaceId]/w/components/preview' } from '@/app/workspace/[workspaceId]/w/components/preview'
import { useDeploymentVersionState, useRevertToVersion } from '@/hooks/queries/workflows' import { useDeploymentVersionState, useRevertToVersion } from '@/hooks/queries/workflows'
@@ -337,7 +337,7 @@ export function GeneralDeploy({
/> />
</div> </div>
{expandedSelectedBlockId && workflowToShow.blocks?.[expandedSelectedBlockId] && ( {expandedSelectedBlockId && workflowToShow.blocks?.[expandedSelectedBlockId] && (
<BlockDetailsSidebar <PreviewEditor
block={workflowToShow.blocks[expandedSelectedBlockId]} block={workflowToShow.blocks[expandedSelectedBlockId]}
workflowVariables={workflowToShow.variables} workflowVariables={workflowToShow.variables}
loops={workflowToShow.loops} loops={workflowToShow.loops}

View File

@@ -15,7 +15,7 @@ import {
import { Skeleton } from '@/components/ui' import { Skeleton } from '@/components/ui'
import { generateToolInputSchema, sanitizeToolName } from '@/lib/mcp/workflow-tool-schema' import { generateToolInputSchema, sanitizeToolName } from '@/lib/mcp/workflow-tool-schema'
import { normalizeInputFormatValue } from '@/lib/workflows/input-format' import { normalizeInputFormatValue } from '@/lib/workflows/input-format'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
import type { InputFormatField } from '@/lib/workflows/types' import type { InputFormatField } from '@/lib/workflows/types'
import { import {
useAddWorkflowMcpTool, useAddWorkflowMcpTool,
@@ -107,7 +107,7 @@ export function McpDeploy({
for (const [blockId, block] of Object.entries(blocks)) { for (const [blockId, block] of Object.entries(blocks)) {
if (!block || typeof block !== 'object') continue if (!block || typeof block !== 'object') continue
const blockType = (block as { type?: string }).type const blockType = (block as { type?: string }).type
if (blockType && isValidStartBlockType(blockType)) { if (blockType && isInputDefinitionTrigger(blockType)) {
return blockId return blockId
} }
} }

View File

@@ -446,7 +446,6 @@ const OGCaptureContainer = forwardRef<HTMLDivElement>((_, ref) => {
isPannable={false} isPannable={false}
defaultZoom={0.8} defaultZoom={0.8}
fitPadding={0.2} fitPadding={0.2}
lightweight
/> />
</div> </div>
) )

View File

@@ -35,9 +35,9 @@ import { WandPromptBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/comp
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes' import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { useWand } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-wand' import { useWand } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-wand'
import type { GenerationType } from '@/blocks/types' import type { GenerationType } from '@/blocks/types'
import { normalizeName } from '@/executor/constants'
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation' import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
import { useTagSelection } from '@/hooks/kb/use-tag-selection' import { useTagSelection } from '@/hooks/kb/use-tag-selection'
import { normalizeName } from '@/stores/workflows/utils'
const logger = createLogger('Code') const logger = createLogger('Code')

View File

@@ -1,6 +1,7 @@
import { memo, useCallback, useEffect, useMemo, useRef, useState } from 'react' import { memo, useCallback, useEffect, useMemo, useRef, useState } from 'react'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { useReactFlow } from 'reactflow' import { useReactFlow } from 'reactflow'
import { useStoreWithEqualityFn } from 'zustand/traditional'
import { Combobox, type ComboboxOption } from '@/components/emcn/components' import { Combobox, type ComboboxOption } from '@/components/emcn/components'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { buildCanonicalIndex, resolveDependencyValue } from '@/lib/workflows/subblocks/visibility' import { buildCanonicalIndex, resolveDependencyValue } from '@/lib/workflows/subblocks/visibility'
@@ -102,7 +103,8 @@ export const ComboBox = memo(function ComboBox({
[blockConfig?.subBlocks] [blockConfig?.subBlocks]
) )
const canonicalModeOverrides = blockState?.data?.canonicalModes const canonicalModeOverrides = blockState?.data?.canonicalModes
const dependencyValues = useSubBlockStore( const dependencyValues = useStoreWithEqualityFn(
useSubBlockStore,
useCallback( useCallback(
(state) => { (state) => {
if (dependsOnFields.length === 0 || !activeWorkflowId) return [] if (dependsOnFields.length === 0 || !activeWorkflowId) return []

View File

@@ -32,9 +32,9 @@ import {
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown' } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value' import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes' import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { normalizeName } from '@/executor/constants'
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation' import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
import { useTagSelection } from '@/hooks/kb/use-tag-selection' import { useTagSelection } from '@/hooks/kb/use-tag-selection'
import { normalizeName } from '@/stores/workflows/utils'
import { useWorkflowStore } from '@/stores/workflows/workflow/store' import { useWorkflowStore } from '@/stores/workflows/workflow/store'
const logger = createLogger('ConditionInput') const logger = createLogger('ConditionInput')

View File

@@ -1,5 +1,6 @@
import { memo, useCallback, useEffect, useMemo, useRef, useState } from 'react' import { memo, useCallback, useEffect, useMemo, useRef, useState } from 'react'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { useStoreWithEqualityFn } from 'zustand/traditional'
import { Badge } from '@/components/emcn' import { Badge } from '@/components/emcn'
import { Combobox, type ComboboxOption } from '@/components/emcn/components' import { Combobox, type ComboboxOption } from '@/components/emcn/components'
import { buildCanonicalIndex, resolveDependencyValue } from '@/lib/workflows/subblocks/visibility' import { buildCanonicalIndex, resolveDependencyValue } from '@/lib/workflows/subblocks/visibility'
@@ -100,7 +101,8 @@ export const Dropdown = memo(function Dropdown({
[blockConfig?.subBlocks] [blockConfig?.subBlocks]
) )
const canonicalModeOverrides = blockState?.data?.canonicalModes const canonicalModeOverrides = blockState?.data?.canonicalModes
const dependencyValues = useSubBlockStore( const dependencyValues = useStoreWithEqualityFn(
useSubBlockStore,
useCallback( useCallback(
(state) => { (state) => {
if (dependsOnFields.length === 0 || !activeWorkflowId) return [] if (dependsOnFields.length === 0 || !activeWorkflowId) return []

View File

@@ -8,9 +8,10 @@ import { Button, Combobox } from '@/components/emcn/components'
import { Progress } from '@/components/ui/progress' import { Progress } from '@/components/ui/progress'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import type { WorkspaceFileRecord } from '@/lib/uploads/contexts/workspace' import type { WorkspaceFileRecord } from '@/lib/uploads/contexts/workspace'
import { getExtensionFromMimeType } from '@/lib/uploads/utils/file-utils'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store' import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store' import { useWorkflowStore } from '@/stores/workflows/workflow/store'
import { useSubBlockValue } from '../../hooks/use-sub-block-value'
const logger = createLogger('FileUpload') const logger = createLogger('FileUpload')
@@ -85,14 +86,47 @@ export function FileUpload({
} }
} }
/**
* Checks if a file's MIME type matches the accepted types
* Supports exact matches, wildcard patterns (e.g., 'image/*'), and '*' for all types
*/
const isFileTypeAccepted = (fileType: string | undefined, accepted: string): boolean => {
if (accepted === '*') return true
if (!fileType) return false
const acceptedList = accepted.split(',').map((t) => t.trim().toLowerCase())
const normalizedFileType = fileType.toLowerCase()
return acceptedList.some((acceptedType) => {
if (acceptedType === normalizedFileType) return true
if (acceptedType.endsWith('/*')) {
const typePrefix = acceptedType.slice(0, -1) // 'image/' from 'image/*'
return normalizedFileType.startsWith(typePrefix)
}
if (acceptedType.startsWith('.')) {
const extension = acceptedType.slice(1).toLowerCase()
const fileExtension = getExtensionFromMimeType(normalizedFileType)
if (fileExtension === extension) return true
return normalizedFileType.endsWith(`/${extension}`)
}
return false
})
}
const availableWorkspaceFiles = workspaceFiles.filter((workspaceFile) => { const availableWorkspaceFiles = workspaceFiles.filter((workspaceFile) => {
const existingFiles = Array.isArray(value) ? value : value ? [value] : [] const existingFiles = Array.isArray(value) ? value : value ? [value] : []
return !existingFiles.some(
const isAlreadySelected = existingFiles.some(
(existing) => (existing) =>
existing.name === workspaceFile.name || existing.name === workspaceFile.name ||
existing.path?.includes(workspaceFile.key) || existing.path?.includes(workspaceFile.key) ||
existing.key === workspaceFile.key existing.key === workspaceFile.key
) )
return !isAlreadySelected
}) })
useEffect(() => { useEffect(() => {
@@ -421,23 +455,23 @@ export function FileUpload({
return ( return (
<div <div
key={fileKey} key={fileKey}
className='flex items-center justify-between rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-5)] px-[8px] py-[6px] hover:border-[var(--surface-7)] hover:bg-[var(--surface-5)] dark:bg-[var(--surface-5)] dark:hover:bg-[var(--border-1)]' className='relative rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-5)] px-[8px] py-[6px] hover:border-[var(--surface-7)] hover:bg-[var(--surface-5)] dark:bg-[var(--surface-5)] dark:hover:bg-[var(--border-1)]'
> >
<div className='flex-1 truncate pr-2 text-sm' title={file.name}> <div className='truncate pr-[24px] text-sm' title={file.name}>
<span className='text-[var(--text-primary)]'>{truncateMiddle(file.name)}</span> <span className='text-[var(--text-primary)]'>{truncateMiddle(file.name)}</span>
<span className='ml-2 text-[var(--text-muted)]'>({formatFileSize(file.size)})</span> <span className='ml-2 text-[var(--text-muted)]'>({formatFileSize(file.size)})</span>
</div> </div>
<Button <Button
type='button' type='button'
variant='ghost' variant='ghost'
className='h-5 w-5 shrink-0 p-0' className='-translate-y-1/2 absolute top-1/2 right-[4px] h-6 w-6 p-0'
onClick={(e) => handleRemoveFile(file, e)} onClick={(e) => handleRemoveFile(file, e)}
disabled={isDeleting} disabled={isDeleting}
> >
{isDeleting ? ( {isDeleting ? (
<div className='h-3.5 w-3.5 animate-spin rounded-full border-[1.5px] border-current border-t-transparent' /> <div className='h-4 w-4 animate-spin rounded-full border-[1.5px] border-current border-t-transparent' />
) : ( ) : (
<X className='h-3.5 w-3.5' /> <X className='h-4 w-4 opacity-50' />
)} )}
</Button> </Button>
</div> </div>
@@ -468,19 +502,30 @@ export function FileUpload({
const comboboxOptions = useMemo( const comboboxOptions = useMemo(
() => [ () => [
{ label: 'Upload New File', value: '__upload_new__' }, { label: 'Upload New File', value: '__upload_new__' },
...availableWorkspaceFiles.map((file) => ({ ...availableWorkspaceFiles.map((file) => {
label: file.name, const isAccepted =
value: file.id, !acceptedTypes || acceptedTypes === '*' || isFileTypeAccepted(file.type, acceptedTypes)
})), return {
label: file.name,
value: file.id,
disabled: !isAccepted,
}
}),
], ],
[availableWorkspaceFiles] [availableWorkspaceFiles, acceptedTypes]
) )
const handleComboboxChange = (value: string) => { const handleComboboxChange = (value: string) => {
setInputValue(value) setInputValue(value)
const isValidOption = const selectedFile = availableWorkspaceFiles.find((file) => file.id === value)
value === '__upload_new__' || availableWorkspaceFiles.some((file) => file.id === value) const isAcceptedType =
selectedFile &&
(!acceptedTypes ||
acceptedTypes === '*' ||
isFileTypeAccepted(selectedFile.type, acceptedTypes))
const isValidOption = value === '__upload_new__' || isAcceptedType
if (!isValidOption) { if (!isValidOption) {
return return

View File

@@ -2,9 +2,8 @@
import type { ReactNode } from 'react' import type { ReactNode } from 'react'
import { splitReferenceSegment } from '@/lib/workflows/sanitization/references' import { splitReferenceSegment } from '@/lib/workflows/sanitization/references'
import { REFERENCE } from '@/executor/constants' import { normalizeName, REFERENCE } from '@/executor/constants'
import { createCombinedPattern } from '@/executor/utils/reference-validation' import { createCombinedPattern } from '@/executor/utils/reference-validation'
import { normalizeName } from '@/stores/workflows/utils'
export interface HighlightContext { export interface HighlightContext {
accessiblePrefixes?: Set<string> accessiblePrefixes?: Set<string>

View File

@@ -34,3 +34,4 @@ export { Text } from './text/text'
export { TimeInput } from './time-input/time-input' export { TimeInput } from './time-input/time-input'
export { ToolInput } from './tool-input/tool-input' export { ToolInput } from './tool-input/tool-input'
export { VariablesInput } from './variables-input/variables-input' export { VariablesInput } from './variables-input/variables-input'
export { WorkflowSelectorInput } from './workflow-selector/workflow-selector-input'

View File

@@ -2,12 +2,13 @@ import { useMemo, useRef, useState } from 'react'
import { Badge, Input } from '@/components/emcn' import { Badge, Input } from '@/components/emcn'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { extractInputFieldsFromBlocks } from '@/lib/workflows/input-format'
import { formatDisplayText } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/formatted-text' import { formatDisplayText } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/formatted-text'
import { TagDropdown } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown' import { TagDropdown } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockInput } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-input' import { useSubBlockInput } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-input'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value' import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes' import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { useWorkflowInputFields } from '@/hooks/queries/workflows' import { useWorkflowState } from '@/hooks/queries/workflows'
/** /**
* Props for the InputMappingField component * Props for the InputMappingField component
@@ -70,7 +71,11 @@ export function InputMapping({
const overlayRefs = useRef<Map<string, HTMLDivElement>>(new Map()) const overlayRefs = useRef<Map<string, HTMLDivElement>>(new Map())
const workflowId = typeof selectedWorkflowId === 'string' ? selectedWorkflowId : undefined const workflowId = typeof selectedWorkflowId === 'string' ? selectedWorkflowId : undefined
const { data: childInputFields = [], isLoading } = useWorkflowInputFields(workflowId) const { data: workflowState, isLoading } = useWorkflowState(workflowId)
const childInputFields = useMemo(
() => (workflowState?.blocks ? extractInputFieldsFromBlocks(workflowState.blocks) : []),
[workflowState?.blocks]
)
const [collapsedFields, setCollapsedFields] = useState<Record<string, boolean>>({}) const [collapsedFields, setCollapsedFields] = useState<Record<string, boolean>>({})
const valueObj: Record<string, string> = useMemo(() => { const valueObj: Record<string, string> = useMemo(() => {

View File

@@ -1,4 +1,12 @@
import { useCallback, useEffect, useImperativeHandle, useMemo, useRef, useState } from 'react' import {
useCallback,
useEffect,
useImperativeHandle,
useLayoutEffect,
useMemo,
useRef,
useState,
} from 'react'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { ChevronDown, ChevronsUpDown, ChevronUp, Plus } from 'lucide-react' import { ChevronDown, ChevronsUpDown, ChevronUp, Plus } from 'lucide-react'
import { Button, Popover, PopoverContent, PopoverItem, PopoverTrigger } from '@/components/emcn' import { Button, Popover, PopoverContent, PopoverItem, PopoverTrigger } from '@/components/emcn'
@@ -382,93 +390,138 @@ export function MessagesInput({
textareaRefs.current[fieldId]?.focus() textareaRefs.current[fieldId]?.focus()
}, []) }, [])
const autoResizeTextarea = useCallback((fieldId: string) => { const syncOverlay = useCallback((fieldId: string) => {
const textarea = textareaRefs.current[fieldId] const textarea = textareaRefs.current[fieldId]
if (!textarea) return
const overlay = overlayRefs.current[fieldId] const overlay = overlayRefs.current[fieldId]
if (!textarea || !overlay) return
// If user has manually resized, respect their chosen height and only sync overlay. overlay.style.width = `${textarea.clientWidth}px`
if (userResizedRef.current[fieldId]) { overlay.scrollTop = textarea.scrollTop
const currentHeight = overlay.scrollLeft = textarea.scrollLeft
textarea.offsetHeight || Number.parseFloat(textarea.style.height) || MIN_TEXTAREA_HEIGHT_PX
const clampedHeight = Math.max(MIN_TEXTAREA_HEIGHT_PX, currentHeight)
textarea.style.height = `${clampedHeight}px`
if (overlay) {
overlay.style.height = `${clampedHeight}px`
}
return
}
textarea.style.height = 'auto'
const naturalHeight = textarea.scrollHeight || MIN_TEXTAREA_HEIGHT_PX
const nextHeight = Math.min(
MAX_TEXTAREA_HEIGHT_PX,
Math.max(MIN_TEXTAREA_HEIGHT_PX, naturalHeight)
)
textarea.style.height = `${nextHeight}px`
if (overlay) {
overlay.style.height = `${nextHeight}px`
}
}, []) }, [])
const handleResizeStart = useCallback((fieldId: string, e: React.MouseEvent<HTMLDivElement>) => { const autoResizeTextarea = useCallback(
e.preventDefault() (fieldId: string) => {
e.stopPropagation() const textarea = textareaRefs.current[fieldId]
const overlay = overlayRefs.current[fieldId]
if (!textarea) return
const textarea = textareaRefs.current[fieldId] if (!textarea.value.trim()) {
if (!textarea) return userResizedRef.current[fieldId] = false
const startHeight = textarea.offsetHeight || textarea.scrollHeight || MIN_TEXTAREA_HEIGHT_PX
isResizingRef.current = true
resizeStateRef.current = {
fieldId,
startY: e.clientY,
startHeight,
}
const handleMouseMove = (moveEvent: MouseEvent) => {
if (!isResizingRef.current || !resizeStateRef.current) return
const { fieldId: activeFieldId, startY, startHeight } = resizeStateRef.current
const deltaY = moveEvent.clientY - startY
const nextHeight = Math.max(MIN_TEXTAREA_HEIGHT_PX, startHeight + deltaY)
const activeTextarea = textareaRefs.current[activeFieldId]
if (activeTextarea) {
activeTextarea.style.height = `${nextHeight}px`
} }
const overlay = overlayRefs.current[activeFieldId] if (userResizedRef.current[fieldId]) {
if (overlay) {
overlay.style.height = `${textarea.offsetHeight}px`
}
syncOverlay(fieldId)
return
}
textarea.style.height = 'auto'
const scrollHeight = textarea.scrollHeight
const height = Math.min(
MAX_TEXTAREA_HEIGHT_PX,
Math.max(MIN_TEXTAREA_HEIGHT_PX, scrollHeight)
)
textarea.style.height = `${height}px`
if (overlay) { if (overlay) {
overlay.style.height = `${nextHeight}px` overlay.style.height = `${height}px`
}
}
const handleMouseUp = () => {
if (resizeStateRef.current) {
const { fieldId: activeFieldId } = resizeStateRef.current
userResizedRef.current[activeFieldId] = true
} }
isResizingRef.current = false syncOverlay(fieldId)
resizeStateRef.current = null },
document.removeEventListener('mousemove', handleMouseMove) [syncOverlay]
document.removeEventListener('mouseup', handleMouseUp) )
}
document.addEventListener('mousemove', handleMouseMove) const handleResizeStart = useCallback(
document.addEventListener('mouseup', handleMouseUp) (fieldId: string, e: React.MouseEvent<HTMLDivElement>) => {
}, []) e.preventDefault()
e.stopPropagation()
useEffect(() => { const textarea = textareaRefs.current[fieldId]
if (!textarea) return
const startHeight = textarea.offsetHeight || textarea.scrollHeight || MIN_TEXTAREA_HEIGHT_PX
isResizingRef.current = true
resizeStateRef.current = {
fieldId,
startY: e.clientY,
startHeight,
}
const handleMouseMove = (moveEvent: MouseEvent) => {
if (!isResizingRef.current || !resizeStateRef.current) return
const { fieldId: activeFieldId, startY, startHeight } = resizeStateRef.current
const deltaY = moveEvent.clientY - startY
const nextHeight = Math.max(MIN_TEXTAREA_HEIGHT_PX, startHeight + deltaY)
const activeTextarea = textareaRefs.current[activeFieldId]
const overlay = overlayRefs.current[activeFieldId]
if (activeTextarea) {
activeTextarea.style.height = `${nextHeight}px`
}
if (overlay) {
overlay.style.height = `${nextHeight}px`
if (activeTextarea) {
overlay.scrollTop = activeTextarea.scrollTop
overlay.scrollLeft = activeTextarea.scrollLeft
}
}
}
const handleMouseUp = () => {
if (resizeStateRef.current) {
const { fieldId: activeFieldId } = resizeStateRef.current
userResizedRef.current[activeFieldId] = true
syncOverlay(activeFieldId)
}
isResizingRef.current = false
resizeStateRef.current = null
document.removeEventListener('mousemove', handleMouseMove)
document.removeEventListener('mouseup', handleMouseUp)
}
document.addEventListener('mousemove', handleMouseMove)
document.addEventListener('mouseup', handleMouseUp)
},
[syncOverlay]
)
useLayoutEffect(() => {
currentMessages.forEach((_, index) => { currentMessages.forEach((_, index) => {
const fieldId = `message-${index}` autoResizeTextarea(`message-${index}`)
autoResizeTextarea(fieldId)
}) })
}, [currentMessages, autoResizeTextarea]) }, [currentMessages, autoResizeTextarea])
useEffect(() => {
const observers: ResizeObserver[] = []
for (let i = 0; i < currentMessages.length; i++) {
const fieldId = `message-${i}`
const textarea = textareaRefs.current[fieldId]
const overlay = overlayRefs.current[fieldId]
if (textarea && overlay) {
const observer = new ResizeObserver(() => {
overlay.style.width = `${textarea.clientWidth}px`
})
observer.observe(textarea)
observers.push(observer)
}
}
return () => {
observers.forEach((observer) => observer.disconnect())
}
}, [currentMessages.length])
return ( return (
<div className='flex w-full flex-col gap-[10px]'> <div className='flex w-full flex-col gap-[10px]'>
{currentMessages.map((message, index) => ( {currentMessages.map((message, index) => (
@@ -621,19 +674,15 @@ export function MessagesInput({
</div> </div>
{/* Content Input with overlay for variable highlighting */} {/* Content Input with overlay for variable highlighting */}
<div className='relative w-full'> <div className='relative w-full overflow-hidden'>
<textarea <textarea
ref={(el) => { ref={(el) => {
textareaRefs.current[fieldId] = el textareaRefs.current[fieldId] = el
}} }}
className='allow-scroll box-border min-h-[80px] w-full resize-none whitespace-pre-wrap break-words border-none bg-transparent px-[8px] pt-[8px] font-[inherit] font-medium text-sm text-transparent leading-[inherit] caret-[var(--text-primary)] outline-none placeholder:text-[var(--text-muted)] focus:outline-none focus-visible:outline-none disabled:cursor-not-allowed' className='relative z-[2] m-0 box-border h-auto min-h-[80px] w-full resize-none overflow-y-auto overflow-x-hidden whitespace-pre-wrap break-words border-none bg-transparent px-[8px] py-[8px] font-medium font-sans text-sm text-transparent leading-[1.5] caret-[var(--text-primary)] outline-none [-ms-overflow-style:none] [scrollbar-width:none] placeholder:text-[var(--text-muted)] focus:outline-none focus-visible:outline-none disabled:cursor-not-allowed [&::-webkit-scrollbar]:hidden'
rows={3}
placeholder='Enter message content...' placeholder='Enter message content...'
value={message.content} value={message.content}
onChange={(e) => { onChange={fieldHandlers.onChange}
fieldHandlers.onChange(e)
autoResizeTextarea(fieldId)
}}
onKeyDown={(e) => { onKeyDown={(e) => {
if (e.key === 'Tab' && !isPreview && !disabled) { if (e.key === 'Tab' && !isPreview && !disabled) {
e.preventDefault() e.preventDefault()
@@ -670,12 +719,13 @@ export function MessagesInput({
ref={(el) => { ref={(el) => {
overlayRefs.current[fieldId] = el overlayRefs.current[fieldId] = el
}} }}
className='scrollbar-none pointer-events-none absolute top-0 left-0 box-border w-full overflow-auto whitespace-pre-wrap break-words border-none bg-transparent px-[8px] pt-[8px] font-[inherit] font-medium text-[var(--text-primary)] text-sm leading-[inherit]' className='pointer-events-none absolute top-0 left-0 z-[1] m-0 box-border w-full overflow-y-auto overflow-x-hidden whitespace-pre-wrap break-words border-none bg-transparent px-[8px] py-[8px] font-medium font-sans text-[var(--text-primary)] text-sm leading-[1.5] [-ms-overflow-style:none] [scrollbar-width:none] [&::-webkit-scrollbar]:hidden'
> >
{formatDisplayText(message.content, { {formatDisplayText(message.content, {
accessiblePrefixes, accessiblePrefixes,
highlightAll: !accessiblePrefixes, highlightAll: !accessiblePrefixes,
})} })}
{message.content.endsWith('\n') && '\u200B'}
</div> </div>
{/* Env var dropdown for this message */} {/* Env var dropdown for this message */}
@@ -705,7 +755,7 @@ export function MessagesInput({
{!isPreview && !disabled && ( {!isPreview && !disabled && (
<div <div
className='absolute right-1 bottom-1 flex h-4 w-4 cursor-ns-resize items-center justify-center rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-5)] dark:bg-[var(--surface-5)]' className='absolute right-1 bottom-1 z-[3] flex h-4 w-4 cursor-ns-resize items-center justify-center rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-5)] dark:bg-[var(--surface-5)]'
onMouseDown={(e) => handleResizeStart(fieldId, e)} onMouseDown={(e) => handleResizeStart(fieldId, e)}
onDragStart={(e) => { onDragStart={(e) => {
e.preventDefault() e.preventDefault()

View File

@@ -28,6 +28,7 @@ interface Field {
name: string name: string
type?: 'string' | 'number' | 'boolean' | 'object' | 'array' | 'files' type?: 'string' | 'number' | 'boolean' | 'object' | 'array' | 'files'
value?: string value?: string
description?: string
collapsed?: boolean collapsed?: boolean
} }
@@ -41,7 +42,9 @@ interface FieldFormatProps {
placeholder?: string placeholder?: string
showType?: boolean showType?: boolean
showValue?: boolean showValue?: boolean
showDescription?: boolean
valuePlaceholder?: string valuePlaceholder?: string
descriptionPlaceholder?: string
config?: any config?: any
} }
@@ -73,6 +76,7 @@ const createDefaultField = (): Field => ({
name: '', name: '',
type: 'string', type: 'string',
value: '', value: '',
description: '',
collapsed: false, collapsed: false,
}) })
@@ -93,7 +97,9 @@ export function FieldFormat({
placeholder = 'fieldName', placeholder = 'fieldName',
showType = true, showType = true,
showValue = false, showValue = false,
showDescription = false,
valuePlaceholder = 'Enter default value', valuePlaceholder = 'Enter default value',
descriptionPlaceholder = 'Describe this field',
}: FieldFormatProps) { }: FieldFormatProps) {
const [storeValue, setStoreValue] = useSubBlockValue<Field[]>(blockId, subBlockId) const [storeValue, setStoreValue] = useSubBlockValue<Field[]>(blockId, subBlockId)
const valueInputRefs = useRef<Record<string, HTMLInputElement | HTMLTextAreaElement>>({}) const valueInputRefs = useRef<Record<string, HTMLInputElement | HTMLTextAreaElement>>({})
@@ -554,6 +560,18 @@ export function FieldFormat({
</div> </div>
)} )}
{showDescription && (
<div className='flex flex-col gap-[6px]'>
<Label className='text-[13px]'>Description</Label>
<Input
value={field.description ?? ''}
onChange={(e) => updateField(field.id, 'description', e.target.value)}
placeholder={descriptionPlaceholder}
disabled={isReadOnly}
/>
</div>
)}
{showValue && ( {showValue && (
<div className='flex flex-col gap-[6px]'> <div className='flex flex-col gap-[6px]'>
<Label className='text-[13px]'>Value</Label> <Label className='text-[13px]'>Value</Label>
@@ -568,8 +586,10 @@ export function FieldFormat({
) )
} }
export function InputFormat(props: Omit<FieldFormatProps, 'title' | 'placeholder'>) { export function InputFormat(
return <FieldFormat {...props} title='Input' placeholder='firstName' /> props: Omit<FieldFormatProps, 'title' | 'placeholder' | 'showDescription'>
) {
return <FieldFormat {...props} title='Input' placeholder='firstName' showDescription={true} />
} }
export function ResponseFormat( export function ResponseFormat(

View File

@@ -35,11 +35,11 @@ import type {
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes' import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { getBlock } from '@/blocks' import { getBlock } from '@/blocks'
import type { BlockConfig } from '@/blocks/types' import type { BlockConfig } from '@/blocks/types'
import { normalizeName } from '@/executor/constants'
import type { Variable } from '@/stores/panel' import type { Variable } from '@/stores/panel'
import { useVariablesStore } from '@/stores/panel' import { useVariablesStore } from '@/stores/panel'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store' import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store' import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { normalizeName } from '@/stores/workflows/utils'
import { useWorkflowStore } from '@/stores/workflows/workflow/store' import { useWorkflowStore } from '@/stores/workflows/workflow/store'
import type { BlockState } from '@/stores/workflows/workflow/types' import type { BlockState } from '@/stores/workflows/workflow/types'
@@ -241,13 +241,16 @@ const getOutputTypeForPath = (
const blockState = useWorkflowStore.getState().blocks[blockId] const blockState = useWorkflowStore.getState().blocks[blockId]
const subBlocks = mergedSubBlocksOverride ?? (blockState?.subBlocks || {}) const subBlocks = mergedSubBlocksOverride ?? (blockState?.subBlocks || {})
return getBlockOutputType(block.type, outputPath, subBlocks) return getBlockOutputType(block.type, outputPath, subBlocks)
} else { } else if (blockConfig?.tools?.config?.tool) {
const operationValue = getSubBlockValue(blockId, 'operation') const blockState = useWorkflowStore.getState().blocks[blockId]
if (blockConfig && operationValue) { const subBlocks = mergedSubBlocksOverride ?? (blockState?.subBlocks || {})
return getToolOutputType(blockConfig, operationValue, outputPath) return getToolOutputType(blockConfig, subBlocks, outputPath)
}
} }
return 'any'
const subBlocks =
mergedSubBlocksOverride ?? useWorkflowStore.getState().blocks[blockId]?.subBlocks
const triggerMode = block?.triggerMode && blockConfig?.triggers?.enabled
return getBlockOutputType(block?.type ?? '', outputPath, subBlocks, triggerMode)
} }
/** /**
@@ -1211,11 +1214,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
: allTags : allTags
} }
} else { } else {
const operationValue = const toolOutputPaths = getToolOutputPaths(blockConfig, mergedSubBlocks)
mergedSubBlocks?.operation?.value ?? getSubBlockValue(activeSourceBlockId, 'operation')
const toolOutputPaths = operationValue
? getToolOutputPaths(blockConfig, operationValue, mergedSubBlocks)
: []
if (toolOutputPaths.length > 0) { if (toolOutputPaths.length > 0) {
blockTags = toolOutputPaths.map((path) => `${normalizedBlockName}.${path}`) blockTags = toolOutputPaths.map((path) => `${normalizedBlockName}.${path}`)
@@ -1535,7 +1534,6 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
if (dynamicOutputs.length > 0) { if (dynamicOutputs.length > 0) {
const allTags = dynamicOutputs.map((path) => `${normalizedBlockName}.${path}`) const allTags = dynamicOutputs.map((path) => `${normalizedBlockName}.${path}`)
// For self-reference, only show url and resumeEndpoint (not response format fields)
blockTags = isSelfReference blockTags = isSelfReference
? allTags.filter((tag) => tag.endsWith('.url') || tag.endsWith('.resumeEndpoint')) ? allTags.filter((tag) => tag.endsWith('.url') || tag.endsWith('.resumeEndpoint'))
: allTags : allTags
@@ -1543,11 +1541,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
blockTags = [`${normalizedBlockName}.url`, `${normalizedBlockName}.resumeEndpoint`] blockTags = [`${normalizedBlockName}.url`, `${normalizedBlockName}.resumeEndpoint`]
} }
} else { } else {
const operationValue = const toolOutputPaths = getToolOutputPaths(blockConfig, mergedSubBlocks)
mergedSubBlocks?.operation?.value ?? getSubBlockValue(accessibleBlockId, 'operation')
const toolOutputPaths = operationValue
? getToolOutputPaths(blockConfig, operationValue, mergedSubBlocks)
: []
if (toolOutputPaths.length > 0) { if (toolOutputPaths.length > 0) {
blockTags = toolOutputPaths.map((path) => `${normalizedBlockName}.${path}`) blockTags = toolOutputPaths.map((path) => `${normalizedBlockName}.${path}`)
@@ -1789,7 +1783,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
mergedSubBlocks mergedSubBlocks
) )
if (fieldType === 'files' || fieldType === 'array') { if (fieldType === 'files' || fieldType === 'file[]' || fieldType === 'array') {
const blockName = parts[0] const blockName = parts[0]
const remainingPath = parts.slice(2).join('.') const remainingPath = parts.slice(2).join('.')
processedTag = `${blockName}.${arrayFieldName}[0].${remainingPath}` processedTag = `${blockName}.${arrayFieldName}[0].${remainingPath}`

View File

@@ -29,6 +29,7 @@ import {
type OAuthProvider, type OAuthProvider,
type OAuthService, type OAuthService,
} from '@/lib/oauth' } from '@/lib/oauth'
import { extractInputFieldsFromBlocks } from '@/lib/workflows/input-format'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider' import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { import {
CheckboxList, CheckboxList,
@@ -65,7 +66,7 @@ import { useForceRefreshMcpTools, useMcpServers, useStoredMcpTools } from '@/hoo
import { import {
useChildDeploymentStatus, useChildDeploymentStatus,
useDeployChildWorkflow, useDeployChildWorkflow,
useWorkflowInputFields, useWorkflowState,
useWorkflows, useWorkflows,
} from '@/hooks/queries/workflows' } from '@/hooks/queries/workflows'
import { usePermissionConfig } from '@/hooks/use-permission-config' import { usePermissionConfig } from '@/hooks/use-permission-config'
@@ -771,7 +772,11 @@ function WorkflowInputMapperSyncWrapper({
disabled: boolean disabled: boolean
workflowId: string workflowId: string
}) { }) {
const { data: inputFields = [], isLoading } = useWorkflowInputFields(workflowId) const { data: workflowState, isLoading } = useWorkflowState(workflowId)
const inputFields = useMemo(
() => (workflowState?.blocks ? extractInputFieldsFromBlocks(workflowState.blocks) : []),
[workflowState?.blocks]
)
const parsedValue = useMemo(() => { const parsedValue = useMemo(() => {
try { try {

View File

@@ -0,0 +1,45 @@
'use client'
import { useMemo } from 'react'
import { SelectorCombobox } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/selector-combobox/selector-combobox'
import type { SubBlockConfig } from '@/blocks/types'
import type { SelectorContext } from '@/hooks/selectors/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
interface WorkflowSelectorInputProps {
blockId: string
subBlock: SubBlockConfig
disabled?: boolean
isPreview?: boolean
previewValue?: string | null
}
export function WorkflowSelectorInput({
blockId,
subBlock,
disabled = false,
isPreview = false,
previewValue,
}: WorkflowSelectorInputProps) {
const activeWorkflowId = useWorkflowRegistry((s) => s.activeWorkflowId)
const context: SelectorContext = useMemo(
() => ({
excludeWorkflowId: activeWorkflowId ?? undefined,
}),
[activeWorkflowId]
)
return (
<SelectorCombobox
blockId={blockId}
subBlock={subBlock}
selectorKey='sim.workflows'
selectorContext={context}
disabled={disabled}
isPreview={isPreview}
previewValue={previewValue}
placeholder={subBlock.placeholder || 'Select workflow...'}
/>
)
}

View File

@@ -2,6 +2,7 @@
import { useCallback, useMemo } from 'react' import { useCallback, useMemo } from 'react'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { useStoreWithEqualityFn } from 'zustand/traditional'
import { import {
buildCanonicalIndex, buildCanonicalIndex,
isNonEmptyValue, isNonEmptyValue,
@@ -151,7 +152,7 @@ export function useDependsOnGate(
// Get values for all dependency fields (both all and any) // Get values for all dependency fields (both all and any)
// Use isEqual to prevent re-renders when dependency values haven't actually changed // Use isEqual to prevent re-renders when dependency values haven't actually changed
const dependencyValuesMap = useSubBlockStore(dependencySelector, isEqual) const dependencyValuesMap = useStoreWithEqualityFn(useSubBlockStore, dependencySelector, isEqual)
const depsSatisfied = useMemo(() => { const depsSatisfied = useMemo(() => {
// Check all fields (AND logic) - all must be satisfied // Check all fields (AND logic) - all must be satisfied

View File

@@ -2,6 +2,7 @@ import { useCallback, useEffect, useRef } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { useShallow } from 'zustand/react/shallow' import { useShallow } from 'zustand/react/shallow'
import { useStoreWithEqualityFn } from 'zustand/traditional'
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow' import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { getProviderFromModel } from '@/providers/utils' import { getProviderFromModel } from '@/providers/utils'
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store' import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
@@ -58,7 +59,8 @@ export function useSubBlockValue<T = any>(
const streamingValueRef = useRef<T | null>(null) const streamingValueRef = useRef<T | null>(null)
const wasStreamingRef = useRef<boolean>(false) const wasStreamingRef = useRef<boolean>(false)
const storeValue = useSubBlockStore( const storeValue = useStoreWithEqualityFn(
useSubBlockStore,
useCallback( useCallback(
(state) => { (state) => {
// If the active workflow ID isn't available yet, return undefined so we can fall back to initialValue // If the active workflow ID isn't available yet, return undefined so we can fall back to initialValue
@@ -92,7 +94,8 @@ export function useSubBlockValue<T = any>(
// Always call this hook unconditionally - don't wrap it in a condition // Always call this hook unconditionally - don't wrap it in a condition
// Optimized: only re-render if model value actually changes // Optimized: only re-render if model value actually changes
const modelSubBlockValue = useSubBlockStore( const modelSubBlockValue = useStoreWithEqualityFn(
useSubBlockStore,
useCallback((state) => (blockId ? state.getValue(blockId, 'model') : null), [blockId]), useCallback((state) => (blockId ? state.getValue(blockId, 'model') : null), [blockId]),
(a, b) => a === b (a, b) => a === b
) )

View File

@@ -40,6 +40,7 @@ import {
TimeInput, TimeInput,
ToolInput, ToolInput,
VariablesInput, VariablesInput,
WorkflowSelectorInput,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components' } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components'
import { useDependsOnGate } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-depends-on-gate' import { useDependsOnGate } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-depends-on-gate'
import type { SubBlockConfig } from '@/blocks/types' import type { SubBlockConfig } from '@/blocks/types'
@@ -90,7 +91,6 @@ const isFieldRequired = (config: SubBlockConfig, subBlockValues?: Record<string,
if (!config.required) return false if (!config.required) return false
if (typeof config.required === 'boolean') return config.required if (typeof config.required === 'boolean') return config.required
// Helper function to evaluate a condition
const evalCond = ( const evalCond = (
cond: { cond: {
field: string field: string
@@ -132,7 +132,6 @@ const isFieldRequired = (config: SubBlockConfig, subBlockValues?: Record<string,
return match return match
} }
// If required is a condition object or function, evaluate it
const condition = typeof config.required === 'function' ? config.required() : config.required const condition = typeof config.required === 'function' ? config.required() : config.required
return evalCond(condition, subBlockValues || {}) return evalCond(condition, subBlockValues || {})
} }
@@ -378,7 +377,6 @@ function SubBlockComponent({
setIsValidJson(isValid) setIsValidJson(isValid)
} }
// Check if wand is enabled for this sub-block
const isWandEnabled = config.wandConfig?.enabled ?? false const isWandEnabled = config.wandConfig?.enabled ?? false
/** /**
@@ -438,8 +436,6 @@ function SubBlockComponent({
| null | null
| undefined | undefined
// Use dependsOn gating to compute final disabled state
// Only pass previewContextValues when in preview mode to avoid format mismatches
const { finalDisabled: gatedDisabled } = useDependsOnGate(blockId, config, { const { finalDisabled: gatedDisabled } = useDependsOnGate(blockId, config, {
disabled, disabled,
isPreview, isPreview,
@@ -869,6 +865,17 @@ function SubBlockComponent({
/> />
) )
case 'workflow-selector':
return (
<WorkflowSelectorInput
blockId={blockId}
subBlock={config}
disabled={isDisabled}
isPreview={isPreview}
previewValue={previewValue as string | null}
/>
)
case 'mcp-server-selector': case 'mcp-server-selector':
return ( return (
<McpServerSelector <McpServerSelector

View File

@@ -68,7 +68,7 @@ export function SubflowEditor({
<div className='flex flex-1 flex-col overflow-hidden pt-[0px]'> <div className='flex flex-1 flex-col overflow-hidden pt-[0px]'>
{/* Subflow Editor Section */} {/* Subflow Editor Section */}
<div ref={subBlocksRef} className='subblocks-section flex flex-1 flex-col overflow-hidden'> <div ref={subBlocksRef} className='subblocks-section flex flex-1 flex-col overflow-hidden'>
<div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[5px] pb-[8px]'> <div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[9px] pb-[8px]'>
{/* Type Selection */} {/* Type Selection */}
<div> <div>
<Label className='mb-[6.5px] block pl-[2px] font-medium text-[13px] text-[var(--text-primary)]'> <Label className='mb-[6.5px] block pl-[2px] font-medium text-[13px] text-[var(--text-primary)]'>

View File

@@ -2,8 +2,18 @@
import { useCallback, useEffect, useMemo, useRef, useState } from 'react' import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { BookOpen, Check, ChevronDown, ChevronUp, Pencil } from 'lucide-react' import {
BookOpen,
Check,
ChevronDown,
ChevronUp,
ExternalLink,
Loader2,
Pencil,
} from 'lucide-react'
import { useParams } from 'next/navigation'
import { useShallow } from 'zustand/react/shallow' import { useShallow } from 'zustand/react/shallow'
import { useStoreWithEqualityFn } from 'zustand/traditional'
import { Button, Tooltip } from '@/components/emcn' import { Button, Tooltip } from '@/components/emcn'
import { import {
buildCanonicalIndex, buildCanonicalIndex,
@@ -28,8 +38,10 @@ import { LoopTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/component
import { ParallelTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/parallel/parallel-config' import { ParallelTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/parallel/parallel-config'
import { getSubBlockStableKey } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/utils' import { getSubBlockStableKey } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/utils'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks' import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks'
import { WorkflowPreview } from '@/app/workspace/[workspaceId]/w/components/preview'
import { getBlock } from '@/blocks/registry' import { getBlock } from '@/blocks/registry'
import type { SubBlockType } from '@/blocks/types' import type { SubBlockType } from '@/blocks/types'
import { useWorkflowState } from '@/hooks/queries/workflows'
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow' import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { usePanelEditorStore } from '@/stores/panel' import { usePanelEditorStore } from '@/stores/panel'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store' import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
@@ -84,6 +96,14 @@ export function Editor() {
// Get subflow display properties from configs // Get subflow display properties from configs
const subflowConfig = isSubflow ? (currentBlock.type === 'loop' ? LoopTool : ParallelTool) : null const subflowConfig = isSubflow ? (currentBlock.type === 'loop' ? LoopTool : ParallelTool) : null
// Check if selected block is a workflow block
const isWorkflowBlock =
currentBlock && (currentBlock.type === 'workflow' || currentBlock.type === 'workflow_input')
// Get workspace ID from params
const params = useParams()
const workspaceId = params.workspaceId as string
// Refs for resize functionality // Refs for resize functionality
const subBlocksRef = useRef<HTMLDivElement>(null) const subBlocksRef = useRef<HTMLDivElement>(null)
@@ -99,7 +119,8 @@ export function Editor() {
currentWorkflow.isSnapshotView currentWorkflow.isSnapshotView
) )
const blockSubBlockValues = useSubBlockStore( const blockSubBlockValues = useStoreWithEqualityFn(
useSubBlockStore,
useCallback( useCallback(
(state) => { (state) => {
if (!activeWorkflowId || !currentBlockId) return EMPTY_SUBBLOCK_VALUES if (!activeWorkflowId || !currentBlockId) return EMPTY_SUBBLOCK_VALUES
@@ -252,11 +273,11 @@ export function Editor() {
// Trigger rename mode when signaled from context menu // Trigger rename mode when signaled from context menu
useEffect(() => { useEffect(() => {
if (shouldFocusRename && currentBlock && !isSubflow) { if (shouldFocusRename && currentBlock) {
handleStartRename() handleStartRename()
setShouldFocusRename(false) setShouldFocusRename(false)
} }
}, [shouldFocusRename, currentBlock, isSubflow, handleStartRename, setShouldFocusRename]) }, [shouldFocusRename, currentBlock, handleStartRename, setShouldFocusRename])
/** /**
* Handles opening documentation link in a new secure tab. * Handles opening documentation link in a new secure tab.
@@ -268,6 +289,22 @@ export function Editor() {
} }
} }
// Get child workflow ID for workflow blocks
const childWorkflowId = isWorkflowBlock ? blockSubBlockValues?.workflowId : null
// Fetch child workflow state for preview (only for workflow blocks with a selected workflow)
const { data: childWorkflowState, isLoading: isLoadingChildWorkflow } =
useWorkflowState(childWorkflowId)
/**
* Handles opening the child workflow in a new tab.
*/
const handleOpenChildWorkflow = useCallback(() => {
if (childWorkflowId && workspaceId) {
window.open(`/workspace/${workspaceId}/w/${childWorkflowId}`, '_blank', 'noopener,noreferrer')
}
}, [childWorkflowId, workspaceId])
// Determine if connections are at minimum height (collapsed state) // Determine if connections are at minimum height (collapsed state)
const isConnectionsAtMinHeight = connectionsHeight <= 35 const isConnectionsAtMinHeight = connectionsHeight <= 35
@@ -320,7 +357,7 @@ export function Editor() {
</div> </div>
<div className='flex shrink-0 items-center gap-[8px]'> <div className='flex shrink-0 items-center gap-[8px]'>
{/* Rename button */} {/* Rename button */}
{currentBlock && !isSubflow && ( {currentBlock && (
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<Button <Button
@@ -406,7 +443,66 @@ export function Editor() {
className='subblocks-section flex flex-1 flex-col overflow-hidden' className='subblocks-section flex flex-1 flex-col overflow-hidden'
> >
<div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[12px] pb-[8px] [overflow-anchor:none]'> <div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[12px] pb-[8px] [overflow-anchor:none]'>
{subBlocks.length === 0 ? ( {/* Workflow Preview - only for workflow blocks with a selected child workflow */}
{isWorkflowBlock && childWorkflowId && (
<>
<div className='subblock-content flex flex-col gap-[9.5px]'>
<div className='pl-[2px] font-medium text-[13px] text-[var(--text-primary)] leading-none'>
Workflow Preview
</div>
<div className='relative h-[160px] overflow-hidden rounded-[4px] border border-[var(--border)]'>
{isLoadingChildWorkflow ? (
<div className='flex h-full items-center justify-center bg-[var(--surface-3)]'>
<Loader2 className='h-5 w-5 animate-spin text-[var(--text-tertiary)]' />
</div>
) : childWorkflowState ? (
<>
<div className='[&_*:active]:!cursor-grabbing [&_*]:!cursor-grab [&_.react-flow__handle]:!hidden h-full w-full'>
<WorkflowPreview
workflowState={childWorkflowState}
height={160}
width='100%'
isPannable={true}
defaultZoom={0.6}
fitPadding={0.15}
cursorStyle='grab'
/>
</div>
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
type='button'
variant='ghost'
onClick={handleOpenChildWorkflow}
className='absolute right-[6px] bottom-[6px] z-10 h-[24px] w-[24px] cursor-pointer border border-[var(--border)] bg-[var(--surface-2)] p-0 hover:bg-[var(--surface-4)]'
>
<ExternalLink className='h-[12px] w-[12px]' />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>Open workflow</Tooltip.Content>
</Tooltip.Root>
</>
) : (
<div className='flex h-full items-center justify-center bg-[var(--surface-3)]'>
<span className='text-[13px] text-[var(--text-tertiary)]'>
Unable to load preview
</span>
</div>
)}
</div>
</div>
<div className='subblock-divider px-[2px] pt-[16px] pb-[13px]'>
<div
className='h-[1.25px]'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
</div>
</>
)}
{subBlocks.length === 0 && !isWorkflowBlock ? (
<div className='flex h-full items-center justify-center text-center text-[#8D8D8D] text-[13px]'> <div className='flex h-full items-center justify-center text-center text-[#8D8D8D] text-[13px]'>
This block has no subblocks This block has no subblocks
</div> </div>

View File

@@ -1,5 +1,4 @@
import { useCallback } from 'react' import { useShallow } from 'zustand/react/shallow'
import { shallow } from 'zustand/shallow'
import { useWorkflowDiffStore } from '@/stores/workflow-diff' import { useWorkflowDiffStore } from '@/stores/workflow-diff'
import { useWorkflowStore } from '@/stores/workflows/workflow/store' import { useWorkflowStore } from '@/stores/workflows/workflow/store'
@@ -13,35 +12,26 @@ import { useWorkflowStore } from '@/stores/workflows/workflow/store'
*/ */
export function useEditorBlockProperties(blockId: string | null, isSnapshotView: boolean) { export function useEditorBlockProperties(blockId: string | null, isSnapshotView: boolean) {
const normalBlockProps = useWorkflowStore( const normalBlockProps = useWorkflowStore(
useCallback( useShallow((state) => {
(state) => { if (!blockId) return { advancedMode: false, triggerMode: false }
if (!blockId) return { advancedMode: false, triggerMode: false } const block = state.blocks?.[blockId]
const block = state.blocks?.[blockId] return {
return { advancedMode: block?.advancedMode ?? false,
advancedMode: block?.advancedMode ?? false, triggerMode: block?.triggerMode ?? false,
triggerMode: block?.triggerMode ?? false, }
} })
},
[blockId]
),
shallow
) )
const baselineBlockProps = useWorkflowDiffStore( const baselineBlockProps = useWorkflowDiffStore(
useCallback( useShallow((state) => {
(state) => { if (!blockId) return { advancedMode: false, triggerMode: false }
if (!blockId) return { advancedMode: false, triggerMode: false } const block = state.baselineWorkflow?.blocks?.[blockId]
const block = state.baselineWorkflow?.blocks?.[blockId] return {
return { advancedMode: block?.advancedMode ?? false,
advancedMode: block?.advancedMode ?? false, triggerMode: block?.triggerMode ?? false,
triggerMode: block?.triggerMode ?? false, }
} })
},
[blockId]
),
shallow
) )
// Use the appropriate props based on view mode
return isSnapshotView ? baselineBlockProps : normalBlockProps return isSnapshotView ? baselineBlockProps : normalBlockProps
} }

View File

@@ -1,6 +1,7 @@
import { memo, useMemo, useRef } from 'react' import { memo, useMemo, useRef } from 'react'
import { RepeatIcon, SplitIcon } from 'lucide-react' import { RepeatIcon, SplitIcon } from 'lucide-react'
import { Handle, type NodeProps, Position, useReactFlow } from 'reactflow' import { Handle, type NodeProps, Position, useReactFlow } from 'reactflow'
import { Badge } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { HANDLE_POSITIONS } from '@/lib/workflows/blocks/block-dimensions' import { HANDLE_POSITIONS } from '@/lib/workflows/blocks/block-dimensions'
import { type DiffStatus, hasDiffStatus } from '@/lib/workflows/diff/types' import { type DiffStatus, hasDiffStatus } from '@/lib/workflows/diff/types'
@@ -78,6 +79,7 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
? currentBlock.is_diff ? currentBlock.is_diff
: undefined : undefined
const isEnabled = currentBlock?.enabled ?? true
const isPreview = data?.isPreview || false const isPreview = data?.isPreview || false
// Focus state // Focus state
@@ -184,14 +186,21 @@ export const SubflowNodeComponent = memo(({ data, id, selected }: NodeProps<Subf
<div className='flex min-w-0 flex-1 items-center gap-[10px]'> <div className='flex min-w-0 flex-1 items-center gap-[10px]'>
<div <div
className='flex h-[24px] w-[24px] flex-shrink-0 items-center justify-center rounded-[6px]' className='flex h-[24px] w-[24px] flex-shrink-0 items-center justify-center rounded-[6px]'
style={{ backgroundColor: blockIconBg }} style={{ backgroundColor: isEnabled ? blockIconBg : 'gray' }}
> >
<BlockIcon className='h-[16px] w-[16px] text-white' /> <BlockIcon className='h-[16px] w-[16px] text-white' />
</div> </div>
<span className='font-medium text-[16px]' title={blockName}> <span
className={cn(
'truncate font-medium text-[16px]',
!isEnabled && 'text-[var(--text-muted)]'
)}
title={blockName}
>
{blockName} {blockName}
</span> </span>
</div> </div>
{!isEnabled && <Badge variant='gray-secondary'>disabled</Badge>}
</div> </div>
{!isPreview && ( {!isPreview && (

View File

@@ -3,6 +3,7 @@ import { createLogger } from '@sim/logger'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { useParams } from 'next/navigation' import { useParams } from 'next/navigation'
import { Handle, type NodeProps, Position, useUpdateNodeInternals } from 'reactflow' import { Handle, type NodeProps, Position, useUpdateNodeInternals } from 'reactflow'
import { useStoreWithEqualityFn } from 'zustand/traditional'
import { Badge, Tooltip } from '@/components/emcn' import { Badge, Tooltip } from '@/components/emcn'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { getBaseUrl } from '@/lib/core/utils/urls' import { getBaseUrl } from '@/lib/core/utils/urls'
@@ -526,7 +527,8 @@ const SubBlockRow = memo(function SubBlockRow({
* Subscribe only to variables for this workflow to avoid re-renders from other workflows. * Subscribe only to variables for this workflow to avoid re-renders from other workflows.
* Uses isEqual for deep comparison since Object.fromEntries creates a new object each time. * Uses isEqual for deep comparison since Object.fromEntries creates a new object each time.
*/ */
const workflowVariables = useVariablesStore( const workflowVariables = useStoreWithEqualityFn(
useVariablesStore,
useCallback( useCallback(
(state) => { (state) => {
if (!workflowId) return {} if (!workflowId) return {}
@@ -729,7 +731,8 @@ export const WorkflowBlock = memo(function WorkflowBlock({
const isStarterBlock = type === 'starter' const isStarterBlock = type === 'starter'
const isWebhookTriggerBlock = type === 'webhook' || type === 'generic_webhook' const isWebhookTriggerBlock = type === 'webhook' || type === 'generic_webhook'
const blockSubBlockValues = useSubBlockStore( const blockSubBlockValues = useStoreWithEqualityFn(
useSubBlockStore,
useCallback( useCallback(
(state) => { (state) => {
if (!activeWorkflowId) return EMPTY_SUBBLOCK_VALUES if (!activeWorkflowId) return EMPTY_SUBBLOCK_VALUES

View File

@@ -2,7 +2,7 @@ import { useMemo } from 'react'
import { useShallow } from 'zustand/react/shallow' import { useShallow } from 'zustand/react/shallow'
import { BlockPathCalculator } from '@/lib/workflows/blocks/block-path-calculator' import { BlockPathCalculator } from '@/lib/workflows/blocks/block-path-calculator'
import { SYSTEM_REFERENCE_PREFIXES } from '@/lib/workflows/sanitization/references' import { SYSTEM_REFERENCE_PREFIXES } from '@/lib/workflows/sanitization/references'
import { isValidStartBlockType } from '@/lib/workflows/triggers/start-block-types' import { isInputDefinitionTrigger } from '@/lib/workflows/triggers/input-definition-triggers'
import { normalizeName } from '@/executor/constants' import { normalizeName } from '@/executor/constants'
import { useWorkflowStore } from '@/stores/workflows/workflow/store' import { useWorkflowStore } from '@/stores/workflows/workflow/store'
import type { Loop, Parallel } from '@/stores/workflows/workflow/types' import type { Loop, Parallel } from '@/stores/workflows/workflow/types'
@@ -27,7 +27,7 @@ export function useAccessibleReferencePrefixes(blockId?: string | null): Set<str
const accessibleIds = new Set<string>(ancestorIds) const accessibleIds = new Set<string>(ancestorIds)
accessibleIds.add(blockId) accessibleIds.add(blockId)
const starterBlock = Object.values(blocks).find((block) => isValidStartBlockType(block.type)) const starterBlock = Object.values(blocks).find((block) => isInputDefinitionTrigger(block.type))
if (starterBlock && ancestorIds.includes(starterBlock.id)) { if (starterBlock && ancestorIds.includes(starterBlock.id)) {
accessibleIds.add(starterBlock.id) accessibleIds.add(starterBlock.id)
} }

View File

@@ -2,13 +2,15 @@
import { useMemo } from 'react' import { useMemo } from 'react'
import { extractFieldsFromSchema } from '@/lib/core/utils/response-format' import { extractFieldsFromSchema } from '@/lib/core/utils/response-format'
import { getBlockOutputPaths, getBlockOutputs } from '@/lib/workflows/blocks/block-outputs' import {
getBlockOutputPaths,
getBlockOutputs,
getToolOutputs,
} from '@/lib/workflows/blocks/block-outputs'
import { TRIGGER_TYPES } from '@/lib/workflows/triggers/triggers' import { TRIGGER_TYPES } from '@/lib/workflows/triggers/triggers'
import type { SchemaField } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/connection-blocks/components/field-item/field-item' import type { SchemaField } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/connection-blocks/components/field-item/field-item'
import { getBlock } from '@/blocks' import { getBlock } from '@/blocks'
import type { BlockConfig } from '@/blocks/types'
import { useSubBlockStore } from '@/stores/workflows/subblock/store' import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { getTool } from '@/tools/utils'
const RESERVED_KEYS = new Set(['type', 'description']) const RESERVED_KEYS = new Set(['type', 'description'])
@@ -24,64 +26,6 @@ const getSubBlockValue = (blockId: string, property: string): any => {
return useSubBlockStore.getState().getValue(blockId, property) return useSubBlockStore.getState().getValue(blockId, property)
} }
/**
* Generates output paths for a tool-based block
*/
const generateToolOutputPaths = (blockConfig: BlockConfig, operation: string): string[] => {
if (!blockConfig?.tools?.config?.tool) return []
try {
const toolId = blockConfig.tools.config.tool({ operation })
if (!toolId) return []
const toolConfig = getTool(toolId)
if (!toolConfig?.outputs) return []
return generateOutputPaths(toolConfig.outputs)
} catch {
return []
}
}
/**
* Recursively generates all output paths from an outputs schema
*/
const generateOutputPaths = (outputs: Record<string, any>, prefix = ''): string[] => {
const paths: string[] = []
for (const [key, value] of Object.entries(outputs)) {
const currentPath = prefix ? `${prefix}.${key}` : key
if (typeof value === 'string') {
paths.push(currentPath)
} else if (typeof value === 'object' && value !== null) {
if ('type' in value && typeof value.type === 'string') {
paths.push(currentPath)
// Handle nested objects and arrays
if (value.type === 'object' && value.properties) {
paths.push(...generateOutputPaths(value.properties, currentPath))
} else if (value.type === 'array' && value.items?.properties) {
paths.push(...generateOutputPaths(value.items.properties, currentPath))
} else if (
value.type === 'array' &&
value.items &&
typeof value.items === 'object' &&
!('type' in value.items)
) {
paths.push(...generateOutputPaths(value.items, currentPath))
}
} else {
const subPaths = generateOutputPaths(value, currentPath)
paths.push(...subPaths)
}
} else {
paths.push(currentPath)
}
}
return paths
}
/** /**
* Extracts nested fields from array or object properties * Extracts nested fields from array or object properties
*/ */
@@ -155,26 +99,6 @@ const createFieldFromOutput = (
return field return field
} }
/**
* Gets tool outputs for a block's operation
*/
const getToolOutputs = (
blockConfig: BlockConfig | null,
operation?: string
): Record<string, any> => {
if (!blockConfig?.tools?.config?.tool || !operation) return {}
try {
const toolId = blockConfig.tools.config.tool({ operation })
if (!toolId) return {}
const toolConfig = getTool(toolId)
return toolConfig?.outputs || {}
} catch {
return {}
}
}
interface UseBlockOutputFieldsParams { interface UseBlockOutputFieldsParams {
blockId: string blockId: string
blockType: string blockType: string
@@ -299,14 +223,11 @@ export function useBlockOutputFields({
baseOutputs = getBlockOutputs(blockType, mergedSubBlocks) baseOutputs = getBlockOutputs(blockType, mergedSubBlocks)
} else { } else {
// For tool-based blocks, try to get tool outputs first // For tool-based blocks, try to get tool outputs first
const operationValue = const toolOutputs = blockConfig ? getToolOutputs(blockConfig, mergedSubBlocks) : {}
operation ?? mergedSubBlocks?.operation?.value ?? getSubBlockValue(blockId, 'operation')
const toolOutputs = operationValue ? getToolOutputs(blockConfig, operationValue) : {}
if (Object.keys(toolOutputs).length > 0) { if (Object.keys(toolOutputs).length > 0) {
baseOutputs = toolOutputs baseOutputs = toolOutputs
} else { } else {
// Use getBlockOutputs which handles inputFormat merging
baseOutputs = getBlockOutputs(blockType, mergedSubBlocks, triggerMode) baseOutputs = getBlockOutputs(blockType, mergedSubBlocks, triggerMode)
} }
} }

View File

@@ -180,6 +180,21 @@ function mapEdgesByNode(edges: Edge[], nodeIds: Set<string>): Map<string, Edge[]
return result return result
} }
/**
* Syncs the panel editor with the current selection state.
* Shows block details when exactly one block is selected, clears otherwise.
*/
function syncPanelWithSelection(selectedIds: string[]) {
const { currentBlockId, clearCurrentBlock, setCurrentBlockId } = usePanelEditorStore.getState()
if (selectedIds.length === 1 && selectedIds[0] !== currentBlockId) {
setCurrentBlockId(selectedIds[0])
} else if (selectedIds.length === 0 && currentBlockId) {
clearCurrentBlock()
} else if (selectedIds.length > 1 && currentBlockId) {
clearCurrentBlock()
}
}
/** Custom node types for ReactFlow. */ /** Custom node types for ReactFlow. */
const nodeTypes: NodeTypes = { const nodeTypes: NodeTypes = {
workflowBlock: WorkflowBlock, workflowBlock: WorkflowBlock,
@@ -2075,7 +2090,10 @@ const WorkflowContent = React.memo(() => {
...node, ...node,
selected: pendingSet.has(node.id), selected: pendingSet.has(node.id),
})) }))
setDisplayNodes(resolveParentChildSelectionConflicts(withSelection, blocks)) const resolved = resolveParentChildSelectionConflicts(withSelection, blocks)
setDisplayNodes(resolved)
const selectedIds = resolved.filter((node) => node.selected).map((node) => node.id)
syncPanelWithSelection(selectedIds)
return return
} }
@@ -2175,13 +2193,7 @@ const WorkflowContent = React.memo(() => {
}) })
const selectedIds = selectedIdsRef.current as string[] | null const selectedIds = selectedIdsRef.current as string[] | null
if (selectedIds !== null) { if (selectedIds !== null) {
const { currentBlockId, clearCurrentBlock, setCurrentBlockId } = syncPanelWithSelection(selectedIds)
usePanelEditorStore.getState()
if (selectedIds.length === 1 && selectedIds[0] !== currentBlockId) {
setCurrentBlockId(selectedIds[0])
} else if (selectedIds.length === 0 && currentBlockId) {
clearCurrentBlock()
}
} }
}, },
[blocks] [blocks]
@@ -3323,15 +3335,12 @@ const WorkflowContent = React.memo(() => {
onOpenLogs={handleContextOpenLogs} onOpenLogs={handleContextOpenLogs}
onToggleVariables={handleContextToggleVariables} onToggleVariables={handleContextToggleVariables}
onToggleChat={handleContextToggleChat} onToggleChat={handleContextToggleChat}
onInvite={handleContextInvite}
isVariablesOpen={isVariablesOpen} isVariablesOpen={isVariablesOpen}
isChatOpen={isChatOpen} isChatOpen={isChatOpen}
hasClipboard={hasClipboard()} hasClipboard={hasClipboard()}
disableEdit={!effectivePermissions.canEdit} disableEdit={!effectivePermissions.canEdit}
disableAdmin={!effectivePermissions.canAdmin}
canUndo={canUndo} canUndo={canUndo}
canRedo={canRedo} canRedo={canRedo}
isInvitationsDisabled={isInvitationsDisabled}
/> />
</> </>
)} )}

View File

@@ -21,11 +21,9 @@ interface WorkflowPreviewBlockData {
} }
/** /**
* Lightweight block component for workflow previews. * Preview block component for workflow visualization.
* Renders block header, dummy subblocks skeleton, and handles. * Renders block header, subblocks skeleton, and handles without
* Respects horizontalHandles and enabled state from workflow. * hooks, store subscriptions, or interactive features.
* No heavy hooks, store subscriptions, or interactive features.
* Used in template cards and other preview contexts for performance.
*/ */
function WorkflowPreviewBlockInner({ data }: NodeProps<WorkflowPreviewBlockData>) { function WorkflowPreviewBlockInner({ data }: NodeProps<WorkflowPreviewBlockData>) {
const { const {

View File

@@ -685,7 +685,7 @@ interface WorkflowVariable {
value: unknown value: unknown
} }
interface BlockDetailsSidebarProps { interface PreviewEditorProps {
block: BlockState block: BlockState
executionData?: ExecutionData executionData?: ExecutionData
/** All block execution data for resolving variable references */ /** All block execution data for resolving variable references */
@@ -722,7 +722,7 @@ const DEFAULT_CONNECTIONS_HEIGHT = 150
/** /**
* Readonly sidebar panel showing block configuration using SubBlock components. * Readonly sidebar panel showing block configuration using SubBlock components.
*/ */
function BlockDetailsSidebarContent({ function PreviewEditorContent({
block, block,
executionData, executionData,
allBlockExecutions, allBlockExecutions,
@@ -732,7 +732,7 @@ function BlockDetailsSidebarContent({
parallels, parallels,
isExecutionMode = false, isExecutionMode = false,
onClose, onClose,
}: BlockDetailsSidebarProps) { }: PreviewEditorProps) {
// Convert Record<string, Variable> to Array<Variable> for iteration // Convert Record<string, Variable> to Array<Variable> for iteration
const normalizedWorkflowVariables = useMemo(() => { const normalizedWorkflowVariables = useMemo(() => {
if (!workflowVariables) return [] if (!workflowVariables) return []
@@ -998,6 +998,22 @@ function BlockDetailsSidebarContent({
}) })
}, [extractedRefs.envVars]) }, [extractedRefs.envVars])
const rawValues = useMemo(() => {
return Object.entries(subBlockValues).reduce<Record<string, unknown>>((acc, [key, entry]) => {
if (entry && typeof entry === 'object' && 'value' in entry) {
acc[key] = (entry as { value: unknown }).value
} else {
acc[key] = entry
}
return acc
}, {})
}, [subBlockValues])
const canonicalIndex = useMemo(
() => buildCanonicalIndex(blockConfig?.subBlocks || []),
[blockConfig?.subBlocks]
)
// Check if this is a subflow block (loop or parallel) // Check if this is a subflow block (loop or parallel)
const isSubflow = block.type === 'loop' || block.type === 'parallel' const isSubflow = block.type === 'loop' || block.type === 'parallel'
const loopConfig = block.type === 'loop' ? loops?.[block.id] : undefined const loopConfig = block.type === 'loop' ? loops?.[block.id] : undefined
@@ -1079,21 +1095,6 @@ function BlockDetailsSidebarContent({
) )
} }
const rawValues = useMemo(() => {
return Object.entries(subBlockValues).reduce<Record<string, unknown>>((acc, [key, entry]) => {
if (entry && typeof entry === 'object' && 'value' in entry) {
acc[key] = (entry as { value: unknown }).value
} else {
acc[key] = entry
}
return acc
}, {})
}, [subBlockValues])
const canonicalIndex = useMemo(
() => buildCanonicalIndex(blockConfig.subBlocks),
[blockConfig.subBlocks]
)
const canonicalModeOverrides = block.data?.canonicalModes const canonicalModeOverrides = block.data?.canonicalModes
const effectiveAdvanced = const effectiveAdvanced =
(block.advancedMode ?? false) || (block.advancedMode ?? false) ||
@@ -1371,12 +1372,12 @@ function BlockDetailsSidebarContent({
} }
/** /**
* Block details sidebar wrapped in ReactFlowProvider for hook compatibility. * Preview editor wrapped in ReactFlowProvider for hook compatibility.
*/ */
export function BlockDetailsSidebar(props: BlockDetailsSidebarProps) { export function PreviewEditor(props: PreviewEditorProps) {
return ( return (
<ReactFlowProvider> <ReactFlowProvider>
<BlockDetailsSidebarContent {...props} /> <PreviewEditorContent {...props} />
</ReactFlowProvider> </ReactFlowProvider>
) )
} }

View File

@@ -15,10 +15,9 @@ interface WorkflowPreviewSubflowData {
} }
/** /**
* Lightweight subflow component for workflow previews. * Preview subflow component for workflow visualization.
* Matches the styling of the actual SubflowNodeComponent but without * Renders loop/parallel containers without hooks, store subscriptions,
* hooks, store subscriptions, or interactive features. * or interactive features.
* Used in template cards and other preview contexts for performance.
*/ */
function WorkflowPreviewSubflowInner({ data }: NodeProps<WorkflowPreviewSubflowData>) { function WorkflowPreviewSubflowInner({ data }: NodeProps<WorkflowPreviewSubflowData>) {
const { name, width = 500, height = 300, kind, isPreviewSelected = false } = data const { name, width = 500, height = 300, kind, isPreviewSelected = false } = data

View File

@@ -1,2 +1,2 @@
export { BlockDetailsSidebar } from './components/block-details-sidebar' export { PreviewEditor } from './components/preview-editor'
export { getLeftmostBlockId, WorkflowPreview } from './preview' export { getLeftmostBlockId, WorkflowPreview } from './preview'

View File

@@ -15,14 +15,10 @@ import 'reactflow/dist/style.css'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { BLOCK_DIMENSIONS, CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions' import { BLOCK_DIMENSIONS, CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions'
import { NoteBlock } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/note-block/note-block'
import { SubflowNodeComponent } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/subflow-node'
import { WorkflowBlock } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/workflow-block'
import { WorkflowEdge } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-edge/workflow-edge' import { WorkflowEdge } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-edge/workflow-edge'
import { estimateBlockDimensions } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils' import { estimateBlockDimensions } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils'
import { WorkflowPreviewBlock } from '@/app/workspace/[workspaceId]/w/components/preview/components/block' import { WorkflowPreviewBlock } from '@/app/workspace/[workspaceId]/w/components/preview/components/block'
import { WorkflowPreviewSubflow } from '@/app/workspace/[workspaceId]/w/components/preview/components/subflow' import { WorkflowPreviewSubflow } from '@/app/workspace/[workspaceId]/w/components/preview/components/subflow'
import { getBlock } from '@/blocks'
import type { BlockState, WorkflowState } from '@/stores/workflows/workflow/types' import type { BlockState, WorkflowState } from '@/stores/workflows/workflow/types'
const logger = createLogger('WorkflowPreview') const logger = createLogger('WorkflowPreview')
@@ -134,8 +130,6 @@ interface WorkflowPreviewProps {
onNodeContextMenu?: (blockId: string, mousePosition: { x: number; y: number }) => void onNodeContextMenu?: (blockId: string, mousePosition: { x: number; y: number }) => void
/** Callback when the canvas (empty area) is clicked */ /** Callback when the canvas (empty area) is clicked */
onPaneClick?: () => void onPaneClick?: () => void
/** Use lightweight blocks for better performance in template cards */
lightweight?: boolean
/** Cursor style to show when hovering the canvas */ /** Cursor style to show when hovering the canvas */
cursorStyle?: 'default' | 'pointer' | 'grab' cursorStyle?: 'default' | 'pointer' | 'grab'
/** Map of executed block IDs to their status for highlighting the execution path */ /** Map of executed block IDs to their status for highlighting the execution path */
@@ -145,19 +139,10 @@ interface WorkflowPreviewProps {
} }
/** /**
* Full node types with interactive WorkflowBlock for detailed previews * Preview node types using minimal components without hooks or store subscriptions.
* This prevents interaction issues while allowing canvas panning and node clicking.
*/ */
const fullNodeTypes: NodeTypes = { const previewNodeTypes: NodeTypes = {
workflowBlock: WorkflowBlock,
noteBlock: NoteBlock,
subflowNode: SubflowNodeComponent,
}
/**
* Lightweight node types for template cards and other high-volume previews.
* Uses minimal components without hooks or store subscriptions.
*/
const lightweightNodeTypes: NodeTypes = {
workflowBlock: WorkflowPreviewBlock, workflowBlock: WorkflowPreviewBlock,
noteBlock: WorkflowPreviewBlock, noteBlock: WorkflowPreviewBlock,
subflowNode: WorkflowPreviewSubflow, subflowNode: WorkflowPreviewSubflow,
@@ -172,17 +157,19 @@ const edgeTypes: EdgeTypes = {
interface FitViewOnChangeProps { interface FitViewOnChangeProps {
nodeIds: string nodeIds: string
fitPadding: number fitPadding: number
containerRef: React.RefObject<HTMLDivElement | null>
} }
/** /**
* Helper component that calls fitView when the set of nodes changes. * Helper component that calls fitView when the set of nodes changes or when the container resizes.
* Only triggers on actual node additions/removals, not on selection changes. * Only triggers on actual node additions/removals, not on selection changes.
* Must be rendered inside ReactFlowProvider. * Must be rendered inside ReactFlowProvider.
*/ */
function FitViewOnChange({ nodeIds, fitPadding }: FitViewOnChangeProps) { function FitViewOnChange({ nodeIds, fitPadding, containerRef }: FitViewOnChangeProps) {
const { fitView } = useReactFlow() const { fitView } = useReactFlow()
const lastNodeIdsRef = useRef<string | null>(null) const lastNodeIdsRef = useRef<string | null>(null)
// Fit view when nodes change
useEffect(() => { useEffect(() => {
if (!nodeIds.length) return if (!nodeIds.length) return
const shouldFit = lastNodeIdsRef.current !== nodeIds const shouldFit = lastNodeIdsRef.current !== nodeIds
@@ -195,6 +182,27 @@ function FitViewOnChange({ nodeIds, fitPadding }: FitViewOnChangeProps) {
return () => clearTimeout(timeoutId) return () => clearTimeout(timeoutId)
}, [nodeIds, fitPadding, fitView]) }, [nodeIds, fitPadding, fitView])
// Fit view when container resizes (debounced to avoid excessive calls during drag)
useEffect(() => {
const container = containerRef.current
if (!container) return
let timeoutId: ReturnType<typeof setTimeout> | null = null
const resizeObserver = new ResizeObserver(() => {
if (timeoutId) clearTimeout(timeoutId)
timeoutId = setTimeout(() => {
fitView({ padding: fitPadding, duration: 150 })
}, 100)
})
resizeObserver.observe(container)
return () => {
if (timeoutId) clearTimeout(timeoutId)
resizeObserver.disconnect()
}
}, [containerRef, fitPadding, fitView])
return null return null
} }
@@ -210,12 +218,12 @@ export function WorkflowPreview({
onNodeClick, onNodeClick,
onNodeContextMenu, onNodeContextMenu,
onPaneClick, onPaneClick,
lightweight = false,
cursorStyle = 'grab', cursorStyle = 'grab',
executedBlocks, executedBlocks,
selectedBlockId, selectedBlockId,
}: WorkflowPreviewProps) { }: WorkflowPreviewProps) {
const nodeTypes = lightweight ? lightweightNodeTypes : fullNodeTypes const containerRef = useRef<HTMLDivElement>(null)
const nodeTypes = previewNodeTypes
const isValidWorkflowState = workflowState?.blocks && workflowState.edges const isValidWorkflowState = workflowState?.blocks && workflowState.edges
const blocksStructure = useMemo(() => { const blocksStructure = useMemo(() => {
@@ -285,119 +293,28 @@ export function WorkflowPreview({
const absolutePosition = calculateAbsolutePosition(block, workflowState.blocks) const absolutePosition = calculateAbsolutePosition(block, workflowState.blocks)
if (lightweight) { // Handle loop/parallel containers
if (block.type === 'loop' || block.type === 'parallel') { if (block.type === 'loop' || block.type === 'parallel') {
const isSelected = selectedBlockId === blockId
const dimensions = calculateContainerDimensions(blockId, workflowState.blocks)
nodeArray.push({
id: blockId,
type: 'subflowNode',
position: absolutePosition,
draggable: false,
data: {
name: block.name,
width: dimensions.width,
height: dimensions.height,
kind: block.type as 'loop' | 'parallel',
isPreviewSelected: isSelected,
},
})
return
}
const isSelected = selectedBlockId === blockId
let lightweightExecutionStatus: ExecutionStatus | undefined
if (executedBlocks) {
const blockExecution = executedBlocks[blockId]
if (blockExecution) {
if (blockExecution.status === 'error') {
lightweightExecutionStatus = 'error'
} else if (blockExecution.status === 'success') {
lightweightExecutionStatus = 'success'
} else {
lightweightExecutionStatus = 'not-executed'
}
} else {
lightweightExecutionStatus = 'not-executed'
}
}
nodeArray.push({
id: blockId,
type: 'workflowBlock',
position: absolutePosition,
draggable: false,
// Blocks inside subflows need higher z-index to appear above the container
zIndex: block.data?.parentId ? 10 : undefined,
data: {
type: block.type,
name: block.name,
isTrigger: block.triggerMode === true,
horizontalHandles: block.horizontalHandles ?? false,
enabled: block.enabled ?? true,
isPreviewSelected: isSelected,
executionStatus: lightweightExecutionStatus,
},
})
return
}
if (block.type === 'loop') {
const isSelected = selectedBlockId === blockId const isSelected = selectedBlockId === blockId
const dimensions = calculateContainerDimensions(blockId, workflowState.blocks) const dimensions = calculateContainerDimensions(blockId, workflowState.blocks)
nodeArray.push({ nodeArray.push({
id: blockId, id: blockId,
type: 'subflowNode', type: 'subflowNode',
position: absolutePosition, position: absolutePosition,
parentId: block.data?.parentId,
extent: block.data?.extent || undefined,
draggable: false, draggable: false,
data: { data: {
...block.data,
name: block.name, name: block.name,
width: dimensions.width, width: dimensions.width,
height: dimensions.height, height: dimensions.height,
state: 'valid', kind: block.type as 'loop' | 'parallel',
isPreview: true,
isPreviewSelected: isSelected, isPreviewSelected: isSelected,
kind: 'loop',
}, },
}) })
return return
} }
if (block.type === 'parallel') { // Handle regular blocks
const isSelected = selectedBlockId === blockId const isSelected = selectedBlockId === blockId
const dimensions = calculateContainerDimensions(blockId, workflowState.blocks)
nodeArray.push({
id: blockId,
type: 'subflowNode',
position: absolutePosition,
parentId: block.data?.parentId,
extent: block.data?.extent || undefined,
draggable: false,
data: {
...block.data,
name: block.name,
width: dimensions.width,
height: dimensions.height,
state: 'valid',
isPreview: true,
isPreviewSelected: isSelected,
kind: 'parallel',
},
})
return
}
const blockConfig = getBlock(block.type)
if (!blockConfig) {
logger.error(`No configuration found for block type: ${block.type}`, { blockId })
return
}
const nodeType = block.type === 'note' ? 'noteBlock' : 'workflowBlock'
let executionStatus: ExecutionStatus | undefined let executionStatus: ExecutionStatus | undefined
if (executedBlocks) { if (executedBlocks) {
@@ -415,24 +332,20 @@ export function WorkflowPreview({
} }
} }
const isSelected = selectedBlockId === blockId
nodeArray.push({ nodeArray.push({
id: blockId, id: blockId,
type: nodeType, type: 'workflowBlock',
position: absolutePosition, position: absolutePosition,
draggable: false, draggable: false,
// Blocks inside subflows need higher z-index to appear above the container // Blocks inside subflows need higher z-index to appear above the container
zIndex: block.data?.parentId ? 10 : undefined, zIndex: block.data?.parentId ? 10 : undefined,
data: { data: {
type: block.type, type: block.type,
config: blockConfig,
name: block.name, name: block.name,
blockState: block, isTrigger: block.triggerMode === true,
canEdit: false, horizontalHandles: block.horizontalHandles ?? false,
isPreview: true, enabled: block.enabled ?? true,
isPreviewSelected: isSelected, isPreviewSelected: isSelected,
subBlockValues: block.subBlocks ?? {},
executionStatus, executionStatus,
}, },
}) })
@@ -445,7 +358,6 @@ export function WorkflowPreview({
parallelsStructure, parallelsStructure,
workflowState.blocks, workflowState.blocks,
isValidWorkflowState, isValidWorkflowState,
lightweight,
executedBlocks, executedBlocks,
selectedBlockId, selectedBlockId,
]) ])
@@ -503,6 +415,7 @@ export function WorkflowPreview({
return ( return (
<ReactFlowProvider> <ReactFlowProvider>
<div <div
ref={containerRef}
style={{ height, width, backgroundColor: 'var(--bg)' }} style={{ height, width, backgroundColor: 'var(--bg)' }}
className={cn('preview-mode', onNodeClick && 'interactive-nodes', className)} className={cn('preview-mode', onNodeClick && 'interactive-nodes', className)}
> >
@@ -513,6 +426,20 @@ export function WorkflowPreview({
.preview-mode .react-flow__selectionpane { cursor: ${cursorStyle} !important; } .preview-mode .react-flow__selectionpane { cursor: ${cursorStyle} !important; }
.preview-mode .react-flow__renderer { cursor: ${cursorStyle}; } .preview-mode .react-flow__renderer { cursor: ${cursorStyle}; }
/* Active/grabbing cursor when dragging */
${
cursorStyle === 'grab'
? `
.preview-mode .react-flow:active { cursor: grabbing; }
.preview-mode .react-flow__pane:active { cursor: grabbing !important; }
.preview-mode .react-flow__selectionpane:active { cursor: grabbing !important; }
.preview-mode .react-flow__renderer:active { cursor: grabbing; }
.preview-mode .react-flow__node:active { cursor: grabbing !important; }
.preview-mode .react-flow__node:active * { cursor: grabbing !important; }
`
: ''
}
/* Node cursor - pointer on nodes when onNodeClick is provided */ /* Node cursor - pointer on nodes when onNodeClick is provided */
.preview-mode.interactive-nodes .react-flow__node { cursor: pointer !important; } .preview-mode.interactive-nodes .react-flow__node { cursor: pointer !important; }
.preview-mode.interactive-nodes .react-flow__node > div { cursor: pointer !important; } .preview-mode.interactive-nodes .react-flow__node > div { cursor: pointer !important; }
@@ -560,7 +487,11 @@ export function WorkflowPreview({
} }
onPaneClick={onPaneClick} onPaneClick={onPaneClick}
/> />
<FitViewOnChange nodeIds={blocksStructure.ids} fitPadding={fitPadding} /> <FitViewOnChange
nodeIds={blocksStructure.ids}
fitPadding={fitPadding}
containerRef={containerRef}
/>
</div> </div>
</ReactFlowProvider> </ReactFlowProvider>
) )

View File

@@ -208,6 +208,8 @@ async function runWorkflowExecution({
snapshot, snapshot,
callbacks: {}, callbacks: {},
loggingSession, loggingSession,
includeFileBase64: true,
base64MaxBytes: undefined,
}) })
if (executionResult.status === 'paused') { if (executionResult.status === 'paused') {

View File

@@ -240,6 +240,8 @@ async function executeWebhookJobInternal(
snapshot, snapshot,
callbacks: {}, callbacks: {},
loggingSession, loggingSession,
includeFileBase64: true, // Enable base64 hydration
base64MaxBytes: undefined, // Use default limit
}) })
if (executionResult.status === 'paused') { if (executionResult.status === 'paused') {
@@ -493,6 +495,7 @@ async function executeWebhookJobInternal(
snapshot, snapshot,
callbacks: {}, callbacks: {},
loggingSession, loggingSession,
includeFileBase64: true,
}) })
if (executionResult.status === 'paused') { if (executionResult.status === 'paused') {

View File

@@ -109,6 +109,8 @@ export async function executeWorkflowJob(payload: WorkflowExecutionPayload) {
snapshot, snapshot,
callbacks: {}, callbacks: {},
loggingSession, loggingSession,
includeFileBase64: true,
base64MaxBytes: undefined,
}) })
if (result.status === 'paused') { if (result.status === 'paused') {

View File

@@ -107,14 +107,26 @@ export const A2ABlock: BlockConfig<A2AResponse> = {
condition: { field: 'operation', value: 'a2a_send_message' }, condition: { field: 'operation', value: 'a2a_send_message' },
}, },
{ {
id: 'files', id: 'fileUpload',
title: 'Files', title: 'Files',
type: 'file-upload', type: 'file-upload',
canonicalParamId: 'files',
placeholder: 'Upload files to send', placeholder: 'Upload files to send',
description: 'Files to include with the message (FilePart)', description: 'Files to include with the message (FilePart)',
condition: { field: 'operation', value: 'a2a_send_message' }, condition: { field: 'operation', value: 'a2a_send_message' },
mode: 'basic',
multiple: true, multiple: true,
}, },
{
id: 'fileReference',
title: 'Files',
type: 'short-input',
canonicalParamId: 'files',
placeholder: 'Reference files from previous blocks',
description: 'Files to include with the message (FilePart)',
condition: { field: 'operation', value: 'a2a_send_message' },
mode: 'advanced',
},
{ {
id: 'taskId', id: 'taskId',
title: 'Task ID', title: 'Task ID',
@@ -202,6 +214,15 @@ export const A2ABlock: BlockConfig<A2AResponse> = {
], ],
config: { config: {
tool: (params) => params.operation as string, tool: (params) => params.operation as string,
params: (params) => {
const { fileUpload, fileReference, ...rest } = params
const hasFileUpload = Array.isArray(fileUpload) ? fileUpload.length > 0 : !!fileUpload
const files = hasFileUpload ? fileUpload : fileReference
return {
...rest,
...(files ? { files } : {}),
}
},
}, },
}, },
inputs: { inputs: {
@@ -233,6 +254,14 @@ export const A2ABlock: BlockConfig<A2AResponse> = {
type: 'array', type: 'array',
description: 'Files to include with the message', description: 'Files to include with the message',
}, },
fileUpload: {
type: 'array',
description: 'Uploaded files (basic mode)',
},
fileReference: {
type: 'json',
description: 'File reference from previous blocks (advanced mode)',
},
historyLength: { historyLength: {
type: 'number', type: 'number',
description: 'Number of history messages to include', description: 'Number of history messages to include',

View File

@@ -5,8 +5,9 @@ import type { ConfluenceResponse } from '@/tools/confluence/types'
export const ConfluenceBlock: BlockConfig<ConfluenceResponse> = { export const ConfluenceBlock: BlockConfig<ConfluenceResponse> = {
type: 'confluence', type: 'confluence',
name: 'Confluence', name: 'Confluence (Legacy)',
description: 'Interact with Confluence', description: 'Interact with Confluence',
hideFromToolbar: true,
authMode: AuthMode.OAuth, authMode: AuthMode.OAuth,
longDescription: longDescription:
'Integrate Confluence into the workflow. Can read, create, update, delete pages, manage comments, attachments, labels, and search content.', 'Integrate Confluence into the workflow. Can read, create, update, delete pages, manage comments, attachments, labels, and search content.',
@@ -357,3 +358,342 @@ export const ConfluenceBlock: BlockConfig<ConfluenceResponse> = {
status: { type: 'string', description: 'Space status' }, status: { type: 'string', description: 'Space status' },
}, },
} }
export const ConfluenceV2Block: BlockConfig<ConfluenceResponse> = {
...ConfluenceBlock,
type: 'confluence_v2',
name: 'Confluence',
hideFromToolbar: false,
subBlocks: [
{
id: 'operation',
title: 'Operation',
type: 'dropdown',
options: [
{ label: 'Read Page', id: 'read' },
{ label: 'Create Page', id: 'create' },
{ label: 'Update Page', id: 'update' },
{ label: 'Delete Page', id: 'delete' },
{ label: 'Search Content', id: 'search' },
{ label: 'Create Comment', id: 'create_comment' },
{ label: 'List Comments', id: 'list_comments' },
{ label: 'Update Comment', id: 'update_comment' },
{ label: 'Delete Comment', id: 'delete_comment' },
{ label: 'Upload Attachment', id: 'upload_attachment' },
{ label: 'List Attachments', id: 'list_attachments' },
{ label: 'Delete Attachment', id: 'delete_attachment' },
{ label: 'List Labels', id: 'list_labels' },
{ label: 'Get Space', id: 'get_space' },
{ label: 'List Spaces', id: 'list_spaces' },
],
value: () => 'read',
},
{
id: 'domain',
title: 'Domain',
type: 'short-input',
placeholder: 'Enter Confluence domain (e.g., simstudio.atlassian.net)',
required: true,
},
{
id: 'credential',
title: 'Confluence Account',
type: 'oauth-input',
serviceId: 'confluence',
requiredScopes: [
'read:confluence-content.all',
'read:confluence-space.summary',
'read:space:confluence',
'read:space-details:confluence',
'write:confluence-content',
'write:confluence-space',
'write:confluence-file',
'read:content:confluence',
'read:page:confluence',
'write:page:confluence',
'read:comment:confluence',
'write:comment:confluence',
'delete:comment:confluence',
'read:attachment:confluence',
'write:attachment:confluence',
'delete:attachment:confluence',
'delete:page:confluence',
'read:label:confluence',
'write:label:confluence',
'search:confluence',
'read:me',
'offline_access',
],
placeholder: 'Select Confluence account',
required: true,
},
{
id: 'pageId',
title: 'Select Page',
type: 'file-selector',
canonicalParamId: 'pageId',
serviceId: 'confluence',
placeholder: 'Select Confluence page',
dependsOn: ['credential', 'domain'],
mode: 'basic',
},
{
id: 'manualPageId',
title: 'Page ID',
type: 'short-input',
canonicalParamId: 'pageId',
placeholder: 'Enter Confluence page ID',
mode: 'advanced',
},
{
id: 'spaceId',
title: 'Space ID',
type: 'short-input',
placeholder: 'Enter Confluence space ID',
required: true,
condition: { field: 'operation', value: ['create', 'get_space'] },
},
{
id: 'title',
title: 'Title',
type: 'short-input',
placeholder: 'Enter title for the page',
condition: { field: 'operation', value: ['create', 'update'] },
},
{
id: 'content',
title: 'Content',
type: 'long-input',
placeholder: 'Enter content for the page',
condition: { field: 'operation', value: ['create', 'update'] },
},
{
id: 'parentId',
title: 'Parent Page ID',
type: 'short-input',
placeholder: 'Enter parent page ID (optional)',
condition: { field: 'operation', value: 'create' },
},
{
id: 'query',
title: 'Search Query',
type: 'short-input',
placeholder: 'Enter search query',
required: true,
condition: { field: 'operation', value: 'search' },
},
{
id: 'comment',
title: 'Comment Text',
type: 'long-input',
placeholder: 'Enter comment text',
required: true,
condition: { field: 'operation', value: ['create_comment', 'update_comment'] },
},
{
id: 'commentId',
title: 'Comment ID',
type: 'short-input',
placeholder: 'Enter comment ID',
required: true,
condition: { field: 'operation', value: ['update_comment', 'delete_comment'] },
},
{
id: 'attachmentId',
title: 'Attachment ID',
type: 'short-input',
placeholder: 'Enter attachment ID',
required: true,
condition: { field: 'operation', value: 'delete_attachment' },
},
{
id: 'attachmentFileUpload',
title: 'File',
type: 'file-upload',
canonicalParamId: 'attachmentFile',
placeholder: 'Select file to upload',
condition: { field: 'operation', value: 'upload_attachment' },
mode: 'basic',
},
{
id: 'attachmentFileReference',
title: 'File',
type: 'short-input',
canonicalParamId: 'attachmentFile',
placeholder: 'Reference file from previous blocks',
condition: { field: 'operation', value: 'upload_attachment' },
mode: 'advanced',
},
{
id: 'attachmentFileName',
title: 'File Name',
type: 'short-input',
placeholder: 'Optional custom file name',
condition: { field: 'operation', value: 'upload_attachment' },
},
{
id: 'attachmentComment',
title: 'Comment',
type: 'short-input',
placeholder: 'Optional comment for the attachment',
condition: { field: 'operation', value: 'upload_attachment' },
},
{
id: 'labelName',
title: 'Label Name',
type: 'short-input',
placeholder: 'Enter label name',
required: true,
condition: { field: 'operation', value: ['add_label', 'remove_label'] },
},
{
id: 'limit',
title: 'Limit',
type: 'short-input',
placeholder: 'Enter maximum number of results (default: 25)',
condition: {
field: 'operation',
value: ['search', 'list_comments', 'list_attachments', 'list_spaces'],
},
},
],
tools: {
access: [
'confluence_retrieve',
'confluence_update',
'confluence_create_page',
'confluence_delete_page',
'confluence_search',
'confluence_create_comment',
'confluence_list_comments',
'confluence_update_comment',
'confluence_delete_comment',
'confluence_upload_attachment',
'confluence_list_attachments',
'confluence_delete_attachment',
'confluence_list_labels',
'confluence_get_space',
'confluence_list_spaces',
],
config: {
tool: (params) => {
switch (params.operation) {
case 'read':
return 'confluence_retrieve'
case 'create':
return 'confluence_create_page'
case 'update':
return 'confluence_update'
case 'delete':
return 'confluence_delete_page'
case 'search':
return 'confluence_search'
case 'create_comment':
return 'confluence_create_comment'
case 'list_comments':
return 'confluence_list_comments'
case 'update_comment':
return 'confluence_update_comment'
case 'delete_comment':
return 'confluence_delete_comment'
case 'upload_attachment':
return 'confluence_upload_attachment'
case 'list_attachments':
return 'confluence_list_attachments'
case 'delete_attachment':
return 'confluence_delete_attachment'
case 'list_labels':
return 'confluence_list_labels'
case 'get_space':
return 'confluence_get_space'
case 'list_spaces':
return 'confluence_list_spaces'
default:
return 'confluence_retrieve'
}
},
params: (params) => {
const {
credential,
pageId,
manualPageId,
operation,
attachmentFileUpload,
attachmentFileReference,
attachmentFile,
attachmentFileName,
attachmentComment,
...rest
} = params
const effectivePageId = (pageId || manualPageId || '').trim()
const requiresPageId = [
'read',
'update',
'delete',
'create_comment',
'list_comments',
'list_attachments',
'list_labels',
'upload_attachment',
]
const requiresSpaceId = ['create', 'get_space']
if (requiresPageId.includes(operation) && !effectivePageId) {
throw new Error('Page ID is required. Please select a page or enter a page ID manually.')
}
if (requiresSpaceId.includes(operation) && !rest.spaceId) {
throw new Error('Space ID is required for this operation.')
}
if (operation === 'upload_attachment') {
const fileInput = attachmentFileUpload || attachmentFileReference || attachmentFile
if (!fileInput) {
throw new Error('File is required for upload attachment operation.')
}
return {
credential,
pageId: effectivePageId,
operation,
file: fileInput,
fileName: attachmentFileName,
comment: attachmentComment,
...rest,
}
}
return {
credential,
pageId: effectivePageId || undefined,
operation,
...rest,
}
},
},
},
inputs: {
operation: { type: 'string', description: 'Operation to perform' },
domain: { type: 'string', description: 'Confluence domain' },
credential: { type: 'string', description: 'Confluence access token' },
pageId: { type: 'string', description: 'Page identifier' },
manualPageId: { type: 'string', description: 'Manual page identifier' },
spaceId: { type: 'string', description: 'Space identifier' },
title: { type: 'string', description: 'Page title' },
content: { type: 'string', description: 'Page content' },
parentId: { type: 'string', description: 'Parent page identifier' },
query: { type: 'string', description: 'Search query' },
comment: { type: 'string', description: 'Comment text' },
commentId: { type: 'string', description: 'Comment identifier' },
attachmentId: { type: 'string', description: 'Attachment identifier' },
attachmentFile: { type: 'json', description: 'File to upload as attachment' },
attachmentFileUpload: { type: 'json', description: 'Uploaded file (basic mode)' },
attachmentFileReference: { type: 'json', description: 'File reference (advanced mode)' },
attachmentFileName: { type: 'string', description: 'Custom file name for attachment' },
attachmentComment: { type: 'string', description: 'Comment for the attachment' },
labelName: { type: 'string', description: 'Label name' },
limit: { type: 'number', description: 'Maximum number of results' },
},
}

View File

@@ -1,13 +1,14 @@
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { DocumentIcon } from '@/components/icons' import { DocumentIcon } from '@/components/icons'
import type { BlockConfig, SubBlockType } from '@/blocks/types' import type { BlockConfig, SubBlockType } from '@/blocks/types'
import { createVersionedToolSelector } from '@/blocks/utils'
import type { FileParserOutput } from '@/tools/file/types' import type { FileParserOutput } from '@/tools/file/types'
const logger = createLogger('FileBlock') const logger = createLogger('FileBlock')
export const FileBlock: BlockConfig<FileParserOutput> = { export const FileBlock: BlockConfig<FileParserOutput> = {
type: 'file', type: 'file',
name: 'File', name: 'File (Legacy)',
description: 'Read and parse multiple files', description: 'Read and parse multiple files',
longDescription: `Integrate File into the workflow. Can upload a file manually or insert a file url.`, longDescription: `Integrate File into the workflow. Can upload a file manually or insert a file url.`,
bestPractices: ` bestPractices: `
@@ -17,6 +18,7 @@ export const FileBlock: BlockConfig<FileParserOutput> = {
category: 'tools', category: 'tools',
bgColor: '#40916C', bgColor: '#40916C',
icon: DocumentIcon, icon: DocumentIcon,
hideFromToolbar: true,
subBlocks: [ subBlocks: [
{ {
id: 'inputMethod', id: 'inputMethod',
@@ -112,6 +114,99 @@ export const FileBlock: BlockConfig<FileParserOutput> = {
fileType: { type: 'string', description: 'File type' }, fileType: { type: 'string', description: 'File type' },
file: { type: 'json', description: 'Uploaded file data' }, file: { type: 'json', description: 'Uploaded file data' },
}, },
outputs: {
files: {
type: 'json',
description: 'Array of parsed file objects with content, metadata, and file properties',
},
combinedContent: {
type: 'string',
description: 'All file contents merged into a single text string',
},
processedFiles: {
type: 'files',
description: 'Array of UserFile objects for downstream use (attachments, uploads, etc.)',
},
},
}
export const FileV2Block: BlockConfig<FileParserOutput> = {
...FileBlock,
type: 'file_v2',
name: 'File',
description: 'Read and parse multiple files',
hideFromToolbar: false,
subBlocks: [
{
id: 'file',
title: 'Files',
type: 'file-upload' as SubBlockType,
canonicalParamId: 'fileInput',
acceptedTypes:
'.pdf,.csv,.doc,.docx,.txt,.md,.xlsx,.xls,.html,.htm,.pptx,.ppt,.json,.xml,.rtf',
placeholder: 'Upload files to process',
multiple: true,
mode: 'basic',
maxSize: 100,
},
{
id: 'filePath',
title: 'Files',
type: 'short-input' as SubBlockType,
canonicalParamId: 'fileInput',
placeholder: 'File URL',
mode: 'advanced',
},
],
tools: {
access: ['file_parser_v2'],
config: {
tool: createVersionedToolSelector({
baseToolSelector: () => 'file_parser',
suffix: '_v2',
fallbackToolId: 'file_parser_v2',
}),
params: (params) => {
const fileInput = params.file || params.filePath || params.fileInput
if (!fileInput) {
logger.error('No file input provided')
throw new Error('File is required')
}
if (typeof fileInput === 'string') {
return {
filePath: fileInput.trim(),
fileType: params.fileType || 'auto',
workspaceId: params._context?.workspaceId,
}
}
if (Array.isArray(fileInput) && fileInput.length > 0) {
const filePaths = fileInput.map((file) => file.path)
return {
filePath: filePaths.length === 1 ? filePaths[0] : filePaths,
fileType: params.fileType || 'auto',
}
}
if (fileInput?.path) {
return {
filePath: fileInput.path,
fileType: params.fileType || 'auto',
}
}
logger.error('Invalid file input format')
throw new Error('Invalid file input')
},
},
},
inputs: {
fileInput: { type: 'json', description: 'File input (upload or URL reference)' },
filePath: { type: 'string', description: 'File URL (advanced mode)' },
file: { type: 'json', description: 'Uploaded file data (basic mode)' },
fileType: { type: 'string', description: 'File type' },
},
outputs: { outputs: {
files: { files: {
type: 'json', type: 'json',

View File

@@ -1,11 +1,13 @@
import { MistralIcon } from '@/components/icons' import { MistralIcon } from '@/components/icons'
import { AuthMode, type BlockConfig, type SubBlockType } from '@/blocks/types' import { AuthMode, type BlockConfig, type SubBlockType } from '@/blocks/types'
import { createVersionedToolSelector } from '@/blocks/utils'
import type { MistralParserOutput } from '@/tools/mistral/types' import type { MistralParserOutput } from '@/tools/mistral/types'
export const MistralParseBlock: BlockConfig<MistralParserOutput> = { export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
type: 'mistral_parse', type: 'mistral_parse',
name: 'Mistral Parser', name: 'Mistral Parser (Legacy)',
description: 'Extract text from PDF documents', description: 'Extract text from PDF documents',
hideFromToolbar: true,
authMode: AuthMode.ApiKey, authMode: AuthMode.ApiKey,
longDescription: `Integrate Mistral Parse into the workflow. Can extract text from uploaded PDF documents, or from a URL.`, longDescription: `Integrate Mistral Parse into the workflow. Can extract text from uploaded PDF documents, or from a URL.`,
docsLink: 'https://docs.sim.ai/tools/mistral_parse', docsLink: 'https://docs.sim.ai/tools/mistral_parse',
@@ -13,7 +15,6 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
bgColor: '#000000', bgColor: '#000000',
icon: MistralIcon, icon: MistralIcon,
subBlocks: [ subBlocks: [
// Show input method selection
{ {
id: 'inputMethod', id: 'inputMethod',
title: 'Select Input Method', title: 'Select Input Method',
@@ -23,8 +24,6 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
{ id: 'upload', label: 'Upload PDF Document' }, { id: 'upload', label: 'Upload PDF Document' },
], ],
}, },
// URL input - conditional on inputMethod
{ {
id: 'filePath', id: 'filePath',
title: 'PDF Document URL', title: 'PDF Document URL',
@@ -35,8 +34,6 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
value: 'url', value: 'url',
}, },
}, },
// File upload option
{ {
id: 'fileUpload', id: 'fileUpload',
title: 'Upload PDF', title: 'Upload PDF',
@@ -46,9 +43,8 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
field: 'inputMethod', field: 'inputMethod',
value: 'upload', value: 'upload',
}, },
maxSize: 50, // 50MB max via direct upload maxSize: 50,
}, },
{ {
id: 'resultType', id: 'resultType',
title: 'Output Format', title: 'Output Format',
@@ -65,28 +61,6 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
type: 'short-input', type: 'short-input',
placeholder: 'e.g. 0,1,2 (leave empty for all pages)', placeholder: 'e.g. 0,1,2 (leave empty for all pages)',
}, },
/*
* Image-related parameters - temporarily disabled
* Uncomment if PDF image extraction is needed
*
{
id: 'includeImageBase64',
title: 'Include PDF Images',
type: 'switch',
},
{
id: 'imageLimit',
title: 'Max Images',
type: 'short-input',
placeholder: 'Maximum number of images to extract',
},
{
id: 'imageMinSize',
title: 'Min Image Size (px)',
type: 'short-input',
placeholder: 'Min width/height in pixels',
},
*/
{ {
id: 'apiKey', id: 'apiKey',
title: 'API Key', title: 'API Key',
@@ -101,18 +75,15 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
config: { config: {
tool: () => 'mistral_parser', tool: () => 'mistral_parser',
params: (params) => { params: (params) => {
// Basic validation
if (!params || !params.apiKey || params.apiKey.trim() === '') { if (!params || !params.apiKey || params.apiKey.trim() === '') {
throw new Error('Mistral API key is required') throw new Error('Mistral API key is required')
} }
// Build parameters object - file processing is now handled at the tool level const parameters: Record<string, unknown> = {
const parameters: any = {
apiKey: params.apiKey.trim(), apiKey: params.apiKey.trim(),
resultType: params.resultType || 'markdown', resultType: params.resultType || 'markdown',
} }
// Set filePath or fileUpload based on input method
const inputMethod = params.inputMethod || 'url' const inputMethod = params.inputMethod || 'url'
if (inputMethod === 'url') { if (inputMethod === 'url') {
if (!params.filePath || params.filePath.trim() === '') { if (!params.filePath || params.filePath.trim() === '') {
@@ -123,11 +94,9 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
if (!params.fileUpload) { if (!params.fileUpload) {
throw new Error('Please upload a PDF document') throw new Error('Please upload a PDF document')
} }
// Pass the entire fileUpload object to the tool
parameters.fileUpload = params.fileUpload parameters.fileUpload = params.fileUpload
} }
// Convert pages input from string to array of numbers if provided
let pagesArray: number[] | undefined let pagesArray: number[] | undefined
if (params.pages && params.pages.trim() !== '') { if (params.pages && params.pages.trim() !== '') {
try { try {
@@ -146,12 +115,12 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
if (pagesArray && pagesArray.length === 0) { if (pagesArray && pagesArray.length === 0) {
pagesArray = undefined pagesArray = undefined
} }
} catch (error: any) { } catch (error: unknown) {
throw new Error(`Page number format error: ${error.message}`) const errorMessage = error instanceof Error ? error.message : String(error)
throw new Error(`Page number format error: ${errorMessage}`)
} }
} }
// Add optional parameters
if (pagesArray && pagesArray.length > 0) { if (pagesArray && pagesArray.length > 0) {
parameters.pages = pagesArray parameters.pages = pagesArray
} }
@@ -173,3 +142,129 @@ export const MistralParseBlock: BlockConfig<MistralParserOutput> = {
metadata: { type: 'json', description: 'Processing metadata' }, metadata: { type: 'json', description: 'Processing metadata' },
}, },
} }
export const MistralParseV2Block: BlockConfig<MistralParserOutput> = {
...MistralParseBlock,
type: 'mistral_parse_v2',
name: 'Mistral Parser',
description: 'Extract text from PDF documents',
hideFromToolbar: false,
subBlocks: [
{
id: 'fileUpload',
title: 'PDF Document',
type: 'file-upload' as SubBlockType,
canonicalParamId: 'document',
acceptedTypes: 'application/pdf',
placeholder: 'Upload a PDF document',
mode: 'basic',
maxSize: 50,
},
{
id: 'filePath',
title: 'PDF Document',
type: 'short-input' as SubBlockType,
canonicalParamId: 'document',
placeholder: 'Document URL',
mode: 'advanced',
},
{
id: 'resultType',
title: 'Output Format',
type: 'dropdown',
options: [
{ id: 'markdown', label: 'Markdown' },
{ id: 'text', label: 'Plain Text' },
{ id: 'json', label: 'JSON' },
],
},
{
id: 'pages',
title: 'Specific Pages',
type: 'short-input',
placeholder: 'e.g. 0,1,2 (leave empty for all pages)',
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input' as SubBlockType,
placeholder: 'Enter your Mistral API key',
password: true,
required: true,
},
],
tools: {
access: ['mistral_parser_v2'],
config: {
tool: createVersionedToolSelector({
baseToolSelector: () => 'mistral_parser',
suffix: '_v2',
fallbackToolId: 'mistral_parser_v2',
}),
params: (params) => {
if (!params || !params.apiKey || params.apiKey.trim() === '') {
throw new Error('Mistral API key is required')
}
const parameters: Record<string, unknown> = {
apiKey: params.apiKey.trim(),
resultType: params.resultType || 'markdown',
}
const documentInput = params.fileUpload || params.filePath || params.document
if (!documentInput) {
throw new Error('PDF document is required')
}
if (typeof documentInput === 'object') {
parameters.fileUpload = documentInput
} else if (typeof documentInput === 'string') {
parameters.filePath = documentInput.trim()
}
let pagesArray: number[] | undefined
if (params.pages && params.pages.trim() !== '') {
try {
pagesArray = params.pages
.split(',')
.map((p: string) => p.trim())
.filter((p: string) => p.length > 0)
.map((p: string) => {
const num = Number.parseInt(p, 10)
if (Number.isNaN(num) || num < 0) {
throw new Error(`Invalid page number: ${p}`)
}
return num
})
if (pagesArray && pagesArray.length === 0) {
pagesArray = undefined
}
} catch (error: unknown) {
const errorMessage = error instanceof Error ? error.message : String(error)
throw new Error(`Page number format error: ${errorMessage}`)
}
}
if (pagesArray && pagesArray.length > 0) {
parameters.pages = pagesArray
}
return parameters
},
},
},
inputs: {
document: { type: 'json', description: 'Document input (file upload or URL reference)' },
filePath: { type: 'string', description: 'PDF document URL (advanced mode)' },
fileUpload: { type: 'json', description: 'Uploaded PDF file (basic mode)' },
apiKey: { type: 'string', description: 'Mistral API key' },
resultType: { type: 'string', description: 'Output format type' },
pages: { type: 'string', description: 'Page selection' },
},
outputs: {
pages: { type: 'array', description: 'Array of page objects from Mistral OCR' },
model: { type: 'string', description: 'Mistral OCR model identifier' },
usage_info: { type: 'json', description: 'Usage statistics from the API' },
document_annotation: { type: 'string', description: 'Structured annotation data' },
},
}

View File

@@ -15,34 +15,22 @@ export const PulseBlock: BlockConfig<PulseParserOutput> = {
icon: PulseIcon, icon: PulseIcon,
subBlocks: [ subBlocks: [
{ {
id: 'inputMethod', id: 'fileUpload',
title: 'Select Input Method', title: 'Document',
type: 'dropdown' as SubBlockType, type: 'file-upload' as SubBlockType,
options: [ canonicalParamId: 'document',
{ id: 'url', label: 'Document URL' }, acceptedTypes: 'application/pdf,image/*,.docx,.pptx,.xlsx',
{ id: 'upload', label: 'Upload Document' }, placeholder: 'Upload a document',
], mode: 'basic',
maxSize: 50,
}, },
{ {
id: 'filePath', id: 'filePath',
title: 'Document URL', title: 'Document',
type: 'short-input' as SubBlockType, type: 'short-input' as SubBlockType,
placeholder: 'Enter full URL to a document (https://example.com/document.pdf)', canonicalParamId: 'document',
condition: { placeholder: 'Document URL',
field: 'inputMethod', mode: 'advanced',
value: 'url',
},
},
{
id: 'fileUpload',
title: 'Upload Document',
type: 'file-upload' as SubBlockType,
acceptedTypes: 'application/pdf,image/*,.docx,.pptx,.xlsx',
condition: {
field: 'inputMethod',
value: 'upload',
},
maxSize: 50,
}, },
{ {
id: 'pages', id: 'pages',
@@ -84,17 +72,14 @@ export const PulseBlock: BlockConfig<PulseParserOutput> = {
apiKey: params.apiKey.trim(), apiKey: params.apiKey.trim(),
} }
const inputMethod = params.inputMethod || 'url' const documentInput = params.fileUpload || params.filePath || params.document
if (inputMethod === 'url') { if (!documentInput) {
if (!params.filePath || params.filePath.trim() === '') { throw new Error('Document is required')
throw new Error('Document URL is required') }
} if (typeof documentInput === 'object') {
parameters.filePath = params.filePath.trim() parameters.fileUpload = documentInput
} else if (inputMethod === 'upload') { } else if (typeof documentInput === 'string') {
if (!params.fileUpload) { parameters.filePath = documentInput.trim()
throw new Error('Please upload a document')
}
parameters.fileUpload = params.fileUpload
} }
if (params.pages && params.pages.trim() !== '') { if (params.pages && params.pages.trim() !== '') {
@@ -117,9 +102,9 @@ export const PulseBlock: BlockConfig<PulseParserOutput> = {
}, },
}, },
inputs: { inputs: {
inputMethod: { type: 'string', description: 'Input method selection' }, document: { type: 'json', description: 'Document input (file upload or URL reference)' },
filePath: { type: 'string', description: 'Document URL' }, filePath: { type: 'string', description: 'Document URL (advanced mode)' },
fileUpload: { type: 'json', description: 'Uploaded document file' }, fileUpload: { type: 'json', description: 'Uploaded document file (basic mode)' },
apiKey: { type: 'string', description: 'Pulse API key' }, apiKey: { type: 'string', description: 'Pulse API key' },
pages: { type: 'string', description: 'Page range selection' }, pages: { type: 'string', description: 'Page range selection' },
chunking: { chunking: {

View File

@@ -14,34 +14,22 @@ export const ReductoBlock: BlockConfig<ReductoParserOutput> = {
icon: ReductoIcon, icon: ReductoIcon,
subBlocks: [ subBlocks: [
{ {
id: 'inputMethod', id: 'fileUpload',
title: 'Select Input Method', title: 'PDF Document',
type: 'dropdown' as SubBlockType, type: 'file-upload' as SubBlockType,
options: [ canonicalParamId: 'document',
{ id: 'url', label: 'PDF Document URL' }, acceptedTypes: 'application/pdf',
{ id: 'upload', label: 'Upload PDF Document' }, placeholder: 'Upload a PDF document',
], mode: 'basic',
maxSize: 50,
}, },
{ {
id: 'filePath', id: 'filePath',
title: 'PDF Document URL', title: 'PDF Document',
type: 'short-input' as SubBlockType, type: 'short-input' as SubBlockType,
placeholder: 'Enter full URL to a PDF document (https://example.com/document.pdf)', canonicalParamId: 'document',
condition: { placeholder: 'Document URL',
field: 'inputMethod', mode: 'advanced',
value: 'url',
},
},
{
id: 'fileUpload',
title: 'Upload PDF',
type: 'file-upload' as SubBlockType,
acceptedTypes: 'application/pdf',
condition: {
field: 'inputMethod',
value: 'upload',
},
maxSize: 50,
}, },
{ {
id: 'pages', id: 'pages',
@@ -80,17 +68,15 @@ export const ReductoBlock: BlockConfig<ReductoParserOutput> = {
apiKey: params.apiKey.trim(), apiKey: params.apiKey.trim(),
} }
const inputMethod = params.inputMethod || 'url' const documentInput = params.fileUpload || params.filePath || params.document
if (inputMethod === 'url') { if (!documentInput) {
if (!params.filePath || params.filePath.trim() === '') { throw new Error('PDF document is required')
throw new Error('PDF Document URL is required') }
}
parameters.filePath = params.filePath.trim() if (typeof documentInput === 'object') {
} else if (inputMethod === 'upload') { parameters.fileUpload = documentInput
if (!params.fileUpload) { } else if (typeof documentInput === 'string') {
throw new Error('Please upload a PDF document') parameters.filePath = documentInput.trim()
}
parameters.fileUpload = params.fileUpload
} }
let pagesArray: number[] | undefined let pagesArray: number[] | undefined
@@ -130,9 +116,9 @@ export const ReductoBlock: BlockConfig<ReductoParserOutput> = {
}, },
}, },
inputs: { inputs: {
inputMethod: { type: 'string', description: 'Input method selection' }, document: { type: 'json', description: 'Document input (file upload or URL reference)' },
filePath: { type: 'string', description: 'PDF document URL' }, filePath: { type: 'string', description: 'PDF document URL (advanced mode)' },
fileUpload: { type: 'json', description: 'Uploaded PDF file' }, fileUpload: { type: 'json', description: 'Uploaded PDF file (basic mode)' },
apiKey: { type: 'string', description: 'Reducto API key' }, apiKey: { type: 'string', description: 'Reducto API key' },
pages: { type: 'string', description: 'Page selection' }, pages: { type: 'string', description: 'Page selection' },
tableOutputFormat: { type: 'string', description: 'Table output format' }, tableOutputFormat: { type: 'string', description: 'Table output format' },

View File

@@ -414,6 +414,10 @@ export const S3Block: BlockConfig<S3Response> = {
}, },
outputs: { outputs: {
url: { type: 'string', description: 'URL of S3 object' }, url: { type: 'string', description: 'URL of S3 object' },
uri: {
type: 'string',
description: 'S3 URI (s3://bucket/key) for use with other AWS services',
},
objects: { type: 'json', description: 'List of objects (for list operation)' }, objects: { type: 'json', description: 'List of objects (for list operation)' },
deleted: { type: 'boolean', description: 'Deletion status' }, deleted: { type: 'boolean', description: 'Deletion status' },
metadata: { type: 'json', description: 'Operation metadata' }, metadata: { type: 'json', description: 'Operation metadata' },

View File

@@ -0,0 +1,191 @@
import { TextractIcon } from '@/components/icons'
import { AuthMode, type BlockConfig, type SubBlockType } from '@/blocks/types'
import type { TextractParserOutput } from '@/tools/textract/types'
export const TextractBlock: BlockConfig<TextractParserOutput> = {
type: 'textract',
name: 'AWS Textract',
description: 'Extract text, tables, and forms from documents',
authMode: AuthMode.ApiKey,
longDescription: `Integrate AWS Textract into your workflow to extract text, tables, forms, and key-value pairs from documents. Single-page mode supports JPEG, PNG, and single-page PDF. Multi-page mode supports multi-page PDF and TIFF.`,
docsLink: 'https://docs.sim.ai/tools/textract',
category: 'tools',
bgColor: 'linear-gradient(135deg, #055F4E 0%, #56C0A7 100%)',
icon: TextractIcon,
subBlocks: [
{
id: 'processingMode',
title: 'Processing Mode',
type: 'dropdown' as SubBlockType,
options: [
{ id: 'sync', label: 'Single Page (JPEG, PNG, 1-page PDF)' },
{ id: 'async', label: 'Multi-Page (PDF, TIFF via S3)' },
],
tooltip:
'Single Page uses synchronous API for JPEG, PNG, or single-page PDF. Multi-Page uses async API for multi-page PDF/TIFF stored in S3.',
},
{
id: 'fileUpload',
title: 'Document',
type: 'file-upload' as SubBlockType,
canonicalParamId: 'document',
acceptedTypes: 'image/jpeg,image/png,application/pdf',
placeholder: 'Upload JPEG, PNG, or single-page PDF (max 10MB)',
condition: {
field: 'processingMode',
value: 'async',
not: true,
},
mode: 'basic',
maxSize: 10,
},
{
id: 'filePath',
title: 'Document',
type: 'short-input' as SubBlockType,
canonicalParamId: 'document',
placeholder: 'URL to JPEG, PNG, or single-page PDF',
condition: {
field: 'processingMode',
value: 'async',
not: true,
},
mode: 'advanced',
},
{
id: 's3Uri',
title: 'S3 URI',
type: 'short-input' as SubBlockType,
placeholder: 's3://bucket-name/path/to/document.pdf',
condition: {
field: 'processingMode',
value: 'async',
},
},
{
id: 'region',
title: 'AWS Region',
type: 'short-input' as SubBlockType,
placeholder: 'e.g., us-east-1',
required: true,
},
{
id: 'accessKeyId',
title: 'AWS Access Key ID',
type: 'short-input' as SubBlockType,
placeholder: 'Enter your AWS Access Key ID',
password: true,
required: true,
},
{
id: 'secretAccessKey',
title: 'AWS Secret Access Key',
type: 'short-input' as SubBlockType,
placeholder: 'Enter your AWS Secret Access Key',
password: true,
required: true,
},
{
id: 'extractTables',
title: 'Extract Tables',
type: 'switch' as SubBlockType,
},
{
id: 'extractForms',
title: 'Extract Forms (Key-Value Pairs)',
type: 'switch' as SubBlockType,
},
{
id: 'detectSignatures',
title: 'Detect Signatures',
type: 'switch' as SubBlockType,
},
{
id: 'analyzeLayout',
title: 'Analyze Document Layout',
type: 'switch' as SubBlockType,
},
],
tools: {
access: ['textract_parser'],
config: {
tool: () => 'textract_parser',
params: (params) => {
if (!params.accessKeyId || params.accessKeyId.trim() === '') {
throw new Error('AWS Access Key ID is required')
}
if (!params.secretAccessKey || params.secretAccessKey.trim() === '') {
throw new Error('AWS Secret Access Key is required')
}
if (!params.region || params.region.trim() === '') {
throw new Error('AWS Region is required')
}
const processingMode = params.processingMode || 'sync'
const parameters: Record<string, unknown> = {
accessKeyId: params.accessKeyId.trim(),
secretAccessKey: params.secretAccessKey.trim(),
region: params.region.trim(),
processingMode,
}
if (processingMode === 'async') {
if (!params.s3Uri || params.s3Uri.trim() === '') {
throw new Error('S3 URI is required for multi-page processing')
}
parameters.s3Uri = params.s3Uri.trim()
} else {
const documentInput = params.fileUpload || params.filePath || params.document
if (!documentInput) {
throw new Error('Document is required')
}
if (typeof documentInput === 'object') {
parameters.fileUpload = documentInput
} else if (typeof documentInput === 'string') {
parameters.filePath = documentInput.trim()
}
}
const featureTypes: string[] = []
if (params.extractTables) featureTypes.push('TABLES')
if (params.extractForms) featureTypes.push('FORMS')
if (params.detectSignatures) featureTypes.push('SIGNATURES')
if (params.analyzeLayout) featureTypes.push('LAYOUT')
if (featureTypes.length > 0) {
parameters.featureTypes = featureTypes
}
return parameters
},
},
},
inputs: {
processingMode: { type: 'string', description: 'Document type: single-page or multi-page' },
document: { type: 'json', description: 'Document input (file upload or URL reference)' },
filePath: { type: 'string', description: 'Document URL (advanced mode)' },
fileUpload: { type: 'json', description: 'Uploaded document file (basic mode)' },
s3Uri: { type: 'string', description: 'S3 URI for multi-page processing (s3://bucket/key)' },
extractTables: { type: 'boolean', description: 'Extract tables from document' },
extractForms: { type: 'boolean', description: 'Extract form key-value pairs' },
detectSignatures: { type: 'boolean', description: 'Detect signatures' },
analyzeLayout: { type: 'boolean', description: 'Analyze document layout' },
region: { type: 'string', description: 'AWS region' },
accessKeyId: { type: 'string', description: 'AWS Access Key ID' },
secretAccessKey: { type: 'string', description: 'AWS Secret Access Key' },
},
outputs: {
blocks: {
type: 'json',
description: 'Array of detected blocks (PAGE, LINE, WORD, TABLE, CELL, KEY_VALUE_SET, etc.)',
},
documentMetadata: {
type: 'json',
description: 'Document metadata containing pages count',
},
modelVersion: {
type: 'string',
description: 'Version of the Textract model used for processing',
},
},
}

Some files were not shown because too many files have changed in this diff Show More