Compare commits

...

52 Commits

Author SHA1 Message Date
Waleed
e9c4251c1c v0.5.68: router block reasoning, executor improvements, variable resolution consolidation, helm updates (#2946)
* improvement(workflow-item): stabilize avatar layout and fix name truncation (#2939)

* improvement(workflow-item): stabilize avatar layout and fix name truncation

* fix(avatars): revert overflow bg to hardcoded color for contrast

* fix(executor): stop parallel execution when block errors (#2940)

* improvement(helm): add per-deployment extraVolumes support (#2942)

* fix(gmail): expose messageId field in read email block (#2943)

* fix(resolver): consolidate reference resolution  (#2941)

* fix(resolver): consolidate code to resolve references

* fix edge cases

* use already formatted error

* fix multi index

* fix backwards compat reachability

* handle backwards compatibility accurately

* use shared constant correctly

* feat(router): expose reasoning output in router v2 block (#2945)

* fix(copilot): always allow, credential masking (#2947)

* Fix always allow, credential validation

* Credential masking

* Autoload

* fix(executor): handle condition dead-end branches in loops (#2944)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Siddharth Ganesan <33737564+Sg312@users.noreply.github.com>
2026-01-22 13:48:15 -08:00
Waleed
748793e07d fix(executor): handle condition dead-end branches in loops (#2944) 2026-01-22 13:30:11 -08:00
Siddharth Ganesan
91da7e183a fix(copilot): always allow, credential masking (#2947)
* Fix always allow, credential validation

* Credential masking

* Autoload
2026-01-22 13:07:16 -08:00
Waleed
ab09a5ad23 feat(router): expose reasoning output in router v2 block (#2945) 2026-01-22 12:43:57 -08:00
Vikhyath Mondreti
fcd0240db6 fix(resolver): consolidate reference resolution (#2941)
* fix(resolver): consolidate code to resolve references

* fix edge cases

* use already formatted error

* fix multi index

* fix backwards compat reachability

* handle backwards compatibility accurately

* use shared constant correctly
2026-01-22 12:38:50 -08:00
Waleed
4e4149792a fix(gmail): expose messageId field in read email block (#2943) 2026-01-22 11:46:34 -08:00
Waleed
9a8b591257 improvement(helm): add per-deployment extraVolumes support (#2942) 2026-01-22 11:35:23 -08:00
Waleed
f3ae3f8442 fix(executor): stop parallel execution when block errors (#2940) 2026-01-22 11:34:40 -08:00
Waleed
66dfe2c6b2 improvement(workflow-item): stabilize avatar layout and fix name truncation (#2939)
* improvement(workflow-item): stabilize avatar layout and fix name truncation

* fix(avatars): revert overflow bg to hardcoded color for contrast
2026-01-22 11:26:47 -08:00
Waleed
cc2be33d6b v0.5.67: loading, password reset, ui improvements, helm updates (#2928)
* fix(zustand): updated to useShallow from deprecated createWithEqualityFn (#2919)

* fix(logger): use direct env access for webpack inlining (#2920)

* fix(notifications): text overflow with line-clamp (#2921)

* chore(helm): add env vars for Vertex AI, orgs, and telemetry (#2922)

* fix(auth): improve reset password flow and consolidate brand detection (#2924)

* fix(auth): improve reset password flow and consolidate brand detection

* fix(auth): set errorHandled for EMAIL_NOT_VERIFIED to prevent duplicate error

* fix(auth): clear success message on login errors

* chore(auth): fix import order per lint

* fix(action-bar): duplicate subflows with children (#2923)

* fix(action-bar): duplicate subflows with children

* fix(action-bar): add validateTriggerPaste for subflow duplicate

* fix(resolver): agent response format, input formats, root level (#2925)

* fix(resolvers): agent response format, input formats, root level

* fix response block initial seeding

* fix tests

* fix(messages-input): fix cursor alignment and auto-resize with overlay (#2926)

* fix(messages-input): fix cursor alignment and auto-resize with overlay

* fixed remaining zustand warnings

* fix(stores): remove dead code causing log spam on startup (#2927)

* fix(stores): remove dead code causing log spam on startup

* fix(stores): replace custom tools zustand store with react query cache

* improvement(ui): use BrandedButton and BrandedLink components (#2930)

- Refactor auth forms to use BrandedButton component
- Add BrandedLink component for changelog page
- Reduce code duplication in login, signup, reset-password forms
- Update star count default value

* fix(custom-tools): remove unsafe title fallback in getCustomTool (#2929)

* fix(custom-tools): remove unsafe title fallback in getCustomTool

* fix(custom-tools): restore title fallback in getCustomTool lookup

Custom tools are referenced by title (custom_${title}), not database ID.
The title fallback is required for client-side tool resolution to work.

* fix(null-bodies): empty bodies handling (#2931)

* fix(null-statuses): empty bodies handling

* address bugbot comment

* fix(token-refresh): microsoft, notion, x, linear (#2933)

* fix(microsoft): proactive refresh needed

* fix(x): missing token refresh flag

* notion and linear missing flag too

* address bugbot comment

* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback (#2932)

* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback

* refactor(auth): extract redirectToVerify helper to reduce duplication

* fix(workflow-selector): use dedicated selector for workflow dropdown (#2934)

* feat(workflow-block): preview (#2935)

* improvement(copilot): tool configs to show nested props (#2936)

* fix(auth): add genericOAuth providers to trustedProviders (#2937)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-21 22:53:25 -08:00
Waleed
376f7cb571 fix(auth): add genericOAuth providers to trustedProviders (#2937) 2026-01-21 22:44:30 -08:00
Vikhyath Mondreti
42159c23b9 improvement(copilot): tool configs to show nested props (#2936) 2026-01-21 20:02:59 -08:00
Emir Karabeg
2f0f246002 feat(workflow-block): preview (#2935) 2026-01-21 19:12:28 -08:00
Waleed
900d3ef9ea fix(workflow-selector): use dedicated selector for workflow dropdown (#2934) 2026-01-21 18:38:03 -08:00
Waleed
f3fcc28f89 fix(auth): handle EMAIL_NOT_VERIFIED in onError callback (#2932)
* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback

* refactor(auth): extract redirectToVerify helper to reduce duplication
2026-01-21 18:34:49 -08:00
Vikhyath Mondreti
7cfdf46724 fix(token-refresh): microsoft, notion, x, linear (#2933)
* fix(microsoft): proactive refresh needed

* fix(x): missing token refresh flag

* notion and linear missing flag too

* address bugbot comment
2026-01-21 18:30:53 -08:00
Vikhyath Mondreti
d681451297 fix(null-bodies): empty bodies handling (#2931)
* fix(null-statuses): empty bodies handling

* address bugbot comment
2026-01-21 18:10:33 -08:00
Waleed
5987a6d060 fix(custom-tools): remove unsafe title fallback in getCustomTool (#2929)
* fix(custom-tools): remove unsafe title fallback in getCustomTool

* fix(custom-tools): restore title fallback in getCustomTool lookup

Custom tools are referenced by title (custom_${title}), not database ID.
The title fallback is required for client-side tool resolution to work.
2026-01-21 17:36:10 -08:00
Waleed
e2ccefb2f4 improvement(ui): use BrandedButton and BrandedLink components (#2930)
- Refactor auth forms to use BrandedButton component
- Add BrandedLink component for changelog page
- Reduce code duplication in login, signup, reset-password forms
- Update star count default value
2026-01-21 17:25:30 -08:00
Vikhyath Mondreti
45371e521e v0.5.66: external http requests fix, ring highlighting 2026-01-21 02:55:39 -08:00
Waleed
0ce0f98aa5 v0.5.65: gemini updates, textract integration, ui updates (#2909)
* fix(google): wrap primitive tool responses for Gemini API compatibility (#2900)

* fix(canonical): copilot path + update parent (#2901)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output (#2902)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output

* fix(imap): add top-level fields to IMAP trigger output

* improvement(browseruse): add profile id param (#2903)

* improvement(browseruse): add profile id param

* make request a stub since we have directExec

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels (#2880)

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels

* comments

* improvement(files): update execution for passing base64 strings (#2906)

* progress

* improvement(execution): update execution for passing base64 strings

* fix types

* cleanup comments

* path security vuln

* reject promise correctly

* fix redirect case

* remove proxy routes

* fix tests

* use ipaddr

* feat(tools): added textract, added v2 for mistral, updated tag dropdown (#2904)

* feat(tools): added textract

* cleanup

* ack pr comments

* reorder

* removed upload for textract async version

* fix additional fields dropdown in editor, update parser to leave validation to be done on the server

* added mistral v2, files v2, and finalized textract

* updated the rest of the old file patterns, updated mistral outputs for v2

* updated tag dropdown to parse non-operation fields as well

* updated extension finder

* cleanup

* added description for inputs to workflow

* use helper for internal route check

* fix tag dropdown merge conflict change

* remove duplicate code

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>

* fix(ui): change add inputs button to match output selector (#2907)

* fix(canvas): removed invite to workspace from canvas popover (#2908)

* fix(canvas): removed invite to workspace

* removed unused props

* fix(copilot): legacy tool display names (#2911)

* fix(a2a): canonical merge  (#2912)

* fix canonical merge

* fix empty array case

* fix(change-detection): copilot diffs have extra field (#2913)

* improvement(logs): improved logs ui bugs, added subflow disable UI (#2910)

* improvement(logs): improved logs ui bugs, added subflow disable UI

* added duplicate to action bar for subflows

* feat(broadcast): email v0.5 (#2905)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-20 23:54:55 -08:00
Waleed
dff1c9d083 v0.5.64: unsubscribe, search improvements, metrics, additional SSO configuration 2026-01-20 00:34:11 -08:00
Vikhyath Mondreti
b09f683072 v0.5.63: ui and performance improvements, more google tools 2026-01-18 15:22:42 -08:00
Vikhyath Mondreti
a8bb0db660 v0.5.62: webhook bug fixes, seeding default subblock values, block selection fixes 2026-01-16 20:27:06 -08:00
Waleed
af82820a28 v0.5.61: webhook improvements, workflow controls, react query for deployment status, chat fixes, reducto and pulse OCR, linear fixes 2026-01-16 18:06:23 -08:00
Waleed
4372841797 v0.5.60: invitation flow improvements, chat fixes, a2a improvements, additional copilot actions 2026-01-15 00:02:18 -08:00
Waleed
5e8c843241 v0.5.59: a2a support, documentation 2026-01-13 13:21:21 -08:00
Waleed
7bf3d73ee6 v0.5.58: export folders, new tools, permissions groups enhancements 2026-01-13 00:56:59 -08:00
Vikhyath Mondreti
7ffc11a738 v0.5.57: subagents, context menu improvements, bug fixes 2026-01-11 11:38:40 -08:00
Waleed
be578e2ed7 v0.5.56: batch operations, access control and permission groups, billing fixes 2026-01-10 00:31:34 -08:00
Waleed
f415e5edc4 v0.5.55: polling groups, bedrock provider, devcontainer fixes, workflow preview enhancements 2026-01-08 23:36:56 -08:00
Waleed
13a6e6c3fa v0.5.54: seo, model blacklist, helm chart updates, fireflies integration, autoconnect improvements, billing fixes 2026-01-07 16:09:45 -08:00
Waleed
f5ab7f21ae v0.5.53: hotkey improvements, added redis fallback, fixes for workflow tool 2026-01-06 23:34:52 -08:00
Waleed
bfb6fffe38 v0.5.52: new port-based router block, combobox expression and variable support 2026-01-06 16:14:10 -08:00
Waleed
4fbec0a43f v0.5.51: triggers, kb, condition block improvements, supabase and grain integration updates 2026-01-06 14:26:46 -08:00
Waleed
585f5e365b v0.5.50: import improvements, ui upgrades, kb styling and performance improvements 2026-01-05 00:35:55 -08:00
Waleed
3792bdd252 v0.5.49: hitl improvements, new email styles, imap trigger, logs context menu (#2672)
* feat(logs-context-menu): consolidated logs utils and types, added logs record context menu (#2659)

* feat(email): welcome email; improvement(emails): ui/ux (#2658)

* feat(email): welcome email; improvement(emails): ui/ux

* improvement(emails): links, accounts, preview

* refactor(emails): file structure and wrapper components

* added envvar for personal emails sent, added isHosted gate

* fixed failing tests, added env mock

* fix: removed comment

---------

Co-authored-by: waleed <walif6@gmail.com>

* fix(logging): hitl + trigger dev crash protection (#2664)

* hitl gaps

* deal with trigger worker crashes

* cleanup import strcuture

* feat(imap): added support for imap trigger (#2663)

* feat(tools): added support for imap trigger

* feat(imap): added parity, tested

* ack PR comments

* final cleanup

* feat(i18n): update translations (#2665)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(grain): updated grain trigger to auto-establish trigger (#2666)

Co-authored-by: aadamgough <adam@sim.ai>

* feat(admin): routes to manage deployments (#2667)

* feat(admin): routes to manage deployments

* fix naming fo deployed by

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date (#2668)

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date

* removed unused params, cleaned up redundant utils

* improvement(invite): aligned styling (#2669)

* improvement(invite): aligned with rest of app

* fix(invite): error handling

* fix: addressed comments

---------

Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
Co-authored-by: aadamgough <adam@sim.ai>
2026-01-03 13:19:18 -08:00
Waleed
eb5d1f3e5b v0.5.48: copy-paste workflow blocks, docs updates, mcp tool fixes 2025-12-31 18:00:04 -08:00
Waleed
54ab82c8dd v0.5.47: deploy workflow as mcp, kb chunks tokenizer, UI improvements, jira service management tools 2025-12-30 23:18:58 -08:00
Waleed
f895bf469b v0.5.46: build improvements, greptile, light mode improvements 2025-12-29 02:17:52 -08:00
Waleed
dd3209af06 v0.5.45: light mode fixes, realtime usage indicator, docker build improvements 2025-12-27 19:57:42 -08:00
Waleed
b6ba3b50a7 v0.5.44: keyboard shortcuts, autolayout, light mode, byok, testing improvements 2025-12-26 21:25:19 -08:00
Waleed
b304233062 v0.5.43: export logs, circleback, grain, vertex, code hygiene, schedule improvements 2025-12-23 19:19:18 -08:00
Vikhyath Mondreti
57e4b49bd6 v0.5.42: fix memory migration 2025-12-23 01:24:54 -08:00
Vikhyath Mondreti
e12dd204ed v0.5.41: memory fixes, copilot improvements, knowledgebase improvements, LLM providers standardization 2025-12-23 00:15:18 -08:00
Vikhyath Mondreti
3d9d9cbc54 v0.5.40: supabase ops to allow non-public schemas, jira uuid 2025-12-21 22:28:05 -08:00
Waleed
0f4ec962ad v0.5.39: notion, workflow variables fixes 2025-12-20 20:44:00 -08:00
Waleed
4827866f9a v0.5.38: snap to grid, copilot ux improvements, billing line items 2025-12-20 17:24:38 -08:00
Waleed
3e697d9ed9 v0.5.37: redaction utils consolidation, logs updates, autoconnect improvements, additional kb tag types 2025-12-19 22:31:55 -08:00
Martin Yankov
4431a1a484 fix(helm): add custom egress rules to realtime network policy (#2481)
The realtime service network policy was missing the custom egress rules section
that allows configuration of additional egress rules via values.yaml. This caused
the realtime pods to be unable to connect to external databases (e.g., PostgreSQL
on port 5432) when using external database configurations.

The app network policy already had this section, but the realtime network policy
was missing it, creating an inconsistency and preventing the realtime service
from accessing external databases configured via networkPolicy.egress values.

This fix adds the same custom egress rules template section to the realtime
network policy, matching the app network policy behavior and allowing users to
configure database connectivity via values.yaml.
2025-12-19 18:59:08 -08:00
Waleed
4d1a9a3f22 v0.5.36: hitl improvements, opengraph, slack fixes, one-click unsubscribe, auth checks, new db indexes 2025-12-19 01:27:49 -08:00
Vikhyath Mondreti
eb07a080fb v0.5.35: helm updates, copilot improvements, 404 for docs, salesforce fixes, subflow resize clamping 2025-12-18 16:23:19 -08:00
79 changed files with 3842 additions and 890 deletions

View File

@@ -2,10 +2,9 @@
import { useEffect, useState } from 'react' import { useEffect, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { ArrowRight, ChevronRight, Eye, EyeOff } from 'lucide-react' import { Eye, EyeOff } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { useRouter, useSearchParams } from 'next/navigation' import { useRouter, useSearchParams } from 'next/navigation'
import { Button } from '@/components/ui/button'
import { import {
Dialog, Dialog,
DialogContent, DialogContent,
@@ -22,6 +21,7 @@ import { getBaseUrl } from '@/lib/core/utils/urls'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons' import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons'
import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button' import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class' import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
@@ -107,7 +107,6 @@ export default function LoginPage({
const [passwordErrors, setPasswordErrors] = useState<string[]>([]) const [passwordErrors, setPasswordErrors] = useState<string[]>([])
const [showValidationError, setShowValidationError] = useState(false) const [showValidationError, setShowValidationError] = useState(false)
const buttonClass = useBrandedButtonClass() const buttonClass = useBrandedButtonClass()
const [isButtonHovered, setIsButtonHovered] = useState(false)
const [callbackUrl, setCallbackUrl] = useState('/workspace') const [callbackUrl, setCallbackUrl] = useState('/workspace')
const [isInviteFlow, setIsInviteFlow] = useState(false) const [isInviteFlow, setIsInviteFlow] = useState(false)
@@ -115,7 +114,6 @@ export default function LoginPage({
const [forgotPasswordOpen, setForgotPasswordOpen] = useState(false) const [forgotPasswordOpen, setForgotPasswordOpen] = useState(false)
const [forgotPasswordEmail, setForgotPasswordEmail] = useState('') const [forgotPasswordEmail, setForgotPasswordEmail] = useState('')
const [isSubmittingReset, setIsSubmittingReset] = useState(false) const [isSubmittingReset, setIsSubmittingReset] = useState(false)
const [isResetButtonHovered, setIsResetButtonHovered] = useState(false)
const [resetStatus, setResetStatus] = useState<{ const [resetStatus, setResetStatus] = useState<{
type: 'success' | 'error' | null type: 'success' | 'error' | null
message: string message: string
@@ -184,6 +182,13 @@ export default function LoginPage({
e.preventDefault() e.preventDefault()
setIsLoading(true) setIsLoading(true)
const redirectToVerify = (emailToVerify: string) => {
if (typeof window !== 'undefined') {
sessionStorage.setItem('verificationEmail', emailToVerify)
}
router.push('/verify')
}
const formData = new FormData(e.currentTarget) const formData = new FormData(e.currentTarget)
const emailRaw = formData.get('email') as string const emailRaw = formData.get('email') as string
const email = emailRaw.trim().toLowerCase() const email = emailRaw.trim().toLowerCase()
@@ -215,9 +220,9 @@ export default function LoginPage({
onError: (ctx) => { onError: (ctx) => {
logger.error('Login error:', ctx.error) logger.error('Login error:', ctx.error)
// EMAIL_NOT_VERIFIED is handled by the catch block which redirects to /verify
if (ctx.error.code?.includes('EMAIL_NOT_VERIFIED')) { if (ctx.error.code?.includes('EMAIL_NOT_VERIFIED')) {
errorHandled = true errorHandled = true
redirectToVerify(email)
return return
} }
@@ -285,10 +290,7 @@ export default function LoginPage({
router.push(safeCallbackUrl) router.push(safeCallbackUrl)
} catch (err: any) { } catch (err: any) {
if (err.message?.includes('not verified') || err.code?.includes('EMAIL_NOT_VERIFIED')) { if (err.message?.includes('not verified') || err.code?.includes('EMAIL_NOT_VERIFIED')) {
if (typeof window !== 'undefined') { redirectToVerify(email)
sessionStorage.setItem('verificationEmail', email)
}
router.push('/verify')
return return
} }
@@ -491,24 +493,14 @@ export default function LoginPage({
</div> </div>
</div> </div>
<Button <BrandedButton
type='submit' type='submit'
onMouseEnter={() => setIsButtonHovered(true)}
onMouseLeave={() => setIsButtonHovered(false)}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
disabled={isLoading} disabled={isLoading}
loading={isLoading}
loadingText='Signing in'
> >
<span className='flex items-center gap-1'> Sign in
{isLoading ? 'Signing in...' : 'Sign in'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
)} )}
@@ -619,25 +611,15 @@ export default function LoginPage({
<p>{resetStatus.message}</p> <p>{resetStatus.message}</p>
</div> </div>
)} )}
<Button <BrandedButton
type='button' type='button'
onClick={handleForgotPassword} onClick={handleForgotPassword}
onMouseEnter={() => setIsResetButtonHovered(true)}
onMouseLeave={() => setIsResetButtonHovered(false)}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
disabled={isSubmittingReset} disabled={isSubmittingReset}
loading={isSubmittingReset}
loadingText='Sending'
> >
<span className='flex items-center gap-1'> Send Reset Link
{isSubmittingReset ? 'Sending...' : 'Send Reset Link'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isResetButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</div> </div>
</DialogContent> </DialogContent>
</Dialog> </Dialog>

View File

@@ -1,13 +1,12 @@
'use client' 'use client'
import { useState } from 'react' import { useState } from 'react'
import { ArrowRight, ChevronRight, Eye, EyeOff } from 'lucide-react' import { Eye, EyeOff } from 'lucide-react'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class' import { BrandedButton } from '@/app/(auth)/components/branded-button'
interface RequestResetFormProps { interface RequestResetFormProps {
email: string email: string
@@ -28,9 +27,6 @@ export function RequestResetForm({
statusMessage, statusMessage,
className, className,
}: RequestResetFormProps) { }: RequestResetFormProps) {
const buttonClass = useBrandedButtonClass()
const [isButtonHovered, setIsButtonHovered] = useState(false)
const handleSubmit = async (e: React.FormEvent) => { const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault() e.preventDefault()
onSubmit(email) onSubmit(email)
@@ -68,24 +64,14 @@ export function RequestResetForm({
)} )}
</div> </div>
<Button <BrandedButton
type='submit' type='submit'
disabled={isSubmitting} disabled={isSubmitting}
onMouseEnter={() => setIsButtonHovered(true)} loading={isSubmitting}
onMouseLeave={() => setIsButtonHovered(false)} loadingText='Sending'
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
> >
<span className='flex items-center gap-1'> Send Reset Link
{isSubmitting ? 'Sending...' : 'Send Reset Link'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
) )
} }
@@ -112,8 +98,6 @@ export function SetNewPasswordForm({
const [validationMessage, setValidationMessage] = useState('') const [validationMessage, setValidationMessage] = useState('')
const [showPassword, setShowPassword] = useState(false) const [showPassword, setShowPassword] = useState(false)
const [showConfirmPassword, setShowConfirmPassword] = useState(false) const [showConfirmPassword, setShowConfirmPassword] = useState(false)
const buttonClass = useBrandedButtonClass()
const [isButtonHovered, setIsButtonHovered] = useState(false)
const handleSubmit = async (e: React.FormEvent) => { const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault() e.preventDefault()
@@ -243,24 +227,14 @@ export function SetNewPasswordForm({
)} )}
</div> </div>
<Button <BrandedButton
disabled={isSubmitting || !token}
type='submit' type='submit'
onMouseEnter={() => setIsButtonHovered(true)} disabled={isSubmitting || !token}
onMouseLeave={() => setIsButtonHovered(false)} loading={isSubmitting}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all' loadingText='Resetting'
> >
<span className='flex items-center gap-1'> Reset Password
{isSubmitting ? 'Resetting...' : 'Reset Password'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
) )
} }

View File

@@ -2,10 +2,9 @@
import { Suspense, useEffect, useState } from 'react' import { Suspense, useEffect, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { ArrowRight, ChevronRight, Eye, EyeOff } from 'lucide-react' import { Eye, EyeOff } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { useRouter, useSearchParams } from 'next/navigation' import { useRouter, useSearchParams } from 'next/navigation'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { client, useSession } from '@/lib/auth/auth-client' import { client, useSession } from '@/lib/auth/auth-client'
@@ -14,6 +13,7 @@ import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons' import { SocialLoginButtons } from '@/app/(auth)/components/social-login-buttons'
import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button' import { SSOLoginButton } from '@/app/(auth)/components/sso-login-button'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class' import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
@@ -97,7 +97,6 @@ function SignupFormContent({
const [redirectUrl, setRedirectUrl] = useState('') const [redirectUrl, setRedirectUrl] = useState('')
const [isInviteFlow, setIsInviteFlow] = useState(false) const [isInviteFlow, setIsInviteFlow] = useState(false)
const buttonClass = useBrandedButtonClass() const buttonClass = useBrandedButtonClass()
const [isButtonHovered, setIsButtonHovered] = useState(false)
const [name, setName] = useState('') const [name, setName] = useState('')
const [nameErrors, setNameErrors] = useState<string[]>([]) const [nameErrors, setNameErrors] = useState<string[]>([])
@@ -476,24 +475,14 @@ function SignupFormContent({
</div> </div>
</div> </div>
<Button <BrandedButton
type='submit' type='submit'
onMouseEnter={() => setIsButtonHovered(true)}
onMouseLeave={() => setIsButtonHovered(false)}
className='group inline-flex w-full items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all'
disabled={isLoading} disabled={isLoading}
loading={isLoading}
loadingText='Creating account'
> >
<span className='flex items-center gap-1'> Create account
{isLoading ? 'Creating account' : 'Create account'} </BrandedButton>
<span className='inline-flex transition-transform duration-200 group-hover:translate-x-0.5'>
{isButtonHovered ? (
<ArrowRight className='h-4 w-4' aria-hidden='true' />
) : (
<ChevronRight className='h-4 w-4' aria-hidden='true' />
)}
</span>
</span>
</Button>
</form> </form>
)} )}

View File

@@ -4,7 +4,6 @@ import { useRef, useState } from 'react'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { X } from 'lucide-react' import { X } from 'lucide-react'
import { Textarea } from '@/components/emcn' import { Textarea } from '@/components/emcn'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input' import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { import {
@@ -18,6 +17,7 @@ import { isHosted } from '@/lib/core/config/feature-flags'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { quickValidateEmail } from '@/lib/messaging/email/validation' import { quickValidateEmail } from '@/lib/messaging/email/validation'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedButton } from '@/app/(auth)/components/branded-button'
import Footer from '@/app/(landing)/components/footer/footer' import Footer from '@/app/(landing)/components/footer/footer'
import Nav from '@/app/(landing)/components/nav/nav' import Nav from '@/app/(landing)/components/nav/nav'
@@ -493,18 +493,17 @@ export default function CareersPage() {
{/* Submit Button */} {/* Submit Button */}
<div className='flex justify-end pt-2'> <div className='flex justify-end pt-2'>
<Button <BrandedButton
type='submit' type='submit'
disabled={isSubmitting || submitStatus === 'success'} disabled={isSubmitting || submitStatus === 'success'}
className='min-w-[200px] rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all duration-300 hover:opacity-90 disabled:opacity-50' loading={isSubmitting}
size='lg' loadingText='Submitting'
showArrow={false}
fullWidth={false}
className='min-w-[200px]'
> >
{isSubmitting {submitStatus === 'success' ? 'Submitted' : 'Submit Application'}
? 'Submitting...' </BrandedButton>
: submitStatus === 'success'
? 'Submitted'
: 'Submit Application'}
</Button>
</div> </div>
</form> </form>
</section> </section>

View File

@@ -11,6 +11,7 @@ import { useBrandConfig } from '@/lib/branding/branding'
import { isHosted } from '@/lib/core/config/feature-flags' import { isHosted } from '@/lib/core/config/feature-flags'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { getFormattedGitHubStars } from '@/app/(landing)/actions/github' import { getFormattedGitHubStars } from '@/app/(landing)/actions/github'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
const logger = createLogger('nav') const logger = createLogger('nav')
@@ -20,11 +21,12 @@ interface NavProps {
} }
export default function Nav({ hideAuthButtons = false, variant = 'landing' }: NavProps = {}) { export default function Nav({ hideAuthButtons = false, variant = 'landing' }: NavProps = {}) {
const [githubStars, setGithubStars] = useState('25.1k') const [githubStars, setGithubStars] = useState('25.8k')
const [isHovered, setIsHovered] = useState(false) const [isHovered, setIsHovered] = useState(false)
const [isLoginHovered, setIsLoginHovered] = useState(false) const [isLoginHovered, setIsLoginHovered] = useState(false)
const router = useRouter() const router = useRouter()
const brand = useBrandConfig() const brand = useBrandConfig()
const buttonClass = useBrandedButtonClass()
useEffect(() => { useEffect(() => {
if (variant !== 'landing') return if (variant !== 'landing') return
@@ -183,7 +185,7 @@ export default function Nav({ hideAuthButtons = false, variant = 'landing' }: Na
href='/signup' href='/signup'
onMouseEnter={() => setIsHovered(true)} onMouseEnter={() => setIsHovered(true)}
onMouseLeave={() => setIsHovered(false)} onMouseLeave={() => setIsHovered(false)}
className='group inline-flex items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[14px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all sm:text-[16px]' className={`${buttonClass} group inline-flex items-center justify-center gap-2 rounded-[10px] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white transition-all`}
aria-label='Get started with Sim - Sign up for free' aria-label='Get started with Sim - Sign up for free'
prefetch={true} prefetch={true}
> >

View File

@@ -4,6 +4,11 @@ import { createLogger } from '@sim/logger'
import { and, desc, eq, inArray } from 'drizzle-orm' import { and, desc, eq, inArray } from 'drizzle-orm'
import { getSession } from '@/lib/auth' import { getSession } from '@/lib/auth'
import { refreshOAuthToken } from '@/lib/oauth' import { refreshOAuthToken } from '@/lib/oauth'
import {
getMicrosoftRefreshTokenExpiry,
isMicrosoftProvider,
PROACTIVE_REFRESH_THRESHOLD_DAYS,
} from '@/lib/oauth/microsoft'
const logger = createLogger('OAuthUtilsAPI') const logger = createLogger('OAuthUtilsAPI')
@@ -205,15 +210,32 @@ export async function refreshAccessTokenIfNeeded(
} }
// Decide if we should refresh: token missing OR expired // Decide if we should refresh: token missing OR expired
const expiresAt = credential.accessTokenExpiresAt const accessTokenExpiresAt = credential.accessTokenExpiresAt
const refreshTokenExpiresAt = credential.refreshTokenExpiresAt
const now = new Date() const now = new Date()
const shouldRefresh =
!!credential.refreshToken && (!credential.accessToken || (expiresAt && expiresAt <= now)) // Check if access token needs refresh (missing or expired)
const accessTokenNeedsRefresh =
!!credential.refreshToken &&
(!credential.accessToken || (accessTokenExpiresAt && accessTokenExpiresAt <= now))
// Check if we should proactively refresh to prevent refresh token expiry
// This applies to Microsoft providers whose refresh tokens expire after 90 days of inactivity
const proactiveRefreshThreshold = new Date(
now.getTime() + PROACTIVE_REFRESH_THRESHOLD_DAYS * 24 * 60 * 60 * 1000
)
const refreshTokenNeedsProactiveRefresh =
!!credential.refreshToken &&
isMicrosoftProvider(credential.providerId) &&
refreshTokenExpiresAt &&
refreshTokenExpiresAt <= proactiveRefreshThreshold
const shouldRefresh = accessTokenNeedsRefresh || refreshTokenNeedsProactiveRefresh
const accessToken = credential.accessToken const accessToken = credential.accessToken
if (shouldRefresh) { if (shouldRefresh) {
logger.info(`[${requestId}] Token expired, attempting to refresh for credential`) logger.info(`[${requestId}] Refreshing token for credential`)
try { try {
const refreshedToken = await refreshOAuthToken( const refreshedToken = await refreshOAuthToken(
credential.providerId, credential.providerId,
@@ -227,11 +249,15 @@ export async function refreshAccessTokenIfNeeded(
userId: credential.userId, userId: credential.userId,
hasRefreshToken: !!credential.refreshToken, hasRefreshToken: !!credential.refreshToken,
}) })
if (!accessTokenNeedsRefresh && accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return accessToken
}
return null return null
} }
// Prepare update data // Prepare update data
const updateData: any = { const updateData: Record<string, unknown> = {
accessToken: refreshedToken.accessToken, accessToken: refreshedToken.accessToken,
accessTokenExpiresAt: new Date(Date.now() + refreshedToken.expiresIn * 1000), accessTokenExpiresAt: new Date(Date.now() + refreshedToken.expiresIn * 1000),
updatedAt: new Date(), updatedAt: new Date(),
@@ -243,6 +269,10 @@ export async function refreshAccessTokenIfNeeded(
updateData.refreshToken = refreshedToken.refreshToken updateData.refreshToken = refreshedToken.refreshToken
} }
if (isMicrosoftProvider(credential.providerId)) {
updateData.refreshTokenExpiresAt = getMicrosoftRefreshTokenExpiry()
}
// Update the token in the database // Update the token in the database
await db.update(account).set(updateData).where(eq(account.id, credentialId)) await db.update(account).set(updateData).where(eq(account.id, credentialId))
@@ -256,6 +286,10 @@ export async function refreshAccessTokenIfNeeded(
credentialId, credentialId,
userId: credential.userId, userId: credential.userId,
}) })
if (!accessTokenNeedsRefresh && accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return accessToken
}
return null return null
} }
} else if (!accessToken) { } else if (!accessToken) {
@@ -277,10 +311,27 @@ export async function refreshTokenIfNeeded(
credentialId: string credentialId: string
): Promise<{ accessToken: string; refreshed: boolean }> { ): Promise<{ accessToken: string; refreshed: boolean }> {
// Decide if we should refresh: token missing OR expired // Decide if we should refresh: token missing OR expired
const expiresAt = credential.accessTokenExpiresAt const accessTokenExpiresAt = credential.accessTokenExpiresAt
const refreshTokenExpiresAt = credential.refreshTokenExpiresAt
const now = new Date() const now = new Date()
const shouldRefresh =
!!credential.refreshToken && (!credential.accessToken || (expiresAt && expiresAt <= now)) // Check if access token needs refresh (missing or expired)
const accessTokenNeedsRefresh =
!!credential.refreshToken &&
(!credential.accessToken || (accessTokenExpiresAt && accessTokenExpiresAt <= now))
// Check if we should proactively refresh to prevent refresh token expiry
// This applies to Microsoft providers whose refresh tokens expire after 90 days of inactivity
const proactiveRefreshThreshold = new Date(
now.getTime() + PROACTIVE_REFRESH_THRESHOLD_DAYS * 24 * 60 * 60 * 1000
)
const refreshTokenNeedsProactiveRefresh =
!!credential.refreshToken &&
isMicrosoftProvider(credential.providerId) &&
refreshTokenExpiresAt &&
refreshTokenExpiresAt <= proactiveRefreshThreshold
const shouldRefresh = accessTokenNeedsRefresh || refreshTokenNeedsProactiveRefresh
// If token appears valid and present, return it directly // If token appears valid and present, return it directly
if (!shouldRefresh) { if (!shouldRefresh) {
@@ -293,13 +344,17 @@ export async function refreshTokenIfNeeded(
if (!refreshResult) { if (!refreshResult) {
logger.error(`[${requestId}] Failed to refresh token for credential`) logger.error(`[${requestId}] Failed to refresh token for credential`)
if (!accessTokenNeedsRefresh && credential.accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return { accessToken: credential.accessToken, refreshed: false }
}
throw new Error('Failed to refresh token') throw new Error('Failed to refresh token')
} }
const { accessToken: refreshedToken, expiresIn, refreshToken: newRefreshToken } = refreshResult const { accessToken: refreshedToken, expiresIn, refreshToken: newRefreshToken } = refreshResult
// Prepare update data // Prepare update data
const updateData: any = { const updateData: Record<string, unknown> = {
accessToken: refreshedToken, accessToken: refreshedToken,
accessTokenExpiresAt: new Date(Date.now() + expiresIn * 1000), // Use provider's expiry accessTokenExpiresAt: new Date(Date.now() + expiresIn * 1000), // Use provider's expiry
updatedAt: new Date(), updatedAt: new Date(),
@@ -311,6 +366,10 @@ export async function refreshTokenIfNeeded(
updateData.refreshToken = newRefreshToken updateData.refreshToken = newRefreshToken
} }
if (isMicrosoftProvider(credential.providerId)) {
updateData.refreshTokenExpiresAt = getMicrosoftRefreshTokenExpiry()
}
await db.update(account).set(updateData).where(eq(account.id, credentialId)) await db.update(account).set(updateData).where(eq(account.id, credentialId))
logger.info(`[${requestId}] Successfully refreshed access token`) logger.info(`[${requestId}] Successfully refreshed access token`)
@@ -331,6 +390,11 @@ export async function refreshTokenIfNeeded(
} }
} }
if (!accessTokenNeedsRefresh && credential.accessToken) {
logger.info(`[${requestId}] Proactive refresh failed but access token still valid`)
return { accessToken: credential.accessToken, refreshed: false }
}
logger.error(`[${requestId}] Refresh failed and no valid token found in DB`, error) logger.error(`[${requestId}] Refresh failed and no valid token found in DB`, error)
throw error throw error
} }

View File

@@ -313,7 +313,7 @@ describe('Function Execute API Route', () => {
'block-2': 'world', 'block-2': 'world',
}, },
blockNameMapping: { blockNameMapping: {
validVar: 'block-1', validvar: 'block-1',
another_valid: 'block-2', another_valid: 'block-2',
}, },
}) })
@@ -539,7 +539,7 @@ describe('Function Execute API Route', () => {
'block-complex': complexData, 'block-complex': complexData,
}, },
blockNameMapping: { blockNameMapping: {
complexData: 'block-complex', complexdata: 'block-complex',
}, },
}) })

View File

@@ -6,11 +6,11 @@ import { executeInE2B } from '@/lib/execution/e2b'
import { executeInIsolatedVM } from '@/lib/execution/isolated-vm' import { executeInIsolatedVM } from '@/lib/execution/isolated-vm'
import { CodeLanguage, DEFAULT_CODE_LANGUAGE, isValidCodeLanguage } from '@/lib/execution/languages' import { CodeLanguage, DEFAULT_CODE_LANGUAGE, isValidCodeLanguage } from '@/lib/execution/languages'
import { escapeRegExp, normalizeName, REFERENCE } from '@/executor/constants' import { escapeRegExp, normalizeName, REFERENCE } from '@/executor/constants'
import { type OutputSchema, resolveBlockReference } from '@/executor/utils/block-reference'
import { import {
createEnvVarPattern, createEnvVarPattern,
createWorkflowVariablePattern, createWorkflowVariablePattern,
} from '@/executor/utils/reference-validation' } from '@/executor/utils/reference-validation'
import { navigatePath } from '@/executor/variables/resolvers/reference'
export const dynamic = 'force-dynamic' export const dynamic = 'force-dynamic'
export const runtime = 'nodejs' export const runtime = 'nodejs'
@@ -470,14 +470,17 @@ function resolveEnvironmentVariables(
function resolveTagVariables( function resolveTagVariables(
code: string, code: string,
blockData: Record<string, any>, blockData: Record<string, unknown>,
blockNameMapping: Record<string, string>, blockNameMapping: Record<string, string>,
contextVariables: Record<string, any> blockOutputSchemas: Record<string, OutputSchema>,
contextVariables: Record<string, unknown>,
language = 'javascript'
): string { ): string {
let resolvedCode = code let resolvedCode = code
const undefinedLiteral = language === 'python' ? 'None' : 'undefined'
const tagPattern = new RegExp( const tagPattern = new RegExp(
`${REFERENCE.START}([a-zA-Z_][a-zA-Z0-9_${REFERENCE.PATH_DELIMITER}]*[a-zA-Z0-9_])${REFERENCE.END}`, `${REFERENCE.START}([a-zA-Z_](?:[a-zA-Z0-9_${REFERENCE.PATH_DELIMITER}]*[a-zA-Z0-9_])?)${REFERENCE.END}`,
'g' 'g'
) )
const tagMatches = resolvedCode.match(tagPattern) || [] const tagMatches = resolvedCode.match(tagPattern) || []
@@ -486,41 +489,37 @@ function resolveTagVariables(
const tagName = match.slice(REFERENCE.START.length, -REFERENCE.END.length).trim() const tagName = match.slice(REFERENCE.START.length, -REFERENCE.END.length).trim()
const pathParts = tagName.split(REFERENCE.PATH_DELIMITER) const pathParts = tagName.split(REFERENCE.PATH_DELIMITER)
const blockName = pathParts[0] const blockName = pathParts[0]
const fieldPath = pathParts.slice(1)
const blockId = blockNameMapping[blockName] const result = resolveBlockReference(blockName, fieldPath, {
if (!blockId) { blockNameMapping,
blockData,
blockOutputSchemas,
})
if (!result) {
continue continue
} }
const blockOutput = blockData[blockId] let tagValue = result.value
if (blockOutput === undefined) {
continue
}
let tagValue: any
if (pathParts.length === 1) {
tagValue = blockOutput
} else {
tagValue = navigatePath(blockOutput, pathParts.slice(1))
}
if (tagValue === undefined) { if (tagValue === undefined) {
resolvedCode = resolvedCode.replace(new RegExp(escapeRegExp(match), 'g'), undefinedLiteral)
continue continue
} }
if ( if (typeof tagValue === 'string') {
typeof tagValue === 'string' && const trimmed = tagValue.trimStart()
tagValue.length > 100 && if (trimmed.startsWith('{') || trimmed.startsWith('[')) {
(tagValue.startsWith('{') || tagValue.startsWith('[')) try {
) { tagValue = JSON.parse(tagValue)
try { } catch {
tagValue = JSON.parse(tagValue) // Keep as string if not valid JSON
} catch { }
// Keep as-is
} }
} }
const safeVarName = `__tag_${tagName.replace(/[^a-zA-Z0-9_]/g, '_')}` const safeVarName = `__tag_${tagName.replace(/_/g, '_1').replace(/\./g, '_0')}`
contextVariables[safeVarName] = tagValue contextVariables[safeVarName] = tagValue
resolvedCode = resolvedCode.replace(new RegExp(escapeRegExp(match), 'g'), safeVarName) resolvedCode = resolvedCode.replace(new RegExp(escapeRegExp(match), 'g'), safeVarName)
} }
@@ -537,18 +536,27 @@ function resolveTagVariables(
*/ */
function resolveCodeVariables( function resolveCodeVariables(
code: string, code: string,
params: Record<string, any>, params: Record<string, unknown>,
envVars: Record<string, string> = {}, envVars: Record<string, string> = {},
blockData: Record<string, any> = {}, blockData: Record<string, unknown> = {},
blockNameMapping: Record<string, string> = {}, blockNameMapping: Record<string, string> = {},
workflowVariables: Record<string, any> = {} blockOutputSchemas: Record<string, OutputSchema> = {},
): { resolvedCode: string; contextVariables: Record<string, any> } { workflowVariables: Record<string, unknown> = {},
language = 'javascript'
): { resolvedCode: string; contextVariables: Record<string, unknown> } {
let resolvedCode = code let resolvedCode = code
const contextVariables: Record<string, any> = {} const contextVariables: Record<string, unknown> = {}
resolvedCode = resolveWorkflowVariables(resolvedCode, workflowVariables, contextVariables) resolvedCode = resolveWorkflowVariables(resolvedCode, workflowVariables, contextVariables)
resolvedCode = resolveEnvironmentVariables(resolvedCode, params, envVars, contextVariables) resolvedCode = resolveEnvironmentVariables(resolvedCode, params, envVars, contextVariables)
resolvedCode = resolveTagVariables(resolvedCode, blockData, blockNameMapping, contextVariables) resolvedCode = resolveTagVariables(
resolvedCode,
blockData,
blockNameMapping,
blockOutputSchemas,
contextVariables,
language
)
return { resolvedCode, contextVariables } return { resolvedCode, contextVariables }
} }
@@ -585,6 +593,7 @@ export async function POST(req: NextRequest) {
envVars = {}, envVars = {},
blockData = {}, blockData = {},
blockNameMapping = {}, blockNameMapping = {},
blockOutputSchemas = {},
workflowVariables = {}, workflowVariables = {},
workflowId, workflowId,
isCustomTool = false, isCustomTool = false,
@@ -601,20 +610,21 @@ export async function POST(req: NextRequest) {
isCustomTool, isCustomTool,
}) })
// Resolve variables in the code with workflow environment variables const lang = isValidCodeLanguage(language) ? language : DEFAULT_CODE_LANGUAGE
const codeResolution = resolveCodeVariables( const codeResolution = resolveCodeVariables(
code, code,
executionParams, executionParams,
envVars, envVars,
blockData, blockData,
blockNameMapping, blockNameMapping,
workflowVariables blockOutputSchemas,
workflowVariables,
lang
) )
resolvedCode = codeResolution.resolvedCode resolvedCode = codeResolution.resolvedCode
const contextVariables = codeResolution.contextVariables const contextVariables = codeResolution.contextVariables
const lang = isValidCodeLanguage(language) ? language : DEFAULT_CODE_LANGUAGE
let jsImports = '' let jsImports = ''
let jsRemainingCode = resolvedCode let jsRemainingCode = resolvedCode
let hasImports = false let hasImports = false
@@ -670,7 +680,11 @@ export async function POST(req: NextRequest) {
prologue += `const environmentVariables = JSON.parse(${JSON.stringify(JSON.stringify(envVars))});\n` prologue += `const environmentVariables = JSON.parse(${JSON.stringify(JSON.stringify(envVars))});\n`
prologueLineCount++ prologueLineCount++
for (const [k, v] of Object.entries(contextVariables)) { for (const [k, v] of Object.entries(contextVariables)) {
prologue += `const ${k} = JSON.parse(${JSON.stringify(JSON.stringify(v))});\n` if (v === undefined) {
prologue += `const ${k} = undefined;\n`
} else {
prologue += `const ${k} = JSON.parse(${JSON.stringify(JSON.stringify(v))});\n`
}
prologueLineCount++ prologueLineCount++
} }
@@ -741,7 +755,11 @@ export async function POST(req: NextRequest) {
prologue += `environmentVariables = json.loads(${JSON.stringify(JSON.stringify(envVars))})\n` prologue += `environmentVariables = json.loads(${JSON.stringify(JSON.stringify(envVars))})\n`
prologueLineCount++ prologueLineCount++
for (const [k, v] of Object.entries(contextVariables)) { for (const [k, v] of Object.entries(contextVariables)) {
prologue += `${k} = json.loads(${JSON.stringify(JSON.stringify(v))})\n` if (v === undefined) {
prologue += `${k} = None\n`
} else {
prologue += `${k} = json.loads(${JSON.stringify(JSON.stringify(v))})\n`
}
prologueLineCount++ prologueLineCount++
} }
const wrapped = [ const wrapped = [

View File

@@ -0,0 +1,27 @@
'use client'
import Link from 'next/link'
import { useBrandedButtonClass } from '@/hooks/use-branded-button-class'
interface BrandedLinkProps {
href: string
children: React.ReactNode
className?: string
target?: string
rel?: string
}
export function BrandedLink({ href, children, className = '', target, rel }: BrandedLinkProps) {
const buttonClass = useBrandedButtonClass()
return (
<Link
href={href}
target={target}
rel={rel}
className={`${buttonClass} group inline-flex items-center justify-center gap-2 rounded-[10px] py-[6px] pr-[10px] pl-[12px] text-[15px] text-white transition-all ${className}`}
>
{children}
</Link>
)
}

View File

@@ -2,6 +2,7 @@ import { BookOpen, Github, Rss } from 'lucide-react'
import Link from 'next/link' import Link from 'next/link'
import { inter } from '@/app/_styles/fonts/inter/inter' import { inter } from '@/app/_styles/fonts/inter/inter'
import { soehne } from '@/app/_styles/fonts/soehne/soehne' import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { BrandedLink } from '@/app/changelog/components/branded-link'
import ChangelogList from '@/app/changelog/components/timeline-list' import ChangelogList from '@/app/changelog/components/timeline-list'
export interface ChangelogEntry { export interface ChangelogEntry {
@@ -66,25 +67,24 @@ export default async function ChangelogContent() {
<hr className='mt-6 border-border' /> <hr className='mt-6 border-border' />
<div className='mt-6 flex flex-wrap items-center gap-3 text-sm'> <div className='mt-6 flex flex-wrap items-center gap-3 text-sm'>
<Link <BrandedLink
href='https://github.com/simstudioai/sim/releases' href='https://github.com/simstudioai/sim/releases'
target='_blank' target='_blank'
rel='noopener noreferrer' rel='noopener noreferrer'
className='group inline-flex items-center justify-center gap-2 rounded-[10px] border border-[#6F3DFA] bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] py-[6px] pr-[10px] pl-[12px] text-[14px] text-white shadow-[inset_0_2px_4px_0_#9B77FF] transition-all sm:text-[16px]'
> >
<Github className='h-4 w-4' /> <Github className='h-4 w-4' />
View on GitHub View on GitHub
</Link> </BrandedLink>
<Link <Link
href='https://docs.sim.ai' href='https://docs.sim.ai'
className='inline-flex items-center gap-2 rounded-md border border-border px-3 py-1.5 hover:bg-muted' className='inline-flex items-center gap-2 rounded-[10px] border border-border py-[6px] pr-[10px] pl-[12px] text-[15px] transition-all hover:bg-muted'
> >
<BookOpen className='h-4 w-4' /> <BookOpen className='h-4 w-4' />
Documentation Documentation
</Link> </Link>
<Link <Link
href='/changelog.xml' href='/changelog.xml'
className='inline-flex items-center gap-2 rounded-md border border-border px-3 py-1.5 hover:bg-muted' className='inline-flex items-center gap-2 rounded-[10px] border border-border py-[6px] pr-[10px] pl-[12px] text-[15px] transition-all hover:bg-muted'
> >
<Rss className='h-4 w-4' /> <Rss className='h-4 w-4' />
RSS Feed RSS Feed

View File

@@ -117,7 +117,7 @@ export default function ChatClient({ identifier }: { identifier: string }) {
const [error, setError] = useState<string | null>(null) const [error, setError] = useState<string | null>(null)
const messagesEndRef = useRef<HTMLDivElement>(null) const messagesEndRef = useRef<HTMLDivElement>(null)
const messagesContainerRef = useRef<HTMLDivElement>(null) const messagesContainerRef = useRef<HTMLDivElement>(null)
const [starCount, setStarCount] = useState('25.1k') const [starCount, setStarCount] = useState('25.8k')
const [conversationId, setConversationId] = useState('') const [conversationId, setConversationId] = useState('')
const [showScrollButton, setShowScrollButton] = useState(false) const [showScrollButton, setShowScrollButton] = useState(false)

View File

@@ -207,7 +207,6 @@ function TemplateCardInner({
isPannable={false} isPannable={false}
defaultZoom={0.8} defaultZoom={0.8}
fitPadding={0.2} fitPadding={0.2}
lightweight
/> />
) : ( ) : (
<div className='h-full w-full bg-[var(--surface-4)]' /> <div className='h-full w-full bg-[var(--surface-4)]' />

View File

@@ -16,8 +16,8 @@ import {
import { redactApiKeys } from '@/lib/core/security/redaction' import { redactApiKeys } from '@/lib/core/security/redaction'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { import {
BlockDetailsSidebar,
getLeftmostBlockId, getLeftmostBlockId,
PreviewEditor,
WorkflowPreview, WorkflowPreview,
} from '@/app/workspace/[workspaceId]/w/components/preview' } from '@/app/workspace/[workspaceId]/w/components/preview'
import { useExecutionSnapshot } from '@/hooks/queries/logs' import { useExecutionSnapshot } from '@/hooks/queries/logs'
@@ -248,11 +248,10 @@ export function ExecutionSnapshot({
cursorStyle='pointer' cursorStyle='pointer'
executedBlocks={blockExecutions} executedBlocks={blockExecutions}
selectedBlockId={pinnedBlockId} selectedBlockId={pinnedBlockId}
lightweight
/> />
</div> </div>
{pinnedBlockId && workflowState.blocks[pinnedBlockId] && ( {pinnedBlockId && workflowState.blocks[pinnedBlockId] && (
<BlockDetailsSidebar <PreviewEditor
block={workflowState.blocks[pinnedBlockId]} block={workflowState.blocks[pinnedBlockId]}
executionData={blockExecutions[pinnedBlockId]} executionData={blockExecutions[pinnedBlockId]}
allBlockExecutions={blockExecutions} allBlockExecutions={blockExecutions}

View File

@@ -213,7 +213,6 @@ function TemplateCardInner({
isPannable={false} isPannable={false}
defaultZoom={0.8} defaultZoom={0.8}
fitPadding={0.2} fitPadding={0.2}
lightweight
cursorStyle='pointer' cursorStyle='pointer'
/> />
) : ( ) : (

View File

@@ -151,6 +151,29 @@ export const ActionBar = memo(
</Tooltip.Root> </Tooltip.Root>
)} )}
{isSubflowBlock && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
collaborativeBatchToggleBlockEnabled([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
>
{isEnabled ? <Circle className={ICON_SIZE} /> : <CircleOff className={ICON_SIZE} />}
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
</Tooltip.Content>
</Tooltip.Root>
)}
{!isStartBlock && !isResponseBlock && ( {!isStartBlock && !isResponseBlock && (
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
@@ -222,29 +245,6 @@ export const ActionBar = memo(
</Tooltip.Root> </Tooltip.Root>
)} )}
{isSubflowBlock && (
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
variant='ghost'
onClick={(e) => {
e.stopPropagation()
if (!disabled) {
collaborativeBatchToggleBlockEnabled([blockId])
}
}}
className={ACTION_BUTTON_STYLES}
disabled={disabled}
>
{isEnabled ? <Circle className={ICON_SIZE} /> : <CircleOff className={ICON_SIZE} />}
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>
{getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
</Tooltip.Content>
</Tooltip.Root>
)}
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<Button <Button

View File

@@ -78,6 +78,7 @@ const CopilotMessage: FC<CopilotMessageProps> = memo(
mode, mode,
setMode, setMode,
isAborting, isAborting,
maskCredentialValue,
} = useCopilotStore() } = useCopilotStore()
const messageCheckpoints = isUser ? allMessageCheckpoints[message.id] || [] : [] const messageCheckpoints = isUser ? allMessageCheckpoints[message.id] || [] : []
@@ -210,7 +211,10 @@ const CopilotMessage: FC<CopilotMessageProps> = memo(
const isLastTextBlock = const isLastTextBlock =
index === message.contentBlocks!.length - 1 && block.type === 'text' index === message.contentBlocks!.length - 1 && block.type === 'text'
const parsed = parseSpecialTags(block.content) const parsed = parseSpecialTags(block.content)
const cleanBlockContent = parsed.cleanContent.replace(/\n{3,}/g, '\n\n') // Mask credential IDs in the displayed content
const cleanBlockContent = maskCredentialValue(
parsed.cleanContent.replace(/\n{3,}/g, '\n\n')
)
if (!cleanBlockContent.trim()) return null if (!cleanBlockContent.trim()) return null
@@ -238,7 +242,7 @@ const CopilotMessage: FC<CopilotMessageProps> = memo(
return ( return (
<div key={blockKey} className='w-full'> <div key={blockKey} className='w-full'>
<ThinkingBlock <ThinkingBlock
content={block.content} content={maskCredentialValue(block.content)}
isStreaming={isActivelyStreaming} isStreaming={isActivelyStreaming}
hasFollowingContent={hasFollowingContent} hasFollowingContent={hasFollowingContent}
hasSpecialTags={hasSpecialTags} hasSpecialTags={hasSpecialTags}
@@ -261,7 +265,7 @@ const CopilotMessage: FC<CopilotMessageProps> = memo(
} }
return null return null
}) })
}, [message.contentBlocks, isActivelyStreaming, parsedTags, isLastMessage]) }, [message.contentBlocks, isActivelyStreaming, parsedTags, isLastMessage, maskCredentialValue])
if (isUser) { if (isUser) {
return ( return (

View File

@@ -782,6 +782,7 @@ const SubagentContentRenderer = memo(function SubagentContentRenderer({
const [isExpanded, setIsExpanded] = useState(true) const [isExpanded, setIsExpanded] = useState(true)
const [duration, setDuration] = useState(0) const [duration, setDuration] = useState(0)
const startTimeRef = useRef<number>(Date.now()) const startTimeRef = useRef<number>(Date.now())
const maskCredentialValue = useCopilotStore((s) => s.maskCredentialValue)
const wasStreamingRef = useRef(false) const wasStreamingRef = useRef(false)
// Only show streaming animations for current message // Only show streaming animations for current message
@@ -816,14 +817,16 @@ const SubagentContentRenderer = memo(function SubagentContentRenderer({
currentText += parsed.cleanContent currentText += parsed.cleanContent
} else if (block.type === 'subagent_tool_call' && block.toolCall) { } else if (block.type === 'subagent_tool_call' && block.toolCall) {
if (currentText.trim()) { if (currentText.trim()) {
segments.push({ type: 'text', content: currentText }) // Mask any credential IDs in the accumulated text before displaying
segments.push({ type: 'text', content: maskCredentialValue(currentText) })
currentText = '' currentText = ''
} }
segments.push({ type: 'tool', block }) segments.push({ type: 'tool', block })
} }
} }
if (currentText.trim()) { if (currentText.trim()) {
segments.push({ type: 'text', content: currentText }) // Mask any credential IDs in the accumulated text before displaying
segments.push({ type: 'text', content: maskCredentialValue(currentText) })
} }
const allParsed = parseSpecialTags(allRawText) const allParsed = parseSpecialTags(allRawText)
@@ -952,6 +955,7 @@ const WorkflowEditSummary = memo(function WorkflowEditSummary({
toolCall: CopilotToolCall toolCall: CopilotToolCall
}) { }) {
const blocks = useWorkflowStore((s) => s.blocks) const blocks = useWorkflowStore((s) => s.blocks)
const maskCredentialValue = useCopilotStore((s) => s.maskCredentialValue)
const cachedBlockInfoRef = useRef<Record<string, { name: string; type: string }>>({}) const cachedBlockInfoRef = useRef<Record<string, { name: string; type: string }>>({})
@@ -983,6 +987,7 @@ const WorkflowEditSummary = memo(function WorkflowEditSummary({
title: string title: string
value: any value: any
isPassword?: boolean isPassword?: boolean
isCredential?: boolean
} }
interface BlockChange { interface BlockChange {
@@ -1091,6 +1096,7 @@ const WorkflowEditSummary = memo(function WorkflowEditSummary({
title: subBlockConfig.title ?? subBlockConfig.id, title: subBlockConfig.title ?? subBlockConfig.id,
value, value,
isPassword: subBlockConfig.password === true, isPassword: subBlockConfig.password === true,
isCredential: subBlockConfig.type === 'oauth-input',
}) })
} }
} }
@@ -1172,8 +1178,15 @@ const WorkflowEditSummary = memo(function WorkflowEditSummary({
{subBlocksToShow && subBlocksToShow.length > 0 && ( {subBlocksToShow && subBlocksToShow.length > 0 && (
<div className='border-[var(--border-1)] border-t px-2.5 py-1.5'> <div className='border-[var(--border-1)] border-t px-2.5 py-1.5'>
{subBlocksToShow.map((sb) => { {subBlocksToShow.map((sb) => {
// Mask password fields like the canvas does // Mask password fields and credential IDs
const displayValue = sb.isPassword ? '•••' : getDisplayValue(sb.value) let displayValue: string
if (sb.isPassword) {
displayValue = '•••'
} else {
// Get display value first, then mask any credential IDs that might be in it
const rawValue = getDisplayValue(sb.value)
displayValue = maskCredentialValue(rawValue)
}
return ( return (
<div key={sb.id} className='flex items-start gap-1.5 py-0.5 text-[11px]'> <div key={sb.id} className='flex items-start gap-1.5 py-0.5 text-[11px]'>
<span <span
@@ -1412,10 +1425,13 @@ function RunSkipButtons({
setIsProcessing(true) setIsProcessing(true)
setButtonsHidden(true) setButtonsHidden(true)
try { try {
// Add to auto-allowed list first // Add to auto-allowed list - this also executes all pending integration tools of this type
await addAutoAllowedTool(toolCall.name) await addAutoAllowedTool(toolCall.name)
// Then execute // For client tools with interrupts (not integration tools), we still need to call handleRun
await handleRun(toolCall, setToolCallState, onStateChange, editedParams) // since executeIntegrationTool only works for server-side tools
if (!isIntegrationTool(toolCall.name)) {
await handleRun(toolCall, setToolCallState, onStateChange, editedParams)
}
} finally { } finally {
setIsProcessing(false) setIsProcessing(false)
actionInProgressRef.current = false actionInProgressRef.current = false
@@ -1438,10 +1454,10 @@ function RunSkipButtons({
if (buttonsHidden) return null if (buttonsHidden) return null
// Hide "Always Allow" for integration tools (only show for client tools with interrupts) // Show "Always Allow" for all tools that require confirmation
const showAlwaysAllow = !isIntegrationTool(toolCall.name) const showAlwaysAllow = true
// Standardized buttons for all interrupt tools: Allow, (Always Allow for client tools only), Skip // Standardized buttons for all interrupt tools: Allow, Always Allow, Skip
return ( return (
<div className='mt-[10px] flex gap-[6px]'> <div className='mt-[10px] flex gap-[6px]'>
<Button onClick={onRun} disabled={isProcessing} variant='tertiary'> <Button onClick={onRun} disabled={isProcessing} variant='tertiary'>

View File

@@ -105,10 +105,10 @@ export function useCopilotInitialization(props: UseCopilotInitializationProps) {
isSendingMessage, isSendingMessage,
]) ])
/** Load auto-allowed tools once on mount */ /** Load auto-allowed tools once on mount - runs immediately, independent of workflow */
const hasLoadedAutoAllowedToolsRef = useRef(false) const hasLoadedAutoAllowedToolsRef = useRef(false)
useEffect(() => { useEffect(() => {
if (hasMountedRef.current && !hasLoadedAutoAllowedToolsRef.current) { if (!hasLoadedAutoAllowedToolsRef.current) {
hasLoadedAutoAllowedToolsRef.current = true hasLoadedAutoAllowedToolsRef.current = true
loadAutoAllowedTools().catch((err) => { loadAutoAllowedTools().catch((err) => {
logger.warn('[Copilot] Failed to load auto-allowed tools', err) logger.warn('[Copilot] Failed to load auto-allowed tools', err)

View File

@@ -18,8 +18,8 @@ import {
import { Skeleton } from '@/components/ui' import { Skeleton } from '@/components/ui'
import type { WorkflowDeploymentVersionResponse } from '@/lib/workflows/persistence/utils' import type { WorkflowDeploymentVersionResponse } from '@/lib/workflows/persistence/utils'
import { import {
BlockDetailsSidebar,
getLeftmostBlockId, getLeftmostBlockId,
PreviewEditor,
WorkflowPreview, WorkflowPreview,
} from '@/app/workspace/[workspaceId]/w/components/preview' } from '@/app/workspace/[workspaceId]/w/components/preview'
import { useDeploymentVersionState, useRevertToVersion } from '@/hooks/queries/workflows' import { useDeploymentVersionState, useRevertToVersion } from '@/hooks/queries/workflows'
@@ -337,7 +337,7 @@ export function GeneralDeploy({
/> />
</div> </div>
{expandedSelectedBlockId && workflowToShow.blocks?.[expandedSelectedBlockId] && ( {expandedSelectedBlockId && workflowToShow.blocks?.[expandedSelectedBlockId] && (
<BlockDetailsSidebar <PreviewEditor
block={workflowToShow.blocks[expandedSelectedBlockId]} block={workflowToShow.blocks[expandedSelectedBlockId]}
workflowVariables={workflowToShow.variables} workflowVariables={workflowToShow.variables}
loops={workflowToShow.loops} loops={workflowToShow.loops}

View File

@@ -446,7 +446,6 @@ const OGCaptureContainer = forwardRef<HTMLDivElement>((_, ref) => {
isPannable={false} isPannable={false}
defaultZoom={0.8} defaultZoom={0.8}
fitPadding={0.2} fitPadding={0.2}
lightweight
/> />
</div> </div>
) )

View File

@@ -34,3 +34,4 @@ export { Text } from './text/text'
export { TimeInput } from './time-input/time-input' export { TimeInput } from './time-input/time-input'
export { ToolInput } from './tool-input/tool-input' export { ToolInput } from './tool-input/tool-input'
export { VariablesInput } from './variables-input/variables-input' export { VariablesInput } from './variables-input/variables-input'
export { WorkflowSelectorInput } from './workflow-selector/workflow-selector-input'

View File

@@ -2,12 +2,13 @@ import { useMemo, useRef, useState } from 'react'
import { Badge, Input } from '@/components/emcn' import { Badge, Input } from '@/components/emcn'
import { Label } from '@/components/ui/label' import { Label } from '@/components/ui/label'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { extractInputFieldsFromBlocks } from '@/lib/workflows/input-format'
import { formatDisplayText } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/formatted-text' import { formatDisplayText } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/formatted-text'
import { TagDropdown } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown' import { TagDropdown } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
import { useSubBlockInput } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-input' import { useSubBlockInput } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-input'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value' import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes' import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
import { useWorkflowInputFields } from '@/hooks/queries/workflows' import { useWorkflowState } from '@/hooks/queries/workflows'
/** /**
* Props for the InputMappingField component * Props for the InputMappingField component
@@ -70,7 +71,11 @@ export function InputMapping({
const overlayRefs = useRef<Map<string, HTMLDivElement>>(new Map()) const overlayRefs = useRef<Map<string, HTMLDivElement>>(new Map())
const workflowId = typeof selectedWorkflowId === 'string' ? selectedWorkflowId : undefined const workflowId = typeof selectedWorkflowId === 'string' ? selectedWorkflowId : undefined
const { data: childInputFields = [], isLoading } = useWorkflowInputFields(workflowId) const { data: workflowState, isLoading } = useWorkflowState(workflowId)
const childInputFields = useMemo(
() => (workflowState?.blocks ? extractInputFieldsFromBlocks(workflowState.blocks) : []),
[workflowState?.blocks]
)
const [collapsedFields, setCollapsedFields] = useState<Record<string, boolean>>({}) const [collapsedFields, setCollapsedFields] = useState<Record<string, boolean>>({})
const valueObj: Record<string, string> = useMemo(() => { const valueObj: Record<string, string> = useMemo(() => {

View File

@@ -29,6 +29,7 @@ import {
type OAuthProvider, type OAuthProvider,
type OAuthService, type OAuthService,
} from '@/lib/oauth' } from '@/lib/oauth'
import { extractInputFieldsFromBlocks } from '@/lib/workflows/input-format'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider' import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { import {
CheckboxList, CheckboxList,
@@ -65,7 +66,7 @@ import { useForceRefreshMcpTools, useMcpServers, useStoredMcpTools } from '@/hoo
import { import {
useChildDeploymentStatus, useChildDeploymentStatus,
useDeployChildWorkflow, useDeployChildWorkflow,
useWorkflowInputFields, useWorkflowState,
useWorkflows, useWorkflows,
} from '@/hooks/queries/workflows' } from '@/hooks/queries/workflows'
import { usePermissionConfig } from '@/hooks/use-permission-config' import { usePermissionConfig } from '@/hooks/use-permission-config'
@@ -771,7 +772,11 @@ function WorkflowInputMapperSyncWrapper({
disabled: boolean disabled: boolean
workflowId: string workflowId: string
}) { }) {
const { data: inputFields = [], isLoading } = useWorkflowInputFields(workflowId) const { data: workflowState, isLoading } = useWorkflowState(workflowId)
const inputFields = useMemo(
() => (workflowState?.blocks ? extractInputFieldsFromBlocks(workflowState.blocks) : []),
[workflowState?.blocks]
)
const parsedValue = useMemo(() => { const parsedValue = useMemo(() => {
try { try {

View File

@@ -0,0 +1,45 @@
'use client'
import { useMemo } from 'react'
import { SelectorCombobox } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/selector-combobox/selector-combobox'
import type { SubBlockConfig } from '@/blocks/types'
import type { SelectorContext } from '@/hooks/selectors/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
interface WorkflowSelectorInputProps {
blockId: string
subBlock: SubBlockConfig
disabled?: boolean
isPreview?: boolean
previewValue?: string | null
}
export function WorkflowSelectorInput({
blockId,
subBlock,
disabled = false,
isPreview = false,
previewValue,
}: WorkflowSelectorInputProps) {
const activeWorkflowId = useWorkflowRegistry((s) => s.activeWorkflowId)
const context: SelectorContext = useMemo(
() => ({
excludeWorkflowId: activeWorkflowId ?? undefined,
}),
[activeWorkflowId]
)
return (
<SelectorCombobox
blockId={blockId}
subBlock={subBlock}
selectorKey='sim.workflows'
selectorContext={context}
disabled={disabled}
isPreview={isPreview}
previewValue={previewValue}
placeholder={subBlock.placeholder || 'Select workflow...'}
/>
)
}

View File

@@ -40,6 +40,7 @@ import {
TimeInput, TimeInput,
ToolInput, ToolInput,
VariablesInput, VariablesInput,
WorkflowSelectorInput,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components' } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components'
import { useDependsOnGate } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-depends-on-gate' import { useDependsOnGate } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-depends-on-gate'
import type { SubBlockConfig } from '@/blocks/types' import type { SubBlockConfig } from '@/blocks/types'
@@ -90,7 +91,6 @@ const isFieldRequired = (config: SubBlockConfig, subBlockValues?: Record<string,
if (!config.required) return false if (!config.required) return false
if (typeof config.required === 'boolean') return config.required if (typeof config.required === 'boolean') return config.required
// Helper function to evaluate a condition
const evalCond = ( const evalCond = (
cond: { cond: {
field: string field: string
@@ -132,7 +132,6 @@ const isFieldRequired = (config: SubBlockConfig, subBlockValues?: Record<string,
return match return match
} }
// If required is a condition object or function, evaluate it
const condition = typeof config.required === 'function' ? config.required() : config.required const condition = typeof config.required === 'function' ? config.required() : config.required
return evalCond(condition, subBlockValues || {}) return evalCond(condition, subBlockValues || {})
} }
@@ -378,7 +377,6 @@ function SubBlockComponent({
setIsValidJson(isValid) setIsValidJson(isValid)
} }
// Check if wand is enabled for this sub-block
const isWandEnabled = config.wandConfig?.enabled ?? false const isWandEnabled = config.wandConfig?.enabled ?? false
/** /**
@@ -438,8 +436,6 @@ function SubBlockComponent({
| null | null
| undefined | undefined
// Use dependsOn gating to compute final disabled state
// Only pass previewContextValues when in preview mode to avoid format mismatches
const { finalDisabled: gatedDisabled } = useDependsOnGate(blockId, config, { const { finalDisabled: gatedDisabled } = useDependsOnGate(blockId, config, {
disabled, disabled,
isPreview, isPreview,
@@ -869,6 +865,17 @@ function SubBlockComponent({
/> />
) )
case 'workflow-selector':
return (
<WorkflowSelectorInput
blockId={blockId}
subBlock={config}
disabled={isDisabled}
isPreview={isPreview}
previewValue={previewValue as string | null}
/>
)
case 'mcp-server-selector': case 'mcp-server-selector':
return ( return (
<McpServerSelector <McpServerSelector

View File

@@ -68,7 +68,7 @@ export function SubflowEditor({
<div className='flex flex-1 flex-col overflow-hidden pt-[0px]'> <div className='flex flex-1 flex-col overflow-hidden pt-[0px]'>
{/* Subflow Editor Section */} {/* Subflow Editor Section */}
<div ref={subBlocksRef} className='subblocks-section flex flex-1 flex-col overflow-hidden'> <div ref={subBlocksRef} className='subblocks-section flex flex-1 flex-col overflow-hidden'>
<div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[5px] pb-[8px]'> <div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[9px] pb-[8px]'>
{/* Type Selection */} {/* Type Selection */}
<div> <div>
<Label className='mb-[6.5px] block pl-[2px] font-medium text-[13px] text-[var(--text-primary)]'> <Label className='mb-[6.5px] block pl-[2px] font-medium text-[13px] text-[var(--text-primary)]'>

View File

@@ -2,7 +2,16 @@
import { useCallback, useEffect, useMemo, useRef, useState } from 'react' import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
import { isEqual } from 'lodash' import { isEqual } from 'lodash'
import { BookOpen, Check, ChevronDown, ChevronUp, Pencil } from 'lucide-react' import {
BookOpen,
Check,
ChevronDown,
ChevronUp,
ExternalLink,
Loader2,
Pencil,
} from 'lucide-react'
import { useParams } from 'next/navigation'
import { useShallow } from 'zustand/react/shallow' import { useShallow } from 'zustand/react/shallow'
import { useStoreWithEqualityFn } from 'zustand/traditional' import { useStoreWithEqualityFn } from 'zustand/traditional'
import { Button, Tooltip } from '@/components/emcn' import { Button, Tooltip } from '@/components/emcn'
@@ -29,8 +38,10 @@ import { LoopTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/component
import { ParallelTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/parallel/parallel-config' import { ParallelTool } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/parallel/parallel-config'
import { getSubBlockStableKey } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/utils' import { getSubBlockStableKey } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/utils'
import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks' import { useCurrentWorkflow } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks'
import { WorkflowPreview } from '@/app/workspace/[workspaceId]/w/components/preview'
import { getBlock } from '@/blocks/registry' import { getBlock } from '@/blocks/registry'
import type { SubBlockType } from '@/blocks/types' import type { SubBlockType } from '@/blocks/types'
import { useWorkflowState } from '@/hooks/queries/workflows'
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow' import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { usePanelEditorStore } from '@/stores/panel' import { usePanelEditorStore } from '@/stores/panel'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store' import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
@@ -85,6 +96,14 @@ export function Editor() {
// Get subflow display properties from configs // Get subflow display properties from configs
const subflowConfig = isSubflow ? (currentBlock.type === 'loop' ? LoopTool : ParallelTool) : null const subflowConfig = isSubflow ? (currentBlock.type === 'loop' ? LoopTool : ParallelTool) : null
// Check if selected block is a workflow block
const isWorkflowBlock =
currentBlock && (currentBlock.type === 'workflow' || currentBlock.type === 'workflow_input')
// Get workspace ID from params
const params = useParams()
const workspaceId = params.workspaceId as string
// Refs for resize functionality // Refs for resize functionality
const subBlocksRef = useRef<HTMLDivElement>(null) const subBlocksRef = useRef<HTMLDivElement>(null)
@@ -254,11 +273,11 @@ export function Editor() {
// Trigger rename mode when signaled from context menu // Trigger rename mode when signaled from context menu
useEffect(() => { useEffect(() => {
if (shouldFocusRename && currentBlock && !isSubflow) { if (shouldFocusRename && currentBlock) {
handleStartRename() handleStartRename()
setShouldFocusRename(false) setShouldFocusRename(false)
} }
}, [shouldFocusRename, currentBlock, isSubflow, handleStartRename, setShouldFocusRename]) }, [shouldFocusRename, currentBlock, handleStartRename, setShouldFocusRename])
/** /**
* Handles opening documentation link in a new secure tab. * Handles opening documentation link in a new secure tab.
@@ -270,6 +289,22 @@ export function Editor() {
} }
} }
// Get child workflow ID for workflow blocks
const childWorkflowId = isWorkflowBlock ? blockSubBlockValues?.workflowId : null
// Fetch child workflow state for preview (only for workflow blocks with a selected workflow)
const { data: childWorkflowState, isLoading: isLoadingChildWorkflow } =
useWorkflowState(childWorkflowId)
/**
* Handles opening the child workflow in a new tab.
*/
const handleOpenChildWorkflow = useCallback(() => {
if (childWorkflowId && workspaceId) {
window.open(`/workspace/${workspaceId}/w/${childWorkflowId}`, '_blank', 'noopener,noreferrer')
}
}, [childWorkflowId, workspaceId])
// Determine if connections are at minimum height (collapsed state) // Determine if connections are at minimum height (collapsed state)
const isConnectionsAtMinHeight = connectionsHeight <= 35 const isConnectionsAtMinHeight = connectionsHeight <= 35
@@ -322,7 +357,7 @@ export function Editor() {
</div> </div>
<div className='flex shrink-0 items-center gap-[8px]'> <div className='flex shrink-0 items-center gap-[8px]'>
{/* Rename button */} {/* Rename button */}
{currentBlock && !isSubflow && ( {currentBlock && (
<Tooltip.Root> <Tooltip.Root>
<Tooltip.Trigger asChild> <Tooltip.Trigger asChild>
<Button <Button
@@ -408,7 +443,66 @@ export function Editor() {
className='subblocks-section flex flex-1 flex-col overflow-hidden' className='subblocks-section flex flex-1 flex-col overflow-hidden'
> >
<div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[12px] pb-[8px] [overflow-anchor:none]'> <div className='flex-1 overflow-y-auto overflow-x-hidden px-[8px] pt-[12px] pb-[8px] [overflow-anchor:none]'>
{subBlocks.length === 0 ? ( {/* Workflow Preview - only for workflow blocks with a selected child workflow */}
{isWorkflowBlock && childWorkflowId && (
<>
<div className='subblock-content flex flex-col gap-[9.5px]'>
<div className='pl-[2px] font-medium text-[13px] text-[var(--text-primary)] leading-none'>
Workflow Preview
</div>
<div className='relative h-[160px] overflow-hidden rounded-[4px] border border-[var(--border)]'>
{isLoadingChildWorkflow ? (
<div className='flex h-full items-center justify-center bg-[var(--surface-3)]'>
<Loader2 className='h-5 w-5 animate-spin text-[var(--text-tertiary)]' />
</div>
) : childWorkflowState ? (
<>
<div className='[&_*:active]:!cursor-grabbing [&_*]:!cursor-grab [&_.react-flow__handle]:!hidden h-full w-full'>
<WorkflowPreview
workflowState={childWorkflowState}
height={160}
width='100%'
isPannable={true}
defaultZoom={0.6}
fitPadding={0.15}
cursorStyle='grab'
/>
</div>
<Tooltip.Root>
<Tooltip.Trigger asChild>
<Button
type='button'
variant='ghost'
onClick={handleOpenChildWorkflow}
className='absolute right-[6px] bottom-[6px] z-10 h-[24px] w-[24px] cursor-pointer border border-[var(--border)] bg-[var(--surface-2)] p-0 hover:bg-[var(--surface-4)]'
>
<ExternalLink className='h-[12px] w-[12px]' />
</Button>
</Tooltip.Trigger>
<Tooltip.Content side='top'>Open workflow</Tooltip.Content>
</Tooltip.Root>
</>
) : (
<div className='flex h-full items-center justify-center bg-[var(--surface-3)]'>
<span className='text-[13px] text-[var(--text-tertiary)]'>
Unable to load preview
</span>
</div>
)}
</div>
</div>
<div className='subblock-divider px-[2px] pt-[16px] pb-[13px]'>
<div
className='h-[1.25px]'
style={{
backgroundImage:
'repeating-linear-gradient(to right, var(--border) 0px, var(--border) 6px, transparent 6px, transparent 12px)',
}}
/>
</div>
</>
)}
{subBlocks.length === 0 && !isWorkflowBlock ? (
<div className='flex h-full items-center justify-center text-center text-[#8D8D8D] text-[13px]'> <div className='flex h-full items-center justify-center text-center text-[#8D8D8D] text-[13px]'>
This block has no subblocks This block has no subblocks
</div> </div>

View File

@@ -21,11 +21,9 @@ interface WorkflowPreviewBlockData {
} }
/** /**
* Lightweight block component for workflow previews. * Preview block component for workflow visualization.
* Renders block header, dummy subblocks skeleton, and handles. * Renders block header, subblocks skeleton, and handles without
* Respects horizontalHandles and enabled state from workflow. * hooks, store subscriptions, or interactive features.
* No heavy hooks, store subscriptions, or interactive features.
* Used in template cards and other preview contexts for performance.
*/ */
function WorkflowPreviewBlockInner({ data }: NodeProps<WorkflowPreviewBlockData>) { function WorkflowPreviewBlockInner({ data }: NodeProps<WorkflowPreviewBlockData>) {
const { const {

View File

@@ -685,7 +685,7 @@ interface WorkflowVariable {
value: unknown value: unknown
} }
interface BlockDetailsSidebarProps { interface PreviewEditorProps {
block: BlockState block: BlockState
executionData?: ExecutionData executionData?: ExecutionData
/** All block execution data for resolving variable references */ /** All block execution data for resolving variable references */
@@ -722,7 +722,7 @@ const DEFAULT_CONNECTIONS_HEIGHT = 150
/** /**
* Readonly sidebar panel showing block configuration using SubBlock components. * Readonly sidebar panel showing block configuration using SubBlock components.
*/ */
function BlockDetailsSidebarContent({ function PreviewEditorContent({
block, block,
executionData, executionData,
allBlockExecutions, allBlockExecutions,
@@ -732,7 +732,7 @@ function BlockDetailsSidebarContent({
parallels, parallels,
isExecutionMode = false, isExecutionMode = false,
onClose, onClose,
}: BlockDetailsSidebarProps) { }: PreviewEditorProps) {
// Convert Record<string, Variable> to Array<Variable> for iteration // Convert Record<string, Variable> to Array<Variable> for iteration
const normalizedWorkflowVariables = useMemo(() => { const normalizedWorkflowVariables = useMemo(() => {
if (!workflowVariables) return [] if (!workflowVariables) return []
@@ -998,6 +998,22 @@ function BlockDetailsSidebarContent({
}) })
}, [extractedRefs.envVars]) }, [extractedRefs.envVars])
const rawValues = useMemo(() => {
return Object.entries(subBlockValues).reduce<Record<string, unknown>>((acc, [key, entry]) => {
if (entry && typeof entry === 'object' && 'value' in entry) {
acc[key] = (entry as { value: unknown }).value
} else {
acc[key] = entry
}
return acc
}, {})
}, [subBlockValues])
const canonicalIndex = useMemo(
() => buildCanonicalIndex(blockConfig?.subBlocks || []),
[blockConfig?.subBlocks]
)
// Check if this is a subflow block (loop or parallel) // Check if this is a subflow block (loop or parallel)
const isSubflow = block.type === 'loop' || block.type === 'parallel' const isSubflow = block.type === 'loop' || block.type === 'parallel'
const loopConfig = block.type === 'loop' ? loops?.[block.id] : undefined const loopConfig = block.type === 'loop' ? loops?.[block.id] : undefined
@@ -1079,21 +1095,6 @@ function BlockDetailsSidebarContent({
) )
} }
const rawValues = useMemo(() => {
return Object.entries(subBlockValues).reduce<Record<string, unknown>>((acc, [key, entry]) => {
if (entry && typeof entry === 'object' && 'value' in entry) {
acc[key] = (entry as { value: unknown }).value
} else {
acc[key] = entry
}
return acc
}, {})
}, [subBlockValues])
const canonicalIndex = useMemo(
() => buildCanonicalIndex(blockConfig.subBlocks),
[blockConfig.subBlocks]
)
const canonicalModeOverrides = block.data?.canonicalModes const canonicalModeOverrides = block.data?.canonicalModes
const effectiveAdvanced = const effectiveAdvanced =
(block.advancedMode ?? false) || (block.advancedMode ?? false) ||
@@ -1371,12 +1372,12 @@ function BlockDetailsSidebarContent({
} }
/** /**
* Block details sidebar wrapped in ReactFlowProvider for hook compatibility. * Preview editor wrapped in ReactFlowProvider for hook compatibility.
*/ */
export function BlockDetailsSidebar(props: BlockDetailsSidebarProps) { export function PreviewEditor(props: PreviewEditorProps) {
return ( return (
<ReactFlowProvider> <ReactFlowProvider>
<BlockDetailsSidebarContent {...props} /> <PreviewEditorContent {...props} />
</ReactFlowProvider> </ReactFlowProvider>
) )
} }

View File

@@ -15,10 +15,9 @@ interface WorkflowPreviewSubflowData {
} }
/** /**
* Lightweight subflow component for workflow previews. * Preview subflow component for workflow visualization.
* Matches the styling of the actual SubflowNodeComponent but without * Renders loop/parallel containers without hooks, store subscriptions,
* hooks, store subscriptions, or interactive features. * or interactive features.
* Used in template cards and other preview contexts for performance.
*/ */
function WorkflowPreviewSubflowInner({ data }: NodeProps<WorkflowPreviewSubflowData>) { function WorkflowPreviewSubflowInner({ data }: NodeProps<WorkflowPreviewSubflowData>) {
const { name, width = 500, height = 300, kind, isPreviewSelected = false } = data const { name, width = 500, height = 300, kind, isPreviewSelected = false } = data

View File

@@ -1,2 +1,2 @@
export { BlockDetailsSidebar } from './components/block-details-sidebar' export { PreviewEditor } from './components/preview-editor'
export { getLeftmostBlockId, WorkflowPreview } from './preview' export { getLeftmostBlockId, WorkflowPreview } from './preview'

View File

@@ -15,14 +15,10 @@ import 'reactflow/dist/style.css'
import { createLogger } from '@sim/logger' import { createLogger } from '@sim/logger'
import { cn } from '@/lib/core/utils/cn' import { cn } from '@/lib/core/utils/cn'
import { BLOCK_DIMENSIONS, CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions' import { BLOCK_DIMENSIONS, CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions'
import { NoteBlock } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/note-block/note-block'
import { SubflowNodeComponent } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/subflows/subflow-node'
import { WorkflowBlock } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/workflow-block'
import { WorkflowEdge } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-edge/workflow-edge' import { WorkflowEdge } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-edge/workflow-edge'
import { estimateBlockDimensions } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils' import { estimateBlockDimensions } from '@/app/workspace/[workspaceId]/w/[workflowId]/utils'
import { WorkflowPreviewBlock } from '@/app/workspace/[workspaceId]/w/components/preview/components/block' import { WorkflowPreviewBlock } from '@/app/workspace/[workspaceId]/w/components/preview/components/block'
import { WorkflowPreviewSubflow } from '@/app/workspace/[workspaceId]/w/components/preview/components/subflow' import { WorkflowPreviewSubflow } from '@/app/workspace/[workspaceId]/w/components/preview/components/subflow'
import { getBlock } from '@/blocks'
import type { BlockState, WorkflowState } from '@/stores/workflows/workflow/types' import type { BlockState, WorkflowState } from '@/stores/workflows/workflow/types'
const logger = createLogger('WorkflowPreview') const logger = createLogger('WorkflowPreview')
@@ -134,8 +130,6 @@ interface WorkflowPreviewProps {
onNodeContextMenu?: (blockId: string, mousePosition: { x: number; y: number }) => void onNodeContextMenu?: (blockId: string, mousePosition: { x: number; y: number }) => void
/** Callback when the canvas (empty area) is clicked */ /** Callback when the canvas (empty area) is clicked */
onPaneClick?: () => void onPaneClick?: () => void
/** Use lightweight blocks for better performance in template cards */
lightweight?: boolean
/** Cursor style to show when hovering the canvas */ /** Cursor style to show when hovering the canvas */
cursorStyle?: 'default' | 'pointer' | 'grab' cursorStyle?: 'default' | 'pointer' | 'grab'
/** Map of executed block IDs to their status for highlighting the execution path */ /** Map of executed block IDs to their status for highlighting the execution path */
@@ -145,19 +139,10 @@ interface WorkflowPreviewProps {
} }
/** /**
* Full node types with interactive WorkflowBlock for detailed previews * Preview node types using minimal components without hooks or store subscriptions.
* This prevents interaction issues while allowing canvas panning and node clicking.
*/ */
const fullNodeTypes: NodeTypes = { const previewNodeTypes: NodeTypes = {
workflowBlock: WorkflowBlock,
noteBlock: NoteBlock,
subflowNode: SubflowNodeComponent,
}
/**
* Lightweight node types for template cards and other high-volume previews.
* Uses minimal components without hooks or store subscriptions.
*/
const lightweightNodeTypes: NodeTypes = {
workflowBlock: WorkflowPreviewBlock, workflowBlock: WorkflowPreviewBlock,
noteBlock: WorkflowPreviewBlock, noteBlock: WorkflowPreviewBlock,
subflowNode: WorkflowPreviewSubflow, subflowNode: WorkflowPreviewSubflow,
@@ -172,17 +157,19 @@ const edgeTypes: EdgeTypes = {
interface FitViewOnChangeProps { interface FitViewOnChangeProps {
nodeIds: string nodeIds: string
fitPadding: number fitPadding: number
containerRef: React.RefObject<HTMLDivElement | null>
} }
/** /**
* Helper component that calls fitView when the set of nodes changes. * Helper component that calls fitView when the set of nodes changes or when the container resizes.
* Only triggers on actual node additions/removals, not on selection changes. * Only triggers on actual node additions/removals, not on selection changes.
* Must be rendered inside ReactFlowProvider. * Must be rendered inside ReactFlowProvider.
*/ */
function FitViewOnChange({ nodeIds, fitPadding }: FitViewOnChangeProps) { function FitViewOnChange({ nodeIds, fitPadding, containerRef }: FitViewOnChangeProps) {
const { fitView } = useReactFlow() const { fitView } = useReactFlow()
const lastNodeIdsRef = useRef<string | null>(null) const lastNodeIdsRef = useRef<string | null>(null)
// Fit view when nodes change
useEffect(() => { useEffect(() => {
if (!nodeIds.length) return if (!nodeIds.length) return
const shouldFit = lastNodeIdsRef.current !== nodeIds const shouldFit = lastNodeIdsRef.current !== nodeIds
@@ -195,6 +182,27 @@ function FitViewOnChange({ nodeIds, fitPadding }: FitViewOnChangeProps) {
return () => clearTimeout(timeoutId) return () => clearTimeout(timeoutId)
}, [nodeIds, fitPadding, fitView]) }, [nodeIds, fitPadding, fitView])
// Fit view when container resizes (debounced to avoid excessive calls during drag)
useEffect(() => {
const container = containerRef.current
if (!container) return
let timeoutId: ReturnType<typeof setTimeout> | null = null
const resizeObserver = new ResizeObserver(() => {
if (timeoutId) clearTimeout(timeoutId)
timeoutId = setTimeout(() => {
fitView({ padding: fitPadding, duration: 150 })
}, 100)
})
resizeObserver.observe(container)
return () => {
if (timeoutId) clearTimeout(timeoutId)
resizeObserver.disconnect()
}
}, [containerRef, fitPadding, fitView])
return null return null
} }
@@ -210,15 +218,12 @@ export function WorkflowPreview({
onNodeClick, onNodeClick,
onNodeContextMenu, onNodeContextMenu,
onPaneClick, onPaneClick,
lightweight = false,
cursorStyle = 'grab', cursorStyle = 'grab',
executedBlocks, executedBlocks,
selectedBlockId, selectedBlockId,
}: WorkflowPreviewProps) { }: WorkflowPreviewProps) {
const nodeTypes = useMemo( const containerRef = useRef<HTMLDivElement>(null)
() => (lightweight ? lightweightNodeTypes : fullNodeTypes), const nodeTypes = previewNodeTypes
[lightweight]
)
const isValidWorkflowState = workflowState?.blocks && workflowState.edges const isValidWorkflowState = workflowState?.blocks && workflowState.edges
const blocksStructure = useMemo(() => { const blocksStructure = useMemo(() => {
@@ -288,119 +293,28 @@ export function WorkflowPreview({
const absolutePosition = calculateAbsolutePosition(block, workflowState.blocks) const absolutePosition = calculateAbsolutePosition(block, workflowState.blocks)
if (lightweight) { // Handle loop/parallel containers
if (block.type === 'loop' || block.type === 'parallel') { if (block.type === 'loop' || block.type === 'parallel') {
const isSelected = selectedBlockId === blockId
const dimensions = calculateContainerDimensions(blockId, workflowState.blocks)
nodeArray.push({
id: blockId,
type: 'subflowNode',
position: absolutePosition,
draggable: false,
data: {
name: block.name,
width: dimensions.width,
height: dimensions.height,
kind: block.type as 'loop' | 'parallel',
isPreviewSelected: isSelected,
},
})
return
}
const isSelected = selectedBlockId === blockId
let lightweightExecutionStatus: ExecutionStatus | undefined
if (executedBlocks) {
const blockExecution = executedBlocks[blockId]
if (blockExecution) {
if (blockExecution.status === 'error') {
lightweightExecutionStatus = 'error'
} else if (blockExecution.status === 'success') {
lightweightExecutionStatus = 'success'
} else {
lightweightExecutionStatus = 'not-executed'
}
} else {
lightweightExecutionStatus = 'not-executed'
}
}
nodeArray.push({
id: blockId,
type: 'workflowBlock',
position: absolutePosition,
draggable: false,
// Blocks inside subflows need higher z-index to appear above the container
zIndex: block.data?.parentId ? 10 : undefined,
data: {
type: block.type,
name: block.name,
isTrigger: block.triggerMode === true,
horizontalHandles: block.horizontalHandles ?? false,
enabled: block.enabled ?? true,
isPreviewSelected: isSelected,
executionStatus: lightweightExecutionStatus,
},
})
return
}
if (block.type === 'loop') {
const isSelected = selectedBlockId === blockId const isSelected = selectedBlockId === blockId
const dimensions = calculateContainerDimensions(blockId, workflowState.blocks) const dimensions = calculateContainerDimensions(blockId, workflowState.blocks)
nodeArray.push({ nodeArray.push({
id: blockId, id: blockId,
type: 'subflowNode', type: 'subflowNode',
position: absolutePosition, position: absolutePosition,
parentId: block.data?.parentId,
extent: block.data?.extent || undefined,
draggable: false, draggable: false,
data: { data: {
...block.data,
name: block.name, name: block.name,
width: dimensions.width, width: dimensions.width,
height: dimensions.height, height: dimensions.height,
state: 'valid', kind: block.type as 'loop' | 'parallel',
isPreview: true,
isPreviewSelected: isSelected, isPreviewSelected: isSelected,
kind: 'loop',
}, },
}) })
return return
} }
if (block.type === 'parallel') { // Handle regular blocks
const isSelected = selectedBlockId === blockId const isSelected = selectedBlockId === blockId
const dimensions = calculateContainerDimensions(blockId, workflowState.blocks)
nodeArray.push({
id: blockId,
type: 'subflowNode',
position: absolutePosition,
parentId: block.data?.parentId,
extent: block.data?.extent || undefined,
draggable: false,
data: {
...block.data,
name: block.name,
width: dimensions.width,
height: dimensions.height,
state: 'valid',
isPreview: true,
isPreviewSelected: isSelected,
kind: 'parallel',
},
})
return
}
const blockConfig = getBlock(block.type)
if (!blockConfig) {
logger.error(`No configuration found for block type: ${block.type}`, { blockId })
return
}
const nodeType = block.type === 'note' ? 'noteBlock' : 'workflowBlock'
let executionStatus: ExecutionStatus | undefined let executionStatus: ExecutionStatus | undefined
if (executedBlocks) { if (executedBlocks) {
@@ -418,24 +332,20 @@ export function WorkflowPreview({
} }
} }
const isSelected = selectedBlockId === blockId
nodeArray.push({ nodeArray.push({
id: blockId, id: blockId,
type: nodeType, type: 'workflowBlock',
position: absolutePosition, position: absolutePosition,
draggable: false, draggable: false,
// Blocks inside subflows need higher z-index to appear above the container // Blocks inside subflows need higher z-index to appear above the container
zIndex: block.data?.parentId ? 10 : undefined, zIndex: block.data?.parentId ? 10 : undefined,
data: { data: {
type: block.type, type: block.type,
config: blockConfig,
name: block.name, name: block.name,
blockState: block, isTrigger: block.triggerMode === true,
canEdit: false, horizontalHandles: block.horizontalHandles ?? false,
isPreview: true, enabled: block.enabled ?? true,
isPreviewSelected: isSelected, isPreviewSelected: isSelected,
subBlockValues: block.subBlocks ?? {},
executionStatus, executionStatus,
}, },
}) })
@@ -448,7 +358,6 @@ export function WorkflowPreview({
parallelsStructure, parallelsStructure,
workflowState.blocks, workflowState.blocks,
isValidWorkflowState, isValidWorkflowState,
lightweight,
executedBlocks, executedBlocks,
selectedBlockId, selectedBlockId,
]) ])
@@ -506,6 +415,7 @@ export function WorkflowPreview({
return ( return (
<ReactFlowProvider> <ReactFlowProvider>
<div <div
ref={containerRef}
style={{ height, width, backgroundColor: 'var(--bg)' }} style={{ height, width, backgroundColor: 'var(--bg)' }}
className={cn('preview-mode', onNodeClick && 'interactive-nodes', className)} className={cn('preview-mode', onNodeClick && 'interactive-nodes', className)}
> >
@@ -516,6 +426,20 @@ export function WorkflowPreview({
.preview-mode .react-flow__selectionpane { cursor: ${cursorStyle} !important; } .preview-mode .react-flow__selectionpane { cursor: ${cursorStyle} !important; }
.preview-mode .react-flow__renderer { cursor: ${cursorStyle}; } .preview-mode .react-flow__renderer { cursor: ${cursorStyle}; }
/* Active/grabbing cursor when dragging */
${
cursorStyle === 'grab'
? `
.preview-mode .react-flow:active { cursor: grabbing; }
.preview-mode .react-flow__pane:active { cursor: grabbing !important; }
.preview-mode .react-flow__selectionpane:active { cursor: grabbing !important; }
.preview-mode .react-flow__renderer:active { cursor: grabbing; }
.preview-mode .react-flow__node:active { cursor: grabbing !important; }
.preview-mode .react-flow__node:active * { cursor: grabbing !important; }
`
: ''
}
/* Node cursor - pointer on nodes when onNodeClick is provided */ /* Node cursor - pointer on nodes when onNodeClick is provided */
.preview-mode.interactive-nodes .react-flow__node { cursor: pointer !important; } .preview-mode.interactive-nodes .react-flow__node { cursor: pointer !important; }
.preview-mode.interactive-nodes .react-flow__node > div { cursor: pointer !important; } .preview-mode.interactive-nodes .react-flow__node > div { cursor: pointer !important; }
@@ -563,7 +487,11 @@ export function WorkflowPreview({
} }
onPaneClick={onPaneClick} onPaneClick={onPaneClick}
/> />
<FitViewOnChange nodeIds={blocksStructure.ids} fitPadding={fitPadding} /> <FitViewOnChange
nodeIds={blocksStructure.ids}
fitPadding={fitPadding}
containerRef={containerRef}
/>
</div> </div>
</ReactFlowProvider> </ReactFlowProvider>
) )

View File

@@ -11,7 +11,7 @@ import { useSidebarStore } from '@/stores/sidebar/store'
* Avatar display configuration for responsive layout. * Avatar display configuration for responsive layout.
*/ */
const AVATAR_CONFIG = { const AVATAR_CONFIG = {
MIN_COUNT: 3, MIN_COUNT: 4,
MAX_COUNT: 12, MAX_COUNT: 12,
WIDTH_PER_AVATAR: 20, WIDTH_PER_AVATAR: 20,
} as const } as const
@@ -106,7 +106,9 @@ export function Avatars({ workflowId }: AvatarsProps) {
}, [presenceUsers, currentWorkflowId, workflowId, currentSocketId]) }, [presenceUsers, currentWorkflowId, workflowId, currentSocketId])
/** /**
* Calculate visible users and overflow count * Calculate visible users and overflow count.
* Shows up to maxVisible avatars, with overflow indicator for any remaining.
* Users are reversed so new avatars appear on the left (keeping right side stable).
*/ */
const { visibleUsers, overflowCount } = useMemo(() => { const { visibleUsers, overflowCount } = useMemo(() => {
if (workflowUsers.length === 0) { if (workflowUsers.length === 0) {
@@ -116,7 +118,8 @@ export function Avatars({ workflowId }: AvatarsProps) {
const visible = workflowUsers.slice(0, maxVisible) const visible = workflowUsers.slice(0, maxVisible)
const overflow = Math.max(0, workflowUsers.length - maxVisible) const overflow = Math.max(0, workflowUsers.length - maxVisible)
return { visibleUsers: visible, overflowCount: overflow } // Reverse so rightmost avatars stay stable as new ones are revealed on the left
return { visibleUsers: [...visible].reverse(), overflowCount: overflow }
}, [workflowUsers, maxVisible]) }, [workflowUsers, maxVisible])
if (visibleUsers.length === 0) { if (visibleUsers.length === 0) {
@@ -139,9 +142,8 @@ export function Avatars({ workflowId }: AvatarsProps) {
</Tooltip.Content> </Tooltip.Content>
</Tooltip.Root> </Tooltip.Root>
)} )}
{visibleUsers.map((user, index) => ( {visibleUsers.map((user, index) => (
<UserAvatar key={user.socketId} user={user} index={overflowCount > 0 ? index + 1 : index} /> <UserAvatar key={user.socketId} user={user} index={index} />
))} ))}
</div> </div>
) )

View File

@@ -347,7 +347,7 @@ export function WorkflowItem({
) : ( ) : (
<div <div
className={clsx( className={clsx(
'min-w-0 flex-1 truncate font-medium', 'min-w-0 truncate font-medium',
active active
? 'text-[var(--text-primary)]' ? 'text-[var(--text-primary)]'
: 'text-[var(--text-tertiary)] group-hover:text-[var(--text-primary)]' : 'text-[var(--text-tertiary)] group-hover:text-[var(--text-primary)]'

View File

@@ -242,15 +242,9 @@ Return ONLY the email body - no explanations, no extra text.`,
id: 'messageId', id: 'messageId',
title: 'Message ID', title: 'Message ID',
type: 'short-input', type: 'short-input',
placeholder: 'Enter message ID to read (optional)', placeholder: 'Read specific email by ID (overrides label/folder)',
condition: { condition: { field: 'operation', value: 'read_gmail' },
field: 'operation', mode: 'advanced',
value: 'read_gmail',
and: {
field: 'folder',
value: '',
},
},
}, },
// Search Fields // Search Fields
{ {

View File

@@ -129,12 +129,9 @@ ROUTING RULES:
3. If the context is even partially related to a route's description, select that route 3. If the context is even partially related to a route's description, select that route
4. ONLY output NO_MATCH if the context is completely unrelated to ALL route descriptions 4. ONLY output NO_MATCH if the context is completely unrelated to ALL route descriptions
OUTPUT FORMAT: Respond with a JSON object containing:
- Output EXACTLY one route ID (copied exactly as shown above) OR "NO_MATCH" - route: EXACTLY one route ID (copied exactly as shown above) OR "NO_MATCH"
- No explanation, no punctuation, no additional text - reasoning: A brief explanation (1-2 sentences) of why you chose this route`
- Just the route ID or NO_MATCH
Your response:`
} }
/** /**
@@ -272,6 +269,7 @@ interface RouterV2Response extends ToolResponse {
total: number total: number
} }
selectedRoute: string selectedRoute: string
reasoning: string
selectedPath: { selectedPath: {
blockId: string blockId: string
blockType: string blockType: string
@@ -355,6 +353,7 @@ export const RouterV2Block: BlockConfig<RouterV2Response> = {
tokens: { type: 'json', description: 'Token usage' }, tokens: { type: 'json', description: 'Token usage' },
cost: { type: 'json', description: 'Cost information' }, cost: { type: 'json', description: 'Cost information' },
selectedRoute: { type: 'string', description: 'Selected route ID' }, selectedRoute: { type: 'string', description: 'Selected route ID' },
reasoning: { type: 'string', description: 'Explanation of why this route was chosen' },
selectedPath: { type: 'json', description: 'Selected routing path' }, selectedPath: { type: 'json', description: 'Selected routing path' },
}, },
} }

View File

@@ -1,30 +1,5 @@
import { createLogger } from '@sim/logger'
import { WorkflowIcon } from '@/components/icons' import { WorkflowIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types' import type { BlockConfig } from '@/blocks/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
const logger = createLogger('WorkflowBlock')
// Helper function to get available workflows for the dropdown
const getAvailableWorkflows = (): Array<{ label: string; id: string }> => {
try {
const { workflows, activeWorkflowId } = useWorkflowRegistry.getState()
// Filter out the current workflow to prevent recursion
const availableWorkflows = Object.entries(workflows)
.filter(([id]) => id !== activeWorkflowId)
.map(([id, workflow]) => ({
label: workflow.name || `Workflow ${id.slice(0, 8)}`,
id: id,
}))
.sort((a, b) => a.label.localeCompare(b.label))
return availableWorkflows
} catch (error) {
logger.error('Error getting available workflows:', error)
return []
}
}
export const WorkflowBlock: BlockConfig = { export const WorkflowBlock: BlockConfig = {
type: 'workflow', type: 'workflow',
@@ -38,8 +13,7 @@ export const WorkflowBlock: BlockConfig = {
{ {
id: 'workflowId', id: 'workflowId',
title: 'Select Workflow', title: 'Select Workflow',
type: 'combobox', type: 'workflow-selector',
options: getAvailableWorkflows,
placeholder: 'Search workflows...', placeholder: 'Search workflows...',
required: true, required: true,
}, },

View File

@@ -1,18 +1,5 @@
import { WorkflowIcon } from '@/components/icons' import { WorkflowIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types' import type { BlockConfig } from '@/blocks/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
const getAvailableWorkflows = (): Array<{ label: string; id: string }> => {
try {
const { workflows, activeWorkflowId } = useWorkflowRegistry.getState()
return Object.entries(workflows)
.filter(([id]) => id !== activeWorkflowId)
.map(([id, w]) => ({ label: w.name || `Workflow ${id.slice(0, 8)}`, id }))
.sort((a, b) => a.label.localeCompare(b.label))
} catch {
return []
}
}
export const WorkflowInputBlock: BlockConfig = { export const WorkflowInputBlock: BlockConfig = {
type: 'workflow_input', type: 'workflow_input',
@@ -25,21 +12,19 @@ export const WorkflowInputBlock: BlockConfig = {
`, `,
category: 'blocks', category: 'blocks',
docsLink: 'https://docs.sim.ai/blocks/workflow', docsLink: 'https://docs.sim.ai/blocks/workflow',
bgColor: '#6366F1', // Indigo - modern and professional bgColor: '#6366F1',
icon: WorkflowIcon, icon: WorkflowIcon,
subBlocks: [ subBlocks: [
{ {
id: 'workflowId', id: 'workflowId',
title: 'Select Workflow', title: 'Select Workflow',
type: 'combobox', type: 'workflow-selector',
options: getAvailableWorkflows,
placeholder: 'Search workflows...', placeholder: 'Search workflows...',
required: true, required: true,
}, },
// Renders dynamic mapping UI based on selected child workflow's Start trigger inputFormat
{ {
id: 'inputMapping', id: 'inputMapping',
title: 'Input Mapping', title: 'Inputs',
type: 'input-mapping', type: 'input-mapping',
description: description:
"Map fields defined in the child workflow's Start block to variables/values in this workflow.", "Map fields defined in the child workflow's Start block to variables/values in this workflow.",

View File

@@ -24,6 +24,71 @@ function createBlock(id: string, metadataId: string): SerializedBlock {
} }
} }
describe('DAGBuilder disabled subflow validation', () => {
it('skips validation for disabled loops with no blocks inside', () => {
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
{ ...createBlock('loop-block', BlockType.FUNCTION), enabled: false },
],
connections: [],
loops: {
'loop-1': {
id: 'loop-1',
nodes: [], // Empty loop - would normally throw
iterations: 3,
},
},
}
const builder = new DAGBuilder()
// Should not throw even though loop has no blocks inside
expect(() => builder.build(workflow)).not.toThrow()
})
it('skips validation for disabled parallels with no blocks inside', () => {
const workflow: SerializedWorkflow = {
version: '1',
blocks: [createBlock('start', BlockType.STARTER)],
connections: [],
loops: {},
parallels: {
'parallel-1': {
id: 'parallel-1',
nodes: [], // Empty parallel - would normally throw
},
},
}
const builder = new DAGBuilder()
// Should not throw even though parallel has no blocks inside
expect(() => builder.build(workflow)).not.toThrow()
})
it('skips validation for loops where all inner blocks are disabled', () => {
const workflow: SerializedWorkflow = {
version: '1',
blocks: [
createBlock('start', BlockType.STARTER),
{ ...createBlock('inner-block', BlockType.FUNCTION), enabled: false },
],
connections: [],
loops: {
'loop-1': {
id: 'loop-1',
nodes: ['inner-block'], // Has node but it's disabled
iterations: 3,
},
},
}
const builder = new DAGBuilder()
// Should not throw - loop is effectively disabled since all inner blocks are disabled
expect(() => builder.build(workflow)).not.toThrow()
})
})
describe('DAGBuilder human-in-the-loop transformation', () => { describe('DAGBuilder human-in-the-loop transformation', () => {
it('creates trigger nodes and rewires edges for pause blocks', () => { it('creates trigger nodes and rewires edges for pause blocks', () => {
const workflow: SerializedWorkflow = { const workflow: SerializedWorkflow = {

View File

@@ -136,17 +136,18 @@ export class DAGBuilder {
nodes: string[] | undefined, nodes: string[] | undefined,
type: 'Loop' | 'Parallel' type: 'Loop' | 'Parallel'
): void { ): void {
const sentinelStartId =
type === 'Loop' ? buildSentinelStartId(id) : buildParallelSentinelStartId(id)
const sentinelStartNode = dag.nodes.get(sentinelStartId)
if (!sentinelStartNode) return
if (!nodes || nodes.length === 0) { if (!nodes || nodes.length === 0) {
throw new Error( throw new Error(
`${type} has no blocks inside. Add at least one block to the ${type.toLowerCase()}.` `${type} has no blocks inside. Add at least one block to the ${type.toLowerCase()}.`
) )
} }
const sentinelStartId =
type === 'Loop' ? buildSentinelStartId(id) : buildParallelSentinelStartId(id)
const sentinelStartNode = dag.nodes.get(sentinelStartId)
if (!sentinelStartNode) return
const hasConnections = Array.from(sentinelStartNode.outgoingEdges.values()).some((edge) => const hasConnections = Array.from(sentinelStartNode.outgoingEdges.values()).some((edge) =>
nodes.includes(extractBaseBlockId(edge.target)) nodes.includes(extractBaseBlockId(edge.target))
) )

File diff suppressed because it is too large Load Diff

View File

@@ -20,21 +20,13 @@ export class EdgeManager {
const activatedTargets: string[] = [] const activatedTargets: string[] = []
const edgesToDeactivate: Array<{ target: string; handle?: string }> = [] const edgesToDeactivate: Array<{ target: string; handle?: string }> = []
// First pass: categorize edges as activating or deactivating for (const [, edge] of node.outgoingEdges) {
// Don't modify incomingEdges yet - we need the original state for deactivation checks
for (const [edgeId, edge] of node.outgoingEdges) {
if (skipBackwardsEdge && this.isBackwardsEdge(edge.sourceHandle)) { if (skipBackwardsEdge && this.isBackwardsEdge(edge.sourceHandle)) {
continue continue
} }
const shouldActivate = this.shouldActivateEdge(edge, output) if (!this.shouldActivateEdge(edge, output)) {
if (!shouldActivate) { if (!this.isLoopEdge(edge.sourceHandle)) {
const isLoopEdge =
edge.sourceHandle === EDGE.LOOP_CONTINUE ||
edge.sourceHandle === EDGE.LOOP_CONTINUE_ALT ||
edge.sourceHandle === EDGE.LOOP_EXIT
if (!isLoopEdge) {
edgesToDeactivate.push({ target: edge.target, handle: edge.sourceHandle }) edgesToDeactivate.push({ target: edge.target, handle: edge.sourceHandle })
} }
continue continue
@@ -43,13 +35,19 @@ export class EdgeManager {
activatedTargets.push(edge.target) activatedTargets.push(edge.target)
} }
// Second pass: process deactivations while incomingEdges is still intact const cascadeTargets = new Set<string>()
// This ensures hasActiveIncomingEdges can find all potential sources
for (const { target, handle } of edgesToDeactivate) { for (const { target, handle } of edgesToDeactivate) {
this.deactivateEdgeAndDescendants(node.id, target, handle) this.deactivateEdgeAndDescendants(node.id, target, handle, cascadeTargets)
}
if (activatedTargets.length === 0) {
for (const { target } of edgesToDeactivate) {
if (this.isTerminalControlNode(target)) {
cascadeTargets.add(target)
}
}
} }
// Third pass: update incomingEdges for activated targets
for (const targetId of activatedTargets) { for (const targetId of activatedTargets) {
const targetNode = this.dag.nodes.get(targetId) const targetNode = this.dag.nodes.get(targetId)
if (!targetNode) { if (!targetNode) {
@@ -59,28 +57,25 @@ export class EdgeManager {
targetNode.incomingEdges.delete(node.id) targetNode.incomingEdges.delete(node.id)
} }
// Fourth pass: check readiness after all edge processing is complete
for (const targetId of activatedTargets) { for (const targetId of activatedTargets) {
const targetNode = this.dag.nodes.get(targetId) if (this.isTargetReady(targetId)) {
if (targetNode && this.isNodeReady(targetNode)) {
readyNodes.push(targetId) readyNodes.push(targetId)
} }
} }
for (const targetId of cascadeTargets) {
if (!readyNodes.includes(targetId) && !activatedTargets.includes(targetId)) {
if (this.isTargetReady(targetId)) {
readyNodes.push(targetId)
}
}
}
return readyNodes return readyNodes
} }
isNodeReady(node: DAGNode): boolean { isNodeReady(node: DAGNode): boolean {
if (node.incomingEdges.size === 0) { return node.incomingEdges.size === 0 || this.countActiveIncomingEdges(node) === 0
return true
}
const activeIncomingCount = this.countActiveIncomingEdges(node)
if (activeIncomingCount > 0) {
return false
}
return true
} }
restoreIncomingEdge(targetNodeId: string, sourceNodeId: string): void { restoreIncomingEdge(targetNodeId: string, sourceNodeId: string): void {
@@ -99,13 +94,10 @@ export class EdgeManager {
/** /**
* Clear deactivated edges for a set of nodes (used when restoring loop state for next iteration). * Clear deactivated edges for a set of nodes (used when restoring loop state for next iteration).
* This ensures error/success edges can be re-evaluated on each iteration.
*/ */
clearDeactivatedEdgesForNodes(nodeIds: Set<string>): void { clearDeactivatedEdgesForNodes(nodeIds: Set<string>): void {
const edgesToRemove: string[] = [] const edgesToRemove: string[] = []
for (const edgeKey of this.deactivatedEdges) { for (const edgeKey of this.deactivatedEdges) {
// Edge key format is "sourceId-targetId-handle"
// Check if either source or target is in the nodeIds set
for (const nodeId of nodeIds) { for (const nodeId of nodeIds) {
if (edgeKey.startsWith(`${nodeId}-`) || edgeKey.includes(`-${nodeId}-`)) { if (edgeKey.startsWith(`${nodeId}-`) || edgeKey.includes(`-${nodeId}-`)) {
edgesToRemove.push(edgeKey) edgesToRemove.push(edgeKey)
@@ -118,6 +110,44 @@ export class EdgeManager {
} }
} }
private isTargetReady(targetId: string): boolean {
const targetNode = this.dag.nodes.get(targetId)
return targetNode ? this.isNodeReady(targetNode) : false
}
private isLoopEdge(handle?: string): boolean {
return (
handle === EDGE.LOOP_CONTINUE ||
handle === EDGE.LOOP_CONTINUE_ALT ||
handle === EDGE.LOOP_EXIT
)
}
private isControlEdge(handle?: string): boolean {
return (
handle === EDGE.LOOP_CONTINUE ||
handle === EDGE.LOOP_CONTINUE_ALT ||
handle === EDGE.LOOP_EXIT ||
handle === EDGE.PARALLEL_EXIT
)
}
private isBackwardsEdge(sourceHandle?: string): boolean {
return sourceHandle === EDGE.LOOP_CONTINUE || sourceHandle === EDGE.LOOP_CONTINUE_ALT
}
private isTerminalControlNode(nodeId: string): boolean {
const node = this.dag.nodes.get(nodeId)
if (!node || node.outgoingEdges.size === 0) return false
for (const [, edge] of node.outgoingEdges) {
if (!this.isControlEdge(edge.sourceHandle)) {
return false
}
}
return true
}
private shouldActivateEdge(edge: DAGEdge, output: NormalizedBlockOutput): boolean { private shouldActivateEdge(edge: DAGEdge, output: NormalizedBlockOutput): boolean {
const handle = edge.sourceHandle const handle = edge.sourceHandle
@@ -159,14 +189,12 @@ export class EdgeManager {
} }
} }
private isBackwardsEdge(sourceHandle?: string): boolean {
return sourceHandle === EDGE.LOOP_CONTINUE || sourceHandle === EDGE.LOOP_CONTINUE_ALT
}
private deactivateEdgeAndDescendants( private deactivateEdgeAndDescendants(
sourceId: string, sourceId: string,
targetId: string, targetId: string,
sourceHandle?: string sourceHandle?: string,
cascadeTargets?: Set<string>,
isCascade = false
): void { ): void {
const edgeKey = this.createEdgeKey(sourceId, targetId, sourceHandle) const edgeKey = this.createEdgeKey(sourceId, targetId, sourceHandle)
if (this.deactivatedEdges.has(edgeKey)) { if (this.deactivatedEdges.has(edgeKey)) {
@@ -174,38 +202,46 @@ export class EdgeManager {
} }
this.deactivatedEdges.add(edgeKey) this.deactivatedEdges.add(edgeKey)
const targetNode = this.dag.nodes.get(targetId) const targetNode = this.dag.nodes.get(targetId)
if (!targetNode) return if (!targetNode) return
// Check if target has other active incoming edges if (isCascade && this.isTerminalControlNode(targetId)) {
// Pass the specific edge key being deactivated, not just source ID, cascadeTargets?.add(targetId)
// to handle multiple edges from same source to same target (e.g., condition branches) }
const hasOtherActiveIncoming = this.hasActiveIncomingEdges(targetNode, edgeKey)
if (!hasOtherActiveIncoming) { if (this.hasActiveIncomingEdges(targetNode, edgeKey)) {
for (const [_, outgoingEdge] of targetNode.outgoingEdges) { return
this.deactivateEdgeAndDescendants(targetId, outgoingEdge.target, outgoingEdge.sourceHandle) }
for (const [, outgoingEdge] of targetNode.outgoingEdges) {
if (!this.isControlEdge(outgoingEdge.sourceHandle)) {
this.deactivateEdgeAndDescendants(
targetId,
outgoingEdge.target,
outgoingEdge.sourceHandle,
cascadeTargets,
true
)
} }
} }
} }
/** /**
* Checks if a node has any active incoming edges besides the one being excluded. * Checks if a node has any active incoming edges besides the one being excluded.
* This properly handles the case where multiple edges from the same source go to
* the same target (e.g., multiple condition branches pointing to one block).
*/ */
private hasActiveIncomingEdges(node: DAGNode, excludeEdgeKey: string): boolean { private hasActiveIncomingEdges(node: DAGNode, excludeEdgeKey: string): boolean {
for (const incomingSourceId of node.incomingEdges) { for (const incomingSourceId of node.incomingEdges) {
const incomingNode = this.dag.nodes.get(incomingSourceId) const incomingNode = this.dag.nodes.get(incomingSourceId)
if (!incomingNode) continue if (!incomingNode) continue
for (const [_, incomingEdge] of incomingNode.outgoingEdges) { for (const [, incomingEdge] of incomingNode.outgoingEdges) {
if (incomingEdge.target === node.id) { if (incomingEdge.target === node.id) {
const incomingEdgeKey = this.createEdgeKey( const incomingEdgeKey = this.createEdgeKey(
incomingSourceId, incomingSourceId,
node.id, node.id,
incomingEdge.sourceHandle incomingEdge.sourceHandle
) )
// Skip the specific edge being excluded, but check other edges from same source
if (incomingEdgeKey === excludeEdgeKey) continue if (incomingEdgeKey === excludeEdgeKey) continue
if (!this.deactivatedEdges.has(incomingEdgeKey)) { if (!this.deactivatedEdges.has(incomingEdgeKey)) {
return true return true

View File

@@ -554,6 +554,413 @@ describe('ExecutionEngine', () => {
}) })
}) })
describe('Error handling in execution', () => {
it('should fail execution when a single node throws an error', async () => {
const startNode = createMockNode('start', 'starter')
const errorNode = createMockNode('error-node', 'function')
startNode.outgoingEdges.set('edge1', { target: 'error-node' })
const dag = createMockDAG([startNode, errorNode])
const context = createMockContext()
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return ['error-node']
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'error-node') {
throw new Error('Block execution failed')
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('start')).rejects.toThrow('Block execution failed')
})
it('should stop parallel branches when one branch throws an error', async () => {
const startNode = createMockNode('start', 'starter')
const parallelNodes = Array.from({ length: 5 }, (_, i) =>
createMockNode(`parallel${i}`, 'function')
)
parallelNodes.forEach((_, i) => {
startNode.outgoingEdges.set(`edge${i}`, { target: `parallel${i}` })
})
const dag = createMockDAG([startNode, ...parallelNodes])
const context = createMockContext()
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return parallelNodes.map((_, i) => `parallel${i}`)
return []
})
const executedNodes: string[] = []
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
executedNodes.push(nodeId)
if (nodeId === 'parallel0') {
await new Promise((resolve) => setTimeout(resolve, 10))
throw new Error('Parallel branch failed')
}
await new Promise((resolve) => setTimeout(resolve, 100))
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('start')).rejects.toThrow('Parallel branch failed')
})
it('should capture only the first error when multiple parallel branches fail', async () => {
const startNode = createMockNode('start', 'starter')
const parallelNodes = Array.from({ length: 3 }, (_, i) =>
createMockNode(`parallel${i}`, 'function')
)
parallelNodes.forEach((_, i) => {
startNode.outgoingEdges.set(`edge${i}`, { target: `parallel${i}` })
})
const dag = createMockDAG([startNode, ...parallelNodes])
const context = createMockContext()
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return parallelNodes.map((_, i) => `parallel${i}`)
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'parallel0') {
await new Promise((resolve) => setTimeout(resolve, 10))
throw new Error('First error')
}
if (nodeId === 'parallel1') {
await new Promise((resolve) => setTimeout(resolve, 20))
throw new Error('Second error')
}
if (nodeId === 'parallel2') {
await new Promise((resolve) => setTimeout(resolve, 30))
throw new Error('Third error')
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('start')).rejects.toThrow('First error')
})
it('should wait for ongoing executions to complete before throwing error', async () => {
const startNode = createMockNode('start', 'starter')
const fastErrorNode = createMockNode('fast-error', 'function')
const slowNode = createMockNode('slow', 'function')
startNode.outgoingEdges.set('edge1', { target: 'fast-error' })
startNode.outgoingEdges.set('edge2', { target: 'slow' })
const dag = createMockDAG([startNode, fastErrorNode, slowNode])
const context = createMockContext()
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return ['fast-error', 'slow']
return []
})
let slowNodeCompleted = false
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'fast-error') {
await new Promise((resolve) => setTimeout(resolve, 10))
throw new Error('Fast error')
}
if (nodeId === 'slow') {
await new Promise((resolve) => setTimeout(resolve, 50))
slowNodeCompleted = true
return { nodeId, output: {}, isFinalOutput: false }
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('start')).rejects.toThrow('Fast error')
expect(slowNodeCompleted).toBe(true)
})
it('should not queue new nodes after an error occurs', async () => {
const startNode = createMockNode('start', 'starter')
const errorNode = createMockNode('error-node', 'function')
const afterErrorNode = createMockNode('after-error', 'function')
startNode.outgoingEdges.set('edge1', { target: 'error-node' })
errorNode.outgoingEdges.set('edge2', { target: 'after-error' })
const dag = createMockDAG([startNode, errorNode, afterErrorNode])
const context = createMockContext()
const queuedNodes: string[] = []
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') {
queuedNodes.push('error-node')
return ['error-node']
}
if (node.id === 'error-node') {
queuedNodes.push('after-error')
return ['after-error']
}
return []
})
const executedNodes: string[] = []
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
executedNodes.push(nodeId)
if (nodeId === 'error-node') {
throw new Error('Node error')
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('start')).rejects.toThrow('Node error')
expect(executedNodes).not.toContain('after-error')
})
it('should populate error result with metadata when execution fails', async () => {
const startNode = createMockNode('start', 'starter')
const errorNode = createMockNode('error-node', 'function')
startNode.outgoingEdges.set('edge1', { target: 'error-node' })
const dag = createMockDAG([startNode, errorNode])
const context = createMockContext()
context.blockLogs.push({
blockId: 'start',
blockName: 'Start',
blockType: 'starter',
startedAt: new Date().toISOString(),
endedAt: new Date().toISOString(),
durationMs: 10,
success: true,
})
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return ['error-node']
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'error-node') {
const error = new Error('Execution failed') as any
error.executionResult = {
success: false,
output: { partial: 'data' },
logs: context.blockLogs,
metadata: context.metadata,
}
throw error
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
try {
await engine.run('start')
expect.fail('Should have thrown')
} catch (error: any) {
expect(error.executionResult).toBeDefined()
expect(error.executionResult.metadata.endTime).toBeDefined()
expect(error.executionResult.metadata.duration).toBeDefined()
}
})
it('should prefer cancellation status over error when both occur', async () => {
const abortController = new AbortController()
const startNode = createMockNode('start', 'starter')
const errorNode = createMockNode('error-node', 'function')
startNode.outgoingEdges.set('edge1', { target: 'error-node' })
const dag = createMockDAG([startNode, errorNode])
const context = createMockContext({ abortSignal: abortController.signal })
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return ['error-node']
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'error-node') {
abortController.abort()
throw new Error('Node error')
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
const result = await engine.run('start')
expect(result.status).toBe('cancelled')
expect(result.success).toBe(false)
})
it('should stop loop iteration when error occurs in loop body', async () => {
const loopStartNode = createMockNode('loop-start', 'loop_sentinel')
loopStartNode.metadata = { isSentinel: true, sentinelType: 'start', loopId: 'loop1' }
const loopBodyNode = createMockNode('loop-body', 'function')
loopBodyNode.metadata = { isLoopNode: true, loopId: 'loop1' }
const loopEndNode = createMockNode('loop-end', 'loop_sentinel')
loopEndNode.metadata = { isSentinel: true, sentinelType: 'end', loopId: 'loop1' }
const afterLoopNode = createMockNode('after-loop', 'function')
loopStartNode.outgoingEdges.set('edge1', { target: 'loop-body' })
loopBodyNode.outgoingEdges.set('edge2', { target: 'loop-end' })
loopEndNode.outgoingEdges.set('loop_continue', {
target: 'loop-start',
sourceHandle: 'loop_continue',
})
loopEndNode.outgoingEdges.set('loop_complete', {
target: 'after-loop',
sourceHandle: 'loop_complete',
})
const dag = createMockDAG([loopStartNode, loopBodyNode, loopEndNode, afterLoopNode])
const context = createMockContext()
let iterationCount = 0
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'loop-start') return ['loop-body']
if (node.id === 'loop-body') return ['loop-end']
if (node.id === 'loop-end') {
iterationCount++
if (iterationCount < 5) return ['loop-start']
return ['after-loop']
}
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'loop-body' && iterationCount >= 2) {
throw new Error('Loop body error on iteration 3')
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('loop-start')).rejects.toThrow('Loop body error on iteration 3')
expect(iterationCount).toBeLessThanOrEqual(3)
})
it('should handle error that is not an Error instance', async () => {
const startNode = createMockNode('start', 'starter')
const errorNode = createMockNode('error-node', 'function')
startNode.outgoingEdges.set('edge1', { target: 'error-node' })
const dag = createMockDAG([startNode, errorNode])
const context = createMockContext()
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return ['error-node']
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'error-node') {
throw 'String error message'
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
await expect(engine.run('start')).rejects.toThrow('String error message')
})
it('should preserve partial output when error occurs after some blocks complete', async () => {
const startNode = createMockNode('start', 'starter')
const successNode = createMockNode('success', 'function')
const errorNode = createMockNode('error-node', 'function')
startNode.outgoingEdges.set('edge1', { target: 'success' })
successNode.outgoingEdges.set('edge2', { target: 'error-node' })
const dag = createMockDAG([startNode, successNode, errorNode])
const context = createMockContext()
const edgeManager = createMockEdgeManager((node) => {
if (node.id === 'start') return ['success']
if (node.id === 'success') return ['error-node']
return []
})
const nodeOrchestrator = {
executionCount: 0,
executeNode: vi.fn().mockImplementation(async (_ctx: ExecutionContext, nodeId: string) => {
if (nodeId === 'success') {
return { nodeId, output: { successData: 'preserved' }, isFinalOutput: false }
}
if (nodeId === 'error-node') {
throw new Error('Late error')
}
return { nodeId, output: {}, isFinalOutput: false }
}),
handleNodeCompletion: vi.fn(),
} as unknown as MockNodeOrchestrator
const engine = new ExecutionEngine(context, dag, edgeManager, nodeOrchestrator)
try {
await engine.run('start')
expect.fail('Should have thrown')
} catch (error: any) {
// Verify the error was thrown
expect(error.message).toBe('Late error')
// The partial output should be available in executionResult if attached
if (error.executionResult) {
expect(error.executionResult.output).toBeDefined()
}
}
})
})
describe('Cancellation flag behavior', () => { describe('Cancellation flag behavior', () => {
it('should set cancelledFlag when abort signal fires', async () => { it('should set cancelledFlag when abort signal fires', async () => {
const abortController = new AbortController() const abortController = new AbortController()

View File

@@ -25,6 +25,8 @@ export class ExecutionEngine {
private pausedBlocks: Map<string, PauseMetadata> = new Map() private pausedBlocks: Map<string, PauseMetadata> = new Map()
private allowResumeTriggers: boolean private allowResumeTriggers: boolean
private cancelledFlag = false private cancelledFlag = false
private errorFlag = false
private executionError: Error | null = null
private lastCancellationCheck = 0 private lastCancellationCheck = 0
private readonly useRedisCancellation: boolean private readonly useRedisCancellation: boolean
private readonly CANCELLATION_CHECK_INTERVAL_MS = 500 private readonly CANCELLATION_CHECK_INTERVAL_MS = 500
@@ -103,7 +105,7 @@ export class ExecutionEngine {
this.initializeQueue(triggerBlockId) this.initializeQueue(triggerBlockId)
while (this.hasWork()) { while (this.hasWork()) {
if (await this.checkCancellation()) { if ((await this.checkCancellation()) || this.errorFlag) {
break break
} }
await this.processQueue() await this.processQueue()
@@ -113,6 +115,11 @@ export class ExecutionEngine {
await this.waitForAllExecutions() await this.waitForAllExecutions()
} }
// Rethrow the captured error so it's handled by the catch block
if (this.errorFlag && this.executionError) {
throw this.executionError
}
if (this.pausedBlocks.size > 0) { if (this.pausedBlocks.size > 0) {
return this.buildPausedResult(startTime) return this.buildPausedResult(startTime)
} }
@@ -196,11 +203,17 @@ export class ExecutionEngine {
} }
private trackExecution(promise: Promise<void>): void { private trackExecution(promise: Promise<void>): void {
this.executing.add(promise) const trackedPromise = promise
promise.catch(() => {}) .catch((error) => {
promise.finally(() => { if (!this.errorFlag) {
this.executing.delete(promise) this.errorFlag = true
}) this.executionError = error instanceof Error ? error : new Error(String(error))
}
})
.finally(() => {
this.executing.delete(trackedPromise)
})
this.executing.add(trackedPromise)
} }
private async waitForAnyExecution(): Promise<void> { private async waitForAnyExecution(): Promise<void> {
@@ -315,7 +328,7 @@ export class ExecutionEngine {
private async processQueue(): Promise<void> { private async processQueue(): Promise<void> {
while (this.readyQueue.length > 0) { while (this.readyQueue.length > 0) {
if (await this.checkCancellation()) { if ((await this.checkCancellation()) || this.errorFlag) {
break break
} }
const nodeId = this.dequeue() const nodeId = this.dequeue()
@@ -324,7 +337,7 @@ export class ExecutionEngine {
this.trackExecution(promise) this.trackExecution(promise)
} }
if (this.executing.size > 0 && !this.cancelledFlag) { if (this.executing.size > 0 && !this.cancelledFlag && !this.errorFlag) {
await this.waitForAnyExecution() await this.waitForAnyExecution()
} }
} }

View File

@@ -305,7 +305,7 @@ export class AgentBlockHandler implements BlockHandler {
base.executeFunction = async (callParams: Record<string, any>) => { base.executeFunction = async (callParams: Record<string, any>) => {
const mergedParams = mergeToolParameters(userProvidedParams, callParams) const mergedParams = mergeToolParameters(userProvidedParams, callParams)
const { blockData, blockNameMapping } = collectBlockData(ctx) const { blockData, blockNameMapping, blockOutputSchemas } = collectBlockData(ctx)
const result = await executeTool( const result = await executeTool(
'function_execute', 'function_execute',
@@ -317,6 +317,7 @@ export class AgentBlockHandler implements BlockHandler {
workflowVariables: ctx.workflowVariables || {}, workflowVariables: ctx.workflowVariables || {},
blockData, blockData,
blockNameMapping, blockNameMapping,
blockOutputSchemas,
isCustomTool: true, isCustomTool: true,
_context: { _context: {
workflowId: ctx.workflowId, workflowId: ctx.workflowId,

View File

@@ -26,7 +26,7 @@ export async function evaluateConditionExpression(
const contextSetup = `const context = ${JSON.stringify(evalContext)};` const contextSetup = `const context = ${JSON.stringify(evalContext)};`
const code = `${contextSetup}\nreturn Boolean(${conditionExpression})` const code = `${contextSetup}\nreturn Boolean(${conditionExpression})`
const { blockData, blockNameMapping } = collectBlockData(ctx) const { blockData, blockNameMapping, blockOutputSchemas } = collectBlockData(ctx)
const result = await executeTool( const result = await executeTool(
'function_execute', 'function_execute',
@@ -37,6 +37,7 @@ export async function evaluateConditionExpression(
workflowVariables: ctx.workflowVariables || {}, workflowVariables: ctx.workflowVariables || {},
blockData, blockData,
blockNameMapping, blockNameMapping,
blockOutputSchemas,
_context: { _context: {
workflowId: ctx.workflowId, workflowId: ctx.workflowId,
workspaceId: ctx.workspaceId, workspaceId: ctx.workspaceId,

View File

@@ -75,7 +75,12 @@ describe('FunctionBlockHandler', () => {
workflowVariables: {}, workflowVariables: {},
blockData: {}, blockData: {},
blockNameMapping: {}, blockNameMapping: {},
_context: { workflowId: mockContext.workflowId, workspaceId: mockContext.workspaceId }, blockOutputSchemas: {},
_context: {
workflowId: mockContext.workflowId,
workspaceId: mockContext.workspaceId,
isDeployedContext: mockContext.isDeployedContext,
},
} }
const expectedOutput: any = { result: 'Success' } const expectedOutput: any = { result: 'Success' }
@@ -84,8 +89,8 @@ describe('FunctionBlockHandler', () => {
expect(mockExecuteTool).toHaveBeenCalledWith( expect(mockExecuteTool).toHaveBeenCalledWith(
'function_execute', 'function_execute',
expectedToolParams, expectedToolParams,
false, // skipPostProcess false,
mockContext // execution context mockContext
) )
expect(result).toEqual(expectedOutput) expect(result).toEqual(expectedOutput)
}) })
@@ -107,7 +112,12 @@ describe('FunctionBlockHandler', () => {
workflowVariables: {}, workflowVariables: {},
blockData: {}, blockData: {},
blockNameMapping: {}, blockNameMapping: {},
_context: { workflowId: mockContext.workflowId, workspaceId: mockContext.workspaceId }, blockOutputSchemas: {},
_context: {
workflowId: mockContext.workflowId,
workspaceId: mockContext.workspaceId,
isDeployedContext: mockContext.isDeployedContext,
},
} }
const expectedOutput: any = { result: 'Success' } const expectedOutput: any = { result: 'Success' }
@@ -116,8 +126,8 @@ describe('FunctionBlockHandler', () => {
expect(mockExecuteTool).toHaveBeenCalledWith( expect(mockExecuteTool).toHaveBeenCalledWith(
'function_execute', 'function_execute',
expectedToolParams, expectedToolParams,
false, // skipPostProcess false,
mockContext // execution context mockContext
) )
expect(result).toEqual(expectedOutput) expect(result).toEqual(expectedOutput)
}) })
@@ -132,7 +142,12 @@ describe('FunctionBlockHandler', () => {
workflowVariables: {}, workflowVariables: {},
blockData: {}, blockData: {},
blockNameMapping: {}, blockNameMapping: {},
_context: { workflowId: mockContext.workflowId, workspaceId: mockContext.workspaceId }, blockOutputSchemas: {},
_context: {
workflowId: mockContext.workflowId,
workspaceId: mockContext.workspaceId,
isDeployedContext: mockContext.isDeployedContext,
},
} }
await handler.execute(mockContext, mockBlock, inputs) await handler.execute(mockContext, mockBlock, inputs)

View File

@@ -23,7 +23,7 @@ export class FunctionBlockHandler implements BlockHandler {
? inputs.code.map((c: { content: string }) => c.content).join('\n') ? inputs.code.map((c: { content: string }) => c.content).join('\n')
: inputs.code : inputs.code
const { blockData, blockNameMapping } = collectBlockData(ctx) const { blockData, blockNameMapping, blockOutputSchemas } = collectBlockData(ctx)
const result = await executeTool( const result = await executeTool(
'function_execute', 'function_execute',
@@ -35,6 +35,7 @@ export class FunctionBlockHandler implements BlockHandler {
workflowVariables: ctx.workflowVariables || {}, workflowVariables: ctx.workflowVariables || {},
blockData, blockData,
blockNameMapping, blockNameMapping,
blockOutputSchemas,
_context: { _context: {
workflowId: ctx.workflowId, workflowId: ctx.workflowId,
workspaceId: ctx.workspaceId, workspaceId: ctx.workspaceId,

View File

@@ -1,7 +1,7 @@
import '@sim/testing/mocks/executor' import '@sim/testing/mocks/executor'
import { beforeEach, describe, expect, it, type Mock, vi } from 'vitest' import { beforeEach, describe, expect, it, type Mock, vi } from 'vitest'
import { generateRouterPrompt } from '@/blocks/blocks/router' import { generateRouterPrompt, generateRouterV2Prompt } from '@/blocks/blocks/router'
import { BlockType } from '@/executor/constants' import { BlockType } from '@/executor/constants'
import { RouterBlockHandler } from '@/executor/handlers/router/router-handler' import { RouterBlockHandler } from '@/executor/handlers/router/router-handler'
import type { ExecutionContext } from '@/executor/types' import type { ExecutionContext } from '@/executor/types'
@@ -9,6 +9,7 @@ import { getProviderFromModel } from '@/providers/utils'
import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types' import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types'
const mockGenerateRouterPrompt = generateRouterPrompt as Mock const mockGenerateRouterPrompt = generateRouterPrompt as Mock
const mockGenerateRouterV2Prompt = generateRouterV2Prompt as Mock
const mockGetProviderFromModel = getProviderFromModel as Mock const mockGetProviderFromModel = getProviderFromModel as Mock
const mockFetch = global.fetch as unknown as Mock const mockFetch = global.fetch as unknown as Mock
@@ -44,7 +45,7 @@ describe('RouterBlockHandler', () => {
metadata: { id: BlockType.ROUTER, name: 'Test Router' }, metadata: { id: BlockType.ROUTER, name: 'Test Router' },
position: { x: 50, y: 50 }, position: { x: 50, y: 50 },
config: { tool: BlockType.ROUTER, params: {} }, config: { tool: BlockType.ROUTER, params: {} },
inputs: { prompt: 'string', model: 'string' }, // Using ParamType strings inputs: { prompt: 'string', model: 'string' },
outputs: {}, outputs: {},
enabled: true, enabled: true,
} }
@@ -72,14 +73,11 @@ describe('RouterBlockHandler', () => {
workflow: mockWorkflow as SerializedWorkflow, workflow: mockWorkflow as SerializedWorkflow,
} }
// Reset mocks using vi
vi.clearAllMocks() vi.clearAllMocks()
// Default mock implementations
mockGetProviderFromModel.mockReturnValue('openai') mockGetProviderFromModel.mockReturnValue('openai')
mockGenerateRouterPrompt.mockReturnValue('Generated System Prompt') mockGenerateRouterPrompt.mockReturnValue('Generated System Prompt')
// Set up fetch mock to return a successful response
mockFetch.mockImplementation(() => { mockFetch.mockImplementation(() => {
return Promise.resolve({ return Promise.resolve({
ok: true, ok: true,
@@ -147,7 +145,6 @@ describe('RouterBlockHandler', () => {
}) })
) )
// Verify the request body contains the expected data
const fetchCallArgs = mockFetch.mock.calls[0] const fetchCallArgs = mockFetch.mock.calls[0]
const requestBody = JSON.parse(fetchCallArgs[1].body) const requestBody = JSON.parse(fetchCallArgs[1].body)
expect(requestBody).toMatchObject({ expect(requestBody).toMatchObject({
@@ -180,7 +177,6 @@ describe('RouterBlockHandler', () => {
const inputs = { prompt: 'Test' } const inputs = { prompt: 'Test' }
mockContext.workflow!.blocks = [mockBlock, mockTargetBlock2] mockContext.workflow!.blocks = [mockBlock, mockTargetBlock2]
// Expect execute to throw because getTargetBlocks (called internally) will throw
await expect(handler.execute(mockContext, mockBlock, inputs)).rejects.toThrow( await expect(handler.execute(mockContext, mockBlock, inputs)).rejects.toThrow(
'Target block target-block-1 not found' 'Target block target-block-1 not found'
) )
@@ -190,7 +186,6 @@ describe('RouterBlockHandler', () => {
it('should throw error if LLM response is not a valid target block ID', async () => { it('should throw error if LLM response is not a valid target block ID', async () => {
const inputs = { prompt: 'Test', apiKey: 'test-api-key' } const inputs = { prompt: 'Test', apiKey: 'test-api-key' }
// Override fetch mock to return an invalid block ID
mockFetch.mockImplementationOnce(() => { mockFetch.mockImplementationOnce(() => {
return Promise.resolve({ return Promise.resolve({
ok: true, ok: true,
@@ -228,7 +223,6 @@ describe('RouterBlockHandler', () => {
it('should handle server error responses', async () => { it('should handle server error responses', async () => {
const inputs = { prompt: 'Test error handling.', apiKey: 'test-api-key' } const inputs = { prompt: 'Test error handling.', apiKey: 'test-api-key' }
// Override fetch mock to return an error
mockFetch.mockImplementationOnce(() => { mockFetch.mockImplementationOnce(() => {
return Promise.resolve({ return Promise.resolve({
ok: false, ok: false,
@@ -276,13 +270,12 @@ describe('RouterBlockHandler', () => {
mockGetProviderFromModel.mockReturnValue('vertex') mockGetProviderFromModel.mockReturnValue('vertex')
// Mock the database query for Vertex credential
const mockDb = await import('@sim/db') const mockDb = await import('@sim/db')
const mockAccount = { const mockAccount = {
id: 'test-vertex-credential-id', id: 'test-vertex-credential-id',
accessToken: 'mock-access-token', accessToken: 'mock-access-token',
refreshToken: 'mock-refresh-token', refreshToken: 'mock-refresh-token',
expiresAt: new Date(Date.now() + 3600000), // 1 hour from now expiresAt: new Date(Date.now() + 3600000),
} }
vi.spyOn(mockDb.db.query.account, 'findFirst').mockResolvedValue(mockAccount as any) vi.spyOn(mockDb.db.query.account, 'findFirst').mockResolvedValue(mockAccount as any)
@@ -300,3 +293,287 @@ describe('RouterBlockHandler', () => {
expect(requestBody.apiKey).toBe('mock-access-token') expect(requestBody.apiKey).toBe('mock-access-token')
}) })
}) })
describe('RouterBlockHandler V2', () => {
let handler: RouterBlockHandler
let mockRouterV2Block: SerializedBlock
let mockContext: ExecutionContext
let mockWorkflow: Partial<SerializedWorkflow>
let mockTargetBlock1: SerializedBlock
let mockTargetBlock2: SerializedBlock
beforeEach(() => {
mockTargetBlock1 = {
id: 'target-block-1',
metadata: { id: 'agent', name: 'Support Agent' },
position: { x: 100, y: 100 },
config: { tool: 'agent', params: {} },
inputs: {},
outputs: {},
enabled: true,
}
mockTargetBlock2 = {
id: 'target-block-2',
metadata: { id: 'agent', name: 'Sales Agent' },
position: { x: 100, y: 150 },
config: { tool: 'agent', params: {} },
inputs: {},
outputs: {},
enabled: true,
}
mockRouterV2Block = {
id: 'router-v2-block-1',
metadata: { id: BlockType.ROUTER_V2, name: 'Test Router V2' },
position: { x: 50, y: 50 },
config: { tool: BlockType.ROUTER_V2, params: {} },
inputs: {},
outputs: {},
enabled: true,
}
mockWorkflow = {
blocks: [mockRouterV2Block, mockTargetBlock1, mockTargetBlock2],
connections: [
{
source: mockRouterV2Block.id,
target: mockTargetBlock1.id,
sourceHandle: 'router-route-support',
},
{
source: mockRouterV2Block.id,
target: mockTargetBlock2.id,
sourceHandle: 'router-route-sales',
},
],
}
handler = new RouterBlockHandler({})
mockContext = {
workflowId: 'test-workflow-id',
blockStates: new Map(),
blockLogs: [],
metadata: { duration: 0 },
environmentVariables: {},
decisions: { router: new Map(), condition: new Map() },
loopExecutions: new Map(),
completedLoops: new Set(),
executedBlocks: new Set(),
activeExecutionPath: new Set(),
workflow: mockWorkflow as SerializedWorkflow,
}
vi.clearAllMocks()
mockGetProviderFromModel.mockReturnValue('openai')
mockGenerateRouterV2Prompt.mockReturnValue('Generated V2 System Prompt')
})
it('should handle router_v2 blocks', () => {
expect(handler.canHandle(mockRouterV2Block)).toBe(true)
})
it('should execute router V2 and return reasoning', async () => {
const inputs = {
context: 'I need help with a billing issue',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: JSON.stringify([
{ id: 'route-support', title: 'Support', value: 'Customer support inquiries' },
{ id: 'route-sales', title: 'Sales', value: 'Sales and pricing questions' },
]),
}
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
json: () =>
Promise.resolve({
content: JSON.stringify({
route: 'route-support',
reasoning: 'The user mentioned a billing issue which is a customer support matter.',
}),
model: 'gpt-4o',
tokens: { input: 150, output: 25, total: 175 },
}),
})
})
const result = await handler.execute(mockContext, mockRouterV2Block, inputs)
expect(result).toMatchObject({
context: 'I need help with a billing issue',
model: 'gpt-4o',
selectedRoute: 'route-support',
reasoning: 'The user mentioned a billing issue which is a customer support matter.',
selectedPath: {
blockId: 'target-block-1',
blockType: 'agent',
blockTitle: 'Support Agent',
},
})
})
it('should include responseFormat in provider request', async () => {
const inputs = {
context: 'Test context',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: JSON.stringify([{ id: 'route-1', title: 'Route 1', value: 'Description 1' }]),
}
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
json: () =>
Promise.resolve({
content: JSON.stringify({ route: 'route-1', reasoning: 'Test reasoning' }),
model: 'gpt-4o',
tokens: { input: 100, output: 20, total: 120 },
}),
})
})
await handler.execute(mockContext, mockRouterV2Block, inputs)
const fetchCallArgs = mockFetch.mock.calls[0]
const requestBody = JSON.parse(fetchCallArgs[1].body)
expect(requestBody.responseFormat).toEqual({
name: 'router_response',
schema: {
type: 'object',
properties: {
route: {
type: 'string',
description: 'The selected route ID or NO_MATCH',
},
reasoning: {
type: 'string',
description: 'Brief explanation of why this route was chosen',
},
},
required: ['route', 'reasoning'],
additionalProperties: false,
},
strict: true,
})
})
it('should handle NO_MATCH response with reasoning', async () => {
const inputs = {
context: 'Random unrelated query',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: JSON.stringify([{ id: 'route-1', title: 'Route 1', value: 'Specific topic' }]),
}
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
json: () =>
Promise.resolve({
content: JSON.stringify({
route: 'NO_MATCH',
reasoning: 'The query does not relate to any available route.',
}),
model: 'gpt-4o',
tokens: { input: 100, output: 20, total: 120 },
}),
})
})
await expect(handler.execute(mockContext, mockRouterV2Block, inputs)).rejects.toThrow(
'Router could not determine a matching route: The query does not relate to any available route.'
)
})
it('should throw error for invalid route ID in response', async () => {
const inputs = {
context: 'Test context',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: JSON.stringify([{ id: 'route-1', title: 'Route 1', value: 'Description' }]),
}
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
json: () =>
Promise.resolve({
content: JSON.stringify({ route: 'invalid-route', reasoning: 'Some reasoning' }),
model: 'gpt-4o',
tokens: { input: 100, output: 20, total: 120 },
}),
})
})
await expect(handler.execute(mockContext, mockRouterV2Block, inputs)).rejects.toThrow(
/Router could not determine a valid route/
)
})
it('should handle routes passed as array instead of JSON string', async () => {
const inputs = {
context: 'Test context',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: [{ id: 'route-1', title: 'Route 1', value: 'Description' }],
}
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
json: () =>
Promise.resolve({
content: JSON.stringify({ route: 'route-1', reasoning: 'Matched route 1' }),
model: 'gpt-4o',
tokens: { input: 100, output: 20, total: 120 },
}),
})
})
const result = await handler.execute(mockContext, mockRouterV2Block, inputs)
expect(result.selectedRoute).toBe('route-1')
expect(result.reasoning).toBe('Matched route 1')
})
it('should throw error when no routes are defined', async () => {
const inputs = {
context: 'Test context',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: '[]',
}
await expect(handler.execute(mockContext, mockRouterV2Block, inputs)).rejects.toThrow(
'No routes defined for router'
)
})
it('should handle fallback when JSON parsing fails', async () => {
const inputs = {
context: 'Test context',
model: 'gpt-4o',
apiKey: 'test-api-key',
routes: JSON.stringify([{ id: 'route-1', title: 'Route 1', value: 'Description' }]),
}
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
json: () =>
Promise.resolve({
content: 'route-1',
model: 'gpt-4o',
tokens: { input: 100, output: 5, total: 105 },
}),
})
})
const result = await handler.execute(mockContext, mockRouterV2Block, inputs)
expect(result.selectedRoute).toBe('route-1')
expect(result.reasoning).toBe('')
})
})

View File

@@ -238,6 +238,25 @@ export class RouterBlockHandler implements BlockHandler {
apiKey: finalApiKey, apiKey: finalApiKey,
workflowId: ctx.workflowId, workflowId: ctx.workflowId,
workspaceId: ctx.workspaceId, workspaceId: ctx.workspaceId,
responseFormat: {
name: 'router_response',
schema: {
type: 'object',
properties: {
route: {
type: 'string',
description: 'The selected route ID or NO_MATCH',
},
reasoning: {
type: 'string',
description: 'Brief explanation of why this route was chosen',
},
},
required: ['route', 'reasoning'],
additionalProperties: false,
},
strict: true,
},
} }
if (providerId === 'vertex') { if (providerId === 'vertex') {
@@ -277,16 +296,31 @@ export class RouterBlockHandler implements BlockHandler {
const result = await response.json() const result = await response.json()
const chosenRouteId = result.content.trim() let chosenRouteId: string
let reasoning = ''
try {
const parsedResponse = JSON.parse(result.content)
chosenRouteId = parsedResponse.route?.trim() || ''
reasoning = parsedResponse.reasoning || ''
} catch (_parseError) {
logger.error('Router response was not valid JSON despite responseFormat', {
content: result.content,
})
chosenRouteId = result.content.trim()
}
if (chosenRouteId === 'NO_MATCH' || chosenRouteId.toUpperCase() === 'NO_MATCH') { if (chosenRouteId === 'NO_MATCH' || chosenRouteId.toUpperCase() === 'NO_MATCH') {
logger.info('Router determined no route matches the context, routing to error path') logger.info('Router determined no route matches the context, routing to error path')
throw new Error('Router could not determine a matching route for the given context') throw new Error(
reasoning
? `Router could not determine a matching route: ${reasoning}`
: 'Router could not determine a matching route for the given context'
)
} }
const chosenRoute = routes.find((r) => r.id === chosenRouteId) const chosenRoute = routes.find((r) => r.id === chosenRouteId)
// Throw error if LLM returns invalid route ID - this routes through error path
if (!chosenRoute) { if (!chosenRoute) {
const availableRoutes = routes.map((r) => ({ id: r.id, title: r.title })) const availableRoutes = routes.map((r) => ({ id: r.id, title: r.title }))
logger.error( logger.error(
@@ -298,7 +332,6 @@ export class RouterBlockHandler implements BlockHandler {
) )
} }
// Find the target block connected to this route's handle
const connection = ctx.workflow?.connections.find( const connection = ctx.workflow?.connections.find(
(conn) => conn.source === block.id && conn.sourceHandle === `router-${chosenRoute.id}` (conn) => conn.source === block.id && conn.sourceHandle === `router-${chosenRoute.id}`
) )
@@ -334,6 +367,7 @@ export class RouterBlockHandler implements BlockHandler {
total: cost.total, total: cost.total,
}, },
selectedRoute: chosenRoute.id, selectedRoute: chosenRoute.id,
reasoning,
selectedPath: targetBlock selectedPath: targetBlock
? { ? {
blockId: targetBlock.id, blockId: targetBlock.id,
@@ -353,7 +387,7 @@ export class RouterBlockHandler implements BlockHandler {
} }
/** /**
* Parse routes from input (can be JSON string or array). * Parse routes from input (can be JSON string or array)
*/ */
private parseRoutes(input: any): RouteDefinition[] { private parseRoutes(input: any): RouteDefinition[] {
try { try {

View File

@@ -204,26 +204,21 @@ describe('WorkflowBlockHandler', () => {
}) })
}) })
it('should map failed child output correctly', () => { it('should throw error for failed child output so BlockExecutor can check error port', () => {
const childResult = { const childResult = {
success: false, success: false,
error: 'Child workflow failed', error: 'Child workflow failed',
} }
const result = (handler as any).mapChildOutputToParent( expect(() =>
childResult, (handler as any).mapChildOutputToParent(childResult, 'child-id', 'Child Workflow', 100)
'child-id', ).toThrow('Error in child workflow "Child Workflow": Child workflow failed')
'Child Workflow',
100
)
expect(result).toEqual({ try {
success: false, ;(handler as any).mapChildOutputToParent(childResult, 'child-id', 'Child Workflow', 100)
childWorkflowName: 'Child Workflow', } catch (error: any) {
result: {}, expect(error.childTraceSpans).toEqual([])
error: 'Child workflow failed', }
childTraceSpans: [],
})
}) })
it('should handle nested response structures', () => { it('should handle nested response structures', () => {

View File

@@ -144,6 +144,11 @@ export class WorkflowBlockHandler implements BlockHandler {
const workflowMetadata = workflows[workflowId] const workflowMetadata = workflows[workflowId]
const childWorkflowName = workflowMetadata?.name || workflowId const childWorkflowName = workflowMetadata?.name || workflowId
const originalError = error.message || 'Unknown error'
const wrappedError = new Error(
`Error in child workflow "${childWorkflowName}": ${originalError}`
)
if (error.executionResult?.logs) { if (error.executionResult?.logs) {
const executionResult = error.executionResult as ExecutionResult const executionResult = error.executionResult as ExecutionResult
@@ -159,28 +164,12 @@ export class WorkflowBlockHandler implements BlockHandler {
) )
logger.info(`Captured ${childTraceSpans.length} child trace spans from failed execution`) logger.info(`Captured ${childTraceSpans.length} child trace spans from failed execution`)
;(wrappedError as any).childTraceSpans = childTraceSpans
return { } else if (error.childTraceSpans && Array.isArray(error.childTraceSpans)) {
success: false, ;(wrappedError as any).childTraceSpans = error.childTraceSpans
childWorkflowName,
result: {},
error: error.message || 'Child workflow execution failed',
childTraceSpans: childTraceSpans,
} as Record<string, any>
} }
if (error.childTraceSpans && Array.isArray(error.childTraceSpans)) { throw wrappedError
return {
success: false,
childWorkflowName,
result: {},
error: error.message || 'Child workflow execution failed',
childTraceSpans: error.childTraceSpans,
} as Record<string, any>
}
const originalError = error.message || 'Unknown error'
throw new Error(`Error in child workflow "${childWorkflowName}": ${originalError}`)
} }
} }
@@ -452,17 +441,13 @@ export class WorkflowBlockHandler implements BlockHandler {
if (!success) { if (!success) {
logger.warn(`Child workflow ${childWorkflowName} failed`) logger.warn(`Child workflow ${childWorkflowName} failed`)
// Return failure with child trace spans so they can be displayed const error = new Error(
return { `Error in child workflow "${childWorkflowName}": ${childResult.error || 'Child workflow execution failed'}`
success: false, )
childWorkflowName, ;(error as any).childTraceSpans = childTraceSpans || []
result, throw error
error: childResult.error || 'Child workflow execution failed',
childTraceSpans: childTraceSpans || [],
} as Record<string, any>
} }
// Success case
return { return {
success: true, success: true,
childWorkflowName, childWorkflowName,

View File

@@ -1,24 +1,43 @@
import { getBlockOutputs } from '@/lib/workflows/blocks/block-outputs'
import { normalizeName } from '@/executor/constants' import { normalizeName } from '@/executor/constants'
import type { ExecutionContext } from '@/executor/types' import type { ExecutionContext } from '@/executor/types'
import type { OutputSchema } from '@/executor/utils/block-reference'
export interface BlockDataCollection { export interface BlockDataCollection {
blockData: Record<string, any> blockData: Record<string, unknown>
blockNameMapping: Record<string, string> blockNameMapping: Record<string, string>
blockOutputSchemas: Record<string, OutputSchema>
} }
export function collectBlockData(ctx: ExecutionContext): BlockDataCollection { export function collectBlockData(ctx: ExecutionContext): BlockDataCollection {
const blockData: Record<string, any> = {} const blockData: Record<string, unknown> = {}
const blockNameMapping: Record<string, string> = {} const blockNameMapping: Record<string, string> = {}
const blockOutputSchemas: Record<string, OutputSchema> = {}
for (const [id, state] of ctx.blockStates.entries()) { for (const [id, state] of ctx.blockStates.entries()) {
if (state.output !== undefined) { if (state.output !== undefined) {
blockData[id] = state.output blockData[id] = state.output
const workflowBlock = ctx.workflow?.blocks?.find((b) => b.id === id) }
if (workflowBlock?.metadata?.name) {
blockNameMapping[normalizeName(workflowBlock.metadata.name)] = id const workflowBlock = ctx.workflow?.blocks?.find((b) => b.id === id)
if (!workflowBlock) continue
if (workflowBlock.metadata?.name) {
blockNameMapping[normalizeName(workflowBlock.metadata.name)] = id
}
const blockType = workflowBlock.metadata?.id
if (blockType) {
const params = workflowBlock.config?.params as Record<string, unknown> | undefined
const subBlocks = params
? Object.fromEntries(Object.entries(params).map(([k, v]) => [k, { value: v }]))
: undefined
const schema = getBlockOutputs(blockType, subBlocks)
if (schema && Object.keys(schema).length > 0) {
blockOutputSchemas[id] = schema
} }
} }
} }
return { blockData, blockNameMapping } return { blockData, blockNameMapping, blockOutputSchemas }
} }

View File

@@ -0,0 +1,255 @@
/**
* @vitest-environment node
*/
import { describe, expect, it } from 'vitest'
import {
type BlockReferenceContext,
InvalidFieldError,
resolveBlockReference,
} from './block-reference'
describe('resolveBlockReference', () => {
const createContext = (
overrides: Partial<BlockReferenceContext> = {}
): BlockReferenceContext => ({
blockNameMapping: { start: 'block-1', agent: 'block-2' },
blockData: {},
blockOutputSchemas: {},
...overrides,
})
describe('block name resolution', () => {
it('should return undefined when block name does not exist', () => {
const ctx = createContext()
const result = resolveBlockReference('unknown', ['field'], ctx)
expect(result).toBeUndefined()
})
it('should normalize block name before lookup', () => {
const ctx = createContext({
blockNameMapping: { myblock: 'block-1' },
blockData: { 'block-1': { value: 'test' } },
})
const result = resolveBlockReference('MyBlock', ['value'], ctx)
expect(result).toEqual({ value: 'test', blockId: 'block-1' })
})
it('should handle block names with spaces', () => {
const ctx = createContext({
blockNameMapping: { myblock: 'block-1' },
blockData: { 'block-1': { value: 'test' } },
})
const result = resolveBlockReference('My Block', ['value'], ctx)
expect(result).toEqual({ value: 'test', blockId: 'block-1' })
})
})
describe('field resolution', () => {
it('should return entire block output when no path specified', () => {
const ctx = createContext({
blockData: { 'block-1': { input: 'hello', other: 'data' } },
})
const result = resolveBlockReference('start', [], ctx)
expect(result).toEqual({
value: { input: 'hello', other: 'data' },
blockId: 'block-1',
})
})
it('should resolve simple field path', () => {
const ctx = createContext({
blockData: { 'block-1': { input: 'hello' } },
})
const result = resolveBlockReference('start', ['input'], ctx)
expect(result).toEqual({ value: 'hello', blockId: 'block-1' })
})
it('should resolve nested field path', () => {
const ctx = createContext({
blockData: { 'block-1': { response: { data: { name: 'test' } } } },
})
const result = resolveBlockReference('start', ['response', 'data', 'name'], ctx)
expect(result).toEqual({ value: 'test', blockId: 'block-1' })
})
it('should resolve array index path', () => {
const ctx = createContext({
blockData: { 'block-1': { items: ['a', 'b', 'c'] } },
})
const result = resolveBlockReference('start', ['items', '1'], ctx)
expect(result).toEqual({ value: 'b', blockId: 'block-1' })
})
it('should return undefined value when field exists but has no value', () => {
const ctx = createContext({
blockData: { 'block-1': { input: undefined } },
blockOutputSchemas: {
'block-1': { input: { type: 'string' } },
},
})
const result = resolveBlockReference('start', ['input'], ctx)
expect(result).toEqual({ value: undefined, blockId: 'block-1' })
})
it('should return null value when field has null', () => {
const ctx = createContext({
blockData: { 'block-1': { input: null } },
})
const result = resolveBlockReference('start', ['input'], ctx)
expect(result).toEqual({ value: null, blockId: 'block-1' })
})
})
describe('schema validation', () => {
it('should throw InvalidFieldError when field not in schema', () => {
const ctx = createContext({
blockData: { 'block-1': { existing: 'value' } },
blockOutputSchemas: {
'block-1': {
input: { type: 'string' },
conversationId: { type: 'string' },
},
},
})
expect(() => resolveBlockReference('start', ['invalid'], ctx)).toThrow(InvalidFieldError)
expect(() => resolveBlockReference('start', ['invalid'], ctx)).toThrow(
/"invalid" doesn't exist on block "start"/
)
})
it('should include available fields in error message', () => {
const ctx = createContext({
blockData: { 'block-1': {} },
blockOutputSchemas: {
'block-1': {
input: { type: 'string' },
conversationId: { type: 'string' },
files: { type: 'files' },
},
},
})
try {
resolveBlockReference('start', ['typo'], ctx)
expect.fail('Should have thrown')
} catch (error) {
expect(error).toBeInstanceOf(InvalidFieldError)
const fieldError = error as InvalidFieldError
expect(fieldError.availableFields).toContain('input')
expect(fieldError.availableFields).toContain('conversationId')
expect(fieldError.availableFields).toContain('files')
}
})
it('should allow valid field even when value is undefined', () => {
const ctx = createContext({
blockData: { 'block-1': {} },
blockOutputSchemas: {
'block-1': { input: { type: 'string' } },
},
})
const result = resolveBlockReference('start', ['input'], ctx)
expect(result).toEqual({ value: undefined, blockId: 'block-1' })
})
it('should validate path when block has no output yet', () => {
const ctx = createContext({
blockData: {},
blockOutputSchemas: {
'block-1': { input: { type: 'string' } },
},
})
expect(() => resolveBlockReference('start', ['invalid'], ctx)).toThrow(InvalidFieldError)
})
it('should return undefined for valid field when block has no output', () => {
const ctx = createContext({
blockData: {},
blockOutputSchemas: {
'block-1': { input: { type: 'string' } },
},
})
const result = resolveBlockReference('start', ['input'], ctx)
expect(result).toEqual({ value: undefined, blockId: 'block-1' })
})
})
describe('without schema (pass-through mode)', () => {
it('should return undefined value without throwing when no schema', () => {
const ctx = createContext({
blockData: { 'block-1': { existing: 'value' } },
})
const result = resolveBlockReference('start', ['missing'], ctx)
expect(result).toEqual({ value: undefined, blockId: 'block-1' })
})
})
describe('file type handling', () => {
it('should allow file property access', () => {
const ctx = createContext({
blockData: {
'block-1': {
files: [{ name: 'test.txt', url: 'http://example.com/file' }],
},
},
blockOutputSchemas: {
'block-1': { files: { type: 'files' } },
},
})
const result = resolveBlockReference('start', ['files', '0', 'name'], ctx)
expect(result).toEqual({ value: 'test.txt', blockId: 'block-1' })
})
it('should validate file property names', () => {
const ctx = createContext({
blockData: { 'block-1': { files: [] } },
blockOutputSchemas: {
'block-1': { files: { type: 'files' } },
},
})
expect(() => resolveBlockReference('start', ['files', '0', 'invalid'], ctx)).toThrow(
InvalidFieldError
)
})
})
})
describe('InvalidFieldError', () => {
it('should have correct properties', () => {
const error = new InvalidFieldError('myBlock', 'invalid.path', ['field1', 'field2'])
expect(error.blockName).toBe('myBlock')
expect(error.fieldPath).toBe('invalid.path')
expect(error.availableFields).toEqual(['field1', 'field2'])
expect(error.name).toBe('InvalidFieldError')
})
it('should format message correctly', () => {
const error = new InvalidFieldError('start', 'typo', ['input', 'files'])
expect(error.message).toBe(
'"typo" doesn\'t exist on block "start". Available fields: input, files'
)
})
it('should handle empty available fields', () => {
const error = new InvalidFieldError('start', 'field', [])
expect(error.message).toBe('"field" doesn\'t exist on block "start". Available fields: none')
})
})

View File

@@ -0,0 +1,210 @@
import { USER_FILE_ACCESSIBLE_PROPERTIES } from '@/lib/workflows/types'
import { normalizeName } from '@/executor/constants'
import { navigatePath } from '@/executor/variables/resolvers/reference'
export type OutputSchema = Record<string, { type?: string; description?: string } | unknown>
export interface BlockReferenceContext {
blockNameMapping: Record<string, string>
blockData: Record<string, unknown>
blockOutputSchemas?: Record<string, OutputSchema>
}
export interface BlockReferenceResult {
value: unknown
blockId: string
}
export class InvalidFieldError extends Error {
constructor(
public readonly blockName: string,
public readonly fieldPath: string,
public readonly availableFields: string[]
) {
super(
`"${fieldPath}" doesn't exist on block "${blockName}". ` +
`Available fields: ${availableFields.length > 0 ? availableFields.join(', ') : 'none'}`
)
this.name = 'InvalidFieldError'
}
}
function isFileType(value: unknown): boolean {
if (typeof value !== 'object' || value === null) return false
const typed = value as { type?: string }
return typed.type === 'file[]' || typed.type === 'files'
}
function isArrayType(value: unknown): value is { type: 'array'; items?: unknown } {
if (typeof value !== 'object' || value === null) return false
return (value as { type?: string }).type === 'array'
}
function getArrayItems(schema: unknown): unknown {
if (typeof schema !== 'object' || schema === null) return undefined
return (schema as { items?: unknown }).items
}
function getProperties(schema: unknown): Record<string, unknown> | undefined {
if (typeof schema !== 'object' || schema === null) return undefined
const props = (schema as { properties?: unknown }).properties
return typeof props === 'object' && props !== null
? (props as Record<string, unknown>)
: undefined
}
function lookupField(schema: unknown, fieldName: string): unknown | undefined {
if (typeof schema !== 'object' || schema === null) return undefined
const typed = schema as Record<string, unknown>
if (fieldName in typed) {
return typed[fieldName]
}
const props = getProperties(schema)
if (props && fieldName in props) {
return props[fieldName]
}
return undefined
}
function isPathInSchema(schema: OutputSchema | undefined, pathParts: string[]): boolean {
if (!schema || pathParts.length === 0) {
return true
}
let current: unknown = schema
for (let i = 0; i < pathParts.length; i++) {
const part = pathParts[i]
if (current === null || current === undefined) {
return false
}
if (/^\d+$/.test(part)) {
if (isFileType(current)) {
const nextPart = pathParts[i + 1]
return (
!nextPart ||
USER_FILE_ACCESSIBLE_PROPERTIES.includes(
nextPart as (typeof USER_FILE_ACCESSIBLE_PROPERTIES)[number]
)
)
}
if (isArrayType(current)) {
current = getArrayItems(current)
}
continue
}
const arrayMatch = part.match(/^([^[]+)\[(\d+)\]$/)
if (arrayMatch) {
const [, prop] = arrayMatch
const fieldDef = lookupField(current, prop)
if (!fieldDef) return false
if (isFileType(fieldDef)) {
const nextPart = pathParts[i + 1]
return (
!nextPart ||
USER_FILE_ACCESSIBLE_PROPERTIES.includes(
nextPart as (typeof USER_FILE_ACCESSIBLE_PROPERTIES)[number]
)
)
}
current = isArrayType(fieldDef) ? getArrayItems(fieldDef) : fieldDef
continue
}
if (
isFileType(current) &&
USER_FILE_ACCESSIBLE_PROPERTIES.includes(
part as (typeof USER_FILE_ACCESSIBLE_PROPERTIES)[number]
)
) {
return true
}
const fieldDef = lookupField(current, part)
if (fieldDef !== undefined) {
if (isFileType(fieldDef)) {
const nextPart = pathParts[i + 1]
if (!nextPart) return true
if (/^\d+$/.test(nextPart)) {
const afterIndex = pathParts[i + 2]
return (
!afterIndex ||
USER_FILE_ACCESSIBLE_PROPERTIES.includes(
afterIndex as (typeof USER_FILE_ACCESSIBLE_PROPERTIES)[number]
)
)
}
return USER_FILE_ACCESSIBLE_PROPERTIES.includes(
nextPart as (typeof USER_FILE_ACCESSIBLE_PROPERTIES)[number]
)
}
current = fieldDef
continue
}
if (isArrayType(current)) {
const items = getArrayItems(current)
const itemField = lookupField(items, part)
if (itemField !== undefined) {
current = itemField
continue
}
}
return false
}
return true
}
function getSchemaFieldNames(schema: OutputSchema | undefined): string[] {
if (!schema) return []
return Object.keys(schema)
}
export function resolveBlockReference(
blockName: string,
pathParts: string[],
context: BlockReferenceContext
): BlockReferenceResult | undefined {
const normalizedName = normalizeName(blockName)
const blockId = context.blockNameMapping[normalizedName]
if (!blockId) {
return undefined
}
const blockOutput = context.blockData[blockId]
const schema = context.blockOutputSchemas?.[blockId]
if (blockOutput === undefined) {
if (schema && pathParts.length > 0) {
if (!isPathInSchema(schema, pathParts)) {
throw new InvalidFieldError(blockName, pathParts.join('.'), getSchemaFieldNames(schema))
}
}
return { value: undefined, blockId }
}
if (pathParts.length === 0) {
return { value: blockOutput, blockId }
}
const value = navigatePath(blockOutput, pathParts)
if (value === undefined && schema) {
if (!isPathInSchema(schema, pathParts)) {
throw new InvalidFieldError(blockName, pathParts.join('.'), getSchemaFieldNames(schema))
}
}
return { value, blockId }
}

View File

@@ -1,11 +1,15 @@
import { getBlockOutputs } from '@/lib/workflows/blocks/block-outputs' import { getBlockOutputs } from '@/lib/workflows/blocks/block-outputs'
import { USER_FILE_ACCESSIBLE_PROPERTIES } from '@/lib/workflows/types'
import { import {
isReference, isReference,
normalizeName, normalizeName,
parseReferencePath, parseReferencePath,
SPECIAL_REFERENCE_PREFIXES, SPECIAL_REFERENCE_PREFIXES,
} from '@/executor/constants' } from '@/executor/constants'
import {
InvalidFieldError,
type OutputSchema,
resolveBlockReference,
} from '@/executor/utils/block-reference'
import { import {
navigatePath, navigatePath,
type ResolutionContext, type ResolutionContext,
@@ -14,123 +18,6 @@ import {
import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types' import type { SerializedBlock, SerializedWorkflow } from '@/serializer/types'
import { getTool } from '@/tools/utils' import { getTool } from '@/tools/utils'
function isPathInOutputSchema(
outputs: Record<string, any> | undefined,
pathParts: string[]
): boolean {
if (!outputs || pathParts.length === 0) {
return true
}
const isFileArrayType = (value: any): boolean =>
value?.type === 'file[]' || value?.type === 'files'
let current: any = outputs
for (let i = 0; i < pathParts.length; i++) {
const part = pathParts[i]
const arrayMatch = part.match(/^([^[]+)\[(\d+)\]$/)
if (arrayMatch) {
const [, prop] = arrayMatch
let fieldDef: any
if (prop in current) {
fieldDef = current[prop]
} else if (current.properties && prop in current.properties) {
fieldDef = current.properties[prop]
} else if (current.type === 'array' && current.items) {
if (current.items.properties && prop in current.items.properties) {
fieldDef = current.items.properties[prop]
} else if (prop in current.items) {
fieldDef = current.items[prop]
}
}
if (!fieldDef) {
return false
}
if (isFileArrayType(fieldDef)) {
if (i + 1 < pathParts.length) {
return USER_FILE_ACCESSIBLE_PROPERTIES.includes(pathParts[i + 1] as any)
}
return true
}
if (fieldDef.type === 'array' && fieldDef.items) {
current = fieldDef.items
continue
}
current = fieldDef
continue
}
if (/^\d+$/.test(part)) {
if (isFileArrayType(current)) {
if (i + 1 < pathParts.length) {
const nextPart = pathParts[i + 1]
return USER_FILE_ACCESSIBLE_PROPERTIES.includes(nextPart as any)
}
return true
}
continue
}
if (current === null || current === undefined) {
return false
}
if (part in current) {
const nextCurrent = current[part]
if (nextCurrent?.type === 'file[]' && i + 1 < pathParts.length) {
const nextPart = pathParts[i + 1]
if (/^\d+$/.test(nextPart) && i + 2 < pathParts.length) {
const propertyPart = pathParts[i + 2]
return USER_FILE_ACCESSIBLE_PROPERTIES.includes(propertyPart as any)
}
}
current = nextCurrent
continue
}
if (current.properties && part in current.properties) {
current = current.properties[part]
continue
}
if (current.type === 'array' && current.items) {
if (current.items.properties && part in current.items.properties) {
current = current.items.properties[part]
continue
}
if (part in current.items) {
current = current.items[part]
continue
}
}
if (isFileArrayType(current) && USER_FILE_ACCESSIBLE_PROPERTIES.includes(part as any)) {
return true
}
if ('type' in current && typeof current.type === 'string') {
if (!current.properties && !current.items) {
return false
}
}
return false
}
return true
}
function getSchemaFieldNames(outputs: Record<string, any> | undefined): string[] {
if (!outputs) return []
return Object.keys(outputs)
}
export class BlockResolver implements Resolver { export class BlockResolver implements Resolver {
private nameToBlockId: Map<string, string> private nameToBlockId: Map<string, string>
private blockById: Map<string, SerializedBlock> private blockById: Map<string, SerializedBlock>
@@ -170,83 +57,94 @@ export class BlockResolver implements Resolver {
return undefined return undefined
} }
const block = this.blockById.get(blockId) const block = this.blockById.get(blockId)!
const output = this.getBlockOutput(blockId, context) const output = this.getBlockOutput(blockId, context)
if (output === undefined) { const blockData: Record<string, unknown> = {}
const blockOutputSchemas: Record<string, OutputSchema> = {}
if (output !== undefined) {
blockData[blockId] = output
}
const blockType = block.metadata?.id
const params = block.config?.params as Record<string, unknown> | undefined
const subBlocks = params
? Object.fromEntries(Object.entries(params).map(([k, v]) => [k, { value: v }]))
: undefined
const toolId = block.config?.tool
const toolConfig = toolId ? getTool(toolId) : undefined
const outputSchema =
toolConfig?.outputs ?? (blockType ? getBlockOutputs(blockType, subBlocks) : block.outputs)
if (outputSchema && Object.keys(outputSchema).length > 0) {
blockOutputSchemas[blockId] = outputSchema
}
try {
const result = resolveBlockReference(blockName, pathParts, {
blockNameMapping: Object.fromEntries(this.nameToBlockId),
blockData,
blockOutputSchemas,
})!
if (result.value !== undefined) {
return result.value
}
return this.handleBackwardsCompat(block, output, pathParts)
} catch (error) {
if (error instanceof InvalidFieldError) {
const fallback = this.handleBackwardsCompat(block, output, pathParts)
if (fallback !== undefined) {
return fallback
}
throw new Error(error.message)
}
throw error
}
}
private handleBackwardsCompat(
block: SerializedBlock,
output: unknown,
pathParts: string[]
): unknown {
if (output === undefined || pathParts.length === 0) {
return undefined return undefined
} }
if (pathParts.length === 0) {
return output
}
// Try the original path first
let result = navigatePath(output, pathParts)
// If successful, return it immediately
if (result !== undefined) {
return result
}
// Response block backwards compatibility:
// Old: <responseBlock.response.data> -> New: <responseBlock.data>
// Only apply fallback if:
// 1. Block type is 'response'
// 2. Path starts with 'response.'
// 3. Output doesn't have a 'response' key (confirming it's the new format)
if ( if (
block?.metadata?.id === 'response' && block.metadata?.id === 'response' &&
pathParts[0] === 'response' && pathParts[0] === 'response' &&
output?.response === undefined (output as Record<string, unknown>)?.response === undefined
) { ) {
const adjustedPathParts = pathParts.slice(1) const adjustedPathParts = pathParts.slice(1)
if (adjustedPathParts.length === 0) { if (adjustedPathParts.length === 0) {
return output return output
} }
result = navigatePath(output, adjustedPathParts) const fallbackResult = navigatePath(output, adjustedPathParts)
if (result !== undefined) { if (fallbackResult !== undefined) {
return result return fallbackResult
} }
} }
// Workflow block backwards compatibility:
// Old: <workflowBlock.result.response.data> -> New: <workflowBlock.result.data>
// Only apply fallback if:
// 1. Block type is 'workflow' or 'workflow_input'
// 2. Path starts with 'result.response.'
// 3. output.result.response doesn't exist (confirming child used new format)
const isWorkflowBlock = const isWorkflowBlock =
block?.metadata?.id === 'workflow' || block?.metadata?.id === 'workflow_input' block.metadata?.id === 'workflow' || block.metadata?.id === 'workflow_input'
const outputRecord = output as Record<string, Record<string, unknown> | undefined>
if ( if (
isWorkflowBlock && isWorkflowBlock &&
pathParts[0] === 'result' && pathParts[0] === 'result' &&
pathParts[1] === 'response' && pathParts[1] === 'response' &&
output?.result?.response === undefined outputRecord?.result?.response === undefined
) { ) {
const adjustedPathParts = ['result', ...pathParts.slice(2)] const adjustedPathParts = ['result', ...pathParts.slice(2)]
result = navigatePath(output, adjustedPathParts) const fallbackResult = navigatePath(output, adjustedPathParts)
if (result !== undefined) { if (fallbackResult !== undefined) {
return result return fallbackResult
} }
} }
const blockType = block?.metadata?.id
const params = block?.config?.params as Record<string, unknown> | undefined
const subBlocks = params
? Object.fromEntries(Object.entries(params).map(([k, v]) => [k, { value: v }]))
: undefined
const toolId = block?.config?.tool
const toolConfig = toolId ? getTool(toolId) : undefined
const outputSchema =
toolConfig?.outputs ?? (blockType ? getBlockOutputs(blockType, subBlocks) : block?.outputs)
const schemaFields = getSchemaFieldNames(outputSchema)
if (schemaFields.length > 0 && !isPathInOutputSchema(outputSchema, pathParts)) {
throw new Error(
`"${pathParts.join('.')}" doesn't exist on block "${blockName}". ` +
`Available fields: ${schemaFields.join(', ')}`
)
}
return undefined return undefined
} }

View File

@@ -27,23 +27,28 @@ export function navigatePath(obj: any, path: string[]): any {
return undefined return undefined
} }
// Handle array indexing like "items[0]" or just numeric indices const arrayMatch = part.match(/^([^[]+)(\[.+)$/)
const arrayMatch = part.match(/^([^[]+)\[(\d+)\](.*)$/)
if (arrayMatch) { if (arrayMatch) {
// Handle complex array access like "items[0]" const [, prop, bracketsPart] = arrayMatch
const [, prop, index] = arrayMatch
current = current[prop] current = current[prop]
if (current === undefined || current === null) { if (current === undefined || current === null) {
return undefined return undefined
} }
const idx = Number.parseInt(index, 10)
current = Array.isArray(current) ? current[idx] : undefined const indices = bracketsPart.match(/\[(\d+)\]/g)
if (indices) {
for (const indexMatch of indices) {
if (current === null || current === undefined) {
return undefined
}
const idx = Number.parseInt(indexMatch.slice(1, -1), 10)
current = Array.isArray(current) ? current[idx] : undefined
}
}
} else if (/^\d+$/.test(part)) { } else if (/^\d+$/.test(part)) {
// Handle plain numeric index
const index = Number.parseInt(part, 10) const index = Number.parseInt(part, 10)
current = Array.isArray(current) ? current[index] : undefined current = Array.isArray(current) ? current[index] : undefined
} else { } else {
// Handle regular property access
current = current[part] current = current[part]
} }
} }

View File

@@ -110,15 +110,16 @@ export function getCustomTools(workspaceId?: string): CustomToolDefinition[] {
} }
/** /**
* Get a specific custom tool from the query cache by ID (for non-React code) * Get a specific custom tool from the query cache by ID or title (for non-React code)
* Custom tools are referenced by title in the system (custom_${title}), so title lookup is required.
* If workspaceId is not provided, extracts it from the current URL * If workspaceId is not provided, extracts it from the current URL
*/ */
export function getCustomTool( export function getCustomTool(
toolId: string, identifier: string,
workspaceId?: string workspaceId?: string
): CustomToolDefinition | undefined { ): CustomToolDefinition | undefined {
const tools = getCustomTools(workspaceId) const tools = getCustomTools(workspaceId)
return tools.find((tool) => tool.id === toolId) || tools.find((tool) => tool.title === toolId) return tools.find((tool) => tool.id === identifier || tool.title === identifier)
} }
/** /**

View File

@@ -3,7 +3,6 @@ import { createLogger } from '@sim/logger'
import { keepPreviousData, useMutation, useQuery, useQueryClient } from '@tanstack/react-query' import { keepPreviousData, useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
import { getNextWorkflowColor } from '@/lib/workflows/colors' import { getNextWorkflowColor } from '@/lib/workflows/colors'
import { buildDefaultWorkflowArtifacts } from '@/lib/workflows/defaults' import { buildDefaultWorkflowArtifacts } from '@/lib/workflows/defaults'
import { extractInputFieldsFromBlocks, type WorkflowInputField } from '@/lib/workflows/input-format'
import { deploymentKeys } from '@/hooks/queries/deployments' import { deploymentKeys } from '@/hooks/queries/deployments'
import { import {
createOptimisticMutationHandlers, createOptimisticMutationHandlers,
@@ -26,34 +25,35 @@ export const workflowKeys = {
deploymentVersions: () => [...workflowKeys.all, 'deploymentVersion'] as const, deploymentVersions: () => [...workflowKeys.all, 'deploymentVersion'] as const,
deploymentVersion: (workflowId: string | undefined, version: number | undefined) => deploymentVersion: (workflowId: string | undefined, version: number | undefined) =>
[...workflowKeys.deploymentVersions(), workflowId ?? '', version ?? 0] as const, [...workflowKeys.deploymentVersions(), workflowId ?? '', version ?? 0] as const,
inputFields: (workflowId: string | undefined) => state: (workflowId: string | undefined) =>
[...workflowKeys.all, 'inputFields', workflowId ?? ''] as const, [...workflowKeys.all, 'state', workflowId ?? ''] as const,
} }
/** /**
* Fetches workflow input fields from the workflow state. * Fetches workflow state from the API.
* Used as the base query for both state preview and input fields extraction.
*/ */
async function fetchWorkflowInputFields(workflowId: string): Promise<WorkflowInputField[]> { async function fetchWorkflowState(workflowId: string): Promise<WorkflowState | null> {
const response = await fetch(`/api/workflows/${workflowId}`) const response = await fetch(`/api/workflows/${workflowId}`)
if (!response.ok) throw new Error('Failed to fetch workflow') if (!response.ok) throw new Error('Failed to fetch workflow')
const { data } = await response.json() const { data } = await response.json()
return extractInputFieldsFromBlocks(data?.state?.blocks) return data?.state ?? null
} }
/** /**
* Hook to fetch workflow input fields for configuration. * Hook to fetch workflow state.
* Uses React Query for caching and deduplication. * Used by workflow blocks to show a preview of the child workflow
* and as a base query for input fields extraction.
* *
* @param workflowId - The workflow ID to fetch input fields for * @param workflowId - The workflow ID to fetch state for
* @returns Query result with input fields array * @returns Query result with workflow state
*/ */
export function useWorkflowInputFields(workflowId: string | undefined) { export function useWorkflowState(workflowId: string | undefined) {
return useQuery({ return useQuery({
queryKey: workflowKeys.inputFields(workflowId), queryKey: workflowKeys.state(workflowId),
queryFn: () => fetchWorkflowInputFields(workflowId!), queryFn: () => fetchWorkflowState(workflowId!),
enabled: Boolean(workflowId), enabled: Boolean(workflowId),
staleTime: 0, staleTime: 30 * 1000, // 30 seconds
refetchOnMount: 'always',
}) })
} }
@@ -532,13 +532,19 @@ export interface ChildDeploymentStatus {
} }
/** /**
* Fetches deployment status for a child workflow * Fetches deployment status for a child workflow.
* Uses Promise.all to fetch status and deployments in parallel for better performance.
*/ */
async function fetchChildDeploymentStatus(workflowId: string): Promise<ChildDeploymentStatus> { async function fetchChildDeploymentStatus(workflowId: string): Promise<ChildDeploymentStatus> {
const statusRes = await fetch(`/api/workflows/${workflowId}/status`, { const fetchOptions = {
cache: 'no-store', cache: 'no-store' as const,
headers: { 'Cache-Control': 'no-cache' }, headers: { 'Cache-Control': 'no-cache' },
}) }
const [statusRes, deploymentsRes] = await Promise.all([
fetch(`/api/workflows/${workflowId}/status`, fetchOptions),
fetch(`/api/workflows/${workflowId}/deployments`, fetchOptions),
])
if (!statusRes.ok) { if (!statusRes.ok) {
throw new Error('Failed to fetch workflow status') throw new Error('Failed to fetch workflow status')
@@ -546,11 +552,6 @@ async function fetchChildDeploymentStatus(workflowId: string): Promise<ChildDepl
const statusData = await statusRes.json() const statusData = await statusRes.json()
const deploymentsRes = await fetch(`/api/workflows/${workflowId}/deployments`, {
cache: 'no-store',
headers: { 'Cache-Control': 'no-cache' },
})
let activeVersion: number | null = null let activeVersion: number | null = null
if (deploymentsRes.ok) { if (deploymentsRes.ok) {
const deploymentsJson = await deploymentsRes.json() const deploymentsJson = await deploymentsRes.json()
@@ -580,7 +581,7 @@ export function useChildDeploymentStatus(workflowId: string | undefined) {
queryKey: workflowKeys.deploymentStatus(workflowId), queryKey: workflowKeys.deploymentStatus(workflowId),
queryFn: () => fetchChildDeploymentStatus(workflowId!), queryFn: () => fetchChildDeploymentStatus(workflowId!),
enabled: Boolean(workflowId), enabled: Boolean(workflowId),
staleTime: 30 * 1000, // 30 seconds staleTime: 0,
retry: false, retry: false,
}) })
} }

View File

@@ -6,6 +6,7 @@ import type {
SelectorOption, SelectorOption,
SelectorQueryArgs, SelectorQueryArgs,
} from '@/hooks/selectors/types' } from '@/hooks/selectors/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
const SELECTOR_STALE = 60 * 1000 const SELECTOR_STALE = 60 * 1000
@@ -853,6 +854,36 @@ const registry: Record<SelectorKey, SelectorDefinition> = {
})) }))
}, },
}, },
'sim.workflows': {
key: 'sim.workflows',
staleTime: 0, // Always fetch fresh from store
getQueryKey: ({ context }: SelectorQueryArgs) => [
'selectors',
'sim.workflows',
context.excludeWorkflowId ?? 'none',
],
enabled: () => true,
fetchList: async ({ context }: SelectorQueryArgs): Promise<SelectorOption[]> => {
const { workflows } = useWorkflowRegistry.getState()
return Object.entries(workflows)
.filter(([id]) => id !== context.excludeWorkflowId)
.map(([id, workflow]) => ({
id,
label: workflow.name || `Workflow ${id.slice(0, 8)}`,
}))
.sort((a, b) => a.label.localeCompare(b.label))
},
fetchById: async ({ detailId }: SelectorQueryArgs): Promise<SelectorOption | null> => {
if (!detailId) return null
const { workflows } = useWorkflowRegistry.getState()
const workflow = workflows[detailId]
if (!workflow) return null
return {
id: detailId,
label: workflow.name || `Workflow ${detailId.slice(0, 8)}`,
}
},
},
} }
export function getSelectorDefinition(key: SelectorKey): SelectorDefinition { export function getSelectorDefinition(key: SelectorKey): SelectorDefinition {

View File

@@ -29,6 +29,7 @@ export type SelectorKey =
| 'webflow.sites' | 'webflow.sites'
| 'webflow.collections' | 'webflow.collections'
| 'webflow.items' | 'webflow.items'
| 'sim.workflows'
export interface SelectorOption { export interface SelectorOption {
id: string id: string
@@ -52,6 +53,7 @@ export interface SelectorContext {
siteId?: string siteId?: string
collectionId?: string collectionId?: string
spreadsheetId?: string spreadsheetId?: string
excludeWorkflowId?: string
} }
export interface SelectorQueryArgs { export interface SelectorQueryArgs {

View File

@@ -64,6 +64,8 @@ import { SSO_TRUSTED_PROVIDERS } from './sso/constants'
const logger = createLogger('Auth') const logger = createLogger('Auth')
import { getMicrosoftRefreshTokenExpiry, isMicrosoftProvider } from '@/lib/oauth/microsoft'
const validStripeKey = env.STRIPE_SECRET_KEY const validStripeKey = env.STRIPE_SECRET_KEY
let stripeClient = null let stripeClient = null
@@ -187,6 +189,10 @@ export const auth = betterAuth({
} }
} }
const refreshTokenExpiresAt = isMicrosoftProvider(account.providerId)
? getMicrosoftRefreshTokenExpiry()
: account.refreshTokenExpiresAt
await db await db
.update(schema.account) .update(schema.account)
.set({ .set({
@@ -195,7 +201,7 @@ export const auth = betterAuth({
refreshToken: account.refreshToken, refreshToken: account.refreshToken,
idToken: account.idToken, idToken: account.idToken,
accessTokenExpiresAt: account.accessTokenExpiresAt, accessTokenExpiresAt: account.accessTokenExpiresAt,
refreshTokenExpiresAt: account.refreshTokenExpiresAt, refreshTokenExpiresAt,
scope: scopeToStore, scope: scopeToStore,
updatedAt: new Date(), updatedAt: new Date(),
}) })
@@ -292,6 +298,13 @@ export const auth = betterAuth({
} }
} }
if (isMicrosoftProvider(account.providerId)) {
await db
.update(schema.account)
.set({ refreshTokenExpiresAt: getMicrosoftRefreshTokenExpiry() })
.where(eq(schema.account.id, account.id))
}
// Sync webhooks for credential sets after connecting a new credential // Sync webhooks for credential sets after connecting a new credential
const requestId = crypto.randomUUID().slice(0, 8) const requestId = crypto.randomUUID().slice(0, 8)
const userMemberships = await db const userMemberships = await db
@@ -387,7 +400,6 @@ export const auth = betterAuth({
enabled: true, enabled: true,
allowDifferentEmails: true, allowDifferentEmails: true,
trustedProviders: [ trustedProviders: [
// Standard OAuth providers
'google', 'google',
'github', 'github',
'email-password', 'email-password',
@@ -403,8 +415,32 @@ export const auth = betterAuth({
'hubspot', 'hubspot',
'linkedin', 'linkedin',
'spotify', 'spotify',
'google-email',
// Common SSO provider patterns 'google-calendar',
'google-drive',
'google-docs',
'google-sheets',
'google-forms',
'google-vault',
'google-groups',
'vertex-ai',
'github-repo',
'microsoft-teams',
'microsoft-excel',
'microsoft-planner',
'outlook',
'onedrive',
'sharepoint',
'jira',
'airtable',
'dropbox',
'salesforce',
'wealthbox',
'zoom',
'wordpress',
'linear',
'shopify',
'trello',
...SSO_TRUSTED_PROVIDERS, ...SSO_TRUSTED_PROVIDERS,
], ],
}, },

View File

@@ -123,6 +123,8 @@ function resolveSubBlockOptions(sb: SubBlockConfig): string[] | undefined {
interface OutputFieldSchema { interface OutputFieldSchema {
type: string type: string
description?: string description?: string
properties?: Record<string, OutputFieldSchema>
items?: { type: string }
} }
/** /**
@@ -256,6 +258,42 @@ function mapSubBlockTypeToSchemaType(type: string): string {
return typeMap[type] || 'string' return typeMap[type] || 'string'
} }
/**
* Extracts a single output field schema, including nested properties
*/
function extractOutputField(def: any): OutputFieldSchema {
if (typeof def === 'string') {
return { type: def }
}
if (typeof def !== 'object' || def === null) {
return { type: 'any' }
}
const field: OutputFieldSchema = {
type: def.type || 'any',
}
if (def.description) {
field.description = def.description
}
// Include nested properties if present
if (def.properties && typeof def.properties === 'object') {
field.properties = {}
for (const [propKey, propDef] of Object.entries(def.properties)) {
field.properties[propKey] = extractOutputField(propDef)
}
}
// Include items schema for arrays
if (def.items && typeof def.items === 'object') {
field.items = { type: def.items.type || 'any' }
}
return field
}
/** /**
* Extracts trigger outputs from the first available trigger * Extracts trigger outputs from the first available trigger
*/ */
@@ -272,15 +310,7 @@ function extractTriggerOutputs(blockConfig: any): Record<string, OutputFieldSche
const trigger = getTrigger(triggerId) const trigger = getTrigger(triggerId)
if (trigger.outputs) { if (trigger.outputs) {
for (const [key, def] of Object.entries(trigger.outputs)) { for (const [key, def] of Object.entries(trigger.outputs)) {
if (typeof def === 'string') { outputs[key] = extractOutputField(def)
outputs[key] = { type: def }
} else if (typeof def === 'object' && def !== null) {
const typedDef = def as { type?: string; description?: string }
outputs[key] = {
type: typedDef.type || 'any',
description: typedDef.description,
}
}
} }
} }
} }
@@ -312,11 +342,7 @@ function extractOutputs(
const tool = toolsRegistry[toolId] const tool = toolsRegistry[toolId]
if (tool?.outputs) { if (tool?.outputs) {
for (const [key, def] of Object.entries(tool.outputs)) { for (const [key, def] of Object.entries(tool.outputs)) {
const typedDef = def as { type: string; description?: string } outputs[key] = extractOutputField(def)
outputs[key] = {
type: typedDef.type || 'any',
description: typedDef.description,
}
} }
return outputs return outputs
} }
@@ -329,15 +355,7 @@ function extractOutputs(
// Use block-level outputs // Use block-level outputs
if (blockConfig.outputs) { if (blockConfig.outputs) {
for (const [key, def] of Object.entries(blockConfig.outputs)) { for (const [key, def] of Object.entries(blockConfig.outputs)) {
if (typeof def === 'string') { outputs[key] = extractOutputField(def)
outputs[key] = { type: def }
} else if (typeof def === 'object' && def !== null) {
const typedDef = def as { type?: string; description?: string }
outputs[key] = {
type: typedDef.type || 'any',
description: typedDef.description,
}
}
} }
} }

View File

@@ -2468,16 +2468,17 @@ async function validateWorkflowSelectorIds(
const result = await validateSelectorIds(selector.selectorType, selector.value, context) const result = await validateSelectorIds(selector.selectorType, selector.value, context)
if (result.invalid.length > 0) { if (result.invalid.length > 0) {
// Include warning info (like available credentials) in the error message for better LLM feedback
const warningInfo = result.warning ? `. ${result.warning}` : ''
errors.push({ errors.push({
blockId: selector.blockId, blockId: selector.blockId,
blockType: selector.blockType, blockType: selector.blockType,
field: selector.fieldName, field: selector.fieldName,
value: selector.value, value: selector.value,
error: `Invalid ${selector.selectorType} ID(s): ${result.invalid.join(', ')} - ID(s) do not exist`, error: `Invalid ${selector.selectorType} ID(s): ${result.invalid.join(', ')} - ID(s) do not exist or user doesn't have access${warningInfo}`,
}) })
} } else if (result.warning) {
// Log warnings that don't have errors (shouldn't happen for credentials but may for other selectors)
if (result.warning) {
logger.warn(result.warning, { logger.warn(result.warning, {
blockId: selector.blockId, blockId: selector.blockId,
fieldName: selector.fieldName, fieldName: selector.fieldName,

View File

@@ -39,6 +39,31 @@ export async function validateSelectorIds(
.from(account) .from(account)
.where(and(inArray(account.id, idsArray), eq(account.userId, context.userId))) .where(and(inArray(account.id, idsArray), eq(account.userId, context.userId)))
existingIds = results.map((r) => r.id) existingIds = results.map((r) => r.id)
// If any IDs are invalid, fetch user's available credentials to include in error message
const existingSet = new Set(existingIds)
const invalidIds = idsArray.filter((id) => !existingSet.has(id))
if (invalidIds.length > 0) {
// Fetch all of the user's credentials to provide helpful feedback
const allUserCredentials = await db
.select({ id: account.id, providerId: account.providerId })
.from(account)
.where(eq(account.userId, context.userId))
const availableCredentials = allUserCredentials
.map((c) => `${c.id} (${c.providerId})`)
.join(', ')
const noCredentialsMessage = 'User has no credentials configured.'
return {
valid: existingIds,
invalid: invalidIds,
warning:
allUserCredentials.length > 0
? `Available credentials for this user: ${availableCredentials}`
: noCredentialsMessage,
}
}
break break
} }

View File

@@ -130,7 +130,11 @@ async function executeCode(request) {
await jail.set('environmentVariables', new ivm.ExternalCopy(envVars).copyInto()) await jail.set('environmentVariables', new ivm.ExternalCopy(envVars).copyInto())
for (const [key, value] of Object.entries(contextVariables)) { for (const [key, value] of Object.entries(contextVariables)) {
await jail.set(key, new ivm.ExternalCopy(value).copyInto()) if (value === undefined) {
await jail.set(key, undefined)
} else {
await jail.set(key, new ivm.ExternalCopy(value).copyInto())
}
} }
const fetchCallback = new ivm.Reference(async (url, optionsJson) => { const fetchCallback = new ivm.Reference(async (url, optionsJson) => {

View File

@@ -1,3 +1,4 @@
export * from './microsoft'
export * from './oauth' export * from './oauth'
export * from './types' export * from './types'
export * from './utils' export * from './utils'

View File

@@ -0,0 +1,19 @@
export const MICROSOFT_REFRESH_TOKEN_LIFETIME_DAYS = 90
export const PROACTIVE_REFRESH_THRESHOLD_DAYS = 7
export const MICROSOFT_PROVIDERS = new Set([
'microsoft-excel',
'microsoft-planner',
'microsoft-teams',
'outlook',
'onedrive',
'sharepoint',
])
export function isMicrosoftProvider(providerId: string): boolean {
return MICROSOFT_PROVIDERS.has(providerId)
}
export function getMicrosoftRefreshTokenExpiry(): Date {
return new Date(Date.now() + MICROSOFT_REFRESH_TOKEN_LIFETIME_DAYS * 24 * 60 * 60 * 1000)
}

View File

@@ -835,6 +835,7 @@ function getProviderAuthConfig(provider: string): ProviderAuthConfig {
clientId, clientId,
clientSecret, clientSecret,
useBasicAuth: true, useBasicAuth: true,
supportsRefreshTokenRotation: true,
} }
} }
case 'confluence': { case 'confluence': {
@@ -883,6 +884,7 @@ function getProviderAuthConfig(provider: string): ProviderAuthConfig {
clientId, clientId,
clientSecret, clientSecret,
useBasicAuth: false, useBasicAuth: false,
supportsRefreshTokenRotation: true,
} }
} }
case 'microsoft': case 'microsoft':
@@ -910,6 +912,7 @@ function getProviderAuthConfig(provider: string): ProviderAuthConfig {
clientId, clientId,
clientSecret, clientSecret,
useBasicAuth: true, useBasicAuth: true,
supportsRefreshTokenRotation: true,
} }
} }
case 'dropbox': { case 'dropbox': {

View File

@@ -771,12 +771,50 @@ function deepClone<T>(obj: T): T {
} }
} }
/**
* Recursively masks credential IDs in any value (string, object, or array).
* Used during serialization to ensure sensitive IDs are never persisted.
*/
function maskCredentialIdsInValue(value: any, credentialIds: Set<string>): any {
if (!value || credentialIds.size === 0) return value
if (typeof value === 'string') {
let masked = value
// Sort by length descending to mask longer IDs first
const sortedIds = Array.from(credentialIds).sort((a, b) => b.length - a.length)
for (const id of sortedIds) {
if (id && masked.includes(id)) {
masked = masked.split(id).join('••••••••')
}
}
return masked
}
if (Array.isArray(value)) {
return value.map((item) => maskCredentialIdsInValue(item, credentialIds))
}
if (typeof value === 'object') {
const masked: any = {}
for (const key of Object.keys(value)) {
masked[key] = maskCredentialIdsInValue(value[key], credentialIds)
}
return masked
}
return value
}
/** /**
* Serializes messages for database storage. * Serializes messages for database storage.
* Deep clones all fields to ensure proper JSON serialization. * Deep clones all fields to ensure proper JSON serialization.
* Masks sensitive credential IDs before persisting.
* This ensures they render identically when loaded back. * This ensures they render identically when loaded back.
*/ */
function serializeMessagesForDB(messages: CopilotMessage[]): any[] { function serializeMessagesForDB(messages: CopilotMessage[]): any[] {
// Get credential IDs to mask
const credentialIds = useCopilotStore.getState().sensitiveCredentialIds
const result = messages const result = messages
.map((msg) => { .map((msg) => {
// Deep clone the entire message to ensure all nested data is serializable // Deep clone the entire message to ensure all nested data is serializable
@@ -824,7 +862,8 @@ function serializeMessagesForDB(messages: CopilotMessage[]): any[] {
serialized.errorType = msg.errorType serialized.errorType = msg.errorType
} }
return serialized // Mask credential IDs in the serialized message before persisting
return maskCredentialIdsInValue(serialized, credentialIds)
}) })
.filter((msg) => { .filter((msg) => {
// Filter out empty assistant messages // Filter out empty assistant messages
@@ -1320,7 +1359,16 @@ const sseHandlers: Record<string, SSEHandler> = {
typeof def.hasInterrupt === 'function' typeof def.hasInterrupt === 'function'
? !!def.hasInterrupt(args || {}) ? !!def.hasInterrupt(args || {})
: !!def.hasInterrupt : !!def.hasInterrupt
if (!hasInterrupt && typeof def.execute === 'function') { // Check if tool is auto-allowed - if so, execute even if it has an interrupt
const { autoAllowedTools } = get()
const isAutoAllowed = name ? autoAllowedTools.includes(name) : false
if ((!hasInterrupt || isAutoAllowed) && typeof def.execute === 'function') {
if (isAutoAllowed && hasInterrupt) {
logger.info('[toolCallsById] Auto-executing tool with interrupt (auto-allowed)', {
id,
name,
})
}
const ctx = createExecutionContext({ toolCallId: id, toolName: name || 'unknown_tool' }) const ctx = createExecutionContext({ toolCallId: id, toolName: name || 'unknown_tool' })
// Defer executing transition by a tick to let pending render // Defer executing transition by a tick to let pending render
setTimeout(() => { setTimeout(() => {
@@ -1426,11 +1474,23 @@ const sseHandlers: Record<string, SSEHandler> = {
logger.warn('tool_call registry auto-exec check failed', { id, name, error: e }) logger.warn('tool_call registry auto-exec check failed', { id, name, error: e })
} }
// Class-based auto-exec for non-interrupt tools // Class-based auto-exec for non-interrupt tools or auto-allowed tools
try { try {
const inst = getClientTool(id) as any const inst = getClientTool(id) as any
const hasInterrupt = !!inst?.getInterruptDisplays?.() const hasInterrupt = !!inst?.getInterruptDisplays?.()
if (!hasInterrupt && typeof inst?.execute === 'function') { // Check if tool is auto-allowed - if so, execute even if it has an interrupt
const { autoAllowedTools: classAutoAllowed } = get()
const isClassAutoAllowed = name ? classAutoAllowed.includes(name) : false
if (
(!hasInterrupt || isClassAutoAllowed) &&
(typeof inst?.execute === 'function' || typeof inst?.handleAccept === 'function')
) {
if (isClassAutoAllowed && hasInterrupt) {
logger.info('[toolCallsById] Auto-executing class tool with interrupt (auto-allowed)', {
id,
name,
})
}
setTimeout(() => { setTimeout(() => {
// Guard against duplicate execution - check if already executing or terminal // Guard against duplicate execution - check if already executing or terminal
const currentState = get().toolCallsById[id]?.state const currentState = get().toolCallsById[id]?.state
@@ -1449,7 +1509,12 @@ const sseHandlers: Record<string, SSEHandler> = {
Promise.resolve() Promise.resolve()
.then(async () => { .then(async () => {
await inst.execute(args || {}) // Use handleAccept for tools with interrupts, execute for others
if (hasInterrupt && typeof inst?.handleAccept === 'function') {
await inst.handleAccept(args || {})
} else {
await inst.execute(args || {})
}
// Success/error will be synced via registerToolStateSync // Success/error will be synced via registerToolStateSync
}) })
.catch(() => { .catch(() => {
@@ -1474,20 +1539,35 @@ const sseHandlers: Record<string, SSEHandler> = {
} }
} catch {} } catch {}
// Integration tools: Stay in pending state until user confirms via buttons // Integration tools: Check auto-allowed or stay in pending state until user confirms
// This handles tools like google_calendar_*, exa_*, gmail_read, etc. that aren't in the client registry // This handles tools like google_calendar_*, exa_*, gmail_read, etc. that aren't in the client registry
// Only relevant if mode is 'build' (agent) // Only relevant if mode is 'build' (agent)
const { mode, workflowId } = get() const { mode, workflowId, autoAllowedTools, executeIntegrationTool } = get()
if (mode === 'build' && workflowId) { if (mode === 'build' && workflowId) {
// Check if tool was NOT found in client registry // Check if tool was NOT found in client registry
const def = name ? getTool(name) : undefined const def = name ? getTool(name) : undefined
const inst = getClientTool(id) as any const inst = getClientTool(id) as any
if (!def && !inst && name) { if (!def && !inst && name) {
// Integration tools stay in pending state until user confirms // Check if this integration tool is auto-allowed - if so, execute it immediately
logger.info('[build mode] Integration tool awaiting user confirmation', { if (autoAllowedTools.includes(name)) {
id, logger.info('[build mode] Auto-executing integration tool (auto-allowed)', { id, name })
name, // Defer to allow pending state to render briefly
}) setTimeout(() => {
executeIntegrationTool(id).catch((err) => {
logger.error('[build mode] Auto-execute integration tool failed', {
id,
name,
error: err,
})
})
}, 0)
} else {
// Integration tools stay in pending state until user confirms
logger.info('[build mode] Integration tool awaiting user confirmation', {
id,
name,
})
}
} }
} }
}, },
@@ -1976,6 +2056,10 @@ const subAgentSSEHandlers: Record<string, SSEHandler> = {
} }
// Execute client tools in parallel (non-blocking) - same pattern as main tool_call handler // Execute client tools in parallel (non-blocking) - same pattern as main tool_call handler
// Check if tool is auto-allowed
const { autoAllowedTools: subAgentAutoAllowed } = get()
const isSubAgentAutoAllowed = name ? subAgentAutoAllowed.includes(name) : false
try { try {
const def = getTool(name) const def = getTool(name)
if (def) { if (def) {
@@ -1983,8 +2067,15 @@ const subAgentSSEHandlers: Record<string, SSEHandler> = {
typeof def.hasInterrupt === 'function' typeof def.hasInterrupt === 'function'
? !!def.hasInterrupt(args || {}) ? !!def.hasInterrupt(args || {})
: !!def.hasInterrupt : !!def.hasInterrupt
if (!hasInterrupt) { // Auto-execute if no interrupt OR if auto-allowed
// Auto-execute tools without interrupts - non-blocking if (!hasInterrupt || isSubAgentAutoAllowed) {
if (isSubAgentAutoAllowed && hasInterrupt) {
logger.info('[SubAgent] Auto-executing tool with interrupt (auto-allowed)', {
id,
name,
})
}
// Auto-execute tools - non-blocking
const ctx = createExecutionContext({ toolCallId: id, toolName: name }) const ctx = createExecutionContext({ toolCallId: id, toolName: name })
Promise.resolve() Promise.resolve()
.then(() => def.execute(ctx, args || {})) .then(() => def.execute(ctx, args || {}))
@@ -2001,9 +2092,22 @@ const subAgentSSEHandlers: Record<string, SSEHandler> = {
const instance = getClientTool(id) const instance = getClientTool(id)
if (instance) { if (instance) {
const hasInterruptDisplays = !!instance.getInterruptDisplays?.() const hasInterruptDisplays = !!instance.getInterruptDisplays?.()
if (!hasInterruptDisplays) { // Auto-execute if no interrupt OR if auto-allowed
if (!hasInterruptDisplays || isSubAgentAutoAllowed) {
if (isSubAgentAutoAllowed && hasInterruptDisplays) {
logger.info('[SubAgent] Auto-executing class tool with interrupt (auto-allowed)', {
id,
name,
})
}
Promise.resolve() Promise.resolve()
.then(() => instance.execute(args || {})) .then(() => {
// Use handleAccept for tools with interrupts, execute for others
if (hasInterruptDisplays && typeof instance.handleAccept === 'function') {
return instance.handleAccept(args || {})
}
return instance.execute(args || {})
})
.catch((execErr: any) => { .catch((execErr: any) => {
logger.error('[SubAgent] Class tool execution failed', { logger.error('[SubAgent] Class tool execution failed', {
id, id,
@@ -2232,6 +2336,7 @@ const initialState = {
autoAllowedTools: [] as string[], autoAllowedTools: [] as string[],
messageQueue: [] as import('./types').QueuedMessage[], messageQueue: [] as import('./types').QueuedMessage[],
suppressAbortContinueOption: false, suppressAbortContinueOption: false,
sensitiveCredentialIds: new Set<string>(),
} }
export const useCopilotStore = create<CopilotStore>()( export const useCopilotStore = create<CopilotStore>()(
@@ -2614,6 +2719,12 @@ export const useCopilotStore = create<CopilotStore>()(
})) }))
} }
// Load sensitive credential IDs for masking before streaming starts
await get().loadSensitiveCredentialIds()
// Ensure auto-allowed tools are loaded before tool calls arrive
await get().loadAutoAllowedTools()
let newMessages: CopilotMessage[] let newMessages: CopilotMessage[]
if (revertState) { if (revertState) {
const currentMessages = get().messages const currentMessages = get().messages
@@ -3676,6 +3787,16 @@ export const useCopilotStore = create<CopilotStore>()(
const { id, name, params } = toolCall const { id, name, params } = toolCall
// Guard against double execution - skip if already executing or in terminal state
if (toolCall.state === ClientToolCallState.executing || isTerminalState(toolCall.state)) {
logger.info('[executeIntegrationTool] Skipping - already executing or terminal', {
id,
name,
state: toolCall.state,
})
return
}
// Set to executing state // Set to executing state
const executingMap = { ...get().toolCallsById } const executingMap = { ...get().toolCallsById }
executingMap[id] = { executingMap[id] = {
@@ -3824,6 +3945,46 @@ export const useCopilotStore = create<CopilotStore>()(
const data = await res.json() const data = await res.json()
set({ autoAllowedTools: data.autoAllowedTools || [] }) set({ autoAllowedTools: data.autoAllowedTools || [] })
logger.info('[AutoAllowedTools] Added tool', { toolId }) logger.info('[AutoAllowedTools] Added tool', { toolId })
// Auto-execute all pending tools of the same type
const { toolCallsById, executeIntegrationTool } = get()
const pendingToolCalls = Object.values(toolCallsById).filter(
(tc) => tc.name === toolId && tc.state === ClientToolCallState.pending
)
if (pendingToolCalls.length > 0) {
const isIntegrationTool = !CLASS_TOOL_METADATA[toolId]
logger.info('[AutoAllowedTools] Auto-executing pending tools', {
toolId,
count: pendingToolCalls.length,
isIntegrationTool,
})
for (const tc of pendingToolCalls) {
if (isIntegrationTool) {
// Integration tools use executeIntegrationTool
executeIntegrationTool(tc.id).catch((err) => {
logger.error('[AutoAllowedTools] Auto-execute pending integration tool failed', {
toolCallId: tc.id,
toolId,
error: err,
})
})
} else {
// Client tools with interrupts use handleAccept
const inst = getClientTool(tc.id) as any
if (inst && typeof inst.handleAccept === 'function') {
Promise.resolve()
.then(() => inst.handleAccept(tc.params || {}))
.catch((err: any) => {
logger.error('[AutoAllowedTools] Auto-execute pending client tool failed', {
toolCallId: tc.id,
toolId,
error: err,
})
})
}
}
}
}
} }
} catch (err) { } catch (err) {
logger.error('[AutoAllowedTools] Failed to add tool', { toolId, error: err }) logger.error('[AutoAllowedTools] Failed to add tool', { toolId, error: err })
@@ -3853,6 +4014,57 @@ export const useCopilotStore = create<CopilotStore>()(
return autoAllowedTools.includes(toolId) return autoAllowedTools.includes(toolId)
}, },
// Credential masking
loadSensitiveCredentialIds: async () => {
try {
const res = await fetch('/api/copilot/execute-copilot-server-tool', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ toolName: 'get_credentials', payload: {} }),
})
if (!res.ok) {
logger.warn('[loadSensitiveCredentialIds] Failed to fetch credentials', {
status: res.status,
})
return
}
const json = await res.json()
// Credentials are at result.oauth.connected.credentials
const credentials = json?.result?.oauth?.connected?.credentials || []
logger.info('[loadSensitiveCredentialIds] Response', {
hasResult: !!json?.result,
credentialCount: credentials.length,
})
const ids = new Set<string>()
for (const cred of credentials) {
if (cred?.id) {
ids.add(cred.id)
}
}
set({ sensitiveCredentialIds: ids })
logger.info('[loadSensitiveCredentialIds] Loaded credential IDs', {
count: ids.size,
})
} catch (err) {
logger.warn('[loadSensitiveCredentialIds] Error loading credentials', err)
}
},
maskCredentialValue: (value: string) => {
const { sensitiveCredentialIds } = get()
if (!value || sensitiveCredentialIds.size === 0) return value
let masked = value
// Sort by length descending to mask longer IDs first
const sortedIds = Array.from(sensitiveCredentialIds).sort((a, b) => b.length - a.length)
for (const id of sortedIds) {
if (id && masked.includes(id)) {
masked = masked.split(id).join('••••••••')
}
}
return masked
},
// Message queue actions // Message queue actions
addToQueue: (message, options) => { addToQueue: (message, options) => {
const queuedMessage: import('./types').QueuedMessage = { const queuedMessage: import('./types').QueuedMessage = {

View File

@@ -156,6 +156,9 @@ export interface CopilotState {
// Message queue for messages sent while another is in progress // Message queue for messages sent while another is in progress
messageQueue: QueuedMessage[] messageQueue: QueuedMessage[]
// Credential IDs to mask in UI (for sensitive data protection)
sensitiveCredentialIds: Set<string>
} }
export interface CopilotActions { export interface CopilotActions {
@@ -235,6 +238,10 @@ export interface CopilotActions {
removeAutoAllowedTool: (toolId: string) => Promise<void> removeAutoAllowedTool: (toolId: string) => Promise<void>
isToolAutoAllowed: (toolId: string) => boolean isToolAutoAllowed: (toolId: string) => boolean
// Credential masking
loadSensitiveCredentialIds: () => Promise<void>
maskCredentialValue: (value: string) => string
// Message queue actions // Message queue actions
addToQueue: ( addToQueue: (
message: string, message: string,

View File

@@ -56,6 +56,7 @@ describe('Function Execute Tool', () => {
workflowVariables: {}, workflowVariables: {},
blockData: {}, blockData: {},
blockNameMapping: {}, blockNameMapping: {},
blockOutputSchemas: {},
isCustomTool: false, isCustomTool: false,
language: 'javascript', language: 'javascript',
timeout: 5000, timeout: 5000,
@@ -83,6 +84,7 @@ describe('Function Execute Tool', () => {
workflowVariables: {}, workflowVariables: {},
blockData: {}, blockData: {},
blockNameMapping: {}, blockNameMapping: {},
blockOutputSchemas: {},
isCustomTool: false, isCustomTool: false,
language: 'javascript', language: 'javascript',
workflowId: undefined, workflowId: undefined,
@@ -101,6 +103,7 @@ describe('Function Execute Tool', () => {
workflowVariables: {}, workflowVariables: {},
blockData: {}, blockData: {},
blockNameMapping: {}, blockNameMapping: {},
blockOutputSchemas: {},
isCustomTool: false, isCustomTool: false,
language: 'javascript', language: 'javascript',
workflowId: undefined, workflowId: undefined,

View File

@@ -53,6 +53,13 @@ export const functionExecuteTool: ToolConfig<CodeExecutionInput, CodeExecutionOu
description: 'Mapping of block names to block IDs', description: 'Mapping of block names to block IDs',
default: {}, default: {},
}, },
blockOutputSchemas: {
type: 'object',
required: false,
visibility: 'hidden',
description: 'Mapping of block IDs to their output schemas for validation',
default: {},
},
workflowVariables: { workflowVariables: {
type: 'object', type: 'object',
required: false, required: false,
@@ -81,6 +88,7 @@ export const functionExecuteTool: ToolConfig<CodeExecutionInput, CodeExecutionOu
workflowVariables: params.workflowVariables || {}, workflowVariables: params.workflowVariables || {},
blockData: params.blockData || {}, blockData: params.blockData || {},
blockNameMapping: params.blockNameMapping || {}, blockNameMapping: params.blockNameMapping || {},
blockOutputSchemas: params.blockOutputSchemas || {},
workflowId: params._context?.workflowId, workflowId: params._context?.workflowId,
isCustomTool: params.isCustomTool || false, isCustomTool: params.isCustomTool || false,
} }

View File

@@ -11,6 +11,7 @@ export interface CodeExecutionInput {
workflowVariables?: Record<string, unknown> workflowVariables?: Record<string, unknown>
blockData?: Record<string, unknown> blockData?: Record<string, unknown>
blockNameMapping?: Record<string, string> blockNameMapping?: Record<string, string>
blockOutputSchemas?: Record<string, Record<string, unknown>>
_context?: { _context?: {
workflowId?: string workflowId?: string
} }

View File

@@ -643,13 +643,22 @@ async function executeToolRequest(
}) })
const responseHeaders = new Headers(secureResponse.headers.toRecord()) const responseHeaders = new Headers(secureResponse.headers.toRecord())
const bodyBuffer = await secureResponse.arrayBuffer() const nullBodyStatuses = new Set([101, 204, 205, 304])
response = new Response(bodyBuffer, { if (nullBodyStatuses.has(secureResponse.status)) {
status: secureResponse.status, response = new Response(null, {
statusText: secureResponse.statusText, status: secureResponse.status,
headers: responseHeaders, statusText: secureResponse.statusText,
}) headers: responseHeaders,
})
} else {
const bodyBuffer = await secureResponse.arrayBuffer()
response = new Response(bodyBuffer, {
status: secureResponse.status,
statusText: secureResponse.statusText,
headers: responseHeaders,
})
}
} }
// For non-OK responses, attempt JSON first; if parsing fails, fall back to text // For non-OK responses, attempt JSON first; if parsing fails, fall back to text
@@ -693,11 +702,9 @@ async function executeToolRequest(
throw errorToTransform throw errorToTransform
} }
// Parse response data once with guard for empty 202 bodies
let responseData let responseData
const status = response.status const status = response.status
if (status === 202) { if (status === 202 || status === 204 || status === 205) {
// Many APIs (e.g., Microsoft Graph) return 202 with empty body
responseData = { status } responseData = { status }
} else { } else {
if (tool.transformResponse) { if (tool.transformResponse) {

View File

@@ -110,12 +110,22 @@ spec:
{{- end }} {{- end }}
{{- include "sim.resources" .Values.app | nindent 10 }} {{- include "sim.resources" .Values.app | nindent 10 }}
{{- include "sim.securityContext" .Values.app | nindent 10 }} {{- include "sim.securityContext" .Values.app | nindent 10 }}
{{- with .Values.extraVolumeMounts }} {{- if or .Values.extraVolumeMounts .Values.app.extraVolumeMounts }}
volumeMounts: volumeMounts:
{{- with .Values.extraVolumeMounts }}
{{- toYaml . | nindent 12 }} {{- toYaml . | nindent 12 }}
{{- end }}
{{- with .Values.app.extraVolumeMounts }}
{{- toYaml . | nindent 12 }}
{{- end }}
{{- end }} {{- end }}
{{- with .Values.extraVolumes }} {{- if or .Values.extraVolumes .Values.app.extraVolumes }}
volumes: volumes:
{{- with .Values.extraVolumes }}
{{- toYaml . | nindent 8 }} {{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.app.extraVolumes }}
{{- toYaml . | nindent 8 }}
{{- end }}
{{- end }} {{- end }}
{{- end }} {{- end }}

View File

@@ -92,6 +92,7 @@ spec:
{{- toYaml .Values.ollama.readinessProbe | nindent 12 }} {{- toYaml .Values.ollama.readinessProbe | nindent 12 }}
{{- end }} {{- end }}
{{- include "sim.resources" .Values.ollama | nindent 10 }} {{- include "sim.resources" .Values.ollama | nindent 10 }}
{{- if or .Values.ollama.persistence.enabled .Values.extraVolumeMounts .Values.ollama.extraVolumeMounts }}
volumeMounts: volumeMounts:
{{- if .Values.ollama.persistence.enabled }} {{- if .Values.ollama.persistence.enabled }}
- name: ollama-data - name: ollama-data
@@ -100,13 +101,22 @@ spec:
{{- with .Values.extraVolumeMounts }} {{- with .Values.extraVolumeMounts }}
{{- toYaml . | nindent 12 }} {{- toYaml . | nindent 12 }}
{{- end }} {{- end }}
{{- if .Values.ollama.persistence.enabled }} {{- with .Values.ollama.extraVolumeMounts }}
{{- toYaml . | nindent 12 }}
{{- end }}
{{- end }}
{{- if or .Values.ollama.persistence.enabled .Values.extraVolumes .Values.ollama.extraVolumes }}
volumes: volumes:
{{- if .Values.ollama.persistence.enabled }}
- name: ollama-data - name: ollama-data
persistentVolumeClaim: persistentVolumeClaim:
claimName: {{ include "sim.fullname" . }}-ollama-data claimName: {{ include "sim.fullname" . }}-ollama-data
{{- end }}
{{- with .Values.extraVolumes }} {{- with .Values.extraVolumes }}
{{- toYaml . | nindent 8 }} {{- toYaml . | nindent 8 }}
{{- end }} {{- end }}
{{- with .Values.ollama.extraVolumes }}
{{- toYaml . | nindent 8 }}
{{- end }}
{{- end }} {{- end }}
{{- end }} {{- end }}

View File

@@ -84,12 +84,22 @@ spec:
{{- end }} {{- end }}
{{- include "sim.resources" .Values.realtime | nindent 10 }} {{- include "sim.resources" .Values.realtime | nindent 10 }}
{{- include "sim.securityContext" .Values.realtime | nindent 10 }} {{- include "sim.securityContext" .Values.realtime | nindent 10 }}
{{- with .Values.extraVolumeMounts }} {{- if or .Values.extraVolumeMounts .Values.realtime.extraVolumeMounts }}
volumeMounts: volumeMounts:
{{- with .Values.extraVolumeMounts }}
{{- toYaml . | nindent 12 }} {{- toYaml . | nindent 12 }}
{{- end }}
{{- with .Values.realtime.extraVolumeMounts }}
{{- toYaml . | nindent 12 }}
{{- end }}
{{- end }} {{- end }}
{{- with .Values.extraVolumes }} {{- if or .Values.extraVolumes .Values.realtime.extraVolumes }}
volumes: volumes:
{{- with .Values.extraVolumes }}
{{- toYaml . | nindent 8 }} {{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.realtime.extraVolumes }}
{{- toYaml . | nindent 8 }}
{{- end }}
{{- end }} {{- end }}
{{- end }} {{- end }}

View File

@@ -224,6 +224,10 @@ app:
timeoutSeconds: 5 timeoutSeconds: 5
failureThreshold: 3 failureThreshold: 3
# Additional volumes for app deployment (e.g., branding assets, custom configs)
extraVolumes: []
extraVolumeMounts: []
# Realtime socket server configuration # Realtime socket server configuration
realtime: realtime:
# Enable/disable the realtime service # Enable/disable the realtime service
@@ -301,6 +305,10 @@ realtime:
timeoutSeconds: 5 timeoutSeconds: 5
failureThreshold: 3 failureThreshold: 3
# Additional volumes for realtime deployment
extraVolumes: []
extraVolumeMounts: []
# Database migrations job configuration # Database migrations job configuration
migrations: migrations:
# Enable/disable migrations job # Enable/disable migrations job
@@ -539,6 +547,10 @@ ollama:
timeoutSeconds: 5 timeoutSeconds: 5
failureThreshold: 3 failureThreshold: 3
# Additional volumes for ollama deployment
extraVolumes: []
extraVolumeMounts: []
# Ingress configuration # Ingress configuration
ingress: ingress:
# Enable/disable ingress # Enable/disable ingress