Compare commits

..

53 Commits

Author SHA1 Message Date
priyanshu.solanki
2b8b67179a fix to the response types and formatting 2025-12-18 19:00:17 -07:00
Priyanshu Solanki
2a7f51a2f6 adding clamps for subflow drag and drops of blocks (#2460)
Co-authored-by: priyanshu.solanki <priyanshu.solanki@saviynt.com>
2025-12-18 16:57:58 -07:00
Waleed
90c3c43607 fix(blog): add back unoptimized tag, fix styling (#2461) 2025-12-18 15:55:47 -08:00
Siddharth Ganesan
83d813a7cc improvement(copilot): add edge handle validation to copilot edit workflow (#2448)
* Add edge handle validation

* Clean

* Fix lint

* Fix empty target handle
2025-12-18 15:40:00 -08:00
Vikhyath Mondreti
811c736705 fix failing lint from os contributor (#2459) 2025-12-18 15:03:31 -08:00
Vikhyath Mondreti
c6757311af Merge branch 'main' into staging 2025-12-18 14:58:48 -08:00
div
b5b12ba2d1 fix(teams): webhook notifications crash (#2426)
* fix(docs): clarify working directory for drizzle migration (#2375)

* fix(landing): prevent url encoding for spaces for footer links (#2376)

* fix: handle empty body.value in Teams webhook notification parser (#2425)

* Update directory path for migration command

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Waleed <walif6@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
Co-authored-by: icecrasher321 <icecrasher321@users.noreply.github.com>
Co-authored-by: Siddharth Ganesan <33737564+Sg312@users.noreply.github.com>
Co-authored-by: mosa <mosaxiv@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
Co-authored-by: Shivam <shivamprajapati035@gmail.com>
Co-authored-by: Gaurav Chadha <65453826+Chadha93@users.noreply.github.com>
Co-authored-by: root <root@Delta.localdomain>
2025-12-18 14:57:27 -08:00
Waleed
0d30676e34 fix(blog): revert back to using next image tags in blog (#2458) 2025-12-18 13:51:58 -08:00
Waleed
36bdccb449 fix(ui): fixed visibility issue on reset passowrd page (#2456) 2025-12-18 13:24:32 -08:00
Waleed
f45730a89e improvement(helm): added SSO and cloud storage variables to helm charts (#2454)
* improvement(helm): added SSO and cloud storage variables to helm charts

* consolidated sf types
2025-12-18 13:12:21 -08:00
Vikhyath Mondreti
04cd837e9c fix(notifs): inactivity polling filters, consolidate trigger types, minor consistency issue with filter parsing (#2452)
* fix(notifs-slac): display name for account

* fix inactivity polling check

* consolidate trigger types

* remove redundant defaults

* fix
2025-12-18 12:49:58 -08:00
Waleed
c23130a26e Revert "fix(salesforce): updated to more flexible oauth that allows production, developer, and custom domain salesforce orgs (#2441) (#2444)" (#2453)
This reverts commit 9da19e84b7.
2025-12-18 12:46:24 -08:00
Priyanshu Solanki
7575cd6f27 Merge pull request #2451 from simstudioai/improvement/SIM-514-useWebhookUrl-conditioning
improvement(useWebhookUrl): GET api/webhook is called when useWebhookUrl:true
2025-12-18 13:31:06 -07:00
priyanshu.solanki
fbde64f0b0 fixing lint errors 2025-12-18 13:04:25 -07:00
Waleed
25f7ed20f6 feat(docs): added 404 page for the docs (#2450)
* feat(docs): added 404 page for the docs

* added metadata
2025-12-18 11:46:42 -08:00
priyanshu.solanki
261aa3d72d fixing a react component: 2025-12-18 12:39:47 -07:00
Waleed
9da19e84b7 fix(salesforce): updated to more flexible oauth that allows production, developer, and custom domain salesforce orgs (#2441) (#2444)
* fix(oauth): updated oauth providers that had unstable reference IDs leading to duplicate oauth records (#2441)

* fix(oauth): updated oauth providers that had unstable reference IDs leading to duplicate oauth records

* ack PR comments

* ack PR comments

* cleanup salesforce refresh logic

* ack more PR comments
2025-12-18 11:39:28 -08:00
priyanshu.solanki
e83afc0a62 fixing the useWbehookManangement call to only call the loadwebhookorgenerateurl function when the useWebhookurl flag is true 2025-12-18 12:31:18 -07:00
Vikhyath Mondreti
1720fa8749 feat(compare-schema): ci check to make sure schema.ts never goes out of sync with migrations (#2449)
* feat(compare-schema): ci check to make sure schema.ts never goes out of sync with migrations

* test out of sync [do not merge]

* Revert "test out of sync [do not merge]"

This reverts commit 9771f66b84.
2025-12-18 11:25:19 -08:00
Waleed
f3ad7750af fix(auth): added same-origin validation to forget password route, added confirmation for disable auth FF (#2447)
* fix(auth): added same-origin validation to forget password route, added confirmation for disable auth FF

* ack PR comments
2025-12-18 11:07:25 -08:00
Vikhyath Mondreti
78b7643e65 fix(condition): async execution isolated vm error (#2446)
* fix(condition): async execution isolated vm error

* fix tests
2025-12-18 11:02:01 -08:00
Waleed
67cfb21d08 v0.5.34: servicenow, code cleanup, prevent cyclic edge connections, custom tool fixes 2025-12-17 23:39:10 -08:00
Waleed
1d6975db49 v0.5.33: loops, chat fixes, subflow resizing refactor, terminal updates 2025-12-17 15:45:39 -08:00
Waleed
837aabca5e v0.5.32: google sheets fix, schedule input format 2025-12-16 15:41:04 -08:00
Vikhyath Mondreti
f9cfca92bf v0.5.31: add zod as direct dep 2025-12-15 20:40:02 -08:00
Waleed
25afacb25e v0.5.30: vllm fixes, permissions fixes, isolated vms for code execution, tool fixes 2025-12-15 19:38:01 -08:00
Gaurav Chadha
fcf52ac4d5 fix(landing): prevent url encoding for spaces for footer links (#2376) 2025-12-15 10:59:12 -08:00
Shivam
842200bcf2 fix(docs): clarify working directory for drizzle migration (#2375) 2025-12-15 10:58:27 -08:00
Waleed
a0fb889644 v0.5.29: chat voice mode, opengraph for docs, option to disable auth 2025-12-13 19:50:06 -08:00
Waleed
f526c36fc0 v0.5.28: tool fixes, sqs, spotify, nextjs update, component playground 2025-12-12 21:05:57 -08:00
Waleed
e24f31cbce v0.5.27: sidebar updates, ssrf patches, gpt-5.2, stagehand fixes 2025-12-11 14:45:25 -08:00
Waleed
3fbd57caf1 v0.5.26: tool fixes, templates and knowledgebase fixes, deployment versions in logs 2025-12-11 00:52:13 -08:00
Vikhyath Mondreti
b5da61377c v0.5.25: minor ui improvements, copilot billing fix 2025-12-10 18:32:27 -08:00
Waleed
18b7032494 v0.5.24: agent tool and UX improvements, redis service overhaul (#2291)
* feat(folders): add the ability to create a folder within a folder in popover (#2287)

* fix(agent): filter out empty params to ensure LLM can set tool params at runtime (#2288)

* fix(mcp): added backfill effect to add missing descriptions for mcp tools (#2290)

* fix(redis): cleanup access pattern across callsites (#2289)

* fix(redis): cleanup access pattern across callsites

* swap redis command to be non blocking

* improvement(log-details): polling, trace spans (#2292)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2025-12-10 13:09:21 -08:00
Waleed
b7bbef8620 v0.5.23: kb, logs, general ui improvements, token bucket rate limits, docs, mcp, autolayout improvements (#2286)
* fix(mcp): prevent redundant MCP server discovery calls at runtime, use cached tool schema instead (#2273)

* fix(mcp): prevent redundant MCP server discovery calls at runtime, use cached tool schema instead

* added backfill, added loading state for tools in settings > mcp

* fix tool inp

* feat(rate-limiter): token bucket algorithm  (#2270)

* fix(ratelimit): make deployed chat rate limited

* improvement(rate-limiter): use token bucket algo

* update docs

* fix

* fix type

* fix db rate limiter

* address greptile comments

* feat(i18n): update translations (#2275)

Co-authored-by: icecrasher321 <icecrasher321@users.noreply.github.com>

* fix(tools): updated kalshi and polymarket tools to accurately reflect outputs (#2274)

* feat(i18n): update translations (#2276)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(autolayout): align by handle (#2277)

* fix(autolayout): align by handle

* use shared constants everywhere

* cleanup

* fix(copilot): fix custom tools (#2278)

* Fix title custom tool

* Checkpoitn (broken)

* Fix custom tool flash

* Edit workflow returns null fix

* Works

* Fix lint

* fix(ime): prevent form submission during IME composition steps (#2279)

* fix(ui): prevent form submission during IME composition steps

* chore(gitignore): add IntelliJ IDE files to .gitignore

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Waleed <walif6@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* feat(ui): logs, kb, emcn (#2207)

* feat(kb): emcn alignment; sidebar: popover primary; settings-modal: expand

* feat: EMCN breadcrumb; improvement(KB): UI

* fix: hydration error

* improvement(KB): UI

* feat: emcn modal sizing, KB tags; refactor: deleted old sidebar

* feat(logs): UI

* fix: add documents modal name

* feat: logs, emcn, cursorrules; refactor: logs

* feat: dashboard

* feat: notifications; improvement: logs details

* fixed random rectangle on canvas

* fixed the name of the file to align

* fix build

---------

Co-authored-by: waleed <walif6@gmail.com>

* fix(creds): glitch allowing multiple credentials in an integration (#2282)

* improvement: custom tools modal, logs-details (#2283)

* fix(docs): fix copy page button and header hook (#2284)

* improvement(chat): add the ability to download files from the deployed chat (#2280)

* added teams download and chat download file

* Removed comments

* removed comments

* component structure and download all

* removed comments

* cleanup code

* fix empty files case

* small fix

* fix(container): resize heuristic improvement (#2285)

* estimate block height for resize based on subblocks

* fix hydration error

* make more conservative

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: icecrasher321 <icecrasher321@users.noreply.github.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
Co-authored-by: Siddharth Ganesan <33737564+Sg312@users.noreply.github.com>
Co-authored-by: mosa <mosaxiv@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
2025-12-10 00:57:58 -08:00
Waleed
52edbea659 v0.5.22: rss feed trigger, sftp tool, billing fixes, 413 surfacing, copilot improvements 2025-12-09 10:27:36 -08:00
Vikhyath Mondreti
d480057fd3 fix(migration): migration got removed by force push (#2253) 2025-12-08 14:08:12 -08:00
Waleed
c27c233da0 v0.5.21: google groups, virtualized code viewer, ui, autolayout, docs improvements 2025-12-08 13:10:50 -08:00
Waleed
ebef5f3a27 v0.5.20: google slides, ui fixes, subflow resizing improvements 2025-12-06 15:36:09 -08:00
Vikhyath Mondreti
12c4c2d44f v0.5.19: copilot fix 2025-12-05 15:27:31 -08:00
Vikhyath Mondreti
929a352edb fix(build): added trigger.dev sdk mock to tests (#2216) 2025-12-05 14:26:50 -08:00
Vikhyath Mondreti
6cd078b0fe v0.5.18: ui fixes, nextjs16, workspace notifications, admin APIs, loading improvements, new slack tools 2025-12-05 14:03:09 -08:00
Waleed
31874939ee v0.5.17: modals, billing fixes, bun update, zoom, dropbox, kalshi, polymarket, datadog, ahrefs, gitlab, shopify, ssh, wordpress integrations 2025-12-04 13:29:46 -08:00
Waleed
e157ce5fbc v0.5.16: MCP fixes, code refactors, jira fixes, new mistral models 2025-12-02 22:02:11 -08:00
Vikhyath Mondreti
774e5d585c v0.5.15: add tools, revert subblock prop change 2025-12-01 13:52:12 -08:00
Vikhyath Mondreti
54cc93743f v0.5.14: fix issue with teams, google selectors + cleanup code 2025-12-01 12:39:39 -08:00
Waleed
8c32ad4c0d v0.5.13: polling fixes, generic agent search tool, status page, smtp, sendgrid, linkedin, more tools (#2148)
* feat(tools): added smtp, sendgrid, mailgun, linkedin, fixed permissions in context menu (#2133)

* feat(tools): added twilio sendgrid integration

* feat(tools): added smtp, sendgrid, mailgun, fixed permissions in context menu

* added top level mocks for sporadically failing tests

* incr type safety

* fix(team-plans): track departed member usage so value not lost (#2118)

* fix(team-plans): track departed member usage so value not lost

* reset usage to 0 when they leave team

* prep merge with stagig

* regen migrations

* fix org invite + ws selection'

---------

Co-authored-by: Waleed <walif6@gmail.com>

* feat(i18n): update translations (#2134)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* feat(creators): add verification for creators (#2135)

* feat(tools): added apify block/tools  (#2136)

* feat(tools): added apify

* cleanup

* feat(i18n): update translations (#2137)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* feat(env): added more optional env var examples (#2138)

* feat(statuspage): added statuspage, updated list of tools in footer, renamed routes (#2139)

* feat(statuspage): added statuspage, updated list of tools in footer, renamed routes

* ack PR comments

* feat(tools): add generic search tool (#2140)

* feat(i18n): update translations (#2141)

* fix(sdks): bump sdk versions (#2142)

* fix(webhooks): count test webhooks towards usage limit (#2143)

* fix(bill): add requestId to webhook processing (#2144)

* improvement(subflow): remove all associated edges when moving a block into a subflow (#2145)

* improvement(subflow): remove all associated edges when moving a block into a subflow

* ack PR comments

* fix(polling): mark webhook failed on webhook trigger errors (#2146)

* fix(deps): declare core transient deps explicitly (#2147)

* fix(deps): declare core transient deps explicitly

* ack PR comments

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
2025-12-01 10:15:36 -08:00
Waleed
1d08796853 v0.5.12: memory optimizations, sentry, incidentio, posthog, zendesk, pylon, intercom, mailchimp, loading optimizations (#2132)
* fix(memory-util): fixed unbounded array of gmail/outlook pollers causing high memory util, added missing db indexes/removed unused ones, auto-disable schedules/webhooks after 10 consecutive failures (#2115)

* fix(memory-util): fixed unbounded array of gmail/outlook pollers causing high memory util, added missing db indexes/removed unused ones, auto-disable schedules/webhooks after 10 consecutive failures

* ack PR comments

* ack

* improvement(teams-plan): seats increase simplification + not triggering checkout session (#2117)

* improvement(teams-plan): seats increase simplification + not triggering checkout session

* cleanup via helper

* feat(tools): added sentry, incidentio, and posthog tools (#2116)

* feat(tools): added sentry, incidentio, and posthog tools

* update docs

* fixed docs to use native fumadocs for llms.txt and copy markdown, fixed tool issues

* cleanup

* enhance error extractor, fixed posthog tools

* docs enhancements, cleanup

* added more incident io ops, remove zustand/shallow in favor of zustand/react/shallow

* fix type errors

* remove unnecessary comments

* added vllm to docs

* feat(i18n): update translations (#2120)

* feat(i18n): update translations

* fix build

---------

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* improvement(workflow-execution): perf improvements to passing workflow state + decrypted env vars (#2119)

* improvement(execution): load workflow state once instead of 2-3 times

* decrypt only in get helper

* remove comments

* remove comments

* feat(models): host google gemini models (#2122)

* feat(models): host google gemini models

* remove unused primary key

* feat(i18n): update translations (#2123)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* feat(tools): added zendesk, pylon, intercom, & mailchimp (#2126)

* feat(tools): added zendesk, pylon, intercom, & mailchimp

* finish zendesk and pylon

* updated docs

* feat(i18n): update translations (#2129)

* feat(i18n): update translations

* fixed build

---------

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(permissions): add client-side permissions validation to prevent unauthorized actions, upgraded custom tool modal (#2130)

* fix(permissions): add client-side permissions validation to prevent unauthorized actions, upgraded custom tool modal

* fix failing test

* fix test

* cleanup

* fix(custom-tools): add composite index on custom tool names & workspace id (#2131)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
2025-11-28 16:08:06 -08:00
Waleed
ebcd243942 v0.5.11: stt, videogen, vllm, billing fixes, new models 2025-11-25 01:14:12 -08:00
Waleed
b7e814b721 v0.5.10: copilot upgrade, preprocessor, logs search, UI, code hygiene 2025-11-21 12:04:34 -08:00
Waleed
842ef27ed9 v0.5.9: add backwards compatibility for agent messages array 2025-11-20 11:19:42 -08:00
Vikhyath Mondreti
31c34b2ea3 v0.5.8: notifications, billing, ui changes, store loading state machine 2025-11-20 01:32:32 -08:00
Vikhyath Mondreti
8f0ef58056 v0.5.7: combobox selectors, usage indicator, workflow loading race condition, other improvements 2025-11-17 21:25:51 -08:00
99 changed files with 2169 additions and 1527 deletions

View File

@@ -48,6 +48,19 @@ jobs:
ENCRYPTION_KEY: '7cf672e460e430c1fba707575c2b0e2ad5a99dddf9b7b7e3b5646e630861db1c' # dummy key for CI only
run: bun run test
- name: Check schema and migrations are in sync
working-directory: packages/db
run: |
bunx drizzle-kit generate --config=./drizzle.config.ts
if [ -n "$(git status --porcelain ./migrations)" ]; then
echo "❌ Schema and migrations are out of sync!"
echo "Run 'cd packages/db && bunx drizzle-kit generate' and commit the new migrations."
git status --porcelain ./migrations
git diff ./migrations
exit 1
fi
echo "✅ Schema and migrations are in sync"
- name: Build application
env:
NODE_OPTIONS: '--no-warnings'

View File

@@ -188,6 +188,7 @@ DATABASE_URL="postgresql://postgres:your_password@localhost:5432/simstudio"
Then run the migrations:
```bash
cd packages/db # Required so drizzle picks correct .env file
bunx drizzle-kit migrate --config=./drizzle.config.ts
```

View File

@@ -0,0 +1,23 @@
import { DocsBody, DocsPage } from 'fumadocs-ui/page'
export const metadata = {
title: 'Page Not Found',
}
export default function NotFound() {
return (
<DocsPage>
<DocsBody>
<div className='flex min-h-[60vh] flex-col items-center justify-center text-center'>
<h1 className='mb-4 bg-gradient-to-b from-[#8357FF] to-[#6F3DFA] bg-clip-text font-bold text-8xl text-transparent'>
404
</h1>
<h2 className='mb-2 font-semibold text-2xl text-foreground'>Page Not Found</h2>
<p className='text-muted-foreground'>
The page you're looking for doesn't exist or has been moved.
</p>
</div>
</DocsBody>
</DocsPage>
)
}

View File

@@ -573,10 +573,10 @@ export default function LoginPage({
<Dialog open={forgotPasswordOpen} onOpenChange={setForgotPasswordOpen}>
<DialogContent className='auth-card auth-card-shadow max-w-[540px] rounded-[10px] border backdrop-blur-sm'>
<DialogHeader>
<DialogTitle className='auth-text-primary font-semibold text-xl tracking-tight'>
<DialogTitle className='font-semibold text-black text-xl tracking-tight'>
Reset Password
</DialogTitle>
<DialogDescription className='auth-text-secondary text-sm'>
<DialogDescription className='text-muted-foreground text-sm'>
Enter your email address and we'll send you a link to reset your password if your
account exists.
</DialogDescription>

View File

@@ -109,7 +109,7 @@ export default function Footer({ fullWidth = false }: FooterProps) {
{FOOTER_BLOCKS.map((block) => (
<Link
key={block}
href={`https://docs.sim.ai/blocks/${block.toLowerCase().replace(' ', '-')}`}
href={`https://docs.sim.ai/blocks/${block.toLowerCase().replaceAll(' ', '-')}`}
target='_blank'
rel='noopener noreferrer'
className='text-[14px] text-muted-foreground transition-colors hover:text-foreground'

View File

@@ -1,8 +1,7 @@
import Image from 'next/image'
import Link from 'next/link'
import { Avatar, AvatarFallback, AvatarImage } from '@/components/ui/avatar'
import { getAllPostMeta } from '@/lib/blog/registry'
import { soehne } from '@/app/_styles/fonts/soehne/soehne'
import { PostGrid } from '@/app/(landing)/studio/post-grid'
export const revalidate = 3600
@@ -18,7 +17,6 @@ export default async function StudioIndex({
const all = await getAllPostMeta()
const filtered = tag ? all.filter((p) => p.tags.includes(tag)) : all
// Sort to ensure featured post is first on page 1
const sorted =
pageNum === 1
? filtered.sort((a, b) => {
@@ -63,69 +61,7 @@ export default async function StudioIndex({
</div> */}
{/* Grid layout for consistent rows */}
<div className='grid grid-cols-1 gap-4 md:grid-cols-2 md:gap-6 lg:grid-cols-3'>
{posts.map((p, i) => {
return (
<Link key={p.slug} href={`/studio/${p.slug}`} className='group flex flex-col'>
<div className='flex h-full flex-col overflow-hidden rounded-xl border border-gray-200 transition-colors duration-300 hover:border-gray-300'>
<Image
src={p.ogImage}
alt={p.title}
width={800}
height={450}
className='h-48 w-full object-cover'
sizes='(max-width: 768px) 100vw, (max-width: 1024px) 50vw, 33vw'
loading='lazy'
unoptimized
/>
<div className='flex flex-1 flex-col p-4'>
<div className='mb-2 text-gray-600 text-xs'>
{new Date(p.date).toLocaleDateString('en-US', {
month: 'short',
day: 'numeric',
year: 'numeric',
})}
</div>
<h3 className='shine-text mb-1 font-medium text-lg leading-tight'>{p.title}</h3>
<p className='mb-3 line-clamp-3 flex-1 text-gray-700 text-sm'>{p.description}</p>
<div className='flex items-center gap-2'>
<div className='-space-x-1.5 flex'>
{(p.authors && p.authors.length > 0 ? p.authors : [p.author])
.slice(0, 3)
.map((author, idx) => (
<Avatar key={idx} className='size-4 border border-white'>
<AvatarImage src={author?.avatarUrl} alt={author?.name} />
<AvatarFallback className='border border-white bg-gray-100 text-[10px] text-gray-600'>
{author?.name.slice(0, 2)}
</AvatarFallback>
</Avatar>
))}
</div>
<span className='text-gray-600 text-xs'>
{(p.authors && p.authors.length > 0 ? p.authors : [p.author])
.slice(0, 2)
.map((a) => a?.name)
.join(', ')}
{(p.authors && p.authors.length > 0 ? p.authors : [p.author]).length > 2 && (
<>
{' '}
and{' '}
{(p.authors && p.authors.length > 0 ? p.authors : [p.author]).length - 2}{' '}
other
{(p.authors && p.authors.length > 0 ? p.authors : [p.author]).length - 2 >
1
? 's'
: ''}
</>
)}
</span>
</div>
</div>
</div>
</Link>
)
})}
</div>
<PostGrid posts={posts} />
{totalPages > 1 && (
<div className='mt-10 flex items-center justify-center gap-3'>

View File

@@ -0,0 +1,90 @@
'use client'
import Image from 'next/image'
import Link from 'next/link'
import { Avatar, AvatarFallback, AvatarImage } from '@/components/ui/avatar'
interface Author {
id: string
name: string
avatarUrl?: string
url?: string
}
interface Post {
slug: string
title: string
description: string
date: string
ogImage: string
author: Author
authors?: Author[]
featured?: boolean
}
export function PostGrid({ posts }: { posts: Post[] }) {
return (
<div className='grid grid-cols-1 gap-4 md:grid-cols-2 md:gap-6 lg:grid-cols-3'>
{posts.map((p, index) => (
<Link key={p.slug} href={`/studio/${p.slug}`} className='group flex flex-col'>
<div className='flex h-full flex-col overflow-hidden rounded-xl border border-gray-200 transition-colors duration-300 hover:border-gray-300'>
{/* Image container with fixed aspect ratio to prevent layout shift */}
<div className='relative aspect-video w-full overflow-hidden'>
<Image
src={p.ogImage}
alt={p.title}
sizes='(max-width: 768px) 100vw, (max-width: 1024px) 50vw, 33vw'
unoptimized
priority={index < 6}
loading={index < 6 ? undefined : 'lazy'}
fill
style={{ objectFit: 'cover' }}
/>
</div>
<div className='flex flex-1 flex-col p-4'>
<div className='mb-2 text-gray-600 text-xs'>
{new Date(p.date).toLocaleDateString('en-US', {
month: 'short',
day: 'numeric',
year: 'numeric',
})}
</div>
<h3 className='shine-text mb-1 font-medium text-lg leading-tight'>{p.title}</h3>
<p className='mb-3 line-clamp-3 flex-1 text-gray-700 text-sm'>{p.description}</p>
<div className='flex items-center gap-2'>
<div className='-space-x-1.5 flex'>
{(p.authors && p.authors.length > 0 ? p.authors : [p.author])
.slice(0, 3)
.map((author, idx) => (
<Avatar key={idx} className='size-4 border border-white'>
<AvatarImage src={author?.avatarUrl} alt={author?.name} />
<AvatarFallback className='border border-white bg-gray-100 text-[10px] text-gray-600'>
{author?.name.slice(0, 2)}
</AvatarFallback>
</Avatar>
))}
</div>
<span className='text-gray-600 text-xs'>
{(p.authors && p.authors.length > 0 ? p.authors : [p.author])
.slice(0, 2)
.map((a) => a?.name)
.join(', ')}
{(p.authors && p.authors.length > 0 ? p.authors : [p.author]).length > 2 && (
<>
{' '}
and {(p.authors && p.authors.length > 0 ? p.authors : [p.author]).length - 2}{' '}
other
{(p.authors && p.authors.length > 0 ? p.authors : [p.author]).length - 2 > 1
? 's'
: ''}
</>
)}
</span>
</div>
</div>
</div>
</Link>
))}
</div>
)
}

View File

@@ -12,6 +12,7 @@ export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
pathname === '/' ||
pathname.startsWith('/login') ||
pathname.startsWith('/signup') ||
pathname.startsWith('/reset-password') ||
pathname.startsWith('/sso') ||
pathname.startsWith('/terms') ||
pathname.startsWith('/privacy') ||

View File

@@ -759,3 +759,24 @@ input[type="search"]::-ms-clear {
--surface-elevated: #202020;
}
}
/**
* Remove backticks from inline code in prose (Tailwind Typography default)
*/
.prose code::before,
.prose code::after {
content: none !important;
}
/**
* Remove underlines from heading anchor links in prose
*/
.prose h1 a,
.prose h2 a,
.prose h3 a,
.prose h4 a,
.prose h5 a,
.prose h6 a {
text-decoration: none !important;
color: inherit !important;
}

View File

@@ -32,7 +32,17 @@ export async function GET(request: NextRequest) {
.from(account)
.where(and(...whereConditions))
return NextResponse.json({ accounts })
// Use the user's email as the display name (consistent with credential selector)
const userEmail = session.user.email
const accountsWithDisplayName = accounts.map((acc) => ({
id: acc.id,
accountId: acc.accountId,
providerId: acc.providerId,
displayName: userEmail || acc.providerId,
}))
return NextResponse.json({ accounts: accountsWithDisplayName })
} catch (error) {
logger.error('Failed to fetch accounts', { error })
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })

View File

@@ -6,6 +6,10 @@
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { createMockRequest, setupAuthApiMocks } from '@/app/api/__test-utils__/utils'
vi.mock('@/lib/core/utils/urls', () => ({
getBaseUrl: vi.fn(() => 'https://app.example.com'),
}))
describe('Forget Password API Route', () => {
beforeEach(() => {
vi.resetModules()
@@ -15,7 +19,7 @@ describe('Forget Password API Route', () => {
vi.clearAllMocks()
})
it('should send password reset email successfully', async () => {
it('should send password reset email successfully with same-origin redirectTo', async () => {
setupAuthApiMocks({
operations: {
forgetPassword: { success: true },
@@ -24,7 +28,7 @@ describe('Forget Password API Route', () => {
const req = createMockRequest('POST', {
email: 'test@example.com',
redirectTo: 'https://example.com/reset',
redirectTo: 'https://app.example.com/reset',
})
const { POST } = await import('@/app/api/auth/forget-password/route')
@@ -39,12 +43,36 @@ describe('Forget Password API Route', () => {
expect(auth.auth.api.forgetPassword).toHaveBeenCalledWith({
body: {
email: 'test@example.com',
redirectTo: 'https://example.com/reset',
redirectTo: 'https://app.example.com/reset',
},
method: 'POST',
})
})
it('should reject external redirectTo URL', async () => {
setupAuthApiMocks({
operations: {
forgetPassword: { success: true },
},
})
const req = createMockRequest('POST', {
email: 'test@example.com',
redirectTo: 'https://evil.com/phishing',
})
const { POST } = await import('@/app/api/auth/forget-password/route')
const response = await POST(req)
const data = await response.json()
expect(response.status).toBe(400)
expect(data.message).toBe('Redirect URL must be a valid same-origin URL')
const auth = await import('@/lib/auth')
expect(auth.auth.api.forgetPassword).not.toHaveBeenCalled()
})
it('should send password reset email without redirectTo', async () => {
setupAuthApiMocks({
operations: {

View File

@@ -1,6 +1,7 @@
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { auth } from '@/lib/auth'
import { isSameOrigin } from '@/lib/core/utils/validation'
import { createLogger } from '@/lib/logs/console/logger'
export const dynamic = 'force-dynamic'
@@ -13,10 +14,15 @@ const forgetPasswordSchema = z.object({
.email('Please provide a valid email address'),
redirectTo: z
.string()
.url('Redirect URL must be a valid URL')
.optional()
.or(z.literal(''))
.transform((val) => (val === '' ? undefined : val)),
.transform((val) => (val === '' || val === undefined ? undefined : val))
.refine(
(val) => val === undefined || (z.string().url().safeParse(val).success && isSameOrigin(val)),
{
message: 'Redirect URL must be a valid same-origin URL',
}
),
})
export async function POST(request: NextRequest) {

View File

@@ -11,6 +11,7 @@ import { processInputFileFields } from '@/lib/execution/files'
import { preprocessExecution } from '@/lib/execution/preprocessing'
import { createLogger } from '@/lib/logs/console/logger'
import { LoggingSession } from '@/lib/logs/execution/logging-session'
import { ALL_TRIGGER_TYPES } from '@/lib/logs/types'
import { executeWorkflowCore } from '@/lib/workflows/executor/execution-core'
import { type ExecutionEvent, encodeSSEEvent } from '@/lib/workflows/executor/execution-events'
import { PauseResumeManager } from '@/lib/workflows/executor/human-in-the-loop-manager'
@@ -30,7 +31,7 @@ const logger = createLogger('WorkflowExecuteAPI')
const ExecuteWorkflowSchema = z.object({
selectedOutputs: z.array(z.string()).optional().default([]),
triggerType: z.enum(['api', 'webhook', 'schedule', 'manual', 'chat']).optional(),
triggerType: z.enum(ALL_TRIGGER_TYPES).optional(),
stream: z.boolean().optional(),
useDraftState: z.boolean().optional(),
input: z.any().optional(),

View File

@@ -1,111 +0,0 @@
import { db } from '@sim/db'
import { workflow, workflowFolder } from '@sim/db/schema'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { createLogger } from '@/lib/logs/console/logger'
import { loadBulkWorkflowsFromNormalizedTables } from '@/lib/workflows/persistence/utils'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
const logger = createLogger('WorkspaceExportAPI')
/**
* GET /api/workspaces/[id]/export
* Export all workspace data (workflows with states, folders) in a single request.
* Much more efficient than fetching each workflow individually.
*/
export async function GET(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const startTime = Date.now()
const { id: workspaceId } = await params
try {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check if user has access to this workspace
const userPermission = await getUserEntityPermissions(session.user.id, 'workspace', workspaceId)
if (!userPermission) {
return NextResponse.json({ error: 'Workspace not found or access denied' }, { status: 404 })
}
// Fetch all workflows and folders in parallel (2 queries)
const [workflows, folders] = await Promise.all([
db.select().from(workflow).where(eq(workflow.workspaceId, workspaceId)),
db.select().from(workflowFolder).where(eq(workflowFolder.workspaceId, workspaceId)),
])
const workflowIds = workflows.map((w) => w.id)
// Bulk load all workflow states (3 queries total via inArray)
const workflowStates = await loadBulkWorkflowsFromNormalizedTables(workflowIds)
// Build export data
const workflowsExport = workflows.map((w) => {
const state = workflowStates.get(w.id)
// Build the workflow state with defaults if no normalized data
const workflowState = state
? {
blocks: state.blocks,
edges: state.edges,
loops: state.loops,
parallels: state.parallels,
lastSaved: Date.now(),
isDeployed: w.isDeployed || false,
deployedAt: w.deployedAt,
}
: {
blocks: {},
edges: [],
loops: {},
parallels: {},
lastSaved: Date.now(),
isDeployed: w.isDeployed || false,
deployedAt: w.deployedAt,
}
// Extract variables from workflow record
const variables = Object.values((w.variables as Record<string, any>) || {}).map((v: any) => ({
id: v.id,
name: v.name,
type: v.type,
value: v.value,
}))
return {
workflow: {
id: w.id,
name: w.name,
description: w.description,
color: w.color,
folderId: w.folderId,
},
state: workflowState,
variables,
}
})
const foldersExport = folders.map((f) => ({
id: f.id,
name: f.name,
parentId: f.parentId,
}))
const elapsed = Date.now() - startTime
logger.info(`Exported workspace ${workspaceId} in ${elapsed}ms`, {
workflowsCount: workflowsExport.length,
foldersCount: foldersExport.length,
})
return NextResponse.json({
workflows: workflowsExport,
folders: foldersExport,
})
} catch (error) {
const elapsed = Date.now() - startTime
logger.error(`Error exporting workspace ${workspaceId} after ${elapsed}ms:`, error)
return NextResponse.json({ error: 'Failed to export workspace' }, { status: 500 })
}
}

View File

@@ -6,13 +6,14 @@ import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { encryptSecret } from '@/lib/core/security/encryption'
import { createLogger } from '@/lib/logs/console/logger'
import { ALL_TRIGGER_TYPES } from '@/lib/logs/types'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
import { MAX_EMAIL_RECIPIENTS, MAX_WORKFLOW_IDS } from '../constants'
const logger = createLogger('WorkspaceNotificationAPI')
const levelFilterSchema = z.array(z.enum(['info', 'error']))
const triggerFilterSchema = z.array(z.enum(['api', 'webhook', 'schedule', 'manual', 'chat']))
const triggerFilterSchema = z.array(z.enum(ALL_TRIGGER_TYPES))
const alertRuleSchema = z.enum([
'consecutive_failures',

View File

@@ -7,6 +7,7 @@ import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { encryptSecret } from '@/lib/core/security/encryption'
import { createLogger } from '@/lib/logs/console/logger'
import { ALL_TRIGGER_TYPES } from '@/lib/logs/types'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
import { MAX_EMAIL_RECIPIENTS, MAX_NOTIFICATIONS_PER_TYPE, MAX_WORKFLOW_IDS } from './constants'
@@ -14,7 +15,7 @@ const logger = createLogger('WorkspaceNotificationsAPI')
const notificationTypeSchema = z.enum(['webhook', 'email', 'slack'])
const levelFilterSchema = z.array(z.enum(['info', 'error']))
const triggerFilterSchema = z.array(z.enum(['api', 'webhook', 'schedule', 'manual', 'chat']))
const triggerFilterSchema = z.array(z.enum(ALL_TRIGGER_TYPES))
const alertRuleSchema = z.enum([
'consecutive_failures',
@@ -80,7 +81,7 @@ const createNotificationSchema = z
workflowIds: z.array(z.string()).max(MAX_WORKFLOW_IDS).default([]),
allWorkflows: z.boolean().default(false),
levelFilter: levelFilterSchema.default(['info', 'error']),
triggerFilter: triggerFilterSchema.default(['api', 'webhook', 'schedule', 'manual', 'chat']),
triggerFilter: triggerFilterSchema.default([...ALL_TRIGGER_TYPES]),
includeFinalOutput: z.boolean().default(false),
includeTraceSpans: z.boolean().default(false),
includeRateLimits: z.boolean().default(false),

View File

@@ -104,6 +104,8 @@ export function SlackChannelSelector({
disabled={disabled || channels.length === 0}
isLoading={isLoading}
error={fetchError}
searchable
searchPlaceholder='Search channels...'
/>
{selectedChannel && !fetchError && (
<p className='text-[12px] text-[var(--text-muted)]'>

View File

@@ -22,6 +22,7 @@ import { SlackIcon } from '@/components/icons'
import { Skeleton } from '@/components/ui'
import { cn } from '@/lib/core/utils/cn'
import { createLogger } from '@/lib/logs/console/logger'
import { ALL_TRIGGER_TYPES, type TriggerType } from '@/lib/logs/types'
import { quickValidateEmail } from '@/lib/messaging/email/validation'
import {
type NotificationSubscription,
@@ -43,7 +44,6 @@ const PRIMARY_BUTTON_STYLES =
type NotificationType = 'webhook' | 'email' | 'slack'
type LogLevel = 'info' | 'error'
type TriggerType = 'api' | 'webhook' | 'schedule' | 'manual' | 'chat'
type AlertRule =
| 'none'
| 'consecutive_failures'
@@ -84,7 +84,6 @@ interface NotificationSettingsProps {
}
const LOG_LEVELS: LogLevel[] = ['info', 'error']
const TRIGGER_TYPES: TriggerType[] = ['api', 'webhook', 'schedule', 'manual', 'chat']
function formatAlertConfigLabel(config: {
rule: AlertRule
@@ -137,7 +136,7 @@ export function NotificationSettings({
workflowIds: [] as string[],
allWorkflows: true,
levelFilter: ['info', 'error'] as LogLevel[],
triggerFilter: ['api', 'webhook', 'schedule', 'manual', 'chat'] as TriggerType[],
triggerFilter: [...ALL_TRIGGER_TYPES] as TriggerType[],
includeFinalOutput: false,
includeTraceSpans: false,
includeRateLimits: false,
@@ -207,7 +206,7 @@ export function NotificationSettings({
workflowIds: [],
allWorkflows: true,
levelFilter: ['info', 'error'],
triggerFilter: ['api', 'webhook', 'schedule', 'manual', 'chat'],
triggerFilter: [...ALL_TRIGGER_TYPES],
includeFinalOutput: false,
includeTraceSpans: false,
includeRateLimits: false,
@@ -768,7 +767,7 @@ export function NotificationSettings({
<Combobox
options={slackAccounts.map((acc) => ({
value: acc.id,
label: acc.accountId,
label: acc.displayName || 'Slack Workspace',
}))}
value={formData.slackAccountId}
onChange={(value) => {
@@ -859,7 +858,7 @@ export function NotificationSettings({
<div className='flex flex-col gap-[8px]'>
<Label className='text-[var(--text-secondary)]'>Trigger Type Filters</Label>
<Combobox
options={TRIGGER_TYPES.map((trigger) => ({
options={ALL_TRIGGER_TYPES.map((trigger) => ({
label: trigger.charAt(0).toUpperCase() + trigger.slice(1),
value: trigger,
}))}

View File

@@ -90,6 +90,7 @@ export function ShortInput({
blockId,
triggerId: undefined,
isPreview,
useWebhookUrl,
})
const wandHook = useWand({

View File

@@ -74,6 +74,7 @@ export function TriggerSave({
blockId,
triggerId: effectiveTriggerId,
isPreview,
useWebhookUrl: true, // to store the webhook url in the store
})
const triggerConfig = useSubBlockStore((state) => state.getValue(blockId, 'triggerConfig'))

View File

@@ -6,6 +6,61 @@ import { getBlock } from '@/blocks/registry'
const logger = createLogger('NodeUtilities')
/**
* Estimates block dimensions based on block type.
* Uses subblock count to estimate height for blocks that haven't been measured yet.
*
* @param blockType - The type of block (e.g., 'condition', 'agent')
* @returns Estimated width and height for the block
*/
export function estimateBlockDimensions(blockType: string): { width: number; height: number } {
const blockConfig = getBlock(blockType)
const subBlockCount = blockConfig?.subBlocks?.length ?? 3
// Many subblocks are conditionally rendered (advanced mode, provider-specific, etc.)
// Use roughly half the config count as a reasonable estimate, capped between 3-7 rows
const estimatedRows = Math.max(3, Math.min(Math.ceil(subBlockCount / 2), 7))
const hasErrorRow = blockType !== 'starter' && blockType !== 'response' ? 1 : 0
const height =
BLOCK_DIMENSIONS.HEADER_HEIGHT +
BLOCK_DIMENSIONS.WORKFLOW_CONTENT_PADDING +
(estimatedRows + hasErrorRow) * BLOCK_DIMENSIONS.WORKFLOW_ROW_HEIGHT
return {
width: BLOCK_DIMENSIONS.FIXED_WIDTH,
height: Math.max(height, BLOCK_DIMENSIONS.MIN_HEIGHT),
}
}
/**
* Clamps a position to keep a block fully inside a container's content area.
* Content area starts after the header and padding, and ends before the right/bottom padding.
*
* @param position - Raw position relative to container origin
* @param containerDimensions - Container width and height
* @param blockDimensions - Block width and height
* @returns Clamped position that keeps block inside content area
*/
export function clampPositionToContainer(
position: { x: number; y: number },
containerDimensions: { width: number; height: number },
blockDimensions: { width: number; height: number }
): { x: number; y: number } {
const { width: containerWidth, height: containerHeight } = containerDimensions
const { width: blockWidth, height: blockHeight } = blockDimensions
// Content area bounds (where blocks can be placed)
const minX = CONTAINER_DIMENSIONS.LEFT_PADDING
const minY = CONTAINER_DIMENSIONS.HEADER_HEIGHT + CONTAINER_DIMENSIONS.TOP_PADDING
const maxX = containerWidth - CONTAINER_DIMENSIONS.RIGHT_PADDING - blockWidth
const maxY = containerHeight - CONTAINER_DIMENSIONS.BOTTOM_PADDING - blockHeight
return {
x: Math.max(minX, Math.min(position.x, Math.max(minX, maxX))),
y: Math.max(minY, Math.min(position.y, Math.max(minY, maxY))),
}
}
/**
* Hook providing utilities for node position, hierarchy, and dimension calculations
*/
@@ -21,7 +76,7 @@ export function useNodeUtilities(blocks: Record<string, any>) {
/**
* Get the dimensions of a block.
* For regular blocks, estimates height based on block config if not yet measured.
* For regular blocks, uses stored height or estimates based on block config.
*/
const getBlockDimensions = useCallback(
(blockId: string): { width: number; height: number } => {
@@ -41,32 +96,16 @@ export function useNodeUtilities(blocks: Record<string, any>) {
}
}
// Workflow block nodes have fixed visual width
const width = BLOCK_DIMENSIONS.FIXED_WIDTH
// Prefer deterministic height published by the block component; fallback to estimate
let height = block.height
if (!height) {
// Estimate height based on block config's subblock count for more accurate initial sizing
// This is critical for subflow containers to size correctly before child blocks are measured
const blockConfig = getBlock(block.type)
const subBlockCount = blockConfig?.subBlocks?.length ?? 3
// Many subblocks are conditionally rendered (advanced mode, provider-specific, etc.)
// Use roughly half the config count as a reasonable estimate, capped between 3-7 rows
const estimatedRows = Math.max(3, Math.min(Math.ceil(subBlockCount / 2), 7))
const hasErrorRow = block.type !== 'starter' && block.type !== 'response' ? 1 : 0
height =
BLOCK_DIMENSIONS.HEADER_HEIGHT +
BLOCK_DIMENSIONS.WORKFLOW_CONTENT_PADDING +
(estimatedRows + hasErrorRow) * BLOCK_DIMENSIONS.WORKFLOW_ROW_HEIGHT
if (block.height) {
return {
width: BLOCK_DIMENSIONS.FIXED_WIDTH,
height: Math.max(block.height, BLOCK_DIMENSIONS.MIN_HEIGHT),
}
}
return {
width,
height: Math.max(height, BLOCK_DIMENSIONS.MIN_HEIGHT),
}
// Use shared estimation utility for blocks without measured height
return estimateBlockDimensions(block.type)
},
[blocks, isContainerType]
)
@@ -164,29 +203,36 @@ export function useNodeUtilities(blocks: Record<string, any>) {
)
/**
* Calculates the relative position of a node to a new parent's content area.
* Accounts for header height and padding offsets in container nodes.
* Calculates the relative position of a node to a new parent's origin.
* React Flow positions children relative to parent origin, so we clamp
* to the content area bounds (after header and padding).
* @param nodeId ID of the node being repositioned
* @param newParentId ID of the new parent
* @returns Relative position coordinates {x, y} within the parent's content area
* @returns Relative position coordinates {x, y} within the parent
*/
const calculateRelativePosition = useCallback(
(nodeId: string, newParentId: string): { x: number; y: number } => {
const nodeAbsPos = getNodeAbsolutePosition(nodeId)
const parentAbsPos = getNodeAbsolutePosition(newParentId)
const parentNode = getNodes().find((n) => n.id === newParentId)
// Account for container's header and padding
// Children are positioned relative to content area, not container origin
const headerHeight = 50
const leftPadding = 16
const topPadding = 16
return {
x: nodeAbsPos.x - parentAbsPos.x - leftPadding,
y: nodeAbsPos.y - parentAbsPos.y - headerHeight - topPadding,
// Calculate raw relative position (relative to parent origin)
const rawPosition = {
x: nodeAbsPos.x - parentAbsPos.x,
y: nodeAbsPos.y - parentAbsPos.y,
}
// Get container and block dimensions
const containerDimensions = {
width: parentNode?.data?.width || CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: parentNode?.data?.height || CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
}
const blockDimensions = getBlockDimensions(nodeId)
// Clamp position to keep block inside content area
return clampPositionToContainer(rawPosition, containerDimensions, blockDimensions)
},
[getNodeAbsolutePosition]
[getNodeAbsolutePosition, getNodes, getBlockDimensions]
)
/**
@@ -252,7 +298,11 @@ export function useNodeUtilities(blocks: Record<string, any>) {
*/
const calculateLoopDimensions = useCallback(
(nodeId: string): { width: number; height: number } => {
const childNodes = getNodes().filter((node) => node.parentId === nodeId)
// Check both React Flow's node.parentId AND blocks store's data.parentId
// This ensures we catch children even if React Flow hasn't re-rendered yet
const childNodes = getNodes().filter(
(node) => node.parentId === nodeId || blocks[node.id]?.data?.parentId === nodeId
)
if (childNodes.length === 0) {
return {
width: CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
@@ -265,8 +315,11 @@ export function useNodeUtilities(blocks: Record<string, any>) {
childNodes.forEach((node) => {
const { width: nodeWidth, height: nodeHeight } = getBlockDimensions(node.id)
maxRight = Math.max(maxRight, node.position.x + nodeWidth)
maxBottom = Math.max(maxBottom, node.position.y + nodeHeight)
// Use block position from store if available (more up-to-date)
const block = blocks[node.id]
const position = block?.position || node.position
maxRight = Math.max(maxRight, position.x + nodeWidth)
maxBottom = Math.max(maxBottom, position.y + nodeHeight)
})
const width = Math.max(
@@ -283,7 +336,7 @@ export function useNodeUtilities(blocks: Record<string, any>) {
return { width, height }
},
[getNodes, getBlockDimensions]
[getNodes, getBlockDimensions, blocks]
)
/**

View File

@@ -18,7 +18,7 @@ import { useShallow } from 'zustand/react/shallow'
import type { OAuthConnectEventDetail } from '@/lib/copilot/tools/client/other/oauth-request-access'
import { createLogger } from '@/lib/logs/console/logger'
import type { OAuthProvider } from '@/lib/oauth'
import { CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions'
import { BLOCK_DIMENSIONS, CONTAINER_DIMENSIONS } from '@/lib/workflows/blocks/block-dimensions'
import { TriggerUtils } from '@/lib/workflows/triggers/triggers'
import { useWorkspacePermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import {
@@ -40,6 +40,10 @@ import {
useCurrentWorkflow,
useNodeUtilities,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks'
import {
clampPositionToContainer,
estimateBlockDimensions,
} from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-node-utilities'
import { useSocket } from '@/app/workspace/providers/socket-provider'
import { getBlock } from '@/blocks'
import { isAnnotationOnlyBlock } from '@/executor/constants'
@@ -694,17 +698,19 @@ const WorkflowContent = React.memo(() => {
return
}
// Calculate position relative to the container's content area
// Account for header (50px), left padding (16px), and top padding (16px)
const headerHeight = 50
const leftPadding = 16
const topPadding = 16
const relativePosition = {
x: position.x - containerInfo.loopPosition.x - leftPadding,
y: position.y - containerInfo.loopPosition.y - headerHeight - topPadding,
// Calculate raw position relative to container origin
const rawPosition = {
x: position.x - containerInfo.loopPosition.x,
y: position.y - containerInfo.loopPosition.y,
}
// Clamp position to keep block inside container's content area
const relativePosition = clampPositionToContainer(
rawPosition,
containerInfo.dimensions,
estimateBlockDimensions(data.type)
)
// Capture existing child blocks before adding the new one
const existingChildBlocks = Object.values(blocks).filter(
(b) => b.data?.parentId === containerInfo.loopId
@@ -1910,17 +1916,47 @@ const WorkflowContent = React.memo(() => {
})
document.body.style.cursor = ''
// Get the block's current parent (if any)
const currentBlock = blocks[node.id]
const currentParentId = currentBlock?.data?.parentId
// Calculate position - clamp if inside a container
let finalPosition = node.position
if (currentParentId) {
// Block is inside a container - clamp position to keep it fully inside
const parentNode = getNodes().find((n) => n.id === currentParentId)
if (parentNode) {
const containerDimensions = {
width: parentNode.data?.width || CONTAINER_DIMENSIONS.DEFAULT_WIDTH,
height: parentNode.data?.height || CONTAINER_DIMENSIONS.DEFAULT_HEIGHT,
}
const blockDimensions = {
width: BLOCK_DIMENSIONS.FIXED_WIDTH,
height: Math.max(
currentBlock?.height || BLOCK_DIMENSIONS.MIN_HEIGHT,
BLOCK_DIMENSIONS.MIN_HEIGHT
),
}
finalPosition = clampPositionToContainer(
node.position,
containerDimensions,
blockDimensions
)
}
}
// Emit collaborative position update for the final position
// This ensures other users see the smooth final position
collaborativeUpdateBlockPosition(node.id, node.position, true)
collaborativeUpdateBlockPosition(node.id, finalPosition, true)
// Record single move entry on drag end to avoid micro-moves
const start = getDragStartPosition()
if (start && start.id === node.id) {
const before = { x: start.x, y: start.y, parentId: start.parentId }
const after = {
x: node.position.x,
y: node.position.y,
x: finalPosition.x,
y: finalPosition.y,
parentId: node.parentId || blocks[node.id]?.data?.parentId,
}
const moved =

View File

@@ -1,6 +1,9 @@
import { useCallback, useState } from 'react'
import { createLogger } from '@/lib/logs/console/logger'
import { exportWorkspaceToZip } from '@/lib/workflows/operations/import-export'
import {
exportWorkspaceToZip,
type WorkflowExportData,
} from '@/lib/workflows/operations/import-export'
const logger = createLogger('useExportWorkspace')
@@ -15,7 +18,8 @@ interface UseExportWorkspaceProps {
* Hook for managing workspace export to ZIP.
*
* Handles:
* - Fetching all workflows and folders from workspace via bulk export endpoint
* - Fetching all workflows and folders from workspace
* - Fetching workflow states and variables
* - Creating ZIP file with all workspace data
* - Downloading the ZIP file
* - Loading state management
@@ -38,13 +42,74 @@ export function useExportWorkspace({ onSuccess }: UseExportWorkspaceProps = {})
try {
logger.info('Exporting workspace', { workspaceId })
// Single API call to get all workspace data (workflows with states + folders)
const response = await fetch(`/api/workspaces/${workspaceId}/export`)
if (!response.ok) {
throw new Error('Failed to export workspace')
// Fetch all workflows in workspace
const workflowsResponse = await fetch(`/api/workflows?workspaceId=${workspaceId}`)
if (!workflowsResponse.ok) {
throw new Error('Failed to fetch workflows')
}
const { data: workflows } = await workflowsResponse.json()
// Fetch all folders in workspace
const foldersResponse = await fetch(`/api/folders?workspaceId=${workspaceId}`)
if (!foldersResponse.ok) {
throw new Error('Failed to fetch folders')
}
const foldersData = await foldersResponse.json()
// Export each workflow
const workflowsToExport: WorkflowExportData[] = []
for (const workflow of workflows) {
try {
const workflowResponse = await fetch(`/api/workflows/${workflow.id}`)
if (!workflowResponse.ok) {
logger.warn(`Failed to fetch workflow ${workflow.id}`)
continue
}
const { data: workflowData } = await workflowResponse.json()
if (!workflowData?.state) {
logger.warn(`Workflow ${workflow.id} has no state`)
continue
}
const variablesResponse = await fetch(`/api/workflows/${workflow.id}/variables`)
let workflowVariables: any[] = []
if (variablesResponse.ok) {
const variablesData = await variablesResponse.json()
workflowVariables = Object.values(variablesData?.data || {}).map((v: any) => ({
id: v.id,
name: v.name,
type: v.type,
value: v.value,
}))
}
workflowsToExport.push({
workflow: {
id: workflow.id,
name: workflow.name,
description: workflow.description,
color: workflow.color,
folderId: workflow.folderId,
},
state: workflowData.state,
variables: workflowVariables,
})
} catch (error) {
logger.error(`Failed to export workflow ${workflow.id}:`, error)
}
}
const { workflows: workflowsToExport, folders: foldersToExport } = await response.json()
const foldersToExport: Array<{
id: string
name: string
parentId: string | null
}> = (foldersData.folders || []).map((folder: any) => ({
id: folder.id,
name: folder.name,
parentId: folder.parentId,
}))
const zipBlob = await exportWorkspaceToZip(
workspaceName,

View File

@@ -15,6 +15,7 @@ export const ResponseBlock: BlockConfig<ResponseBlockOutput> = {
- This is usually used as the last block in the workflow.
`,
category: 'blocks',
hideFromToolbar: true,
bgColor: '#2F55FF',
icon: ResponseIcon,
subBlocks: [

View File

@@ -0,0 +1,121 @@
import { ResponseIcon } from '@/components/icons'
import type { BlockConfig } from '@/blocks/types'
/**
* WorkflowResponseBlock - A simplified response block with flat output structure.
* Output is directly accessible via <Response.fieldName> instead of nested paths.
*/
export const WorkflowResponseBlock: BlockConfig = {
type: 'workflow_response',
name: 'Response',
description: 'Send structured API response',
longDescription:
'A Response block with direct field access. Use <Response.fieldName> to reference output fields directly.',
docsLink: 'https://docs.sim.ai/blocks/response',
bestPractices: `
- Only use this if the trigger block is the API Trigger.
- Prefer the builder mode over the editor mode.
- This is usually used as the last block in the workflow.
- Output fields are directly accessible: <Response.fieldName>
`,
category: 'blocks',
bgColor: '#2F55FF',
icon: ResponseIcon,
subBlocks: [
{
id: 'dataMode',
title: 'Response Data Mode',
type: 'dropdown',
options: [
{ label: 'Builder', id: 'structured' },
{ label: 'Editor', id: 'json' },
],
value: () => 'structured',
description: 'Choose how to define your response data structure',
},
{
id: 'builderData',
title: 'Response Structure',
type: 'response-format',
condition: { field: 'dataMode', value: 'structured' },
description:
'Define the structure of your response data. Use <variable.name> in field names to reference workflow variables.',
},
{
id: 'data',
title: 'Response Data',
type: 'code',
placeholder: '{\n "message": "Hello world",\n "userId": "<variable.userId>"\n}',
language: 'json',
condition: { field: 'dataMode', value: 'json' },
description:
'Data that will be sent as the response body on API calls. Use <variable.name> to reference workflow variables.',
wandConfig: {
enabled: true,
maintainHistory: true,
prompt: `You are an expert JSON programmer.
Generate ONLY the raw JSON object based on the user's request.
The output MUST be a single, valid JSON object, starting with { and ending with }.
Current response: {context}
Do not include any explanations, markdown formatting, or other text outside the JSON object.
You have access to the following variables you can use to generate the JSON body:
- 'params' (object): Contains input parameters derived from the JSON schema. Access these directly using the parameter name wrapped in angle brackets, e.g., '<paramName>'. Do NOT use 'params.paramName'.
- 'environmentVariables' (object): Contains environment variables. Reference these using the double curly brace syntax: '{{ENV_VAR_NAME}}'. Do NOT use 'environmentVariables.VAR_NAME' or env.
Example:
{
"name": "<block.agent.response.content>",
"age": <block.function.output.age>,
"success": true
}`,
placeholder: 'Describe the API response structure you need...',
generationType: 'json-object',
},
},
{
id: 'status',
title: 'Status Code',
type: 'short-input',
placeholder: '200',
description: 'HTTP status code (default: 200)',
},
{
id: 'headers',
title: 'Response Headers',
type: 'table',
columns: ['Key', 'Value'],
description: 'Additional HTTP headers to include in the response',
},
],
tools: { access: [] },
inputs: {
dataMode: {
type: 'string',
description: 'Response data definition mode',
},
builderData: {
type: 'json',
description: 'Structured response data',
},
data: {
type: 'json',
description: 'JSON response body',
},
status: {
type: 'number',
description: 'HTTP status code',
},
headers: {
type: 'json',
description: 'Response headers',
},
},
outputs: {
// User's data fields are spread directly at root level
status: { type: 'number', description: 'HTTP status code' },
headers: { type: 'json', description: 'Response headers' },
},
}

View File

@@ -131,6 +131,7 @@ import { WikipediaBlock } from '@/blocks/blocks/wikipedia'
import { WordPressBlock } from '@/blocks/blocks/wordpress'
import { WorkflowBlock } from '@/blocks/blocks/workflow'
import { WorkflowInputBlock } from '@/blocks/blocks/workflow_input'
import { WorkflowResponseBlock } from '@/blocks/blocks/workflow_response'
import { XBlock } from '@/blocks/blocks/x'
import { YouTubeBlock } from '@/blocks/blocks/youtube'
import { ZendeskBlock } from '@/blocks/blocks/zendesk'
@@ -231,6 +232,7 @@ export const registry: Record<string, BlockConfig> = {
reddit: RedditBlock,
resend: ResendBlock,
response: ResponseBlock,
workflow_response: WorkflowResponseBlock,
rss: RssBlock,
router: RouterBlock,
s3: S3Block,

View File

@@ -3,5 +3,5 @@
"name": "Emir Karabeg",
"url": "https://x.com/karabegemir",
"xHandle": "karabegemir",
"avatarUrl": "/studio/authors/emir.png"
"avatarUrl": "/studio/authors/emir.jpg"
}

View File

@@ -3,5 +3,5 @@
"name": "Siddharth",
"url": "https://x.com/sidganesan",
"xHandle": "sidganesan",
"avatarUrl": "/studio/authors/sid.png"
"avatarUrl": "/studio/authors/sid.jpg"
}

View File

@@ -3,5 +3,5 @@
"name": "Waleed Latif",
"url": "https://x.com/typingwala",
"xHandle": "typingwala",
"avatarUrl": "/studio/authors/waleed.png"
"avatarUrl": "/studio/authors/waleed.jpg"
}

View File

@@ -18,7 +18,7 @@ featured: true
draft: false
---
![Sim team photo](/studio/series-a/team.png)
![Sim team photo](/studio/series-a/team.jpg)
## Why were excited

View File

@@ -15,6 +15,7 @@ export enum BlockType {
VARIABLES = 'variables',
RESPONSE = 'response',
WORKFLOW_RESPONSE = 'workflow_response',
HUMAN_IN_THE_LOOP = 'human_in_the_loop',
WORKFLOW = 'workflow',
WORKFLOW_INPUT = 'workflow_input',

View File

@@ -17,27 +17,32 @@ vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn(() => 'test-request-id'),
}))
vi.mock('@/lib/execution/isolated-vm', () => ({
executeInIsolatedVM: vi.fn(),
vi.mock('@/tools', () => ({
executeTool: vi.fn(),
}))
import { executeInIsolatedVM } from '@/lib/execution/isolated-vm'
import { executeTool } from '@/tools'
const mockExecuteInIsolatedVM = executeInIsolatedVM as ReturnType<typeof vi.fn>
const mockExecuteTool = executeTool as ReturnType<typeof vi.fn>
function simulateIsolatedVMExecution(
code: string,
contextVariables: Record<string, unknown>
): { result: unknown; stdout: string; error?: { message: string; name: string } } {
/**
* Simulates what the function_execute tool does when evaluating condition code
*/
function simulateConditionExecution(code: string): {
success: boolean
output?: { result: unknown }
error?: string
} {
try {
const fn = new Function(...Object.keys(contextVariables), code)
const result = fn(...Object.values(contextVariables))
return { result, stdout: '' }
// The code is in format: "const context = {...};\nreturn Boolean(...)"
// We need to execute it and return the result
const fn = new Function(code)
const result = fn()
return { success: true, output: { result } }
} catch (error: any) {
return {
result: null,
stdout: '',
error: { message: error.message, name: error.name || 'Error' },
success: false,
error: error.message,
}
}
}
@@ -143,8 +148,8 @@ describe('ConditionBlockHandler', () => {
vi.clearAllMocks()
mockExecuteInIsolatedVM.mockImplementation(async ({ code, contextVariables }) => {
return simulateIsolatedVMExecution(code, contextVariables)
mockExecuteTool.mockImplementation(async (_toolId: string, params: { code: string }) => {
return simulateConditionExecution(params.code)
})
})

View File

@@ -1,10 +1,9 @@
import { generateRequestId } from '@/lib/core/utils/request'
import { executeInIsolatedVM } from '@/lib/execution/isolated-vm'
import { createLogger } from '@/lib/logs/console/logger'
import type { BlockOutput } from '@/blocks/types'
import { BlockType, CONDITION, DEFAULTS, EDGE } from '@/executor/constants'
import type { BlockHandler, ExecutionContext } from '@/executor/types'
import type { SerializedBlock } from '@/serializer/types'
import { executeTool } from '@/tools'
const logger = createLogger('ConditionBlockHandler')
@@ -39,32 +38,38 @@ export async function evaluateConditionExpression(
}
try {
const requestId = generateRequestId()
const contextSetup = `const context = ${JSON.stringify(evalContext)};`
const code = `${contextSetup}\nreturn Boolean(${resolvedConditionValue})`
const code = `return Boolean(${resolvedConditionValue})`
const result = await executeTool(
'function_execute',
{
code,
timeout: CONDITION_TIMEOUT_MS,
envVars: {},
_context: {
workflowId: ctx.workflowId,
workspaceId: ctx.workspaceId,
},
},
false,
false,
ctx
)
const result = await executeInIsolatedVM({
code,
params: {},
envVars: {},
contextVariables: { context: evalContext },
timeoutMs: CONDITION_TIMEOUT_MS,
requestId,
})
if (result.error) {
logger.error(`Failed to evaluate condition: ${result.error.message}`, {
if (!result.success) {
logger.error(`Failed to evaluate condition: ${result.error}`, {
originalCondition: conditionExpression,
resolvedCondition: resolvedConditionValue,
evalContext,
error: result.error,
})
throw new Error(
`Evaluation error in condition: ${result.error.message}. (Resolved: ${resolvedConditionValue})`
`Evaluation error in condition: ${result.error}. (Resolved: ${resolvedConditionValue})`
)
}
return Boolean(result.result)
return Boolean(result.output?.result)
} catch (evalError: any) {
logger.error(`Failed to evaluate condition: ${evalError.message}`, {
originalCondition: conditionExpression,

View File

@@ -11,6 +11,7 @@ import { TriggerBlockHandler } from '@/executor/handlers/trigger/trigger-handler
import { VariablesBlockHandler } from '@/executor/handlers/variables/variables-handler'
import { WaitBlockHandler } from '@/executor/handlers/wait/wait-handler'
import { WorkflowBlockHandler } from '@/executor/handlers/workflow/workflow-handler'
import { WorkflowResponseBlockHandler } from '@/executor/handlers/workflow-response/workflow-response-handler'
export {
AgentBlockHandler,
@@ -19,6 +20,7 @@ export {
EvaluatorBlockHandler,
FunctionBlockHandler,
GenericBlockHandler,
WorkflowResponseBlockHandler,
ResponseBlockHandler,
HumanInTheLoopBlockHandler,
RouterBlockHandler,

View File

@@ -18,6 +18,7 @@ import { TriggerBlockHandler } from '@/executor/handlers/trigger/trigger-handler
import { VariablesBlockHandler } from '@/executor/handlers/variables/variables-handler'
import { WaitBlockHandler } from '@/executor/handlers/wait/wait-handler'
import { WorkflowBlockHandler } from '@/executor/handlers/workflow/workflow-handler'
import { WorkflowResponseBlockHandler } from '@/executor/handlers/workflow-response/workflow-response-handler'
import type { BlockHandler } from '@/executor/types'
/**
@@ -34,6 +35,7 @@ export function createBlockHandlers(): BlockHandler[] {
new ConditionBlockHandler(),
new RouterBlockHandler(),
new ResponseBlockHandler(),
new WorkflowResponseBlockHandler(),
new HumanInTheLoopBlockHandler(),
new AgentBlockHandler(),
new VariablesBlockHandler(),

View File

@@ -0,0 +1,258 @@
import { createLogger } from '@/lib/logs/console/logger'
import type { BlockOutput } from '@/blocks/types'
import { BlockType, HTTP } from '@/executor/constants'
import type { BlockHandler, ExecutionContext } from '@/executor/types'
import type { SerializedBlock } from '@/serializer/types'
const logger = createLogger('WorkflowResponseBlockHandler')
interface JSONProperty {
id: string
name: string
type: 'string' | 'number' | 'boolean' | 'object' | 'array' | 'files'
value: any
collapsed?: boolean
}
/**
* Handler for the WorkflowResponse block.
* Returns a flat output structure where data fields are directly accessible.
* Reference: <Response.fieldName> directly (no wrapper like old Response block)
* API output is exactly what user defines - no metadata, no wrappers.
*/
export class WorkflowResponseBlockHandler implements BlockHandler {
canHandle(block: SerializedBlock): boolean {
return block.metadata?.id === BlockType.WORKFLOW_RESPONSE
}
async execute(
ctx: ExecutionContext,
block: SerializedBlock,
inputs: Record<string, any>
): Promise<BlockOutput> {
logger.info(`Executing workflow response block: ${block.id}`)
try {
const responseData = this.parseResponseData(inputs)
const statusCode = this.parseStatus(inputs.status)
const responseHeaders = this.parseHeaders(inputs.headers)
logger.info('Workflow Response prepared', {
status: statusCode,
dataKeys: typeof responseData === 'object' ? Object.keys(responseData) : [],
headerKeys: Object.keys(responseHeaders),
})
// Flat output structure - no response or data wrapper
// Access as <Response.fieldName>, <Response.status>, <Response.headers>
return {
...responseData,
status: statusCode,
headers: responseHeaders,
}
} catch (error: any) {
logger.error('Workflow Response block execution failed:', error)
return {
error: {
type: 'Workflow Response block execution failed',
message: error.message || 'Unknown error',
},
status: HTTP.STATUS.SERVER_ERROR,
headers: { 'Content-Type': HTTP.CONTENT_TYPE.JSON },
}
}
}
private parseResponseData(inputs: Record<string, any>): any {
const dataMode = inputs.dataMode || 'structured'
if (dataMode === 'json' && inputs.data) {
if (typeof inputs.data === 'string') {
try {
return JSON.parse(inputs.data)
} catch (error) {
logger.warn('Failed to parse JSON data, returning as string:', error)
return inputs.data
}
} else if (typeof inputs.data === 'object' && inputs.data !== null) {
return inputs.data
}
return inputs.data
}
if (dataMode === 'structured' && inputs.builderData) {
const convertedData = this.convertBuilderDataToJson(inputs.builderData)
return this.parseObjectStrings(convertedData)
}
return inputs.data || {}
}
private convertBuilderDataToJson(builderData: JSONProperty[]): any {
if (!Array.isArray(builderData)) {
return {}
}
const result: any = {}
for (const prop of builderData) {
if (!prop.name || !prop.name.trim()) {
continue
}
const value = this.convertPropertyValue(prop)
result[prop.name] = value
}
return result
}
private convertPropertyValue(prop: JSONProperty): any {
switch (prop.type) {
case 'object':
return this.convertObjectValue(prop.value)
case 'array':
return this.convertArrayValue(prop.value)
case 'number':
return this.convertNumberValue(prop.value)
case 'boolean':
return this.convertBooleanValue(prop.value)
case 'files':
return prop.value
default:
return prop.value
}
}
private convertObjectValue(value: any): any {
if (Array.isArray(value)) {
return this.convertBuilderDataToJson(value)
}
if (typeof value === 'string' && !this.isVariableReference(value)) {
return this.tryParseJson(value, value)
}
return value
}
private convertArrayValue(value: any): any {
if (Array.isArray(value)) {
return value.map((item: any) => this.convertArrayItem(item))
}
if (typeof value === 'string' && !this.isVariableReference(value)) {
const parsed = this.tryParseJson(value, value)
if (Array.isArray(parsed)) {
return parsed
}
return value
}
return value
}
private convertArrayItem(item: any): any {
if (typeof item !== 'object' || !item.type) {
return item
}
if (item.type === 'object' && Array.isArray(item.value)) {
return this.convertBuilderDataToJson(item.value)
}
if (item.type === 'array' && Array.isArray(item.value)) {
return item.value.map((subItem: any) => {
if (typeof subItem === 'object' && subItem.type) {
return subItem.value
}
return subItem
})
}
return item.value
}
private convertNumberValue(value: any): any {
if (this.isVariableReference(value)) {
return value
}
const numValue = Number(value)
if (Number.isNaN(numValue)) {
return value
}
return numValue
}
private convertBooleanValue(value: any): any {
if (this.isVariableReference(value)) {
return value
}
return value === 'true' || value === true
}
private tryParseJson(jsonString: string, fallback: any): any {
try {
return JSON.parse(jsonString)
} catch {
return fallback
}
}
private isVariableReference(value: any): boolean {
return typeof value === 'string' && value.trim().startsWith('<') && value.trim().includes('>')
}
private parseObjectStrings(data: any): any {
if (typeof data === 'string') {
try {
const parsed = JSON.parse(data)
if (typeof parsed === 'object' && parsed !== null) {
return this.parseObjectStrings(parsed)
}
return parsed
} catch {
return data
}
} else if (Array.isArray(data)) {
return data.map((item) => this.parseObjectStrings(item))
} else if (typeof data === 'object' && data !== null) {
const result: any = {}
for (const [key, value] of Object.entries(data)) {
result[key] = this.parseObjectStrings(value)
}
return result
}
return data
}
private parseStatus(status?: string): number {
if (!status) return HTTP.STATUS.OK
const parsed = Number(status)
if (Number.isNaN(parsed) || parsed < 100 || parsed > 599) {
return HTTP.STATUS.OK
}
return parsed
}
private parseHeaders(
headers: {
id: string
cells: { Key: string; Value: string }
}[]
): Record<string, string> {
const defaultHeaders = { 'Content-Type': HTTP.CONTENT_TYPE.JSON }
if (!headers) return defaultHeaders
const headerObj = headers.reduce((acc: Record<string, string>, header) => {
if (header?.cells?.Key && header?.cells?.Value) {
acc[header.cells.Key] = header.cells.Value
}
return acc
}, {})
return { ...defaultHeaders, ...headerObj }
}
}

View File

@@ -4,6 +4,7 @@ interface SlackAccount {
id: string
accountId: string
providerId: string
displayName?: string
}
interface UseSlackAccountsResult {

View File

@@ -14,6 +14,7 @@ interface UseWebhookManagementProps {
blockId: string
triggerId?: string
isPreview?: boolean
useWebhookUrl?: boolean
}
interface WebhookManagementState {
@@ -90,6 +91,7 @@ export function useWebhookManagement({
blockId,
triggerId,
isPreview = false,
useWebhookUrl = false,
}: UseWebhookManagementProps): WebhookManagementState {
const params = useParams()
const workflowId = params.workflowId as string
@@ -204,9 +206,10 @@ export function useWebhookManagement({
})
}
}
loadWebhookOrGenerateUrl()
}, [isPreview, triggerId, workflowId, blockId])
if (useWebhookUrl) {
loadWebhookOrGenerateUrl()
}
}, [isPreview, triggerId, workflowId, blockId, useWebhookUrl])
const createWebhook = async (
effectiveTriggerId: string | undefined,

View File

@@ -1,5 +1,7 @@
'use client'
import { useState } from 'react'
import { Check, Copy } from 'lucide-react'
import { Code } from '@/components/emcn'
interface CodeBlockProps {
@@ -8,5 +10,36 @@ interface CodeBlockProps {
}
export function CodeBlock({ code, language }: CodeBlockProps) {
return <Code.Viewer code={code} showGutter={true} language={language} />
const [copied, setCopied] = useState(false)
const handleCopy = () => {
navigator.clipboard.writeText(code)
setCopied(true)
setTimeout(() => setCopied(false), 2000)
}
return (
<div className='dark w-full overflow-hidden rounded-md border border-[#2a2a2a] bg-[#1F1F1F] text-sm'>
<div className='flex items-center justify-between border-[#2a2a2a] border-b px-4 py-1.5'>
<span className='text-[#A3A3A3] text-xs'>{language}</span>
<button
onClick={handleCopy}
className='text-[#A3A3A3] transition-colors hover:text-gray-300'
title='Copy code'
>
{copied ? (
<Check className='h-3 w-3' strokeWidth={2} />
) : (
<Copy className='h-3 w-3' strokeWidth={2} />
)}
</button>
</div>
<Code.Viewer
code={code}
showGutter
language={language}
className='[&_pre]:!pb-0 m-0 rounded-none border-0 bg-transparent'
/>
</div>
)
}

View File

@@ -67,7 +67,7 @@ export const mdxComponents: MDXRemoteProps['components'] = {
a: (props: any) => {
const isAnchorLink = props.className?.includes('anchor')
if (isAnchorLink) {
return <a {...props} />
return <a {...props} className={clsx('text-inherit no-underline', props.className)} />
}
return (
<a
@@ -113,7 +113,7 @@ export const mdxComponents: MDXRemoteProps['components'] = {
const mappedLanguage = languageMap[language.toLowerCase()] || 'javascript'
return (
<div className='my-6'>
<div className='not-prose my-6'>
<CodeBlock
code={typeof codeContent === 'string' ? codeContent.trim() : String(codeContent)}
language={mappedLanguage}
@@ -129,9 +129,10 @@ export const mdxComponents: MDXRemoteProps['components'] = {
<code
{...props}
className={clsx(
'rounded bg-gray-100 px-1.5 py-0.5 font-mono text-[0.9em] text-red-600',
'rounded bg-gray-100 px-1.5 py-0.5 font-mono font-normal text-[0.9em] text-red-600',
props.className
)}
style={{ fontWeight: 400 }}
/>
)
}

View File

@@ -38,7 +38,9 @@ function slugify(text: string): string {
}
async function scanFrontmatters(): Promise<BlogMeta[]> {
if (cachedMeta) return cachedMeta
if (cachedMeta) {
return cachedMeta
}
await ensureContentDirs()
const entries = await fs.readdir(BLOG_DIR).catch(() => [])
const authorsMap = await loadAuthors()

View File

@@ -50,6 +50,8 @@ type SkippedItemType =
| 'invalid_block_type'
| 'invalid_edge_target'
| 'invalid_edge_source'
| 'invalid_source_handle'
| 'invalid_target_handle'
| 'invalid_subblock_field'
| 'missing_required_params'
| 'invalid_subflow_parent'
@@ -734,8 +736,279 @@ function normalizeResponseFormat(value: any): string {
}
}
interface EdgeHandleValidationResult {
valid: boolean
error?: string
}
/**
* Helper to add connections as edges for a block
* Validates source handle is valid for the block type
*/
function validateSourceHandleForBlock(
sourceHandle: string,
sourceBlockType: string,
sourceBlock: any
): EdgeHandleValidationResult {
if (sourceHandle === 'error') {
return { valid: true }
}
switch (sourceBlockType) {
case 'loop':
if (sourceHandle === 'loop-start-source' || sourceHandle === 'loop-end-source') {
return { valid: true }
}
return {
valid: false,
error: `Invalid source handle "${sourceHandle}" for loop block. Valid handles: loop-start-source, loop-end-source, error`,
}
case 'parallel':
if (sourceHandle === 'parallel-start-source' || sourceHandle === 'parallel-end-source') {
return { valid: true }
}
return {
valid: false,
error: `Invalid source handle "${sourceHandle}" for parallel block. Valid handles: parallel-start-source, parallel-end-source, error`,
}
case 'condition': {
if (!sourceHandle.startsWith('condition-')) {
return {
valid: false,
error: `Invalid source handle "${sourceHandle}" for condition block. Must start with "condition-"`,
}
}
const conditionsValue = sourceBlock?.subBlocks?.conditions?.value
if (!conditionsValue) {
return {
valid: false,
error: `Invalid condition handle "${sourceHandle}" - no conditions defined`,
}
}
return validateConditionHandle(sourceHandle, sourceBlock.id, conditionsValue)
}
case 'router':
if (sourceHandle === 'source' || sourceHandle.startsWith('router-')) {
return { valid: true }
}
return {
valid: false,
error: `Invalid source handle "${sourceHandle}" for router block. Valid handles: source, router-{targetId}, error`,
}
default:
if (sourceHandle === 'source') {
return { valid: true }
}
return {
valid: false,
error: `Invalid source handle "${sourceHandle}" for ${sourceBlockType} block. Valid handles: source, error`,
}
}
}
/**
* Validates condition handle references a valid condition in the block.
* Accepts both internal IDs (condition-blockId-if) and semantic keys (condition-blockId-else-if)
*/
function validateConditionHandle(
sourceHandle: string,
blockId: string,
conditionsValue: string | any[]
): EdgeHandleValidationResult {
let conditions: any[]
if (typeof conditionsValue === 'string') {
try {
conditions = JSON.parse(conditionsValue)
} catch {
return {
valid: false,
error: `Cannot validate condition handle "${sourceHandle}" - conditions is not valid JSON`,
}
}
} else if (Array.isArray(conditionsValue)) {
conditions = conditionsValue
} else {
return {
valid: false,
error: `Cannot validate condition handle "${sourceHandle}" - conditions is not an array`,
}
}
if (!Array.isArray(conditions) || conditions.length === 0) {
return {
valid: false,
error: `Invalid condition handle "${sourceHandle}" - no conditions defined`,
}
}
const validHandles = new Set<string>()
const semanticPrefix = `condition-${blockId}-`
let elseIfCount = 0
for (const condition of conditions) {
if (condition.id) {
validHandles.add(`condition-${condition.id}`)
}
const title = condition.title?.toLowerCase()
if (title === 'if') {
validHandles.add(`${semanticPrefix}if`)
} else if (title === 'else if') {
elseIfCount++
validHandles.add(
elseIfCount === 1 ? `${semanticPrefix}else-if` : `${semanticPrefix}else-if-${elseIfCount}`
)
} else if (title === 'else') {
validHandles.add(`${semanticPrefix}else`)
}
}
if (validHandles.has(sourceHandle)) {
return { valid: true }
}
const validOptions = Array.from(validHandles).slice(0, 5)
const moreCount = validHandles.size - validOptions.length
let validOptionsStr = validOptions.join(', ')
if (moreCount > 0) {
validOptionsStr += `, ... and ${moreCount} more`
}
return {
valid: false,
error: `Invalid condition handle "${sourceHandle}". Valid handles: ${validOptionsStr}`,
}
}
/**
* Validates target handle is valid (must be 'target')
*/
function validateTargetHandle(targetHandle: string): EdgeHandleValidationResult {
if (targetHandle === 'target') {
return { valid: true }
}
return {
valid: false,
error: `Invalid target handle "${targetHandle}". Expected "target"`,
}
}
/**
* Creates a validated edge between two blocks.
* Returns true if edge was created, false if skipped due to validation errors.
*/
function createValidatedEdge(
modifiedState: any,
sourceBlockId: string,
targetBlockId: string,
sourceHandle: string,
targetHandle: string,
operationType: string,
logger: ReturnType<typeof createLogger>,
skippedItems?: SkippedItem[]
): boolean {
if (!modifiedState.blocks[targetBlockId]) {
logger.warn(`Target block "${targetBlockId}" not found. Edge skipped.`, {
sourceBlockId,
targetBlockId,
sourceHandle,
})
skippedItems?.push({
type: 'invalid_edge_target',
operationType,
blockId: sourceBlockId,
reason: `Edge from "${sourceBlockId}" to "${targetBlockId}" skipped - target block does not exist`,
details: { sourceHandle, targetHandle, targetId: targetBlockId },
})
return false
}
const sourceBlock = modifiedState.blocks[sourceBlockId]
if (!sourceBlock) {
logger.warn(`Source block "${sourceBlockId}" not found. Edge skipped.`, {
sourceBlockId,
targetBlockId,
})
skippedItems?.push({
type: 'invalid_edge_source',
operationType,
blockId: sourceBlockId,
reason: `Edge from "${sourceBlockId}" to "${targetBlockId}" skipped - source block does not exist`,
details: { sourceHandle, targetHandle, targetId: targetBlockId },
})
return false
}
const sourceBlockType = sourceBlock.type
if (!sourceBlockType) {
logger.warn(`Source block "${sourceBlockId}" has no type. Edge skipped.`, {
sourceBlockId,
targetBlockId,
})
skippedItems?.push({
type: 'invalid_edge_source',
operationType,
blockId: sourceBlockId,
reason: `Edge from "${sourceBlockId}" to "${targetBlockId}" skipped - source block has no type`,
details: { sourceHandle, targetHandle, targetId: targetBlockId },
})
return false
}
const sourceValidation = validateSourceHandleForBlock(sourceHandle, sourceBlockType, sourceBlock)
if (!sourceValidation.valid) {
logger.warn(`Invalid source handle. Edge skipped.`, {
sourceBlockId,
targetBlockId,
sourceHandle,
error: sourceValidation.error,
})
skippedItems?.push({
type: 'invalid_source_handle',
operationType,
blockId: sourceBlockId,
reason: sourceValidation.error || `Invalid source handle "${sourceHandle}"`,
details: { sourceHandle, targetHandle, targetId: targetBlockId },
})
return false
}
const targetValidation = validateTargetHandle(targetHandle)
if (!targetValidation.valid) {
logger.warn(`Invalid target handle. Edge skipped.`, {
sourceBlockId,
targetBlockId,
targetHandle,
error: targetValidation.error,
})
skippedItems?.push({
type: 'invalid_target_handle',
operationType,
blockId: sourceBlockId,
reason: targetValidation.error || `Invalid target handle "${targetHandle}"`,
details: { sourceHandle, targetHandle, targetId: targetBlockId },
})
return false
}
modifiedState.edges.push({
id: crypto.randomUUID(),
source: sourceBlockId,
sourceHandle,
target: targetBlockId,
targetHandle,
type: 'default',
})
return true
}
/**
* Adds connections as edges for a block
*/
function addConnectionsAsEdges(
modifiedState: any,
@@ -747,34 +1020,16 @@ function addConnectionsAsEdges(
Object.entries(connections).forEach(([sourceHandle, targets]) => {
const targetArray = Array.isArray(targets) ? targets : [targets]
targetArray.forEach((targetId: string) => {
// Validate target block exists - skip edge if target doesn't exist
if (!modifiedState.blocks[targetId]) {
logger.warn(
`Target block "${targetId}" not found when creating connection from "${blockId}". ` +
`Edge skipped.`,
{
sourceBlockId: blockId,
targetBlockId: targetId,
existingBlocks: Object.keys(modifiedState.blocks),
}
)
skippedItems?.push({
type: 'invalid_edge_target',
operationType: 'add_edge',
blockId: blockId,
reason: `Edge from "${blockId}" to "${targetId}" skipped - target block does not exist`,
details: { sourceHandle, targetId },
})
return
}
modifiedState.edges.push({
id: crypto.randomUUID(),
source: blockId,
createValidatedEdge(
modifiedState,
blockId,
targetId,
sourceHandle,
target: targetId,
targetHandle: 'target',
type: 'default',
})
'target',
'add_edge',
logger,
skippedItems
)
})
})
}
@@ -1257,67 +1512,44 @@ function applyOperationsToWorkflowState(
// Handle connections update (convert to edges)
if (params?.connections) {
// Remove existing edges from this block
modifiedState.edges = modifiedState.edges.filter((edge: any) => edge.source !== block_id)
// Add new edges based on connections
Object.entries(params.connections).forEach(([connectionType, targets]) => {
if (targets === null) return
// Map semantic connection names to actual React Flow handle IDs
// 'success' in YAML/connections maps to 'source' handle in React Flow
const mapConnectionTypeToHandle = (type: string): string => {
if (type === 'success') return 'source'
if (type === 'error') return 'error'
// Conditions and other types pass through as-is
return type
}
const actualSourceHandle = mapConnectionTypeToHandle(connectionType)
const sourceHandle = mapConnectionTypeToHandle(connectionType)
const addEdge = (targetBlock: string, targetHandle?: string) => {
// Validate target block exists - skip edge if target doesn't exist
if (!modifiedState.blocks[targetBlock]) {
logger.warn(
`Target block "${targetBlock}" not found when creating connection from "${block_id}". ` +
`Edge skipped.`,
{
sourceBlockId: block_id,
targetBlockId: targetBlock,
existingBlocks: Object.keys(modifiedState.blocks),
}
)
logSkippedItem(skippedItems, {
type: 'invalid_edge_target',
operationType: 'edit',
blockId: block_id,
reason: `Edge from "${block_id}" to "${targetBlock}" skipped - target block does not exist`,
details: { sourceHandle: actualSourceHandle, targetId: targetBlock },
})
return
}
modifiedState.edges.push({
id: crypto.randomUUID(),
source: block_id,
sourceHandle: actualSourceHandle,
target: targetBlock,
targetHandle: targetHandle || 'target',
type: 'default',
})
const addEdgeForTarget = (targetBlock: string, targetHandle?: string) => {
createValidatedEdge(
modifiedState,
block_id,
targetBlock,
sourceHandle,
targetHandle || 'target',
'edit',
logger,
skippedItems
)
}
if (typeof targets === 'string') {
addEdge(targets)
addEdgeForTarget(targets)
} else if (Array.isArray(targets)) {
targets.forEach((target: any) => {
if (typeof target === 'string') {
addEdge(target)
addEdgeForTarget(target)
} else if (target?.block) {
addEdge(target.block, target.handle)
addEdgeForTarget(target.block, target.handle)
}
})
} else if (typeof targets === 'object' && (targets as any)?.block) {
addEdge((targets as any).block, (targets as any).handle)
addEdgeForTarget((targets as any).block, (targets as any).handle)
}
})
}

View File

@@ -37,8 +37,28 @@ export const isEmailVerificationEnabled = isTruthy(env.EMAIL_VERIFICATION_ENABLE
/**
* Is authentication disabled (for self-hosted deployments behind private networks)
* This flag is blocked when isHosted is true.
*/
export const isAuthDisabled = isTruthy(env.DISABLE_AUTH)
export const isAuthDisabled = isTruthy(env.DISABLE_AUTH) && !isHosted
if (isTruthy(env.DISABLE_AUTH)) {
import('@/lib/logs/console/logger')
.then(({ createLogger }) => {
const logger = createLogger('FeatureFlags')
if (isHosted) {
logger.error(
'DISABLE_AUTH is set but ignored on hosted environment. Authentication remains enabled for security.'
)
} else {
logger.warn(
'DISABLE_AUTH is enabled. Authentication is bypassed and all requests use an anonymous session. Only use this in trusted private networks.'
)
}
})
.catch(() => {
// Fallback during config compilation when logger is unavailable
})
}
/**
* Is user registration disabled

View File

@@ -31,20 +31,25 @@ vi.mock('crypto', () => ({
}),
}))
vi.mock('@/lib/core/config/env', () => ({
env: {
ENCRYPTION_KEY: '0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef',
OPENAI_API_KEY_1: 'test-openai-key-1',
OPENAI_API_KEY_2: 'test-openai-key-2',
OPENAI_API_KEY_3: 'test-openai-key-3',
ANTHROPIC_API_KEY_1: 'test-anthropic-key-1',
ANTHROPIC_API_KEY_2: 'test-anthropic-key-2',
ANTHROPIC_API_KEY_3: 'test-anthropic-key-3',
GEMINI_API_KEY_1: 'test-gemini-key-1',
GEMINI_API_KEY_2: 'test-gemini-key-2',
GEMINI_API_KEY_3: 'test-gemini-key-3',
},
}))
vi.mock('@/lib/core/config/env', async (importOriginal) => {
const actual = await importOriginal<typeof import('@/lib/core/config/env')>()
return {
...actual,
env: {
...actual.env,
ENCRYPTION_KEY: '0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef', // fake key for testing
OPENAI_API_KEY_1: 'test-openai-key-1', // fake key for testing
OPENAI_API_KEY_2: 'test-openai-key-2', // fake key for testing
OPENAI_API_KEY_3: 'test-openai-key-3', // fake key for testing
ANTHROPIC_API_KEY_1: 'test-anthropic-key-1', // fake key for testing
ANTHROPIC_API_KEY_2: 'test-anthropic-key-2', // fake key for testing
ANTHROPIC_API_KEY_3: 'test-anthropic-key-3', // fake key for testing
GEMINI_API_KEY_1: 'test-gemini-key-1', // fake key for testing
GEMINI_API_KEY_2: 'test-gemini-key-2', // fake key for testing
GEMINI_API_KEY_3: 'test-gemini-key-3', // fake key for testing
},
}
})
afterEach(() => {
vi.clearAllMocks()

View File

@@ -1,3 +1,22 @@
import { getBaseUrl } from './urls'
/**
* Checks if a URL is same-origin with the application's base URL.
* Used to prevent open redirect vulnerabilities.
*
* @param url - The URL to validate
* @returns True if the URL is same-origin, false otherwise (secure default)
*/
export function isSameOrigin(url: string): boolean {
try {
const targetUrl = new URL(url)
const appUrl = new URL(getBaseUrl())
return targetUrl.origin === appUrl.origin
} catch {
return false
}
}
/**
* Validates a name by removing any characters that could cause issues
* with variable references or node naming.

View File

@@ -81,8 +81,8 @@ export async function emitWorkflowExecutionCompleted(log: WorkflowExecutionLog):
)
for (const subscription of subscriptions) {
const levelMatches = subscription.levelFilter?.includes(log.level) ?? true
const triggerMatches = subscription.triggerFilter?.includes(log.trigger) ?? true
const levelMatches = subscription.levelFilter.includes(log.level)
const triggerMatches = subscription.triggerFilter.includes(log.trigger)
if (!levelMatches || !triggerMatches) {
logger.debug(`Skipping subscription ${subscription.id} due to filter mismatch`)
@@ -98,6 +98,7 @@ export async function emitWorkflowExecutionCompleted(log: WorkflowExecutionLog):
status: log.level === 'error' ? 'error' : 'success',
durationMs: log.totalDurationMs || 0,
cost: (log.cost as { total?: number })?.total || 0,
triggerFilter: subscription.triggerFilter,
}
const shouldAlert = await shouldTriggerAlert(alertConfig, context, subscription.lastAlertAt)

View File

@@ -51,8 +51,11 @@ export interface ExecutionEnvironment {
workspaceId: string
}
export const ALL_TRIGGER_TYPES = ['api', 'webhook', 'schedule', 'manual', 'chat'] as const
export type TriggerType = (typeof ALL_TRIGGER_TYPES)[number]
export interface ExecutionTrigger {
type: 'api' | 'webhook' | 'schedule' | 'manual' | 'chat' | string
type: TriggerType | string
source: string
data?: Record<string, unknown>
timestamp: string

View File

@@ -1,6 +1,6 @@
import { db } from '@sim/db'
import { workflowExecutionLogs } from '@sim/db/schema'
import { and, avg, count, desc, eq, gte } from 'drizzle-orm'
import { and, avg, count, desc, eq, gte, inArray } from 'drizzle-orm'
import { createLogger } from '@/lib/logs/console/logger'
const logger = createLogger('AlertRules')
@@ -135,25 +135,29 @@ export function isInCooldown(lastAlertAt: Date | null): boolean {
return new Date() < cooldownEnd
}
/**
* Context passed to alert check functions
*/
export interface AlertCheckContext {
workflowId: string
executionId: string
status: 'success' | 'error'
durationMs: number
cost: number
triggerFilter: string[]
}
/**
* Check if consecutive failures threshold is met
*/
async function checkConsecutiveFailures(workflowId: string, threshold: number): Promise<boolean> {
async function checkConsecutiveFailures(
workflowId: string,
threshold: number,
triggerFilter: string[]
): Promise<boolean> {
const recentLogs = await db
.select({ level: workflowExecutionLogs.level })
.from(workflowExecutionLogs)
.where(eq(workflowExecutionLogs.workflowId, workflowId))
.where(
and(
eq(workflowExecutionLogs.workflowId, workflowId),
inArray(workflowExecutionLogs.trigger, triggerFilter)
)
)
.orderBy(desc(workflowExecutionLogs.createdAt))
.limit(threshold)
@@ -162,13 +166,11 @@ async function checkConsecutiveFailures(workflowId: string, threshold: number):
return recentLogs.every((log) => log.level === 'error')
}
/**
* Check if failure rate exceeds threshold
*/
async function checkFailureRate(
workflowId: string,
ratePercent: number,
windowHours: number
windowHours: number,
triggerFilter: string[]
): Promise<boolean> {
const windowStart = new Date(Date.now() - windowHours * 60 * 60 * 1000)
@@ -181,7 +183,8 @@ async function checkFailureRate(
.where(
and(
eq(workflowExecutionLogs.workflowId, workflowId),
gte(workflowExecutionLogs.createdAt, windowStart)
gte(workflowExecutionLogs.createdAt, windowStart),
inArray(workflowExecutionLogs.trigger, triggerFilter)
)
)
.orderBy(workflowExecutionLogs.createdAt)
@@ -206,14 +209,12 @@ function checkLatencyThreshold(durationMs: number, thresholdMs: number): boolean
return durationMs > thresholdMs
}
/**
* Check if execution duration is significantly above average
*/
async function checkLatencySpike(
workflowId: string,
currentDurationMs: number,
spikePercent: number,
windowHours: number
windowHours: number,
triggerFilter: string[]
): Promise<boolean> {
const windowStart = new Date(Date.now() - windowHours * 60 * 60 * 1000)
@@ -226,7 +227,8 @@ async function checkLatencySpike(
.where(
and(
eq(workflowExecutionLogs.workflowId, workflowId),
gte(workflowExecutionLogs.createdAt, windowStart)
gte(workflowExecutionLogs.createdAt, windowStart),
inArray(workflowExecutionLogs.trigger, triggerFilter)
)
)
@@ -248,13 +250,11 @@ function checkCostThreshold(cost: number, thresholdDollars: number): boolean {
return cost > thresholdDollars
}
/**
* Check if error count exceeds threshold within window
*/
async function checkErrorCount(
workflowId: string,
threshold: number,
windowHours: number
windowHours: number,
triggerFilter: string[]
): Promise<boolean> {
const windowStart = new Date(Date.now() - windowHours * 60 * 60 * 1000)
@@ -265,7 +265,8 @@ async function checkErrorCount(
and(
eq(workflowExecutionLogs.workflowId, workflowId),
eq(workflowExecutionLogs.level, 'error'),
gte(workflowExecutionLogs.createdAt, windowStart)
gte(workflowExecutionLogs.createdAt, windowStart),
inArray(workflowExecutionLogs.trigger, triggerFilter)
)
)
@@ -273,9 +274,6 @@ async function checkErrorCount(
return errorCount >= threshold
}
/**
* Evaluates if an alert should be triggered based on the configuration
*/
export async function shouldTriggerAlert(
config: AlertConfig,
context: AlertCheckContext,
@@ -287,16 +285,21 @@ export async function shouldTriggerAlert(
}
const { rule } = config
const { workflowId, status, durationMs, cost } = context
const { workflowId, status, durationMs, cost, triggerFilter } = context
switch (rule) {
case 'consecutive_failures':
if (status !== 'error') return false
return checkConsecutiveFailures(workflowId, config.consecutiveFailures!)
return checkConsecutiveFailures(workflowId, config.consecutiveFailures!, triggerFilter)
case 'failure_rate':
if (status !== 'error') return false
return checkFailureRate(workflowId, config.failureRatePercent!, config.windowHours!)
return checkFailureRate(
workflowId,
config.failureRatePercent!,
config.windowHours!,
triggerFilter
)
case 'latency_threshold':
return checkLatencyThreshold(durationMs, config.durationThresholdMs!)
@@ -306,19 +309,24 @@ export async function shouldTriggerAlert(
workflowId,
durationMs,
config.latencySpikePercent!,
config.windowHours!
config.windowHours!,
triggerFilter
)
case 'cost_threshold':
return checkCostThreshold(cost, config.costThresholdDollars!)
case 'no_activity':
// no_activity alerts are handled by the hourly polling job, not execution events
return false
case 'error_count':
if (status !== 'error') return false
return checkErrorCount(workflowId, config.errorCountThreshold!, config.windowHours!)
return checkErrorCount(
workflowId,
config.errorCountThreshold!,
config.windowHours!,
triggerFilter
)
default:
logger.warn(`Unknown alert rule: ${rule}`)

View File

@@ -1,6 +1,7 @@
import { db } from '@sim/db'
import {
workflow,
workflowDeploymentVersion,
workflowExecutionLogs,
workspaceNotificationDelivery,
workspaceNotificationSubscription,
@@ -9,15 +10,81 @@ import { and, eq, gte, inArray, sql } from 'drizzle-orm'
import { v4 as uuidv4 } from 'uuid'
import { isTriggerDevEnabled } from '@/lib/core/config/feature-flags'
import { createLogger } from '@/lib/logs/console/logger'
import { TRIGGER_TYPES } from '@/lib/workflows/triggers/triggers'
import {
executeNotificationDelivery,
workspaceNotificationDeliveryTask,
} from '@/background/workspace-notification-delivery'
import type { WorkflowState } from '@/stores/workflows/workflow/types'
import type { AlertConfig } from './alert-rules'
import { isInCooldown } from './alert-rules'
const logger = createLogger('InactivityPolling')
const SCHEDULE_BLOCK_TYPES: string[] = [TRIGGER_TYPES.SCHEDULE]
const WEBHOOK_BLOCK_TYPES: string[] = [TRIGGER_TYPES.WEBHOOK, TRIGGER_TYPES.GENERIC_WEBHOOK]
function deploymentHasTriggerType(
deploymentState: Pick<WorkflowState, 'blocks'>,
triggerFilter: string[]
): boolean {
const blocks = deploymentState.blocks
if (!blocks) return false
const alwaysAvailable = ['api', 'manual', 'chat']
if (triggerFilter.some((t) => alwaysAvailable.includes(t))) {
return true
}
for (const block of Object.values(blocks)) {
if (triggerFilter.includes('schedule') && SCHEDULE_BLOCK_TYPES.includes(block.type)) {
return true
}
if (triggerFilter.includes('webhook')) {
if (WEBHOOK_BLOCK_TYPES.includes(block.type)) {
return true
}
if (block.triggerMode === true) {
return true
}
}
}
return false
}
async function getWorkflowsWithTriggerTypes(
workspaceId: string,
triggerFilter: string[]
): Promise<Set<string>> {
const workflowIds = new Set<string>()
const deployedWorkflows = await db
.select({
workflowId: workflow.id,
deploymentState: workflowDeploymentVersion.state,
})
.from(workflow)
.innerJoin(
workflowDeploymentVersion,
and(
eq(workflowDeploymentVersion.workflowId, workflow.id),
eq(workflowDeploymentVersion.isActive, true)
)
)
.where(and(eq(workflow.workspaceId, workspaceId), eq(workflow.isDeployed, true)))
for (const w of deployedWorkflows) {
const state = w.deploymentState as WorkflowState | null
if (state && deploymentHasTriggerType(state, triggerFilter)) {
workflowIds.add(w.workflowId)
}
}
return workflowIds
}
interface InactivityCheckResult {
subscriptionId: string
workflowId: string
@@ -25,9 +92,6 @@ interface InactivityCheckResult {
reason?: string
}
/**
* Checks a single workflow for inactivity and triggers notification if needed
*/
async function checkWorkflowInactivity(
subscription: typeof workspaceNotificationSubscription.$inferSelect,
workflowId: string,
@@ -141,9 +205,6 @@ async function checkWorkflowInactivity(
return result
}
/**
* Polls all active no_activity subscriptions and triggers alerts as needed
*/
export async function pollInactivityAlerts(): Promise<{
total: number
triggered: number
@@ -179,19 +240,30 @@ export async function pollInactivityAlerts(): Promise<{
continue
}
const triggerFilter = subscription.triggerFilter as string[]
if (!triggerFilter || triggerFilter.length === 0) {
logger.warn(`Subscription ${subscription.id} has no trigger filter, skipping`)
continue
}
const eligibleWorkflowIds = await getWorkflowsWithTriggerTypes(
subscription.workspaceId,
triggerFilter
)
let workflowIds: string[] = []
if (subscription.allWorkflows) {
const workflows = await db
.select({ id: workflow.id })
.from(workflow)
.where(eq(workflow.workspaceId, subscription.workspaceId))
workflowIds = workflows.map((w) => w.id)
workflowIds = Array.from(eligibleWorkflowIds)
} else {
workflowIds = subscription.workflowIds || []
workflowIds = (subscription.workflowIds || []).filter((id) => eligibleWorkflowIds.has(id))
}
logger.debug(`Checking ${workflowIds.length} workflows for subscription ${subscription.id}`, {
triggerFilter,
eligibleCount: eligibleWorkflowIds.size,
})
for (const workflowId of workflowIds) {
const result = await checkWorkflowInactivity(subscription, workflowId, alertConfig)
results.push(result)

View File

@@ -81,7 +81,11 @@ async function formatTeamsGraphNotification(
foundWorkflow: any,
request: NextRequest
): Promise<any> {
const notification = body.value[0]
const notification = body.value?.[0]
if (!notification) {
logger.warn('Received empty Teams notification body')
return null
}
const changeType = notification.changeType || 'created'
const resource = notification.resource || ''
const subscriptionId = notification.subscriptionId || ''

View File

@@ -9,7 +9,7 @@ import {
workflowSubflows,
} from '@sim/db'
import type { InferSelectModel } from 'drizzle-orm'
import { and, desc, eq, inArray, sql } from 'drizzle-orm'
import { and, desc, eq, sql } from 'drizzle-orm'
import type { Edge } from 'reactflow'
import { v4 as uuidv4 } from 'uuid'
import { createLogger } from '@/lib/logs/console/logger'
@@ -602,178 +602,6 @@ export async function deployWorkflow(params: {
}
}
/**
* Bulk load workflow states for multiple workflows in a single set of queries.
* Much more efficient than calling loadWorkflowFromNormalizedTables for each workflow.
*/
export async function loadBulkWorkflowsFromNormalizedTables(
workflowIds: string[]
): Promise<Map<string, NormalizedWorkflowData>> {
const result = new Map<string, NormalizedWorkflowData>()
if (workflowIds.length === 0) {
return result
}
try {
// Load all components for all workflows in parallel (just 3 queries total)
const [allBlocks, allEdges, allSubflows] = await Promise.all([
db.select().from(workflowBlocks).where(inArray(workflowBlocks.workflowId, workflowIds)),
db.select().from(workflowEdges).where(inArray(workflowEdges.workflowId, workflowIds)),
db.select().from(workflowSubflows).where(inArray(workflowSubflows.workflowId, workflowIds)),
])
// Group blocks by workflow
const blocksByWorkflow = new Map<string, typeof allBlocks>()
for (const block of allBlocks) {
const existing = blocksByWorkflow.get(block.workflowId) || []
existing.push(block)
blocksByWorkflow.set(block.workflowId, existing)
}
// Group edges by workflow
const edgesByWorkflow = new Map<string, typeof allEdges>()
for (const edge of allEdges) {
const existing = edgesByWorkflow.get(edge.workflowId) || []
existing.push(edge)
edgesByWorkflow.set(edge.workflowId, existing)
}
// Group subflows by workflow
const subflowsByWorkflow = new Map<string, typeof allSubflows>()
for (const subflow of allSubflows) {
const existing = subflowsByWorkflow.get(subflow.workflowId) || []
existing.push(subflow)
subflowsByWorkflow.set(subflow.workflowId, existing)
}
// Process each workflow
for (const workflowId of workflowIds) {
const blocks = blocksByWorkflow.get(workflowId) || []
const edges = edgesByWorkflow.get(workflowId) || []
const subflows = subflowsByWorkflow.get(workflowId) || []
// Skip workflows with no blocks (not migrated yet)
if (blocks.length === 0) {
continue
}
// Convert blocks to the expected format
const blocksMap: Record<string, BlockState> = {}
blocks.forEach((block) => {
const blockData = block.data || {}
const assembled: BlockState = {
id: block.id,
type: block.type,
name: block.name,
position: {
x: Number(block.positionX),
y: Number(block.positionY),
},
enabled: block.enabled,
horizontalHandles: block.horizontalHandles,
advancedMode: block.advancedMode,
triggerMode: block.triggerMode,
height: Number(block.height),
subBlocks: (block.subBlocks as BlockState['subBlocks']) || {},
outputs: (block.outputs as BlockState['outputs']) || {},
data: blockData,
}
blocksMap[block.id] = assembled
})
// Sanitize any invalid custom tools in agent blocks
const { blocks: sanitizedBlocks } = sanitizeAgentToolsInBlocks(blocksMap)
// Migrate old agent block format to new messages array format
const migratedBlocks = migrateAgentBlocksToMessagesFormat(sanitizedBlocks)
// Convert edges to the expected format
const edgesArray: Edge[] = edges.map((edge) => ({
id: edge.id,
source: edge.sourceBlockId,
target: edge.targetBlockId,
sourceHandle: edge.sourceHandle ?? undefined,
targetHandle: edge.targetHandle ?? undefined,
type: 'default',
data: {},
}))
// Convert subflows to loops and parallels
const loops: Record<string, Loop> = {}
const parallels: Record<string, Parallel> = {}
subflows.forEach((subflow) => {
const config = (subflow.config ?? {}) as Partial<Loop & Parallel>
if (subflow.type === SUBFLOW_TYPES.LOOP) {
const loopType =
(config as Loop).loopType === 'for' ||
(config as Loop).loopType === 'forEach' ||
(config as Loop).loopType === 'while' ||
(config as Loop).loopType === 'doWhile'
? (config as Loop).loopType
: 'for'
const loop: Loop = {
id: subflow.id,
nodes: Array.isArray((config as Loop).nodes) ? (config as Loop).nodes : [],
iterations:
typeof (config as Loop).iterations === 'number' ? (config as Loop).iterations : 1,
loopType,
forEachItems: (config as Loop).forEachItems ?? '',
whileCondition: (config as Loop).whileCondition ?? '',
doWhileCondition: (config as Loop).doWhileCondition ?? '',
}
loops[subflow.id] = loop
// Sync block.data with loop config
if (migratedBlocks[subflow.id]) {
const block = migratedBlocks[subflow.id]
migratedBlocks[subflow.id] = {
...block,
data: {
...block.data,
collection: loop.forEachItems ?? block.data?.collection ?? '',
whileCondition: loop.whileCondition ?? block.data?.whileCondition ?? '',
doWhileCondition: loop.doWhileCondition ?? block.data?.doWhileCondition ?? '',
},
}
}
} else if (subflow.type === SUBFLOW_TYPES.PARALLEL) {
const parallel: Parallel = {
id: subflow.id,
nodes: Array.isArray((config as Parallel).nodes) ? (config as Parallel).nodes : [],
count: typeof (config as Parallel).count === 'number' ? (config as Parallel).count : 5,
distribution: (config as Parallel).distribution ?? '',
parallelType:
(config as Parallel).parallelType === 'count' ||
(config as Parallel).parallelType === 'collection'
? (config as Parallel).parallelType
: 'count',
}
parallels[subflow.id] = parallel
}
})
result.set(workflowId, {
blocks: migratedBlocks,
edges: edgesArray,
loops,
parallels,
isFromNormalizedTables: true,
})
}
return result
} catch (error) {
logger.error('Error bulk loading workflows from normalized tables:', error)
return result
}
}
/**
* Regenerates all IDs in a workflow state to avoid conflicts when duplicating or using templates
* Returns a new state with all IDs regenerated and references updated

View File

@@ -439,26 +439,46 @@ export function stripCustomToolPrefix(name: string) {
}
export const workflowHasResponseBlock = (executionResult: ExecutionResult): boolean => {
if (
!executionResult?.logs ||
!Array.isArray(executionResult.logs) ||
!executionResult.success ||
!executionResult.output.response
) {
if (!executionResult?.logs || !Array.isArray(executionResult.logs) || !executionResult.success) {
return false
}
const responseBlock = executionResult.logs.find(
(log) => log?.blockType === 'response' && log?.success
// Check for old response block format (has output.response)
if (executionResult.output.response) {
const responseBlock = executionResult.logs.find(
(log) => log?.blockType === 'response' && log?.success
)
if (responseBlock) return true
}
// Check for new workflow_response block format (has status/headers at root)
const workflowResponseBlock = executionResult.logs.find(
(log) => log?.blockType === 'workflow_response' && log?.success
)
return responseBlock !== undefined
return workflowResponseBlock !== undefined
}
// Create a HTTP response from response block
export const createHttpResponseFromBlock = (executionResult: ExecutionResult): NextResponse => {
const output = executionResult.output.response
const { data = {}, status = 200, headers = {} } = output
// Check if it's the old response block format
if (executionResult.output.response) {
const output = executionResult.output.response
const { data = {}, status = 200, headers = {} } = output
const responseHeaders = new Headers({
'Content-Type': 'application/json',
...headers,
})
return NextResponse.json(data, {
status: status,
headers: responseHeaders,
})
}
// New workflow_response format - status/headers at root, data spread at root
const { status = 200, headers = {}, ...data } = executionResult.output
const responseHeaders = new Headers({
'Content-Type': 'application/json',

View File

@@ -84,6 +84,7 @@ const nextConfig: NextConfig = {
],
outputFileTracingIncludes: {
'/api/tools/stagehand/*': ['./node_modules/ws/**/*'],
'/*': ['./node_modules/sharp/**/*', './node_modules/@img/**/*'],
},
experimental: {
optimizeCss: true,

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.0 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 349 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 123 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 515 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 MiB

View File

@@ -1,39 +1,12 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceCreateAccountParams,
SalesforceCreateAccountResponse,
} from '@/tools/salesforce/types'
import type { ToolConfig } from '@/tools/types'
const logger = createLogger('SalesforceCreateAccount')
export interface SalesforceCreateAccountParams {
accessToken: string
idToken?: string
instanceUrl?: string
name: string
type?: string
industry?: string
phone?: string
website?: string
billingStreet?: string
billingCity?: string
billingState?: string
billingPostalCode?: string
billingCountry?: string
description?: string
annualRevenue?: string
numberOfEmployees?: string
}
export interface SalesforceCreateAccountResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_account'
}
}
}
export const salesforceCreateAccountTool: ToolConfig<
SalesforceCreateAccountParams,
SalesforceCreateAccountResponse

View File

@@ -1,30 +1,9 @@
import type {
SalesforceCreateCaseParams,
SalesforceCreateCaseResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceCreateCaseParams {
accessToken: string
idToken?: string
instanceUrl?: string
subject: string
status?: string
priority?: string
origin?: string
contactId?: string
accountId?: string
description?: string
}
export interface SalesforceCreateCaseResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_case'
}
}
}
export const salesforceCreateCaseTool: ToolConfig<
SalesforceCreateCaseParams,

View File

@@ -1,38 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceCreateContactParams,
SalesforceCreateContactResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
const logger = createLogger('SalesforceContacts')
export interface SalesforceCreateContactParams {
accessToken: string
idToken?: string
instanceUrl?: string
lastName: string
firstName?: string
email?: string
phone?: string
accountId?: string
title?: string
department?: string
mailingStreet?: string
mailingCity?: string
mailingState?: string
mailingPostalCode?: string
mailingCountry?: string
description?: string
}
export interface SalesforceCreateContactResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: { operation: 'create_contact' }
}
}
export const salesforceCreateContactTool: ToolConfig<
SalesforceCreateContactParams,
SalesforceCreateContactResponse

View File

@@ -1,32 +1,9 @@
import type {
SalesforceCreateLeadParams,
SalesforceCreateLeadResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceCreateLeadParams {
accessToken: string
idToken?: string
instanceUrl?: string
lastName: string
company: string
firstName?: string
email?: string
phone?: string
status?: string
leadSource?: string
title?: string
description?: string
}
export interface SalesforceCreateLeadResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_lead'
}
}
}
export const salesforceCreateLeadTool: ToolConfig<
SalesforceCreateLeadParams,

View File

@@ -1,30 +1,9 @@
import type {
SalesforceCreateOpportunityParams,
SalesforceCreateOpportunityResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceCreateOpportunityParams {
accessToken: string
idToken?: string
instanceUrl?: string
name: string
stageName: string
closeDate: string
accountId?: string
amount?: string
probability?: string
description?: string
}
export interface SalesforceCreateOpportunityResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_opportunity'
}
}
}
export const salesforceCreateOpportunityTool: ToolConfig<
SalesforceCreateOpportunityParams,

View File

@@ -1,30 +1,9 @@
import type {
SalesforceCreateTaskParams,
SalesforceCreateTaskResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceCreateTaskParams {
accessToken: string
idToken?: string
instanceUrl?: string
subject: string
status?: string
priority?: string
activityDate?: string
whoId?: string
whatId?: string
description?: string
}
export interface SalesforceCreateTaskResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_task'
}
}
}
export const salesforceCreateTaskTool: ToolConfig<
SalesforceCreateTaskParams,

View File

@@ -1,26 +1,12 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceDeleteAccountParams,
SalesforceDeleteAccountResponse,
} from '@/tools/salesforce/types'
import type { ToolConfig } from '@/tools/types'
const logger = createLogger('SalesforceDeleteAccount')
export interface SalesforceDeleteAccountParams {
accessToken: string
idToken?: string
instanceUrl?: string
accountId: string
}
export interface SalesforceDeleteAccountResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_account'
}
}
}
export const salesforceDeleteAccountTool: ToolConfig<
SalesforceDeleteAccountParams,
SalesforceDeleteAccountResponse

View File

@@ -1,23 +1,9 @@
import type {
SalesforceDeleteCaseParams,
SalesforceDeleteCaseResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceDeleteCaseParams {
accessToken: string
idToken?: string
instanceUrl?: string
caseId: string
}
export interface SalesforceDeleteCaseResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_case'
}
}
}
export const salesforceDeleteCaseTool: ToolConfig<
SalesforceDeleteCaseParams,

View File

@@ -1,25 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceDeleteContactParams,
SalesforceDeleteContactResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
const logger = createLogger('SalesforceContacts')
export interface SalesforceDeleteContactParams {
accessToken: string
idToken?: string
instanceUrl?: string
contactId: string
}
export interface SalesforceDeleteContactResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: { operation: 'delete_contact' }
}
}
export const salesforceDeleteContactTool: ToolConfig<
SalesforceDeleteContactParams,
SalesforceDeleteContactResponse

View File

@@ -1,23 +1,9 @@
import type {
SalesforceDeleteLeadParams,
SalesforceDeleteLeadResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceDeleteLeadParams {
accessToken: string
idToken?: string
instanceUrl?: string
leadId: string
}
export interface SalesforceDeleteLeadResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_lead'
}
}
}
export const salesforceDeleteLeadTool: ToolConfig<
SalesforceDeleteLeadParams,

View File

@@ -1,23 +1,9 @@
import type {
SalesforceDeleteOpportunityParams,
SalesforceDeleteOpportunityResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceDeleteOpportunityParams {
accessToken: string
idToken?: string
instanceUrl?: string
opportunityId: string
}
export interface SalesforceDeleteOpportunityResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_opportunity'
}
}
}
export const salesforceDeleteOpportunityTool: ToolConfig<
SalesforceDeleteOpportunityParams,

View File

@@ -1,23 +1,9 @@
import type {
SalesforceDeleteTaskParams,
SalesforceDeleteTaskResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceDeleteTaskParams {
accessToken: string
idToken?: string
instanceUrl?: string
taskId: string
}
export interface SalesforceDeleteTaskResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_task'
}
}
}
export const salesforceDeleteTaskTool: ToolConfig<
SalesforceDeleteTaskParams,

View File

@@ -1,38 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceDescribeObjectParams,
SalesforceDescribeObjectResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceQuery')
export interface SalesforceDescribeObjectParams {
accessToken: string
idToken?: string
instanceUrl?: string
objectName: string
}
export interface SalesforceDescribeObjectResponse {
success: boolean
output: {
objectName: string
label?: string
labelPlural?: string
fields?: any[]
keyPrefix?: string
queryable?: boolean
createable?: boolean
updateable?: boolean
deletable?: boolean
childRelationships?: any[]
recordTypeInfos?: any[]
metadata: {
operation: 'describe_object'
fieldCount: number
}
success: boolean
}
}
/**
* Describe a Salesforce object to get its metadata/fields
* Useful for discovering available fields for queries

View File

@@ -1,34 +1,6 @@
import type { SalesforceGetCasesParams, SalesforceGetCasesResponse } from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceGetCasesParams {
accessToken: string
idToken?: string
instanceUrl?: string
caseId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetCasesResponse {
success: boolean
output: {
case?: any
cases?: any[]
paging?: {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
metadata: {
operation: 'get_cases'
totalReturned?: number
hasMore?: boolean
}
success: boolean
}
}
export const salesforceGetCasesTool: ToolConfig<
SalesforceGetCasesParams,

View File

@@ -1,39 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceGetContactsParams,
SalesforceGetContactsResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
const logger = createLogger('SalesforceContacts')
export interface SalesforceGetContactsParams {
accessToken: string
idToken?: string
instanceUrl?: string
contactId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetContactsResponse {
success: boolean
output: {
contacts?: any[]
contact?: any
paging?: {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
metadata: {
operation: 'get_contacts'
totalReturned?: number
hasMore?: boolean
singleContact?: boolean
}
success: boolean
}
}
export const salesforceGetContactsTool: ToolConfig<
SalesforceGetContactsParams,
SalesforceGetContactsResponse

View File

@@ -1,32 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceGetDashboardParams,
SalesforceGetDashboardResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceDashboards')
export interface SalesforceGetDashboardParams {
accessToken: string
idToken?: string
instanceUrl?: string
dashboardId: string
}
export interface SalesforceGetDashboardResponse {
success: boolean
output: {
dashboard: any
dashboardId: string
components: any[]
metadata: {
operation: 'get_dashboard'
dashboardName?: string
folderId?: string
runningUser?: any
}
success: boolean
}
}
/**
* Get details for a specific dashboard
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_dashboard_results.htm

View File

@@ -1,35 +1,6 @@
import type { SalesforceGetLeadsParams, SalesforceGetLeadsResponse } from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceGetLeadsParams {
accessToken: string
idToken?: string
instanceUrl?: string
leadId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetLeadsResponse {
success: boolean
output: {
lead?: any
leads?: any[]
paging?: {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
metadata: {
operation: 'get_leads'
totalReturned?: number
hasMore?: boolean
singleLead?: boolean
}
success: boolean
}
}
export const salesforceGetLeadsTool: ToolConfig<
SalesforceGetLeadsParams,

View File

@@ -1,34 +1,9 @@
import type {
SalesforceGetOpportunitiesParams,
SalesforceGetOpportunitiesResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceGetOpportunitiesParams {
accessToken: string
idToken?: string
instanceUrl?: string
opportunityId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetOpportunitiesResponse {
success: boolean
output: {
opportunity?: any
opportunities?: any[]
paging?: {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
metadata: {
operation: 'get_opportunities'
totalReturned?: number
hasMore?: boolean
}
success: boolean
}
}
export const salesforceGetOpportunitiesTool: ToolConfig<
SalesforceGetOpportunitiesParams,

View File

@@ -1,28 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceGetReportParams,
SalesforceGetReportResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceReports')
export interface SalesforceGetReportParams {
accessToken: string
idToken?: string
instanceUrl?: string
reportId: string
}
export interface SalesforceGetReportResponse {
success: boolean
output: {
report: any
reportId: string
metadata: {
operation: 'get_report'
}
success: boolean
}
}
/**
* Get metadata for a specific report
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_get_reportmetadata.htm

View File

@@ -1,34 +1,6 @@
import type { SalesforceGetTasksParams, SalesforceGetTasksResponse } from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceGetTasksParams {
accessToken: string
idToken?: string
instanceUrl?: string
taskId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetTasksResponse {
success: boolean
output: {
task?: any
tasks?: any[]
paging?: {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
metadata: {
operation: 'get_tasks'
totalReturned?: number
hasMore?: boolean
}
success: boolean
}
}
export const salesforceGetTasksTool: ToolConfig<
SalesforceGetTasksParams,

View File

@@ -1,28 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceListDashboardsParams,
SalesforceListDashboardsResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceDashboards')
export interface SalesforceListDashboardsParams {
accessToken: string
idToken?: string
instanceUrl?: string
folderName?: string
}
export interface SalesforceListDashboardsResponse {
success: boolean
output: {
dashboards: any[]
metadata: {
operation: 'list_dashboards'
totalReturned: number
}
success: boolean
}
}
/**
* List all dashboards accessible by the current user
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_getbasic_dashboardlist.htm

View File

@@ -1,29 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceListObjectsParams,
SalesforceListObjectsResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceQuery')
export interface SalesforceListObjectsParams {
accessToken: string
idToken?: string
instanceUrl?: string
}
export interface SalesforceListObjectsResponse {
success: boolean
output: {
objects: any[]
encoding?: string
maxBatchSize?: number
metadata: {
operation: 'list_objects'
totalReturned: number
}
success: boolean
}
}
/**
* List all available Salesforce objects
* Useful for discovering what objects are available

View File

@@ -1,27 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceListReportTypesParams,
SalesforceListReportTypesResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceReports')
export interface SalesforceListReportTypesParams {
accessToken: string
idToken?: string
instanceUrl?: string
}
export interface SalesforceListReportTypesResponse {
success: boolean
output: {
reportTypes: any[]
metadata: {
operation: 'list_report_types'
totalReturned: number
}
success: boolean
}
}
/**
* Get list of available report types
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_list_reporttypes.htm

View File

@@ -1,29 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceListReportsParams,
SalesforceListReportsResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceReports')
export interface SalesforceListReportsParams {
accessToken: string
idToken?: string
instanceUrl?: string
folderName?: string
searchTerm?: string
}
export interface SalesforceListReportsResponse {
success: boolean
output: {
reports: any[]
metadata: {
operation: 'list_reports'
totalReturned: number
}
success: boolean
}
}
/**
* List all reports accessible by the current user
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_get_reportlist.htm

View File

@@ -1,33 +1,10 @@
import { createLogger } from '@/lib/logs/console/logger'
import type { SalesforceQueryParams, SalesforceQueryResponse } from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceQuery')
export interface SalesforceQueryParams {
accessToken: string
idToken?: string
instanceUrl?: string
query: string
}
export interface SalesforceQueryResponse {
success: boolean
output: {
records: any[]
totalSize: number
done: boolean
nextRecordsUrl?: string
query: string
metadata: {
operation: 'query'
totalReturned: number
hasMore: boolean
}
success: boolean
}
}
/**
* Execute a custom SOQL query
* @see https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_query.htm

View File

@@ -1,32 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceQueryMoreParams,
SalesforceQueryMoreResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceQuery')
export interface SalesforceQueryMoreParams {
accessToken: string
idToken?: string
instanceUrl?: string
nextRecordsUrl: string
}
export interface SalesforceQueryMoreResponse {
success: boolean
output: {
records: any[]
totalSize: number
done: boolean
nextRecordsUrl?: string
metadata: {
operation: 'query_more'
totalReturned: number
hasMore: boolean
}
success: boolean
}
}
/**
* Retrieve additional query results using the nextRecordsUrl
* @see https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_query.htm

View File

@@ -1,32 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceRefreshDashboardParams,
SalesforceRefreshDashboardResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceDashboards')
export interface SalesforceRefreshDashboardParams {
accessToken: string
idToken?: string
instanceUrl?: string
dashboardId: string
}
export interface SalesforceRefreshDashboardResponse {
success: boolean
output: {
dashboard: any
dashboardId: string
components: any[]
status?: any
metadata: {
operation: 'refresh_dashboard'
dashboardName?: string
refreshDate?: string
}
success: boolean
}
}
/**
* Refresh a dashboard to get latest data
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_refresh_dashboard.htm

View File

@@ -1,38 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceRunReportParams,
SalesforceRunReportResponse,
} from '@/tools/salesforce/types'
import { extractErrorMessage, getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { extractErrorMessage, getInstanceUrl } from './utils'
const logger = createLogger('SalesforceReports')
export interface SalesforceRunReportParams {
accessToken: string
idToken?: string
instanceUrl?: string
reportId: string
includeDetails?: string
filters?: string
}
export interface SalesforceRunReportResponse {
success: boolean
output: {
reportId: string
reportMetadata?: any
reportExtendedMetadata?: any
factMap?: any
groupingsDown?: any
groupingsAcross?: any
hasDetailRows?: boolean
allData?: boolean
metadata: {
operation: 'run_report'
reportName?: string
reportFormat?: string
}
success: boolean
}
}
/**
* Run a report and return the results
* @see https://developer.salesforce.com/docs/atlas.en-us.api_analytics.meta/api_analytics/sforce_analytics_rest_api_get_reportdata.htm

View File

@@ -1,6 +1,23 @@
import type { ToolResponse } from '@/tools/types'
// Common Salesforce types
/**
* Base parameters shared by all Salesforce operations
*/
export interface BaseSalesforceParams {
accessToken: string
idToken?: string
instanceUrl?: string
}
/**
* Common paging structure for list operations
*/
export interface SalesforcePaging {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
export interface SalesforceAccount {
Id: string
Name: string
@@ -22,13 +39,12 @@ export interface SalesforceAccount {
[key: string]: any
}
export interface SalesforcePaging {
nextRecordsUrl?: string
totalSize: number
done: boolean
export interface SalesforceGetAccountsParams extends BaseSalesforceParams {
limit?: string
fields?: string
orderBy?: string
}
// Get Accounts
export interface SalesforceGetAccountsResponse extends ToolResponse {
output: {
accounts: SalesforceAccount[]
@@ -42,16 +58,22 @@ export interface SalesforceGetAccountsResponse extends ToolResponse {
}
}
export interface SalesforceGetAccountsParams {
accessToken: string
idToken?: string
instanceUrl?: string
limit?: string
fields?: string
orderBy?: string
export interface SalesforceCreateAccountParams extends BaseSalesforceParams {
name: string
type?: string
industry?: string
phone?: string
website?: string
billingStreet?: string
billingCity?: string
billingState?: string
billingPostalCode?: string
billingCountry?: string
description?: string
annualRevenue?: string
numberOfEmployees?: string
}
// Create Account
export interface SalesforceCreateAccountResponse {
success: boolean
output: {
@@ -64,7 +86,23 @@ export interface SalesforceCreateAccountResponse {
}
}
// Update Account
export interface SalesforceUpdateAccountParams extends BaseSalesforceParams {
accountId: string
name?: string
type?: string
industry?: string
phone?: string
website?: string
billingStreet?: string
billingCity?: string
billingState?: string
billingPostalCode?: string
billingCountry?: string
description?: string
annualRevenue?: string
numberOfEmployees?: string
}
export interface SalesforceUpdateAccountResponse {
success: boolean
output: {
@@ -76,7 +114,10 @@ export interface SalesforceUpdateAccountResponse {
}
}
// Delete Account
export interface SalesforceDeleteAccountParams extends BaseSalesforceParams {
accountId: string
}
export interface SalesforceDeleteAccountResponse {
success: boolean
output: {
@@ -88,17 +129,19 @@ export interface SalesforceDeleteAccountResponse {
}
}
// Contact types
export interface SalesforceGetContactsParams extends BaseSalesforceParams {
contactId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetContactsResponse {
success: boolean
output: {
contacts?: any[]
contact?: any
paging?: {
nextRecordsUrl?: string
totalSize: number
done: boolean
}
paging?: SalesforcePaging
metadata: {
operation: 'get_contacts'
totalReturned?: number
@@ -109,6 +152,22 @@ export interface SalesforceGetContactsResponse {
}
}
export interface SalesforceCreateContactParams extends BaseSalesforceParams {
lastName: string
firstName?: string
email?: string
phone?: string
accountId?: string
title?: string
department?: string
mailingStreet?: string
mailingCity?: string
mailingState?: string
mailingPostalCode?: string
mailingCountry?: string
description?: string
}
export interface SalesforceCreateContactResponse {
success: boolean
output: {
@@ -119,6 +178,23 @@ export interface SalesforceCreateContactResponse {
}
}
export interface SalesforceUpdateContactParams extends BaseSalesforceParams {
contactId: string
lastName?: string
firstName?: string
email?: string
phone?: string
accountId?: string
title?: string
department?: string
mailingStreet?: string
mailingCity?: string
mailingState?: string
mailingPostalCode?: string
mailingCountry?: string
description?: string
}
export interface SalesforceUpdateContactResponse {
success: boolean
output: {
@@ -128,6 +204,10 @@ export interface SalesforceUpdateContactResponse {
}
}
export interface SalesforceDeleteContactParams extends BaseSalesforceParams {
contactId: string
}
export interface SalesforceDeleteContactResponse {
success: boolean
output: {
@@ -137,7 +217,335 @@ export interface SalesforceDeleteContactResponse {
}
}
// Report types
export interface SalesforceGetLeadsParams extends BaseSalesforceParams {
leadId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetLeadsResponse {
success: boolean
output: {
lead?: any
leads?: any[]
paging?: SalesforcePaging
metadata: {
operation: 'get_leads'
totalReturned?: number
hasMore?: boolean
singleLead?: boolean
}
success: boolean
}
}
export interface SalesforceCreateLeadParams extends BaseSalesforceParams {
lastName: string
company: string
firstName?: string
email?: string
phone?: string
status?: string
leadSource?: string
title?: string
description?: string
}
export interface SalesforceCreateLeadResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_lead'
}
}
}
export interface SalesforceUpdateLeadParams extends BaseSalesforceParams {
leadId: string
lastName?: string
company?: string
firstName?: string
email?: string
phone?: string
status?: string
leadSource?: string
title?: string
description?: string
}
export interface SalesforceUpdateLeadResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_lead'
}
}
}
export interface SalesforceDeleteLeadParams extends BaseSalesforceParams {
leadId: string
}
export interface SalesforceDeleteLeadResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_lead'
}
}
}
export interface SalesforceGetOpportunitiesParams extends BaseSalesforceParams {
opportunityId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetOpportunitiesResponse {
success: boolean
output: {
opportunity?: any
opportunities?: any[]
paging?: SalesforcePaging
metadata: {
operation: 'get_opportunities'
totalReturned?: number
hasMore?: boolean
}
success: boolean
}
}
export interface SalesforceCreateOpportunityParams extends BaseSalesforceParams {
name: string
stageName: string
closeDate: string
accountId?: string
amount?: string
probability?: string
description?: string
}
export interface SalesforceCreateOpportunityResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_opportunity'
}
}
}
export interface SalesforceUpdateOpportunityParams extends BaseSalesforceParams {
opportunityId: string
name?: string
stageName?: string
closeDate?: string
accountId?: string
amount?: string
probability?: string
description?: string
}
export interface SalesforceUpdateOpportunityResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_opportunity'
}
}
}
export interface SalesforceDeleteOpportunityParams extends BaseSalesforceParams {
opportunityId: string
}
export interface SalesforceDeleteOpportunityResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_opportunity'
}
}
}
export interface SalesforceGetCasesParams extends BaseSalesforceParams {
caseId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetCasesResponse {
success: boolean
output: {
case?: any
cases?: any[]
paging?: SalesforcePaging
metadata: {
operation: 'get_cases'
totalReturned?: number
hasMore?: boolean
}
success: boolean
}
}
export interface SalesforceCreateCaseParams extends BaseSalesforceParams {
subject: string
status?: string
priority?: string
origin?: string
contactId?: string
accountId?: string
description?: string
}
export interface SalesforceCreateCaseResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_case'
}
}
}
export interface SalesforceUpdateCaseParams extends BaseSalesforceParams {
caseId: string
subject?: string
status?: string
priority?: string
description?: string
}
export interface SalesforceUpdateCaseResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_case'
}
}
}
export interface SalesforceDeleteCaseParams extends BaseSalesforceParams {
caseId: string
}
export interface SalesforceDeleteCaseResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_case'
}
}
}
export interface SalesforceGetTasksParams extends BaseSalesforceParams {
taskId?: string
limit?: string
fields?: string
orderBy?: string
}
export interface SalesforceGetTasksResponse {
success: boolean
output: {
task?: any
tasks?: any[]
paging?: SalesforcePaging
metadata: {
operation: 'get_tasks'
totalReturned?: number
hasMore?: boolean
}
success: boolean
}
}
export interface SalesforceCreateTaskParams extends BaseSalesforceParams {
subject: string
status?: string
priority?: string
activityDate?: string
whoId?: string
whatId?: string
description?: string
}
export interface SalesforceCreateTaskResponse {
success: boolean
output: {
id: string
success: boolean
created: boolean
metadata: {
operation: 'create_task'
}
}
}
export interface SalesforceUpdateTaskParams extends BaseSalesforceParams {
taskId: string
subject?: string
status?: string
priority?: string
activityDate?: string
description?: string
}
export interface SalesforceUpdateTaskResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_task'
}
}
}
export interface SalesforceDeleteTaskParams extends BaseSalesforceParams {
taskId: string
}
export interface SalesforceDeleteTaskResponse {
success: boolean
output: {
id: string
deleted: boolean
metadata: {
operation: 'delete_task'
}
}
}
export interface SalesforceListReportsParams extends BaseSalesforceParams {
folderName?: string
searchTerm?: string
}
export interface SalesforceListReportsResponse {
success: boolean
output: {
@@ -150,6 +558,10 @@ export interface SalesforceListReportsResponse {
}
}
export interface SalesforceGetReportParams extends BaseSalesforceParams {
reportId: string
}
export interface SalesforceGetReportResponse {
success: boolean
output: {
@@ -162,26 +574,34 @@ export interface SalesforceGetReportResponse {
}
}
export interface SalesforceRunReportParams extends BaseSalesforceParams {
reportId: string
includeDetails?: string
filters?: string
}
export interface SalesforceRunReportResponse {
success: boolean
output: {
reportId: string
reportMetadata: any
reportExtendedMetadata: any
factMap: any
groupingsDown: any
groupingsAcross: any
hasDetailRows: boolean
allData: boolean
reportMetadata?: any
reportExtendedMetadata?: any
factMap?: any
groupingsDown?: any
groupingsAcross?: any
hasDetailRows?: boolean
allData?: boolean
metadata: {
operation: 'run_report'
reportName: string
reportFormat: string
reportName?: string
reportFormat?: string
}
success: boolean
}
}
export interface SalesforceListReportTypesParams extends BaseSalesforceParams {}
export interface SalesforceListReportTypesResponse {
success: boolean
output: {
@@ -194,7 +614,10 @@ export interface SalesforceListReportTypesResponse {
}
}
// Dashboard types
export interface SalesforceListDashboardsParams extends BaseSalesforceParams {
folderName?: string
}
export interface SalesforceListDashboardsResponse {
success: boolean
output: {
@@ -207,6 +630,10 @@ export interface SalesforceListDashboardsResponse {
}
}
export interface SalesforceGetDashboardParams extends BaseSalesforceParams {
dashboardId: string
}
export interface SalesforceGetDashboardResponse {
success: boolean
output: {
@@ -215,31 +642,38 @@ export interface SalesforceGetDashboardResponse {
components: any[]
metadata: {
operation: 'get_dashboard'
dashboardName: string
folderId: string
runningUser: any
dashboardName?: string
folderId?: string
runningUser?: any
}
success: boolean
}
}
export interface SalesforceRefreshDashboardParams extends BaseSalesforceParams {
dashboardId: string
}
export interface SalesforceRefreshDashboardResponse {
success: boolean
output: {
dashboard: any
dashboardId: string
components: any[]
status: any
status?: any
metadata: {
operation: 'refresh_dashboard'
dashboardName: string
refreshDate: string
dashboardName?: string
refreshDate?: string
}
success: boolean
}
}
// Query types
export interface SalesforceQueryParams extends BaseSalesforceParams {
query: string
}
export interface SalesforceQueryResponse {
success: boolean
output: {
@@ -257,6 +691,10 @@ export interface SalesforceQueryResponse {
}
}
export interface SalesforceQueryMoreParams extends BaseSalesforceParams {
nextRecordsUrl: string
}
export interface SalesforceQueryMoreResponse {
success: boolean
output: {
@@ -273,20 +711,24 @@ export interface SalesforceQueryMoreResponse {
}
}
export interface SalesforceDescribeObjectParams extends BaseSalesforceParams {
objectName: string
}
export interface SalesforceDescribeObjectResponse {
success: boolean
output: {
objectName: string
label: string
labelPlural: string
fields: any[]
keyPrefix: string
queryable: boolean
createable: boolean
updateable: boolean
deletable: boolean
childRelationships: any[]
recordTypeInfos: any[]
label?: string
labelPlural?: string
fields?: any[]
keyPrefix?: string
queryable?: boolean
createable?: boolean
updateable?: boolean
deletable?: boolean
childRelationships?: any[]
recordTypeInfos?: any[]
metadata: {
operation: 'describe_object'
fieldCount: number
@@ -295,12 +737,14 @@ export interface SalesforceDescribeObjectResponse {
}
}
export interface SalesforceListObjectsParams extends BaseSalesforceParams {}
export interface SalesforceListObjectsResponse {
success: boolean
output: {
objects: any[]
encoding: string
maxBatchSize: number
encoding?: string
maxBatchSize?: number
metadata: {
operation: 'list_objects'
totalReturned: number
@@ -309,7 +753,6 @@ export interface SalesforceListObjectsResponse {
}
}
// Generic Salesforce response type for the block
export type SalesforceResponse =
| SalesforceGetAccountsResponse
| SalesforceCreateAccountResponse
@@ -319,6 +762,22 @@ export type SalesforceResponse =
| SalesforceCreateContactResponse
| SalesforceUpdateContactResponse
| SalesforceDeleteContactResponse
| SalesforceGetLeadsResponse
| SalesforceCreateLeadResponse
| SalesforceUpdateLeadResponse
| SalesforceDeleteLeadResponse
| SalesforceGetOpportunitiesResponse
| SalesforceCreateOpportunityResponse
| SalesforceUpdateOpportunityResponse
| SalesforceDeleteOpportunityResponse
| SalesforceGetCasesResponse
| SalesforceCreateCaseResponse
| SalesforceUpdateCaseResponse
| SalesforceDeleteCaseResponse
| SalesforceGetTasksResponse
| SalesforceCreateTaskResponse
| SalesforceUpdateTaskResponse
| SalesforceDeleteTaskResponse
| SalesforceListReportsResponse
| SalesforceGetReportResponse
| SalesforceRunReportResponse
@@ -330,4 +789,3 @@ export type SalesforceResponse =
| SalesforceQueryMoreResponse
| SalesforceDescribeObjectResponse
| SalesforceListObjectsResponse
| { success: boolean; output: any } // Generic for leads, opportunities, cases, tasks

View File

@@ -1,39 +1,12 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceUpdateAccountParams,
SalesforceUpdateAccountResponse,
} from '@/tools/salesforce/types'
import type { ToolConfig } from '@/tools/types'
const logger = createLogger('SalesforceUpdateAccount')
export interface SalesforceUpdateAccountParams {
accessToken: string
idToken?: string
instanceUrl?: string
accountId: string
name?: string
type?: string
industry?: string
phone?: string
website?: string
billingStreet?: string
billingCity?: string
billingState?: string
billingPostalCode?: string
billingCountry?: string
description?: string
annualRevenue?: string
numberOfEmployees?: string
}
export interface SalesforceUpdateAccountResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_account'
}
}
}
export const salesforceUpdateAccountTool: ToolConfig<
SalesforceUpdateAccountParams,
SalesforceUpdateAccountResponse

View File

@@ -1,27 +1,9 @@
import type {
SalesforceUpdateCaseParams,
SalesforceUpdateCaseResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceUpdateCaseParams {
accessToken: string
idToken?: string
instanceUrl?: string
caseId: string
subject?: string
status?: string
priority?: string
description?: string
}
export interface SalesforceUpdateCaseResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_case'
}
}
}
export const salesforceUpdateCaseTool: ToolConfig<
SalesforceUpdateCaseParams,

View File

@@ -1,38 +1,13 @@
import { createLogger } from '@/lib/logs/console/logger'
import type {
SalesforceUpdateContactParams,
SalesforceUpdateContactResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
const logger = createLogger('SalesforceContacts')
export interface SalesforceUpdateContactParams {
accessToken: string
idToken?: string
instanceUrl?: string
contactId: string
lastName?: string
firstName?: string
email?: string
phone?: string
accountId?: string
title?: string
department?: string
mailingStreet?: string
mailingCity?: string
mailingState?: string
mailingPostalCode?: string
mailingCountry?: string
description?: string
}
export interface SalesforceUpdateContactResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: { operation: 'update_contact' }
}
}
export const salesforceUpdateContactTool: ToolConfig<
SalesforceUpdateContactParams,
SalesforceUpdateContactResponse

View File

@@ -1,32 +1,9 @@
import type {
SalesforceUpdateLeadParams,
SalesforceUpdateLeadResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceUpdateLeadParams {
accessToken: string
idToken?: string
instanceUrl?: string
leadId: string
lastName?: string
company?: string
firstName?: string
email?: string
phone?: string
status?: string
leadSource?: string
title?: string
description?: string
}
export interface SalesforceUpdateLeadResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_lead'
}
}
}
export const salesforceUpdateLeadTool: ToolConfig<
SalesforceUpdateLeadParams,

View File

@@ -1,30 +1,9 @@
import type {
SalesforceUpdateOpportunityParams,
SalesforceUpdateOpportunityResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceUpdateOpportunityParams {
accessToken: string
idToken?: string
instanceUrl?: string
opportunityId: string
name?: string
stageName?: string
closeDate?: string
accountId?: string
amount?: string
probability?: string
description?: string
}
export interface SalesforceUpdateOpportunityResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_opportunity'
}
}
}
export const salesforceUpdateOpportunityTool: ToolConfig<
SalesforceUpdateOpportunityParams,

View File

@@ -1,28 +1,9 @@
import type {
SalesforceUpdateTaskParams,
SalesforceUpdateTaskResponse,
} from '@/tools/salesforce/types'
import { getInstanceUrl } from '@/tools/salesforce/utils'
import type { ToolConfig } from '@/tools/types'
import { getInstanceUrl } from './utils'
export interface SalesforceUpdateTaskParams {
accessToken: string
idToken?: string
instanceUrl?: string
taskId: string
subject?: string
status?: string
priority?: string
activityDate?: string
description?: string
}
export interface SalesforceUpdateTaskResponse {
success: boolean
output: {
id: string
updated: boolean
metadata: {
operation: 'update_task'
}
}
}
export const salesforceUpdateTaskTool: ToolConfig<
SalesforceUpdateTaskParams,

View File

@@ -44,9 +44,18 @@ app:
NODE_ENV: "production"
NEXT_TELEMETRY_DISABLED: "1"
# AWS-specific environment variables
# AWS S3 Cloud Storage Configuration (RECOMMENDED for production)
# Create S3 buckets in your AWS account and configure IAM permissions
AWS_REGION: "us-west-2"
AWS_ACCESS_KEY_ID: "" # AWS access key (or use IRSA for EKS)
AWS_SECRET_ACCESS_KEY: "" # AWS secret key (or use IRSA for EKS)
S3_BUCKET_NAME: "workspace-files" # Workspace files
S3_KB_BUCKET_NAME: "knowledge-base" # Knowledge base documents
S3_EXECUTION_FILES_BUCKET_NAME: "execution-files" # Workflow execution outputs
S3_CHAT_BUCKET_NAME: "chat-files" # Deployed chat assets
S3_COPILOT_BUCKET_NAME: "copilot-files" # Copilot attachments
S3_PROFILE_PICTURES_BUCKET_NAME: "profile-pictures" # User avatars
# Realtime service
realtime:

View File

@@ -42,10 +42,23 @@ app:
# Optional: API Key Encryption (RECOMMENDED for production)
# Generate 64-character hex string using: openssl rand -hex 32
API_ENCRYPTION_KEY: "your-64-char-hex-api-encryption-key-here" # Optional but recommended
NODE_ENV: "production"
NEXT_TELEMETRY_DISABLED: "1"
# Azure Blob Storage Configuration (RECOMMENDED for production)
# Create a storage account and containers in your Azure subscription
AZURE_ACCOUNT_NAME: "simstudiostorageacct" # Azure storage account name
AZURE_ACCOUNT_KEY: "" # Storage account access key
# Or use connection string instead of account name/key:
# AZURE_CONNECTION_STRING: "DefaultEndpointsProtocol=https;AccountName=...;AccountKey=...;EndpointSuffix=core.windows.net"
AZURE_STORAGE_CONTAINER_NAME: "workspace-files" # Workspace files container
AZURE_STORAGE_KB_CONTAINER_NAME: "knowledge-base" # Knowledge base documents container
AZURE_STORAGE_EXECUTION_FILES_CONTAINER_NAME: "execution-files" # Workflow execution outputs
AZURE_STORAGE_CHAT_CONTAINER_NAME: "chat-files" # Deployed chat assets container
AZURE_STORAGE_COPILOT_CONTAINER_NAME: "copilot-files" # Copilot attachments container
AZURE_STORAGE_PROFILE_PICTURES_CONTAINER_NAME: "profile-pictures" # User avatars container
# Realtime service
realtime:
enabled: true

View File

@@ -116,8 +116,37 @@ app:
# Access Control (leave empty if not restricting login)
ALLOWED_LOGIN_EMAILS: "" # Comma-separated list of allowed email addresses for login
ALLOWED_LOGIN_DOMAINS: "" # Comma-separated list of allowed email domains for login
# SSO Configuration (Enterprise Single Sign-On)
# Set to "true" AFTER running the SSO registration script
SSO_ENABLED: "" # Enable SSO authentication ("true" to enable)
NEXT_PUBLIC_SSO_ENABLED: "" # Show SSO login button in UI ("true" to enable)
# AWS S3 Cloud Storage Configuration (optional - for file storage)
# If configured, files will be stored in S3 instead of local storage
AWS_REGION: "" # AWS region (e.g., "us-east-1")
AWS_ACCESS_KEY_ID: "" # AWS access key ID
AWS_SECRET_ACCESS_KEY: "" # AWS secret access key
S3_BUCKET_NAME: "" # S3 bucket for workspace files
S3_KB_BUCKET_NAME: "" # S3 bucket for knowledge base files
S3_EXECUTION_FILES_BUCKET_NAME: "" # S3 bucket for workflow execution files
S3_CHAT_BUCKET_NAME: "" # S3 bucket for deployed chat files
S3_COPILOT_BUCKET_NAME: "" # S3 bucket for copilot files
S3_PROFILE_PICTURES_BUCKET_NAME: "" # S3 bucket for user profile pictures
# Azure Blob Storage Configuration (optional - for file storage)
# If configured, files will be stored in Azure Blob instead of local storage
# Note: Azure Blob takes precedence over S3 if both are configured
AZURE_ACCOUNT_NAME: "" # Azure storage account name
AZURE_ACCOUNT_KEY: "" # Azure storage account key
AZURE_CONNECTION_STRING: "" # Azure connection string (alternative to account name/key)
AZURE_STORAGE_CONTAINER_NAME: "" # Azure container for workspace files
AZURE_STORAGE_KB_CONTAINER_NAME: "" # Azure container for knowledge base files
AZURE_STORAGE_EXECUTION_FILES_CONTAINER_NAME: "" # Azure container for workflow execution files
AZURE_STORAGE_CHAT_CONTAINER_NAME: "" # Azure container for deployed chat files
AZURE_STORAGE_COPILOT_CONTAINER_NAME: "" # Azure container for copilot files
AZURE_STORAGE_PROFILE_PICTURES_CONTAINER_NAME: "" # Azure container for user profile pictures
# Service configuration
service:
type: ClusterIP