Compare commits

..

38 Commits

Author SHA1 Message Date
Waleed
d63a5cb504 v0.5.71: ux, ci improvements, docs updates 2026-01-25 03:08:08 -08:00
Waleed
8bd5d41723 v0.5.70: router fix, anthropic agent response format adherence 2026-01-24 20:57:02 -08:00
Waleed
c12931bc50 v0.5.69: kb upgrades, blog, copilot improvements, auth consolidation (#2973)
* fix(subflows): tag dropdown + resolution logic (#2949)

* fix(subflows): tag dropdown + resolution logic

* fixes;

* revert parallel change

* chore(deps): bump posthog-js to 1.334.1 (#2948)

* fix(idempotency): add conflict target to atomicallyClaimDb query + remove redundant db namespace tracking (#2950)

* fix(idempotency): add conflict target to atomicallyClaimDb query

* delete needs to account for namespace

* simplify namespace filtering logic

* fix cleanup

* consistent target

* improvement(kb): add document filtering, select all, and React Query migration (#2951)

* improvement(kb): add document filtering, select all, and React Query migration

* test(kb): update tests for enabledFilter and removed userId params

* fix(kb): remove non-null assertion, add explicit guard

* improvement(logs): trace span, details (#2952)

* improvement(action-bar): ordering

* improvement(logs): details, trace span

* feat(blog): v0.5 release post (#2953)

* feat(blog): v0.5 post

* improvement(blog): simplify title and remove code block header

- Simplified blog title from Introducing Sim Studio v0.5 to Introducing Sim v0.5
- Removed language label header and copy button from code blocks for cleaner appearance

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* ack PR comments

* small styling improvements

* created system to create post-specific components

* updated componnet

* cache invalidation

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>

* feat(admin): add credits endpoint to issue credits to users (#2954)

* feat(admin): add credits endpoint to issue credits to users

* fix(admin): use existing credit functions and handle enterprise seats

* fix(admin): reject NaN and Infinity in amount validation

* styling

* fix(admin): validate userId and email are strings

* improvement(copilot): fast mode, subagent tool responses and allow preferences (#2955)

* Improvements

* Fix actions mapping

* Remove console logs

* fix(billing): handle missing userStats and prevent crashes (#2956)

* fix(billing): handle missing userStats and prevent crashes

* fix(billing): correct import path for getFilledPillColor

* fix(billing): add Number.isFinite check to lastPeriodCost

* fix(logs): refresh logic to refresh logs details (#2958)

* fix(security): add authentication and input validation to API routes (#2959)

* fix(security): add authentication and input validation to API routes

* moved utils

* remove extraneous commetns

* removed unused dep

* improvement(helm): add internal ingress support and same-host path consolidation (#2960)

* improvement(helm): add internal ingress support and same-host path consolidation

* improvement(helm): clean up ingress template comments

Simplify verbose inline Helm comments and section dividers to match the
minimal style used in services.yaml.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* fix(helm): add missing copilot path consolidation for realtime host

When copilot.host equals realtime.host but differs from app.host,
copilot paths were not being routed. Added logic to consolidate
copilot paths into the realtime rule for this scenario.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* improvement(helm): follow ingress best practices

- Remove orphan comments that appeared when services were disabled
- Add documentation about path ordering requirements
- Paths rendered in order: realtime, copilot, app (specific before catch-all)
- Clean template output matching industry Helm chart standards

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>

* feat(blog): enterprise post (#2961)

* feat(blog): enterprise post

* added more images, styling

* more content

* updated v0-5 post

* remove unused transition

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>

* fix(envvars): resolution standardized (#2957)

* fix(envvars): resolution standardized

* remove comments

* address bugbot

* fix highlighting for env vars

* remove comments

* address greptile

* address bugbot

* fix(copilot): mask credentials fix (#2963)

* Fix copilot masking

* Clean up

* Lint

* improvement(webhooks): remove dead code (#2965)

* fix(webhooks): subscription recreation path

* improvement(webhooks): remove dead code

* fix tests

* address bugbot comments

* fix restoration edge case

* fix more edge cases

* address bugbot comments

* fix gmail polling

* add warnings for UI indication for credential sets

* fix(preview): subblock values (#2969)

* fix(child-workflow): nested spans handoff (#2966)

* fix(child-workflow): nested spans handoff

* remove overly defensive programming

* update type check

* type more code

* remove more dead code

* address bugbot comments

* fix(security): restrict API key access on internal-only routes (#2964)

* fix(security): restrict API key access on internal-only routes

* test(security): update function execute tests for checkInternalAuth

* updated agent handler

* move session check higher in checkSessionOrInternalAuth

* extracted duplicate code into helper for resolving user from jwt

* fix(copilot): update copilot chat title (#2968)

* fix(hitl): fix condition blocks after hitl (#2967)

* fix(notes): ghost edges (#2970)

* fix(notes): ghost edges

* fix deployed state fallback

* fallback

* remove UI level checks

* annotation missing from autoconnect source check

* improvement(docs): loop and parallel var reference syntax (#2975)

* fix(blog): slash actions description (#2976)

* improvement(docs): loop and parallel var reference syntax

* fix(blog): slash actions description

* fix(auth): copilot routes (#2977)

* Fix copilot auth

* Fix

* Fix

* Fix

* fix(copilot): fix edit summary for loops/parallels (#2978)

* fix(integrations): hide from tool bar (#2544)

* fix(landing): ui (#2979)

* fix(edge-validation): race condition on collaborative add (#2980)

* fix(variables): boolean type support and input improvements (#2981)

* fix(variables): boolean type support and input improvements

* fix formatting

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Siddharth Ganesan <33737564+Sg312@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
2026-01-24 14:29:53 -08:00
Waleed
e9c4251c1c v0.5.68: router block reasoning, executor improvements, variable resolution consolidation, helm updates (#2946)
* improvement(workflow-item): stabilize avatar layout and fix name truncation (#2939)

* improvement(workflow-item): stabilize avatar layout and fix name truncation

* fix(avatars): revert overflow bg to hardcoded color for contrast

* fix(executor): stop parallel execution when block errors (#2940)

* improvement(helm): add per-deployment extraVolumes support (#2942)

* fix(gmail): expose messageId field in read email block (#2943)

* fix(resolver): consolidate reference resolution  (#2941)

* fix(resolver): consolidate code to resolve references

* fix edge cases

* use already formatted error

* fix multi index

* fix backwards compat reachability

* handle backwards compatibility accurately

* use shared constant correctly

* feat(router): expose reasoning output in router v2 block (#2945)

* fix(copilot): always allow, credential masking (#2947)

* Fix always allow, credential validation

* Credential masking

* Autoload

* fix(executor): handle condition dead-end branches in loops (#2944)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Siddharth Ganesan <33737564+Sg312@users.noreply.github.com>
2026-01-22 13:48:15 -08:00
Waleed
cc2be33d6b v0.5.67: loading, password reset, ui improvements, helm updates (#2928)
* fix(zustand): updated to useShallow from deprecated createWithEqualityFn (#2919)

* fix(logger): use direct env access for webpack inlining (#2920)

* fix(notifications): text overflow with line-clamp (#2921)

* chore(helm): add env vars for Vertex AI, orgs, and telemetry (#2922)

* fix(auth): improve reset password flow and consolidate brand detection (#2924)

* fix(auth): improve reset password flow and consolidate brand detection

* fix(auth): set errorHandled for EMAIL_NOT_VERIFIED to prevent duplicate error

* fix(auth): clear success message on login errors

* chore(auth): fix import order per lint

* fix(action-bar): duplicate subflows with children (#2923)

* fix(action-bar): duplicate subflows with children

* fix(action-bar): add validateTriggerPaste for subflow duplicate

* fix(resolver): agent response format, input formats, root level (#2925)

* fix(resolvers): agent response format, input formats, root level

* fix response block initial seeding

* fix tests

* fix(messages-input): fix cursor alignment and auto-resize with overlay (#2926)

* fix(messages-input): fix cursor alignment and auto-resize with overlay

* fixed remaining zustand warnings

* fix(stores): remove dead code causing log spam on startup (#2927)

* fix(stores): remove dead code causing log spam on startup

* fix(stores): replace custom tools zustand store with react query cache

* improvement(ui): use BrandedButton and BrandedLink components (#2930)

- Refactor auth forms to use BrandedButton component
- Add BrandedLink component for changelog page
- Reduce code duplication in login, signup, reset-password forms
- Update star count default value

* fix(custom-tools): remove unsafe title fallback in getCustomTool (#2929)

* fix(custom-tools): remove unsafe title fallback in getCustomTool

* fix(custom-tools): restore title fallback in getCustomTool lookup

Custom tools are referenced by title (custom_${title}), not database ID.
The title fallback is required for client-side tool resolution to work.

* fix(null-bodies): empty bodies handling (#2931)

* fix(null-statuses): empty bodies handling

* address bugbot comment

* fix(token-refresh): microsoft, notion, x, linear (#2933)

* fix(microsoft): proactive refresh needed

* fix(x): missing token refresh flag

* notion and linear missing flag too

* address bugbot comment

* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback (#2932)

* fix(auth): handle EMAIL_NOT_VERIFIED in onError callback

* refactor(auth): extract redirectToVerify helper to reduce duplication

* fix(workflow-selector): use dedicated selector for workflow dropdown (#2934)

* feat(workflow-block): preview (#2935)

* improvement(copilot): tool configs to show nested props (#2936)

* fix(auth): add genericOAuth providers to trustedProviders (#2937)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-21 22:53:25 -08:00
Vikhyath Mondreti
45371e521e v0.5.66: external http requests fix, ring highlighting 2026-01-21 02:55:39 -08:00
Waleed
0ce0f98aa5 v0.5.65: gemini updates, textract integration, ui updates (#2909)
* fix(google): wrap primitive tool responses for Gemini API compatibility (#2900)

* fix(canonical): copilot path + update parent (#2901)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output (#2902)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output

* fix(imap): add top-level fields to IMAP trigger output

* improvement(browseruse): add profile id param (#2903)

* improvement(browseruse): add profile id param

* make request a stub since we have directExec

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels (#2880)

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels

* comments

* improvement(files): update execution for passing base64 strings (#2906)

* progress

* improvement(execution): update execution for passing base64 strings

* fix types

* cleanup comments

* path security vuln

* reject promise correctly

* fix redirect case

* remove proxy routes

* fix tests

* use ipaddr

* feat(tools): added textract, added v2 for mistral, updated tag dropdown (#2904)

* feat(tools): added textract

* cleanup

* ack pr comments

* reorder

* removed upload for textract async version

* fix additional fields dropdown in editor, update parser to leave validation to be done on the server

* added mistral v2, files v2, and finalized textract

* updated the rest of the old file patterns, updated mistral outputs for v2

* updated tag dropdown to parse non-operation fields as well

* updated extension finder

* cleanup

* added description for inputs to workflow

* use helper for internal route check

* fix tag dropdown merge conflict change

* remove duplicate code

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>

* fix(ui): change add inputs button to match output selector (#2907)

* fix(canvas): removed invite to workspace from canvas popover (#2908)

* fix(canvas): removed invite to workspace

* removed unused props

* fix(copilot): legacy tool display names (#2911)

* fix(a2a): canonical merge  (#2912)

* fix canonical merge

* fix empty array case

* fix(change-detection): copilot diffs have extra field (#2913)

* improvement(logs): improved logs ui bugs, added subflow disable UI (#2910)

* improvement(logs): improved logs ui bugs, added subflow disable UI

* added duplicate to action bar for subflows

* feat(broadcast): email v0.5 (#2905)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-20 23:54:55 -08:00
Waleed
dff1c9d083 v0.5.64: unsubscribe, search improvements, metrics, additional SSO configuration 2026-01-20 00:34:11 -08:00
Vikhyath Mondreti
b09f683072 v0.5.63: ui and performance improvements, more google tools 2026-01-18 15:22:42 -08:00
Vikhyath Mondreti
a8bb0db660 v0.5.62: webhook bug fixes, seeding default subblock values, block selection fixes 2026-01-16 20:27:06 -08:00
Waleed
af82820a28 v0.5.61: webhook improvements, workflow controls, react query for deployment status, chat fixes, reducto and pulse OCR, linear fixes 2026-01-16 18:06:23 -08:00
Waleed
4372841797 v0.5.60: invitation flow improvements, chat fixes, a2a improvements, additional copilot actions 2026-01-15 00:02:18 -08:00
Waleed
5e8c843241 v0.5.59: a2a support, documentation 2026-01-13 13:21:21 -08:00
Waleed
7bf3d73ee6 v0.5.58: export folders, new tools, permissions groups enhancements 2026-01-13 00:56:59 -08:00
Vikhyath Mondreti
7ffc11a738 v0.5.57: subagents, context menu improvements, bug fixes 2026-01-11 11:38:40 -08:00
Waleed
be578e2ed7 v0.5.56: batch operations, access control and permission groups, billing fixes 2026-01-10 00:31:34 -08:00
Waleed
f415e5edc4 v0.5.55: polling groups, bedrock provider, devcontainer fixes, workflow preview enhancements 2026-01-08 23:36:56 -08:00
Waleed
13a6e6c3fa v0.5.54: seo, model blacklist, helm chart updates, fireflies integration, autoconnect improvements, billing fixes 2026-01-07 16:09:45 -08:00
Waleed
f5ab7f21ae v0.5.53: hotkey improvements, added redis fallback, fixes for workflow tool 2026-01-06 23:34:52 -08:00
Waleed
bfb6fffe38 v0.5.52: new port-based router block, combobox expression and variable support 2026-01-06 16:14:10 -08:00
Waleed
4fbec0a43f v0.5.51: triggers, kb, condition block improvements, supabase and grain integration updates 2026-01-06 14:26:46 -08:00
Waleed
585f5e365b v0.5.50: import improvements, ui upgrades, kb styling and performance improvements 2026-01-05 00:35:55 -08:00
Waleed
3792bdd252 v0.5.49: hitl improvements, new email styles, imap trigger, logs context menu (#2672)
* feat(logs-context-menu): consolidated logs utils and types, added logs record context menu (#2659)

* feat(email): welcome email; improvement(emails): ui/ux (#2658)

* feat(email): welcome email; improvement(emails): ui/ux

* improvement(emails): links, accounts, preview

* refactor(emails): file structure and wrapper components

* added envvar for personal emails sent, added isHosted gate

* fixed failing tests, added env mock

* fix: removed comment

---------

Co-authored-by: waleed <walif6@gmail.com>

* fix(logging): hitl + trigger dev crash protection (#2664)

* hitl gaps

* deal with trigger worker crashes

* cleanup import strcuture

* feat(imap): added support for imap trigger (#2663)

* feat(tools): added support for imap trigger

* feat(imap): added parity, tested

* ack PR comments

* final cleanup

* feat(i18n): update translations (#2665)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(grain): updated grain trigger to auto-establish trigger (#2666)

Co-authored-by: aadamgough <adam@sim.ai>

* feat(admin): routes to manage deployments (#2667)

* feat(admin): routes to manage deployments

* fix naming fo deployed by

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date (#2668)

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date

* removed unused params, cleaned up redundant utils

* improvement(invite): aligned styling (#2669)

* improvement(invite): aligned with rest of app

* fix(invite): error handling

* fix: addressed comments

---------

Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
Co-authored-by: aadamgough <adam@sim.ai>
2026-01-03 13:19:18 -08:00
Waleed
eb5d1f3e5b v0.5.48: copy-paste workflow blocks, docs updates, mcp tool fixes 2025-12-31 18:00:04 -08:00
Waleed
54ab82c8dd v0.5.47: deploy workflow as mcp, kb chunks tokenizer, UI improvements, jira service management tools 2025-12-30 23:18:58 -08:00
Waleed
f895bf469b v0.5.46: build improvements, greptile, light mode improvements 2025-12-29 02:17:52 -08:00
Waleed
dd3209af06 v0.5.45: light mode fixes, realtime usage indicator, docker build improvements 2025-12-27 19:57:42 -08:00
Waleed
b6ba3b50a7 v0.5.44: keyboard shortcuts, autolayout, light mode, byok, testing improvements 2025-12-26 21:25:19 -08:00
Waleed
b304233062 v0.5.43: export logs, circleback, grain, vertex, code hygiene, schedule improvements 2025-12-23 19:19:18 -08:00
Vikhyath Mondreti
57e4b49bd6 v0.5.42: fix memory migration 2025-12-23 01:24:54 -08:00
Vikhyath Mondreti
e12dd204ed v0.5.41: memory fixes, copilot improvements, knowledgebase improvements, LLM providers standardization 2025-12-23 00:15:18 -08:00
Vikhyath Mondreti
3d9d9cbc54 v0.5.40: supabase ops to allow non-public schemas, jira uuid 2025-12-21 22:28:05 -08:00
Waleed
0f4ec962ad v0.5.39: notion, workflow variables fixes 2025-12-20 20:44:00 -08:00
Waleed
4827866f9a v0.5.38: snap to grid, copilot ux improvements, billing line items 2025-12-20 17:24:38 -08:00
Waleed
3e697d9ed9 v0.5.37: redaction utils consolidation, logs updates, autoconnect improvements, additional kb tag types 2025-12-19 22:31:55 -08:00
Martin Yankov
4431a1a484 fix(helm): add custom egress rules to realtime network policy (#2481)
The realtime service network policy was missing the custom egress rules section
that allows configuration of additional egress rules via values.yaml. This caused
the realtime pods to be unable to connect to external databases (e.g., PostgreSQL
on port 5432) when using external database configurations.

The app network policy already had this section, but the realtime network policy
was missing it, creating an inconsistency and preventing the realtime service
from accessing external databases configured via networkPolicy.egress values.

This fix adds the same custom egress rules template section to the realtime
network policy, matching the app network policy behavior and allowing users to
configure database connectivity via values.yaml.
2025-12-19 18:59:08 -08:00
Waleed
4d1a9a3f22 v0.5.36: hitl improvements, opengraph, slack fixes, one-click unsubscribe, auth checks, new db indexes 2025-12-19 01:27:49 -08:00
Vikhyath Mondreti
eb07a080fb v0.5.35: helm updates, copilot improvements, 404 for docs, salesforce fixes, subflow resize clamping 2025-12-18 16:23:19 -08:00
68 changed files with 578 additions and 1113 deletions

View File

@@ -44,7 +44,7 @@ services:
deploy:
resources:
limits:
memory: 1G
memory: 4G
environment:
- NODE_ENV=development
- DATABASE_URL=postgresql://postgres:postgres@db:5432/simstudio

View File

@@ -1,40 +0,0 @@
'use client'
import { getAssetUrl } from '@/lib/utils'
interface ActionImageProps {
src: string
alt: string
}
interface ActionVideoProps {
src: string
alt: string
}
export function ActionImage({ src, alt }: ActionImageProps) {
const resolvedSrc = getAssetUrl(src.startsWith('/') ? src.slice(1) : src)
return (
<img
src={resolvedSrc}
alt={alt}
className='inline-block w-full max-w-[200px] rounded border border-neutral-200 dark:border-neutral-700'
/>
)
}
export function ActionVideo({ src, alt }: ActionVideoProps) {
const resolvedSrc = getAssetUrl(src.startsWith('/') ? src.slice(1) : src)
return (
<video
src={resolvedSrc}
autoPlay
loop
muted
playsInline
className='inline-block w-full max-w-[200px] rounded border border-neutral-200 dark:border-neutral-700'
/>
)
}

View File

@@ -10,20 +10,12 @@ Stellen Sie Sim auf Ihrer eigenen Infrastruktur mit Docker oder Kubernetes berei
## Anforderungen
| Ressource | Klein | Standard | Produktion |
|----------|-------|----------|------------|
| CPU | 2 Kerne | 4 Kerne | 8+ Kerne |
| RAM | 12 GB | 16 GB | 32+ GB |
| Speicher | 20 GB SSD | 50 GB SSD | 100+ GB SSD |
| Docker | 20.10+ | 20.10+ | Neueste Version |
**Klein**: Entwicklung, Tests, Einzelnutzer (1-5 Nutzer)
**Standard**: Teams (5-50 Nutzer), moderate Arbeitslasten
**Produktion**: Große Teams (50+ Nutzer), Hochverfügbarkeit, intensive Workflow-Ausführung
<Callout type="info">
Die Ressourcenanforderungen werden durch Workflow-Ausführung (isolated-vm Sandboxing), Dateiverarbeitung (In-Memory-Dokumentenparsing) und Vektoroperationen (pgvector) bestimmt. Arbeitsspeicher ist typischerweise der limitierende Faktor, nicht CPU. Produktionsdaten zeigen, dass die Hauptanwendung durchschnittlich 4-8 GB und bei hoher Last bis zu 12 GB benötigt.
</Callout>
| Ressource | Minimum | Empfohlen |
|----------|---------|-------------|
| CPU | 2 Kerne | 4+ Kerne |
| RAM | 12 GB | 16+ GB |
| Speicher | 20 GB SSD | 50+ GB SSD |
| Docker | 20.10+ | Neueste Version |
## Schnellstart

View File

@@ -4,7 +4,6 @@ description: Essential actions for navigating and using the Sim workflow editor
---
import { Callout } from 'fumadocs-ui/components/callout'
import { ActionImage, ActionVideo } from '@/components/ui/action-media'
A quick lookup for everyday actions in the Sim workflow editor. For keyboard shortcuts, see [Keyboard Shortcuts](/keyboard-shortcuts).
@@ -14,362 +13,124 @@ A quick lookup for everyday actions in the Sim workflow editor. For keyboard sho
## Workspaces
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Create a workspace</td>
<td>Click workspace dropdown → **New Workspace**</td>
<td><ActionVideo src="/static/quick-reference/create-workspace.mp4" alt="Create workspace" /></td>
</tr>
<tr>
<td>Switch workspaces</td>
<td>Click workspace dropdown → Select workspace</td>
<td><ActionVideo src="/static/quick-reference/switch-workspace.mp4" alt="Switch workspaces" /></td>
</tr>
<tr>
<td>Invite team members</td>
<td>Sidebar → **Invite**</td>
<td><ActionVideo src="/static/quick-reference/invite.mp4" alt="Invite team members" /></td>
</tr>
<tr>
<td>Rename a workspace</td>
<td>Right-click workspace → **Rename**</td>
<td rowSpan={4}><ActionImage src="/static/quick-reference/workspace-context-menu.png" alt="Workspace context menu" /></td>
</tr>
<tr>
<td>Duplicate a workspace</td>
<td>Right-click workspace → **Duplicate**</td>
</tr>
<tr>
<td>Export a workspace</td>
<td>Right-click workspace → **Export**</td>
</tr>
<tr>
<td>Delete a workspace</td>
<td>Right-click workspace → **Delete**</td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Create a workspace | Click workspace dropdown in sidebar → **New Workspace** |
| Rename a workspace | Workspace settings → Edit name |
| Switch workspaces | Click workspace dropdown in sidebar → Select workspace |
| Invite team members | Workspace settings → **Team** → **Invite** |
## Workflows
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Create a workflow</td>
<td>Click **+** button in sidebar</td>
<td><ActionImage src="/static/quick-reference/create-workflow.png" alt="Create workflow" /></td>
</tr>
<tr>
<td>Reorder / move workflows</td>
<td>Drag workflow up/down or onto a folder</td>
<td><ActionVideo src="/static/quick-reference/reordering.mp4" alt="Reorder workflows" /></td>
</tr>
<tr>
<td>Import a workflow</td>
<td>Click import button in sidebar → Select file</td>
<td><ActionImage src="/static/quick-reference/import-workflow.png" alt="Import workflow" /></td>
</tr>
<tr>
<td>Multi-select workflows</td>
<td>`Mod+Click` or `Shift+Click` workflows in sidebar</td>
<td><ActionVideo src="/static/quick-reference/multiselect.mp4" alt="Multi-select workflows" /></td>
</tr>
<tr>
<td>Open in new tab</td>
<td>Right-click workflow → **Open in New Tab**</td>
<td rowSpan={6}><ActionImage src="/static/quick-reference/workflow-context-menu.png" alt="Workflow context menu" /></td>
</tr>
<tr>
<td>Rename a workflow</td>
<td>Right-click workflow → **Rename**</td>
</tr>
<tr>
<td>Assign workflow color</td>
<td>Right-click workflow → **Change Color**</td>
</tr>
<tr>
<td>Duplicate a workflow</td>
<td>Right-click workflow → **Duplicate**</td>
</tr>
<tr>
<td>Export a workflow</td>
<td>Right-click workflow → **Export**</td>
</tr>
<tr>
<td>Delete a workflow</td>
<td>Right-click workflow → **Delete**</td>
</tr>
<tr>
<td>Rename a folder</td>
<td>Right-click folder → **Rename**</td>
<td rowSpan={6}><ActionImage src="/static/quick-reference/folder-context-menu.png" alt="Folder context menu" /></td>
</tr>
<tr>
<td>Create workflow in folder</td>
<td>Right-click folder → **Create workflow**</td>
</tr>
<tr>
<td>Create folder in folder</td>
<td>Right-click folder → **Create folder**</td>
</tr>
<tr>
<td>Duplicate a folder</td>
<td>Right-click folder → **Duplicate**</td>
</tr>
<tr>
<td>Export a folder</td>
<td>Right-click folder → **Export**</td>
</tr>
<tr>
<td>Delete a folder</td>
<td>Right-click folder → **Delete**</td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Create a workflow | Click **New Workflow** button or `Mod+Shift+A` |
| Rename a workflow | Double-click workflow name in sidebar, or right-click → **Rename** |
| Duplicate a workflow | Right-click workflow → **Duplicate** |
| Reorder workflows | Drag workflow up/down in the sidebar list |
| Import a workflow | Sidebar menu → **Import** → Select file |
| Create a folder | Right-click in sidebar → **New Folder** |
| Rename a folder | Right-click folder → **Rename** |
| Delete a folder | Right-click folder → **Delete** |
| Collapse/expand folder | Click folder arrow, or double-click folder |
| Move workflow to folder | Drag workflow onto folder in sidebar |
| Delete a workflow | Right-click workflow → **Delete** |
| Export a workflow | Right-click workflow → **Export** |
| Assign workflow color | Right-click workflow → **Change Color** |
| Multi-select workflows | `Mod+Click` or `Shift+Click` workflows in sidebar |
| Open in new tab | Right-click workflow → **Open in New Tab** |
## Blocks
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Add a block</td>
<td>Drag from Toolbar panel, or right-click canvas → **Add Block**</td>
<td><ActionVideo src="/static/quick-reference/add-block.mp4" alt="Add a block" /></td>
</tr>
<tr>
<td>Multi-select blocks</td>
<td>`Mod+Click` additional blocks, or shift-drag to draw selection box</td>
<td><ActionVideo src="/static/quick-reference/multiselect-blocks.mp4" alt="Multi-select blocks" /></td>
</tr>
<tr>
<td>Copy blocks</td>
<td>`Mod+C` with blocks selected</td>
<td rowSpan={2}><ActionVideo src="/static/quick-reference/copy-paste.mp4" alt="Copy and paste blocks" /></td>
</tr>
<tr>
<td>Paste blocks</td>
<td>`Mod+V` to paste copied blocks</td>
</tr>
<tr>
<td>Duplicate blocks</td>
<td>Right-click → **Duplicate**</td>
<td><ActionVideo src="/static/quick-reference/duplicate-block.mp4" alt="Duplicate blocks" /></td>
</tr>
<tr>
<td>Delete blocks</td>
<td>`Delete` or `Backspace` key, or right-click → **Delete**</td>
<td><ActionImage src="/static/quick-reference/delete-block.png" alt="Delete block" /></td>
</tr>
<tr>
<td>Rename a block</td>
<td>Click block name in header, or edit in the Editor panel</td>
<td><ActionVideo src="/static/quick-reference/rename-block.mp4" alt="Rename a block" /></td>
</tr>
<tr>
<td>Enable/Disable a block</td>
<td>Right-click → **Enable/Disable**</td>
<td><ActionImage src="/static/quick-reference/disable-block.png" alt="Disable block" /></td>
</tr>
<tr>
<td>Toggle handle orientation</td>
<td>Right-click → **Toggle Handles**</td>
<td><ActionVideo src="/static/quick-reference/toggle-handles.mp4" alt="Toggle handle orientation" /></td>
</tr>
<tr>
<td>Configure a block</td>
<td>Select block → use Editor panel on right</td>
<td><ActionVideo src="/static/quick-reference/configure-block.mp4" alt="Configure a block" /></td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Add a block | Drag from Toolbar panel, or right-click canvas → **Add Block** |
| Select a block | Click on the block |
| Multi-select blocks | `Mod+Click` additional blocks, or right-drag to draw selection box |
| Move blocks | Drag selected block(s) to new position |
| Copy blocks | `Mod+C` with blocks selected |
| Paste blocks | `Mod+V` to paste copied blocks |
| Duplicate blocks | Right-click → **Duplicate** |
| Delete blocks | `Delete` or `Backspace` key, or right-click → **Delete** |
| Rename a block | Click block name in header, or edit in the Editor panel |
| Enable/Disable a block | Right-click → **Enable/Disable** |
| Toggle handle orientation | Right-click → **Toggle Handles** |
| Toggle trigger mode | Right-click trigger block → **Toggle Trigger Mode** |
| Configure a block | Select block → use Editor panel on right |
## Connections
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Create a connection</td>
<td>Drag from output handle to input handle</td>
<td><ActionVideo src="/static/quick-reference/connect-blocks.mp4" alt="Connect blocks" /></td>
</tr>
<tr>
<td>Delete a connection</td>
<td>Click edge to select `Delete` key</td>
<td><ActionVideo src="/static/quick-reference/delete-connection.mp4" alt="Delete connection" /></td>
</tr>
<tr>
<td>Use output in another block</td>
<td>Drag connection tag into input field</td>
<td><ActionVideo src="/static/quick-reference/connection-tag.mp4" alt="Use connection tag" /></td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Create a connection | Drag from output handle to input handle |
| Delete a connection | Click edge to select → `Delete` key |
| Use output in another block | Drag connection tag into input field |
## Canvas Navigation
| Action | How |
|--------|-----|
| Pan/move canvas | Left-drag on empty space, or scroll/trackpad |
| Zoom in/out | Scroll wheel or pinch gesture |
| Auto-layout | `Shift+L` |
| Draw selection box | Right-drag on empty canvas area |
## Panels & Views
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Search toolbar</td>
<td>`Mod+F`</td>
<td><ActionVideo src="/static/quick-reference/search-toolbar.mp4" alt="Search toolbar" /></td>
</tr>
<tr>
<td>Search everything</td>
<td>`Mod+K`</td>
<td><ActionImage src="/static/quick-reference/search-everything.png" alt="Search everything" /></td>
</tr>
<tr>
<td>Toggle manual mode</td>
<td>Click toggle button to switch between manual and selector</td>
<td><ActionImage src="/static/quick-reference/toggle-manual-mode.png" alt="Toggle manual mode" /></td>
</tr>
<tr>
<td>Collapse/expand sidebar</td>
<td>Click collapse button on sidebar</td>
<td><ActionVideo src="/static/quick-reference/collapse-sidebar.mp4" alt="Collapse sidebar" /></td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Open Copilot tab | Press `C` or click Copilot tab |
| Open Toolbar tab | Press `T` or click Toolbar tab |
| Open Editor tab | Press `E` or click Editor tab |
| Search toolbar | `Mod+F` |
| Toggle advanced mode | Click toggle button on input fields |
| Resize panels | Drag panel edge |
| Collapse/expand sidebar | Click collapse button on sidebar |
## Running & Testing
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Run workflow</td>
<td>Click Run Workflow button or `Mod+Enter`</td>
<td><ActionImage src="/static/quick-reference/run-workflow.png" alt="Run workflow" /></td>
</tr>
<tr>
<td>Stop workflow</td>
<td>Click Stop button or `Mod+Enter` while running</td>
<td><ActionImage src="/static/quick-reference/stop-workflow.png" alt="Stop workflow" /></td>
</tr>
<tr>
<td>Test with chat</td>
<td>Use Chat panel on the right side</td>
<td><ActionImage src="/static/quick-reference/test-chat.png" alt="Test with chat" /></td>
</tr>
<tr>
<td>Select output to view</td>
<td>Click dropdown in Chat panel → Select block output</td>
<td><ActionImage src="/static/quick-reference/output-select.png" alt="Select output to view" /></td>
</tr>
<tr>
<td>Clear chat history</td>
<td>Click clear button in Chat panel</td>
<td><ActionImage src="/static/quick-reference/clear-chat.png" alt="Clear chat history" /></td>
</tr>
<tr>
<td>View execution logs</td>
<td>Open terminal panel at bottom, or `Mod+L`</td>
<td><ActionImage src="/static/quick-reference/terminal.png" alt="Execution logs terminal" /></td>
</tr>
<tr>
<td>Filter logs by block or status</td>
<td>Click block filter in terminal or right-click log entry → **Filter by Block** or **Filter by Status**</td>
<td><ActionImage src="/static/quick-reference/filter-block.png" alt="Filter logs by block" /></td>
</tr>
<tr>
<td>Search logs</td>
<td>Use search field in terminal or right-click log entry → **Search**</td>
<td><ActionImage src="/static/quick-reference/terminal-search.png" alt="Search logs" /></td>
</tr>
<tr>
<td>Copy log entry</td>
<td>Clipboard Icon or Right-click log entry → **Copy**</td>
<td><ActionImage src="/static/quick-reference/copy-log.png" alt="Copy log entry" /></td>
</tr>
<tr>
<td>Clear terminal</td>
<td>Trash icon or `Mod+D`</td>
<td><ActionImage src="/static/quick-reference/clear-terminal.png" alt="Clear terminal" /></td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Run workflow | Click Play button or `Mod+Enter` |
| Stop workflow | Click Stop button or `Mod+Enter` while running |
| Test with chat | Use Chat panel on the right side |
| Select output to view | Click dropdown in Chat panel → Select block output |
| Clear chat history | Click clear button in Chat panel |
| View execution logs | Open terminal panel at bottom, or `Mod+L` |
| Filter logs by block | Click block filter in terminal |
| Filter logs by status | Click status filter in terminal |
| Search logs | Use search field in terminal |
| Copy log entry | Right-click log entry → **Copy** |
| Clear terminal | `Mod+D` |
## Deployment
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Deploy a workflow</td>
<td>Click **Deploy** button in panel</td>
<td><ActionImage src="/static/quick-reference/deploy.png" alt="Deploy workflow" /></td>
</tr>
<tr>
<td>Update deployment</td>
<td>Click **Update** when changes are detected</td>
<td><ActionImage src="/static/quick-reference/update-deployment.png" alt="Update deployment" /></td>
</tr>
<tr>
<td>View deployment status</td>
<td>Check status indicator (Live/Update/Deploy) in Deploy tab</td>
<td><ActionImage src="/static/quick-reference/view-deployment.png" alt="View deployment status" /></td>
</tr>
<tr>
<td>Revert deployment</td>
<td>Access previous versions in Deploy tab → **Promote to live**</td>
<td><ActionImage src="/static/quick-reference/promote-deployment.png" alt="Promote deployment to live" /></td>
</tr>
<tr>
<td>Copy API endpoint</td>
<td>Deploy tab → Copy API endpoint URL</td>
<td><ActionImage src="/static/quick-reference/copy-api.png" alt="Copy API endpoint" /></td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Deploy a workflow | Click **Deploy** button in Deploy tab |
| Update deployment | Click **Update** when changes are detected |
| View deployment status | Check status indicator (Live/Update/Deploy) in Deploy tab |
| Revert deployment | Access previous versions in Deploy tab |
| Copy webhook URL | Deploy tab → Copy webhook URL |
| Copy API endpoint | Deploy tab → Copy API endpoint URL |
| Set up a schedule | Add Schedule trigger block → Configure interval |
## Variables
<table>
<thead>
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
</thead>
<tbody>
<tr>
<td>Add / Edit / Delete workflow variable</td>
<td>Panel -> Variables -> **Add Variable**, click to edit, or delete icon</td>
<td><ActionImage src="/static/quick-reference/variables.png" alt="Variables panel" /></td>
</tr>
<tr>
<td>Add environment variable</td>
<td>Settings → **Environment Variables** → **Add**</td>
<td><ActionImage src="/static/quick-reference/add-env-variable.png" alt="Add environment variable" /></td>
</tr>
<tr>
<td>Reference a workflow variable</td>
<td>Use `<blockName.itemName>` syntax in block inputs</td>
<td><ActionImage src="/static/quick-reference/variable-reference.png" alt="Reference workflow variable" /></td>
</tr>
<tr>
<td>Reference an environment variable</td>
<td>Use `&#123;&#123;ENV_VAR&#125;&#125;` syntax in block inputs</td>
<td><ActionImage src="/static/quick-reference/env-variable-reference.png" alt="Reference environment variable" /></td>
</tr>
</tbody>
</table>
| Action | How |
|--------|-----|
| Add workflow variable | Variables tab → **Add Variable** |
| Edit workflow variable | Variables tab → Click variable to edit |
| Delete workflow variable | Variables tab → Click delete icon on variable |
| Add environment variable | Settings → **Environment Variables** → **Add** |
| Reference a variable | Use `{{variableName}}` syntax in block inputs |
## Credentials
| Action | How |
|--------|-----|
| Add API key | Block credential field → **Add Credential** → Enter API key |
| Connect OAuth account | Block credential field → **Connect** → Authorize with provider |
| Manage credentials | Settings → **Credentials** |
| Remove credential | Settings → **Credentials** → Delete credential |

View File

@@ -16,20 +16,12 @@ Deploy Sim on your own infrastructure with Docker or Kubernetes.
## Requirements
| Resource | Small | Standard | Production |
|----------|-------|----------|------------|
| CPU | 2 cores | 4 cores | 8+ cores |
| RAM | 12 GB | 16 GB | 32+ GB |
| Storage | 20 GB SSD | 50 GB SSD | 100+ GB SSD |
| Docker | 20.10+ | 20.10+ | Latest |
**Small**: Development, testing, single user (1-5 users)
**Standard**: Teams (5-50 users), moderate workloads
**Production**: Large teams (50+ users), high availability, heavy workflow execution
<Callout type="info">
Resource requirements are driven by workflow execution (isolated-vm sandboxing), file processing (in-memory document parsing), and vector operations (pgvector). Memory is typically the constraining factor rather than CPU. Production telemetry shows the main app uses 4-8 GB average with peaks up to 12 GB under heavy load.
</Callout>
| Resource | Minimum | Recommended |
|----------|---------|-------------|
| CPU | 2 cores | 4+ cores |
| RAM | 12 GB | 16+ GB |
| Storage | 20 GB SSD | 50+ GB SSD |
| Docker | 20.10+ | Latest |
## Quick Start

View File

@@ -10,20 +10,12 @@ Despliega Sim en tu propia infraestructura con Docker o Kubernetes.
## Requisitos
| Recurso | Pequeño | Estándar | Producción |
|----------|---------|----------|------------|
| CPU | 2 núcleos | 4 núcleos | 8+ núcleos |
| RAM | 12 GB | 16 GB | 32+ GB |
| Almacenamiento | 20 GB SSD | 50 GB SSD | 100+ GB SSD |
| Docker | 20.10+ | 20.10+ | Última versión |
**Pequeño**: Desarrollo, pruebas, usuario único (1-5 usuarios)
**Estándar**: Equipos (5-50 usuarios), cargas de trabajo moderadas
**Producción**: Equipos grandes (50+ usuarios), alta disponibilidad, ejecución intensiva de workflows
<Callout type="info">
Los requisitos de recursos están determinados por la ejecución de workflows (sandboxing isolated-vm), procesamiento de archivos (análisis de documentos en memoria) y operaciones vectoriales (pgvector). La memoria suele ser el factor limitante, no la CPU. La telemetría de producción muestra que la aplicación principal usa 4-8 GB en promedio con picos de hasta 12 GB bajo carga pesada.
</Callout>
| Recurso | Mínimo | Recomendado |
|----------|---------|-------------|
| CPU | 2 núcleos | 4+ núcleos |
| RAM | 12 GB | 16+ GB |
| Almacenamiento | 20 GB SSD | 50+ GB SSD |
| Docker | 20.10+ | Última versión |
## Inicio rápido

View File

@@ -10,20 +10,12 @@ Déployez Sim sur votre propre infrastructure avec Docker ou Kubernetes.
## Prérequis
| Ressource | Petit | Standard | Production |
|----------|-------|----------|------------|
| CPU | 2 cœurs | 4 cœurs | 8+ cœurs |
| RAM | 12 Go | 16 Go | 32+ Go |
| Stockage | 20 Go SSD | 50 Go SSD | 100+ Go SSD |
| Docker | 20.10+ | 20.10+ | Dernière version |
**Petit** : Développement, tests, utilisateur unique (1-5 utilisateurs)
**Standard** : Équipes (5-50 utilisateurs), charges de travail modérées
**Production** : Grandes équipes (50+ utilisateurs), haute disponibilité, exécution intensive de workflows
<Callout type="info">
Les besoins en ressources sont déterminés par l'exécution des workflows (sandboxing isolated-vm), le traitement des fichiers (analyse de documents en mémoire) et les opérations vectorielles (pgvector). La mémoire est généralement le facteur limitant, pas le CPU. La télémétrie de production montre que l'application principale utilise 4-8 Go en moyenne avec des pics jusqu'à 12 Go sous forte charge.
</Callout>
| Ressource | Minimum | Recommandé |
|----------|---------|-------------|
| CPU | 2 cœurs | 4+ cœurs |
| RAM | 12 Go | 16+ Go |
| Stockage | 20 Go SSD | 50+ Go SSD |
| Docker | 20.10+ | Dernière version |
## Démarrage rapide

View File

@@ -10,20 +10,12 @@ DockerまたはKubernetesを使用して、自社のインフラストラクチ
## 要件
| リソース | スモール | スタンダード | プロダクション |
|----------|---------|-------------|----------------|
| CPU | 2コア | 4コア | 8+コア |
| RAM | 12 GB | 16 GB | 32+ GB |
| ストレージ | 20 GB SSD | 50 GB SSD | 100+ GB SSD |
| Docker | 20.10+ | 20.10+ | 最新版 |
**スモール**: 開発、テスト、シングルユーザー1-5ユーザー
**スタンダード**: チーム5-50ユーザー、中程度のワークロード
**プロダクション**: 大規模チーム50+ユーザー)、高可用性、高負荷ワークフロー実行
<Callout type="info">
リソース要件は、ワークフロー実行isolated-vmサンドボックス、ファイル処理メモリ内ドキュメント解析、ベクトル演算pgvectorによって決まります。CPUよりもメモリが制約要因となることが多いです。本番環境のテレメトリによると、メインアプリは平均4-8 GB、高負荷時は最大12 GBを使用します。
</Callout>
| リソース | 最小 | 推奨 |
|----------|---------|-------------|
| CPU | 2コア | 4+コア |
| RAM | 12 GB | 16+ GB |
| ストレージ | 20 GB SSD | 50+ GB SSD |
| Docker | 20.10+ | 最新版 |
## クイックスタート

View File

@@ -10,20 +10,12 @@ import { Callout } from 'fumadocs-ui/components/callout'
## 要求
| 资源 | 小型 | 标准 | 生产环境 |
|----------|------|------|----------|
| CPU | 2 核 | 4 核 | 8+ 核 |
| 内存 | 12 GB | 16 GB | 32+ GB |
| 存储 | 20 GB SSD | 50 GB SSD | 100+ GB SSD |
| Docker | 20.10+ | 20.10+ | 最新版本 |
**小型**: 开发、测试、单用户1-5 用户)
**标准**: 团队5-50 用户)、中等工作负载
**生产环境**: 大型团队50+ 用户)、高可用性、密集工作流执行
<Callout type="info">
资源需求由工作流执行isolated-vm 沙箱、文件处理内存中文档解析和向量运算pgvector决定。内存通常是限制因素而不是 CPU。生产遥测数据显示主应用平均使用 4-8 GB高负载时峰值可达 12 GB。
</Callout>
| 资源 | 最低要求 | 推荐配置 |
|----------|---------|-------------|
| CPU | 2 核 | 4 核及以上 |
| 内存 | 12 GB | 16 GB 及以上 |
| 存储 | 20 GB SSD | 50 GB 及以上 SSD |
| Docker | 20.10+ | 最新版本 |
## 快速开始

Binary file not shown.

Before

Width:  |  Height:  |  Size: 104 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 45 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 114 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 66 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 49 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 146 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 60 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 90 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 103 KiB

View File

@@ -8,7 +8,6 @@ import { executeInIsolatedVM } from '@/lib/execution/isolated-vm'
import { CodeLanguage, DEFAULT_CODE_LANGUAGE, isValidCodeLanguage } from '@/lib/execution/languages'
import { escapeRegExp, normalizeName, REFERENCE } from '@/executor/constants'
import { type OutputSchema, resolveBlockReference } from '@/executor/utils/block-reference'
import { formatLiteralForCode } from '@/executor/utils/code-formatting'
import {
createEnvVarPattern,
createWorkflowVariablePattern,
@@ -388,12 +387,7 @@ function resolveWorkflowVariables(
if (type === 'number') {
variableValue = Number(variableValue)
} else if (type === 'boolean') {
if (typeof variableValue === 'boolean') {
// Already a boolean, keep as-is
} else {
const normalized = String(variableValue).toLowerCase().trim()
variableValue = normalized === 'true'
}
variableValue = variableValue === 'true' || variableValue === true
} else if (type === 'json' && typeof variableValue === 'string') {
try {
variableValue = JSON.parse(variableValue)
@@ -693,7 +687,11 @@ export async function POST(req: NextRequest) {
prologue += `const environmentVariables = JSON.parse(${JSON.stringify(JSON.stringify(envVars))});\n`
prologueLineCount++
for (const [k, v] of Object.entries(contextVariables)) {
prologue += `const ${k} = ${formatLiteralForCode(v, 'javascript')};\n`
if (v === undefined) {
prologue += `const ${k} = undefined;\n`
} else {
prologue += `const ${k} = JSON.parse(${JSON.stringify(JSON.stringify(v))});\n`
}
prologueLineCount++
}
@@ -764,7 +762,11 @@ export async function POST(req: NextRequest) {
prologue += `environmentVariables = json.loads(${JSON.stringify(JSON.stringify(envVars))})\n`
prologueLineCount++
for (const [k, v] of Object.entries(contextVariables)) {
prologue += `${k} = ${formatLiteralForCode(v, 'python')}\n`
if (v === undefined) {
prologue += `${k} = None\n`
} else {
prologue += `${k} = json.loads(${JSON.stringify(JSON.stringify(v))})\n`
}
prologueLineCount++
}
const wrapped = [

View File

@@ -408,7 +408,6 @@ describe('Knowledge Search Utils', () => {
input: ['test query'],
model: 'text-embedding-3-small',
encoding_format: 'float',
dimensions: 1536,
}),
})
)

View File

@@ -0,0 +1,204 @@
import { db } from '@sim/db'
import { member, permissions, user, workspace } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, or } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
const logger = createLogger('OrganizationWorkspacesAPI')
/**
* GET /api/organizations/[id]/workspaces
* Get workspaces related to the organization with optional filtering
* Query parameters:
* - ?available=true - Only workspaces where user can invite others (admin permissions)
* - ?member=userId - Workspaces where specific member has access
*/
export async function GET(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
try {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const { id: organizationId } = await params
const url = new URL(request.url)
const availableOnly = url.searchParams.get('available') === 'true'
const memberId = url.searchParams.get('member')
// Verify user is a member of this organization
const memberEntry = await db
.select()
.from(member)
.where(and(eq(member.organizationId, organizationId), eq(member.userId, session.user.id)))
.limit(1)
if (memberEntry.length === 0) {
return NextResponse.json(
{
error: 'Forbidden - Not a member of this organization',
},
{ status: 403 }
)
}
const userRole = memberEntry[0].role
const hasAdminAccess = ['owner', 'admin'].includes(userRole)
if (availableOnly) {
// Get workspaces where user has admin permissions (can invite others)
const availableWorkspaces = await db
.select({
id: workspace.id,
name: workspace.name,
ownerId: workspace.ownerId,
createdAt: workspace.createdAt,
isOwner: eq(workspace.ownerId, session.user.id),
permissionType: permissions.permissionType,
})
.from(workspace)
.leftJoin(
permissions,
and(
eq(permissions.entityType, 'workspace'),
eq(permissions.entityId, workspace.id),
eq(permissions.userId, session.user.id)
)
)
.where(
or(
// User owns the workspace
eq(workspace.ownerId, session.user.id),
// User has admin permission on the workspace
and(
eq(permissions.userId, session.user.id),
eq(permissions.entityType, 'workspace'),
eq(permissions.permissionType, 'admin')
)
)
)
// Filter and format the results
const workspacesWithInvitePermission = availableWorkspaces
.filter((workspace) => {
// Include if user owns the workspace OR has admin permission
return workspace.isOwner || workspace.permissionType === 'admin'
})
.map((workspace) => ({
id: workspace.id,
name: workspace.name,
isOwner: workspace.isOwner,
canInvite: true, // All returned workspaces have invite permission
createdAt: workspace.createdAt,
}))
logger.info('Retrieved available workspaces for organization member', {
organizationId,
userId: session.user.id,
workspaceCount: workspacesWithInvitePermission.length,
})
return NextResponse.json({
success: true,
data: {
workspaces: workspacesWithInvitePermission,
totalCount: workspacesWithInvitePermission.length,
filter: 'available',
},
})
}
if (memberId && hasAdminAccess) {
// Get workspaces where specific member has access (admin only)
const memberWorkspaces = await db
.select({
id: workspace.id,
name: workspace.name,
ownerId: workspace.ownerId,
isOwner: eq(workspace.ownerId, memberId),
permissionType: permissions.permissionType,
createdAt: permissions.createdAt,
})
.from(workspace)
.leftJoin(
permissions,
and(
eq(permissions.entityType, 'workspace'),
eq(permissions.entityId, workspace.id),
eq(permissions.userId, memberId)
)
)
.where(
or(
// Member owns the workspace
eq(workspace.ownerId, memberId),
// Member has permissions on the workspace
and(eq(permissions.userId, memberId), eq(permissions.entityType, 'workspace'))
)
)
const formattedWorkspaces = memberWorkspaces.map((workspace) => ({
id: workspace.id,
name: workspace.name,
isOwner: workspace.isOwner,
permission: workspace.permissionType,
joinedAt: workspace.createdAt,
createdAt: workspace.createdAt,
}))
return NextResponse.json({
success: true,
data: {
workspaces: formattedWorkspaces,
totalCount: formattedWorkspaces.length,
filter: 'member',
memberId,
},
})
}
// Default: Get all workspaces (basic info only for regular members)
if (!hasAdminAccess) {
return NextResponse.json({
success: true,
data: {
workspaces: [],
totalCount: 0,
message: 'Workspace access information is only available to organization admins',
},
})
}
// For admins: Get summary of all workspaces
const allWorkspaces = await db
.select({
id: workspace.id,
name: workspace.name,
ownerId: workspace.ownerId,
createdAt: workspace.createdAt,
ownerName: user.name,
})
.from(workspace)
.leftJoin(user, eq(workspace.ownerId, user.id))
return NextResponse.json({
success: true,
data: {
workspaces: allWorkspaces,
totalCount: allWorkspaces.length,
filter: 'all',
},
userRole,
hasAdminAccess,
})
} catch (error) {
logger.error('Failed to get organization workspaces', { error })
return NextResponse.json(
{
error: 'Internal server error',
},
{ status: 500 }
)
}
}

View File

@@ -1,257 +0,0 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { processSingleFileToUserFile } from '@/lib/uploads/utils/file-utils'
import { downloadFileFromStorage } from '@/lib/uploads/utils/file-utils.server'
export const dynamic = 'force-dynamic'
const logger = createLogger('SupabaseStorageUploadAPI')
const SupabaseStorageUploadSchema = z.object({
projectId: z.string().min(1, 'Project ID is required'),
apiKey: z.string().min(1, 'API key is required'),
bucket: z.string().min(1, 'Bucket name is required'),
fileName: z.string().min(1, 'File name is required'),
path: z.string().optional().nullable(),
fileData: z.any(),
contentType: z.string().optional().nullable(),
upsert: z.boolean().optional().default(false),
})
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
try {
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
if (!authResult.success) {
logger.warn(
`[${requestId}] Unauthorized Supabase storage upload attempt: ${authResult.error}`
)
return NextResponse.json(
{
success: false,
error: authResult.error || 'Authentication required',
},
{ status: 401 }
)
}
logger.info(
`[${requestId}] Authenticated Supabase storage upload request via ${authResult.authType}`,
{
userId: authResult.userId,
}
)
const body = await request.json()
const validatedData = SupabaseStorageUploadSchema.parse(body)
const fileData = validatedData.fileData
const isStringInput = typeof fileData === 'string'
logger.info(`[${requestId}] Uploading to Supabase Storage`, {
bucket: validatedData.bucket,
fileName: validatedData.fileName,
path: validatedData.path,
fileDataType: isStringInput ? 'string' : 'object',
})
if (!fileData) {
return NextResponse.json(
{
success: false,
error: 'fileData is required',
},
{ status: 400 }
)
}
let uploadBody: Buffer
let uploadContentType: string | undefined
if (isStringInput) {
let content = fileData as string
const dataUrlMatch = content.match(/^data:([^;]+);base64,(.+)$/s)
if (dataUrlMatch) {
const [, mimeType, base64Data] = dataUrlMatch
content = base64Data
if (!validatedData.contentType) {
uploadContentType = mimeType
}
logger.info(`[${requestId}] Extracted base64 from data URL (MIME: ${mimeType})`)
}
const cleanedContent = content.replace(/[\s\r\n]/g, '')
const isLikelyBase64 = /^[A-Za-z0-9+/]*={0,2}$/.test(cleanedContent)
if (isLikelyBase64 && cleanedContent.length >= 4) {
try {
uploadBody = Buffer.from(cleanedContent, 'base64')
const expectedMinSize = Math.floor(cleanedContent.length * 0.7)
const expectedMaxSize = Math.ceil(cleanedContent.length * 0.8)
if (
uploadBody.length >= expectedMinSize &&
uploadBody.length <= expectedMaxSize &&
uploadBody.length > 0
) {
logger.info(
`[${requestId}] Decoded base64 content: ${cleanedContent.length} chars -> ${uploadBody.length} bytes`
)
} else {
const reEncoded = uploadBody.toString('base64')
if (reEncoded !== cleanedContent) {
logger.info(
`[${requestId}] Content looked like base64 but re-encoding didn't match, using as plain text`
)
uploadBody = Buffer.from(content, 'utf-8')
} else {
logger.info(
`[${requestId}] Decoded base64 content (verified): ${uploadBody.length} bytes`
)
}
}
} catch (decodeError) {
logger.info(
`[${requestId}] Failed to decode as base64, using as plain text: ${decodeError}`
)
uploadBody = Buffer.from(content, 'utf-8')
}
} else {
uploadBody = Buffer.from(content, 'utf-8')
logger.info(`[${requestId}] Using content as plain text (${uploadBody.length} bytes)`)
}
uploadContentType =
uploadContentType || validatedData.contentType || 'application/octet-stream'
} else {
const rawFile = fileData
logger.info(`[${requestId}] Processing file object: ${rawFile.name || 'unknown'}`)
let userFile
try {
userFile = processSingleFileToUserFile(rawFile, requestId, logger)
} catch (error) {
return NextResponse.json(
{
success: false,
error: error instanceof Error ? error.message : 'Failed to process file',
},
{ status: 400 }
)
}
const buffer = await downloadFileFromStorage(userFile, requestId, logger)
uploadBody = buffer
uploadContentType = validatedData.contentType || userFile.type || 'application/octet-stream'
}
let fullPath = validatedData.fileName
if (validatedData.path) {
const folderPath = validatedData.path.endsWith('/')
? validatedData.path
: `${validatedData.path}/`
fullPath = `${folderPath}${validatedData.fileName}`
}
const supabaseUrl = `https://${validatedData.projectId}.supabase.co/storage/v1/object/${validatedData.bucket}/${fullPath}`
const headers: Record<string, string> = {
apikey: validatedData.apiKey,
Authorization: `Bearer ${validatedData.apiKey}`,
'Content-Type': uploadContentType,
}
if (validatedData.upsert) {
headers['x-upsert'] = 'true'
}
logger.info(`[${requestId}] Sending to Supabase: ${supabaseUrl}`, {
contentType: uploadContentType,
bodySize: uploadBody.length,
upsert: validatedData.upsert,
})
const response = await fetch(supabaseUrl, {
method: 'POST',
headers,
body: new Uint8Array(uploadBody),
})
if (!response.ok) {
const errorText = await response.text()
let errorData
try {
errorData = JSON.parse(errorText)
} catch {
errorData = { message: errorText }
}
logger.error(`[${requestId}] Supabase Storage upload failed:`, {
status: response.status,
statusText: response.statusText,
error: errorData,
})
return NextResponse.json(
{
success: false,
error: errorData.message || errorData.error || `Upload failed: ${response.statusText}`,
details: errorData,
},
{ status: response.status }
)
}
const result = await response.json()
logger.info(`[${requestId}] File uploaded successfully to Supabase Storage`, {
bucket: validatedData.bucket,
path: fullPath,
})
const publicUrl = `https://${validatedData.projectId}.supabase.co/storage/v1/object/public/${validatedData.bucket}/${fullPath}`
return NextResponse.json({
success: true,
output: {
message: 'Successfully uploaded file to storage',
results: {
...result,
path: fullPath,
bucket: validatedData.bucket,
publicUrl,
},
},
})
} catch (error) {
if (error instanceof z.ZodError) {
logger.warn(`[${requestId}] Invalid request data`, { errors: error.errors })
return NextResponse.json(
{
success: false,
error: 'Invalid request data',
details: error.errors,
},
{ status: 400 }
)
}
logger.error(`[${requestId}] Error uploading to Supabase Storage:`, error)
return NextResponse.json(
{
success: false,
error: error instanceof Error ? error.message : 'Internal server error',
},
{ status: 500 }
)
}
}

View File

@@ -338,11 +338,6 @@ const arePropsEqual = (prevProps: SubBlockProps, nextProps: SubBlockProps): bool
const configEqual =
prevProps.config.id === nextProps.config.id && prevProps.config.type === nextProps.config.type
const canonicalToggleEqual =
!!prevProps.canonicalToggle === !!nextProps.canonicalToggle &&
prevProps.canonicalToggle?.mode === nextProps.canonicalToggle?.mode &&
prevProps.canonicalToggle?.disabled === nextProps.canonicalToggle?.disabled
return (
prevProps.blockId === nextProps.blockId &&
configEqual &&
@@ -351,7 +346,8 @@ const arePropsEqual = (prevProps: SubBlockProps, nextProps: SubBlockProps): bool
prevProps.disabled === nextProps.disabled &&
prevProps.fieldDiffStatus === nextProps.fieldDiffStatus &&
prevProps.allowExpandInPreview === nextProps.allowExpandInPreview &&
canonicalToggleEqual
prevProps.canonicalToggle?.mode === nextProps.canonicalToggle?.mode &&
prevProps.canonicalToggle?.disabled === nextProps.canonicalToggle?.disabled
)
}

View File

@@ -214,6 +214,15 @@ export const A2ABlock: BlockConfig<A2AResponse> = {
],
config: {
tool: (params) => params.operation as string,
params: (params) => {
const { fileUpload, fileReference, ...rest } = params
const hasFileUpload = Array.isArray(fileUpload) ? fileUpload.length > 0 : !!fileUpload
const files = hasFileUpload ? fileUpload : fileReference
return {
...rest,
...(files ? { files } : {}),
}
},
},
},
inputs: {

View File

@@ -581,18 +581,6 @@ export const GmailV2Block: BlockConfig<GmailToolResponse> = {
results: { type: 'json', description: 'Search/read summary results' },
attachments: { type: 'json', description: 'Downloaded attachments (if enabled)' },
// Draft-specific outputs
draftId: {
type: 'string',
description: 'Draft ID',
condition: { field: 'operation', value: 'draft_gmail' },
},
messageId: {
type: 'string',
description: 'Gmail message ID for the draft',
condition: { field: 'operation', value: 'draft_gmail' },
},
// Trigger outputs (unchanged)
email_id: { type: 'string', description: 'Gmail message ID' },
thread_id: { type: 'string', description: 'Gmail thread ID' },

View File

@@ -661,25 +661,12 @@ Return ONLY the PostgREST filter expression - no explanations, no markdown, no e
placeholder: 'folder/subfolder/',
condition: { field: 'operation', value: 'storage_upload' },
},
{
id: 'file',
title: 'File',
type: 'file-upload',
canonicalParamId: 'fileData',
placeholder: 'Upload file to storage',
condition: { field: 'operation', value: 'storage_upload' },
mode: 'basic',
multiple: false,
required: true,
},
{
id: 'fileContent',
title: 'File Content',
type: 'code',
canonicalParamId: 'fileData',
placeholder: 'Base64 encoded for binary files, or plain text',
condition: { field: 'operation', value: 'storage_upload' },
mode: 'advanced',
required: true,
},
{

View File

@@ -1,9 +1,7 @@
import { getBlockOutputs } from '@/lib/workflows/blocks/block-outputs'
import { normalizeName } from '@/executor/constants'
import type { ExecutionContext } from '@/executor/types'
import type { OutputSchema } from '@/executor/utils/block-reference'
import type { SerializedBlock } from '@/serializer/types'
import type { ToolConfig } from '@/tools/types'
import { getTool } from '@/tools/utils'
export interface BlockDataCollection {
blockData: Record<string, unknown>
@@ -11,32 +9,6 @@ export interface BlockDataCollection {
blockOutputSchemas: Record<string, OutputSchema>
}
export function getBlockSchema(
block: SerializedBlock,
toolConfig?: ToolConfig
): OutputSchema | undefined {
const isTrigger =
block.metadata?.category === 'triggers' ||
(block.config?.params as Record<string, unknown> | undefined)?.triggerMode === true
// Triggers use saved outputs (defines the trigger payload schema)
if (isTrigger && block.outputs && Object.keys(block.outputs).length > 0) {
return block.outputs as OutputSchema
}
// When a tool is selected, tool outputs are the source of truth
if (toolConfig?.outputs && Object.keys(toolConfig.outputs).length > 0) {
return toolConfig.outputs as OutputSchema
}
// Fallback to saved outputs for blocks without tools
if (block.outputs && Object.keys(block.outputs).length > 0) {
return block.outputs as OutputSchema
}
return undefined
}
export function collectBlockData(ctx: ExecutionContext): BlockDataCollection {
const blockData: Record<string, unknown> = {}
const blockNameMapping: Record<string, string> = {}
@@ -46,23 +18,26 @@ export function collectBlockData(ctx: ExecutionContext): BlockDataCollection {
if (state.output !== undefined) {
blockData[id] = state.output
}
const workflowBlock = ctx.workflow?.blocks?.find((b) => b.id === id)
if (!workflowBlock) continue
if (workflowBlock.metadata?.name) {
blockNameMapping[normalizeName(workflowBlock.metadata.name)] = id
}
const workflowBlocks = ctx.workflow?.blocks ?? []
for (const block of workflowBlocks) {
const id = block.id
if (block.metadata?.name) {
blockNameMapping[normalizeName(block.metadata.name)] = id
}
const toolId = block.config?.tool
const toolConfig = toolId ? getTool(toolId) : undefined
const schema = getBlockSchema(block, toolConfig)
const blockType = workflowBlock.metadata?.id
if (blockType) {
const params = workflowBlock.config?.params as Record<string, unknown> | undefined
const subBlocks = params
? Object.fromEntries(Object.entries(params).map(([k, v]) => [k, { value: v }]))
: undefined
const schema = getBlockOutputs(blockType, subBlocks)
if (schema && Object.keys(schema).length > 0) {
blockOutputSchemas[id] = schema
}
}
}
return { blockData, blockNameMapping, blockOutputSchemas }
}

View File

@@ -1,48 +0,0 @@
/**
* Formats a JavaScript/TypeScript value as a code literal for the target language.
* Handles special cases like null, undefined, booleans, and Python-specific number representations.
*
* @param value - The value to format
* @param language - Target language ('javascript' or 'python')
* @returns A string literal representation valid in the target language
*
* @example
* formatLiteralForCode(null, 'python') // => 'None'
* formatLiteralForCode(true, 'python') // => 'True'
* formatLiteralForCode(NaN, 'python') // => "float('nan')"
* formatLiteralForCode("hello", 'javascript') // => '"hello"'
* formatLiteralForCode({a: 1}, 'python') // => "json.loads('{\"a\":1}')"
*/
export function formatLiteralForCode(value: unknown, language: 'javascript' | 'python'): string {
const isPython = language === 'python'
if (value === undefined) {
return isPython ? 'None' : 'undefined'
}
if (value === null) {
return isPython ? 'None' : 'null'
}
if (typeof value === 'boolean') {
return isPython ? (value ? 'True' : 'False') : String(value)
}
if (typeof value === 'number') {
if (Number.isNaN(value)) {
return isPython ? "float('nan')" : 'NaN'
}
if (value === Number.POSITIVE_INFINITY) {
return isPython ? "float('inf')" : 'Infinity'
}
if (value === Number.NEGATIVE_INFINITY) {
return isPython ? "float('-inf')" : '-Infinity'
}
return String(value)
}
if (typeof value === 'string') {
return JSON.stringify(value)
}
// Objects and arrays - Python needs json.loads() because JSON true/false/null aren't valid Python
if (isPython) {
return `json.loads(${JSON.stringify(JSON.stringify(value))})`
}
return JSON.stringify(value)
}

View File

@@ -378,30 +378,8 @@ function buildManualTriggerOutput(
return mergeFilesIntoOutput(output, workflowInput)
}
function buildIntegrationTriggerOutput(
workflowInput: unknown,
structuredInput: Record<string, unknown>,
hasStructured: boolean
): NormalizedBlockOutput {
const output: NormalizedBlockOutput = {}
if (hasStructured) {
for (const [key, value] of Object.entries(structuredInput)) {
output[key] = value
}
}
if (isPlainObject(workflowInput)) {
for (const [key, value] of Object.entries(workflowInput)) {
if (value !== undefined && value !== null) {
output[key] = value
} else if (!Object.hasOwn(output, key)) {
output[key] = value
}
}
}
return mergeFilesIntoOutput(output, workflowInput)
function buildIntegrationTriggerOutput(workflowInput: unknown): NormalizedBlockOutput {
return isPlainObject(workflowInput) ? (workflowInput as NormalizedBlockOutput) : {}
}
function extractSubBlocks(block: SerializedBlock): Record<string, unknown> | undefined {
@@ -450,7 +428,7 @@ export function buildStartBlockOutput(options: StartBlockOutputOptions): Normali
return buildManualTriggerOutput(finalInput, workflowInput)
case StartBlockPath.EXTERNAL_TRIGGER:
return buildIntegrationTriggerOutput(workflowInput, structuredInput, hasStructured)
return buildIntegrationTriggerOutput(workflowInput)
case StartBlockPath.LEGACY_STARTER:
return buildLegacyStarterOutput(

View File

@@ -157,14 +157,7 @@ export class VariableResolver {
let replacementError: Error | null = null
const blockType = block?.metadata?.id
const language =
blockType === BlockType.FUNCTION
? ((block?.config?.params as Record<string, unknown> | undefined)?.language as
| string
| undefined)
: undefined
// Use generic utility for smart variable reference replacement
let result = replaceValidReferences(template, (match) => {
if (replacementError) return match
@@ -174,7 +167,14 @@ export class VariableResolver {
return match
}
return this.blockResolver.formatValueForBlock(resolved, blockType, language)
const blockType = block?.metadata?.id
const isInTemplateLiteral =
blockType === BlockType.FUNCTION &&
template.includes('${') &&
template.includes('}') &&
template.includes('`')
return this.blockResolver.formatValueForBlock(resolved, blockType, isInTemplateLiteral)
} catch (error) {
replacementError = error instanceof Error ? error : new Error(String(error))
return match

View File

@@ -257,9 +257,15 @@ describe('BlockResolver', () => {
expect(result).toBe('"hello"')
})
it.concurrent('should format object for function block', () => {
it.concurrent('should format string for function block in template literal', () => {
const resolver = new BlockResolver(createTestWorkflow())
const result = resolver.formatValueForBlock({ a: 1 }, 'function')
const result = resolver.formatValueForBlock('hello', 'function', true)
expect(result).toBe('hello')
})
it.concurrent('should format object for function block in template literal', () => {
const resolver = new BlockResolver(createTestWorkflow())
const result = resolver.formatValueForBlock({ a: 1 }, 'function', true)
expect(result).toBe('{"a":1}')
})

View File

@@ -1,16 +1,15 @@
import { getBlockOutputs } from '@/lib/workflows/blocks/block-outputs'
import {
isReference,
normalizeName,
parseReferencePath,
SPECIAL_REFERENCE_PREFIXES,
} from '@/executor/constants'
import { getBlockSchema } from '@/executor/utils/block-data'
import {
InvalidFieldError,
type OutputSchema,
resolveBlockReference,
} from '@/executor/utils/block-reference'
import { formatLiteralForCode } from '@/executor/utils/code-formatting'
import {
navigatePath,
type ResolutionContext,
@@ -68,9 +67,15 @@ export class BlockResolver implements Resolver {
blockData[blockId] = output
}
const blockType = block.metadata?.id
const params = block.config?.params as Record<string, unknown> | undefined
const subBlocks = params
? Object.fromEntries(Object.entries(params).map(([k, v]) => [k, { value: v }]))
: undefined
const toolId = block.config?.tool
const toolConfig = toolId ? getTool(toolId) : undefined
const outputSchema = getBlockSchema(block, toolConfig)
const outputSchema =
toolConfig?.outputs ?? (blockType ? getBlockOutputs(blockType, subBlocks) : block.outputs)
if (outputSchema && Object.keys(outputSchema).length > 0) {
blockOutputSchemas[blockId] = outputSchema
@@ -160,13 +165,17 @@ export class BlockResolver implements Resolver {
return this.nameToBlockId.get(normalizeName(name))
}
public formatValueForBlock(value: any, blockType: string | undefined, language?: string): string {
public formatValueForBlock(
value: any,
blockType: string | undefined,
isInTemplateLiteral = false
): string {
if (blockType === 'condition') {
return this.stringifyForCondition(value)
}
if (blockType === 'function') {
return this.formatValueForCodeContext(value, language)
return this.formatValueForCodeContext(value, isInTemplateLiteral)
}
if (blockType === 'response') {
@@ -207,7 +216,29 @@ export class BlockResolver implements Resolver {
return String(value)
}
private formatValueForCodeContext(value: any, language?: string): string {
return formatLiteralForCode(value, language === 'python' ? 'python' : 'javascript')
private formatValueForCodeContext(value: any, isInTemplateLiteral: boolean): string {
if (isInTemplateLiteral) {
if (typeof value === 'string') {
return value
}
if (typeof value === 'object' && value !== null) {
return JSON.stringify(value)
}
return String(value)
}
if (typeof value === 'string') {
return JSON.stringify(value)
}
if (typeof value === 'object' && value !== null) {
return JSON.stringify(value)
}
if (value === undefined) {
return 'undefined'
}
if (value === null) {
return 'null'
}
return String(value)
}
}

View File

@@ -30,10 +30,7 @@ export function navigatePath(obj: any, path: string[]): any {
const arrayMatch = part.match(/^([^[]+)(\[.+)$/)
if (arrayMatch) {
const [, prop, bracketsPart] = arrayMatch
current =
typeof current === 'object' && current !== null
? (current as Record<string, unknown>)[prop]
: undefined
current = current[prop]
if (current === undefined || current === null) {
return undefined
}
@@ -52,10 +49,7 @@ export function navigatePath(obj: any, path: string[]): any {
const index = Number.parseInt(part, 10)
current = Array.isArray(current) ? current[index] : undefined
} else {
current =
typeof current === 'object' && current !== null
? (current as Record<string, unknown>)[part]
: undefined
current = current[part]
}
}
return current

View File

@@ -680,10 +680,6 @@ export function useCollaborativeWorkflow() {
previousPositions?: Map<string, { x: number; y: number; parentId?: string }>
}
) => {
if (isBaselineDiffView) {
return
}
if (!isInActiveRoom()) {
logger.debug('Skipping batch position update - not in active workflow')
return
@@ -729,7 +725,7 @@ export function useCollaborativeWorkflow() {
}
}
},
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
[addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
)
const collaborativeUpdateBlockName = useCallback(
@@ -821,10 +817,6 @@ export function useCollaborativeWorkflow() {
const collaborativeBatchToggleBlockEnabled = useCallback(
(ids: string[]) => {
if (isBaselineDiffView) {
return
}
if (ids.length === 0) return
const previousStates: Record<string, boolean> = {}
@@ -857,7 +849,7 @@ export function useCollaborativeWorkflow() {
undoRedo.recordBatchToggleEnabled(validIds, previousStates)
},
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
[addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
)
const collaborativeBatchUpdateParent = useCallback(
@@ -869,10 +861,6 @@ export function useCollaborativeWorkflow() {
affectedEdges: Edge[]
}>
) => {
if (isBaselineDiffView) {
return
}
if (!isInActiveRoom()) {
logger.debug('Skipping batch update parent - not in active workflow')
return
@@ -943,7 +931,7 @@ export function useCollaborativeWorkflow() {
logger.debug('Batch updated parent for blocks', { updateCount: updates.length })
},
[isBaselineDiffView, isInActiveRoom, undoRedo, addToQueue, activeWorkflowId, session?.user?.id]
[isInActiveRoom, undoRedo, addToQueue, activeWorkflowId, session?.user?.id]
)
const collaborativeToggleBlockAdvancedMode = useCallback(
@@ -963,37 +951,18 @@ export function useCollaborativeWorkflow() {
const collaborativeSetBlockCanonicalMode = useCallback(
(id: string, canonicalId: string, canonicalMode: 'basic' | 'advanced') => {
if (isBaselineDiffView) {
return
}
useWorkflowStore.getState().setBlockCanonicalMode(id, canonicalId, canonicalMode)
if (!activeWorkflowId) {
return
}
const operationId = crypto.randomUUID()
addToQueue({
id: operationId,
operation: {
operation: BLOCK_OPERATIONS.UPDATE_CANONICAL_MODE,
target: OPERATION_TARGETS.BLOCK,
payload: { id, canonicalId, canonicalMode },
executeQueuedOperation(
BLOCK_OPERATIONS.UPDATE_CANONICAL_MODE,
OPERATION_TARGETS.BLOCK,
{ id, canonicalId, canonicalMode },
() => useWorkflowStore.getState().setBlockCanonicalMode(id, canonicalId, canonicalMode)
)
},
workflowId: activeWorkflowId,
userId: session?.user?.id || 'unknown',
})
},
[isBaselineDiffView, activeWorkflowId, addToQueue, session?.user?.id]
[executeQueuedOperation]
)
const collaborativeBatchToggleBlockHandles = useCallback(
(ids: string[]) => {
if (isBaselineDiffView) {
return
}
if (ids.length === 0) return
const previousStates: Record<string, boolean> = {}
@@ -1026,15 +995,11 @@ export function useCollaborativeWorkflow() {
undoRedo.recordBatchToggleHandles(validIds, previousStates)
},
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
[addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
)
const collaborativeBatchAddEdges = useCallback(
(edges: Edge[], options?: { skipUndoRedo?: boolean }) => {
if (isBaselineDiffView) {
return false
}
if (!isInActiveRoom()) {
logger.debug('Skipping batch add edges - not in active workflow')
return false
@@ -1070,15 +1035,11 @@ export function useCollaborativeWorkflow() {
return true
},
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
[addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
)
const collaborativeBatchRemoveEdges = useCallback(
(edgeIds: string[], options?: { skipUndoRedo?: boolean }) => {
if (isBaselineDiffView) {
return false
}
if (!isInActiveRoom()) {
logger.debug('Skipping batch remove edges - not in active workflow')
return false
@@ -1128,7 +1089,7 @@ export function useCollaborativeWorkflow() {
logger.info('Batch removed edges', { count: validEdgeIds.length })
return true
},
[isBaselineDiffView, isInActiveRoom, addToQueue, activeWorkflowId, session, undoRedo]
[isInActiveRoom, addToQueue, activeWorkflowId, session, undoRedo]
)
const collaborativeSetSubblockValue = useCallback(
@@ -1204,10 +1165,6 @@ export function useCollaborativeWorkflow() {
(blockId: string, subblockId: string, value: any) => {
if (isApplyingRemoteChange.current) return
if (isBaselineDiffView) {
return
}
if (!isInActiveRoom()) {
logger.debug('Skipping tag selection - not in active workflow', {
currentWorkflowId,
@@ -1235,14 +1192,7 @@ export function useCollaborativeWorkflow() {
userId: session?.user?.id || 'unknown',
})
},
[
isBaselineDiffView,
addToQueue,
currentWorkflowId,
activeWorkflowId,
session?.user?.id,
isInActiveRoom,
]
[addToQueue, currentWorkflowId, activeWorkflowId, session?.user?.id, isInActiveRoom]
)
const collaborativeUpdateLoopType = useCallback(
@@ -1588,10 +1538,6 @@ export function useCollaborativeWorkflow() {
const collaborativeBatchRemoveBlocks = useCallback(
(blockIds: string[], options?: { skipUndoRedo?: boolean }) => {
if (isBaselineDiffView) {
return false
}
if (!isInActiveRoom()) {
logger.debug('Skipping batch remove blocks - not in active workflow')
return false
@@ -1673,7 +1619,6 @@ export function useCollaborativeWorkflow() {
return true
},
[
isBaselineDiffView,
addToQueue,
activeWorkflowId,
session?.user?.id,

View File

@@ -132,8 +132,6 @@ async function executeCode(request) {
for (const [key, value] of Object.entries(contextVariables)) {
if (value === undefined) {
await jail.set(key, undefined)
} else if (value === null) {
await jail.set(key, null)
} else {
await jail.set(key, new ivm.ExternalCopy(value).copyInto())
}

View File

@@ -8,17 +8,6 @@ const logger = createLogger('EmbeddingUtils')
const MAX_TOKENS_PER_REQUEST = 8000
const MAX_CONCURRENT_BATCHES = env.KB_CONFIG_CONCURRENCY_LIMIT || 50
const EMBEDDING_DIMENSIONS = 1536
/**
* Check if the model supports custom dimensions.
* text-embedding-3-* models support the dimensions parameter.
* Checks for 'embedding-3' to handle Azure deployments with custom naming conventions.
*/
function supportsCustomDimensions(modelName: string): boolean {
const name = modelName.toLowerCase()
return name.includes('embedding-3') && !name.includes('ada')
}
export class EmbeddingAPIError extends Error {
public status: number
@@ -104,19 +93,15 @@ async function getEmbeddingConfig(
async function callEmbeddingAPI(inputs: string[], config: EmbeddingConfig): Promise<number[][]> {
return retryWithExponentialBackoff(
async () => {
const useDimensions = supportsCustomDimensions(config.modelName)
const requestBody = config.useAzure
? {
input: inputs,
encoding_format: 'float',
...(useDimensions && { dimensions: EMBEDDING_DIMENSIONS }),
}
: {
input: inputs,
model: config.modelName,
encoding_format: 'float',
...(useDimensions && { dimensions: EMBEDDING_DIMENSIONS }),
}
const response = await fetch(config.apiUrl, {

View File

@@ -18,52 +18,6 @@ const logger = createLogger('BlobClient')
let _blobServiceClient: BlobServiceClientInstance | null = null
interface ParsedCredentials {
accountName: string
accountKey: string
}
/**
* Extract account name and key from an Azure connection string.
* Connection strings have the format: DefaultEndpointsProtocol=https;AccountName=...;AccountKey=...;EndpointSuffix=...
*/
function parseConnectionString(connectionString: string): ParsedCredentials {
const accountNameMatch = connectionString.match(/AccountName=([^;]+)/)
if (!accountNameMatch) {
throw new Error('Cannot extract account name from connection string')
}
const accountKeyMatch = connectionString.match(/AccountKey=([^;]+)/)
if (!accountKeyMatch) {
throw new Error('Cannot extract account key from connection string')
}
return {
accountName: accountNameMatch[1],
accountKey: accountKeyMatch[1],
}
}
/**
* Get account credentials from BLOB_CONFIG, extracting from connection string if necessary.
*/
function getAccountCredentials(): ParsedCredentials {
if (BLOB_CONFIG.connectionString) {
return parseConnectionString(BLOB_CONFIG.connectionString)
}
if (BLOB_CONFIG.accountName && BLOB_CONFIG.accountKey) {
return {
accountName: BLOB_CONFIG.accountName,
accountKey: BLOB_CONFIG.accountKey,
}
}
throw new Error(
'Azure Blob Storage credentials are missing set AZURE_CONNECTION_STRING or both AZURE_ACCOUNT_NAME and AZURE_ACCOUNT_KEY'
)
}
export async function getBlobServiceClient(): Promise<BlobServiceClientInstance> {
if (_blobServiceClient) return _blobServiceClient
@@ -173,8 +127,6 @@ export async function getPresignedUrl(key: string, expiresIn = 3600) {
const containerClient = blobServiceClient.getContainerClient(BLOB_CONFIG.containerName)
const blockBlobClient = containerClient.getBlockBlobClient(key)
const { accountName, accountKey } = getAccountCredentials()
const sasOptions = {
containerName: BLOB_CONFIG.containerName,
blobName: key,
@@ -185,7 +137,13 @@ export async function getPresignedUrl(key: string, expiresIn = 3600) {
const sasToken = generateBlobSASQueryParameters(
sasOptions,
new StorageSharedKeyCredential(accountName, accountKey)
new StorageSharedKeyCredential(
BLOB_CONFIG.accountName,
BLOB_CONFIG.accountKey ??
(() => {
throw new Error('AZURE_ACCOUNT_KEY is required when using account name authentication')
})()
)
).toString()
return `${blockBlobClient.url}?${sasToken}`
@@ -210,14 +168,9 @@ export async function getPresignedUrlWithConfig(
StorageSharedKeyCredential,
} = await import('@azure/storage-blob')
let tempBlobServiceClient: BlobServiceClientInstance
let accountName: string
let accountKey: string
if (customConfig.connectionString) {
tempBlobServiceClient = BlobServiceClient.fromConnectionString(customConfig.connectionString)
const credentials = parseConnectionString(customConfig.connectionString)
accountName = credentials.accountName
accountKey = credentials.accountKey
} else if (customConfig.accountName && customConfig.accountKey) {
const sharedKeyCredential = new StorageSharedKeyCredential(
customConfig.accountName,
@@ -227,8 +180,6 @@ export async function getPresignedUrlWithConfig(
`https://${customConfig.accountName}.blob.core.windows.net`,
sharedKeyCredential
)
accountName = customConfig.accountName
accountKey = customConfig.accountKey
} else {
throw new Error(
'Custom blob config must include either connectionString or accountName + accountKey'
@@ -248,7 +199,13 @@ export async function getPresignedUrlWithConfig(
const sasToken = generateBlobSASQueryParameters(
sasOptions,
new StorageSharedKeyCredential(accountName, accountKey)
new StorageSharedKeyCredential(
customConfig.accountName,
customConfig.accountKey ??
(() => {
throw new Error('Account key is required when using account name authentication')
})()
)
).toString()
return `${blockBlobClient.url}?${sasToken}`
@@ -446,9 +403,13 @@ export async function getMultipartPartUrls(
if (customConfig) {
if (customConfig.connectionString) {
blobServiceClient = BlobServiceClient.fromConnectionString(customConfig.connectionString)
const credentials = parseConnectionString(customConfig.connectionString)
accountName = credentials.accountName
accountKey = credentials.accountKey
const match = customConfig.connectionString.match(/AccountName=([^;]+)/)
if (!match) throw new Error('Cannot extract account name from connection string')
accountName = match[1]
const keyMatch = customConfig.connectionString.match(/AccountKey=([^;]+)/)
if (!keyMatch) throw new Error('Cannot extract account key from connection string')
accountKey = keyMatch[1]
} else if (customConfig.accountName && customConfig.accountKey) {
const credential = new StorageSharedKeyCredential(
customConfig.accountName,
@@ -467,9 +428,12 @@ export async function getMultipartPartUrls(
} else {
blobServiceClient = await getBlobServiceClient()
containerName = BLOB_CONFIG.containerName
const credentials = getAccountCredentials()
accountName = credentials.accountName
accountKey = credentials.accountKey
accountName = BLOB_CONFIG.accountName
accountKey =
BLOB_CONFIG.accountKey ||
(() => {
throw new Error('AZURE_ACCOUNT_KEY is required')
})()
}
const containerClient = blobServiceClient.getContainerClient(containerName)
@@ -537,10 +501,12 @@ export async function completeMultipartUpload(
const containerClient = blobServiceClient.getContainerClient(containerName)
const blockBlobClient = containerClient.getBlockBlobClient(key)
// Sort parts by part number and extract block IDs
const sortedBlockIds = parts
.sort((a, b) => a.partNumber - b.partNumber)
.map((part) => part.blockId)
// Commit the block list to create the final blob
await blockBlobClient.commitBlockList(sortedBlockIds, {
metadata: {
multipartUpload: 'completed',
@@ -591,8 +557,10 @@ export async function abortMultipartUpload(key: string, customConfig?: BlobConfi
const blockBlobClient = containerClient.getBlockBlobClient(key)
try {
// Delete the blob if it exists (this also cleans up any uncommitted blocks)
await blockBlobClient.deleteIfExists()
} catch (error) {
// Ignore errors since we're just cleaning up
logger.warn('Error cleaning up multipart upload:', error)
}
}

View File

@@ -618,6 +618,13 @@ export function getToolOutputs(
}
}
/**
* Generates output paths for a tool-based block.
*
* @param blockConfig - The block configuration containing tools config
* @param subBlocks - SubBlock values for tool selection and condition evaluation
* @returns Array of output paths for the tool, or empty array on error
*/
export function getToolOutputPaths(
blockConfig: BlockConfig,
subBlocks?: Record<string, SubBlockWithValue>
@@ -627,22 +634,12 @@ export function getToolOutputPaths(
if (!outputs || Object.keys(outputs).length === 0) return []
if (subBlocks && blockConfig.outputs) {
const filteredBlockOutputs = filterOutputsByCondition(blockConfig.outputs, subBlocks)
const allowedKeys = new Set(Object.keys(filteredBlockOutputs))
const filteredOutputs: Record<string, any> = {}
for (const [key, value] of Object.entries(outputs)) {
const blockOutput = blockConfig.outputs[key]
if (!blockOutput || typeof blockOutput !== 'object') {
filteredOutputs[key] = value
continue
}
const condition = 'condition' in blockOutput ? blockOutput.condition : undefined
if (condition) {
if (evaluateOutputCondition(condition, subBlocks)) {
filteredOutputs[key] = value
}
} else {
if (allowedKeys.has(key)) {
filteredOutputs[key] = value
}
}

View File

@@ -26,7 +26,7 @@ describe('VariableManager', () => {
it.concurrent('should handle boolean type variables', () => {
expect(VariableManager.parseInputForStorage('true', 'boolean')).toBe(true)
expect(VariableManager.parseInputForStorage('false', 'boolean')).toBe(false)
expect(VariableManager.parseInputForStorage('1', 'boolean')).toBe(false)
expect(VariableManager.parseInputForStorage('1', 'boolean')).toBe(true)
expect(VariableManager.parseInputForStorage('0', 'boolean')).toBe(false)
expect(VariableManager.parseInputForStorage('"true"', 'boolean')).toBe(true)
expect(VariableManager.parseInputForStorage("'false'", 'boolean')).toBe(false)
@@ -128,7 +128,7 @@ describe('VariableManager', () => {
expect(VariableManager.resolveForExecution(false, 'boolean')).toBe(false)
expect(VariableManager.resolveForExecution('true', 'boolean')).toBe(true)
expect(VariableManager.resolveForExecution('false', 'boolean')).toBe(false)
expect(VariableManager.resolveForExecution('1', 'boolean')).toBe(false)
expect(VariableManager.resolveForExecution('1', 'boolean')).toBe(true)
expect(VariableManager.resolveForExecution('0', 'boolean')).toBe(false)
})

View File

@@ -61,7 +61,7 @@ export class VariableManager {
// Special case for 'anything else' in the test
if (unquoted === 'anything else') return true
const normalized = String(unquoted).toLowerCase().trim()
return normalized === 'true'
return normalized === 'true' || normalized === '1'
}
case 'object':

View File

@@ -27,9 +27,6 @@ export function registerEmitFunctions(
emitSubblockUpdate = subblockEmit
emitVariableUpdate = variableEmit
currentRegisteredWorkflowId = workflowId
if (workflowId) {
useOperationQueueStore.getState().processNextOperation()
}
}
let currentRegisteredWorkflowId: string | null = null
@@ -265,14 +262,16 @@ export const useOperationQueueStore = create<OperationQueueState>((set, get) =>
return
}
if (!currentRegisteredWorkflowId) {
const nextOperation = currentRegisteredWorkflowId
? state.operations.find(
(op) => op.status === 'pending' && op.workflowId === currentRegisteredWorkflowId
)
: state.operations.find((op) => op.status === 'pending')
if (!nextOperation) {
return
}
const nextOperation = state.operations.find(
(op) => op.status === 'pending' && op.workflowId === currentRegisteredWorkflowId
)
if (!nextOperation) {
if (currentRegisteredWorkflowId && nextOperation.workflowId !== currentRegisteredWorkflowId) {
return
}

View File

@@ -38,12 +38,11 @@ export const storageUploadTool: ToolConfig<
visibility: 'user-or-llm',
description: 'Optional folder path (e.g., "folder/subfolder/")',
},
fileData: {
type: 'json',
fileContent: {
type: 'string',
required: true,
visibility: 'user-or-llm',
description:
'File to upload - UserFile object (basic mode) or string content (advanced mode: base64 or plain text). Supports data URLs.',
description: 'The file content (base64 encoded for binary files, or plain text)',
},
contentType: {
type: 'string',
@@ -66,28 +65,65 @@ export const storageUploadTool: ToolConfig<
},
request: {
url: '/api/tools/supabase/storage-upload',
url: (params) => {
// Combine folder path and fileName, ensuring proper formatting
let fullPath = params.fileName
if (params.path) {
// Ensure path ends with / and doesn't have double slashes
const folderPath = params.path.endsWith('/') ? params.path : `${params.path}/`
fullPath = `${folderPath}${params.fileName}`
}
return `https://${params.projectId}.supabase.co/storage/v1/object/${params.bucket}/${fullPath}`
},
method: 'POST',
headers: () => ({
'Content-Type': 'application/json',
}),
body: (params) => ({
projectId: params.projectId,
apiKey: params.apiKey,
bucket: params.bucket,
fileName: params.fileName,
path: params.path,
fileData: params.fileData,
contentType: params.contentType,
upsert: params.upsert,
}),
headers: (params) => {
const headers: Record<string, string> = {
apikey: params.apiKey,
Authorization: `Bearer ${params.apiKey}`,
}
if (params.contentType) {
headers['Content-Type'] = params.contentType
}
if (params.upsert) {
headers['x-upsert'] = 'true'
}
return headers
},
body: (params) => {
// Return the file content wrapped in an object
// The actual upload will need to handle this appropriately
return {
content: params.fileContent,
}
},
},
transformResponse: async (response: Response) => {
let data
try {
data = await response.json()
} catch (parseError) {
throw new Error(`Failed to parse Supabase storage upload response: ${parseError}`)
}
return {
success: true,
output: {
message: 'Successfully uploaded file to storage',
results: data,
},
error: undefined,
}
},
outputs: {
message: { type: 'string', description: 'Operation status message' },
results: {
type: 'object',
description: 'Upload result including file path, bucket, and public URL',
description: 'Upload result including file path and metadata',
},
},
}

View File

@@ -136,7 +136,7 @@ export interface SupabaseStorageUploadParams {
bucket: string
fileName: string
path?: string
fileData: any // UserFile object (basic mode) or string (advanced mode: base64/plain text)
fileContent: string
contentType?: string
upsert?: boolean
}

View File

@@ -52,7 +52,7 @@ services:
deploy:
resources:
limits:
memory: 1G
memory: 8G
healthcheck:
test: ['CMD', 'wget', '--spider', '--quiet', 'http://127.0.0.1:3002/health']
interval: 90s

View File

@@ -56,7 +56,7 @@ services:
deploy:
resources:
limits:
memory: 1G
memory: 8G
healthcheck:
test: ['CMD', 'wget', '--spider', '--quiet', 'http://127.0.0.1:3002/health']
interval: 90s

View File

@@ -42,7 +42,7 @@ services:
deploy:
resources:
limits:
memory: 1G
memory: 4G
environment:
- DATABASE_URL=postgresql://${POSTGRES_USER:-postgres}:${POSTGRES_PASSWORD:-postgres}@db:5432/${POSTGRES_DB:-simstudio}
- NEXT_PUBLIC_APP_URL=${NEXT_PUBLIC_APP_URL:-http://localhost:3000}

View File

@@ -13,10 +13,10 @@ app:
resources:
limits:
memory: "8Gi"
memory: "6Gi"
cpu: "2000m"
requests:
memory: "6Gi"
memory: "4Gi"
cpu: "1000m"
# Production URLs (REQUIRED - update with your actual domain names)
@@ -52,11 +52,11 @@ realtime:
resources:
limits:
memory: "1Gi"
cpu: "500m"
memory: "4Gi"
cpu: "1000m"
requests:
memory: "512Mi"
cpu: "250m"
memory: "2Gi"
cpu: "500m"
env:
NEXT_PUBLIC_APP_URL: "https://sim.acme.ai"

View File

@@ -29,10 +29,10 @@ app:
# Resource limits and requests
resources:
limits:
memory: "8Gi"
memory: "4Gi"
cpu: "2000m"
requests:
memory: "4Gi"
memory: "2Gi"
cpu: "1000m"
# Node selector for pod scheduling (leave empty to allow scheduling on any node)
@@ -245,11 +245,11 @@ realtime:
# Resource limits and requests
resources:
limits:
memory: "2Gi"
cpu: "1000m"
requests:
memory: "1Gi"
cpu: "500m"
requests:
memory: "512Mi"
cpu: "250m"
# Node selector for pod scheduling (leave empty to allow scheduling on any node)
nodeSelector: {}