Compare commits
3 Commits
improvemen
...
fix/kb-ws-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
e94d74a541 | ||
|
|
974f3b9eff | ||
|
|
d83c418111 |
@@ -5,14 +5,6 @@ description: Essential actions for navigating and using the Sim workflow editor
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
|
||||
export const ActionImage = ({ src, alt }) => (
|
||||
<img src={src} alt={alt} className="inline-block max-h-8 rounded border border-neutral-200 dark:border-neutral-700" />
|
||||
)
|
||||
|
||||
export const ActionVideo = ({ src, alt }) => (
|
||||
<video src={src} alt={alt} autoPlay loop muted playsInline className="inline-block max-h-8 rounded border border-neutral-200 dark:border-neutral-700" />
|
||||
)
|
||||
|
||||
A quick lookup for everyday actions in the Sim workflow editor. For keyboard shortcuts, see [Keyboard Shortcuts](/keyboard-shortcuts).
|
||||
|
||||
<Callout type="info">
|
||||
@@ -21,209 +13,67 @@ A quick lookup for everyday actions in the Sim workflow editor. For keyboard sho
|
||||
|
||||
## Workspaces
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Create a workspace</td>
|
||||
<td>Click workspace dropdown → **New Workspace**</td>
|
||||
<td><ActionVideo src="/static/quick-reference/create-workspace.mp4" alt="Create workspace" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Switch workspaces</td>
|
||||
<td>Click workspace dropdown → Select workspace</td>
|
||||
<td><ActionVideo src="/static/quick-reference/switch-workspace.mp4" alt="Switch workspaces" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Invite team members</td>
|
||||
<td>Workspace settings → **Team** → **Invite**</td>
|
||||
<td><ActionVideo src="/static/quick-reference/invite.mp4" alt="Invite team members" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Rename a workspace</td>
|
||||
<td>Right-click workspace → **Rename**</td>
|
||||
<td rowSpan={4}><ActionImage src="/static/quick-reference/workspace-context-menu.png" alt="Workspace context menu" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Duplicate a workspace</td>
|
||||
<td>Right-click workspace → **Duplicate**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Export a workspace</td>
|
||||
<td>Right-click workspace → **Export**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Delete a workspace</td>
|
||||
<td>Right-click workspace → **Delete**</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
| Action | How |
|
||||
|--------|-----|
|
||||
| Create a workspace | Click workspace dropdown in sidebar → **New Workspace** |
|
||||
| Rename a workspace | Workspace settings → Edit name |
|
||||
| Switch workspaces | Click workspace dropdown in sidebar → Select workspace |
|
||||
| Invite team members | Workspace settings → **Team** → **Invite** |
|
||||
|
||||
## Workflows
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Create a workflow</td>
|
||||
<td>Click **+** button in sidebar</td>
|
||||
<td><ActionImage src="/static/quick-reference/create-workflow.png" alt="Create workflow" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Reorder / move workflows</td>
|
||||
<td>Drag workflow up/down or onto a folder</td>
|
||||
<td><ActionVideo src="/static/quick-reference/reordering.mp4" alt="Reorder workflows" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Import a workflow</td>
|
||||
<td>Click import button in sidebar → Select file</td>
|
||||
<td><ActionImage src="/static/quick-reference/import-workflow.png" alt="Import workflow" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Multi-select workflows</td>
|
||||
<td>`Mod+Click` or `Shift+Click` workflows in sidebar</td>
|
||||
<td><ActionVideo src="/static/quick-reference/multiselect.mp4" alt="Multi-select workflows" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Open in new tab</td>
|
||||
<td>Right-click workflow → **Open in New Tab**</td>
|
||||
<td rowSpan={6}><ActionImage src="/static/quick-reference/workflow-context-menu.png" alt="Workflow context menu" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Rename a workflow</td>
|
||||
<td>Right-click workflow → **Rename**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Assign workflow color</td>
|
||||
<td>Right-click workflow → **Change Color**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Duplicate a workflow</td>
|
||||
<td>Right-click workflow → **Duplicate**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Export a workflow</td>
|
||||
<td>Right-click workflow → **Export**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Delete a workflow</td>
|
||||
<td>Right-click workflow → **Delete**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Rename a folder</td>
|
||||
<td>Right-click folder → **Rename**</td>
|
||||
<td rowSpan={6}><ActionImage src="/static/quick-reference/folder-context-menu.png" alt="Folder context menu" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Create workflow in folder</td>
|
||||
<td>Right-click folder → **Create workflow**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Create folder in folder</td>
|
||||
<td>Right-click folder → **Create folder**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Duplicate a folder</td>
|
||||
<td>Right-click folder → **Duplicate**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Export a folder</td>
|
||||
<td>Right-click folder → **Export**</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Delete a folder</td>
|
||||
<td>Right-click folder → **Delete**</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
| Action | How |
|
||||
|--------|-----|
|
||||
| Create a workflow | Click **New Workflow** button or `Mod+Shift+A` |
|
||||
| Rename a workflow | Double-click workflow name in sidebar, or right-click → **Rename** |
|
||||
| Duplicate a workflow | Right-click workflow → **Duplicate** |
|
||||
| Reorder workflows | Drag workflow up/down in the sidebar list |
|
||||
| Import a workflow | Sidebar menu → **Import** → Select file |
|
||||
| Create a folder | Right-click in sidebar → **New Folder** |
|
||||
| Rename a folder | Right-click folder → **Rename** |
|
||||
| Delete a folder | Right-click folder → **Delete** |
|
||||
| Collapse/expand folder | Click folder arrow, or double-click folder |
|
||||
| Move workflow to folder | Drag workflow onto folder in sidebar |
|
||||
| Delete a workflow | Right-click workflow → **Delete** |
|
||||
| Export a workflow | Right-click workflow → **Export** |
|
||||
| Assign workflow color | Right-click workflow → **Change Color** |
|
||||
| Multi-select workflows | `Mod+Click` or `Shift+Click` workflows in sidebar |
|
||||
| Open in new tab | Right-click workflow → **Open in New Tab** |
|
||||
|
||||
## Blocks
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Add a block</td>
|
||||
<td>Drag from Toolbar panel, or right-click canvas → **Add Block**</td>
|
||||
<td><ActionVideo src="/static/quick-reference/add-block.mp4" alt="Add a block" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Multi-select blocks</td>
|
||||
<td>`Mod+Click` additional blocks, or right-drag to draw selection box</td>
|
||||
<td><ActionVideo src="/static/quick-reference/multiselect-blocks.mp4" alt="Multi-select blocks" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Copy blocks</td>
|
||||
<td>`Mod+C` with blocks selected</td>
|
||||
<td rowSpan={2}><ActionVideo src="/static/quick-reference/copy-paste.mp4" alt="Copy and paste blocks" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Paste blocks</td>
|
||||
<td>`Mod+V` to paste copied blocks</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Duplicate blocks</td>
|
||||
<td>Right-click → **Duplicate**</td>
|
||||
<td><ActionVideo src="/static/quick-reference/duplicate-block.mp4" alt="Duplicate blocks" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Delete blocks</td>
|
||||
<td>`Delete` or `Backspace` key, or right-click → **Delete**</td>
|
||||
<td><ActionImage src="/static/quick-reference/delete-block.png" alt="Delete block" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Rename a block</td>
|
||||
<td>Click block name in header, or edit in the Editor panel</td>
|
||||
<td><ActionVideo src="/static/quick-reference/rename-block.mp4" alt="Rename a block" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Enable/Disable a block</td>
|
||||
<td>Right-click → **Enable/Disable**</td>
|
||||
<td><ActionImage src="/static/quick-reference/disable-block.png" alt="Disable block" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Toggle handle orientation</td>
|
||||
<td>Right-click → **Toggle Handles**</td>
|
||||
<td><ActionVideo src="/static/quick-reference/toggle-handles.mp4" alt="Toggle handle orientation" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Configure a block</td>
|
||||
<td>Select block → use Editor panel on right</td>
|
||||
<td><ActionVideo src="/static/quick-reference/configure-block.mp4" alt="Configure a block" /></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
| Action | How |
|
||||
|--------|-----|
|
||||
| Add a block | Drag from Toolbar panel, or right-click canvas → **Add Block** |
|
||||
| Select a block | Click on the block |
|
||||
| Multi-select blocks | `Mod+Click` additional blocks, or right-drag to draw selection box |
|
||||
| Move blocks | Drag selected block(s) to new position |
|
||||
| Copy blocks | `Mod+C` with blocks selected |
|
||||
| Paste blocks | `Mod+V` to paste copied blocks |
|
||||
| Duplicate blocks | Right-click → **Duplicate** |
|
||||
| Delete blocks | `Delete` or `Backspace` key, or right-click → **Delete** |
|
||||
| Rename a block | Click block name in header, or edit in the Editor panel |
|
||||
| Enable/Disable a block | Right-click → **Enable/Disable** |
|
||||
| Toggle handle orientation | Right-click → **Toggle Handles** |
|
||||
| Toggle trigger mode | Right-click trigger block → **Toggle Trigger Mode** |
|
||||
| Configure a block | Select block → use Editor panel on right |
|
||||
|
||||
## Connections
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr><th>Action</th><th>How</th><th>Preview</th></tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Create a connection</td>
|
||||
<td>Drag from output handle to input handle</td>
|
||||
<td><ActionVideo src="/static/quick-reference/connect-blocks.mp4" alt="Connect blocks" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Delete a connection</td>
|
||||
<td>Click edge to select → `Delete` key</td>
|
||||
<td><ActionVideo src="/static/quick-reference/delete-connection.mp4" alt="Delete connection" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Use output in another block</td>
|
||||
<td>Drag connection tag into input field</td>
|
||||
<td><ActionVideo src="/static/quick-reference/connection-tag.mp4" alt="Use connection tag" /></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
| Action | How |
|
||||
|--------|-----|
|
||||
| Create a connection | Drag from output handle to input handle |
|
||||
| Delete a connection | Click edge to select → `Delete` key |
|
||||
| Use output in another block | Drag connection tag into input field |
|
||||
|
||||
## Canvas Navigation
|
||||
|
||||
| Action | How |
|
||||
|--------|-----|
|
||||
| Pan/move canvas | Left-drag on empty space, or scroll/trackpad |
|
||||
| Zoom in/out | Scroll wheel or pinch gesture |
|
||||
| Auto-layout | `Shift+L` |
|
||||
| Draw selection box | Right-drag on empty canvas area |
|
||||
|
||||
## Panels & Views
|
||||
|
||||
@@ -233,8 +83,7 @@ A quick lookup for everyday actions in the Sim workflow editor. For keyboard sho
|
||||
| Open Toolbar tab | Press `T` or click Toolbar tab |
|
||||
| Open Editor tab | Press `E` or click Editor tab |
|
||||
| Search toolbar | `Mod+F` |
|
||||
| Search everything | `Mod+K` |
|
||||
| Toggle manual mode | Click toggle button in editor fields to switch between manual and selector |
|
||||
| Toggle advanced mode | Click toggle button on input fields |
|
||||
| Resize panels | Drag panel edge |
|
||||
| Collapse/expand sidebar | Click collapse button on sidebar |
|
||||
|
||||
@@ -274,8 +123,7 @@ A quick lookup for everyday actions in the Sim workflow editor. For keyboard sho
|
||||
| Edit workflow variable | Variables tab → Click variable to edit |
|
||||
| Delete workflow variable | Variables tab → Click delete icon on variable |
|
||||
| Add environment variable | Settings → **Environment Variables** → **Add** |
|
||||
| Reference a workflow variable | Use `<blockName.itemName>` syntax in block inputs |
|
||||
| Reference an environment variable | Use `{{ENV_VAR}}` syntax in block inputs |
|
||||
| Reference a variable | Use `{{variableName}}` syntax in block inputs |
|
||||
|
||||
## Credentials
|
||||
|
||||
|
||||
|
Before Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 27 KiB |
|
Before Width: | Height: | Size: 24 KiB |
|
Before Width: | Height: | Size: 48 KiB |
|
Before Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 25 KiB |
|
Before Width: | Height: | Size: 36 KiB |
|
Before Width: | Height: | Size: 41 KiB |
@@ -16,6 +16,10 @@ mockKnowledgeSchemas()
|
||||
mockDrizzleOrm()
|
||||
mockConsoleLogger()
|
||||
|
||||
vi.mock('@/lib/workspaces/permissions/utils', () => ({
|
||||
getUserEntityPermissions: vi.fn().mockResolvedValue({ role: 'owner' }),
|
||||
}))
|
||||
|
||||
describe('Knowledge Base API Route', () => {
|
||||
const mockAuth$ = mockAuth()
|
||||
|
||||
@@ -86,6 +90,7 @@ describe('Knowledge Base API Route', () => {
|
||||
const validKnowledgeBaseData = {
|
||||
name: 'Test Knowledge Base',
|
||||
description: 'Test description',
|
||||
workspaceId: 'test-workspace-id',
|
||||
chunkingConfig: {
|
||||
maxSize: 1024,
|
||||
minSize: 100,
|
||||
@@ -133,11 +138,25 @@ describe('Knowledge Base API Route', () => {
|
||||
expect(data.details).toBeDefined()
|
||||
})
|
||||
|
||||
it('should require workspaceId', async () => {
|
||||
mockAuth$.mockAuthenticatedUser()
|
||||
|
||||
const req = createMockRequest('POST', { name: 'Test KB' })
|
||||
const { POST } = await import('@/app/api/knowledge/route')
|
||||
const response = await POST(req)
|
||||
const data = await response.json()
|
||||
|
||||
expect(response.status).toBe(400)
|
||||
expect(data.error).toBe('Invalid request data')
|
||||
expect(data.details).toBeDefined()
|
||||
})
|
||||
|
||||
it('should validate chunking config constraints', async () => {
|
||||
mockAuth$.mockAuthenticatedUser()
|
||||
|
||||
const invalidData = {
|
||||
name: 'Test KB',
|
||||
workspaceId: 'test-workspace-id',
|
||||
chunkingConfig: {
|
||||
maxSize: 100, // 100 tokens = 400 characters
|
||||
minSize: 500, // Invalid: minSize (500 chars) > maxSize (400 chars)
|
||||
@@ -157,7 +176,7 @@ describe('Knowledge Base API Route', () => {
|
||||
it('should use default values for optional fields', async () => {
|
||||
mockAuth$.mockAuthenticatedUser()
|
||||
|
||||
const minimalData = { name: 'Test KB' }
|
||||
const minimalData = { name: 'Test KB', workspaceId: 'test-workspace-id' }
|
||||
const req = createMockRequest('POST', minimalData)
|
||||
const { POST } = await import('@/app/api/knowledge/route')
|
||||
const response = await POST(req)
|
||||
|
||||
@@ -19,7 +19,7 @@ const logger = createLogger('KnowledgeBaseAPI')
|
||||
const CreateKnowledgeBaseSchema = z.object({
|
||||
name: z.string().min(1, 'Name is required'),
|
||||
description: z.string().optional(),
|
||||
workspaceId: z.string().optional(),
|
||||
workspaceId: z.string().min(1, 'Workspace ID is required'),
|
||||
embeddingModel: z.literal('text-embedding-3-small').default('text-embedding-3-small'),
|
||||
embeddingDimension: z.literal(1536).default(1536),
|
||||
chunkingConfig: z
|
||||
|
||||
@@ -1,204 +0,0 @@
|
||||
import { db } from '@sim/db'
|
||||
import { member, permissions, user, workspace } from '@sim/db/schema'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { and, eq, or } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { getSession } from '@/lib/auth'
|
||||
|
||||
const logger = createLogger('OrganizationWorkspacesAPI')
|
||||
|
||||
/**
|
||||
* GET /api/organizations/[id]/workspaces
|
||||
* Get workspaces related to the organization with optional filtering
|
||||
* Query parameters:
|
||||
* - ?available=true - Only workspaces where user can invite others (admin permissions)
|
||||
* - ?member=userId - Workspaces where specific member has access
|
||||
*/
|
||||
export async function GET(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
|
||||
try {
|
||||
const session = await getSession()
|
||||
|
||||
if (!session?.user?.id) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { id: organizationId } = await params
|
||||
const url = new URL(request.url)
|
||||
const availableOnly = url.searchParams.get('available') === 'true'
|
||||
const memberId = url.searchParams.get('member')
|
||||
|
||||
// Verify user is a member of this organization
|
||||
const memberEntry = await db
|
||||
.select()
|
||||
.from(member)
|
||||
.where(and(eq(member.organizationId, organizationId), eq(member.userId, session.user.id)))
|
||||
.limit(1)
|
||||
|
||||
if (memberEntry.length === 0) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: 'Forbidden - Not a member of this organization',
|
||||
},
|
||||
{ status: 403 }
|
||||
)
|
||||
}
|
||||
|
||||
const userRole = memberEntry[0].role
|
||||
const hasAdminAccess = ['owner', 'admin'].includes(userRole)
|
||||
|
||||
if (availableOnly) {
|
||||
// Get workspaces where user has admin permissions (can invite others)
|
||||
const availableWorkspaces = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
name: workspace.name,
|
||||
ownerId: workspace.ownerId,
|
||||
createdAt: workspace.createdAt,
|
||||
isOwner: eq(workspace.ownerId, session.user.id),
|
||||
permissionType: permissions.permissionType,
|
||||
})
|
||||
.from(workspace)
|
||||
.leftJoin(
|
||||
permissions,
|
||||
and(
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.entityId, workspace.id),
|
||||
eq(permissions.userId, session.user.id)
|
||||
)
|
||||
)
|
||||
.where(
|
||||
or(
|
||||
// User owns the workspace
|
||||
eq(workspace.ownerId, session.user.id),
|
||||
// User has admin permission on the workspace
|
||||
and(
|
||||
eq(permissions.userId, session.user.id),
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.permissionType, 'admin')
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
// Filter and format the results
|
||||
const workspacesWithInvitePermission = availableWorkspaces
|
||||
.filter((workspace) => {
|
||||
// Include if user owns the workspace OR has admin permission
|
||||
return workspace.isOwner || workspace.permissionType === 'admin'
|
||||
})
|
||||
.map((workspace) => ({
|
||||
id: workspace.id,
|
||||
name: workspace.name,
|
||||
isOwner: workspace.isOwner,
|
||||
canInvite: true, // All returned workspaces have invite permission
|
||||
createdAt: workspace.createdAt,
|
||||
}))
|
||||
|
||||
logger.info('Retrieved available workspaces for organization member', {
|
||||
organizationId,
|
||||
userId: session.user.id,
|
||||
workspaceCount: workspacesWithInvitePermission.length,
|
||||
})
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
workspaces: workspacesWithInvitePermission,
|
||||
totalCount: workspacesWithInvitePermission.length,
|
||||
filter: 'available',
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
if (memberId && hasAdminAccess) {
|
||||
// Get workspaces where specific member has access (admin only)
|
||||
const memberWorkspaces = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
name: workspace.name,
|
||||
ownerId: workspace.ownerId,
|
||||
isOwner: eq(workspace.ownerId, memberId),
|
||||
permissionType: permissions.permissionType,
|
||||
createdAt: permissions.createdAt,
|
||||
})
|
||||
.from(workspace)
|
||||
.leftJoin(
|
||||
permissions,
|
||||
and(
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.entityId, workspace.id),
|
||||
eq(permissions.userId, memberId)
|
||||
)
|
||||
)
|
||||
.where(
|
||||
or(
|
||||
// Member owns the workspace
|
||||
eq(workspace.ownerId, memberId),
|
||||
// Member has permissions on the workspace
|
||||
and(eq(permissions.userId, memberId), eq(permissions.entityType, 'workspace'))
|
||||
)
|
||||
)
|
||||
|
||||
const formattedWorkspaces = memberWorkspaces.map((workspace) => ({
|
||||
id: workspace.id,
|
||||
name: workspace.name,
|
||||
isOwner: workspace.isOwner,
|
||||
permission: workspace.permissionType,
|
||||
joinedAt: workspace.createdAt,
|
||||
createdAt: workspace.createdAt,
|
||||
}))
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
workspaces: formattedWorkspaces,
|
||||
totalCount: formattedWorkspaces.length,
|
||||
filter: 'member',
|
||||
memberId,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
// Default: Get all workspaces (basic info only for regular members)
|
||||
if (!hasAdminAccess) {
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
workspaces: [],
|
||||
totalCount: 0,
|
||||
message: 'Workspace access information is only available to organization admins',
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
// For admins: Get summary of all workspaces
|
||||
const allWorkspaces = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
name: workspace.name,
|
||||
ownerId: workspace.ownerId,
|
||||
createdAt: workspace.createdAt,
|
||||
ownerName: user.name,
|
||||
})
|
||||
.from(workspace)
|
||||
.leftJoin(user, eq(workspace.ownerId, user.id))
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
data: {
|
||||
workspaces: allWorkspaces,
|
||||
totalCount: allWorkspaces.length,
|
||||
filter: 'all',
|
||||
},
|
||||
userRole,
|
||||
hasAdminAccess,
|
||||
})
|
||||
} catch (error) {
|
||||
logger.error('Failed to get organization workspaces', { error })
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: 'Internal server error',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
257
apps/sim/app/api/tools/supabase/storage-upload/route.ts
Normal file
@@ -0,0 +1,257 @@
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { checkInternalAuth } from '@/lib/auth/hybrid'
|
||||
import { generateRequestId } from '@/lib/core/utils/request'
|
||||
import { processSingleFileToUserFile } from '@/lib/uploads/utils/file-utils'
|
||||
import { downloadFileFromStorage } from '@/lib/uploads/utils/file-utils.server'
|
||||
|
||||
export const dynamic = 'force-dynamic'
|
||||
|
||||
const logger = createLogger('SupabaseStorageUploadAPI')
|
||||
|
||||
const SupabaseStorageUploadSchema = z.object({
|
||||
projectId: z.string().min(1, 'Project ID is required'),
|
||||
apiKey: z.string().min(1, 'API key is required'),
|
||||
bucket: z.string().min(1, 'Bucket name is required'),
|
||||
fileName: z.string().min(1, 'File name is required'),
|
||||
path: z.string().optional().nullable(),
|
||||
fileData: z.any(),
|
||||
contentType: z.string().optional().nullable(),
|
||||
upsert: z.boolean().optional().default(false),
|
||||
})
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = generateRequestId()
|
||||
|
||||
try {
|
||||
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
|
||||
|
||||
if (!authResult.success) {
|
||||
logger.warn(
|
||||
`[${requestId}] Unauthorized Supabase storage upload attempt: ${authResult.error}`
|
||||
)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: authResult.error || 'Authentication required',
|
||||
},
|
||||
{ status: 401 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(
|
||||
`[${requestId}] Authenticated Supabase storage upload request via ${authResult.authType}`,
|
||||
{
|
||||
userId: authResult.userId,
|
||||
}
|
||||
)
|
||||
|
||||
const body = await request.json()
|
||||
const validatedData = SupabaseStorageUploadSchema.parse(body)
|
||||
|
||||
const fileData = validatedData.fileData
|
||||
const isStringInput = typeof fileData === 'string'
|
||||
|
||||
logger.info(`[${requestId}] Uploading to Supabase Storage`, {
|
||||
bucket: validatedData.bucket,
|
||||
fileName: validatedData.fileName,
|
||||
path: validatedData.path,
|
||||
fileDataType: isStringInput ? 'string' : 'object',
|
||||
})
|
||||
|
||||
if (!fileData) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'fileData is required',
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
let uploadBody: Buffer
|
||||
let uploadContentType: string | undefined
|
||||
|
||||
if (isStringInput) {
|
||||
let content = fileData as string
|
||||
|
||||
const dataUrlMatch = content.match(/^data:([^;]+);base64,(.+)$/s)
|
||||
if (dataUrlMatch) {
|
||||
const [, mimeType, base64Data] = dataUrlMatch
|
||||
content = base64Data
|
||||
if (!validatedData.contentType) {
|
||||
uploadContentType = mimeType
|
||||
}
|
||||
logger.info(`[${requestId}] Extracted base64 from data URL (MIME: ${mimeType})`)
|
||||
}
|
||||
|
||||
const cleanedContent = content.replace(/[\s\r\n]/g, '')
|
||||
const isLikelyBase64 = /^[A-Za-z0-9+/]*={0,2}$/.test(cleanedContent)
|
||||
|
||||
if (isLikelyBase64 && cleanedContent.length >= 4) {
|
||||
try {
|
||||
uploadBody = Buffer.from(cleanedContent, 'base64')
|
||||
|
||||
const expectedMinSize = Math.floor(cleanedContent.length * 0.7)
|
||||
const expectedMaxSize = Math.ceil(cleanedContent.length * 0.8)
|
||||
|
||||
if (
|
||||
uploadBody.length >= expectedMinSize &&
|
||||
uploadBody.length <= expectedMaxSize &&
|
||||
uploadBody.length > 0
|
||||
) {
|
||||
logger.info(
|
||||
`[${requestId}] Decoded base64 content: ${cleanedContent.length} chars -> ${uploadBody.length} bytes`
|
||||
)
|
||||
} else {
|
||||
const reEncoded = uploadBody.toString('base64')
|
||||
if (reEncoded !== cleanedContent) {
|
||||
logger.info(
|
||||
`[${requestId}] Content looked like base64 but re-encoding didn't match, using as plain text`
|
||||
)
|
||||
uploadBody = Buffer.from(content, 'utf-8')
|
||||
} else {
|
||||
logger.info(
|
||||
`[${requestId}] Decoded base64 content (verified): ${uploadBody.length} bytes`
|
||||
)
|
||||
}
|
||||
}
|
||||
} catch (decodeError) {
|
||||
logger.info(
|
||||
`[${requestId}] Failed to decode as base64, using as plain text: ${decodeError}`
|
||||
)
|
||||
uploadBody = Buffer.from(content, 'utf-8')
|
||||
}
|
||||
} else {
|
||||
uploadBody = Buffer.from(content, 'utf-8')
|
||||
logger.info(`[${requestId}] Using content as plain text (${uploadBody.length} bytes)`)
|
||||
}
|
||||
|
||||
uploadContentType =
|
||||
uploadContentType || validatedData.contentType || 'application/octet-stream'
|
||||
} else {
|
||||
const rawFile = fileData
|
||||
logger.info(`[${requestId}] Processing file object: ${rawFile.name || 'unknown'}`)
|
||||
|
||||
let userFile
|
||||
try {
|
||||
userFile = processSingleFileToUserFile(rawFile, requestId, logger)
|
||||
} catch (error) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to process file',
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
const buffer = await downloadFileFromStorage(userFile, requestId, logger)
|
||||
|
||||
uploadBody = buffer
|
||||
uploadContentType = validatedData.contentType || userFile.type || 'application/octet-stream'
|
||||
}
|
||||
|
||||
let fullPath = validatedData.fileName
|
||||
if (validatedData.path) {
|
||||
const folderPath = validatedData.path.endsWith('/')
|
||||
? validatedData.path
|
||||
: `${validatedData.path}/`
|
||||
fullPath = `${folderPath}${validatedData.fileName}`
|
||||
}
|
||||
|
||||
const supabaseUrl = `https://${validatedData.projectId}.supabase.co/storage/v1/object/${validatedData.bucket}/${fullPath}`
|
||||
|
||||
const headers: Record<string, string> = {
|
||||
apikey: validatedData.apiKey,
|
||||
Authorization: `Bearer ${validatedData.apiKey}`,
|
||||
'Content-Type': uploadContentType,
|
||||
}
|
||||
|
||||
if (validatedData.upsert) {
|
||||
headers['x-upsert'] = 'true'
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Sending to Supabase: ${supabaseUrl}`, {
|
||||
contentType: uploadContentType,
|
||||
bodySize: uploadBody.length,
|
||||
upsert: validatedData.upsert,
|
||||
})
|
||||
|
||||
const response = await fetch(supabaseUrl, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body: new Uint8Array(uploadBody),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text()
|
||||
let errorData
|
||||
try {
|
||||
errorData = JSON.parse(errorText)
|
||||
} catch {
|
||||
errorData = { message: errorText }
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Supabase Storage upload failed:`, {
|
||||
status: response.status,
|
||||
statusText: response.statusText,
|
||||
error: errorData,
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: errorData.message || errorData.error || `Upload failed: ${response.statusText}`,
|
||||
details: errorData,
|
||||
},
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
logger.info(`[${requestId}] File uploaded successfully to Supabase Storage`, {
|
||||
bucket: validatedData.bucket,
|
||||
path: fullPath,
|
||||
})
|
||||
|
||||
const publicUrl = `https://${validatedData.projectId}.supabase.co/storage/v1/object/public/${validatedData.bucket}/${fullPath}`
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
output: {
|
||||
message: 'Successfully uploaded file to storage',
|
||||
results: {
|
||||
...result,
|
||||
path: fullPath,
|
||||
bucket: validatedData.bucket,
|
||||
publicUrl,
|
||||
},
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
logger.warn(`[${requestId}] Invalid request data`, { errors: error.errors })
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Invalid request data',
|
||||
details: error.errors,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error uploading to Supabase Storage:`, error)
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Internal server error',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -1,379 +0,0 @@
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { checkInternalAuth } from '@/lib/auth/hybrid'
|
||||
import { generateRequestId } from '@/lib/core/utils/request'
|
||||
import { getBaseUrl } from '@/lib/core/utils/urls'
|
||||
import { StorageService } from '@/lib/uploads'
|
||||
import {
|
||||
extractStorageKey,
|
||||
inferContextFromKey,
|
||||
isInternalFileUrl,
|
||||
} from '@/lib/uploads/utils/file-utils'
|
||||
import { verifyFileAccess } from '@/app/api/files/authorization'
|
||||
|
||||
export const dynamic = 'force-dynamic'
|
||||
|
||||
const logger = createLogger('SupabaseStorageUploadAPI')
|
||||
|
||||
const SupabaseStorageUploadSchema = z.object({
|
||||
apiKey: z.string().min(1, 'API key is required'),
|
||||
projectId: z.string().min(1, 'Project ID is required'),
|
||||
bucket: z.string().min(1, 'Bucket name is required'),
|
||||
fileName: z.string().min(1, 'File name is required'),
|
||||
path: z.string().optional().nullable(),
|
||||
fileUpload: z
|
||||
.object({
|
||||
name: z.string().optional(),
|
||||
type: z.string().optional(),
|
||||
url: z.string().optional(),
|
||||
path: z.string().optional(),
|
||||
})
|
||||
.optional()
|
||||
.nullable(),
|
||||
fileContent: z.string().optional().nullable(),
|
||||
contentType: z.string().optional().nullable(),
|
||||
upsert: z.boolean().optional().default(false),
|
||||
})
|
||||
|
||||
/**
|
||||
* Detects if a string is base64 encoded and decodes it to a Buffer.
|
||||
* Handles both standard base64 and base64url encoding.
|
||||
*/
|
||||
function decodeBase64ToBuffer(content: string): Buffer {
|
||||
// Remove data URI prefix if present (e.g., "data:application/pdf;base64,")
|
||||
const base64Content = content.includes(',') ? content.split(',')[1] : content
|
||||
|
||||
// Convert base64url to standard base64 if needed
|
||||
let normalizedBase64 = base64Content
|
||||
if (base64Content.includes('-') || base64Content.includes('_')) {
|
||||
normalizedBase64 = base64Content.replace(/-/g, '+').replace(/_/g, '/')
|
||||
}
|
||||
|
||||
// Add padding if necessary
|
||||
const padding = normalizedBase64.length % 4
|
||||
if (padding > 0) {
|
||||
normalizedBase64 += '='.repeat(4 - padding)
|
||||
}
|
||||
|
||||
return Buffer.from(normalizedBase64, 'base64')
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if a string appears to be base64 encoded.
|
||||
*/
|
||||
function isBase64(str: string): boolean {
|
||||
// Remove data URI prefix if present
|
||||
const content = str.includes(',') ? str.split(',')[1] : str
|
||||
|
||||
// Check if it matches base64 pattern (including base64url)
|
||||
const base64Regex = /^[A-Za-z0-9+/_-]*={0,2}$/
|
||||
if (!base64Regex.test(content)) {
|
||||
return false
|
||||
}
|
||||
|
||||
// Additional heuristic: base64 strings are typically longer and don't contain spaces
|
||||
if (content.length < 4 || content.includes(' ')) {
|
||||
return false
|
||||
}
|
||||
|
||||
// Try to decode and check if it produces valid bytes
|
||||
try {
|
||||
const decoded = decodeBase64ToBuffer(str)
|
||||
// If decoded length is significantly smaller than input, it's likely base64
|
||||
return decoded.length < content.length
|
||||
} catch {
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Infer content type from file extension
|
||||
*/
|
||||
function inferContentType(fileName: string): string {
|
||||
const ext = fileName.split('.').pop()?.toLowerCase()
|
||||
const mimeTypes: Record<string, string> = {
|
||||
pdf: 'application/pdf',
|
||||
png: 'image/png',
|
||||
jpg: 'image/jpeg',
|
||||
jpeg: 'image/jpeg',
|
||||
gif: 'image/gif',
|
||||
webp: 'image/webp',
|
||||
svg: 'image/svg+xml',
|
||||
txt: 'text/plain',
|
||||
html: 'text/html',
|
||||
css: 'text/css',
|
||||
js: 'application/javascript',
|
||||
json: 'application/json',
|
||||
xml: 'application/xml',
|
||||
zip: 'application/zip',
|
||||
doc: 'application/msword',
|
||||
docx: 'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
|
||||
xls: 'application/vnd.ms-excel',
|
||||
xlsx: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
|
||||
ppt: 'application/vnd.ms-powerpoint',
|
||||
pptx: 'application/vnd.openxmlformats-officedocument.presentationml.presentation',
|
||||
mp3: 'audio/mpeg',
|
||||
mp4: 'video/mp4',
|
||||
wav: 'audio/wav',
|
||||
csv: 'text/csv',
|
||||
}
|
||||
return mimeTypes[ext || ''] || 'application/octet-stream'
|
||||
}
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = generateRequestId()
|
||||
|
||||
try {
|
||||
const authResult = await checkInternalAuth(request, { requireWorkflowId: false })
|
||||
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
logger.warn(
|
||||
`[${requestId}] Unauthorized Supabase storage upload attempt: ${authResult.error}`
|
||||
)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: authResult.error || 'Authentication required',
|
||||
},
|
||||
{ status: 401 }
|
||||
)
|
||||
}
|
||||
|
||||
const userId = authResult.userId
|
||||
|
||||
logger.info(
|
||||
`[${requestId}] Authenticated Supabase storage upload request via ${authResult.authType}`,
|
||||
{ userId }
|
||||
)
|
||||
|
||||
const body = await request.json()
|
||||
const validatedData = SupabaseStorageUploadSchema.parse(body)
|
||||
|
||||
// Build the full file path
|
||||
let fullPath = validatedData.fileName
|
||||
if (validatedData.path) {
|
||||
const folderPath = validatedData.path.endsWith('/')
|
||||
? validatedData.path
|
||||
: `${validatedData.path}/`
|
||||
fullPath = `${folderPath}${validatedData.fileName}`
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Uploading to Supabase Storage`, {
|
||||
projectId: validatedData.projectId,
|
||||
bucket: validatedData.bucket,
|
||||
path: fullPath,
|
||||
upsert: validatedData.upsert,
|
||||
hasFileUpload: !!validatedData.fileUpload,
|
||||
hasFileContent: !!validatedData.fileContent,
|
||||
})
|
||||
|
||||
// Determine content type
|
||||
let contentType = validatedData.contentType
|
||||
if (!contentType && validatedData.fileUpload?.type) {
|
||||
contentType = validatedData.fileUpload.type
|
||||
}
|
||||
if (!contentType) {
|
||||
contentType = inferContentType(validatedData.fileName)
|
||||
}
|
||||
|
||||
// Get the file content - either from fileUpload (internal storage) or fileContent (base64)
|
||||
let uploadBody: Buffer
|
||||
|
||||
if (validatedData.fileUpload) {
|
||||
// Handle file upload from internal storage
|
||||
const fileUrl = validatedData.fileUpload.url || validatedData.fileUpload.path
|
||||
|
||||
if (!fileUrl) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'File upload is missing URL or path',
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Processing file upload from: ${fileUrl}`)
|
||||
|
||||
// Check if it's an internal file URL (workspace file)
|
||||
if (isInternalFileUrl(fileUrl)) {
|
||||
try {
|
||||
const storageKey = extractStorageKey(fileUrl)
|
||||
const context = inferContextFromKey(storageKey)
|
||||
|
||||
const hasAccess = await verifyFileAccess(storageKey, userId, undefined, context, false)
|
||||
|
||||
if (!hasAccess) {
|
||||
logger.warn(`[${requestId}] Unauthorized file access attempt`, {
|
||||
userId,
|
||||
key: storageKey,
|
||||
context,
|
||||
})
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'File not found or access denied',
|
||||
},
|
||||
{ status: 404 }
|
||||
)
|
||||
}
|
||||
|
||||
// Download file from internal storage
|
||||
const fileBuffer = await StorageService.downloadFile({ key: storageKey, context })
|
||||
uploadBody = Buffer.from(fileBuffer)
|
||||
logger.info(
|
||||
`[${requestId}] Downloaded file from internal storage: ${fileBuffer.byteLength} bytes`
|
||||
)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Failed to download from internal storage:`, error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Failed to access uploaded file',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
} else {
|
||||
// External URL - fetch the file
|
||||
let fetchUrl = fileUrl
|
||||
if (fetchUrl.startsWith('/')) {
|
||||
const baseUrl = getBaseUrl()
|
||||
fetchUrl = `${baseUrl}${fetchUrl}`
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(fetchUrl)
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch file: ${response.status} ${response.statusText}`)
|
||||
}
|
||||
const arrayBuffer = await response.arrayBuffer()
|
||||
uploadBody = Buffer.from(arrayBuffer)
|
||||
logger.info(`[${requestId}] Downloaded file from URL: ${uploadBody.length} bytes`)
|
||||
} catch (error) {
|
||||
logger.error(`[${requestId}] Failed to fetch file from URL:`, error)
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: `Failed to fetch file from URL: ${error instanceof Error ? error.message : 'Unknown error'}`,
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
} else if (validatedData.fileContent) {
|
||||
// Handle direct file content (base64 or plain text)
|
||||
if (isBase64(validatedData.fileContent)) {
|
||||
logger.info(`[${requestId}] Detected base64 content, decoding to binary`)
|
||||
uploadBody = decodeBase64ToBuffer(validatedData.fileContent)
|
||||
} else {
|
||||
logger.info(`[${requestId}] Using plain text content`)
|
||||
uploadBody = Buffer.from(validatedData.fileContent, 'utf-8')
|
||||
}
|
||||
} else {
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Either fileUpload or fileContent is required',
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Upload body size: ${uploadBody.length} bytes`)
|
||||
|
||||
// Build Supabase Storage URL
|
||||
const supabaseUrl = `https://${validatedData.projectId}.supabase.co/storage/v1/object/${validatedData.bucket}/${fullPath}`
|
||||
|
||||
// Build headers
|
||||
const headers: Record<string, string> = {
|
||||
apikey: validatedData.apiKey,
|
||||
Authorization: `Bearer ${validatedData.apiKey}`,
|
||||
'Content-Type': contentType,
|
||||
}
|
||||
|
||||
if (validatedData.upsert) {
|
||||
headers['x-upsert'] = 'true'
|
||||
}
|
||||
|
||||
// Make the request to Supabase Storage
|
||||
// Convert Buffer to Uint8Array for fetch compatibility
|
||||
const response = await fetch(supabaseUrl, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body: new Uint8Array(uploadBody),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
let errorData: any
|
||||
try {
|
||||
errorData = await response.json()
|
||||
} catch {
|
||||
errorData = await response.text()
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Supabase Storage upload failed`, {
|
||||
status: response.status,
|
||||
statusText: response.statusText,
|
||||
error: errorData,
|
||||
})
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error:
|
||||
typeof errorData === 'object' && errorData.message
|
||||
? errorData.message
|
||||
: `Upload failed: ${response.status} ${response.statusText}`,
|
||||
},
|
||||
{ status: response.status }
|
||||
)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
logger.info(`[${requestId}] File uploaded successfully to Supabase Storage`, {
|
||||
bucket: validatedData.bucket,
|
||||
path: fullPath,
|
||||
})
|
||||
|
||||
// Build public URL for reference
|
||||
const publicUrl = `https://${validatedData.projectId}.supabase.co/storage/v1/object/public/${validatedData.bucket}/${fullPath}`
|
||||
|
||||
return NextResponse.json({
|
||||
success: true,
|
||||
output: {
|
||||
message: 'Successfully uploaded file to storage',
|
||||
results: {
|
||||
...result,
|
||||
publicUrl,
|
||||
bucket: validatedData.bucket,
|
||||
path: fullPath,
|
||||
},
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
logger.warn(`[${requestId}] Invalid request data`, { errors: error.errors })
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: 'Invalid request data',
|
||||
details: error.errors,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error uploading to Supabase Storage:`, error)
|
||||
|
||||
return NextResponse.json(
|
||||
{
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : 'Internal server error',
|
||||
},
|
||||
{ status: 500 }
|
||||
)
|
||||
}
|
||||
}
|
||||
@@ -338,6 +338,11 @@ const arePropsEqual = (prevProps: SubBlockProps, nextProps: SubBlockProps): bool
|
||||
const configEqual =
|
||||
prevProps.config.id === nextProps.config.id && prevProps.config.type === nextProps.config.type
|
||||
|
||||
const canonicalToggleEqual =
|
||||
!!prevProps.canonicalToggle === !!nextProps.canonicalToggle &&
|
||||
prevProps.canonicalToggle?.mode === nextProps.canonicalToggle?.mode &&
|
||||
prevProps.canonicalToggle?.disabled === nextProps.canonicalToggle?.disabled
|
||||
|
||||
return (
|
||||
prevProps.blockId === nextProps.blockId &&
|
||||
configEqual &&
|
||||
@@ -346,8 +351,7 @@ const arePropsEqual = (prevProps: SubBlockProps, nextProps: SubBlockProps): bool
|
||||
prevProps.disabled === nextProps.disabled &&
|
||||
prevProps.fieldDiffStatus === nextProps.fieldDiffStatus &&
|
||||
prevProps.allowExpandInPreview === nextProps.allowExpandInPreview &&
|
||||
prevProps.canonicalToggle?.mode === nextProps.canonicalToggle?.mode &&
|
||||
prevProps.canonicalToggle?.disabled === nextProps.canonicalToggle?.disabled
|
||||
canonicalToggleEqual
|
||||
)
|
||||
}
|
||||
|
||||
|
||||
@@ -214,15 +214,6 @@ export const A2ABlock: BlockConfig<A2AResponse> = {
|
||||
],
|
||||
config: {
|
||||
tool: (params) => params.operation as string,
|
||||
params: (params) => {
|
||||
const { fileUpload, fileReference, ...rest } = params
|
||||
const hasFileUpload = Array.isArray(fileUpload) ? fileUpload.length > 0 : !!fileUpload
|
||||
const files = hasFileUpload ? fileUpload : fileReference
|
||||
return {
|
||||
...rest,
|
||||
...(files ? { files } : {}),
|
||||
}
|
||||
},
|
||||
},
|
||||
},
|
||||
inputs: {
|
||||
|
||||
@@ -661,12 +661,25 @@ Return ONLY the PostgREST filter expression - no explanations, no markdown, no e
|
||||
placeholder: 'folder/subfolder/',
|
||||
condition: { field: 'operation', value: 'storage_upload' },
|
||||
},
|
||||
{
|
||||
id: 'file',
|
||||
title: 'File',
|
||||
type: 'file-upload',
|
||||
canonicalParamId: 'fileData',
|
||||
placeholder: 'Upload file to storage',
|
||||
condition: { field: 'operation', value: 'storage_upload' },
|
||||
mode: 'basic',
|
||||
multiple: false,
|
||||
required: true,
|
||||
},
|
||||
{
|
||||
id: 'fileContent',
|
||||
title: 'File Content',
|
||||
type: 'code',
|
||||
canonicalParamId: 'fileData',
|
||||
placeholder: 'Base64 encoded for binary files, or plain text',
|
||||
condition: { field: 'operation', value: 'storage_upload' },
|
||||
mode: 'advanced',
|
||||
required: true,
|
||||
},
|
||||
{
|
||||
|
||||
@@ -680,6 +680,10 @@ export function useCollaborativeWorkflow() {
|
||||
previousPositions?: Map<string, { x: number; y: number; parentId?: string }>
|
||||
}
|
||||
) => {
|
||||
if (isBaselineDiffView) {
|
||||
return
|
||||
}
|
||||
|
||||
if (!isInActiveRoom()) {
|
||||
logger.debug('Skipping batch position update - not in active workflow')
|
||||
return
|
||||
@@ -725,7 +729,7 @@ export function useCollaborativeWorkflow() {
|
||||
}
|
||||
}
|
||||
},
|
||||
[addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
|
||||
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
|
||||
)
|
||||
|
||||
const collaborativeUpdateBlockName = useCallback(
|
||||
@@ -817,6 +821,10 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
const collaborativeBatchToggleBlockEnabled = useCallback(
|
||||
(ids: string[]) => {
|
||||
if (isBaselineDiffView) {
|
||||
return
|
||||
}
|
||||
|
||||
if (ids.length === 0) return
|
||||
|
||||
const previousStates: Record<string, boolean> = {}
|
||||
@@ -849,7 +857,7 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
undoRedo.recordBatchToggleEnabled(validIds, previousStates)
|
||||
},
|
||||
[addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
|
||||
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
|
||||
)
|
||||
|
||||
const collaborativeBatchUpdateParent = useCallback(
|
||||
@@ -861,6 +869,10 @@ export function useCollaborativeWorkflow() {
|
||||
affectedEdges: Edge[]
|
||||
}>
|
||||
) => {
|
||||
if (isBaselineDiffView) {
|
||||
return
|
||||
}
|
||||
|
||||
if (!isInActiveRoom()) {
|
||||
logger.debug('Skipping batch update parent - not in active workflow')
|
||||
return
|
||||
@@ -931,7 +943,7 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
logger.debug('Batch updated parent for blocks', { updateCount: updates.length })
|
||||
},
|
||||
[isInActiveRoom, undoRedo, addToQueue, activeWorkflowId, session?.user?.id]
|
||||
[isBaselineDiffView, isInActiveRoom, undoRedo, addToQueue, activeWorkflowId, session?.user?.id]
|
||||
)
|
||||
|
||||
const collaborativeToggleBlockAdvancedMode = useCallback(
|
||||
@@ -951,18 +963,37 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
const collaborativeSetBlockCanonicalMode = useCallback(
|
||||
(id: string, canonicalId: string, canonicalMode: 'basic' | 'advanced') => {
|
||||
executeQueuedOperation(
|
||||
BLOCK_OPERATIONS.UPDATE_CANONICAL_MODE,
|
||||
OPERATION_TARGETS.BLOCK,
|
||||
{ id, canonicalId, canonicalMode },
|
||||
() => useWorkflowStore.getState().setBlockCanonicalMode(id, canonicalId, canonicalMode)
|
||||
)
|
||||
if (isBaselineDiffView) {
|
||||
return
|
||||
}
|
||||
|
||||
useWorkflowStore.getState().setBlockCanonicalMode(id, canonicalId, canonicalMode)
|
||||
|
||||
if (!activeWorkflowId) {
|
||||
return
|
||||
}
|
||||
|
||||
const operationId = crypto.randomUUID()
|
||||
addToQueue({
|
||||
id: operationId,
|
||||
operation: {
|
||||
operation: BLOCK_OPERATIONS.UPDATE_CANONICAL_MODE,
|
||||
target: OPERATION_TARGETS.BLOCK,
|
||||
payload: { id, canonicalId, canonicalMode },
|
||||
},
|
||||
workflowId: activeWorkflowId,
|
||||
userId: session?.user?.id || 'unknown',
|
||||
})
|
||||
},
|
||||
[executeQueuedOperation]
|
||||
[isBaselineDiffView, activeWorkflowId, addToQueue, session?.user?.id]
|
||||
)
|
||||
|
||||
const collaborativeBatchToggleBlockHandles = useCallback(
|
||||
(ids: string[]) => {
|
||||
if (isBaselineDiffView) {
|
||||
return
|
||||
}
|
||||
|
||||
if (ids.length === 0) return
|
||||
|
||||
const previousStates: Record<string, boolean> = {}
|
||||
@@ -995,11 +1026,15 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
undoRedo.recordBatchToggleHandles(validIds, previousStates)
|
||||
},
|
||||
[addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
|
||||
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, undoRedo]
|
||||
)
|
||||
|
||||
const collaborativeBatchAddEdges = useCallback(
|
||||
(edges: Edge[], options?: { skipUndoRedo?: boolean }) => {
|
||||
if (isBaselineDiffView) {
|
||||
return false
|
||||
}
|
||||
|
||||
if (!isInActiveRoom()) {
|
||||
logger.debug('Skipping batch add edges - not in active workflow')
|
||||
return false
|
||||
@@ -1035,11 +1070,15 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
return true
|
||||
},
|
||||
[addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
|
||||
[isBaselineDiffView, addToQueue, activeWorkflowId, session?.user?.id, isInActiveRoom, undoRedo]
|
||||
)
|
||||
|
||||
const collaborativeBatchRemoveEdges = useCallback(
|
||||
(edgeIds: string[], options?: { skipUndoRedo?: boolean }) => {
|
||||
if (isBaselineDiffView) {
|
||||
return false
|
||||
}
|
||||
|
||||
if (!isInActiveRoom()) {
|
||||
logger.debug('Skipping batch remove edges - not in active workflow')
|
||||
return false
|
||||
@@ -1089,7 +1128,7 @@ export function useCollaborativeWorkflow() {
|
||||
logger.info('Batch removed edges', { count: validEdgeIds.length })
|
||||
return true
|
||||
},
|
||||
[isInActiveRoom, addToQueue, activeWorkflowId, session, undoRedo]
|
||||
[isBaselineDiffView, isInActiveRoom, addToQueue, activeWorkflowId, session, undoRedo]
|
||||
)
|
||||
|
||||
const collaborativeSetSubblockValue = useCallback(
|
||||
@@ -1165,6 +1204,10 @@ export function useCollaborativeWorkflow() {
|
||||
(blockId: string, subblockId: string, value: any) => {
|
||||
if (isApplyingRemoteChange.current) return
|
||||
|
||||
if (isBaselineDiffView) {
|
||||
return
|
||||
}
|
||||
|
||||
if (!isInActiveRoom()) {
|
||||
logger.debug('Skipping tag selection - not in active workflow', {
|
||||
currentWorkflowId,
|
||||
@@ -1192,7 +1235,14 @@ export function useCollaborativeWorkflow() {
|
||||
userId: session?.user?.id || 'unknown',
|
||||
})
|
||||
},
|
||||
[addToQueue, currentWorkflowId, activeWorkflowId, session?.user?.id, isInActiveRoom]
|
||||
[
|
||||
isBaselineDiffView,
|
||||
addToQueue,
|
||||
currentWorkflowId,
|
||||
activeWorkflowId,
|
||||
session?.user?.id,
|
||||
isInActiveRoom,
|
||||
]
|
||||
)
|
||||
|
||||
const collaborativeUpdateLoopType = useCallback(
|
||||
@@ -1538,6 +1588,10 @@ export function useCollaborativeWorkflow() {
|
||||
|
||||
const collaborativeBatchRemoveBlocks = useCallback(
|
||||
(blockIds: string[], options?: { skipUndoRedo?: boolean }) => {
|
||||
if (isBaselineDiffView) {
|
||||
return false
|
||||
}
|
||||
|
||||
if (!isInActiveRoom()) {
|
||||
logger.debug('Skipping batch remove blocks - not in active workflow')
|
||||
return false
|
||||
@@ -1619,6 +1673,7 @@ export function useCollaborativeWorkflow() {
|
||||
return true
|
||||
},
|
||||
[
|
||||
isBaselineDiffView,
|
||||
addToQueue,
|
||||
activeWorkflowId,
|
||||
session?.user?.id,
|
||||
|
||||
@@ -37,6 +37,13 @@ export const knowledgeBaseServerTool: BaseServerTool<KnowledgeBaseArgs, Knowledg
|
||||
}
|
||||
}
|
||||
|
||||
if (!args.workspaceId) {
|
||||
return {
|
||||
success: false,
|
||||
message: 'Workspace ID is required for creating a knowledge base',
|
||||
}
|
||||
}
|
||||
|
||||
const requestId = crypto.randomUUID().slice(0, 8)
|
||||
const newKnowledgeBase = await createKnowledgeBase(
|
||||
{
|
||||
|
||||
@@ -79,7 +79,7 @@ export const KnowledgeBaseArgsSchema = z.object({
|
||||
name: z.string().optional(),
|
||||
/** Description of the knowledge base (optional for create) */
|
||||
description: z.string().optional(),
|
||||
/** Workspace ID to associate with (optional for create/list) */
|
||||
/** Workspace ID to associate with (required for create, optional for list) */
|
||||
workspaceId: z.string().optional(),
|
||||
/** Knowledge base ID (required for get, query) */
|
||||
knowledgeBaseId: z.string().optional(),
|
||||
|
||||
@@ -86,18 +86,16 @@ export async function createKnowledgeBase(
|
||||
const kbId = randomUUID()
|
||||
const now = new Date()
|
||||
|
||||
if (data.workspaceId) {
|
||||
const hasPermission = await getUserEntityPermissions(data.userId, 'workspace', data.workspaceId)
|
||||
if (hasPermission === null) {
|
||||
throw new Error('User does not have permission to create knowledge bases in this workspace')
|
||||
}
|
||||
const hasPermission = await getUserEntityPermissions(data.userId, 'workspace', data.workspaceId)
|
||||
if (hasPermission === null) {
|
||||
throw new Error('User does not have permission to create knowledge bases in this workspace')
|
||||
}
|
||||
|
||||
const newKnowledgeBase = {
|
||||
id: kbId,
|
||||
name: data.name,
|
||||
description: data.description ?? null,
|
||||
workspaceId: data.workspaceId ?? null,
|
||||
workspaceId: data.workspaceId,
|
||||
userId: data.userId,
|
||||
tokenCount: 0,
|
||||
embeddingModel: data.embeddingModel,
|
||||
@@ -122,7 +120,7 @@ export async function createKnowledgeBase(
|
||||
chunkingConfig: data.chunkingConfig,
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
workspaceId: data.workspaceId ?? null,
|
||||
workspaceId: data.workspaceId,
|
||||
docCount: 0,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -32,7 +32,7 @@ export interface KnowledgeBaseWithCounts {
|
||||
export interface CreateKnowledgeBaseData {
|
||||
name: string
|
||||
description?: string
|
||||
workspaceId?: string
|
||||
workspaceId: string
|
||||
embeddingModel: 'text-embedding-3-small'
|
||||
embeddingDimension: 1536
|
||||
chunkingConfig: ChunkingConfig
|
||||
|
||||
@@ -27,6 +27,9 @@ export function registerEmitFunctions(
|
||||
emitSubblockUpdate = subblockEmit
|
||||
emitVariableUpdate = variableEmit
|
||||
currentRegisteredWorkflowId = workflowId
|
||||
if (workflowId) {
|
||||
useOperationQueueStore.getState().processNextOperation()
|
||||
}
|
||||
}
|
||||
|
||||
let currentRegisteredWorkflowId: string | null = null
|
||||
@@ -262,16 +265,14 @@ export const useOperationQueueStore = create<OperationQueueState>((set, get) =>
|
||||
return
|
||||
}
|
||||
|
||||
const nextOperation = currentRegisteredWorkflowId
|
||||
? state.operations.find(
|
||||
(op) => op.status === 'pending' && op.workflowId === currentRegisteredWorkflowId
|
||||
)
|
||||
: state.operations.find((op) => op.status === 'pending')
|
||||
if (!nextOperation) {
|
||||
if (!currentRegisteredWorkflowId) {
|
||||
return
|
||||
}
|
||||
|
||||
if (currentRegisteredWorkflowId && nextOperation.workflowId !== currentRegisteredWorkflowId) {
|
||||
const nextOperation = state.operations.find(
|
||||
(op) => op.status === 'pending' && op.workflowId === currentRegisteredWorkflowId
|
||||
)
|
||||
if (!nextOperation) {
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -38,11 +38,12 @@ export const storageUploadTool: ToolConfig<
|
||||
visibility: 'user-or-llm',
|
||||
description: 'Optional folder path (e.g., "folder/subfolder/")',
|
||||
},
|
||||
fileContent: {
|
||||
type: 'string',
|
||||
fileData: {
|
||||
type: 'json',
|
||||
required: true,
|
||||
visibility: 'user-or-llm',
|
||||
description: 'The file content (base64 encoded for binary files, or plain text)',
|
||||
description:
|
||||
'File to upload - UserFile object (basic mode) or string content (advanced mode: base64 or plain text). Supports data URLs.',
|
||||
},
|
||||
contentType: {
|
||||
type: 'string',
|
||||
@@ -65,65 +66,28 @@ export const storageUploadTool: ToolConfig<
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params) => {
|
||||
// Combine folder path and fileName, ensuring proper formatting
|
||||
let fullPath = params.fileName
|
||||
if (params.path) {
|
||||
// Ensure path ends with / and doesn't have double slashes
|
||||
const folderPath = params.path.endsWith('/') ? params.path : `${params.path}/`
|
||||
fullPath = `${folderPath}${params.fileName}`
|
||||
}
|
||||
return `https://${params.projectId}.supabase.co/storage/v1/object/${params.bucket}/${fullPath}`
|
||||
},
|
||||
url: '/api/tools/supabase/storage-upload',
|
||||
method: 'POST',
|
||||
headers: (params) => {
|
||||
const headers: Record<string, string> = {
|
||||
apikey: params.apiKey,
|
||||
Authorization: `Bearer ${params.apiKey}`,
|
||||
}
|
||||
|
||||
if (params.contentType) {
|
||||
headers['Content-Type'] = params.contentType
|
||||
}
|
||||
|
||||
if (params.upsert) {
|
||||
headers['x-upsert'] = 'true'
|
||||
}
|
||||
|
||||
return headers
|
||||
},
|
||||
body: (params) => {
|
||||
// Return the file content wrapped in an object
|
||||
// The actual upload will need to handle this appropriately
|
||||
return {
|
||||
content: params.fileContent,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response: Response) => {
|
||||
let data
|
||||
try {
|
||||
data = await response.json()
|
||||
} catch (parseError) {
|
||||
throw new Error(`Failed to parse Supabase storage upload response: ${parseError}`)
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
message: 'Successfully uploaded file to storage',
|
||||
results: data,
|
||||
},
|
||||
error: undefined,
|
||||
}
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params) => ({
|
||||
projectId: params.projectId,
|
||||
apiKey: params.apiKey,
|
||||
bucket: params.bucket,
|
||||
fileName: params.fileName,
|
||||
path: params.path,
|
||||
fileData: params.fileData,
|
||||
contentType: params.contentType,
|
||||
upsert: params.upsert,
|
||||
}),
|
||||
},
|
||||
|
||||
outputs: {
|
||||
message: { type: 'string', description: 'Operation status message' },
|
||||
results: {
|
||||
type: 'object',
|
||||
description: 'Upload result including file path and metadata',
|
||||
description: 'Upload result including file path, bucket, and public URL',
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -136,7 +136,7 @@ export interface SupabaseStorageUploadParams {
|
||||
bucket: string
|
||||
fileName: string
|
||||
path?: string
|
||||
fileContent: string
|
||||
fileData: any // UserFile object (basic mode) or string (advanced mode: base64/plain text)
|
||||
contentType?: string
|
||||
upsert?: boolean
|
||||
}
|
||||
|
||||
@@ -1,362 +0,0 @@
|
||||
# Enterprise Self-Hosting FAQ Response
|
||||
|
||||
This document addresses common questions from enterprise customers regarding self-hosted Sim deployments.
|
||||
|
||||
---
|
||||
|
||||
## 1. Resource Requirements and Scalability
|
||||
|
||||
### What drives resource consumption?
|
||||
|
||||
Sim's resource requirements are driven by several memory-intensive components:
|
||||
|
||||
| Component | Memory Driver | Description |
|
||||
|-----------|--------------|-------------|
|
||||
| **Isolated-VM** | High | JavaScript sandboxing for secure workflow code execution. Each concurrent workflow maintains an execution context in memory. |
|
||||
| **File Processing** | Medium-High | Documents (PDF, DOCX, XLSX, etc.) are parsed in-memory before chunking for knowledge base operations. |
|
||||
| **pgvector Operations** | Medium | Vector database operations for embeddings (1536 dimensions per vector for knowledge base). |
|
||||
| **FFmpeg** | Variable | Media transcoding for audio/video processing happens synchronously in memory. |
|
||||
| **Sharp** | Low-Medium | Image processing and manipulation. |
|
||||
|
||||
### Actual Production Metrics
|
||||
|
||||
Based on production telemetry from our cloud deployment:
|
||||
|
||||
**Main Application (simstudio)**
|
||||
| Metric | Average | Peak | Notes |
|
||||
|--------|---------|------|-------|
|
||||
| CPU | ~10% | ~30% | Spikes during workflow execution |
|
||||
| Memory | ~35% | ~75% | Increases with concurrent workflows |
|
||||
|
||||
**WebSocket Server (realtime)**
|
||||
| Metric | Average | Peak | Notes |
|
||||
|--------|---------|------|-------|
|
||||
| CPU | ~1-2% | ~30% | Very lightweight |
|
||||
| Memory | ~7% | ~13% | Scales with connected clients |
|
||||
|
||||
### Recommended Resource Tiers
|
||||
|
||||
Based on actual production data (60k+ users), we recommend the following tiers:
|
||||
|
||||
#### Small (Development/Testing)
|
||||
- **CPU**: 2 cores
|
||||
- **RAM**: 12 GB
|
||||
- **Storage**: 20 GB SSD
|
||||
- **Use case**: 1-5 users, development, testing, light workloads
|
||||
|
||||
#### Standard (Teams)
|
||||
- **CPU**: 4 cores
|
||||
- **RAM**: 16 GB
|
||||
- **Storage**: 50 GB SSD
|
||||
- **Use case**: 5-50 users, moderate workflow execution
|
||||
|
||||
#### Production (Enterprise)
|
||||
- **CPU**: 8+ cores
|
||||
- **RAM**: 32+ GB
|
||||
- **Storage**: 100+ GB SSD
|
||||
- **Use case**: 50+ users, high availability, heavy workflow execution
|
||||
- **Note**: Consider running multiple replicas for high availability
|
||||
|
||||
### Memory Breakdown (Standard Deployment)
|
||||
|
||||
| Component | Recommended | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| Main App | 6-8 GB | Handles workflow execution, API, UI (peaks to 12 GB under heavy load) |
|
||||
| WebSocket | 1 GB | Real-time updates (typically uses 300-500 MB) |
|
||||
| PostgreSQL + pgvector | 2-4 GB | Database with vector extensions |
|
||||
| OS/Buffer | 2-4 GB | System overhead, file cache |
|
||||
| **Total** | **~12-16 GB** | |
|
||||
|
||||
### Scalability Considerations
|
||||
|
||||
- **Horizontal scaling**: The main app and WebSocket server are stateless and can be scaled horizontally with a load balancer.
|
||||
- **Database**: PostgreSQL can be scaled vertically or replaced with managed services (Supabase, Neon, RDS).
|
||||
- **Workflow concurrency**: Each concurrent workflow execution consumes additional memory. Plan for peak usage.
|
||||
|
||||
---
|
||||
|
||||
## 2. Managing Releases in Enterprise Environments
|
||||
|
||||
### Multi-Environment Strategy
|
||||
|
||||
For enterprise deployments requiring dev/staging/production environments, we recommend deploying **separate Sim instances** for each environment:
|
||||
|
||||
```
|
||||
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
|
||||
│ Dev │ -> │ Staging │ -> │ Production │
|
||||
│ Instance │ │ Instance │ │ Instance │
|
||||
└─────────────┘ └─────────────┘ └─────────────┘
|
||||
│ │ │
|
||||
v v v
|
||||
Develop Test/QA Deploy
|
||||
```
|
||||
|
||||
**Advantages**:
|
||||
- Complete isolation between environments
|
||||
- Independent scaling per environment
|
||||
- No risk of accidental production changes
|
||||
- Environment-specific configurations and credentials
|
||||
|
||||
### Promoting Changes Between Environments
|
||||
|
||||
Sim provides multiple ways to move workflows, folders, and workspaces between environments:
|
||||
|
||||
#### UI-Based Export/Import
|
||||
1. **Export** workflows, folders, or entire workspaces from the source environment via the UI
|
||||
2. **Import** into the target environment
|
||||
3. Configure environment-specific variables and credentials
|
||||
|
||||
#### Admin APIs (Automation)
|
||||
For CI/CD integration, use the admin APIs to programmatically:
|
||||
- Export workflows, folders, and workspaces as JSON
|
||||
- Import configurations into target environments
|
||||
- Automate promotion pipelines between dev → staging → production
|
||||
|
||||
### Version Control Within an Instance
|
||||
|
||||
Within a single Sim instance, the **Deploy Modal** provides version control:
|
||||
|
||||
1. **Draft Mode**: Edit and test workflows without affecting the live version
|
||||
2. **Explicit Deploy**: The live version is **not updated** until you explicitly click Deploy
|
||||
3. **Snapshots**: Each deployment creates a snapshot of the workflow state
|
||||
4. **Rollback**: Revert to any previous version at any time with one click
|
||||
|
||||
This allows teams to:
|
||||
- Safely iterate on workflows without disrupting production
|
||||
- Test changes before making them live
|
||||
- Quickly recover from issues by rolling back
|
||||
|
||||
---
|
||||
|
||||
## 3. Stable Releases and Backward Compatibility
|
||||
|
||||
### Versioning Strategy
|
||||
|
||||
Sim uses the following versioning scheme:
|
||||
- **Major versions** (0.x): e.g., 0.5, 0.6 - New major features
|
||||
- **Minor versions** (0.x.y): e.g., 0.5.1, 0.5.2 - Incremental updates, bug fixes
|
||||
|
||||
### Backward Compatibility Guarantees
|
||||
|
||||
**Forward upgrades are safe:**
|
||||
- Changes are **additive** - new features don't break existing workflows
|
||||
- We ensure no breaking changes between versions
|
||||
- Breaking changes are announced in advance when necessary
|
||||
- Database migrations are automatic and handle schema changes
|
||||
|
||||
**Rollbacks are not guaranteed:**
|
||||
- Rolling back to an older version may break things due to database schema changes
|
||||
- Always backup your database before upgrading
|
||||
- If you need to rollback, restore from a database backup taken before the upgrade
|
||||
|
||||
### Upgrade Best Practices
|
||||
|
||||
1. **Backup first**: Always backup your database before upgrading
|
||||
2. **Review release notes**: Check for any announced changes
|
||||
3. **Test in staging**: Upgrade your staging environment first
|
||||
4. **Monitor after upgrade**: Verify workflows continue to function correctly
|
||||
|
||||
### Enterprise Support
|
||||
|
||||
For enterprise customers requiring additional stability guarantees:
|
||||
- Contact us for support arrangements
|
||||
- We can provide guidance on upgrade planning
|
||||
- Security patches are prioritized for supported versions
|
||||
|
||||
---
|
||||
|
||||
## 4. OAuth and OIDC Providers
|
||||
|
||||
### Built-in OAuth Providers (Environment Variables)
|
||||
|
||||
Only the following providers can be configured via environment variables:
|
||||
|
||||
| Provider | Environment Variables |
|
||||
|----------|----------------------|
|
||||
| **GitHub** | `GITHUB_CLIENT_ID`, `GITHUB_CLIENT_SECRET` |
|
||||
| **Google** | `GOOGLE_CLIENT_ID`, `GOOGLE_CLIENT_SECRET` |
|
||||
|
||||
There are no plans to add additional OAuth providers via environment variables.
|
||||
|
||||
### All Other Identity Providers (SSO)
|
||||
|
||||
For any other identity providers, configure SSO through the app settings:
|
||||
|
||||
1. Enable SSO in environment variables:
|
||||
```
|
||||
SSO_ENABLED=true
|
||||
NEXT_PUBLIC_SSO_ENABLED=true
|
||||
```
|
||||
|
||||
2. Configure your identity provider in the app's SSO settings UI
|
||||
|
||||
Supported protocols:
|
||||
- SAML 2.0
|
||||
- OpenID Connect (OIDC)
|
||||
|
||||
Compatible with any OIDC/SAML provider including:
|
||||
- Okta
|
||||
- Azure AD / Entra ID
|
||||
- Auth0
|
||||
- Ping Identity
|
||||
- OneLogin
|
||||
- Custom OIDC providers
|
||||
|
||||
---
|
||||
|
||||
## 5. Known Issues and Workarounds
|
||||
|
||||
### SSO Save Button Disabled
|
||||
|
||||
**Issue**: The 'Save' button remains disabled when configuring SSO.
|
||||
|
||||
**Cause**: The form has strict validation on all required fields. The button remains disabled until ALL validations pass.
|
||||
|
||||
**Required fields for OIDC**:
|
||||
- Provider ID (letters, numbers, dashes only)
|
||||
- Issuer URL (must be HTTPS, except for localhost)
|
||||
- Domain (no `https://` prefix, must be valid domain format)
|
||||
- Client ID
|
||||
- Client Secret
|
||||
- Scopes (defaults to `openid,profile,email`)
|
||||
|
||||
**Required fields for SAML**:
|
||||
- Provider ID
|
||||
- Issuer URL
|
||||
- Domain
|
||||
- Entry Point URL
|
||||
- Certificate
|
||||
|
||||
**Common validation issues**:
|
||||
1. **Domain field**: Do NOT include `https://` - enter only the domain (e.g., `login.okta.com` not `https://login.okta.com`)
|
||||
2. **Issuer URL**: Must use HTTPS protocol (except localhost for testing)
|
||||
3. **Provider ID**: Only lowercase letters, numbers, and dashes allowed (e.g., `okta-prod`)
|
||||
|
||||
**Debugging**:
|
||||
- Open browser DevTools console to check for JavaScript errors
|
||||
- Ensure `SSO_ENABLED=true` and `NEXT_PUBLIC_SSO_ENABLED=true` environment variables are set
|
||||
- Try using one of the suggested provider IDs from the dropdown (e.g., `okta`, `azure-ad`)
|
||||
|
||||
### Access Control Group Creation
|
||||
|
||||
**Issue**: Button appears enabled but nothing happens when clicked.
|
||||
|
||||
**Cause**: For self-hosted deployments, an organization must be created via the admin API before access control groups can be used.
|
||||
|
||||
**Required Setup**:
|
||||
|
||||
1. **Enable required environment variables**:
|
||||
```env
|
||||
ADMIN_API_KEY=your-admin-api-key
|
||||
ACCESS_CONTROL_ENABLED=true
|
||||
ORGANIZATIONS_ENABLED=true
|
||||
NEXT_PUBLIC_ACCESS_CONTROL_ENABLED=true
|
||||
NEXT_PUBLIC_ORGANIZATIONS_ENABLED=true
|
||||
```
|
||||
|
||||
2. **Create an organization via admin API**:
|
||||
```bash
|
||||
# List users to get admin user ID
|
||||
curl -H "x-admin-key: $ADMIN_API_KEY" \
|
||||
"https://your-sim-instance.com/api/v1/admin/users?limit=10"
|
||||
|
||||
# Create organization
|
||||
curl -X POST https://your-sim-instance.com/api/v1/admin/organizations \
|
||||
-H "x-admin-key: $ADMIN_API_KEY" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"name": "Your Organization", "slug": "your-org", "ownerId": "<user-id-from-step-1>"}'
|
||||
|
||||
# Add members to organization
|
||||
curl -X POST https://your-sim-instance.com/api/v1/admin/organizations/<org-id>/members \
|
||||
-H "x-admin-key: $ADMIN_API_KEY" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"userId": "<user-id>", "role": "member"}'
|
||||
```
|
||||
|
||||
3. **Create permission groups**: After the organization is set up, go to Settings > Permission Groups in the UI.
|
||||
|
||||
---
|
||||
|
||||
## 6. File Storage Configuration
|
||||
|
||||
### Supported Storage Backends
|
||||
|
||||
Sim supports multiple storage backends for file storage:
|
||||
|
||||
#### Local Storage (Default)
|
||||
Files are stored on the local filesystem. Suitable for development and single-node deployments.
|
||||
|
||||
#### AWS S3
|
||||
```env
|
||||
AWS_REGION=us-east-1
|
||||
AWS_ACCESS_KEY_ID=your-access-key
|
||||
AWS_SECRET_ACCESS_KEY=your-secret-key
|
||||
S3_BUCKET_NAME=sim-files
|
||||
S3_KB_BUCKET_NAME=sim-knowledge-base
|
||||
S3_EXECUTION_FILES_BUCKET_NAME=sim-execution-files
|
||||
S3_CHAT_BUCKET_NAME=sim-chat-files
|
||||
```
|
||||
|
||||
#### Azure Blob Storage
|
||||
|
||||
You can configure Azure Blob Storage using either a connection string or account name/key:
|
||||
|
||||
**Option 1: Connection String**
|
||||
```env
|
||||
AZURE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...;AccountKey=...;EndpointSuffix=core.windows.net
|
||||
AZURE_STORAGE_CONTAINER_NAME=sim-files
|
||||
AZURE_STORAGE_KB_CONTAINER_NAME=sim-knowledge-base
|
||||
AZURE_STORAGE_EXECUTION_FILES_CONTAINER_NAME=sim-execution-files
|
||||
AZURE_STORAGE_CHAT_CONTAINER_NAME=sim-chat-files
|
||||
```
|
||||
|
||||
**Option 2: Account Name and Key**
|
||||
```env
|
||||
AZURE_ACCOUNT_NAME=your-storage-account
|
||||
AZURE_ACCOUNT_KEY=your-storage-key
|
||||
AZURE_STORAGE_CONTAINER_NAME=sim-files
|
||||
AZURE_STORAGE_KB_CONTAINER_NAME=sim-knowledge-base
|
||||
AZURE_STORAGE_EXECUTION_FILES_CONTAINER_NAME=sim-execution-files
|
||||
AZURE_STORAGE_CHAT_CONTAINER_NAME=sim-chat-files
|
||||
```
|
||||
|
||||
Both options are fully supported. The connection string is automatically parsed to extract credentials when needed for operations like presigned URL generation.
|
||||
|
||||
---
|
||||
|
||||
## 7. Knowledge Base Configuration
|
||||
|
||||
### Required Environment Variables
|
||||
|
||||
```env
|
||||
# OpenAI API key for embeddings
|
||||
OPENAI_API_KEY=your-openai-api-key
|
||||
|
||||
# Embedding model configuration (optional)
|
||||
KB_OPENAI_MODEL_NAME=text-embedding-3-small
|
||||
```
|
||||
|
||||
### Embedding Model Compatibility
|
||||
|
||||
**Supported models**:
|
||||
- `text-embedding-3-small` (default, 1536 dimensions)
|
||||
- `text-embedding-3-large` (1536 dimensions, automatically reduced from 3072)
|
||||
- `text-embedding-ada-002` (1536 dimensions)
|
||||
|
||||
All text-embedding-3-* models automatically use 1536 dimensions to match the database schema. This allows you to use `text-embedding-3-large` for higher quality embeddings without schema modifications.
|
||||
|
||||
### Database Requirements
|
||||
|
||||
The knowledge base requires PostgreSQL with the pgvector extension:
|
||||
- PostgreSQL 12+ with pgvector
|
||||
- The `vector` extension must be enabled
|
||||
- Tables are created automatically during migration
|
||||
|
||||
---
|
||||
|
||||
## Questions?
|
||||
|
||||
For additional support:
|
||||
- Documentation: https://docs.sim.ai
|
||||
- GitHub Issues: https://github.com/simstudioai/sim/issues
|
||||
- Enterprise Support: Contact your account representative
|
||||