Compare commits

..

1 Commits

Author SHA1 Message Date
Waleed Latif
c2f786e40b v0.2.9: fix + feat (#643)
* fix(sharing): fixed folders not appearing when sharing workflows (#616)

* fix(sharing): fixed folders not appearing when sharing workflows

* cleanup

* fixed error case

* fix(deletions): folder deletions were hanging + use cascade deletions throughout  (#620)

* use cascade deletion

* fix lint

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan>

* fix(envvars): t3-env standardization (#606)

* chore: use t3-env as source of truth

* chore: update mock env for failing tests

* feat(enhanced logs): integration + log visualizer canvas (#618)

* feat(logs): enhanced logging system with cleanup and theme fixes

- Implement enhanced logging cleanup with S3 archival and retention policies
- Fix error propagation in trace spans for manual executions
- Add theme-aware styling for frozen canvas modal
- Integrate enhanced logging system across all execution pathways
- Add comprehensive trace span processing and iteration navigation
- Fix boolean parameter types in enhanced logs API

* add warning for old logs

* fix lint

* added cost for streaming outputs

* fix overflow issue

* fix lint

* fix selection on closing sidebar

* tooltips z index increase

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan>
Co-authored-by: Waleed Latif <walif6@gmail.com>

* fix(frozen canvas): don't error if workflow state not available for migrated logs (#624)

* fix(frozen canvas): don't error if workflow state not available for old logs

* fix lint

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan>

* fix(reddit): update to oauth endpoints  (#627)

* fix(reddit): change tool to use oauth token

* fix lint

* add contact info

* Update apps/sim/tools/reddit/get_comments.ts

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>

* Update apps/sim/tools/reddit/hot_posts.ts

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>

* Update apps/sim/tools/reddit/get_posts.ts

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>

* fix type error

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-MacBook-Air.local>
Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>

* feat(tools): added reordering of tool calls in agent tool input  (#629)

* added tool re-ordering in agent block

* styling

* fix(oauth): fix oauth to use correct subblock value setter + remove unused local storage code (#628)

* fix(oauth): fixed oauth state not persisting in credential selector

* remove unused local storage code for oauth

* fix lint

* selector clearance issue fix

* fix typing issue

* fix lint

* remove cred id from logs

* fix lint

* works

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>

* fix(mem-deletion): hard deletion of memory (#622)

* fix: memory deletion

* fix: bun run lint

---------

Co-authored-by: Adam Gough <adamgough@Adams-MacBook-Pro.local>

* feat(build): added turbopack builds to prod (#630)

* added turbopack to prod builds

* block access to sourcemaps

* revert changes to docs

* fix(docs): fixed broken docs links (#632)

* fix(resp format): non-json input was crashing (#631)

* fix response format non-json input crash bug

* fix lint

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>

* fix(revert-deployed): correctly revert to deployed state as unit op using separate endpoint (#633)

* fix(revert-deployed): revert deployed functionality with separate endpoint

* fix lint

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>

* fix(dropdown): simplify & fix tag dropdown for parallel & loop blocks (#634)

* fix(dropdown): simplify & fix tag dropdown for parallel & loop blocks

* fixed build

* fix(response-format): add response format to tag dropdown, chat panel, and chat client (#637)

* add response format structure to tag dropdown

* handle response format outputs for chat client and chat panel, implemented the response format handling for streamed responses

* cleanup

* fix(sockets-server-disconnection): on reconnect force sync store to db  (#638)

* keep warning until refresh

* works

* fix sockets server sync on reconnection

* infinite reconn attempts

* fix lint

---------

Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>

* fix(build): fixed build

* Revert "fix(sockets-server-disconnection): on reconnect force sync store to d…" (#640)

This reverts commit 6dc8b17bed.

* fix(sockets): force user to refresh on disconnect in order to mkae changes, add read-only offline mode (#641)

* force user to refresh on disconnect in order to mkae changes, add read-only offline mode

* remove unused hook

* style

* update tooltip msg

* remove unnecessary useMemo around log

* fix(sockets): added debouncing for sub-block values to prevent overloading socket server, fixed persistence issue during streaming back from LLM response format, removed unused events (#642)

* fix(sockets): added debouncing for sub-block values to prevent overloading socket server, fixed persistence issue during streaming back from LLM response format, removed unused events

* reuse existing isStreaming state for code block llm-generated response format

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan>
Co-authored-by: Aditya Tripathi <aditya@climactic.co>
Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-MacBook-Air.local>
Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
Co-authored-by: Adam Gough <adamgough@Adams-MacBook-Pro.local>
2025-07-08 21:58:06 -07:00
77 changed files with 3138 additions and 1035 deletions

View File

@@ -81,4 +81,4 @@ Sim Studio provides a wide range of features designed to accelerate your develop
##
Ready to get started? Check out our [Getting Started](/getting-started) guide or explore our [Blocks](/docs/blocks) and [Tools](/docs/tools) in more detail.
Ready to get started? Check out our [Getting Started](/getting-started) guide or explore our [Blocks](/blocks) and [Tools](/tools) in more detail.

View File

@@ -19,7 +19,7 @@
"fumadocs-mdx": "^11.5.6",
"fumadocs-ui": "^15.0.16",
"lucide-react": "^0.511.0",
"next": "^15.2.3",
"next": "^15.3.2",
"next-themes": "^0.4.6",
"react": "19.1.0",
"react-dom": "19.1.0",

View File

@@ -14,6 +14,8 @@ const logger = createLogger('OAuthTokenAPI')
export async function POST(request: NextRequest) {
const requestId = crypto.randomUUID().slice(0, 8)
logger.info(`[${requestId}] OAuth token API POST request received`)
try {
// Parse request body
const body = await request.json()
@@ -38,6 +40,7 @@ export async function POST(request: NextRequest) {
const credential = await getCredential(requestId, credentialId, userId)
if (!credential) {
logger.error(`[${requestId}] Credential not found: ${credentialId}`)
return NextResponse.json({ error: 'Credential not found' }, { status: 404 })
}
@@ -45,7 +48,8 @@ export async function POST(request: NextRequest) {
// Refresh the token if needed
const { accessToken } = await refreshTokenIfNeeded(requestId, credential, credentialId)
return NextResponse.json({ accessToken }, { status: 200 })
} catch (_error) {
} catch (error) {
logger.error(`[${requestId}] Failed to refresh access token:`, error)
return NextResponse.json({ error: 'Failed to refresh access token' }, { status: 401 })
}
} catch (error) {

View File

@@ -89,6 +89,7 @@ export async function getOAuthToken(userId: string, providerId: string): Promise
// Check if the token is expired and needs refreshing
const now = new Date()
const tokenExpiry = credential.accessTokenExpiresAt
// Only refresh if we have an expiration time AND it's expired AND we have a refresh token
const needsRefresh = tokenExpiry && tokenExpiry < now && !!credential.refreshToken
if (needsRefresh) {
@@ -166,7 +167,9 @@ export async function refreshAccessTokenIfNeeded(
// Check if we need to refresh the token
const expiresAt = credential.accessTokenExpiresAt
const now = new Date()
const needsRefresh = !expiresAt || expiresAt <= now
// Only refresh if we have an expiration time AND it's expired
// If no expiration time is set (newly created credentials), assume token is valid
const needsRefresh = expiresAt && expiresAt <= now
const accessToken = credential.accessToken
@@ -233,7 +236,9 @@ export async function refreshTokenIfNeeded(
// Check if we need to refresh the token
const expiresAt = credential.accessTokenExpiresAt
const now = new Date()
const needsRefresh = !expiresAt || expiresAt <= now
// Only refresh if we have an expiration time AND it's expired
// If no expiration time is set (newly created credentials), assume token is valid
const needsRefresh = expiresAt && expiresAt <= now
// If token is still valid, return it directly
if (!needsRefresh || !credential.refreshToken) {

View File

@@ -194,6 +194,7 @@ export async function GET(
description: deployment.description,
customizations: deployment.customizations,
authType: deployment.authType,
outputConfigs: deployment.outputConfigs,
}),
request
)
@@ -219,6 +220,7 @@ export async function GET(
description: deployment.description,
customizations: deployment.customizations,
authType: deployment.authType,
outputConfigs: deployment.outputConfigs,
}),
request
)

View File

@@ -263,17 +263,26 @@ export async function executeWorkflowForChat(
let outputBlockIds: string[] = []
// Extract output configs from the new schema format
let selectedOutputIds: string[] = []
if (deployment.outputConfigs && Array.isArray(deployment.outputConfigs)) {
// Extract block IDs and paths from the new outputConfigs array format
// Extract output IDs in the format expected by the streaming processor
logger.debug(
`[${requestId}] Found ${deployment.outputConfigs.length} output configs in deployment`
)
deployment.outputConfigs.forEach((config) => {
selectedOutputIds = deployment.outputConfigs.map((config) => {
const outputId = config.path
? `${config.blockId}_${config.path}`
: `${config.blockId}.content`
logger.debug(
`[${requestId}] Processing output config: blockId=${config.blockId}, path=${config.path || 'none'}`
`[${requestId}] Processing output config: blockId=${config.blockId}, path=${config.path || 'content'} -> outputId=${outputId}`
)
return outputId
})
// Also extract block IDs for legacy compatibility
outputBlockIds = deployment.outputConfigs.map((config) => config.blockId)
} else {
// Use customizations as fallback
@@ -291,7 +300,9 @@ export async function executeWorkflowForChat(
outputBlockIds = customizations.outputBlockIds
}
logger.debug(`[${requestId}] Using ${outputBlockIds.length} output blocks for extraction`)
logger.debug(
`[${requestId}] Using ${outputBlockIds.length} output blocks and ${selectedOutputIds.length} selected output IDs for extraction`
)
// Find the workflow (deployedState is NOT deprecated - needed for chat execution)
const workflowResult = await db
@@ -457,7 +468,7 @@ export async function executeWorkflowForChat(
workflowVariables,
contextExtensions: {
stream: true,
selectedOutputIds: outputBlockIds,
selectedOutputIds: selectedOutputIds.length > 0 ? selectedOutputIds : outputBlockIds,
edges: edges.map((e: any) => ({
source: e.source,
target: e.target,

View File

@@ -1,4 +1,4 @@
import { and, eq, isNull } from 'drizzle-orm'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { createLogger } from '@/lib/logs/console-logger'
import { db } from '@/db'
@@ -40,7 +40,7 @@ export async function GET(request: NextRequest, { params }: { params: Promise<{
const memories = await db
.select()
.from(memory)
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId), isNull(memory.deletedAt)))
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
.orderBy(memory.createdAt)
.limit(1)
@@ -112,7 +112,7 @@ export async function DELETE(
const existingMemory = await db
.select({ id: memory.id })
.from(memory)
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId), isNull(memory.deletedAt)))
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
.limit(1)
if (existingMemory.length === 0) {
@@ -128,14 +128,8 @@ export async function DELETE(
)
}
// Soft delete by setting deletedAt timestamp
await db
.update(memory)
.set({
deletedAt: new Date(),
updatedAt: new Date(),
})
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
// Hard delete the memory
await db.delete(memory).where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
logger.info(`[${requestId}] Memory deleted successfully: ${id} for workflow: ${workflowId}`)
return NextResponse.json(
@@ -202,7 +196,7 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
const existingMemories = await db
.select()
.from(memory)
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId), isNull(memory.deletedAt)))
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
.limit(1)
if (existingMemories.length === 0) {
@@ -250,13 +244,7 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
}
// Update the memory with new data
await db
.update(memory)
.set({
data,
updatedAt: new Date(),
})
.where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
await db.delete(memory).where(and(eq(memory.key, id), eq(memory.workflowId, workflowId)))
// Fetch the updated memory
const updatedMemories = await db

View File

@@ -197,18 +197,42 @@ async function executeWorkflow(workflow: any, requestId: string, input?: any) {
(acc, [blockId, blockState]) => {
// Check if this block has a responseFormat that needs to be parsed
if (blockState.responseFormat && typeof blockState.responseFormat === 'string') {
try {
logger.debug(`[${requestId}] Parsing responseFormat for block ${blockId}`)
// Attempt to parse the responseFormat if it's a string
const parsedResponseFormat = JSON.parse(blockState.responseFormat)
const responseFormatValue = blockState.responseFormat.trim()
// Check for variable references like <start.input>
if (responseFormatValue.startsWith('<') && responseFormatValue.includes('>')) {
logger.debug(
`[${requestId}] Response format contains variable reference for block ${blockId}`
)
// Keep variable references as-is - they will be resolved during execution
acc[blockId] = blockState
} else if (responseFormatValue === '') {
// Empty string - remove response format
acc[blockId] = {
...blockState,
responseFormat: parsedResponseFormat,
responseFormat: undefined,
}
} else {
try {
logger.debug(`[${requestId}] Parsing responseFormat for block ${blockId}`)
// Attempt to parse the responseFormat if it's a string
const parsedResponseFormat = JSON.parse(responseFormatValue)
acc[blockId] = {
...blockState,
responseFormat: parsedResponseFormat,
}
} catch (error) {
logger.warn(
`[${requestId}] Failed to parse responseFormat for block ${blockId}, using undefined`,
error
)
// Set to undefined instead of keeping malformed JSON - this allows execution to continue
acc[blockId] = {
...blockState,
responseFormat: undefined,
}
}
} catch (error) {
logger.warn(`[${requestId}] Failed to parse responseFormat for block ${blockId}`, error)
acc[blockId] = blockState
}
} else {
acc[blockId] = blockState

View File

@@ -0,0 +1,121 @@
import crypto from 'crypto'
import { eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { createLogger } from '@/lib/logs/console-logger'
import { saveWorkflowToNormalizedTables } from '@/lib/workflows/db-helpers'
import { db } from '@/db'
import { workflow } from '@/db/schema'
import type { WorkflowState } from '@/stores/workflows/workflow/types'
import { validateWorkflowAccess } from '../../middleware'
import { createErrorResponse, createSuccessResponse } from '../../utils'
const logger = createLogger('RevertToDeployedAPI')
export const dynamic = 'force-dynamic'
export const runtime = 'nodejs'
/**
* POST /api/workflows/[id]/revert-to-deployed
* Revert workflow to its deployed state by saving deployed state to normalized tables
*/
export async function POST(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const requestId = crypto.randomUUID().slice(0, 8)
const { id } = await params
try {
logger.debug(`[${requestId}] Reverting workflow to deployed state: ${id}`)
const validation = await validateWorkflowAccess(request, id, false)
if (validation.error) {
logger.warn(`[${requestId}] Workflow revert failed: ${validation.error.message}`)
return createErrorResponse(validation.error.message, validation.error.status)
}
const workflowData = validation.workflow
// Check if workflow is deployed and has deployed state
if (!workflowData.isDeployed || !workflowData.deployedState) {
logger.warn(`[${requestId}] Cannot revert: workflow is not deployed or has no deployed state`)
return createErrorResponse('Workflow is not deployed or has no deployed state', 400)
}
// Validate deployed state structure
const deployedState = workflowData.deployedState as WorkflowState
if (!deployedState.blocks || !deployedState.edges) {
logger.error(`[${requestId}] Invalid deployed state structure`, { deployedState })
return createErrorResponse('Invalid deployed state structure', 500)
}
logger.debug(`[${requestId}] Saving deployed state to normalized tables`, {
blocksCount: Object.keys(deployedState.blocks).length,
edgesCount: deployedState.edges.length,
loopsCount: Object.keys(deployedState.loops || {}).length,
parallelsCount: Object.keys(deployedState.parallels || {}).length,
})
// Save deployed state to normalized tables
const saveResult = await saveWorkflowToNormalizedTables(id, {
blocks: deployedState.blocks,
edges: deployedState.edges,
loops: deployedState.loops || {},
parallels: deployedState.parallels || {},
lastSaved: Date.now(),
isDeployed: workflowData.isDeployed,
deployedAt: workflowData.deployedAt,
deploymentStatuses: deployedState.deploymentStatuses || {},
hasActiveSchedule: deployedState.hasActiveSchedule || false,
hasActiveWebhook: deployedState.hasActiveWebhook || false,
})
if (!saveResult.success) {
logger.error(`[${requestId}] Failed to save deployed state to normalized tables`, {
error: saveResult.error,
})
return createErrorResponse(
saveResult.error || 'Failed to save deployed state to normalized tables',
500
)
}
// Update workflow's last_synced timestamp to indicate changes
await db
.update(workflow)
.set({
lastSynced: new Date(),
updatedAt: new Date(),
})
.where(eq(workflow.id, id))
// Notify socket server about the revert operation for real-time sync
try {
const socketServerUrl = process.env.SOCKET_SERVER_URL || 'http://localhost:3002'
await fetch(`${socketServerUrl}/api/workflow-reverted`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
workflowId: id,
timestamp: Date.now(),
}),
})
logger.debug(`[${requestId}] Notified socket server about workflow revert: ${id}`)
} catch (socketError) {
// Don't fail the request if socket notification fails
logger.warn(`[${requestId}] Failed to notify socket server about revert:`, socketError)
}
logger.info(`[${requestId}] Successfully reverted workflow to deployed state: ${id}`)
return createSuccessResponse({
message: 'Workflow successfully reverted to deployed state',
lastSaved: Date.now(),
})
} catch (error: any) {
logger.error(`[${requestId}] Error reverting workflow to deployed state: ${id}`, {
error: error.message,
stack: error.stack,
})
return createErrorResponse(error.message || 'Failed to revert workflow to deployed state', 500)
}
}

View File

@@ -33,6 +33,7 @@ interface ChatConfig {
headerText?: string
}
authType?: 'public' | 'password' | 'email'
outputConfigs?: Array<{ blockId: string; path?: string }>
}
interface AudioStreamingOptions {
@@ -373,8 +374,16 @@ export default function ChatClient({ subdomain }: { subdomain: string }) {
const json = JSON.parse(line.substring(6))
const { blockId, chunk: contentChunk, event: eventType } = json
if (eventType === 'final') {
if (eventType === 'final' && json.data) {
setIsLoading(false)
// Process final execution result for field extraction
const result = json.data
const nonStreamingLogs =
result.logs?.filter((log: any) => !messageIdMap.has(log.blockId)) || []
// Chat field extraction will be handled by the backend using deployment outputConfigs
return
}

View File

@@ -305,15 +305,18 @@ export default function Logs() {
<div className='flex flex-1 flex-col overflow-hidden'>
{/* Table container */}
<div className='flex flex-1 flex-col overflow-hidden'>
{/* Simple header */}
<div className='border-border/50 border-b px-4 py-3'>
<div className='flex items-center gap-4 font-medium text-muted-foreground text-xs'>
<div className='w-32'>Time</div>
<div className='w-20'>Status</div>
<div className='flex-1'>Workflow</div>
<div className='hidden w-24 lg:block'>Trigger</div>
<div className='hidden w-20 xl:block'>Cost</div>
<div className='w-20'>Duration</div>
{/* Table with fixed layout */}
<div className='w-full min-w-[800px]'>
{/* Header */}
<div className='border-border/50 border-b'>
<div className='grid grid-cols-[160px_100px_1fr_120px_100px_100px] gap-4 px-4 py-3 font-medium text-muted-foreground text-xs'>
<div>Time</div>
<div>Status</div>
<div>Workflow</div>
<div className='hidden lg:block'>Trigger</div>
<div className='hidden xl:block'>Cost</div>
<div>Duration</div>
</div>
</div>
</div>
@@ -357,9 +360,9 @@ export default function Logs() {
}`}
onClick={() => handleLogClick(log)}
>
<div className='flex items-center gap-4 p-4'>
<div className='grid grid-cols-[160px_100px_1fr_120px_100px_100px] gap-4 p-4'>
{/* Time */}
<div className='w-32 flex-shrink-0'>
<div>
<div className='font-medium text-sm'>{formattedDate.formatted}</div>
<div className='text-muted-foreground text-xs'>
{formattedDate.relative}
@@ -367,7 +370,7 @@ export default function Logs() {
</div>
{/* Status */}
<div className='w-20 flex-shrink-0'>
<div>
<div
className={`inline-flex items-center justify-center rounded-md px-2 py-1 text-xs ${
log.level === 'error'
@@ -382,7 +385,7 @@ export default function Logs() {
</div>
{/* Workflow */}
<div className='min-w-0 flex-1'>
<div className='min-w-0'>
<div className='truncate font-medium text-sm'>
{log.workflow?.name || 'Unknown Workflow'}
</div>
@@ -392,14 +395,14 @@ export default function Logs() {
</div>
{/* Trigger */}
<div className='hidden w-24 flex-shrink-0 lg:block'>
<div className='hidden lg:block'>
<div className='text-muted-foreground text-xs'>
{log.trigger || '—'}
</div>
</div>
{/* Cost */}
<div className='hidden w-20 flex-shrink-0 xl:block'>
<div className='hidden xl:block'>
<div className='text-xs'>
{log.metadata?.enhanced && log.metadata?.cost?.total ? (
<span className='text-muted-foreground'>
@@ -412,7 +415,7 @@ export default function Logs() {
</div>
{/* Duration */}
<div className='w-20 flex-shrink-0'>
<div>
<div className='text-muted-foreground text-xs'>
{log.duration || '—'}
</div>

View File

@@ -1,53 +1,57 @@
'use client'
import { useEffect, useState } from 'react'
import { AlertTriangle, RefreshCw } from 'lucide-react'
import { Button } from '@/components/ui/button'
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/w/components/providers/workspace-permissions-provider'
interface ConnectionStatusProps {
isConnected: boolean
}
export function ConnectionStatus({ isConnected }: ConnectionStatusProps) {
const [showOfflineNotice, setShowOfflineNotice] = useState(false)
const userPermissions = useUserPermissionsContext()
useEffect(() => {
let timeoutId: NodeJS.Timeout
const handleRefresh = () => {
window.location.reload()
}
if (!isConnected) {
// Show offline notice after 6 seconds of being disconnected
timeoutId = setTimeout(() => {
setShowOfflineNotice(true)
}, 6000) // 6 seconds
} else {
// Hide notice immediately when reconnected
setShowOfflineNotice(false)
}
return () => {
if (timeoutId) {
clearTimeout(timeoutId)
}
}
}, [isConnected])
// Don't render anything if connected or if we haven't been disconnected long enough
if (!showOfflineNotice) {
// Don't render anything if not in offline mode
if (!userPermissions.isOfflineMode) {
return null
}
return (
<div className='flex items-center gap-1.5'>
<div className='flex items-center gap-1.5 text-red-600'>
<div className='flex items-center gap-2 rounded-md border border-red-200 bg-red-50 px-3 py-2'>
<div className='flex items-center gap-2 text-red-700'>
<div className='relative flex items-center justify-center'>
<div className='absolute h-3 w-3 animate-ping rounded-full bg-red-500/20' />
<div className='relative h-2 w-2 rounded-full bg-red-500' />
{!isConnected && (
<div className='absolute h-4 w-4 animate-ping rounded-full bg-red-500/20' />
)}
<AlertTriangle className='relative h-4 w-4' />
</div>
<div className='flex flex-col'>
<span className='font-medium text-xs leading-tight'>Connection lost</span>
<span className='text-xs leading-tight opacity-90'>
Changes not saved - please refresh
<span className='font-medium text-xs leading-tight'>
{isConnected ? 'Reconnected' : 'Connection lost - please refresh'}
</span>
<span className='text-red-600 text-xs leading-tight'>
{isConnected ? 'Refresh to continue editing' : 'Read-only mode active'}
</span>
</div>
</div>
<Tooltip>
<TooltipTrigger asChild>
<Button
onClick={handleRefresh}
variant='ghost'
size='sm'
className='h-7 w-7 p-0 text-red-700 hover:bg-red-100 hover:text-red-800'
>
<RefreshCw className='h-4 w-4' />
</Button>
</TooltipTrigger>
<TooltipContent className='z-[9999]'>Refresh page to continue editing</TooltipContent>
</Tooltip>
</div>
)
}

View File

@@ -44,16 +44,6 @@ export function UserAvatarStack({
}
}, [users, maxVisible])
// Show connection status component regardless of user count
// This will handle the offline notice when disconnected for 15 seconds
const connectionStatusElement = <ConnectionStatus isConnected={isConnected} />
// Only show presence when there are multiple users (>1)
// But always show connection status
if (users.length <= 1) {
return connectionStatusElement
}
// Determine spacing based on size
const spacingClass = {
sm: '-space-x-1',
@@ -62,46 +52,55 @@ export function UserAvatarStack({
}[size]
return (
<div className={`flex items-center ${spacingClass} ${className}`}>
{/* Connection status - always present */}
{connectionStatusElement}
<div className={`flex items-center gap-3 ${className}`}>
{/* Connection status - always check, shows when offline */}
<ConnectionStatus isConnected={isConnected} />
{/* Render visible user avatars */}
{visibleUsers.map((user, index) => (
<UserAvatar
key={user.connectionId}
connectionId={user.connectionId}
name={user.name}
color={user.color}
size={size}
index={index}
tooltipContent={
user.name ? (
<div className='text-center'>
<div className='font-medium'>{user.name}</div>
{user.info && <div className='mt-1 text-muted-foreground text-xs'>{user.info}</div>}
</div>
) : null
}
/>
))}
{/* Only show avatar stack when there are multiple users (>1) */}
{users.length > 1 && (
<div className={`flex items-center ${spacingClass}`}>
{/* Render visible user avatars */}
{visibleUsers.map((user, index) => (
<UserAvatar
key={user.connectionId}
connectionId={user.connectionId}
name={user.name}
color={user.color}
size={size}
index={index}
tooltipContent={
user.name ? (
<div className='text-center'>
<div className='font-medium'>{user.name}</div>
{user.info && (
<div className='mt-1 text-muted-foreground text-xs'>{user.info}</div>
)}
</div>
) : null
}
/>
))}
{/* Render overflow indicator if there are more users */}
{overflowCount > 0 && (
<UserAvatar
connectionId='overflow-indicator' // Use a unique string identifier
name={`+${overflowCount}`}
size={size}
index={visibleUsers.length}
tooltipContent={
<div className='text-center'>
<div className='font-medium'>
{overflowCount} more user{overflowCount > 1 ? 's' : ''}
</div>
<div className='mt-1 text-muted-foreground text-xs'>{users.length} total online</div>
</div>
}
/>
{/* Render overflow indicator if there are more users */}
{overflowCount > 0 && (
<UserAvatar
connectionId='overflow-indicator' // Use a unique string identifier
name={`+${overflowCount}`}
size={size}
index={visibleUsers.length}
tooltipContent={
<div className='text-center'>
<div className='font-medium'>
{overflowCount} more user{overflowCount > 1 ? 's' : ''}
</div>
<div className='mt-1 text-muted-foreground text-xs'>
{users.length} total online
</div>
</div>
}
/>
)}
</div>
)}
</div>
)

View File

@@ -670,7 +670,11 @@ export function ControlBar({ hasValidationErrors = false }: ControlBarProps) {
</h2>
</TooltipTrigger>
{!canEdit && (
<TooltipContent>Edit permissions required to rename workflows</TooltipContent>
<TooltipContent>
{userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Edit permissions required to rename workflows'}
</TooltipContent>
)}
</Tooltip>
)}
@@ -934,7 +938,11 @@ export function ControlBar({ hasValidationErrors = false }: ControlBarProps) {
)}
</TooltipTrigger>
<TooltipContent>
{canEdit ? 'Duplicate Workflow' : 'Admin permission required to duplicate workflows'}
{canEdit
? 'Duplicate Workflow'
: userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Admin permission required to duplicate workflows'}
</TooltipContent>
</Tooltip>
)
@@ -975,7 +983,9 @@ export function ControlBar({ hasValidationErrors = false }: ControlBarProps) {
</TooltipTrigger>
<TooltipContent command='Shift+L'>
{!userPermissions.canEdit
? 'Admin permission required to use auto-layout'
? userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Admin permission required to use auto-layout'
: 'Auto Layout'}
</TooltipContent>
</Tooltip>

View File

@@ -5,6 +5,12 @@ import { ArrowUp } from 'lucide-react'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input'
import { ScrollArea } from '@/components/ui/scroll-area'
import { createLogger } from '@/lib/logs/console-logger'
import {
extractBlockIdFromOutputId,
extractPathFromOutputId,
parseOutputContentSafely,
} from '@/lib/response-format'
import type { BlockLog, ExecutionResult } from '@/executor/types'
import { useExecutionStore } from '@/stores/execution/store'
import { useChatStore } from '@/stores/panel/chat/store'
@@ -14,6 +20,8 @@ import { useWorkflowExecution } from '../../../../hooks/use-workflow-execution'
import { ChatMessage } from './components/chat-message/chat-message'
import { OutputSelect } from './components/output-select/output-select'
const logger = createLogger('ChatPanel')
interface ChatProps {
panelWidth: number
chatMessage: string
@@ -60,8 +68,8 @@ export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
const selected = selectedWorkflowOutputs[activeWorkflowId]
if (!selected || selected.length === 0) {
const defaultSelection = outputEntries.length > 0 ? [outputEntries[0].id] : []
return defaultSelection
// Return empty array when nothing is explicitly selected
return []
}
// Ensure we have no duplicates in the selection
@@ -74,7 +82,7 @@ export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
}
return selected
}, [selectedWorkflowOutputs, activeWorkflowId, outputEntries, setSelectedWorkflowOutput])
}, [selectedWorkflowOutputs, activeWorkflowId, setSelectedWorkflowOutput])
// Auto-scroll to bottom when new messages are added
useEffect(() => {
@@ -141,25 +149,22 @@ export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
if (nonStreamingLogs.length > 0) {
const outputsToRender = selectedOutputs.filter((outputId) => {
// Extract block ID correctly - handle both formats:
// - "blockId" (direct block ID)
// - "blockId_response.result" (block ID with path)
const blockIdForOutput = outputId.includes('_')
? outputId.split('_')[0]
: outputId.split('.')[0]
const blockIdForOutput = extractBlockIdFromOutputId(outputId)
return nonStreamingLogs.some((log) => log.blockId === blockIdForOutput)
})
for (const outputId of outputsToRender) {
const blockIdForOutput = outputId.includes('_')
? outputId.split('_')[0]
: outputId.split('.')[0]
const path = outputId.substring(blockIdForOutput.length + 1)
const blockIdForOutput = extractBlockIdFromOutputId(outputId)
const path = extractPathFromOutputId(outputId, blockIdForOutput)
const log = nonStreamingLogs.find((l) => l.blockId === blockIdForOutput)
if (log) {
let outputValue: any = log.output
if (path) {
// Parse JSON content safely
outputValue = parseOutputContentSafely(outputValue)
const pathParts = path.split('.')
for (const part of pathParts) {
if (
@@ -211,42 +216,41 @@ export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
}
}
} catch (e) {
console.error('Error parsing stream data:', e)
logger.error('Error parsing stream data:', e)
}
}
}
}
}
processStream().catch((e) => console.error('Error processing stream:', e))
processStream().catch((e) => logger.error('Error processing stream:', e))
} else if (result && 'success' in result && result.success && 'logs' in result) {
const finalOutputs: any[] = []
if (selectedOutputs && selectedOutputs.length > 0) {
if (selectedOutputs?.length > 0) {
for (const outputId of selectedOutputs) {
// Find the log that corresponds to the start of the outputId
const log = result.logs?.find(
(l: BlockLog) => l.blockId === outputId || outputId.startsWith(`${l.blockId}_`)
)
const blockIdForOutput = extractBlockIdFromOutputId(outputId)
const path = extractPathFromOutputId(outputId, blockIdForOutput)
const log = result.logs?.find((l: BlockLog) => l.blockId === blockIdForOutput)
if (log) {
let output = log.output
// Check if there is a path to traverse
if (outputId.length > log.blockId.length) {
const path = outputId.substring(log.blockId.length + 1)
if (path) {
const pathParts = path.split('.')
let current = output
for (const part of pathParts) {
if (current && typeof current === 'object' && part in current) {
current = current[part]
} else {
current = undefined
break
}
if (path) {
// Parse JSON content safely
output = parseOutputContentSafely(output)
const pathParts = path.split('.')
let current = output
for (const part of pathParts) {
if (current && typeof current === 'object' && part in current) {
current = current[part]
} else {
current = undefined
break
}
output = current
}
output = current
}
if (output !== undefined) {
finalOutputs.push(output)
@@ -255,10 +259,8 @@ export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
}
}
// If no specific outputs could be resolved, fall back to the final workflow output
if (finalOutputs.length === 0 && result.output) {
finalOutputs.push(result.output)
}
// Only show outputs if something was explicitly selected
// If no outputs are selected, don't show anything
// Add a new message for each resolved output
finalOutputs.forEach((output) => {
@@ -266,19 +268,8 @@ export function Chat({ panelWidth, chatMessage, setChatMessage }: ChatProps) {
if (typeof output === 'string') {
content = output
} else if (output && typeof output === 'object') {
// Handle cases where output is { response: ... }
const outputObj = output as Record<string, any>
const response = outputObj.response
if (response) {
if (typeof response.content === 'string') {
content = response.content
} else {
// Pretty print for better readability
content = `\`\`\`json\n${JSON.stringify(response, null, 2)}\n\`\`\``
}
} else {
content = `\`\`\`json\n${JSON.stringify(output, null, 2)}\n\`\`\``
}
// For structured responses, pretty print the JSON
content = `\`\`\`json\n${JSON.stringify(output, null, 2)}\n\`\`\``
}
if (content) {

View File

@@ -1,8 +1,10 @@
import { useEffect, useMemo, useRef, useState } from 'react'
import { Check, ChevronDown } from 'lucide-react'
import { Button } from '@/components/ui/button'
import { extractFieldsFromSchema, parseResponseFormatSafely } from '@/lib/response-format'
import { cn } from '@/lib/utils'
import { getBlock } from '@/blocks'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
interface OutputSelectProps {
@@ -48,8 +50,31 @@ export function OutputSelect({
? block.name.replace(/\s+/g, '').toLowerCase()
: `block-${block.id}`
// Check for custom response format first
const responseFormatValue = useSubBlockStore.getState().getValue(block.id, 'responseFormat')
const responseFormat = parseResponseFormatSafely(responseFormatValue, block.id)
let outputsToProcess: Record<string, any> = {}
if (responseFormat) {
// Use custom schema properties if response format is specified
const schemaFields = extractFieldsFromSchema(responseFormat)
if (schemaFields.length > 0) {
// Convert schema fields to output structure
schemaFields.forEach((field) => {
outputsToProcess[field.name] = { type: field.type }
})
} else {
// Fallback to default outputs if schema extraction failed
outputsToProcess = block.outputs || {}
}
} else {
// Use default block outputs
outputsToProcess = block.outputs || {}
}
// Add response outputs
if (block.outputs && typeof block.outputs === 'object') {
if (Object.keys(outputsToProcess).length > 0) {
const addOutput = (path: string, outputObj: any, prefix = '') => {
const fullPath = prefix ? `${prefix}.${path}` : path
@@ -100,7 +125,7 @@ export function OutputSelect({
}
// Process all output properties directly (flattened structure)
Object.entries(block.outputs).forEach(([key, value]) => {
Object.entries(outputsToProcess).forEach(([key, value]) => {
addOutput(key, value)
})
}

View File

@@ -125,35 +125,33 @@ export function ConsoleEntry({ entry, consoleWidth }: ConsoleEntryProps) {
<div className='flex items-start gap-2'>
<Terminal className='mt-1 h-4 w-4 text-muted-foreground' />
<div className='overflow-wrap-anywhere relative flex-1 whitespace-normal break-normal font-mono text-sm'>
{typeof entry.output === 'object' &&
entry.output !== null &&
hasNestedStructure(entry.output) && (
<div className='absolute top-0 right-0 z-10'>
<Button
variant='ghost'
size='sm'
className='h-6 px-2 text-muted-foreground hover:text-foreground'
onClick={(e) => {
e.stopPropagation()
setExpandAllJson(!expandAllJson)
}}
>
<span className='flex items-center'>
{expandAllJson ? (
<>
<ChevronUp className='mr-1 h-3 w-3' />
<span className='text-xs'>Collapse</span>
</>
) : (
<>
<ChevronDown className='mr-1 h-3 w-3' />
<span className='text-xs'>Expand</span>
</>
)}
</span>
</Button>
</div>
)}
{entry.output != null && (
<div className='absolute top-0 right-0 z-10'>
<Button
variant='ghost'
size='sm'
className='h-6 px-2 text-muted-foreground hover:text-foreground'
onClick={(e) => {
e.stopPropagation()
setExpandAllJson(!expandAllJson)
}}
>
<span className='flex items-center'>
{expandAllJson ? (
<>
<ChevronUp className='mr-1 h-3 w-3' />
<span className='text-xs'>Collapse</span>
</>
) : (
<>
<ChevronDown className='mr-1 h-3 w-3' />
<span className='text-xs'>Expand</span>
</>
)}
</span>
</Button>
</div>
)}
<JSONView data={entry.output} initiallyExpanded={expandAllJson} />
</div>
</div>

View File

@@ -1,6 +1,7 @@
import { useCallback } from 'react'
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
import { cn } from '@/lib/utils'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/w/components/providers/workspace-permissions-provider'
import type { BlockConfig } from '@/blocks/types'
export type ToolbarBlockProps = {
@@ -9,6 +10,8 @@ export type ToolbarBlockProps = {
}
export function ToolbarBlock({ config, disabled = false }: ToolbarBlockProps) {
const userPermissions = useUserPermissionsContext()
const handleDragStart = (e: React.DragEvent) => {
if (disabled) {
e.preventDefault()
@@ -66,7 +69,11 @@ export function ToolbarBlock({ config, disabled = false }: ToolbarBlockProps) {
return (
<Tooltip>
<TooltipTrigger asChild>{blockContent}</TooltipTrigger>
<TooltipContent>Edit permissions required to add blocks</TooltipContent>
<TooltipContent>
{userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Edit permissions required to add blocks'}
</TooltipContent>
</Tooltip>
)
}

View File

@@ -1,6 +1,7 @@
import { useCallback } from 'react'
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
import { cn } from '@/lib/utils'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/w/components/providers/workspace-permissions-provider'
import { LoopTool } from '../../../loop-node/loop-config'
type LoopToolbarItemProps = {
@@ -9,6 +10,8 @@ type LoopToolbarItemProps = {
// Custom component for the Loop Tool
export default function LoopToolbarItem({ disabled = false }: LoopToolbarItemProps) {
const userPermissions = useUserPermissionsContext()
const handleDragStart = (e: React.DragEvent) => {
if (disabled) {
e.preventDefault()
@@ -74,7 +77,11 @@ export default function LoopToolbarItem({ disabled = false }: LoopToolbarItemPro
return (
<Tooltip>
<TooltipTrigger asChild>{blockContent}</TooltipTrigger>
<TooltipContent>Edit permissions required to add blocks</TooltipContent>
<TooltipContent>
{userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Edit permissions required to add blocks'}
</TooltipContent>
</Tooltip>
)
}

View File

@@ -1,6 +1,7 @@
import { useCallback } from 'react'
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
import { cn } from '@/lib/utils'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/w/components/providers/workspace-permissions-provider'
import { ParallelTool } from '../../../parallel-node/parallel-config'
type ParallelToolbarItemProps = {
@@ -9,6 +10,7 @@ type ParallelToolbarItemProps = {
// Custom component for the Parallel Tool
export default function ParallelToolbarItem({ disabled = false }: ParallelToolbarItemProps) {
const userPermissions = useUserPermissionsContext()
const handleDragStart = (e: React.DragEvent) => {
if (disabled) {
e.preventDefault()
@@ -75,7 +77,11 @@ export default function ParallelToolbarItem({ disabled = false }: ParallelToolba
return (
<Tooltip>
<TooltipTrigger asChild>{blockContent}</TooltipTrigger>
<TooltipContent>Edit permissions required to add blocks</TooltipContent>
<TooltipContent>
{userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Edit permissions required to add blocks'}
</TooltipContent>
</Tooltip>
)
}

View File

@@ -2,6 +2,7 @@ import { ArrowLeftRight, ArrowUpDown, Circle, CircleOff, Copy, Trash2 } from 'lu
import { Button } from '@/components/ui/button'
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
import { cn } from '@/lib/utils'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/w/components/providers/workspace-permissions-provider'
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
@@ -22,9 +23,17 @@ export function ActionBar({ blockId, blockType, disabled = false }: ActionBarPro
const horizontalHandles = useWorkflowStore(
(state) => state.blocks[blockId]?.horizontalHandles ?? false
)
const userPermissions = useUserPermissionsContext()
const isStarterBlock = blockType === 'starter'
const getTooltipMessage = (defaultMessage: string) => {
if (disabled) {
return userPermissions.isOfflineMode ? 'Connection lost - please refresh' : 'Read-only mode'
}
return defaultMessage
}
return (
<div
className={cn(
@@ -68,7 +77,7 @@ export function ActionBar({ blockId, blockType, disabled = false }: ActionBarPro
</Button>
</TooltipTrigger>
<TooltipContent side='right'>
{disabled ? 'Read-only mode' : isEnabled ? 'Disable Block' : 'Enable Block'}
{getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
</TooltipContent>
</Tooltip>
@@ -89,9 +98,7 @@ export function ActionBar({ blockId, blockType, disabled = false }: ActionBarPro
<Copy className='h-4 w-4' />
</Button>
</TooltipTrigger>
<TooltipContent side='right'>
{disabled ? 'Read-only mode' : 'Duplicate Block'}
</TooltipContent>
<TooltipContent side='right'>{getTooltipMessage('Duplicate Block')}</TooltipContent>
</Tooltip>
)}
@@ -116,7 +123,7 @@ export function ActionBar({ blockId, blockType, disabled = false }: ActionBarPro
</Button>
</TooltipTrigger>
<TooltipContent side='right'>
{disabled ? 'Read-only mode' : horizontalHandles ? 'Vertical Ports' : 'Horizontal Ports'}
{getTooltipMessage(horizontalHandles ? 'Vertical Ports' : 'Horizontal Ports')}
</TooltipContent>
</Tooltip>
@@ -140,9 +147,7 @@ export function ActionBar({ blockId, blockType, disabled = false }: ActionBarPro
<Trash2 className='h-4 w-4' />
</Button>
</TooltipTrigger>
<TooltipContent side='right'>
{disabled ? 'Read-only mode' : 'Delete Block'}
</TooltipContent>
<TooltipContent side='right'>{getTooltipMessage('Delete Block')}</TooltipContent>
</Tooltip>
)}
</div>

View File

@@ -1,3 +1,4 @@
import { RepeatIcon, SplitIcon } from 'lucide-react'
import { Card } from '@/components/ui/card'
import { cn } from '@/lib/utils'
import {
@@ -77,8 +78,20 @@ export function ConnectionBlocks({
// Get block configuration for icon and color
const blockConfig = getBlock(connection.type)
const displayName = connection.name // Use the actual block name instead of transforming it
const Icon = blockConfig?.icon
const bgColor = blockConfig?.bgColor || '#6B7280' // Fallback to gray
// Handle special blocks that aren't in the registry (loop and parallel)
let Icon = blockConfig?.icon
let bgColor = blockConfig?.bgColor || '#6B7280' // Fallback to gray
if (!blockConfig) {
if (connection.type === 'loop') {
Icon = RepeatIcon as typeof Icon
bgColor = '#2FB3FF' // Blue color for loop blocks
} else if (connection.type === 'parallel') {
Icon = SplitIcon as typeof Icon
bgColor = '#FEE12B' // Yellow color for parallel blocks
}
}
return (
<Card

View File

@@ -73,8 +73,6 @@ export function Code({
}
}, [generationType])
// State management
const [storeValue, setStoreValue] = useSubBlockValue(blockId, subBlockId)
const [code, setCode] = useState<string>('')
const [_lineCount, setLineCount] = useState(1)
const [showTags, setShowTags] = useState(false)
@@ -98,34 +96,13 @@ export function Code({
const toggleCollapsed = () => {
setCollapsedValue(blockId, collapsedStateKey, !isCollapsed)
}
// Use preview value when in preview mode, otherwise use store value or prop value
const value = isPreview ? previewValue : propValue !== undefined ? propValue : storeValue
// Create refs to hold the handlers
const handleStreamStartRef = useRef<() => void>(() => {})
const handleGeneratedContentRef = useRef<(generatedCode: string) => void>(() => {})
const handleStreamChunkRef = useRef<(chunk: string) => void>(() => {})
// AI Code Generation Hook
const handleStreamStart = () => {
setCode('')
// Optionally clear the store value too, though handleStreamChunk will update it
// setStoreValue('')
}
const handleGeneratedContent = (generatedCode: string) => {
setCode(generatedCode)
if (!isPreview && !disabled) {
setStoreValue(generatedCode)
}
}
// Handle streaming chunks directly into the editor
const handleStreamChunk = (chunk: string) => {
setCode((currentCode) => {
const newCode = currentCode + chunk
if (!isPreview && !disabled) {
setStoreValue(newCode)
}
return newCode
})
}
const {
isLoading: isAiLoading,
isStreaming: isAiStreaming,
@@ -140,11 +117,48 @@ export function Code({
} = useCodeGeneration({
generationType: generationType,
initialContext: code,
onGeneratedContent: handleGeneratedContent,
onStreamChunk: handleStreamChunk,
onStreamStart: handleStreamStart,
onGeneratedContent: (content: string) => handleGeneratedContentRef.current?.(content),
onStreamChunk: (chunk: string) => handleStreamChunkRef.current?.(chunk),
onStreamStart: () => handleStreamStartRef.current?.(),
})
// State management - useSubBlockValue with explicit streaming control
const [storeValue, setStoreValue] = useSubBlockValue(blockId, subBlockId, false, {
debounceMs: 150,
isStreaming: isAiStreaming, // Use AI streaming state directly
onStreamingEnd: () => {
logger.debug('AI streaming ended, value persisted', { blockId, subBlockId })
},
})
// Use preview value when in preview mode, otherwise use store value or prop value
const value = isPreview ? previewValue : propValue !== undefined ? propValue : storeValue
// Define the handlers now that we have access to setStoreValue
handleStreamStartRef.current = () => {
setCode('')
// Streaming state is now controlled by isAiStreaming
}
handleGeneratedContentRef.current = (generatedCode: string) => {
setCode(generatedCode)
if (!isPreview && !disabled) {
setStoreValue(generatedCode)
// Final value will be persisted when isAiStreaming becomes false
}
}
handleStreamChunkRef.current = (chunk: string) => {
setCode((currentCode) => {
const newCode = currentCode + chunk
if (!isPreview && !disabled) {
// Update the value - it won't be persisted until streaming ends
setStoreValue(newCode)
}
return newCode
})
}
// Effects
useEffect(() => {
const valueString = value?.toString() ?? ''

View File

@@ -19,7 +19,6 @@ import {
type OAuthProvider,
parseProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
const logger = createLogger('OAuthRequiredModal')
@@ -157,42 +156,11 @@ export function OAuthRequiredModal({
(scope) => !scope.includes('userinfo.email') && !scope.includes('userinfo.profile')
)
const handleRedirectToSettings = () => {
try {
// Determine the appropriate serviceId and providerId
const providerId = getProviderIdFromServiceId(effectiveServiceId)
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
saveToStorage<boolean>('from_oauth_modal', true)
// Close the modal
onClose()
// Open the settings modal with the credentials tab
const event = new CustomEvent('open-settings', {
detail: { tab: 'credentials' },
})
window.dispatchEvent(event)
} catch (error) {
logger.error('Error redirecting to settings:', { error })
}
}
const handleConnectDirectly = async () => {
try {
// Determine the appropriate serviceId and providerId
const providerId = getProviderIdFromServiceId(effectiveServiceId)
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Close the modal
onClose()
@@ -258,14 +226,6 @@ export function OAuthRequiredModal({
<Button type='button' onClick={handleConnectDirectly} className='sm:order-3'>
Connect Now
</Button>
<Button
type='button'
variant='secondary'
onClick={handleRedirectToSettings}
className='sm:order-2'
>
Go to Settings
</Button>
</DialogFooter>
</DialogContent>
</Dialog>

View File

@@ -21,31 +21,24 @@ import {
type OAuthProvider,
parseProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import type { SubBlockConfig } from '@/blocks/types'
import { useSubBlockValue } from '../../hooks/use-sub-block-value'
import { OAuthRequiredModal } from './components/oauth-required-modal'
const logger = createLogger('CredentialSelector')
interface CredentialSelectorProps {
value: string
onChange: (value: string) => void
provider: OAuthProvider
requiredScopes?: string[]
label?: string
blockId: string
subBlock: SubBlockConfig
disabled?: boolean
serviceId?: string
isPreview?: boolean
previewValue?: any | null
}
export function CredentialSelector({
value,
onChange,
provider,
requiredScopes = [],
label = 'Select credential',
blockId,
subBlock,
disabled = false,
serviceId,
isPreview = false,
previewValue,
}: CredentialSelectorProps) {
@@ -55,14 +48,22 @@ export function CredentialSelector({
const [showOAuthModal, setShowOAuthModal] = useState(false)
const [selectedId, setSelectedId] = useState('')
// Use collaborative state management via useSubBlockValue hook
const [storeValue, setStoreValue] = useSubBlockValue(blockId, subBlock.id)
// Extract values from subBlock config
const provider = subBlock.provider as OAuthProvider
const requiredScopes = subBlock.requiredScopes || []
const label = subBlock.placeholder || 'Select credential'
const serviceId = subBlock.serviceId
// Get the effective value (preview or store value)
const effectiveValue = isPreview && previewValue !== undefined ? previewValue : storeValue
// Initialize selectedId with the effective value
useEffect(() => {
if (isPreview && previewValue !== undefined) {
setSelectedId(previewValue || '')
} else {
setSelectedId(value)
}
}, [value, isPreview, previewValue])
setSelectedId(effectiveValue || '')
}, [effectiveValue])
// Derive service and provider IDs using useMemo
const effectiveServiceId = useMemo(() => {
@@ -85,7 +86,9 @@ export function CredentialSelector({
// If we have a value but it's not in the credentials, reset it
if (selectedId && !data.credentials.some((cred: Credential) => cred.id === selectedId)) {
setSelectedId('')
onChange('')
if (!isPreview) {
setStoreValue('')
}
}
// Auto-select logic:
@@ -99,11 +102,15 @@ export function CredentialSelector({
const defaultCred = data.credentials.find((cred: Credential) => cred.isDefault)
if (defaultCred) {
setSelectedId(defaultCred.id)
onChange(defaultCred.id)
if (!isPreview) {
setStoreValue(defaultCred.id)
}
} else if (data.credentials.length === 1) {
// If only one credential, select it
setSelectedId(data.credentials[0].id)
onChange(data.credentials[0].id)
if (!isPreview) {
setStoreValue(data.credentials[0].id)
}
}
}
}
@@ -112,7 +119,7 @@ export function CredentialSelector({
} finally {
setIsLoading(false)
}
}, [effectiveProviderId, onChange, selectedId])
}, [effectiveProviderId, selectedId, isPreview, setStoreValue])
// Fetch credentials on initial mount
useEffect(() => {
@@ -121,11 +128,7 @@ export function CredentialSelector({
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [])
// Update local state when external value changes
useEffect(() => {
const currentValue = isPreview ? previewValue : value
setSelectedId(currentValue || '')
}, [value, isPreview, previewValue])
// This effect is no longer needed since we're using effectiveValue directly
// Listen for visibility changes to update credentials when user returns from settings
useEffect(() => {
@@ -158,19 +161,13 @@ export function CredentialSelector({
const handleSelect = (credentialId: string) => {
setSelectedId(credentialId)
if (!isPreview) {
onChange(credentialId)
setStoreValue(credentialId)
}
setOpen(false)
}
// Handle adding a new credential
const handleAddCredential = () => {
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', effectiveProviderId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)

View File

@@ -19,7 +19,6 @@ import {
getServiceIdFromScopes,
type OAuthProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
export interface ConfluenceFileInfo {
@@ -355,15 +354,6 @@ export function ConfluenceFileSelector({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)

View File

@@ -24,7 +24,6 @@ import {
type OAuthProvider,
parseProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
const logger = createLogger('GoogleDrivePicker')
@@ -79,6 +78,7 @@ export function GoogleDrivePicker({
const [isLoading, setIsLoading] = useState(false)
const [isLoadingSelectedFile, setIsLoadingSelectedFile] = useState(false)
const [showOAuthModal, setShowOAuthModal] = useState(false)
const [credentialsLoaded, setCredentialsLoaded] = useState(false)
const initialFetchRef = useRef(false)
const [openPicker, _authResponse] = useDrivePicker()
@@ -97,6 +97,7 @@ export function GoogleDrivePicker({
// Fetch available credentials for this provider
const fetchCredentials = useCallback(async () => {
setIsLoading(true)
setCredentialsLoaded(false)
try {
const providerId = getProviderId()
const response = await fetch(`/api/auth/oauth/credentials?provider=${providerId}`)
@@ -128,6 +129,7 @@ export function GoogleDrivePicker({
logger.error('Error fetching credentials:', { error })
} finally {
setIsLoading(false)
setCredentialsLoaded(true)
}
}, [provider, getProviderId, selectedCredentialId])
@@ -154,9 +156,16 @@ export function GoogleDrivePicker({
return data.file
}
} else {
logger.error('Error fetching file by ID:', {
error: await response.text(),
})
const errorText = await response.text()
logger.error('Error fetching file by ID:', { error: errorText })
// If file not found or access denied, clear the selection
if (response.status === 404 || response.status === 403) {
logger.info('File not accessible, clearing selection')
setSelectedFileId('')
onChange('')
onFileInfoChange?.(null)
}
}
return null
} catch (error) {
@@ -166,7 +175,7 @@ export function GoogleDrivePicker({
setIsLoadingSelectedFile(false)
}
},
[selectedCredentialId, onFileInfoChange]
[selectedCredentialId, onChange, onFileInfoChange]
)
// Fetch credentials on initial mount
@@ -177,20 +186,61 @@ export function GoogleDrivePicker({
}
}, [fetchCredentials])
// Fetch the selected file metadata once credentials are loaded or changed
useEffect(() => {
// If we have a file ID selected and credentials are ready but we still don't have the file info, fetch it
if (value && selectedCredentialId && !selectedFile) {
fetchFileById(value)
}
}, [value, selectedCredentialId, selectedFile, fetchFileById])
// Keep internal selectedFileId in sync with the value prop
useEffect(() => {
if (value !== selectedFileId) {
const previousFileId = selectedFileId
setSelectedFileId(value)
// Only clear selected file info if we had a different file before (not initial load)
if (previousFileId && previousFileId !== value && selectedFile) {
setSelectedFile(null)
}
}
}, [value])
}, [value, selectedFileId, selectedFile])
// Track previous credential ID to detect changes
const prevCredentialIdRef = useRef<string>('')
// Clear selected file when credentials are removed or changed
useEffect(() => {
const prevCredentialId = prevCredentialIdRef.current
prevCredentialIdRef.current = selectedCredentialId
if (!selectedCredentialId) {
// No credentials - clear everything
if (selectedFile) {
setSelectedFile(null)
setSelectedFileId('')
onChange('')
}
} else if (prevCredentialId && prevCredentialId !== selectedCredentialId) {
// Credentials changed (not initial load) - clear file info to force refetch
if (selectedFile) {
setSelectedFile(null)
}
}
}, [selectedCredentialId, selectedFile, onChange])
// Fetch the selected file metadata once credentials are loaded or changed
useEffect(() => {
// Only fetch if we have both a file ID and credentials, credentials are loaded, but no file info yet
if (
value &&
selectedCredentialId &&
credentialsLoaded &&
!selectedFile &&
!isLoadingSelectedFile
) {
fetchFileById(value)
}
}, [
value,
selectedCredentialId,
credentialsLoaded,
selectedFile,
isLoadingSelectedFile,
fetchFileById,
])
// Fetch the access token for the selected credential
const fetchAccessToken = async (): Promise<string | null> => {
@@ -286,15 +336,6 @@ export function GoogleDrivePicker({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)
@@ -399,7 +440,7 @@ export function GoogleDrivePicker({
{getFileIcon(selectedFile, 'sm')}
<span className='truncate font-normal'>{selectedFile.name}</span>
</div>
) : selectedFileId && (isLoadingSelectedFile || !selectedCredentialId) ? (
) : selectedFileId && isLoadingSelectedFile && selectedCredentialId ? (
<div className='flex items-center gap-2'>
<RefreshCw className='h-4 w-4 animate-spin' />
<span className='text-muted-foreground'>Loading document...</span>

View File

@@ -20,7 +20,6 @@ import {
getServiceIdFromScopes,
type OAuthProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
const logger = new Logger('jira_issue_selector')
@@ -420,15 +419,6 @@ export function JiraIssueSelector({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)

View File

@@ -23,7 +23,6 @@ import {
type OAuthProvider,
parseProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
const logger = createLogger('MicrosoftFileSelector')
@@ -75,6 +74,7 @@ export function MicrosoftFileSelector({
const [availableFiles, setAvailableFiles] = useState<MicrosoftFileInfo[]>([])
const [searchQuery, setSearchQuery] = useState<string>('')
const [showOAuthModal, setShowOAuthModal] = useState(false)
const [credentialsLoaded, setCredentialsLoaded] = useState(false)
const initialFetchRef = useRef(false)
// Determine the appropriate service ID based on provider and scopes
@@ -92,6 +92,7 @@ export function MicrosoftFileSelector({
// Fetch available credentials for this provider
const fetchCredentials = useCallback(async () => {
setIsLoading(true)
setCredentialsLoaded(false)
try {
const providerId = getProviderId()
const response = await fetch(`/api/auth/oauth/credentials?provider=${providerId}`)
@@ -123,6 +124,7 @@ export function MicrosoftFileSelector({
logger.error('Error fetching credentials:', { error })
} finally {
setIsLoading(false)
setCredentialsLoaded(true)
}
}, [provider, getProviderId, selectedCredentialId])
@@ -183,9 +185,16 @@ export function MicrosoftFileSelector({
return data.file
}
} else {
logger.error('Error fetching file by ID:', {
error: await response.text(),
})
const errorText = await response.text()
logger.error('Error fetching file by ID:', { error: errorText })
// If file not found or access denied, clear the selection
if (response.status === 404 || response.status === 403) {
logger.info('File not accessible, clearing selection')
setSelectedFileId('')
onChange('')
onFileInfoChange?.(null)
}
}
return null
} catch (error) {
@@ -224,20 +233,61 @@ export function MicrosoftFileSelector({
}
}, [searchQuery, selectedCredentialId, fetchAvailableFiles])
// Fetch the selected file metadata once credentials are loaded or changed
useEffect(() => {
// If we have a file ID selected and credentials are ready but we still don't have the file info, fetch it
if (value && selectedCredentialId && !selectedFile) {
fetchFileById(value)
}
}, [value, selectedCredentialId, selectedFile, fetchFileById])
// Keep internal selectedFileId in sync with the value prop
useEffect(() => {
if (value !== selectedFileId) {
const previousFileId = selectedFileId
setSelectedFileId(value)
// Only clear selected file info if we had a different file before (not initial load)
if (previousFileId && previousFileId !== value && selectedFile) {
setSelectedFile(null)
}
}
}, [value])
}, [value, selectedFileId, selectedFile])
// Track previous credential ID to detect changes
const prevCredentialIdRef = useRef<string>('')
// Clear selected file when credentials are removed or changed
useEffect(() => {
const prevCredentialId = prevCredentialIdRef.current
prevCredentialIdRef.current = selectedCredentialId
if (!selectedCredentialId) {
// No credentials - clear everything
if (selectedFile) {
setSelectedFile(null)
setSelectedFileId('')
onChange('')
}
} else if (prevCredentialId && prevCredentialId !== selectedCredentialId) {
// Credentials changed (not initial load) - clear file info to force refetch
if (selectedFile) {
setSelectedFile(null)
}
}
}, [selectedCredentialId, selectedFile, onChange])
// Fetch the selected file metadata once credentials are loaded or changed
useEffect(() => {
// Only fetch if we have both a file ID and credentials, credentials are loaded, but no file info yet
if (
value &&
selectedCredentialId &&
credentialsLoaded &&
!selectedFile &&
!isLoadingSelectedFile
) {
fetchFileById(value)
}
}, [
value,
selectedCredentialId,
credentialsLoaded,
selectedFile,
isLoadingSelectedFile,
fetchFileById,
])
// Handle selecting a file from the available files
const handleFileSelect = (file: MicrosoftFileInfo) => {
@@ -251,15 +301,6 @@ export function MicrosoftFileSelector({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)
@@ -381,7 +422,7 @@ export function MicrosoftFileSelector({
{getFileIcon(selectedFile, 'sm')}
<span className='truncate font-normal'>{selectedFile.name}</span>
</div>
) : selectedFileId && (isLoadingSelectedFile || !selectedCredentialId) ? (
) : selectedFileId && isLoadingSelectedFile && selectedCredentialId ? (
<div className='flex items-center gap-2'>
<RefreshCw className='h-4 w-4 animate-spin' />
<span className='text-muted-foreground'>Loading document...</span>

View File

@@ -20,7 +20,6 @@ import {
getServiceIdFromScopes,
type OAuthProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
const logger = new Logger('TeamsMessageSelector')
@@ -399,15 +398,6 @@ export function TeamsMessageSelector({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)

View File

@@ -16,7 +16,6 @@ import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover
import { createLogger } from '@/lib/logs/console-logger'
import { type Credential, getProviderIdFromServiceId, getServiceIdFromScopes } from '@/lib/oauth'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/components/credential-selector/components/oauth-required-modal'
import { saveToStorage } from '@/stores/workflows/persistence'
const logger = createLogger('FolderSelector')
@@ -274,15 +273,6 @@ export function FolderSelector({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)

View File

@@ -20,7 +20,6 @@ import {
getServiceIdFromScopes,
type OAuthProvider,
} from '@/lib/oauth'
import { saveToStorage } from '@/stores/workflows/persistence'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
const logger = new Logger('jira_project_selector')
@@ -371,15 +370,6 @@ export function JiraProjectSelector({
// Handle adding a new credential
const handleAddCredential = () => {
const effectiveServiceId = getServiceId()
const providerId = getProviderId()
// Store information about the required connection
saveToStorage<string>('pending_service_id', effectiveServiceId)
saveToStorage<string[]>('pending_oauth_scopes', requiredScopes)
saveToStorage<string>('pending_oauth_return_url', window.location.href)
saveToStorage<string>('pending_oauth_provider_id', providerId)
// Show the OAuth modal
setShowOAuthModal(true)
setOpen(false)

View File

@@ -50,7 +50,11 @@ export function ResponseFormat({
isPreview = false,
previewValue,
}: ResponseFormatProps) {
const [storeValue, setStoreValue] = useSubBlockValue<JSONProperty[]>(blockId, subBlockId)
// useSubBlockValue now includes debouncing by default
const [storeValue, setStoreValue] = useSubBlockValue<JSONProperty[]>(blockId, subBlockId, false, {
debounceMs: 200, // Slightly longer debounce for complex structures
})
const [showPreview, setShowPreview] = useState(false)
const value = isPreview ? previewValue : storeValue
@@ -290,7 +294,13 @@ export function ResponseFormat({
{showPreview && (
<div className='rounded border bg-muted/30 p-2'>
<pre className='max-h-32 overflow-auto text-xs'>
{JSON.stringify(generateJSON(properties), null, 2)}
{(() => {
try {
return JSON.stringify(generateJSON(properties), null, 2)
} catch (error) {
return `Error generating preview: ${error instanceof Error ? error.message : 'Unknown error'}`
}
})()}
</pre>
</div>
)}

View File

@@ -0,0 +1,213 @@
import { useCallback, useEffect, useState } from 'react'
import { Check, ChevronDown, ExternalLink, Plus, RefreshCw } from 'lucide-react'
import { Button } from '@/components/ui/button'
import {
Command,
CommandEmpty,
CommandGroup,
CommandItem,
CommandList,
} from '@/components/ui/command'
import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover'
import { createLogger } from '@/lib/logs/console-logger'
import {
type Credential,
OAUTH_PROVIDERS,
type OAuthProvider,
type OAuthService,
parseProvider,
} from '@/lib/oauth'
import { OAuthRequiredModal } from '../../credential-selector/components/oauth-required-modal'
const logger = createLogger('ToolCredentialSelector')
// Helper functions for provider icons and names
const getProviderIcon = (providerName: OAuthProvider) => {
const { baseProvider } = parseProvider(providerName)
const baseProviderConfig = OAUTH_PROVIDERS[baseProvider]
if (!baseProviderConfig) {
return <ExternalLink className='h-4 w-4' />
}
// Always use the base provider icon for a more consistent UI
return baseProviderConfig.icon({ className: 'h-4 w-4' })
}
const getProviderName = (providerName: OAuthProvider) => {
const { baseProvider } = parseProvider(providerName)
const baseProviderConfig = OAUTH_PROVIDERS[baseProvider]
if (baseProviderConfig) {
return baseProviderConfig.name
}
// Fallback: capitalize the provider name
return providerName
.split('-')
.map((part) => part.charAt(0).toUpperCase() + part.slice(1))
.join(' ')
}
interface ToolCredentialSelectorProps {
value: string
onChange: (value: string) => void
provider: OAuthProvider
requiredScopes?: string[]
label?: string
serviceId?: OAuthService
disabled?: boolean
}
export function ToolCredentialSelector({
value,
onChange,
provider,
requiredScopes = [],
label = 'Select account',
serviceId,
disabled = false,
}: ToolCredentialSelectorProps) {
const [open, setOpen] = useState(false)
const [credentials, setCredentials] = useState<Credential[]>([])
const [isLoading, setIsLoading] = useState(false)
const [showOAuthModal, setShowOAuthModal] = useState(false)
const [selectedId, setSelectedId] = useState('')
// Update selected ID when value changes
useEffect(() => {
setSelectedId(value)
}, [value])
const fetchCredentials = useCallback(async () => {
setIsLoading(true)
try {
const response = await fetch(`/api/auth/oauth/credentials?provider=${provider}`)
if (response.ok) {
const data = await response.json()
setCredentials(data.credentials || [])
// If we have a selected value but it's not in the credentials list, clear it
if (value && !data.credentials?.some((cred: Credential) => cred.id === value)) {
onChange('')
}
} else {
logger.error('Error fetching credentials:', { error: await response.text() })
setCredentials([])
}
} catch (error) {
logger.error('Error fetching credentials:', { error })
setCredentials([])
} finally {
setIsLoading(false)
}
}, [provider, value, onChange])
// Fetch credentials on mount and when provider changes
useEffect(() => {
fetchCredentials()
}, [fetchCredentials])
const handleSelect = (credentialId: string) => {
setSelectedId(credentialId)
onChange(credentialId)
setOpen(false)
}
const handleOAuthClose = () => {
setShowOAuthModal(false)
// Refetch credentials to include any new ones
fetchCredentials()
}
const selectedCredential = credentials.find((cred) => cred.id === selectedId)
return (
<>
<Popover open={open} onOpenChange={setOpen}>
<PopoverTrigger asChild>
<Button
variant='outline'
role='combobox'
aria-expanded={open}
className='w-full justify-between'
disabled={disabled}
>
{selectedCredential ? (
<div className='flex items-center gap-2 overflow-hidden'>
{getProviderIcon(provider)}
<span className='truncate font-normal'>{selectedCredential.name}</span>
</div>
) : (
<div className='flex items-center gap-2'>
{getProviderIcon(provider)}
<span className='text-muted-foreground'>{label}</span>
</div>
)}
<ChevronDown className='ml-2 h-4 w-4 shrink-0 opacity-50' />
</Button>
</PopoverTrigger>
<PopoverContent className='w-[300px] p-0' align='start'>
<Command>
<CommandList>
<CommandEmpty>
{isLoading ? (
<div className='flex items-center justify-center p-4'>
<RefreshCw className='h-4 w-4 animate-spin' />
<span className='ml-2'>Loading...</span>
</div>
) : credentials.length === 0 ? (
<div className='p-4 text-center'>
<p className='font-medium text-sm'>No accounts connected.</p>
<p className='text-muted-foreground text-xs'>
Connect a {getProviderName(provider)} account to continue.
</p>
</div>
) : (
<div className='p-4 text-center'>
<p className='font-medium text-sm'>No accounts found.</p>
</div>
)}
</CommandEmpty>
{credentials.length > 0 && (
<CommandGroup>
{credentials.map((credential) => (
<CommandItem
key={credential.id}
value={credential.id}
onSelect={() => handleSelect(credential.id)}
>
<div className='flex items-center gap-2'>
{getProviderIcon(credential.provider)}
<span className='font-normal'>{credential.name}</span>
</div>
{credential.id === selectedId && <Check className='ml-auto h-4 w-4' />}
</CommandItem>
))}
</CommandGroup>
)}
<CommandGroup>
<CommandItem onSelect={() => setShowOAuthModal(true)}>
<div className='flex items-center gap-2'>
<Plus className='h-4 w-4' />
<span className='font-normal'>Connect {getProviderName(provider)} account</span>
</div>
</CommandItem>
</CommandGroup>
</CommandList>
</Command>
</PopoverContent>
</Popover>
<OAuthRequiredModal
isOpen={showOAuthModal}
onClose={handleOAuthClose}
provider={provider}
toolName={label}
requiredScopes={requiredScopes}
serviceId={serviceId}
/>
</>
)
}

View File

@@ -22,10 +22,10 @@ import { useWorkflowStore } from '@/stores/workflows/workflow/store'
import { getTool } from '@/tools/utils'
import { useSubBlockValue } from '../../hooks/use-sub-block-value'
import { ChannelSelectorInput } from '../channel-selector/channel-selector-input'
import { CredentialSelector } from '../credential-selector/credential-selector'
import { ShortInput } from '../short-input'
import { type CustomTool, CustomToolModal } from './components/custom-tool-modal/custom-tool-modal'
import { ToolCommand } from './components/tool-command/tool-command'
import { ToolCredentialSelector } from './components/tool-credential-selector'
interface ToolInputProps {
blockId: string
@@ -347,6 +347,8 @@ export function ToolInput({
const [customToolModalOpen, setCustomToolModalOpen] = useState(false)
const [editingToolIndex, setEditingToolIndex] = useState<number | null>(null)
const [searchQuery, setSearchQuery] = useState('')
const [draggedIndex, setDraggedIndex] = useState<number | null>(null)
const [dragOverIndex, setDragOverIndex] = useState<number | null>(null)
const isWide = useWorkflowStore((state) => state.blocks[blockId]?.isWide)
const customTools = useCustomToolsStore((state) => state.getAllTools())
const subBlockStore = useSubBlockStore()
@@ -668,6 +670,46 @@ export function ToolInput({
)
}
const handleDragStart = (e: React.DragEvent, index: number) => {
if (isPreview || disabled) return
setDraggedIndex(index)
e.dataTransfer.effectAllowed = 'move'
e.dataTransfer.setData('text/html', '')
}
const handleDragOver = (e: React.DragEvent, index: number) => {
if (isPreview || disabled || draggedIndex === null) return
e.preventDefault()
e.dataTransfer.dropEffect = 'move'
setDragOverIndex(index)
}
const handleDragEnd = () => {
setDraggedIndex(null)
setDragOverIndex(null)
}
const handleDrop = (e: React.DragEvent, dropIndex: number) => {
if (isPreview || disabled || draggedIndex === null || draggedIndex === dropIndex) return
e.preventDefault()
const newTools = [...selectedTools]
const draggedTool = newTools[draggedIndex]
newTools.splice(draggedIndex, 1)
if (dropIndex === selectedTools.length) {
newTools.push(draggedTool)
} else {
const adjustedDropIndex = draggedIndex < dropIndex ? dropIndex - 1 : dropIndex
newTools.splice(adjustedDropIndex, 0, draggedTool)
}
setStoreValue(newTools)
setDraggedIndex(null)
setDragOverIndex(null)
}
const IconComponent = ({ icon: Icon, className }: { icon: any; className?: string }) => {
if (!Icon) return null
return <Icon className={className} />
@@ -827,9 +869,34 @@ export function ToolInput({
return (
<div
key={`${tool.type}-${toolIndex}`}
className={cn('group flex flex-col', isWide ? 'w-[calc(50%-0.25rem)]' : 'w-full')}
className={cn(
'group relative flex flex-col transition-all duration-200 ease-in-out',
isWide ? 'w-[calc(50%-0.25rem)]' : 'w-full',
draggedIndex === toolIndex ? 'scale-95 opacity-40' : '',
dragOverIndex === toolIndex && draggedIndex !== toolIndex && draggedIndex !== null
? 'translate-y-1 transform'
: '',
selectedTools.length > 1 && !isPreview && !disabled
? 'cursor-grab active:cursor-grabbing'
: ''
)}
draggable={!isPreview && !disabled}
onDragStart={(e) => handleDragStart(e, toolIndex)}
onDragOver={(e) => handleDragOver(e, toolIndex)}
onDragEnd={handleDragEnd}
onDrop={(e) => handleDrop(e, toolIndex)}
>
<div className='flex flex-col overflow-visible rounded-md border bg-card'>
{/* Subtle drop indicator - use border highlight instead of separate line */}
<div
className={cn(
'flex flex-col overflow-visible rounded-md border bg-card',
dragOverIndex === toolIndex &&
draggedIndex !== toolIndex &&
draggedIndex !== null
? 'border-t-2 border-t-muted-foreground/40'
: ''
)}
>
<div
className={cn(
'flex items-center justify-between bg-accent/50 p-2',
@@ -993,13 +1060,14 @@ export function ToolInput({
<div className='font-medium text-muted-foreground text-xs'>
Account
</div>
<CredentialSelector
<ToolCredentialSelector
value={tool.params.credential || ''}
onChange={(value) => handleCredentialChange(toolIndex, value)}
provider={oauthConfig.provider as OAuthProvider}
requiredScopes={oauthConfig.additionalScopes || []}
label={`Select ${oauthConfig.provider} account`}
serviceId={oauthConfig.provider}
disabled={disabled}
/>
</div>
)
@@ -1091,6 +1159,20 @@ export function ToolInput({
)
})}
{/* Drop zone for the end of the list */}
{selectedTools.length > 0 && draggedIndex !== null && (
<div
className={cn(
'h-2 w-full rounded transition-all duration-200 ease-in-out',
dragOverIndex === selectedTools.length
? 'border-b-2 border-b-muted-foreground/40'
: ''
)}
onDragOver={(e) => handleDragOver(e, selectedTools.length)}
onDrop={(e) => handleDrop(e, selectedTools.length)}
/>
)}
<Popover open={open} onOpenChange={setOpen}>
<PopoverTrigger asChild>
<Button

View File

@@ -16,7 +16,7 @@ import { createLogger } from '@/lib/logs/console-logger'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
import { useSubBlockValue } from '../../hooks/use-sub-block-value'
import { CredentialSelector } from '../credential-selector/credential-selector'
import { ToolCredentialSelector } from '../tool-input/components/tool-credential-selector'
import { WebhookModal } from './components/webhook-modal'
const logger = createLogger('WebhookConfig')
@@ -564,7 +564,7 @@ export function WebhookConfig({
{error && <div className='mb-2 text-red-500 text-sm dark:text-red-400'>{error}</div>}
<div className='mb-3'>
<CredentialSelector
<ToolCredentialSelector
value={gmailCredentialId}
onChange={handleCredentialChange}
provider='google-email'

View File

@@ -1,11 +1,15 @@
import { useCallback, useEffect, useRef } from 'react'
import { isEqual } from 'lodash'
import { createLogger } from '@/lib/logs/console-logger'
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
import { getProviderFromModel } from '@/providers/utils'
import { useGeneralStore } from '@/stores/settings/general/store'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
const logger = createLogger('SubBlockValue')
// Helper function to dispatch collaborative subblock updates
const dispatchSubblockUpdate = (blockId: string, subBlockId: string, value: any) => {
const event = new CustomEvent('update-subblock-value', {
@@ -154,20 +158,31 @@ function storeApiKeyValue(
}
}
interface UseSubBlockValueOptions {
debounceMs?: number
isStreaming?: boolean // Explicit streaming state
onStreamingEnd?: () => void
}
/**
* Custom hook to get and set values for a sub-block in a workflow.
* Handles complex object values properly by using deep equality comparison.
* Includes automatic debouncing and explicit streaming mode for AI generation.
*
* @param blockId The ID of the block containing the sub-block
* @param subBlockId The ID of the sub-block
* @param triggerWorkflowUpdate Whether to trigger a workflow update when the value changes
* @returns A tuple containing the current value and a setter function
* @param options Configuration for debouncing and streaming behavior
* @returns A tuple containing the current value and setter function
*/
export function useSubBlockValue<T = any>(
blockId: string,
subBlockId: string,
triggerWorkflowUpdate = false
triggerWorkflowUpdate = false,
options?: UseSubBlockValueOptions
): readonly [T | null, (value: T) => void] {
const { debounceMs = 150, isStreaming = false, onStreamingEnd } = options || {}
const { collaborativeSetSubblockValue } = useCollaborativeWorkflow()
const blockType = useWorkflowStore(
@@ -187,6 +202,12 @@ export function useSubBlockValue<T = any>(
// Previous model reference for detecting model changes
const prevModelRef = useRef<string | null>(null)
// Debouncing refs
const debounceTimerRef = useRef<NodeJS.Timeout | null>(null)
const lastEmittedValueRef = useRef<T | null>(null)
const streamingValueRef = useRef<T | null>(null)
const wasStreamingRef = useRef<boolean>(false)
// Get value from subblock store - always call this hook unconditionally
const storeValue = useSubBlockStore(
useCallback((state) => state.getValue(blockId, subBlockId), [blockId, subBlockId])
@@ -211,6 +232,36 @@ export function useSubBlockValue<T = any>(
// Compute the modelValue based on block type
const modelValue = isProviderBasedBlock ? (modelSubBlockValue as string) : null
// Cleanup timer on unmount
useEffect(() => {
return () => {
if (debounceTimerRef.current) {
clearTimeout(debounceTimerRef.current)
}
}
}, [])
// Emit the value to socket/DB
const emitValue = useCallback(
(value: T) => {
collaborativeSetSubblockValue(blockId, subBlockId, value)
lastEmittedValueRef.current = value
},
[blockId, subBlockId, collaborativeSetSubblockValue]
)
// Handle streaming mode changes
useEffect(() => {
// If we just exited streaming mode, emit the final value
if (wasStreamingRef.current && !isStreaming && streamingValueRef.current !== null) {
logger.debug('Streaming ended, persisting final value', { blockId, subBlockId })
emitValue(streamingValueRef.current)
streamingValueRef.current = null
onStreamingEnd?.()
}
wasStreamingRef.current = isStreaming
}, [isStreaming, blockId, subBlockId, emitValue, onStreamingEnd])
// Hook to set a value in the subblock store
const setValue = useCallback(
(newValue: T) => {
@@ -218,6 +269,22 @@ export function useSubBlockValue<T = any>(
if (!isEqual(valueRef.current, newValue)) {
valueRef.current = newValue
// Always update local store immediately for UI responsiveness
useSubBlockStore.setState((state) => ({
workflowValues: {
...state.workflowValues,
[useWorkflowRegistry.getState().activeWorkflowId || '']: {
...state.workflowValues[useWorkflowRegistry.getState().activeWorkflowId || ''],
[blockId]: {
...state.workflowValues[useWorkflowRegistry.getState().activeWorkflowId || '']?.[
blockId
],
[subBlockId]: newValue,
},
},
},
}))
// Ensure we're passing the actual value, not a reference that might change
const valueCopy =
newValue === null
@@ -231,8 +298,27 @@ export function useSubBlockValue<T = any>(
storeApiKeyValue(blockId, blockType, modelValue, newValue, storeValue)
}
// Use collaborative function which handles both local store update and socket emission
collaborativeSetSubblockValue(blockId, subBlockId, valueCopy)
// Clear any existing debounce timer
if (debounceTimerRef.current) {
clearTimeout(debounceTimerRef.current)
debounceTimerRef.current = null
}
// If streaming, just store the value without emitting
if (isStreaming) {
streamingValueRef.current = valueCopy
} else {
// Detect large changes for extended debounce
const isLargeChange = detectLargeChange(lastEmittedValueRef.current, valueCopy)
const effectiveDebounceMs = isLargeChange ? debounceMs * 2 : debounceMs
// Debounce the socket emission
debounceTimerRef.current = setTimeout(() => {
if (valueRef.current !== null && valueRef.current !== lastEmittedValueRef.current) {
emitValue(valueCopy)
}
}, effectiveDebounceMs)
}
if (triggerWorkflowUpdate) {
useWorkflowStore.getState().triggerUpdate()
@@ -247,7 +333,9 @@ export function useSubBlockValue<T = any>(
storeValue,
triggerWorkflowUpdate,
modelValue,
collaborativeSetSubblockValue,
isStreaming,
debounceMs,
emitValue,
]
)
@@ -320,5 +408,29 @@ export function useSubBlockValue<T = any>(
}
}, [storeValue, initialValue])
// Return appropriate tuple based on whether options were provided
return [storeValue !== undefined ? storeValue : initialValue, setValue] as const
}
// Helper function to detect large changes
function detectLargeChange(oldValue: any, newValue: any): boolean {
// Handle null/undefined
if (oldValue == null && newValue == null) return false
if (oldValue == null || newValue == null) return true
// For strings, check if it's a large paste or deletion
if (typeof oldValue === 'string' && typeof newValue === 'string') {
const sizeDiff = Math.abs(newValue.length - oldValue.length)
// Consider it a large change if more than 50 characters changed at once
return sizeDiff > 50
}
// For arrays, check length difference
if (Array.isArray(oldValue) && Array.isArray(newValue)) {
const sizeDiff = Math.abs(newValue.length - oldValue.length)
return sizeDiff > 5
}
// For other types, always treat as small change
return false
}

View File

@@ -297,27 +297,11 @@ export function SubBlock({
case 'oauth-input':
return (
<CredentialSelector
value={
isPreview ? previewValue || '' : typeof config.value === 'string' ? config.value : ''
}
onChange={(value) => {
// Only allow changes in non-preview mode and when not disabled
if (!isPreview && !disabled) {
const event = new CustomEvent('update-subblock-value', {
detail: {
blockId,
subBlockId: config.id,
value,
},
})
window.dispatchEvent(event)
}
}}
provider={config.provider as any}
requiredScopes={config.requiredScopes || []}
label={config.placeholder || 'Select a credential'}
serviceId={config.serviceId}
blockId={blockId}
subBlock={config}
disabled={isDisabled}
isPreview={isPreview}
previewValue={previewValue}
/>
)
case 'file-selector':

View File

@@ -654,7 +654,9 @@ export function WorkflowBlock({ id, data }: NodeProps<WorkflowBlockProps>) {
</TooltipTrigger>
<TooltipContent side='top'>
{!userPermissions.canEdit
? 'Read-only mode'
? userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Read-only mode'
: blockAdvancedMode
? 'Switch to Basic Mode'
: 'Switch to Advanced Mode'}
@@ -750,7 +752,9 @@ export function WorkflowBlock({ id, data }: NodeProps<WorkflowBlockProps>) {
</TooltipTrigger>
<TooltipContent side='top'>
{!userPermissions.canEdit
? 'Read-only mode'
? userPermissions.isOfflineMode
? 'Connection lost - please refresh'
: 'Read-only mode'
: isWide
? 'Narrow Block'
: 'Expand Block'}

View File

@@ -29,6 +29,35 @@ export interface ConnectedBlock {
}
}
function parseResponseFormatSafely(responseFormatValue: any, blockId: string): any {
if (!responseFormatValue) {
return undefined
}
if (typeof responseFormatValue === 'object' && responseFormatValue !== null) {
return responseFormatValue
}
if (typeof responseFormatValue === 'string') {
const trimmedValue = responseFormatValue.trim()
if (trimmedValue.startsWith('<') && trimmedValue.includes('>')) {
return trimmedValue
}
if (trimmedValue === '') {
return undefined
}
try {
return JSON.parse(trimmedValue)
} catch (error) {
return undefined
}
}
return undefined
}
// Helper function to extract fields from JSON Schema
function extractFieldsFromSchema(schema: any): Field[] {
if (!schema || typeof schema !== 'object') {
@@ -75,17 +104,8 @@ export function useBlockConnections(blockId: string) {
// Get the response format from the subblock store
const responseFormatValue = useSubBlockStore.getState().getValue(sourceId, 'responseFormat')
let responseFormat
try {
responseFormat =
typeof responseFormatValue === 'string' && responseFormatValue
? JSON.parse(responseFormatValue)
: responseFormatValue // Handle case where it's already an object
} catch (e) {
logger.error('Failed to parse response format:', { e })
responseFormat = undefined
}
// Safely parse response format with proper error handling
const responseFormat = parseResponseFormatSafely(responseFormatValue, sourceId)
// Get the default output type from the block's outputs
const defaultOutputs: Field[] = Object.entries(sourceBlock.outputs || {}).map(([key]) => ({
@@ -118,17 +138,8 @@ export function useBlockConnections(blockId: string) {
.getState()
.getValue(edge.source, 'responseFormat')
let responseFormat
try {
responseFormat =
typeof responseFormatValue === 'string' && responseFormatValue
? JSON.parse(responseFormatValue)
: responseFormatValue // Handle case where it's already an object
} catch (e) {
logger.error('Failed to parse response format:', { e })
responseFormat = undefined
}
// Safely parse response format with proper error handling
const responseFormat = parseResponseFormatSafely(responseFormatValue, edge.source)
// Get the default output type from the block's outputs
const defaultOutputs: Field[] = Object.entries(sourceBlock.outputs || {}).map(([key]) => ({

View File

@@ -217,10 +217,13 @@ export function useWorkflowExecution() {
result.logs.forEach((log: BlockLog) => {
if (streamedContent.has(log.blockId)) {
const content = streamedContent.get(log.blockId) || ''
if (log.output) {
log.output.content = content
}
useConsoleStore.getState().updateConsole(log.blockId, content)
// For console display, show the actual structured block output instead of formatted streaming content
// This ensures console logs match the block state structure
// Use replaceOutput to completely replace the output instead of merging
useConsoleStore.getState().updateConsole(log.blockId, {
replaceOutput: log.output,
success: true,
})
}
})

View File

@@ -1,6 +1,7 @@
'use client'
import React, { createContext, useContext, useMemo } from 'react'
import type React from 'react'
import { createContext, useContext, useEffect, useMemo, useState } from 'react'
import { useParams } from 'next/navigation'
import { createLogger } from '@/lib/logs/console-logger'
import { useUserPermissions, type WorkspaceUserPermissions } from '@/hooks/use-user-permissions'
@@ -8,6 +9,7 @@ import {
useWorkspacePermissions,
type WorkspacePermissions,
} from '@/hooks/use-workspace-permissions'
import { usePresence } from '../../[workflowId]/hooks/use-presence'
const logger = createLogger('WorkspacePermissionsProvider')
@@ -18,88 +20,140 @@ interface WorkspacePermissionsContextType {
permissionsError: string | null
updatePermissions: (newPermissions: WorkspacePermissions) => void
// Computed user permissions
userPermissions: WorkspaceUserPermissions
// Computed user permissions (connection-aware)
userPermissions: WorkspaceUserPermissions & { isOfflineMode?: boolean }
// Connection state management
setOfflineMode: (isOffline: boolean) => void
}
const WorkspacePermissionsContext = createContext<WorkspacePermissionsContextType | null>(null)
const WorkspacePermissionsContext = createContext<WorkspacePermissionsContextType>({
workspacePermissions: null,
permissionsLoading: false,
permissionsError: null,
updatePermissions: () => {},
userPermissions: {
canRead: false,
canEdit: false,
canAdmin: false,
userPermissions: 'read',
isLoading: false,
error: null,
},
setOfflineMode: () => {},
})
interface WorkspacePermissionsProviderProps {
children: React.ReactNode
}
const WorkspacePermissionsProvider = React.memo<WorkspacePermissionsProviderProps>(
({ children }) => {
const params = useParams()
const workspaceId = params.workspaceId as string
/**
* Provider that manages workspace permissions and user access
* Also provides connection-aware permissions that enforce read-only mode when offline
*/
export function WorkspacePermissionsProvider({ children }: WorkspacePermissionsProviderProps) {
const params = useParams()
const workspaceId = params?.workspaceId as string
if (!workspaceId) {
logger.warn('Workspace ID is undefined from params:', params)
// Manage offline mode state locally
const [isOfflineMode, setIsOfflineMode] = useState(false)
const [hasBeenConnected, setHasBeenConnected] = useState(false)
// Fetch workspace permissions and loading state
const {
permissions: workspacePermissions,
loading: permissionsLoading,
error: permissionsError,
updatePermissions,
} = useWorkspacePermissions(workspaceId)
// Get base user permissions from workspace permissions
const baseUserPermissions = useUserPermissions(
workspacePermissions,
permissionsLoading,
permissionsError
)
// Get connection status and update offline mode accordingly
const { isConnected } = usePresence()
useEffect(() => {
if (isConnected) {
// Mark that we've been connected at least once
setHasBeenConnected(true)
// On initial connection, allow going online
if (!hasBeenConnected) {
setIsOfflineMode(false)
}
// If we were previously connected and this is a reconnection, stay offline (user must refresh)
} else if (hasBeenConnected) {
// Only enter offline mode if we were previously connected and now disconnected
setIsOfflineMode(true)
}
// If not connected and never been connected, stay in initial state (not offline mode)
}, [isConnected, hasBeenConnected])
// Create connection-aware permissions that override user permissions when offline
const userPermissions = useMemo((): WorkspaceUserPermissions & { isOfflineMode?: boolean } => {
if (isOfflineMode) {
// In offline mode, force read-only permissions regardless of actual user permissions
return {
...baseUserPermissions,
canEdit: false,
canAdmin: false,
// Keep canRead true so users can still view content
canRead: baseUserPermissions.canRead,
isOfflineMode: true,
}
}
const {
permissions: workspacePermissions,
loading: permissionsLoading,
error: permissionsError,
updatePermissions,
} = useWorkspacePermissions(workspaceId)
// When online, use normal permissions
return {
...baseUserPermissions,
isOfflineMode: false,
}
}, [baseUserPermissions, isOfflineMode])
const userPermissions = useUserPermissions(
const contextValue = useMemo(
() => ({
workspacePermissions,
permissionsLoading,
permissionsError
)
permissionsError,
updatePermissions,
userPermissions,
setOfflineMode: setIsOfflineMode,
}),
[workspacePermissions, permissionsLoading, permissionsError, updatePermissions, userPermissions]
)
const contextValue = useMemo(
() => ({
workspacePermissions,
permissionsLoading,
permissionsError,
updatePermissions,
userPermissions,
}),
[
workspacePermissions,
permissionsLoading,
permissionsError,
updatePermissions,
userPermissions,
]
)
return (
<WorkspacePermissionsContext.Provider value={contextValue}>
{children}
</WorkspacePermissionsContext.Provider>
)
}
)
WorkspacePermissionsProvider.displayName = 'WorkspacePermissionsProvider'
export { WorkspacePermissionsProvider }
return (
<WorkspacePermissionsContext.Provider value={contextValue}>
{children}
</WorkspacePermissionsContext.Provider>
)
}
/**
* Hook to access workspace permissions context
* This replaces individual useWorkspacePermissions calls to avoid duplicate API requests
* Hook to access workspace permissions and data from context
* This provides both raw workspace permissions and computed user permissions
*/
export function useWorkspacePermissionsContext(): WorkspacePermissionsContextType {
const context = useContext(WorkspacePermissionsContext)
if (!context) {
throw new Error(
'useWorkspacePermissionsContext must be used within a WorkspacePermissionsProvider'
)
}
return context
}
/**
* Hook to access user permissions from context
* This replaces individual useUserPermissions calls
* This replaces individual useUserPermissions calls and includes connection-aware permissions
*/
export function useUserPermissionsContext(): WorkspaceUserPermissions {
export function useUserPermissionsContext(): WorkspaceUserPermissions & {
isOfflineMode?: boolean
} {
const { userPermissions } = useWorkspacePermissionsContext()
return userPermissions
}

View File

@@ -58,6 +58,22 @@ export function WorkflowPreview({
defaultZoom,
onNodeClick,
}: WorkflowPreviewProps) {
// Handle migrated logs that don't have complete workflow state
if (!workflowState || !workflowState.blocks || !workflowState.edges) {
return (
<div
style={{ height, width }}
className='flex items-center justify-center rounded-lg border border-gray-200 bg-gray-50 dark:border-gray-700 dark:bg-gray-900'
>
<div className='text-center text-gray-500 dark:text-gray-400'>
<div className='mb-2 font-medium text-lg'> Logged State Not Found</div>
<div className='text-sm'>
This log was migrated from the old system and doesn't contain workflow state data.
</div>
</div>
</div>
)
}
const blocksStructure = useMemo(
() => ({
count: Object.keys(workflowState.blocks || {}).length,
@@ -84,8 +100,8 @@ export function WorkflowPreview({
const edgesStructure = useMemo(
() => ({
count: workflowState.edges.length,
ids: workflowState.edges.map((e) => e.id).join(','),
count: workflowState.edges?.length || 0,
ids: workflowState.edges?.map((e) => e.id).join(',') || '',
}),
[workflowState.edges]
)
@@ -115,7 +131,7 @@ export function WorkflowPreview({
const nodes: Node[] = useMemo(() => {
const nodeArray: Node[] = []
Object.entries(workflowState.blocks).forEach(([blockId, block]) => {
Object.entries(workflowState.blocks || {}).forEach(([blockId, block]) => {
if (!block || !block.type) {
logger.warn(`Skipping invalid block: ${blockId}`)
return
@@ -186,7 +202,7 @@ export function WorkflowPreview({
})
if (block.type === 'loop') {
const childBlocks = Object.entries(workflowState.blocks).filter(
const childBlocks = Object.entries(workflowState.blocks || {}).filter(
([_, childBlock]) => childBlock.data?.parentId === blockId
)
@@ -223,7 +239,7 @@ export function WorkflowPreview({
}, [blocksStructure, loopsStructure, parallelsStructure, showSubBlocks, workflowState.blocks])
const edges: Edge[] = useMemo(() => {
return workflowState.edges.map((edge) => ({
return (workflowState.edges || []).map((edge) => ({
id: edge.id,
source: edge.source,
target: edge.target,

View File

@@ -31,6 +31,18 @@ export const RedditBlock: BlockConfig<
],
},
// Reddit OAuth Authentication
{
id: 'credential',
title: 'Reddit Account',
type: 'oauth-input',
layout: 'full',
provider: 'reddit',
serviceId: 'reddit',
requiredScopes: ['identity', 'read'],
placeholder: 'Select Reddit account',
},
// Common fields - appear for all actions
{
id: 'subreddit',
@@ -151,27 +163,31 @@ export const RedditBlock: BlockConfig<
},
params: (inputs) => {
const action = inputs.action || 'get_posts'
const { credential, ...rest } = inputs
if (action === 'get_comments') {
return {
postId: inputs.postId,
subreddit: inputs.subreddit,
sort: inputs.commentSort,
limit: inputs.commentLimit ? Number.parseInt(inputs.commentLimit) : undefined,
postId: rest.postId,
subreddit: rest.subreddit,
sort: rest.commentSort,
limit: rest.commentLimit ? Number.parseInt(rest.commentLimit) : undefined,
credential: credential,
}
}
return {
subreddit: inputs.subreddit,
sort: inputs.sort,
limit: inputs.limit ? Number.parseInt(inputs.limit) : undefined,
time: inputs.sort === 'top' ? inputs.time : undefined,
subreddit: rest.subreddit,
sort: rest.sort,
limit: rest.limit ? Number.parseInt(rest.limit) : undefined,
time: rest.sort === 'top' ? rest.time : undefined,
credential: credential,
}
},
},
},
inputs: {
action: { type: 'string', required: true },
credential: { type: 'string', required: true },
subreddit: { type: 'string', required: true },
sort: { type: 'string', required: true },
time: { type: 'string', required: false },

View File

@@ -1,7 +1,8 @@
import { describe, expect, test, vi } from 'vitest'
import { extractFieldsFromSchema, parseResponseFormatSafely } from '@/lib/response-format'
import type { BlockState } from '@/stores/workflows/workflow/types'
import { generateLoopBlocks } from '@/stores/workflows/workflow/utils'
import { checkTagTrigger, extractFieldsFromSchema } from './tag-dropdown'
import { checkTagTrigger } from './tag-dropdown'
vi.mock('@/stores/workflows/workflow/store', () => ({
useWorkflowStore: vi.fn(() => ({
@@ -24,6 +25,15 @@ vi.mock('@/stores/panel/variables/store', () => ({
})),
}))
vi.mock('@/stores/workflows/subblock/store', () => ({
useSubBlockStore: vi.fn(() => ({
getValue: vi.fn(() => null),
getState: vi.fn(() => ({
getValue: vi.fn(() => null),
})),
})),
}))
describe('TagDropdown Loop Suggestions', () => {
test('should generate correct loop suggestions for forEach loops', () => {
const blocks: Record<string, BlockState> = {
@@ -603,3 +613,180 @@ describe('TagDropdown Tag Selection Logic', () => {
})
})
})
describe('TagDropdown Response Format Support', () => {
it.concurrent(
'should use custom schema properties when response format is specified',
async () => {
// Mock the subblock store to return a custom response format
const mockGetValue = vi.fn()
const mockUseSubBlockStore = vi.mocked(
await import('@/stores/workflows/subblock/store')
).useSubBlockStore
// Set up the mock to return the example schema from the user
const responseFormatValue = JSON.stringify({
name: 'short_schema',
description: 'A minimal example schema with a single string property.',
strict: true,
schema: {
type: 'object',
properties: {
example_property: {
type: 'string',
description: 'A simple string property.',
},
},
additionalProperties: false,
required: ['example_property'],
},
})
mockGetValue.mockImplementation((blockId: string, subBlockId: string) => {
if (blockId === 'agent1' && subBlockId === 'responseFormat') {
return responseFormatValue
}
return null
})
mockUseSubBlockStore.mockReturnValue({
getValue: mockGetValue,
getState: () => ({
getValue: mockGetValue,
}),
} as any)
// Test the parseResponseFormatSafely function
const parsedFormat = parseResponseFormatSafely(responseFormatValue, 'agent1')
expect(parsedFormat).toEqual({
name: 'short_schema',
description: 'A minimal example schema with a single string property.',
strict: true,
schema: {
type: 'object',
properties: {
example_property: {
type: 'string',
description: 'A simple string property.',
},
},
additionalProperties: false,
required: ['example_property'],
},
})
// Test the extractFieldsFromSchema function with the parsed format
const fields = extractFieldsFromSchema(parsedFormat)
expect(fields).toEqual([
{
name: 'example_property',
type: 'string',
description: 'A simple string property.',
},
])
}
)
it.concurrent(
'should fallback to default outputs when response format parsing fails',
async () => {
// Test with invalid JSON
const invalidFormat = parseResponseFormatSafely('invalid json', 'agent1')
expect(invalidFormat).toBeNull()
// Test with null/undefined values
expect(parseResponseFormatSafely(null, 'agent1')).toBeNull()
expect(parseResponseFormatSafely(undefined, 'agent1')).toBeNull()
expect(parseResponseFormatSafely('', 'agent1')).toBeNull()
}
)
it.concurrent('should handle response format with nested schema correctly', async () => {
const responseFormat = {
schema: {
type: 'object',
properties: {
user: {
type: 'object',
description: 'User information',
properties: {
name: { type: 'string', description: 'User name' },
age: { type: 'number', description: 'User age' },
},
},
status: { type: 'string', description: 'Response status' },
},
},
}
const fields = extractFieldsFromSchema(responseFormat)
expect(fields).toEqual([
{ name: 'user', type: 'object', description: 'User information' },
{ name: 'status', type: 'string', description: 'Response status' },
])
})
it.concurrent('should handle response format without schema wrapper', async () => {
const responseFormat = {
type: 'object',
properties: {
result: { type: 'boolean', description: 'Operation result' },
message: { type: 'string', description: 'Status message' },
},
}
const fields = extractFieldsFromSchema(responseFormat)
expect(fields).toEqual([
{ name: 'result', type: 'boolean', description: 'Operation result' },
{ name: 'message', type: 'string', description: 'Status message' },
])
})
it.concurrent('should return object as-is when it is already parsed', async () => {
const responseFormat = {
name: 'test_schema',
schema: {
properties: {
data: { type: 'string' },
},
},
}
const result = parseResponseFormatSafely(responseFormat, 'agent1')
expect(result).toEqual(responseFormat)
})
it.concurrent('should simulate block tag generation with custom response format', async () => {
// Simulate the tag generation logic that would happen in the component
const blockName = 'Agent 1'
const normalizedBlockName = blockName.replace(/\s+/g, '').toLowerCase() // 'agent1'
// Mock response format
const responseFormat = {
schema: {
properties: {
example_property: { type: 'string', description: 'A simple string property.' },
another_field: { type: 'number', description: 'Another field.' },
},
},
}
const schemaFields = extractFieldsFromSchema(responseFormat)
// Generate block tags as they would be in the component
const blockTags = schemaFields.map((field) => `${normalizedBlockName}.${field.name}`)
expect(blockTags).toEqual(['agent1.example_property', 'agent1.another_field'])
// Verify the fields extracted correctly
expect(schemaFields).toEqual([
{ name: 'example_property', type: 'string', description: 'A simple string property.' },
{ name: 'another_field', type: 'number', description: 'Another field.' },
])
})
})

View File

@@ -1,18 +1,16 @@
import type React from 'react'
import { useCallback, useEffect, useMemo, useState } from 'react'
import { BlockPathCalculator } from '@/lib/block-path-calculator'
import { createLogger } from '@/lib/logs/console-logger'
import { extractFieldsFromSchema, parseResponseFormatSafely } from '@/lib/response-format'
import { cn } from '@/lib/utils'
import { getBlock } from '@/blocks'
import { Serializer } from '@/serializer'
import { useVariablesStore } from '@/stores/panel/variables/store'
import type { Variable } from '@/stores/panel/variables/types'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
const logger = createLogger('TagDropdown')
// Type definitions for component data structures
interface BlockTagGroup {
blockName: string
blockId: string
@@ -21,49 +19,6 @@ interface BlockTagGroup {
distance: number
}
interface Field {
name: string
type: string
description?: string
}
// Helper function to extract fields from JSON Schema
export function extractFieldsFromSchema(schema: any): Field[] {
if (!schema || typeof schema !== 'object') {
return []
}
// Handle legacy format with fields array
if (Array.isArray(schema.fields)) {
return schema.fields
}
// Handle new JSON Schema format
const schemaObj = schema.schema || schema
if (!schemaObj || !schemaObj.properties || typeof schemaObj.properties !== 'object') {
return []
}
// Extract fields from schema properties
return Object.entries(schemaObj.properties).map(([name, prop]: [string, any]) => {
// Handle array format like ['string', 'array']
if (Array.isArray(prop)) {
return {
name,
type: prop.includes('array') ? 'array' : prop[0] || 'string',
description: undefined,
}
}
// Handle object format like { type: 'string', description: '...' }
return {
name,
type: prop.type || 'string',
description: prop.description,
}
})
}
interface TagDropdownProps {
visible: boolean
onSelect: (newValue: string) => void
@@ -169,18 +124,68 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
}
const blockConfig = getBlock(sourceBlock.type)
// Handle special blocks that aren't in the registry (loop and parallel)
if (!blockConfig) {
if (sourceBlock.type === 'loop' || sourceBlock.type === 'parallel') {
// Create a mock config with results output for loop/parallel blocks
const mockConfig = {
outputs: {
results: 'array', // These blocks have a results array output
},
}
const blockName = sourceBlock.name || sourceBlock.type
const normalizedBlockName = blockName.replace(/\s+/g, '').toLowerCase()
// Generate output paths for the mock config
const outputPaths = generateOutputPaths(mockConfig.outputs)
const blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
const blockTagGroups: BlockTagGroup[] = [
{
blockName,
blockId: activeSourceBlockId,
blockType: sourceBlock.type,
tags: blockTags,
distance: 0,
},
]
return {
tags: blockTags,
variableInfoMap: {},
blockTagGroups,
}
}
return { tags: [], variableInfoMap: {}, blockTagGroups: [] }
}
const blockName = sourceBlock.name || sourceBlock.type
const normalizedBlockName = blockName.replace(/\s+/g, '').toLowerCase()
// Handle blocks with no outputs (like starter) - show as just <blockname>
// Check for custom response format first
const responseFormatValue = useSubBlockStore
.getState()
.getValue(activeSourceBlockId, 'responseFormat')
const responseFormat = parseResponseFormatSafely(responseFormatValue, activeSourceBlockId)
let blockTags: string[]
if (Object.keys(blockConfig.outputs).length === 0) {
if (responseFormat) {
// Use custom schema properties if response format is specified
const schemaFields = extractFieldsFromSchema(responseFormat)
if (schemaFields.length > 0) {
blockTags = schemaFields.map((field) => `${normalizedBlockName}.${field.name}`)
} else {
// Fallback to default if schema extraction failed
const outputPaths = generateOutputPaths(blockConfig.outputs)
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
}
} else if (Object.keys(blockConfig.outputs).length === 0) {
// Handle blocks with no outputs (like starter) - show as just <blockname>
blockTags = [normalizedBlockName]
} else {
// Use default block outputs
const outputPaths = generateOutputPaths(blockConfig.outputs)
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
}
@@ -270,28 +275,65 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
{} as Record<string, { type: string; id: string }>
)
// Generate loop tags if current block is in a loop
const loopTags: string[] = []
// Generate loop contextual block group if current block is in a loop
let loopBlockGroup: BlockTagGroup | null = null
const containingLoop = Object.entries(loops).find(([_, loop]) => loop.nodes.includes(blockId))
let containingLoopBlockId: string | null = null
if (containingLoop) {
const [_loopId, loop] = containingLoop
const [loopId, loop] = containingLoop
containingLoopBlockId = loopId
const loopType = loop.loopType || 'for'
loopTags.push('loop.index')
const contextualTags: string[] = ['index']
if (loopType === 'forEach') {
loopTags.push('loop.currentItem')
loopTags.push('loop.items')
contextualTags.push('currentItem')
contextualTags.push('items')
}
// Add the containing loop block's results to the contextual tags
const containingLoopBlock = blocks[loopId]
if (containingLoopBlock) {
const loopBlockName = containingLoopBlock.name || containingLoopBlock.type
const normalizedLoopBlockName = loopBlockName.replace(/\s+/g, '').toLowerCase()
contextualTags.push(`${normalizedLoopBlockName}.results`)
// Create a block group for the loop contextual tags
loopBlockGroup = {
blockName: loopBlockName,
blockId: loopId,
blockType: 'loop',
tags: contextualTags,
distance: 0, // Contextual tags have highest priority
}
}
}
// Generate parallel tags if current block is in parallel
const parallelTags: string[] = []
// Generate parallel contextual block group if current block is in parallel
let parallelBlockGroup: BlockTagGroup | null = null
const containingParallel = Object.entries(parallels || {}).find(([_, parallel]) =>
parallel.nodes.includes(blockId)
)
let containingParallelBlockId: string | null = null
if (containingParallel) {
parallelTags.push('parallel.index')
parallelTags.push('parallel.currentItem')
parallelTags.push('parallel.items')
const [parallelId] = containingParallel
containingParallelBlockId = parallelId
const contextualTags: string[] = ['index', 'currentItem', 'items']
// Add the containing parallel block's results to the contextual tags
const containingParallelBlock = blocks[parallelId]
if (containingParallelBlock) {
const parallelBlockName = containingParallelBlock.name || containingParallelBlock.type
const normalizedParallelBlockName = parallelBlockName.replace(/\s+/g, '').toLowerCase()
contextualTags.push(`${normalizedParallelBlockName}.results`)
// Create a block group for the parallel contextual tags
parallelBlockGroup = {
blockName: parallelBlockName,
blockId: parallelId,
blockType: 'parallel',
tags: contextualTags,
distance: 0, // Contextual tags have highest priority
}
}
}
// Create block tag groups from accessible blocks
@@ -303,16 +345,70 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
if (!accessibleBlock) continue
const blockConfig = getBlock(accessibleBlock.type)
if (!blockConfig) continue
// Handle special blocks that aren't in the registry (loop and parallel)
if (!blockConfig) {
// For loop and parallel blocks, create a mock config with results output
if (accessibleBlock.type === 'loop' || accessibleBlock.type === 'parallel') {
// Skip this block if it's the containing loop/parallel block - we'll handle it with contextual tags
if (
accessibleBlockId === containingLoopBlockId ||
accessibleBlockId === containingParallelBlockId
) {
continue
}
const mockConfig = {
outputs: {
results: 'array', // These blocks have a results array output
},
}
const blockName = accessibleBlock.name || accessibleBlock.type
const normalizedBlockName = blockName.replace(/\s+/g, '').toLowerCase()
// Generate output paths for the mock config
const outputPaths = generateOutputPaths(mockConfig.outputs)
const blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
blockTagGroups.push({
blockName,
blockId: accessibleBlockId,
blockType: accessibleBlock.type,
tags: blockTags,
distance: blockDistances[accessibleBlockId] || 0,
})
allBlockTags.push(...blockTags)
}
continue
}
const blockName = accessibleBlock.name || accessibleBlock.type
const normalizedBlockName = blockName.replace(/\s+/g, '').toLowerCase()
// Handle blocks with no outputs (like starter) - show as just <blockname>
// Check for custom response format first
const responseFormatValue = useSubBlockStore
.getState()
.getValue(accessibleBlockId, 'responseFormat')
const responseFormat = parseResponseFormatSafely(responseFormatValue, accessibleBlockId)
let blockTags: string[]
if (Object.keys(blockConfig.outputs).length === 0) {
if (responseFormat) {
// Use custom schema properties if response format is specified
const schemaFields = extractFieldsFromSchema(responseFormat)
if (schemaFields.length > 0) {
blockTags = schemaFields.map((field) => `${normalizedBlockName}.${field.name}`)
} else {
// Fallback to default if schema extraction failed
const outputPaths = generateOutputPaths(blockConfig.outputs)
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
}
} else if (Object.keys(blockConfig.outputs).length === 0) {
// Handle blocks with no outputs (like starter) - show as just <blockname>
blockTags = [normalizedBlockName]
} else {
// Use default block outputs
const outputPaths = generateOutputPaths(blockConfig.outputs)
blockTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
}
@@ -328,13 +424,32 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
allBlockTags.push(...blockTags)
}
// Sort block groups by distance (closest first)
// Add contextual block groups at the beginning (they have highest priority)
const finalBlockTagGroups: BlockTagGroup[] = []
if (loopBlockGroup) {
finalBlockTagGroups.push(loopBlockGroup)
}
if (parallelBlockGroup) {
finalBlockTagGroups.push(parallelBlockGroup)
}
// Sort regular block groups by distance (closest first) and add them
blockTagGroups.sort((a, b) => a.distance - b.distance)
finalBlockTagGroups.push(...blockTagGroups)
// Collect all tags for the main tags array
const contextualTags: string[] = []
if (loopBlockGroup) {
contextualTags.push(...loopBlockGroup.tags)
}
if (parallelBlockGroup) {
contextualTags.push(...parallelBlockGroup.tags)
}
return {
tags: [...variableTags, ...loopTags, ...parallelTags, ...allBlockTags],
tags: [...variableTags, ...contextualTags, ...allBlockTags],
variableInfoMap,
blockTagGroups,
blockTagGroups: finalBlockTagGroups,
}
}, [blocks, edges, loops, parallels, blockId, activeSourceBlockId, workflowVariables])
@@ -345,18 +460,12 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
}, [tags, searchTerm])
// Group filtered tags by category
const { variableTags, loopTags, parallelTags, filteredBlockTagGroups } = useMemo(() => {
const { variableTags, filteredBlockTagGroups } = useMemo(() => {
const varTags: string[] = []
const loopTags: string[] = []
const parTags: string[] = []
filteredTags.forEach((tag) => {
if (tag.startsWith('variable.')) {
varTags.push(tag)
} else if (tag.startsWith('loop.')) {
loopTags.push(tag)
} else if (tag.startsWith('parallel.')) {
parTags.push(tag)
}
})
@@ -370,8 +479,6 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
return {
variableTags: varTags,
loopTags: loopTags,
parallelTags: parTags,
filteredBlockTagGroups,
}
}, [filteredTags, blockTagGroups, searchTerm])
@@ -379,8 +486,8 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
// Create ordered tags for keyboard navigation
const orderedTags = useMemo(() => {
const allBlockTags = filteredBlockTagGroups.flatMap((group) => group.tags)
return [...variableTags, ...loopTags, ...parallelTags, ...allBlockTags]
}, [variableTags, loopTags, parallelTags, filteredBlockTagGroups])
return [...variableTags, ...allBlockTags]
}, [variableTags, filteredBlockTagGroups])
// Create efficient tag index lookup map
const tagIndexMap = useMemo(() => {
@@ -393,7 +500,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
// Handle tag selection and text replacement
const handleTagSelect = useCallback(
(tag: string) => {
(tag: string, blockGroup?: BlockTagGroup) => {
const textBeforeCursor = inputValue.slice(0, cursorPosition)
const textAfterCursor = inputValue.slice(cursorPosition)
@@ -401,8 +508,10 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
const lastOpenBracket = textBeforeCursor.lastIndexOf('<')
if (lastOpenBracket === -1) return
// Process variable tags to maintain compatibility
// Process different types of tags
let processedTag = tag
// Handle variable tags
if (tag.startsWith('variable.')) {
const variableName = tag.substring('variable.'.length)
const variableObj = Object.values(variables).find(
@@ -413,6 +522,19 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
processedTag = tag
}
}
// Handle contextual loop/parallel tags
else if (
blockGroup &&
(blockGroup.blockType === 'loop' || blockGroup.blockType === 'parallel')
) {
// Check if this is a contextual tag (without dots) that needs a prefix
if (!tag.includes('.') && ['index', 'currentItem', 'items'].includes(tag)) {
processedTag = `${blockGroup.blockType}.${tag}`
} else {
// It's already a properly formatted tag (like blockname.results)
processedTag = tag
}
}
// Handle existing closing bracket
const nextCloseBracket = textAfterCursor.indexOf('>')
@@ -465,7 +587,12 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
e.preventDefault()
e.stopPropagation()
if (selectedIndex >= 0 && selectedIndex < orderedTags.length) {
handleTagSelect(orderedTags[selectedIndex])
const selectedTag = orderedTags[selectedIndex]
// Find which block group this tag belongs to
const belongsToGroup = filteredBlockTagGroups.find((group) =>
group.tags.includes(selectedTag)
)
handleTagSelect(selectedTag, belongsToGroup)
}
break
case 'Escape':
@@ -479,7 +606,7 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
window.addEventListener('keydown', handleKeyboardEvent, true)
return () => window.removeEventListener('keydown', handleKeyboardEvent, true)
}
}, [visible, selectedIndex, orderedTags, handleTagSelect, onClose])
}, [visible, selectedIndex, orderedTags, filteredBlockTagGroups, handleTagSelect, onClose])
// Early return if dropdown should not be visible
if (!visible || tags.length === 0 || orderedTags.length === 0) return null
@@ -552,152 +679,21 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
</>
)}
{/* Loop section */}
{loopTags.length > 0 && (
<>
{variableTags.length > 0 && <div className='my-0' />}
<div className='px-2 pt-2.5 pb-0.5 font-medium text-muted-foreground text-xs'>
Loop
</div>
<div className='-mx-1 -px-1'>
{loopTags.map((tag: string) => {
const tagIndex = tagIndexMap.get(tag) ?? -1
const loopProperty = tag.split('.')[1]
// Choose appropriate icon and description based on loop property
let tagIcon = 'L'
let tagDescription = ''
const bgColor = '#8857E6'
if (loopProperty === 'currentItem') {
tagIcon = 'i'
tagDescription = 'Current item'
} else if (loopProperty === 'items') {
tagIcon = 'I'
tagDescription = 'All items'
} else if (loopProperty === 'index') {
tagIcon = '#'
tagDescription = 'Index'
}
return (
<button
key={tag}
className={cn(
'flex w-full items-center gap-2 px-3 py-1.5 text-left text-sm',
'hover:bg-accent hover:text-accent-foreground',
'focus:bg-accent focus:text-accent-foreground focus:outline-none',
tagIndex === selectedIndex &&
tagIndex >= 0 &&
'bg-accent text-accent-foreground'
)}
onMouseEnter={() => setSelectedIndex(tagIndex >= 0 ? tagIndex : 0)}
onMouseDown={(e) => {
e.preventDefault()
e.stopPropagation()
handleTagSelect(tag)
}}
onClick={(e) => {
e.preventDefault()
e.stopPropagation()
handleTagSelect(tag)
}}
>
<div
className='flex h-5 w-5 items-center justify-center rounded'
style={{ backgroundColor: bgColor }}
>
<span className='h-3 w-3 font-bold text-white text-xs'>{tagIcon}</span>
</div>
<span className='flex-1 truncate'>{tag}</span>
<span className='ml-auto text-muted-foreground text-xs'>
{tagDescription}
</span>
</button>
)
})}
</div>
</>
)}
{/* Parallel section */}
{parallelTags.length > 0 && (
<>
{loopTags.length > 0 && <div className='my-0' />}
<div className='px-2 pt-2.5 pb-0.5 font-medium text-muted-foreground text-xs'>
Parallel
</div>
<div className='-mx-1 -px-1'>
{parallelTags.map((tag: string) => {
const tagIndex = tagIndexMap.get(tag) ?? -1
const parallelProperty = tag.split('.')[1]
// Choose appropriate icon and description based on parallel property
let tagIcon = 'P'
let tagDescription = ''
const bgColor = '#FF5757'
if (parallelProperty === 'currentItem') {
tagIcon = 'i'
tagDescription = 'Current item'
} else if (parallelProperty === 'items') {
tagIcon = 'I'
tagDescription = 'All items'
} else if (parallelProperty === 'index') {
tagIcon = '#'
tagDescription = 'Index'
}
return (
<button
key={tag}
className={cn(
'flex w-full items-center gap-2 px-3 py-1.5 text-left text-sm',
'hover:bg-accent hover:text-accent-foreground',
'focus:bg-accent focus:text-accent-foreground focus:outline-none',
tagIndex === selectedIndex &&
tagIndex >= 0 &&
'bg-accent text-accent-foreground'
)}
onMouseEnter={() => setSelectedIndex(tagIndex >= 0 ? tagIndex : 0)}
onMouseDown={(e) => {
e.preventDefault()
e.stopPropagation()
handleTagSelect(tag)
}}
onClick={(e) => {
e.preventDefault()
e.stopPropagation()
handleTagSelect(tag)
}}
>
<div
className='flex h-5 w-5 items-center justify-center rounded'
style={{ backgroundColor: bgColor }}
>
<span className='h-3 w-3 font-bold text-white text-xs'>{tagIcon}</span>
</div>
<span className='flex-1 truncate'>{tag}</span>
<span className='ml-auto text-muted-foreground text-xs'>
{tagDescription}
</span>
</button>
)
})}
</div>
</>
)}
{/* Block sections */}
{filteredBlockTagGroups.length > 0 && (
<>
{(variableTags.length > 0 || loopTags.length > 0 || parallelTags.length > 0) && (
<div className='my-0' />
)}
{variableTags.length > 0 && <div className='my-0' />}
{filteredBlockTagGroups.map((group) => {
// Get block color from configuration
const blockConfig = getBlock(group.blockType)
const blockColor = blockConfig?.bgColor || '#2F55FF'
let blockColor = blockConfig?.bgColor || '#2F55FF'
// Handle special colors for loop and parallel blocks
if (group.blockType === 'loop') {
blockColor = '#8857E6' // Purple color for loop blocks
} else if (group.blockType === 'parallel') {
blockColor = '#FF5757' // Red color for parallel blocks
}
return (
<div key={group.blockId}>
@@ -707,11 +703,37 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
<div>
{group.tags.map((tag: string) => {
const tagIndex = tagIndexMap.get(tag) ?? -1
// Extract path after block name (e.g., "field" from "blockname.field")
// For root reference blocks, show the block name instead of empty path
const tagParts = tag.split('.')
const path = tagParts.slice(1).join('.')
const displayText = path || group.blockName
// Handle display text based on tag type
let displayText: string
let tagDescription = ''
let tagIcon = group.blockName.charAt(0).toUpperCase()
if (
(group.blockType === 'loop' || group.blockType === 'parallel') &&
!tag.includes('.')
) {
// Contextual tags like 'index', 'currentItem', 'items'
displayText = tag
if (tag === 'index') {
tagIcon = '#'
tagDescription = 'Index'
} else if (tag === 'currentItem') {
tagIcon = 'i'
tagDescription = 'Current item'
} else if (tag === 'items') {
tagIcon = 'I'
tagDescription = 'All items'
}
} else {
// Regular block output tags like 'blockname.field' or 'blockname.results'
const tagParts = tag.split('.')
const path = tagParts.slice(1).join('.')
displayText = path || group.blockName
if (path === 'results') {
tagDescription = 'Results array'
}
}
return (
<button
@@ -728,12 +750,12 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
onMouseDown={(e) => {
e.preventDefault()
e.stopPropagation()
handleTagSelect(tag)
handleTagSelect(tag, group)
}}
onClick={(e) => {
e.preventDefault()
e.stopPropagation()
handleTagSelect(tag)
handleTagSelect(tag, group)
}}
>
<div
@@ -741,12 +763,15 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
style={{ backgroundColor: blockColor }}
>
<span className='h-3 w-3 font-bold text-white text-xs'>
{group.blockName.charAt(0).toUpperCase()}
{tagIcon}
</span>
</div>
<span className='max-w-[calc(100%-32px)] truncate'>
{displayText}
</span>
<span className='flex-1 truncate'>{displayText}</span>
{tagDescription && (
<span className='ml-auto text-muted-foreground text-xs'>
{tagDescription}
</span>
)}
</button>
)
})}

View File

@@ -50,6 +50,7 @@ interface SocketContextType {
onUserJoined: (handler: (data: any) => void) => void
onUserLeft: (handler: (data: any) => void) => void
onWorkflowDeleted: (handler: (data: any) => void) => void
onWorkflowReverted: (handler: (data: any) => void) => void
}
const SocketContext = createContext<SocketContextType>({
@@ -71,6 +72,7 @@ const SocketContext = createContext<SocketContextType>({
onUserJoined: () => {},
onUserLeft: () => {},
onWorkflowDeleted: () => {},
onWorkflowReverted: () => {},
})
export const useSocket = () => useContext(SocketContext)
@@ -100,6 +102,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
userJoined?: (data: any) => void
userLeft?: (data: any) => void
workflowDeleted?: (data: any) => void
workflowReverted?: (data: any) => void
}>({})
// Helper function to generate a fresh socket token
@@ -147,9 +150,9 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
const socketInstance = io(socketUrl, {
transports: ['websocket', 'polling'], // Keep polling fallback for reliability
withCredentials: true,
reconnectionAttempts: 5, // Socket.IO handles base reconnection
reconnectionAttempts: Number.POSITIVE_INFINITY, // Socket.IO handles base reconnection
reconnectionDelay: 1000, // Start with 1 second delay
reconnectionDelayMax: 5000, // Max 5 second delay
reconnectionDelayMax: 30000, // Max 30 second delay
timeout: 10000, // Back to original timeout
auth: (cb) => {
// Generate a fresh token for each connection attempt (including reconnections)
@@ -281,6 +284,12 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
eventHandlers.current.workflowDeleted?.(data)
})
// Workflow revert events
socketInstance.on('workflow-reverted', (data) => {
logger.info(`Workflow ${data.workflowId} has been reverted to deployed state`)
eventHandlers.current.workflowReverted?.(data)
})
// Cursor update events
socketInstance.on('cursor-update', (data) => {
setPresenceUsers((prev) =>
@@ -557,6 +566,10 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
eventHandlers.current.workflowDeleted = handler
}, [])
const onWorkflowReverted = useCallback((handler: (data: any) => void) => {
eventHandlers.current.workflowReverted = handler
}, [])
return (
<SocketContext.Provider
value={{
@@ -578,6 +591,7 @@ export function SocketProvider({ children, user }: SocketProviderProps) {
onUserJoined,
onUserLeft,
onWorkflowDeleted,
onWorkflowReverted,
}}
>
{children}

View File

@@ -736,7 +736,29 @@ describe('AgentBlockHandler', () => {
})
})
it('should throw an error for invalid JSON in responseFormat', async () => {
it('should handle invalid JSON in responseFormat gracefully', async () => {
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
headers: {
get: (name: string) => {
if (name === 'Content-Type') return 'application/json'
if (name === 'X-Execution-Data') return null
return null
},
},
json: () =>
Promise.resolve({
content: 'Regular text response',
model: 'mock-model',
tokens: { prompt: 10, completion: 20, total: 30 },
timing: { total: 100 },
toolCalls: [],
cost: undefined,
}),
})
})
const inputs = {
model: 'gpt-4o',
userPrompt: 'Format this output.',
@@ -744,9 +766,60 @@ describe('AgentBlockHandler', () => {
responseFormat: '{invalid-json',
}
await expect(handler.execute(mockBlock, inputs, mockContext)).rejects.toThrow(
'Invalid response'
)
// Should not throw an error, but continue with default behavior
const result = await handler.execute(mockBlock, inputs, mockContext)
expect(result).toEqual({
content: 'Regular text response',
model: 'mock-model',
tokens: { prompt: 10, completion: 20, total: 30 },
toolCalls: { list: [], count: 0 },
providerTiming: { total: 100 },
cost: undefined,
})
})
it('should handle variable references in responseFormat gracefully', async () => {
mockFetch.mockImplementationOnce(() => {
return Promise.resolve({
ok: true,
headers: {
get: (name: string) => {
if (name === 'Content-Type') return 'application/json'
if (name === 'X-Execution-Data') return null
return null
},
},
json: () =>
Promise.resolve({
content: 'Regular text response',
model: 'mock-model',
tokens: { prompt: 10, completion: 20, total: 30 },
timing: { total: 100 },
toolCalls: [],
cost: undefined,
}),
})
})
const inputs = {
model: 'gpt-4o',
userPrompt: 'Format this output.',
apiKey: 'test-api-key',
responseFormat: '<start.input>',
}
// Should not throw an error, but continue with default behavior
const result = await handler.execute(mockBlock, inputs, mockContext)
expect(result).toEqual({
content: 'Regular text response',
model: 'mock-model',
tokens: { prompt: 10, completion: 20, total: 30 },
toolCalls: { list: [], count: 0 },
providerTiming: { total: 100 },
cost: undefined,
})
})
it('should handle errors from the provider request', async () => {

View File

@@ -58,22 +58,63 @@ export class AgentBlockHandler implements BlockHandler {
private parseResponseFormat(responseFormat?: string | object): any {
if (!responseFormat || responseFormat === '') return undefined
try {
const parsed =
typeof responseFormat === 'string' ? JSON.parse(responseFormat) : responseFormat
if (parsed && typeof parsed === 'object' && !parsed.schema && !parsed.name) {
// If already an object, process it directly
if (typeof responseFormat === 'object' && responseFormat !== null) {
const formatObj = responseFormat as any
if (!formatObj.schema && !formatObj.name) {
return {
name: 'response_schema',
schema: parsed,
schema: responseFormat,
strict: true,
}
}
return parsed
} catch (error: any) {
logger.error('Failed to parse response format:', { error })
throw new Error(`Invalid response format: ${error.message}`)
return responseFormat
}
// Handle string values
if (typeof responseFormat === 'string') {
const trimmedValue = responseFormat.trim()
// Check for variable references like <start.input>
if (trimmedValue.startsWith('<') && trimmedValue.includes('>')) {
logger.info('Response format contains variable reference:', {
value: trimmedValue,
})
// Variable references should have been resolved by the resolver before reaching here
// If we still have a variable reference, it means it couldn't be resolved
// Return undefined to use default behavior (no structured response)
return undefined
}
// Try to parse as JSON
try {
const parsed = JSON.parse(trimmedValue)
if (parsed && typeof parsed === 'object' && !parsed.schema && !parsed.name) {
return {
name: 'response_schema',
schema: parsed,
strict: true,
}
}
return parsed
} catch (error: any) {
logger.warn('Failed to parse response format as JSON, using default behavior:', {
error: error.message,
value: trimmedValue,
})
// Return undefined instead of throwing - this allows execution to continue
// without structured response format
return undefined
}
}
// For any other type, return undefined
logger.warn('Unexpected response format type, using default behavior:', {
type: typeof responseFormat,
value: responseFormat,
})
return undefined
}
private async formatTools(inputTools: ToolInput[], context: ExecutionContext): Promise<any[]> {

View File

@@ -30,6 +30,7 @@ import type {
NormalizedBlockOutput,
StreamingExecution,
} from './types'
import { streamingResponseFormatProcessor } from './utils'
const logger = createLogger('Executor')
@@ -242,7 +243,25 @@ export class Executor {
const streamingExec = output as StreamingExecution
const [streamForClient, streamForExecutor] = streamingExec.stream.tee()
const clientStreamingExec = { ...streamingExec, stream: streamForClient }
// Apply response format processing to the client stream if needed
const blockId = (streamingExec.execution as any).blockId
// Get response format from initial block states (passed from useWorkflowExecution)
// The initialBlockStates contain the subblock values including responseFormat
let responseFormat: any
if (this.initialBlockStates?.[blockId]) {
const blockState = this.initialBlockStates[blockId] as any
responseFormat = blockState.responseFormat
}
const processedClientStream = streamingResponseFormatProcessor.processStream(
streamForClient,
blockId,
context.selectedOutputIds || [],
responseFormat
)
const clientStreamingExec = { ...streamingExec, stream: processedClientStream }
try {
// Handle client stream with proper error handling
@@ -267,7 +286,41 @@ export class Executor {
const blockId = (streamingExec.execution as any).blockId
const blockState = context.blockStates.get(blockId)
if (blockState?.output) {
blockState.output.content = fullContent
// Check if we have response format - if so, preserve structured response
let responseFormat: any
if (this.initialBlockStates?.[blockId]) {
const initialBlockState = this.initialBlockStates[blockId] as any
responseFormat = initialBlockState.responseFormat
}
if (responseFormat && fullContent) {
// For structured responses, always try to parse the raw streaming content
// The streamForExecutor contains the raw JSON response, not the processed display text
try {
const parsedContent = JSON.parse(fullContent)
// Preserve metadata but spread parsed fields at root level (same as manual execution)
const structuredOutput = {
...parsedContent,
tokens: blockState.output.tokens,
toolCalls: blockState.output.toolCalls,
providerTiming: blockState.output.providerTiming,
cost: blockState.output.cost,
}
blockState.output = structuredOutput
// Also update the corresponding block log with the structured output
const blockLog = context.blockLogs.find((log) => log.blockId === blockId)
if (blockLog) {
blockLog.output = structuredOutput
}
} catch (parseError) {
// If parsing fails, fall back to setting content
blockState.output.content = fullContent
}
} else {
// No response format, use standard content setting
blockState.output.content = fullContent
}
}
} catch (readerError: any) {
logger.error('Error reading stream for executor:', readerError)
@@ -275,7 +328,40 @@ export class Executor {
const blockId = (streamingExec.execution as any).blockId
const blockState = context.blockStates.get(blockId)
if (blockState?.output && fullContent) {
blockState.output.content = fullContent
// Check if we have response format for error handling too
let responseFormat: any
if (this.initialBlockStates?.[blockId]) {
const initialBlockState = this.initialBlockStates[blockId] as any
responseFormat = initialBlockState.responseFormat
}
if (responseFormat) {
// For structured responses, always try to parse the raw streaming content
// The streamForExecutor contains the raw JSON response, not the processed display text
try {
const parsedContent = JSON.parse(fullContent)
const structuredOutput = {
...parsedContent,
tokens: blockState.output.tokens,
toolCalls: blockState.output.toolCalls,
providerTiming: blockState.output.providerTiming,
cost: blockState.output.cost,
}
blockState.output = structuredOutput
// Also update the corresponding block log with the structured output
const blockLog = context.blockLogs.find((log) => log.blockId === blockId)
if (blockLog) {
blockLog.output = structuredOutput
}
} catch (parseError) {
// If parsing fails, fall back to setting content
blockState.output.content = fullContent
}
} else {
// No response format, use standard content setting
blockState.output.content = fullContent
}
}
} finally {
try {
@@ -1257,6 +1343,7 @@ export class Executor {
context.blockLogs.push(blockLog)
// Skip console logging for infrastructure blocks like loops and parallels
// For streaming blocks, we'll add the console entry after stream processing
if (block.metadata?.id !== 'loop' && block.metadata?.id !== 'parallel') {
addConsole({
output: blockLog.output,

View File

@@ -269,3 +269,15 @@ export interface Tool<P = any, O = Record<string, any>> {
export interface ToolRegistry {
[key: string]: Tool
}
/**
* Interface for a stream processor that can process a stream based on a response format.
*/
export interface ResponseFormatStreamProcessor {
processStream(
originalStream: ReadableStream,
blockId: string,
selectedOutputIds: string[],
responseFormat?: any
): ReadableStream
}

View File

@@ -0,0 +1,354 @@
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { StreamingResponseFormatProcessor, streamingResponseFormatProcessor } from './utils'
vi.mock('@/lib/logs/console-logger', () => ({
createLogger: vi.fn().mockReturnValue({
debug: vi.fn(),
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
}),
}))
describe('StreamingResponseFormatProcessor', () => {
let processor: StreamingResponseFormatProcessor
beforeEach(() => {
processor = new StreamingResponseFormatProcessor()
})
afterEach(() => {
vi.clearAllMocks()
})
describe('processStream', () => {
it.concurrent('should return original stream when no response format selection', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('{"content": "test"}'))
controller.close()
},
})
const result = processor.processStream(
mockStream,
'block-1',
['block-1.content'], // No underscore, not response format
{ schema: { properties: { username: { type: 'string' } } } }
)
expect(result).toBe(mockStream)
})
it.concurrent('should return original stream when no response format provided', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('{"content": "test"}'))
controller.close()
},
})
const result = processor.processStream(
mockStream,
'block-1',
['block-1_username'], // Has underscore but no response format
undefined
)
expect(result).toBe(mockStream)
})
it.concurrent('should process stream and extract single selected field', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('{"username": "alice", "age": 25}'))
controller.close()
},
})
const processedStream = processor.processStream(mockStream, 'block-1', ['block-1_username'], {
schema: { properties: { username: { type: 'string' }, age: { type: 'number' } } },
})
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('alice')
})
it.concurrent('should process stream and extract multiple selected fields', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(
new TextEncoder().encode('{"username": "bob", "age": 30, "email": "bob@test.com"}')
)
controller.close()
},
})
const processedStream = processor.processStream(
mockStream,
'block-1',
['block-1_username', 'block-1_age'],
{ schema: { properties: { username: { type: 'string' }, age: { type: 'number' } } } }
)
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('bob\n30')
})
it.concurrent('should handle non-string field values by JSON stringifying them', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(
new TextEncoder().encode(
'{"config": {"theme": "dark", "notifications": true}, "count": 42}'
)
)
controller.close()
},
})
const processedStream = processor.processStream(
mockStream,
'block-1',
['block-1_config', 'block-1_count'],
{ schema: { properties: { config: { type: 'object' }, count: { type: 'number' } } } }
)
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('{"theme":"dark","notifications":true}\n42')
})
it.concurrent('should handle streaming JSON that comes in chunks', async () => {
const mockStream = new ReadableStream({
start(controller) {
// Simulate streaming JSON in chunks
controller.enqueue(new TextEncoder().encode('{"username": "charlie"'))
controller.enqueue(new TextEncoder().encode(', "age": 35}'))
controller.close()
},
})
const processedStream = processor.processStream(mockStream, 'block-1', ['block-1_username'], {
schema: { properties: { username: { type: 'string' }, age: { type: 'number' } } },
})
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('charlie')
})
it.concurrent('should handle missing fields gracefully', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('{"username": "diana"}'))
controller.close()
},
})
const processedStream = processor.processStream(
mockStream,
'block-1',
['block-1_username', 'block-1_missing_field'],
{ schema: { properties: { username: { type: 'string' } } } }
)
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('diana')
})
it.concurrent('should handle invalid JSON gracefully', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('invalid json'))
controller.close()
},
})
const processedStream = processor.processStream(mockStream, 'block-1', ['block-1_username'], {
schema: { properties: { username: { type: 'string' } } },
})
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('')
})
it.concurrent('should filter selected fields for correct block ID', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('{"username": "eve", "age": 28}'))
controller.close()
},
})
const processedStream = processor.processStream(
mockStream,
'block-1',
['block-1_username', 'block-2_age'], // Different block ID should be filtered out
{ schema: { properties: { username: { type: 'string' }, age: { type: 'number' } } } }
)
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('eve')
})
it.concurrent('should handle empty result when no matching fields', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode('{"other_field": "value"}'))
controller.close()
},
})
const processedStream = processor.processStream(mockStream, 'block-1', ['block-1_username'], {
schema: { properties: { username: { type: 'string' } } },
})
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('')
})
})
describe('singleton instance', () => {
it.concurrent('should export a singleton instance', () => {
expect(streamingResponseFormatProcessor).toBeInstanceOf(StreamingResponseFormatProcessor)
})
it.concurrent('should return the same instance on multiple imports', () => {
const instance1 = streamingResponseFormatProcessor
const instance2 = streamingResponseFormatProcessor
expect(instance1).toBe(instance2)
})
})
describe('edge cases', () => {
it.concurrent('should handle empty stream', async () => {
const mockStream = new ReadableStream({
start(controller) {
controller.close()
},
})
const processedStream = processor.processStream(mockStream, 'block-1', ['block-1_username'], {
schema: { properties: { username: { type: 'string' } } },
})
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('')
})
it.concurrent('should handle very large JSON objects', async () => {
const largeObject = {
username: 'frank',
data: 'x'.repeat(10000), // Large string
nested: {
deep: {
value: 'test',
},
},
}
const mockStream = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode(JSON.stringify(largeObject)))
controller.close()
},
})
const processedStream = processor.processStream(mockStream, 'block-1', ['block-1_username'], {
schema: { properties: { username: { type: 'string' } } },
})
const reader = processedStream.getReader()
const decoder = new TextDecoder()
let result = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
result += decoder.decode(value)
}
expect(result).toBe('frank')
})
})
})

201
apps/sim/executor/utils.ts Normal file
View File

@@ -0,0 +1,201 @@
import { createLogger } from '@/lib/logs/console-logger'
import type { ResponseFormatStreamProcessor } from './types'
const logger = createLogger('ExecutorUtils')
/**
* Processes a streaming response to extract only the selected response format fields
* instead of streaming the full JSON wrapper.
*/
export class StreamingResponseFormatProcessor implements ResponseFormatStreamProcessor {
processStream(
originalStream: ReadableStream,
blockId: string,
selectedOutputIds: string[],
responseFormat?: any
): ReadableStream {
// Check if this block has response format selected outputs
const hasResponseFormatSelection = selectedOutputIds.some((outputId) => {
const blockIdForOutput = outputId.includes('_')
? outputId.split('_')[0]
: outputId.split('.')[0]
return blockIdForOutput === blockId && outputId.includes('_')
})
// If no response format selection, return original stream unchanged
if (!hasResponseFormatSelection || !responseFormat) {
return originalStream
}
// Get the selected field names for this block
const selectedFields = selectedOutputIds
.filter((outputId) => {
const blockIdForOutput = outputId.includes('_')
? outputId.split('_')[0]
: outputId.split('.')[0]
return blockIdForOutput === blockId && outputId.includes('_')
})
.map((outputId) => outputId.substring(blockId.length + 1))
logger.info('Processing streaming response format', {
blockId,
selectedFields,
hasResponseFormat: !!responseFormat,
selectedFieldsCount: selectedFields.length,
})
return this.createProcessedStream(originalStream, selectedFields, blockId)
}
private createProcessedStream(
originalStream: ReadableStream,
selectedFields: string[],
blockId: string
): ReadableStream {
let buffer = ''
let hasProcessedComplete = false // Track if we've already processed the complete JSON
const self = this
return new ReadableStream({
async start(controller) {
const reader = originalStream.getReader()
const decoder = new TextDecoder()
try {
while (true) {
const { done, value } = await reader.read()
if (done) {
// Handle any remaining buffer at the end only if we haven't processed complete JSON yet
if (buffer.trim() && !hasProcessedComplete) {
self.processCompleteJson(buffer, selectedFields, controller)
}
controller.close()
break
}
const chunk = decoder.decode(value, { stream: true })
buffer += chunk
// Try to process the current buffer only if we haven't processed complete JSON yet
if (!hasProcessedComplete) {
const processedChunk = self.processStreamingChunk(buffer, selectedFields)
if (processedChunk) {
controller.enqueue(new TextEncoder().encode(processedChunk))
hasProcessedComplete = true // Mark as processed to prevent duplicate processing
}
}
}
} catch (error) {
logger.error('Error processing streaming response format:', { error, blockId })
controller.error(error)
} finally {
reader.releaseLock()
}
},
})
}
private processStreamingChunk(buffer: string, selectedFields: string[]): string | null {
// For streaming response format, we need to parse the JSON as it comes in
// and extract only the field values we care about
// Try to parse as complete JSON first
try {
const parsed = JSON.parse(buffer.trim())
if (typeof parsed === 'object' && parsed !== null) {
// We have a complete JSON object, extract the selected fields
// Process all selected fields and format them properly
const results: string[] = []
for (const field of selectedFields) {
if (field in parsed) {
const value = parsed[field]
const formattedValue = typeof value === 'string' ? value : JSON.stringify(value)
results.push(formattedValue)
}
}
if (results.length > 0) {
// Join multiple fields with newlines for readability
const result = results.join('\n')
return result
}
return null
}
} catch (e) {
// Not complete JSON yet, continue buffering
}
// For real-time extraction during streaming, we'd need more sophisticated parsing
// For now, let's handle the case where we receive chunks that might be partial JSON
// Simple heuristic: if buffer contains what looks like a complete JSON object
const openBraces = (buffer.match(/\{/g) || []).length
const closeBraces = (buffer.match(/\}/g) || []).length
if (openBraces > 0 && openBraces === closeBraces) {
// Likely a complete JSON object
try {
const parsed = JSON.parse(buffer.trim())
if (typeof parsed === 'object' && parsed !== null) {
// Process all selected fields and format them properly
const results: string[] = []
for (const field of selectedFields) {
if (field in parsed) {
const value = parsed[field]
const formattedValue = typeof value === 'string' ? value : JSON.stringify(value)
results.push(formattedValue)
}
}
if (results.length > 0) {
// Join multiple fields with newlines for readability
const result = results.join('\n')
return result
}
return null
}
} catch (e) {
// Still not valid JSON, continue
}
}
return null
}
private processCompleteJson(
buffer: string,
selectedFields: string[],
controller: ReadableStreamDefaultController
): void {
try {
const parsed = JSON.parse(buffer.trim())
if (typeof parsed === 'object' && parsed !== null) {
// Process all selected fields and format them properly
const results: string[] = []
for (const field of selectedFields) {
if (field in parsed) {
const value = parsed[field]
const formattedValue = typeof value === 'string' ? value : JSON.stringify(value)
results.push(formattedValue)
}
}
if (results.length > 0) {
// Join multiple fields with newlines for readability
const result = results.join('\n')
controller.enqueue(new TextEncoder().encode(result))
}
}
} catch (error) {
logger.warn('Failed to parse complete JSON in streaming processor:', { error })
}
}
}
// Create singleton instance
export const streamingResponseFormatProcessor = new StreamingResponseFormatProcessor()

View File

@@ -25,6 +25,7 @@ export function useCollaborativeWorkflow() {
onUserJoined,
onUserLeft,
onWorkflowDeleted,
onWorkflowReverted,
} = useSocket()
const { activeWorkflowId } = useWorkflowRegistry()
@@ -262,12 +263,80 @@ export function useCollaborativeWorkflow() {
}
}
const handleWorkflowReverted = async (data: any) => {
const { workflowId } = data
logger.info(`Workflow ${workflowId} has been reverted to deployed state`)
// If the reverted workflow is the currently active one, reload the workflow state
if (activeWorkflowId === workflowId) {
logger.info(`Currently active workflow ${workflowId} was reverted, reloading state`)
try {
// Fetch the updated workflow state from the server (which loads from normalized tables)
const response = await fetch(`/api/workflows/${workflowId}`)
if (response.ok) {
const responseData = await response.json()
const workflowData = responseData.data
if (workflowData?.state) {
// Update the workflow store with the reverted state
isApplyingRemoteChange.current = true
try {
// Update the main workflow state using the API response
useWorkflowStore.setState({
blocks: workflowData.state.blocks || {},
edges: workflowData.state.edges || [],
loops: workflowData.state.loops || {},
parallels: workflowData.state.parallels || {},
isDeployed: workflowData.state.isDeployed || false,
deployedAt: workflowData.state.deployedAt,
lastSaved: workflowData.state.lastSaved || Date.now(),
hasActiveSchedule: workflowData.state.hasActiveSchedule || false,
hasActiveWebhook: workflowData.state.hasActiveWebhook || false,
deploymentStatuses: workflowData.state.deploymentStatuses || {},
})
// Update subblock store with reverted values
const subblockValues: Record<string, Record<string, any>> = {}
Object.entries(workflowData.state.blocks || {}).forEach(([blockId, block]) => {
const blockState = block as any
subblockValues[blockId] = {}
Object.entries(blockState.subBlocks || {}).forEach(([subblockId, subblock]) => {
subblockValues[blockId][subblockId] = (subblock as any).value
})
})
// Update subblock store for this workflow
useSubBlockStore.setState((state: any) => ({
workflowValues: {
...state.workflowValues,
[workflowId]: subblockValues,
},
}))
logger.info(`Successfully loaded reverted workflow state for ${workflowId}`)
} finally {
isApplyingRemoteChange.current = false
}
} else {
logger.error('No state found in workflow data after revert', { workflowData })
}
} else {
logger.error(`Failed to fetch workflow data after revert: ${response.statusText}`)
}
} catch (error) {
logger.error('Error reloading workflow state after revert:', error)
}
}
}
// Register event handlers
onWorkflowOperation(handleWorkflowOperation)
onSubblockUpdate(handleSubblockUpdate)
onUserJoined(handleUserJoined)
onUserLeft(handleUserLeft)
onWorkflowDeleted(handleWorkflowDeleted)
onWorkflowReverted(handleWorkflowReverted)
return () => {
// Cleanup handled by socket context
@@ -278,6 +347,7 @@ export function useCollaborativeWorkflow() {
onUserJoined,
onUserLeft,
onWorkflowDeleted,
onWorkflowReverted,
workflowStore,
subBlockStore,
activeWorkflowId,

View File

@@ -135,6 +135,7 @@ export const auth = betterAuth({
'notion',
'microsoft',
'slack',
'reddit',
],
},
},
@@ -825,6 +826,57 @@ export const auth = betterAuth({
},
},
// Reddit provider
{
providerId: 'reddit',
clientId: env.REDDIT_CLIENT_ID as string,
clientSecret: env.REDDIT_CLIENT_SECRET as string,
authorizationUrl: 'https://www.reddit.com/api/v1/authorize',
tokenUrl: 'https://www.reddit.com/api/v1/access_token',
userInfoUrl: 'https://oauth.reddit.com/api/v1/me',
scopes: ['identity', 'read'],
responseType: 'code',
pkce: false,
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/reddit`,
getUserInfo: async (tokens) => {
try {
const response = await fetch('https://oauth.reddit.com/api/v1/me', {
headers: {
Authorization: `Bearer ${tokens.accessToken}`,
'User-Agent': 'sim-studio/1.0',
},
})
if (!response.ok) {
logger.error('Error fetching Reddit user info:', {
status: response.status,
statusText: response.statusText,
})
return null
}
const data = await response.json()
const now = new Date()
return {
id: data.id,
name: data.name || 'Reddit User',
email: `${data.name}@reddit.user`, // Reddit doesn't provide email in identity scope
image: data.icon_img || null,
emailVerified: false,
createdAt: now,
updatedAt: now,
}
} catch (error) {
logger.error('Error in Reddit getUserInfo:', { error })
return null
}
},
},
{
providerId: 'linear',
clientId: env.LINEAR_CLIENT_ID as string,

View File

@@ -103,6 +103,8 @@ export const env = createEnv({
LINEAR_CLIENT_SECRET: z.string().optional(),
SLACK_CLIENT_ID: z.string().optional(),
SLACK_CLIENT_SECRET: z.string().optional(),
REDDIT_CLIENT_ID: z.string().optional(),
REDDIT_CLIENT_SECRET: z.string().optional(),
SOCKET_SERVER_URL: z.string().url().optional(),
SOCKET_PORT: z.number().optional(),
PORT: z.number().optional(),

View File

@@ -26,6 +26,8 @@ vi.mock('../env', () => ({
LINEAR_CLIENT_SECRET: 'linear_client_secret',
SLACK_CLIENT_ID: 'slack_client_id',
SLACK_CLIENT_SECRET: 'slack_client_secret',
REDDIT_CLIENT_ID: 'reddit_client_id',
REDDIT_CLIENT_SECRET: 'reddit_client_secret',
},
}))
@@ -80,6 +82,11 @@ describe('OAuth Token Refresh', () => {
endpoint: 'https://discord.com/api/v10/oauth2/token',
},
{ name: 'Linear', providerId: 'linear', endpoint: 'https://api.linear.app/oauth/token' },
{
name: 'Reddit',
providerId: 'reddit',
endpoint: 'https://www.reddit.com/api/v1/access_token',
},
]
basicAuthProviders.forEach(({ name, providerId, endpoint }) => {

View File

@@ -17,6 +17,7 @@ import {
MicrosoftTeamsIcon,
NotionIcon,
OutlookIcon,
RedditIcon,
SlackIcon,
SupabaseIcon,
xIcon,
@@ -39,6 +40,7 @@ export type OAuthProvider =
| 'microsoft'
| 'linear'
| 'slack'
| 'reddit'
| string
export type OAuthService =
@@ -61,6 +63,7 @@ export type OAuthService =
| 'outlook'
| 'linear'
| 'slack'
| 'reddit'
export interface OAuthProviderConfig {
id: OAuthProvider
@@ -387,6 +390,23 @@ export const OAUTH_PROVIDERS: Record<string, OAuthProviderConfig> = {
},
defaultService: 'slack',
},
reddit: {
id: 'reddit',
name: 'Reddit',
icon: (props) => RedditIcon(props),
services: {
reddit: {
id: 'reddit',
name: 'Reddit',
description: 'Access Reddit data and content from subreddits.',
providerId: 'reddit',
icon: (props) => RedditIcon(props),
baseProviderIcon: (props) => RedditIcon(props),
scopes: ['identity', 'read'],
},
},
defaultService: 'reddit',
},
}
// Helper function to get a service by provider and service ID
@@ -695,6 +715,18 @@ function getProviderAuthConfig(provider: string): ProviderAuthConfig {
useBasicAuth: false,
}
}
case 'reddit': {
const { clientId, clientSecret } = getCredentials(
env.REDDIT_CLIENT_ID,
env.REDDIT_CLIENT_SECRET
)
return {
tokenEndpoint: 'https://www.reddit.com/api/v1/access_token',
clientId,
clientSecret,
useBasicAuth: true,
}
}
default:
throw new Error(`Unsupported provider: ${provider}`)
}

View File

@@ -0,0 +1,185 @@
import { createLogger } from '@/lib/logs/console-logger'
const logger = createLogger('ResponseFormatUtils')
// Type definitions for component data structures
export interface Field {
name: string
type: string
description?: string
}
/**
* Helper function to extract fields from JSON Schema
* Handles both legacy format with fields array and new JSON Schema format
*/
export function extractFieldsFromSchema(schema: any): Field[] {
if (!schema || typeof schema !== 'object') {
return []
}
// Handle legacy format with fields array
if (Array.isArray(schema.fields)) {
return schema.fields
}
// Handle new JSON Schema format
const schemaObj = schema.schema || schema
if (!schemaObj || !schemaObj.properties || typeof schemaObj.properties !== 'object') {
return []
}
// Extract fields from schema properties
return Object.entries(schemaObj.properties).map(([name, prop]: [string, any]) => {
// Handle array format like ['string', 'array']
if (Array.isArray(prop)) {
return {
name,
type: prop.includes('array') ? 'array' : prop[0] || 'string',
description: undefined,
}
}
// Handle object format like { type: 'string', description: '...' }
return {
name,
type: prop.type || 'string',
description: prop.description,
}
})
}
/**
* Helper function to safely parse response format
* Handles both string and object formats
*/
export function parseResponseFormatSafely(responseFormatValue: any, blockId: string): any {
if (!responseFormatValue) {
return null
}
try {
if (typeof responseFormatValue === 'string') {
return JSON.parse(responseFormatValue)
}
return responseFormatValue
} catch (error) {
logger.warn(`Failed to parse response format for block ${blockId}:`, error)
return null
}
}
/**
* Extract field values from a parsed JSON object based on selected output paths
* Used for both workspace and chat client field extraction
*/
export function extractFieldValues(
parsedContent: any,
selectedOutputIds: string[],
blockId: string
): Record<string, any> {
const extractedValues: Record<string, any> = {}
for (const outputId of selectedOutputIds) {
const blockIdForOutput = extractBlockIdFromOutputId(outputId)
if (blockIdForOutput !== blockId) {
continue
}
const path = extractPathFromOutputId(outputId, blockIdForOutput)
if (path) {
const pathParts = path.split('.')
let current = parsedContent
for (const part of pathParts) {
if (current && typeof current === 'object' && part in current) {
current = current[part]
} else {
current = undefined
break
}
}
if (current !== undefined) {
extractedValues[path] = current
}
}
}
return extractedValues
}
/**
* Format extracted field values for display
* Returns formatted string representation of field values
*/
export function formatFieldValues(extractedValues: Record<string, any>): string {
const formattedValues: string[] = []
for (const [fieldName, value] of Object.entries(extractedValues)) {
const formattedValue = typeof value === 'string' ? value : JSON.stringify(value)
formattedValues.push(formattedValue)
}
return formattedValues.join('\n')
}
/**
* Extract block ID from output ID
* Handles both formats: "blockId" and "blockId_path" or "blockId.path"
*/
export function extractBlockIdFromOutputId(outputId: string): string {
return outputId.includes('_') ? outputId.split('_')[0] : outputId.split('.')[0]
}
/**
* Extract path from output ID after the block ID
*/
export function extractPathFromOutputId(outputId: string, blockId: string): string {
return outputId.substring(blockId.length + 1)
}
/**
* Parse JSON content from output safely
* Handles both string and object formats with proper error handling
*/
export function parseOutputContentSafely(output: any): any {
if (!output?.content) {
return output
}
if (typeof output.content === 'string') {
try {
return JSON.parse(output.content)
} catch (e) {
// Fallback to original structure if parsing fails
return output
}
}
return output
}
/**
* Check if a set of output IDs contains response format selections for a specific block
*/
export function hasResponseFormatSelection(selectedOutputIds: string[], blockId: string): boolean {
return selectedOutputIds.some((outputId) => {
const blockIdForOutput = extractBlockIdFromOutputId(outputId)
return blockIdForOutput === blockId && outputId.includes('_')
})
}
/**
* Get selected field names for a specific block from output IDs
*/
export function getSelectedFieldNames(selectedOutputIds: string[], blockId: string): string[] {
return selectedOutputIds
.filter((outputId) => {
const blockIdForOutput = extractBlockIdFromOutputId(outputId)
return blockIdForOutput === blockId && outputId.includes('_')
})
.map((outputId) => extractPathFromOutputId(outputId, blockId))
}

View File

@@ -78,7 +78,7 @@ export function processStreamingBlockLog(log: BlockLog, streamedContent: string)
log.output.cost = result.cost
log.output.model = result.model
logTokenizationDetails(`Streaming tokenization completed for ${log.blockType}`, {
logTokenizationDetails(`Streaming tokenization completed for ${log.blockType}`, {
blockId: log.blockId,
blockType: log.blockType,
model: result.model,
@@ -92,7 +92,7 @@ export function processStreamingBlockLog(log: BlockLog, streamedContent: string)
return true
} catch (error) {
logger.error(`Streaming tokenization failed for block ${log.blockId}`, {
logger.error(`Streaming tokenization failed for block ${log.blockId}`, {
blockType: log.blockType,
error: error instanceof Error ? error.message : String(error),
contentLength: streamedContent?.length || 0,

View File

@@ -24,6 +24,7 @@ const nextConfig: NextConfig = {
},
experimental: {
optimizeCss: true,
turbopackSourceMaps: false,
},
...(env.NODE_ENV === 'development' && {
allowedDevOrigins: [
@@ -41,36 +42,6 @@ const nextConfig: NextConfig = {
],
outputFileTracingRoot: path.join(__dirname, '../../'),
}),
webpack: (config, { isServer, dev }) => {
// Skip webpack configuration in development when using Turbopack
if (dev && env.NEXT_RUNTIME === 'turbopack') {
return config
}
// Configure webpack to use filesystem cache for faster incremental builds
if (config.cache) {
config.cache = {
type: 'filesystem',
buildDependencies: {
config: [__filename],
},
cacheDirectory: path.resolve(process.cwd(), '.next/cache/webpack'),
}
}
// Avoid aliasing React on the server/edge runtime builds because it bypasses
// the "react-server" export condition, which Next.js relies on when
// bundling React Server Components and API route handlers.
if (!isServer) {
config.resolve.alias = {
...config.resolve.alias,
react: path.join(__dirname, '../../node_modules/react'),
'react-dom': path.join(__dirname, '../../node_modules/react-dom'),
}
}
return config
},
transpilePackages: ['prettier', '@react-email/components', '@react-email/render'],
async headers() {
return [
@@ -144,6 +115,16 @@ const nextConfig: NextConfig = {
},
],
},
// Block access to sourcemap files (defense in depth)
{
source: '/(.*)\\.map$',
headers: [
{
key: 'x-robots-tag',
value: 'noindex',
},
],
},
// Apply security headers to all routes
{
source: '/:path*',

View File

@@ -3,7 +3,6 @@
"version": "0.1.0",
"private": true,
"license": "Apache-2.0",
"type": "module",
"engines": {
"bun": ">=1.2.13",
"node": ">=20.0.0"
@@ -13,7 +12,7 @@
"dev:classic": "next dev",
"dev:sockets": "bun run socket-server/index.ts",
"dev:full": "concurrently -n \"NextJS,Realtime\" -c \"cyan,magenta\" \"bun run dev\" \"bun run dev:sockets\"",
"build": "next build",
"build": "next build --turbopack",
"start": "next start",
"prepare": "cd ../.. && bun husky",
"db:push": "bunx drizzle-kit push",

View File

@@ -121,7 +121,7 @@ export class Serializer {
// Include response format fields if available
...(params.responseFormat
? {
responseFormat: JSON.parse(params.responseFormat),
responseFormat: this.parseResponseFormatSafely(params.responseFormat),
}
: {}),
},
@@ -136,6 +136,48 @@ export class Serializer {
}
}
private parseResponseFormatSafely(responseFormat: any): any {
if (!responseFormat) {
return undefined
}
// If already an object, return as-is
if (typeof responseFormat === 'object' && responseFormat !== null) {
return responseFormat
}
// Handle string values
if (typeof responseFormat === 'string') {
const trimmedValue = responseFormat.trim()
// Check for variable references like <start.input>
if (trimmedValue.startsWith('<') && trimmedValue.includes('>')) {
// Keep variable references as-is
return trimmedValue
}
if (trimmedValue === '') {
return undefined
}
// Try to parse as JSON
try {
return JSON.parse(trimmedValue)
} catch (error) {
// If parsing fails, return undefined to avoid crashes
// This allows the workflow to continue without structured response format
logger.warn('Failed to parse response format as JSON in serializer, using undefined:', {
value: trimmedValue,
error: error instanceof Error ? error.message : String(error),
})
return undefined
}
}
// For any other type, return undefined
return undefined
}
private extractParams(block: BlockState): Record<string, any> {
// Special handling for subflow blocks (loops, parallels, etc.)
if (block.type === 'loop' || block.type === 'parallel') {

View File

@@ -28,7 +28,5 @@ export function setupConnectionHandlers(
roomManager.cleanupUserFromRoom(socket.id, workflowId)
roomManager.broadcastPresenceUpdate(workflowId)
}
roomManager.clearPendingOperations(socket.id)
})
}

View File

@@ -75,11 +75,6 @@ export class RoomManager {
this.userSessions.delete(socketId)
}
// This would be used if we implement operation queuing
clearPendingOperations(socketId: string) {
logger.debug(`Cleared pending operations for socket ${socketId}`)
}
handleWorkflowDeletion(workflowId: string) {
logger.info(`Handling workflow deletion notification for ${workflowId}`)
@@ -115,6 +110,26 @@ export class RoomManager {
)
}
handleWorkflowRevert(workflowId: string, timestamp: number) {
logger.info(`Handling workflow revert notification for ${workflowId}`)
const room = this.workflowRooms.get(workflowId)
if (!room) {
logger.debug(`No active room found for reverted workflow ${workflowId}`)
return
}
this.io.to(workflowId).emit('workflow-reverted', {
workflowId,
message: 'Workflow has been reverted to deployed state',
timestamp,
})
room.lastModified = timestamp
logger.info(`Notified ${room.users.size} users about workflow revert: ${workflowId}`)
}
async validateWorkflowConsistency(
workflowId: string
): Promise<{ valid: boolean; issues: string[] }> {

View File

@@ -50,6 +50,27 @@ export function createHttpHandler(roomManager: RoomManager, logger: Logger) {
return
}
// Handle workflow revert notifications from the main API
if (req.method === 'POST' && req.url === '/api/workflow-reverted') {
let body = ''
req.on('data', (chunk) => {
body += chunk.toString()
})
req.on('end', () => {
try {
const { workflowId, timestamp } = JSON.parse(body)
roomManager.handleWorkflowRevert(workflowId, timestamp)
res.writeHead(200, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ success: true }))
} catch (error) {
logger.error('Error handling workflow revert notification:', error)
res.writeHead(500, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: 'Failed to process revert notification' }))
}
})
return
}
res.writeHead(404, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ error: 'Not found' }))
}

View File

@@ -193,7 +193,10 @@ export const useConsoleStore = create<ConsoleStore>()(
updatedEntry.output = newOutput
}
if (update.output !== undefined) {
if (update.replaceOutput !== undefined) {
// Complete replacement of output
updatedEntry.output = update.replaceOutput
} else if (update.output !== undefined) {
const existingOutput = entry.output || {}
updatedEntry.output = {
...existingOutput,

View File

@@ -20,6 +20,7 @@ export interface ConsoleEntry {
export interface ConsoleUpdate {
content?: string
output?: Partial<NormalizedBlockOutput>
replaceOutput?: NormalizedBlockOutput // New field for complete replacement
error?: string
warning?: string
success?: boolean

View File

@@ -1,128 +0,0 @@
/**
* OAuth state persistence for secure OAuth redirects
* This is the ONLY localStorage usage in the app - for temporary OAuth state during redirects
*/
import { createLogger } from '@/lib/logs/console-logger'
const logger = createLogger('OAuthPersistence')
interface OAuthState {
providerId: string
serviceId: string
requiredScopes: string[]
returnUrl: string
context: string
timestamp: number
data?: Record<string, any>
}
const OAUTH_STATE_KEY = 'pending_oauth_state'
const OAUTH_STATE_EXPIRY = 10 * 60 * 1000 // 10 minutes
/**
* Generic function to save data to localStorage (used by main branch OAuth flow)
*/
export function saveToStorage<T>(key: string, data: T): boolean {
try {
localStorage.setItem(key, JSON.stringify(data))
return true
} catch (error) {
logger.error(`Failed to save data to ${key}:`, { error })
return false
}
}
/**
* Generic function to load data from localStorage
*/
export function loadFromStorage<T>(key: string): T | null {
try {
const stored = localStorage.getItem(key)
if (!stored) return null
return JSON.parse(stored) as T
} catch (error) {
logger.error(`Failed to load data from ${key}:`, { error })
return null
}
}
/**
* Save OAuth state to localStorage before redirect
*/
export function saveOAuthState(state: OAuthState): boolean {
try {
const stateWithTimestamp = {
...state,
timestamp: Date.now(),
}
localStorage.setItem(OAUTH_STATE_KEY, JSON.stringify(stateWithTimestamp))
return true
} catch (error) {
logger.error('Failed to save OAuth state to localStorage:', error)
return false
}
}
/**
* Load and remove OAuth state from localStorage after redirect
*/
export function loadOAuthState(): OAuthState | null {
try {
const stored = localStorage.getItem(OAUTH_STATE_KEY)
if (!stored) return null
const state = JSON.parse(stored) as OAuthState
// Check if state has expired
if (Date.now() - state.timestamp > OAUTH_STATE_EXPIRY) {
localStorage.removeItem(OAUTH_STATE_KEY)
logger.warn('OAuth state expired, removing from localStorage')
return null
}
// Remove state after loading (one-time use)
localStorage.removeItem(OAUTH_STATE_KEY)
return state
} catch (error) {
logger.error('Failed to load OAuth state from localStorage:', error)
// Clean up corrupted state
localStorage.removeItem(OAUTH_STATE_KEY)
return null
}
}
/**
* Remove OAuth state from localStorage (cleanup)
*/
export function clearOAuthState(): void {
try {
localStorage.removeItem(OAUTH_STATE_KEY)
} catch (error) {
logger.error('Failed to clear OAuth state from localStorage:', error)
}
}
/**
* Check if there's pending OAuth state
*/
export function hasPendingOAuthState(): boolean {
try {
const stored = localStorage.getItem(OAUTH_STATE_KEY)
if (!stored) return false
const state = JSON.parse(stored) as OAuthState
// Check if expired
if (Date.now() - state.timestamp > OAUTH_STATE_EXPIRY) {
localStorage.removeItem(OAUTH_STATE_KEY)
return false
}
return true
} catch (error) {
logger.error('Failed to check pending OAuth state:', error)
localStorage.removeItem(OAUTH_STATE_KEY)
return false
}
}

View File

@@ -822,13 +822,18 @@ export const useWorkflowStore = create<WorkflowStoreWithHistory>()(
}
},
revertToDeployedState: (deployedState: WorkflowState) => {
revertToDeployedState: async (deployedState: WorkflowState) => {
const activeWorkflowId = useWorkflowRegistry.getState().activeWorkflowId
if (!activeWorkflowId) {
console.error('Cannot revert: no active workflow ID')
return
}
// Preserving the workflow-specific deployment status if it exists
const deploymentStatus = activeWorkflowId
? useWorkflowRegistry.getState().getWorkflowDeploymentStatus(activeWorkflowId)
: null
const deploymentStatus = useWorkflowRegistry
.getState()
.getWorkflowDeploymentStatus(activeWorkflowId)
const newState = {
blocks: deployedState.blocks,
@@ -841,7 +846,7 @@ export const useWorkflowStore = create<WorkflowStoreWithHistory>()(
// Keep existing deployment statuses and update for the active workflow if needed
deploymentStatuses: {
...get().deploymentStatuses,
...(activeWorkflowId && deploymentStatus
...(deploymentStatus
? {
[activeWorkflowId]: deploymentStatus,
}
@@ -852,9 +857,6 @@ export const useWorkflowStore = create<WorkflowStoreWithHistory>()(
// Update the main workflow state
set(newState)
// Get the active workflow ID
if (!activeWorkflowId) return
// Initialize subblock store with values from deployed state
const subBlockStore = useSubBlockStore.getState()
const values: Record<string, Record<string, any>> = {}
@@ -885,7 +887,27 @@ export const useWorkflowStore = create<WorkflowStoreWithHistory>()(
pushHistory(set, get, newState, 'Reverted to deployed state')
get().updateLastSaved()
// Note: Socket.IO handles real-time sync automatically
// Call API to persist the revert to normalized tables
try {
const response = await fetch(`/api/workflows/${activeWorkflowId}/revert-to-deployed`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
})
if (!response.ok) {
const errorData = await response.json()
console.error('Failed to persist revert to deployed state:', errorData.error)
// Don't throw error to avoid breaking the UI, but log it
} else {
console.log('Successfully persisted revert to deployed state')
}
} catch (error) {
console.error('Error calling revert to deployed API:', error)
// Don't throw error to avoid breaking the UI
}
},
toggleBlockAdvancedMode: (id: string) => {

View File

@@ -48,6 +48,9 @@ export async function executeTool(
// If we have a credential parameter, fetch the access token
if (contextParams.credential) {
logger.info(
`[${requestId}] Tool ${toolId} needs access token for credential: ${contextParams.credential}`
)
try {
const baseUrl = env.NEXT_PUBLIC_APP_URL
if (!baseUrl) {
@@ -69,6 +72,8 @@ export async function executeTool(
}
}
logger.info(`[${requestId}] Fetching access token from ${baseUrl}/api/auth/oauth/token`)
const tokenUrl = new URL('/api/auth/oauth/token', baseUrl).toString()
const response = await fetch(tokenUrl, {
method: 'POST',
@@ -88,6 +93,10 @@ export async function executeTool(
const data = await response.json()
contextParams.accessToken = data.accessToken
logger.info(
`[${requestId}] Successfully got access token for ${toolId}, length: ${data.accessToken?.length || 0}`
)
// Clean up params we don't need to pass to the actual tool
contextParams.credential = undefined
if (contextParams.workflowId) contextParams.workflowId = undefined

View File

@@ -7,6 +7,12 @@ export const getCommentsTool: ToolConfig<RedditCommentsParams, RedditCommentsRes
description: 'Fetch comments from a specific Reddit post',
version: '1.0.0',
oauth: {
required: true,
provider: 'reddit',
additionalScopes: ['read'],
},
params: {
postId: {
type: 'string',
@@ -38,15 +44,21 @@ export const getCommentsTool: ToolConfig<RedditCommentsParams, RedditCommentsRes
const sort = params.sort || 'confidence'
const limit = Math.min(Math.max(1, params.limit || 50), 100)
// Build URL
return `https://www.reddit.com/r/${subreddit}/comments/${params.postId}.json?sort=${sort}&limit=${limit}&raw_json=1`
// Build URL using OAuth endpoint
return `https://oauth.reddit.com/r/${subreddit}/comments/${params.postId}?sort=${sort}&limit=${limit}&raw_json=1`
},
method: 'GET',
headers: () => ({
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36',
Accept: 'application/json',
}),
headers: (params: RedditCommentsParams) => {
if (!params.accessToken?.trim()) {
throw new Error('Access token is required for Reddit API')
}
return {
Authorization: `Bearer ${params.accessToken}`,
'User-Agent': 'sim-studio/1.0 (https://github.com/simstudioai/sim)',
Accept: 'application/json',
}
},
},
transformResponse: async (response: Response, requestParams?: RedditCommentsParams) => {

View File

@@ -7,6 +7,12 @@ export const getPostsTool: ToolConfig<RedditPostsParams, RedditPostsResponse> =
description: 'Fetch posts from a subreddit with different sorting options',
version: '1.0.0',
oauth: {
required: true,
provider: 'reddit',
additionalScopes: ['read'],
},
params: {
subreddit: {
type: 'string',
@@ -38,8 +44,8 @@ export const getPostsTool: ToolConfig<RedditPostsParams, RedditPostsResponse> =
const sort = params.sort || 'hot'
const limit = Math.min(Math.max(1, params.limit || 10), 100)
// Build URL with appropriate parameters
let url = `https://www.reddit.com/r/${subreddit}/${sort}.json?limit=${limit}&raw_json=1`
// Build URL with appropriate parameters using OAuth endpoint
let url = `https://oauth.reddit.com/r/${subreddit}/${sort}?limit=${limit}&raw_json=1`
// Add time parameter only for 'top' sorting
if (sort === 'top' && params.time) {
@@ -49,29 +55,54 @@ export const getPostsTool: ToolConfig<RedditPostsParams, RedditPostsResponse> =
return url
},
method: 'GET',
headers: () => ({
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36',
Accept: 'application/json',
}),
headers: (params: RedditPostsParams) => {
if (!params.accessToken) {
throw new Error('Access token is required for Reddit API')
}
return {
Authorization: `Bearer ${params.accessToken}`,
'User-Agent': 'sim-studio/1.0 (https://github.com/simstudioai/sim)',
Accept: 'application/json',
}
},
},
transformResponse: async (response: Response, requestParams?: RedditPostsParams) => {
try {
// Check if response is OK
if (!response.ok) {
// Get response text for better error details
const errorText = await response.text()
console.error('Reddit API Error:', {
status: response.status,
statusText: response.statusText,
body: errorText,
url: response.url,
})
if (response.status === 403 || response.status === 429) {
throw new Error('Reddit API access blocked or rate limited. Please try again later.')
}
throw new Error(`Reddit API returned ${response.status}: ${response.statusText}`)
throw new Error(
`Reddit API returned ${response.status}: ${response.statusText}. Body: ${errorText}`
)
}
// Attempt to parse JSON
let data
try {
data = await response.json()
} catch (_error) {
throw new Error('Failed to parse Reddit API response: Response was not valid JSON')
} catch (error) {
const responseText = await response.text()
console.error('Failed to parse Reddit API response as JSON:', {
error: error instanceof Error ? error.message : String(error),
responseText,
contentType: response.headers.get('content-type'),
})
throw new Error(
`Failed to parse Reddit API response: Response was not valid JSON. Content: ${responseText}`
)
}
// Check if response contains error

View File

@@ -4,6 +4,7 @@ import type { RedditHotPostsResponse, RedditPost } from './types'
interface HotPostsParams {
subreddit: string
limit?: number
accessToken: string
}
export const hotPostsTool: ToolConfig<HotPostsParams, RedditHotPostsResponse> = {
@@ -12,6 +13,12 @@ export const hotPostsTool: ToolConfig<HotPostsParams, RedditHotPostsResponse> =
description: 'Fetch the most popular (hot) posts from a specified subreddit.',
version: '1.0.0',
oauth: {
required: true,
provider: 'reddit',
additionalScopes: ['read'],
},
params: {
subreddit: {
type: 'string',
@@ -31,14 +38,20 @@ export const hotPostsTool: ToolConfig<HotPostsParams, RedditHotPostsResponse> =
const subreddit = params.subreddit.trim().replace(/^r\//, '')
const limit = Math.min(Math.max(1, params.limit || 10), 100)
return `https://www.reddit.com/r/${subreddit}/hot.json?limit=${limit}&raw_json=1`
return `https://oauth.reddit.com/r/${subreddit}/hot?limit=${limit}&raw_json=1`
},
method: 'GET',
headers: () => ({
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36',
Accept: 'application/json',
}),
headers: (params: HotPostsParams) => {
if (!params.accessToken) {
throw new Error('Access token is required for Reddit API')
}
return {
Authorization: `Bearer ${params.accessToken}`,
'User-Agent': 'sim-studio/1.0 (https://github.com/simstudioai/sim)',
Accept: 'application/json',
}
},
},
transformResponse: async (response: Response, requestParams?: HotPostsParams) => {

View File

@@ -39,6 +39,7 @@ export interface RedditPostsParams {
sort?: 'hot' | 'new' | 'top' | 'rising'
limit?: number
time?: 'day' | 'week' | 'month' | 'year' | 'all'
accessToken?: string
}
// Response for the generalized get_posts tool
@@ -55,6 +56,7 @@ export interface RedditCommentsParams {
subreddit: string
sort?: 'confidence' | 'top' | 'new' | 'controversial' | 'old' | 'random' | 'qa'
limit?: number
accessToken?: string
}
// Response for the get_comments tool

View File

@@ -1,10 +1,10 @@
import path, { resolve } from 'path'
/// <reference types="vitest" />
import nextEnv from '@next/env'
import react from '@vitejs/plugin-react'
import { configDefaults, defineConfig } from 'vitest/config'
const { loadEnvConfig } = nextEnv
const nextEnv = require('@next/env')
const { loadEnvConfig } = nextEnv.default || nextEnv
const projectDir = process.cwd()
loadEnvConfig(projectDir)

View File

@@ -34,7 +34,7 @@
"fumadocs-mdx": "^11.5.6",
"fumadocs-ui": "^15.0.16",
"lucide-react": "^0.511.0",
"next": "^15.2.3",
"next": "^15.3.2",
"next-themes": "^0.4.6",
"react": "19.1.0",
"react-dom": "19.1.0",
@@ -1176,35 +1176,35 @@
"@tabler/icons-react": ["@tabler/icons-react@3.34.0", "", { "dependencies": { "@tabler/icons": "3.34.0" }, "peerDependencies": { "react": ">= 16" } }, "sha512-OpEIR2iZsIXECtAIMbn1zfKfQ3zKJjXyIZlkgOGUL9UkMCFycEiF2Y8AVfEQsyre/3FnBdlWJvGr0NU47n2TbQ=="],
"@tailwindcss/node": ["@tailwindcss/node@4.1.10", "", { "dependencies": { "@ampproject/remapping": "^2.3.0", "enhanced-resolve": "^5.18.1", "jiti": "^2.4.2", "lightningcss": "1.30.1", "magic-string": "^0.30.17", "source-map-js": "^1.2.1", "tailwindcss": "4.1.10" } }, "sha512-2ACf1znY5fpRBwRhMgj9ZXvb2XZW8qs+oTfotJ2C5xR0/WNL7UHZ7zXl6s+rUqedL1mNi+0O+WQr5awGowS3PQ=="],
"@tailwindcss/node": ["@tailwindcss/node@4.1.11", "", { "dependencies": { "@ampproject/remapping": "^2.3.0", "enhanced-resolve": "^5.18.1", "jiti": "^2.4.2", "lightningcss": "1.30.1", "magic-string": "^0.30.17", "source-map-js": "^1.2.1", "tailwindcss": "4.1.11" } }, "sha512-yzhzuGRmv5QyU9qLNg4GTlYI6STedBWRE7NjxP45CsFYYq9taI0zJXZBMqIC/c8fViNLhmrbpSFS57EoxUmD6Q=="],
"@tailwindcss/oxide": ["@tailwindcss/oxide@4.1.10", "", { "dependencies": { "detect-libc": "^2.0.4", "tar": "^7.4.3" }, "optionalDependencies": { "@tailwindcss/oxide-android-arm64": "4.1.10", "@tailwindcss/oxide-darwin-arm64": "4.1.10", "@tailwindcss/oxide-darwin-x64": "4.1.10", "@tailwindcss/oxide-freebsd-x64": "4.1.10", "@tailwindcss/oxide-linux-arm-gnueabihf": "4.1.10", "@tailwindcss/oxide-linux-arm64-gnu": "4.1.10", "@tailwindcss/oxide-linux-arm64-musl": "4.1.10", "@tailwindcss/oxide-linux-x64-gnu": "4.1.10", "@tailwindcss/oxide-linux-x64-musl": "4.1.10", "@tailwindcss/oxide-wasm32-wasi": "4.1.10", "@tailwindcss/oxide-win32-arm64-msvc": "4.1.10", "@tailwindcss/oxide-win32-x64-msvc": "4.1.10" } }, "sha512-v0C43s7Pjw+B9w21htrQwuFObSkio2aV/qPx/mhrRldbqxbWJK6KizM+q7BF1/1CmuLqZqX3CeYF7s7P9fbA8Q=="],
"@tailwindcss/oxide": ["@tailwindcss/oxide@4.1.11", "", { "dependencies": { "detect-libc": "^2.0.4", "tar": "^7.4.3" }, "optionalDependencies": { "@tailwindcss/oxide-android-arm64": "4.1.11", "@tailwindcss/oxide-darwin-arm64": "4.1.11", "@tailwindcss/oxide-darwin-x64": "4.1.11", "@tailwindcss/oxide-freebsd-x64": "4.1.11", "@tailwindcss/oxide-linux-arm-gnueabihf": "4.1.11", "@tailwindcss/oxide-linux-arm64-gnu": "4.1.11", "@tailwindcss/oxide-linux-arm64-musl": "4.1.11", "@tailwindcss/oxide-linux-x64-gnu": "4.1.11", "@tailwindcss/oxide-linux-x64-musl": "4.1.11", "@tailwindcss/oxide-wasm32-wasi": "4.1.11", "@tailwindcss/oxide-win32-arm64-msvc": "4.1.11", "@tailwindcss/oxide-win32-x64-msvc": "4.1.11" } }, "sha512-Q69XzrtAhuyfHo+5/HMgr1lAiPP/G40OMFAnws7xcFEYqcypZmdW8eGXaOUIeOl1dzPJBPENXgbjsOyhg2nkrg=="],
"@tailwindcss/oxide-android-arm64": ["@tailwindcss/oxide-android-arm64@4.1.10", "", { "os": "android", "cpu": "arm64" }, "sha512-VGLazCoRQ7rtsCzThaI1UyDu/XRYVyH4/EWiaSX6tFglE+xZB5cvtC5Omt0OQ+FfiIVP98su16jDVHDEIuH4iQ=="],
"@tailwindcss/oxide-android-arm64": ["@tailwindcss/oxide-android-arm64@4.1.11", "", { "os": "android", "cpu": "arm64" }, "sha512-3IfFuATVRUMZZprEIx9OGDjG3Ou3jG4xQzNTvjDoKmU9JdmoCohQJ83MYd0GPnQIu89YoJqvMM0G3uqLRFtetg=="],
"@tailwindcss/oxide-darwin-arm64": ["@tailwindcss/oxide-darwin-arm64@4.1.10", "", { "os": "darwin", "cpu": "arm64" }, "sha512-ZIFqvR1irX2yNjWJzKCqTCcHZbgkSkSkZKbRM3BPzhDL/18idA8uWCoopYA2CSDdSGFlDAxYdU2yBHwAwx8euQ=="],
"@tailwindcss/oxide-darwin-arm64": ["@tailwindcss/oxide-darwin-arm64@4.1.11", "", { "os": "darwin", "cpu": "arm64" }, "sha512-ESgStEOEsyg8J5YcMb1xl8WFOXfeBmrhAwGsFxxB2CxY9evy63+AtpbDLAyRkJnxLy2WsD1qF13E97uQyP1lfQ=="],
"@tailwindcss/oxide-darwin-x64": ["@tailwindcss/oxide-darwin-x64@4.1.10", "", { "os": "darwin", "cpu": "x64" }, "sha512-eCA4zbIhWUFDXoamNztmS0MjXHSEJYlvATzWnRiTqJkcUteSjO94PoRHJy1Xbwp9bptjeIxxBHh+zBWFhttbrQ=="],
"@tailwindcss/oxide-darwin-x64": ["@tailwindcss/oxide-darwin-x64@4.1.11", "", { "os": "darwin", "cpu": "x64" }, "sha512-EgnK8kRchgmgzG6jE10UQNaH9Mwi2n+yw1jWmof9Vyg2lpKNX2ioe7CJdf9M5f8V9uaQxInenZkOxnTVL3fhAw=="],
"@tailwindcss/oxide-freebsd-x64": ["@tailwindcss/oxide-freebsd-x64@4.1.10", "", { "os": "freebsd", "cpu": "x64" }, "sha512-8/392Xu12R0cc93DpiJvNpJ4wYVSiciUlkiOHOSOQNH3adq9Gi/dtySK7dVQjXIOzlpSHjeCL89RUUI8/GTI6g=="],
"@tailwindcss/oxide-freebsd-x64": ["@tailwindcss/oxide-freebsd-x64@4.1.11", "", { "os": "freebsd", "cpu": "x64" }, "sha512-xdqKtbpHs7pQhIKmqVpxStnY1skuNh4CtbcyOHeX1YBE0hArj2romsFGb6yUmzkq/6M24nkxDqU8GYrKrz+UcA=="],
"@tailwindcss/oxide-linux-arm-gnueabihf": ["@tailwindcss/oxide-linux-arm-gnueabihf@4.1.10", "", { "os": "linux", "cpu": "arm" }, "sha512-t9rhmLT6EqeuPT+MXhWhlRYIMSfh5LZ6kBrC4FS6/+M1yXwfCtp24UumgCWOAJVyjQwG+lYva6wWZxrfvB+NhQ=="],
"@tailwindcss/oxide-linux-arm-gnueabihf": ["@tailwindcss/oxide-linux-arm-gnueabihf@4.1.11", "", { "os": "linux", "cpu": "arm" }, "sha512-ryHQK2eyDYYMwB5wZL46uoxz2zzDZsFBwfjssgB7pzytAeCCa6glsiJGjhTEddq/4OsIjsLNMAiMlHNYnkEEeg=="],
"@tailwindcss/oxide-linux-arm64-gnu": ["@tailwindcss/oxide-linux-arm64-gnu@4.1.10", "", { "os": "linux", "cpu": "arm64" }, "sha512-3oWrlNlxLRxXejQ8zImzrVLuZ/9Z2SeKoLhtCu0hpo38hTO2iL86eFOu4sVR8cZc6n3z7eRXXqtHJECa6mFOvA=="],
"@tailwindcss/oxide-linux-arm64-gnu": ["@tailwindcss/oxide-linux-arm64-gnu@4.1.11", "", { "os": "linux", "cpu": "arm64" }, "sha512-mYwqheq4BXF83j/w75ewkPJmPZIqqP1nhoghS9D57CLjsh3Nfq0m4ftTotRYtGnZd3eCztgbSPJ9QhfC91gDZQ=="],
"@tailwindcss/oxide-linux-arm64-musl": ["@tailwindcss/oxide-linux-arm64-musl@4.1.10", "", { "os": "linux", "cpu": "arm64" }, "sha512-saScU0cmWvg/Ez4gUmQWr9pvY9Kssxt+Xenfx1LG7LmqjcrvBnw4r9VjkFcqmbBb7GCBwYNcZi9X3/oMda9sqQ=="],
"@tailwindcss/oxide-linux-arm64-musl": ["@tailwindcss/oxide-linux-arm64-musl@4.1.11", "", { "os": "linux", "cpu": "arm64" }, "sha512-m/NVRFNGlEHJrNVk3O6I9ggVuNjXHIPoD6bqay/pubtYC9QIdAMpS+cswZQPBLvVvEF6GtSNONbDkZrjWZXYNQ=="],
"@tailwindcss/oxide-linux-x64-gnu": ["@tailwindcss/oxide-linux-x64-gnu@4.1.10", "", { "os": "linux", "cpu": "x64" }, "sha512-/G3ao/ybV9YEEgAXeEg28dyH6gs1QG8tvdN9c2MNZdUXYBaIY/Gx0N6RlJzfLy/7Nkdok4kaxKPHKJUlAaoTdA=="],
"@tailwindcss/oxide-linux-x64-gnu": ["@tailwindcss/oxide-linux-x64-gnu@4.1.11", "", { "os": "linux", "cpu": "x64" }, "sha512-YW6sblI7xukSD2TdbbaeQVDysIm/UPJtObHJHKxDEcW2exAtY47j52f8jZXkqE1krdnkhCMGqP3dbniu1Te2Fg=="],
"@tailwindcss/oxide-linux-x64-musl": ["@tailwindcss/oxide-linux-x64-musl@4.1.10", "", { "os": "linux", "cpu": "x64" }, "sha512-LNr7X8fTiKGRtQGOerSayc2pWJp/9ptRYAa4G+U+cjw9kJZvkopav1AQc5HHD+U364f71tZv6XamaHKgrIoVzA=="],
"@tailwindcss/oxide-linux-x64-musl": ["@tailwindcss/oxide-linux-x64-musl@4.1.11", "", { "os": "linux", "cpu": "x64" }, "sha512-e3C/RRhGunWYNC3aSF7exsQkdXzQ/M+aYuZHKnw4U7KQwTJotnWsGOIVih0s2qQzmEzOFIJ3+xt7iq67K/p56Q=="],
"@tailwindcss/oxide-wasm32-wasi": ["@tailwindcss/oxide-wasm32-wasi@4.1.10", "", { "dependencies": { "@emnapi/core": "^1.4.3", "@emnapi/runtime": "^1.4.3", "@emnapi/wasi-threads": "^1.0.2", "@napi-rs/wasm-runtime": "^0.2.10", "@tybys/wasm-util": "^0.9.0", "tslib": "^2.8.0" }, "cpu": "none" }, "sha512-d6ekQpopFQJAcIK2i7ZzWOYGZ+A6NzzvQ3ozBvWFdeyqfOZdYHU66g5yr+/HC4ipP1ZgWsqa80+ISNILk+ae/Q=="],
"@tailwindcss/oxide-wasm32-wasi": ["@tailwindcss/oxide-wasm32-wasi@4.1.11", "", { "dependencies": { "@emnapi/core": "^1.4.3", "@emnapi/runtime": "^1.4.3", "@emnapi/wasi-threads": "^1.0.2", "@napi-rs/wasm-runtime": "^0.2.11", "@tybys/wasm-util": "^0.9.0", "tslib": "^2.8.0" }, "cpu": "none" }, "sha512-Xo1+/GU0JEN/C/dvcammKHzeM6NqKovG+6921MR6oadee5XPBaKOumrJCXvopJ/Qb5TH7LX/UAywbqrP4lax0g=="],
"@tailwindcss/oxide-win32-arm64-msvc": ["@tailwindcss/oxide-win32-arm64-msvc@4.1.10", "", { "os": "win32", "cpu": "arm64" }, "sha512-i1Iwg9gRbwNVOCYmnigWCCgow8nDWSFmeTUU5nbNx3rqbe4p0kRbEqLwLJbYZKmSSp23g4N6rCDmm7OuPBXhDA=="],
"@tailwindcss/oxide-win32-arm64-msvc": ["@tailwindcss/oxide-win32-arm64-msvc@4.1.11", "", { "os": "win32", "cpu": "arm64" }, "sha512-UgKYx5PwEKrac3GPNPf6HVMNhUIGuUh4wlDFR2jYYdkX6pL/rn73zTq/4pzUm8fOjAn5L8zDeHp9iXmUGOXZ+w=="],
"@tailwindcss/oxide-win32-x64-msvc": ["@tailwindcss/oxide-win32-x64-msvc@4.1.10", "", { "os": "win32", "cpu": "x64" }, "sha512-sGiJTjcBSfGq2DVRtaSljq5ZgZS2SDHSIfhOylkBvHVjwOsodBhnb3HdmiKkVuUGKD0I7G63abMOVaskj1KpOA=="],
"@tailwindcss/oxide-win32-x64-msvc": ["@tailwindcss/oxide-win32-x64-msvc@4.1.11", "", { "os": "win32", "cpu": "x64" }, "sha512-YfHoggn1j0LK7wR82TOucWc5LDCguHnoS879idHekmmiR7g9HUtMw9MI0NHatS28u/Xlkfi9w5RJWgz2Dl+5Qg=="],
"@tailwindcss/postcss": ["@tailwindcss/postcss@4.1.10", "", { "dependencies": { "@alloc/quick-lru": "^5.2.0", "@tailwindcss/node": "4.1.10", "@tailwindcss/oxide": "4.1.10", "postcss": "^8.4.41", "tailwindcss": "4.1.10" } }, "sha512-B+7r7ABZbkXJwpvt2VMnS6ujcDoR2OOcFaqrLIo1xbcdxje4Vf+VgJdBzNNbrAjBj/rLZ66/tlQ1knIGNLKOBQ=="],
"@tailwindcss/postcss": ["@tailwindcss/postcss@4.1.11", "", { "dependencies": { "@alloc/quick-lru": "^5.2.0", "@tailwindcss/node": "4.1.11", "@tailwindcss/oxide": "4.1.11", "postcss": "^8.4.41", "tailwindcss": "4.1.11" } }, "sha512-q/EAIIpF6WpLhKEuQSEVMZNMIY8KhWoAemZ9eylNAih9jxMGAYPPWBn3I9QL/2jZ+e7OEz/tZkX5HwbBR4HohA=="],
"@testing-library/dom": ["@testing-library/dom@10.4.0", "", { "dependencies": { "@babel/code-frame": "^7.10.4", "@babel/runtime": "^7.12.5", "@types/aria-query": "^5.0.1", "aria-query": "5.3.0", "chalk": "^4.1.0", "dom-accessibility-api": "^0.5.9", "lz-string": "^1.5.0", "pretty-format": "^27.0.2" } }, "sha512-pemlzrSESWbdAloYml3bAJMEfNh1Z7EduzqPKprCH5S341frlpYnUEW0H72dLxa6IsYr+mPno20GiSm+h9dEdQ=="],
@@ -3374,11 +3374,11 @@
"@tailwindcss/node/jiti": ["jiti@2.4.2", "", { "bin": { "jiti": "lib/jiti-cli.mjs" } }, "sha512-rg9zJN+G4n2nfJl5MW3BMygZX56zKPNVEYYqq7adpmMh4Jn2QNEwhvQlFy6jPVdcod7txZtKHWnyZiA3a0zP7A=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/core": ["@emnapi/core@1.4.3", "", { "dependencies": { "@emnapi/wasi-threads": "1.0.2", "tslib": "^2.4.0" }, "bundled": true }, "sha512-4m62DuCE07lw01soJwPiBGC0nAww0Q+RY70VZ+n49yDIO13yyinhbWCeNnaob0lakDtWQzSdtNWzJeOJt2ma+g=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/core": ["@emnapi/core@1.4.4", "", { "dependencies": { "@emnapi/wasi-threads": "1.0.3", "tslib": "^2.4.0" }, "bundled": true }, "sha512-A9CnAbC6ARNMKcIcrQwq6HeHCjpcBZ5wSx4U01WXCqEKlrzB9F9315WDNHkrs2xbx7YjjSxbUYxuN6EQzpcY2g=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/runtime": ["@emnapi/runtime@1.4.3", "", { "dependencies": { "tslib": "^2.4.0" }, "bundled": true }, "sha512-pBPWdu6MLKROBX05wSNKcNb++m5Er+KQ9QkB+WVM+pW2Kx9hoSrVTnu3BdkI5eBLZoKu/J6mW/B6i6bJB2ytXQ=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/wasi-threads": ["@emnapi/wasi-threads@1.0.2", "", { "dependencies": { "tslib": "^2.4.0" }, "bundled": true }, "sha512-5n3nTJblwRi8LlXkJ9eBzu+kZR8Yxcc7ubakyQTFzPMtIhFpUBRbsnc2Dv88IZDIbCDlBiWrknhB4Lsz7mg6BA=="],
"@tailwindcss/oxide-wasm32-wasi/@emnapi/wasi-threads": ["@emnapi/wasi-threads@1.0.3", "", { "dependencies": { "tslib": "^2.4.0" }, "bundled": true }, "sha512-8K5IFFsQqF9wQNJptGbS6FNKgUTsSRYnTqNCG1vPP8jFdjSv18n2mQfJpkt2Oibo9iBEzcDnDxNwKTzC7svlJw=="],
"@tailwindcss/oxide-wasm32-wasi/@napi-rs/wasm-runtime": ["@napi-rs/wasm-runtime@0.2.11", "", { "dependencies": { "@emnapi/core": "^1.4.3", "@emnapi/runtime": "^1.4.3", "@tybys/wasm-util": "^0.9.0" }, "bundled": true }, "sha512-9DPkXtvHydrcOsopiYpUgPHpmj0HWZKMUnL2dZqpvC42lsratuBG06V5ipyno0fUek5VlFsNQ+AcFATSrJXgMA=="],