mirror of
https://github.com/simstudioai/sim.git
synced 2026-04-06 03:00:16 -04:00
v0.5.6: executor fixes, UI improvements, run paths (#2028)
* test(pr): hackathon (#1999) * test(pr): github trigger (#2000) * fix(usage-indicator): conditional rendering, upgrade, and ui/ux (#2001) * fix: usage-limit indicator and render conditonally on is billing enabled * fix: upgrade render * fix(notes): fix notes, tighten spacing, update deprecated zustand function, update use mention data to ignore block positon (#2002) * fix(pdfs): use unpdf instead of pdf-parse (#2004) * fix(modals): fix z-index for various modals and output selector and variables (#2005) * fix(condition): treat condition input the same as the code subblock (#2006) * feat(models): added gpt-5.1 (#2007) * improvement: runpath edges, blocks, active (#2008) * feat(i18n): update translations (#2009) * fix(triggers): check triggermode and consolidate block type (#2011) * fix(triggers): disabled trigger shouldn't be added to dag (#2012) * Fix disabled blocks * Comments * Fix api/chat trigger not found message * fix(tags): only show start block upstream if is ancestor (#2013) * fix(variables): Fix resolution on double < (#2016) * Fix variable <> * Ling * Clean * feat(billing): add notif for first failed payment, added upgrade email from free, updated providers that supported granular tool control to support them, fixed envvar popover, fixed redirect to wrong workspace after oauth connect (#2015) * feat(billing): add notif for first failed payment, added upgrade email from free, updated providers that supported granular tool control to support them, fixed envvar popover, fixed redirect to wrong workspace after oauth connect * fix build * ack PR comments * feat(performance): added reactquery hooks for workflow operations, for logs, fixed logs reloading, fix subscription UI (#2017) * feat(performance): added reactquery hooks for workflow operations, for logs, fixed logs reloading, fix subscription UI * use useInfiniteQuery for logs fetching * fix(copilot): run workflow supports input format and fix run id (#2018) * fix(router): fix error edge in router block + fix source handle problem (#2019) * Fix router block error port handling * Remove comment * Fix edge execution * improvement: code subblock, action bar, connections (#2024) * improvement: action bar, connections * fix: code block draggable resize * fix(response): fix response block http format (#2027) * Fix response block * Lint * fix(notes): fix notes block spacing, additional logs for billing transfer route (#2029) --------- Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com> Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com> Co-authored-by: Siddharth Ganesan <33737564+Sg312@users.noreply.github.com>
This commit is contained in:
@@ -42,10 +42,10 @@ Der Benutzer-Prompt stellt die primären Eingabedaten für die Inferenzverarbeit
|
||||
|
||||
Der Agent-Block unterstützt mehrere LLM-Anbieter über eine einheitliche Inferenzschnittstelle. Verfügbare Modelle umfassen:
|
||||
|
||||
- **OpenAI**: GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic**: Claude 3.7 Sonnet
|
||||
- **OpenAI**: GPT-5.1, GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic**: Claude 4.5 Sonnet, Claude Opus 4.1
|
||||
- **Google**: Gemini 2.5 Pro, Gemini 2.0 Flash
|
||||
- **Andere Anbieter**: Groq, Cerebras, xAI, DeepSeek
|
||||
- **Andere Anbieter**: Groq, Cerebras, xAI, Azure OpenAI, OpenRouter
|
||||
- **Lokale Modelle**: Ollama-kompatible Modelle
|
||||
|
||||
### Temperatur
|
||||
|
||||
@@ -42,10 +42,10 @@ The user prompt represents the primary input data for inference processing. This
|
||||
|
||||
The Agent block supports multiple LLM providers through a unified inference interface. Available models include:
|
||||
|
||||
- **OpenAI**: GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic**: Claude 3.7 Sonnet
|
||||
- **OpenAI**: GPT-5.1, GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic**: Claude 4.5 Sonnet, Claude Opus 4.1
|
||||
- **Google**: Gemini 2.5 Pro, Gemini 2.0 Flash
|
||||
- **Other Providers**: Groq, Cerebras, xAI, DeepSeek
|
||||
- **Other Providers**: Groq, Cerebras, xAI, Azure OpenAI, OpenRouter
|
||||
- **Local Models**: Ollama-compatible models
|
||||
|
||||
### Temperature
|
||||
|
||||
@@ -42,11 +42,11 @@ El prompt del usuario representa los datos de entrada principales para el proces
|
||||
|
||||
El bloque Agente admite múltiples proveedores de LLM a través de una interfaz de inferencia unificada. Los modelos disponibles incluyen:
|
||||
|
||||
- **OpenAI**: GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic**: Claude 3.7 Sonnet
|
||||
- **OpenAI**: GPT-5.1, GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic**: Claude 4.5 Sonnet, Claude Opus 4.1
|
||||
- **Google**: Gemini 2.5 Pro, Gemini 2.0 Flash
|
||||
- **Otros proveedores**: Groq, Cerebras, xAI, DeepSeek
|
||||
- **Modelos locales**: Modelos compatibles con Ollama
|
||||
- **Otros proveedores**: Groq, Cerebras, xAI, Azure OpenAI, OpenRouter
|
||||
- **Modelos locales**: modelos compatibles con Ollama
|
||||
|
||||
### Temperatura
|
||||
|
||||
|
||||
@@ -42,10 +42,10 @@ Le prompt utilisateur représente les données d'entrée principales pour le tra
|
||||
|
||||
Le bloc Agent prend en charge plusieurs fournisseurs de LLM via une interface d'inférence unifiée. Les modèles disponibles comprennent :
|
||||
|
||||
- **OpenAI** : GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic** : Claude 3.7 Sonnet
|
||||
- **OpenAI** : GPT-5.1, GPT-5, GPT-4o, o1, o3, o4-mini, gpt-4.1
|
||||
- **Anthropic** : Claude 4.5 Sonnet, Claude Opus 4.1
|
||||
- **Google** : Gemini 2.5 Pro, Gemini 2.0 Flash
|
||||
- **Autres fournisseurs** : Groq, Cerebras, xAI, DeepSeek
|
||||
- **Autres fournisseurs** : Groq, Cerebras, xAI, Azure OpenAI, OpenRouter
|
||||
- **Modèles locaux** : modèles compatibles avec Ollama
|
||||
|
||||
### Température
|
||||
|
||||
@@ -42,10 +42,10 @@ When responding to questions about investments, include risk disclaimers.
|
||||
|
||||
エージェントブロックは統一された推論インターフェースを通じて複数のLLMプロバイダーをサポートしています。利用可能なモデルには以下が含まれます:
|
||||
|
||||
- **OpenAI**: GPT-5、GPT-4o、o1、o3、o4-mini、gpt-4.1
|
||||
- **Anthropic**: Claude 3.7 Sonnet
|
||||
- **OpenAI**: GPT-5.1、GPT-5、GPT-4o、o1、o3、o4-mini、gpt-4.1
|
||||
- **Anthropic**: Claude 4.5 Sonnet、Claude Opus 4.1
|
||||
- **Google**: Gemini 2.5 Pro、Gemini 2.0 Flash
|
||||
- **その他のプロバイダー**: Groq、Cerebras、xAI、DeepSeek
|
||||
- **その他のプロバイダー**: Groq、Cerebras、xAI、Azure OpenAI、OpenRouter
|
||||
- **ローカルモデル**: Ollama互換モデル
|
||||
|
||||
### 温度
|
||||
|
||||
@@ -42,10 +42,10 @@ When responding to questions about investments, include risk disclaimers.
|
||||
|
||||
代理模块通过统一的推理接口支持多个 LLM 提供商。可用模型包括:
|
||||
|
||||
- **OpenAI**:GPT-5、GPT-4o、o1、o3、o4-mini、gpt-4.1
|
||||
- **Anthropic**:Claude 3.7 Sonnet
|
||||
- **OpenAI**:GPT-5.1、GPT-5、GPT-4o、o1、o3、o4-mini、gpt-4.1
|
||||
- **Anthropic**:Claude 4.5 Sonnet、Claude Opus 4.1
|
||||
- **Google**:Gemini 2.5 Pro、Gemini 2.0 Flash
|
||||
- **其他提供商**:Groq、Cerebras、xAI、DeepSeek
|
||||
- **其他提供商**:Groq、Cerebras、xAI、Azure OpenAI、OpenRouter
|
||||
- **本地模型**:兼容 Ollama 的模型
|
||||
|
||||
### 温度
|
||||
|
||||
@@ -5117,7 +5117,7 @@ checksums:
|
||||
content/9: e688b523909d6d6e9966c17892a18c96
|
||||
content/10: e50bd5107ca3410126cf0252b3c47eca
|
||||
content/11: d03d17960348dea95c6df8f46114bd0a
|
||||
content/12: 3850cfbd618a9d1c836fc7086da0f9b4
|
||||
content/12: 80da7e96414b75bb5b910c437bf7894a
|
||||
content/13: 6a7479225be3a7c7a42ba557ece50d03
|
||||
content/14: c64f9cd5168b3e592fe3341cbe1a41fe
|
||||
content/15: 87d6b6280da1c98b1bc291483459c8cf
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import { render } from '@react-email/components'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import CareersConfirmationEmail from '@/components/emails/careers-confirmation-email'
|
||||
import CareersSubmissionEmail from '@/components/emails/careers-submission-email'
|
||||
import CareersConfirmationEmail from '@/components/emails/careers/careers-confirmation-email'
|
||||
import CareersSubmissionEmail from '@/components/emails/careers/careers-submission-email'
|
||||
import { sendEmail } from '@/lib/email/mailer'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateRequestId } from '@/lib/utils'
|
||||
|
||||
@@ -15,6 +15,7 @@ import { executeWorkflowCore } from '@/lib/workflows/executor/execution-core'
|
||||
import { type ExecutionEvent, encodeSSEEvent } from '@/lib/workflows/executor/execution-events'
|
||||
import { PauseResumeManager } from '@/lib/workflows/executor/human-in-the-loop-manager'
|
||||
import { createStreamingResponse } from '@/lib/workflows/streaming'
|
||||
import { createHttpResponseFromBlock, workflowHasResponseBlock } from '@/lib/workflows/utils'
|
||||
import { validateWorkflowAccess } from '@/app/api/workflows/middleware'
|
||||
import { type ExecutionMetadata, ExecutionSnapshot } from '@/executor/execution/snapshot'
|
||||
import type { StreamingExecution } from '@/executor/types'
|
||||
@@ -495,6 +496,11 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
|
||||
loggingSession,
|
||||
})
|
||||
|
||||
const hasResponseBlock = workflowHasResponseBlock(result)
|
||||
if (hasResponseBlock) {
|
||||
return createHttpResponseFromBlock(result)
|
||||
}
|
||||
|
||||
const filteredResult = {
|
||||
success: result.success,
|
||||
output: result.output,
|
||||
|
||||
@@ -118,18 +118,18 @@ export async function POST(req: NextRequest) {
|
||||
|
||||
logger.info(`[${requestId}] Creating workflow ${workflowId} for user ${session.user.id}`)
|
||||
|
||||
// Track workflow creation
|
||||
try {
|
||||
const { trackPlatformEvent } = await import('@/lib/telemetry/tracer')
|
||||
trackPlatformEvent('platform.workflow.created', {
|
||||
'workflow.id': workflowId,
|
||||
'workflow.name': name,
|
||||
'workflow.has_workspace': !!workspaceId,
|
||||
'workflow.has_folder': !!folderId,
|
||||
import('@/lib/telemetry/tracer')
|
||||
.then(({ trackPlatformEvent }) => {
|
||||
trackPlatformEvent('platform.workflow.created', {
|
||||
'workflow.id': workflowId,
|
||||
'workflow.name': name,
|
||||
'workflow.has_workspace': !!workspaceId,
|
||||
'workflow.has_folder': !!folderId,
|
||||
})
|
||||
})
|
||||
.catch(() => {
|
||||
// Silently fail
|
||||
})
|
||||
} catch (_e) {
|
||||
// Silently fail
|
||||
}
|
||||
|
||||
await db.insert(workflow).values({
|
||||
id: workflowId,
|
||||
|
||||
@@ -74,30 +74,6 @@
|
||||
animation: dash-animation 1.5s linear infinite !important;
|
||||
}
|
||||
|
||||
/**
|
||||
* Active block ring animation - cycles through gray tones using box-shadow
|
||||
*/
|
||||
@keyframes ring-pulse-colors {
|
||||
0%,
|
||||
100% {
|
||||
box-shadow: 0 0 0 4px var(--surface-14);
|
||||
}
|
||||
33% {
|
||||
box-shadow: 0 0 0 4px var(--surface-12);
|
||||
}
|
||||
66% {
|
||||
box-shadow: 0 0 0 4px var(--surface-15);
|
||||
}
|
||||
}
|
||||
|
||||
.dark .animate-ring-pulse {
|
||||
animation: ring-pulse-colors 2s ease-in-out infinite !important;
|
||||
}
|
||||
|
||||
.light .animate-ring-pulse {
|
||||
animation: ring-pulse-colors 2s ease-in-out infinite !important;
|
||||
}
|
||||
|
||||
/**
|
||||
* Dark color tokens - single source of truth for all colors (dark-only)
|
||||
*/
|
||||
@@ -135,6 +111,7 @@
|
||||
--border-strong: #d1d1d1;
|
||||
--divider: #e5e5e5;
|
||||
--border-muted: #eeeeee;
|
||||
--border-success: #d5d5d5;
|
||||
|
||||
/* Brand & state */
|
||||
--brand-400: #8e4cfb;
|
||||
@@ -250,6 +227,7 @@
|
||||
--border-strong: #303030;
|
||||
--divider: #393939;
|
||||
--border-muted: #424242;
|
||||
--border-success: #575757;
|
||||
|
||||
/* Brand & state */
|
||||
--brand-400: #8e4cfb;
|
||||
|
||||
@@ -23,6 +23,7 @@ import '@/components/emcn/components/code/code.css'
|
||||
interface LogSidebarProps {
|
||||
log: WorkflowLog | null
|
||||
isOpen: boolean
|
||||
isLoadingDetails?: boolean
|
||||
onClose: () => void
|
||||
onNavigateNext?: () => void
|
||||
onNavigatePrev?: () => void
|
||||
@@ -192,6 +193,7 @@ const BlockContentDisplay = ({
|
||||
export function Sidebar({
|
||||
log,
|
||||
isOpen,
|
||||
isLoadingDetails = false,
|
||||
onClose,
|
||||
onNavigateNext,
|
||||
onNavigatePrev,
|
||||
@@ -219,15 +221,6 @@ export function Sidebar({
|
||||
}
|
||||
}, [log?.id])
|
||||
|
||||
const isLoadingDetails = useMemo(() => {
|
||||
if (!log) return false
|
||||
// Only show while we expect details to arrive (has executionId)
|
||||
if (!log.executionId) return false
|
||||
const hasEnhanced = !!log.executionData?.enhanced
|
||||
const hasAnyDetails = hasEnhanced || !!log.cost || Array.isArray(log.executionData?.traceSpans)
|
||||
return !hasAnyDetails
|
||||
}, [log])
|
||||
|
||||
const formattedContent = useMemo(() => {
|
||||
if (!log) return null
|
||||
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import { Loader2 } from 'lucide-react'
|
||||
import { useParams, useRouter, useSearchParams } from 'next/navigation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { soehne } from '@/app/fonts/soehne/soehne'
|
||||
import Controls from '@/app/workspace/[workspaceId]/logs/components/dashboard/controls'
|
||||
import KPIs from '@/app/workspace/[workspaceId]/logs/components/dashboard/kpis'
|
||||
@@ -11,12 +10,15 @@ import WorkflowDetails from '@/app/workspace/[workspaceId]/logs/components/dashb
|
||||
import WorkflowsList from '@/app/workspace/[workspaceId]/logs/components/dashboard/workflows-list'
|
||||
import Timeline from '@/app/workspace/[workspaceId]/logs/components/filters/components/timeline'
|
||||
import { mapToExecutionLog, mapToExecutionLogAlt } from '@/app/workspace/[workspaceId]/logs/utils'
|
||||
import {
|
||||
useExecutionsMetrics,
|
||||
useGlobalDashboardLogs,
|
||||
useWorkflowDashboardLogs,
|
||||
} from '@/hooks/queries/logs'
|
||||
import { formatCost } from '@/providers/utils'
|
||||
import { useFilterStore } from '@/stores/logs/filters/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
const logger = createLogger('Dashboard')
|
||||
|
||||
type TimeFilter = '30m' | '1h' | '6h' | '12h' | '24h' | '3d' | '7d' | '14d' | '30d'
|
||||
|
||||
interface WorkflowExecution {
|
||||
@@ -59,15 +61,6 @@ interface ExecutionLog {
|
||||
workflowColor?: string
|
||||
}
|
||||
|
||||
interface WorkflowDetailsDataLocal {
|
||||
errorRates: { timestamp: string; value: number }[]
|
||||
durations: { timestamp: string; value: number }[]
|
||||
executionCounts: { timestamp: string; value: number }[]
|
||||
logs: ExecutionLog[]
|
||||
allLogs: ExecutionLog[]
|
||||
__meta?: { offset: number; hasMore: boolean }
|
||||
}
|
||||
|
||||
export default function Dashboard() {
|
||||
const params = useParams()
|
||||
const workspaceId = params.workspaceId as string
|
||||
@@ -99,23 +92,7 @@ export default function Dashboard() {
|
||||
}
|
||||
}
|
||||
const [endTime, setEndTime] = useState<Date>(new Date())
|
||||
const [executions, setExecutions] = useState<WorkflowExecution[]>([])
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [isRefetching, setIsRefetching] = useState(false)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [expandedWorkflowId, setExpandedWorkflowId] = useState<string | null>(null)
|
||||
const [workflowDetails, setWorkflowDetails] = useState<Record<string, WorkflowDetailsDataLocal>>(
|
||||
{}
|
||||
)
|
||||
const [globalDetails, setGlobalDetails] = useState<WorkflowDetailsDataLocal | null>(null)
|
||||
const [globalLogsMeta, setGlobalLogsMeta] = useState<{ offset: number; hasMore: boolean }>({
|
||||
offset: 0,
|
||||
hasMore: true,
|
||||
})
|
||||
const [globalLoadingMore, setGlobalLoadingMore] = useState(false)
|
||||
const [aggregateSegments, setAggregateSegments] = useState<
|
||||
{ timestamp: string; totalExecutions: number; successfulExecutions: number }[]
|
||||
>([])
|
||||
const [selectedSegments, setSelectedSegments] = useState<Record<string, number[]>>({})
|
||||
const [lastAnchorIndices, setLastAnchorIndices] = useState<Record<string, number>>({})
|
||||
const [searchQuery, setSearchQuery] = useState('')
|
||||
@@ -135,6 +112,134 @@ export default function Dashboard() {
|
||||
|
||||
const timeFilter = getTimeFilterFromRange(sidebarTimeRange)
|
||||
|
||||
const getStartTime = useCallback(() => {
|
||||
const start = new Date(endTime)
|
||||
|
||||
switch (timeFilter) {
|
||||
case '30m':
|
||||
start.setMinutes(endTime.getMinutes() - 30)
|
||||
break
|
||||
case '1h':
|
||||
start.setHours(endTime.getHours() - 1)
|
||||
break
|
||||
case '6h':
|
||||
start.setHours(endTime.getHours() - 6)
|
||||
break
|
||||
case '12h':
|
||||
start.setHours(endTime.getHours() - 12)
|
||||
break
|
||||
case '24h':
|
||||
start.setHours(endTime.getHours() - 24)
|
||||
break
|
||||
case '3d':
|
||||
start.setDate(endTime.getDate() - 3)
|
||||
break
|
||||
case '7d':
|
||||
start.setDate(endTime.getDate() - 7)
|
||||
break
|
||||
case '14d':
|
||||
start.setDate(endTime.getDate() - 14)
|
||||
break
|
||||
case '30d':
|
||||
start.setDate(endTime.getDate() - 30)
|
||||
break
|
||||
default:
|
||||
start.setHours(endTime.getHours() - 24)
|
||||
}
|
||||
|
||||
return start
|
||||
}, [endTime, timeFilter])
|
||||
|
||||
const metricsFilters = useMemo(
|
||||
() => ({
|
||||
workspaceId,
|
||||
segments: segmentCount || DEFAULT_SEGMENTS,
|
||||
startTime: getStartTime().toISOString(),
|
||||
endTime: endTime.toISOString(),
|
||||
workflowIds: workflowIds.length > 0 ? workflowIds : undefined,
|
||||
folderIds: folderIds.length > 0 ? folderIds : undefined,
|
||||
triggers: triggers.length > 0 ? triggers : undefined,
|
||||
}),
|
||||
[workspaceId, segmentCount, getStartTime, endTime, workflowIds, folderIds, triggers]
|
||||
)
|
||||
|
||||
const logsFilters = useMemo(
|
||||
() => ({
|
||||
workspaceId,
|
||||
startDate: getStartTime().toISOString(),
|
||||
endDate: endTime.toISOString(),
|
||||
workflowIds: workflowIds.length > 0 ? workflowIds : undefined,
|
||||
folderIds: folderIds.length > 0 ? folderIds : undefined,
|
||||
triggers: triggers.length > 0 ? triggers : undefined,
|
||||
limit: 50,
|
||||
}),
|
||||
[workspaceId, getStartTime, endTime, workflowIds, folderIds, triggers]
|
||||
)
|
||||
|
||||
const metricsQuery = useExecutionsMetrics(metricsFilters, {
|
||||
enabled: Boolean(workspaceId),
|
||||
})
|
||||
|
||||
const globalLogsQuery = useGlobalDashboardLogs(logsFilters, {
|
||||
enabled: Boolean(workspaceId),
|
||||
})
|
||||
|
||||
const workflowLogsQuery = useWorkflowDashboardLogs(expandedWorkflowId ?? undefined, logsFilters, {
|
||||
enabled: Boolean(workspaceId) && Boolean(expandedWorkflowId),
|
||||
})
|
||||
|
||||
const executions = metricsQuery.data?.workflows ?? []
|
||||
const aggregateSegments = metricsQuery.data?.aggregateSegments ?? []
|
||||
const loading = metricsQuery.isLoading
|
||||
const isRefetching = metricsQuery.isFetching && !metricsQuery.isLoading
|
||||
const error = metricsQuery.error?.message ?? null
|
||||
|
||||
const globalLogs = useMemo(() => {
|
||||
if (!globalLogsQuery.data?.pages) return []
|
||||
return globalLogsQuery.data.pages.flatMap((page) => page.logs).map(mapToExecutionLog)
|
||||
}, [globalLogsQuery.data?.pages])
|
||||
|
||||
const workflowLogs = useMemo(() => {
|
||||
if (!workflowLogsQuery.data?.pages) return []
|
||||
return workflowLogsQuery.data.pages.flatMap((page) => page.logs).map(mapToExecutionLogAlt)
|
||||
}, [workflowLogsQuery.data?.pages])
|
||||
|
||||
const globalDetails = useMemo(() => {
|
||||
if (!aggregateSegments.length) return null
|
||||
|
||||
const errorRates = aggregateSegments.map((s) => ({
|
||||
timestamp: s.timestamp,
|
||||
value: s.totalExecutions > 0 ? (1 - s.successfulExecutions / s.totalExecutions) * 100 : 0,
|
||||
}))
|
||||
|
||||
const executionCounts = aggregateSegments.map((s) => ({
|
||||
timestamp: s.timestamp,
|
||||
value: s.totalExecutions,
|
||||
}))
|
||||
|
||||
return {
|
||||
errorRates,
|
||||
durations: [],
|
||||
executionCounts,
|
||||
logs: globalLogs,
|
||||
allLogs: globalLogs,
|
||||
}
|
||||
}, [aggregateSegments, globalLogs])
|
||||
|
||||
const workflowDetails = useMemo(() => {
|
||||
if (!expandedWorkflowId || !workflowLogs.length) return {}
|
||||
|
||||
return {
|
||||
[expandedWorkflowId]: {
|
||||
errorRates: [],
|
||||
durations: [],
|
||||
executionCounts: [],
|
||||
logs: workflowLogs,
|
||||
allLogs: workflowLogs,
|
||||
},
|
||||
}
|
||||
}, [expandedWorkflowId, workflowLogs])
|
||||
|
||||
useEffect(() => {
|
||||
const urlView = searchParams.get('view')
|
||||
if (urlView === 'dashboard' || urlView === 'logs') {
|
||||
@@ -190,362 +295,24 @@ export default function Dashboard() {
|
||||
}
|
||||
}, [executions])
|
||||
|
||||
const getStartTime = useCallback(() => {
|
||||
const start = new Date(endTime)
|
||||
|
||||
switch (timeFilter) {
|
||||
case '30m':
|
||||
start.setMinutes(endTime.getMinutes() - 30)
|
||||
break
|
||||
case '1h':
|
||||
start.setHours(endTime.getHours() - 1)
|
||||
break
|
||||
case '6h':
|
||||
start.setHours(endTime.getHours() - 6)
|
||||
break
|
||||
case '12h':
|
||||
start.setHours(endTime.getHours() - 12)
|
||||
break
|
||||
case '24h':
|
||||
start.setHours(endTime.getHours() - 24)
|
||||
break
|
||||
case '3d':
|
||||
start.setDate(endTime.getDate() - 3)
|
||||
break
|
||||
case '7d':
|
||||
start.setDate(endTime.getDate() - 7)
|
||||
break
|
||||
case '14d':
|
||||
start.setDate(endTime.getDate() - 14)
|
||||
break
|
||||
case '30d':
|
||||
start.setDate(endTime.getDate() - 30)
|
||||
break
|
||||
default:
|
||||
start.setHours(endTime.getHours() - 24)
|
||||
}
|
||||
|
||||
return start
|
||||
}, [endTime, timeFilter])
|
||||
|
||||
const fetchExecutions = useCallback(
|
||||
async (isInitialLoad = false) => {
|
||||
try {
|
||||
if (isInitialLoad) {
|
||||
setLoading(true)
|
||||
} else {
|
||||
setIsRefetching(true)
|
||||
}
|
||||
setError(null)
|
||||
|
||||
const startTime = getStartTime()
|
||||
const params = new URLSearchParams({
|
||||
segments: String(segmentCount || DEFAULT_SEGMENTS),
|
||||
startTime: startTime.toISOString(),
|
||||
endTime: endTime.toISOString(),
|
||||
})
|
||||
|
||||
if (workflowIds.length > 0) {
|
||||
params.set('workflowIds', workflowIds.join(','))
|
||||
}
|
||||
|
||||
if (folderIds.length > 0) {
|
||||
params.set('folderIds', folderIds.join(','))
|
||||
}
|
||||
|
||||
if (triggers.length > 0) {
|
||||
params.set('triggers', triggers.join(','))
|
||||
}
|
||||
|
||||
const response = await fetch(
|
||||
`/api/workspaces/${workspaceId}/metrics/executions?${params.toString()}`
|
||||
)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch execution history')
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
const mapped: WorkflowExecution[] = (data.workflows || []).map((wf: any) => {
|
||||
const segments = (wf.segments || []).map((s: any) => {
|
||||
const total = s.totalExecutions || 0
|
||||
const success = s.successfulExecutions || 0
|
||||
const hasExecutions = total > 0
|
||||
const successRate = hasExecutions ? (success / total) * 100 : 100
|
||||
return {
|
||||
timestamp: s.timestamp,
|
||||
hasExecutions,
|
||||
totalExecutions: total,
|
||||
successfulExecutions: success,
|
||||
successRate,
|
||||
avgDurationMs: typeof s.avgDurationMs === 'number' ? s.avgDurationMs : 0,
|
||||
p50Ms: typeof s.p50Ms === 'number' ? s.p50Ms : 0,
|
||||
p90Ms: typeof s.p90Ms === 'number' ? s.p90Ms : 0,
|
||||
p99Ms: typeof s.p99Ms === 'number' ? s.p99Ms : 0,
|
||||
}
|
||||
})
|
||||
const totals = segments.reduce(
|
||||
(acc: { total: number; success: number }, seg: (typeof segments)[number]) => {
|
||||
acc.total += seg.totalExecutions
|
||||
acc.success += seg.successfulExecutions
|
||||
return acc
|
||||
},
|
||||
{ total: 0, success: 0 }
|
||||
)
|
||||
const overallSuccessRate = totals.total > 0 ? (totals.success / totals.total) * 100 : 100
|
||||
return {
|
||||
workflowId: wf.workflowId,
|
||||
workflowName: wf.workflowName,
|
||||
segments,
|
||||
overallSuccessRate,
|
||||
} as WorkflowExecution
|
||||
})
|
||||
const sortedWorkflows = mapped.sort((a, b) => {
|
||||
const errA = a.overallSuccessRate < 100 ? 1 - a.overallSuccessRate / 100 : 0
|
||||
const errB = b.overallSuccessRate < 100 ? 1 - b.overallSuccessRate / 100 : 0
|
||||
return errB - errA
|
||||
})
|
||||
setExecutions(sortedWorkflows)
|
||||
|
||||
const segmentsCount: number = Number(params.get('segments') || DEFAULT_SEGMENTS)
|
||||
const agg: { timestamp: string; totalExecutions: number; successfulExecutions: number }[] =
|
||||
Array.from({ length: segmentsCount }, (_, i) => {
|
||||
const base = startTime.getTime()
|
||||
const ts = new Date(base + Math.floor((i * (endTime.getTime() - base)) / segmentsCount))
|
||||
return {
|
||||
timestamp: ts.toISOString(),
|
||||
totalExecutions: 0,
|
||||
successfulExecutions: 0,
|
||||
}
|
||||
})
|
||||
for (const wf of data.workflows as any[]) {
|
||||
wf.segments.forEach((s: any, i: number) => {
|
||||
const index = Math.min(i, segmentsCount - 1)
|
||||
agg[index].totalExecutions += s.totalExecutions || 0
|
||||
agg[index].successfulExecutions += s.successfulExecutions || 0
|
||||
})
|
||||
}
|
||||
setAggregateSegments(agg)
|
||||
|
||||
const errorRates = agg.map((s) => ({
|
||||
timestamp: s.timestamp,
|
||||
value: s.totalExecutions > 0 ? (1 - s.successfulExecutions / s.totalExecutions) * 100 : 0,
|
||||
}))
|
||||
const executionCounts = agg.map((s) => ({
|
||||
timestamp: s.timestamp,
|
||||
value: s.totalExecutions,
|
||||
}))
|
||||
|
||||
const logsParams = new URLSearchParams({
|
||||
limit: '50',
|
||||
offset: '0',
|
||||
workspaceId,
|
||||
startDate: startTime.toISOString(),
|
||||
endDate: endTime.toISOString(),
|
||||
order: 'desc',
|
||||
details: 'full',
|
||||
})
|
||||
if (workflowIds.length > 0) logsParams.set('workflowIds', workflowIds.join(','))
|
||||
if (folderIds.length > 0) logsParams.set('folderIds', folderIds.join(','))
|
||||
if (triggers.length > 0) logsParams.set('triggers', triggers.join(','))
|
||||
|
||||
const logsResponse = await fetch(`/api/logs?${logsParams.toString()}`)
|
||||
let mappedLogs: ExecutionLog[] = []
|
||||
if (logsResponse.ok) {
|
||||
const logsData = await logsResponse.json()
|
||||
mappedLogs = (logsData.data || []).map(mapToExecutionLog)
|
||||
}
|
||||
|
||||
setGlobalDetails({
|
||||
errorRates,
|
||||
durations: [],
|
||||
executionCounts,
|
||||
logs: mappedLogs,
|
||||
allLogs: mappedLogs,
|
||||
})
|
||||
setGlobalLogsMeta({ offset: mappedLogs.length, hasMore: mappedLogs.length === 50 })
|
||||
} catch (err) {
|
||||
logger.error('Error fetching executions:', err)
|
||||
setError(err instanceof Error ? err.message : 'An error occurred')
|
||||
} finally {
|
||||
setLoading(false)
|
||||
setIsRefetching(false)
|
||||
}
|
||||
},
|
||||
[workspaceId, timeFilter, endTime, getStartTime, workflowIds, folderIds, triggers, segmentCount]
|
||||
)
|
||||
|
||||
const fetchWorkflowDetails = useCallback(
|
||||
async (workflowId: string, silent = false) => {
|
||||
try {
|
||||
const startTime = getStartTime()
|
||||
const params = new URLSearchParams({
|
||||
startTime: startTime.toISOString(),
|
||||
endTime: endTime.toISOString(),
|
||||
})
|
||||
|
||||
if (triggers.length > 0) {
|
||||
params.set('triggers', triggers.join(','))
|
||||
}
|
||||
|
||||
const response = await fetch(
|
||||
`/api/logs?${new URLSearchParams({
|
||||
limit: '50',
|
||||
offset: '0',
|
||||
workspaceId,
|
||||
startDate: startTime.toISOString(),
|
||||
endDate: endTime.toISOString(),
|
||||
order: 'desc',
|
||||
details: 'full',
|
||||
workflowIds: workflowId,
|
||||
...(triggers.length > 0 ? { triggers: triggers.join(',') } : {}),
|
||||
}).toString()}`
|
||||
)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch workflow details')
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
const mappedLogs: ExecutionLog[] = (data.data || []).map(mapToExecutionLogAlt)
|
||||
|
||||
setWorkflowDetails((prev) => ({
|
||||
...prev,
|
||||
[workflowId]: {
|
||||
errorRates: [],
|
||||
durations: [],
|
||||
executionCounts: [],
|
||||
logs: mappedLogs,
|
||||
allLogs: mappedLogs,
|
||||
__meta: { offset: mappedLogs.length, hasMore: (data.data || []).length === 50 },
|
||||
},
|
||||
}))
|
||||
} catch (err) {
|
||||
logger.error('Error fetching workflow details:', err)
|
||||
}
|
||||
},
|
||||
[workspaceId, endTime, getStartTime, triggers]
|
||||
)
|
||||
|
||||
// Infinite scroll for details logs
|
||||
const loadMoreLogs = useCallback(
|
||||
async (workflowId: string) => {
|
||||
const details = (workflowDetails as any)[workflowId]
|
||||
if (!details) return
|
||||
if (details.__loading) return
|
||||
if (!details.__meta?.hasMore) return
|
||||
try {
|
||||
// mark loading to prevent duplicate fetches
|
||||
setWorkflowDetails((prev) => ({
|
||||
...prev,
|
||||
[workflowId]: { ...(prev as any)[workflowId], __loading: true },
|
||||
}))
|
||||
const startTime = getStartTime()
|
||||
const offset = details.__meta.offset || 0
|
||||
const qp = new URLSearchParams({
|
||||
limit: '50',
|
||||
offset: String(offset),
|
||||
workspaceId,
|
||||
startDate: startTime.toISOString(),
|
||||
endDate: endTime.toISOString(),
|
||||
order: 'desc',
|
||||
details: 'full',
|
||||
workflowIds: workflowId,
|
||||
})
|
||||
if (triggers.length > 0) qp.set('triggers', triggers.join(','))
|
||||
const res = await fetch(`/api/logs?${qp.toString()}`)
|
||||
if (!res.ok) return
|
||||
const data = await res.json()
|
||||
const more: ExecutionLog[] = (data.data || []).map(mapToExecutionLogAlt)
|
||||
|
||||
setWorkflowDetails((prev) => {
|
||||
const cur = prev[workflowId]
|
||||
const seen = new Set<string>()
|
||||
const dedup = [...(cur?.allLogs || []), ...more].filter((x) => {
|
||||
const id = x.id
|
||||
if (seen.has(id)) return false
|
||||
seen.add(id)
|
||||
return true
|
||||
})
|
||||
return {
|
||||
...prev,
|
||||
[workflowId]: {
|
||||
...cur,
|
||||
logs: dedup,
|
||||
allLogs: dedup,
|
||||
__meta: {
|
||||
offset: (cur?.__meta?.offset || 0) + more.length,
|
||||
hasMore: more.length === 50,
|
||||
},
|
||||
__loading: false,
|
||||
},
|
||||
}
|
||||
})
|
||||
} catch {
|
||||
setWorkflowDetails((prev) => ({
|
||||
...prev,
|
||||
[workflowId]: { ...(prev as any)[workflowId], __loading: false },
|
||||
}))
|
||||
(workflowId: string) => {
|
||||
if (
|
||||
workflowId === expandedWorkflowId &&
|
||||
workflowLogsQuery.hasNextPage &&
|
||||
!workflowLogsQuery.isFetchingNextPage
|
||||
) {
|
||||
workflowLogsQuery.fetchNextPage()
|
||||
}
|
||||
},
|
||||
[workspaceId, endTime, getStartTime, triggers, workflowDetails]
|
||||
[expandedWorkflowId, workflowLogsQuery]
|
||||
)
|
||||
|
||||
const loadMoreGlobalLogs = useCallback(async () => {
|
||||
if (!globalDetails || !globalLogsMeta.hasMore) return
|
||||
if (globalLoadingMore) return
|
||||
try {
|
||||
setGlobalLoadingMore(true)
|
||||
const startTime = getStartTime()
|
||||
const qp = new URLSearchParams({
|
||||
limit: '50',
|
||||
offset: String(globalLogsMeta.offset || 0),
|
||||
workspaceId,
|
||||
startDate: startTime.toISOString(),
|
||||
endDate: endTime.toISOString(),
|
||||
order: 'desc',
|
||||
details: 'full',
|
||||
})
|
||||
if (workflowIds.length > 0) qp.set('workflowIds', workflowIds.join(','))
|
||||
if (folderIds.length > 0) qp.set('folderIds', folderIds.join(','))
|
||||
if (triggers.length > 0) qp.set('triggers', triggers.join(','))
|
||||
|
||||
const res = await fetch(`/api/logs?${qp.toString()}`)
|
||||
if (!res.ok) return
|
||||
const data = await res.json()
|
||||
const more: ExecutionLog[] = (data.data || []).map(mapToExecutionLog)
|
||||
|
||||
setGlobalDetails((prev) => {
|
||||
if (!prev) return prev
|
||||
const seen = new Set<string>()
|
||||
const dedup = [...prev.allLogs, ...more].filter((x) => {
|
||||
const id = x.id
|
||||
if (seen.has(id)) return false
|
||||
seen.add(id)
|
||||
return true
|
||||
})
|
||||
return { ...prev, logs: dedup, allLogs: dedup }
|
||||
})
|
||||
setGlobalLogsMeta((m) => ({
|
||||
offset: (m.offset || 0) + more.length,
|
||||
hasMore: more.length === 50,
|
||||
}))
|
||||
} catch {
|
||||
// ignore
|
||||
} finally {
|
||||
setGlobalLoadingMore(false)
|
||||
const loadMoreGlobalLogs = useCallback(() => {
|
||||
if (globalLogsQuery.hasNextPage && !globalLogsQuery.isFetchingNextPage) {
|
||||
globalLogsQuery.fetchNextPage()
|
||||
}
|
||||
}, [
|
||||
globalDetails,
|
||||
globalLogsMeta,
|
||||
globalLoadingMore,
|
||||
workspaceId,
|
||||
endTime,
|
||||
getStartTime,
|
||||
workflowIds,
|
||||
folderIds,
|
||||
triggers,
|
||||
])
|
||||
}, [globalLogsQuery])
|
||||
|
||||
const toggleWorkflow = useCallback(
|
||||
(workflowId: string) => {
|
||||
@@ -553,12 +320,9 @@ export default function Dashboard() {
|
||||
setExpandedWorkflowId(null)
|
||||
} else {
|
||||
setExpandedWorkflowId(workflowId)
|
||||
if (!workflowDetails[workflowId]) {
|
||||
fetchWorkflowDetails(workflowId)
|
||||
}
|
||||
}
|
||||
},
|
||||
[expandedWorkflowId, workflowDetails, fetchWorkflowDetails]
|
||||
[expandedWorkflowId]
|
||||
)
|
||||
|
||||
const handleSegmentClick = useCallback(
|
||||
@@ -568,13 +332,7 @@ export default function Dashboard() {
|
||||
_timestamp: string,
|
||||
mode: 'single' | 'toggle' | 'range'
|
||||
) => {
|
||||
// Fetch workflow details if not already loaded
|
||||
if (!workflowDetails[workflowId]) {
|
||||
fetchWorkflowDetails(workflowId)
|
||||
}
|
||||
|
||||
if (mode === 'toggle') {
|
||||
// Toggle mode: Add/remove segment from selection, allowing cross-workflow selection
|
||||
setSelectedSegments((prev) => {
|
||||
const currentSegments = prev[workflowId] || []
|
||||
const exists = currentSegments.includes(segmentIndex)
|
||||
@@ -584,7 +342,6 @@ export default function Dashboard() {
|
||||
|
||||
if (nextSegments.length === 0) {
|
||||
const { [workflowId]: _, ...rest } = prev
|
||||
// If this was the only workflow with selections, clear expanded
|
||||
if (Object.keys(rest).length === 0) {
|
||||
setExpandedWorkflowId(null)
|
||||
}
|
||||
@@ -593,7 +350,6 @@ export default function Dashboard() {
|
||||
|
||||
const newState = { ...prev, [workflowId]: nextSegments }
|
||||
|
||||
// Set to multi-workflow mode if multiple workflows have selections
|
||||
const selectedWorkflowIds = Object.keys(newState)
|
||||
if (selectedWorkflowIds.length > 1) {
|
||||
setExpandedWorkflowId('__multi__')
|
||||
@@ -606,27 +362,23 @@ export default function Dashboard() {
|
||||
|
||||
setLastAnchorIndices((prev) => ({ ...prev, [workflowId]: segmentIndex }))
|
||||
} else if (mode === 'single') {
|
||||
// Single mode: Select this segment, or deselect if already selected
|
||||
setSelectedSegments((prev) => {
|
||||
const currentSegments = prev[workflowId] || []
|
||||
const isOnlySelectedSegment =
|
||||
currentSegments.length === 1 && currentSegments[0] === segmentIndex
|
||||
const isOnlyWorkflowSelected = Object.keys(prev).length === 1 && prev[workflowId]
|
||||
|
||||
// If this is the only selected segment in the only selected workflow, deselect it
|
||||
if (isOnlySelectedSegment && isOnlyWorkflowSelected) {
|
||||
setExpandedWorkflowId(null)
|
||||
setLastAnchorIndices({})
|
||||
return {}
|
||||
}
|
||||
|
||||
// Otherwise, select only this segment
|
||||
setExpandedWorkflowId(workflowId)
|
||||
setLastAnchorIndices({ [workflowId]: segmentIndex })
|
||||
return { [workflowId]: [segmentIndex] }
|
||||
})
|
||||
} else if (mode === 'range') {
|
||||
// Range mode: Expand selection within the current workflow
|
||||
if (expandedWorkflowId === workflowId) {
|
||||
setSelectedSegments((prev) => {
|
||||
const currentSegments = prev[workflowId] || []
|
||||
@@ -638,31 +390,15 @@ export default function Dashboard() {
|
||||
return { ...prev, [workflowId]: Array.from(union).sort((a, b) => a - b) }
|
||||
})
|
||||
} else {
|
||||
// If clicking range on a different workflow, treat as single click
|
||||
setExpandedWorkflowId(workflowId)
|
||||
setSelectedSegments({ [workflowId]: [segmentIndex] })
|
||||
setLastAnchorIndices({ [workflowId]: segmentIndex })
|
||||
}
|
||||
}
|
||||
},
|
||||
[expandedWorkflowId, workflowDetails, fetchWorkflowDetails, lastAnchorIndices]
|
||||
[expandedWorkflowId, workflowDetails, lastAnchorIndices]
|
||||
)
|
||||
|
||||
const isInitialMount = useRef(true)
|
||||
useEffect(() => {
|
||||
const isInitial = isInitialMount.current
|
||||
if (isInitial) {
|
||||
isInitialMount.current = false
|
||||
}
|
||||
fetchExecutions(isInitial)
|
||||
}, [workspaceId, timeFilter, endTime, workflowIds, folderIds, triggers, segmentCount])
|
||||
|
||||
useEffect(() => {
|
||||
if (expandedWorkflowId) {
|
||||
fetchWorkflowDetails(expandedWorkflowId)
|
||||
}
|
||||
}, [expandedWorkflowId, timeFilter, endTime, workflowIds, folderIds, fetchWorkflowDetails])
|
||||
|
||||
useEffect(() => {
|
||||
setSelectedSegments({})
|
||||
setLastAnchorIndices({})
|
||||
@@ -692,68 +428,15 @@ export default function Dashboard() {
|
||||
}
|
||||
}, [])
|
||||
|
||||
const getShiftLabel = () => {
|
||||
switch (sidebarTimeRange) {
|
||||
case 'Past 30 minutes':
|
||||
return '30 minutes'
|
||||
case 'Past hour':
|
||||
return 'hour'
|
||||
case 'Past 12 hours':
|
||||
return '12 hours'
|
||||
case 'Past 24 hours':
|
||||
return '24 hours'
|
||||
default:
|
||||
return 'period'
|
||||
}
|
||||
}
|
||||
|
||||
const getDateRange = () => {
|
||||
const start = getStartTime()
|
||||
return `${start.toLocaleDateString('en-US', { month: 'short', day: 'numeric', hour: 'numeric', minute: '2-digit' })} - ${endTime.toLocaleDateString('en-US', { month: 'short', day: 'numeric', hour: 'numeric', minute: '2-digit', year: 'numeric' })}`
|
||||
}
|
||||
|
||||
const shiftTimeWindow = (direction: 'back' | 'forward') => {
|
||||
let shift: number
|
||||
switch (timeFilter) {
|
||||
case '30m':
|
||||
shift = 30 * 60 * 1000
|
||||
break
|
||||
case '1h':
|
||||
shift = 60 * 60 * 1000
|
||||
break
|
||||
case '6h':
|
||||
shift = 6 * 60 * 60 * 1000
|
||||
break
|
||||
case '12h':
|
||||
shift = 12 * 60 * 60 * 1000
|
||||
break
|
||||
case '24h':
|
||||
shift = 24 * 60 * 60 * 1000
|
||||
break
|
||||
case '3d':
|
||||
shift = 3 * 24 * 60 * 60 * 1000
|
||||
break
|
||||
case '7d':
|
||||
shift = 7 * 24 * 60 * 60 * 1000
|
||||
break
|
||||
case '14d':
|
||||
shift = 14 * 24 * 60 * 60 * 1000
|
||||
break
|
||||
case '30d':
|
||||
shift = 30 * 24 * 60 * 60 * 1000
|
||||
break
|
||||
default:
|
||||
shift = 24 * 60 * 60 * 1000
|
||||
}
|
||||
|
||||
setEndTime((prev) => new Date(prev.getTime() + (direction === 'forward' ? shift : -shift)))
|
||||
}
|
||||
|
||||
const resetToNow = () => {
|
||||
setEndTime(new Date())
|
||||
}
|
||||
|
||||
const isLive = endTime.getTime() > Date.now() - 60000 // Within last minute
|
||||
const [live, setLive] = useState(false)
|
||||
|
||||
useEffect(() => {
|
||||
@@ -768,8 +451,6 @@ export default function Dashboard() {
|
||||
}
|
||||
}, [live])
|
||||
|
||||
// Infinite scroll is now handled inside WorkflowDetails
|
||||
|
||||
return (
|
||||
<div className={`flex h-full min-w-0 flex-col pl-64 ${soehne.className}`}>
|
||||
<div className='flex min-w-0 flex-1 overflow-hidden'>
|
||||
@@ -873,25 +554,21 @@ export default function Dashboard() {
|
||||
{/* Details section in its own scroll area */}
|
||||
<div className='min-h-0 flex-1 overflow-auto'>
|
||||
{(() => {
|
||||
// Handle multi-workflow selection view
|
||||
if (expandedWorkflowId === '__multi__') {
|
||||
const selectedWorkflowIds = Object.keys(selectedSegments)
|
||||
const totalMs = endTime.getTime() - getStartTime().getTime()
|
||||
const segMs = totalMs / Math.max(1, segmentCount)
|
||||
|
||||
// Collect all unique segment indices across all workflows
|
||||
const allSegmentIndices = new Set<number>()
|
||||
for (const indices of Object.values(selectedSegments)) {
|
||||
indices.forEach((idx) => allSegmentIndices.add(idx))
|
||||
}
|
||||
const sortedIndices = Array.from(allSegmentIndices).sort((a, b) => a - b)
|
||||
|
||||
// Aggregate logs from all selected workflows/segments
|
||||
const allLogs: any[] = []
|
||||
let totalExecutions = 0
|
||||
let totalSuccess = 0
|
||||
|
||||
// Build aggregated chart series
|
||||
const aggregatedSegments: Array<{
|
||||
timestamp: string
|
||||
totalExecutions: number
|
||||
@@ -900,9 +577,7 @@ export default function Dashboard() {
|
||||
durationCount: number
|
||||
}> = []
|
||||
|
||||
// Initialize aggregated segments for each unique index
|
||||
for (const idx of sortedIndices) {
|
||||
// Get the timestamp from the first workflow that has this index
|
||||
let timestamp = ''
|
||||
for (const wfId of selectedWorkflowIds) {
|
||||
const wf = executions.find((w) => w.workflowId === wfId)
|
||||
@@ -921,7 +596,6 @@ export default function Dashboard() {
|
||||
})
|
||||
}
|
||||
|
||||
// Aggregate data from all workflows
|
||||
for (const wfId of selectedWorkflowIds) {
|
||||
const wf = executions.find((w) => w.workflowId === wfId)
|
||||
const details = workflowDetails[wfId]
|
||||
@@ -929,7 +603,6 @@ export default function Dashboard() {
|
||||
|
||||
if (!wf || !details || indices.length === 0) continue
|
||||
|
||||
// Calculate time windows for this workflow's selected segments
|
||||
const windows = indices
|
||||
.map((idx) => wf.segments[idx])
|
||||
.filter(Boolean)
|
||||
@@ -944,7 +617,6 @@ export default function Dashboard() {
|
||||
const inAnyWindow = (t: number) =>
|
||||
windows.some((w) => t >= w.start && t < w.end)
|
||||
|
||||
// Filter logs for this workflow's selected segments
|
||||
const workflowLogs = details.allLogs
|
||||
.filter((log) => inAnyWindow(new Date(log.startedAt).getTime()))
|
||||
.map((log) => ({
|
||||
@@ -956,7 +628,6 @@ export default function Dashboard() {
|
||||
|
||||
allLogs.push(...workflowLogs)
|
||||
|
||||
// Aggregate segment metrics
|
||||
indices.forEach((idx) => {
|
||||
const segment = wf.segments[idx]
|
||||
if (!segment) return
|
||||
@@ -974,7 +645,6 @@ export default function Dashboard() {
|
||||
})
|
||||
}
|
||||
|
||||
// Build chart series
|
||||
const errorRates = aggregatedSegments.map((seg) => ({
|
||||
timestamp: seg.timestamp,
|
||||
value:
|
||||
@@ -993,7 +663,6 @@ export default function Dashboard() {
|
||||
value: seg.durationCount > 0 ? seg.avgDurationMs / seg.durationCount : 0,
|
||||
}))
|
||||
|
||||
// Sort logs by time (most recent first)
|
||||
allLogs.sort(
|
||||
(a, b) => new Date(b.startedAt).getTime() - new Date(a.startedAt).getTime()
|
||||
)
|
||||
@@ -1002,13 +671,11 @@ export default function Dashboard() {
|
||||
const totalRate =
|
||||
totalExecutions > 0 ? (totalSuccess / totalExecutions) * 100 : 100
|
||||
|
||||
// Calculate overall time range across all selected workflows
|
||||
let multiWorkflowTimeRange: { start: Date; end: Date } | null = null
|
||||
if (sortedIndices.length > 0) {
|
||||
const firstIdx = sortedIndices[0]
|
||||
const lastIdx = sortedIndices[sortedIndices.length - 1]
|
||||
|
||||
// Find earliest start time
|
||||
let earliestStart: Date | null = null
|
||||
for (const wfId of selectedWorkflowIds) {
|
||||
const wf = executions.find((w) => w.workflowId === wfId)
|
||||
@@ -1021,7 +688,6 @@ export default function Dashboard() {
|
||||
}
|
||||
}
|
||||
|
||||
// Find latest end time
|
||||
let latestEnd: Date | null = null
|
||||
for (const wfId of selectedWorkflowIds) {
|
||||
const wf = executions.find((w) => w.workflowId === wfId)
|
||||
@@ -1042,7 +708,6 @@ export default function Dashboard() {
|
||||
}
|
||||
}
|
||||
|
||||
// Get workflow names
|
||||
const workflowNames = selectedWorkflowIds
|
||||
.map((id) => executions.find((w) => w.workflowId === id)?.workflowName)
|
||||
.filter(Boolean) as string[]
|
||||
@@ -1179,33 +844,25 @@ export default function Dashboard() {
|
||||
...log,
|
||||
workflowName: (log as any).workflowName || wf.workflowName,
|
||||
}))
|
||||
|
||||
// Build series from selected segments indices
|
||||
const idxSet = new Set(workflowSelectedIndices)
|
||||
const selectedSegs = wf.segments.filter((_, i) => idxSet.has(i))
|
||||
;(details as any).__filtered = buildSeriesFromSegments(selectedSegs as any)
|
||||
} else if (details) {
|
||||
// Clear filtered data when no segments are selected
|
||||
;(details as any).__filtered = undefined
|
||||
}
|
||||
|
||||
// Compute series data based on selected segments or all segments
|
||||
const segmentsToUse =
|
||||
workflowSelectedIndices.length > 0
|
||||
? wf.segments.filter((_, i) => workflowSelectedIndices.includes(i))
|
||||
: wf.segments
|
||||
const series = buildSeriesFromSegments(segmentsToUse as any)
|
||||
|
||||
const detailsWithFilteredLogs = details
|
||||
? {
|
||||
...details,
|
||||
logs: logsToDisplay,
|
||||
...(() => {
|
||||
const series =
|
||||
(details as any).__filtered ||
|
||||
buildSeriesFromSegments(wf.segments as any)
|
||||
return {
|
||||
errorRates: series.errorRates,
|
||||
durations: series.durations,
|
||||
executionCounts: series.executionCounts,
|
||||
durationP50: series.durationP50,
|
||||
durationP90: series.durationP90,
|
||||
durationP99: series.durationP99,
|
||||
}
|
||||
})(),
|
||||
errorRates: series.errorRates,
|
||||
durations: series.durations,
|
||||
executionCounts: series.executionCounts,
|
||||
durationP50: series.durationP50,
|
||||
durationP90: series.durationP90,
|
||||
durationP99: series.durationP99,
|
||||
}
|
||||
: undefined
|
||||
|
||||
@@ -1261,8 +918,8 @@ export default function Dashboard() {
|
||||
}}
|
||||
formatCost={formatCost}
|
||||
onLoadMore={() => loadMoreLogs(expandedWorkflowId)}
|
||||
hasMore={(workflowDetails as any)[expandedWorkflowId]?.__meta?.hasMore}
|
||||
isLoadingMore={(workflowDetails as any)[expandedWorkflowId]?.__loading}
|
||||
hasMore={workflowLogsQuery.hasNextPage ?? false}
|
||||
isLoadingMore={workflowLogsQuery.isFetchingNextPage}
|
||||
/>
|
||||
)
|
||||
}
|
||||
@@ -1297,8 +954,8 @@ export default function Dashboard() {
|
||||
}}
|
||||
formatCost={formatCost}
|
||||
onLoadMore={loadMoreGlobalLogs}
|
||||
hasMore={globalLogsMeta.hasMore}
|
||||
isLoadingMore={globalLoadingMore}
|
||||
hasMore={globalLogsQuery.hasNextPage ?? false}
|
||||
isLoadingMore={globalLogsQuery.isFetchingNextPage}
|
||||
/>
|
||||
)
|
||||
})()}
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
'use client'
|
||||
|
||||
import { useCallback, useEffect, useRef, useState } from 'react'
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import { AlertCircle, ArrowUpRight, Info, Loader2 } from 'lucide-react'
|
||||
import Link from 'next/link'
|
||||
import { useParams } from 'next/navigation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { parseQuery, queryToApiParams } from '@/lib/logs/query-parser'
|
||||
import { cn } from '@/lib/utils'
|
||||
import Controls from '@/app/workspace/[workspaceId]/logs/components/dashboard/controls'
|
||||
@@ -13,12 +12,12 @@ import { Sidebar } from '@/app/workspace/[workspaceId]/logs/components/sidebar/s
|
||||
import Dashboard from '@/app/workspace/[workspaceId]/logs/dashboard'
|
||||
import { formatDate } from '@/app/workspace/[workspaceId]/logs/utils'
|
||||
import { useFolders } from '@/hooks/queries/folders'
|
||||
import { useLogDetail, useLogsList } from '@/hooks/queries/logs'
|
||||
import { useDebounce } from '@/hooks/use-debounce'
|
||||
import { useFolderStore } from '@/stores/folders/store'
|
||||
import { useFilterStore } from '@/stores/logs/filters/store'
|
||||
import type { LogsResponse, WorkflowLog } from '@/stores/logs/filters/types'
|
||||
import type { WorkflowLog } from '@/stores/logs/filters/types'
|
||||
|
||||
const logger = createLogger('Logs')
|
||||
const LOGS_PER_PAGE = 50
|
||||
|
||||
/**
|
||||
@@ -63,19 +62,7 @@ export default function Logs() {
|
||||
const workspaceId = params.workspaceId as string
|
||||
|
||||
const {
|
||||
logs,
|
||||
loading,
|
||||
error,
|
||||
setLogs,
|
||||
setLoading,
|
||||
setError,
|
||||
setWorkspaceId,
|
||||
page,
|
||||
setPage,
|
||||
hasMore,
|
||||
setHasMore,
|
||||
isFetchingMore,
|
||||
setIsFetchingMore,
|
||||
initializeFromURL,
|
||||
timeRange,
|
||||
level,
|
||||
@@ -95,10 +82,6 @@ export default function Logs() {
|
||||
const [selectedLog, setSelectedLog] = useState<WorkflowLog | null>(null)
|
||||
const [selectedLogIndex, setSelectedLogIndex] = useState<number>(-1)
|
||||
const [isSidebarOpen, setIsSidebarOpen] = useState(false)
|
||||
const [isDetailsLoading, setIsDetailsLoading] = useState(false)
|
||||
const detailsCacheRef = useRef<Map<string, any>>(new Map())
|
||||
const detailsAbortRef = useRef<AbortController | null>(null)
|
||||
const currentDetailsIdRef = useRef<string | null>(null)
|
||||
const selectedRowRef = useRef<HTMLTableRowElement | null>(null)
|
||||
const loaderRef = useRef<HTMLDivElement>(null)
|
||||
const scrollContainerRef = useRef<HTMLDivElement>(null)
|
||||
@@ -107,16 +90,37 @@ export default function Logs() {
|
||||
const [searchQuery, setSearchQuery] = useState(storeSearchQuery)
|
||||
const debouncedSearchQuery = useDebounce(searchQuery, 300)
|
||||
|
||||
const [availableWorkflows, setAvailableWorkflows] = useState<string[]>([])
|
||||
const [availableFolders, setAvailableFolders] = useState<string[]>([])
|
||||
const [, setAvailableWorkflows] = useState<string[]>([])
|
||||
const [, setAvailableFolders] = useState<string[]>([])
|
||||
|
||||
// Live and refresh state
|
||||
const [isLive, setIsLive] = useState(false)
|
||||
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||
const liveIntervalRef = useRef<NodeJS.Timeout | null>(null)
|
||||
const isSearchOpenRef = useRef<boolean>(false)
|
||||
|
||||
// Sync local search query with store search query
|
||||
const logFilters = useMemo(
|
||||
() => ({
|
||||
timeRange,
|
||||
level,
|
||||
workflowIds,
|
||||
folderIds,
|
||||
triggers,
|
||||
searchQuery: debouncedSearchQuery,
|
||||
limit: LOGS_PER_PAGE,
|
||||
}),
|
||||
[timeRange, level, workflowIds, folderIds, triggers, debouncedSearchQuery]
|
||||
)
|
||||
|
||||
const logsQuery = useLogsList(workspaceId, logFilters, {
|
||||
enabled: Boolean(workspaceId) && isInitialized.current,
|
||||
refetchInterval: isLive ? 5000 : false,
|
||||
})
|
||||
|
||||
const logDetailQuery = useLogDetail(selectedLog?.id)
|
||||
|
||||
const logs = useMemo(() => {
|
||||
if (!logsQuery.data?.pages) return []
|
||||
return logsQuery.data.pages.flatMap((page) => page.logs)
|
||||
}, [logsQuery.data?.pages])
|
||||
|
||||
useEffect(() => {
|
||||
setSearchQuery(storeSearchQuery)
|
||||
}, [storeSearchQuery])
|
||||
@@ -182,62 +186,6 @@ export default function Logs() {
|
||||
const index = logs.findIndex((l) => l.id === log.id)
|
||||
setSelectedLogIndex(index)
|
||||
setIsSidebarOpen(true)
|
||||
setIsDetailsLoading(true)
|
||||
|
||||
const currentId = log.id
|
||||
const prevId = index > 0 ? logs[index - 1]?.id : undefined
|
||||
const nextId = index < logs.length - 1 ? logs[index + 1]?.id : undefined
|
||||
|
||||
if (detailsAbortRef.current) {
|
||||
try {
|
||||
detailsAbortRef.current.abort()
|
||||
} catch {
|
||||
/* no-op */
|
||||
}
|
||||
}
|
||||
const controller = new AbortController()
|
||||
detailsAbortRef.current = controller
|
||||
currentDetailsIdRef.current = currentId
|
||||
|
||||
const idsToFetch: Array<{ id: string; merge: boolean }> = []
|
||||
const cachedCurrent = currentId ? detailsCacheRef.current.get(currentId) : undefined
|
||||
if (currentId && !cachedCurrent) idsToFetch.push({ id: currentId, merge: true })
|
||||
if (prevId && !detailsCacheRef.current.has(prevId))
|
||||
idsToFetch.push({ id: prevId, merge: false })
|
||||
if (nextId && !detailsCacheRef.current.has(nextId))
|
||||
idsToFetch.push({ id: nextId, merge: false })
|
||||
|
||||
if (cachedCurrent) {
|
||||
setSelectedLog((prev) =>
|
||||
prev && prev.id === currentId
|
||||
? ({ ...(prev as any), ...(cachedCurrent as any) } as any)
|
||||
: prev
|
||||
)
|
||||
setIsDetailsLoading(false)
|
||||
}
|
||||
if (idsToFetch.length === 0) return
|
||||
|
||||
Promise.all(
|
||||
idsToFetch.map(async ({ id, merge }) => {
|
||||
try {
|
||||
const res = await fetch(`/api/logs/${id}`, { signal: controller.signal })
|
||||
if (!res.ok) return
|
||||
const body = await res.json()
|
||||
const detailed = body?.data
|
||||
if (detailed) {
|
||||
detailsCacheRef.current.set(id, detailed)
|
||||
if (merge && id === currentId) {
|
||||
setSelectedLog((prev) =>
|
||||
prev && prev.id === id ? ({ ...(prev as any), ...(detailed as any) } as any) : prev
|
||||
)
|
||||
if (currentDetailsIdRef.current === id) setIsDetailsLoading(false)
|
||||
}
|
||||
}
|
||||
} catch (e: any) {
|
||||
if (e?.name === 'AbortError') return
|
||||
}
|
||||
})
|
||||
).catch(() => {})
|
||||
}
|
||||
|
||||
const handleNavigateNext = useCallback(() => {
|
||||
@@ -246,54 +194,6 @@ export default function Logs() {
|
||||
setSelectedLogIndex(nextIndex)
|
||||
const nextLog = logs[nextIndex]
|
||||
setSelectedLog(nextLog)
|
||||
if (detailsAbortRef.current) {
|
||||
try {
|
||||
detailsAbortRef.current.abort()
|
||||
} catch {
|
||||
/* no-op */
|
||||
}
|
||||
}
|
||||
const controller = new AbortController()
|
||||
detailsAbortRef.current = controller
|
||||
|
||||
const cached = detailsCacheRef.current.get(nextLog.id)
|
||||
if (cached) {
|
||||
setSelectedLog((prev) =>
|
||||
prev && prev.id === nextLog.id ? ({ ...(prev as any), ...(cached as any) } as any) : prev
|
||||
)
|
||||
} else {
|
||||
const prevId = nextIndex > 0 ? logs[nextIndex - 1]?.id : undefined
|
||||
const afterId = nextIndex < logs.length - 1 ? logs[nextIndex + 1]?.id : undefined
|
||||
const idsToFetch: Array<{ id: string; merge: boolean }> = []
|
||||
if (nextLog.id && !detailsCacheRef.current.has(nextLog.id))
|
||||
idsToFetch.push({ id: nextLog.id, merge: true })
|
||||
if (prevId && !detailsCacheRef.current.has(prevId))
|
||||
idsToFetch.push({ id: prevId, merge: false })
|
||||
if (afterId && !detailsCacheRef.current.has(afterId))
|
||||
idsToFetch.push({ id: afterId, merge: false })
|
||||
Promise.all(
|
||||
idsToFetch.map(async ({ id, merge }) => {
|
||||
try {
|
||||
const res = await fetch(`/api/logs/${id}`, { signal: controller.signal })
|
||||
if (!res.ok) return
|
||||
const body = await res.json()
|
||||
const detailed = body?.data
|
||||
if (detailed) {
|
||||
detailsCacheRef.current.set(id, detailed)
|
||||
if (merge && id === nextLog.id) {
|
||||
setSelectedLog((prev) =>
|
||||
prev && prev.id === id
|
||||
? ({ ...(prev as any), ...(detailed as any) } as any)
|
||||
: prev
|
||||
)
|
||||
}
|
||||
}
|
||||
} catch (e: any) {
|
||||
if (e?.name === 'AbortError') return
|
||||
}
|
||||
})
|
||||
).catch(() => {})
|
||||
}
|
||||
}
|
||||
}, [selectedLogIndex, logs])
|
||||
|
||||
@@ -303,54 +203,6 @@ export default function Logs() {
|
||||
setSelectedLogIndex(prevIndex)
|
||||
const prevLog = logs[prevIndex]
|
||||
setSelectedLog(prevLog)
|
||||
if (detailsAbortRef.current) {
|
||||
try {
|
||||
detailsAbortRef.current.abort()
|
||||
} catch {
|
||||
/* no-op */
|
||||
}
|
||||
}
|
||||
const controller = new AbortController()
|
||||
detailsAbortRef.current = controller
|
||||
|
||||
const cached = detailsCacheRef.current.get(prevLog.id)
|
||||
if (cached) {
|
||||
setSelectedLog((prev) =>
|
||||
prev && prev.id === prevLog.id ? ({ ...(prev as any), ...(cached as any) } as any) : prev
|
||||
)
|
||||
} else {
|
||||
const beforeId = prevIndex > 0 ? logs[prevIndex - 1]?.id : undefined
|
||||
const afterId = prevIndex < logs.length - 1 ? logs[prevIndex + 1]?.id : undefined
|
||||
const idsToFetch: Array<{ id: string; merge: boolean }> = []
|
||||
if (prevLog.id && !detailsCacheRef.current.has(prevLog.id))
|
||||
idsToFetch.push({ id: prevLog.id, merge: true })
|
||||
if (beforeId && !detailsCacheRef.current.has(beforeId))
|
||||
idsToFetch.push({ id: beforeId, merge: false })
|
||||
if (afterId && !detailsCacheRef.current.has(afterId))
|
||||
idsToFetch.push({ id: afterId, merge: false })
|
||||
Promise.all(
|
||||
idsToFetch.map(async ({ id, merge }) => {
|
||||
try {
|
||||
const res = await fetch(`/api/logs/${id}`, { signal: controller.signal })
|
||||
if (!res.ok) return
|
||||
const body = await res.json()
|
||||
const detailed = body?.data
|
||||
if (detailed) {
|
||||
detailsCacheRef.current.set(id, detailed)
|
||||
if (merge && id === prevLog.id) {
|
||||
setSelectedLog((prev) =>
|
||||
prev && prev.id === id
|
||||
? ({ ...(prev as any), ...(detailed as any) } as any)
|
||||
: prev
|
||||
)
|
||||
}
|
||||
}
|
||||
} catch (e: any) {
|
||||
if (e?.name === 'AbortError') return
|
||||
}
|
||||
})
|
||||
).catch(() => {})
|
||||
}
|
||||
}
|
||||
}, [selectedLogIndex, logs])
|
||||
|
||||
@@ -369,106 +221,13 @@ export default function Logs() {
|
||||
}
|
||||
}, [selectedLogIndex])
|
||||
|
||||
const fetchLogs = useCallback(async (pageNum: number, append = false) => {
|
||||
try {
|
||||
// Don't fetch if workspaceId is not set
|
||||
const { workspaceId: storeWorkspaceId } = useFilterStore.getState()
|
||||
if (!storeWorkspaceId) {
|
||||
return
|
||||
}
|
||||
|
||||
if (pageNum === 1) {
|
||||
setLoading(true)
|
||||
} else {
|
||||
setIsFetchingMore(true)
|
||||
}
|
||||
|
||||
const { buildQueryParams: getCurrentQueryParams } = useFilterStore.getState()
|
||||
const queryParams = getCurrentQueryParams(pageNum, LOGS_PER_PAGE)
|
||||
|
||||
const { searchQuery: currentSearchQuery } = useFilterStore.getState()
|
||||
const parsedQuery = parseQuery(currentSearchQuery)
|
||||
const enhancedParams = queryToApiParams(parsedQuery)
|
||||
|
||||
const allParams = new URLSearchParams(queryParams)
|
||||
Object.entries(enhancedParams).forEach(([key, value]) => {
|
||||
if (key === 'triggers' && allParams.has('triggers')) {
|
||||
const existingTriggers = allParams.get('triggers')?.split(',') || []
|
||||
const searchTriggers = value.split(',')
|
||||
const combined = [...new Set([...existingTriggers, ...searchTriggers])]
|
||||
allParams.set('triggers', combined.join(','))
|
||||
} else {
|
||||
allParams.set(key, value)
|
||||
}
|
||||
})
|
||||
|
||||
allParams.set('details', 'basic')
|
||||
const response = await fetch(`/api/logs?${allParams.toString()}`)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Error fetching logs: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const data: LogsResponse = await response.json()
|
||||
|
||||
setHasMore(data.data.length === LOGS_PER_PAGE && data.page < data.totalPages)
|
||||
|
||||
setLogs(data.data, append)
|
||||
|
||||
setError(null)
|
||||
} catch (err) {
|
||||
logger.error('Failed to fetch logs:', { err })
|
||||
setError(err instanceof Error ? err.message : 'An unknown error occurred')
|
||||
} finally {
|
||||
if (pageNum === 1) {
|
||||
setLoading(false)
|
||||
} else {
|
||||
setIsFetchingMore(false)
|
||||
}
|
||||
}
|
||||
}, [])
|
||||
|
||||
const handleRefresh = async () => {
|
||||
if (isRefreshing) return
|
||||
|
||||
setIsRefreshing(true)
|
||||
|
||||
try {
|
||||
await fetchLogs(1)
|
||||
setError(null)
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : 'An unknown error occurred')
|
||||
} finally {
|
||||
setIsRefreshing(false)
|
||||
await logsQuery.refetch()
|
||||
if (selectedLog?.id) {
|
||||
await logDetailQuery.refetch()
|
||||
}
|
||||
}
|
||||
|
||||
// Setup or clear the live refresh interval when isLive changes
|
||||
useEffect(() => {
|
||||
if (liveIntervalRef.current) {
|
||||
clearInterval(liveIntervalRef.current)
|
||||
liveIntervalRef.current = null
|
||||
}
|
||||
|
||||
if (isLive) {
|
||||
handleRefresh()
|
||||
liveIntervalRef.current = setInterval(() => {
|
||||
handleRefresh()
|
||||
}, 5000)
|
||||
}
|
||||
|
||||
return () => {
|
||||
if (liveIntervalRef.current) {
|
||||
clearInterval(liveIntervalRef.current)
|
||||
liveIntervalRef.current = null
|
||||
}
|
||||
}
|
||||
}, [isLive])
|
||||
|
||||
const toggleLive = () => {
|
||||
setIsLive(!isLive)
|
||||
}
|
||||
|
||||
const handleExport = async () => {
|
||||
const params = new URLSearchParams()
|
||||
params.set('workspaceId', workspaceId)
|
||||
@@ -506,101 +265,14 @@ export default function Logs() {
|
||||
return () => window.removeEventListener('popstate', handlePopState)
|
||||
}, [initializeFromURL])
|
||||
|
||||
useEffect(() => {
|
||||
if (!isInitialized.current) {
|
||||
return
|
||||
}
|
||||
|
||||
// Don't fetch if workspaceId is not set yet
|
||||
if (!workspaceId) {
|
||||
return
|
||||
}
|
||||
|
||||
setPage(1)
|
||||
setHasMore(true)
|
||||
|
||||
const fetchWithFilters = async () => {
|
||||
try {
|
||||
setLoading(true)
|
||||
|
||||
const params = new URLSearchParams()
|
||||
params.set('details', 'basic')
|
||||
params.set('limit', LOGS_PER_PAGE.toString())
|
||||
params.set('offset', '0') // Always start from page 1
|
||||
params.set('workspaceId', workspaceId)
|
||||
|
||||
const parsedQuery = parseQuery(debouncedSearchQuery)
|
||||
const enhancedParams = queryToApiParams(parsedQuery)
|
||||
|
||||
if (level !== 'all') params.set('level', level)
|
||||
if (triggers.length > 0) params.set('triggers', triggers.join(','))
|
||||
if (workflowIds.length > 0) params.set('workflowIds', workflowIds.join(','))
|
||||
if (folderIds.length > 0) params.set('folderIds', folderIds.join(','))
|
||||
|
||||
Object.entries(enhancedParams).forEach(([key, value]) => {
|
||||
if (key === 'triggers' && params.has('triggers')) {
|
||||
const storeTriggers = params.get('triggers')?.split(',') || []
|
||||
const searchTriggers = value.split(',')
|
||||
const combined = [...new Set([...storeTriggers, ...searchTriggers])]
|
||||
params.set('triggers', combined.join(','))
|
||||
} else {
|
||||
params.set(key, value)
|
||||
}
|
||||
})
|
||||
|
||||
if (timeRange !== 'All time') {
|
||||
const now = new Date()
|
||||
let startDate: Date
|
||||
switch (timeRange) {
|
||||
case 'Past 30 minutes':
|
||||
startDate = new Date(now.getTime() - 30 * 60 * 1000)
|
||||
break
|
||||
case 'Past hour':
|
||||
startDate = new Date(now.getTime() - 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 24 hours':
|
||||
startDate = new Date(now.getTime() - 24 * 60 * 60 * 1000)
|
||||
break
|
||||
default:
|
||||
startDate = new Date(0)
|
||||
}
|
||||
params.set('startDate', startDate.toISOString())
|
||||
}
|
||||
|
||||
const response = await fetch(`/api/logs?${params.toString()}`)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Error fetching logs: ${response.statusText}`)
|
||||
}
|
||||
|
||||
const data: LogsResponse = await response.json()
|
||||
setHasMore(data.data.length === LOGS_PER_PAGE && data.page < data.totalPages)
|
||||
setLogs(data.data, false)
|
||||
setError(null)
|
||||
} catch (err) {
|
||||
logger.error('Failed to fetch logs:', { err })
|
||||
setError(err instanceof Error ? err.message : 'An unknown error occurred')
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
}
|
||||
|
||||
fetchWithFilters()
|
||||
}, [workspaceId, timeRange, level, workflowIds, folderIds, debouncedSearchQuery, triggers])
|
||||
|
||||
const loadMoreLogs = useCallback(() => {
|
||||
if (!isFetchingMore && hasMore) {
|
||||
const nextPage = page + 1
|
||||
setPage(nextPage)
|
||||
setIsFetchingMore(true)
|
||||
setTimeout(() => {
|
||||
fetchLogs(nextPage, true)
|
||||
}, 50)
|
||||
if (!logsQuery.isFetching && logsQuery.hasNextPage) {
|
||||
logsQuery.fetchNextPage()
|
||||
}
|
||||
}, [fetchLogs, isFetchingMore, hasMore, page])
|
||||
}, [logsQuery])
|
||||
|
||||
useEffect(() => {
|
||||
if (loading || !hasMore) return
|
||||
if (logsQuery.isLoading || !logsQuery.hasNextPage) return
|
||||
|
||||
const scrollContainer = scrollContainerRef.current
|
||||
if (!scrollContainer) return
|
||||
@@ -612,7 +284,7 @@ export default function Logs() {
|
||||
|
||||
const scrollPercentage = (scrollTop / (scrollHeight - clientHeight)) * 100
|
||||
|
||||
if (scrollPercentage > 60 && !isFetchingMore && hasMore) {
|
||||
if (scrollPercentage > 60 && !logsQuery.isFetchingNextPage && logsQuery.hasNextPage) {
|
||||
loadMoreLogs()
|
||||
}
|
||||
}
|
||||
@@ -622,13 +294,14 @@ export default function Logs() {
|
||||
return () => {
|
||||
scrollContainer.removeEventListener('scroll', handleScroll)
|
||||
}
|
||||
}, [loading, hasMore, isFetchingMore, loadMoreLogs])
|
||||
}, [logsQuery.isLoading, logsQuery.hasNextPage, logsQuery.isFetchingNextPage, loadMoreLogs])
|
||||
|
||||
useEffect(() => {
|
||||
const currentLoaderRef = loaderRef.current
|
||||
const scrollContainer = scrollContainerRef.current
|
||||
|
||||
if (!currentLoaderRef || !scrollContainer || loading || !hasMore) return
|
||||
if (!currentLoaderRef || !scrollContainer || logsQuery.isLoading || !logsQuery.hasNextPage)
|
||||
return
|
||||
|
||||
const observer = new IntersectionObserver(
|
||||
(entries) => {
|
||||
@@ -636,7 +309,7 @@ export default function Logs() {
|
||||
if (!e?.isIntersecting) return
|
||||
const { scrollTop, scrollHeight, clientHeight } = scrollContainer
|
||||
const pct = (scrollTop / (scrollHeight - clientHeight)) * 100
|
||||
if (pct > 70 && !isFetchingMore) {
|
||||
if (pct > 70 && !logsQuery.isFetchingNextPage) {
|
||||
loadMoreLogs()
|
||||
}
|
||||
},
|
||||
@@ -652,7 +325,7 @@ export default function Logs() {
|
||||
return () => {
|
||||
observer.unobserve(currentLoaderRef)
|
||||
}
|
||||
}, [loading, hasMore, isFetchingMore, loadMoreLogs])
|
||||
}, [logsQuery.isLoading, logsQuery.hasNextPage, logsQuery.isFetchingNextPage, loadMoreLogs])
|
||||
|
||||
useEffect(() => {
|
||||
const handleKeyDown = (e: KeyboardEvent) => {
|
||||
@@ -686,7 +359,6 @@ export default function Logs() {
|
||||
return () => window.removeEventListener('keydown', handleKeyDown)
|
||||
}, [logs, selectedLogIndex, isSidebarOpen, selectedLog, handleNavigateNext, handleNavigatePrev])
|
||||
|
||||
// If in dashboard mode, show the dashboard
|
||||
if (viewMode === 'dashboard') {
|
||||
return <Dashboard />
|
||||
}
|
||||
@@ -701,7 +373,7 @@ export default function Logs() {
|
||||
<div className='flex min-w-0 flex-1 overflow-hidden'>
|
||||
<div className='flex flex-1 flex-col p-[24px]'>
|
||||
<Controls
|
||||
isRefetching={isRefreshing}
|
||||
isRefetching={logsQuery.isFetching}
|
||||
resetToNow={handleRefresh}
|
||||
live={isLive}
|
||||
setLive={(fn) => setIsLive(fn)}
|
||||
@@ -750,18 +422,20 @@ export default function Logs() {
|
||||
|
||||
{/* Table body - scrollable */}
|
||||
<div className='flex-1 overflow-y-auto overflow-x-hidden' ref={scrollContainerRef}>
|
||||
{loading && page === 1 ? (
|
||||
{logsQuery.isLoading && !logsQuery.data ? (
|
||||
<div className='flex h-full items-center justify-center'>
|
||||
<div className='flex items-center gap-[8px] text-[var(--text-secondary)] dark:text-[var(--text-secondary)]'>
|
||||
<Loader2 className='h-[16px] w-[16px] animate-spin' />
|
||||
<span className='text-[13px]'>Loading logs...</span>
|
||||
</div>
|
||||
</div>
|
||||
) : error ? (
|
||||
) : logsQuery.isError ? (
|
||||
<div className='flex h-full items-center justify-center'>
|
||||
<div className='flex items-center gap-[8px] text-[var(--text-error)] dark:text-[var(--text-error)]'>
|
||||
<AlertCircle className='h-[16px] w-[16px]' />
|
||||
<span className='text-[13px]'>Error: {error}</span>
|
||||
<span className='text-[13px]'>
|
||||
Error: {logsQuery.error?.message || 'Failed to load logs'}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
) : logs.length === 0 ? (
|
||||
@@ -778,7 +452,6 @@ export default function Logs() {
|
||||
const isSelected = selectedLog?.id === log.id
|
||||
const baseLevel = (log.level || 'info').toLowerCase()
|
||||
const isError = baseLevel === 'error'
|
||||
// If it's an error, don't treat it as pending even if hasPendingPause is true
|
||||
const isPending = !isError && log.hasPendingPause === true
|
||||
const statusLabel = isPending
|
||||
? 'Pending'
|
||||
@@ -906,13 +579,13 @@ export default function Logs() {
|
||||
})}
|
||||
|
||||
{/* Infinite scroll loader */}
|
||||
{hasMore && (
|
||||
{logsQuery.hasNextPage && (
|
||||
<div className='flex items-center justify-center py-[16px]'>
|
||||
<div
|
||||
ref={loaderRef}
|
||||
className='flex items-center gap-[8px] text-[var(--text-secondary)] dark:text-[var(--text-secondary)]'
|
||||
>
|
||||
{isFetchingMore ? (
|
||||
{logsQuery.isFetchingNextPage ? (
|
||||
<>
|
||||
<Loader2 className='h-[16px] w-[16px] animate-spin' />
|
||||
<span className='text-[13px]'>Loading more...</span>
|
||||
@@ -932,8 +605,9 @@ export default function Logs() {
|
||||
|
||||
{/* Log Sidebar */}
|
||||
<Sidebar
|
||||
log={selectedLog}
|
||||
log={logDetailQuery.data || selectedLog}
|
||||
isOpen={isSidebarOpen}
|
||||
isLoadingDetails={logDetailQuery.isLoading}
|
||||
onClose={handleCloseSidebar}
|
||||
onNavigateNext={handleNavigateNext}
|
||||
onNavigatePrev={handleNavigatePrev}
|
||||
|
||||
@@ -600,6 +600,7 @@ export function Chat() {
|
||||
onOutputSelect={handleOutputSelection}
|
||||
disabled={!activeWorkflowId}
|
||||
placeholder='Select outputs'
|
||||
align='end'
|
||||
/>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -7,7 +7,6 @@ import {
|
||||
Popover,
|
||||
PopoverContent,
|
||||
PopoverItem,
|
||||
PopoverScrollArea,
|
||||
PopoverSection,
|
||||
PopoverTrigger,
|
||||
} from '@/components/emcn'
|
||||
@@ -24,6 +23,7 @@ interface OutputSelectProps {
|
||||
disabled?: boolean
|
||||
placeholder?: string
|
||||
valueMode?: 'id' | 'label'
|
||||
align?: 'start' | 'end' | 'center'
|
||||
}
|
||||
|
||||
export function OutputSelect({
|
||||
@@ -33,10 +33,13 @@ export function OutputSelect({
|
||||
disabled = false,
|
||||
placeholder = 'Select outputs',
|
||||
valueMode = 'id',
|
||||
align = 'start',
|
||||
}: OutputSelectProps) {
|
||||
const [open, setOpen] = useState(false)
|
||||
const [highlightedIndex, setHighlightedIndex] = useState(-1)
|
||||
const triggerRef = useRef<HTMLDivElement>(null)
|
||||
const popoverRef = useRef<HTMLDivElement>(null)
|
||||
const contentRef = useRef<HTMLDivElement>(null)
|
||||
const blocks = useWorkflowStore((state) => state.blocks)
|
||||
const { isShowingDiff, isDiffReady, diffWorkflow } = useWorkflowDiffStore()
|
||||
const subBlockValues = useSubBlockStore((state) =>
|
||||
@@ -230,6 +233,13 @@ export function OutputSelect({
|
||||
return blockConfig?.bgColor || '#2F55FF'
|
||||
}
|
||||
|
||||
/**
|
||||
* Flattened outputs for keyboard navigation
|
||||
*/
|
||||
const flattenedOutputs = useMemo(() => {
|
||||
return Object.values(groupedOutputs).flat()
|
||||
}, [groupedOutputs])
|
||||
|
||||
/**
|
||||
* Handles output selection - toggle selection
|
||||
*/
|
||||
@@ -246,6 +256,75 @@ export function OutputSelect({
|
||||
onOutputSelect(newSelectedOutputs)
|
||||
}
|
||||
|
||||
/**
|
||||
* Keyboard navigation handler
|
||||
*/
|
||||
const handleKeyDown = (e: React.KeyboardEvent) => {
|
||||
if (flattenedOutputs.length === 0) return
|
||||
|
||||
switch (e.key) {
|
||||
case 'ArrowDown':
|
||||
e.preventDefault()
|
||||
setHighlightedIndex((prev) => {
|
||||
const next = prev < flattenedOutputs.length - 1 ? prev + 1 : 0
|
||||
return next
|
||||
})
|
||||
break
|
||||
|
||||
case 'ArrowUp':
|
||||
e.preventDefault()
|
||||
setHighlightedIndex((prev) => {
|
||||
const next = prev > 0 ? prev - 1 : flattenedOutputs.length - 1
|
||||
return next
|
||||
})
|
||||
break
|
||||
|
||||
case 'Enter':
|
||||
e.preventDefault()
|
||||
if (highlightedIndex >= 0 && highlightedIndex < flattenedOutputs.length) {
|
||||
handleOutputSelection(flattenedOutputs[highlightedIndex].label)
|
||||
}
|
||||
break
|
||||
|
||||
case 'Escape':
|
||||
e.preventDefault()
|
||||
setOpen(false)
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset highlighted index when popover opens/closes
|
||||
*/
|
||||
useEffect(() => {
|
||||
if (open) {
|
||||
// Find first selected item, or start at -1
|
||||
const firstSelectedIndex = flattenedOutputs.findIndex((output) => isSelectedValue(output))
|
||||
setHighlightedIndex(firstSelectedIndex >= 0 ? firstSelectedIndex : -1)
|
||||
|
||||
// Focus the content for keyboard navigation
|
||||
setTimeout(() => {
|
||||
contentRef.current?.focus()
|
||||
}, 0)
|
||||
} else {
|
||||
setHighlightedIndex(-1)
|
||||
}
|
||||
}, [open, flattenedOutputs])
|
||||
|
||||
/**
|
||||
* Scroll highlighted item into view
|
||||
*/
|
||||
useEffect(() => {
|
||||
if (highlightedIndex >= 0 && contentRef.current) {
|
||||
const highlightedElement = contentRef.current.querySelector(
|
||||
`[data-option-index="${highlightedIndex}"]`
|
||||
)
|
||||
if (highlightedElement) {
|
||||
highlightedElement.scrollIntoView({ behavior: 'smooth', block: 'nearest' })
|
||||
}
|
||||
}
|
||||
}, [highlightedIndex])
|
||||
|
||||
/**
|
||||
* Closes popover when clicking outside
|
||||
*/
|
||||
@@ -288,44 +367,57 @@ export function OutputSelect({
|
||||
<PopoverContent
|
||||
ref={popoverRef}
|
||||
side='bottom'
|
||||
align='start'
|
||||
align={align}
|
||||
sideOffset={4}
|
||||
maxHeight={140}
|
||||
maxWidth={140}
|
||||
minWidth={140}
|
||||
onOpenAutoFocus={(e) => e.preventDefault()}
|
||||
onCloseAutoFocus={(e) => e.preventDefault()}
|
||||
maxHeight={300}
|
||||
maxWidth={300}
|
||||
minWidth={200}
|
||||
onKeyDown={handleKeyDown}
|
||||
tabIndex={0}
|
||||
style={{ outline: 'none' }}
|
||||
>
|
||||
<PopoverScrollArea className='space-y-[2px]'>
|
||||
{Object.entries(groupedOutputs).map(([blockName, outputs]) => (
|
||||
<div key={blockName}>
|
||||
<PopoverSection>{blockName}</PopoverSection>
|
||||
<div ref={contentRef} className='space-y-[2px]'>
|
||||
{Object.entries(groupedOutputs).map(([blockName, outputs]) => {
|
||||
// Calculate the starting index for this group
|
||||
const startIndex = flattenedOutputs.findIndex((o) => o.blockName === blockName)
|
||||
|
||||
<div className='flex flex-col gap-[2px]'>
|
||||
{outputs.map((output) => (
|
||||
<PopoverItem
|
||||
key={output.id}
|
||||
active={isSelectedValue(output)}
|
||||
onClick={() => handleOutputSelection(output.label)}
|
||||
>
|
||||
<div
|
||||
className='flex h-[14px] w-[14px] flex-shrink-0 items-center justify-center rounded'
|
||||
style={{
|
||||
backgroundColor: getOutputColor(output.blockId, output.blockType),
|
||||
}}
|
||||
>
|
||||
<span className='font-bold text-[10px] text-white'>
|
||||
{blockName.charAt(0).toUpperCase()}
|
||||
</span>
|
||||
</div>
|
||||
<span className='min-w-0 flex-1 truncate'>{output.path}</span>
|
||||
{isSelectedValue(output) && <Check className='h-3 w-3 flex-shrink-0' />}
|
||||
</PopoverItem>
|
||||
))}
|
||||
return (
|
||||
<div key={blockName}>
|
||||
<PopoverSection>{blockName}</PopoverSection>
|
||||
|
||||
<div className='flex flex-col gap-[2px]'>
|
||||
{outputs.map((output, localIndex) => {
|
||||
const globalIndex = startIndex + localIndex
|
||||
const isHighlighted = globalIndex === highlightedIndex
|
||||
|
||||
return (
|
||||
<PopoverItem
|
||||
key={output.id}
|
||||
active={isSelectedValue(output) || isHighlighted}
|
||||
data-option-index={globalIndex}
|
||||
onClick={() => handleOutputSelection(output.label)}
|
||||
onMouseEnter={() => setHighlightedIndex(globalIndex)}
|
||||
>
|
||||
<div
|
||||
className='flex h-[14px] w-[14px] flex-shrink-0 items-center justify-center rounded'
|
||||
style={{
|
||||
backgroundColor: getOutputColor(output.blockId, output.blockType),
|
||||
}}
|
||||
>
|
||||
<span className='font-bold text-[10px] text-white'>
|
||||
{blockName.charAt(0).toUpperCase()}
|
||||
</span>
|
||||
</div>
|
||||
<span className='min-w-0 flex-1 truncate'>{output.path}</span>
|
||||
{isSelectedValue(output) && <Check className='h-3 w-3 flex-shrink-0' />}
|
||||
</PopoverItem>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</PopoverScrollArea>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
)
|
||||
|
||||
@@ -35,37 +35,64 @@ const NoteMarkdown = memo(function NoteMarkdown({ content }: { content: string }
|
||||
<ReactMarkdown
|
||||
remarkPlugins={[remarkGfm]}
|
||||
components={{
|
||||
p: ({ children }) => <p className='mb-0 text-[#E5E5E5] text-sm'>{children}</p>,
|
||||
h1: ({ children }) => (
|
||||
<h1 className='mt-0 mb-[-2px] font-semibold text-[#E5E5E5] text-lg'>{children}</h1>
|
||||
p: ({ children }: any) => (
|
||||
<p className='mb-2 break-words text-[#E5E5E5] text-sm'>{children}</p>
|
||||
),
|
||||
h2: ({ children }) => (
|
||||
<h2 className='mt-0 mb-[-2px] font-semibold text-[#E5E5E5] text-base'>{children}</h2>
|
||||
h1: ({ children }: any) => (
|
||||
<h1 className='mt-3 mb-1 break-words font-semibold text-[#E5E5E5] text-lg first:mt-0'>
|
||||
{children}
|
||||
</h1>
|
||||
),
|
||||
h3: ({ children }) => (
|
||||
<h3 className='mt-0 mb-[-2px] font-semibold text-[#E5E5E5] text-sm'>{children}</h3>
|
||||
h2: ({ children }: any) => (
|
||||
<h2 className='mt-3 mb-1 break-words font-semibold text-[#E5E5E5] text-base first:mt-0'>
|
||||
{children}
|
||||
</h2>
|
||||
),
|
||||
h4: ({ children }) => (
|
||||
<h4 className='mt-0 mb-[-2px] font-semibold text-[#E5E5E5] text-xs'>{children}</h4>
|
||||
h3: ({ children }: any) => (
|
||||
<h3 className='mt-3 mb-1 break-words font-semibold text-[#E5E5E5] text-sm first:mt-0'>
|
||||
{children}
|
||||
</h3>
|
||||
),
|
||||
ul: ({ children }) => (
|
||||
<ul className='-mt-[2px] mb-0 list-disc pl-4 text-[#E5E5E5] text-sm'>{children}</ul>
|
||||
h4: ({ children }: any) => (
|
||||
<h4 className='mt-3 mb-1 break-words font-semibold text-[#E5E5E5] text-xs first:mt-0'>
|
||||
{children}
|
||||
</h4>
|
||||
),
|
||||
ol: ({ children }) => (
|
||||
<ol className='-mt-[2px] mb-0 list-decimal pl-4 text-[#E5E5E5] text-sm'>{children}</ol>
|
||||
ul: ({ children }: any) => (
|
||||
<ul className='mt-1 mb-2 list-disc break-words pl-4 text-[#E5E5E5] text-sm'>
|
||||
{children}
|
||||
</ul>
|
||||
),
|
||||
li: ({ children }) => <li className='mb-0'>{children}</li>,
|
||||
code: ({ inline, children }: any) =>
|
||||
inline ? (
|
||||
<code className='rounded bg-[var(--divider)] px-1 py-0.5 text-[#F59E0B] text-xs'>
|
||||
ol: ({ children }: any) => (
|
||||
<ol className='mt-1 mb-2 list-decimal break-words pl-4 text-[#E5E5E5] text-sm'>
|
||||
{children}
|
||||
</ol>
|
||||
),
|
||||
li: ({ children }: any) => <li className='mb-0 break-words'>{children}</li>,
|
||||
code: ({ inline, className, children, ...props }: any) => {
|
||||
const isInline = inline || !className?.includes('language-')
|
||||
|
||||
if (isInline) {
|
||||
return (
|
||||
<code
|
||||
{...props}
|
||||
className='whitespace-normal rounded bg-gray-200 px-1 py-0.5 font-mono text-[#F59E0B] text-xs dark:bg-[var(--surface-11)] dark:text-[#F59E0B]'
|
||||
>
|
||||
{children}
|
||||
</code>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<code
|
||||
{...props}
|
||||
className='block whitespace-pre-wrap break-words rounded bg-[#1A1A1A] p-2 text-[#E5E5E5] text-xs'
|
||||
>
|
||||
{children}
|
||||
</code>
|
||||
) : (
|
||||
<code className='block rounded bg-[#1A1A1A] p-2 text-[#E5E5E5] text-xs'>
|
||||
{children}
|
||||
</code>
|
||||
),
|
||||
a: ({ href, children }) => (
|
||||
)
|
||||
},
|
||||
a: ({ href, children }: any) => (
|
||||
<a
|
||||
href={href}
|
||||
target='_blank'
|
||||
@@ -75,10 +102,12 @@ const NoteMarkdown = memo(function NoteMarkdown({ content }: { content: string }
|
||||
{children}
|
||||
</a>
|
||||
),
|
||||
strong: ({ children }) => <strong className='font-semibold text-white'>{children}</strong>,
|
||||
em: ({ children }) => <em className='text-[#B8B8B8]'>{children}</em>,
|
||||
blockquote: ({ children }) => (
|
||||
<blockquote className='m-0 border-[#F59E0B] border-l-2 pl-3 text-[#B8B8B8] italic'>
|
||||
strong: ({ children }: any) => (
|
||||
<strong className='break-words font-semibold text-white'>{children}</strong>
|
||||
),
|
||||
em: ({ children }: any) => <em className='break-words text-[#B8B8B8]'>{children}</em>,
|
||||
blockquote: ({ children }: any) => (
|
||||
<blockquote className='mt-1 mb-2 break-words border-[#F59E0B] border-l-2 pl-3 text-[#B8B8B8] italic'>
|
||||
{children}
|
||||
</blockquote>
|
||||
),
|
||||
@@ -181,15 +210,13 @@ export const NoteBlock = memo(function NoteBlock({ id, data }: NodeProps<NoteBlo
|
||||
</div>
|
||||
|
||||
<div className='relative px-[12px] pt-[6px] pb-[8px]'>
|
||||
<div className='relative whitespace-pre-wrap break-words'>
|
||||
<div className='relative break-words'>
|
||||
{isEmpty ? (
|
||||
<p className='text-[#868686] text-sm italic'>Add a note...</p>
|
||||
) : showMarkdown ? (
|
||||
<NoteMarkdown content={content} />
|
||||
) : (
|
||||
<p className='whitespace-pre-wrap text-[#E5E5E5] text-sm leading-relaxed'>
|
||||
{content}
|
||||
</p>
|
||||
<p className='whitespace-pre-wrap text-[#E5E5E5] text-sm leading-snug'>{content}</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
'use client'
|
||||
|
||||
import { useCallback, useEffect, useState } from 'react'
|
||||
import { shallow } from 'zustand/shallow'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
@@ -109,7 +110,11 @@ export function useMentionData(props: UseMentionDataProps) {
|
||||
const [workflowBlocks, setWorkflowBlocks] = useState<WorkflowBlockItem[]>([])
|
||||
const [isLoadingWorkflowBlocks, setIsLoadingWorkflowBlocks] = useState(false)
|
||||
|
||||
const workflowStoreBlocks = useWorkflowStore((state) => state.blocks)
|
||||
// Only subscribe to block keys to avoid re-rendering on position updates
|
||||
const blockKeys = useWorkflowStore(
|
||||
useCallback((state) => Object.keys(state.blocks), []),
|
||||
shallow
|
||||
)
|
||||
|
||||
// Use workflow registry as source of truth for workflows
|
||||
const registryWorkflows = useWorkflowRegistry((state) => state.workflows)
|
||||
@@ -139,15 +144,19 @@ export function useMentionData(props: UseMentionDataProps) {
|
||||
|
||||
/**
|
||||
* Syncs workflow blocks from store
|
||||
* Only re-runs when blocks are added/removed (not on position updates)
|
||||
*/
|
||||
useEffect(() => {
|
||||
const syncWorkflowBlocks = async () => {
|
||||
if (!workflowId || !workflowStoreBlocks || Object.keys(workflowStoreBlocks).length === 0) {
|
||||
if (!workflowId || blockKeys.length === 0) {
|
||||
setWorkflowBlocks([])
|
||||
return
|
||||
}
|
||||
|
||||
try {
|
||||
// Fetch current blocks from store
|
||||
const workflowStoreBlocks = useWorkflowStore.getState().blocks
|
||||
|
||||
const { registry: blockRegistry } = await import('@/blocks/registry')
|
||||
const mapped = Object.values(workflowStoreBlocks).map((b: any) => {
|
||||
const reg = (blockRegistry as any)[b.type]
|
||||
@@ -169,7 +178,7 @@ export function useMentionData(props: UseMentionDataProps) {
|
||||
}
|
||||
|
||||
syncWorkflowBlocks()
|
||||
}, [workflowStoreBlocks, workflowId])
|
||||
}, [blockKeys, workflowId])
|
||||
|
||||
/**
|
||||
* Ensures past chats are loaded
|
||||
@@ -323,10 +332,10 @@ export function useMentionData(props: UseMentionDataProps) {
|
||||
if (!workflowId) return
|
||||
logger.debug('ensureWorkflowBlocksLoaded called', {
|
||||
workflowId,
|
||||
storeBlocksCount: Object.keys(workflowStoreBlocks || {}).length,
|
||||
storeBlocksCount: blockKeys.length,
|
||||
workflowBlocksCount: workflowBlocks.length,
|
||||
})
|
||||
}, [workflowId, workflowStoreBlocks, workflowBlocks.length])
|
||||
}, [workflowId, blockKeys.length, workflowBlocks.length])
|
||||
|
||||
return {
|
||||
// State
|
||||
|
||||
@@ -35,6 +35,7 @@ import { WandPromptBar } from '@/app/workspace/[workspaceId]/w/[workflowId]/comp
|
||||
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
|
||||
import { useWand } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-wand'
|
||||
import type { GenerationType } from '@/blocks/types'
|
||||
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
|
||||
import { useTagSelection } from '@/hooks/use-tag-selection'
|
||||
import { normalizeBlockName } from '@/stores/workflows/utils'
|
||||
|
||||
@@ -99,14 +100,15 @@ const createHighlightFunction = (
|
||||
let processedCode = codeToHighlight
|
||||
|
||||
// Replace environment variables with placeholders
|
||||
processedCode = processedCode.replace(/\{\{([^}]+)\}\}/g, (match) => {
|
||||
processedCode = processedCode.replace(createEnvVarPattern(), (match) => {
|
||||
const placeholder = `__ENV_VAR_${placeholders.length}__`
|
||||
placeholders.push({ placeholder, original: match, type: 'env' })
|
||||
return placeholder
|
||||
})
|
||||
|
||||
// Replace variable references with placeholders
|
||||
processedCode = processedCode.replace(/<([^>]+)>/g, (match) => {
|
||||
// Use [^<>]+ to prevent matching across nested brackets (e.g., "<3 <real.ref>" should match separately)
|
||||
processedCode = processedCode.replace(createReferencePattern(), (match) => {
|
||||
if (shouldHighlightReference(match)) {
|
||||
const placeholder = `__VAR_REF_${placeholders.length}__`
|
||||
placeholders.push({ placeholder, original: match, type: 'var' })
|
||||
|
||||
@@ -31,6 +31,7 @@ import {
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel-new/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
|
||||
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel-new/components/editor/components/sub-block/hooks/use-sub-block-value'
|
||||
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
|
||||
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
|
||||
import { useTagSelection } from '@/hooks/use-tag-selection'
|
||||
import { normalizeBlockName } from '@/stores/workflows/utils'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
@@ -864,25 +865,41 @@ export function ConditionInput({
|
||||
placeholder: string
|
||||
original: string
|
||||
type: 'var' | 'env'
|
||||
shouldHighlight: boolean
|
||||
}[] = []
|
||||
let processedCode = codeToHighlight
|
||||
|
||||
// Replace environment variables with placeholders
|
||||
processedCode = processedCode.replace(/\{\{([^}]+)\}\}/g, (match) => {
|
||||
processedCode = processedCode.replace(createEnvVarPattern(), (match) => {
|
||||
const placeholder = `__ENV_VAR_${placeholders.length}__`
|
||||
placeholders.push({ placeholder, original: match, type: 'env' })
|
||||
placeholders.push({
|
||||
placeholder,
|
||||
original: match,
|
||||
type: 'env',
|
||||
shouldHighlight: true,
|
||||
})
|
||||
return placeholder
|
||||
})
|
||||
|
||||
// Replace variable references with placeholders
|
||||
processedCode = processedCode.replace(/<([^>]+)>/g, (match) => {
|
||||
if (shouldHighlightReference(match)) {
|
||||
const placeholder = `__VAR_REF_${placeholders.length}__`
|
||||
placeholders.push({ placeholder, original: match, type: 'var' })
|
||||
return placeholder
|
||||
// Use [^<>]+ to prevent matching across nested brackets (e.g., "<3 <real.ref>" should match separately)
|
||||
processedCode = processedCode.replace(
|
||||
createReferencePattern(),
|
||||
(match) => {
|
||||
const shouldHighlight = shouldHighlightReference(match)
|
||||
if (shouldHighlight) {
|
||||
const placeholder = `__VAR_REF_${placeholders.length}__`
|
||||
placeholders.push({
|
||||
placeholder,
|
||||
original: match,
|
||||
type: 'var',
|
||||
shouldHighlight: true,
|
||||
})
|
||||
return placeholder
|
||||
}
|
||||
return match
|
||||
}
|
||||
return match
|
||||
})
|
||||
)
|
||||
|
||||
// Apply Prism syntax highlighting
|
||||
let highlightedCode = highlight(
|
||||
@@ -892,21 +909,25 @@ export function ConditionInput({
|
||||
)
|
||||
|
||||
// Restore and highlight the placeholders
|
||||
placeholders.forEach(({ placeholder, original, type }) => {
|
||||
if (type === 'env') {
|
||||
highlightedCode = highlightedCode.replace(
|
||||
placeholder,
|
||||
`<span class="text-blue-500">${original}</span>`
|
||||
)
|
||||
} else if (type === 'var') {
|
||||
// Escape the < and > for display
|
||||
const escaped = original.replace(/</g, '<').replace(/>/g, '>')
|
||||
highlightedCode = highlightedCode.replace(
|
||||
placeholder,
|
||||
`<span class="text-blue-500">${escaped}</span>`
|
||||
)
|
||||
placeholders.forEach(
|
||||
({ placeholder, original, type, shouldHighlight }) => {
|
||||
if (!shouldHighlight) return
|
||||
|
||||
if (type === 'env') {
|
||||
highlightedCode = highlightedCode.replace(
|
||||
placeholder,
|
||||
`<span class="text-blue-500">${original}</span>`
|
||||
)
|
||||
} else if (type === 'var') {
|
||||
// Escape the < and > for display
|
||||
const escaped = original.replace(/</g, '<').replace(/>/g, '>')
|
||||
highlightedCode = highlightedCode.replace(
|
||||
placeholder,
|
||||
`<span class="text-blue-500">${escaped}</span>`
|
||||
)
|
||||
}
|
||||
}
|
||||
})
|
||||
)
|
||||
|
||||
return highlightedCode
|
||||
}}
|
||||
|
||||
@@ -2,6 +2,8 @@
|
||||
|
||||
import type { ReactNode } from 'react'
|
||||
import { splitReferenceSegment } from '@/lib/workflows/references'
|
||||
import { REFERENCE } from '@/executor/consts'
|
||||
import { createCombinedPattern } from '@/executor/utils/reference-validation'
|
||||
import { normalizeBlockName } from '@/stores/workflows/utils'
|
||||
|
||||
export interface HighlightContext {
|
||||
@@ -43,7 +45,9 @@ export function formatDisplayText(text: string, context?: HighlightContext): Rea
|
||||
}
|
||||
|
||||
const nodes: ReactNode[] = []
|
||||
const regex = /<[^>]+>|\{\{[^}]+\}\}/g
|
||||
// Match variable references without allowing nested brackets to prevent matching across references
|
||||
// e.g., "<3. text <real.ref>" should match "<3" and "<real.ref>", not the whole string
|
||||
const regex = createCombinedPattern()
|
||||
let lastIndex = 0
|
||||
let key = 0
|
||||
|
||||
@@ -61,7 +65,7 @@ export function formatDisplayText(text: string, context?: HighlightContext): Rea
|
||||
pushPlainText(text.slice(lastIndex, index))
|
||||
}
|
||||
|
||||
if (matchText.startsWith('{{')) {
|
||||
if (matchText.startsWith(REFERENCE.ENV_VAR_START)) {
|
||||
nodes.push(
|
||||
<span key={key++} className='text-[#34B5FF] dark:text-[#34B5FF]'>
|
||||
{matchText}
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { useEffect, useMemo, useRef, useState } from 'react'
|
||||
import { Badge } from '@/components/emcn'
|
||||
import { Input } from '@/components/emcn/components/input/input'
|
||||
import { Label } from '@/components/ui/label'
|
||||
import { cn } from '@/lib/utils'
|
||||
@@ -50,12 +51,13 @@ interface InputMappingFieldProps {
|
||||
value: string
|
||||
onChange: (value: string) => void
|
||||
blockId: string
|
||||
subBlockId: string
|
||||
disabled: boolean
|
||||
accessiblePrefixes: Set<string> | undefined
|
||||
inputController: ReturnType<typeof useSubBlockInput>
|
||||
inputRefs: React.MutableRefObject<Map<string, HTMLInputElement>>
|
||||
overlayRefs: React.MutableRefObject<Map<string, HTMLDivElement>>
|
||||
inputRefs: React.RefObject<Map<string, HTMLInputElement>>
|
||||
overlayRefs: React.RefObject<Map<string, HTMLDivElement>>
|
||||
collapsed: boolean
|
||||
onToggleCollapse: () => void
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -169,6 +171,7 @@ export function InputMapping({
|
||||
|
||||
const [childInputFields, setChildInputFields] = useState<InputFormatField[]>([])
|
||||
const [isLoading, setIsLoading] = useState(false)
|
||||
const [collapsedFields, setCollapsedFields] = useState<Record<string, boolean>>({})
|
||||
|
||||
useEffect(() => {
|
||||
let isMounted = true
|
||||
@@ -245,6 +248,13 @@ export function InputMapping({
|
||||
setMapping(updated)
|
||||
}
|
||||
|
||||
const toggleCollapse = (fieldName: string) => {
|
||||
setCollapsedFields((prev) => ({
|
||||
...prev,
|
||||
[fieldName]: !prev[fieldName],
|
||||
}))
|
||||
}
|
||||
|
||||
if (!selectedWorkflowId) {
|
||||
return (
|
||||
<div className='flex flex-col items-center justify-center rounded-[4px] border border-[var(--border-strong)] bg-[#1F1F1F] p-8 text-center'>
|
||||
@@ -278,12 +288,13 @@ export function InputMapping({
|
||||
value=''
|
||||
onChange={() => {}}
|
||||
blockId={blockId}
|
||||
subBlockId={subBlockId}
|
||||
disabled={true}
|
||||
accessiblePrefixes={accessiblePrefixes}
|
||||
inputController={inputController}
|
||||
inputRefs={inputRefs}
|
||||
overlayRefs={overlayRefs}
|
||||
collapsed={false}
|
||||
onToggleCollapse={() => {}}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
@@ -303,12 +314,13 @@ export function InputMapping({
|
||||
value={valueObj[field.name] || ''}
|
||||
onChange={(value) => handleFieldUpdate(field.name, value)}
|
||||
blockId={blockId}
|
||||
subBlockId={subBlockId}
|
||||
disabled={isPreview || disabled}
|
||||
accessiblePrefixes={accessiblePrefixes}
|
||||
inputController={inputController}
|
||||
inputRefs={inputRefs}
|
||||
overlayRefs={overlayRefs}
|
||||
collapsed={collapsedFields[field.name] || false}
|
||||
onToggleCollapse={() => toggleCollapse(field.name)}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
@@ -326,12 +338,13 @@ function InputMappingField({
|
||||
value,
|
||||
onChange,
|
||||
blockId,
|
||||
subBlockId,
|
||||
disabled,
|
||||
accessiblePrefixes,
|
||||
inputController,
|
||||
inputRefs,
|
||||
overlayRefs,
|
||||
collapsed,
|
||||
onToggleCollapse,
|
||||
}: InputMappingFieldProps) {
|
||||
const fieldId = fieldName
|
||||
const fieldState = inputController.fieldHelpers.getFieldState(fieldId)
|
||||
@@ -354,64 +367,91 @@ function InputMappingField({
|
||||
}
|
||||
|
||||
return (
|
||||
<div className='group relative overflow-visible rounded-[4px] border border-[var(--border-strong)] bg-[#1F1F1F]'>
|
||||
<div className='flex items-center justify-between bg-transparent px-[10px] py-[5px]'>
|
||||
<Label className='font-medium text-[14px] text-[var(--text-tertiary)]'>{fieldName}</Label>
|
||||
{fieldType && (
|
||||
<span className='rounded-md bg-[#2A2A2A] px-1.5 py-0.5 font-mono text-[10px] text-[var(--text-tertiary)]'>
|
||||
{fieldType}
|
||||
<div
|
||||
className={cn(
|
||||
'rounded-[4px] border border-[var(--border-strong)] bg-[#1F1F1F]',
|
||||
collapsed ? 'overflow-hidden' : 'overflow-visible'
|
||||
)}
|
||||
>
|
||||
<div
|
||||
className='flex cursor-pointer items-center justify-between bg-transparent px-[10px] py-[5px]'
|
||||
onClick={onToggleCollapse}
|
||||
>
|
||||
<div className='flex min-w-0 flex-1 items-center gap-[8px]'>
|
||||
<span className='block truncate font-medium text-[14px] text-[var(--text-tertiary)]'>
|
||||
{fieldName}
|
||||
</span>
|
||||
)}
|
||||
{fieldType && <Badge className='h-[20px] text-[13px]'>{fieldType}</Badge>}
|
||||
</div>
|
||||
</div>
|
||||
<div className='relative w-full border-[var(--border-strong)] border-t bg-transparent'>
|
||||
<Input
|
||||
ref={(el) => {
|
||||
if (el) inputRefs.current.set(fieldId, el)
|
||||
}}
|
||||
className={cn(
|
||||
'allow-scroll !bg-transparent w-full overflow-auto rounded-none border-0 px-[10px] py-[8px] text-transparent caret-white [-ms-overflow-style:none] [scrollbar-width:none] placeholder:text-[var(--text-muted)] focus-visible:ring-0 focus-visible:ring-offset-0 [&::-webkit-scrollbar]:hidden'
|
||||
)}
|
||||
type='text'
|
||||
value={value}
|
||||
onChange={handlers.onChange}
|
||||
onKeyDown={handlers.onKeyDown}
|
||||
onScroll={handleScroll}
|
||||
onDrop={handlers.onDrop}
|
||||
onDragOver={handlers.onDragOver}
|
||||
autoComplete='off'
|
||||
disabled={disabled}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) overlayRefs.current.set(fieldId, el)
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-center overflow-x-auto bg-transparent px-[10px] py-[8px] font-medium font-sans text-[#eeeeee] text-sm [-ms-overflow-style:none] [scrollbar-width:none] [&::-webkit-scrollbar]:hidden'
|
||||
>
|
||||
<div className='min-w-fit whitespace-pre'>
|
||||
{formatDisplayText(value, {
|
||||
accessiblePrefixes,
|
||||
highlightAll: !accessiblePrefixes,
|
||||
})}
|
||||
|
||||
{!collapsed && (
|
||||
<div className='flex flex-col gap-[6px] border-[var(--border-strong)] border-t px-[10px] pt-[6px] pb-[10px]'>
|
||||
<div className='space-y-[4px]'>
|
||||
<Label className='text-[13px]'>Value</Label>
|
||||
<div className='relative'>
|
||||
<Input
|
||||
ref={(el) => {
|
||||
if (el) inputRefs.current.set(fieldId, el)
|
||||
}}
|
||||
name='value'
|
||||
value={value}
|
||||
onChange={handlers.onChange}
|
||||
onKeyDown={handlers.onKeyDown}
|
||||
onDrop={handlers.onDrop}
|
||||
onDragOver={handlers.onDragOver}
|
||||
onScroll={(e) => handleScroll(e)}
|
||||
onPaste={() =>
|
||||
setTimeout(() => {
|
||||
const input = inputRefs.current.get(fieldId)
|
||||
input && handleScroll({ currentTarget: input } as any)
|
||||
}, 0)
|
||||
}
|
||||
placeholder='Enter value or reference'
|
||||
disabled={disabled}
|
||||
autoComplete='off'
|
||||
className={cn(
|
||||
'allow-scroll w-full overflow-auto text-transparent caret-foreground'
|
||||
)}
|
||||
style={{ overflowX: 'auto' }}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) overlayRefs.current.set(fieldId, el)
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-center overflow-x-auto bg-transparent px-[8px] py-[6px] font-medium font-sans text-sm'
|
||||
style={{ overflowX: 'auto' }}
|
||||
>
|
||||
<div
|
||||
className='w-full whitespace-pre'
|
||||
style={{ scrollbarWidth: 'none', minWidth: 'fit-content' }}
|
||||
>
|
||||
{formatDisplayText(
|
||||
value,
|
||||
accessiblePrefixes ? { accessiblePrefixes } : { highlightAll: true }
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
{fieldState.showTags && (
|
||||
<TagDropdown
|
||||
visible={fieldState.showTags}
|
||||
onSelect={tagSelectHandler}
|
||||
blockId={blockId}
|
||||
activeSourceBlockId={fieldState.activeSourceBlockId}
|
||||
inputValue={value}
|
||||
cursorPosition={fieldState.cursorPosition}
|
||||
onClose={() => inputController.fieldHelpers.hideFieldDropdowns(fieldId)}
|
||||
inputRef={
|
||||
{
|
||||
current: inputRefs.current.get(fieldId) || null,
|
||||
} as React.RefObject<HTMLInputElement>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{fieldState.showTags && (
|
||||
<TagDropdown
|
||||
visible={fieldState.showTags}
|
||||
onSelect={tagSelectHandler}
|
||||
blockId={blockId}
|
||||
activeSourceBlockId={fieldState.activeSourceBlockId}
|
||||
inputValue={value}
|
||||
cursorPosition={fieldState.cursorPosition}
|
||||
onClose={() => inputController.fieldHelpers.hideFieldDropdowns(fieldId)}
|
||||
inputRef={
|
||||
{
|
||||
current: inputRefs.current.get(fieldId) || null,
|
||||
} as React.RefObject<HTMLInputElement>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -1642,19 +1642,19 @@ export function ToolInput({
|
||||
<p className='text-xs'>
|
||||
{tool.usageControl === 'auto' && (
|
||||
<span>
|
||||
<span className='font-medium'>Auto:</span> The model decides when to
|
||||
use the tool
|
||||
<span className='font-medium' /> The model decides when to use the
|
||||
tool
|
||||
</span>
|
||||
)}
|
||||
{tool.usageControl === 'force' && (
|
||||
<span>
|
||||
<span className='font-medium'>Force:</span> Always use this tool in
|
||||
the response
|
||||
<span className='font-medium' /> Always use this tool in the
|
||||
response
|
||||
</span>
|
||||
)}
|
||||
{tool.usageControl === 'none' && (
|
||||
<span>
|
||||
<span className='font-medium'>Deny:</span> Never use this tool
|
||||
<span className='font-medium' /> Never use this tool
|
||||
</span>
|
||||
)}
|
||||
</p>
|
||||
|
||||
@@ -1,17 +1,9 @@
|
||||
import { useRef, useState } from 'react'
|
||||
import { useEffect, useRef, useState } from 'react'
|
||||
import { Plus } from 'lucide-react'
|
||||
import { useParams } from 'next/navigation'
|
||||
import { Badge, Button, Combobox, Input } from '@/components/emcn'
|
||||
import { Trash } from '@/components/emcn/icons/trash'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Input } from '@/components/ui/input'
|
||||
import { Label } from '@/components/ui/label'
|
||||
import {
|
||||
Select,
|
||||
SelectContent,
|
||||
SelectItem,
|
||||
SelectTrigger,
|
||||
SelectValue,
|
||||
} from '@/components/ui/select'
|
||||
import { Textarea } from '@/components/ui/textarea'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { formatDisplayText } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel-new/components/editor/components/sub-block/components/formatted-text'
|
||||
@@ -68,6 +60,7 @@ export function VariablesInput({
|
||||
const valueInputRefs = useRef<Record<string, HTMLInputElement | HTMLTextAreaElement>>({})
|
||||
const overlayRefs = useRef<Record<string, HTMLDivElement>>({})
|
||||
const [dragHighlight, setDragHighlight] = useState<Record<string, boolean>>({})
|
||||
const [collapsedAssignments, setCollapsedAssignments] = useState<Record<string, boolean>>({})
|
||||
|
||||
const currentWorkflowVariables = Object.values(workflowVariables).filter(
|
||||
(v: Variable) => v.workflowId === workflowId
|
||||
@@ -75,6 +68,7 @@ export function VariablesInput({
|
||||
|
||||
const value = isPreview ? previewValue : storeValue
|
||||
const assignments: VariableAssignment[] = value || []
|
||||
const isReadOnly = isPreview || disabled
|
||||
|
||||
const getAvailableVariablesFor = (currentAssignmentId: string) => {
|
||||
const otherSelectedIds = new Set(
|
||||
@@ -91,8 +85,41 @@ export function VariablesInput({
|
||||
const allVariablesAssigned =
|
||||
!hasNoWorkflowVariables && getAvailableVariablesFor('new').length === 0
|
||||
|
||||
// Initialize with one empty assignment if none exist and not in preview/disabled mode
|
||||
// Also add assignment when first variable is created
|
||||
useEffect(() => {
|
||||
if (!isReadOnly && assignments.length === 0 && currentWorkflowVariables.length > 0) {
|
||||
const initialAssignment: VariableAssignment = {
|
||||
...DEFAULT_ASSIGNMENT,
|
||||
id: crypto.randomUUID(),
|
||||
}
|
||||
setStoreValue([initialAssignment])
|
||||
}
|
||||
}, [currentWorkflowVariables.length, isReadOnly, assignments.length, setStoreValue])
|
||||
|
||||
// Clean up assignments when their associated variables are deleted
|
||||
useEffect(() => {
|
||||
if (isReadOnly || assignments.length === 0) return
|
||||
|
||||
const currentVariableIds = new Set(currentWorkflowVariables.map((v) => v.id))
|
||||
const validAssignments = assignments.filter((assignment) => {
|
||||
// Keep assignments that haven't selected a variable yet
|
||||
if (!assignment.variableId) return true
|
||||
// Keep assignments whose variable still exists
|
||||
return currentVariableIds.has(assignment.variableId)
|
||||
})
|
||||
|
||||
// If all variables were deleted, clear all assignments
|
||||
if (currentWorkflowVariables.length === 0) {
|
||||
setStoreValue([])
|
||||
} else if (validAssignments.length !== assignments.length) {
|
||||
// Some assignments reference deleted variables, remove them
|
||||
setStoreValue(validAssignments.length > 0 ? validAssignments : [])
|
||||
}
|
||||
}, [currentWorkflowVariables, assignments, isReadOnly, setStoreValue])
|
||||
|
||||
const addAssignment = () => {
|
||||
if (isPreview || disabled) return
|
||||
if (isPreview || disabled || allVariablesAssigned) return
|
||||
|
||||
const newAssignment: VariableAssignment = {
|
||||
...DEFAULT_ASSIGNMENT,
|
||||
@@ -219,6 +246,13 @@ export function VariablesInput({
|
||||
setDragHighlight((prev) => ({ ...prev, [assignmentId]: false }))
|
||||
}
|
||||
|
||||
const toggleCollapse = (assignmentId: string) => {
|
||||
setCollapsedAssignments((prev) => ({
|
||||
...prev,
|
||||
[assignmentId]: !prev[assignmentId],
|
||||
}))
|
||||
}
|
||||
|
||||
if (isPreview && (!assignments || assignments.length === 0)) {
|
||||
return (
|
||||
<div className='flex flex-col items-center justify-center rounded-md border border-border/40 border-dashed bg-muted/20 py-8 text-center'>
|
||||
@@ -244,225 +278,195 @@ export function VariablesInput({
|
||||
}
|
||||
|
||||
if (!isPreview && hasNoWorkflowVariables && assignments.length === 0) {
|
||||
return (
|
||||
<div className='flex flex-col items-center justify-center rounded-lg border border-border/50 bg-muted/30 p-8 text-center'>
|
||||
<svg
|
||||
className='mb-3 h-10 w-10 text-muted-foreground/60'
|
||||
fill='none'
|
||||
viewBox='0 0 24 24'
|
||||
stroke='currentColor'
|
||||
>
|
||||
<path
|
||||
strokeLinecap='round'
|
||||
strokeLinejoin='round'
|
||||
strokeWidth={1.5}
|
||||
d='M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z'
|
||||
/>
|
||||
</svg>
|
||||
<p className='font-medium text-muted-foreground text-sm'>No variables found</p>
|
||||
<p className='mt-1 text-muted-foreground/80 text-xs'>
|
||||
Add variables in the Variables panel to get started
|
||||
</p>
|
||||
</div>
|
||||
)
|
||||
return <p className='text-[var(--text-muted)] text-sm'>No variables available</p>
|
||||
}
|
||||
|
||||
return (
|
||||
<div className='space-y-2'>
|
||||
{assignments && assignments.length > 0 ? (
|
||||
<div className='space-y-2'>
|
||||
{assignments.map((assignment) => {
|
||||
<div className='space-y-[8px]'>
|
||||
{assignments && assignments.length > 0 && (
|
||||
<div className='space-y-[8px]'>
|
||||
{assignments.map((assignment, index) => {
|
||||
const collapsed = collapsedAssignments[assignment.id] || false
|
||||
const availableVars = getAvailableVariablesFor(assignment.id)
|
||||
|
||||
return (
|
||||
<div
|
||||
key={assignment.id}
|
||||
className='group relative rounded-lg border border-border/50 bg-background/50 p-3 transition-all hover:border-border hover:bg-background'
|
||||
>
|
||||
{!isPreview && !disabled && (
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='icon'
|
||||
className='absolute top-2 right-2 h-6 w-6 opacity-0 transition-opacity group-hover:opacity-100'
|
||||
onClick={() => removeAssignment(assignment.id)}
|
||||
>
|
||||
<Trash className='h-3.5 w-3.5 text-muted-foreground hover:text-destructive' />
|
||||
</Button>
|
||||
data-assignment-id={assignment.id}
|
||||
className={cn(
|
||||
'rounded-[4px] border border-[var(--border-strong)] bg-[#1F1F1F]',
|
||||
collapsed ? 'overflow-hidden' : 'overflow-visible'
|
||||
)}
|
||||
|
||||
<div className='space-y-3'>
|
||||
<div className='space-y-1.5'>
|
||||
<div className='flex items-center justify-between pr-8'>
|
||||
<Label className='text-xs'>Variable</Label>
|
||||
{assignment.variableName && (
|
||||
<span className='rounded-md bg-muted px-1.5 py-0.5 font-mono text-[10px] text-muted-foreground'>
|
||||
{assignment.type}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<Select
|
||||
value={assignment.variableId || assignment.variableName || ''}
|
||||
onValueChange={(value) => {
|
||||
if (value === '__new__') {
|
||||
return
|
||||
}
|
||||
handleVariableSelect(assignment.id, value)
|
||||
}}
|
||||
disabled={isPreview || disabled}
|
||||
>
|
||||
<SelectTrigger className='h-9 border border-input bg-white dark:border-input/60 dark:bg-background'>
|
||||
<SelectValue placeholder='Select a variable...' />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{(() => {
|
||||
const availableVars = getAvailableVariablesFor(assignment.id)
|
||||
return availableVars.length > 0 ? (
|
||||
availableVars.map((variable) => (
|
||||
<SelectItem key={variable.id} value={variable.id}>
|
||||
{variable.name}
|
||||
</SelectItem>
|
||||
))
|
||||
) : (
|
||||
<div className='p-2 text-center text-muted-foreground text-sm'>
|
||||
{currentWorkflowVariables.length > 0
|
||||
? 'All variables have been assigned.'
|
||||
: 'No variables defined in this workflow.'}
|
||||
{currentWorkflowVariables.length === 0 && (
|
||||
<>
|
||||
<br />
|
||||
Add them in the Variables panel.
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
})()}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
>
|
||||
<div
|
||||
className='flex cursor-pointer items-center justify-between bg-transparent px-[10px] py-[5px]'
|
||||
onClick={() => toggleCollapse(assignment.id)}
|
||||
>
|
||||
<div className='flex min-w-0 flex-1 items-center gap-[8px]'>
|
||||
<span className='block truncate font-medium text-[14px] text-[var(--text-tertiary)]'>
|
||||
{assignment.variableName || `Variable ${index + 1}`}
|
||||
</span>
|
||||
{assignment.variableName && (
|
||||
<Badge className='h-[20px] text-[13px]'>{assignment.type}</Badge>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className='space-y-1.5'>
|
||||
<Label className='text-xs'>Value</Label>
|
||||
{assignment.type === 'object' || assignment.type === 'array' ? (
|
||||
<div className='relative'>
|
||||
<Textarea
|
||||
ref={(el) => {
|
||||
if (el) valueInputRefs.current[assignment.id] = el
|
||||
}}
|
||||
value={assignment.value || ''}
|
||||
onChange={(e) =>
|
||||
handleValueInputChange(
|
||||
assignment.id,
|
||||
e.target.value,
|
||||
e.target.selectionStart ?? undefined
|
||||
)
|
||||
}
|
||||
placeholder={
|
||||
assignment.type === 'object'
|
||||
? '{\n "key": "value"\n}'
|
||||
: '[\n 1, 2, 3\n]'
|
||||
}
|
||||
disabled={isPreview || disabled}
|
||||
className={cn(
|
||||
'min-h-[120px] border border-input bg-white font-mono text-sm text-transparent caret-foreground placeholder:text-muted-foreground/50 dark:border-input/60 dark:bg-background',
|
||||
dragHighlight[assignment.id] && 'ring-2 ring-blue-500 ring-offset-2'
|
||||
)}
|
||||
style={{
|
||||
fontFamily: 'inherit',
|
||||
lineHeight: 'inherit',
|
||||
wordBreak: 'break-word',
|
||||
whiteSpace: 'pre-wrap',
|
||||
}}
|
||||
onDrop={(e) => handleDrop(e, assignment.id)}
|
||||
onDragOver={(e) => handleDragOver(e, assignment.id)}
|
||||
onDragLeave={(e) => handleDragLeave(e, assignment.id)}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) overlayRefs.current[assignment.id] = el
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-start overflow-auto bg-transparent px-3 py-2 font-mono text-sm'
|
||||
style={{
|
||||
fontFamily: 'inherit',
|
||||
lineHeight: 'inherit',
|
||||
}}
|
||||
>
|
||||
<div className='w-full whitespace-pre-wrap break-words'>
|
||||
{formatDisplayText(assignment.value || '', {
|
||||
accessiblePrefixes,
|
||||
highlightAll: !accessiblePrefixes,
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div className='relative'>
|
||||
<Input
|
||||
ref={(el) => {
|
||||
if (el) valueInputRefs.current[assignment.id] = el
|
||||
}}
|
||||
value={assignment.value || ''}
|
||||
onChange={(e) =>
|
||||
handleValueInputChange(
|
||||
assignment.id,
|
||||
e.target.value,
|
||||
e.target.selectionStart ?? undefined
|
||||
)
|
||||
}
|
||||
placeholder={`${assignment.type} value`}
|
||||
disabled={isPreview || disabled}
|
||||
autoComplete='off'
|
||||
className={cn(
|
||||
'h-9 border border-input bg-white text-transparent caret-foreground placeholder:text-muted-foreground/50 dark:border-input/60 dark:bg-background',
|
||||
dragHighlight[assignment.id] && 'ring-2 ring-blue-500 ring-offset-2'
|
||||
)}
|
||||
onDrop={(e) => handleDrop(e, assignment.id)}
|
||||
onDragOver={(e) => handleDragOver(e, assignment.id)}
|
||||
onDragLeave={(e) => handleDragLeave(e, assignment.id)}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) overlayRefs.current[assignment.id] = el
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-center overflow-hidden bg-transparent px-3 text-sm'
|
||||
>
|
||||
<div className='w-full whitespace-nowrap'>
|
||||
{formatDisplayText(assignment.value || '', {
|
||||
accessiblePrefixes,
|
||||
highlightAll: !accessiblePrefixes,
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{showTags && activeFieldId === assignment.id && (
|
||||
<TagDropdown
|
||||
visible={showTags}
|
||||
onSelect={handleTagSelect}
|
||||
blockId={blockId}
|
||||
activeSourceBlockId={activeSourceBlockId}
|
||||
inputValue={assignment.value || ''}
|
||||
cursorPosition={cursorPosition}
|
||||
onClose={() => setShowTags(false)}
|
||||
className='absolute top-full left-0 z-50 mt-1'
|
||||
/>
|
||||
)}
|
||||
<div
|
||||
className='flex items-center gap-[8px] pl-[8px]'
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<Button
|
||||
variant='ghost'
|
||||
onClick={addAssignment}
|
||||
disabled={isPreview || disabled || allVariablesAssigned}
|
||||
className='h-auto p-0'
|
||||
>
|
||||
<Plus className='h-[14px] w-[14px]' />
|
||||
<span className='sr-only'>Add Variable</span>
|
||||
</Button>
|
||||
<Button
|
||||
variant='ghost'
|
||||
onClick={() => removeAssignment(assignment.id)}
|
||||
disabled={isPreview || disabled || assignments.length === 1}
|
||||
className='h-auto p-0 text-[var(--text-error)] hover:text-[var(--text-error)]'
|
||||
>
|
||||
<Trash className='h-[14px] w-[14px]' />
|
||||
<span className='sr-only'>Delete Variable</span>
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{!collapsed && (
|
||||
<div className='flex flex-col gap-[6px] border-[var(--border-strong)] border-t px-[10px] pt-[6px] pb-[10px]'>
|
||||
<div className='flex flex-col gap-[4px]'>
|
||||
<Label className='text-[13px]'>Variable</Label>
|
||||
<Combobox
|
||||
options={availableVars.map((v) => ({ label: v.name, value: v.id }))}
|
||||
value={assignment.variableId || assignment.variableName || ''}
|
||||
onChange={(value) => handleVariableSelect(assignment.id, value)}
|
||||
placeholder='Select a variable...'
|
||||
disabled={isPreview || disabled}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className='space-y-[4px]'>
|
||||
<Label className='text-[13px]'>Value</Label>
|
||||
{assignment.type === 'object' || assignment.type === 'array' ? (
|
||||
<div className='relative'>
|
||||
<Textarea
|
||||
ref={(el) => {
|
||||
if (el) valueInputRefs.current[assignment.id] = el
|
||||
}}
|
||||
value={assignment.value || ''}
|
||||
onChange={(e) =>
|
||||
handleValueInputChange(
|
||||
assignment.id,
|
||||
e.target.value,
|
||||
e.target.selectionStart ?? undefined
|
||||
)
|
||||
}
|
||||
placeholder={
|
||||
assignment.type === 'object'
|
||||
? '{\n "key": "value"\n}'
|
||||
: '[\n 1, 2, 3\n]'
|
||||
}
|
||||
disabled={isPreview || disabled}
|
||||
className={cn(
|
||||
'min-h-[120px] font-mono text-sm text-transparent caret-foreground placeholder:text-muted-foreground/50',
|
||||
dragHighlight[assignment.id] && 'ring-2 ring-blue-500 ring-offset-2'
|
||||
)}
|
||||
style={{
|
||||
fontFamily: 'inherit',
|
||||
lineHeight: 'inherit',
|
||||
wordBreak: 'break-word',
|
||||
whiteSpace: 'pre-wrap',
|
||||
}}
|
||||
onDrop={(e) => handleDrop(e, assignment.id)}
|
||||
onDragOver={(e) => handleDragOver(e, assignment.id)}
|
||||
onDragLeave={(e) => handleDragLeave(e, assignment.id)}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) overlayRefs.current[assignment.id] = el
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-start overflow-auto bg-transparent px-3 py-2 font-mono text-sm'
|
||||
style={{
|
||||
fontFamily: 'inherit',
|
||||
lineHeight: 'inherit',
|
||||
}}
|
||||
>
|
||||
<div className='w-full whitespace-pre-wrap break-words'>
|
||||
{formatDisplayText(assignment.value || '', {
|
||||
accessiblePrefixes,
|
||||
highlightAll: !accessiblePrefixes,
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div className='relative'>
|
||||
<Input
|
||||
ref={(el) => {
|
||||
if (el) valueInputRefs.current[assignment.id] = el
|
||||
}}
|
||||
name='value'
|
||||
value={assignment.value || ''}
|
||||
onChange={(e) =>
|
||||
handleValueInputChange(
|
||||
assignment.id,
|
||||
e.target.value,
|
||||
e.target.selectionStart ?? undefined
|
||||
)
|
||||
}
|
||||
placeholder={`${assignment.type} value`}
|
||||
disabled={isPreview || disabled}
|
||||
autoComplete='off'
|
||||
className={cn(
|
||||
'allow-scroll w-full overflow-auto text-transparent caret-foreground',
|
||||
dragHighlight[assignment.id] && 'ring-2 ring-blue-500 ring-offset-2'
|
||||
)}
|
||||
style={{ overflowX: 'auto' }}
|
||||
onDrop={(e) => handleDrop(e, assignment.id)}
|
||||
onDragOver={(e) => handleDragOver(e, assignment.id)}
|
||||
onDragLeave={(e) => handleDragLeave(e, assignment.id)}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) overlayRefs.current[assignment.id] = el
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-center overflow-x-auto bg-transparent px-[8px] py-[6px] font-medium font-sans text-sm'
|
||||
style={{ overflowX: 'auto' }}
|
||||
>
|
||||
<div
|
||||
className='w-full whitespace-pre'
|
||||
style={{ scrollbarWidth: 'none', minWidth: 'fit-content' }}
|
||||
>
|
||||
{formatDisplayText(
|
||||
assignment.value || '',
|
||||
accessiblePrefixes ? { accessiblePrefixes } : { highlightAll: true }
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{showTags && activeFieldId === assignment.id && (
|
||||
<TagDropdown
|
||||
visible={showTags}
|
||||
onSelect={handleTagSelect}
|
||||
blockId={blockId}
|
||||
activeSourceBlockId={activeSourceBlockId}
|
||||
inputValue={assignment.value || ''}
|
||||
cursorPosition={cursorPosition}
|
||||
onClose={() => setShowTags(false)}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
) : null}
|
||||
|
||||
{!isPreview && !disabled && !hasNoWorkflowVariables && (
|
||||
<Button
|
||||
onClick={addAssignment}
|
||||
variant='outline'
|
||||
className='h-9 w-full border-dashed'
|
||||
disabled={allVariablesAssigned}
|
||||
>
|
||||
<Plus className='mr-2 h-4 w-4' />
|
||||
{allVariablesAssigned ? 'All Variables Assigned' : 'Add Variable Assignment'}
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
|
||||
@@ -210,9 +210,14 @@ export function Editor() {
|
||||
/>
|
||||
) : (
|
||||
<h2
|
||||
className='min-w-0 flex-1 cursor-pointer truncate pr-[8px] font-medium text-[14px] text-[var(--white)] dark:text-[var(--white)]'
|
||||
className='min-w-0 flex-1 cursor-pointer select-none truncate pr-[8px] font-medium text-[14px] text-[var(--white)] dark:text-[var(--white)]'
|
||||
title={title}
|
||||
onDoubleClick={handleStartRename}
|
||||
onMouseDown={(e) => {
|
||||
if (e.detail === 2) {
|
||||
e.preventDefault()
|
||||
}
|
||||
}}
|
||||
>
|
||||
{title}
|
||||
</h2>
|
||||
|
||||
@@ -7,6 +7,7 @@ import {
|
||||
} from '@/lib/workflows/references'
|
||||
import { checkTagTrigger } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel-new/components/editor/components/sub-block/components/tag-dropdown/tag-dropdown'
|
||||
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
|
||||
import { createEnvVarPattern, createReferencePattern } from '@/executor/utils/reference-validation'
|
||||
import { useCollaborativeWorkflow } from '@/hooks/use-collaborative-workflow'
|
||||
import { normalizeBlockName } from '@/stores/workflows/utils'
|
||||
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
|
||||
@@ -133,13 +134,14 @@ export function useSubflowEditor(currentBlock: BlockState | null, currentBlockId
|
||||
|
||||
let processedCode = code
|
||||
|
||||
processedCode = processedCode.replace(/\{\{([^}]+)\}\}/g, (match) => {
|
||||
processedCode = processedCode.replace(createEnvVarPattern(), (match) => {
|
||||
const placeholder = `__ENV_VAR_${placeholders.length}__`
|
||||
placeholders.push({ placeholder, original: match, type: 'env' })
|
||||
return placeholder
|
||||
})
|
||||
|
||||
processedCode = processedCode.replace(/<[^>]+>/g, (match) => {
|
||||
// Use [^<>]+ to prevent matching across nested brackets (e.g., "<3 <real.ref>" should match separately)
|
||||
processedCode = processedCode.replace(createReferencePattern(), (match) => {
|
||||
if (shouldHighlightReference(match)) {
|
||||
const placeholder = `__VAR_REF_${placeholders.length}__`
|
||||
placeholders.push({ placeholder, original: match, type: 'var' })
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
'use client'
|
||||
|
||||
import { useCallback, useEffect, useRef, useState } from 'react'
|
||||
import { Braces, Square } from 'lucide-react'
|
||||
import { ArrowDown, Braces, Square } from 'lucide-react'
|
||||
import { useParams, useRouter } from 'next/navigation'
|
||||
import {
|
||||
BubbleChatPreview,
|
||||
@@ -22,12 +22,13 @@ import {
|
||||
PopoverTrigger,
|
||||
Trash,
|
||||
} from '@/components/emcn'
|
||||
import { VariableIcon } from '@/components/icons'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useRegisterGlobalCommands } from '@/app/workspace/[workspaceId]/providers/global-commands-provider'
|
||||
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
|
||||
import { Variables } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/variables/variables'
|
||||
import { useWorkflowExecution } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-workflow-execution'
|
||||
import { useDeleteWorkflow } from '@/app/workspace/[workspaceId]/w/hooks'
|
||||
import { useDeleteWorkflow, useImportWorkflow } from '@/app/workspace/[workspaceId]/w/hooks'
|
||||
import { useChatStore } from '@/stores/chat/store'
|
||||
import { usePanelStore } from '@/stores/panel-new/store'
|
||||
import type { PanelTab } from '@/stores/panel-new/types'
|
||||
@@ -62,6 +63,7 @@ export function Panel() {
|
||||
const workspaceId = params.workspaceId as string
|
||||
|
||||
const panelRef = useRef<HTMLElement>(null)
|
||||
const fileInputRef = useRef<HTMLInputElement>(null)
|
||||
const { activeTab, setActiveTab, panelWidth, _hasHydrated, setHasHydrated } = usePanelStore()
|
||||
const copilotRef = useRef<{
|
||||
createNewChat: () => void
|
||||
@@ -77,6 +79,7 @@ export function Panel() {
|
||||
|
||||
// Hooks
|
||||
const userPermissions = useUserPermissionsContext()
|
||||
const { isImporting, handleFileChange } = useImportWorkflow({ workspaceId })
|
||||
const {
|
||||
workflows,
|
||||
activeWorkflowId,
|
||||
@@ -262,6 +265,14 @@ export function Panel() {
|
||||
workspaceId,
|
||||
])
|
||||
|
||||
/**
|
||||
* Handles triggering file input for workflow import
|
||||
*/
|
||||
const handleImportWorkflow = useCallback(() => {
|
||||
setIsMenuOpen(false)
|
||||
fileInputRef.current?.click()
|
||||
}, [])
|
||||
|
||||
// Compute run button state
|
||||
const canRun = userPermissions.canRead // Running only requires read permissions
|
||||
const isLoadingPermissions = userPermissions.isLoading
|
||||
@@ -314,7 +325,7 @@ export function Panel() {
|
||||
</PopoverItem>
|
||||
{
|
||||
<PopoverItem onClick={() => setVariablesOpen(!isVariablesOpen)}>
|
||||
<Braces className='h-3 w-3' />
|
||||
<VariableIcon className='h-3 w-3' />
|
||||
<span>Variables</span>
|
||||
</PopoverItem>
|
||||
}
|
||||
@@ -331,7 +342,14 @@ export function Panel() {
|
||||
disabled={isExporting || !currentWorkflow}
|
||||
>
|
||||
<Braces className='h-3 w-3' />
|
||||
<span>Export JSON</span>
|
||||
<span>Export workflow</span>
|
||||
</PopoverItem>
|
||||
<PopoverItem
|
||||
onClick={handleImportWorkflow}
|
||||
disabled={isImporting || !userPermissions.canEdit}
|
||||
>
|
||||
<ArrowDown className='h-3 w-3' />
|
||||
<span>Import workflow</span>
|
||||
</PopoverItem>
|
||||
<PopoverItem
|
||||
onClick={handleDuplicateWorkflow}
|
||||
@@ -499,6 +517,16 @@ export function Panel() {
|
||||
|
||||
{/* Floating Variables Modal */}
|
||||
<Variables />
|
||||
|
||||
{/* Hidden file input for workflow import */}
|
||||
<input
|
||||
ref={fileInputRef}
|
||||
type='file'
|
||||
accept='.json,.zip'
|
||||
multiple
|
||||
style={{ display: 'none' }}
|
||||
onChange={handleFileChange}
|
||||
/>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -74,10 +74,10 @@ export const ActionBar = memo(
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'-right-20 absolute top-0',
|
||||
'flex flex-col items-center',
|
||||
'-top-[46px] absolute right-0',
|
||||
'flex flex-row items-center',
|
||||
'opacity-0 transition-opacity duration-200 group-hover:opacity-100',
|
||||
'gap-[6px] rounded-[10px] bg-[var(--surface-3)] p-[6px]'
|
||||
'gap-[5px] rounded-[10px] bg-[var(--surface-3)] p-[5px]'
|
||||
)}
|
||||
>
|
||||
<Tooltip.Root>
|
||||
@@ -90,17 +90,17 @@ export const ActionBar = memo(
|
||||
collaborativeToggleBlockEnabled(blockId)
|
||||
}
|
||||
}}
|
||||
className='h-[30px] w-[30px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
className='h-[23px] w-[23px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
disabled={disabled}
|
||||
>
|
||||
{isEnabled ? (
|
||||
<Circle className='h-[14px] w-[14px]' />
|
||||
<Circle className='h-[11px] w-[11px]' />
|
||||
) : (
|
||||
<CircleOff className='h-[14px] w-[14px]' />
|
||||
<CircleOff className='h-[11px] w-[11px]' />
|
||||
)}
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content side='right'>
|
||||
<Tooltip.Content side='top'>
|
||||
{getTooltipMessage(isEnabled ? 'Disable Block' : 'Enable Block')}
|
||||
</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
@@ -116,13 +116,13 @@ export const ActionBar = memo(
|
||||
collaborativeDuplicateBlock(blockId)
|
||||
}
|
||||
}}
|
||||
className='h-[30px] w-[30px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
className='h-[23px] w-[23px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
disabled={disabled}
|
||||
>
|
||||
<Duplicate className='h-[14px] w-[14px]' />
|
||||
<Duplicate className='h-[11px] w-[11px]' />
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content side='right'>{getTooltipMessage('Duplicate Block')}</Tooltip.Content>
|
||||
<Tooltip.Content side='top'>{getTooltipMessage('Duplicate Block')}</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
)}
|
||||
|
||||
@@ -139,15 +139,13 @@ export const ActionBar = memo(
|
||||
)
|
||||
}
|
||||
}}
|
||||
className='h-[30px] w-[30px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
className='h-[23px] w-[23px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
disabled={disabled || !userPermissions.canEdit}
|
||||
>
|
||||
<LogOut className='h-[14px] w-[14px]' />
|
||||
<LogOut className='h-[11px] w-[11px]' />
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content side='right'>
|
||||
{getTooltipMessage('Remove From Subflow')}
|
||||
</Tooltip.Content>
|
||||
<Tooltip.Content side='top'>{getTooltipMessage('Remove from Subflow')}</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
)}
|
||||
|
||||
@@ -161,17 +159,17 @@ export const ActionBar = memo(
|
||||
collaborativeToggleBlockHandles(blockId)
|
||||
}
|
||||
}}
|
||||
className='h-[30px] w-[30px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
className='h-[23px] w-[23px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)]'
|
||||
disabled={disabled}
|
||||
>
|
||||
{horizontalHandles ? (
|
||||
<ArrowLeftRight className='h-[14px] w-[14px]' />
|
||||
<ArrowLeftRight className='h-[11px] w-[11px]' />
|
||||
) : (
|
||||
<ArrowUpDown className='h-[14px] w-[14px]' />
|
||||
<ArrowUpDown className='h-[11px] w-[11px]' />
|
||||
)}
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content side='right'>
|
||||
<Tooltip.Content side='top'>
|
||||
{getTooltipMessage(horizontalHandles ? 'Vertical Ports' : 'Horizontal Ports')}
|
||||
</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
@@ -186,13 +184,13 @@ export const ActionBar = memo(
|
||||
collaborativeRemoveBlock(blockId)
|
||||
}
|
||||
}}
|
||||
className='h-[30px] w-[30px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)] '
|
||||
className='h-[23px] w-[23px] rounded-[8px] bg-[var(--surface-9)] p-0 text-[#868686] hover:bg-[var(--brand-secondary)] hover:text-[var(--bg)] dark:text-[#868686] dark:hover:bg-[var(--brand-secondary)] dark:hover:text-[var(--bg)] '
|
||||
disabled={disabled}
|
||||
>
|
||||
<Trash2 className='h-[14px] w-[14px]' />
|
||||
<Trash2 className='h-[11px] w-[11px]' />
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content side='right'>{getTooltipMessage('Delete Block')}</Tooltip.Content>
|
||||
<Tooltip.Content side='top'>{getTooltipMessage('Delete Block')}</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
</div>
|
||||
)
|
||||
|
||||
@@ -1,84 +1,23 @@
|
||||
import { RepeatIcon, SplitIcon } from 'lucide-react'
|
||||
import {
|
||||
type ConnectedBlock,
|
||||
useBlockConnections,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel-new/components/editor/hooks/use-block-connections'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { useBlockConnections } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel-new/components/editor/hooks/use-block-connections'
|
||||
|
||||
interface ConnectionsProps {
|
||||
blockId: string
|
||||
horizontalHandles: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves the icon component for a given connection block
|
||||
* @param connection - The connected block to get the icon for
|
||||
* @returns The icon component or null if not found
|
||||
* Displays incoming connections at the bottom left of the workflow block
|
||||
*/
|
||||
function getConnectionIcon(connection: ConnectedBlock) {
|
||||
const blockConfig = getBlock(connection.type)
|
||||
|
||||
if (blockConfig?.icon) {
|
||||
return blockConfig.icon
|
||||
}
|
||||
|
||||
if (connection.type === 'loop') {
|
||||
return RepeatIcon
|
||||
}
|
||||
|
||||
if (connection.type === 'parallel') {
|
||||
return SplitIcon
|
||||
}
|
||||
|
||||
return null
|
||||
}
|
||||
|
||||
/**
|
||||
* Displays incoming connections as compact floating text above the workflow block
|
||||
*/
|
||||
export function Connections({ blockId, horizontalHandles }: ConnectionsProps) {
|
||||
export function Connections({ blockId }: ConnectionsProps) {
|
||||
const { incomingConnections, hasIncomingConnections } = useBlockConnections(blockId)
|
||||
|
||||
if (!hasIncomingConnections) return null
|
||||
|
||||
const connectionCount = incomingConnections.length
|
||||
const maxVisibleIcons = 4
|
||||
const visibleConnections = incomingConnections.slice(0, maxVisibleIcons)
|
||||
const remainingCount = connectionCount - maxVisibleIcons
|
||||
|
||||
const connectionText = `${connectionCount} ${connectionCount === 1 ? 'connection' : 'connections'}`
|
||||
|
||||
const connectionIcons = (
|
||||
<>
|
||||
{visibleConnections.map((connection: ConnectedBlock) => {
|
||||
const Icon = getConnectionIcon(connection)
|
||||
if (!Icon) return null
|
||||
return (
|
||||
<Icon key={connection.id} className='h-[14px] w-[14px] text-[var(--text-tertiary)]' />
|
||||
)
|
||||
})}
|
||||
{remainingCount > 0 && (
|
||||
<span className='text-[14px] text-[var(--text-tertiary)]'>+{remainingCount}</span>
|
||||
)}
|
||||
</>
|
||||
)
|
||||
|
||||
if (!horizontalHandles) {
|
||||
return (
|
||||
<div className='-translate-x-full -translate-y-1/2 pointer-events-none absolute top-1/2 left-0 flex flex-col items-end gap-[8px] pr-[8px] opacity-0 transition-opacity group-hover:opacity-100'>
|
||||
<span className='text-[14px] text-[var(--text-tertiary)] leading-[14px]'>
|
||||
{connectionText}
|
||||
</span>
|
||||
<div className='flex items-center justify-end gap-[4px]'>{connectionIcons}</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className='pointer-events-none absolute bottom-full left-0 ml-[8px] flex items-center gap-[8px] pb-[8px] opacity-0 transition-opacity group-hover:opacity-100'>
|
||||
<span className='text-[14px] text-[var(--text-tertiary)]'>{connectionText}</span>
|
||||
<div className='h-[14px] w-[1px] bg-[var(--text-tertiary)]' />
|
||||
<div className='flex items-center gap-[4px]'>{connectionIcons}</div>
|
||||
<div className='pointer-events-none absolute top-full left-0 ml-[8px] flex items-center gap-[8px] pt-[8px] opacity-0 transition-opacity group-hover:opacity-100'>
|
||||
<span className='text-[12px] text-[var(--text-tertiary)]'>{connectionText}</span>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -343,6 +343,7 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
handleClick,
|
||||
hasRing,
|
||||
ringStyles,
|
||||
runPathStatus,
|
||||
} = useBlockCore({ blockId: id, data, isPending })
|
||||
|
||||
const currentBlock = currentWorkflow.getBlockById(id)
|
||||
@@ -722,9 +723,7 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
|
||||
<ActionBar blockId={id} blockType={type} disabled={!userPermissions.canEdit} />
|
||||
|
||||
{shouldShowDefaultHandles && (
|
||||
<Connections blockId={id} horizontalHandles={horizontalHandles} />
|
||||
)}
|
||||
{shouldShowDefaultHandles && <Connections blockId={id} />}
|
||||
|
||||
{shouldShowDefaultHandles && (
|
||||
<Handle
|
||||
@@ -750,21 +749,26 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
e.stopPropagation()
|
||||
}}
|
||||
>
|
||||
<div className='flex min-w-0 flex-1 items-center gap-[10px]'>
|
||||
<div className='relative z-10 flex min-w-0 flex-1 items-center gap-[10px]'>
|
||||
<div
|
||||
className='flex h-[24px] w-[24px] flex-shrink-0 items-center justify-center rounded-[6px]'
|
||||
style={{ backgroundColor: isEnabled ? config.bgColor : 'gray' }}
|
||||
style={{
|
||||
backgroundColor: isEnabled ? config.bgColor : 'gray',
|
||||
}}
|
||||
>
|
||||
<config.icon className='h-[16px] w-[16px] text-white' />
|
||||
</div>
|
||||
<span
|
||||
className={cn('truncate font-medium text-[16px]', !isEnabled && 'text-[#808080]')}
|
||||
className={cn(
|
||||
'truncate font-medium text-[16px]',
|
||||
!isEnabled && runPathStatus !== 'success' && 'text-[#808080]'
|
||||
)}
|
||||
title={name}
|
||||
>
|
||||
{name}
|
||||
</span>
|
||||
</div>
|
||||
<div className='flex flex-shrink-0 items-center gap-2'>
|
||||
<div className='relative z-10 flex flex-shrink-0 items-center gap-2'>
|
||||
{isWorkflowSelector && childWorkflowId && (
|
||||
<>
|
||||
{typeof childIsDeployed === 'boolean' ? (
|
||||
@@ -890,6 +894,14 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
)}
|
||||
{/* {isActive && (
|
||||
<div className='mr-[2px] ml-2 flex h-[16px] w-[16px] items-center justify-center'>
|
||||
<div
|
||||
className='h-full w-full animate-spin-slow rounded-full border-[2.5px] border-[rgba(255,102,0,0.25)] border-t-[var(--warning)]'
|
||||
aria-hidden='true'
|
||||
/>
|
||||
</div>
|
||||
)} */}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import { X } from 'lucide-react'
|
||||
import { BaseEdge, EdgeLabelRenderer, type EdgeProps, getSmoothStepPath } from 'reactflow'
|
||||
import type { EdgeDiffStatus } from '@/lib/workflows/diff/types'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff'
|
||||
|
||||
interface WorkflowEdgeProps extends EdgeProps {
|
||||
@@ -43,6 +44,7 @@ export const WorkflowEdge = ({
|
||||
const diffAnalysis = useWorkflowDiffStore((state) => state.diffAnalysis)
|
||||
const isShowingDiff = useWorkflowDiffStore((state) => state.isShowingDiff)
|
||||
const isDiffReady = useWorkflowDiffStore((state) => state.isDiffReady)
|
||||
const lastRunEdges = useExecutionStore((state) => state.lastRunEdges)
|
||||
|
||||
const generateEdgeIdentity = (
|
||||
sourceId: string,
|
||||
@@ -78,10 +80,16 @@ export const WorkflowEdge = ({
|
||||
const dataSourceHandle = (data as { sourceHandle?: string } | undefined)?.sourceHandle
|
||||
const isErrorEdge = (sourceHandle ?? dataSourceHandle) === 'error'
|
||||
|
||||
// Check if this edge was traversed during last execution
|
||||
const edgeRunStatus = lastRunEdges.get(id)
|
||||
|
||||
const getEdgeColor = () => {
|
||||
if (edgeDiffStatus === 'deleted') return 'var(--text-error)'
|
||||
if (isErrorEdge) return 'var(--text-error)'
|
||||
if (edgeDiffStatus === 'new') return 'var(--brand-tertiary)'
|
||||
// Show run path status if edge was traversed
|
||||
if (edgeRunStatus === 'success') return 'var(--border-success)'
|
||||
if (edgeRunStatus === 'error') return 'var(--text-error)'
|
||||
return 'var(--surface-12)'
|
||||
}
|
||||
|
||||
|
||||
@@ -30,7 +30,7 @@ export function useAccessibleReferencePrefixes(blockId?: string | null): Set<str
|
||||
const starterBlock = Object.values(blocks).find(
|
||||
(block) => block.type === 'starter' || block.type === 'start_trigger'
|
||||
)
|
||||
if (starterBlock) {
|
||||
if (starterBlock && ancestorIds.includes(starterBlock.id)) {
|
||||
accessibleIds.add(starterBlock.id)
|
||||
}
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
import { useCallback, useMemo } from 'react'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { useExecutionStore } from '@/stores/execution/store'
|
||||
import { usePanelEditorStore } from '@/stores/panel-new/editor/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useBlockState } from '../components/workflow-block/hooks'
|
||||
import type { WorkflowBlockProps } from '../components/workflow-block/types'
|
||||
import { getBlockRingStyles } from '../utils/block-ring-utils'
|
||||
import { useCurrentWorkflow } from './use-current-workflow'
|
||||
|
||||
interface UseBlockCoreOptions {
|
||||
@@ -43,60 +43,19 @@ export function useBlockCore({ blockId, data, isPending = false }: UseBlockCoreO
|
||||
}, [blockId, setCurrentBlockId])
|
||||
|
||||
// Ring styling based on all states
|
||||
// Priority: active (animated) > pending > focused > deleted > diff > run path
|
||||
const { hasRing, ringStyles } = useMemo(() => {
|
||||
const hasRing =
|
||||
isActive ||
|
||||
isPending ||
|
||||
isFocused ||
|
||||
diffStatus === 'new' ||
|
||||
diffStatus === 'edited' ||
|
||||
isDeletedBlock ||
|
||||
!!runPathStatus
|
||||
|
||||
const ringStyles = cn(
|
||||
// Executing block: animated ring cycling through gray tones (animation handles all styling)
|
||||
isActive && 'animate-ring-pulse',
|
||||
// Non-active states use standard ring utilities
|
||||
!isActive && hasRing && 'ring-[1.75px]',
|
||||
// Pending state: warning ring
|
||||
!isActive && isPending && 'ring-[var(--warning)]',
|
||||
// Focused (selected) state: brand ring
|
||||
!isActive && !isPending && isFocused && 'ring-[var(--brand-secondary)]',
|
||||
// Deleted state (highest priority after active/pending/focused)
|
||||
!isActive && !isPending && !isFocused && isDeletedBlock && 'ring-[var(--text-error)]',
|
||||
// Diff states
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
diffStatus === 'new' &&
|
||||
'ring-[#22C55E]',
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
diffStatus === 'edited' &&
|
||||
'ring-[var(--warning)]',
|
||||
// Run path states (lowest priority - only show if no other states active)
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
!diffStatus &&
|
||||
runPathStatus === 'success' &&
|
||||
'ring-[var(--surface-14)]',
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
!diffStatus &&
|
||||
runPathStatus === 'error' &&
|
||||
'ring-[var(--text-error)]'
|
||||
)
|
||||
|
||||
return { hasRing, ringStyles }
|
||||
}, [isActive, isPending, isFocused, diffStatus, isDeletedBlock, runPathStatus])
|
||||
// Priority: active (executing) > pending > focused > deleted > diff > run path
|
||||
const { hasRing, ringClassName: ringStyles } = useMemo(
|
||||
() =>
|
||||
getBlockRingStyles({
|
||||
isActive,
|
||||
isPending,
|
||||
isFocused,
|
||||
isDeletedBlock,
|
||||
diffStatus,
|
||||
runPathStatus,
|
||||
}),
|
||||
[isActive, isPending, isFocused, isDeletedBlock, diffStatus, runPathStatus]
|
||||
)
|
||||
|
||||
return {
|
||||
// Workflow context
|
||||
@@ -116,5 +75,6 @@ export function useBlockCore({ blockId, data, isPending = false }: UseBlockCoreO
|
||||
// Ring styling
|
||||
hasRing,
|
||||
ringStyles,
|
||||
runPathStatus,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -11,7 +11,7 @@ const DEFAULT_CONTAINER_HEIGHT = 300
|
||||
* Hook providing utilities for node position, hierarchy, and dimension calculations
|
||||
*/
|
||||
export function useNodeUtilities(blocks: Record<string, any>) {
|
||||
const { getNodes, project } = useReactFlow()
|
||||
const { getNodes } = useReactFlow()
|
||||
|
||||
/**
|
||||
* Check if a block is a container type (loop, parallel, or subflow)
|
||||
|
||||
@@ -100,6 +100,7 @@ export function useWorkflowExecution() {
|
||||
setDebugContext,
|
||||
setActiveBlocks,
|
||||
setBlockRunStatus,
|
||||
setEdgeRunStatus,
|
||||
} = useExecutionStore()
|
||||
const [executionResult, setExecutionResult] = useState<ExecutionResult | null>(null)
|
||||
const executionStream = useExecutionStream()
|
||||
@@ -681,10 +682,10 @@ export function useWorkflowExecution() {
|
||||
const workflowEdges = (executionWorkflowState?.edges ??
|
||||
latestWorkflowState.edges) as typeof currentWorkflow.edges
|
||||
|
||||
// Filter out blocks without type (these are layout-only blocks)
|
||||
// Filter out blocks without type (these are layout-only blocks) and disabled blocks
|
||||
const validBlocks = Object.entries(workflowBlocks).reduce(
|
||||
(acc, [blockId, block]) => {
|
||||
if (block?.type) {
|
||||
if (block?.type && block.enabled !== false) {
|
||||
acc[blockId] = block
|
||||
}
|
||||
return acc
|
||||
@@ -724,13 +725,18 @@ export function useWorkflowExecution() {
|
||||
}
|
||||
})
|
||||
|
||||
// Do not filter out trigger blocks; executor may need to start from them
|
||||
// Filter out blocks without type and disabled blocks
|
||||
const filteredStates = Object.entries(mergedStates).reduce(
|
||||
(acc, [id, block]) => {
|
||||
if (!block || !block.type) {
|
||||
logger.warn(`Skipping block with undefined type: ${id}`, block)
|
||||
return acc
|
||||
}
|
||||
// Skip disabled blocks to prevent them from being passed to executor
|
||||
if (block.enabled === false) {
|
||||
logger.warn(`Skipping disabled block: ${id}`)
|
||||
return acc
|
||||
}
|
||||
acc[id] = block
|
||||
return acc
|
||||
},
|
||||
@@ -892,6 +898,12 @@ export function useWorkflowExecution() {
|
||||
activeBlocksSet.add(data.blockId)
|
||||
// Create a new Set to trigger React re-render
|
||||
setActiveBlocks(new Set(activeBlocksSet))
|
||||
|
||||
// Track edges that led to this block as soon as execution starts
|
||||
const incomingEdges = workflowEdges.filter((edge) => edge.target === data.blockId)
|
||||
incomingEdges.forEach((edge) => {
|
||||
setEdgeRunStatus(edge.id, 'success')
|
||||
})
|
||||
},
|
||||
|
||||
onBlockCompleted: (data) => {
|
||||
@@ -904,6 +916,8 @@ export function useWorkflowExecution() {
|
||||
// Track successful block execution in run path
|
||||
setBlockRunStatus(data.blockId, 'success')
|
||||
|
||||
// Edges already tracked in onBlockStarted, no need to track again
|
||||
|
||||
// Add to console
|
||||
addConsole({
|
||||
input: data.input || {},
|
||||
@@ -938,7 +952,6 @@ export function useWorkflowExecution() {
|
||||
|
||||
// Track failed block execution in run path
|
||||
setBlockRunStatus(data.blockId, 'error')
|
||||
|
||||
// Add error to console
|
||||
addConsole({
|
||||
input: data.input || {},
|
||||
|
||||
@@ -0,0 +1,77 @@
|
||||
import { cn } from '@/lib/utils'
|
||||
|
||||
export type BlockDiffStatus = 'new' | 'edited' | null | undefined
|
||||
|
||||
export type BlockRunPathStatus = 'success' | 'error' | undefined
|
||||
|
||||
export interface BlockRingOptions {
|
||||
isActive: boolean
|
||||
isPending: boolean
|
||||
isFocused: boolean
|
||||
isDeletedBlock: boolean
|
||||
diffStatus: BlockDiffStatus
|
||||
runPathStatus: BlockRunPathStatus
|
||||
}
|
||||
|
||||
/**
|
||||
* Derives visual ring visibility and class names for workflow blocks
|
||||
* based on execution, focus, diff, deletion, and run-path states.
|
||||
*/
|
||||
export function getBlockRingStyles(options: BlockRingOptions): {
|
||||
hasRing: boolean
|
||||
ringClassName: string
|
||||
} {
|
||||
const { isActive, isPending, isFocused, isDeletedBlock, diffStatus, runPathStatus } = options
|
||||
|
||||
const hasRing =
|
||||
isActive ||
|
||||
isPending ||
|
||||
isFocused ||
|
||||
diffStatus === 'new' ||
|
||||
diffStatus === 'edited' ||
|
||||
isDeletedBlock ||
|
||||
!!runPathStatus
|
||||
|
||||
const ringClassName = cn(
|
||||
// Executing block: pulsing success ring with prominent thickness
|
||||
isActive && 'ring-[3.5px] ring-[var(--border-success)] animate-ring-pulse',
|
||||
// Non-active states use standard ring utilities
|
||||
!isActive && hasRing && 'ring-[1.75px]',
|
||||
// Pending state: warning ring
|
||||
!isActive && isPending && 'ring-[var(--warning)]',
|
||||
// Focused (selected) state: brand ring
|
||||
!isActive && !isPending && isFocused && 'ring-[var(--brand-secondary)]',
|
||||
// Deleted state (highest priority after active/pending/focused)
|
||||
!isActive && !isPending && !isFocused && isDeletedBlock && 'ring-[var(--text-error)]',
|
||||
// Diff states
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
diffStatus === 'new' &&
|
||||
'ring-[#22C55E]',
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
diffStatus === 'edited' &&
|
||||
'ring-[var(--warning)]',
|
||||
// Run path states (lowest priority - only show if no other states active)
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
!diffStatus &&
|
||||
runPathStatus === 'success' &&
|
||||
'ring-[var(--border-success)]',
|
||||
!isActive &&
|
||||
!isPending &&
|
||||
!isFocused &&
|
||||
!isDeletedBlock &&
|
||||
!diffStatus &&
|
||||
runPathStatus === 'error' &&
|
||||
'ring-[var(--text-error)]'
|
||||
)
|
||||
|
||||
return { hasRing, ringClassName }
|
||||
}
|
||||
@@ -1,2 +1,3 @@
|
||||
export * from './auto-layout-utils'
|
||||
export * from './block-ring-utils'
|
||||
export * from './workflow-execution-utils'
|
||||
|
||||
@@ -110,14 +110,13 @@ const WorkflowContent = React.memo(() => {
|
||||
// Hooks
|
||||
const params = useParams()
|
||||
const router = useRouter()
|
||||
const { project, getNodes, fitView } = useReactFlow()
|
||||
const { screenToFlowPosition, getNodes, fitView } = useReactFlow()
|
||||
const { emitCursorUpdate } = useSocket()
|
||||
|
||||
// Get workspace ID from the params
|
||||
const workspaceId = params.workspaceId as string
|
||||
|
||||
const { workflows, activeWorkflowId, isLoading, setActiveWorkflow, createWorkflow } =
|
||||
useWorkflowRegistry()
|
||||
const { workflows, activeWorkflowId, isLoading, setActiveWorkflow } = useWorkflowRegistry()
|
||||
|
||||
// Use the clean abstraction for current workflow state
|
||||
const currentWorkflow = useCurrentWorkflow()
|
||||
@@ -170,7 +169,7 @@ const WorkflowContent = React.memo(() => {
|
||||
// Get diff analysis for edge reconstruction
|
||||
const { diffAnalysis, isShowingDiff, isDiffReady } = useWorkflowDiffStore()
|
||||
|
||||
// Reconstruct deleted edges when viewing original workflow and filter trigger edges
|
||||
// Reconstruct deleted edges when viewing original workflow and filter out invalid edges
|
||||
const edgesForDisplay = useMemo(() => {
|
||||
let edgesToFilter = edges
|
||||
|
||||
@@ -237,7 +236,21 @@ const WorkflowContent = React.memo(() => {
|
||||
// Combine existing edges with reconstructed deleted edges
|
||||
edgesToFilter = [...edges, ...reconstructedEdges]
|
||||
}
|
||||
return edgesToFilter
|
||||
|
||||
// Filter out edges that connect to/from annotation-only blocks (note blocks)
|
||||
// These blocks don't have handles and shouldn't have connections
|
||||
return edgesToFilter.filter((edge) => {
|
||||
const sourceBlock = blocks[edge.source]
|
||||
const targetBlock = blocks[edge.target]
|
||||
|
||||
// Remove edge if either source or target is an annotation-only block
|
||||
if (!sourceBlock || !targetBlock) return false
|
||||
if (isAnnotationOnlyBlock(sourceBlock.type) || isAnnotationOnlyBlock(targetBlock.type)) {
|
||||
return false
|
||||
}
|
||||
|
||||
return true
|
||||
})
|
||||
}, [edges, isShowingDiff, isDiffReady, diffAnalysis, blocks])
|
||||
|
||||
// User permissions - get current user's specific permissions from context
|
||||
@@ -680,7 +693,11 @@ const WorkflowContent = React.memo(() => {
|
||||
// Auto-connect logic for blocks inside containers
|
||||
const isAutoConnectEnabled = useGeneralStore.getState().isAutoConnectEnabled
|
||||
let autoConnectEdge
|
||||
if (isAutoConnectEnabled && data.type !== 'starter') {
|
||||
if (
|
||||
isAutoConnectEnabled &&
|
||||
data.type !== 'starter' &&
|
||||
!isAnnotationOnlyBlock(data.type)
|
||||
) {
|
||||
if (existingChildBlocks.length > 0) {
|
||||
// Connect to the nearest existing child block within the container
|
||||
const closestBlock = existingChildBlocks
|
||||
@@ -694,7 +711,7 @@ const WorkflowContent = React.memo(() => {
|
||||
.sort((a, b) => a.distance - b.distance)[0]?.block
|
||||
|
||||
if (closestBlock) {
|
||||
// Don't create edges into trigger blocks
|
||||
// Don't create edges into trigger blocks or annotation blocks
|
||||
const targetBlockConfig = getBlock(data.type)
|
||||
const isTargetTrigger =
|
||||
data.enableTriggerMode === true || targetBlockConfig?.category === 'triggers'
|
||||
@@ -769,10 +786,14 @@ const WorkflowContent = React.memo(() => {
|
||||
// Regular auto-connect logic
|
||||
const isAutoConnectEnabled = useGeneralStore.getState().isAutoConnectEnabled
|
||||
let autoConnectEdge
|
||||
if (isAutoConnectEnabled && data.type !== 'starter') {
|
||||
if (
|
||||
isAutoConnectEnabled &&
|
||||
data.type !== 'starter' &&
|
||||
!isAnnotationOnlyBlock(data.type)
|
||||
) {
|
||||
const closestBlock = findClosestOutput(position)
|
||||
if (closestBlock) {
|
||||
// Don't create edges into trigger blocks
|
||||
// Don't create edges into trigger blocks or annotation blocks
|
||||
const targetBlockConfig = getBlock(data.type)
|
||||
const isTargetTrigger =
|
||||
data.enableTriggerMode === true || targetBlockConfig?.category === 'triggers'
|
||||
@@ -842,7 +863,7 @@ const WorkflowContent = React.memo(() => {
|
||||
const baseName = type === 'loop' ? 'Loop' : 'Parallel'
|
||||
const name = getUniqueBlockName(baseName, blocks)
|
||||
|
||||
const centerPosition = project({
|
||||
const centerPosition = screenToFlowPosition({
|
||||
x: window.innerWidth / 2,
|
||||
y: window.innerHeight / 2,
|
||||
})
|
||||
@@ -891,7 +912,7 @@ const WorkflowContent = React.memo(() => {
|
||||
}
|
||||
|
||||
// Calculate the center position of the viewport
|
||||
const centerPosition = project({
|
||||
const centerPosition = screenToFlowPosition({
|
||||
x: window.innerWidth / 2,
|
||||
y: window.innerHeight / 2,
|
||||
})
|
||||
@@ -906,11 +927,11 @@ const WorkflowContent = React.memo(() => {
|
||||
// Auto-connect logic
|
||||
const isAutoConnectEnabled = useGeneralStore.getState().isAutoConnectEnabled
|
||||
let autoConnectEdge
|
||||
if (isAutoConnectEnabled && type !== 'starter') {
|
||||
if (isAutoConnectEnabled && type !== 'starter' && !isAnnotationOnlyBlock(type)) {
|
||||
const closestBlock = findClosestOutput(centerPosition)
|
||||
logger.info('Closest block found:', closestBlock)
|
||||
if (closestBlock) {
|
||||
// Don't create edges into trigger blocks
|
||||
// Don't create edges into trigger blocks or annotation blocks
|
||||
const targetBlockConfig = blockConfig
|
||||
const isTargetTrigger = enableTriggerMode || targetBlockConfig?.category === 'triggers'
|
||||
|
||||
@@ -977,7 +998,7 @@ const WorkflowContent = React.memo(() => {
|
||||
)
|
||||
}
|
||||
}, [
|
||||
project,
|
||||
screenToFlowPosition,
|
||||
blocks,
|
||||
addBlock,
|
||||
addEdge,
|
||||
@@ -1014,7 +1035,7 @@ const WorkflowContent = React.memo(() => {
|
||||
}
|
||||
|
||||
const bounds = canvasElement.getBoundingClientRect()
|
||||
const position = project({
|
||||
const position = screenToFlowPosition({
|
||||
x: detail.clientX - bounds.left,
|
||||
y: detail.clientY - bounds.top,
|
||||
})
|
||||
@@ -1041,7 +1062,7 @@ const WorkflowContent = React.memo(() => {
|
||||
'toolbar-drop-on-empty-workflow-overlay',
|
||||
handleOverlayToolbarDrop as EventListener
|
||||
)
|
||||
}, [project, handleToolbarDrop])
|
||||
}, [screenToFlowPosition, handleToolbarDrop])
|
||||
|
||||
/**
|
||||
* Recenter canvas when diff appears
|
||||
@@ -1090,7 +1111,7 @@ const WorkflowContent = React.memo(() => {
|
||||
if (!data?.type) return
|
||||
|
||||
const reactFlowBounds = event.currentTarget.getBoundingClientRect()
|
||||
const position = project({
|
||||
const position = screenToFlowPosition({
|
||||
x: event.clientX - reactFlowBounds.left,
|
||||
y: event.clientY - reactFlowBounds.top,
|
||||
})
|
||||
@@ -1106,7 +1127,7 @@ const WorkflowContent = React.memo(() => {
|
||||
logger.error('Error dropping block on ReactFlow canvas:', { err })
|
||||
}
|
||||
},
|
||||
[project, handleToolbarDrop]
|
||||
[screenToFlowPosition, handleToolbarDrop]
|
||||
)
|
||||
|
||||
const handleCanvasPointerMove = useCallback(
|
||||
@@ -1114,14 +1135,14 @@ const WorkflowContent = React.memo(() => {
|
||||
const target = event.currentTarget as HTMLElement
|
||||
const bounds = target.getBoundingClientRect()
|
||||
|
||||
const position = project({
|
||||
const position = screenToFlowPosition({
|
||||
x: event.clientX - bounds.left,
|
||||
y: event.clientY - bounds.top,
|
||||
})
|
||||
|
||||
emitCursorUpdate(position)
|
||||
},
|
||||
[project, emitCursorUpdate]
|
||||
[screenToFlowPosition, emitCursorUpdate]
|
||||
)
|
||||
|
||||
const handleCanvasPointerLeave = useCallback(() => {
|
||||
@@ -1144,7 +1165,7 @@ const WorkflowContent = React.memo(() => {
|
||||
|
||||
try {
|
||||
const reactFlowBounds = event.currentTarget.getBoundingClientRect()
|
||||
const position = project({
|
||||
const position = screenToFlowPosition({
|
||||
x: event.clientX - reactFlowBounds.left,
|
||||
y: event.clientY - reactFlowBounds.top,
|
||||
})
|
||||
@@ -1188,7 +1209,7 @@ const WorkflowContent = React.memo(() => {
|
||||
logger.error('Error in onDragOver', { err })
|
||||
}
|
||||
},
|
||||
[project, isPointInLoopNode, getNodes]
|
||||
[screenToFlowPosition, isPointInLoopNode, getNodes]
|
||||
)
|
||||
|
||||
// Initialize workflow when it exists in registry and isn't active
|
||||
@@ -1584,8 +1605,8 @@ const WorkflowContent = React.memo(() => {
|
||||
// Store currently dragged node ID
|
||||
setDraggedNodeId(node.id)
|
||||
|
||||
// Emit collaborative position update during drag for smooth real-time movement
|
||||
collaborativeUpdateBlockPosition(node.id, node.position, false)
|
||||
// Note: We don't emit position updates during drag to avoid flooding socket events.
|
||||
// The final position is sent in onNodeDragStop for collaborative updates.
|
||||
|
||||
// Get the current parent ID of the node being dragged
|
||||
const currentParentId = blocks[node.id]?.data?.parentId || null
|
||||
@@ -1721,14 +1742,7 @@ const WorkflowContent = React.memo(() => {
|
||||
}
|
||||
}
|
||||
},
|
||||
[
|
||||
getNodes,
|
||||
potentialParentId,
|
||||
blocks,
|
||||
getNodeAbsolutePosition,
|
||||
getNodeDepth,
|
||||
collaborativeUpdateBlockPosition,
|
||||
]
|
||||
[getNodes, potentialParentId, blocks, getNodeAbsolutePosition, getNodeDepth]
|
||||
)
|
||||
|
||||
// Add in a nodeDrag start event to set the dragStartParentId
|
||||
@@ -1855,7 +1869,8 @@ const WorkflowContent = React.memo(() => {
|
||||
|
||||
// Auto-connect when moving an existing block into a container
|
||||
const isAutoConnectEnabled = useGeneralStore.getState().isAutoConnectEnabled
|
||||
if (isAutoConnectEnabled) {
|
||||
// Don't auto-connect annotation blocks (like note blocks)
|
||||
if (isAutoConnectEnabled && !isAnnotationOnlyBlock(node.data?.type)) {
|
||||
// Existing children in the target container (excluding the moved node)
|
||||
const existingChildBlocks = Object.values(blocks).filter(
|
||||
(b) => b.data?.parentId === potentialParentId && b.id !== node.id
|
||||
|
||||
@@ -26,7 +26,7 @@ export function Account(_props: AccountProps) {
|
||||
const router = useRouter()
|
||||
const brandConfig = useBrandConfig()
|
||||
|
||||
// React Query hooks - with placeholderData to show cached data immediately (no skeleton loading!)
|
||||
// React Query hooks - with placeholderData to show cached data immediately
|
||||
const { data: profile } = useUserProfile()
|
||||
const updateProfile = useUpdateUserProfile()
|
||||
|
||||
|
||||
@@ -41,7 +41,7 @@ export function CreatorProfile() {
|
||||
const { data: session } = useSession()
|
||||
const userId = session?.user?.id || ''
|
||||
|
||||
// React Query hooks - with placeholderData to show cached data immediately (no skeleton loading!)
|
||||
// React Query hooks - with placeholderData to show cached data immediately
|
||||
const { data: organizations = [] } = useOrganizations()
|
||||
const { data: existingProfile } = useCreatorProfile(userId)
|
||||
const saveProfile = useSaveCreatorProfile()
|
||||
|
||||
@@ -5,7 +5,6 @@ import { Check, ChevronDown, ExternalLink, Search } from 'lucide-react'
|
||||
import { useRouter, useSearchParams } from 'next/navigation'
|
||||
import { Button } from '@/components/emcn'
|
||||
import { Input, Label } from '@/components/ui'
|
||||
import { useSession } from '@/lib/auth-client'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { OAUTH_PROVIDERS } from '@/lib/oauth/oauth'
|
||||
import { cn } from '@/lib/utils'
|
||||
@@ -26,11 +25,9 @@ interface CredentialsProps {
|
||||
export function Credentials({ onOpenChange, registerCloseHandler }: CredentialsProps) {
|
||||
const router = useRouter()
|
||||
const searchParams = useSearchParams()
|
||||
const { data: session } = useSession()
|
||||
const userId = session?.user?.id
|
||||
const pendingServiceRef = useRef<HTMLDivElement>(null)
|
||||
|
||||
// React Query hooks - with placeholderData to show cached data immediately (no skeleton loading!)
|
||||
// React Query hooks - with placeholderData to show cached data immediately
|
||||
const { data: services = [] } = useOAuthConnections()
|
||||
const connectService = useConnectOAuthService()
|
||||
const disconnectService = useDisconnectOAuthService()
|
||||
@@ -38,51 +35,28 @@ export function Credentials({ onOpenChange, registerCloseHandler }: CredentialsP
|
||||
// Local UI state
|
||||
const [searchTerm, setSearchTerm] = useState('')
|
||||
const [pendingService, setPendingService] = useState<string | null>(null)
|
||||
const [_pendingScopes, setPendingScopes] = useState<string[]>([])
|
||||
const [authSuccess, setAuthSuccess] = useState(false)
|
||||
const [showActionRequired, setShowActionRequired] = useState(false)
|
||||
const prevConnectedIdsRef = useRef<Set<string>>(new Set())
|
||||
const connectionAddedRef = useRef<boolean>(false)
|
||||
|
||||
// Check for OAuth callback
|
||||
// Check for OAuth callback - just show success message
|
||||
useEffect(() => {
|
||||
const code = searchParams.get('code')
|
||||
const state = searchParams.get('state')
|
||||
const error = searchParams.get('error')
|
||||
|
||||
// Handle OAuth callback
|
||||
if (code && state) {
|
||||
// This is an OAuth callback - try to restore state from localStorage
|
||||
try {
|
||||
const stored = localStorage.getItem('pending_oauth_state')
|
||||
if (stored) {
|
||||
const oauthState = JSON.parse(stored)
|
||||
logger.info('OAuth callback with restored state:', oauthState)
|
||||
|
||||
// Mark as pending if we have context about what service was being connected
|
||||
if (oauthState.serviceId) {
|
||||
setPendingService(oauthState.serviceId)
|
||||
setShowActionRequired(true)
|
||||
}
|
||||
|
||||
// Clean up the state (one-time use)
|
||||
localStorage.removeItem('pending_oauth_state')
|
||||
} else {
|
||||
logger.warn('OAuth callback but no state found in localStorage')
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error loading OAuth state from localStorage:', error)
|
||||
localStorage.removeItem('pending_oauth_state') // Clean up corrupted state
|
||||
}
|
||||
|
||||
// Set success flag
|
||||
logger.info('OAuth callback successful')
|
||||
setAuthSuccess(true)
|
||||
|
||||
// Clear the URL parameters
|
||||
router.replace('/workspace')
|
||||
// Clear URL parameters without changing the page
|
||||
const url = new URL(window.location.href)
|
||||
url.searchParams.delete('code')
|
||||
url.searchParams.delete('state')
|
||||
router.replace(url.pathname + url.search)
|
||||
} else if (error) {
|
||||
logger.error('OAuth error:', { error })
|
||||
router.replace('/workspace')
|
||||
}
|
||||
}, [searchParams, router])
|
||||
|
||||
@@ -132,6 +106,7 @@ export function Credentials({ onOpenChange, registerCloseHandler }: CredentialsP
|
||||
scopes: service.scopes,
|
||||
})
|
||||
|
||||
// better-auth will automatically redirect back to this URL after OAuth
|
||||
await connectService.mutateAsync({
|
||||
providerId: service.providerId,
|
||||
callbackURL: window.location.href,
|
||||
|
||||
@@ -55,7 +55,7 @@ export function Files() {
|
||||
const params = useParams()
|
||||
const workspaceId = params?.workspaceId as string
|
||||
|
||||
// React Query hooks - with placeholderData to show cached data immediately (no skeleton loading!)
|
||||
// React Query hooks - with placeholderData to show cached data immediately
|
||||
const { data: files = [] } = useWorkspaceFiles(workspaceId)
|
||||
const { data: storageInfo } = useStorageInfo(isBillingEnabled)
|
||||
const uploadFile = useUploadWorkspaceFile()
|
||||
|
||||
@@ -34,7 +34,7 @@ export function General() {
|
||||
const [isSuperUser, setIsSuperUser] = useState(false)
|
||||
const [loadingSuperUser, setLoadingSuperUser] = useState(true)
|
||||
|
||||
// React Query hooks - with placeholderData to show cached data immediately (no skeleton loading!)
|
||||
// React Query hooks - with placeholderData to show cached data immediately
|
||||
const { data: settings, isLoading } = useGeneralSettings()
|
||||
const updateSetting = useUpdateGeneralSetting()
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@ const TOOLTIPS = {
|
||||
}
|
||||
|
||||
export function Privacy() {
|
||||
// React Query hooks - with placeholderData to show cached data immediately (no skeleton loading!)
|
||||
// React Query hooks - with placeholderData to show cached data immediately
|
||||
const { data: settings } = useGeneralSettings()
|
||||
const updateSetting = useUpdateGeneralSetting()
|
||||
|
||||
|
||||
@@ -227,12 +227,8 @@ export function CancelSubscription({ subscription, subscriptionData }: CancelSub
|
||||
onClick={() => setIsDialogOpen(true)}
|
||||
disabled={isLoading}
|
||||
className={cn(
|
||||
'h-8 rounded-[8px] font-medium text-xs transition-all duration-200',
|
||||
error
|
||||
? 'border-red-500 text-red-500 dark:border-red-500 dark:text-red-500'
|
||||
: isCancelAtPeriodEnd
|
||||
? 'text-muted-foreground hover:border-green-500 hover:bg-green-500 hover:text-white dark:hover:border-green-500 dark:hover:bg-green-500'
|
||||
: 'text-muted-foreground hover:border-red-500 hover:bg-red-500 hover:text-white dark:hover:border-red-500 dark:hover:bg-red-500'
|
||||
'h-8 rounded-[8px] font-medium text-xs',
|
||||
error && 'border-red-500 text-red-500 dark:border-red-500 dark:text-red-500'
|
||||
)}
|
||||
>
|
||||
{error ? 'Error' : isCancelAtPeriodEnd ? 'Restore' : 'Manage'}
|
||||
|
||||
@@ -107,12 +107,11 @@ export function PlanCard({
|
||||
<Button
|
||||
onClick={onButtonClick}
|
||||
className={cn(
|
||||
'h-9 rounded-[8px] text-xs transition-colors',
|
||||
'h-9 rounded-[8px] text-xs',
|
||||
isHorizontal ? 'px-4' : 'w-full',
|
||||
isError &&
|
||||
'border-red-500 bg-transparent text-red-500 hover:bg-red-500 hover:text-white dark:border-red-500 dark:text-red-500 dark:hover:bg-red-500'
|
||||
isError && 'border-red-500 text-red-500 dark:border-red-500 dark:text-red-500'
|
||||
)}
|
||||
variant={isError ? 'outline' : 'default'}
|
||||
variant='outline'
|
||||
aria-label={`${buttonText} ${name} plan`}
|
||||
>
|
||||
{isError ? 'Error' : buttonText}
|
||||
|
||||
@@ -469,6 +469,15 @@ export function Subscription({ onOpenChange }: SubscriptionProps) {
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Enterprise Usage Limit Notice */}
|
||||
{subscription.isEnterprise && (
|
||||
<div className='text-center'>
|
||||
<p className='text-muted-foreground text-xs'>
|
||||
Contact enterprise for support usage limit changes
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Cost Breakdown */}
|
||||
{/* TODO: Re-enable CostBreakdown component in the next billing period
|
||||
once sufficient copilot cost data has been collected for accurate display.
|
||||
@@ -554,14 +563,6 @@ export function Subscription({ onOpenChange }: SubscriptionProps) {
|
||||
{/* Billing usage notifications toggle */}
|
||||
{subscription.isPaid && <BillingUsageNotificationsToggle />}
|
||||
|
||||
{subscription.isEnterprise && (
|
||||
<div className='text-center'>
|
||||
<p className='text-muted-foreground text-xs'>
|
||||
Contact enterprise for support usage limit changes
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Cancel Subscription */}
|
||||
{permissions.canCancelSubscription && (
|
||||
<div className='mt-2'>
|
||||
@@ -631,9 +632,6 @@ function BillingUsageNotificationsToggle() {
|
||||
const updateSetting = useUpdateGeneralSetting()
|
||||
const isLoading = updateSetting.isPending
|
||||
|
||||
// Settings are automatically loaded by SettingsLoader provider
|
||||
// No need to load here - Zustand is synced from React Query
|
||||
|
||||
return (
|
||||
<div className='mt-4 flex items-center justify-between'>
|
||||
<div className='flex flex-col'>
|
||||
|
||||
@@ -16,30 +16,39 @@ import { MIN_SIDEBAR_WIDTH, useSidebarStore } from '@/stores/sidebar/store'
|
||||
const logger = createLogger('UsageIndicator')
|
||||
|
||||
/**
|
||||
* Minimum number of pills to display (at minimum sidebar width)
|
||||
* Minimum number of pills to display (at minimum sidebar width).
|
||||
*/
|
||||
const MIN_PILL_COUNT = 6
|
||||
|
||||
/**
|
||||
* Maximum number of pills to display
|
||||
* Maximum number of pills to display.
|
||||
*/
|
||||
const MAX_PILL_COUNT = 8
|
||||
|
||||
/**
|
||||
* Width increase (in pixels) required to add one additional pill
|
||||
* Width increase (in pixels) required to add one additional pill.
|
||||
*/
|
||||
const WIDTH_PER_PILL = 50
|
||||
|
||||
/**
|
||||
* Animation configuration for usage pills
|
||||
* Controls how smoothly and quickly the highlight progresses across pills
|
||||
* Animation tick interval in milliseconds.
|
||||
* Controls the update frequency of the wave animation.
|
||||
*/
|
||||
const PILL_ANIMATION_TICK_MS = 30
|
||||
|
||||
/**
|
||||
* Speed of the wave animation in pills per second.
|
||||
*/
|
||||
const PILLS_PER_SECOND = 1.8
|
||||
|
||||
/**
|
||||
* Distance (in pill units) the wave advances per animation tick.
|
||||
* Derived from {@link PILLS_PER_SECOND} and {@link PILL_ANIMATION_TICK_MS}.
|
||||
*/
|
||||
const PILL_STEP_PER_TICK = (PILLS_PER_SECOND * PILL_ANIMATION_TICK_MS) / 1000
|
||||
|
||||
/**
|
||||
* Plan name mapping
|
||||
* Human-readable plan name labels.
|
||||
*/
|
||||
const PLAN_NAMES = {
|
||||
enterprise: 'Enterprise',
|
||||
@@ -48,17 +57,37 @@ const PLAN_NAMES = {
|
||||
free: 'Free',
|
||||
} as const
|
||||
|
||||
/**
|
||||
* Props for the {@link UsageIndicator} component.
|
||||
*/
|
||||
interface UsageIndicatorProps {
|
||||
/**
|
||||
* Optional click handler. If provided, overrides the default behavior
|
||||
* of opening the settings modal to the subscription tab.
|
||||
*/
|
||||
onClick?: () => void
|
||||
}
|
||||
|
||||
/**
|
||||
* Displays a visual usage indicator showing current subscription usage
|
||||
* with an animated pill bar that responds to hover interactions.
|
||||
*
|
||||
* The component shows:
|
||||
* - Current plan type (Free, Pro, Team, Enterprise)
|
||||
* - Current usage vs. limit (e.g., $7.00 / $10.00)
|
||||
* - Visual pill bar representing usage percentage
|
||||
* - Upgrade button for free plans or when blocked
|
||||
*
|
||||
* @param props - Component props
|
||||
* @returns A usage indicator component with responsive pill visualization
|
||||
*/
|
||||
export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
const { data: subscriptionData, isLoading } = useSubscriptionData()
|
||||
const sidebarWidth = useSidebarStore((state) => state.sidebarWidth)
|
||||
|
||||
/**
|
||||
* Calculate pill count based on sidebar width (6-8 pills dynamically)
|
||||
* This provides responsive feedback as the sidebar width changes
|
||||
* Calculate pill count based on sidebar width (6-8 pills dynamically).
|
||||
* This provides responsive feedback as the sidebar width changes.
|
||||
*/
|
||||
const pillCount = useMemo(() => {
|
||||
const widthDelta = sidebarWidth - MIN_SIDEBAR_WIDTH
|
||||
@@ -82,54 +111,56 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
|
||||
const billingStatus = getBillingStatus(subscriptionData?.data)
|
||||
const isBlocked = billingStatus === 'blocked'
|
||||
const showUpgradeButton = planType === 'free' || isBlocked
|
||||
const showUpgradeButton =
|
||||
(planType === 'free' || isBlocked || progressPercentage >= 80) && planType !== 'enterprise'
|
||||
|
||||
/**
|
||||
* Calculate which pills should be filled based on usage percentage
|
||||
* Uses shared Math.ceil heuristic but with dynamic pill count (6-8)
|
||||
* This ensures consistent calculation logic while maintaining responsive pill count
|
||||
* Calculate which pills should be filled based on usage percentage.
|
||||
* Uses Math.ceil heuristic with dynamic pill count (6-8).
|
||||
* This ensures consistent calculation logic while maintaining responsive pill count.
|
||||
*/
|
||||
const filledPillsCount = Math.ceil((progressPercentage / 100) * pillCount)
|
||||
const isAlmostOut = filledPillsCount === pillCount
|
||||
|
||||
const [isHovered, setIsHovered] = useState(false)
|
||||
const [wavePosition, setWavePosition] = useState<number | null>(null)
|
||||
const [hasWrapped, setHasWrapped] = useState(false)
|
||||
|
||||
const startAnimationIndex = pillCount === 0 ? 0 : Math.min(filledPillsCount, pillCount - 1)
|
||||
|
||||
useEffect(() => {
|
||||
if (!isHovered || pillCount <= 0) {
|
||||
const isFreePlan = subscription.isFree
|
||||
|
||||
if (!isHovered || pillCount <= 0 || !isFreePlan) {
|
||||
setWavePosition(null)
|
||||
setHasWrapped(false)
|
||||
return
|
||||
}
|
||||
|
||||
const totalSpan = pillCount
|
||||
let wrapped = false
|
||||
setHasWrapped(false)
|
||||
/**
|
||||
* Maximum distance (in pill units) the wave should travel from
|
||||
* {@link startAnimationIndex} to the end of the row. The wave stops
|
||||
* once it reaches the final pill and does not wrap.
|
||||
*/
|
||||
const maxDistance = pillCount <= 0 ? 0 : Math.max(0, pillCount - startAnimationIndex)
|
||||
|
||||
setWavePosition(0)
|
||||
|
||||
const interval = window.setInterval(() => {
|
||||
setWavePosition((prev) => {
|
||||
const current = prev ?? 0
|
||||
const next = current + PILL_STEP_PER_TICK
|
||||
|
||||
// Mark as wrapped after first complete cycle
|
||||
if (next >= totalSpan && !wrapped) {
|
||||
wrapped = true
|
||||
setHasWrapped(true)
|
||||
if (current >= maxDistance) {
|
||||
return current
|
||||
}
|
||||
|
||||
// Return continuous value, never reset (seamless loop)
|
||||
return next
|
||||
const next = current + PILL_STEP_PER_TICK
|
||||
return next >= maxDistance ? maxDistance : next
|
||||
})
|
||||
}, PILL_ANIMATION_TICK_MS)
|
||||
|
||||
return () => {
|
||||
window.clearInterval(interval)
|
||||
}
|
||||
}, [isHovered, pillCount, startAnimationIndex])
|
||||
}, [isHovered, pillCount, startAnimationIndex, subscription.isFree])
|
||||
|
||||
if (isLoading) {
|
||||
return (
|
||||
@@ -153,7 +184,7 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
)
|
||||
}
|
||||
|
||||
const handleClick = () => {
|
||||
const handleClick = async () => {
|
||||
try {
|
||||
if (onClick) {
|
||||
onClick()
|
||||
@@ -163,7 +194,35 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
const blocked = getBillingStatus(subscriptionData?.data) === 'blocked'
|
||||
const canUpg = canUpgrade(subscriptionData?.data)
|
||||
|
||||
// Open Settings modal to the subscription tab (upgrade UI lives there)
|
||||
// If blocked, try to open billing portal directly for faster recovery
|
||||
if (blocked) {
|
||||
try {
|
||||
const context = subscription.isTeam || subscription.isEnterprise ? 'organization' : 'user'
|
||||
const organizationId =
|
||||
subscription.isTeam || subscription.isEnterprise
|
||||
? subscriptionData?.data?.organization?.id
|
||||
: undefined
|
||||
|
||||
const response = await fetch('/api/billing/portal', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ context, organizationId }),
|
||||
})
|
||||
|
||||
if (response.ok) {
|
||||
const { url } = await response.json()
|
||||
window.open(url, '_blank')
|
||||
logger.info('Opened billing portal for blocked account', { context, organizationId })
|
||||
return
|
||||
}
|
||||
} catch (portalError) {
|
||||
logger.warn('Failed to open billing portal, falling back to settings', {
|
||||
error: portalError,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback: Open Settings modal to the subscription tab
|
||||
if (typeof window !== 'undefined') {
|
||||
window.dispatchEvent(new CustomEvent('open-settings', { detail: { tab: 'subscription' } }))
|
||||
logger.info('Opened settings to subscription tab', { blocked, canUpgrade: canUpg })
|
||||
@@ -175,7 +234,9 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
|
||||
return (
|
||||
<div
|
||||
className='group flex flex-shrink-0 cursor-pointer flex-col gap-[8px] border-t px-[13.5px] pt-[8px] pb-[10px]'
|
||||
className={`group flex flex-shrink-0 cursor-pointer flex-col gap-[8px] border-t px-[13.5px] pt-[8px] pb-[10px] ${
|
||||
isBlocked ? 'border-red-500/50 bg-red-950/20' : ''
|
||||
}`}
|
||||
onClick={handleClick}
|
||||
onMouseEnter={() => setIsHovered(true)}
|
||||
onMouseLeave={() => setIsHovered(false)}
|
||||
@@ -188,8 +249,8 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
<div className='flex items-center gap-[4px]'>
|
||||
{isBlocked ? (
|
||||
<>
|
||||
<span className='font-medium text-[12px] text-[var(--text-tertiary)]'>Over</span>
|
||||
<span className='font-medium text-[12px] text-[var(--text-tertiary)]'>limit</span>
|
||||
<span className='font-medium text-[12px] text-red-400'>Payment</span>
|
||||
<span className='font-medium text-[12px] text-red-400'>Required</span>
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
@@ -207,10 +268,14 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
{showUpgradeButton && (
|
||||
<Button
|
||||
variant='ghost'
|
||||
className='-mx-1 !h-auto !px-1 !py-0 !text-[#F473B7] group-hover:!text-[#F789C4] mt-[-2px] transition-colors duration-100'
|
||||
className={`-mx-1 !h-auto !px-1 !py-0 mt-[-2px] transition-colors duration-100 ${
|
||||
isBlocked
|
||||
? '!text-red-400 group-hover:!text-red-300'
|
||||
: '!text-[#F473B7] group-hover:!text-[#F789C4]'
|
||||
}`}
|
||||
onClick={handleClick}
|
||||
>
|
||||
<span className='font-medium text-[12px]'>Upgrade</span>
|
||||
<span className='font-medium text-[12px]'>{isBlocked ? 'Fix Now' : 'Upgrade'}</span>
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
@@ -220,63 +285,42 @@ export function UsageIndicator({ onClick }: UsageIndicatorProps) {
|
||||
{Array.from({ length: pillCount }).map((_, i) => {
|
||||
const isFilled = i < filledPillsCount
|
||||
|
||||
const baseColor = isFilled ? (isAlmostOut ? '#ef4444' : '#34B5FF') : '#414141'
|
||||
const baseColor = isFilled
|
||||
? isBlocked || isAlmostOut
|
||||
? '#ef4444'
|
||||
: '#34B5FF'
|
||||
: '#414141'
|
||||
|
||||
let backgroundColor = baseColor
|
||||
let backgroundImage: string | undefined
|
||||
|
||||
if (isHovered && wavePosition !== null && pillCount > 0) {
|
||||
const totalSpan = pillCount
|
||||
if (isHovered && wavePosition !== null && pillCount > 0 && subscription.isFree) {
|
||||
const grayColor = '#414141'
|
||||
const activeColor = isAlmostOut ? '#ef4444' : '#34B5FF'
|
||||
|
||||
if (!hasWrapped) {
|
||||
// First pass: respect original fill state, start from startAnimationIndex
|
||||
const headIndex = Math.floor(wavePosition)
|
||||
const progress = wavePosition - headIndex
|
||||
/**
|
||||
* Single-pass wave: travel from {@link startAnimationIndex} to the end
|
||||
* of the row without wrapping. Previously highlighted pills remain
|
||||
* filled; the wave only affects pills at or after the start index.
|
||||
*/
|
||||
const headIndex = Math.floor(wavePosition)
|
||||
const progress = wavePosition - headIndex
|
||||
|
||||
const pillOffsetFromStart =
|
||||
i >= startAnimationIndex
|
||||
? i - startAnimationIndex
|
||||
: totalSpan - startAnimationIndex + i
|
||||
const pillOffsetFromStart = i - startAnimationIndex
|
||||
|
||||
if (pillOffsetFromStart < headIndex) {
|
||||
backgroundColor = baseColor
|
||||
backgroundImage = `linear-gradient(to right, ${activeColor} 0%, ${activeColor} 100%)`
|
||||
} else if (pillOffsetFromStart === headIndex) {
|
||||
const fillPercent = Math.max(0, Math.min(1, progress)) * 100
|
||||
backgroundColor = baseColor
|
||||
backgroundImage = `linear-gradient(to right, ${activeColor} 0%, ${activeColor} ${fillPercent}%, ${baseColor} ${fillPercent}%, ${baseColor} 100%)`
|
||||
}
|
||||
if (pillOffsetFromStart < 0) {
|
||||
// Before the wave start; keep original baseColor.
|
||||
} else if (pillOffsetFromStart < headIndex) {
|
||||
backgroundColor = isFilled ? baseColor : grayColor
|
||||
backgroundImage = `linear-gradient(to right, ${activeColor} 0%, ${activeColor} 100%)`
|
||||
} else if (pillOffsetFromStart === headIndex) {
|
||||
const fillPercent = Math.max(0, Math.min(1, progress)) * 100
|
||||
backgroundColor = isFilled ? baseColor : grayColor
|
||||
backgroundImage = `linear-gradient(to right, ${activeColor} 0%, ${activeColor} ${fillPercent}%, ${
|
||||
isFilled ? baseColor : grayColor
|
||||
} ${fillPercent}%, ${isFilled ? baseColor : grayColor} 100%)`
|
||||
} else {
|
||||
// Subsequent passes: render wave at BOTH current and next-cycle positions for seamless wrap
|
||||
const wrappedPosition = wavePosition % totalSpan
|
||||
const currentHead = Math.floor(wrappedPosition)
|
||||
const progress = wrappedPosition - currentHead
|
||||
|
||||
// Primary wave position
|
||||
const primaryFilled = i < currentHead
|
||||
const primaryActive = i === currentHead
|
||||
|
||||
// Secondary wave position (one full cycle ahead, wraps to beginning)
|
||||
const secondaryHead = Math.floor(wavePosition + totalSpan) % totalSpan
|
||||
const secondaryProgress =
|
||||
wavePosition + totalSpan - Math.floor(wavePosition + totalSpan)
|
||||
const secondaryFilled = i < secondaryHead
|
||||
const secondaryActive = i === secondaryHead
|
||||
|
||||
// Render: pill is filled if either wave position has filled it
|
||||
if (primaryFilled || secondaryFilled) {
|
||||
backgroundColor = grayColor
|
||||
backgroundImage = `linear-gradient(to right, ${activeColor} 0%, ${activeColor} 100%)`
|
||||
} else if (primaryActive || secondaryActive) {
|
||||
const activeProgress = primaryActive ? progress : secondaryProgress
|
||||
const fillPercent = Math.max(0, Math.min(1, activeProgress)) * 100
|
||||
backgroundColor = grayColor
|
||||
backgroundImage = `linear-gradient(to right, ${activeColor} 0%, ${activeColor} ${fillPercent}%, ${grayColor} ${fillPercent}%, ${grayColor} 100%)`
|
||||
} else {
|
||||
backgroundColor = grayColor
|
||||
}
|
||||
backgroundColor = isFilled ? baseColor : grayColor
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -14,8 +14,8 @@ import {
|
||||
} from '@/app/workspace/[workspaceId]/w/components/sidebar/hooks'
|
||||
import { useDeleteFolder, useDuplicateFolder } from '@/app/workspace/[workspaceId]/w/hooks'
|
||||
import { useUpdateFolder } from '@/hooks/queries/folders'
|
||||
import { useCreateWorkflow } from '@/hooks/queries/workflows'
|
||||
import type { FolderTreeNode } from '@/stores/folders/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
interface FolderItemProps {
|
||||
folder: FolderTreeNode
|
||||
@@ -39,7 +39,7 @@ export function FolderItem({ folder, level, hoverHandlers }: FolderItemProps) {
|
||||
const router = useRouter()
|
||||
const workspaceId = params.workspaceId as string
|
||||
const updateFolderMutation = useUpdateFolder()
|
||||
const { createWorkflow } = useWorkflowRegistry()
|
||||
const createWorkflowMutation = useCreateWorkflow()
|
||||
|
||||
// Delete modal state
|
||||
const [isDeleteModalOpen, setIsDeleteModalOpen] = useState(false)
|
||||
@@ -58,18 +58,18 @@ export function FolderItem({ folder, level, hoverHandlers }: FolderItemProps) {
|
||||
})
|
||||
|
||||
/**
|
||||
* Handle create workflow in folder
|
||||
* Handle create workflow in folder using React Query mutation
|
||||
*/
|
||||
const handleCreateWorkflowInFolder = useCallback(async () => {
|
||||
const workflowId = await createWorkflow({
|
||||
const result = await createWorkflowMutation.mutateAsync({
|
||||
workspaceId,
|
||||
folderId: folder.id,
|
||||
})
|
||||
|
||||
if (workflowId) {
|
||||
router.push(`/workspace/${workspaceId}/w/${workflowId}`)
|
||||
if (result.id) {
|
||||
router.push(`/workspace/${workspaceId}/w/${result.id}`)
|
||||
}
|
||||
}, [createWorkflow, workspaceId, folder.id, router])
|
||||
}, [createWorkflowMutation, workspaceId, folder.id, router])
|
||||
|
||||
// Folder expand hook
|
||||
const {
|
||||
|
||||
@@ -6,18 +6,18 @@ import { useParams, useRouter } from 'next/navigation'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateFolderName } from '@/lib/naming'
|
||||
import { cn } from '@/lib/utils'
|
||||
import {
|
||||
extractWorkflowName,
|
||||
extractWorkflowsFromFiles,
|
||||
extractWorkflowsFromZip,
|
||||
} from '@/lib/workflows/import-export'
|
||||
import { generateFolderName } from '@/lib/workspaces/naming'
|
||||
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
|
||||
import { useCreateFolder } from '@/hooks/queries/folders'
|
||||
import { useCreateWorkflow } from '@/hooks/queries/workflows'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { parseWorkflowJson } from '@/stores/workflows/json/importer'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
const logger = createLogger('CreateMenu')
|
||||
|
||||
@@ -44,7 +44,7 @@ export function CreateMenu({ onCreateWorkflow, isCreatingWorkflow = false }: Cre
|
||||
const router = useRouter()
|
||||
const workspaceId = params.workspaceId as string
|
||||
const createFolderMutation = useCreateFolder()
|
||||
const { createWorkflow } = useWorkflowRegistry()
|
||||
const createWorkflowMutation = useCreateWorkflow()
|
||||
const userPermissions = useUserPermissionsContext()
|
||||
const fileInputRef = useRef<HTMLInputElement>(null)
|
||||
|
||||
@@ -194,12 +194,13 @@ export function CreateMenu({ onCreateWorkflow, isCreatingWorkflow = false }: Cre
|
||||
const { clearDiff } = useWorkflowDiffStore.getState()
|
||||
clearDiff()
|
||||
|
||||
const newWorkflowId = await createWorkflow({
|
||||
const result = await createWorkflowMutation.mutateAsync({
|
||||
name: workflowName,
|
||||
description: 'Imported from workspace export',
|
||||
workspaceId,
|
||||
folderId: targetFolderId,
|
||||
})
|
||||
const newWorkflowId = result.id
|
||||
|
||||
const response = await fetch(`/api/workflows/${newWorkflowId}/state`, {
|
||||
method: 'PUT',
|
||||
@@ -255,11 +256,12 @@ export function CreateMenu({ onCreateWorkflow, isCreatingWorkflow = false }: Cre
|
||||
const { clearDiff } = useWorkflowDiffStore.getState()
|
||||
clearDiff()
|
||||
|
||||
const newWorkflowId = await createWorkflow({
|
||||
const result = await createWorkflowMutation.mutateAsync({
|
||||
name: workflowName,
|
||||
description: 'Imported from JSON',
|
||||
workspaceId,
|
||||
})
|
||||
const newWorkflowId = result.id
|
||||
|
||||
const response = await fetch(`/api/workflows/${newWorkflowId}/state`, {
|
||||
method: 'PUT',
|
||||
@@ -299,8 +301,8 @@ export function CreateMenu({ onCreateWorkflow, isCreatingWorkflow = false }: Cre
|
||||
}
|
||||
}
|
||||
|
||||
const { loadWorkflows } = useWorkflowRegistry.getState()
|
||||
await loadWorkflows(workspaceId)
|
||||
// Invalidate workflow queries to reload the list
|
||||
// The useWorkflows hook in the sidebar will automatically refetch
|
||||
} catch (error) {
|
||||
logger.error('Failed to import workflows:', error)
|
||||
} finally {
|
||||
@@ -310,7 +312,7 @@ export function CreateMenu({ onCreateWorkflow, isCreatingWorkflow = false }: Cre
|
||||
}
|
||||
}
|
||||
},
|
||||
[workspaceId, createWorkflow, createFolderMutation]
|
||||
[workspaceId, createWorkflowMutation, createFolderMutation]
|
||||
)
|
||||
|
||||
// Button event handlers
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { useCallback, useState } from 'react'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateFolderName } from '@/lib/naming'
|
||||
import { generateFolderName } from '@/lib/workspaces/naming'
|
||||
import { useCreateFolder } from '@/hooks/queries/folders'
|
||||
|
||||
const logger = createLogger('useFolderOperations')
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import { useCallback, useEffect, useState } from 'react'
|
||||
import { useCallback, useState } from 'react'
|
||||
import { useRouter } from 'next/navigation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useCreateWorkflow, useWorkflows } from '@/hooks/queries/workflows'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
@@ -25,12 +26,9 @@ export function useWorkflowOperations({
|
||||
onWorkspaceInvalid,
|
||||
}: UseWorkflowOperationsProps) {
|
||||
const router = useRouter()
|
||||
const {
|
||||
workflows,
|
||||
isLoading: workflowsLoading,
|
||||
loadWorkflows,
|
||||
createWorkflow,
|
||||
} = useWorkflowRegistry()
|
||||
const { workflows } = useWorkflowRegistry()
|
||||
const workflowsQuery = useWorkflows(workspaceId)
|
||||
const createWorkflowMutation = useCreateWorkflow()
|
||||
const [isCreatingWorkflow, setIsCreatingWorkflow] = useState(false)
|
||||
|
||||
/**
|
||||
@@ -45,6 +43,7 @@ export function useWorkflowOperations({
|
||||
|
||||
/**
|
||||
* Create workflow handler - creates workflow and navigates to it
|
||||
* Now uses React Query mutation for better performance and caching
|
||||
*/
|
||||
const handleCreateWorkflow = useCallback(async (): Promise<string | null> => {
|
||||
if (isCreatingWorkflow) {
|
||||
@@ -59,14 +58,15 @@ export function useWorkflowOperations({
|
||||
const { clearDiff } = useWorkflowDiffStore.getState()
|
||||
clearDiff()
|
||||
|
||||
const workflowId = await createWorkflow({
|
||||
workspaceId: workspaceId || undefined,
|
||||
// Use React Query mutation for creation
|
||||
const result = await createWorkflowMutation.mutateAsync({
|
||||
workspaceId: workspaceId,
|
||||
})
|
||||
|
||||
// Navigate to the newly created workflow
|
||||
if (workflowId) {
|
||||
router.push(`/workspace/${workspaceId}/w/${workflowId}`)
|
||||
return workflowId
|
||||
if (result.id) {
|
||||
router.push(`/workspace/${workspaceId}/w/${result.id}`)
|
||||
return result.id
|
||||
}
|
||||
return null
|
||||
} catch (error) {
|
||||
@@ -75,34 +75,16 @@ export function useWorkflowOperations({
|
||||
} finally {
|
||||
setIsCreatingWorkflow(false)
|
||||
}
|
||||
}, [isCreatingWorkflow, createWorkflow, workspaceId, router])
|
||||
|
||||
/**
|
||||
* Load workflows for the current workspace when workspaceId changes
|
||||
*/
|
||||
useEffect(() => {
|
||||
if (workspaceId) {
|
||||
// Validate workspace exists before loading workflows
|
||||
isWorkspaceValid(workspaceId).then((valid) => {
|
||||
if (valid) {
|
||||
loadWorkflows(workspaceId)
|
||||
} else {
|
||||
logger.warn(`Workspace ${workspaceId} no longer exists, triggering workspace refresh`)
|
||||
onWorkspaceInvalid()
|
||||
}
|
||||
})
|
||||
}
|
||||
}, [workspaceId, loadWorkflows, isWorkspaceValid, onWorkspaceInvalid])
|
||||
}, [isCreatingWorkflow, createWorkflowMutation, workspaceId, router])
|
||||
|
||||
return {
|
||||
// State
|
||||
workflows,
|
||||
regularWorkflows,
|
||||
workflowsLoading,
|
||||
workflowsLoading: workflowsQuery.isLoading,
|
||||
isCreatingWorkflow,
|
||||
|
||||
// Operations
|
||||
handleCreateWorkflow,
|
||||
loadWorkflows,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { useCallback, useEffect, useRef, useState } from 'react'
|
||||
import { usePathname, useRouter } from 'next/navigation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateWorkspaceName } from '@/lib/naming'
|
||||
import { generateWorkspaceName } from '@/lib/workspaces/naming'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
const logger = createLogger('useWorkspaceManagement')
|
||||
|
||||
@@ -5,6 +5,7 @@ import { ArrowDown, Plus, Search } from 'lucide-react'
|
||||
import { useParams, useRouter } from 'next/navigation'
|
||||
import { Button, FolderPlus, Tooltip } from '@/components/emcn'
|
||||
import { useSession } from '@/lib/auth-client'
|
||||
import { getEnv, isTruthy } from '@/lib/env'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useRegisterGlobalCommands } from '@/app/workspace/[workspaceId]/providers/global-commands-provider'
|
||||
import {
|
||||
@@ -32,8 +33,7 @@ import { MIN_SIDEBAR_WIDTH, useSidebarStore } from '@/stores/sidebar/store'
|
||||
const logger = createLogger('SidebarNew')
|
||||
|
||||
// Feature flag: Billing usage indicator visibility (matches legacy sidebar behavior)
|
||||
// const isBillingEnabled = isTruthy(getEnv('NEXT_PUBLIC_BILLING_ENABLED'))
|
||||
const isBillingEnabled = true
|
||||
const isBillingEnabled = isTruthy(getEnv('NEXT_PUBLIC_BILLING_ENABLED'))
|
||||
|
||||
/**
|
||||
* Sidebar component with resizable width that persists across page refreshes.
|
||||
|
||||
@@ -7,8 +7,8 @@ import { ScrollArea } from '@/components/ui'
|
||||
import { useSession } from '@/lib/auth-client'
|
||||
import { getEnv, isTruthy } from '@/lib/env'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { generateWorkspaceName } from '@/lib/naming'
|
||||
import { canUpgrade, getBillingStatus } from '@/lib/subscription/helpers'
|
||||
import { generateWorkspaceName } from '@/lib/workspaces/naming'
|
||||
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
|
||||
import {
|
||||
CreateMenu,
|
||||
@@ -28,7 +28,6 @@ import { InviteModal } from '@/app/workspace/[workspaceId]/w/components/sidebar/
|
||||
import { useAutoScroll } from '@/app/workspace/[workspaceId]/w/components/sidebar/hooks/use-auto-scroll'
|
||||
import { useSubscriptionData } from '@/hooks/queries/subscription'
|
||||
import { useKnowledgeBasesList } from '@/hooks/use-knowledge'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import type { WorkflowMetadata } from '@/stores/workflows/registry/types'
|
||||
|
||||
@@ -82,13 +81,7 @@ interface TemplateData {
|
||||
export function Sidebar() {
|
||||
// useGlobalShortcuts()
|
||||
|
||||
const {
|
||||
workflows,
|
||||
createWorkflow,
|
||||
isLoading: workflowsLoading,
|
||||
loadWorkflows,
|
||||
switchToWorkspace,
|
||||
} = useWorkflowRegistry()
|
||||
const { workflows, isLoading: workflowsLoading, switchToWorkspace } = useWorkflowRegistry()
|
||||
const { data: sessionData, isPending: sessionLoading } = useSession()
|
||||
const userPermissions = useUserPermissionsContext()
|
||||
const isLoading = workflowsLoading || sessionLoading
|
||||
@@ -605,19 +598,20 @@ export function Sidebar() {
|
||||
}
|
||||
|
||||
// Load workflows for the current workspace when workspaceId changes
|
||||
useEffect(() => {
|
||||
if (workspaceId) {
|
||||
// Validate workspace exists before loading workflows
|
||||
isWorkspaceValid(workspaceId).then((valid) => {
|
||||
if (valid) {
|
||||
loadWorkflows(workspaceId)
|
||||
} else {
|
||||
logger.warn(`Workspace ${workspaceId} no longer exists, triggering workspace refresh`)
|
||||
fetchWorkspaces() // This will handle the redirect through the fallback logic
|
||||
}
|
||||
})
|
||||
}
|
||||
}, [workspaceId, loadWorkflows]) // Removed isWorkspaceValid and fetchWorkspaces dependencies
|
||||
// NOTE: This useEffect is disabled - workflows now loaded via React Query in sidebar-new
|
||||
// useEffect(() => {
|
||||
// if (workspaceId) {
|
||||
// // Validate workspace exists before loading workflows
|
||||
// isWorkspaceValid(workspaceId).then((valid) => {
|
||||
// if (valid) {
|
||||
// loadWorkflows(workspaceId)
|
||||
// } else {
|
||||
// logger.warn(`Workspace ${workspaceId} no longer exists, triggering workspace refresh`)
|
||||
// fetchWorkspaces() // This will handle the redirect through the fallback logic
|
||||
// }
|
||||
// })
|
||||
// }
|
||||
// }, [workspaceId, loadWorkflows]) // Removed isWorkspaceValid and fetchWorkspaces dependencies
|
||||
|
||||
// Initialize workspace data on mount (uses full validation with URL handling)
|
||||
useEffect(() => {
|
||||
@@ -736,30 +730,33 @@ export function Sidebar() {
|
||||
}, [knowledgeBases, workspaceId, knowledgeBaseId])
|
||||
|
||||
// Create workflow handler
|
||||
// NOTE: This is disabled - workflow creation now handled via React Query in sidebar-new
|
||||
const handleCreateWorkflow = async (folderId?: string): Promise<string> => {
|
||||
if (isCreatingWorkflow) {
|
||||
logger.info('Workflow creation already in progress, ignoring request')
|
||||
throw new Error('Workflow creation already in progress')
|
||||
}
|
||||
logger.warn('Old sidebar handleCreateWorkflow called - should use sidebar-new')
|
||||
return ''
|
||||
// if (isCreatingWorkflow) {
|
||||
// logger.info('Workflow creation already in progress, ignoring request')
|
||||
// throw new Error('Workflow creation already in progress')
|
||||
// }
|
||||
|
||||
try {
|
||||
setIsCreatingWorkflow(true)
|
||||
// try {
|
||||
// setIsCreatingWorkflow(true)
|
||||
|
||||
// Clear workflow diff store when creating a new workflow
|
||||
const { clearDiff } = useWorkflowDiffStore.getState()
|
||||
clearDiff()
|
||||
// // Clear workflow diff store when creating a new workflow
|
||||
// const { clearDiff } = useWorkflowDiffStore.getState()
|
||||
// clearDiff()
|
||||
|
||||
const id = await createWorkflow({
|
||||
workspaceId: workspaceId || undefined,
|
||||
folderId: folderId || undefined,
|
||||
})
|
||||
return id
|
||||
} catch (error) {
|
||||
logger.error('Error creating workflow:', error)
|
||||
throw error
|
||||
} finally {
|
||||
setIsCreatingWorkflow(false)
|
||||
}
|
||||
// const id = await createWorkflow({
|
||||
// workspaceId: workspaceId || undefined,
|
||||
// folderId: folderId || undefined,
|
||||
// })
|
||||
// return id
|
||||
// } catch (error) {
|
||||
// logger.error('Error creating workflow:', error)
|
||||
// throw error
|
||||
// } finally {
|
||||
// setIsCreatingWorkflow(false)
|
||||
// }
|
||||
}
|
||||
|
||||
// Toggle workspace selector visibility
|
||||
|
||||
@@ -6,31 +6,14 @@ import { useFolderStore } from '@/stores/folders/store'
|
||||
const logger = createLogger('useDuplicateFolder')
|
||||
|
||||
interface UseDuplicateFolderProps {
|
||||
/**
|
||||
* Current workspace ID
|
||||
*/
|
||||
workspaceId: string
|
||||
/**
|
||||
* Function that returns the folder ID(s) to duplicate
|
||||
* This function is called when duplication occurs to get fresh selection state
|
||||
*/
|
||||
getFolderIds: () => string | string[]
|
||||
/**
|
||||
* Optional callback after successful duplication
|
||||
*/
|
||||
onSuccess?: () => void
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook for managing folder duplication.
|
||||
*
|
||||
* Handles:
|
||||
* - Single or bulk folder duplication
|
||||
* - Calling duplicate API for each folder
|
||||
* - Loading state management
|
||||
* - Error handling and logging
|
||||
* - Clearing selection after duplication
|
||||
*
|
||||
* @param props - Hook configuration
|
||||
* @returns Duplicate folder handlers and state
|
||||
*/
|
||||
|
||||
@@ -8,9 +8,9 @@ import {
|
||||
extractWorkflowsFromZip,
|
||||
} from '@/lib/workflows/import-export'
|
||||
import { folderKeys, useCreateFolder } from '@/hooks/queries/folders'
|
||||
import { useCreateWorkflow, workflowKeys } from '@/hooks/queries/workflows'
|
||||
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
|
||||
import { parseWorkflowJson } from '@/stores/workflows/json/importer'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
const logger = createLogger('useImportWorkflow')
|
||||
|
||||
@@ -30,7 +30,7 @@ interface UseImportWorkflowProps {
|
||||
*/
|
||||
export function useImportWorkflow({ workspaceId }: UseImportWorkflowProps) {
|
||||
const router = useRouter()
|
||||
const { createWorkflow, loadWorkflows } = useWorkflowRegistry()
|
||||
const createWorkflowMutation = useCreateWorkflow()
|
||||
const queryClient = useQueryClient()
|
||||
const createFolderMutation = useCreateFolder()
|
||||
const [isImporting, setIsImporting] = useState(false)
|
||||
@@ -55,12 +55,13 @@ export function useImportWorkflow({ workspaceId }: UseImportWorkflowProps) {
|
||||
const workflowColor =
|
||||
parsedContent.state?.metadata?.color || parsedContent.metadata?.color || '#3972F6'
|
||||
|
||||
const newWorkflowId = await createWorkflow({
|
||||
const result = await createWorkflowMutation.mutateAsync({
|
||||
name: workflowName,
|
||||
description: workflowData.metadata?.description || 'Imported from JSON',
|
||||
workspaceId,
|
||||
folderId: folderId || undefined,
|
||||
})
|
||||
const newWorkflowId = result.id
|
||||
|
||||
// Update workflow color if we extracted one
|
||||
if (workflowColor !== '#3972F6') {
|
||||
@@ -98,7 +99,7 @@ export function useImportWorkflow({ workspaceId }: UseImportWorkflowProps) {
|
||||
logger.info(`Imported workflow: ${workflowName}`)
|
||||
return newWorkflowId
|
||||
},
|
||||
[createWorkflow, workspaceId]
|
||||
[createWorkflowMutation, workspaceId]
|
||||
)
|
||||
|
||||
/**
|
||||
@@ -184,8 +185,8 @@ export function useImportWorkflow({ workspaceId }: UseImportWorkflowProps) {
|
||||
}
|
||||
}
|
||||
|
||||
// Reload workflows to show newly imported ones
|
||||
await loadWorkflows(workspaceId)
|
||||
// Reload workflows and folders to show newly imported ones
|
||||
await queryClient.invalidateQueries({ queryKey: workflowKeys.list(workspaceId) })
|
||||
await queryClient.invalidateQueries({ queryKey: folderKeys.list(workspaceId) })
|
||||
|
||||
logger.info(`Import complete. Imported ${importedWorkflowIds.length} workflow(s)`)
|
||||
@@ -205,7 +206,7 @@ export function useImportWorkflow({ workspaceId }: UseImportWorkflowProps) {
|
||||
}
|
||||
}
|
||||
},
|
||||
[importSingleWorkflow, workspaceId, loadWorkflows, router, createFolderMutation, queryClient]
|
||||
[importSingleWorkflow, workspaceId, router, createFolderMutation, queryClient]
|
||||
)
|
||||
|
||||
return {
|
||||
|
||||
@@ -1,42 +1,32 @@
|
||||
'use client'
|
||||
|
||||
import { useEffect, useState } from 'react'
|
||||
import { useEffect } from 'react'
|
||||
import { Loader2 } from 'lucide-react'
|
||||
import { useParams, useRouter } from 'next/navigation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { useWorkflows } from '@/hooks/queries/workflows'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
const logger = createLogger('WorkflowsPage')
|
||||
|
||||
export default function WorkflowsPage() {
|
||||
const router = useRouter()
|
||||
const { workflows, isLoading, loadWorkflows, setActiveWorkflow } = useWorkflowRegistry()
|
||||
const [hasInitialized, setHasInitialized] = useState(false)
|
||||
|
||||
const { workflows, setActiveWorkflow } = useWorkflowRegistry()
|
||||
const params = useParams()
|
||||
const workspaceId = params.workspaceId as string
|
||||
|
||||
// Initialize workspace workflows
|
||||
useEffect(() => {
|
||||
const initializeWorkspace = async () => {
|
||||
try {
|
||||
await loadWorkflows(workspaceId)
|
||||
setHasInitialized(true)
|
||||
} catch (error) {
|
||||
logger.error('Failed to load workflows for workspace:', error)
|
||||
setHasInitialized(true) // Still mark as initialized to show error state
|
||||
}
|
||||
}
|
||||
|
||||
if (!hasInitialized) {
|
||||
initializeWorkspace()
|
||||
}
|
||||
}, [workspaceId, loadWorkflows, hasInitialized])
|
||||
// Fetch workflows using React Query
|
||||
const { isLoading, isError } = useWorkflows(workspaceId)
|
||||
|
||||
// Handle redirection once workflows are loaded
|
||||
useEffect(() => {
|
||||
// Only proceed if we've initialized and workflows are not loading
|
||||
if (!hasInitialized || isLoading) return
|
||||
// Only proceed if workflows are done loading
|
||||
if (isLoading) return
|
||||
|
||||
if (isError) {
|
||||
logger.error('Failed to load workflows for workspace')
|
||||
return
|
||||
}
|
||||
|
||||
const workflowIds = Object.keys(workflows)
|
||||
|
||||
@@ -55,7 +45,7 @@ export default function WorkflowsPage() {
|
||||
router.replace(`/workspace/${workspaceId}/w/${firstWorkflowId}`)
|
||||
})
|
||||
}
|
||||
}, [hasInitialized, isLoading, workflows, workspaceId, router, setActiveWorkflow])
|
||||
}, [isLoading, workflows, workspaceId, router, setActiveWorkflow, isError])
|
||||
|
||||
// Always show loading state until redirect happens
|
||||
// There should always be a default workflow, so we never show "no workflows found"
|
||||
|
||||
@@ -210,6 +210,7 @@ Create a system prompt appropriately detailed for the request, using clear langu
|
||||
type: 'dropdown',
|
||||
placeholder: 'Select reasoning effort...',
|
||||
options: [
|
||||
{ label: 'none', id: 'none' },
|
||||
{ label: 'minimal', id: 'minimal' },
|
||||
{ label: 'low', id: 'low' },
|
||||
{ label: 'medium', id: 'medium' },
|
||||
|
||||
@@ -12,10 +12,10 @@ import {
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { format } from 'date-fns'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import EmailFooter from '@/components/emails/footer'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
import { baseStyles } from './base-styles'
|
||||
import EmailFooter from './footer'
|
||||
|
||||
interface EnterpriseSubscriptionEmailProps {
|
||||
userName?: string
|
||||
142
apps/sim/components/emails/billing/free-tier-upgrade-email.tsx
Normal file
142
apps/sim/components/emails/billing/free-tier-upgrade-email.tsx
Normal file
@@ -0,0 +1,142 @@
|
||||
import {
|
||||
Body,
|
||||
Column,
|
||||
Container,
|
||||
Head,
|
||||
Hr,
|
||||
Html,
|
||||
Img,
|
||||
Link,
|
||||
Preview,
|
||||
Row,
|
||||
Section,
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import EmailFooter from '@/components/emails/footer'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
|
||||
interface FreeTierUpgradeEmailProps {
|
||||
userName?: string
|
||||
percentUsed: number
|
||||
currentUsage: number
|
||||
limit: number
|
||||
upgradeLink: string
|
||||
updatedDate?: Date
|
||||
}
|
||||
|
||||
export function FreeTierUpgradeEmail({
|
||||
userName,
|
||||
percentUsed,
|
||||
currentUsage,
|
||||
limit,
|
||||
upgradeLink,
|
||||
updatedDate = new Date(),
|
||||
}: FreeTierUpgradeEmailProps) {
|
||||
const brand = getBrandConfig()
|
||||
const baseUrl = getBaseUrl()
|
||||
|
||||
const previewText = `${brand.name}: You've used ${percentUsed}% of your free credits`
|
||||
|
||||
return (
|
||||
<Html>
|
||||
<Head />
|
||||
<Preview>{previewText}</Preview>
|
||||
<Body style={baseStyles.main}>
|
||||
<Container style={baseStyles.container}>
|
||||
<Section style={{ padding: '30px 0', textAlign: 'center' }}>
|
||||
<Row>
|
||||
<Column style={{ textAlign: 'center' }}>
|
||||
<Img
|
||||
src={brand.logoUrl || `${baseUrl}/logo/reverse/text/medium.png`}
|
||||
width='114'
|
||||
alt={brand.name}
|
||||
style={{
|
||||
margin: '0 auto',
|
||||
}}
|
||||
/>
|
||||
</Column>
|
||||
</Row>
|
||||
</Section>
|
||||
|
||||
<Section style={baseStyles.sectionsBorders}>
|
||||
<Row>
|
||||
<Column style={baseStyles.sectionBorder} />
|
||||
<Column style={baseStyles.sectionCenter} />
|
||||
<Column style={baseStyles.sectionBorder} />
|
||||
</Row>
|
||||
</Section>
|
||||
|
||||
<Section style={baseStyles.content}>
|
||||
<Text style={{ ...baseStyles.paragraph, marginTop: 0 }}>
|
||||
{userName ? `Hi ${userName},` : 'Hi,'}
|
||||
</Text>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
You've used <strong>${currentUsage.toFixed(2)}</strong> of your{' '}
|
||||
<strong>${limit.toFixed(2)}</strong> free credits ({percentUsed}%).
|
||||
</Text>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
To ensure uninterrupted service and unlock the full power of {brand.name}, upgrade to
|
||||
Pro today.
|
||||
</Text>
|
||||
|
||||
<Section
|
||||
style={{
|
||||
backgroundColor: '#f8f9fa',
|
||||
padding: '20px',
|
||||
borderRadius: '5px',
|
||||
margin: '20px 0',
|
||||
}}
|
||||
>
|
||||
<Text
|
||||
style={{
|
||||
...baseStyles.paragraph,
|
||||
marginTop: 0,
|
||||
marginBottom: 12,
|
||||
fontWeight: 'bold',
|
||||
}}
|
||||
>
|
||||
What you get with Pro:
|
||||
</Text>
|
||||
<Text style={{ ...baseStyles.paragraph, margin: '8px 0', lineHeight: 1.6 }}>
|
||||
• <strong>$20/month in credits</strong> – 2x your free tier
|
||||
<br />• <strong>Priority support</strong> – Get help when you need it
|
||||
<br />• <strong>Advanced features</strong> – Access to premium blocks and
|
||||
integrations
|
||||
<br />• <strong>No interruptions</strong> – Never worry about running out of credits
|
||||
</Text>
|
||||
</Section>
|
||||
|
||||
<Hr />
|
||||
|
||||
<Text style={baseStyles.paragraph}>Upgrade now to keep building without limits.</Text>
|
||||
|
||||
<Link href={upgradeLink} style={{ textDecoration: 'none' }}>
|
||||
<Text style={baseStyles.button}>Upgrade to Pro</Text>
|
||||
</Link>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
Questions? We're here to help.
|
||||
<br />
|
||||
<br />
|
||||
Best regards,
|
||||
<br />
|
||||
The {brand.name} Team
|
||||
</Text>
|
||||
|
||||
<Text style={{ ...baseStyles.paragraph, fontSize: '12px', color: '#666' }}>
|
||||
Sent on {updatedDate.toLocaleDateString()} • This is a one-time notification at 90%.
|
||||
</Text>
|
||||
</Section>
|
||||
</Container>
|
||||
|
||||
<EmailFooter baseUrl={baseUrl} />
|
||||
</Body>
|
||||
</Html>
|
||||
)
|
||||
}
|
||||
|
||||
export default FreeTierUpgradeEmail
|
||||
167
apps/sim/components/emails/billing/payment-failed-email.tsx
Normal file
167
apps/sim/components/emails/billing/payment-failed-email.tsx
Normal file
@@ -0,0 +1,167 @@
|
||||
import {
|
||||
Body,
|
||||
Column,
|
||||
Container,
|
||||
Head,
|
||||
Hr,
|
||||
Html,
|
||||
Img,
|
||||
Link,
|
||||
Preview,
|
||||
Row,
|
||||
Section,
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import EmailFooter from '@/components/emails/footer'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
|
||||
interface PaymentFailedEmailProps {
|
||||
userName?: string
|
||||
amountDue: number
|
||||
lastFourDigits?: string
|
||||
billingPortalUrl: string
|
||||
failureReason?: string
|
||||
sentDate?: Date
|
||||
}
|
||||
|
||||
export function PaymentFailedEmail({
|
||||
userName,
|
||||
amountDue,
|
||||
lastFourDigits,
|
||||
billingPortalUrl,
|
||||
failureReason,
|
||||
sentDate = new Date(),
|
||||
}: PaymentFailedEmailProps) {
|
||||
const brand = getBrandConfig()
|
||||
const baseUrl = getBaseUrl()
|
||||
|
||||
const previewText = `${brand.name}: Payment Failed - Action Required`
|
||||
|
||||
return (
|
||||
<Html>
|
||||
<Head />
|
||||
<Preview>{previewText}</Preview>
|
||||
<Body style={baseStyles.main}>
|
||||
<Container style={baseStyles.container}>
|
||||
<Section style={{ padding: '30px 0', textAlign: 'center' }}>
|
||||
<Row>
|
||||
<Column style={{ textAlign: 'center' }}>
|
||||
<Img
|
||||
src={brand.logoUrl || `${baseUrl}/logo/reverse/text/medium.png`}
|
||||
width='114'
|
||||
alt={brand.name}
|
||||
style={{
|
||||
margin: '0 auto',
|
||||
}}
|
||||
/>
|
||||
</Column>
|
||||
</Row>
|
||||
</Section>
|
||||
|
||||
<Section style={baseStyles.sectionsBorders}>
|
||||
<Row>
|
||||
<Column style={baseStyles.sectionBorder} />
|
||||
<Column style={baseStyles.sectionCenter} />
|
||||
<Column style={baseStyles.sectionBorder} />
|
||||
</Row>
|
||||
</Section>
|
||||
|
||||
<Section style={baseStyles.content}>
|
||||
<Text style={{ ...baseStyles.paragraph, marginTop: 0 }}>
|
||||
{userName ? `Hi ${userName},` : 'Hi,'}
|
||||
</Text>
|
||||
|
||||
<Text style={{ ...baseStyles.paragraph, fontSize: '18px', fontWeight: 'bold' }}>
|
||||
We were unable to process your payment.
|
||||
</Text>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
Your {brand.name} account has been temporarily blocked to prevent service
|
||||
interruptions and unexpected charges. To restore access immediately, please update
|
||||
your payment method.
|
||||
</Text>
|
||||
|
||||
<Section
|
||||
style={{
|
||||
backgroundColor: '#fff5f5',
|
||||
border: '1px solid #fed7d7',
|
||||
borderRadius: '5px',
|
||||
padding: '16px',
|
||||
margin: '20px 0',
|
||||
}}
|
||||
>
|
||||
<Row>
|
||||
<Column>
|
||||
<Text style={{ ...baseStyles.paragraph, marginBottom: 8, marginTop: 0 }}>
|
||||
<strong>Payment Details</strong>
|
||||
</Text>
|
||||
<Text style={{ ...baseStyles.paragraph, margin: '4px 0' }}>
|
||||
Amount due: ${amountDue.toFixed(2)}
|
||||
</Text>
|
||||
{lastFourDigits && (
|
||||
<Text style={{ ...baseStyles.paragraph, margin: '4px 0' }}>
|
||||
Payment method: •••• {lastFourDigits}
|
||||
</Text>
|
||||
)}
|
||||
{failureReason && (
|
||||
<Text style={{ ...baseStyles.paragraph, margin: '4px 0' }}>
|
||||
Reason: {failureReason}
|
||||
</Text>
|
||||
)}
|
||||
</Column>
|
||||
</Row>
|
||||
</Section>
|
||||
|
||||
<Link href={billingPortalUrl} style={{ textDecoration: 'none' }}>
|
||||
<Text style={baseStyles.button}>Update Payment Method</Text>
|
||||
</Link>
|
||||
|
||||
<Hr />
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
<strong>What happens next?</strong>
|
||||
</Text>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
• Your workflows and automations are currently paused
|
||||
<br />• Update your payment method to restore service immediately
|
||||
<br />• Stripe will automatically retry the charge once payment is updated
|
||||
</Text>
|
||||
|
||||
<Hr />
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
<strong>Need help?</strong>
|
||||
</Text>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
Common reasons for payment failures include expired cards, insufficient funds, or
|
||||
incorrect billing information. If you continue to experience issues, please{' '}
|
||||
<Link href={`${baseUrl}/support`} style={baseStyles.link}>
|
||||
contact our support team
|
||||
</Link>
|
||||
.
|
||||
</Text>
|
||||
|
||||
<Text style={baseStyles.paragraph}>
|
||||
Best regards,
|
||||
<br />
|
||||
The Sim Team
|
||||
</Text>
|
||||
|
||||
<Text style={{ ...baseStyles.paragraph, fontSize: '12px', color: '#666' }}>
|
||||
Sent on {sentDate.toLocaleDateString()} • This is a critical transactional
|
||||
notification.
|
||||
</Text>
|
||||
</Section>
|
||||
</Container>
|
||||
|
||||
<EmailFooter baseUrl={baseUrl} />
|
||||
</Body>
|
||||
</Html>
|
||||
)
|
||||
}
|
||||
|
||||
export default PaymentFailedEmail
|
||||
@@ -12,10 +12,10 @@ import {
|
||||
Section,
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import EmailFooter from '@/components/emails/footer'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
import { baseStyles } from './base-styles'
|
||||
|
||||
interface PlanWelcomeEmailProps {
|
||||
planName: 'Pro' | 'Team'
|
||||
@@ -12,10 +12,10 @@ import {
|
||||
Section,
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import EmailFooter from '@/components/emails/footer'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
import { baseStyles } from './base-styles'
|
||||
|
||||
interface UsageThresholdEmailProps {
|
||||
userName?: string
|
||||
@@ -11,10 +11,10 @@ import {
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { format } from 'date-fns'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import EmailFooter from '@/components/emails/footer'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
import { baseStyles } from './base-styles'
|
||||
import EmailFooter from './footer'
|
||||
|
||||
interface CareersConfirmationEmailProps {
|
||||
name: string
|
||||
@@ -11,9 +11,9 @@ import {
|
||||
Text,
|
||||
} from '@react-email/components'
|
||||
import { format } from 'date-fns'
|
||||
import { baseStyles } from '@/components/emails/base-styles'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
import { baseStyles } from './base-styles'
|
||||
|
||||
interface CareersSubmissionEmailProps {
|
||||
name: string
|
||||
@@ -1,12 +1,12 @@
|
||||
export * from './base-styles'
|
||||
export { BatchInvitationEmail } from './batch-invitation-email'
|
||||
export { EnterpriseSubscriptionEmail } from './enterprise-subscription-email'
|
||||
export { EnterpriseSubscriptionEmail } from './billing/enterprise-subscription-email'
|
||||
export { PlanWelcomeEmail } from './billing/plan-welcome-email'
|
||||
export { UsageThresholdEmail } from './billing/usage-threshold-email'
|
||||
export { default as EmailFooter } from './footer'
|
||||
export { HelpConfirmationEmail } from './help-confirmation-email'
|
||||
export { InvitationEmail } from './invitation-email'
|
||||
export { OTPVerificationEmail } from './otp-verification-email'
|
||||
export { PlanWelcomeEmail } from './plan-welcome-email'
|
||||
export * from './render-email'
|
||||
export { ResetPasswordEmail } from './reset-password-email'
|
||||
export { UsageThresholdEmail } from './usage-threshold-email'
|
||||
export { WorkspaceInvitationEmail } from './workspace-invitation'
|
||||
|
||||
@@ -9,6 +9,7 @@ import {
|
||||
ResetPasswordEmail,
|
||||
UsageThresholdEmail,
|
||||
} from '@/components/emails'
|
||||
import FreeTierUpgradeEmail from '@/components/emails/billing/free-tier-upgrade-email'
|
||||
import { getBrandConfig } from '@/lib/branding/branding'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
|
||||
@@ -124,6 +125,25 @@ export async function renderUsageThresholdEmail(params: {
|
||||
)
|
||||
}
|
||||
|
||||
export async function renderFreeTierUpgradeEmail(params: {
|
||||
userName?: string
|
||||
percentUsed: number
|
||||
currentUsage: number
|
||||
limit: number
|
||||
upgradeLink: string
|
||||
}): Promise<string> {
|
||||
return await render(
|
||||
FreeTierUpgradeEmail({
|
||||
userName: params.userName,
|
||||
percentUsed: params.percentUsed,
|
||||
currentUsage: params.currentUsage,
|
||||
limit: params.limit,
|
||||
upgradeLink: params.upgradeLink,
|
||||
updatedDate: new Date(),
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
export function getEmailSubject(
|
||||
type:
|
||||
| 'sign-in'
|
||||
@@ -135,6 +155,7 @@ export function getEmailSubject(
|
||||
| 'help-confirmation'
|
||||
| 'enterprise-subscription'
|
||||
| 'usage-threshold'
|
||||
| 'free-tier-upgrade'
|
||||
| 'plan-welcome-pro'
|
||||
| 'plan-welcome-team'
|
||||
): string {
|
||||
@@ -159,6 +180,8 @@ export function getEmailSubject(
|
||||
return `Your Enterprise Plan is now active on ${brandName}`
|
||||
case 'usage-threshold':
|
||||
return `You're nearing your monthly budget on ${brandName}`
|
||||
case 'free-tier-upgrade':
|
||||
return `You're at 90% of your free credits on ${brandName}`
|
||||
case 'plan-welcome-pro':
|
||||
return `Your Pro plan is now active on ${brandName}`
|
||||
case 'plan-welcome-team':
|
||||
|
||||
@@ -81,9 +81,7 @@ function Container({
|
||||
'bg-[#1F1F1F] font-medium font-mono text-sm transition-colors',
|
||||
'dark:border-[var(--border-strong)]',
|
||||
// Overflow handling for long content
|
||||
'overflow-x-auto',
|
||||
// Vertical resize handle
|
||||
'resize-y overflow-y-auto',
|
||||
'overflow-x-auto overflow-y-auto',
|
||||
// Streaming state
|
||||
isStreaming && 'streaming-effect',
|
||||
className
|
||||
|
||||
@@ -41,8 +41,9 @@ import { cn } from '@/lib/utils'
|
||||
|
||||
/**
|
||||
* Modal z-index configuration
|
||||
* Set higher than Dialog component (10000000) to ensure Settings modal appears on top when opened from Deploy modal
|
||||
*/
|
||||
const MODAL_Z_INDEX = 9999999
|
||||
const MODAL_Z_INDEX = 10000100
|
||||
|
||||
/**
|
||||
* Modal sizing constants
|
||||
|
||||
@@ -296,7 +296,8 @@ const PopoverContent = React.forwardRef<
|
||||
'z-[10000001] flex flex-col overflow-auto rounded-[8px] bg-[var(--surface-3)] px-[5.5px] py-[5px] text-foreground outline-none dark:bg-[var(--surface-3)]',
|
||||
// If width is constrained by the caller (prop or style), ensure inner flexible text truncates by default,
|
||||
// and also truncate section headers.
|
||||
hasUserWidthConstraint && '[&_.flex-1]:truncate [&_[data-popover-section]]:truncate'
|
||||
hasUserWidthConstraint && '[&_.flex-1]:truncate [&_[data-popover-section]]:truncate',
|
||||
className
|
||||
)}
|
||||
style={{
|
||||
maxHeight: `${maxHeight || 400}px`,
|
||||
|
||||
@@ -123,7 +123,7 @@ export class EdgeConstructor {
|
||||
}
|
||||
}
|
||||
|
||||
if (metadata.routerBlockIds.has(source)) {
|
||||
if (metadata.routerBlockIds.has(source) && handle !== EDGE.ERROR) {
|
||||
handle = `${EDGE.ROUTER_PREFIX}${target}`
|
||||
}
|
||||
|
||||
|
||||
@@ -28,6 +28,24 @@ export class PathConstructor {
|
||||
const block = workflow.blocks.find((b) => b.id === triggerBlockId)
|
||||
|
||||
if (block) {
|
||||
if (!block.enabled) {
|
||||
logger.error('Provided triggerBlockId is disabled, finding alternative', {
|
||||
triggerBlockId,
|
||||
blockEnabled: block.enabled,
|
||||
})
|
||||
// Try to find an alternative enabled trigger instead of failing
|
||||
const alternativeTrigger = this.findExplicitTrigger(workflow)
|
||||
if (alternativeTrigger) {
|
||||
logger.info('Using alternative enabled trigger', {
|
||||
disabledTriggerId: triggerBlockId,
|
||||
alternativeTriggerId: alternativeTrigger,
|
||||
})
|
||||
return alternativeTrigger
|
||||
}
|
||||
throw new Error(
|
||||
`Trigger block ${triggerBlockId} is disabled and no alternative enabled trigger found`
|
||||
)
|
||||
}
|
||||
return triggerBlockId
|
||||
}
|
||||
|
||||
@@ -95,8 +113,13 @@ export class PathConstructor {
|
||||
|
||||
private buildAdjacencyMap(workflow: SerializedWorkflow): Map<string, string[]> {
|
||||
const adjacency = new Map<string, string[]>()
|
||||
const enabledBlocks = new Set(workflow.blocks.filter((b) => b.enabled).map((b) => b.id))
|
||||
|
||||
for (const connection of workflow.connections) {
|
||||
if (!enabledBlocks.has(connection.source) || !enabledBlocks.has(connection.target)) {
|
||||
continue
|
||||
}
|
||||
|
||||
const neighbors = adjacency.get(connection.source) ?? []
|
||||
neighbors.push(connection.target)
|
||||
adjacency.set(connection.source, neighbors)
|
||||
|
||||
@@ -5,6 +5,7 @@ import type { LoopScope } from '@/executor/execution/state'
|
||||
import type { BlockStateController } from '@/executor/execution/types'
|
||||
import type { ExecutionContext, NormalizedBlockOutput } from '@/executor/types'
|
||||
import type { LoopConfigWithNodes } from '@/executor/types/loop'
|
||||
import { replaceValidReferences } from '@/executor/utils/reference-validation'
|
||||
import {
|
||||
buildSentinelEndId,
|
||||
buildSentinelStartId,
|
||||
@@ -271,16 +272,14 @@ export class LoopOrchestrator {
|
||||
}
|
||||
|
||||
try {
|
||||
const referencePattern = /<([^>]+)>/g
|
||||
let evaluatedCondition = condition
|
||||
|
||||
logger.info('Evaluating loop condition', {
|
||||
originalCondition: condition,
|
||||
iteration: scope.iteration,
|
||||
workflowVariables: ctx.workflowVariables,
|
||||
})
|
||||
|
||||
evaluatedCondition = evaluatedCondition.replace(referencePattern, (match) => {
|
||||
// Use generic utility for smart variable reference replacement
|
||||
const evaluatedCondition = replaceValidReferences(condition, (match) => {
|
||||
const resolved = this.resolver.resolveSingleReference(ctx, '', match, scope)
|
||||
logger.info('Resolved variable reference in loop condition', {
|
||||
reference: match,
|
||||
|
||||
49
apps/sim/executor/utils/reference-validation.ts
Normal file
49
apps/sim/executor/utils/reference-validation.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import { isLikelyReferenceSegment } from '@/lib/workflows/references'
|
||||
import { REFERENCE } from '@/executor/consts'
|
||||
|
||||
/**
|
||||
* Creates a regex pattern for matching variable references.
|
||||
* Uses [^<>]+ to prevent matching across nested brackets (e.g., "<3 <real.ref>" matches separately).
|
||||
*/
|
||||
export function createReferencePattern(): RegExp {
|
||||
return new RegExp(
|
||||
`${REFERENCE.START}([^${REFERENCE.START}${REFERENCE.END}]+)${REFERENCE.END}`,
|
||||
'g'
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a regex pattern for matching environment variables {{variable}}
|
||||
*/
|
||||
export function createEnvVarPattern(): RegExp {
|
||||
return new RegExp(`\\${REFERENCE.ENV_VAR_START}([^}]+)\\${REFERENCE.ENV_VAR_END}`, 'g')
|
||||
}
|
||||
|
||||
/**
|
||||
* Combined pattern matching both <reference> and {{env_var}}
|
||||
*/
|
||||
export function createCombinedPattern(): RegExp {
|
||||
return new RegExp(
|
||||
`${REFERENCE.START}[^${REFERENCE.START}${REFERENCE.END}]+${REFERENCE.END}|` +
|
||||
`\\${REFERENCE.ENV_VAR_START}[^}]+\\${REFERENCE.ENV_VAR_END}`,
|
||||
'g'
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Replaces variable references with smart validation.
|
||||
* Distinguishes < operator from < bracket using isLikelyReferenceSegment.
|
||||
*/
|
||||
export function replaceValidReferences(
|
||||
template: string,
|
||||
replacer: (match: string) => string
|
||||
): string {
|
||||
const pattern = createReferencePattern()
|
||||
|
||||
return template.replace(pattern, (match) => {
|
||||
if (!isLikelyReferenceSegment(match)) {
|
||||
return match
|
||||
}
|
||||
return replacer(match)
|
||||
})
|
||||
}
|
||||
@@ -2,6 +2,7 @@ import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { BlockType, REFERENCE } from '@/executor/consts'
|
||||
import type { ExecutionState, LoopScope } from '@/executor/execution/state'
|
||||
import type { ExecutionContext } from '@/executor/types'
|
||||
import { replaceValidReferences } from '@/executor/utils/reference-validation'
|
||||
import { BlockResolver } from '@/executor/variables/resolvers/block'
|
||||
import { EnvResolver } from '@/executor/variables/resolvers/env'
|
||||
import { LoopResolver } from '@/executor/variables/resolvers/loop'
|
||||
@@ -147,21 +148,17 @@ export class VariableResolver {
|
||||
loopScope?: LoopScope,
|
||||
block?: SerializedBlock
|
||||
): string {
|
||||
let result = template
|
||||
const resolutionContext: ResolutionContext = {
|
||||
executionContext: ctx,
|
||||
executionState: this.state,
|
||||
currentNodeId,
|
||||
loopScope,
|
||||
}
|
||||
const referenceRegex = new RegExp(
|
||||
`${REFERENCE.START}([^${REFERENCE.END}]+)${REFERENCE.END}`,
|
||||
'g'
|
||||
)
|
||||
|
||||
let replacementError: Error | null = null
|
||||
|
||||
result = result.replace(referenceRegex, (match) => {
|
||||
// Use generic utility for smart variable reference replacement
|
||||
let result = replaceValidReferences(template, (match) => {
|
||||
if (replacementError) return match
|
||||
|
||||
try {
|
||||
@@ -202,21 +199,17 @@ export class VariableResolver {
|
||||
template: string,
|
||||
loopScope?: LoopScope
|
||||
): string {
|
||||
let result = template
|
||||
const resolutionContext: ResolutionContext = {
|
||||
executionContext: ctx,
|
||||
executionState: this.state,
|
||||
currentNodeId,
|
||||
loopScope,
|
||||
}
|
||||
const referenceRegex = new RegExp(
|
||||
`${REFERENCE.START}([^${REFERENCE.END}]+)${REFERENCE.END}`,
|
||||
'g'
|
||||
)
|
||||
|
||||
let replacementError: Error | null = null
|
||||
|
||||
result = result.replace(referenceRegex, (match) => {
|
||||
// Use generic utility for smart variable reference replacement
|
||||
let result = replaceValidReferences(template, (match) => {
|
||||
if (replacementError) return match
|
||||
|
||||
try {
|
||||
|
||||
@@ -59,7 +59,7 @@ export function useOrganizations() {
|
||||
queryKey: creatorProfileKeys.organizations(),
|
||||
queryFn: fetchOrganizations,
|
||||
staleTime: 5 * 60 * 1000, // 5 minutes - organizations don't change often
|
||||
placeholderData: keepPreviousData, // Show cached data immediately (no skeleton loading!)
|
||||
placeholderData: keepPreviousData, // Show cached data immediately
|
||||
})
|
||||
}
|
||||
|
||||
@@ -97,7 +97,7 @@ export function useCreatorProfile(userId: string) {
|
||||
enabled: !!userId,
|
||||
retry: false, // Don't retry on 404
|
||||
staleTime: 60 * 1000, // 1 minute
|
||||
placeholderData: keepPreviousData, // Show cached data immediately (no skeleton loading!)
|
||||
placeholderData: keepPreviousData, // Show cached data immediately
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import { useEffect } from 'react'
|
||||
import { keepPreviousData, useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { workflowKeys } from '@/hooks/queries/workflows'
|
||||
import { useFolderStore, type WorkflowFolder } from '@/stores/folders/store'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
const logger = createLogger('FolderQueries')
|
||||
|
||||
@@ -150,11 +150,8 @@ export function useDeleteFolderMutation() {
|
||||
},
|
||||
onSuccess: async (_data, variables) => {
|
||||
queryClient.invalidateQueries({ queryKey: folderKeys.list(variables.workspaceId) })
|
||||
try {
|
||||
await useWorkflowRegistry.getState().loadWorkflows(variables.workspaceId)
|
||||
} catch (error) {
|
||||
logger.error('Failed to reload workflows after folder delete', { error })
|
||||
}
|
||||
// Invalidate workflow queries to reload workflows after folder changes
|
||||
queryClient.invalidateQueries({ queryKey: workflowKeys.list(variables.workspaceId) })
|
||||
},
|
||||
})
|
||||
}
|
||||
@@ -184,11 +181,8 @@ export function useDuplicateFolderMutation() {
|
||||
},
|
||||
onSuccess: async (_data, variables) => {
|
||||
queryClient.invalidateQueries({ queryKey: folderKeys.list(variables.workspaceId) })
|
||||
try {
|
||||
await useWorkflowRegistry.getState().loadWorkflows(variables.workspaceId)
|
||||
} catch (error) {
|
||||
logger.error('Failed to reload workflows after folder duplicate', { error })
|
||||
}
|
||||
// Invalidate workflow queries to reload workflows after folder changes
|
||||
queryClient.invalidateQueries({ queryKey: workflowKeys.list(variables.workspaceId) })
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
421
apps/sim/hooks/queries/logs.ts
Normal file
421
apps/sim/hooks/queries/logs.ts
Normal file
@@ -0,0 +1,421 @@
|
||||
import { keepPreviousData, useInfiniteQuery, useQuery } from '@tanstack/react-query'
|
||||
import type { LogsResponse, WorkflowLog } from '@/stores/logs/filters/types'
|
||||
|
||||
export const logKeys = {
|
||||
all: ['logs'] as const,
|
||||
lists: () => [...logKeys.all, 'list'] as const,
|
||||
list: (workspaceId: string | undefined, filters: Omit<LogFilters, 'page'>) =>
|
||||
[...logKeys.lists(), workspaceId ?? '', filters] as const,
|
||||
details: () => [...logKeys.all, 'detail'] as const,
|
||||
detail: (logId: string | undefined) => [...logKeys.details(), logId ?? ''] as const,
|
||||
metrics: () => [...logKeys.all, 'metrics'] as const,
|
||||
executions: (workspaceId: string | undefined, filters: Record<string, any>) =>
|
||||
[...logKeys.metrics(), 'executions', workspaceId ?? '', filters] as const,
|
||||
workflowLogs: (
|
||||
workspaceId: string | undefined,
|
||||
workflowId: string | undefined,
|
||||
filters: Record<string, any>
|
||||
) => [...logKeys.all, 'workflow-logs', workspaceId ?? '', workflowId ?? '', filters] as const,
|
||||
globalLogs: (workspaceId: string | undefined, filters: Record<string, any>) =>
|
||||
[...logKeys.all, 'global-logs', workspaceId ?? '', filters] as const,
|
||||
}
|
||||
|
||||
interface LogFilters {
|
||||
timeRange: string
|
||||
level: string
|
||||
workflowIds: string[]
|
||||
folderIds: string[]
|
||||
triggers: string[]
|
||||
searchQuery: string
|
||||
limit: number
|
||||
}
|
||||
|
||||
async function fetchLogsPage(
|
||||
workspaceId: string,
|
||||
filters: LogFilters,
|
||||
page: number
|
||||
): Promise<{ logs: WorkflowLog[]; hasMore: boolean; nextPage: number | undefined }> {
|
||||
const queryParams = buildQueryParams(workspaceId, filters, page)
|
||||
const response = await fetch(`/api/logs?${queryParams}`)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch logs')
|
||||
}
|
||||
|
||||
const apiData: LogsResponse = await response.json()
|
||||
const hasMore = apiData.data.length === filters.limit && apiData.page < apiData.totalPages
|
||||
|
||||
return {
|
||||
logs: apiData.data || [],
|
||||
hasMore,
|
||||
nextPage: hasMore ? page + 1 : undefined,
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchLogDetail(logId: string): Promise<WorkflowLog> {
|
||||
const response = await fetch(`/api/logs/${logId}`)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch log details')
|
||||
}
|
||||
|
||||
const { data } = await response.json()
|
||||
return data
|
||||
}
|
||||
|
||||
function buildQueryParams(workspaceId: string, filters: LogFilters, page: number): string {
|
||||
const params = new URLSearchParams()
|
||||
|
||||
params.set('workspaceId', workspaceId)
|
||||
params.set('limit', filters.limit.toString())
|
||||
params.set('offset', ((page - 1) * filters.limit).toString())
|
||||
|
||||
if (filters.level !== 'all') {
|
||||
params.set('level', filters.level)
|
||||
}
|
||||
|
||||
if (filters.triggers.length > 0) {
|
||||
params.set('triggers', filters.triggers.join(','))
|
||||
}
|
||||
|
||||
if (filters.workflowIds.length > 0) {
|
||||
params.set('workflowIds', filters.workflowIds.join(','))
|
||||
}
|
||||
|
||||
if (filters.folderIds.length > 0) {
|
||||
params.set('folderIds', filters.folderIds.join(','))
|
||||
}
|
||||
|
||||
if (filters.timeRange !== 'All time') {
|
||||
const now = new Date()
|
||||
let startDate: Date
|
||||
|
||||
switch (filters.timeRange) {
|
||||
case 'Past 30 minutes':
|
||||
startDate = new Date(now.getTime() - 30 * 60 * 1000)
|
||||
break
|
||||
case 'Past hour':
|
||||
startDate = new Date(now.getTime() - 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 6 hours':
|
||||
startDate = new Date(now.getTime() - 6 * 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 12 hours':
|
||||
startDate = new Date(now.getTime() - 12 * 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 24 hours':
|
||||
startDate = new Date(now.getTime() - 24 * 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 3 days':
|
||||
startDate = new Date(now.getTime() - 3 * 24 * 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 7 days':
|
||||
startDate = new Date(now.getTime() - 7 * 24 * 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 14 days':
|
||||
startDate = new Date(now.getTime() - 14 * 24 * 60 * 60 * 1000)
|
||||
break
|
||||
case 'Past 30 days':
|
||||
startDate = new Date(now.getTime() - 30 * 24 * 60 * 60 * 1000)
|
||||
break
|
||||
default:
|
||||
startDate = new Date(0)
|
||||
}
|
||||
|
||||
params.set('startDate', startDate.toISOString())
|
||||
}
|
||||
|
||||
if (filters.searchQuery.trim()) {
|
||||
params.set('search', filters.searchQuery.trim())
|
||||
}
|
||||
|
||||
return params.toString()
|
||||
}
|
||||
|
||||
interface UseLogsListOptions {
|
||||
enabled?: boolean
|
||||
refetchInterval?: number | false
|
||||
}
|
||||
|
||||
export function useLogsList(
|
||||
workspaceId: string | undefined,
|
||||
filters: LogFilters,
|
||||
options?: UseLogsListOptions
|
||||
) {
|
||||
return useInfiniteQuery({
|
||||
queryKey: logKeys.list(workspaceId, filters),
|
||||
queryFn: ({ pageParam }) => fetchLogsPage(workspaceId as string, filters, pageParam),
|
||||
enabled: Boolean(workspaceId) && (options?.enabled ?? true),
|
||||
refetchInterval: options?.refetchInterval ?? false,
|
||||
staleTime: 0, // Always consider stale for real-time logs
|
||||
initialPageParam: 1,
|
||||
getNextPageParam: (lastPage) => lastPage.nextPage,
|
||||
})
|
||||
}
|
||||
|
||||
export function useLogDetail(logId: string | undefined) {
|
||||
return useQuery({
|
||||
queryKey: logKeys.detail(logId),
|
||||
queryFn: () => fetchLogDetail(logId as string),
|
||||
enabled: Boolean(logId),
|
||||
staleTime: 30 * 1000, // Details can be slightly stale (30 seconds)
|
||||
placeholderData: keepPreviousData,
|
||||
})
|
||||
}
|
||||
|
||||
interface WorkflowSegment {
|
||||
timestamp: string
|
||||
hasExecutions: boolean
|
||||
totalExecutions: number
|
||||
successfulExecutions: number
|
||||
successRate: number
|
||||
avgDurationMs?: number
|
||||
p50Ms?: number
|
||||
p90Ms?: number
|
||||
p99Ms?: number
|
||||
}
|
||||
|
||||
interface WorkflowExecution {
|
||||
workflowId: string
|
||||
workflowName: string
|
||||
segments: WorkflowSegment[]
|
||||
overallSuccessRate: number
|
||||
}
|
||||
|
||||
interface AggregateSegment {
|
||||
timestamp: string
|
||||
totalExecutions: number
|
||||
successfulExecutions: number
|
||||
}
|
||||
|
||||
interface ExecutionsMetricsResponse {
|
||||
workflows: WorkflowExecution[]
|
||||
aggregateSegments: AggregateSegment[]
|
||||
}
|
||||
|
||||
interface DashboardMetricsFilters {
|
||||
workspaceId: string
|
||||
segments: number
|
||||
startTime: string
|
||||
endTime: string
|
||||
workflowIds?: string[]
|
||||
folderIds?: string[]
|
||||
triggers?: string[]
|
||||
}
|
||||
|
||||
async function fetchExecutionsMetrics(
|
||||
filters: DashboardMetricsFilters
|
||||
): Promise<ExecutionsMetricsResponse> {
|
||||
const params = new URLSearchParams({
|
||||
segments: String(filters.segments),
|
||||
startTime: filters.startTime,
|
||||
endTime: filters.endTime,
|
||||
})
|
||||
|
||||
if (filters.workflowIds && filters.workflowIds.length > 0) {
|
||||
params.set('workflowIds', filters.workflowIds.join(','))
|
||||
}
|
||||
|
||||
if (filters.folderIds && filters.folderIds.length > 0) {
|
||||
params.set('folderIds', filters.folderIds.join(','))
|
||||
}
|
||||
|
||||
if (filters.triggers && filters.triggers.length > 0) {
|
||||
params.set('triggers', filters.triggers.join(','))
|
||||
}
|
||||
|
||||
const response = await fetch(
|
||||
`/api/workspaces/${filters.workspaceId}/metrics/executions?${params.toString()}`
|
||||
)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch execution metrics')
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
|
||||
const workflows: WorkflowExecution[] = (data.workflows || []).map((wf: any) => {
|
||||
const segments = (wf.segments || []).map((s: any) => {
|
||||
const total = s.totalExecutions || 0
|
||||
const success = s.successfulExecutions || 0
|
||||
const hasExecutions = total > 0
|
||||
const successRate = hasExecutions ? (success / total) * 100 : 100
|
||||
return {
|
||||
timestamp: s.timestamp,
|
||||
hasExecutions,
|
||||
totalExecutions: total,
|
||||
successfulExecutions: success,
|
||||
successRate,
|
||||
avgDurationMs: typeof s.avgDurationMs === 'number' ? s.avgDurationMs : 0,
|
||||
p50Ms: typeof s.p50Ms === 'number' ? s.p50Ms : 0,
|
||||
p90Ms: typeof s.p90Ms === 'number' ? s.p90Ms : 0,
|
||||
p99Ms: typeof s.p99Ms === 'number' ? s.p99Ms : 0,
|
||||
}
|
||||
})
|
||||
|
||||
const totals = segments.reduce(
|
||||
(acc: { total: number; success: number }, seg: WorkflowSegment) => {
|
||||
acc.total += seg.totalExecutions
|
||||
acc.success += seg.successfulExecutions
|
||||
return acc
|
||||
},
|
||||
{ total: 0, success: 0 }
|
||||
)
|
||||
|
||||
const overallSuccessRate = totals.total > 0 ? (totals.success / totals.total) * 100 : 100
|
||||
|
||||
return {
|
||||
workflowId: wf.workflowId,
|
||||
workflowName: wf.workflowName,
|
||||
segments,
|
||||
overallSuccessRate,
|
||||
}
|
||||
})
|
||||
|
||||
const sortedWorkflows = workflows.sort((a, b) => {
|
||||
const errA = a.overallSuccessRate < 100 ? 1 - a.overallSuccessRate / 100 : 0
|
||||
const errB = b.overallSuccessRate < 100 ? 1 - b.overallSuccessRate / 100 : 0
|
||||
return errB - errA
|
||||
})
|
||||
|
||||
const segmentCount = filters.segments
|
||||
const startTime = new Date(filters.startTime)
|
||||
const endTime = new Date(filters.endTime)
|
||||
|
||||
const aggregateSegments: AggregateSegment[] = Array.from({ length: segmentCount }, (_, i) => {
|
||||
const base = startTime.getTime()
|
||||
const ts = new Date(base + Math.floor((i * (endTime.getTime() - base)) / segmentCount))
|
||||
return {
|
||||
timestamp: ts.toISOString(),
|
||||
totalExecutions: 0,
|
||||
successfulExecutions: 0,
|
||||
}
|
||||
})
|
||||
|
||||
for (const wf of data.workflows as any[]) {
|
||||
wf.segments.forEach((s: any, i: number) => {
|
||||
const index = Math.min(i, segmentCount - 1)
|
||||
aggregateSegments[index].totalExecutions += s.totalExecutions || 0
|
||||
aggregateSegments[index].successfulExecutions += s.successfulExecutions || 0
|
||||
})
|
||||
}
|
||||
|
||||
return {
|
||||
workflows: sortedWorkflows,
|
||||
aggregateSegments,
|
||||
}
|
||||
}
|
||||
|
||||
interface UseExecutionsMetricsOptions {
|
||||
enabled?: boolean
|
||||
refetchInterval?: number | false
|
||||
}
|
||||
|
||||
export function useExecutionsMetrics(
|
||||
filters: DashboardMetricsFilters,
|
||||
options?: UseExecutionsMetricsOptions
|
||||
) {
|
||||
return useQuery({
|
||||
queryKey: logKeys.executions(filters.workspaceId, filters),
|
||||
queryFn: () => fetchExecutionsMetrics(filters),
|
||||
enabled: Boolean(filters.workspaceId) && (options?.enabled ?? true),
|
||||
refetchInterval: options?.refetchInterval ?? false,
|
||||
staleTime: 10 * 1000, // Metrics can be slightly stale (10 seconds)
|
||||
placeholderData: keepPreviousData,
|
||||
})
|
||||
}
|
||||
|
||||
interface DashboardLogsFilters {
|
||||
workspaceId: string
|
||||
startDate: string
|
||||
endDate: string
|
||||
workflowIds?: string[]
|
||||
folderIds?: string[]
|
||||
triggers?: string[]
|
||||
limit: number
|
||||
}
|
||||
|
||||
interface DashboardLogsPage {
|
||||
logs: any[] // Will be mapped by the consumer
|
||||
hasMore: boolean
|
||||
nextPage: number | undefined
|
||||
}
|
||||
|
||||
async function fetchDashboardLogsPage(
|
||||
filters: DashboardLogsFilters,
|
||||
page: number,
|
||||
workflowId?: string
|
||||
): Promise<DashboardLogsPage> {
|
||||
const params = new URLSearchParams({
|
||||
limit: filters.limit.toString(),
|
||||
offset: ((page - 1) * filters.limit).toString(),
|
||||
workspaceId: filters.workspaceId,
|
||||
startDate: filters.startDate,
|
||||
endDate: filters.endDate,
|
||||
order: 'desc',
|
||||
details: 'full',
|
||||
})
|
||||
|
||||
if (workflowId) {
|
||||
params.set('workflowIds', workflowId)
|
||||
} else if (filters.workflowIds && filters.workflowIds.length > 0) {
|
||||
params.set('workflowIds', filters.workflowIds.join(','))
|
||||
}
|
||||
|
||||
if (filters.folderIds && filters.folderIds.length > 0) {
|
||||
params.set('folderIds', filters.folderIds.join(','))
|
||||
}
|
||||
|
||||
if (filters.triggers && filters.triggers.length > 0) {
|
||||
params.set('triggers', filters.triggers.join(','))
|
||||
}
|
||||
|
||||
const response = await fetch(`/api/logs?${params.toString()}`)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch dashboard logs')
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
const logs = data.data || []
|
||||
const hasMore = logs.length === filters.limit
|
||||
|
||||
return {
|
||||
logs,
|
||||
hasMore,
|
||||
nextPage: hasMore ? page + 1 : undefined,
|
||||
}
|
||||
}
|
||||
|
||||
interface UseDashboardLogsOptions {
|
||||
enabled?: boolean
|
||||
}
|
||||
|
||||
export function useGlobalDashboardLogs(
|
||||
filters: DashboardLogsFilters,
|
||||
options?: UseDashboardLogsOptions
|
||||
) {
|
||||
return useInfiniteQuery({
|
||||
queryKey: logKeys.globalLogs(filters.workspaceId, filters),
|
||||
queryFn: ({ pageParam }) => fetchDashboardLogsPage(filters, pageParam),
|
||||
enabled: Boolean(filters.workspaceId) && (options?.enabled ?? true),
|
||||
staleTime: 10 * 1000, // Slightly stale (10 seconds)
|
||||
initialPageParam: 1,
|
||||
getNextPageParam: (lastPage) => lastPage.nextPage,
|
||||
})
|
||||
}
|
||||
|
||||
export function useWorkflowDashboardLogs(
|
||||
workflowId: string | undefined,
|
||||
filters: DashboardLogsFilters,
|
||||
options?: UseDashboardLogsOptions
|
||||
) {
|
||||
return useInfiniteQuery({
|
||||
queryKey: logKeys.workflowLogs(filters.workspaceId, workflowId, filters),
|
||||
queryFn: ({ pageParam }) => fetchDashboardLogsPage(filters, pageParam, workflowId),
|
||||
enabled: Boolean(filters.workspaceId) && Boolean(workflowId) && (options?.enabled ?? true),
|
||||
staleTime: 10 * 1000, // Slightly stale (10 seconds)
|
||||
initialPageParam: 1,
|
||||
getNextPageParam: (lastPage) => lastPage.nextPage,
|
||||
})
|
||||
}
|
||||
@@ -123,7 +123,7 @@ export function useOAuthConnections() {
|
||||
queryFn: fetchOAuthConnections,
|
||||
staleTime: 30 * 1000, // 30 seconds - connections don't change often
|
||||
retry: false, // Don't retry on 404
|
||||
placeholderData: keepPreviousData, // Show cached data immediately (no skeleton loading!)
|
||||
placeholderData: keepPreviousData, // Show cached data immediately
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
@@ -53,7 +53,7 @@ export function useUserProfile() {
|
||||
queryKey: userProfileKeys.profile(),
|
||||
queryFn: fetchUserProfile,
|
||||
staleTime: 5 * 60 * 1000, // 5 minutes - profile data doesn't change often
|
||||
placeholderData: keepPreviousData, // Show cached data immediately (no skeleton loading!)
|
||||
placeholderData: keepPreviousData, // Show cached data immediately
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
169
apps/sim/hooks/queries/workflows.ts
Normal file
169
apps/sim/hooks/queries/workflows.ts
Normal file
@@ -0,0 +1,169 @@
|
||||
import { useEffect } from 'react'
|
||||
import { keepPreviousData, useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { buildDefaultWorkflowArtifacts } from '@/lib/workflows/defaults'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import type { WorkflowMetadata } from '@/stores/workflows/registry/types'
|
||||
import {
|
||||
generateCreativeWorkflowName,
|
||||
getNextWorkflowColor,
|
||||
} from '@/stores/workflows/registry/utils'
|
||||
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
|
||||
|
||||
const logger = createLogger('WorkflowQueries')
|
||||
|
||||
export const workflowKeys = {
|
||||
all: ['workflows'] as const,
|
||||
lists: () => [...workflowKeys.all, 'list'] as const,
|
||||
list: (workspaceId: string | undefined) => [...workflowKeys.lists(), workspaceId ?? ''] as const,
|
||||
}
|
||||
|
||||
function mapWorkflow(workflow: any): WorkflowMetadata {
|
||||
return {
|
||||
id: workflow.id,
|
||||
name: workflow.name,
|
||||
description: workflow.description,
|
||||
color: workflow.color,
|
||||
workspaceId: workflow.workspaceId,
|
||||
folderId: workflow.folderId,
|
||||
createdAt: new Date(workflow.createdAt),
|
||||
lastModified: new Date(workflow.updatedAt || workflow.createdAt),
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchWorkflows(workspaceId: string): Promise<WorkflowMetadata[]> {
|
||||
const response = await fetch(`/api/workflows?workspaceId=${workspaceId}`)
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch workflows')
|
||||
}
|
||||
|
||||
const { data }: { data: any[] } = await response.json()
|
||||
return data.map(mapWorkflow)
|
||||
}
|
||||
|
||||
export function useWorkflows(workspaceId?: string) {
|
||||
const setWorkflows = useWorkflowRegistry((state) => state.setWorkflows)
|
||||
|
||||
const query = useQuery({
|
||||
queryKey: workflowKeys.list(workspaceId),
|
||||
queryFn: () => fetchWorkflows(workspaceId as string),
|
||||
enabled: Boolean(workspaceId),
|
||||
placeholderData: keepPreviousData,
|
||||
staleTime: 60 * 1000,
|
||||
})
|
||||
|
||||
useEffect(() => {
|
||||
if (query.data) {
|
||||
setWorkflows(query.data)
|
||||
}
|
||||
}, [query.data, setWorkflows])
|
||||
|
||||
return query
|
||||
}
|
||||
|
||||
interface CreateWorkflowVariables {
|
||||
workspaceId: string
|
||||
name?: string
|
||||
description?: string
|
||||
color?: string
|
||||
folderId?: string | null
|
||||
}
|
||||
|
||||
export function useCreateWorkflow() {
|
||||
const queryClient = useQueryClient()
|
||||
|
||||
return useMutation({
|
||||
mutationFn: async (variables: CreateWorkflowVariables) => {
|
||||
const { workspaceId, name, description, color, folderId } = variables
|
||||
|
||||
logger.info(`Creating new workflow in workspace: ${workspaceId}`)
|
||||
|
||||
const createResponse = await fetch('/api/workflows', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
name: name || generateCreativeWorkflowName(),
|
||||
description: description || 'New workflow',
|
||||
color: color || getNextWorkflowColor(),
|
||||
workspaceId,
|
||||
folderId: folderId || null,
|
||||
}),
|
||||
})
|
||||
|
||||
if (!createResponse.ok) {
|
||||
const errorData = await createResponse.json()
|
||||
throw new Error(
|
||||
`Failed to create workflow: ${errorData.error || createResponse.statusText}`
|
||||
)
|
||||
}
|
||||
|
||||
const createdWorkflow = await createResponse.json()
|
||||
const workflowId = createdWorkflow.id
|
||||
|
||||
logger.info(`Successfully created workflow ${workflowId}`)
|
||||
|
||||
const { workflowState } = buildDefaultWorkflowArtifacts()
|
||||
|
||||
fetch(`/api/workflows/${workflowId}/state`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(workflowState),
|
||||
})
|
||||
.then((response) => {
|
||||
if (!response.ok) {
|
||||
response.text().then((text) => {
|
||||
logger.error('Failed to persist default Start block:', text)
|
||||
})
|
||||
} else {
|
||||
logger.info('Successfully persisted default Start block')
|
||||
}
|
||||
})
|
||||
.catch((error) => {
|
||||
logger.error('Error persisting default Start block:', error)
|
||||
})
|
||||
|
||||
return {
|
||||
id: workflowId,
|
||||
name: createdWorkflow.name,
|
||||
description: createdWorkflow.description,
|
||||
color: createdWorkflow.color,
|
||||
workspaceId,
|
||||
folderId: createdWorkflow.folderId,
|
||||
}
|
||||
},
|
||||
onSuccess: (data, variables) => {
|
||||
logger.info(`Workflow ${data.id} created successfully`)
|
||||
|
||||
const { subBlockValues } = buildDefaultWorkflowArtifacts()
|
||||
useSubBlockStore.setState((state) => ({
|
||||
workflowValues: {
|
||||
...state.workflowValues,
|
||||
[data.id]: subBlockValues,
|
||||
},
|
||||
}))
|
||||
|
||||
useWorkflowRegistry.setState((state) => ({
|
||||
workflows: {
|
||||
...state.workflows,
|
||||
[data.id]: {
|
||||
id: data.id,
|
||||
name: data.name,
|
||||
lastModified: new Date(),
|
||||
createdAt: new Date(),
|
||||
description: data.description,
|
||||
color: data.color,
|
||||
workspaceId: data.workspaceId,
|
||||
folderId: data.folderId,
|
||||
},
|
||||
},
|
||||
error: null,
|
||||
}))
|
||||
|
||||
queryClient.invalidateQueries({ queryKey: workflowKeys.list(variables.workspaceId) })
|
||||
},
|
||||
onError: (error: Error) => {
|
||||
logger.error('Failed to create workflow:', error)
|
||||
},
|
||||
})
|
||||
}
|
||||
@@ -49,7 +49,7 @@ export function useWorkspaceFiles(workspaceId: string) {
|
||||
queryFn: () => fetchWorkspaceFiles(workspaceId),
|
||||
enabled: !!workspaceId,
|
||||
staleTime: 30 * 1000, // 30 seconds - files can change frequently
|
||||
placeholderData: keepPreviousData, // Show cached data immediately (no skeleton loading!)
|
||||
placeholderData: keepPreviousData, // Show cached data immediately
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
@@ -1,7 +1,11 @@
|
||||
import { db } from '@sim/db'
|
||||
import { member, organization, settings, user, userStats } from '@sim/db/schema'
|
||||
import { eq, inArray } from 'drizzle-orm'
|
||||
import { getEmailSubject, renderUsageThresholdEmail } from '@/components/emails/render-email'
|
||||
import {
|
||||
getEmailSubject,
|
||||
renderFreeTierUpgradeEmail,
|
||||
renderUsageThresholdEmail,
|
||||
} from '@/components/emails/render-email'
|
||||
import { getHighestPrioritySubscription } from '@/lib/billing/core/subscription'
|
||||
import {
|
||||
canEditUsageLimit,
|
||||
@@ -614,60 +618,113 @@ export async function maybeSendUsageThresholdEmail(params: {
|
||||
}): Promise<void> {
|
||||
try {
|
||||
if (!isBillingEnabled) return
|
||||
// Only on upward crossing to >= 80%
|
||||
if (!(params.percentBefore < 80 && params.percentAfter >= 80)) return
|
||||
if (params.limit <= 0 || params.currentUsageAfter <= 0) return
|
||||
|
||||
const baseUrl = getBaseUrl()
|
||||
const ctaLink = `${baseUrl}/workspace?billing=usage`
|
||||
const sendTo = async (email: string, name?: string) => {
|
||||
const prefs = await getEmailPreferences(email)
|
||||
if (prefs?.unsubscribeAll || prefs?.unsubscribeNotifications) return
|
||||
const isFreeUser = params.planName === 'Free'
|
||||
|
||||
const html = await renderUsageThresholdEmail({
|
||||
userName: name,
|
||||
planName: params.planName,
|
||||
percentUsed: Math.min(100, Math.round(params.percentAfter)),
|
||||
currentUsage: params.currentUsageAfter,
|
||||
limit: params.limit,
|
||||
ctaLink,
|
||||
})
|
||||
// Check for 80% threshold (all users)
|
||||
const crosses80 = params.percentBefore < 80 && params.percentAfter >= 80
|
||||
// Check for 90% threshold (free users only)
|
||||
const crosses90 = params.percentBefore < 90 && params.percentAfter >= 90
|
||||
|
||||
await sendEmail({
|
||||
to: email,
|
||||
subject: getEmailSubject('usage-threshold'),
|
||||
html,
|
||||
emailType: 'notifications',
|
||||
})
|
||||
// Skip if no thresholds crossed
|
||||
if (!crosses80 && !crosses90) return
|
||||
|
||||
// For 80% threshold email (all users)
|
||||
if (crosses80) {
|
||||
const ctaLink = `${baseUrl}/workspace?billing=usage`
|
||||
const sendTo = async (email: string, name?: string) => {
|
||||
const prefs = await getEmailPreferences(email)
|
||||
if (prefs?.unsubscribeAll || prefs?.unsubscribeNotifications) return
|
||||
|
||||
const html = await renderUsageThresholdEmail({
|
||||
userName: name,
|
||||
planName: params.planName,
|
||||
percentUsed: Math.min(100, Math.round(params.percentAfter)),
|
||||
currentUsage: params.currentUsageAfter,
|
||||
limit: params.limit,
|
||||
ctaLink,
|
||||
})
|
||||
|
||||
await sendEmail({
|
||||
to: email,
|
||||
subject: getEmailSubject('usage-threshold'),
|
||||
html,
|
||||
emailType: 'notifications',
|
||||
})
|
||||
}
|
||||
|
||||
if (params.scope === 'user' && params.userId && params.userEmail) {
|
||||
const rows = await db
|
||||
.select({ enabled: settings.billingUsageNotificationsEnabled })
|
||||
.from(settings)
|
||||
.where(eq(settings.userId, params.userId))
|
||||
.limit(1)
|
||||
if (rows.length > 0 && rows[0].enabled === false) return
|
||||
await sendTo(params.userEmail, params.userName)
|
||||
} else if (params.scope === 'organization' && params.organizationId) {
|
||||
const admins = await db
|
||||
.select({
|
||||
email: user.email,
|
||||
name: user.name,
|
||||
enabled: settings.billingUsageNotificationsEnabled,
|
||||
role: member.role,
|
||||
})
|
||||
.from(member)
|
||||
.innerJoin(user, eq(member.userId, user.id))
|
||||
.leftJoin(settings, eq(settings.userId, member.userId))
|
||||
.where(eq(member.organizationId, params.organizationId))
|
||||
|
||||
for (const a of admins) {
|
||||
const isAdmin = a.role === 'owner' || a.role === 'admin'
|
||||
if (!isAdmin) continue
|
||||
if (a.enabled === false) continue
|
||||
if (!a.email) continue
|
||||
await sendTo(a.email, a.name || undefined)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (params.scope === 'user' && params.userId && params.userEmail) {
|
||||
const rows = await db
|
||||
.select({ enabled: settings.billingUsageNotificationsEnabled })
|
||||
.from(settings)
|
||||
.where(eq(settings.userId, params.userId))
|
||||
.limit(1)
|
||||
if (rows.length > 0 && rows[0].enabled === false) return
|
||||
await sendTo(params.userEmail, params.userName)
|
||||
} else if (params.scope === 'organization' && params.organizationId) {
|
||||
const admins = await db
|
||||
.select({
|
||||
email: user.email,
|
||||
name: user.name,
|
||||
enabled: settings.billingUsageNotificationsEnabled,
|
||||
role: member.role,
|
||||
})
|
||||
.from(member)
|
||||
.innerJoin(user, eq(member.userId, user.id))
|
||||
.leftJoin(settings, eq(settings.userId, member.userId))
|
||||
.where(eq(member.organizationId, params.organizationId))
|
||||
// For 90% threshold email (free users only)
|
||||
if (crosses90 && isFreeUser) {
|
||||
const upgradeLink = `${baseUrl}/workspace?billing=upgrade`
|
||||
const sendFreeTierEmail = async (email: string, name?: string) => {
|
||||
const prefs = await getEmailPreferences(email)
|
||||
if (prefs?.unsubscribeAll || prefs?.unsubscribeNotifications) return
|
||||
|
||||
for (const a of admins) {
|
||||
const isAdmin = a.role === 'owner' || a.role === 'admin'
|
||||
if (!isAdmin) continue
|
||||
if (a.enabled === false) continue
|
||||
if (!a.email) continue
|
||||
await sendTo(a.email, a.name || undefined)
|
||||
const html = await renderFreeTierUpgradeEmail({
|
||||
userName: name,
|
||||
percentUsed: Math.min(100, Math.round(params.percentAfter)),
|
||||
currentUsage: params.currentUsageAfter,
|
||||
limit: params.limit,
|
||||
upgradeLink,
|
||||
})
|
||||
|
||||
await sendEmail({
|
||||
to: email,
|
||||
subject: getEmailSubject('free-tier-upgrade'),
|
||||
html,
|
||||
emailType: 'notifications',
|
||||
})
|
||||
|
||||
logger.info('Free tier upgrade email sent', {
|
||||
email,
|
||||
percentUsed: Math.round(params.percentAfter),
|
||||
currentUsage: params.currentUsageAfter,
|
||||
limit: params.limit,
|
||||
})
|
||||
}
|
||||
|
||||
// Free users are always individual scope (not organization)
|
||||
if (params.scope === 'user' && params.userId && params.userEmail) {
|
||||
const rows = await db
|
||||
.select({ enabled: settings.billingUsageNotificationsEnabled })
|
||||
.from(settings)
|
||||
.where(eq(settings.userId, params.userId))
|
||||
.limit(1)
|
||||
if (rows.length > 0 && rows[0].enabled === false) return
|
||||
await sendFreeTierEmail(params.userEmail, params.userName)
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
|
||||
@@ -1,10 +1,15 @@
|
||||
import { render } from '@react-email/components'
|
||||
import { db } from '@sim/db'
|
||||
import { member, subscription as subscriptionTable, userStats } from '@sim/db/schema'
|
||||
import { member, subscription as subscriptionTable, user, userStats } from '@sim/db/schema'
|
||||
import { eq, inArray } from 'drizzle-orm'
|
||||
import type Stripe from 'stripe'
|
||||
import PaymentFailedEmail from '@/components/emails/billing/payment-failed-email'
|
||||
import { calculateSubscriptionOverage } from '@/lib/billing/core/billing'
|
||||
import { requireStripeClient } from '@/lib/billing/stripe-client'
|
||||
import { sendEmail } from '@/lib/email/mailer'
|
||||
import { quickValidateEmail } from '@/lib/email/validation'
|
||||
import { createLogger } from '@/lib/logs/console/logger'
|
||||
import { getBaseUrl } from '@/lib/urls/utils'
|
||||
|
||||
const logger = createLogger('StripeInvoiceWebhooks')
|
||||
|
||||
@@ -19,6 +24,199 @@ function parseDecimal(value: string | number | null | undefined): number {
|
||||
return Number.parseFloat(value.toString())
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a billing portal URL for a Stripe customer
|
||||
*/
|
||||
async function createBillingPortalUrl(stripeCustomerId: string): Promise<string> {
|
||||
try {
|
||||
const stripe = requireStripeClient()
|
||||
const baseUrl = getBaseUrl()
|
||||
const portal = await stripe.billingPortal.sessions.create({
|
||||
customer: stripeCustomerId,
|
||||
return_url: `${baseUrl}/workspace?billing=updated`,
|
||||
})
|
||||
return portal.url
|
||||
} catch (error) {
|
||||
logger.error('Failed to create billing portal URL', { error, stripeCustomerId })
|
||||
// Fallback to generic billing page
|
||||
return `${getBaseUrl()}/workspace?tab=subscription`
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get payment method details from Stripe invoice
|
||||
*/
|
||||
async function getPaymentMethodDetails(
|
||||
invoice: Stripe.Invoice
|
||||
): Promise<{ lastFourDigits?: string; failureReason?: string }> {
|
||||
let lastFourDigits: string | undefined
|
||||
let failureReason: string | undefined
|
||||
|
||||
// Try to get last 4 digits from payment method
|
||||
try {
|
||||
const stripe = requireStripeClient()
|
||||
|
||||
// Try to get from default payment method
|
||||
if (invoice.default_payment_method && typeof invoice.default_payment_method === 'string') {
|
||||
const paymentMethod = await stripe.paymentMethods.retrieve(invoice.default_payment_method)
|
||||
if (paymentMethod.card?.last4) {
|
||||
lastFourDigits = paymentMethod.card.last4
|
||||
}
|
||||
}
|
||||
|
||||
// If no default payment method, try getting from customer's default
|
||||
if (!lastFourDigits && invoice.customer && typeof invoice.customer === 'string') {
|
||||
const customer = await stripe.customers.retrieve(invoice.customer)
|
||||
if (customer && !('deleted' in customer)) {
|
||||
const defaultPm = customer.invoice_settings?.default_payment_method
|
||||
if (defaultPm && typeof defaultPm === 'string') {
|
||||
const paymentMethod = await stripe.paymentMethods.retrieve(defaultPm)
|
||||
if (paymentMethod.card?.last4) {
|
||||
lastFourDigits = paymentMethod.card.last4
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.warn('Failed to retrieve payment method details', { error, invoiceId: invoice.id })
|
||||
}
|
||||
|
||||
// Get failure message - check multiple sources
|
||||
if (invoice.last_finalization_error?.message) {
|
||||
failureReason = invoice.last_finalization_error.message
|
||||
}
|
||||
|
||||
// If not found, check the payments array (requires expand: ['payments'])
|
||||
if (!failureReason && invoice.payments?.data) {
|
||||
const defaultPayment = invoice.payments.data.find((p) => p.is_default)
|
||||
const payment = defaultPayment || invoice.payments.data[0]
|
||||
|
||||
if (payment?.payment) {
|
||||
try {
|
||||
const stripe = requireStripeClient()
|
||||
|
||||
if (payment.payment.type === 'payment_intent' && payment.payment.payment_intent) {
|
||||
const piId =
|
||||
typeof payment.payment.payment_intent === 'string'
|
||||
? payment.payment.payment_intent
|
||||
: payment.payment.payment_intent.id
|
||||
|
||||
const paymentIntent = await stripe.paymentIntents.retrieve(piId)
|
||||
if (paymentIntent.last_payment_error?.message) {
|
||||
failureReason = paymentIntent.last_payment_error.message
|
||||
}
|
||||
} else if (payment.payment.type === 'charge' && payment.payment.charge) {
|
||||
const chargeId =
|
||||
typeof payment.payment.charge === 'string'
|
||||
? payment.payment.charge
|
||||
: payment.payment.charge.id
|
||||
|
||||
const charge = await stripe.charges.retrieve(chargeId)
|
||||
if (charge.failure_message) {
|
||||
failureReason = charge.failure_message
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.warn('Failed to retrieve payment details for failure reason', {
|
||||
error,
|
||||
invoiceId: invoice.id,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { lastFourDigits, failureReason }
|
||||
}
|
||||
|
||||
/**
|
||||
* Send payment failure notification emails to affected users
|
||||
*/
|
||||
async function sendPaymentFailureEmails(
|
||||
sub: { plan: string | null; referenceId: string },
|
||||
invoice: Stripe.Invoice,
|
||||
stripeCustomerId: string
|
||||
): Promise<void> {
|
||||
try {
|
||||
const billingPortalUrl = await createBillingPortalUrl(stripeCustomerId)
|
||||
const amountDue = invoice.amount_due / 100 // Convert cents to dollars
|
||||
const { lastFourDigits, failureReason } = await getPaymentMethodDetails(invoice)
|
||||
|
||||
// Get users to notify
|
||||
let usersToNotify: Array<{ email: string; name: string | null }> = []
|
||||
|
||||
if (sub.plan === 'team' || sub.plan === 'enterprise') {
|
||||
// For team/enterprise, notify all owners and admins
|
||||
const members = await db
|
||||
.select({
|
||||
userId: member.userId,
|
||||
role: member.role,
|
||||
})
|
||||
.from(member)
|
||||
.where(eq(member.organizationId, sub.referenceId))
|
||||
|
||||
// Get owner/admin user details
|
||||
const ownerAdminIds = members
|
||||
.filter((m) => m.role === 'owner' || m.role === 'admin')
|
||||
.map((m) => m.userId)
|
||||
|
||||
if (ownerAdminIds.length > 0) {
|
||||
const users = await db
|
||||
.select({ email: user.email, name: user.name })
|
||||
.from(user)
|
||||
.where(inArray(user.id, ownerAdminIds))
|
||||
|
||||
usersToNotify = users.filter((u) => u.email && quickValidateEmail(u.email).isValid)
|
||||
}
|
||||
} else {
|
||||
// For individual plans, notify the user
|
||||
const users = await db
|
||||
.select({ email: user.email, name: user.name })
|
||||
.from(user)
|
||||
.where(eq(user.id, sub.referenceId))
|
||||
.limit(1)
|
||||
|
||||
if (users.length > 0) {
|
||||
usersToNotify = users.filter((u) => u.email && quickValidateEmail(u.email).isValid)
|
||||
}
|
||||
}
|
||||
|
||||
// Send emails to all affected users
|
||||
for (const userToNotify of usersToNotify) {
|
||||
try {
|
||||
const emailHtml = await render(
|
||||
PaymentFailedEmail({
|
||||
userName: userToNotify.name || undefined,
|
||||
amountDue,
|
||||
lastFourDigits,
|
||||
billingPortalUrl,
|
||||
failureReason,
|
||||
sentDate: new Date(),
|
||||
})
|
||||
)
|
||||
|
||||
await sendEmail({
|
||||
to: userToNotify.email,
|
||||
subject: 'Payment Failed - Action Required',
|
||||
html: emailHtml,
|
||||
emailType: 'transactional',
|
||||
})
|
||||
|
||||
logger.info('Payment failure email sent', {
|
||||
email: userToNotify.email,
|
||||
invoiceId: invoice.id,
|
||||
})
|
||||
} catch (emailError) {
|
||||
logger.error('Failed to send payment failure email', {
|
||||
error: emailError,
|
||||
email: userToNotify.email,
|
||||
})
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Failed to send payment failure emails', { error })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get total billed overage for a subscription, handling team vs individual plans
|
||||
* For team plans: sums billedOverageThisPeriod across all members
|
||||
@@ -237,10 +435,19 @@ export async function handleInvoicePaymentFailed(event: Stripe.Event) {
|
||||
return
|
||||
}
|
||||
|
||||
const customerId = invoice.customer as string
|
||||
// Extract and validate customer ID
|
||||
const customerId = invoice.customer
|
||||
if (!customerId || typeof customerId !== 'string') {
|
||||
logger.error('Invalid customer ID on invoice', {
|
||||
invoiceId: invoice.id,
|
||||
customer: invoice.customer,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
const failedAmount = invoice.amount_due / 100 // Convert from cents to dollars
|
||||
const billingPeriod = invoice.metadata?.billingPeriod || 'unknown'
|
||||
const attemptCount = invoice.attempt_count || 1
|
||||
const attemptCount = invoice.attempt_count ?? 1
|
||||
|
||||
logger.warn('Invoice payment failed', {
|
||||
invoiceId: invoice.id,
|
||||
@@ -300,6 +507,23 @@ export async function handleInvoicePaymentFailed(event: Stripe.Event) {
|
||||
isOverageInvoice,
|
||||
})
|
||||
}
|
||||
|
||||
// Send payment failure notification emails
|
||||
// Only send on FIRST failure (attempt_count === 1), not on Stripe's automatic retries
|
||||
// This prevents spamming users with duplicate emails every 3-5-7 days
|
||||
if (attemptCount === 1) {
|
||||
await sendPaymentFailureEmails(sub, invoice, customerId)
|
||||
logger.info('Payment failure email sent on first attempt', {
|
||||
invoiceId: invoice.id,
|
||||
customerId,
|
||||
})
|
||||
} else {
|
||||
logger.info('Skipping payment failure email on retry attempt', {
|
||||
invoiceId: invoice.id,
|
||||
attemptCount,
|
||||
customerId,
|
||||
})
|
||||
}
|
||||
} else {
|
||||
logger.warn('Subscription not found in database for failed payment', {
|
||||
stripeSubscriptionId,
|
||||
|
||||
@@ -85,9 +85,10 @@ export class BlockPathCalculator {
|
||||
const pathNodes = BlockPathCalculator.findAllPathNodes(workflow.connections, block.id)
|
||||
pathNodes.forEach((nodeId) => accessibleBlocks.add(nodeId))
|
||||
|
||||
// Always allow referencing the starter block (special case)
|
||||
// Only add starter block if it's actually upstream (already in pathNodes)
|
||||
// Don't add it just because it exists on the canvas
|
||||
const starterBlock = workflow.blocks.find((b) => b.metadata?.id === 'starter')
|
||||
if (starterBlock && starterBlock.id !== block.id) {
|
||||
if (starterBlock && starterBlock.id !== block.id && pathNodes.includes(starterBlock.id)) {
|
||||
accessibleBlocks.add(starterBlock.id)
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { Loader2, MinusCircle, Play, XCircle } from 'lucide-react'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
import {
|
||||
BaseClientTool,
|
||||
type BaseClientToolMetadata,
|
||||
@@ -12,7 +13,7 @@ import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
interface RunWorkflowArgs {
|
||||
workflowId?: string
|
||||
description?: string
|
||||
workflow_input?: string
|
||||
workflow_input?: Record<string, any>
|
||||
}
|
||||
|
||||
export class RunWorkflowClientTool extends BaseClientTool {
|
||||
@@ -76,10 +77,11 @@ export class RunWorkflowClientTool extends BaseClientTool {
|
||||
}
|
||||
logger.debug('Using active workflow', { activeWorkflowId })
|
||||
|
||||
const workflowInput = params.workflow_input ? { input: params.workflow_input } : undefined
|
||||
if (workflowInput?.input) {
|
||||
const workflowInput = params.workflow_input || undefined
|
||||
if (workflowInput) {
|
||||
logger.debug('Workflow input provided', {
|
||||
inputPreview: String(workflowInput.input).slice(0, 120),
|
||||
inputFields: Object.keys(workflowInput),
|
||||
inputPreview: JSON.stringify(workflowInput).slice(0, 120),
|
||||
})
|
||||
}
|
||||
|
||||
@@ -87,15 +89,17 @@ export class RunWorkflowClientTool extends BaseClientTool {
|
||||
logger.debug('Set isExecuting(true) and switching state to executing')
|
||||
this.setState(ClientToolCallState.executing)
|
||||
|
||||
const executionId = uuidv4()
|
||||
const executionStartTime = new Date().toISOString()
|
||||
logger.debug('Starting workflow execution', {
|
||||
executionStartTime,
|
||||
executionId: this.toolCallId,
|
||||
executionId,
|
||||
toolCallId: this.toolCallId,
|
||||
})
|
||||
|
||||
const result = await executeWorkflowWithFullLogging({
|
||||
workflowInput,
|
||||
executionId: this.toolCallId,
|
||||
executionId,
|
||||
})
|
||||
|
||||
setIsExecuting(false)
|
||||
|
||||
@@ -28,29 +28,23 @@ export class PdfParser implements FileParser {
|
||||
try {
|
||||
logger.info('Starting to parse buffer, size:', dataBuffer.length)
|
||||
|
||||
const { PDFParse } = await import('pdf-parse')
|
||||
const { extractText, getDocumentProxy } = await import('unpdf')
|
||||
|
||||
const parser = new PDFParse({ data: dataBuffer })
|
||||
const textResult = await parser.getText()
|
||||
const infoResult = await parser.getInfo()
|
||||
await parser.destroy()
|
||||
const uint8Array = new Uint8Array(dataBuffer)
|
||||
|
||||
logger.info(
|
||||
'PDF parsed successfully, pages:',
|
||||
textResult.total,
|
||||
'text length:',
|
||||
textResult.text.length
|
||||
)
|
||||
const pdf = await getDocumentProxy(uint8Array)
|
||||
|
||||
const cleanContent = textResult.text.replace(/\u0000/g, '')
|
||||
const { totalPages, text } = await extractText(pdf, { mergePages: true })
|
||||
|
||||
logger.info('PDF parsed successfully, pages:', totalPages, 'text length:', text.length)
|
||||
|
||||
const cleanContent = text.replace(/\u0000/g, '')
|
||||
|
||||
return {
|
||||
content: cleanContent,
|
||||
metadata: {
|
||||
pageCount: textResult.total,
|
||||
info: infoResult.info,
|
||||
version: infoResult.metadata?.get('pdf:PDFVersion'),
|
||||
source: 'pdf-parse',
|
||||
pageCount: totalPages,
|
||||
source: 'unpdf',
|
||||
},
|
||||
}
|
||||
} catch (error) {
|
||||
|
||||
@@ -76,7 +76,7 @@ export function useSubscriptionUpgrade() {
|
||||
try {
|
||||
await client.organization.setActive({ organizationId: result.organizationId })
|
||||
|
||||
logger.info('Set organization as active and updated referenceId', {
|
||||
logger.info('Set organization as active', {
|
||||
organizationId: result.organizationId,
|
||||
oldReferenceId: userId,
|
||||
newReferenceId: referenceId,
|
||||
@@ -90,6 +90,11 @@ export function useSubscriptionUpgrade() {
|
||||
}
|
||||
|
||||
if (currentSubscriptionId) {
|
||||
logger.info('Transferring personal subscription to organization', {
|
||||
subscriptionId: currentSubscriptionId,
|
||||
organizationId: referenceId,
|
||||
})
|
||||
|
||||
const transferResponse = await fetch(
|
||||
`/api/users/me/subscription/${currentSubscriptionId}/transfer`,
|
||||
{
|
||||
@@ -101,8 +106,18 @@ export function useSubscriptionUpgrade() {
|
||||
|
||||
if (!transferResponse.ok) {
|
||||
const text = await transferResponse.text()
|
||||
logger.error('Failed to transfer subscription to organization', {
|
||||
subscriptionId: currentSubscriptionId,
|
||||
organizationId: referenceId,
|
||||
error: text,
|
||||
})
|
||||
throw new Error(text || 'Failed to transfer subscription to organization')
|
||||
}
|
||||
|
||||
logger.info('Successfully transferred subscription to organization', {
|
||||
subscriptionId: currentSubscriptionId,
|
||||
organizationId: referenceId,
|
||||
})
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Failed to create organization for team plan', error)
|
||||
|
||||
@@ -257,12 +257,7 @@ export async function executeWorkflowCore(
|
||||
const startBlock = TriggerUtils.findStartBlock(mergedStates, executionKind, false)
|
||||
|
||||
if (!startBlock) {
|
||||
const errorMsg =
|
||||
executionKind === 'api'
|
||||
? 'No API trigger block found. Add an API Trigger block to this workflow.'
|
||||
: executionKind === 'chat'
|
||||
? 'No chat trigger block found. Add a Chat Trigger block to this workflow.'
|
||||
: 'No trigger block found for this workflow.'
|
||||
const errorMsg = 'No start block found. Add a start block to this workflow.'
|
||||
logger.error(`[${requestId}] ${errorMsg}`)
|
||||
throw new Error(errorMsg)
|
||||
}
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { getBlock } from '@/blocks'
|
||||
import type { BlockState } from '@/stores/workflows/workflow/types'
|
||||
|
||||
/**
|
||||
* Unified trigger type definitions
|
||||
@@ -62,14 +63,9 @@ const START_CONFLICT_TYPES: TriggerType[] = [
|
||||
TRIGGER_TYPES.STARTER, // Legacy starter also conflicts with start_trigger
|
||||
]
|
||||
|
||||
type BlockWithType = { type: string; subBlocks?: Record<string, unknown> | undefined }
|
||||
type MinimalBlock = { type: string; subBlocks?: Record<string, unknown> | undefined }
|
||||
|
||||
type BlockWithMetadata = BlockWithType & {
|
||||
category?: string
|
||||
triggers?: { enabled?: boolean }
|
||||
}
|
||||
|
||||
export interface StartBlockCandidate<T extends BlockWithType> {
|
||||
export interface StartBlockCandidate<T extends MinimalBlock> {
|
||||
blockId: string
|
||||
block: T
|
||||
path: StartBlockPath
|
||||
@@ -108,20 +104,18 @@ export function classifyStartBlockType(
|
||||
}
|
||||
}
|
||||
|
||||
export function classifyStartBlock<T extends BlockWithType>(block: T): StartBlockPath | null {
|
||||
const blockWithMetadata = block as BlockWithMetadata
|
||||
export function classifyStartBlock<T extends MinimalBlock>(block: T): StartBlockPath | null {
|
||||
const blockState = block as Partial<BlockState>
|
||||
|
||||
// Try to get metadata from the block itself first
|
||||
let category = blockWithMetadata.category
|
||||
let triggerModeEnabled = Boolean(blockWithMetadata.triggers?.enabled)
|
||||
let category: string | undefined
|
||||
const triggerModeEnabled = Boolean(blockState.triggerMode)
|
||||
|
||||
// If not available on the block, fetch from registry
|
||||
if (!category || triggerModeEnabled === undefined) {
|
||||
const blockConfig = getBlock(block.type)
|
||||
if (blockConfig) {
|
||||
category = category || blockConfig.category
|
||||
triggerModeEnabled = triggerModeEnabled || Boolean(blockConfig.triggers?.enabled)
|
||||
}
|
||||
const blockConfig = getBlock(block.type)
|
||||
|
||||
if (blockConfig) {
|
||||
category = blockConfig.category
|
||||
}
|
||||
|
||||
return classifyStartBlockType(block.type, { category, triggerModeEnabled })
|
||||
@@ -131,7 +125,7 @@ export function isLegacyStartPath(path: StartBlockPath): boolean {
|
||||
return path !== StartBlockPath.UNIFIED
|
||||
}
|
||||
|
||||
function toEntries<T extends BlockWithType>(blocks: Record<string, T> | T[]): Array<[string, T]> {
|
||||
function toEntries<T extends MinimalBlock>(blocks: Record<string, T> | T[]): Array<[string, T]> {
|
||||
if (Array.isArray(blocks)) {
|
||||
return blocks.map((block, index) => {
|
||||
const potentialId = (block as { id?: unknown }).id
|
||||
@@ -169,7 +163,7 @@ function supportsExecution(path: StartBlockPath, execution: StartExecutionKind):
|
||||
)
|
||||
}
|
||||
|
||||
export function resolveStartCandidates<T extends BlockWithType>(
|
||||
export function resolveStartCandidates<T extends MinimalBlock>(
|
||||
blocks: Record<string, T> | T[],
|
||||
options: ResolveStartOptions
|
||||
): StartBlockCandidate<T>[] {
|
||||
@@ -183,6 +177,11 @@ export function resolveStartCandidates<T extends BlockWithType>(
|
||||
const candidates: StartBlockCandidate<T>[] = []
|
||||
|
||||
for (const [blockId, block] of entries) {
|
||||
// Skip disabled blocks - they cannot be used as triggers
|
||||
if ('enabled' in block && block.enabled === false) {
|
||||
continue
|
||||
}
|
||||
|
||||
const path = classifyStartBlock(block)
|
||||
if (!path) continue
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/**
|
||||
* Utility functions for generating names for all entities (workspaces, folders, workflows)
|
||||
* Utility functions for generating names for workspaces and folders
|
||||
*/
|
||||
|
||||
import type { Workspace } from '@/lib/organization/types'
|
||||
@@ -17,148 +17,6 @@ interface FoldersApiResponse {
|
||||
folders: WorkflowFolder[]
|
||||
}
|
||||
|
||||
const ADJECTIVES = [
|
||||
'Blazing',
|
||||
'Crystal',
|
||||
'Golden',
|
||||
'Silver',
|
||||
'Mystic',
|
||||
'Cosmic',
|
||||
'Electric',
|
||||
'Frozen',
|
||||
'Burning',
|
||||
'Shining',
|
||||
'Dancing',
|
||||
'Flying',
|
||||
'Roaring',
|
||||
'Whispering',
|
||||
'Glowing',
|
||||
'Sparkling',
|
||||
'Thunder',
|
||||
'Lightning',
|
||||
'Storm',
|
||||
'Ocean',
|
||||
'Mountain',
|
||||
'Forest',
|
||||
'Desert',
|
||||
'Arctic',
|
||||
'Tropical',
|
||||
'Midnight',
|
||||
'Dawn',
|
||||
'Sunset',
|
||||
'Rainbow',
|
||||
'Diamond',
|
||||
'Ruby',
|
||||
'Emerald',
|
||||
'Sapphire',
|
||||
'Pearl',
|
||||
'Jade',
|
||||
'Amber',
|
||||
'Coral',
|
||||
'Ivory',
|
||||
'Obsidian',
|
||||
'Marble',
|
||||
'Velvet',
|
||||
'Silk',
|
||||
'Satin',
|
||||
'Linen',
|
||||
'Cotton',
|
||||
'Wool',
|
||||
'Cashmere',
|
||||
'Denim',
|
||||
'Neon',
|
||||
'Pastel',
|
||||
'Vibrant',
|
||||
'Muted',
|
||||
'Bold',
|
||||
'Subtle',
|
||||
'Bright',
|
||||
'Dark',
|
||||
]
|
||||
|
||||
const NOUNS = [
|
||||
'Phoenix',
|
||||
'Dragon',
|
||||
'Eagle',
|
||||
'Wolf',
|
||||
'Lion',
|
||||
'Tiger',
|
||||
'Panther',
|
||||
'Falcon',
|
||||
'Hawk',
|
||||
'Raven',
|
||||
'Swan',
|
||||
'Dove',
|
||||
'Butterfly',
|
||||
'Firefly',
|
||||
'Dragonfly',
|
||||
'Hummingbird',
|
||||
'Galaxy',
|
||||
'Nebula',
|
||||
'Comet',
|
||||
'Meteor',
|
||||
'Star',
|
||||
'Moon',
|
||||
'Sun',
|
||||
'Planet',
|
||||
'Asteroid',
|
||||
'Constellation',
|
||||
'Aurora',
|
||||
'Eclipse',
|
||||
'Solstice',
|
||||
'Equinox',
|
||||
'Horizon',
|
||||
'Zenith',
|
||||
'Castle',
|
||||
'Tower',
|
||||
'Bridge',
|
||||
'Garden',
|
||||
'Fountain',
|
||||
'Palace',
|
||||
'Temple',
|
||||
'Cathedral',
|
||||
'Lighthouse',
|
||||
'Windmill',
|
||||
'Waterfall',
|
||||
'Canyon',
|
||||
'Valley',
|
||||
'Peak',
|
||||
'Ridge',
|
||||
'Cliff',
|
||||
'Ocean',
|
||||
'River',
|
||||
'Lake',
|
||||
'Stream',
|
||||
'Pond',
|
||||
'Bay',
|
||||
'Cove',
|
||||
'Harbor',
|
||||
'Island',
|
||||
'Peninsula',
|
||||
'Archipelago',
|
||||
'Atoll',
|
||||
'Reef',
|
||||
'Lagoon',
|
||||
'Fjord',
|
||||
'Delta',
|
||||
'Cake',
|
||||
'Cookie',
|
||||
'Muffin',
|
||||
'Cupcake',
|
||||
'Pie',
|
||||
'Tart',
|
||||
'Brownie',
|
||||
'Donut',
|
||||
'Pancake',
|
||||
'Waffle',
|
||||
'Croissant',
|
||||
'Bagel',
|
||||
'Pretzel',
|
||||
'Biscuit',
|
||||
'Scone',
|
||||
'Crumpet',
|
||||
]
|
||||
|
||||
/**
|
||||
* Generates the next incremental name for entities following pattern: "{prefix} {number}"
|
||||
*
|
||||
@@ -226,13 +84,3 @@ export async function generateSubfolderName(
|
||||
|
||||
return generateIncrementalName(subfolders, 'Subfolder')
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates a creative workflow name using random adjectives and nouns
|
||||
* @returns A creative workflow name like "blazing-phoenix" or "crystal-dragon"
|
||||
*/
|
||||
export function generateCreativeWorkflowName(): string {
|
||||
const adjective = ADJECTIVES[Math.floor(Math.random() * ADJECTIVES.length)]
|
||||
const noun = NOUNS[Math.floor(Math.random() * NOUNS.length)]
|
||||
return `${adjective.toLowerCase()}-${noun.toLowerCase()}`
|
||||
}
|
||||
@@ -75,7 +75,7 @@ const nextConfig: NextConfig = {
|
||||
turbopack: {
|
||||
resolveExtensions: ['.tsx', '.ts', '.jsx', '.js', '.mjs', '.json'],
|
||||
},
|
||||
serverExternalPackages: ['pdf-parse'],
|
||||
serverExternalPackages: ['unpdf'],
|
||||
experimental: {
|
||||
optimizeCss: true,
|
||||
turbopackSourceMaps: false,
|
||||
|
||||
@@ -61,7 +61,6 @@
|
||||
"@radix-ui/react-tooltip": "1.2.8",
|
||||
"@react-email/components": "^0.0.34",
|
||||
"@trigger.dev/sdk": "4.0.4",
|
||||
"@types/pdf-parse": "1.1.5",
|
||||
"@types/three": "0.177.0",
|
||||
"better-auth": "1.3.12",
|
||||
"browser-image-compression": "^2.0.2",
|
||||
@@ -76,6 +75,7 @@
|
||||
"entities": "6.0.1",
|
||||
"framer-motion": "^12.5.0",
|
||||
"fuse.js": "7.1.0",
|
||||
"gray-matter": "^4.0.3",
|
||||
"groq-sdk": "^0.15.0",
|
||||
"html-to-text": "^9.0.5",
|
||||
"input-otp": "^1.4.2",
|
||||
@@ -96,8 +96,6 @@
|
||||
"officeparser": "^5.2.0",
|
||||
"openai": "^4.91.1",
|
||||
"papaparse": "5.5.3",
|
||||
"pdf-parse": "2.4.5",
|
||||
"gray-matter": "^4.0.3",
|
||||
"posthog-js": "1.268.9",
|
||||
"posthog-node": "5.9.2",
|
||||
"prismjs": "^1.30.0",
|
||||
@@ -109,9 +107,9 @@
|
||||
"react-markdown": "^10.1.0",
|
||||
"react-simple-code-editor": "^0.14.1",
|
||||
"reactflow": "^11.11.4",
|
||||
"remark-gfm": "4.0.1",
|
||||
"rehype-autolink-headings": "^7.1.0",
|
||||
"rehype-slug": "^6.0.0",
|
||||
"remark-gfm": "4.0.1",
|
||||
"resend": "^4.1.2",
|
||||
"sharp": "0.34.3",
|
||||
"socket.io": "^4.8.1",
|
||||
@@ -119,6 +117,7 @@
|
||||
"tailwind-merge": "^2.6.0",
|
||||
"tailwindcss-animate": "^1.0.7",
|
||||
"three": "0.177.0",
|
||||
"unpdf": "1.4.0",
|
||||
"uuid": "^11.1.0",
|
||||
"xlsx": "0.18.5",
|
||||
"zod": "^3.24.2"
|
||||
|
||||
@@ -8,8 +8,13 @@ import type {
|
||||
ProviderResponse,
|
||||
TimeSegment,
|
||||
} from '@/providers/types'
|
||||
import { prepareToolExecution } from '@/providers/utils'
|
||||
import {
|
||||
prepareToolExecution,
|
||||
prepareToolsWithUsageControl,
|
||||
trackForcedToolUsage,
|
||||
} from '@/providers/utils'
|
||||
import { executeTool } from '@/tools'
|
||||
import type { CerebrasResponse } from './types'
|
||||
|
||||
const logger = createLogger('CerebrasProvider')
|
||||
|
||||
@@ -116,29 +121,29 @@ export const cerebrasProvider: ProviderConfig = {
|
||||
}
|
||||
}
|
||||
|
||||
// Add tools if provided
|
||||
if (tools?.length) {
|
||||
// Filter out any tools with usageControl='none', treat 'force' as 'auto' since Cerebras only supports 'auto'
|
||||
const filteredTools = tools.filter((tool) => {
|
||||
const toolId = tool.function?.name
|
||||
const toolConfig = request.tools?.find((t) => t.id === toolId)
|
||||
// Only filter out tools with usageControl='none'
|
||||
return toolConfig?.usageControl !== 'none'
|
||||
})
|
||||
// Handle tools and tool usage control
|
||||
// Cerebras supports full OpenAI-compatible tool_choice including forcing specific tools
|
||||
let originalToolChoice: any
|
||||
let forcedTools: string[] = []
|
||||
let hasFilteredTools = false
|
||||
|
||||
if (filteredTools?.length) {
|
||||
payload.tools = filteredTools
|
||||
// Always use 'auto' for Cerebras, explicitly converting any 'force' usageControl to 'auto'
|
||||
payload.tool_choice = 'auto'
|
||||
if (tools?.length) {
|
||||
const preparedTools = prepareToolsWithUsageControl(tools, request.tools, logger, 'openai')
|
||||
|
||||
if (preparedTools.tools?.length) {
|
||||
payload.tools = preparedTools.tools
|
||||
payload.tool_choice = preparedTools.toolChoice || 'auto'
|
||||
originalToolChoice = preparedTools.toolChoice
|
||||
forcedTools = preparedTools.forcedTools || []
|
||||
hasFilteredTools = preparedTools.hasFilteredTools
|
||||
|
||||
logger.info('Cerebras request configuration:', {
|
||||
toolCount: filteredTools.length,
|
||||
toolChoice: 'auto', // Cerebras always uses auto, 'force' is treated as 'auto'
|
||||
toolCount: preparedTools.tools.length,
|
||||
toolChoice: payload.tool_choice,
|
||||
forcedToolsCount: forcedTools.length,
|
||||
hasFilteredTools,
|
||||
model: request.model,
|
||||
})
|
||||
} else if (tools.length > 0 && filteredTools.length === 0) {
|
||||
// Handle case where all tools are filtered out
|
||||
logger.info(`All tools have usageControl='none', removing tools from request`)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -357,6 +362,29 @@ export const cerebrasProvider: ProviderConfig = {
|
||||
const thisToolsTime = Date.now() - toolsStartTime
|
||||
toolsTime += thisToolsTime
|
||||
|
||||
// Check if we used any forced tools and update tool_choice for the next iteration
|
||||
let usedForcedTools: string[] = []
|
||||
if (typeof originalToolChoice === 'object' && forcedTools.length > 0) {
|
||||
const toolTracking = trackForcedToolUsage(
|
||||
currentResponse.choices[0]?.message?.tool_calls,
|
||||
originalToolChoice,
|
||||
logger,
|
||||
'openai',
|
||||
forcedTools,
|
||||
usedForcedTools
|
||||
)
|
||||
usedForcedTools = toolTracking.usedForcedTools
|
||||
const nextToolChoice = toolTracking.nextToolChoice
|
||||
|
||||
// Update tool_choice for next iteration if we're still forcing tools
|
||||
if (nextToolChoice && typeof nextToolChoice === 'object') {
|
||||
payload.tool_choice = nextToolChoice
|
||||
} else if (nextToolChoice === 'auto' || !nextToolChoice) {
|
||||
// All forced tools have been used, switch to auto
|
||||
payload.tool_choice = 'auto'
|
||||
}
|
||||
}
|
||||
|
||||
// After processing tool calls, get a final response
|
||||
if (processedAnyToolCall || hasRepeatedToolCalls) {
|
||||
// Time the next model call
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
interface CerebrasMessage {
|
||||
export interface CerebrasMessage {
|
||||
role: string
|
||||
content: string | null
|
||||
tool_calls?: Array<{
|
||||
@@ -12,19 +12,19 @@ interface CerebrasMessage {
|
||||
tool_call_id?: string
|
||||
}
|
||||
|
||||
interface CerebrasChoice {
|
||||
export interface CerebrasChoice {
|
||||
message: CerebrasMessage
|
||||
index: number
|
||||
finish_reason: string
|
||||
}
|
||||
|
||||
interface CerebrasUsage {
|
||||
export interface CerebrasUsage {
|
||||
prompt_tokens: number
|
||||
completion_tokens: number
|
||||
total_tokens: number
|
||||
}
|
||||
|
||||
interface CerebrasResponse {
|
||||
export interface CerebrasResponse {
|
||||
id: string
|
||||
object: string
|
||||
created: number
|
||||
|
||||
@@ -8,7 +8,11 @@ import type {
|
||||
ProviderResponse,
|
||||
TimeSegment,
|
||||
} from '@/providers/types'
|
||||
import { prepareToolExecution } from '@/providers/utils'
|
||||
import {
|
||||
prepareToolExecution,
|
||||
prepareToolsWithUsageControl,
|
||||
trackForcedToolUsage,
|
||||
} from '@/providers/utils'
|
||||
import { executeTool } from '@/tools'
|
||||
|
||||
const logger = createLogger('GroqProvider')
|
||||
@@ -110,23 +114,26 @@ export const groqProvider: ProviderConfig = {
|
||||
}
|
||||
|
||||
// Handle tools and tool usage control
|
||||
if (tools?.length) {
|
||||
// Filter out any tools with usageControl='none', but ignore 'force' since Groq doesn't support it
|
||||
const filteredTools = tools.filter((tool) => {
|
||||
const toolId = tool.function?.name
|
||||
const toolConfig = request.tools?.find((t) => t.id === toolId)
|
||||
// Only filter out 'none', treat 'force' as 'auto'
|
||||
return toolConfig?.usageControl !== 'none'
|
||||
})
|
||||
// Groq supports full OpenAI-compatible tool_choice including forcing specific tools
|
||||
let originalToolChoice: any
|
||||
let forcedTools: string[] = []
|
||||
let hasFilteredTools = false
|
||||
|
||||
if (filteredTools?.length) {
|
||||
payload.tools = filteredTools
|
||||
// Always use 'auto' for Groq, regardless of the tool_choice setting
|
||||
payload.tool_choice = 'auto'
|
||||
if (tools?.length) {
|
||||
const preparedTools = prepareToolsWithUsageControl(tools, request.tools, logger, 'openai')
|
||||
|
||||
if (preparedTools.tools?.length) {
|
||||
payload.tools = preparedTools.tools
|
||||
payload.tool_choice = preparedTools.toolChoice || 'auto'
|
||||
originalToolChoice = preparedTools.toolChoice
|
||||
forcedTools = preparedTools.forcedTools || []
|
||||
hasFilteredTools = preparedTools.hasFilteredTools
|
||||
|
||||
logger.info('Groq request configuration:', {
|
||||
toolCount: filteredTools.length,
|
||||
toolChoice: 'auto', // Groq always uses auto
|
||||
toolCount: preparedTools.tools.length,
|
||||
toolChoice: payload.tool_choice,
|
||||
forcedToolsCount: forcedTools.length,
|
||||
hasFilteredTools,
|
||||
model: request.model || 'groq/meta-llama/llama-4-scout-17b-16e-instruct',
|
||||
})
|
||||
}
|
||||
@@ -328,6 +335,29 @@ export const groqProvider: ProviderConfig = {
|
||||
const thisToolsTime = Date.now() - toolsStartTime
|
||||
toolsTime += thisToolsTime
|
||||
|
||||
// Check if we used any forced tools and update tool_choice for the next iteration
|
||||
let usedForcedTools: string[] = []
|
||||
if (typeof originalToolChoice === 'object' && forcedTools.length > 0) {
|
||||
const toolTracking = trackForcedToolUsage(
|
||||
currentResponse.choices[0]?.message?.tool_calls,
|
||||
originalToolChoice,
|
||||
logger,
|
||||
'openai',
|
||||
forcedTools,
|
||||
usedForcedTools
|
||||
)
|
||||
usedForcedTools = toolTracking.usedForcedTools
|
||||
const nextToolChoice = toolTracking.nextToolChoice
|
||||
|
||||
// Update tool_choice for next iteration if we're still forcing tools
|
||||
if (nextToolChoice && typeof nextToolChoice === 'object') {
|
||||
payload.tool_choice = nextToolChoice
|
||||
} else if (nextToolChoice === 'auto' || !nextToolChoice) {
|
||||
// All forced tools have been used, switch to auto
|
||||
payload.tool_choice = 'auto'
|
||||
}
|
||||
}
|
||||
|
||||
// Make the next request with updated messages
|
||||
const nextPayload = {
|
||||
...payload,
|
||||
|
||||
@@ -101,6 +101,74 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
|
||||
temperature: { min: 0, max: 2 },
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'gpt-5.1',
|
||||
pricing: {
|
||||
input: 1.25,
|
||||
cachedInput: 0.125,
|
||||
output: 10.0,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'low', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'gpt-5.1-mini',
|
||||
pricing: {
|
||||
input: 0.25,
|
||||
cachedInput: 0.025,
|
||||
output: 2.0,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'low', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'gpt-5.1-nano',
|
||||
pricing: {
|
||||
input: 0.05,
|
||||
cachedInput: 0.005,
|
||||
output: 0.4,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'low', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'gpt-5.1-codex',
|
||||
pricing: {
|
||||
input: 1.25,
|
||||
cachedInput: 0.125,
|
||||
output: 10.0,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'gpt-5',
|
||||
pricing: {
|
||||
@@ -253,6 +321,74 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
|
||||
temperature: { min: 0, max: 2 },
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'azure/gpt-5.1',
|
||||
pricing: {
|
||||
input: 1.25,
|
||||
cachedInput: 0.125,
|
||||
output: 10.0,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'low', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'azure/gpt-5.1-mini',
|
||||
pricing: {
|
||||
input: 0.25,
|
||||
cachedInput: 0.025,
|
||||
output: 2.0,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'low', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'azure/gpt-5.1-nano',
|
||||
pricing: {
|
||||
input: 0.05,
|
||||
cachedInput: 0.005,
|
||||
output: 0.4,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'low', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'azure/gpt-5.1-codex',
|
||||
pricing: {
|
||||
input: 1.25,
|
||||
cachedInput: 0.125,
|
||||
output: 10.0,
|
||||
updatedAt: '2025-11-14',
|
||||
},
|
||||
capabilities: {
|
||||
reasoningEffort: {
|
||||
values: ['none', 'medium', 'high'],
|
||||
},
|
||||
verbosity: {
|
||||
values: ['low', 'medium', 'high'],
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'azure/gpt-5',
|
||||
pricing: {
|
||||
@@ -630,7 +766,7 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
|
||||
modelPatterns: [/^cerebras/],
|
||||
icon: CerebrasIcon,
|
||||
capabilities: {
|
||||
toolUsageControl: false,
|
||||
toolUsageControl: true,
|
||||
},
|
||||
models: [
|
||||
{
|
||||
@@ -679,7 +815,7 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
|
||||
modelPatterns: [/^groq/],
|
||||
icon: GroqIcon,
|
||||
capabilities: {
|
||||
toolUsageControl: false,
|
||||
toolUsageControl: true,
|
||||
},
|
||||
models: [
|
||||
{
|
||||
@@ -957,6 +1093,9 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
|
||||
defaultModel: '',
|
||||
modelPatterns: [],
|
||||
icon: OllamaIcon,
|
||||
capabilities: {
|
||||
toolUsageControl: false, // Ollama does not support tool_choice parameter
|
||||
},
|
||||
models: [], // Populated dynamically
|
||||
},
|
||||
}
|
||||
|
||||
@@ -9,11 +9,7 @@ import type {
|
||||
ProviderResponse,
|
||||
TimeSegment,
|
||||
} from '@/providers/types'
|
||||
import {
|
||||
prepareToolExecution,
|
||||
prepareToolsWithUsageControl,
|
||||
trackForcedToolUsage,
|
||||
} from '@/providers/utils'
|
||||
import { prepareToolExecution } from '@/providers/utils'
|
||||
import { useProvidersStore } from '@/stores/providers/store'
|
||||
import { executeTool } from '@/tools'
|
||||
|
||||
@@ -173,20 +169,42 @@ export const ollamaProvider: ProviderConfig = {
|
||||
}
|
||||
|
||||
// Handle tools and tool usage control
|
||||
let preparedTools: ReturnType<typeof prepareToolsWithUsageControl> | null = null
|
||||
|
||||
// NOTE: Ollama does NOT support the tool_choice parameter beyond basic 'auto' behavior
|
||||
// According to official documentation, tool_choice is silently ignored
|
||||
// Ollama only supports basic function calling where the model autonomously decides
|
||||
if (tools?.length) {
|
||||
preparedTools = prepareToolsWithUsageControl(tools, request.tools, logger, 'ollama')
|
||||
const { tools: filteredTools, toolChoice } = preparedTools
|
||||
// Filter out tools with usageControl='none'
|
||||
// Treat 'force' as 'auto' since Ollama doesn't support forced tool selection
|
||||
const filteredTools = tools.filter((tool) => {
|
||||
const toolId = tool.function?.name
|
||||
const toolConfig = request.tools?.find((t) => t.id === toolId)
|
||||
// Only filter out 'none', treat 'force' as 'auto'
|
||||
return toolConfig?.usageControl !== 'none'
|
||||
})
|
||||
|
||||
if (filteredTools?.length && toolChoice) {
|
||||
// Check if any tools were forcibly marked
|
||||
const hasForcedTools = tools.some((tool) => {
|
||||
const toolId = tool.function?.name
|
||||
const toolConfig = request.tools?.find((t) => t.id === toolId)
|
||||
return toolConfig?.usageControl === 'force'
|
||||
})
|
||||
|
||||
if (hasForcedTools) {
|
||||
logger.warn(
|
||||
'Ollama does not support forced tool selection (tool_choice parameter is ignored). ' +
|
||||
'Tools marked with usageControl="force" will behave as "auto" instead.'
|
||||
)
|
||||
}
|
||||
|
||||
if (filteredTools?.length) {
|
||||
payload.tools = filteredTools
|
||||
// Ollama supports 'auto' but not forced tool selection - convert 'force' to 'auto'
|
||||
payload.tool_choice = typeof toolChoice === 'string' ? toolChoice : 'auto'
|
||||
// Ollama only supports 'auto' behavior - model decides whether to use tools
|
||||
payload.tool_choice = 'auto'
|
||||
|
||||
logger.info('Ollama request configuration:', {
|
||||
toolCount: filteredTools.length,
|
||||
toolChoice: payload.tool_choice,
|
||||
toolChoice: 'auto', // Ollama always uses auto
|
||||
forcedToolsIgnored: hasForcedTools,
|
||||
model: request.model,
|
||||
})
|
||||
}
|
||||
@@ -295,33 +313,6 @@ export const ollamaProvider: ProviderConfig = {
|
||||
// Make the initial API request
|
||||
const initialCallTime = Date.now()
|
||||
|
||||
// Track the original tool_choice for forced tool tracking
|
||||
const originalToolChoice = payload.tool_choice
|
||||
|
||||
// Track forced tools and their usage
|
||||
const forcedTools = preparedTools?.forcedTools || []
|
||||
let usedForcedTools: string[] = []
|
||||
|
||||
// Helper function to check for forced tool usage in responses
|
||||
const checkForForcedToolUsage = (
|
||||
response: any,
|
||||
toolChoice: string | { type: string; function?: { name: string }; name?: string; any?: any }
|
||||
) => {
|
||||
if (typeof toolChoice === 'object' && response.choices[0]?.message?.tool_calls) {
|
||||
const toolCallsResponse = response.choices[0].message.tool_calls
|
||||
const result = trackForcedToolUsage(
|
||||
toolCallsResponse,
|
||||
toolChoice,
|
||||
logger,
|
||||
'ollama',
|
||||
forcedTools,
|
||||
usedForcedTools
|
||||
)
|
||||
hasUsedForcedTool = result.hasUsedForcedTool
|
||||
usedForcedTools = result.usedForcedTools
|
||||
}
|
||||
}
|
||||
|
||||
let currentResponse = await ollama.chat.completions.create(payload)
|
||||
const firstResponseTime = Date.now() - initialCallTime
|
||||
|
||||
@@ -349,9 +340,6 @@ export const ollamaProvider: ProviderConfig = {
|
||||
let modelTime = firstResponseTime
|
||||
let toolsTime = 0
|
||||
|
||||
// Track if a forced tool has been used
|
||||
let hasUsedForcedTool = false
|
||||
|
||||
// Track each model and tool call segment with timestamps
|
||||
const timeSegments: TimeSegment[] = [
|
||||
{
|
||||
@@ -363,9 +351,6 @@ export const ollamaProvider: ProviderConfig = {
|
||||
},
|
||||
]
|
||||
|
||||
// Check if a forced tool was used in the first response
|
||||
checkForForcedToolUsage(currentResponse, originalToolChoice)
|
||||
|
||||
while (iterationCount < MAX_ITERATIONS) {
|
||||
// Check for tool calls
|
||||
const toolCallsInResponse = currentResponse.choices[0]?.message?.tool_calls
|
||||
@@ -470,31 +455,12 @@ export const ollamaProvider: ProviderConfig = {
|
||||
messages: currentMessages,
|
||||
}
|
||||
|
||||
// Update tool_choice based on which forced tools have been used
|
||||
if (typeof originalToolChoice === 'object' && hasUsedForcedTool && forcedTools.length > 0) {
|
||||
// If we have remaining forced tools, get the next one to force
|
||||
const remainingTools = forcedTools.filter((tool) => !usedForcedTools.includes(tool))
|
||||
|
||||
if (remainingTools.length > 0) {
|
||||
// Ollama doesn't support forced tool selection, so we keep using 'auto'
|
||||
nextPayload.tool_choice = 'auto'
|
||||
logger.info(`Ollama doesn't support forced tools, using auto for: ${remainingTools[0]}`)
|
||||
} else {
|
||||
// All forced tools have been used, continue with auto
|
||||
nextPayload.tool_choice = 'auto'
|
||||
logger.info('All forced tools have been used, continuing with auto tool_choice')
|
||||
}
|
||||
}
|
||||
|
||||
// Time the next model call
|
||||
const nextModelStartTime = Date.now()
|
||||
|
||||
// Make the next request
|
||||
currentResponse = await ollama.chat.completions.create(nextPayload)
|
||||
|
||||
// Check if any forced tools were used in this response
|
||||
checkForForcedToolUsage(currentResponse, nextPayload.tool_choice)
|
||||
|
||||
const nextModelEndTime = Date.now()
|
||||
const thisModelTime = nextModelEndTime - nextModelStartTime
|
||||
|
||||
|
||||
@@ -35,7 +35,6 @@ const mockGetRotatingApiKey = vi.fn().mockReturnValue('rotating-server-key')
|
||||
const originalRequire = module.require
|
||||
|
||||
describe('getApiKey', () => {
|
||||
// Save original env and reset between tests
|
||||
const originalEnv = { ...process.env }
|
||||
|
||||
beforeEach(() => {
|
||||
@@ -146,6 +145,15 @@ describe('Model Capabilities', () => {
|
||||
'deepseek-chat',
|
||||
'azure/gpt-4.1',
|
||||
'azure/model-router',
|
||||
// GPT-5.1 models don't support temperature (removed in our implementation)
|
||||
'gpt-5.1',
|
||||
'gpt-5.1-mini',
|
||||
'gpt-5.1-nano',
|
||||
'gpt-5.1-codex',
|
||||
'azure/gpt-5.1',
|
||||
'azure/gpt-5.1-mini',
|
||||
'azure/gpt-5.1-nano',
|
||||
'azure/gpt-5.1-codex',
|
||||
// GPT-5 models don't support temperature (removed in our implementation)
|
||||
'gpt-5',
|
||||
'gpt-5-mini',
|
||||
@@ -218,6 +226,15 @@ describe('Model Capabilities', () => {
|
||||
expect(getMaxTemperature('azure/o3')).toBeUndefined()
|
||||
expect(getMaxTemperature('azure/o4-mini')).toBeUndefined()
|
||||
expect(getMaxTemperature('deepseek-r1')).toBeUndefined()
|
||||
// GPT-5.1 models don't support temperature
|
||||
expect(getMaxTemperature('gpt-5.1')).toBeUndefined()
|
||||
expect(getMaxTemperature('gpt-5.1-mini')).toBeUndefined()
|
||||
expect(getMaxTemperature('gpt-5.1-nano')).toBeUndefined()
|
||||
expect(getMaxTemperature('gpt-5.1-codex')).toBeUndefined()
|
||||
expect(getMaxTemperature('azure/gpt-5.1')).toBeUndefined()
|
||||
expect(getMaxTemperature('azure/gpt-5.1-mini')).toBeUndefined()
|
||||
expect(getMaxTemperature('azure/gpt-5.1-nano')).toBeUndefined()
|
||||
expect(getMaxTemperature('azure/gpt-5.1-codex')).toBeUndefined()
|
||||
// GPT-5 models don't support temperature
|
||||
expect(getMaxTemperature('gpt-5')).toBeUndefined()
|
||||
expect(getMaxTemperature('gpt-5-mini')).toBeUndefined()
|
||||
@@ -263,7 +280,7 @@ describe('Model Capabilities', () => {
|
||||
it.concurrent(
|
||||
'should return false for providers that do not support tool usage control',
|
||||
() => {
|
||||
const unsupportedProviders = ['ollama', 'cerebras', 'groq', 'non-existent-provider']
|
||||
const unsupportedProviders = ['ollama', 'non-existent-provider']
|
||||
|
||||
for (const provider of unsupportedProviders) {
|
||||
expect(supportsToolUsageControl(provider)).toBe(false)
|
||||
@@ -306,6 +323,16 @@ describe('Model Capabilities', () => {
|
||||
)
|
||||
|
||||
it.concurrent('should have correct models in MODELS_WITH_REASONING_EFFORT', () => {
|
||||
// Should contain GPT-5.1 models that support reasoning effort
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5.1')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5.1-mini')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5.1-nano')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5.1-codex')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5.1')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5.1-mini')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5.1-nano')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5.1-codex')
|
||||
|
||||
// Should contain GPT-5 models that support reasoning effort
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5')
|
||||
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5-mini')
|
||||
@@ -325,6 +352,16 @@ describe('Model Capabilities', () => {
|
||||
})
|
||||
|
||||
it.concurrent('should have correct models in MODELS_WITH_VERBOSITY', () => {
|
||||
// Should contain GPT-5.1 models that support verbosity
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5.1')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5.1-mini')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5.1-nano')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5.1-codex')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5.1')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5.1-mini')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5.1-nano')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5.1-codex')
|
||||
|
||||
// Should contain GPT-5 models that support verbosity
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5')
|
||||
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5-mini')
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user