Compare commits

..

27 Commits

Author SHA1 Message Date
Waleed
4372841797 v0.5.60: invitation flow improvements, chat fixes, a2a improvements, additional copilot actions 2026-01-15 00:02:18 -08:00
Waleed
5e8c843241 v0.5.59: a2a support, documentation 2026-01-13 13:21:21 -08:00
Waleed
7bf3d73ee6 v0.5.58: export folders, new tools, permissions groups enhancements 2026-01-13 00:56:59 -08:00
Vikhyath Mondreti
7ffc11a738 v0.5.57: subagents, context menu improvements, bug fixes 2026-01-11 11:38:40 -08:00
Waleed
be578e2ed7 v0.5.56: batch operations, access control and permission groups, billing fixes 2026-01-10 00:31:34 -08:00
Waleed
f415e5edc4 v0.5.55: polling groups, bedrock provider, devcontainer fixes, workflow preview enhancements 2026-01-08 23:36:56 -08:00
Waleed
13a6e6c3fa v0.5.54: seo, model blacklist, helm chart updates, fireflies integration, autoconnect improvements, billing fixes 2026-01-07 16:09:45 -08:00
Waleed
f5ab7f21ae v0.5.53: hotkey improvements, added redis fallback, fixes for workflow tool 2026-01-06 23:34:52 -08:00
Waleed
bfb6fffe38 v0.5.52: new port-based router block, combobox expression and variable support 2026-01-06 16:14:10 -08:00
Waleed
4fbec0a43f v0.5.51: triggers, kb, condition block improvements, supabase and grain integration updates 2026-01-06 14:26:46 -08:00
Waleed
585f5e365b v0.5.50: import improvements, ui upgrades, kb styling and performance improvements 2026-01-05 00:35:55 -08:00
Waleed
3792bdd252 v0.5.49: hitl improvements, new email styles, imap trigger, logs context menu (#2672)
* feat(logs-context-menu): consolidated logs utils and types, added logs record context menu (#2659)

* feat(email): welcome email; improvement(emails): ui/ux (#2658)

* feat(email): welcome email; improvement(emails): ui/ux

* improvement(emails): links, accounts, preview

* refactor(emails): file structure and wrapper components

* added envvar for personal emails sent, added isHosted gate

* fixed failing tests, added env mock

* fix: removed comment

---------

Co-authored-by: waleed <walif6@gmail.com>

* fix(logging): hitl + trigger dev crash protection (#2664)

* hitl gaps

* deal with trigger worker crashes

* cleanup import strcuture

* feat(imap): added support for imap trigger (#2663)

* feat(tools): added support for imap trigger

* feat(imap): added parity, tested

* ack PR comments

* final cleanup

* feat(i18n): update translations (#2665)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(grain): updated grain trigger to auto-establish trigger (#2666)

Co-authored-by: aadamgough <adam@sim.ai>

* feat(admin): routes to manage deployments (#2667)

* feat(admin): routes to manage deployments

* fix naming fo deployed by

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date (#2668)

* feat(time-picker): added timepicker emcn component, added to playground, added searchable prop for dropdown, added more timezones for schedule, updated license and notice date

* removed unused params, cleaned up redundant utils

* improvement(invite): aligned styling (#2669)

* improvement(invite): aligned with rest of app

* fix(invite): error handling

* fix: addressed comments

---------

Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
Co-authored-by: Adam Gough <77861281+aadamgough@users.noreply.github.com>
Co-authored-by: aadamgough <adam@sim.ai>
2026-01-03 13:19:18 -08:00
Waleed
eb5d1f3e5b v0.5.48: copy-paste workflow blocks, docs updates, mcp tool fixes 2025-12-31 18:00:04 -08:00
Waleed
54ab82c8dd v0.5.47: deploy workflow as mcp, kb chunks tokenizer, UI improvements, jira service management tools 2025-12-30 23:18:58 -08:00
Waleed
f895bf469b v0.5.46: build improvements, greptile, light mode improvements 2025-12-29 02:17:52 -08:00
Waleed
dd3209af06 v0.5.45: light mode fixes, realtime usage indicator, docker build improvements 2025-12-27 19:57:42 -08:00
Waleed
b6ba3b50a7 v0.5.44: keyboard shortcuts, autolayout, light mode, byok, testing improvements 2025-12-26 21:25:19 -08:00
Waleed
b304233062 v0.5.43: export logs, circleback, grain, vertex, code hygiene, schedule improvements 2025-12-23 19:19:18 -08:00
Vikhyath Mondreti
57e4b49bd6 v0.5.42: fix memory migration 2025-12-23 01:24:54 -08:00
Vikhyath Mondreti
e12dd204ed v0.5.41: memory fixes, copilot improvements, knowledgebase improvements, LLM providers standardization 2025-12-23 00:15:18 -08:00
Vikhyath Mondreti
3d9d9cbc54 v0.5.40: supabase ops to allow non-public schemas, jira uuid 2025-12-21 22:28:05 -08:00
Waleed
0f4ec962ad v0.5.39: notion, workflow variables fixes 2025-12-20 20:44:00 -08:00
Waleed
4827866f9a v0.5.38: snap to grid, copilot ux improvements, billing line items 2025-12-20 17:24:38 -08:00
Waleed
3e697d9ed9 v0.5.37: redaction utils consolidation, logs updates, autoconnect improvements, additional kb tag types 2025-12-19 22:31:55 -08:00
Martin Yankov
4431a1a484 fix(helm): add custom egress rules to realtime network policy (#2481)
The realtime service network policy was missing the custom egress rules section
that allows configuration of additional egress rules via values.yaml. This caused
the realtime pods to be unable to connect to external databases (e.g., PostgreSQL
on port 5432) when using external database configurations.

The app network policy already had this section, but the realtime network policy
was missing it, creating an inconsistency and preventing the realtime service
from accessing external databases configured via networkPolicy.egress values.

This fix adds the same custom egress rules template section to the realtime
network policy, matching the app network policy behavior and allowing users to
configure database connectivity via values.yaml.
2025-12-19 18:59:08 -08:00
Waleed
4d1a9a3f22 v0.5.36: hitl improvements, opengraph, slack fixes, one-click unsubscribe, auth checks, new db indexes 2025-12-19 01:27:49 -08:00
Vikhyath Mondreti
eb07a080fb v0.5.35: helm updates, copilot improvements, 404 for docs, salesforce fixes, subflow resize clamping 2025-12-18 16:23:19 -08:00
17 changed files with 534 additions and 580 deletions

View File

@@ -43,27 +43,6 @@ In Sim, the Slack integration enables your agents to programmatically interact w
- **Download files**: Retrieve files shared in Slack channels for processing or archival
This allows for powerful automation scenarios such as sending notifications with dynamic updates, managing conversational flows with editable status messages, acknowledging important messages with reactions, and maintaining clean channels by removing outdated bot messages. Your agents can deliver timely information, update messages as workflows progress, create collaborative documents, or alert team members when attention is needed. This integration bridges the gap between your AI workflows and your team's communication, ensuring everyone stays informed with accurate, up-to-date information. By connecting Sim with Slack, you can create agents that keep your team updated with relevant information at the right time, enhance collaboration by sharing and updating insights automatically, and reduce the need for manual status updates—all while leveraging your existing Slack workspace where your team already communicates.
## Getting Started
To connect Slack to your Sim workflows:
1. Sign up or log in at [sim.ai](https://sim.ai)
2. Create a new workflow or open an existing one
3. Drag a **Slack** block onto your canvas
4. Click the credential selector and choose **Connect**
5. Authorize Sim to access your Slack workspace
6. Select your target channel or user
Once connected, you can use any of the Slack operations listed below.
## AI-Generated Content
Sim workflows may use AI models to generate messages and responses sent to Slack. AI-generated content may be inaccurate or contain errors. Always review automated outputs, especially for critical communications.
## Need Help?
If you encounter issues with the Slack integration, contact us at [help@sim.ai](mailto:help@sim.ai)
{/* MANUAL-CONTENT-END */}

View File

@@ -7,7 +7,6 @@ import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { generateChatTitle } from '@/lib/copilot/chat-title'
import { getCopilotModel } from '@/lib/copilot/config'
import { COPILOT_MODEL_IDS, COPILOT_REQUEST_MODES } from '@/lib/copilot/models'
import { SIM_AGENT_API_URL_DEFAULT, SIM_AGENT_VERSION } from '@/lib/copilot/constants'
import {
authenticateCopilotRequestSessionOnly,
@@ -41,8 +40,34 @@ const ChatMessageSchema = z.object({
userMessageId: z.string().optional(), // ID from frontend for the user message
chatId: z.string().optional(),
workflowId: z.string().min(1, 'Workflow ID is required'),
model: z.enum(COPILOT_MODEL_IDS).optional().default('claude-4.5-opus'),
mode: z.enum(COPILOT_REQUEST_MODES).optional().default('agent'),
model: z
.enum([
'gpt-5-fast',
'gpt-5',
'gpt-5-medium',
'gpt-5-high',
'gpt-5.1-fast',
'gpt-5.1',
'gpt-5.1-medium',
'gpt-5.1-high',
'gpt-5-codex',
'gpt-5.1-codex',
'gpt-5.2',
'gpt-5.2-codex',
'gpt-5.2-pro',
'gpt-4o',
'gpt-4.1',
'o3',
'claude-4-sonnet',
'claude-4.5-haiku',
'claude-4.5-sonnet',
'claude-4.5-opus',
'claude-4.1-opus',
'gemini-3-pro',
])
.optional()
.default('claude-4.5-opus'),
mode: z.enum(['ask', 'agent', 'plan']).optional().default('agent'),
prefetch: z.boolean().optional(),
createNewChat: z.boolean().optional().default(false),
stream: z.boolean().optional().default(true),
@@ -270,8 +295,7 @@ export async function POST(req: NextRequest) {
}
const defaults = getCopilotModel('chat')
const selectedModel = model || defaults.model
const envModel = env.COPILOT_MODEL || defaults.model
const modelToUse = env.COPILOT_MODEL || defaults.model
let providerConfig: CopilotProviderConfig | undefined
const providerEnv = env.COPILOT_PROVIDER as any
@@ -280,7 +304,7 @@ export async function POST(req: NextRequest) {
if (providerEnv === 'azure-openai') {
providerConfig = {
provider: 'azure-openai',
model: envModel,
model: modelToUse,
apiKey: env.AZURE_OPENAI_API_KEY,
apiVersion: 'preview',
endpoint: env.AZURE_OPENAI_ENDPOINT,
@@ -288,7 +312,7 @@ export async function POST(req: NextRequest) {
} else if (providerEnv === 'vertex') {
providerConfig = {
provider: 'vertex',
model: envModel,
model: modelToUse,
apiKey: env.COPILOT_API_KEY,
vertexProject: env.VERTEX_PROJECT,
vertexLocation: env.VERTEX_LOCATION,
@@ -296,15 +320,12 @@ export async function POST(req: NextRequest) {
} else {
providerConfig = {
provider: providerEnv,
model: selectedModel,
model: modelToUse,
apiKey: env.COPILOT_API_KEY,
}
}
}
const effectiveMode = mode === 'agent' ? 'build' : mode
const transportMode = effectiveMode === 'build' ? 'agent' : effectiveMode
// Determine conversationId to use for this request
const effectiveConversationId =
(currentChat?.conversationId as string | undefined) || conversationId
@@ -324,7 +345,7 @@ export async function POST(req: NextRequest) {
}
} | null = null
if (effectiveMode === 'build') {
if (mode === 'agent') {
// Build base tools (executed locally, not deferred)
// Include function_execute for code execution capability
baseTools = [
@@ -431,8 +452,8 @@ export async function POST(req: NextRequest) {
userId: authenticatedUserId,
stream: stream,
streamToolCalls: true,
model: selectedModel,
mode: transportMode,
model: model,
mode: mode,
messageId: userMessageIdToUse,
version: SIM_AGENT_VERSION,
...(providerConfig ? { provider: providerConfig } : {}),
@@ -456,7 +477,7 @@ export async function POST(req: NextRequest) {
hasConversationId: !!effectiveConversationId,
hasFileAttachments: processedFileContents.length > 0,
messageLength: message.length,
mode: effectiveMode,
mode,
hasTools: integrationTools.length > 0,
toolCount: integrationTools.length,
hasBaseTools: baseTools.length > 0,

View File

@@ -11,7 +11,6 @@ import {
createRequestTracker,
createUnauthorizedResponse,
} from '@/lib/copilot/request-helpers'
import { COPILOT_MODES } from '@/lib/copilot/models'
const logger = createLogger('CopilotChatUpdateAPI')
@@ -46,7 +45,7 @@ const UpdateMessagesSchema = z.object({
planArtifact: z.string().nullable().optional(),
config: z
.object({
mode: z.enum(COPILOT_MODES).optional(),
mode: z.enum(['ask', 'build', 'plan']).optional(),
model: z.string().optional(),
})
.nullable()

View File

@@ -2,13 +2,12 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import type { CopilotModelId } from '@/lib/copilot/models'
import { db } from '@/../../packages/db'
import { settings } from '@/../../packages/db/schema'
const logger = createLogger('CopilotUserModelsAPI')
const DEFAULT_ENABLED_MODELS: Record<CopilotModelId, boolean> = {
const DEFAULT_ENABLED_MODELS: Record<string, boolean> = {
'gpt-4o': false,
'gpt-4.1': false,
'gpt-5-fast': false,
@@ -29,7 +28,7 @@ const DEFAULT_ENABLED_MODELS: Record<CopilotModelId, boolean> = {
'claude-4.5-haiku': true,
'claude-4.5-sonnet': true,
'claude-4.5-opus': true,
'claude-4.1-opus': false,
// 'claude-4.1-opus': true,
'gemini-3-pro': true,
}
@@ -55,9 +54,7 @@ export async function GET(request: NextRequest) {
const mergedModels = { ...DEFAULT_ENABLED_MODELS }
for (const [modelId, enabled] of Object.entries(userModelsMap)) {
if (modelId in mergedModels) {
mergedModels[modelId as CopilotModelId] = enabled
}
mergedModels[modelId] = enabled
}
const hasNewModels = Object.keys(DEFAULT_ENABLED_MODELS).some(

View File

@@ -7,33 +7,192 @@ import { usePreventZoom } from '@/app/workspace/[workspaceId]/w/[workflowId]/hoo
import { useCopilotStore, usePanelStore } from '@/stores/panel'
import { useTerminalStore } from '@/stores/terminal'
import { useWorkflowDiffStore } from '@/stores/workflow-diff'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { mergeSubblockState } from '@/stores/workflows/utils'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
const logger = createLogger('DiffControls')
export const DiffControls = memo(function DiffControls() {
const isTerminalResizing = useTerminalStore((state) => state.isResizing)
const isPanelResizing = usePanelStore((state) => state.isResizing)
const { isDiffReady, hasActiveDiff, acceptChanges, rejectChanges } = useWorkflowDiffStore(
const { isDiffReady, hasActiveDiff, acceptChanges, rejectChanges, baselineWorkflow } =
useWorkflowDiffStore(
useCallback(
(state) => ({
isDiffReady: state.isDiffReady,
hasActiveDiff: state.hasActiveDiff,
acceptChanges: state.acceptChanges,
rejectChanges: state.rejectChanges,
baselineWorkflow: state.baselineWorkflow,
}),
[]
)
)
const { updatePreviewToolCallState, currentChat, messages } = useCopilotStore(
useCallback(
(state) => ({
isDiffReady: state.isDiffReady,
hasActiveDiff: state.hasActiveDiff,
acceptChanges: state.acceptChanges,
rejectChanges: state.rejectChanges,
updatePreviewToolCallState: state.updatePreviewToolCallState,
currentChat: state.currentChat,
messages: state.messages,
}),
[]
)
)
const { updatePreviewToolCallState } = useCopilotStore(
useCallback(
(state) => ({
updatePreviewToolCallState: state.updatePreviewToolCallState,
}),
[]
)
const { activeWorkflowId } = useWorkflowRegistry(
useCallback((state) => ({ activeWorkflowId: state.activeWorkflowId }), [])
)
const createCheckpoint = useCallback(async () => {
if (!activeWorkflowId || !currentChat?.id) {
logger.warn('Cannot create checkpoint: missing workflowId or chatId', {
workflowId: activeWorkflowId,
chatId: currentChat?.id,
})
return false
}
try {
logger.info('Creating checkpoint before accepting changes')
// Use the baseline workflow (state before diff) instead of current state
// This ensures reverting to the checkpoint restores the pre-diff state
const rawState = baselineWorkflow || useWorkflowStore.getState().getWorkflowState()
// The baseline already has merged subblock values, but we'll merge again to be safe
// This ensures all user inputs and subblock data are captured
const blocksWithSubblockValues = mergeSubblockState(rawState.blocks, activeWorkflowId)
// Filter and complete blocks to ensure all required fields are present
// This matches the validation logic from /api/workflows/[id]/state
const filteredBlocks = Object.entries(blocksWithSubblockValues).reduce(
(acc, [blockId, block]) => {
if (block.type && block.name) {
// Ensure all required fields are present
acc[blockId] = {
...block,
id: block.id || blockId, // Ensure id field is set
enabled: block.enabled !== undefined ? block.enabled : true,
horizontalHandles:
block.horizontalHandles !== undefined ? block.horizontalHandles : true,
height: block.height !== undefined ? block.height : 90,
subBlocks: block.subBlocks || {},
outputs: block.outputs || {},
data: block.data || {},
position: block.position || { x: 0, y: 0 }, // Ensure position exists
}
}
return acc
},
{} as typeof rawState.blocks
)
// Clean the workflow state - only include valid fields, exclude null/undefined values
const workflowState = {
blocks: filteredBlocks,
edges: rawState.edges || [],
loops: rawState.loops || {},
parallels: rawState.parallels || {},
lastSaved: rawState.lastSaved || Date.now(),
deploymentStatuses: rawState.deploymentStatuses || {},
}
logger.info('Prepared complete workflow state for checkpoint', {
blocksCount: Object.keys(workflowState.blocks).length,
edgesCount: workflowState.edges.length,
loopsCount: Object.keys(workflowState.loops).length,
parallelsCount: Object.keys(workflowState.parallels).length,
hasRequiredFields: Object.values(workflowState.blocks).every(
(block) => block.id && block.type && block.name && block.position
),
hasSubblockValues: Object.values(workflowState.blocks).some((block) =>
Object.values(block.subBlocks || {}).some(
(subblock) => subblock.value !== null && subblock.value !== undefined
)
),
sampleBlock: Object.values(workflowState.blocks)[0],
})
// Find the most recent user message ID from the current chat
const userMessages = messages.filter((msg) => msg.role === 'user')
const lastUserMessage = userMessages[userMessages.length - 1]
const messageId = lastUserMessage?.id
logger.info('Creating checkpoint with message association', {
totalMessages: messages.length,
userMessageCount: userMessages.length,
lastUserMessageId: messageId,
chatId: currentChat.id,
entireMessageArray: messages,
allMessageIds: messages.map((m) => ({
id: m.id,
role: m.role,
content: m.content.substring(0, 50),
})),
selectedUserMessages: userMessages.map((m) => ({
id: m.id,
content: m.content.substring(0, 100),
})),
allRawMessageIds: messages.map((m) => m.id),
userMessageIds: userMessages.map((m) => m.id),
checkpointData: {
workflowId: activeWorkflowId,
chatId: currentChat.id,
messageId: messageId,
messageFound: !!lastUserMessage,
},
})
const response = await fetch('/api/copilot/checkpoints', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
workflowId: activeWorkflowId,
chatId: currentChat.id,
messageId,
workflowState: JSON.stringify(workflowState),
}),
})
if (!response.ok) {
throw new Error(`Failed to create checkpoint: ${response.statusText}`)
}
const result = await response.json()
const newCheckpoint = result.checkpoint
logger.info('Checkpoint created successfully', {
messageId,
chatId: currentChat.id,
checkpointId: newCheckpoint?.id,
})
// Update the copilot store immediately to show the checkpoint icon
if (newCheckpoint && messageId) {
const { messageCheckpoints: currentCheckpoints } = useCopilotStore.getState()
const existingCheckpoints = currentCheckpoints[messageId] || []
const updatedCheckpoints = {
...currentCheckpoints,
[messageId]: [newCheckpoint, ...existingCheckpoints],
}
useCopilotStore.setState({ messageCheckpoints: updatedCheckpoints })
logger.info('Updated copilot store with new checkpoint', {
messageId,
checkpointId: newCheckpoint.id,
})
}
return true
} catch (error) {
logger.error('Failed to create checkpoint:', error)
return false
}
}, [activeWorkflowId, currentChat, messages, baselineWorkflow])
const handleAccept = useCallback(() => {
logger.info('Accepting proposed changes with backup protection')
@@ -70,8 +229,12 @@ export const DiffControls = memo(function DiffControls() {
})
// Create checkpoint in the background (fire-and-forget) so it doesn't block UI
createCheckpoint().catch((error) => {
logger.warn('Failed to create checkpoint after accept:', error)
})
logger.info('Accept triggered; UI will update optimistically')
}, [updatePreviewToolCallState, acceptChanges])
}, [createCheckpoint, updatePreviewToolCallState, acceptChanges])
const handleReject = useCallback(() => {
logger.info('Rejecting proposed changes (optimistic)')

View File

@@ -57,7 +57,7 @@ export function useCheckpointManagement(
const { messageCheckpoints: currentCheckpoints } = useCopilotStore.getState()
const updatedCheckpoints = {
...currentCheckpoints,
[message.id]: [],
[message.id]: messageCheckpoints.slice(1),
}
useCopilotStore.setState({ messageCheckpoints: updatedCheckpoints })
@@ -140,7 +140,7 @@ export function useCheckpointManagement(
const { messageCheckpoints: currentCheckpoints } = useCopilotStore.getState()
const updatedCheckpoints = {
...currentCheckpoints,
[message.id]: [],
[message.id]: messageCheckpoints.slice(1),
}
useCopilotStore.setState({ messageCheckpoints: updatedCheckpoints })

View File

@@ -1446,10 +1446,8 @@ function WorkflowEditSummary({ toolCall }: { toolCall: CopilotToolCall }) {
blockType = blockType || op.block_type || ''
}
if (!blockName) blockName = blockType || ''
if (!blockName && !blockType) {
continue
}
// Fallback name to type or ID
if (!blockName) blockName = blockType || blockId
const change: BlockChange = { blockId, blockName, blockType }

View File

@@ -22,9 +22,6 @@ interface UseContextManagementProps {
export function useContextManagement({ message, initialContexts }: UseContextManagementProps) {
const [selectedContexts, setSelectedContexts] = useState<ChatContext[]>(initialContexts ?? [])
const initializedRef = useRef(false)
const escapeRegex = useCallback((value: string) => {
return value.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
}, [])
// Initialize with initial contexts when they're first provided (for edit mode)
useEffect(() => {
@@ -81,8 +78,10 @@ export function useContextManagement({ message, initialContexts }: UseContextMan
// Check for slash command tokens or mention tokens based on kind
const isSlashCommand = c.kind === 'slash_command'
const prefix = isSlashCommand ? '/' : '@'
const tokenPattern = new RegExp(`(^|\\s)${escapeRegex(prefix)}${escapeRegex(c.label)}(\\s|$)`)
return tokenPattern.test(message)
const tokenWithSpaces = ` ${prefix}${c.label} `
const tokenAtStart = `${prefix}${c.label} `
// Token can appear with leading space OR at the start of the message
return message.includes(tokenWithSpaces) || message.startsWith(tokenAtStart)
})
return filtered.length === prev.length ? prev : filtered
})

View File

@@ -76,15 +76,6 @@ export function useMentionTokens({
ranges.push({ start: idx, end: idx + token.length, label })
fromIndex = idx + token.length
}
// Token at end of message without trailing space: "@label" or " /label"
const tokenAtEnd = `${prefix}${label}`
if (message.endsWith(tokenAtEnd)) {
const idx = message.lastIndexOf(tokenAtEnd)
const hasLeadingSpace = idx > 0 && message[idx - 1] === ' '
const start = hasLeadingSpace ? idx - 1 : idx
ranges.push({ start, end: message.length, label })
}
}
ranges.sort((a, b) => a.start - b.start)

View File

@@ -613,7 +613,7 @@ const UserInput = forwardRef<UserInputRef, UserInputProps>(
const insertTriggerAndOpenMenu = useCallback(
(trigger: '@' | '/') => {
if (disabled) return
if (disabled || isLoading) return
const textarea = mentionMenu.textareaRef.current
if (!textarea) return
@@ -642,7 +642,7 @@ const UserInput = forwardRef<UserInputRef, UserInputProps>(
}
mentionMenu.setSubmenuActiveIndex(0)
},
[disabled, mentionMenu, message, setMessage]
[disabled, isLoading, mentionMenu, message, setMessage]
)
const handleOpenMentionMenuWithAt = useCallback(
@@ -735,7 +735,10 @@ const UserInput = forwardRef<UserInputRef, UserInputProps>(
variant='outline'
onClick={handleOpenMentionMenuWithAt}
title='Insert @'
className={cn('cursor-pointer rounded-[6px] p-[4.5px]', disabled && 'cursor-not-allowed')}
className={cn(
'cursor-pointer rounded-[6px] p-[4.5px]',
(disabled || isLoading) && 'cursor-not-allowed'
)}
>
<AtSign className='h-3 w-3' strokeWidth={1.75} />
</Badge>
@@ -744,7 +747,10 @@ const UserInput = forwardRef<UserInputRef, UserInputProps>(
variant='outline'
onClick={handleOpenSlashMenu}
title='Insert /'
className={cn('cursor-pointer rounded-[6px] p-[4.5px]', disabled && 'cursor-not-allowed')}
className={cn(
'cursor-pointer rounded-[6px] p-[4.5px]',
(disabled || isLoading) && 'cursor-not-allowed'
)}
>
<span className='flex h-3 w-3 items-center justify-center font-medium text-[11px] leading-none'>
/

View File

@@ -1,9 +1,4 @@
import { createLogger } from '@sim/logger'
import type {
CopilotMode,
CopilotModelId,
CopilotTransportMode,
} from '@/lib/copilot/models'
const logger = createLogger('CopilotAPI')
@@ -32,8 +27,8 @@ export interface CopilotMessage {
* Chat config stored in database
*/
export interface CopilotChatConfig {
mode?: CopilotMode
model?: CopilotModelId
mode?: 'ask' | 'build' | 'plan'
model?: string
}
/**
@@ -70,8 +65,30 @@ export interface SendMessageRequest {
userMessageId?: string // ID from frontend for the user message
chatId?: string
workflowId?: string
mode?: CopilotMode | CopilotTransportMode
model?: CopilotModelId
mode?: 'ask' | 'agent' | 'plan'
model?:
| 'gpt-5-fast'
| 'gpt-5'
| 'gpt-5-medium'
| 'gpt-5-high'
| 'gpt-5.1-fast'
| 'gpt-5.1'
| 'gpt-5.1-medium'
| 'gpt-5.1-high'
| 'gpt-5-codex'
| 'gpt-5.1-codex'
| 'gpt-5.2'
| 'gpt-5.2-codex'
| 'gpt-5.2-pro'
| 'gpt-4o'
| 'gpt-4.1'
| 'o3'
| 'claude-4-sonnet'
| 'claude-4.5-haiku'
| 'claude-4.5-sonnet'
| 'claude-4.5-opus'
| 'claude-4.1-opus'
| 'gemini-3-pro'
prefetch?: boolean
createNewChat?: boolean
stream?: boolean

View File

@@ -1,36 +0,0 @@
export const COPILOT_MODEL_IDS = [
'gpt-5-fast',
'gpt-5',
'gpt-5-medium',
'gpt-5-high',
'gpt-5.1-fast',
'gpt-5.1',
'gpt-5.1-medium',
'gpt-5.1-high',
'gpt-5-codex',
'gpt-5.1-codex',
'gpt-5.2',
'gpt-5.2-codex',
'gpt-5.2-pro',
'gpt-4o',
'gpt-4.1',
'o3',
'claude-4-sonnet',
'claude-4.5-haiku',
'claude-4.5-sonnet',
'claude-4.5-opus',
'claude-4.1-opus',
'gemini-3-pro',
] as const
export type CopilotModelId = (typeof COPILOT_MODEL_IDS)[number]
export const COPILOT_MODES = ['ask', 'build', 'plan'] as const
export type CopilotMode = (typeof COPILOT_MODES)[number]
export const COPILOT_TRANSPORT_MODES = ['ask', 'agent', 'plan'] as const
export type CopilotTransportMode = (typeof COPILOT_TRANSPORT_MODES)[number]
export const COPILOT_REQUEST_MODES = ['ask', 'build', 'plan', 'agent'] as const
export type CopilotRequestMode = (typeof COPILOT_REQUEST_MODES)[number]

View File

@@ -38,18 +38,6 @@ export class EditWorkflowClientTool extends BaseClientTool {
super(toolCallId, EditWorkflowClientTool.id, EditWorkflowClientTool.metadata)
}
async markToolComplete(status: number, message?: any, data?: any): Promise<boolean> {
const logger = createLogger('EditWorkflowClientTool')
logger.info('markToolComplete payload', {
toolCallId: this.toolCallId,
toolName: this.name,
status,
message,
data,
})
return super.markToolComplete(status, message, data)
}
/**
* Get sanitized workflow JSON from a workflow state, merge subblocks, and sanitize for copilot
* This matches what get_user_workflow returns
@@ -185,13 +173,21 @@ export class EditWorkflowClientTool extends BaseClientTool {
async execute(args?: EditWorkflowArgs): Promise<void> {
const logger = createLogger('EditWorkflowClientTool')
if (this.hasExecuted) {
logger.info('execute skipped (already executed)', { toolCallId: this.toolCallId })
return
}
// Use timeout protection to ensure tool always completes
await this.executeWithTimeout(async () => {
if (this.hasExecuted) {
logger.info('execute skipped (already executed)', { toolCallId: this.toolCallId })
// Even if skipped, ensure we mark complete with current workflow state
if (!this.hasBeenMarkedComplete()) {
const currentWorkflowJson = this.getCurrentWorkflowJsonSafe(logger)
await this.markToolComplete(
200,
'Tool already executed',
currentWorkflowJson ? { userWorkflow: currentWorkflowJson } : undefined
)
}
return
}
this.hasExecuted = true
logger.info('execute called', { toolCallId: this.toolCallId, argsProvided: !!args })
this.setState(ClientToolCallState.executing)

View File

@@ -495,7 +495,7 @@ export const OAUTH_PROVIDERS: Record<string, OAuthProviderConfig> = {
services: {
slack: {
name: 'Slack',
description: 'Send messages using a bot for Slack.',
description: 'Send messages using a Slack bot.',
providerId: 'slack',
icon: SlackIcon,
baseProviderIcon: SlackIcon,

View File

@@ -4,7 +4,6 @@ import { createLogger } from '@sim/logger'
import { create } from 'zustand'
import { devtools } from 'zustand/middleware'
import { type CopilotChat, sendStreamingMessage } from '@/lib/copilot/api'
import type { CopilotTransportMode } from '@/lib/copilot/models'
import type {
BaseClientToolMetadata,
ClientToolDisplay,
@@ -85,9 +84,7 @@ import type {
} from '@/stores/panel/copilot/types'
import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
import { mergeSubblockState } from '@/stores/workflows/utils'
import { useWorkflowStore } from '@/stores/workflows/workflow/store'
import type { WorkflowState } from '@/stores/workflows/workflow/types'
const logger = createLogger('CopilotStore')
@@ -240,7 +237,6 @@ const TEXT_BLOCK_TYPE = 'text'
const THINKING_BLOCK_TYPE = 'thinking'
const DATA_PREFIX = 'data: '
const DATA_PREFIX_LENGTH = 6
const CONTINUE_OPTIONS_TAG = '<options>{"1":"Continue"}</options>'
// Resolve display text/icon for a tool based on its state
function resolveToolDisplay(
@@ -364,7 +360,6 @@ function abortAllInProgressTools(set: any, get: () => CopilotStore) {
const { toolCallsById, messages } = get()
const updatedMap = { ...toolCallsById }
const abortedIds = new Set<string>()
let hasUpdates = false
for (const [id, tc] of Object.entries(toolCallsById)) {
const st = tc.state as any
// Abort anything not already terminal success/error/rejected/aborted
@@ -378,19 +373,11 @@ function abortAllInProgressTools(set: any, get: () => CopilotStore) {
updatedMap[id] = {
...tc,
state: ClientToolCallState.aborted,
subAgentStreaming: false,
display: resolveToolDisplay(tc.name, ClientToolCallState.aborted, id, (tc as any).params),
}
hasUpdates = true
} else if (tc.subAgentStreaming) {
updatedMap[id] = {
...tc,
subAgentStreaming: false,
}
hasUpdates = true
}
}
if (abortedIds.size > 0 || hasUpdates) {
if (abortedIds.size > 0) {
set({ toolCallsById: updatedMap })
// Update inline blocks in-place for the latest assistant message only (most relevant)
set((s: CopilotStore) => {
@@ -633,97 +620,6 @@ function createErrorMessage(
}
}
/**
* Builds a workflow snapshot suitable for checkpoint persistence.
*/
function buildCheckpointWorkflowState(workflowId: string): WorkflowState | null {
const rawState = useWorkflowStore.getState().getWorkflowState()
if (!rawState) return null
const blocksWithSubblockValues = mergeSubblockState(rawState.blocks, workflowId)
const filteredBlocks = Object.entries(blocksWithSubblockValues).reduce(
(acc, [blockId, block]) => {
if (block?.type && block?.name) {
acc[blockId] = {
...block,
id: block.id || blockId,
enabled: block.enabled !== undefined ? block.enabled : true,
horizontalHandles: block.horizontalHandles !== undefined ? block.horizontalHandles : true,
height: block.height !== undefined ? block.height : 90,
subBlocks: block.subBlocks || {},
outputs: block.outputs || {},
data: block.data || {},
position: block.position || { x: 0, y: 0 },
}
}
return acc
},
{} as WorkflowState['blocks']
)
return {
blocks: filteredBlocks,
edges: rawState.edges || [],
loops: rawState.loops || {},
parallels: rawState.parallels || {},
lastSaved: rawState.lastSaved || Date.now(),
deploymentStatuses: rawState.deploymentStatuses || {},
}
}
/**
* Persists a previously captured snapshot as a workflow checkpoint.
*/
async function saveMessageCheckpoint(
messageId: string,
get: () => CopilotStore,
set: (partial: Partial<CopilotStore> | ((state: CopilotStore) => Partial<CopilotStore>)) => void
): Promise<boolean> {
const { workflowId, currentChat, messageSnapshots, messageCheckpoints } = get()
if (!workflowId || !currentChat?.id) return false
const snapshot = messageSnapshots[messageId]
if (!snapshot) return false
const nextSnapshots = { ...messageSnapshots }
delete nextSnapshots[messageId]
set({ messageSnapshots: nextSnapshots })
try {
const response = await fetch('/api/copilot/checkpoints', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
workflowId,
chatId: currentChat.id,
messageId,
workflowState: JSON.stringify(snapshot),
}),
})
if (!response.ok) {
throw new Error(`Failed to create checkpoint: ${response.statusText}`)
}
const result = await response.json()
const newCheckpoint = result.checkpoint
if (newCheckpoint) {
const existingCheckpoints = messageCheckpoints[messageId] || []
const updatedCheckpoints = {
...messageCheckpoints,
[messageId]: [newCheckpoint, ...existingCheckpoints],
}
set({ messageCheckpoints: updatedCheckpoints })
}
return true
} catch (error) {
logger.error('Failed to create checkpoint from snapshot:', error)
return false
}
}
function stripTodoTags(text: string): string {
if (!text) return text
return text
@@ -930,8 +826,6 @@ interface StreamingContext {
newChatId?: string
doneEventCount: number
streamComplete?: boolean
wasAborted?: boolean
suppressContinueOption?: boolean
/** Track active subagent sessions by parent tool call ID */
subAgentParentToolCallId?: string
/** Track subagent content per parent tool call */
@@ -949,129 +843,6 @@ type SSEHandler = (
set: any
) => Promise<void> | void
function appendTextBlock(context: StreamingContext, text: string) {
if (!text) return
context.accumulatedContent.append(text)
if (context.currentTextBlock && context.contentBlocks.length > 0) {
const lastBlock = context.contentBlocks[context.contentBlocks.length - 1]
if (lastBlock.type === TEXT_BLOCK_TYPE && lastBlock === context.currentTextBlock) {
lastBlock.content += text
return
}
}
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = text
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
function appendContinueOption(content: string): string {
if (/<options>/i.test(content)) return content
const suffix = content.trim().length > 0 ? '\n\n' : ''
return `${content}${suffix}${CONTINUE_OPTIONS_TAG}`
}
function appendContinueOptionBlock(blocks: any[]): any[] {
if (!Array.isArray(blocks)) return blocks
const hasOptions = blocks.some(
(block) => block?.type === TEXT_BLOCK_TYPE && typeof block.content === 'string' && /<options>/i.test(block.content)
)
if (hasOptions) return blocks
return [
...blocks,
{
type: TEXT_BLOCK_TYPE,
content: CONTINUE_OPTIONS_TAG,
timestamp: Date.now(),
},
]
}
function beginThinkingBlock(context: StreamingContext) {
if (!context.currentThinkingBlock) {
context.currentThinkingBlock = contentBlockPool.get()
context.currentThinkingBlock.type = THINKING_BLOCK_TYPE
context.currentThinkingBlock.content = ''
context.currentThinkingBlock.timestamp = Date.now()
;(context.currentThinkingBlock as any).startTime = Date.now()
context.contentBlocks.push(context.currentThinkingBlock)
}
context.isInThinkingBlock = true
context.currentTextBlock = null
}
/**
* Removes thinking tags from streamed content.
*/
function stripThinkingTags(text: string): string {
return text.replace(/<\/?thinking>/g, '')
}
function appendThinkingContent(context: StreamingContext, text: string) {
if (!text) return
const cleanedText = stripThinkingTags(text)
if (!cleanedText) return
if (context.currentThinkingBlock) {
context.currentThinkingBlock.content += cleanedText
} else {
context.currentThinkingBlock = contentBlockPool.get()
context.currentThinkingBlock.type = THINKING_BLOCK_TYPE
context.currentThinkingBlock.content = cleanedText
context.currentThinkingBlock.timestamp = Date.now()
context.currentThinkingBlock.startTime = Date.now()
context.contentBlocks.push(context.currentThinkingBlock)
}
context.isInThinkingBlock = true
context.currentTextBlock = null
}
function finalizeThinkingBlock(context: StreamingContext) {
if (context.currentThinkingBlock) {
context.currentThinkingBlock.duration =
Date.now() - (context.currentThinkingBlock.startTime || Date.now())
}
context.isInThinkingBlock = false
context.currentThinkingBlock = null
context.currentTextBlock = null
}
function upsertToolCallBlock(context: StreamingContext, toolCall: CopilotToolCall) {
let found = false
for (let i = 0; i < context.contentBlocks.length; i++) {
const b = context.contentBlocks[i] as any
if (b.type === 'tool_call' && b.toolCall?.id === toolCall.id) {
context.contentBlocks[i] = { ...b, toolCall }
found = true
break
}
}
if (!found) {
context.contentBlocks.push({ type: 'tool_call', toolCall, timestamp: Date.now() })
}
}
function appendSubAgentText(context: StreamingContext, parentToolCallId: string, text: string) {
if (!context.subAgentContent[parentToolCallId]) {
context.subAgentContent[parentToolCallId] = ''
}
if (!context.subAgentBlocks[parentToolCallId]) {
context.subAgentBlocks[parentToolCallId] = []
}
context.subAgentContent[parentToolCallId] += text
const blocks = context.subAgentBlocks[parentToolCallId]
const lastBlock = blocks[blocks.length - 1]
if (lastBlock && lastBlock.type === 'subagent_text') {
lastBlock.content = (lastBlock.content || '') + text
} else {
blocks.push({
type: 'subagent_text',
content: text,
timestamp: Date.now(),
})
}
}
const sseHandlers: Record<string, SSEHandler> = {
chat_id: async (data, context, get) => {
context.newChatId = data.chatId
@@ -1262,7 +1033,17 @@ const sseHandlers: Record<string, SSEHandler> = {
logger.info('[toolCallsById] map updated', updated)
// Add/refresh inline content block
upsertToolCallBlock(context, tc)
let found = false
for (let i = 0; i < context.contentBlocks.length; i++) {
const b = context.contentBlocks[i] as any
if (b.type === 'tool_call' && b.toolCall?.id === toolCallId) {
context.contentBlocks[i] = { ...b, toolCall: tc }
found = true
break
}
}
if (!found)
context.contentBlocks.push({ type: 'tool_call', toolCall: tc, timestamp: Date.now() })
updateStreamingMessage(set, context)
}
},
@@ -1298,10 +1079,20 @@ const sseHandlers: Record<string, SSEHandler> = {
logger.info('[toolCallsById] → pending', { id, name, params: args })
// Ensure an inline content block exists/updated for this tool call
upsertToolCallBlock(context, next)
let found = false
for (let i = 0; i < context.contentBlocks.length; i++) {
const b = context.contentBlocks[i] as any
if (b.type === 'tool_call' && b.toolCall?.id === id) {
context.contentBlocks[i] = { ...b, toolCall: next }
found = true
break
}
}
if (!found) {
context.contentBlocks.push({ type: 'tool_call', toolCall: next, timestamp: Date.now() })
}
updateStreamingMessage(set, context)
// Prefer interface-based registry to determine interrupt and execute
try {
const def = name ? getTool(name) : undefined
@@ -1484,18 +1275,44 @@ const sseHandlers: Record<string, SSEHandler> = {
reasoning: (data, context, _get, set) => {
const phase = (data && (data.phase || data?.data?.phase)) as string | undefined
if (phase === 'start') {
beginThinkingBlock(context)
if (!context.currentThinkingBlock) {
context.currentThinkingBlock = contentBlockPool.get()
context.currentThinkingBlock.type = THINKING_BLOCK_TYPE
context.currentThinkingBlock.content = ''
context.currentThinkingBlock.timestamp = Date.now()
;(context.currentThinkingBlock as any).startTime = Date.now()
context.contentBlocks.push(context.currentThinkingBlock)
}
context.isInThinkingBlock = true
context.currentTextBlock = null
updateStreamingMessage(set, context)
return
}
if (phase === 'end') {
finalizeThinkingBlock(context)
if (context.currentThinkingBlock) {
;(context.currentThinkingBlock as any).duration =
Date.now() - ((context.currentThinkingBlock as any).startTime || Date.now())
}
context.isInThinkingBlock = false
context.currentThinkingBlock = null
context.currentTextBlock = null
updateStreamingMessage(set, context)
return
}
const chunk: string = typeof data?.data === 'string' ? data.data : data?.content || ''
if (!chunk) return
appendThinkingContent(context, chunk)
if (context.currentThinkingBlock) {
context.currentThinkingBlock.content += chunk
} else {
context.currentThinkingBlock = contentBlockPool.get()
context.currentThinkingBlock.type = THINKING_BLOCK_TYPE
context.currentThinkingBlock.content = chunk
context.currentThinkingBlock.timestamp = Date.now()
;(context.currentThinkingBlock as any).startTime = Date.now()
context.contentBlocks.push(context.currentThinkingBlock)
}
context.isInThinkingBlock = true
context.currentTextBlock = null
updateStreamingMessage(set, context)
},
content: (data, context, get, set) => {
@@ -1510,23 +1327,21 @@ const sseHandlers: Record<string, SSEHandler> = {
const designWorkflowStartRegex = /<design_workflow>/
const designWorkflowEndRegex = /<\/design_workflow>/
const splitTrailingPartialTag = (
text: string,
tags: string[]
): { text: string; remaining: string } => {
const partialIndex = text.lastIndexOf('<')
if (partialIndex < 0) {
return { text, remaining: '' }
}
const possibleTag = text.substring(partialIndex)
const matchesTagStart = tags.some((tag) => tag.startsWith(possibleTag))
if (!matchesTagStart) {
return { text, remaining: '' }
}
return {
text: text.substring(0, partialIndex),
remaining: possibleTag,
const appendTextToContent = (text: string) => {
if (!text) return
context.accumulatedContent.append(text)
if (context.currentTextBlock && context.contentBlocks.length > 0) {
const lastBlock = context.contentBlocks[context.contentBlocks.length - 1]
if (lastBlock.type === TEXT_BLOCK_TYPE && lastBlock === context.currentTextBlock) {
lastBlock.content += text
return
}
}
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = text
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
while (contentToProcess.length > 0) {
@@ -1548,17 +1363,13 @@ const sseHandlers: Record<string, SSEHandler> = {
hasProcessedContent = true
} else {
// Still in design_workflow block, accumulate content
const { text, remaining } = splitTrailingPartialTag(contentToProcess, ['</design_workflow>'])
context.designWorkflowContent += text
context.designWorkflowContent += contentToProcess
// Update store with partial content for streaming effect (available in all modes)
set({ streamingPlanContent: context.designWorkflowContent })
contentToProcess = remaining
contentToProcess = ''
hasProcessedContent = true
if (remaining) {
break
}
}
continue
}
@@ -1569,7 +1380,7 @@ const sseHandlers: Record<string, SSEHandler> = {
if (designStartMatch) {
const textBeforeDesign = contentToProcess.substring(0, designStartMatch.index)
if (textBeforeDesign) {
appendTextBlock(context, textBeforeDesign)
appendTextToContent(textBeforeDesign)
hasProcessedContent = true
}
context.isInDesignWorkflowBlock = true
@@ -1660,27 +1471,63 @@ const sseHandlers: Record<string, SSEHandler> = {
const endMatch = thinkingEndRegex.exec(contentToProcess)
if (endMatch) {
const thinkingContent = contentToProcess.substring(0, endMatch.index)
appendThinkingContent(context, thinkingContent)
finalizeThinkingBlock(context)
if (context.currentThinkingBlock) {
context.currentThinkingBlock.content += thinkingContent
} else {
context.currentThinkingBlock = contentBlockPool.get()
context.currentThinkingBlock.type = THINKING_BLOCK_TYPE
context.currentThinkingBlock.content = thinkingContent
context.currentThinkingBlock.timestamp = Date.now()
context.currentThinkingBlock.startTime = Date.now()
context.contentBlocks.push(context.currentThinkingBlock)
}
context.isInThinkingBlock = false
if (context.currentThinkingBlock) {
context.currentThinkingBlock.duration =
Date.now() - (context.currentThinkingBlock.startTime || Date.now())
}
context.currentThinkingBlock = null
context.currentTextBlock = null
contentToProcess = contentToProcess.substring(endMatch.index + endMatch[0].length)
hasProcessedContent = true
} else {
const { text, remaining } = splitTrailingPartialTag(contentToProcess, ['</thinking>'])
if (text) {
appendThinkingContent(context, text)
hasProcessedContent = true
}
contentToProcess = remaining
if (remaining) {
break
if (context.currentThinkingBlock) {
context.currentThinkingBlock.content += contentToProcess
} else {
context.currentThinkingBlock = contentBlockPool.get()
context.currentThinkingBlock.type = THINKING_BLOCK_TYPE
context.currentThinkingBlock.content = contentToProcess
context.currentThinkingBlock.timestamp = Date.now()
context.currentThinkingBlock.startTime = Date.now()
context.contentBlocks.push(context.currentThinkingBlock)
}
contentToProcess = ''
hasProcessedContent = true
}
} else {
const startMatch = thinkingStartRegex.exec(contentToProcess)
if (startMatch) {
const textBeforeThinking = contentToProcess.substring(0, startMatch.index)
if (textBeforeThinking) {
appendTextBlock(context, textBeforeThinking)
context.accumulatedContent.append(textBeforeThinking)
if (context.currentTextBlock && context.contentBlocks.length > 0) {
const lastBlock = context.contentBlocks[context.contentBlocks.length - 1]
if (lastBlock.type === TEXT_BLOCK_TYPE && lastBlock === context.currentTextBlock) {
lastBlock.content += textBeforeThinking
} else {
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = textBeforeThinking
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
} else {
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = textBeforeThinking
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
hasProcessedContent = true
}
context.isInThinkingBlock = true
@@ -1709,7 +1556,25 @@ const sseHandlers: Record<string, SSEHandler> = {
remaining = contentToProcess.substring(partialTagIndex)
}
if (textToAdd) {
appendTextBlock(context, textToAdd)
context.accumulatedContent.append(textToAdd)
if (context.currentTextBlock && context.contentBlocks.length > 0) {
const lastBlock = context.contentBlocks[context.contentBlocks.length - 1]
if (lastBlock.type === TEXT_BLOCK_TYPE && lastBlock === context.currentTextBlock) {
lastBlock.content += textToAdd
} else {
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = textToAdd
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
} else {
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = textToAdd
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
hasProcessedContent = true
}
contentToProcess = remaining
@@ -1747,13 +1612,37 @@ const sseHandlers: Record<string, SSEHandler> = {
stream_end: (_data, context, _get, set) => {
if (context.pendingContent) {
if (context.isInThinkingBlock && context.currentThinkingBlock) {
appendThinkingContent(context, context.pendingContent)
context.currentThinkingBlock.content += context.pendingContent
} else if (context.pendingContent.trim()) {
appendTextBlock(context, context.pendingContent)
context.accumulatedContent.append(context.pendingContent)
if (context.currentTextBlock && context.contentBlocks.length > 0) {
const lastBlock = context.contentBlocks[context.contentBlocks.length - 1]
if (lastBlock.type === TEXT_BLOCK_TYPE && lastBlock === context.currentTextBlock) {
lastBlock.content += context.pendingContent
} else {
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = context.pendingContent
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
} else {
context.currentTextBlock = contentBlockPool.get()
context.currentTextBlock.type = TEXT_BLOCK_TYPE
context.currentTextBlock.content = context.pendingContent
context.currentTextBlock.timestamp = Date.now()
context.contentBlocks.push(context.currentTextBlock)
}
}
context.pendingContent = ''
}
finalizeThinkingBlock(context)
if (context.currentThinkingBlock) {
context.currentThinkingBlock.duration =
Date.now() - (context.currentThinkingBlock.startTime || Date.now())
}
context.isInThinkingBlock = false
context.currentThinkingBlock = null
context.currentTextBlock = null
updateStreamingMessage(set, context)
},
default: () => {},
@@ -1851,7 +1740,29 @@ const subAgentSSEHandlers: Record<string, SSEHandler> = {
return
}
appendSubAgentText(context, parentToolCallId, data.data)
// Initialize if needed
if (!context.subAgentContent[parentToolCallId]) {
context.subAgentContent[parentToolCallId] = ''
}
if (!context.subAgentBlocks[parentToolCallId]) {
context.subAgentBlocks[parentToolCallId] = []
}
// Append content
context.subAgentContent[parentToolCallId] += data.data
// Update or create the last text block in subAgentBlocks
const blocks = context.subAgentBlocks[parentToolCallId]
const lastBlock = blocks[blocks.length - 1]
if (lastBlock && lastBlock.type === 'subagent_text') {
lastBlock.content = (lastBlock.content || '') + data.data
} else {
blocks.push({
type: 'subagent_text',
content: data.data,
timestamp: Date.now(),
})
}
updateToolCallWithSubAgentData(context, get, set, parentToolCallId)
},
@@ -1862,13 +1773,34 @@ const subAgentSSEHandlers: Record<string, SSEHandler> = {
const phase = data?.phase || data?.data?.phase
if (!parentToolCallId) return
// Initialize if needed
if (!context.subAgentContent[parentToolCallId]) {
context.subAgentContent[parentToolCallId] = ''
}
if (!context.subAgentBlocks[parentToolCallId]) {
context.subAgentBlocks[parentToolCallId] = []
}
// For reasoning, we just append the content (treating start/end as markers)
if (phase === 'start' || phase === 'end') return
const chunk = typeof data?.data === 'string' ? data.data : data?.content || ''
if (!chunk) return
appendSubAgentText(context, parentToolCallId, chunk)
context.subAgentContent[parentToolCallId] += chunk
// Update or create the last text block in subAgentBlocks
const blocks = context.subAgentBlocks[parentToolCallId]
const lastBlock = blocks[blocks.length - 1]
if (lastBlock && lastBlock.type === 'subagent_text') {
lastBlock.content = (lastBlock.content || '') + chunk
} else {
blocks.push({
type: 'subagent_text',
content: chunk,
timestamp: Date.now(),
})
}
updateToolCallWithSubAgentData(context, get, set, parentToolCallId)
},
@@ -2070,14 +2002,6 @@ const MIN_BATCH_INTERVAL = 16
const MAX_BATCH_INTERVAL = 50
const MAX_QUEUE_SIZE = 5
function stopStreamingUpdates() {
if (streamingUpdateRAF !== null) {
cancelAnimationFrame(streamingUpdateRAF)
streamingUpdateRAF = null
}
streamingUpdateQueue.clear()
}
function createOptimizedContentBlocks(contentBlocks: any[]): any[] {
const result: any[] = new Array(contentBlocks.length)
for (let i = 0; i < contentBlocks.length; i++) {
@@ -2185,7 +2109,6 @@ const initialState = {
messages: [] as CopilotMessage[],
checkpoints: [] as any[],
messageCheckpoints: {} as Record<string, any[]>,
messageSnapshots: {} as Record<string, WorkflowState>,
isLoading: false,
isLoadingChats: false,
isLoadingCheckpoints: false,
@@ -2209,7 +2132,6 @@ const initialState = {
suppressAutoSelect: false,
autoAllowedTools: [] as string[],
messageQueue: [] as import('./types').QueuedMessage[],
suppressAbortContinueOption: false,
}
export const useCopilotStore = create<CopilotStore>()(
@@ -2232,7 +2154,7 @@ export const useCopilotStore = create<CopilotStore>()(
// Abort all in-progress tools and clear any diff preview
abortAllInProgressTools(set, get)
try {
useWorkflowDiffStore.getState().clearDiff({ restoreBaseline: false })
useWorkflowDiffStore.getState().clearDiff()
} catch {}
set({
@@ -2266,7 +2188,7 @@ export const useCopilotStore = create<CopilotStore>()(
// Abort in-progress tools and clear diff when changing chats
abortAllInProgressTools(set, get)
try {
useWorkflowDiffStore.getState().clearDiff({ restoreBaseline: false })
useWorkflowDiffStore.getState().clearDiff()
} catch {}
// Restore plan content and config (mode/model) from selected chat
@@ -2359,7 +2281,7 @@ export const useCopilotStore = create<CopilotStore>()(
// Abort in-progress tools and clear diff on new chat
abortAllInProgressTools(set, get)
try {
useWorkflowDiffStore.getState().clearDiff({ restoreBaseline: false })
useWorkflowDiffStore.getState().clearDiff()
} catch {}
// Background-save the current chat before clearing (optimistic)
@@ -2562,12 +2484,6 @@ export const useCopilotStore = create<CopilotStore>()(
const userMessage = createUserMessage(message, fileAttachments, contexts, messageId)
const streamingMessage = createStreamingMessage()
const snapshot = workflowId ? buildCheckpointWorkflowState(workflowId) : null
if (snapshot) {
set((state) => ({
messageSnapshots: { ...state.messageSnapshots, [userMessage.id]: snapshot },
}))
}
let newMessages: CopilotMessage[]
if (revertState) {
@@ -2632,7 +2548,7 @@ export const useCopilotStore = create<CopilotStore>()(
}
// Call copilot API
const apiMode: CopilotTransportMode =
const apiMode: 'ask' | 'agent' | 'plan' =
mode === 'ask' ? 'ask' : mode === 'plan' ? 'plan' : 'agent'
// Extract slash commands from contexts (lowercase) and filter them out from contexts
@@ -2724,14 +2640,12 @@ export const useCopilotStore = create<CopilotStore>()(
},
// Abort streaming
abortMessage: (options?: { suppressContinueOption?: boolean }) => {
abortMessage: () => {
const { abortController, isSendingMessage, messages } = get()
if (!isSendingMessage || !abortController) return
const suppressContinueOption = options?.suppressContinueOption === true
set({ isAborting: true, suppressAbortContinueOption: suppressContinueOption })
set({ isAborting: true })
try {
abortController.abort()
stopStreamingUpdates()
const lastMessage = messages[messages.length - 1]
if (lastMessage && lastMessage.role === 'assistant') {
const textContent =
@@ -2739,19 +2653,10 @@ export const useCopilotStore = create<CopilotStore>()(
?.filter((b) => b.type === 'text')
.map((b: any) => b.content)
.join('') || ''
const nextContentBlocks = suppressContinueOption
? lastMessage.contentBlocks ?? []
: appendContinueOptionBlock(lastMessage.contentBlocks ? [...lastMessage.contentBlocks] : [])
set((state) => ({
messages: state.messages.map((msg) =>
msg.id === lastMessage.id
? {
...msg,
content: suppressContinueOption
? textContent.trim() || 'Message was aborted'
: appendContinueOption(textContent.trim() || 'Message was aborted'),
contentBlocks: nextContentBlocks,
}
? { ...msg, content: textContent.trim() || 'Message was aborted' }
: msg
),
isSendingMessage: false,
@@ -3050,10 +2955,6 @@ export const useCopilotStore = create<CopilotStore>()(
if (!workflowId) return
set({ isRevertingCheckpoint: true, checkpointError: null })
try {
const { messageCheckpoints } = get()
const checkpointMessageId = Object.entries(messageCheckpoints).find(([, cps]) =>
(cps || []).some((cp: any) => cp?.id === checkpointId)
)?.[0]
const response = await fetch('/api/copilot/checkpoints/revert', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
@@ -3099,11 +3000,6 @@ export const useCopilotStore = create<CopilotStore>()(
},
})
}
if (checkpointMessageId) {
const { messageCheckpoints: currentCheckpoints } = get()
const updatedCheckpoints = { ...currentCheckpoints, [checkpointMessageId]: [] }
set({ messageCheckpoints: updatedCheckpoints })
}
set({ isRevertingCheckpoint: false })
} catch (error) {
set({
@@ -3117,10 +3013,6 @@ export const useCopilotStore = create<CopilotStore>()(
const { messageCheckpoints } = get()
return messageCheckpoints[messageId] || []
},
saveMessageCheckpoint: async (messageId: string) => {
if (!messageId) return false
return saveMessageCheckpoint(messageId, get, set)
},
// Handle streaming response
handleStreamingResponse: async (
@@ -3168,19 +3060,7 @@ export const useCopilotStore = create<CopilotStore>()(
try {
for await (const data of parseSSEStream(reader, decoder)) {
const { abortController } = get()
if (abortController?.signal.aborted) {
context.wasAborted = true
const { suppressAbortContinueOption } = get()
context.suppressContinueOption = suppressAbortContinueOption === true
if (suppressAbortContinueOption) {
set({ suppressAbortContinueOption: false })
}
context.pendingContent = ''
finalizeThinkingBlock(context)
stopStreamingUpdates()
reader.cancel()
break
}
if (abortController?.signal.aborted) break
// Log SSE events for debugging
logger.info('[SSE] Received event', {
@@ -3280,9 +3160,7 @@ export const useCopilotStore = create<CopilotStore>()(
if (context.streamComplete) break
}
if (!context.wasAborted && sseHandlers.stream_end) {
sseHandlers.stream_end({}, context, get, set)
}
if (sseHandlers.stream_end) sseHandlers.stream_end({}, context, get, set)
if (streamingUpdateRAF !== null) {
cancelAnimationFrame(streamingUpdateRAF)
@@ -3299,9 +3177,6 @@ export const useCopilotStore = create<CopilotStore>()(
: block
)
}
if (context.wasAborted && !context.suppressContinueOption) {
sanitizedContentBlocks = appendContinueOptionBlock(sanitizedContentBlocks)
}
if (context.contentBlocks) {
context.contentBlocks.forEach((block) => {
@@ -3312,36 +3187,21 @@ export const useCopilotStore = create<CopilotStore>()(
}
const finalContent = stripTodoTags(context.accumulatedContent.toString())
const finalContentWithOptions = context.wasAborted && !context.suppressContinueOption
? appendContinueOption(finalContent)
: finalContent
set((state) => {
const snapshotId = state.currentUserMessageId
const nextSnapshots =
snapshotId && state.messageSnapshots[snapshotId]
? (() => {
const updated = { ...state.messageSnapshots }
delete updated[snapshotId]
return updated
})()
: state.messageSnapshots
return {
messages: state.messages.map((msg) =>
msg.id === assistantMessageId
? {
...msg,
content: finalContentWithOptions,
contentBlocks: sanitizedContentBlocks,
}
: msg
),
isSendingMessage: false,
isAborting: false,
abortController: null,
currentUserMessageId: null,
messageSnapshots: nextSnapshots,
}
})
set((state) => ({
messages: state.messages.map((msg) =>
msg.id === assistantMessageId
? {
...msg,
content: finalContent,
contentBlocks: sanitizedContentBlocks,
}
: msg
),
isSendingMessage: false,
isAborting: false,
abortController: null,
currentUserMessageId: null,
}))
if (context.newChatId && !get().currentChat) {
await get().handleNewChatCreation(context.newChatId)
@@ -3849,7 +3709,7 @@ export const useCopilotStore = create<CopilotStore>()(
// If currently sending, abort and send this one
const { isSendingMessage } = get()
if (isSendingMessage) {
get().abortMessage({ suppressContinueOption: true })
get().abortMessage()
// Wait a tick for abort to complete
await new Promise((resolve) => setTimeout(resolve, 50))
}

View File

@@ -1,7 +1,4 @@
import type { CopilotMode, CopilotModelId } from '@/lib/copilot/models'
export type { CopilotMode, CopilotModelId } from '@/lib/copilot/models'
import type { ClientToolCallState, ClientToolDisplay } from '@/lib/copilot/tools/client/base-tool'
import type { WorkflowState } from '@/stores/workflows/workflow/types'
export type ToolState = ClientToolCallState
@@ -94,9 +91,33 @@ import type { CopilotChat as ApiCopilotChat } from '@/lib/copilot/api'
export type CopilotChat = ApiCopilotChat
export type CopilotMode = 'ask' | 'build' | 'plan'
export interface CopilotState {
mode: CopilotMode
selectedModel: CopilotModelId
selectedModel:
| 'gpt-5-fast'
| 'gpt-5'
| 'gpt-5-medium'
| 'gpt-5-high'
| 'gpt-5.1-fast'
| 'gpt-5.1'
| 'gpt-5.1-medium'
| 'gpt-5.1-high'
| 'gpt-5-codex'
| 'gpt-5.1-codex'
| 'gpt-5.2'
| 'gpt-5.2-codex'
| 'gpt-5.2-pro'
| 'gpt-4o'
| 'gpt-4.1'
| 'o3'
| 'claude-4-sonnet'
| 'claude-4.5-haiku'
| 'claude-4.5-sonnet'
| 'claude-4.5-opus'
| 'claude-4.1-opus'
| 'gemini-3-pro'
agentPrefetch: boolean
enabledModels: string[] | null // Null means not loaded yet, array of model IDs when loaded
isCollapsed: boolean
@@ -108,7 +129,6 @@ export interface CopilotState {
checkpoints: any[]
messageCheckpoints: Record<string, any[]>
messageSnapshots: Record<string, WorkflowState>
isLoading: boolean
isLoadingChats: boolean
@@ -117,8 +137,6 @@ export interface CopilotState {
isSaving: boolean
isRevertingCheckpoint: boolean
isAborting: boolean
/** Skip adding Continue option on abort for queued send-now */
suppressAbortContinueOption?: boolean
error: string | null
saveError: string | null
@@ -179,7 +197,7 @@ export interface CopilotActions {
messageId?: string
}
) => Promise<void>
abortMessage: (options?: { suppressContinueOption?: boolean }) => void
abortMessage: () => void
sendImplicitFeedback: (
implicitFeedback: string,
toolCallState?: 'accepted' | 'rejected' | 'error'
@@ -197,7 +215,6 @@ export interface CopilotActions {
loadMessageCheckpoints: (chatId: string) => Promise<void>
revertToCheckpoint: (checkpointId: string) => Promise<void>
getCheckpointsForMessage: (messageId: string) => any[]
saveMessageCheckpoint: (messageId: string) => Promise<boolean>
clearMessages: () => void
clearError: () => void

View File

@@ -23,31 +23,6 @@ import {
const logger = createLogger('WorkflowDiffStore')
const diffEngine = new WorkflowDiffEngine()
/**
* Detects when a diff contains no meaningful changes.
*/
function isEmptyDiffAnalysis(diffAnalysis?: {
new_blocks?: string[]
edited_blocks?: string[]
deleted_blocks?: string[]
field_diffs?: Record<string, { changed_fields: string[] }>
edge_diff?: { new_edges?: string[]; deleted_edges?: string[] }
} | null): boolean {
if (!diffAnalysis) return false
const hasBlockChanges =
(diffAnalysis.new_blocks?.length || 0) > 0 ||
(diffAnalysis.edited_blocks?.length || 0) > 0 ||
(diffAnalysis.deleted_blocks?.length || 0) > 0
const hasEdgeChanges =
(diffAnalysis.edge_diff?.new_edges?.length || 0) > 0 ||
(diffAnalysis.edge_diff?.deleted_edges?.length || 0) > 0
const hasFieldChanges =
Object.values(diffAnalysis.field_diffs || {}).some(
(diff) => (diff?.changed_fields?.length || 0) > 0
)
return !hasBlockChanges && !hasEdgeChanges && !hasFieldChanges
}
export const useWorkflowDiffStore = create<WorkflowDiffState & WorkflowDiffActions>()(
devtools(
(set, get) => {
@@ -100,24 +75,6 @@ export const useWorkflowDiffStore = create<WorkflowDiffState & WorkflowDiffActio
throw new Error(errorMessage)
}
const diffAnalysisResult = diffResult.diff.diffAnalysis || null
if (isEmptyDiffAnalysis(diffAnalysisResult)) {
logger.info('No workflow diff detected; skipping diff view')
diffEngine.clearDiff()
batchedUpdate({
hasActiveDiff: false,
isShowingDiff: false,
isDiffReady: false,
baselineWorkflow: null,
baselineWorkflowId: null,
diffAnalysis: null,
diffMetadata: null,
diffError: null,
_triggerMessageId: null,
})
return
}
const candidateState = diffResult.diff.proposedState
// Validate proposed workflow using serializer round-trip
@@ -146,22 +103,12 @@ export const useWorkflowDiffStore = create<WorkflowDiffState & WorkflowDiffActio
isDiffReady: true,
baselineWorkflow: baselineWorkflow,
baselineWorkflowId,
diffAnalysis: diffAnalysisResult,
diffAnalysis: diffResult.diff.diffAnalysis || null,
diffMetadata: diffResult.diff.metadata,
diffError: null,
_triggerMessageId: triggerMessageId ?? null,
})
if (triggerMessageId) {
import('@/stores/panel/copilot/store')
.then(({ useCopilotStore }) =>
useCopilotStore.getState().saveMessageCheckpoint(triggerMessageId)
)
.catch((error) => {
logger.warn('Failed to save checkpoint for diff', { error })
})
}
logger.info('Workflow diff applied optimistically', {
workflowId: activeWorkflowId,
blocks: Object.keys(candidateState.blocks || {}).length,