Compare commits

..

20 Commits

Author SHA1 Message Date
Waleed
fdca73679d v0.5.93: NextJS config changes, MCP and Blocks whitelisting, copilot keyboard shortcuts, audit logs 2026-02-18 12:10:05 -08:00
Waleed
86ca984926 fix(normalization): update allowed integrations checks to be fully lowercase (#3248) 2026-02-18 12:08:03 -08:00
Emir Karabeg
e3964624ac feat(sub): hide usage limits and seats info from enterprise members (non-admin) (#3243)
- Add isEnterpriseMember and canViewUsageInfo flags to subscription permissions
- Hide UsageHeader, CreditBalance, billing date, and usage notifications from enterprise members
- Show only plan name in subscription tab for enterprise members (non-admin)
- Hide usage indicator details (amount, progress pills) from enterprise members
- Team tab already hidden via requiresTeam check in settings modal

Closes #6882

Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>
2026-02-18 12:01:47 -08:00
Waleed
7c7c0fd955 feat(audit-log): add audit events for templates, billing, credentials, env, deployments, passwords (#3246)
* feat(audit-log): add audit events for templates, billing, credentials, env, deployments, passwords

* improvement(audit-log): add actorName/actorEmail to all recordAudit calls

* fix(audit-log): resolve user for password reset, add CREDENTIAL_SET_INVITATION_RESENT action

* fix(audit-log): add workspaceId to deployment activation audit

* improvement(audit-log): use better-auth callback for password reset audit, remove cast

- Move password reset audit to onPasswordReset callback in auth config
  instead of coupling to better-auth's verification table internals
- Remove ugly double-cast on workflowData.workspaceId in deployment activation

* fix(audit-log): add missing actorName/actorEmail to workflow duplicate

* improvement(audit-log): add resourceName to credential set invitation accept
2026-02-18 11:53:08 -08:00
Waleed
e37b4a926d feat(audit-log): add persistent audit log system with comprehensive route instrumentation (#3242)
* feat(audit-log): add persistent audit log system with comprehensive route instrumentation

* fix(audit-log): address PR review — nullable workspaceId, enum usage, remove redundant queries

- Make audit_log.workspace_id nullable with ON DELETE SET NULL (logs survive workspace/user deletion)
- Make audit_log.actor_id nullable with ON DELETE SET NULL
- Replace all 53 routes' string literal action/resourceType with AuditAction.X and AuditResourceType.X enums
- Fix empty workspaceId ('') → null for OAuth, form, and org routes to avoid FK violations
- Remove redundant DB queries in chat manage route (use checkChatAccess return data)
- Fix organization routes to pass workspaceId: null instead of organizationId

* fix(audit-log): replace remaining workspaceId '' fallbacks with null

* fix(audit-log): credential-set org IDs, workspace deletion FK, actorId fallback, string literal action

* reran migrations

* fix(mcp,audit): tighten env var domain bypass, add post-resolution check, form workspaceId

- Only bypass MCP domain check when env var is in hostname/authority, not path/query
- Add post-resolution validateMcpDomain call in test-connection endpoint
- Match client-side isDomainAllowed to same hostname-only bypass logic
- Return workspaceId from checkFormAccess, use in form audit logs
- Add 49 comprehensive domain-check tests covering all edge cases

* fix(mcp): stateful regex lastIndex bug, RFC 3986 authority parsing

- Remove /g flag from module-level ENV_VAR_PATTERN to avoid lastIndex state
- Create fresh regex instances per call in server-side hasEnvVarInHostname
- Fix authority extraction to terminate at /, ?, or # per RFC 3986
- Prevents bypass via https://evil.com?token={{SECRET}} (no path)
- Add test cases for query-only and fragment-only env var URLs (53 total)

* fix(audit-log): try/catch for never-throw contract, accept null actorName/Email, fix misleading action

- Wrap recordAudit body in try/catch so nanoid() or header extraction can't throw
- Accept string | null for actorName and actorEmail (session.user.name can be null)
- Normalize null -> undefined before insert to match DB column types
- Fix org members route: ORG_MEMBER_ADDED -> ORG_INVITATION_CREATED (sends invite, not adds member)

* improvement(audit-log): add resource names and specific invitation actions

* fix(audit-log): use validated chat record, add mock sync tests
2026-02-18 00:54:52 -08:00
Waleed
11f3a14c02 fix(lock): prevent socket crash when locking agent blocks (#3245) 2026-02-18 00:32:09 -08:00
Emir Karabeg
eab01e0272 fix(copilot): copilot shortcut conflict (#3219)
* fix: prevent copilot keyboard shortcuts from triggering when panel is inactive

The OptionsSelector component was capturing keyboard events (1-9 number keys and Enter)
globally on the document, causing accidental option selections when users were
interacting with other parts of the application.

This fix adds a check to only handle keyboard shortcuts when the copilot panel
is the active tab, preventing the shortcuts from interfering with other workflows.

Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>

* lint

---------

Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>
Co-authored-by: Waleed Latif <walif6@gmail.com>
2026-02-17 18:47:07 -08:00
Waleed
bbcef7ce5c feat(access-control): add ALLOWED_INTEGRATIONS env var for self-hosted block restrictions (#3238)
* feat(access-control): add ALLOWED_INTEGRATIONS env var for self-hosted block restrictions

* fix(tests): add getAllowedIntegrationsFromEnv mock to agent-handler tests

* fix(access-control): add auth to allowlist endpoint, fix loading state race, use accurate error message

* fix(access-control): remove auth from allowed-integrations endpoint to match models endpoint pattern

* fix(access-control): normalize blockType to lowercase before env allowlist check

* fix(access-control): expose merged allowedIntegrations on config to prevent bypass via direct access

* consolidate merging of allowed blocks so all callers have it by default

* normalize to lower case

* added tests

* added tests, normalize to lower case

* added safety incase userId is missing

* fix failing tests
2026-02-17 18:46:24 -08:00
Emir Karabeg
0ee52df5a7 feat(canvas): allow locked block outbound connections (#3229)
* Allow outbound connections from locked blocks to be modified

- Modified isEdgeProtected to only check target block protection
- Outbound connections (from locked blocks) can now be added/removed
- Inbound connections (to locked blocks) remain protected
- Updated notification messages and comments to reflect the change

Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>

* update notif msg

---------

Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>
Co-authored-by: waleed <walif6@gmail.com>
2026-02-17 18:16:17 -08:00
Waleed
6421b1a0ca feat(mcp): add ALLOWED_MCP_DOMAINS env var for domain allowlist (#3240)
* feat(mcp): add ALLOWED_MCP_DOMAINS env var for domain allowlist

* ack PR comments

* cleanup
2026-02-17 18:01:52 -08:00
Waleed
61a5c98717 fix(shortlink): use redirect instead of rewrite for Beluga tracking (#3239) 2026-02-17 16:27:20 -08:00
Waleed
da46a387c9 v0.5.92: shortlinks, copilot scrolling stickiness, pagination 2026-02-17 15:13:21 -08:00
Waleed
a0afb5d03e feat(pipedrive): added sort order to endpoints that support it, upgraded turborepo (#3237)
* feat(pipedrive): added sort order to endpoints that support it

* upgraded turborepo

* fix
2026-02-17 14:58:54 -08:00
Waleed
cdacb796a8 improvement(providers): replace @ts-ignore with typed ProviderError class (#3235) 2026-02-17 14:20:31 -08:00
Waleed
3ce54147e6 fix(pagination): add missing next_page to response interfaces and operator comments (#3236) 2026-02-17 14:13:45 -08:00
Waleed
08690b2906 feat(pagination): update pagination for remaining integrations that support it (#3233)
* feat(pagination): update pagination for remaining integrations that support it

* fixed remaining

* ack comments
2026-02-17 13:34:46 -08:00
Waleed
299cc26694 improvement(lint): fix react-doctor errors and warnings (#3232)
* improvement(lint): fix react-doctor errors and warnings

* remove separators
2026-02-17 11:40:47 -08:00
Emir Karabeg
48715ff013 improvement(copilot): scrolling stickiness (#3218)
- Changed default stickinessThreshold from 100 to 30 in use-scroll-management.ts
- Removed explicit stickinessThreshold override (40) from copilot.tsx
- Both copilot and chat now use the same default value of 30
- This makes scrolling less sticky across all copilot message interactions

Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: Emir Karabeg <emir-karabeg@users.noreply.github.com>
2026-02-17 10:33:10 -08:00
Waleed
ad0d0ed1f1 feat(shortlink): add Beluga short link rewrite for hosted campaigns (#3231) 2026-02-17 10:32:32 -08:00
Waleed
b7e377ec4b v0.5.91: docs i18n, turborepo upgrade 2026-02-16 00:36:05 -08:00
293 changed files with 4519 additions and 11872 deletions

View File

@@ -1,261 +0,0 @@
---
description: Add a knowledge base connector for syncing documents from an external source
argument-hint: <service-name> [api-docs-url]
---
# Add Connector Skill
You are an expert at adding knowledge base connectors to Sim. A connector syncs documents from an external source (Confluence, Google Drive, Notion, etc.) into a knowledge base.
## Your Task
When the user asks you to create a connector:
1. Use Context7 or WebFetch to read the service's API documentation
2. Create the connector directory and config
3. Register it in the connector registry
## Directory Structure
Create files in `apps/sim/connectors/{service}/`:
```
connectors/{service}/
├── index.ts # Barrel export
└── {service}.ts # ConnectorConfig definition
```
## ConnectorConfig Structure
```typescript
import { createLogger } from '@sim/logger'
import { {Service}Icon } from '@/components/icons'
import { fetchWithRetry } from '@/lib/knowledge/documents/utils'
import type { ConnectorConfig, ExternalDocument, ExternalDocumentList } from '@/connectors/types'
const logger = createLogger('{Service}Connector')
export const {service}Connector: ConnectorConfig = {
id: '{service}',
name: '{Service}',
description: 'Sync documents from {Service} into your knowledge base',
version: '1.0.0',
icon: {Service}Icon,
oauth: {
required: true,
provider: '{service}', // Must match OAuthService in lib/oauth/types.ts
requiredScopes: ['read:...'],
},
configFields: [
// Rendered dynamically by the add-connector modal UI
// Supports 'short-input' and 'dropdown' types
],
listDocuments: async (accessToken, sourceConfig, cursor) => {
// Paginate via cursor, extract text, compute SHA-256 hash
// Return { documents: ExternalDocument[], nextCursor?, hasMore }
},
getDocument: async (accessToken, sourceConfig, externalId) => {
// Return ExternalDocument or null
},
validateConfig: async (accessToken, sourceConfig) => {
// Return { valid: true } or { valid: false, error: 'message' }
},
// Optional: map source metadata to semantic tag keys (translated to slots by sync engine)
mapTags: (metadata) => {
// Return Record<string, unknown> with keys matching tagDefinitions[].id
},
}
```
## ConfigField Types
The add-connector modal renders these automatically — no custom UI needed.
```typescript
// Text input
{
id: 'domain',
title: 'Domain',
type: 'short-input',
placeholder: 'yoursite.example.com',
required: true,
}
// Dropdown (static options)
{
id: 'contentType',
title: 'Content Type',
type: 'dropdown',
required: false,
options: [
{ label: 'Pages only', id: 'page' },
{ label: 'Blog posts only', id: 'blogpost' },
{ label: 'All content', id: 'all' },
],
}
```
## ExternalDocument Shape
Every document returned from `listDocuments`/`getDocument` must include:
```typescript
{
externalId: string // Source-specific unique ID
title: string // Document title
content: string // Extracted plain text
mimeType: 'text/plain' // Always text/plain (content is extracted)
contentHash: string // SHA-256 of content (change detection)
sourceUrl?: string // Link back to original (stored on document record)
metadata?: Record<string, unknown> // Source-specific data (fed to mapTags)
}
```
## Content Hashing (Required)
The sync engine uses content hashes for change detection:
```typescript
async function computeContentHash(content: string): Promise<string> {
const data = new TextEncoder().encode(content)
const hashBuffer = await crypto.subtle.digest('SHA-256', data)
return Array.from(new Uint8Array(hashBuffer)).map(b => b.toString(16).padStart(2, '0')).join('')
}
```
## tagDefinitions — Declared Tag Definitions
Declare which tags the connector populates using semantic IDs. Shown in the add-connector modal as opt-out checkboxes.
On connector creation, slots are **dynamically assigned** via `getNextAvailableSlot` — connectors never hardcode slot names.
```typescript
tagDefinitions: [
{ id: 'labels', displayName: 'Labels', fieldType: 'text' },
{ id: 'version', displayName: 'Version', fieldType: 'number' },
{ id: 'lastModified', displayName: 'Last Modified', fieldType: 'date' },
],
```
Each entry has:
- `id`: Semantic key matching a key returned by `mapTags` (e.g. `'labels'`, `'version'`)
- `displayName`: Human-readable name shown in the UI (e.g. "Labels", "Last Modified")
- `fieldType`: `'text'` | `'number'` | `'date'` | `'boolean'` — determines which slot pool to draw from
Users can opt out of specific tags in the modal. Disabled IDs are stored in `sourceConfig.disabledTagIds`.
The assigned mapping (`semantic id → slot`) is stored in `sourceConfig.tagSlotMapping`.
## mapTags — Metadata to Semantic Keys
Maps source metadata to semantic tag keys. Required if `tagDefinitions` is set.
The sync engine calls this automatically and translates semantic keys to actual DB slots
using the `tagSlotMapping` stored on the connector.
Return keys must match the `id` values declared in `tagDefinitions`.
```typescript
mapTags: (metadata: Record<string, unknown>): Record<string, unknown> => {
const result: Record<string, unknown> = {}
// Validate arrays before casting — metadata may be malformed
const labels = Array.isArray(metadata.labels) ? (metadata.labels as string[]) : []
if (labels.length > 0) result.labels = labels.join(', ')
// Validate numbers — guard against NaN
if (metadata.version != null) {
const num = Number(metadata.version)
if (!Number.isNaN(num)) result.version = num
}
// Validate dates — guard against Invalid Date
if (typeof metadata.lastModified === 'string') {
const date = new Date(metadata.lastModified)
if (!Number.isNaN(date.getTime())) result.lastModified = date
}
return result
}
```
## External API Calls — Use `fetchWithRetry`
All external API calls must use `fetchWithRetry` from `@/lib/knowledge/documents/utils` instead of raw `fetch()`. This provides exponential backoff with retries on 429/502/503/504 errors. It returns a standard `Response` — all `.ok`, `.json()`, `.text()` checks work unchanged.
For `validateConfig` (user-facing, called on save), pass `VALIDATE_RETRY_OPTIONS` to cap wait time at ~7s. Background operations (`listDocuments`, `getDocument`) use the built-in defaults (5 retries, ~31s max).
```typescript
import { VALIDATE_RETRY_OPTIONS, fetchWithRetry } from '@/lib/knowledge/documents/utils'
// Background sync — use defaults
const response = await fetchWithRetry(url, {
method: 'GET',
headers: { Authorization: `Bearer ${accessToken}` },
})
// validateConfig — tighter retry budget
const response = await fetchWithRetry(url, { ... }, VALIDATE_RETRY_OPTIONS)
```
## sourceUrl
If `ExternalDocument.sourceUrl` is set, the sync engine stores it on the document record. Always construct the full URL (not a relative path).
## Sync Engine Behavior (Do Not Modify)
The sync engine (`lib/knowledge/connectors/sync-engine.ts`) is connector-agnostic. It:
1. Calls `listDocuments` with pagination until `hasMore` is false
2. Compares `contentHash` to detect new/changed/unchanged documents
3. Stores `sourceUrl` and calls `mapTags` on insert/update automatically
4. Handles soft-delete of removed documents
You never need to modify the sync engine when adding a connector.
## OAuth Credential Reuse
Connectors reuse the existing OAuth infrastructure. The `oauth.provider` must match an `OAuthService` from `apps/sim/lib/oauth/types.ts`. Check existing providers before adding a new one.
## Icon
The `icon` field on `ConnectorConfig` is used throughout the UI — in the connector list, the add-connector modal, and as the document icon in the knowledge base table (replacing the generic file type icon for connector-sourced documents). The icon is read from `CONNECTOR_REGISTRY[connectorType].icon` at runtime — no separate icon map to maintain.
If the service already has an icon in `apps/sim/components/icons.tsx` (from a tool integration), reuse it. Otherwise, ask the user to provide the SVG.
## Registering
Add one line to `apps/sim/connectors/registry.ts`:
```typescript
import { {service}Connector } from '@/connectors/{service}'
export const CONNECTOR_REGISTRY: ConnectorRegistry = {
// ... existing connectors ...
{service}: {service}Connector,
}
```
## Reference Implementation
See `apps/sim/connectors/confluence/confluence.ts` for a complete example with:
- Multiple config field types (text + dropdown)
- Label fetching and CQL search filtering
- Blogpost + page content types
- `mapTags` mapping labels, version, and dates to semantic keys
## Checklist
- [ ] Created `connectors/{service}/{service}.ts` with full ConnectorConfig
- [ ] Created `connectors/{service}/index.ts` barrel export
- [ ] `oauth.provider` matches an existing OAuthService in `lib/oauth/types.ts`
- [ ] `listDocuments` handles pagination and computes content hashes
- [ ] `sourceUrl` set on each ExternalDocument (full URL, not relative)
- [ ] `metadata` includes source-specific data for tag mapping
- [ ] `tagDefinitions` declared for each semantic key returned by `mapTags`
- [ ] `mapTags` implemented if source has useful metadata (labels, dates, versions)
- [ ] `validateConfig` verifies the source is accessible
- [ ] All external API calls use `fetchWithRetry` (not raw `fetch`)
- [ ] All optional config fields validated in `validateConfig`
- [ ] Icon exists in `components/icons.tsx` (or asked user to provide SVG)
- [ ] Registered in `connectors/registry.ts`

View File

@@ -59,12 +59,6 @@ body {
--content-gap: 1.75rem;
}
/* Remove custom layout variable overrides to fallback to fumadocs defaults */
/* ============================================
Navbar Light Mode Styling
============================================ */
/* Light mode navbar and search styling */
:root:not(.dark) nav {
background-color: hsla(0, 0%, 96%, 0.85) !important;
@@ -88,10 +82,6 @@ body {
-webkit-backdrop-filter: blur(25px) saturate(180%) brightness(0.6) !important;
}
/* ============================================
Custom Sidebar Styling (Turborepo-inspired)
============================================ */
/* Floating sidebar appearance - remove background */
[data-sidebar-container],
#nd-sidebar {
@@ -468,10 +458,6 @@ aside[data-sidebar],
writing-mode: horizontal-tb !important;
}
/* ============================================
Code Block Styling (Improved)
============================================ */
/* Apply Geist Mono to code elements */
code,
pre,
@@ -532,10 +518,6 @@ pre code .line {
color: var(--color-fd-primary);
}
/* ============================================
TOC (Table of Contents) Styling
============================================ */
/* Remove the thin border-left on nested TOC items (keeps main indicator only) */
#nd-toc a[style*="padding-inline-start"] {
border-left: none !important;
@@ -554,10 +536,6 @@ main article,
padding-bottom: 4rem;
}
/* ============================================
Center and Constrain Main Content Width
============================================ */
/* Main content area - center and constrain like turborepo/raindrop */
/* Note: --sidebar-offset and --toc-offset are now applied at #nd-docs-layout level */
main[data-main] {

View File

@@ -130,37 +130,4 @@ Update multiple existing records in an Airtable table
| `records` | json | Array of updated Airtable records |
| `metadata` | json | Operation metadata including record count and updated record IDs |
### `airtable_list_bases`
List all bases the authenticated user has access to
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `bases` | json | Array of Airtable bases with id, name, and permissionLevel |
| `metadata` | json | Operation metadata including total bases count |
### `airtable_get_base_schema`
Get the schema of all tables, fields, and views in an Airtable base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `baseId` | string | Yes | Airtable base ID \(starts with "app", e.g., "appXXXXXXXXXXXXXX"\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `tables` | json | Array of table schemas with fields and views |
| `metadata` | json | Operation metadata including total tables count |

View File

@@ -234,7 +234,6 @@ List actions from incident.io. Optionally filter by incident ID.
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | incident.io API Key |
| `incident_id` | string | No | Filter actions by incident ID \(e.g., "01FCNDV6P870EA6S7TK1DSYDG0"\) |
| `page_size` | number | No | Number of actions to return per page \(e.g., 10, 25, 50\) |
#### Output
@@ -309,7 +308,6 @@ List follow-ups from incident.io. Optionally filter by incident ID.
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | incident.io API Key |
| `incident_id` | string | No | Filter follow-ups by incident ID \(e.g., "01FCNDV6P870EA6S7TK1DSYDG0"\) |
| `page_size` | number | No | Number of follow-ups to return per page \(e.g., 10, 25, 50\) |
#### Output
@@ -396,6 +394,7 @@ List all users in your Incident.io workspace. Returns user details including id,
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | Incident.io API Key |
| `page_size` | number | No | Number of results to return per page \(e.g., 10, 25, 50\). Default: 25 |
| `after` | string | No | Pagination cursor to fetch the next page of results |
#### Output
@@ -406,6 +405,10 @@ List all users in your Incident.io workspace. Returns user details including id,
| ↳ `name` | string | Full name of the user |
| ↳ `email` | string | Email address of the user |
| ↳ `role` | string | Role of the user in the workspace |
| `pagination_meta` | object | Pagination metadata |
| ↳ `after` | string | Cursor for next page |
| ↳ `page_size` | number | Number of items per page |
| ↳ `total_record_count` | number | Total number of records |
### `incidentio_users_show`
@@ -644,7 +647,6 @@ List all escalation policies in incident.io
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `apiKey` | string | Yes | incident.io API Key |
| `page_size` | number | No | Number of results per page \(e.g., 10, 25, 50\). Default: 25 |
#### Output

View File

@@ -29,7 +29,7 @@ In Sim, the Knowledge Base block enables your agents to perform intelligent sema
## Usage Instructions
Integrate Knowledge into the workflow. Perform full CRUD operations on documents, chunks, and tags.
Integrate Knowledge into the workflow. Can search, upload chunks, and create documents.
@@ -126,161 +126,4 @@ Create a new document in a knowledge base
| `message` | string | Success or error message describing the operation result |
| `documentId` | string | ID of the created document |
### `knowledge_list_tags`
List all tag definitions for a knowledge base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base to list tags for |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `knowledgeBaseId` | string | ID of the knowledge base |
| `tags` | array | Array of tag definitions for the knowledge base |
| ↳ `id` | string | Tag definition ID |
| ↳ `tagSlot` | string | Internal tag slot \(e.g. tag1, number1\) |
| ↳ `displayName` | string | Human-readable tag name |
| ↳ `fieldType` | string | Tag field type \(text, number, date, boolean\) |
| ↳ `createdAt` | string | Creation timestamp |
| ↳ `updatedAt` | string | Last update timestamp |
| `totalTags` | number | Total number of tag definitions |
### `knowledge_list_documents`
List documents in a knowledge base with optional filtering, search, and pagination
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base to list documents from |
| `search` | string | No | Search query to filter documents by filename |
| `enabledFilter` | string | No | Filter by enabled status: "all", "enabled", or "disabled" |
| `limit` | number | No | Maximum number of documents to return \(default: 50\) |
| `offset` | number | No | Number of documents to skip for pagination \(default: 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `knowledgeBaseId` | string | ID of the knowledge base |
| `documents` | array | Array of documents in the knowledge base |
| ↳ `id` | string | Document ID |
| ↳ `filename` | string | Document filename |
| ↳ `fileSize` | number | File size in bytes |
| ↳ `mimeType` | string | MIME type of the document |
| ↳ `enabled` | boolean | Whether the document is enabled |
| ↳ `processingStatus` | string | Processing status \(pending, processing, completed, failed\) |
| ↳ `chunkCount` | number | Number of chunks in the document |
| ↳ `tokenCount` | number | Total token count across chunks |
| ↳ `uploadedAt` | string | Upload timestamp |
| ↳ `updatedAt` | string | Last update timestamp |
| `totalDocuments` | number | Total number of documents matching the filter |
| `limit` | number | Page size used |
| `offset` | number | Offset used for pagination |
### `knowledge_delete_document`
Delete a document from a knowledge base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base containing the document |
| `documentId` | string | Yes | ID of the document to delete |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `documentId` | string | ID of the deleted document |
| `message` | string | Confirmation message |
### `knowledge_list_chunks`
List chunks for a document in a knowledge base with optional filtering and pagination
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base |
| `documentId` | string | Yes | ID of the document to list chunks from |
| `search` | string | No | Search query to filter chunks by content |
| `enabled` | string | No | Filter by enabled status: "true", "false", or "all" \(default: "all"\) |
| `limit` | number | No | Maximum number of chunks to return \(1-100, default: 50\) |
| `offset` | number | No | Number of chunks to skip for pagination \(default: 0\) |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `knowledgeBaseId` | string | ID of the knowledge base |
| `documentId` | string | ID of the document |
| `chunks` | array | Array of chunks in the document |
| ↳ `id` | string | Chunk ID |
| ↳ `chunkIndex` | number | Index of the chunk within the document |
| ↳ `content` | string | Chunk text content |
| ↳ `contentLength` | number | Content length in characters |
| ↳ `tokenCount` | number | Token count for the chunk |
| ↳ `enabled` | boolean | Whether the chunk is enabled |
| ↳ `createdAt` | string | Creation timestamp |
| ↳ `updatedAt` | string | Last update timestamp |
| `totalChunks` | number | Total number of chunks matching the filter |
| `limit` | number | Page size used |
| `offset` | number | Offset used for pagination |
### `knowledge_update_chunk`
Update the content or enabled status of a chunk in a knowledge base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base |
| `documentId` | string | Yes | ID of the document containing the chunk |
| `chunkId` | string | Yes | ID of the chunk to update |
| `content` | string | No | New content for the chunk |
| `enabled` | boolean | No | Whether the chunk should be enabled or disabled |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `documentId` | string | ID of the parent document |
| `id` | string | Chunk ID |
| `chunkIndex` | number | Index of the chunk within the document |
| `content` | string | Updated chunk content |
| `contentLength` | number | Content length in characters |
| `tokenCount` | number | Token count for the chunk |
| `enabled` | boolean | Whether the chunk is enabled |
| `updatedAt` | string | Last update timestamp |
### `knowledge_delete_chunk`
Delete a chunk from a document in a knowledge base
#### Input
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `knowledgeBaseId` | string | Yes | ID of the knowledge base |
| `documentId` | string | Yes | ID of the document containing the chunk |
| `chunkId` | string | Yes | ID of the chunk to delete |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `chunkId` | string | ID of the deleted chunk |
| `documentId` | string | ID of the parent document |
| `message` | string | Confirmation message |

View File

@@ -49,6 +49,7 @@ Retrieve all deals from Pipedrive with optional filters
| `pipeline_id` | string | No | If supplied, only deals in the specified pipeline are returned \(e.g., "1"\) |
| `updated_since` | string | No | If set, only deals updated after this time are returned. Format: 2025-01-01T10:20:00Z |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500\) |
| `cursor` | string | No | For pagination, the marker representing the first item on the next page |
#### Output
@@ -74,6 +75,8 @@ Retrieve all deals from Pipedrive with optional filters
| `metadata` | object | Pagination metadata for the response |
| ↳ `total_items` | number | Total number of items |
| ↳ `has_more` | boolean | Whether more items are available |
| ↳ `next_cursor` | string | Cursor for fetching the next page \(v2 endpoints\) |
| ↳ `next_start` | number | Offset for fetching the next page \(v1 endpoints\) |
| `success` | boolean | Operation success status |
### `pipedrive_get_deal`
@@ -148,10 +151,9 @@ Retrieve files from Pipedrive with optional filters
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `deal_id` | string | No | Filter files by deal ID \(e.g., "123"\) |
| `person_id` | string | No | Filter files by person ID \(e.g., "456"\) |
| `org_id` | string | No | Filter files by organization ID \(e.g., "789"\) |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500\) |
| `sort` | string | No | Sort files by field \(supported: "id", "update_time"\) |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 100\) |
| `start` | string | No | Pagination start offset \(0-based index of the first item to return\) |
| `downloadFiles` | boolean | No | Download file contents into file outputs |
#### Output
@@ -171,6 +173,8 @@ Retrieve files from Pipedrive with optional filters
| ↳ `url` | string | File download URL |
| `downloadedFiles` | file[] | Downloaded files from Pipedrive |
| `total_items` | number | Total number of files returned |
| `has_more` | boolean | Whether more files are available |
| `next_start` | number | Offset for fetching the next page |
| `success` | boolean | Operation success status |
### `pipedrive_get_mail_messages`
@@ -183,6 +187,7 @@ Retrieve mail threads from Pipedrive mailbox
| --------- | ---- | -------- | ----------- |
| `folder` | string | No | Filter by folder: inbox, drafts, sent, archive \(default: inbox\) |
| `limit` | string | No | Number of results to return \(e.g., "25", default: 50\) |
| `start` | string | No | Pagination start offset \(0-based index of the first item to return\) |
#### Output
@@ -190,6 +195,8 @@ Retrieve mail threads from Pipedrive mailbox
| --------- | ---- | ----------- |
| `messages` | array | Array of mail thread objects from Pipedrive mailbox |
| `total_items` | number | Total number of mail threads returned |
| `has_more` | boolean | Whether more messages are available |
| `next_start` | number | Offset for fetching the next page |
| `success` | boolean | Operation success status |
### `pipedrive_get_mail_thread`
@@ -221,7 +228,7 @@ Retrieve all pipelines from Pipedrive
| `sort_by` | string | No | Field to sort by: id, update_time, add_time \(default: id\) |
| `sort_direction` | string | No | Sorting direction: asc, desc \(default: asc\) |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500\) |
| `cursor` | string | No | For pagination, the marker representing the first item on the next page |
| `start` | string | No | Pagination start offset \(0-based index of the first item to return\) |
#### Output
@@ -237,6 +244,8 @@ Retrieve all pipelines from Pipedrive
| ↳ `add_time` | string | When the pipeline was created |
| ↳ `update_time` | string | When the pipeline was last updated |
| `total_items` | number | Total number of pipelines returned |
| `has_more` | boolean | Whether more pipelines are available |
| `next_start` | number | Offset for fetching the next page |
| `success` | boolean | Operation success status |
### `pipedrive_get_pipeline_deals`
@@ -249,8 +258,8 @@ Retrieve all deals in a specific pipeline
| --------- | ---- | -------- | ----------- |
| `pipeline_id` | string | Yes | The ID of the pipeline \(e.g., "1"\) |
| `stage_id` | string | No | Filter by specific stage within the pipeline \(e.g., "2"\) |
| `status` | string | No | Filter by deal status: open, won, lost |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500\) |
| `start` | string | No | Pagination start offset \(0-based index of the first item to return\) |
#### Output
@@ -271,6 +280,7 @@ Retrieve all projects or a specific project from Pipedrive
| `project_id` | string | No | Optional: ID of a specific project to retrieve \(e.g., "123"\) |
| `status` | string | No | Filter by project status: open, completed, deleted \(only for listing all\) |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500, only for listing all\) |
| `cursor` | string | No | For pagination, the marker representing the first item on the next page |
#### Output
@@ -279,6 +289,8 @@ Retrieve all projects or a specific project from Pipedrive
| `projects` | array | Array of project objects \(when listing all\) |
| `project` | object | Single project object \(when project_id is provided\) |
| `total_items` | number | Total number of projects returned |
| `has_more` | boolean | Whether more projects are available |
| `next_cursor` | string | Cursor for fetching the next page |
| `success` | boolean | Operation success status |
### `pipedrive_create_project`
@@ -309,12 +321,11 @@ Retrieve activities (tasks) from Pipedrive with optional filters
| Parameter | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `deal_id` | string | No | Filter activities by deal ID \(e.g., "123"\) |
| `person_id` | string | No | Filter activities by person ID \(e.g., "456"\) |
| `org_id` | string | No | Filter activities by organization ID \(e.g., "789"\) |
| `user_id` | string | No | Filter activities by user ID \(e.g., "123"\) |
| `type` | string | No | Filter by activity type \(call, meeting, task, deadline, email, lunch\) |
| `done` | string | No | Filter by completion status: 0 for not done, 1 for done |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500\) |
| `start` | string | No | Pagination start offset \(0-based index of the first item to return\) |
#### Output
@@ -335,6 +346,8 @@ Retrieve activities (tasks) from Pipedrive with optional filters
| ↳ `add_time` | string | When the activity was created |
| ↳ `update_time` | string | When the activity was last updated |
| `total_items` | number | Total number of activities returned |
| `has_more` | boolean | Whether more activities are available |
| `next_start` | number | Offset for fetching the next page |
| `success` | boolean | Operation success status |
### `pipedrive_create_activity`
@@ -399,6 +412,7 @@ Retrieve all leads or a specific lead from Pipedrive
| `person_id` | string | No | Filter by person ID \(e.g., "456"\) |
| `organization_id` | string | No | Filter by organization ID \(e.g., "789"\) |
| `limit` | string | No | Number of results to return \(e.g., "50", default: 100, max: 500\) |
| `start` | string | No | Pagination start offset \(0-based index of the first item to return\) |
#### Output
@@ -433,6 +447,8 @@ Retrieve all leads or a specific lead from Pipedrive
| ↳ `add_time` | string | When the lead was created \(ISO 8601\) |
| ↳ `update_time` | string | When the lead was last updated \(ISO 8601\) |
| `total_items` | number | Total number of leads returned |
| `has_more` | boolean | Whether more leads are available |
| `next_start` | number | Offset for fetching the next page |
| `success` | boolean | Operation success status |
### `pipedrive_create_lead`

View File

@@ -57,6 +57,7 @@ Query data from a Supabase table
| `filter` | string | No | PostgREST filter \(e.g., "id=eq.123"\) |
| `orderBy` | string | No | Column to order by \(add DESC for descending\) |
| `limit` | number | No | Maximum number of rows to return |
| `offset` | number | No | Number of rows to skip \(for pagination\) |
| `apiKey` | string | Yes | Your Supabase service role secret key |
#### Output
@@ -211,6 +212,7 @@ Perform full-text search on a Supabase table
| `searchType` | string | No | Search type: plain, phrase, or websearch \(default: websearch\) |
| `language` | string | No | Language for text search configuration \(default: english\) |
| `limit` | number | No | Maximum number of rows to return |
| `offset` | number | No | Number of rows to skip \(for pagination\) |
| `apiKey` | string | Yes | Your Supabase service role secret key |
#### Output

View File

@@ -43,6 +43,8 @@ Retrieve form responses from Typeform
| `formId` | string | Yes | Typeform form ID \(e.g., "abc123XYZ"\) |
| `apiKey` | string | Yes | Typeform Personal Access Token |
| `pageSize` | number | No | Number of responses to retrieve \(e.g., 10, 25, 50\) |
| `before` | string | No | Cursor token for fetching the next page of older responses |
| `after` | string | No | Cursor token for fetching the next page of newer responses |
| `since` | string | No | Retrieve responses submitted after this date \(e.g., "2024-01-01T00:00:00Z"\) |
| `until` | string | No | Retrieve responses submitted before this date \(e.g., "2024-12-31T23:59:59Z"\) |
| `completed` | string | No | Filter by completion status \(e.g., "true", "false", "all"\) |

View File

@@ -67,10 +67,9 @@ Retrieve a list of tickets from Zendesk with optional filtering
| `type` | string | No | Filter by type: "problem", "incident", "question", or "task" |
| `assigneeId` | string | No | Filter by assignee user ID as a numeric string \(e.g., "12345"\) |
| `organizationId` | string | No | Filter by organization ID as a numeric string \(e.g., "67890"\) |
| `sortBy` | string | No | Sort field: "created_at", "updated_at", "priority", or "status" |
| `sortOrder` | string | No | Sort order: "asc" or "desc" |
| `sort` | string | No | Sort field for ticket listing \(only applies without filters\): "updated_at", "id", or "status". Prefix with "-" for descending \(e.g., "-updated_at"\) |
| `perPage` | string | No | Results per page as a number string \(default: "100", max: "100"\) |
| `page` | string | No | Page number as a string \(e.g., "1", "2"\) |
| `pageAfter` | string | No | Cursor from a previous response to fetch the next page of results |
#### Output
@@ -129,10 +128,10 @@ Retrieve a list of tickets from Zendesk with optional filtering
| ↳ `from_messaging_channel` | boolean | Whether the ticket originated from a messaging channel |
| ↳ `ticket_form_id` | number | Ticket form ID |
| ↳ `generated_timestamp` | number | Unix timestamp of the ticket generation |
| `paging` | object | Pagination information |
| `paging` | object | Cursor-based pagination information |
| ↳ `after_cursor` | string | Cursor for fetching the next page of results |
| ↳ `has_more` | boolean | Whether more results are available |
| ↳ `next_page` | string | URL for next page of results |
| ↳ `previous_page` | string | URL for previous page of results |
| ↳ `count` | number | Total count of items |
| `metadata` | object | Response metadata |
| ↳ `total_returned` | number | Number of items returned in this response |
| ↳ `has_more` | boolean | Whether more items are available |
@@ -515,7 +514,7 @@ Retrieve a list of users from Zendesk with optional filtering
| `role` | string | No | Filter by role: "end-user", "agent", or "admin" |
| `permissionSet` | string | No | Filter by permission set ID as a numeric string \(e.g., "12345"\) |
| `perPage` | string | No | Results per page as a number string \(default: "100", max: "100"\) |
| `page` | string | No | Page number as a string \(e.g., "1", "2"\) |
| `pageAfter` | string | No | Cursor from a previous response to fetch the next page of results |
#### Output
@@ -563,10 +562,10 @@ Retrieve a list of users from Zendesk with optional filtering
| ↳ `shared` | boolean | Whether the user is shared from a different Zendesk |
| ↳ `shared_agent` | boolean | Whether the agent is shared from a different Zendesk |
| ↳ `remote_photo_url` | string | URL to a remote photo |
| `paging` | object | Pagination information |
| `paging` | object | Cursor-based pagination information |
| ↳ `after_cursor` | string | Cursor for fetching the next page of results |
| ↳ `has_more` | boolean | Whether more results are available |
| ↳ `next_page` | string | URL for next page of results |
| ↳ `previous_page` | string | URL for previous page of results |
| ↳ `count` | number | Total count of items |
| `metadata` | object | Response metadata |
| ↳ `total_returned` | number | Number of items returned in this response |
| ↳ `has_more` | boolean | Whether more items are available |
@@ -706,7 +705,7 @@ Search for users in Zendesk using a query string
| `query` | string | No | Search query string \(e.g., user name or email\) |
| `externalId` | string | No | External ID to search by \(your system identifier\) |
| `perPage` | string | No | Results per page as a number string \(default: "100", max: "100"\) |
| `page` | string | No | Page number as a string \(e.g., "1", "2"\) |
| `page` | string | No | Page number for pagination \(1-based\) |
#### Output
@@ -754,10 +753,10 @@ Search for users in Zendesk using a query string
| ↳ `shared` | boolean | Whether the user is shared from a different Zendesk |
| ↳ `shared_agent` | boolean | Whether the agent is shared from a different Zendesk |
| ↳ `remote_photo_url` | string | URL to a remote photo |
| `paging` | object | Pagination information |
| `paging` | object | Cursor-based pagination information |
| ↳ `after_cursor` | string | Cursor for fetching the next page of results |
| ↳ `has_more` | boolean | Whether more results are available |
| ↳ `next_page` | string | URL for next page of results |
| ↳ `previous_page` | string | URL for previous page of results |
| ↳ `count` | number | Total count of items |
| `metadata` | object | Response metadata |
| ↳ `total_returned` | number | Number of items returned in this response |
| ↳ `has_more` | boolean | Whether more items are available |
@@ -999,7 +998,7 @@ Retrieve a list of organizations from Zendesk
| `apiToken` | string | Yes | Zendesk API token |
| `subdomain` | string | Yes | Your Zendesk subdomain \(e.g., "mycompany" for mycompany.zendesk.com\) |
| `perPage` | string | No | Results per page as a number string \(default: "100", max: "100"\) |
| `page` | string | No | Page number as a string \(e.g., "1", "2"\) |
| `pageAfter` | string | No | Cursor from a previous response to fetch the next page of results |
#### Output
@@ -1020,10 +1019,10 @@ Retrieve a list of organizations from Zendesk
| ↳ `created_at` | string | When the organization was created \(ISO 8601 format\) |
| ↳ `updated_at` | string | When the organization was last updated \(ISO 8601 format\) |
| ↳ `external_id` | string | External ID for linking to external records |
| `paging` | object | Pagination information |
| `paging` | object | Cursor-based pagination information |
| ↳ `after_cursor` | string | Cursor for fetching the next page of results |
| ↳ `has_more` | boolean | Whether more results are available |
| ↳ `next_page` | string | URL for next page of results |
| ↳ `previous_page` | string | URL for previous page of results |
| ↳ `count` | number | Total count of items |
| `metadata` | object | Response metadata |
| ↳ `total_returned` | number | Number of items returned in this response |
| ↳ `has_more` | boolean | Whether more items are available |
@@ -1075,7 +1074,7 @@ Autocomplete organizations in Zendesk by name prefix (for name matching/autocomp
| `subdomain` | string | Yes | Your Zendesk subdomain |
| `name` | string | Yes | Organization name prefix to search for \(e.g., "Acme"\) |
| `perPage` | string | No | Results per page as a number string \(default: "100", max: "100"\) |
| `page` | string | No | Page number as a string \(e.g., "1", "2"\) |
| `page` | string | No | Page number for pagination \(1-based\) |
#### Output
@@ -1096,10 +1095,10 @@ Autocomplete organizations in Zendesk by name prefix (for name matching/autocomp
| ↳ `created_at` | string | When the organization was created \(ISO 8601 format\) |
| ↳ `updated_at` | string | When the organization was last updated \(ISO 8601 format\) |
| ↳ `external_id` | string | External ID for linking to external records |
| `paging` | object | Pagination information |
| `paging` | object | Cursor-based pagination information |
| ↳ `after_cursor` | string | Cursor for fetching the next page of results |
| ↳ `has_more` | boolean | Whether more results are available |
| ↳ `next_page` | string | URL for next page of results |
| ↳ `previous_page` | string | URL for previous page of results |
| ↳ `count` | number | Total count of items |
| `metadata` | object | Response metadata |
| ↳ `total_returned` | number | Number of items returned in this response |
| ↳ `has_more` | boolean | Whether more items are available |
@@ -1249,19 +1248,18 @@ Unified search across tickets, users, and organizations in Zendesk
| `apiToken` | string | Yes | Zendesk API token |
| `subdomain` | string | Yes | Your Zendesk subdomain |
| `query` | string | Yes | Search query string using Zendesk search syntax \(e.g., "type:ticket status:open"\) |
| `sortBy` | string | No | Sort field: "relevance", "created_at", "updated_at", "priority", "status", or "ticket_type" |
| `sortOrder` | string | No | Sort order: "asc" or "desc" |
| `filterType` | string | Yes | Resource type to search for: "ticket", "user", "organization", or "group" |
| `perPage` | string | No | Results per page as a number string \(default: "100", max: "100"\) |
| `page` | string | No | Page number as a string \(e.g., "1", "2"\) |
| `pageAfter` | string | No | Cursor from a previous response to fetch the next page of results |
#### Output
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `paging` | object | Pagination information |
| `paging` | object | Cursor-based pagination information |
| ↳ `after_cursor` | string | Cursor for fetching the next page of results |
| ↳ `has_more` | boolean | Whether more results are available |
| ↳ `next_page` | string | URL for next page of results |
| ↳ `previous_page` | string | URL for previous page of results |
| ↳ `count` | number | Total count of items |
| `metadata` | object | Response metadata |
| ↳ `total_returned` | number | Number of items returned in this response |
| ↳ `has_more` | boolean | Whether more items are available |

View File

@@ -1,5 +1,3 @@
'use server'
import { env } from '@/lib/core/config/env'
import { isProd } from '@/lib/core/config/feature-flags'

View File

@@ -85,7 +85,7 @@ export const LandingNode = React.memo(function LandingNode({ data }: { data: Lan
transform: isAnimated ? 'translateY(0) scale(1)' : 'translateY(8px) scale(0.98)',
transition:
'opacity 0.6s cubic-bezier(0.22, 1, 0.36, 1), transform 0.6s cubic-bezier(0.22, 1, 0.36, 1)',
willChange: 'transform, opacity',
willChange: isAnimated ? 'auto' : 'transform, opacity',
}}
>
<LandingBlock icon={data.icon} color={data.color} name={data.name} tags={data.tags} />

View File

@@ -67,7 +67,6 @@ export const LandingEdge = React.memo(function LandingEdge(props: EdgeProps) {
strokeLinejoin: 'round',
pointerEvents: 'none',
animation: `landing-edge-dash-${id} 1s linear infinite`,
willChange: 'stroke-dashoffset',
...style,
}}
/>

View File

@@ -754,3 +754,100 @@ input[type="search"]::-ms-clear {
text-decoration: none !important;
color: inherit !important;
}
/**
* Respect user's prefers-reduced-motion setting (WCAG 2.3.3)
* Disables animations and transitions for users who prefer reduced motion.
*/
@media (prefers-reduced-motion: reduce) {
*,
*::before,
*::after {
animation-duration: 0.01ms !important;
animation-iteration-count: 1 !important;
transition-duration: 0.01ms !important;
scroll-behavior: auto !important;
}
}
/* WandPromptBar status indicator */
@keyframes smoke-pulse {
0%,
100% {
transform: scale(0.8);
opacity: 0.4;
}
50% {
transform: scale(1.1);
opacity: 0.8;
}
}
.status-indicator {
position: relative;
width: 12px;
height: 12px;
border-radius: 50%;
overflow: hidden;
background-color: hsl(var(--muted-foreground) / 0.5);
transition: background-color 0.3s ease;
}
.status-indicator.streaming {
background-color: transparent;
}
.status-indicator.streaming::before {
content: "";
position: absolute;
inset: 0;
border-radius: 50%;
background: radial-gradient(
circle,
hsl(var(--primary) / 0.9) 0%,
hsl(var(--primary) / 0.4) 60%,
transparent 80%
);
animation: smoke-pulse 1.8s ease-in-out infinite;
opacity: 0.9;
}
.dark .status-indicator.streaming::before {
background: #6b7280;
opacity: 0.9;
animation: smoke-pulse 1.8s ease-in-out infinite;
}
/* MessageContainer loading dot */
@keyframes growShrink {
0%,
100% {
transform: scale(0.9);
}
50% {
transform: scale(1.1);
}
}
.loading-dot {
animation: growShrink 1.5s infinite ease-in-out;
}
/* Subflow node z-index and drag-over styles */
.workflow-container .react-flow__node-subflowNode {
z-index: -1 !important;
}
.workflow-container .react-flow__node-subflowNode:has([data-subflow-selected="true"]) {
z-index: 10 !important;
}
.loop-node-drag-over,
.parallel-node-drag-over {
box-shadow: 0 0 0 1.75px var(--brand-secondary) !important;
border-radius: 8px !important;
}
.react-flow__node[data-parent-node-id] .react-flow__handle {
z-index: 30;
}

View File

@@ -3,7 +3,7 @@
*
* @vitest-environment node
*/
import { createMockLogger, createMockRequest } from '@sim/testing'
import { auditMock, createMockLogger, createMockRequest } from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
describe('OAuth Disconnect API Route', () => {
@@ -67,6 +67,8 @@ describe('OAuth Disconnect API Route', () => {
vi.doMock('@/lib/webhooks/utils.server', () => ({
syncAllWebhooksForCredentialSet: mockSyncAllWebhooksForCredentialSet,
}))
vi.doMock('@/lib/audit/log', () => auditMock)
})
afterEach(() => {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq, like, or } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
@@ -118,6 +119,20 @@ export async function POST(request: NextRequest) {
}
}
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.OAUTH_DISCONNECTED,
resourceType: AuditResourceType.OAUTH,
resourceId: providerId ?? provider,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: provider,
description: `Disconnected OAuth provider: ${provider}`,
metadata: { provider, providerId },
request,
})
return NextResponse.json({ success: true }, { status: 200 })
} catch (error) {
logger.error(`[${requestId}] Error disconnecting OAuth provider`, error)

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { getCreditBalance } from '@/lib/billing/credits/balance'
import { purchaseCredits } from '@/lib/billing/credits/purchase'
@@ -57,6 +58,17 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: result.error }, { status: 400 })
}
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CREDIT_PURCHASED,
resourceType: AuditResourceType.BILLING,
description: `Purchased $${validation.data.amount} in credits`,
metadata: { amount: validation.data.amount, requestId: validation.data.requestId },
request,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Failed to purchase credits', { error, userId: session.user.id })

View File

@@ -3,10 +3,12 @@
*
* @vitest-environment node
*/
import { loggerMock } from '@sim/testing'
import { auditMock, loggerMock } from '@sim/testing'
import { NextRequest } from 'next/server'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@/lib/audit/log', () => auditMock)
vi.mock('@/lib/core/config/feature-flags', () => ({
isDev: true,
isHosted: false,
@@ -216,8 +218,11 @@ describe('Chat Edit API Route', () => {
workflowId: 'workflow-123',
}
mockCheckChatAccess.mockResolvedValue({ hasAccess: true, chat: mockChat })
mockLimit.mockResolvedValueOnce([]) // No identifier conflict
mockCheckChatAccess.mockResolvedValue({
hasAccess: true,
chat: mockChat,
workspaceId: 'workspace-123',
})
const req = new NextRequest('http://localhost:3000/api/chat/manage/chat-123', {
method: 'PATCH',
@@ -311,8 +316,11 @@ describe('Chat Edit API Route', () => {
workflowId: 'workflow-123',
}
mockCheckChatAccess.mockResolvedValue({ hasAccess: true, chat: mockChat })
mockLimit.mockResolvedValueOnce([])
mockCheckChatAccess.mockResolvedValue({
hasAccess: true,
chat: mockChat,
workspaceId: 'workspace-123',
})
const req = new NextRequest('http://localhost:3000/api/chat/manage/chat-123', {
method: 'PATCH',
@@ -371,8 +379,11 @@ describe('Chat Edit API Route', () => {
}),
}))
mockCheckChatAccess.mockResolvedValue({ hasAccess: true })
mockWhere.mockResolvedValue(undefined)
mockCheckChatAccess.mockResolvedValue({
hasAccess: true,
chat: { title: 'Test Chat', workflowId: 'workflow-123' },
workspaceId: 'workspace-123',
})
const req = new NextRequest('http://localhost:3000/api/chat/manage/chat-123', {
method: 'DELETE',
@@ -393,8 +404,11 @@ describe('Chat Edit API Route', () => {
}),
}))
mockCheckChatAccess.mockResolvedValue({ hasAccess: true })
mockWhere.mockResolvedValue(undefined)
mockCheckChatAccess.mockResolvedValue({
hasAccess: true,
chat: { title: 'Test Chat', workflowId: 'workflow-123' },
workspaceId: 'workspace-123',
})
const req = new NextRequest('http://localhost:3000/api/chat/manage/chat-123', {
method: 'DELETE',

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { isDev } from '@/lib/core/config/feature-flags'
import { encryptSecret } from '@/lib/core/security/encryption'
@@ -103,7 +104,11 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
try {
const validatedData = chatUpdateSchema.parse(body)
const { hasAccess, chat: existingChatRecord } = await checkChatAccess(chatId, session.user.id)
const {
hasAccess,
chat: existingChatRecord,
workspaceId: chatWorkspaceId,
} = await checkChatAccess(chatId, session.user.id)
if (!hasAccess || !existingChatRecord) {
return createErrorResponse('Chat not found or access denied', 404)
@@ -217,6 +222,19 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
logger.info(`Chat "${chatId}" updated successfully`)
recordAudit({
workspaceId: chatWorkspaceId || null,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CHAT_UPDATED,
resourceType: AuditResourceType.CHAT,
resourceId: chatId,
resourceName: title || existingChatRecord.title,
description: `Updated chat deployment "${title || existingChatRecord.title}"`,
request,
})
return createSuccessResponse({
id: chatId,
chatUrl,
@@ -252,7 +270,11 @@ export async function DELETE(
return createErrorResponse('Unauthorized', 401)
}
const { hasAccess } = await checkChatAccess(chatId, session.user.id)
const {
hasAccess,
chat: chatRecord,
workspaceId: chatWorkspaceId,
} = await checkChatAccess(chatId, session.user.id)
if (!hasAccess) {
return createErrorResponse('Chat not found or access denied', 404)
@@ -262,6 +284,19 @@ export async function DELETE(
logger.info(`Chat "${chatId}" deleted successfully`)
recordAudit({
workspaceId: chatWorkspaceId || null,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CHAT_DELETED,
resourceType: AuditResourceType.CHAT,
resourceId: chatId,
resourceName: chatRecord?.title || chatId,
description: `Deleted chat deployment "${chatRecord?.title || chatId}"`,
request: _request,
})
return createSuccessResponse({
message: 'Chat deployment deleted successfully',
})

View File

@@ -1,9 +1,10 @@
import { NextRequest } from 'next/server'
/**
* Tests for chat API route
*
* @vitest-environment node
*/
import { auditMock } from '@sim/testing'
import { NextRequest } from 'next/server'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
describe('Chat API Route', () => {
@@ -30,6 +31,8 @@ describe('Chat API Route', () => {
mockInsert.mockReturnValue({ values: mockValues })
mockValues.mockReturnValue({ returning: mockReturning })
vi.doMock('@/lib/audit/log', () => auditMock)
vi.doMock('@sim/db', () => ({
db: {
select: mockSelect,

View File

@@ -5,6 +5,7 @@ import { eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { v4 as uuidv4 } from 'uuid'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { isDev } from '@/lib/core/config/feature-flags'
import { encryptSecret } from '@/lib/core/security/encryption'
@@ -42,7 +43,7 @@ const chatSchema = z.object({
.default([]),
})
export async function GET(request: NextRequest) {
export async function GET(_request: NextRequest) {
try {
const session = await getSession()
@@ -174,7 +175,7 @@ export async function POST(request: NextRequest) {
userId: session.user.id,
identifier,
title,
description: description || '',
description: description || null,
customizations: mergedCustomizations,
isActive: true,
authType,
@@ -224,6 +225,20 @@ export async function POST(request: NextRequest) {
// Silently fail
}
recordAudit({
workspaceId: workflowRecord.workspaceId || null,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CHAT_DEPLOYED,
resourceType: AuditResourceType.CHAT,
resourceId: id,
resourceName: title,
description: `Deployed chat "${title}"`,
metadata: { workflowId, identifier, authType },
request,
})
return createSuccessResponse({
id,
chatUrl,

View File

@@ -52,7 +52,7 @@ export async function checkWorkflowAccessForChatCreation(
export async function checkChatAccess(
chatId: string,
userId: string
): Promise<{ hasAccess: boolean; chat?: any }> {
): Promise<{ hasAccess: boolean; chat?: any; workspaceId?: string }> {
const chatData = await db
.select({
chat: chat,
@@ -78,7 +78,9 @@ export async function checkChatAccess(
action: 'admin',
})
return authorization.allowed ? { hasAccess: true, chat: chatRecord } : { hasAccess: false }
return authorization.allowed
? { hasAccess: true, chat: chatRecord, workspaceId: workflowWorkspaceId }
: { hasAccess: false }
}
export async function validateChatAuth(

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getEmailSubject, renderPollingGroupInvitationEmail } from '@/components/emails'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
import { getBaseUrl } from '@/lib/core/utils/urls'
@@ -148,6 +149,19 @@ export async function POST(
userId: session.user.id,
})
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CREDENTIAL_SET_INVITATION_RESENT,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: id,
resourceName: result.set.name,
description: `Resent credential set invitation to ${invitation.email}`,
metadata: { invitationId, email: invitation.email },
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error resending invitation', error)

View File

@@ -5,6 +5,7 @@ import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getEmailSubject, renderPollingGroupInvitationEmail } from '@/components/emails'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
import { getBaseUrl } from '@/lib/core/utils/urls'
@@ -175,6 +176,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
emailSent: !!email,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.CREDENTIAL_SET_INVITATION_CREATED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: result.set.name,
description: `Created invitation for credential set "${result.set.name}"${email ? ` to ${email}` : ''}`,
request: req,
})
return NextResponse.json({
invitation: {
...invitation,
@@ -235,6 +249,19 @@ export async function DELETE(req: NextRequest, { params }: { params: Promise<{ i
)
)
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.CREDENTIAL_SET_INVITATION_REVOKED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: result.set.name,
description: `Revoked invitation "${invitationId}" for credential set "${result.set.name}"`,
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error cancelling invitation', error)

View File

@@ -3,6 +3,7 @@ import { account, credentialSet, credentialSetMember, member, user } from '@sim/
import { createLogger } from '@sim/logger'
import { and, eq, inArray } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
@@ -13,6 +14,7 @@ async function getCredentialSetWithAccess(credentialSetId: string, userId: strin
const [set] = await db
.select({
id: credentialSet.id,
name: credentialSet.name,
organizationId: credentialSet.organizationId,
providerId: credentialSet.providerId,
})
@@ -177,6 +179,19 @@ export async function DELETE(req: NextRequest, { params }: { params: Promise<{ i
userId: session.user.id,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.CREDENTIAL_SET_MEMBER_REMOVED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: result.set.name,
description: `Removed member from credential set "${result.set.name}"`,
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error removing member from credential set', error)

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
@@ -131,6 +132,19 @@ export async function PUT(req: NextRequest, { params }: { params: Promise<{ id:
const [updated] = await db.select().from(credentialSet).where(eq(credentialSet.id, id)).limit(1)
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.CREDENTIAL_SET_UPDATED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: updated?.name ?? result.set.name,
description: `Updated credential set "${updated?.name ?? result.set.name}"`,
request: req,
})
return NextResponse.json({ credentialSet: updated })
} catch (error) {
if (error instanceof z.ZodError) {
@@ -175,6 +189,19 @@ export async function DELETE(req: NextRequest, { params }: { params: Promise<{ i
logger.info('Deleted credential set', { credentialSetId: id, userId: session.user.id })
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.CREDENTIAL_SET_DELETED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: result.set.name,
description: `Deleted credential set "${result.set.name}"`,
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error deleting credential set', error)

View File

@@ -8,6 +8,7 @@ import {
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
@@ -78,6 +79,7 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ tok
status: credentialSetInvitation.status,
expiresAt: credentialSetInvitation.expiresAt,
invitedBy: credentialSetInvitation.invitedBy,
credentialSetName: credentialSet.name,
providerId: credentialSet.providerId,
})
.from(credentialSetInvitation)
@@ -125,7 +127,6 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ tok
const now = new Date()
const requestId = crypto.randomUUID().slice(0, 8)
// Use transaction to ensure membership + invitation update + webhook sync are atomic
await db.transaction(async (tx) => {
await tx.insert(credentialSetMember).values({
id: crypto.randomUUID(),
@@ -147,8 +148,6 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ tok
})
.where(eq(credentialSetInvitation.id, invitation.id))
// Clean up all other pending invitations for the same credential set and email
// This prevents duplicate invites from showing up after accepting one
if (invitation.email) {
await tx
.update(credentialSetInvitation)
@@ -166,7 +165,6 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ tok
)
}
// Sync webhooks within the transaction
const syncResult = await syncAllWebhooksForCredentialSet(
invitation.credentialSetId,
requestId,
@@ -184,6 +182,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ tok
userId: session.user.id,
})
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CREDENTIAL_SET_INVITATION_ACCEPTED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: invitation.credentialSetId,
resourceName: invitation.credentialSetName,
description: `Accepted credential set invitation`,
metadata: { invitationId: invitation.id },
request: req,
})
return NextResponse.json({
success: true,
credentialSetId: invitation.credentialSetId,

View File

@@ -3,6 +3,7 @@ import { credentialSet, credentialSetMember, organization } from '@sim/db/schema
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
@@ -106,6 +107,17 @@ export async function DELETE(req: NextRequest) {
userId: session.user.id,
})
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.CREDENTIAL_SET_MEMBER_LEFT,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: credentialSetId,
description: `Left credential set`,
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
const message = error instanceof Error ? error.message : 'Failed to leave credential set'

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, count, desc, eq } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
@@ -165,6 +166,19 @@ export async function POST(req: Request) {
userId: session.user.id,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.CREDENTIAL_SET_CREATED,
resourceType: AuditResourceType.CREDENTIAL_SET,
resourceId: newCredentialSet.id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: name,
description: `Created credential set "${name}"`,
request: req,
})
return NextResponse.json({ credentialSet: newCredentialSet }, { status: 201 })
} catch (error) {
if (error instanceof z.ZodError) {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { decryptSecret, encryptSecret } from '@/lib/core/security/encryption'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -53,6 +54,17 @@ export async function POST(req: NextRequest) {
},
})
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.ENVIRONMENT_UPDATED,
resourceType: AuditResourceType.ENVIRONMENT,
description: 'Updated global environment variables',
metadata: { variableCount: Object.keys(variables).length },
request: req,
})
return NextResponse.json({ success: true })
} catch (validationError) {
if (validationError instanceof z.ZodError) {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { duplicateWorkflow } from '@/lib/workflows/persistence/duplicate'
@@ -115,6 +116,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
}
)
recordAudit({
workspaceId: targetWorkspaceId,
actorId: session.user.id,
action: AuditAction.FOLDER_DUPLICATED,
resourceType: AuditResourceType.FOLDER,
resourceId: newFolderId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: name,
description: `Duplicated folder "${sourceFolder.name}" as "${name}"`,
request: req,
})
return NextResponse.json(
{
id: newFolderId,

View File

@@ -4,6 +4,7 @@
* @vitest-environment node
*/
import {
auditMock,
createMockRequest,
type MockUser,
mockAuth,
@@ -12,6 +13,8 @@ import {
} from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@/lib/audit/log', () => auditMock)
/** Type for captured folder values in tests */
interface CapturedFolderValues {
name?: string

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -167,6 +168,19 @@ export async function DELETE(
deletionStats,
})
recordAudit({
workspaceId: existingFolder.workspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.FOLDER_DELETED,
resourceType: AuditResourceType.FOLDER,
resourceId: id,
resourceName: existingFolder.name,
description: `Deleted folder "${existingFolder.name}"`,
request,
})
return NextResponse.json({
success: true,
deletedItems: deletionStats,

View File

@@ -3,9 +3,17 @@
*
* @vitest-environment node
*/
import { createMockRequest, mockAuth, mockConsoleLogger, setupCommonApiMocks } from '@sim/testing'
import {
auditMock,
createMockRequest,
mockAuth,
mockConsoleLogger,
setupCommonApiMocks,
} from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@/lib/audit/log', () => auditMock)
interface CapturedFolderValues {
name?: string
color?: string

View File

@@ -3,6 +3,7 @@ import { workflowFolder } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, asc, desc, eq, isNull } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -119,6 +120,20 @@ export async function POST(request: NextRequest) {
logger.info('Created new folder:', { id, name, workspaceId, parentId })
recordAudit({
workspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.FOLDER_CREATED,
resourceType: AuditResourceType.FOLDER,
resourceId: id,
resourceName: name.trim(),
description: `Created folder "${name.trim()}"`,
metadata: { name: name.trim() },
request,
})
return NextResponse.json({ folder: newFolder })
} catch (error) {
logger.error('Error creating folder:', { error })

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { encryptSecret } from '@/lib/core/security/encryption'
import { checkFormAccess, DEFAULT_FORM_CUSTOMIZATIONS } from '@/app/api/form/utils'
@@ -102,7 +103,11 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
const { id } = await params
const { hasAccess, form: formRecord } = await checkFormAccess(id, session.user.id)
const {
hasAccess,
form: formRecord,
workspaceId: formWorkspaceId,
} = await checkFormAccess(id, session.user.id)
if (!hasAccess || !formRecord) {
return createErrorResponse('Form not found or access denied', 404)
@@ -184,6 +189,19 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
logger.info(`Form ${id} updated successfully`)
recordAudit({
workspaceId: formWorkspaceId ?? null,
actorId: session.user.id,
action: AuditAction.FORM_UPDATED,
resourceType: AuditResourceType.FORM,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: formRecord.title ?? undefined,
description: `Updated form "${formRecord.title}"`,
request,
})
return createSuccessResponse({
message: 'Form updated successfully',
})
@@ -213,7 +231,11 @@ export async function DELETE(
const { id } = await params
const { hasAccess, form: formRecord } = await checkFormAccess(id, session.user.id)
const {
hasAccess,
form: formRecord,
workspaceId: formWorkspaceId,
} = await checkFormAccess(id, session.user.id)
if (!hasAccess || !formRecord) {
return createErrorResponse('Form not found or access denied', 404)
@@ -223,6 +245,19 @@ export async function DELETE(
logger.info(`Form ${id} deleted (soft delete)`)
recordAudit({
workspaceId: formWorkspaceId ?? null,
actorId: session.user.id,
action: AuditAction.FORM_DELETED,
resourceType: AuditResourceType.FORM,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: formRecord.title ?? undefined,
description: `Deleted form "${formRecord.title}"`,
request,
})
return createSuccessResponse({
message: 'Form deleted successfully',
})

View File

@@ -5,6 +5,7 @@ import { eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { v4 as uuidv4 } from 'uuid'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { isDev } from '@/lib/core/config/feature-flags'
import { encryptSecret } from '@/lib/core/security/encryption'
@@ -178,7 +179,7 @@ export async function POST(request: NextRequest) {
userId: session.user.id,
identifier,
title,
description: description || '',
description: description || null,
customizations: mergedCustomizations,
isActive: true,
authType,
@@ -195,6 +196,19 @@ export async function POST(request: NextRequest) {
logger.info(`Form "${title}" deployed successfully at ${formUrl}`)
recordAudit({
workspaceId: workflowRecord.workspaceId ?? null,
actorId: session.user.id,
action: AuditAction.FORM_CREATED,
resourceType: AuditResourceType.FORM,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: title,
description: `Created form "${title}" for workflow ${workflowId}`,
request,
})
return createSuccessResponse({
id,
formUrl,

View File

@@ -52,7 +52,7 @@ export async function checkWorkflowAccessForFormCreation(
export async function checkFormAccess(
formId: string,
userId: string
): Promise<{ hasAccess: boolean; form?: any }> {
): Promise<{ hasAccess: boolean; form?: any; workspaceId?: string }> {
const formData = await db
.select({ form: form, workflowWorkspaceId: workflow.workspaceId })
.from(form)
@@ -75,7 +75,9 @@ export async function checkFormAccess(
action: 'admin',
})
return authorization.allowed ? { hasAccess: true, form: formRecord } : { hasAccess: false }
return authorization.allowed
? { hasAccess: true, form: formRecord, workspaceId: workflowWorkspaceId }
: { hasAccess: false }
}
export async function validateFormAuth(

View File

@@ -1,300 +0,0 @@
/**
* @vitest-environment node
*/
import { createMockRequest, mockConsoleLogger, mockDrizzleOrm } from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@sim/db/schema', () => ({
document: {
id: 'id',
connectorId: 'connectorId',
deletedAt: 'deletedAt',
filename: 'filename',
externalId: 'externalId',
sourceUrl: 'sourceUrl',
enabled: 'enabled',
userExcluded: 'userExcluded',
uploadedAt: 'uploadedAt',
processingStatus: 'processingStatus',
},
knowledgeConnector: {
id: 'id',
knowledgeBaseId: 'knowledgeBaseId',
deletedAt: 'deletedAt',
},
}))
vi.mock('@/app/api/knowledge/utils', () => ({
checkKnowledgeBaseAccess: vi.fn(),
checkKnowledgeBaseWriteAccess: vi.fn(),
}))
vi.mock('@/lib/auth/hybrid', () => ({
checkSessionOrInternalAuth: vi.fn(),
}))
vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn().mockReturnValue('test-req-id'),
}))
mockDrizzleOrm()
mockConsoleLogger()
describe('Connector Documents API Route', () => {
/**
* The route chains db calls in sequence. We track call order
* to return different values for connector lookup vs document queries.
*/
let limitCallCount: number
let orderByCallCount: number
const mockDbChain = {
select: vi.fn().mockReturnThis(),
from: vi.fn().mockReturnThis(),
where: vi.fn().mockReturnThis(),
orderBy: vi.fn(() => {
orderByCallCount++
return Promise.resolve([])
}),
limit: vi.fn(() => {
limitCallCount++
return Promise.resolve([])
}),
update: vi.fn().mockReturnThis(),
set: vi.fn().mockReturnThis(),
returning: vi.fn().mockResolvedValue([]),
}
const mockParams = Promise.resolve({ id: 'kb-123', connectorId: 'conn-456' })
beforeEach(() => {
vi.clearAllMocks()
limitCallCount = 0
orderByCallCount = 0
mockDbChain.select.mockReturnThis()
mockDbChain.from.mockReturnThis()
mockDbChain.where.mockReturnThis()
mockDbChain.orderBy.mockImplementation(() => {
orderByCallCount++
return Promise.resolve([])
})
mockDbChain.limit.mockImplementation(() => {
limitCallCount++
return Promise.resolve([])
})
mockDbChain.update.mockReturnThis()
mockDbChain.set.mockReturnThis()
mockDbChain.returning.mockResolvedValue([])
vi.doMock('@sim/db', () => ({ db: mockDbChain }))
})
afterEach(() => {
vi.clearAllMocks()
})
describe('GET', () => {
it('returns 401 when unauthenticated', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: false,
userId: null,
} as never)
const req = createMockRequest('GET')
const { GET } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await GET(req as never, { params: mockParams })
expect(response.status).toBe(401)
})
it('returns 404 when connector not found', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([])
const req = createMockRequest('GET')
const { GET } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await GET(req as never, { params: mockParams })
expect(response.status).toBe(404)
})
it('returns documents list on success', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseAccess).mockResolvedValue({ hasAccess: true } as never)
const doc = { id: 'doc-1', filename: 'test.txt', userExcluded: false }
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456' }])
mockDbChain.orderBy.mockResolvedValueOnce([doc])
const url = 'http://localhost/api/knowledge/kb-123/connectors/conn-456/documents'
const req = createMockRequest('GET', undefined, undefined, url)
Object.assign(req, { nextUrl: new URL(url) })
const { GET } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await GET(req as never, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.data.documents).toHaveLength(1)
expect(data.data.counts.active).toBe(1)
expect(data.data.counts.excluded).toBe(0)
})
it('includes excluded documents when includeExcluded=true', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456' }])
mockDbChain.orderBy
.mockResolvedValueOnce([{ id: 'doc-1', userExcluded: false }])
.mockResolvedValueOnce([{ id: 'doc-2', userExcluded: true }])
const url =
'http://localhost/api/knowledge/kb-123/connectors/conn-456/documents?includeExcluded=true'
const req = createMockRequest('GET', undefined, undefined, url)
Object.assign(req, { nextUrl: new URL(url) })
const { GET } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await GET(req as never, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.data.documents).toHaveLength(2)
expect(data.data.counts.active).toBe(1)
expect(data.data.counts.excluded).toBe(1)
})
})
describe('PATCH', () => {
it('returns 401 when unauthenticated', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: false,
userId: null,
} as never)
const req = createMockRequest('PATCH', { operation: 'restore', documentIds: ['doc-1'] })
const { PATCH } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await PATCH(req as never, { params: mockParams })
expect(response.status).toBe(401)
})
it('returns 400 for invalid body', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456' }])
const req = createMockRequest('PATCH', { documentIds: [] })
const { PATCH } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await PATCH(req as never, { params: mockParams })
expect(response.status).toBe(400)
})
it('returns 404 when connector not found', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([])
const req = createMockRequest('PATCH', { operation: 'restore', documentIds: ['doc-1'] })
const { PATCH } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await PATCH(req as never, { params: mockParams })
expect(response.status).toBe(404)
})
it('returns success for restore operation', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456' }])
mockDbChain.returning.mockResolvedValueOnce([{ id: 'doc-1' }])
const req = createMockRequest('PATCH', { operation: 'restore', documentIds: ['doc-1'] })
const { PATCH } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await PATCH(req as never, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.data.restoredCount).toBe(1)
})
it('returns success for exclude operation', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456' }])
mockDbChain.returning.mockResolvedValueOnce([{ id: 'doc-2' }, { id: 'doc-3' }])
const req = createMockRequest('PATCH', {
operation: 'exclude',
documentIds: ['doc-2', 'doc-3'],
})
const { PATCH } = await import(
'@/app/api/knowledge/[id]/connectors/[connectorId]/documents/route'
)
const response = await PATCH(req as never, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.data.excludedCount).toBe(2)
expect(data.data.documentIds).toEqual(['doc-2', 'doc-3'])
})
})
})

View File

@@ -1,210 +0,0 @@
import { db } from '@sim/db'
import { document, knowledgeConnector } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, inArray, isNull } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { checkKnowledgeBaseAccess, checkKnowledgeBaseWriteAccess } from '@/app/api/knowledge/utils'
const logger = createLogger('ConnectorDocumentsAPI')
type RouteParams = { params: Promise<{ id: string; connectorId: string }> }
/**
* GET /api/knowledge/[id]/connectors/[connectorId]/documents
* Returns documents for a connector, optionally including user-excluded ones.
*/
export async function GET(request: NextRequest, { params }: RouteParams) {
const requestId = generateRequestId()
const { id: knowledgeBaseId, connectorId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const accessCheck = await checkKnowledgeBaseAccess(knowledgeBaseId, auth.userId)
if (!accessCheck.hasAccess) {
const status = 'notFound' in accessCheck && accessCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const connectorRows = await db
.select({ id: knowledgeConnector.id })
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.limit(1)
if (connectorRows.length === 0) {
return NextResponse.json({ error: 'Connector not found' }, { status: 404 })
}
const includeExcluded = request.nextUrl.searchParams.get('includeExcluded') === 'true'
const activeDocs = await db
.select({
id: document.id,
filename: document.filename,
externalId: document.externalId,
sourceUrl: document.sourceUrl,
enabled: document.enabled,
userExcluded: document.userExcluded,
uploadedAt: document.uploadedAt,
processingStatus: document.processingStatus,
})
.from(document)
.where(
and(
eq(document.connectorId, connectorId),
isNull(document.deletedAt),
eq(document.userExcluded, false)
)
)
.orderBy(document.filename)
const excludedDocs = includeExcluded
? await db
.select({
id: document.id,
filename: document.filename,
externalId: document.externalId,
sourceUrl: document.sourceUrl,
enabled: document.enabled,
userExcluded: document.userExcluded,
uploadedAt: document.uploadedAt,
processingStatus: document.processingStatus,
})
.from(document)
.where(
and(
eq(document.connectorId, connectorId),
eq(document.userExcluded, true),
isNull(document.deletedAt)
)
)
.orderBy(document.filename)
: []
const docs = [...activeDocs, ...excludedDocs]
const activeCount = activeDocs.length
const excludedCount = excludedDocs.length
return NextResponse.json({
success: true,
data: {
documents: docs,
counts: { active: activeCount, excluded: excludedCount },
},
})
} catch (error) {
logger.error(`[${requestId}] Error fetching connector documents`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
const PatchSchema = z.object({
operation: z.enum(['restore', 'exclude']),
documentIds: z.array(z.string()).min(1),
})
/**
* PATCH /api/knowledge/[id]/connectors/[connectorId]/documents
* Restore or exclude connector documents.
*/
export async function PATCH(request: NextRequest, { params }: RouteParams) {
const requestId = generateRequestId()
const { id: knowledgeBaseId, connectorId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const writeCheck = await checkKnowledgeBaseWriteAccess(knowledgeBaseId, auth.userId)
if (!writeCheck.hasAccess) {
const status = 'notFound' in writeCheck && writeCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const connectorRows = await db
.select({ id: knowledgeConnector.id })
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.limit(1)
if (connectorRows.length === 0) {
return NextResponse.json({ error: 'Connector not found' }, { status: 404 })
}
const body = await request.json()
const parsed = PatchSchema.safeParse(body)
if (!parsed.success) {
return NextResponse.json(
{ error: 'Invalid request', details: parsed.error.flatten() },
{ status: 400 }
)
}
const { operation, documentIds } = parsed.data
if (operation === 'restore') {
const updated = await db
.update(document)
.set({ userExcluded: false, deletedAt: null, enabled: true })
.where(
and(
eq(document.connectorId, connectorId),
inArray(document.id, documentIds),
eq(document.userExcluded, true)
)
)
.returning({ id: document.id })
logger.info(`[${requestId}] Restored ${updated.length} excluded documents`, { connectorId })
return NextResponse.json({
success: true,
data: { restoredCount: updated.length, documentIds: updated.map((d) => d.id) },
})
}
const updated = await db
.update(document)
.set({ userExcluded: true })
.where(
and(
eq(document.connectorId, connectorId),
inArray(document.id, documentIds),
eq(document.userExcluded, false),
isNull(document.deletedAt)
)
)
.returning({ id: document.id })
logger.info(`[${requestId}] Excluded ${updated.length} documents`, { connectorId })
return NextResponse.json({
success: true,
data: { excludedCount: updated.length, documentIds: updated.map((d) => d.id) },
})
} catch (error) {
logger.error(`[${requestId}] Error updating connector documents`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@@ -1,231 +0,0 @@
/**
* @vitest-environment node
*/
import { createMockRequest, mockConsoleLogger, mockDrizzleOrm } from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@/app/api/knowledge/utils', () => ({
checkKnowledgeBaseAccess: vi.fn(),
checkKnowledgeBaseWriteAccess: vi.fn(),
}))
vi.mock('@/lib/auth/hybrid', () => ({
checkSessionOrInternalAuth: vi.fn(),
}))
vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn().mockReturnValue('test-req-id'),
}))
vi.mock('@/app/api/auth/oauth/utils', () => ({
refreshAccessTokenIfNeeded: vi.fn(),
}))
vi.mock('@/connectors/registry', () => ({
CONNECTOR_REGISTRY: {
jira: { validateConfig: vi.fn() },
},
}))
vi.mock('@sim/db/schema', () => ({
knowledgeBase: { id: 'id', userId: 'userId' },
knowledgeConnector: {
id: 'id',
knowledgeBaseId: 'knowledgeBaseId',
deletedAt: 'deletedAt',
connectorType: 'connectorType',
credentialId: 'credentialId',
},
knowledgeConnectorSyncLog: { connectorId: 'connectorId', startedAt: 'startedAt' },
}))
mockDrizzleOrm()
mockConsoleLogger()
describe('Knowledge Connector By ID API Route', () => {
const mockDbChain = {
select: vi.fn().mockReturnThis(),
from: vi.fn().mockReturnThis(),
where: vi.fn().mockReturnThis(),
orderBy: vi.fn().mockReturnThis(),
limit: vi.fn().mockResolvedValue([]),
insert: vi.fn().mockReturnThis(),
values: vi.fn().mockResolvedValue(undefined),
update: vi.fn().mockReturnThis(),
set: vi.fn().mockReturnThis(),
returning: vi.fn().mockResolvedValue([]),
}
const mockParams = Promise.resolve({ id: 'kb-123', connectorId: 'conn-456' })
beforeEach(() => {
vi.clearAllMocks()
vi.resetModules()
mockDbChain.select.mockReturnThis()
mockDbChain.from.mockReturnThis()
mockDbChain.where.mockReturnThis()
mockDbChain.orderBy.mockReturnThis()
mockDbChain.limit.mockResolvedValue([])
mockDbChain.update.mockReturnThis()
mockDbChain.set.mockReturnThis()
vi.doMock('@sim/db', () => ({ db: mockDbChain }))
})
afterEach(() => {
vi.clearAllMocks()
})
describe('GET', () => {
it('returns 401 when unauthenticated', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: false, userId: null })
const req = createMockRequest('GET')
const { GET } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await GET(req, { params: mockParams })
expect(response.status).toBe(401)
})
it('returns 404 when KB not found', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseAccess).mockResolvedValue({ hasAccess: false, notFound: true })
const req = createMockRequest('GET')
const { GET } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await GET(req, { params: mockParams })
expect(response.status).toBe(404)
})
it('returns 404 when connector not found', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseAccess).mockResolvedValue({ hasAccess: true })
mockDbChain.limit.mockResolvedValueOnce([])
const req = createMockRequest('GET')
const { GET } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await GET(req, { params: mockParams })
expect(response.status).toBe(404)
})
it('returns connector with sync logs on success', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseAccess).mockResolvedValue({ hasAccess: true })
const mockConnector = { id: 'conn-456', connectorType: 'jira', status: 'active' }
const mockLogs = [{ id: 'log-1', status: 'completed' }]
mockDbChain.limit.mockResolvedValueOnce([mockConnector]).mockResolvedValueOnce(mockLogs)
const req = createMockRequest('GET')
const { GET } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await GET(req, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.success).toBe(true)
expect(data.data.id).toBe('conn-456')
expect(data.data.syncLogs).toHaveLength(1)
})
})
describe('PATCH', () => {
it('returns 401 when unauthenticated', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: false, userId: null })
const req = createMockRequest('PATCH', { status: 'paused' })
const { PATCH } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await PATCH(req, { params: mockParams })
expect(response.status).toBe(401)
})
it('returns 400 for invalid body', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true })
const req = createMockRequest('PATCH', { syncIntervalMinutes: 'not a number' })
const { PATCH } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await PATCH(req, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(400)
expect(data.error).toBe('Invalid request')
})
it('returns 404 when connector not found during sourceConfig validation', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true })
mockDbChain.limit.mockResolvedValueOnce([])
const req = createMockRequest('PATCH', { sourceConfig: { project: 'NEW' } })
const { PATCH } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await PATCH(req, { params: mockParams })
expect(response.status).toBe(404)
})
it('returns 200 and updates status', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true })
const updatedConnector = { id: 'conn-456', status: 'paused', syncIntervalMinutes: 120 }
mockDbChain.limit.mockResolvedValueOnce([updatedConnector])
const req = createMockRequest('PATCH', { status: 'paused', syncIntervalMinutes: 120 })
const { PATCH } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await PATCH(req, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.success).toBe(true)
expect(data.data.status).toBe('paused')
})
})
describe('DELETE', () => {
it('returns 401 when unauthenticated', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: false, userId: null })
const req = createMockRequest('DELETE')
const { DELETE } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await DELETE(req, { params: mockParams })
expect(response.status).toBe(401)
})
it('returns 200 on successful soft-delete', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({ success: true, userId: 'user-1' })
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true })
const req = createMockRequest('DELETE')
const { DELETE } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/route')
const response = await DELETE(req, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.success).toBe(true)
})
})
})

View File

@@ -1,248 +0,0 @@
import { db } from '@sim/db'
import { knowledgeBase, knowledgeConnector, knowledgeConnectorSyncLog } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, desc, eq, isNull } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { checkKnowledgeBaseAccess, checkKnowledgeBaseWriteAccess } from '@/app/api/knowledge/utils'
import { CONNECTOR_REGISTRY } from '@/connectors/registry'
const logger = createLogger('KnowledgeConnectorByIdAPI')
type RouteParams = { params: Promise<{ id: string; connectorId: string }> }
const UpdateConnectorSchema = z.object({
sourceConfig: z.record(z.unknown()).optional(),
syncIntervalMinutes: z.number().int().min(0).optional(),
status: z.enum(['active', 'paused']).optional(),
})
/**
* GET /api/knowledge/[id]/connectors/[connectorId] - Get connector details with recent sync logs
*/
export async function GET(request: NextRequest, { params }: RouteParams) {
const requestId = generateRequestId()
const { id: knowledgeBaseId, connectorId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const accessCheck = await checkKnowledgeBaseAccess(knowledgeBaseId, auth.userId)
if (!accessCheck.hasAccess) {
const status = 'notFound' in accessCheck && accessCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const connectorRows = await db
.select()
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.limit(1)
if (connectorRows.length === 0) {
return NextResponse.json({ error: 'Connector not found' }, { status: 404 })
}
const syncLogs = await db
.select()
.from(knowledgeConnectorSyncLog)
.where(eq(knowledgeConnectorSyncLog.connectorId, connectorId))
.orderBy(desc(knowledgeConnectorSyncLog.startedAt))
.limit(10)
return NextResponse.json({
success: true,
data: {
...connectorRows[0],
syncLogs,
},
})
} catch (error) {
logger.error(`[${requestId}] Error fetching connector`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
/**
* PATCH /api/knowledge/[id]/connectors/[connectorId] - Update a connector
*/
export async function PATCH(request: NextRequest, { params }: RouteParams) {
const requestId = generateRequestId()
const { id: knowledgeBaseId, connectorId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const writeCheck = await checkKnowledgeBaseWriteAccess(knowledgeBaseId, auth.userId)
if (!writeCheck.hasAccess) {
const status = 'notFound' in writeCheck && writeCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const body = await request.json()
const parsed = UpdateConnectorSchema.safeParse(body)
if (!parsed.success) {
return NextResponse.json(
{ error: 'Invalid request', details: parsed.error.flatten() },
{ status: 400 }
)
}
if (parsed.data.sourceConfig !== undefined) {
const existingRows = await db
.select()
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.limit(1)
if (existingRows.length === 0) {
return NextResponse.json({ error: 'Connector not found' }, { status: 404 })
}
const existing = existingRows[0]
const connectorConfig = CONNECTOR_REGISTRY[existing.connectorType]
if (!connectorConfig) {
return NextResponse.json(
{ error: `Unknown connector type: ${existing.connectorType}` },
{ status: 400 }
)
}
const kbRows = await db
.select({ userId: knowledgeBase.userId })
.from(knowledgeBase)
.where(eq(knowledgeBase.id, knowledgeBaseId))
.limit(1)
if (kbRows.length === 0) {
return NextResponse.json({ error: 'Knowledge base not found' }, { status: 404 })
}
const accessToken = await refreshAccessTokenIfNeeded(
existing.credentialId,
kbRows[0].userId,
`patch-${connectorId}`
)
if (!accessToken) {
return NextResponse.json(
{ error: 'Failed to refresh access token. Please reconnect your account.' },
{ status: 401 }
)
}
const validation = await connectorConfig.validateConfig(accessToken, parsed.data.sourceConfig)
if (!validation.valid) {
return NextResponse.json(
{ error: validation.error || 'Invalid source configuration' },
{ status: 400 }
)
}
}
const updates: Record<string, unknown> = { updatedAt: new Date() }
if (parsed.data.sourceConfig !== undefined) {
updates.sourceConfig = parsed.data.sourceConfig
}
if (parsed.data.syncIntervalMinutes !== undefined) {
updates.syncIntervalMinutes = parsed.data.syncIntervalMinutes
if (parsed.data.syncIntervalMinutes > 0) {
updates.nextSyncAt = new Date(Date.now() + parsed.data.syncIntervalMinutes * 60 * 1000)
} else {
updates.nextSyncAt = null
}
}
if (parsed.data.status !== undefined) {
updates.status = parsed.data.status
}
await db
.update(knowledgeConnector)
.set(updates)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
const updated = await db
.select()
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.limit(1)
return NextResponse.json({ success: true, data: updated[0] })
} catch (error) {
logger.error(`[${requestId}] Error updating connector`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
/**
* DELETE /api/knowledge/[id]/connectors/[connectorId] - Soft-delete a connector
*/
export async function DELETE(request: NextRequest, { params }: RouteParams) {
const requestId = generateRequestId()
const { id: knowledgeBaseId, connectorId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const writeCheck = await checkKnowledgeBaseWriteAccess(knowledgeBaseId, auth.userId)
if (!writeCheck.hasAccess) {
const status = 'notFound' in writeCheck && writeCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
await db
.update(knowledgeConnector)
.set({ deletedAt: new Date(), status: 'paused', updatedAt: new Date() })
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
logger.info(`[${requestId}] Soft-deleted connector ${connectorId}`)
return NextResponse.json({ success: true })
} catch (error) {
logger.error(`[${requestId}] Error deleting connector`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@@ -1,133 +0,0 @@
/**
* @vitest-environment node
*/
import { createMockRequest, mockConsoleLogger, mockDrizzleOrm } from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('@sim/db/schema', () => ({
knowledgeConnector: {
id: 'id',
knowledgeBaseId: 'knowledgeBaseId',
deletedAt: 'deletedAt',
status: 'status',
},
}))
vi.mock('@/app/api/knowledge/utils', () => ({
checkKnowledgeBaseWriteAccess: vi.fn(),
}))
vi.mock('@/lib/auth/hybrid', () => ({
checkSessionOrInternalAuth: vi.fn(),
}))
vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn().mockReturnValue('test-req-id'),
}))
vi.mock('@/lib/knowledge/connectors/sync-engine', () => ({
dispatchSync: vi.fn().mockResolvedValue(undefined),
}))
mockDrizzleOrm()
mockConsoleLogger()
describe('Connector Manual Sync API Route', () => {
const mockDbChain = {
select: vi.fn().mockReturnThis(),
from: vi.fn().mockReturnThis(),
where: vi.fn().mockReturnThis(),
orderBy: vi.fn().mockResolvedValue([]),
limit: vi.fn().mockResolvedValue([]),
update: vi.fn().mockReturnThis(),
set: vi.fn().mockReturnThis(),
}
const mockParams = Promise.resolve({ id: 'kb-123', connectorId: 'conn-456' })
beforeEach(() => {
vi.clearAllMocks()
mockDbChain.select.mockReturnThis()
mockDbChain.from.mockReturnThis()
mockDbChain.where.mockReturnThis()
mockDbChain.orderBy.mockResolvedValue([])
mockDbChain.limit.mockResolvedValue([])
mockDbChain.update.mockReturnThis()
mockDbChain.set.mockReturnThis()
vi.doMock('@sim/db', () => ({ db: mockDbChain }))
})
afterEach(() => {
vi.clearAllMocks()
})
it('returns 401 when unauthenticated', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: false,
userId: null,
} as never)
const req = createMockRequest('POST')
const { POST } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/sync/route')
const response = await POST(req as never, { params: mockParams })
expect(response.status).toBe(401)
})
it('returns 404 when connector not found', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([])
const req = createMockRequest('POST')
const { POST } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/sync/route')
const response = await POST(req as never, { params: mockParams })
expect(response.status).toBe(404)
})
it('returns 409 when connector is syncing', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456', status: 'syncing' }])
const req = createMockRequest('POST')
const { POST } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/sync/route')
const response = await POST(req as never, { params: mockParams })
expect(response.status).toBe(409)
})
it('dispatches sync on valid request', async () => {
const { checkSessionOrInternalAuth } = await import('@/lib/auth/hybrid')
const { checkKnowledgeBaseWriteAccess } = await import('@/app/api/knowledge/utils')
const { dispatchSync } = await import('@/lib/knowledge/connectors/sync-engine')
vi.mocked(checkSessionOrInternalAuth).mockResolvedValue({
success: true,
userId: 'user-1',
} as never)
vi.mocked(checkKnowledgeBaseWriteAccess).mockResolvedValue({ hasAccess: true } as never)
mockDbChain.limit.mockResolvedValueOnce([{ id: 'conn-456', status: 'active' }])
const req = createMockRequest('POST')
const { POST } = await import('@/app/api/knowledge/[id]/connectors/[connectorId]/sync/route')
const response = await POST(req as never, { params: mockParams })
const data = await response.json()
expect(response.status).toBe(200)
expect(data.success).toBe(true)
expect(vi.mocked(dispatchSync)).toHaveBeenCalledWith('conn-456', { requestId: 'test-req-id' })
})
})

View File

@@ -1,71 +0,0 @@
import { db } from '@sim/db'
import { knowledgeConnector } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { dispatchSync } from '@/lib/knowledge/connectors/sync-engine'
import { checkKnowledgeBaseWriteAccess } from '@/app/api/knowledge/utils'
const logger = createLogger('ConnectorManualSyncAPI')
type RouteParams = { params: Promise<{ id: string; connectorId: string }> }
/**
* POST /api/knowledge/[id]/connectors/[connectorId]/sync - Trigger a manual sync
*/
export async function POST(request: NextRequest, { params }: RouteParams) {
const requestId = generateRequestId()
const { id: knowledgeBaseId, connectorId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const writeCheck = await checkKnowledgeBaseWriteAccess(knowledgeBaseId, auth.userId)
if (!writeCheck.hasAccess) {
const status = 'notFound' in writeCheck && writeCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const connectorRows = await db
.select()
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.id, connectorId),
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.limit(1)
if (connectorRows.length === 0) {
return NextResponse.json({ error: 'Connector not found' }, { status: 404 })
}
if (connectorRows[0].status === 'syncing') {
return NextResponse.json({ error: 'Sync already in progress' }, { status: 409 })
}
logger.info(`[${requestId}] Manual sync triggered for connector ${connectorId}`)
dispatchSync(connectorId, { requestId }).catch((error) => {
logger.error(
`[${requestId}] Failed to dispatch manual sync for connector ${connectorId}`,
error
)
})
return NextResponse.json({
success: true,
message: 'Sync triggered',
})
} catch (error) {
logger.error(`[${requestId}] Error triggering manual sync`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@@ -1,204 +0,0 @@
import { db } from '@sim/db'
import { knowledgeBaseTagDefinitions, knowledgeConnector } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, desc, eq, isNull } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { dispatchSync } from '@/lib/knowledge/connectors/sync-engine'
import { allocateTagSlots } from '@/lib/knowledge/constants'
import { createTagDefinition } from '@/lib/knowledge/tags/service'
import { getCredential } from '@/app/api/auth/oauth/utils'
import { checkKnowledgeBaseAccess, checkKnowledgeBaseWriteAccess } from '@/app/api/knowledge/utils'
import { CONNECTOR_REGISTRY } from '@/connectors/registry'
const logger = createLogger('KnowledgeConnectorsAPI')
const CreateConnectorSchema = z.object({
connectorType: z.string().min(1),
credentialId: z.string().min(1),
sourceConfig: z.record(z.unknown()),
syncIntervalMinutes: z.number().int().min(0).default(1440),
})
/**
* GET /api/knowledge/[id]/connectors - List connectors for a knowledge base
*/
export async function GET(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const requestId = generateRequestId()
const { id: knowledgeBaseId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const accessCheck = await checkKnowledgeBaseAccess(knowledgeBaseId, auth.userId)
if (!accessCheck.hasAccess) {
const status = 'notFound' in accessCheck && accessCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const connectors = await db
.select()
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.knowledgeBaseId, knowledgeBaseId),
isNull(knowledgeConnector.deletedAt)
)
)
.orderBy(desc(knowledgeConnector.createdAt))
return NextResponse.json({ success: true, data: connectors })
} catch (error) {
logger.error(`[${requestId}] Error listing connectors`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}
/**
* POST /api/knowledge/[id]/connectors - Create a new connector
*/
export async function POST(request: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const requestId = generateRequestId()
const { id: knowledgeBaseId } = await params
try {
const auth = await checkSessionOrInternalAuth(request, { requireWorkflowId: false })
if (!auth.success || !auth.userId) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const writeCheck = await checkKnowledgeBaseWriteAccess(knowledgeBaseId, auth.userId)
if (!writeCheck.hasAccess) {
const status = 'notFound' in writeCheck && writeCheck.notFound ? 404 : 401
return NextResponse.json({ error: status === 404 ? 'Not found' : 'Unauthorized' }, { status })
}
const body = await request.json()
const parsed = CreateConnectorSchema.safeParse(body)
if (!parsed.success) {
return NextResponse.json(
{ error: 'Invalid request', details: parsed.error.flatten() },
{ status: 400 }
)
}
const { connectorType, credentialId, sourceConfig, syncIntervalMinutes } = parsed.data
const connectorConfig = CONNECTOR_REGISTRY[connectorType]
if (!connectorConfig) {
return NextResponse.json(
{ error: `Unknown connector type: ${connectorType}` },
{ status: 400 }
)
}
const credential = await getCredential(requestId, credentialId, auth.userId)
if (!credential) {
return NextResponse.json({ error: 'Credential not found' }, { status: 400 })
}
if (!credential.accessToken) {
return NextResponse.json(
{ error: 'Credential has no access token. Please reconnect your account.' },
{ status: 400 }
)
}
const validation = await connectorConfig.validateConfig(credential.accessToken, sourceConfig)
if (!validation.valid) {
return NextResponse.json(
{ error: validation.error || 'Invalid source configuration' },
{ status: 400 }
)
}
let finalSourceConfig: Record<string, unknown> = sourceConfig
const tagSlotMapping: Record<string, string> = {}
if (connectorConfig.tagDefinitions?.length) {
const disabledIds = new Set((sourceConfig.disabledTagIds as string[] | undefined) ?? [])
const enabledDefs = connectorConfig.tagDefinitions.filter((td) => !disabledIds.has(td.id))
const existingDefs = await db
.select({ tagSlot: knowledgeBaseTagDefinitions.tagSlot })
.from(knowledgeBaseTagDefinitions)
.where(eq(knowledgeBaseTagDefinitions.knowledgeBaseId, knowledgeBaseId))
const usedSlots = new Set<string>(existingDefs.map((d) => d.tagSlot))
const { mapping, skipped: skippedTags } = allocateTagSlots(enabledDefs, usedSlots)
Object.assign(tagSlotMapping, mapping)
for (const name of skippedTags) {
logger.warn(`[${requestId}] No available slots for "${name}"`)
}
if (skippedTags.length > 0 && Object.keys(tagSlotMapping).length === 0) {
return NextResponse.json(
{ error: `No available tag slots. Could not assign: ${skippedTags.join(', ')}` },
{ status: 422 }
)
}
finalSourceConfig = { ...sourceConfig, tagSlotMapping }
}
const now = new Date()
const connectorId = crypto.randomUUID()
const nextSyncAt =
syncIntervalMinutes > 0 ? new Date(now.getTime() + syncIntervalMinutes * 60 * 1000) : null
await db.transaction(async (tx) => {
for (const [semanticId, slot] of Object.entries(tagSlotMapping)) {
const td = connectorConfig.tagDefinitions!.find((d) => d.id === semanticId)!
await createTagDefinition(
{
knowledgeBaseId,
tagSlot: slot,
displayName: td.displayName,
fieldType: td.fieldType,
},
requestId,
tx
)
}
await tx.insert(knowledgeConnector).values({
id: connectorId,
knowledgeBaseId,
connectorType,
credentialId,
sourceConfig: finalSourceConfig,
syncIntervalMinutes,
status: 'active',
nextSyncAt,
createdAt: now,
updatedAt: now,
})
})
logger.info(`[${requestId}] Created connector ${connectorId} for KB ${knowledgeBaseId}`)
dispatchSync(connectorId, { requestId }).catch((error) => {
logger.error(
`[${requestId}] Failed to dispatch initial sync for connector ${connectorId}`,
error
)
})
const created = await db
.select()
.from(knowledgeConnector)
.where(eq(knowledgeConnector.id, connectorId))
.limit(1)
return NextResponse.json({ success: true, data: created[0] }, { status: 201 })
} catch (error) {
logger.error(`[${requestId}] Error creating connector`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@@ -4,6 +4,7 @@
* @vitest-environment node
*/
import {
auditMock,
createMockRequest,
mockAuth,
mockConsoleLogger,
@@ -35,6 +36,8 @@ vi.mock('@/lib/knowledge/documents/service', () => ({
mockDrizzleOrm()
mockConsoleLogger()
vi.mock('@/lib/audit/log', () => auditMock)
describe('Document By ID API Route', () => {
const mockAuth$ = mockAuth()

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import {
@@ -197,6 +198,19 @@ export async function PUT(
`[${requestId}] Document updated: ${documentId} in knowledge base ${knowledgeBaseId}`
)
recordAudit({
workspaceId: accessCheck.knowledgeBase?.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.DOCUMENT_UPDATED,
resourceType: AuditResourceType.DOCUMENT,
resourceId: documentId,
resourceName: validatedData.filename ?? accessCheck.document?.filename,
description: `Updated document "${documentId}" in knowledge base "${knowledgeBaseId}"`,
request: req,
})
return NextResponse.json({
success: true,
data: updatedDocument,
@@ -257,6 +271,19 @@ export async function DELETE(
`[${requestId}] Document deleted: ${documentId} from knowledge base ${knowledgeBaseId}`
)
recordAudit({
workspaceId: accessCheck.knowledgeBase?.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.DOCUMENT_DELETED,
resourceType: AuditResourceType.DOCUMENT,
resourceId: documentId,
resourceName: accessCheck.document?.filename,
description: `Deleted document "${documentId}" from knowledge base "${knowledgeBaseId}"`,
request: req,
})
return NextResponse.json({
success: true,
data: result,

View File

@@ -4,6 +4,7 @@
* @vitest-environment node
*/
import {
auditMock,
createMockRequest,
mockAuth,
mockConsoleLogger,
@@ -40,6 +41,8 @@ vi.mock('@/lib/knowledge/documents/service', () => ({
mockDrizzleOrm()
mockConsoleLogger()
vi.mock('@/lib/audit/log', () => auditMock)
describe('Knowledge Base Documents API Route', () => {
const mockAuth$ = mockAuth()

View File

@@ -2,6 +2,7 @@ import { randomUUID } from 'crypto'
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import {
@@ -12,7 +13,6 @@ import {
getDocuments,
getProcessingConfig,
processDocumentsWithQueue,
type TagFilterCondition,
} from '@/lib/knowledge/documents/service'
import type { DocumentSortField, SortOrder } from '@/lib/knowledge/documents/types'
import { authorizeWorkflowByWorkspacePermission } from '@/lib/workflows/utils'
@@ -131,21 +131,6 @@ export async function GET(req: NextRequest, { params }: { params: Promise<{ id:
? (sortOrderParam as SortOrder)
: undefined
let tagFilters: TagFilterCondition[] | undefined
const tagFiltersParam = url.searchParams.get('tagFilters')
if (tagFiltersParam) {
try {
const parsed = JSON.parse(tagFiltersParam)
if (Array.isArray(parsed)) {
tagFilters = parsed.filter(
(f: TagFilterCondition) => f.tagSlot && f.operator && f.value !== undefined
)
}
} catch {
logger.warn(`[${requestId}] Invalid tagFilters param`)
}
}
const result = await getDocuments(
knowledgeBaseId,
{
@@ -155,7 +140,6 @@ export async function GET(req: NextRequest, { params }: { params: Promise<{ id:
offset,
...(sortBy && { sortBy }),
...(sortOrder && { sortOrder }),
tagFilters,
},
requestId
)
@@ -261,6 +245,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
logger.error(`[${requestId}] Critical error in document processing pipeline:`, error)
})
recordAudit({
workspaceId: accessCheck.knowledgeBase?.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.DOCUMENT_UPLOADED,
resourceType: AuditResourceType.DOCUMENT,
resourceId: knowledgeBaseId,
resourceName: `${createdDocuments.length} document(s)`,
description: `Uploaded ${createdDocuments.length} document(s) to knowledge base "${knowledgeBaseId}"`,
request: req,
})
return NextResponse.json({
success: true,
data: {
@@ -309,6 +306,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
// Silently fail
}
recordAudit({
workspaceId: accessCheck.knowledgeBase?.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.DOCUMENT_UPLOADED,
resourceType: AuditResourceType.DOCUMENT,
resourceId: knowledgeBaseId,
resourceName: validatedData.filename,
description: `Uploaded document "${validatedData.filename}" to knowledge base "${knowledgeBaseId}"`,
request: req,
})
return NextResponse.json({
success: true,
data: newDocument,

View File

@@ -4,6 +4,7 @@
* @vitest-environment node
*/
import {
auditMock,
createMockRequest,
mockAuth,
mockConsoleLogger,
@@ -16,6 +17,8 @@ mockKnowledgeSchemas()
mockDrizzleOrm()
mockConsoleLogger()
vi.mock('@/lib/audit/log', () => auditMock)
vi.mock('@/lib/knowledge/service', () => ({
getKnowledgeBaseById: vi.fn(),
updateKnowledgeBase: vi.fn(),

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -135,6 +136,19 @@ export async function PUT(req: NextRequest, { params }: { params: Promise<{ id:
logger.info(`[${requestId}] Knowledge base updated: ${id} for user ${userId}`)
recordAudit({
workspaceId: accessCheck.knowledgeBase.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.KNOWLEDGE_BASE_UPDATED,
resourceType: AuditResourceType.KNOWLEDGE_BASE,
resourceId: id,
resourceName: validatedData.name ?? updatedKnowledgeBase.name,
description: `Updated knowledge base "${validatedData.name ?? updatedKnowledgeBase.name}"`,
request: req,
})
return NextResponse.json({
success: true,
data: updatedKnowledgeBase,
@@ -197,6 +211,19 @@ export async function DELETE(
logger.info(`[${requestId}] Knowledge base deleted: ${id} for user ${userId}`)
recordAudit({
workspaceId: accessCheck.knowledgeBase.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.KNOWLEDGE_BASE_DELETED,
resourceType: AuditResourceType.KNOWLEDGE_BASE,
resourceId: id,
resourceName: accessCheck.knowledgeBase.name,
description: `Deleted knowledge base "${accessCheck.knowledgeBase.name || id}"`,
request: _request,
})
return NextResponse.json({
success: true,
data: { message: 'Knowledge base deleted successfully' },

View File

@@ -1,68 +0,0 @@
import { db } from '@sim/db'
import { knowledgeConnector } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull, lte } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { verifyCronAuth } from '@/lib/auth/internal'
import { generateRequestId } from '@/lib/core/utils/request'
import { dispatchSync } from '@/lib/knowledge/connectors/sync-engine'
export const dynamic = 'force-dynamic'
const logger = createLogger('ConnectorSyncSchedulerAPI')
/**
* Cron endpoint that checks for connectors due for sync and dispatches sync jobs.
* Should be called every 5 minutes by an external cron service.
*/
export async function GET(request: NextRequest) {
const requestId = generateRequestId()
logger.info(`[${requestId}] Connector sync scheduler triggered`)
const authError = verifyCronAuth(request, 'Connector sync scheduler')
if (authError) {
return authError
}
try {
const now = new Date()
const dueConnectors = await db
.select({
id: knowledgeConnector.id,
})
.from(knowledgeConnector)
.where(
and(
eq(knowledgeConnector.status, 'active'),
lte(knowledgeConnector.nextSyncAt, now),
isNull(knowledgeConnector.deletedAt)
)
)
logger.info(`[${requestId}] Found ${dueConnectors.length} connectors due for sync`)
if (dueConnectors.length === 0) {
return NextResponse.json({
success: true,
message: 'No connectors due for sync',
count: 0,
})
}
for (const connector of dueConnectors) {
dispatchSync(connector.id, { requestId }).catch((error) => {
logger.error(`[${requestId}] Failed to dispatch sync for connector ${connector.id}`, error)
})
}
return NextResponse.json({
success: true,
message: `Dispatched ${dueConnectors.length} connector sync(s)`,
count: dueConnectors.length,
})
} catch (error) {
logger.error(`[${requestId}] Connector sync scheduler error`, error)
return NextResponse.json({ error: 'Internal server error' }, { status: 500 })
}
}

View File

@@ -4,6 +4,7 @@
* @vitest-environment node
*/
import {
auditMock,
createMockRequest,
mockAuth,
mockConsoleLogger,
@@ -16,6 +17,8 @@ mockKnowledgeSchemas()
mockDrizzleOrm()
mockConsoleLogger()
vi.mock('@/lib/audit/log', () => auditMock)
vi.mock('@/lib/workspaces/permissions/utils', () => ({
getUserEntityPermissions: vi.fn().mockResolvedValue('admin'),
}))

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -109,6 +110,20 @@ export async function POST(req: NextRequest) {
`[${requestId}] Knowledge base created: ${newKnowledgeBase.id} for user ${session.user.id}`
)
recordAudit({
workspaceId: validatedData.workspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.KNOWLEDGE_BASE_CREATED,
resourceType: AuditResourceType.KNOWLEDGE_BASE,
resourceId: newKnowledgeBase.id,
resourceName: validatedData.name,
description: `Created knowledge base "${validatedData.name}"`,
metadata: { name: validatedData.name },
request: req,
})
return NextResponse.json({
success: true,
data: newKnowledgeBase,

View File

@@ -99,7 +99,7 @@ export interface EmbeddingData {
export interface KnowledgeBaseAccessResult {
hasAccess: true
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId' | 'workspaceId'>
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId' | 'workspaceId' | 'name'>
}
export interface KnowledgeBaseAccessDenied {
@@ -113,7 +113,7 @@ export type KnowledgeBaseAccessCheck = KnowledgeBaseAccessResult | KnowledgeBase
export interface DocumentAccessResult {
hasAccess: true
document: DocumentData
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId' | 'workspaceId'>
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId' | 'workspaceId' | 'name'>
}
export interface DocumentAccessDenied {
@@ -128,7 +128,7 @@ export interface ChunkAccessResult {
hasAccess: true
chunk: EmbeddingData
document: DocumentData
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId' | 'workspaceId'>
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId' | 'workspaceId' | 'name'>
}
export interface ChunkAccessDenied {
@@ -151,6 +151,7 @@ export async function checkKnowledgeBaseAccess(
id: knowledgeBase.id,
userId: knowledgeBase.userId,
workspaceId: knowledgeBase.workspaceId,
name: knowledgeBase.name,
})
.from(knowledgeBase)
.where(and(eq(knowledgeBase.id, knowledgeBaseId), isNull(knowledgeBase.deletedAt)))
@@ -193,6 +194,7 @@ export async function checkKnowledgeBaseWriteAccess(
id: knowledgeBase.id,
userId: knowledgeBase.userId,
workspaceId: knowledgeBase.workspaceId,
name: knowledgeBase.name,
})
.from(knowledgeBase)
.where(and(eq(knowledgeBase.id, knowledgeBaseId), isNull(knowledgeBase.deletedAt)))

View File

@@ -3,6 +3,8 @@ import { mcpServers } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { McpDomainNotAllowedError, validateMcpDomain } from '@/lib/mcp/domain-check'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { mcpService } from '@/lib/mcp/service'
import { createMcpErrorResponse, createMcpSuccessResponse } from '@/lib/mcp/utils'
@@ -15,7 +17,11 @@ export const dynamic = 'force-dynamic'
* PATCH - Update an MCP server in the workspace (requires write or admin permission)
*/
export const PATCH = withMcpAuth<{ id: string }>('write')(
async (request: NextRequest, { userId, workspaceId, requestId }, { params }) => {
async (
request: NextRequest,
{ userId, userName, userEmail, workspaceId, requestId },
{ params }
) => {
const { id: serverId } = await params
try {
@@ -29,6 +35,17 @@ export const PATCH = withMcpAuth<{ id: string }>('write')(
// Remove workspaceId from body to prevent it from being updated
const { workspaceId: _, ...updateData } = body
if (updateData.url) {
try {
validateMcpDomain(updateData.url)
} catch (e) {
if (e instanceof McpDomainNotAllowedError) {
return createMcpErrorResponse(e, e.message, 403)
}
throw e
}
}
// Get the current server to check if URL is changing
const [currentServer] = await db
.select({ url: mcpServers.url })
@@ -73,6 +90,20 @@ export const PATCH = withMcpAuth<{ id: string }>('write')(
}
logger.info(`[${requestId}] Successfully updated MCP server: ${serverId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_UPDATED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
resourceName: updatedServer.name || serverId,
description: `Updated MCP server "${updatedServer.name || serverId}"`,
request,
})
return createMcpSuccessResponse({ server: updatedServer })
} catch (error) {
logger.error(`[${requestId}] Error updating MCP server:`, error)

View File

@@ -3,6 +3,8 @@ import { mcpServers } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, isNull } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { McpDomainNotAllowedError, validateMcpDomain } from '@/lib/mcp/domain-check'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { mcpService } from '@/lib/mcp/service'
import {
@@ -54,7 +56,7 @@ export const GET = withMcpAuth('read')(
* it will be updated instead of creating a duplicate.
*/
export const POST = withMcpAuth('write')(
async (request: NextRequest, { userId, workspaceId, requestId }) => {
async (request: NextRequest, { userId, userName, userEmail, workspaceId, requestId }) => {
try {
const body = getParsedBody(request) || (await request.json())
@@ -72,6 +74,15 @@ export const POST = withMcpAuth('write')(
)
}
try {
validateMcpDomain(body.url)
} catch (e) {
if (e instanceof McpDomainNotAllowedError) {
return createMcpErrorResponse(e, e.message, 403)
}
throw e
}
const serverId = body.url ? generateMcpServerId(workspaceId, body.url) : crypto.randomUUID()
const [existingServer] = await db
@@ -151,6 +162,20 @@ export const POST = withMcpAuth('write')(
// Silently fail
}
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_ADDED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
resourceName: body.name,
description: `Added MCP server "${body.name}"`,
metadata: { serverName: body.name, transport: body.transport },
request,
})
return createMcpSuccessResponse({ serverId }, 201)
} catch (error) {
logger.error(`[${requestId}] Error registering MCP server:`, error)
@@ -167,7 +192,7 @@ export const POST = withMcpAuth('write')(
* DELETE - Delete an MCP server from the workspace (requires admin permission)
*/
export const DELETE = withMcpAuth('admin')(
async (request: NextRequest, { userId, workspaceId, requestId }) => {
async (request: NextRequest, { userId, userName, userEmail, workspaceId, requestId }) => {
try {
const { searchParams } = new URL(request.url)
const serverId = searchParams.get('serverId')
@@ -198,6 +223,20 @@ export const DELETE = withMcpAuth('admin')(
await mcpService.clearCache(workspaceId)
logger.info(`[${requestId}] Successfully deleted MCP server: ${serverId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_REMOVED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId!,
resourceName: deletedServer.name,
description: `Removed MCP server "${deletedServer.name}"`,
request,
})
return createMcpSuccessResponse({ message: `Server ${serverId} deleted successfully` })
} catch (error) {
logger.error(`[${requestId}] Error deleting MCP server:`, error)

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import type { NextRequest } from 'next/server'
import { McpClient } from '@/lib/mcp/client'
import { McpDomainNotAllowedError, validateMcpDomain } from '@/lib/mcp/domain-check'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { resolveMcpConfigEnvVars } from '@/lib/mcp/resolve-config'
import type { McpTransport } from '@/lib/mcp/types'
@@ -71,6 +72,15 @@ export const POST = withMcpAuth('write')(
)
}
try {
validateMcpDomain(body.url)
} catch (e) {
if (e instanceof McpDomainNotAllowedError) {
return createMcpErrorResponse(e, e.message, 403)
}
throw e
}
// Build initial config for resolution
const initialConfig = {
id: `test-${requestId}`,
@@ -95,6 +105,16 @@ export const POST = withMcpAuth('write')(
logger.warn(`[${requestId}] Some environment variables not found:`, { missingVars })
}
// Re-validate domain after env var resolution
try {
validateMcpDomain(testConfig.url)
} catch (e) {
if (e instanceof McpDomainNotAllowedError) {
return createMcpErrorResponse(e, e.message, 403)
}
throw e
}
const testSecurityPolicy = {
requireConsent: false,
auditLevel: 'none' as const,

View File

@@ -3,6 +3,7 @@ import { workflowMcpServer, workflowMcpTool } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { mcpPubSub } from '@/lib/mcp/pubsub'
import { createMcpErrorResponse, createMcpSuccessResponse } from '@/lib/mcp/utils'
@@ -71,7 +72,11 @@ export const GET = withMcpAuth<RouteParams>('read')(
* PATCH - Update a workflow MCP server
*/
export const PATCH = withMcpAuth<RouteParams>('write')(
async (request: NextRequest, { userId, workspaceId, requestId }, { params }) => {
async (
request: NextRequest,
{ userId, userName, userEmail, workspaceId, requestId },
{ params }
) => {
try {
const { id: serverId } = await params
const body = getParsedBody(request) || (await request.json())
@@ -112,6 +117,19 @@ export const PATCH = withMcpAuth<RouteParams>('write')(
logger.info(`[${requestId}] Successfully updated workflow MCP server: ${serverId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_UPDATED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
resourceName: updatedServer.name,
description: `Updated workflow MCP server "${updatedServer.name}"`,
request,
})
return createMcpSuccessResponse({ server: updatedServer })
} catch (error) {
logger.error(`[${requestId}] Error updating workflow MCP server:`, error)
@@ -128,7 +146,11 @@ export const PATCH = withMcpAuth<RouteParams>('write')(
* DELETE - Delete a workflow MCP server and all its tools
*/
export const DELETE = withMcpAuth<RouteParams>('admin')(
async (request: NextRequest, { userId, workspaceId, requestId }, { params }) => {
async (
request: NextRequest,
{ userId, userName, userEmail, workspaceId, requestId },
{ params }
) => {
try {
const { id: serverId } = await params
@@ -149,6 +171,19 @@ export const DELETE = withMcpAuth<RouteParams>('admin')(
mcpPubSub?.publishWorkflowToolsChanged({ serverId, workspaceId })
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_REMOVED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
resourceName: deletedServer.name,
description: `Unpublished workflow MCP server "${deletedServer.name}"`,
request,
})
return createMcpSuccessResponse({ message: `Server ${serverId} deleted successfully` })
} catch (error) {
logger.error(`[${requestId}] Error deleting workflow MCP server:`, error)

View File

@@ -3,6 +3,7 @@ import { workflowMcpServer, workflowMcpTool } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { mcpPubSub } from '@/lib/mcp/pubsub'
import { createMcpErrorResponse, createMcpSuccessResponse } from '@/lib/mcp/utils'
@@ -65,7 +66,11 @@ export const GET = withMcpAuth<RouteParams>('read')(
* PATCH - Update a tool's configuration
*/
export const PATCH = withMcpAuth<RouteParams>('write')(
async (request: NextRequest, { userId, workspaceId, requestId }, { params }) => {
async (
request: NextRequest,
{ userId, userName, userEmail, workspaceId, requestId },
{ params }
) => {
try {
const { id: serverId, toolId } = await params
const body = getParsedBody(request) || (await request.json())
@@ -118,6 +123,19 @@ export const PATCH = withMcpAuth<RouteParams>('write')(
mcpPubSub?.publishWorkflowToolsChanged({ serverId, workspaceId })
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_UPDATED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
description: `Updated tool "${updatedTool.toolName}" in MCP server`,
metadata: { toolId, toolName: updatedTool.toolName },
request,
})
return createMcpSuccessResponse({ tool: updatedTool })
} catch (error) {
logger.error(`[${requestId}] Error updating tool:`, error)
@@ -134,7 +152,11 @@ export const PATCH = withMcpAuth<RouteParams>('write')(
* DELETE - Remove a tool from an MCP server
*/
export const DELETE = withMcpAuth<RouteParams>('write')(
async (request: NextRequest, { userId, workspaceId, requestId }, { params }) => {
async (
request: NextRequest,
{ userId, userName, userEmail, workspaceId, requestId },
{ params }
) => {
try {
const { id: serverId, toolId } = await params
@@ -165,6 +187,19 @@ export const DELETE = withMcpAuth<RouteParams>('write')(
mcpPubSub?.publishWorkflowToolsChanged({ serverId, workspaceId })
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_UPDATED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
description: `Removed tool "${deletedTool.toolName}" from MCP server`,
metadata: { toolId, toolName: deletedTool.toolName },
request,
})
return createMcpSuccessResponse({ message: `Tool ${toolId} deleted successfully` })
} catch (error) {
logger.error(`[${requestId}] Error deleting tool:`, error)

View File

@@ -3,6 +3,7 @@ import { workflow, workflowMcpServer, workflowMcpTool } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { mcpPubSub } from '@/lib/mcp/pubsub'
import { createMcpErrorResponse, createMcpSuccessResponse } from '@/lib/mcp/utils'
@@ -76,7 +77,11 @@ export const GET = withMcpAuth<RouteParams>('read')(
* POST - Add a workflow as a tool to an MCP server
*/
export const POST = withMcpAuth<RouteParams>('write')(
async (request: NextRequest, { userId, workspaceId, requestId }, { params }) => {
async (
request: NextRequest,
{ userId, userName, userEmail, workspaceId, requestId },
{ params }
) => {
try {
const { id: serverId } = await params
const body = getParsedBody(request) || (await request.json())
@@ -197,6 +202,19 @@ export const POST = withMcpAuth<RouteParams>('write')(
mcpPubSub?.publishWorkflowToolsChanged({ serverId, workspaceId })
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_UPDATED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
description: `Added tool "${toolName}" to MCP server`,
metadata: { toolId, toolName, workflowId: body.workflowId },
request,
})
return createMcpSuccessResponse({ tool }, 201)
} catch (error) {
logger.error(`[${requestId}] Error adding tool:`, error)

View File

@@ -3,6 +3,7 @@ import { workflow, workflowMcpServer, workflowMcpTool } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { eq, inArray, sql } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getParsedBody, withMcpAuth } from '@/lib/mcp/middleware'
import { mcpPubSub } from '@/lib/mcp/pubsub'
import { createMcpErrorResponse, createMcpSuccessResponse } from '@/lib/mcp/utils'
@@ -85,7 +86,7 @@ export const GET = withMcpAuth('read')(
* POST - Create a new workflow MCP server
*/
export const POST = withMcpAuth('write')(
async (request: NextRequest, { userId, workspaceId, requestId }) => {
async (request: NextRequest, { userId, userName, userEmail, workspaceId, requestId }) => {
try {
const body = getParsedBody(request) || (await request.json())
@@ -188,6 +189,19 @@ export const POST = withMcpAuth('write')(
`[${requestId}] Successfully created workflow MCP server: ${body.name} (ID: ${serverId})`
)
recordAudit({
workspaceId,
actorId: userId,
actorName: userName,
actorEmail: userEmail,
action: AuditAction.MCP_SERVER_ADDED,
resourceType: AuditResourceType.MCP_SERVER,
resourceId: serverId,
resourceName: body.name.trim(),
description: `Published workflow MCP server "${body.name.trim()}" with ${addedTools.length} tool(s)`,
request,
})
return createMcpSuccessResponse({ server, addedTools }, 201)
} catch (error) {
logger.error(`[${requestId}] Error creating workflow MCP server:`, error)

View File

@@ -18,6 +18,7 @@ import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getEmailSubject, renderInvitationEmail } from '@/components/emails'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasAccessControlAccess } from '@/lib/billing'
import { syncUsageLimitsFromSubscription } from '@/lib/billing/core/usage'
@@ -552,6 +553,25 @@ export async function PUT(
email: orgInvitation.email,
})
const auditActionMap = {
accepted: AuditAction.ORG_INVITATION_ACCEPTED,
rejected: AuditAction.ORG_INVITATION_REJECTED,
cancelled: AuditAction.ORG_INVITATION_CANCELLED,
} as const
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: auditActionMap[status],
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Organization invitation ${status} for ${orgInvitation.email}`,
metadata: { invitationId, email: orgInvitation.email, status },
request: req,
})
return NextResponse.json({
success: true,
message: `Invitation ${status} successfully`,

View File

@@ -17,6 +17,7 @@ import {
renderBatchInvitationEmail,
renderInvitationEmail,
} from '@/components/emails'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import {
validateBulkInvitations,
@@ -411,6 +412,22 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
workspaceInvitationCount: workspaceInvitationIds.length,
})
for (const inv of invitationsToCreate) {
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.ORG_INVITATION_CREATED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: organizationEntry[0]?.name,
description: `Invited ${inv.email} to organization as ${role}`,
metadata: { invitationId: inv.id, email: inv.email, role },
request,
})
}
return NextResponse.json({
success: true,
message: `${invitationsToCreate.length} invitation(s) sent successfully`,
@@ -532,6 +549,19 @@ export async function DELETE(
email: result[0].email,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.ORG_INVITATION_REVOKED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Revoked organization invitation for ${result[0].email}`,
metadata: { invitationId, email: result[0].email },
request,
})
return NextResponse.json({
success: true,
message: 'Invitation cancelled successfully',

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { getUserUsageData } from '@/lib/billing/core/usage'
import { removeUserFromOrganization } from '@/lib/billing/organizations/membership'
@@ -213,6 +214,19 @@ export async function PUT(
updatedBy: session.user.id,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.ORG_MEMBER_ROLE_CHANGED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Changed role for member ${memberId} to ${role}`,
metadata: { targetUserId: memberId, newRole: role },
request,
})
return NextResponse.json({
success: true,
message: 'Member role updated successfully',
@@ -305,6 +319,22 @@ export async function DELETE(
billingActions: result.billingActions,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.ORG_MEMBER_REMOVED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description:
session.user.id === targetUserId
? 'Left the organization'
: `Removed member ${targetUserId} from organization`,
metadata: { targetUserId, wasSelfRemoval: session.user.id === targetUserId },
request,
})
return NextResponse.json({
success: true,
message:

View File

@@ -5,6 +5,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getEmailSubject, renderInvitationEmail } from '@/components/emails'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { getUserUsageData } from '@/lib/billing/core/usage'
import { validateSeatAvailability } from '@/lib/billing/validation/seat-management'
@@ -285,6 +286,19 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
// Don't fail the request if email fails
}
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.ORG_INVITATION_CREATED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Invited ${normalizedEmail} to organization as ${role}`,
metadata: { invitationId, email: normalizedEmail, role },
request,
})
return NextResponse.json({
success: true,
message: `Invitation sent to ${normalizedEmail}`,

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq, ne } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import {
getOrganizationSeatAnalytics,
@@ -192,6 +193,20 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
changes: { name, slug, logo },
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.ORGANIZATION_UPDATED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: updatedOrg[0].name,
description: `Updated organization settings`,
metadata: { changes: { name, slug, logo } },
request,
})
return NextResponse.json({
success: true,
message: 'Organization updated successfully',

View File

@@ -3,6 +3,7 @@ import { member, organization } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, or } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { createOrganizationForTeamPlan } from '@/lib/billing/organization'
@@ -115,6 +116,19 @@ export async function POST(request: Request) {
organizationId,
})
recordAudit({
workspaceId: null,
actorId: user.id,
action: AuditAction.ORGANIZATION_CREATED,
resourceType: AuditResourceType.ORGANIZATION,
resourceId: organizationId,
actorName: user.name ?? undefined,
actorEmail: user.email ?? undefined,
resourceName: organizationName ?? undefined,
description: `Created organization "${organizationName}"`,
request,
})
return NextResponse.json({
success: true,
organizationId,

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasAccessControlAccess } from '@/lib/billing'
@@ -13,6 +14,7 @@ async function getPermissionGroupWithAccess(groupId: string, userId: string) {
const [group] = await db
.select({
id: permissionGroup.id,
name: permissionGroup.name,
organizationId: permissionGroup.organizationId,
})
.from(permissionGroup)
@@ -151,6 +153,20 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
assignedBy: session.user.id,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.PERMISSION_GROUP_MEMBER_ADDED,
resourceType: AuditResourceType.PERMISSION_GROUP,
resourceId: id,
resourceName: result.group.name,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Added member ${userId} to permission group "${result.group.name}"`,
metadata: { targetUserId: userId, permissionGroupId: id },
request: req,
})
return NextResponse.json({ member: newMember }, { status: 201 })
} catch (error) {
if (error instanceof z.ZodError) {
@@ -221,6 +237,20 @@ export async function DELETE(req: NextRequest, { params }: { params: Promise<{ i
userId: session.user.id,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.PERMISSION_GROUP_MEMBER_REMOVED,
resourceType: AuditResourceType.PERMISSION_GROUP,
resourceId: id,
resourceName: result.group.name,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Removed member ${memberToRemove.userId} from permission group "${result.group.name}"`,
metadata: { targetUserId: memberToRemove.userId, memberId, permissionGroupId: id },
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error removing member from permission group', error)

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasAccessControlAccess } from '@/lib/billing'
import {
@@ -181,6 +182,19 @@ export async function PUT(req: NextRequest, { params }: { params: Promise<{ id:
.where(eq(permissionGroup.id, id))
.limit(1)
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.PERMISSION_GROUP_UPDATED,
resourceType: AuditResourceType.PERMISSION_GROUP,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: updated.name,
description: `Updated permission group "${updated.name}"`,
request: req,
})
return NextResponse.json({
permissionGroup: {
...updated,
@@ -229,6 +243,19 @@ export async function DELETE(req: NextRequest, { params }: { params: Promise<{ i
logger.info('Deleted permission group', { permissionGroupId: id, userId: session.user.id })
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.PERMISSION_GROUP_DELETED,
resourceType: AuditResourceType.PERMISSION_GROUP,
resourceId: id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: result.group.name,
description: `Deleted permission group "${result.group.name}"`,
request: req,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error deleting permission group', error)

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, count, desc, eq } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { hasAccessControlAccess } from '@/lib/billing'
import {
@@ -198,6 +199,19 @@ export async function POST(req: Request) {
userId: session.user.id,
})
recordAudit({
workspaceId: null,
actorId: session.user.id,
action: AuditAction.PERMISSION_GROUP_CREATED,
resourceType: AuditResourceType.PERMISSION_GROUP,
resourceId: newGroup.id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: name,
description: `Created permission group "${name}"`,
request: req,
})
return NextResponse.json({ permissionGroup: newGroup }, { status: 201 })
} catch (error) {
if (error instanceof z.ZodError) {

View File

@@ -3,7 +3,7 @@
*
* @vitest-environment node
*/
import { databaseMock, loggerMock } from '@sim/testing'
import { auditMock, databaseMock, loggerMock } from '@sim/testing'
import { NextRequest } from 'next/server'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
@@ -37,6 +37,8 @@ vi.mock('@/lib/core/utils/request', () => ({
vi.mock('@sim/logger', () => loggerMock)
vi.mock('@/lib/audit/log', () => auditMock)
import { PUT } from './route'
function createRequest(body: Record<string, unknown>): NextRequest {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { validateCronExpression } from '@/lib/workflows/schedules/utils'
@@ -106,6 +107,18 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
logger.info(`[${requestId}] Reactivated schedule: ${scheduleId}`)
recordAudit({
workspaceId: authorization.workflow.workspaceId ?? null,
actorId: session.user.id,
action: AuditAction.SCHEDULE_UPDATED,
resourceType: AuditResourceType.SCHEDULE,
resourceId: scheduleId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Reactivated schedule for workflow ${schedule.workflowId}`,
request,
})
return NextResponse.json({
message: 'Schedule activated successfully',
nextRunAt,

View File

@@ -0,0 +1,14 @@
import { NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { getAllowedIntegrationsFromEnv } from '@/lib/core/config/feature-flags'
export async function GET() {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
return NextResponse.json({
allowedIntegrations: getAllowedIntegrationsFromEnv(),
})
}

View File

@@ -0,0 +1,27 @@
import { NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { getAllowedMcpDomainsFromEnv } from '@/lib/core/config/feature-flags'
import { getBaseUrl } from '@/lib/core/utils/urls'
export async function GET() {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const configuredDomains = getAllowedMcpDomainsFromEnv()
if (configuredDomains === null) {
return NextResponse.json({ allowedMcpDomains: null })
}
try {
const platformHostname = new URL(getBaseUrl()).hostname.toLowerCase()
if (!configuredDomains.includes(platformHostname)) {
return NextResponse.json({
allowedMcpDomains: [...configuredDomains, platformHostname],
})
}
} catch {}
return NextResponse.json({ allowedMcpDomains: configuredDomains })
}

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq, sql } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import {
@@ -247,6 +248,18 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
logger.info(`[${requestId}] Successfully updated template: ${id}`)
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.TEMPLATE_UPDATED,
resourceType: AuditResourceType.TEMPLATE,
resourceId: id,
resourceName: name ?? template.name,
description: `Updated template "${name ?? template.name}"`,
request,
})
return NextResponse.json({
data: updatedTemplate[0],
message: 'Template updated successfully',
@@ -300,6 +313,19 @@ export async function DELETE(
await db.delete(templates).where(eq(templates.id, id))
logger.info(`[${requestId}] Deleted template: ${id}`)
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.TEMPLATE_DELETED,
resourceType: AuditResourceType.TEMPLATE,
resourceId: id,
resourceName: template.name,
description: `Deleted template "${template.name}"`,
request,
})
return NextResponse.json({ success: true })
} catch (error: any) {
logger.error(`[${requestId}] Error deleting template: ${id}`, error)

View File

@@ -11,6 +11,7 @@ import { and, desc, eq, ilike, or, sql } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { v4 as uuidv4 } from 'uuid'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { verifyEffectiveSuperUser } from '@/lib/templates/permissions'
@@ -285,6 +286,18 @@ export async function POST(request: NextRequest) {
logger.info(`[${requestId}] Successfully created template: ${templateId}`)
recordAudit({
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.TEMPLATE_CREATED,
resourceType: AuditResourceType.TEMPLATE,
resourceId: templateId,
resourceName: data.name,
description: `Created template "${data.name}"`,
request,
})
return NextResponse.json(
{
id: templateId,

View File

@@ -22,15 +22,20 @@ interface PipedriveFile {
interface PipedriveApiResponse {
success: boolean
data?: PipedriveFile[]
additional_data?: {
pagination?: {
more_items_in_collection: boolean
next_start: number
}
}
error?: string
}
const PipedriveGetFilesSchema = z.object({
accessToken: z.string().min(1, 'Access token is required'),
deal_id: z.string().optional().nullable(),
person_id: z.string().optional().nullable(),
org_id: z.string().optional().nullable(),
sort: z.enum(['id', 'update_time']).optional().nullable(),
limit: z.string().optional().nullable(),
start: z.string().optional().nullable(),
downloadFiles: z.boolean().optional().default(false),
})
@@ -54,20 +59,19 @@ export async function POST(request: NextRequest) {
const body = await request.json()
const validatedData = PipedriveGetFilesSchema.parse(body)
const { accessToken, deal_id, person_id, org_id, limit, downloadFiles } = validatedData
const { accessToken, sort, limit, start, downloadFiles } = validatedData
const baseUrl = 'https://api.pipedrive.com/v1/files'
const queryParams = new URLSearchParams()
if (deal_id) queryParams.append('deal_id', deal_id)
if (person_id) queryParams.append('person_id', person_id)
if (org_id) queryParams.append('org_id', org_id)
if (sort) queryParams.append('sort', sort)
if (limit) queryParams.append('limit', limit)
if (start) queryParams.append('start', start)
const queryString = queryParams.toString()
const apiUrl = queryString ? `${baseUrl}?${queryString}` : baseUrl
logger.info(`[${requestId}] Fetching files from Pipedrive`, { deal_id, person_id, org_id })
logger.info(`[${requestId}] Fetching files from Pipedrive`)
const urlValidation = await validateUrlWithDNS(apiUrl, 'apiUrl')
if (!urlValidation.isValid) {
@@ -93,6 +97,8 @@ export async function POST(request: NextRequest) {
}
const files = data.data || []
const hasMore = data.additional_data?.pagination?.more_items_in_collection || false
const nextStart = data.additional_data?.pagination?.next_start ?? null
const downloadedFiles: Array<{
name: string
mimeType: string
@@ -149,6 +155,8 @@ export async function POST(request: NextRequest) {
files,
downloadedFiles: downloadedFiles.length > 0 ? downloadedFiles : undefined,
total_items: files.length,
has_more: hasMore,
next_start: nextStart,
success: true,
},
})

View File

@@ -3,6 +3,7 @@ import { apiKey } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -34,12 +35,27 @@ export async function DELETE(
const result = await db
.delete(apiKey)
.where(and(eq(apiKey.id, keyId), eq(apiKey.userId, userId)))
.returning({ id: apiKey.id })
.returning({ id: apiKey.id, name: apiKey.name })
if (!result.length) {
return NextResponse.json({ error: 'API key not found' }, { status: 404 })
}
const deletedKey = result[0]
recordAudit({
workspaceId: null,
actorId: userId,
action: AuditAction.PERSONAL_API_KEY_REVOKED,
resourceType: AuditResourceType.API_KEY,
resourceId: keyId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: deletedKey.name,
description: `Revoked personal API key: ${deletedKey.name}`,
request,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Failed to delete API key', { error })

View File

@@ -5,6 +5,7 @@ import { and, eq } from 'drizzle-orm'
import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { createApiKey, getApiKeyDisplayFormat } from '@/lib/api-key/auth'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
const logger = createLogger('ApiKeysAPI')
@@ -110,6 +111,19 @@ export async function POST(request: NextRequest) {
createdAt: apiKey.createdAt,
})
recordAudit({
workspaceId: null,
actorId: userId,
action: AuditAction.PERSONAL_API_KEY_CREATED,
resourceType: AuditResourceType.API_KEY,
resourceId: newKey.id,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: name,
description: `Created personal API key: ${name}`,
request,
})
return NextResponse.json({
key: {
...newKey,

View File

@@ -3,6 +3,7 @@ import { webhook, workflow } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { validateInteger } from '@/lib/core/security/input-validation'
import { PlatformEvents } from '@/lib/core/telemetry'
@@ -261,6 +262,20 @@ export async function DELETE(
logger.info(`[${requestId}] Successfully deleted webhook: ${id}`)
}
recordAudit({
workspaceId: webhookData.workflow.workspaceId || null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.WEBHOOK_DELETED,
resourceType: AuditResourceType.WEBHOOK,
resourceId: id,
resourceName: foundWebhook.provider || 'generic',
description: 'Deleted webhook',
metadata: { workflowId: webhookData.workflow.id },
request,
})
return NextResponse.json({ success: true }, { status: 200 })
} catch (error: any) {
logger.error(`[${requestId}] Error deleting webhook`, {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, desc, eq, inArray, isNull, or } from 'drizzle-orm'
import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -145,7 +146,8 @@ export async function GET(request: NextRequest) {
// Create or Update a webhook
export async function POST(request: NextRequest) {
const requestId = generateRequestId()
const userId = (await getSession())?.user?.id
const session = await getSession()
const userId = session?.user?.id
if (!userId) {
logger.warn(`[${requestId}] Unauthorized webhook creation attempt`)
@@ -678,6 +680,20 @@ export async function POST(request: NextRequest) {
} catch {
// Telemetry should not fail the operation
}
recordAudit({
workspaceId: workflowRecord.workspaceId || null,
actorId: userId,
actorName: session?.user?.name ?? undefined,
actorEmail: session?.user?.email ?? undefined,
action: AuditAction.WEBHOOK_CREATED,
resourceType: AuditResourceType.WEBHOOK,
resourceId: savedWebhook.id,
resourceName: provider || 'generic',
description: `Created ${provider || 'generic'} webhook`,
metadata: { provider, workflowId },
request,
})
}
const status = targetWebhookId ? 200 : 201

View File

@@ -2,6 +2,7 @@ import { db, workflow, workflowDeploymentVersion } from '@sim/db'
import { createLogger } from '@sim/logger'
import { and, desc, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { generateRequestId } from '@/lib/core/utils/request'
import { removeMcpToolsForWorkflow, syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
import {
@@ -258,6 +259,19 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
// Sync MCP tools with the latest parameter schema
await syncMcpToolsForWorkflow({ workflowId: id, requestId, context: 'deploy' })
recordAudit({
workspaceId: workflowData?.workspaceId || null,
actorId: actorUserId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.WORKFLOW_DEPLOYED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
resourceName: workflowData?.name,
description: `Deployed workflow "${workflowData?.name || id}"`,
request,
})
const responseApiKeyInfo = workflowData!.workspaceId
? 'Workspace API keys'
: 'Personal API keys'
@@ -297,11 +311,11 @@ export async function DELETE(
try {
logger.debug(`[${requestId}] Undeploying workflow: ${id}`)
const { error, workflow: workflowData } = await validateWorkflowPermissions(
id,
requestId,
'admin'
)
const {
error,
session,
workflow: workflowData,
} = await validateWorkflowPermissions(id, requestId, 'admin')
if (error) {
return createErrorResponse(error.message, error.status)
}
@@ -325,6 +339,19 @@ export async function DELETE(
// Silently fail
}
recordAudit({
workspaceId: workflowData?.workspaceId || null,
actorId: session!.user.id,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.WORKFLOW_UNDEPLOYED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
resourceName: workflowData?.name,
description: `Undeployed workflow "${workflowData?.name || id}"`,
request,
})
return createSuccessResponse({
isDeployed: false,
deployedAt: null,

View File

@@ -2,6 +2,7 @@ import { db, workflow, workflowDeploymentVersion } from '@sim/db'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { env } from '@/lib/core/config/env'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
@@ -22,7 +23,11 @@ export async function POST(
const { id, version } = await params
try {
const { error } = await validateWorkflowPermissions(id, requestId, 'admin')
const {
error,
session,
workflow: workflowRecord,
} = await validateWorkflowPermissions(id, requestId, 'admin')
if (error) {
return createErrorResponse(error.message, error.status)
}
@@ -107,6 +112,19 @@ export async function POST(
logger.error('Error sending workflow reverted event to socket server', e)
}
recordAudit({
workspaceId: workflowRecord?.workspaceId ?? null,
actorId: session!.user.id,
action: AuditAction.WORKFLOW_DEPLOYMENT_REVERTED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
actorName: session!.user.name ?? undefined,
actorEmail: session!.user.email ?? undefined,
resourceName: workflowRecord?.name ?? undefined,
description: `Reverted workflow to deployment version ${version}`,
request,
})
return createSuccessResponse({
message: 'Reverted to deployment version',
lastSaved: Date.now(),

View File

@@ -3,6 +3,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import type { NextRequest } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncMcpToolsForWorkflow } from '@/lib/mcp/workflow-mcp-sync'
import { restorePreviousVersionWebhooks, saveTriggerWebhooksForDeploy } from '@/lib/webhooks/deploy'
@@ -297,6 +298,19 @@ export async function PATCH(
}
}
recordAudit({
workspaceId: workflowData?.workspaceId,
actorId: actorUserId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.WORKFLOW_DEPLOYMENT_ACTIVATED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: id,
description: `Activated deployment version ${versionNum}`,
metadata: { version: versionNum },
request,
})
return createSuccessResponse({
success: true,
deployedAt: result.deployedAt,

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -61,6 +62,20 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
`[${requestId}] Successfully duplicated workflow ${sourceWorkflowId} to ${result.id} in ${elapsed}ms`
)
recordAudit({
workspaceId: workspaceId || null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.WORKFLOW_DUPLICATED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: result.id,
resourceName: result.name,
description: `Duplicated workflow from ${sourceWorkflowId}`,
metadata: { sourceWorkflowId },
request: req,
})
return NextResponse.json(result, { status: 201 })
} catch (error) {
if (error instanceof Error) {

View File

@@ -5,7 +5,7 @@
* @vitest-environment node
*/
import { loggerMock, setupGlobalFetchMock } from '@sim/testing'
import { auditMock, loggerMock, setupGlobalFetchMock } from '@sim/testing'
import { NextRequest } from 'next/server'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
@@ -23,6 +23,8 @@ vi.mock('@/lib/auth', () => ({
vi.mock('@sim/logger', () => loggerMock)
vi.mock('@/lib/audit/log', () => auditMock)
vi.mock('@/lib/workflows/persistence/utils', () => ({
loadWorkflowFromNormalizedTables: (workflowId: string) =>
mockLoadWorkflowFromNormalizedTables(workflowId),

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkHybridAuth, checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { env } from '@/lib/core/config/env'
import { PlatformEvents } from '@/lib/core/telemetry'
@@ -336,6 +337,19 @@ export async function DELETE(
// Don't fail the deletion if Socket.IO notification fails
}
recordAudit({
workspaceId: workflowData.workspaceId || null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.WORKFLOW_DELETED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: workflowId,
resourceName: workflowData.name,
description: `Deleted workflow "${workflowData.name}"`,
request,
})
return NextResponse.json({ success: true }, { status: 200 })
} catch (error: any) {
const elapsed = Date.now() - startTime

View File

@@ -5,6 +5,7 @@
* @vitest-environment node
*/
import {
auditMock,
databaseMock,
defaultMockUser,
mockAuth,
@@ -27,6 +28,8 @@ describe('Workflow Variables API Route', () => {
vi.doMock('@sim/db', () => databaseMock)
vi.doMock('@/lib/audit/log', () => auditMock)
vi.doMock('@/lib/workflows/utils', () => ({
authorizeWorkflowByWorkspacePermission: mockAuthorizeWorkflowByWorkspacePermission,
}))

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { authorizeWorkflowByWorkspacePermission } from '@/lib/workflows/utils'
@@ -79,6 +80,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
})
.where(eq(workflow.id, workflowId))
recordAudit({
workspaceId: workflowData.workspaceId ?? null,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.WORKFLOW_VARIABLES_UPDATED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: workflowId,
resourceName: workflowData.name ?? undefined,
description: `Updated workflow variables`,
request: req,
})
return NextResponse.json({ success: true })
} catch (validationError) {
if (validationError instanceof z.ZodError) {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, asc, eq, inArray, isNull, min } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { checkSessionOrInternalAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { getUserEntityPermissions, workspaceExists } from '@/lib/workspaces/permissions/utils'
@@ -188,6 +189,20 @@ export async function POST(req: NextRequest) {
logger.info(`[${requestId}] Successfully created empty workflow ${workflowId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: auth.userName,
actorEmail: auth.userEmail,
action: AuditAction.WORKFLOW_CREATED,
resourceType: AuditResourceType.WORKFLOW,
resourceId: workflowId,
resourceName: name,
description: `Created workflow "${name}"`,
metadata: { name },
request: req,
})
return NextResponse.json({
id: workflowId,
name,

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq, not } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -86,6 +87,19 @@ export async function PUT(
updatedAt: apiKey.updatedAt,
})
recordAudit({
workspaceId,
actorId: userId,
action: AuditAction.API_KEY_UPDATED,
resourceType: AuditResourceType.API_KEY,
resourceId: keyId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: name,
description: `Updated workspace API key: ${name}`,
request,
})
logger.info(`[${requestId}] Updated workspace API key: ${keyId} in workspace ${workspaceId}`)
return NextResponse.json({ key: updatedKey })
} catch (error: unknown) {
@@ -123,12 +137,27 @@ export async function DELETE(
.where(
and(eq(apiKey.workspaceId, workspaceId), eq(apiKey.id, keyId), eq(apiKey.type, 'workspace'))
)
.returning({ id: apiKey.id })
.returning({ id: apiKey.id, name: apiKey.name })
if (deletedRows.length === 0) {
return NextResponse.json({ error: 'API key not found' }, { status: 404 })
}
const deletedKey = deletedRows[0]
recordAudit({
workspaceId,
actorId: userId,
action: AuditAction.API_KEY_REVOKED,
resourceType: AuditResourceType.API_KEY,
resourceId: keyId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: deletedKey.name,
description: `Revoked workspace API key: ${deletedKey.name}`,
request,
})
logger.info(`[${requestId}] Deleted workspace API key: ${keyId} from workspace ${workspaceId}`)
return NextResponse.json({ success: true })
} catch (error: unknown) {

View File

@@ -6,6 +6,7 @@ import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { createApiKey, getApiKeyDisplayFormat } from '@/lib/api-key/auth'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -159,6 +160,20 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
logger.info(`[${requestId}] Created workspace API key: ${name} in workspace ${workspaceId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.API_KEY_CREATED,
resourceType: AuditResourceType.API_KEY,
resourceId: newKey.id,
resourceName: name,
description: `Created API key "${name}"`,
metadata: { keyName: name },
request,
})
return NextResponse.json({
key: {
...newKey,
@@ -222,6 +237,19 @@ export async function DELETE(
logger.info(
`[${requestId}] Deleted ${deletedCount} workspace API keys from workspace ${workspaceId}`
)
recordAudit({
workspaceId,
actorId: userId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.API_KEY_REVOKED,
resourceType: AuditResourceType.API_KEY,
description: `Revoked ${deletedCount} API key(s)`,
metadata: { keyIds: keys, deletedCount },
request,
})
return NextResponse.json({ success: true, deletedCount })
} catch (error: unknown) {
logger.error(`[${requestId}] Workspace API key DELETE error`, error)

View File

@@ -5,6 +5,7 @@ import { and, eq } from 'drizzle-orm'
import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { decryptSecret, encryptSecret } from '@/lib/core/security/encryption'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -185,6 +186,20 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
logger.info(`[${requestId}] Created BYOK key for ${providerId} in workspace ${workspaceId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.BYOK_KEY_CREATED,
resourceType: AuditResourceType.BYOK_KEY,
resourceId: newKey.id,
resourceName: providerId,
description: `Added BYOK key for ${providerId}`,
metadata: { providerId },
request,
})
return NextResponse.json({
success: true,
key: {
@@ -242,6 +257,19 @@ export async function DELETE(
logger.info(`[${requestId}] Deleted BYOK key for ${providerId} from workspace ${workspaceId}`)
recordAudit({
workspaceId,
actorId: userId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.BYOK_KEY_DELETED,
resourceType: AuditResourceType.BYOK_KEY,
resourceName: providerId,
description: `Removed BYOK key for ${providerId}`,
metadata: { providerId },
request,
})
return NextResponse.json({ success: true })
} catch (error: unknown) {
logger.error(`[${requestId}] BYOK key DELETE error`, error)

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { duplicateWorkspace } from '@/lib/workspaces/duplicate'
@@ -45,6 +46,19 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
`[${requestId}] Successfully duplicated workspace ${sourceWorkspaceId} to ${result.id} in ${elapsed}ms`
)
recordAudit({
workspaceId: sourceWorkspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.WORKSPACE_DUPLICATED,
resourceType: AuditResourceType.WORKSPACE,
resourceId: result.id,
resourceName: name,
description: `Duplicated workspace to "${name}"`,
request: req,
})
return NextResponse.json(result, { status: 201 })
} catch (error) {
if (error instanceof Error) {

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { decryptSecret, encryptSecret } from '@/lib/core/security/encryption'
import { generateRequestId } from '@/lib/core/utils/request'
@@ -156,6 +157,19 @@ export async function PUT(request: NextRequest, { params }: { params: Promise<{
set: { variables: merged, updatedAt: new Date() },
})
recordAudit({
workspaceId,
actorId: userId,
actorName: session?.user?.name,
actorEmail: session?.user?.email,
action: AuditAction.ENVIRONMENT_UPDATED,
resourceType: AuditResourceType.ENVIRONMENT,
resourceId: workspaceId,
description: `Updated environment variables`,
metadata: { keysUpdated: Object.keys(variables) },
request,
})
return NextResponse.json({ success: true })
} catch (error: any) {
logger.error(`[${requestId}] Workspace env PUT error`, error)

View File

@@ -1,5 +1,6 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { deleteWorkspaceFile } from '@/lib/uploads/contexts/workspace'
@@ -39,6 +40,18 @@ export async function DELETE(
logger.info(`[${requestId}] Deleted workspace file: ${fileId}`)
recordAudit({
workspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.FILE_DELETED,
resourceType: AuditResourceType.FILE,
resourceId: fileId,
description: `Deleted file "${fileId}"`,
request,
})
return NextResponse.json({
success: true,
})

View File

@@ -1,5 +1,6 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { listWorkspaceFiles, uploadWorkspaceFile } from '@/lib/uploads/contexts/workspace'
@@ -104,6 +105,19 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
logger.info(`[${requestId}] Uploaded workspace file: ${file.name}`)
recordAudit({
workspaceId,
actorId: session.user.id,
actorName: session.user.name,
actorEmail: session.user.email,
action: AuditAction.FILE_UPLOADED,
resourceType: AuditResourceType.FILE,
resourceId: userFile.id,
resourceName: file.name,
description: `Uploaded file "${file.name}"`,
request,
})
return NextResponse.json({
success: true,
file: userFile,

View File

@@ -4,6 +4,7 @@ import { createLogger } from '@sim/logger'
import { and, eq, inArray } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { encryptSecret } from '@/lib/core/security/encryption'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -251,6 +252,19 @@ export async function PUT(request: NextRequest, { params }: RouteParams) {
subscriptionId: subscription.id,
})
recordAudit({
workspaceId,
actorId: session.user.id,
action: AuditAction.NOTIFICATION_UPDATED,
resourceType: AuditResourceType.NOTIFICATION,
resourceId: notificationId,
resourceName: subscription.notificationType,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Updated ${subscription.notificationType} notification subscription`,
request,
})
return NextResponse.json({
data: {
id: subscription.id,
@@ -300,17 +314,35 @@ export async function DELETE(request: NextRequest, { params }: RouteParams) {
eq(workspaceNotificationSubscription.workspaceId, workspaceId)
)
)
.returning({ id: workspaceNotificationSubscription.id })
.returning({
id: workspaceNotificationSubscription.id,
notificationType: workspaceNotificationSubscription.notificationType,
})
if (deleted.length === 0) {
return NextResponse.json({ error: 'Notification not found' }, { status: 404 })
}
const deletedSubscription = deleted[0]
logger.info('Deleted notification subscription', {
workspaceId,
subscriptionId: notificationId,
})
recordAudit({
workspaceId,
actorId: session.user.id,
action: AuditAction.NOTIFICATION_DELETED,
resourceType: AuditResourceType.NOTIFICATION,
resourceId: notificationId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
resourceName: deletedSubscription.notificationType,
description: `Deleted ${deletedSubscription.notificationType} notification subscription`,
request,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error deleting notification', { error })

View File

@@ -5,6 +5,7 @@ import { and, eq, inArray } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { v4 as uuidv4 } from 'uuid'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import { encryptSecret } from '@/lib/core/security/encryption'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -256,6 +257,19 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
type: data.notificationType,
})
recordAudit({
workspaceId,
actorId: session.user.id,
action: AuditAction.NOTIFICATION_CREATED,
resourceType: AuditResourceType.NOTIFICATION,
resourceId: subscription.id,
resourceName: data.notificationType,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Created ${data.notificationType} notification subscription`,
request,
})
return NextResponse.json({
data: {
id: subscription.id,

View File

@@ -5,6 +5,7 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { AuditAction, AuditResourceType, recordAudit } from '@/lib/audit/log'
import { getSession } from '@/lib/auth'
import {
getUsersWithPermissions,
@@ -156,6 +157,21 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
const updatedUsers = await getUsersWithPermissions(workspaceId)
for (const update of body.updates) {
recordAudit({
workspaceId,
actorId: session.user.id,
action: AuditAction.MEMBER_ROLE_CHANGED,
resourceType: AuditResourceType.WORKSPACE,
resourceId: workspaceId,
actorName: session.user.name ?? undefined,
actorEmail: session.user.email ?? undefined,
description: `Changed permissions for user ${update.userId} to ${update.permissions}`,
metadata: { targetUserId: update.userId, newPermissions: update.permissions },
request,
})
}
return NextResponse.json({
message: 'Permissions updated successfully',
users: updatedUsers,

Some files were not shown because too many files have changed in this diff Show More