mirror of
https://github.com/simstudioai/sim.git
synced 2026-04-06 03:00:16 -04:00
updates
This commit is contained in:
250
PLAN.md
Normal file
250
PLAN.md
Normal file
@@ -0,0 +1,250 @@
|
||||
# Table Block Implementation Plan
|
||||
|
||||
> Create a new "table" block type that enables users to define schemas and perform CRUD operations on lightweight, workspace/workflow-scoped tables stored in the existing PostgreSQL database using JSONB with application-level schema enforcement.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Architecture Overview](#architecture-overview)
|
||||
- [Data Model](#data-model)
|
||||
- [Implementation Files](#implementation-files)
|
||||
- [Key Design Decisions](#key-design-decisions)
|
||||
- [Limits and Limitations](#limits-and-limitations)
|
||||
- [Implementation Checklist](#implementation-checklist)
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```mermaid
|
||||
flowchart TB
|
||||
subgraph UI [Block UI Layer]
|
||||
TableBlock[Table Block]
|
||||
SchemaEditor[Schema Editor SubBlock]
|
||||
end
|
||||
|
||||
subgraph Tools [Tool Layer]
|
||||
CreateTable[table_create]
|
||||
Insert[table_insert]
|
||||
Select[table_select]
|
||||
Update[table_update]
|
||||
Delete[table_delete]
|
||||
DropTable[table_drop]
|
||||
end
|
||||
|
||||
subgraph API [API Routes]
|
||||
TableAPI["/api/tables"]
|
||||
RowAPI["/api/tables/tableId/rows"]
|
||||
end
|
||||
|
||||
subgraph DB [PostgreSQL]
|
||||
SimTable[sim_table]
|
||||
SimTableRow[sim_table_row]
|
||||
end
|
||||
|
||||
TableBlock --> Tools
|
||||
Tools --> API
|
||||
API --> DB
|
||||
```
|
||||
|
||||
## Data Model
|
||||
|
||||
Two new tables in the existing PostgreSQL database:
|
||||
|
||||
### `sim_table` - Table Definitions
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| id | text | Primary key |
|
||||
| workspace_id | text | FK to workspace |
|
||||
| workflow_id | text | FK to workflow (nullable for workspace-scope) |
|
||||
| name | text | Table name (unique per scope) |
|
||||
| schema | jsonb | Column definitions with types/constraints |
|
||||
| created_by | text | FK to user |
|
||||
| created_at | timestamp | Creation time |
|
||||
| updated_at | timestamp | Last update time |
|
||||
|
||||
### `sim_table_row` - Row Data
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| id | text | Primary key |
|
||||
| table_id | text | FK to sim_table |
|
||||
| data | jsonb | Row data (validated against schema) |
|
||||
| created_at | timestamp | Creation time |
|
||||
| updated_at | timestamp | Last update time |
|
||||
|
||||
### Schema Format
|
||||
|
||||
**Example schema definition:**
|
||||
|
||||
```json
|
||||
{
|
||||
"columns": [
|
||||
{ "name": "id", "type": "string", "primaryKey": true },
|
||||
{ "name": "email", "type": "string", "required": true, "unique": true },
|
||||
{ "name": "age", "type": "number" },
|
||||
{ "name": "active", "type": "boolean", "default": true }
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**Supported Types:** `string`, `number`, `boolean`, `date`, `json`
|
||||
|
||||
## Implementation Files
|
||||
|
||||
### 1. Database Schema
|
||||
|
||||
- `packages/db/schema.ts` - Add `simTable` and `simTableRow` table definitions
|
||||
- Generate migration for the new tables
|
||||
|
||||
### 2. Tools (`apps/sim/tools/table/`)
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `types.ts` | Type definitions for params/responses |
|
||||
| `create.ts` | Create table with schema |
|
||||
| `insert.ts` | Insert row(s) with schema validation |
|
||||
| `select.ts` | Query rows with filtering |
|
||||
| `update.ts` | Update rows with schema validation |
|
||||
| `delete.ts` | Delete rows |
|
||||
| `drop.ts` | Drop table |
|
||||
| `index.ts` | Barrel export |
|
||||
|
||||
### 3. Block Definition
|
||||
|
||||
- `apps/sim/blocks/blocks/table.ts` - Block config with:
|
||||
- Operation dropdown (create, insert, select, update, delete, drop)
|
||||
- Scope selector (workspace/workflow)
|
||||
- Table selector (for existing tables)
|
||||
- Schema editor (for create operation)
|
||||
- Data/query inputs (operation-specific)
|
||||
|
||||
### 4. API Routes
|
||||
|
||||
- `apps/sim/app/api/tables/route.ts` - Create table, list tables
|
||||
- `apps/sim/app/api/tables/[tableId]/route.ts` - Get/drop table
|
||||
- `apps/sim/app/api/tables/[tableId]/rows/route.ts` - CRUD on rows
|
||||
|
||||
### 5. Schema Validation Library
|
||||
|
||||
- `apps/sim/lib/tables/schema.ts` - Schema validation utilities
|
||||
- `apps/sim/lib/tables/types.ts` - Shared types
|
||||
|
||||
## Key Design Decisions
|
||||
|
||||
1. **Schema Enforcement**: Application-layer validation before database writes. JSONB stores data, but every insert/update validates against the table's schema.
|
||||
|
||||
2. **Concurrency**: PostgreSQL handles concurrent reads/writes natively. Row-level locking for updates.
|
||||
|
||||
3. **Indexing**: GIN index on `data` column for efficient JSONB queries. Additional indexes on `table_id` for fast row lookups.
|
||||
|
||||
4. **Scope Resolution**: Tables with `workflow_id = NULL` are workspace-scoped. Tables with `workflow_id` set are workflow-scoped.
|
||||
|
||||
5. **Table Selector**: New SubBlock type `table-selector` that fetches available tables based on current workspace/workflow context.
|
||||
|
||||
## Limits and Limitations
|
||||
|
||||
### Table Limits
|
||||
|
||||
| Limit | Free Plan | Pro Plan | Enterprise |
|
||||
|-------|-----------|----------|------------|
|
||||
| Tables per workspace | 10 | 50 | Unlimited |
|
||||
| Tables per workflow | 5 | 20 | Unlimited |
|
||||
| Columns per table | 50 | 100 | 200 |
|
||||
|
||||
### Row Limits
|
||||
|
||||
| Limit | Free Plan | Pro Plan | Enterprise |
|
||||
|-------|-----------|----------|------------|
|
||||
| Rows per table | 10,000 | 100,000 | 1,000,000 |
|
||||
| Batch insert size | 100 | 500 | 1,000 |
|
||||
| Batch update/delete size | 100 | 500 | 1,000 |
|
||||
|
||||
### Size Limits
|
||||
|
||||
| Limit | Value | Rationale |
|
||||
|-------|-------|-----------|
|
||||
| Column name length | 64 chars | PostgreSQL identifier limit |
|
||||
| Table name length | 64 chars | PostgreSQL identifier limit |
|
||||
| String field max length | 65,535 chars | ~64KB per text field |
|
||||
| JSON field max size | 1 MB | PostgreSQL JSONB practical limit |
|
||||
| Single row max size | 2 MB | Reasonable row size limit |
|
||||
| Total table data size | Based on plan | Tied to workspace storage quota |
|
||||
|
||||
### Query Limits
|
||||
|
||||
| Limit | Value | Notes |
|
||||
|-------|-------|-------|
|
||||
| Default page size | 100 rows | Can be overridden up to max |
|
||||
| Max page size | 1,000 rows | Prevents memory issues |
|
||||
| Max filter conditions | 20 | AND/OR conditions combined |
|
||||
| Query timeout | 30 seconds | Prevents long-running queries |
|
||||
| Max concurrent queries per table | 50 | Rate limiting per table |
|
||||
|
||||
### Schema Constraints
|
||||
|
||||
| Constraint | Limit |
|
||||
|------------|-------|
|
||||
| Primary key columns | 1 (single column only) |
|
||||
| Unique constraints | 5 per table |
|
||||
| Required (NOT NULL) columns | Unlimited |
|
||||
| Default values | Supported for all types |
|
||||
| Foreign keys | Not supported (v1) |
|
||||
| Computed columns | Not supported (v1) |
|
||||
| Indexes | Auto-created for primary key and unique columns |
|
||||
|
||||
### Data Type Specifications
|
||||
|
||||
| Type | Storage | Min | Max | Notes |
|
||||
|------|---------|-----|-----|-------|
|
||||
| `string` | text | 0 chars | 65,535 chars | UTF-8 encoded |
|
||||
| `number` | double precision | -1.7e308 | 1.7e308 | IEEE 754 double |
|
||||
| `boolean` | boolean | - | - | true/false |
|
||||
| `date` | timestamp | 4713 BC | 294276 AD | ISO 8601 format |
|
||||
| `json` | jsonb | - | 1 MB | Nested objects/arrays |
|
||||
|
||||
### Operational Limitations
|
||||
|
||||
1. **No Transactions Across Tables**: Each operation is atomic to a single table. Cross-table transactions are not supported.
|
||||
|
||||
2. **No JOINs**: Cannot join data between tables. Use workflow logic to combine data from multiple tables.
|
||||
|
||||
3. **No Triggers/Hooks**: No automatic actions on insert/update/delete. Use workflow blocks for reactive logic.
|
||||
|
||||
4. **No Full-Text Search**: Basic filtering only. For full-text search, use the Knowledge Base feature.
|
||||
|
||||
5. **No Schema Migrations**: Schema changes require dropping and recreating the table (with data loss). Future versions may support additive migrations.
|
||||
|
||||
6. **Query Complexity**: Only basic operators supported:
|
||||
- Comparison: `=`, `!=`, `>`, `<`, `>=`, `<=`
|
||||
- String: `LIKE`, `ILIKE`, `STARTS_WITH`, `ENDS_WITH`, `CONTAINS`
|
||||
- Logical: `AND`, `OR`, `NOT`
|
||||
- Null checks: `IS NULL`, `IS NOT NULL`
|
||||
- Array: `IN`, `NOT IN`
|
||||
|
||||
### Performance Characteristics
|
||||
|
||||
| Operation | Expected Latency | Notes |
|
||||
|-----------|------------------|-------|
|
||||
| Insert (single row) | < 50ms | With schema validation |
|
||||
| Insert (batch 100) | < 200ms | Parallel validation |
|
||||
| Select (indexed) | < 20ms | Primary key or unique column |
|
||||
| Select (filtered, 1K rows) | < 100ms | With GIN index |
|
||||
| Update (single row) | < 50ms | By primary key |
|
||||
| Delete (single row) | < 30ms | By primary key |
|
||||
|
||||
### Storage Accounting
|
||||
|
||||
- Table storage counts toward workspace storage quota
|
||||
- Calculated as: `sum(row_data_size) + schema_overhead`
|
||||
- Schema overhead: ~1KB per table
|
||||
- Row overhead: ~100 bytes per row (metadata, timestamps)
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
- [ ] Add `simTable` and `simTableRow` to `packages/db/schema.ts` and generate migration
|
||||
- [ ] Create `apps/sim/lib/tables/` with schema validation and types
|
||||
- [ ] Create `apps/sim/tools/table/` with all 6 tool implementations
|
||||
- [ ] Register tools in `apps/sim/tools/registry.ts`
|
||||
- [ ] Create API routes for tables and rows CRUD operations
|
||||
- [ ] Create `apps/sim/blocks/blocks/table.ts` block definition
|
||||
- [ ] Register block in `apps/sim/blocks/registry.ts`
|
||||
- [ ] Add `TableIcon` to `apps/sim/components/icons.tsx`
|
||||
208
apps/sim/app/api/table/[tableId]/route.ts
Normal file
208
apps/sim/app/api/table/[tableId]/route.ts
Normal file
@@ -0,0 +1,208 @@
|
||||
import { db } from '@sim/db'
|
||||
import { permissions, userTableDefinitions, userTableRows, workspace } from '@sim/db/schema'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { and, eq, isNull } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { checkHybridAuth } from '@/lib/auth/hybrid'
|
||||
import { generateRequestId } from '@/lib/core/utils/request'
|
||||
|
||||
const logger = createLogger('TableDetailAPI')
|
||||
|
||||
const GetTableSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
})
|
||||
|
||||
/**
|
||||
* Check if user has write access to workspace
|
||||
*/
|
||||
async function checkWorkspaceAccess(workspaceId: string, userId: string) {
|
||||
const [workspaceData] = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
ownerId: workspace.ownerId,
|
||||
})
|
||||
.from(workspace)
|
||||
.where(eq(workspace.id, workspaceId))
|
||||
.limit(1)
|
||||
|
||||
if (!workspaceData) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
if (workspaceData.ownerId === userId) {
|
||||
return { hasAccess: true, canWrite: true }
|
||||
}
|
||||
|
||||
const [permission] = await db
|
||||
.select({
|
||||
permissionType: permissions.permissionType,
|
||||
})
|
||||
.from(permissions)
|
||||
.where(
|
||||
and(
|
||||
eq(permissions.userId, userId),
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.entityId, workspaceId)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!permission) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
const canWrite = permission.permissionType === 'admin' || permission.permissionType === 'write'
|
||||
|
||||
return {
|
||||
hasAccess: true,
|
||||
canWrite,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/table/[tableId]?workspaceId=xxx
|
||||
* Get table details
|
||||
*/
|
||||
export async function GET(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(request.url)
|
||||
const validated = GetTableSchema.parse({
|
||||
workspaceId: searchParams.get('workspaceId'),
|
||||
})
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess } = await checkWorkspaceAccess(validated.workspaceId, authResult.userId)
|
||||
|
||||
if (!hasAccess) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get table
|
||||
const [table] = await db
|
||||
.select()
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Retrieved table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
table: {
|
||||
id: table.id,
|
||||
name: table.name,
|
||||
description: table.description,
|
||||
schema: table.schema,
|
||||
rowCount: table.rowCount,
|
||||
maxRows: table.maxRows,
|
||||
createdAt: table.createdAt.toISOString(),
|
||||
updatedAt: table.updatedAt.toISOString(),
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error getting table:`, error)
|
||||
return NextResponse.json({ error: 'Failed to get table' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DELETE /api/table/[tableId]?workspaceId=xxx
|
||||
* Delete a table (soft delete)
|
||||
*/
|
||||
export async function DELETE(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(request.url)
|
||||
const validated = GetTableSchema.parse({
|
||||
workspaceId: searchParams.get('workspaceId'),
|
||||
})
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
validated.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Soft delete table
|
||||
const [deletedTable] = await db
|
||||
.update(userTableDefinitions)
|
||||
.set({
|
||||
deletedAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.returning()
|
||||
|
||||
if (!deletedTable) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Delete all rows
|
||||
await db.delete(userTableRows).where(eq(userTableRows.tableId, tableId))
|
||||
|
||||
logger.info(`[${requestId}] Deleted table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
message: 'Table deleted successfully',
|
||||
success: true,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error deleting table:`, error)
|
||||
return NextResponse.json({ error: 'Failed to delete table' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
331
apps/sim/app/api/table/[tableId]/rows/[rowId]/route.ts
Normal file
331
apps/sim/app/api/table/[tableId]/rows/[rowId]/route.ts
Normal file
@@ -0,0 +1,331 @@
|
||||
import { db } from '@sim/db'
|
||||
import { permissions, userTableDefinitions, userTableRows, workspace } from '@sim/db/schema'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { and, eq, isNull, sql } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { checkHybridAuth } from '@/lib/auth/hybrid'
|
||||
import { generateRequestId } from '@/lib/core/utils/request'
|
||||
import type { TableSchema } from '@/lib/table'
|
||||
import { validateRowAgainstSchema, validateRowSize } from '@/lib/table'
|
||||
|
||||
const logger = createLogger('TableRowAPI')
|
||||
|
||||
const GetRowSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
})
|
||||
|
||||
const UpdateRowSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
data: z.record(z.any()),
|
||||
})
|
||||
|
||||
const DeleteRowSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
})
|
||||
|
||||
/**
|
||||
* Check if user has write access to workspace
|
||||
*/
|
||||
async function checkWorkspaceAccess(workspaceId: string, userId: string) {
|
||||
const [workspaceData] = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
ownerId: workspace.ownerId,
|
||||
})
|
||||
.from(workspace)
|
||||
.where(eq(workspace.id, workspaceId))
|
||||
.limit(1)
|
||||
|
||||
if (!workspaceData) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
if (workspaceData.ownerId === userId) {
|
||||
return { hasAccess: true, canWrite: true }
|
||||
}
|
||||
|
||||
const [permission] = await db
|
||||
.select({
|
||||
permissionType: permissions.permissionType,
|
||||
})
|
||||
.from(permissions)
|
||||
.where(
|
||||
and(
|
||||
eq(permissions.userId, userId),
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.entityId, workspaceId)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!permission) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
const canWrite = permission.permissionType === 'admin' || permission.permissionType === 'write'
|
||||
|
||||
return {
|
||||
hasAccess: true,
|
||||
canWrite,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/table/[tableId]/rows/[rowId]?workspaceId=xxx
|
||||
* Get a single row by ID
|
||||
*/
|
||||
export async function GET(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string; rowId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId, rowId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(request.url)
|
||||
const validated = GetRowSchema.parse({
|
||||
workspaceId: searchParams.get('workspaceId'),
|
||||
})
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess } = await checkWorkspaceAccess(validated.workspaceId, authResult.userId)
|
||||
|
||||
if (!hasAccess) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get row
|
||||
const [row] = await db
|
||||
.select({
|
||||
id: userTableRows.id,
|
||||
data: userTableRows.data,
|
||||
createdAt: userTableRows.createdAt,
|
||||
updatedAt: userTableRows.updatedAt,
|
||||
})
|
||||
.from(userTableRows)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableRows.id, rowId),
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!row) {
|
||||
return NextResponse.json({ error: 'Row not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Retrieved row ${rowId} from table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
row: {
|
||||
id: row.id,
|
||||
data: row.data,
|
||||
createdAt: row.createdAt.toISOString(),
|
||||
updatedAt: row.updatedAt.toISOString(),
|
||||
},
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error getting row:`, error)
|
||||
return NextResponse.json({ error: 'Failed to get row' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* PATCH /api/table/[tableId]/rows/[rowId]
|
||||
* Update an existing row
|
||||
*/
|
||||
export async function PATCH(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string; rowId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId, rowId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
const validated = UpdateRowSchema.parse(body)
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
validated.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get table definition
|
||||
const [table] = await db
|
||||
.select()
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Validate row size
|
||||
const sizeValidation = validateRowSize(validated.data)
|
||||
if (!sizeValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid row data', details: sizeValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Validate row against schema
|
||||
const rowValidation = validateRowAgainstSchema(validated.data, table.schema as TableSchema)
|
||||
if (!rowValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Row data does not match schema', details: rowValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Update row
|
||||
const now = new Date()
|
||||
|
||||
const [updatedRow] = await db
|
||||
.update(userTableRows)
|
||||
.set({
|
||||
data: validated.data,
|
||||
updatedAt: now,
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(userTableRows.id, rowId),
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId)
|
||||
)
|
||||
)
|
||||
.returning()
|
||||
|
||||
if (!updatedRow) {
|
||||
return NextResponse.json({ error: 'Row not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Updated row ${rowId} in table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
row: {
|
||||
id: updatedRow.id,
|
||||
data: updatedRow.data,
|
||||
createdAt: updatedRow.createdAt.toISOString(),
|
||||
updatedAt: updatedRow.updatedAt.toISOString(),
|
||||
},
|
||||
message: 'Row updated successfully',
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error updating row:`, error)
|
||||
return NextResponse.json({ error: 'Failed to update row' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DELETE /api/table/[tableId]/rows/[rowId]
|
||||
* Delete a row
|
||||
*/
|
||||
export async function DELETE(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string; rowId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId, rowId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
const validated = DeleteRowSchema.parse(body)
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
validated.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Delete row
|
||||
const [deletedRow] = await db
|
||||
.delete(userTableRows)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableRows.id, rowId),
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId)
|
||||
)
|
||||
)
|
||||
.returning()
|
||||
|
||||
if (!deletedRow) {
|
||||
return NextResponse.json({ error: 'Row not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Update row count
|
||||
await db
|
||||
.update(userTableDefinitions)
|
||||
.set({
|
||||
rowCount: sql`${userTableDefinitions.rowCount} - 1`,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(userTableDefinitions.id, tableId))
|
||||
|
||||
logger.info(`[${requestId}] Deleted row ${rowId} from table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
message: 'Row deleted successfully',
|
||||
deletedCount: 1,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error deleting row:`, error)
|
||||
return NextResponse.json({ error: 'Failed to delete row' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
780
apps/sim/app/api/table/[tableId]/rows/route.ts
Normal file
780
apps/sim/app/api/table/[tableId]/rows/route.ts
Normal file
@@ -0,0 +1,780 @@
|
||||
import { db } from '@sim/db'
|
||||
import { permissions, userTableDefinitions, userTableRows, workspace } from '@sim/db/schema'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { and, eq, isNull, sql } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { checkHybridAuth } from '@/lib/auth/hybrid'
|
||||
import { generateRequestId } from '@/lib/core/utils/request'
|
||||
import type { QueryFilter, TableSchema } from '@/lib/table'
|
||||
import { TABLE_LIMITS, validateRowAgainstSchema, validateRowSize } from '@/lib/table'
|
||||
import { buildFilterClause, buildSortClause } from '@/lib/table/query-builder'
|
||||
|
||||
const logger = createLogger('TableRowsAPI')
|
||||
|
||||
const InsertRowSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
data: z.record(z.any()),
|
||||
})
|
||||
|
||||
const BatchInsertRowsSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
rows: z.array(z.record(z.any())).min(1).max(1000), // Max 1000 rows per batch
|
||||
})
|
||||
|
||||
const QueryRowsSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
filter: z.record(z.any()).optional(),
|
||||
sort: z.record(z.enum(['asc', 'desc'])).optional(),
|
||||
limit: z.coerce.number().int().min(1).max(TABLE_LIMITS.MAX_QUERY_LIMIT).optional().default(100),
|
||||
offset: z.coerce.number().int().min(0).optional().default(0),
|
||||
})
|
||||
|
||||
const UpdateRowsByFilterSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
filter: z.record(z.any()), // Required - must specify what to update
|
||||
data: z.record(z.any()), // New data to set
|
||||
limit: z.coerce.number().int().min(1).max(1000).optional(), // Safety limit for bulk updates
|
||||
})
|
||||
|
||||
const DeleteRowsByFilterSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
filter: z.record(z.any()), // Required - must specify what to delete
|
||||
limit: z.coerce.number().int().min(1).max(1000).optional(), // Safety limit for bulk deletes
|
||||
})
|
||||
|
||||
/**
|
||||
* Check if user has write access to workspace
|
||||
*/
|
||||
async function checkWorkspaceAccess(workspaceId: string, userId: string) {
|
||||
const [workspaceData] = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
ownerId: workspace.ownerId,
|
||||
})
|
||||
.from(workspace)
|
||||
.where(eq(workspace.id, workspaceId))
|
||||
.limit(1)
|
||||
|
||||
if (!workspaceData) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
if (workspaceData.ownerId === userId) {
|
||||
return { hasAccess: true, canWrite: true }
|
||||
}
|
||||
|
||||
const [permission] = await db
|
||||
.select({
|
||||
permissionType: permissions.permissionType,
|
||||
})
|
||||
.from(permissions)
|
||||
.where(
|
||||
and(
|
||||
eq(permissions.userId, userId),
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.entityId, workspaceId)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!permission) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
const canWrite = permission.permissionType === 'admin' || permission.permissionType === 'write'
|
||||
|
||||
return {
|
||||
hasAccess: true,
|
||||
canWrite,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle batch insert of multiple rows
|
||||
*/
|
||||
async function handleBatchInsert(requestId: string, tableId: string, body: any, userId: string) {
|
||||
const validated = BatchInsertRowsSchema.parse(body)
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(validated.workspaceId, userId)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get table definition
|
||||
const [table] = await db
|
||||
.select()
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Check row count limit
|
||||
const remainingCapacity = table.maxRows - table.rowCount
|
||||
if (remainingCapacity < validated.rows.length) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: `Insufficient capacity. Can only insert ${remainingCapacity} more rows (table has ${table.rowCount}/${table.maxRows} rows)`,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Validate all rows
|
||||
const errors: { row: number; errors: string[] }[] = []
|
||||
|
||||
for (let i = 0; i < validated.rows.length; i++) {
|
||||
const rowData = validated.rows[i]
|
||||
|
||||
// Validate row size
|
||||
const sizeValidation = validateRowSize(rowData)
|
||||
if (!sizeValidation.valid) {
|
||||
errors.push({ row: i, errors: sizeValidation.errors })
|
||||
continue
|
||||
}
|
||||
|
||||
// Validate row against schema
|
||||
const rowValidation = validateRowAgainstSchema(rowData, table.schema as TableSchema)
|
||||
if (!rowValidation.valid) {
|
||||
errors.push({ row: i, errors: rowValidation.errors })
|
||||
}
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: 'Validation failed for some rows',
|
||||
details: errors,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Insert all rows
|
||||
const now = new Date()
|
||||
const rowsToInsert = validated.rows.map((data) => ({
|
||||
id: `row_${crypto.randomUUID().replace(/-/g, '')}`,
|
||||
tableId,
|
||||
workspaceId: validated.workspaceId,
|
||||
data,
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
createdBy: userId,
|
||||
}))
|
||||
|
||||
const insertedRows = await db.insert(userTableRows).values(rowsToInsert).returning()
|
||||
|
||||
// Update row count
|
||||
await db
|
||||
.update(userTableDefinitions)
|
||||
.set({
|
||||
rowCount: sql`${userTableDefinitions.rowCount} + ${validated.rows.length}`,
|
||||
updatedAt: now,
|
||||
})
|
||||
.where(eq(userTableDefinitions.id, tableId))
|
||||
|
||||
logger.info(`[${requestId}] Batch inserted ${insertedRows.length} rows into table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
rows: insertedRows.map((r) => ({
|
||||
id: r.id,
|
||||
data: r.data,
|
||||
createdAt: r.createdAt.toISOString(),
|
||||
updatedAt: r.updatedAt.toISOString(),
|
||||
})),
|
||||
insertedCount: insertedRows.length,
|
||||
message: `Successfully inserted ${insertedRows.length} rows`,
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/table/[tableId]/rows
|
||||
* Insert a new row into the table
|
||||
* Supports both single row and batch insert (NDJSON format)
|
||||
*/
|
||||
export async function POST(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
|
||||
// Check if this is a batch insert
|
||||
if (body.rows && Array.isArray(body.rows)) {
|
||||
return handleBatchInsert(requestId, tableId, body, authResult.userId)
|
||||
}
|
||||
|
||||
// Single row insert
|
||||
const validated = InsertRowSchema.parse(body)
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
validated.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get table definition
|
||||
const [table] = await db
|
||||
.select()
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Validate row size
|
||||
const sizeValidation = validateRowSize(validated.data)
|
||||
if (!sizeValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid row data', details: sizeValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Validate row against schema
|
||||
const rowValidation = validateRowAgainstSchema(validated.data, table.schema as TableSchema)
|
||||
if (!rowValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Row data does not match schema', details: rowValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Check row count limit
|
||||
if (table.rowCount >= table.maxRows) {
|
||||
return NextResponse.json(
|
||||
{ error: `Table row limit reached (${table.maxRows} rows max)` },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Insert row
|
||||
const rowId = `row_${crypto.randomUUID().replace(/-/g, '')}`
|
||||
const now = new Date()
|
||||
|
||||
const [row] = await db
|
||||
.insert(userTableRows)
|
||||
.values({
|
||||
id: rowId,
|
||||
tableId,
|
||||
workspaceId: validated.workspaceId,
|
||||
data: validated.data,
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
createdBy: authResult.userId,
|
||||
})
|
||||
.returning()
|
||||
|
||||
// Update row count
|
||||
await db
|
||||
.update(userTableDefinitions)
|
||||
.set({
|
||||
rowCount: sql`${userTableDefinitions.rowCount} + 1`,
|
||||
updatedAt: now,
|
||||
})
|
||||
.where(eq(userTableDefinitions.id, tableId))
|
||||
|
||||
logger.info(`[${requestId}] Inserted row ${rowId} into table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
row: {
|
||||
id: row.id,
|
||||
data: row.data,
|
||||
createdAt: row.createdAt.toISOString(),
|
||||
updatedAt: row.updatedAt.toISOString(),
|
||||
},
|
||||
message: 'Row inserted successfully',
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error inserting row:`, error)
|
||||
return NextResponse.json({ error: 'Failed to insert row' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/table/[tableId]/rows?workspaceId=xxx&filter=...&sort=...&limit=100&offset=0
|
||||
* Query rows from the table with filtering, sorting, and pagination
|
||||
*/
|
||||
export async function GET(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(request.url)
|
||||
const workspaceId = searchParams.get('workspaceId')
|
||||
const filterParam = searchParams.get('filter')
|
||||
const sortParam = searchParams.get('sort')
|
||||
const limit = searchParams.get('limit')
|
||||
const offset = searchParams.get('offset')
|
||||
|
||||
let filter
|
||||
let sort
|
||||
|
||||
try {
|
||||
if (filterParam) {
|
||||
filter = JSON.parse(filterParam)
|
||||
}
|
||||
if (sortParam) {
|
||||
sort = JSON.parse(sortParam)
|
||||
}
|
||||
} catch {
|
||||
return NextResponse.json({ error: 'Invalid filter or sort JSON' }, { status: 400 })
|
||||
}
|
||||
|
||||
const validated = QueryRowsSchema.parse({
|
||||
workspaceId,
|
||||
filter,
|
||||
sort,
|
||||
limit,
|
||||
offset,
|
||||
})
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess } = await checkWorkspaceAccess(validated.workspaceId, authResult.userId)
|
||||
|
||||
if (!hasAccess) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Verify table exists
|
||||
const [table] = await db
|
||||
.select({ id: userTableDefinitions.id })
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Build base where conditions
|
||||
const baseConditions = [
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId),
|
||||
]
|
||||
|
||||
// Add filter conditions if provided
|
||||
if (validated.filter) {
|
||||
const filterClause = buildFilterClause(validated.filter as QueryFilter, 'user_table_rows')
|
||||
if (filterClause) {
|
||||
baseConditions.push(filterClause)
|
||||
}
|
||||
}
|
||||
|
||||
// Build query with combined conditions
|
||||
let query = db
|
||||
.select({
|
||||
id: userTableRows.id,
|
||||
data: userTableRows.data,
|
||||
createdAt: userTableRows.createdAt,
|
||||
updatedAt: userTableRows.updatedAt,
|
||||
})
|
||||
.from(userTableRows)
|
||||
.where(and(...baseConditions))
|
||||
|
||||
// Apply sorting
|
||||
if (validated.sort) {
|
||||
const sortClause = buildSortClause(validated.sort, 'user_table_rows')
|
||||
if (sortClause) {
|
||||
query = query.orderBy(sortClause) as any
|
||||
}
|
||||
} else {
|
||||
query = query.orderBy(userTableRows.createdAt) as any
|
||||
}
|
||||
|
||||
// Get total count with same filters (without pagination)
|
||||
const countQuery = db
|
||||
.select({ count: sql<number>`count(*)` })
|
||||
.from(userTableRows)
|
||||
.where(and(...baseConditions))
|
||||
|
||||
const [{ count: totalCount }] = await countQuery
|
||||
|
||||
// Apply pagination
|
||||
const rows = await query.limit(validated.limit).offset(validated.offset)
|
||||
|
||||
logger.info(
|
||||
`[${requestId}] Queried ${rows.length} rows from table ${tableId} (total: ${totalCount})`
|
||||
)
|
||||
|
||||
return NextResponse.json({
|
||||
rows: rows.map((r) => ({
|
||||
id: r.id,
|
||||
data: r.data,
|
||||
createdAt: r.createdAt.toISOString(),
|
||||
updatedAt: r.updatedAt.toISOString(),
|
||||
})),
|
||||
rowCount: rows.length,
|
||||
totalCount: Number(totalCount),
|
||||
limit: validated.limit,
|
||||
offset: validated.offset,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error querying rows:`, error)
|
||||
return NextResponse.json({ error: 'Failed to query rows' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* PUT /api/table/[tableId]/rows
|
||||
* Update multiple rows by filter criteria
|
||||
* Example: Update all rows where name contains "test"
|
||||
*/
|
||||
export async function PUT(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
const validated = UpdateRowsByFilterSchema.parse(body)
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
validated.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get table definition
|
||||
const [table] = await db
|
||||
.select()
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Validate new data size
|
||||
const sizeValidation = validateRowSize(validated.data)
|
||||
if (!sizeValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid row data', details: sizeValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Build base where conditions
|
||||
const baseConditions = [
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId),
|
||||
]
|
||||
|
||||
// Add filter conditions
|
||||
const filterClause = buildFilterClause(validated.filter as QueryFilter, 'user_table_rows')
|
||||
if (filterClause) {
|
||||
baseConditions.push(filterClause)
|
||||
}
|
||||
|
||||
// First, get the rows that match the filter to validate against schema
|
||||
let matchingRowsQuery = db
|
||||
.select({
|
||||
id: userTableRows.id,
|
||||
data: userTableRows.data,
|
||||
})
|
||||
.from(userTableRows)
|
||||
.where(and(...baseConditions))
|
||||
|
||||
if (validated.limit) {
|
||||
matchingRowsQuery = matchingRowsQuery.limit(validated.limit) as any
|
||||
}
|
||||
|
||||
const matchingRows = await matchingRowsQuery
|
||||
|
||||
if (matchingRows.length === 0) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
message: 'No rows matched the filter criteria',
|
||||
updatedCount: 0,
|
||||
},
|
||||
{ status: 200 }
|
||||
)
|
||||
}
|
||||
|
||||
// Log warning for large operations but allow them
|
||||
if (matchingRows.length > 1000) {
|
||||
logger.warn(`[${requestId}] Updating ${matchingRows.length} rows. This may take some time.`)
|
||||
}
|
||||
|
||||
// Validate that merged data matches schema for each row
|
||||
for (const row of matchingRows) {
|
||||
const mergedData = { ...row.data, ...validated.data }
|
||||
const rowValidation = validateRowAgainstSchema(mergedData, table.schema as TableSchema)
|
||||
if (!rowValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: 'Updated data does not match schema',
|
||||
details: rowValidation.errors,
|
||||
affectedRowId: row.id,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// Update rows by merging existing data with new data in batches
|
||||
const now = new Date()
|
||||
const BATCH_SIZE = 100 // Smaller batch for updates since each is a separate query
|
||||
let totalUpdated = 0
|
||||
|
||||
for (let i = 0; i < matchingRows.length; i += BATCH_SIZE) {
|
||||
const batch = matchingRows.slice(i, i + BATCH_SIZE)
|
||||
const updatePromises = batch.map((row) =>
|
||||
db
|
||||
.update(userTableRows)
|
||||
.set({
|
||||
data: { ...row.data, ...validated.data },
|
||||
updatedAt: now,
|
||||
})
|
||||
.where(eq(userTableRows.id, row.id))
|
||||
)
|
||||
await Promise.all(updatePromises)
|
||||
totalUpdated += batch.length
|
||||
logger.info(
|
||||
`[${requestId}] Updated batch ${Math.floor(i / BATCH_SIZE) + 1} (${totalUpdated}/${matchingRows.length} rows)`
|
||||
)
|
||||
}
|
||||
|
||||
logger.info(`[${requestId}] Updated ${matchingRows.length} rows in table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
message: 'Rows updated successfully',
|
||||
updatedCount: matchingRows.length,
|
||||
updatedRowIds: matchingRows.map((r) => r.id),
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error updating rows by filter:`, error)
|
||||
|
||||
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||
const detailedError = `Failed to update rows: ${errorMessage}`
|
||||
|
||||
return NextResponse.json({ error: detailedError }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DELETE /api/table/[tableId]/rows
|
||||
* Delete multiple rows by filter criteria
|
||||
* Example: Delete all rows where seen is false
|
||||
*/
|
||||
export async function DELETE(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ tableId: string }> }
|
||||
) {
|
||||
const requestId = generateRequestId()
|
||||
const { tableId } = await params
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
const validated = DeleteRowsByFilterSchema.parse(body)
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
validated.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Verify table exists
|
||||
const [table] = await db
|
||||
.select({ id: userTableDefinitions.id })
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.id, tableId),
|
||||
eq(userTableDefinitions.workspaceId, validated.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!table) {
|
||||
return NextResponse.json({ error: 'Table not found' }, { status: 404 })
|
||||
}
|
||||
|
||||
// Build base where conditions
|
||||
const baseConditions = [
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId),
|
||||
]
|
||||
|
||||
// Add filter conditions
|
||||
const filterClause = buildFilterClause(validated.filter as QueryFilter, 'user_table_rows')
|
||||
if (filterClause) {
|
||||
baseConditions.push(filterClause)
|
||||
}
|
||||
|
||||
// Get matching rows first (for reporting and limit enforcement)
|
||||
let matchingRowsQuery = db
|
||||
.select({ id: userTableRows.id })
|
||||
.from(userTableRows)
|
||||
.where(and(...baseConditions))
|
||||
|
||||
if (validated.limit) {
|
||||
matchingRowsQuery = matchingRowsQuery.limit(validated.limit) as any
|
||||
}
|
||||
|
||||
const matchingRows = await matchingRowsQuery
|
||||
|
||||
if (matchingRows.length === 0) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
message: 'No rows matched the filter criteria',
|
||||
deletedCount: 0,
|
||||
},
|
||||
{ status: 200 }
|
||||
)
|
||||
}
|
||||
|
||||
// Log warning for large operations but allow them
|
||||
if (matchingRows.length > 1000) {
|
||||
logger.warn(`[${requestId}] Deleting ${matchingRows.length} rows. This may take some time.`)
|
||||
}
|
||||
|
||||
// Delete the matching rows in batches to avoid stack overflow
|
||||
const rowIds = matchingRows.map((r) => r.id)
|
||||
const BATCH_SIZE = 1000
|
||||
let totalDeleted = 0
|
||||
|
||||
for (let i = 0; i < rowIds.length; i += BATCH_SIZE) {
|
||||
const batch = rowIds.slice(i, i + BATCH_SIZE)
|
||||
await db.delete(userTableRows).where(
|
||||
and(
|
||||
eq(userTableRows.tableId, tableId),
|
||||
eq(userTableRows.workspaceId, validated.workspaceId),
|
||||
sql`${userTableRows.id} = ANY(ARRAY[${sql.join(
|
||||
batch.map((id) => sql`${id}`),
|
||||
sql`, `
|
||||
)}])`
|
||||
)
|
||||
)
|
||||
totalDeleted += batch.length
|
||||
logger.info(
|
||||
`[${requestId}] Deleted batch ${Math.floor(i / BATCH_SIZE) + 1} (${totalDeleted}/${rowIds.length} rows)`
|
||||
)
|
||||
}
|
||||
|
||||
// Update row count
|
||||
await db
|
||||
.update(userTableDefinitions)
|
||||
.set({
|
||||
rowCount: sql`${userTableDefinitions.rowCount} - ${matchingRows.length}`,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(userTableDefinitions.id, tableId))
|
||||
|
||||
logger.info(`[${requestId}] Deleted ${matchingRows.length} rows from table ${tableId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
message: 'Rows deleted successfully',
|
||||
deletedCount: matchingRows.length,
|
||||
deletedRowIds: rowIds,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error deleting rows by filter:`, error)
|
||||
|
||||
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||
const detailedError = `Failed to delete rows: ${errorMessage}`
|
||||
|
||||
return NextResponse.json({ error: detailedError }, { status: 500 })
|
||||
}
|
||||
}
|
||||
297
apps/sim/app/api/table/route.ts
Normal file
297
apps/sim/app/api/table/route.ts
Normal file
@@ -0,0 +1,297 @@
|
||||
import { db } from '@sim/db'
|
||||
import { permissions, userTableDefinitions, workspace } from '@sim/db/schema'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { and, eq, isNull, sql } from 'drizzle-orm'
|
||||
import { type NextRequest, NextResponse } from 'next/server'
|
||||
import { z } from 'zod'
|
||||
import { checkHybridAuth } from '@/lib/auth/hybrid'
|
||||
import { generateRequestId } from '@/lib/core/utils/request'
|
||||
import { TABLE_LIMITS, validateTableName, validateTableSchema } from '@/lib/table'
|
||||
import type { TableSchema } from '@/lib/table/validation'
|
||||
|
||||
const logger = createLogger('TableAPI')
|
||||
|
||||
const ColumnSchema = z.object({
|
||||
name: z
|
||||
.string()
|
||||
.min(1)
|
||||
.max(TABLE_LIMITS.MAX_COLUMN_NAME_LENGTH)
|
||||
.regex(/^[a-z_][a-z0-9_]*$/i, 'Invalid column name'),
|
||||
type: z.enum(['string', 'number', 'boolean', 'date', 'json']),
|
||||
required: z.boolean().optional().default(false),
|
||||
})
|
||||
|
||||
const CreateTableSchema = z.object({
|
||||
name: z
|
||||
.string()
|
||||
.min(1)
|
||||
.max(TABLE_LIMITS.MAX_TABLE_NAME_LENGTH)
|
||||
.regex(/^[a-z_][a-z0-9_]*$/i, 'Invalid table name'),
|
||||
description: z.string().max(TABLE_LIMITS.MAX_DESCRIPTION_LENGTH).optional(),
|
||||
schema: z.object({
|
||||
columns: z.array(ColumnSchema).min(1).max(TABLE_LIMITS.MAX_COLUMNS_PER_TABLE),
|
||||
}),
|
||||
workspaceId: z.string().min(1),
|
||||
})
|
||||
|
||||
const ListTablesSchema = z.object({
|
||||
workspaceId: z.string().min(1),
|
||||
})
|
||||
|
||||
/**
|
||||
* Check if user has write access to workspace
|
||||
*/
|
||||
async function checkWorkspaceAccess(workspaceId: string, userId: string) {
|
||||
const [workspaceData] = await db
|
||||
.select({
|
||||
id: workspace.id,
|
||||
ownerId: workspace.ownerId,
|
||||
})
|
||||
.from(workspace)
|
||||
.where(eq(workspace.id, workspaceId))
|
||||
.limit(1)
|
||||
|
||||
if (!workspaceData) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
// Owner has full access
|
||||
if (workspaceData.ownerId === userId) {
|
||||
return { hasAccess: true, canWrite: true }
|
||||
}
|
||||
|
||||
// Check permissions
|
||||
const [permission] = await db
|
||||
.select({
|
||||
permissionType: permissions.permissionType,
|
||||
})
|
||||
.from(permissions)
|
||||
.where(
|
||||
and(
|
||||
eq(permissions.userId, userId),
|
||||
eq(permissions.entityType, 'workspace'),
|
||||
eq(permissions.entityId, workspaceId)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (!permission) {
|
||||
return { hasAccess: false, canWrite: false }
|
||||
}
|
||||
|
||||
const canWrite = permission.permissionType === 'admin' || permission.permissionType === 'write'
|
||||
|
||||
return {
|
||||
hasAccess: true,
|
||||
canWrite,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/table
|
||||
* Create a new user-defined table
|
||||
*/
|
||||
export async function POST(request: NextRequest) {
|
||||
const requestId = generateRequestId()
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const body = await request.json()
|
||||
const params = CreateTableSchema.parse(body)
|
||||
|
||||
// Validate table name
|
||||
const nameValidation = validateTableName(params.name)
|
||||
if (!nameValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid table name', details: nameValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Validate schema
|
||||
const schemaValidation = validateTableSchema(params.schema as TableSchema)
|
||||
if (!schemaValidation.valid) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid table schema', details: schemaValidation.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess, canWrite } = await checkWorkspaceAccess(
|
||||
params.workspaceId,
|
||||
authResult.userId
|
||||
)
|
||||
|
||||
if (!hasAccess || !canWrite) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Check workspace table limit
|
||||
const [tableCount] = await db
|
||||
.select({ count: sql<number>`count(*)` })
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.workspaceId, params.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
|
||||
if (Number(tableCount.count) >= TABLE_LIMITS.MAX_TABLES_PER_WORKSPACE) {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: `Workspace table limit reached (${TABLE_LIMITS.MAX_TABLES_PER_WORKSPACE} tables max)`,
|
||||
},
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Check for duplicate table name
|
||||
const [existing] = await db
|
||||
.select({ id: userTableDefinitions.id })
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.workspaceId, params.workspaceId),
|
||||
eq(userTableDefinitions.name, params.name),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.limit(1)
|
||||
|
||||
if (existing) {
|
||||
return NextResponse.json(
|
||||
{ error: `Table "${params.name}" already exists in this workspace` },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
// Create table
|
||||
const tableId = `tbl_${crypto.randomUUID().replace(/-/g, '')}`
|
||||
const now = new Date()
|
||||
|
||||
const [table] = await db
|
||||
.insert(userTableDefinitions)
|
||||
.values({
|
||||
id: tableId,
|
||||
workspaceId: params.workspaceId,
|
||||
name: params.name,
|
||||
description: params.description,
|
||||
schema: params.schema,
|
||||
maxRows: TABLE_LIMITS.MAX_ROWS_PER_TABLE,
|
||||
rowCount: 0,
|
||||
createdBy: authResult.userId,
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
})
|
||||
.returning()
|
||||
|
||||
logger.info(`[${requestId}] Created table ${tableId} in workspace ${params.workspaceId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
table: {
|
||||
id: table.id,
|
||||
name: table.name,
|
||||
description: table.description,
|
||||
schema: table.schema,
|
||||
rowCount: table.rowCount,
|
||||
maxRows: table.maxRows,
|
||||
createdAt: table.createdAt.toISOString(),
|
||||
updatedAt: table.updatedAt.toISOString(),
|
||||
},
|
||||
message: 'Table created successfully',
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error creating table:`, error)
|
||||
return NextResponse.json({ error: 'Failed to create table' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/table?workspaceId=xxx
|
||||
* List all tables in a workspace
|
||||
*/
|
||||
export async function GET(request: NextRequest) {
|
||||
const requestId = generateRequestId()
|
||||
|
||||
try {
|
||||
const authResult = await checkHybridAuth(request)
|
||||
if (!authResult.success || !authResult.userId) {
|
||||
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
|
||||
}
|
||||
|
||||
const { searchParams } = new URL(request.url)
|
||||
const workspaceId = searchParams.get('workspaceId')
|
||||
|
||||
const validation = ListTablesSchema.safeParse({ workspaceId })
|
||||
if (!validation.success) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: validation.error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
const params = validation.data
|
||||
|
||||
// Check workspace access
|
||||
const { hasAccess } = await checkWorkspaceAccess(params.workspaceId, authResult.userId)
|
||||
|
||||
if (!hasAccess) {
|
||||
return NextResponse.json({ error: 'Access denied' }, { status: 403 })
|
||||
}
|
||||
|
||||
// Get tables
|
||||
const tables = await db
|
||||
.select({
|
||||
id: userTableDefinitions.id,
|
||||
name: userTableDefinitions.name,
|
||||
description: userTableDefinitions.description,
|
||||
schema: userTableDefinitions.schema,
|
||||
rowCount: userTableDefinitions.rowCount,
|
||||
maxRows: userTableDefinitions.maxRows,
|
||||
createdAt: userTableDefinitions.createdAt,
|
||||
updatedAt: userTableDefinitions.updatedAt,
|
||||
})
|
||||
.from(userTableDefinitions)
|
||||
.where(
|
||||
and(
|
||||
eq(userTableDefinitions.workspaceId, params.workspaceId),
|
||||
isNull(userTableDefinitions.deletedAt)
|
||||
)
|
||||
)
|
||||
.orderBy(userTableDefinitions.createdAt)
|
||||
|
||||
logger.info(`[${requestId}] Listed ${tables.length} tables in workspace ${params.workspaceId}`)
|
||||
|
||||
return NextResponse.json({
|
||||
tables: tables.map((t) => ({
|
||||
...t,
|
||||
createdAt: t.createdAt.toISOString(),
|
||||
updatedAt: t.updatedAt.toISOString(),
|
||||
})),
|
||||
totalCount: tables.length,
|
||||
})
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Validation error', details: error.errors },
|
||||
{ status: 400 }
|
||||
)
|
||||
}
|
||||
|
||||
logger.error(`[${requestId}] Error listing tables:`, error)
|
||||
return NextResponse.json({ error: 'Failed to list tables' }, { status: 500 })
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,278 @@
|
||||
'use client'
|
||||
|
||||
import { useState } from 'react'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { Plus, Trash2 } from 'lucide-react'
|
||||
import { useParams } from 'next/navigation'
|
||||
import {
|
||||
Button,
|
||||
Checkbox,
|
||||
Combobox,
|
||||
Input,
|
||||
Label,
|
||||
Modal,
|
||||
ModalBody,
|
||||
ModalContent,
|
||||
ModalFooter,
|
||||
ModalHeader,
|
||||
Textarea,
|
||||
} from '@/components/emcn'
|
||||
import { useCreateTable } from '@/hooks/queries/use-tables'
|
||||
|
||||
const logger = createLogger('CreateTableModal')
|
||||
|
||||
interface ColumnDefinition {
|
||||
name: string
|
||||
type: 'string' | 'number' | 'boolean' | 'date' | 'json'
|
||||
required: boolean
|
||||
}
|
||||
|
||||
interface CreateTableModalProps {
|
||||
isOpen: boolean
|
||||
onClose: () => void
|
||||
}
|
||||
|
||||
const COLUMN_TYPES = [
|
||||
{ value: 'string', label: 'String' },
|
||||
{ value: 'number', label: 'Number' },
|
||||
{ value: 'boolean', label: 'Boolean' },
|
||||
{ value: 'date', label: 'Date' },
|
||||
{ value: 'json', label: 'JSON' },
|
||||
]
|
||||
|
||||
export function CreateTableModal({ isOpen, onClose }: CreateTableModalProps) {
|
||||
const params = useParams()
|
||||
const workspaceId = params.workspaceId as string
|
||||
|
||||
const [tableName, setTableName] = useState('')
|
||||
const [description, setDescription] = useState('')
|
||||
const [columns, setColumns] = useState<ColumnDefinition[]>([
|
||||
{ name: '', type: 'string', required: false },
|
||||
])
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
|
||||
const createTable = useCreateTable(workspaceId)
|
||||
|
||||
const handleAddColumn = () => {
|
||||
setColumns([...columns, { name: '', type: 'string', required: false }])
|
||||
}
|
||||
|
||||
const handleRemoveColumn = (index: number) => {
|
||||
if (columns.length > 1) {
|
||||
setColumns(columns.filter((_, i) => i !== index))
|
||||
}
|
||||
}
|
||||
|
||||
const handleColumnChange = (
|
||||
index: number,
|
||||
field: keyof ColumnDefinition,
|
||||
value: string | boolean
|
||||
) => {
|
||||
const newColumns = [...columns]
|
||||
newColumns[index] = { ...newColumns[index], [field]: value }
|
||||
setColumns(newColumns)
|
||||
}
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault()
|
||||
setError(null)
|
||||
|
||||
if (!tableName.trim()) {
|
||||
setError('Table name is required')
|
||||
return
|
||||
}
|
||||
|
||||
// Validate column names
|
||||
const validColumns = columns.filter((col) => col.name.trim())
|
||||
if (validColumns.length === 0) {
|
||||
setError('At least one column is required')
|
||||
return
|
||||
}
|
||||
|
||||
// Check for duplicate column names
|
||||
const columnNames = validColumns.map((col) => col.name.toLowerCase())
|
||||
const uniqueNames = new Set(columnNames)
|
||||
if (uniqueNames.size !== columnNames.length) {
|
||||
setError('Duplicate column names found')
|
||||
return
|
||||
}
|
||||
|
||||
try {
|
||||
await createTable.mutateAsync({
|
||||
name: tableName,
|
||||
description: description || undefined,
|
||||
schema: {
|
||||
columns: validColumns,
|
||||
},
|
||||
})
|
||||
|
||||
// Reset form
|
||||
setTableName('')
|
||||
setDescription('')
|
||||
setColumns([{ name: '', type: 'string', required: false }])
|
||||
setError(null)
|
||||
onClose()
|
||||
} catch (err) {
|
||||
logger.error('Failed to create table:', err)
|
||||
setError(err instanceof Error ? err.message : 'Failed to create table')
|
||||
}
|
||||
}
|
||||
|
||||
const handleClose = () => {
|
||||
// Reset form on close
|
||||
setTableName('')
|
||||
setDescription('')
|
||||
setColumns([{ name: '', type: 'string', required: false }])
|
||||
setError(null)
|
||||
onClose()
|
||||
}
|
||||
|
||||
return (
|
||||
<Modal open={isOpen} onOpenChange={handleClose}>
|
||||
<ModalContent className='w-[600px]'>
|
||||
<ModalHeader>Create New Table</ModalHeader>
|
||||
<ModalBody className='max-h-[70vh] overflow-y-auto'>
|
||||
<form onSubmit={handleSubmit} className='flex flex-col gap-[16px]'>
|
||||
{error && (
|
||||
<div className='rounded-[6px] border border-[var(--status-error-border)] bg-[var(--status-error-bg)] px-[12px] py-[10px] text-[12px] text-[var(--status-error-text)]'>
|
||||
{error}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Table Name */}
|
||||
<div className='flex flex-col gap-[6px]'>
|
||||
<Label htmlFor='tableName' className='text-[12px] font-medium'>
|
||||
Table Name*
|
||||
</Label>
|
||||
<Input
|
||||
id='tableName'
|
||||
value={tableName}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) => setTableName(e.target.value)}
|
||||
placeholder='customers, orders, products'
|
||||
className='h-[36px]'
|
||||
required
|
||||
/>
|
||||
<p className='text-[11px] text-[var(--text-muted)]'>
|
||||
Use lowercase with underscores (e.g., customer_orders)
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Description */}
|
||||
<div className='flex flex-col gap-[6px]'>
|
||||
<Label htmlFor='description' className='text-[12px] font-medium'>
|
||||
Description
|
||||
</Label>
|
||||
<Textarea
|
||||
id='description'
|
||||
value={description}
|
||||
onChange={(e: React.ChangeEvent<HTMLTextAreaElement>) =>
|
||||
setDescription(e.target.value)
|
||||
}
|
||||
placeholder='Optional description for this table'
|
||||
rows={2}
|
||||
className='resize-none'
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Columns */}
|
||||
<div className='flex flex-col gap-[12px]'>
|
||||
<div className='flex items-center justify-between'>
|
||||
<Label className='text-[12px] font-medium'>Columns*</Label>
|
||||
<Button
|
||||
type='button'
|
||||
size='sm'
|
||||
variant='default'
|
||||
onClick={handleAddColumn}
|
||||
className='h-[28px] rounded-[6px] px-[10px] text-[12px]'
|
||||
>
|
||||
<Plus className='mr-[4px] h-[12px] w-[12px]' />
|
||||
Add Column
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* Column Headers */}
|
||||
<div className='flex items-center gap-[8px] text-[11px] font-medium text-[var(--text-muted)]'>
|
||||
<div className='flex-1'>Name</div>
|
||||
<div className='w-[120px]'>Type</div>
|
||||
<div className='w-[70px] text-center'>Required</div>
|
||||
<div className='w-[32px]' />
|
||||
</div>
|
||||
|
||||
{/* Column Rows */}
|
||||
<div className='flex flex-col gap-[8px]'>
|
||||
{columns.map((column, index) => (
|
||||
<div key={index} className='flex items-center gap-[8px]'>
|
||||
{/* Column Name */}
|
||||
<div className='flex-1'>
|
||||
<Input
|
||||
value={column.name}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
handleColumnChange(index, 'name', e.target.value)
|
||||
}
|
||||
placeholder='column_name'
|
||||
className='h-[36px]'
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Column Type */}
|
||||
<div className='w-[120px]'>
|
||||
<Combobox
|
||||
options={COLUMN_TYPES}
|
||||
value={column.type}
|
||||
selectedValue={column.type}
|
||||
onChange={(value) =>
|
||||
handleColumnChange(index, 'type', value as ColumnDefinition['type'])
|
||||
}
|
||||
placeholder='Type'
|
||||
editable={false}
|
||||
filterOptions={false}
|
||||
size='sm'
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Required Checkbox */}
|
||||
<div className='flex w-[70px] items-center justify-center'>
|
||||
<Checkbox
|
||||
checked={column.required}
|
||||
onCheckedChange={(checked) =>
|
||||
handleColumnChange(index, 'required', checked === true)
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Delete Button */}
|
||||
<div className='w-[32px]'>
|
||||
<Button
|
||||
type='button'
|
||||
size='sm'
|
||||
variant='ghost'
|
||||
onClick={() => handleRemoveColumn(index)}
|
||||
disabled={columns.length === 1}
|
||||
className='h-[32px] w-[32px] p-0 text-[var(--text-muted)] hover:text-[var(--text-error)]'
|
||||
>
|
||||
<Trash2 className='h-[14px] w-[14px]' />
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
</ModalBody>
|
||||
<ModalFooter>
|
||||
<Button type='button' variant='default' onClick={handleClose}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
type='button'
|
||||
variant='tertiary'
|
||||
onClick={handleSubmit}
|
||||
disabled={createTable.isPending}
|
||||
>
|
||||
{createTable.isPending ? 'Creating...' : 'Create Table'}
|
||||
</Button>
|
||||
</ModalFooter>
|
||||
</ModalContent>
|
||||
</Modal>
|
||||
)
|
||||
}
|
||||
@@ -0,0 +1,2 @@
|
||||
export * from './create-table-modal'
|
||||
export * from './table-card'
|
||||
@@ -0,0 +1,143 @@
|
||||
'use client'
|
||||
|
||||
import { useState } from 'react'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { Database, MoreVertical, Trash2 } from 'lucide-react'
|
||||
import { useRouter } from 'next/navigation'
|
||||
import {
|
||||
Button,
|
||||
Modal,
|
||||
ModalBody,
|
||||
ModalContent,
|
||||
ModalFooter,
|
||||
ModalHeader,
|
||||
Popover,
|
||||
PopoverContent,
|
||||
PopoverItem,
|
||||
PopoverTrigger,
|
||||
} from '@/components/emcn'
|
||||
import { useDeleteTable } from '@/hooks/queries/use-tables'
|
||||
import type { TableDefinition } from '@/tools/table/types'
|
||||
|
||||
const logger = createLogger('TableCard')
|
||||
|
||||
interface TableCardProps {
|
||||
table: TableDefinition
|
||||
workspaceId: string
|
||||
}
|
||||
|
||||
export function TableCard({ table, workspaceId }: TableCardProps) {
|
||||
const router = useRouter()
|
||||
const [isDeleteDialogOpen, setIsDeleteDialogOpen] = useState(false)
|
||||
const [isMenuOpen, setIsMenuOpen] = useState(false)
|
||||
|
||||
const deleteTable = useDeleteTable(workspaceId)
|
||||
|
||||
const handleDelete = async () => {
|
||||
try {
|
||||
await deleteTable.mutateAsync(table.id)
|
||||
setIsDeleteDialogOpen(false)
|
||||
} catch (error) {
|
||||
logger.error('Failed to delete table:', error)
|
||||
}
|
||||
}
|
||||
|
||||
const columnCount = table.schema.columns.length
|
||||
|
||||
return (
|
||||
<>
|
||||
<div
|
||||
data-table-card
|
||||
className='group relative cursor-pointer rounded-[8px] border border-[var(--border-muted)] bg-[var(--surface-1)] p-[16px] transition-colors hover:border-[var(--border-color)]'
|
||||
onClick={() => router.push(`/workspace/${workspaceId}/tables/${table.id}`)}
|
||||
>
|
||||
<div className='flex items-start justify-between gap-[8px]'>
|
||||
<div className='flex min-w-0 flex-1 items-start gap-[12px]'>
|
||||
<div className='mt-[2px] flex-shrink-0'>
|
||||
<div className='flex h-[40px] w-[40px] items-center justify-center rounded-[8px] border border-[#3B82F6] bg-[#EFF6FF] dark:border-[#1E40AF] dark:bg-[#1E3A5F]'>
|
||||
<Database className='h-[20px] w-[20px] text-[#3B82F6] dark:text-[#60A5FA]' />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className='min-w-0 flex-1'>
|
||||
<h3 className='truncate font-medium text-[14px] text-[var(--text-primary)]'>
|
||||
{table.name}
|
||||
</h3>
|
||||
|
||||
{table.description && (
|
||||
<p className='mt-[4px] line-clamp-2 text-[12px] text-[var(--text-tertiary)]'>
|
||||
{table.description}
|
||||
</p>
|
||||
)}
|
||||
|
||||
<div className='mt-[12px] flex items-center gap-[16px] text-[11px] text-[var(--text-subtle)]'>
|
||||
<span>{columnCount} columns</span>
|
||||
<span>{table.rowCount} rows</span>
|
||||
</div>
|
||||
|
||||
<div className='mt-[8px] text-[11px] text-[var(--text-muted)]'>
|
||||
Updated {new Date(table.updatedAt).toLocaleDateString()}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<Popover open={isMenuOpen} onOpenChange={setIsMenuOpen}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant='ghost'
|
||||
size='sm'
|
||||
className='h-[24px] w-[24px] p-0 opacity-0 group-hover:opacity-100'
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<MoreVertical className='h-[14px] w-[14px]' />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent align='end' className='w-[160px]'>
|
||||
<PopoverItem
|
||||
onClick={(e) => {
|
||||
e.stopPropagation()
|
||||
setIsMenuOpen(false)
|
||||
setIsDeleteDialogOpen(true)
|
||||
}}
|
||||
>
|
||||
<Trash2 className='mr-[8px] h-[14px] w-[14px]' />
|
||||
Delete
|
||||
</PopoverItem>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<Modal open={isDeleteDialogOpen} onOpenChange={setIsDeleteDialogOpen}>
|
||||
<ModalContent className='w-[400px]'>
|
||||
<ModalHeader>Delete Table</ModalHeader>
|
||||
<ModalBody>
|
||||
<p className='text-[12px] text-[var(--text-secondary)]'>
|
||||
Are you sure you want to delete{' '}
|
||||
<span className='font-medium text-[var(--text-primary)]'>{table.name}</span>? This
|
||||
will permanently delete all {table.rowCount} rows.{' '}
|
||||
<span className='text-[var(--text-error)]'>This action cannot be undone.</span>
|
||||
</p>
|
||||
</ModalBody>
|
||||
<ModalFooter>
|
||||
<Button
|
||||
variant='default'
|
||||
onClick={() => setIsDeleteDialogOpen(false)}
|
||||
disabled={deleteTable.isPending}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
variant='ghost'
|
||||
onClick={handleDelete}
|
||||
disabled={deleteTable.isPending}
|
||||
className='text-[var(--text-error)] hover:text-[var(--text-error)]'
|
||||
>
|
||||
{deleteTable.isPending ? 'Deleting...' : 'Delete'}
|
||||
</Button>
|
||||
</ModalFooter>
|
||||
</ModalContent>
|
||||
</Modal>
|
||||
</>
|
||||
)
|
||||
}
|
||||
10
apps/sim/app/workspace/[workspaceId]/tables/layout.tsx
Normal file
10
apps/sim/app/workspace/[workspaceId]/tables/layout.tsx
Normal file
@@ -0,0 +1,10 @@
|
||||
/**
|
||||
* Tables layout - applies sidebar padding for all table routes.
|
||||
*/
|
||||
export default function TablesLayout({ children }: { children: React.ReactNode }) {
|
||||
return (
|
||||
<div className='flex h-full flex-1 flex-col overflow-hidden pl-[var(--sidebar-width)]'>
|
||||
{children}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
26
apps/sim/app/workspace/[workspaceId]/tables/page.tsx
Normal file
26
apps/sim/app/workspace/[workspaceId]/tables/page.tsx
Normal file
@@ -0,0 +1,26 @@
|
||||
import { redirect } from 'next/navigation'
|
||||
import { getSession } from '@/lib/auth'
|
||||
import { verifyWorkspaceMembership } from '@/app/api/workflows/utils'
|
||||
import { Tables } from './tables'
|
||||
|
||||
interface TablesPageProps {
|
||||
params: Promise<{
|
||||
workspaceId: string
|
||||
}>
|
||||
}
|
||||
|
||||
export default async function TablesPage({ params }: TablesPageProps) {
|
||||
const { workspaceId } = await params
|
||||
const session = await getSession()
|
||||
|
||||
if (!session?.user?.id) {
|
||||
redirect('/')
|
||||
}
|
||||
|
||||
const hasPermission = await verifyWorkspaceMembership(session.user.id, workspaceId)
|
||||
if (!hasPermission) {
|
||||
redirect('/')
|
||||
}
|
||||
|
||||
return <Tables />
|
||||
}
|
||||
142
apps/sim/app/workspace/[workspaceId]/tables/tables.tsx
Normal file
142
apps/sim/app/workspace/[workspaceId]/tables/tables.tsx
Normal file
@@ -0,0 +1,142 @@
|
||||
'use client'
|
||||
|
||||
import { useState } from 'react'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { Database, Plus, Search } from 'lucide-react'
|
||||
import { useParams } from 'next/navigation'
|
||||
import { Button, Tooltip } from '@/components/emcn'
|
||||
import { Input } from '@/components/ui/input'
|
||||
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
|
||||
import { useTablesList } from '@/hooks/queries/use-tables'
|
||||
import { useDebounce } from '@/hooks/use-debounce'
|
||||
import { CreateTableModal } from './components/create-table-modal'
|
||||
import { TableCard } from './components/table-card'
|
||||
|
||||
const logger = createLogger('Tables')
|
||||
|
||||
export function Tables() {
|
||||
const params = useParams()
|
||||
const workspaceId = params.workspaceId as string
|
||||
const userPermissions = useUserPermissionsContext()
|
||||
|
||||
const { data: tables = [], isLoading, error } = useTablesList(workspaceId)
|
||||
|
||||
const [searchQuery, setSearchQuery] = useState('')
|
||||
const debouncedSearchQuery = useDebounce(searchQuery, 300)
|
||||
const [isCreateModalOpen, setIsCreateModalOpen] = useState(false)
|
||||
|
||||
// Filter tables by search query
|
||||
const filteredTables = tables.filter((table) => {
|
||||
if (!debouncedSearchQuery) return true
|
||||
|
||||
const query = debouncedSearchQuery.toLowerCase()
|
||||
return (
|
||||
table.name.toLowerCase().includes(query) || table.description?.toLowerCase().includes(query)
|
||||
)
|
||||
})
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className='flex h-full flex-1 flex-col'>
|
||||
<div className='flex flex-1 overflow-hidden'>
|
||||
<div className='flex flex-1 flex-col overflow-auto bg-white px-[24px] pt-[28px] pb-[24px] dark:bg-[var(--bg)]'>
|
||||
{/* Header */}
|
||||
<div>
|
||||
<div className='flex items-start gap-[12px]'>
|
||||
<div className='flex h-[26px] w-[26px] items-center justify-center rounded-[6px] border border-[#3B82F6] bg-[#EFF6FF] dark:border-[#1E40AF] dark:bg-[#1E3A5F]'>
|
||||
<Database className='h-[14px] w-[14px] text-[#3B82F6] dark:text-[#60A5FA]' />
|
||||
</div>
|
||||
<h1 className='font-medium text-[18px]'>Tables</h1>
|
||||
</div>
|
||||
<p className='mt-[10px] text-[14px] text-[var(--text-tertiary)]'>
|
||||
Create and manage data tables for your workflows.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Search and Actions */}
|
||||
<div className='mt-[14px] flex items-center justify-between'>
|
||||
<div className='flex h-[32px] w-[400px] items-center gap-[6px] rounded-[8px] bg-[var(--surface-4)] px-[8px]'>
|
||||
<Search className='h-[14px] w-[14px] text-[var(--text-subtle)]' />
|
||||
<Input
|
||||
placeholder='Search'
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
className='flex-1 border-0 bg-transparent px-0 font-medium text-[var(--text-secondary)] text-small leading-none placeholder:text-[var(--text-subtle)] focus-visible:ring-0 focus-visible:ring-offset-0'
|
||||
/>
|
||||
</div>
|
||||
<div className='flex items-center gap-[8px]'>
|
||||
<Tooltip.Root>
|
||||
<Tooltip.Trigger asChild>
|
||||
<Button
|
||||
onClick={() => setIsCreateModalOpen(true)}
|
||||
disabled={userPermissions.canEdit !== true}
|
||||
variant='tertiary'
|
||||
className='h-[32px] rounded-[6px]'
|
||||
>
|
||||
<Plus className='mr-[6px] h-[14px] w-[14px]' />
|
||||
Create Table
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
{userPermissions.canEdit !== true && (
|
||||
<Tooltip.Content>Write permission required to create tables</Tooltip.Content>
|
||||
)}
|
||||
</Tooltip.Root>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div className='mt-[24px] grid grid-cols-1 gap-[20px] md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4'>
|
||||
{isLoading ? (
|
||||
// Loading skeleton
|
||||
Array.from({ length: 8 }).map((_, i) => (
|
||||
<div
|
||||
key={i}
|
||||
className='animate-pulse rounded-[8px] border border-[var(--border-muted)] bg-[var(--surface-1)] p-[16px]'
|
||||
>
|
||||
<div className='flex items-start gap-[12px]'>
|
||||
<div className='h-[40px] w-[40px] rounded-[8px] bg-[var(--surface-4)]' />
|
||||
<div className='flex-1 space-y-[8px]'>
|
||||
<div className='h-[16px] w-3/4 rounded bg-[var(--surface-4)]' />
|
||||
<div className='h-[12px] w-1/2 rounded bg-[var(--surface-4)]' />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
))
|
||||
) : error ? (
|
||||
<div className='col-span-full flex h-64 items-center justify-center rounded-lg border border-muted-foreground/25 bg-muted/20'>
|
||||
<div className='text-center'>
|
||||
<p className='font-medium text-[var(--text-secondary)] text-sm'>
|
||||
Error loading tables
|
||||
</p>
|
||||
<p className='mt-1 text-[var(--text-muted)] text-xs'>
|
||||
{error instanceof Error ? error.message : 'An error occurred'}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
) : filteredTables.length === 0 ? (
|
||||
<div className='col-span-full flex h-64 items-center justify-center rounded-lg border border-muted-foreground/25 bg-muted/20'>
|
||||
<div className='text-center'>
|
||||
<p className='font-medium text-[var(--text-secondary)] text-sm'>
|
||||
{searchQuery ? 'No tables found' : 'No tables yet'}
|
||||
</p>
|
||||
<p className='mt-1 text-[var(--text-muted)] text-xs'>
|
||||
{searchQuery
|
||||
? 'Try adjusting your search query'
|
||||
: 'Create your first table to store structured data for your workflows'}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
filteredTables.map((table) => (
|
||||
<TableCard key={table.id} table={table} workspaceId={workspaceId} />
|
||||
))
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<CreateTableModal isOpen={isCreateModalOpen} onClose={() => setIsCreateModalOpen(false)} />
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -35,6 +35,7 @@ import { getDependsOnFields } from '@/blocks/utils'
|
||||
import { useKnowledgeBase } from '@/hooks/kb/use-knowledge'
|
||||
import { useMcpServers, useMcpToolsQuery } from '@/hooks/queries/mcp'
|
||||
import { useCredentialName } from '@/hooks/queries/oauth-credentials'
|
||||
import { useTablesList } from '@/hooks/queries/use-tables'
|
||||
import { useSelectorDisplayName } from '@/hooks/use-selector-display-name'
|
||||
import { useVariablesStore } from '@/stores/panel'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
@@ -85,7 +86,11 @@ const isFieldFormatArray = (value: unknown): value is FieldFormat[] => {
|
||||
if (!Array.isArray(value) || value.length === 0) return false
|
||||
const firstItem = value[0]
|
||||
return (
|
||||
typeof firstItem === 'object' && firstItem !== null && 'id' in firstItem && 'name' in firstItem
|
||||
typeof firstItem === 'object' &&
|
||||
firstItem !== null &&
|
||||
'id' in firstItem &&
|
||||
'name' in firstItem &&
|
||||
typeof firstItem.name === 'string'
|
||||
)
|
||||
}
|
||||
|
||||
@@ -151,7 +156,8 @@ const isTagFilterArray = (value: unknown): value is TagFilterItem[] => {
|
||||
typeof firstItem === 'object' &&
|
||||
firstItem !== null &&
|
||||
'tagName' in firstItem &&
|
||||
'tagValue' in firstItem
|
||||
'tagValue' in firstItem &&
|
||||
typeof firstItem.tagName === 'string'
|
||||
)
|
||||
}
|
||||
|
||||
@@ -173,7 +179,8 @@ const isDocumentTagArray = (value: unknown): value is DocumentTagItem[] => {
|
||||
firstItem !== null &&
|
||||
'tagName' in firstItem &&
|
||||
'value' in firstItem &&
|
||||
!('tagValue' in firstItem) // Distinguish from tag filters
|
||||
!('tagValue' in firstItem) && // Distinguish from tag filters
|
||||
typeof firstItem.tagName === 'string'
|
||||
)
|
||||
}
|
||||
|
||||
@@ -222,7 +229,9 @@ export const getDisplayValue = (value: unknown): string => {
|
||||
}
|
||||
|
||||
if (isTagFilterArray(parsedValue)) {
|
||||
const validFilters = parsedValue.filter((f) => f.tagName?.trim())
|
||||
const validFilters = parsedValue.filter(
|
||||
(f) => typeof f.tagName === 'string' && f.tagName.trim() !== ''
|
||||
)
|
||||
if (validFilters.length === 0) return '-'
|
||||
if (validFilters.length === 1) return validFilters[0].tagName
|
||||
if (validFilters.length === 2) return `${validFilters[0].tagName}, ${validFilters[1].tagName}`
|
||||
@@ -230,7 +239,9 @@ export const getDisplayValue = (value: unknown): string => {
|
||||
}
|
||||
|
||||
if (isDocumentTagArray(parsedValue)) {
|
||||
const validTags = parsedValue.filter((t) => t.tagName?.trim())
|
||||
const validTags = parsedValue.filter(
|
||||
(t) => typeof t.tagName === 'string' && t.tagName.trim() !== ''
|
||||
)
|
||||
if (validTags.length === 0) return '-'
|
||||
if (validTags.length === 1) return validTags[0].tagName
|
||||
if (validTags.length === 2) return `${validTags[0].tagName}, ${validTags[1].tagName}`
|
||||
@@ -258,7 +269,9 @@ export const getDisplayValue = (value: unknown): string => {
|
||||
}
|
||||
|
||||
if (isFieldFormatArray(parsedValue)) {
|
||||
const namedFields = parsedValue.filter((field) => field.name && field.name.trim() !== '')
|
||||
const namedFields = parsedValue.filter(
|
||||
(field) => typeof field.name === 'string' && field.name.trim() !== ''
|
||||
)
|
||||
if (namedFields.length === 0) return '-'
|
||||
if (namedFields.length === 1) return namedFields[0].name
|
||||
if (namedFields.length === 2) return `${namedFields[0].name}, ${namedFields[1].name}`
|
||||
@@ -442,6 +455,15 @@ const SubBlockRow = ({
|
||||
return tool?.name ?? null
|
||||
}, [subBlock?.type, rawValue, mcpToolsData])
|
||||
|
||||
const { data: tables = [] } = useTablesList(workspaceId || '')
|
||||
const tableDisplayName = useMemo(() => {
|
||||
if (subBlock?.id !== 'tableId' || typeof rawValue !== 'string') {
|
||||
return null
|
||||
}
|
||||
const table = tables.find((t) => t.id === rawValue)
|
||||
return table?.name ?? null
|
||||
}, [subBlock?.id, rawValue, tables])
|
||||
|
||||
const webhookUrlDisplayValue = useMemo(() => {
|
||||
if (subBlock?.id !== 'webhookUrlDisplay' || !blockId) {
|
||||
return null
|
||||
@@ -481,18 +503,42 @@ const SubBlockRow = ({
|
||||
return `${names[0]}, ${names[1]} +${names.length - 2}`
|
||||
}, [subBlock?.type, rawValue, workflowId, allVariables])
|
||||
|
||||
const filterDisplayValue = useMemo(() => {
|
||||
const isFilterField =
|
||||
subBlock?.id === 'filter' || subBlock?.id === 'filterCriteria' || subBlock?.id === 'sort'
|
||||
|
||||
if (!isFilterField || !rawValue) return null
|
||||
|
||||
const parsedValue = tryParseJson(rawValue)
|
||||
|
||||
if (isPlainObject(parsedValue) || Array.isArray(parsedValue)) {
|
||||
try {
|
||||
const jsonStr = JSON.stringify(parsedValue, null, 0)
|
||||
if (jsonStr.length <= 35) return jsonStr
|
||||
return `${jsonStr.slice(0, 32)}...`
|
||||
} catch {
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
return null
|
||||
}, [subBlock?.id, rawValue])
|
||||
|
||||
const isPasswordField = subBlock?.password === true
|
||||
const maskedValue = isPasswordField && value && value !== '-' ? '•••' : null
|
||||
const isMonospaceField = Boolean(filterDisplayValue)
|
||||
|
||||
const isSelectorType = subBlock?.type && SELECTOR_TYPES_HYDRATION_REQUIRED.includes(subBlock.type)
|
||||
const hydratedName =
|
||||
credentialName ||
|
||||
dropdownLabel ||
|
||||
variablesDisplayValue ||
|
||||
filterDisplayValue ||
|
||||
knowledgeBaseDisplayName ||
|
||||
workflowSelectionName ||
|
||||
mcpServerDisplayName ||
|
||||
mcpToolDisplayName ||
|
||||
tableDisplayName ||
|
||||
webhookUrlDisplayValue ||
|
||||
selectorDisplayName
|
||||
const displayValue = maskedValue || hydratedName || (isSelectorType && value ? '-' : value)
|
||||
@@ -507,7 +553,10 @@ const SubBlockRow = ({
|
||||
</span>
|
||||
{displayValue !== undefined && (
|
||||
<span
|
||||
className='flex-1 truncate text-right text-[14px] text-[var(--text-primary)]'
|
||||
className={cn(
|
||||
'flex-1 truncate text-right text-[14px] text-[var(--text-primary)]',
|
||||
isMonospaceField && 'font-mono'
|
||||
)}
|
||||
title={displayValue}
|
||||
>
|
||||
{displayValue}
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { Database, HelpCircle, Layout, Plus, Search, Settings } from 'lucide-react'
|
||||
import { Database, HelpCircle, Layout, Plus, Search, Settings, Table } from 'lucide-react'
|
||||
import Link from 'next/link'
|
||||
import { useParams, usePathname, useRouter } from 'next/navigation'
|
||||
import { Button, Download, FolderPlus, Library, Loader, Tooltip } from '@/components/emcn'
|
||||
@@ -265,6 +265,12 @@ export function Sidebar() {
|
||||
href: `/workspace/${workspaceId}/knowledge`,
|
||||
hidden: permissionConfig.hideKnowledgeBaseTab,
|
||||
},
|
||||
{
|
||||
id: 'tables',
|
||||
label: 'Tables',
|
||||
icon: Table,
|
||||
href: `/workspace/${workspaceId}/tables`,
|
||||
},
|
||||
{
|
||||
id: 'help',
|
||||
label: 'Help',
|
||||
|
||||
544
apps/sim/blocks/blocks/table.ts
Normal file
544
apps/sim/blocks/blocks/table.ts
Normal file
@@ -0,0 +1,544 @@
|
||||
import { TableIcon } from '@/components/icons'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import type { TableQueryResponse } from '@/tools/table/types'
|
||||
|
||||
export const TableBlock: BlockConfig<TableQueryResponse> = {
|
||||
type: 'table',
|
||||
name: 'Table',
|
||||
description: 'User-defined data tables',
|
||||
longDescription:
|
||||
'Create and manage custom data tables. Store, query, and manipulate structured data within workflows.',
|
||||
docsLink: 'https://docs.sim.ai/tools/table',
|
||||
category: 'blocks',
|
||||
bgColor: '#10B981',
|
||||
icon: TableIcon,
|
||||
subBlocks: [
|
||||
{
|
||||
id: 'operation',
|
||||
title: 'Operation',
|
||||
type: 'dropdown',
|
||||
options: [
|
||||
{ label: 'Query Rows', id: 'queryRows' },
|
||||
{ label: 'Insert Row', id: 'insertRow' },
|
||||
{ label: 'Batch Insert Rows', id: 'batchInsertRows' },
|
||||
{ label: 'Update Rows by Filter', id: 'updateRowsByFilter' },
|
||||
{ label: 'Delete Rows by Filter', id: 'deleteRowsByFilter' },
|
||||
{ label: 'Update Row by ID', id: 'updateRow' },
|
||||
{ label: 'Delete Row by ID', id: 'deleteRow' },
|
||||
{ label: 'Get Row by ID', id: 'getRow' },
|
||||
],
|
||||
value: () => 'queryRows',
|
||||
},
|
||||
|
||||
// Table selector (for all operations)
|
||||
{
|
||||
id: 'tableId',
|
||||
title: 'Table',
|
||||
type: 'dropdown',
|
||||
placeholder: 'Select a table',
|
||||
required: true,
|
||||
options: [],
|
||||
fetchOptions: async () => {
|
||||
const { useWorkflowRegistry } = await import('@/stores/workflows/registry/store')
|
||||
|
||||
const workspaceId = useWorkflowRegistry.getState().hydration.workspaceId
|
||||
if (!workspaceId) {
|
||||
return []
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/table?workspaceId=${workspaceId}`)
|
||||
if (!response.ok) {
|
||||
return []
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
return (data.tables || []).map((table: any) => ({
|
||||
label: table.name,
|
||||
id: table.id,
|
||||
}))
|
||||
} catch (error) {
|
||||
return []
|
||||
}
|
||||
},
|
||||
fetchOptionById: async (_blockId: string, _subBlockId: string, tableId: string) => {
|
||||
const { useWorkflowRegistry } = await import('@/stores/workflows/registry/store')
|
||||
|
||||
const workspaceId = useWorkflowRegistry.getState().hydration.workspaceId
|
||||
if (!workspaceId) {
|
||||
return null
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/table?workspaceId=${workspaceId}`)
|
||||
if (!response.ok) {
|
||||
return null
|
||||
}
|
||||
|
||||
const data = await response.json()
|
||||
const table = (data.tables || []).find((t: any) => t.id === tableId)
|
||||
return table ? { label: table.name, id: table.id } : null
|
||||
} catch (error) {
|
||||
return null
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
// Row ID for get/update/delete
|
||||
{
|
||||
id: 'rowId',
|
||||
title: 'Row ID',
|
||||
type: 'short-input',
|
||||
placeholder: 'row_xxxxx',
|
||||
condition: { field: 'operation', value: ['getRow', 'updateRow', 'deleteRow'] },
|
||||
required: true,
|
||||
},
|
||||
|
||||
// Insert/Update Row data (single row)
|
||||
{
|
||||
id: 'rowData',
|
||||
title: 'Row Data (JSON)',
|
||||
type: 'code',
|
||||
placeholder: '{"column_name": "value"}',
|
||||
condition: { field: 'operation', value: ['insertRow', 'updateRow', 'updateRowsByFilter'] },
|
||||
required: true,
|
||||
wandConfig: {
|
||||
enabled: true,
|
||||
maintainHistory: true,
|
||||
prompt: `Generate row data as a JSON object matching the table's column schema.
|
||||
|
||||
### CONTEXT
|
||||
{context}
|
||||
|
||||
### INSTRUCTION
|
||||
Return ONLY a valid JSON object with field values based on the table's columns. No explanations or markdown.
|
||||
|
||||
IMPORTANT: Reference the table schema visible in the table selector to know which columns exist and their types.
|
||||
|
||||
### EXAMPLES
|
||||
|
||||
Table with columns: email (string), name (string), age (number)
|
||||
"user with email john@example.com and age 25"
|
||||
→ {"email": "john@example.com", "name": "John", "age": 25}
|
||||
|
||||
Table with columns: customer_id (string), total (number), status (string)
|
||||
"order with customer ID 123, total 99.99, status pending"
|
||||
→ {"customer_id": "123", "total": 99.99, "status": "pending"}
|
||||
|
||||
Return ONLY the data JSON:`,
|
||||
generationType: 'json-object',
|
||||
},
|
||||
},
|
||||
|
||||
// Batch Insert - multiple rows
|
||||
{
|
||||
id: 'batchRows',
|
||||
title: 'Rows Data (Array of JSON)',
|
||||
type: 'code',
|
||||
placeholder: '[{"col1": "val1"}, {"col1": "val2"}]',
|
||||
condition: { field: 'operation', value: 'batchInsertRows' },
|
||||
required: true,
|
||||
wandConfig: {
|
||||
enabled: true,
|
||||
maintainHistory: true,
|
||||
prompt: `Generate an array of row data objects matching the table's column schema.
|
||||
|
||||
### CONTEXT
|
||||
{context}
|
||||
|
||||
### INSTRUCTION
|
||||
Return ONLY a valid JSON array of objects. Each object represents one row. No explanations or markdown.
|
||||
Maximum 1000 rows per batch.
|
||||
|
||||
IMPORTANT: Reference the table schema to know which columns exist and their types.
|
||||
|
||||
### EXAMPLES
|
||||
|
||||
Table with columns: email (string), name (string), age (number)
|
||||
"3 users: john@example.com age 25, jane@example.com age 30, bob@example.com age 28"
|
||||
→ [
|
||||
{"email": "john@example.com", "name": "John", "age": 25},
|
||||
{"email": "jane@example.com", "name": "Jane", "age": 30},
|
||||
{"email": "bob@example.com", "name": "Bob", "age": 28}
|
||||
]
|
||||
|
||||
Return ONLY the rows array:`,
|
||||
generationType: 'json-object',
|
||||
},
|
||||
},
|
||||
|
||||
// Filter for update/delete/query operations
|
||||
{
|
||||
id: 'filterCriteria',
|
||||
title: 'Filter Criteria',
|
||||
type: 'code',
|
||||
placeholder: '{"column_name": {"$eq": "value"}}',
|
||||
condition: {
|
||||
field: 'operation',
|
||||
value: ['updateRowsByFilter', 'deleteRowsByFilter'],
|
||||
},
|
||||
required: true,
|
||||
wandConfig: {
|
||||
enabled: true,
|
||||
maintainHistory: true,
|
||||
prompt: `Generate filter criteria for selecting rows to update or delete.
|
||||
|
||||
### CONTEXT
|
||||
{context}
|
||||
|
||||
### INSTRUCTION
|
||||
Return ONLY a valid JSON filter object. No explanations or markdown.
|
||||
|
||||
IMPORTANT: Reference the table schema to know which columns exist and their types.
|
||||
|
||||
### OPERATORS
|
||||
- **$eq**: Equals - {"column": {"$eq": "value"}} or {"column": "value"}
|
||||
- **$ne**: Not equals - {"column": {"$ne": "value"}}
|
||||
- **$gt**: Greater than - {"column": {"$gt": 18}}
|
||||
- **$gte**: Greater than or equal - {"column": {"$gte": 100}}
|
||||
- **$lt**: Less than - {"column": {"$lt": 90}}
|
||||
- **$lte**: Less than or equal - {"column": {"$lte": 5}}
|
||||
- **$in**: In array - {"column": {"$in": ["value1", "value2"]}}
|
||||
- **$nin**: Not in array - {"column": {"$nin": ["value1", "value2"]}}
|
||||
- **$contains**: String contains - {"column": {"$contains": "text"}}
|
||||
|
||||
### EXAMPLES
|
||||
|
||||
"rows where status is active"
|
||||
→ {"status": "active"}
|
||||
|
||||
"rows where age is over 18 and status is pending"
|
||||
→ {"age": {"$gte": 18}, "status": "pending"}
|
||||
|
||||
"rows where email contains gmail.com"
|
||||
→ {"email": {"$contains": "gmail.com"}}
|
||||
|
||||
Return ONLY the filter JSON:`,
|
||||
generationType: 'json-object',
|
||||
},
|
||||
},
|
||||
|
||||
// Safety limit for bulk operations
|
||||
{
|
||||
id: 'bulkLimit',
|
||||
title: 'Limit',
|
||||
type: 'short-input',
|
||||
placeholder: '100',
|
||||
condition: {
|
||||
field: 'operation',
|
||||
value: ['updateRowsByFilter', 'deleteRowsByFilter'],
|
||||
},
|
||||
},
|
||||
|
||||
// Query filters
|
||||
{
|
||||
id: 'filter',
|
||||
title: 'Filter',
|
||||
type: 'code',
|
||||
placeholder: '{"column_name": {"$eq": "value"}}',
|
||||
condition: { field: 'operation', value: 'queryRows' },
|
||||
wandConfig: {
|
||||
enabled: true,
|
||||
maintainHistory: true,
|
||||
prompt: `Generate query filters for table data using MongoDB-style operators.
|
||||
|
||||
### CONTEXT
|
||||
{context}
|
||||
|
||||
### INSTRUCTION
|
||||
Return ONLY a valid JSON filter object based on the table's columns. No explanations or markdown.
|
||||
|
||||
IMPORTANT: Reference the table schema to know which columns exist and their types (string, number, boolean, date, json).
|
||||
|
||||
### OPERATORS
|
||||
- **$eq**: Equals - {"column": {"$eq": "value"}} or {"column": "value"}
|
||||
- **$ne**: Not equals - {"column": {"$ne": "value"}}
|
||||
- **$gt**: Greater than - {"column": {"$gt": 18}} (numbers/dates only)
|
||||
- **$gte**: Greater than or equal - {"column": {"$gte": 100}} (numbers/dates only)
|
||||
- **$lt**: Less than - {"column": {"$lt": 90}} (numbers/dates only)
|
||||
- **$lte**: Less than or equal - {"column": {"$lte": 5}} (numbers/dates only)
|
||||
- **$in**: In array - {"column": {"$in": ["value1", "value2"]}}
|
||||
- **$nin**: Not in array - {"column": {"$nin": ["value1", "value2"]}}
|
||||
- **$contains**: String contains (case-insensitive) - {"column": {"$contains": "text"}} (strings only)
|
||||
|
||||
### EXAMPLES
|
||||
|
||||
Table with columns: status (string), age (number), email (string), active (boolean)
|
||||
|
||||
"active users"
|
||||
→ {"active": true}
|
||||
|
||||
"users over 18 years old"
|
||||
→ {"age": {"$gte": 18}}
|
||||
|
||||
"users with status active or pending"
|
||||
→ {"status": {"$in": ["active", "pending"]}}
|
||||
|
||||
"users with age between 18 and 65 and active status"
|
||||
→ {"age": {"$gte": 18, "$lte": 65}, "active": true}
|
||||
|
||||
"users with email containing 'example.com'"
|
||||
→ {"email": {"$contains": "example.com"}}
|
||||
|
||||
Return ONLY the filter JSON:`,
|
||||
generationType: 'json-object',
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'sort',
|
||||
title: 'Sort',
|
||||
type: 'code',
|
||||
placeholder: '{"column_name": "desc"}',
|
||||
condition: { field: 'operation', value: 'queryRows' },
|
||||
wandConfig: {
|
||||
enabled: true,
|
||||
maintainHistory: true,
|
||||
prompt: `Generate sort order for table query results.
|
||||
|
||||
### CONTEXT
|
||||
{context}
|
||||
|
||||
### INSTRUCTION
|
||||
Return ONLY a valid JSON object specifying sort order. No explanations or markdown.
|
||||
|
||||
IMPORTANT: Reference the table schema to know which columns exist. You can sort by any column or the built-in columns (createdAt, updatedAt).
|
||||
|
||||
### FORMAT
|
||||
{"column_name": "asc" or "desc"}
|
||||
|
||||
You can specify multiple columns for multi-level sorting.
|
||||
|
||||
### EXAMPLES
|
||||
|
||||
Table with columns: name (string), age (number), email (string), createdAt (date)
|
||||
|
||||
"sort by newest first"
|
||||
→ {"createdAt": "desc"}
|
||||
|
||||
"sort by name alphabetically"
|
||||
→ {"name": "asc"}
|
||||
|
||||
"sort by age descending"
|
||||
→ {"age": "desc"}
|
||||
|
||||
"sort by age descending, then name ascending"
|
||||
→ {"age": "desc", "name": "asc"}
|
||||
|
||||
"sort by oldest created first"
|
||||
→ {"createdAt": "asc"}
|
||||
|
||||
Return ONLY the sort JSON:`,
|
||||
generationType: 'json-object',
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'limit',
|
||||
title: 'Limit',
|
||||
type: 'short-input',
|
||||
placeholder: '100',
|
||||
condition: { field: 'operation', value: 'queryRows' },
|
||||
value: () => '100',
|
||||
},
|
||||
{
|
||||
id: 'offset',
|
||||
title: 'Offset',
|
||||
type: 'short-input',
|
||||
placeholder: '0',
|
||||
condition: { field: 'operation', value: 'queryRows' },
|
||||
value: () => '0',
|
||||
},
|
||||
],
|
||||
|
||||
tools: {
|
||||
access: [
|
||||
'table_insert_row',
|
||||
'table_batch_insert_rows',
|
||||
'table_update_row',
|
||||
'table_update_rows_by_filter',
|
||||
'table_delete_row',
|
||||
'table_delete_rows_by_filter',
|
||||
'table_query_rows',
|
||||
'table_get_row',
|
||||
],
|
||||
config: {
|
||||
tool: (params) => {
|
||||
const toolMap: Record<string, string> = {
|
||||
insertRow: 'table_insert_row',
|
||||
batchInsertRows: 'table_batch_insert_rows',
|
||||
updateRow: 'table_update_row',
|
||||
updateRowsByFilter: 'table_update_rows_by_filter',
|
||||
deleteRow: 'table_delete_row',
|
||||
deleteRowsByFilter: 'table_delete_rows_by_filter',
|
||||
queryRows: 'table_query_rows',
|
||||
getRow: 'table_get_row',
|
||||
}
|
||||
return toolMap[params.operation] || 'table_query_rows'
|
||||
},
|
||||
params: (params) => {
|
||||
const { operation, ...rest } = params
|
||||
|
||||
/**
|
||||
* Helper to parse JSON with better error messages
|
||||
*/
|
||||
const parseJSON = (value: string | any, fieldName: string): any => {
|
||||
if (typeof value !== 'string') return value
|
||||
|
||||
try {
|
||||
return JSON.parse(value)
|
||||
} catch (error) {
|
||||
const errorMsg = error instanceof Error ? error.message : String(error)
|
||||
throw new Error(
|
||||
`Invalid JSON in ${fieldName}: ${errorMsg}. Make sure all property names are in double quotes (e.g., {"name": "value"} not {name: "value"})`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// Insert Row
|
||||
if (operation === 'insertRow') {
|
||||
const data = parseJSON(rest.rowData, 'Row Data')
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
data,
|
||||
}
|
||||
}
|
||||
|
||||
// Batch Insert Rows
|
||||
if (operation === 'batchInsertRows') {
|
||||
const rows = parseJSON(rest.batchRows, 'Rows Data')
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
rows,
|
||||
}
|
||||
}
|
||||
|
||||
// Update Row by ID
|
||||
if (operation === 'updateRow') {
|
||||
const data = parseJSON(rest.rowData, 'Row Data')
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
rowId: rest.rowId,
|
||||
data,
|
||||
}
|
||||
}
|
||||
|
||||
// Update Rows by Filter
|
||||
if (operation === 'updateRowsByFilter') {
|
||||
const filter = parseJSON(rest.filterCriteria, 'Filter Criteria')
|
||||
const data = parseJSON(rest.rowData, 'Row Data')
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
filter,
|
||||
data,
|
||||
limit: rest.bulkLimit ? Number.parseInt(rest.bulkLimit as string) : undefined,
|
||||
}
|
||||
}
|
||||
|
||||
// Delete Row by ID
|
||||
if (operation === 'deleteRow') {
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
rowId: rest.rowId,
|
||||
}
|
||||
}
|
||||
|
||||
// Delete Rows by Filter
|
||||
if (operation === 'deleteRowsByFilter') {
|
||||
const filter = parseJSON(rest.filterCriteria, 'Filter Criteria')
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
filter,
|
||||
limit: rest.bulkLimit ? Number.parseInt(rest.bulkLimit as string) : undefined,
|
||||
}
|
||||
}
|
||||
|
||||
// Get Row by ID
|
||||
if (operation === 'getRow') {
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
rowId: rest.rowId,
|
||||
}
|
||||
}
|
||||
|
||||
// Query Rows
|
||||
if (operation === 'queryRows') {
|
||||
const filter = rest.filter ? parseJSON(rest.filter, 'Filter') : undefined
|
||||
const sort = rest.sort ? parseJSON(rest.sort, 'Sort') : undefined
|
||||
|
||||
return {
|
||||
tableId: rest.tableId,
|
||||
filter,
|
||||
sort,
|
||||
limit: rest.limit ? Number.parseInt(rest.limit as string) : 100,
|
||||
offset: rest.offset ? Number.parseInt(rest.offset as string) : 0,
|
||||
}
|
||||
}
|
||||
|
||||
return rest
|
||||
},
|
||||
},
|
||||
},
|
||||
|
||||
inputs: {
|
||||
operation: { type: 'string', description: 'Table operation to perform' },
|
||||
tableId: { type: 'string', description: 'Table identifier' },
|
||||
rowData: { type: 'json', description: 'Row data for insert/update' },
|
||||
batchRows: { type: 'array', description: 'Array of row data for batch insert' },
|
||||
rowId: { type: 'string', description: 'Row identifier for ID-based operations' },
|
||||
filterCriteria: { type: 'json', description: 'Filter criteria for bulk operations' },
|
||||
bulkLimit: { type: 'number', description: 'Safety limit for bulk operations' },
|
||||
filter: { type: 'json', description: 'Query filter conditions' },
|
||||
sort: { type: 'json', description: 'Sort order' },
|
||||
limit: { type: 'number', description: 'Query result limit' },
|
||||
offset: { type: 'number', description: 'Query result offset' },
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Operation success status' },
|
||||
row: {
|
||||
type: 'json',
|
||||
description: 'Single row data',
|
||||
condition: { field: 'operation', value: ['getRow', 'insertRow', 'updateRow'] },
|
||||
},
|
||||
rows: {
|
||||
type: 'array',
|
||||
description: 'Array of rows',
|
||||
condition: { field: 'operation', value: ['queryRows', 'batchInsertRows'] },
|
||||
},
|
||||
rowCount: {
|
||||
type: 'number',
|
||||
description: 'Number of rows returned',
|
||||
condition: { field: 'operation', value: 'queryRows' },
|
||||
},
|
||||
totalCount: {
|
||||
type: 'number',
|
||||
description: 'Total rows matching filter',
|
||||
condition: { field: 'operation', value: 'queryRows' },
|
||||
},
|
||||
insertedCount: {
|
||||
type: 'number',
|
||||
description: 'Number of rows inserted',
|
||||
condition: { field: 'operation', value: 'batchInsertRows' },
|
||||
},
|
||||
updatedCount: {
|
||||
type: 'number',
|
||||
description: 'Number of rows updated',
|
||||
condition: { field: 'operation', value: 'updateRowsByFilter' },
|
||||
},
|
||||
updatedRowIds: {
|
||||
type: 'array',
|
||||
description: 'IDs of updated rows',
|
||||
condition: { field: 'operation', value: 'updateRowsByFilter' },
|
||||
},
|
||||
deletedCount: {
|
||||
type: 'number',
|
||||
description: 'Number of rows deleted',
|
||||
condition: { field: 'operation', value: ['deleteRow', 'deleteRowsByFilter'] },
|
||||
},
|
||||
deletedRowIds: {
|
||||
type: 'array',
|
||||
description: 'IDs of deleted rows',
|
||||
condition: { field: 'operation', value: 'deleteRowsByFilter' },
|
||||
},
|
||||
message: { type: 'string', description: 'Operation status message' },
|
||||
},
|
||||
}
|
||||
@@ -118,6 +118,7 @@ import { StarterBlock } from '@/blocks/blocks/starter'
|
||||
import { StripeBlock } from '@/blocks/blocks/stripe'
|
||||
import { SttBlock } from '@/blocks/blocks/stt'
|
||||
import { SupabaseBlock } from '@/blocks/blocks/supabase'
|
||||
import { TableBlock } from '@/blocks/blocks/table'
|
||||
import { TavilyBlock } from '@/blocks/blocks/tavily'
|
||||
import { TelegramBlock } from '@/blocks/blocks/telegram'
|
||||
import { ThinkingBlock } from '@/blocks/blocks/thinking'
|
||||
@@ -278,6 +279,7 @@ export const registry: Record<string, BlockConfig> = {
|
||||
tts: TtsBlock,
|
||||
stripe: StripeBlock,
|
||||
supabase: SupabaseBlock,
|
||||
table: TableBlock,
|
||||
tavily: TavilyBlock,
|
||||
telegram: TelegramBlock,
|
||||
thinking: ThinkingBlock,
|
||||
|
||||
@@ -4644,3 +4644,24 @@ export function BedrockIcon(props: SVGProps<SVGSVGElement>) {
|
||||
</svg>
|
||||
)
|
||||
}
|
||||
|
||||
export function TableIcon(props: SVGProps<SVGSVGElement>) {
|
||||
return (
|
||||
<svg
|
||||
xmlns='http://www.w3.org/2000/svg'
|
||||
viewBox='0 0 24 24'
|
||||
fill='none'
|
||||
stroke='currentColor'
|
||||
strokeWidth={2}
|
||||
strokeLinecap='round'
|
||||
strokeLinejoin='round'
|
||||
{...props}
|
||||
>
|
||||
<rect width='18' height='18' x='3' y='3' rx='2' />
|
||||
<path d='M3 9h18' />
|
||||
<path d='M3 15h18' />
|
||||
<path d='M9 3v18' />
|
||||
<path d='M15 3v18' />
|
||||
</svg>
|
||||
)
|
||||
}
|
||||
|
||||
152
apps/sim/hooks/queries/use-tables.ts
Normal file
152
apps/sim/hooks/queries/use-tables.ts
Normal file
@@ -0,0 +1,152 @@
|
||||
/**
|
||||
* React Query hooks for managing user-defined tables.
|
||||
*
|
||||
* Provides hooks for fetching, creating, and deleting tables within a workspace.
|
||||
* Tables are user-defined data structures that can store rows of data in JSONB format.
|
||||
*
|
||||
* @module hooks/queries/use-tables
|
||||
*/
|
||||
|
||||
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
|
||||
import type { TableDefinition } from '@/tools/table/types'
|
||||
|
||||
/**
|
||||
* Query key factories for table-related queries.
|
||||
* Ensures consistent cache invalidation across the app.
|
||||
*/
|
||||
/**
|
||||
* Query keys for table-related queries in React Query.
|
||||
* Use these keys to ensure correct cache scoping and invalidation
|
||||
* in queries or mutations dealing with user-defined tables.
|
||||
*/
|
||||
export const tableKeys = {
|
||||
/**
|
||||
* Base key for all table queries.
|
||||
* Example: ['tables']
|
||||
*/
|
||||
all: ['tables'] as const,
|
||||
|
||||
/**
|
||||
* Key for all lists of tables.
|
||||
* Useful for cache invalidation across all table lists.
|
||||
* Example: ['tables', 'list']
|
||||
*/
|
||||
lists: () => [...tableKeys.all, 'list'] as const,
|
||||
|
||||
/**
|
||||
* Key for the list of tables in a specific workspace.
|
||||
* @param workspaceId - The workspace ID to scope the list to.
|
||||
* Example: ['tables', 'list', 'workspace_abc123']
|
||||
*/
|
||||
list: (workspaceId?: string) => [...tableKeys.lists(), workspaceId ?? ''] as const,
|
||||
|
||||
/**
|
||||
* Key for all individual table detail queries.
|
||||
* Useful for cache invalidation for all details.
|
||||
* Example: ['tables', 'detail']
|
||||
*/
|
||||
details: () => [...tableKeys.all, 'detail'] as const,
|
||||
|
||||
/**
|
||||
* Key for a specific table's detail.
|
||||
* @param tableId - The table ID to scope the detail to.
|
||||
* Example: ['tables', 'detail', 'table_abc123']
|
||||
*/
|
||||
detail: (tableId: string) => [...tableKeys.details(), tableId] as const,
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook to fetch all tables for a workspace.
|
||||
*
|
||||
* @param workspaceId - The workspace ID to fetch tables for. If undefined, the query is disabled.
|
||||
* @returns React Query result containing the list of tables.
|
||||
*/
|
||||
export function useTablesList(workspaceId?: string) {
|
||||
return useQuery({
|
||||
queryKey: tableKeys.list(workspaceId),
|
||||
queryFn: async () => {
|
||||
if (!workspaceId) throw new Error('Workspace ID required')
|
||||
|
||||
const res = await fetch(`/api/table?workspaceId=${encodeURIComponent(workspaceId)}`)
|
||||
|
||||
if (!res.ok) {
|
||||
const error = await res.json()
|
||||
throw new Error(error.error || 'Failed to fetch tables')
|
||||
}
|
||||
|
||||
const data = await res.json()
|
||||
return data.tables as TableDefinition[]
|
||||
},
|
||||
enabled: Boolean(workspaceId),
|
||||
staleTime: 30 * 1000, // Cache data for 30 seconds before refetching
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook to create a new table in a workspace.
|
||||
*
|
||||
* @param workspaceId - The workspace ID where the table will be created.
|
||||
* @returns React Query mutation object with mutationFn and onSuccess handler.
|
||||
* The mutationFn accepts table creation parameters (name, description, schema).
|
||||
* On success, invalidates the tables list query to refresh the UI.
|
||||
*/
|
||||
export function useCreateTable(workspaceId: string) {
|
||||
const queryClient = useQueryClient()
|
||||
|
||||
return useMutation({
|
||||
mutationFn: async (params: {
|
||||
name: string
|
||||
description?: string
|
||||
schema: { columns: Array<{ name: string; type: string; required?: boolean }> }
|
||||
}) => {
|
||||
const res = await fetch('/api/table', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ ...params, workspaceId }),
|
||||
})
|
||||
|
||||
if (!res.ok) {
|
||||
const error = await res.json()
|
||||
throw new Error(error.error || 'Failed to create table')
|
||||
}
|
||||
|
||||
return res.json()
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: tableKeys.list(workspaceId) })
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook to delete a table from a workspace.
|
||||
*
|
||||
* @param workspaceId - The workspace ID containing the table to delete.
|
||||
* @returns React Query mutation object with mutationFn and onSuccess handler.
|
||||
* The mutationFn accepts a tableId string.
|
||||
* On success, invalidates the tables list query to refresh the UI.
|
||||
*/
|
||||
export function useDeleteTable(workspaceId: string) {
|
||||
const queryClient = useQueryClient()
|
||||
|
||||
return useMutation({
|
||||
mutationFn: async (tableId: string) => {
|
||||
const res = await fetch(
|
||||
`/api/table/${tableId}?workspaceId=${encodeURIComponent(workspaceId)}`,
|
||||
{
|
||||
method: 'DELETE',
|
||||
}
|
||||
)
|
||||
|
||||
if (!res.ok) {
|
||||
const error = await res.json()
|
||||
throw new Error(error.error || 'Failed to delete table')
|
||||
}
|
||||
|
||||
return res.json()
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: tableKeys.list(workspaceId) })
|
||||
},
|
||||
})
|
||||
}
|
||||
28
apps/sim/lib/table/constants.ts
Normal file
28
apps/sim/lib/table/constants.ts
Normal file
@@ -0,0 +1,28 @@
|
||||
/**
|
||||
* Limits and constants for user-defined tables
|
||||
*/
|
||||
export const TABLE_LIMITS = {
|
||||
MAX_TABLES_PER_WORKSPACE: 100,
|
||||
MAX_ROWS_PER_TABLE: 10000,
|
||||
MAX_ROW_SIZE_BYTES: 100 * 1024, // 100KB
|
||||
MAX_COLUMNS_PER_TABLE: 50,
|
||||
MAX_TABLE_NAME_LENGTH: 50,
|
||||
MAX_COLUMN_NAME_LENGTH: 50,
|
||||
MAX_STRING_VALUE_LENGTH: 10000,
|
||||
MAX_DESCRIPTION_LENGTH: 500,
|
||||
DEFAULT_QUERY_LIMIT: 100,
|
||||
MAX_QUERY_LIMIT: 1000,
|
||||
} as const
|
||||
|
||||
/**
|
||||
* Valid column types for table schema
|
||||
*/
|
||||
export const COLUMN_TYPES = ['string', 'number', 'boolean', 'date', 'json'] as const
|
||||
|
||||
export type ColumnType = (typeof COLUMN_TYPES)[number]
|
||||
|
||||
/**
|
||||
* Regex pattern for valid table and column names
|
||||
* Must start with letter or underscore, followed by alphanumeric or underscore
|
||||
*/
|
||||
export const NAME_PATTERN = /^[a-z_][a-z0-9_]*$/i
|
||||
3
apps/sim/lib/table/index.ts
Normal file
3
apps/sim/lib/table/index.ts
Normal file
@@ -0,0 +1,3 @@
|
||||
export * from './constants'
|
||||
export * from './query-builder'
|
||||
export * from './validation'
|
||||
144
apps/sim/lib/table/query-builder.ts
Normal file
144
apps/sim/lib/table/query-builder.ts
Normal file
@@ -0,0 +1,144 @@
|
||||
/**
|
||||
* Query builder utilities for user-defined tables.
|
||||
*
|
||||
* Provides functions to build SQL WHERE and ORDER BY clauses for querying
|
||||
* user table rows stored as JSONB in PostgreSQL. Supports filtering on
|
||||
* JSONB fields using various operators ($eq, $ne, $gt, $gte, $lt, $lte, $in, $nin, $contains)
|
||||
* and sorting by both JSONB fields and built-in columns (createdAt, updatedAt).
|
||||
*
|
||||
*/
|
||||
|
||||
import type { SQL } from 'drizzle-orm'
|
||||
import { sql } from 'drizzle-orm'
|
||||
|
||||
export interface QueryFilter {
|
||||
[key: string]:
|
||||
| any
|
||||
| {
|
||||
$eq?: any
|
||||
$ne?: any
|
||||
$gt?: number
|
||||
$gte?: number
|
||||
$lt?: number
|
||||
$lte?: number
|
||||
$in?: any[]
|
||||
$nin?: any[]
|
||||
$contains?: string
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Build WHERE clause from filter object
|
||||
* Supports: $eq, $ne, $gt, $gte, $lt, $lte, $in, $nin, $contains
|
||||
*/
|
||||
export function buildFilterClause(filter: QueryFilter, tableName: string): SQL | undefined {
|
||||
const conditions: SQL[] = []
|
||||
|
||||
for (const [field, condition] of Object.entries(filter)) {
|
||||
// Escape field name to prevent SQL injection
|
||||
const escapedField = field.replace(/'/g, "''")
|
||||
|
||||
if (typeof condition === 'object' && condition !== null && !Array.isArray(condition)) {
|
||||
// Operator-based filter
|
||||
for (const [op, value] of Object.entries(condition)) {
|
||||
switch (op) {
|
||||
case '$eq':
|
||||
conditions.push(
|
||||
sql`${sql.raw(`${tableName}.data->>'${escapedField}'`)} = ${String(value)}`
|
||||
)
|
||||
break
|
||||
case '$ne':
|
||||
conditions.push(
|
||||
sql`${sql.raw(`${tableName}.data->>'${escapedField}'`)} != ${String(value)}`
|
||||
)
|
||||
break
|
||||
case '$gt':
|
||||
conditions.push(
|
||||
sql`(${sql.raw(`${tableName}.data->>'${escapedField}'`)})::numeric > ${value}`
|
||||
)
|
||||
break
|
||||
case '$gte':
|
||||
conditions.push(
|
||||
sql`(${sql.raw(`${tableName}.data->>'${escapedField}'`)})::numeric >= ${value}`
|
||||
)
|
||||
break
|
||||
case '$lt':
|
||||
conditions.push(
|
||||
sql`(${sql.raw(`${tableName}.data->>'${escapedField}'`)})::numeric < ${value}`
|
||||
)
|
||||
break
|
||||
case '$lte':
|
||||
conditions.push(
|
||||
sql`(${sql.raw(`${tableName}.data->>'${escapedField}'`)})::numeric <= ${value}`
|
||||
)
|
||||
break
|
||||
case '$in':
|
||||
if (Array.isArray(value) && value.length > 0) {
|
||||
const valuesList = value.map((v) => String(v))
|
||||
conditions.push(
|
||||
sql`${sql.raw(`${tableName}.data->>'${escapedField}'`)} = ANY(${valuesList})`
|
||||
)
|
||||
}
|
||||
break
|
||||
case '$nin':
|
||||
if (Array.isArray(value) && value.length > 0) {
|
||||
const valuesList = value.map((v) => String(v))
|
||||
conditions.push(
|
||||
sql`${sql.raw(`${tableName}.data->>'${escapedField}'`)} != ALL(${valuesList})`
|
||||
)
|
||||
}
|
||||
break
|
||||
case '$contains':
|
||||
conditions.push(
|
||||
sql`${sql.raw(`${tableName}.data->>'${escapedField}'`)} ILIKE ${`%${value}%`}`
|
||||
)
|
||||
break
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Direct equality
|
||||
conditions.push(
|
||||
sql`${sql.raw(`${tableName}.data->>'${escapedField}'`)} = ${String(condition)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
if (conditions.length === 0) return undefined
|
||||
if (conditions.length === 1) return conditions[0]
|
||||
|
||||
return sql.join(conditions, sql.raw(' AND '))
|
||||
}
|
||||
|
||||
/**
|
||||
* Build ORDER BY clause from sort object
|
||||
* Format: {field: 'asc'|'desc'}
|
||||
*/
|
||||
export function buildSortClause(
|
||||
sort: Record<string, 'asc' | 'desc'>,
|
||||
tableName: string
|
||||
): SQL | undefined {
|
||||
const clauses: SQL[] = []
|
||||
|
||||
for (const [field, direction] of Object.entries(sort)) {
|
||||
// Escape field name to prevent SQL injection
|
||||
const escapedField = field.replace(/'/g, "''")
|
||||
|
||||
if (field === 'createdAt' || field === 'updatedAt') {
|
||||
// Built-in columns
|
||||
clauses.push(
|
||||
direction === 'asc'
|
||||
? sql.raw(`${tableName}.${escapedField} ASC`)
|
||||
: sql.raw(`${tableName}.${escapedField} DESC`)
|
||||
)
|
||||
} else {
|
||||
// JSONB fields
|
||||
clauses.push(
|
||||
direction === 'asc'
|
||||
? sql.raw(`${tableName}.data->>'${escapedField}' ASC`)
|
||||
: sql.raw(`${tableName}.data->>'${escapedField}' DESC`)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
return clauses.length > 0 ? sql.join(clauses, sql.raw(', ')) : undefined
|
||||
}
|
||||
199
apps/sim/lib/table/validation.ts
Normal file
199
apps/sim/lib/table/validation.ts
Normal file
@@ -0,0 +1,199 @@
|
||||
import type { ColumnType } from './constants'
|
||||
import { COLUMN_TYPES, NAME_PATTERN, TABLE_LIMITS } from './constants'
|
||||
|
||||
export interface ColumnDefinition {
|
||||
name: string
|
||||
type: ColumnType
|
||||
required?: boolean
|
||||
}
|
||||
|
||||
export interface TableSchema {
|
||||
columns: ColumnDefinition[]
|
||||
}
|
||||
|
||||
interface ValidationResult {
|
||||
valid: boolean
|
||||
errors: string[]
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates table name against naming rules
|
||||
*/
|
||||
export function validateTableName(name: string): ValidationResult {
|
||||
const errors: string[] = []
|
||||
|
||||
if (!name || typeof name !== 'string') {
|
||||
errors.push('Table name is required')
|
||||
return { valid: false, errors }
|
||||
}
|
||||
|
||||
if (name.length > TABLE_LIMITS.MAX_TABLE_NAME_LENGTH) {
|
||||
errors.push(
|
||||
`Table name exceeds maximum length (${TABLE_LIMITS.MAX_TABLE_NAME_LENGTH} characters)`
|
||||
)
|
||||
}
|
||||
|
||||
if (!NAME_PATTERN.test(name)) {
|
||||
errors.push(
|
||||
'Table name must start with letter or underscore, followed by alphanumeric or underscore'
|
||||
)
|
||||
}
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates column definition
|
||||
*/
|
||||
export function validateColumnDefinition(column: ColumnDefinition): ValidationResult {
|
||||
const errors: string[] = []
|
||||
|
||||
if (!column.name || typeof column.name !== 'string') {
|
||||
errors.push('Column name is required')
|
||||
return { valid: false, errors }
|
||||
}
|
||||
|
||||
if (column.name.length > TABLE_LIMITS.MAX_COLUMN_NAME_LENGTH) {
|
||||
errors.push(
|
||||
`Column name "${column.name}" exceeds maximum length (${TABLE_LIMITS.MAX_COLUMN_NAME_LENGTH} characters)`
|
||||
)
|
||||
}
|
||||
|
||||
if (!NAME_PATTERN.test(column.name)) {
|
||||
errors.push(
|
||||
`Column name "${column.name}" must start with letter or underscore, followed by alphanumeric or underscore`
|
||||
)
|
||||
}
|
||||
|
||||
if (!COLUMN_TYPES.includes(column.type)) {
|
||||
errors.push(
|
||||
`Column "${column.name}" has invalid type "${column.type}". Valid types: ${COLUMN_TYPES.join(', ')}`
|
||||
)
|
||||
}
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates table schema
|
||||
*/
|
||||
export function validateTableSchema(schema: TableSchema): ValidationResult {
|
||||
const errors: string[] = []
|
||||
|
||||
if (!schema || typeof schema !== 'object') {
|
||||
errors.push('Schema is required')
|
||||
return { valid: false, errors }
|
||||
}
|
||||
|
||||
if (!Array.isArray(schema.columns)) {
|
||||
errors.push('Schema must have columns array')
|
||||
return { valid: false, errors }
|
||||
}
|
||||
|
||||
if (schema.columns.length === 0) {
|
||||
errors.push('Schema must have at least one column')
|
||||
}
|
||||
|
||||
if (schema.columns.length > TABLE_LIMITS.MAX_COLUMNS_PER_TABLE) {
|
||||
errors.push(`Schema exceeds maximum columns (${TABLE_LIMITS.MAX_COLUMNS_PER_TABLE})`)
|
||||
}
|
||||
|
||||
// Validate each column
|
||||
for (const column of schema.columns) {
|
||||
const columnResult = validateColumnDefinition(column)
|
||||
errors.push(...columnResult.errors)
|
||||
}
|
||||
|
||||
// Check for duplicate column names
|
||||
const columnNames = schema.columns.map((c) => c.name.toLowerCase())
|
||||
const uniqueNames = new Set(columnNames)
|
||||
if (uniqueNames.size !== columnNames.length) {
|
||||
errors.push('Duplicate column names found')
|
||||
}
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates row size
|
||||
*/
|
||||
export function validateRowSize(data: Record<string, any>): ValidationResult {
|
||||
const size = JSON.stringify(data).length
|
||||
if (size > TABLE_LIMITS.MAX_ROW_SIZE_BYTES) {
|
||||
return {
|
||||
valid: false,
|
||||
errors: [`Row size exceeds limit (${size} bytes > ${TABLE_LIMITS.MAX_ROW_SIZE_BYTES} bytes)`],
|
||||
}
|
||||
}
|
||||
return { valid: true, errors: [] }
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates row data against schema
|
||||
*/
|
||||
export function validateRowAgainstSchema(
|
||||
data: Record<string, any>,
|
||||
schema: TableSchema
|
||||
): ValidationResult {
|
||||
const errors: string[] = []
|
||||
|
||||
for (const column of schema.columns) {
|
||||
const value = data[column.name]
|
||||
|
||||
// Check required fields
|
||||
if (column.required && (value === undefined || value === null)) {
|
||||
errors.push(`Missing required field: ${column.name}`)
|
||||
continue
|
||||
}
|
||||
|
||||
// Skip type validation if value is null/undefined for optional fields
|
||||
if (value === null || value === undefined) continue
|
||||
|
||||
// Type validation
|
||||
switch (column.type) {
|
||||
case 'string':
|
||||
if (typeof value !== 'string') {
|
||||
errors.push(`${column.name} must be string, got ${typeof value}`)
|
||||
} else if (value.length > TABLE_LIMITS.MAX_STRING_VALUE_LENGTH) {
|
||||
errors.push(`${column.name} exceeds max string length`)
|
||||
}
|
||||
break
|
||||
case 'number':
|
||||
if (typeof value !== 'number' || Number.isNaN(value)) {
|
||||
errors.push(`${column.name} must be number`)
|
||||
}
|
||||
break
|
||||
case 'boolean':
|
||||
if (typeof value !== 'boolean') {
|
||||
errors.push(`${column.name} must be boolean`)
|
||||
}
|
||||
break
|
||||
case 'date':
|
||||
if (!(value instanceof Date) && Number.isNaN(Date.parse(value))) {
|
||||
errors.push(`${column.name} must be valid date`)
|
||||
}
|
||||
break
|
||||
case 'json':
|
||||
try {
|
||||
JSON.stringify(value)
|
||||
} catch {
|
||||
errors.push(`${column.name} must be valid JSON`)
|
||||
}
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
valid: errors.length === 0,
|
||||
errors,
|
||||
}
|
||||
}
|
||||
@@ -57,6 +57,54 @@ const ERROR_EXTRACTORS: ErrorExtractorConfig[] = [
|
||||
examples: ['Various REST APIs'],
|
||||
extract: (errorInfo) => errorInfo?.data?.details?.[0]?.message,
|
||||
},
|
||||
{
|
||||
id: 'details-string-array',
|
||||
description: 'Details array containing strings (validation errors)',
|
||||
examples: ['Table API', 'Validation APIs'],
|
||||
extract: (errorInfo) => {
|
||||
const details = errorInfo?.data?.details
|
||||
if (!Array.isArray(details) || details.length === 0) return undefined
|
||||
|
||||
// Check if it's an array of strings
|
||||
if (details.every((d) => typeof d === 'string')) {
|
||||
const errorMessage = errorInfo?.data?.error || 'Validation failed'
|
||||
return `${errorMessage}: ${details.join('; ')}`
|
||||
}
|
||||
|
||||
return undefined
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'batch-validation-errors',
|
||||
description: 'Batch validation errors with row numbers and error arrays',
|
||||
examples: ['Table Batch Insert'],
|
||||
extract: (errorInfo) => {
|
||||
const details = errorInfo?.data?.details
|
||||
if (!Array.isArray(details) || details.length === 0) return undefined
|
||||
|
||||
// Check if it's an array of objects with row numbers and errors
|
||||
if (
|
||||
details.every(
|
||||
(d) =>
|
||||
typeof d === 'object' &&
|
||||
d !== null &&
|
||||
'row' in d &&
|
||||
'errors' in d &&
|
||||
Array.isArray(d.errors)
|
||||
)
|
||||
) {
|
||||
const errorMessage = errorInfo?.data?.error || 'Validation failed'
|
||||
const rowErrors = details
|
||||
.map((detail: { row: number; errors: string[] }) => {
|
||||
return `Row ${detail.row}: ${detail.errors.join(', ')}`
|
||||
})
|
||||
.join('; ')
|
||||
return `${errorMessage}: ${rowErrors}`
|
||||
}
|
||||
|
||||
return undefined
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'hunter-errors',
|
||||
description: 'Hunter API error details',
|
||||
@@ -176,6 +224,8 @@ export const ErrorExtractorId = {
|
||||
GRAPHQL_ERRORS: 'graphql-errors',
|
||||
TWITTER_ERRORS: 'twitter-errors',
|
||||
DETAILS_ARRAY: 'details-array',
|
||||
DETAILS_STRING_ARRAY: 'details-string-array',
|
||||
BATCH_VALIDATION_ERRORS: 'batch-validation-errors',
|
||||
HUNTER_ERRORS: 'hunter-errors',
|
||||
ERRORS_ARRAY_STRING: 'errors-array-string',
|
||||
TELEGRAM_DESCRIPTION: 'telegram-description',
|
||||
|
||||
@@ -1270,6 +1270,7 @@ import {
|
||||
spotifyUnfollowPlaylistTool,
|
||||
spotifyUpdatePlaylistTool,
|
||||
} from '@/tools/spotify'
|
||||
import { sqsSendTool } from '@/tools/sqs'
|
||||
import {
|
||||
sshCheckCommandExistsTool,
|
||||
sshCheckFileExistsTool,
|
||||
@@ -1369,6 +1370,18 @@ import {
|
||||
supabaseUpsertTool,
|
||||
supabaseVectorSearchTool,
|
||||
} from '@/tools/supabase'
|
||||
import {
|
||||
tableBatchInsertRowsTool,
|
||||
tableCreateTool,
|
||||
tableDeleteRowsByFilterTool,
|
||||
tableDeleteRowTool,
|
||||
tableGetRowTool,
|
||||
tableInsertRowTool,
|
||||
tableListTool,
|
||||
tableQueryRowsTool,
|
||||
tableUpdateRowsByFilterTool,
|
||||
tableUpdateRowTool,
|
||||
} from '@/tools/table'
|
||||
import { tavilyCrawlTool, tavilyExtractTool, tavilyMapTool, tavilySearchTool } from '@/tools/tavily'
|
||||
import {
|
||||
telegramDeleteMessageTool,
|
||||
@@ -1530,7 +1543,6 @@ import {
|
||||
zoomListRecordingsTool,
|
||||
zoomUpdateMeetingTool,
|
||||
} from '@/tools/zoom'
|
||||
import { sqsSendTool } from './sqs'
|
||||
|
||||
// Registry of all available tools
|
||||
export const tools: Record<string, ToolConfig> = {
|
||||
@@ -2702,6 +2714,16 @@ export const tools: Record<string, ToolConfig> = {
|
||||
salesforce_describe_object: salesforceDescribeObjectTool,
|
||||
salesforce_list_objects: salesforceListObjectsTool,
|
||||
sqs_send: sqsSendTool,
|
||||
table_create: tableCreateTool,
|
||||
table_list: tableListTool,
|
||||
table_insert_row: tableInsertRowTool,
|
||||
table_batch_insert_rows: tableBatchInsertRowsTool,
|
||||
table_update_row: tableUpdateRowTool,
|
||||
table_update_rows_by_filter: tableUpdateRowsByFilterTool,
|
||||
table_delete_row: tableDeleteRowTool,
|
||||
table_delete_rows_by_filter: tableDeleteRowsByFilterTool,
|
||||
table_query_rows: tableQueryRowsTool,
|
||||
table_get_row: tableGetRowTool,
|
||||
mailchimp_get_audiences: mailchimpGetAudiencesTool,
|
||||
mailchimp_get_audience: mailchimpGetAudienceTool,
|
||||
mailchimp_create_audience: mailchimpCreateAudienceTool,
|
||||
|
||||
93
apps/sim/tools/table/batch-insert-rows.ts
Normal file
93
apps/sim/tools/table/batch-insert-rows.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableBatchInsertParams, TableBatchInsertResponse } from './types'
|
||||
|
||||
export const tableBatchInsertRowsTool: ToolConfig<
|
||||
TableBatchInsertParams,
|
||||
TableBatchInsertResponse
|
||||
> = {
|
||||
id: 'table_batch_insert_rows',
|
||||
name: 'Batch Insert Rows',
|
||||
description: 'Insert multiple rows into a table at once (up to 1000 rows)',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
rows: {
|
||||
type: 'array',
|
||||
required: true,
|
||||
description: 'Array of row data objects (max 1000 rows)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => `/api/table/${params.tableId}/rows`,
|
||||
method: 'POST',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
rows: params.rows,
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableBatchInsertResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
let errorMessage = data.error || 'Failed to batch insert rows'
|
||||
|
||||
// Include details if present
|
||||
if (data.details) {
|
||||
if (Array.isArray(data.details) && data.details.length > 0) {
|
||||
// Check if details is array of error objects with row numbers
|
||||
if (typeof data.details[0] === 'object' && 'row' in data.details[0]) {
|
||||
const errorSummary = data.details
|
||||
.map((detail: { row: number; errors: string[] }) => {
|
||||
const rowErrors = detail.errors.join(', ')
|
||||
return `Row ${detail.row}: ${rowErrors}`
|
||||
})
|
||||
.join('; ')
|
||||
errorMessage = `${errorMessage}: ${errorSummary}`
|
||||
} else {
|
||||
// Simple array of strings
|
||||
errorMessage = `${errorMessage}: ${data.details.join('; ')}`
|
||||
}
|
||||
} else if (typeof data.details === 'string') {
|
||||
errorMessage = `${errorMessage}: ${data.details}`
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(errorMessage)
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
rows: data.rows,
|
||||
insertedCount: data.insertedCount,
|
||||
message: data.message || 'Rows inserted successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether rows were inserted' },
|
||||
rows: { type: 'array', description: 'Inserted rows data' },
|
||||
insertedCount: { type: 'number', description: 'Number of rows inserted' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
73
apps/sim/tools/table/create.ts
Normal file
73
apps/sim/tools/table/create.ts
Normal file
@@ -0,0 +1,73 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableCreateParams, TableCreateResponse } from './types'
|
||||
|
||||
export const tableCreateTool: ToolConfig<TableCreateParams, TableCreateResponse> = {
|
||||
id: 'table_create',
|
||||
name: 'Create Table',
|
||||
description: 'Create a new user-defined table with schema',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
name: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table name (alphanumeric, underscores, 1-50 chars)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
description: {
|
||||
type: 'string',
|
||||
required: false,
|
||||
description: 'Optional table description',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
schema: {
|
||||
type: 'object',
|
||||
required: true,
|
||||
description: 'Table schema with column definitions',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: '/api/table',
|
||||
method: 'POST',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
name: params.name,
|
||||
description: params.description,
|
||||
schema: params.schema,
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableCreateResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(data.error || 'Failed to create table')
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
table: data.table,
|
||||
message: data.message || 'Table created successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether table was created' },
|
||||
table: { type: 'json', description: 'Created table metadata' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
64
apps/sim/tools/table/delete-row.ts
Normal file
64
apps/sim/tools/table/delete-row.ts
Normal file
@@ -0,0 +1,64 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableDeleteResponse, TableRowDeleteParams } from './types'
|
||||
|
||||
export const tableDeleteRowTool: ToolConfig<TableRowDeleteParams, TableDeleteResponse> = {
|
||||
id: 'table_delete_row',
|
||||
name: 'Delete Row',
|
||||
description: 'Delete a row from a table',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
rowId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Row ID to delete',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => `/api/table/${params.tableId}/rows/${params.rowId}`,
|
||||
method: 'DELETE',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableDeleteResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(data.error || 'Failed to delete row')
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
deletedCount: data.deletedCount,
|
||||
message: data.message || 'Row deleted successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether row was deleted' },
|
||||
deletedCount: { type: 'number', description: 'Number of rows deleted' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
78
apps/sim/tools/table/delete-rows-by-filter.ts
Normal file
78
apps/sim/tools/table/delete-rows-by-filter.ts
Normal file
@@ -0,0 +1,78 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableBulkOperationResponse, TableDeleteByFilterParams } from './types'
|
||||
|
||||
export const tableDeleteRowsByFilterTool: ToolConfig<
|
||||
TableDeleteByFilterParams,
|
||||
TableBulkOperationResponse
|
||||
> = {
|
||||
id: 'table_delete_rows_by_filter',
|
||||
name: 'Delete Rows by Filter',
|
||||
description:
|
||||
'Delete multiple rows that match filter criteria. Use with caution - supports optional limit for safety.',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
filter: {
|
||||
type: 'object',
|
||||
required: true,
|
||||
description: 'Filter criteria using operators like $eq, $ne, $gt, $lt, $contains, $in, etc.',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
limit: {
|
||||
type: 'number',
|
||||
required: false,
|
||||
description: 'Maximum number of rows to delete (default: no limit, max: 1000)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => `/api/table/${params.tableId}/rows`,
|
||||
method: 'DELETE',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
filter: params.filter,
|
||||
limit: params.limit,
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableBulkOperationResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(data.error || 'Failed to delete rows')
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
deletedCount: data.deletedCount || 0,
|
||||
deletedRowIds: data.deletedRowIds || [],
|
||||
message: data.message || 'Rows deleted successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether rows were deleted' },
|
||||
deletedCount: { type: 'number', description: 'Number of rows deleted' },
|
||||
deletedRowIds: { type: 'array', description: 'IDs of deleted rows' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
61
apps/sim/tools/table/get-row.ts
Normal file
61
apps/sim/tools/table/get-row.ts
Normal file
@@ -0,0 +1,61 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableRowGetParams, TableRowResponse } from './types'
|
||||
|
||||
export const tableGetRowTool: ToolConfig<TableRowGetParams, TableRowResponse> = {
|
||||
id: 'table_get_row',
|
||||
name: 'Get Row',
|
||||
description: 'Get a single row by ID',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
rowId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Row ID to retrieve',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return `/api/table/${params.tableId}/rows/${params.rowId}?workspaceId=${encodeURIComponent(workspaceId)}`
|
||||
},
|
||||
method: 'GET',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableRowResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(data.error || 'Failed to get row')
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
row: data.row,
|
||||
message: 'Row retrieved successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether row was retrieved' },
|
||||
row: { type: 'json', description: 'Row data' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
11
apps/sim/tools/table/index.ts
Normal file
11
apps/sim/tools/table/index.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
export * from './batch-insert-rows'
|
||||
export * from './create'
|
||||
export * from './delete-row'
|
||||
export * from './delete-rows-by-filter'
|
||||
export * from './get-row'
|
||||
export * from './insert-row'
|
||||
export * from './list'
|
||||
export * from './query-rows'
|
||||
export * from './types'
|
||||
export * from './update-row'
|
||||
export * from './update-rows-by-filter'
|
||||
77
apps/sim/tools/table/insert-row.ts
Normal file
77
apps/sim/tools/table/insert-row.ts
Normal file
@@ -0,0 +1,77 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableRowInsertParams, TableRowResponse } from './types'
|
||||
|
||||
export const tableInsertRowTool: ToolConfig<TableRowInsertParams, TableRowResponse> = {
|
||||
id: 'table_insert_row',
|
||||
name: 'Insert Row',
|
||||
description: 'Insert a new row into a table',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
data: {
|
||||
type: 'object',
|
||||
required: true,
|
||||
description: 'Row data as JSON object',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => `/api/table/${params.tableId}/rows`,
|
||||
method: 'POST',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
data: params.data,
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableRowResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
let errorMessage = data.error || 'Failed to insert row'
|
||||
|
||||
// Include details array if present
|
||||
if (data.details) {
|
||||
if (Array.isArray(data.details) && data.details.length > 0) {
|
||||
const detailsStr = data.details.join('; ')
|
||||
errorMessage = `${errorMessage}: ${detailsStr}`
|
||||
} else if (typeof data.details === 'string') {
|
||||
errorMessage = `${errorMessage}: ${data.details}`
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(errorMessage)
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
row: data.row,
|
||||
message: data.message || 'Row inserted successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether row was inserted' },
|
||||
row: { type: 'json', description: 'Inserted row data' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
47
apps/sim/tools/table/list.ts
Normal file
47
apps/sim/tools/table/list.ts
Normal file
@@ -0,0 +1,47 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableListParams, TableListResponse } from './types'
|
||||
|
||||
export const tableListTool: ToolConfig<TableListParams, TableListResponse> = {
|
||||
id: 'table_list',
|
||||
name: 'List Tables',
|
||||
description: 'List all tables in the workspace',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {},
|
||||
|
||||
request: {
|
||||
url: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
return `/api/table?workspaceId=${encodeURIComponent(workspaceId)}`
|
||||
},
|
||||
method: 'GET',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableListResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(data.error || 'Failed to list tables')
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
tables: data.tables,
|
||||
totalCount: data.totalCount,
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether operation succeeded' },
|
||||
tables: { type: 'array', description: 'List of tables' },
|
||||
totalCount: { type: 'number', description: 'Total number of tables' },
|
||||
},
|
||||
}
|
||||
103
apps/sim/tools/table/query-rows.ts
Normal file
103
apps/sim/tools/table/query-rows.ts
Normal file
@@ -0,0 +1,103 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableQueryResponse, TableRowQueryParams } from './types'
|
||||
|
||||
export const tableQueryRowsTool: ToolConfig<TableRowQueryParams, TableQueryResponse> = {
|
||||
id: 'table_query_rows',
|
||||
name: 'Query Rows',
|
||||
description: 'Query rows from a table with filtering, sorting, and pagination',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
filter: {
|
||||
type: 'object',
|
||||
required: false,
|
||||
description:
|
||||
'Filter conditions (MongoDB-style operators: $eq, $ne, $gt, $gte, $lt, $lte, $in, $nin, $contains)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
sort: {
|
||||
type: 'object',
|
||||
required: false,
|
||||
description: 'Sort order as {field: "asc"|"desc"}',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
limit: {
|
||||
type: 'number',
|
||||
required: false,
|
||||
description: 'Maximum rows to return (default: 100, max: 1000)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
offset: {
|
||||
type: 'number',
|
||||
required: false,
|
||||
description: 'Number of rows to skip (default: 0)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
const searchParams = new URLSearchParams({
|
||||
workspaceId,
|
||||
})
|
||||
|
||||
if (params.filter) {
|
||||
searchParams.append('filter', JSON.stringify(params.filter))
|
||||
}
|
||||
if (params.sort) {
|
||||
searchParams.append('sort', JSON.stringify(params.sort))
|
||||
}
|
||||
if (params.limit !== undefined) {
|
||||
searchParams.append('limit', String(params.limit))
|
||||
}
|
||||
if (params.offset !== undefined) {
|
||||
searchParams.append('offset', String(params.offset))
|
||||
}
|
||||
|
||||
return `/api/table/${params.tableId}/rows?${searchParams.toString()}`
|
||||
},
|
||||
method: 'GET',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableQueryResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(data.error || 'Failed to query rows')
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
rows: data.rows,
|
||||
rowCount: data.rowCount,
|
||||
totalCount: data.totalCount,
|
||||
limit: data.limit,
|
||||
offset: data.offset,
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether query succeeded' },
|
||||
rows: { type: 'array', description: 'Query result rows' },
|
||||
rowCount: { type: 'number', description: 'Number of rows returned' },
|
||||
totalCount: { type: 'number', description: 'Total rows matching filter' },
|
||||
limit: { type: 'number', description: 'Limit used in query' },
|
||||
offset: { type: 'number', description: 'Offset used in query' },
|
||||
},
|
||||
}
|
||||
186
apps/sim/tools/table/types.ts
Normal file
186
apps/sim/tools/table/types.ts
Normal file
@@ -0,0 +1,186 @@
|
||||
import type { ToolResponse } from '@/tools/types'
|
||||
|
||||
/**
|
||||
* Execution context provided by the workflow executor
|
||||
*/
|
||||
export interface ExecutionContext {
|
||||
workspaceId: string
|
||||
workflowId: string
|
||||
userId?: string
|
||||
executionId?: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Base type for tool parameters with execution context
|
||||
*/
|
||||
export interface ToolParamsWithContext {
|
||||
_context?: ExecutionContext
|
||||
}
|
||||
|
||||
export type ColumnType = 'string' | 'number' | 'boolean' | 'date' | 'json'
|
||||
|
||||
export interface ColumnDefinition {
|
||||
name: string
|
||||
type: ColumnType
|
||||
required?: boolean
|
||||
}
|
||||
|
||||
export interface TableSchema {
|
||||
columns: ColumnDefinition[]
|
||||
}
|
||||
|
||||
export interface TableDefinition {
|
||||
id: string
|
||||
name: string
|
||||
description?: string
|
||||
schema: TableSchema
|
||||
rowCount: number
|
||||
maxRows: number
|
||||
createdAt: string
|
||||
updatedAt: string
|
||||
}
|
||||
|
||||
export interface TableRow {
|
||||
id: string
|
||||
data: Record<string, any>
|
||||
createdAt: string
|
||||
updatedAt: string
|
||||
}
|
||||
|
||||
export interface QueryFilter {
|
||||
[key: string]:
|
||||
| any
|
||||
| {
|
||||
$eq?: any
|
||||
$ne?: any
|
||||
$gt?: number
|
||||
$gte?: number
|
||||
$lt?: number
|
||||
$lte?: number
|
||||
$in?: any[]
|
||||
$nin?: any[]
|
||||
$contains?: string
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableCreateParams extends ToolParamsWithContext {
|
||||
name: string
|
||||
description?: string
|
||||
schema: TableSchema
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableListParams extends ToolParamsWithContext {
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableRowInsertParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
data: Record<string, any>
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableRowUpdateParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
rowId: string
|
||||
data: Record<string, any>
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableRowDeleteParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
rowId: string
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableRowQueryParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
filter?: QueryFilter
|
||||
sort?: Record<string, 'asc' | 'desc'>
|
||||
limit?: number
|
||||
offset?: number
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableRowGetParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
rowId: string
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableCreateResponse extends ToolResponse {
|
||||
output: {
|
||||
table: TableDefinition
|
||||
message: string
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableListResponse extends ToolResponse {
|
||||
output: {
|
||||
tables: TableDefinition[]
|
||||
totalCount: number
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableRowResponse extends ToolResponse {
|
||||
output: {
|
||||
row: TableRow
|
||||
message: string
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableQueryResponse extends ToolResponse {
|
||||
output: {
|
||||
rows: TableRow[]
|
||||
rowCount: number
|
||||
totalCount: number
|
||||
limit: number
|
||||
offset: number
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableDeleteResponse extends ToolResponse {
|
||||
output: {
|
||||
deletedCount: number
|
||||
message: string
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableBatchInsertParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
rows: Record<string, any>[]
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableBatchInsertResponse extends ToolResponse {
|
||||
output: {
|
||||
rows: TableRow[]
|
||||
insertedCount: number
|
||||
message: string
|
||||
}
|
||||
}
|
||||
|
||||
export interface TableUpdateByFilterParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
filter: QueryFilter
|
||||
data: Record<string, any>
|
||||
limit?: number
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableDeleteByFilterParams extends ToolParamsWithContext {
|
||||
tableId: string
|
||||
filter: QueryFilter
|
||||
limit?: number
|
||||
workspaceId?: string
|
||||
}
|
||||
|
||||
export interface TableBulkOperationResponse extends ToolResponse {
|
||||
output: {
|
||||
updatedCount?: number
|
||||
deletedCount?: number
|
||||
updatedRowIds?: string[]
|
||||
deletedRowIds?: string[]
|
||||
message: string
|
||||
}
|
||||
}
|
||||
83
apps/sim/tools/table/update-row.ts
Normal file
83
apps/sim/tools/table/update-row.ts
Normal file
@@ -0,0 +1,83 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableRowResponse, TableRowUpdateParams } from './types'
|
||||
|
||||
export const tableUpdateRowTool: ToolConfig<TableRowUpdateParams, TableRowResponse> = {
|
||||
id: 'table_update_row',
|
||||
name: 'Update Row',
|
||||
description: 'Update an existing row in a table',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
rowId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Row ID to update',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
data: {
|
||||
type: 'object',
|
||||
required: true,
|
||||
description: 'Updated row data',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => `/api/table/${params.tableId}/rows/${params.rowId}`,
|
||||
method: 'PATCH',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
data: params.data,
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableRowResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
let errorMessage = data.error || 'Failed to update row'
|
||||
|
||||
// Include details array if present
|
||||
if (data.details) {
|
||||
if (Array.isArray(data.details) && data.details.length > 0) {
|
||||
const detailsStr = data.details.join('; ')
|
||||
errorMessage = `${errorMessage}: ${detailsStr}`
|
||||
} else if (typeof data.details === 'string') {
|
||||
errorMessage = `${errorMessage}: ${data.details}`
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(errorMessage)
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
row: data.row,
|
||||
message: data.message || 'Row updated successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether row was updated' },
|
||||
row: { type: 'json', description: 'Updated row data' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
97
apps/sim/tools/table/update-rows-by-filter.ts
Normal file
97
apps/sim/tools/table/update-rows-by-filter.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
import type { ToolConfig } from '@/tools/types'
|
||||
import type { TableBulkOperationResponse, TableUpdateByFilterParams } from './types'
|
||||
|
||||
export const tableUpdateRowsByFilterTool: ToolConfig<
|
||||
TableUpdateByFilterParams,
|
||||
TableBulkOperationResponse
|
||||
> = {
|
||||
id: 'table_update_rows_by_filter',
|
||||
name: 'Update Rows by Filter',
|
||||
description:
|
||||
'Update multiple rows that match filter criteria. Data is merged with existing row data.',
|
||||
version: '1.0.0',
|
||||
|
||||
params: {
|
||||
tableId: {
|
||||
type: 'string',
|
||||
required: true,
|
||||
description: 'Table ID',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
filter: {
|
||||
type: 'object',
|
||||
required: true,
|
||||
description: 'Filter criteria using operators like $eq, $ne, $gt, $lt, $contains, $in, etc.',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
data: {
|
||||
type: 'object',
|
||||
required: true,
|
||||
description: 'Fields to update (merged with existing data)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
limit: {
|
||||
type: 'number',
|
||||
required: false,
|
||||
description: 'Maximum number of rows to update (default: no limit, max: 1000)',
|
||||
visibility: 'user-or-llm',
|
||||
},
|
||||
},
|
||||
|
||||
request: {
|
||||
url: (params: any) => `/api/table/${params.tableId}/rows`,
|
||||
method: 'PUT',
|
||||
headers: () => ({
|
||||
'Content-Type': 'application/json',
|
||||
}),
|
||||
body: (params: any) => {
|
||||
const workspaceId = params._context?.workspaceId
|
||||
if (!workspaceId) {
|
||||
throw new Error('workspaceId is required in execution context')
|
||||
}
|
||||
|
||||
return {
|
||||
filter: params.filter,
|
||||
data: params.data,
|
||||
limit: params.limit,
|
||||
workspaceId,
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
transformResponse: async (response): Promise<TableBulkOperationResponse> => {
|
||||
const data = await response.json()
|
||||
|
||||
if (!response.ok) {
|
||||
let errorMessage = data.error || 'Failed to update rows'
|
||||
|
||||
// Include details array if present
|
||||
if (data.details) {
|
||||
if (Array.isArray(data.details) && data.details.length > 0) {
|
||||
const detailsStr = data.details.join('; ')
|
||||
errorMessage = `${errorMessage}: ${detailsStr}`
|
||||
} else if (typeof data.details === 'string') {
|
||||
errorMessage = `${errorMessage}: ${data.details}`
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(errorMessage)
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
output: {
|
||||
updatedCount: data.updatedCount || 0,
|
||||
updatedRowIds: data.updatedRowIds || [],
|
||||
message: data.message || 'Rows updated successfully',
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
outputs: {
|
||||
success: { type: 'boolean', description: 'Whether rows were updated' },
|
||||
updatedCount: { type: 'number', description: 'Number of rows updated' },
|
||||
updatedRowIds: { type: 'array', description: 'IDs of updated rows' },
|
||||
message: { type: 'string', description: 'Status message' },
|
||||
},
|
||||
}
|
||||
36
packages/db/migrations/0140_awesome_killer_shrike.sql
Normal file
36
packages/db/migrations/0140_awesome_killer_shrike.sql
Normal file
@@ -0,0 +1,36 @@
|
||||
CREATE TABLE "user_table_definitions" (
|
||||
"id" text PRIMARY KEY NOT NULL,
|
||||
"workspace_id" text NOT NULL,
|
||||
"name" text NOT NULL,
|
||||
"description" text,
|
||||
"schema" jsonb NOT NULL,
|
||||
"max_rows" integer DEFAULT 10000 NOT NULL,
|
||||
"row_count" integer DEFAULT 0 NOT NULL,
|
||||
"created_by" text NOT NULL,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp DEFAULT now() NOT NULL,
|
||||
"deleted_at" timestamp
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE "user_table_rows" (
|
||||
"id" text PRIMARY KEY NOT NULL,
|
||||
"table_id" text NOT NULL,
|
||||
"workspace_id" text NOT NULL,
|
||||
"data" jsonb NOT NULL,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp DEFAULT now() NOT NULL,
|
||||
"created_by" text
|
||||
);
|
||||
--> statement-breakpoint
|
||||
ALTER TABLE "user_table_definitions" ADD CONSTRAINT "user_table_definitions_workspace_id_workspace_id_fk" FOREIGN KEY ("workspace_id") REFERENCES "public"."workspace"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "user_table_definitions" ADD CONSTRAINT "user_table_definitions_created_by_user_id_fk" FOREIGN KEY ("created_by") REFERENCES "public"."user"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "user_table_rows" ADD CONSTRAINT "user_table_rows_table_id_user_table_definitions_id_fk" FOREIGN KEY ("table_id") REFERENCES "public"."user_table_definitions"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "user_table_rows" ADD CONSTRAINT "user_table_rows_workspace_id_workspace_id_fk" FOREIGN KEY ("workspace_id") REFERENCES "public"."workspace"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "user_table_rows" ADD CONSTRAINT "user_table_rows_created_by_user_id_fk" FOREIGN KEY ("created_by") REFERENCES "public"."user"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||
CREATE INDEX "user_table_def_workspace_id_idx" ON "user_table_definitions" USING btree ("workspace_id");--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX "user_table_def_workspace_name_unique" ON "user_table_definitions" USING btree ("workspace_id","name");--> statement-breakpoint
|
||||
CREATE INDEX "user_table_def_deleted_at_idx" ON "user_table_definitions" USING btree ("deleted_at");--> statement-breakpoint
|
||||
CREATE INDEX "user_table_rows_table_id_idx" ON "user_table_rows" USING btree ("table_id");--> statement-breakpoint
|
||||
CREATE INDEX "user_table_rows_workspace_id_idx" ON "user_table_rows" USING btree ("workspace_id");--> statement-breakpoint
|
||||
CREATE INDEX "user_table_rows_data_gin_idx" ON "user_table_rows" USING gin ("data");--> statement-breakpoint
|
||||
CREATE INDEX "user_table_rows_workspace_table_idx" ON "user_table_rows" USING btree ("workspace_id","table_id");
|
||||
10046
packages/db/migrations/meta/0140_snapshot.json
Normal file
10046
packages/db/migrations/meta/0140_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -974,6 +974,13 @@
|
||||
"when": 1768260112533,
|
||||
"tag": "0139_late_cargill",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 140,
|
||||
"version": "7",
|
||||
"when": 1768267681365,
|
||||
"tag": "0140_awesome_killer_shrike",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -2106,3 +2106,66 @@ export const permissionGroupMember = pgTable(
|
||||
userIdUnique: uniqueIndex('permission_group_member_user_id_unique').on(table.userId),
|
||||
})
|
||||
)
|
||||
|
||||
/**
|
||||
* User-defined table definitions
|
||||
* Stores schema and metadata for custom tables created by users
|
||||
*/
|
||||
export const userTableDefinitions = pgTable(
|
||||
'user_table_definitions',
|
||||
{
|
||||
id: text('id').primaryKey(), // tbl_xxxxx
|
||||
workspaceId: text('workspace_id')
|
||||
.notNull()
|
||||
.references(() => workspace.id, { onDelete: 'cascade' }),
|
||||
name: text('name').notNull(),
|
||||
description: text('description'),
|
||||
schema: jsonb('schema').notNull(), // {columns: [{name, type, required}]}
|
||||
maxRows: integer('max_rows').notNull().default(10000),
|
||||
rowCount: integer('row_count').notNull().default(0),
|
||||
createdBy: text('created_by')
|
||||
.notNull()
|
||||
.references(() => user.id, { onDelete: 'cascade' }),
|
||||
createdAt: timestamp('created_at').notNull().defaultNow(),
|
||||
updatedAt: timestamp('updated_at').notNull().defaultNow(),
|
||||
deletedAt: timestamp('deleted_at'),
|
||||
},
|
||||
(table) => ({
|
||||
workspaceIdIdx: index('user_table_def_workspace_id_idx').on(table.workspaceId),
|
||||
workspaceNameUnique: uniqueIndex('user_table_def_workspace_name_unique').on(
|
||||
table.workspaceId,
|
||||
table.name
|
||||
),
|
||||
deletedAtIdx: index('user_table_def_deleted_at_idx').on(table.deletedAt),
|
||||
})
|
||||
)
|
||||
|
||||
/**
|
||||
* User-defined table rows
|
||||
* Stores actual row data as JSONB for flexible schema
|
||||
*/
|
||||
export const userTableRows = pgTable(
|
||||
'user_table_rows',
|
||||
{
|
||||
id: text('id').primaryKey(), // row_xxxxx
|
||||
tableId: text('table_id')
|
||||
.notNull()
|
||||
.references(() => userTableDefinitions.id, { onDelete: 'cascade' }),
|
||||
workspaceId: text('workspace_id')
|
||||
.notNull()
|
||||
.references(() => workspace.id, { onDelete: 'cascade' }),
|
||||
data: jsonb('data').notNull(),
|
||||
createdAt: timestamp('created_at').notNull().defaultNow(),
|
||||
updatedAt: timestamp('updated_at').notNull().defaultNow(),
|
||||
createdBy: text('created_by').references(() => user.id, { onDelete: 'set null' }),
|
||||
},
|
||||
(table) => ({
|
||||
tableIdIdx: index('user_table_rows_table_id_idx').on(table.tableId),
|
||||
workspaceIdIdx: index('user_table_rows_workspace_id_idx').on(table.workspaceId),
|
||||
dataGinIdx: index('user_table_rows_data_gin_idx').using('gin', table.data),
|
||||
workspaceTableIdx: index('user_table_rows_workspace_table_idx').on(
|
||||
table.workspaceId,
|
||||
table.tableId
|
||||
),
|
||||
})
|
||||
)
|
||||
|
||||
Reference in New Issue
Block a user