mirror of
https://github.com/simstudioai/sim.git
synced 2026-01-11 07:58:06 -05:00
Compare commits
23 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f5ab7f21ae | ||
|
|
02229f0cb2 | ||
|
|
a2451ef3d3 | ||
|
|
6a262f3988 | ||
|
|
5145ce1684 | ||
|
|
e5bd5e4474 | ||
|
|
e9aede087d | ||
|
|
bfb6fffe38 | ||
|
|
ba2377f83b | ||
|
|
f502f984f3 | ||
|
|
74f371cc79 | ||
|
|
4fbec0a43f | ||
|
|
d248557042 | ||
|
|
8215a819e5 | ||
|
|
155f544ce8 | ||
|
|
22f949a41c | ||
|
|
f9aef6ae22 | ||
|
|
46b04a964d | ||
|
|
964b40de45 | ||
|
|
75aca00b6e | ||
|
|
d25084e05d | ||
|
|
445932c1c8 | ||
|
|
cc3f565d5e |
89
apps/docs/content/docs/de/blocks/webhook.mdx
Normal file
89
apps/docs/content/docs/de/blocks/webhook.mdx
Normal file
@@ -0,0 +1,89 @@
|
||||
---
|
||||
title: Webhook
|
||||
---
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
Der Webhook-Block sendet HTTP-POST-Anfragen an externe Webhook-Endpunkte mit automatischen Webhook-Headern und optionaler HMAC-Signierung.
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhook-Block"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Konfiguration
|
||||
|
||||
### Webhook-URL
|
||||
|
||||
Der Ziel-Endpunkt für Ihre Webhook-Anfrage. Unterstützt sowohl statische URLs als auch dynamische Werte aus anderen Blöcken.
|
||||
|
||||
### Payload
|
||||
|
||||
JSON-Daten, die im Anfrage-Body gesendet werden. Verwenden Sie den KI-Zauberstab, um Payloads zu generieren oder auf Workflow-Variablen zu verweisen:
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Signierungsgeheimnis
|
||||
|
||||
Optionales Geheimnis für die HMAC-SHA256-Payload-Signierung. Wenn angegeben, wird ein `X-Webhook-Signature`Header hinzugefügt:
|
||||
|
||||
```
|
||||
X-Webhook-Signature: t=1704067200000,v1=5d41402abc4b2a76b9719d911017c592...
|
||||
```
|
||||
|
||||
Um Signaturen zu verifizieren, berechnen Sie `HMAC-SHA256(secret, "${timestamp}.${body}")` und vergleichen Sie mit dem `v1`Wert.
|
||||
|
||||
### Zusätzliche Header
|
||||
|
||||
Benutzerdefinierte Schlüssel-Wert-Header, die in die Anfrage aufgenommen werden. Diese überschreiben alle automatischen Header mit demselben Namen.
|
||||
|
||||
## Automatische Header
|
||||
|
||||
Jede Anfrage enthält automatisch diese Header:
|
||||
|
||||
| Header | Beschreibung |
|
||||
|--------|-------------|
|
||||
| `Content-Type` | `application/json` |
|
||||
| `X-Webhook-Timestamp` | Unix-Zeitstempel in Millisekunden |
|
||||
| `X-Delivery-ID` | Eindeutige UUID für diese Zustellung |
|
||||
| `Idempotency-Key` | Identisch mit `X-Delivery-ID` zur Deduplizierung |
|
||||
|
||||
## Ausgaben
|
||||
|
||||
| Ausgabe | Typ | Beschreibung |
|
||||
|--------|------|-------------|
|
||||
| `data` | json | Antwort-Body vom Endpunkt |
|
||||
| `status` | number | HTTP-Statuscode |
|
||||
| `headers` | object | Antwort-Header |
|
||||
|
||||
## Beispiel-Anwendungsfälle
|
||||
|
||||
**Externe Dienste benachrichtigen** - Workflow-Ergebnisse an Slack, Discord oder benutzerdefinierte Endpunkte senden
|
||||
|
||||
```
|
||||
Agent → Function (format) → Webhook (notify)
|
||||
```
|
||||
|
||||
**Externe Workflows auslösen** - Prozesse in anderen Systemen starten, wenn Bedingungen erfüllt sind
|
||||
|
||||
```
|
||||
Condition (check) → Webhook (trigger) → Response
|
||||
```
|
||||
|
||||
<Callout>
|
||||
Der Webhook-Block verwendet immer POST. Für andere HTTP-Methoden oder mehr Kontrolle verwenden Sie den [API-Block](/blocks/api).
|
||||
</Callout>
|
||||
@@ -1,231 +0,0 @@
|
||||
---
|
||||
title: Webhook
|
||||
description: Empfangen Sie Webhooks von jedem Dienst durch Konfiguration eines
|
||||
benutzerdefinierten Webhooks.
|
||||
---
|
||||
|
||||
import { BlockInfoCard } from "@/components/ui/block-info-card"
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
<BlockInfoCard
|
||||
type="generic_webhook"
|
||||
color="#10B981"
|
||||
/>
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhook-Block-Konfiguration"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Übersicht
|
||||
|
||||
Der generische Webhook-Block ermöglicht den Empfang von Webhooks von jedem externen Dienst. Dies ist ein flexibler Trigger, der jede JSON-Nutzlast verarbeiten kann und sich daher ideal für die Integration mit Diensten eignet, die keinen dedizierten Sim-Block haben.
|
||||
|
||||
## Grundlegende Verwendung
|
||||
|
||||
### Einfacher Durchleitungsmodus
|
||||
|
||||
Ohne ein definiertes Eingabeformat leitet der Webhook den gesamten Anforderungstext unverändert weiter:
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Test webhook trigger",
|
||||
"data": {
|
||||
"key": "value"
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
Greifen Sie in nachgelagerten Blöcken auf die Daten zu mit:
|
||||
- `<webhook1.message>` → "Test webhook trigger"
|
||||
- `<webhook1.data.key>` → "value"
|
||||
|
||||
### Strukturiertes Eingabeformat (optional)
|
||||
|
||||
Definieren Sie ein Eingabeschema, um typisierte Felder zu erhalten und erweiterte Funktionen wie Datei-Uploads zu aktivieren:
|
||||
|
||||
**Konfiguration des Eingabeformats:**
|
||||
|
||||
```json
|
||||
[
|
||||
{ "name": "message", "type": "string" },
|
||||
{ "name": "priority", "type": "number" },
|
||||
{ "name": "documents", "type": "files" }
|
||||
]
|
||||
```
|
||||
|
||||
**Webhook-Anfrage:**
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Invoice submission",
|
||||
"priority": 1,
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:application/pdf;base64,JVBERi0xLjQK...",
|
||||
"name": "invoice.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## Datei-Uploads
|
||||
|
||||
### Unterstützte Dateiformate
|
||||
|
||||
Der Webhook unterstützt zwei Dateieingabeformate:
|
||||
|
||||
#### 1. Base64-kodierte Dateien
|
||||
Zum direkten Hochladen von Dateiinhalten:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgA...",
|
||||
"name": "screenshot.png",
|
||||
"mime": "image/png"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
- **Maximale Größe**: 20MB pro Datei
|
||||
- **Format**: Standard-Daten-URL mit Base64-Kodierung
|
||||
- **Speicherung**: Dateien werden in sicheren Ausführungsspeicher hochgeladen
|
||||
|
||||
#### 2. URL-Referenzen
|
||||
Zum Übergeben vorhandener Datei-URLs:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "url",
|
||||
"data": "https://example.com/files/document.pdf",
|
||||
"name": "document.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Zugriff auf Dateien in nachgelagerten Blöcken
|
||||
|
||||
Dateien werden in `UserFile`Objekte mit den folgenden Eigenschaften verarbeitet:
|
||||
|
||||
```typescript
|
||||
{
|
||||
id: string, // Unique file identifier
|
||||
name: string, // Original filename
|
||||
url: string, // Presigned URL (valid for 5 minutes)
|
||||
size: number, // File size in bytes
|
||||
type: string, // MIME type
|
||||
key: string, // Storage key
|
||||
uploadedAt: string, // ISO timestamp
|
||||
expiresAt: string // ISO timestamp (5 minutes)
|
||||
}
|
||||
```
|
||||
|
||||
**Zugriff in Blöcken:**
|
||||
- `<webhook1.documents[0].url>` → Download-URL
|
||||
- `<webhook1.documents[0].name>` → "invoice.pdf"
|
||||
- `<webhook1.documents[0].size>` → 524288
|
||||
- `<webhook1.documents[0].type>` → "application/pdf"
|
||||
|
||||
### Vollständiges Datei-Upload-Beispiel
|
||||
|
||||
```bash
|
||||
# Create a base64-encoded file
|
||||
echo "Hello World" | base64
|
||||
# SGVsbG8gV29ybGQK
|
||||
|
||||
# Send webhook with file
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"subject": "Document for review",
|
||||
"attachments": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:text/plain;base64,SGVsbG8gV29ybGQK",
|
||||
"name": "sample.txt",
|
||||
"mime": "text/plain"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## Authentifizierung
|
||||
|
||||
### Authentifizierung konfigurieren (Optional)
|
||||
|
||||
In der Webhook-Konfiguration:
|
||||
1. Aktiviere "Authentifizierung erforderlich"
|
||||
2. Setze einen geheimen Token
|
||||
3. Wähle den Header-Typ:
|
||||
- **Benutzerdefinierter Header**: `X-Sim-Secret: your-token`
|
||||
- **Authorization Bearer**: `Authorization: Bearer your-token`
|
||||
|
||||
### Verwendung der Authentifizierung
|
||||
|
||||
```bash
|
||||
# With custom header
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
|
||||
# With bearer token
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Eingabeformat für Struktur verwenden**: Definiere ein Eingabeformat, wenn du das erwartete Schema kennst. Dies bietet:
|
||||
- Typvalidierung
|
||||
- Bessere Autovervollständigung im Editor
|
||||
- Datei-Upload-Funktionen
|
||||
|
||||
2. **Authentifizierung**: Aktiviere immer die Authentifizierung für Produktions-Webhooks, um unbefugten Zugriff zu verhindern.
|
||||
|
||||
3. **Dateigrößenbeschränkungen**: Halte Dateien unter 20 MB. Verwende für größere Dateien URL-Referenzen.
|
||||
|
||||
4. **Dateiablauf**: Heruntergeladene Dateien haben URLs mit einer Gültigkeit von 5 Minuten. Verarbeite sie umgehend oder speichere sie an anderer Stelle, wenn sie länger benötigt werden.
|
||||
|
||||
5. **Fehlerbehandlung**: Die Webhook-Verarbeitung erfolgt asynchron. Überprüfe die Ausführungsprotokolle auf Fehler.
|
||||
|
||||
6. **Testen**: Verwende die Schaltfläche "Webhook testen" im Editor, um deine Konfiguration vor der Bereitstellung zu validieren.
|
||||
|
||||
## Anwendungsfälle
|
||||
|
||||
- **Formularübermittlungen**: Empfange Daten von benutzerdefinierten Formularen mit Datei-Uploads
|
||||
- **Drittanbieter-Integrationen**: Verbinde mit Diensten, die Webhooks senden (Stripe, GitHub usw.)
|
||||
- **Dokumentenverarbeitung**: Akzeptiere Dokumente von externen Systemen zur Verarbeitung
|
||||
- **Ereignisbenachrichtigungen**: Empfange Ereignisdaten aus verschiedenen Quellen
|
||||
- **Benutzerdefinierte APIs**: Erstelle benutzerdefinierte API-Endpunkte für deine Anwendungen
|
||||
|
||||
## Hinweise
|
||||
|
||||
- Kategorie: `triggers`
|
||||
- Typ: `generic_webhook`
|
||||
- **Dateiunterstützung**: Verfügbar über Eingabeformat-Konfiguration
|
||||
- **Maximale Dateigröße**: 20 MB pro Datei
|
||||
@@ -123,8 +123,6 @@ Kontostand und Portfoliowert von Kalshi abrufen
|
||||
| --------- | ---- | ----------- |
|
||||
| `balance` | number | Kontostand in Cent |
|
||||
| `portfolioValue` | number | Portfoliowert in Cent |
|
||||
| `balanceDollars` | number | Kontostand in Dollar |
|
||||
| `portfolioValueDollars` | number | Portfoliowert in Dollar |
|
||||
|
||||
### `kalshi_get_positions`
|
||||
|
||||
|
||||
@@ -47,10 +47,11 @@ Daten aus einer Supabase-Tabelle abfragen
|
||||
|
||||
| Parameter | Typ | Erforderlich | Beschreibung |
|
||||
| --------- | ---- | -------- | ----------- |
|
||||
| `projectId` | string | Ja | Ihre Supabase-Projekt-ID \(z. B. jdrkgepadsdopsntdlom\) |
|
||||
| `projectId` | string | Ja | Ihre Supabase-Projekt-ID \(z.B. jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Ja | Der Name der abzufragenden Supabase-Tabelle |
|
||||
| `schema` | string | Nein | Datenbankschema für die Abfrage \(Standard: public\). Verwenden Sie dies, um auf Tabellen in anderen Schemas zuzugreifen. |
|
||||
| `filter` | string | Nein | PostgREST-Filter \(z. B. "id=eq.123"\) |
|
||||
| `select` | string | Nein | Zurückzugebende Spalten \(durch Komma getrennt\). Standard ist * \(alle Spalten\) |
|
||||
| `filter` | string | Nein | PostgREST-Filter \(z.B. "id=eq.123"\) |
|
||||
| `orderBy` | string | Nein | Spalte zum Sortieren \(fügen Sie DESC für absteigende Sortierung hinzu\) |
|
||||
| `limit` | number | Nein | Maximale Anzahl der zurückzugebenden Zeilen |
|
||||
| `apiKey` | string | Ja | Ihr Supabase Service Role Secret Key |
|
||||
@@ -91,10 +92,11 @@ Eine einzelne Zeile aus einer Supabase-Tabelle basierend auf Filterkriterien abr
|
||||
|
||||
| Parameter | Typ | Erforderlich | Beschreibung |
|
||||
| --------- | ---- | -------- | ----------- |
|
||||
| `projectId` | string | Ja | Ihre Supabase-Projekt-ID \(z. B. jdrkgepadsdopsntdlom\) |
|
||||
| `projectId` | string | Ja | Ihre Supabase-Projekt-ID \(z.B. jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Ja | Der Name der abzufragenden Supabase-Tabelle |
|
||||
| `schema` | string | Nein | Datenbankschema für die Abfrage \(Standard: public\). Verwenden Sie dies, um auf Tabellen in anderen Schemas zuzugreifen. |
|
||||
| `filter` | string | Ja | PostgREST-Filter zum Auffinden der spezifischen Zeile \(z. B. "id=eq.123"\) |
|
||||
| `select` | string | Nein | Zurückzugebende Spalten \(durch Komma getrennt\). Standard ist * \(alle Spalten\) |
|
||||
| `filter` | string | Ja | PostgREST-Filter zum Finden der spezifischen Zeile \(z.B. "id=eq.123"\) |
|
||||
| `apiKey` | string | Ja | Ihr Supabase Service Role Secret Key |
|
||||
|
||||
#### Ausgabe
|
||||
|
||||
@@ -15,7 +15,7 @@ Der generische Webhook-Block erstellt einen flexiblen Endpunkt, der beliebige Pa
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
src="/static/blocks/webhook-trigger.png"
|
||||
alt="Generische Webhook-Konfiguration"
|
||||
width={500}
|
||||
height={400}
|
||||
|
||||
@@ -14,6 +14,7 @@
|
||||
"router",
|
||||
"variables",
|
||||
"wait",
|
||||
"webhook",
|
||||
"workflow"
|
||||
]
|
||||
}
|
||||
|
||||
87
apps/docs/content/docs/en/blocks/webhook.mdx
Normal file
87
apps/docs/content/docs/en/blocks/webhook.mdx
Normal file
@@ -0,0 +1,87 @@
|
||||
---
|
||||
title: Webhook
|
||||
---
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
The Webhook block sends HTTP POST requests to external webhook endpoints with automatic webhook headers and optional HMAC signing.
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhook Block"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Configuration
|
||||
|
||||
### Webhook URL
|
||||
|
||||
The destination endpoint for your webhook request. Supports both static URLs and dynamic values from other blocks.
|
||||
|
||||
### Payload
|
||||
|
||||
JSON data to send in the request body. Use the AI wand to generate payloads or reference workflow variables:
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Signing Secret
|
||||
|
||||
Optional secret for HMAC-SHA256 payload signing. When provided, adds an `X-Webhook-Signature` header:
|
||||
|
||||
```
|
||||
X-Webhook-Signature: t=1704067200000,v1=5d41402abc4b2a76b9719d911017c592...
|
||||
```
|
||||
|
||||
To verify signatures, compute `HMAC-SHA256(secret, "${timestamp}.${body}")` and compare with the `v1` value.
|
||||
|
||||
### Additional Headers
|
||||
|
||||
Custom key-value headers to include with the request. These override any automatic headers with the same name.
|
||||
|
||||
## Automatic Headers
|
||||
|
||||
Every request includes these headers automatically:
|
||||
|
||||
| Header | Description |
|
||||
|--------|-------------|
|
||||
| `Content-Type` | `application/json` |
|
||||
| `X-Webhook-Timestamp` | Unix timestamp in milliseconds |
|
||||
| `X-Delivery-ID` | Unique UUID for this delivery |
|
||||
| `Idempotency-Key` | Same as `X-Delivery-ID` for deduplication |
|
||||
|
||||
## Outputs
|
||||
|
||||
| Output | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `data` | json | Response body from the endpoint |
|
||||
| `status` | number | HTTP status code |
|
||||
| `headers` | object | Response headers |
|
||||
|
||||
## Example Use Cases
|
||||
|
||||
**Notify external services** - Send workflow results to Slack, Discord, or custom endpoints
|
||||
```
|
||||
Agent → Function (format) → Webhook (notify)
|
||||
```
|
||||
|
||||
**Trigger external workflows** - Start processes in other systems when conditions are met
|
||||
```
|
||||
Condition (check) → Webhook (trigger) → Response
|
||||
```
|
||||
|
||||
<Callout>
|
||||
The Webhook block always uses POST. For other HTTP methods or more control, use the [API block](/blocks/api).
|
||||
</Callout>
|
||||
@@ -126,8 +126,6 @@ Retrieve your account balance and portfolio value from Kalshi
|
||||
| --------- | ---- | ----------- |
|
||||
| `balance` | number | Account balance in cents |
|
||||
| `portfolioValue` | number | Portfolio value in cents |
|
||||
| `balanceDollars` | number | Account balance in dollars |
|
||||
| `portfolioValueDollars` | number | Portfolio value in dollars |
|
||||
|
||||
### `kalshi_get_positions`
|
||||
|
||||
|
||||
@@ -53,6 +53,7 @@ Query data from a Supabase table
|
||||
| `projectId` | string | Yes | Your Supabase project ID \(e.g., jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Yes | The name of the Supabase table to query |
|
||||
| `schema` | string | No | Database schema to query from \(default: public\). Use this to access tables in other schemas. |
|
||||
| `select` | string | No | Columns to return \(comma-separated\). Defaults to * \(all columns\) |
|
||||
| `filter` | string | No | PostgREST filter \(e.g., "id=eq.123"\) |
|
||||
| `orderBy` | string | No | Column to order by \(add DESC for descending\) |
|
||||
| `limit` | number | No | Maximum number of rows to return |
|
||||
@@ -97,6 +98,7 @@ Get a single row from a Supabase table based on filter criteria
|
||||
| `projectId` | string | Yes | Your Supabase project ID \(e.g., jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Yes | The name of the Supabase table to query |
|
||||
| `schema` | string | No | Database schema to query from \(default: public\). Use this to access tables in other schemas. |
|
||||
| `select` | string | No | Columns to return \(comma-separated\). Defaults to * \(all columns\) |
|
||||
| `filter` | string | Yes | PostgREST filter to find the specific row \(e.g., "id=eq.123"\) |
|
||||
| `apiKey` | string | Yes | Your Supabase service role secret key |
|
||||
|
||||
|
||||
@@ -15,7 +15,7 @@ The Generic Webhook block creates a flexible endpoint that can receive any paylo
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
src="/static/blocks/webhook-trigger.png"
|
||||
alt="Generic Webhook Configuration"
|
||||
width={500}
|
||||
height={400}
|
||||
|
||||
89
apps/docs/content/docs/es/blocks/webhook.mdx
Normal file
89
apps/docs/content/docs/es/blocks/webhook.mdx
Normal file
@@ -0,0 +1,89 @@
|
||||
---
|
||||
title: Webhook
|
||||
---
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
El bloque Webhook envía solicitudes HTTP POST a endpoints de webhook externos con encabezados de webhook automáticos y firma HMAC opcional.
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Bloque Webhook"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Configuración
|
||||
|
||||
### URL del webhook
|
||||
|
||||
El endpoint de destino para tu solicitud de webhook. Admite tanto URL estáticas como valores dinámicos de otros bloques.
|
||||
|
||||
### Carga útil
|
||||
|
||||
Datos JSON para enviar en el cuerpo de la solicitud. Usa la varita de IA para generar cargas útiles o referenciar variables del flujo de trabajo:
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Secreto de firma
|
||||
|
||||
Secreto opcional para la firma HMAC-SHA256 de la carga útil. Cuando se proporciona, añade un encabezado `X-Webhook-Signature`:
|
||||
|
||||
```
|
||||
X-Webhook-Signature: t=1704067200000,v1=5d41402abc4b2a76b9719d911017c592...
|
||||
```
|
||||
|
||||
Para verificar las firmas, calcula `HMAC-SHA256(secret, "${timestamp}.${body}")` y compara con el valor `v1`.
|
||||
|
||||
### Encabezados adicionales
|
||||
|
||||
Encabezados personalizados de clave-valor para incluir con la solicitud. Estos sobrescriben cualquier encabezado automático con el mismo nombre.
|
||||
|
||||
## Encabezados automáticos
|
||||
|
||||
Cada solicitud incluye estos encabezados automáticamente:
|
||||
|
||||
| Encabezado | Descripción |
|
||||
|--------|-------------|
|
||||
| `Content-Type` | `application/json` |
|
||||
| `X-Webhook-Timestamp` | Marca de tiempo Unix en milisegundos |
|
||||
| `X-Delivery-ID` | UUID único para esta entrega |
|
||||
| `Idempotency-Key` | Igual que `X-Delivery-ID` para deduplicación |
|
||||
|
||||
## Salidas
|
||||
|
||||
| Salida | Tipo | Descripción |
|
||||
|--------|------|-------------|
|
||||
| `data` | json | Cuerpo de respuesta del endpoint |
|
||||
| `status` | number | Código de estado HTTP |
|
||||
| `headers` | object | Encabezados de respuesta |
|
||||
|
||||
## Ejemplos de casos de uso
|
||||
|
||||
**Notificar servicios externos** - Envía resultados del flujo de trabajo a Slack, Discord o endpoints personalizados
|
||||
|
||||
```
|
||||
Agent → Function (format) → Webhook (notify)
|
||||
```
|
||||
|
||||
**Activar flujos de trabajo externos** - Inicia procesos en otros sistemas cuando se cumplan las condiciones
|
||||
|
||||
```
|
||||
Condition (check) → Webhook (trigger) → Response
|
||||
```
|
||||
|
||||
<Callout>
|
||||
El bloque Webhook siempre usa POST. Para otros métodos HTTP o más control, usa el [bloque API](/blocks/api).
|
||||
</Callout>
|
||||
@@ -1,230 +0,0 @@
|
||||
---
|
||||
title: Webhook
|
||||
description: Recibe webhooks de cualquier servicio configurando un webhook personalizado.
|
||||
---
|
||||
|
||||
import { BlockInfoCard } from "@/components/ui/block-info-card"
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
<BlockInfoCard
|
||||
type="generic_webhook"
|
||||
color="#10B981"
|
||||
/>
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Configuración del bloque Webhook"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Descripción general
|
||||
|
||||
El bloque Webhook genérico te permite recibir webhooks desde cualquier servicio externo. Este es un disparador flexible que puede manejar cualquier carga útil JSON, lo que lo hace ideal para integrarse con servicios que no tienen un bloque Sim dedicado.
|
||||
|
||||
## Uso básico
|
||||
|
||||
### Modo de paso simple
|
||||
|
||||
Sin definir un formato de entrada, el webhook transmite todo el cuerpo de la solicitud tal como está:
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Test webhook trigger",
|
||||
"data": {
|
||||
"key": "value"
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
Accede a los datos en bloques posteriores usando:
|
||||
- `<webhook1.message>` → "Test webhook trigger"
|
||||
- `<webhook1.data.key>` → "value"
|
||||
|
||||
### Formato de entrada estructurado (opcional)
|
||||
|
||||
Define un esquema de entrada para obtener campos tipados y habilitar funciones avanzadas como cargas de archivos:
|
||||
|
||||
**Configuración del formato de entrada:**
|
||||
|
||||
```json
|
||||
[
|
||||
{ "name": "message", "type": "string" },
|
||||
{ "name": "priority", "type": "number" },
|
||||
{ "name": "documents", "type": "files" }
|
||||
]
|
||||
```
|
||||
|
||||
**Solicitud de webhook:**
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Invoice submission",
|
||||
"priority": 1,
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:application/pdf;base64,JVBERi0xLjQK...",
|
||||
"name": "invoice.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## Cargas de archivos
|
||||
|
||||
### Formatos de archivo compatibles
|
||||
|
||||
El webhook admite dos formatos de entrada de archivos:
|
||||
|
||||
#### 1. Archivos codificados en Base64
|
||||
Para cargar contenido de archivos directamente:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgA...",
|
||||
"name": "screenshot.png",
|
||||
"mime": "image/png"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
- **Tamaño máximo**: 20MB por archivo
|
||||
- **Formato**: URL de datos estándar con codificación base64
|
||||
- **Almacenamiento**: Los archivos se cargan en almacenamiento seguro de ejecución
|
||||
|
||||
#### 2. Referencias URL
|
||||
Para pasar URLs de archivos existentes:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "url",
|
||||
"data": "https://example.com/files/document.pdf",
|
||||
"name": "document.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Acceso a archivos en bloques posteriores
|
||||
|
||||
Los archivos se procesan en objetos `UserFile` con las siguientes propiedades:
|
||||
|
||||
```typescript
|
||||
{
|
||||
id: string, // Unique file identifier
|
||||
name: string, // Original filename
|
||||
url: string, // Presigned URL (valid for 5 minutes)
|
||||
size: number, // File size in bytes
|
||||
type: string, // MIME type
|
||||
key: string, // Storage key
|
||||
uploadedAt: string, // ISO timestamp
|
||||
expiresAt: string // ISO timestamp (5 minutes)
|
||||
}
|
||||
```
|
||||
|
||||
**Acceso en bloques:**
|
||||
- `<webhook1.documents[0].url>` → URL de descarga
|
||||
- `<webhook1.documents[0].name>` → "invoice.pdf"
|
||||
- `<webhook1.documents[0].size>` → 524288
|
||||
- `<webhook1.documents[0].type>` → "application/pdf"
|
||||
|
||||
### Ejemplo completo de carga de archivos
|
||||
|
||||
```bash
|
||||
# Create a base64-encoded file
|
||||
echo "Hello World" | base64
|
||||
# SGVsbG8gV29ybGQK
|
||||
|
||||
# Send webhook with file
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"subject": "Document for review",
|
||||
"attachments": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:text/plain;base64,SGVsbG8gV29ybGQK",
|
||||
"name": "sample.txt",
|
||||
"mime": "text/plain"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## Autenticación
|
||||
|
||||
### Configurar autenticación (opcional)
|
||||
|
||||
En la configuración del webhook:
|
||||
1. Habilitar "Requerir autenticación"
|
||||
2. Establecer un token secreto
|
||||
3. Elegir tipo de encabezado:
|
||||
- **Encabezado personalizado**: `X-Sim-Secret: your-token`
|
||||
- **Autorización Bearer**: `Authorization: Bearer your-token`
|
||||
|
||||
### Uso de la autenticación
|
||||
|
||||
```bash
|
||||
# With custom header
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
|
||||
# With bearer token
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
```
|
||||
|
||||
## Mejores prácticas
|
||||
|
||||
1. **Usar formato de entrada para estructura**: Define un formato de entrada cuando conozcas el esquema esperado. Esto proporciona:
|
||||
- Validación de tipo
|
||||
- Mejor autocompletado en el editor
|
||||
- Capacidades de carga de archivos
|
||||
|
||||
2. **Autenticación**: Habilita siempre la autenticación para webhooks en producción para prevenir accesos no autorizados.
|
||||
|
||||
3. **Límites de tamaño de archivo**: Mantén los archivos por debajo de 20MB. Para archivos más grandes, usa referencias URL en su lugar.
|
||||
|
||||
4. **Caducidad de archivos**: Los archivos descargados tienen URLs con caducidad de 5 minutos. Procésalos rápidamente o almacénalos en otro lugar si los necesitas por más tiempo.
|
||||
|
||||
5. **Manejo de errores**: El procesamiento de webhooks es asíncrono. Revisa los registros de ejecución para detectar errores.
|
||||
|
||||
6. **Pruebas**: Usa el botón "Probar webhook" en el editor para validar tu configuración antes de implementarla.
|
||||
|
||||
## Casos de uso
|
||||
|
||||
- **Envíos de formularios**: Recibe datos de formularios personalizados con cargas de archivos
|
||||
- **Integraciones con terceros**: Conéctate con servicios que envían webhooks (Stripe, GitHub, etc.)
|
||||
- **Procesamiento de documentos**: Acepta documentos de sistemas externos para procesarlos
|
||||
- **Notificaciones de eventos**: Recibe datos de eventos de varias fuentes
|
||||
- **APIs personalizadas**: Construye endpoints de API personalizados para tus aplicaciones
|
||||
|
||||
## Notas
|
||||
|
||||
- Categoría: `triggers`
|
||||
- Tipo: `generic_webhook`
|
||||
- **Soporte de archivos**: disponible a través de la configuración del formato de entrada
|
||||
- **Tamaño máximo de archivo**: 20MB por archivo
|
||||
@@ -122,9 +122,7 @@ Recuperar el saldo de tu cuenta y el valor de la cartera desde Kalshi
|
||||
| Parámetro | Tipo | Descripción |
|
||||
| --------- | ---- | ----------- |
|
||||
| `balance` | number | Saldo de la cuenta en centavos |
|
||||
| `portfolioValue` | number | Valor de la cartera en centavos |
|
||||
| `balanceDollars` | number | Saldo de la cuenta en dólares |
|
||||
| `portfolioValueDollars` | number | Valor de la cartera en dólares |
|
||||
| `portfolioValue` | number | Valor del portafolio en centavos |
|
||||
|
||||
### `kalshi_get_positions`
|
||||
|
||||
|
||||
@@ -46,12 +46,13 @@ Consultar datos de una tabla de Supabase
|
||||
#### Entrada
|
||||
|
||||
| Parámetro | Tipo | Obligatorio | Descripción |
|
||||
| --------- | ---- | ----------- | ----------- |
|
||||
| --------- | ---- | -------- | ----------- |
|
||||
| `projectId` | string | Sí | ID de tu proyecto Supabase \(p. ej., jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Sí | Nombre de la tabla Supabase a consultar |
|
||||
| `schema` | string | No | Esquema de base de datos desde donde consultar \(predeterminado: public\). Usa esto para acceder a tablas en otros esquemas. |
|
||||
| `schema` | string | No | Esquema de base de datos desde el que consultar \(predeterminado: public\). Usa esto para acceder a tablas en otros esquemas. |
|
||||
| `select` | string | No | Columnas a devolver \(separadas por comas\). Predeterminado: * \(todas las columnas\) |
|
||||
| `filter` | string | No | Filtro PostgREST \(p. ej., "id=eq.123"\) |
|
||||
| `orderBy` | string | No | Columna para ordenar \(añade DESC para descendente\) |
|
||||
| `orderBy` | string | No | Columna por la que ordenar \(añade DESC para orden descendente\) |
|
||||
| `limit` | number | No | Número máximo de filas a devolver |
|
||||
| `apiKey` | string | Sí | Tu clave secreta de rol de servicio de Supabase |
|
||||
|
||||
@@ -90,10 +91,11 @@ Obtener una sola fila de una tabla de Supabase basada en criterios de filtro
|
||||
#### Entrada
|
||||
|
||||
| Parámetro | Tipo | Obligatorio | Descripción |
|
||||
| --------- | ---- | ----------- | ----------- |
|
||||
| --------- | ---- | -------- | ----------- |
|
||||
| `projectId` | string | Sí | ID de tu proyecto Supabase \(p. ej., jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Sí | Nombre de la tabla Supabase a consultar |
|
||||
| `schema` | string | No | Esquema de base de datos desde donde consultar \(predeterminado: public\). Usa esto para acceder a tablas en otros esquemas. |
|
||||
| `schema` | string | No | Esquema de base de datos desde el que consultar \(predeterminado: public\). Usa esto para acceder a tablas en otros esquemas. |
|
||||
| `select` | string | No | Columnas a devolver \(separadas por comas\). Predeterminado: * \(todas las columnas\) |
|
||||
| `filter` | string | Sí | Filtro PostgREST para encontrar la fila específica \(p. ej., "id=eq.123"\) |
|
||||
| `apiKey` | string | Sí | Tu clave secreta de rol de servicio de Supabase |
|
||||
|
||||
|
||||
@@ -15,8 +15,8 @@ El bloque de webhook genérico crea un punto de conexión flexible que puede rec
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Configuración genérica de webhook"
|
||||
src="/static/blocks/webhook-trigger.png"
|
||||
alt="Configuración de webhook genérico"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
|
||||
89
apps/docs/content/docs/fr/blocks/webhook.mdx
Normal file
89
apps/docs/content/docs/fr/blocks/webhook.mdx
Normal file
@@ -0,0 +1,89 @@
|
||||
---
|
||||
title: Webhook
|
||||
---
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
Le bloc Webhook envoie des requêtes HTTP POST vers des points de terminaison webhook externes avec des en-têtes webhook automatiques et une signature HMAC optionnelle.
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Bloc Webhook"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Configuration
|
||||
|
||||
### URL du webhook
|
||||
|
||||
Le point de terminaison de destination pour votre requête webhook. Prend en charge les URL statiques et les valeurs dynamiques provenant d'autres blocs.
|
||||
|
||||
### Charge utile
|
||||
|
||||
Données JSON à envoyer dans le corps de la requête. Utilisez la baguette IA pour générer des charges utiles ou référencer des variables de workflow :
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Secret de signature
|
||||
|
||||
Secret optionnel pour la signature HMAC-SHA256 de la charge utile. Lorsqu'il est fourni, ajoute un en-tête `X-Webhook-Signature` :
|
||||
|
||||
```
|
||||
X-Webhook-Signature: t=1704067200000,v1=5d41402abc4b2a76b9719d911017c592...
|
||||
```
|
||||
|
||||
Pour vérifier les signatures, calculez `HMAC-SHA256(secret, "${timestamp}.${body}")` et comparez avec la valeur `v1`.
|
||||
|
||||
### En-têtes supplémentaires
|
||||
|
||||
En-têtes personnalisés clé-valeur à inclure avec la requête. Ceux-ci remplacent tous les en-têtes automatiques portant le même nom.
|
||||
|
||||
## En-têtes automatiques
|
||||
|
||||
Chaque requête inclut automatiquement ces en-têtes :
|
||||
|
||||
| En-tête | Description |
|
||||
|--------|-------------|
|
||||
| `Content-Type` | `application/json` |
|
||||
| `X-Webhook-Timestamp` | Horodatage Unix en millisecondes |
|
||||
| `X-Delivery-ID` | UUID unique pour cette livraison |
|
||||
| `Idempotency-Key` | Identique à `X-Delivery-ID` pour la déduplication |
|
||||
|
||||
## Sorties
|
||||
|
||||
| Sortie | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `data` | json | Corps de la réponse du point de terminaison |
|
||||
| `status` | number | Code de statut HTTP |
|
||||
| `headers` | object | En-têtes de réponse |
|
||||
|
||||
## Exemples de cas d'usage
|
||||
|
||||
**Notifier des services externes** - Envoyer les résultats du workflow vers Slack, Discord ou des points de terminaison personnalisés
|
||||
|
||||
```
|
||||
Agent → Function (format) → Webhook (notify)
|
||||
```
|
||||
|
||||
**Déclencher des workflows externes** - Démarrer des processus dans d'autres systèmes lorsque des conditions sont remplies
|
||||
|
||||
```
|
||||
Condition (check) → Webhook (trigger) → Response
|
||||
```
|
||||
|
||||
<Callout>
|
||||
Le bloc Webhook utilise toujours POST. Pour d'autres méthodes HTTP ou plus de contrôle, utilisez le [bloc API](/blocks/api).
|
||||
</Callout>
|
||||
@@ -1,231 +0,0 @@
|
||||
---
|
||||
title: Webhook
|
||||
description: Recevez des webhooks de n'importe quel service en configurant un
|
||||
webhook personnalisé.
|
||||
---
|
||||
|
||||
import { BlockInfoCard } from "@/components/ui/block-info-card"
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
<BlockInfoCard
|
||||
type="generic_webhook"
|
||||
color="#10B981"
|
||||
/>
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Configuration du bloc Webhook"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## Aperçu
|
||||
|
||||
Le bloc Webhook générique vous permet de recevoir des webhooks depuis n'importe quel service externe. C'est un déclencheur flexible qui peut traiter n'importe quelle charge utile JSON, ce qui le rend idéal pour l'intégration avec des services qui n'ont pas de bloc Sim dédié.
|
||||
|
||||
## Utilisation de base
|
||||
|
||||
### Mode de transmission simple
|
||||
|
||||
Sans définir un format d'entrée, le webhook transmet l'intégralité du corps de la requête tel quel :
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Test webhook trigger",
|
||||
"data": {
|
||||
"key": "value"
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
Accédez aux données dans les blocs en aval en utilisant :
|
||||
- `<webhook1.message>` → "Test webhook trigger"
|
||||
- `<webhook1.data.key>` → "value"
|
||||
|
||||
### Format d'entrée structuré (optionnel)
|
||||
|
||||
Définissez un schéma d'entrée pour obtenir des champs typés et activer des fonctionnalités avancées comme les téléchargements de fichiers :
|
||||
|
||||
**Configuration du format d'entrée :**
|
||||
|
||||
```json
|
||||
[
|
||||
{ "name": "message", "type": "string" },
|
||||
{ "name": "priority", "type": "number" },
|
||||
{ "name": "documents", "type": "files" }
|
||||
]
|
||||
```
|
||||
|
||||
**Requête Webhook :**
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Invoice submission",
|
||||
"priority": 1,
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:application/pdf;base64,JVBERi0xLjQK...",
|
||||
"name": "invoice.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## Téléchargements de fichiers
|
||||
|
||||
### Formats de fichiers pris en charge
|
||||
|
||||
Le webhook prend en charge deux formats d'entrée de fichiers :
|
||||
|
||||
#### 1. Fichiers encodés en Base64
|
||||
Pour télécharger directement le contenu du fichier :
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgA...",
|
||||
"name": "screenshot.png",
|
||||
"mime": "image/png"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
- **Taille maximale** : 20 Mo par fichier
|
||||
- **Format** : URL de données standard avec encodage base64
|
||||
- **Stockage** : Les fichiers sont téléchargés dans un stockage d'exécution sécurisé
|
||||
|
||||
#### 2. Références URL
|
||||
Pour transmettre des URL de fichiers existants :
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "url",
|
||||
"data": "https://example.com/files/document.pdf",
|
||||
"name": "document.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Accès aux fichiers dans les blocs en aval
|
||||
|
||||
Les fichiers sont traités en objets `UserFile` avec les propriétés suivantes :
|
||||
|
||||
```typescript
|
||||
{
|
||||
id: string, // Unique file identifier
|
||||
name: string, // Original filename
|
||||
url: string, // Presigned URL (valid for 5 minutes)
|
||||
size: number, // File size in bytes
|
||||
type: string, // MIME type
|
||||
key: string, // Storage key
|
||||
uploadedAt: string, // ISO timestamp
|
||||
expiresAt: string // ISO timestamp (5 minutes)
|
||||
}
|
||||
```
|
||||
|
||||
**Accès dans les blocs :**
|
||||
- `<webhook1.documents[0].url>` → URL de téléchargement
|
||||
- `<webhook1.documents[0].name>` → "invoice.pdf"
|
||||
- `<webhook1.documents[0].size>` → 524288
|
||||
- `<webhook1.documents[0].type>` → "application/pdf"
|
||||
|
||||
### Exemple complet de téléchargement de fichier
|
||||
|
||||
```bash
|
||||
# Create a base64-encoded file
|
||||
echo "Hello World" | base64
|
||||
# SGVsbG8gV29ybGQK
|
||||
|
||||
# Send webhook with file
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"subject": "Document for review",
|
||||
"attachments": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:text/plain;base64,SGVsbG8gV29ybGQK",
|
||||
"name": "sample.txt",
|
||||
"mime": "text/plain"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## Authentification
|
||||
|
||||
### Configurer l'authentification (optionnel)
|
||||
|
||||
Dans la configuration du webhook :
|
||||
1. Activez "Exiger l'authentification"
|
||||
2. Définissez un jeton secret
|
||||
3. Choisissez le type d'en-tête :
|
||||
- **En-tête personnalisé** : `X-Sim-Secret: your-token`
|
||||
- **Autorisation Bearer** : `Authorization: Bearer your-token`
|
||||
|
||||
### Utilisation de l'authentification
|
||||
|
||||
```bash
|
||||
# With custom header
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
|
||||
# With bearer token
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
```
|
||||
|
||||
## Bonnes pratiques
|
||||
|
||||
1. **Utiliser le format d'entrée pour la structure** : définissez un format d'entrée lorsque vous connaissez le schéma attendu. Cela fournit :
|
||||
- Validation de type
|
||||
- Meilleure autocomplétion dans l'éditeur
|
||||
- Capacités de téléchargement de fichiers
|
||||
|
||||
2. **Authentification** : activez toujours l'authentification pour les webhooks en production afin d'empêcher les accès non autorisés.
|
||||
|
||||
3. **Limites de taille de fichier** : gardez les fichiers en dessous de 20 Mo. Pour les fichiers plus volumineux, utilisez plutôt des références URL.
|
||||
|
||||
4. **Expiration des fichiers** : les fichiers téléchargés ont des URL d'expiration de 5 minutes. Traitez-les rapidement ou stockez-les ailleurs si vous en avez besoin plus longtemps.
|
||||
|
||||
5. **Gestion des erreurs** : le traitement des webhooks est asynchrone. Vérifiez les journaux d'exécution pour les erreurs.
|
||||
|
||||
6. **Tests** : utilisez le bouton "Tester le webhook" dans l'éditeur pour valider votre configuration avant le déploiement.
|
||||
|
||||
## Cas d'utilisation
|
||||
|
||||
- **Soumissions de formulaires** : recevez des données de formulaires personnalisés avec téléchargement de fichiers
|
||||
- **Intégrations tierces** : connectez-vous avec des services qui envoient des webhooks (Stripe, GitHub, etc.)
|
||||
- **Traitement de documents** : acceptez des documents de systèmes externes pour traitement
|
||||
- **Notifications d'événements** : recevez des données d'événements de diverses sources
|
||||
- **API personnalisées** : créez des points de terminaison API personnalisés pour vos applications
|
||||
|
||||
## Remarques
|
||||
|
||||
- Catégorie : `triggers`
|
||||
- Type : `generic_webhook`
|
||||
- **Support de fichiers** : disponible via la configuration du format d'entrée
|
||||
- **Taille maximale de fichier** : 20 Mo par fichier
|
||||
@@ -123,8 +123,6 @@ Récupérer le solde de votre compte et la valeur de votre portefeuille depuis K
|
||||
| --------- | ---- | ----------- |
|
||||
| `balance` | number | Solde du compte en centimes |
|
||||
| `portfolioValue` | number | Valeur du portefeuille en centimes |
|
||||
| `balanceDollars` | number | Solde du compte en dollars |
|
||||
| `portfolioValueDollars` | number | Valeur du portefeuille en dollars |
|
||||
|
||||
### `kalshi_get_positions`
|
||||
|
||||
|
||||
@@ -49,7 +49,8 @@ Interroger des données d'une table Supabase
|
||||
| --------- | ---- | ----------- | ----------- |
|
||||
| `projectId` | string | Oui | L'ID de votre projet Supabase \(ex. : jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Oui | Le nom de la table Supabase à interroger |
|
||||
| `schema` | string | Non | Schéma de base de données à interroger \(par défaut : public\). Utilisez ceci pour accéder aux tables dans d'autres schémas. |
|
||||
| `schema` | string | Non | Schéma de base de données à partir duquel interroger \(par défaut : public\). Utilisez ceci pour accéder aux tables dans d'autres schémas. |
|
||||
| `select` | string | Non | Colonnes à retourner \(séparées par des virgules\). Par défaut * \(toutes les colonnes\) |
|
||||
| `filter` | string | Non | Filtre PostgREST \(ex. : "id=eq.123"\) |
|
||||
| `orderBy` | string | Non | Colonne pour le tri \(ajoutez DESC pour l'ordre décroissant\) |
|
||||
| `limit` | number | Non | Nombre maximum de lignes à retourner |
|
||||
@@ -93,7 +94,8 @@ Obtenir une seule ligne d'une table Supabase selon des critères de filtrage
|
||||
| --------- | ---- | ----------- | ----------- |
|
||||
| `projectId` | string | Oui | L'ID de votre projet Supabase \(ex. : jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | Oui | Le nom de la table Supabase à interroger |
|
||||
| `schema` | string | Non | Schéma de base de données à interroger \(par défaut : public\). Utilisez ceci pour accéder aux tables dans d'autres schémas. |
|
||||
| `schema` | string | Non | Schéma de base de données à partir duquel interroger \(par défaut : public\). Utilisez ceci pour accéder aux tables dans d'autres schémas. |
|
||||
| `select` | string | Non | Colonnes à retourner \(séparées par des virgules\). Par défaut * \(toutes les colonnes\) |
|
||||
| `filter` | string | Oui | Filtre PostgREST pour trouver la ligne spécifique \(ex. : "id=eq.123"\) |
|
||||
| `apiKey` | string | Oui | Votre clé secrète de rôle de service Supabase |
|
||||
|
||||
|
||||
@@ -15,8 +15,8 @@ Le bloc Webhook générique crée un point de terminaison flexible qui peut rece
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Configuration de webhook générique"
|
||||
src="/static/blocks/webhook-trigger.png"
|
||||
alt="Configuration du webhook générique"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
|
||||
89
apps/docs/content/docs/ja/blocks/webhook.mdx
Normal file
89
apps/docs/content/docs/ja/blocks/webhook.mdx
Normal file
@@ -0,0 +1,89 @@
|
||||
---
|
||||
title: Webhook
|
||||
---
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
Webhookブロックは、自動的なWebhookヘッダーとオプションのHMAC署名を使用して、外部のWebhookエンドポイントにHTTP POSTリクエストを送信します。
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhookブロック"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## 設定
|
||||
|
||||
### Webhook URL
|
||||
|
||||
Webhookリクエストの送信先エンドポイントです。静的URLと他のブロックからの動的な値の両方に対応しています。
|
||||
|
||||
### ペイロード
|
||||
|
||||
リクエストボディで送信するJSONデータです。AIワンドを使用してペイロードを生成したり、ワークフロー変数を参照したりできます。
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 署名シークレット
|
||||
|
||||
HMAC-SHA256ペイロード署名用のオプションのシークレットです。指定すると、`X-Webhook-Signature`ヘッダーが追加されます。
|
||||
|
||||
```
|
||||
X-Webhook-Signature: t=1704067200000,v1=5d41402abc4b2a76b9719d911017c592...
|
||||
```
|
||||
|
||||
署名を検証するには、`HMAC-SHA256(secret, "${timestamp}.${body}")`を計算し、`v1`の値と比較します。
|
||||
|
||||
### 追加ヘッダー
|
||||
|
||||
リクエストに含めるカスタムのキーと値のヘッダーです。同じ名前の自動ヘッダーがある場合は上書きされます。
|
||||
|
||||
## 自動ヘッダー
|
||||
|
||||
すべてのリクエストには、以下のヘッダーが自動的に含まれます。
|
||||
|
||||
| ヘッダー | 説明 |
|
||||
|--------|-------------|
|
||||
| `Content-Type` | `application/json` |
|
||||
| `X-Webhook-Timestamp` | ミリ秒単位のUnixタイムスタンプ |
|
||||
| `X-Delivery-ID` | この配信の一意のUUID |
|
||||
| `Idempotency-Key` | 重複排除用の`X-Delivery-ID`と同じ |
|
||||
|
||||
## 出力
|
||||
|
||||
| 出力 | 型 | 説明 |
|
||||
|--------|------|-------------|
|
||||
| `data` | json | エンドポイントからのレスポンスボディ |
|
||||
| `status` | number | HTTPステータスコード |
|
||||
| `headers` | object | レスポンスヘッダー |
|
||||
|
||||
## 使用例
|
||||
|
||||
**外部サービスへの通知** - ワークフローの結果をSlack、Discord、またはカスタムエンドポイントに送信します。
|
||||
|
||||
```
|
||||
Agent → Function (format) → Webhook (notify)
|
||||
```
|
||||
|
||||
**外部ワークフローのトリガー** - 条件が満たされたときに他のシステムでプロセスを開始します。
|
||||
|
||||
```
|
||||
Condition (check) → Webhook (trigger) → Response
|
||||
```
|
||||
|
||||
<Callout>
|
||||
Webhookブロックは常にPOSTを使用します。他のHTTPメソッドやより詳細な制御が必要な場合は、[APIブロック](/blocks/api)を使用してください。
|
||||
</Callout>
|
||||
@@ -1,230 +0,0 @@
|
||||
---
|
||||
title: Webhook
|
||||
description: カスタムウェブフックを設定して、任意のサービスからウェブフックを受信します。
|
||||
---
|
||||
|
||||
import { BlockInfoCard } from "@/components/ui/block-info-card"
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
<BlockInfoCard
|
||||
type="generic_webhook"
|
||||
color="#10B981"
|
||||
/>
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhookブロックの設定"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## 概要
|
||||
|
||||
汎用Webhookブロックを使用すると、任意の外部サービスからWebhookを受信できます。これは柔軟なトリガーであり、あらゆるJSONペイロードを処理できるため、専用のSimブロックがないサービスとの統合に最適です。
|
||||
|
||||
## 基本的な使用方法
|
||||
|
||||
### シンプルなパススルーモード
|
||||
|
||||
入力フォーマットを定義しない場合、Webhookはリクエスト本文全体をそのまま渡します:
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Test webhook trigger",
|
||||
"data": {
|
||||
"key": "value"
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
下流のブロックでデータにアクセスする方法:
|
||||
- `<webhook1.message>` → "Test webhook trigger"
|
||||
- `<webhook1.data.key>` → "value"
|
||||
|
||||
### 構造化入力フォーマット(オプション)
|
||||
|
||||
入力スキーマを定義して、型付きフィールドを取得し、ファイルアップロードなどの高度な機能を有効にします:
|
||||
|
||||
**入力フォーマット設定:**
|
||||
|
||||
```json
|
||||
[
|
||||
{ "name": "message", "type": "string" },
|
||||
{ "name": "priority", "type": "number" },
|
||||
{ "name": "documents", "type": "files" }
|
||||
]
|
||||
```
|
||||
|
||||
**Webhookリクエスト:**
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Invoice submission",
|
||||
"priority": 1,
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:application/pdf;base64,JVBERi0xLjQK...",
|
||||
"name": "invoice.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## ファイルアップロード
|
||||
|
||||
### サポートされているファイル形式
|
||||
|
||||
Webhookは2つのファイル入力形式をサポートしています:
|
||||
|
||||
#### 1. Base64エンコードファイル
|
||||
ファイルコンテンツを直接アップロードする場合:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgA...",
|
||||
"name": "screenshot.png",
|
||||
"mime": "image/png"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
- **最大サイズ**: ファイルあたり20MB
|
||||
- **フォーマット**: Base64エンコーディングを使用した標準データURL
|
||||
- **ストレージ**: ファイルは安全な実行ストレージにアップロードされます
|
||||
|
||||
#### 2. URL参照
|
||||
既存のファイルURLを渡す場合:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "url",
|
||||
"data": "https://example.com/files/document.pdf",
|
||||
"name": "document.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### 下流のブロックでファイルにアクセスする
|
||||
|
||||
ファイルは以下のプロパティを持つ `UserFile` オブジェクトに処理されます:
|
||||
|
||||
```typescript
|
||||
{
|
||||
id: string, // Unique file identifier
|
||||
name: string, // Original filename
|
||||
url: string, // Presigned URL (valid for 5 minutes)
|
||||
size: number, // File size in bytes
|
||||
type: string, // MIME type
|
||||
key: string, // Storage key
|
||||
uploadedAt: string, // ISO timestamp
|
||||
expiresAt: string // ISO timestamp (5 minutes)
|
||||
}
|
||||
```
|
||||
|
||||
**ブロック内でのアクセス:**
|
||||
- `<webhook1.documents[0].url>` → ダウンロードURL
|
||||
- `<webhook1.documents[0].name>` → "invoice.pdf"
|
||||
- `<webhook1.documents[0].size>` → 524288
|
||||
- `<webhook1.documents[0].type>` → "application/pdf"
|
||||
|
||||
### ファイルアップロードの完全な例
|
||||
|
||||
```bash
|
||||
# Create a base64-encoded file
|
||||
echo "Hello World" | base64
|
||||
# SGVsbG8gV29ybGQK
|
||||
|
||||
# Send webhook with file
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"subject": "Document for review",
|
||||
"attachments": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:text/plain;base64,SGVsbG8gV29ybGQK",
|
||||
"name": "sample.txt",
|
||||
"mime": "text/plain"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## 認証
|
||||
|
||||
### 認証の設定(オプション)
|
||||
|
||||
ウェブフック設定で:
|
||||
1. 「認証を要求する」を有効にする
|
||||
2. シークレットトークンを設定する
|
||||
3. ヘッダータイプを選択する:
|
||||
- **カスタムヘッダー**: `X-Sim-Secret: your-token`
|
||||
- **認証ベアラー**: `Authorization: Bearer your-token`
|
||||
|
||||
### 認証の使用
|
||||
|
||||
```bash
|
||||
# With custom header
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
|
||||
# With bearer token
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
```
|
||||
|
||||
## ベストプラクティス
|
||||
|
||||
1. **構造化のための入力フォーマットの使用**: 予想されるスキーマがわかっている場合は入力フォーマットを定義してください。これにより以下が提供されます:
|
||||
- 型の検証
|
||||
- エディタでのより良いオートコンプリート
|
||||
- ファイルアップロード機能
|
||||
|
||||
2. **認証**: 不正アクセスを防ぐため、本番環境のウェブフックには常に認証を有効にしてください。
|
||||
|
||||
3. **ファイルサイズの制限**: ファイルは20MB未満に保ってください。より大きなファイルの場合は、代わりにURL参照を使用してください。
|
||||
|
||||
4. **ファイルの有効期限**: ダウンロードされたファイルのURLは5分間有効です。すぐに処理するか、長期間必要な場合は別の場所に保存してください。
|
||||
|
||||
5. **エラー処理**: ウェブフック処理は非同期です。エラーについては実行ログを確認してください。
|
||||
|
||||
6. **テスト**: 設定をデプロイする前に、エディタの「ウェブフックをテスト」ボタンを使用して設定を検証してください。
|
||||
|
||||
## ユースケース
|
||||
|
||||
- **フォーム送信**: ファイルアップロード機能を持つカスタムフォームからデータを受け取る
|
||||
- **サードパーティ連携**: ウェブフックを送信するサービス(Stripe、GitHubなど)と接続する
|
||||
- **ドキュメント処理**: 外部システムからドキュメントを受け取って処理する
|
||||
- **イベント通知**: さまざまなソースからイベントデータを受け取る
|
||||
- **カスタムAPI**: アプリケーション用のカスタムAPIエンドポイントを構築する
|
||||
|
||||
## 注意事項
|
||||
|
||||
- カテゴリ:`triggers`
|
||||
- タイプ:`generic_webhook`
|
||||
- **ファイルサポート**:入力フォーマット設定で利用可能
|
||||
- **最大ファイルサイズ**:ファイルあたり20MB
|
||||
@@ -121,10 +121,8 @@ Kalshiからアカウント残高とポートフォリオ価値を取得
|
||||
|
||||
| パラメータ | 型 | 説明 |
|
||||
| --------- | ---- | ----------- |
|
||||
| `balance` | number | セント単位のアカウント残高 |
|
||||
| `portfolioValue` | number | セント単位のポートフォリオ価値 |
|
||||
| `balanceDollars` | number | ドル単位のアカウント残高 |
|
||||
| `portfolioValueDollars` | number | ドル単位のポートフォリオ価値 |
|
||||
| `balance` | number | アカウント残高(セント単位) |
|
||||
| `portfolioValue` | number | ポートフォリオ価値(セント単位) |
|
||||
|
||||
### `kalshi_get_positions`
|
||||
|
||||
|
||||
@@ -49,7 +49,8 @@ Supabaseテーブルからデータを照会する
|
||||
| --------- | ---- | -------- | ----------- |
|
||||
| `projectId` | string | はい | あなたのSupabaseプロジェクトID(例:jdrkgepadsdopsntdlom) |
|
||||
| `table` | string | はい | クエリするSupabaseテーブルの名前 |
|
||||
| `schema` | string | いいえ | クエリするデータベーススキーマ(デフォルト:public)。他のスキーマのテーブルにアクセスする場合に使用します。 |
|
||||
| `schema` | string | いいえ | クエリ元のデータベーススキーマ(デフォルト:public)。他のスキーマのテーブルにアクセスする場合に使用します。 |
|
||||
| `select` | string | いいえ | 返す列(カンマ区切り)。デフォルトは*(すべての列) |
|
||||
| `filter` | string | いいえ | PostgRESTフィルター(例:"id=eq.123") |
|
||||
| `orderBy` | string | いいえ | 並べ替える列(降順の場合はDESCを追加) |
|
||||
| `limit` | number | いいえ | 返す最大行数 |
|
||||
@@ -93,7 +94,8 @@ Supabaseテーブルにデータを挿入する
|
||||
| --------- | ---- | -------- | ----------- |
|
||||
| `projectId` | string | はい | あなたのSupabaseプロジェクトID(例:jdrkgepadsdopsntdlom) |
|
||||
| `table` | string | はい | クエリするSupabaseテーブルの名前 |
|
||||
| `schema` | string | いいえ | クエリするデータベーススキーマ(デフォルト:public)。他のスキーマのテーブルにアクセスする場合に使用します。 |
|
||||
| `schema` | string | いいえ | クエリ元のデータベーススキーマ(デフォルト:public)。他のスキーマのテーブルにアクセスする場合に使用します。 |
|
||||
| `select` | string | いいえ | 返す列(カンマ区切り)。デフォルトは*(すべての列) |
|
||||
| `filter` | string | はい | 特定の行を見つけるためのPostgRESTフィルター(例:"id=eq.123") |
|
||||
| `apiKey` | string | はい | あなたのSupabaseサービスロールシークレットキー |
|
||||
|
||||
|
||||
@@ -15,7 +15,7 @@ Webhookを使用すると、外部サービスがHTTPリクエストを送信し
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
src="/static/blocks/webhook-trigger.png"
|
||||
alt="汎用Webhook設定"
|
||||
width={500}
|
||||
height={400}
|
||||
|
||||
89
apps/docs/content/docs/zh/blocks/webhook.mdx
Normal file
89
apps/docs/content/docs/zh/blocks/webhook.mdx
Normal file
@@ -0,0 +1,89 @@
|
||||
---
|
||||
title: Webhook
|
||||
---
|
||||
|
||||
import { Callout } from 'fumadocs-ui/components/callout'
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
Webhook 模块会向外部 webhook 端点发送 HTTP POST 请求,自动附加 webhook 头部,并可选用 HMAC 签名。
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhook 模块"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## 配置
|
||||
|
||||
### Webhook URL
|
||||
|
||||
Webhook 请求的目标端点。支持静态 URL 和来自其他模块的动态值。
|
||||
|
||||
### 负载
|
||||
|
||||
要在请求体中发送的 JSON 数据。可使用 AI 魔杖生成负载,或引用工作流变量:
|
||||
|
||||
```json
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 签名密钥
|
||||
|
||||
可选的 HMAC-SHA256 负载签名密钥。填写后会添加 `X-Webhook-Signature` 头部:
|
||||
|
||||
```
|
||||
X-Webhook-Signature: t=1704067200000,v1=5d41402abc4b2a76b9719d911017c592...
|
||||
```
|
||||
|
||||
要验证签名,请计算 `HMAC-SHA256(secret, "${timestamp}.${body}")` 并与 `v1` 的值进行比对。
|
||||
|
||||
### 额外头部
|
||||
|
||||
自定义的键值头部,将随请求一同发送。若与自动头部同名,则会覆盖自动头部。
|
||||
|
||||
## 自动头部
|
||||
|
||||
每个请求都会自动包含以下头部:
|
||||
|
||||
| Header | 说明 |
|
||||
|--------|------|
|
||||
| `Content-Type` | `application/json` |
|
||||
| `X-Webhook-Timestamp` | Unix 时间戳(毫秒) |
|
||||
| `X-Delivery-ID` | 本次投递的唯一 UUID |
|
||||
| `Idempotency-Key` | 与 `X-Delivery-ID` 相同,用于去重 |
|
||||
|
||||
## 输出
|
||||
|
||||
| 输出 | 类型 | 说明 |
|
||||
|------|------|------|
|
||||
| `data` | json | 端点返回的响应体 |
|
||||
| `status` | number | HTTP 状态码 |
|
||||
| `headers` | object | 响应头部 |
|
||||
|
||||
## 示例用例
|
||||
|
||||
**通知外部服务** - 将工作流结果发送到 Slack、Discord 或自定义端点
|
||||
|
||||
```
|
||||
Agent → Function (format) → Webhook (notify)
|
||||
```
|
||||
|
||||
**触发外部工作流** - 当满足条件时,在其他系统中启动流程
|
||||
|
||||
```
|
||||
Condition (check) → Webhook (trigger) → Response
|
||||
```
|
||||
|
||||
<Callout>
|
||||
Webhook 模块始终使用 POST。如需使用其他 HTTP 方法或获得更多控制,请使用 [API 模块](/blocks/api)。
|
||||
</Callout>
|
||||
@@ -1,230 +0,0 @@
|
||||
---
|
||||
title: Webhook
|
||||
description: 通过配置自定义 webhook,从任何服务接收 webhook。
|
||||
---
|
||||
|
||||
import { BlockInfoCard } from "@/components/ui/block-info-card"
|
||||
import { Image } from '@/components/ui/image'
|
||||
|
||||
<BlockInfoCard
|
||||
type="generic_webhook"
|
||||
color="#10B981"
|
||||
/>
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
alt="Webhook Block Configuration"
|
||||
width={500}
|
||||
height={400}
|
||||
className="my-6"
|
||||
/>
|
||||
</div>
|
||||
|
||||
## 概述
|
||||
|
||||
通用 Webhook 模块允许您接收来自任何外部服务的 webhook。这是一个灵活的触发器,可以处理任何 JSON 负载,非常适合与没有专用 Sim 模块的服务集成。
|
||||
|
||||
## 基本用法
|
||||
|
||||
### 简单直通模式
|
||||
|
||||
在未定义输入格式的情况下,webhook 会按原样传递整个请求正文:
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Test webhook trigger",
|
||||
"data": {
|
||||
"key": "value"
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
在下游模块中使用以下方式访问数据:
|
||||
- `<webhook1.message>` → "测试 webhook 触发器"
|
||||
- `<webhook1.data.key>` → "值"
|
||||
|
||||
### 结构化输入格式(可选)
|
||||
|
||||
定义输入模式以获取类型化字段,并启用高级功能,例如文件上传:
|
||||
|
||||
**输入格式配置:**
|
||||
|
||||
```json
|
||||
[
|
||||
{ "name": "message", "type": "string" },
|
||||
{ "name": "priority", "type": "number" },
|
||||
{ "name": "documents", "type": "files" }
|
||||
]
|
||||
```
|
||||
|
||||
**Webhook 请求:**
|
||||
|
||||
```bash
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"message": "Invoice submission",
|
||||
"priority": 1,
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:application/pdf;base64,JVBERi0xLjQK...",
|
||||
"name": "invoice.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## 文件上传
|
||||
|
||||
### 支持的文件格式
|
||||
|
||||
webhook 支持两种文件输入格式:
|
||||
|
||||
#### 1. Base64 编码文件
|
||||
用于直接上传文件内容:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgA...",
|
||||
"name": "screenshot.png",
|
||||
"mime": "image/png"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
- **最大大小**:每个文件 20MB
|
||||
- **格式**:带有 base64 编码的标准数据 URL
|
||||
- **存储**:文件上传到安全的执行存储
|
||||
|
||||
#### 2. URL 引用
|
||||
用于传递现有文件 URL:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [
|
||||
{
|
||||
"type": "url",
|
||||
"data": "https://example.com/files/document.pdf",
|
||||
"name": "document.pdf",
|
||||
"mime": "application/pdf"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### 在下游模块中访问文件
|
||||
|
||||
文件被处理为具有以下属性的 `UserFile` 对象:
|
||||
|
||||
```typescript
|
||||
{
|
||||
id: string, // Unique file identifier
|
||||
name: string, // Original filename
|
||||
url: string, // Presigned URL (valid for 5 minutes)
|
||||
size: number, // File size in bytes
|
||||
type: string, // MIME type
|
||||
key: string, // Storage key
|
||||
uploadedAt: string, // ISO timestamp
|
||||
expiresAt: string // ISO timestamp (5 minutes)
|
||||
}
|
||||
```
|
||||
|
||||
**分块访问:**
|
||||
- `<webhook1.documents[0].url>` → 下载 URL
|
||||
- `<webhook1.documents[0].name>` → "invoice.pdf"
|
||||
- `<webhook1.documents[0].size>` → 524288
|
||||
- `<webhook1.documents[0].type>` → "application/pdf"
|
||||
|
||||
### 完整文件上传示例
|
||||
|
||||
```bash
|
||||
# Create a base64-encoded file
|
||||
echo "Hello World" | base64
|
||||
# SGVsbG8gV29ybGQK
|
||||
|
||||
# Send webhook with file
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret" \
|
||||
-d '{
|
||||
"subject": "Document for review",
|
||||
"attachments": [
|
||||
{
|
||||
"type": "file",
|
||||
"data": "data:text/plain;base64,SGVsbG8gV29ybGQK",
|
||||
"name": "sample.txt",
|
||||
"mime": "text/plain"
|
||||
}
|
||||
]
|
||||
}'
|
||||
```
|
||||
|
||||
## 身份验证
|
||||
|
||||
### 配置身份验证(可选)
|
||||
|
||||
在 webhook 配置中:
|
||||
1. 启用“需要身份验证”
|
||||
2. 设置一个密钥令牌
|
||||
3. 选择头类型:
|
||||
- **自定义头**: `X-Sim-Secret: your-token`
|
||||
- **授权 Bearer**: `Authorization: Bearer your-token`
|
||||
|
||||
### 使用身份验证
|
||||
|
||||
```bash
|
||||
# With custom header
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-Sim-Secret: your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
|
||||
# With bearer token
|
||||
curl -X POST https://sim.ai/api/webhooks/trigger/{webhook-path} \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secret-token" \
|
||||
-d '{"message": "Authenticated request"}'
|
||||
```
|
||||
|
||||
## 最佳实践
|
||||
|
||||
1. **使用输入格式定义结构**:当您知道预期的模式时,定义输入格式。这提供:
|
||||
- 类型验证
|
||||
- 编辑器中的更好自动完成
|
||||
- 文件上传功能
|
||||
|
||||
2. **身份验证**:在生产环境的 webhook 中始终启用身份验证,以防止未经授权的访问。
|
||||
|
||||
3. **文件大小限制**:将文件保持在 20MB 以下。对于更大的文件,请使用 URL 引用。
|
||||
|
||||
4. **文件过期**:下载的文件具有 5 分钟的 URL 过期时间。请及时处理,或如果需要更长时间,请将其存储在其他地方。
|
||||
|
||||
5. **错误处理**:Webhook 处理是异步的。请检查执行日志以获取错误信息。
|
||||
|
||||
6. **测试**:在部署前,使用编辑器中的“测试 Webhook”按钮验证您的配置。
|
||||
|
||||
## 使用场景
|
||||
|
||||
- **表单提交**:接收带有文件上传的自定义表单数据
|
||||
- **第三方集成**:与发送 webhook 的服务(如 Stripe、GitHub 等)连接
|
||||
- **文档处理**:接受来自外部系统的文档进行处理
|
||||
- **事件通知**:接收来自各种来源的事件数据
|
||||
- **自定义 API**:为您的应用程序构建自定义 API 端点
|
||||
|
||||
## 注意事项
|
||||
|
||||
- 类别:`triggers`
|
||||
- 类型:`generic_webhook`
|
||||
- **文件支持**:通过输入格式配置可用
|
||||
- **最大文件大小**:每个文件 20MB
|
||||
@@ -123,8 +123,6 @@ import { BlockInfoCard } from "@/components/ui/block-info-card"
|
||||
| --------- | ---- | ----------- |
|
||||
| `balance` | number | 账户余额(以分为单位) |
|
||||
| `portfolioValue` | number | 投资组合价值(以分为单位) |
|
||||
| `balanceDollars` | number | 账户余额(以美元为单位) |
|
||||
| `portfolioValueDollars` | number | 投资组合价值(以美元为单位) |
|
||||
|
||||
### `kalshi_get_positions`
|
||||
|
||||
|
||||
@@ -50,8 +50,9 @@ Sim 的 Supabase 集成使您能够轻松地将代理工作流连接到您的 Su
|
||||
| `projectId` | string | 是 | 您的 Supabase 项目 ID \(例如:jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | 是 | 要查询的 Supabase 表名 |
|
||||
| `schema` | string | 否 | 要查询的数据库 schema \(默认:public\)。用于访问其他 schema 下的表。|
|
||||
| `select` | string | 否 | 要返回的列(逗号分隔)。默认为 *(所有列)|
|
||||
| `filter` | string | 否 | PostgREST 过滤条件 \(例如:"id=eq.123"\) |
|
||||
| `orderBy` | string | 否 | 排序的列名 \(添加 DESC 表示降序\) |
|
||||
| `orderBy` | string | 否 | 排序的列(添加 DESC 表示降序)|
|
||||
| `limit` | number | 否 | 返回的最大行数 |
|
||||
| `apiKey` | string | 是 | 您的 Supabase 服务角色密钥 |
|
||||
|
||||
@@ -94,7 +95,8 @@ Sim 的 Supabase 集成使您能够轻松地将代理工作流连接到您的 Su
|
||||
| `projectId` | string | 是 | 您的 Supabase 项目 ID \(例如:jdrkgepadsdopsntdlom\) |
|
||||
| `table` | string | 是 | 要查询的 Supabase 表名 |
|
||||
| `schema` | string | 否 | 要查询的数据库 schema \(默认:public\)。用于访问其他 schema 下的表。|
|
||||
| `filter` | string | 是 | 用于查找特定行的 PostgREST 过滤条件 \(例如:"id=eq.123"\) |
|
||||
| `select` | string | 否 | 要返回的列(逗号分隔)。默认为 *(所有列)|
|
||||
| `filter` | string | 是 | PostgREST 过滤条件,用于查找特定行 \(例如:"id=eq.123"\) |
|
||||
| `apiKey` | string | 是 | 您的 Supabase 服务角色密钥 |
|
||||
|
||||
#### 输出
|
||||
|
||||
@@ -15,7 +15,7 @@ Webhook 允许外部服务通过向您的工作流发送 HTTP 请求来触发工
|
||||
|
||||
<div className="flex justify-center">
|
||||
<Image
|
||||
src="/static/blocks/webhook.png"
|
||||
src="/static/blocks/webhook-trigger.png"
|
||||
alt="通用 Webhook 配置"
|
||||
width={500}
|
||||
height={400}
|
||||
|
||||
@@ -169,7 +169,7 @@ checksums:
|
||||
content/1: 9d1b6de2021f809cc43502d19a19bd15
|
||||
content/2: f4c40c45a45329eca670aca4fcece6f3
|
||||
content/3: b03a97486cc185beb7b51644b548875a
|
||||
content/4: a77222cf7a57362fc7eb5ebf7cc652c6
|
||||
content/4: 01c24bef59948dbecc1ae19794019d5f
|
||||
content/5: ba18ac99184b17d7e49bd1abdc814437
|
||||
content/6: 171c4e97e509427ca63acccf136779b3
|
||||
content/7: 98e1babdd0136267807b7e94ae7da6c7
|
||||
@@ -700,7 +700,7 @@ checksums:
|
||||
content/11: 04bd9805ef6a50af8469463c34486dbf
|
||||
content/12: a3671dd7ba76a87dc75464d9bf9b7b4b
|
||||
content/13: 371d0e46b4bd2c23f559b8bc112f6955
|
||||
content/14: 80578981b8b3a1cf579e52ff05e7468d
|
||||
content/14: 5102b3705883f9e0c5440aeabafd1d24
|
||||
content/15: bcadfc362b69078beee0088e5936c98b
|
||||
content/16: 09ed43219d02501c829594dbf4128959
|
||||
content/17: 88ae2285d728c80937e1df8194d92c60
|
||||
@@ -712,7 +712,7 @@ checksums:
|
||||
content/23: 7d96d99e45880195ccbd34bddaac6319
|
||||
content/24: 75d05f96dff406db06b338d9ab8d0bd7
|
||||
content/25: 371d0e46b4bd2c23f559b8bc112f6955
|
||||
content/26: cfd801fa517b4bcfa5fa034b2c4e908a
|
||||
content/26: 38373ac018fd7db3a20ba5308beac81e
|
||||
content/27: bcadfc362b69078beee0088e5936c98b
|
||||
content/28: a0284632eb0a15e66f69479ec477c5b1
|
||||
content/29: b1e60734e590a8ad894a96581a253bf4
|
||||
@@ -48276,7 +48276,7 @@ checksums:
|
||||
content/35: 371d0e46b4bd2c23f559b8bc112f6955
|
||||
content/36: bddd30707802c07aac61620721bfaf16
|
||||
content/37: bcadfc362b69078beee0088e5936c98b
|
||||
content/38: fa2c581e6fb204f5ddbd0ffcbf0f7123
|
||||
content/38: 4619dad6a45478396332397f1e53db85
|
||||
content/39: 65de097e276f762b71d59fa7f9b0a207
|
||||
content/40: 013f52c249b5919fdb6d96700b25f379
|
||||
content/41: 371d0e46b4bd2c23f559b8bc112f6955
|
||||
@@ -50197,3 +50197,31 @@ checksums:
|
||||
content/7: 7b29d23aec8fda839f3934c5fc71c6d3
|
||||
content/8: b3f310d5ef115bea5a8b75bf25d7ea9a
|
||||
content/9: 79ecd09a7bedc128285814d8b439ed40
|
||||
2bf1f583bd3a431e459e5a0142a82efd:
|
||||
meta/title: 70f95b2c27f2c3840b500fcaf79ee83c
|
||||
content/0: eb0ed7078f192304703144f4cac3442f
|
||||
content/1: 1bc1f971556fb854666c22551215d3c2
|
||||
content/2: 5127a30fba20289720806082df2eae87
|
||||
content/3: 0441638444240cd20a6c69ea1d3afbb1
|
||||
content/4: 0b5805c0201ed427ba1b56b9814ee0cb
|
||||
content/5: cf5305db38e782a1001f5208cdf6b5f1
|
||||
content/6: 575a2fc0f65f0d24a9d75fac8e8bf5f8
|
||||
content/7: 1acea0b3685c12e5c3d73c7afa9c5582
|
||||
content/8: 4464a6c6f5ccc67b95309ba6399552e9
|
||||
content/9: 336794d9cf3e900c1b5aba0071944f1c
|
||||
content/10: bf46b631598a496c37560e074454f5ec
|
||||
content/11: 3d6a55b18007832eb2ed751638e968ca
|
||||
content/12: 3f97586d23efe56c4ab94c03a0b91706
|
||||
content/13: f2caee00e0e386a5e5257862209aaaef
|
||||
content/14: 15c9ed641ef776a33a945b6e0ddb908c
|
||||
content/15: db087c66ef8c0ab22775072b10655d05
|
||||
content/16: e148c1c6e1345e9ee95657c5ba40ebf4
|
||||
content/17: 9feca6cbb058fb8070b23d139d2d96e6
|
||||
content/18: 987932038f4e9442bd89f0f8ed3c5319
|
||||
content/19: 8e0258b3891544d355fa4a92f2ae96e4
|
||||
content/20: 9c2f91f89a914bf4661512275e461104
|
||||
content/21: a5cc8d50937a37d5ae7e921fc85a71f1
|
||||
content/22: 51b2fdf484e8d6b07cdf8434034dc872
|
||||
content/23: 59da7694b8be001fec8b9f9d7b604faf
|
||||
content/24: 8fb6954068c6687d44121e21a95cf1b6
|
||||
content/25: 9e7b1a1a453340d20adf4cacbd532018
|
||||
|
||||
BIN
apps/docs/public/static/blocks/webhook-trigger.png
Normal file
BIN
apps/docs/public/static/blocks/webhook-trigger.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 55 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 55 KiB After Width: | Height: | Size: 38 KiB |
@@ -20,7 +20,7 @@ interface NavProps {
|
||||
}
|
||||
|
||||
export default function Nav({ hideAuthButtons = false, variant = 'landing' }: NavProps = {}) {
|
||||
const [githubStars, setGithubStars] = useState('24.4k')
|
||||
const [githubStars, setGithubStars] = useState('25.1k')
|
||||
const [isHovered, setIsHovered] = useState(false)
|
||||
const [isLoginHovered, setIsLoginHovered] = useState(false)
|
||||
const router = useRouter()
|
||||
|
||||
@@ -136,16 +136,29 @@ vi.mock('@sim/db', () => {
|
||||
},
|
||||
}),
|
||||
}),
|
||||
delete: () => ({
|
||||
where: () => Promise.resolve(),
|
||||
}),
|
||||
insert: () => ({
|
||||
values: (records: any) => {
|
||||
dbOps.order.push('insert')
|
||||
dbOps.insertRecords.push(records)
|
||||
return Promise.resolve()
|
||||
},
|
||||
}),
|
||||
transaction: vi.fn(async (fn: any) => {
|
||||
await fn({
|
||||
insert: (table: any) => ({
|
||||
delete: () => ({
|
||||
where: () => Promise.resolve(),
|
||||
}),
|
||||
insert: () => ({
|
||||
values: (records: any) => {
|
||||
dbOps.order.push('insert')
|
||||
dbOps.insertRecords.push(records)
|
||||
return Promise.resolve()
|
||||
},
|
||||
}),
|
||||
update: (table: any) => ({
|
||||
update: () => ({
|
||||
set: (payload: any) => ({
|
||||
where: () => {
|
||||
dbOps.updatePayloads.push(payload)
|
||||
|
||||
@@ -21,14 +21,15 @@ export async function POST(
|
||||
) {
|
||||
const { workflowId, executionId, contextId } = await params
|
||||
|
||||
// Allow resume from dashboard without requiring deployment
|
||||
const access = await validateWorkflowAccess(request, workflowId, false)
|
||||
if (access.error) {
|
||||
return NextResponse.json({ error: access.error.message }, { status: access.error.status })
|
||||
}
|
||||
|
||||
const workflow = access.workflow!
|
||||
const workflow = access.workflow
|
||||
|
||||
let payload: any = {}
|
||||
let payload: Record<string, unknown> = {}
|
||||
try {
|
||||
payload = await request.json()
|
||||
} catch {
|
||||
@@ -148,6 +149,7 @@ export async function GET(
|
||||
) {
|
||||
const { workflowId, executionId, contextId } = await params
|
||||
|
||||
// Allow access without API key for browser-based UI (same as parent execution endpoint)
|
||||
const access = await validateWorkflowAccess(request, workflowId, false)
|
||||
if (access.error) {
|
||||
return NextResponse.json({ error: access.error.message }, { status: access.error.status })
|
||||
|
||||
@@ -14,10 +14,6 @@ import {
|
||||
} from '@/app/api/__test-utils__/utils'
|
||||
|
||||
const {
|
||||
hasProcessedMessageMock,
|
||||
markMessageAsProcessedMock,
|
||||
closeRedisConnectionMock,
|
||||
acquireLockMock,
|
||||
generateRequestHashMock,
|
||||
validateSlackSignatureMock,
|
||||
handleWhatsAppVerificationMock,
|
||||
@@ -28,10 +24,6 @@ const {
|
||||
processWebhookMock,
|
||||
executeMock,
|
||||
} = vi.hoisted(() => ({
|
||||
hasProcessedMessageMock: vi.fn().mockResolvedValue(false),
|
||||
markMessageAsProcessedMock: vi.fn().mockResolvedValue(true),
|
||||
closeRedisConnectionMock: vi.fn().mockResolvedValue(undefined),
|
||||
acquireLockMock: vi.fn().mockResolvedValue(true),
|
||||
generateRequestHashMock: vi.fn().mockResolvedValue('test-hash-123'),
|
||||
validateSlackSignatureMock: vi.fn().mockResolvedValue(true),
|
||||
handleWhatsAppVerificationMock: vi.fn().mockResolvedValue(null),
|
||||
@@ -73,13 +65,6 @@ vi.mock('@/background/logs-webhook-delivery', () => ({
|
||||
logsWebhookDelivery: {},
|
||||
}))
|
||||
|
||||
vi.mock('@/lib/redis', () => ({
|
||||
hasProcessedMessage: hasProcessedMessageMock,
|
||||
markMessageAsProcessed: markMessageAsProcessedMock,
|
||||
closeRedisConnection: closeRedisConnectionMock,
|
||||
acquireLock: acquireLockMock,
|
||||
}))
|
||||
|
||||
vi.mock('@/lib/webhooks/utils', () => ({
|
||||
handleWhatsAppVerification: handleWhatsAppVerificationMock,
|
||||
handleSlackChallenge: handleSlackChallengeMock,
|
||||
@@ -201,9 +186,6 @@ describe('Webhook Trigger API Route', () => {
|
||||
workspaceId: 'test-workspace-id',
|
||||
})
|
||||
|
||||
hasProcessedMessageMock.mockResolvedValue(false)
|
||||
markMessageAsProcessedMock.mockResolvedValue(true)
|
||||
acquireLockMock.mockResolvedValue(true)
|
||||
handleWhatsAppVerificationMock.mockResolvedValue(null)
|
||||
processGenericDeduplicationMock.mockResolvedValue(null)
|
||||
processWebhookMock.mockResolvedValue(new Response('Webhook processed', { status: 200 }))
|
||||
|
||||
@@ -117,7 +117,7 @@ export default function ChatClient({ identifier }: { identifier: string }) {
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const messagesEndRef = useRef<HTMLDivElement>(null)
|
||||
const messagesContainerRef = useRef<HTMLDivElement>(null)
|
||||
const [starCount, setStarCount] = useState('24.4k')
|
||||
const [starCount, setStarCount] = useState('25.1k')
|
||||
const [conversationId, setConversationId] = useState('')
|
||||
|
||||
const [showScrollButton, setShowScrollButton] = useState(false)
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,12 @@
|
||||
'use client'
|
||||
|
||||
import { Popover, PopoverAnchor, PopoverContent, PopoverItem } from '@/components/emcn'
|
||||
import {
|
||||
Popover,
|
||||
PopoverAnchor,
|
||||
PopoverContent,
|
||||
PopoverDivider,
|
||||
PopoverItem,
|
||||
} from '@/components/emcn'
|
||||
|
||||
interface ChunkContextMenuProps {
|
||||
isOpen: boolean
|
||||
@@ -102,6 +108,7 @@ export function ChunkContextMenu({
|
||||
<PopoverContent ref={menuRef} align='start' side='bottom' sideOffset={4}>
|
||||
{hasChunk ? (
|
||||
<>
|
||||
{/* Navigation */}
|
||||
{!isMultiSelect && onOpenInNewTab && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
@@ -112,6 +119,9 @@ export function ChunkContextMenu({
|
||||
Open in new tab
|
||||
</PopoverItem>
|
||||
)}
|
||||
{!isMultiSelect && onOpenInNewTab && <PopoverDivider />}
|
||||
|
||||
{/* Edit and copy actions */}
|
||||
{!isMultiSelect && onEdit && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
@@ -132,6 +142,9 @@ export function ChunkContextMenu({
|
||||
Copy content
|
||||
</PopoverItem>
|
||||
)}
|
||||
{!isMultiSelect && (onEdit || onCopyContent) && <PopoverDivider />}
|
||||
|
||||
{/* State toggle */}
|
||||
{onToggleEnabled && (
|
||||
<PopoverItem
|
||||
disabled={disableToggleEnabled}
|
||||
@@ -143,6 +156,13 @@ export function ChunkContextMenu({
|
||||
{getToggleLabel()}
|
||||
</PopoverItem>
|
||||
)}
|
||||
|
||||
{/* Destructive action */}
|
||||
{onDelete &&
|
||||
((!isMultiSelect && onOpenInNewTab) ||
|
||||
(!isMultiSelect && onEdit) ||
|
||||
(!isMultiSelect && onCopyContent) ||
|
||||
onToggleEnabled) && <PopoverDivider />}
|
||||
{onDelete && (
|
||||
<PopoverItem
|
||||
disabled={disableDelete}
|
||||
|
||||
@@ -453,6 +453,8 @@ export function KnowledgeBase({
|
||||
error: knowledgeBaseError,
|
||||
refresh: refreshKnowledgeBase,
|
||||
} = useKnowledgeBase(id)
|
||||
const [hasProcessingDocuments, setHasProcessingDocuments] = useState(false)
|
||||
|
||||
const {
|
||||
documents,
|
||||
pagination,
|
||||
@@ -468,6 +470,7 @@ export function KnowledgeBase({
|
||||
offset: (currentPage - 1) * DOCUMENTS_PER_PAGE,
|
||||
sortBy,
|
||||
sortOrder,
|
||||
refetchInterval: hasProcessingDocuments && !isDeleting ? 3000 : false,
|
||||
})
|
||||
|
||||
const { tagDefinitions } = useKnowledgeBaseTagDefinitions(id)
|
||||
@@ -534,25 +537,15 @@ export function KnowledgeBase({
|
||||
)
|
||||
|
||||
useEffect(() => {
|
||||
const hasProcessingDocuments = documents.some(
|
||||
const processing = documents.some(
|
||||
(doc) => doc.processingStatus === 'pending' || doc.processingStatus === 'processing'
|
||||
)
|
||||
setHasProcessingDocuments(processing)
|
||||
|
||||
if (!hasProcessingDocuments) return
|
||||
|
||||
const refreshInterval = setInterval(async () => {
|
||||
try {
|
||||
if (!isDeleting) {
|
||||
await checkForDeadProcesses()
|
||||
await refreshDocuments()
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error refreshing documents:', error)
|
||||
}
|
||||
}, 3000)
|
||||
|
||||
return () => clearInterval(refreshInterval)
|
||||
}, [documents, refreshDocuments, isDeleting])
|
||||
if (processing) {
|
||||
checkForDeadProcesses()
|
||||
}
|
||||
}, [documents])
|
||||
|
||||
/**
|
||||
* Checks for documents with stale processing states and marks them as failed
|
||||
@@ -672,25 +665,6 @@ export function KnowledgeBase({
|
||||
|
||||
await refreshDocuments()
|
||||
|
||||
let refreshAttempts = 0
|
||||
const maxRefreshAttempts = 3
|
||||
const refreshInterval = setInterval(async () => {
|
||||
try {
|
||||
refreshAttempts++
|
||||
await refreshDocuments()
|
||||
if (refreshAttempts >= maxRefreshAttempts) {
|
||||
clearInterval(refreshInterval)
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error refreshing documents after retry:', error)
|
||||
clearInterval(refreshInterval)
|
||||
}
|
||||
}, 1000)
|
||||
|
||||
setTimeout(() => {
|
||||
clearInterval(refreshInterval)
|
||||
}, 4000)
|
||||
|
||||
logger.info(`Document retry initiated successfully for: ${docId}`)
|
||||
} catch (err) {
|
||||
logger.error('Error retrying document:', err)
|
||||
|
||||
@@ -1,6 +1,12 @@
|
||||
'use client'
|
||||
|
||||
import { Popover, PopoverAnchor, PopoverContent, PopoverItem } from '@/components/emcn'
|
||||
import {
|
||||
Popover,
|
||||
PopoverAnchor,
|
||||
PopoverContent,
|
||||
PopoverDivider,
|
||||
PopoverItem,
|
||||
} from '@/components/emcn'
|
||||
|
||||
interface DocumentContextMenuProps {
|
||||
isOpen: boolean
|
||||
@@ -107,6 +113,7 @@ export function DocumentContextMenu({
|
||||
<PopoverContent ref={menuRef} align='start' side='bottom' sideOffset={4}>
|
||||
{hasDocument ? (
|
||||
<>
|
||||
{/* Navigation */}
|
||||
{!isMultiSelect && onOpenInNewTab && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
@@ -117,6 +124,9 @@ export function DocumentContextMenu({
|
||||
Open in new tab
|
||||
</PopoverItem>
|
||||
)}
|
||||
{!isMultiSelect && onOpenInNewTab && <PopoverDivider />}
|
||||
|
||||
{/* Edit and view actions */}
|
||||
{!isMultiSelect && onRename && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
@@ -137,6 +147,9 @@ export function DocumentContextMenu({
|
||||
View tags
|
||||
</PopoverItem>
|
||||
)}
|
||||
{!isMultiSelect && (onRename || (hasTags && onViewTags)) && <PopoverDivider />}
|
||||
|
||||
{/* State toggle */}
|
||||
{onToggleEnabled && (
|
||||
<PopoverItem
|
||||
disabled={disableToggleEnabled}
|
||||
@@ -148,6 +161,13 @@ export function DocumentContextMenu({
|
||||
{getToggleLabel()}
|
||||
</PopoverItem>
|
||||
)}
|
||||
|
||||
{/* Destructive action */}
|
||||
{onDelete &&
|
||||
((!isMultiSelect && onOpenInNewTab) ||
|
||||
(!isMultiSelect && onRename) ||
|
||||
(!isMultiSelect && hasTags && onViewTags) ||
|
||||
onToggleEnabled) && <PopoverDivider />}
|
||||
{onDelete && (
|
||||
<PopoverItem
|
||||
disabled={disableDelete}
|
||||
|
||||
@@ -1,6 +1,12 @@
|
||||
'use client'
|
||||
|
||||
import { Popover, PopoverAnchor, PopoverContent, PopoverItem } from '@/components/emcn'
|
||||
import {
|
||||
Popover,
|
||||
PopoverAnchor,
|
||||
PopoverContent,
|
||||
PopoverDivider,
|
||||
PopoverItem,
|
||||
} from '@/components/emcn'
|
||||
|
||||
interface KnowledgeBaseContextMenuProps {
|
||||
/**
|
||||
@@ -104,6 +110,7 @@ export function KnowledgeBaseContextMenu({
|
||||
}}
|
||||
/>
|
||||
<PopoverContent ref={menuRef} align='start' side='bottom' sideOffset={4}>
|
||||
{/* Navigation */}
|
||||
{showOpenInNewTab && onOpenInNewTab && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
@@ -114,6 +121,9 @@ export function KnowledgeBaseContextMenu({
|
||||
Open in new tab
|
||||
</PopoverItem>
|
||||
)}
|
||||
{showOpenInNewTab && onOpenInNewTab && <PopoverDivider />}
|
||||
|
||||
{/* View and copy actions */}
|
||||
{showViewTags && onViewTags && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
@@ -134,6 +144,9 @@ export function KnowledgeBaseContextMenu({
|
||||
Copy ID
|
||||
</PopoverItem>
|
||||
)}
|
||||
{((showViewTags && onViewTags) || onCopyId) && <PopoverDivider />}
|
||||
|
||||
{/* Edit action */}
|
||||
{showEdit && onEdit && (
|
||||
<PopoverItem
|
||||
disabled={disableEdit}
|
||||
@@ -145,6 +158,14 @@ export function KnowledgeBaseContextMenu({
|
||||
Edit
|
||||
</PopoverItem>
|
||||
)}
|
||||
|
||||
{/* Destructive action */}
|
||||
{showDelete &&
|
||||
onDelete &&
|
||||
((showOpenInNewTab && onOpenInNewTab) ||
|
||||
(showViewTags && onViewTags) ||
|
||||
onCopyId ||
|
||||
(showEdit && onEdit)) && <PopoverDivider />}
|
||||
{showDelete && onDelete && (
|
||||
<PopoverItem
|
||||
disabled={disableDelete}
|
||||
|
||||
@@ -164,7 +164,7 @@ function getBlockIconAndColor(
|
||||
return { icon: ParallelTool.icon, bgColor: ParallelTool.bgColor }
|
||||
}
|
||||
if (lowerType === 'workflow') {
|
||||
return { icon: WorkflowIcon, bgColor: '#705335' }
|
||||
return { icon: WorkflowIcon, bgColor: '#6366F1' }
|
||||
}
|
||||
|
||||
// Look up from block registry (model maps to agent)
|
||||
|
||||
@@ -1,7 +1,13 @@
|
||||
'use client'
|
||||
|
||||
import type { RefObject } from 'react'
|
||||
import { Popover, PopoverAnchor, PopoverContent, PopoverItem } from '@/components/emcn'
|
||||
import {
|
||||
Popover,
|
||||
PopoverAnchor,
|
||||
PopoverContent,
|
||||
PopoverDivider,
|
||||
PopoverItem,
|
||||
} from '@/components/emcn'
|
||||
import type { WorkflowLog } from '@/stores/logs/filters/types'
|
||||
|
||||
interface LogRowContextMenuProps {
|
||||
@@ -50,7 +56,7 @@ export function LogRowContextMenu({
|
||||
}}
|
||||
/>
|
||||
<PopoverContent ref={menuRef} align='start' side='bottom' sideOffset={4}>
|
||||
{/* Copy Execution ID */}
|
||||
{/* Copy action */}
|
||||
<PopoverItem
|
||||
disabled={!hasExecutionId}
|
||||
onClick={() => {
|
||||
@@ -61,7 +67,8 @@ export function LogRowContextMenu({
|
||||
Copy Execution ID
|
||||
</PopoverItem>
|
||||
|
||||
{/* Open Workflow */}
|
||||
{/* Navigation */}
|
||||
<PopoverDivider />
|
||||
<PopoverItem
|
||||
disabled={!hasWorkflow}
|
||||
onClick={() => {
|
||||
@@ -72,7 +79,8 @@ export function LogRowContextMenu({
|
||||
Open Workflow
|
||||
</PopoverItem>
|
||||
|
||||
{/* Filter by Workflow - only show when not already filtered by this workflow */}
|
||||
{/* Filter actions */}
|
||||
<PopoverDivider />
|
||||
{!isFilteredByThisWorkflow && (
|
||||
<PopoverItem
|
||||
disabled={!hasWorkflow}
|
||||
@@ -84,8 +92,6 @@ export function LogRowContextMenu({
|
||||
Filter by Workflow
|
||||
</PopoverItem>
|
||||
)}
|
||||
|
||||
{/* Clear All Filters - show when any filters are active */}
|
||||
{hasActiveFilters && (
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
|
||||
@@ -157,7 +157,7 @@ export function ChatMessage({ message }: ChatMessageProps) {
|
||||
|
||||
{formattedContent && !formattedContent.startsWith('Uploaded') && (
|
||||
<div className='rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-5)] px-[8px] py-[6px] transition-all duration-200'>
|
||||
<div className='whitespace-pre-wrap break-words font-medium font-sans text-gray-100 text-sm leading-[1.25rem]'>
|
||||
<div className='whitespace-pre-wrap break-words font-medium font-sans text-[var(--text-primary)] text-sm leading-[1.25rem]'>
|
||||
<WordWrap text={formattedContent} />
|
||||
</div>
|
||||
</div>
|
||||
@@ -168,7 +168,7 @@ export function ChatMessage({ message }: ChatMessageProps) {
|
||||
|
||||
return (
|
||||
<div className='w-full max-w-full overflow-hidden pl-[2px] opacity-100 transition-opacity duration-200'>
|
||||
<div className='whitespace-pre-wrap break-words font-[470] font-season text-[#E8E8E8] text-sm leading-[1.25rem]'>
|
||||
<div className='whitespace-pre-wrap break-words font-[470] font-season text-[var(--text-primary)] text-sm leading-[1.25rem]'>
|
||||
<WordWrap text={formattedContent} />
|
||||
{message.isStreaming && <StreamingIndicator />}
|
||||
</div>
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useCallback, useEffect, useMemo, useState } from 'react'
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
|
||||
import { useReactFlow } from 'reactflow'
|
||||
import { Combobox, type ComboboxOption } from '@/components/emcn/components'
|
||||
import { cn } from '@/lib/core/utils/cn'
|
||||
@@ -7,6 +7,9 @@ import { SubBlockInputController } from '@/app/workspace/[workspaceId]/w/[workfl
|
||||
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
|
||||
import { useAccessibleReferencePrefixes } from '@/app/workspace/[workspaceId]/w/[workflowId]/hooks/use-accessible-reference-prefixes'
|
||||
import type { SubBlockConfig } from '@/blocks/types'
|
||||
import { getDependsOnFields } from '@/blocks/utils'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
import { useSubBlockStore } from '@/stores/workflows/subblock/store'
|
||||
|
||||
/**
|
||||
* Constants for ComboBox component behavior
|
||||
@@ -48,6 +51,19 @@ interface ComboBoxProps {
|
||||
placeholder?: string
|
||||
/** Configuration for the sub-block */
|
||||
config: SubBlockConfig
|
||||
/** Async function to fetch options dynamically */
|
||||
fetchOptions?: (
|
||||
blockId: string,
|
||||
subBlockId: string
|
||||
) => Promise<Array<{ label: string; id: string }>>
|
||||
/** Async function to fetch a single option's label by ID (for hydration) */
|
||||
fetchOptionById?: (
|
||||
blockId: string,
|
||||
subBlockId: string,
|
||||
optionId: string
|
||||
) => Promise<{ label: string; id: string } | null>
|
||||
/** Field dependencies that trigger option refetch when changed */
|
||||
dependsOn?: SubBlockConfig['dependsOn']
|
||||
}
|
||||
|
||||
export function ComboBox({
|
||||
@@ -61,23 +77,89 @@ export function ComboBox({
|
||||
disabled,
|
||||
placeholder = 'Type or select an option...',
|
||||
config,
|
||||
fetchOptions,
|
||||
fetchOptionById,
|
||||
dependsOn,
|
||||
}: ComboBoxProps) {
|
||||
// Hooks and context
|
||||
const [storeValue, setStoreValue] = useSubBlockValue<string>(blockId, subBlockId)
|
||||
const accessiblePrefixes = useAccessibleReferencePrefixes(blockId)
|
||||
const reactFlowInstance = useReactFlow()
|
||||
|
||||
// Dependency tracking for fetchOptions
|
||||
const dependsOnFields = useMemo(() => getDependsOnFields(dependsOn), [dependsOn])
|
||||
const activeWorkflowId = useWorkflowRegistry((s) => s.activeWorkflowId)
|
||||
const dependencyValues = useSubBlockStore(
|
||||
useCallback(
|
||||
(state) => {
|
||||
if (dependsOnFields.length === 0 || !activeWorkflowId) return []
|
||||
const workflowValues = state.workflowValues[activeWorkflowId] || {}
|
||||
const blockValues = workflowValues[blockId] || {}
|
||||
return dependsOnFields.map((depKey) => blockValues[depKey] ?? null)
|
||||
},
|
||||
[dependsOnFields, activeWorkflowId, blockId]
|
||||
)
|
||||
)
|
||||
|
||||
// State management
|
||||
const [storeInitialized, setStoreInitialized] = useState(false)
|
||||
const [fetchedOptions, setFetchedOptions] = useState<Array<{ label: string; id: string }>>([])
|
||||
const [isLoadingOptions, setIsLoadingOptions] = useState(false)
|
||||
const [fetchError, setFetchError] = useState<string | null>(null)
|
||||
const [hydratedOption, setHydratedOption] = useState<{ label: string; id: string } | null>(null)
|
||||
const previousDependencyValuesRef = useRef<string>('')
|
||||
|
||||
/**
|
||||
* Fetches options from the async fetchOptions function if provided
|
||||
*/
|
||||
const fetchOptionsIfNeeded = useCallback(async () => {
|
||||
if (!fetchOptions || isPreview || disabled) return
|
||||
|
||||
setIsLoadingOptions(true)
|
||||
setFetchError(null)
|
||||
try {
|
||||
const options = await fetchOptions(blockId, subBlockId)
|
||||
setFetchedOptions(options)
|
||||
} catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Failed to fetch options'
|
||||
setFetchError(errorMessage)
|
||||
setFetchedOptions([])
|
||||
} finally {
|
||||
setIsLoadingOptions(false)
|
||||
}
|
||||
}, [fetchOptions, blockId, subBlockId, isPreview, disabled])
|
||||
|
||||
// Determine the active value based on mode (preview vs. controlled vs. store)
|
||||
const value = isPreview ? previewValue : propValue !== undefined ? propValue : storeValue
|
||||
|
||||
// Evaluate options if provided as a function
|
||||
const evaluatedOptions = useMemo(() => {
|
||||
// Evaluate static options if provided as a function
|
||||
const staticOptions = useMemo(() => {
|
||||
return typeof options === 'function' ? options() : options
|
||||
}, [options])
|
||||
|
||||
// Normalize fetched options to match ComboBoxOption format
|
||||
const normalizedFetchedOptions = useMemo((): ComboBoxOption[] => {
|
||||
return fetchedOptions.map((opt) => ({ label: opt.label, id: opt.id }))
|
||||
}, [fetchedOptions])
|
||||
|
||||
// Merge static and fetched options - fetched options take priority when available
|
||||
const evaluatedOptions = useMemo((): ComboBoxOption[] => {
|
||||
let opts: ComboBoxOption[] =
|
||||
fetchOptions && normalizedFetchedOptions.length > 0 ? normalizedFetchedOptions : staticOptions
|
||||
|
||||
// Merge hydrated option if not already present
|
||||
if (hydratedOption) {
|
||||
const alreadyPresent = opts.some((o) =>
|
||||
typeof o === 'string' ? o === hydratedOption.id : o.id === hydratedOption.id
|
||||
)
|
||||
if (!alreadyPresent) {
|
||||
opts = [hydratedOption, ...opts]
|
||||
}
|
||||
}
|
||||
|
||||
return opts
|
||||
}, [fetchOptions, normalizedFetchedOptions, staticOptions, hydratedOption])
|
||||
|
||||
// Convert options to Combobox format
|
||||
const comboboxOptions = useMemo((): ComboboxOption[] => {
|
||||
return evaluatedOptions.map((option) => {
|
||||
@@ -160,6 +242,94 @@ export function ComboBox({
|
||||
}
|
||||
}, [storeInitialized, value, defaultOptionValue, setStoreValue])
|
||||
|
||||
// Clear fetched options and hydrated option when dependencies change
|
||||
useEffect(() => {
|
||||
if (fetchOptions && dependsOnFields.length > 0) {
|
||||
const currentDependencyValuesStr = JSON.stringify(dependencyValues)
|
||||
const previousDependencyValuesStr = previousDependencyValuesRef.current
|
||||
|
||||
if (
|
||||
previousDependencyValuesStr &&
|
||||
currentDependencyValuesStr !== previousDependencyValuesStr
|
||||
) {
|
||||
setFetchedOptions([])
|
||||
setHydratedOption(null)
|
||||
}
|
||||
|
||||
previousDependencyValuesRef.current = currentDependencyValuesStr
|
||||
}
|
||||
}, [dependencyValues, fetchOptions, dependsOnFields.length])
|
||||
|
||||
// Fetch options when needed (on mount, when enabled, or when dependencies change)
|
||||
useEffect(() => {
|
||||
if (
|
||||
fetchOptions &&
|
||||
!isPreview &&
|
||||
!disabled &&
|
||||
fetchedOptions.length === 0 &&
|
||||
!isLoadingOptions &&
|
||||
!fetchError
|
||||
) {
|
||||
fetchOptionsIfNeeded()
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps -- fetchOptionsIfNeeded deps already covered above
|
||||
}, [
|
||||
fetchOptions,
|
||||
isPreview,
|
||||
disabled,
|
||||
fetchedOptions.length,
|
||||
isLoadingOptions,
|
||||
fetchError,
|
||||
dependencyValues,
|
||||
])
|
||||
|
||||
// Hydrate the stored value's label by fetching it individually
|
||||
useEffect(() => {
|
||||
if (!fetchOptionById || isPreview || disabled) return
|
||||
|
||||
const valueToHydrate = value as string | null | undefined
|
||||
if (!valueToHydrate) return
|
||||
|
||||
// Skip if value is an expression (not a real ID)
|
||||
if (valueToHydrate.startsWith('<') || valueToHydrate.includes('{{')) return
|
||||
|
||||
// Skip if already hydrated with the same value
|
||||
if (hydratedOption?.id === valueToHydrate) return
|
||||
|
||||
// Skip if value is already in fetched options or static options
|
||||
const alreadyInFetchedOptions = fetchedOptions.some((opt) => opt.id === valueToHydrate)
|
||||
const alreadyInStaticOptions = staticOptions.some((opt) =>
|
||||
typeof opt === 'string' ? opt === valueToHydrate : opt.id === valueToHydrate
|
||||
)
|
||||
if (alreadyInFetchedOptions || alreadyInStaticOptions) return
|
||||
|
||||
// Track if effect is still active (cleanup on unmount or value change)
|
||||
let isActive = true
|
||||
|
||||
// Fetch the hydrated option
|
||||
fetchOptionById(blockId, subBlockId, valueToHydrate)
|
||||
.then((option) => {
|
||||
if (isActive) setHydratedOption(option)
|
||||
})
|
||||
.catch(() => {
|
||||
if (isActive) setHydratedOption(null)
|
||||
})
|
||||
|
||||
return () => {
|
||||
isActive = false
|
||||
}
|
||||
}, [
|
||||
fetchOptionById,
|
||||
value,
|
||||
blockId,
|
||||
subBlockId,
|
||||
isPreview,
|
||||
disabled,
|
||||
fetchedOptions,
|
||||
staticOptions,
|
||||
hydratedOption?.id,
|
||||
])
|
||||
|
||||
/**
|
||||
* Handles wheel event for ReactFlow zoom control
|
||||
* Intercepts Ctrl/Cmd+Wheel to zoom the canvas
|
||||
@@ -247,11 +417,13 @@ export function ComboBox({
|
||||
return option.id === newValue
|
||||
})
|
||||
|
||||
if (!matchedOption) {
|
||||
return
|
||||
}
|
||||
|
||||
const nextValue = typeof matchedOption === 'string' ? matchedOption : matchedOption.id
|
||||
// If a matching option is found, store its ID; otherwise store the raw value
|
||||
// (allows expressions like <block.output> to be entered directly)
|
||||
const nextValue = matchedOption
|
||||
? typeof matchedOption === 'string'
|
||||
? matchedOption
|
||||
: matchedOption.id
|
||||
: newValue
|
||||
setStoreValue(nextValue)
|
||||
}}
|
||||
isPreview={isPreview}
|
||||
@@ -293,6 +465,13 @@ export function ComboBox({
|
||||
onWheel: handleWheel,
|
||||
autoComplete: 'off',
|
||||
}}
|
||||
isLoading={isLoadingOptions}
|
||||
error={fetchError}
|
||||
onOpenChange={(open) => {
|
||||
if (open) {
|
||||
void fetchOptionsIfNeeded()
|
||||
}
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</SubBlockInputController>
|
||||
|
||||
@@ -12,6 +12,7 @@ import {
|
||||
getCodeEditorProps,
|
||||
highlight,
|
||||
languages,
|
||||
Textarea,
|
||||
Tooltip,
|
||||
} from '@/components/emcn'
|
||||
import { Trash } from '@/components/emcn/icons/trash'
|
||||
@@ -74,6 +75,8 @@ interface ConditionInputProps {
|
||||
previewValue?: string | null
|
||||
/** Whether the component is disabled */
|
||||
disabled?: boolean
|
||||
/** Mode: 'condition' for code editor, 'router' for text input */
|
||||
mode?: 'condition' | 'router'
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -101,7 +104,9 @@ export function ConditionInput({
|
||||
isPreview = false,
|
||||
previewValue,
|
||||
disabled = false,
|
||||
mode = 'condition',
|
||||
}: ConditionInputProps) {
|
||||
const isRouterMode = mode === 'router'
|
||||
const params = useParams()
|
||||
const workspaceId = params.workspaceId as string
|
||||
const [storeValue, setStoreValue] = useSubBlockValue(blockId, subBlockId)
|
||||
@@ -161,32 +166,50 @@ export function ConditionInput({
|
||||
const shouldPersistRef = useRef<boolean>(false)
|
||||
|
||||
/**
|
||||
* Creates default if/else conditional blocks with stable IDs.
|
||||
* Creates default blocks with stable IDs.
|
||||
* For conditions: if/else blocks. For router: one route block.
|
||||
*
|
||||
* @returns Array of two default blocks (if and else)
|
||||
* @returns Array of default blocks
|
||||
*/
|
||||
const createDefaultBlocks = (): ConditionalBlock[] => [
|
||||
{
|
||||
id: generateStableId(blockId, 'if'),
|
||||
title: 'if',
|
||||
value: '',
|
||||
showTags: false,
|
||||
showEnvVars: false,
|
||||
searchTerm: '',
|
||||
cursorPosition: 0,
|
||||
activeSourceBlockId: null,
|
||||
},
|
||||
{
|
||||
id: generateStableId(blockId, 'else'),
|
||||
title: 'else',
|
||||
value: '',
|
||||
showTags: false,
|
||||
showEnvVars: false,
|
||||
searchTerm: '',
|
||||
cursorPosition: 0,
|
||||
activeSourceBlockId: null,
|
||||
},
|
||||
]
|
||||
const createDefaultBlocks = (): ConditionalBlock[] => {
|
||||
if (isRouterMode) {
|
||||
return [
|
||||
{
|
||||
id: generateStableId(blockId, 'route1'),
|
||||
title: 'route1',
|
||||
value: '',
|
||||
showTags: false,
|
||||
showEnvVars: false,
|
||||
searchTerm: '',
|
||||
cursorPosition: 0,
|
||||
activeSourceBlockId: null,
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
return [
|
||||
{
|
||||
id: generateStableId(blockId, 'if'),
|
||||
title: 'if',
|
||||
value: '',
|
||||
showTags: false,
|
||||
showEnvVars: false,
|
||||
searchTerm: '',
|
||||
cursorPosition: 0,
|
||||
activeSourceBlockId: null,
|
||||
},
|
||||
{
|
||||
id: generateStableId(blockId, 'else'),
|
||||
title: 'else',
|
||||
value: '',
|
||||
showTags: false,
|
||||
showEnvVars: false,
|
||||
searchTerm: '',
|
||||
cursorPosition: 0,
|
||||
activeSourceBlockId: null,
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
// Initialize with a loading state instead of default blocks
|
||||
const [conditionalBlocks, setConditionalBlocks] = useState<ConditionalBlock[]>([])
|
||||
@@ -270,10 +293,13 @@ export function ConditionInput({
|
||||
const parsedBlocks = safeParseJSON(effectiveValueStr)
|
||||
|
||||
if (parsedBlocks) {
|
||||
const blocksWithCorrectTitles = parsedBlocks.map((block, index) => ({
|
||||
...block,
|
||||
title: index === 0 ? 'if' : index === parsedBlocks.length - 1 ? 'else' : 'else if',
|
||||
}))
|
||||
// For router mode, keep original titles. For condition mode, assign if/else if/else
|
||||
const blocksWithCorrectTitles = isRouterMode
|
||||
? parsedBlocks
|
||||
: parsedBlocks.map((block, index) => ({
|
||||
...block,
|
||||
title: index === 0 ? 'if' : index === parsedBlocks.length - 1 ? 'else' : 'else if',
|
||||
}))
|
||||
|
||||
setConditionalBlocks(blocksWithCorrectTitles)
|
||||
hasInitializedRef.current = true
|
||||
@@ -573,12 +599,17 @@ export function ConditionInput({
|
||||
|
||||
/**
|
||||
* Updates block titles based on their position in the array.
|
||||
* First block is always 'if', last is 'else', middle ones are 'else if'.
|
||||
* For conditions: First block is 'if', last is 'else', middle ones are 'else if'.
|
||||
* For router: Titles are user-editable and not auto-updated.
|
||||
*
|
||||
* @param blocks - Array of conditional blocks
|
||||
* @returns Updated blocks with correct titles
|
||||
*/
|
||||
const updateBlockTitles = (blocks: ConditionalBlock[]): ConditionalBlock[] => {
|
||||
if (isRouterMode) {
|
||||
// For router mode, don't change titles - they're user-editable
|
||||
return blocks
|
||||
}
|
||||
return blocks.map((block, index) => ({
|
||||
...block,
|
||||
title: index === 0 ? 'if' : index === blocks.length - 1 ? 'else' : 'else if',
|
||||
@@ -590,13 +621,15 @@ export function ConditionInput({
|
||||
if (isPreview || disabled) return
|
||||
|
||||
const blockIndex = conditionalBlocks.findIndex((block) => block.id === afterId)
|
||||
if (conditionalBlocks[blockIndex]?.title === 'else') return
|
||||
if (!isRouterMode && conditionalBlocks[blockIndex]?.title === 'else') return
|
||||
|
||||
const newBlockId = generateStableId(blockId, `else-if-${Date.now()}`)
|
||||
const newBlockId = isRouterMode
|
||||
? generateStableId(blockId, `route-${Date.now()}`)
|
||||
: generateStableId(blockId, `else-if-${Date.now()}`)
|
||||
|
||||
const newBlock: ConditionalBlock = {
|
||||
id: newBlockId,
|
||||
title: '',
|
||||
title: isRouterMode ? `route-${Date.now()}` : '',
|
||||
value: '',
|
||||
showTags: false,
|
||||
showEnvVars: false,
|
||||
@@ -710,13 +743,15 @@ export function ConditionInput({
|
||||
<div
|
||||
className={cn(
|
||||
'flex items-center justify-between overflow-hidden bg-transparent px-[10px] py-[5px]',
|
||||
block.title === 'else'
|
||||
? 'rounded-[4px] border-0'
|
||||
: 'rounded-t-[4px] border-[var(--border-1)] border-b'
|
||||
isRouterMode
|
||||
? 'rounded-t-[4px] border-[var(--border-1)] border-b'
|
||||
: block.title === 'else'
|
||||
? 'rounded-[4px] border-0'
|
||||
: 'rounded-t-[4px] border-[var(--border-1)] border-b'
|
||||
)}
|
||||
>
|
||||
<span className='font-medium text-[14px] text-[var(--text-tertiary)]'>
|
||||
{block.title}
|
||||
{isRouterMode ? `Route ${index + 1}` : block.title}
|
||||
</span>
|
||||
<div className='flex items-center gap-[8px]'>
|
||||
<Tooltip.Root>
|
||||
@@ -724,7 +759,7 @@ export function ConditionInput({
|
||||
<Button
|
||||
variant='ghost'
|
||||
onClick={() => addBlock(block.id)}
|
||||
disabled={isPreview || disabled || block.title === 'else'}
|
||||
disabled={isPreview || disabled || (!isRouterMode && block.title === 'else')}
|
||||
className='h-auto p-0'
|
||||
>
|
||||
<Plus className='h-[14px] w-[14px]' />
|
||||
@@ -739,7 +774,12 @@ export function ConditionInput({
|
||||
<Button
|
||||
variant='ghost'
|
||||
onClick={() => moveBlock(block.id, 'up')}
|
||||
disabled={isPreview || index === 0 || disabled || block.title === 'else'}
|
||||
disabled={
|
||||
isPreview ||
|
||||
index === 0 ||
|
||||
disabled ||
|
||||
(!isRouterMode && block.title === 'else')
|
||||
}
|
||||
className='h-auto p-0'
|
||||
>
|
||||
<ChevronUp className='h-[14px] w-[14px]' />
|
||||
@@ -758,8 +798,8 @@ export function ConditionInput({
|
||||
isPreview ||
|
||||
disabled ||
|
||||
index === conditionalBlocks.length - 1 ||
|
||||
conditionalBlocks[index + 1]?.title === 'else' ||
|
||||
block.title === 'else'
|
||||
(!isRouterMode && conditionalBlocks[index + 1]?.title === 'else') ||
|
||||
(!isRouterMode && block.title === 'else')
|
||||
}
|
||||
className='h-auto p-0'
|
||||
>
|
||||
@@ -775,18 +815,122 @@ export function ConditionInput({
|
||||
<Button
|
||||
variant='ghost'
|
||||
onClick={() => removeBlock(block.id)}
|
||||
disabled={isPreview || conditionalBlocks.length === 1 || disabled}
|
||||
disabled={isPreview || disabled || conditionalBlocks.length === 1}
|
||||
className='h-auto p-0 text-[var(--text-error)] hover:text-[var(--text-error)]'
|
||||
>
|
||||
<Trash className='h-[14px] w-[14px]' />
|
||||
<span className='sr-only'>Delete Block</span>
|
||||
</Button>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content>Delete Condition</Tooltip.Content>
|
||||
<Tooltip.Content>
|
||||
{isRouterMode ? 'Delete Route' : 'Delete Condition'}
|
||||
</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
</div>
|
||||
</div>
|
||||
{block.title !== 'else' &&
|
||||
{/* Router mode: show description textarea with tag/env var support */}
|
||||
{isRouterMode && (
|
||||
<div
|
||||
className='relative'
|
||||
onDragOver={(e) => e.preventDefault()}
|
||||
onDrop={(e) => handleDrop(block.id, e)}
|
||||
>
|
||||
<Textarea
|
||||
data-router-block-id={block.id}
|
||||
value={block.value}
|
||||
onChange={(e) => {
|
||||
if (!isPreview && !disabled) {
|
||||
const newValue = e.target.value
|
||||
const pos = e.target.selectionStart ?? 0
|
||||
|
||||
const tagTrigger = checkTagTrigger(newValue, pos)
|
||||
const envVarTrigger = checkEnvVarTrigger(newValue, pos)
|
||||
|
||||
shouldPersistRef.current = true
|
||||
setConditionalBlocks((blocks) =>
|
||||
blocks.map((b) =>
|
||||
b.id === block.id
|
||||
? {
|
||||
...b,
|
||||
value: newValue,
|
||||
showTags: tagTrigger.show,
|
||||
showEnvVars: envVarTrigger.show,
|
||||
searchTerm: envVarTrigger.show ? envVarTrigger.searchTerm : '',
|
||||
cursorPosition: pos,
|
||||
}
|
||||
: b
|
||||
)
|
||||
)
|
||||
}
|
||||
}}
|
||||
onBlur={() => {
|
||||
setTimeout(() => {
|
||||
setConditionalBlocks((blocks) =>
|
||||
blocks.map((b) =>
|
||||
b.id === block.id ? { ...b, showTags: false, showEnvVars: false } : b
|
||||
)
|
||||
)
|
||||
}, 150)
|
||||
}}
|
||||
placeholder='Describe when this route should be taken...'
|
||||
disabled={disabled || isPreview}
|
||||
className='min-h-[60px] resize-none rounded-none border-0 px-3 py-2 text-sm placeholder:text-muted-foreground/50 focus-visible:ring-0 focus-visible:ring-offset-0'
|
||||
rows={2}
|
||||
/>
|
||||
|
||||
{block.showEnvVars && (
|
||||
<EnvVarDropdown
|
||||
visible={block.showEnvVars}
|
||||
onSelect={(newValue) => handleEnvVarSelectImmediate(block.id, newValue)}
|
||||
searchTerm={block.searchTerm}
|
||||
inputValue={block.value}
|
||||
cursorPosition={block.cursorPosition}
|
||||
workspaceId={workspaceId}
|
||||
onClose={() => {
|
||||
setConditionalBlocks((blocks) =>
|
||||
blocks.map((b) =>
|
||||
b.id === block.id
|
||||
? {
|
||||
...b,
|
||||
showEnvVars: false,
|
||||
searchTerm: '',
|
||||
}
|
||||
: b
|
||||
)
|
||||
)
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
|
||||
{block.showTags && (
|
||||
<TagDropdown
|
||||
visible={block.showTags}
|
||||
onSelect={(newValue) => handleTagSelectImmediate(block.id, newValue)}
|
||||
blockId={blockId}
|
||||
activeSourceBlockId={block.activeSourceBlockId}
|
||||
inputValue={block.value}
|
||||
cursorPosition={block.cursorPosition}
|
||||
onClose={() => {
|
||||
setConditionalBlocks((blocks) =>
|
||||
blocks.map((b) =>
|
||||
b.id === block.id
|
||||
? {
|
||||
...b,
|
||||
showTags: false,
|
||||
activeSourceBlockId: null,
|
||||
}
|
||||
: b
|
||||
)
|
||||
)
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Condition mode: show code editor */}
|
||||
{!isRouterMode &&
|
||||
block.title !== 'else' &&
|
||||
(() => {
|
||||
const blockLineCount = block.value.split('\n').length
|
||||
const blockGutterWidth = calculateGutterWidth(blockLineCount)
|
||||
|
||||
@@ -44,6 +44,12 @@ interface DropdownProps {
|
||||
blockId: string,
|
||||
subBlockId: string
|
||||
) => Promise<Array<{ label: string; id: string }>>
|
||||
/** Async function to fetch a single option's label by ID (for hydration) */
|
||||
fetchOptionById?: (
|
||||
blockId: string,
|
||||
subBlockId: string,
|
||||
optionId: string
|
||||
) => Promise<{ label: string; id: string } | null>
|
||||
/** Field dependencies that trigger option refetch when changed */
|
||||
dependsOn?: SubBlockConfig['dependsOn']
|
||||
/** Enable search input in dropdown */
|
||||
@@ -71,6 +77,7 @@ export function Dropdown({
|
||||
placeholder = 'Select an option...',
|
||||
multiSelect = false,
|
||||
fetchOptions,
|
||||
fetchOptionById,
|
||||
dependsOn,
|
||||
searchable = false,
|
||||
}: DropdownProps) {
|
||||
@@ -98,6 +105,7 @@ export function Dropdown({
|
||||
const [fetchedOptions, setFetchedOptions] = useState<Array<{ label: string; id: string }>>([])
|
||||
const [isLoadingOptions, setIsLoadingOptions] = useState(false)
|
||||
const [fetchError, setFetchError] = useState<string | null>(null)
|
||||
const [hydratedOption, setHydratedOption] = useState<{ label: string; id: string } | null>(null)
|
||||
|
||||
const previousModeRef = useRef<string | null>(null)
|
||||
const previousDependencyValuesRef = useRef<string>('')
|
||||
@@ -150,11 +158,23 @@ export function Dropdown({
|
||||
}, [fetchedOptions])
|
||||
|
||||
const availableOptions = useMemo(() => {
|
||||
if (fetchOptions && normalizedFetchedOptions.length > 0) {
|
||||
return normalizedFetchedOptions
|
||||
let opts: DropdownOption[] =
|
||||
fetchOptions && normalizedFetchedOptions.length > 0
|
||||
? normalizedFetchedOptions
|
||||
: evaluatedOptions
|
||||
|
||||
// Merge hydrated option if not already present
|
||||
if (hydratedOption) {
|
||||
const alreadyPresent = opts.some((o) =>
|
||||
typeof o === 'string' ? o === hydratedOption.id : o.id === hydratedOption.id
|
||||
)
|
||||
if (!alreadyPresent) {
|
||||
opts = [hydratedOption, ...opts]
|
||||
}
|
||||
}
|
||||
return evaluatedOptions
|
||||
}, [fetchOptions, normalizedFetchedOptions, evaluatedOptions])
|
||||
|
||||
return opts
|
||||
}, [fetchOptions, normalizedFetchedOptions, evaluatedOptions, hydratedOption])
|
||||
|
||||
/**
|
||||
* Convert dropdown options to Combobox format
|
||||
@@ -310,7 +330,7 @@ export function Dropdown({
|
||||
)
|
||||
|
||||
/**
|
||||
* Effect to clear fetched options when dependencies actually change
|
||||
* Effect to clear fetched options and hydrated option when dependencies actually change
|
||||
* This ensures options are refetched with new dependency values (e.g., new credentials)
|
||||
*/
|
||||
useEffect(() => {
|
||||
@@ -323,6 +343,7 @@ export function Dropdown({
|
||||
currentDependencyValuesStr !== previousDependencyValuesStr
|
||||
) {
|
||||
setFetchedOptions([])
|
||||
setHydratedOption(null)
|
||||
}
|
||||
|
||||
previousDependencyValuesRef.current = currentDependencyValuesStr
|
||||
@@ -338,18 +359,72 @@ export function Dropdown({
|
||||
!isPreview &&
|
||||
!disabled &&
|
||||
fetchedOptions.length === 0 &&
|
||||
!isLoadingOptions
|
||||
!isLoadingOptions &&
|
||||
!fetchError
|
||||
) {
|
||||
fetchOptionsIfNeeded()
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps -- fetchOptionsIfNeeded deps already covered above
|
||||
}, [
|
||||
fetchOptions,
|
||||
isPreview,
|
||||
disabled,
|
||||
fetchedOptions.length,
|
||||
isLoadingOptions,
|
||||
fetchOptionsIfNeeded,
|
||||
dependencyValues, // Refetch when dependencies change
|
||||
fetchError,
|
||||
dependencyValues,
|
||||
])
|
||||
|
||||
/**
|
||||
* Effect to hydrate the stored value's label by fetching it individually
|
||||
* This ensures the correct label is shown before the full options list loads
|
||||
*/
|
||||
useEffect(() => {
|
||||
if (!fetchOptionById || isPreview || disabled) return
|
||||
|
||||
// Get the value to hydrate (single value only, not multi-select)
|
||||
const valueToHydrate = multiSelect ? null : (singleValue as string | null | undefined)
|
||||
if (!valueToHydrate) return
|
||||
|
||||
// Skip if value is an expression (not a real ID)
|
||||
if (valueToHydrate.startsWith('<') || valueToHydrate.includes('{{')) return
|
||||
|
||||
// Skip if already hydrated with the same value
|
||||
if (hydratedOption?.id === valueToHydrate) return
|
||||
|
||||
// Skip if value is already in fetched options or static options
|
||||
const alreadyInFetchedOptions = fetchedOptions.some((opt) => opt.id === valueToHydrate)
|
||||
const alreadyInStaticOptions = evaluatedOptions.some((opt) =>
|
||||
typeof opt === 'string' ? opt === valueToHydrate : opt.id === valueToHydrate
|
||||
)
|
||||
if (alreadyInFetchedOptions || alreadyInStaticOptions) return
|
||||
|
||||
// Track if effect is still active (cleanup on unmount or value change)
|
||||
let isActive = true
|
||||
|
||||
// Fetch the hydrated option
|
||||
fetchOptionById(blockId, subBlockId, valueToHydrate)
|
||||
.then((option) => {
|
||||
if (isActive) setHydratedOption(option)
|
||||
})
|
||||
.catch(() => {
|
||||
if (isActive) setHydratedOption(null)
|
||||
})
|
||||
|
||||
return () => {
|
||||
isActive = false
|
||||
}
|
||||
}, [
|
||||
fetchOptionById,
|
||||
singleValue,
|
||||
multiSelect,
|
||||
blockId,
|
||||
subBlockId,
|
||||
isPreview,
|
||||
disabled,
|
||||
fetchedOptions,
|
||||
evaluatedOptions,
|
||||
hydratedOption?.id,
|
||||
])
|
||||
|
||||
/**
|
||||
|
||||
@@ -256,24 +256,13 @@ export function InputMapping({
|
||||
|
||||
if (!selectedWorkflowId) {
|
||||
return (
|
||||
<div className='flex flex-col items-center justify-center rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-3)] p-8 text-center dark:bg-[#1F1F1F]'>
|
||||
<svg
|
||||
className='mb-3 h-10 w-10 text-[var(--text-tertiary)]'
|
||||
fill='none'
|
||||
viewBox='0 0 24 24'
|
||||
stroke='currentColor'
|
||||
>
|
||||
<path
|
||||
strokeLinecap='round'
|
||||
strokeLinejoin='round'
|
||||
strokeWidth={1.5}
|
||||
d='M13 10V3L4 14h7v7l9-11h-7z'
|
||||
/>
|
||||
</svg>
|
||||
<p className='font-medium text-[var(--text-tertiary)] text-sm'>No workflow selected</p>
|
||||
<p className='mt-1 text-[var(--text-tertiary)]/80 text-xs'>
|
||||
Select a workflow above to configure inputs
|
||||
</p>
|
||||
<div className='flex h-32 items-center justify-center rounded-[4px] border border-[var(--border-1)] border-dashed bg-[var(--surface-3)] dark:bg-[#1F1F1F]'>
|
||||
<div className='text-center'>
|
||||
<p className='font-medium text-[var(--text-secondary)] text-sm'>No workflow selected</p>
|
||||
<p className='mt-1 text-[var(--text-muted)] text-xs'>
|
||||
Select a workflow above to configure inputs
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -95,7 +95,9 @@ export function FieldFormat({
|
||||
}: FieldFormatProps) {
|
||||
const [storeValue, setStoreValue] = useSubBlockValue<Field[]>(blockId, subBlockId)
|
||||
const valueInputRefs = useRef<Record<string, HTMLInputElement | HTMLTextAreaElement>>({})
|
||||
const nameInputRefs = useRef<Record<string, HTMLInputElement>>({})
|
||||
const overlayRefs = useRef<Record<string, HTMLDivElement>>({})
|
||||
const nameOverlayRefs = useRef<Record<string, HTMLDivElement>>({})
|
||||
const accessiblePrefixes = useAccessibleReferencePrefixes(blockId)
|
||||
|
||||
const inputController = useSubBlockInput({
|
||||
@@ -158,6 +160,97 @@ export function FieldFormat({
|
||||
if (overlay) overlay.scrollLeft = scrollLeft
|
||||
}
|
||||
|
||||
/**
|
||||
* Syncs scroll position between name input and overlay for text highlighting
|
||||
*/
|
||||
const syncNameOverlayScroll = (fieldId: string, scrollLeft: number) => {
|
||||
const overlay = nameOverlayRefs.current[fieldId]
|
||||
if (overlay) overlay.scrollLeft = scrollLeft
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates a unique field key for name inputs to avoid collision with value inputs
|
||||
*/
|
||||
const getNameFieldKey = (fieldId: string) => `name-${fieldId}`
|
||||
|
||||
/**
|
||||
* Renders the name input field with tag dropdown support
|
||||
*/
|
||||
const renderNameInput = (field: Field) => {
|
||||
const nameFieldKey = getNameFieldKey(field.id)
|
||||
const fieldValue = field.name ?? ''
|
||||
const fieldState = inputController.fieldHelpers.getFieldState(nameFieldKey)
|
||||
const handlers = inputController.fieldHelpers.createFieldHandlers(
|
||||
nameFieldKey,
|
||||
fieldValue,
|
||||
(newValue) => updateField(field.id, 'name', newValue)
|
||||
)
|
||||
const tagSelectHandler = inputController.fieldHelpers.createTagSelectHandler(
|
||||
nameFieldKey,
|
||||
fieldValue,
|
||||
(newValue) => updateField(field.id, 'name', newValue)
|
||||
)
|
||||
|
||||
const inputClassName = cn('text-transparent caret-foreground')
|
||||
|
||||
return (
|
||||
<>
|
||||
<Input
|
||||
ref={(el) => {
|
||||
if (el) nameInputRefs.current[field.id] = el
|
||||
}}
|
||||
name='name'
|
||||
value={fieldValue}
|
||||
onChange={handlers.onChange}
|
||||
onKeyDown={handlers.onKeyDown}
|
||||
onDrop={handlers.onDrop}
|
||||
onDragOver={handlers.onDragOver}
|
||||
onScroll={(e) => syncNameOverlayScroll(field.id, e.currentTarget.scrollLeft)}
|
||||
onPaste={() =>
|
||||
setTimeout(() => {
|
||||
const input = nameInputRefs.current[field.id]
|
||||
input && syncNameOverlayScroll(field.id, input.scrollLeft)
|
||||
}, 0)
|
||||
}
|
||||
placeholder={placeholder}
|
||||
disabled={isReadOnly}
|
||||
autoComplete='off'
|
||||
className={cn('allow-scroll w-full overflow-auto', inputClassName)}
|
||||
style={{ overflowX: 'auto' }}
|
||||
/>
|
||||
<div
|
||||
ref={(el) => {
|
||||
if (el) nameOverlayRefs.current[field.id] = el
|
||||
}}
|
||||
className='pointer-events-none absolute inset-0 flex items-center overflow-x-auto bg-transparent px-[8px] py-[6px] font-medium font-sans text-sm'
|
||||
style={{ overflowX: 'auto' }}
|
||||
>
|
||||
<div
|
||||
className='w-full whitespace-pre'
|
||||
style={{ scrollbarWidth: 'none', minWidth: 'fit-content' }}
|
||||
>
|
||||
{formatDisplayText(
|
||||
fieldValue,
|
||||
accessiblePrefixes ? { accessiblePrefixes } : { highlightAll: true }
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
{fieldState.showTags && (
|
||||
<TagDropdown
|
||||
visible={fieldState.showTags}
|
||||
onSelect={tagSelectHandler}
|
||||
blockId={blockId}
|
||||
activeSourceBlockId={fieldState.activeSourceBlockId}
|
||||
inputValue={fieldValue}
|
||||
cursorPosition={fieldState.cursorPosition}
|
||||
onClose={() => inputController.fieldHelpers.hideFieldDropdowns(nameFieldKey)}
|
||||
inputRef={{ current: nameInputRefs.current[field.id] || null }}
|
||||
/>
|
||||
)}
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Renders the field header with name, type badge, and action buttons
|
||||
*/
|
||||
@@ -417,14 +510,7 @@ export function FieldFormat({
|
||||
<div className='flex flex-col gap-[8px] border-[var(--border-1)] border-t px-[10px] pt-[6px] pb-[10px]'>
|
||||
<div className='flex flex-col gap-[6px]'>
|
||||
<Label className='text-[13px]'>Name</Label>
|
||||
<Input
|
||||
name='name'
|
||||
value={field.name}
|
||||
onChange={(e) => updateField(field.id, 'name', e.target.value)}
|
||||
placeholder={placeholder}
|
||||
disabled={isReadOnly}
|
||||
autoComplete='off'
|
||||
/>
|
||||
<div className='relative'>{renderNameInput(field)}</div>
|
||||
</div>
|
||||
|
||||
{showType && (
|
||||
|
||||
@@ -755,6 +755,24 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
|
||||
const allTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
|
||||
blockTags = isSelfReference ? allTags.filter((tag) => tag.endsWith('.url')) : allTags
|
||||
}
|
||||
} else if (sourceBlock.type === 'human_in_the_loop') {
|
||||
const dynamicOutputs = getBlockOutputPaths(sourceBlock.type, mergedSubBlocks)
|
||||
|
||||
const isSelfReference = activeSourceBlockId === blockId
|
||||
|
||||
if (dynamicOutputs.length > 0) {
|
||||
const allTags = dynamicOutputs.map((path) => `${normalizedBlockName}.${path}`)
|
||||
// For self-reference, only show url and resumeEndpoint (not response format fields)
|
||||
blockTags = isSelfReference
|
||||
? allTags.filter((tag) => tag.endsWith('.url') || tag.endsWith('.resumeEndpoint'))
|
||||
: allTags
|
||||
} else {
|
||||
const outputPaths = getBlockOutputPaths(sourceBlock.type, mergedSubBlocks)
|
||||
const allTags = outputPaths.map((path) => `${normalizedBlockName}.${path}`)
|
||||
blockTags = isSelfReference
|
||||
? allTags.filter((tag) => tag.endsWith('.url') || tag.endsWith('.resumeEndpoint'))
|
||||
: allTags
|
||||
}
|
||||
} else {
|
||||
const operationValue =
|
||||
mergedSubBlocks?.operation?.value ?? getSubBlockValue(activeSourceBlockId, 'operation')
|
||||
@@ -1074,7 +1092,19 @@ export const TagDropdown: React.FC<TagDropdownProps> = ({
|
||||
blockTags = isSelfReference ? allTags.filter((tag) => tag.endsWith('.url')) : allTags
|
||||
}
|
||||
} else if (accessibleBlock.type === 'human_in_the_loop') {
|
||||
blockTags = [`${normalizedBlockName}.url`]
|
||||
const dynamicOutputs = getBlockOutputPaths(accessibleBlock.type, mergedSubBlocks)
|
||||
|
||||
const isSelfReference = accessibleBlockId === blockId
|
||||
|
||||
if (dynamicOutputs.length > 0) {
|
||||
const allTags = dynamicOutputs.map((path) => `${normalizedBlockName}.${path}`)
|
||||
// For self-reference, only show url and resumeEndpoint (not response format fields)
|
||||
blockTags = isSelfReference
|
||||
? allTags.filter((tag) => tag.endsWith('.url') || tag.endsWith('.resumeEndpoint'))
|
||||
: allTags
|
||||
} else {
|
||||
blockTags = [`${normalizedBlockName}.url`, `${normalizedBlockName}.resumeEndpoint`]
|
||||
}
|
||||
} else {
|
||||
const operationValue =
|
||||
mergedSubBlocks?.operation?.value ?? getSubBlockValue(accessibleBlockId, 'operation')
|
||||
|
||||
@@ -50,6 +50,7 @@ import {
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tool-input/components/custom-tool-modal/custom-tool-modal'
|
||||
import { ToolCredentialSelector } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/tool-input/components/tool-credential-selector'
|
||||
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
|
||||
import { useChildDeployment } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/hooks/use-child-deployment'
|
||||
import { getAllBlocks } from '@/blocks'
|
||||
import {
|
||||
type CustomTool as CustomToolDefinition,
|
||||
@@ -582,6 +583,8 @@ function WorkflowSelectorSyncWrapper({
|
||||
onChange={onChange}
|
||||
placeholder={uiComponent.placeholder || 'Select workflow'}
|
||||
disabled={disabled || isLoading}
|
||||
searchable
|
||||
searchPlaceholder='Search workflows...'
|
||||
/>
|
||||
)
|
||||
}
|
||||
@@ -752,6 +755,81 @@ function CodeEditorSyncWrapper({
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Badge component showing deployment status for workflow tools
|
||||
*/
|
||||
function WorkflowToolDeployBadge({
|
||||
workflowId,
|
||||
onDeploySuccess,
|
||||
}: {
|
||||
workflowId: string
|
||||
onDeploySuccess?: () => void
|
||||
}) {
|
||||
const { isDeployed, needsRedeploy, isLoading, refetch } = useChildDeployment(workflowId)
|
||||
const [isDeploying, setIsDeploying] = useState(false)
|
||||
|
||||
const deployWorkflow = useCallback(async () => {
|
||||
if (isDeploying || !workflowId) return
|
||||
|
||||
try {
|
||||
setIsDeploying(true)
|
||||
const response = await fetch(`/api/workflows/${workflowId}/deploy`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
deployChatEnabled: false,
|
||||
}),
|
||||
})
|
||||
|
||||
if (response.ok) {
|
||||
refetch()
|
||||
onDeploySuccess?.()
|
||||
} else {
|
||||
logger.error('Failed to deploy workflow')
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error('Error deploying workflow:', error)
|
||||
} finally {
|
||||
setIsDeploying(false)
|
||||
}
|
||||
}, [isDeploying, workflowId, refetch, onDeploySuccess])
|
||||
|
||||
if (isLoading || (isDeployed && !needsRedeploy)) {
|
||||
return null
|
||||
}
|
||||
|
||||
if (typeof isDeployed !== 'boolean') {
|
||||
return null
|
||||
}
|
||||
|
||||
return (
|
||||
<Tooltip.Root>
|
||||
<Tooltip.Trigger asChild>
|
||||
<Badge
|
||||
variant={!isDeployed ? 'red' : 'amber'}
|
||||
className='cursor-pointer'
|
||||
size='sm'
|
||||
dot
|
||||
onClick={(e: React.MouseEvent) => {
|
||||
e.stopPropagation()
|
||||
e.preventDefault()
|
||||
if (!isDeploying) {
|
||||
deployWorkflow()
|
||||
}
|
||||
}}
|
||||
>
|
||||
{isDeploying ? 'Deploying...' : !isDeployed ? 'undeployed' : 'redeploy'}
|
||||
</Badge>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content>
|
||||
<span className='text-sm'>{!isDeployed ? 'Click to deploy' : 'Click to redeploy'}</span>
|
||||
</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Set of built-in tool types that are core platform tools.
|
||||
*
|
||||
@@ -760,6 +838,7 @@ function CodeEditorSyncWrapper({
|
||||
* in the tool selection dropdown.
|
||||
*/
|
||||
const BUILT_IN_TOOL_TYPES = new Set([
|
||||
'api',
|
||||
'file',
|
||||
'function',
|
||||
'knowledge',
|
||||
@@ -772,6 +851,7 @@ const BUILT_IN_TOOL_TYPES = new Set([
|
||||
'tts',
|
||||
'stt',
|
||||
'memory',
|
||||
'webhook_request',
|
||||
'workflow',
|
||||
])
|
||||
|
||||
@@ -926,6 +1006,8 @@ export function ToolInput({
|
||||
const toolBlocks = getAllBlocks().filter(
|
||||
(block) =>
|
||||
(block.category === 'tools' ||
|
||||
block.type === 'api' ||
|
||||
block.type === 'webhook_request' ||
|
||||
block.type === 'workflow' ||
|
||||
block.type === 'knowledge' ||
|
||||
block.type === 'function') &&
|
||||
@@ -2215,10 +2297,15 @@ export function ToolInput({
|
||||
{getIssueBadgeLabel(issue)}
|
||||
</Badge>
|
||||
</Tooltip.Trigger>
|
||||
<Tooltip.Content>{issue.message}: click to open settings</Tooltip.Content>
|
||||
<Tooltip.Content>
|
||||
<span className='text-sm'>{issue.message}: click to open settings</span>
|
||||
</Tooltip.Content>
|
||||
</Tooltip.Root>
|
||||
)
|
||||
})()}
|
||||
{tool.type === 'workflow' && tool.params?.workflowId && (
|
||||
<WorkflowToolDeployBadge workflowId={tool.params.workflowId} />
|
||||
)}
|
||||
</div>
|
||||
<div className='flex flex-shrink-0 items-center gap-[8px]'>
|
||||
{supportsToolControl && !(isMcpTool && isMcpToolUnavailable(tool)) && (
|
||||
|
||||
@@ -361,9 +361,9 @@ export function TriggerSave({
|
||||
onClick={handleSave}
|
||||
disabled={disabled || isProcessing}
|
||||
className={cn(
|
||||
'h-[32px] flex-1 rounded-[8px] px-[12px] transition-all duration-200',
|
||||
saveStatus === 'saved' && 'bg-green-600 hover:bg-green-700',
|
||||
saveStatus === 'error' && 'bg-red-600 hover:bg-red-700'
|
||||
'flex-1',
|
||||
saveStatus === 'saved' && '!bg-green-600 !text-white hover:!bg-green-700',
|
||||
saveStatus === 'error' && '!bg-red-600 !text-white hover:!bg-red-700'
|
||||
)}
|
||||
>
|
||||
{saveStatus === 'saving' && 'Saving...'}
|
||||
@@ -373,12 +373,7 @@ export function TriggerSave({
|
||||
</Button>
|
||||
|
||||
{webhookId && (
|
||||
<Button
|
||||
variant='default'
|
||||
onClick={handleDeleteClick}
|
||||
disabled={disabled || isProcessing}
|
||||
className='h-[32px] rounded-[8px] px-[12px]'
|
||||
>
|
||||
<Button variant='default' onClick={handleDeleteClick} disabled={disabled || isProcessing}>
|
||||
<Trash className='h-[14px] w-[14px]' />
|
||||
</Button>
|
||||
)}
|
||||
|
||||
@@ -460,6 +460,7 @@ function SubBlockComponent({
|
||||
disabled={isDisabled}
|
||||
multiSelect={config.multiSelect}
|
||||
fetchOptions={config.fetchOptions}
|
||||
fetchOptionById={config.fetchOptionById}
|
||||
dependsOn={config.dependsOn}
|
||||
searchable={config.searchable}
|
||||
/>
|
||||
@@ -479,6 +480,9 @@ function SubBlockComponent({
|
||||
previewValue={previewValue as any}
|
||||
disabled={isDisabled}
|
||||
config={config}
|
||||
fetchOptions={config.fetchOptions}
|
||||
fetchOptionById={config.fetchOptionById}
|
||||
dependsOn={config.dependsOn}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
@@ -605,6 +609,18 @@ function SubBlockComponent({
|
||||
/>
|
||||
)
|
||||
|
||||
case 'router-input':
|
||||
return (
|
||||
<ConditionInput
|
||||
blockId={blockId}
|
||||
subBlockId={config.id}
|
||||
isPreview={isPreview}
|
||||
previewValue={previewValue as any}
|
||||
disabled={isDisabled}
|
||||
mode='router'
|
||||
/>
|
||||
)
|
||||
|
||||
case 'eval-input':
|
||||
return (
|
||||
<EvalInput
|
||||
|
||||
@@ -1 +1,3 @@
|
||||
export { LogRowContextMenu } from './log-row-context-menu'
|
||||
export { OutputContextMenu } from './output-context-menu'
|
||||
export { PrettierOutput } from './prettier-output'
|
||||
|
||||
@@ -0,0 +1,145 @@
|
||||
'use client'
|
||||
|
||||
import type { RefObject } from 'react'
|
||||
import {
|
||||
Popover,
|
||||
PopoverAnchor,
|
||||
PopoverContent,
|
||||
PopoverDivider,
|
||||
PopoverItem,
|
||||
} from '@/components/emcn'
|
||||
import type { ConsoleEntry } from '@/stores/terminal'
|
||||
|
||||
interface ContextMenuPosition {
|
||||
x: number
|
||||
y: number
|
||||
}
|
||||
|
||||
interface TerminalFilters {
|
||||
blockIds: Set<string>
|
||||
statuses: Set<'error' | 'info'>
|
||||
runIds: Set<string>
|
||||
}
|
||||
|
||||
interface LogRowContextMenuProps {
|
||||
isOpen: boolean
|
||||
position: ContextMenuPosition
|
||||
menuRef: RefObject<HTMLDivElement | null>
|
||||
onClose: () => void
|
||||
entry: ConsoleEntry | null
|
||||
filters: TerminalFilters
|
||||
onFilterByBlock: (blockId: string) => void
|
||||
onFilterByStatus: (status: 'error' | 'info') => void
|
||||
onFilterByRunId: (runId: string) => void
|
||||
onClearFilters: () => void
|
||||
onClearConsole: () => void
|
||||
hasActiveFilters: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Context menu for terminal log rows (left side).
|
||||
* Displays filtering options based on the selected row's properties.
|
||||
*/
|
||||
export function LogRowContextMenu({
|
||||
isOpen,
|
||||
position,
|
||||
menuRef,
|
||||
onClose,
|
||||
entry,
|
||||
filters,
|
||||
onFilterByBlock,
|
||||
onFilterByStatus,
|
||||
onFilterByRunId,
|
||||
onClearFilters,
|
||||
onClearConsole,
|
||||
hasActiveFilters,
|
||||
}: LogRowContextMenuProps) {
|
||||
const hasRunId = entry?.executionId != null
|
||||
|
||||
const isBlockFiltered = entry ? filters.blockIds.has(entry.blockId) : false
|
||||
const entryStatus = entry?.success ? 'info' : 'error'
|
||||
const isStatusFiltered = entry ? filters.statuses.has(entryStatus) : false
|
||||
const isRunIdFiltered = entry?.executionId ? filters.runIds.has(entry.executionId) : false
|
||||
|
||||
return (
|
||||
<Popover
|
||||
open={isOpen}
|
||||
onOpenChange={onClose}
|
||||
variant='secondary'
|
||||
size='sm'
|
||||
colorScheme='inverted'
|
||||
>
|
||||
<PopoverAnchor
|
||||
style={{
|
||||
position: 'fixed',
|
||||
left: `${position.x}px`,
|
||||
top: `${position.y}px`,
|
||||
width: '1px',
|
||||
height: '1px',
|
||||
}}
|
||||
/>
|
||||
<PopoverContent ref={menuRef} align='start' side='bottom' sideOffset={4}>
|
||||
{/* Clear filters at top when active */}
|
||||
{hasActiveFilters && (
|
||||
<>
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
onClearFilters()
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Clear All Filters
|
||||
</PopoverItem>
|
||||
{entry && <PopoverDivider />}
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Filter actions */}
|
||||
{entry && (
|
||||
<>
|
||||
<PopoverItem
|
||||
showCheck={isBlockFiltered}
|
||||
onClick={() => {
|
||||
onFilterByBlock(entry.blockId)
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Filter by Block
|
||||
</PopoverItem>
|
||||
<PopoverItem
|
||||
showCheck={isStatusFiltered}
|
||||
onClick={() => {
|
||||
onFilterByStatus(entryStatus)
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Filter by Status
|
||||
</PopoverItem>
|
||||
{hasRunId && (
|
||||
<PopoverItem
|
||||
showCheck={isRunIdFiltered}
|
||||
onClick={() => {
|
||||
onFilterByRunId(entry.executionId!)
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Filter by Run ID
|
||||
</PopoverItem>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Destructive action */}
|
||||
{(entry || hasActiveFilters) && <PopoverDivider />}
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
onClearConsole()
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Clear Console
|
||||
</PopoverItem>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
)
|
||||
}
|
||||
@@ -0,0 +1,119 @@
|
||||
'use client'
|
||||
|
||||
import type { RefObject } from 'react'
|
||||
import {
|
||||
Popover,
|
||||
PopoverAnchor,
|
||||
PopoverContent,
|
||||
PopoverDivider,
|
||||
PopoverItem,
|
||||
} from '@/components/emcn'
|
||||
|
||||
interface ContextMenuPosition {
|
||||
x: number
|
||||
y: number
|
||||
}
|
||||
|
||||
interface OutputContextMenuProps {
|
||||
isOpen: boolean
|
||||
position: ContextMenuPosition
|
||||
menuRef: RefObject<HTMLDivElement | null>
|
||||
onClose: () => void
|
||||
onCopySelection: () => void
|
||||
onCopyAll: () => void
|
||||
onSearch: () => void
|
||||
wrapText: boolean
|
||||
onToggleWrap: () => void
|
||||
openOnRun: boolean
|
||||
onToggleOpenOnRun: () => void
|
||||
onClearConsole: () => void
|
||||
hasSelection: boolean
|
||||
}
|
||||
|
||||
/**
|
||||
* Context menu for terminal output panel (right side).
|
||||
* Displays copy, search, and display options for the code viewer.
|
||||
*/
|
||||
export function OutputContextMenu({
|
||||
isOpen,
|
||||
position,
|
||||
menuRef,
|
||||
onClose,
|
||||
onCopySelection,
|
||||
onCopyAll,
|
||||
onSearch,
|
||||
wrapText,
|
||||
onToggleWrap,
|
||||
openOnRun,
|
||||
onToggleOpenOnRun,
|
||||
onClearConsole,
|
||||
hasSelection,
|
||||
}: OutputContextMenuProps) {
|
||||
return (
|
||||
<Popover
|
||||
open={isOpen}
|
||||
onOpenChange={onClose}
|
||||
variant='secondary'
|
||||
size='sm'
|
||||
colorScheme='inverted'
|
||||
>
|
||||
<PopoverAnchor
|
||||
style={{
|
||||
position: 'fixed',
|
||||
left: `${position.x}px`,
|
||||
top: `${position.y}px`,
|
||||
width: '1px',
|
||||
height: '1px',
|
||||
}}
|
||||
/>
|
||||
<PopoverContent ref={menuRef} align='start' side='bottom' sideOffset={4}>
|
||||
{/* Copy and search actions */}
|
||||
<PopoverItem
|
||||
disabled={!hasSelection}
|
||||
onClick={() => {
|
||||
onCopySelection()
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Copy Selection
|
||||
</PopoverItem>
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
onCopyAll()
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Copy All
|
||||
</PopoverItem>
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
onSearch()
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Search
|
||||
</PopoverItem>
|
||||
|
||||
{/* Display settings - toggles don't close menu */}
|
||||
<PopoverDivider />
|
||||
<PopoverItem showCheck={wrapText} onClick={onToggleWrap}>
|
||||
Wrap Text
|
||||
</PopoverItem>
|
||||
<PopoverItem showCheck={openOnRun} onClick={onToggleOpenOnRun}>
|
||||
Open on Run
|
||||
</PopoverItem>
|
||||
|
||||
{/* Destructive action */}
|
||||
<PopoverDivider />
|
||||
<PopoverItem
|
||||
onClick={() => {
|
||||
onClearConsole()
|
||||
onClose()
|
||||
}}
|
||||
>
|
||||
Clear Console
|
||||
</PopoverItem>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
)
|
||||
}
|
||||
@@ -38,11 +38,16 @@ import {
|
||||
import { getEnv, isTruthy } from '@/lib/core/config/env'
|
||||
import { useRegisterGlobalCommands } from '@/app/workspace/[workspaceId]/providers/global-commands-provider'
|
||||
import { createCommands } from '@/app/workspace/[workspaceId]/utils/commands-utils'
|
||||
import {
|
||||
LogRowContextMenu,
|
||||
OutputContextMenu,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/terminal/components'
|
||||
import {
|
||||
useOutputPanelResize,
|
||||
useTerminalFilters,
|
||||
useTerminalResize,
|
||||
} from '@/app/workspace/[workspaceId]/w/[workflowId]/components/terminal/hooks'
|
||||
import { useContextMenu } from '@/app/workspace/[workspaceId]/w/components/sidebar/hooks'
|
||||
import { getBlock } from '@/blocks'
|
||||
import { OUTPUT_PANEL_WIDTH, TERMINAL_HEIGHT } from '@/stores/constants'
|
||||
import { useCopilotTrainingStore } from '@/stores/copilot-training/store'
|
||||
@@ -365,6 +370,28 @@ export function Terminal() {
|
||||
hasActiveFilters,
|
||||
} = useTerminalFilters()
|
||||
|
||||
// Context menu state
|
||||
const [hasSelection, setHasSelection] = useState(false)
|
||||
const [contextMenuEntry, setContextMenuEntry] = useState<ConsoleEntry | null>(null)
|
||||
const [storedSelectionText, setStoredSelectionText] = useState('')
|
||||
|
||||
// Context menu hooks
|
||||
const {
|
||||
isOpen: isLogRowMenuOpen,
|
||||
position: logRowMenuPosition,
|
||||
menuRef: logRowMenuRef,
|
||||
handleContextMenu: handleLogRowContextMenu,
|
||||
closeMenu: closeLogRowMenu,
|
||||
} = useContextMenu()
|
||||
|
||||
const {
|
||||
isOpen: isOutputMenuOpen,
|
||||
position: outputMenuPosition,
|
||||
menuRef: outputMenuRef,
|
||||
handleContextMenu: handleOutputContextMenu,
|
||||
closeMenu: closeOutputMenu,
|
||||
} = useContextMenu()
|
||||
|
||||
/**
|
||||
* Expands the terminal to its last meaningful height, with safeguards:
|
||||
* - Never expands below {@link DEFAULT_EXPANDED_HEIGHT}.
|
||||
@@ -511,15 +538,11 @@ export function Terminal() {
|
||||
const handleRowClick = useCallback((entry: ConsoleEntry) => {
|
||||
setSelectedEntry((prev) => {
|
||||
const isDeselecting = prev?.id === entry.id
|
||||
// Re-enable auto-select when deselecting, disable when selecting
|
||||
setAutoSelectEnabled(isDeselecting)
|
||||
return isDeselecting ? null : entry
|
||||
})
|
||||
}, [])
|
||||
|
||||
/**
|
||||
* Handle header click - toggle between expanded and collapsed
|
||||
*/
|
||||
const handleHeaderClick = useCallback(() => {
|
||||
if (isExpanded) {
|
||||
setIsToggling(true)
|
||||
@@ -529,16 +552,10 @@ export function Terminal() {
|
||||
}
|
||||
}, [expandToLastHeight, isExpanded, setTerminalHeight])
|
||||
|
||||
/**
|
||||
* Handle transition end - reset toggling state
|
||||
*/
|
||||
const handleTransitionEnd = useCallback(() => {
|
||||
setIsToggling(false)
|
||||
}, [])
|
||||
|
||||
/**
|
||||
* Handle copy output to clipboard
|
||||
*/
|
||||
const handleCopy = useCallback(() => {
|
||||
if (!selectedEntry) return
|
||||
|
||||
@@ -560,9 +577,6 @@ export function Terminal() {
|
||||
}
|
||||
}, [activeWorkflowId, clearWorkflowConsole])
|
||||
|
||||
/**
|
||||
* Activates output search and focuses the search input.
|
||||
*/
|
||||
const activateOutputSearch = useCallback(() => {
|
||||
setIsOutputSearchActive(true)
|
||||
setTimeout(() => {
|
||||
@@ -570,9 +584,6 @@ export function Terminal() {
|
||||
}, 0)
|
||||
}, [])
|
||||
|
||||
/**
|
||||
* Closes output search and clears the query.
|
||||
*/
|
||||
const closeOutputSearch = useCallback(() => {
|
||||
setIsOutputSearchActive(false)
|
||||
setOutputSearchQuery('')
|
||||
@@ -604,9 +615,6 @@ export function Terminal() {
|
||||
setCurrentMatchIndex(0)
|
||||
}, [])
|
||||
|
||||
/**
|
||||
* Handle clear console for current workflow via mouse interaction.
|
||||
*/
|
||||
const handleClearConsole = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
e.stopPropagation()
|
||||
@@ -615,10 +623,6 @@ export function Terminal() {
|
||||
[clearCurrentWorkflowConsole]
|
||||
)
|
||||
|
||||
/**
|
||||
* Handle export of console entries for the current workflow via mouse interaction.
|
||||
* Mirrors the visibility and interaction behavior of the clear console action.
|
||||
*/
|
||||
const handleExportConsole = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
e.stopPropagation()
|
||||
@@ -629,9 +633,60 @@ export function Terminal() {
|
||||
[activeWorkflowId, exportConsoleCSV]
|
||||
)
|
||||
|
||||
/**
|
||||
* Handle training button click - toggle training state or open modal
|
||||
*/
|
||||
const handleCopySelection = useCallback(() => {
|
||||
if (storedSelectionText) {
|
||||
navigator.clipboard.writeText(storedSelectionText)
|
||||
setShowCopySuccess(true)
|
||||
}
|
||||
}, [storedSelectionText])
|
||||
|
||||
const handleOutputPanelContextMenu = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
const selection = window.getSelection()
|
||||
const selectionText = selection?.toString() || ''
|
||||
setStoredSelectionText(selectionText)
|
||||
setHasSelection(selectionText.length > 0)
|
||||
handleOutputContextMenu(e)
|
||||
},
|
||||
[handleOutputContextMenu]
|
||||
)
|
||||
|
||||
const handleRowContextMenu = useCallback(
|
||||
(e: React.MouseEvent, entry: ConsoleEntry) => {
|
||||
setContextMenuEntry(entry)
|
||||
handleLogRowContextMenu(e)
|
||||
},
|
||||
[handleLogRowContextMenu]
|
||||
)
|
||||
|
||||
const handleFilterByBlock = useCallback(
|
||||
(blockId: string) => {
|
||||
toggleBlock(blockId)
|
||||
closeLogRowMenu()
|
||||
},
|
||||
[toggleBlock, closeLogRowMenu]
|
||||
)
|
||||
|
||||
const handleFilterByStatus = useCallback(
|
||||
(status: 'error' | 'info') => {
|
||||
toggleStatus(status)
|
||||
closeLogRowMenu()
|
||||
},
|
||||
[toggleStatus, closeLogRowMenu]
|
||||
)
|
||||
|
||||
const handleFilterByRunId = useCallback(
|
||||
(runId: string) => {
|
||||
toggleRunId(runId)
|
||||
closeLogRowMenu()
|
||||
},
|
||||
[toggleRunId, closeLogRowMenu]
|
||||
)
|
||||
|
||||
const handleClearConsoleFromMenu = useCallback(() => {
|
||||
clearCurrentWorkflowConsole()
|
||||
}, [clearCurrentWorkflowConsole])
|
||||
|
||||
const handleTrainingClick = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
e.stopPropagation()
|
||||
@@ -644,9 +699,6 @@ export function Terminal() {
|
||||
[isTraining, stopTraining, toggleTrainingModal]
|
||||
)
|
||||
|
||||
/**
|
||||
* Whether training controls should be visible
|
||||
*/
|
||||
const shouldShowTrainingButton = isTrainingEnvEnabled && showTrainingControls
|
||||
|
||||
/**
|
||||
@@ -721,6 +773,23 @@ export function Terminal() {
|
||||
}
|
||||
}, [showCopySuccess])
|
||||
|
||||
/**
|
||||
* Track text selection state for context menu.
|
||||
* Skip updates when the context menu is open to prevent the selection
|
||||
* state from changing mid-click (which would disable the copy button).
|
||||
*/
|
||||
useEffect(() => {
|
||||
const handleSelectionChange = () => {
|
||||
if (isOutputMenuOpen) return
|
||||
|
||||
const selection = window.getSelection()
|
||||
setHasSelection(Boolean(selection && selection.toString().length > 0))
|
||||
}
|
||||
|
||||
document.addEventListener('selectionchange', handleSelectionChange)
|
||||
return () => document.removeEventListener('selectionchange', handleSelectionChange)
|
||||
}, [isOutputMenuOpen])
|
||||
|
||||
/**
|
||||
* Auto-select the latest entry when new logs arrive
|
||||
* Re-enables auto-selection when all entries are cleared
|
||||
@@ -1311,6 +1380,7 @@ export function Terminal() {
|
||||
isSelected && 'bg-[var(--surface-6)] dark:bg-[var(--surface-4)]'
|
||||
)}
|
||||
onClick={() => handleRowClick(entry)}
|
||||
onContextMenu={(e) => handleRowContextMenu(e, entry)}
|
||||
>
|
||||
{/* Block */}
|
||||
<div
|
||||
@@ -1327,7 +1397,13 @@ export function Terminal() {
|
||||
</div>
|
||||
|
||||
{/* Status */}
|
||||
<div className={clsx(COLUMN_WIDTHS.STATUS, COLUMN_BASE_CLASS)}>
|
||||
<div
|
||||
className={clsx(
|
||||
COLUMN_WIDTHS.STATUS,
|
||||
COLUMN_BASE_CLASS,
|
||||
'flex items-center'
|
||||
)}
|
||||
>
|
||||
{statusInfo ? (
|
||||
<Badge variant={statusInfo.isError ? 'red' : 'gray'} dot>
|
||||
{statusInfo.label}
|
||||
@@ -1719,7 +1795,10 @@ export function Terminal() {
|
||||
)}
|
||||
|
||||
{/* Content */}
|
||||
<div className={clsx('flex-1 overflow-y-auto', !wrapText && 'overflow-x-auto')}>
|
||||
<div
|
||||
className={clsx('flex-1 overflow-y-auto', !wrapText && 'overflow-x-auto')}
|
||||
onContextMenu={handleOutputPanelContextMenu}
|
||||
>
|
||||
{shouldShowCodeDisplay ? (
|
||||
<OutputCodeContent
|
||||
code={selectedEntry.input.code}
|
||||
@@ -1748,6 +1827,42 @@ export function Terminal() {
|
||||
)}
|
||||
</div>
|
||||
</aside>
|
||||
|
||||
{/* Log Row Context Menu */}
|
||||
<LogRowContextMenu
|
||||
isOpen={isLogRowMenuOpen}
|
||||
position={logRowMenuPosition}
|
||||
menuRef={logRowMenuRef}
|
||||
onClose={closeLogRowMenu}
|
||||
entry={contextMenuEntry}
|
||||
filters={filters}
|
||||
onFilterByBlock={handleFilterByBlock}
|
||||
onFilterByStatus={handleFilterByStatus}
|
||||
onFilterByRunId={handleFilterByRunId}
|
||||
onClearFilters={() => {
|
||||
clearFilters()
|
||||
closeLogRowMenu()
|
||||
}}
|
||||
onClearConsole={handleClearConsoleFromMenu}
|
||||
hasActiveFilters={hasActiveFilters}
|
||||
/>
|
||||
|
||||
{/* Output Panel Context Menu */}
|
||||
<OutputContextMenu
|
||||
isOpen={isOutputMenuOpen}
|
||||
position={outputMenuPosition}
|
||||
menuRef={outputMenuRef}
|
||||
onClose={closeOutputMenu}
|
||||
onCopySelection={handleCopySelection}
|
||||
onCopyAll={handleCopy}
|
||||
onSearch={activateOutputSearch}
|
||||
wrapText={wrapText}
|
||||
onToggleWrap={() => setWrapText(!wrapText)}
|
||||
openOnRun={openOnRun}
|
||||
onToggleOpenOnRun={() => setOpenOnRun(!openOnRun)}
|
||||
onClearConsole={handleClearConsoleFromMenu}
|
||||
hasSelection={hasSelection}
|
||||
/>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -841,6 +841,37 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
]
|
||||
}, [type, subBlockState, id])
|
||||
|
||||
/**
|
||||
* Compute per-route rows (id/value) for router_v2 blocks so we can render
|
||||
* one row per route with its own output handle.
|
||||
* Uses same structure as conditions: { id, title, value }
|
||||
*/
|
||||
const routerRows = useMemo(() => {
|
||||
if (type !== 'router_v2') return [] as { id: string; value: string }[]
|
||||
|
||||
const routesValue = subBlockState.routes?.value
|
||||
const raw = typeof routesValue === 'string' ? routesValue : undefined
|
||||
|
||||
try {
|
||||
if (raw) {
|
||||
const parsed = JSON.parse(raw) as unknown
|
||||
if (Array.isArray(parsed)) {
|
||||
return parsed.map((item: unknown, index: number) => {
|
||||
const routeItem = item as { id?: string; value?: string }
|
||||
return {
|
||||
id: routeItem?.id ?? `${id}-route-${index}`,
|
||||
value: routeItem?.value ?? '',
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.warn('Failed to parse router routes value', { error, blockId: id })
|
||||
}
|
||||
|
||||
return [{ id: `${id}-route-route1`, value: '' }]
|
||||
}, [type, subBlockState, id])
|
||||
|
||||
/**
|
||||
* Compute and publish deterministic layout metrics for workflow blocks.
|
||||
* This avoids ResizeObserver/animation-frame jitter and prevents initial "jump".
|
||||
@@ -857,6 +888,9 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
let rowsCount = 0
|
||||
if (type === 'condition') {
|
||||
rowsCount = conditionRows.length + defaultHandlesRow
|
||||
} else if (type === 'router_v2') {
|
||||
// +1 for context row, plus route rows
|
||||
rowsCount = 1 + routerRows.length + defaultHandlesRow
|
||||
} else {
|
||||
const subblockRowCount = subBlockRows.reduce((acc, row) => acc + row.length, 0)
|
||||
rowsCount = subblockRowCount + defaultHandlesRow
|
||||
@@ -879,6 +913,7 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
displayTriggerMode,
|
||||
subBlockRows.length,
|
||||
conditionRows.length,
|
||||
routerRows.length,
|
||||
horizontalHandles,
|
||||
],
|
||||
})
|
||||
@@ -1073,32 +1108,45 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
|
||||
{hasContentBelowHeader && (
|
||||
<div className='flex flex-col gap-[8px] p-[8px]'>
|
||||
{type === 'condition'
|
||||
? conditionRows.map((cond) => (
|
||||
{type === 'condition' ? (
|
||||
conditionRows.map((cond) => (
|
||||
<SubBlockRow key={cond.id} title={cond.title} value={getDisplayValue(cond.value)} />
|
||||
))
|
||||
) : type === 'router_v2' ? (
|
||||
<>
|
||||
<SubBlockRow
|
||||
key='context'
|
||||
title='Context'
|
||||
value={getDisplayValue(subBlockState.context?.value)}
|
||||
/>
|
||||
{routerRows.map((route, index) => (
|
||||
<SubBlockRow
|
||||
key={cond.id}
|
||||
title={cond.title}
|
||||
value={getDisplayValue(cond.value)}
|
||||
key={route.id}
|
||||
title={`Route ${index + 1}`}
|
||||
value={getDisplayValue(route.value)}
|
||||
/>
|
||||
))
|
||||
: subBlockRows.map((row, rowIndex) =>
|
||||
row.map((subBlock) => {
|
||||
const rawValue = subBlockState[subBlock.id]?.value
|
||||
return (
|
||||
<SubBlockRow
|
||||
key={`${subBlock.id}-${rowIndex}`}
|
||||
title={subBlock.title ?? subBlock.id}
|
||||
value={getDisplayValue(rawValue)}
|
||||
subBlock={subBlock}
|
||||
rawValue={rawValue}
|
||||
workspaceId={workspaceId}
|
||||
workflowId={currentWorkflowId}
|
||||
blockId={id}
|
||||
allSubBlockValues={subBlockState}
|
||||
/>
|
||||
)
|
||||
})
|
||||
)}
|
||||
))}
|
||||
</>
|
||||
) : (
|
||||
subBlockRows.map((row, rowIndex) =>
|
||||
row.map((subBlock) => {
|
||||
const rawValue = subBlockState[subBlock.id]?.value
|
||||
return (
|
||||
<SubBlockRow
|
||||
key={`${subBlock.id}-${rowIndex}`}
|
||||
title={subBlock.title ?? subBlock.id}
|
||||
value={getDisplayValue(rawValue)}
|
||||
subBlock={subBlock}
|
||||
rawValue={rawValue}
|
||||
workspaceId={workspaceId}
|
||||
workflowId={currentWorkflowId}
|
||||
blockId={id}
|
||||
allSubBlockValues={subBlockState}
|
||||
/>
|
||||
)
|
||||
})
|
||||
)
|
||||
)}
|
||||
{shouldShowDefaultHandles && <SubBlockRow title='error' />}
|
||||
</div>
|
||||
)}
|
||||
@@ -1153,7 +1201,58 @@ export const WorkflowBlock = memo(function WorkflowBlock({
|
||||
</>
|
||||
)}
|
||||
|
||||
{type !== 'condition' && type !== 'response' && (
|
||||
{type === 'router_v2' && (
|
||||
<>
|
||||
{routerRows.map((route, routeIndex) => {
|
||||
// +1 row offset for context row at the top
|
||||
const topOffset =
|
||||
HANDLE_POSITIONS.CONDITION_START_Y +
|
||||
(routeIndex + 1) * HANDLE_POSITIONS.CONDITION_ROW_HEIGHT
|
||||
return (
|
||||
<Handle
|
||||
key={`handle-${route.id}`}
|
||||
type='source'
|
||||
position={Position.Right}
|
||||
id={`router-${route.id}`}
|
||||
className={getHandleClasses('right')}
|
||||
style={{ top: `${topOffset}px`, transform: 'translateY(-50%)' }}
|
||||
data-nodeid={id}
|
||||
data-handleid={`router-${route.id}`}
|
||||
isConnectableStart={true}
|
||||
isConnectableEnd={false}
|
||||
isValidConnection={(connection) => {
|
||||
if (connection.target === id) return false
|
||||
const edges = useWorkflowStore.getState().edges
|
||||
return !wouldCreateCycle(edges, connection.source!, connection.target!)
|
||||
}}
|
||||
/>
|
||||
)
|
||||
})}
|
||||
<Handle
|
||||
type='source'
|
||||
position={Position.Right}
|
||||
id='error'
|
||||
className={getHandleClasses('right', true)}
|
||||
style={{
|
||||
right: '-7px',
|
||||
top: 'auto',
|
||||
bottom: `${HANDLE_POSITIONS.ERROR_BOTTOM_OFFSET}px`,
|
||||
transform: 'translateY(50%)',
|
||||
}}
|
||||
data-nodeid={id}
|
||||
data-handleid='error'
|
||||
isConnectableStart={true}
|
||||
isConnectableEnd={false}
|
||||
isValidConnection={(connection) => {
|
||||
if (connection.target === id) return false
|
||||
const edges = useWorkflowStore.getState().edges
|
||||
return !wouldCreateCycle(edges, connection.source!, connection.target!)
|
||||
}}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
|
||||
{type !== 'condition' && type !== 'router_v2' && type !== 'response' && (
|
||||
<>
|
||||
<Handle
|
||||
type='source'
|
||||
|
||||
@@ -795,6 +795,13 @@ const WorkflowContent = React.memo(() => {
|
||||
event.preventDefault()
|
||||
redo()
|
||||
} else if ((event.ctrlKey || event.metaKey) && event.key === 'c') {
|
||||
const selection = window.getSelection()
|
||||
const hasTextSelection = selection && selection.toString().length > 0
|
||||
|
||||
if (hasTextSelection) {
|
||||
return
|
||||
}
|
||||
|
||||
const selectedNodes = getNodes().filter((node) => node.selected)
|
||||
if (selectedNodes.length > 0) {
|
||||
event.preventDefault()
|
||||
@@ -1940,11 +1947,26 @@ const WorkflowContent = React.memo(() => {
|
||||
const handleKeyUp = (e: KeyboardEvent) => {
|
||||
if (e.key === 'Shift') setIsShiftPressed(false)
|
||||
}
|
||||
const handleFocusLoss = () => {
|
||||
setIsShiftPressed(false)
|
||||
setIsSelectionDragActive(false)
|
||||
}
|
||||
const handleVisibilityChange = () => {
|
||||
if (document.hidden) {
|
||||
handleFocusLoss()
|
||||
}
|
||||
}
|
||||
|
||||
window.addEventListener('keydown', handleKeyDown)
|
||||
window.addEventListener('keyup', handleKeyUp)
|
||||
window.addEventListener('blur', handleFocusLoss)
|
||||
document.addEventListener('visibilitychange', handleVisibilityChange)
|
||||
|
||||
return () => {
|
||||
window.removeEventListener('keydown', handleKeyDown)
|
||||
window.removeEventListener('keyup', handleKeyUp)
|
||||
window.removeEventListener('blur', handleFocusLoss)
|
||||
document.removeEventListener('visibilitychange', handleVisibilityChange)
|
||||
}
|
||||
}, [])
|
||||
|
||||
|
||||
@@ -21,6 +21,7 @@ import { signOut, useSession } from '@/lib/auth/auth-client'
|
||||
import { ANONYMOUS_USER_ID } from '@/lib/auth/constants'
|
||||
import { useBrandConfig } from '@/lib/branding/branding'
|
||||
import { getEnv, isTruthy } from '@/lib/core/config/env'
|
||||
import { isHosted } from '@/lib/core/config/feature-flags'
|
||||
import { getBaseUrl } from '@/lib/core/utils/urls'
|
||||
import { useProfilePictureUpload } from '@/app/workspace/[workspaceId]/w/components/sidebar/components/settings-modal/hooks/use-profile-picture-upload'
|
||||
import { useGeneralSettings, useUpdateGeneralSetting } from '@/hooks/queries/general-settings'
|
||||
@@ -565,13 +566,15 @@ export function General({ onOpenChange }: GeneralProps) {
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
<Button
|
||||
onClick={() => window.open('/?from=settings', '_blank', 'noopener,noreferrer')}
|
||||
variant='active'
|
||||
className='ml-auto'
|
||||
>
|
||||
Home Page
|
||||
</Button>
|
||||
{isHosted && (
|
||||
<Button
|
||||
onClick={() => window.open('/?from=settings', '_blank', 'noopener,noreferrer')}
|
||||
variant='active'
|
||||
className='ml-auto'
|
||||
>
|
||||
Home Page
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Password Reset Confirmation Modal */}
|
||||
|
||||
@@ -27,6 +27,7 @@ export type DocumentProcessingPayload = {
|
||||
export const processDocument = task({
|
||||
id: 'knowledge-process-document',
|
||||
maxDuration: env.KB_CONFIG_MAX_DURATION || 600,
|
||||
machine: 'large-1x', // 2 vCPU, 2GB RAM - needed for large PDF processing
|
||||
retry: {
|
||||
maxAttempts: env.KB_CONFIG_MAX_ATTEMPTS || 3,
|
||||
factor: env.KB_CONFIG_RETRY_FACTOR || 2,
|
||||
|
||||
@@ -321,7 +321,7 @@ describe('Blocks Module', () => {
|
||||
|
||||
it('should have correct metadata', () => {
|
||||
expect(block?.type).toBe('router')
|
||||
expect(block?.name).toBe('Router')
|
||||
expect(block?.name).toBe('Router (Legacy)')
|
||||
expect(block?.category).toBe('blocks')
|
||||
expect(block?.authMode).toBe(AuthMode.ApiKey)
|
||||
})
|
||||
@@ -352,53 +352,6 @@ describe('Blocks Module', () => {
|
||||
expect(typeof block?.tools.config?.tool).toBe('function')
|
||||
})
|
||||
})
|
||||
|
||||
describe('WebhookBlock', () => {
|
||||
const block = getBlock('webhook')
|
||||
|
||||
it('should have correct metadata', () => {
|
||||
expect(block?.type).toBe('webhook')
|
||||
expect(block?.name).toBe('Webhook')
|
||||
expect(block?.category).toBe('triggers')
|
||||
expect(block?.authMode).toBe(AuthMode.OAuth)
|
||||
expect(block?.triggerAllowed).toBe(true)
|
||||
expect(block?.hideFromToolbar).toBe(true)
|
||||
})
|
||||
|
||||
it('should have webhookProvider dropdown with multiple providers', () => {
|
||||
const providerSubBlock = block?.subBlocks.find((sb) => sb.id === 'webhookProvider')
|
||||
expect(providerSubBlock).toBeDefined()
|
||||
expect(providerSubBlock?.type).toBe('dropdown')
|
||||
const options = providerSubBlock?.options as Array<{ label: string; id: string }>
|
||||
expect(options?.map((o) => o.id)).toContain('slack')
|
||||
expect(options?.map((o) => o.id)).toContain('generic')
|
||||
expect(options?.map((o) => o.id)).toContain('github')
|
||||
})
|
||||
|
||||
it('should have conditional OAuth inputs', () => {
|
||||
const gmailCredentialSubBlock = block?.subBlocks.find((sb) => sb.id === 'gmailCredential')
|
||||
expect(gmailCredentialSubBlock).toBeDefined()
|
||||
expect(gmailCredentialSubBlock?.type).toBe('oauth-input')
|
||||
expect(gmailCredentialSubBlock?.condition).toEqual({
|
||||
field: 'webhookProvider',
|
||||
value: 'gmail',
|
||||
})
|
||||
|
||||
const outlookCredentialSubBlock = block?.subBlocks.find(
|
||||
(sb) => sb.id === 'outlookCredential'
|
||||
)
|
||||
expect(outlookCredentialSubBlock).toBeDefined()
|
||||
expect(outlookCredentialSubBlock?.type).toBe('oauth-input')
|
||||
expect(outlookCredentialSubBlock?.condition).toEqual({
|
||||
field: 'webhookProvider',
|
||||
value: 'outlook',
|
||||
})
|
||||
})
|
||||
|
||||
it('should have empty tools access', () => {
|
||||
expect(block?.tools.access).toEqual([])
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('SubBlock Validation', () => {
|
||||
@@ -454,6 +407,7 @@ describe('Blocks Module', () => {
|
||||
'workflow-selector',
|
||||
'workflow-input-mapper',
|
||||
'text',
|
||||
'router-input',
|
||||
]
|
||||
|
||||
const blocks = getAllBlocks()
|
||||
@@ -544,8 +498,8 @@ describe('Blocks Module', () => {
|
||||
})
|
||||
|
||||
it('should handle blocks with triggerAllowed flag', () => {
|
||||
const webhookBlock = getBlock('webhook')
|
||||
expect(webhookBlock?.triggerAllowed).toBe(true)
|
||||
const gmailBlock = getBlock('gmail')
|
||||
expect(gmailBlock?.triggerAllowed).toBe(true)
|
||||
|
||||
const functionBlock = getBlock('function')
|
||||
expect(functionBlock?.triggerAllowed).toBeUndefined()
|
||||
@@ -662,16 +616,6 @@ describe('Blocks Module', () => {
|
||||
expect(temperatureSubBlock?.min).toBe(0)
|
||||
expect(temperatureSubBlock?.max).toBe(2)
|
||||
})
|
||||
|
||||
it('should have required scopes on OAuth inputs', () => {
|
||||
const webhookBlock = getBlock('webhook')
|
||||
const gmailCredentialSubBlock = webhookBlock?.subBlocks.find(
|
||||
(sb) => sb.id === 'gmailCredential'
|
||||
)
|
||||
expect(gmailCredentialSubBlock?.requiredScopes).toBeDefined()
|
||||
expect(Array.isArray(gmailCredentialSubBlock?.requiredScopes)).toBe(true)
|
||||
expect((gmailCredentialSubBlock?.requiredScopes?.length ?? 0) > 0).toBe(true)
|
||||
})
|
||||
})
|
||||
|
||||
describe('Block Consistency', () => {
|
||||
|
||||
@@ -2,6 +2,7 @@ import { GrainIcon } from '@/components/icons'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { AuthMode } from '@/blocks/types'
|
||||
import { getTrigger } from '@/triggers'
|
||||
import { grainTriggerOptions } from '@/triggers/grain/utils'
|
||||
|
||||
export const GrainBlock: BlockConfig = {
|
||||
type: 'grain',
|
||||
@@ -207,13 +208,21 @@ export const GrainBlock: BlockConfig = {
|
||||
value: ['grain_delete_hook'],
|
||||
},
|
||||
},
|
||||
// Trigger SubBlocks
|
||||
...getTrigger('grain_recording_created').subBlocks,
|
||||
...getTrigger('grain_recording_updated').subBlocks,
|
||||
...getTrigger('grain_highlight_created').subBlocks,
|
||||
...getTrigger('grain_highlight_updated').subBlocks,
|
||||
...getTrigger('grain_story_created').subBlocks,
|
||||
...getTrigger('grain_webhook').subBlocks,
|
||||
{
|
||||
id: 'selectedTriggerId',
|
||||
title: 'Trigger Type',
|
||||
type: 'dropdown',
|
||||
mode: 'trigger',
|
||||
options: grainTriggerOptions,
|
||||
value: () => 'grain_webhook',
|
||||
required: true,
|
||||
},
|
||||
...getTrigger('grain_recording_created').subBlocks.slice(1),
|
||||
...getTrigger('grain_recording_updated').subBlocks.slice(1),
|
||||
...getTrigger('grain_highlight_created').subBlocks.slice(1),
|
||||
...getTrigger('grain_highlight_updated').subBlocks.slice(1),
|
||||
...getTrigger('grain_story_created').subBlocks.slice(1),
|
||||
...getTrigger('grain_webhook').subBlocks.slice(1),
|
||||
],
|
||||
tools: {
|
||||
access: [
|
||||
|
||||
@@ -10,6 +10,7 @@ export const HumanInTheLoopBlock: BlockConfig<ResponseBlockOutput> = {
|
||||
'Combines response and start functionality. Sends structured responses and allows workflow to resume from this point.',
|
||||
category: 'blocks',
|
||||
bgColor: '#10B981',
|
||||
docsLink: 'https://docs.sim.ai/blocks/human-in-the-loop',
|
||||
icon: HumanInTheLoopIcon,
|
||||
subBlocks: [
|
||||
// Operation dropdown hidden - block defaults to human approval mode
|
||||
@@ -27,7 +28,7 @@ export const HumanInTheLoopBlock: BlockConfig<ResponseBlockOutput> = {
|
||||
// },
|
||||
{
|
||||
id: 'builderData',
|
||||
title: 'Paused Output',
|
||||
title: 'Display Data',
|
||||
type: 'response-format',
|
||||
// condition: { field: 'operation', value: 'human' }, // Always shown since we only support human mode
|
||||
description:
|
||||
@@ -35,7 +36,7 @@ export const HumanInTheLoopBlock: BlockConfig<ResponseBlockOutput> = {
|
||||
},
|
||||
{
|
||||
id: 'notification',
|
||||
title: 'Notification',
|
||||
title: 'Notification (Send URL)',
|
||||
type: 'tool-input',
|
||||
// condition: { field: 'operation', value: 'human' }, // Always shown since we only support human mode
|
||||
description: 'Configure notification tools to alert approvers (e.g., Slack, Email)',
|
||||
@@ -57,7 +58,7 @@ export const HumanInTheLoopBlock: BlockConfig<ResponseBlockOutput> = {
|
||||
// },
|
||||
{
|
||||
id: 'inputFormat',
|
||||
title: 'Resume Input',
|
||||
title: 'Resume Form',
|
||||
type: 'input-format',
|
||||
// condition: { field: 'operation', value: 'human' }, // Always shown since we only support human mode
|
||||
description: 'Define the fields the approver can fill in when resuming',
|
||||
@@ -157,6 +158,9 @@ export const HumanInTheLoopBlock: BlockConfig<ResponseBlockOutput> = {
|
||||
},
|
||||
outputs: {
|
||||
url: { type: 'string', description: 'Resume UI URL' },
|
||||
// apiUrl: { type: 'string', description: 'Resume API URL' }, // Commented out - not accessible as output
|
||||
resumeEndpoint: {
|
||||
type: 'string',
|
||||
description: 'Resume API endpoint URL for direct curl requests',
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
@@ -51,6 +51,9 @@ interface TargetBlock {
|
||||
currentState?: any
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates the system prompt for the legacy router (block-based).
|
||||
*/
|
||||
export const generateRouterPrompt = (prompt: string, targetBlocks?: TargetBlock[]): string => {
|
||||
const basePrompt = `You are an intelligent routing agent responsible for directing workflow requests to the most appropriate block. Your task is to analyze the input and determine the single most suitable destination based on the request.
|
||||
|
||||
@@ -107,9 +110,88 @@ Example: "2acd9007-27e8-4510-a487-73d3b825e7c1"
|
||||
Remember: Your response must be ONLY the block ID - no additional text, formatting, or explanation.`
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates the system prompt for the port-based router (v2).
|
||||
* Instead of selecting a block by ID, it selects a route by evaluating all route descriptions.
|
||||
*/
|
||||
export const generateRouterV2Prompt = (
|
||||
context: string,
|
||||
routes: Array<{ id: string; title: string; value: string }>
|
||||
): string => {
|
||||
const routesInfo = routes
|
||||
.map(
|
||||
(route, index) => `
|
||||
Route ${index + 1}:
|
||||
ID: ${route.id}
|
||||
Description: ${route.value || 'No description provided'}
|
||||
---`
|
||||
)
|
||||
.join('\n')
|
||||
|
||||
return `You are an intelligent routing agent. Your task is to analyze the provided context and select the most appropriate route from the available options.
|
||||
|
||||
Available Routes:
|
||||
${routesInfo}
|
||||
|
||||
Context to analyze:
|
||||
${context}
|
||||
|
||||
Instructions:
|
||||
1. Carefully analyze the context against each route's description
|
||||
2. Select the route that best matches the context's intent and requirements
|
||||
3. Consider the semantic meaning, not just keyword matching
|
||||
4. If multiple routes could match, choose the most specific one
|
||||
|
||||
Response Format:
|
||||
Return ONLY the route ID as a single string, no punctuation, no explanation.
|
||||
Example: "route-abc123"
|
||||
|
||||
Remember: Your response must be ONLY the route ID - no additional text, formatting, or explanation.`
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper to get model options for both router versions.
|
||||
*/
|
||||
const getModelOptions = () => {
|
||||
const providersState = useProvidersStore.getState()
|
||||
const baseModels = providersState.providers.base.models
|
||||
const ollamaModels = providersState.providers.ollama.models
|
||||
const vllmModels = providersState.providers.vllm.models
|
||||
const openrouterModels = providersState.providers.openrouter.models
|
||||
const allModels = Array.from(
|
||||
new Set([...baseModels, ...ollamaModels, ...vllmModels, ...openrouterModels])
|
||||
)
|
||||
|
||||
return allModels.map((model) => {
|
||||
const icon = getProviderIcon(model)
|
||||
return { label: model, id: model, ...(icon && { icon }) }
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper to get API key condition for both router versions.
|
||||
*/
|
||||
const getApiKeyCondition = () => {
|
||||
return isHosted
|
||||
? {
|
||||
field: 'model',
|
||||
value: [...getHostedModels(), ...providers.vertex.models],
|
||||
not: true,
|
||||
}
|
||||
: () => ({
|
||||
field: 'model',
|
||||
value: [...getCurrentOllamaModels(), ...getCurrentVLLMModels(), ...providers.vertex.models],
|
||||
not: true,
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Legacy Router Block (block-based routing).
|
||||
* Hidden from toolbar but still supported for existing workflows.
|
||||
*/
|
||||
export const RouterBlock: BlockConfig<RouterResponse> = {
|
||||
type: 'router',
|
||||
name: 'Router',
|
||||
name: 'Router (Legacy)',
|
||||
description: 'Route workflow',
|
||||
authMode: AuthMode.ApiKey,
|
||||
longDescription:
|
||||
@@ -121,6 +203,7 @@ export const RouterBlock: BlockConfig<RouterResponse> = {
|
||||
category: 'blocks',
|
||||
bgColor: '#28C43F',
|
||||
icon: ConnectIcon,
|
||||
hideFromToolbar: true, // Hide legacy version from toolbar
|
||||
subBlocks: [
|
||||
{
|
||||
id: 'prompt',
|
||||
@@ -136,21 +219,7 @@ export const RouterBlock: BlockConfig<RouterResponse> = {
|
||||
placeholder: 'Type or select a model...',
|
||||
required: true,
|
||||
defaultValue: 'claude-sonnet-4-5',
|
||||
options: () => {
|
||||
const providersState = useProvidersStore.getState()
|
||||
const baseModels = providersState.providers.base.models
|
||||
const ollamaModels = providersState.providers.ollama.models
|
||||
const vllmModels = providersState.providers.vllm.models
|
||||
const openrouterModels = providersState.providers.openrouter.models
|
||||
const allModels = Array.from(
|
||||
new Set([...baseModels, ...ollamaModels, ...vllmModels, ...openrouterModels])
|
||||
)
|
||||
|
||||
return allModels.map((model) => {
|
||||
const icon = getProviderIcon(model)
|
||||
return { label: model, id: model, ...(icon && { icon }) }
|
||||
})
|
||||
},
|
||||
options: getModelOptions,
|
||||
},
|
||||
{
|
||||
id: 'vertexCredential',
|
||||
@@ -173,22 +242,7 @@ export const RouterBlock: BlockConfig<RouterResponse> = {
|
||||
password: true,
|
||||
connectionDroppable: false,
|
||||
required: true,
|
||||
// Hide API key for hosted models, Ollama models, vLLM models, and Vertex models (uses OAuth)
|
||||
condition: isHosted
|
||||
? {
|
||||
field: 'model',
|
||||
value: [...getHostedModels(), ...providers.vertex.models],
|
||||
not: true, // Show for all models EXCEPT those listed
|
||||
}
|
||||
: () => ({
|
||||
field: 'model',
|
||||
value: [
|
||||
...getCurrentOllamaModels(),
|
||||
...getCurrentVLLMModels(),
|
||||
...providers.vertex.models,
|
||||
],
|
||||
not: true, // Show for all models EXCEPT Ollama, vLLM, and Vertex models
|
||||
}),
|
||||
condition: getApiKeyCondition(),
|
||||
},
|
||||
{
|
||||
id: 'azureEndpoint',
|
||||
@@ -303,3 +357,185 @@ export const RouterBlock: BlockConfig<RouterResponse> = {
|
||||
selectedPath: { type: 'json', description: 'Selected routing path' },
|
||||
},
|
||||
}
|
||||
|
||||
/**
|
||||
* Router V2 Block (port-based routing).
|
||||
* Uses route definitions with descriptions instead of downstream block names.
|
||||
*/
|
||||
interface RouterV2Response extends ToolResponse {
|
||||
output: {
|
||||
context: string
|
||||
model: string
|
||||
tokens?: {
|
||||
prompt?: number
|
||||
completion?: number
|
||||
total?: number
|
||||
}
|
||||
cost?: {
|
||||
input: number
|
||||
output: number
|
||||
total: number
|
||||
}
|
||||
selectedRoute: string
|
||||
selectedPath: {
|
||||
blockId: string
|
||||
blockType: string
|
||||
blockTitle: string
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export const RouterV2Block: BlockConfig<RouterV2Response> = {
|
||||
type: 'router_v2',
|
||||
name: 'Router',
|
||||
description: 'Route workflow based on context',
|
||||
authMode: AuthMode.ApiKey,
|
||||
longDescription:
|
||||
'Intelligently route workflow execution to different paths based on context analysis. Define multiple routes with descriptions, and an LLM will determine which route to take based on the provided context.',
|
||||
bestPractices: `
|
||||
- Write clear, specific descriptions for each route
|
||||
- The context field should contain all relevant information for routing decisions
|
||||
- Route descriptions should be mutually exclusive when possible
|
||||
- Use descriptive route names to make the workflow readable
|
||||
`,
|
||||
category: 'blocks',
|
||||
bgColor: '#28C43F',
|
||||
icon: ConnectIcon,
|
||||
subBlocks: [
|
||||
{
|
||||
id: 'context',
|
||||
title: 'Context',
|
||||
type: 'long-input',
|
||||
placeholder: 'Enter the context to analyze for routing...',
|
||||
required: true,
|
||||
},
|
||||
{
|
||||
id: 'routes',
|
||||
type: 'router-input',
|
||||
},
|
||||
{
|
||||
id: 'model',
|
||||
title: 'Model',
|
||||
type: 'combobox',
|
||||
placeholder: 'Type or select a model...',
|
||||
required: true,
|
||||
defaultValue: 'claude-sonnet-4-5',
|
||||
options: getModelOptions,
|
||||
},
|
||||
{
|
||||
id: 'vertexCredential',
|
||||
title: 'Google Cloud Account',
|
||||
type: 'oauth-input',
|
||||
serviceId: 'vertex-ai',
|
||||
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
|
||||
placeholder: 'Select Google Cloud account',
|
||||
required: true,
|
||||
condition: {
|
||||
field: 'model',
|
||||
value: providers.vertex.models,
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'apiKey',
|
||||
title: 'API Key',
|
||||
type: 'short-input',
|
||||
placeholder: 'Enter your API key',
|
||||
password: true,
|
||||
connectionDroppable: false,
|
||||
required: true,
|
||||
condition: getApiKeyCondition(),
|
||||
},
|
||||
{
|
||||
id: 'azureEndpoint',
|
||||
title: 'Azure OpenAI Endpoint',
|
||||
type: 'short-input',
|
||||
password: true,
|
||||
placeholder: 'https://your-resource.openai.azure.com',
|
||||
connectionDroppable: false,
|
||||
condition: {
|
||||
field: 'model',
|
||||
value: providers['azure-openai'].models,
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'azureApiVersion',
|
||||
title: 'Azure API Version',
|
||||
type: 'short-input',
|
||||
placeholder: '2024-07-01-preview',
|
||||
connectionDroppable: false,
|
||||
condition: {
|
||||
field: 'model',
|
||||
value: providers['azure-openai'].models,
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'vertexProject',
|
||||
title: 'Vertex AI Project',
|
||||
type: 'short-input',
|
||||
placeholder: 'your-gcp-project-id',
|
||||
connectionDroppable: false,
|
||||
required: true,
|
||||
condition: {
|
||||
field: 'model',
|
||||
value: providers.vertex.models,
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'vertexLocation',
|
||||
title: 'Vertex AI Location',
|
||||
type: 'short-input',
|
||||
placeholder: 'us-central1',
|
||||
connectionDroppable: false,
|
||||
required: true,
|
||||
condition: {
|
||||
field: 'model',
|
||||
value: providers.vertex.models,
|
||||
},
|
||||
},
|
||||
],
|
||||
tools: {
|
||||
access: [
|
||||
'openai_chat',
|
||||
'anthropic_chat',
|
||||
'google_chat',
|
||||
'xai_chat',
|
||||
'deepseek_chat',
|
||||
'deepseek_reasoner',
|
||||
],
|
||||
config: {
|
||||
tool: (params: Record<string, any>) => {
|
||||
const model = params.model || 'gpt-4o'
|
||||
if (!model) {
|
||||
throw new Error('No model selected')
|
||||
}
|
||||
const tool = getAllModelProviders()[model as ProviderId]
|
||||
if (!tool) {
|
||||
throw new Error(`Invalid model selected: ${model}`)
|
||||
}
|
||||
return tool
|
||||
},
|
||||
},
|
||||
},
|
||||
inputs: {
|
||||
context: { type: 'string', description: 'Context for routing decision' },
|
||||
routes: { type: 'json', description: 'Route definitions with descriptions' },
|
||||
model: { type: 'string', description: 'AI model to use' },
|
||||
apiKey: { type: 'string', description: 'Provider API key' },
|
||||
azureEndpoint: { type: 'string', description: 'Azure OpenAI endpoint URL' },
|
||||
azureApiVersion: { type: 'string', description: 'Azure API version' },
|
||||
vertexProject: { type: 'string', description: 'Google Cloud project ID for Vertex AI' },
|
||||
vertexLocation: { type: 'string', description: 'Google Cloud location for Vertex AI' },
|
||||
vertexCredential: {
|
||||
type: 'string',
|
||||
description: 'Google Cloud OAuth credential ID for Vertex AI',
|
||||
},
|
||||
},
|
||||
outputs: {
|
||||
context: { type: 'string', description: 'Context used for routing' },
|
||||
model: { type: 'string', description: 'Model used' },
|
||||
tokens: { type: 'json', description: 'Token usage' },
|
||||
cost: { type: 'json', description: 'Cost information' },
|
||||
selectedRoute: { type: 'string', description: 'Selected route ID' },
|
||||
selectedPath: { type: 'json', description: 'Selected routing path' },
|
||||
},
|
||||
}
|
||||
|
||||
@@ -79,6 +79,16 @@ export const SupabaseBlock: BlockConfig<SupabaseResponse> = {
|
||||
value: ['query', 'get_row', 'insert', 'update', 'delete', 'upsert', 'count', 'text_search'],
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'select',
|
||||
title: 'Select Columns',
|
||||
type: 'short-input',
|
||||
placeholder: '* (all columns) or id,name,email',
|
||||
condition: {
|
||||
field: 'operation',
|
||||
value: ['query', 'get_row'],
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'apiKey',
|
||||
title: 'Service Role Secret',
|
||||
@@ -1044,6 +1054,7 @@ Return ONLY the PostgREST filter expression - no explanations, no markdown, no e
|
||||
projectId: { type: 'string', description: 'Supabase project identifier' },
|
||||
table: { type: 'string', description: 'Database table name' },
|
||||
schema: { type: 'string', description: 'Database schema (default: public)' },
|
||||
select: { type: 'string', description: 'Columns to return (comma-separated, defaults to *)' },
|
||||
apiKey: { type: 'string', description: 'Service role secret key' },
|
||||
// Data for insert/update operations
|
||||
data: { type: 'json', description: 'Row data' },
|
||||
|
||||
@@ -9,7 +9,7 @@ export const TtsBlock: BlockConfig<TtsBlockResponse> = {
|
||||
authMode: AuthMode.ApiKey,
|
||||
longDescription:
|
||||
'Generate natural-sounding speech from text using state-of-the-art AI voices from OpenAI, Deepgram, ElevenLabs, Cartesia, Google Cloud, Azure, and PlayHT. Supports multiple voices, languages, and audio formats.',
|
||||
docsLink: 'https://docs.sim.ai/blocks/tts',
|
||||
docsLink: 'https://docs.sim.ai/tools/tts',
|
||||
category: 'tools',
|
||||
bgColor: '#181C1E',
|
||||
icon: TTSIcon,
|
||||
|
||||
@@ -1,132 +0,0 @@
|
||||
import {
|
||||
AirtableIcon,
|
||||
DiscordIcon,
|
||||
GithubIcon,
|
||||
GmailIcon,
|
||||
MicrosoftTeamsIcon,
|
||||
OutlookIcon,
|
||||
SignalIcon,
|
||||
SlackIcon,
|
||||
StripeIcon,
|
||||
TelegramIcon,
|
||||
WebhookIcon,
|
||||
WhatsAppIcon,
|
||||
} from '@/components/icons'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { AuthMode } from '@/blocks/types'
|
||||
|
||||
const getWebhookProviderIcon = (provider: string) => {
|
||||
const iconMap: Record<string, React.ComponentType<{ className?: string }>> = {
|
||||
slack: SlackIcon,
|
||||
gmail: GmailIcon,
|
||||
outlook: OutlookIcon,
|
||||
airtable: AirtableIcon,
|
||||
telegram: TelegramIcon,
|
||||
generic: SignalIcon,
|
||||
whatsapp: WhatsAppIcon,
|
||||
github: GithubIcon,
|
||||
discord: DiscordIcon,
|
||||
stripe: StripeIcon,
|
||||
microsoftteams: MicrosoftTeamsIcon,
|
||||
}
|
||||
|
||||
return iconMap[provider.toLowerCase()]
|
||||
}
|
||||
|
||||
export const WebhookBlock: BlockConfig = {
|
||||
type: 'webhook',
|
||||
name: 'Webhook',
|
||||
description: 'Trigger workflow execution from external webhooks',
|
||||
authMode: AuthMode.OAuth,
|
||||
category: 'triggers',
|
||||
icon: WebhookIcon,
|
||||
bgColor: '#10B981', // Green color for triggers
|
||||
docsLink: 'https://docs.sim.ai/triggers/webhook',
|
||||
triggerAllowed: true,
|
||||
hideFromToolbar: true, // Hidden for backwards compatibility - use generic webhook trigger instead
|
||||
|
||||
subBlocks: [
|
||||
{
|
||||
id: 'webhookProvider',
|
||||
title: 'Webhook Provider',
|
||||
type: 'dropdown',
|
||||
options: [
|
||||
'slack',
|
||||
'gmail',
|
||||
'outlook',
|
||||
'airtable',
|
||||
'telegram',
|
||||
'generic',
|
||||
'whatsapp',
|
||||
'github',
|
||||
'discord',
|
||||
'stripe',
|
||||
'microsoftteams',
|
||||
].map((provider) => {
|
||||
const providerLabels = {
|
||||
slack: 'Slack',
|
||||
gmail: 'Gmail',
|
||||
outlook: 'Outlook',
|
||||
airtable: 'Airtable',
|
||||
telegram: 'Telegram',
|
||||
generic: 'Generic',
|
||||
whatsapp: 'WhatsApp',
|
||||
github: 'GitHub',
|
||||
discord: 'Discord',
|
||||
stripe: 'Stripe',
|
||||
microsoftteams: 'Microsoft Teams',
|
||||
}
|
||||
|
||||
const icon = getWebhookProviderIcon(provider)
|
||||
return {
|
||||
label: providerLabels[provider as keyof typeof providerLabels],
|
||||
id: provider,
|
||||
...(icon && { icon }),
|
||||
}
|
||||
}),
|
||||
value: () => 'generic',
|
||||
},
|
||||
{
|
||||
id: 'gmailCredential',
|
||||
title: 'Gmail Account',
|
||||
type: 'oauth-input',
|
||||
serviceId: 'gmail',
|
||||
requiredScopes: [
|
||||
'https://www.googleapis.com/auth/gmail.modify',
|
||||
'https://www.googleapis.com/auth/gmail.labels',
|
||||
],
|
||||
placeholder: 'Select Gmail account',
|
||||
condition: { field: 'webhookProvider', value: 'gmail' },
|
||||
required: true,
|
||||
},
|
||||
{
|
||||
id: 'outlookCredential',
|
||||
title: 'Microsoft Account',
|
||||
type: 'oauth-input',
|
||||
serviceId: 'outlook',
|
||||
requiredScopes: [
|
||||
'Mail.ReadWrite',
|
||||
'Mail.ReadBasic',
|
||||
'Mail.Read',
|
||||
'Mail.Send',
|
||||
'offline_access',
|
||||
],
|
||||
placeholder: 'Select Microsoft account',
|
||||
condition: { field: 'webhookProvider', value: 'outlook' },
|
||||
required: true,
|
||||
},
|
||||
{
|
||||
id: 'webhookConfig',
|
||||
title: 'Webhook Configuration',
|
||||
type: 'webhook-config',
|
||||
},
|
||||
],
|
||||
|
||||
tools: {
|
||||
access: [], // No external tools needed
|
||||
},
|
||||
|
||||
inputs: {}, // No inputs - webhook triggers receive data externally
|
||||
|
||||
outputs: {}, // No outputs - webhook data is injected directly into workflow context
|
||||
}
|
||||
86
apps/sim/blocks/blocks/webhook_request.ts
Normal file
86
apps/sim/blocks/blocks/webhook_request.ts
Normal file
@@ -0,0 +1,86 @@
|
||||
import { WebhookIcon } from '@/components/icons'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import type { RequestResponse } from '@/tools/http/types'
|
||||
|
||||
export const WebhookRequestBlock: BlockConfig<RequestResponse> = {
|
||||
type: 'webhook_request',
|
||||
name: 'Webhook',
|
||||
description: 'Send a webhook request',
|
||||
longDescription:
|
||||
'Send an HTTP POST request to a webhook URL with automatic webhook headers. Optionally sign the payload with HMAC-SHA256 for secure webhook delivery.',
|
||||
docsLink: 'https://docs.sim.ai/blocks/webhook',
|
||||
category: 'blocks',
|
||||
bgColor: '#10B981',
|
||||
icon: WebhookIcon,
|
||||
subBlocks: [
|
||||
{
|
||||
id: 'url',
|
||||
title: 'Webhook URL',
|
||||
type: 'short-input',
|
||||
placeholder: 'https://example.com/webhook',
|
||||
required: true,
|
||||
},
|
||||
{
|
||||
id: 'body',
|
||||
title: 'Payload',
|
||||
type: 'code',
|
||||
placeholder: 'Enter JSON payload...',
|
||||
language: 'json',
|
||||
wandConfig: {
|
||||
enabled: true,
|
||||
maintainHistory: true,
|
||||
prompt: `You are an expert JSON programmer.
|
||||
Generate ONLY the raw JSON object based on the user's request.
|
||||
The output MUST be a single, valid JSON object, starting with { and ending with }.
|
||||
|
||||
Current payload: {context}
|
||||
|
||||
Do not include any explanations, markdown formatting, or other text outside the JSON object.
|
||||
|
||||
You have access to the following variables you can use to generate the JSON payload:
|
||||
- Use angle brackets for workflow variables, e.g., '<blockName.output>'.
|
||||
- Use double curly braces for environment variables, e.g., '{{ENV_VAR_NAME}}'.
|
||||
|
||||
Example:
|
||||
{
|
||||
"event": "workflow.completed",
|
||||
"data": {
|
||||
"result": "<agent.content>",
|
||||
"timestamp": "<function.result>"
|
||||
}
|
||||
}`,
|
||||
placeholder: 'Describe the webhook payload you need...',
|
||||
generationType: 'json-object',
|
||||
},
|
||||
},
|
||||
{
|
||||
id: 'secret',
|
||||
title: 'Signing Secret',
|
||||
type: 'short-input',
|
||||
placeholder: 'Optional: Secret for HMAC signature',
|
||||
password: true,
|
||||
connectionDroppable: false,
|
||||
},
|
||||
{
|
||||
id: 'headers',
|
||||
title: 'Additional Headers',
|
||||
type: 'table',
|
||||
columns: ['Key', 'Value'],
|
||||
description: 'Optional custom headers to include with the webhook request',
|
||||
},
|
||||
],
|
||||
tools: {
|
||||
access: ['webhook_request'],
|
||||
},
|
||||
inputs: {
|
||||
url: { type: 'string', description: 'Webhook URL to send the request to' },
|
||||
body: { type: 'json', description: 'JSON payload to send' },
|
||||
secret: { type: 'string', description: 'Optional secret for HMAC-SHA256 signature' },
|
||||
headers: { type: 'json', description: 'Optional additional headers' },
|
||||
},
|
||||
outputs: {
|
||||
data: { type: 'json', description: 'Response data from the webhook endpoint' },
|
||||
status: { type: 'number', description: 'HTTP status code' },
|
||||
headers: { type: 'json', description: 'Response headers' },
|
||||
},
|
||||
}
|
||||
@@ -32,7 +32,7 @@ export const WorkflowBlock: BlockConfig = {
|
||||
description:
|
||||
'This is a core workflow block. Execute another workflow as a block in your workflow. Enter the input variable to pass to the child workflow.',
|
||||
category: 'blocks',
|
||||
bgColor: '#705335',
|
||||
bgColor: '#6366F1',
|
||||
icon: WorkflowIcon,
|
||||
subBlocks: [
|
||||
{
|
||||
|
||||
@@ -2,7 +2,6 @@ import { WorkflowIcon } from '@/components/icons'
|
||||
import type { BlockConfig } from '@/blocks/types'
|
||||
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
|
||||
|
||||
// Helper: list workflows excluding self
|
||||
const getAvailableWorkflows = (): Array<{ label: string; id: string }> => {
|
||||
try {
|
||||
const { workflows, activeWorkflowId } = useWorkflowRegistry.getState()
|
||||
@@ -15,7 +14,6 @@ const getAvailableWorkflows = (): Array<{ label: string; id: string }> => {
|
||||
}
|
||||
}
|
||||
|
||||
// New workflow block variant that visualizes child Input Trigger schema for mapping
|
||||
export const WorkflowInputBlock: BlockConfig = {
|
||||
type: 'workflow_input',
|
||||
name: 'Workflow',
|
||||
@@ -26,6 +24,7 @@ export const WorkflowInputBlock: BlockConfig = {
|
||||
- Remember, that the start point of the child workflow is the Start block.
|
||||
`,
|
||||
category: 'blocks',
|
||||
docsLink: 'https://docs.sim.ai/blocks/workflow',
|
||||
bgColor: '#6366F1', // Indigo - modern and professional
|
||||
icon: WorkflowIcon,
|
||||
subBlocks: [
|
||||
|
||||
@@ -92,7 +92,7 @@ import { RDSBlock } from '@/blocks/blocks/rds'
|
||||
import { RedditBlock } from '@/blocks/blocks/reddit'
|
||||
import { ResendBlock } from '@/blocks/blocks/resend'
|
||||
import { ResponseBlock } from '@/blocks/blocks/response'
|
||||
import { RouterBlock } from '@/blocks/blocks/router'
|
||||
import { RouterBlock, RouterV2Block } from '@/blocks/blocks/router'
|
||||
import { RssBlock } from '@/blocks/blocks/rss'
|
||||
import { S3Block } from '@/blocks/blocks/s3'
|
||||
import { SalesforceBlock } from '@/blocks/blocks/salesforce'
|
||||
@@ -130,7 +130,7 @@ import { VisionBlock } from '@/blocks/blocks/vision'
|
||||
import { WaitBlock } from '@/blocks/blocks/wait'
|
||||
import { WealthboxBlock } from '@/blocks/blocks/wealthbox'
|
||||
import { WebflowBlock } from '@/blocks/blocks/webflow'
|
||||
import { WebhookBlock } from '@/blocks/blocks/webhook'
|
||||
import { WebhookRequestBlock } from '@/blocks/blocks/webhook_request'
|
||||
import { WhatsAppBlock } from '@/blocks/blocks/whatsapp'
|
||||
import { WikipediaBlock } from '@/blocks/blocks/wikipedia'
|
||||
import { WordPressBlock } from '@/blocks/blocks/wordpress'
|
||||
@@ -243,6 +243,7 @@ export const registry: Record<string, BlockConfig> = {
|
||||
response: ResponseBlock,
|
||||
rss: RssBlock,
|
||||
router: RouterBlock,
|
||||
router_v2: RouterV2Block,
|
||||
s3: S3Block,
|
||||
salesforce: SalesforceBlock,
|
||||
schedule: ScheduleBlock,
|
||||
@@ -279,7 +280,7 @@ export const registry: Record<string, BlockConfig> = {
|
||||
wait: WaitBlock,
|
||||
wealthbox: WealthboxBlock,
|
||||
webflow: WebflowBlock,
|
||||
webhook: WebhookBlock,
|
||||
webhook_request: WebhookRequestBlock,
|
||||
whatsapp: WhatsAppBlock,
|
||||
wikipedia: WikipediaBlock,
|
||||
wordpress: WordPressBlock,
|
||||
@@ -293,11 +294,9 @@ export const registry: Record<string, BlockConfig> = {
|
||||
}
|
||||
|
||||
export const getBlock = (type: string): BlockConfig | undefined => {
|
||||
// Direct lookup first
|
||||
if (registry[type]) {
|
||||
return registry[type]
|
||||
}
|
||||
// Fallback: normalize hyphens to underscores (e.g., 'microsoft-teams' -> 'microsoft_teams')
|
||||
const normalized = type.replace(/-/g, '_')
|
||||
return registry[normalized]
|
||||
}
|
||||
|
||||
@@ -78,6 +78,7 @@ export type SubBlockType =
|
||||
| 'workflow-selector' // Workflow selector for agent tools
|
||||
| 'workflow-input-mapper' // Dynamic workflow input mapper based on selected workflow
|
||||
| 'text' // Read-only text display
|
||||
| 'router-input' // Router route definitions with descriptions
|
||||
|
||||
/**
|
||||
* Selector types that require display name hydration
|
||||
@@ -217,6 +218,7 @@ export interface SubBlockConfig {
|
||||
hideFromPreview?: boolean // Hide this subblock from the workflow block preview
|
||||
requiresFeature?: string // Environment variable name that must be truthy for this subblock to be visible
|
||||
description?: string
|
||||
tooltip?: string // Tooltip text displayed via info icon next to the title
|
||||
value?: (params: Record<string, any>) => string
|
||||
grouped?: boolean
|
||||
scrollable?: boolean
|
||||
@@ -287,11 +289,19 @@ export interface SubBlockConfig {
|
||||
useWebhookUrl?: boolean
|
||||
// Trigger-save specific: The trigger ID for validation and saving
|
||||
triggerId?: string
|
||||
// Dropdown specific: Function to fetch options dynamically (for multi-select or single-select)
|
||||
// Dropdown/Combobox: Function to fetch options dynamically
|
||||
// Works with both 'dropdown' (select-only) and 'combobox' (editable with expression support)
|
||||
fetchOptions?: (
|
||||
blockId: string,
|
||||
subBlockId: string
|
||||
) => Promise<Array<{ label: string; id: string }>>
|
||||
// Dropdown/Combobox: Function to fetch a single option's label by ID (for hydration)
|
||||
// Called when component mounts with a stored value to display the correct label before options load
|
||||
fetchOptionById?: (
|
||||
blockId: string,
|
||||
subBlockId: string,
|
||||
optionId: string
|
||||
) => Promise<{ label: string; id: string } | null>
|
||||
}
|
||||
|
||||
export interface BlockConfig<T extends ToolResponse = ToolResponse> {
|
||||
|
||||
@@ -119,10 +119,8 @@ const STYLES = {
|
||||
'hover:bg-[var(--border-1)] hover:text-[var(--text-primary)] hover:[&_svg]:text-[var(--text-primary)]',
|
||||
},
|
||||
secondary: {
|
||||
active:
|
||||
'bg-[var(--brand-secondary)] text-[var(--text-inverse)] [&_svg]:text-[var(--text-inverse)]',
|
||||
hover:
|
||||
'hover:bg-[var(--brand-secondary)] hover:text-[var(--text-inverse)] dark:hover:text-[var(--text-inverse)] hover:[&_svg]:text-[var(--text-inverse)] dark:hover:[&_svg]:text-[var(--text-inverse)]',
|
||||
active: 'bg-[var(--brand-secondary)] text-white [&_svg]:text-white',
|
||||
hover: 'hover:bg-[var(--brand-secondary)] hover:text-white hover:[&_svg]:text-white',
|
||||
},
|
||||
inverted: {
|
||||
active:
|
||||
@@ -474,14 +472,20 @@ const PopoverScrollArea = React.forwardRef<HTMLDivElement, PopoverScrollAreaProp
|
||||
PopoverScrollArea.displayName = 'PopoverScrollArea'
|
||||
|
||||
export interface PopoverItemProps extends React.HTMLAttributes<HTMLDivElement> {
|
||||
/** Whether this item is currently active/selected */
|
||||
/**
|
||||
* Whether this item has active/highlighted background styling.
|
||||
* Use for keyboard navigation focus or persistent highlight states.
|
||||
*/
|
||||
active?: boolean
|
||||
/** Only show when not inside any folder */
|
||||
rootOnly?: boolean
|
||||
/** Whether this item is disabled */
|
||||
disabled?: boolean
|
||||
/**
|
||||
* Show checkmark when active
|
||||
* Show a checkmark to indicate selection/checked state.
|
||||
* Unlike `active`, this only shows the checkmark without background highlight,
|
||||
* following the pattern where hover provides interaction feedback
|
||||
* and checkmarks indicate current value.
|
||||
* @default false
|
||||
*/
|
||||
showCheck?: boolean
|
||||
@@ -528,7 +532,7 @@ const PopoverItem = React.forwardRef<HTMLDivElement, PopoverItemProps>(
|
||||
{...props}
|
||||
>
|
||||
{children}
|
||||
{showCheck && active && <Check className={cn('ml-auto', STYLES.size[size].icon)} />}
|
||||
{showCheck && <Check className={cn('ml-auto', STYLES.size[size].icon)} />}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ export enum BlockType {
|
||||
PARALLEL = 'parallel',
|
||||
LOOP = 'loop',
|
||||
ROUTER = 'router',
|
||||
ROUTER_V2 = 'router_v2',
|
||||
CONDITION = 'condition',
|
||||
|
||||
START_TRIGGER = 'start_trigger',
|
||||
@@ -271,7 +272,11 @@ export function isConditionBlockType(blockType: string | undefined): boolean {
|
||||
}
|
||||
|
||||
export function isRouterBlockType(blockType: string | undefined): boolean {
|
||||
return blockType === BlockType.ROUTER
|
||||
return blockType === BlockType.ROUTER || blockType === BlockType.ROUTER_V2
|
||||
}
|
||||
|
||||
export function isRouterV2BlockType(blockType: string | undefined): boolean {
|
||||
return blockType === BlockType.ROUTER_V2
|
||||
}
|
||||
|
||||
export function isAgentBlockType(blockType: string | undefined): boolean {
|
||||
|
||||
@@ -1,5 +1,10 @@
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { EDGE, isConditionBlockType, isRouterBlockType } from '@/executor/constants'
|
||||
import {
|
||||
EDGE,
|
||||
isConditionBlockType,
|
||||
isRouterBlockType,
|
||||
isRouterV2BlockType,
|
||||
} from '@/executor/constants'
|
||||
import type { DAG } from '@/executor/dag/builder'
|
||||
import {
|
||||
buildBranchNodeId,
|
||||
@@ -19,10 +24,17 @@ interface ConditionConfig {
|
||||
condition: string
|
||||
}
|
||||
|
||||
interface RouterV2RouteConfig {
|
||||
id: string
|
||||
title: string
|
||||
description: string
|
||||
}
|
||||
|
||||
interface EdgeMetadata {
|
||||
blockTypeMap: Map<string, string>
|
||||
conditionConfigMap: Map<string, ConditionConfig[]>
|
||||
routerBlockIds: Set<string>
|
||||
routerV2ConfigMap: Map<string, RouterV2RouteConfig[]>
|
||||
}
|
||||
|
||||
export class EdgeConstructor {
|
||||
@@ -58,6 +70,7 @@ export class EdgeConstructor {
|
||||
const blockTypeMap = new Map<string, string>()
|
||||
const conditionConfigMap = new Map<string, ConditionConfig[]>()
|
||||
const routerBlockIds = new Set<string>()
|
||||
const routerV2ConfigMap = new Map<string, RouterV2RouteConfig[]>()
|
||||
|
||||
for (const block of workflow.blocks) {
|
||||
const blockType = block.metadata?.id ?? ''
|
||||
@@ -69,12 +82,19 @@ export class EdgeConstructor {
|
||||
if (conditions) {
|
||||
conditionConfigMap.set(block.id, conditions)
|
||||
}
|
||||
} else if (isRouterV2BlockType(blockType)) {
|
||||
// Router V2 uses port-based routing with route configs
|
||||
const routes = this.parseRouterV2Config(block)
|
||||
if (routes) {
|
||||
routerV2ConfigMap.set(block.id, routes)
|
||||
}
|
||||
} else if (isRouterBlockType(blockType)) {
|
||||
// Legacy router uses target block IDs
|
||||
routerBlockIds.add(block.id)
|
||||
}
|
||||
}
|
||||
|
||||
return { blockTypeMap, conditionConfigMap, routerBlockIds }
|
||||
return { blockTypeMap, conditionConfigMap, routerBlockIds, routerV2ConfigMap }
|
||||
}
|
||||
|
||||
private parseConditionConfig(block: any): ConditionConfig[] | null {
|
||||
@@ -100,6 +120,29 @@ export class EdgeConstructor {
|
||||
}
|
||||
}
|
||||
|
||||
private parseRouterV2Config(block: any): RouterV2RouteConfig[] | null {
|
||||
try {
|
||||
const routesJson = block.config.params?.routes
|
||||
|
||||
if (typeof routesJson === 'string') {
|
||||
return JSON.parse(routesJson)
|
||||
}
|
||||
|
||||
if (Array.isArray(routesJson)) {
|
||||
return routesJson
|
||||
}
|
||||
|
||||
return null
|
||||
} catch (error) {
|
||||
logger.warn('Failed to parse router v2 config', {
|
||||
blockId: block.id,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
})
|
||||
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
private generateSourceHandle(
|
||||
source: string,
|
||||
target: string,
|
||||
@@ -123,6 +166,26 @@ export class EdgeConstructor {
|
||||
}
|
||||
}
|
||||
|
||||
// Router V2 uses port-based routing - handle is already set from UI (router-{routeId})
|
||||
// We don't modify it here, just validate it exists
|
||||
if (metadata.routerV2ConfigMap.has(source)) {
|
||||
// For router_v2, the sourceHandle should already be set from the UI
|
||||
// If not set and not an error handle, generate based on route index
|
||||
if (!handle || (!handle.startsWith(EDGE.ROUTER_PREFIX) && handle !== EDGE.ERROR)) {
|
||||
const routes = metadata.routerV2ConfigMap.get(source)
|
||||
if (routes && routes.length > 0) {
|
||||
const edgesFromRouter = workflow.connections.filter((c) => c.source === source)
|
||||
const edgeIndex = edgesFromRouter.findIndex((e) => e.target === target)
|
||||
|
||||
if (edgeIndex >= 0 && edgeIndex < routes.length) {
|
||||
const correspondingRoute = routes[edgeIndex]
|
||||
handle = `${EDGE.ROUTER_PREFIX}${correspondingRoute.id}`
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Legacy router uses target block ID
|
||||
if (metadata.routerBlockIds.has(source) && handle !== EDGE.ERROR) {
|
||||
handle = `${EDGE.ROUTER_PREFIX}${target}`
|
||||
}
|
||||
|
||||
@@ -331,6 +331,22 @@ export class BlockExecutor {
|
||||
}
|
||||
return filtered
|
||||
}
|
||||
|
||||
const isTrigger =
|
||||
block.metadata?.category === 'triggers' ||
|
||||
block.config?.params?.triggerMode === true ||
|
||||
block.metadata?.id === BlockType.STARTER
|
||||
|
||||
if (isTrigger) {
|
||||
const filtered: NormalizedBlockOutput = {}
|
||||
const internalKeys = ['webhook', 'workflowId', 'input']
|
||||
for (const [key, value] of Object.entries(output)) {
|
||||
if (internalKeys.includes(key)) continue
|
||||
filtered[key] = value
|
||||
}
|
||||
return filtered
|
||||
}
|
||||
|
||||
return output
|
||||
}
|
||||
|
||||
@@ -510,7 +526,7 @@ export class BlockExecutor {
|
||||
const placeholderState: BlockState = {
|
||||
output: {
|
||||
url: resumeLinks.uiUrl,
|
||||
// apiUrl: resumeLinks.apiUrl, // Hidden from output
|
||||
resumeEndpoint: resumeLinks.apiUrl,
|
||||
},
|
||||
executed: false,
|
||||
executionTime: existingState?.executionTime ?? 0,
|
||||
|
||||
@@ -227,7 +227,7 @@ export class HumanInTheLoopBlockHandler implements BlockHandler {
|
||||
|
||||
if (resumeLinks) {
|
||||
output.url = resumeLinks.uiUrl
|
||||
// output.apiUrl = resumeLinks.apiUrl // Hidden from output
|
||||
output.resumeEndpoint = resumeLinks.apiUrl
|
||||
}
|
||||
|
||||
return output
|
||||
@@ -576,9 +576,9 @@ export class HumanInTheLoopBlockHandler implements BlockHandler {
|
||||
if (context.resumeLinks.uiUrl) {
|
||||
pauseOutput.url = context.resumeLinks.uiUrl
|
||||
}
|
||||
// if (context.resumeLinks.apiUrl) {
|
||||
// pauseOutput.apiUrl = context.resumeLinks.apiUrl
|
||||
// } // Hidden from output
|
||||
if (context.resumeLinks.apiUrl) {
|
||||
pauseOutput.resumeEndpoint = context.resumeLinks.apiUrl
|
||||
}
|
||||
}
|
||||
|
||||
if (Array.isArray(context.inputFormat)) {
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { createLogger } from '@sim/logger'
|
||||
import type { BlockOutput } from '@/blocks/types'
|
||||
import { BlockType, HTTP, REFERENCE } from '@/executor/constants'
|
||||
import type { BlockHandler, ExecutionContext } from '@/executor/types'
|
||||
import type { BlockHandler, ExecutionContext, NormalizedBlockOutput } from '@/executor/types'
|
||||
import type { SerializedBlock } from '@/serializer/types'
|
||||
|
||||
const logger = createLogger('ResponseBlockHandler')
|
||||
@@ -23,7 +22,7 @@ export class ResponseBlockHandler implements BlockHandler {
|
||||
ctx: ExecutionContext,
|
||||
block: SerializedBlock,
|
||||
inputs: Record<string, any>
|
||||
): Promise<BlockOutput> {
|
||||
): Promise<NormalizedBlockOutput> {
|
||||
logger.info(`Executing response block: ${block.id}`)
|
||||
|
||||
try {
|
||||
@@ -38,23 +37,19 @@ export class ResponseBlockHandler implements BlockHandler {
|
||||
})
|
||||
|
||||
return {
|
||||
response: {
|
||||
data: responseData,
|
||||
status: statusCode,
|
||||
headers: responseHeaders,
|
||||
},
|
||||
data: responseData,
|
||||
status: statusCode,
|
||||
headers: responseHeaders,
|
||||
}
|
||||
} catch (error: any) {
|
||||
logger.error('Response block execution failed:', error)
|
||||
return {
|
||||
response: {
|
||||
data: {
|
||||
error: 'Response block execution failed',
|
||||
message: error.message || 'Unknown error',
|
||||
},
|
||||
status: HTTP.STATUS.SERVER_ERROR,
|
||||
headers: { 'Content-Type': HTTP.CONTENT_TYPE.JSON },
|
||||
data: {
|
||||
error: 'Response block execution failed',
|
||||
message: error.message || 'Unknown error',
|
||||
},
|
||||
status: HTTP.STATUS.SERVER_ERROR,
|
||||
headers: { 'Content-Type': HTTP.CONTENT_TYPE.JSON },
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,29 +4,60 @@ import { createLogger } from '@sim/logger'
|
||||
import { eq } from 'drizzle-orm'
|
||||
import { getBaseUrl } from '@/lib/core/utils/urls'
|
||||
import { refreshTokenIfNeeded } from '@/app/api/auth/oauth/utils'
|
||||
import { generateRouterPrompt } from '@/blocks/blocks/router'
|
||||
import { generateRouterPrompt, generateRouterV2Prompt } from '@/blocks/blocks/router'
|
||||
import type { BlockOutput } from '@/blocks/types'
|
||||
import { BlockType, DEFAULTS, HTTP, isAgentBlockType, ROUTER } from '@/executor/constants'
|
||||
import {
|
||||
BlockType,
|
||||
DEFAULTS,
|
||||
HTTP,
|
||||
isAgentBlockType,
|
||||
isRouterV2BlockType,
|
||||
ROUTER,
|
||||
} from '@/executor/constants'
|
||||
import type { BlockHandler, ExecutionContext } from '@/executor/types'
|
||||
import { calculateCost, getProviderFromModel } from '@/providers/utils'
|
||||
import type { SerializedBlock } from '@/serializer/types'
|
||||
|
||||
const logger = createLogger('RouterBlockHandler')
|
||||
|
||||
interface RouteDefinition {
|
||||
id: string
|
||||
title: string
|
||||
value: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Handler for Router blocks that dynamically select execution paths.
|
||||
* Supports both legacy router (block-based) and router_v2 (port-based).
|
||||
*/
|
||||
export class RouterBlockHandler implements BlockHandler {
|
||||
constructor(private pathTracker?: any) {}
|
||||
|
||||
canHandle(block: SerializedBlock): boolean {
|
||||
return block.metadata?.id === BlockType.ROUTER
|
||||
return block.metadata?.id === BlockType.ROUTER || block.metadata?.id === BlockType.ROUTER_V2
|
||||
}
|
||||
|
||||
async execute(
|
||||
ctx: ExecutionContext,
|
||||
block: SerializedBlock,
|
||||
inputs: Record<string, any>
|
||||
): Promise<BlockOutput> {
|
||||
const isV2 = isRouterV2BlockType(block.metadata?.id)
|
||||
|
||||
if (isV2) {
|
||||
return this.executeV2(ctx, block, inputs)
|
||||
}
|
||||
|
||||
return this.executeLegacy(ctx, block, inputs)
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute legacy router (block-based routing).
|
||||
*/
|
||||
private async executeLegacy(
|
||||
ctx: ExecutionContext,
|
||||
block: SerializedBlock,
|
||||
inputs: Record<string, any>
|
||||
): Promise<BlockOutput> {
|
||||
const targetBlocks = this.getTargetBlocks(ctx, block)
|
||||
|
||||
@@ -144,6 +175,168 @@ export class RouterBlockHandler implements BlockHandler {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute router v2 (port-based routing).
|
||||
* Uses route definitions with descriptions instead of downstream block names.
|
||||
*/
|
||||
private async executeV2(
|
||||
ctx: ExecutionContext,
|
||||
block: SerializedBlock,
|
||||
inputs: Record<string, any>
|
||||
): Promise<BlockOutput> {
|
||||
const routes = this.parseRoutes(inputs.routes)
|
||||
|
||||
if (routes.length === 0) {
|
||||
throw new Error('No routes defined for router')
|
||||
}
|
||||
|
||||
const routerConfig = {
|
||||
context: inputs.context,
|
||||
model: inputs.model || ROUTER.DEFAULT_MODEL,
|
||||
apiKey: inputs.apiKey,
|
||||
vertexProject: inputs.vertexProject,
|
||||
vertexLocation: inputs.vertexLocation,
|
||||
vertexCredential: inputs.vertexCredential,
|
||||
}
|
||||
|
||||
const providerId = getProviderFromModel(routerConfig.model)
|
||||
|
||||
try {
|
||||
const url = new URL('/api/providers', getBaseUrl())
|
||||
|
||||
const messages = [{ role: 'user', content: routerConfig.context }]
|
||||
const systemPrompt = generateRouterV2Prompt(routerConfig.context, routes)
|
||||
|
||||
let finalApiKey: string | undefined = routerConfig.apiKey
|
||||
if (providerId === 'vertex' && routerConfig.vertexCredential) {
|
||||
finalApiKey = await this.resolveVertexCredential(routerConfig.vertexCredential)
|
||||
}
|
||||
|
||||
const providerRequest: Record<string, any> = {
|
||||
provider: providerId,
|
||||
model: routerConfig.model,
|
||||
systemPrompt: systemPrompt,
|
||||
context: JSON.stringify(messages),
|
||||
temperature: ROUTER.INFERENCE_TEMPERATURE,
|
||||
apiKey: finalApiKey,
|
||||
workflowId: ctx.workflowId,
|
||||
workspaceId: ctx.workspaceId,
|
||||
}
|
||||
|
||||
if (providerId === 'vertex') {
|
||||
providerRequest.vertexProject = routerConfig.vertexProject
|
||||
providerRequest.vertexLocation = routerConfig.vertexLocation
|
||||
}
|
||||
|
||||
if (providerId === 'azure-openai') {
|
||||
providerRequest.azureEndpoint = inputs.azureEndpoint
|
||||
providerRequest.azureApiVersion = inputs.azureApiVersion
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': HTTP.CONTENT_TYPE.JSON,
|
||||
},
|
||||
body: JSON.stringify(providerRequest),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
let errorMessage = `Provider API request failed with status ${response.status}`
|
||||
try {
|
||||
const errorData = await response.json()
|
||||
if (errorData.error) {
|
||||
errorMessage = errorData.error
|
||||
}
|
||||
} catch (_e) {}
|
||||
throw new Error(errorMessage)
|
||||
}
|
||||
|
||||
const result = await response.json()
|
||||
|
||||
const chosenRouteId = result.content.trim()
|
||||
const chosenRoute = routes.find((r) => r.id === chosenRouteId)
|
||||
|
||||
if (!chosenRoute) {
|
||||
logger.error(
|
||||
`Invalid routing decision. Response content: "${result.content}", available routes:`,
|
||||
routes.map((r) => ({ id: r.id, title: r.title }))
|
||||
)
|
||||
throw new Error(`Invalid routing decision: ${chosenRouteId}`)
|
||||
}
|
||||
|
||||
// Find the target block connected to this route's handle
|
||||
const connection = ctx.workflow?.connections.find(
|
||||
(conn) => conn.source === block.id && conn.sourceHandle === `router-${chosenRoute.id}`
|
||||
)
|
||||
|
||||
const targetBlock = connection
|
||||
? ctx.workflow?.blocks.find((b) => b.id === connection.target)
|
||||
: null
|
||||
|
||||
const tokens = result.tokens || {
|
||||
input: DEFAULTS.TOKENS.PROMPT,
|
||||
output: DEFAULTS.TOKENS.COMPLETION,
|
||||
total: DEFAULTS.TOKENS.TOTAL,
|
||||
}
|
||||
|
||||
const cost = calculateCost(
|
||||
result.model,
|
||||
tokens.input || DEFAULTS.TOKENS.PROMPT,
|
||||
tokens.output || DEFAULTS.TOKENS.COMPLETION,
|
||||
false
|
||||
)
|
||||
|
||||
return {
|
||||
context: inputs.context,
|
||||
model: result.model,
|
||||
tokens: {
|
||||
input: tokens.input || DEFAULTS.TOKENS.PROMPT,
|
||||
output: tokens.output || DEFAULTS.TOKENS.COMPLETION,
|
||||
total: tokens.total || DEFAULTS.TOKENS.TOTAL,
|
||||
},
|
||||
cost: {
|
||||
input: cost.input,
|
||||
output: cost.output,
|
||||
total: cost.total,
|
||||
},
|
||||
selectedRoute: chosenRoute.id,
|
||||
selectedPath: targetBlock
|
||||
? {
|
||||
blockId: targetBlock.id,
|
||||
blockType: targetBlock.metadata?.id || DEFAULTS.BLOCK_TYPE,
|
||||
blockTitle: targetBlock.metadata?.name || DEFAULTS.BLOCK_TITLE,
|
||||
}
|
||||
: {
|
||||
blockId: '',
|
||||
blockType: DEFAULTS.BLOCK_TYPE,
|
||||
blockTitle: chosenRoute.title,
|
||||
},
|
||||
} as BlockOutput
|
||||
} catch (error) {
|
||||
logger.error('Router V2 execution failed:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse routes from input (can be JSON string or array).
|
||||
*/
|
||||
private parseRoutes(input: any): RouteDefinition[] {
|
||||
try {
|
||||
if (typeof input === 'string') {
|
||||
return JSON.parse(input)
|
||||
}
|
||||
if (Array.isArray(input)) {
|
||||
return input
|
||||
}
|
||||
return []
|
||||
} catch (error) {
|
||||
logger.error('Failed to parse routes:', { input, error })
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
private getTargetBlocks(ctx: ExecutionContext, block: SerializedBlock) {
|
||||
return ctx.workflow?.connections
|
||||
.filter((conn) => conn.source === block.id)
|
||||
|
||||
@@ -205,7 +205,6 @@ describe('TriggerBlockHandler', () => {
|
||||
|
||||
const result = await handler.execute(mockContext, scheduleBlock, {})
|
||||
|
||||
// Schedule triggers typically don't have input data, just trigger the workflow
|
||||
expect(result).toEqual({})
|
||||
})
|
||||
|
||||
|
||||
@@ -31,10 +31,7 @@ export class TriggerBlockHandler implements BlockHandler {
|
||||
|
||||
const existingState = ctx.blockStates.get(block.id)
|
||||
if (existingState?.output && Object.keys(existingState.output).length > 0) {
|
||||
const existingOutput = existingState.output as any
|
||||
const existingProvider = existingOutput?.webhook?.data?.provider
|
||||
|
||||
return existingOutput
|
||||
return existingState.output
|
||||
}
|
||||
|
||||
const starterBlock = ctx.workflow?.blocks?.find((b) => b.metadata?.id === 'starter')
|
||||
@@ -44,88 +41,8 @@ export class TriggerBlockHandler implements BlockHandler {
|
||||
const starterOutput = starterState.output
|
||||
|
||||
if (starterOutput.webhook?.data) {
|
||||
const webhookData = starterOutput.webhook?.data || {}
|
||||
const provider = webhookData.provider
|
||||
|
||||
if (provider === 'github') {
|
||||
const payloadSource = webhookData.payload || {}
|
||||
return {
|
||||
...payloadSource,
|
||||
webhook: starterOutput.webhook,
|
||||
}
|
||||
}
|
||||
|
||||
if (provider === 'microsoft-teams') {
|
||||
const providerData = (starterOutput as any)[provider] || webhookData[provider] || {}
|
||||
const payloadSource = providerData?.message?.raw || webhookData.payload || {}
|
||||
return {
|
||||
...payloadSource,
|
||||
[provider]: providerData,
|
||||
webhook: starterOutput.webhook,
|
||||
}
|
||||
}
|
||||
|
||||
if (provider === 'airtable') {
|
||||
return starterOutput
|
||||
}
|
||||
|
||||
const result: any = {
|
||||
input: starterOutput.input,
|
||||
}
|
||||
|
||||
for (const [key, value] of Object.entries(starterOutput)) {
|
||||
if (key !== 'webhook' && key !== provider) {
|
||||
result[key] = value
|
||||
}
|
||||
}
|
||||
|
||||
if (provider && starterOutput[provider]) {
|
||||
const providerData = starterOutput[provider]
|
||||
|
||||
for (const [key, value] of Object.entries(providerData)) {
|
||||
if (typeof value === 'object' && value !== null) {
|
||||
if (!result[key]) {
|
||||
result[key] = value
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result[provider] = providerData
|
||||
} else if (provider && webhookData[provider]) {
|
||||
const providerData = webhookData[provider]
|
||||
|
||||
for (const [key, value] of Object.entries(providerData)) {
|
||||
if (typeof value === 'object' && value !== null) {
|
||||
if (!result[key]) {
|
||||
result[key] = value
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
result[provider] = providerData
|
||||
} else if (
|
||||
provider &&
|
||||
(provider === 'gmail' || provider === 'outlook') &&
|
||||
webhookData.payload?.email
|
||||
) {
|
||||
const emailData = webhookData.payload.email
|
||||
|
||||
for (const [key, value] of Object.entries(emailData)) {
|
||||
if (!result[key]) {
|
||||
result[key] = value
|
||||
}
|
||||
}
|
||||
|
||||
result.email = emailData
|
||||
|
||||
if (webhookData.payload.timestamp) {
|
||||
result.timestamp = webhookData.payload.timestamp
|
||||
}
|
||||
}
|
||||
|
||||
if (starterOutput.webhook) result.webhook = starterOutput.webhook
|
||||
|
||||
return result
|
||||
const { webhook, workflowId, ...cleanOutput } = starterOutput
|
||||
return cleanOutput
|
||||
}
|
||||
|
||||
return starterOutput
|
||||
|
||||
@@ -109,6 +109,9 @@ export class WorkflowBlockHandler implements BlockHandler {
|
||||
contextExtensions: {
|
||||
isChildExecution: true,
|
||||
isDeployedContext: ctx.isDeployedContext === true,
|
||||
workspaceId: ctx.workspaceId,
|
||||
userId: ctx.userId,
|
||||
executionId: ctx.executionId,
|
||||
},
|
||||
})
|
||||
|
||||
|
||||
@@ -293,6 +293,532 @@ describe('BlockResolver', () => {
|
||||
})
|
||||
})
|
||||
|
||||
describe('Response block backwards compatibility', () => {
|
||||
it.concurrent('should resolve new format: <responseBlock.data>', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
data: { message: 'hello', userId: 123 },
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<response.data>', ctx)).toEqual({ message: 'hello', userId: 123 })
|
||||
expect(resolver.resolve('<response.data.message>', ctx)).toBe('hello')
|
||||
expect(resolver.resolve('<response.data.userId>', ctx)).toBe(123)
|
||||
})
|
||||
|
||||
it.concurrent('should resolve new format: <responseBlock.status>', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
data: { message: 'hello' },
|
||||
status: 201,
|
||||
headers: {},
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<response.status>', ctx)).toBe(201)
|
||||
})
|
||||
|
||||
it.concurrent('should resolve new format: <responseBlock.headers>', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
data: {},
|
||||
status: 200,
|
||||
headers: { 'X-Custom-Header': 'custom-value', 'Content-Type': 'application/json' },
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<response.headers>', ctx)).toEqual({
|
||||
'X-Custom-Header': 'custom-value',
|
||||
'Content-Type': 'application/json',
|
||||
})
|
||||
})
|
||||
|
||||
it.concurrent(
|
||||
'should resolve old format (backwards compat): <responseBlock.response.data>',
|
||||
() => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
data: { message: 'hello', userId: 123 },
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
},
|
||||
})
|
||||
|
||||
// Old format: <responseBlock.response.data> should strip 'response.' and resolve to data
|
||||
expect(resolver.resolve('<response.response.data>', ctx)).toEqual({
|
||||
message: 'hello',
|
||||
userId: 123,
|
||||
})
|
||||
expect(resolver.resolve('<response.response.data.message>', ctx)).toBe('hello')
|
||||
expect(resolver.resolve('<response.response.data.userId>', ctx)).toBe(123)
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent(
|
||||
'should resolve old format (backwards compat): <responseBlock.response.status>',
|
||||
() => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
data: { message: 'hello' },
|
||||
status: 404,
|
||||
headers: {},
|
||||
},
|
||||
})
|
||||
|
||||
// Old format: <responseBlock.response.status> should strip 'response.' and resolve to status
|
||||
expect(resolver.resolve('<response.response.status>', ctx)).toBe(404)
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent(
|
||||
'should resolve old format (backwards compat): <responseBlock.response.headers>',
|
||||
() => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
data: {},
|
||||
status: 200,
|
||||
headers: { 'X-Request-Id': 'abc-123' },
|
||||
},
|
||||
})
|
||||
|
||||
// Old format: <responseBlock.response.headers> should strip 'response.' and resolve to headers
|
||||
expect(resolver.resolve('<response.response.headers>', ctx)).toEqual({
|
||||
'X-Request-Id': 'abc-123',
|
||||
})
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent('should resolve entire Response block output with new format', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'My Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const fullOutput = {
|
||||
data: { result: 'success' },
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
}
|
||||
const ctx = createTestContext('current', { 'response-block': fullOutput })
|
||||
|
||||
expect(resolver.resolve('<myresponse>', ctx)).toEqual(fullOutput)
|
||||
})
|
||||
|
||||
it.concurrent(
|
||||
'should only strip response prefix for response block type, not other blocks',
|
||||
() => {
|
||||
// For non-response blocks, 'response' is a valid property name that should NOT be stripped
|
||||
const workflow = createTestWorkflow([{ id: 'agent-block', name: 'Agent', type: 'agent' }])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'agent-block': {
|
||||
response: { content: 'AI generated text' },
|
||||
tokens: { input: 100, output: 50 },
|
||||
},
|
||||
})
|
||||
|
||||
// For agent blocks, 'response' is a valid property and should be accessed normally
|
||||
expect(resolver.resolve('<agent.response.content>', ctx)).toBe('AI generated text')
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent(
|
||||
'should NOT strip response prefix if output actually has response key (edge case)',
|
||||
() => {
|
||||
// Edge case: What if a Response block somehow has a 'response' key in its output?
|
||||
// This shouldn't happen in practice, but if it does, we should respect it.
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'response-block', name: 'Response', type: 'response' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
// Hypothetical edge case where output has an actual 'response' property
|
||||
const ctx = createTestContext('current', {
|
||||
'response-block': {
|
||||
response: { legacyData: 'some value' },
|
||||
data: { newData: 'other value' },
|
||||
},
|
||||
})
|
||||
|
||||
// Since output.response exists, we should NOT strip it - access the actual 'response' property
|
||||
expect(resolver.resolve('<response.response.legacyData>', ctx)).toBe('some value')
|
||||
expect(resolver.resolve('<response.data.newData>', ctx)).toBe('other value')
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
describe('Workflow block with child Response block backwards compatibility', () => {
|
||||
it.concurrent('should resolve new format: <workflowBlock.result.data>', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
// After our change, child workflow with Response block returns { data, status, headers }
|
||||
// Workflow block wraps it in { success, result: { data, status, headers }, ... }
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
data: { userId: 456, name: 'Test User' },
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<myworkflow.result.data>', ctx)).toEqual({
|
||||
userId: 456,
|
||||
name: 'Test User',
|
||||
})
|
||||
expect(resolver.resolve('<myworkflow.result.data.userId>', ctx)).toBe(456)
|
||||
expect(resolver.resolve('<myworkflow.result.data.name>', ctx)).toBe('Test User')
|
||||
})
|
||||
|
||||
it.concurrent('should resolve new format: <workflowBlock.result.status>', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
data: { message: 'created' },
|
||||
status: 201,
|
||||
headers: {},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<myworkflow.result.status>', ctx)).toBe(201)
|
||||
})
|
||||
|
||||
it.concurrent('should resolve new format: <workflowBlock.result.headers>', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
data: {},
|
||||
status: 200,
|
||||
headers: { 'X-Trace-Id': 'trace-abc-123' },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<myworkflow.result.headers>', ctx)).toEqual({
|
||||
'X-Trace-Id': 'trace-abc-123',
|
||||
})
|
||||
})
|
||||
|
||||
it.concurrent(
|
||||
'should resolve old format (backwards compat): <workflowBlock.result.response.data>',
|
||||
() => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
data: { userId: 456, name: 'Test User' },
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Old format: <workflowBlock.result.response.data> should strip 'response.' and resolve to result.data
|
||||
expect(resolver.resolve('<myworkflow.result.response.data>', ctx)).toEqual({
|
||||
userId: 456,
|
||||
name: 'Test User',
|
||||
})
|
||||
expect(resolver.resolve('<myworkflow.result.response.data.userId>', ctx)).toBe(456)
|
||||
expect(resolver.resolve('<myworkflow.result.response.data.name>', ctx)).toBe('Test User')
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent(
|
||||
'should resolve old format (backwards compat): <workflowBlock.result.response.status>',
|
||||
() => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
data: { message: 'error' },
|
||||
status: 500,
|
||||
headers: {},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Old format: <workflowBlock.result.response.status> should strip 'response.' and resolve to result.status
|
||||
expect(resolver.resolve('<myworkflow.result.response.status>', ctx)).toBe(500)
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent(
|
||||
'should resolve old format (backwards compat): <workflowBlock.result.response.headers>',
|
||||
() => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
data: {},
|
||||
status: 200,
|
||||
headers: { 'Cache-Control': 'no-cache' },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Old format: <workflowBlock.result.response.headers> should strip 'response.' and resolve to result.headers
|
||||
expect(resolver.resolve('<myworkflow.result.response.headers>', ctx)).toEqual({
|
||||
'Cache-Control': 'no-cache',
|
||||
})
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent('should resolve workflow block success and other properties', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: { data: {}, status: 200, headers: {} },
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<myworkflow.success>', ctx)).toBe(true)
|
||||
expect(resolver.resolve('<myworkflow.childWorkflowName>', ctx)).toBe('Child Workflow')
|
||||
})
|
||||
|
||||
it.concurrent('should handle workflow block with failed child workflow', () => {
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: false,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {},
|
||||
error: 'Child workflow execution failed',
|
||||
},
|
||||
})
|
||||
|
||||
expect(resolver.resolve('<myworkflow.success>', ctx)).toBe(false)
|
||||
expect(resolver.resolve('<myworkflow.error>', ctx)).toBe('Child workflow execution failed')
|
||||
})
|
||||
|
||||
it.concurrent('should handle workflow block where child has non-Response final block', () => {
|
||||
// When child workflow does NOT have a Response block as final block,
|
||||
// the result structure will be different (not data/status/headers)
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
content: 'AI generated response',
|
||||
tokens: { input: 100, output: 50 },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// No backwards compat needed here since child didn't have Response block
|
||||
expect(resolver.resolve('<myworkflow.result.content>', ctx)).toBe('AI generated response')
|
||||
expect(resolver.resolve('<myworkflow.result.tokens.input>', ctx)).toBe(100)
|
||||
})
|
||||
|
||||
it.concurrent('should not apply workflow backwards compat for non-workflow blocks', () => {
|
||||
// For non-workflow blocks, 'result.response' is a valid path that should NOT be modified
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'function-block', name: 'Function', type: 'function' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'function-block': {
|
||||
result: {
|
||||
response: { apiData: 'test' },
|
||||
other: 'value',
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// For function blocks, 'result.response' is a valid nested property
|
||||
expect(resolver.resolve('<function.result.response.apiData>', ctx)).toBe('test')
|
||||
})
|
||||
|
||||
it.concurrent(
|
||||
'should NOT strip result.response if child actually has response property (edge case)',
|
||||
() => {
|
||||
// Edge case: Child workflow's final output legitimately has a 'response' property
|
||||
// (e.g., child ended with an Agent block that outputs response data)
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'Child Workflow',
|
||||
result: {
|
||||
// Child workflow ended with Agent block, not Response block
|
||||
content: 'AI generated text',
|
||||
response: { apiCallData: 'from external API' }, // legitimate 'response' property
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Since output.result.response exists, we should NOT strip it - access the actual property
|
||||
expect(resolver.resolve('<myworkflow.result.response.apiCallData>', ctx)).toBe(
|
||||
'from external API'
|
||||
)
|
||||
expect(resolver.resolve('<myworkflow.result.content>', ctx)).toBe('AI generated text')
|
||||
}
|
||||
)
|
||||
|
||||
it.concurrent('should handle mixed scenarios correctly', () => {
|
||||
// Test that new format works when child workflow had Response block
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'My Workflow', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
|
||||
// Scenario 1: Child had Response block (new format - no 'response' key in result)
|
||||
const ctx1 = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
result: { data: { id: 1 }, status: 200, headers: {} },
|
||||
},
|
||||
})
|
||||
// New format works
|
||||
expect(resolver.resolve('<myworkflow.result.data.id>', ctx1)).toBe(1)
|
||||
// Old format also works (backwards compat kicks in because result.response is undefined)
|
||||
expect(resolver.resolve('<myworkflow.result.response.data.id>', ctx1)).toBe(1)
|
||||
|
||||
// Scenario 2: Child had Agent block with 'response' property
|
||||
const ctx2 = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
result: {
|
||||
content: 'text',
|
||||
response: { external: 'data' }, // actual 'response' property
|
||||
},
|
||||
},
|
||||
})
|
||||
// Access the actual 'response' property - no stripping
|
||||
expect(resolver.resolve('<myworkflow.result.response.external>', ctx2)).toBe('data')
|
||||
})
|
||||
|
||||
it.concurrent(
|
||||
'real-world scenario: parent workflow referencing child Response block via <workflow1.result.response.data>',
|
||||
() => {
|
||||
/**
|
||||
* This test simulates the exact scenario from user workflows:
|
||||
*
|
||||
* Child workflow (vibrant-cliff):
|
||||
* Start → Function 1 (returns "fuck") → Response 1
|
||||
* Response 1 outputs: { data: { hi: "fuck" }, status: 200, headers: {...} }
|
||||
*
|
||||
* Parent workflow (flying-glacier):
|
||||
* Start → Workflow 1 (calls vibrant-cliff) → Function 1
|
||||
* Function 1 code: return <workflow1.result.response.data>
|
||||
*
|
||||
* After our changes:
|
||||
* - Child Response block outputs { data, status, headers } (no wrapper)
|
||||
* - Workflow block wraps it in { success, result: { data, status, headers }, ... }
|
||||
* - Parent uses OLD reference <workflow1.result.response.data>
|
||||
* - Backwards compat should strip 'response.' and resolve to result.data
|
||||
*/
|
||||
const workflow = createTestWorkflow([
|
||||
{ id: 'workflow-block', name: 'Workflow 1', type: 'workflow' },
|
||||
])
|
||||
const resolver = new BlockResolver(workflow)
|
||||
|
||||
// Simulate the workflow block output after child (vibrant-cliff) executes
|
||||
// Child's Response block now outputs { data, status, headers } directly (no wrapper)
|
||||
// Workflow block wraps it in { success, result: <child_output>, ... }
|
||||
const ctx = createTestContext('current', {
|
||||
'workflow-block': {
|
||||
success: true,
|
||||
childWorkflowName: 'vibrant-cliff',
|
||||
result: {
|
||||
// This is what Response block outputs after our changes (no 'response' wrapper)
|
||||
data: { hi: 'fuck' },
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// OLD reference pattern: <workflow1.result.response.data>
|
||||
// Should work via backwards compatibility (strips 'response.')
|
||||
expect(resolver.resolve('<workflow1.result.response.data>', ctx)).toEqual({ hi: 'fuck' })
|
||||
expect(resolver.resolve('<workflow1.result.response.data.hi>', ctx)).toBe('fuck')
|
||||
expect(resolver.resolve('<workflow1.result.response.status>', ctx)).toBe(200)
|
||||
|
||||
// NEW reference pattern: <workflow1.result.data>
|
||||
// Should work directly
|
||||
expect(resolver.resolve('<workflow1.result.data>', ctx)).toEqual({ hi: 'fuck' })
|
||||
expect(resolver.resolve('<workflow1.result.data.hi>', ctx)).toBe('fuck')
|
||||
expect(resolver.resolve('<workflow1.result.status>', ctx)).toBe(200)
|
||||
|
||||
// Other workflow block properties should still work
|
||||
expect(resolver.resolve('<workflow1.success>', ctx)).toBe(true)
|
||||
expect(resolver.resolve('<workflow1.childWorkflowName>', ctx)).toBe('vibrant-cliff')
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
describe('edge cases', () => {
|
||||
it.concurrent('should handle case-insensitive block name matching', () => {
|
||||
const workflow = createTestWorkflow([{ id: 'block-1', name: 'My Block' }])
|
||||
|
||||
@@ -48,7 +48,6 @@ export class BlockResolver implements Resolver {
|
||||
}
|
||||
|
||||
const output = this.getBlockOutput(blockId, context)
|
||||
|
||||
if (output === undefined) {
|
||||
return undefined
|
||||
}
|
||||
@@ -56,16 +55,62 @@ export class BlockResolver implements Resolver {
|
||||
return output
|
||||
}
|
||||
|
||||
const result = navigatePath(output, pathParts)
|
||||
// Try the original path first
|
||||
let result = navigatePath(output, pathParts)
|
||||
|
||||
if (result === undefined) {
|
||||
const availableKeys = output && typeof output === 'object' ? Object.keys(output) : []
|
||||
throw new Error(
|
||||
`No value found at path "${pathParts.join('.')}" in block "${blockName}". Available fields: ${availableKeys.join(', ')}`
|
||||
)
|
||||
// If successful, return it immediately
|
||||
if (result !== undefined) {
|
||||
return result
|
||||
}
|
||||
|
||||
return result
|
||||
// If failed, check if we should try backwards compatibility fallback
|
||||
const block = this.workflow.blocks.find((b) => b.id === blockId)
|
||||
|
||||
// Response block backwards compatibility:
|
||||
// Old: <responseBlock.response.data> -> New: <responseBlock.data>
|
||||
// Only apply fallback if:
|
||||
// 1. Block type is 'response'
|
||||
// 2. Path starts with 'response.'
|
||||
// 3. Output doesn't have a 'response' key (confirming it's the new format)
|
||||
if (
|
||||
block?.metadata?.id === 'response' &&
|
||||
pathParts[0] === 'response' &&
|
||||
output?.response === undefined
|
||||
) {
|
||||
const adjustedPathParts = pathParts.slice(1)
|
||||
if (adjustedPathParts.length === 0) {
|
||||
return output
|
||||
}
|
||||
result = navigatePath(output, adjustedPathParts)
|
||||
if (result !== undefined) {
|
||||
return result
|
||||
}
|
||||
}
|
||||
|
||||
// Workflow block backwards compatibility:
|
||||
// Old: <workflowBlock.result.response.data> -> New: <workflowBlock.result.data>
|
||||
// Only apply fallback if:
|
||||
// 1. Block type is 'workflow'
|
||||
// 2. Path starts with 'result.response.'
|
||||
// 3. output.result.response doesn't exist (confirming child used new format)
|
||||
if (
|
||||
block?.metadata?.id === 'workflow' &&
|
||||
pathParts[0] === 'result' &&
|
||||
pathParts[1] === 'response' &&
|
||||
output?.result?.response === undefined
|
||||
) {
|
||||
const adjustedPathParts = ['result', ...pathParts.slice(2)]
|
||||
result = navigatePath(output, adjustedPathParts)
|
||||
if (result !== undefined) {
|
||||
return result
|
||||
}
|
||||
}
|
||||
|
||||
// If still undefined, throw error with original path
|
||||
const availableKeys = output && typeof output === 'object' ? Object.keys(output) : []
|
||||
throw new Error(
|
||||
`No value found at path "${pathParts.join('.')}" in block "${blockName}". Available fields: ${availableKeys.join(', ')}`
|
||||
)
|
||||
}
|
||||
|
||||
private getBlockOutput(blockId: string, context: ResolutionContext): any {
|
||||
|
||||
@@ -228,6 +228,7 @@ export function useKnowledgeDocumentsQuery(
|
||||
params: KnowledgeDocumentsParams,
|
||||
options?: {
|
||||
enabled?: boolean
|
||||
refetchInterval?: number | false
|
||||
}
|
||||
) {
|
||||
const paramsKey = serializeDocumentParams(params)
|
||||
@@ -237,6 +238,7 @@ export function useKnowledgeDocumentsQuery(
|
||||
enabled: (options?.enabled ?? true) && Boolean(params.knowledgeBaseId),
|
||||
staleTime: 60 * 1000,
|
||||
placeholderData: keepPreviousData,
|
||||
refetchInterval: options?.refetchInterval ?? false,
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
@@ -67,6 +67,7 @@ export function useKnowledgeBaseDocuments(
|
||||
sortBy?: string
|
||||
sortOrder?: string
|
||||
enabled?: boolean
|
||||
refetchInterval?: number | false
|
||||
}
|
||||
) {
|
||||
const queryClient = useQueryClient()
|
||||
@@ -92,6 +93,7 @@ export function useKnowledgeBaseDocuments(
|
||||
},
|
||||
{
|
||||
enabled: (options?.enabled ?? true) && Boolean(knowledgeBaseId),
|
||||
refetchInterval: options?.refetchInterval,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@@ -16,7 +16,7 @@ interface HeaderInfo {
|
||||
interface Frontmatter {
|
||||
title?: string
|
||||
description?: string
|
||||
[key: string]: any
|
||||
[key: string]: unknown
|
||||
}
|
||||
|
||||
const logger = createLogger('DocsChunker')
|
||||
|
||||
@@ -6,6 +6,11 @@ import { estimateTokenCount } from '@/lib/tokenization/estimators'
|
||||
|
||||
const logger = createLogger('JsonYamlChunker')
|
||||
|
||||
type JsonPrimitive = string | number | boolean | null
|
||||
type JsonValue = JsonPrimitive | JsonObject | JsonArray
|
||||
type JsonObject = { [key: string]: JsonValue }
|
||||
type JsonArray = JsonValue[]
|
||||
|
||||
function getTokenCount(text: string): number {
|
||||
try {
|
||||
return getAccurateTokenCount(text, 'text-embedding-3-small')
|
||||
@@ -59,11 +64,11 @@ export class JsonYamlChunker {
|
||||
*/
|
||||
async chunk(content: string): Promise<Chunk[]> {
|
||||
try {
|
||||
let data: any
|
||||
let data: JsonValue
|
||||
try {
|
||||
data = JSON.parse(content)
|
||||
data = JSON.parse(content) as JsonValue
|
||||
} catch {
|
||||
data = yaml.load(content)
|
||||
data = yaml.load(content) as JsonValue
|
||||
}
|
||||
const chunks = this.chunkStructuredData(data)
|
||||
|
||||
@@ -86,7 +91,7 @@ export class JsonYamlChunker {
|
||||
/**
|
||||
* Chunk structured data based on its structure
|
||||
*/
|
||||
private chunkStructuredData(data: any, path: string[] = []): Chunk[] {
|
||||
private chunkStructuredData(data: JsonValue, path: string[] = []): Chunk[] {
|
||||
const chunks: Chunk[] = []
|
||||
|
||||
if (Array.isArray(data)) {
|
||||
@@ -94,7 +99,7 @@ export class JsonYamlChunker {
|
||||
}
|
||||
|
||||
if (typeof data === 'object' && data !== null) {
|
||||
return this.chunkObject(data, path)
|
||||
return this.chunkObject(data as JsonObject, path)
|
||||
}
|
||||
|
||||
const content = JSON.stringify(data, null, 2)
|
||||
@@ -118,9 +123,9 @@ export class JsonYamlChunker {
|
||||
/**
|
||||
* Chunk an array intelligently
|
||||
*/
|
||||
private chunkArray(arr: any[], path: string[]): Chunk[] {
|
||||
private chunkArray(arr: JsonArray, path: string[]): Chunk[] {
|
||||
const chunks: Chunk[] = []
|
||||
let currentBatch: any[] = []
|
||||
let currentBatch: JsonValue[] = []
|
||||
let currentTokens = 0
|
||||
|
||||
const contextHeader = path.length > 0 ? `// ${path.join('.')}\n` : ''
|
||||
@@ -194,7 +199,7 @@ export class JsonYamlChunker {
|
||||
/**
|
||||
* Chunk an object intelligently
|
||||
*/
|
||||
private chunkObject(obj: Record<string, any>, path: string[]): Chunk[] {
|
||||
private chunkObject(obj: JsonObject, path: string[]): Chunk[] {
|
||||
const chunks: Chunk[] = []
|
||||
const entries = Object.entries(obj)
|
||||
|
||||
@@ -213,7 +218,7 @@ export class JsonYamlChunker {
|
||||
return chunks
|
||||
}
|
||||
|
||||
let currentObj: Record<string, any> = {}
|
||||
let currentObj: JsonObject = {}
|
||||
let currentTokens = 0
|
||||
let currentKeys: string[] = []
|
||||
|
||||
|
||||
@@ -110,10 +110,12 @@ export class TextChunker {
|
||||
chunks.push(currentChunk.trim())
|
||||
}
|
||||
|
||||
// Start new chunk with current part
|
||||
// If part itself is too large, split it further
|
||||
if (this.estimateTokens(part) > this.chunkSize) {
|
||||
chunks.push(...(await this.splitRecursively(part, separatorIndex + 1)))
|
||||
const subChunks = await this.splitRecursively(part, separatorIndex + 1)
|
||||
for (const subChunk of subChunks) {
|
||||
chunks.push(subChunk)
|
||||
}
|
||||
currentChunk = ''
|
||||
} else {
|
||||
currentChunk = part
|
||||
|
||||
@@ -178,6 +178,7 @@ export const env = createEnv({
|
||||
KB_CONFIG_BATCH_SIZE: z.number().optional().default(2000), // Chunks to process per embedding batch
|
||||
KB_CONFIG_DELAY_BETWEEN_BATCHES: z.number().optional().default(0), // Delay between batches in ms (0 for max speed)
|
||||
KB_CONFIG_DELAY_BETWEEN_DOCUMENTS: z.number().optional().default(50), // Delay between documents in ms
|
||||
KB_CONFIG_CHUNK_CONCURRENCY: z.number().optional().default(10), // Concurrent PDF chunk OCR processing
|
||||
|
||||
// Real-time Communication
|
||||
SOCKET_SERVER_URL: z.string().url().optional(), // WebSocket server URL for real-time features
|
||||
|
||||
@@ -61,54 +61,6 @@ export function getRedisClient(): Redis | null {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if Redis is ready for commands.
|
||||
* Use for health checks only - commands should be sent regardless (ioredis queues them).
|
||||
*/
|
||||
export function isRedisConnected(): boolean {
|
||||
return globalRedisClient?.status === 'ready'
|
||||
}
|
||||
|
||||
/**
|
||||
* Get Redis connection status for diagnostics.
|
||||
*/
|
||||
export function getRedisStatus(): string {
|
||||
return globalRedisClient?.status ?? 'not initialized'
|
||||
}
|
||||
|
||||
const MESSAGE_ID_PREFIX = 'processed:'
|
||||
const MESSAGE_ID_EXPIRY = 60 * 60 * 24 * 7
|
||||
|
||||
/**
|
||||
* Check if a message has been processed (for idempotency).
|
||||
* Requires Redis - throws if Redis is not available.
|
||||
*/
|
||||
export async function hasProcessedMessage(key: string): Promise<boolean> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
throw new Error('Redis not available for message deduplication')
|
||||
}
|
||||
|
||||
const result = await redis.exists(`${MESSAGE_ID_PREFIX}${key}`)
|
||||
return result === 1
|
||||
}
|
||||
|
||||
/**
|
||||
* Mark a message as processed (for idempotency).
|
||||
* Requires Redis - throws if Redis is not available.
|
||||
*/
|
||||
export async function markMessageAsProcessed(
|
||||
key: string,
|
||||
expirySeconds: number = MESSAGE_ID_EXPIRY
|
||||
): Promise<void> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
throw new Error('Redis not available for message deduplication')
|
||||
}
|
||||
|
||||
await redis.set(`${MESSAGE_ID_PREFIX}${key}`, '1', 'EX', expirySeconds)
|
||||
}
|
||||
|
||||
/**
|
||||
* Lua script for safe lock release.
|
||||
* Only deletes the key if the value matches (ownership verification).
|
||||
@@ -125,7 +77,10 @@ end
|
||||
/**
|
||||
* Acquire a distributed lock using Redis SET NX.
|
||||
* Returns true if lock acquired, false if already held.
|
||||
* Requires Redis - throws if Redis is not available.
|
||||
*
|
||||
* When Redis is not available, returns true (lock "acquired") to allow
|
||||
* single-replica deployments to function without Redis. In multi-replica
|
||||
* deployments without Redis, the idempotency layer prevents duplicate processing.
|
||||
*/
|
||||
export async function acquireLock(
|
||||
lockKey: string,
|
||||
@@ -134,36 +89,24 @@ export async function acquireLock(
|
||||
): Promise<boolean> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
throw new Error('Redis not available for distributed locking')
|
||||
return true // No-op when Redis unavailable; idempotency layer handles duplicates
|
||||
}
|
||||
|
||||
const result = await redis.set(lockKey, value, 'EX', expirySeconds, 'NX')
|
||||
return result === 'OK'
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the value of a lock key.
|
||||
* Requires Redis - throws if Redis is not available.
|
||||
*/
|
||||
export async function getLockValue(key: string): Promise<string | null> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
throw new Error('Redis not available')
|
||||
}
|
||||
|
||||
return redis.get(key)
|
||||
}
|
||||
|
||||
/**
|
||||
* Release a distributed lock safely.
|
||||
* Only releases if the caller owns the lock (value matches).
|
||||
* Returns true if lock was released, false if not owned or already expired.
|
||||
* Requires Redis - throws if Redis is not available.
|
||||
*
|
||||
* When Redis is not available, returns true (no-op) since no lock was held.
|
||||
*/
|
||||
export async function releaseLock(lockKey: string, value: string): Promise<boolean> {
|
||||
const redis = getRedisClient()
|
||||
if (!redis) {
|
||||
throw new Error('Redis not available for distributed locking')
|
||||
return true // No-op when Redis unavailable; no lock was actually held
|
||||
}
|
||||
|
||||
const result = await redis.eval(RELEASE_LOCK_SCRIPT, 1, lockKey, value)
|
||||
|
||||
@@ -17,8 +17,6 @@ export class DocParser implements FileParser {
|
||||
throw new Error(`File not found: ${filePath}`)
|
||||
}
|
||||
|
||||
logger.info(`Parsing DOC file: ${filePath}`)
|
||||
|
||||
const buffer = await readFile(filePath)
|
||||
return this.parseBuffer(buffer)
|
||||
} catch (error) {
|
||||
@@ -29,53 +27,80 @@ export class DocParser implements FileParser {
|
||||
|
||||
async parseBuffer(buffer: Buffer): Promise<FileParseResult> {
|
||||
try {
|
||||
logger.info('Parsing DOC buffer, size:', buffer.length)
|
||||
|
||||
if (!buffer || buffer.length === 0) {
|
||||
throw new Error('Empty buffer provided')
|
||||
}
|
||||
|
||||
let parseOfficeAsync
|
||||
try {
|
||||
const officeParser = await import('officeparser')
|
||||
parseOfficeAsync = officeParser.parseOfficeAsync
|
||||
} catch (importError) {
|
||||
logger.warn('officeparser not available, using fallback extraction')
|
||||
return this.fallbackExtraction(buffer)
|
||||
const result = await officeParser.parseOfficeAsync(buffer)
|
||||
|
||||
if (result) {
|
||||
const resultString = typeof result === 'string' ? result : String(result)
|
||||
const content = sanitizeTextForUTF8(resultString.trim())
|
||||
|
||||
if (content.length > 0) {
|
||||
return {
|
||||
content,
|
||||
metadata: {
|
||||
characterCount: content.length,
|
||||
extractionMethod: 'officeparser',
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (officeError) {
|
||||
logger.warn('officeparser failed, trying mammoth:', officeError)
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await parseOfficeAsync(buffer)
|
||||
const mammoth = await import('mammoth')
|
||||
const result = await mammoth.extractRawText({ buffer })
|
||||
|
||||
if (!result) {
|
||||
throw new Error('officeparser returned no result')
|
||||
if (result.value && result.value.trim().length > 0) {
|
||||
const content = sanitizeTextForUTF8(result.value.trim())
|
||||
return {
|
||||
content,
|
||||
metadata: {
|
||||
characterCount: content.length,
|
||||
extractionMethod: 'mammoth',
|
||||
messages: result.messages,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
const resultString = typeof result === 'string' ? result : String(result)
|
||||
|
||||
const content = sanitizeTextForUTF8(resultString.trim())
|
||||
|
||||
logger.info('DOC parsing completed successfully with officeparser')
|
||||
|
||||
return {
|
||||
content: content,
|
||||
metadata: {
|
||||
characterCount: content.length,
|
||||
extractionMethod: 'officeparser',
|
||||
},
|
||||
}
|
||||
} catch (extractError) {
|
||||
logger.warn('officeparser failed, using fallback:', extractError)
|
||||
return this.fallbackExtraction(buffer)
|
||||
} catch (mammothError) {
|
||||
logger.warn('mammoth failed:', mammothError)
|
||||
}
|
||||
|
||||
return this.fallbackExtraction(buffer)
|
||||
} catch (error) {
|
||||
logger.error('DOC buffer parsing error:', error)
|
||||
logger.error('DOC parsing error:', error)
|
||||
throw new Error(`Failed to parse DOC buffer: ${(error as Error).message}`)
|
||||
}
|
||||
}
|
||||
|
||||
private fallbackExtraction(buffer: Buffer): FileParseResult {
|
||||
logger.info('Using fallback text extraction for DOC file')
|
||||
const isBinaryDoc = buffer.length >= 2 && buffer[0] === 0xd0 && buffer[1] === 0xcf
|
||||
|
||||
if (!isBinaryDoc) {
|
||||
const textContent = buffer.toString('utf8').trim()
|
||||
|
||||
if (textContent.length > 0) {
|
||||
const printableChars = textContent.match(/[\x20-\x7E\n\r\t]/g)?.length || 0
|
||||
const isProbablyText = printableChars / textContent.length > 0.9
|
||||
|
||||
if (isProbablyText) {
|
||||
return {
|
||||
content: sanitizeTextForUTF8(textContent),
|
||||
metadata: {
|
||||
extractionMethod: 'plaintext-fallback',
|
||||
characterCount: textContent.length,
|
||||
warning: 'File is not a valid DOC format, extracted as plain text',
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const text = buffer.toString('utf8', 0, Math.min(buffer.length, 100000))
|
||||
|
||||
|
||||
@@ -2,13 +2,18 @@ import { readFile } from 'fs/promises'
|
||||
import { createLogger } from '@sim/logger'
|
||||
import mammoth from 'mammoth'
|
||||
import type { FileParseResult, FileParser } from '@/lib/file-parsers/types'
|
||||
import { sanitizeTextForUTF8 } from '@/lib/file-parsers/utils'
|
||||
|
||||
const logger = createLogger('DocxParser')
|
||||
|
||||
// Define interface for mammoth result
|
||||
interface MammothMessage {
|
||||
type: 'warning' | 'error'
|
||||
message: string
|
||||
}
|
||||
|
||||
interface MammothResult {
|
||||
value: string
|
||||
messages: any[]
|
||||
messages: MammothMessage[]
|
||||
}
|
||||
|
||||
export class DocxParser implements FileParser {
|
||||
@@ -19,7 +24,6 @@ export class DocxParser implements FileParser {
|
||||
}
|
||||
|
||||
const buffer = await readFile(filePath)
|
||||
|
||||
return this.parseBuffer(buffer)
|
||||
} catch (error) {
|
||||
logger.error('DOCX file error:', error)
|
||||
@@ -29,26 +33,74 @@ export class DocxParser implements FileParser {
|
||||
|
||||
async parseBuffer(buffer: Buffer): Promise<FileParseResult> {
|
||||
try {
|
||||
logger.info('Parsing buffer, size:', buffer.length)
|
||||
if (!buffer || buffer.length === 0) {
|
||||
throw new Error('Empty buffer provided')
|
||||
}
|
||||
|
||||
const result = await mammoth.extractRawText({ buffer })
|
||||
|
||||
let htmlResult: MammothResult = { value: '', messages: [] }
|
||||
try {
|
||||
htmlResult = await mammoth.convertToHtml({ buffer })
|
||||
} catch (htmlError) {
|
||||
logger.warn('HTML conversion warning:', htmlError)
|
||||
const result = await mammoth.extractRawText({ buffer })
|
||||
|
||||
if (result.value && result.value.trim().length > 0) {
|
||||
let htmlResult: MammothResult = { value: '', messages: [] }
|
||||
try {
|
||||
htmlResult = await mammoth.convertToHtml({ buffer })
|
||||
} catch {
|
||||
// HTML conversion is optional
|
||||
}
|
||||
|
||||
return {
|
||||
content: sanitizeTextForUTF8(result.value),
|
||||
metadata: {
|
||||
extractionMethod: 'mammoth',
|
||||
messages: [...result.messages, ...htmlResult.messages],
|
||||
html: htmlResult.value,
|
||||
},
|
||||
}
|
||||
}
|
||||
} catch (mammothError) {
|
||||
logger.warn('mammoth failed, trying officeparser:', mammothError)
|
||||
}
|
||||
|
||||
return {
|
||||
content: result.value,
|
||||
metadata: {
|
||||
messages: [...result.messages, ...htmlResult.messages],
|
||||
html: htmlResult.value,
|
||||
},
|
||||
try {
|
||||
const officeParser = await import('officeparser')
|
||||
const result = await officeParser.parseOfficeAsync(buffer)
|
||||
|
||||
if (result) {
|
||||
const resultString = typeof result === 'string' ? result : String(result)
|
||||
const content = sanitizeTextForUTF8(resultString.trim())
|
||||
|
||||
if (content.length > 0) {
|
||||
return {
|
||||
content,
|
||||
metadata: {
|
||||
extractionMethod: 'officeparser',
|
||||
characterCount: content.length,
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (officeError) {
|
||||
logger.warn('officeparser failed:', officeError)
|
||||
}
|
||||
|
||||
const isZipFile = buffer.length >= 2 && buffer[0] === 0x50 && buffer[1] === 0x4b
|
||||
if (!isZipFile) {
|
||||
const textContent = buffer.toString('utf8').trim()
|
||||
if (textContent.length > 0) {
|
||||
return {
|
||||
content: sanitizeTextForUTF8(textContent),
|
||||
metadata: {
|
||||
extractionMethod: 'plaintext-fallback',
|
||||
characterCount: textContent.length,
|
||||
warning: 'File is not a valid DOCX format, extracted as plain text',
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error('Failed to extract text from DOCX file')
|
||||
} catch (error) {
|
||||
logger.error('DOCX buffer parsing error:', error)
|
||||
logger.error('DOCX parsing error:', error)
|
||||
throw new Error(`Failed to parse DOCX buffer: ${(error as Error).message}`)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,22 @@
|
||||
export interface FileParseMetadata {
|
||||
characterCount?: number
|
||||
pageCount?: number
|
||||
extractionMethod?: string
|
||||
warning?: string
|
||||
messages?: unknown[]
|
||||
html?: string
|
||||
type?: string
|
||||
headers?: string[]
|
||||
totalRows?: number
|
||||
rowCount?: number
|
||||
sheetNames?: string[]
|
||||
source?: string
|
||||
[key: string]: unknown
|
||||
}
|
||||
|
||||
export interface FileParseResult {
|
||||
content: string
|
||||
metadata?: Record<string, any>
|
||||
metadata?: FileParseMetadata
|
||||
}
|
||||
|
||||
export interface FileParser {
|
||||
|
||||
@@ -1,8 +1,10 @@
|
||||
import { createLogger } from '@sim/logger'
|
||||
import { PDFDocument } from 'pdf-lib'
|
||||
import { getBYOKKey } from '@/lib/api-key/byok'
|
||||
import { type Chunk, JsonYamlChunker, StructuredDataChunker, TextChunker } from '@/lib/chunkers'
|
||||
import { env } from '@/lib/core/config/env'
|
||||
import { parseBuffer, parseFile } from '@/lib/file-parsers'
|
||||
import type { FileParseMetadata } from '@/lib/file-parsers/types'
|
||||
import { retryWithExponentialBackoff } from '@/lib/knowledge/documents/utils'
|
||||
import { StorageService } from '@/lib/uploads'
|
||||
import { downloadFileFromUrl } from '@/lib/uploads/utils/file-utils.server'
|
||||
@@ -15,6 +17,8 @@ const TIMEOUTS = {
|
||||
MISTRAL_OCR_API: 120000,
|
||||
} as const
|
||||
|
||||
const MAX_CONCURRENT_CHUNKS = env.KB_CONFIG_CHUNK_CONCURRENCY
|
||||
|
||||
type OCRResult = {
|
||||
success: boolean
|
||||
error?: string
|
||||
@@ -36,6 +40,61 @@ type OCRRequestBody = {
|
||||
include_image_base64: boolean
|
||||
}
|
||||
|
||||
const MISTRAL_MAX_PAGES = 1000
|
||||
|
||||
/**
|
||||
* Get page count from a PDF buffer using unpdf
|
||||
*/
|
||||
async function getPdfPageCount(buffer: Buffer): Promise<number> {
|
||||
try {
|
||||
const { getDocumentProxy } = await import('unpdf')
|
||||
const uint8Array = new Uint8Array(buffer)
|
||||
const pdf = await getDocumentProxy(uint8Array)
|
||||
return pdf.numPages
|
||||
} catch (error) {
|
||||
logger.warn('Failed to get PDF page count:', error)
|
||||
return 0
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Split a PDF buffer into multiple smaller PDFs
|
||||
* Returns an array of PDF buffers, each with at most maxPages pages
|
||||
*/
|
||||
async function splitPdfIntoChunks(
|
||||
pdfBuffer: Buffer,
|
||||
maxPages: number
|
||||
): Promise<{ buffer: Buffer; startPage: number; endPage: number }[]> {
|
||||
const sourcePdf = await PDFDocument.load(pdfBuffer)
|
||||
const totalPages = sourcePdf.getPageCount()
|
||||
|
||||
if (totalPages <= maxPages) {
|
||||
return [{ buffer: pdfBuffer, startPage: 0, endPage: totalPages - 1 }]
|
||||
}
|
||||
|
||||
const chunks: { buffer: Buffer; startPage: number; endPage: number }[] = []
|
||||
|
||||
for (let startPage = 0; startPage < totalPages; startPage += maxPages) {
|
||||
const endPage = Math.min(startPage + maxPages - 1, totalPages - 1)
|
||||
const pageCount = endPage - startPage + 1
|
||||
|
||||
const newPdf = await PDFDocument.create()
|
||||
const pageIndices = Array.from({ length: pageCount }, (_, i) => startPage + i)
|
||||
const copiedPages = await newPdf.copyPages(sourcePdf, pageIndices)
|
||||
|
||||
copiedPages.forEach((page) => newPdf.addPage(page))
|
||||
|
||||
const pdfBytes = await newPdf.save()
|
||||
chunks.push({
|
||||
buffer: Buffer.from(pdfBytes),
|
||||
startPage,
|
||||
endPage,
|
||||
})
|
||||
}
|
||||
|
||||
return chunks
|
||||
}
|
||||
|
||||
type AzureOCRResponse = {
|
||||
pages?: OCRPage[]
|
||||
[key: string]: unknown
|
||||
@@ -81,7 +140,7 @@ export async function processDocument(
|
||||
const cloudUrl = 'cloudUrl' in parseResult ? parseResult.cloudUrl : undefined
|
||||
|
||||
let chunks: Chunk[]
|
||||
const metadata = 'metadata' in parseResult ? parseResult.metadata : {}
|
||||
const metadata: FileParseMetadata = parseResult.metadata ?? {}
|
||||
|
||||
const isJsonYaml =
|
||||
metadata.type === 'json' ||
|
||||
@@ -97,10 +156,11 @@ export async function processDocument(
|
||||
})
|
||||
} else if (StructuredDataChunker.isStructuredData(content, mimeType)) {
|
||||
logger.info('Using structured data chunker for spreadsheet/CSV content')
|
||||
const rowCount = metadata.totalRows ?? metadata.rowCount
|
||||
chunks = await StructuredDataChunker.chunkStructuredData(content, {
|
||||
chunkSize,
|
||||
headers: metadata.headers,
|
||||
totalRows: metadata.totalRows || metadata.rowCount,
|
||||
totalRows: typeof rowCount === 'number' ? rowCount : undefined,
|
||||
sheetName: metadata.sheetNames?.[0],
|
||||
})
|
||||
} else {
|
||||
@@ -153,7 +213,7 @@ async function parseDocument(
|
||||
content: string
|
||||
processingMethod: 'file-parser' | 'mistral-ocr'
|
||||
cloudUrl?: string
|
||||
metadata?: any
|
||||
metadata?: FileParseMetadata
|
||||
}> {
|
||||
const isPDF = mimeType === 'application/pdf'
|
||||
const hasAzureMistralOCR =
|
||||
@@ -165,7 +225,7 @@ async function parseDocument(
|
||||
if (isPDF && (hasAzureMistralOCR || hasMistralOCR)) {
|
||||
if (hasAzureMistralOCR) {
|
||||
logger.info(`Using Azure Mistral OCR: ${filename}`)
|
||||
return parseWithAzureMistralOCR(fileUrl, filename, mimeType, userId, workspaceId)
|
||||
return parseWithAzureMistralOCR(fileUrl, filename, mimeType)
|
||||
}
|
||||
|
||||
if (hasMistralOCR) {
|
||||
@@ -188,13 +248,32 @@ async function handleFileForOCR(
|
||||
const isExternalHttps = fileUrl.startsWith('https://') && !fileUrl.includes('/api/files/serve/')
|
||||
|
||||
if (isExternalHttps) {
|
||||
return { httpsUrl: fileUrl }
|
||||
if (mimeType === 'application/pdf') {
|
||||
logger.info(`handleFileForOCR: Downloading external PDF to check page count`)
|
||||
try {
|
||||
const buffer = await downloadFileWithTimeout(fileUrl)
|
||||
logger.info(`handleFileForOCR: Downloaded external PDF: ${buffer.length} bytes`)
|
||||
return { httpsUrl: fileUrl, buffer }
|
||||
} catch (error) {
|
||||
logger.warn(
|
||||
`handleFileForOCR: Failed to download external PDF for page count check, proceeding without batching`,
|
||||
{
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
}
|
||||
)
|
||||
return { httpsUrl: fileUrl, buffer: undefined }
|
||||
}
|
||||
}
|
||||
logger.info(`handleFileForOCR: Using external URL directly`)
|
||||
return { httpsUrl: fileUrl, buffer: undefined }
|
||||
}
|
||||
|
||||
logger.info(`Uploading "${filename}" to cloud storage for OCR`)
|
||||
|
||||
const buffer = await downloadFileWithTimeout(fileUrl)
|
||||
|
||||
logger.info(`Downloaded ${filename}: ${buffer.length} bytes`)
|
||||
|
||||
try {
|
||||
const metadata: Record<string, string> = {
|
||||
originalName: filename,
|
||||
@@ -224,8 +303,7 @@ async function handleFileForOCR(
|
||||
900 // 15 minutes
|
||||
)
|
||||
|
||||
logger.info(`Successfully uploaded for OCR: ${cloudResult.key}`)
|
||||
return { httpsUrl, cloudUrl: httpsUrl }
|
||||
return { httpsUrl, cloudUrl: httpsUrl, buffer }
|
||||
} catch (uploadError) {
|
||||
const message = uploadError instanceof Error ? uploadError.message : 'Unknown error'
|
||||
throw new Error(`Cloud upload failed: ${message}. Cloud upload is required for OCR.`)
|
||||
@@ -321,13 +399,7 @@ async function makeOCRRequest(
|
||||
}
|
||||
}
|
||||
|
||||
async function parseWithAzureMistralOCR(
|
||||
fileUrl: string,
|
||||
filename: string,
|
||||
mimeType: string,
|
||||
userId?: string,
|
||||
workspaceId?: string | null
|
||||
) {
|
||||
async function parseWithAzureMistralOCR(fileUrl: string, filename: string, mimeType: string) {
|
||||
validateOCRConfig(
|
||||
env.OCR_AZURE_API_KEY,
|
||||
env.OCR_AZURE_ENDPOINT,
|
||||
@@ -336,6 +408,19 @@ async function parseWithAzureMistralOCR(
|
||||
)
|
||||
|
||||
const fileBuffer = await downloadFileForBase64(fileUrl)
|
||||
|
||||
if (mimeType === 'application/pdf') {
|
||||
const pageCount = await getPdfPageCount(fileBuffer)
|
||||
if (pageCount > MISTRAL_MAX_PAGES) {
|
||||
logger.info(
|
||||
`PDF has ${pageCount} pages, exceeds Azure OCR limit of ${MISTRAL_MAX_PAGES}. ` +
|
||||
`Falling back to file parser.`
|
||||
)
|
||||
return parseWithFileParser(fileUrl, filename, mimeType)
|
||||
}
|
||||
logger.info(`Azure Mistral OCR: PDF page count for ${filename}: ${pageCount}`)
|
||||
}
|
||||
|
||||
const base64Data = fileBuffer.toString('base64')
|
||||
const dataUri = `data:${mimeType};base64,${base64Data}`
|
||||
|
||||
@@ -374,17 +459,7 @@ async function parseWithAzureMistralOCR(
|
||||
message: error instanceof Error ? error.message : String(error),
|
||||
})
|
||||
|
||||
const fallbackMistralKey = await getMistralApiKey(workspaceId)
|
||||
if (fallbackMistralKey) {
|
||||
return parseWithMistralOCR(
|
||||
fileUrl,
|
||||
filename,
|
||||
mimeType,
|
||||
userId,
|
||||
workspaceId,
|
||||
fallbackMistralKey
|
||||
)
|
||||
}
|
||||
logger.info(`Falling back to file parser: ${filename}`)
|
||||
return parseWithFileParser(fileUrl, filename, mimeType)
|
||||
}
|
||||
}
|
||||
@@ -406,50 +481,35 @@ async function parseWithMistralOCR(
|
||||
throw new Error('Mistral parser tool not configured')
|
||||
}
|
||||
|
||||
const { httpsUrl, cloudUrl } = await handleFileForOCR(
|
||||
const { httpsUrl, cloudUrl, buffer } = await handleFileForOCR(
|
||||
fileUrl,
|
||||
filename,
|
||||
mimeType,
|
||||
userId,
|
||||
workspaceId
|
||||
)
|
||||
|
||||
logger.info(`Mistral OCR: Using presigned URL for ${filename}: ${httpsUrl.substring(0, 120)}...`)
|
||||
|
||||
let pageCount = 0
|
||||
if (mimeType === 'application/pdf' && buffer) {
|
||||
pageCount = await getPdfPageCount(buffer)
|
||||
logger.info(`PDF page count for ${filename}: ${pageCount}`)
|
||||
}
|
||||
|
||||
const needsBatching = pageCount > MISTRAL_MAX_PAGES
|
||||
|
||||
if (needsBatching && buffer) {
|
||||
logger.info(
|
||||
`PDF has ${pageCount} pages, exceeds limit of ${MISTRAL_MAX_PAGES}. Splitting and processing in chunks.`
|
||||
)
|
||||
return processMistralOCRInBatches(filename, apiKey, buffer, userId, cloudUrl)
|
||||
}
|
||||
|
||||
const params = { filePath: httpsUrl, apiKey, resultType: 'text' as const }
|
||||
|
||||
try {
|
||||
const response = await retryWithExponentialBackoff(
|
||||
async () => {
|
||||
let url =
|
||||
typeof mistralParserTool.request!.url === 'function'
|
||||
? mistralParserTool.request!.url(params)
|
||||
: mistralParserTool.request!.url
|
||||
|
||||
const isInternalRoute = url.startsWith('/')
|
||||
|
||||
if (isInternalRoute) {
|
||||
const { getBaseUrl } = await import('@/lib/core/utils/urls')
|
||||
url = `${getBaseUrl()}${url}`
|
||||
}
|
||||
|
||||
let headers =
|
||||
typeof mistralParserTool.request!.headers === 'function'
|
||||
? mistralParserTool.request!.headers(params)
|
||||
: mistralParserTool.request!.headers
|
||||
|
||||
if (isInternalRoute) {
|
||||
const { generateInternalToken } = await import('@/lib/auth/internal')
|
||||
const internalToken = await generateInternalToken(userId)
|
||||
headers = {
|
||||
...headers,
|
||||
Authorization: `Bearer ${internalToken}`,
|
||||
}
|
||||
}
|
||||
|
||||
const requestBody = mistralParserTool.request!.body!(params) as OCRRequestBody
|
||||
return makeOCRRequest(url, headers as Record<string, string>, requestBody)
|
||||
},
|
||||
{ maxRetries: 3, initialDelayMs: 1000, maxDelayMs: 10000 }
|
||||
)
|
||||
|
||||
const response = await executeMistralOCRRequest(params, userId)
|
||||
const result = (await mistralParserTool.transformResponse!(response, params)) as OCRResult
|
||||
const content = processOCRContent(result, filename)
|
||||
|
||||
@@ -464,10 +524,204 @@ async function parseWithMistralOCR(
|
||||
}
|
||||
}
|
||||
|
||||
async function executeMistralOCRRequest(
|
||||
params: { filePath: string; apiKey: string; resultType: 'text' },
|
||||
userId?: string
|
||||
): Promise<Response> {
|
||||
return retryWithExponentialBackoff(
|
||||
async () => {
|
||||
let url =
|
||||
typeof mistralParserTool.request!.url === 'function'
|
||||
? mistralParserTool.request!.url(params)
|
||||
: mistralParserTool.request!.url
|
||||
|
||||
const isInternalRoute = url.startsWith('/')
|
||||
|
||||
if (isInternalRoute) {
|
||||
const { getBaseUrl } = await import('@/lib/core/utils/urls')
|
||||
url = `${getBaseUrl()}${url}`
|
||||
}
|
||||
|
||||
let headers =
|
||||
typeof mistralParserTool.request!.headers === 'function'
|
||||
? mistralParserTool.request!.headers(params)
|
||||
: mistralParserTool.request!.headers
|
||||
|
||||
if (isInternalRoute) {
|
||||
const { generateInternalToken } = await import('@/lib/auth/internal')
|
||||
const internalToken = await generateInternalToken(userId)
|
||||
headers = {
|
||||
...headers,
|
||||
Authorization: `Bearer ${internalToken}`,
|
||||
}
|
||||
}
|
||||
|
||||
const requestBody = mistralParserTool.request!.body!(params) as OCRRequestBody
|
||||
return makeOCRRequest(url, headers as Record<string, string>, requestBody)
|
||||
},
|
||||
{ maxRetries: 3, initialDelayMs: 1000, maxDelayMs: 10000 }
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Process a single PDF chunk: upload to S3, OCR, cleanup
|
||||
*/
|
||||
async function processChunk(
|
||||
chunk: { buffer: Buffer; startPage: number; endPage: number },
|
||||
chunkIndex: number,
|
||||
totalChunks: number,
|
||||
filename: string,
|
||||
apiKey: string,
|
||||
userId?: string
|
||||
): Promise<{ index: number; content: string | null }> {
|
||||
const chunkPageCount = chunk.endPage - chunk.startPage + 1
|
||||
|
||||
logger.info(
|
||||
`Processing chunk ${chunkIndex + 1}/${totalChunks} (pages ${chunk.startPage + 1}-${chunk.endPage + 1}, ${chunkPageCount} pages)`
|
||||
)
|
||||
|
||||
let uploadedKey: string | null = null
|
||||
|
||||
try {
|
||||
// Upload the chunk to S3
|
||||
const timestamp = Date.now()
|
||||
const uniqueId = Math.random().toString(36).substring(2, 9)
|
||||
const safeFileName = filename.replace(/[^a-zA-Z0-9.-]/g, '_')
|
||||
const chunkKey = `kb/${timestamp}-${uniqueId}-chunk${chunkIndex + 1}-${safeFileName}`
|
||||
|
||||
const metadata: Record<string, string> = {
|
||||
originalName: `${filename}_chunk${chunkIndex + 1}`,
|
||||
uploadedAt: new Date().toISOString(),
|
||||
purpose: 'knowledge-base',
|
||||
...(userId && { userId }),
|
||||
}
|
||||
|
||||
const uploadResult = await StorageService.uploadFile({
|
||||
file: chunk.buffer,
|
||||
fileName: `${filename}_chunk${chunkIndex + 1}`,
|
||||
contentType: 'application/pdf',
|
||||
context: 'knowledge-base',
|
||||
customKey: chunkKey,
|
||||
metadata,
|
||||
})
|
||||
|
||||
uploadedKey = uploadResult.key
|
||||
|
||||
const chunkUrl = await StorageService.generatePresignedDownloadUrl(
|
||||
uploadResult.key,
|
||||
'knowledge-base',
|
||||
900 // 15 minutes
|
||||
)
|
||||
|
||||
logger.info(`Uploaded chunk ${chunkIndex + 1} to S3: ${chunkKey}`)
|
||||
|
||||
// Process the chunk with Mistral OCR
|
||||
const params = {
|
||||
filePath: chunkUrl,
|
||||
apiKey,
|
||||
resultType: 'text' as const,
|
||||
}
|
||||
|
||||
const response = await executeMistralOCRRequest(params, userId)
|
||||
const result = (await mistralParserTool.transformResponse!(response, params)) as OCRResult
|
||||
|
||||
if (result.success && result.output?.content) {
|
||||
logger.info(`Chunk ${chunkIndex + 1}/${totalChunks} completed successfully`)
|
||||
return { index: chunkIndex, content: result.output.content }
|
||||
}
|
||||
logger.warn(`Chunk ${chunkIndex + 1}/${totalChunks} returned no content`)
|
||||
return { index: chunkIndex, content: null }
|
||||
} catch (error) {
|
||||
logger.error(`Chunk ${chunkIndex + 1}/${totalChunks} failed:`, {
|
||||
message: error instanceof Error ? error.message : String(error),
|
||||
})
|
||||
return { index: chunkIndex, content: null }
|
||||
} finally {
|
||||
// Clean up the chunk file from S3 after processing
|
||||
if (uploadedKey) {
|
||||
try {
|
||||
await StorageService.deleteFile({ key: uploadedKey, context: 'knowledge-base' })
|
||||
logger.info(`Cleaned up chunk ${chunkIndex + 1} from S3`)
|
||||
} catch (deleteError) {
|
||||
logger.warn(`Failed to clean up chunk ${chunkIndex + 1} from S3:`, {
|
||||
message: deleteError instanceof Error ? deleteError.message : String(deleteError),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function processMistralOCRInBatches(
|
||||
filename: string,
|
||||
apiKey: string,
|
||||
pdfBuffer: Buffer,
|
||||
userId?: string,
|
||||
cloudUrl?: string
|
||||
): Promise<{
|
||||
content: string
|
||||
processingMethod: 'mistral-ocr'
|
||||
cloudUrl?: string
|
||||
}> {
|
||||
const totalPages = await getPdfPageCount(pdfBuffer)
|
||||
logger.info(
|
||||
`Splitting ${filename} (${totalPages} pages) into chunks of ${MISTRAL_MAX_PAGES} pages`
|
||||
)
|
||||
|
||||
const pdfChunks = await splitPdfIntoChunks(pdfBuffer, MISTRAL_MAX_PAGES)
|
||||
logger.info(
|
||||
`Split into ${pdfChunks.length} chunks, processing with concurrency ${MAX_CONCURRENT_CHUNKS}`
|
||||
)
|
||||
|
||||
// Process chunks concurrently with limited concurrency
|
||||
const results: { index: number; content: string | null }[] = []
|
||||
|
||||
for (let i = 0; i < pdfChunks.length; i += MAX_CONCURRENT_CHUNKS) {
|
||||
const batch = pdfChunks.slice(i, i + MAX_CONCURRENT_CHUNKS)
|
||||
const batchPromises = batch.map((chunk, batchIndex) =>
|
||||
processChunk(chunk, i + batchIndex, pdfChunks.length, filename, apiKey, userId)
|
||||
)
|
||||
|
||||
const batchResults = await Promise.all(batchPromises)
|
||||
for (const result of batchResults) {
|
||||
results.push(result)
|
||||
}
|
||||
|
||||
logger.info(
|
||||
`Completed batch ${Math.floor(i / MAX_CONCURRENT_CHUNKS) + 1}/${Math.ceil(pdfChunks.length / MAX_CONCURRENT_CHUNKS)}`
|
||||
)
|
||||
}
|
||||
|
||||
// Sort by index to maintain page order and filter out nulls
|
||||
const sortedResults = results
|
||||
.sort((a, b) => a.index - b.index)
|
||||
.filter((r) => r.content !== null)
|
||||
.map((r) => r.content as string)
|
||||
|
||||
if (sortedResults.length === 0) {
|
||||
// Don't fall back to file parser for large PDFs - it produces poor results
|
||||
// Better to fail clearly than return low-quality extraction
|
||||
throw new Error(
|
||||
`OCR failed for all ${pdfChunks.length} chunks of ${filename}. ` +
|
||||
`Large PDFs require OCR - file parser fallback would produce poor results.`
|
||||
)
|
||||
}
|
||||
|
||||
const combinedContent = sortedResults.join('\n\n')
|
||||
logger.info(
|
||||
`Successfully processed ${sortedResults.length}/${pdfChunks.length} chunks for ${filename}`
|
||||
)
|
||||
|
||||
return {
|
||||
content: combinedContent,
|
||||
processingMethod: 'mistral-ocr',
|
||||
cloudUrl,
|
||||
}
|
||||
}
|
||||
|
||||
async function parseWithFileParser(fileUrl: string, filename: string, mimeType: string) {
|
||||
try {
|
||||
let content: string
|
||||
let metadata: any = {}
|
||||
let metadata: FileParseMetadata = {}
|
||||
|
||||
if (fileUrl.startsWith('data:')) {
|
||||
content = await parseDataURI(fileUrl, filename, mimeType)
|
||||
@@ -513,7 +767,7 @@ async function parseDataURI(fileUrl: string, filename: string, mimeType: string)
|
||||
async function parseHttpFile(
|
||||
fileUrl: string,
|
||||
filename: string
|
||||
): Promise<{ content: string; metadata?: any }> {
|
||||
): Promise<{ content: string; metadata?: FileParseMetadata }> {
|
||||
const buffer = await downloadFileWithTimeout(fileUrl)
|
||||
|
||||
const extension = filename.split('.').pop()?.toLowerCase()
|
||||
|
||||
@@ -212,7 +212,6 @@ export async function processDocumentTags(
|
||||
return result
|
||||
}
|
||||
|
||||
// Fetch existing tag definitions
|
||||
const existingDefinitions = await db
|
||||
.select()
|
||||
.from(knowledgeBaseTagDefinitions)
|
||||
@@ -220,18 +219,15 @@ export async function processDocumentTags(
|
||||
|
||||
const existingByName = new Map(existingDefinitions.map((def) => [def.displayName, def]))
|
||||
|
||||
// First pass: collect all validation errors
|
||||
const undefinedTags: string[] = []
|
||||
const typeErrors: string[] = []
|
||||
|
||||
for (const tag of tagData) {
|
||||
// Skip if no tag name
|
||||
if (!tag.tagName?.trim()) continue
|
||||
|
||||
const tagName = tag.tagName.trim()
|
||||
const fieldType = tag.fieldType || 'text'
|
||||
|
||||
// For boolean, check if value is defined; for others, check if value is non-empty
|
||||
const hasValue =
|
||||
fieldType === 'boolean'
|
||||
? tag.value !== undefined && tag.value !== null && tag.value !== ''
|
||||
@@ -239,14 +235,12 @@ export async function processDocumentTags(
|
||||
|
||||
if (!hasValue) continue
|
||||
|
||||
// Check if tag exists
|
||||
const existingDef = existingByName.get(tagName)
|
||||
if (!existingDef) {
|
||||
undefinedTags.push(tagName)
|
||||
continue
|
||||
}
|
||||
|
||||
// Validate value type using shared validation
|
||||
const rawValue = typeof tag.value === 'string' ? tag.value.trim() : tag.value
|
||||
const actualFieldType = existingDef.fieldType || fieldType
|
||||
const validationError = validateTagValue(tagName, String(rawValue), actualFieldType)
|
||||
@@ -255,7 +249,6 @@ export async function processDocumentTags(
|
||||
}
|
||||
}
|
||||
|
||||
// Throw combined error if there are any validation issues
|
||||
if (undefinedTags.length > 0 || typeErrors.length > 0) {
|
||||
const errorParts: string[] = []
|
||||
|
||||
@@ -270,7 +263,6 @@ export async function processDocumentTags(
|
||||
throw new Error(errorParts.join('\n'))
|
||||
}
|
||||
|
||||
// Second pass: process valid tags
|
||||
for (const tag of tagData) {
|
||||
if (!tag.tagName?.trim()) continue
|
||||
|
||||
@@ -285,14 +277,13 @@ export async function processDocumentTags(
|
||||
if (!hasValue) continue
|
||||
|
||||
const existingDef = existingByName.get(tagName)
|
||||
if (!existingDef) continue // Already validated above
|
||||
if (!existingDef) continue
|
||||
|
||||
const targetSlot = existingDef.tagSlot
|
||||
const actualFieldType = existingDef.fieldType || fieldType
|
||||
const rawValue = typeof tag.value === 'string' ? tag.value.trim() : tag.value
|
||||
const stringValue = String(rawValue).trim()
|
||||
|
||||
// Assign value to the slot with proper type conversion (values already validated)
|
||||
if (actualFieldType === 'boolean') {
|
||||
setTagValue(result, targetSlot, parseBooleanValue(stringValue) ?? false)
|
||||
} else if (actualFieldType === 'number') {
|
||||
@@ -440,7 +431,6 @@ export async function processDocumentAsync(
|
||||
|
||||
logger.info(`[${documentId}] Status updated to 'processing', starting document processor`)
|
||||
|
||||
// Use KB's chunkingConfig as fallback if processingOptions not provided
|
||||
const kbConfig = kb[0].chunkingConfig as { maxSize: number; minSize: number; overlap: number }
|
||||
|
||||
await withTimeout(
|
||||
@@ -469,7 +459,6 @@ export async function processDocumentAsync(
|
||||
`[${documentId}] Document parsed successfully, generating embeddings for ${processed.chunks.length} chunks`
|
||||
)
|
||||
|
||||
// Generate embeddings in batches for large documents
|
||||
const chunkTexts = processed.chunks.map((chunk) => chunk.text)
|
||||
const embeddings: number[][] = []
|
||||
|
||||
@@ -485,7 +474,9 @@ export async function processDocumentAsync(
|
||||
|
||||
logger.info(`[${documentId}] Processing embedding batch ${batchNum}/${totalBatches}`)
|
||||
const batchEmbeddings = await generateEmbeddings(batch, undefined, kb[0].workspaceId)
|
||||
embeddings.push(...batchEmbeddings)
|
||||
for (const emb of batchEmbeddings) {
|
||||
embeddings.push(emb)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -562,23 +553,18 @@ export async function processDocumentAsync(
|
||||
}))
|
||||
|
||||
await db.transaction(async (tx) => {
|
||||
// Insert embeddings in batches for large documents
|
||||
if (embeddingRecords.length > 0) {
|
||||
const batchSize = LARGE_DOC_CONFIG.MAX_CHUNKS_PER_BATCH
|
||||
const totalBatches = Math.ceil(embeddingRecords.length / batchSize)
|
||||
await tx.delete(embedding).where(eq(embedding.documentId, documentId))
|
||||
|
||||
logger.info(
|
||||
`[${documentId}] Inserting ${embeddingRecords.length} embeddings in ${totalBatches} batches`
|
||||
)
|
||||
|
||||
for (let i = 0; i < embeddingRecords.length; i += batchSize) {
|
||||
const batch = embeddingRecords.slice(i, i + batchSize)
|
||||
const batchNum = Math.floor(i / batchSize) + 1
|
||||
const insertBatchSize = LARGE_DOC_CONFIG.MAX_CHUNKS_PER_BATCH
|
||||
const batches: (typeof embeddingRecords)[] = []
|
||||
for (let i = 0; i < embeddingRecords.length; i += insertBatchSize) {
|
||||
batches.push(embeddingRecords.slice(i, i + insertBatchSize))
|
||||
}
|
||||
|
||||
logger.info(`[${documentId}] Inserting ${embeddingRecords.length} embeddings`)
|
||||
for (const batch of batches) {
|
||||
await tx.insert(embedding).values(batch)
|
||||
logger.info(
|
||||
`[${documentId}] Inserted batch ${batchNum}/${totalBatches} (${batch.length} records)`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -689,11 +675,9 @@ export async function createDocumentRecords(
|
||||
requestId: string,
|
||||
userId?: string
|
||||
): Promise<DocumentData[]> {
|
||||
// Check storage limits before creating documents
|
||||
if (userId) {
|
||||
const totalSize = documents.reduce((sum, doc) => sum + doc.fileSize, 0)
|
||||
|
||||
// Get knowledge base owner
|
||||
const kb = await db
|
||||
.select({ userId: knowledgeBase.userId })
|
||||
.from(knowledgeBase)
|
||||
@@ -713,7 +697,7 @@ export async function createDocumentRecords(
|
||||
for (const docData of documents) {
|
||||
const documentId = randomUUID()
|
||||
|
||||
let processedTags: Record<string, any> = {}
|
||||
let processedTags: Partial<ProcessedDocumentTags> = {}
|
||||
|
||||
if (docData.documentTagsData) {
|
||||
try {
|
||||
@@ -722,7 +706,6 @@ export async function createDocumentRecords(
|
||||
processedTags = await processDocumentTags(knowledgeBaseId, tagData, requestId)
|
||||
}
|
||||
} catch (error) {
|
||||
// Re-throw validation errors, only catch JSON parse errors
|
||||
if (error instanceof SyntaxError) {
|
||||
logger.warn(`[${requestId}] Failed to parse documentTagsData for bulk document:`, error)
|
||||
} else {
|
||||
@@ -791,7 +774,6 @@ export async function createDocumentRecords(
|
||||
if (userId) {
|
||||
const totalSize = documents.reduce((sum, doc) => sum + doc.fileSize, 0)
|
||||
|
||||
// Get knowledge base owner
|
||||
const kb = await db
|
||||
.select({ userId: knowledgeBase.userId })
|
||||
.from(knowledgeBase)
|
||||
@@ -1079,7 +1061,7 @@ export async function createSingleDocument(
|
||||
const now = new Date()
|
||||
|
||||
// Process structured tag data if provided
|
||||
let processedTags: Record<string, any> = {
|
||||
let processedTags: ProcessedDocumentTags = {
|
||||
// Text tags (7 slots)
|
||||
tag1: documentData.tag1 ?? null,
|
||||
tag2: documentData.tag2 ?? null,
|
||||
@@ -1555,23 +1537,30 @@ export async function updateDocument(
|
||||
return value || null
|
||||
}
|
||||
|
||||
// Type-safe access to tag slots in updateData
|
||||
type UpdateDataWithTags = typeof updateData & Record<TagSlot, string | undefined>
|
||||
const typedUpdateData = updateData as UpdateDataWithTags
|
||||
|
||||
ALL_TAG_SLOTS.forEach((slot: TagSlot) => {
|
||||
const updateValue = (updateData as any)[slot]
|
||||
const updateValue = typedUpdateData[slot]
|
||||
if (updateValue !== undefined) {
|
||||
;(dbUpdateData as any)[slot] = convertTagValue(slot, updateValue)
|
||||
;(dbUpdateData as Record<TagSlot, string | number | Date | boolean | null>)[slot] =
|
||||
convertTagValue(slot, updateValue)
|
||||
}
|
||||
})
|
||||
|
||||
await db.transaction(async (tx) => {
|
||||
await tx.update(document).set(dbUpdateData).where(eq(document.id, documentId))
|
||||
|
||||
const hasTagUpdates = ALL_TAG_SLOTS.some((field) => (updateData as any)[field] !== undefined)
|
||||
const hasTagUpdates = ALL_TAG_SLOTS.some((field) => typedUpdateData[field] !== undefined)
|
||||
|
||||
if (hasTagUpdates) {
|
||||
const embeddingUpdateData: Record<string, any> = {}
|
||||
const embeddingUpdateData: Partial<ProcessedDocumentTags> = {}
|
||||
ALL_TAG_SLOTS.forEach((field) => {
|
||||
if ((updateData as any)[field] !== undefined) {
|
||||
embeddingUpdateData[field] = convertTagValue(field, (updateData as any)[field])
|
||||
if (typedUpdateData[field] !== undefined) {
|
||||
;(embeddingUpdateData as Record<TagSlot, string | number | Date | boolean | null>)[
|
||||
field
|
||||
] = convertTagValue(field, typedUpdateData[field])
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@ export interface RetryOptions {
|
||||
initialDelayMs?: number
|
||||
maxDelayMs?: number
|
||||
backoffMultiplier?: number
|
||||
retryCondition?: (error: RetryableError) => boolean
|
||||
retryCondition?: (error: unknown) => boolean
|
||||
}
|
||||
|
||||
export interface RetryResult<T> {
|
||||
@@ -30,11 +30,18 @@ function hasStatus(
|
||||
return typeof error === 'object' && error !== null && 'status' in error
|
||||
}
|
||||
|
||||
function isRetryableErrorType(error: unknown): error is RetryableError {
|
||||
if (!error) return false
|
||||
if (error instanceof Error) return true
|
||||
if (typeof error === 'object' && ('status' in error || 'message' in error)) return true
|
||||
return false
|
||||
}
|
||||
|
||||
/**
|
||||
* Default retry condition for rate limiting errors
|
||||
*/
|
||||
export function isRetryableError(error: RetryableError): boolean {
|
||||
if (!error) return false
|
||||
export function isRetryableError(error: unknown): boolean {
|
||||
if (!isRetryableErrorType(error)) return false
|
||||
|
||||
// Check for rate limiting status codes
|
||||
if (
|
||||
@@ -45,7 +52,7 @@ export function isRetryableError(error: RetryableError): boolean {
|
||||
}
|
||||
|
||||
// Check for rate limiting in error messages
|
||||
const errorMessage = error.message || error.toString()
|
||||
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||
const rateLimitKeywords = [
|
||||
'rate limit',
|
||||
'rate_limit',
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user