Compare commits

...

9 Commits

Author SHA1 Message Date
aadamgough
1e75f44b65 fixed linear bugs 2026-01-08 18:58:33 -08:00
aadamgough
780b4fe22d added missing params 2026-01-08 18:26:16 -08:00
Vikhyath Mondreti
c2180bf8a0 improvement(enterprise): feature flagging + runtime checks consolidation (#2730)
* improvement(enterprise): enterprise checks code consolidation

* update docs

* revert isHosted check

* add unique index to prevent multiple orgs per user

* address greptile comments

* ui bug
2026-01-08 13:53:22 -08:00
Waleed
fdac4314d2 fix(chat): update stream to respect all output select objects (#2729) 2026-01-08 11:54:07 -08:00
Waleed
a54fcbc094 improvement(auth): added ability to inject secrets to kubernetes, server-side ff to disable email registration (#2728)
* improvement(auth): added ability to inject secrets to kubernetes, server-side ff to disable email registration

* consolidated telemetry events

* comments cleanup

* ack PR comment

* refactor to use createEnvMock helper instead of local mocks
2026-01-08 11:09:35 -08:00
Waleed
05904a73b2 feat(i18n): update translations (#2721)
Co-authored-by: icecrasher321 <icecrasher321@users.noreply.github.com>
2026-01-08 10:30:53 -08:00
Lakshman Patel
1b22d2ce81 fix(devcontainer): use bunx for concurrently command (#2723) 2026-01-07 21:20:29 -08:00
Waleed
26dff7cffe feat(bedrock): added aws bedrock as a model provider (#2722) 2026-01-07 20:08:03 -08:00
Vikhyath Mondreti
020037728d feat(polling-groups): can invite multiple people to have their gmail/outlook inboxes connected to a workflow (#2695)
* progress on cred sets

* fix credential set system

* return data to render credential set in block preview

* progress

* invite flow

* simplify code

* fix ui

* fix tests

* fix types

* fix

* fix icon for outlook

* fix cred set name not showing up for owner

* fix rendering of credential set name

* fix outlook well known folder id resolution

* fix perms for creating cred set

* add to docs and simplify ui

* consolidate webhook code better

* fix tests

* fix credential collab logic issue

* fix ui

* fix lint
2026-01-07 17:49:40 -08:00
165 changed files with 28745 additions and 1442 deletions

View File

@@ -17,7 +17,7 @@ MCP-Server gruppieren Ihre Workflow-Tools zusammen. Erstellen und verwalten Sie
<Video src="mcp/mcp-server.mp4" width={700} height={450} />
</div>
1. Navigieren Sie zu **Einstellungen → MCP-Server**
1. Navigieren Sie zu **Einstellungen → Bereitgestellte MCPs**
2. Klicken Sie auf **Server erstellen**
3. Geben Sie einen Namen und eine optionale Beschreibung ein
4. Kopieren Sie die Server-URL zur Verwendung in Ihren MCP-Clients
@@ -79,7 +79,7 @@ Füge deinen API-Key-Header (`X-API-Key`) für authentifizierten Zugriff hinzu,
## Server-Verwaltung
In der Server-Detailansicht unter **Einstellungen → MCP-Server** kannst du:
In der Server-Detailansicht unter **Einstellungen → Bereitgestellte MCPs** können Sie:
- **Tools anzeigen**: Alle Workflows sehen, die einem Server hinzugefügt wurden
- **URL kopieren**: Die Server-URL für MCP-Clients abrufen

View File

@@ -27,7 +27,7 @@ MCP-Server stellen Sammlungen von Tools bereit, die Ihre Agenten nutzen können.
</div>
1. Navigieren Sie zu Ihren Workspace-Einstellungen
2. Gehen Sie zum Abschnitt **MCP-Server**
2. Gehen Sie zum Abschnitt **Bereitgestellte MCPs**
3. Klicken Sie auf **MCP-Server hinzufügen**
4. Geben Sie die Server-Konfigurationsdetails ein
5. Speichern Sie die Konfiguration

View File

@@ -22,7 +22,7 @@ Verwende den Start-Block für alles, was aus dem Editor, deploy-to-API oder depl
<Cards>
<Card title="Start" href="/triggers/start">
Einheitlicher Einstiegspunkt, der Editor-Ausführungen, API-Bereitstellungen und Chat-Bereitstellungen unterstützt
Einheitlicher Einstiegspunkt, der Editor-Ausführungen, API-Deployments und Chat-Deployments unterstützt
</Card>
<Card title="Webhook" href="/triggers/webhook">
Externe Webhook-Payloads empfangen
@@ -33,6 +33,9 @@ Verwende den Start-Block für alles, was aus dem Editor, deploy-to-API oder depl
<Card title="RSS Feed" href="/triggers/rss">
RSS- und Atom-Feeds auf neue Inhalte überwachen
</Card>
<Card title="Email Polling Groups" href="#email-polling-groups">
Team-Gmail- und Outlook-Postfächer überwachen
</Card>
</Cards>
## Schneller Vergleich
@@ -43,6 +46,7 @@ Verwende den Start-Block für alles, was aus dem Editor, deploy-to-API oder depl
| **Schedule** | Timer, der im Schedule-Block verwaltet wird |
| **Webhook** | Bei eingehender HTTP-Anfrage |
| **RSS Feed** | Neues Element im Feed veröffentlicht |
| **Email Polling Groups** | Neue E-Mail in Team-Gmail- oder Outlook-Postfächern empfangen |
> Der Start-Block stellt immer `input`, `conversationId` und `files` Felder bereit. Füge benutzerdefinierte Felder zum Eingabeformat für zusätzliche strukturierte Daten hinzu.
@@ -65,3 +69,25 @@ Wenn du im Editor auf **Run** klickst, wählt Sim automatisch aus, welcher Trigg
Wenn dein Workflow mehrere Trigger hat, wird der Trigger mit der höchsten Priorität ausgeführt. Wenn du beispielsweise sowohl einen Start-Block als auch einen Webhook-Trigger hast, wird beim Klicken auf Run der Start-Block ausgeführt.
**Externe Auslöser mit Mock-Payloads**: Wenn externe Auslöser (Webhooks und Integrationen) manuell ausgeführt werden, generiert Sim automatisch Mock-Payloads basierend auf der erwarteten Datenstruktur des Auslösers. Dies stellt sicher, dass nachgelagerte Blöcke während des Testens Variablen korrekt auflösen können.
## E-Mail-Polling-Gruppen
Polling-Gruppen ermöglichen es Ihnen, die Gmail- oder Outlook-Postfächer mehrerer Teammitglieder mit einem einzigen Trigger zu überwachen. Erfordert einen Team- oder Enterprise-Plan.
**Erstellen einer Polling-Gruppe** (Admin/Owner)
1. Gehen Sie zu **Einstellungen → E-Mail-Polling**
2. Klicken Sie auf **Erstellen** und wählen Sie Gmail oder Outlook
3. Geben Sie einen Namen für die Gruppe ein
**Mitglieder einladen**
1. Klicken Sie auf **Mitglieder hinzufügen** bei Ihrer Polling-Gruppe
2. Geben Sie E-Mail-Adressen ein (durch Komma oder Zeilenumbruch getrennt oder ziehen Sie eine CSV-Datei per Drag & Drop)
3. Klicken Sie auf **Einladungen senden**
Eingeladene erhalten eine E-Mail mit einem Link, um ihr Konto zu verbinden. Sobald die Verbindung hergestellt ist, wird ihr Postfach automatisch in die Polling-Gruppe aufgenommen. Eingeladene müssen keine Mitglieder Ihrer Sim-Organisation sein.
**Verwendung in einem Workflow**
Wählen Sie beim Konfigurieren eines E-Mail-Triggers Ihre Polling-Gruppe aus dem Dropdown-Menü für Anmeldeinformationen anstelle eines einzelnen Kontos aus. Das System erstellt Webhooks für jedes Mitglied und leitet alle E-Mails durch Ihren Workflow.

View File

@@ -0,0 +1,75 @@
---
title: Enterprise
description: Enterprise features for organizations with advanced security and compliance requirements
---
import { Callout } from 'fumadocs-ui/components/callout'
Sim Studio Enterprise provides advanced features for organizations with enhanced security, compliance, and management requirements.
---
## Bring Your Own Key (BYOK)
Use your own API keys for AI model providers instead of Sim Studio's hosted keys.
### Supported Providers
| Provider | Usage |
|----------|-------|
| OpenAI | Knowledge Base embeddings, Agent block |
| Anthropic | Agent block |
| Google | Agent block |
| Mistral | Knowledge Base OCR |
### Setup
1. Navigate to **Settings** → **BYOK** in your workspace
2. Click **Add Key** for your provider
3. Enter your API key and save
<Callout type="warn">
BYOK keys are encrypted at rest. Only organization admins and owners can manage keys.
</Callout>
When configured, workflows use your key instead of Sim Studio's hosted keys. If removed, workflows automatically fall back to hosted keys.
---
## Single Sign-On (SSO)
Enterprise authentication with SAML 2.0 and OIDC support for centralized identity management.
### Supported Providers
- Okta
- Azure AD / Entra ID
- Google Workspace
- OneLogin
- Any SAML 2.0 or OIDC provider
### Setup
1. Navigate to **Settings** → **SSO** in your workspace
2. Choose your identity provider
3. Configure the connection using your IdP's metadata
4. Enable SSO for your organization
<Callout type="info">
Once SSO is enabled, team members authenticate through your identity provider instead of email/password.
</Callout>
---
## Self-Hosted
For self-hosted deployments, enterprise features can be enabled via environment variables:
| Variable | Description |
|----------|-------------|
| `SSO_ENABLED`, `NEXT_PUBLIC_SSO_ENABLED` | Single Sign-On with SAML/OIDC |
| `CREDENTIAL_SETS_ENABLED`, `NEXT_PUBLIC_CREDENTIAL_SETS_ENABLED` | Polling Groups for email triggers |
<Callout type="warn">
BYOK is only available on hosted Sim Studio. Self-hosted deployments configure AI provider keys directly via environment variables.
</Callout>

View File

@@ -15,6 +15,7 @@
"permissions",
"sdks",
"self-hosting",
"./enterprise/index",
"./keyboard-shortcuts/index"
],
"defaultOpen": false

View File

@@ -33,6 +33,9 @@ Use the Start block for everything originating from the editor, deploy-to-API, o
<Card title="RSS Feed" href="/triggers/rss">
Monitor RSS and Atom feeds for new content
</Card>
<Card title="Email Polling Groups" href="#email-polling-groups">
Monitor team Gmail and Outlook inboxes
</Card>
</Cards>
## Quick Comparison
@@ -43,6 +46,7 @@ Use the Start block for everything originating from the editor, deploy-to-API, o
| **Schedule** | Timer managed in schedule block |
| **Webhook** | On inbound HTTP request |
| **RSS Feed** | New item published to feed |
| **Email Polling Groups** | New email received in team Gmail or Outlook inboxes |
> The Start block always exposes `input`, `conversationId`, and `files` fields. Add custom fields to the input format for additional structured data.
@@ -66,3 +70,24 @@ If your workflow has multiple triggers, the highest priority trigger will be exe
**External triggers with mock payloads**: When external triggers (webhooks and integrations) are executed manually, Sim automatically generates mock payloads based on the trigger's expected data structure. This ensures downstream blocks can resolve variables correctly during testing.
## Email Polling Groups
Polling Groups let you monitor multiple team members' Gmail or Outlook inboxes with a single trigger. Requires a Team or Enterprise plan.
**Creating a Polling Group** (Admin/Owner)
1. Go to **Settings → Email Polling**
2. Click **Create** and choose Gmail or Outlook
3. Enter a name for the group
**Inviting Members**
1. Click **Add Members** on your polling group
2. Enter email addresses (comma or newline separated, or drag & drop a CSV)
3. Click **Send Invites**
Invitees receive an email with a link to connect their account. Once connected, their inbox is automatically included in the polling group. Invitees don't need to be members of your Sim organization.
**Using in a Workflow**
When configuring an email trigger, select your polling group from the credentials dropdown instead of an individual account. The system creates webhooks for each member and routes all emails through your workflow.

View File

@@ -17,7 +17,7 @@ Los servidores MCP agrupan tus herramientas de flujo de trabajo. Créalos y gest
<Video src="mcp/mcp-server.mp4" width={700} height={450} />
</div>
1. Navega a **Configuración → Servidores MCP**
1. Navega a **Configuración → MCP implementados**
2. Haz clic en **Crear servidor**
3. Introduce un nombre y una descripción opcional
4. Copia la URL del servidor para usarla en tus clientes MCP
@@ -79,7 +79,7 @@ Incluye tu encabezado de clave API (`X-API-Key`) para acceso autenticado al usar
## Gestión del servidor
Desde la vista de detalle del servidor en **Configuración → Servidores MCP**, puedes:
Desde la vista de detalles del servidor en **Configuración → MCP implementados**, puedes:
- **Ver herramientas**: consulta todos los flujos de trabajo añadidos a un servidor
- **Copiar URL**: obtén la URL del servidor para clientes MCP

View File

@@ -26,8 +26,8 @@ Los servidores MCP proporcionan colecciones de herramientas que tus agentes pued
<Video src="mcp/settings-mcp-tools.mp4" width={700} height={450} />
</div>
1. Navega a los ajustes de tu espacio de trabajo
2. Ve a la sección **Servidores MCP**
1. Navega a la configuración de tu espacio de trabajo
2. Ve a la sección **MCP implementados**
3. Haz clic en **Añadir servidor MCP**
4. Introduce los detalles de configuración del servidor
5. Guarda la configuración

View File

@@ -22,7 +22,7 @@ Utiliza el bloque Start para todo lo que se origina desde el editor, despliegue
<Cards>
<Card title="Start" href="/triggers/start">
Punto de entrada unificado que admite ejecuciones del editor, despliegues de API y despliegues de chat
Punto de entrada unificado que admite ejecuciones en el editor, despliegues de API y despliegues de chat
</Card>
<Card title="Webhook" href="/triggers/webhook">
Recibe cargas útiles de webhooks externos
@@ -31,18 +31,22 @@ Utiliza el bloque Start para todo lo que se origina desde el editor, despliegue
Ejecución basada en cron o intervalos
</Card>
<Card title="RSS Feed" href="/triggers/rss">
Monitorea feeds RSS y Atom para nuevo contenido
Monitorea feeds RSS y Atom para detectar contenido nuevo
</Card>
<Card title="Email Polling Groups" href="#email-polling-groups">
Monitorea bandejas de entrada de Gmail y Outlook del equipo
</Card>
</Cards>
## Comparación rápida
| Disparador | Condición de inicio |
| Trigger | Condición de inicio |
|---------|-----------------|
| **Start** | Ejecuciones del editor, solicitudes de despliegue a API o mensajes de chat |
| **Start** | Ejecuciones en el editor, solicitudes de despliegue a API o mensajes de chat |
| **Schedule** | Temporizador gestionado en el bloque de programación |
| **Webhook** | Al recibir una solicitud HTTP entrante |
| **RSS Feed** | Nuevo elemento publicado en el feed |
| **Email Polling Groups** | Nuevo correo electrónico recibido en bandejas de entrada de Gmail o Outlook del equipo |
> El bloque Start siempre expone los campos `input`, `conversationId` y `files`. Añade campos personalizados al formato de entrada para datos estructurados adicionales.
@@ -65,3 +69,25 @@ Cuando haces clic en **Ejecutar** en el editor, Sim selecciona automáticamente
Si tu flujo de trabajo tiene múltiples disparadores, se ejecutará el disparador de mayor prioridad. Por ejemplo, si tienes tanto un bloque Start como un disparador Webhook, al hacer clic en Ejecutar se ejecutará el bloque Start.
**Disparadores externos con cargas útiles simuladas**: Cuando los disparadores externos (webhooks e integraciones) se ejecutan manualmente, Sim genera automáticamente cargas útiles simuladas basadas en la estructura de datos esperada del disparador. Esto asegura que los bloques posteriores puedan resolver las variables correctamente durante las pruebas.
## Grupos de sondeo de correo electrónico
Los grupos de sondeo te permiten monitorear las bandejas de entrada de Gmail o Outlook de varios miembros del equipo con un solo activador. Requiere un plan Team o Enterprise.
**Crear un grupo de sondeo** (administrador/propietario)
1. Ve a **Configuración → Sondeo de correo electrónico**
2. Haz clic en **Crear** y elige Gmail u Outlook
3. Ingresa un nombre para el grupo
**Invitar miembros**
1. Haz clic en **Agregar miembros** en tu grupo de sondeo
2. Ingresa direcciones de correo electrónico (separadas por comas o saltos de línea, o arrastra y suelta un CSV)
3. Haz clic en **Enviar invitaciones**
Los invitados reciben un correo electrónico con un enlace para conectar su cuenta. Una vez conectada, su bandeja de entrada se incluye automáticamente en el grupo de sondeo. Los invitados no necesitan ser miembros de tu organización Sim.
**Usar en un flujo de trabajo**
Al configurar un activador de correo electrónico, selecciona tu grupo de sondeo del menú desplegable de credenciales en lugar de una cuenta individual. El sistema crea webhooks para cada miembro y enruta todos los correos electrónicos a través de tu flujo de trabajo.

View File

@@ -17,11 +17,11 @@ Les serveurs MCP regroupent vos outils de workflow. Créez-les et gérez-les dan
<Video src="mcp/mcp-server.mp4" width={700} height={450} />
</div>
1. Accédez à **Paramètres → Serveurs MCP**
1. Accédez à **Paramètres → MCP déployés**
2. Cliquez sur **Créer un serveur**
3. Saisissez un nom et une description facultative
4. Copiez l'URL du serveur pour l'utiliser dans vos clients MCP
5. Consultez et gérez tous les outils ajoutés au serveur
5. Affichez et gérez tous les outils ajoutés au serveur
## Ajouter un workflow en tant qu'outil
@@ -79,7 +79,7 @@ Incluez votre en-tête de clé API (`X-API-Key`) pour un accès authentifié lor
## Gestion du serveur
Depuis la vue détaillée du serveur dans **Paramètres → Serveurs MCP**, vous pouvez :
Depuis la vue détaillée du serveur dans **Paramètres → MCP déployés**, vous pouvez :
- **Voir les outils** : voir tous les workflows ajoutés à un serveur
- **Copier l'URL** : obtenir l'URL du serveur pour les clients MCP

View File

@@ -28,7 +28,7 @@ Les serveurs MCP fournissent des collections d'outils que vos agents peuvent uti
</div>
1. Accédez aux paramètres de votre espace de travail
2. Allez à la section **Serveurs MCP**
2. Allez dans la section **MCP déployés**
3. Cliquez sur **Ajouter un serveur MCP**
4. Saisissez les détails de configuration du serveur
5. Enregistrez la configuration

View File

@@ -22,7 +22,7 @@ Utilisez le bloc Démarrer pour tout ce qui provient de l'éditeur, du déploiem
<Cards>
<Card title="Start" href="/triggers/start">
Point d'entrée unifié qui prend en charge les exécutions de l'éditeur, les déploiements d'API et les déploiements de chat
Point d'entrée unifié qui prend en charge les exécutions dans l'éditeur, les déploiements API et les déploiements de chat
</Card>
<Card title="Webhook" href="/triggers/webhook">
Recevoir des charges utiles de webhook externes
@@ -31,18 +31,22 @@ Utilisez le bloc Démarrer pour tout ce qui provient de l'éditeur, du déploiem
Exécution basée sur cron ou intervalle
</Card>
<Card title="RSS Feed" href="/triggers/rss">
Surveiller les flux RSS et Atom pour du nouveau contenu
Surveiller les flux RSS et Atom pour détecter du nouveau contenu
</Card>
<Card title="Email Polling Groups" href="#email-polling-groups">
Surveiller les boîtes de réception Gmail et Outlook de l'équipe
</Card>
</Cards>
## Comparaison rapide
| Déclencheur | Condition de démarrage |
|---------|-----------------|
| **Start** | Exécutions de l'éditeur, requêtes de déploiement d'API ou messages de chat |
|-------------|------------------------|
| **Start** | Exécutions dans l'éditeur, requêtes de déploiement vers l'API ou messages de chat |
| **Schedule** | Minuteur géré dans le bloc de planification |
| **Webhook** | Sur requête HTTP entrante |
| **Webhook** | Lors d'une requête HTTP entrante |
| **RSS Feed** | Nouvel élément publié dans le flux |
| **Email Polling Groups** | Nouvel e-mail reçu dans les boîtes de réception Gmail ou Outlook de l'équipe |
> Le bloc Démarrer expose toujours les champs `input`, `conversationId` et `files`. Ajoutez des champs personnalisés au format d'entrée pour des données structurées supplémentaires.
@@ -65,3 +69,25 @@ Lorsque vous cliquez sur **Exécuter** dans l'éditeur, Sim sélectionne automat
Si votre flux de travail comporte plusieurs déclencheurs, le déclencheur de priorité la plus élevée sera exécuté. Par exemple, si vous avez à la fois un bloc Démarrer et un déclencheur Webhook, cliquer sur Exécuter exécutera le bloc Démarrer.
**Déclencheurs externes avec charges utiles simulées** : lorsque des déclencheurs externes (webhooks et intégrations) sont exécutés manuellement, Sim génère automatiquement des charges utiles simulées basées sur la structure de données attendue du déclencheur. Cela garantit que les blocs en aval peuvent résoudre correctement les variables pendant les tests.
## Groupes de surveillance d'e-mails
Les groupes de surveillance vous permettent de surveiller les boîtes de réception Gmail ou Outlook de plusieurs membres de l'équipe avec un seul déclencheur. Nécessite un forfait Team ou Enterprise.
**Créer un groupe de surveillance** (Admin/Propriétaire)
1. Accédez à **Paramètres → Surveillance d'e-mails**
2. Cliquez sur **Créer** et choisissez Gmail ou Outlook
3. Entrez un nom pour le groupe
**Inviter des membres**
1. Cliquez sur **Ajouter des membres** dans votre groupe de surveillance
2. Entrez les adresses e-mail (séparées par des virgules ou des sauts de ligne, ou glissez-déposez un fichier CSV)
3. Cliquez sur **Envoyer les invitations**
Les personnes invitées reçoivent un e-mail avec un lien pour connecter leur compte. Une fois connectée, leur boîte de réception est automatiquement incluse dans le groupe de surveillance. Les personnes invitées n'ont pas besoin d'être membres de votre organisation Sim.
**Utiliser dans un workflow**
Lors de la configuration d'un déclencheur d'e-mail, sélectionnez votre groupe de surveillance dans le menu déroulant des identifiants au lieu d'un compte individuel. Le système crée des webhooks pour chaque membre et achemine tous les e-mails via votre workflow.

View File

@@ -16,11 +16,11 @@ MCPサーバーは、ワークフローツールをまとめてグループ化
<Video src="mcp/mcp-server.mp4" width={700} height={450} />
</div>
1. **設定 → MCPサーバー**に移動
2. **サーバーを作成**をクリック
3. 名前と説明(任意)を入力
4. MCPクライアントで使用するためにサーバーURLをコピー
5. サーバーに追加されたすべてのツールを表示・管理
1. **設定 → デプロイ済みMCP**に移動します
2. **サーバーを作成**をクリックします
3. 名前とオプションの説明を入力します
4. MCPクライアントで使用するためにサーバーURLをコピーします
5. サーバーに追加されたすべてのツールを表示および管理します
## ワークフローをツールとして追加
@@ -78,7 +78,7 @@ mcp-remoteまたは他のHTTPベースのMCPトランスポートを使用する
## サーバー管理
**設定 → MCPサーバー**のサーバー詳細ビューから、以下の操作が可能です:
**設定 → デプロイ済みMCP**のサーバー詳細ビューから、次のことができます:
- **ツールを表示**: サーバーに追加されたすべてのワークフローを確認
- **URLをコピー**: MCPクライアント用のサーバーURLを取得

View File

@@ -27,10 +27,10 @@ MCPサーバーはエージェントが使用できるツールのコレクシ
</div>
1. ワークスペース設定に移動します
2. **MCPサーバー**セクションに進みます
2. **デプロイ済みMCP**セクションに移動します
3. **MCPサーバーを追加**をクリックします
4. サーバー構成の詳細を入力します
5. 構成を保存します
4. サーバー設定の詳細を入力します
5. 設定を保存します
<Callout type="info">
エージェントブロックのツールバーから直接MCPサーバーを構成することもできますクイックセットアップ

View File

@@ -22,16 +22,19 @@ import { Image } from '@/components/ui/image'
<Cards>
<Card title="Start" href="/triggers/start">
エディタ実行、APIデプロイメント、チャットデプロイメントをサポートする統合エントリーポイント
エディタ実行、APIデプロイ、チャットデプロイをサポートする統合エントリーポイント
</Card>
<Card title="Webhook" href="/triggers/webhook">
外部のwebhookペイロードを受信
外部Webhookペイロードを受信
</Card>
<Card title="Schedule" href="/triggers/schedule">
Cronまたは間隔ベースの実行
Cronまたはインターバルベースの実行
</Card>
<Card title="RSS Feed" href="/triggers/rss">
新しいコンテンツのRSSとAtomフィードを監視
RSSおよびAtomフィードの新しいコンテンツを監視
</Card>
<Card title="Email Polling Groups" href="#email-polling-groups">
チームのGmailおよびOutlook受信トレイを監視
</Card>
</Cards>
@@ -39,10 +42,11 @@ import { Image } from '@/components/ui/image'
| トリガー | 開始条件 |
|---------|-----------------|
| **Start** | エディタ実行、APIへのデプロイリクエスト、またはチャットメッセージ |
| **Start** | エディタ実行、deploy-to-APIリクエスト、またはチャットメッセージ |
| **Schedule** | スケジュールブロックで管理されるタイマー |
| **Webhook** | 受信HTTPリクエスト時 |
| **Webhook** | インバウンドHTTPリクエスト時 |
| **RSS Feed** | フィードに新しいアイテムが公開された時 |
| **Email Polling Groups** | チームのGmailまたはOutlook受信トレイに新しいメールが受信された時 |
> スタートブロックは常に `input`、`conversationId`、および `files` フィールドを公開します。追加の構造化データには入力フォーマットにカスタムフィールドを追加してください。
@@ -65,3 +69,25 @@ import { Image } from '@/components/ui/image'
ワークフローに複数のトリガーがある場合、最も優先度の高いトリガーが実行されます。例えば、スタートブロックとウェブフックトリガーの両方がある場合、実行をクリックするとスタートブロックが実行されます。
**モックペイロードを持つ外部トリガー**: 外部トリガーウェブフックと連携が手動で実行される場合、Simはトリガーの予想されるデータ構造に基づいてモックペイロードを自動生成します。これにより、テスト中に下流のブロックが変数を正しく解決できるようになります。
## Email Polling Groups
Polling Groupsを使用すると、単一のトリガーで複数のチームメンバーのGmailまたはOutlook受信トレイを監視できます。TeamまたはEnterpriseプランが必要です。
**Polling Groupの作成**(管理者/オーナー)
1. **設定 → Email Polling**に移動
2. **作成**をクリックし、GmailまたはOutlookを選択
3. グループの名前を入力
**メンバーの招待**
1. Polling Groupの**メンバーを追加**をクリック
2. メールアドレスを入力カンマまたは改行で区切る、またはCSVをドラッグ&ドロップ)
3. **招待を送信**をクリック
招待された人は、アカウントを接続するためのリンクが記載されたメールを受信します。接続されると、その受信トレイは自動的にPolling Groupに含まれます。招待された人は、Sim組織のメンバーである必要はありません。
**ワークフローでの使用**
メールトリガーを設定する際、個別のアカウントではなく、認証情報ドロップダウンからPolling Groupを選択します。システムは各メンバーのWebhookを作成し、すべてのメールをワークフローを通じてルーティングします。

View File

@@ -16,11 +16,11 @@ MCP 服务器用于将您的工作流工具进行分组。您可以在工作区
<Video src="mcp/mcp-server.mp4" width={700} height={450} />
</div>
1. 进入 **设置 → MCP 服务器**
1. 进入 **设置 → 已部署的 MCPs**
2. 点击 **创建服务器**
3. 输入名称和可选描述
4. 复制服务器 URL 以在的 MCP 客户端中使用
5. 查看并管理已添加到服务器的所有工具
4. 复制服务器 URL 以在的 MCP 客户端中使用
5. 查看并管理已添加到服务器的所有工具
## 添加工作流为工具
@@ -78,7 +78,7 @@ MCP 服务器用于将您的工作流工具进行分组。您可以在工作区
## 服务器管理
在 **设置 → MCP 服务器** 的服务器详情视图中,您可以:
在 **设置 → 已部署的 MCPs** 的服务器详情页,你可以:
- **查看工具**:查看添加到服务器的所有工作流
- **复制 URL**:获取 MCP 客户端的服务器 URL

View File

@@ -27,9 +27,9 @@ MCP 服务器提供工具集合,供您的代理使用。您可以在工作区
</div>
1. 进入您的工作区设置
2. 转到 **MCP 服务器** 部分
3. 点击 **添加 MCP 服务器**
4. 输入服务器配置详情
2. 前往 **Deployed MCPs** 部分
3. 点击 **Add MCP Server**
4. 输入服务器配置信息
5. 保存配置
<Callout type="info">

View File

@@ -21,17 +21,20 @@ import { Image } from '@/components/ui/image'
使用 Start 块处理从编辑器、部署到 API 或部署到聊天的所有操作。其他触发器可用于事件驱动的工作流:
<Cards>
<Card title="开始" href="/triggers/start">
支持编辑器运行、API 部署和聊天部署的统一入口
<Card title="Start" href="/triggers/start">
支持编辑器运行、API 部署和聊天部署的统一入口
</Card>
<Card title="Webhook" href="/triggers/webhook">
接收外部 webhook 负载
</Card>
<Card title="计划" href="/triggers/schedule">
基于 Cron 或间隔的执行
<Card title="Schedule" href="/triggers/schedule">
基于 cron 或间隔的执行
</Card>
<Card title="RSS " href="/triggers/rss">
监控 RSS 和 Atom 源的新内容
<Card title="RSS Feed" href="/triggers/rss">
监控 RSS 和 Atom 订阅源的新内容
</Card>
<Card title="Email Polling Groups" href="#email-polling-groups">
监控团队 Gmail 和 Outlook 收件箱
</Card>
</Cards>
@@ -39,10 +42,11 @@ import { Image } from '@/components/ui/image'
| 触发器 | 启动条件 |
|---------|-----------------|
| **开始** | 编辑器运行、部署到 API 请求或聊天消息 |
| **计划** | 在计划块中管理的时器 |
| **Start** | 编辑器运行、API 部署请求或聊天消息 |
| **Schedule** | 在 schedule 块中管理的时器 |
| **Webhook** | 收到入站 HTTP 请求时 |
| **RSS ** | 源中发布了新项目 |
| **RSS Feed** | 订阅源中有新内容发布时 |
| **Email Polling Groups** | 团队 Gmail 或 Outlook 收件箱收到新邮件时 |
> Start 块始终公开 `input`、`conversationId` 和 `files` 字段。通过向输入格式添加自定义字段来增加结构化数据。
@@ -65,3 +69,25 @@ import { Image } from '@/components/ui/image'
如果您的工作流有多个触发器,将执行优先级最高的触发器。例如,如果您同时有 Start 块和 Webhook 触发器,点击运行将执行 Start 块。
**带有模拟负载的外部触发器**:当手动执行外部触发器(如 webhooks 和集成Sim 会根据触发器的预期数据结构自动生成模拟负载。这确保了在测试过程中,下游模块可以正确解析变量。
## 邮件轮询组
轮询组可让你通过单一触发器监控多个团队成员的 Gmail 或 Outlook 收件箱。需要 Team 或 Enterprise 方案。
**创建轮询组**(管理员/所有者)
1. 前往 **设置 → 邮件轮询**
2. 点击 **创建**,选择 Gmail 或 Outlook
3. 输入组名
**邀请成员**
1. 在你的轮询组中点击 **添加成员**
2. 输入邮箱地址(用逗号或换行分隔,或拖拽 CSV 文件)
3. 点击 **发送邀请**
受邀者会收到一封带有连接账户链接的邮件。连接后,他们的收件箱会自动加入轮询组。受邀者无需成为你的 Sim 组织成员。
**在工作流中使用**
配置邮件触发器时,从凭据下拉菜单中选择你的轮询组,而不是单独账户。系统会为每位成员创建 webhook并将所有邮件通过你的工作流进行处理。

View File

@@ -4343,7 +4343,7 @@ checksums:
content/5: 6eee8c607e72b6c444d7b3ef07244f20
content/6: 747991e0e80e306dce1061ef7802db2a
content/7: 430153eacb29c66026cf71944df7be20
content/8: 5950966e19939b7a3a320d56ee4a674c
content/8: f9bdeac954d1d138c954c151db0403ec
content/9: 159cf7a6d62e64b0c5db27e73b8c1ff5
content/10: a723187777f9a848d4daa563e9dcbe17
content/11: b1c5f14e5290bcbbf5d590361ee7c053
@@ -5789,9 +5789,9 @@ checksums:
content/1: e71056df0f7b2eb3b2f271f21d0052cc
content/2: da2b445db16c149f56558a4ea876a5f0
content/3: cec18f48b2cd7974eb556880e6604f7f
content/4: b200402d6a01ab565fd56d113c530ef6
content/4: cff35e4208de8f6ef36a6eae79915fab
content/5: 4c3a5708af82c1ee42a12d14fd34e950
content/6: 64fbd5b16f4cff18ba976492a275c05e
content/6: 00a9f255e60b5979014694b0c2a3ba26
content/7: a28151eeb5ba3518b33809055b04f0f6
content/8: cffe5b901d78ebf2000d07dc7579533e
content/9: 73486253d24eeff7ac44dfd0c8868d87
@@ -5801,6 +5801,15 @@ checksums:
content/13: e5ca2445d3b69b062af5bf0a2988e760
content/14: 67e0b520d57e352689789eff5803ebbc
content/15: a1d7382600994068ca24dc03f46b7c73
content/16: 1895a0c773fddeb014c7aab468593b30
content/17: 5b478d664a0b1bc76f19516b2a6e2788
content/18: c97883b63e5e455cd2de51f0406f963f
content/19: 2ff6c01b8eebbdd653d864b105f53cde
content/20: 523b34e945343591d1df51a6ba6357dd
content/21: e6611cff00c91bd2327660aebf9418f4
content/22: 87e7e7df71f0883369e8abda30289c0f
content/23: b248d9eda347cfb122101a4e4b5eaa53
content/24: 2f003723d891d6c53c398b86c7397577
0bf172ef4ee9a2c94a2967d7d320b81b:
meta/title: 330265974a03ee22a09f42fa4ece25f6
meta/description: e3d54cbedf551315cf9e8749228c2d1c
@@ -50141,7 +50150,7 @@ checksums:
content/2: b082096b0c871b2a40418e479af6f158
content/3: 9c94aa34f44540b0632931a8244a6488
content/4: 14f33e16b5a98e4dbdda2a27aa0d7afb
content/5: d7b36732970b7649dd1aa1f1d0a34e74
content/5: 3ea8bad9314f442a69a87f313419ef1a
content/6: f554f833467a6dae5391372fc41dad53
content/7: 9cdb9189ecfcc4a6f567d3fd5fe342f0
content/8: 9a107692cb52c284c1cb022b516d700b
@@ -50158,7 +50167,7 @@ checksums:
content/19: a618fcff50c4856113428639359a922b
content/20: 5fd3a6d2dcd8aa18dbf0b784acaa271c
content/21: d118656dd565c4c22f3c0c3a7c7f3bee
content/22: f49b9be78f1e7a569e290acc1365d417
content/22: c161e7bcfba9cf6ef0ab8ef40ac0c17a
content/23: 0a70ebe6eb4c543c3810977ed46b69b0
content/24: ad8638a3473c909dbcb1e1d9f4f26381
content/25: 95343a9f81cd050d3713988c677c750f

View File

@@ -109,11 +109,15 @@ function SignupFormContent({
setEmail(emailParam)
}
const redirectParam = searchParams.get('redirect')
// Check both 'redirect' and 'callbackUrl' params (login page uses callbackUrl)
const redirectParam = searchParams.get('redirect') || searchParams.get('callbackUrl')
if (redirectParam) {
setRedirectUrl(redirectParam)
if (redirectParam.startsWith('/invite/')) {
if (
redirectParam.startsWith('/invite/') ||
redirectParam.startsWith('/credential-account/')
) {
setIsInviteFlow(true)
}
}

View File

@@ -8,11 +8,18 @@ import { createMockLogger, createMockRequest } from '@/app/api/__test-utils__/ut
describe('OAuth Disconnect API Route', () => {
const mockGetSession = vi.fn()
const mockSelectChain = {
from: vi.fn().mockReturnThis(),
innerJoin: vi.fn().mockReturnThis(),
where: vi.fn().mockResolvedValue([]),
}
const mockDb = {
delete: vi.fn().mockReturnThis(),
where: vi.fn(),
select: vi.fn().mockReturnValue(mockSelectChain),
}
const mockLogger = createMockLogger()
const mockSyncAllWebhooksForCredentialSet = vi.fn().mockResolvedValue({})
const mockUUID = 'mock-uuid-12345678-90ab-cdef-1234-567890abcdef'
@@ -33,6 +40,13 @@ describe('OAuth Disconnect API Route', () => {
vi.doMock('@sim/db/schema', () => ({
account: { userId: 'userId', providerId: 'providerId' },
credentialSetMember: {
id: 'id',
credentialSetId: 'credentialSetId',
userId: 'userId',
status: 'status',
},
credentialSet: { id: 'id', providerId: 'providerId' },
}))
vi.doMock('drizzle-orm', () => ({
@@ -45,6 +59,14 @@ describe('OAuth Disconnect API Route', () => {
vi.doMock('@sim/logger', () => ({
createLogger: vi.fn().mockReturnValue(mockLogger),
}))
vi.doMock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn().mockReturnValue('test-request-id'),
}))
vi.doMock('@/lib/webhooks/utils.server', () => ({
syncAllWebhooksForCredentialSet: mockSyncAllWebhooksForCredentialSet,
}))
})
afterEach(() => {

View File

@@ -1,11 +1,12 @@
import { db } from '@sim/db'
import { account } from '@sim/db/schema'
import { account, credentialSet, credentialSetMember } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, like, or } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { generateRequestId } from '@/lib/core/utils/request'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
export const dynamic = 'force-dynamic'
@@ -74,6 +75,49 @@ export async function POST(request: NextRequest) {
)
}
// Sync webhooks for all credential sets the user is a member of
// This removes webhooks that were using the disconnected credential
const userMemberships = await db
.select({
id: credentialSetMember.id,
credentialSetId: credentialSetMember.credentialSetId,
providerId: credentialSet.providerId,
})
.from(credentialSetMember)
.innerJoin(credentialSet, eq(credentialSetMember.credentialSetId, credentialSet.id))
.where(
and(
eq(credentialSetMember.userId, session.user.id),
eq(credentialSetMember.status, 'active')
)
)
for (const membership of userMemberships) {
// Only sync if the credential set matches this provider
// Credential sets store OAuth provider IDs like 'google-email' or 'outlook'
const matchesProvider =
membership.providerId === provider ||
membership.providerId === providerId ||
membership.providerId?.startsWith(`${provider}-`)
if (matchesProvider) {
try {
await syncAllWebhooksForCredentialSet(membership.credentialSetId, requestId)
logger.info(`[${requestId}] Synced webhooks after credential disconnect`, {
credentialSetId: membership.credentialSetId,
provider,
})
} catch (error) {
// Log but don't fail the disconnect - credential is already removed
logger.error(`[${requestId}] Failed to sync webhooks after credential disconnect`, {
credentialSetId: membership.credentialSetId,
provider,
error,
})
}
}
}
return NextResponse.json({ success: true }, { status: 200 })
} catch (error) {
logger.error(`[${requestId}] Error disconnecting OAuth provider`, error)

View File

@@ -138,7 +138,10 @@ describe('OAuth Token API Routes', () => {
const data = await response.json()
expect(response.status).toBe(400)
expect(data).toHaveProperty('error', 'Credential ID is required')
expect(data).toHaveProperty(
'error',
'Either credentialId or (credentialAccountUserId + providerId) is required'
)
expect(mockLogger.warn).toHaveBeenCalled()
})

View File

@@ -4,7 +4,7 @@ import { z } from 'zod'
import { authorizeCredentialUse } from '@/lib/auth/credential-access'
import { checkHybridAuth } from '@/lib/auth/hybrid'
import { generateRequestId } from '@/lib/core/utils/request'
import { getCredential, refreshTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { getCredential, getOAuthToken, refreshTokenIfNeeded } from '@/app/api/auth/oauth/utils'
export const dynamic = 'force-dynamic'
@@ -12,12 +12,17 @@ const logger = createLogger('OAuthTokenAPI')
const SALESFORCE_INSTANCE_URL_REGEX = /__sf_instance__:([^\s]+)/
const tokenRequestSchema = z.object({
credentialId: z
.string({ required_error: 'Credential ID is required' })
.min(1, 'Credential ID is required'),
workflowId: z.string().min(1, 'Workflow ID is required').nullish(),
})
const tokenRequestSchema = z
.object({
credentialId: z.string().min(1).optional(),
credentialAccountUserId: z.string().min(1).optional(),
providerId: z.string().min(1).optional(),
workflowId: z.string().min(1).nullish(),
})
.refine(
(data) => data.credentialId || (data.credentialAccountUserId && data.providerId),
'Either credentialId or (credentialAccountUserId + providerId) is required'
)
const tokenQuerySchema = z.object({
credentialId: z
@@ -58,9 +63,37 @@ export async function POST(request: NextRequest) {
)
}
const { credentialId, workflowId } = parseResult.data
const { credentialId, credentialAccountUserId, providerId, workflowId } = parseResult.data
if (credentialAccountUserId && providerId) {
logger.info(`[${requestId}] Fetching token by credentialAccountUserId + providerId`, {
credentialAccountUserId,
providerId,
})
try {
const accessToken = await getOAuthToken(credentialAccountUserId, providerId)
if (!accessToken) {
return NextResponse.json(
{
error: `No credential found for user ${credentialAccountUserId} and provider ${providerId}`,
},
{ status: 404 }
)
}
return NextResponse.json({ accessToken }, { status: 200 })
} catch (error) {
const message = error instanceof Error ? error.message : 'Failed to get OAuth token'
logger.warn(`[${requestId}] OAuth token error: ${message}`)
return NextResponse.json({ error: message }, { status: 403 })
}
}
if (!credentialId) {
return NextResponse.json({ error: 'Credential ID is required' }, { status: 400 })
}
// We already have workflowId from the parsed body; avoid forcing hybrid auth to re-read it
const authz = await authorizeCredentialUse(request, {
credentialId,
workflowId: workflowId ?? undefined,
@@ -70,7 +103,6 @@ export async function POST(request: NextRequest) {
return NextResponse.json({ error: authz.error || 'Unauthorized' }, { status: 403 })
}
// Fetch the credential as the owner to enforce ownership scoping
const credential = await getCredential(requestId, credentialId, authz.credentialOwnerUserId)
if (!credential) {
@@ -78,7 +110,6 @@ export async function POST(request: NextRequest) {
}
try {
// Refresh the token if needed
const { accessToken } = await refreshTokenIfNeeded(requestId, credential, credentialId)
let instanceUrl: string | undefined
@@ -145,7 +176,6 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'User not authenticated' }, { status: 401 })
}
// Get the credential from the database
const credential = await getCredential(requestId, credentialId, auth.userId)
if (!credential) {

View File

@@ -1,7 +1,7 @@
import { db } from '@sim/db'
import { account, workflow } from '@sim/db/schema'
import { account, credentialSetMember, workflow } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, desc, eq } from 'drizzle-orm'
import { and, desc, eq, inArray } from 'drizzle-orm'
import { getSession } from '@/lib/auth'
import { refreshOAuthToken } from '@/lib/oauth'
@@ -105,10 +105,10 @@ export async function getOAuthToken(userId: string, providerId: string): Promise
refreshToken: account.refreshToken,
accessTokenExpiresAt: account.accessTokenExpiresAt,
idToken: account.idToken,
scope: account.scope,
})
.from(account)
.where(and(eq(account.userId, userId), eq(account.providerId, providerId)))
// Always use the most recently updated credential for this provider
.orderBy(desc(account.updatedAt))
.limit(1)
@@ -335,3 +335,108 @@ export async function refreshTokenIfNeeded(
throw error
}
}
export interface CredentialSetCredential {
userId: string
credentialId: string
accessToken: string
providerId: string
}
export async function getCredentialsForCredentialSet(
credentialSetId: string,
providerId: string
): Promise<CredentialSetCredential[]> {
logger.info(`Getting credentials for credential set ${credentialSetId}, provider ${providerId}`)
const members = await db
.select({ userId: credentialSetMember.userId })
.from(credentialSetMember)
.where(
and(
eq(credentialSetMember.credentialSetId, credentialSetId),
eq(credentialSetMember.status, 'active')
)
)
logger.info(`Found ${members.length} active members in credential set ${credentialSetId}`)
if (members.length === 0) {
logger.warn(`No active members found for credential set ${credentialSetId}`)
return []
}
const userIds = members.map((m) => m.userId)
logger.debug(`Member user IDs: ${userIds.join(', ')}`)
const credentials = await db
.select({
id: account.id,
userId: account.userId,
providerId: account.providerId,
accessToken: account.accessToken,
refreshToken: account.refreshToken,
accessTokenExpiresAt: account.accessTokenExpiresAt,
})
.from(account)
.where(and(inArray(account.userId, userIds), eq(account.providerId, providerId)))
logger.info(
`Found ${credentials.length} credentials with provider ${providerId} for ${members.length} members`
)
const results: CredentialSetCredential[] = []
for (const cred of credentials) {
const now = new Date()
const tokenExpiry = cred.accessTokenExpiresAt
const shouldRefresh =
!!cred.refreshToken && (!cred.accessToken || (tokenExpiry && tokenExpiry < now))
let accessToken = cred.accessToken
if (shouldRefresh && cred.refreshToken) {
try {
const refreshResult = await refreshOAuthToken(providerId, cred.refreshToken)
if (refreshResult) {
accessToken = refreshResult.accessToken
const updateData: Record<string, unknown> = {
accessToken: refreshResult.accessToken,
accessTokenExpiresAt: new Date(Date.now() + refreshResult.expiresIn * 1000),
updatedAt: new Date(),
}
if (refreshResult.refreshToken && refreshResult.refreshToken !== cred.refreshToken) {
updateData.refreshToken = refreshResult.refreshToken
}
await db.update(account).set(updateData).where(eq(account.id, cred.id))
logger.info(`Refreshed token for user ${cred.userId}, provider ${providerId}`)
}
} catch (error) {
logger.error(`Failed to refresh token for user ${cred.userId}, provider ${providerId}`, {
error: error instanceof Error ? error.message : String(error),
})
continue
}
}
if (accessToken) {
results.push({
userId: cred.userId,
credentialId: cred.id,
accessToken,
providerId,
})
}
}
logger.info(
`Found ${results.length} valid credentials for credential set ${credentialSetId}, provider ${providerId}`
)
return results
}

View File

@@ -1,7 +1,8 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { auth } from '@/lib/auth'
import { auth, getSession } from '@/lib/auth'
import { hasSSOAccess } from '@/lib/billing'
import { env } from '@/lib/core/config/env'
import { REDACTED_MARKER } from '@/lib/core/security/redaction'
@@ -63,10 +64,22 @@ const ssoRegistrationSchema = z.discriminatedUnion('providerType', [
export async function POST(request: NextRequest) {
try {
// SSO plugin must be enabled in Better Auth
if (!env.SSO_ENABLED) {
return NextResponse.json({ error: 'SSO is not enabled' }, { status: 400 })
}
// Check plan access (enterprise) or env var override
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
}
const hasAccess = await hasSSOAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json({ error: 'SSO requires an Enterprise plan' }, { status: 403 })
}
const rawBody = await request.json()
const parseResult = ssoRegistrationSchema.safeParse(rawBody)

View File

@@ -212,6 +212,18 @@ export async function POST(request: NextRequest) {
logger.info(`Chat "${title}" deployed successfully at ${chatUrl}`)
try {
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.chatDeployed({
chatId: id,
workflowId,
authType,
hasOutputConfigs: outputConfigs.length > 0,
})
} catch (_e) {
// Silently fail
}
return createSuccessResponse({
id,
chatUrl,

View File

@@ -0,0 +1,156 @@
import { db } from '@sim/db'
import { credentialSet, credentialSetInvitation, member, organization, user } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getEmailSubject, renderPollingGroupInvitationEmail } from '@/components/emails'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { sendEmail } from '@/lib/messaging/email/mailer'
const logger = createLogger('CredentialSetInviteResend')
async function getCredentialSetWithAccess(credentialSetId: string, userId: string) {
const [set] = await db
.select({
id: credentialSet.id,
organizationId: credentialSet.organizationId,
name: credentialSet.name,
providerId: credentialSet.providerId,
})
.from(credentialSet)
.where(eq(credentialSet.id, credentialSetId))
.limit(1)
if (!set) return null
const [membership] = await db
.select({ role: member.role })
.from(member)
.where(and(eq(member.userId, userId), eq(member.organizationId, set.organizationId)))
.limit(1)
if (!membership) return null
return { set, role: membership.role }
}
export async function POST(
req: NextRequest,
{ params }: { params: Promise<{ id: string; invitationId: string }> }
) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id, invitationId } = await params
try {
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
if (result.role !== 'admin' && result.role !== 'owner') {
return NextResponse.json({ error: 'Admin or owner permissions required' }, { status: 403 })
}
const [invitation] = await db
.select()
.from(credentialSetInvitation)
.where(
and(
eq(credentialSetInvitation.id, invitationId),
eq(credentialSetInvitation.credentialSetId, id)
)
)
.limit(1)
if (!invitation) {
return NextResponse.json({ error: 'Invitation not found' }, { status: 404 })
}
if (invitation.status !== 'pending') {
return NextResponse.json({ error: 'Only pending invitations can be resent' }, { status: 400 })
}
// Update expiration
const newExpiresAt = new Date()
newExpiresAt.setDate(newExpiresAt.getDate() + 7)
await db
.update(credentialSetInvitation)
.set({ expiresAt: newExpiresAt })
.where(eq(credentialSetInvitation.id, invitationId))
const inviteUrl = `${getBaseUrl()}/credential-account/${invitation.token}`
// Send email if email address exists
if (invitation.email) {
try {
const [inviter] = await db
.select({ name: user.name })
.from(user)
.where(eq(user.id, session.user.id))
.limit(1)
const [org] = await db
.select({ name: organization.name })
.from(organization)
.where(eq(organization.id, result.set.organizationId))
.limit(1)
const provider = (result.set.providerId as 'google-email' | 'outlook') || 'google-email'
const emailHtml = await renderPollingGroupInvitationEmail({
inviterName: inviter?.name || 'A team member',
organizationName: org?.name || 'your organization',
pollingGroupName: result.set.name,
provider,
inviteLink: inviteUrl,
})
const emailResult = await sendEmail({
to: invitation.email,
subject: getEmailSubject('polling-group-invitation'),
html: emailHtml,
emailType: 'transactional',
})
if (!emailResult.success) {
logger.warn('Failed to resend invitation email', {
email: invitation.email,
error: emailResult.message,
})
return NextResponse.json({ error: 'Failed to send email' }, { status: 500 })
}
} catch (emailError) {
logger.error('Error sending invitation email', emailError)
return NextResponse.json({ error: 'Failed to send email' }, { status: 500 })
}
}
logger.info('Resent credential set invitation', {
credentialSetId: id,
invitationId,
userId: session.user.id,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error resending invitation', error)
return NextResponse.json({ error: 'Failed to resend invitation' }, { status: 500 })
}
}

View File

@@ -0,0 +1,243 @@
import { db } from '@sim/db'
import { credentialSet, credentialSetInvitation, member, organization, user } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getEmailSubject, renderPollingGroupInvitationEmail } from '@/components/emails'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { sendEmail } from '@/lib/messaging/email/mailer'
const logger = createLogger('CredentialSetInvite')
const createInviteSchema = z.object({
email: z.string().email().optional(),
})
async function getCredentialSetWithAccess(credentialSetId: string, userId: string) {
const [set] = await db
.select({
id: credentialSet.id,
organizationId: credentialSet.organizationId,
name: credentialSet.name,
providerId: credentialSet.providerId,
})
.from(credentialSet)
.where(eq(credentialSet.id, credentialSetId))
.limit(1)
if (!set) return null
const [membership] = await db
.select({ role: member.role })
.from(member)
.where(and(eq(member.userId, userId), eq(member.organizationId, set.organizationId)))
.limit(1)
if (!membership) return null
return { set, role: membership.role }
}
export async function GET(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
const invitations = await db
.select()
.from(credentialSetInvitation)
.where(eq(credentialSetInvitation.credentialSetId, id))
return NextResponse.json({ invitations })
}
export async function POST(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
try {
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
if (result.role !== 'admin' && result.role !== 'owner') {
return NextResponse.json({ error: 'Admin or owner permissions required' }, { status: 403 })
}
const body = await req.json()
const { email } = createInviteSchema.parse(body)
const token = crypto.randomUUID()
const expiresAt = new Date()
expiresAt.setDate(expiresAt.getDate() + 7)
const invitation = {
id: crypto.randomUUID(),
credentialSetId: id,
email: email || null,
token,
invitedBy: session.user.id,
status: 'pending' as const,
expiresAt,
createdAt: new Date(),
}
await db.insert(credentialSetInvitation).values(invitation)
const inviteUrl = `${getBaseUrl()}/credential-account/${token}`
// Send email if email address was provided
if (email) {
try {
// Get inviter name
const [inviter] = await db
.select({ name: user.name })
.from(user)
.where(eq(user.id, session.user.id))
.limit(1)
// Get organization name
const [org] = await db
.select({ name: organization.name })
.from(organization)
.where(eq(organization.id, result.set.organizationId))
.limit(1)
const provider = (result.set.providerId as 'google-email' | 'outlook') || 'google-email'
const emailHtml = await renderPollingGroupInvitationEmail({
inviterName: inviter?.name || 'A team member',
organizationName: org?.name || 'your organization',
pollingGroupName: result.set.name,
provider,
inviteLink: inviteUrl,
})
const emailResult = await sendEmail({
to: email,
subject: getEmailSubject('polling-group-invitation'),
html: emailHtml,
emailType: 'transactional',
})
if (!emailResult.success) {
logger.warn('Failed to send invitation email', {
email,
error: emailResult.message,
})
}
} catch (emailError) {
logger.error('Error sending invitation email', emailError)
// Don't fail the invitation creation if email fails
}
}
logger.info('Created credential set invitation', {
credentialSetId: id,
invitationId: invitation.id,
userId: session.user.id,
emailSent: !!email,
})
return NextResponse.json({
invitation: {
...invitation,
inviteUrl,
},
})
} catch (error) {
if (error instanceof z.ZodError) {
return NextResponse.json({ error: error.errors[0].message }, { status: 400 })
}
logger.error('Error creating invitation', error)
return NextResponse.json({ error: 'Failed to create invitation' }, { status: 500 })
}
}
export async function DELETE(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
const { searchParams } = new URL(req.url)
const invitationId = searchParams.get('invitationId')
if (!invitationId) {
return NextResponse.json({ error: 'invitationId is required' }, { status: 400 })
}
try {
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
if (result.role !== 'admin' && result.role !== 'owner') {
return NextResponse.json({ error: 'Admin or owner permissions required' }, { status: 403 })
}
await db
.update(credentialSetInvitation)
.set({ status: 'cancelled' })
.where(
and(
eq(credentialSetInvitation.id, invitationId),
eq(credentialSetInvitation.credentialSetId, id)
)
)
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error cancelling invitation', error)
return NextResponse.json({ error: 'Failed to cancel invitation' }, { status: 500 })
}
}

View File

@@ -0,0 +1,185 @@
import { db } from '@sim/db'
import { account, credentialSet, credentialSetMember, member, user } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, inArray } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
const logger = createLogger('CredentialSetMembers')
async function getCredentialSetWithAccess(credentialSetId: string, userId: string) {
const [set] = await db
.select({
id: credentialSet.id,
organizationId: credentialSet.organizationId,
providerId: credentialSet.providerId,
})
.from(credentialSet)
.where(eq(credentialSet.id, credentialSetId))
.limit(1)
if (!set) return null
const [membership] = await db
.select({ role: member.role })
.from(member)
.where(and(eq(member.userId, userId), eq(member.organizationId, set.organizationId)))
.limit(1)
if (!membership) return null
return { set, role: membership.role }
}
export async function GET(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
const members = await db
.select({
id: credentialSetMember.id,
userId: credentialSetMember.userId,
status: credentialSetMember.status,
joinedAt: credentialSetMember.joinedAt,
createdAt: credentialSetMember.createdAt,
userName: user.name,
userEmail: user.email,
userImage: user.image,
})
.from(credentialSetMember)
.leftJoin(user, eq(credentialSetMember.userId, user.id))
.where(eq(credentialSetMember.credentialSetId, id))
// Get credentials for all active members filtered by the polling group's provider
const activeMembers = members.filter((m) => m.status === 'active')
const memberUserIds = activeMembers.map((m) => m.userId)
let credentials: { userId: string; providerId: string; accountId: string }[] = []
if (memberUserIds.length > 0 && result.set.providerId) {
credentials = await db
.select({
userId: account.userId,
providerId: account.providerId,
accountId: account.accountId,
})
.from(account)
.where(
and(inArray(account.userId, memberUserIds), eq(account.providerId, result.set.providerId))
)
}
// Group credentials by userId
const credentialsByUser = credentials.reduce(
(acc, cred) => {
if (!acc[cred.userId]) {
acc[cred.userId] = []
}
acc[cred.userId].push({
providerId: cred.providerId,
accountId: cred.accountId,
})
return acc
},
{} as Record<string, { providerId: string; accountId: string }[]>
)
// Attach credentials to members
const membersWithCredentials = members.map((m) => ({
...m,
credentials: credentialsByUser[m.userId] || [],
}))
return NextResponse.json({ members: membersWithCredentials })
}
export async function DELETE(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
const { searchParams } = new URL(req.url)
const memberId = searchParams.get('memberId')
if (!memberId) {
return NextResponse.json({ error: 'memberId is required' }, { status: 400 })
}
try {
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
if (result.role !== 'admin' && result.role !== 'owner') {
return NextResponse.json({ error: 'Admin or owner permissions required' }, { status: 403 })
}
const [memberToRemove] = await db
.select()
.from(credentialSetMember)
.where(and(eq(credentialSetMember.id, memberId), eq(credentialSetMember.credentialSetId, id)))
.limit(1)
if (!memberToRemove) {
return NextResponse.json({ error: 'Member not found' }, { status: 404 })
}
const requestId = crypto.randomUUID().slice(0, 8)
// Use transaction to ensure member deletion + webhook sync are atomic
await db.transaction(async (tx) => {
await tx.delete(credentialSetMember).where(eq(credentialSetMember.id, memberId))
const syncResult = await syncAllWebhooksForCredentialSet(id, requestId, tx)
logger.info('Synced webhooks after member removed', {
credentialSetId: id,
...syncResult,
})
})
logger.info('Removed member from credential set', {
credentialSetId: id,
memberId,
userId: session.user.id,
})
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error removing member from credential set', error)
return NextResponse.json({ error: 'Failed to remove member' }, { status: 500 })
}
}

View File

@@ -0,0 +1,183 @@
import { db } from '@sim/db'
import { credentialSet, credentialSetMember, member } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
const logger = createLogger('CredentialSet')
const updateCredentialSetSchema = z.object({
name: z.string().trim().min(1).max(100).optional(),
description: z.string().max(500).nullable().optional(),
})
async function getCredentialSetWithAccess(credentialSetId: string, userId: string) {
const [set] = await db
.select({
id: credentialSet.id,
organizationId: credentialSet.organizationId,
name: credentialSet.name,
description: credentialSet.description,
providerId: credentialSet.providerId,
createdBy: credentialSet.createdBy,
createdAt: credentialSet.createdAt,
updatedAt: credentialSet.updatedAt,
})
.from(credentialSet)
.where(eq(credentialSet.id, credentialSetId))
.limit(1)
if (!set) return null
const [membership] = await db
.select({ role: member.role })
.from(member)
.where(and(eq(member.userId, userId), eq(member.organizationId, set.organizationId)))
.limit(1)
if (!membership) return null
return { set, role: membership.role }
}
export async function GET(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
return NextResponse.json({ credentialSet: result.set })
}
export async function PUT(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
try {
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
if (result.role !== 'admin' && result.role !== 'owner') {
return NextResponse.json({ error: 'Admin or owner permissions required' }, { status: 403 })
}
const body = await req.json()
const updates = updateCredentialSetSchema.parse(body)
if (updates.name) {
const existingSet = await db
.select({ id: credentialSet.id })
.from(credentialSet)
.where(
and(
eq(credentialSet.organizationId, result.set.organizationId),
eq(credentialSet.name, updates.name)
)
)
.limit(1)
if (existingSet.length > 0 && existingSet[0].id !== id) {
return NextResponse.json(
{ error: 'A credential set with this name already exists' },
{ status: 409 }
)
}
}
await db
.update(credentialSet)
.set({
...updates,
updatedAt: new Date(),
})
.where(eq(credentialSet.id, id))
const [updated] = await db.select().from(credentialSet).where(eq(credentialSet.id, id)).limit(1)
return NextResponse.json({ credentialSet: updated })
} catch (error) {
if (error instanceof z.ZodError) {
return NextResponse.json({ error: error.errors[0].message }, { status: 400 })
}
logger.error('Error updating credential set', error)
return NextResponse.json({ error: 'Failed to update credential set' }, { status: 500 })
}
}
export async function DELETE(req: NextRequest, { params }: { params: Promise<{ id: string }> }) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { id } = await params
try {
const result = await getCredentialSetWithAccess(id, session.user.id)
if (!result) {
return NextResponse.json({ error: 'Credential set not found' }, { status: 404 })
}
if (result.role !== 'admin' && result.role !== 'owner') {
return NextResponse.json({ error: 'Admin or owner permissions required' }, { status: 403 })
}
await db.delete(credentialSetMember).where(eq(credentialSetMember.credentialSetId, id))
await db.delete(credentialSet).where(eq(credentialSet.id, id))
logger.info('Deleted credential set', { credentialSetId: id, userId: session.user.id })
return NextResponse.json({ success: true })
} catch (error) {
logger.error('Error deleting credential set', error)
return NextResponse.json({ error: 'Failed to delete credential set' }, { status: 500 })
}
}

View File

@@ -0,0 +1,53 @@
import { db } from '@sim/db'
import { credentialSet, credentialSetInvitation, organization, user } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq, gt, isNull, or } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
const logger = createLogger('CredentialSetInvitations')
export async function GET() {
const session = await getSession()
if (!session?.user?.id || !session?.user?.email) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
try {
const invitations = await db
.select({
invitationId: credentialSetInvitation.id,
token: credentialSetInvitation.token,
status: credentialSetInvitation.status,
expiresAt: credentialSetInvitation.expiresAt,
createdAt: credentialSetInvitation.createdAt,
credentialSetId: credentialSet.id,
credentialSetName: credentialSet.name,
providerId: credentialSet.providerId,
organizationId: organization.id,
organizationName: organization.name,
invitedByName: user.name,
invitedByEmail: user.email,
})
.from(credentialSetInvitation)
.innerJoin(credentialSet, eq(credentialSetInvitation.credentialSetId, credentialSet.id))
.innerJoin(organization, eq(credentialSet.organizationId, organization.id))
.leftJoin(user, eq(credentialSetInvitation.invitedBy, user.id))
.where(
and(
or(
eq(credentialSetInvitation.email, session.user.email),
isNull(credentialSetInvitation.email)
),
eq(credentialSetInvitation.status, 'pending'),
gt(credentialSetInvitation.expiresAt, new Date())
)
)
return NextResponse.json({ invitations })
} catch (error) {
logger.error('Error fetching credential set invitations', error)
return NextResponse.json({ error: 'Failed to fetch invitations' }, { status: 500 })
}
}

View File

@@ -0,0 +1,196 @@
import { db } from '@sim/db'
import {
credentialSet,
credentialSetInvitation,
credentialSetMember,
organization,
} from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
const logger = createLogger('CredentialSetInviteToken')
export async function GET(req: NextRequest, { params }: { params: Promise<{ token: string }> }) {
const { token } = await params
const [invitation] = await db
.select({
id: credentialSetInvitation.id,
credentialSetId: credentialSetInvitation.credentialSetId,
email: credentialSetInvitation.email,
status: credentialSetInvitation.status,
expiresAt: credentialSetInvitation.expiresAt,
credentialSetName: credentialSet.name,
providerId: credentialSet.providerId,
organizationId: credentialSet.organizationId,
organizationName: organization.name,
})
.from(credentialSetInvitation)
.innerJoin(credentialSet, eq(credentialSetInvitation.credentialSetId, credentialSet.id))
.innerJoin(organization, eq(credentialSet.organizationId, organization.id))
.where(eq(credentialSetInvitation.token, token))
.limit(1)
if (!invitation) {
return NextResponse.json({ error: 'Invitation not found' }, { status: 404 })
}
if (invitation.status !== 'pending') {
return NextResponse.json({ error: 'Invitation is no longer valid' }, { status: 410 })
}
if (new Date() > invitation.expiresAt) {
await db
.update(credentialSetInvitation)
.set({ status: 'expired' })
.where(eq(credentialSetInvitation.id, invitation.id))
return NextResponse.json({ error: 'Invitation has expired' }, { status: 410 })
}
return NextResponse.json({
invitation: {
credentialSetName: invitation.credentialSetName,
organizationName: invitation.organizationName,
providerId: invitation.providerId,
email: invitation.email,
},
})
}
export async function POST(req: NextRequest, { params }: { params: Promise<{ token: string }> }) {
const { token } = await params
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Authentication required' }, { status: 401 })
}
try {
const [invitationData] = await db
.select({
id: credentialSetInvitation.id,
credentialSetId: credentialSetInvitation.credentialSetId,
email: credentialSetInvitation.email,
status: credentialSetInvitation.status,
expiresAt: credentialSetInvitation.expiresAt,
invitedBy: credentialSetInvitation.invitedBy,
providerId: credentialSet.providerId,
})
.from(credentialSetInvitation)
.innerJoin(credentialSet, eq(credentialSetInvitation.credentialSetId, credentialSet.id))
.where(eq(credentialSetInvitation.token, token))
.limit(1)
if (!invitationData) {
return NextResponse.json({ error: 'Invitation not found' }, { status: 404 })
}
const invitation = invitationData
if (invitation.status !== 'pending') {
return NextResponse.json({ error: 'Invitation is no longer valid' }, { status: 410 })
}
if (new Date() > invitation.expiresAt) {
await db
.update(credentialSetInvitation)
.set({ status: 'expired' })
.where(eq(credentialSetInvitation.id, invitation.id))
return NextResponse.json({ error: 'Invitation has expired' }, { status: 410 })
}
const existingMember = await db
.select()
.from(credentialSetMember)
.where(
and(
eq(credentialSetMember.credentialSetId, invitation.credentialSetId),
eq(credentialSetMember.userId, session.user.id)
)
)
.limit(1)
if (existingMember.length > 0) {
return NextResponse.json(
{ error: 'Already a member of this credential set' },
{ status: 409 }
)
}
const now = new Date()
const requestId = crypto.randomUUID().slice(0, 8)
// Use transaction to ensure membership + invitation update + webhook sync are atomic
await db.transaction(async (tx) => {
await tx.insert(credentialSetMember).values({
id: crypto.randomUUID(),
credentialSetId: invitation.credentialSetId,
userId: session.user.id,
status: 'active',
joinedAt: now,
invitedBy: invitation.invitedBy,
createdAt: now,
updatedAt: now,
})
await tx
.update(credentialSetInvitation)
.set({
status: 'accepted',
acceptedAt: now,
acceptedByUserId: session.user.id,
})
.where(eq(credentialSetInvitation.id, invitation.id))
// Clean up all other pending invitations for the same credential set and email
// This prevents duplicate invites from showing up after accepting one
if (invitation.email) {
await tx
.update(credentialSetInvitation)
.set({
status: 'accepted',
acceptedAt: now,
acceptedByUserId: session.user.id,
})
.where(
and(
eq(credentialSetInvitation.credentialSetId, invitation.credentialSetId),
eq(credentialSetInvitation.email, invitation.email),
eq(credentialSetInvitation.status, 'pending')
)
)
}
// Sync webhooks within the transaction
const syncResult = await syncAllWebhooksForCredentialSet(
invitation.credentialSetId,
requestId,
tx
)
logger.info('Synced webhooks after member joined', {
credentialSetId: invitation.credentialSetId,
...syncResult,
})
})
logger.info('Accepted credential set invitation', {
invitationId: invitation.id,
credentialSetId: invitation.credentialSetId,
userId: session.user.id,
})
return NextResponse.json({
success: true,
credentialSetId: invitation.credentialSetId,
providerId: invitation.providerId,
})
} catch (error) {
logger.error('Error accepting invitation', error)
return NextResponse.json({ error: 'Failed to accept invitation' }, { status: 500 })
}
}

View File

@@ -0,0 +1,115 @@
import { db } from '@sim/db'
import { credentialSet, credentialSetMember, organization } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { syncAllWebhooksForCredentialSet } from '@/lib/webhooks/utils.server'
const logger = createLogger('CredentialSetMemberships')
export async function GET() {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
try {
const memberships = await db
.select({
membershipId: credentialSetMember.id,
status: credentialSetMember.status,
joinedAt: credentialSetMember.joinedAt,
credentialSetId: credentialSet.id,
credentialSetName: credentialSet.name,
credentialSetDescription: credentialSet.description,
providerId: credentialSet.providerId,
organizationId: organization.id,
organizationName: organization.name,
})
.from(credentialSetMember)
.innerJoin(credentialSet, eq(credentialSetMember.credentialSetId, credentialSet.id))
.innerJoin(organization, eq(credentialSet.organizationId, organization.id))
.where(eq(credentialSetMember.userId, session.user.id))
return NextResponse.json({ memberships })
} catch (error) {
logger.error('Error fetching credential set memberships', error)
return NextResponse.json({ error: 'Failed to fetch memberships' }, { status: 500 })
}
}
/**
* Leave a credential set (self-revocation).
* Sets status to 'revoked' immediately (blocks execution), then syncs webhooks to clean up.
*/
export async function DELETE(req: NextRequest) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
const { searchParams } = new URL(req.url)
const credentialSetId = searchParams.get('credentialSetId')
if (!credentialSetId) {
return NextResponse.json({ error: 'credentialSetId is required' }, { status: 400 })
}
try {
const requestId = crypto.randomUUID().slice(0, 8)
// Use transaction to ensure revocation + webhook sync are atomic
await db.transaction(async (tx) => {
// Find and verify membership
const [membership] = await tx
.select()
.from(credentialSetMember)
.where(
and(
eq(credentialSetMember.credentialSetId, credentialSetId),
eq(credentialSetMember.userId, session.user.id)
)
)
.limit(1)
if (!membership) {
throw new Error('Not a member of this credential set')
}
if (membership.status === 'revoked') {
throw new Error('Already left this credential set')
}
// Set status to 'revoked' - this immediately blocks credential from being used
await tx
.update(credentialSetMember)
.set({
status: 'revoked',
updatedAt: new Date(),
})
.where(eq(credentialSetMember.id, membership.id))
// Sync webhooks to remove this user's credential webhooks
const syncResult = await syncAllWebhooksForCredentialSet(credentialSetId, requestId, tx)
logger.info('Synced webhooks after member left', {
credentialSetId,
userId: session.user.id,
...syncResult,
})
})
logger.info('User left credential set', {
credentialSetId,
userId: session.user.id,
})
return NextResponse.json({ success: true })
} catch (error) {
const message = error instanceof Error ? error.message : 'Failed to leave credential set'
logger.error('Error leaving credential set', error)
return NextResponse.json({ error: message }, { status: 500 })
}
}

View File

@@ -0,0 +1,176 @@
import { db } from '@sim/db'
import { credentialSet, credentialSetMember, member, organization, user } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { and, count, desc, eq } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { hasCredentialSetsAccess } from '@/lib/billing'
const logger = createLogger('CredentialSets')
const createCredentialSetSchema = z.object({
organizationId: z.string().min(1),
name: z.string().trim().min(1).max(100),
description: z.string().max(500).optional(),
providerId: z.enum(['google-email', 'outlook']),
})
export async function GET(req: Request) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
const { searchParams } = new URL(req.url)
const organizationId = searchParams.get('organizationId')
if (!organizationId) {
return NextResponse.json({ error: 'organizationId is required' }, { status: 400 })
}
const membership = await db
.select({ id: member.id, role: member.role })
.from(member)
.where(and(eq(member.userId, session.user.id), eq(member.organizationId, organizationId)))
.limit(1)
if (membership.length === 0) {
return NextResponse.json({ error: 'Forbidden' }, { status: 403 })
}
const sets = await db
.select({
id: credentialSet.id,
name: credentialSet.name,
description: credentialSet.description,
providerId: credentialSet.providerId,
createdBy: credentialSet.createdBy,
createdAt: credentialSet.createdAt,
updatedAt: credentialSet.updatedAt,
creatorName: user.name,
creatorEmail: user.email,
})
.from(credentialSet)
.leftJoin(user, eq(credentialSet.createdBy, user.id))
.where(eq(credentialSet.organizationId, organizationId))
.orderBy(desc(credentialSet.createdAt))
const setsWithCounts = await Promise.all(
sets.map(async (set) => {
const [memberCount] = await db
.select({ count: count() })
.from(credentialSetMember)
.where(
and(
eq(credentialSetMember.credentialSetId, set.id),
eq(credentialSetMember.status, 'active')
)
)
return {
...set,
memberCount: memberCount?.count ?? 0,
}
})
)
return NextResponse.json({ credentialSets: setsWithCounts })
}
export async function POST(req: Request) {
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Check plan access (team/enterprise) or env var override
const hasAccess = await hasCredentialSetsAccess(session.user.id)
if (!hasAccess) {
return NextResponse.json(
{ error: 'Credential sets require a Team or Enterprise plan' },
{ status: 403 }
)
}
try {
const body = await req.json()
const { organizationId, name, description, providerId } = createCredentialSetSchema.parse(body)
const membership = await db
.select({ id: member.id, role: member.role })
.from(member)
.where(and(eq(member.userId, session.user.id), eq(member.organizationId, organizationId)))
.limit(1)
const role = membership[0]?.role
if (membership.length === 0 || (role !== 'admin' && role !== 'owner')) {
return NextResponse.json(
{ error: 'Admin or owner permissions required to create credential sets' },
{ status: 403 }
)
}
const orgExists = await db
.select({ id: organization.id })
.from(organization)
.where(eq(organization.id, organizationId))
.limit(1)
if (orgExists.length === 0) {
return NextResponse.json({ error: 'Organization not found' }, { status: 404 })
}
const existingSet = await db
.select({ id: credentialSet.id })
.from(credentialSet)
.where(and(eq(credentialSet.organizationId, organizationId), eq(credentialSet.name, name)))
.limit(1)
if (existingSet.length > 0) {
return NextResponse.json(
{ error: 'A credential set with this name already exists' },
{ status: 409 }
)
}
const now = new Date()
const newCredentialSet = {
id: crypto.randomUUID(),
organizationId,
name,
description: description || null,
providerId,
createdBy: session.user.id,
createdAt: now,
updatedAt: now,
}
await db.insert(credentialSet).values(newCredentialSet)
logger.info('Created credential set', {
credentialSetId: newCredentialSet.id,
organizationId,
userId: session.user.id,
})
return NextResponse.json({ credentialSet: newCredentialSet }, { status: 201 })
} catch (error) {
if (error instanceof z.ZodError) {
return NextResponse.json({ error: error.errors[0].message }, { status: 400 })
}
logger.error('Error creating credential set', error)
return NextResponse.json({ error: 'Failed to create credential set' }, { status: 500 })
}
}

View File

@@ -198,15 +198,14 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
`[${requestId}] Starting controlled async processing of ${createdDocuments.length} documents`
)
// Track bulk document upload
try {
const { trackPlatformEvent } = await import('@/lib/core/telemetry')
trackPlatformEvent('platform.knowledge_base.documents_uploaded', {
'knowledge_base.id': knowledgeBaseId,
'documents.count': createdDocuments.length,
'documents.upload_type': 'bulk',
'processing.chunk_size': validatedData.processingOptions.chunkSize,
'processing.recipe': validatedData.processingOptions.recipe,
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.knowledgeBaseDocumentsUploaded({
knowledgeBaseId,
documentsCount: createdDocuments.length,
uploadType: 'bulk',
chunkSize: validatedData.processingOptions.chunkSize,
recipe: validatedData.processingOptions.recipe,
})
} catch (_e) {
// Silently fail
@@ -262,15 +261,14 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
userId
)
// Track single document upload
try {
const { trackPlatformEvent } = await import('@/lib/core/telemetry')
trackPlatformEvent('platform.knowledge_base.documents_uploaded', {
'knowledge_base.id': knowledgeBaseId,
'documents.count': 1,
'documents.upload_type': 'single',
'document.mime_type': validatedData.mimeType,
'document.file_size': validatedData.fileSize,
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.knowledgeBaseDocumentsUploaded({
knowledgeBaseId,
documentsCount: 1,
uploadType: 'single',
mimeType: validatedData.mimeType,
fileSize: validatedData.fileSize,
})
} catch (_e) {
// Silently fail

View File

@@ -2,6 +2,7 @@ import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import {
deleteKnowledgeBase,
@@ -183,6 +184,14 @@ export async function DELETE(
await deleteKnowledgeBase(id, requestId)
try {
PlatformEvents.knowledgeBaseDeleted({
knowledgeBaseId: id,
})
} catch {
// Telemetry should not fail the operation
}
logger.info(`[${requestId}] Knowledge base deleted: ${id} for user ${session.user.id}`)
return NextResponse.json({

View File

@@ -2,6 +2,7 @@ import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { createKnowledgeBase, getKnowledgeBases } from '@/lib/knowledge/service'
@@ -94,6 +95,16 @@ export async function POST(req: NextRequest) {
const newKnowledgeBase = await createKnowledgeBase(createData, requestId)
try {
PlatformEvents.knowledgeBaseCreated({
knowledgeBaseId: newKnowledgeBase.id,
name: validatedData.name,
workspaceId: validatedData.workspaceId,
})
} catch {
// Telemetry should not fail the operation
}
logger.info(
`[${requestId}] Knowledge base created: ${newKnowledgeBase.id} for user ${session.user.id}`
)

View File

@@ -5,6 +5,7 @@
*
* @vitest-environment node
*/
import { createEnvMock } from '@sim/testing'
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import {
createMockRequest,
@@ -26,13 +27,7 @@ vi.mock('drizzle-orm', () => ({
mockKnowledgeSchemas()
vi.mock('@/lib/core/config/env', () => ({
env: {
OPENAI_API_KEY: 'test-api-key',
},
isTruthy: (value: string | boolean | number | undefined) =>
typeof value === 'string' ? value === 'true' || value === '1' : Boolean(value),
}))
vi.mock('@/lib/core/config/env', () => createEnvMock({ OPENAI_API_KEY: 'test-api-key' }))
vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn(() => 'test-request-id'),

View File

@@ -1,6 +1,7 @@
import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { ALL_TAG_SLOTS } from '@/lib/knowledge/constants'
import { getDocumentTagDefinitions } from '@/lib/knowledge/tags/service'
@@ -294,6 +295,16 @@ export async function POST(request: NextRequest) {
const documentIds = results.map((result) => result.documentId)
const documentNameMap = await getDocumentNamesByIds(documentIds)
try {
PlatformEvents.knowledgeBaseSearched({
knowledgeBaseId: accessibleKbIds[0],
resultsCount: results.length,
workspaceId: workspaceId || undefined,
})
} catch {
// Telemetry should not fail the operation
}
return NextResponse.json({
success: true,
data: {

View File

@@ -4,6 +4,7 @@
*
* @vitest-environment node
*/
import { createEnvMock } from '@sim/testing'
import { beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('drizzle-orm')
@@ -30,12 +31,7 @@ vi.stubGlobal(
})
)
vi.mock('@/lib/core/config/env', () => ({
env: {},
getEnv: (key: string) => process.env[key],
isTruthy: (value: string | boolean | number | undefined) =>
typeof value === 'string' ? value === 'true' || value === '1' : Boolean(value),
}))
vi.mock('@/lib/core/config/env', () => createEnvMock())
import {
generateSearchEmbedding,

View File

@@ -6,6 +6,7 @@
* This file contains unit tests for the knowledge base utility functions,
* including access checks, document processing, and embedding generation.
*/
import { createEnvMock } from '@sim/testing'
import { beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('drizzle-orm', () => ({
@@ -15,12 +16,7 @@ vi.mock('drizzle-orm', () => ({
sql: (strings: TemplateStringsArray, ...expr: any[]) => ({ strings, expr }),
}))
vi.mock('@/lib/core/config/env', () => ({
env: { OPENAI_API_KEY: 'test-key' },
getEnv: (key: string) => process.env[key],
isTruthy: (value: string | boolean | number | undefined) =>
typeof value === 'string' ? value === 'true' || value === '1' : Boolean(value),
}))
vi.mock('@/lib/core/config/env', () => createEnvMock({ OPENAI_API_KEY: 'test-key' }))
vi.mock('@/lib/knowledge/documents/utils', () => ({
retryWithExponentialBackoff: (fn: any) => fn(),

View File

@@ -140,12 +140,12 @@ export const POST = withMcpAuth('write')(
)
try {
const { trackPlatformEvent } = await import('@/lib/core/telemetry')
trackPlatformEvent('platform.mcp.server_added', {
'mcp.server_id': serverId,
'mcp.server_name': body.name,
'mcp.transport': body.transport,
'workspace.id': workspaceId,
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.mcpServerAdded({
serverId,
serverName: body.name,
transport: body.transport,
workspaceId,
})
} catch (_e) {
// Silently fail

View File

@@ -194,12 +194,12 @@ export const POST = withMcpAuth('read')(
logger.info(`[${requestId}] Successfully executed tool ${toolName} on server ${serverId}`)
try {
const { trackPlatformEvent } = await import('@/lib/core/telemetry')
trackPlatformEvent('platform.mcp.tool_executed', {
'mcp.server_id': serverId,
'mcp.tool_name': toolName,
'mcp.execution_status': 'success',
'workspace.id': workspaceId,
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.mcpToolExecuted({
serverId,
toolName,
status: 'success',
workspaceId,
})
} catch {
// Telemetry failure is non-critical

View File

@@ -15,8 +15,11 @@ import { createLogger } from '@sim/logger'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getEmailSubject, renderInvitationEmail } from '@/components/emails'
import { getSession } from '@/lib/auth'
import { requireStripeClient } from '@/lib/billing/stripe-client'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { sendEmail } from '@/lib/messaging/email/mailer'
const logger = createLogger('OrganizationInvitation')
@@ -69,6 +72,102 @@ export async function GET(
}
}
// Resend invitation
export async function POST(
_request: NextRequest,
{ params }: { params: Promise<{ id: string; invitationId: string }> }
) {
const { id: organizationId, invitationId } = await params
const session = await getSession()
if (!session?.user?.id) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
try {
// Verify user is admin/owner
const memberEntry = await db
.select()
.from(member)
.where(and(eq(member.organizationId, organizationId), eq(member.userId, session.user.id)))
.limit(1)
if (memberEntry.length === 0 || !['owner', 'admin'].includes(memberEntry[0].role)) {
return NextResponse.json({ error: 'Forbidden - Admin access required' }, { status: 403 })
}
const orgInvitation = await db
.select()
.from(invitation)
.where(and(eq(invitation.id, invitationId), eq(invitation.organizationId, organizationId)))
.then((rows) => rows[0])
if (!orgInvitation) {
return NextResponse.json({ error: 'Invitation not found' }, { status: 404 })
}
if (orgInvitation.status !== 'pending') {
return NextResponse.json({ error: 'Can only resend pending invitations' }, { status: 400 })
}
const org = await db
.select({ name: organization.name })
.from(organization)
.where(eq(organization.id, organizationId))
.then((rows) => rows[0])
const inviter = await db
.select({ name: user.name })
.from(user)
.where(eq(user.id, session.user.id))
.limit(1)
// Update expiration date
const newExpiresAt = new Date(Date.now() + 7 * 24 * 60 * 60 * 1000) // 7 days
await db
.update(invitation)
.set({ expiresAt: newExpiresAt })
.where(eq(invitation.id, invitationId))
// Send email
const emailHtml = await renderInvitationEmail(
inviter[0]?.name || 'Someone',
org?.name || 'organization',
`${getBaseUrl()}/invite/${invitationId}`
)
const emailResult = await sendEmail({
to: orgInvitation.email,
subject: getEmailSubject('invitation'),
html: emailHtml,
emailType: 'transactional',
})
if (!emailResult.success) {
logger.error('Failed to resend invitation email', {
email: orgInvitation.email,
error: emailResult.message,
})
return NextResponse.json({ error: 'Failed to send invitation email' }, { status: 500 })
}
logger.info('Organization invitation resent', {
organizationId,
invitationId,
resentBy: session.user.id,
email: orgInvitation.email,
})
return NextResponse.json({
success: true,
message: 'Invitation resent successfully',
})
} catch (error) {
logger.error('Error resending organization invitation:', error)
return NextResponse.json({ error: 'Failed to resend invitation' }, { status: 500 })
}
}
export async function PUT(
req: NextRequest,
{ params }: { params: Promise<{ id: string; invitationId: string }> }

View File

@@ -41,6 +41,9 @@ export async function POST(request: NextRequest) {
vertexProject,
vertexLocation,
vertexCredential,
bedrockAccessKeyId,
bedrockSecretKey,
bedrockRegion,
responseFormat,
workflowId,
workspaceId,
@@ -67,6 +70,9 @@ export async function POST(request: NextRequest) {
hasVertexProject: !!vertexProject,
hasVertexLocation: !!vertexLocation,
hasVertexCredential: !!vertexCredential,
hasBedrockAccessKeyId: !!bedrockAccessKeyId,
hasBedrockSecretKey: !!bedrockSecretKey,
hasBedrockRegion: !!bedrockRegion,
hasResponseFormat: !!responseFormat,
workflowId,
stream: !!stream,
@@ -116,6 +122,9 @@ export async function POST(request: NextRequest) {
azureApiVersion,
vertexProject,
vertexLocation,
bedrockAccessKeyId,
bedrockSecretKey,
bedrockRegion,
responseFormat,
workflowId,
workspaceId,

View File

@@ -168,18 +168,15 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
`[${requestId}] Successfully used template: ${id}, created workflow: ${newWorkflowId}`
)
// Track template usage
try {
const { trackPlatformEvent } = await import('@/lib/core/telemetry')
const { PlatformEvents } = await import('@/lib/core/telemetry')
const templateState = templateData.state as any
trackPlatformEvent('platform.template.used', {
'template.id': id,
'template.name': templateData.name,
'workflow.created_id': newWorkflowId,
'workflow.blocks_count': templateState?.blocks
? Object.keys(templateState.blocks).length
: 0,
'workspace.id': workspaceId,
PlatformEvents.templateUsed({
templateId: id,
templateName: templateData.name,
newWorkflowId,
blocksCount: templateState?.blocks ? Object.keys(templateState.blocks).length : 0,
workspaceId,
})
} catch (_e) {
// Silently fail

View File

@@ -0,0 +1,199 @@
/**
* Admin BYOK Keys API
*
* GET /api/v1/admin/byok
* List all BYOK keys with optional filtering.
*
* Query Parameters:
* - organizationId?: string - Filter by organization ID (finds all workspaces billed to this org)
* - workspaceId?: string - Filter by specific workspace ID
*
* Response: { data: AdminBYOKKey[], pagination: PaginationMeta }
*
* DELETE /api/v1/admin/byok
* Delete BYOK keys for an organization or workspace.
* Used when an enterprise plan churns to clean up BYOK keys.
*
* Query Parameters:
* - organizationId: string - Delete all BYOK keys for workspaces billed to this org
* - workspaceId?: string - Delete keys for a specific workspace only (optional)
*
* Response: { success: true, deletedCount: number, workspacesAffected: string[] }
*/
import { db } from '@sim/db'
import { user, workspace, workspaceBYOKKeys } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { eq, inArray, sql } from 'drizzle-orm'
import { withAdminAuth } from '@/app/api/v1/admin/middleware'
import {
badRequestResponse,
internalErrorResponse,
singleResponse,
} from '@/app/api/v1/admin/responses'
const logger = createLogger('AdminBYOKAPI')
export interface AdminBYOKKey {
id: string
workspaceId: string
workspaceName: string
organizationId: string
providerId: string
createdAt: string
createdByUserId: string | null
createdByEmail: string | null
}
export const GET = withAdminAuth(async (request) => {
const url = new URL(request.url)
const organizationId = url.searchParams.get('organizationId')
const workspaceId = url.searchParams.get('workspaceId')
try {
let workspaceIds: string[] = []
if (workspaceId) {
workspaceIds = [workspaceId]
} else if (organizationId) {
const workspaces = await db
.select({ id: workspace.id })
.from(workspace)
.where(eq(workspace.billedAccountUserId, organizationId))
workspaceIds = workspaces.map((w) => w.id)
}
const query = db
.select({
id: workspaceBYOKKeys.id,
workspaceId: workspaceBYOKKeys.workspaceId,
workspaceName: workspace.name,
organizationId: workspace.billedAccountUserId,
providerId: workspaceBYOKKeys.providerId,
createdAt: workspaceBYOKKeys.createdAt,
createdByUserId: workspaceBYOKKeys.createdBy,
createdByEmail: user.email,
})
.from(workspaceBYOKKeys)
.innerJoin(workspace, eq(workspaceBYOKKeys.workspaceId, workspace.id))
.leftJoin(user, eq(workspaceBYOKKeys.createdBy, user.id))
let keys
if (workspaceIds.length > 0) {
keys = await query.where(inArray(workspaceBYOKKeys.workspaceId, workspaceIds))
} else {
keys = await query
}
const formattedKeys: AdminBYOKKey[] = keys.map((k) => ({
id: k.id,
workspaceId: k.workspaceId,
workspaceName: k.workspaceName,
organizationId: k.organizationId,
providerId: k.providerId,
createdAt: k.createdAt.toISOString(),
createdByUserId: k.createdByUserId,
createdByEmail: k.createdByEmail,
}))
logger.info('Admin API: Listed BYOK keys', {
organizationId,
workspaceId,
count: formattedKeys.length,
})
return singleResponse({
data: formattedKeys,
pagination: {
total: formattedKeys.length,
limit: formattedKeys.length,
offset: 0,
hasMore: false,
},
})
} catch (error) {
logger.error('Admin API: Failed to list BYOK keys', { error, organizationId, workspaceId })
return internalErrorResponse('Failed to list BYOK keys')
}
})
export const DELETE = withAdminAuth(async (request) => {
const url = new URL(request.url)
const organizationId = url.searchParams.get('organizationId')
const workspaceId = url.searchParams.get('workspaceId')
const reason = url.searchParams.get('reason') || 'Enterprise plan churn cleanup'
if (!organizationId && !workspaceId) {
return badRequestResponse('Either organizationId or workspaceId is required')
}
try {
let workspaceIds: string[] = []
if (workspaceId) {
workspaceIds = [workspaceId]
} else if (organizationId) {
const workspaces = await db
.select({ id: workspace.id })
.from(workspace)
.where(eq(workspace.billedAccountUserId, organizationId))
workspaceIds = workspaces.map((w) => w.id)
}
if (workspaceIds.length === 0) {
logger.info('Admin API: No workspaces found for BYOK cleanup', {
organizationId,
workspaceId,
})
return singleResponse({
success: true,
deletedCount: 0,
workspacesAffected: [],
message: 'No workspaces found for the given organization/workspace ID',
})
}
const countResult = await db
.select({ count: sql<number>`count(*)` })
.from(workspaceBYOKKeys)
.where(inArray(workspaceBYOKKeys.workspaceId, workspaceIds))
const totalToDelete = Number(countResult[0]?.count ?? 0)
if (totalToDelete === 0) {
logger.info('Admin API: No BYOK keys to delete', {
organizationId,
workspaceId,
workspaceIds,
})
return singleResponse({
success: true,
deletedCount: 0,
workspacesAffected: [],
message: 'No BYOK keys found for the specified workspaces',
})
}
await db.delete(workspaceBYOKKeys).where(inArray(workspaceBYOKKeys.workspaceId, workspaceIds))
logger.info('Admin API: Deleted BYOK keys', {
organizationId,
workspaceId,
workspaceIds,
deletedCount: totalToDelete,
reason,
})
return singleResponse({
success: true,
deletedCount: totalToDelete,
workspacesAffected: workspaceIds,
reason,
})
} catch (error) {
logger.error('Admin API: Failed to delete BYOK keys', { error, organizationId, workspaceId })
return internalErrorResponse('Failed to delete BYOK keys')
}
})

View File

@@ -51,6 +51,10 @@
* GET /api/v1/admin/subscriptions - List all subscriptions
* GET /api/v1/admin/subscriptions/:id - Get subscription details
* DELETE /api/v1/admin/subscriptions/:id - Cancel subscription (?atPeriodEnd=true for scheduled)
*
* BYOK Keys:
* GET /api/v1/admin/byok - List BYOK keys (?organizationId=X or ?workspaceId=X)
* DELETE /api/v1/admin/byok - Delete BYOK keys for org/workspace
*/
export type { AdminAuthFailure, AdminAuthResult, AdminAuthSuccess } from '@/app/api/v1/admin/auth'

View File

@@ -1,10 +1,11 @@
import { db } from '@sim/db'
import { webhook, workflow } from '@sim/db/schema'
import { createLogger } from '@sim/logger'
import { eq } from 'drizzle-orm'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { validateInteger } from '@/lib/core/security/input-validation'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -184,16 +185,28 @@ export async function PATCH(request: NextRequest, { params }: { params: Promise<
hasFailedCountUpdate: failedCount !== undefined,
})
// Update the webhook
// Merge providerConfig to preserve credential-related fields
let finalProviderConfig = webhooks[0].webhook.providerConfig
if (providerConfig !== undefined) {
const existingConfig = (webhooks[0].webhook.providerConfig as Record<string, unknown>) || {}
finalProviderConfig = {
...resolvedProviderConfig,
credentialId: existingConfig.credentialId,
credentialSetId: existingConfig.credentialSetId,
userId: existingConfig.userId,
historyId: existingConfig.historyId,
lastCheckedTimestamp: existingConfig.lastCheckedTimestamp,
setupCompleted: existingConfig.setupCompleted,
externalId: existingConfig.externalId,
}
}
const updatedWebhook = await db
.update(webhook)
.set({
path: path !== undefined ? path : webhooks[0].webhook.path,
provider: provider !== undefined ? provider : webhooks[0].webhook.provider,
providerConfig:
providerConfig !== undefined
? resolvedProviderConfig
: webhooks[0].webhook.providerConfig,
providerConfig: finalProviderConfig,
isActive: isActive !== undefined ? isActive : webhooks[0].webhook.isActive,
failedCount: failedCount !== undefined ? failedCount : webhooks[0].webhook.failedCount,
updatedAt: new Date(),
@@ -276,13 +289,67 @@ export async function DELETE(
}
const foundWebhook = webhookData.webhook
const { cleanupExternalWebhook } = await import('@/lib/webhooks/provider-subscriptions')
await cleanupExternalWebhook(foundWebhook, webhookData.workflow, requestId)
await db.delete(webhook).where(eq(webhook.id, id))
const providerConfig = foundWebhook.providerConfig as Record<string, unknown> | null
const credentialSetId = providerConfig?.credentialSetId as string | undefined
const blockId = providerConfig?.blockId as string | undefined
if (credentialSetId && blockId) {
const allCredentialSetWebhooks = await db
.select()
.from(webhook)
.where(and(eq(webhook.workflowId, webhookData.workflow.id), eq(webhook.blockId, blockId)))
const webhooksToDelete = allCredentialSetWebhooks.filter((w) => {
const config = w.providerConfig as Record<string, unknown> | null
return config?.credentialSetId === credentialSetId
})
for (const w of webhooksToDelete) {
await cleanupExternalWebhook(w, webhookData.workflow, requestId)
}
const idsToDelete = webhooksToDelete.map((w) => w.id)
for (const wId of idsToDelete) {
await db.delete(webhook).where(eq(webhook.id, wId))
}
try {
for (const wId of idsToDelete) {
PlatformEvents.webhookDeleted({
webhookId: wId,
workflowId: webhookData.workflow.id,
})
}
} catch {
// Telemetry should not fail the operation
}
logger.info(
`[${requestId}] Successfully deleted ${idsToDelete.length} webhooks for credential set`,
{
credentialSetId,
blockId,
deletedIds: idsToDelete,
}
)
} else {
await cleanupExternalWebhook(foundWebhook, webhookData.workflow, requestId)
await db.delete(webhook).where(eq(webhook.id, id))
try {
PlatformEvents.webhookDeleted({
webhookId: id,
workflowId: webhookData.workflow.id,
})
} catch {
// Telemetry should not fail the operation
}
logger.info(`[${requestId}] Successfully deleted webhook: ${id}`)
}
logger.info(`[${requestId}] Successfully deleted webhook: ${id}`)
return NextResponse.json({ success: true }, { status: 200 })
} catch (error: any) {
logger.error(`[${requestId}] Error deleting webhook`, {

View File

@@ -5,6 +5,7 @@ import { and, desc, eq } from 'drizzle-orm'
import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -262,6 +263,157 @@ export async function POST(request: NextRequest) {
workflowRecord.workspaceId || undefined
)
// --- Credential Set Handling ---
// For credential sets, we fan out to create one webhook per credential at save time.
// This applies to all OAuth-based triggers, not just polling ones.
// Check for credentialSetId directly (frontend may already extract it) or credential set value in credential fields
const rawCredentialId = (resolvedProviderConfig?.credentialId ||
resolvedProviderConfig?.triggerCredentials) as string | undefined
const directCredentialSetId = resolvedProviderConfig?.credentialSetId as string | undefined
if (directCredentialSetId || rawCredentialId) {
const { isCredentialSetValue, extractCredentialSetId } = await import('@/executor/constants')
const credentialSetId =
directCredentialSetId ||
(rawCredentialId && isCredentialSetValue(rawCredentialId)
? extractCredentialSetId(rawCredentialId)
: null)
if (credentialSetId) {
logger.info(
`[${requestId}] Credential set detected for ${provider} trigger. Syncing webhooks for set ${credentialSetId}`
)
const { getProviderIdFromServiceId } = await import('@/lib/oauth')
const { syncWebhooksForCredentialSet, configureGmailPolling, configureOutlookPolling } =
await import('@/lib/webhooks/utils.server')
// Map provider to OAuth provider ID
const oauthProviderId = getProviderIdFromServiceId(provider)
const {
credentialId: _cId,
triggerCredentials: _tCred,
credentialSetId: _csId,
...baseProviderConfig
} = resolvedProviderConfig
try {
const syncResult = await syncWebhooksForCredentialSet({
workflowId,
blockId,
provider,
basePath: finalPath,
credentialSetId,
oauthProviderId,
providerConfig: baseProviderConfig,
requestId,
})
if (syncResult.webhooks.length === 0) {
logger.error(
`[${requestId}] No webhooks created for credential set - no valid credentials found`
)
return NextResponse.json(
{
error: `No valid credentials found in credential set for ${provider}`,
details: 'Please ensure team members have connected their accounts',
},
{ status: 400 }
)
}
// Configure each new webhook (for providers that need configuration)
const pollingProviders = ['gmail', 'outlook']
const needsConfiguration = pollingProviders.includes(provider)
if (needsConfiguration) {
const configureFunc =
provider === 'gmail' ? configureGmailPolling : configureOutlookPolling
const configureErrors: string[] = []
for (const wh of syncResult.webhooks) {
if (wh.isNew) {
// Fetch the webhook data for configuration
const webhookRows = await db
.select()
.from(webhook)
.where(eq(webhook.id, wh.id))
.limit(1)
if (webhookRows.length > 0) {
const success = await configureFunc(webhookRows[0], requestId)
if (!success) {
configureErrors.push(
`Failed to configure webhook for credential ${wh.credentialId}`
)
logger.warn(
`[${requestId}] Failed to configure ${provider} polling for webhook ${wh.id}`
)
}
}
}
}
if (
configureErrors.length > 0 &&
configureErrors.length === syncResult.webhooks.length
) {
// All configurations failed - roll back
logger.error(`[${requestId}] All webhook configurations failed, rolling back`)
for (const wh of syncResult.webhooks) {
await db.delete(webhook).where(eq(webhook.id, wh.id))
}
return NextResponse.json(
{
error: `Failed to configure ${provider} polling`,
details: 'Please check account permissions and try again',
},
{ status: 500 }
)
}
}
logger.info(
`[${requestId}] Successfully synced ${syncResult.webhooks.length} webhooks for credential set ${credentialSetId}`
)
// Return the first webhook as the "primary" for the UI
// The UI will query by credentialSetId to get all of them
const primaryWebhookRows = await db
.select()
.from(webhook)
.where(eq(webhook.id, syncResult.webhooks[0].id))
.limit(1)
return NextResponse.json(
{
webhook: primaryWebhookRows[0],
credentialSetInfo: {
credentialSetId,
totalWebhooks: syncResult.webhooks.length,
created: syncResult.created,
updated: syncResult.updated,
deleted: syncResult.deleted,
},
},
{ status: syncResult.created > 0 ? 201 : 200 }
)
} catch (err) {
logger.error(`[${requestId}] Error syncing webhooks for credential set`, err)
return NextResponse.json(
{
error: `Failed to configure ${provider} webhook`,
details: err instanceof Error ? err.message : 'Unknown error',
},
{ status: 500 }
)
}
}
}
// --- End Credential Set Handling ---
// Create external subscriptions before saving to DB to prevent orphaned records
let externalSubscriptionId: string | undefined
let externalSubscriptionCreated = false
@@ -422,6 +574,10 @@ export async function POST(request: NextRequest) {
blockId,
provider,
providerConfig: resolvedProviderConfig,
credentialSetId:
((resolvedProviderConfig as Record<string, unknown>)?.credentialSetId as
| string
| null) || null,
isActive: true,
updatedAt: new Date(),
})
@@ -445,6 +601,10 @@ export async function POST(request: NextRequest) {
path: finalPath,
provider,
providerConfig: resolvedProviderConfig,
credentialSetId:
((resolvedProviderConfig as Record<string, unknown>)?.credentialSetId as
| string
| null) || null,
isActive: true,
createdAt: new Date(),
updatedAt: new Date(),
@@ -631,6 +791,19 @@ export async function POST(request: NextRequest) {
}
// --- End Grain specific logic ---
if (!targetWebhookId && savedWebhook) {
try {
PlatformEvents.webhookCreated({
webhookId: savedWebhook.id,
workflowId: workflowId,
provider: provider || 'generic',
workspaceId: workflowRecord.workspaceId || undefined,
})
} catch {
// Telemetry should not fail the operation
}
}
const status = targetWebhookId ? 200 : 201
return NextResponse.json({ webhook: savedWebhook }, { status })
} catch (error: any) {

View File

@@ -157,6 +157,112 @@ vi.mock('@/lib/workflows/persistence/utils', () => ({
blockExistsInDeployment: vi.fn().mockResolvedValue(true),
}))
vi.mock('@/lib/webhooks/processor', () => ({
findAllWebhooksForPath: vi.fn().mockImplementation(async (options: { path: string }) => {
// Filter webhooks by path from globalMockData
const matchingWebhooks = globalMockData.webhooks.filter(
(wh) => wh.path === options.path && wh.isActive
)
if (matchingWebhooks.length === 0) {
return []
}
// Return array of {webhook, workflow} objects
return matchingWebhooks.map((wh) => {
const matchingWorkflow = globalMockData.workflows.find((w) => w.id === wh.workflowId) || {
id: wh.workflowId || 'test-workflow-id',
userId: 'test-user-id',
workspaceId: 'test-workspace-id',
}
return {
webhook: wh,
workflow: matchingWorkflow,
}
})
}),
parseWebhookBody: vi.fn().mockImplementation(async (request: any) => {
try {
const cloned = request.clone()
const rawBody = await cloned.text()
const body = rawBody ? JSON.parse(rawBody) : {}
return { body, rawBody }
} catch {
return { body: {}, rawBody: '' }
}
}),
handleProviderChallenges: vi.fn().mockResolvedValue(null),
handleProviderReachabilityTest: vi.fn().mockReturnValue(null),
verifyProviderAuth: vi
.fn()
.mockImplementation(
async (
foundWebhook: any,
_foundWorkflow: any,
request: any,
_rawBody: string,
_requestId: string
) => {
// Implement generic webhook auth verification for tests
if (foundWebhook.provider === 'generic') {
const providerConfig = foundWebhook.providerConfig || {}
if (providerConfig.requireAuth) {
const configToken = providerConfig.token
const secretHeaderName = providerConfig.secretHeaderName
if (configToken) {
let isTokenValid = false
if (secretHeaderName) {
// Custom header auth
const headerValue = request.headers.get(secretHeaderName.toLowerCase())
if (headerValue === configToken) {
isTokenValid = true
}
} else {
// Bearer token auth
const authHeader = request.headers.get('authorization')
if (authHeader?.toLowerCase().startsWith('bearer ')) {
const token = authHeader.substring(7)
if (token === configToken) {
isTokenValid = true
}
}
}
if (!isTokenValid) {
const { NextResponse } = await import('next/server')
return new NextResponse('Unauthorized - Invalid authentication token', {
status: 401,
})
}
} else {
// Auth required but no token configured
const { NextResponse } = await import('next/server')
return new NextResponse('Unauthorized - Authentication required but not configured', {
status: 401,
})
}
}
}
return null
}
),
checkWebhookPreprocessing: vi.fn().mockResolvedValue(null),
formatProviderErrorResponse: vi.fn().mockImplementation((_webhook, error, status) => {
const { NextResponse } = require('next/server')
return NextResponse.json({ error }, { status })
}),
shouldSkipWebhookEvent: vi.fn().mockReturnValue(false),
handlePreDeploymentVerification: vi.fn().mockReturnValue(null),
queueWebhookExecution: vi.fn().mockImplementation(async () => {
// Call processWebhookMock so tests can verify it was called
processWebhookMock()
const { NextResponse } = await import('next/server')
return NextResponse.json({ message: 'Webhook processed' })
}),
}))
vi.mock('drizzle-orm/postgres-js', () => ({
drizzle: vi.fn().mockReturnValue({}),
}))
@@ -165,6 +271,10 @@ vi.mock('postgres', () => vi.fn().mockReturnValue({}))
vi.mock('@sim/logger', () => loggerMock)
vi.mock('@/lib/core/utils/request', () => ({
generateRequestId: vi.fn().mockReturnValue('test-request-id'),
}))
process.env.DATABASE_URL = 'postgresql://test:test@localhost:5432/test'
import { POST } from '@/app/api/webhooks/trigger/[path]/route'

View File

@@ -3,11 +3,14 @@ import { type NextRequest, NextResponse } from 'next/server'
import { generateRequestId } from '@/lib/core/utils/request'
import {
checkWebhookPreprocessing,
findWebhookAndWorkflow,
findAllWebhooksForPath,
formatProviderErrorResponse,
handlePreDeploymentVerification,
handleProviderChallenges,
handleProviderReachabilityTest,
parseWebhookBody,
queueWebhookExecution,
shouldSkipWebhookEvent,
verifyProviderAuth,
} from '@/lib/webhooks/processor'
import { blockExistsInDeployment } from '@/lib/workflows/persistence/utils'
@@ -22,19 +25,7 @@ export async function GET(request: NextRequest, { params }: { params: Promise<{
const requestId = generateRequestId()
const { path } = await params
// Handle Microsoft Graph subscription validation
const url = new URL(request.url)
const validationToken = url.searchParams.get('validationToken')
if (validationToken) {
logger.info(`[${requestId}] Microsoft Graph subscription validation for path: ${path}`)
return new NextResponse(validationToken, {
status: 200,
headers: { 'Content-Type': 'text/plain' },
})
}
// Handle other GET-based verifications if needed
// Handle provider-specific GET verifications (Microsoft Graph, WhatsApp, etc.)
const challengeResponse = await handleProviderChallenges({}, request, requestId, path)
if (challengeResponse) {
return challengeResponse
@@ -50,26 +41,10 @@ export async function POST(
const requestId = generateRequestId()
const { path } = await params
// Log ALL incoming webhook requests for debugging
logger.info(`[${requestId}] Incoming webhook request`, {
path,
method: request.method,
headers: Object.fromEntries(request.headers.entries()),
})
// Handle Microsoft Graph subscription validation (some environments send POST with validationToken)
try {
const url = new URL(request.url)
const validationToken = url.searchParams.get('validationToken')
if (validationToken) {
logger.info(`[${requestId}] Microsoft Graph subscription validation (POST) for path: ${path}`)
return new NextResponse(validationToken, {
status: 200,
headers: { 'Content-Type': 'text/plain' },
})
}
} catch {
// ignore URL parsing errors; proceed to normal handling
// Handle provider challenges before body parsing (Microsoft Graph validationToken, etc.)
const earlyChallenge = await handleProviderChallenges({}, request, requestId, path)
if (earlyChallenge) {
return earlyChallenge
}
const parseResult = await parseWebhookBody(request, requestId)
@@ -86,118 +61,118 @@ export async function POST(
return challengeResponse
}
const findResult = await findWebhookAndWorkflow({ requestId, path })
// Find all webhooks for this path (supports credential set fan-out where multiple webhooks share a path)
const webhooksForPath = await findAllWebhooksForPath({ requestId, path })
if (!findResult) {
if (webhooksForPath.length === 0) {
logger.warn(`[${requestId}] Webhook or workflow not found for path: ${path}`)
return new NextResponse('Not Found', { status: 404 })
}
const { webhook: foundWebhook, workflow: foundWorkflow } = findResult
// Process each webhook
// For credential sets with shared paths, each webhook represents a different credential
const responses: NextResponse[] = []
// Log HubSpot webhook details for debugging
if (foundWebhook.provider === 'hubspot') {
const events = Array.isArray(body) ? body : [body]
const firstEvent = events[0]
logger.info(`[${requestId}] HubSpot webhook received`, {
path,
subscriptionType: firstEvent?.subscriptionType,
objectId: firstEvent?.objectId,
portalId: firstEvent?.portalId,
webhookId: foundWebhook.id,
workflowId: foundWorkflow.id,
triggerId: foundWebhook.providerConfig?.triggerId,
eventCount: events.length,
})
}
const authError = await verifyProviderAuth(
foundWebhook,
foundWorkflow,
request,
rawBody,
requestId
)
if (authError) {
return authError
}
const reachabilityResponse = handleProviderReachabilityTest(foundWebhook, body, requestId)
if (reachabilityResponse) {
return reachabilityResponse
}
let preprocessError: NextResponse | null = null
try {
preprocessError = await checkWebhookPreprocessing(foundWorkflow, foundWebhook, requestId)
if (preprocessError) {
return preprocessError
}
} catch (error) {
logger.error(`[${requestId}] Unexpected error during webhook preprocessing`, {
error: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined,
webhookId: foundWebhook.id,
workflowId: foundWorkflow.id,
})
if (foundWebhook.provider === 'microsoft-teams') {
return NextResponse.json(
{
type: 'message',
text: 'An unexpected error occurred during preprocessing',
},
{ status: 500 }
)
}
return NextResponse.json(
{ error: 'An unexpected error occurred during preprocessing' },
{ status: 500 }
for (const { webhook: foundWebhook, workflow: foundWorkflow } of webhooksForPath) {
const authError = await verifyProviderAuth(
foundWebhook,
foundWorkflow,
request,
rawBody,
requestId
)
}
if (authError) {
// For multi-webhook, log and continue to next webhook
if (webhooksForPath.length > 1) {
logger.warn(`[${requestId}] Auth failed for webhook ${foundWebhook.id}, continuing to next`)
continue
}
return authError
}
if (foundWebhook.blockId) {
const blockExists = await blockExistsInDeployment(foundWorkflow.id, foundWebhook.blockId)
if (!blockExists) {
// For Grain, if block doesn't exist in deployment, treat as verification request
// Grain validates webhook URLs during creation, and the block may not be deployed yet
if (foundWebhook.provider === 'grain') {
logger.info(
`[${requestId}] Grain webhook verification - block not in deployment, returning 200 OK`
)
return NextResponse.json({ status: 'ok', message: 'Webhook endpoint verified' })
const reachabilityResponse = handleProviderReachabilityTest(foundWebhook, body, requestId)
if (reachabilityResponse) {
// Reachability test should return immediately for the first webhook
return reachabilityResponse
}
let preprocessError: NextResponse | null = null
try {
preprocessError = await checkWebhookPreprocessing(foundWorkflow, foundWebhook, requestId)
if (preprocessError) {
if (webhooksForPath.length > 1) {
logger.warn(
`[${requestId}] Preprocessing failed for webhook ${foundWebhook.id}, continuing to next`
)
continue
}
return preprocessError
}
} catch (error) {
logger.error(`[${requestId}] Unexpected error during webhook preprocessing`, {
error: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined,
webhookId: foundWebhook.id,
workflowId: foundWorkflow.id,
})
if (webhooksForPath.length > 1) {
continue
}
logger.info(
`[${requestId}] Trigger block ${foundWebhook.blockId} not found in deployment for workflow ${foundWorkflow.id}`
return formatProviderErrorResponse(
foundWebhook,
'An unexpected error occurred during preprocessing',
500
)
return new NextResponse('Trigger block not found in deployment', { status: 404 })
}
}
if (foundWebhook.provider === 'stripe') {
const providerConfig = (foundWebhook.providerConfig as Record<string, any>) || {}
const eventTypes = providerConfig.eventTypes
if (foundWebhook.blockId) {
const blockExists = await blockExistsInDeployment(foundWorkflow.id, foundWebhook.blockId)
if (!blockExists) {
const preDeploymentResponse = handlePreDeploymentVerification(foundWebhook, requestId)
if (preDeploymentResponse) {
return preDeploymentResponse
}
if (eventTypes && Array.isArray(eventTypes) && eventTypes.length > 0) {
const eventType = body?.type
if (eventType && !eventTypes.includes(eventType)) {
logger.info(
`[${requestId}] Stripe event type '${eventType}' not in allowed list, skipping execution`
`[${requestId}] Trigger block ${foundWebhook.blockId} not found in deployment for workflow ${foundWorkflow.id}`
)
return new NextResponse('Event type filtered', { status: 200 })
if (webhooksForPath.length > 1) {
continue
}
return new NextResponse('Trigger block not found in deployment', { status: 404 })
}
}
if (shouldSkipWebhookEvent(foundWebhook, body, requestId)) {
continue
}
const response = await queueWebhookExecution(foundWebhook, foundWorkflow, body, request, {
requestId,
path,
testMode: false,
executionTarget: 'deployed',
})
responses.push(response)
}
return queueWebhookExecution(foundWebhook, foundWorkflow, body, request, {
requestId,
path,
testMode: false,
executionTarget: 'deployed',
// Return the last successful response, or a combined response for multiple webhooks
if (responses.length === 0) {
return new NextResponse('No webhooks processed successfully', { status: 500 })
}
if (responses.length === 1) {
return responses[0]
}
// For multiple webhooks, return success if at least one succeeded
logger.info(
`[${requestId}] Processed ${responses.length} webhooks for path: ${path} (credential set fan-out)`
)
return NextResponse.json({
success: true,
webhooksProcessed: responses.length,
})
}

View File

@@ -217,10 +217,8 @@ export async function DELETE(
logger.info(`[${requestId}] Workflow undeployed successfully: ${id}`)
try {
const { trackPlatformEvent } = await import('@/lib/core/telemetry')
trackPlatformEvent('platform.workflow.undeployed', {
'workflow.id': id,
})
const { PlatformEvents } = await import('@/lib/core/telemetry')
PlatformEvents.workflowUndeployed({ workflowId: id })
} catch (_e) {
// Silently fail
}

View File

@@ -2,6 +2,7 @@ import { createLogger } from '@sim/logger'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { duplicateWorkflow } from '@/lib/workflows/persistence/duplicate'
@@ -46,6 +47,16 @@ export async function POST(req: NextRequest, { params }: { params: Promise<{ id:
requestId,
})
try {
PlatformEvents.workflowDuplicated({
sourceWorkflowId,
newWorkflowId: result.id,
workspaceId,
})
} catch {
// Telemetry should not fail the operation
}
const elapsed = Date.now() - startTime
logger.info(
`[${requestId}] Successfully duplicated workflow ${sourceWorkflowId} to ${result.id} in ${elapsed}ms`

View File

@@ -8,6 +8,7 @@ import { authenticateApiKeyFromHeader, updateApiKeyLastUsed } from '@/lib/api-ke
import { getSession } from '@/lib/auth'
import { verifyInternalToken } from '@/lib/auth/internal'
import { env } from '@/lib/core/config/env'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { loadWorkflowFromNormalizedTables } from '@/lib/workflows/persistence/utils'
import { getWorkflowAccessContext, getWorkflowById } from '@/lib/workflows/utils'
@@ -335,6 +336,15 @@ export async function DELETE(
await db.delete(workflow).where(eq(workflow.id, workflowId))
try {
PlatformEvents.workflowDeleted({
workflowId,
workspaceId: workflowData.workspaceId || undefined,
})
} catch {
// Telemetry should not fail the operation
}
const elapsed = Date.now() - startTime
logger.info(`[${requestId}] Successfully deleted workflow ${workflowId} in ${elapsed}ms`)

View File

@@ -317,6 +317,8 @@ interface WebhookMetadata {
providerConfig: Record<string, any>
}
const CREDENTIAL_SET_PREFIX = 'credentialSet:'
function buildWebhookMetadata(block: BlockState): WebhookMetadata | null {
const triggerId =
getSubBlockValue<string>(block, 'triggerId') ||
@@ -328,9 +330,17 @@ function buildWebhookMetadata(block: BlockState): WebhookMetadata | null {
const triggerDef = triggerId ? getTrigger(triggerId) : undefined
const provider = triggerDef?.provider || null
// Handle credential sets vs individual credentials
const isCredentialSet = triggerCredentials?.startsWith(CREDENTIAL_SET_PREFIX)
const credentialSetId = isCredentialSet
? triggerCredentials!.slice(CREDENTIAL_SET_PREFIX.length)
: undefined
const credentialId = isCredentialSet ? undefined : triggerCredentials
const providerConfig = {
...(typeof triggerConfig === 'object' ? triggerConfig : {}),
...(triggerCredentials ? { credentialId: triggerCredentials } : {}),
...(credentialId ? { credentialId } : {}),
...(credentialSetId ? { credentialSetId } : {}),
...(triggerId ? { triggerId } : {}),
}
@@ -347,6 +357,54 @@ async function upsertWebhookRecord(
webhookId: string,
metadata: WebhookMetadata
): Promise<void> {
const providerConfig = metadata.providerConfig as Record<string, unknown>
const credentialSetId = providerConfig?.credentialSetId as string | undefined
// For credential sets, delegate to the sync function which handles fan-out
if (credentialSetId && metadata.provider) {
const { syncWebhooksForCredentialSet } = await import('@/lib/webhooks/utils.server')
const { getProviderIdFromServiceId } = await import('@/lib/oauth')
const oauthProviderId = getProviderIdFromServiceId(metadata.provider)
const requestId = crypto.randomUUID().slice(0, 8)
// Extract base config (without credential-specific fields)
const {
credentialId: _cId,
credentialSetId: _csId,
userId: _uId,
...baseConfig
} = providerConfig
try {
await syncWebhooksForCredentialSet({
workflowId,
blockId: block.id,
provider: metadata.provider,
basePath: metadata.triggerPath,
credentialSetId,
oauthProviderId,
providerConfig: baseConfig as Record<string, any>,
requestId,
})
logger.info('Synced credential set webhooks during workflow save', {
workflowId,
blockId: block.id,
credentialSetId,
})
} catch (error) {
logger.error('Failed to sync credential set webhooks during workflow save', {
workflowId,
blockId: block.id,
credentialSetId,
error,
})
}
return
}
// For individual credentials, use the existing single webhook logic
const [existing] = await db.select().from(webhook).where(eq(webhook.id, webhookId)).limit(1)
if (existing) {
@@ -381,6 +439,7 @@ async function upsertWebhookRecord(
path: metadata.triggerPath,
provider: metadata.provider,
providerConfig: metadata.providerConfig,
credentialSetId: null,
isActive: true,
createdAt: new Date(),
updatedAt: new Date(),

View File

@@ -119,12 +119,12 @@ export async function POST(req: NextRequest) {
logger.info(`[${requestId}] Creating workflow ${workflowId} for user ${session.user.id}`)
import('@/lib/core/telemetry')
.then(({ trackPlatformEvent }) => {
trackPlatformEvent('platform.workflow.created', {
'workflow.id': workflowId,
'workflow.name': name,
'workflow.has_workspace': !!workspaceId,
'workflow.has_folder': !!folderId,
.then(({ PlatformEvents }) => {
PlatformEvents.workflowCreated({
workflowId,
name,
workspaceId: workspaceId || undefined,
folderId: folderId || undefined,
})
})
.catch(() => {

View File

@@ -7,6 +7,7 @@ import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { createApiKey, getApiKeyDisplayFormat } from '@/lib/api-key/auth'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { generateRequestId } from '@/lib/core/utils/request'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -147,6 +148,15 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
createdAt: apiKey.createdAt,
})
try {
PlatformEvents.apiKeyGenerated({
userId: userId,
keyName: name,
})
} catch {
// Telemetry should not fail the operation
}
logger.info(`[${requestId}] Created workspace API key: ${name} in workspace ${workspaceId}`)
return NextResponse.json({
@@ -198,6 +208,17 @@ export async function DELETE(
)
)
try {
for (const keyId of keys) {
PlatformEvents.apiKeyRevoked({
userId: userId,
keyId: keyId,
})
}
} catch {
// Telemetry should not fail the operation
}
logger.info(
`[${requestId}] Deleted ${deletedCount} workspace API keys from workspace ${workspaceId}`
)

View File

@@ -6,6 +6,8 @@ import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { isEnterpriseOrgAdminOrOwner } from '@/lib/billing/core/subscription'
import { isHosted } from '@/lib/core/config/feature-flags'
import { decryptSecret, encryptSecret } from '@/lib/core/security/encryption'
import { generateRequestId } from '@/lib/core/utils/request'
import { getUserEntityPermissions } from '@/lib/workspaces/permissions/utils'
@@ -56,6 +58,15 @@ export async function GET(request: NextRequest, { params }: { params: Promise<{
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
let byokEnabled = true
if (isHosted) {
byokEnabled = await isEnterpriseOrgAdminOrOwner(userId)
}
if (!byokEnabled) {
return NextResponse.json({ keys: [], byokEnabled: false })
}
const byokKeys = await db
.select({
id: workspaceBYOKKeys.id,
@@ -97,7 +108,7 @@ export async function GET(request: NextRequest, { params }: { params: Promise<{
})
)
return NextResponse.json({ keys: formattedKeys })
return NextResponse.json({ keys: formattedKeys, byokEnabled: true })
} catch (error: unknown) {
logger.error(`[${requestId}] BYOK keys GET error`, error)
return NextResponse.json(
@@ -120,6 +131,20 @@ export async function POST(request: NextRequest, { params }: { params: Promise<{
const userId = session.user.id
if (isHosted) {
const canManageBYOK = await isEnterpriseOrgAdminOrOwner(userId)
if (!canManageBYOK) {
logger.warn(`[${requestId}] User not authorized to manage BYOK keys`, { userId })
return NextResponse.json(
{
error:
'BYOK is an Enterprise-only feature. Only organization admins and owners can manage API keys.',
},
{ status: 403 }
)
}
}
const permission = await getUserEntityPermissions(userId, 'workspace', workspaceId)
if (permission !== 'admin') {
return NextResponse.json(
@@ -220,6 +245,20 @@ export async function DELETE(
const userId = session.user.id
if (isHosted) {
const canManageBYOK = await isEnterpriseOrgAdminOrOwner(userId)
if (!canManageBYOK) {
logger.warn(`[${requestId}] User not authorized to manage BYOK keys`, { userId })
return NextResponse.json(
{
error:
'BYOK is an Enterprise-only feature. Only organization admins and owners can manage API keys.',
},
{ status: 403 }
)
}
}
const permission = await getUserEntityPermissions(userId, 'workspace', workspaceId)
if (permission !== 'admin') {
return NextResponse.json(

View File

@@ -14,6 +14,7 @@ import { and, eq, inArray } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { WorkspaceInvitationEmail } from '@/components/emails'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { getBaseUrl } from '@/lib/core/utils/urls'
import { sendEmail } from '@/lib/messaging/email/mailer'
import { getFromEmailAddress } from '@/lib/messaging/email/utils'
@@ -81,7 +82,6 @@ export async function POST(req: NextRequest) {
return NextResponse.json({ error: 'Workspace ID and email are required' }, { status: 400 })
}
// Validate permission type
const validPermissions: PermissionType[] = ['admin', 'write', 'read']
if (!validPermissions.includes(permission)) {
return NextResponse.json(
@@ -90,7 +90,6 @@ export async function POST(req: NextRequest) {
)
}
// Check if user has admin permissions for this workspace
const userPermission = await db
.select()
.from(permissions)
@@ -111,7 +110,6 @@ export async function POST(req: NextRequest) {
)
}
// Get the workspace details for the email
const workspaceDetails = await db
.select()
.from(workspace)
@@ -122,8 +120,6 @@ export async function POST(req: NextRequest) {
return NextResponse.json({ error: 'Workspace not found' }, { status: 404 })
}
// Check if the user is already a member
// First find if a user with this email exists
const existingUser = await db
.select()
.from(user)
@@ -131,7 +127,6 @@ export async function POST(req: NextRequest) {
.then((rows) => rows[0])
if (existingUser) {
// Check if the user already has permissions for this workspace
const existingPermission = await db
.select()
.from(permissions)
@@ -155,7 +150,6 @@ export async function POST(req: NextRequest) {
}
}
// Check if there's already a pending invitation
const existingInvitation = await db
.select()
.from(workspaceInvitation)
@@ -178,12 +172,10 @@ export async function POST(req: NextRequest) {
)
}
// Generate a unique token and set expiry date (1 week from now)
const token = randomUUID()
const expiresAt = new Date()
expiresAt.setDate(expiresAt.getDate() + 7) // 7 days expiry
// Create the invitation
const invitationData = {
id: randomUUID(),
workspaceId,
@@ -198,10 +190,19 @@ export async function POST(req: NextRequest) {
updatedAt: new Date(),
}
// Create invitation
await db.insert(workspaceInvitation).values(invitationData)
// Send the invitation email
try {
PlatformEvents.workspaceMemberInvited({
workspaceId,
invitedBy: session.user.id,
inviteeEmail: email,
role: permission,
})
} catch {
// Telemetry should not fail the operation
}
await sendInvitationEmail({
to: email,
inviterName: session.user.name || session.user.email || 'A user',
@@ -217,7 +218,6 @@ export async function POST(req: NextRequest) {
}
}
// Helper function to send invitation email using the Resend API
async function sendInvitationEmail({
to,
inviterName,
@@ -233,7 +233,6 @@ async function sendInvitationEmail({
}) {
try {
const baseUrl = getBaseUrl()
// Use invitation ID in path, token in query parameter for security
const invitationLink = `${baseUrl}/invite/${invitationId}?token=${token}`
const emailHtml = await render(
@@ -263,6 +262,5 @@ async function sendInvitationEmail({
}
} catch (error) {
logger.error('Error sending invitation email:', error)
// Continue even if email fails - the invitation is still created
}
}

View File

@@ -5,6 +5,7 @@ import { and, desc, eq, isNull } from 'drizzle-orm'
import { NextResponse } from 'next/server'
import { z } from 'zod'
import { getSession } from '@/lib/auth'
import { PlatformEvents } from '@/lib/core/telemetry'
import { buildDefaultWorkflowArtifacts } from '@/lib/workflows/defaults'
import { saveWorkflowToNormalizedTables } from '@/lib/workflows/persistence/utils'
@@ -22,7 +23,6 @@ export async function GET() {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 })
}
// Get all workspaces where the user has permissions
const userWorkspaces = await db
.select({
workspace: workspace,
@@ -34,19 +34,15 @@ export async function GET() {
.orderBy(desc(workspace.createdAt))
if (userWorkspaces.length === 0) {
// Create a default workspace for the user
const defaultWorkspace = await createDefaultWorkspace(session.user.id, session.user.name)
// Migrate existing workflows to the default workspace
await migrateExistingWorkflows(session.user.id, defaultWorkspace.id)
return NextResponse.json({ workspaces: [defaultWorkspace] })
}
// If user has workspaces but might have orphaned workflows, migrate them
await ensureWorkflowsHaveWorkspace(session.user.id, userWorkspaces[0].workspace.id)
// Format the response with permission information
const workspacesWithPermissions = userWorkspaces.map(
({ workspace: workspaceDetails, permissionType }) => ({
...workspaceDetails,
@@ -78,24 +74,19 @@ export async function POST(req: Request) {
}
}
// Helper function to create a default workspace
async function createDefaultWorkspace(userId: string, userName?: string | null) {
// Extract first name only by splitting on spaces and taking the first part
const firstName = userName?.split(' ')[0] || null
const workspaceName = firstName ? `${firstName}'s Workspace` : 'My Workspace'
return createWorkspace(userId, workspaceName)
}
// Helper function to create a workspace
async function createWorkspace(userId: string, name: string) {
const workspaceId = crypto.randomUUID()
const workflowId = crypto.randomUUID()
const now = new Date()
// Create the workspace and initial workflow in a transaction
try {
await db.transaction(async (tx) => {
// Create the workspace
await tx.insert(workspace).values({
id: workspaceId,
name,
@@ -135,8 +126,6 @@ async function createWorkspace(userId: string, name: string) {
variables: {},
})
// No blocks are inserted - empty canvas
logger.info(
`Created workspace ${workspaceId} with initial workflow ${workflowId} for user ${userId}`
)
@@ -153,7 +142,16 @@ async function createWorkspace(userId: string, name: string) {
throw error
}
// Return the workspace data directly instead of querying again
try {
PlatformEvents.workspaceCreated({
workspaceId,
userId,
name,
})
} catch {
// Telemetry should not fail the operation
}
return {
id: workspaceId,
name,
@@ -166,9 +164,7 @@ async function createWorkspace(userId: string, name: string) {
}
}
// Helper function to migrate existing workflows to a workspace
async function migrateExistingWorkflows(userId: string, workspaceId: string) {
// Find all workflows that have no workspace ID
const orphanedWorkflows = await db
.select({ id: workflow.id })
.from(workflow)
@@ -182,7 +178,6 @@ async function migrateExistingWorkflows(userId: string, workspaceId: string) {
`Migrating ${orphanedWorkflows.length} workflows to workspace ${workspaceId} for user ${userId}`
)
// Bulk update all orphaned workflows at once
await db
.update(workflow)
.set({
@@ -192,16 +187,13 @@ async function migrateExistingWorkflows(userId: string, workspaceId: string) {
.where(and(eq(workflow.userId, userId), isNull(workflow.workspaceId)))
}
// Helper function to ensure all workflows have a workspace
async function ensureWorkflowsHaveWorkspace(userId: string, defaultWorkspaceId: string) {
// First check if there are any orphaned workflows
const orphanedWorkflows = await db
.select()
.from(workflow)
.where(and(eq(workflow.userId, userId), isNull(workflow.workspaceId)))
if (orphanedWorkflows.length > 0) {
// Directly update any workflows that don't have a workspace ID in a single query
await db
.update(workflow)
.set({

View File

@@ -0,0 +1,269 @@
'use client'
import { useCallback, useEffect, useState } from 'react'
import { Mail } from 'lucide-react'
import { useParams, useRouter } from 'next/navigation'
import { GmailIcon, OutlookIcon } from '@/components/icons'
import { client, useSession } from '@/lib/auth/auth-client'
import { getProviderDisplayName, isPollingProvider } from '@/lib/credential-sets/providers'
import { InviteLayout, InviteStatusCard } from '@/app/invite/components'
interface InvitationInfo {
credentialSetName: string
organizationName: string
providerId: string | null
email: string | null
}
type AcceptedState = 'connecting' | 'already-connected'
export default function CredentialAccountInvitePage() {
const params = useParams()
const router = useRouter()
const token = params.token as string
const { data: session, isPending: sessionLoading } = useSession()
const [invitation, setInvitation] = useState<InvitationInfo | null>(null)
const [loading, setLoading] = useState(true)
const [error, setError] = useState<string | null>(null)
const [accepting, setAccepting] = useState(false)
const [acceptedState, setAcceptedState] = useState<AcceptedState | null>(null)
useEffect(() => {
async function fetchInvitation() {
try {
const res = await fetch(`/api/credential-sets/invite/${token}`)
if (!res.ok) {
const data = await res.json()
setError(data.error || 'Failed to load invitation')
return
}
const data = await res.json()
setInvitation(data.invitation)
} catch {
setError('Failed to load invitation')
} finally {
setLoading(false)
}
}
fetchInvitation()
}, [token])
const handleAccept = useCallback(async () => {
if (!session?.user?.id) {
// Include invite_flow=true so the login page preserves callbackUrl when linking to signup
const callbackUrl = encodeURIComponent(`/credential-account/${token}`)
router.push(`/login?invite_flow=true&callbackUrl=${callbackUrl}`)
return
}
setAccepting(true)
try {
const res = await fetch(`/api/credential-sets/invite/${token}`, {
method: 'POST',
})
if (!res.ok) {
const data = await res.json()
setError(data.error || 'Failed to accept invitation')
return
}
const data = await res.json()
const credentialSetProviderId = data.providerId || invitation?.providerId
// Check if user already has this provider connected
let isAlreadyConnected = false
if (credentialSetProviderId && isPollingProvider(credentialSetProviderId)) {
try {
const connectionsRes = await fetch('/api/auth/oauth/connections')
if (connectionsRes.ok) {
const connectionsData = await connectionsRes.json()
const connections = connectionsData.connections || []
isAlreadyConnected = connections.some(
(conn: { provider: string; accounts?: { id: string }[] }) =>
conn.provider === credentialSetProviderId &&
conn.accounts &&
conn.accounts.length > 0
)
}
} catch {
// If we can't check connections, proceed with OAuth flow
}
}
if (isAlreadyConnected) {
// Already connected - redirect to workspace
setAcceptedState('already-connected')
setTimeout(() => {
router.push('/workspace')
}, 2000)
} else if (credentialSetProviderId && isPollingProvider(credentialSetProviderId)) {
// Not connected - start OAuth flow
setAcceptedState('connecting')
// Small delay to show success message before redirect
setTimeout(async () => {
try {
await client.oauth2.link({
providerId: credentialSetProviderId,
callbackURL: `${window.location.origin}/workspace`,
})
} catch (oauthError) {
// OAuth redirect will happen, this catch is for any pre-redirect errors
console.error('OAuth initiation error:', oauthError)
// If OAuth fails, redirect to workspace where they can connect manually
router.push('/workspace')
}
}, 1500)
} else {
// No provider specified - just redirect to workspace
router.push('/workspace')
}
} catch {
setError('Failed to accept invitation')
} finally {
setAccepting(false)
}
}, [session?.user?.id, token, router, invitation?.providerId])
const providerName = invitation?.providerId
? getProviderDisplayName(invitation.providerId)
: 'email'
const ProviderIcon =
invitation?.providerId === 'outlook'
? OutlookIcon
: invitation?.providerId === 'google-email'
? GmailIcon
: Mail
const providerWithIcon = (
<span className='inline-flex items-baseline gap-1'>
<ProviderIcon className='inline-block h-4 w-4 translate-y-[2px]' />
{providerName}
</span>
)
const getCallbackUrl = () => `/credential-account/${token}`
if (loading || sessionLoading) {
return (
<InviteLayout>
<InviteStatusCard type='loading' title='' description='Loading invitation...' />
</InviteLayout>
)
}
if (error) {
return (
<InviteLayout>
<InviteStatusCard
type='error'
title='Unable to load invitation'
description={error}
icon='error'
actions={[
{
label: 'Return to Home',
onClick: () => router.push('/'),
},
]}
/>
</InviteLayout>
)
}
if (acceptedState === 'already-connected') {
return (
<InviteLayout>
<InviteStatusCard
type='success'
title="You're all set!"
description={`You've joined ${invitation?.credentialSetName}. Your ${providerName} account is already connected. Redirecting to workspace...`}
icon='success'
/>
</InviteLayout>
)
}
if (acceptedState === 'connecting') {
return (
<InviteLayout>
<InviteStatusCard
type='loading'
title={`Connecting to ${providerName}...`}
description={`You've joined ${invitation?.credentialSetName}. You'll be redirected to connect your ${providerName} account.`}
/>
</InviteLayout>
)
}
// Not logged in
if (!session?.user) {
const callbackUrl = encodeURIComponent(getCallbackUrl())
return (
<InviteLayout>
<InviteStatusCard
type='login'
title='Join Email Polling Group'
description={`You've been invited to join ${invitation?.credentialSetName} by ${invitation?.organizationName}. Sign in or create an account to accept this invitation.`}
icon='mail'
actions={[
{
label: 'Sign in',
onClick: () => router.push(`/login?callbackUrl=${callbackUrl}&invite_flow=true`),
},
{
label: 'Create an account',
onClick: () =>
router.push(`/signup?callbackUrl=${callbackUrl}&invite_flow=true&new=true`),
variant: 'outline' as const,
},
{
label: 'Return to Home',
onClick: () => router.push('/'),
variant: 'ghost' as const,
},
]}
/>
</InviteLayout>
)
}
// Logged in - show invitation
return (
<InviteLayout>
<InviteStatusCard
type='invitation'
title='Join Email Polling Group'
description={
<>
You've been invited to join {invitation?.credentialSetName} by{' '}
{invitation?.organizationName}.
{invitation?.providerId && (
<> You'll be asked to connect your {providerWithIcon} account after accepting.</>
)}
</>
}
icon='mail'
actions={[
{
label: `Accept & Connect ${providerName}`,
onClick: handleAccept,
disabled: accepting,
loading: accepting,
},
{
label: 'Return to Home',
onClick: () => router.push('/'),
variant: 'ghost' as const,
},
]}
/>
</InviteLayout>
)
}

View File

@@ -473,6 +473,7 @@ export function Chat() {
/**
* Processes streaming response from workflow execution
* Reads the stream chunk by chunk and updates the message content in real-time
* When the final event arrives, extracts any additional selected outputs (model, tokens, toolCalls)
* @param stream - ReadableStream containing the workflow execution response
* @param responseMessageId - ID of the message to update with streamed content
*/
@@ -529,6 +530,35 @@ export function Chat() {
return
}
if (
selectedOutputs.length > 0 &&
'logs' in result &&
Array.isArray(result.logs) &&
activeWorkflowId
) {
const additionalOutputs: string[] = []
for (const outputId of selectedOutputs) {
const blockId = extractBlockIdFromOutputId(outputId)
const path = extractPathFromOutputId(outputId, blockId)
if (path === 'content') continue
const outputValue = extractOutputFromLogs(result.logs as BlockLog[], outputId)
if (outputValue !== undefined) {
const formattedValue =
typeof outputValue === 'string' ? outputValue : JSON.stringify(outputValue)
if (formattedValue) {
additionalOutputs.push(`**${path}:** ${formattedValue}`)
}
}
}
if (additionalOutputs.length > 0) {
appendMessageContent(responseMessageId, `\n\n${additionalOutputs.join('\n\n')}`)
}
}
finalizeMessageStream(responseMessageId)
} else if (contentChunk) {
accumulatedContent += contentChunk
@@ -552,7 +582,7 @@ export function Chat() {
focusInput(100)
}
},
[appendMessageContent, finalizeMessageStream, focusInput]
[appendMessageContent, finalizeMessageStream, focusInput, selectedOutputs, activeWorkflowId]
)
/**
@@ -564,7 +594,6 @@ export function Chat() {
if (!result || !activeWorkflowId) return
if (typeof result !== 'object') return
// Handle streaming response
if ('stream' in result && result.stream instanceof ReadableStream) {
const responseMessageId = crypto.randomUUID()
addMessage({
@@ -578,7 +607,6 @@ export function Chat() {
return
}
// Handle success with logs
if ('success' in result && result.success && 'logs' in result && Array.isArray(result.logs)) {
selectedOutputs
.map((outputId) => extractOutputFromLogs(result.logs as BlockLog[], outputId))
@@ -596,7 +624,6 @@ export function Chat() {
return
}
// Handle error response
if ('success' in result && !result.success) {
const errorMessage =
'error' in result && typeof result.error === 'string'
@@ -622,7 +649,6 @@ export function Chat() {
const sentMessage = chatMessage.trim()
// Update prompt history (only if new unique message)
if (sentMessage && promptHistory[promptHistory.length - 1] !== sentMessage) {
setPromptHistory((prev) => [...prev, sentMessage])
}
@@ -631,10 +657,8 @@ export function Chat() {
const conversationId = getConversationId(activeWorkflowId)
try {
// Process file attachments
const attachmentsWithData = await processFileAttachments(chatFiles)
// Add user message
const messageContent =
sentMessage || (chatFiles.length > 0 ? `Uploaded ${chatFiles.length} file(s)` : '')
addMessage({
@@ -644,7 +668,6 @@ export function Chat() {
attachments: attachmentsWithData,
})
// Prepare workflow input
const workflowInput: {
input: string
conversationId: string
@@ -667,13 +690,11 @@ export function Chat() {
}
}
// Clear input and files
setChatMessage('')
clearFiles()
clearErrors()
focusInput(10)
// Execute workflow
const result = await handleRunWorkflow(workflowInput)
handleWorkflowResponse(result)
} catch (error) {

View File

@@ -2,8 +2,10 @@
import { useCallback, useEffect, useMemo, useState } from 'react'
import { createLogger } from '@sim/logger'
import { ExternalLink } from 'lucide-react'
import { ExternalLink, Users } from 'lucide-react'
import { Button, Combobox } from '@/components/emcn/components'
import { getSubscriptionStatus } from '@/lib/billing/client'
import { getPollingProviderFromOAuth } from '@/lib/credential-sets/providers'
import {
getCanonicalScopesForProvider,
getProviderIdFromServiceId,
@@ -15,7 +17,11 @@ import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]
import { useDependsOnGate } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-depends-on-gate'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/hooks/use-sub-block-value'
import type { SubBlockConfig } from '@/blocks/types'
import { CREDENTIAL, CREDENTIAL_SET } from '@/executor/constants'
import { useCredentialSets } from '@/hooks/queries/credential-sets'
import { useOAuthCredentialDetail, useOAuthCredentials } from '@/hooks/queries/oauth-credentials'
import { useOrganizations } from '@/hooks/queries/organization'
import { useSubscriptionData } from '@/hooks/queries/subscription'
import { getMissingRequiredScopes } from '@/hooks/use-oauth-scope-status'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
@@ -45,6 +51,19 @@ export function CredentialSelector({
const requiredScopes = subBlock.requiredScopes || []
const label = subBlock.placeholder || 'Select credential'
const serviceId = subBlock.serviceId || ''
const supportsCredentialSets = subBlock.supportsCredentialSets || false
const { data: organizationsData } = useOrganizations()
const { data: subscriptionData } = useSubscriptionData()
const activeOrganization = organizationsData?.activeOrganization
const subscriptionStatus = getSubscriptionStatus(subscriptionData?.data)
const hasTeamPlan = subscriptionStatus.isTeam || subscriptionStatus.isEnterprise
const canUseCredentialSets = supportsCredentialSets && hasTeamPlan && !!activeOrganization?.id
const { data: credentialSets = [] } = useCredentialSets(
activeOrganization?.id,
canUseCredentialSets
)
const { depsSatisfied, dependsOn } = useDependsOnGate(blockId, subBlock, { disabled, isPreview })
const hasDependencies = dependsOn.length > 0
@@ -52,7 +71,12 @@ export function CredentialSelector({
const effectiveDisabled = disabled || (hasDependencies && !depsSatisfied)
const effectiveValue = isPreview && previewValue !== undefined ? previewValue : storeValue
const selectedId = typeof effectiveValue === 'string' ? effectiveValue : ''
const rawSelectedId = typeof effectiveValue === 'string' ? effectiveValue : ''
const isCredentialSetSelected = rawSelectedId.startsWith(CREDENTIAL_SET.PREFIX)
const selectedId = isCredentialSetSelected ? '' : rawSelectedId
const selectedCredentialSetId = isCredentialSetSelected
? rawSelectedId.slice(CREDENTIAL_SET.PREFIX.length)
: ''
const effectiveProviderId = useMemo(
() => getProviderIdFromServiceId(serviceId) as OAuthProvider,
@@ -87,11 +111,20 @@ export function CredentialSelector({
const hasForeignMeta = foreignCredentials.length > 0
const isForeign = Boolean(selectedId && !selectedCredential && hasForeignMeta)
const selectedCredentialSet = useMemo(
() => credentialSets.find((cs) => cs.id === selectedCredentialSetId),
[credentialSets, selectedCredentialSetId]
)
const isForeignCredentialSet = Boolean(isCredentialSetSelected && !selectedCredentialSet)
const resolvedLabel = useMemo(() => {
if (selectedCredentialSet) return selectedCredentialSet.name
if (isForeignCredentialSet) return CREDENTIAL.FOREIGN_LABEL
if (selectedCredential) return selectedCredential.name
if (isForeign) return 'Saved by collaborator'
if (isForeign) return CREDENTIAL.FOREIGN_LABEL
return ''
}, [selectedCredential, isForeign])
}, [selectedCredentialSet, isForeignCredentialSet, selectedCredential, isForeign])
useEffect(() => {
if (!isEditing) {
@@ -148,6 +181,15 @@ export function CredentialSelector({
[isPreview, setStoreValue]
)
const handleCredentialSetSelect = useCallback(
(credentialSetId: string) => {
if (isPreview) return
setStoreValue(`${CREDENTIAL_SET.PREFIX}${credentialSetId}`)
setIsEditing(false)
},
[isPreview, setStoreValue]
)
const handleAddCredential = useCallback(() => {
setShowOAuthModal(true)
}, [])
@@ -176,7 +218,56 @@ export function CredentialSelector({
.join(' ')
}, [])
const comboboxOptions = useMemo(() => {
const { comboboxOptions, comboboxGroups } = useMemo(() => {
const pollingProviderId = getPollingProviderFromOAuth(effectiveProviderId)
// Handle both old ('gmail') and new ('google-email') provider IDs for backwards compatibility
const matchesProvider = (csProviderId: string | null) => {
if (!csProviderId || !pollingProviderId) return false
if (csProviderId === pollingProviderId) return true
// Handle legacy 'gmail' mapping to 'google-email'
if (pollingProviderId === 'google-email' && csProviderId === 'gmail') return true
return false
}
const filteredCredentialSets = pollingProviderId
? credentialSets.filter((cs) => matchesProvider(cs.providerId))
: []
if (canUseCredentialSets && filteredCredentialSets.length > 0) {
const groups = []
groups.push({
section: 'Polling Groups',
items: filteredCredentialSets.map((cs) => ({
label: cs.name,
value: `${CREDENTIAL_SET.PREFIX}${cs.id}`,
})),
})
const credentialItems = credentials.map((cred) => ({
label: cred.name,
value: cred.id,
}))
if (credentialItems.length > 0) {
groups.push({
section: 'Personal Credential',
items: credentialItems,
})
} else {
groups.push({
section: 'Personal Credential',
items: [
{
label: `Connect ${getProviderName(provider)} account`,
value: '__connect_account__',
},
],
})
}
return { comboboxOptions: [], comboboxGroups: groups }
}
const options = credentials.map((cred) => ({
label: cred.name,
value: cred.id,
@@ -189,14 +280,32 @@ export function CredentialSelector({
})
}
return options
}, [credentials, provider, getProviderName])
return { comboboxOptions: options, comboboxGroups: undefined }
}, [
credentials,
provider,
effectiveProviderId,
getProviderName,
canUseCredentialSets,
credentialSets,
])
const selectedCredentialProvider = selectedCredential?.provider ?? provider
const overlayContent = useMemo(() => {
if (!inputValue) return null
if (isCredentialSetSelected && selectedCredentialSet) {
return (
<div className='flex w-full items-center truncate'>
<div className='mr-2 flex-shrink-0 opacity-90'>
<Users className='h-3 w-3' />
</div>
<span className='truncate'>{inputValue}</span>
</div>
)
}
return (
<div className='flex w-full items-center truncate'>
<div className='mr-2 flex-shrink-0 opacity-90'>
@@ -205,7 +314,13 @@ export function CredentialSelector({
<span className='truncate'>{inputValue}</span>
</div>
)
}, [getProviderIcon, inputValue, selectedCredentialProvider])
}, [
getProviderIcon,
inputValue,
selectedCredentialProvider,
isCredentialSetSelected,
selectedCredentialSet,
])
const handleComboboxChange = useCallback(
(value: string) => {
@@ -214,6 +329,16 @@ export function CredentialSelector({
return
}
if (value.startsWith(CREDENTIAL_SET.PREFIX)) {
const credentialSetId = value.slice(CREDENTIAL_SET.PREFIX.length)
const matchedSet = credentialSets.find((cs) => cs.id === credentialSetId)
if (matchedSet) {
setInputValue(matchedSet.name)
handleCredentialSetSelect(credentialSetId)
return
}
}
const matchedCred = credentials.find((c) => c.id === value)
if (matchedCred) {
setInputValue(matchedCred.name)
@@ -224,15 +349,16 @@ export function CredentialSelector({
setIsEditing(true)
setInputValue(value)
},
[credentials, handleAddCredential, handleSelect]
[credentials, credentialSets, handleAddCredential, handleSelect, handleCredentialSetSelect]
)
return (
<div>
<Combobox
options={comboboxOptions}
groups={comboboxGroups}
value={inputValue}
selectedValue={selectedId}
selectedValue={rawSelectedId}
onChange={handleComboboxChange}
onOpenChange={handleOpenChange}
placeholder={
@@ -240,10 +366,10 @@ export function CredentialSelector({
}
disabled={effectiveDisabled}
editable={true}
filterOptions={true}
filterOptions={!isForeign && !isForeignCredentialSet}
isLoading={credentialsLoading}
overlayContent={overlayContent}
className={selectedId ? 'pl-[28px]' : ''}
className={selectedId || isCredentialSetSelected ? 'pl-[28px]' : ''}
/>
{needsUpdate && (

View File

@@ -10,6 +10,7 @@ import {
parseProvider,
} from '@/lib/oauth'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/editor/components/sub-block/components/credential-selector/components/oauth-required-modal'
import { CREDENTIAL } from '@/executor/constants'
import { useOAuthCredentialDetail, useOAuthCredentials } from '@/hooks/queries/oauth-credentials'
import { getMissingRequiredScopes } from '@/hooks/use-oauth-scope-status'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
@@ -95,7 +96,7 @@ export function ToolCredentialSelector({
const resolvedLabel = useMemo(() => {
if (selectedCredential) return selectedCredential.name
if (isForeign) return 'Saved by collaborator'
if (isForeign) return CREDENTIAL.FOREIGN_LABEL
return ''
}, [selectedCredential, isForeign])
@@ -210,7 +211,7 @@ export function ToolCredentialSelector({
placeholder={label}
disabled={disabled}
editable={true}
filterOptions={true}
filterOptions={!isForeign}
isLoading={credentialsLoading}
overlayContent={overlayContent}
className={selectedId ? 'pl-[28px]' : ''}

View File

@@ -885,6 +885,7 @@ export function useWorkflowExecution() {
const activeBlocksSet = new Set<string>()
const streamedContent = new Map<string, string>()
const accumulatedBlockLogs: BlockLog[] = []
// Execute the workflow
try {
@@ -933,14 +934,30 @@ export function useWorkflowExecution() {
// Edges already tracked in onBlockStarted, no need to track again
const startedAt = new Date(Date.now() - data.durationMs).toISOString()
const endedAt = new Date().toISOString()
// Accumulate block log for the execution result
accumulatedBlockLogs.push({
blockId: data.blockId,
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
input: data.input || {},
output: data.output,
success: true,
durationMs: data.durationMs,
startedAt,
endedAt,
})
// Add to console
addConsole({
input: data.input || {},
output: data.output,
success: true,
durationMs: data.durationMs,
startedAt: new Date(Date.now() - data.durationMs).toISOString(),
endedAt: new Date().toISOString(),
startedAt,
endedAt,
workflowId: activeWorkflowId,
blockId: data.blockId,
executionId: executionId || uuidv4(),
@@ -967,6 +984,24 @@ export function useWorkflowExecution() {
// Track failed block execution in run path
setBlockRunStatus(data.blockId, 'error')
const startedAt = new Date(Date.now() - data.durationMs).toISOString()
const endedAt = new Date().toISOString()
// Accumulate block error log for the execution result
accumulatedBlockLogs.push({
blockId: data.blockId,
blockName: data.blockName || 'Unknown Block',
blockType: data.blockType || 'unknown',
input: data.input || {},
output: {},
success: false,
error: data.error,
durationMs: data.durationMs,
startedAt,
endedAt,
})
// Add error to console
addConsole({
input: data.input || {},
@@ -974,8 +1009,8 @@ export function useWorkflowExecution() {
success: false,
error: data.error,
durationMs: data.durationMs,
startedAt: new Date(Date.now() - data.durationMs).toISOString(),
endedAt: new Date().toISOString(),
startedAt,
endedAt,
workflowId: activeWorkflowId,
blockId: data.blockId,
executionId: executionId || uuidv4(),
@@ -1029,7 +1064,7 @@ export function useWorkflowExecution() {
startTime: data.startTime,
endTime: data.endTime,
},
logs: [],
logs: accumulatedBlockLogs,
}
},
@@ -1041,7 +1076,7 @@ export function useWorkflowExecution() {
metadata: {
duration: data.duration,
},
logs: [],
logs: accumulatedBlockLogs,
}
// Only add workflow-level error if no blocks have executed yet

View File

@@ -31,7 +31,6 @@ export async function executeWorkflowWithFullLogging(
const { setActiveBlocks, setBlockRunStatus, setEdgeRunStatus } = useExecutionStore.getState()
const workflowEdges = useWorkflowStore.getState().edges
// Track active blocks for pulsing animation
const activeBlocksSet = new Set<string>()
const payload: any = {
@@ -59,7 +58,6 @@ export async function executeWorkflowWithFullLogging(
throw new Error('No response body')
}
// Parse SSE stream
const reader = response.body.getReader()
const decoder = new TextDecoder()
let buffer = ''
@@ -89,11 +87,9 @@ export async function executeWorkflowWithFullLogging(
switch (event.type) {
case 'block:started': {
// Add block to active set for pulsing animation
activeBlocksSet.add(event.data.blockId)
setActiveBlocks(new Set(activeBlocksSet))
// Track edges that led to this block as soon as execution starts
const incomingEdges = workflowEdges.filter(
(edge) => edge.target === event.data.blockId
)
@@ -104,11 +100,9 @@ export async function executeWorkflowWithFullLogging(
}
case 'block:completed':
// Remove block from active set
activeBlocksSet.delete(event.data.blockId)
setActiveBlocks(new Set(activeBlocksSet))
// Track successful block execution in run path
setBlockRunStatus(event.data.blockId, 'success')
addConsole({
@@ -134,11 +128,9 @@ export async function executeWorkflowWithFullLogging(
break
case 'block:error':
// Remove block from active set
activeBlocksSet.delete(event.data.blockId)
setActiveBlocks(new Set(activeBlocksSet))
// Track failed block execution in run path
setBlockRunStatus(event.data.blockId, 'error')
addConsole({
@@ -183,7 +175,6 @@ export async function executeWorkflowWithFullLogging(
}
} finally {
reader.releaseLock()
// Clear active blocks when execution ends
setActiveBlocks(new Set())
}

View File

@@ -2,7 +2,7 @@
import { useState } from 'react'
import { createLogger } from '@sim/logger'
import { Eye, EyeOff } from 'lucide-react'
import { Crown, Eye, EyeOff } from 'lucide-react'
import { useParams } from 'next/navigation'
import {
Button,
@@ -81,7 +81,9 @@ export function BYOK() {
const params = useParams()
const workspaceId = (params?.workspaceId as string) || ''
const { data: keys = [], isLoading } = useBYOKKeys(workspaceId)
const { data, isLoading } = useBYOKKeys(workspaceId)
const keys = data?.keys ?? []
const byokEnabled = data?.byokEnabled ?? true
const upsertKey = useUpsertBYOKKey()
const deleteKey = useDeleteBYOKKey()
@@ -96,6 +98,31 @@ export function BYOK() {
return keys.find((k) => k.providerId === providerId)
}
// Show enterprise-only gate if BYOK is not enabled
if (!isLoading && !byokEnabled) {
return (
<div className='flex h-full flex-col items-center justify-center gap-[16px] py-[32px]'>
<div className='flex h-[48px] w-[48px] items-center justify-center rounded-full bg-[var(--surface-6)]'>
<Crown className='h-[24px] w-[24px] text-[var(--amber-9)]' />
</div>
<div className='flex flex-col items-center gap-[8px] text-center'>
<h3 className='font-medium text-[15px] text-[var(--text-primary)]'>Enterprise Feature</h3>
<p className='max-w-[320px] text-[13px] text-[var(--text-secondary)]'>
Bring Your Own Key (BYOK) is available exclusively on the Enterprise plan. Upgrade to
use your own API keys and eliminate the 2x cost multiplier.
</p>
</div>
<Button
variant='primary'
className='!bg-[var(--brand-tertiary-2)] !text-[var(--text-inverse)] hover:!bg-[var(--brand-tertiary-2)]/90'
onClick={() => window.open('https://sim.ai/enterprise', '_blank')}
>
Contact Sales
</Button>
</div>
)
}
const handleSave = async () => {
if (!editingProvider || !apiKeyInput.trim()) return

View File

@@ -1,6 +1,7 @@
export { ApiKeys } from './api-keys/api-keys'
export { BYOK } from './byok/byok'
export { Copilot } from './copilot/copilot'
export { CredentialSets } from './credential-sets/credential-sets'
export { CustomTools } from './custom-tools/custom-tools'
export { EnvironmentVariables } from './environment/environment'
export { Files as FileUploads } from './files/files'

View File

@@ -1,7 +1,7 @@
'use client'
import React, { useMemo, useState } from 'react'
import { CheckCircle, ChevronDown } from 'lucide-react'
import { ChevronDown } from 'lucide-react'
import {
Button,
Checkbox,
@@ -302,14 +302,11 @@ export function MemberInvitationCard({
{/* Success message */}
{inviteSuccess && (
<div className='flex items-start gap-[8px] rounded-[6px] bg-green-500/10 px-[10px] py-[8px] text-green-600 dark:text-green-400'>
<CheckCircle className='h-4 w-4 flex-shrink-0' />
<p className='text-[12px]'>
Invitation sent successfully
{selectedCount > 0 &&
` with access to ${selectedCount} workspace${selectedCount !== 1 ? 's' : ''}`}
</p>
</div>
<p className='text-[11px] text-[var(--text-success)] leading-tight'>
Invitation sent successfully
{selectedCount > 0 &&
` with access to ${selectedCount} workspace${selectedCount !== 1 ? 's' : ''}`}
</p>
)}
</div>
</div>

View File

@@ -5,7 +5,11 @@ import { createLogger } from '@sim/logger'
import { Avatar, AvatarFallback, AvatarImage, Badge, Button } from '@/components/emcn'
import type { Invitation, Member, Organization } from '@/lib/workspaces/organization'
import { getUserColor } from '@/app/workspace/[workspaceId]/w/utils/get-user-color'
import { useCancelInvitation, useOrganizationMembers } from '@/hooks/queries/organization'
import {
useCancelInvitation,
useOrganizationMembers,
useResendInvitation,
} from '@/hooks/queries/organization'
const logger = createLogger('TeamMembers')
@@ -46,12 +50,16 @@ export function TeamMembers({
onRemoveMember,
}: TeamMembersProps) {
const [cancellingInvitations, setCancellingInvitations] = useState<Set<string>>(new Set())
const [resendingInvitations, setResendingInvitations] = useState<Set<string>>(new Set())
const [resentInvitations, setResentInvitations] = useState<Set<string>>(new Set())
const [resendCooldowns, setResendCooldowns] = useState<Record<string, number>>({})
const { data: memberUsageResponse, isLoading: isLoadingUsage } = useOrganizationMembers(
organization?.id || ''
)
const cancelInvitationMutation = useCancelInvitation()
const resendInvitationMutation = useResendInvitation()
const memberUsageData: Record<string, number> = {}
if (memberUsageResponse?.data) {
@@ -140,6 +148,54 @@ export function TeamMembers({
}
}
const handleResendInvitation = async (invitationId: string) => {
if (!organization?.id) return
const secondsLeft = resendCooldowns[invitationId]
if (secondsLeft && secondsLeft > 0) return
setResendingInvitations((prev) => new Set([...prev, invitationId]))
try {
await resendInvitationMutation.mutateAsync({
invitationId,
orgId: organization.id,
})
setResentInvitations((prev) => new Set([...prev, invitationId]))
setTimeout(() => {
setResentInvitations((prev) => {
const next = new Set(prev)
next.delete(invitationId)
return next
})
}, 4000)
// Start 60s cooldown
setResendCooldowns((prev) => ({ ...prev, [invitationId]: 60 }))
const interval = setInterval(() => {
setResendCooldowns((prev) => {
const current = prev[invitationId]
if (current === undefined) return prev
if (current <= 1) {
const next = { ...prev }
delete next[invitationId]
clearInterval(interval)
return next
}
return { ...prev, [invitationId]: current - 1 }
})
}, 1000)
} catch (error) {
logger.error('Failed to resend invitation', { error })
} finally {
setResendingInvitations((prev) => {
const next = new Set(prev)
next.delete(invitationId)
return next
})
}
}
return (
<div className='flex flex-col gap-[16px]'>
{/* Header */}
@@ -148,13 +204,13 @@ export function TeamMembers({
</div>
{/* Members list */}
<div className='flex flex-col gap-[16px]'>
<div className='flex flex-col gap-[8px]'>
{teamItems.map((item) => (
<div key={item.id} className='flex items-center justify-between'>
{/* Left section: Avatar + Name/Role + Action buttons */}
<div className='flex flex-1 items-center gap-[12px]'>
{/* Avatar */}
<Avatar size='sm'>
<Avatar className='h-9 w-9'>
{item.avatarUrl && <AvatarImage src={item.avatarUrl} alt={item.name} />}
<AvatarFallback
style={{ background: getUserColor(item.userId || item.email) }}
@@ -179,32 +235,60 @@ export function TeamMembers({
</Badge>
)}
{item.type === 'invitation' && (
<Badge variant='amber' size='sm'>
<Badge variant='gray-secondary' size='sm'>
Pending
</Badge>
)}
</div>
<div className='truncate text-[12px] text-[var(--text-muted)]'>{item.email}</div>
<div className='truncate text-[13px] text-[var(--text-muted)]'>{item.email}</div>
</div>
{/* Action buttons */}
{isAdminOrOwner && (
<>
{/* Admin/Owner can remove other members */}
{item.type === 'member' &&
item.role !== 'owner' &&
item.email !== currentUserEmail && (
<Button
variant='ghost'
onClick={() => onRemoveMember(item.member)}
className='h-8'
>
Remove
</Button>
)}
{/* Action buttons for members */}
{isAdminOrOwner &&
item.type === 'member' &&
item.role !== 'owner' &&
item.email !== currentUserEmail && (
<Button
variant='ghost'
onClick={() => onRemoveMember(item.member)}
className='h-8'
>
Remove
</Button>
)}
</div>
{/* Admin can cancel invitations */}
{item.type === 'invitation' && (
{/* Right section */}
{isAdminOrOwner && (
<div className='ml-[16px] flex flex-col items-end'>
{item.type === 'member' ? (
<>
<div className='text-[12px] text-[var(--text-muted)]'>Usage</div>
<div className='font-medium text-[12px] text-[var(--text-primary)] tabular-nums'>
{isLoadingUsage ? (
<span className='inline-block h-3 w-12 animate-pulse rounded-[4px] bg-[var(--surface-4)]' />
) : (
item.usage
)}
</div>
</>
) : (
<div className='flex items-center gap-[4px]'>
<Button
variant='ghost'
onClick={() => handleResendInvitation(item.invitation.id)}
disabled={
resendingInvitations.has(item.invitation.id) ||
(resendCooldowns[item.invitation.id] ?? 0) > 0
}
className='h-8'
>
{resendingInvitations.has(item.invitation.id)
? 'Sending...'
: resendCooldowns[item.invitation.id]
? `Resend (${resendCooldowns[item.invitation.id]}s)`
: 'Resend'}
</Button>
<Button
variant='ghost'
onClick={() => handleCancelInvitation(item.invitation.id)}
@@ -213,22 +297,8 @@ export function TeamMembers({
>
{cancellingInvitations.has(item.invitation.id) ? 'Cancelling...' : 'Cancel'}
</Button>
)}
</>
)}
</div>
{/* Right section: Usage column (right-aligned) */}
{isAdminOrOwner && (
<div className='ml-[16px] flex flex-col items-end'>
<div className='text-[12px] text-[var(--text-muted)]'>Usage</div>
<div className='font-medium text-[12px] text-[var(--text-primary)] tabular-nums'>
{isLoadingUsage && item.type === 'member' ? (
<span className='inline-block h-3 w-12 animate-pulse rounded-[4px] bg-[var(--surface-4)]' />
) : (
item.usage
)}
</div>
</div>
)}
</div>
)}
</div>

View File

@@ -4,7 +4,7 @@ import { useCallback, useEffect, useMemo, useRef, useState } from 'react'
import * as DialogPrimitive from '@radix-ui/react-dialog'
import * as VisuallyHidden from '@radix-ui/react-visually-hidden'
import { useQueryClient } from '@tanstack/react-query'
import { Files, KeySquare, LogIn, Server, Settings, User, Users, Wrench } from 'lucide-react'
import { Files, KeySquare, LogIn, Mail, Server, Settings, User, Users, Wrench } from 'lucide-react'
import {
Card,
Connections,
@@ -32,6 +32,7 @@ import {
ApiKeys,
BYOK,
Copilot,
CredentialSets,
CustomTools,
EnvironmentVariables,
FileUploads,
@@ -52,6 +53,7 @@ import { useSettingsModalStore } from '@/stores/settings-modal/store'
const isBillingEnabled = isTruthy(getEnv('NEXT_PUBLIC_BILLING_ENABLED'))
const isSSOEnabled = isTruthy(getEnv('NEXT_PUBLIC_SSO_ENABLED'))
const isCredentialSetsEnabled = isTruthy(getEnv('NEXT_PUBLIC_CREDENTIAL_SETS_ENABLED'))
interface SettingsModalProps {
open: boolean
@@ -63,6 +65,7 @@ type SettingsSection =
| 'environment'
| 'template-profile'
| 'integrations'
| 'credential-sets'
| 'apikeys'
| 'byok'
| 'files'
@@ -84,8 +87,8 @@ type NavigationItem = {
hideWhenBillingDisabled?: boolean
requiresTeam?: boolean
requiresEnterprise?: boolean
requiresOwner?: boolean
requiresHosted?: boolean
selfHostedOverride?: boolean
}
const sectionConfig: { key: NavigationSection; title: string }[] = [
@@ -111,11 +114,20 @@ const allNavigationItems: NavigationItem[] = [
icon: Users,
section: 'subscription',
hideWhenBillingDisabled: true,
requiresHosted: true,
requiresTeam: true,
},
{ id: 'integrations', label: 'Integrations', icon: Connections, section: 'tools' },
{ id: 'custom-tools', label: 'Custom Tools', icon: Wrench, section: 'tools' },
{ id: 'mcp', label: 'MCP Tools', icon: McpIcon, section: 'tools' },
{
id: 'credential-sets',
label: 'Email Polling',
icon: Mail,
section: 'system',
requiresHosted: true,
selfHostedOverride: isCredentialSetsEnabled,
},
{ id: 'environment', label: 'Environment', icon: FolderCode, section: 'system' },
{ id: 'apikeys', label: 'API Keys', icon: Key, section: 'system' },
{ id: 'workflow-mcp-servers', label: 'Deployed MCPs', icon: Server, section: 'system' },
@@ -125,6 +137,7 @@ const allNavigationItems: NavigationItem[] = [
icon: KeySquare,
section: 'system',
requiresHosted: true,
requiresEnterprise: true,
},
{
id: 'copilot',
@@ -139,9 +152,9 @@ const allNavigationItems: NavigationItem[] = [
label: 'Single Sign-On',
icon: LogIn,
section: 'system',
requiresTeam: true,
requiresHosted: true,
requiresEnterprise: true,
requiresOwner: true,
selfHostedOverride: isSSOEnabled,
},
]
@@ -164,8 +177,9 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
const userRole = getUserRole(activeOrganization, userEmail)
const isOwner = userRole === 'owner'
const isAdmin = userRole === 'admin'
const canManageSSO = isOwner || isAdmin
const isOrgAdminOrOwner = isOwner || isAdmin
const subscriptionStatus = getSubscriptionStatus(subscriptionData?.data)
const hasTeamPlan = subscriptionStatus.isTeam || subscriptionStatus.isEnterprise
const hasEnterprisePlan = subscriptionStatus.isEnterprise
const hasOrganization = !!activeOrganization?.id
@@ -183,29 +197,19 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
return false
}
// SSO has special logic that must be checked before requiresTeam
if (item.id === 'sso') {
if (isHosted) {
return hasOrganization && hasEnterprisePlan && canManageSSO
if (item.selfHostedOverride && !isHosted) {
if (item.id === 'sso') {
const hasProviders = (ssoProvidersData?.providers?.length ?? 0) > 0
return !hasProviders || isSSOProviderOwner === true
}
// For self-hosted, only show SSO tab if explicitly enabled via environment variable
if (!isSSOEnabled) return false
// Show tab if user is the SSO provider owner, or if no providers exist yet (to allow initial setup)
const hasProviders = (ssoProvidersData?.providers?.length ?? 0) > 0
return !hasProviders || isSSOProviderOwner === true
return true
}
if (item.requiresTeam) {
const isMember = userRole === 'member' || isAdmin
const hasTeamPlan = subscriptionStatus.isTeam || subscriptionStatus.isEnterprise
if (isMember) return true
if (isOwner && hasTeamPlan) return true
if (item.requiresTeam && (!hasTeamPlan || !isOrgAdminOrOwner)) {
return false
}
if (item.requiresEnterprise && !hasEnterprisePlan) {
if (item.requiresEnterprise && (!hasEnterprisePlan || !isOrgAdminOrOwner)) {
return false
}
@@ -213,24 +217,17 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
return false
}
if (item.requiresOwner && !isOwner) {
return false
}
return true
})
}, [
hasOrganization,
hasTeamPlan,
hasEnterprisePlan,
canManageSSO,
isOrgAdminOrOwner,
isSSOProviderOwner,
isSSOEnabled,
ssoProvidersData?.providers?.length,
isOwner,
isAdmin,
userRole,
subscriptionStatus.isTeam,
subscriptionStatus.isEnterprise,
])
// Memoized callbacks to prevent infinite loops in child components
@@ -462,6 +459,7 @@ export function SettingsModal({ open, onOpenChange }: SettingsModalProps) {
registerCloseHandler={registerIntegrationsCloseHandler}
/>
)}
{activeSection === 'credential-sets' && <CredentialSets />}
{activeSection === 'apikeys' && <ApiKeys onOpenChange={onOpenChange} />}
{activeSection === 'files' && <FileUploads />}
{isBillingEnabled && activeSection === 'subscription' && <Subscription />}

View File

@@ -2,6 +2,7 @@
import React, { type KeyboardEvent, useCallback, useEffect, useRef, useState } from 'react'
import { createLogger } from '@sim/logger'
import { Paperclip, X } from 'lucide-react'
import { useParams } from 'next/navigation'
import {
Button,
@@ -40,9 +41,12 @@ interface PendingInvitation {
export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalProps) {
const formRef = useRef<HTMLFormElement>(null)
const fileInputRef = useRef<HTMLInputElement>(null)
const [inputValue, setInputValue] = useState('')
const [emails, setEmails] = useState<string[]>([])
const [invalidEmails, setInvalidEmails] = useState<string[]>([])
const [duplicateEmails, setDuplicateEmails] = useState<string[]>([])
const [isDragging, setIsDragging] = useState(false)
const [userPermissions, setUserPermissions] = useState<UserPermissions[]>([])
const [pendingInvitations, setPendingInvitations] = useState<UserPermissions[]>([])
const [isPendingInvitationsLoading, setIsPendingInvitationsLoading] = useState(false)
@@ -134,13 +138,20 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
const validation = quickValidateEmail(normalized)
const isValid = validation.isValid
if (emails.includes(normalized) || invalidEmails.includes(normalized)) {
if (
emails.includes(normalized) ||
invalidEmails.includes(normalized) ||
duplicateEmails.includes(normalized)
) {
return false
}
const hasPendingInvitation = pendingInvitations.some((inv) => inv.email === normalized)
if (hasPendingInvitation) {
setErrorMessage(`${normalized} already has a pending invitation`)
setDuplicateEmails((prev) => {
if (prev.includes(normalized)) return prev
return [...prev, normalized]
})
setInputValue('')
return false
}
@@ -149,7 +160,10 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
(user) => user.email === normalized
)
if (isExistingMember) {
setErrorMessage(`${normalized} is already a member of this workspace`)
setDuplicateEmails((prev) => {
if (prev.includes(normalized)) return prev
return [...prev, normalized]
})
setInputValue('')
return false
}
@@ -161,13 +175,19 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
}
if (!isValid) {
setInvalidEmails((prev) => [...prev, normalized])
setInvalidEmails((prev) => {
if (prev.includes(normalized)) return prev
return [...prev, normalized]
})
setInputValue('')
return false
}
setErrorMessage(null)
setEmails((prev) => [...prev, normalized])
setEmails((prev) => {
if (prev.includes(normalized)) return prev
return [...prev, normalized]
})
setUserPermissions((prev) => [
...prev,
@@ -180,7 +200,14 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
setInputValue('')
return true
},
[emails, invalidEmails, pendingInvitations, workspacePermissions?.users, session?.user?.email]
[
emails,
invalidEmails,
duplicateEmails,
pendingInvitations,
workspacePermissions?.users,
session?.user?.email,
]
)
const removeEmail = useCallback(
@@ -196,6 +223,80 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
setInvalidEmails((prev) => prev.filter((_, i) => i !== index))
}, [])
const removeDuplicateEmail = useCallback((index: number) => {
setDuplicateEmails((prev) => prev.filter((_, i) => i !== index))
}, [])
const extractEmailsFromText = useCallback((text: string): string[] => {
const emailRegex = /[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/g
const matches = text.match(emailRegex) || []
return [...new Set(matches.map((e) => e.toLowerCase()))]
}, [])
const handleFileDrop = useCallback(
async (file: File) => {
try {
const text = await file.text()
const extractedEmails = extractEmailsFromText(text)
extractedEmails.forEach((email) => {
addEmail(email)
})
} catch (error) {
logger.error('Error reading dropped file', error)
}
},
[extractEmailsFromText, addEmail]
)
const handleDragOver = useCallback((e: React.DragEvent) => {
e.preventDefault()
e.stopPropagation()
e.dataTransfer.dropEffect = 'copy'
setIsDragging(true)
}, [])
const handleDragLeave = useCallback((e: React.DragEvent) => {
e.preventDefault()
e.stopPropagation()
setIsDragging(false)
}, [])
const handleDrop = useCallback(
async (e: React.DragEvent) => {
e.preventDefault()
e.stopPropagation()
setIsDragging(false)
const files = Array.from(e.dataTransfer.files)
const validFiles = files.filter(
(f) =>
f.type === 'text/csv' ||
f.type === 'text/plain' ||
f.name.endsWith('.csv') ||
f.name.endsWith('.txt')
)
for (const file of validFiles) {
await handleFileDrop(file)
}
},
[handleFileDrop]
)
const handleFileInputChange = useCallback(
async (e: React.ChangeEvent<HTMLInputElement>) => {
const files = e.target.files
if (!files) return
for (const file of Array.from(files)) {
await handleFileDrop(file)
}
e.target.value = ''
},
[handleFileDrop]
)
const handlePermissionChange = useCallback(
(identifier: string, permissionType: PermissionType) => {
const existingUser = workspacePermissions?.users?.find((user) => user.userId === identifier)
@@ -204,11 +305,9 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
setExistingUserPermissionChanges((prev) => {
const newChanges = { ...prev }
// If the new permission matches the original, remove the change entry
if (existingUser.permissionType === permissionType) {
delete newChanges[identifier]
} else {
// Otherwise, track the change
newChanges[identifier] = { permissionType }
}
@@ -297,7 +396,6 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
setErrorMessage(null)
try {
// Verify the user exists in workspace permissions
const userRecord = workspacePermissions?.users?.find(
(user) => user.userId === memberToRemove.userId
)
@@ -322,7 +420,6 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
throw new Error(data.error || 'Failed to remove member')
}
// Update the workspace permissions to remove the user
if (workspacePermissions) {
const updatedUsers = workspacePermissions.users.filter(
(user) => user.userId !== memberToRemove.userId
@@ -333,7 +430,6 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
})
}
// Clear any pending changes for this user
setExistingUserPermissionChanges((prev) => {
const updated = { ...prev }
delete updated[memberToRemove.userId]
@@ -384,7 +480,6 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
throw new Error(data.error || 'Failed to cancel invitation')
}
// Remove the invitation from the pending invitations list
setPendingInvitations((prev) =>
prev.filter((inv) => inv.invitationId !== invitationToRemove.invitationId)
)
@@ -452,7 +547,6 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
delete next[invitationId]
return next
})
// Start 60s cooldown
setResendCooldowns((prev) => ({ ...prev, [invitationId]: 60 }))
const interval = setInterval(() => {
setResendCooldowns((prev) => {
@@ -474,40 +568,52 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
const handleKeyDown = useCallback(
(e: KeyboardEvent<HTMLInputElement>) => {
if (['Enter', ',', ' '].includes(e.key) && inputValue.trim()) {
if (e.key === 'Enter') {
e.preventDefault()
if (inputValue.trim()) {
addEmail(inputValue)
}
return
}
if ([',', ' '].includes(e.key) && inputValue.trim()) {
e.preventDefault()
addEmail(inputValue)
}
if (e.key === 'Backspace' && !inputValue) {
if (invalidEmails.length > 0) {
if (duplicateEmails.length > 0) {
removeDuplicateEmail(duplicateEmails.length - 1)
} else if (invalidEmails.length > 0) {
removeInvalidEmail(invalidEmails.length - 1)
} else if (emails.length > 0) {
removeEmail(emails.length - 1)
}
}
},
[inputValue, addEmail, invalidEmails, emails, removeInvalidEmail, removeEmail]
[
inputValue,
addEmail,
duplicateEmails,
invalidEmails,
emails,
removeDuplicateEmail,
removeInvalidEmail,
removeEmail,
]
)
const handlePaste = useCallback(
(e: React.ClipboardEvent<HTMLInputElement>) => {
e.preventDefault()
const pastedText = e.clipboardData.getData('text')
const pastedEmails = pastedText.split(/[\s,;]+/).filter(Boolean)
const pastedEmails = extractEmailsFromText(pastedText)
let addedCount = 0
pastedEmails.forEach((email) => {
if (addEmail(email)) {
addedCount++
}
addEmail(email)
})
if (addedCount === 0 && pastedEmails.length === 1) {
setInputValue(inputValue + pastedEmails[0])
}
},
[addEmail, inputValue]
[addEmail, extractEmailsFromText]
)
const handleSubmit = useCallback(
@@ -518,7 +624,6 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
addEmail(inputValue)
}
// Clear messages at start of submission
setErrorMessage(null)
setSuccessMessage(null)
@@ -644,10 +749,11 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
)
const resetState = useCallback(() => {
// Batch state updates using React's automatic batching in React 18+
setInputValue('')
setEmails([])
setInvalidEmails([])
setDuplicateEmails([])
setIsDragging(false)
setUserPermissions([])
setPendingInvitations([])
setIsPendingInvitationsLoading(false)
@@ -718,7 +824,29 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
tabIndex={-1}
readOnly
/>
<div className='scrollbar-hide flex max-h-32 min-h-9 flex-wrap items-center gap-x-[8px] gap-y-[4px] overflow-y-auto rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-4)] px-[6px] py-[4px] focus-within:outline-none'>
<input
ref={fileInputRef}
type='file'
accept='.csv,.txt,text/csv,text/plain'
onChange={handleFileInputChange}
className='hidden'
/>
<div
onDragOver={handleDragOver}
onDragLeave={handleDragLeave}
onDrop={handleDrop}
className={cn(
'scrollbar-hide relative flex max-h-32 min-h-9 flex-wrap items-center gap-x-[8px] gap-y-[4px] overflow-y-auto rounded-[4px] border border-[var(--border-1)] bg-[var(--surface-4)] px-[6px] py-[4px] transition-colors focus-within:outline-none',
isDragging && 'border-[var(--border)] border-dashed bg-[var(--surface-5)]'
)}
>
{isDragging && (
<div className='absolute inset-0 flex items-center justify-center rounded-[4px] bg-[var(--surface-5)]/90'>
<span className='text-[13px] text-[var(--text-tertiary)]'>
Drop file here
</span>
</div>
)}
{invalidEmails.map((email, index) => (
<EmailTag
key={`invalid-${index}`}
@@ -728,6 +856,25 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
isInvalid={true}
/>
))}
{duplicateEmails.map((email, index) => (
<div
key={`duplicate-${index}`}
className='flex w-auto items-center gap-[4px] rounded-[4px] border border-amber-500 bg-amber-500/10 px-[6px] py-[2px] text-[12px] text-amber-600 dark:bg-amber-500/20 dark:text-amber-400'
>
<span className='max-w-[200px] truncate'>{email}</span>
<span className='text-[11px] opacity-70'>duplicate</span>
{!isSubmitting && userPerms.canAdmin && (
<button
type='button'
onClick={() => removeDuplicateEmail(index)}
className='flex-shrink-0 text-amber-600 transition-colors hover:text-amber-700 focus:outline-none dark:text-amber-400 dark:hover:text-amber-300'
aria-label={`Remove ${email}`}
>
<X className='h-[12px] w-[12px] translate-y-[0.2px]' />
</button>
)}
</div>
))}
{emails.map((email, index) => (
<EmailTag
key={`valid-${index}`}
@@ -736,36 +883,52 @@ export function InviteModal({ open, onOpenChange, workspaceName }: InviteModalPr
disabled={isSubmitting || !userPerms.canAdmin}
/>
))}
<Input
id='invite-field'
name='invite_search_field'
type='text'
value={inputValue}
onChange={(e) => setInputValue(e.target.value)}
onKeyDown={handleKeyDown}
onPaste={handlePaste}
onBlur={() => inputValue.trim() && addEmail(inputValue)}
placeholder={
!userPerms.canAdmin
? 'Only administrators can invite new members'
: emails.length > 0 || invalidEmails.length > 0
? 'Add another email'
: 'Enter emails'
}
className={cn(
'h-6 min-w-[180px] flex-1 border-none bg-transparent p-0 text-[13px] focus-visible:ring-0 focus-visible:ring-offset-0',
emails.length > 0 || invalidEmails.length > 0 ? 'pl-[4px]' : 'pl-[4px]'
<div className='relative flex flex-1 items-center'>
<Input
id='invite-field'
name='invite_search_field'
type='text'
value={inputValue}
onChange={(e) => setInputValue(e.target.value)}
onKeyDown={handleKeyDown}
onPaste={handlePaste}
onBlur={() => inputValue.trim() && addEmail(inputValue)}
placeholder={
!userPerms.canAdmin
? 'Only administrators can invite new members'
: emails.length > 0 ||
invalidEmails.length > 0 ||
duplicateEmails.length > 0
? 'Add another email'
: 'Enter emails'
}
className={cn(
'h-6 min-w-[140px] flex-1 border-none bg-transparent p-0 text-[13px] focus-visible:ring-0 focus-visible:ring-offset-0',
emails.length > 0 || invalidEmails.length > 0 || duplicateEmails.length > 0
? 'pl-[4px]'
: 'pl-[4px]'
)}
autoFocus={userPerms.canAdmin}
disabled={isSubmitting || !userPerms.canAdmin}
autoComplete='off'
autoCorrect='off'
autoCapitalize='off'
spellCheck={false}
data-lpignore='true'
data-form-type='other'
aria-autocomplete='none'
/>
{userPerms.canAdmin && (
<button
type='button'
onClick={() => fileInputRef.current?.click()}
className='ml-[4px] flex-shrink-0 text-[var(--text-tertiary)] transition-colors hover:text-[var(--text-secondary)]'
disabled={isSubmitting}
>
<Paperclip className='h-[14px] w-[14px]' strokeWidth={2} />
</button>
)}
autoFocus={userPerms.canAdmin}
disabled={isSubmitting || !userPerms.canAdmin}
autoComplete='off'
autoCorrect='off'
autoCapitalize='off'
spellCheck={false}
data-lpignore='true'
data-form-type='other'
aria-autocomplete='none'
/>
</div>
</div>
</div>
{errorMessage && (

View File

@@ -95,6 +95,7 @@ export type WebhookExecutionPayload = {
testMode?: boolean
executionTarget?: 'deployed' | 'live'
credentialId?: string
credentialAccountUserId?: string
}
export async function executeWebhookJob(payload: WebhookExecutionPayload) {
@@ -241,6 +242,7 @@ async function executeWebhookJobInternal(
useDraftState: false,
startTime: new Date().toISOString(),
isClientSession: false,
credentialAccountUserId: payload.credentialAccountUserId,
workflowStateOverride: {
blocks,
edges,
@@ -499,6 +501,7 @@ async function executeWebhookJobInternal(
useDraftState: false,
startTime: new Date().toISOString(),
isClientSession: false,
credentialAccountUserId: payload.credentialAccountUserId,
workflowStateOverride: {
blocks,
edges,
@@ -508,7 +511,9 @@ async function executeWebhookJobInternal(
},
}
const snapshot = new ExecutionSnapshot(metadata, workflow, input || {}, workflowVariables, [])
const triggerInput = input || {}
const snapshot = new ExecutionSnapshot(metadata, workflow, triggerInput, workflowVariables, [])
const executionResult = await executeWorkflowCore({
snapshot,

View File

@@ -94,7 +94,6 @@ export const AgentBlock: BlockConfig<AgentResponse> = {
placeholder: 'Type or select a model...',
required: true,
defaultValue: 'claude-sonnet-4-5',
searchable: true,
options: () => {
const providersState = useProvidersStore.getState()
const baseModels = providersState.providers.base.models
@@ -329,6 +328,43 @@ export const AgentBlock: BlockConfig<AgentResponse> = {
value: providers.vertex.models,
},
},
{
id: 'bedrockAccessKeyId',
title: 'AWS Access Key ID',
type: 'short-input',
password: true,
placeholder: 'Enter your AWS Access Key ID',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.bedrock.models,
},
},
{
id: 'bedrockSecretKey',
title: 'AWS Secret Access Key',
type: 'short-input',
password: true,
placeholder: 'Enter your AWS Secret Access Key',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.bedrock.models,
},
},
{
id: 'bedrockRegion',
title: 'AWS Region',
type: 'short-input',
placeholder: 'us-east-1',
connectionDroppable: false,
condition: {
field: 'model',
value: providers.bedrock.models,
},
},
{
id: 'tools',
title: 'Tools',
@@ -343,11 +379,11 @@ export const AgentBlock: BlockConfig<AgentResponse> = {
password: true,
connectionDroppable: false,
required: true,
// Hide API key for hosted models, Ollama models, vLLM models, and Vertex models (uses OAuth)
// Hide API key for hosted models, Ollama models, vLLM models, Vertex models (uses OAuth), and Bedrock (uses AWS credentials)
condition: isHosted
? {
field: 'model',
value: [...getHostedModels(), ...providers.vertex.models],
value: [...getHostedModels(), ...providers.vertex.models, ...providers.bedrock.models],
not: true, // Show for all models EXCEPT those listed
}
: () => ({
@@ -356,8 +392,9 @@ export const AgentBlock: BlockConfig<AgentResponse> = {
...getCurrentOllamaModels(),
...getCurrentVLLMModels(),
...providers.vertex.models,
...providers.bedrock.models,
],
not: true, // Show for all models EXCEPT Ollama, vLLM, and Vertex models
not: true, // Show for all models EXCEPT Ollama, vLLM, Vertex, and Bedrock models
}),
},
{
@@ -634,6 +671,9 @@ Example 3 (Array Input):
azureApiVersion: { type: 'string', description: 'Azure API version' },
vertexProject: { type: 'string', description: 'Google Cloud project ID for Vertex AI' },
vertexLocation: { type: 'string', description: 'Google Cloud location for Vertex AI' },
bedrockAccessKeyId: { type: 'string', description: 'AWS Access Key ID for Bedrock' },
bedrockSecretKey: { type: 'string', description: 'AWS Secret Access Key for Bedrock' },
bedrockRegion: { type: 'string', description: 'AWS region for Bedrock' },
responseFormat: {
type: 'json',
description: 'JSON response format schema',

View File

@@ -1,27 +1,14 @@
import { createLogger } from '@sim/logger'
import { ChartBarIcon } from '@/components/icons'
import { isHosted } from '@/lib/core/config/feature-flags'
import type { BlockConfig, ParamType } from '@/blocks/types'
import { getProviderCredentialSubBlocks, PROVIDER_CREDENTIAL_INPUTS } from '@/blocks/utils'
import type { ProviderId } from '@/providers/types'
import {
getBaseModelProviders,
getHostedModels,
getProviderIcon,
providers,
} from '@/providers/utils'
import { getBaseModelProviders, getProviderIcon } from '@/providers/utils'
import { useProvidersStore } from '@/stores/providers/store'
import type { ToolResponse } from '@/tools/types'
const logger = createLogger('EvaluatorBlock')
const getCurrentOllamaModels = () => {
return useProvidersStore.getState().providers.ollama.models
}
const getCurrentVLLMModels = () => {
return useProvidersStore.getState().providers.vllm.models
}
interface Metric {
name: string
description: string
@@ -204,91 +191,7 @@ export const EvaluatorBlock: BlockConfig<EvaluatorResponse> = {
})
},
},
{
id: 'vertexCredential',
title: 'Google Cloud Account',
type: 'oauth-input',
serviceId: 'vertex-ai',
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
placeholder: 'Select Google Cloud account',
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your API key',
password: true,
connectionDroppable: false,
required: true,
// Hide API key for hosted models, Ollama models, vLLM models, and Vertex models (uses OAuth)
condition: isHosted
? {
field: 'model',
value: [...getHostedModels(), ...providers.vertex.models],
not: true, // Show for all models EXCEPT those listed
}
: () => ({
field: 'model',
value: [
...getCurrentOllamaModels(),
...getCurrentVLLMModels(),
...providers.vertex.models,
],
not: true, // Show for all models EXCEPT Ollama, vLLM, and Vertex models
}),
},
{
id: 'azureEndpoint',
title: 'Azure OpenAI Endpoint',
type: 'short-input',
password: true,
placeholder: 'https://your-resource.openai.azure.com',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'azureApiVersion',
title: 'Azure API Version',
type: 'short-input',
placeholder: '2024-07-01-preview',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'vertexProject',
title: 'Vertex AI Project',
type: 'short-input',
placeholder: 'your-gcp-project-id',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'vertexLocation',
title: 'Vertex AI Location',
type: 'short-input',
placeholder: 'us-central1',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
...getProviderCredentialSubBlocks(),
{
id: 'temperature',
title: 'Temperature',
@@ -403,21 +306,7 @@ export const EvaluatorBlock: BlockConfig<EvaluatorResponse> = {
},
},
model: { type: 'string' as ParamType, description: 'AI model to use' },
apiKey: { type: 'string' as ParamType, description: 'Provider API key' },
azureEndpoint: { type: 'string' as ParamType, description: 'Azure OpenAI endpoint URL' },
azureApiVersion: { type: 'string' as ParamType, description: 'Azure API version' },
vertexProject: {
type: 'string' as ParamType,
description: 'Google Cloud project ID for Vertex AI',
},
vertexLocation: {
type: 'string' as ParamType,
description: 'Google Cloud location for Vertex AI',
},
vertexCredential: {
type: 'string' as ParamType,
description: 'Google Cloud OAuth credential ID for Vertex AI',
},
...PROVIDER_CREDENTIAL_INPUTS,
temperature: {
type: 'number' as ParamType,
description: 'Response randomness level (low for consistent evaluation)',

View File

@@ -1,15 +1,10 @@
import { ShieldCheckIcon } from '@/components/icons'
import { isHosted } from '@/lib/core/config/feature-flags'
import type { BlockConfig } from '@/blocks/types'
import { getHostedModels, getProviderIcon } from '@/providers/utils'
import { getProviderCredentialSubBlocks, PROVIDER_CREDENTIAL_INPUTS } from '@/blocks/utils'
import { getProviderIcon } from '@/providers/utils'
import { useProvidersStore } from '@/stores/providers/store'
import type { ToolResponse } from '@/tools/types'
const getCurrentOllamaModels = () => {
const providersState = useProvidersStore.getState()
return providersState.providers.ollama.models
}
export interface GuardrailsResponse extends ToolResponse {
output: {
passed: boolean
@@ -120,8 +115,11 @@ Return ONLY the regex pattern - no explanations, no quotes, no forward slashes,
const providersState = useProvidersStore.getState()
const baseModels = providersState.providers.base.models
const ollamaModels = providersState.providers.ollama.models
const vllmModels = providersState.providers.vllm.models
const openrouterModels = providersState.providers.openrouter.models
const allModels = Array.from(new Set([...baseModels, ...ollamaModels, ...openrouterModels]))
const allModels = Array.from(
new Set([...baseModels, ...ollamaModels, ...vllmModels, ...openrouterModels])
)
return allModels.map((model) => {
const icon = getProviderIcon(model)
@@ -160,44 +158,19 @@ Return ONLY the regex pattern - no explanations, no quotes, no forward slashes,
value: ['hallucination'],
},
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your API key',
password: true,
connectionDroppable: false,
required: true,
// Show API key field only for hallucination validation
// Hide for hosted models and Ollama models
condition: () => {
const baseCondition = {
field: 'validationType' as const,
value: ['hallucination'],
}
if (isHosted) {
// In hosted mode, hide for hosted models
return {
...baseCondition,
and: {
field: 'model' as const,
value: getHostedModels(),
not: true, // Show for all models EXCEPT hosted ones
},
// Provider credential subblocks - only shown for hallucination validation
...getProviderCredentialSubBlocks().map((subBlock) => ({
...subBlock,
// Combine with hallucination condition
condition: subBlock.condition
? {
field: 'validationType' as const,
value: ['hallucination'],
and:
typeof subBlock.condition === 'function' ? subBlock.condition() : subBlock.condition,
}
}
// In self-hosted mode, hide for Ollama models
return {
...baseCondition,
and: {
field: 'model' as const,
value: getCurrentOllamaModels(),
not: true, // Show for all models EXCEPT Ollama ones
},
}
},
},
: { field: 'validationType' as const, value: ['hallucination'] },
})),
{
id: 'piiEntityTypes',
title: 'PII Types to Detect',
@@ -332,10 +305,7 @@ Return ONLY the regex pattern - no explanations, no quotes, no forward slashes,
type: 'string',
description: 'LLM model for hallucination scoring (default: gpt-4o-mini)',
},
apiKey: {
type: 'string',
description: 'API key for LLM provider (optional if using hosted)',
},
...PROVIDER_CREDENTIAL_INPUTS,
piiEntityTypes: {
type: 'json',
description: 'PII entity types to detect (array of strings, empty = detect all)',

View File

@@ -77,7 +77,6 @@ export const LinearBlock: BlockConfig<LinearResponse> = {
// Project Update Operations
{ label: 'Create Project Update', id: 'linear_create_project_update' },
{ label: 'List Project Updates', id: 'linear_list_project_updates' },
{ label: 'Create Project Link', id: 'linear_create_project_link' },
// Notification Operations
{ label: 'List Notifications', id: 'linear_list_notifications' },
{ label: 'Update Notification', id: 'linear_update_notification' },
@@ -227,6 +226,7 @@ export const LinearBlock: BlockConfig<LinearResponse> = {
'linear_update_project',
'linear_archive_project',
'linear_delete_project',
'linear_create_project_update',
'linear_list_project_updates',
],
},
@@ -239,6 +239,7 @@ export const LinearBlock: BlockConfig<LinearResponse> = {
'linear_update_project',
'linear_archive_project',
'linear_delete_project',
'linear_create_project_update',
'linear_list_project_updates',
'linear_list_project_labels',
],
@@ -261,7 +262,6 @@ export const LinearBlock: BlockConfig<LinearResponse> = {
'linear_delete_project',
'linear_create_project_update',
'linear_list_project_updates',
'linear_create_project_link',
],
},
condition: {
@@ -275,7 +275,6 @@ export const LinearBlock: BlockConfig<LinearResponse> = {
'linear_delete_project',
'linear_create_project_update',
'linear_list_project_updates',
'linear_create_project_link',
'linear_list_project_labels',
],
},
@@ -625,7 +624,7 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
required: true,
condition: {
field: 'operation',
value: ['linear_create_attachment', 'linear_create_project_link'],
value: ['linear_create_attachment'],
},
},
// Attachment title
@@ -1221,6 +1220,36 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
value: ['linear_create_project_status'],
},
},
{
id: 'projectStatusType',
title: 'Status Type',
type: 'dropdown',
options: [
{ label: 'Backlog', id: 'backlog' },
{ label: 'Planned', id: 'planned' },
{ label: 'Started', id: 'started' },
{ label: 'Paused', id: 'paused' },
{ label: 'Completed', id: 'completed' },
{ label: 'Canceled', id: 'canceled' },
],
value: () => 'started',
required: true,
condition: {
field: 'operation',
value: ['linear_create_project_status'],
},
},
{
id: 'projectStatusPosition',
title: 'Position',
type: 'short-input',
placeholder: 'Enter position (e.g. 0, 1, 2...)',
required: true,
condition: {
field: 'operation',
value: ['linear_create_project_status'],
},
},
{
id: 'projectStatusId',
title: 'Status ID',
@@ -1326,7 +1355,6 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
'linear_list_favorites',
'linear_create_project_update',
'linear_list_project_updates',
'linear_create_project_link',
'linear_list_notifications',
'linear_update_notification',
'linear_create_customer',
@@ -1772,17 +1800,6 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
projectId: effectiveProjectId,
}
case 'linear_create_project_link':
if (!effectiveProjectId || !params.url?.trim()) {
throw new Error('Project ID and URL are required.')
}
return {
...baseParams,
projectId: effectiveProjectId,
url: params.url.trim(),
label: params.name,
}
case 'linear_list_notifications':
return baseParams
@@ -2033,22 +2050,22 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
}
case 'linear_add_label_to_project':
if (!effectiveProjectId || !params.projectLabelId?.trim()) {
if (!params.projectIdForMilestone?.trim() || !params.projectLabelId?.trim()) {
throw new Error('Project ID and label ID are required.')
}
return {
...baseParams,
projectId: effectiveProjectId,
projectId: params.projectIdForMilestone.trim(),
labelId: params.projectLabelId.trim(),
}
case 'linear_remove_label_from_project':
if (!effectiveProjectId || !params.projectLabelId?.trim()) {
if (!params.projectIdForMilestone?.trim() || !params.projectLabelId?.trim()) {
throw new Error('Project ID and label ID are required.')
}
return {
...baseParams,
projectId: effectiveProjectId,
projectId: params.projectIdForMilestone.trim(),
labelId: params.projectLabelId.trim(),
}
@@ -2097,13 +2114,20 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
// Project Status Operations
case 'linear_create_project_status':
if (!params.projectStatusName?.trim() || !params.statusColor?.trim()) {
throw new Error('Project status name and color are required.')
if (
!params.projectStatusName?.trim() ||
!params.projectStatusType?.trim() ||
!params.statusColor?.trim() ||
!params.projectStatusPosition?.trim()
) {
throw new Error('Project status name, type, color, and position are required.')
}
return {
...baseParams,
name: params.projectStatusName.trim(),
type: params.projectStatusType.trim(),
color: params.statusColor.trim(),
position: Number.parseFloat(params.projectStatusPosition.trim()),
description: params.projectStatusDescription?.trim() || undefined,
indefinite: params.projectStatusIndefinite === 'true',
}
@@ -2270,7 +2294,6 @@ Return ONLY the date string in YYYY-MM-DD format - no explanations, no quotes, n
// Project update outputs
update: { type: 'json', description: 'Project update data' },
updates: { type: 'json', description: 'Project updates list' },
link: { type: 'json', description: 'Project link data' },
// Notification outputs
notification: { type: 'json', description: 'Notification data' },
notifications: { type: 'json', description: 'Notifications list' },

View File

@@ -1,24 +1,11 @@
import { ConnectIcon } from '@/components/icons'
import { isHosted } from '@/lib/core/config/feature-flags'
import { AuthMode, type BlockConfig } from '@/blocks/types'
import { getProviderCredentialSubBlocks, PROVIDER_CREDENTIAL_INPUTS } from '@/blocks/utils'
import type { ProviderId } from '@/providers/types'
import {
getBaseModelProviders,
getHostedModels,
getProviderIcon,
providers,
} from '@/providers/utils'
import { getBaseModelProviders, getProviderIcon } from '@/providers/utils'
import { useProvidersStore } from '@/stores/providers/store'
import type { ToolResponse } from '@/tools/types'
const getCurrentOllamaModels = () => {
return useProvidersStore.getState().providers.ollama.models
}
const getCurrentVLLMModels = () => {
return useProvidersStore.getState().providers.vllm.models
}
interface RouterResponse extends ToolResponse {
output: {
prompt: string
@@ -168,23 +155,6 @@ const getModelOptions = () => {
})
}
/**
* Helper to get API key condition for both router versions.
*/
const getApiKeyCondition = () => {
return isHosted
? {
field: 'model',
value: [...getHostedModels(), ...providers.vertex.models],
not: true,
}
: () => ({
field: 'model',
value: [...getCurrentOllamaModels(), ...getCurrentVLLMModels(), ...providers.vertex.models],
not: true,
})
}
/**
* Legacy Router Block (block-based routing).
* Hidden from toolbar but still supported for existing workflows.
@@ -221,76 +191,7 @@ export const RouterBlock: BlockConfig<RouterResponse> = {
defaultValue: 'claude-sonnet-4-5',
options: getModelOptions,
},
{
id: 'vertexCredential',
title: 'Google Cloud Account',
type: 'oauth-input',
serviceId: 'vertex-ai',
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
placeholder: 'Select Google Cloud account',
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your API key',
password: true,
connectionDroppable: false,
required: true,
condition: getApiKeyCondition(),
},
{
id: 'azureEndpoint',
title: 'Azure OpenAI Endpoint',
type: 'short-input',
password: true,
placeholder: 'https://your-resource.openai.azure.com',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'azureApiVersion',
title: 'Azure API Version',
type: 'short-input',
placeholder: '2024-07-01-preview',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'vertexProject',
title: 'Vertex AI Project',
type: 'short-input',
placeholder: 'your-gcp-project-id',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'vertexLocation',
title: 'Vertex AI Location',
type: 'short-input',
placeholder: 'us-central1',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
...getProviderCredentialSubBlocks(),
{
id: 'temperature',
title: 'Temperature',
@@ -335,15 +236,7 @@ export const RouterBlock: BlockConfig<RouterResponse> = {
inputs: {
prompt: { type: 'string', description: 'Routing prompt content' },
model: { type: 'string', description: 'AI model to use' },
apiKey: { type: 'string', description: 'Provider API key' },
azureEndpoint: { type: 'string', description: 'Azure OpenAI endpoint URL' },
azureApiVersion: { type: 'string', description: 'Azure API version' },
vertexProject: { type: 'string', description: 'Google Cloud project ID for Vertex AI' },
vertexLocation: { type: 'string', description: 'Google Cloud location for Vertex AI' },
vertexCredential: {
type: 'string',
description: 'Google Cloud OAuth credential ID for Vertex AI',
},
...PROVIDER_CREDENTIAL_INPUTS,
temperature: {
type: 'number',
description: 'Response randomness level (low for consistent routing)',
@@ -422,76 +315,7 @@ export const RouterV2Block: BlockConfig<RouterV2Response> = {
defaultValue: 'claude-sonnet-4-5',
options: getModelOptions,
},
{
id: 'vertexCredential',
title: 'Google Cloud Account',
type: 'oauth-input',
serviceId: 'vertex-ai',
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
placeholder: 'Select Google Cloud account',
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your API key',
password: true,
connectionDroppable: false,
required: true,
condition: getApiKeyCondition(),
},
{
id: 'azureEndpoint',
title: 'Azure OpenAI Endpoint',
type: 'short-input',
password: true,
placeholder: 'https://your-resource.openai.azure.com',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'azureApiVersion',
title: 'Azure API Version',
type: 'short-input',
placeholder: '2024-07-01-preview',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'vertexProject',
title: 'Vertex AI Project',
type: 'short-input',
placeholder: 'your-gcp-project-id',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'vertexLocation',
title: 'Vertex AI Location',
type: 'short-input',
placeholder: 'us-central1',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
...getProviderCredentialSubBlocks(),
],
tools: {
access: [
@@ -520,15 +344,7 @@ export const RouterV2Block: BlockConfig<RouterV2Response> = {
context: { type: 'string', description: 'Context for routing decision' },
routes: { type: 'json', description: 'Route definitions with descriptions' },
model: { type: 'string', description: 'AI model to use' },
apiKey: { type: 'string', description: 'Provider API key' },
azureEndpoint: { type: 'string', description: 'Azure OpenAI endpoint URL' },
azureApiVersion: { type: 'string', description: 'Azure API version' },
vertexProject: { type: 'string', description: 'Google Cloud project ID for Vertex AI' },
vertexLocation: { type: 'string', description: 'Google Cloud location for Vertex AI' },
vertexCredential: {
type: 'string',
description: 'Google Cloud OAuth credential ID for Vertex AI',
},
...PROVIDER_CREDENTIAL_INPUTS,
},
outputs: {
context: { type: 'string', description: 'Context used for routing' },

View File

@@ -1,17 +1,9 @@
import { TranslateIcon } from '@/components/icons'
import { isHosted } from '@/lib/core/config/feature-flags'
import { AuthMode, type BlockConfig } from '@/blocks/types'
import { getHostedModels, getProviderIcon, providers } from '@/providers/utils'
import { getProviderCredentialSubBlocks, PROVIDER_CREDENTIAL_INPUTS } from '@/blocks/utils'
import { getProviderIcon } from '@/providers/utils'
import { useProvidersStore } from '@/stores/providers/store'
const getCurrentOllamaModels = () => {
return useProvidersStore.getState().providers.ollama.models
}
const getCurrentVLLMModels = () => {
return useProvidersStore.getState().providers.vllm.models
}
const getTranslationPrompt = (targetLanguage: string) =>
`Translate the following text into ${targetLanguage || 'English'}. Output ONLY the translated text with no additional commentary, explanations, or notes.`
@@ -59,91 +51,7 @@ export const TranslateBlock: BlockConfig = {
})
},
},
{
id: 'vertexCredential',
title: 'Google Cloud Account',
type: 'oauth-input',
serviceId: 'vertex-ai',
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
placeholder: 'Select Google Cloud account',
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your API key',
password: true,
connectionDroppable: false,
required: true,
// Hide API key for hosted models, Ollama models, vLLM models, and Vertex models (uses OAuth)
condition: isHosted
? {
field: 'model',
value: [...getHostedModels(), ...providers.vertex.models],
not: true, // Show for all models EXCEPT those listed
}
: () => ({
field: 'model',
value: [
...getCurrentOllamaModels(),
...getCurrentVLLMModels(),
...providers.vertex.models,
],
not: true, // Show for all models EXCEPT Ollama, vLLM, and Vertex models
}),
},
{
id: 'azureEndpoint',
title: 'Azure OpenAI Endpoint',
type: 'short-input',
password: true,
placeholder: 'https://your-resource.openai.azure.com',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'azureApiVersion',
title: 'Azure API Version',
type: 'short-input',
placeholder: '2024-07-01-preview',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'vertexProject',
title: 'Vertex AI Project',
type: 'short-input',
placeholder: 'your-gcp-project-id',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'vertexLocation',
title: 'Vertex AI Location',
type: 'short-input',
placeholder: 'us-central1',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
...getProviderCredentialSubBlocks(),
{
id: 'systemPrompt',
title: 'System Prompt',
@@ -168,21 +76,15 @@ export const TranslateBlock: BlockConfig = {
vertexProject: params.vertexProject,
vertexLocation: params.vertexLocation,
vertexCredential: params.vertexCredential,
bedrockRegion: params.bedrockRegion,
bedrockSecretKey: params.bedrockSecretKey,
}),
},
},
inputs: {
context: { type: 'string', description: 'Text to translate' },
targetLanguage: { type: 'string', description: 'Target language' },
apiKey: { type: 'string', description: 'Provider API key' },
azureEndpoint: { type: 'string', description: 'Azure OpenAI endpoint URL' },
azureApiVersion: { type: 'string', description: 'Azure API version' },
vertexProject: { type: 'string', description: 'Google Cloud project ID for Vertex AI' },
vertexLocation: { type: 'string', description: 'Google Cloud location for Vertex AI' },
vertexCredential: {
type: 'string',
description: 'Google Cloud OAuth credential ID for Vertex AI',
},
...PROVIDER_CREDENTIAL_INPUTS,
systemPrompt: { type: 'string', description: 'Translation instructions' },
},
outputs: {

View File

@@ -254,6 +254,8 @@ export interface SubBlockConfig {
// OAuth specific properties - serviceId is the canonical identifier for OAuth services
serviceId?: string
requiredScopes?: string[]
// Whether this credential selector supports credential sets (for trigger blocks)
supportsCredentialSets?: boolean
// File selector specific properties
mimeType?: string
// File upload specific properties

View File

@@ -1,4 +1,7 @@
import { isHosted } from '@/lib/core/config/feature-flags'
import type { BlockOutput, OutputFieldDefinition, SubBlockConfig } from '@/blocks/types'
import { getHostedModels, providers } from '@/providers/utils'
import { useProvidersStore } from '@/stores/providers/store'
/**
* Checks if a field is included in the dependsOn config.
@@ -37,3 +40,177 @@ export function resolveOutputType(
return resolvedOutputs
}
/**
* Helper to get current Ollama models from store
*/
const getCurrentOllamaModels = () => {
return useProvidersStore.getState().providers.ollama.models
}
/**
* Helper to get current vLLM models from store
*/
const getCurrentVLLMModels = () => {
return useProvidersStore.getState().providers.vllm.models
}
/**
* Get the API key condition for provider credential subblocks.
* Handles hosted vs self-hosted environments and excludes providers that don't need API key.
*/
export function getApiKeyCondition() {
return isHosted
? {
field: 'model',
value: [...getHostedModels(), ...providers.vertex.models, ...providers.bedrock.models],
not: true,
}
: () => ({
field: 'model',
value: [
...getCurrentOllamaModels(),
...getCurrentVLLMModels(),
...providers.vertex.models,
...providers.bedrock.models,
],
not: true,
})
}
/**
* Returns the standard provider credential subblocks used by LLM-based blocks.
* This includes: Vertex AI OAuth, API Key, Azure OpenAI, Vertex AI config, and Bedrock config.
*
* Usage: Spread into your block's subBlocks array after block-specific fields
*/
export function getProviderCredentialSubBlocks(): SubBlockConfig[] {
return [
{
id: 'vertexCredential',
title: 'Google Cloud Account',
type: 'oauth-input',
serviceId: 'vertex-ai',
requiredScopes: ['https://www.googleapis.com/auth/cloud-platform'],
placeholder: 'Select Google Cloud account',
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'apiKey',
title: 'API Key',
type: 'short-input',
placeholder: 'Enter your API key',
password: true,
connectionDroppable: false,
required: true,
condition: getApiKeyCondition(),
},
{
id: 'azureEndpoint',
title: 'Azure OpenAI Endpoint',
type: 'short-input',
password: true,
placeholder: 'https://your-resource.openai.azure.com',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'azureApiVersion',
title: 'Azure API Version',
type: 'short-input',
placeholder: '2024-07-01-preview',
connectionDroppable: false,
condition: {
field: 'model',
value: providers['azure-openai'].models,
},
},
{
id: 'vertexProject',
title: 'Vertex AI Project',
type: 'short-input',
placeholder: 'your-gcp-project-id',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'vertexLocation',
title: 'Vertex AI Location',
type: 'short-input',
placeholder: 'us-central1',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.vertex.models,
},
},
{
id: 'bedrockAccessKeyId',
title: 'AWS Access Key ID',
type: 'short-input',
password: true,
placeholder: 'Enter your AWS Access Key ID',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.bedrock.models,
},
},
{
id: 'bedrockSecretKey',
title: 'AWS Secret Access Key',
type: 'short-input',
password: true,
placeholder: 'Enter your AWS Secret Access Key',
connectionDroppable: false,
required: true,
condition: {
field: 'model',
value: providers.bedrock.models,
},
},
{
id: 'bedrockRegion',
title: 'AWS Region',
type: 'short-input',
placeholder: 'us-east-1',
connectionDroppable: false,
condition: {
field: 'model',
value: providers.bedrock.models,
},
},
]
}
/**
* Returns the standard input definitions for provider credentials.
* Use this in your block's inputs definition.
*/
export const PROVIDER_CREDENTIAL_INPUTS = {
apiKey: { type: 'string', description: 'Provider API key' },
azureEndpoint: { type: 'string', description: 'Azure OpenAI endpoint URL' },
azureApiVersion: { type: 'string', description: 'Azure API version' },
vertexProject: { type: 'string', description: 'Google Cloud project ID for Vertex AI' },
vertexLocation: { type: 'string', description: 'Google Cloud location for Vertex AI' },
vertexCredential: {
type: 'string',
description: 'Google Cloud OAuth credential ID for Vertex AI',
},
bedrockAccessKeyId: { type: 'string', description: 'AWS Access Key ID for Bedrock' },
bedrockSecretKey: { type: 'string', description: 'AWS Secret Access Key for Bedrock' },
bedrockRegion: { type: 'string', description: 'AWS region for Bedrock' },
} as const

View File

@@ -1,3 +1,4 @@
export { BatchInvitationEmail } from './batch-invitation-email'
export { InvitationEmail } from './invitation-email'
export { PollingGroupInvitationEmail } from './polling-group-invitation-email'
export { WorkspaceInvitationEmail } from './workspace-invitation-email'

View File

@@ -0,0 +1,52 @@
import { Link, Text } from '@react-email/components'
import { baseStyles } from '@/components/emails/_styles'
import { EmailLayout } from '@/components/emails/components'
import { getBrandConfig } from '@/lib/branding/branding'
interface PollingGroupInvitationEmailProps {
inviterName?: string
organizationName?: string
pollingGroupName?: string
provider?: 'google-email' | 'outlook'
inviteLink?: string
}
export function PollingGroupInvitationEmail({
inviterName = 'A team member',
organizationName = 'an organization',
pollingGroupName = 'a polling group',
provider = 'google-email',
inviteLink = '',
}: PollingGroupInvitationEmailProps) {
const brand = getBrandConfig()
const providerName = provider === 'google-email' ? 'Gmail' : 'Outlook'
return (
<EmailLayout preview={`You've been invited to join ${pollingGroupName} on ${brand.name}`}>
<Text style={baseStyles.paragraph}>Hello,</Text>
<Text style={baseStyles.paragraph}>
<strong>{inviterName}</strong> from <strong>{organizationName}</strong> has invited you to
join the polling group <strong>{pollingGroupName}</strong> on {brand.name}.
</Text>
<Text style={baseStyles.paragraph}>
By accepting this invitation, your {providerName} account will be connected to enable email
polling for automated workflows.
</Text>
<Link href={inviteLink} style={{ textDecoration: 'none' }}>
<Text style={baseStyles.button}>Accept Invitation</Text>
</Link>
{/* Divider */}
<div style={baseStyles.divider} />
<Text style={{ ...baseStyles.footerText, textAlign: 'left' }}>
This invitation expires in 7 days. If you weren't expecting this email, you can safely
ignore it.
</Text>
</EmailLayout>
)
}
export default PollingGroupInvitationEmail

View File

@@ -12,6 +12,7 @@ import { CareersConfirmationEmail, CareersSubmissionEmail } from '@/components/e
import {
BatchInvitationEmail,
InvitationEmail,
PollingGroupInvitationEmail,
WorkspaceInvitationEmail,
} from '@/components/emails/invitations'
import { HelpConfirmationEmail } from '@/components/emails/support'
@@ -184,6 +185,24 @@ export async function renderWorkspaceInvitationEmail(
)
}
export async function renderPollingGroupInvitationEmail(params: {
inviterName: string
organizationName: string
pollingGroupName: string
provider: 'google-email' | 'outlook'
inviteLink: string
}): Promise<string> {
return await render(
PollingGroupInvitationEmail({
inviterName: params.inviterName,
organizationName: params.organizationName,
pollingGroupName: params.pollingGroupName,
provider: params.provider,
inviteLink: params.inviteLink,
})
)
}
export async function renderPaymentFailedEmail(params: {
userName?: string
amountDue: number

View File

@@ -8,6 +8,7 @@ export type EmailSubjectType =
| 'reset-password'
| 'invitation'
| 'batch-invitation'
| 'polling-group-invitation'
| 'help-confirmation'
| 'enterprise-subscription'
| 'usage-threshold'
@@ -38,6 +39,8 @@ export function getEmailSubject(type: EmailSubjectType): string {
return `You've been invited to join a team on ${brandName}`
case 'batch-invitation':
return `You've been invited to join a team and workspaces on ${brandName}`
case 'polling-group-invitation':
return `You've been invited to join an email polling group on ${brandName}`
case 'help-confirmation':
return 'Your request has been received'
case 'enterprise-subscription':

View File

@@ -4575,3 +4575,22 @@ export function FirefliesIcon(props: SVGProps<SVGSVGElement>) {
</svg>
)
}
export function BedrockIcon(props: SVGProps<SVGSVGElement>) {
return (
<svg {...props} viewBox='0 0 24 24' xmlns='http://www.w3.org/2000/svg'>
<defs>
<linearGradient id='bedrock_gradient' x1='80%' x2='20%' y1='20%' y2='80%'>
<stop offset='0%' stopColor='#6350FB' />
<stop offset='50%' stopColor='#3D8FFF' />
<stop offset='100%' stopColor='#9AD8F8' />
</linearGradient>
</defs>
<path
d='M13.05 15.513h3.08c.214 0 .389.177.389.394v1.82a1.704 1.704 0 011.296 1.661c0 .943-.755 1.708-1.685 1.708-.931 0-1.686-.765-1.686-1.708 0-.807.554-1.484 1.297-1.662v-1.425h-2.69v4.663a.395.395 0 01-.188.338l-2.69 1.641a.385.385 0 01-.405-.002l-4.926-3.086a.395.395 0 01-.185-.336V16.3L2.196 14.87A.395.395 0 012 14.555L2 14.528V9.406c0-.14.073-.27.192-.34l2.465-1.462V4.448c0-.129.062-.249.165-.322l.021-.014L9.77 1.058a.385.385 0 01.407 0l2.69 1.675a.395.395 0 01.185.336V7.6h3.856V5.683a1.704 1.704 0 01-1.296-1.662c0-.943.755-1.708 1.685-1.708.931 0 1.685.765 1.685 1.708 0 .807-.553 1.484-1.296 1.662v2.311a.391.391 0 01-.389.394h-4.245v1.806h6.624a1.69 1.69 0 011.64-1.313c.93 0 1.685.764 1.685 1.707 0 .943-.754 1.708-1.685 1.708a1.69 1.69 0 01-1.64-1.314H13.05v1.937h4.953l.915 1.18a1.66 1.66 0 01.84-.227c.931 0 1.685.764 1.685 1.707 0 .943-.754 1.708-1.685 1.708-.93 0-1.685-.765-1.685-1.708 0-.346.102-.668.276-.937l-.724-.935H13.05v1.806zM9.973 1.856L7.93 3.122V6.09h-.778V3.604L5.435 4.669v2.945l2.11 1.36L9.712 7.61V5.334h.778V7.83c0 .136-.07.263-.184.335L7.963 9.638v2.081l1.422 1.009-.446.646-1.406-.998-1.53 1.005-.423-.66 1.605-1.055v-1.99L5.038 8.29l-2.26 1.34v1.676l1.972-1.189.398.677-2.37 1.429V14.3l2.166 1.258 2.27-1.368.397.677-2.176 1.311V19.3l1.876 1.175 2.365-1.426.398.678-2.017 1.216 1.918 1.201 2.298-1.403v-5.78l-4.758 2.893-.4-.675 5.158-3.136V3.289L9.972 1.856zM16.13 18.47a.913.913 0 00-.908.92c0 .507.406.918.908.918a.913.913 0 00.907-.919.913.913 0 00-.907-.92zm3.63-3.81a.913.913 0 00-.908.92c0 .508.406.92.907.92a.913.913 0 00.908-.92.913.913 0 00-.908-.92zm1.555-4.99a.913.913 0 00-.908.92c0 .507.407.918.908.918a.913.913 0 00.907-.919.913.913 0 00-.907-.92zM17.296 3.1a.913.913 0 00-.907.92c0 .508.406.92.907.92a.913.913 0 00.908-.92.913.913 0 00-.908-.92z'
fill='url(#bedrock_gradient)'
fillRule='nonzero'
/>
</svg>
)
}

View File

@@ -181,6 +181,22 @@ export const MCP = {
TOOL_PREFIX: 'mcp-',
} as const
export const CREDENTIAL_SET = {
PREFIX: 'credentialSet:',
} as const
export const CREDENTIAL = {
FOREIGN_LABEL: 'Saved by collaborator',
} as const
export function isCredentialSetValue(value: string | null | undefined): boolean {
return typeof value === 'string' && value.startsWith(CREDENTIAL_SET.PREFIX)
}
export function extractCredentialSetId(value: string): string {
return value.slice(CREDENTIAL_SET.PREFIX.length)
}
export const MEMORY = {
DEFAULT_SLIDING_WINDOW_SIZE: 10,
DEFAULT_SLIDING_WINDOW_TOKENS: 4000,

View File

@@ -339,7 +339,7 @@ export class BlockExecutor {
if (isTrigger) {
const filtered: NormalizedBlockOutput = {}
const internalKeys = ['webhook', 'workflowId', 'input']
const internalKeys = ['webhook', 'workflowId']
for (const [key, value] of Object.entries(output)) {
if (internalKeys.includes(key)) continue
filtered[key] = value

View File

@@ -17,6 +17,7 @@ export interface ExecutionMetadata {
isClientSession?: boolean
pendingBlocks?: string[]
resumeFromSnapshot?: boolean
credentialAccountUserId?: string
workflowStateOverride?: {
blocks: Record<string, any>
edges: Edge[]

View File

@@ -928,6 +928,9 @@ export class AgentBlockHandler implements BlockHandler {
vertexProject: inputs.vertexProject,
vertexLocation: inputs.vertexLocation,
vertexCredential: inputs.vertexCredential,
bedrockAccessKeyId: inputs.bedrockAccessKeyId,
bedrockSecretKey: inputs.bedrockSecretKey,
bedrockRegion: inputs.bedrockRegion,
responseFormat,
workflowId: ctx.workflowId,
workspaceId: ctx.workspaceId,
@@ -1029,6 +1032,9 @@ export class AgentBlockHandler implements BlockHandler {
azureApiVersion: providerRequest.azureApiVersion,
vertexProject: providerRequest.vertexProject,
vertexLocation: providerRequest.vertexLocation,
bedrockAccessKeyId: providerRequest.bedrockAccessKeyId,
bedrockSecretKey: providerRequest.bedrockSecretKey,
bedrockRegion: providerRequest.bedrockRegion,
responseFormat: providerRequest.responseFormat,
workflowId: providerRequest.workflowId,
workspaceId: ctx.workspaceId,

View File

@@ -22,6 +22,9 @@ export interface AgentInputs {
vertexProject?: string
vertexLocation?: string
vertexCredential?: string
bedrockAccessKeyId?: string
bedrockSecretKey?: string
bedrockRegion?: string
reasoningEffort?: string
verbosity?: string
}

View File

@@ -32,6 +32,9 @@ export class EvaluatorBlockHandler implements BlockHandler {
vertexProject: inputs.vertexProject,
vertexLocation: inputs.vertexLocation,
vertexCredential: inputs.vertexCredential,
bedrockAccessKeyId: inputs.bedrockAccessKeyId,
bedrockSecretKey: inputs.bedrockSecretKey,
bedrockRegion: inputs.bedrockRegion,
}
const providerId = getProviderFromModel(evaluatorConfig.model)
@@ -128,6 +131,12 @@ export class EvaluatorBlockHandler implements BlockHandler {
providerRequest.azureApiVersion = inputs.azureApiVersion
}
if (providerId === 'bedrock') {
providerRequest.bedrockAccessKeyId = evaluatorConfig.bedrockAccessKeyId
providerRequest.bedrockSecretKey = evaluatorConfig.bedrockSecretKey
providerRequest.bedrockRegion = evaluatorConfig.bedrockRegion
}
const response = await fetch(url.toString(), {
method: 'POST',
headers: {

View File

@@ -68,6 +68,9 @@ export class RouterBlockHandler implements BlockHandler {
vertexProject: inputs.vertexProject,
vertexLocation: inputs.vertexLocation,
vertexCredential: inputs.vertexCredential,
bedrockAccessKeyId: inputs.bedrockAccessKeyId,
bedrockSecretKey: inputs.bedrockSecretKey,
bedrockRegion: inputs.bedrockRegion,
}
const providerId = getProviderFromModel(routerConfig.model)
@@ -104,6 +107,12 @@ export class RouterBlockHandler implements BlockHandler {
providerRequest.azureApiVersion = inputs.azureApiVersion
}
if (providerId === 'bedrock') {
providerRequest.bedrockAccessKeyId = routerConfig.bedrockAccessKeyId
providerRequest.bedrockSecretKey = routerConfig.bedrockSecretKey
providerRequest.bedrockRegion = routerConfig.bedrockRegion
}
const response = await fetch(url.toString(), {
method: 'POST',
headers: {
@@ -197,6 +206,9 @@ export class RouterBlockHandler implements BlockHandler {
vertexProject: inputs.vertexProject,
vertexLocation: inputs.vertexLocation,
vertexCredential: inputs.vertexCredential,
bedrockAccessKeyId: inputs.bedrockAccessKeyId,
bedrockSecretKey: inputs.bedrockSecretKey,
bedrockRegion: inputs.bedrockRegion,
}
const providerId = getProviderFromModel(routerConfig.model)
@@ -233,6 +245,12 @@ export class RouterBlockHandler implements BlockHandler {
providerRequest.azureApiVersion = inputs.azureApiVersion
}
if (providerId === 'bedrock') {
providerRequest.bedrockAccessKeyId = routerConfig.bedrockAccessKeyId
providerRequest.bedrockSecretKey = routerConfig.bedrockSecretKey
providerRequest.bedrockRegion = routerConfig.bedrockRegion
}
const response = await fetch(url.toString(), {
method: 'POST',
headers: {

View File

@@ -124,6 +124,7 @@ export interface ExecutionMetadata {
isDebugSession?: boolean
context?: ExecutionContext
workflowConnections?: Array<{ source: string; target: string }>
credentialAccountUserId?: string
status?: 'running' | 'paused' | 'completed'
pausePoints?: string[]
resumeChain?: {

View File

@@ -15,18 +15,26 @@ export interface BYOKKey {
updatedAt: string
}
export interface BYOKKeysResponse {
keys: BYOKKey[]
byokEnabled: boolean
}
export const byokKeysKeys = {
all: ['byok-keys'] as const,
workspace: (workspaceId: string) => [...byokKeysKeys.all, 'workspace', workspaceId] as const,
}
async function fetchBYOKKeys(workspaceId: string): Promise<BYOKKey[]> {
async function fetchBYOKKeys(workspaceId: string): Promise<BYOKKeysResponse> {
const response = await fetch(API_ENDPOINTS.WORKSPACE_BYOK_KEYS(workspaceId))
if (!response.ok) {
throw new Error(`Failed to load BYOK keys: ${response.statusText}`)
}
const { keys } = await response.json()
return keys
const data = await response.json()
return {
keys: data.keys ?? [],
byokEnabled: data.byokEnabled ?? true,
}
}
export function useBYOKKeys(workspaceId: string) {
@@ -36,6 +44,7 @@ export function useBYOKKeys(workspaceId: string) {
enabled: !!workspaceId,
staleTime: 60 * 1000,
placeholderData: keepPreviousData,
select: (data) => data,
})
}

View File

@@ -0,0 +1,370 @@
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
import { fetchJson } from '@/hooks/selectors/helpers'
export interface CredentialSet {
id: string
name: string
description: string | null
providerId: string | null
createdBy: string
createdAt: string
updatedAt: string
creatorName: string | null
creatorEmail: string | null
memberCount: number
}
export interface CredentialSetMembership {
membershipId: string
status: string
joinedAt: string | null
credentialSetId: string
credentialSetName: string
credentialSetDescription: string | null
providerId: string | null
organizationId: string
organizationName: string
}
export interface CredentialSetInvitation {
invitationId: string
token: string
status: string
expiresAt: string
createdAt: string
credentialSetId: string
credentialSetName: string
providerId: string | null
organizationId: string
organizationName: string
invitedByName: string | null
invitedByEmail: string | null
}
interface CredentialSetsResponse {
credentialSets?: CredentialSet[]
}
interface MembershipsResponse {
memberships?: CredentialSetMembership[]
}
interface InvitationsResponse {
invitations?: CredentialSetInvitation[]
}
export const credentialSetKeys = {
all: ['credentialSets'] as const,
list: (organizationId?: string) => ['credentialSets', 'list', organizationId ?? 'none'] as const,
detail: (id?: string) => ['credentialSets', 'detail', id ?? 'none'] as const,
memberships: () => ['credentialSets', 'memberships'] as const,
invitations: () => ['credentialSets', 'invitations'] as const,
}
export async function fetchCredentialSets(organizationId: string): Promise<CredentialSet[]> {
if (!organizationId) return []
const data = await fetchJson<CredentialSetsResponse>('/api/credential-sets', {
searchParams: { organizationId },
})
return data.credentialSets ?? []
}
export function useCredentialSets(organizationId?: string, enabled = true) {
return useQuery<CredentialSet[]>({
queryKey: credentialSetKeys.list(organizationId),
queryFn: () => fetchCredentialSets(organizationId ?? ''),
enabled: Boolean(organizationId) && enabled,
staleTime: 60 * 1000,
})
}
interface CredentialSetDetailResponse {
credentialSet?: CredentialSet
}
export async function fetchCredentialSetById(id: string): Promise<CredentialSet | null> {
if (!id) return null
const data = await fetchJson<CredentialSetDetailResponse>(`/api/credential-sets/${id}`)
return data.credentialSet ?? null
}
export function useCredentialSetDetail(id?: string, enabled = true) {
return useQuery<CredentialSet | null>({
queryKey: credentialSetKeys.detail(id),
queryFn: () => fetchCredentialSetById(id ?? ''),
enabled: Boolean(id) && enabled,
staleTime: 60 * 1000,
})
}
export function useCredentialSetMemberships() {
return useQuery<CredentialSetMembership[]>({
queryKey: credentialSetKeys.memberships(),
queryFn: async () => {
const data = await fetchJson<MembershipsResponse>('/api/credential-sets/memberships')
return data.memberships ?? []
},
staleTime: 60 * 1000,
})
}
export function useCredentialSetInvitations() {
return useQuery<CredentialSetInvitation[]>({
queryKey: credentialSetKeys.invitations(),
queryFn: async () => {
const data = await fetchJson<InvitationsResponse>('/api/credential-sets/invitations')
return data.invitations ?? []
},
staleTime: 30 * 1000,
})
}
export function useAcceptCredentialSetInvitation() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (token: string) => {
const response = await fetch(`/api/credential-sets/invite/${token}`, {
method: 'POST',
})
if (!response.ok) {
const data = await response.json()
throw new Error(data.error || 'Failed to accept invitation')
}
return response.json()
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: credentialSetKeys.memberships() })
queryClient.invalidateQueries({ queryKey: credentialSetKeys.invitations() })
},
})
}
export interface CreateCredentialSetData {
organizationId: string
name: string
description?: string
providerId?: string
}
export function useCreateCredentialSet() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (data: CreateCredentialSetData) => {
const response = await fetch('/api/credential-sets', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data),
})
if (!response.ok) {
const result = await response.json()
throw new Error(result.error || 'Failed to create credential set')
}
return response.json()
},
onSuccess: (_data, variables) => {
queryClient.invalidateQueries({ queryKey: credentialSetKeys.list(variables.organizationId) })
},
})
}
export function useCreateCredentialSetInvitation() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (data: { credentialSetId: string; email?: string }) => {
const response = await fetch(`/api/credential-sets/${data.credentialSetId}/invite`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ email: data.email }),
})
if (!response.ok) {
const result = await response.json()
throw new Error(result.error || 'Failed to create invitation')
}
return response.json()
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: credentialSetKeys.all })
},
})
}
export interface CredentialSetMember {
id: string
userId: string
status: string
joinedAt: string | null
createdAt: string
userName: string | null
userEmail: string | null
userImage: string | null
credentials: { providerId: string; accountId: string }[]
}
interface MembersResponse {
members?: CredentialSetMember[]
}
export function useCredentialSetMembers(credentialSetId?: string) {
return useQuery<CredentialSetMember[]>({
queryKey: [...credentialSetKeys.detail(credentialSetId), 'members'],
queryFn: async () => {
const data = await fetchJson<MembersResponse>(
`/api/credential-sets/${credentialSetId}/members`
)
return data.members ?? []
},
enabled: Boolean(credentialSetId),
staleTime: 30 * 1000,
})
}
export function useRemoveCredentialSetMember() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (data: { credentialSetId: string; memberId: string }) => {
const response = await fetch(
`/api/credential-sets/${data.credentialSetId}/members?memberId=${data.memberId}`,
{ method: 'DELETE' }
)
if (!response.ok) {
const result = await response.json()
throw new Error(result.error || 'Failed to remove member')
}
return response.json()
},
onSuccess: (_data, variables) => {
queryClient.invalidateQueries({
queryKey: [...credentialSetKeys.detail(variables.credentialSetId), 'members'],
})
queryClient.invalidateQueries({ queryKey: credentialSetKeys.all })
},
})
}
export function useLeaveCredentialSet() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (credentialSetId: string) => {
const response = await fetch(
`/api/credential-sets/memberships?credentialSetId=${credentialSetId}`,
{ method: 'DELETE' }
)
if (!response.ok) {
const data = await response.json()
throw new Error(data.error || 'Failed to leave credential set')
}
return response.json()
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: credentialSetKeys.memberships() })
},
})
}
export interface DeleteCredentialSetParams {
credentialSetId: string
organizationId: string
}
export function useDeleteCredentialSet() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async ({ credentialSetId }: DeleteCredentialSetParams) => {
const response = await fetch(`/api/credential-sets/${credentialSetId}`, {
method: 'DELETE',
})
if (!response.ok) {
const data = await response.json()
throw new Error(data.error || 'Failed to delete credential set')
}
return response.json()
},
onSuccess: (_data, variables) => {
queryClient.invalidateQueries({
queryKey: credentialSetKeys.list(variables.organizationId),
})
queryClient.invalidateQueries({ queryKey: credentialSetKeys.memberships() })
},
})
}
export interface CredentialSetInvitationDetail {
id: string
credentialSetId: string
email: string | null
token: string
status: string
expiresAt: string
createdAt: string
invitedBy: string
}
interface InvitationsDetailResponse {
invitations?: CredentialSetInvitationDetail[]
}
export function useCredentialSetInvitationsDetail(credentialSetId?: string) {
return useQuery<CredentialSetInvitationDetail[]>({
queryKey: [...credentialSetKeys.detail(credentialSetId), 'invitations'],
queryFn: async () => {
const data = await fetchJson<InvitationsDetailResponse>(
`/api/credential-sets/${credentialSetId}/invite`
)
return (data.invitations ?? []).filter((inv) => inv.status === 'pending')
},
enabled: Boolean(credentialSetId),
staleTime: 30 * 1000,
})
}
export function useCancelCredentialSetInvitation() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (data: { credentialSetId: string; invitationId: string }) => {
const response = await fetch(
`/api/credential-sets/${data.credentialSetId}/invite?invitationId=${data.invitationId}`,
{ method: 'DELETE' }
)
if (!response.ok) {
const result = await response.json()
throw new Error(result.error || 'Failed to cancel invitation')
}
return response.json()
},
onSuccess: (_data, variables) => {
queryClient.invalidateQueries({
queryKey: [...credentialSetKeys.detail(variables.credentialSetId), 'invitations'],
})
},
})
}
export function useResendCredentialSetInvitation() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (data: { credentialSetId: string; invitationId: string; email: string }) => {
const response = await fetch(
`/api/credential-sets/${data.credentialSetId}/invite/${data.invitationId}`,
{ method: 'POST' }
)
if (!response.ok) {
const result = await response.json()
throw new Error(result.error || 'Failed to resend invitation')
}
return response.json()
},
onSuccess: (_data, variables) => {
queryClient.invalidateQueries({
queryKey: [...credentialSetKeys.detail(variables.credentialSetId), 'invitations'],
})
},
})
}

View File

@@ -1,5 +1,7 @@
import { useQuery } from '@tanstack/react-query'
import type { Credential } from '@/lib/oauth'
import { CREDENTIAL, CREDENTIAL_SET } from '@/executor/constants'
import { useCredentialSetDetail } from '@/hooks/queries/credential-sets'
import { fetchJson } from '@/hooks/selectors/helpers'
interface CredentialListResponse {
@@ -61,14 +63,28 @@ export function useOAuthCredentialDetail(
}
export function useCredentialName(credentialId?: string, providerId?: string, workflowId?: string) {
// Check if this is a credential set value
const isCredentialSet = credentialId?.startsWith(CREDENTIAL_SET.PREFIX) ?? false
const credentialSetId = isCredentialSet
? credentialId?.slice(CREDENTIAL_SET.PREFIX.length)
: undefined
// Fetch credential set by ID directly
const { data: credentialSetData, isFetching: credentialSetLoading } = useCredentialSetDetail(
credentialSetId,
isCredentialSet
)
const { data: credentials = [], isFetching: credentialsLoading } = useOAuthCredentials(
providerId,
Boolean(providerId)
Boolean(providerId) && !isCredentialSet
)
const selectedCredential = credentials.find((cred) => cred.id === credentialId)
const shouldFetchDetail = Boolean(credentialId && !selectedCredential && providerId && workflowId)
const shouldFetchDetail = Boolean(
credentialId && !selectedCredential && providerId && workflowId && !isCredentialSet
)
const { data: foreignCredentials = [], isFetching: foreignLoading } = useOAuthCredentialDetail(
shouldFetchDetail ? credentialId : undefined,
@@ -77,12 +93,17 @@ export function useCredentialName(credentialId?: string, providerId?: string, wo
)
const hasForeignMeta = foreignCredentials.length > 0
const isForeignCredentialSet = isCredentialSet && !credentialSetData && !credentialSetLoading
const displayName = selectedCredential?.name ?? (hasForeignMeta ? 'Saved by collaborator' : null)
const displayName =
credentialSetData?.name ??
selectedCredential?.name ??
(hasForeignMeta ? CREDENTIAL.FOREIGN_LABEL : null) ??
(isForeignCredentialSet ? CREDENTIAL.FOREIGN_LABEL : null)
return {
displayName,
isLoading: credentialsLoading || foreignLoading,
isLoading: credentialsLoading || foreignLoading || (isCredentialSet && credentialSetLoading),
hasForeignMeta,
}
}

Some files were not shown because too many files have changed in this diff Show More