Compare commits

...

33 Commits

Author SHA1 Message Date
Waleed Latif
e107363ea7 v0.3.35: migrations, custom email address support 2025-08-21 12:36:51 -07:00
Waleed Latif
7e364a7977 fix(emails): remove unused useCustomFromFormat param (#1082)
* fix(mailer): remove unused useCustomFormat

* bun.lock changes
2025-08-21 12:09:03 -07:00
Waleed Latif
35a37d8b45 fix(acs): added FROM_EMAIL_ADDRESS envvar for ACS (#1081)
* fix: clear Docker build cache to use correct Next.js version

* fix(mailer): add FROM_EMAIL_ADDRESS envvar for ACS

* bun.lock

* added tests
2025-08-21 11:57:44 -07:00
Vikhyath Mondreti
2b52d88cee fix(migrations): add missing migration for document table (#1080)
* fix(migrations): add missing migration for document table

* add newline at end of file
2025-08-21 11:48:54 -07:00
Waleed Latif
abad3620a3 fix(build): clear docker build cache to use correct Next.js version 2025-08-21 01:43:45 -07:00
Waleed Latif
a37c6bc812 fix(build): clear docker build cache to use correct Next.js version (#1075)
* fix: clear Docker build cache to use correct Next.js version

- Changed GitHub Actions cache scope from build-v2 to build-v3
- This should force a fresh build without cached Next.js 15.5.0 layers
- Reverted to ^15.3.2 version format that worked on main branch

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>

* run install

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-08-21 01:38:47 -07:00
Waleed Latif
cd1bd95952 fix(nextjs): downgrade nextjs due to known issue with bun commonjs module bundling (#1073) 2025-08-21 01:24:06 -07:00
Waleed Latif
4c9fdbe7fb fix(nextjs): downgrade nextjs due to known issue with bun commonjs module bundling (#1073) 2025-08-21 01:23:10 -07:00
Waleed Latif
2c47cf4161 v0.3.34: azure-openai options, billing fixes, mistral OCR via Azure, start block input format changes 2025-08-20 21:05:48 -07:00
Waleed Latif
db1cf8a6db fix(placeholder): fix starter block placeholder (#1071) 2025-08-20 21:01:37 -07:00
Vikhyath Mondreti
c6912095f7 fix placeholder text 2025-08-20 20:38:15 -07:00
Waleed Latif
154d9eef6a fix(gpt-5): fix chat-completions api (#1070) 2025-08-20 20:36:12 -07:00
Emir Karabeg
c2ded1f3e1 fix(theme-provider): preventing flash on page load (#1067)
* fix(theme-provider): preventing flash on page load

* consolidated themes to use NextJS theme logic

* improvement: optimized latency
2025-08-20 20:20:23 -07:00
Waleed Latif
ff43528d35 fix(gpt-5): fixed verbosity and reasoning params (#1069)
* fix(gpt-5): fixed verbosity and reasoning parsm

* fixed dropdown

* default values for verbosity and reasoning effort

* cleanup

* use default value in dropdown
2025-08-20 20:18:02 -07:00
Vikhyath Mondreti
692ba69864 fix type 2025-08-20 20:00:41 -07:00
Adam Gough
cb7ce8659b fix(msverify): changed consent for microsoft (#1057)
* changed consent

* changed excel error message and default sheets

* changed variable res for excel

---------

Co-authored-by: Adam Gough <adamgough@Mac.attlocal.net>
2025-08-20 19:54:51 -07:00
Vikhyath Mondreti
5caef3a37d fix(input-format): first time execution bug (#1068) 2025-08-20 19:52:04 -07:00
Waleed Latif
a6888da124 fix(semantics): fix incorrect imports (#1066)
* fix(semantics): fix incorrect import

* fixed all incorrecr imports
2025-08-20 19:02:52 -07:00
Vikhyath Mondreti
07b0597f4f improvement(trigger): upgrade import path for trigger (#1065) 2025-08-20 18:41:13 -07:00
Vikhyath Mondreti
71e2994f9d improvement(trigger): upgrade trigger (#1063) 2025-08-20 18:33:01 -07:00
Vikhyath Mondreti
9973b2c165 Merge branch 'staging' of github.com:simstudioai/sim into staging 2025-08-20 18:26:08 -07:00
Vikhyath Mondreti
d9e5777538 use personal access token 2025-08-20 18:24:17 -07:00
Waleed Latif
dd74267313 feat(nextjs): upgrade nextjs to 15.5 (#1062) 2025-08-20 18:22:35 -07:00
Vikhyath Mondreti
1db72dc823 pin version 2025-08-20 18:13:15 -07:00
Vikhyath Mondreti
da707fa491 improvement(gh-action): add gh action to deploy to correct environment for trigger.dev (#1060)
* improvement(gh-action): add gh action to deploy to correct environment for trigger.dev

* add dep installation

* change away from pull request target
2025-08-20 18:10:43 -07:00
Vikhyath Mondreti
9ffaf305bd feat(input-format): add value field to test input formats (#1059)
* feat(input-format): add value field to test input formats

* fix lint

* fix typing issue

* change to dropdown for boolean
2025-08-20 18:03:47 -07:00
Waleed Latif
26e6286fda fix(billing): fix team plan upgrade (#1053) 2025-08-20 17:05:35 -07:00
Waleed Latif
c795fc83aa feat(azure-openai): allow usage of azure-openai for knowledgebase uploads and wand generation (#1056)
* feat(azure-openai): allow usage of azure-openai for knowledgebase uploads

* feat(azure-openai): added azure-openai for kb and wand

* added embeddings utils, added the ability to use mistral through Azure

* fix(oauth): gdrive picker race condition, token route cleanup

* fix test

* feat(mailer): consolidated all emailing to mailer service, added support for Azure ACS (#1054)

* feat(mailer): consolidated all emailing to mailer service, added support for Azure ACS

* fix batch invitation email template

* cleanup

* improvement(emails): add help template instead of doing it inline

* remove fallback version

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
2025-08-20 17:04:52 -07:00
Waleed Latif
cea42f5135 improvement(gpt-5): added reasoning level and verbosity to gpt-5 models (#1058) 2025-08-20 17:04:39 -07:00
Waleed Latif
6fd6f921dc feat(mailer): consolidated all emailing to mailer service, added support for Azure ACS (#1054)
* feat(mailer): consolidated all emailing to mailer service, added support for Azure ACS

* fix batch invitation email template

* cleanup

* improvement(emails): add help template instead of doing it inline
2025-08-20 16:02:49 -07:00
Vikhyath Mondreti
7530fb9a4e Merge pull request #1055 from simstudioai/fix/picker-race-cond
fix(oauth): gdrive picker race condition, token route cleanup
2025-08-20 15:03:57 -07:00
Vikhyath Mondreti
9a5b035822 fix test 2025-08-20 13:55:54 -07:00
Vikhyath Mondreti
0c0b6bf967 fix(oauth): gdrive picker race condition, token route cleanup 2025-08-20 12:33:46 -07:00
87 changed files with 3645 additions and 1922 deletions

View File

@@ -85,8 +85,8 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=build-v2
cache-to: type=gha,mode=max,scope=build-v2
cache-from: type=gha,scope=build-v3
cache-to: type=gha,mode=max,scope=build-v3
provenance: false
sbom: false

44
.github/workflows/trigger-deploy.yml vendored Normal file
View File

@@ -0,0 +1,44 @@
name: Trigger.dev Deploy
on:
push:
branches:
- main
- staging
jobs:
deploy:
name: Trigger.dev Deploy
runs-on: ubuntu-latest
concurrency:
group: trigger-deploy-${{ github.ref }}
cancel-in-progress: false
env:
TRIGGER_ACCESS_TOKEN: ${{ secrets.TRIGGER_ACCESS_TOKEN }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 'lts/*'
- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: latest
- name: Install dependencies
run: bun install
- name: Deploy to Staging
if: github.ref == 'refs/heads/staging'
working-directory: ./apps/sim
run: npx --yes trigger.dev@4.0.0 deploy -e staging
- name: Deploy to Production
if: github.ref == 'refs/heads/main'
working-directory: ./apps/sim
run: npx --yes trigger.dev@4.0.0 deploy

View File

@@ -115,8 +115,7 @@ Read data from a Microsoft Excel spreadsheet
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `success` | boolean | Operation success status |
| `output` | object | Excel spreadsheet data and metadata |
| `data` | object | Range data from the spreadsheet |
### `microsoft_excel_write`
@@ -136,8 +135,11 @@ Write data to a Microsoft Excel spreadsheet
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `success` | boolean | Operation success status |
| `output` | object | Write operation results and metadata |
| `updatedRange` | string | The range that was updated |
| `updatedRows` | number | Number of rows that were updated |
| `updatedColumns` | number | Number of columns that were updated |
| `updatedCells` | number | Number of cells that were updated |
| `metadata` | object | Spreadsheet metadata |
### `microsoft_excel_table_add`
@@ -155,8 +157,9 @@ Add new rows to a Microsoft Excel table
| Parameter | Type | Description |
| --------- | ---- | ----------- |
| `success` | boolean | Operation success status |
| `output` | object | Table add operation results and metadata |
| `index` | number | Index of the first row that was added |
| `values` | array | Array of rows that were added to the table |
| `metadata` | object | Spreadsheet metadata |

View File

@@ -84,14 +84,12 @@ export async function GET(request: NextRequest) {
return NextResponse.json({ error: 'Credential not found' }, { status: 404 })
}
// Check if the access token is valid
if (!credential.accessToken) {
logger.warn(`[${requestId}] No access token available for credential`)
return NextResponse.json({ error: 'No access token available' }, { status: 400 })
}
try {
// Refresh the token if needed
const { accessToken } = await refreshTokenIfNeeded(requestId, credential, credentialId)
return NextResponse.json({ accessToken }, { status: 200 })
} catch (_error) {

View File

@@ -1,4 +1,4 @@
import { and, eq } from 'drizzle-orm'
import { and, desc, eq } from 'drizzle-orm'
import { getSession } from '@/lib/auth'
import { createLogger } from '@/lib/logs/console/logger'
import { refreshOAuthToken } from '@/lib/oauth/oauth'
@@ -70,7 +70,8 @@ export async function getOAuthToken(userId: string, providerId: string): Promise
})
.from(account)
.where(and(eq(account.userId, userId), eq(account.providerId, providerId)))
.orderBy(account.createdAt)
// Always use the most recently updated credential for this provider
.orderBy(desc(account.updatedAt))
.limit(1)
if (connections.length === 0) {
@@ -80,19 +81,13 @@ export async function getOAuthToken(userId: string, providerId: string): Promise
const credential = connections[0]
// Check if we have a valid access token
if (!credential.accessToken) {
logger.warn(`Access token is null for user ${userId}, provider ${providerId}`)
return null
}
// Check if the token is expired and needs refreshing
// Determine whether we should refresh: missing token OR expired token
const now = new Date()
const tokenExpiry = credential.accessTokenExpiresAt
// Only refresh if we have an expiration time AND it's expired AND we have a refresh token
const needsRefresh = tokenExpiry && tokenExpiry < now && !!credential.refreshToken
const shouldAttemptRefresh =
!!credential.refreshToken && (!credential.accessToken || (tokenExpiry && tokenExpiry < now))
if (needsRefresh) {
if (shouldAttemptRefresh) {
logger.info(
`Access token expired for user ${userId}, provider ${providerId}. Attempting to refresh.`
)
@@ -141,6 +136,13 @@ export async function getOAuthToken(userId: string, providerId: string): Promise
}
}
if (!credential.accessToken) {
logger.warn(
`Access token is null and no refresh attempted or available for user ${userId}, provider ${providerId}`
)
return null
}
logger.info(`Found valid OAuth token for user ${userId}, provider ${providerId}`)
return credential.accessToken
}
@@ -164,19 +166,21 @@ export async function refreshAccessTokenIfNeeded(
return null
}
// Check if we need to refresh the token
// Decide if we should refresh: token missing OR expired
const expiresAt = credential.accessTokenExpiresAt
const now = new Date()
// Only refresh if we have an expiration time AND it's expired
// If no expiration time is set (newly created credentials), assume token is valid
const needsRefresh = expiresAt && expiresAt <= now
const shouldRefresh =
!!credential.refreshToken && (!credential.accessToken || (expiresAt && expiresAt <= now))
const accessToken = credential.accessToken
if (needsRefresh && credential.refreshToken) {
if (shouldRefresh) {
logger.info(`[${requestId}] Token expired, attempting to refresh for credential`)
try {
const refreshedToken = await refreshOAuthToken(credential.providerId, credential.refreshToken)
const refreshedToken = await refreshOAuthToken(
credential.providerId,
credential.refreshToken!
)
if (!refreshedToken) {
logger.error(`[${requestId}] Failed to refresh token for credential: ${credentialId}`, {
@@ -217,6 +221,7 @@ export async function refreshAccessTokenIfNeeded(
return null
}
} else if (!accessToken) {
// We have no access token and either no refresh token or not eligible to refresh
logger.error(`[${requestId}] Missing access token for credential`)
return null
}
@@ -233,21 +238,20 @@ export async function refreshTokenIfNeeded(
credential: any,
credentialId: string
): Promise<{ accessToken: string; refreshed: boolean }> {
// Check if we need to refresh the token
// Decide if we should refresh: token missing OR expired
const expiresAt = credential.accessTokenExpiresAt
const now = new Date()
// Only refresh if we have an expiration time AND it's expired
// If no expiration time is set (newly created credentials), assume token is valid
const needsRefresh = expiresAt && expiresAt <= now
const shouldRefresh =
!!credential.refreshToken && (!credential.accessToken || (expiresAt && expiresAt <= now))
// If token is still valid, return it directly
if (!needsRefresh || !credential.refreshToken) {
// If token appears valid and present, return it directly
if (!shouldRefresh) {
logger.info(`[${requestId}] Access token is valid`)
return { accessToken: credential.accessToken, refreshed: false }
}
try {
const refreshResult = await refreshOAuthToken(credential.providerId, credential.refreshToken)
const refreshResult = await refreshOAuthToken(credential.providerId, credential.refreshToken!)
if (!refreshResult) {
logger.error(`[${requestId}] Failed to refresh token for credential`)

View File

@@ -1,12 +1,13 @@
import { type NextRequest, NextResponse } from 'next/server'
import { Resend } from 'resend'
import { z } from 'zod'
import { renderHelpConfirmationEmail } from '@/components/emails'
import { getSession } from '@/lib/auth'
import { sendEmail } from '@/lib/email/mailer'
import { getFromEmailAddress } from '@/lib/email/utils'
import { env } from '@/lib/env'
import { createLogger } from '@/lib/logs/console/logger'
import { getEmailDomain } from '@/lib/urls/utils'
const resend = env.RESEND_API_KEY ? new Resend(env.RESEND_API_KEY) : null
const logger = createLogger('HelpAPI')
const helpFormSchema = z.object({
@@ -28,18 +29,6 @@ export async function POST(req: NextRequest) {
const email = session.user.email
// Check if Resend API key is configured
if (!resend) {
logger.error(`[${requestId}] RESEND_API_KEY not configured`)
return NextResponse.json(
{
error:
'Email service not configured. Please set RESEND_API_KEY in environment variables.',
},
{ status: 500 }
)
}
// Handle multipart form data
const formData = await req.formData()
@@ -54,18 +43,18 @@ export async function POST(req: NextRequest) {
})
// Validate the form data
const result = helpFormSchema.safeParse({
const validationResult = helpFormSchema.safeParse({
subject,
message,
type,
})
if (!result.success) {
if (!validationResult.success) {
logger.warn(`[${requestId}] Invalid help request data`, {
errors: result.error.format(),
errors: validationResult.error.format(),
})
return NextResponse.json(
{ error: 'Invalid request data', details: result.error.format() },
{ error: 'Invalid request data', details: validationResult.error.format() },
{ status: 400 }
)
}
@@ -103,63 +92,60 @@ ${message}
emailText += `\n\n${images.length} image(s) attached.`
}
// Send email using Resend
const { error } = await resend.emails.send({
from: `Sim <noreply@${env.EMAIL_DOMAIN || getEmailDomain()}>`,
const emailResult = await sendEmail({
to: [`help@${env.EMAIL_DOMAIN || getEmailDomain()}`],
subject: `[${type.toUpperCase()}] ${subject}`,
replyTo: email,
text: emailText,
from: getFromEmailAddress(),
replyTo: email,
emailType: 'transactional',
attachments: images.map((image) => ({
filename: image.filename,
content: image.content.toString('base64'),
contentType: image.contentType,
disposition: 'attachment', // Explicitly set as attachment
disposition: 'attachment',
})),
})
if (error) {
logger.error(`[${requestId}] Error sending help request email`, error)
if (!emailResult.success) {
logger.error(`[${requestId}] Error sending help request email`, emailResult.message)
return NextResponse.json({ error: 'Failed to send email' }, { status: 500 })
}
logger.info(`[${requestId}] Help request email sent successfully`)
// Send confirmation email to the user
await resend.emails
.send({
from: `Sim <noreply@${env.EMAIL_DOMAIN || getEmailDomain()}>`,
try {
const confirmationHtml = await renderHelpConfirmationEmail(
email,
type as 'bug' | 'feedback' | 'feature_request' | 'other',
images.length
)
await sendEmail({
to: [email],
subject: `Your ${type} request has been received: ${subject}`,
text: `
Hello,
Thank you for your ${type} submission. We've received your request and will get back to you as soon as possible.
Your message:
${message}
${images.length > 0 ? `You attached ${images.length} image(s).` : ''}
Best regards,
The Sim Team
`,
html: confirmationHtml,
from: getFromEmailAddress(),
replyTo: `help@${env.EMAIL_DOMAIN || getEmailDomain()}`,
emailType: 'transactional',
})
.catch((err) => {
logger.warn(`[${requestId}] Failed to send confirmation email`, err)
})
} catch (err) {
logger.warn(`[${requestId}] Failed to send confirmation email`, err)
}
return NextResponse.json(
{ success: true, message: 'Help request submitted successfully' },
{ status: 200 }
)
} catch (error) {
// Check if error is related to missing API key
if (error instanceof Error && error.message.includes('API key')) {
logger.error(`[${requestId}] API key configuration error`, error)
if (error instanceof Error && error.message.includes('not configured')) {
logger.error(`[${requestId}] Email service configuration error`, error)
return NextResponse.json(
{ error: 'Email service configuration error. Please check your RESEND_API_KEY.' },
{
error:
'Email service configuration error. Please check your email service configuration.',
},
{ status: 500 }
)
}

View File

@@ -1,4 +1,4 @@
import { runs } from '@trigger.dev/sdk/v3'
import { runs } from '@trigger.dev/sdk'
import { eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { getSession } from '@/lib/auth'

View File

@@ -4,15 +4,50 @@
*
* @vitest-environment node
*/
import { describe, expect, it, vi } from 'vitest'
import { beforeEach, describe, expect, it, vi } from 'vitest'
vi.mock('drizzle-orm')
vi.mock('@/lib/logs/console/logger')
vi.mock('@/lib/logs/console/logger', () => ({
createLogger: vi.fn(() => ({
info: vi.fn(),
debug: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
})),
}))
vi.mock('@/db')
vi.mock('@/lib/documents/utils', () => ({
retryWithExponentialBackoff: (fn: any) => fn(),
}))
import { handleTagAndVectorSearch, handleTagOnlySearch, handleVectorOnlySearch } from './utils'
vi.stubGlobal(
'fetch',
vi.fn().mockResolvedValue({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
})
)
vi.mock('@/lib/env', () => ({
env: {},
isTruthy: (value: string | boolean | number | undefined) =>
typeof value === 'string' ? value === 'true' || value === '1' : Boolean(value),
}))
import {
generateSearchEmbedding,
handleTagAndVectorSearch,
handleTagOnlySearch,
handleVectorOnlySearch,
} from './utils'
describe('Knowledge Search Utils', () => {
beforeEach(() => {
vi.clearAllMocks()
})
describe('handleTagOnlySearch', () => {
it('should throw error when no filters provided', async () => {
const params = {
@@ -140,4 +175,251 @@ describe('Knowledge Search Utils', () => {
expect(params.distanceThreshold).toBe(0.8)
})
})
describe('generateSearchEmbedding', () => {
it('should use Azure OpenAI when KB-specific config is provided', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
AZURE_OPENAI_API_KEY: 'test-azure-key',
AZURE_OPENAI_ENDPOINT: 'https://test.openai.azure.com',
AZURE_OPENAI_API_VERSION: '2024-12-01-preview',
KB_OPENAI_MODEL_NAME: 'text-embedding-ada-002',
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
} as any)
const result = await generateSearchEmbedding('test query')
expect(fetchSpy).toHaveBeenCalledWith(
'https://test.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-12-01-preview',
expect.objectContaining({
headers: expect.objectContaining({
'api-key': 'test-azure-key',
}),
})
)
expect(result).toEqual([0.1, 0.2, 0.3])
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should fallback to OpenAI when no KB Azure config provided', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
} as any)
const result = await generateSearchEmbedding('test query')
expect(fetchSpy).toHaveBeenCalledWith(
'https://api.openai.com/v1/embeddings',
expect.objectContaining({
headers: expect.objectContaining({
Authorization: 'Bearer test-openai-key',
}),
})
)
expect(result).toEqual([0.1, 0.2, 0.3])
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should use default API version when not provided in Azure config', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
AZURE_OPENAI_API_KEY: 'test-azure-key',
AZURE_OPENAI_ENDPOINT: 'https://test.openai.azure.com',
KB_OPENAI_MODEL_NAME: 'custom-embedding-model',
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
} as any)
await generateSearchEmbedding('test query')
expect(fetchSpy).toHaveBeenCalledWith(
expect.stringContaining('api-version='),
expect.any(Object)
)
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should use custom model name when provided in Azure config', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
AZURE_OPENAI_API_KEY: 'test-azure-key',
AZURE_OPENAI_ENDPOINT: 'https://test.openai.azure.com',
AZURE_OPENAI_API_VERSION: '2024-12-01-preview',
KB_OPENAI_MODEL_NAME: 'custom-embedding-model',
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
} as any)
await generateSearchEmbedding('test query', 'text-embedding-3-small')
expect(fetchSpy).toHaveBeenCalledWith(
'https://test.openai.azure.com/openai/deployments/custom-embedding-model/embeddings?api-version=2024-12-01-preview',
expect.any(Object)
)
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should throw error when no API configuration provided', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
await expect(generateSearchEmbedding('test query')).rejects.toThrow(
'Either OPENAI_API_KEY or Azure OpenAI configuration (AZURE_OPENAI_API_KEY + AZURE_OPENAI_ENDPOINT) must be configured'
)
})
it('should handle Azure OpenAI API errors properly', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
AZURE_OPENAI_API_KEY: 'test-azure-key',
AZURE_OPENAI_ENDPOINT: 'https://test.openai.azure.com',
AZURE_OPENAI_API_VERSION: '2024-12-01-preview',
KB_OPENAI_MODEL_NAME: 'text-embedding-ada-002',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: false,
status: 404,
statusText: 'Not Found',
text: async () => 'Deployment not found',
} as any)
await expect(generateSearchEmbedding('test query')).rejects.toThrow('Embedding API failed')
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should handle OpenAI API errors properly', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: false,
status: 429,
statusText: 'Too Many Requests',
text: async () => 'Rate limit exceeded',
} as any)
await expect(generateSearchEmbedding('test query')).rejects.toThrow('Embedding API failed')
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should include correct request body for Azure OpenAI', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
AZURE_OPENAI_API_KEY: 'test-azure-key',
AZURE_OPENAI_ENDPOINT: 'https://test.openai.azure.com',
AZURE_OPENAI_API_VERSION: '2024-12-01-preview',
KB_OPENAI_MODEL_NAME: 'text-embedding-ada-002',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
} as any)
await generateSearchEmbedding('test query')
expect(fetchSpy).toHaveBeenCalledWith(
expect.any(String),
expect.objectContaining({
body: JSON.stringify({
input: ['test query'],
encoding_format: 'float',
}),
})
)
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should include correct request body for OpenAI', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2, 0.3] }],
}),
} as any)
await generateSearchEmbedding('test query', 'text-embedding-3-small')
expect(fetchSpy).toHaveBeenCalledWith(
expect.any(String),
expect.objectContaining({
body: JSON.stringify({
input: ['test query'],
model: 'text-embedding-3-small',
encoding_format: 'float',
}),
})
)
// Clean up
Object.keys(env).forEach((key) => delete (env as any)[key])
})
})
})

View File

@@ -1,22 +1,10 @@
import { and, eq, inArray, sql } from 'drizzle-orm'
import { retryWithExponentialBackoff } from '@/lib/documents/utils'
import { env } from '@/lib/env'
import { createLogger } from '@/lib/logs/console/logger'
import { db } from '@/db'
import { embedding } from '@/db/schema'
const logger = createLogger('KnowledgeSearchUtils')
export class APIError extends Error {
public status: number
constructor(message: string, status: number) {
super(message)
this.name = 'APIError'
this.status = status
}
}
export interface SearchResult {
id: string
content: string
@@ -41,61 +29,8 @@ export interface SearchParams {
distanceThreshold?: number
}
export async function generateSearchEmbedding(query: string): Promise<number[]> {
const openaiApiKey = env.OPENAI_API_KEY
if (!openaiApiKey) {
throw new Error('OPENAI_API_KEY not configured')
}
try {
const embedding = await retryWithExponentialBackoff(
async () => {
const response = await fetch('https://api.openai.com/v1/embeddings', {
method: 'POST',
headers: {
Authorization: `Bearer ${openaiApiKey}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
input: query,
model: 'text-embedding-3-small',
encoding_format: 'float',
}),
})
if (!response.ok) {
const errorText = await response.text()
const error = new APIError(
`OpenAI API error: ${response.status} ${response.statusText} - ${errorText}`,
response.status
)
throw error
}
const data = await response.json()
if (!data.data || !Array.isArray(data.data) || data.data.length === 0) {
throw new Error('Invalid response format from OpenAI embeddings API')
}
return data.data[0].embedding
},
{
maxRetries: 5,
initialDelayMs: 1000,
maxDelayMs: 30000,
backoffMultiplier: 2,
}
)
return embedding
} catch (error) {
logger.error('Failed to generate search embedding:', error)
throw new Error(
`Embedding generation failed: ${error instanceof Error ? error.message : 'Unknown error'}`
)
}
}
// Use shared embedding utility
export { generateSearchEmbedding } from '@/lib/embeddings/utils'
function getTagFilters(filters: Record<string, string>, embedding: any) {
return Object.entries(filters).map(([key, value]) => {

View File

@@ -252,5 +252,76 @@ describe('Knowledge Utils', () => {
expect(result.length).toBe(2)
})
it('should use Azure OpenAI when Azure config is provided', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
AZURE_OPENAI_API_KEY: 'test-azure-key',
AZURE_OPENAI_ENDPOINT: 'https://test.openai.azure.com',
AZURE_OPENAI_API_VERSION: '2024-12-01-preview',
KB_OPENAI_MODEL_NAME: 'text-embedding-ada-002',
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2], index: 0 }],
}),
} as any)
await generateEmbeddings(['test text'])
expect(fetchSpy).toHaveBeenCalledWith(
'https://test.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-12-01-preview',
expect.objectContaining({
headers: expect.objectContaining({
'api-key': 'test-azure-key',
}),
})
)
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should fallback to OpenAI when no Azure config provided', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
Object.assign(env, {
OPENAI_API_KEY: 'test-openai-key',
})
const fetchSpy = vi.mocked(fetch)
fetchSpy.mockResolvedValueOnce({
ok: true,
json: async () => ({
data: [{ embedding: [0.1, 0.2], index: 0 }],
}),
} as any)
await generateEmbeddings(['test text'])
expect(fetchSpy).toHaveBeenCalledWith(
'https://api.openai.com/v1/embeddings',
expect.objectContaining({
headers: expect.objectContaining({
Authorization: 'Bearer test-openai-key',
}),
})
)
Object.keys(env).forEach((key) => delete (env as any)[key])
})
it('should throw error when no API configuration provided', async () => {
const { env } = await import('@/lib/env')
Object.keys(env).forEach((key) => delete (env as any)[key])
await expect(generateEmbeddings(['test text'])).rejects.toThrow(
'Either OPENAI_API_KEY or Azure OpenAI configuration (AZURE_OPENAI_API_KEY + AZURE_OPENAI_ENDPOINT) must be configured'
)
})
})
})

View File

@@ -1,8 +1,7 @@
import crypto from 'crypto'
import { and, eq, isNull } from 'drizzle-orm'
import { processDocument } from '@/lib/documents/document-processor'
import { retryWithExponentialBackoff } from '@/lib/documents/utils'
import { env } from '@/lib/env'
import { generateEmbeddings } from '@/lib/embeddings/utils'
import { createLogger } from '@/lib/logs/console/logger'
import { getUserEntityPermissions } from '@/lib/permissions/utils'
import { db } from '@/db'
@@ -10,22 +9,11 @@ import { document, embedding, knowledgeBase } from '@/db/schema'
const logger = createLogger('KnowledgeUtils')
// Timeout constants (in milliseconds)
const TIMEOUTS = {
OVERALL_PROCESSING: 150000, // 150 seconds (2.5 minutes)
EMBEDDINGS_API: 60000, // 60 seconds per batch
} as const
class APIError extends Error {
public status: number
constructor(message: string, status: number) {
super(message)
this.name = 'APIError'
this.status = status
}
}
/**
* Create a timeout wrapper for async operations
*/
@@ -110,18 +98,6 @@ export interface EmbeddingData {
updatedAt: Date
}
interface OpenAIEmbeddingResponse {
data: Array<{
embedding: number[]
index: number
}>
model: string
usage: {
prompt_tokens: number
total_tokens: number
}
}
export interface KnowledgeBaseAccessResult {
hasAccess: true
knowledgeBase: Pick<KnowledgeBaseData, 'id' | 'userId'>
@@ -405,87 +381,8 @@ export async function checkChunkAccess(
}
}
/**
* Generate embeddings using OpenAI API with retry logic for rate limiting
*/
export async function generateEmbeddings(
texts: string[],
embeddingModel = 'text-embedding-3-small'
): Promise<number[][]> {
const openaiApiKey = env.OPENAI_API_KEY
if (!openaiApiKey) {
throw new Error('OPENAI_API_KEY not configured')
}
try {
const batchSize = 100
const allEmbeddings: number[][] = []
for (let i = 0; i < texts.length; i += batchSize) {
const batch = texts.slice(i, i + batchSize)
logger.info(
`Generating embeddings for batch ${Math.floor(i / batchSize) + 1} (${batch.length} texts)`
)
const batchEmbeddings = await retryWithExponentialBackoff(
async () => {
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), TIMEOUTS.EMBEDDINGS_API)
try {
const response = await fetch('https://api.openai.com/v1/embeddings', {
method: 'POST',
headers: {
Authorization: `Bearer ${openaiApiKey}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
input: batch,
model: embeddingModel,
encoding_format: 'float',
}),
signal: controller.signal,
})
clearTimeout(timeoutId)
if (!response.ok) {
const errorText = await response.text()
const error = new APIError(
`OpenAI API error: ${response.status} ${response.statusText} - ${errorText}`,
response.status
)
throw error
}
const data: OpenAIEmbeddingResponse = await response.json()
return data.data.map((item) => item.embedding)
} catch (error) {
clearTimeout(timeoutId)
if (error instanceof Error && error.name === 'AbortError') {
throw new Error('OpenAI API request timed out')
}
throw error
}
},
{
maxRetries: 5,
initialDelayMs: 1000,
maxDelayMs: 60000, // Max 1 minute delay for embeddings
backoffMultiplier: 2,
}
)
allEmbeddings.push(...batchEmbeddings)
}
return allEmbeddings
} catch (error) {
logger.error('Failed to generate embeddings:', error)
throw error
}
}
// Export for external use
export { generateEmbeddings }
/**
* Process a document asynchronously with full error handling

View File

@@ -39,6 +39,8 @@ export async function POST(request: NextRequest) {
stream,
messages,
environmentVariables,
reasoningEffort,
verbosity,
} = body
logger.info(`[${requestId}] Provider request details`, {
@@ -58,6 +60,8 @@ export async function POST(request: NextRequest) {
messageCount: messages?.length || 0,
hasEnvironmentVariables:
!!environmentVariables && Object.keys(environmentVariables).length > 0,
reasoningEffort,
verbosity,
})
let finalApiKey: string
@@ -99,6 +103,8 @@ export async function POST(request: NextRequest) {
stream,
messages,
environmentVariables,
reasoningEffort,
verbosity,
})
const executionTime = Date.now() - startTime

View File

@@ -1,10 +1,10 @@
import { NextResponse } from 'next/server'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { getJiraCloudId } from '@/tools/jira/utils'
export const dynamic = 'force-dynamic'
const logger = new Logger('JiraIssueAPI')
const logger = createLogger('JiraIssueAPI')
export async function POST(request: Request) {
try {

View File

@@ -1,10 +1,10 @@
import { NextResponse } from 'next/server'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { getJiraCloudId } from '@/tools/jira/utils'
export const dynamic = 'force-dynamic'
const logger = new Logger('JiraIssuesAPI')
const logger = createLogger('JiraIssuesAPI')
export async function POST(request: Request) {
try {

View File

@@ -1,10 +1,10 @@
import { NextResponse } from 'next/server'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { getJiraCloudId } from '@/tools/jira/utils'
export const dynamic = 'force-dynamic'
const logger = new Logger('JiraProjectsAPI')
const logger = createLogger('JiraProjectsAPI')
export async function GET(request: Request) {
try {

View File

@@ -1,10 +1,10 @@
import { NextResponse } from 'next/server'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { getJiraCloudId } from '@/tools/jira/utils'
export const dynamic = 'force-dynamic'
const logger = new Logger('JiraUpdateAPI')
const logger = createLogger('JiraUpdateAPI')
export async function PUT(request: Request) {
try {

View File

@@ -1,10 +1,10 @@
import { NextResponse } from 'next/server'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { getJiraCloudId } from '@/tools/jira/utils'
export const dynamic = 'force-dynamic'
const logger = new Logger('JiraWriteAPI')
const logger = createLogger('JiraWriteAPI')
export async function POST(request: Request) {
try {

View File

@@ -1,6 +1,6 @@
import { unstable_noStore as noStore } from 'next/cache'
import { type NextRequest, NextResponse } from 'next/server'
import OpenAI from 'openai'
import OpenAI, { AzureOpenAI } from 'openai'
import { env } from '@/lib/env'
import { createLogger } from '@/lib/logs/console/logger'
@@ -10,14 +10,32 @@ export const maxDuration = 60
const logger = createLogger('WandGenerateAPI')
const openai = env.OPENAI_API_KEY
? new OpenAI({
apiKey: env.OPENAI_API_KEY,
})
: null
const azureApiKey = env.AZURE_OPENAI_API_KEY
const azureEndpoint = env.AZURE_OPENAI_ENDPOINT
const azureApiVersion = env.AZURE_OPENAI_API_VERSION
const wandModelName = env.WAND_OPENAI_MODEL_NAME || 'gpt-4o'
const openaiApiKey = env.OPENAI_API_KEY
if (!env.OPENAI_API_KEY) {
logger.warn('OPENAI_API_KEY not found. Wand generation API will not function.')
const useWandAzure = azureApiKey && azureEndpoint && azureApiVersion
const client = useWandAzure
? new AzureOpenAI({
apiKey: azureApiKey,
apiVersion: azureApiVersion,
endpoint: azureEndpoint,
})
: openaiApiKey
? new OpenAI({
apiKey: openaiApiKey,
})
: null
if (!useWandAzure && !openaiApiKey) {
logger.warn(
'Neither Azure OpenAI nor OpenAI API key found. Wand generation API will not function.'
)
} else {
logger.info(`Using ${useWandAzure ? 'Azure OpenAI' : 'OpenAI'} for wand generation`)
}
interface ChatMessage {
@@ -32,14 +50,12 @@ interface RequestBody {
history?: ChatMessage[]
}
// The endpoint is now generic - system prompts come from wand configs
export async function POST(req: NextRequest) {
const requestId = crypto.randomUUID().slice(0, 8)
logger.info(`[${requestId}] Received wand generation request`)
if (!openai) {
logger.error(`[${requestId}] OpenAI client not initialized. Missing API key.`)
if (!client) {
logger.error(`[${requestId}] AI client not initialized. Missing API key.`)
return NextResponse.json(
{ success: false, error: 'Wand generation service is not configured.' },
{ status: 503 }
@@ -74,16 +90,19 @@ export async function POST(req: NextRequest) {
// Add the current user prompt
messages.push({ role: 'user', content: prompt })
logger.debug(`[${requestId}] Calling OpenAI API for wand generation`, {
stream,
historyLength: history.length,
})
logger.debug(
`[${requestId}] Calling ${useWandAzure ? 'Azure OpenAI' : 'OpenAI'} API for wand generation`,
{
stream,
historyLength: history.length,
}
)
// For streaming responses
if (stream) {
try {
const streamCompletion = await openai?.chat.completions.create({
model: 'gpt-4o',
const streamCompletion = await client.chat.completions.create({
model: useWandAzure ? wandModelName : 'gpt-4o',
messages: messages,
temperature: 0.3,
max_tokens: 10000,
@@ -141,8 +160,8 @@ export async function POST(req: NextRequest) {
}
// For non-streaming responses
const completion = await openai?.chat.completions.create({
model: 'gpt-4o',
const completion = await client.chat.completions.create({
model: useWandAzure ? wandModelName : 'gpt-4o',
messages: messages,
temperature: 0.3,
max_tokens: 10000,
@@ -151,9 +170,11 @@ export async function POST(req: NextRequest) {
const generatedContent = completion.choices[0]?.message?.content?.trim()
if (!generatedContent) {
logger.error(`[${requestId}] OpenAI response was empty or invalid.`)
logger.error(
`[${requestId}] ${useWandAzure ? 'Azure OpenAI' : 'OpenAI'} response was empty or invalid.`
)
return NextResponse.json(
{ success: false, error: 'Failed to generate content. OpenAI response was empty.' },
{ success: false, error: 'Failed to generate content. AI response was empty.' },
{ status: 500 }
)
}
@@ -171,7 +192,9 @@ export async function POST(req: NextRequest) {
if (error instanceof OpenAI.APIError) {
status = error.status || 500
logger.error(`[${requestId}] OpenAI API Error: ${status} - ${error.message}`)
logger.error(
`[${requestId}] ${useWandAzure ? 'Azure OpenAI' : 'OpenAI'} API Error: ${status} - ${error.message}`
)
if (status === 401) {
clientErrorMessage = 'Authentication failed. Please check your API key configuration.'
@@ -181,6 +204,10 @@ export async function POST(req: NextRequest) {
clientErrorMessage =
'The wand generation service is currently unavailable. Please try again later.'
}
} else if (useWandAzure && error.message?.includes('DeploymentNotFound')) {
clientErrorMessage =
'Azure OpenAI deployment not found. Please check your model deployment configuration.'
status = 404
}
return NextResponse.json(

View File

@@ -1,11 +1,11 @@
import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { verifyCronAuth } from '@/lib/auth/internal'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { acquireLock, releaseLock } from '@/lib/redis'
import { pollGmailWebhooks } from '@/lib/webhooks/gmail-polling-service'
const logger = new Logger('GmailPollingAPI')
const logger = createLogger('GmailPollingAPI')
export const dynamic = 'force-dynamic'
export const maxDuration = 180 // Allow up to 3 minutes for polling to complete

View File

@@ -1,11 +1,11 @@
import { nanoid } from 'nanoid'
import { type NextRequest, NextResponse } from 'next/server'
import { verifyCronAuth } from '@/lib/auth/internal'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { acquireLock, releaseLock } from '@/lib/redis'
import { pollOutlookWebhooks } from '@/lib/webhooks/outlook-polling-service'
const logger = new Logger('OutlookPollingAPI')
const logger = createLogger('OutlookPollingAPI')
export const dynamic = 'force-dynamic'
export const maxDuration = 180 // Allow up to 3 minutes for polling to complete

View File

@@ -309,7 +309,7 @@ describe('Webhook Trigger API Route', () => {
const req = createMockRequest('POST', { event: 'test', id: 'test-123' })
const params = Promise.resolve({ path: 'test-path' })
vi.doMock('@trigger.dev/sdk/v3', () => ({
vi.doMock('@trigger.dev/sdk', () => ({
tasks: {
trigger: vi.fn().mockResolvedValue({ id: 'mock-task-id' }),
},
@@ -339,7 +339,7 @@ describe('Webhook Trigger API Route', () => {
const req = createMockRequest('POST', { event: 'bearer.test' }, headers)
const params = Promise.resolve({ path: 'test-path' })
vi.doMock('@trigger.dev/sdk/v3', () => ({
vi.doMock('@trigger.dev/sdk', () => ({
tasks: {
trigger: vi.fn().mockResolvedValue({ id: 'mock-task-id' }),
},
@@ -369,7 +369,7 @@ describe('Webhook Trigger API Route', () => {
const req = createMockRequest('POST', { event: 'custom.header.test' }, headers)
const params = Promise.resolve({ path: 'test-path' })
vi.doMock('@trigger.dev/sdk/v3', () => ({
vi.doMock('@trigger.dev/sdk', () => ({
tasks: {
trigger: vi.fn().mockResolvedValue({ id: 'mock-task-id' }),
},
@@ -391,7 +391,7 @@ describe('Webhook Trigger API Route', () => {
token: 'case-test-token',
})
vi.doMock('@trigger.dev/sdk/v3', () => ({
vi.doMock('@trigger.dev/sdk', () => ({
tasks: {
trigger: vi.fn().mockResolvedValue({ id: 'mock-task-id' }),
},
@@ -430,7 +430,7 @@ describe('Webhook Trigger API Route', () => {
secretHeaderName: 'X-Secret-Key',
})
vi.doMock('@trigger.dev/sdk/v3', () => ({
vi.doMock('@trigger.dev/sdk', () => ({
tasks: {
trigger: vi.fn().mockResolvedValue({ id: 'mock-task-id' }),
},

View File

@@ -1,4 +1,4 @@
import { tasks } from '@trigger.dev/sdk/v3'
import { tasks } from '@trigger.dev/sdk'
import { and, eq } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { checkServerSideUsageLimits } from '@/lib/billing'

View File

@@ -1,4 +1,4 @@
import { tasks } from '@trigger.dev/sdk/v3'
import { tasks } from '@trigger.dev/sdk'
import { eq, sql } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { v4 as uuidv4 } from 'uuid'

View File

@@ -91,6 +91,7 @@ describe('Workspace Invitations API Route', () => {
env: {
RESEND_API_KEY: 'test-resend-key',
NEXT_PUBLIC_APP_URL: 'https://test.sim.ai',
FROM_EMAIL_ADDRESS: 'Sim <noreply@test.sim.ai>',
EMAIL_DOMAIN: 'test.sim.ai',
},
}))

View File

@@ -2,12 +2,12 @@ import { randomUUID } from 'crypto'
import { render } from '@react-email/render'
import { and, eq, inArray } from 'drizzle-orm'
import { type NextRequest, NextResponse } from 'next/server'
import { Resend } from 'resend'
import { WorkspaceInvitationEmail } from '@/components/emails/workspace-invitation'
import { getSession } from '@/lib/auth'
import { sendEmail } from '@/lib/email/mailer'
import { getFromEmailAddress } from '@/lib/email/utils'
import { env } from '@/lib/env'
import { createLogger } from '@/lib/logs/console/logger'
import { getEmailDomain } from '@/lib/urls/utils'
import { db } from '@/db'
import {
permissions,
@@ -20,7 +20,6 @@ import {
export const dynamic = 'force-dynamic'
const logger = createLogger('WorkspaceInvitationsAPI')
const resend = env.RESEND_API_KEY ? new Resend(env.RESEND_API_KEY) : null
type PermissionType = (typeof permissionTypeEnum.enumValues)[number]
@@ -241,30 +240,23 @@ async function sendInvitationEmail({
})
)
if (!resend) {
logger.error('RESEND_API_KEY not configured')
return NextResponse.json(
{
error:
'Email service not configured. Please set RESEND_API_KEY in environment variables.',
},
{ status: 500 }
)
}
const emailDomain = env.EMAIL_DOMAIN || getEmailDomain()
const fromAddress = `noreply@${emailDomain}`
const fromAddress = getFromEmailAddress()
logger.info(`Attempting to send email from ${fromAddress} to ${to}`)
const result = await resend.emails.send({
from: fromAddress,
const result = await sendEmail({
to,
subject: `You've been invited to join "${workspaceName}" on Sim`,
html: emailHtml,
from: fromAddress,
emailType: 'transactional',
})
logger.info(`Invitation email sent successfully to ${to}`, { result })
if (result.success) {
logger.info(`Invitation email sent successfully to ${to}`, { result })
} else {
logger.error(`Failed to send invitation email to ${to}`, { error: result.message })
}
} catch (error) {
logger.error('Error sending invitation email:', error)
// Continue even if email fails - the invitation is still created

View File

@@ -10,6 +10,7 @@ import { createLogger } from '@/lib/logs/console/logger'
import { getAssetUrl } from '@/lib/utils'
import '@/app/globals.css'
import { ThemeProvider } from '@/app/theme-provider'
import { ZoomPrevention } from '@/app/zoom-prevention'
const logger = createLogger('RootLayout')
@@ -45,11 +46,14 @@ if (typeof window !== 'undefined') {
}
export const viewport: Viewport = {
themeColor: '#ffffff',
width: 'device-width',
initialScale: 1,
maximumScale: 1,
userScalable: false,
themeColor: [
{ media: '(prefers-color-scheme: light)', color: '#ffffff' },
{ media: '(prefers-color-scheme: dark)', color: '#0c0c0c' },
],
}
// Generate dynamic metadata based on brand configuration
@@ -70,8 +74,7 @@ export default function RootLayout({ children }: { children: React.ReactNode })
/>
{/* Meta tags for better SEO */}
<meta name='theme-color' content='#ffffff' />
<meta name='color-scheme' content='light' />
<meta name='color-scheme' content='light dark' />
<meta name='format-detection' content='telephone=no' />
<meta httpEquiv='x-ua-compatible' content='ie=edge' />
@@ -107,16 +110,18 @@ export default function RootLayout({ children }: { children: React.ReactNode })
)}
</head>
<body suppressHydrationWarning>
<BrandedLayout>
<ZoomPrevention />
{children}
{isHosted && (
<>
<SpeedInsights />
<Analytics />
</>
)}
</BrandedLayout>
<ThemeProvider>
<BrandedLayout>
<ZoomPrevention />
{children}
{isHosted && (
<>
<SpeedInsights />
<Analytics />
</>
)}
</BrandedLayout>
</ThemeProvider>
</body>
</html>
)

View File

@@ -0,0 +1,19 @@
'use client'
import type { ThemeProviderProps } from 'next-themes'
import { ThemeProvider as NextThemesProvider } from 'next-themes'
export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
return (
<NextThemesProvider
attribute='class'
defaultTheme='system'
enableSystem
disableTransitionOnChange
storageKey='sim-theme'
{...props}
>
{children}
</NextThemesProvider>
)
}

View File

@@ -2,8 +2,8 @@
import React from 'react'
import { TooltipProvider } from '@/components/ui/tooltip'
import { ThemeProvider } from '@/app/workspace/[workspaceId]/providers/theme-provider'
import { WorkspacePermissionsProvider } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
import { SettingsLoader } from './settings-loader'
interface ProvidersProps {
children: React.ReactNode
@@ -11,11 +11,12 @@ interface ProvidersProps {
const Providers = React.memo<ProvidersProps>(({ children }) => {
return (
<ThemeProvider>
<>
<SettingsLoader />
<TooltipProvider delayDuration={100} skipDelayDuration={0}>
<WorkspacePermissionsProvider>{children}</WorkspacePermissionsProvider>
</TooltipProvider>
</ThemeProvider>
</>
)
})

View File

@@ -0,0 +1,27 @@
'use client'
import { useEffect, useRef } from 'react'
import { useSession } from '@/lib/auth-client'
import { useGeneralStore } from '@/stores/settings/general/store'
/**
* Loads user settings from database once per workspace session.
* This ensures settings are synced from DB on initial load but uses
* localStorage cache for subsequent navigation within the app.
*/
export function SettingsLoader() {
const { data: session, isPending: isSessionPending } = useSession()
const loadSettings = useGeneralStore((state) => state.loadSettings)
const hasLoadedRef = useRef(false)
useEffect(() => {
// Only load settings once per session for authenticated users
if (!isSessionPending && session?.user && !hasLoadedRef.current) {
hasLoadedRef.current = true
// Force load from DB on initial workspace entry
loadSettings(true)
}
}, [isSessionPending, session?.user, loadSettings])
return null
}

View File

@@ -1,23 +0,0 @@
'use client'
import { useEffect } from 'react'
import { useGeneralStore } from '@/stores/settings/general/store'
export function ThemeProvider({ children }: { children: React.ReactNode }) {
const theme = useGeneralStore((state) => state.theme)
useEffect(() => {
const root = window.document.documentElement
root.classList.remove('light', 'dark')
// If theme is system, check system preference
if (theme === 'system') {
const prefersDark = window.matchMedia('(prefers-color-scheme: dark)').matches
root.classList.add(prefersDark ? 'dark' : 'light')
} else {
root.classList.add(theme)
}
}, [theme])
return children
}

View File

@@ -22,6 +22,7 @@ interface DropdownProps {
previewValue?: string | null
disabled?: boolean
placeholder?: string
config?: import('@/blocks/types').SubBlockConfig
}
export function Dropdown({
@@ -34,6 +35,7 @@ export function Dropdown({
previewValue,
disabled,
placeholder = 'Select an option...',
config,
}: DropdownProps) {
const [storeValue, setStoreValue] = useSubBlockValue<string>(blockId, subBlockId)
const [storeInitialized, setStoreInitialized] = useState(false)
@@ -281,7 +283,7 @@ export function Dropdown({
{/* Dropdown */}
{open && (
<div className='absolute top-full left-0 z-[100] mt-1 w-full min-w-[286px]'>
<div className='absolute top-full left-0 z-[100] mt-1 w-full'>
<div className='allow-scroll fade-in-0 zoom-in-95 animate-in rounded-md border bg-popover text-popover-foreground shadow-lg'>
<div
ref={dropdownRef}

View File

@@ -237,10 +237,11 @@ export function GoogleDrivePicker({
setIsLoading(true)
try {
const url = new URL('/api/auth/oauth/token', window.location.origin)
url.searchParams.set('credentialId', effectiveCredentialId)
// include workflowId if available via global registry (server adds session owner otherwise)
const response = await fetch(url.toString())
const response = await fetch('/api/auth/oauth/token', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ credentialId: effectiveCredentialId, workflowId }),
})
if (!response.ok) {
throw new Error(`Failed to fetch access token: ${response.status}`)

View File

@@ -13,7 +13,7 @@ import {
CommandList,
} from '@/components/ui/command'
import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import {
type Credential,
getProviderIdFromServiceId,
@@ -22,7 +22,7 @@ import {
} from '@/lib/oauth'
import { OAuthRequiredModal } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/components/credential-selector/components/oauth-required-modal'
const logger = new Logger('JiraIssueSelector')
const logger = createLogger('JiraIssueSelector')
export interface JiraIssueInfo {
id: string

View File

@@ -1,4 +1,4 @@
import { useRef, useState } from 'react'
import { useEffect, useRef, useState } from 'react'
import { ChevronDown, Plus, Trash } from 'lucide-react'
import { Badge } from '@/components/ui/badge'
import { Button } from '@/components/ui/button'
@@ -8,10 +8,16 @@ import {
DropdownMenuItem,
DropdownMenuTrigger,
} from '@/components/ui/dropdown-menu'
import { formatDisplayText } from '@/components/ui/formatted-text'
import { Input } from '@/components/ui/input'
import { Label } from '@/components/ui/label'
import { checkTagTrigger, TagDropdown } from '@/components/ui/tag-dropdown'
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from '@/components/ui/select'
import { Textarea } from '@/components/ui/textarea'
import { cn } from '@/lib/utils'
import { useSubBlockValue } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/hooks/use-sub-block-value'
@@ -59,27 +65,31 @@ export function FieldFormat({
emptyMessage = 'No fields defined',
showType = true,
showValue = false,
valuePlaceholder = 'Enter value or <variable.name>',
valuePlaceholder = 'Enter test value',
isConnecting = false,
config,
}: FieldFormatProps) {
const [storeValue, setStoreValue] = useSubBlockValue<Field[]>(blockId, subBlockId)
const [tagDropdownStates, setTagDropdownStates] = useState<
Record<
string,
{
visible: boolean
cursorPosition: number
}
>
>({})
const [dragHighlight, setDragHighlight] = useState<Record<string, boolean>>({})
const valueInputRefs = useRef<Record<string, HTMLInputElement>>({})
const valueInputRefs = useRef<Record<string, HTMLInputElement | HTMLTextAreaElement>>({})
const [localValues, setLocalValues] = useState<Record<string, string>>({})
// Use preview value when in preview mode, otherwise use store value
const value = isPreview ? previewValue : storeValue
const fields: Field[] = value || []
useEffect(() => {
const initial: Record<string, string> = {}
;(fields || []).forEach((f) => {
if (localValues[f.id] === undefined) {
initial[f.id] = (f.value as string) || ''
}
})
if (Object.keys(initial).length > 0) {
setLocalValues((prev) => ({ ...prev, ...initial }))
}
}, [fields])
// Field operations
const addField = () => {
if (isPreview || disabled) return
@@ -88,12 +98,12 @@ export function FieldFormat({
...DEFAULT_FIELD,
id: crypto.randomUUID(),
}
setStoreValue([...fields, newField])
setStoreValue([...(fields || []), newField])
}
const removeField = (id: string) => {
if (isPreview || disabled) return
setStoreValue(fields.filter((field: Field) => field.id !== id))
setStoreValue((fields || []).filter((field: Field) => field.id !== id))
}
// Validate field name for API safety
@@ -103,38 +113,22 @@ export function FieldFormat({
return name.replace(/[\x00-\x1F"\\]/g, '').trim()
}
// Tag dropdown handlers
const handleValueInputChange = (fieldId: string, newValue: string) => {
const input = valueInputRefs.current[fieldId]
if (!input) return
const cursorPosition = input.selectionStart || 0
const shouldShow = checkTagTrigger(newValue, cursorPosition)
setTagDropdownStates((prev) => ({
...prev,
[fieldId]: {
visible: shouldShow.show,
cursorPosition,
},
}))
updateField(fieldId, 'value', newValue)
setLocalValues((prev) => ({ ...prev, [fieldId]: newValue }))
}
const handleTagSelect = (fieldId: string, newValue: string) => {
updateField(fieldId, 'value', newValue)
setTagDropdownStates((prev) => ({
...prev,
[fieldId]: { ...prev[fieldId], visible: false },
}))
}
// Value normalization: keep it simple for string types
const handleTagDropdownClose = (fieldId: string) => {
setTagDropdownStates((prev) => ({
...prev,
[fieldId]: { ...prev[fieldId], visible: false },
}))
const handleValueInputBlur = (field: Field) => {
if (isPreview || disabled) return
const inputEl = valueInputRefs.current[field.id]
if (!inputEl) return
const current = localValues[field.id] ?? inputEl.value ?? ''
const trimmed = current.trim()
if (!trimmed) return
updateField(field.id, 'value', current)
}
// Drag and drop handlers for connection blocks
@@ -152,47 +146,8 @@ export function FieldFormat({
const handleDrop = (e: React.DragEvent, fieldId: string) => {
e.preventDefault()
setDragHighlight((prev) => ({ ...prev, [fieldId]: false }))
try {
const data = JSON.parse(e.dataTransfer.getData('application/json'))
if (data.type === 'connectionBlock' && data.connectionData) {
const input = valueInputRefs.current[fieldId]
if (!input) return
// Focus the input first
input.focus()
// Get current cursor position or use end of field
const dropPosition = input.selectionStart ?? (input.value?.length || 0)
// Insert '<' at drop position to trigger the dropdown
const currentValue = input.value || ''
const newValue = `${currentValue.slice(0, dropPosition)}<${currentValue.slice(dropPosition)}`
// Update the field value
updateField(fieldId, 'value', newValue)
// Set cursor position and show dropdown
setTimeout(() => {
input.selectionStart = dropPosition + 1
input.selectionEnd = dropPosition + 1
// Trigger dropdown by simulating the tag check
const cursorPosition = dropPosition + 1
const shouldShow = checkTagTrigger(newValue, cursorPosition)
setTagDropdownStates((prev) => ({
...prev,
[fieldId]: {
visible: shouldShow.show,
cursorPosition,
},
}))
}, 0)
}
} catch (error) {
console.error('Error handling drop:', error)
}
const input = valueInputRefs.current[fieldId]
input?.focus()
}
// Update handlers
@@ -204,12 +159,14 @@ export function FieldFormat({
value = validateFieldName(value)
}
setStoreValue(fields.map((f: Field) => (f.id === id ? { ...f, [field]: value } : f)))
setStoreValue((fields || []).map((f: Field) => (f.id === id ? { ...f, [field]: value } : f)))
}
const toggleCollapse = (id: string) => {
if (isPreview || disabled) return
setStoreValue(fields.map((f: Field) => (f.id === id ? { ...f, collapsed: !f.collapsed } : f)))
setStoreValue(
(fields || []).map((f: Field) => (f.id === id ? { ...f, collapsed: !f.collapsed } : f))
)
}
// Field header
@@ -371,54 +328,66 @@ export function FieldFormat({
<div className='space-y-1.5'>
<Label className='text-xs'>Value</Label>
<div className='relative'>
<Input
ref={(el) => {
if (el) valueInputRefs.current[field.id] = el
}}
name='value'
value={field.value || ''}
onChange={(e) => handleValueInputChange(field.id, e.target.value)}
onKeyDown={(e) => {
if (e.key === 'Escape') {
handleTagDropdownClose(field.id)
{field.type === 'boolean' ? (
<Select
value={localValues[field.id] ?? (field.value as string) ?? ''}
onValueChange={(v) => {
setLocalValues((prev) => ({ ...prev, [field.id]: v }))
if (!isPreview && !disabled) updateField(field.id, 'value', v)
}}
>
<SelectTrigger className='h-9 w-full justify-between font-normal'>
<SelectValue placeholder='Select value' className='truncate' />
</SelectTrigger>
<SelectContent>
<SelectItem value='true'>true</SelectItem>
<SelectItem value='false'>false</SelectItem>
</SelectContent>
</Select>
) : field.type === 'object' || field.type === 'array' ? (
<Textarea
ref={(el) => {
if (el) valueInputRefs.current[field.id] = el
}}
name='value'
value={localValues[field.id] ?? (field.value as string) ?? ''}
onChange={(e) => handleValueInputChange(field.id, e.target.value)}
onBlur={() => handleValueInputBlur(field)}
placeholder={
field.type === 'object' ? '{\n "key": "value"\n}' : '[\n 1, 2, 3\n]'
}
}}
onDragOver={(e) => handleDragOver(e, field.id)}
onDragLeave={(e) => handleDragLeave(e, field.id)}
onDrop={(e) => handleDrop(e, field.id)}
placeholder={valuePlaceholder}
disabled={isPreview || disabled}
className={cn(
'h-9 text-transparent caret-foreground placeholder:text-muted-foreground/50',
dragHighlight[field.id] && 'ring-2 ring-blue-500 ring-offset-2',
isConnecting &&
config?.connectionDroppable !== false &&
'ring-2 ring-blue-500 ring-offset-2 focus-visible:ring-blue-500'
)}
/>
{field.value && (
<div className='pointer-events-none absolute inset-0 flex items-center px-3 py-2'>
<div className='w-full overflow-hidden text-ellipsis whitespace-nowrap text-sm'>
{formatDisplayText(field.value, true)}
</div>
</div>
disabled={isPreview || disabled}
className={cn(
'min-h-[120px] font-mono text-sm placeholder:text-muted-foreground/50',
dragHighlight[field.id] && 'ring-2 ring-blue-500 ring-offset-2',
isConnecting &&
config?.connectionDroppable !== false &&
'ring-2 ring-blue-500 ring-offset-2 focus-visible:ring-blue-500'
)}
/>
) : (
<Input
ref={(el) => {
if (el) valueInputRefs.current[field.id] = el
}}
name='value'
value={localValues[field.id] ?? field.value ?? ''}
onChange={(e) => handleValueInputChange(field.id, e.target.value)}
onBlur={() => handleValueInputBlur(field)}
onDragOver={(e) => handleDragOver(e, field.id)}
onDragLeave={(e) => handleDragLeave(e, field.id)}
onDrop={(e) => handleDrop(e, field.id)}
placeholder={valuePlaceholder}
disabled={isPreview || disabled}
className={cn(
'h-9 placeholder:text-muted-foreground/50',
dragHighlight[field.id] && 'ring-2 ring-blue-500 ring-offset-2',
isConnecting &&
config?.connectionDroppable !== false &&
'ring-2 ring-blue-500 ring-offset-2 focus-visible:ring-blue-500'
)}
/>
)}
<TagDropdown
visible={tagDropdownStates[field.id]?.visible || false}
onSelect={(newValue) => handleTagSelect(field.id, newValue)}
blockId={blockId}
activeSourceBlockId={null}
inputValue={field.value || ''}
cursorPosition={tagDropdownStates[field.id]?.cursorPosition || 0}
onClose={() => handleTagDropdownClose(field.id)}
style={{
position: 'absolute',
top: '100%',
left: 0,
right: 0,
zIndex: 9999,
}}
/>
</div>
</div>
)}
@@ -460,7 +429,7 @@ export function ResponseFormat(
emptyMessage='No response fields defined'
showType={false}
showValue={true}
valuePlaceholder='Enter value or <variable.name>'
valuePlaceholder='Enter test value'
/>
)
}

View File

@@ -1,5 +1,4 @@
import { useCallback, useEffect, useState } from 'react'
import { logger } from '@trigger.dev/sdk/v3'
import { PlusIcon, WrenchIcon, XIcon } from 'lucide-react'
import { Button } from '@/components/ui/button'
import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover'
@@ -13,6 +12,7 @@ import {
import { Switch } from '@/components/ui/switch'
import { Toggle } from '@/components/ui/toggle'
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'
import { createLogger } from '@/lib/logs/console/logger'
import type { OAuthProvider, OAuthService } from '@/lib/oauth/oauth'
import { cn } from '@/lib/utils'
import {
@@ -49,6 +49,8 @@ import {
type ToolParameterConfig,
} from '@/tools/params'
const logger = createLogger('ToolInput')
interface ToolInputProps {
blockId: string
subBlockId: string

View File

@@ -17,11 +17,11 @@ import {
TooltipContent,
TooltipTrigger,
} from '@/components/ui'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { JSONView } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/console/components'
import { ConfigSection } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/components/webhook/components'
const logger = new Logger('GmailConfig')
const logger = createLogger('GmailConfig')
const TOOLTIPS = {
labels: 'Select which email labels to monitor.',

View File

@@ -17,11 +17,11 @@ import {
TooltipContent,
TooltipTrigger,
} from '@/components/ui'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { JSONView } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/panel/components/console/components'
import { ConfigSection } from '@/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/components/webhook/components'
const logger = new Logger('OutlookConfig')
const logger = createLogger('OutlookConfig')
interface OutlookFolder {
id: string

View File

@@ -126,9 +126,12 @@ export function SubBlock({
blockId={blockId}
subBlockId={config.id}
options={config.options as { label: string; id: string }[]}
defaultValue={typeof config.value === 'function' ? config.value({}) : config.value}
placeholder={config.placeholder}
isPreview={isPreview}
previewValue={previewValue}
disabled={isDisabled}
config={config}
/>
</div>
)
@@ -139,6 +142,7 @@ export function SubBlock({
blockId={blockId}
subBlockId={config.id}
options={config.options as { label: string; id: string }[]}
defaultValue={typeof config.value === 'function' ? config.value({}) : config.value}
placeholder={config.placeholder}
isPreview={isPreview}
previewValue={previewValue}
@@ -435,6 +439,7 @@ export function SubBlock({
disabled={isDisabled}
isConnecting={isConnecting}
config={config}
showValue={true}
/>
)
}

View File

@@ -548,8 +548,8 @@ export function useWorkflowExecution() {
}
})
// Merge subblock states from the appropriate store
const mergedStates = mergeSubblockState(validBlocks)
// Merge subblock states from the appropriate store (scoped to active workflow)
const mergedStates = mergeSubblockState(validBlocks, activeWorkflowId ?? undefined)
// Debug: Check for blocks with undefined types after merging
Object.entries(mergedStates).forEach(([blockId, block]) => {

View File

@@ -1,11 +1,11 @@
'use client'
import { useCallback, useEffect, useRef, useState } from 'react'
import { logger } from '@sentry/nextjs'
import { Download, Folder, Plus } from 'lucide-react'
import { useParams, useRouter } from 'next/navigation'
import { Button } from '@/components/ui/button'
import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover'
import { createLogger } from '@/lib/logs/console/logger'
import { generateFolderName } from '@/lib/naming'
import { cn } from '@/lib/utils'
import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/providers/workspace-permissions-provider'
@@ -14,7 +14,8 @@ import { useWorkflowDiffStore } from '@/stores/workflow-diff/store'
import { useWorkflowRegistry } from '@/stores/workflows/registry/store'
import { parseWorkflowYaml } from '@/stores/workflows/yaml/importer'
// Constants
const logger = createLogger('CreateMenu')
const TIMERS = {
LONG_PRESS_DELAY: 500,
CLOSE_DELAY: 150,

View File

@@ -45,14 +45,15 @@ export function General() {
const toggleConsoleExpandedByDefault = useGeneralStore(
(state) => state.toggleConsoleExpandedByDefault
)
const loadSettings = useGeneralStore((state) => state.loadSettings)
// Sync theme from store to next-themes when theme changes
useEffect(() => {
const loadData = async () => {
await loadSettings()
if (!isLoading && theme) {
// Ensure next-themes is in sync with our store
const { syncThemeToNextThemes } = require('@/lib/theme-sync')
syncThemeToNextThemes(theme)
}
loadData()
}, [loadSettings])
}, [theme, isLoading])
const handleThemeChange = async (value: 'system' | 'light' | 'dark') => {
await setTheme(value)

View File

@@ -1,4 +1,4 @@
import { task } from '@trigger.dev/sdk/v3'
import { task } from '@trigger.dev/sdk'
import { eq, sql } from 'drizzle-orm'
import { v4 as uuidv4 } from 'uuid'
import { checkServerSideUsageLimits } from '@/lib/billing'

View File

@@ -1,4 +1,4 @@
import { task } from '@trigger.dev/sdk/v3'
import { task } from '@trigger.dev/sdk'
import { eq, sql } from 'drizzle-orm'
import { v4 as uuidv4 } from 'uuid'
import { checkServerSideUsageLimits } from '@/lib/billing'

View File

@@ -9,7 +9,9 @@ import {
getProviderIcon,
MODELS_TEMP_RANGE_0_1,
MODELS_TEMP_RANGE_0_2,
MODELS_WITH_REASONING_EFFORT,
MODELS_WITH_TEMPERATURE_SUPPORT,
MODELS_WITH_VERBOSITY,
providers,
} from '@/providers/utils'
@@ -210,6 +212,41 @@ Create a system prompt appropriately detailed for the request, using clear langu
},
},
},
{
id: 'reasoningEffort',
title: 'Reasoning Effort',
type: 'dropdown',
layout: 'half',
placeholder: 'Select reasoning effort...',
options: [
{ label: 'minimal', id: 'minimal' },
{ label: 'low', id: 'low' },
{ label: 'medium', id: 'medium' },
{ label: 'high', id: 'high' },
],
value: () => 'medium',
condition: {
field: 'model',
value: MODELS_WITH_REASONING_EFFORT,
},
},
{
id: 'verbosity',
title: 'Verbosity',
type: 'dropdown',
layout: 'half',
placeholder: 'Select verbosity...',
options: [
{ label: 'low', id: 'low' },
{ label: 'medium', id: 'medium' },
{ label: 'high', id: 'high' },
],
value: () => 'medium',
condition: {
field: 'model',
value: MODELS_WITH_VERBOSITY,
},
},
{
id: 'apiKey',
title: 'API Key',
@@ -485,6 +522,8 @@ Example 3 (Array Input):
},
},
temperature: { type: 'number', description: 'Response randomness level' },
reasoningEffort: { type: 'string', description: 'Reasoning effort level for GPT-5 models' },
verbosity: { type: 'string', description: 'Verbosity level for GPT-5 models' },
tools: { type: 'json', description: 'Available tools configuration' },
},
outputs: {

View File

@@ -5,7 +5,7 @@ export const StarterBlock: BlockConfig = {
type: 'starter',
name: 'Starter',
description: 'Start workflow',
longDescription: 'Initiate your workflow manually with optional structured input for API calls.',
longDescription: 'Initiate your workflow manually with optional structured input.',
category: 'blocks',
bgColor: '#2FB3FF',
icon: StartIcon,
@@ -25,9 +25,11 @@ export const StarterBlock: BlockConfig = {
// Structured Input format - visible if manual run is selected (advanced mode)
{
id: 'inputFormat',
title: 'Input Format (for API calls)',
title: 'Input Format',
type: 'input-format',
layout: 'full',
description:
'Name and Type define your input schema. Value is used only for manual test runs.',
mode: 'advanced',
condition: { field: 'startWorkflow', value: 'manual' },
},

View File

@@ -31,7 +31,7 @@ export const baseStyles = {
},
button: {
display: 'inline-block',
backgroundColor: 'var(--brand-primary-hover-hex)',
backgroundColor: '#802FFF',
color: '#ffffff',
fontWeight: 'bold',
fontSize: '16px',
@@ -42,7 +42,7 @@ export const baseStyles = {
margin: '20px 0',
},
link: {
color: 'var(--brand-primary-hover-hex)',
color: '#802FFF',
textDecoration: 'underline',
},
footer: {
@@ -79,7 +79,7 @@ export const baseStyles = {
width: '249px',
},
sectionCenter: {
borderBottom: '1px solid var(--brand-primary-hover-hex)',
borderBottom: '1px solid #802FFF',
width: '102px',
},
}

View File

@@ -1,17 +1,21 @@
import {
Body,
Button,
Column,
Container,
Head,
Heading,
Hr,
Html,
Img,
Link,
Preview,
Row,
Section,
Text,
} from '@react-email/components'
import { getBrandConfig } from '@/lib/branding/branding'
import { env } from '@/lib/env'
import { getAssetUrl } from '@/lib/utils'
import { baseStyles } from './base-styles'
import EmailFooter from './footer'
interface WorkspaceInvitation {
workspaceId: string
@@ -27,6 +31,8 @@ interface BatchInvitationEmailProps {
acceptUrl: string
}
const baseUrl = env.NEXT_PUBLIC_APP_URL || 'https://sim.ai'
const getPermissionLabel = (permission: string) => {
switch (permission) {
case 'admin':
@@ -43,9 +49,9 @@ const getPermissionLabel = (permission: string) => {
const getRoleLabel = (role: string) => {
switch (role) {
case 'admin':
return 'Team Admin (can manage team and billing)'
return 'Admin'
case 'member':
return 'Team Member (billing access only)'
return 'Member'
default:
return role
}
@@ -64,217 +70,101 @@ export const BatchInvitationEmail = ({
return (
<Html>
<Head />
<Preview>
You've been invited to join {organizationName}
{hasWorkspaces ? ` and ${workspaceInvitations.length} workspace(s)` : ''}
</Preview>
<Body style={main}>
<Container style={container}>
<Section style={logoContainer}>
<Img
src={brand.logoUrl || 'https://sim.ai/logo.png'}
width='120'
height='36'
alt={brand.name}
style={logo}
/>
<Body style={baseStyles.main}>
<Preview>
You've been invited to join {organizationName}
{hasWorkspaces ? ` and ${workspaceInvitations.length} workspace(s)` : ''}
</Preview>
<Container style={baseStyles.container}>
<Section style={{ padding: '30px 0', textAlign: 'center' }}>
<Row>
<Column style={{ textAlign: 'center' }}>
<Img
src={brand.logoUrl || getAssetUrl('static/sim.png')}
width='114'
alt={brand.name}
style={{
margin: '0 auto',
}}
/>
</Column>
</Row>
</Section>
<Heading style={h1}>You're invited to join {organizationName}!</Heading>
<Text style={text}>
<strong>{inviterName}</strong> has invited you to join{' '}
<strong>{organizationName}</strong> on Sim.
</Text>
{/* Organization Invitation Details */}
<Section style={invitationSection}>
<Heading style={h2}>Team Access</Heading>
<div style={roleCard}>
<Text style={roleTitle}>Team Role: {getRoleLabel(organizationRole)}</Text>
<Text style={roleDescription}>
{organizationRole === 'admin'
? "You'll be able to manage team members, billing, and workspace access."
: "You'll have access to shared team billing and can be invited to workspaces."}
</Text>
</div>
<Section style={baseStyles.sectionsBorders}>
<Row>
<Column style={baseStyles.sectionBorder} />
<Column style={baseStyles.sectionCenter} />
<Column style={baseStyles.sectionBorder} />
</Row>
</Section>
{/* Workspace Invitations */}
{hasWorkspaces && (
<Section style={invitationSection}>
<Heading style={h2}>
Workspace Access ({workspaceInvitations.length} workspace
{workspaceInvitations.length !== 1 ? 's' : ''})
</Heading>
<Text style={text}>You're also being invited to the following workspaces:</Text>
<Section style={baseStyles.content}>
<Text style={baseStyles.paragraph}>Hello,</Text>
<Text style={baseStyles.paragraph}>
<strong>{inviterName}</strong> has invited you to join{' '}
<strong>{organizationName}</strong> on Sim.
</Text>
{workspaceInvitations.map((ws, index) => (
<div key={ws.workspaceId} style={workspaceCard}>
<Text style={workspaceName}>{ws.workspaceName}</Text>
<Text style={workspacePermission}>{getPermissionLabel(ws.permission)}</Text>
</div>
))}
</Section>
)}
{/* Team Role Information */}
<Text style={baseStyles.paragraph}>
<strong>Team Role:</strong> {getRoleLabel(organizationRole)}
</Text>
<Text style={baseStyles.paragraph}>
{organizationRole === 'admin'
? "As a Team Admin, you'll be able to manage team members, billing, and workspace access."
: "As a Team Member, you'll have access to shared team billing and can be invited to workspaces."}
</Text>
<Section style={buttonContainer}>
<Button style={button} href={acceptUrl}>
Accept Invitation
</Button>
{/* Workspace Invitations */}
{hasWorkspaces && (
<>
<Text style={baseStyles.paragraph}>
<strong>
Workspace Access ({workspaceInvitations.length} workspace
{workspaceInvitations.length !== 1 ? 's' : ''}):
</strong>
</Text>
{workspaceInvitations.map((ws) => (
<Text
key={ws.workspaceId}
style={{ ...baseStyles.paragraph, marginLeft: '20px' }}
>
• <strong>{ws.workspaceName}</strong> - {getPermissionLabel(ws.permission)}
</Text>
))}
</>
)}
<Link href={acceptUrl} style={{ textDecoration: 'none' }}>
<Text style={baseStyles.button}>Accept Invitation</Text>
</Link>
<Text style={baseStyles.paragraph}>
By accepting this invitation, you'll join {organizationName}
{hasWorkspaces
? ` and gain access to ${workspaceInvitations.length} workspace(s)`
: ''}
.
</Text>
<Text style={baseStyles.paragraph}>
This invitation will expire in 7 days. If you didn't expect this invitation, you can
safely ignore this email.
</Text>
<Text style={baseStyles.paragraph}>
Best regards,
<br />
The Sim Team
</Text>
</Section>
<Text style={text}>
By accepting this invitation, you'll join {organizationName}
{hasWorkspaces ? ` and gain access to ${workspaceInvitations.length} workspace(s)` : ''}
.
</Text>
<Hr style={hr} />
<Text style={footer}>
If you have any questions, you can reach out to {inviterName} directly or contact our
support team.
</Text>
<Text style={footer}>
This invitation will expire in 7 days. If you didn't expect this invitation, you can
safely ignore this email.
</Text>
</Container>
<EmailFooter baseUrl={baseUrl} />
</Body>
</Html>
)
}
export default BatchInvitationEmail
// Styles
const main = {
backgroundColor: '#f6f9fc',
fontFamily:
'-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Ubuntu,sans-serif',
}
const container = {
backgroundColor: '#ffffff',
margin: '0 auto',
padding: '20px 0 48px',
marginBottom: '64px',
}
const logoContainer = {
margin: '32px 0',
textAlign: 'center' as const,
}
const logo = {
margin: '0 auto',
}
const h1 = {
color: '#333',
fontSize: '24px',
fontWeight: 'bold',
margin: '40px 0',
padding: '0',
textAlign: 'center' as const,
}
const h2 = {
color: '#333',
fontSize: '18px',
fontWeight: 'bold',
margin: '24px 0 16px 0',
padding: '0',
}
const text = {
color: '#333',
fontSize: '16px',
lineHeight: '26px',
margin: '16px 0',
padding: '0 40px',
}
const invitationSection = {
margin: '32px 0',
padding: '0 40px',
}
const roleCard = {
backgroundColor: '#f8f9fa',
border: '1px solid #e9ecef',
borderRadius: '8px',
padding: '16px',
margin: '16px 0',
}
const roleTitle = {
color: '#333',
fontSize: '16px',
fontWeight: 'bold',
margin: '0 0 8px 0',
}
const roleDescription = {
color: '#6c757d',
fontSize: '14px',
lineHeight: '20px',
margin: '0',
}
const workspaceCard = {
backgroundColor: '#f8f9fa',
border: '1px solid #e9ecef',
borderRadius: '6px',
padding: '12px 16px',
margin: '8px 0',
display: 'flex',
justifyContent: 'space-between',
alignItems: 'center',
}
const workspaceName = {
color: '#333',
fontSize: '15px',
fontWeight: '500',
margin: '0',
}
const workspacePermission = {
color: '#6c757d',
fontSize: '13px',
margin: '0',
}
const buttonContainer = {
margin: '32px 0',
textAlign: 'center' as const,
}
const button = {
backgroundColor: '#007bff',
borderRadius: '6px',
color: '#fff',
fontSize: '16px',
fontWeight: 'bold',
textDecoration: 'none',
textAlign: 'center' as const,
display: 'inline-block',
padding: '12px 24px',
margin: '0 auto',
}
const hr = {
borderColor: '#e9ecef',
margin: '32px 0',
}
const footer = {
color: '#6c757d',
fontSize: '14px',
lineHeight: '20px',
margin: '8px 0',
padding: '0 40px',
}

View File

@@ -0,0 +1,136 @@
import {
Body,
Column,
Container,
Head,
Html,
Img,
Preview,
Row,
Section,
Text,
} from '@react-email/components'
import { format } from 'date-fns'
import { getBrandConfig } from '@/lib/branding/branding'
import { env } from '@/lib/env'
import { getAssetUrl } from '@/lib/utils'
import { baseStyles } from './base-styles'
import EmailFooter from './footer'
interface HelpConfirmationEmailProps {
userEmail?: string
type?: 'bug' | 'feedback' | 'feature_request' | 'other'
attachmentCount?: number
submittedDate?: Date
}
const baseUrl = env.NEXT_PUBLIC_APP_URL || 'https://sim.ai'
const getTypeLabel = (type: string) => {
switch (type) {
case 'bug':
return 'Bug Report'
case 'feedback':
return 'Feedback'
case 'feature_request':
return 'Feature Request'
case 'other':
return 'General Inquiry'
default:
return 'Request'
}
}
export const HelpConfirmationEmail = ({
userEmail = '',
type = 'other',
attachmentCount = 0,
submittedDate = new Date(),
}: HelpConfirmationEmailProps) => {
const brand = getBrandConfig()
const typeLabel = getTypeLabel(type)
return (
<Html>
<Head />
<Body style={baseStyles.main}>
<Preview>Your {typeLabel.toLowerCase()} has been received</Preview>
<Container style={baseStyles.container}>
<Section style={{ padding: '30px 0', textAlign: 'center' }}>
<Row>
<Column style={{ textAlign: 'center' }}>
<Img
src={brand.logoUrl || getAssetUrl('static/sim.png')}
width='114'
alt={brand.name}
style={{
margin: '0 auto',
}}
/>
</Column>
</Row>
</Section>
<Section style={baseStyles.sectionsBorders}>
<Row>
<Column style={baseStyles.sectionBorder} />
<Column style={baseStyles.sectionCenter} />
<Column style={baseStyles.sectionBorder} />
</Row>
</Section>
<Section style={baseStyles.content}>
<Text style={baseStyles.paragraph}>Hello,</Text>
<Text style={baseStyles.paragraph}>
Thank you for your <strong>{typeLabel.toLowerCase()}</strong> submission. We've
received your request and will get back to you as soon as possible.
</Text>
{attachmentCount > 0 && (
<Text style={baseStyles.paragraph}>
You attached{' '}
<strong>
{attachmentCount} image{attachmentCount > 1 ? 's' : ''}
</strong>{' '}
with your request.
</Text>
)}
<Text style={baseStyles.paragraph}>
We typically respond to{' '}
{type === 'bug'
? 'bug reports'
: type === 'feature_request'
? 'feature requests'
: 'inquiries'}{' '}
within a few hours. If you need immediate assistance, please don't hesitate to reach
out to us directly.
</Text>
<Text style={baseStyles.paragraph}>
Best regards,
<br />
The {brand.name} Team
</Text>
<Text
style={{
...baseStyles.footerText,
marginTop: '40px',
textAlign: 'left',
color: '#666666',
}}
>
This confirmation was sent on {format(submittedDate, 'MMMM do, yyyy')} for your{' '}
{typeLabel.toLowerCase()} submission from {userEmail}.
</Text>
</Section>
</Container>
<EmailFooter baseUrl={baseUrl} />
</Body>
</Html>
)
}
export default HelpConfirmationEmail

View File

@@ -1,6 +1,7 @@
export * from './base-styles'
export { BatchInvitationEmail } from './batch-invitation-email'
export { default as EmailFooter } from './footer'
export { HelpConfirmationEmail } from './help-confirmation-email'
export { InvitationEmail } from './invitation-email'
export { OTPVerificationEmail } from './otp-verification-email'
export * from './render-email'

View File

@@ -1,6 +1,7 @@
import { render } from '@react-email/components'
import {
BatchInvitationEmail,
HelpConfirmationEmail,
InvitationEmail,
OTPVerificationEmail,
ResetPasswordEmail,
@@ -65,6 +66,21 @@ export async function renderBatchInvitationEmail(
)
}
export async function renderHelpConfirmationEmail(
userEmail: string,
type: 'bug' | 'feedback' | 'feature_request' | 'other',
attachmentCount = 0
): Promise<string> {
return await render(
HelpConfirmationEmail({
userEmail,
type,
attachmentCount,
submittedDate: new Date(),
})
)
}
export function getEmailSubject(
type:
| 'sign-in'
@@ -73,6 +89,7 @@ export function getEmailSubject(
| 'reset-password'
| 'invitation'
| 'batch-invitation'
| 'help-confirmation'
): string {
switch (type) {
case 'sign-in':
@@ -87,6 +104,8 @@ export function getEmailSubject(
return "You've been invited to join a team on Sim"
case 'batch-invitation':
return "You've been invited to join a team and workspaces on Sim"
case 'help-confirmation':
return 'Your request has been received'
default:
return 'Sim'
}

View File

@@ -0,0 +1,5 @@
ALTER TABLE "document" ADD COLUMN IF NOT EXISTS "processing_status" text DEFAULT 'pending' NOT NULL;
ALTER TABLE "document" ADD COLUMN IF NOT EXISTS "processing_started_at" timestamp;
ALTER TABLE "document" ADD COLUMN IF NOT EXISTS "processing_completed_at" timestamp;
ALTER TABLE "document" ADD COLUMN IF NOT EXISTS "processing_error" text;
CREATE INDEX IF NOT EXISTS "doc_processing_status_idx" ON "document" USING btree ("knowledge_base_id","processing_status");

View File

@@ -1326,5 +1326,59 @@ describe('AgentBlockHandler', () => {
expect(requestBody.model).toBe('azure/gpt-4o')
expect(requestBody.apiKey).toBe('test-azure-api-key')
})
it('should pass GPT-5 specific parameters (reasoningEffort and verbosity) through the request pipeline', async () => {
const inputs = {
model: 'gpt-5',
systemPrompt: 'You are a helpful assistant.',
userPrompt: 'Hello!',
apiKey: 'test-api-key',
reasoningEffort: 'minimal',
verbosity: 'high',
temperature: 0.7,
}
mockGetProviderFromModel.mockReturnValue('openai')
await handler.execute(mockBlock, inputs, mockContext)
expect(mockFetch).toHaveBeenCalledWith(expect.any(String), expect.any(Object))
const fetchCall = mockFetch.mock.calls[0]
const requestBody = JSON.parse(fetchCall[1].body)
// Check that GPT-5 parameters are included in the request
expect(requestBody.reasoningEffort).toBe('minimal')
expect(requestBody.verbosity).toBe('high')
expect(requestBody.provider).toBe('openai')
expect(requestBody.model).toBe('gpt-5')
expect(requestBody.apiKey).toBe('test-api-key')
})
it('should handle missing GPT-5 parameters gracefully', async () => {
const inputs = {
model: 'gpt-5',
systemPrompt: 'You are a helpful assistant.',
userPrompt: 'Hello!',
apiKey: 'test-api-key',
temperature: 0.7,
// No reasoningEffort or verbosity provided
}
mockGetProviderFromModel.mockReturnValue('openai')
await handler.execute(mockBlock, inputs, mockContext)
expect(mockFetch).toHaveBeenCalledWith(expect.any(String), expect.any(Object))
const fetchCall = mockFetch.mock.calls[0]
const requestBody = JSON.parse(fetchCall[1].body)
// Check that GPT-5 parameters are undefined when not provided
expect(requestBody.reasoningEffort).toBeUndefined()
expect(requestBody.verbosity).toBeUndefined()
expect(requestBody.provider).toBe('openai')
expect(requestBody.model).toBe('gpt-5')
})
})
})

View File

@@ -368,6 +368,8 @@ export class AgentBlockHandler implements BlockHandler {
stream: streaming,
messages,
environmentVariables: context.environmentVariables || {},
reasoningEffort: inputs.reasoningEffort,
verbosity: inputs.verbosity,
}
}

View File

@@ -10,6 +10,8 @@ export interface AgentInputs {
apiKey?: string
azureEndpoint?: string
azureApiVersion?: string
reasoningEffort?: string
verbosity?: string
}
export interface ToolInput {

View File

@@ -771,7 +771,7 @@ export class Executor {
// Get the field value from workflow input if available
// First try to access via input.field, then directly from field
// This handles both input formats: { input: { field: value } } and { field: value }
const inputValue =
let inputValue =
this.workflowInput?.input?.[field.name] !== undefined
? this.workflowInput.input[field.name] // Try to get from input.field
: this.workflowInput?.[field.name] // Fallback to direct field access
@@ -781,13 +781,25 @@ export class Executor {
inputValue !== undefined ? JSON.stringify(inputValue) : 'undefined'
)
// Convert the value to the appropriate type
if (inputValue === undefined || inputValue === null) {
if (Object.hasOwn(field, 'value')) {
inputValue = (field as any).value
}
}
let typedValue = inputValue
if (inputValue !== undefined) {
if (field.type === 'number' && typeof inputValue !== 'number') {
typedValue = Number(inputValue)
if (inputValue !== undefined && inputValue !== null) {
if (field.type === 'string' && typeof inputValue !== 'string') {
typedValue = String(inputValue)
} else if (field.type === 'number' && typeof inputValue !== 'number') {
const num = Number(inputValue)
typedValue = Number.isNaN(num) ? inputValue : num
} else if (field.type === 'boolean' && typeof inputValue !== 'boolean') {
typedValue = inputValue === 'true' || inputValue === true
typedValue =
inputValue === 'true' ||
inputValue === true ||
inputValue === 1 ||
inputValue === '1'
} else if (
(field.type === 'object' || field.type === 'array') &&
typeof inputValue === 'string'

View File

@@ -11,7 +11,6 @@ import {
} from 'better-auth/plugins'
import { and, eq } from 'drizzle-orm'
import { headers } from 'next/headers'
import { Resend } from 'resend'
import Stripe from 'stripe'
import {
getEmailSubject,
@@ -21,11 +20,12 @@ import {
} from '@/components/emails/render-email'
import { getBaseURL } from '@/lib/auth-client'
import { DEFAULT_FREE_CREDITS } from '@/lib/billing/constants'
import { sendEmail } from '@/lib/email/mailer'
import { getFromEmailAddress } from '@/lib/email/utils'
import { quickValidateEmail } from '@/lib/email/validation'
import { env, isTruthy } from '@/lib/env'
import { isBillingEnabled, isProd } from '@/lib/environment'
import { createLogger } from '@/lib/logs/console/logger'
import { getEmailDomain } from '@/lib/urls/utils'
import { db } from '@/db'
import * as schema from '@/db/schema'
@@ -45,22 +45,6 @@ if (validStripeKey) {
})
}
// If there is no resend key, it might be a local dev environment
// In that case, we don't want to send emails and just log them
const validResendAPIKEY =
env.RESEND_API_KEY && env.RESEND_API_KEY.trim() !== '' && env.RESEND_API_KEY !== 'placeholder'
const resend = validResendAPIKEY
? new Resend(env.RESEND_API_KEY)
: {
emails: {
send: async (...args: any[]) => {
logger.info('Email would have been sent in production. Details:', args)
return { id: 'local-dev-mode' }
},
},
}
export const auth = betterAuth({
baseURL: getBaseURL(),
trustedOrigins: [
@@ -165,15 +149,16 @@ export const auth = betterAuth({
const html = await renderPasswordResetEmail(username, url)
const result = await resend.emails.send({
from: `Sim <team@${env.EMAIL_DOMAIN || getEmailDomain()}>`,
const result = await sendEmail({
to: user.email,
subject: getEmailSubject('reset-password'),
html,
from: getFromEmailAddress(),
emailType: 'transactional',
})
if (!result) {
throw new Error('Failed to send reset password email')
if (!result.success) {
throw new Error(`Failed to send reset password email: ${result.message}`)
}
},
},
@@ -252,8 +237,19 @@ export const auth = betterAuth({
)
}
// In development with no RESEND_API_KEY, log verification code
if (!validResendAPIKEY) {
const html = await renderOTPEmail(data.otp, data.email, data.type)
// Send email via consolidated mailer (supports Resend, Azure, or logging fallback)
const result = await sendEmail({
to: data.email,
subject: getEmailSubject(data.type),
html,
from: getFromEmailAddress(),
emailType: 'transactional',
})
// If no email service is configured, log verification code for development
if (!result.success && result.message.includes('no email service configured')) {
logger.info('🔑 VERIFICATION CODE FOR LOGIN/SIGNUP', {
email: data.email,
otp: data.otp,
@@ -263,18 +259,8 @@ export const auth = betterAuth({
return
}
const html = await renderOTPEmail(data.otp, data.email, data.type)
// In production, send an actual email
const result = await resend.emails.send({
from: `Sim <onboarding@${env.EMAIL_DOMAIN || getEmailDomain()}>`,
to: data.email,
subject: getEmailSubject(data.type),
html,
})
if (!result) {
throw new Error('Failed to send verification code')
if (!result.success) {
throw new Error(`Failed to send verification code: ${result.message}`)
}
} catch (error) {
logger.error('Error sending verification code:', {
@@ -470,7 +456,6 @@ export const auth = betterAuth({
responseType: 'code',
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
pkce: true,
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/microsoft-teams`,
},
@@ -486,7 +471,6 @@ export const auth = betterAuth({
responseType: 'code',
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
pkce: true,
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/microsoft-excel`,
},
@@ -509,7 +493,6 @@ export const auth = betterAuth({
responseType: 'code',
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
pkce: true,
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/microsoft-planner`,
},
@@ -534,7 +517,6 @@ export const auth = betterAuth({
responseType: 'code',
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
pkce: true,
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/outlook`,
},
@@ -550,7 +532,6 @@ export const auth = betterAuth({
responseType: 'code',
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
pkce: true,
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/onedrive`,
},
@@ -573,7 +554,6 @@ export const auth = betterAuth({
responseType: 'code',
accessType: 'offline',
authentication: 'basic',
prompt: 'consent',
pkce: true,
redirectURI: `${env.NEXT_PUBLIC_APP_URL}/api/auth/oauth2/callback/sharepoint`,
},
@@ -1284,133 +1264,30 @@ export const auth = betterAuth({
})
// Auto-create organization for team plan purchases
if (subscription.plan === 'team') {
try {
// Get the user who purchased the subscription
const user = await db
.select()
.from(schema.user)
.where(eq(schema.user.id, subscription.referenceId))
.limit(1)
if (user.length > 0) {
const currentUser = user[0]
// Store the original user ID before we change the referenceId
const originalUserId = subscription.referenceId
// First, sync usage limits for the purchasing user with their new plan
try {
const { syncUsageLimitsFromSubscription } = await import('@/lib/billing')
await syncUsageLimitsFromSubscription(originalUserId)
logger.info(
'Usage limits synced for purchasing user before organization creation',
{
userId: originalUserId,
}
)
} catch (error) {
logger.error('Failed to sync usage limits for purchasing user', {
userId: originalUserId,
error,
})
}
// Create organization for the team
const orgId = `org_${Date.now()}_${Math.random().toString(36).substring(2, 15)}`
const orgSlug = `${currentUser.name?.toLowerCase().replace(/\s+/g, '-') || 'team'}-${Date.now()}`
// Create a separate Stripe customer for the organization
let orgStripeCustomerId: string | null = null
if (stripeClient) {
try {
const orgStripeCustomer = await stripeClient.customers.create({
name: `${currentUser.name || 'User'}'s Team`,
email: currentUser.email,
metadata: {
organizationId: orgId,
type: 'organization',
},
})
orgStripeCustomerId = orgStripeCustomer.id
} catch (error) {
logger.error('Failed to create Stripe customer for organization', {
organizationId: orgId,
error,
})
// Continue without Stripe customer - can be created later
}
}
const newOrg = await db
.insert(schema.organization)
.values({
id: orgId,
name: `${currentUser.name || 'User'}'s Team`,
slug: orgSlug,
metadata: orgStripeCustomerId
? { stripeCustomerId: orgStripeCustomerId }
: null,
createdAt: new Date(),
updatedAt: new Date(),
})
.returning()
// Add the user as owner of the organization
await db.insert(schema.member).values({
id: `member_${Date.now()}_${Math.random().toString(36).substring(2, 15)}`,
userId: currentUser.id,
organizationId: orgId,
role: 'owner',
createdAt: new Date(),
})
// Update the subscription to reference the organization instead of the user
await db
.update(schema.subscription)
.set({ referenceId: orgId })
.where(eq(schema.subscription.id, subscription.id))
// Update the session to set the new organization as active
await db
.update(schema.session)
.set({ activeOrganizationId: orgId })
.where(eq(schema.session.userId, currentUser.id))
logger.info('Auto-created organization for team subscription', {
organizationId: orgId,
userId: currentUser.id,
subscriptionId: subscription.id,
orgName: `${currentUser.name || 'User'}'s Team`,
})
// Update referenceId for usage limit sync
subscription.referenceId = orgId
}
} catch (error) {
logger.error('Failed to auto-create organization for team subscription', {
subscriptionId: subscription.id,
referenceId: subscription.referenceId,
error,
})
}
try {
const { handleTeamPlanOrganization } = await import(
'@/lib/billing/team-management'
)
await handleTeamPlanOrganization(subscription)
} catch (error) {
logger.error('Failed to handle team plan organization creation', {
subscriptionId: subscription.id,
referenceId: subscription.referenceId,
error,
})
}
// Initialize billing period for the user/organization
// Initialize billing period and sync usage limits
try {
const { initializeBillingPeriod } = await import(
'@/lib/billing/core/billing-periods'
)
const { syncSubscriptionUsageLimits } = await import(
'@/lib/billing/team-management'
)
// Note: Usage limits are already synced above for team plan users
// For non-team plans, sync usage limits using the referenceId (which is the user ID)
if (subscription.plan !== 'team') {
const { syncUsageLimitsFromSubscription } = await import('@/lib/billing')
await syncUsageLimitsFromSubscription(subscription.referenceId)
logger.info('Usage limits synced after subscription creation', {
referenceId: subscription.referenceId,
})
}
// Sync usage limits for user or organization members
await syncSubscriptionUsageLimits(subscription)
// Initialize billing period for new subscription using Stripe dates
if (subscription.plan !== 'free') {
@@ -1447,15 +1324,29 @@ export const auth = betterAuth({
logger.info('Subscription updated', {
subscriptionId: subscription.id,
status: subscription.status,
plan: subscription.plan,
})
// Auto-create organization for team plan upgrades (free -> team)
try {
const { handleTeamPlanOrganization } = await import(
'@/lib/billing/team-management'
)
await handleTeamPlanOrganization(subscription)
} catch (error) {
logger.error('Failed to handle team plan organization creation on update', {
subscriptionId: subscription.id,
referenceId: subscription.referenceId,
error,
})
}
// Sync usage limits for the user/organization
try {
const { syncUsageLimitsFromSubscription } = await import('@/lib/billing')
await syncUsageLimitsFromSubscription(subscription.referenceId)
logger.info('Usage limits synced after subscription update', {
referenceId: subscription.referenceId,
})
const { syncSubscriptionUsageLimits } = await import(
'@/lib/billing/team-management'
)
await syncSubscriptionUsageLimits(subscription)
} catch (error) {
logger.error('Failed to sync usage limits after subscription update', {
referenceId: subscription.referenceId,
@@ -1551,12 +1442,17 @@ export const auth = betterAuth({
invitation.email
)
await resend.emails.send({
from: `Sim <team@${env.EMAIL_DOMAIN || getEmailDomain()}>`,
const result = await sendEmail({
to: invitation.email,
subject: `${inviterName} has invited you to join ${organization.name} on Sim`,
html,
from: getFromEmailAddress(),
emailType: 'transactional',
})
if (!result.success) {
logger.error('Failed to send organization invitation email:', result.message)
}
} catch (error) {
logger.error('Error sending invitation email', { error })
}

View File

@@ -13,6 +13,24 @@ import { member, organization, subscription, user, userStats } from '@/db/schema
const logger = createLogger('Billing')
/**
* Get organization subscription directly by organization ID
*/
export async function getOrganizationSubscription(organizationId: string) {
try {
const orgSubs = await db
.select()
.from(subscription)
.where(and(eq(subscription.referenceId, organizationId), eq(subscription.status, 'active')))
.limit(1)
return orgSubs.length > 0 ? orgSubs[0] : null
} catch (error) {
logger.error('Error getting organization subscription', { error, organizationId })
return null
}
}
interface BillingResult {
success: boolean
chargedAmount?: number
@@ -89,15 +107,43 @@ async function getStripeCustomerId(referenceId: string): Promise<string | null>
.where(eq(organization.id, referenceId))
.limit(1)
if (orgRecord.length > 0 && orgRecord[0].metadata) {
const metadata =
typeof orgRecord[0].metadata === 'string'
? JSON.parse(orgRecord[0].metadata)
: orgRecord[0].metadata
if (orgRecord.length > 0) {
// First, check if organization has its own Stripe customer (legacy support)
if (orgRecord[0].metadata) {
const metadata =
typeof orgRecord[0].metadata === 'string'
? JSON.parse(orgRecord[0].metadata)
: orgRecord[0].metadata
if (metadata?.stripeCustomerId) {
return metadata.stripeCustomerId
if (metadata?.stripeCustomerId) {
return metadata.stripeCustomerId
}
}
// If organization has no Stripe customer, use the owner's customer
// This is our new pattern: subscriptions stay with user, referenceId = orgId
const ownerRecord = await db
.select({
stripeCustomerId: user.stripeCustomerId,
userId: user.id,
})
.from(user)
.innerJoin(member, eq(member.userId, user.id))
.where(and(eq(member.organizationId, referenceId), eq(member.role, 'owner')))
.limit(1)
if (ownerRecord.length > 0 && ownerRecord[0].stripeCustomerId) {
logger.debug('Using organization owner Stripe customer for billing', {
organizationId: referenceId,
ownerId: ownerRecord[0].userId,
stripeCustomerId: ownerRecord[0].stripeCustomerId,
})
return ownerRecord[0].stripeCustomerId
}
logger.warn('No Stripe customer found for organization or its owner', {
organizationId: referenceId,
})
}
return null
@@ -431,8 +477,8 @@ export async function processOrganizationOverageBilling(
organizationId: string
): Promise<BillingResult> {
try {
// Get organization subscription
const subscription = await getHighestPrioritySubscription(organizationId)
// Get organization subscription directly (referenceId = organizationId)
const subscription = await getOrganizationSubscription(organizationId)
if (!subscription || !['team', 'enterprise'].includes(subscription.plan)) {
logger.warn('No team/enterprise subscription found for organization', { organizationId })
@@ -759,7 +805,9 @@ export async function getSimplifiedBillingSummary(
try {
// Get subscription and usage data upfront
const [subscription, usageData] = await Promise.all([
getHighestPrioritySubscription(organizationId || userId),
organizationId
? getOrganizationSubscription(organizationId)
: getHighestPrioritySubscription(userId),
getUserUsageData(userId),
])

View File

@@ -1,13 +1,31 @@
import { and, eq } from 'drizzle-orm'
import { DEFAULT_FREE_CREDITS } from '@/lib/billing/constants'
import { getPlanPricing } from '@/lib/billing/core/billing'
import { getHighestPrioritySubscription } from '@/lib/billing/core/subscription'
import { createLogger } from '@/lib/logs/console/logger'
import { db } from '@/db'
import { member, organization, user, userStats } from '@/db/schema'
import { member, organization, subscription, user, userStats } from '@/db/schema'
const logger = createLogger('OrganizationBilling')
/**
* Get organization subscription directly by organization ID
* This is for our new pattern where referenceId = organizationId
*/
async function getOrganizationSubscription(organizationId: string) {
try {
const orgSubs = await db
.select()
.from(subscription)
.where(and(eq(subscription.referenceId, organizationId), eq(subscription.status, 'active')))
.limit(1)
return orgSubs.length > 0 ? orgSubs[0] : null
} catch (error) {
logger.error('Error getting organization subscription', { error, organizationId })
return null
}
}
interface OrganizationUsageData {
organizationId: string
organizationName: string
@@ -57,8 +75,8 @@ export async function getOrganizationBillingData(
const organizationData = orgRecord[0]
// Get organization subscription
const subscription = await getHighestPrioritySubscription(organizationId)
// Get organization subscription directly (referenceId = organizationId)
const subscription = await getOrganizationSubscription(organizationId)
if (!subscription) {
logger.warn('No subscription found for organization', { organizationId })
@@ -191,7 +209,7 @@ export async function updateMemberUsageLimit(
}
// Get organization subscription to validate limit
const subscription = await getHighestPrioritySubscription(organizationId)
const subscription = await getOrganizationSubscription(organizationId)
if (!subscription) {
throw new Error('No active subscription found')
}

View File

@@ -0,0 +1,181 @@
import { eq } from 'drizzle-orm'
import { createLogger } from '@/lib/logs/console/logger'
import { db } from '@/db'
import { member, organization, session, subscription, user } from '@/db/schema'
const logger = createLogger('TeamManagement')
type SubscriptionData = {
id: string
plan: string
referenceId: string
status: string
seats?: number
[key: string]: any
}
/**
* Auto-create organization for team plan subscriptions
*/
export async function handleTeamPlanOrganization(
subscriptionData: SubscriptionData
): Promise<void> {
if (subscriptionData.plan !== 'team') return
try {
// For team subscriptions, referenceId should be the user ID initially
// But if the organization has already been created, it might be the org ID
let userId: string = subscriptionData.referenceId
let currentUser: any = null
// First try to get user directly (most common case)
const users = await db
.select()
.from(user)
.where(eq(user.id, subscriptionData.referenceId))
.limit(1)
if (users.length > 0) {
currentUser = users[0]
userId = currentUser.id
} else {
// If referenceId is not a user ID, it might be an organization ID
// In that case, the organization already exists, so we should skip
const existingOrg = await db
.select()
.from(organization)
.where(eq(organization.id, subscriptionData.referenceId))
.limit(1)
if (existingOrg.length > 0) {
logger.info('Organization already exists for team subscription, skipping creation', {
organizationId: subscriptionData.referenceId,
subscriptionId: subscriptionData.id,
})
return
}
logger.warn('User not found for team subscription and no existing organization', {
referenceId: subscriptionData.referenceId,
})
return
}
// Check if user already has an organization membership
const existingMember = await db.select().from(member).where(eq(member.userId, userId)).limit(1)
if (existingMember.length > 0) {
logger.info('User already has organization membership, skipping auto-creation', {
userId,
existingOrgId: existingMember[0].organizationId,
})
return
}
const orgName = `${currentUser.name || 'User'}'s Team`
const orgSlug = `${currentUser.name?.toLowerCase().replace(/\s+/g, '-') || 'team'}-${Date.now()}`
// Create organization directly in database
const orgId = `org_${Date.now()}_${Math.random().toString(36).substring(2, 15)}`
const [createdOrg] = await db
.insert(organization)
.values({
id: orgId,
name: orgName,
slug: orgSlug,
createdAt: new Date(),
updatedAt: new Date(),
})
.returning()
if (!createdOrg) {
throw new Error('Failed to create organization in database')
}
// Add the user as admin of the organization (owner role for full control)
await db.insert(member).values({
id: `member_${Date.now()}_${Math.random().toString(36).substring(2, 15)}`,
userId: currentUser.id,
organizationId: orgId,
role: 'owner', // Owner gives full admin privileges
createdAt: new Date(),
})
// Update the subscription to reference the organization instead of the user
await db
.update(subscription)
.set({ referenceId: orgId })
.where(eq(subscription.id, subscriptionData.id))
// Update the user's session to set the new organization as active
await db
.update(session)
.set({ activeOrganizationId: orgId })
.where(eq(session.userId, currentUser.id))
logger.info('Auto-created organization for team subscription', {
organizationId: orgId,
userId: currentUser.id,
subscriptionId: subscriptionData.id,
orgName,
userRole: 'owner',
})
// Update subscription object for subsequent logic
subscriptionData.referenceId = orgId
} catch (error) {
logger.error('Failed to auto-create organization for team subscription', {
subscriptionId: subscriptionData.id,
referenceId: subscriptionData.referenceId,
error,
})
throw error
}
}
/**
* Sync usage limits for user or organization
* Handles the complexity of determining whether to sync for user ID or org members
*/
export async function syncSubscriptionUsageLimits(
subscriptionData: SubscriptionData
): Promise<void> {
try {
const { syncUsageLimitsFromSubscription } = await import('@/lib/billing')
// For team plans, the referenceId is now an organization ID
// We need to sync limits for the organization members
if (subscriptionData.plan === 'team') {
// Get all members of the organization
const orgMembers = await db
.select({ userId: member.userId })
.from(member)
.where(eq(member.organizationId, subscriptionData.referenceId))
// Sync usage limits for each member
for (const orgMember of orgMembers) {
await syncUsageLimitsFromSubscription(orgMember.userId)
}
logger.info('Synced usage limits for team organization members', {
organizationId: subscriptionData.referenceId,
memberCount: orgMembers.length,
})
} else {
// For non-team plans, referenceId is the user ID
await syncUsageLimitsFromSubscription(subscriptionData.referenceId)
logger.info('Synced usage limits for user', {
userId: subscriptionData.referenceId,
plan: subscriptionData.plan,
})
}
} catch (error) {
logger.error('Failed to sync subscription usage limits', {
subscriptionId: subscriptionData.id,
referenceId: subscriptionData.referenceId,
error,
})
throw error
}
}

View File

@@ -1,5 +1,5 @@
import { and, count, eq } from 'drizzle-orm'
import { getHighestPrioritySubscription } from '@/lib/billing/core/subscription'
import { getOrganizationSubscription } from '@/lib/billing/core/billing'
import { quickValidateEmail } from '@/lib/email/validation'
import { createLogger } from '@/lib/logs/console/logger'
import { db } from '@/db'
@@ -33,8 +33,8 @@ export async function validateSeatAvailability(
additionalSeats = 1
): Promise<SeatValidationResult> {
try {
// Get organization subscription
const subscription = await getHighestPrioritySubscription(organizationId)
// Get organization subscription directly (referenceId = organizationId)
const subscription = await getOrganizationSubscription(organizationId)
if (!subscription) {
return {
@@ -71,7 +71,10 @@ export async function validateSeatAvailability(
// For enterprise plans, check metadata for custom seat allowances
if (subscription.plan === 'enterprise' && subscription.metadata) {
try {
const metadata = JSON.parse(subscription.metadata)
const metadata =
typeof subscription.metadata === 'string'
? JSON.parse(subscription.metadata)
: subscription.metadata
if (metadata.maxSeats) {
maxSeats = metadata.maxSeats
}
@@ -142,8 +145,8 @@ export async function getOrganizationSeatInfo(
return null
}
// Get subscription
const subscription = await getHighestPrioritySubscription(organizationId)
// Get organization subscription directly (referenceId = organizationId)
const subscription = await getOrganizationSubscription(organizationId)
if (!subscription) {
return null
@@ -163,7 +166,10 @@ export async function getOrganizationSeatInfo(
if (subscription.plan === 'enterprise' && subscription.metadata) {
try {
const metadata = JSON.parse(subscription.metadata)
const metadata =
typeof subscription.metadata === 'string'
? JSON.parse(subscription.metadata)
: subscription.metadata
if (metadata.maxSeats) {
maxSeats = metadata.maxSeats
}
@@ -282,8 +288,8 @@ export async function updateOrganizationSeats(
updatedBy: string
): Promise<{ success: boolean; error?: string }> {
try {
// Get current subscription
const subscriptionRecord = await getHighestPrioritySubscription(organizationId)
// Get current organization subscription directly (referenceId = organizationId)
const subscriptionRecord = await getOrganizationSubscription(organizationId)
if (!subscriptionRecord) {
return { success: false, error: 'No active subscription found' }

View File

@@ -9,10 +9,9 @@ import { mistralParserTool } from '@/tools/mistral/parser'
const logger = createLogger('DocumentProcessor')
// Timeout constants (in milliseconds)
const TIMEOUTS = {
FILE_DOWNLOAD: 60000, // 60 seconds
MISTRAL_OCR_API: 90000, // 90 seconds
FILE_DOWNLOAD: 60000,
MISTRAL_OCR_API: 90000,
} as const
type S3Config = {
@@ -27,20 +26,19 @@ type BlobConfig = {
connectionString?: string
}
function getKBConfig(): S3Config | BlobConfig {
const getKBConfig = (): S3Config | BlobConfig => {
const provider = getStorageProvider()
if (provider === 'blob') {
return {
containerName: BLOB_KB_CONFIG.containerName,
accountName: BLOB_KB_CONFIG.accountName,
accountKey: BLOB_KB_CONFIG.accountKey,
connectionString: BLOB_KB_CONFIG.connectionString,
}
}
return {
bucket: S3_KB_CONFIG.bucket,
region: S3_KB_CONFIG.region,
}
return provider === 'blob'
? {
containerName: BLOB_KB_CONFIG.containerName,
accountName: BLOB_KB_CONFIG.accountName,
accountKey: BLOB_KB_CONFIG.accountKey,
connectionString: BLOB_KB_CONFIG.connectionString,
}
: {
bucket: S3_KB_CONFIG.bucket,
region: S3_KB_CONFIG.region,
}
}
class APIError extends Error {
@@ -53,9 +51,6 @@ class APIError extends Error {
}
}
/**
* Process a document by parsing it and chunking the content
*/
export async function processDocument(
fileUrl: string,
filename: string,
@@ -79,29 +74,23 @@ export async function processDocument(
logger.info(`Processing document: ${filename}`)
try {
// Parse the document
const { content, processingMethod, cloudUrl } = await parseDocument(fileUrl, filename, mimeType)
// Create chunker and process content
const chunker = new TextChunker({
chunkSize,
overlap: chunkOverlap,
minChunkSize,
})
const parseResult = await parseDocument(fileUrl, filename, mimeType)
const { content, processingMethod } = parseResult
const cloudUrl = 'cloudUrl' in parseResult ? parseResult.cloudUrl : undefined
const chunker = new TextChunker({ chunkSize, overlap: chunkOverlap, minChunkSize })
const chunks = await chunker.chunk(content)
// Calculate metadata
const characterCount = content.length
const tokenCount = chunks.reduce((sum: number, chunk: Chunk) => sum + chunk.tokenCount, 0)
const tokenCount = chunks.reduce((sum, chunk) => sum + chunk.tokenCount, 0)
logger.info(`Document processed successfully: ${chunks.length} chunks, ${tokenCount} tokens`)
logger.info(`Document processed: ${chunks.length} chunks, ${tokenCount} tokens`)
return {
chunks,
metadata: {
filename,
fileSize: content.length, // Using content length as file size approximation
fileSize: characterCount,
mimeType,
chunkCount: chunks.length,
tokenCount,
@@ -116,9 +105,6 @@ export async function processDocument(
}
}
/**
* Parse a document from a URL or file path
*/
async function parseDocument(
fileUrl: string,
filename: string,
@@ -128,283 +114,286 @@ async function parseDocument(
processingMethod: 'file-parser' | 'mistral-ocr'
cloudUrl?: string
}> {
// Check if we should use Mistral OCR for PDFs
const shouldUseMistralOCR = mimeType === 'application/pdf' && env.MISTRAL_API_KEY
const isPDF = mimeType === 'application/pdf'
const hasAzureMistralOCR =
env.AZURE_OPENAI_API_KEY && env.OCR_AZURE_ENDPOINT && env.OCR_AZURE_MODEL_NAME
const hasMistralOCR = env.MISTRAL_API_KEY
if (shouldUseMistralOCR) {
logger.info(`Using Mistral OCR for PDF: ${filename}`)
return await parseWithMistralOCR(fileUrl, filename, mimeType)
// Check Azure Mistral OCR configuration
if (isPDF && hasAzureMistralOCR) {
logger.info(`Using Azure Mistral OCR: ${filename}`)
return parseWithAzureMistralOCR(fileUrl, filename, mimeType)
}
// Use standard file parser
logger.info(`Using file parser for: ${filename}`)
return await parseWithFileParser(fileUrl, filename, mimeType)
if (isPDF && hasMistralOCR) {
logger.info(`Using Mistral OCR: ${filename}`)
return parseWithMistralOCR(fileUrl, filename, mimeType)
}
logger.info(`Using file parser: ${filename}`)
return parseWithFileParser(fileUrl, filename, mimeType)
}
/**
* Parse document using Mistral OCR
*/
async function parseWithMistralOCR(
fileUrl: string,
filename: string,
mimeType: string
): Promise<{
content: string
processingMethod: 'file-parser' | 'mistral-ocr'
cloudUrl?: string
}> {
const mistralApiKey = env.MISTRAL_API_KEY
if (!mistralApiKey) {
throw new Error('Mistral API key is required for OCR processing')
async function handleFileForOCR(fileUrl: string, filename: string, mimeType: string) {
if (fileUrl.startsWith('https://')) {
return { httpsUrl: fileUrl }
}
let httpsUrl = fileUrl
let cloudUrl: string | undefined
logger.info(`Uploading "${filename}" to cloud storage for OCR`)
// If the URL is not HTTPS, we need to upload to cloud storage first
if (!fileUrl.startsWith('https://')) {
logger.info(`Uploading "${filename}" to cloud storage for Mistral OCR access`)
const buffer = await downloadFileWithTimeout(fileUrl)
const kbConfig = getKBConfig()
// Download the file content with timeout
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), TIMEOUTS.FILE_DOWNLOAD)
validateCloudConfig(kbConfig)
try {
const response = await fetch(fileUrl, { signal: controller.signal })
clearTimeout(timeoutId)
try {
const cloudResult = await uploadFile(buffer, filename, mimeType, kbConfig as any)
const httpsUrl = await getPresignedUrlWithConfig(cloudResult.key, kbConfig as any, 900)
logger.info(`Successfully uploaded for OCR: ${cloudResult.key}`)
return { httpsUrl, cloudUrl: httpsUrl }
} catch (uploadError) {
const message = uploadError instanceof Error ? uploadError.message : 'Unknown error'
throw new Error(`Cloud upload failed: ${message}. Cloud upload is required for OCR.`)
}
}
if (!response.ok) {
throw new Error(`Failed to download file for cloud upload: ${response.statusText}`)
}
async function downloadFileWithTimeout(fileUrl: string): Promise<Buffer> {
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), TIMEOUTS.FILE_DOWNLOAD)
const buffer = Buffer.from(await response.arrayBuffer())
try {
const response = await fetch(fileUrl, { signal: controller.signal })
clearTimeout(timeoutId)
// Always upload to cloud storage for Mistral OCR, even in development
const kbConfig = getKBConfig()
const provider = getStorageProvider()
if (provider === 'blob') {
const blobConfig = kbConfig as BlobConfig
if (
!blobConfig.containerName ||
(!blobConfig.connectionString && (!blobConfig.accountName || !blobConfig.accountKey))
) {
throw new Error(
'Azure Blob configuration missing for PDF processing with Mistral OCR. Set AZURE_CONNECTION_STRING or both AZURE_ACCOUNT_NAME + AZURE_ACCOUNT_KEY, and AZURE_KB_CONTAINER_NAME.'
)
}
} else {
const s3Config = kbConfig as S3Config
if (!s3Config.bucket || !s3Config.region) {
throw new Error(
'S3 configuration missing for PDF processing with Mistral OCR. Set AWS_REGION and S3_KB_BUCKET_NAME environment variables.'
)
}
}
try {
// Upload to cloud storage
const cloudResult = await uploadFile(buffer, filename, mimeType, kbConfig as any)
// Generate presigned URL with 15 minutes expiration
httpsUrl = await getPresignedUrlWithConfig(cloudResult.key, kbConfig as any, 900)
cloudUrl = httpsUrl
logger.info(`Successfully uploaded to cloud storage for Mistral OCR: ${cloudResult.key}`)
} catch (uploadError) {
logger.error('Failed to upload to cloud storage for Mistral OCR:', uploadError)
throw new Error(
`Cloud upload failed: ${uploadError instanceof Error ? uploadError.message : 'Unknown error'}. Cloud upload is required for PDF processing with Mistral OCR.`
)
}
} catch (error) {
clearTimeout(timeoutId)
if (error instanceof Error && error.name === 'AbortError') {
throw new Error('File download timed out for Mistral OCR processing')
}
throw error
if (!response.ok) {
throw new Error(`Failed to download file: ${response.statusText}`)
}
return Buffer.from(await response.arrayBuffer())
} catch (error) {
clearTimeout(timeoutId)
if (error instanceof Error && error.name === 'AbortError') {
throw new Error('File download timed out')
}
throw error
}
}
async function downloadFileForBase64(fileUrl: string): Promise<Buffer> {
// Handle different URL types for Azure Mistral OCR base64 requirement
if (fileUrl.startsWith('data:')) {
// Extract base64 data from data URI
const [, base64Data] = fileUrl.split(',')
if (!base64Data) {
throw new Error('Invalid data URI format')
}
return Buffer.from(base64Data, 'base64')
}
if (fileUrl.startsWith('http')) {
// Download from HTTP(S) URL
return downloadFileWithTimeout(fileUrl)
}
// Local file - read it
const fs = await import('fs/promises')
return fs.readFile(fileUrl)
}
function validateCloudConfig(kbConfig: S3Config | BlobConfig) {
const provider = getStorageProvider()
if (provider === 'blob') {
const config = kbConfig as BlobConfig
if (
!config.containerName ||
(!config.connectionString && (!config.accountName || !config.accountKey))
) {
throw new Error(
'Azure Blob configuration missing. Set AZURE_CONNECTION_STRING or AZURE_ACCOUNT_NAME + AZURE_ACCOUNT_KEY + AZURE_KB_CONTAINER_NAME'
)
}
} else {
const config = kbConfig as S3Config
if (!config.bucket || !config.region) {
throw new Error('S3 configuration missing. Set AWS_REGION and S3_KB_BUCKET_NAME')
}
}
}
function processOCRContent(result: any, filename: string): string {
if (!result.success) {
throw new Error(`OCR processing failed: ${result.error || 'Unknown error'}`)
}
const content = result.output?.content || ''
if (!content.trim()) {
throw new Error('OCR returned empty content')
}
logger.info(`OCR completed: ${filename}`)
return content
}
function validateOCRConfig(
apiKey?: string,
endpoint?: string,
modelName?: string,
service = 'OCR'
) {
if (!apiKey) throw new Error(`${service} API key required`)
if (!endpoint) throw new Error(`${service} endpoint required`)
if (!modelName) throw new Error(`${service} model name required`)
}
function extractPageContent(pages: any[]): string {
if (!pages?.length) return ''
return pages
.map((page) => page?.markdown || '')
.filter(Boolean)
.join('\n\n')
}
async function makeOCRRequest(endpoint: string, headers: Record<string, string>, body: any) {
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), TIMEOUTS.MISTRAL_OCR_API)
try {
const response = await fetch(endpoint, {
method: 'POST',
headers,
body: JSON.stringify(body),
signal: controller.signal,
})
clearTimeout(timeoutId)
if (!response.ok) {
const errorText = await response.text()
throw new APIError(
`OCR failed: ${response.status} ${response.statusText} - ${errorText}`,
response.status
)
}
return response
} catch (error) {
clearTimeout(timeoutId)
if (error instanceof Error && error.name === 'AbortError') {
throw new Error('OCR API request timed out')
}
throw error
}
}
async function parseWithAzureMistralOCR(fileUrl: string, filename: string, mimeType: string) {
validateOCRConfig(
env.AZURE_OPENAI_API_KEY,
env.OCR_AZURE_ENDPOINT,
env.OCR_AZURE_MODEL_NAME,
'Azure Mistral OCR'
)
// Azure Mistral OCR accepts data URIs with base64 content
const fileBuffer = await downloadFileForBase64(fileUrl)
const base64Data = fileBuffer.toString('base64')
const dataUri = `data:${mimeType};base64,${base64Data}`
try {
const response = await retryWithExponentialBackoff(
() =>
makeOCRRequest(
env.OCR_AZURE_ENDPOINT!,
{
'Content-Type': 'application/json',
Authorization: `Bearer ${env.AZURE_OPENAI_API_KEY}`,
},
{
model: env.OCR_AZURE_MODEL_NAME,
document: {
type: 'document_url',
document_url: dataUri,
},
include_image_base64: false,
}
),
{ maxRetries: 3, initialDelayMs: 1000, maxDelayMs: 10000 }
)
const ocrResult = await response.json()
const content = extractPageContent(ocrResult.pages) || JSON.stringify(ocrResult, null, 2)
if (!content.trim()) {
throw new Error('Azure Mistral OCR returned empty content')
}
logger.info(`Azure Mistral OCR completed: ${filename}`)
return { content, processingMethod: 'mistral-ocr' as const, cloudUrl: undefined }
} catch (error) {
logger.error(`Azure Mistral OCR failed for ${filename}:`, {
message: error instanceof Error ? error.message : String(error),
})
return env.MISTRAL_API_KEY
? parseWithMistralOCR(fileUrl, filename, mimeType)
: parseWithFileParser(fileUrl, filename, mimeType)
}
}
async function parseWithMistralOCR(fileUrl: string, filename: string, mimeType: string) {
if (!env.MISTRAL_API_KEY) {
throw new Error('Mistral API key required')
}
if (!mistralParserTool.request?.body) {
throw new Error('Mistral parser tool not properly configured')
throw new Error('Mistral parser tool not configured')
}
const requestBody = mistralParserTool.request.body({
filePath: httpsUrl,
apiKey: mistralApiKey,
resultType: 'text',
})
const { httpsUrl, cloudUrl } = await handleFileForOCR(fileUrl, filename, mimeType)
const params = { filePath: httpsUrl, apiKey: env.MISTRAL_API_KEY, resultType: 'text' as const }
try {
const response = await retryWithExponentialBackoff(
async () => {
const url =
typeof mistralParserTool.request!.url === 'function'
? mistralParserTool.request!.url({
filePath: httpsUrl,
apiKey: mistralApiKey,
resultType: 'text',
})
? mistralParserTool.request!.url(params)
: mistralParserTool.request!.url
const headers =
typeof mistralParserTool.request!.headers === 'function'
? mistralParserTool.request!.headers({
filePath: httpsUrl,
apiKey: mistralApiKey,
resultType: 'text',
})
? mistralParserTool.request!.headers(params)
: mistralParserTool.request!.headers
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), TIMEOUTS.MISTRAL_OCR_API)
try {
const method =
typeof mistralParserTool.request!.method === 'function'
? mistralParserTool.request!.method(requestBody as any)
: mistralParserTool.request!.method
const res = await fetch(url, {
method,
headers,
body: JSON.stringify(requestBody),
signal: controller.signal,
})
clearTimeout(timeoutId)
if (!res.ok) {
const errorText = await res.text()
throw new APIError(
`Mistral OCR failed: ${res.status} ${res.statusText} - ${errorText}`,
res.status
)
}
return res
} catch (error) {
clearTimeout(timeoutId)
if (error instanceof Error && error.name === 'AbortError') {
throw new Error('Mistral OCR API request timed out')
}
throw error
}
const requestBody = mistralParserTool.request!.body!(params)
return makeOCRRequest(url, headers as Record<string, string>, requestBody)
},
{
maxRetries: 3,
initialDelayMs: 1000,
maxDelayMs: 10000,
}
{ maxRetries: 3, initialDelayMs: 1000, maxDelayMs: 10000 }
)
const result = await mistralParserTool.transformResponse!(response, {
filePath: httpsUrl,
apiKey: mistralApiKey,
resultType: 'text',
})
const result = await mistralParserTool.transformResponse!(response, params)
const content = processOCRContent(result, filename)
if (!result.success) {
throw new Error(`Mistral OCR processing failed: ${result.error || 'Unknown error'}`)
}
const content = result.output?.content || ''
if (!content.trim()) {
throw new Error('Mistral OCR returned empty content')
}
logger.info(`Mistral OCR completed successfully for ${filename}`)
return {
content,
processingMethod: 'mistral-ocr',
cloudUrl,
}
return { content, processingMethod: 'mistral-ocr' as const, cloudUrl }
} catch (error) {
logger.error(`Mistral OCR failed for ${filename}:`, {
message: error instanceof Error ? error.message : String(error),
stack: error instanceof Error ? error.stack : undefined,
name: error instanceof Error ? error.name : 'Unknown',
})
// Fall back to file parser
logger.info(`Falling back to file parser for ${filename}`)
return await parseWithFileParser(fileUrl, filename, mimeType)
logger.info(`Falling back to file parser: ${filename}`)
return parseWithFileParser(fileUrl, filename, mimeType)
}
}
/**
* Parse document using standard file parser
*/
async function parseWithFileParser(
fileUrl: string,
filename: string,
mimeType: string
): Promise<{
content: string
processingMethod: 'file-parser' | 'mistral-ocr'
cloudUrl?: string
}> {
async function parseWithFileParser(fileUrl: string, filename: string, mimeType: string) {
try {
let content: string
if (fileUrl.startsWith('data:')) {
logger.info(`Processing data URI for: ${filename}`)
try {
const [header, base64Data] = fileUrl.split(',')
if (!base64Data) {
throw new Error('Invalid data URI format')
}
if (header.includes('base64')) {
const buffer = Buffer.from(base64Data, 'base64')
content = buffer.toString('utf8')
} else {
content = decodeURIComponent(base64Data)
}
if (mimeType === 'text/plain') {
logger.info(`Data URI processed successfully for text content: ${filename}`)
} else {
const extension = filename.split('.').pop()?.toLowerCase() || 'txt'
const buffer = Buffer.from(base64Data, 'base64')
const result = await parseBuffer(buffer, extension)
content = result.content
}
} catch (error) {
throw new Error(
`Failed to process data URI: ${error instanceof Error ? error.message : 'Unknown error'}`
)
}
} else if (fileUrl.startsWith('http://') || fileUrl.startsWith('https://')) {
const controller = new AbortController()
const timeoutId = setTimeout(() => controller.abort(), TIMEOUTS.FILE_DOWNLOAD)
try {
const response = await fetch(fileUrl, { signal: controller.signal })
clearTimeout(timeoutId)
if (!response.ok) {
throw new Error(`Failed to download file: ${response.status} ${response.statusText}`)
}
const buffer = Buffer.from(await response.arrayBuffer())
const extension = filename.split('.').pop()?.toLowerCase() || ''
if (!extension) {
throw new Error(`Could not determine file extension from filename: ${filename}`)
}
const result = await parseBuffer(buffer, extension)
content = result.content
} catch (error) {
clearTimeout(timeoutId)
if (error instanceof Error && error.name === 'AbortError') {
throw new Error('File download timed out')
}
throw error
}
content = await parseDataURI(fileUrl, filename, mimeType)
} else if (fileUrl.startsWith('http')) {
content = await parseHttpFile(fileUrl, filename)
} else {
// Parse local file
const result = await parseFile(fileUrl)
content = result.content
}
@@ -413,12 +402,39 @@ async function parseWithFileParser(
throw new Error('File parser returned empty content')
}
return {
content,
processingMethod: 'file-parser',
}
return { content, processingMethod: 'file-parser' as const, cloudUrl: undefined }
} catch (error) {
logger.error(`File parser failed for ${filename}:`, error)
throw error
}
}
async function parseDataURI(fileUrl: string, filename: string, mimeType: string): Promise<string> {
const [header, base64Data] = fileUrl.split(',')
if (!base64Data) {
throw new Error('Invalid data URI format')
}
if (mimeType === 'text/plain') {
return header.includes('base64')
? Buffer.from(base64Data, 'base64').toString('utf8')
: decodeURIComponent(base64Data)
}
const extension = filename.split('.').pop()?.toLowerCase() || 'txt'
const buffer = Buffer.from(base64Data, 'base64')
const result = await parseBuffer(buffer, extension)
return result.content
}
async function parseHttpFile(fileUrl: string, filename: string): Promise<string> {
const buffer = await downloadFileWithTimeout(fileUrl)
const extension = filename.split('.').pop()?.toLowerCase()
if (!extension) {
throw new Error(`Could not determine file extension: ${filename}`)
}
const result = await parseBuffer(buffer, extension)
return result.content
}

View File

@@ -1,6 +1,9 @@
import { beforeEach, describe, expect, it, type Mock, vi } from 'vitest'
const mockSend = vi.fn()
const mockBatchSend = vi.fn()
const mockAzureBeginSend = vi.fn()
const mockAzurePollUntilDone = vi.fn()
vi.mock('resend', () => {
return {
@@ -8,6 +11,17 @@ vi.mock('resend', () => {
emails: {
send: (...args: any[]) => mockSend(...args),
},
batch: {
send: (...args: any[]) => mockBatchSend(...args),
},
})),
}
})
vi.mock('@azure/communication-email', () => {
return {
EmailClient: vi.fn().mockImplementation(() => ({
beginSend: (...args: any[]) => mockAzureBeginSend(...args),
})),
}
})
@@ -20,7 +34,10 @@ vi.mock('@/lib/email/unsubscribe', () => ({
vi.mock('@/lib/env', () => ({
env: {
RESEND_API_KEY: 'test-api-key',
AZURE_ACS_CONNECTION_STRING: 'test-azure-connection-string',
AZURE_COMMUNICATION_EMAIL_DOMAIN: 'test.azurecomm.net',
NEXT_PUBLIC_APP_URL: 'https://test.sim.ai',
FROM_EMAIL_ADDRESS: 'Sim <noreply@sim.ai>',
},
}))
@@ -28,7 +45,7 @@ vi.mock('@/lib/urls/utils', () => ({
getEmailDomain: vi.fn().mockReturnValue('sim.ai'),
}))
import { type EmailType, sendEmail } from '@/lib/email/mailer'
import { type EmailType, sendBatchEmails, sendEmail } from '@/lib/email/mailer'
import { generateUnsubscribeToken, isUnsubscribed } from '@/lib/email/unsubscribe'
describe('mailer', () => {
@@ -42,10 +59,27 @@ describe('mailer', () => {
vi.clearAllMocks()
;(isUnsubscribed as Mock).mockResolvedValue(false)
;(generateUnsubscribeToken as Mock).mockReturnValue('mock-token-123')
// Mock successful Resend response
mockSend.mockResolvedValue({
data: { id: 'test-email-id' },
error: null,
})
mockBatchSend.mockResolvedValue({
data: [{ id: 'batch-email-1' }, { id: 'batch-email-2' }],
error: null,
})
// Mock successful Azure response
mockAzurePollUntilDone.mockResolvedValue({
status: 'Succeeded',
id: 'azure-email-id',
})
mockAzureBeginSend.mockReturnValue({
pollUntilDone: mockAzurePollUntilDone,
})
})
describe('sendEmail', () => {
@@ -56,7 +90,7 @@ describe('mailer', () => {
})
expect(result.success).toBe(true)
expect(result.message).toBe('Email sent successfully')
expect(result.message).toBe('Email sent successfully via Resend')
expect(result.data).toEqual({ id: 'test-email-id' })
// Should not check unsubscribe status for transactional emails
@@ -119,7 +153,8 @@ describe('mailer', () => {
expect(mockSend).not.toHaveBeenCalled()
})
it.concurrent('should handle Resend API errors', async () => {
it.concurrent('should handle Resend API errors and fallback to Azure', async () => {
// Mock Resend to fail
mockSend.mockResolvedValue({
data: null,
error: { message: 'API rate limit exceeded' },
@@ -127,17 +162,32 @@ describe('mailer', () => {
const result = await sendEmail(testEmailOptions)
expect(result.success).toBe(false)
expect(result.message).toBe('API rate limit exceeded')
expect(result.success).toBe(true)
expect(result.message).toBe('Email sent successfully via Azure Communication Services')
expect(result.data).toEqual({ id: 'azure-email-id' })
// Should have tried Resend first
expect(mockSend).toHaveBeenCalled()
// Should have fallen back to Azure
expect(mockAzureBeginSend).toHaveBeenCalled()
})
it.concurrent('should handle unexpected errors', async () => {
it.concurrent('should handle unexpected errors and fallback to Azure', async () => {
// Mock Resend to throw an error
mockSend.mockRejectedValue(new Error('Network error'))
const result = await sendEmail(testEmailOptions)
expect(result.success).toBe(false)
expect(result.message).toBe('Failed to send email')
expect(result.success).toBe(true)
expect(result.message).toBe('Email sent successfully via Azure Communication Services')
expect(result.data).toEqual({ id: 'azure-email-id' })
// Should have tried Resend first
expect(mockSend).toHaveBeenCalled()
// Should have fallen back to Azure
expect(mockAzureBeginSend).toHaveBeenCalled()
})
it.concurrent('should use custom from address when provided', async () => {
@@ -148,7 +198,7 @@ describe('mailer', () => {
expect(mockSend).toHaveBeenCalledWith(
expect.objectContaining({
from: 'Sim <custom@example.com>',
from: 'custom@example.com',
})
)
})
@@ -184,4 +234,117 @@ describe('mailer', () => {
)
})
})
describe('Azure Communication Services fallback', () => {
it('should fallback to Azure when Resend fails', async () => {
// Mock Resend to fail
mockSend.mockRejectedValue(new Error('Resend service unavailable'))
const result = await sendEmail({
...testEmailOptions,
emailType: 'transactional',
})
expect(result.success).toBe(true)
expect(result.message).toBe('Email sent successfully via Azure Communication Services')
expect(result.data).toEqual({ id: 'azure-email-id' })
// Should have tried Resend first
expect(mockSend).toHaveBeenCalled()
// Should have fallen back to Azure
expect(mockAzureBeginSend).toHaveBeenCalledWith({
senderAddress: 'noreply@sim.ai',
content: {
subject: testEmailOptions.subject,
html: testEmailOptions.html,
},
recipients: {
to: [{ address: testEmailOptions.to }],
},
headers: {},
})
})
it('should handle Azure Communication Services failure', async () => {
// Mock both services to fail
mockSend.mockRejectedValue(new Error('Resend service unavailable'))
mockAzurePollUntilDone.mockResolvedValue({
status: 'Failed',
id: 'failed-id',
})
const result = await sendEmail({
...testEmailOptions,
emailType: 'transactional',
})
expect(result.success).toBe(false)
expect(result.message).toBe('Both Resend and Azure Communication Services failed')
// Should have tried both services
expect(mockSend).toHaveBeenCalled()
expect(mockAzureBeginSend).toHaveBeenCalled()
})
})
describe('sendBatchEmails', () => {
const testBatchEmails = [
{ ...testEmailOptions, to: 'user1@example.com' },
{ ...testEmailOptions, to: 'user2@example.com' },
]
it('should send batch emails via Resend successfully', async () => {
const result = await sendBatchEmails({ emails: testBatchEmails })
expect(result.success).toBe(true)
expect(result.message).toBe('All batch emails sent successfully via Resend')
expect(result.results).toHaveLength(2)
expect(mockBatchSend).toHaveBeenCalled()
})
it('should fallback to individual sends when Resend batch fails', async () => {
// Mock Resend batch to fail
mockBatchSend.mockRejectedValue(new Error('Batch service unavailable'))
const result = await sendBatchEmails({ emails: testBatchEmails })
expect(result.success).toBe(true)
expect(result.message).toBe('All batch emails sent successfully')
expect(result.results).toHaveLength(2)
// Should have tried Resend batch first
expect(mockBatchSend).toHaveBeenCalled()
// Should have fallen back to individual sends (which will use Resend since it's available)
expect(mockSend).toHaveBeenCalledTimes(2)
})
it('should handle mixed success/failure in individual fallback', async () => {
// Mock Resend batch to fail
mockBatchSend.mockRejectedValue(new Error('Batch service unavailable'))
// Mock first individual send to succeed, second to fail and Azure also fails
mockSend
.mockResolvedValueOnce({
data: { id: 'email-1' },
error: null,
})
.mockRejectedValueOnce(new Error('Individual send failure'))
// Mock Azure to fail for the second email (first call succeeds, but second fails)
mockAzurePollUntilDone.mockResolvedValue({
status: 'Failed',
id: 'failed-id',
})
const result = await sendBatchEmails({ emails: testBatchEmails })
expect(result.success).toBe(false)
expect(result.message).toBe('1/2 emails sent successfully')
expect(result.results).toHaveLength(2)
expect(result.results[0].success).toBe(true)
expect(result.results[1].success).toBe(false)
})
})
})

View File

@@ -1,64 +1,87 @@
import { EmailClient, type EmailMessage } from '@azure/communication-email'
import { Resend } from 'resend'
import { generateUnsubscribeToken, isUnsubscribed } from '@/lib/email/unsubscribe'
import { getFromEmailAddress } from '@/lib/email/utils'
import { env } from '@/lib/env'
import { createLogger } from '@/lib/logs/console/logger'
import { getEmailDomain } from '@/lib/urls/utils'
const logger = createLogger('Mailer')
export type EmailType = 'transactional' | 'marketing' | 'updates' | 'notifications'
interface EmailOptions {
to: string
export interface EmailAttachment {
filename: string
content: string | Buffer
contentType: string
disposition?: 'attachment' | 'inline'
}
export interface EmailOptions {
to: string | string[]
subject: string
html: string
html?: string
text?: string
from?: string
emailType?: EmailType
includeUnsubscribe?: boolean
attachments?: EmailAttachment[]
replyTo?: string
}
interface BatchEmailOptions {
export interface BatchEmailOptions {
emails: EmailOptions[]
}
interface SendEmailResult {
export interface SendEmailResult {
success: boolean
message: string
data?: any
}
interface BatchSendEmailResult {
export interface BatchSendEmailResult {
success: boolean
message: string
results: SendEmailResult[]
data?: any
}
interface ProcessedEmailData {
to: string | string[]
subject: string
html?: string
text?: string
senderEmail: string
headers: Record<string, string>
attachments?: EmailAttachment[]
replyTo?: string
}
const resendApiKey = env.RESEND_API_KEY
const azureConnectionString = env.AZURE_ACS_CONNECTION_STRING
const resend =
resendApiKey && resendApiKey !== 'placeholder' && resendApiKey.trim() !== ''
? new Resend(resendApiKey)
: null
export async function sendEmail({
to,
subject,
html,
from,
emailType = 'transactional',
includeUnsubscribe = true,
}: EmailOptions): Promise<SendEmailResult> {
const azureEmailClient =
azureConnectionString && azureConnectionString.trim() !== ''
? new EmailClient(azureConnectionString)
: null
export async function sendEmail(options: EmailOptions): Promise<SendEmailResult> {
try {
// Check if user has unsubscribed (skip for critical transactional emails)
if (emailType !== 'transactional') {
const unsubscribeType = emailType as 'marketing' | 'updates' | 'notifications'
const hasUnsubscribed = await isUnsubscribed(to, unsubscribeType)
if (options.emailType !== 'transactional') {
const unsubscribeType = options.emailType as 'marketing' | 'updates' | 'notifications'
// For arrays, check the first email address (batch emails typically go to similar recipients)
const primaryEmail = Array.isArray(options.to) ? options.to[0] : options.to
const hasUnsubscribed = await isUnsubscribed(primaryEmail, unsubscribeType)
if (hasUnsubscribed) {
logger.info('Email not sent (user unsubscribed):', {
to,
subject,
emailType,
to: options.to,
subject: options.subject,
emailType: options.emailType,
})
return {
success: true,
@@ -68,56 +91,41 @@ export async function sendEmail({
}
}
const senderEmail = from || `noreply@${env.EMAIL_DOMAIN || getEmailDomain()}`
// Process email data with unsubscribe tokens and headers
const processedData = await processEmailData(options)
if (!resend) {
logger.info('Email not sent (Resend not configured):', {
to,
subject,
from: senderEmail,
})
return {
success: true,
message: 'Email logging successful (Resend not configured)',
data: { id: 'mock-email-id' },
// Try Resend first if configured
if (resend) {
try {
return await sendWithResend(processedData)
} catch (error) {
logger.warn('Resend failed, attempting Azure Communication Services fallback:', error)
}
}
// Generate unsubscribe token and add to HTML
let finalHtml = html
const headers: Record<string, string> = {}
if (includeUnsubscribe && emailType !== 'transactional') {
const unsubscribeToken = generateUnsubscribeToken(to, emailType)
const baseUrl = env.NEXT_PUBLIC_APP_URL || 'https://sim.ai'
const unsubscribeUrl = `${baseUrl}/unsubscribe?token=${unsubscribeToken}&email=${encodeURIComponent(to)}`
headers['List-Unsubscribe'] = `<${unsubscribeUrl}>`
headers['List-Unsubscribe-Post'] = 'List-Unsubscribe=One-Click'
finalHtml = html.replace(/\{\{UNSUBSCRIBE_TOKEN\}\}/g, unsubscribeToken)
// Fallback to Azure Communication Services if configured
if (azureEmailClient) {
try {
return await sendWithAzure(processedData)
} catch (error) {
logger.error('Azure Communication Services also failed:', error)
return {
success: false,
message: 'Both Resend and Azure Communication Services failed',
}
}
}
const { data, error } = await resend.emails.send({
from: `Sim <${senderEmail}>`,
to,
subject,
html: finalHtml,
headers: Object.keys(headers).length > 0 ? headers : undefined,
// No email service configured
logger.info('Email not sent (no email service configured):', {
to: options.to,
subject: options.subject,
from: processedData.senderEmail,
})
if (error) {
logger.error('Resend API error:', error)
return {
success: false,
message: error.message || 'Failed to send email',
}
}
return {
success: true,
message: 'Email sent successfully',
data,
message: 'Email logging successful (no email service configured)',
data: { id: 'mock-email-id' },
}
} catch (error) {
logger.error('Error sending email:', error)
@@ -128,183 +136,175 @@ export async function sendEmail({
}
}
export async function sendBatchEmails({
emails,
}: BatchEmailOptions): Promise<BatchSendEmailResult> {
async function processEmailData(options: EmailOptions): Promise<ProcessedEmailData> {
const {
to,
subject,
html,
text,
from,
emailType = 'transactional',
includeUnsubscribe = true,
attachments,
replyTo,
} = options
const senderEmail = from || getFromEmailAddress()
// Generate unsubscribe token and add to content
let finalHtml = html
let finalText = text
const headers: Record<string, string> = {}
if (includeUnsubscribe && emailType !== 'transactional') {
// For arrays, use the first email for unsubscribe (batch emails typically go to similar recipients)
const primaryEmail = Array.isArray(to) ? to[0] : to
const unsubscribeToken = generateUnsubscribeToken(primaryEmail, emailType)
const baseUrl = env.NEXT_PUBLIC_APP_URL || 'https://sim.ai'
const unsubscribeUrl = `${baseUrl}/unsubscribe?token=${unsubscribeToken}&email=${encodeURIComponent(primaryEmail)}`
headers['List-Unsubscribe'] = `<${unsubscribeUrl}>`
headers['List-Unsubscribe-Post'] = 'List-Unsubscribe=One-Click'
if (html) {
finalHtml = html.replace(/\{\{UNSUBSCRIBE_TOKEN\}\}/g, unsubscribeToken)
}
if (text) {
finalText = text.replace(/\{\{UNSUBSCRIBE_TOKEN\}\}/g, unsubscribeToken)
}
}
return {
to,
subject,
html: finalHtml,
text: finalText,
senderEmail,
headers,
attachments,
replyTo,
}
}
async function sendWithResend(data: ProcessedEmailData): Promise<SendEmailResult> {
if (!resend) throw new Error('Resend not configured')
const fromAddress = data.senderEmail
const emailData: any = {
from: fromAddress,
to: data.to,
subject: data.subject,
headers: Object.keys(data.headers).length > 0 ? data.headers : undefined,
}
if (data.html) emailData.html = data.html
if (data.text) emailData.text = data.text
if (data.replyTo) emailData.replyTo = data.replyTo
if (data.attachments) {
emailData.attachments = data.attachments.map((att) => ({
filename: att.filename,
content: typeof att.content === 'string' ? att.content : att.content.toString('base64'),
contentType: att.contentType,
disposition: att.disposition || 'attachment',
}))
}
const { data: responseData, error } = await resend.emails.send(emailData)
if (error) {
throw new Error(error.message || 'Failed to send email via Resend')
}
return {
success: true,
message: 'Email sent successfully via Resend',
data: responseData,
}
}
async function sendWithAzure(data: ProcessedEmailData): Promise<SendEmailResult> {
if (!azureEmailClient) throw new Error('Azure Communication Services not configured')
// Azure Communication Services requires at least one content type
if (!data.html && !data.text) {
throw new Error('Azure Communication Services requires either HTML or text content')
}
// For Azure, use just the email address part (no display name)
// Azure will use the display name configured in the portal for the sender address
const senderEmailOnly = data.senderEmail.includes('<')
? data.senderEmail.match(/<(.+)>/)?.[1] || data.senderEmail
: data.senderEmail
const message: EmailMessage = {
senderAddress: senderEmailOnly,
content: data.html
? {
subject: data.subject,
html: data.html,
}
: {
subject: data.subject,
plainText: data.text!,
},
recipients: {
to: Array.isArray(data.to)
? data.to.map((email) => ({ address: email }))
: [{ address: data.to }],
},
headers: data.headers,
}
const poller = await azureEmailClient.beginSend(message)
const result = await poller.pollUntilDone()
if (result.status === 'Succeeded') {
return {
success: true,
message: 'Email sent successfully via Azure Communication Services',
data: { id: result.id },
}
}
throw new Error(`Azure Communication Services failed with status: ${result.status}`)
}
export async function sendBatchEmails(options: BatchEmailOptions): Promise<BatchSendEmailResult> {
try {
const senderEmail = `noreply@${env.EMAIL_DOMAIN || getEmailDomain()}`
const results: SendEmailResult[] = []
if (!resend) {
logger.info('Batch emails not sent (Resend not configured):', {
emailCount: emails.length,
})
emails.forEach(() => {
results.push({
success: true,
message: 'Email logging successful (Resend not configured)',
data: { id: 'mock-email-id' },
})
})
return {
success: true,
message: 'Batch email logging successful (Resend not configured)',
results,
data: { ids: Array(emails.length).fill('mock-email-id') },
}
}
const batchEmails = emails.map((email) => ({
from: `Sim <${email.from || senderEmail}>`,
to: email.to,
subject: email.subject,
html: email.html,
}))
const BATCH_SIZE = 50
let allSuccessful = true
const delay = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms))
let rateDelay = 500
for (let i = 0; i < batchEmails.length; i += BATCH_SIZE) {
if (i > 0) {
logger.info(`Rate limit protection: Waiting ${rateDelay}ms before sending next batch`)
await delay(rateDelay)
}
const batch = batchEmails.slice(i, i + BATCH_SIZE)
// Try Resend first for batch emails if available
if (resend) {
try {
logger.info(
`Sending batch ${Math.floor(i / BATCH_SIZE) + 1} of ${Math.ceil(batchEmails.length / BATCH_SIZE)} (${batch.length} emails)`
)
const response = await resend.batch.send(batch)
if (response.error) {
logger.error('Resend batch API error:', response.error)
// Add failure results for this batch
batch.forEach(() => {
results.push({
success: false,
message: response.error?.message || 'Failed to send batch email',
})
})
allSuccessful = false
} else if (response.data) {
if (Array.isArray(response.data)) {
response.data.forEach((item: { id: string }) => {
results.push({
success: true,
message: 'Email sent successfully',
data: item,
})
})
} else {
logger.info('Resend batch API returned unexpected format, assuming success')
batch.forEach((_, index) => {
results.push({
success: true,
message: 'Email sent successfully',
data: { id: `batch-${i}-item-${index}` },
})
})
}
}
return await sendBatchWithResend(options.emails)
} catch (error) {
logger.error('Error sending batch emails:', error)
// Check if it's a rate limit error
if (
error instanceof Error &&
(error.message.toLowerCase().includes('rate') ||
error.message.toLowerCase().includes('too many') ||
error.message.toLowerCase().includes('429'))
) {
logger.warn('Rate limit exceeded, increasing delay and retrying...')
// Wait a bit longer and try again with this batch
await delay(rateDelay * 5)
try {
logger.info(`Retrying batch ${Math.floor(i / BATCH_SIZE) + 1} with longer delay`)
const retryResponse = await resend.batch.send(batch)
if (retryResponse.error) {
logger.error('Retry failed with error:', retryResponse.error)
batch.forEach(() => {
results.push({
success: false,
message: retryResponse.error?.message || 'Failed to send batch email after retry',
})
})
allSuccessful = false
} else if (retryResponse.data) {
if (Array.isArray(retryResponse.data)) {
retryResponse.data.forEach((item: { id: string }) => {
results.push({
success: true,
message: 'Email sent successfully on retry',
data: item,
})
})
} else {
batch.forEach((_, index) => {
results.push({
success: true,
message: 'Email sent successfully on retry',
data: { id: `retry-batch-${i}-item-${index}` },
})
})
}
// Increase the standard delay since we hit a rate limit
logger.info('Increasing delay between batches after rate limit hit')
rateDelay = rateDelay * 2
}
} catch (retryError) {
logger.error('Retry also failed:', retryError)
batch.forEach(() => {
results.push({
success: false,
message:
retryError instanceof Error
? retryError.message
: 'Failed to send email even after retry',
})
})
allSuccessful = false
}
} else {
// Non-rate limit error
batch.forEach(() => {
results.push({
success: false,
message: error instanceof Error ? error.message : 'Failed to send batch email',
})
})
allSuccessful = false
}
logger.warn('Resend batch failed, falling back to individual sends:', error)
}
}
// Fallback to individual sends (works with both Azure and Resend)
logger.info('Sending batch emails individually')
for (const email of options.emails) {
try {
const result = await sendEmail(email)
results.push(result)
} catch (error) {
results.push({
success: false,
message: error instanceof Error ? error.message : 'Failed to send email',
})
}
}
const successCount = results.filter((r) => r.success).length
return {
success: allSuccessful,
message: allSuccessful
? 'All batch emails sent successfully'
: 'Some batch emails failed to send',
success: successCount === results.length,
message:
successCount === results.length
? 'All batch emails sent successfully'
: `${successCount}/${results.length} emails sent successfully`,
results,
data: { count: results.filter((r) => r.success).length },
data: { count: successCount },
}
} catch (error) {
logger.error('Error in batch email sending:', error)
@@ -315,3 +315,47 @@ export async function sendBatchEmails({
}
}
}
async function sendBatchWithResend(emails: EmailOptions[]): Promise<BatchSendEmailResult> {
if (!resend) throw new Error('Resend not configured')
const results: SendEmailResult[] = []
const batchEmails = emails.map((email) => {
const senderEmail = email.from || getFromEmailAddress()
const emailData: any = {
from: senderEmail,
to: email.to,
subject: email.subject,
}
if (email.html) emailData.html = email.html
if (email.text) emailData.text = email.text
return emailData
})
try {
const response = await resend.batch.send(batchEmails as any)
if (response.error) {
throw new Error(response.error.message || 'Resend batch API error')
}
// Success - create results for each email
batchEmails.forEach((_, index) => {
results.push({
success: true,
message: 'Email sent successfully via Resend batch',
data: { id: `batch-${index}` },
})
})
return {
success: true,
message: 'All batch emails sent successfully via Resend',
results,
data: { count: results.length },
}
} catch (error) {
logger.error('Resend batch send failed:', error)
throw error // Let the caller handle fallback
}
}

View File

@@ -0,0 +1,140 @@
import { beforeEach, describe, expect, it, vi } from 'vitest'
// Mock the env module
vi.mock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: undefined,
EMAIL_DOMAIN: undefined,
},
}))
// Mock the getEmailDomain function
vi.mock('@/lib/urls/utils', () => ({
getEmailDomain: vi.fn().mockReturnValue('fallback.com'),
}))
describe('getFromEmailAddress', () => {
beforeEach(() => {
// Reset mocks before each test
vi.resetModules()
})
it('should return FROM_EMAIL_ADDRESS when set', async () => {
// Mock env with FROM_EMAIL_ADDRESS
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: 'Sim <noreply@sim.ai>',
EMAIL_DOMAIN: 'example.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('Sim <noreply@sim.ai>')
})
it('should return simple email format when FROM_EMAIL_ADDRESS is set without display name', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: 'noreply@sim.ai',
EMAIL_DOMAIN: 'example.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('noreply@sim.ai')
})
it('should return Azure ACS format when FROM_EMAIL_ADDRESS is set', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: 'DoNotReply@customer.azurecomm.net',
EMAIL_DOMAIN: 'example.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('DoNotReply@customer.azurecomm.net')
})
it('should construct from EMAIL_DOMAIN when FROM_EMAIL_ADDRESS is not set', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: undefined,
EMAIL_DOMAIN: 'example.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('noreply@example.com')
})
it('should use getEmailDomain fallback when both FROM_EMAIL_ADDRESS and EMAIL_DOMAIN are not set', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: undefined,
EMAIL_DOMAIN: undefined,
},
}))
const mockGetEmailDomain = vi.fn().mockReturnValue('fallback.com')
vi.doMock('@/lib/urls/utils', () => ({
getEmailDomain: mockGetEmailDomain,
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('noreply@fallback.com')
expect(mockGetEmailDomain).toHaveBeenCalled()
})
it('should prioritize FROM_EMAIL_ADDRESS over EMAIL_DOMAIN when both are set', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: 'Custom <custom@custom.com>',
EMAIL_DOMAIN: 'ignored.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('Custom <custom@custom.com>')
})
it('should handle empty string FROM_EMAIL_ADDRESS by falling back to EMAIL_DOMAIN', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: '',
EMAIL_DOMAIN: 'fallback.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('noreply@fallback.com')
})
it('should handle whitespace-only FROM_EMAIL_ADDRESS by falling back to EMAIL_DOMAIN', async () => {
vi.doMock('@/lib/env', () => ({
env: {
FROM_EMAIL_ADDRESS: ' ',
EMAIL_DOMAIN: 'fallback.com',
},
}))
const { getFromEmailAddress } = await import('./utils')
const result = getFromEmailAddress()
expect(result).toBe('noreply@fallback.com')
})
})

View File

@@ -0,0 +1,13 @@
import { env } from '@/lib/env'
import { getEmailDomain } from '@/lib/urls/utils'
/**
* Get the from email address, preferring FROM_EMAIL_ADDRESS over EMAIL_DOMAIN
*/
export function getFromEmailAddress(): string {
if (env.FROM_EMAIL_ADDRESS?.trim()) {
return env.FROM_EMAIL_ADDRESS
}
// Fallback to constructing from EMAIL_DOMAIN
return `noreply@${env.EMAIL_DOMAIN || getEmailDomain()}`
}

View File

@@ -0,0 +1,148 @@
import { isRetryableError, retryWithExponentialBackoff } from '@/lib/documents/utils'
import { env } from '@/lib/env'
import { createLogger } from '@/lib/logs/console/logger'
const logger = createLogger('EmbeddingUtils')
export class EmbeddingAPIError extends Error {
public status: number
constructor(message: string, status: number) {
super(message)
this.name = 'EmbeddingAPIError'
this.status = status
}
}
interface EmbeddingConfig {
useAzure: boolean
apiUrl: string
headers: Record<string, string>
modelName: string
}
function getEmbeddingConfig(embeddingModel = 'text-embedding-3-small'): EmbeddingConfig {
const azureApiKey = env.AZURE_OPENAI_API_KEY
const azureEndpoint = env.AZURE_OPENAI_ENDPOINT
const azureApiVersion = env.AZURE_OPENAI_API_VERSION
const kbModelName = env.KB_OPENAI_MODEL_NAME || embeddingModel
const openaiApiKey = env.OPENAI_API_KEY
const useAzure = !!(azureApiKey && azureEndpoint)
if (!useAzure && !openaiApiKey) {
throw new Error(
'Either OPENAI_API_KEY or Azure OpenAI configuration (AZURE_OPENAI_API_KEY + AZURE_OPENAI_ENDPOINT) must be configured'
)
}
const apiUrl = useAzure
? `${azureEndpoint}/openai/deployments/${kbModelName}/embeddings?api-version=${azureApiVersion}`
: 'https://api.openai.com/v1/embeddings'
const headers: Record<string, string> = useAzure
? {
'api-key': azureApiKey!,
'Content-Type': 'application/json',
}
: {
Authorization: `Bearer ${openaiApiKey!}`,
'Content-Type': 'application/json',
}
return {
useAzure,
apiUrl,
headers,
modelName: useAzure ? kbModelName : embeddingModel,
}
}
async function callEmbeddingAPI(inputs: string[], config: EmbeddingConfig): Promise<number[][]> {
return retryWithExponentialBackoff(
async () => {
const requestBody = config.useAzure
? {
input: inputs,
encoding_format: 'float',
}
: {
input: inputs,
model: config.modelName,
encoding_format: 'float',
}
const response = await fetch(config.apiUrl, {
method: 'POST',
headers: config.headers,
body: JSON.stringify(requestBody),
})
if (!response.ok) {
const errorText = await response.text()
throw new EmbeddingAPIError(
`Embedding API failed: ${response.status} ${response.statusText} - ${errorText}`,
response.status
)
}
const data = await response.json()
return data.data.map((item: any) => item.embedding)
},
{
maxRetries: 3,
initialDelayMs: 1000,
maxDelayMs: 10000,
retryCondition: (error: any) => {
if (error instanceof EmbeddingAPIError) {
return error.status === 429 || error.status >= 500
}
return isRetryableError(error)
},
}
)
}
/**
* Generate embeddings for multiple texts with batching
*/
export async function generateEmbeddings(
texts: string[],
embeddingModel = 'text-embedding-3-small'
): Promise<number[][]> {
const config = getEmbeddingConfig(embeddingModel)
logger.info(`Using ${config.useAzure ? 'Azure OpenAI' : 'OpenAI'} for embeddings generation`)
const batchSize = 100
const allEmbeddings: number[][] = []
for (let i = 0; i < texts.length; i += batchSize) {
const batch = texts.slice(i, i + batchSize)
const batchEmbeddings = await callEmbeddingAPI(batch, config)
allEmbeddings.push(...batchEmbeddings)
logger.info(
`Generated embeddings for batch ${Math.floor(i / batchSize) + 1}/${Math.ceil(texts.length / batchSize)}`
)
}
return allEmbeddings
}
/**
* Generate embedding for a single search query
*/
export async function generateSearchEmbedding(
query: string,
embeddingModel = 'text-embedding-3-small'
): Promise<number[]> {
const config = getEmbeddingConfig(embeddingModel)
logger.info(
`Using ${config.useAzure ? 'Azure OpenAI' : 'OpenAI'} for search embedding generation`
)
const embeddings = await callEmbeddingAPI([query], config)
return embeddings[0]
}

View File

@@ -49,7 +49,9 @@ export const env = createEnv({
// Email & Communication
RESEND_API_KEY: z.string().min(1).optional(), // Resend API key for transactional emails
EMAIL_DOMAIN: z.string().min(1).optional(), // Domain for sending emails
FROM_EMAIL_ADDRESS: z.string().min(1).optional(), // Complete from address (e.g., "Sim <noreply@domain.com>" or "noreply@domain.com")
EMAIL_DOMAIN: z.string().min(1).optional(), // Domain for sending emails (fallback when FROM_EMAIL_ADDRESS not set)
AZURE_ACS_CONNECTION_STRING: z.string().optional(), // Azure Communication Services connection string
// AI/LLM Provider API Keys
OPENAI_API_KEY: z.string().min(1).optional(), // Primary OpenAI API key
@@ -64,9 +66,14 @@ export const env = createEnv({
ELEVENLABS_API_KEY: z.string().min(1).optional(), // ElevenLabs API key for text-to-speech in deployed chat
SERPER_API_KEY: z.string().min(1).optional(), // Serper API key for online search
// Azure OpenAI Configuration
AZURE_OPENAI_ENDPOINT: z.string().url().optional(), // Azure OpenAI service endpoint
AZURE_OPENAI_API_VERSION: z.string().optional(), // Azure OpenAI API version
// Azure Configuration - Shared credentials with feature-specific models
AZURE_OPENAI_ENDPOINT: z.string().url().optional(), // Shared Azure OpenAI service endpoint
AZURE_OPENAI_API_VERSION: z.string().optional(), // Shared Azure OpenAI API version
AZURE_OPENAI_API_KEY: z.string().min(1).optional(), // Shared Azure OpenAI API key
KB_OPENAI_MODEL_NAME: z.string().optional(), // Knowledge base OpenAI model name (works with both regular OpenAI and Azure OpenAI)
WAND_OPENAI_MODEL_NAME: z.string().optional(), // Wand generation OpenAI model name (works with both regular OpenAI and Azure OpenAI)
OCR_AZURE_ENDPOINT: z.string().url().optional(), // Azure Mistral OCR service endpoint
OCR_AZURE_MODEL_NAME: z.string().optional(), // Azure Mistral OCR model name for document processing
// Monitoring & Analytics
TELEMETRY_ENDPOINT: z.string().url().optional(), // Custom telemetry/analytics endpoint

View File

@@ -0,0 +1,43 @@
/**
* Theme synchronization utilities for managing theme across next-themes and database
*/
/**
* Updates the theme in next-themes by dispatching a storage event
* This works by updating localStorage and notifying next-themes of the change
*/
export function syncThemeToNextThemes(theme: 'system' | 'light' | 'dark') {
if (typeof window === 'undefined') return
// Update localStorage
localStorage.setItem('sim-theme', theme)
// Dispatch storage event to notify next-themes
window.dispatchEvent(
new StorageEvent('storage', {
key: 'sim-theme',
newValue: theme,
oldValue: localStorage.getItem('sim-theme'),
storageArea: localStorage,
url: window.location.href,
})
)
// Also update the HTML class immediately for instant feedback
const root = document.documentElement
const systemTheme = window.matchMedia('(prefers-color-scheme: dark)').matches ? 'dark' : 'light'
const actualTheme = theme === 'system' ? systemTheme : theme
// Remove existing theme classes
root.classList.remove('light', 'dark')
// Add new theme class
root.classList.add(actualTheme)
}
/**
* Gets the current theme from next-themes localStorage
*/
export function getThemeFromNextThemes(): 'system' | 'light' | 'dark' {
if (typeof window === 'undefined') return 'system'
return (localStorage.getItem('sim-theme') as 'system' | 'light' | 'dark') || 'system'
}

View File

@@ -1,13 +1,13 @@
import { and, eq } from 'drizzle-orm'
import { nanoid } from 'nanoid'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { hasProcessedMessage, markMessageAsProcessed } from '@/lib/redis'
import { getBaseUrl } from '@/lib/urls/utils'
import { getOAuthToken, refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { db } from '@/db'
import { account, webhook } from '@/db/schema'
const logger = new Logger('GmailPollingService')
const logger = createLogger('GmailPollingService')
interface GmailWebhookConfig {
labelIds: string[]

View File

@@ -1,13 +1,13 @@
import { and, eq } from 'drizzle-orm'
import { nanoid } from 'nanoid'
import { Logger } from '@/lib/logs/console/logger'
import { createLogger } from '@/lib/logs/console/logger'
import { hasProcessedMessage, markMessageAsProcessed } from '@/lib/redis'
import { getBaseUrl } from '@/lib/urls/utils'
import { getOAuthToken, refreshAccessTokenIfNeeded } from '@/app/api/auth/oauth/utils'
import { db } from '@/db'
import { account, webhook } from '@/db/schema'
const logger = new Logger('OutlookPollingService')
const logger = createLogger('OutlookPollingService')
interface OutlookWebhookConfig {
credentialId: string

View File

@@ -29,6 +29,7 @@
"@anthropic-ai/sdk": "^0.39.0",
"@aws-sdk/client-s3": "^3.779.0",
"@aws-sdk/s3-request-presigner": "^3.779.0",
"@azure/communication-email": "1.0.0",
"@azure/storage-blob": "12.27.0",
"@better-auth/stripe": "^1.2.9",
"@browserbasehq/stagehand": "^2.0.0",
@@ -67,7 +68,7 @@
"@radix-ui/react-tooltip": "^1.1.6",
"@react-email/components": "^0.0.34",
"@sentry/nextjs": "^9.15.0",
"@trigger.dev/sdk": "3.3.17",
"@trigger.dev/sdk": "4.0.0",
"@types/three": "0.177.0",
"@vercel/og": "^0.6.5",
"@vercel/speed-insights": "^1.2.0",
@@ -125,10 +126,11 @@
"zod": "^3.24.2"
},
"devDependencies": {
"@react-email/preview-server": "4.2.4",
"@testing-library/jest-dom": "^6.6.3",
"@testing-library/react": "^16.3.0",
"@testing-library/user-event": "^14.6.1",
"@trigger.dev/build": "3.3.17",
"@trigger.dev/build": "4.0.0",
"@types/js-yaml": "4.0.9",
"@types/jsdom": "21.1.7",
"@types/lodash": "^4.17.16",

View File

@@ -144,6 +144,10 @@ export const azureOpenAIProvider: ProviderConfig = {
if (request.temperature !== undefined) payload.temperature = request.temperature
if (request.maxTokens !== undefined) payload.max_tokens = request.maxTokens
// Add GPT-5 specific parameters
if (request.reasoningEffort !== undefined) payload.reasoning_effort = request.reasoningEffort
if (request.verbosity !== undefined) payload.verbosity = request.verbosity
// Add response format for structured output if specified
if (request.responseFormat) {
// Use Azure OpenAI's JSON schema format

View File

@@ -34,6 +34,12 @@ export interface ModelCapabilities {
}
toolUsageControl?: boolean
computerUse?: boolean
reasoningEffort?: {
values: string[]
}
verbosity?: {
values: string[]
}
}
export interface ModelDefinition {
@@ -87,6 +93,12 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
},
capabilities: {
toolUsageControl: true,
reasoningEffort: {
values: ['minimal', 'low', 'medium', 'high'],
},
verbosity: {
values: ['low', 'medium', 'high'],
},
},
},
{
@@ -99,6 +111,12 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
},
capabilities: {
toolUsageControl: true,
reasoningEffort: {
values: ['minimal', 'low', 'medium', 'high'],
},
verbosity: {
values: ['low', 'medium', 'high'],
},
},
},
{
@@ -111,6 +129,12 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
},
capabilities: {
toolUsageControl: true,
reasoningEffort: {
values: ['minimal', 'low', 'medium', 'high'],
},
verbosity: {
values: ['low', 'medium', 'high'],
},
},
},
{
@@ -233,6 +257,12 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
},
capabilities: {
toolUsageControl: true,
reasoningEffort: {
values: ['minimal', 'low', 'medium', 'high'],
},
verbosity: {
values: ['low', 'medium', 'high'],
},
},
},
{
@@ -245,6 +275,12 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
},
capabilities: {
toolUsageControl: true,
reasoningEffort: {
values: ['minimal', 'low', 'medium', 'high'],
},
verbosity: {
values: ['low', 'medium', 'high'],
},
},
},
{
@@ -257,6 +293,12 @@ export const PROVIDER_DEFINITIONS: Record<string, ProviderDefinition> = {
},
capabilities: {
toolUsageControl: true,
reasoningEffort: {
values: ['minimal', 'low', 'medium', 'high'],
},
verbosity: {
values: ['low', 'medium', 'high'],
},
},
},
{
@@ -844,3 +886,33 @@ export const EMBEDDING_MODEL_PRICING: Record<string, ModelPricing> = {
export function getEmbeddingModelPricing(modelId: string): ModelPricing | null {
return EMBEDDING_MODEL_PRICING[modelId] || null
}
/**
* Get all models that support reasoning effort
*/
export function getModelsWithReasoningEffort(): string[] {
const models: string[] = []
for (const provider of Object.values(PROVIDER_DEFINITIONS)) {
for (const model of provider.models) {
if (model.capabilities.reasoningEffort) {
models.push(model.id)
}
}
}
return models
}
/**
* Get all models that support verbosity
*/
export function getModelsWithVerbosity(): string[] {
const models: string[] = []
for (const provider of Object.values(PROVIDER_DEFINITIONS)) {
for (const model of provider.models) {
if (model.capabilities.verbosity) {
models.push(model.id)
}
}
}
return models
}

View File

@@ -130,6 +130,10 @@ export const openaiProvider: ProviderConfig = {
if (request.temperature !== undefined) payload.temperature = request.temperature
if (request.maxTokens !== undefined) payload.max_tokens = request.maxTokens
// Add GPT-5 specific parameters
if (request.reasoningEffort !== undefined) payload.reasoning_effort = request.reasoningEffort
if (request.verbosity !== undefined) payload.verbosity = request.verbosity
// Add response format for structured output if specified
if (request.responseFormat) {
// Use OpenAI's JSON schema format

View File

@@ -156,6 +156,9 @@ export interface ProviderRequest {
// Azure OpenAI specific parameters
azureEndpoint?: string
azureApiVersion?: string
// GPT-5 specific parameters
reasoningEffort?: string
verbosity?: string
}
// Map of provider IDs to their configurations

View File

@@ -19,7 +19,9 @@ import {
getProviderModels,
MODELS_TEMP_RANGE_0_1,
MODELS_TEMP_RANGE_0_2,
MODELS_WITH_REASONING_EFFORT,
MODELS_WITH_TEMPERATURE_SUPPORT,
MODELS_WITH_VERBOSITY,
PROVIDERS_WITH_TOOL_USAGE_CONTROL,
prepareToolsWithUsageControl,
supportsTemperature,
@@ -144,6 +146,15 @@ describe('Model Capabilities', () => {
'deepseek-chat',
'azure/gpt-4.1',
'azure/model-router',
// GPT-5 models don't support temperature (removed in our implementation)
'gpt-5',
'gpt-5-mini',
'gpt-5-nano',
'gpt-5-chat-latest',
'azure/gpt-5',
'azure/gpt-5-mini',
'azure/gpt-5-nano',
'azure/gpt-5-chat-latest',
]
for (const model of unsupportedModels) {
@@ -198,6 +209,15 @@ describe('Model Capabilities', () => {
expect(getMaxTemperature('azure/o3')).toBeUndefined()
expect(getMaxTemperature('azure/o4-mini')).toBeUndefined()
expect(getMaxTemperature('deepseek-r1')).toBeUndefined()
// GPT-5 models don't support temperature (removed in our implementation)
expect(getMaxTemperature('gpt-5')).toBeUndefined()
expect(getMaxTemperature('gpt-5-mini')).toBeUndefined()
expect(getMaxTemperature('gpt-5-nano')).toBeUndefined()
expect(getMaxTemperature('gpt-5-chat-latest')).toBeUndefined()
expect(getMaxTemperature('azure/gpt-5')).toBeUndefined()
expect(getMaxTemperature('azure/gpt-5-mini')).toBeUndefined()
expect(getMaxTemperature('azure/gpt-5-nano')).toBeUndefined()
expect(getMaxTemperature('azure/gpt-5-chat-latest')).toBeUndefined()
})
it.concurrent('should be case insensitive', () => {
@@ -266,6 +286,49 @@ describe('Model Capabilities', () => {
expect(MODELS_WITH_TEMPERATURE_SUPPORT).toContain('claude-sonnet-4-0') // From 0-1 range
}
)
it.concurrent('should have correct models in MODELS_WITH_REASONING_EFFORT', () => {
// Should contain GPT-5 models that support reasoning effort
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5')
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5-mini')
expect(MODELS_WITH_REASONING_EFFORT).toContain('gpt-5-nano')
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5')
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5-mini')
expect(MODELS_WITH_REASONING_EFFORT).toContain('azure/gpt-5-nano')
// Should NOT contain non-reasoning GPT-5 models
expect(MODELS_WITH_REASONING_EFFORT).not.toContain('gpt-5-chat-latest')
expect(MODELS_WITH_REASONING_EFFORT).not.toContain('azure/gpt-5-chat-latest')
// Should NOT contain other models
expect(MODELS_WITH_REASONING_EFFORT).not.toContain('gpt-4o')
expect(MODELS_WITH_REASONING_EFFORT).not.toContain('claude-sonnet-4-0')
expect(MODELS_WITH_REASONING_EFFORT).not.toContain('o1')
})
it.concurrent('should have correct models in MODELS_WITH_VERBOSITY', () => {
// Should contain GPT-5 models that support verbosity
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5')
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5-mini')
expect(MODELS_WITH_VERBOSITY).toContain('gpt-5-nano')
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5')
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5-mini')
expect(MODELS_WITH_VERBOSITY).toContain('azure/gpt-5-nano')
// Should NOT contain non-reasoning GPT-5 models
expect(MODELS_WITH_VERBOSITY).not.toContain('gpt-5-chat-latest')
expect(MODELS_WITH_VERBOSITY).not.toContain('azure/gpt-5-chat-latest')
// Should NOT contain other models
expect(MODELS_WITH_VERBOSITY).not.toContain('gpt-4o')
expect(MODELS_WITH_VERBOSITY).not.toContain('claude-sonnet-4-0')
expect(MODELS_WITH_VERBOSITY).not.toContain('o1')
})
it.concurrent('should have same models in both reasoning effort and verbosity arrays', () => {
// GPT-5 models that support reasoning effort should also support verbosity and vice versa
expect(MODELS_WITH_REASONING_EFFORT.sort()).toEqual(MODELS_WITH_VERBOSITY.sort())
})
})
})

View File

@@ -12,9 +12,11 @@ import {
getHostedModels as getHostedModelsFromDefinitions,
getMaxTemperature as getMaxTempFromDefinitions,
getModelPricing as getModelPricingFromDefinitions,
getModelsWithReasoningEffort,
getModelsWithTemperatureSupport,
getModelsWithTempRange01,
getModelsWithTempRange02,
getModelsWithVerbosity,
getProviderModels as getProviderModelsFromDefinitions,
getProvidersWithToolUsageControl,
PROVIDER_DEFINITIONS,
@@ -878,6 +880,8 @@ export function trackForcedToolUsage(
export const MODELS_TEMP_RANGE_0_2 = getModelsWithTempRange02()
export const MODELS_TEMP_RANGE_0_1 = getModelsWithTempRange01()
export const MODELS_WITH_TEMPERATURE_SUPPORT = getModelsWithTemperatureSupport()
export const MODELS_WITH_REASONING_EFFORT = getModelsWithReasoningEffort()
export const MODELS_WITH_VERBOSITY = getModelsWithVerbosity()
export const PROVIDERS_WITH_TOOL_USAGE_CONTROL = getProvidersWithToolUsageControl()
/**

View File

@@ -213,13 +213,24 @@ export class Serializer {
const params: Record<string, any> = {}
const isAdvancedMode = block.advancedMode ?? false
const isStarterBlock = block.type === 'starter'
// First collect all current values from subBlocks, filtering by mode
Object.entries(block.subBlocks).forEach(([id, subBlock]) => {
// Find the corresponding subblock config to check its mode
const subBlockConfig = blockConfig.subBlocks.find((config) => config.id === id)
if (subBlockConfig && shouldIncludeField(subBlockConfig, isAdvancedMode)) {
// Include field if it matches current mode OR if it's the starter inputFormat with values
const hasStarterInputFormatValues =
isStarterBlock &&
id === 'inputFormat' &&
Array.isArray(subBlock.value) &&
subBlock.value.length > 0
if (
subBlockConfig &&
(shouldIncludeField(subBlockConfig, isAdvancedMode) || hasStarterInputFormatValues)
) {
params[id] = subBlock.value
}
})

View File

@@ -1,11 +1,12 @@
import { create } from 'zustand'
import { devtools, persist } from 'zustand/middleware'
import { createLogger } from '@/lib/logs/console/logger'
import { syncThemeToNextThemes } from '@/lib/theme-sync'
import type { General, GeneralStore, UserSettings } from '@/stores/settings/general/types'
const logger = createLogger('GeneralStore')
const CACHE_TIMEOUT = 5000
const CACHE_TIMEOUT = 3600000 // 1 hour - settings rarely change
const MAX_ERROR_RETRIES = 2
export const useGeneralStore = create<GeneralStore>()(
@@ -14,13 +15,14 @@ export const useGeneralStore = create<GeneralStore>()(
(set, get) => {
let lastLoadTime = 0
let errorRetryCount = 0
let hasLoadedFromDb = false // Track if we've loaded from DB in this session
const store: General = {
isAutoConnectEnabled: true,
isAutoPanEnabled: true,
isConsoleExpandedByDefault: true,
isDebugModeEnabled: false,
theme: 'system' as const,
theme: 'system' as const, // Keep for compatibility but not used
telemetryEnabled: true,
isLoading: false,
error: null,
@@ -28,7 +30,7 @@ export const useGeneralStore = create<GeneralStore>()(
isAutoConnectLoading: false,
isAutoPanLoading: false,
isConsoleExpandedByDefaultLoading: false,
isThemeLoading: false,
isThemeLoading: false, // Keep for compatibility but not used
isTelemetryLoading: false,
}
@@ -99,7 +101,26 @@ export const useGeneralStore = create<GeneralStore>()(
setTheme: async (theme) => {
if (get().isThemeLoading) return
await updateSettingOptimistic('theme', theme, 'isThemeLoading', 'theme')
const originalTheme = get().theme
// Optimistic update
set({ theme, isThemeLoading: true })
// Update next-themes immediately for instant feedback
syncThemeToNextThemes(theme)
try {
// Sync to DB for authenticated users
await get().updateSetting('theme', theme)
set({ isThemeLoading: false })
} catch (error) {
// Rollback on error
set({ theme: originalTheme, isThemeLoading: false })
syncThemeToNextThemes(originalTheme)
logger.error('Failed to sync theme to database:', error)
throw error
}
},
setTelemetryEnabled: async (enabled) => {
@@ -114,6 +135,27 @@ export const useGeneralStore = create<GeneralStore>()(
// API Actions
loadSettings: async (force = false) => {
// Skip if we've already loaded from DB and not forcing
if (hasLoadedFromDb && !force) {
logger.debug('Already loaded settings from DB, using cached data')
return
}
// If we have persisted state and not forcing, check if we need to load
const persistedState = localStorage.getItem('general-settings')
if (persistedState && !force) {
try {
const parsed = JSON.parse(persistedState)
// If we have valid theme data, skip DB load unless forced
if (parsed.state?.theme) {
logger.debug('Using cached settings from localStorage')
hasLoadedFromDb = true // Mark as loaded to prevent future API calls
return
}
} catch (e) {
// If parsing fails, continue to load from DB
}
}
// Skip loading if on a subdomain or chat path
if (
typeof window !== 'undefined' &&
@@ -147,15 +189,24 @@ export const useGeneralStore = create<GeneralStore>()(
set({
isAutoConnectEnabled: data.autoConnect,
isAutoPanEnabled: data.autoPan ?? true, // Default to true if undefined
isConsoleExpandedByDefault: data.consoleExpandedByDefault ?? true, // Default to true if undefined
theme: data.theme,
isAutoPanEnabled: data.autoPan ?? true,
isConsoleExpandedByDefault: data.consoleExpandedByDefault ?? true,
theme: data.theme || 'system',
telemetryEnabled: data.telemetryEnabled,
isLoading: false,
})
// Sync theme to next-themes if it's different
if (data.theme && typeof window !== 'undefined') {
const currentTheme = localStorage.getItem('sim-theme')
if (currentTheme !== data.theme) {
syncThemeToNextThemes(data.theme)
}
}
lastLoadTime = now
errorRetryCount = 0
hasLoadedFromDb = true
} catch (error) {
logger.error('Error loading settings:', error)
set({

View File

@@ -1,8 +1,6 @@
import type { AirtableGetParams, AirtableGetResponse } from '@/tools/airtable/types'
import type { ToolConfig } from '@/tools/types'
// import { logger } from '@/utils/logger' // Removed logger due to import issues
export const airtableGetRecordTool: ToolConfig<AirtableGetParams, AirtableGetResponse> = {
id: 'airtable_get_record',
name: 'Airtable Get Record',

View File

@@ -45,7 +45,9 @@ export const readTool: ToolConfig<MicrosoftExcelToolParams, MicrosoftExcelReadRe
}
if (!params.range) {
return `https://graph.microsoft.com/v1.0/me/drive/items/${spreadsheetId}/workbook/worksheets('Sheet1')/range(address='A1:Z1000')`
// When no range is provided, first fetch the first worksheet name (to avoid hardcoding "Sheet1")
// We'll read its default range after in transformResponse
return `https://graph.microsoft.com/v1.0/me/drive/items/${spreadsheetId}/workbook/worksheets?$select=name&$orderby=position&$top=1`
}
const rangeInput = params.range.trim()
@@ -72,7 +74,65 @@ export const readTool: ToolConfig<MicrosoftExcelToolParams, MicrosoftExcelReadRe
},
},
transformResponse: async (response: Response) => {
transformResponse: async (response: Response, params?: MicrosoftExcelToolParams) => {
const defaultAddress = 'A1:Z1000' // Match Google Sheets default logic
// If we came from the worksheets listing (no range provided), resolve first sheet name then fetch range
if (response.url.includes('/workbook/worksheets?')) {
const listData = await response.json()
const firstSheetName: string | undefined = listData?.value?.[0]?.name
if (!firstSheetName) {
throw new Error('No worksheets found in the Excel workbook')
}
const spreadsheetIdFromUrl = response.url.split('/drive/items/')[1]?.split('/')[0] || ''
const accessToken = params?.accessToken
if (!accessToken) {
throw new Error('Access token is required to read Excel range')
}
const rangeUrl = `https://graph.microsoft.com/v1.0/me/drive/items/${encodeURIComponent(
spreadsheetIdFromUrl
)}/workbook/worksheets('${encodeURIComponent(firstSheetName)}')/range(address='${defaultAddress}')`
const rangeResp = await fetch(rangeUrl, {
headers: { Authorization: `Bearer ${accessToken}` },
})
if (!rangeResp.ok) {
// Normalize Microsoft Graph sheet/range errors to a friendly message
throw new Error(
'Invalid range provided or worksheet not found. Provide a range like "Sheet1!A1:B2"'
)
}
const data = await rangeResp.json()
const metadata = {
spreadsheetId: spreadsheetIdFromUrl,
properties: {},
spreadsheetUrl: `https://graph.microsoft.com/v1.0/me/drive/items/${spreadsheetIdFromUrl}`,
}
const result: MicrosoftExcelReadResponse = {
success: true,
output: {
data: {
range: data.range || `${firstSheetName}!${defaultAddress}`,
values: data.values || [],
},
metadata: {
spreadsheetId: metadata.spreadsheetId,
spreadsheetUrl: metadata.spreadsheetUrl,
},
},
}
return result
}
// Normal path: caller supplied a range; just return the parsed result
const data = await response.json()
const urlParts = response.url.split('/drive/items/')
@@ -102,27 +162,20 @@ export const readTool: ToolConfig<MicrosoftExcelToolParams, MicrosoftExcelReadRe
},
outputs: {
success: { type: 'boolean', description: 'Operation success status' },
output: {
data: {
type: 'object',
description: 'Excel spreadsheet data and metadata',
description: 'Range data from the spreadsheet',
properties: {
data: {
type: 'object',
description: 'Range data from the spreadsheet',
properties: {
range: { type: 'string', description: 'The range that was read' },
values: { type: 'array', description: 'Array of rows containing cell values' },
},
},
metadata: {
type: 'object',
description: 'Spreadsheet metadata',
properties: {
spreadsheetId: { type: 'string', description: 'The ID of the spreadsheet' },
spreadsheetUrl: { type: 'string', description: 'URL to access the spreadsheet' },
},
},
range: { type: 'string', description: 'The range that was read' },
values: { type: 'array', description: 'Array of rows containing cell values' },
},
},
metadata: {
type: 'object',
description: 'Spreadsheet metadata',
properties: {
spreadsheetId: { type: 'string', description: 'The ID of the spreadsheet' },
spreadsheetUrl: { type: 'string', description: 'URL to access the spreadsheet' },
},
},
},

View File

@@ -128,21 +128,14 @@ export const tableAddTool: ToolConfig<
},
outputs: {
success: { type: 'boolean', description: 'Operation success status' },
output: {
index: { type: 'number', description: 'Index of the first row that was added' },
values: { type: 'array', description: 'Array of rows that were added to the table' },
metadata: {
type: 'object',
description: 'Table add operation results and metadata',
description: 'Spreadsheet metadata',
properties: {
index: { type: 'number', description: 'Index of the first row that was added' },
values: { type: 'array', description: 'Array of rows that were added to the table' },
metadata: {
type: 'object',
description: 'Spreadsheet metadata',
properties: {
spreadsheetId: { type: 'string', description: 'The ID of the spreadsheet' },
spreadsheetUrl: { type: 'string', description: 'URL to access the spreadsheet' },
},
},
spreadsheetId: { type: 'string', description: 'The ID of the spreadsheet' },
spreadsheetUrl: { type: 'string', description: 'URL to access the spreadsheet' },
},
},
},

View File

@@ -161,23 +161,16 @@ export const writeTool: ToolConfig<MicrosoftExcelToolParams, MicrosoftExcelWrite
},
outputs: {
success: { type: 'boolean', description: 'Operation success status' },
output: {
updatedRange: { type: 'string', description: 'The range that was updated' },
updatedRows: { type: 'number', description: 'Number of rows that were updated' },
updatedColumns: { type: 'number', description: 'Number of columns that were updated' },
updatedCells: { type: 'number', description: 'Number of cells that were updated' },
metadata: {
type: 'object',
description: 'Write operation results and metadata',
description: 'Spreadsheet metadata',
properties: {
updatedRange: { type: 'string', description: 'The range that was updated' },
updatedRows: { type: 'number', description: 'Number of rows that were updated' },
updatedColumns: { type: 'number', description: 'Number of columns that were updated' },
updatedCells: { type: 'number', description: 'Number of cells that were updated' },
metadata: {
type: 'object',
description: 'Spreadsheet metadata',
properties: {
spreadsheetId: { type: 'string', description: 'The ID of the spreadsheet' },
spreadsheetUrl: { type: 'string', description: 'URL to access the spreadsheet' },
},
},
spreadsheetId: { type: 'string', description: 'The ID of the spreadsheet' },
spreadsheetUrl: { type: 'string', description: 'URL to access the spreadsheet' },
},
},
},

View File

@@ -1,4 +1,4 @@
import { defineConfig } from '@trigger.dev/sdk/v3'
import { defineConfig } from '@trigger.dev/sdk'
export default defineConfig({
project: 'proj_kufttkwzywcydwtccqhx',

1158
bun.lock

File diff suppressed because it is too large Load Diff

View File

@@ -127,9 +127,13 @@
"type": "string",
"description": "Resend API key for transactional emails"
},
"FROM_EMAIL_ADDRESS": {
"type": "string",
"description": "Complete from address (e.g., \"Sim <noreply@domain.com>\" or \"DoNotReply@domain.com\")"
},
"EMAIL_DOMAIN": {
"type": "string",
"description": "Domain for sending emails"
"description": "Domain for sending emails (fallback when FROM_EMAIL_ADDRESS not set)"
},
"GOOGLE_CLIENT_ID": {
"type": "string",

View File

@@ -66,7 +66,8 @@ app:
# Email & Communication
RESEND_API_KEY: "" # Resend API key for transactional emails
EMAIL_DOMAIN: "" # Domain for sending emails
FROM_EMAIL_ADDRESS: "" # Complete from address (e.g., "Sim <noreply@domain.com>" or "DoNotReply@domain.com")
EMAIL_DOMAIN: "" # Domain for sending emails (fallback when FROM_EMAIL_ADDRESS not set)
# OAuth Integration Credentials (leave empty if not using)
GOOGLE_CLIENT_ID: "" # Google OAuth client ID