refactor(backend): move OpenAI client to centralized clients.py

Organizational improvement:
- Moved get_openai_client() from embeddings.py to backend/util/clients.py
- Follows established pattern for external service clients (like Supabase)
- Uses @cached(ttl_seconds=3600) for process-level caching with TTL
- Makes OpenAI client reusable across codebase

Benefits:
- Consistency with existing client patterns
- Centralized location for all external service clients
- Better organization and maintainability
- Reusable for future use cases (block embeddings, library agents, etc.)

Pattern alignment:
- Similar to get_supabase() - external API client with caching
- Uses same caching decorator as other service clients
- Thread-safe process-level cache

Files changed:
- backend/util/clients.py: Add get_openai_client() with @cached decorator
- backend/api/features/store/embeddings.py: Import from clients instead of local definition

No functional changes - purely organizational refactor.
This commit is contained in:
Zamil Majdy
2026-01-13 15:18:05 -06:00
parent 704b8a9207
commit 16a14ca09e
2 changed files with 20 additions and 17 deletions

View File

@@ -8,16 +8,14 @@ Handles generation and storage of OpenAI embeddings for all content types
import asyncio
import logging
import time
from functools import cache
from typing import Any
import prisma
from openai import AsyncOpenAI
from prisma.enums import ContentType
from backend.data.db import execute_raw_with_schema, query_raw_with_schema
from backend.util.clients import get_openai_client
from backend.util.json import dumps
from backend.util.settings import Settings
logger = logging.getLogger(__name__)
@@ -27,20 +25,6 @@ EMBEDDING_MODEL = "text-embedding-3-small"
EMBEDDING_DIM = 1536
@cache
def get_openai_client() -> AsyncOpenAI | None:
"""
Get or create a singleton async OpenAI client for connection reuse.
Returns None if API key is not configured.
"""
settings = Settings()
api_key = settings.secrets.openai_internal_api_key
if not api_key:
return None
return AsyncOpenAI(api_key=api_key)
def build_searchable_text(
name: str,
description: str,

View File

@@ -10,6 +10,7 @@ from backend.util.settings import Settings
settings = Settings()
if TYPE_CHECKING:
from openai import AsyncOpenAI
from supabase import AClient, Client
from backend.data.execution import (
@@ -139,6 +140,24 @@ async def get_async_supabase() -> "AClient":
)
# ============ OpenAI Client ============ #
@cached(ttl_seconds=3600)
def get_openai_client() -> "AsyncOpenAI | None":
"""
Get a process-cached async OpenAI client for embeddings.
Returns None if API key is not configured.
"""
from openai import AsyncOpenAI
api_key = settings.secrets.openai_internal_api_key
if not api_key:
return None
return AsyncOpenAI(api_key=api_key)
# ============ Notification Queue Helpers ============ #