Compare commits

..

5 Commits

Author SHA1 Message Date
github-actions[bot]
1d7fdffdbd chore(release): Update version to v1.4.391 2026-01-24 20:02:29 +00:00
Kayvan Sylvan
bd38f5ae20 Merge pull request #1965 from infinitelyloopy-bt/fix/azure-openai-deployment-url
fix(azure): Fix deployment URL path for Azure OpenAI API
2026-01-24 12:00:09 -08:00
Baker Tamory
a61007b3b1 Apply PR review feedback from @ksylvan
- Add changelog file for PR #1965
- Fix trailing space formatting in deploymentRoutes map

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-25 01:39:18 +11:00
Baker Tamory
4d5ee38a34 docs: Add Azure OpenAI troubleshooting guide
Documents the deployment URL bug and stream_options fix with:
- Clear explanation of the root cause (SDK route matching bug)
- Technical details for developers
- Configuration guidance
- Verification steps

Related to #1954 and PR #1965

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-24 13:34:28 +11:00
Baker Tamory
82974a6a2a fix(azure): Fix deployment URL path for Azure OpenAI API
The OpenAI Go SDK's azure.WithEndpoint() middleware has a bug where it
expects request paths like /openai/chat/completions but the SDK actually
sends paths like /chat/completions (without the /openai/ prefix since
that's included in the base URL). This causes the SDK's route matching
to fail, resulting in deployment names not being injected into the URL.

Azure OpenAI requires URLs like:
  /openai/deployments/{deployment-name}/chat/completions
But the SDK was generating:
  /openai/chat/completions

This fix:
1. Adds custom middleware that correctly transforms API paths to include
   the deployment name extracted from the request body's model field
2. Moves StreamOptions to only be set for streaming requests (Azure
   rejects stream_options for non-streaming requests)

Fixes #1954

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-24 13:31:04 +11:00
7 changed files with 227 additions and 6 deletions

View File

@@ -1,5 +1,16 @@
# Changelog
## v1.4.391 (2026-01-24)
MiniMax unexpected status code: 404 from provider MiniMax, response body: 404 page not found
### PR [#1965](https://github.com/danielmiessler/Fabric/pull/1965) by [infinitelyloopy-bt](https://github.com/infinitelyloopy-bt): fix(azure): Fix deployment URL path for Azure OpenAI API
- Fixed deployment URL path construction for Azure OpenAI API to correctly include deployment names in request URLs
- Added custom middleware to transform API paths and extract deployment names from request body model fields
- Moved StreamOptions configuration to only apply to streaming requests, as Azure rejects stream_options for non-streaming requests
- Added Azure OpenAI troubleshooting documentation with technical details and configuration guidance
- Resolved SDK route matching bug that was preventing proper URL generation for Azure OpenAI endpoints
## v1.4.390 (2026-01-24)
MiniMax unexpected status code: 404 from provider MiniMax, response body: 404 page not found

View File

@@ -1,3 +1,3 @@
package main
var version = "v1.4.390"
var version = "v1.4.391"

Binary file not shown.

View File

@@ -0,0 +1,127 @@
# Azure OpenAI Troubleshooting
This document describes a known issue with Azure OpenAI integration and its fix.
## Issue: DeploymentNotFound Error (404)
### Symptoms
When using Fabric with Azure OpenAI, you may encounter this error:
```
POST "https://{resource}.cognitiveservices.azure.com/openai/chat/completions?api-version=...": 404 Not Found
{
"code": "DeploymentNotFound",
"message": "The API deployment for this resource does not exist..."
}
```
### Root Cause
Azure OpenAI requires deployment names in the URL path:
```
✅ Correct: /openai/deployments/{deployment-name}/chat/completions
❌ Incorrect: /openai/chat/completions
```
The OpenAI Go SDK's `azure.WithEndpoint()` middleware has a bug in its URL transformation logic:
1. The SDK's `jsonRoutes` map expects paths like `/openai/chat/completions`
2. But the SDK actually sends paths like `/chat/completions` (without the `/openai/` prefix)
3. The `/openai/` prefix is included in the base URL, not the request path
4. This causes the route matching to **always fail**, so deployment names are never injected into the URL
### Technical Details
In the SDK's `azure/azure.go`:
```go
// SDK checks for these routes:
var jsonRoutes = map[string]bool{
"/openai/chat/completions": true, // Expects /openai/ prefix
// ...
}
// But actual request path is:
path := "chat/completions" // No /openai/ prefix!
```
The mismatch means `jsonRoutes[req.URL.Path]` never matches, and the deployment name transformation never happens.
## Fix
The fix in `internal/plugins/ai/azure/azure.go` adds custom middleware that:
1. Intercepts outgoing requests
2. Extracts the deployment name from the request body's `model` field
3. Transforms the URL path to include `/deployments/{name}/`
```go
// Transform: /chat/completions -> /openai/deployments/{name}/chat/completions
func azureDeploymentMiddleware(req *http.Request, next option.MiddlewareNext) (*http.Response, error) {
// Routes that need deployment name injection
deploymentRoutes := map[string]bool{
"/chat/completions": true,
"/completions": true,
"/embeddings": true,
"/audio/speech": true,
"/audio/transcriptions": true,
"/audio/translations": true,
"/images/generations": true,
}
// Extract deployment from body and transform URL...
}
```
## Additional Fix: StreamOptions Error
### Symptom
```
400 Bad Request
{
"message": "The 'stream_options' parameter is only allowed when 'stream' is enabled."
}
```
### Cause
The Chat Completions API was sending `stream_options` for all requests, but Azure only accepts this parameter when `stream: true` is also set.
### Fix
Moved `StreamOptions` to only be set for streaming requests in `internal/plugins/ai/openai/chat_completions.go`.
## Configuration
Ensure your Azure OpenAI configuration is correct:
```bash
# In ~/.config/fabric/.env
AZURE_API_KEY=your-api-key
AZURE_API_BASE_URL=https://{your-resource}.cognitiveservices.azure.com/
AZURE_DEPLOYMENTS=your-deployment-1,your-deployment-2 # Comma-separated deployment names
AZURE_API_VERSION=2024-12-01-preview # Optional, defaults to 2024-05-01-preview
```
**Note:** The deployment name is what you specified when deploying a model in Azure AI Foundry (formerly Azure OpenAI Studio), not the model name itself (e.g., `my-gpt4-deployment` rather than `gpt-4`).
## Verification
Test your Azure OpenAI setup:
```bash
fabric --model <your-deployment-name> --pattern summarize "Hello world"
```
Replace `<your-deployment-name>` with the actual deployment name from your Azure configuration.
You should see a successful response from your Azure OpenAI deployment.
## References
- GitHub Issue: [#1954](https://github.com/danielmiessler/fabric/issues/1954)
- Pull Request: [#1965](https://github.com/danielmiessler/fabric/pull/1965)
- [Azure OpenAI REST API Reference](https://learn.microsoft.com/en-us/azure/ai-services/openai/reference)

View File

@@ -1,13 +1,19 @@
package azure
import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
"net/url"
"strings"
"github.com/danielmiessler/fabric/internal/plugins"
"github.com/danielmiessler/fabric/internal/plugins/ai/openai"
openaiapi "github.com/openai/openai-go"
"github.com/openai/openai-go/azure"
"github.com/openai/openai-go/option"
)
func NewClient() (ret *Client) {
@@ -50,14 +56,90 @@ func (oi *Client) configure() error {
oi.ApiVersion.Value = apiVersion
}
// Build the Azure endpoint URL with /openai/ suffix
endpoint := strings.TrimSuffix(baseURL, "/") + "/openai/"
// Create the client with Azure authentication and custom middleware
// to fix the deployment URL path (workaround for SDK bug where
// jsonRoutes expects /openai/chat/completions but SDK uses /chat/completions)
client := openaiapi.NewClient(
azure.WithAPIKey(apiKey),
azure.WithEndpoint(baseURL, apiVersion),
option.WithBaseURL(endpoint),
option.WithQueryAdd("api-version", apiVersion),
option.WithMiddleware(azureDeploymentMiddleware),
)
oi.ApiClient = &client
return nil
}
// azureDeploymentMiddleware transforms Azure OpenAI API paths to include
// the deployment name. Azure requires URLs like:
// /openai/deployments/{deployment-name}/chat/completions
// but the SDK sends paths like /chat/completions
func azureDeploymentMiddleware(req *http.Request, next option.MiddlewareNext) (*http.Response, error) {
// Routes that need deployment name injection
deploymentRoutes := map[string]bool{
"/chat/completions": true,
"/completions": true,
"/embeddings": true,
"/audio/speech": true,
"/audio/transcriptions": true,
"/audio/translations": true,
"/images/generations": true,
}
path := req.URL.Path
// Remove /openai prefix if present (SDK may add it via base URL)
trimmedPath := strings.TrimPrefix(path, "/openai")
if !strings.HasPrefix(trimmedPath, "/") {
trimmedPath = "/" + trimmedPath
}
if deploymentRoutes[trimmedPath] {
// Extract model/deployment name from request body
deploymentName, err := extractDeploymentFromBody(req)
if err != nil {
return nil, fmt.Errorf("failed to extract deployment name: %w", err)
}
// Transform path: /chat/completions -> /deployments/{name}/chat/completions
newPath := "/openai/deployments/" + url.PathEscape(deploymentName) + trimmedPath
req.URL.Path = newPath
req.URL.RawPath = "" // Clear RawPath to ensure Path is used
}
return next(req)
}
// extractDeploymentFromBody reads the model field from the JSON request body
// and restores the body for subsequent use
func extractDeploymentFromBody(req *http.Request) (string, error) {
if req.Body == nil {
return "", fmt.Errorf("request body is nil")
}
bodyBytes, err := io.ReadAll(req.Body)
if err != nil {
return "", err
}
// Restore body for subsequent reads
req.Body = io.NopCloser(bytes.NewReader(bodyBytes))
var payload struct {
Model string `json:"model"`
}
if err := json.Unmarshal(bodyBytes, &payload); err != nil {
return "", err
}
if payload.Model == "" {
return "", fmt.Errorf("model field is empty or missing in request body")
}
return payload.Model, nil
}
func parseDeployments(value string) []string {
parts := strings.Split(value, ",")
var deployments []string

View File

@@ -35,6 +35,10 @@ func (o *Client) sendStreamChatCompletions(
defer close(channel)
req := o.buildChatCompletionParams(msgs, opts)
// Set StreamOptions only for streaming requests (required to get usage stats)
req.StreamOptions = openai.ChatCompletionStreamOptionsParam{
IncludeUsage: openai.Bool(true),
}
stream := o.ApiClient.Chat.Completions.NewStreaming(context.Background(), req)
for stream.Next() {
chunk := stream.Current()
@@ -82,9 +86,6 @@ func (o *Client) buildChatCompletionParams(
ret = openai.ChatCompletionNewParams{
Model: shared.ChatModel(opts.Model),
Messages: messages,
StreamOptions: openai.ChatCompletionStreamOptionsParam{
IncludeUsage: openai.Bool(true),
},
}
if !opts.Raw {

View File

@@ -1 +1 @@
"1.4.390"
"1.4.391"