## CHANGES
- add MiniMax provider configuration with API endpoint updates
- implement NeedsRawMode method for MiniMax model handling
- define static MiniMax model list with M2 variants
- add Infermatic and Novita to VS Code extensions
- update test to use proper context.TODO() parameter
- configure MiniMax models as static discovery
- set ModelsURL to static:minimax for model listing
- Add `analyze_discord_structure` pattern for Discord server analysis
- Add `create_design_system` pattern for CSS design system generation
- Add `create_golden_rules` pattern for extracting codebase rules
- Update pattern numbering in explanations document (11-235)
- Add new patterns to suggest_pattern category mappings
- Update pattern_descriptions.json with new pattern metadata
- Update pattern_extracts.json with new pattern content
Documents the deployment URL bug and stream_options fix with:
- Clear explanation of the root cause (SDK route matching bug)
- Technical details for developers
- Configuration guidance
- Verification steps
Related to #1954 and PR #1965
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The OpenAI Go SDK's azure.WithEndpoint() middleware has a bug where it
expects request paths like /openai/chat/completions but the SDK actually
sends paths like /chat/completions (without the /openai/ prefix since
that's included in the base URL). This causes the SDK's route matching
to fail, resulting in deployment names not being injected into the URL.
Azure OpenAI requires URLs like:
/openai/deployments/{deployment-name}/chat/completions
But the SDK was generating:
/openai/chat/completions
This fix:
1. Adds custom middleware that correctly transforms API paths to include
the deployment name extracted from the request body's model field
2. Moves StreamOptions to only be set for streaming requests (Azure
rejects stream_options for non-streaming requests)
Fixes#1954
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Three new patterns for common development workflows:
- create_design_system: Generate CSS design systems with tokens,
typography scales, and dark/light mode support from requirements
- create_golden_rules: Extract implicit and explicit rules from
codebases into testable, enforceable guidelines
- analyze_discord_structure: Audit Discord server organization,
permissions, and naming conventions
Changed from using only the last positional argument (args[len(args)-1])
to joining all positional arguments with spaces. This allows commands like:
fabric -p pattern_name How do I use fabric to list available models
to consume the entire phrase instead of just "models".
Fixes#1958
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Add Novita AI provider configuration with API endpoint
- Update README to include Novita AI in supported providers list
- Configure Novita AI to use OpenAI-compatible interface
- Add documentation comments for messageTextFromParts, contentBlocksFromMessage,
prependSystemContentToBlocks, contentBlockFromAttachmentURL, and parseDataURL
- Consolidate two switch statements in normalizeImageMimeType into single
idiomatic Go switch with direct returns
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add debug logging for failed data URL parsing
- Add debug logging for unsupported MIME types
- Add debug logging for non-base64 data URLs
- Add documentation for normalizeImageMimeType with API reference
- Add documentation for isPDFURL explaining extension-only limitation
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
### CHANGES
- Update toMessages to handle multi-content messages with text and attachments
- Add contentBlocksFromMessage to convert message parts to Anthropic blocks
- Implement support for image URLs including data URLs and base64 images
- Add PDF attachment handling via data URLs and URL-based PDFs
- Introduce parseDataURL for extracting MIME type and data from data URLs
- Create normalizeImageMimeType to standardize supported image MIME types
- Add isPDFURL to detect PDF files from URL paths
- Refactor system content accumulation to use text extraction from parts
- Update tests to verify PDF attachment processing in multi-content messages
## CHANGES
- Add new pattern extending `extract_wisdom` with speaker attribution
- Create README and system.md for the new pattern
- Update pattern_explanations.md with new pattern entry
- Add pattern to suggest_pattern category lists
- Update pattern_descriptions.json with metadata and tags
- Update pattern_extracts.json with pattern content
Adds a pattern that suggests Gas Town (gt) CLI commands based on
natural language descriptions. Covers 85+ commands across work
management, agents, communication, services, diagnostics, and recovery.
Pipe-friendly output: first line is the raw command, enabling:
echo "message the mayor" | fabric -p suggest_gt_command | head -1
Gas Town: https://github.com/steveyegge/gastown
Generated with [Claude Code](https://claude.ai/code)
via [Happy](https://happy.engineering)
Co-Authored-By: Claude <noreply@anthropic.com>
Co-Authored-By: Happy <yesreply@happy.engineering>
## CHANGES
- Add Spotify plugin with OAuth token handling and metadata
- Wire --spotify flag into CLI processing and output
- Register Spotify in plugin setup, env, and registry
- Update shell completions to include --spotify option
- Add i18n strings for Spotify configuration errors
- Add unit and integration tests for Spotify API
- Set gopls integration build tags for workspace
## CHANGES
- Add warning to stderr when using incompatible models with image generation
- Add GPT-5, GPT-5-nano, and GPT-5.2 to supported image generation models
- Create `checkImageGenerationCompatibility` function in OpenAI plugin
- Add comprehensive tests for image generation compatibility warnings
- Add integration test scenarios for CLI image generation workflows
- Suggest gpt-4o as alternative in incompatibility warning messages