Compare commits

...

18 Commits

Author SHA1 Message Date
github-actions[bot]
93f8978085 chore(release): Update version to v1.4.340 2025-12-08 00:36:16 +00:00
Kayvan Sylvan
4d91bf837f Merge pull request #1856 from ksylvan/kayvan/claude-haiku-4-5
Add support for new ClaudeHaiku 4.5 models
2025-12-08 08:33:51 +08:00
Changelog Bot
cb29a0d606 chore: incoming 1856 changelog entry 2025-12-08 08:30:17 +08:00
Kayvan Sylvan
b1eb7a82d9 feat: add support for new ClaudeHaiku models in client
### CHANGES

- Add `ModelClaudeHaiku4_5` to supported models
- Add `ModelClaudeHaiku4_5_20251001` to supported models
2025-12-08 08:21:18 +08:00
github-actions[bot]
bc8f5add00 chore(release): Update version to v1.4.339 2025-12-08 00:10:02 +00:00
Kayvan Sylvan
c3f874f985 Merge pull request #1855 from ksylvan/kayvan/ollama_image_handling
feat: add image attachment support for Ollama vision models
2025-12-08 08:07:33 +08:00
Changelog Bot
922df52d0c chore: incoming 1855 changelog entry 2025-12-08 08:00:59 +08:00
Kayvan Sylvan
4badfecadb feat: add multi-modal image support to Ollama client
## CHANGES

- Add base64 and io imports for image handling
- Store httpClient separately in Client struct for reuse
- Convert createChatRequest to return error for validation
- Implement convertMessage to handle multi-content chat messages
- Add loadImageBytes to fetch images from URLs
- Support base64 data URLs for inline images
- Handle HTTP image URLs with context propagation
- Replace debug print with proper debuglog usage
2025-12-08 07:48:36 +08:00
github-actions[bot]
83139a64d5 chore(release): Update version to v1.4.338 2025-12-04 13:34:00 +00:00
Kayvan Sylvan
78fd836532 Merge pull request #1852 from ksylvan/kayvan/add-abacus-provider-for-chatllm-models
Add Abacus vendor for ChatLLM models with static model list
2025-12-04 21:31:34 +08:00
Kayvan Sylvan
894459ddec feat: add static model support and register Abacus provider
CHANGES

- feat: detect modelsURL starting with 'static:' and route
- feat: implement getStaticModels returning curated Abacus model list
- feat: register Abacus provider with ModelsURL 'static:abacus'
- chore: add fmt import for error formatting in provider code
- test: extend provider tests to include Abacus existence
- chore: update .vscode settings add 'kimi' and 'qwen' contributors
2025-12-04 21:22:57 +08:00
github-actions[bot]
920c22c889 chore(release): Update version to v1.4.337 2025-12-04 04:21:35 +00:00
Kayvan Sylvan
a0f931feb0 Merge pull request #1851 from ksylvan/kayvan/add-z-ai-vendor-support
Add Z AI provider and glm model support
2025-12-04 12:19:13 +08:00
Kayvan Sylvan
4b080fd6dd feat: add Z AI provider and glm model support
- Add Z AI provider configuration to ProviderMap
- Include BaseURL for Z AI API endpoint
- Add test case for Z AI provider existence
- Add glm to OpenAI model prefixes list
- Reorder gpt-5 in model prefixes list
- Support new Z AI provider in OpenAI compatible plugins
2025-12-04 12:06:55 +08:00
github-actions[bot]
298abecb3f chore(release): Update version to v1.4.336 2025-12-01 11:37:19 +00:00
Kayvan Sylvan
e2d4aab775 Merge pull request #1848 from zeddy303/fix/localStorage-ssr-issue 2025-12-01 19:34:45 +08:00
Changelog Bot
17cac13584 chore: incoming 1848 changelog entry 2025-12-01 18:41:32 +08:00
zeddy303
e4a004cf88 Fix localStorage SSR error in favorites-store
Use SvelteKit's browser constant instead of typeof localStorage check
to properly handle server-side rendering. Prevents 'localStorage.getItem
is not a function' error when running dev server.
2025-11-29 13:06:54 -07:00
12 changed files with 246 additions and 22 deletions

View File

@@ -96,6 +96,7 @@
"joho",
"kballard",
"Keploy",
"kimi",
"Kore",
"ksylvan",
"Langdock",
@@ -151,6 +152,7 @@
"Pulcherrima",
"pycache",
"pyperclip",
"qwen",
"readystream",
"restapi",
"rmextension",

View File

@@ -1,5 +1,49 @@
# Changelog
## v1.4.340 (2025-12-08)
### PR [#1856](https://github.com/danielmiessler/Fabric/pull/1856) by [ksylvan](https://github.com/ksylvan): Add support for new ClaudeHaiku 4.5 models
- Add support for new ClaudeHaiku models in client
- Add `ModelClaudeHaiku4_5` to supported models
- Add `ModelClaudeHaiku4_5_20251001` to supported models
## v1.4.339 (2025-12-08)
### PR [#1855](https://github.com/danielmiessler/Fabric/pull/1855) by [ksylvan](https://github.com/ksylvan): feat: add image attachment support for Ollama vision models
- Add multi-modal image support to Ollama client
- Implement convertMessage to handle multi-content chat messages
- Add loadImageBytes to fetch images from URLs
- Support base64 data URLs for inline images
- Handle HTTP image URLs with context propagation
## v1.4.338 (2025-12-04)
### PR [#1852](https://github.com/danielmiessler/Fabric/pull/1852) by [ksylvan](https://github.com/ksylvan): Add Abacus vendor for ChatLLM models with static model list
- Add static model support and register Abacus provider
- Detect modelsURL starting with 'static:' and route appropriately
- Implement getStaticModels returning curated Abacus model list
- Register Abacus provider with ModelsURL 'static:abacus'
- Extend provider tests to include Abacus existence
## v1.4.337 (2025-12-04)
### PR [#1851](https://github.com/danielmiessler/Fabric/pull/1851) by [ksylvan](https://github.com/ksylvan): Add Z AI provider and glm model support
- Add Z AI provider configuration to ProviderMap
- Include BaseURL for Z AI API endpoint
- Add test case for Z AI provider existence
- Add glm to OpenAI model prefixes list
- Support new Z AI provider in OpenAI compatible plugins
## v1.4.336 (2025-12-01)
### PR [#1848](https://github.com/danielmiessler/Fabric/pull/1848) by [zeddy303](https://github.com/zeddy303): Fix localStorage SSR error in favorites-store
- Fix localStorage SSR error in favorites-store by using SvelteKit's browser constant instead of typeof localStorage check to properly handle server-side rendering and prevent 'localStorage.getItem is not a function' error when running dev server
## v1.4.335 (2025-11-28)
### PR [#1847](https://github.com/danielmiessler/Fabric/pull/1847) by [ksylvan](https://github.com/ksylvan): Improve model name matching for NeedsRaw in Ollama plugin

View File

@@ -73,6 +73,9 @@ Below are the **new features and capabilities** we've added (newest first):
### Recent Major Features
- [v1.4.338](https://github.com/danielmiessler/fabric/releases/tag/v1.4.338) (Dec 4, 2025) — Add Abacus vendor support for Chat-LLM
models (see [RouteLLM APIs](https://abacus.ai/app/route-llm-apis)).
- [v1.4.337](https://github.com/danielmiessler/fabric/releases/tag/v1.4.337) (Dec 4, 2025) — Add "Z AI" vendor support. See the [Z AI overview](https://docs.z.ai/guides/overview/overview) page for more details.
- [v1.4.334](https://github.com/danielmiessler/fabric/releases/tag/v1.4.334) (Nov 26, 2025) — **Claude Opus 4.5**: Updates the Anthropic SDK to the latest and adds the new [Claude Opus 4.5](https://www.anthropic.com/news/claude-opus-4-5) to the available models.
- [v1.4.331](https://github.com/danielmiessler/fabric/releases/tag/v1.4.331) (Nov 23, 2025) — **Support for GitHub Models**: Adds support for using GitHub Models.
- [v1.4.322](https://github.com/danielmiessler/fabric/releases/tag/v1.4.322) (Nov 5, 2025) — **Interactive HTML Concept Maps and Claude Sonnet 4.5**: Adds `create_conceptmap` pattern for visual knowledge representation using Vis.js, introduces WELLNESS category with psychological analysis patterns, and upgrades to Claude Sonnet 4.5

View File

@@ -1,3 +1,3 @@
package main
var version = "v1.4.335"
var version = "v1.4.340"

Binary file not shown.

View File

@@ -52,6 +52,8 @@ func NewClient() (ret *Client) {
string(anthropic.ModelClaudeSonnet4_5_20250929),
string(anthropic.ModelClaudeOpus4_5_20251101),
string(anthropic.ModelClaudeOpus4_5),
string(anthropic.ModelClaudeHaiku4_5),
string(anthropic.ModelClaudeHaiku4_5_20251001),
}
ret.modelBetas = map[string][]string{

View File

@@ -2,7 +2,9 @@ package ollama
import (
"context"
"encoding/base64"
"fmt"
"io"
"net/http"
"net/url"
"os"
@@ -10,11 +12,10 @@ import (
"time"
"github.com/danielmiessler/fabric/internal/chat"
ollamaapi "github.com/ollama/ollama/api"
"github.com/samber/lo"
"github.com/danielmiessler/fabric/internal/domain"
debuglog "github.com/danielmiessler/fabric/internal/log"
"github.com/danielmiessler/fabric/internal/plugins"
ollamaapi "github.com/ollama/ollama/api"
)
const defaultBaseUrl = "http://localhost:11434"
@@ -48,6 +49,7 @@ type Client struct {
apiUrl *url.URL
client *ollamaapi.Client
ApiHttpTimeout *plugins.SetupQuestion
httpClient *http.Client
}
type transport_sec struct {
@@ -84,7 +86,8 @@ func (o *Client) configure() (err error) {
}
}
o.client = ollamaapi.NewClient(o.apiUrl, &http.Client{Timeout: timeout, Transport: &transport_sec{underlyingTransport: http.DefaultTransport, ApiKey: o.ApiKey}})
o.httpClient = &http.Client{Timeout: timeout, Transport: &transport_sec{underlyingTransport: http.DefaultTransport, ApiKey: o.ApiKey}}
o.client = ollamaapi.NewClient(o.apiUrl, o.httpClient)
return
}
@@ -104,15 +107,18 @@ func (o *Client) ListModels() (ret []string, err error) {
}
func (o *Client) SendStream(msgs []*chat.ChatCompletionMessage, opts *domain.ChatOptions, channel chan string) (err error) {
req := o.createChatRequest(msgs, opts)
ctx := context.Background()
var req ollamaapi.ChatRequest
if req, err = o.createChatRequest(ctx, msgs, opts); err != nil {
return
}
respFunc := func(resp ollamaapi.ChatResponse) (streamErr error) {
channel <- resp.Message.Content
return
}
ctx := context.Background()
if err = o.client.Chat(ctx, &req, respFunc); err != nil {
return
}
@@ -124,7 +130,10 @@ func (o *Client) SendStream(msgs []*chat.ChatCompletionMessage, opts *domain.Cha
func (o *Client) Send(ctx context.Context, msgs []*chat.ChatCompletionMessage, opts *domain.ChatOptions) (ret string, err error) {
bf := false
req := o.createChatRequest(msgs, opts)
var req ollamaapi.ChatRequest
if req, err = o.createChatRequest(ctx, msgs, opts); err != nil {
return
}
req.Stream = &bf
respFunc := func(resp ollamaapi.ChatResponse) (streamErr error) {
@@ -133,15 +142,18 @@ func (o *Client) Send(ctx context.Context, msgs []*chat.ChatCompletionMessage, o
}
if err = o.client.Chat(ctx, &req, respFunc); err != nil {
fmt.Printf("FRED --> %s\n", err)
debuglog.Debug(debuglog.Basic, "Ollama chat request failed: %v\n", err)
}
return
}
func (o *Client) createChatRequest(msgs []*chat.ChatCompletionMessage, opts *domain.ChatOptions) (ret ollamaapi.ChatRequest) {
messages := lo.Map(msgs, func(message *chat.ChatCompletionMessage, _ int) (ret ollamaapi.Message) {
return ollamaapi.Message{Role: message.Role, Content: message.Content}
})
func (o *Client) createChatRequest(ctx context.Context, msgs []*chat.ChatCompletionMessage, opts *domain.ChatOptions) (ret ollamaapi.ChatRequest, err error) {
messages := make([]ollamaapi.Message, len(msgs))
for i, message := range msgs {
if messages[i], err = o.convertMessage(ctx, message); err != nil {
return
}
}
options := map[string]interface{}{
"temperature": opts.Temperature,
@@ -162,6 +174,77 @@ func (o *Client) createChatRequest(msgs []*chat.ChatCompletionMessage, opts *dom
return
}
func (o *Client) convertMessage(ctx context.Context, message *chat.ChatCompletionMessage) (ret ollamaapi.Message, err error) {
ret = ollamaapi.Message{Role: message.Role, Content: message.Content}
if len(message.MultiContent) == 0 {
return
}
// Pre-allocate with capacity hint
textParts := make([]string, 0, len(message.MultiContent))
if strings.TrimSpace(ret.Content) != "" {
textParts = append(textParts, strings.TrimSpace(ret.Content))
}
for _, part := range message.MultiContent {
switch part.Type {
case chat.ChatMessagePartTypeText:
if trimmed := strings.TrimSpace(part.Text); trimmed != "" {
textParts = append(textParts, trimmed)
}
case chat.ChatMessagePartTypeImageURL:
// Nil guard
if part.ImageURL == nil || part.ImageURL.URL == "" {
continue
}
var img []byte
if img, err = o.loadImageBytes(ctx, part.ImageURL.URL); err != nil {
return
}
ret.Images = append(ret.Images, ollamaapi.ImageData(img))
}
}
ret.Content = strings.Join(textParts, "\n")
return
}
func (o *Client) loadImageBytes(ctx context.Context, imageURL string) (ret []byte, err error) {
// Handle data URLs (base64 encoded)
if strings.HasPrefix(imageURL, "data:") {
parts := strings.SplitN(imageURL, ",", 2)
if len(parts) != 2 {
err = fmt.Errorf("invalid data URL format")
return
}
if ret, err = base64.StdEncoding.DecodeString(parts[1]); err != nil {
err = fmt.Errorf("failed to decode data URL: %w", err)
}
return
}
// Handle HTTP URLs with context
var req *http.Request
if req, err = http.NewRequestWithContext(ctx, http.MethodGet, imageURL, nil); err != nil {
return
}
var resp *http.Response
if resp, err = o.httpClient.Do(req); err != nil {
return
}
defer resp.Body.Close()
if resp.StatusCode >= http.StatusBadRequest {
err = fmt.Errorf("failed to fetch image %s: %s", imageURL, resp.Status)
return
}
ret, err = io.ReadAll(resp.Body)
return
}
func (o *Client) NeedsRawMode(modelName string) bool {
ollamaSearchStrings := []string{
"llama3",

View File

@@ -172,10 +172,11 @@ func (o *Client) supportsResponsesAPI() bool {
func (o *Client) NeedsRawMode(modelName string) bool {
openaiModelsPrefixes := []string{
"glm",
"gpt-5",
"o1",
"o3",
"o4",
"gpt-5",
}
openAIModelsNeedingRaw := []string{
"gpt-4o-mini-search-preview",

View File

@@ -2,6 +2,7 @@ package openai_compatible
import (
"context"
"fmt"
"os"
"strings"
@@ -38,8 +39,12 @@ func NewClient(providerConfig ProviderConfig) *Client {
// ListModels overrides the default ListModels to handle different response formats
func (c *Client) ListModels() ([]string, error) {
// If a custom models URL is provided, use direct fetch with that URL
// If a custom models URL is provided, handle it
if c.modelsURL != "" {
// Check for static model list
if strings.HasPrefix(c.modelsURL, "static:") {
return c.getStaticModels(c.modelsURL)
}
// TODO: Handle context properly in Fabric by accepting and propagating a context.Context
// instead of creating a new one here.
return openai.FetchModelsDirectly(context.Background(), c.modelsURL, c.Client.ApiKey.Value, c.GetName())
@@ -55,6 +60,68 @@ func (c *Client) ListModels() ([]string, error) {
return c.DirectlyGetModels(context.Background())
}
// getStaticModels returns a predefined list of models for providers that don't support model discovery
func (c *Client) getStaticModels(modelsKey string) ([]string, error) {
switch modelsKey {
case "static:abacus":
return []string{
"route-llm",
"gpt-4o-2024-11-20",
"gpt-4o-mini",
"o4-mini",
"o3-pro",
"o3",
"o3-mini",
"gpt-4.1",
"gpt-4.1-mini",
"gpt-4.1-nano",
"gpt-5",
"gpt-5-mini",
"gpt-5-nano",
"gpt-5.1",
"gpt-5.1-chat-latest",
"openai/gpt-oss-120b",
"claude-3-7-sonnet-20250219",
"claude-sonnet-4-20250514",
"claude-opus-4-20250514",
"claude-opus-4-1-20250805",
"claude-sonnet-4-5-20250929",
"claude-haiku-4-5-20251001",
"claude-opus-4-5-20251101",
"meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
"meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo",
"meta-llama/Meta-Llama-3.1-70B-Instruct",
"meta-llama/Meta-Llama-3.1-8B-Instruct",
"llama-3.3-70b-versatile",
"gemini-2.0-flash-001",
"gemini-2.0-pro-exp-02-05",
"gemini-2.5-pro",
"gemini-2.5-flash",
"gemini-3-pro-preview",
"qwen-2.5-coder-32b",
"Qwen/Qwen2.5-72B-Instruct",
"Qwen/QwQ-32B",
"Qwen/Qwen3-235B-A22B-Instruct-2507",
"Qwen/Qwen3-32B",
"qwen/qwen3-coder-480b-a35b-instruct",
"qwen/qwen3-Max",
"grok-4-0709",
"grok-4-fast-non-reasoning",
"grok-4-1-fast-non-reasoning",
"grok-code-fast-1",
"kimi-k2-turbo-preview",
"deepseek/deepseek-v3.1",
"deepseek-ai/DeepSeek-V3.1-Terminus",
"deepseek-ai/DeepSeek-R1",
"deepseek-ai/DeepSeek-V3.2",
"zai-org/glm-4.5",
"zai-org/glm-4.6",
}, nil
default:
return nil, fmt.Errorf("unknown static model list: %s", modelsKey)
}
}
// ProviderMap is a map of provider name to ProviderConfig for O(1) lookup
var ProviderMap = map[string]ProviderConfig{
"AIML": {
@@ -123,6 +190,17 @@ var ProviderMap = map[string]ProviderConfig{
BaseURL: "https://api.venice.ai/api/v1",
ImplementsResponses: false,
},
"Z AI": {
Name: "Z AI",
BaseURL: "https://api.z.ai/api/paas/v4",
ImplementsResponses: false,
},
"Abacus": {
Name: "Abacus",
BaseURL: "https://routellm.abacus.ai/v1/",
ModelsURL: "static:abacus", // Special marker for static model list
ImplementsResponses: false,
},
}
// GetProviderByName returns the provider configuration for a given name with O(1) lookup

View File

@@ -20,6 +20,16 @@ func TestCreateClient(t *testing.T) {
provider: "Groq",
exists: true,
},
{
name: "Existing provider - Z AI",
provider: "Z AI",
exists: true,
},
{
name: "Existing provider - Abacus",
provider: "Abacus",
exists: true,
},
{
name: "Non-existent provider",
provider: "NonExistent",

View File

@@ -1 +1 @@
"1.4.335"
"1.4.340"

View File

@@ -1,13 +1,14 @@
import { writable } from 'svelte/store';
import { browser } from '$app/environment';
// Load favorites from localStorage if available
const storedFavorites = typeof localStorage !== 'undefined'
const storedFavorites = browser
? JSON.parse(localStorage.getItem('favoritePatterns') || '[]')
: [];
const createFavoritesStore = () => {
const { subscribe, set, update } = writable<string[]>(storedFavorites);
return {
subscribe,
toggleFavorite: (patternName: string) => {
@@ -17,7 +18,7 @@ const createFavoritesStore = () => {
: [...favorites, patternName];
// Save to localStorage
if (typeof localStorage !== 'undefined') {
if (browser) {
localStorage.setItem('favoritePatterns', JSON.stringify(newFavorites));
}
@@ -26,11 +27,11 @@ const createFavoritesStore = () => {
},
reset: () => {
set([]);
if (typeof localStorage !== 'undefined') {
if (browser) {
localStorage.removeItem('favoritePatterns');
}
}
};
};
export const favorites = createFavoritesStore();
export const favorites = createFavoritesStore();