Compare commits

..

16 Commits

Author SHA1 Message Date
github-actions[bot]
fd5530d38b chore(release): Update version to v1.4.378 2026-01-14 19:11:24 +00:00
Kayvan Sylvan
8ec09be550 Merge pull request #1933 from ksylvan/kayvan/digital-ocean-provider
Add DigitalOcean Gradient AI support
2026-01-14 11:08:58 -08:00
Kayvan Sylvan
6bac79703e chore: Add Digital Oceam vendor to Updates section. 2026-01-14 11:03:15 -08:00
Kayvan Sylvan
24afe127f1 chore: incoming 1933 changelog entry 2026-01-13 22:56:36 -08:00
Kayvan Sylvan
c26a56a368 feat: add DigitalOcean Gradient AI Agents as a new vendor
## CHANGES

- Add DigitalOcean as a new AI provider in plugin registry
- Implement DigitalOcean client with OpenAI-compatible inference endpoint
- Support model access key authentication for inference requests
- Add optional control plane token for model discovery
- Create DigitalOcean setup documentation with environment variables
- Update README to list DigitalOcean in supported providers
- Handle model listing via control plane API with fallback
2026-01-13 22:52:13 -08:00
Kayvan Sylvan
84470eac3f chore: Update README with a links to other docs 2026-01-13 13:58:14 -08:00
github-actions[bot]
74250bbcbd chore(release): Update version to v1.4.377 2026-01-12 17:39:02 +00:00
Kayvan Sylvan
a593e83d9f Merge pull request #1929 from ksylvan/kayvan/add-mammouth-ai-provider
Add Mammouth as new OpenAI-compatible AI provider
2026-01-12 09:36:05 -08:00
Kayvan Sylvan
62e6812c7f chore: incoming 1929 changelog entry 2026-01-12 09:33:38 -08:00
Kayvan Sylvan
7e7ab9e5f2 feat: add Mammouth as new OpenAI-compatible AI provider
## CHANGES

- Add Mammouth provider configuration with API base URL
- Configure Mammouth to use standard OpenAI-compatible interface
- Disable Responses API implementation for Mammouth provider
- Add "Mammouth" to VSCode spell check dictionary
2026-01-12 09:27:28 -08:00
github-actions[bot]
df2938a7ee chore(release): Update version to v1.4.376 2026-01-12 05:22:38 +00:00
Kayvan Sylvan
014985c407 Merge pull request #1928 from ksylvan/kayvan/refactor-new-plugin-base
Eliminate repetitive boilerplate across eight vendor implementations
2026-01-11 21:20:07 -08:00
Kayvan Sylvan
a3d9bec537 chore: incoming 1928 changelog entry 2026-01-11 21:18:06 -08:00
Kayvan Sylvan
febae215f3 chore: exempt json files from VSCode format-on-save 2026-01-11 20:56:13 -08:00
Kayvan Sylvan
cf55be784f refactor: add NewVendorPluginBase factory function to reduce duplication
Add centralized factory function for AI vendor plugin initialization:
- Add NewVendorPluginBase(name, configure) to internal/plugins/plugin.go
- Update 8 vendor files (anthropic, bedrock, gemini, lmstudio, ollama,
  openai, perplexity, vertexai) to use the factory function
- Add 3 test cases for the new factory function

This removes ~40 lines of duplicated boilerplate code and ensures
consistent plugin initialization across all vendors.

MAESTRO: Loop 00001 refactoring implementation

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 20:12:58 -08:00
Daniel Miessler
6d2180e69a docs: Add GitHub sponsor section to README
I spend hundreds of hours a year on open source. If you'd like to help support this project, you can sponsor me here.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 12:40:13 -08:00
20 changed files with 334 additions and 48 deletions

View File

@@ -117,6 +117,7 @@
"listvendors",
"lmstudio",
"Makefiles",
"Mammouth",
"markmap",
"matplotlib",
"mattn",
@@ -157,6 +158,7 @@
"pyperclip",
"qwen",
"readystream",
"reflexion",
"restapi",
"rmextension",
"Sadachbia",
@@ -247,5 +249,8 @@
]
},
"MD041": false
},
"[json]": {
"editor.formatOnSave": false
}
}

View File

@@ -1,5 +1,45 @@
# Changelog
## v1.4.378 (2026-01-14)
### PR [#1933](https://github.com/danielmiessler/Fabric/pull/1933) by [ksylvan](https://github.com/ksylvan): Add DigitalOcean Gradient AI support
- Feat: add DigitalOcean Gradient AI Agents as a new vendor
- Add DigitalOcean as a new AI provider in plugin registry
- Implement DigitalOcean client with OpenAI-compatible inference endpoint
- Support model access key authentication for inference requests
- Add optional control plane token for model discovery
### Direct commits
- Chore: Update README with a links to other docs
## v1.4.377 (2026-01-12)
### PR [#1929](https://github.com/danielmiessler/Fabric/pull/1929) by [ksylvan](https://github.com/ksylvan): Add Mammouth as new OpenAI-compatible AI provider
- Feat: add Mammouth as new OpenAI-compatible AI provider
- Add Mammouth provider configuration with API base URL
- Configure Mammouth to use standard OpenAI-compatible interface
- Disable Responses API implementation for Mammouth provider
- Add "Mammouth" to VSCode spell check dictionary
## v1.4.376 (2026-01-12)
### PR [#1928](https://github.com/danielmiessler/Fabric/pull/1928) by [ksylvan](https://github.com/ksylvan): Eliminate repetitive boilerplate across eight vendor implementations
- Refactor: add NewVendorPluginBase factory function to reduce duplication
- Update 8 vendor files (anthropic, bedrock, gemini, lmstudio, ollama, openai, perplexity, vertexai) to use the factory function
- Add 3 test cases for the new factory function
- Add centralized factory function for AI vendor plugin initialization
- Chore: exempt json files from VSCode format-on-save
### Direct commits
- Docs: Add GitHub sponsor section to README
I spend hundreds of hours a year on open source. If you'd like to help support this project, you can sponsor me here.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
## v1.4.375 (2026-01-08)
### PR [#1925](https://github.com/danielmiessler/Fabric/pull/1925) by [ksylvan](https://github.com/ksylvan): docs: update README to document new AI providers and features

View File

@@ -63,6 +63,9 @@ Fabric organizes prompts by real-world task, allowing people to create, collect,
## Updates
For a deep dive into Fabric and its internals, read the documentation in the [docs folder](https://github.com/danielmiessler/Fabric/tree/main/docs). There is
also the extremely useful and regularly updated [DeepWiki](https://deepwiki.com/danielmiessler/Fabric) for Fabric.
<details>
<summary>Click to view recent updates</summary>
@@ -74,6 +77,7 @@ Below are the **new features and capabilities** we've added (newest first):
### Recent Major Features
- [v1.4.378](https://github.com/danielmiessler/fabric/releases/tag/v1.4.378) (Jan 14, 2026) — **Digital Ocean GenAI Support**: Added support for Digital Ocean GenAI, along with a [guide for how to use it](./docs/DigitalOcean-Agents-Setup.md).
- [v1.4.356](https://github.com/danielmiessler/fabric/releases/tag/v1.4.356) (Dec 22, 2025) — **Complete Internationalization**: Full i18n support for setup prompts across all 10 languages with intelligent environment variable handling—making Fabric truly accessible worldwide while maintaining configuration consistency.
- [v1.4.350](https://github.com/danielmiessler/fabric/releases/tag/v1.4.350) (Dec 18, 2025) — **Interactive API Documentation**: Adds Swagger/OpenAPI UI at `/swagger/index.html` with comprehensive REST API documentation, enhanced developer guides, and improved endpoint discoverability for easier integration.
- [v1.4.338](https://github.com/danielmiessler/fabric/releases/tag/v1.4.338) (Dec 4, 2025) — Add Abacus vendor support for Chat-LLM
@@ -196,6 +200,7 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r
- [Meta](#meta)
- [Primary contributors](#primary-contributors)
- [Contributors](#contributors)
- [💜 Support This Project](#-support-this-project)
<br />
@@ -376,6 +381,7 @@ Fabric supports a wide range of AI providers:
- AIML
- Cerebras
- DeepSeek
- DigitalOcean
- GitHub Models
- GrokAI
- Groq
@@ -1091,3 +1097,13 @@ Made with [contrib.rocks](https://contrib.rocks).
`fabric` was created by <a href="https://danielmiessler.com/subscribe" target="_blank">Daniel Miessler</a> in January of 2024.
<br /><br />
<a href="https://twitter.com/intent/user?screen_name=danielmiessler">![X (formerly Twitter) Follow](https://img.shields.io/twitter/follow/danielmiessler)</a>
## 💜 Support This Project
<div align="center">
<img src="https://img.shields.io/badge/Sponsor-❤️-EA4AAA?style=for-the-badge&logo=github-sponsors&logoColor=white" alt="Sponsor">
**I spend hundreds of hours a year on open source. If you'd like to help support this project, you can [sponsor me here](https://github.com/sponsors/danielmiessler). 🙏🏼**
</div>

View File

@@ -1,3 +1,3 @@
package main
var version = "v1.4.375"
var version = "v1.4.378"

Binary file not shown.

View File

@@ -0,0 +1,55 @@
# DigitalOcean Gradient AI Agents
Fabric can talk to DigitalOcean Gradient™ AI Agents by using DigitalOcean's OpenAI-compatible
inference endpoint. You provide a **model access key** for inference plus an optional **DigitalOcean
API token** for model discovery.
## Prerequisites
1. Create or locate a Gradient AI Agent in the DigitalOcean control panel.
2. Create a **model access key** for inference (this is not the same as your DigitalOcean API token).
3. (Optional) Keep a DigitalOcean API token handy if you want `fabric --listmodels` to query the
control plane for available models.
The official walkthrough for creating and using agents is here:
<https://docs.digitalocean.com/products/gradient-ai-platform/how-to/use-agents/>
## Environment variables
Set the following environment variables before running `fabric --setup`:
```bash
# Required: model access key for inference
export DIGITALOCEAN_INFERENCE_KEY="your-model-access-key"
# Optional: control-plane token for model listing
export DIGITALOCEAN_TOKEN="your-digitalocean-api-token"
# Optional: override the default inference base URL
export DIGITALOCEAN_INFERENCE_BASE_URL="https://inference.do-ai.run/v1"
```
If you need a region-specific inference URL, you can retrieve it from the GenAI regions API:
```bash
curl -H "Authorization: Bearer $DIGITALOCEAN_TOKEN" \
"https://api.digitalocean.com/v2/gen-ai/regions"
```
## Fabric setup
Run setup and select the DigitalOcean vendor:
```bash
fabric --setup
```
Then list models (requires `DIGITALOCEAN_TOKEN`) and pick the inference name:
```bash
fabric --listmodels
fabric --vendor DigitalOcean --model <inference_name> --pattern summarize
```
If you skip `DIGITALOCEAN_TOKEN`, you can still use Fabric by supplying the model name directly
based on the agent or model you created in DigitalOcean.

View File

@@ -15,6 +15,7 @@ import (
"github.com/danielmiessler/fabric/internal/plugins/ai/anthropic"
"github.com/danielmiessler/fabric/internal/plugins/ai/azure"
"github.com/danielmiessler/fabric/internal/plugins/ai/bedrock"
"github.com/danielmiessler/fabric/internal/plugins/ai/digitalocean"
"github.com/danielmiessler/fabric/internal/plugins/ai/dryrun"
"github.com/danielmiessler/fabric/internal/plugins/ai/exolab"
"github.com/danielmiessler/fabric/internal/plugins/ai/gemini"
@@ -98,6 +99,7 @@ func NewPluginRegistry(db *fsdb.Db) (ret *PluginRegistry, err error) {
// Add non-OpenAI compatible clients
vendors = append(vendors,
openai.NewClient(),
digitalocean.NewClient(),
ollama.NewClient(),
azure.NewClient(),
gemini.NewClient(),

View File

@@ -29,11 +29,7 @@ func NewClient() (ret *Client) {
vendorName := "Anthropic"
ret = &Client{}
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: ret.configure,
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, ret.configure)
ret.ApiBaseURL = ret.AddSetupQuestion("API Base URL", false)
ret.ApiBaseURL.Value = defaultBaseUrl

View File

@@ -51,13 +51,9 @@ func NewClient() (ret *BedrockClient) {
cfg, err := config.LoadDefaultConfig(ctx)
if err != nil {
// Create a minimal client that will fail gracefully during configuration
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: func() error {
return fmt.Errorf("unable to load AWS Config: %w", err)
},
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, func() error {
return fmt.Errorf("unable to load AWS Config: %w", err)
})
ret.bedrockRegion = ret.PluginBase.AddSetupQuestion("AWS Region", true)
return
}
@@ -67,11 +63,7 @@ func NewClient() (ret *BedrockClient) {
runtimeClient := bedrockruntime.NewFromConfig(cfg)
controlPlaneClient := bedrock.NewFromConfig(cfg)
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: ret.configure,
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, ret.configure)
ret.runtimeClient = runtimeClient
ret.controlPlaneClient = controlPlaneClient

View File

@@ -0,0 +1,151 @@
package digitalocean
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"net/url"
"time"
"github.com/danielmiessler/fabric/internal/i18n"
"github.com/danielmiessler/fabric/internal/plugins"
"github.com/danielmiessler/fabric/internal/plugins/ai/openai"
)
const (
defaultInferenceBaseURL = "https://inference.do-ai.run/v1"
controlPlaneModelsURL = "https://api.digitalocean.com/v2/gen-ai/models"
errorResponseLimit = 1024
maxResponseSize = 10 * 1024 * 1024
)
type Client struct {
*openai.Client
ControlPlaneToken *plugins.SetupQuestion
httpClient *http.Client
}
type modelsResponse struct {
Models []modelDetails `json:"models"`
}
type modelDetails struct {
InferenceName string `json:"inference_name"`
Name string `json:"name"`
UUID string `json:"uuid"`
}
func NewClient() *Client {
base := openai.NewClientCompatibleNoSetupQuestions("DigitalOcean", nil)
base.ApiKey = base.AddSetupQuestion("Inference Key", true)
base.ApiBaseURL = base.AddSetupQuestion("Inference Base URL", false)
base.ApiBaseURL.Value = defaultInferenceBaseURL
base.ImplementsResponses = false
client := &Client{
Client: base,
}
client.ControlPlaneToken = client.AddSetupQuestion("Token", false)
return client
}
func (c *Client) ListModels() ([]string, error) {
if c.ControlPlaneToken.Value == "" {
models, err := c.Client.ListModels()
if err == nil && len(models) > 0 {
return models, nil
}
if err != nil {
return nil, fmt.Errorf(
"DigitalOcean model list unavailable: %w. Set DIGITALOCEAN_TOKEN to fetch models from the control plane",
err,
)
}
return nil, fmt.Errorf("DigitalOcean model list unavailable. Set DIGITALOCEAN_TOKEN to fetch models from the control plane")
}
return c.fetchModelsFromControlPlane(context.Background())
}
func (c *Client) fetchModelsFromControlPlane(ctx context.Context) ([]string, error) {
if ctx == nil {
ctx = context.Background()
}
fullURL, err := url.Parse(controlPlaneModelsURL)
if err != nil {
return nil, fmt.Errorf("failed to parse DigitalOcean control plane URL: %w", err)
}
req, err := http.NewRequestWithContext(ctx, http.MethodGet, fullURL.String(), nil)
if err != nil {
return nil, err
}
req.Header.Set("Authorization", fmt.Sprintf("Bearer %s", c.ControlPlaneToken.Value))
req.Header.Set("Accept", "application/json")
client := c.httpClient
if client == nil {
client = &http.Client{Timeout: 10 * time.Second}
}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
bodyBytes, readErr := io.ReadAll(io.LimitReader(resp.Body, errorResponseLimit))
if readErr != nil {
return nil, fmt.Errorf(
"DigitalOcean models request failed with status %d: %w",
resp.StatusCode,
readErr,
)
}
return nil, fmt.Errorf(
"DigitalOcean models request failed with status %d: %s",
resp.StatusCode,
string(bodyBytes),
)
}
bodyBytes, err := io.ReadAll(io.LimitReader(resp.Body, maxResponseSize+1))
if err != nil {
return nil, err
}
if len(bodyBytes) > maxResponseSize {
return nil, fmt.Errorf(i18n.T("openai_models_response_too_large"), c.GetName(), maxResponseSize)
}
var payload modelsResponse
if err := json.Unmarshal(bodyBytes, &payload); err != nil {
return nil, err
}
models := make([]string, 0, len(payload.Models))
seen := make(map[string]struct{}, len(payload.Models))
for _, model := range payload.Models {
var value string
switch {
case model.InferenceName != "":
value = model.InferenceName
case model.Name != "":
value = model.Name
case model.UUID != "":
value = model.UUID
}
if value == "" {
continue
}
if _, ok := seen[value]; ok {
continue
}
seen[value] = struct{}{}
models = append(models, value)
}
return models, nil
}

View File

@@ -46,10 +46,7 @@ func NewClient() (ret *Client) {
vendorName := "Gemini"
ret = &Client{}
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, nil)
ret.ApiKey = ret.PluginBase.AddSetupQuestion("API key", true)

View File

@@ -27,11 +27,7 @@ func NewClientCompatible(vendorName string, defaultBaseUrl string, configureCust
if configureCustom == nil {
configureCustom = ret.configure
}
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: configureCustom,
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, configureCustom)
ret.ApiUrl = ret.AddSetupQuestionCustom("API URL", true,
fmt.Sprintf("Enter your %v URL (as a reminder, it is usually %v')", vendorName, defaultBaseUrl))
return

View File

@@ -24,11 +24,7 @@ func NewClient() (ret *Client) {
vendorName := "Ollama"
ret = &Client{}
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: ret.configure,
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, ret.configure)
ret.ApiUrl = ret.AddSetupQuestionCustom("API URL", true,
"Enter your Ollama URL (as a reminder, it is usually http://localhost:11434')")

View File

@@ -52,11 +52,7 @@ func NewClientCompatibleNoSetupQuestions(vendorName string, configureCustom func
configureCustom = ret.configure
}
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: configureCustom,
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, configureCustom)
return
}

View File

@@ -206,6 +206,11 @@ var ProviderMap = map[string]ProviderConfig{
ModelsURL: "static:abacus", // Special marker for static model list
ImplementsResponses: false,
},
"Mammouth": {
Name: "Mammouth",
BaseURL: "https://api.mammouth.ai/v1",
ImplementsResponses: false,
},
}
// GetProviderByName returns the provider configuration for a given name with O(1) lookup

View File

@@ -31,11 +31,7 @@ type Client struct {
func NewClient() *Client {
c := &Client{}
c.PluginBase = &plugins.PluginBase{
Name: providerName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(providerName),
ConfigureCustom: c.Configure, // Assign the Configure method
}
c.PluginBase = plugins.NewVendorPluginBase(providerName, c.Configure)
c.APIKey = c.AddSetupQuestion("API_KEY", true)
return c
}

View File

@@ -28,11 +28,7 @@ func NewClient() (ret *Client) {
vendorName := "VertexAI"
ret = &Client{}
ret.PluginBase = &plugins.PluginBase{
Name: vendorName,
EnvNamePrefix: plugins.BuildEnvVariablePrefix(vendorName),
ConfigureCustom: ret.configure,
}
ret.PluginBase = plugins.NewVendorPluginBase(vendorName, ret.configure)
ret.ProjectID = ret.AddSetupQuestion("Project ID", true)
ret.Region = ret.AddSetupQuestion("Region", false)

View File

@@ -36,6 +36,16 @@ func (o *PluginBase) GetName() string {
return o.Name
}
// NewVendorPluginBase creates a standardized PluginBase for AI vendor plugins.
// This centralizes the common initialization pattern used by all vendors.
func NewVendorPluginBase(name string, configure func() error) *PluginBase {
return &PluginBase{
Name: name,
EnvNamePrefix: BuildEnvVariablePrefix(name),
ConfigureCustom: configure,
}
}
func (o *PluginBase) GetSetupDescription() (ret string) {
if ret = o.SetupDescription; ret == "" {
ret = o.GetName()

View File

@@ -8,6 +8,43 @@ import (
"github.com/stretchr/testify/assert"
)
func TestNewVendorPluginBase(t *testing.T) {
// Test with configure function
configureCalled := false
configureFunc := func() error {
configureCalled = true
return nil
}
plugin := NewVendorPluginBase("TestVendor", configureFunc)
assert.Equal(t, "TestVendor", plugin.Name)
assert.Equal(t, "TESTVENDOR_", plugin.EnvNamePrefix)
assert.NotNil(t, plugin.ConfigureCustom)
// Test that configure function is properly stored
err := plugin.ConfigureCustom()
assert.NoError(t, err)
assert.True(t, configureCalled)
}
func TestNewVendorPluginBase_NilConfigure(t *testing.T) {
// Test with nil configure function
plugin := NewVendorPluginBase("TestVendor", nil)
assert.Equal(t, "TestVendor", plugin.Name)
assert.Equal(t, "TESTVENDOR_", plugin.EnvNamePrefix)
assert.Nil(t, plugin.ConfigureCustom)
}
func TestNewVendorPluginBase_EnvPrefixWithSpaces(t *testing.T) {
// Test that spaces are converted to underscores
plugin := NewVendorPluginBase("LM Studio", nil)
assert.Equal(t, "LM Studio", plugin.Name)
assert.Equal(t, "LM_STUDIO_", plugin.EnvNamePrefix)
}
func TestConfigurable_AddSetting(t *testing.T) {
conf := &PluginBase{
Settings: Settings{},

View File

@@ -1 +1 @@
"1.4.375"
"1.4.378"