Compare commits

..

10 Commits

Author SHA1 Message Date
github-actions[bot]
fd5530d38b chore(release): Update version to v1.4.378 2026-01-14 19:11:24 +00:00
Kayvan Sylvan
8ec09be550 Merge pull request #1933 from ksylvan/kayvan/digital-ocean-provider
Add DigitalOcean Gradient AI support
2026-01-14 11:08:58 -08:00
Kayvan Sylvan
6bac79703e chore: Add Digital Oceam vendor to Updates section. 2026-01-14 11:03:15 -08:00
Kayvan Sylvan
24afe127f1 chore: incoming 1933 changelog entry 2026-01-13 22:56:36 -08:00
Kayvan Sylvan
c26a56a368 feat: add DigitalOcean Gradient AI Agents as a new vendor
## CHANGES

- Add DigitalOcean as a new AI provider in plugin registry
- Implement DigitalOcean client with OpenAI-compatible inference endpoint
- Support model access key authentication for inference requests
- Add optional control plane token for model discovery
- Create DigitalOcean setup documentation with environment variables
- Update README to list DigitalOcean in supported providers
- Handle model listing via control plane API with fallback
2026-01-13 22:52:13 -08:00
Kayvan Sylvan
84470eac3f chore: Update README with a links to other docs 2026-01-13 13:58:14 -08:00
github-actions[bot]
74250bbcbd chore(release): Update version to v1.4.377 2026-01-12 17:39:02 +00:00
Kayvan Sylvan
a593e83d9f Merge pull request #1929 from ksylvan/kayvan/add-mammouth-ai-provider
Add Mammouth as new OpenAI-compatible AI provider
2026-01-12 09:36:05 -08:00
Kayvan Sylvan
62e6812c7f chore: incoming 1929 changelog entry 2026-01-12 09:33:38 -08:00
Kayvan Sylvan
7e7ab9e5f2 feat: add Mammouth as new OpenAI-compatible AI provider
## CHANGES

- Add Mammouth provider configuration with API base URL
- Configure Mammouth to use standard OpenAI-compatible interface
- Disable Responses API implementation for Mammouth provider
- Add "Mammouth" to VSCode spell check dictionary
2026-01-12 09:27:28 -08:00
10 changed files with 248 additions and 3 deletions

View File

@@ -117,6 +117,7 @@
"listvendors",
"lmstudio",
"Makefiles",
"Mammouth",
"markmap",
"matplotlib",
"mattn",
@@ -157,6 +158,7 @@
"pyperclip",
"qwen",
"readystream",
"reflexion",
"restapi",
"rmextension",
"Sadachbia",

View File

@@ -1,5 +1,29 @@
# Changelog
## v1.4.378 (2026-01-14)
### PR [#1933](https://github.com/danielmiessler/Fabric/pull/1933) by [ksylvan](https://github.com/ksylvan): Add DigitalOcean Gradient AI support
- Feat: add DigitalOcean Gradient AI Agents as a new vendor
- Add DigitalOcean as a new AI provider in plugin registry
- Implement DigitalOcean client with OpenAI-compatible inference endpoint
- Support model access key authentication for inference requests
- Add optional control plane token for model discovery
### Direct commits
- Chore: Update README with a links to other docs
## v1.4.377 (2026-01-12)
### PR [#1929](https://github.com/danielmiessler/Fabric/pull/1929) by [ksylvan](https://github.com/ksylvan): Add Mammouth as new OpenAI-compatible AI provider
- Feat: add Mammouth as new OpenAI-compatible AI provider
- Add Mammouth provider configuration with API base URL
- Configure Mammouth to use standard OpenAI-compatible interface
- Disable Responses API implementation for Mammouth provider
- Add "Mammouth" to VSCode spell check dictionary
## v1.4.376 (2026-01-12)
### PR [#1928](https://github.com/danielmiessler/Fabric/pull/1928) by [ksylvan](https://github.com/ksylvan): Eliminate repetitive boilerplate across eight vendor implementations

View File

@@ -63,6 +63,9 @@ Fabric organizes prompts by real-world task, allowing people to create, collect,
## Updates
For a deep dive into Fabric and its internals, read the documentation in the [docs folder](https://github.com/danielmiessler/Fabric/tree/main/docs). There is
also the extremely useful and regularly updated [DeepWiki](https://deepwiki.com/danielmiessler/Fabric) for Fabric.
<details>
<summary>Click to view recent updates</summary>
@@ -74,6 +77,7 @@ Below are the **new features and capabilities** we've added (newest first):
### Recent Major Features
- [v1.4.378](https://github.com/danielmiessler/fabric/releases/tag/v1.4.378) (Jan 14, 2026) — **Digital Ocean GenAI Support**: Added support for Digital Ocean GenAI, along with a [guide for how to use it](./docs/DigitalOcean-Agents-Setup.md).
- [v1.4.356](https://github.com/danielmiessler/fabric/releases/tag/v1.4.356) (Dec 22, 2025) — **Complete Internationalization**: Full i18n support for setup prompts across all 10 languages with intelligent environment variable handling—making Fabric truly accessible worldwide while maintaining configuration consistency.
- [v1.4.350](https://github.com/danielmiessler/fabric/releases/tag/v1.4.350) (Dec 18, 2025) — **Interactive API Documentation**: Adds Swagger/OpenAPI UI at `/swagger/index.html` with comprehensive REST API documentation, enhanced developer guides, and improved endpoint discoverability for easier integration.
- [v1.4.338](https://github.com/danielmiessler/fabric/releases/tag/v1.4.338) (Dec 4, 2025) — Add Abacus vendor support for Chat-LLM
@@ -196,6 +200,7 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r
- [Meta](#meta)
- [Primary contributors](#primary-contributors)
- [Contributors](#contributors)
- [💜 Support This Project](#-support-this-project)
<br />
@@ -376,6 +381,7 @@ Fabric supports a wide range of AI providers:
- AIML
- Cerebras
- DeepSeek
- DigitalOcean
- GitHub Models
- GrokAI
- Groq
@@ -1098,6 +1104,6 @@ Made with [contrib.rocks](https://contrib.rocks).
<img src="https://img.shields.io/badge/Sponsor-❤️-EA4AAA?style=for-the-badge&logo=github-sponsors&logoColor=white" alt="Sponsor">
**I spend hundreds of hours a year on open source. If you'd like to help support this project, you can sponsor me [here](https://github.com/sponsors/danielmiessler). 🙏🏼**
**I spend hundreds of hours a year on open source. If you'd like to help support this project, you can [sponsor me here](https://github.com/sponsors/danielmiessler). 🙏🏼**
</div>

View File

@@ -1,3 +1,3 @@
package main
var version = "v1.4.376"
var version = "v1.4.378"

Binary file not shown.

View File

@@ -0,0 +1,55 @@
# DigitalOcean Gradient AI Agents
Fabric can talk to DigitalOcean Gradient™ AI Agents by using DigitalOcean's OpenAI-compatible
inference endpoint. You provide a **model access key** for inference plus an optional **DigitalOcean
API token** for model discovery.
## Prerequisites
1. Create or locate a Gradient AI Agent in the DigitalOcean control panel.
2. Create a **model access key** for inference (this is not the same as your DigitalOcean API token).
3. (Optional) Keep a DigitalOcean API token handy if you want `fabric --listmodels` to query the
control plane for available models.
The official walkthrough for creating and using agents is here:
<https://docs.digitalocean.com/products/gradient-ai-platform/how-to/use-agents/>
## Environment variables
Set the following environment variables before running `fabric --setup`:
```bash
# Required: model access key for inference
export DIGITALOCEAN_INFERENCE_KEY="your-model-access-key"
# Optional: control-plane token for model listing
export DIGITALOCEAN_TOKEN="your-digitalocean-api-token"
# Optional: override the default inference base URL
export DIGITALOCEAN_INFERENCE_BASE_URL="https://inference.do-ai.run/v1"
```
If you need a region-specific inference URL, you can retrieve it from the GenAI regions API:
```bash
curl -H "Authorization: Bearer $DIGITALOCEAN_TOKEN" \
"https://api.digitalocean.com/v2/gen-ai/regions"
```
## Fabric setup
Run setup and select the DigitalOcean vendor:
```bash
fabric --setup
```
Then list models (requires `DIGITALOCEAN_TOKEN`) and pick the inference name:
```bash
fabric --listmodels
fabric --vendor DigitalOcean --model <inference_name> --pattern summarize
```
If you skip `DIGITALOCEAN_TOKEN`, you can still use Fabric by supplying the model name directly
based on the agent or model you created in DigitalOcean.

View File

@@ -15,6 +15,7 @@ import (
"github.com/danielmiessler/fabric/internal/plugins/ai/anthropic"
"github.com/danielmiessler/fabric/internal/plugins/ai/azure"
"github.com/danielmiessler/fabric/internal/plugins/ai/bedrock"
"github.com/danielmiessler/fabric/internal/plugins/ai/digitalocean"
"github.com/danielmiessler/fabric/internal/plugins/ai/dryrun"
"github.com/danielmiessler/fabric/internal/plugins/ai/exolab"
"github.com/danielmiessler/fabric/internal/plugins/ai/gemini"
@@ -98,6 +99,7 @@ func NewPluginRegistry(db *fsdb.Db) (ret *PluginRegistry, err error) {
// Add non-OpenAI compatible clients
vendors = append(vendors,
openai.NewClient(),
digitalocean.NewClient(),
ollama.NewClient(),
azure.NewClient(),
gemini.NewClient(),

View File

@@ -0,0 +1,151 @@
package digitalocean
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"net/url"
"time"
"github.com/danielmiessler/fabric/internal/i18n"
"github.com/danielmiessler/fabric/internal/plugins"
"github.com/danielmiessler/fabric/internal/plugins/ai/openai"
)
const (
defaultInferenceBaseURL = "https://inference.do-ai.run/v1"
controlPlaneModelsURL = "https://api.digitalocean.com/v2/gen-ai/models"
errorResponseLimit = 1024
maxResponseSize = 10 * 1024 * 1024
)
type Client struct {
*openai.Client
ControlPlaneToken *plugins.SetupQuestion
httpClient *http.Client
}
type modelsResponse struct {
Models []modelDetails `json:"models"`
}
type modelDetails struct {
InferenceName string `json:"inference_name"`
Name string `json:"name"`
UUID string `json:"uuid"`
}
func NewClient() *Client {
base := openai.NewClientCompatibleNoSetupQuestions("DigitalOcean", nil)
base.ApiKey = base.AddSetupQuestion("Inference Key", true)
base.ApiBaseURL = base.AddSetupQuestion("Inference Base URL", false)
base.ApiBaseURL.Value = defaultInferenceBaseURL
base.ImplementsResponses = false
client := &Client{
Client: base,
}
client.ControlPlaneToken = client.AddSetupQuestion("Token", false)
return client
}
func (c *Client) ListModels() ([]string, error) {
if c.ControlPlaneToken.Value == "" {
models, err := c.Client.ListModels()
if err == nil && len(models) > 0 {
return models, nil
}
if err != nil {
return nil, fmt.Errorf(
"DigitalOcean model list unavailable: %w. Set DIGITALOCEAN_TOKEN to fetch models from the control plane",
err,
)
}
return nil, fmt.Errorf("DigitalOcean model list unavailable. Set DIGITALOCEAN_TOKEN to fetch models from the control plane")
}
return c.fetchModelsFromControlPlane(context.Background())
}
func (c *Client) fetchModelsFromControlPlane(ctx context.Context) ([]string, error) {
if ctx == nil {
ctx = context.Background()
}
fullURL, err := url.Parse(controlPlaneModelsURL)
if err != nil {
return nil, fmt.Errorf("failed to parse DigitalOcean control plane URL: %w", err)
}
req, err := http.NewRequestWithContext(ctx, http.MethodGet, fullURL.String(), nil)
if err != nil {
return nil, err
}
req.Header.Set("Authorization", fmt.Sprintf("Bearer %s", c.ControlPlaneToken.Value))
req.Header.Set("Accept", "application/json")
client := c.httpClient
if client == nil {
client = &http.Client{Timeout: 10 * time.Second}
}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
bodyBytes, readErr := io.ReadAll(io.LimitReader(resp.Body, errorResponseLimit))
if readErr != nil {
return nil, fmt.Errorf(
"DigitalOcean models request failed with status %d: %w",
resp.StatusCode,
readErr,
)
}
return nil, fmt.Errorf(
"DigitalOcean models request failed with status %d: %s",
resp.StatusCode,
string(bodyBytes),
)
}
bodyBytes, err := io.ReadAll(io.LimitReader(resp.Body, maxResponseSize+1))
if err != nil {
return nil, err
}
if len(bodyBytes) > maxResponseSize {
return nil, fmt.Errorf(i18n.T("openai_models_response_too_large"), c.GetName(), maxResponseSize)
}
var payload modelsResponse
if err := json.Unmarshal(bodyBytes, &payload); err != nil {
return nil, err
}
models := make([]string, 0, len(payload.Models))
seen := make(map[string]struct{}, len(payload.Models))
for _, model := range payload.Models {
var value string
switch {
case model.InferenceName != "":
value = model.InferenceName
case model.Name != "":
value = model.Name
case model.UUID != "":
value = model.UUID
}
if value == "" {
continue
}
if _, ok := seen[value]; ok {
continue
}
seen[value] = struct{}{}
models = append(models, value)
}
return models, nil
}

View File

@@ -206,6 +206,11 @@ var ProviderMap = map[string]ProviderConfig{
ModelsURL: "static:abacus", // Special marker for static model list
ImplementsResponses: false,
},
"Mammouth": {
Name: "Mammouth",
BaseURL: "https://api.mammouth.ai/v1",
ImplementsResponses: false,
},
}
// GetProviderByName returns the provider configuration for a given name with O(1) lookup

View File

@@ -1 +1 @@
"1.4.376"
"1.4.378"