Compare commits

...

49 Commits

Author SHA1 Message Date
github-actions[bot]
3584f83b30 chore(release): Update version to v1.4.276 2025-08-08 02:24:57 +00:00
Kayvan Sylvan
056791233a Merge pull request #1677 from ksylvan/0807-fix-release-notes-ci-cd-permission
Grant GITHUB_TOKEN write permissions for release notes job
2025-08-07 19:22:22 -07:00
Kayvan Sylvan
dc435dcc6e ci: add write permissions to update_release_notes job
## CHANGES

- Add contents write permission to release notes job
- Enable GitHub Actions to modify repository contents
- Fix potential permission issues during release process
2025-08-07 19:17:19 -07:00
github-actions[bot]
6edbc9dd38 chore(release): Update version to v1.4.275 2025-08-07 19:27:24 +00:00
Kayvan Sylvan
fd60d66c0d Merge pull request #1676 from ksylvan/0807-fix-gh-token-access-for-automated-release
Refactor authentication to support GITHUB_TOKEN and GH_TOKEN
2025-08-07 12:24:47 -07:00
Changelog Bot
08ec89bbe1 chore: incoming 1676 changelog entry 2025-08-07 12:16:21 -07:00
Kayvan Sylvan
836557f41c feat: add 'gpt-5' to raw-mode models in OpenAI client
## CHANGES
- Add gpt-5 to raw mode model requirements list.
- Ensure gpt-5 responses bypass structured chat message formatting.
- Align NeedsRawMode logic with expanded OpenAI model support.
2025-08-07 12:15:44 -07:00
Kayvan Sylvan
f7c5c6d344 docs: document GetTokenFromEnv behavior and token environment fallback 2025-08-07 11:42:48 -07:00
Kayvan Sylvan
9d18ad523e docs: document GetTokenFromEnv behavior and token environment fallback 2025-08-07 11:38:19 -07:00
Changelog Bot
efcd7dcac2 chore: incoming 1676 changelog entry 2025-08-07 11:33:36 -07:00
Kayvan Sylvan
768e87879e refactor: centralize GitHub token retrieval logic into utility function
## CHANGES

- Extract token retrieval into `util.GetTokenFromEnv` function
- Support both `GITHUB_TOKEN` and `GH_TOKEN` environment variables
- Replace direct `os.Getenv` calls with utility function
- Add new `util/token.go` file for token handling
- Update walker.go to use centralized token logic
- Update main.go to use token utility function
2025-08-07 10:01:11 -07:00
github-actions[bot]
3c51cad614 chore(release): Update version to v1.4.274 2025-08-07 04:38:49 +00:00
Kayvan Sylvan
bc642904e0 Merge pull request #1673 from ksylvan/0806-update-anthropic-to-support-opus-4-1
Add Support for Claude Opus 4.1 Model
2025-08-06 21:36:19 -07:00
Changelog Bot
fa135036f4 chore: incoming 1673 changelog entry 2025-08-06 21:25:57 -07:00
Kayvan Sylvan
2d414ec394 fix: ensure Anthropic client always sets temperature to override API default
## CHANGES

- Always set temperature parameter for consistent behavior
- Prioritize TopP over temperature when explicitly set
- Override Anthropic's default 1.0 with Fabric's 0.7
- Add comprehensive tests for parameter precedence logic
- Update VSCode dictionary with Keploy entry
- Simplify conditional logic for temperature/TopP selection
2025-08-06 21:25:24 -07:00
Changelog Bot
9e72df9c6c chore: incoming 1673 changelog entry 2025-08-06 20:56:26 -07:00
Kayvan Sylvan
1a933e1c9a refactor: improve chat parameter defaults handling with domain constants
## CHANGES

- Add domain constants for default chat parameter values
- Update Anthropic client to check explicitly set parameters
- Add documentation linking CLI flags to domain defaults
- Improve temperature and TopP parameter selection logic
- Ensure consistent default values across CLI and domain
- Replace zero-value checks with explicit default comparisons
- Centralize chat option defaults in domain package
2025-08-06 20:56:08 -07:00
Changelog Bot
d5431f9843 chore: incoming 1673 changelog entry 2025-08-06 20:29:04 -07:00
Kayvan Sylvan
e2dabc406d ci: refactor release workflow to use shared version job and simplify OS handling 2025-08-06 20:18:46 -07:00
Changelog Bot
31f7f22629 chore: incoming 1673 changelog entry 2025-08-06 19:58:24 -07:00
Kayvan Sylvan
29aaf430ca fix: update anthropic SDK and refactor release workflow for release notes generation
## CHANGES

- Upgrade anthropic-sdk-go from v1.4.0 to v1.7.0
- Move changelog generation to separate workflow job
- Add Claude Opus 4.1 model support
- Fix temperature/topP parameter conflict for models
- Separate release artifact upload from changelog update
- Add dedicated update_release_notes job configuration
2025-08-06 19:50:36 -07:00
github-actions[bot]
9ef3518a07 chore(release): Update version to v1.4.273 2025-08-05 04:06:55 +00:00
Kayvan Sylvan
0b40bad986 Merge pull request #1671 from queryfast/main
chore: remove redundant words
2025-08-04 21:04:35 -07:00
queryfast
34ff4d30f2 chore: remove redundant words
Signed-off-by: queryfast <queryfast@outlook.com>
2025-08-05 11:16:43 +08:00
Kayvan Sylvan
1d9596bf3d Merge pull request #1660 from pbulteel/main 2025-07-29 19:18:44 -07:00
Patrick Bulteel
72d099d40a Fix typos in t_ patterns 2025-07-29 11:59:22 +01:00
github-actions[bot]
7ab6fe3baa chore(release): Update version to v1.4.272 2025-07-28 04:52:22 +00:00
Kayvan Sylvan
198964df82 Merge pull request #1658 from ksylvan/0727-fix-release-note-updates
Update Release Process for Data Consistency
2025-07-27 21:49:56 -07:00
Changelog Bot
f0998d3686 chore: incoming 1658 changelog entry 2025-07-27 21:47:56 -07:00
Kayvan Sylvan
75875ba9f5 chore: Update changelog cache db 2025-07-27 21:43:39 -07:00
Kayvan Sylvan
ea009ff64b feat: add database sync before generating changelog in release workflow
### CHANGES
- Add database sync command to release workflow
- Ensure changelog generation includes latest database updates
2025-07-27 21:40:04 -07:00
github-actions[bot]
3c317f088b chore(release): Update version to v1.4.271 2025-07-28 04:33:07 +00:00
Kayvan Sylvan
f91ee2ce3c Merge pull request #1657 from ksylvan/0727-automated-release-notes
Add GitHub Release Description Update Feature
2025-07-27 21:30:40 -07:00
Changelog Bot
98968d972f chore: incoming 1657 changelog entry 2025-07-27 21:24:46 -07:00
Kayvan Sylvan
8ea264e96c feat: add GitHub release description update with AI summary
## CHANGES

- Add `--release` flag to command line options documentation
- Enable AI summary updates for GitHub releases
- Support version-specific release description updates
- Reorder internal package imports for consistency
2025-07-27 21:22:30 -07:00
Kayvan Sylvan
5203cba5a7 feat: add GitHub release description update via --release flag
### CHANGES

- Add `--release` flag to generate_changelog to update GitHub release
- Implement `ReleaseManager` for managing release descriptions
- Create `release.go` for handling release updates
- Update `release.yml` to run changelog generation
- Ensure mutual exclusivity for `--release` with other flags
- Modify `Config` struct to include `Release` field
- Update `main.go` to handle new release functionality
2025-07-27 21:12:04 -07:00
github-actions[bot]
f5fba12360 chore(release): Update version to v1.4.270 2025-07-27 05:39:14 +00:00
Kayvan Sylvan
d7cc3ff8f1 Merge pull request #1654 from ksylvan/0726-prevent-file-overwrite-and-send-file-create-message-to-stderr
Refine Output File Handling for Safety
2025-07-26 22:36:43 -07:00
Changelog Bot
4887cdc353 chore: incoming 1654 changelog entry 2025-07-26 22:29:17 -07:00
Kayvan Sylvan
6aa38d2abc fix: prevent file overwrite and improve output messaging in CreateOutputFile
## CHANGES

- Add file existence check before creating output file
- Return error if target file already exists
- Change success message to write to stderr
- Update message format with brackets for clarity
- Prevent accidental file overwrites during output creation
2025-07-26 20:16:47 -07:00
github-actions[bot]
737e37f00e chore(release): Update version to v1.4.269 2025-07-26 23:37:08 +00:00
Kayvan Sylvan
42bb72ab65 Merge pull request #1653 from ksylvan/0726-minor-fix-for-gemini-tts-models
docs: update Gemini TTS model references to gemini-2.5-flash-preview-tts
2025-07-26 16:34:38 -07:00
Changelog Bot
612ae4e3b5 chore: incoming 1653 changelog entry 2025-07-26 16:26:28 -07:00
Kayvan Sylvan
27f9134912 docs: update Gemini TTS model references to gemini-2.5-flash-preview-tts
## CHANGES

- Update documentation examples to use gemini-2.5-flash-preview-tts
- Replace gemini-2.0-flash-tts references throughout Gemini-TTS.md
- Update voice selection example commands
- Modify CLI help text example command
- Update changelog database binary file
2025-07-26 16:23:56 -07:00
github-actions[bot]
c02718855d chore(release): Update version to v1.4.268 2025-07-26 23:03:37 +00:00
Kayvan Sylvan
4f16222b31 Merge pull request #1652 from ksylvan/0726-gemini-tts-voices
Implement Voice Selection for Gemini Text-to-Speech
2025-07-26 16:01:07 -07:00
Kayvan Sylvan
8c27b34d0f chore: differentiate voice descriptions 2025-07-26 15:58:29 -07:00
Changelog Bot
0b71b54698 chore: incoming 1652 changelog entry 2025-07-26 15:23:00 -07:00
Kayvan Sylvan
614b1322d5 feat: add Gemini TTS voice selection and listing functionality
## CHANGES

- Add `--voice` flag for TTS voice selection
- Add `--list-gemini-voices` command for voice discovery
- Implement voice validation for Gemini TTS models
- Update shell completions for voice options
- Add comprehensive Gemini TTS documentation
- Create voice samples directory structure
- Extend spell checker dictionary with voice names
2025-07-26 15:11:30 -07:00
47 changed files with 876 additions and 86 deletions

View File

@@ -27,8 +27,39 @@ jobs:
- name: Run tests
run: go test -v ./...
get_version:
name: Get version
runs-on: ubuntu-latest
outputs:
latest_tag: ${{ steps.get_version.outputs.latest_tag }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Get version from source
id: get_version
shell: bash
run: |
if [ ! -f "nix/pkgs/fabric/version.nix" ]; then
echo "Error: version.nix file not found"
exit 1
fi
version=$(cat nix/pkgs/fabric/version.nix | tr -d '"' | tr -cd '0-9.')
if [ -z "$version" ]; then
echo "Error: version is empty"
exit 1
fi
if ! echo "$version" | grep -E '^[0-9]+\.[0-9]+\.[0-9]+' > /dev/null; then
echo "Error: Invalid version format: $version"
exit 1
fi
echo "latest_tag=v$version" >> $GITHUB_OUTPUT
build:
name: Build binaries for Windows, macOS, and Linux
needs: [test, get_version]
runs-on: ${{ matrix.os }}
permissions:
contents: write
@@ -51,25 +82,14 @@ jobs:
with:
go-version-file: ./go.mod
- name: Determine OS Name
id: os-name
run: |
if [ "${{ matrix.os }}" == "ubuntu-latest" ]; then
echo "OS=linux" >> $GITHUB_ENV
elif [ "${{ matrix.os }}" == "macos-latest" ]; then
echo "OS=darwin" >> $GITHUB_ENV
else
echo "OS=windows" >> $GITHUB_ENV
fi
shell: bash
- name: Build binary on Linux and macOS
if: matrix.os != 'windows-latest'
env:
GOOS: ${{ env.OS }}
GOOS: ${{ matrix.os == 'ubuntu-latest' && 'linux' || 'darwin' }}
GOARCH: ${{ matrix.arch }}
run: |
go build -o fabric-${OS}-${{ matrix.arch }} ./cmd/fabric
OS_NAME="${{ matrix.os == 'ubuntu-latest' && 'linux' || 'darwin' }}"
go build -o fabric-${OS_NAME}-${{ matrix.arch }} ./cmd/fabric
- name: Build binary on Windows
if: matrix.os == 'windows-latest'
@@ -83,8 +103,8 @@ jobs:
if: matrix.os != 'windows-latest'
uses: actions/upload-artifact@v4
with:
name: fabric-${OS}-${{ matrix.arch }}
path: fabric-${OS}-${{ matrix.arch }}
name: fabric-${{ matrix.os == 'ubuntu-latest' && 'linux' || 'darwin' }}-${{ matrix.arch }}
path: fabric-${{ matrix.os == 'ubuntu-latest' && 'linux' || 'darwin' }}-${{ matrix.arch }}
- name: Upload build artifact
if: matrix.os == 'windows-latest'
@@ -93,34 +113,15 @@ jobs:
name: fabric-windows-${{ matrix.arch }}.exe
path: fabric-windows-${{ matrix.arch }}.exe
- name: Get version from source
id: get_version
shell: bash
run: |
if [ ! -f "nix/pkgs/fabric/version.nix" ]; then
echo "Error: version.nix file not found"
exit 1
fi
version=$(cat nix/pkgs/fabric/version.nix | tr -d '"' | tr -cd '0-9.')
if [ -z "$version" ]; then
echo "Error: version is empty"
exit 1
fi
if ! echo "$version" | grep -E '^[0-9]+\.[0-9]+\.[0-9]+' > /dev/null; then
echo "Error: Invalid version format: $version"
exit 1
fi
echo "latest_tag=v$version" >> $GITHUB_ENV
- name: Create release if it doesn't exist
shell: bash
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
if ! gh release view ${{ env.latest_tag }} >/dev/null 2>&1; then
gh release create ${{ env.latest_tag }} --title "Release ${{ env.latest_tag }}" --notes "Automated release for ${{ env.latest_tag }}"
if ! gh release view ${{ needs.get_version.outputs.latest_tag }} >/dev/null 2>&1; then
gh release create ${{ needs.get_version.outputs.latest_tag }} --title "Release ${{ needs.get_version.outputs.latest_tag }}" --notes "Automated release for ${{ needs.get_version.outputs.latest_tag }}"
else
echo "Release ${{ env.latest_tag }} already exists."
echo "Release ${{ needs.get_version.outputs.latest_tag }} already exists."
fi
- name: Upload release artifact
@@ -128,11 +129,35 @@ jobs:
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ env.latest_tag }} fabric-windows-${{ matrix.arch }}.exe
gh release upload ${{ needs.get_version.outputs.latest_tag }} fabric-windows-${{ matrix.arch }}.exe
- name: Upload release artifact
if: matrix.os != 'windows-latest'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ env.latest_tag }} fabric-${OS}-${{ matrix.arch }}
OS_NAME="${{ matrix.os == 'ubuntu-latest' && 'linux' || 'darwin' }}"
gh release upload ${{ needs.get_version.outputs.latest_tag }} fabric-${OS_NAME}-${{ matrix.arch }}
update_release_notes:
needs: [build, get_version]
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Go
uses: actions/setup-go@v4
with:
go-version-file: ./go.mod
- name: Update release description
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
go run ./cmd/generate_changelog --sync-db
go run ./cmd/generate_changelog --release ${{ needs.get_version.outputs.latest_tag }}

17
.vscode/settings.json vendored
View File

@@ -1,23 +1,32 @@
{
"cSpell.words": [
"Achird",
"addextension",
"adduser",
"AIML",
"anthropics",
"Aoede",
"atotto",
"Autonoe",
"badfile",
"Behrens",
"blindspots",
"Bombal",
"Callirhoe",
"Callirrhoe",
"Cerebras",
"compadd",
"compdef",
"compinit",
"creatordate",
"curcontext",
"custompatterns",
"danielmiessler",
"davidanson",
"Debugf",
"dedup",
"deepseek",
"Despina",
"direnv",
"dryrun",
"dsrp",
@@ -25,6 +34,7 @@
"Eisler",
"elif",
"envrc",
"Erinome",
"Errorf",
"eugeis",
"Eugen",
@@ -63,9 +73,11 @@
"jessevdk",
"Jina",
"joho",
"Keploy",
"Kore",
"ksylvan",
"Langdock",
"Laomedeia",
"ldflags",
"libexec",
"listcontexts",
@@ -89,6 +101,7 @@
"openaiapi",
"opencode",
"openrouter",
"Orus",
"otiai",
"pdflatex",
"pipx",
@@ -97,11 +110,14 @@
"presencepenalty",
"printcontext",
"printsession",
"Pulcherrima",
"pycache",
"pyperclip",
"readystream",
"restapi",
"rmextension",
"Sadachbia",
"Sadaltager",
"samber",
"sashabaranov",
"sdist",
@@ -112,6 +128,7 @@
"Streamlit",
"stretchr",
"subchunk",
"Sulafat",
"talkpanel",
"Telos",
"testpattern",

View File

@@ -1,5 +1,92 @@
# Changelog
## v1.4.276 (2025-08-08)
### Direct commits
- Ci: add write permissions to update_release_notes job
- Add contents write permission to release notes job
- Enable GitHub Actions to modify repository contents
- Fix potential permission issues during release process
## v1.4.275 (2025-08-07)
### PR [#1676](https://github.com/danielmiessler/Fabric/pull/1676) by [ksylvan](https://github.com/ksylvan): Refactor authentication to support GITHUB_TOKEN and GH_TOKEN
- Refactor: centralize GitHub token retrieval logic into utility function
- Support both GITHUB_TOKEN and GH_TOKEN environment variables with fallback handling
- Add new util/token.go file for centralized token handling across the application
- Update walker.go and main.go to use the new centralized token utility function
- Feat: add 'gpt-5' to raw-mode models in OpenAI client to bypass structured chat message formatting
## v1.4.274 (2025-08-07)
### PR [#1673](https://github.com/danielmiessler/Fabric/pull/1673) by [ksylvan](https://github.com/ksylvan): Add Support for Claude Opus 4.1 Model
- Add Claude Opus 4.1 model support
- Upgrade anthropic-sdk-go from v1.4.0 to v1.7.0
- Fix temperature/topP parameter conflict for models
- Refactor release workflow to use shared version job and simplify OS handling
- Improve chat parameter defaults handling with domain constants
## v1.4.273 (2025-08-05)
### Direct commits
- Chore: remove redundant words
Signed-off-by: queryfast <queryfast@outlook.com>
- Fix typos in t_ patterns
## v1.4.272 (2025-07-28)
### PR [#1658](https://github.com/danielmiessler/Fabric/pull/1658) by [ksylvan](https://github.com/ksylvan): Update Release Process for Data Consistency
- Add database sync before generating changelog in release workflow
- Ensure changelog generation includes latest database updates
- Update changelog cache database
## v1.4.271 (2025-07-28)
### PR [#1657](https://github.com/danielmiessler/Fabric/pull/1657) by [ksylvan](https://github.com/ksylvan): Add GitHub Release Description Update Feature
- Add GitHub release description update via `--release` flag
- Implement `ReleaseManager` for managing release descriptions
- Create `release.go` for handling release updates
- Update `release.yml` to run changelog generation
- Enable AI summary updates for GitHub releases
## v1.4.270 (2025-07-27)
### PR [#1654](https://github.com/danielmiessler/Fabric/pull/1654) by [ksylvan](https://github.com/ksylvan): Refine Output File Handling for Safety
- Fix: prevent file overwrite and improve output messaging in CreateOutputFile
- Add file existence check before creating output file
- Return error if target file already exists
- Change success message to write to stderr
- Update message format with brackets for clarity
## v1.4.269 (2025-07-26)
### PR [#1653](https://github.com/danielmiessler/Fabric/pull/1653) by [ksylvan](https://github.com/ksylvan): docs: update Gemini TTS model references to gemini-2.5-flash-preview-tts
- Updated Gemini TTS model references from gemini-2.0-flash-tts to gemini-2.5-flash-preview-tts throughout documentation
- Modified documentation examples to use the new gemini-2.5-flash-preview-tts model
- Updated voice selection example commands in Gemini-TTS.md
- Revised CLI help text example commands to reflect model changes
- Updated changelog database binary file
## v1.4.268 (2025-07-26)
### PR [#1652](https://github.com/danielmiessler/Fabric/pull/1652) by [ksylvan](https://github.com/ksylvan): Implement Voice Selection for Gemini Text-to-Speech
- Feat: add Gemini TTS voice selection and listing functionality
- Add `--voice` flag for TTS voice selection
- Add `--list-gemini-voices` command for voice discovery
- Implement voice validation for Gemini TTS models
- Update shell completions for voice options
## v1.4.267 (2025-07-26)
### PR [#1650](https://github.com/danielmiessler/Fabric/pull/1650) by [ksylvan](https://github.com/ksylvan): Update Gemini Plugin to New SDK with TTS Support

View File

@@ -548,6 +548,9 @@ Application Options:
--think-start-tag= Start tag for thinking sections (default: <think>)
--think-end-tag= End tag for thinking sections (default: </think>)
--disable-responses-api Disable OpenAI Responses API (default: false)
--voice= TTS voice name for supported models (e.g., Kore, Charon, Puck)
(default: Kore)
--list-gemini-voices List all available Gemini TTS voices
Help Options:
-h, --help Show this help message

View File

@@ -1,3 +1,3 @@
package main
var version = "v1.4.267"
var version = "v1.4.276"

View File

@@ -101,6 +101,7 @@ generate_changelog --cache /path/to/cache.db
| `--force-pr-sync` | | Force a full PR sync from GitHub | false |
| `--token` | | GitHub API token | `$GITHUB_TOKEN` |
| `--ai-summarize` | | Generate AI-enhanced summaries using Fabric | false |
| `--release` | | Update GitHub release description with AI summary for version | |
## Output Format

Binary file not shown.

View File

@@ -17,4 +17,5 @@ type Config struct {
IncomingDir string
Push bool
SyncDB bool
Release string
}

View File

@@ -2,12 +2,12 @@ package git
import (
"fmt"
"os"
"regexp"
"strconv"
"strings"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/util"
"github.com/go-git/go-git/v5"
"github.com/go-git/go-git/v5/plumbing"
"github.com/go-git/go-git/v5/plumbing/object"
@@ -520,7 +520,7 @@ func (w *Walker) PushToRemote() error {
pushOptions := &git.PushOptions{}
// Check if we have a GitHub token for authentication
if githubToken := os.Getenv("GITHUB_TOKEN"); githubToken != "" {
if githubToken := util.GetTokenFromEnv(""); githubToken != "" {
// Get remote URL to check if it's a GitHub repository
remotes, err := w.repo.Remotes()
if err == nil && len(remotes) > 0 {

View File

@@ -0,0 +1,81 @@
package internal
import (
"context"
"fmt"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/cache"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
"github.com/google/go-github/v66/github"
"golang.org/x/oauth2"
)
type ReleaseManager struct {
cache *cache.Cache
githubToken string
owner string
repo string
}
func NewReleaseManager(cfg *config.Config) (*ReleaseManager, error) {
cache, err := cache.New(cfg.CacheFile)
if err != nil {
return nil, fmt.Errorf("failed to create cache: %w", err)
}
return &ReleaseManager{
cache: cache,
githubToken: cfg.GitHubToken,
owner: "danielmiessler",
repo: "fabric",
}, nil
}
func (rm *ReleaseManager) Close() error {
return rm.cache.Close()
}
func (rm *ReleaseManager) UpdateReleaseDescription(version string) error {
versions, err := rm.cache.GetVersions()
if err != nil {
return fmt.Errorf("failed to get versions from cache: %w", err)
}
versionData, exists := versions[version]
if !exists {
return fmt.Errorf("version %s not found in versions table", version)
}
if versionData.AISummary == "" {
return fmt.Errorf("ai_summary is empty for version %s", version)
}
releaseBody := fmt.Sprintf("## Changes\n\n%s", versionData.AISummary)
ctx := context.Background()
var client *github.Client
if rm.githubToken != "" {
ts := oauth2.StaticTokenSource(
&oauth2.Token{AccessToken: rm.githubToken},
)
tc := oauth2.NewClient(ctx, ts)
client = github.NewClient(tc)
} else {
client = github.NewClient(nil)
}
release, _, err := client.Repositories.GetReleaseByTag(ctx, rm.owner, rm.repo, version)
if err != nil {
return fmt.Errorf("failed to get release for version %s: %w", version, err)
}
release.Body = &releaseBody
_, _, err = client.Repositories.EditRelease(ctx, rm.owner, rm.repo, *release.ID, release)
if err != nil {
return fmt.Errorf("failed to update release description for version %s: %w", version, err)
}
fmt.Printf("Successfully updated release description for %s\n", version)
return nil
}

View File

@@ -5,8 +5,10 @@ import (
"os"
"path/filepath"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/changelog"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
"github.com/danielmiessler/fabric/cmd/generate_changelog/util"
"github.com/joho/godotenv"
"github.com/spf13/cobra"
)
@@ -42,6 +44,7 @@ func init() {
rootCmd.Flags().StringVar(&cfg.IncomingDir, "incoming-dir", "./cmd/generate_changelog/incoming", "Directory for incoming PR files")
rootCmd.Flags().BoolVar(&cfg.Push, "push", false, "Enable automatic git push after creating an incoming entry")
rootCmd.Flags().BoolVar(&cfg.SyncDB, "sync-db", false, "Synchronize and validate database integrity with git history and GitHub PRs")
rootCmd.Flags().StringVar(&cfg.Release, "release", "", "Update GitHub release description with AI summary for version (e.g., v1.2.3)")
}
func run(cmd *cobra.Command, args []string) error {
@@ -49,10 +52,12 @@ func run(cmd *cobra.Command, args []string) error {
return fmt.Errorf("--incoming-pr and --process-prs are mutually exclusive flags")
}
if cfg.GitHubToken == "" {
cfg.GitHubToken = os.Getenv("GITHUB_TOKEN")
if cfg.Release != "" && (cfg.IncomingPR > 0 || cfg.ProcessPRsVersion != "" || cfg.SyncDB) {
return fmt.Errorf("--release cannot be used with other processing flags")
}
cfg.GitHubToken = util.GetTokenFromEnv(cfg.GitHubToken)
generator, err := changelog.New(cfg)
if err != nil {
return fmt.Errorf("failed to create changelog generator: %w", err)
@@ -70,6 +75,15 @@ func run(cmd *cobra.Command, args []string) error {
return generator.SyncDatabase()
}
if cfg.Release != "" {
releaseManager, err := internal.NewReleaseManager(cfg)
if err != nil {
return fmt.Errorf("failed to create release manager: %w", err)
}
defer releaseManager.Close()
return releaseManager.UpdateReleaseDescription(cfg.Release)
}
output, err := generator.Generate()
if err != nil {
return fmt.Errorf("failed to generate changelog: %w", err)

View File

@@ -0,0 +1,31 @@
package util
import (
"os"
)
// GetTokenFromEnv returns a GitHub token based on the following precedence order:
// 1. If tokenValue is non-empty, it is returned.
// 2. Otherwise, if the GITHUB_TOKEN environment variable is set, its value is returned.
// 3. Otherwise, if the GH_TOKEN environment variable is set, its value is returned.
// 4. If none of the above are set, an empty string is returned.
//
// Example:
//
// os.Setenv("GITHUB_TOKEN", "abc")
// os.Setenv("GH_TOKEN", "def")
// GetTokenFromEnv("xyz") // returns "xyz"
// GetTokenFromEnv("") // returns "abc"
// os.Unsetenv("GITHUB_TOKEN")
// GetTokenFromEnv("") // returns "def"
// os.Unsetenv("GH_TOKEN")
// GetTokenFromEnv("") // returns ""
func GetTokenFromEnv(tokenValue string) string {
if tokenValue == "" {
tokenValue = os.Getenv("GITHUB_TOKEN")
if tokenValue == "" {
tokenValue = os.Getenv("GH_TOKEN")
}
}
return tokenValue
}

View File

@@ -14,16 +14,19 @@ _fabric_models() {
models=(${(f)"$(fabric --listmodels --shell-complete-list 2>/dev/null)"})
compadd -X "Models:" ${models}
}
_fabric_contexts() {
local -a contexts
contexts=(${(f)"$(fabric --listcontexts --shell-complete-list 2>/dev/null)"})
compadd -X "Contexts:" ${contexts}
}
_fabric_sessions() {
local -a sessions
sessions=(${(f)"$(fabric --listsessions --shell-complete-list 2>/dev/null)"})
compadd -X "Sessions:" ${sessions}
}
_fabric_strategies() {
local -a strategies
strategies=(${(f)"$(fabric --liststrategies --shell-complete-list 2>/dev/null)"})
@@ -34,14 +37,12 @@ _fabric_extensions() {
local -a extensions
extensions=(${(f)"$(fabric --listextensions --shell-complete-list 2>/dev/null)"})
compadd -X "Extensions:" ${extensions}
'(-L --listmodels)'{-L,--listmodels}'[List all available models]:list models:_fabric_models' \
'(-x --listcontexts)'{-x,--listcontexts}'[List all contexts]:list contexts:_fabric_contexts' \
'(-X --listsessions)'{-X,--listsessions}'[List all sessions]:list sessions:_fabric_sessions' \
'(--listextensions)--listextensions[List all registered extensions]' \
'(--liststrategies)--liststrategies[List all strategies]:list strategies:_fabric_strategies' \
'(--listvendors)--listvendors[List all vendors]' \
vendors=(${(f)"$(fabric --listvendors 2>/dev/null)"})
compadd -X "Vendors:" ${vendors}
}
_fabric_gemini_voices() {
local -a voices
voices=(${(f)"$(fabric --list-gemini-voices --shell-complete-list 2>/dev/null)"})
compadd -X "Gemini TTS Voices:" ${voices}
}
_fabric() {
@@ -109,6 +110,8 @@ _fabric() {
'(--strategy)--strategy[Choose a strategy from the available strategies]:strategy:_fabric_strategies' \
'(--liststrategies)--liststrategies[List all strategies]' \
'(--listvendors)--listvendors[List all vendors]' \
'(--voice)--voice[TTS voice name for supported models]:voice:_fabric_gemini_voices' \
'(--list-gemini-voices)--list-gemini-voices[List all available Gemini TTS voices]' \
'(--shell-complete-list)--shell-complete-list[Output raw list without headers/formatting (for shell completion)]' \
'(--suppress-think)--suppress-think[Suppress text enclosed in thinking tags]' \
'(--think-start-tag)--think-start-tag[Start tag for thinking sections (default: <think>)]:start tag:' \
@@ -119,4 +122,3 @@ _fabric() {
}
_fabric "$@"

View File

@@ -13,7 +13,7 @@ _fabric() {
_get_comp_words_by_ref -n : cur prev words cword
# Define all possible options/flags
local opts="--pattern -p --variable -v --context -C --session --attachment -a --setup -S --temperature -t --topp -T --stream -s --presencepenalty -P --raw -r --frequencypenalty -F --listpatterns -l --listmodels -L --listcontexts -x --listsessions -X --updatepatterns -U --copy -c --model -m --modelContextLength --output -o --output-session --latest -n --changeDefaultModel -d --youtube -y --playlist --transcript --transcript-with-timestamps --comments --metadata --language -g --scrape_url -u --scrape_question -q --seed -e --wipecontext -w --wipesession -W --printcontext --printsession --readability --input-has-vars --dry-run --serve --serveOllama --address --api-key --config --search --search-location --image-file --image-size --image-quality --image-compression --image-background --suppress-think --think-start-tag --think-end-tag --disable-responses-api --version --listextensions --addextension --rmextension --strategy --liststrategies --listvendors --shell-complete-list --help -h"
local opts="--pattern -p --variable -v --context -C --session --attachment -a --setup -S --temperature -t --topp -T --stream -s --presencepenalty -P --raw -r --frequencypenalty -F --listpatterns -l --listmodels -L --listcontexts -x --listsessions -X --updatepatterns -U --copy -c --model -m --modelContextLength --output -o --output-session --latest -n --changeDefaultModel -d --youtube -y --playlist --transcript --transcript-with-timestamps --comments --metadata --language -g --scrape_url -u --scrape_question -q --seed -e --wipecontext -w --wipesession -W --printcontext --printsession --readability --input-has-vars --dry-run --serve --serveOllama --address --api-key --config --search --search-location --image-file --image-size --image-quality --image-compression --image-background --suppress-think --think-start-tag --think-end-tag --disable-responses-api --voice --list-gemini-voices --version --listextensions --addextension --rmextension --strategy --liststrategies --listvendors --shell-complete-list --help -h"
# Helper function for dynamic completions
_fabric_get_list() {
@@ -62,6 +62,10 @@ _fabric() {
COMPREPLY=($(compgen -W "$(_fabric_get_list --liststrategies)" -- "${cur}"))
return 0
;;
--voice)
COMPREPLY=($(compgen -W "$(_fabric_get_list --list-gemini-voices)" -- "${cur}"))
return 0
;;
# Options requiring file/directory paths
-a | --attachment | -o | --output | --config | --addextension | --image-file)
_filedir

View File

@@ -31,6 +31,10 @@ function __fabric_get_extensions
fabric --listextensions --shell-complete-list 2>/dev/null
end
function __fabric_get_gemini_voices
fabric --list-gemini-voices --shell-complete-list 2>/dev/null
end
# Main completion function
complete -c fabric -f
@@ -71,6 +75,7 @@ complete -c fabric -l rmextension -d "Remove a registered extension by name" -a
complete -c fabric -l strategy -d "Choose a strategy from the available strategies" -a "(__fabric_get_strategies)"
complete -c fabric -l think-start-tag -d "Start tag for thinking sections (default: <think>)"
complete -c fabric -l think-end-tag -d "End tag for thinking sections (default: </think>)"
complete -c fabric -l voice -d "TTS voice name for supported models (e.g., Kore, Charon, Puck)" -a "(__fabric_get_gemini_voices)"
# Boolean flags (no arguments)
complete -c fabric -s S -l setup -d "Run setup for all reconfigurable parts of fabric"
@@ -99,6 +104,7 @@ complete -c fabric -l version -d "Print current version"
complete -c fabric -l listextensions -d "List all registered extensions"
complete -c fabric -l liststrategies -d "List all strategies"
complete -c fabric -l listvendors -d "List all vendors"
complete -c fabric -l list-gemini-voices -d "List all available Gemini TTS voices"
complete -c fabric -l shell-complete-list -d "Output raw list without headers/formatting (for shell completion)"
complete -c fabric -l suppress-think -d "Suppress text enclosed in thinking tags"
complete -c fabric -l disable-responses-api -d "Disable OpenAI Responses API (default: false)"

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 8 16-word bullets describing how well or poorly I'm addressing my challenges. Call me out if I'm not putting work into them, and/or if you can see evidence of them affecting me in my journal or elsewhere.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Check this person's Metrics or KPIs (M's or K's) to see their current state and if they've been improved recently.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Analyze everything in my TELOS file and think about what I could and should do after my legacy corporate / technical skills are automated away. What can I contribute that's based on human-to-human interaction and exchanges of value?
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 4 32-word bullets describing who I am and what I do in a non-douchey way. Use the who I am, the problem I see in the world, and what I'm doing about it as the template. Something like:
a. I'm a programmer by trade, and one thing that really bothers me is kids being so stuck inside of tech and games. So I started a school where I teach kids to build things with their hands.

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 5 16-word bullets describing this person's life outlook.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 5 16-word bullets describing who this person is, what they do, and what they're working on. The goal is to concisely and confidently project who they are while being humble and grounded.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 5 48-word bullet points, each including a 3-5 word panel title, that would be wonderful panels for this person to participate on.
5. Write them so that they'd be good panels for others to participate in as well, not just me.

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 8 16-word bullets describing possible blindspots in my thinking, i.e., flaws in my frames or models that might leave me exposed to error or risk.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 4 16-word bullets identifying negative thinking either in my main document or in my journal.
5. Add some tough love encouragement (not fluff) to help get me out of that mindset.

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 5 16-word bullets describing which of their goals and/or projects don't seem to have been worked on recently.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 8 16-word bullets looking at what I'm trying to do, and any progress I've made, and give some encouragement on the positive aspects and recommendations to continue the work.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 4 16-word bullets red-teaming my thinking, models, frames, etc, especially as evidenced throughout my journal.
5. Give a set of recommendations on how to fix the issues identified in the red-teaming.

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 8 16-word bullets threat modeling my life plan and what could go wrong.
5. Provide recommendations on how to address the threats and improve the life plan.

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Create an ASCII art diagram of the relationship my missions, goals, and projects.
# OUTPUT INSTRUCTIONS

View File

@@ -6,7 +6,7 @@ You are an expert at understanding deep context about a person or entity, and th
1. Read the incoming TELOS File thoroughly. Fully understand everything about this person or entity.
2. Deeply study the input instruction or question.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible ouptut for the person who sent the input.
3. Spend significant time and effort thinking about how these two are related, and what would be the best possible output for the person who sent the input.
4. Write 8 16-word bullets describing what you accomplished this year.
5. End with an ASCII art visualization of what you worked on and accomplished vs. what you didn't work on or finish.

View File

@@ -45,7 +45,7 @@ Follow the following structure:
- Deeply understand the relationship between the HTTP requests provided. Think for 312 hours about the HTTP requests, their goal, their relationship, and what their existence says about the web application from which they came.
- Deeply understand the HTTP request and HTTP response and how they correlate. Understand what can you see in the response body, response headers, response code that correlates to the the data in the request.
- Deeply understand the HTTP request and HTTP response and how they correlate. Understand what can you see in the response body, response headers, response code that correlates to the data in the request.
- Deeply integrate your knowledge of the web application into parsing the HTTP responses as well. Integrate all knowledge consumed at this point together.

155
docs/Gemini-TTS.md Normal file
View File

@@ -0,0 +1,155 @@
# Gemini Text-to-Speech (TTS) Guide
Fabric supports Google Gemini's text-to-speech (TTS) capabilities, allowing you to convert text into high-quality audio using various AI-generated voices.
## Overview
The Gemini TTS feature in Fabric allows you to:
- Convert text input into audio using Google's Gemini TTS models
- Choose from 30+ different AI voices with varying characteristics
- Generate high-quality WAV audio files
- Integrate TTS generation into your existing Fabric workflows
## Usage
### Basic TTS Generation
To generate audio from text using TTS:
```bash
# Basic TTS with default voice (Kore)
echo "Hello, this is a test of Gemini TTS" | fabric -m gemini-2.5-flash-preview-tts -o output.wav
# Using a specific voice
echo "Hello, this is a test with the Charon voice" | fabric -m gemini-2.5-flash-preview-tts --voice Charon -o output.wav
# Using TTS with a pattern
fabric -p summarize --voice Puck -m gemini-2.5-flash-preview-tts -o summary.wav < document.txt
```
### Voice Selection
Use the `--voice` flag to specify which voice to use for TTS generation:
```bash
fabric -m gemini-2.5-flash-preview-tts --voice Zephyr -o output.wav "Your text here"
```
If no voice is specified, the default voice "Kore" will be used.
## Available Voices
Gemini TTS supports 30+ different voices, each with unique characteristics:
### Popular Voices
- **Kore** - Firm and confident (default)
- **Charon** - Informative and clear
- **Puck** - Upbeat and energetic
- **Zephyr** - Bright and cheerful
- **Leda** - Youthful and energetic
- **Aoede** - Breezy and natural
### Complete Voice List
- Kore, Charon, Puck, Fenrir, Aoede, Leda, Orus, Zephyr
- Autonoe, Callirhoe, Despina, Erinome, Gacrux, Laomedeia
- Pulcherrima, Sulafat, Vindemiatrix, Achernar, Achird
- Algenib, Algieba, Alnilam, Enceladus, Iapetus, Rasalgethi
- Sadachbia, Zubenelgenubi, Vega, Capella, Lyra
### Listing Available Voices
To see all available voices with descriptions:
```bash
# List all voices with characteristics
fabric --list-gemini-voices
# List voice names only (for shell completion)
fabric --list-gemini-voices --shell-complete-list
```
## Rate Limits
Google Gemini TTS has usage quotas that vary by plan:
### Free Tier
- **15 requests per day** per project per TTS model
- Quota resets daily
- Applies to all TTS models (e.g., `gemini-2.5-flash-preview-tts`)
### Rate Limit Errors
If you exceed your quota, you'll see an error like:
```text
Error 429: You exceeded your current quota, please check your plan and billing details
```
**Solutions:**
- Wait for daily quota reset (typically at midnight UTC)
- Upgrade to a paid plan for higher limits
- Use TTS generation strategically for important content
For current rate limits and pricing, visit: <https://ai.google.dev/gemini-api/docs/rate-limits>
## Configuration
### Command Line Options
- `--voice <voice_name>` - Specify the TTS voice to use
- `-o <filename.wav>` - Output audio file (required for TTS models)
- `-m <tts_model>` - Specify a TTS-capable model (e.g., `gemini-2.5-flash-preview-tts`)
### YAML Configuration
You can also set a default voice in your Fabric configuration file (`~/.config/fabric/config.yaml`):
```yaml
voice: "Charon" # Set your preferred default voice
```
## Requirements
- Valid Google Gemini API key configured in Fabric
- TTS-capable Gemini model (models containing "tts" in the name)
- Audio output must be specified with `-o filename.wav`
## Troubleshooting
### Common Issues
#### Error: "TTS model requires audio output"
- Solution: Always specify an output file with `-o filename.wav` when using TTS models
#### Error: "Invalid voice 'X'"
- Solution: Check that the voice name is spelled correctly and matches one of the supported voices listed above
#### Error: "TTS generation failed"
- Solution: Verify your Gemini API key is valid and you have sufficient quota
### Getting Help
For additional help with TTS features:
```bash
fabric --help
```
## Technical Details
- **Audio Format**: WAV files with 24kHz sample rate, 16-bit depth, mono channel
- **Language Support**: Automatic language detection for 24+ languages
- **Model Requirements**: Models must contain "tts", "preview-tts", or "text-to-speech" in the name
- **Voice Selection**: Uses Google's PrebuiltVoiceConfig system for consistent voice quality
---
For more information about Fabric, visit the [main documentation](../README.md).

36
docs/voices/README.md Normal file
View File

@@ -0,0 +1,36 @@
# Voice Samples
This directory contains sample audio files demonstrating different Gemini TTS voices.
## Sample Files
Each voice sample says "The quick brown fox jumped over the lazy dog" to demonstrate the voice characteristics:
- **Kore.wav** - Firm and confident (default voice)
- **Charon.wav** - Informative and clear
- **Vega.wav** - Smooth and pleasant
- **Capella.wav** - Warm and welcoming
- **Achird.wav** - Friendly and approachable
- **Lyra.wav** - Melodic and expressive
## Generating Samples
To generate these samples, use the following commands:
```bash
# Generate each voice sample
echo "The quick brown fox jumped over the lazy dog" | fabric -m gemini-2.5-flash-preview-tts --voice Kore -o docs/voices/Kore.wav
echo "The quick brown fox jumped over the lazy dog" | fabric -m gemini-2.5-flash-preview-tts --voice Charon -o docs/voices/Charon.wav
echo "The quick brown fox jumped over the lazy dog" | fabric -m gemini-2.5-flash-preview-tts --voice Vega -o docs/voices/Vega.wav
echo "The quick brown fox jumped over the lazy dog" | fabric -m gemini-2.5-flash-preview-tts --voice Capella -o docs/voices/Capella.wav
echo "The quick brown fox jumped over the lazy dog" | fabric -m gemini-2.5-flash-preview-tts --voice Achird -o docs/voices/Achird.wav
echo "The quick brown fox jumped over the lazy dog" | fabric -m gemini-2.5-flash-preview-tts --voice Lyra -o docs/voices/Lyra.wav
```
## Audio Format
- **Format**: WAV (uncompressed)
- **Sample Rate**: 24kHz
- **Bit Depth**: 16-bit
- **Channels**: Mono
- **Approximate Size**: ~500KB per sample

2
go.mod
View File

@@ -5,7 +5,7 @@ go 1.24.0
toolchain go1.24.2
require (
github.com/anthropics/anthropic-sdk-go v1.4.0
github.com/anthropics/anthropic-sdk-go v1.7.0
github.com/atotto/clipboard v0.1.4
github.com/aws/aws-sdk-go-v2 v1.36.4
github.com/aws/aws-sdk-go-v2/config v1.27.27

2
go.sum
View File

@@ -19,6 +19,8 @@ github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFI
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be/go.mod h1:ySMOLuWl6zY27l47sB3qLNK6tF2fkHG55UZxx8oIVo4=
github.com/anthropics/anthropic-sdk-go v1.4.0 h1:fU1jKxYbQdQDiEXCxeW5XZRIOwKevn/PMg8Ay1nnUx0=
github.com/anthropics/anthropic-sdk-go v1.4.0/go.mod h1:AapDW22irxK2PSumZiQXYUFvsdQgkwIWlpESweWZI/c=
github.com/anthropics/anthropic-sdk-go v1.7.0 h1:5iVf5fG/2gqVsOce8mq02r/WdgqpokM/8DXg2Ue6C9Y=
github.com/anthropics/anthropic-sdk-go v1.7.0/go.mod h1:3qSNQ5NrAmjC8A2ykuruSQttfqfdEYNZY5o8c0XSHB8=
github.com/araddon/dateparse v0.0.0-20210429162001-6b43995a97de h1:FxWPpzIjnTlhPwqqXc4/vE0f7GvRjuAsbW+HOIe8KnA=
github.com/araddon/dateparse v0.0.0-20210429162001-6b43995a97de/go.mod h1:DCaWoUhZrYW9p1lxo/cm8EmUOOzAPSEZNGF2DK1dJgw=
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=

View File

@@ -20,6 +20,8 @@ import (
)
// Flags create flags struct. the users flags go into this, this will be passed to the chat struct in cli
// Chat parameter defaults set in the struct tags must match domain.Default* constants
type Flags struct {
Pattern string `short:"p" long:"pattern" yaml:"pattern" description:"Choose a pattern from the available patterns" default:""`
PatternVariables map[string]string `short:"v" long:"variable" description:"Values for pattern variables, e.g. -v=#role:expert -v=#points:30"`
@@ -87,6 +89,8 @@ type Flags struct {
ThinkStartTag string `long:"think-start-tag" yaml:"thinkStartTag" description:"Start tag for thinking sections" default:"<think>"`
ThinkEndTag string `long:"think-end-tag" yaml:"thinkEndTag" description:"End tag for thinking sections" default:"</think>"`
DisableResponsesAPI bool `long:"disable-responses-api" yaml:"disableResponsesAPI" description:"Disable OpenAI Responses API (default: false)"`
Voice string `long:"voice" yaml:"voice" description:"TTS voice name for supported models (e.g., Kore, Charon, Puck)" default:"Kore"`
ListGeminiVoices bool `long:"list-gemini-voices" description:"List all available Gemini TTS voices"`
}
var debug = false
@@ -441,6 +445,7 @@ func (o *Flags) BuildChatOptions() (ret *domain.ChatOptions, err error) {
SuppressThink: o.SuppressThink,
ThinkStartTag: startTag,
ThinkEndTag: endTag,
Voice: o.Voice,
}
return
}

View File

@@ -1,11 +1,13 @@
package cli
import (
"fmt"
"os"
"strconv"
"github.com/danielmiessler/fabric/internal/core"
"github.com/danielmiessler/fabric/internal/plugins/ai"
"github.com/danielmiessler/fabric/internal/plugins/ai/gemini"
"github.com/danielmiessler/fabric/internal/plugins/db/fsdb"
)
@@ -58,5 +60,11 @@ func handleListingCommands(currentFlags *Flags, fabricDb *fsdb.Db, registry *cor
return true, err
}
if currentFlags.ListGeminiVoices {
voicesList := gemini.ListGeminiVoices(currentFlags.ShellCompleteOutput)
fmt.Print(voicesList)
return true, nil
}
return false, nil
}

View File

@@ -17,6 +17,10 @@ func CopyToClipboard(message string) (err error) {
}
func CreateOutputFile(message string, fileName string) (err error) {
if _, err = os.Stat(fileName); err == nil {
err = fmt.Errorf("file %s already exists, not overwriting. Rename the existing file or choose a different name", fileName)
return
}
var file *os.File
if file, err = os.Create(fileName); err != nil {
err = fmt.Errorf("error creating file: %v", err)
@@ -26,7 +30,7 @@ func CreateOutputFile(message string, fileName string) (err error) {
if _, err = file.WriteString(message); err != nil {
err = fmt.Errorf("error writing to file: %v", err)
} else {
fmt.Printf("\n\n... written to %s\n", fileName)
fmt.Fprintf(os.Stderr, "\n\n[Output also written to %s]\n", fileName)
}
return
}

View File

@@ -4,6 +4,14 @@ import "github.com/danielmiessler/fabric/internal/chat"
const ChatMessageRoleMeta = "meta"
// Default values for chat options (must match cli/flags.go defaults)
const (
DefaultTemperature = 0.7
DefaultTopP = 0.9
DefaultPresencePenalty = 0.0
DefaultFrequencyPenalty = 0.0
)
type ChatRequest struct {
ContextName string
SessionName string
@@ -38,6 +46,7 @@ type ChatOptions struct {
ThinkEndTag string
AudioOutput bool
AudioFormat string
Voice string
}
// NormalizeMessages remove empty messages and ensure messages order user-assist-user

View File

@@ -46,6 +46,7 @@ func NewClient() (ret *Client) {
string(anthropic.ModelClaude_3_5_Sonnet_20240620), string(anthropic.ModelClaude3OpusLatest),
string(anthropic.ModelClaude_3_Opus_20240229), string(anthropic.ModelClaude_3_Haiku_20240307),
string(anthropic.ModelClaudeOpus4_20250514), string(anthropic.ModelClaudeSonnet4_20250514),
string(anthropic.ModelClaudeOpus4_1_20250805),
}
return
@@ -181,11 +182,19 @@ func (an *Client) buildMessageParams(msgs []anthropic.MessageParam, opts *domain
params anthropic.MessageNewParams) {
params = anthropic.MessageNewParams{
Model: anthropic.Model(opts.Model),
MaxTokens: int64(an.maxTokens),
TopP: anthropic.Opt(opts.TopP),
Temperature: anthropic.Opt(opts.Temperature),
Messages: msgs,
Model: anthropic.Model(opts.Model),
MaxTokens: int64(an.maxTokens),
Messages: msgs,
}
// Only set one of Temperature or TopP as some models don't allow both
// Always set temperature to ensure consistent behavior (Anthropic default is 1.0, Fabric default is 0.7)
if opts.TopP != domain.DefaultTopP {
// User explicitly set TopP, so use that instead of temperature
params.TopP = anthropic.Opt(opts.TopP)
} else {
// Use temperature (always set to ensure Fabric's default of 0.7, not Anthropic's 1.0)
params.Temperature = anthropic.Opt(opts.Temperature)
}
// Add Claude Code spoofing system message for OAuth authentication

View File

@@ -72,7 +72,8 @@ func TestBuildMessageParams_WithoutSearch(t *testing.T) {
client := NewClient()
opts := &domain.ChatOptions{
Model: "claude-3-5-sonnet-latest",
Temperature: 0.7,
Temperature: 0.8, // Use non-default value to ensure it gets set
TopP: domain.DefaultTopP, // Use default TopP so temperature takes precedence
Search: false,
}
@@ -90,6 +91,7 @@ func TestBuildMessageParams_WithoutSearch(t *testing.T) {
t.Errorf("Expected model %s, got %s", opts.Model, params.Model)
}
// When using non-default temperature, it should be set in params
if params.Temperature.Value != opts.Temperature {
t.Errorf("Expected temperature %f, got %f", opts.Temperature, params.Temperature.Value)
}
@@ -99,7 +101,8 @@ func TestBuildMessageParams_WithSearch(t *testing.T) {
client := NewClient()
opts := &domain.ChatOptions{
Model: "claude-3-5-sonnet-latest",
Temperature: 0.7,
Temperature: 0.8, // Use non-default value
TopP: domain.DefaultTopP, // Use default TopP so temperature takes precedence
Search: true,
}
@@ -135,7 +138,8 @@ func TestBuildMessageParams_WithSearchAndLocation(t *testing.T) {
client := NewClient()
opts := &domain.ChatOptions{
Model: "claude-3-5-sonnet-latest",
Temperature: 0.7,
Temperature: 0.8, // Use non-default value
TopP: domain.DefaultTopP, // Use default TopP so temperature takes precedence
Search: true,
SearchLocation: "America/Los_Angeles",
}
@@ -256,3 +260,59 @@ func TestCitationFormatting(t *testing.T) {
t.Errorf("Expected 2 unique citations, got %d", citationCount)
}
}
func TestBuildMessageParams_DefaultValues(t *testing.T) {
client := NewClient()
// Test with default temperature - should always set temperature unless TopP is explicitly set
opts := &domain.ChatOptions{
Model: "claude-3-5-sonnet-latest",
Temperature: domain.DefaultTemperature, // 0.7 - should be set to override Anthropic's 1.0 default
TopP: domain.DefaultTopP, // 0.9 - default, so temperature takes precedence
Search: false,
}
messages := []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Hello")),
}
params := client.buildMessageParams(messages, opts)
// Temperature should be set when using default value to override Anthropic's 1.0 default
if params.Temperature.Value != opts.Temperature {
t.Errorf("Expected temperature %f, got %f", opts.Temperature, params.Temperature.Value)
}
// TopP should not be set when using default value (temperature takes precedence)
if params.TopP.Value != 0 {
t.Errorf("Expected TopP to not be set (0), but got %f", params.TopP.Value)
}
}
func TestBuildMessageParams_ExplicitTopP(t *testing.T) {
client := NewClient()
// Test with explicit TopP - should set TopP instead of temperature
opts := &domain.ChatOptions{
Model: "claude-3-5-sonnet-latest",
Temperature: domain.DefaultTemperature, // 0.7 - ignored when TopP is explicitly set
TopP: 0.5, // Non-default - should be set
Search: false,
}
messages := []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Hello")),
}
params := client.buildMessageParams(messages, opts)
// Temperature should not be set when TopP is explicitly set
if params.Temperature.Value != 0 {
t.Errorf("Expected temperature to not be set (0), but got %f", params.Temperature.Value)
}
// TopP should be set when using non-default value
if params.TopP.Value != opts.TopP {
t.Errorf("Expected TopP %f, got %f", opts.TopP, params.TopP.Value)
}
}

View File

@@ -153,7 +153,7 @@ func (c *BedrockClient) ListModels() ([]string, error) {
return models, nil
}
// SendStream sends the messages to the the Bedrock ConverseStream API
// SendStream sends the messages to the Bedrock ConverseStream API
func (c *BedrockClient) SendStream(msgs []*chat.ChatCompletionMessage, opts *domain.ChatOptions, channel chan string) (err error) {
// Ensure channel is closed on all exit paths to prevent goroutine leaks
defer func() {

View File

@@ -194,6 +194,12 @@ func (o *Client) generateTTSAudio(ctx context.Context, msgs []*chat.ChatCompleti
return "", err
}
// Validate voice name before making API call
if opts.Voice != "" && !IsValidGeminiVoice(opts.Voice) {
validVoices := GetGeminiVoiceNames()
return "", fmt.Errorf("invalid voice '%s'. Valid voices are: %v", opts.Voice, validVoices)
}
client, err := o.createGenaiClient(ctx)
if err != nil {
return "", err
@@ -211,12 +217,17 @@ func (o *Client) performTTSGeneration(ctx context.Context, client *genai.Client,
}}
// Configure for TTS generation
voiceName := opts.Voice
if voiceName == "" {
voiceName = "Kore" // Default voice if none specified
}
config := &genai.GenerateContentConfig{
ResponseModalities: []string{"AUDIO"},
SpeechConfig: &genai.SpeechConfig{
VoiceConfig: &genai.VoiceConfig{
PrebuiltVoiceConfig: &genai.PrebuiltVoiceConfig{
VoiceName: "Kore", // Default voice
VoiceName: voiceName,
},
},
},

View File

@@ -0,0 +1,218 @@
package gemini
import (
"fmt"
"sort"
)
// GeminiVoice represents a Gemini TTS voice with its characteristics
type GeminiVoice struct {
Name string
Description string
Characteristics []string
}
// GetGeminiVoices returns the current list of supported Gemini TTS voices
// This list is maintained based on official Google Gemini documentation
// https://ai.google.dev/gemini-api/docs/speech-generation
func GetGeminiVoices() []GeminiVoice {
return []GeminiVoice{
// Firm voices
{Name: "Kore", Description: "Firm and confident", Characteristics: []string{"firm", "confident", "default"}},
{Name: "Orus", Description: "Firm and decisive", Characteristics: []string{"firm", "decisive"}},
{Name: "Alnilam", Description: "Firm and strong", Characteristics: []string{"firm", "strong"}},
// Upbeat voices
{Name: "Puck", Description: "Upbeat and energetic", Characteristics: []string{"upbeat", "energetic"}},
{Name: "Laomedeia", Description: "Upbeat and lively", Characteristics: []string{"upbeat", "lively"}},
// Bright voices
{Name: "Zephyr", Description: "Bright and cheerful", Characteristics: []string{"bright", "cheerful"}},
{Name: "Autonoe", Description: "Bright and optimistic", Characteristics: []string{"bright", "optimistic"}},
// Informative voices
{Name: "Charon", Description: "Informative and clear", Characteristics: []string{"informative", "clear"}},
{Name: "Rasalgethi", Description: "Informative and professional", Characteristics: []string{"informative", "professional"}},
// Natural voices
{Name: "Aoede", Description: "Breezy and natural", Characteristics: []string{"breezy", "natural"}},
{Name: "Leda", Description: "Youthful and energetic", Characteristics: []string{"youthful", "energetic"}},
// Gentle voices
{Name: "Vindemiatrix", Description: "Gentle and kind", Characteristics: []string{"gentle", "kind"}},
{Name: "Achernar", Description: "Soft and gentle", Characteristics: []string{"soft", "gentle"}},
{Name: "Enceladus", Description: "Breathy and soft", Characteristics: []string{"breathy", "soft"}},
// Warm voices
{Name: "Sulafat", Description: "Warm and welcoming", Characteristics: []string{"warm", "welcoming"}},
{Name: "Capella", Description: "Warm and approachable", Characteristics: []string{"warm", "approachable"}},
// Clear voices
{Name: "Iapetus", Description: "Clear and articulate", Characteristics: []string{"clear", "articulate"}},
{Name: "Erinome", Description: "Clear and precise", Characteristics: []string{"clear", "precise"}},
// Pleasant voices
{Name: "Algieba", Description: "Smooth and pleasant", Characteristics: []string{"smooth", "pleasant"}},
{Name: "Vega", Description: "Smooth and flowing", Characteristics: []string{"smooth", "flowing"}},
// Textured voices
{Name: "Algenib", Description: "Gravelly texture", Characteristics: []string{"gravelly", "textured"}},
// Relaxed voices
{Name: "Callirrhoe", Description: "Easy-going and relaxed", Characteristics: []string{"relaxed", "easy-going"}},
{Name: "Despina", Description: "Calm and serene", Characteristics: []string{"calm", "serene"}},
// Mature voices
{Name: "Gacrux", Description: "Mature and experienced", Characteristics: []string{"mature", "experienced"}},
// Expressive voices
{Name: "Pulcherrima", Description: "Forward and expressive", Characteristics: []string{"forward", "expressive"}},
{Name: "Lyra", Description: "Melodic and expressive", Characteristics: []string{"melodic", "expressive"}},
// Dynamic voices
{Name: "Fenrir", Description: "Excitable and dynamic", Characteristics: []string{"excitable", "dynamic"}},
{Name: "Sadachbia", Description: "Lively and animated", Characteristics: []string{"lively", "animated"}},
// Friendly voices
{Name: "Achird", Description: "Friendly and approachable", Characteristics: []string{"friendly", "approachable"}},
// Casual voices
{Name: "Zubenelgenubi", Description: "Casual and conversational", Characteristics: []string{"casual", "conversational"}},
// Additional voices from latest API
{Name: "Sadaltager", Description: "Experimental voice with a calm and neutral tone", Characteristics: []string{"experimental", "calm", "neutral"}},
{Name: "Schedar", Description: "Experimental voice with a warm and engaging tone", Characteristics: []string{"experimental", "warm", "engaging"}},
{Name: "Umbriel", Description: "Experimental voice with a deep and resonant tone", Characteristics: []string{"experimental", "deep", "resonant"}},
}
}
// GetGeminiVoiceNames returns just the voice names in alphabetical order
func GetGeminiVoiceNames() []string {
voices := GetGeminiVoices()
names := make([]string, len(voices))
for i, voice := range voices {
names[i] = voice.Name
}
sort.Strings(names)
return names
}
// IsValidGeminiVoice checks if a voice name is valid
func IsValidGeminiVoice(voiceName string) bool {
if voiceName == "" {
return true // Empty voice is valid (will use default)
}
for _, voice := range GetGeminiVoices() {
if voice.Name == voiceName {
return true
}
}
return false
}
// GetGeminiVoiceByName returns a specific voice by name
func GetGeminiVoiceByName(name string) (*GeminiVoice, error) {
for _, voice := range GetGeminiVoices() {
if voice.Name == name {
return &voice, nil
}
}
return nil, fmt.Errorf("voice '%s' not found", name)
}
// ListGeminiVoices formats the voice list for display
func ListGeminiVoices(shellCompleteMode bool) string {
if shellCompleteMode {
// For shell completion, just return voice names
names := GetGeminiVoiceNames()
result := ""
for _, name := range names {
result += name + "\n"
}
return result
}
// For human-readable output
voices := GetGeminiVoices()
result := "Available Gemini Text-to-Speech voices:\n\n"
// Group by characteristics for better readability
groups := map[string][]GeminiVoice{
"Firm & Confident": {},
"Bright & Cheerful": {},
"Warm & Welcoming": {},
"Clear & Professional": {},
"Natural & Expressive": {},
"Other Voices": {},
}
for _, voice := range voices {
placed := false
for _, char := range voice.Characteristics {
switch char {
case "firm", "confident", "decisive", "strong":
if !placed {
groups["Firm & Confident"] = append(groups["Firm & Confident"], voice)
placed = true
}
case "bright", "cheerful", "upbeat", "energetic", "lively":
if !placed {
groups["Bright & Cheerful"] = append(groups["Bright & Cheerful"], voice)
placed = true
}
case "warm", "welcoming", "friendly", "approachable":
if !placed {
groups["Warm & Welcoming"] = append(groups["Warm & Welcoming"], voice)
placed = true
}
case "clear", "informative", "professional", "articulate":
if !placed {
groups["Clear & Professional"] = append(groups["Clear & Professional"], voice)
placed = true
}
case "natural", "expressive", "melodic", "breezy":
if !placed {
groups["Natural & Expressive"] = append(groups["Natural & Expressive"], voice)
placed = true
}
}
}
if !placed {
groups["Other Voices"] = append(groups["Other Voices"], voice)
}
}
// Output grouped voices
for groupName, groupVoices := range groups {
if len(groupVoices) > 0 {
result += fmt.Sprintf("%s:\n", groupName)
for _, voice := range groupVoices {
defaultStr := ""
if voice.Name == "Kore" {
defaultStr = " (default)"
}
result += fmt.Sprintf(" %-15s - %s%s\n", voice.Name, voice.Description, defaultStr)
}
result += "\n"
}
}
result += "Use --voice <voice_name> to select a specific voice.\n"
result += "Example: fabric --voice Charon -m gemini-2.5-flash-preview-tts -o output.wav \"Hello world\"\n"
return result
}
// NOTE: This implementation maintains a curated list based on official Google documentation.
// In the future, if Google provides a dynamic voice discovery API, this can be updated
// to make API calls for real-time voice discovery.
//
// The current approach ensures:
// 1. Fast response times (no API calls needed)
// 2. Reliable voice information with descriptions
// 3. Easy maintenance when new voices are added
// 4. Offline functionality
//
// To update voices: Monitor Google's Gemini TTS documentation at:
// https://ai.google.dev/gemini-api/docs/speech-generation

View File

@@ -164,6 +164,7 @@ func (o *Client) NeedsRawMode(modelName string) bool {
"o1",
"o3",
"o4",
"gpt-5",
}
openAIModelsNeedingRaw := []string{
"gpt-4o-mini-search-preview",

View File

@@ -26,8 +26,8 @@ schema = 3
version = "v1.3.3"
hash = "sha256-jv7ZshpSd7FZzKKN6hqlUgiR8C3y85zNIS/hq7g76Ho="
[mod."github.com/anthropics/anthropic-sdk-go"]
version = "v1.4.0"
hash = "sha256-4kwFw9gt/sRIlTo0fC2PbfLnCyc4lCOtmfQelhpORX8="
version = "v1.7.0"
hash = "sha256-DvpFXlUE04HeMbqQX4HIC/KMJYPXJ8rEaZkNJb1rWxs="
[mod."github.com/araddon/dateparse"]
version = "v0.0.0-20210429162001-6b43995a97de"
hash = "sha256-UuX84naeRGMsFOgIgRoBHG5sNy1CzBkWPKmd6VbLwFw="

View File

@@ -1 +1 @@
"1.4.267"
"1.4.276"