Compare commits

...

162 Commits

Author SHA1 Message Date
github-actions[bot]
64411cdc02 chore(release): Update version to v1.4.263 2025-07-21 19:29:11 +00:00
Kayvan Sylvan
9a2ff983a4 Merge pull request #1641 from ksylvan/0721-web-timeout-fix
Fix Fabric Web timeout error
2025-07-21 12:26:53 -07:00
Changelog Bot
a522d4a411 chore: incoming 1641 changelog entry 2025-07-21 12:24:36 -07:00
Kayvan Sylvan
9bdd77c277 chore: extend proxy timeout in vite.config.ts to 15 minutes
### CHANGES

- Increase `/api` proxy timeout to 900,000 ms
- Increase `/names` proxy timeout to 900,000 ms
2025-07-21 12:16:15 -07:00
github-actions[bot]
cc68dddfe8 chore(release): Update version to v1.4.262 2025-07-21 18:20:58 +00:00
Kayvan Sylvan
07ee7f8b21 Merge pull request #1640 from ksylvan/0720-changelog-during-ci-cd
Implement Automated Changelog System for CI/CD Integration
2025-07-21 11:18:26 -07:00
Kayvan Sylvan
15a355f08a docs: Remove duplicated section. 2025-07-21 11:07:27 -07:00
Kayvan Sylvan
c50486b611 chore: fix tests for generate_changelog 2025-07-21 10:52:43 -07:00
Kayvan Sylvan
edaca7a045 chore: adjust insertVersionAtTop for consistent newline handling 2025-07-21 10:33:00 -07:00
Kayvan Sylvan
28432a50f0 chore: adjust newline handling in insertVersionAtTop method 2025-07-21 10:28:42 -07:00
Kayvan Sylvan
8ab891fcff chore: trim leading newline in changelog entry content 2025-07-21 10:16:31 -07:00
Kayvan Sylvan
cab6df88ea chore: simplify direct commits content handling in changelog generation 2025-07-21 09:56:17 -07:00
Kayvan Sylvan
42afd92f31 refactor: rename ProcessIncomingPRs to CreateNewChangelogEntry for clarity
## CHANGES

- Rename ProcessIncomingPRs method to CreateNewChangelogEntry
- Update method comment to reflect new name
- Update main.go to call renamed method
- Reduce newline spacing in content formatting
2025-07-21 09:37:02 -07:00
Changelog Bot
76d6b1721e chore: incoming 1640 changelog entry 2025-07-21 08:42:35 -07:00
Kayvan Sylvan
7d562096d1 fix: formatting fixes in tests. 2025-07-21 08:25:13 -07:00
Kayvan Sylvan
91c1aca0dd feat: enhance changelog generator to accept version parameter for PR processing
## CHANGES

- Pass version parameter to changelog generation workflow
- Update ProcessIncomingPRs method to accept version string
- Add commit SHA tracking to prevent duplicate entries
- Modify process-prs flag to require version parameter
- Improve changelog formatting with proper spacing
- Update configuration to use ProcessPRsVersion string field
- Enhance direct commit filtering with SHA exclusion
- Update documentation to reflect version parameter requirement
2025-07-21 07:36:30 -07:00
Kayvan Sylvan
b8008a34fb feat: enhance changelog generation to avoid duplicate commit entries
## CHANGES

- Extract PR numbers from processed changelog files
- Pass processed PRs map to direct commits function
- Filter out commits already included via PR files
- Reduce extra newlines in changelog version insertion
- Add strconv import for PR number parsing
- Prevent duplicate entries between PRs and direct commits
- Improve changelog formatting consistency
2025-07-20 22:49:05 -07:00
Kayvan Sylvan
482759ae72 fix: ensure the PR#.txt file ends with a newline. 2025-07-20 20:38:52 -07:00
Kayvan Sylvan
b0d096d0ea feat: change push behavior from opt-out to opt-in with GitHub token auth
## CHANGES

- Change `NoPush` config field to `Push` boolean
- Update CLI flag from `--no-push` to `--push`
- Add GitHub token authentication for push operations
- Import `os` and HTTP transport packages
- Reverse push logic to require explicit enable
- Update documentation for new push behavior
- Add automatic GitHub repository detection for auth
2025-07-20 20:33:23 -07:00
Kayvan Sylvan
e56ecfb7ae chore: update gitignore and simplify changelog generator error handling
## CHANGES

- Add .claude/ directory to gitignore exclusions
- Update comment clarity for SilenceUsage flag
- Remove redundant error handling in main function
- Simplify command execution without explicit error checking
2025-07-20 18:49:32 -07:00
Kayvan Sylvan
951bd134eb Fix CLI error handling and improve git status validation
- Add SilenceUsage to prevent help output on errors
- Add GetStatusDetails method to show which files are dirty
- Include direct commits in ProcessIncomingPRs for complete AI summaries
2025-07-20 18:20:53 -07:00
Kayvan Sylvan
7ff04658f3 chore: add automated changelog processing for CI/CD integration
## CHANGES

- Add incoming PR preprocessing with validation
- Implement release aggregation for incoming files
- Create git operations for staging changes
- Add comprehensive test coverage for processing
- Extend GitHub client with validation methods
- Support version detection from nix files
- Include documentation for automated workflow
- Add command flags for PR processing
2025-07-20 17:47:34 -07:00
Kayvan Sylvan
272f04dd32 docs: Update CHANGELOG after v1.4.261 2025-07-19 07:22:24 -07:00
github-actions[bot]
29cb3796bf Update version to v1.4.261 and commit 2025-07-19 14:20:50 +00:00
Kayvan Sylvan
f51f9e75a9 Merge pull request #1637 from ksylvan/0719-add-mistral-to-list-of-raw-mode-models
chore: update `NeedsRawMode` to include `mistral` prefix for Ollama
2025-07-19 07:19:13 -07:00
Kayvan Sylvan
63475784c7 chore: update NeedsRawMode to include mistral prefix
### CHANGES

- Add `mistral` to `ollamaPrefixes` list.
2025-07-19 07:13:23 -07:00
Kayvan Sylvan
1a7bb27370 docs: Update CHANGELOG after v1.4.260 2025-07-18 13:01:15 -07:00
github-actions[bot]
4badaa4c85 Update version to v1.4.260 and commit 2025-07-18 19:57:27 +00:00
Kayvan Sylvan
bf6be964fd Merge pull request #1634 from ksylvan/0718-fix-exo-labs-client
Fix abort in Exo-Labs provider plugin; with credit to @sakithahSenid
2025-07-18 12:55:58 -07:00
Kayvan Sylvan
cdbcb0a512 chore: add API key setup question to Exolab AI plugin configuration
## CHANGES

- Add "openaiapi" to VSCode spell check dictionary
- Include API key setup question in Exolab client
- Configure API key as required field for setup
- Maintain existing API base URL configuration order
2025-07-18 12:40:55 -07:00
Kayvan Sylvan
f81cf193a2 docs: Update CHANGELOG after v1.4.259 2025-07-18 12:01:19 -07:00
github-actions[bot]
cba56fcde6 Update version to v1.4.259 and commit 2025-07-18 18:57:13 +00:00
Kayvan Sylvan
72cbd13917 Merge pull request #1633 from ksylvan/0718-youtube-vtt-transcript-duplication-bug
YouTube VTT Processing Enhancement
2025-07-18 11:55:44 -07:00
Kayvan Sylvan
dc722f9724 feat: improve timestamp parsing to handle fractional seconds in YouTube tool
## CHANGES

- Move timestamp regex initialization to init function
- Add parseSeconds helper function for fractional seconds
- Replace direct strconv.Atoi calls with parseSeconds function
- Support decimal seconds in timestamp format parsing
- Extract seconds parsing logic into reusable function
2025-07-18 11:53:28 -07:00
Kayvan Sylvan
1a35f32a48 fix: Youtube VTT parsing gap test 2025-07-18 11:27:23 -07:00
Kayvan Sylvan
65bd2753c2 feat: enhance VTT duplicate filtering to allow legitimate repeated content
## CHANGES

- Fix regex escape sequence for timestamp parsing
- Add configurable time gap constant for repeat detection
- Track content with timestamps instead of simple deduplication
- Implement time-based repeat inclusion logic for choruses
- Add timestamp parsing helper functions for calculations
- Allow repeated content after significant time gaps
- Preserve legitimate recurring phrases while filtering duplicates
2025-07-18 11:08:37 -07:00
Kayvan Sylvan
570c9a9404 chore: refactor timestamp regex and seenSegments logic
### CHANGES

- Update `timestampRegex` to support optional seconds/milliseconds
- Change `seenSegments` to use `struct{}` for memory efficiency
- Refactor duplicate check using `struct{}` pattern
- Improve readability by restructuring timestamp logic
2025-07-18 09:44:31 -07:00
Kayvan Sylvan
15151fe9ee chore: refactor timestamp regex to global scope and add spell check words
## CHANGES

- Move timestamp regex to global package scope
- Remove duplicate regex compilation from isTimeStamp function
- Add "horts", "mbed", "WEBVTT", "youtu" to spell checker
- Improve regex performance by avoiding repeated compilation
- Clean up code organization in YouTube module
2025-07-18 09:33:46 -07:00
Kayvan Sylvan
2aad4caf9b fix: prevent duplicate segments in VTT file processing
- Add deduplication map to track seen segments
- Skip duplicate text segments in plain VTT processing
- Skip duplicate segments in timestamped VTT processing
- Improve timestamp regex to handle more formats
- Use clean text as deduplication key consistently
2025-07-18 09:17:03 -07:00
Kayvan Sylvan
289fda8c74 docs: Update CHANGELOG after v1.4.258 2025-07-17 15:30:50 -07:00
github-actions[bot]
fd40778472 Update version to v1.4.258 and commit 2025-07-17 22:28:08 +00:00
Kayvan Sylvan
bc1641a68c Merge pull request #1629 from ksylvan/0717-ensure-envFile
Create Default (empty) .env in ~/.config/fabric on Demand
2025-07-17 15:26:37 -07:00
Kayvan Sylvan
5cf15d22d3 chore: define constants for file and directory permissions 2025-07-17 15:14:15 -07:00
Kayvan Sylvan
2b2a25daaa chore: improve error handling in ensureEnvFile function 2025-07-17 15:00:26 -07:00
Kayvan Sylvan
75a7f25642 refactor: improve error handling and permissions in ensureEnvFile 2025-07-17 14:46:03 -07:00
Kayvan Sylvan
8bab58f225 feat: add startup check to initialize config and .env file
### CHANGES
- Introduce ensureEnvFile function to create ~/.config/fabric/.env if missing.
- Add directory creation for config path in ensureEnvFile.
- Integrate setup flag in CLI to call ensureEnvFile on demand.
- Handle errors for home directory detection and file operations.
2025-07-17 14:27:19 -07:00
Kayvan Sylvan
8ec006e02c docs: Update README and CHANGELOG after v1.4.257 2025-07-17 11:15:24 -07:00
github-actions[bot]
2508dc6397 Update version to v1.4.257 and commit 2025-07-17 18:03:45 +00:00
Kayvan Sylvan
7670df35ad Merge pull request #1628 from ksylvan/0717-give-users-who-use-llama-server-ability-to-say-dont-use-responses-api
Introduce CLI Flag to Disable OpenAI Responses API
2025-07-17 11:02:18 -07:00
Kayvan Sylvan
3b9782f942 feat: add disable-responses-api flag for OpenAI compatibility
## CHANGES

- Add disable-responses-api flag to CLI completions
- Update zsh completion with new API flag
- Update bash completion options list
- Add fish shell completion for API flag
- Add testpattern to VSCode spell checker dictionary
- Configure disableResponsesAPI in example YAML config
- Enable flag for llama-server compatibility
2025-07-17 10:47:35 -07:00
Kayvan Sylvan
3fca3489fb feat: add OpenAI Responses API configuration control via CLI flag
## CHANGES

- Add `--disable-responses-api` CLI flag for OpenAI control
- Implement `SetResponsesAPIEnabled` method in OpenAI client
- Configure OpenAI Responses API setting during CLI initialization
- Update default config path to `~/.config/fabric/config.yaml`
- Add OpenAI import to CLI package dependencies
2025-07-17 10:37:15 -07:00
Kayvan Sylvan
bb2d58eae0 docs: Update CHANGELOG after v1.4.256 2025-07-17 07:17:26 -07:00
github-actions[bot]
87df7dc383 Update version to v1.4.256 and commit 2025-07-17 14:14:50 +00:00
Kayvan Sylvan
1d69afa1c9 Merge pull request #1624 from ksylvan/0716-default-config-yaml
Feature: Add Automatic ~/.fabric.yaml Config Detection
2025-07-17 07:13:17 -07:00
Kayvan Sylvan
96c18b4c99 refactor: extract flag parsing logic into separate extractFlag function 2025-07-17 06:51:40 -07:00
Kayvan Sylvan
dd5173963b fix: improve error handling for default config path resolution
## CHANGES

- Update `GetDefaultConfigPath` to return error alongside path
- Add proper error handling in flags initialization
- Include debug logging for config path failures
- Move channel close to defer in dryrun SendStream
- Return wrapped errors with context messages
- Handle non-existent config as valid case
2025-07-17 06:18:55 -07:00
Kayvan Sylvan
da1c8ec979 fix: improve dry run output formatting and config path error handling
## CHANGES

- Remove leading newline from DryRunResponse constant
- Add newline separator in SendStream method output
- Add newline separator in Send method output
- Improve GetDefaultConfigPath error handling logic
- Add stderr error message for config access failures
- Return empty string when config file doesn't exist
2025-07-16 23:46:07 -07:00
Kayvan Sylvan
ac97f9984f chore: refactor constructRequest method for consistency
### CHANGES

- Rename `_ConstructRequest` to `constructRequest` for consistency
- Update `SendStream` to use `constructRequest`
- Update `Send` method to use `constructRequest`
2025-07-16 23:36:05 -07:00
Kayvan Sylvan
181b812eaf chore: remove unneeded parenthesis around function call 2025-07-16 23:31:17 -07:00
Kayvan Sylvan
fe94165d31 chore: update Send method to append request to DryRunResponse
### CHANGES

- Assign `_ConstructRequest` output to `request` variable
- Concatenate `request` with `DryRunResponse` in `Send` method
2025-07-16 23:28:48 -07:00
Kayvan Sylvan
16e92690aa feat: improve flag handling and add default config support
## CHANGES

- Map both short and long flags to yaml tags
- Add support for short flag parsing with dashes
- Implement default ~/.fabric.yaml config file detection
- Fix think block suppression in dry run mode
- Add think options to dry run output formatting
- Refactor dry run response construction into helper method
- Return actual response content from dry run client
- Create utility function for default config path resolution
2025-07-16 23:09:56 -07:00
Kayvan Sylvan
1c33799aa8 docs: Update CHANGELOG after v1.4.255 2025-07-16 14:44:32 -07:00
github-actions[bot]
9559e618c3 Update version to v1.4.255 and commit 2025-07-16 21:38:03 +00:00
Kayvan Sylvan
ac32e8e64a Merge branch 'danielmiessler:main' into main 2025-07-16 14:35:00 -07:00
Kayvan Sylvan
82340e6126 chore: add more paths to update-version-andcreate-tag workflow to reduce unnecessary tagging 2025-07-16 14:34:29 -07:00
github-actions[bot]
5dec53726a Update version to v1.4.254 and commit 2025-07-16 21:21:12 +00:00
Kayvan Sylvan
b0eb136cbb Merge pull request #1621 from robertocarvajal/main
Adds generate code rules pattern
2025-07-16 14:19:46 -07:00
Roberto Carvajal
63f4370ff1 Adds generate code rules pattern
Signed-off-by: Roberto Carvajal <roberto.carvajal@gmail.com>
2025-07-16 11:15:55 -04:00
Kayvan Sylvan
b3cc2c737d docs: Update CHANGELOG after v1.4.253 2025-07-15 22:36:26 -07:00
github-actions[bot]
e43b4191e4 Update version to v1.4.253 and commit 2025-07-16 05:34:13 +00:00
Kayvan Sylvan
744c565120 Merge pull request #1620 from ksylvan/0715-thinking-flags-completions-scripts
Update Shell Completions for New Think-Block Suppression Options
2025-07-15 22:32:41 -07:00
Kayvan Sylvan
1473ac1465 feat: add 'think' tag options for text suppression and completion
### CHANGES

- Remove outdated update notes from README
- Add `--suppress-think` option to suppress 'think' tags
- Introduce `--think-start-tag` and `--think-end-tag` options
- Update bash completion with 'think' tag options
- Update fish completion with 'think' tag options
2025-07-15 22:26:12 -07:00
Kayvan Sylvan
c38c16f0db docs: Update CHANGELOG after v.1.4.252 2025-07-15 22:08:43 -07:00
github-actions[bot]
a4b1db4193 Update version to v1.4.252 and commit 2025-07-16 05:05:47 +00:00
Kayvan Sylvan
d44bc19a84 Merge pull request #1619 from ksylvan/0715-suppress-think
Feature: Optional Hiding of Model Thinking Process with Configurable Tags
2025-07-15 22:04:12 -07:00
Kayvan Sylvan
a2e618e11c perf: add regex caching to StripThinkBlocks function for improved performance 2025-07-15 22:02:16 -07:00
Kayvan Sylvan
cb90379b30 feat: add suppress-think feature to filter AI reasoning output
## CHANGES

- Add suppress-think flag to hide thinking blocks
- Configure customizable start and end thinking tags
- Strip thinking content from final response output
- Update streaming logic to respect suppress-think setting
- Add YAML configuration support for thinking options
- Implement StripThinkBlocks utility function for content filtering
- Add comprehensive tests for thinking suppression functionality
2025-07-15 21:52:27 -07:00
Kayvan Sylvan
4868687746 chore: Update CHANGELOG after v1.4.251 2025-07-15 21:44:14 -07:00
github-actions[bot]
85780fee76 Update version to v1.4.251 and commit 2025-07-16 03:49:08 +00:00
Kayvan Sylvan
497b1ed682 Merge pull request #1618 from ksylvan/0715-refrain-from-version-bumping-when-only-changelog-cache-changes
Update GitHub Workflow to Ignore Additional File Paths
2025-07-15 20:47:35 -07:00
Kayvan Sylvan
135433b749 ci: update workflow to ignore additional paths during version updates
## CHANGES

- Add `data/strategies/**` to paths-ignore list
- Add `cmd/generate_changelog/*.db` to paths-ignore list
- Prevent workflow triggers from strategy data changes
- Prevent workflow triggers from changelog database files
2025-07-15 20:38:44 -07:00
github-actions[bot]
f185dedb37 Update version to v1.4.250 and commit 2025-07-16 02:31:31 +00:00
Kayvan Sylvan
c74a157dcf docs: Update changelog with v1.4.249 changes 2025-07-15 19:29:46 -07:00
github-actions[bot]
91a336e870 Update version to v1.4.249 and commit 2025-07-16 01:32:15 +00:00
Kayvan Sylvan
5212fbcc37 Merge pull request #1617 from ksylvan/0715-really-really-fix-changelog-pr-sync-issue
Improve PR Sync Logic for Changelog Generator
2025-07-15 18:30:42 -07:00
Kayvan Sylvan
6d8eb3d2b9 chore: add log message for missing PRs in cache 2025-07-15 18:27:06 -07:00
Kayvan Sylvan
d3bba5d026 feat: preserve PR numbers during version cache merges
### CHANGES

- Enhance changelog to associate PR numbers with version tags
- Improve PR number parsing with proper error handling
- Collect all PR numbers for commits between version tags
- Associate aggregated PR numbers with each version entry
- Update cached versions with newly found PR numbers
- Add check for missing PRs to trigger sync if needed
2025-07-15 18:12:07 -07:00
github-actions[bot]
699762b694 Update version to v1.4.248 and commit 2025-07-16 00:24:44 +00:00
Kayvan Sylvan
f2a6f1bd98 Merge pull request #1616 from ksylvan/0715-fix-changelog-cache-logic
Preserve PR Numbers During Version Cache Merges
2025-07-15 17:23:17 -07:00
Kayvan Sylvan
3176adf59b fix: improve PR number parsing with proper error handling 2025-07-15 17:18:45 -07:00
Kayvan Sylvan
7e29966622 feat: enhance changelog to correctly associate PR numbers with version tags
### CHANGES

-   Collect all PR numbers for commits between version tags.
-   Associate aggregated PR numbers with each version entry.
-   Update cached versions with newly found PR numbers.
-   Attribute all changes in a version to relevant PRs.
2025-07-15 17:06:40 -07:00
Kayvan Sylvan
0af0ab683d docs: reorganize v1.4.247 changelog to attribute changes to PR #1613 2025-07-15 07:03:04 -07:00
github-actions[bot]
e72e67de71 Update version to v1.4.247 and commit 2025-07-15 07:07:13 +00:00
Kayvan Sylvan
414b6174e7 Merge pull request #1613 from ksylvan/0714-fixes-for-custom-directory-unique-patterns-list-changelog-cache
Improve AI Summarization for Consistent Professional Changelog Entries
2025-07-15 00:05:44 -07:00
Kayvan Sylvan
f63e0dfc05 chore: update logging output to use os.Stderr 2025-07-15 00:00:09 -07:00
Kayvan Sylvan
4ef8578e47 fix: improve error handling in plugin registry configuration 2025-07-14 23:54:05 -07:00
Kayvan Sylvan
12ee690ae4 chore: remove debug logging and sync custom patterns directory configuration
## CHANGES

- Remove debug stderr logging from content summarization
- Add custom patterns directory to PatternsLoader configuration
- Ensure consistent patterns directory setup across components
- Clean up unnecessary console output during summarization
2025-07-14 23:19:05 -07:00
Kayvan Sylvan
cc378be485 feat: improve error handling in plugin_registry and patterns_loader
### CHANGES

- Adjust prompt formatting in `summarize.go`
- Add error handling for `CustomPatterns` configuration
- Enhance error messages in `patterns_loader`
- Check for patterns in multiple directories
2025-07-14 23:09:39 -07:00
Kayvan Sylvan
06fc8d8732 chore: reorder plugin configuration sequence in PluginRegistry.Configure method
## CHANGES

- Move CustomPatterns.Configure() before PatternsLoader.Configure()
- Adjust plugin initialization order in Configure method
- Ensure proper dependency sequence for pattern loading
2025-07-14 22:51:52 -07:00
Kayvan Sylvan
9e4ed8ecb3 fix: improve git walking termination and error handling in changelog generator
## CHANGES

- Add storer import for proper git iteration control
- Use storer.ErrStop instead of nil for commit iteration termination
- Handle storer.ErrStop as expected condition in git walker
- Update cache comment to clarify Unreleased version skipping
- Change custom patterns warning to stderr output
- Add storer to VSCode spell checker dictionary
2025-07-14 22:34:13 -07:00
Kayvan Sylvan
c369425708 chore: clean up changelog and add debug logging for content length validation 2025-07-14 20:53:59 -07:00
Kayvan Sylvan
cf074d3411 feat: enhance changelog generation with incremental caching and improved AI summarization
## CHANGES

- Add incremental processing for new Git tags since cache
- Implement `WalkHistorySinceTag` method for efficient history traversal
- Cache new versions and commits after AI processing
- Update AI summarization prompt for better release note formatting
- Remove conventional commit prefix stripping from commit messages
- Add custom patterns directory support to plugin registry
- Generate unique patterns file including custom directory patterns
- Improve session string formatting with switch statement
2025-07-14 15:12:57 -07:00
Kayvan Sylvan
47f75237ff docs: update README for GraphQL optimization and AI summary features
### CHANGES

- Detail GraphQL API usage for faster PR fetching
- Introduce AI-powered summaries via Fabric integration
- Explain content-based caching for AI summaries
- Document support for loading secrets from .env files
- Add usage examples for new AI summary feature
- Clarify project license is The MIT License
2025-07-13 21:57:11 -07:00
github-actions[bot]
fad0a065d4 Update version to v1.4.246 and commit 2025-07-14 03:44:10 +00:00
Kayvan Sylvan
a59a3517d8 Merge pull request #1611 from ksylvan/0711-changelog
Changelog Generator: AI-Powered Automation for Fabric Project
2025-07-13 20:42:41 -07:00
Kayvan Sylvan
04c3c0c512 docs: Update CHANGELOG 2025-07-13 20:39:21 -07:00
Kayvan Sylvan
cb837bde2d feat: add AI-powered changelog generation with high-performance Go tool and comprehensive caching
## CHANGES

- Add high-performance Go changelog generator with GraphQL integration
- Implement SQLite-based persistent caching for incremental updates
- Create one-pass git history walking algorithm with concurrent processing
- Add comprehensive CLI with cobra framework and tag-based caching
- Integrate AI summarization using Fabric CLI for enhanced output
- Support batch PR fetching with GitHub Search API optimization
- Add VSCode configuration with spell checking and markdown linting
- Include extensive documentation with PRD and README files
- Implement commit-PR mapping for lightning-fast git operations
- Add content hashing for change detection and cache optimization
2025-07-13 20:26:23 -07:00
github-actions[bot]
2ad454b6dc Update version to v1.4.245 and commit 2025-07-11 20:56:33 +00:00
Kayvan Sylvan
c0ea25f816 Merge pull request #1603 from ksylvan/0711-together-ai-implementation
Together AI Support with OpenAI Fallback Mechanism Added
2025-07-11 13:55:02 -07:00
Kayvan Sylvan
87796d4fa9 chore: optimize model ID extraction and remove redundant comment
## CHANGES

- Remove duplicate comment about reading response body
- Preallocate slice capacity in extractModelIDs function
- Initialize modelIDs slice with known capacity
2025-07-11 13:48:30 -07:00
Kayvan Sylvan
e1945a0b62 fix: improve error message truncation in DirectlyGetModels method
## CHANGES

- Add proper bounds checking for response body truncation
- Prevent slice out of bounds errors in error messages
- Add ellipsis indicator when response body is truncated
- Improve error message clarity for debugging purposes
2025-07-11 13:33:16 -07:00
Kayvan Sylvan
ecac2b4c34 refactor: clean up HTTP request handling and improve error response formatting
## CHANGES

- Remove unnecessary else block in HTTP request creation
- Move header setting outside conditional block for clarity
- Add TODO comment about reusing HTTP client instance
- Truncate error response output to prevent excessive logging
2025-07-11 13:19:44 -07:00
Kayvan Sylvan
7ed4de269e chore: refactor DirectlyGetModels to read response body once
## CHANGES
* Read response body once for efficiency
* Use io.ReadAll for response body
* Unmarshal json from bodyBytes
* Return error with raw response bytes
* Improve error handling for json parsing
2025-07-11 13:11:30 -07:00
Kayvan Sylvan
6bd305906d fix: increase error response limit and simplify model extraction logic
## CHANGES

- Increase error response limit from 500 to 1024 bytes
- Add documentation comment for ErrorResponseLimit constant
- Remove unnecessary error return from extractModelIDs function
- Fix return statements in DirectlyGetModels parsing logic
- Add TODO comment for proper context handling
- Simplify model ID extraction without error propagation
2025-07-11 12:59:55 -07:00
Kayvan Sylvan
6aeca6e4da fix: improve error message in DirectlyGetModels to include provider name
## CHANGES

- Add provider name to API base URL error message
- Enhance error context for better debugging experience
- Include GetName() method call in error formatting
2025-07-11 12:50:57 -07:00
Kayvan Sylvan
b34f249e24 feat: add context support to DirectlyGetModels method
## CHANGES

- Add context parameter to DirectlyGetModels method signature
- Add nil context check with Background fallback
- Extract magic number 500 into errorResponseLimit constant
- Update DirectlyGetModels call to pass context.Background
- Import context package in providers_config.go file
2025-07-11 12:43:31 -07:00
Kayvan Sylvan
b187a80275 refactor: replace string manipulation with url.JoinPath for models endpoint construction 2025-07-11 12:29:12 -07:00
Kayvan Sylvan
a6fc54a991 refactor: improve OpenAI compatible models API client with timeout and cleaner parsing 2025-07-11 12:09:20 -07:00
Kayvan Sylvan
b9f4b9837a refactor: extract model ID parsing logic into reusable helper function 2025-07-11 11:57:04 -07:00
Kayvan Sylvan
2bedf35957 fix: enhance error messages in OpenAI compatible models endpoint with response body details 2025-07-11 11:17:22 -07:00
Kayvan Sylvan
b9df64a0d8 feat: add direct model fetching support for non-standard providers
- Add `DirectlyGetModels` function to handle non-standard API responses
- Add Together provider configuration to ProviderMap
- Implement fallback to direct model fetching when standard method fails
2025-07-11 09:12:53 -07:00
Kayvan Sylvan
6b07b33ff2 fix: broken image link 2025-07-09 11:41:33 -07:00
Kayvan Sylvan
ff245edd51 Merge pull request #1599 from ksylvan/0708-fix-documentation-paths-for-restructured-repo
Update file paths to reflect new data directory structure
2025-07-09 08:47:21 -07:00
Kayvan Sylvan
2e0a4da876 docs: update file paths to reflect new data directory structure
## CHANGES

- Move fabric logo image path to docs directory
- Update patterns directory reference to data/patterns location
- Update strategies directory reference to data/strategies location
- Fix create_coding_feature README path reference
- Update code_helper install path to cmd directory
2025-07-09 08:44:12 -07:00
github-actions[bot]
1f3befbbbc Update version to v1.4.244 and commit 2025-07-09 14:58:49 +00:00
Kayvan Sylvan
8988206fbe Merge pull request #1598 from jaredmontoya/flake-enhance 2025-07-09 07:57:12 -07:00
jaredmontoya
1bd5f9d7e4 shell: fix typo 2025-07-09 14:00:59 +02:00
github-actions[bot]
832fd2f718 Update version to v1.4.243 and commit 2025-07-09 10:53:52 +00:00
Kayvan Sylvan
dd0935fb70 Merge pull request #1597 from ksylvan/0708-more-refactoring-fixes-for-patterns-loading
CLI Refactoring: Modular Command Processing and Pattern Loading Improvements
2025-07-09 03:52:25 -07:00
Kayvan Sylvan
e64bdd849c fix: improve error handling and temporary file management in patterns loader
## CHANGES

- Replace println with fmt.Fprintln to stderr for errors
- Use os.MkdirTemp for secure temporary directory creation
- Remove unused time import from patterns loader
- Add proper error wrapping for file operations
- Handle RemoveAll errors with warning messages
- Improve error messages with context information
- Add explicit error checking for cleanup operations
2025-07-09 03:40:03 -07:00
Kayvan Sylvan
be82b4b013 chore: improve error handling for scraping configuration in tools.go 2025-07-09 03:23:29 -07:00
Kayvan Sylvan
6e2f00090c chore: enhance error handling and early returns in CLI
### CHANGES
- Add early return if registry is nil to prevent panics.
- Introduce early return for non-chat tool operations.
- Update error message to use original input on HTML readability failure.
- Enhance error wrapping for playlist video fetching.
- Modify temp patterns folder name with timestamp for uniqueness.
- Improve error handling for patterns directory access.
2025-07-09 03:15:57 -07:00
jaredmontoya
7d6505fe98 update-mod: fix generation path 2025-07-09 12:07:56 +02:00
jaredmontoya
23c1437794 shell: rename command 2025-07-09 12:07:07 +02:00
jaredmontoya
dd5e57477f nix:pkgs:fabric: use self reference 2025-07-09 12:07:07 +02:00
Kayvan Sylvan
2c2b374664 chore: remove fabric binary 2025-07-09 02:58:09 -07:00
Kayvan Sylvan
b884c529bd chore: update command handlers to return 'handled' boolean
### CHANGES

- Add `handled` boolean return to command handlers
- Modify `handleSetupAndServerCommands` to use `handled`
- Update `handleConfigurationCommands` with `handled` logic
- Implement `handled` return in `handleExtensionCommands`
- Revise `handleListingCommands` to support `handled` return
- Adjust `handleManagementCommands` to return `handled`
2025-07-09 02:57:29 -07:00
Kayvan Sylvan
137aff2268 feat: refactor CLI to modularize command handling
### CHANGES

* Extract chat processing logic into separate function
* Create modular command handlers for setup, configuration, listing, management, and extensions
* Improve patterns loader with migration support and better error handling
* Simplify main CLI logic by delegating to specialized handlers
* Enhance code organization and maintainability
* Add tool processing for YouTube and web scraping functionality
2025-07-09 02:29:38 -07:00
github-actions[bot]
42d3f45c57 Update version to v1.4.242 and commit 2025-07-09 07:49:40 +00:00
Kayvan Sylvan
f8ada0b148 Merge pull request #1596 from ksylvan/0708-fix-patterns-workflow
Fix patterns zipping workflow
2025-07-09 00:48:08 -07:00
Kayvan Sylvan
cb5fa50f68 chore: update workflow paths to reflect directory structure change
### CHANGES

- Modify trigger path to `data/patterns/**`
- Update `git diff` command to new path
- Change zip command to include `data/patterns/` directory
2025-07-09 00:43:55 -07:00
github-actions[bot]
921d12a153 Update version to v1.4.241 and commit 2025-07-09 07:40:23 +00:00
Kayvan Sylvan
725c6f9327 Merge pull request #1595 from ksylvan/0708-project-restructure
Restructure project to align with standard Go layout
2025-07-09 00:38:49 -07:00
Kayvan Sylvan
9a64238f18 docs: Update README with new go install commands 2025-07-09 00:33:21 -07:00
Kayvan Sylvan
af318aca17 fix: minor edit 2025-07-09 00:24:35 -07:00
Kayvan Sylvan
4d7bc7deb8 docs: update restructure guide with Homebrew and go install details
### CHANGES

- Document required Homebrew formula update for new structure.
- Add new `go install` commands for all tools.
- Specify new build path is `./cmd/fabric`.
- Include link to the draft Homebrew PR.
2025-07-09 00:11:15 -07:00
Kayvan Sylvan
da1336e8cb feat: add new patterns for content tagging and cognitive bias analysis
## CHANGES

- Fix static directory path in extract_patterns.py script
- Add apply_ul_tags pattern for content categorization
- Add t_check_dunning_kruger pattern for bias analysis
- Update pattern descriptions with new entries
- Sync web static data with latest patterns
- Include pattern extracts for new functionality
- Support standardized content topic classification
- Enable cognitive bias identification capabilities
2025-07-08 23:48:30 -07:00
Kayvan Sylvan
81adb3b050 docs: update project restructuring status and reorganize pattern scripts
## CHANGES

- Mark all 10 migration steps as completed
- Add restructuring completion status section
- Move pattern generation scripts to pattern_descriptions
- Update completion checkmarks throughout migration plan
- Document remaining external packaging verification tasks
- Consolidate pattern description files under new directory
- Confirm all binaries compile and tests pass
- Note standard Go project layout achieved
2025-07-08 23:36:56 -07:00
Kayvan Sylvan
ebc59ee82a refactor: move common package to domain and util packages for better organization
## CHANGES

- Move domain types from common to domain package
- Move utility functions from common to util package
- Update all import statements across codebase
- Reorganize OAuth storage functionality into util package
- Move file management functions to domain package
- Update test files to use new package structure
- Maintain backward compatibility for existing functionality
2025-07-08 23:26:11 -07:00
Kayvan Sylvan
e242e0fc52 refactor: alias server package import as restapi for clarity
### CHANGES

*   Rename the `server` package import to `restapi`.
*   Improve code readability and prevent naming collisions.
2025-07-08 23:03:52 -07:00
Kayvan Sylvan
4004c51b9e refactor: restructure project to align with standard Go layout
### CHANGES

- Introduce `cmd` directory for all main application binaries.
- Move all Go packages into the `internal` directory.
- Rename the `restapi` package to `server` for clarity.
- Consolidate patterns and strategies into a new `data` directory.
- Group all auxiliary scripts into a new `scripts` directory.
- Move all documentation and images into a `docs` directory.
- Update all Go import paths to reflect the new structure.
- Adjust CI/CD workflows and build commands for new layout.
2025-07-08 22:47:17 -07:00
Kayvan Sylvan
6d67223a4b Merge pull request #1594 from amancioandre/main
Adds check Dunning-Kruger Telos self-evaluation pattern
2025-07-07 18:38:28 -07:00
github-actions[bot]
265f2b807e Update version to v1.4.240 and commit 2025-07-07 21:25:55 +00:00
Kayvan Sylvan
dc63e0d1cc Merge pull request #1593 from ksylvan/0707-claude-oauth-improvement
Refactor: Generalize OAuth flow for improved token handling.
2025-07-07 14:24:22 -07:00
Kayvan Sylvan
75842d8610 chore: refactor token path to use authTokenIdentifier 2025-07-07 13:59:13 -07:00
Kayvan Sylvan
bcd4c6caea test: add comprehensive OAuth testing suite for Anthropic plugin
## CHANGES

- Add OAuth test file with 434 lines coverage
- Create mock token server for safe testing
- Implement PKCE generation and validation tests
- Add token expiration logic verification tests
- Create OAuth transport round-trip testing
- Add benchmark tests for performance validation
- Implement helper functions for test token creation
- Add comprehensive error path testing scenarios
2025-07-07 13:50:57 -07:00
Kayvan Sylvan
a6a63698e1 fix: update RefreshToken to use tokenIdentifier parameter 2025-07-07 13:31:08 -07:00
Kayvan Sylvan
0528556b5c refactor: replace hardcoded "claude" with configurable authTokenIdentifier parameter
## CHANGES

- Replace hardcoded "claude" string with `authTokenIdentifier` constant
- Update `RunOAuthFlow` to accept token identifier parameter
- Modify `RefreshToken` to use configurable token identifier
- Update `exchangeToken` to accept token identifier parameter
- Enhance `getValidToken` to use parameterized token identifier
- Add token refresh attempt before full OAuth flow
- Improve OAuth flow with existing token validation
2025-07-07 13:19:00 -07:00
apollo
a43f267a69 Merge branch 'main' of https://github.com/amancioandre/Fabric 2025-07-01 16:12:23 -07:00
apollo
c78fe41ebc fix: sections as heading 1, typos 2025-07-01 16:12:13 -07:00
Andre Amnc
cab246bc74 Merge branch 'danielmiessler:main' into main 2025-07-01 16:07:25 -07:00
apollo
50c05e2d5c feat: adds pattern telos check dunning kruger 2025-07-01 16:06:59 -07:00
505 changed files with 9529 additions and 3264 deletions

View File

@@ -4,13 +4,13 @@ on:
push:
branches: ["main"]
paths-ignore:
- 'patterns/**'
- '**/*.md'
- "data/patterns/**"
- "**/*.md"
pull_request:
branches: ["main"]
paths-ignore:
- 'patterns/**'
- '**/*.md'
- "data/patterns/**"
- "**/*.md"
jobs:
test:

View File

@@ -3,7 +3,7 @@ name: Patterns Artifact
on:
push:
paths:
- "patterns/**" # Trigger only on changes to files in the patterns folder
- "data/patterns/**" # Trigger only on changes to files in the patterns folder
jobs:
zip-and-upload:
@@ -18,13 +18,13 @@ jobs:
- name: Verify Changes in Patterns Folder
run: |
git fetch origin
if git diff --quiet HEAD~1 -- patterns; then
if git diff --quiet HEAD~1 -- data/patterns; then
echo "No changes detected in patterns folder."
exit 1
fi
- name: Zip the Patterns Folder
run: zip -r patterns.zip patterns/
run: zip -r patterns.zip data/patterns/
- name: Upload Patterns Artifact
uses: actions/upload-artifact@v4

View File

@@ -69,7 +69,7 @@ jobs:
GOOS: ${{ env.OS }}
GOARCH: ${{ matrix.arch }}
run: |
go build -o fabric-${OS}-${{ matrix.arch }} .
go build -o fabric-${OS}-${{ matrix.arch }} ./cmd/fabric
- name: Build binary on Windows
if: matrix.os == 'windows-latest'
@@ -77,7 +77,7 @@ jobs:
GOOS: windows
GOARCH: ${{ matrix.arch }}
run: |
go build -o fabric-windows-${{ matrix.arch }}.exe .
go build -o fabric-windows-${{ matrix.arch }}.exe ./cmd/fabric
- name: Upload build artifact
if: matrix.os != 'windows-latest'

View File

@@ -3,13 +3,17 @@ name: Update Version File and Create Tag
on:
push:
branches:
- main # Monitor the main branch
- main # Monitor the main branch
paths-ignore:
- 'patterns/**'
- '**/*.md'
- "data/patterns/**"
- "**/*.md"
- "data/strategies/**"
- "cmd/generate_changelog/*.db"
- "scripts/pattern_descriptions/*.json"
- "web/static/data/pattern_descriptions.json"
permissions:
contents: write # Ensure the workflow has write permissions
contents: write # Ensure the workflow has write permissions
concurrency:
group: version-update
@@ -63,14 +67,14 @@ jobs:
- name: Update version.go file
run: |
echo "package main" > version.go
echo "" >> version.go
echo "var version = \"${{ env.new_tag }}\"" >> version.go
echo "package main" > cmd/fabric/version.go
echo "" >> cmd/fabric/version.go
echo "var version = \"${{ env.new_tag }}\"" >> cmd/fabric/version.go
- name: Update version.nix file
run: |
echo "\"${{ env.new_version }}\"" > nix/pkgs/fabric/version.nix
- name: Format source code
run: |
nix fmt
@@ -79,21 +83,30 @@ jobs:
run: |
nix run .#gomod2nix -- --outdir nix/pkgs/fabric
- name: Generate Changelog Entry
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
go run ./cmd/generate_changelog --process-prs ${{ env.new_tag }}
- name: Commit changes
run: |
git add version.go
# These files are modified by the version bump process
git add cmd/fabric/version.go
git add nix/pkgs/fabric/version.nix
git add nix/pkgs/fabric/gomod2nix.toml
git add .
# The changelog tool is responsible for staging CHANGELOG.md, changelog.db,
# and removing the incoming/ directory.
if ! git diff --staged --quiet; then
git commit -m "Update version to ${{ env.new_tag }} and commit $commit_hash"
git commit -m "chore(release): Update version to ${{ env.new_tag }}"
else
echo "No changes to commit."
fi
- name: Push changes
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Use GITHUB_TOKEN to authenticate the push
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Use GITHUB_TOKEN to authenticate the push
run: |
git push origin main # Push changes to the main branch
@@ -106,7 +119,7 @@ jobs:
- name: Dispatch event to trigger release workflow
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Use GITHUB_TOKEN to authenticate the dispatch
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Use GITHUB_TOKEN to authenticate the dispatch
run: |
curl -X POST \
-H "Authorization: token $GITHUB_TOKEN" \

10
.gitignore vendored
View File

@@ -131,9 +131,7 @@ celerybeat.pid
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
@@ -349,5 +347,9 @@ web/package-lock.json
.gitignore_backup
web/static/*.png
# Local VSCode project settings
.vscode/
# Local tmp directory
.tmp/
tmp/
# Ignore .claude/
.claude/

3
.vscode/extensions.json vendored Normal file
View File

@@ -0,0 +1,3 @@
{
"recommendations": ["davidanson.vscode-markdownlint"]
}

153
.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,153 @@
{
"cSpell.words": [
"addextension",
"AIML",
"anthropics",
"badfile",
"Behrens",
"blindspots",
"Bombal",
"Cerebras",
"compinit",
"creatordate",
"custompatterns",
"danielmiessler",
"davidanson",
"Debugf",
"dedup",
"deepseek",
"direnv",
"dryrun",
"dsrp",
"editability",
"Eisler",
"elif",
"envrc",
"Errorf",
"eugeis",
"Eugen",
"excalidraw",
"exolab",
"fabriclogo",
"fpath",
"frequencypenalty",
"fsdb",
"gantt",
"genai",
"githelper",
"gjson",
"GOARCH",
"godotenv",
"gofmt",
"goimports",
"gomod",
"gonic",
"goopenai",
"GOPATH",
"gopkg",
"GOROOT",
"Graphviz",
"grokai",
"Groq",
"hackerone",
"Haddix",
"hasura",
"hormozi",
"Hormozi's",
"horts",
"HTMLURL",
"jaredmontoya",
"jessevdk",
"Jina",
"joho",
"ksylvan",
"Langdock",
"ldflags",
"libexec",
"listcontexts",
"listextensions",
"listmodels",
"listpatterns",
"listsessions",
"liststrategies",
"listvendors",
"lmstudio",
"Makefiles",
"markmap",
"matplotlib",
"mattn",
"mbed",
"Miessler",
"nometa",
"numpy",
"ollama",
"ollamaapi",
"openaiapi",
"opencode",
"openrouter",
"otiai",
"pdflatex",
"pipx",
"PKCE",
"pkgs",
"presencepenalty",
"printcontext",
"printsession",
"pycache",
"pyperclip",
"readystream",
"restapi",
"rmextension",
"samber",
"sashabaranov",
"sdist",
"seaborn",
"semgrep",
"sess",
"storer",
"Streamlit",
"stretchr",
"talkpanel",
"Telos",
"testpattern",
"Thacker",
"tidwall",
"topp",
"ttrc",
"unalias",
"unmarshalling",
"updatepatterns",
"videoid",
"webp",
"WEBVTT",
"wipecontext",
"wipesession",
"Worktree",
"writeups",
"xclip",
"yourpatternname",
"youtu"
],
"cSpell.ignorePaths": ["go.mod", ".gitignore", "CHANGELOG.md"],
"markdownlint.config": {
"MD004": false,
"MD011": false,
"MD024": false,
"MD025": false,
"M032": false,
"MD033": {
"allowed_elements": [
"a",
"br",
"code",
"div",
"em",
"h4",
"img",
"module",
"p"
]
},
"MD041": false
}
}

2425
CHANGELOG.md Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -3,7 +3,7 @@ Fabric is graciously supported by…
[![Github Repo Tagline](https://github.com/user-attachments/assets/96ab3d81-9b13-4df4-ba09-75dee7a5c3d2)](https://warp.dev/fabric)
<img src="./images/fabric-logo-gif.gif" alt="fabriclogo" width="400" height="400"/>
<img src="./docs/images/fabric-logo-gif.gif" alt="fabriclogo" width="400" height="400"/>
# `fabric`
@@ -29,7 +29,7 @@ Fabric is graciously supported by…
[Helper Apps](#helper-apps) •
[Meta](#meta)
![Screenshot of fabric](images/fabric-summarize.png)
![Screenshot of fabric](./docs/images/fabric-summarize.png)
</div>
@@ -113,30 +113,9 @@ Keep in mind that many of these were recorded when Fabric was Python-based, so r
## Updates
> [!NOTE]
>
> July 4, 2025
>
> - **Web Search**: Fabric now supports web search for Anthropic and OpenAI models using the `--search` and `--search-location` flags. This replaces the previous plugin-based search, so you may want to remove the old `ANTHROPIC_WEB_SEARCH_TOOL_*` variables from your `~/.config/fabric/.env` file.
> - **Image Generation**: Fabric now has powerful image generation capabilities with OpenAI.
> - Generate images from text prompts and save them using `--image-file`.
> - Edit existing images by providing an input image with `--attachment`.
> - Control image `size`, `quality`, `compression`, and `background` with the new `--image-*` flags.
>
>June 17, 2025
>
>- Fabric now supports Perplexity AI. Configure it by using `fabric -S` to add your Perplexity AI API Key,
> and then try:
>
> ```bash
> fabric -m sonar-pro "What is the latest world news?"
> ```
>
>June 11, 2025
>
>- Fabric's YouTube transcription now needs `yt-dlp` to be installed. Make sure to install the latest
> version (2025.06.09 as of this note). The YouTube API key is only needed for comments (the `--comments` flag)
> and metadata extraction (the `--metadata` flag).
Fabric is evolving rapidly.
Stay current with the latest features by reviewing the [CHANGELOG](./CHANGELOG.md) for all recent changes.
## Philosophy
@@ -218,7 +197,7 @@ To install Fabric, [make sure Go is installed](https://go.dev/doc/install), and
```bash
# Install Fabric directly from the repo
go install github.com/danielmiessler/fabric@latest
go install github.com/danielmiessler/fabric/cmd/fabric@latest
```
### Environment Variables
@@ -428,7 +407,7 @@ pipx uninstall fabric
# Clear any old Fabric aliases
(check your .bashrc, .zshrc, etc.)
# Install the Go version
go install github.com/danielmiessler/fabric@latest
go install github.com/danielmiessler/fabric/cmd/fabric@latest
# Run setup for the new version. Important because things have changed
fabric --setup
```
@@ -440,7 +419,7 @@ Then [set your environmental variables](#environment-variables) as shown above.
The great thing about Go is that it's super easy to upgrade. Just run the same command you used to install it in the first place and you'll always get the latest version.
```bash
go install github.com/danielmiessler/fabric@latest
go install github.com/danielmiessler/fabric/cmd/fabric@latest
```
### Shell Completions
@@ -565,10 +544,13 @@ Application Options:
--image-compression= Compression level 0-100 for JPEG/WebP formats (default: not set)
--image-background= Background type: opaque, transparent (default: opaque, only for
PNG/WebP)
--suppress-think Suppress text enclosed in thinking tags
--think-start-tag= Start tag for thinking sections (default: <think>)
--think-end-tag= End tag for thinking sections (default: </think>)
--disable-responses-api Disable OpenAI Responses API (default: false)
Help Options:
-h, --help Show this help message
```
## Our approach to prompting
@@ -628,7 +610,7 @@ Now let's look at some things you can do with Fabric.
<br />
<br />
If you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the [`/patterns`](https://github.com/danielmiessler/fabric/tree/main/patterns) directory and start exploring!
If you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the [`/patterns`](https://github.com/danielmiessler/fabric/tree/main/data/patterns) directory and start exploring!
We hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.
@@ -644,7 +626,7 @@ be used in addition to the basic patterns.
See the [Thinking Faster by Writing Less](https://arxiv.org/pdf/2502.18600) paper and
the [Thought Generation section of Learn Prompting](https://learnprompting.org/docs/advanced/thought_generation/introduction) for examples of prompt strategies.
Each strategy is available as a small `json` file in the [`/strategies`](https://github.com/danielmiessler/fabric/tree/main/strategies) directory.
Each strategy is available as a small `json` file in the [`/strategies`](https://github.com/danielmiessler/fabric/tree/main/data/strategies) directory.
The prompt modification of the strategy is applied to the system prompt and passed on to the
LLM in the chat session.
@@ -725,7 +707,7 @@ This will create a PDF file named `output.pdf` in the current directory.
To install `to_pdf`, install it the same way as you install Fabric, just with a different repo name.
```bash
go install github.com/danielmiessler/fabric/plugins/tools/to_pdf@latest
go install github.com/danielmiessler/fabric/cmd/to_pdf@latest
```
Make sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on your system, as `to_pdf` requires `pdflatex` to be available in your system's PATH.
@@ -736,12 +718,12 @@ Make sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on y
It generates a `json` representation of a directory of code that can be fed into an AI model
with instructions to create a new feature or edit the code in a specified way.
See [the Create Coding Feature Pattern README](./patterns/create_coding_feature/README.md) for details.
See [the Create Coding Feature Pattern README](./data/patterns/create_coding_feature/README.md) for details.
Install it first using:
```bash
go install github.com/danielmiessler/fabric/plugins/tools/code_helper@latest
go install github.com/danielmiessler/fabric/cmd/code_helper@latest
```
## pbpaste

View File

@@ -1,361 +0,0 @@
package cli
import (
"encoding/json"
"fmt"
"os"
"path/filepath"
"strconv"
"strings"
"github.com/danielmiessler/fabric/plugins/tools/youtube"
"github.com/danielmiessler/fabric/common"
"github.com/danielmiessler/fabric/core"
"github.com/danielmiessler/fabric/plugins/ai"
"github.com/danielmiessler/fabric/plugins/db/fsdb"
"github.com/danielmiessler/fabric/plugins/tools/converter"
"github.com/danielmiessler/fabric/restapi"
)
// Cli Controls the cli. It takes in the flags and runs the appropriate functions
func Cli(version string) (err error) {
var currentFlags *Flags
if currentFlags, err = Init(); err != nil {
return
}
if currentFlags.Version {
fmt.Println(version)
return
}
var homedir string
if homedir, err = os.UserHomeDir(); err != nil {
return
}
fabricDb := fsdb.NewDb(filepath.Join(homedir, ".config/fabric"))
if err = fabricDb.Configure(); err != nil {
if !currentFlags.Setup {
println(err.Error())
currentFlags.Setup = true
}
}
var registry *core.PluginRegistry
if registry, err = core.NewPluginRegistry(fabricDb); err != nil {
return
}
// if the setup flag is set, run the setup function
if currentFlags.Setup {
err = registry.Setup()
return
}
if currentFlags.Serve {
registry.ConfigureVendors()
err = restapi.Serve(registry, currentFlags.ServeAddress, currentFlags.ServeAPIKey)
return
}
if currentFlags.ServeOllama {
registry.ConfigureVendors()
err = restapi.ServeOllama(registry, currentFlags.ServeAddress, version)
return
}
if currentFlags.UpdatePatterns {
err = registry.PatternsLoader.PopulateDB()
return
}
if currentFlags.ChangeDefaultModel {
if err = registry.Defaults.Setup(); err != nil {
return
}
err = registry.SaveEnvFile()
return
}
if currentFlags.LatestPatterns != "0" {
var parsedToInt int
if parsedToInt, err = strconv.Atoi(currentFlags.LatestPatterns); err != nil {
return
}
if err = fabricDb.Patterns.PrintLatestPatterns(parsedToInt); err != nil {
return
}
return
}
if currentFlags.ListPatterns {
err = fabricDb.Patterns.ListNames(currentFlags.ShellCompleteOutput)
return
}
if currentFlags.ListAllModels {
var models *ai.VendorsModels
if models, err = registry.VendorManager.GetModels(); err != nil {
return
}
models.Print(currentFlags.ShellCompleteOutput)
return
}
if currentFlags.ListAllContexts {
err = fabricDb.Contexts.ListNames(currentFlags.ShellCompleteOutput)
return
}
if currentFlags.ListAllSessions {
err = fabricDb.Sessions.ListNames(currentFlags.ShellCompleteOutput)
return
}
if currentFlags.WipeContext != "" {
err = fabricDb.Contexts.Delete(currentFlags.WipeContext)
return
}
if currentFlags.WipeSession != "" {
err = fabricDb.Sessions.Delete(currentFlags.WipeSession)
return
}
if currentFlags.PrintSession != "" {
err = fabricDb.Sessions.PrintSession(currentFlags.PrintSession)
return
}
if currentFlags.PrintContext != "" {
err = fabricDb.Contexts.PrintContext(currentFlags.PrintContext)
return
}
if currentFlags.HtmlReadability {
if msg, cleanErr := converter.HtmlReadability(currentFlags.Message); cleanErr != nil {
fmt.Println("use original input, because can't apply html readability", err)
} else {
currentFlags.Message = msg
}
}
if currentFlags.ListExtensions {
err = registry.TemplateExtensions.ListExtensions()
return
}
if currentFlags.AddExtension != "" {
err = registry.TemplateExtensions.RegisterExtension(currentFlags.AddExtension)
return
}
if currentFlags.RemoveExtension != "" {
err = registry.TemplateExtensions.RemoveExtension(currentFlags.RemoveExtension)
return
}
if currentFlags.ListStrategies {
err = registry.Strategies.ListStrategies(currentFlags.ShellCompleteOutput)
return
}
if currentFlags.ListVendors {
err = registry.ListVendors(os.Stdout)
return
}
// if the interactive flag is set, run the interactive function
// if currentFlags.Interactive {
// interactive.Interactive()
// }
// if none of the above currentFlags are set, run the initiate chat function
var messageTools string
if currentFlags.YouTube != "" {
if !registry.YouTube.IsConfigured() {
err = fmt.Errorf("YouTube is not configured, please run the setup procedure")
return
}
var videoId string
var playlistId string
if videoId, playlistId, err = registry.YouTube.GetVideoOrPlaylistId(currentFlags.YouTube); err != nil {
return
} else if (videoId == "" || currentFlags.YouTubePlaylist) && playlistId != "" {
if currentFlags.Output != "" {
err = registry.YouTube.FetchAndSavePlaylist(playlistId, currentFlags.Output)
} else {
var videos []*youtube.VideoMeta
if videos, err = registry.YouTube.FetchPlaylistVideos(playlistId); err != nil {
err = fmt.Errorf("error fetching playlist videos: %v", err)
return
}
for _, video := range videos {
var message string
if message, err = processYoutubeVideo(currentFlags, registry, video.Id); err != nil {
return
}
if !currentFlags.IsChatRequest() {
if err = WriteOutput(message, fmt.Sprintf("%v.md", video.TitleNormalized)); err != nil {
return
}
} else {
messageTools = AppendMessage(messageTools, message)
}
}
}
return
}
if messageTools, err = processYoutubeVideo(currentFlags, registry, videoId); err != nil {
return
}
if !currentFlags.IsChatRequest() {
err = currentFlags.WriteOutput(messageTools)
return
}
}
if (currentFlags.ScrapeURL != "" || currentFlags.ScrapeQuestion != "") && registry.Jina.IsConfigured() {
// Check if the scrape_url flag is set and call ScrapeURL
if currentFlags.ScrapeURL != "" {
var website string
if website, err = registry.Jina.ScrapeURL(currentFlags.ScrapeURL); err != nil {
return
}
messageTools = AppendMessage(messageTools, website)
}
// Check if the scrape_question flag is set and call ScrapeQuestion
if currentFlags.ScrapeQuestion != "" {
var website string
if website, err = registry.Jina.ScrapeQuestion(currentFlags.ScrapeQuestion); err != nil {
return
}
messageTools = AppendMessage(messageTools, website)
}
if !currentFlags.IsChatRequest() {
err = currentFlags.WriteOutput(messageTools)
return
}
}
if messageTools != "" {
currentFlags.AppendMessage(messageTools)
}
var chatter *core.Chatter
if chatter, err = registry.GetChatter(currentFlags.Model, currentFlags.ModelContextLength,
currentFlags.Strategy, currentFlags.Stream, currentFlags.DryRun); err != nil {
return
}
var session *fsdb.Session
var chatReq *common.ChatRequest
if chatReq, err = currentFlags.BuildChatRequest(strings.Join(os.Args[1:], " ")); err != nil {
return
}
if chatReq.Language == "" {
chatReq.Language = registry.Language.DefaultLanguage.Value
}
var chatOptions *common.ChatOptions
if chatOptions, err = currentFlags.BuildChatOptions(); err != nil {
return
}
if session, err = chatter.Send(chatReq, chatOptions); err != nil {
return
}
result := session.GetLastMessage().Content
if !currentFlags.Stream {
// print the result if it was not streamed already
fmt.Println(result)
}
// if the copy flag is set, copy the message to the clipboard
if currentFlags.Copy {
if err = CopyToClipboard(result); err != nil {
return
}
}
// if the output flag is set, create an output file
if currentFlags.Output != "" {
if currentFlags.OutputSession {
sessionAsString := session.String()
err = CreateOutputFile(sessionAsString, currentFlags.Output)
} else {
err = CreateOutputFile(result, currentFlags.Output)
}
}
return
}
func processYoutubeVideo(
flags *Flags, registry *core.PluginRegistry, videoId string) (message string, err error) {
if (!flags.YouTubeComments && !flags.YouTubeMetadata) || flags.YouTubeTranscript || flags.YouTubeTranscriptWithTimestamps {
var transcript string
var language = "en"
if flags.Language != "" || registry.Language.DefaultLanguage.Value != "" {
if flags.Language != "" {
language = flags.Language
} else {
language = registry.Language.DefaultLanguage.Value
}
}
if flags.YouTubeTranscriptWithTimestamps {
if transcript, err = registry.YouTube.GrabTranscriptWithTimestamps(videoId, language); err != nil {
return
}
} else {
if transcript, err = registry.YouTube.GrabTranscript(videoId, language); err != nil {
return
}
}
message = AppendMessage(message, transcript)
}
if flags.YouTubeComments {
var comments []string
if comments, err = registry.YouTube.GrabComments(videoId); err != nil {
return
}
commentsString := strings.Join(comments, "\n")
message = AppendMessage(message, commentsString)
}
if flags.YouTubeMetadata {
var metadata *youtube.VideoMetadata
if metadata, err = registry.YouTube.GrabMetadata(videoId); err != nil {
return
}
metadataJson, _ := json.MarshalIndent(metadata, "", " ")
message = AppendMessage(message, string(metadataJson))
}
return
}
func WriteOutput(message string, outputFile string) (err error) {
fmt.Println(message)
if outputFile != "" {
err = CreateOutputFile(message, outputFile)
}
return
}

View File

@@ -6,7 +6,7 @@ import (
"github.com/jessevdk/go-flags"
"github.com/danielmiessler/fabric/cli"
"github.com/danielmiessler/fabric/internal/cli"
)
func main() {

3
cmd/fabric/version.go Normal file
View File

@@ -0,0 +1,3 @@
package main
var version = "v1.4.263"

View File

@@ -0,0 +1,151 @@
# Product Requirements Document: Changelog Generator
## Overview
The Changelog Generator is a high-performance Go tool that automatically generates comprehensive changelogs from git history and GitHub pull requests.
## Goals
1. **Performance**: Very fast. Efficient enough to be used in CI/CD as part of release process.
2. **Completeness**: Capture ALL commits including unreleased changes
3. **Efficiency**: Minimize API calls through caching and batch operations
4. **Reliability**: Handle errors gracefully with proper Go error handling
5. **Simplicity**: Single binary with no runtime dependencies
## Key Features
### 1. One-Pass Git History Algorithm
- Walk git history once from newest to oldest
- Start with "Unreleased" bucket for all new commits
- Switch buckets when encountering version commits
- No need to calculate ranges between versions
### 2. Native Library Integration
- **go-git**: Pure Go git implementation (no git binary required)
- **go-github**: Official GitHub Go client library
- Benefits: Type safety, better error handling, no subprocess overhead
### 3. Smart Caching System
- SQLite-based persistent cache
- Stores: versions, commits, PR details, last processed commit
- Enables incremental updates on subsequent runs
- Instant changelog regeneration from cache
### 4. Concurrent Processing
- Parallel GitHub API calls (up to 10 concurrent)
- Batch PR fetching with deduplication
- Rate limiting awareness
### 5. Enhanced Output
- "Unreleased" section for commits since last version
- Clean markdown formatting
- Configurable version limiting
- Direct commit tracking (non-PR commits)
## Technical Architecture
### Module Structure
```text
cmd/generate_changelog/
├── main.go # CLI entry point with cobra
├── internal/
│ ├── git/ # Git operations (go-git)
│ ├── github/ # GitHub API client (go-github)
│ ├── cache/ # SQLite caching layer
│ ├── changelog/ # Core generation logic
│ └── config/ # Configuration management
└── changelog.db # SQLite cache (generated)
```
### Data Flow
1. Git walker collects all commits in one pass
2. Commits bucketed by version (starting with "Unreleased")
3. PR numbers extracted from merge commits
4. GitHub API batch-fetches PR details
5. Cache stores everything for future runs
6. Formatter generates markdown output
### Cache Schema
- **metadata**: Last processed commit SHA
- **versions**: Version names, dates, commit SHAs
- **commits**: Full commit details with version associations
- **pull_requests**: PR details including commits
- Indexes on version and PR number for fast lookups
### Features
- **Unreleased section**: Shows all new commits
- **Better caching**: SQLite vs JSON, incremental updates
- **Smarter deduplication**: Removes consecutive duplicate commits
- **Direct commit tracking**: Shows non-PR commits
### Reliability
- **No subprocess errors**: Direct library usage
- **Type safety**: Compile-time checking
- **Better error handling**: Go's explicit error returns
### Deployment
- **Single binary**: No Python/pip/dependencies
- **Cross-platform**: Compile for any OS/architecture
- **No git CLI required**: Uses go-git library
## Configuration
### Environment Variables
- `GITHUB_TOKEN`: GitHub API authentication token
### Command Line Flags
- `--repo, -r`: Repository path (default: current directory)
- `--output, -o`: Output file (default: stdout)
- `--limit, -l`: Version limit (default: all)
- `--version, -v`: Target specific version
- `--save-data`: Export debug JSON
- `--cache`: Cache file location
- `--no-cache`: Disable caching
- `--rebuild-cache`: Force cache rebuild
- `--token`: GitHub token override
## Success Metrics
1. **Performance**: Generate full changelog in <5 seconds for fabric repo
2. **Completeness**: 100% commit coverage including unreleased
3. **Accuracy**: Correct PR associations and change extraction
4. **Reliability**: Handle network failures gracefully
5. **Usability**: Simple CLI with sensible defaults
## Future Enhancements
1. **Multiple output formats**: JSON, HTML, etc.
2. **Custom version patterns**: Configurable regex
3. **Change categorization**: feat/fix/docs auto-grouping
4. **Conventional commits**: Full support for semantic versioning
5. **GitLab/Bitbucket**: Support other platforms
6. **Web UI**: Interactive changelog browser
7. **Incremental updates**: Update existing CHANGELOG.md file
8. **Breaking change detection**: Highlight breaking changes
## Implementation Status
- ✅ Core architecture and modules
- ✅ One-pass git walking algorithm
- ✅ GitHub API integration with concurrency
- ✅ SQLite caching system
- ✅ Changelog formatting and generation
- ✅ CLI with all planned flags
- ✅ Documentation (README and PRD)
## Conclusion
This Go implementation provides a modern, efficient, and feature-rich changelog generator.

View File

@@ -0,0 +1,263 @@
# Changelog Generator
A high-performance changelog generator for Git repositories that automatically creates comprehensive, well-formatted changelogs from your git history and GitHub pull requests.
## Features
- **One-pass git history walking**: Efficiently processes entire repository history in a single pass
- **Automatic PR detection**: Extracts pull request information from merge commits
- **GitHub API integration**: Fetches detailed PR information including commits, authors, and descriptions
- **Smart caching**: SQLite-based caching for instant incremental updates
- **Unreleased changes**: Tracks all commits since the last release
- **Concurrent processing**: Parallel GitHub API calls for improved performance
- **Flexible output**: Generate complete changelogs or target specific versions
- **GraphQL optimization**: Ultra-fast PR fetching using GitHub GraphQL API (~5-10 calls vs 1000s)
- **Intelligent sync**: Automatically syncs new PRs every 24 hours or when missing PRs are detected
- **AI-powered summaries**: Optional Fabric integration for enhanced changelog summaries
- **Advanced caching**: Content-based change detection for AI summaries with hash comparison
- **Author type detection**: Distinguishes between users, bots, and organizations
- **Lightning-fast incremental updates**: SHA→PR mapping for instant git operations
## Installation
```bash
go install github.com/danielmiessler/fabric/cmd/generate_changelog@latest
```
## Usage
### Basic usage (generate complete changelog)
```bash
generate_changelog
```
### Save to file
```bash
generate_changelog -o CHANGELOG.md
```
### Generate for specific version
```bash
generate_changelog -v v1.4.244
```
### Limit to recent versions
```bash
generate_changelog -l 10
```
### Using GitHub token for private repos or higher rate limits
```bash
export GITHUB_TOKEN=your_token_here
generate_changelog
# Or pass directly
generate_changelog --token your_token_here
```
### AI-enhanced summaries
```bash
# Enable AI summaries using Fabric
generate_changelog --ai-summarize
# Use custom model for AI summaries
FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-opus-4 generate_changelog --ai-summarize
```
### Cache management
```bash
# Rebuild cache from scratch
generate_changelog --rebuild-cache
# Force a full PR sync from GitHub
generate_changelog --force-pr-sync
# Disable cache usage
generate_changelog --no-cache
# Use custom cache location
generate_changelog --cache /path/to/cache.db
```
## Command Line Options
| Flag | Short | Description | Default |
|------|-------|-------------|---------|
| `--repo` | `-r` | Repository path | `.` (current directory) |
| `--output` | `-o` | Output file | stdout |
| `--limit` | `-l` | Limit number of versions | 0 (all) |
| `--version` | `-v` | Generate for specific version | |
| `--save-data` | | Save version data to JSON | false |
| `--cache` | | Cache database file | `./cmd/generate_changelog/changelog.db` |
| `--no-cache` | | Disable cache usage | false |
| `--rebuild-cache` | | Rebuild cache from scratch | false |
| `--force-pr-sync` | | Force a full PR sync from GitHub | false |
| `--token` | | GitHub API token | `$GITHUB_TOKEN` |
| `--ai-summarize` | | Generate AI-enhanced summaries using Fabric | false |
## Output Format
The generated changelog follows this structure:
```markdown
# Changelog
## Unreleased
### PR [#1601](url) by [author](profile): PR Title
- Change description 1
- Change description 2
### Direct commits
- Direct commit message 1
- Direct commit message 2
## v1.4.244 (2025-07-09)
### PR [#1598](url) by [author](profile): PR Title
- Change description
...
```
## How It Works
1. **Git History Walking**: The tool walks through your git history from newest to oldest commits
2. **Version Detection**: Identifies version bump commits (pattern: "Update version to vX.Y.Z")
3. **PR Extraction**: Detects merge commits and extracts PR numbers
4. **GitHub API Calls**: Fetches detailed PR information in parallel batches
5. **Change Extraction**: Extracts changes from PR commit messages or PR body
6. **Formatting**: Generates clean, organized markdown output
## Performance
- **Native Go libraries**: Uses go-git and go-github for maximum performance
- **Concurrent API calls**: Processes up to 10 GitHub API requests in parallel
- **Smart caching**: SQLite cache eliminates redundant API calls
- **Incremental updates**: Only processes new commits on subsequent runs
- **GraphQL optimization**: Uses GitHub GraphQL API to fetch all PR data in ~5-10 calls
- **AI-powered summaries**: Optional Fabric integration with intelligent caching
- **Content-based change detection**: AI summaries only regenerated when content changes
- **Lightning-fast git operations**: SHA→PR mapping stored in database for instant lookups
### Major Optimization: GraphQL + Advanced Caching
The tool has been optimized to drastically reduce GitHub API calls and improve performance:
**Previous approach**: Individual API calls for each PR (2 API calls per PR)
- For a repo with 500 PRs: 1,000 API calls
**Current approach**: GraphQL batch fetching with intelligent caching
- For a repo with 500 PRs: ~5-10 GraphQL calls (initial fetch) + 0 calls (subsequent runs with cache)
- **99%+ reduction in API calls after initial run!**
The optimization includes:
1. **GraphQL Batch Fetch**: Uses GitHub's GraphQL API to fetch all merged PRs with commits in minimal calls
2. **Smart Caching**: Stores complete PR data, commits, and SHA mappings in SQLite
3. **Incremental Sync**: Only fetches PRs merged after the last sync timestamp
4. **Automatic Refresh**: PRs are synced every 24 hours or when missing PRs are detected
5. **AI Summary Caching**: Content-based change detection prevents unnecessary AI regeneration
6. **Fallback Support**: If GraphQL fails, falls back to REST API batch fetching
7. **Lightning Git Operations**: Pre-computed SHA→PR mappings for instant commit association
## Requirements
- Go 1.24+ (for installation from source)
- Git repository
- GitHub token (optional, for private repos or higher rate limits)
- Fabric CLI (optional, for AI-enhanced summaries)
## Authentication
The tool supports GitHub authentication via:
1. Environment variable: `export GITHUB_TOKEN=your_token`
2. Command line flag: `--token your_token`
3. `.env` file in the same directory as the binary
### Environment File Support
Create a `.env` file next to the `generate_changelog` binary:
```bash
GITHUB_TOKEN=your_github_token_here
FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-sonnet-4-20250514
```
The tool automatically loads `.env` files for convenient configuration management.
Without authentication, the tool is limited to 60 GitHub API requests per hour.
## Caching
The SQLite cache stores:
- Version information and commit associations
- Pull request details (title, body, commits, authors)
- Last processed commit SHA for incremental updates
- Last PR sync timestamp for intelligent refresh
- AI summaries with content-based change detection
- SHA→PR mappings for lightning-fast git operations
Cache benefits:
- Instant changelog regeneration
- Drastically reduced GitHub API usage (99%+ reduction after initial run)
- Offline changelog generation (after initial cache build)
- Automatic PR data refresh every 24 hours
- Batch database transactions for better performance
- Content-aware AI summary regeneration
## AI-Enhanced Summaries
The tool can generate AI-powered summaries using Fabric for more polished, professional changelogs:
```bash
# Enable AI summarization
generate_changelog --ai-summarize
# Custom model (default: claude-sonnet-4-20250514)
FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-opus-4 generate_changelog --ai-summarize
```
### AI Summary Features
- **Content-based change detection**: AI summaries are only regenerated when version content changes
- **Intelligent caching**: Preserves existing summaries and only processes changed versions
- **Content hash comparison**: Uses SHA256 hashing to detect when "Unreleased" content changes
- **Automatic fallback**: Falls back to raw content if AI processing fails
- **Error detection**: Identifies and handles AI processing errors gracefully
- **Minimum content filtering**: Skips AI processing for very brief content (< 256 characters)
### AI Model Configuration
Set the model via environment variable:
```bash
export FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-opus-4
# or
export FABRIC_CHANGELOG_SUMMARIZE_MODEL=gpt-4
```
AI summaries are cached and only regenerated when:
- Version content changes (detected via hash comparison)
- No existing AI summary exists for the version
- Force rebuild is requested
## Contributing
This tool is part of the Fabric project. Contributions are welcome!
## License
The MIT License. Same as the Fabric project.

Binary file not shown.

View File

@@ -0,0 +1,7 @@
### PR [#1640](https://github.com/danielmiessler/Fabric/pull/1640) by [ksylvan](https://github.com/ksylvan): Implement Automated Changelog System for CI/CD Integration
- Add automated changelog processing for CI/CD integration with comprehensive test coverage and GitHub client validation methods
- Implement release aggregation for incoming files with git operations for staging changes and support for version detection from nix files
- Change push behavior from opt-out to opt-in with GitHub token authentication and automatic repository detection
- Enhance changelog generation to avoid duplicate commit entries by extracting PR numbers and filtering commits already included via PR files
- Add version parameter requirement for PR processing with commit SHA tracking to prevent duplicate entries and improve formatting consistency

View File

@@ -0,0 +1,5 @@
### PR [#1641](https://github.com/danielmiessler/Fabric/pull/1641) by [ksylvan](https://github.com/ksylvan): Fix Fabric Web timeout error
- Chore: extend proxy timeout in `vite.config.ts` to 15 minutes
- Increase `/api` proxy timeout to 900,000 ms
- Increase `/names` proxy timeout to 900,000 ms

View File

@@ -0,0 +1,448 @@
package cache
import (
"database/sql"
"encoding/json"
"fmt"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/git"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/github"
_ "github.com/mattn/go-sqlite3"
)
type Cache struct {
db *sql.DB
}
func New(dbPath string) (*Cache, error) {
db, err := sql.Open("sqlite3", dbPath)
if err != nil {
return nil, fmt.Errorf("failed to open database: %w", err)
}
cache := &Cache{db: db}
if err := cache.createTables(); err != nil {
return nil, fmt.Errorf("failed to create tables: %w", err)
}
return cache, nil
}
func (c *Cache) Close() error {
return c.db.Close()
}
func (c *Cache) createTables() error {
queries := []string{
`CREATE TABLE IF NOT EXISTS metadata (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)`,
`CREATE TABLE IF NOT EXISTS versions (
name TEXT PRIMARY KEY,
date DATETIME,
commit_sha TEXT,
pr_numbers TEXT,
ai_summary TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)`,
`CREATE TABLE IF NOT EXISTS commits (
sha TEXT PRIMARY KEY,
version TEXT NOT NULL,
message TEXT,
author TEXT,
email TEXT,
date DATETIME,
is_merge BOOLEAN,
pr_number INTEGER,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (version) REFERENCES versions(name)
)`,
`CREATE TABLE IF NOT EXISTS pull_requests (
number INTEGER PRIMARY KEY,
title TEXT,
body TEXT,
author TEXT,
author_url TEXT,
author_type TEXT DEFAULT 'user',
url TEXT,
merged_at DATETIME,
merge_commit TEXT,
commits TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)`,
`CREATE INDEX IF NOT EXISTS idx_commits_version ON commits(version)`,
`CREATE INDEX IF NOT EXISTS idx_commits_pr_number ON commits(pr_number)`,
`CREATE TABLE IF NOT EXISTS commit_pr_mapping (
commit_sha TEXT PRIMARY KEY,
pr_number INTEGER NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (pr_number) REFERENCES pull_requests(number)
)`,
`CREATE INDEX IF NOT EXISTS idx_commit_pr_mapping_sha ON commit_pr_mapping(commit_sha)`,
}
for _, query := range queries {
if _, err := c.db.Exec(query); err != nil {
return fmt.Errorf("failed to execute query: %w", err)
}
}
return nil
}
func (c *Cache) GetLastProcessedTag() (string, error) {
var tag string
err := c.db.QueryRow("SELECT value FROM metadata WHERE key = 'last_processed_tag'").Scan(&tag)
if err == sql.ErrNoRows {
return "", nil
}
return tag, err
}
func (c *Cache) SetLastProcessedTag(tag string) error {
_, err := c.db.Exec(`
INSERT OR REPLACE INTO metadata (key, value, updated_at)
VALUES ('last_processed_tag', ?, CURRENT_TIMESTAMP)
`, tag)
return err
}
func (c *Cache) SaveVersion(v *git.Version) error {
prNumbers, _ := json.Marshal(v.PRNumbers)
_, err := c.db.Exec(`
INSERT OR REPLACE INTO versions (name, date, commit_sha, pr_numbers, ai_summary)
VALUES (?, ?, ?, ?, ?)
`, v.Name, v.Date, v.CommitSHA, string(prNumbers), v.AISummary)
return err
}
// UpdateVersionAISummary updates only the AI summary for a specific version
func (c *Cache) UpdateVersionAISummary(versionName, aiSummary string) error {
_, err := c.db.Exec(`
UPDATE versions SET ai_summary = ? WHERE name = ?
`, aiSummary, versionName)
return err
}
func (c *Cache) SaveCommit(commit *git.Commit, version string) error {
_, err := c.db.Exec(`
INSERT OR REPLACE INTO commits
(sha, version, message, author, email, date, is_merge, pr_number)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
`, commit.SHA, version, commit.Message, commit.Author, commit.Email,
commit.Date, commit.IsMerge, commit.PRNumber)
return err
}
func (c *Cache) SavePR(pr *github.PR) error {
commits, _ := json.Marshal(pr.Commits)
_, err := c.db.Exec(`
INSERT OR REPLACE INTO pull_requests
(number, title, body, author, author_url, author_type, url, merged_at, merge_commit, commits)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`, pr.Number, pr.Title, pr.Body, pr.Author, pr.AuthorURL, pr.AuthorType,
pr.URL, pr.MergedAt, pr.MergeCommit, string(commits))
return err
}
func (c *Cache) GetPR(number int) (*github.PR, error) {
var pr github.PR
var commitsJSON string
err := c.db.QueryRow(`
SELECT number, title, body, author, author_url, COALESCE(author_type, 'user'), url, merged_at, merge_commit, commits
FROM pull_requests WHERE number = ?
`, number).Scan(
&pr.Number, &pr.Title, &pr.Body, &pr.Author, &pr.AuthorURL, &pr.AuthorType,
&pr.URL, &pr.MergedAt, &pr.MergeCommit, &commitsJSON,
)
if err == sql.ErrNoRows {
return nil, nil
}
if err != nil {
return nil, err
}
if err := json.Unmarshal([]byte(commitsJSON), &pr.Commits); err != nil {
return nil, fmt.Errorf("failed to unmarshal commits: %w", err)
}
return &pr, nil
}
func (c *Cache) GetVersions() (map[string]*git.Version, error) {
rows, err := c.db.Query(`
SELECT name, date, commit_sha, pr_numbers, ai_summary FROM versions
`)
if err != nil {
return nil, err
}
defer rows.Close()
versions := make(map[string]*git.Version)
for rows.Next() {
var v git.Version
var dateStr sql.NullString
var prNumbersJSON string
var aiSummary sql.NullString
if err := rows.Scan(&v.Name, &dateStr, &v.CommitSHA, &prNumbersJSON, &aiSummary); err != nil {
return nil, err
}
if dateStr.Valid {
v.Date, _ = time.Parse(time.RFC3339, dateStr.String)
}
if prNumbersJSON != "" {
json.Unmarshal([]byte(prNumbersJSON), &v.PRNumbers)
}
if aiSummary.Valid {
v.AISummary = aiSummary.String
}
v.Commits, err = c.getCommitsForVersion(v.Name)
if err != nil {
return nil, err
}
versions[v.Name] = &v
}
return versions, rows.Err()
}
func (c *Cache) getCommitsForVersion(version string) ([]*git.Commit, error) {
rows, err := c.db.Query(`
SELECT sha, message, author, email, date, is_merge, pr_number
FROM commits WHERE version = ?
ORDER BY date DESC
`, version)
if err != nil {
return nil, err
}
defer rows.Close()
var commits []*git.Commit
for rows.Next() {
var commit git.Commit
if err := rows.Scan(
&commit.SHA, &commit.Message, &commit.Author, &commit.Email,
&commit.Date, &commit.IsMerge, &commit.PRNumber,
); err != nil {
return nil, err
}
commits = append(commits, &commit)
}
return commits, rows.Err()
}
func (c *Cache) Clear() error {
tables := []string{"metadata", "versions", "commits", "pull_requests"}
for _, table := range tables {
if _, err := c.db.Exec("DELETE FROM " + table); err != nil {
return err
}
}
return nil
}
// GetLastPRSync returns the timestamp of the last PR sync
func (c *Cache) GetLastPRSync() (time.Time, error) {
var timestamp string
err := c.db.QueryRow("SELECT value FROM metadata WHERE key = 'last_pr_sync'").Scan(&timestamp)
if err == sql.ErrNoRows {
return time.Time{}, nil
}
if err != nil {
return time.Time{}, err
}
return time.Parse(time.RFC3339, timestamp)
}
// SetLastPRSync updates the timestamp of the last PR sync
func (c *Cache) SetLastPRSync(timestamp time.Time) error {
_, err := c.db.Exec(`
INSERT OR REPLACE INTO metadata (key, value, updated_at)
VALUES ('last_pr_sync', ?, CURRENT_TIMESTAMP)
`, timestamp.Format(time.RFC3339))
return err
}
// SavePRBatch saves multiple PRs in a single transaction for better performance
func (c *Cache) SavePRBatch(prs []*github.PR) error {
tx, err := c.db.Begin()
if err != nil {
return fmt.Errorf("failed to begin transaction: %w", err)
}
defer tx.Rollback()
stmt, err := tx.Prepare(`
INSERT OR REPLACE INTO pull_requests
(number, title, body, author, author_url, author_type, url, merged_at, merge_commit, commits)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`)
if err != nil {
return fmt.Errorf("failed to prepare statement: %w", err)
}
defer stmt.Close()
for _, pr := range prs {
commits, _ := json.Marshal(pr.Commits)
_, err := stmt.Exec(
pr.Number, pr.Title, pr.Body, pr.Author, pr.AuthorURL, pr.AuthorType,
pr.URL, pr.MergedAt, pr.MergeCommit, string(commits),
)
if err != nil {
return fmt.Errorf("failed to save PR #%d: %w", pr.Number, err)
}
}
return tx.Commit()
}
// GetAllPRs returns all cached PRs
func (c *Cache) GetAllPRs() (map[int]*github.PR, error) {
rows, err := c.db.Query(`
SELECT number, title, body, author, author_url, COALESCE(author_type, 'user'), url, merged_at, merge_commit, commits
FROM pull_requests
`)
if err != nil {
return nil, err
}
defer rows.Close()
prs := make(map[int]*github.PR)
for rows.Next() {
var pr github.PR
var commitsJSON string
if err := rows.Scan(
&pr.Number, &pr.Title, &pr.Body, &pr.Author, &pr.AuthorURL, &pr.AuthorType,
&pr.URL, &pr.MergedAt, &pr.MergeCommit, &commitsJSON,
); err != nil {
return nil, err
}
if err := json.Unmarshal([]byte(commitsJSON), &pr.Commits); err != nil {
return nil, fmt.Errorf("failed to unmarshal commits for PR #%d: %w", pr.Number, err)
}
prs[pr.Number] = &pr
}
return prs, rows.Err()
}
// MarkPRAsNonExistent marks a PR number as non-existent to avoid future fetches
func (c *Cache) MarkPRAsNonExistent(prNumber int) error {
_, err := c.db.Exec(`
INSERT OR REPLACE INTO metadata (key, value, updated_at)
VALUES (?, 'non_existent', CURRENT_TIMESTAMP)
`, fmt.Sprintf("pr_non_existent_%d", prNumber))
return err
}
// IsPRMarkedAsNonExistent checks if a PR is marked as non-existent
func (c *Cache) IsPRMarkedAsNonExistent(prNumber int) bool {
var value string
err := c.db.QueryRow("SELECT value FROM metadata WHERE key = ?",
fmt.Sprintf("pr_non_existent_%d", prNumber)).Scan(&value)
return err == nil && value == "non_existent"
}
// SaveCommitPRMappings saves SHA→PR mappings for all commits in PRs
func (c *Cache) SaveCommitPRMappings(prs []*github.PR) error {
tx, err := c.db.Begin()
if err != nil {
return fmt.Errorf("failed to begin transaction: %w", err)
}
defer tx.Rollback()
stmt, err := tx.Prepare(`
INSERT OR REPLACE INTO commit_pr_mapping (commit_sha, pr_number)
VALUES (?, ?)
`)
if err != nil {
return fmt.Errorf("failed to prepare statement: %w", err)
}
defer stmt.Close()
for _, pr := range prs {
for _, commit := range pr.Commits {
_, err := stmt.Exec(commit.SHA, pr.Number)
if err != nil {
return fmt.Errorf("failed to save commit mapping %s→%d: %w", commit.SHA, pr.Number, err)
}
}
}
return tx.Commit()
}
// GetPRNumberBySHA returns the PR number for a given commit SHA
func (c *Cache) GetPRNumberBySHA(sha string) (int, bool) {
var prNumber int
err := c.db.QueryRow("SELECT pr_number FROM commit_pr_mapping WHERE commit_sha = ?", sha).Scan(&prNumber)
if err == sql.ErrNoRows {
return 0, false
}
if err != nil {
return 0, false
}
return prNumber, true
}
// GetCommitSHAsForPR returns all commit SHAs for a given PR number
func (c *Cache) GetCommitSHAsForPR(prNumber int) ([]string, error) {
rows, err := c.db.Query("SELECT commit_sha FROM commit_pr_mapping WHERE pr_number = ?", prNumber)
if err != nil {
return nil, err
}
defer rows.Close()
var shas []string
for rows.Next() {
var sha string
if err := rows.Scan(&sha); err != nil {
return nil, err
}
shas = append(shas, sha)
}
return shas, rows.Err()
}
// GetUnreleasedContentHash returns the cached content hash for Unreleased
func (c *Cache) GetUnreleasedContentHash() (string, error) {
var hash string
err := c.db.QueryRow("SELECT value FROM metadata WHERE key = 'unreleased_content_hash'").Scan(&hash)
if err == sql.ErrNoRows {
return "", fmt.Errorf("no content hash found")
}
return hash, err
}
// SetUnreleasedContentHash stores the content hash for Unreleased
func (c *Cache) SetUnreleasedContentHash(hash string) error {
_, err := c.db.Exec(`
INSERT OR REPLACE INTO metadata (key, value, updated_at)
VALUES ('unreleased_content_hash', ?, CURRENT_TIMESTAMP)
`, hash)
return err
}

View File

@@ -0,0 +1,699 @@
package changelog
import (
"crypto/sha256"
"fmt"
"os"
"regexp"
"sort"
"strings"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/cache"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/git"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/github"
)
type Generator struct {
cfg *config.Config
gitWalker *git.Walker
ghClient *github.Client
cache *cache.Cache
versions map[string]*git.Version
prs map[int]*github.PR
}
func New(cfg *config.Config) (*Generator, error) {
gitWalker, err := git.NewWalker(cfg.RepoPath)
if err != nil {
return nil, fmt.Errorf("failed to create git walker: %w", err)
}
owner, repo, err := gitWalker.GetRepoInfo()
if err != nil {
return nil, fmt.Errorf("failed to get repo info: %w", err)
}
ghClient := github.NewClient(cfg.GitHubToken, owner, repo)
var c *cache.Cache
if !cfg.NoCache {
c, err = cache.New(cfg.CacheFile)
if err != nil {
return nil, fmt.Errorf("failed to create cache: %w", err)
}
if cfg.RebuildCache {
if err := c.Clear(); err != nil {
return nil, fmt.Errorf("failed to clear cache: %w", err)
}
}
}
return &Generator{
cfg: cfg,
gitWalker: gitWalker,
ghClient: ghClient,
cache: c,
prs: make(map[int]*github.PR),
}, nil
}
func (g *Generator) Generate() (string, error) {
if err := g.collectData(); err != nil {
return "", fmt.Errorf("failed to collect data: %w", err)
}
if err := g.fetchPRs(); err != nil {
return "", fmt.Errorf("failed to fetch PRs: %w", err)
}
return g.formatChangelog(), nil
}
func (g *Generator) collectData() error {
if g.cache != nil && !g.cfg.RebuildCache {
cachedTag, err := g.cache.GetLastProcessedTag()
if err != nil {
return fmt.Errorf("failed to get last processed tag: %w", err)
}
if cachedTag != "" {
// Get the current latest tag from git
currentTag, err := g.gitWalker.GetLatestTag()
if err == nil {
// Load cached data - we can use it even if there are new tags
cachedVersions, err := g.cache.GetVersions()
if err == nil && len(cachedVersions) > 0 {
g.versions = cachedVersions
// Load cached PRs
for _, version := range g.versions {
for _, prNum := range version.PRNumbers {
if pr, err := g.cache.GetPR(prNum); err == nil && pr != nil {
g.prs[prNum] = pr
}
}
}
// If we have new tags since cache, process the new versions only
if currentTag != cachedTag {
fmt.Fprintf(os.Stderr, "Processing new versions since %s...\n", cachedTag)
newVersions, err := g.gitWalker.WalkHistorySinceTag(cachedTag)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to walk history since tag %s: %v\n", cachedTag, err)
} else {
// Merge new versions into cached versions (only add if not already cached)
for name, version := range newVersions {
if name != "Unreleased" { // Handle Unreleased separately
if existingVersion, exists := g.versions[name]; !exists {
g.versions[name] = version
} else {
// Update existing version with new PR numbers if they're missing
if len(existingVersion.PRNumbers) == 0 && len(version.PRNumbers) > 0 {
existingVersion.PRNumbers = version.PRNumbers
}
}
}
}
}
}
// Always update Unreleased section with latest commits
unreleasedVersion, err := g.gitWalker.WalkCommitsSinceTag(currentTag)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to walk commits since tag %s: %v\n", currentTag, err)
} else if unreleasedVersion != nil {
// Preserve existing AI summary if available
if existingUnreleased, exists := g.versions["Unreleased"]; exists {
unreleasedVersion.AISummary = existingUnreleased.AISummary
}
// Replace or add the unreleased version
g.versions["Unreleased"] = unreleasedVersion
}
// Save any new versions to cache (after potential AI processing)
if currentTag != cachedTag {
for _, version := range g.versions {
// Skip versions that were already cached and Unreleased
if version.Name != "Unreleased" {
if err := g.cache.SaveVersion(version); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to save version to cache: %v\n", err)
}
for _, commit := range version.Commits {
if err := g.cache.SaveCommit(commit, version.Name); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to save commit to cache: %v\n", err)
}
}
}
}
// Update the last processed tag
if err := g.cache.SetLastProcessedTag(currentTag); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to update last processed tag: %v\n", err)
}
}
return nil
}
}
}
}
versions, err := g.gitWalker.WalkHistory()
if err != nil {
return fmt.Errorf("failed to walk history: %w", err)
}
g.versions = versions
if g.cache != nil {
for _, version := range versions {
if err := g.cache.SaveVersion(version); err != nil {
return fmt.Errorf("failed to save version to cache: %w", err)
}
for _, commit := range version.Commits {
if err := g.cache.SaveCommit(commit, version.Name); err != nil {
return fmt.Errorf("failed to save commit to cache: %w", err)
}
}
}
// Save the latest tag as our cache anchor point
if latestTag, err := g.gitWalker.GetLatestTag(); err == nil && latestTag != "" {
if err := g.cache.SetLastProcessedTag(latestTag); err != nil {
return fmt.Errorf("failed to save last processed tag: %w", err)
}
}
}
return nil
}
func (g *Generator) fetchPRs() error {
// First, load all cached PRs
if g.cache != nil {
cachedPRs, err := g.cache.GetAllPRs()
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to load cached PRs: %v\n", err)
} else {
g.prs = cachedPRs
}
}
// Check if we need to fetch new PRs
var lastSync time.Time
if g.cache != nil {
lastSync, _ = g.cache.GetLastPRSync()
}
// Check if we need to sync for missing PRs
missingPRs := false
for _, version := range g.versions {
for _, prNum := range version.PRNumbers {
if _, exists := g.prs[prNum]; !exists {
missingPRs = true
break
}
}
if missingPRs {
break
}
}
if missingPRs {
fmt.Fprintf(os.Stderr, "Full sync triggered due to missing PRs in cache.\n")
}
// If we have never synced or it's been more than 24 hours, do a full sync
// Also sync if we have versions with PR numbers that aren't cached
needsSync := lastSync.IsZero() || time.Since(lastSync) > 24*time.Hour || g.cfg.ForcePRSync || missingPRs
if !needsSync {
fmt.Fprintf(os.Stderr, "Using cached PR data (last sync: %s)\n", lastSync.Format("2006-01-02 15:04:05"))
return nil
}
fmt.Fprintf(os.Stderr, "Fetching merged PRs from GitHub using GraphQL...\n")
// Use GraphQL for ultimate performance - gets everything in ~5-10 calls
prs, err := g.ghClient.FetchAllMergedPRsGraphQL(lastSync)
if err != nil {
fmt.Fprintf(os.Stderr, "GraphQL fetch failed, falling back to REST API: %v\n", err)
// Fall back to REST API
prs, err = g.ghClient.FetchAllMergedPRs(lastSync)
if err != nil {
return fmt.Errorf("both GraphQL and REST API failed: %w", err)
}
}
// Update our PR map with new data
for _, pr := range prs {
g.prs[pr.Number] = pr
}
// Save all PRs to cache in a batch transaction
if g.cache != nil && len(prs) > 0 {
// Save PRs
if err := g.cache.SavePRBatch(prs); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to cache PRs: %v\n", err)
}
// Save SHA→PR mappings for lightning-fast git operations
if err := g.cache.SaveCommitPRMappings(prs); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to cache commit mappings: %v\n", err)
}
// Update last sync timestamp
if err := g.cache.SetLastPRSync(time.Now()); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to update last sync timestamp: %v\n", err)
}
}
if len(prs) > 0 {
fmt.Fprintf(os.Stderr, "Fetched %d PRs with commits (total cached: %d)\n", len(prs), len(g.prs))
}
return nil
}
func (g *Generator) formatChangelog() string {
var sb strings.Builder
sb.WriteString("# Changelog\n")
versionList := g.getSortedVersions()
for _, version := range versionList {
if g.cfg.Version != "" && version.Name != g.cfg.Version {
continue
}
versionText := g.formatVersion(version)
if versionText != "" {
sb.WriteString("\n")
sb.WriteString(versionText)
}
}
return sb.String()
}
func (g *Generator) getSortedVersions() []*git.Version {
var versions []*git.Version
var releasedVersions []*git.Version
// Collect all released versions (non-"Unreleased")
for name, version := range g.versions {
if name != "Unreleased" {
releasedVersions = append(releasedVersions, version)
}
}
// Sort released versions by date (newest first)
sort.Slice(releasedVersions, func(i, j int) bool {
return releasedVersions[i].Date.After(releasedVersions[j].Date)
})
// Add "Unreleased" first if it exists and has commits
if unreleased, exists := g.versions["Unreleased"]; exists && len(unreleased.Commits) > 0 {
versions = append(versions, unreleased)
}
// Add sorted released versions
versions = append(versions, releasedVersions...)
if g.cfg.Limit > 0 && len(versions) > g.cfg.Limit {
versions = versions[:g.cfg.Limit]
}
return versions
}
func (g *Generator) formatVersion(version *git.Version) string {
var sb strings.Builder
// Generate raw content
rawContent := g.generateRawVersionContent(version)
if rawContent == "" {
return ""
}
header := g.formatVersionHeader(version)
sb.WriteString(("\n"))
sb.WriteString(header)
// If AI summarization is enabled, enhance with AI
if g.cfg.EnableAISummary {
// For "Unreleased", check if content has changed since last AI summary
if version.Name == "Unreleased" && version.AISummary != "" && g.cache != nil {
// Get cached content hash
cachedHash, err := g.cache.GetUnreleasedContentHash()
if err == nil {
// Calculate current content hash
currentHash := hashContent(rawContent)
if cachedHash == currentHash {
// Content unchanged, use cached summary
fmt.Fprintf(os.Stderr, "✅ %s content unchanged (skipping AI)\n", version.Name)
sb.WriteString(version.AISummary)
return fixMarkdown(sb.String())
}
}
}
// For released versions, if we have cached AI summary, use it!
if version.Name != "Unreleased" && version.AISummary != "" {
fmt.Fprintf(os.Stderr, "✅ %s already summarized (skipping)\n", version.Name)
sb.WriteString(version.AISummary)
return fixMarkdown(sb.String())
}
fmt.Fprintf(os.Stderr, "🤖 AI summarizing %s...", version.Name)
aiSummary, err := SummarizeVersionContent(rawContent)
if err != nil {
fmt.Fprintf(os.Stderr, " Failed: %v\n", err)
sb.WriteString((rawContent))
return fixMarkdown(sb.String())
}
if checkForAIError(aiSummary) {
fmt.Fprintf(os.Stderr, " AI error detected, using raw content instead\n")
sb.WriteString(rawContent)
fmt.Fprintf(os.Stderr, "Raw Content was: (%d bytes) %s \n", len(rawContent), rawContent)
fmt.Fprintf(os.Stderr, "AI Summary was: (%d bytes) %s\n", len(aiSummary), aiSummary)
return fixMarkdown(sb.String())
}
fmt.Fprintf(os.Stderr, " Done!\n")
aiSummary = strings.TrimSpace(aiSummary)
// Cache the AI summary and content hash
version.AISummary = aiSummary
if g.cache != nil {
if err := g.cache.UpdateVersionAISummary(version.Name, aiSummary); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to cache AI summary: %v\n", err)
}
// Cache content hash for "Unreleased" to detect changes
if version.Name == "Unreleased" {
if err := g.cache.SetUnreleasedContentHash(hashContent(rawContent)); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to cache content hash: %v\n", err)
}
}
}
sb.WriteString(aiSummary)
return fixMarkdown(sb.String())
}
sb.WriteString(rawContent)
return fixMarkdown(sb.String())
}
func checkForAIError(summary string) bool {
// Check for common AI error patterns
errorPatterns := []string{
"I don't see any", "please provide",
"content you've provided appears to be incomplete",
}
for _, pattern := range errorPatterns {
if strings.Contains(summary, pattern) {
return true
}
}
return false
}
// formatVersionHeader formats just the version header (## ...)
func (g *Generator) formatVersionHeader(version *git.Version) string {
if version.Name == "Unreleased" {
return "## Unreleased\n\n"
}
return fmt.Sprintf("\n## %s (%s)\n\n", version.Name, version.Date.Format("2006-01-02"))
}
// generateRawVersionContent generates the raw content (PRs + commits) for a version
func (g *Generator) generateRawVersionContent(version *git.Version) string {
var sb strings.Builder
// Build a set of commit SHAs that are part of fetched PRs
prCommitSHAs := make(map[string]bool)
for _, prNum := range version.PRNumbers {
if pr, exists := g.prs[prNum]; exists {
for _, prCommit := range pr.Commits {
prCommitSHAs[prCommit.SHA] = true
}
}
}
prCommits := make(map[int][]*git.Commit)
directCommits := []*git.Commit{}
for _, commit := range version.Commits {
// Skip version bump commits from output
if commit.IsVersion {
continue
}
// If this commit is part of a fetched PR, don't include it in direct commits
if prCommitSHAs[commit.SHA] {
continue
}
if commit.PRNumber > 0 {
prCommits[commit.PRNumber] = append(prCommits[commit.PRNumber], commit)
} else {
directCommits = append(directCommits, commit)
}
}
// There are occasionally no PRs or direct commits other than version bumps, so we handle that gracefully
if len(prCommits) == 0 && len(directCommits) == 0 {
return ""
}
prependNewline := ""
for _, prNum := range version.PRNumbers {
if pr, exists := g.prs[prNum]; exists {
sb.WriteString(prependNewline)
sb.WriteString(g.formatPR(pr))
prependNewline = "\n"
}
}
if len(directCommits) > 0 {
// Sort direct commits by date (newest first) for consistent ordering
sort.Slice(directCommits, func(i, j int) bool {
return directCommits[i].Date.After(directCommits[j].Date)
})
sb.WriteString(prependNewline + "### Direct commits\n\n")
for _, commit := range directCommits {
message := g.formatCommitMessage(strings.TrimSpace(commit.Message))
if message != "" && !g.isDuplicateMessage(message, directCommits) {
sb.WriteString(fmt.Sprintf("- %s\n", message))
}
}
}
return fixMarkdown(
strings.ReplaceAll(sb.String(), "\n-\n", "\n"), // Remove empty list items
)
}
func fixMarkdown(content string) string {
// Fix MD032/blank-around-lists: Lists should be surrounded by blank lines
lines := strings.Split(content, "\n")
inList := false
preListNewline := false
for i := range lines {
line := strings.TrimSpace(lines[i])
if strings.HasPrefix(line, "- ") || strings.HasPrefix(line, "* ") {
if !inList {
inList = true
// Ensure there's a blank line before the list starts
if !preListNewline && i > 0 && lines[i-1] != "" {
line = "\n" + line
preListNewline = true
}
}
} else {
if inList {
inList = false
preListNewline = false
}
}
lines[i] = strings.TrimRight(line, " \t")
}
fixedContent := strings.TrimSpace(strings.Join(lines, "\n"))
return fixedContent + "\n"
}
func (g *Generator) formatPR(pr *github.PR) string {
var sb strings.Builder
pr.Title = strings.TrimRight(strings.TrimSpace(pr.Title), ".")
// Add type indicator for non-users
authorName := pr.Author
switch pr.AuthorType {
case "bot":
authorName += "[bot]"
case "organization":
authorName += "[org]"
}
sb.WriteString(fmt.Sprintf("### PR [#%d](%s) by [%s](%s): %s\n\n",
pr.Number, pr.URL, authorName, pr.AuthorURL, strings.TrimSpace(pr.Title)))
changes := g.extractChanges(pr)
for _, change := range changes {
if change != "" {
sb.WriteString(fmt.Sprintf("- %s\n", change))
}
}
return sb.String()
}
func (g *Generator) extractChanges(pr *github.PR) []string {
var changes []string
seen := make(map[string]bool)
for _, commit := range pr.Commits {
message := g.formatCommitMessage(commit.Message)
if message != "" && !seen[message] {
seen[message] = true
changes = append(changes, message)
}
}
if len(changes) == 0 && pr.Body != "" {
lines := strings.Split(pr.Body, "\n")
for _, line := range lines {
line = strings.TrimSpace(line)
if strings.HasPrefix(line, "- ") || strings.HasPrefix(line, "* ") {
change := strings.TrimPrefix(strings.TrimPrefix(line, "- "), "* ")
if change != "" {
changes = append(changes, change)
}
}
}
}
return changes
}
func normalizeLineEndings(content string) string {
return strings.ReplaceAll(content, "\r\n", "\n")
}
func (g *Generator) formatCommitMessage(message string) string {
strings_to_remove := []string{
"### CHANGES\n", "## CHANGES\n", "# CHANGES\n",
"...\n", "---\n", "## Changes\n", "## Change",
"Update version to v..1 and commit\n",
"# What this Pull Request (PR) does\n",
"# Conflicts:",
}
message = normalizeLineEndings(message)
// No hard tabs
message = strings.ReplaceAll(message, "\t", " ")
if len(message) > 0 {
message = strings.ToUpper(message[:1]) + message[1:]
}
for _, str := range strings_to_remove {
if strings.Contains(message, str) {
message = strings.ReplaceAll(message, str, "")
}
}
message = fixFormatting(message)
return message
}
func fixFormatting(message string) string {
// Turn "*"" lists into "-" lists"
message = strings.ReplaceAll(message, "* ", "- ")
// Remove extra spaces around dashes
message = strings.ReplaceAll(message, "- ", "- ")
message = strings.ReplaceAll(message, "- ", "- ")
// turn bare URL into <URL>
if strings.Contains(message, "http://") || strings.Contains(message, "https://") {
// Use regex to wrap bare URLs with angle brackets
urlRegex := regexp.MustCompile(`\b(https?://[^\s<>]+)`)
message = urlRegex.ReplaceAllString(message, "<$1>")
}
// Replace "## LINKS\n" with "- "
message = strings.ReplaceAll(message, "## LINKS\n", "- ")
// Dependabot messages: "- [Commits]" should become "\n- [Commits]"
message = strings.TrimSpace(message)
// Turn multiple newlines into a single newline
message = strings.TrimSpace(strings.ReplaceAll(message, "\n\n", "\n"))
// Fix inline trailing spaces
message = strings.ReplaceAll(message, " \n", "\n")
// Fix weird indent before list,
message = strings.ReplaceAll(message, "\n - ", "\n- ")
// blanks-around-lists MD032 fix
// Use regex to ensure blank line before list items that don't already have one
listRegex := regexp.MustCompile(`(?m)([^\n-].*[^:\n])\n([-*] .*)`)
message = listRegex.ReplaceAllString(message, "$1\n\n$2")
// Change random first-level "#" to 4th level "####"
// This is a hack to fix spurious first-level headings that are not actual headings
// but rather just comments or notes in the commit message.
message = strings.ReplaceAll(message, "# ", "\n#### ")
message = strings.ReplaceAll(message, "\n\n\n", "\n\n")
// Wrap any non-wrapped Emails with angle brackets
emailRegex := regexp.MustCompile(`([a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,})`)
message = emailRegex.ReplaceAllString(message, "<$1>")
// Wrap any non-wrapped URLs with angle brackets
urlRegex := regexp.MustCompile(`(https?://[^\s<]+)`)
message = urlRegex.ReplaceAllString(message, "<$1>")
message = strings.ReplaceAll(message, "<<", "<")
message = strings.ReplaceAll(message, ">>", ">")
// Fix some spurious Issue/PR links at the beginning of a commit message line
prOrIssueLinkRegex := regexp.MustCompile("\n" + `(#\d+)`)
message = prOrIssueLinkRegex.ReplaceAllString(message, " $1")
// Remove leading/trailing whitespace
message = strings.TrimSpace(message)
return message
}
func (g *Generator) isDuplicateMessage(message string, commits []*git.Commit) bool {
if message == "." || strings.ToLower(message) == "fix" {
count := 0
for _, commit := range commits {
formatted := g.formatCommitMessage(commit.Message)
if formatted == message {
count++
if count > 1 {
return true
}
}
}
}
return false
}
// hashContent generates a SHA256 hash of the content for change detection
func hashContent(content string) string {
hash := sha256.Sum256([]byte(content))
return fmt.Sprintf("%x", hash)
}

View File

@@ -0,0 +1,115 @@
package changelog
import (
"os"
"path/filepath"
"regexp"
"testing"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
)
func TestDetectVersionFromNix(t *testing.T) {
tempDir := t.TempDir()
t.Run("version.nix exists", func(t *testing.T) {
versionNixContent := `"1.2.3"`
versionNixPath := filepath.Join(tempDir, "version.nix")
err := os.WriteFile(versionNixPath, []byte(versionNixContent), 0644)
if err != nil {
t.Fatalf("Failed to write version.nix: %v", err)
}
data, err := os.ReadFile(versionNixPath)
if err != nil {
t.Fatalf("Failed to read version.nix: %v", err)
}
versionRegex := regexp.MustCompile(`"([^"]+)"`)
matches := versionRegex.FindStringSubmatch(string(data))
if len(matches) <= 1 {
t.Fatalf("No version found in version.nix")
}
version := matches[1]
if version != "1.2.3" {
t.Errorf("Expected version 1.2.3, got %s", version)
}
})
}
func TestEnsureIncomingDir(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
cfg := &config.Config{
IncomingDir: incomingDir,
}
g := &Generator{cfg: cfg}
err := g.ensureIncomingDir()
if err != nil {
t.Fatalf("ensureIncomingDir failed: %v", err)
}
if _, err := os.Stat(incomingDir); os.IsNotExist(err) {
t.Errorf("Incoming directory was not created")
}
}
func TestInsertVersionAtTop(t *testing.T) {
tempDir := t.TempDir()
changelogPath := filepath.Join(tempDir, "CHANGELOG.md")
cfg := &config.Config{
RepoPath: tempDir,
}
g := &Generator{cfg: cfg}
t.Run("new changelog", func(t *testing.T) {
entry := "## v1.0.0 (2025-01-01)\n\n- Initial release"
err := g.insertVersionAtTop(entry)
if err != nil {
t.Fatalf("insertVersionAtTop failed: %v", err)
}
content, err := os.ReadFile(changelogPath)
if err != nil {
t.Fatalf("Failed to read changelog: %v", err)
}
expected := "# Changelog\n\n## v1.0.0 (2025-01-01)\n\n- Initial release\n"
if string(content) != expected {
t.Errorf("Expected:\n%s\nGot:\n%s", expected, string(content))
}
})
t.Run("existing changelog", func(t *testing.T) {
existingContent := "# Changelog\n\n## v0.9.0 (2024-12-01)\n\n- Previous release"
err := os.WriteFile(changelogPath, []byte(existingContent), 0644)
if err != nil {
t.Fatalf("Failed to write existing changelog: %v", err)
}
entry := "## v1.0.0 (2025-01-01)\n\n- New release"
err = g.insertVersionAtTop(entry)
if err != nil {
t.Fatalf("insertVersionAtTop failed: %v", err)
}
content, err := os.ReadFile(changelogPath)
if err != nil {
t.Fatalf("Failed to read changelog: %v", err)
}
expected := "# Changelog\n\n## v1.0.0 (2025-01-01)\n\n- New release\n## v0.9.0 (2024-12-01)\n\n- Previous release"
if string(content) != expected {
t.Errorf("Expected:\n%s\nGot:\n%s", expected, string(content))
}
})
}

View File

@@ -0,0 +1,407 @@
package changelog
import (
"fmt"
"os"
"path/filepath"
"regexp"
"sort"
"strconv"
"strings"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/git"
)
// ProcessIncomingPR processes a single PR for changelog entry creation
func (g *Generator) ProcessIncomingPR(prNumber int) error {
if err := g.validatePRState(prNumber); err != nil {
return fmt.Errorf("PR validation failed: %w", err)
}
if err := g.validateGitStatus(); err != nil {
return fmt.Errorf("git status validation failed: %w", err)
}
// Now fetch the full PR with commits for content generation
pr, err := g.ghClient.GetPRWithCommits(prNumber)
if err != nil {
return fmt.Errorf("failed to fetch PR %d: %w", prNumber, err)
}
content := g.formatPR(pr)
if g.cfg.EnableAISummary {
aiContent, err := SummarizeVersionContent(content)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: AI summarization failed: %v\n", err)
} else if !checkForAIError(aiContent) {
content = strings.TrimSpace(aiContent)
}
}
if err := g.ensureIncomingDir(); err != nil {
return fmt.Errorf("failed to create incoming directory: %w", err)
}
filename := filepath.Join(g.cfg.IncomingDir, fmt.Sprintf("%d.txt", prNumber))
// Ensure content ends with a single newline
content = strings.TrimSpace(content) + "\n"
if err := os.WriteFile(filename, []byte(content), 0644); err != nil {
return fmt.Errorf("failed to write incoming file: %w", err)
}
if err := g.commitAndPushIncoming(prNumber, filename); err != nil {
return fmt.Errorf("failed to commit and push: %w", err)
}
fmt.Printf("Successfully created incoming changelog entry: %s\n", filename)
return nil
}
// CreateNewChangelogEntry aggregates all incoming PR files for release and includes direct commits
func (g *Generator) CreateNewChangelogEntry(version string) error {
files, err := filepath.Glob(filepath.Join(g.cfg.IncomingDir, "*.txt"))
if err != nil {
return fmt.Errorf("failed to scan incoming directory: %w", err)
}
var content strings.Builder
var processingErrors []string
// First, aggregate all incoming PR files
for _, file := range files {
data, err := os.ReadFile(file)
if err != nil {
processingErrors = append(processingErrors, fmt.Sprintf("failed to read %s: %v", file, err))
continue // Continue to attempt processing other files
}
content.WriteString(string(data))
content.WriteString("\n")
}
if len(processingErrors) > 0 {
return fmt.Errorf("encountered errors while processing incoming files: %s", strings.Join(processingErrors, "; "))
}
// Extract PR numbers and their commit SHAs from processed files to avoid including their commits as "direct"
processedPRs := make(map[int]bool)
processedCommitSHAs := make(map[string]bool)
for _, file := range files {
// Extract PR number from filename (e.g., "1640.txt" -> 1640)
filename := filepath.Base(file)
if prNumStr := strings.TrimSuffix(filename, ".txt"); prNumStr != filename {
if prNum, err := strconv.Atoi(prNumStr); err == nil {
processedPRs[prNum] = true
// Fetch the PR to get its commit SHAs
if pr, err := g.ghClient.GetPRWithCommits(prNum); err == nil {
for _, commit := range pr.Commits {
processedCommitSHAs[commit.SHA] = true
}
}
}
}
}
// Now add direct commits since the last release, excluding commits from processed PRs
directCommitsContent, err := g.getDirectCommitsSinceLastRelease(processedPRs, processedCommitSHAs)
if err != nil {
return fmt.Errorf("failed to get direct commits since last release: %w", err)
}
content.WriteString(directCommitsContent)
// Check if we have any content at all
if content.Len() == 0 {
if len(files) == 0 {
fmt.Fprintf(os.Stderr, "No incoming PR files found in %s and no direct commits since last release\n", g.cfg.IncomingDir)
} else {
fmt.Fprintf(os.Stderr, "No content found in incoming files and no direct commits since last release\n")
}
return nil
}
entry := fmt.Sprintf("## %s (%s)\n\n%s",
version, time.Now().Format("2006-01-02"), strings.TrimLeft(content.String(), "\n"))
if err := g.insertVersionAtTop(entry); err != nil {
return fmt.Errorf("failed to update CHANGELOG.md: %w", err)
}
if g.cache != nil {
// Create a proper new version entry for the database
newVersionEntry := &git.Version{
Name: version,
Date: time.Now(),
CommitSHA: "", // Will be set when the release commit is made
PRNumbers: []int{}, // Could be extracted from incoming files if needed
AISummary: content.String(),
}
if err := g.cache.SaveVersion(newVersionEntry); err != nil {
return fmt.Errorf("failed to save new version entry to database: %w", err)
}
}
for _, file := range files {
if err := os.Remove(file); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to remove %s: %v\n", file, err)
}
}
if err := g.stageChangesForRelease(); err != nil {
return fmt.Errorf("critical: failed to stage changes for release: %w", err)
}
fmt.Printf("Successfully processed %d incoming PR files for version %s\n", len(files), version)
return nil
}
// getDirectCommitsSinceLastRelease gets all direct commits (not part of PRs) since the last release
func (g *Generator) getDirectCommitsSinceLastRelease(processedPRs map[int]bool, processedCommitSHAs map[string]bool) (string, error) {
// Get the latest tag to determine what commits are unreleased
latestTag, err := g.gitWalker.GetLatestTag()
if err != nil {
return "", fmt.Errorf("failed to get latest tag: %w", err)
}
// Get all commits since the latest tag
unreleasedVersion, err := g.gitWalker.WalkCommitsSinceTag(latestTag)
if err != nil {
return "", fmt.Errorf("failed to walk commits since tag %s: %w", latestTag, err)
}
if unreleasedVersion == nil || len(unreleasedVersion.Commits) == 0 {
return "", nil // No unreleased commits
}
// Filter out commits that are part of PRs (we already have those from incoming files)
// and format the direct commits
var directCommits []*git.Commit
for _, commit := range unreleasedVersion.Commits {
// Skip version bump commits
if commit.IsVersion {
continue
}
// Skip commits that belong to PRs we've already processed from incoming files (by PR number)
if commit.PRNumber > 0 && processedPRs[commit.PRNumber] {
continue
}
// Skip commits whose SHA is already included in processed PRs (this catches commits
// that might not have been detected as part of a PR but are actually in the PR)
if processedCommitSHAs[commit.SHA] {
continue
}
// Only include commits that are NOT part of any PR (direct commits)
if commit.PRNumber == 0 {
directCommits = append(directCommits, commit)
}
}
if len(directCommits) == 0 {
return "", nil // No direct commits
}
// Format the direct commits similar to how it's done in generateRawVersionContent
var sb strings.Builder
sb.WriteString("### Direct commits\n\n")
// Sort direct commits by date (newest first) for consistent ordering
sort.Slice(directCommits, func(i, j int) bool {
return directCommits[i].Date.After(directCommits[j].Date)
})
for _, commit := range directCommits {
message := g.formatCommitMessage(strings.TrimSpace(commit.Message))
if message != "" && !g.isDuplicateMessage(message, directCommits) {
sb.WriteString(fmt.Sprintf("- %s\n", message))
}
}
return sb.String(), nil
}
// validatePRState validates that a PR is in the correct state for processing
func (g *Generator) validatePRState(prNumber int) error {
// Use lightweight validation call that doesn't fetch commits
details, err := g.ghClient.GetPRValidationDetails(prNumber)
if err != nil {
return fmt.Errorf("failed to fetch PR %d: %w", prNumber, err)
}
if details.State != "open" {
return fmt.Errorf("PR %d is not open (current state: %s)", prNumber, details.State)
}
if !details.Mergeable {
return fmt.Errorf("PR %d is not mergeable - please resolve conflicts first", prNumber)
}
return nil
}
// validateGitStatus ensures the working directory is clean
func (g *Generator) validateGitStatus() error {
isClean, err := g.gitWalker.IsWorkingDirectoryClean()
if err != nil {
return fmt.Errorf("failed to check git status: %w", err)
}
if !isClean {
// Get detailed status for better error message
statusDetails, statusErr := g.gitWalker.GetStatusDetails()
if statusErr == nil && statusDetails != "" {
return fmt.Errorf("working directory is not clean - please commit or stash changes before proceeding:\n%s", statusDetails)
}
return fmt.Errorf("working directory is not clean - please commit or stash changes before proceeding")
}
return nil
}
// ensureIncomingDir creates the incoming directory if it doesn't exist
func (g *Generator) ensureIncomingDir() error {
if err := os.MkdirAll(g.cfg.IncomingDir, 0755); err != nil {
return fmt.Errorf("failed to create directory %s: %w", g.cfg.IncomingDir, err)
}
return nil
}
// commitAndPushIncoming commits and optionally pushes the incoming changelog file
func (g *Generator) commitAndPushIncoming(prNumber int, filename string) error {
relativeFilename, err := filepath.Rel(g.cfg.RepoPath, filename)
if err != nil {
relativeFilename = filename
}
// Add file to git index
if err := g.gitWalker.AddFile(relativeFilename); err != nil {
return fmt.Errorf("failed to add file %s: %w", relativeFilename, err)
}
// Commit changes
commitMessage := fmt.Sprintf("chore: incoming %d changelog entry", prNumber)
_, err = g.gitWalker.CommitChanges(commitMessage)
if err != nil {
return fmt.Errorf("failed to commit changes: %w", err)
}
// Push to remote if enabled
if g.cfg.Push {
if err := g.gitWalker.PushToRemote(); err != nil {
return fmt.Errorf("failed to push to remote: %w", err)
}
} else {
fmt.Println("Commit created successfully. Please review and push manually.")
}
return nil
}
// detectVersion detects the current version from version.nix or git tags
func (g *Generator) detectVersion() (string, error) {
versionNixPath := filepath.Join(g.cfg.RepoPath, "version.nix")
if _, err := os.Stat(versionNixPath); err == nil {
data, err := os.ReadFile(versionNixPath)
if err != nil {
return "", fmt.Errorf("failed to read version.nix: %w", err)
}
versionRegex := regexp.MustCompile(`"([^"]+)"`)
matches := versionRegex.FindStringSubmatch(string(data))
if len(matches) > 1 {
return matches[1], nil
}
}
latestTag, err := g.gitWalker.GetLatestTag()
if err != nil {
return "", fmt.Errorf("failed to get latest tag: %w", err)
}
if latestTag == "" {
return "v1.0.0", nil
}
return latestTag, nil
}
// insertVersionAtTop inserts a new version entry at the top of CHANGELOG.md
func (g *Generator) insertVersionAtTop(entry string) error {
changelogPath := filepath.Join(g.cfg.RepoPath, "CHANGELOG.md")
header := "# Changelog"
headerRegex := regexp.MustCompile(`(?m)^# Changelog\s*`)
existingContent, err := os.ReadFile(changelogPath)
if err != nil {
if !os.IsNotExist(err) {
return fmt.Errorf("failed to read existing CHANGELOG.md: %w", err)
}
// File doesn't exist, create it.
newContent := fmt.Sprintf("%s\n\n%s\n", header, entry)
return os.WriteFile(changelogPath, []byte(newContent), 0644)
}
contentStr := string(existingContent)
var newContent string
if loc := headerRegex.FindStringIndex(contentStr); loc != nil {
// Found the header, insert after it.
insertionPoint := loc[1]
// Skip any existing newlines after the header to avoid double spacing
for insertionPoint < len(contentStr) && (contentStr[insertionPoint] == '\n' || contentStr[insertionPoint] == '\r') {
insertionPoint++
}
// Insert with proper spacing: single newline after header, then entry, then newline before existing content
newContent = contentStr[:loc[1]] + entry + "\n" + contentStr[insertionPoint:]
} else {
// Header not found, prepend everything.
newContent = fmt.Sprintf("%s\n\n%s\n\n%s", header, entry, contentStr)
}
return os.WriteFile(changelogPath, []byte(newContent), 0644)
}
// stageChangesForRelease stages the modified files for the release commit
func (g *Generator) stageChangesForRelease() error {
changelogPath := filepath.Join(g.cfg.RepoPath, "CHANGELOG.md")
relativeChangelog, err := filepath.Rel(g.cfg.RepoPath, changelogPath)
if err != nil {
relativeChangelog = "CHANGELOG.md"
}
relativeCacheFile, err := filepath.Rel(g.cfg.RepoPath, g.cfg.CacheFile)
if err != nil {
relativeCacheFile = g.cfg.CacheFile
}
// Add CHANGELOG.md to git index
if err := g.gitWalker.AddFile(relativeChangelog); err != nil {
return fmt.Errorf("failed to add %s: %w", relativeChangelog, err)
}
// Add cache file to git index
if err := g.gitWalker.AddFile(relativeCacheFile); err != nil {
return fmt.Errorf("failed to add %s: %w", relativeCacheFile, err)
}
// Remove incoming directory if it exists
if g.cfg.IncomingDir != "" {
relativeIncomingDir, err := filepath.Rel(g.cfg.RepoPath, g.cfg.IncomingDir)
if err != nil {
relativeIncomingDir = g.cfg.IncomingDir
}
if err := g.gitWalker.RemoveFile(relativeIncomingDir); err != nil {
return fmt.Errorf("failed to remove incoming directory %s: %w", relativeIncomingDir, err)
}
}
return nil
}

View File

@@ -0,0 +1,262 @@
package changelog
import (
"os"
"path/filepath"
"strings"
"testing"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
)
func TestDetectVersion(t *testing.T) {
tempDir := t.TempDir()
tests := []struct {
name string
versionNixContent string
expectedVersion string
shouldError bool
}{
{
name: "valid version.nix",
versionNixContent: `"1.2.3"`,
expectedVersion: "1.2.3",
shouldError: false,
},
{
name: "version with extra whitespace",
versionNixContent: `"1.2.3" `,
expectedVersion: "1.2.3",
shouldError: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
// Create version.nix file
versionNixPath := filepath.Join(tempDir, "version.nix")
if err := os.WriteFile(versionNixPath, []byte(tt.versionNixContent), 0644); err != nil {
t.Fatalf("Failed to create version.nix: %v", err)
}
cfg := &config.Config{
RepoPath: tempDir,
}
g := &Generator{cfg: cfg}
version, err := g.detectVersion()
if tt.shouldError && err == nil {
t.Errorf("Expected error but got none")
}
if !tt.shouldError && err != nil {
t.Errorf("Unexpected error: %v", err)
}
if version != tt.expectedVersion {
t.Errorf("Expected version '%s', got '%s'", tt.expectedVersion, version)
}
// Clean up
os.Remove(versionNixPath)
})
}
}
func TestInsertVersionAtTop_ImprovedRobustness(t *testing.T) {
tempDir := t.TempDir()
changelogPath := filepath.Join(tempDir, "CHANGELOG.md")
cfg := &config.Config{
RepoPath: tempDir,
}
g := &Generator{cfg: cfg}
tests := []struct {
name string
existingContent string
entry string
expectedContent string
}{
{
name: "header with trailing spaces",
existingContent: "# Changelog \n\n## v1.0.0\n- Old content",
entry: "## v2.0.0\n- New content",
expectedContent: "# Changelog \n\n## v2.0.0\n- New content\n## v1.0.0\n- Old content",
},
{
name: "header with different line endings",
existingContent: "# Changelog\r\n\r\n## v1.0.0\r\n- Old content",
entry: "## v2.0.0\n- New content",
expectedContent: "# Changelog\r\n\r\n## v2.0.0\n- New content\n## v1.0.0\r\n- Old content",
},
{
name: "no existing header",
existingContent: "Some existing content without header",
entry: "## v1.0.0\n- New content",
expectedContent: "# Changelog\n\n## v1.0.0\n- New content\n\nSome existing content without header",
},
{
name: "new file creation",
existingContent: "",
entry: "## v1.0.0\n- Initial release",
expectedContent: "# Changelog\n\n## v1.0.0\n- Initial release\n",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
// Write existing content (or create empty file)
if tt.existingContent != "" {
if err := os.WriteFile(changelogPath, []byte(tt.existingContent), 0644); err != nil {
t.Fatalf("Failed to write existing content: %v", err)
}
} else {
// Remove file if it exists to test new file creation
os.Remove(changelogPath)
}
// Insert new version
if err := g.insertVersionAtTop(tt.entry); err != nil {
t.Fatalf("insertVersionAtTop failed: %v", err)
}
// Read result
result, err := os.ReadFile(changelogPath)
if err != nil {
t.Fatalf("Failed to read result: %v", err)
}
if string(result) != tt.expectedContent {
t.Errorf("Expected:\n%q\nGot:\n%q", tt.expectedContent, string(result))
}
})
}
}
func TestProcessIncomingPRs_FileAggregation(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
// Create incoming directory and files
if err := os.MkdirAll(incomingDir, 0755); err != nil {
t.Fatalf("Failed to create incoming dir: %v", err)
}
// Create test incoming files
file1Content := "## PR #1\n- Feature A"
file2Content := "## PR #2\n- Feature B"
if err := os.WriteFile(filepath.Join(incomingDir, "1.txt"), []byte(file1Content), 0644); err != nil {
t.Fatalf("Failed to create test file: %v", err)
}
if err := os.WriteFile(filepath.Join(incomingDir, "2.txt"), []byte(file2Content), 0644); err != nil {
t.Fatalf("Failed to create test file: %v", err)
}
// Test file aggregation logic by calling the internal functions
files, err := filepath.Glob(filepath.Join(incomingDir, "*.txt"))
if err != nil {
t.Fatalf("Failed to glob files: %v", err)
}
if len(files) != 2 {
t.Fatalf("Expected 2 files, got %d", len(files))
}
// Test content aggregation
var content strings.Builder
var processingErrors []string
for _, file := range files {
data, err := os.ReadFile(file)
if err != nil {
processingErrors = append(processingErrors, err.Error())
continue
}
content.WriteString(string(data))
content.WriteString("\n")
}
if len(processingErrors) > 0 {
t.Fatalf("Unexpected processing errors: %v", processingErrors)
}
aggregatedContent := content.String()
if !strings.Contains(aggregatedContent, "Feature A") {
t.Errorf("Aggregated content should contain 'Feature A'")
}
if !strings.Contains(aggregatedContent, "Feature B") {
t.Errorf("Aggregated content should contain 'Feature B'")
}
}
func TestFileProcessing_ErrorHandling(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
// Create incoming directory with one good file and one unreadable file
if err := os.MkdirAll(incomingDir, 0755); err != nil {
t.Fatalf("Failed to create incoming dir: %v", err)
}
// Create a good file
if err := os.WriteFile(filepath.Join(incomingDir, "1.txt"), []byte("content"), 0644); err != nil {
t.Fatalf("Failed to create test file: %v", err)
}
// Create an unreadable file (simulate permission error)
unreadableFile := filepath.Join(incomingDir, "2.txt")
if err := os.WriteFile(unreadableFile, []byte("content"), 0000); err != nil {
t.Fatalf("Failed to create unreadable file: %v", err)
}
defer os.Chmod(unreadableFile, 0644) // Clean up
// Test error aggregation logic
files, err := filepath.Glob(filepath.Join(incomingDir, "*.txt"))
if err != nil {
t.Fatalf("Failed to glob files: %v", err)
}
var content strings.Builder
var processingErrors []string
for _, file := range files {
data, err := os.ReadFile(file)
if err != nil {
processingErrors = append(processingErrors, err.Error())
continue
}
content.WriteString(string(data))
content.WriteString("\n")
}
if len(processingErrors) == 0 {
t.Errorf("Expected processing errors due to unreadable file")
}
// Verify error message format
errorMsg := strings.Join(processingErrors, "; ")
if !strings.Contains(errorMsg, "2.txt") {
t.Errorf("Error message should mention the problematic file")
}
}
func TestEnsureIncomingDirCreation(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
cfg := &config.Config{
IncomingDir: incomingDir,
}
g := &Generator{cfg: cfg}
err := g.ensureIncomingDir()
if err != nil {
t.Fatalf("ensureIncomingDir failed: %v", err)
}
if _, err := os.Stat(incomingDir); os.IsNotExist(err) {
t.Errorf("Incoming directory was not created")
}
}

View File

@@ -0,0 +1,79 @@
package changelog
import (
"fmt"
"os"
"os/exec"
"strings"
)
const DefaultSummarizeModel = "claude-sonnet-4-20250514"
const MinContentLength = 256 // Minimum content length to consider for summarization
const prompt = `# ROLE
You are an expert Technical Writer specializing in creating clear, concise,
and professional release notes from raw Git commit logs.
# TASK
Your goal is to transform a provided block of Git commit logs into a clean,
human-readable changelog summary. You will identify the most important changes,
format them as a bulleted list, and preserve the associated Pull Request (PR)
information.
# INSTRUCTIONS:
Follow these steps in order:
1. Deeply analyze the input. You will be given a block of text containing PR
information and commit log messages. Carefully read through the logs
to identify individual commits and their descriptions.
2. Identify Key Changes: Focus on commits that represent significant changes,
such as new features ("feat"), bug fixes ("fix"), performance improvements ("perf"),
or breaking changes ("BREAKING CHANGE").
3. Select the Top 5: From the identified key changes, select a maximum of the five
(5) most impactful ones to include in the summary.
If there are five or fewer total changes, include all of them.
4. Format the Output:
- Where you see a PR header, include the PR header verbatim. NO CHANGES.
**This is a critical rule: Do not modify the PR header, as it contains
important links.** What follow the PR header are the related changes.
- Do not add any additional text or preamble. Begin directly with the output.
- Use bullet points for each key change. Starting each point with a hyphen ("-").
- Ensure that the summary is concise and focused on the main changes.
- The summary should be in American English (en-US), using proper grammar and punctuation.
5. If the content is too brief or you do not see any PR headers, return the content as is.
`
// getSummarizeModel returns the model to use for AI summarization
func getSummarizeModel() string {
if model := os.Getenv("FABRIC_CHANGELOG_SUMMARIZE_MODEL"); model != "" {
return model
}
return DefaultSummarizeModel
}
// SummarizeVersionContent takes raw version content and returns AI-enhanced summary
func SummarizeVersionContent(content string) (string, error) {
if strings.TrimSpace(content) == "" {
return "", fmt.Errorf("no content to summarize")
}
if len(content) < MinContentLength {
// If content is too brief, return it as is
return content, nil
}
model := getSummarizeModel()
cmd := exec.Command("fabric", "-m", model, prompt)
cmd.Stdin = strings.NewReader(content)
output, err := cmd.Output()
if err != nil {
return "", fmt.Errorf("fabric command failed: %w", err)
}
summary := strings.TrimSpace(string(output))
if summary == "" {
return "", fmt.Errorf("fabric returned empty summary")
}
return summary, nil
}

View File

@@ -0,0 +1,19 @@
package config
type Config struct {
RepoPath string
OutputFile string
Limit int
Version string
SaveData bool
CacheFile string
NoCache bool
RebuildCache bool
GitHubToken string
ForcePRSync bool
EnableAISummary bool
IncomingPR int
ProcessPRsVersion string
IncomingDir string
Push bool
}

View File

@@ -0,0 +1,26 @@
package git
import (
"time"
)
type Commit struct {
SHA string
Message string
Author string
Email string
Date time.Time
IsMerge bool
PRNumber int
IsVersion bool
Version string
}
type Version struct {
Name string
Date time.Time
CommitSHA string
Commits []*Commit
PRNumbers []int
AISummary string
}

View File

@@ -0,0 +1,566 @@
package git
import (
"fmt"
"os"
"regexp"
"strconv"
"strings"
"time"
"github.com/go-git/go-git/v5"
"github.com/go-git/go-git/v5/plumbing"
"github.com/go-git/go-git/v5/plumbing/object"
"github.com/go-git/go-git/v5/plumbing/storer"
"github.com/go-git/go-git/v5/plumbing/transport/http"
)
var (
versionPattern = regexp.MustCompile(`Update version to (v\d+\.\d+\.\d+)`)
prPattern = regexp.MustCompile(`Merge pull request #(\d+)`)
)
type Walker struct {
repo *git.Repository
}
func NewWalker(repoPath string) (*Walker, error) {
repo, err := git.PlainOpen(repoPath)
if err != nil {
return nil, fmt.Errorf("failed to open repository: %w", err)
}
return &Walker{repo: repo}, nil
}
// GetLatestTag returns the name of the most recent tag by committer date
func (w *Walker) GetLatestTag() (string, error) {
tagRefs, err := w.repo.Tags()
if err != nil {
return "", err
}
var latestTagCommit *object.Commit
var latestTagName string
err = tagRefs.ForEach(func(tagRef *plumbing.Reference) error {
revision := plumbing.Revision(tagRef.Name().String())
tagCommitHash, err := w.repo.ResolveRevision(revision)
if err != nil {
return err
}
commit, err := w.repo.CommitObject(*tagCommitHash)
if err != nil {
return err
}
if latestTagCommit == nil {
latestTagCommit = commit
latestTagName = tagRef.Name().Short() // Get short name like "v1.4.245"
}
if commit.Committer.When.After(latestTagCommit.Committer.When) {
latestTagCommit = commit
latestTagName = tagRef.Name().Short()
}
return nil
})
if err != nil {
return "", err
}
return latestTagName, nil
}
// WalkCommitsSinceTag walks commits from the specified tag to HEAD and returns only "Unreleased" version
func (w *Walker) WalkCommitsSinceTag(tagName string) (*Version, error) {
// Get the tag reference
tagRef, err := w.repo.Tag(tagName)
if err != nil {
return nil, fmt.Errorf("failed to find tag %s: %w", tagName, err)
}
// Get the commit that the tag points to
tagCommit, err := w.repo.CommitObject(tagRef.Hash())
if err != nil {
return nil, fmt.Errorf("failed to get tag commit: %w", err)
}
// Get HEAD
headRef, err := w.repo.Head()
if err != nil {
return nil, fmt.Errorf("failed to get HEAD: %w", err)
}
// Walk from HEAD back to the tag commit (exclusive)
commitIter, err := w.repo.Log(&git.LogOptions{
From: headRef.Hash(),
Order: git.LogOrderCommitterTime,
})
if err != nil {
return nil, fmt.Errorf("failed to get commit log: %w", err)
}
version := &Version{
Name: "Unreleased",
Commits: []*Commit{},
}
prNumbers := []int{}
err = commitIter.ForEach(func(c *object.Commit) error {
// Stop when we reach the tag commit (don't include it)
if c.Hash == tagCommit.Hash {
return fmt.Errorf("reached tag commit") // Use error to break out of iteration
}
commit := &Commit{
SHA: c.Hash.String(),
Message: strings.TrimSpace(c.Message),
Date: c.Committer.When,
}
// Check for version patterns
if versionMatch := versionPattern.FindStringSubmatch(commit.Message); versionMatch != nil {
commit.IsVersion = true
}
// Check for PR merge patterns
if prMatch := prPattern.FindStringSubmatch(commit.Message); prMatch != nil {
if prNumber, err := strconv.Atoi(prMatch[1]); err == nil {
commit.PRNumber = prNumber
prNumbers = append(prNumbers, prNumber)
}
}
version.Commits = append(version.Commits, commit)
return nil
})
// Ignore the "reached tag commit" error - it's expected
if err != nil && !strings.Contains(err.Error(), "reached tag commit") {
return nil, fmt.Errorf("failed to walk commits: %w", err)
}
// Remove duplicates from prNumbers and set them
prNumbersMap := make(map[int]bool)
for _, prNum := range prNumbers {
prNumbersMap[prNum] = true
}
version.PRNumbers = make([]int, 0, len(prNumbersMap))
for prNum := range prNumbersMap {
version.PRNumbers = append(version.PRNumbers, prNum)
}
return version, nil
}
func (w *Walker) WalkHistory() (map[string]*Version, error) {
ref, err := w.repo.Head()
if err != nil {
return nil, fmt.Errorf("failed to get HEAD: %w", err)
}
commitIter, err := w.repo.Log(&git.LogOptions{
From: ref.Hash(),
Order: git.LogOrderCommitterTime,
})
if err != nil {
return nil, fmt.Errorf("failed to get commit log: %w", err)
}
versions := make(map[string]*Version)
currentVersion := "Unreleased"
versions[currentVersion] = &Version{
Name: currentVersion,
Commits: []*Commit{},
}
prNumbers := make(map[string][]int)
err = commitIter.ForEach(func(c *object.Commit) error {
// c.Message = Summarize(c.Message)
commit := &Commit{
SHA: c.Hash.String(),
Message: strings.TrimSpace(c.Message),
Author: c.Author.Name,
Email: c.Author.Email,
Date: c.Author.When,
IsMerge: len(c.ParentHashes) > 1,
}
if matches := versionPattern.FindStringSubmatch(commit.Message); len(matches) > 1 {
commit.IsVersion = true
commit.Version = matches[1]
currentVersion = commit.Version
if _, exists := versions[currentVersion]; !exists {
versions[currentVersion] = &Version{
Name: currentVersion,
Date: commit.Date,
CommitSHA: commit.SHA,
Commits: []*Commit{},
}
}
return nil
}
if matches := prPattern.FindStringSubmatch(commit.Message); len(matches) > 1 {
prNumber := 0
fmt.Sscanf(matches[1], "%d", &prNumber)
commit.PRNumber = prNumber
prNumbers[currentVersion] = append(prNumbers[currentVersion], prNumber)
}
versions[currentVersion].Commits = append(versions[currentVersion].Commits, commit)
return nil
})
if err != nil {
return nil, fmt.Errorf("failed to walk commits: %w", err)
}
for version, prs := range prNumbers {
versions[version].PRNumbers = dedupInts(prs)
}
return versions, nil
}
func (w *Walker) GetRepoInfo() (owner string, name string, err error) {
remotes, err := w.repo.Remotes()
if err != nil {
return "", "", fmt.Errorf("failed to get remotes: %w", err)
}
// First try upstream (preferred for forks)
for _, remote := range remotes {
if remote.Config().Name == "upstream" {
urls := remote.Config().URLs
if len(urls) > 0 {
owner, name = parseGitHubURL(urls[0])
if owner != "" && name != "" {
return owner, name, nil
}
}
}
}
// Then try origin
for _, remote := range remotes {
if remote.Config().Name == "origin" {
urls := remote.Config().URLs
if len(urls) > 0 {
owner, name = parseGitHubURL(urls[0])
if owner != "" && name != "" {
return owner, name, nil
}
}
}
}
return "danielmiessler", "fabric", nil
}
func parseGitHubURL(url string) (owner, repo string) {
patterns := []string{
`github\.com[:/]([^/]+)/([^/.]+)`,
`github\.com[:/]([^/]+)/([^/]+)\.git$`,
}
for _, pattern := range patterns {
re := regexp.MustCompile(pattern)
matches := re.FindStringSubmatch(url)
if len(matches) > 2 {
return matches[1], matches[2]
}
}
return "", ""
}
// WalkHistorySinceTag walks git history from HEAD down to (but not including) the specified tag
// and returns any version commits found along the way
func (w *Walker) WalkHistorySinceTag(sinceTag string) (map[string]*Version, error) {
// Get the commit SHA for the sinceTag
tagRef, err := w.repo.Tag(sinceTag)
if err != nil {
return nil, fmt.Errorf("failed to get tag %s: %w", sinceTag, err)
}
tagCommit, err := w.repo.CommitObject(tagRef.Hash())
if err != nil {
return nil, fmt.Errorf("failed to get commit for tag %s: %w", sinceTag, err)
}
// Get HEAD reference
ref, err := w.repo.Head()
if err != nil {
return nil, fmt.Errorf("failed to get HEAD: %w", err)
}
// Walk from HEAD down to the tag commit (excluding it)
commitIter, err := w.repo.Log(&git.LogOptions{
From: ref.Hash(),
Order: git.LogOrderCommitterTime,
})
if err != nil {
return nil, fmt.Errorf("failed to create commit iterator: %w", err)
}
defer commitIter.Close()
versions := make(map[string]*Version)
currentVersion := "Unreleased"
prNumbers := make(map[string][]int)
err = commitIter.ForEach(func(c *object.Commit) error {
// Stop iteration when the hash of the current commit matches the hash of the specified sinceTag commit
if c.Hash == tagCommit.Hash {
return storer.ErrStop
}
commit := &Commit{
SHA: c.Hash.String(),
Message: strings.TrimSpace(c.Message),
Author: c.Author.Name,
Email: c.Author.Email,
Date: c.Author.When,
IsMerge: len(c.ParentHashes) > 1,
}
// Check for version pattern
if matches := versionPattern.FindStringSubmatch(commit.Message); len(matches) > 1 {
commit.IsVersion = true
commit.Version = matches[1]
currentVersion = commit.Version
if _, exists := versions[currentVersion]; !exists {
versions[currentVersion] = &Version{
Name: currentVersion,
Date: commit.Date,
CommitSHA: commit.SHA,
Commits: []*Commit{},
}
}
return nil
}
// Check for PR merge pattern
if matches := prPattern.FindStringSubmatch(commit.Message); len(matches) > 1 {
prNumber, err := strconv.Atoi(matches[1])
if err != nil {
// Handle parsing error (e.g., log it or skip processing)
return fmt.Errorf("failed to parse PR number: %v", err)
}
commit.PRNumber = prNumber
prNumbers[currentVersion] = append(prNumbers[currentVersion], prNumber)
}
// Add commit to current version
if _, exists := versions[currentVersion]; !exists {
versions[currentVersion] = &Version{
Name: currentVersion,
Date: time.Time{}, // Zero value, will be set by version commit
CommitSHA: "",
Commits: []*Commit{},
}
}
versions[currentVersion].Commits = append(versions[currentVersion].Commits, commit)
return nil
})
// Handle the stop condition - storer.ErrStop is expected
if err == storer.ErrStop {
err = nil
}
// Assign collected PR numbers to each version
for version, prs := range prNumbers {
versions[version].PRNumbers = dedupInts(prs)
}
return versions, err
}
func dedupInts(ints []int) []int {
seen := make(map[int]bool)
result := []int{}
for _, i := range ints {
if !seen[i] {
seen[i] = true
result = append(result, i)
}
}
return result
}
// Worktree returns the git worktree for performing git operations
func (w *Walker) Worktree() (*git.Worktree, error) {
return w.repo.Worktree()
}
// Repository returns the underlying git repository
func (w *Walker) Repository() *git.Repository {
return w.repo
}
// IsWorkingDirectoryClean checks if the working directory has any uncommitted changes
func (w *Walker) IsWorkingDirectoryClean() (bool, error) {
worktree, err := w.repo.Worktree()
if err != nil {
return false, fmt.Errorf("failed to get worktree: %w", err)
}
status, err := worktree.Status()
if err != nil {
return false, fmt.Errorf("failed to get git status: %w", err)
}
return status.IsClean(), nil
}
// GetStatusDetails returns a detailed status of the working directory
func (w *Walker) GetStatusDetails() (string, error) {
worktree, err := w.repo.Worktree()
if err != nil {
return "", fmt.Errorf("failed to get worktree: %w", err)
}
status, err := worktree.Status()
if err != nil {
return "", fmt.Errorf("failed to get git status: %w", err)
}
if status.IsClean() {
return "", nil
}
var details strings.Builder
for file, fileStatus := range status {
details.WriteString(fmt.Sprintf(" %c%c %s\n", fileStatus.Staging, fileStatus.Worktree, file))
}
return details.String(), nil
}
// AddFile adds a file to the git index
func (w *Walker) AddFile(filename string) error {
worktree, err := w.repo.Worktree()
if err != nil {
return fmt.Errorf("failed to get worktree: %w", err)
}
_, err = worktree.Add(filename)
if err != nil {
return fmt.Errorf("failed to add file %s: %w", filename, err)
}
return nil
}
// CommitChanges creates a commit with the given message
func (w *Walker) CommitChanges(message string) (plumbing.Hash, error) {
worktree, err := w.repo.Worktree()
if err != nil {
return plumbing.ZeroHash, fmt.Errorf("failed to get worktree: %w", err)
}
// Get git config for author information
cfg, err := w.repo.Config()
if err != nil {
return plumbing.ZeroHash, fmt.Errorf("failed to get git config: %w", err)
}
var authorName, authorEmail string
if cfg.User.Name != "" {
authorName = cfg.User.Name
} else {
authorName = "Changelog Bot"
}
if cfg.User.Email != "" {
authorEmail = cfg.User.Email
} else {
authorEmail = "bot@changelog.local"
}
commit, err := worktree.Commit(message, &git.CommitOptions{
Author: &object.Signature{
Name: authorName,
Email: authorEmail,
When: time.Now(),
},
})
if err != nil {
return plumbing.ZeroHash, fmt.Errorf("failed to commit: %w", err)
}
return commit, nil
}
// PushToRemote pushes the current branch to the remote repository
// It automatically detects GitHub repositories and uses token authentication when available
func (w *Walker) PushToRemote() error {
pushOptions := &git.PushOptions{}
// Check if we have a GitHub token for authentication
if githubToken := os.Getenv("GITHUB_TOKEN"); githubToken != "" {
// Get remote URL to check if it's a GitHub repository
remotes, err := w.repo.Remotes()
if err == nil && len(remotes) > 0 {
// Get the origin remote (or first remote if origin doesn't exist)
var remote *git.Remote
for _, r := range remotes {
if r.Config().Name == "origin" {
remote = r
break
}
}
if remote == nil {
remote = remotes[0]
}
// Check if this is a GitHub repository
urls := remote.Config().URLs
if len(urls) > 0 {
url := urls[0]
if strings.Contains(url, "github.com") {
// Use token authentication for GitHub repositories
pushOptions.Auth = &http.BasicAuth{
Username: "token", // GitHub expects "token" as username
Password: githubToken,
}
}
}
}
}
err := w.repo.Push(pushOptions)
if err != nil {
return fmt.Errorf("failed to push: %w", err)
}
return nil
}
// RemoveFile removes a file from both the working directory and git index
func (w *Walker) RemoveFile(filename string) error {
worktree, err := w.repo.Worktree()
if err != nil {
return fmt.Errorf("failed to get worktree: %w", err)
}
_, err = worktree.Remove(filename)
if err != nil {
return fmt.Errorf("failed to remove file %s: %w", filename, err)
}
return nil
}

View File

@@ -0,0 +1,417 @@
package github
import (
"context"
"fmt"
"net/http"
"os"
"strings"
"sync"
"time"
"github.com/google/go-github/v66/github"
"github.com/hasura/go-graphql-client"
"golang.org/x/oauth2"
)
type Client struct {
client *github.Client
graphqlClient *graphql.Client
owner string
repo string
token string
}
func NewClient(token, owner, repo string) *Client {
var githubClient *github.Client
var httpClient *http.Client
var gqlClient *graphql.Client
if token != "" {
ts := oauth2.StaticTokenSource(
&oauth2.Token{AccessToken: token},
)
httpClient = oauth2.NewClient(context.Background(), ts)
githubClient = github.NewClient(httpClient)
gqlClient = graphql.NewClient("https://api.github.com/graphql", httpClient)
} else {
httpClient = http.DefaultClient
githubClient = github.NewClient(nil)
gqlClient = graphql.NewClient("https://api.github.com/graphql", httpClient)
}
return &Client{
client: githubClient,
graphqlClient: gqlClient,
owner: owner,
repo: repo,
token: token,
}
}
func (c *Client) FetchPRs(prNumbers []int) ([]*PR, error) {
if len(prNumbers) == 0 {
return []*PR{}, nil
}
ctx := context.Background()
prs := make([]*PR, 0, len(prNumbers))
prsChan := make(chan *PR, len(prNumbers))
errChan := make(chan error, len(prNumbers))
var wg sync.WaitGroup
semaphore := make(chan struct{}, 10)
for _, prNumber := range prNumbers {
wg.Add(1)
go func(num int) {
defer wg.Done()
semaphore <- struct{}{}
defer func() { <-semaphore }()
pr, err := c.fetchSinglePR(ctx, num)
if err != nil {
errChan <- fmt.Errorf("failed to fetch PR #%d: %w", num, err)
return
}
prsChan <- pr
}(prNumber)
}
go func() {
wg.Wait()
close(prsChan)
close(errChan)
}()
var errors []error
for pr := range prsChan {
prs = append(prs, pr)
}
for err := range errChan {
errors = append(errors, err)
}
if len(errors) > 0 {
return prs, fmt.Errorf("some PRs failed to fetch: %v", errors)
}
return prs, nil
}
// GetPRValidationDetails fetches only the data needed for validation (lightweight).
func (c *Client) GetPRValidationDetails(prNumber int) (*PRDetails, error) {
ctx := context.Background()
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, fmt.Errorf("failed to get PR %d: %w", prNumber, err)
}
// Only return validation data, no commits fetched
details := &PRDetails{
PR: nil, // Will be populated later if needed
State: getString(ghPR.State),
Mergeable: ghPR.Mergeable != nil && *ghPR.Mergeable,
}
return details, nil
}
// GetPRWithCommits fetches the full PR and its commits.
func (c *Client) GetPRWithCommits(prNumber int) (*PR, error) {
ctx := context.Background()
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, fmt.Errorf("failed to get PR %d: %w", prNumber, err)
}
return c.buildPRWithCommits(ctx, ghPR)
}
// GetPRDetails fetches a comprehensive set of details for a single PR.
// Deprecated: Use GetPRValidationDetails + GetPRWithCommits for better performance
func (c *Client) GetPRDetails(prNumber int) (*PRDetails, error) {
ctx := context.Background()
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, fmt.Errorf("failed to get PR %d: %w", prNumber, err)
}
// Reuse the existing logic to build the base PR object
pr, err := c.buildPRWithCommits(ctx, ghPR)
if err != nil {
return nil, fmt.Errorf("failed to build PR details for %d: %w", prNumber, err)
}
details := &PRDetails{
PR: pr,
State: getString(ghPR.State),
Mergeable: ghPR.Mergeable != nil && *ghPR.Mergeable,
}
return details, nil
}
// buildPRWithCommits fetches commits and constructs a PR object from a GitHub API response
func (c *Client) buildPRWithCommits(ctx context.Context, ghPR *github.PullRequest) (*PR, error) {
commits, _, err := c.client.PullRequests.ListCommits(ctx, c.owner, c.repo, *ghPR.Number, nil)
if err != nil {
return nil, fmt.Errorf("failed to fetch commits for PR %d: %w", *ghPR.Number, err)
}
return c.convertGitHubPR(ghPR, commits), nil
}
// convertGitHubPR transforms GitHub API data into our internal PR struct (pure function)
func (c *Client) convertGitHubPR(ghPR *github.PullRequest, commits []*github.RepositoryCommit) *PR {
result := &PR{
Number: *ghPR.Number,
Title: getString(ghPR.Title),
Body: getString(ghPR.Body),
URL: getString(ghPR.HTMLURL),
Commits: make([]PRCommit, 0, len(commits)),
}
if ghPR.MergedAt != nil {
result.MergedAt = ghPR.MergedAt.Time
}
if ghPR.User != nil {
result.Author = getString(ghPR.User.Login)
result.AuthorURL = getString(ghPR.User.HTMLURL)
userType := getString(ghPR.User.Type)
switch userType {
case "User":
result.AuthorType = "user"
case "Organization":
result.AuthorType = "organization"
case "Bot":
result.AuthorType = "bot"
default:
result.AuthorType = "user"
}
}
if ghPR.MergeCommitSHA != nil {
result.MergeCommit = *ghPR.MergeCommitSHA
}
for _, commit := range commits {
if commit.Commit != nil {
prCommit := PRCommit{
SHA: getString(commit.SHA),
Message: strings.TrimSpace(getString(commit.Commit.Message)),
}
if commit.Commit.Author != nil {
prCommit.Author = getString(commit.Commit.Author.Name)
}
result.Commits = append(result.Commits, prCommit)
}
}
return result
}
func (c *Client) fetchSinglePR(ctx context.Context, prNumber int) (*PR, error) {
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, err
}
return c.buildPRWithCommits(ctx, ghPR)
}
func getString(s *string) string {
if s == nil {
return ""
}
return *s
}
// FetchAllMergedPRs fetches all merged PRs using GitHub's search API
// This is much more efficient than fetching PRs individually
func (c *Client) FetchAllMergedPRs(since time.Time) ([]*PR, error) {
ctx := context.Background()
var allPRs []*PR
// Build search query for merged PRs
query := fmt.Sprintf("repo:%s/%s is:pr is:merged", c.owner, c.repo)
if !since.IsZero() {
query += fmt.Sprintf(" merged:>=%s", since.Format("2006-01-02"))
}
opts := &github.SearchOptions{
Sort: "created",
Order: "desc",
ListOptions: github.ListOptions{
PerPage: 100, // Maximum allowed
},
}
for {
result, resp, err := c.client.Search.Issues(ctx, query, opts)
if err != nil {
return allPRs, fmt.Errorf("failed to search PRs: %w", err)
}
// Process PRs in parallel
prsChan := make(chan *PR, len(result.Issues))
errChan := make(chan error, len(result.Issues))
var wg sync.WaitGroup
semaphore := make(chan struct{}, 10) // Limit concurrent requests
for _, issue := range result.Issues {
if issue.PullRequestLinks == nil {
continue // Not a PR
}
wg.Add(1)
go func(prNumber int) {
defer wg.Done()
semaphore <- struct{}{}
defer func() { <-semaphore }()
pr, err := c.fetchSinglePR(ctx, prNumber)
if err != nil {
errChan <- fmt.Errorf("failed to fetch PR #%d: %w", prNumber, err)
return
}
prsChan <- pr
}(*issue.Number)
}
go func() {
wg.Wait()
close(prsChan)
close(errChan)
}()
// Collect results
for pr := range prsChan {
allPRs = append(allPRs, pr)
}
// Check for errors
for err := range errChan {
// Log error but continue processing
fmt.Fprintf(os.Stderr, "Warning: %v\n", err)
}
if resp.NextPage == 0 {
break
}
opts.Page = resp.NextPage
}
return allPRs, nil
}
// FetchAllMergedPRsGraphQL fetches all merged PRs with their commits using GraphQL
// This is the ultimate optimization - gets everything in ~5-10 API calls
func (c *Client) FetchAllMergedPRsGraphQL(since time.Time) ([]*PR, error) {
ctx := context.Background()
var allPRs []*PR
var after *string
totalFetched := 0
for {
// Prepare variables
variables := map[string]interface{}{
"owner": graphql.String(c.owner),
"repo": graphql.String(c.repo),
"after": (*graphql.String)(after),
}
// Execute GraphQL query
var query PullRequestsQuery
err := c.graphqlClient.Query(ctx, &query, variables)
if err != nil {
return allPRs, fmt.Errorf("GraphQL query failed: %w", err)
}
prs := query.Repository.PullRequests.Nodes
fmt.Fprintf(os.Stderr, "Fetched %d PRs via GraphQL (page %d)\n", len(prs), (totalFetched/100)+1)
// Convert GraphQL PRs to our PR struct
for _, gqlPR := range prs {
// If we have a since filter, stop when we reach older PRs
if !since.IsZero() && gqlPR.MergedAt.Before(since) {
fmt.Fprintf(os.Stderr, "Reached PRs older than %s, stopping\n", since.Format("2006-01-02"))
return allPRs, nil
}
pr := &PR{
Number: gqlPR.Number,
Title: gqlPR.Title,
Body: gqlPR.Body,
URL: gqlPR.URL,
MergedAt: gqlPR.MergedAt,
Commits: make([]PRCommit, 0, len(gqlPR.Commits.Nodes)),
}
// Handle author - check if it's nil first
if gqlPR.Author != nil {
pr.Author = gqlPR.Author.Login
pr.AuthorURL = gqlPR.Author.URL
switch gqlPR.Author.Typename {
case "Bot":
pr.AuthorType = "bot"
case "Organization":
pr.AuthorType = "organization"
case "User":
pr.AuthorType = "user"
default:
pr.AuthorType = "user" // fallback
if gqlPR.Author.Typename != "" {
fmt.Fprintf(os.Stderr, "PR #%d: Unknown author typename '%s'\n", gqlPR.Number, gqlPR.Author.Typename)
}
}
} else {
// Author is nil - try to fetch from REST API as fallback
fmt.Fprintf(os.Stderr, "PR #%d: Author is nil in GraphQL response, fetching from REST API\n", gqlPR.Number)
// Fetch this specific PR from REST API
restPR, err := c.fetchSinglePR(ctx, gqlPR.Number)
if err == nil && restPR != nil && restPR.Author != "" {
pr.Author = restPR.Author
pr.AuthorURL = restPR.AuthorURL
pr.AuthorType = restPR.AuthorType
} else {
// Fallback if REST API also fails
pr.Author = "[unknown]"
pr.AuthorURL = ""
pr.AuthorType = "user"
}
}
// Convert commits
for _, commitNode := range gqlPR.Commits.Nodes {
commit := PRCommit{
SHA: commitNode.Commit.OID,
Message: strings.TrimSpace(commitNode.Commit.Message),
Author: commitNode.Commit.Author.Name,
}
pr.Commits = append(pr.Commits, commit)
}
allPRs = append(allPRs, pr)
}
totalFetched += len(prs)
// Check if we need to fetch more pages
if !query.Repository.PullRequests.PageInfo.HasNextPage {
break
}
after = &query.Repository.PullRequests.PageInfo.EndCursor
}
fmt.Fprintf(os.Stderr, "Total PRs fetched via GraphQL: %d\n", len(allPRs))
return allPRs, nil
}

View File

@@ -0,0 +1,64 @@
package github
import "time"
type PR struct {
Number int
Title string
Body string
Author string
AuthorURL string
AuthorType string // "user", "organization", or "bot"
URL string
MergedAt time.Time
Commits []PRCommit
MergeCommit string
}
// PRDetails encapsulates all relevant information about a Pull Request.
type PRDetails struct {
*PR
State string
Mergeable bool
}
type PRCommit struct {
SHA string
Message string
Author string
}
// GraphQL query structures for hasura client
type PullRequestsQuery struct {
Repository struct {
PullRequests struct {
PageInfo struct {
HasNextPage bool
EndCursor string
}
Nodes []struct {
Number int
Title string
Body string
URL string
MergedAt time.Time
Author *struct {
Typename string `graphql:"__typename"`
Login string `graphql:"login"`
URL string `graphql:"url"`
}
Commits struct {
Nodes []struct {
Commit struct {
OID string `graphql:"oid"`
Message string
Author struct {
Name string
}
}
}
} `graphql:"commits(first: 250)"`
}
} `graphql:"pullRequests(first: 100, after: $after, states: MERGED, orderBy: {field: UPDATED_AT, direction: DESC})"`
} `graphql:"repository(owner: $owner, name: $repo)"`
}

View File

@@ -0,0 +1,98 @@
package main
import (
"fmt"
"os"
"path/filepath"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/changelog"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
"github.com/joho/godotenv"
"github.com/spf13/cobra"
)
var (
cfg = &config.Config{}
)
var rootCmd = &cobra.Command{
Use: "generate_changelog",
Short: "Generate changelog from git history and GitHub PRs",
Long: `A high-performance changelog generator that walks git history,
collects version information and pull requests, and generates a
comprehensive changelog in markdown format.`,
RunE: run,
SilenceUsage: true, // Don't show usage on runtime errors, only on flag errors
}
func init() {
rootCmd.Flags().StringVarP(&cfg.RepoPath, "repo", "r", ".", "Repository path")
rootCmd.Flags().StringVarP(&cfg.OutputFile, "output", "o", "", "Output file (default: stdout)")
rootCmd.Flags().IntVarP(&cfg.Limit, "limit", "l", 0, "Limit number of versions (0 = all)")
rootCmd.Flags().StringVarP(&cfg.Version, "version", "v", "", "Generate changelog for specific version")
rootCmd.Flags().BoolVar(&cfg.SaveData, "save-data", false, "Save version data to JSON for debugging")
rootCmd.Flags().StringVar(&cfg.CacheFile, "cache", "./cmd/generate_changelog/changelog.db", "Cache database file")
rootCmd.Flags().BoolVar(&cfg.NoCache, "no-cache", false, "Disable cache usage")
rootCmd.Flags().BoolVar(&cfg.RebuildCache, "rebuild-cache", false, "Rebuild cache from scratch")
rootCmd.Flags().StringVar(&cfg.GitHubToken, "token", "", "GitHub API token (or set GITHUB_TOKEN env var)")
rootCmd.Flags().BoolVar(&cfg.ForcePRSync, "force-pr-sync", false, "Force a full PR sync from GitHub (ignores cache age)")
rootCmd.Flags().BoolVar(&cfg.EnableAISummary, "ai-summarize", false, "Generate AI-enhanced summaries using Fabric")
rootCmd.Flags().IntVar(&cfg.IncomingPR, "incoming-pr", 0, "Pre-process PR for changelog (provide PR number)")
rootCmd.Flags().StringVar(&cfg.ProcessPRsVersion, "process-prs", "", "Process all incoming PR files for release (provide version like v1.4.262)")
rootCmd.Flags().StringVar(&cfg.IncomingDir, "incoming-dir", "./cmd/generate_changelog/incoming", "Directory for incoming PR files")
rootCmd.Flags().BoolVar(&cfg.Push, "push", false, "Enable automatic git push after creating an incoming entry")
}
func run(cmd *cobra.Command, args []string) error {
if cfg.IncomingPR > 0 && cfg.ProcessPRsVersion != "" {
return fmt.Errorf("--incoming-pr and --process-prs are mutually exclusive flags")
}
if cfg.GitHubToken == "" {
cfg.GitHubToken = os.Getenv("GITHUB_TOKEN")
}
generator, err := changelog.New(cfg)
if err != nil {
return fmt.Errorf("failed to create changelog generator: %w", err)
}
if cfg.IncomingPR > 0 {
return generator.ProcessIncomingPR(cfg.IncomingPR)
}
if cfg.ProcessPRsVersion != "" {
return generator.CreateNewChangelogEntry(cfg.ProcessPRsVersion)
}
output, err := generator.Generate()
if err != nil {
return fmt.Errorf("failed to generate changelog: %w", err)
}
if cfg.OutputFile != "" {
if err := os.WriteFile(cfg.OutputFile, []byte(output), 0644); err != nil {
return fmt.Errorf("failed to write output file: %w", err)
}
fmt.Printf("Changelog written to %s\n", cfg.OutputFile)
} else {
fmt.Print(output)
}
return nil
}
func main() {
// Load .env file from the same directory as the binary
if exePath, err := os.Executable(); err == nil {
envPath := filepath.Join(filepath.Dir(exePath), ".env")
if _, err := os.Stat(envPath); err == nil {
// .env file exists, load it
if err := godotenv.Load(envPath); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to load .env file: %v\n", err)
}
}
}
rootCmd.Execute()
}

View File

@@ -110,6 +110,10 @@ _fabric() {
'(--liststrategies)--liststrategies[List all strategies]' \
'(--listvendors)--listvendors[List all vendors]' \
'(--shell-complete-list)--shell-complete-list[Output raw list without headers/formatting (for shell completion)]' \
'(--suppress-think)--suppress-think[Suppress text enclosed in thinking tags]' \
'(--think-start-tag)--think-start-tag[Start tag for thinking sections (default: <think>)]:start tag:' \
'(--think-end-tag)--think-end-tag[End tag for thinking sections (default: </think>)]:end tag:' \
'(--disable-responses-api)--disable-responses-api[Disable OpenAI Responses API (default: false)]' \
'(-h --help)'{-h,--help}'[Show this help message]' \
'*:arguments:'
}

View File

@@ -13,7 +13,7 @@ _fabric() {
_get_comp_words_by_ref -n : cur prev words cword
# Define all possible options/flags
local opts="--pattern -p --variable -v --context -C --session --attachment -a --setup -S --temperature -t --topp -T --stream -s --presencepenalty -P --raw -r --frequencypenalty -F --listpatterns -l --listmodels -L --listcontexts -x --listsessions -X --updatepatterns -U --copy -c --model -m --modelContextLength --output -o --output-session --latest -n --changeDefaultModel -d --youtube -y --playlist --transcript --transcript-with-timestamps --comments --metadata --language -g --scrape_url -u --scrape_question -q --seed -e --wipecontext -w --wipesession -W --printcontext --printsession --readability --input-has-vars --dry-run --serve --serveOllama --address --api-key --config --search --search-location --image-file --image-size --image-quality --image-compression --image-background --version --listextensions --addextension --rmextension --strategy --liststrategies --listvendors --shell-complete-list --help -h"
local opts="--pattern -p --variable -v --context -C --session --attachment -a --setup -S --temperature -t --topp -T --stream -s --presencepenalty -P --raw -r --frequencypenalty -F --listpatterns -l --listmodels -L --listcontexts -x --listsessions -X --updatepatterns -U --copy -c --model -m --modelContextLength --output -o --output-session --latest -n --changeDefaultModel -d --youtube -y --playlist --transcript --transcript-with-timestamps --comments --metadata --language -g --scrape_url -u --scrape_question -q --seed -e --wipecontext -w --wipesession -W --printcontext --printsession --readability --input-has-vars --dry-run --serve --serveOllama --address --api-key --config --search --search-location --image-file --image-size --image-quality --image-compression --image-background --suppress-think --think-start-tag --think-end-tag --disable-responses-api --version --listextensions --addextension --rmextension --strategy --liststrategies --listvendors --shell-complete-list --help -h"
# Helper function for dynamic completions
_fabric_get_list() {
@@ -81,7 +81,7 @@ _fabric() {
return 0
;;
# Options requiring simple arguments (no specific completion logic here)
-v | --variable | -t | --temperature | -T | --topp | -P | --presencepenalty | -F | --frequencypenalty | --modelContextLength | -n | --latest | -y | --youtube | -g | --language | -u | --scrape_url | -q | --scrape_question | -e | --seed | --address | --api-key | --search-location | --image-compression)
-v | --variable | -t | --temperature | -T | --topp | -P | --presencepenalty | -F | --frequencypenalty | --modelContextLength | -n | --latest | -y | --youtube | -g | --language | -u | --scrape_url | -q | --scrape_question | -e | --seed | --address | --api-key | --search-location | --image-compression | --think-start-tag | --think-end-tag)
# No specific completion suggestions, user types the value
return 0
;;

View File

@@ -69,6 +69,8 @@ complete -c fabric -l image-background -d "Background type: opaque, transparent
complete -c fabric -l addextension -d "Register a new extension from config file path" -r -a "*.yaml *.yml"
complete -c fabric -l rmextension -d "Remove a registered extension by name" -a "(__fabric_get_extensions)"
complete -c fabric -l strategy -d "Choose a strategy from the available strategies" -a "(__fabric_get_strategies)"
complete -c fabric -l think-start-tag -d "Start tag for thinking sections (default: <think>)"
complete -c fabric -l think-end-tag -d "End tag for thinking sections (default: </think>)"
# Boolean flags (no arguments)
complete -c fabric -s S -l setup -d "Run setup for all reconfigurable parts of fabric"
@@ -98,4 +100,6 @@ complete -c fabric -l listextensions -d "List all registered extensions"
complete -c fabric -l liststrategies -d "List all strategies"
complete -c fabric -l listvendors -d "List all vendors"
complete -c fabric -l shell-complete-list -d "Output raw list without headers/formatting (for shell completion)"
complete -c fabric -l suppress-think -d "Suppress text enclosed in thinking tags"
complete -c fabric -l disable-responses-api -d "Disable OpenAI Responses API (default: false)"
complete -c fabric -s h -l help -d "Show this help message"

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More