Compare commits

...

58 Commits

Author SHA1 Message Date
github-actions[bot]
d37a1acc9b chore(release): Update version to v1.4.264 2025-07-22 04:55:41 +00:00
Kayvan Sylvan
7254571501 Merge pull request #1642 from ksylvan/0721-fix-changelog-processing
Add --sync-db to `generate_changelog`, plus many fixes
2025-07-21 21:53:20 -07:00
Changelog Bot
c300262804 chore: incoming 1642 changelog entry 2025-07-21 21:17:44 -07:00
Kayvan Sylvan
e8ba57be90 fix: improve error message formatting in version date parsing
## CHANGES

- Add actual error details to date parsing failure message
- Include error variable in stderr output formatting
- Enhance debugging information for invalid date formats
2025-07-21 21:17:06 -07:00
Kayvan Sylvan
15fad3da87 refactor: simplify merge pattern management by removing unnecessary struct wrapper
## CHANGES

- Remove mergePatternManager struct wrapper for patterns
- Replace struct fields with package-level variables
- Simplify getMergePatterns function implementation
- Clean up merge commit detection documentation
- Reduce code complexity in pattern initialization
- Maintain thread-safe lazy initialization with sync.Once
2025-07-21 20:59:25 -07:00
Kayvan Sylvan
e2b0d3c368 chore: standardize logging output format and improve error messages in changelog generator
## CHANGES

- Replace emoji prefixes with bracketed text labels
- Standardize synchronization step logging format across methods
- Simplify version existence check error message text
- Update commit author email extraction comment clarity
- Maintain consistent stderr output formatting throughout sync process
2025-07-21 20:36:11 -07:00
Changelog Bot
3de85eb50e chore: incoming 1642 changelog entry 2025-07-21 20:26:17 -07:00
Kayvan Sylvan
58e635c873 refactor: improve error handling and simplify merge pattern management in changelog generation
## CHANGES

- Remove unused runtime import from processing.go
- Simplify date parsing error messages in cache
- Replace global merge pattern variables with struct
- Use sync.Once for thread-safe pattern initialization
- Remove OS-specific file deletion instructions from errors
- Clean up merge commit detection function documentation
- Eliminate redundant error variable in date parsing
2025-07-21 20:25:52 -07:00
Changelog Bot
dde21d2337 chore: incoming 1642 changelog entry 2025-07-21 20:08:33 -07:00
Kayvan Sylvan
e3fcbcb12b fix: improve error reporting in date parsing and merge commit detection
## CHANGES

- Capture first RFC3339Nano parsing error for better diagnostics
- Display both RFC3339Nano and RFC3339 errors in output
- Extract merge patterns to variable for cleaner code
- Improve error message clarity in date parsing failures
2025-07-21 20:08:13 -07:00
Kayvan Sylvan
839296e3ba feat: add cross-platform file removal instructions for changelog generation
## CHANGES

- Import runtime package for OS detection
- Add Windows-specific file deletion commands in error messages
- Provide both Command Prompt and PowerShell alternatives
- Maintain existing Unix/Linux rm command for non-Windows systems
- Improve user experience across different operating systems
2025-07-21 19:58:56 -07:00
Changelog Bot
5b97b0e56a chore: incoming 1642 changelog entry 2025-07-21 19:45:07 -07:00
Kayvan Sylvan
38ff2288da refactor: replace sync.Once with mutex for merge patterns initialization
## CHANGES

- Replace sync.Once with mutex and boolean flag
- Add thread-safe initialization check for merge patterns
- Remove overly broad merge pattern regex rule
- Improve error messaging for file removal failures
- Clarify filesystem vs git index error contexts
- Add detailed manual intervention instructions for failures
2025-07-21 19:44:46 -07:00
Changelog Bot
771a1ac2e6 chore: incoming 1642 changelog entry 2025-07-21 18:59:17 -07:00
Kayvan Sylvan
f6fd6f535a perf: optimize merge pattern matching with lazy initialization and sync.Once
## CHANGES

- Add sync package import for concurrency safety
- Implement lazy initialization for merge patterns using sync.Once
- Wrap merge patterns in getMergePatterns function
- Replace direct mergePatterns access with function call
- Ensure thread-safe pattern compilation on first use
2025-07-21 18:57:51 -07:00
Changelog Bot
f548ca5f82 chore: incoming 1642 changelog entry 2025-07-21 18:39:15 -07:00
Kayvan Sylvan
616f51748e refactor: improve merge commit detection and update error messages
## CHANGES

- Move merge patterns to package-level variables
- Update date parsing error message for clarity
- Simplify author email field comment
- Extract regex compilation from function scope
- Improve merge commit detection performance
- Clarify RFC3339 fallback error context
2025-07-21 18:37:31 -07:00
Kayvan Sylvan
db5aaf9da6 fix: improve warning message clarity for invalid commit timestamps
## CHANGES

- Simplify warning message for invalid commit timestamps
- Remove parenthetical explanation about git history rewrites
- Make error message more concise and readable
2025-07-21 18:24:03 -07:00
Kayvan Sylvan
a922032756 chore: optimize error logging and regex pattern matching for better performance
## CHANGES

- Remove redundant RFC3339Nano parsing error message
- Enhance RFC3339 error message with version name
- Pre-compile regex patterns for merge commit detection
- Replace regexp.MatchString with compiled pattern matching
- Improve merge commit pattern matching performance
- Add structured regex compilation for better efficiency
2025-07-21 18:21:23 -07:00
Kayvan Sylvan
a415409a48 fix: improve error handling and guidance for file removal failures
## CHANGES

- Replace generic warning with detailed error message
- Add step-by-step manual intervention instructions
- Provide multiple recovery options for users
- Separate git and filesystem error reporting
- Include specific commands for manual cleanup
2025-07-21 17:35:53 -07:00
Kayvan Sylvan
19d95b9014 docs: improve code comments for version pattern and PR commit fields
## CHANGES

- Expand version pattern regex documentation with examples
- Add matching and non-matching commit message examples
- Clarify version pattern behavior with optional prefix
- Update PR commit field comments for clarity
- Document email field availability from GitHub API
- Simplify timestamp and parents field descriptions
2025-07-21 17:21:28 -07:00
Kayvan Sylvan
73c7a8c147 feat: improve changelog entry creation and error messages
### CHANGES

- Rename `changelogDate` to `versionDate` for clarity
- Enhance error message for git index removal failure
- Add comments for `versionPattern` regex in `walker.go`
2025-07-21 17:14:14 -07:00
Changelog Bot
4dc84bd64d chore: incoming 1642 changelog entry 2025-07-21 17:00:03 -07:00
Kayvan Sylvan
dd96014f9b chore: improve error message clarity in changelog generation and cache operations
## CHANGES

- Clarify RFC3339Nano date parsing error message
- Improve PR batch cache save error description
- Add context for commit timestamp fallback warning
- Specify git index in file removal error message
2025-07-21 16:59:50 -07:00
Changelog Bot
3cf2557af3 chore: incoming 1642 changelog entry 2025-07-21 16:46:04 -07:00
Kayvan Sylvan
fcda0338cb chore: improve error logging and documentation in changelog generation components
## CHANGES

- Add date string to RFC3339 parsing error messages
- Enhance isMergeCommit function documentation with detailed explanation
- Document calculateVersionDate function with comprehensive behavior description
- Improve error context for date parsing failures
- Add implementation details for merge commit detection methods
- Clarify fallback behavior in version date calculation
2025-07-21 16:45:52 -07:00
Changelog Bot
ac19c81ef0 chore: incoming 1642 changelog entry 2025-07-21 16:37:49 -07:00
Kayvan Sylvan
a83d57065f chore: improve error message clarity in version existence check for git history sync
## CHANGES

- Enhance warning message with additional context details
- Add guidance for users when version check fails
- Improve error handling feedback in sync operation
- Provide actionable steps for troubleshooting sync issues
2025-07-21 16:37:32 -07:00
Changelog Bot
055ed32ab8 chore: incoming 1642 changelog entry 2025-07-21 16:15:16 -07:00
Kayvan Sylvan
8d62165444 feat: add email field support and improve error logging in changelog generation
## CHANGES

- Add Email field to PRCommit struct for author information
- Extract version date calculation into reusable helper function
- Redirect error messages from stdout to stderr properly
- Populate commit email from GitHub API responses correctly
- Add comprehensive test coverage for email field handling
- Remove duplicate version date calculation code blocks
- Import os package for proper stderr output handling
2025-07-21 16:14:12 -07:00
Kayvan Sylvan
63bc7a7e79 feat: improve timestamp handling and merge commit detection in changelog generator
## CHANGES

- Add debug logging for date parsing failures
- Pass forcePRSync parameter explicitly to fetchPRs method
- Implement comprehensive merge commit detection using parents
- Capture actual commit timestamps from GitHub API
- Calculate version dates from most recent commit
- Add parent commit SHAs for merge detection
- Use real commit dates instead of current time
- Add timestamp validation with fallback handling
2025-07-21 15:43:14 -07:00
Changelog Bot
f2b2501767 chore: incoming 1642 changelog entry 2025-07-21 14:38:32 -07:00
Kayvan Sylvan
be1e2485ee feat: add database synchronization and improve changelog processing workflow
## CHANGES

- Add database sync command with comprehensive validation
- Implement version and commit existence checking methods
- Enhance time parsing with RFC3339Nano fallback support
- Cache fetched PRs during changelog entry creation
- Remove individual incoming files using git operations
- Add sync-db flag for database integrity validation
- Improve commit-PR mapping verification process
- Exclude incoming directory from workflow trigger paths
2025-07-21 14:33:05 -07:00
Kayvan Sylvan
38c4211649 docs: clean up duplicate CHANGELOG for v1.4.262 2025-07-21 13:27:37 -07:00
Kayvan Sylvan
71e6355c10 docs: Update CHANGELOG after v1.4.263 2025-07-21 12:35:55 -07:00
github-actions[bot]
64411cdc02 chore(release): Update version to v1.4.263 2025-07-21 19:29:11 +00:00
Kayvan Sylvan
9a2ff983a4 Merge pull request #1641 from ksylvan/0721-web-timeout-fix
Fix Fabric Web timeout error
2025-07-21 12:26:53 -07:00
Changelog Bot
a522d4a411 chore: incoming 1641 changelog entry 2025-07-21 12:24:36 -07:00
Kayvan Sylvan
9bdd77c277 chore: extend proxy timeout in vite.config.ts to 15 minutes
### CHANGES

- Increase `/api` proxy timeout to 900,000 ms
- Increase `/names` proxy timeout to 900,000 ms
2025-07-21 12:16:15 -07:00
github-actions[bot]
cc68dddfe8 chore(release): Update version to v1.4.262 2025-07-21 18:20:58 +00:00
Kayvan Sylvan
07ee7f8b21 Merge pull request #1640 from ksylvan/0720-changelog-during-ci-cd
Implement Automated Changelog System for CI/CD Integration
2025-07-21 11:18:26 -07:00
Kayvan Sylvan
15a355f08a docs: Remove duplicated section. 2025-07-21 11:07:27 -07:00
Kayvan Sylvan
c50486b611 chore: fix tests for generate_changelog 2025-07-21 10:52:43 -07:00
Kayvan Sylvan
edaca7a045 chore: adjust insertVersionAtTop for consistent newline handling 2025-07-21 10:33:00 -07:00
Kayvan Sylvan
28432a50f0 chore: adjust newline handling in insertVersionAtTop method 2025-07-21 10:28:42 -07:00
Kayvan Sylvan
8ab891fcff chore: trim leading newline in changelog entry content 2025-07-21 10:16:31 -07:00
Kayvan Sylvan
cab6df88ea chore: simplify direct commits content handling in changelog generation 2025-07-21 09:56:17 -07:00
Kayvan Sylvan
42afd92f31 refactor: rename ProcessIncomingPRs to CreateNewChangelogEntry for clarity
## CHANGES

- Rename ProcessIncomingPRs method to CreateNewChangelogEntry
- Update method comment to reflect new name
- Update main.go to call renamed method
- Reduce newline spacing in content formatting
2025-07-21 09:37:02 -07:00
Changelog Bot
76d6b1721e chore: incoming 1640 changelog entry 2025-07-21 08:42:35 -07:00
Kayvan Sylvan
7d562096d1 fix: formatting fixes in tests. 2025-07-21 08:25:13 -07:00
Kayvan Sylvan
91c1aca0dd feat: enhance changelog generator to accept version parameter for PR processing
## CHANGES

- Pass version parameter to changelog generation workflow
- Update ProcessIncomingPRs method to accept version string
- Add commit SHA tracking to prevent duplicate entries
- Modify process-prs flag to require version parameter
- Improve changelog formatting with proper spacing
- Update configuration to use ProcessPRsVersion string field
- Enhance direct commit filtering with SHA exclusion
- Update documentation to reflect version parameter requirement
2025-07-21 07:36:30 -07:00
Kayvan Sylvan
b8008a34fb feat: enhance changelog generation to avoid duplicate commit entries
## CHANGES

- Extract PR numbers from processed changelog files
- Pass processed PRs map to direct commits function
- Filter out commits already included via PR files
- Reduce extra newlines in changelog version insertion
- Add strconv import for PR number parsing
- Prevent duplicate entries between PRs and direct commits
- Improve changelog formatting consistency
2025-07-20 22:49:05 -07:00
Kayvan Sylvan
482759ae72 fix: ensure the PR#.txt file ends with a newline. 2025-07-20 20:38:52 -07:00
Kayvan Sylvan
b0d096d0ea feat: change push behavior from opt-out to opt-in with GitHub token auth
## CHANGES

- Change `NoPush` config field to `Push` boolean
- Update CLI flag from `--no-push` to `--push`
- Add GitHub token authentication for push operations
- Import `os` and HTTP transport packages
- Reverse push logic to require explicit enable
- Update documentation for new push behavior
- Add automatic GitHub repository detection for auth
2025-07-20 20:33:23 -07:00
Kayvan Sylvan
e56ecfb7ae chore: update gitignore and simplify changelog generator error handling
## CHANGES

- Add .claude/ directory to gitignore exclusions
- Update comment clarity for SilenceUsage flag
- Remove redundant error handling in main function
- Simplify command execution without explicit error checking
2025-07-20 18:49:32 -07:00
Kayvan Sylvan
951bd134eb Fix CLI error handling and improve git status validation
- Add SilenceUsage to prevent help output on errors
- Add GetStatusDetails method to show which files are dirty
- Include direct commits in ProcessIncomingPRs for complete AI summaries
2025-07-20 18:20:53 -07:00
Kayvan Sylvan
7ff04658f3 chore: add automated changelog processing for CI/CD integration
## CHANGES

- Add incoming PR preprocessing with validation
- Implement release aggregation for incoming files
- Create git operations for staging changes
- Add comprehensive test coverage for processing
- Extend GitHub client with validation methods
- Support version detection from nix files
- Include documentation for automated workflow
- Add command flags for PR processing
2025-07-20 17:47:34 -07:00
Kayvan Sylvan
272f04dd32 docs: Update CHANGELOG after v1.4.261 2025-07-19 07:22:24 -07:00
22 changed files with 2146 additions and 51 deletions

View File

@@ -9,6 +9,7 @@ on:
- "**/*.md"
- "data/strategies/**"
- "cmd/generate_changelog/*.db"
- "cmd/generate_changelog/incoming/*.txt"
- "scripts/pattern_descriptions/*.json"
- "web/static/data/pattern_descriptions.json"
@@ -83,14 +84,23 @@ jobs:
run: |
nix run .#gomod2nix -- --outdir nix/pkgs/fabric
- name: Generate Changelog Entry
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
go run ./cmd/generate_changelog --process-prs ${{ env.new_tag }}
- name: Commit changes
run: |
# These files are modified by the version bump process
git add cmd/fabric/version.go
git add nix/pkgs/fabric/version.nix
git add nix/pkgs/fabric/gomod2nix.toml
git add .
# The changelog tool is responsible for staging CHANGELOG.md, changelog.db,
# and removing the incoming/ directory.
if ! git diff --staged --quiet; then
git commit -m "Update version to ${{ env.new_tag }} and commit $commit_hash"
git commit -m "chore(release): Update version to ${{ env.new_tag }}"
else
echo "No changes to commit."
fi

3
.gitignore vendored
View File

@@ -350,3 +350,6 @@ web/static/*.png
# Local tmp directory
.tmp/
tmp/
# Ignore .claude/
.claude/

View File

@@ -23,6 +23,7 @@
"Eisler",
"elif",
"envrc",
"Errorf",
"eugeis",
"Eugen",
"excalidraw",
@@ -80,6 +81,7 @@
"nometa",
"numpy",
"ollama",
"ollamaapi",
"openaiapi",
"opencode",
"openrouter",
@@ -120,6 +122,7 @@
"WEBVTT",
"wipecontext",
"wipesession",
"Worktree",
"writeups",
"xclip",
"yourpatternname",

View File

@@ -1,5 +1,59 @@
# Changelog
## v1.4.264 (2025-07-22)
### PR [#1642](https://github.com/danielmiessler/Fabric/pull/1642) by [ksylvan](https://github.com/ksylvan): Add --sync-db to `generate_changelog`, plus many fixes
- Add database synchronization command with comprehensive validation and sync-db flag for database integrity validation
- Implement version and commit existence checking methods with enhanced time parsing using RFC3339Nano fallback support
- Improve timestamp handling and merge commit detection in changelog generator with comprehensive merge commit detection using parents
- Add email field support to PRCommit struct for author information and improve error logging throughout changelog generation
- Optimize merge pattern matching with lazy initialization and thread-safe pattern compilation for better performance
### Direct commits
- Chore: incoming 1642 changelog entry
- Fix: improve error message formatting in version date parsing
- Add actual error details to date parsing failure message
- Include error variable in stderr output formatting
- Enhance debugging information for invalid date formats
- Docs: Update CHANGELOG after v1.4.263
## v1.4.263 (2025-07-21)
### PR [#1641](https://github.com/danielmiessler/Fabric/pull/1641) by [ksylvan](https://github.com/ksylvan): Fix Fabric Web timeout error
- Chore: extend proxy timeout in `vite.config.ts` to 15 minutes
- Increase `/api` proxy timeout to 900,000 ms
- Increase `/names` proxy timeout to 900,000 ms
## v1.4.262 (2025-07-21)
### PR [#1640](https://github.com/danielmiessler/Fabric/pull/1640) by [ksylvan](https://github.com/ksylvan): Implement Automated Changelog System for CI/CD Integration
- Add automated changelog processing for CI/CD integration with comprehensive test coverage and GitHub client validation methods
- Implement release aggregation for incoming files with git operations for staging changes and support for version detection from nix files
- Change push behavior from opt-out to opt-in with GitHub token authentication and automatic repository detection
- Enhance changelog generation to avoid duplicate commit entries by extracting PR numbers and filtering commits already included via PR files
- Add version parameter requirement for PR processing with commit SHA tracking to prevent duplicate entries and improve formatting consistency
### Direct commits
- Docs: Update CHANGELOG after v1.4.261
## v1.4.261 (2025-07-19)
### PR [#1637](https://github.com/danielmiessler/Fabric/pull/1637) by [ksylvan](https://github.com/ksylvan): chore: update `NeedsRawMode` to include `mistral` prefix for Ollama
- Updated `NeedsRawMode` to include `mistral` prefix for Ollama compatibility
- Added `mistral` to `ollamaPrefixes` list for improved model support
### Direct commits
- Updated CHANGELOG after v1.4.260 release
## v1.4.260 (2025-07-18)
### PR [#1634](https://github.com/danielmiessler/Fabric/pull/1634) by [ksylvan](https://github.com/ksylvan): Fix abort in Exo-Labs provider plugin; with credit to @sakithahSenid

View File

@@ -1,3 +1,3 @@
package main
var version = "v1.4.261"
var version = "v1.4.264"

Binary file not shown.

View File

@@ -4,6 +4,7 @@ import (
"database/sql"
"encoding/json"
"fmt"
"os"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/git"
@@ -201,7 +202,14 @@ func (c *Cache) GetVersions() (map[string]*git.Version, error) {
}
if dateStr.Valid {
v.Date, _ = time.Parse(time.RFC3339, dateStr.String)
// Try RFC3339Nano first (for nanosecond precision), then fall back to RFC3339
v.Date, err = time.Parse(time.RFC3339Nano, dateStr.String)
if err != nil {
v.Date, err = time.Parse(time.RFC3339, dateStr.String)
if err != nil {
fmt.Fprintf(os.Stderr, "Error parsing date '%s' for version '%s': %v. Expected format: RFC3339 or RFC3339Nano.\n", dateStr.String, v.Name, err)
}
}
}
if prNumbersJSON != "" {
@@ -260,6 +268,26 @@ func (c *Cache) Clear() error {
return nil
}
// VersionExists checks if a version already exists in the cache
func (c *Cache) VersionExists(version string) (bool, error) {
var count int
err := c.db.QueryRow("SELECT COUNT(*) FROM versions WHERE name = ?", version).Scan(&count)
if err != nil {
return false, err
}
return count > 0, nil
}
// CommitExists checks if a commit already exists in the cache
func (c *Cache) CommitExists(hash string) (bool, error) {
var count int
err := c.db.QueryRow("SELECT COUNT(*) FROM commits WHERE sha = ?", hash).Scan(&count)
if err != nil {
return false, err
}
return count > 0, nil
}
// GetLastPRSync returns the timestamp of the last PR sync
func (c *Cache) GetLastPRSync() (time.Time, error) {
var timestamp string

View File

@@ -65,7 +65,7 @@ func (g *Generator) Generate() (string, error) {
return "", fmt.Errorf("failed to collect data: %w", err)
}
if err := g.fetchPRs(); err != nil {
if err := g.fetchPRs(g.cfg.ForcePRSync); err != nil {
return "", fmt.Errorf("failed to fetch PRs: %w", err)
}
@@ -193,7 +193,7 @@ func (g *Generator) collectData() error {
return nil
}
func (g *Generator) fetchPRs() error {
func (g *Generator) fetchPRs(forcePRSync bool) error {
// First, load all cached PRs
if g.cache != nil {
cachedPRs, err := g.cache.GetAllPRs()
@@ -229,7 +229,7 @@ func (g *Generator) fetchPRs() error {
}
// If we have never synced or it's been more than 24 hours, do a full sync
// Also sync if we have versions with PR numbers that aren't cached
needsSync := lastSync.IsZero() || time.Since(lastSync) > 24*time.Hour || g.cfg.ForcePRSync || missingPRs
needsSync := lastSync.IsZero() || time.Since(lastSync) > 24*time.Hour || forcePRSync || missingPRs
if !needsSync {
fmt.Fprintf(os.Stderr, "Using cached PR data (last sync: %s)\n", lastSync.Format("2006-01-02 15:04:05"))
@@ -697,3 +697,109 @@ func hashContent(content string) string {
hash := sha256.Sum256([]byte(content))
return fmt.Sprintf("%x", hash)
}
// SyncDatabase performs a comprehensive database synchronization and validation
func (g *Generator) SyncDatabase() error {
if g.cache == nil {
return fmt.Errorf("cache is disabled, cannot sync database")
}
fmt.Fprintf(os.Stderr, "[SYNC] Starting database synchronization...\n")
// Step 1: Force PR sync (pass true explicitly)
fmt.Fprintf(os.Stderr, "[PR_SYNC] Forcing PR sync from GitHub...\n")
if err := g.fetchPRs(true); err != nil {
return fmt.Errorf("failed to sync PRs: %w", err)
}
// Step 2: Rebuild git history and verify versions/commits completeness
fmt.Fprintf(os.Stderr, "[VERIFY] Verifying git history and version completeness...\n")
if err := g.syncGitHistory(); err != nil {
return fmt.Errorf("failed to sync git history: %w", err)
}
// Step 3: Verify commit-PR mappings
fmt.Fprintf(os.Stderr, "[MAPPING] Verifying commit-PR mappings...\n")
if err := g.verifyCommitPRMappings(); err != nil {
return fmt.Errorf("failed to verify commit-PR mappings: %w", err)
}
fmt.Fprintf(os.Stderr, "[SUCCESS] Database synchronization completed successfully!\n")
return nil
}
// syncGitHistory walks the complete git history and ensures all versions and commits are cached
func (g *Generator) syncGitHistory() error {
// Walk complete git history (reuse existing logic)
versions, err := g.gitWalker.WalkHistory()
if err != nil {
return fmt.Errorf("failed to walk git history: %w", err)
}
// Save only new versions and commits (preserve existing data)
var newVersions, newCommits int
for _, version := range versions {
// Only save version if it doesn't exist
exists, err := g.cache.VersionExists(version.Name)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to check existence of version %s: %v. This may affect the completeness of the sync operation.\n", version.Name, err)
continue
}
if !exists {
if err := g.cache.SaveVersion(version); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to save version %s: %v\n", version.Name, err)
} else {
newVersions++
}
}
// Only save commits that don't exist
for _, commit := range version.Commits {
exists, err := g.cache.CommitExists(commit.SHA)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to check commit %s existence: %v\n", commit.SHA, err)
continue
}
if !exists {
if err := g.cache.SaveCommit(commit, version.Name); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to save commit %s: %v\n", commit.SHA, err)
} else {
newCommits++
}
}
}
}
// Update last processed tag
if latestTag, err := g.gitWalker.GetLatestTag(); err == nil && latestTag != "" {
if err := g.cache.SetLastProcessedTag(latestTag); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to update last processed tag: %v\n", err)
}
}
fmt.Fprintf(os.Stderr, " Added %d new versions and %d new commits (preserved existing data)\n", newVersions, newCommits)
return nil
}
// verifyCommitPRMappings ensures all PR commits have proper mappings
func (g *Generator) verifyCommitPRMappings() error {
// Get all cached PRs
allPRs, err := g.cache.GetAllPRs()
if err != nil {
return fmt.Errorf("failed to get cached PRs: %w", err)
}
// Convert to slice for batch operations (reuse existing logic)
var prSlice []*github.PR
for _, pr := range allPRs {
prSlice = append(prSlice, pr)
}
// Save commit-PR mappings (reuse existing logic)
if err := g.cache.SaveCommitPRMappings(prSlice); err != nil {
return fmt.Errorf("failed to save commit-PR mappings: %w", err)
}
fmt.Fprintf(os.Stderr, " Verified mappings for %d PRs\n", len(prSlice))
return nil
}

View File

@@ -0,0 +1,115 @@
package changelog
import (
"os"
"path/filepath"
"regexp"
"testing"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
)
func TestDetectVersionFromNix(t *testing.T) {
tempDir := t.TempDir()
t.Run("version.nix exists", func(t *testing.T) {
versionNixContent := `"1.2.3"`
versionNixPath := filepath.Join(tempDir, "version.nix")
err := os.WriteFile(versionNixPath, []byte(versionNixContent), 0644)
if err != nil {
t.Fatalf("Failed to write version.nix: %v", err)
}
data, err := os.ReadFile(versionNixPath)
if err != nil {
t.Fatalf("Failed to read version.nix: %v", err)
}
versionRegex := regexp.MustCompile(`"([^"]+)"`)
matches := versionRegex.FindStringSubmatch(string(data))
if len(matches) <= 1 {
t.Fatalf("No version found in version.nix")
}
version := matches[1]
if version != "1.2.3" {
t.Errorf("Expected version 1.2.3, got %s", version)
}
})
}
func TestEnsureIncomingDir(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
cfg := &config.Config{
IncomingDir: incomingDir,
}
g := &Generator{cfg: cfg}
err := g.ensureIncomingDir()
if err != nil {
t.Fatalf("ensureIncomingDir failed: %v", err)
}
if _, err := os.Stat(incomingDir); os.IsNotExist(err) {
t.Errorf("Incoming directory was not created")
}
}
func TestInsertVersionAtTop(t *testing.T) {
tempDir := t.TempDir()
changelogPath := filepath.Join(tempDir, "CHANGELOG.md")
cfg := &config.Config{
RepoPath: tempDir,
}
g := &Generator{cfg: cfg}
t.Run("new changelog", func(t *testing.T) {
entry := "## v1.0.0 (2025-01-01)\n\n- Initial release"
err := g.insertVersionAtTop(entry)
if err != nil {
t.Fatalf("insertVersionAtTop failed: %v", err)
}
content, err := os.ReadFile(changelogPath)
if err != nil {
t.Fatalf("Failed to read changelog: %v", err)
}
expected := "# Changelog\n\n## v1.0.0 (2025-01-01)\n\n- Initial release\n"
if string(content) != expected {
t.Errorf("Expected:\n%s\nGot:\n%s", expected, string(content))
}
})
t.Run("existing changelog", func(t *testing.T) {
existingContent := "# Changelog\n\n## v0.9.0 (2024-12-01)\n\n- Previous release"
err := os.WriteFile(changelogPath, []byte(existingContent), 0644)
if err != nil {
t.Fatalf("Failed to write existing changelog: %v", err)
}
entry := "## v1.0.0 (2025-01-01)\n\n- New release"
err = g.insertVersionAtTop(entry)
if err != nil {
t.Fatalf("insertVersionAtTop failed: %v", err)
}
content, err := os.ReadFile(changelogPath)
if err != nil {
t.Fatalf("Failed to read changelog: %v", err)
}
expected := "# Changelog\n\n## v1.0.0 (2025-01-01)\n\n- New release\n## v0.9.0 (2024-12-01)\n\n- Previous release"
if string(content) != expected {
t.Errorf("Expected:\n%s\nGot:\n%s", expected, string(content))
}
})
}

View File

@@ -0,0 +1,82 @@
package changelog
import (
"testing"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/github"
)
func TestIsMergeCommit(t *testing.T) {
tests := []struct {
name string
commit github.PRCommit
expected bool
}{
{
name: "Regular commit with single parent",
commit: github.PRCommit{
SHA: "abc123",
Message: "Fix bug in user authentication",
Author: "John Doe",
Date: time.Now(),
Parents: []string{"def456"},
},
expected: false,
},
{
name: "Merge commit with multiple parents",
commit: github.PRCommit{
SHA: "abc123",
Message: "Merge pull request #42 from feature/auth",
Author: "GitHub",
Date: time.Now(),
Parents: []string{"def456", "ghi789"},
},
expected: true,
},
{
name: "Merge commit detected by message pattern only",
commit: github.PRCommit{
SHA: "abc123",
Message: "Merge pull request #123 from user/feature-branch",
Author: "GitHub",
Date: time.Now(),
Parents: []string{}, // Empty parents - fallback to message detection
},
expected: true,
},
{
name: "Merge branch commit pattern",
commit: github.PRCommit{
SHA: "abc123",
Message: "Merge branch 'feature' into main",
Author: "Developer",
Date: time.Now(),
Parents: []string{"def456"}, // Single parent but merge pattern
},
expected: true,
},
{
name: "Regular commit with no merge patterns",
commit: github.PRCommit{
SHA: "abc123",
Message: "Add new feature for user management",
Author: "Jane Doe",
Date: time.Now(),
Parents: []string{"def456"},
},
expected: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result := isMergeCommit(tt.commit)
if result != tt.expected {
t.Errorf("isMergeCommit() = %v, expected %v for commit: %s",
result, tt.expected, tt.commit.Message)
}
})
}
}

View File

@@ -0,0 +1,521 @@
package changelog
import (
"fmt"
"os"
"path/filepath"
"regexp"
"sort"
"strconv"
"strings"
"sync"
"time"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/git"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/github"
)
var (
mergePatterns []*regexp.Regexp
mergePatternsOnce sync.Once
)
// getMergePatterns returns the compiled merge patterns, initializing them lazily
func getMergePatterns() []*regexp.Regexp {
mergePatternsOnce.Do(func() {
mergePatterns = []*regexp.Regexp{
regexp.MustCompile(`^Merge pull request #\d+`), // "Merge pull request #123 from..."
regexp.MustCompile(`^Merge branch '.*' into .*`), // "Merge branch 'feature' into main"
regexp.MustCompile(`^Merge remote-tracking branch`), // "Merge remote-tracking branch..."
regexp.MustCompile(`^Merge '.*' into .*`), // "Merge 'feature' into main"
}
})
return mergePatterns
}
// isMergeCommit determines if a commit is a merge commit based on its parents and message patterns.
func isMergeCommit(commit github.PRCommit) bool {
// Primary method: Check parent count (merge commits have multiple parents)
if len(commit.Parents) > 1 {
return true
}
// Fallback method: Check commit message patterns
mergePatterns := getMergePatterns()
for _, pattern := range mergePatterns {
if pattern.MatchString(commit.Message) {
return true
}
}
return false
}
// calculateVersionDate determines the version date based on the most recent commit date from the provided PRs.
//
// If no valid commit dates are found, the function falls back to the current time.
// The function iterates through the provided PRs and their associated commits, comparing commit dates
// to identify the most recent one. If a valid date is found, it is returned; otherwise, the fallback is used.
func calculateVersionDate(fetchedPRs []*github.PR) time.Time {
versionDate := time.Now() // fallback to current time
if len(fetchedPRs) > 0 {
var mostRecentCommitDate time.Time
for _, pr := range fetchedPRs {
for _, commit := range pr.Commits {
if commit.Date.After(mostRecentCommitDate) {
mostRecentCommitDate = commit.Date
}
}
}
if !mostRecentCommitDate.IsZero() {
versionDate = mostRecentCommitDate
}
}
return versionDate
}
// ProcessIncomingPR processes a single PR for changelog entry creation
func (g *Generator) ProcessIncomingPR(prNumber int) error {
if err := g.validatePRState(prNumber); err != nil {
return fmt.Errorf("PR validation failed: %w", err)
}
if err := g.validateGitStatus(); err != nil {
return fmt.Errorf("git status validation failed: %w", err)
}
// Now fetch the full PR with commits for content generation
pr, err := g.ghClient.GetPRWithCommits(prNumber)
if err != nil {
return fmt.Errorf("failed to fetch PR %d: %w", prNumber, err)
}
content := g.formatPR(pr)
if g.cfg.EnableAISummary {
aiContent, err := SummarizeVersionContent(content)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: AI summarization failed: %v\n", err)
} else if !checkForAIError(aiContent) {
content = strings.TrimSpace(aiContent)
}
}
if err := g.ensureIncomingDir(); err != nil {
return fmt.Errorf("failed to create incoming directory: %w", err)
}
filename := filepath.Join(g.cfg.IncomingDir, fmt.Sprintf("%d.txt", prNumber))
// Ensure content ends with a single newline
content = strings.TrimSpace(content) + "\n"
if err := os.WriteFile(filename, []byte(content), 0644); err != nil {
return fmt.Errorf("failed to write incoming file: %w", err)
}
if err := g.commitAndPushIncoming(prNumber, filename); err != nil {
return fmt.Errorf("failed to commit and push: %w", err)
}
fmt.Printf("Successfully created incoming changelog entry: %s\n", filename)
return nil
}
// CreateNewChangelogEntry aggregates all incoming PR files for release and includes direct commits
func (g *Generator) CreateNewChangelogEntry(version string) error {
files, err := filepath.Glob(filepath.Join(g.cfg.IncomingDir, "*.txt"))
if err != nil {
return fmt.Errorf("failed to scan incoming directory: %w", err)
}
var content strings.Builder
var processingErrors []string
// First, aggregate all incoming PR files
for _, file := range files {
data, err := os.ReadFile(file)
if err != nil {
processingErrors = append(processingErrors, fmt.Sprintf("failed to read %s: %v", file, err))
continue // Continue to attempt processing other files
}
content.WriteString(string(data))
content.WriteString("\n")
}
if len(processingErrors) > 0 {
return fmt.Errorf("encountered errors while processing incoming files: %s", strings.Join(processingErrors, "; "))
}
// Extract PR numbers and their commit SHAs from processed files to avoid including their commits as "direct"
processedPRs := make(map[int]bool)
processedCommitSHAs := make(map[string]bool)
var fetchedPRs []*github.PR
var prNumbers []int
for _, file := range files {
// Extract PR number from filename (e.g., "1640.txt" -> 1640)
filename := filepath.Base(file)
if prNumStr := strings.TrimSuffix(filename, ".txt"); prNumStr != filename {
if prNum, err := strconv.Atoi(prNumStr); err == nil {
processedPRs[prNum] = true
prNumbers = append(prNumbers, prNum)
// Fetch the PR to get its commit SHAs
if pr, err := g.ghClient.GetPRWithCommits(prNum); err == nil {
fetchedPRs = append(fetchedPRs, pr)
for _, commit := range pr.Commits {
processedCommitSHAs[commit.SHA] = true
}
}
}
}
}
// Now add direct commits since the last release, excluding commits from processed PRs
directCommitsContent, err := g.getDirectCommitsSinceLastRelease(processedPRs, processedCommitSHAs)
if err != nil {
return fmt.Errorf("failed to get direct commits since last release: %w", err)
}
content.WriteString(directCommitsContent)
// Check if we have any content at all
if content.Len() == 0 {
if len(files) == 0 {
fmt.Fprintf(os.Stderr, "No incoming PR files found in %s and no direct commits since last release\n", g.cfg.IncomingDir)
} else {
fmt.Fprintf(os.Stderr, "No content found in incoming files and no direct commits since last release\n")
}
return nil
}
// Calculate the version date for the changelog entry as the most recent commit date from processed PRs
versionDate := calculateVersionDate(fetchedPRs)
entry := fmt.Sprintf("## %s (%s)\n\n%s",
version, versionDate.Format("2006-01-02"), strings.TrimLeft(content.String(), "\n"))
if err := g.insertVersionAtTop(entry); err != nil {
return fmt.Errorf("failed to update CHANGELOG.md: %w", err)
}
if g.cache != nil {
// Cache the fetched PRs using the same logic as normal changelog generation
if len(fetchedPRs) > 0 {
// Save PRs to cache
if err := g.cache.SavePRBatch(fetchedPRs); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to save PR batch to cache: %v\n", err)
}
// Save SHA→PR mappings for lightning-fast git operations
if err := g.cache.SaveCommitPRMappings(fetchedPRs); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to cache commit mappings: %v\n", err)
}
// Save individual commits to cache for each PR
for _, pr := range fetchedPRs {
for _, commit := range pr.Commits {
// Use actual commit timestamp, with fallback to current time if invalid
commitDate := commit.Date
if commitDate.IsZero() {
commitDate = time.Now()
fmt.Fprintf(os.Stderr, "Warning: Commit %s has invalid timestamp, using current time as fallback\n", commit.SHA)
}
// Convert github.PRCommit to git.Commit
gitCommit := &git.Commit{
SHA: commit.SHA,
Message: commit.Message,
Author: commit.Author,
Email: commit.Email, // Use email from GitHub API
Date: commitDate, // Use actual commit timestamp from GitHub API
IsMerge: isMergeCommit(commit), // Detect merge commits using parents and message patterns
PRNumber: pr.Number,
}
if err := g.cache.SaveCommit(gitCommit, version); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to save commit %s to cache: %v\n", commit.SHA, err)
}
}
}
}
// Create a proper new version entry for the database
newVersionEntry := &git.Version{
Name: version,
Date: versionDate, // Use most recent commit date instead of current time
CommitSHA: "", // Will be set when the release commit is made
PRNumbers: prNumbers, // Now we have the actual PR numbers
AISummary: content.String(),
}
if err := g.cache.SaveVersion(newVersionEntry); err != nil {
return fmt.Errorf("failed to save new version entry to database: %w", err)
}
}
for _, file := range files {
// Convert to relative path for git operations
relativeFile, err := filepath.Rel(g.cfg.RepoPath, file)
if err != nil {
relativeFile = file
}
// Use git remove to handle both filesystem and git index
if err := g.gitWalker.RemoveFile(relativeFile); err != nil {
fmt.Fprintf(os.Stderr, "Warning: Failed to remove %s from git index: %v\n", relativeFile, err)
// Fallback to filesystem-only removal
if err := os.Remove(file); err != nil {
fmt.Fprintf(os.Stderr, "Error: Failed to remove %s from the filesystem after failing to remove it from the git index.\n", relativeFile)
fmt.Fprintf(os.Stderr, "Filesystem error: %v\n", err)
fmt.Fprintf(os.Stderr, "Manual intervention required:\n")
fmt.Fprintf(os.Stderr, " 1. Remove the file %s manually (using the OS-specific command)\n", file)
fmt.Fprintf(os.Stderr, " 2. Remove from git index: git rm --cached %s\n", relativeFile)
fmt.Fprintf(os.Stderr, " 3. Or reset git index: git reset HEAD %s\n", relativeFile)
}
}
}
if err := g.stageChangesForRelease(); err != nil {
return fmt.Errorf("critical: failed to stage changes for release: %w", err)
}
fmt.Printf("Successfully processed %d incoming PR files for version %s\n", len(files), version)
return nil
}
// getDirectCommitsSinceLastRelease gets all direct commits (not part of PRs) since the last release
func (g *Generator) getDirectCommitsSinceLastRelease(processedPRs map[int]bool, processedCommitSHAs map[string]bool) (string, error) {
// Get the latest tag to determine what commits are unreleased
latestTag, err := g.gitWalker.GetLatestTag()
if err != nil {
return "", fmt.Errorf("failed to get latest tag: %w", err)
}
// Get all commits since the latest tag
unreleasedVersion, err := g.gitWalker.WalkCommitsSinceTag(latestTag)
if err != nil {
return "", fmt.Errorf("failed to walk commits since tag %s: %w", latestTag, err)
}
if unreleasedVersion == nil || len(unreleasedVersion.Commits) == 0 {
return "", nil // No unreleased commits
}
// Filter out commits that are part of PRs (we already have those from incoming files)
// and format the direct commits
var directCommits []*git.Commit
for _, commit := range unreleasedVersion.Commits {
// Skip version bump commits
if commit.IsVersion {
continue
}
// Skip commits that belong to PRs we've already processed from incoming files (by PR number)
if commit.PRNumber > 0 && processedPRs[commit.PRNumber] {
continue
}
// Skip commits whose SHA is already included in processed PRs (this catches commits
// that might not have been detected as part of a PR but are actually in the PR)
if processedCommitSHAs[commit.SHA] {
continue
}
// Only include commits that are NOT part of any PR (direct commits)
if commit.PRNumber == 0 {
directCommits = append(directCommits, commit)
}
}
if len(directCommits) == 0 {
return "", nil // No direct commits
}
// Format the direct commits similar to how it's done in generateRawVersionContent
var sb strings.Builder
sb.WriteString("### Direct commits\n\n")
// Sort direct commits by date (newest first) for consistent ordering
sort.Slice(directCommits, func(i, j int) bool {
return directCommits[i].Date.After(directCommits[j].Date)
})
for _, commit := range directCommits {
message := g.formatCommitMessage(strings.TrimSpace(commit.Message))
if message != "" && !g.isDuplicateMessage(message, directCommits) {
sb.WriteString(fmt.Sprintf("- %s\n", message))
}
}
return sb.String(), nil
}
// validatePRState validates that a PR is in the correct state for processing
func (g *Generator) validatePRState(prNumber int) error {
// Use lightweight validation call that doesn't fetch commits
details, err := g.ghClient.GetPRValidationDetails(prNumber)
if err != nil {
return fmt.Errorf("failed to fetch PR %d: %w", prNumber, err)
}
if details.State != "open" {
return fmt.Errorf("PR %d is not open (current state: %s)", prNumber, details.State)
}
if !details.Mergeable {
return fmt.Errorf("PR %d is not mergeable - please resolve conflicts first", prNumber)
}
return nil
}
// validateGitStatus ensures the working directory is clean
func (g *Generator) validateGitStatus() error {
isClean, err := g.gitWalker.IsWorkingDirectoryClean()
if err != nil {
return fmt.Errorf("failed to check git status: %w", err)
}
if !isClean {
// Get detailed status for better error message
statusDetails, statusErr := g.gitWalker.GetStatusDetails()
if statusErr == nil && statusDetails != "" {
return fmt.Errorf("working directory is not clean - please commit or stash changes before proceeding:\n%s", statusDetails)
}
return fmt.Errorf("working directory is not clean - please commit or stash changes before proceeding")
}
return nil
}
// ensureIncomingDir creates the incoming directory if it doesn't exist
func (g *Generator) ensureIncomingDir() error {
if err := os.MkdirAll(g.cfg.IncomingDir, 0755); err != nil {
return fmt.Errorf("failed to create directory %s: %w", g.cfg.IncomingDir, err)
}
return nil
}
// commitAndPushIncoming commits and optionally pushes the incoming changelog file
func (g *Generator) commitAndPushIncoming(prNumber int, filename string) error {
relativeFilename, err := filepath.Rel(g.cfg.RepoPath, filename)
if err != nil {
relativeFilename = filename
}
// Add file to git index
if err := g.gitWalker.AddFile(relativeFilename); err != nil {
return fmt.Errorf("failed to add file %s: %w", relativeFilename, err)
}
// Commit changes
commitMessage := fmt.Sprintf("chore: incoming %d changelog entry", prNumber)
_, err = g.gitWalker.CommitChanges(commitMessage)
if err != nil {
return fmt.Errorf("failed to commit changes: %w", err)
}
// Push to remote if enabled
if g.cfg.Push {
if err := g.gitWalker.PushToRemote(); err != nil {
return fmt.Errorf("failed to push to remote: %w", err)
}
} else {
fmt.Println("Commit created successfully. Please review and push manually.")
}
return nil
}
// detectVersion detects the current version from version.nix or git tags
func (g *Generator) detectVersion() (string, error) {
versionNixPath := filepath.Join(g.cfg.RepoPath, "version.nix")
if _, err := os.Stat(versionNixPath); err == nil {
data, err := os.ReadFile(versionNixPath)
if err != nil {
return "", fmt.Errorf("failed to read version.nix: %w", err)
}
versionRegex := regexp.MustCompile(`"([^"]+)"`)
matches := versionRegex.FindStringSubmatch(string(data))
if len(matches) > 1 {
return matches[1], nil
}
}
latestTag, err := g.gitWalker.GetLatestTag()
if err != nil {
return "", fmt.Errorf("failed to get latest tag: %w", err)
}
if latestTag == "" {
return "v1.0.0", nil
}
return latestTag, nil
}
// insertVersionAtTop inserts a new version entry at the top of CHANGELOG.md
func (g *Generator) insertVersionAtTop(entry string) error {
changelogPath := filepath.Join(g.cfg.RepoPath, "CHANGELOG.md")
header := "# Changelog"
headerRegex := regexp.MustCompile(`(?m)^# Changelog\s*`)
existingContent, err := os.ReadFile(changelogPath)
if err != nil {
if !os.IsNotExist(err) {
return fmt.Errorf("failed to read existing CHANGELOG.md: %w", err)
}
// File doesn't exist, create it.
newContent := fmt.Sprintf("%s\n\n%s\n", header, entry)
return os.WriteFile(changelogPath, []byte(newContent), 0644)
}
contentStr := string(existingContent)
var newContent string
if loc := headerRegex.FindStringIndex(contentStr); loc != nil {
// Found the header, insert after it.
insertionPoint := loc[1]
// Skip any existing newlines after the header to avoid double spacing
for insertionPoint < len(contentStr) && (contentStr[insertionPoint] == '\n' || contentStr[insertionPoint] == '\r') {
insertionPoint++
}
// Insert with proper spacing: single newline after header, then entry, then newline before existing content
newContent = contentStr[:loc[1]] + entry + "\n" + contentStr[insertionPoint:]
} else {
// Header not found, prepend everything.
newContent = fmt.Sprintf("%s\n\n%s\n\n%s", header, entry, contentStr)
}
return os.WriteFile(changelogPath, []byte(newContent), 0644)
}
// stageChangesForRelease stages the modified files for the release commit
func (g *Generator) stageChangesForRelease() error {
changelogPath := filepath.Join(g.cfg.RepoPath, "CHANGELOG.md")
relativeChangelog, err := filepath.Rel(g.cfg.RepoPath, changelogPath)
if err != nil {
relativeChangelog = "CHANGELOG.md"
}
relativeCacheFile, err := filepath.Rel(g.cfg.RepoPath, g.cfg.CacheFile)
if err != nil {
relativeCacheFile = g.cfg.CacheFile
}
// Add CHANGELOG.md to git index
if err := g.gitWalker.AddFile(relativeChangelog); err != nil {
return fmt.Errorf("failed to add %s: %w", relativeChangelog, err)
}
// Add cache file to git index
if err := g.gitWalker.AddFile(relativeCacheFile); err != nil {
return fmt.Errorf("failed to add %s: %w", relativeCacheFile, err)
}
// Note: Individual incoming files are now removed during the main processing loop
// No need to remove the entire directory here
return nil
}

View File

@@ -0,0 +1,262 @@
package changelog
import (
"os"
"path/filepath"
"strings"
"testing"
"github.com/danielmiessler/fabric/cmd/generate_changelog/internal/config"
)
func TestDetectVersion(t *testing.T) {
tempDir := t.TempDir()
tests := []struct {
name string
versionNixContent string
expectedVersion string
shouldError bool
}{
{
name: "valid version.nix",
versionNixContent: `"1.2.3"`,
expectedVersion: "1.2.3",
shouldError: false,
},
{
name: "version with extra whitespace",
versionNixContent: `"1.2.3" `,
expectedVersion: "1.2.3",
shouldError: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
// Create version.nix file
versionNixPath := filepath.Join(tempDir, "version.nix")
if err := os.WriteFile(versionNixPath, []byte(tt.versionNixContent), 0644); err != nil {
t.Fatalf("Failed to create version.nix: %v", err)
}
cfg := &config.Config{
RepoPath: tempDir,
}
g := &Generator{cfg: cfg}
version, err := g.detectVersion()
if tt.shouldError && err == nil {
t.Errorf("Expected error but got none")
}
if !tt.shouldError && err != nil {
t.Errorf("Unexpected error: %v", err)
}
if version != tt.expectedVersion {
t.Errorf("Expected version '%s', got '%s'", tt.expectedVersion, version)
}
// Clean up
os.Remove(versionNixPath)
})
}
}
func TestInsertVersionAtTop_ImprovedRobustness(t *testing.T) {
tempDir := t.TempDir()
changelogPath := filepath.Join(tempDir, "CHANGELOG.md")
cfg := &config.Config{
RepoPath: tempDir,
}
g := &Generator{cfg: cfg}
tests := []struct {
name string
existingContent string
entry string
expectedContent string
}{
{
name: "header with trailing spaces",
existingContent: "# Changelog \n\n## v1.0.0\n- Old content",
entry: "## v2.0.0\n- New content",
expectedContent: "# Changelog \n\n## v2.0.0\n- New content\n## v1.0.0\n- Old content",
},
{
name: "header with different line endings",
existingContent: "# Changelog\r\n\r\n## v1.0.0\r\n- Old content",
entry: "## v2.0.0\n- New content",
expectedContent: "# Changelog\r\n\r\n## v2.0.0\n- New content\n## v1.0.0\r\n- Old content",
},
{
name: "no existing header",
existingContent: "Some existing content without header",
entry: "## v1.0.0\n- New content",
expectedContent: "# Changelog\n\n## v1.0.0\n- New content\n\nSome existing content without header",
},
{
name: "new file creation",
existingContent: "",
entry: "## v1.0.0\n- Initial release",
expectedContent: "# Changelog\n\n## v1.0.0\n- Initial release\n",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
// Write existing content (or create empty file)
if tt.existingContent != "" {
if err := os.WriteFile(changelogPath, []byte(tt.existingContent), 0644); err != nil {
t.Fatalf("Failed to write existing content: %v", err)
}
} else {
// Remove file if it exists to test new file creation
os.Remove(changelogPath)
}
// Insert new version
if err := g.insertVersionAtTop(tt.entry); err != nil {
t.Fatalf("insertVersionAtTop failed: %v", err)
}
// Read result
result, err := os.ReadFile(changelogPath)
if err != nil {
t.Fatalf("Failed to read result: %v", err)
}
if string(result) != tt.expectedContent {
t.Errorf("Expected:\n%q\nGot:\n%q", tt.expectedContent, string(result))
}
})
}
}
func TestProcessIncomingPRs_FileAggregation(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
// Create incoming directory and files
if err := os.MkdirAll(incomingDir, 0755); err != nil {
t.Fatalf("Failed to create incoming dir: %v", err)
}
// Create test incoming files
file1Content := "## PR #1\n- Feature A"
file2Content := "## PR #2\n- Feature B"
if err := os.WriteFile(filepath.Join(incomingDir, "1.txt"), []byte(file1Content), 0644); err != nil {
t.Fatalf("Failed to create test file: %v", err)
}
if err := os.WriteFile(filepath.Join(incomingDir, "2.txt"), []byte(file2Content), 0644); err != nil {
t.Fatalf("Failed to create test file: %v", err)
}
// Test file aggregation logic by calling the internal functions
files, err := filepath.Glob(filepath.Join(incomingDir, "*.txt"))
if err != nil {
t.Fatalf("Failed to glob files: %v", err)
}
if len(files) != 2 {
t.Fatalf("Expected 2 files, got %d", len(files))
}
// Test content aggregation
var content strings.Builder
var processingErrors []string
for _, file := range files {
data, err := os.ReadFile(file)
if err != nil {
processingErrors = append(processingErrors, err.Error())
continue
}
content.WriteString(string(data))
content.WriteString("\n")
}
if len(processingErrors) > 0 {
t.Fatalf("Unexpected processing errors: %v", processingErrors)
}
aggregatedContent := content.String()
if !strings.Contains(aggregatedContent, "Feature A") {
t.Errorf("Aggregated content should contain 'Feature A'")
}
if !strings.Contains(aggregatedContent, "Feature B") {
t.Errorf("Aggregated content should contain 'Feature B'")
}
}
func TestFileProcessing_ErrorHandling(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
// Create incoming directory with one good file and one unreadable file
if err := os.MkdirAll(incomingDir, 0755); err != nil {
t.Fatalf("Failed to create incoming dir: %v", err)
}
// Create a good file
if err := os.WriteFile(filepath.Join(incomingDir, "1.txt"), []byte("content"), 0644); err != nil {
t.Fatalf("Failed to create test file: %v", err)
}
// Create an unreadable file (simulate permission error)
unreadableFile := filepath.Join(incomingDir, "2.txt")
if err := os.WriteFile(unreadableFile, []byte("content"), 0000); err != nil {
t.Fatalf("Failed to create unreadable file: %v", err)
}
defer os.Chmod(unreadableFile, 0644) // Clean up
// Test error aggregation logic
files, err := filepath.Glob(filepath.Join(incomingDir, "*.txt"))
if err != nil {
t.Fatalf("Failed to glob files: %v", err)
}
var content strings.Builder
var processingErrors []string
for _, file := range files {
data, err := os.ReadFile(file)
if err != nil {
processingErrors = append(processingErrors, err.Error())
continue
}
content.WriteString(string(data))
content.WriteString("\n")
}
if len(processingErrors) == 0 {
t.Errorf("Expected processing errors due to unreadable file")
}
// Verify error message format
errorMsg := strings.Join(processingErrors, "; ")
if !strings.Contains(errorMsg, "2.txt") {
t.Errorf("Error message should mention the problematic file")
}
}
func TestEnsureIncomingDirCreation(t *testing.T) {
tempDir := t.TempDir()
incomingDir := filepath.Join(tempDir, "incoming")
cfg := &config.Config{
IncomingDir: incomingDir,
}
g := &Generator{cfg: cfg}
err := g.ensureIncomingDir()
if err != nil {
t.Fatalf("ensureIncomingDir failed: %v", err)
}
if _, err := os.Stat(incomingDir); os.IsNotExist(err) {
t.Errorf("Incoming directory was not created")
}
}

View File

@@ -1,15 +1,20 @@
package config
type Config struct {
RepoPath string
OutputFile string
Limit int
Version string
SaveData bool
CacheFile string
NoCache bool
RebuildCache bool
GitHubToken string
ForcePRSync bool
EnableAISummary bool
RepoPath string
OutputFile string
Limit int
Version string
SaveData bool
CacheFile string
NoCache bool
RebuildCache bool
GitHubToken string
ForcePRSync bool
EnableAISummary bool
IncomingPR int
ProcessPRsVersion string
IncomingDir string
Push bool
SyncDB bool
}

View File

@@ -2,6 +2,7 @@ package git
import (
"fmt"
"os"
"regexp"
"strconv"
"strings"
@@ -11,10 +12,19 @@ import (
"github.com/go-git/go-git/v5/plumbing"
"github.com/go-git/go-git/v5/plumbing/object"
"github.com/go-git/go-git/v5/plumbing/storer"
"github.com/go-git/go-git/v5/plumbing/transport/http"
)
var (
versionPattern = regexp.MustCompile(`Update version to (v\d+\.\d+\.\d+)`)
// The versionPattern matches version commit messages with or without the optional "chore(release): " prefix.
// Examples of matching commit messages:
// - "chore(release): Update version to v1.2.3"
// - "Update version to v1.2.3"
// Examples of non-matching commit messages:
// - "fix: Update version to v1.2.3" (missing "chore(release): " or "Update version to")
// - "chore(release): Update version to 1.2.3" (missing "v" prefix in version)
// - "Update version to v1.2" (incomplete version number)
versionPattern = regexp.MustCompile(`(?:chore\(release\): )?Update version to (v\d+\.\d+\.\d+)`)
prPattern = regexp.MustCompile(`Merge pull request #(\d+)`)
)
@@ -400,3 +410,165 @@ func dedupInts(ints []int) []int {
return result
}
// Worktree returns the git worktree for performing git operations
func (w *Walker) Worktree() (*git.Worktree, error) {
return w.repo.Worktree()
}
// Repository returns the underlying git repository
func (w *Walker) Repository() *git.Repository {
return w.repo
}
// IsWorkingDirectoryClean checks if the working directory has any uncommitted changes
func (w *Walker) IsWorkingDirectoryClean() (bool, error) {
worktree, err := w.repo.Worktree()
if err != nil {
return false, fmt.Errorf("failed to get worktree: %w", err)
}
status, err := worktree.Status()
if err != nil {
return false, fmt.Errorf("failed to get git status: %w", err)
}
return status.IsClean(), nil
}
// GetStatusDetails returns a detailed status of the working directory
func (w *Walker) GetStatusDetails() (string, error) {
worktree, err := w.repo.Worktree()
if err != nil {
return "", fmt.Errorf("failed to get worktree: %w", err)
}
status, err := worktree.Status()
if err != nil {
return "", fmt.Errorf("failed to get git status: %w", err)
}
if status.IsClean() {
return "", nil
}
var details strings.Builder
for file, fileStatus := range status {
details.WriteString(fmt.Sprintf(" %c%c %s\n", fileStatus.Staging, fileStatus.Worktree, file))
}
return details.String(), nil
}
// AddFile adds a file to the git index
func (w *Walker) AddFile(filename string) error {
worktree, err := w.repo.Worktree()
if err != nil {
return fmt.Errorf("failed to get worktree: %w", err)
}
_, err = worktree.Add(filename)
if err != nil {
return fmt.Errorf("failed to add file %s: %w", filename, err)
}
return nil
}
// CommitChanges creates a commit with the given message
func (w *Walker) CommitChanges(message string) (plumbing.Hash, error) {
worktree, err := w.repo.Worktree()
if err != nil {
return plumbing.ZeroHash, fmt.Errorf("failed to get worktree: %w", err)
}
// Get git config for author information
cfg, err := w.repo.Config()
if err != nil {
return plumbing.ZeroHash, fmt.Errorf("failed to get git config: %w", err)
}
var authorName, authorEmail string
if cfg.User.Name != "" {
authorName = cfg.User.Name
} else {
authorName = "Changelog Bot"
}
if cfg.User.Email != "" {
authorEmail = cfg.User.Email
} else {
authorEmail = "bot@changelog.local"
}
commit, err := worktree.Commit(message, &git.CommitOptions{
Author: &object.Signature{
Name: authorName,
Email: authorEmail,
When: time.Now(),
},
})
if err != nil {
return plumbing.ZeroHash, fmt.Errorf("failed to commit: %w", err)
}
return commit, nil
}
// PushToRemote pushes the current branch to the remote repository
// It automatically detects GitHub repositories and uses token authentication when available
func (w *Walker) PushToRemote() error {
pushOptions := &git.PushOptions{}
// Check if we have a GitHub token for authentication
if githubToken := os.Getenv("GITHUB_TOKEN"); githubToken != "" {
// Get remote URL to check if it's a GitHub repository
remotes, err := w.repo.Remotes()
if err == nil && len(remotes) > 0 {
// Get the origin remote (or first remote if origin doesn't exist)
var remote *git.Remote
for _, r := range remotes {
if r.Config().Name == "origin" {
remote = r
break
}
}
if remote == nil {
remote = remotes[0]
}
// Check if this is a GitHub repository
urls := remote.Config().URLs
if len(urls) > 0 {
url := urls[0]
if strings.Contains(url, "github.com") {
// Use token authentication for GitHub repositories
pushOptions.Auth = &http.BasicAuth{
Username: "token", // GitHub expects "token" as username
Password: githubToken,
}
}
}
}
}
err := w.repo.Push(pushOptions)
if err != nil {
return fmt.Errorf("failed to push: %w", err)
}
return nil
}
// RemoveFile removes a file from both the working directory and git index
func (w *Walker) RemoveFile(filename string) error {
worktree, err := w.repo.Worktree()
if err != nil {
return fmt.Errorf("failed to get worktree: %w", err)
}
_, err = worktree.Remove(filename)
if err != nil {
return fmt.Errorf("failed to remove file %s: %w", filename, err)
}
return nil
}

View File

@@ -100,35 +100,89 @@ func (c *Client) FetchPRs(prNumbers []int) ([]*PR, error) {
return prs, nil
}
func (c *Client) fetchSinglePR(ctx context.Context, prNumber int) (*PR, error) {
pr, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
// GetPRValidationDetails fetches only the data needed for validation (lightweight).
func (c *Client) GetPRValidationDetails(prNumber int) (*PRDetails, error) {
ctx := context.Background()
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, err
return nil, fmt.Errorf("failed to get PR %d: %w", prNumber, err)
}
commits, _, err := c.client.PullRequests.ListCommits(ctx, c.owner, c.repo, prNumber, nil)
if err != nil {
return nil, fmt.Errorf("failed to fetch commits: %w", err)
// Only return validation data, no commits fetched
details := &PRDetails{
PR: nil, // Will be populated later if needed
State: getString(ghPR.State),
Mergeable: ghPR.Mergeable != nil && *ghPR.Mergeable,
}
return details, nil
}
// GetPRWithCommits fetches the full PR and its commits.
func (c *Client) GetPRWithCommits(prNumber int) (*PR, error) {
ctx := context.Background()
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, fmt.Errorf("failed to get PR %d: %w", prNumber, err)
}
return c.buildPRWithCommits(ctx, ghPR)
}
// GetPRDetails fetches a comprehensive set of details for a single PR.
// Deprecated: Use GetPRValidationDetails + GetPRWithCommits for better performance
func (c *Client) GetPRDetails(prNumber int) (*PRDetails, error) {
ctx := context.Background()
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, fmt.Errorf("failed to get PR %d: %w", prNumber, err)
}
// Reuse the existing logic to build the base PR object
pr, err := c.buildPRWithCommits(ctx, ghPR)
if err != nil {
return nil, fmt.Errorf("failed to build PR details for %d: %w", prNumber, err)
}
details := &PRDetails{
PR: pr,
State: getString(ghPR.State),
Mergeable: ghPR.Mergeable != nil && *ghPR.Mergeable,
}
return details, nil
}
// buildPRWithCommits fetches commits and constructs a PR object from a GitHub API response
func (c *Client) buildPRWithCommits(ctx context.Context, ghPR *github.PullRequest) (*PR, error) {
commits, _, err := c.client.PullRequests.ListCommits(ctx, c.owner, c.repo, *ghPR.Number, nil)
if err != nil {
return nil, fmt.Errorf("failed to fetch commits for PR %d: %w", *ghPR.Number, err)
}
return c.convertGitHubPR(ghPR, commits), nil
}
// convertGitHubPR transforms GitHub API data into our internal PR struct (pure function)
func (c *Client) convertGitHubPR(ghPR *github.PullRequest, commits []*github.RepositoryCommit) *PR {
result := &PR{
Number: prNumber,
Title: getString(pr.Title),
Body: getString(pr.Body),
URL: getString(pr.HTMLURL),
Number: *ghPR.Number,
Title: getString(ghPR.Title),
Body: getString(ghPR.Body),
URL: getString(ghPR.HTMLURL),
Commits: make([]PRCommit, 0, len(commits)),
}
if pr.MergedAt != nil {
result.MergedAt = pr.MergedAt.Time
if ghPR.MergedAt != nil {
result.MergedAt = ghPR.MergedAt.Time
}
if pr.User != nil {
result.Author = getString(pr.User.Login)
result.AuthorURL = getString(pr.User.HTMLURL)
userType := getString(pr.User.Type) // GitHub API returns "User", "Organization", or "Bot"
if ghPR.User != nil {
result.Author = getString(ghPR.User.Login)
result.AuthorURL = getString(ghPR.User.HTMLURL)
userType := getString(ghPR.User.Type)
// Convert GitHub API type to lowercase
switch userType {
case "User":
result.AuthorType = "user"
@@ -137,12 +191,12 @@ func (c *Client) fetchSinglePR(ctx context.Context, prNumber int) (*PR, error) {
case "Bot":
result.AuthorType = "bot"
default:
result.AuthorType = "user" // Default fallback
result.AuthorType = "user"
}
}
if pr.MergeCommitSHA != nil {
result.MergeCommit = *pr.MergeCommitSHA
if ghPR.MergeCommitSHA != nil {
result.MergeCommit = *ghPR.MergeCommitSHA
}
for _, commit := range commits {
@@ -153,12 +207,34 @@ func (c *Client) fetchSinglePR(ctx context.Context, prNumber int) (*PR, error) {
}
if commit.Commit.Author != nil {
prCommit.Author = getString(commit.Commit.Author.Name)
prCommit.Email = getString(commit.Commit.Author.Email) // Extract author email from GitHub API response
// Capture actual commit timestamp from GitHub API
if commit.Commit.Author.Date != nil {
prCommit.Date = commit.Commit.Author.Date.Time
}
}
// Capture parent commit SHAs for merge detection
if commit.Parents != nil {
for _, parent := range commit.Parents {
if parent.SHA != nil {
prCommit.Parents = append(prCommit.Parents, *parent.SHA)
}
}
}
result.Commits = append(result.Commits, prCommit)
}
}
return result, nil
return result
}
func (c *Client) fetchSinglePR(ctx context.Context, prNumber int) (*PR, error) {
ghPR, _, err := c.client.PullRequests.Get(ctx, c.owner, c.repo, prNumber)
if err != nil {
return nil, err
}
return c.buildPRWithCommits(ctx, ghPR)
}
func getString(s *string) string {
@@ -332,6 +408,7 @@ func (c *Client) FetchAllMergedPRsGraphQL(since time.Time) ([]*PR, error) {
SHA: commitNode.Commit.OID,
Message: strings.TrimSpace(commitNode.Commit.Message),
Author: commitNode.Commit.Author.Name,
Date: commitNode.Commit.AuthoredDate, // Use actual commit timestamp
}
pr.Commits = append(pr.Commits, commit)
}

View File

@@ -0,0 +1,59 @@
package github
import (
"testing"
"time"
)
func TestPRCommitEmailHandling(t *testing.T) {
tests := []struct {
name string
commit PRCommit
expected string
}{
{
name: "Valid email field",
commit: PRCommit{
SHA: "abc123",
Message: "Fix bug in authentication",
Author: "John Doe",
Email: "john.doe@example.com",
Date: time.Now(),
Parents: []string{"def456"},
},
expected: "john.doe@example.com",
},
{
name: "Empty email field",
commit: PRCommit{
SHA: "abc123",
Message: "Fix bug in authentication",
Author: "John Doe",
Email: "",
Date: time.Now(),
Parents: []string{"def456"},
},
expected: "",
},
{
name: "Email field with proper initialization",
commit: PRCommit{
SHA: "def789",
Message: "Add new feature",
Author: "Jane Smith",
Email: "jane.smith@company.org",
Date: time.Now(),
Parents: []string{"ghi012"},
},
expected: "jane.smith@company.org",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if tt.commit.Email != tt.expected {
t.Errorf("Expected email %q, got %q", tt.expected, tt.commit.Email)
}
})
}
}

View File

@@ -15,10 +15,20 @@ type PR struct {
MergeCommit string
}
// PRDetails encapsulates all relevant information about a Pull Request.
type PRDetails struct {
*PR
State string
Mergeable bool
}
type PRCommit struct {
SHA string
Message string
Author string
Email string // Author email from GitHub API, empty if not public
Date time.Time // Timestamp field
Parents []string // Parent commits (for merge detection)
}
// GraphQL query structures for hasura client
@@ -43,9 +53,10 @@ type PullRequestsQuery struct {
Commits struct {
Nodes []struct {
Commit struct {
OID string `graphql:"oid"`
Message string
Author struct {
OID string `graphql:"oid"`
Message string
AuthoredDate time.Time `graphql:"authoredDate"`
Author struct {
Name string
}
}

View File

@@ -21,7 +21,8 @@ var rootCmd = &cobra.Command{
Long: `A high-performance changelog generator that walks git history,
collects version information and pull requests, and generates a
comprehensive changelog in markdown format.`,
RunE: run,
RunE: run,
SilenceUsage: true, // Don't show usage on runtime errors, only on flag errors
}
func init() {
@@ -36,9 +37,18 @@ func init() {
rootCmd.Flags().StringVar(&cfg.GitHubToken, "token", "", "GitHub API token (or set GITHUB_TOKEN env var)")
rootCmd.Flags().BoolVar(&cfg.ForcePRSync, "force-pr-sync", false, "Force a full PR sync from GitHub (ignores cache age)")
rootCmd.Flags().BoolVar(&cfg.EnableAISummary, "ai-summarize", false, "Generate AI-enhanced summaries using Fabric")
rootCmd.Flags().IntVar(&cfg.IncomingPR, "incoming-pr", 0, "Pre-process PR for changelog (provide PR number)")
rootCmd.Flags().StringVar(&cfg.ProcessPRsVersion, "process-prs", "", "Process all incoming PR files for release (provide version like v1.4.262)")
rootCmd.Flags().StringVar(&cfg.IncomingDir, "incoming-dir", "./cmd/generate_changelog/incoming", "Directory for incoming PR files")
rootCmd.Flags().BoolVar(&cfg.Push, "push", false, "Enable automatic git push after creating an incoming entry")
rootCmd.Flags().BoolVar(&cfg.SyncDB, "sync-db", false, "Synchronize and validate database integrity with git history and GitHub PRs")
}
func run(cmd *cobra.Command, args []string) error {
if cfg.IncomingPR > 0 && cfg.ProcessPRsVersion != "" {
return fmt.Errorf("--incoming-pr and --process-prs are mutually exclusive flags")
}
if cfg.GitHubToken == "" {
cfg.GitHubToken = os.Getenv("GITHUB_TOKEN")
}
@@ -48,6 +58,18 @@ func run(cmd *cobra.Command, args []string) error {
return fmt.Errorf("failed to create changelog generator: %w", err)
}
if cfg.IncomingPR > 0 {
return generator.ProcessIncomingPR(cfg.IncomingPR)
}
if cfg.ProcessPRsVersion != "" {
return generator.CreateNewChangelogEntry(cfg.ProcessPRsVersion)
}
if cfg.SyncDB {
return generator.SyncDatabase()
}
output, err := generator.Generate()
if err != nil {
return fmt.Errorf("failed to generate changelog: %w", err)
@@ -77,8 +99,5 @@ func main() {
}
}
if err := rootCmd.Execute(); err != nil {
fmt.Fprintf(os.Stderr, "Error: %v\n", err)
os.Exit(1)
}
rootCmd.Execute()
}

373
docs/Automated-ChangeLog.md Normal file
View File

@@ -0,0 +1,373 @@
# Automated CHANGELOG Entry System for CI/CD
## Overview
This document outlines a comprehensive system for automatically generating and maintaining CHANGELOG.md entries during the CI/CD process. The system builds upon the existing `generate_changelog` tool and integrates seamlessly with GitHub's pull request workflow.
## Current State Analysis
### Existing Infrastructure
The `generate_changelog` tool already provides:
- **High-performance Git history walking** with one-pass algorithm
- **GitHub API integration** with GraphQL optimization and smart caching
- **SQLite-based caching** for instant incremental updates
- **AI-powered summaries** using Fabric integration
- **Concurrent processing** for optimal performance
- **Version detection** from git tags and commit patterns
### Key Components
- **Main entry point**: `cmd/generate_changelog/main.go`
- **Core generation logic**: `internal/changelog/generator.go`
- **AI summarization**: `internal/changelog/summarize.go`
- **Caching system**: `internal/cache/cache.go`
- **GitHub integration**: `internal/github/client.go`
- **Git operations**: `internal/git/walker.go`
## Proposed Automated System
### Developer Workflow
```mermaid
graph TD
A[Developer creates feature branch] --> B[Codes feature]
B --> C[Creates Pull Request]
C --> D[PR is open and ready]
D --> E[Developer runs: generate_changelog --incoming-pr XXXX]
E --> F[Tool validates PR is open/mergeable]
F --> G[Tool creates incoming/XXXX.txt with AI summary]
G --> H[Auto-commit and push to branch]
H --> I[PR includes pre-processed changelog entry]
I --> J[PR gets reviewed and merged]
```
### CI/CD Integration
```mermaid
graph TD
A[PR merged to main] --> B[Version bump workflow triggered]
B --> C[generate_changelog --process-prs]
C --> D[Scan incoming/ directory]
D --> E[Concatenate all incoming/*.txt files]
E --> F[Insert new version at top of CHANGELOG.md]
F --> G[Store entry in versions table]
G --> H[git rm incoming/*.txt files]
H --> I[git add CHANGELOG.md and changelog.db, done by the tool]
I --> J[Increment version number]
J --> K[Commit and tag release]
```
## Implementation Details
### Phase 1: Pre-Processing PRs
#### New Command: `--incoming-pr`
**Usage**: `generate_changelog --incoming-pr 1672`
**Functionality**:
1. **Validation**:
- Verify PR exists and is open
- Check PR is mergeable
- Ensure branch is up-to-date
- Verify that current git repo is clean (everything committed); do not continue otherwise.
2. **Content Generation**:
- Extract PR metadata (title, author, description)
- Collect all commit messages from the PR
- Use existing `SummarizeVersionContent` function for AI enhancement
- Format as standard changelog entry
3. **File Creation**:
- Generate `./cmd/generate_changelog/incoming/{PR#}.txt`
- Include PR header: `### PR [#1672](url) by [author](profile): Title` (as is done currently in the code)
- Consider extracting the existing header code for PRs into a helper function for re-use.
- Include the AI-summarized changes (generated when we ran all the commit messages through `SummarizeVersionContent`)
4. **Auto-commit**:
- Commit file with message: `chore: incoming 1672 changelog entry`
- Optionally push to current branch (use `--push` flag)
(The PR is now completely ready to be merged with integrated CHANGELOG entry updating)
#### File Format Example
```markdown
### PR [#1672](https://github.com/danielmiessler/Fabric/pull/1672) by [ksylvan](https://github.com/ksylvan): Changelog Generator Enhancement
- Added automated CI/CD integration for changelog generation
- Implemented pre-processing of PR entries during development
- Enhanced caching system for better performance
- Added validation for mergeable PR states
```
### Phase 2: Release Processing
#### New Command: `--process-prs`
**Usage**: `generate_changelog --process-prs`
**Integration Point**: `.github/workflows/update-version-and-create-tag.yml`
(we can do this AFTER the "Update gomod2nix.toml file" step in the workflow, where we
already have generated the next version in the "version.nix" file)
**Functionality**:
1. **Discovery**: Scan `./cmd/generate_changelog/incoming/` directory
2. **Aggregation**: Read and concatenate all `*.txt` files
3. **Version Creation**: Generate new version header with current date
4. **CHANGELOG Update**: Insert new version at top of existing CHANGELOG.md
5. **Database Update**: Store complete entry in `versions` table as `ai_summary`
6. **Cleanup**: Remove all processed incoming files
7. **Stage Changes**: Add modified files to git staging area
#### Example Output in CHANGELOG.md
```markdown
# Changelog
## v1.4.259 (2025-07-18)
### PR [#1672](https://github.com/danielmiessler/Fabric/pull/1672) by [ksylvan](https://github.com/ksylvan): Changelog Generator Enhancement
- Added automated CI/CD integration for changelog generation
- Implemented pre-processing of PR entries during development
- Enhanced caching system for better performance
### PR [#1671](https://github.com/danielmiessler/Fabric/pull/1671) by [contributor](https://github.com/contributor): Bug Fix
- Fixed memory leak in caching system
- Improved error handling for GitHub API failures
## v1.4.258 (2025-07-14)
[... rest of file ...]
```
## Technical Implementation
### Configuration Extensions
Add to `internal/config/config.go`:
```go
type Config struct {
// ... existing fields
IncomingPR int // PR number for --incoming-pr
ProcessPRsVersion string // Flag for --process-prs (new version string)
IncomingDir string // Directory for incoming files (default: ./cmd/generate_changelog/incoming/)
}
```
### New Command Line Flags
```go
rootCmd.Flags().IntVar(&cfg.IncomingPR, "incoming-pr", 0, "Pre-process PR for changelog (provide PR number)")
rootCmd.Flags().StringVar(&cfg.ProcessPRsVersion, "process-prs", "", "Process all incoming PR files for release (provide version like v1.4.262)")
rootCmd.Flags().StringVar(&cfg.IncomingDir, "incoming-dir", "./cmd/generate_changelog/incoming", "Directory for incoming PR files")
```
### Core Logic Extensions
#### PR Pre-processing
```go
func (g *Generator) ProcessIncomingPR(prNumber int) error {
// 1. Validate PR state via GitHub API
pr, err := g.ghClient.GetPR(prNumber)
if err != nil || pr.State != "open" || !pr.Mergeable {
return fmt.Errorf("PR %d is not in valid state for processing", prNumber)
}
// 2. Generate changelog content using existing logic
content := g.formatPR(pr)
// 3. Apply AI summarization if enabled
if g.cfg.EnableAISummary {
content, _ = SummarizeVersionContent(content)
}
// 4. Write to incoming file
filename := filepath.Join(g.cfg.IncomingDir, fmt.Sprintf("%d.txt", prNumber))
err = os.WriteFile(filename, []byte(content), 0644)
if err != nil {
return fmt.Errorf("failed to write incoming file: %w", err)
}
// 5. Auto-commit and push
return g.commitAndPushIncoming(prNumber, filename)
}
```
#### Release Processing
```go
func (g *Generator) ProcessIncomingPRs(version string) error {
// 1. Scan incoming directory
files, err := filepath.Glob(filepath.Join(g.cfg.IncomingDir, "*.txt"))
if err != nil || len(files) == 0 {
return fmt.Errorf("no incoming PR files found")
}
// 2. Read and concatenate all files
var content strings.Builder
for _, file := range files {
data, err := os.ReadFile(file)
if err == nil {
content.WriteString(string(data))
content.WriteString("\n")
}
}
// 3. Generate version entry
entry := fmt.Sprintf("\n## %s (%s)\n\n%s",
version, time.Now().Format("2006-01-02"), content.String())
// 4. Update CHANGELOG.md
err = g.insertVersionAtTop(entry)
if err != nil {
return fmt.Errorf("failed to update CHANGELOG.md: %w", err)
}
// 5. Update database
err = g.cache.SaveVersionEntry(version, content.String())
if err != nil {
return fmt.Errorf("failed to save to database: %w", err)
}
// 6. Cleanup incoming files
for _, file := range files {
os.Remove(file)
}
return nil
}
```
## Workflow Integration
### GitHub Actions Modification
Update `.github/workflows/update-version-and-create-tag.yml`.
```yaml
- name: Generate Changelog Entry
run: |
# Process all incoming PR entries
./cmd/generate_changelog/generate_changelog --process-prs
# The tool will make the needed changes in the CHANGELOG.md,
# and the changelog.db, and will remove the PR#.txt file(s)
# In effect, doing the following:
# 1. Generate the new CHANGELOG (and store the entry in the changelog.db)
# 2. git add CHANGELOG.md
# 3. git add ./cmd/generate_changelog/changelog.db
# 4. git rm -rf ./cmd/generate_changelog/incoming/
#
```
### Developer Instructions
1. **During Development**:
```bash
# After PR is ready for review (commit locally only)
generate_changelog --incoming-pr 1672 --ai-summarize
# Or to automatically push to remote
generate_changelog --incoming-pr 1672 --ai-summarize --push
```
2. **Validation**:
- Check that `incoming/1672.txt` was created
- Verify auto-commit occurred
- Confirm file is included in PR
- Scan the file and make any changes you need to the auto-generated summary
## Benefits
### For Developers
- **Automated changelog entries** - no manual CHANGELOG.md editing
- **AI-enhanced summaries** - professional, consistent formatting
- **Early visibility** - changelog content visible during PR review
- **Reduced merge conflicts** - no multiple PRs editing CHANGELOG.md
### For Project Maintainers
- **Consistent formatting** - all entries follow same structure
- **Complete coverage** - no missed changelog entries
- **Automated releases** - seamless integration with version bumps
- **Historical accuracy** - each PR's contribution properly documented
### For CI/CD
- **Deterministic process** - reliable, repeatable changelog generation
- **Performance optimized** - leverages existing caching and AI systems
- **Error resilience** - validates PR states before processing
- **Clean integration** - minimal changes to existing workflows
## Implementation Strategy
### Phase 1: Implement Developer Tooling
- [x] Add new command line flags and configuration
- [x] Implement `--incoming-pr` functionality
- [x] Add validation for PR states and git status
- [x] Create auto-commit logic
### Phase 2: Integration (CI/CD) Readiness
- [x] Implement `--process-prs` functionality
- [x] Add CHANGELOG.md insertion logic
- [x] Update database storage for version entries
### Phase 3: Deployment
- [x] Update GitHub Actions workflow
- [x] Create developer documentation in ./docs/ directory
- [x] Test full end-to-end workflow (the PR that includes these modifications can be its first production test)
### Phase 4: Adoption
- [ ] Train development team - Consider creating a full tutorial blog post/page to fully walk developers through the process.
- [ ] Monitor first few releases
- [ ] Gather feedback and iterate
- [ ] Document lessons learned
## Error Handling
### PR Validation Failures
- **Closed/Merged PR**: Error with suggestion to check PR status
- **Non-mergeable PR**: Error with instruction to resolve conflicts
- **Missing PR**: Error with verification of PR number
### File System Issues
- **Permission errors**: Clear error with directory permission requirements
- **Disk space**: Graceful handling with cleanup suggestions
- **Network failures**: Retry logic with exponential backoff
### Git Operations
- **Commit failures**: Check for dirty working directory
- **Push failures**: Handle authentication and remote issues
- **Merge conflicts**: Clear instructions for manual resolution
## Future Enhancements
### Advanced Features
- **Custom categorization** - group changes by type (feat/fix/docs)
- **Breaking change detection** - special handling for BREAKING CHANGE commits
- **Release notes generation** - enhanced formatting for GitHub releases (our release pages are pretty bare)
## Conclusion
This automated changelog system builds upon the robust foundation of the existing `generate_changelog` tool while providing a seamless developer experience and reliable CI/CD integration. By pre-processing PR entries during development and aggregating them during releases, we achieve both accuracy and automation without sacrificing quality or developer productivity.
The phased approach ensures smooth adoption while the extensive error handling and validation provide confidence in production deployment. The system's design leverages existing infrastructure and patterns, making it a natural evolution of the current changelog generation capabilities.

View File

@@ -0,0 +1,195 @@
# Automated Changelog System - Developer Guide
This guide explains how to use the new automated changelog system for the Fabric project.
## Overview
The automated changelog system allows developers to pre-process their PR changelog entries during development, which are then automatically aggregated during the release process. This eliminates manual CHANGELOG.md editing and reduces merge conflicts.
## Developer Workflow
### Step 1: Create Your Feature Branch and PR
Work on your feature as usual and create a pull request.
### Step 2: Generate Changelog Entry
Once your PR is ready for review, generate a changelog entry:
```bash
cd cmd/generate_changelog
go build -o generate_changelog .
./generate_changelog --incoming-pr YOUR_PR_NUMBER
```
For example, if your PR number is 1672:
```bash
./generate_changelog --incoming-pr 1672
```
### Step 3: Validation
The tool will validate:
- ✅ PR exists and is open
- ✅ PR is mergeable (no conflicts)
- ✅ Your working directory is clean
If any validation fails, fix the issues and try again.
### Step 4: Review Generated Entry
The tool will:
1. Create `./cmd/generate_changelog/incoming/1672.txt`
2. Generate an AI-enhanced summary (if `--ai-summarize` is enabled)
3. Auto-commit the file to your branch (use `--push` to also push to remote)
Review the generated file and edit if needed:
```bash
cat ./cmd/generate_changelog/incoming/1672.txt
```
### Step 5: Include in PR
The incoming changelog entry is now part of your PR and will be reviewed along with your code changes.
## Example Generated Entry
```markdown
### PR [#1672](https://github.com/danielmiessler/fabric/pull/1672) by [ksylvan](https://github.com/ksylvan): Changelog Generator Enhancement
- Added automated CI/CD integration for changelog generation
- Implemented pre-processing of PR entries during development
- Enhanced caching system for better performance
- Added validation for mergeable PR states
```
## Command Options
### `--incoming-pr`
Pre-process a specific PR for changelog generation.
**Usage**: `./generate_changelog --incoming-pr PR_NUMBER`
**Requirements**:
- PR must be open
- PR must be mergeable (no conflicts)
- Working directory must be clean (no uncommitted changes)
- GitHub token must be available (`GITHUB_TOKEN` env var or `--token` flag)
**Mutual Exclusivity**: Cannot be used with `--process-prs` flag
### `--incoming-dir`
Specify custom directory for incoming PR files (default: `./cmd/generate_changelog/incoming`).
**Usage**: `./generate_changelog --incoming-pr 1672 --incoming-dir ./custom/path`
### `--process-prs`
Process all incoming PR files for release aggregation. Used by CI/CD during release creation.
**Usage**: `./generate_changelog --process-prs {new_version_string}`
**Mutual Exclusivity**: Cannot be used with `--incoming-pr` flag
### `--ai-summarize`
Enable AI-enhanced summaries using Fabric integration.
**Usage**: `./generate_changelog --incoming-pr 1672 --ai-summarize`
### `--push`
Enable automatic git push after creating an incoming entry. By default, the commit is created locally but not pushed to the remote repository.
**Usage**: `./generate_changelog --incoming-pr 1672 --push`
**Note**: When using `--push`, ensure you have proper authentication configured (SSH keys or GITHUB_TOKEN environment variable).
## Troubleshooting
### "PR is not open"
Your PR has been closed or merged. Only open PRs can be processed.
### "PR is not mergeable"
Your PR has merge conflicts or other issues preventing it from being merged. Resolve conflicts and ensure the PR is in a mergeable state.
### "Working directory is not clean"
You have uncommitted changes. Commit or stash them before running the tool.
### "Failed to fetch PR"
Check your GitHub token and network connection. Ensure the PR number exists.
## CI/CD Integration
The system automatically processes all incoming PR files during the release workflow. No manual intervention is required.
When a release is created:
1. All `incoming/*.txt` files are aggregated using `--process-prs`
2. Version is detected from `version.nix` or latest git tag
3. A new version entry is created in CHANGELOG.md
4. Incoming files are cleaned up (removed)
5. Changes are staged for the release commit (CHANGELOG.md and cache file)
## Best Practices
1. **Run early**: Generate your changelog entry as soon as your PR is ready for review
2. **Review content**: Always review the generated entry and edit if necessary
3. **Keep it updated**: If you make significant changes to your PR, regenerate the entry
4. **Use AI summaries**: Enable `--ai-summarize` for more professional, consistent formatting
## Advanced Usage
### Custom GitHub Token
```bash
./generate_changelog --incoming-pr 1672 --token YOUR_GITHUB_TOKEN
```
### Custom Repository Path
```bash
./generate_changelog --incoming-pr 1672 --repo /path/to/repo
```
### Disable Caching
```bash
./generate_changelog --incoming-pr 1672 --no-cache
```
### Enable Auto-Push
```bash
./generate_changelog --incoming-pr 1672 --push
```
This creates the commit locally and pushes it to the remote repository. By default, commits are only created locally, allowing you to review changes before pushing manually.
**Authentication**: The tool automatically detects GitHub repositories and uses the GITHUB_TOKEN environment variable for authentication when pushing. For SSH repositories, ensure your SSH keys are properly configured.
## Integration with Existing Workflow
This system is fully backward compatible. The existing changelog generation continues to work unchanged. The new features are opt-in and only activated when using the new flags.
## Support
If you encounter issues:
1. Check this documentation
2. Verify your GitHub token has appropriate permissions
3. Ensure your PR meets the validation requirements
4. Check the tool's help: `./generate_changelog --help`
For bugs or feature requests, please create an issue in the repository.

View File

@@ -1 +1 @@
"1.4.261"
"1.4.264"

View File

@@ -44,7 +44,7 @@ export default defineConfig({
'/api': {
target: FABRIC_BASE_URL,
changeOrigin: true,
timeout: 30000,
timeout: 900000,
rewrite: (path) => path.replace(/^\/api/, ''),
configure: (proxy, _options) => {
proxy.on('error', (err, req, res) => {
@@ -59,7 +59,7 @@ export default defineConfig({
'^/(patterns|models|sessions)/names': {
target: FABRIC_BASE_URL,
changeOrigin: true,
timeout: 30000,
timeout: 900000,
configure: (proxy, _options) => {
proxy.on('error', (err, req, res) => {
console.log('proxy error', err);