Compare commits

...

22 Commits

Author SHA1 Message Date
di-sukharev
9855ed1f69 3.2.19 2026-04-10 15:27:56 +03:00
GPT8
7e41139d9c Merge pull request #552 from di-sukharev/improve_tests
Improve CLI test coverage and harden proxy configuration behavior
2026-04-10 15:26:34 +03:00
di-sukharev
66a8c2b52a fix(build): remove mistral cjs warning
Replace the createRequire fallback with a direct Mistral import so the CJS build stops emitting the import.meta warning.

Refresh the bundled out/cli.cjs artifact after the change.
2026-04-10 15:24:09 +03:00
di-sukharev
57fb52a3c5 style(prettier): format source files 2026-04-10 15:20:27 +03:00
di-sukharev
88964cbc5e test(cli): overhaul e2e coverage and CI
Split smoke, core, and prompt-module suites and make test:e2e self-contained with an explicit build step and version-check bypass.

Move core CLI coverage to a process-based harness with mock OpenAI boundary checks, add user-path scenarios, refresh CI jobs, and commit the rebuilt out/cli.cjs artifact.
2026-04-10 15:16:32 +03:00
di-sukharev
cf27085ac9 fix(cli): tighten proxy and setup behavior
Support explicit proxy disabling and ambient proxy fallback without leaking env state into config.

Improve first-run detection, endpoint-specific error messaging, diff exclusions, and runtime helper boundaries covered by unit tests.
2026-04-10 15:16:11 +03:00
GPT8
7fa2384761 Merge pull request #548 from SOV710/master
chore: add biome.json as an opt-in alternative to Prettier and ESLint
2026-04-08 16:41:04 +03:00
GPT8
fa1482d8b1 Merge pull request #549 from SOV710/fix-cli-flags
fix: make -c/--context, -y/--yes, and --fgm flags actually work
2026-04-08 16:40:35 +03:00
GPT8
f656c39f63 Merge pull request #550 from keith666666/master
feat(ollama): add OCO_OLLAMA_THINK config to control thinking mode
2026-04-08 16:36:09 +03:00
GPT8
420a15343c Merge pull request #538 from barnabasbusa/master
feat: display PR URL and auto-set upstream when pushing new branches
2026-04-08 16:35:37 +03:00
Barnabas Busa
fd9820dd64 Merge remote-tracking branch 'upstream/master' 2026-04-08 14:43:22 +02:00
Keith
2d9a26dc37 feat(ollama): add OCO_OLLAMA_THINK config to control thinking mode
Adds support for passing the `think` param to Ollama's /api/chat endpoint,
allowing users to disable reasoning blocks on models like qwen3.5 via
`oco config set OCO_OLLAMA_THINK=false`.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-07 18:27:38 +08:00
SOV710
8cbaa36e82 build 2026-04-05 03:58:40 +00:00
SOV710
42029fff4e refactor(prompts): null-safe, trim-aware user context handling
The previous userInputCodeContext only skipped the context block
when context was exactly '' or ' '. Anything else (e.g. a string of
whitespace, null, undefined) would inject an empty or
whitespace-only <context>…</context> tag into the system prompt.

Trim the input and guard against null/undefined:
  - accept string | undefined | null
  - normalize via `(context ?? '').trim()`
  - skip the injection whenever the trimmed value is empty

Also inline the INIT_MAIN_PROMPT IIFE into a normal function body
and introduce a `content` local, removing a layer of nesting that
obscured the prompt assembly. Behavior is unchanged.
2026-04-05 03:58:40 +00:00
SOV710
4d767da9e5 fix(prompts): make --fgm override OCO_EMOJI config
getCommitConvention gated the entire GitMoji branch on
config.OCO_EMOJI, so --fgm was silently ignored unless the user had
previously run `oco config set OCO_EMOJI true`. Since OCO_EMOJI
defaults to false, --fgm was a no-op for most users.

This violates the standard CLI convention that command-line flags
should override configuration. Restructure getCommitConvention so
that --fgm forces FULL_GITMOJI_SPEC regardless of OCO_EMOJI:

  --fgm=true                    → FULL_GITMOJI_SPEC
  --fgm=false + OCO_EMOJI=true  → GITMOJI_HELP (unchanged)
  --fgm=false + OCO_EMOJI=false → CONVENTIONAL_COMMIT_KEYWORDS (unchanged)

No other files need changes — the fgm flag was already threaded
correctly through cli.ts → commit.ts → generateCommitMessageByDiff
→ getMainCommitPrompt → getCommitConvention.
2026-04-05 03:58:40 +00:00
SOV710
361327a8fe fix(generate): forward context through chunked large-diff prompt path
When a staged diff exceeds MAX_REQUEST_TOKENS, generateCommitMessageByDiff
routes through getCommitMsgsPromisesFromFileDiffs →
getMessagesPromisesByChangesInFile → generateCommitMessageChatCompletionPrompt
to produce one sub-prompt per chunk. That entire chain was threading
`fullGitMojiSpec` but never `context`, so `-c/--context` was silently
dropped for any diff large enough to trigger chunking, even though
the simple (non-chunked) path forwarded it correctly.

Add a `context` parameter to each of the three helpers and thread it
through to generateCommitMessageChatCompletionPrompt so the user's
context is present in every sub-prompt.
2026-04-05 03:58:40 +00:00
SOV710
3a2fa11fcd fix(commit): preserve context and skip-confirm flag across regenerate
When the user answers "No" at the confirmation prompt and chooses to
regenerate, the recursive call to generateCommitMessageFromGitDiff
forwarded only `diff`, `extraArgs`, and `fullGitMojiSpec`. Both
`context` and `skipCommitConfirmation` were silently dropped, so:

- `-c/--context` was honored only on the first attempt and lost on
  every regeneration;
- `-y/--yes` was honored only on the first attempt, forcing a manual
  confirmation after regeneration.

Forward both fields through the recursive call so the user's flags
are respected for the full lifetime of the commit() invocation.
2026-04-05 03:58:40 +00:00
SOV710
4056bfa547 fix(cli): strip -y/--fgm from extraArgs to prevent git commit conflict
Same class of bug as the -c/--context fix: these flags could leak
into extraArgs and be forwarded to the internal `git commit` call,
causing unexpected behavior.

Extend the extraArgs sanitization to also strip -y, --yes, --fgm,
and their values.
2026-04-05 03:58:40 +00:00
SOV710
a48d33096a fix(cli): strip -c/--context from extraArgs to prevent git commit conflict
cleye's ignoreArgv passes unconsumed flags and arguments through to
the internal `git commit` execa call. Although -c/--context is
defined as a known cleye flag, a defensive guard is needed to strip
it from extraArgs in case it leaks through, which would conflict
with git's own handling.

Add a sanitization step at the entry of commit() that filters -c,
--context, and their values from extraArgs before they are forwarded
to the git commit invocation.
2026-04-05 03:58:40 +00:00
SOV710
d5dcd42d2c chore: add biome.json as an alternative to Prettier and ESLint
For developers who prefer Biome, this config mirrors the existing
.prettierrc rules (single quotes, no trailing commas) and ESLint rules
(recommended rules, no-console as error, import sorting via assist).

This is not a replacement — Prettier and ESLint remain the primary
tooling. biome.json is an opt-in for those who already use Biome.
2026-04-04 17:19:56 +00:00
di-sukharev
f300b5dd4e build 2026-04-03 19:07:22 +03:00
Barnabas Busa
ccc227ed85 feat: display PR URL and auto-set upstream when pushing new branches
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-16 20:12:09 +01:00
41 changed files with 3280 additions and 1354 deletions

View File

@@ -8,49 +8,40 @@ on:
- main
jobs:
unit-test:
linux-tests:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [20.x]
steps:
- uses: actions/checkout@v4
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
- name: Use Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
node-version: '20.x'
cache: 'npm'
- name: Install dependencies
run: npm install
- name: Run Unit Tests
run: npm run test:unit
e2e-test:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [20.x]
steps:
- uses: actions/checkout@v4
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- name: Install git
run: |
sudo apt-get update
sudo apt-get install -y git
git --version
- name: Setup git
run: |
git config --global user.email "test@example.com"
git config --global user.name "Test User"
- name: Install dependencies
run: npm install
- name: Build
run: npm run build
- name: Run E2E Tests
run: npm run test:e2e
run: npm ci
- name: Run Unit Tests
run: npm run test:unit
- name: Run Core E2E Tests
run: npm run test:e2e:core
- name: Run Prompt Module E2E Tests
run: npm run test:e2e:prompt-module
macos-smoke:
runs-on: macos-latest
steps:
- uses: actions/checkout@v4
- name: Use Node.js
uses: actions/setup-node@v4
with:
node-version: '20.x'
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Run Smoke E2E Tests
run: npm run test:e2e:smoke
prettier:
runs-on: ubuntu-latest
steps:

View File

@@ -245,7 +245,13 @@ If you are behind a proxy, you can set it in the config:
oco config set OCO_PROXY=http://127.0.0.1:7890
```
Or it will automatically use `HTTPS_PROXY` or `HTTP_PROXY` environment variables.
If `OCO_PROXY` is unset, OpenCommit will automatically use `HTTPS_PROXY` or `HTTP_PROXY` environment variables.
To explicitly disable proxy use for OpenCommit, even when those environment variables are set:
```sh
oco config set OCO_PROXY=null
```
### Locale configuration

52
biome.json Normal file
View File

@@ -0,0 +1,52 @@
{
"$schema": "https://biomejs.dev/schemas/2.4.10/schema.json",
"vcs": {
"enabled": true,
"clientKind": "git",
"useIgnoreFile": true
},
"files": {
"ignoreUnknown": true,
"includes": ["**", "!!build", "!!dist", "!!out"]
},
"formatter": {
"enabled": true,
"indentStyle": "space",
"indentWidth": 2,
"lineEnding": "lf"
},
"javascript": {
"formatter": {
"quoteStyle": "single",
"jsxQuoteStyle": "double",
"trailingCommas": "none",
"semicolons": "always"
}
},
"linter": {
"enabled": true,
"rules": {
"recommended": true,
"suspicious": {
"noConsole": "error"
},
"style": {
"noNonNullAssertion": "off"
}
}
},
"assist": {
"enabled": true,
"actions": {
"source": {
"organizeImports": "on"
}
}
}
}

File diff suppressed because it is too large Load Diff

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "opencommit",
"version": "3.2.18",
"version": "3.2.19",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "opencommit",
"version": "3.2.18",
"version": "3.2.19",
"license": "MIT",
"dependencies": {
"@actions/core": "^1.10.0",

View File

@@ -1,6 +1,6 @@
{
"name": "opencommit",
"version": "3.2.18",
"version": "3.2.19",
"description": "Auto-generate impressive commits in 1 second. Killing lame commits with AI 🤯🔫",
"keywords": [
"git",
@@ -57,8 +57,15 @@
"test:docker-build": "docker build -t oco-test -f test/Dockerfile .",
"test:unit": "NODE_OPTIONS=--experimental-vm-modules jest test/unit",
"test:unit:docker": "npm run test:docker-build && DOCKER_CONTENT_TRUST=0 docker run --rm oco-test npm run test:unit",
"test:e2e": "npm run test:e2e:setup && jest test/e2e",
"test:e2e:setup": "sh test/e2e/setup.sh",
"test:e2e": "npm run build && npm run test:e2e:smoke:run && npm run test:e2e:core:run && npm run test:e2e:prompt-module:run",
"test:e2e:smoke": "npm run build && npm run test:e2e:smoke:run",
"test:e2e:smoke:run": "OCO_TEST_SKIP_VERSION_CHECK=true jest test/e2e/smoke.test.ts",
"test:e2e:core": "npm run build && npm run test:e2e:core:run",
"test:e2e:core:run": "OCO_TEST_SKIP_VERSION_CHECK=true jest test/e2e/cliBehavior.test.ts test/e2e/gitPush.test.ts test/e2e/oneFile.test.ts test/e2e/noChanges.test.ts",
"test:e2e:setup": "npm run test:e2e:prompt-module:setup",
"test:e2e:prompt-module:setup": "sh test/e2e/setup.sh",
"test:e2e:prompt-module": "npm run build && npm run test:e2e:prompt-module:run",
"test:e2e:prompt-module:run": "npm run test:e2e:prompt-module:setup && OCO_TEST_SKIP_VERSION_CHECK=true jest test/e2e/prompt-module",
"test:e2e:docker": "npm run test:docker-build && DOCKER_CONTENT_TRUST=0 docker run --rm oco-test npm run test:e2e",
"mlx:start": "OCO_AI_PROVIDER='mlx' node ./out/cli.cjs"
},

View File

@@ -8,7 +8,7 @@ import { commitlintConfigCommand } from './commands/commitlint';
import { configCommand, getConfig } from './commands/config';
import { hookCommand, isHookCalled } from './commands/githook.js';
import { prepareCommitMessageHook } from './commands/prepare-commit-msg-hook';
import { setupProxy } from './utils/proxy';
import { resolveProxy, setupProxy } from './utils/proxy';
import {
setupCommand,
isFirstRun,
@@ -20,9 +20,36 @@ import { checkIsLatestVersion } from './utils/checkIsLatestVersion';
import { runMigrations } from './migrations/_run.js';
const config = getConfig();
setupProxy(config.OCO_PROXY);
setupProxy(resolveProxy(config.OCO_PROXY));
const extraArgs = process.argv.slice(2);
const OCO_FLAGS_WITH_VALUE = new Set(['-c', '--context']);
const OCO_BOOLEAN_FLAGS = new Set(['-y', '--yes', '--fgm']);
const OCO_EQUALS_PREFIXES = ['-c=', '--context=', '-y=', '--yes=', '--fgm='];
const stripOcoFlags = (argv: string[]): string[] => {
const out: string[] = [];
for (let i = 0; i < argv.length; i++) {
const a = argv[i];
// String flags with a separate value token: -c <val>, --context <val>
if (OCO_FLAGS_WITH_VALUE.has(a)) {
i++; // skip the value token too
continue;
}
// Boolean flags: -y, --yes, --fgm
if (OCO_BOOLEAN_FLAGS.has(a)) {
continue;
}
// Equals form: -c=…, --context=…, -y=…, --yes=…, --fgm=…
if (OCO_EQUALS_PREFIXES.some((prefix) => a.startsWith(prefix))) {
continue;
}
out.push(a);
}
return out;
};
const rawArgv = process.argv.slice(2);
const extraArgs = stripOcoFlags(rawArgv);
cli(
{
@@ -82,5 +109,5 @@ cli(
commit(extraArgs, flags.context, false, flags.fgm, flags.yes);
},
extraArgs
rawArgv
);

View File

@@ -29,6 +29,32 @@ const getGitRemotes = async () => {
return stdout.split('\n').filter((remote) => Boolean(remote.trim()));
};
const hasUpstreamBranch = async (): Promise<boolean> => {
try {
await execa('git', [
'rev-parse',
'--abbrev-ref',
'--symbolic-full-name',
'@{u}'
]);
return true;
} catch {
return false;
}
};
const getCurrentBranch = async (): Promise<string> => {
const { stdout } = await execa('git', ['branch', '--show-current']);
return stdout.trim();
};
const displayPushUrl = (stderr: string) => {
const urlMatch = stderr.match(/https?:\/\/\S+/);
if (urlMatch) {
outro(`${chalk.cyan('Create a pull request:')} ${urlMatch[0]}`);
}
};
// Check for the presence of message templates
const checkMessageTemplate = (extraArgs: string[]): string | false => {
for (const key in extraArgs) {
@@ -130,8 +156,13 @@ ${chalk.grey('——————————————————')}`
if (config.OCO_GITPUSH === false) return;
if (!remotes.length) {
const { stdout } = await execa('git', ['push']);
const pushArgs = ['push'];
if (!(await hasUpstreamBranch())) {
pushArgs.push('--set-upstream', 'origin', await getCurrentBranch());
}
const { stdout, stderr } = await execa('git', pushArgs);
if (stdout) outro(stdout);
displayPushUrl(stderr);
process.exit(0);
}
@@ -147,11 +178,11 @@ ${chalk.grey('——————————————————')}`
pushSpinner.start(`Running 'git push ${remotes[0]}'`);
const { stdout } = await execa('git', [
'push',
'--verbose',
remotes[0]
]);
const pushArgs = ['push', '--verbose', remotes[0]];
if (!(await hasUpstreamBranch())) {
pushArgs.push('--set-upstream', await getCurrentBranch());
}
const { stdout, stderr } = await execa('git', pushArgs);
pushSpinner.stop(
`${chalk.green('✔')} Successfully pushed all commits to ${
@@ -160,6 +191,7 @@ ${chalk.grey('——————————————————')}`
);
if (stdout) outro(stdout);
displayPushUrl(stderr);
} else {
outro('`git push` aborted');
process.exit(0);
@@ -181,7 +213,11 @@ ${chalk.grey('——————————————————')}`
pushSpinner.start(`Running 'git push ${selectedRemote}'`);
const { stdout } = await execa('git', ['push', selectedRemote]);
const pushArgs = ['push', selectedRemote];
if (!(await hasUpstreamBranch())) {
pushArgs.push('--set-upstream', await getCurrentBranch());
}
const { stdout, stderr } = await execa('git', pushArgs);
if (stdout) outro(stdout);
@@ -190,6 +226,8 @@ ${chalk.grey('——————————————————')}`
'✔'
)} successfully pushed all commits to ${selectedRemote}`
);
displayPushUrl(stderr);
}
}
} else {
@@ -203,7 +241,9 @@ ${chalk.grey('——————————————————')}`
await generateCommitMessageFromGitDiff({
diff,
extraArgs,
fullGitMojiSpec
context,
fullGitMojiSpec,
skipCommitConfirmation
});
}
}
@@ -214,7 +254,9 @@ ${chalk.grey('——————————————————')}`
const errorConfig = getConfig();
const provider = errorConfig.OCO_AI_PROVIDER || 'openai';
const formatted = formatUserFriendlyError(error, provider);
const formatted = formatUserFriendlyError(error, provider, {
baseURL: errorConfig.OCO_API_URL
});
outro(printFormattedError(formatted));
process.exit(1);

View File

@@ -29,7 +29,8 @@ export enum CONFIG_KEYS {
OCO_API_CUSTOM_HEADERS = 'OCO_API_CUSTOM_HEADERS',
OCO_OMIT_SCOPE = 'OCO_OMIT_SCOPE',
OCO_GITPUSH = 'OCO_GITPUSH', // todo: deprecate
OCO_HOOK_AUTO_UNCOMMENT = 'OCO_HOOK_AUTO_UNCOMMENT'
OCO_HOOK_AUTO_UNCOMMENT = 'OCO_HOOK_AUTO_UNCOMMENT',
OCO_OLLAMA_THINK = 'OCO_OLLAMA_THINK'
}
export enum CONFIG_MODES {
@@ -722,7 +723,7 @@ export const configValidators = {
[CONFIG_KEYS.OCO_API_URL](value: any) {
validateConfig(
CONFIG_KEYS.OCO_API_URL,
typeof value === 'string',
typeof value === 'string' && /^(https?:\/\/)/.test(value),
`${value} is not a valid URL. It should start with 'http://' or 'https://'.`
);
return value;
@@ -731,7 +732,8 @@ export const configValidators = {
[CONFIG_KEYS.OCO_PROXY](value: any) {
validateConfig(
CONFIG_KEYS.OCO_PROXY,
typeof value === 'string',
value === null ||
(typeof value === 'string' && /^(https?:\/\/)/.test(value)),
`${value} is not a valid URL. It should start with 'http://' or 'https://'.`
);
return value;
@@ -838,6 +840,15 @@ export const configValidators = {
typeof value === 'boolean',
'Must be true or false'
);
},
[CONFIG_KEYS.OCO_OLLAMA_THINK](value: any) {
validateConfig(
CONFIG_KEYS.OCO_OLLAMA_THINK,
typeof value === 'boolean',
'Must be true or false'
);
return value;
}
};
@@ -890,7 +901,7 @@ export type ConfigType = {
[CONFIG_KEYS.OCO_TOKENS_MAX_INPUT]: number;
[CONFIG_KEYS.OCO_TOKENS_MAX_OUTPUT]: number;
[CONFIG_KEYS.OCO_API_URL]?: string;
[CONFIG_KEYS.OCO_PROXY]?: string;
[CONFIG_KEYS.OCO_PROXY]?: string | null;
[CONFIG_KEYS.OCO_API_CUSTOM_HEADERS]?: string;
[CONFIG_KEYS.OCO_DESCRIPTION]: boolean;
[CONFIG_KEYS.OCO_EMOJI]: boolean;
@@ -905,6 +916,7 @@ export type ConfigType = {
[CONFIG_KEYS.OCO_OMIT_SCOPE]: boolean;
[CONFIG_KEYS.OCO_TEST_MOCK_TYPE]: string;
[CONFIG_KEYS.OCO_HOOK_AUTO_UNCOMMENT]: boolean;
[CONFIG_KEYS.OCO_OLLAMA_THINK]?: boolean;
};
export const defaultConfigPath = pathJoin(homedir(), '.opencommit');
@@ -975,10 +987,7 @@ const getEnvConfig = (envPath: string) => {
return {
OCO_MODEL: process.env.OCO_MODEL,
OCO_API_URL: process.env.OCO_API_URL,
OCO_PROXY:
process.env.OCO_PROXY ||
process.env.HTTPS_PROXY ||
process.env.HTTP_PROXY,
OCO_PROXY: process.env.OCO_PROXY,
OCO_API_KEY: process.env.OCO_API_KEY,
OCO_API_CUSTOM_HEADERS: process.env.OCO_API_CUSTOM_HEADERS,
OCO_AI_PROVIDER: process.env.OCO_AI_PROVIDER as OCO_AI_PROVIDER_ENUM,
@@ -1016,16 +1025,13 @@ export const getIsGlobalConfigFileExist = (
};
export const getGlobalConfig = (configPath: string = defaultConfigPath) => {
let globalConfig: ConfigType;
const isGlobalConfigFileExist = getIsGlobalConfigFileExist(configPath);
if (!isGlobalConfigFileExist) globalConfig = initGlobalConfig(configPath);
else {
const configFile = readFileSync(configPath, 'utf8');
globalConfig = iniParse(configFile) as ConfigType;
if (!isGlobalConfigFileExist) {
return { ...DEFAULT_CONFIG };
}
return globalConfig;
const configFile = readFileSync(configPath, 'utf8');
return iniParse(configFile) as ConfigType;
};
/**
@@ -1038,7 +1044,10 @@ export const getGlobalConfig = (configPath: string = defaultConfigPath) => {
const mergeConfigs = (main: Partial<ConfigType>, fallback: ConfigType) => {
const allKeys = new Set([...Object.keys(main), ...Object.keys(fallback)]);
return Array.from(allKeys).reduce((acc, key) => {
acc[key] = parseConfigVarValue(main[key] ?? fallback[key]);
const mainValue = main[key];
acc[key] = parseConfigVarValue(
mainValue !== undefined ? mainValue : fallback[key]
);
return acc;
}, {} as ConfigType);
};
@@ -1207,7 +1216,10 @@ function getConfigKeyDetails(key) {
case CONFIG_KEYS.OCO_PROXY:
return {
description: 'HTTP/HTTPS Proxy URL',
values: ["URL string (must start with 'http://' or 'https://')"]
values: [
"URL string (must start with 'http://' or 'https://')",
'null (disable proxy even when HTTP_PROXY/HTTPS_PROXY are set)'
]
};
case CONFIG_KEYS.OCO_MESSAGE_TEMPLATE_PLACEHOLDER:
return {

View File

@@ -427,22 +427,23 @@ export async function runSetup(): Promise<boolean> {
}
export function isFirstRun(): boolean {
const hasGlobalConfig = getIsGlobalConfigFileExist();
const config = getConfig();
// Check if API key is missing for providers that need it
const provider = config.OCO_AI_PROVIDER || OCO_AI_PROVIDER_ENUM.OPENAI;
if (MODEL_REQUIRED_PROVIDERS.includes(provider as OCO_AI_PROVIDER_ENUM)) {
// For Ollama/MLX, check if model is set
return !config.OCO_MODEL;
}
if (provider === OCO_AI_PROVIDER_ENUM.TEST) {
return false;
}
// For other providers, check if API key is set
return !config.OCO_API_KEY;
const hasRequiredConfig = MODEL_REQUIRED_PROVIDERS.includes(
provider as OCO_AI_PROVIDER_ENUM
)
? Boolean(config.OCO_MODEL)
: Boolean(config.OCO_API_KEY);
// Trigger the full setup wizard only when nothing usable was configured yet.
return !hasGlobalConfig && !hasRequiredConfig;
}
export async function promptForMissingApiKey(): Promise<boolean> {

View File

@@ -11,8 +11,9 @@ export interface AiEngineConfig {
maxTokensOutput: number;
maxTokensInput: number;
baseURL?: string;
proxy?: string;
proxy?: string | null;
customHeaders?: Record<string, string>;
ollamaThink?: boolean;
}
type Client =

View File

@@ -5,8 +5,8 @@ import {
MessageParam
} from '@anthropic-ai/sdk/resources/messages.mjs';
import { OpenAI } from 'openai';
import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitDiff';
import { normalizeEngineError } from '../utils/engineErrorHandler';
import { GenerateCommitMessageErrorEnum } from '../utils/generateCommitMessageErrors';
import { removeContentTags } from '../utils/removeContentTags';
import { tokenCount } from '../utils/tokenCount';
import { AiEngine, AiEngineConfig } from './Engine';
@@ -21,8 +21,7 @@ export class AnthropicEngine implements AiEngine {
this.config = config;
const clientOptions: any = { apiKey: this.config.apiKey };
const proxy =
config.proxy || process.env.HTTPS_PROXY || process.env.HTTP_PROXY;
const proxy = config.proxy;
if (proxy) {
clientOptions.httpAgent = new HttpsProxyAgent(proxy);
}

View File

@@ -3,8 +3,8 @@ import {
OpenAIClient as AzureOpenAIClient
} from '@azure/openai';
import { OpenAI } from 'openai';
import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitDiff';
import { normalizeEngineError } from '../utils/engineErrorHandler';
import { GenerateCommitMessageErrorEnum } from '../utils/generateCommitMessageErrors';
import { removeContentTags } from '../utils/removeContentTags';
import { tokenCount } from '../utils/tokenCount';
import { AiEngine, AiEngineConfig } from './Engine';

View File

@@ -1,6 +1,6 @@
import { OpenAI } from 'openai';
import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitDiff';
import { normalizeEngineError } from '../utils/engineErrorHandler';
import { GenerateCommitMessageErrorEnum } from '../utils/generateCommitMessageErrors';
import { removeContentTags } from '../utils/removeContentTags';
import { tokenCount } from '../utils/tokenCount';
import { OpenAiEngine, OpenAiConfig } from './openAi';

View File

@@ -1,7 +1,7 @@
import { Mistral } from '@mistralai/mistralai';
import { OpenAI } from 'openai';
import { HttpsProxyAgent } from 'https-proxy-agent';
import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitDiff';
import { normalizeEngineError } from '../utils/engineErrorHandler';
import { GenerateCommitMessageErrorEnum } from '../utils/generateCommitMessageErrors';
import { removeContentTags } from '../utils/removeContentTags';
import { tokenCount } from '../utils/tokenCount';
import { AiEngine, AiEngineConfig } from './Engine';
@@ -10,10 +10,6 @@ import { AiEngine, AiEngineConfig } from './Engine';
export interface MistralAiConfig extends AiEngineConfig {}
export type MistralCompletionMessageParam = Array<any>;
// Import Mistral dynamically to avoid TS errors
// eslint-disable-next-line @typescript-eslint/no-var-requires
const Mistral = require('@mistralai/mistralai').Mistral;
export class MistralAiEngine implements AiEngine {
config: MistralAiConfig;
client: any; // Using any type for Mistral client to avoid TS errors

View File

@@ -4,7 +4,9 @@ import { normalizeEngineError } from '../utils/engineErrorHandler';
import { removeContentTags } from '../utils/removeContentTags';
import { AiEngine, AiEngineConfig } from './Engine';
interface OllamaConfig extends AiEngineConfig {}
interface OllamaConfig extends AiEngineConfig {
ollamaThink?: boolean;
}
const DEFAULT_OLLAMA_URL = 'http://localhost:11434';
const OLLAMA_CHAT_PATH = '/api/chat';
@@ -32,12 +34,15 @@ export class OllamaEngine implements AiEngine {
async generateCommitMessage(
messages: Array<OpenAI.Chat.Completions.ChatCompletionMessageParam>
): Promise<string | undefined> {
const params = {
const params: Record<string, any> = {
model: this.config.model ?? 'mistral',
messages,
options: { temperature: 0, top_p: 0.1 },
stream: false
};
if (typeof this.config.ollamaThink === 'boolean') {
params.think = this.config.ollamaThink;
}
try {
const response = await this.client.post(this.chatUrl, params);

View File

@@ -1,8 +1,8 @@
import { OpenAI } from 'openai';
import { HttpsProxyAgent } from 'https-proxy-agent';
import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitDiff';
import { parseCustomHeaders } from '../utils/engine';
import { parseCustomHeaders } from '../utils/customHeaders';
import { normalizeEngineError } from '../utils/engineErrorHandler';
import { GenerateCommitMessageErrorEnum } from '../utils/generateCommitMessageErrors';
import { removeContentTags } from '../utils/removeContentTags';
import { tokenCount } from '../utils/tokenCount';
import { AiEngine, AiEngineConfig } from './Engine';
@@ -24,8 +24,7 @@ export class OpenAiEngine implements AiEngine {
clientOptions.baseURL = config.baseURL;
}
const proxy =
config.proxy || process.env.HTTPS_PROXY || process.env.HTTP_PROXY;
const proxy = config.proxy;
if (proxy) {
clientOptions.httpAgent = new HttpsProxyAgent(proxy);
}

View File

@@ -16,6 +16,7 @@ import {
getSuggestedModels,
ModelNotFoundError
} from './utils/errors';
import { GenerateCommitMessageErrorEnum } from './utils/generateCommitMessageErrors';
import { mergeDiffs } from './utils/mergeDiffs';
import { tokenCount } from './utils/tokenCount';
@@ -43,13 +44,6 @@ const generateCommitMessageChatCompletionPrompt = async (
return chatContextAsCompletionRequest;
};
export enum GenerateCommitMessageErrorEnum {
tooMuchTokens = 'TOO_MUCH_TOKENS',
internalError = 'INTERNAL_ERROR',
emptyMessage = 'EMPTY_MESSAGE',
outputTokensTooHigh = `Token limit exceeded, OCO_TOKENS_MAX_OUTPUT must not be much higher than the default ${DEFAULT_TOKEN_LIMITS.DEFAULT_MAX_TOKENS_OUTPUT} tokens.`
}
async function handleModelNotFoundError(
error: Error,
provider: string,
@@ -130,7 +124,7 @@ async function handleModelNotFoundError(
...existingConfig,
OCO_MODEL: newModel
} as any);
console.log(chalk.green('') + ' Model saved as default\n');
console.log(chalk.green('') + ' Model saved as default\n');
}
return newModel;
@@ -168,7 +162,8 @@ export const generateCommitMessageByDiff = async (
const commitMessagePromises = await getCommitMsgsPromisesFromFileDiffs(
diff,
MAX_REQUEST_TOKENS,
fullGitMojiSpec
fullGitMojiSpec,
context
);
const commitMessages = [] as string[];
@@ -228,7 +223,8 @@ function getMessagesPromisesByChangesInFile(
fileDiff: string,
separator: string,
maxChangeLength: number,
fullGitMojiSpec: boolean
fullGitMojiSpec: boolean,
context: string
) {
const hunkHeaderSeparator = '@@ ';
const [fileHeader, ...fileDiffByLines] = fileDiff.split(hunkHeaderSeparator);
@@ -256,7 +252,8 @@ function getMessagesPromisesByChangesInFile(
async (lineDiff) => {
const messages = await generateCommitMessageChatCompletionPrompt(
separator + lineDiff,
fullGitMojiSpec
fullGitMojiSpec,
context
);
return engine.generateCommitMessage(messages);
@@ -305,7 +302,8 @@ function splitDiff(diff: string, maxChangeLength: number) {
export const getCommitMsgsPromisesFromFileDiffs = async (
diff: string,
maxDiffLength: number,
fullGitMojiSpec: boolean
fullGitMojiSpec: boolean,
context: string
) => {
const separator = 'diff --git ';
@@ -323,14 +321,16 @@ export const getCommitMsgsPromisesFromFileDiffs = async (
fileDiff,
separator,
maxDiffLength,
fullGitMojiSpec
fullGitMojiSpec,
context
);
commitMessagePromises.push(...messagesPromises);
} else {
const messages = await generateCommitMessageChatCompletionPrompt(
separator + fileDiff,
fullGitMojiSpec
fullGitMojiSpec,
context
);
const engine = getEngine();

View File

@@ -95,10 +95,10 @@ const CONVENTIONAL_COMMIT_KEYWORDS =
'Do not preface the commit with anything, except for the conventional commit keywords: fix, feat, build, chore, ci, docs, style, refactor, perf, test.';
const getCommitConvention = (fullGitMojiSpec: boolean) =>
config.OCO_EMOJI
? fullGitMojiSpec
? FULL_GITMOJI_SPEC
: GITMOJI_HELP
fullGitMojiSpec
? FULL_GITMOJI_SPEC
: config.OCO_EMOJI
? GITMOJI_HELP
: CONVENTIONAL_COMMIT_KEYWORDS;
const getDescriptionInstruction = () =>
@@ -123,36 +123,36 @@ const getScopeInstruction = () =>
* $ oco -- This is a context used to generate the commit message
* @returns - The context of the user input
*/
const userInputCodeContext = (context: string) => {
if (context !== '' && context !== ' ') {
return `Additional context provided by the user: <context>${context}</context>\nConsider this context when generating the commit message, incorporating relevant information when appropriate.`;
const userInputCodeContext = (context: string | undefined | null) => {
const trimmed = (context ?? '').trim();
if (trimmed === '') {
return '';
}
return '';
return `Additional context provided by the user: <context>${trimmed}</context>\nConsider this context when generating the commit message, incorporating relevant information when appropriate.`;
};
const INIT_MAIN_PROMPT = (
language: string,
fullGitMojiSpec: boolean,
context: string
): OpenAI.Chat.Completions.ChatCompletionMessageParam => ({
role: 'system',
content: (() => {
const commitConvention = fullGitMojiSpec
? 'GitMoji specification'
: 'Conventional Commit Convention';
const missionStatement = `${IDENTITY} Your mission is to create clean and comprehensive commit messages as per the ${commitConvention} and explain WHAT were the changes and mainly WHY the changes were done.`;
const diffInstruction =
"I'll send you an output of 'git diff --staged' command, and you are to convert it into a commit message.";
const conventionGuidelines = getCommitConvention(fullGitMojiSpec);
const descriptionGuideline = getDescriptionInstruction();
const oneLineCommitGuideline = getOneLineCommitInstruction();
const scopeInstruction = getScopeInstruction();
const generalGuidelines = `Use the present tense. Lines must not be longer than 74 characters. Use ${language} for the commit message.`;
const userInputContext = userInputCodeContext(context);
): OpenAI.Chat.Completions.ChatCompletionMessageParam => {
const commitConvention = fullGitMojiSpec
? 'GitMoji specification'
: 'Conventional Commit Convention';
const missionStatement = `${IDENTITY} Your mission is to create clean and comprehensive commit messages as per the ${commitConvention} and explain WHAT were the changes and mainly WHY the changes were done.`;
const diffInstruction =
"I'll send you an output of 'git diff --staged' command, and you are to convert it into a commit message.";
const conventionGuidelines = getCommitConvention(fullGitMojiSpec);
const descriptionGuideline = getDescriptionInstruction();
const oneLineCommitGuideline = getOneLineCommitInstruction();
const scopeInstruction = getScopeInstruction();
const generalGuidelines = `Use the present tense. Lines must not be longer than 74 characters. Use ${language} for the commit message.`;
const userInputContext = userInputCodeContext(context);
return `${missionStatement}\n${diffInstruction}\n${conventionGuidelines}\n${descriptionGuideline}\n${oneLineCommitGuideline}\n${scopeInstruction}\n${generalGuidelines}\n${userInputContext}`;
})()
});
const content = `${missionStatement}\n${diffInstruction}\n${conventionGuidelines}\n${descriptionGuideline}\n${oneLineCommitGuideline}\n${scopeInstruction}\n${generalGuidelines}\n${userInputContext}`;
return { role: 'system', content };
};
export const INIT_DIFF_PROMPT: OpenAI.Chat.Completions.ChatCompletionMessageParam =
{

View File

@@ -6,6 +6,10 @@ import currentPackage from '../../package.json';
import { getOpenCommitLatestVersion } from '../version';
export const checkIsLatestVersion = async () => {
if (process.env.OCO_TEST_SKIP_VERSION_CHECK === 'true') {
return;
}
const latestVersion = await getOpenCommitLatestVersion();
if (latestVersion) {

View File

@@ -0,0 +1,21 @@
export function parseCustomHeaders(headers: any): Record<string, string> {
let parsedHeaders = {};
if (!headers) {
return parsedHeaders;
}
try {
if (typeof headers === 'object' && !Array.isArray(headers)) {
parsedHeaders = headers;
} else {
parsedHeaders = JSON.parse(headers);
}
} catch {
console.warn(
'Invalid OCO_API_CUSTOM_HEADERS format, ignoring custom headers'
);
}
return parsedHeaders;
}

View File

@@ -13,48 +13,32 @@ import { MLXEngine } from '../engine/mlx';
import { DeepseekEngine } from '../engine/deepseek';
import { AimlApiEngine } from '../engine/aimlapi';
import { OpenRouterEngine } from '../engine/openrouter';
export function parseCustomHeaders(headers: any): Record<string, string> {
let parsedHeaders = {};
if (!headers) {
return parsedHeaders;
}
try {
if (typeof headers === 'object' && !Array.isArray(headers)) {
parsedHeaders = headers;
} else {
parsedHeaders = JSON.parse(headers);
}
} catch (error) {
console.warn(
'Invalid OCO_API_CUSTOM_HEADERS format, ignoring custom headers'
);
}
return parsedHeaders;
}
import { parseCustomHeaders } from './customHeaders';
import { resolveProxy } from './proxy';
export function getEngine(): AiEngine {
const config = getConfig();
const provider = config.OCO_AI_PROVIDER;
const customHeaders = parseCustomHeaders(config.OCO_API_CUSTOM_HEADERS);
const resolvedProxy = resolveProxy(config.OCO_PROXY);
const DEFAULT_CONFIG = {
model: config.OCO_MODEL!,
maxTokensOutput: config.OCO_TOKENS_MAX_OUTPUT!,
maxTokensInput: config.OCO_TOKENS_MAX_INPUT!,
baseURL: config.OCO_API_URL!,
proxy: config.OCO_PROXY!,
proxy: resolvedProxy,
apiKey: config.OCO_API_KEY!,
customHeaders
};
switch (provider) {
case OCO_AI_PROVIDER_ENUM.OLLAMA:
return new OllamaEngine(DEFAULT_CONFIG);
return new OllamaEngine({
...DEFAULT_CONFIG,
ollamaThink: config.OCO_OLLAMA_THINK
});
case OCO_AI_PROVIDER_ENUM.ANTHROPIC:
return new AnthropicEngine(DEFAULT_CONFIG);

View File

@@ -349,10 +349,44 @@ export interface FormattedError {
suggestion: string | null;
}
export interface ErrorFormattingContext {
baseURL?: string;
}
function getCustomEndpointLabel(baseURL?: string): string | null {
if (!baseURL) {
return null;
}
try {
return new URL(baseURL).host;
} catch {
return null;
}
}
function getServiceUnavailableMessage(
provider: string,
context?: ErrorFormattingContext
): string {
const endpointLabel = getCustomEndpointLabel(context?.baseURL);
if (endpointLabel) {
return `The configured API endpoint (${endpointLabel}) is temporarily unavailable.`;
}
if (context?.baseURL) {
return 'The configured API endpoint is temporarily unavailable.';
}
return `The ${provider} service is temporarily unavailable.`;
}
// Format an error into a user-friendly structure
export function formatUserFriendlyError(
error: unknown,
provider: string
provider: string,
context?: ErrorFormattingContext
): FormattedError {
const billingUrl = PROVIDER_BILLING_URLS[provider] || null;
@@ -381,7 +415,7 @@ export function formatUserFriendlyError(
if (error instanceof ServiceUnavailableError) {
return {
title: 'Service Unavailable',
message: `The ${provider} service is temporarily unavailable.`,
message: getServiceUnavailableMessage(provider, context),
helpUrl: null,
suggestion: 'Please try again in a few moments.'
};
@@ -427,7 +461,7 @@ export function formatUserFriendlyError(
if (isServiceUnavailableError(error)) {
return {
title: 'Service Unavailable',
message: `The ${provider} service is temporarily unavailable.`,
message: getServiceUnavailableMessage(provider, context),
helpUrl: null,
suggestion: 'Please try again in a few moments.'
};

View File

@@ -0,0 +1,8 @@
import { DEFAULT_TOKEN_LIMITS } from '../commands/config';
export enum GenerateCommitMessageErrorEnum {
tooMuchTokens = 'TOO_MUCH_TOKENS',
internalError = 'INTERNAL_ERROR',
emptyMessage = 'EMPTY_MESSAGE',
outputTokensTooHigh = `Token limit exceeded, OCO_TOKENS_MAX_OUTPUT must not be much higher than the default ${DEFAULT_TOKEN_LIMITS.DEFAULT_MAX_TOKENS_OUTPUT} tokens.`
}

View File

@@ -93,36 +93,34 @@ export const gitAdd = async ({ files }: { files: string[] }) => {
gitAddSpinner.stop(`Staged ${files.length} files`);
};
const isFileExcludedFromDiff = (file: string) =>
file.includes('.lock') ||
file.includes('-lock.') ||
file.includes('.svg') ||
file.includes('.png') ||
file.includes('.jpg') ||
file.includes('.jpeg') ||
file.includes('.webp') ||
file.includes('.gif');
export const getDiff = async ({ files }: { files: string[] }) => {
const gitDir = await getGitDir();
const lockFiles = files.filter(
(file) =>
file.includes('.lock') ||
file.includes('-lock.') ||
file.includes('.svg') ||
file.includes('.png') ||
file.includes('.jpg') ||
file.includes('.jpeg') ||
file.includes('.webp') ||
file.includes('.gif')
);
const excludedFiles = files.filter(isFileExcludedFromDiff);
if (lockFiles.length) {
if (excludedFiles.length) {
outro(
`Some files are excluded by default from 'git diff'. No commit messages are generated for this files:\n${lockFiles.join(
`Some files are excluded by default from 'git diff'. No commit messages are generated for this files:\n${excludedFiles.join(
'\n'
)}`
);
}
const filesWithoutLocks = files.filter(
(file) => !file.includes('.lock') && !file.includes('-lock.')
);
const diffableFiles = files.filter((file) => !isFileExcludedFromDiff(file));
const { stdout: diff } = await execa(
'git',
['diff', '--staged', '--', ...filesWithoutLocks],
['diff', '--staged', '--', ...diffableFiles],
{ cwd: gitDir }
);

View File

@@ -1,21 +1,52 @@
import { setGlobalDispatcher, ProxyAgent } from 'undici';
import axios from 'axios';
import { HttpsProxyAgent } from 'https-proxy-agent';
import { Agent, ProxyAgent, setGlobalDispatcher } from 'undici';
export function setupProxy(proxyUrl?: string) {
const proxy = proxyUrl || process.env.HTTPS_PROXY || process.env.HTTP_PROXY;
if (proxy) {
try {
// Set global dispatcher for undici (affects globalThis.fetch used by Gemini and others)
const dispatcher = new ProxyAgent(proxy);
setGlobalDispatcher(dispatcher);
export type ProxySetting = string | null | undefined;
// Set axios global agent
const agent = new HttpsProxyAgent(proxy);
axios.defaults.httpsAgent = agent;
axios.defaults.proxy = false; // Disable axios built-in proxy handling to use agent
} catch (error) {
console.warn(`[Proxy Error] Failed to set proxy: ${error.message}`);
export function resolveProxy(proxySetting?: ProxySetting): ProxySetting {
if (proxySetting === null) {
return null;
}
if (typeof proxySetting === 'string' && proxySetting.trim().length > 0) {
return proxySetting;
}
return process.env.HTTPS_PROXY || process.env.HTTP_PROXY;
}
function resetProxySetup(disableEnvProxy: boolean) {
setGlobalDispatcher(new Agent());
axios.defaults.httpAgent = undefined;
axios.defaults.httpsAgent = undefined;
axios.defaults.proxy = disableEnvProxy ? false : undefined;
}
export function setupProxy(proxySetting?: ProxySetting) {
try {
if (proxySetting === null) {
resetProxySetup(true);
return;
}
resetProxySetup(false);
if (!proxySetting) {
return;
}
// Set global dispatcher for undici (affects globalThis.fetch used by Gemini and others)
const dispatcher = new ProxyAgent(proxySetting);
setGlobalDispatcher(dispatcher);
// Set axios global agents and disable axios built-in proxy handling.
const agent = new HttpsProxyAgent(proxySetting);
axios.defaults.httpAgent = agent;
axios.defaults.httpsAgent = agent;
axios.defaults.proxy = false;
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.warn(`[Proxy Error] Failed to set proxy: ${message}`);
}
}

View File

@@ -0,0 +1,736 @@
import {
existsSync,
lstatSync,
readFileSync,
realpathSync,
rmSync,
writeFileSync
} from 'fs';
import { resolve } from 'path';
import 'cli-testing-library/extend-expect';
import {
assertGitStatus,
assertHeadCommit,
getHeadCommitFiles,
getMockOpenAiEnv,
prepareEnvironment,
prepareRepo,
prepareTempDir,
runCli,
runGit,
runProcess,
seedMigrations,
seedModelCache,
startMockOpenAiServer,
waitForExit,
writeGlobalConfig,
writeRepoFile
} from './utils';
it('cli flow passes --context through to the model prompt and skips confirmation with --yes', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(context): handle production incident'
);
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli(['--yes', '--context=production-incident'], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(
await oco.queryByText('Confirm the commit message?')
).not.toBeInTheConsole();
expect(
await oco.queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(gitDir, 'fix(context): handle production incident');
const requestPayload = server.requestBodies[
server.requestBodies.length - 1
] as { messages: Array<{ content: string }> };
const requestContents = requestPayload.messages
.map((message) => message.content)
.join('\n');
expect(requestContents).toContain(
'<context>production-incident</context>'
);
expect(requestContents).toContain('console.log("Hello World");');
expect(server.authHeaders).toContain('Bearer test-openai-key');
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli flow passes --fgm through to the full GitMoji prompt', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'feat(fgm): use the extended gitmoji specification'
);
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli(['--fgm', '--yes'], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'feat(fgm): use the extended gitmoji specification'
);
const requestPayload = server.requestBodies[
server.requestBodies.length - 1
] as { messages: Array<{ content: string }> };
const requestContents = requestPayload.messages
.map((message) => message.content)
.join('\n');
expect(requestContents).toContain(
'🎨, Improve structure / format of the code;'
);
expect(requestContents).toContain('GitMoji specification');
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli flow allows editing the generated commit message before committing', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(cli): allow editing the generated message'
);
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[ArrowDown][ArrowDown][Enter]');
expect(
await oco.findByText(
'Please edit the commit message: (press Enter to continue)'
)
).toBeInTheConsole();
oco.userEvent.keyboard(' before commit[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'fix(cli): allow editing the generated message before commit'
);
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli flow regenerates the message when the user rejects the first suggestion', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(({ requestIndex }) => ({
body: {
choices: [
{
message: {
content:
requestIndex === 0
? 'fix(cli): first generated message'
: 'fix(cli): regenerated message after retry'
}
}
]
}
}));
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[ArrowDown][Enter]');
expect(
await oco.findByText('Do you want to regenerate the message?')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
oco.clear();
expect(
await oco.findByText('fix(cli): regenerated message after retry')
).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'fix(cli): regenerated message after retry'
);
expect(server.requestBodies).toHaveLength(2);
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli flow lets the user select only specific unstaged files', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(cli): commit only the selected files'
);
try {
await prepareRepo(gitDir, {
'alpha.ts': 'console.log("alpha");\n',
'beta.ts': 'console.log("beta");\n'
});
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('No files are staged')).toBeInTheConsole();
expect(
await oco.findByText(
'Do you want to stage all files and generate commit message?'
)
).toBeInTheConsole();
oco.userEvent.keyboard('[ArrowDown][Enter]');
expect(
await oco.findByText('Select the files you want to add to the commit:')
).toBeInTheConsole();
oco.userEvent.keyboard('[Space][Enter]');
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
expect(await getHeadCommitFiles(gitDir)).toEqual(['alpha.ts']);
await assertGitStatus(gitDir, '?? beta.ts');
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli applies the documented message template placeholder from extra args', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'feat(template): keep generated subject'
);
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli(["'$msg #205'"], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(gitDir, 'feat(template): keep generated subject #205');
} finally {
await server.cleanup();
await cleanup();
}
});
it('hook command sets and unsets the prepare-commit-msg symlink', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const hookPath = resolve(gitDir, '.git/hooks/prepare-commit-msg');
const cliPath = resolve('./out/cli.cjs');
try {
const setHook = await runCli(['hook', 'set'], {
cwd: gitDir
});
expect(await setHook.findByText('Hook set')).toBeInTheConsole();
expect(await waitForExit(setHook)).toBe(0);
expect(existsSync(hookPath)).toBe(true);
expect(lstatSync(hookPath).isSymbolicLink()).toBe(true);
expect(realpathSync(hookPath)).toBe(cliPath);
const unsetHook = await runCli(['hook', 'unset'], {
cwd: gitDir
});
expect(await unsetHook.findByText('Hook is removed')).toBeInTheConsole();
expect(await waitForExit(unsetHook)).toBe(0);
expect(existsSync(hookPath)).toBe(false);
} finally {
await cleanup();
}
});
it('prepare-commit-msg hook writes the generated message into the commit message file', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(hook): populate the commit message file'
);
const hookPath = resolve(gitDir, '.git/hooks/prepare-commit-msg');
const messageFile = resolve(gitDir, '.git/COMMIT_EDITMSG');
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const setHook = await runCli(['hook', 'set'], {
cwd: gitDir
});
expect(await setHook.findByText('Hook set')).toBeInTheConsole();
expect(await waitForExit(setHook)).toBe(0);
writeFileSync(messageFile, '# existing\n');
const hookRun = await runProcess(hookPath, [messageFile], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await hookRun.findByText('Done')).toBeInTheConsole();
expect(await waitForExit(hookRun)).toBe(0);
const commitMessage = readFileSync(messageFile, 'utf8');
expect(commitMessage).toContain(
'# fix(hook): populate the commit message file'
);
expect(commitMessage).toContain('# ---------- [OpenCommit] ---------- #');
expect(commitMessage).toContain('# existing');
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli flow prompts for a missing API key, saves it, and completes the commit', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const homeDir = await prepareTempDir();
const server = await startMockOpenAiServer(
'fix(api): recovered after prompt'
);
try {
const configPath = writeGlobalConfig(homeDir, [
'OCO_AI_PROVIDER=openai',
'OCO_MODEL=gpt-4o-mini',
`OCO_API_URL=${server.baseUrl}`,
'OCO_GITPUSH=false'
]);
seedMigrations(homeDir);
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: {
HOME: homeDir
}
});
expect(
await oco.findByText("API key missing for openai. Let's set it up.")
).toBeInTheConsole();
expect(await oco.findByText('Enter your API key:')).toBeInTheConsole();
oco.userEvent.keyboard('test-openai-key[Enter]');
expect(await oco.findByText('API key saved')).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(gitDir, 'fix(api): recovered after prompt');
expect(server.authHeaders).toContain('Bearer test-openai-key');
expect(readFileSync(configPath, 'utf8')).toContain(
'OCO_API_KEY=test-openai-key'
);
} finally {
await server.cleanup();
rmSync(homeDir, { force: true, recursive: true });
await cleanup();
}
});
it('cli ignores files listed in .opencommitignore when they are the only staged changes', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
try {
await prepareRepo(
gitDir,
{
'.opencommitignore': 'ignored.ts\n'
},
{
stage: true,
commitMessage: 'add opencommit ignore'
}
);
writeRepoFile(gitDir, 'ignored.ts', 'console.log("ignored");\n');
await runGit(['add', 'ignored.ts'], gitDir);
const oco = await runCli([], {
cwd: gitDir,
env: {
OCO_AI_PROVIDER: 'openai',
OCO_API_KEY: 'dummy-openai-key',
OCO_GITPUSH: 'false'
}
});
expect(await oco.findByText('No changes detected')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(1);
await assertHeadCommit(gitDir, 'add opencommit ignore');
} finally {
await cleanup();
}
});
it('cli excludes .opencommitignore files from the generated prompt while still committing staged changes', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(ignore): keep only relevant diff context'
);
try {
await prepareRepo(
gitDir,
{
'.opencommitignore': 'ignored.ts\n'
},
{
stage: true,
commitMessage: 'add opencommit ignore'
}
);
writeRepoFile(gitDir, 'kept.ts', 'console.log("kept");\n');
writeRepoFile(gitDir, 'ignored.ts', 'console.log("ignored");\n');
await runGit(['add', 'kept.ts', 'ignored.ts'], gitDir);
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
const requestPayload = server.requestBodies[
server.requestBodies.length - 1
] as { messages: Array<{ content: string }> };
const requestContents = requestPayload.messages
.map((message) => message.content)
.join('\n');
expect(requestContents).toContain('kept.ts');
expect(requestContents).toContain('console.log("kept");');
expect(requestContents).not.toContain('ignored.ts');
expect(requestContents).not.toContain('console.log("ignored");');
expect(await getHeadCommitFiles(gitDir)).toEqual(['ignored.ts', 'kept.ts']);
} finally {
await server.cleanup();
await cleanup();
}
});
it('first run launches setup, saves config, and completes a commit with the configured provider', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const homeDir = await prepareTempDir();
const server = await startMockOpenAiServer(
'feat(setup): finish first run successfully'
);
try {
const configPath = resolve(homeDir, '.opencommit');
await seedModelCache(homeDir, {
openai: ['gpt-4o-mini', 'gpt-4o']
});
seedMigrations(homeDir);
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: {
HOME: homeDir,
OCO_API_URL: server.baseUrl,
OCO_GITPUSH: 'false'
}
});
expect(await oco.findByText('Select your AI provider:')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Enter your API key:')).toBeInTheConsole();
oco.userEvent.keyboard('first-run-openai-key[Enter]');
expect(await oco.findByText('Select a model:')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Configuration saved to ~/.opencommit')
).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'feat(setup): finish first run successfully'
);
expect(readFileSync(configPath, 'utf8')).toContain('OCO_AI_PROVIDER=openai');
expect(readFileSync(configPath, 'utf8')).toContain(
'OCO_API_KEY=first-run-openai-key'
);
expect(readFileSync(configPath, 'utf8')).toContain('OCO_MODEL=gpt-4o-mini');
expect(server.authHeaders).toContain('Bearer first-run-openai-key');
} finally {
await server.cleanup();
rmSync(homeDir, { force: true, recursive: true });
await cleanup();
}
});
it('cli recovers from a missing model by prompting for an alternative and retrying', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const homeDir = await prepareTempDir();
const server = await startMockOpenAiServer(({ requestIndex, body }) => {
if (requestIndex === 0) {
return {
status: 404,
body: {
error: {
message: `The model '${body?.model}' does not exist`,
type: 'invalid_request_error',
code: 'model_not_found'
}
}
};
}
return {
body: {
choices: [
{
message: {
content: 'fix(model): recover from invalid default model'
}
}
]
}
};
});
try {
const configPath = writeGlobalConfig(homeDir, [
'OCO_AI_PROVIDER=openai',
'OCO_API_KEY=test-openai-key',
'OCO_MODEL=missing-model',
`OCO_API_URL=${server.baseUrl}`,
'OCO_GITPUSH=false'
]);
seedMigrations(homeDir);
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: {
HOME: homeDir
}
});
expect(
await oco.findByText("Model 'missing-model' not found")
).toBeInTheConsole();
expect(
await oco.findByText('Select an alternative model:')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Save as default model?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Model saved as default')).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'fix(model): recover from invalid default model'
);
expect(server.requestBodies.map((request) => request.model)).toEqual([
'missing-model',
'gpt-4o-mini'
]);
expect(readFileSync(configPath, 'utf8')).toContain('OCO_MODEL=gpt-4o-mini');
} finally {
await server.cleanup();
rmSync(homeDir, { force: true, recursive: true });
await cleanup();
}
});
it('cli excludes lockfiles and assets from the generated prompt while still committing them', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(diff): focus prompt on meaningful source changes'
);
try {
await prepareRepo(
gitDir,
{
'kept.ts': 'console.log("kept");\n',
'package-lock.json': '{"name":"opencommit","lockfileVersion":3}\n',
'logo.svg': '<svg viewBox="0 0 1 1"><rect width="1" height="1" /></svg>\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl)
});
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
const requestPayload = server.requestBodies[
server.requestBodies.length - 1
] as { messages: Array<{ content: string }> };
const requestContents = requestPayload.messages
.map((message) => message.content)
.join('\n');
expect(requestContents).toContain('kept.ts');
expect(requestContents).toContain('console.log("kept");');
expect(requestContents).not.toContain('package-lock.json');
expect(requestContents).not.toContain('lockfileVersion');
expect(requestContents).not.toContain('logo.svg');
expect(requestContents).not.toContain('<svg');
expect(await getHeadCommitFiles(gitDir)).toEqual([
'kept.ts',
'logo.svg',
'package-lock.json'
]);
} finally {
await server.cleanup();
await cleanup();
}
});
it('fails with a non-zero exit code outside a git repository', async () => {
const tempDir = await prepareTempDir();
try {
const oco = await runCli([], {
cwd: tempDir,
env: {
OCO_AI_PROVIDER: 'openai',
OCO_API_KEY: 'dummy-openai-key',
OCO_GITPUSH: 'false'
}
});
expect(await waitForExit(oco)).toBe(1);
expect(oco.getStdallStr()).toMatch(/No changes detected|not a git repository/);
} finally {
rmSync(tempDir, { force: true, recursive: true });
}
});

View File

@@ -1,210 +1,268 @@
import path from 'path';
import 'cli-testing-library/extend-expect';
import { exec } from 'child_process';
import { prepareTempDir } from './utils';
import { promisify } from 'util';
import { render } from 'cli-testing-library';
import { resolve } from 'path';
import { rm } from 'fs';
const fsExec = promisify(exec);
const fsRemove = promisify(rm);
import {
assertHeadCommit,
getCurrentBranchName,
getMockOpenAiEnv,
getRemoteBranchHeadSubject,
prepareEnvironment,
prepareRepo,
remoteBranchExists,
runCli,
startMockOpenAiServer,
waitForExit
} from './utils';
const waitForCommitConfirmation = async (findByText: any) => {
const waitForCommitConfirmation = async (
findByText: (text: string) => Promise<any>
) => {
expect(await findByText('Generating the commit message')).toBeInTheConsole();
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
};
/**
* git remote -v
*
* [no remotes]
*/
const prepareNoRemoteGitRepository = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
await fsExec('git init test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
};
return {
gitDir,
cleanup
};
};
/**
* git remote -v
*
* origin /tmp/remote.git (fetch)
* origin /tmp/remote.git (push)
*/
const prepareOneRemoteGitRepository = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
await fsExec('git init --bare remote.git', { cwd: tempDir });
await fsExec('git clone remote.git test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
};
return {
gitDir,
cleanup
};
};
/**
* git remote -v
*
* origin /tmp/remote.git (fetch)
* origin /tmp/remote.git (push)
* other ../remote2.git (fetch)
* other ../remote2.git (push)
*/
const prepareTwoRemotesGitRepository = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
await fsExec('git init --bare remote.git', { cwd: tempDir });
await fsExec('git init --bare other.git', { cwd: tempDir });
await fsExec('git clone remote.git test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
await fsExec('git remote add other ../other.git', { cwd: gitDir });
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
};
return {
gitDir,
cleanup
};
};
describe('cli flow to push git branch', () => {
it('do nothing when OCO_GITPUSH is set to false', async () => {
const { gitDir, cleanup } = await prepareNoRemoteGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { queryByText, findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' OCO_GITPUSH='false' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
it('does nothing when OCO_GITPUSH is set to false', async () => {
const { gitDir, cleanup } = await prepareEnvironment({ remotes: 0 });
const server = await startMockOpenAiServer(
'fix(push): keep the commit local when push is disabled'
);
await waitForCommitConfirmation(findByText);
userEvent.keyboard('[Enter]');
expect(
await queryByText('Choose a remote to push to')
).not.toBeInTheConsole();
expect(
await queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(
await queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
expect(
await queryByText('Command failed with exit code 1')
).not.toBeInTheConsole();
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
await cleanup();
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'false'
})
});
await waitForCommitConfirmation(oco.findByText);
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(
await oco.queryByText('Choose a remote to push to')
).not.toBeInTheConsole();
expect(
await oco.queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(
await oco.queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
expect(
await oco.queryByText('Command failed with exit code 1')
).not.toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'fix(push): keep the commit local when push is disabled'
);
} finally {
await server.cleanup();
await cleanup();
}
});
it('push and cause error when there is no remote', async () => {
const { gitDir, cleanup } = await prepareNoRemoteGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { queryByText, findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' OCO_GITPUSH='true' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
it('fails after committing when push is enabled but there is no remote', async () => {
const { gitDir, cleanup } = await prepareEnvironment({ remotes: 0 });
const server = await startMockOpenAiServer(
'fix(push): commit even when the push later fails'
);
await waitForCommitConfirmation(findByText);
userEvent.keyboard('[Enter]');
expect(
await queryByText('Choose a remote to push to')
).not.toBeInTheConsole();
expect(
await queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(
await queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
expect(
await findByText('Command failed with exit code 1')
).toBeInTheConsole();
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'true'
})
});
await cleanup();
await waitForCommitConfirmation(oco.findByText);
oco.userEvent.keyboard('[Enter]');
expect(
await oco.queryByText('Choose a remote to push to')
).not.toBeInTheConsole();
expect(
await oco.queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(
await oco.queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
expect(
await oco.findByText('Command failed with exit code 1')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(1);
await assertHeadCommit(
gitDir,
'fix(push): commit even when the push later fails'
);
} finally {
await server.cleanup();
await cleanup();
}
});
it('push when one remote is set', async () => {
const { gitDir, cleanup } = await prepareOneRemoteGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
it('pushes to the only configured remote', async () => {
const { gitDir, remoteDir, cleanup } = await prepareEnvironment({
remotes: 1
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' OCO_GITPUSH='true' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
const server = await startMockOpenAiServer(
'feat(push): publish the commit to the only remote'
);
await waitForCommitConfirmation(findByText);
userEvent.keyboard('[Enter]');
expect(
await findByText('Do you want to run `git push`?')
).toBeInTheConsole();
userEvent.keyboard('[Enter]');
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
expect(
await findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'true'
})
});
await cleanup();
await waitForCommitConfirmation(oco.findByText);
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Do you want to run `git push`?')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'feat(push): publish the commit to the only remote'
);
expect(
await getRemoteBranchHeadSubject(
remoteDir!,
await getCurrentBranchName(gitDir)
)
).toBe('feat(push): publish the commit to the only remote');
} finally {
await server.cleanup();
await cleanup();
}
});
it('push when two remotes are set', async () => {
const { gitDir, cleanup } = await prepareTwoRemotesGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
it('pushes to the selected remote when multiple remotes are configured', async () => {
const { gitDir, remoteDir, cleanup } = await prepareEnvironment({
remotes: 2
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' OCO_GITPUSH='true' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
const server = await startMockOpenAiServer(
'feat(push): choose a remote explicitly when several exist'
);
await waitForCommitConfirmation(findByText);
userEvent.keyboard('[Enter]');
expect(await findByText('Choose a remote to push to')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
expect(
await findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'true'
})
});
await cleanup();
await waitForCommitConfirmation(oco.findByText);
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Choose a remote to push to')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'feat(push): choose a remote explicitly when several exist'
);
expect(
await getRemoteBranchHeadSubject(
remoteDir!,
await getCurrentBranchName(gitDir)
)
).toBe('feat(push): choose a remote explicitly when several exist');
} finally {
await server.cleanup();
await cleanup();
}
});
it("keeps the commit local when the user chooses 'don't push'", async () => {
const { gitDir, remoteDir, otherRemoteDir, cleanup } =
await prepareEnvironment({ remotes: 2 });
const server = await startMockOpenAiServer(
"fix(push): skip the remote step when the user chooses don't push"
);
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'true'
})
});
await waitForCommitConfirmation(oco.findByText);
oco.userEvent.keyboard('[Enter]');
expect(await oco.findByText('Choose a remote to push to')).toBeInTheConsole();
oco.userEvent.keyboard('[ArrowDown][ArrowDown][Enter]');
expect(
await oco.queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
"fix(push): skip the remote step when the user chooses don't push"
);
const branchName = await getCurrentBranchName(gitDir);
expect(await remoteBranchExists(remoteDir!, branchName)).toBe(false);
expect(await remoteBranchExists(otherRemoteDir!, branchName)).toBe(false);
} finally {
await server.cleanup();
await cleanup();
}
});
});

View File

@@ -1,12 +1,22 @@
import { resolve } from 'path'
import { render } from 'cli-testing-library'
import 'cli-testing-library/extend-expect';
import { prepareEnvironment } from './utils';
import { prepareEnvironment, runCli, waitForExit } from './utils';
it('cli flow when there are no changes', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const { findByText } = await render(`OCO_AI_PROVIDER='test' node`, [resolve('./out/cli.cjs')], { cwd: gitDir });
expect(await findByText('No changes detected')).toBeInTheConsole();
await cleanup();
try {
const oco = await runCli([], {
cwd: gitDir,
env: {
OCO_AI_PROVIDER: 'openai',
OCO_API_KEY: 'dummy-openai-key',
OCO_GITPUSH: 'false'
}
});
expect(await oco.findByText('No changes detected')).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(1);
} finally {
await cleanup();
}
});

View File

@@ -1,55 +1,125 @@
import { resolve } from 'path'
import { render } from 'cli-testing-library'
import 'cli-testing-library/extend-expect';
import { prepareEnvironment } from './utils';
import {
assertHeadCommit,
getCurrentBranchName,
getMockOpenAiEnv,
getRemoteBranchHeadSubject,
prepareEnvironment,
prepareRepo,
runCli,
startMockOpenAiServer,
appendRepoFile,
waitForExit
} from './utils';
it('cli flow to generate commit message for 1 new file (staged)', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const { gitDir, remoteDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'feat(cli): commit one staged file through the CLI'
);
await render('echo' ,[`'console.log("Hello World");' > index.ts`], { cwd: gitDir });
await render('git' ,['add index.ts'], { cwd: gitDir });
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const { queryByText, findByText, userEvent } = await render(`OCO_AI_PROVIDER='test' OCO_GITPUSH='true' node`, [resolve('./out/cli.cjs')], { cwd: gitDir });
expect(await queryByText('No files are staged')).not.toBeInTheConsole();
expect(await queryByText('Do you want to stage all files and generate commit message?')).not.toBeInTheConsole();
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'true'
})
});
expect(await findByText('Generating the commit message')).toBeInTheConsole();
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await oco.queryByText('No files are staged')).not.toBeInTheConsole();
expect(
await oco.queryByText(
'Do you want to stage all files and generate commit message?'
)
).not.toBeInTheConsole();
expect(await findByText('Do you want to run `git push`?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await oco.findByText('Generating the commit message')).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await findByText('Successfully pushed all commits to origin')).toBeInTheConsole();
expect(await oco.findByText('Do you want to run `git push`?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
await cleanup();
expect(
await oco.findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'feat(cli): commit one staged file through the CLI'
);
expect(
await getRemoteBranchHeadSubject(
remoteDir!,
await getCurrentBranchName(gitDir)
)
).toBe('feat(cli): commit one staged file through the CLI');
} finally {
await server.cleanup();
await cleanup();
}
});
it('cli flow to generate commit message for 1 changed file (not staged)', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
const server = await startMockOpenAiServer(
'fix(cli): stage modified files before committing'
);
await render('echo' ,[`'console.log("Hello World");' > index.ts`], { cwd: gitDir });
await render('git' ,['add index.ts'], { cwd: gitDir });
await render('git' ,[`commit -m 'add new file'`], { cwd: gitDir });
try {
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{
stage: true,
commitMessage: 'add new file'
}
);
appendRepoFile(gitDir, 'index.ts', 'console.log("Good night World");\n');
await render('echo' ,[`'console.log("Good night World");' >> index.ts`], { cwd: gitDir });
const oco = await runCli([], {
cwd: gitDir,
env: getMockOpenAiEnv(server.baseUrl, {
OCO_GITPUSH: 'true'
})
});
const { findByText, userEvent } = await render(`OCO_AI_PROVIDER='test' OCO_GITPUSH='true' node`, [resolve('./out/cli.cjs')], { cwd: gitDir });
expect(await oco.findByText('No files are staged')).toBeInTheConsole();
expect(
await oco.findByText(
'Do you want to stage all files and generate commit message?'
)
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await findByText('No files are staged')).toBeInTheConsole();
expect(await findByText('Do you want to stage all files and generate commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await oco.findByText('Generating the commit message')).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await findByText('Generating the commit message')).toBeInTheConsole();
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await oco.findByText('Successfully committed')).toBeInTheConsole();
expect(await oco.findByText('Do you want to run `git push`?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(await findByText('Successfully committed')).toBeInTheConsole();
expect(await findByText('Do you want to run `git push`?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await findByText('Successfully pushed all commits to origin')).toBeInTheConsole();
await cleanup();
expect(
await oco.findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(
gitDir,
'fix(cli): stage modified files before committing'
);
} finally {
await server.cleanup();
await cleanup();
}
});

View File

@@ -1,151 +1,145 @@
import { resolve } from 'path';
import { render } from 'cli-testing-library';
import 'cli-testing-library/extend-expect';
import { prepareEnvironment, wait } from '../utils';
import { cpSync } from 'fs';
import path from 'path';
import { execFile } from 'child_process';
import { promisify } from 'util';
import 'cli-testing-library/extend-expect';
import {
assertHeadCommit,
prepareEnvironment,
prepareRepo,
runCli,
waitForExit
} from '../utils';
const execFileAsync = promisify(execFile);
function getAbsolutePath(relativePath: string) {
// Use process.cwd() which should be the project root during test execution
return path.resolve(process.cwd(), 'test/e2e/prompt-module', relativePath);
}
async function setupCommitlint(dir: string, ver: 9 | 18 | 19) {
let packagePath, packageJsonPath, configPath;
switch (ver) {
case 9:
packagePath = getAbsolutePath('./data/commitlint_9/node_modules');
packageJsonPath = getAbsolutePath('./data/commitlint_9/package.json');
configPath = getAbsolutePath('./data/commitlint_9/commitlint.config.js');
break;
case 18:
packagePath = getAbsolutePath('./data/commitlint_18/node_modules');
packageJsonPath = getAbsolutePath('./data/commitlint_18/package.json');
configPath = getAbsolutePath('./data/commitlint_18/commitlint.config.js');
break;
case 19:
packagePath = getAbsolutePath('./data/commitlint_19/node_modules');
packageJsonPath = getAbsolutePath('./data/commitlint_19/package.json');
configPath = getAbsolutePath('./data/commitlint_19/commitlint.config.js');
break;
}
const getFixturePath = (version: 9 | 18 | 19, fileName: string) =>
path.resolve(
process.cwd(),
`test/e2e/prompt-module/data/commitlint_${version}/${fileName}`
);
await execFileAsync('cp', ['-R', packagePath, path.join(dir, 'node_modules')]);
await execFileAsync('cp', [packageJsonPath, path.join(dir, 'package.json')]);
await execFileAsync('cp', [configPath, path.join(dir, 'commitlint.config.js')]);
await wait(3000); // Avoid flakiness by waiting
const getPromptModuleEnv = (
mockType: 'commit-message' | 'prompt-module-commitlint-config'
): NodeJS.ProcessEnv => ({
OCO_TEST_MOCK_TYPE: mockType,
OCO_PROMPT_MODULE: '@commitlint',
OCO_AI_PROVIDER: 'test',
OCO_GITPUSH: 'true'
});
async function setupCommitlint(dir: string, version: 9 | 18 | 19) {
cpSync(getFixturePath(version, 'node_modules'), path.join(dir, 'node_modules'), {
recursive: true
});
cpSync(getFixturePath(version, 'package.json'), path.join(dir, 'package.json'));
cpSync(
getFixturePath(version, 'commitlint.config.js'),
path.join(dir, 'commitlint.config.js')
);
}
async function assertInstalledCommitlintVersion(
cwd: string,
version: string
): Promise<void> {
const { stdout = '', stderr = '' } = await execFileAsync(
'npm',
['list', '@commitlint/load'],
{ cwd }
);
expect(`${stdout}\n${stderr}`).toContain(`@commitlint/load@${version}`);
}
describe('cli flow to run "oco commitlint force"', () => {
it('on commitlint@9 using CJS', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
await setupCommitlint(gitDir, 9);
const npmList = await render('npm', ['list', '@commitlint/load'], {
cwd: gitDir
});
expect(await npmList.findByText('@commitlint/load@9')).toBeInTheConsole();
try {
await setupCommitlint(gitDir, 9);
await assertInstalledCommitlintVersion(gitDir, '9');
const { findByText } = await render(
`
OCO_TEST_MOCK_TYPE='prompt-module-commitlint-config' \
OCO_PROMPT_MODULE='@commitlint' \
OCO_AI_PROVIDER='test' OCO_GITPUSH='true' \
node ${resolve('./out/cli.cjs')} commitlint force \
`,
[],
{ cwd: gitDir }
);
const oco = await runCli(['commitlint', 'force'], {
cwd: gitDir,
env: getPromptModuleEnv('prompt-module-commitlint-config')
});
expect(
await findByText('opencommit — configure @commitlint')
).toBeInTheConsole();
expect(
await findByText('Read @commitlint configuration')
).toBeInTheConsole();
expect(
await findByText('Generating consistency with given @commitlint rules')
).toBeInTheConsole();
expect(
await findByText('Done - please review contents of')
).toBeInTheConsole();
await cleanup();
expect(
await oco.findByText('opencommit — configure @commitlint')
).toBeInTheConsole();
expect(
await oco.findByText('Read @commitlint configuration')
).toBeInTheConsole();
expect(
await oco.findByText('Generating consistency with given @commitlint rules')
).toBeInTheConsole();
expect(
await oco.findByText('Done - please review contents of')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
} finally {
await cleanup();
}
});
it('on commitlint@18 using CJS', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
await setupCommitlint(gitDir, 18);
const npmList = await render('npm', ['list', '@commitlint/load'], {
cwd: gitDir
});
expect(await npmList.findByText('@commitlint/load@18')).toBeInTheConsole();
try {
await setupCommitlint(gitDir, 18);
await assertInstalledCommitlintVersion(gitDir, '18');
const { findByText } = await render(
`
OCO_TEST_MOCK_TYPE='prompt-module-commitlint-config' \
OCO_PROMPT_MODULE='@commitlint' \
OCO_AI_PROVIDER='test' OCO_GITPUSH='true' \
node ${resolve('./out/cli.cjs')} commitlint force \
`,
[],
{ cwd: gitDir }
);
const oco = await runCli(['commitlint', 'force'], {
cwd: gitDir,
env: getPromptModuleEnv('prompt-module-commitlint-config')
});
expect(
await findByText('opencommit — configure @commitlint')
).toBeInTheConsole();
expect(
await findByText('Read @commitlint configuration')
).toBeInTheConsole();
expect(
await findByText('Generating consistency with given @commitlint rules')
).toBeInTheConsole();
expect(
await findByText('Done - please review contents of')
).toBeInTheConsole();
await cleanup();
expect(
await oco.findByText('opencommit — configure @commitlint')
).toBeInTheConsole();
expect(
await oco.findByText('Read @commitlint configuration')
).toBeInTheConsole();
expect(
await oco.findByText('Generating consistency with given @commitlint rules')
).toBeInTheConsole();
expect(
await oco.findByText('Done - please review contents of')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
} finally {
await cleanup();
}
});
it('on commitlint@19 using ESM', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
await setupCommitlint(gitDir, 19);
const npmList = await render('npm', ['list', '@commitlint/load'], {
cwd: gitDir
});
expect(await npmList.findByText('@commitlint/load@19')).toBeInTheConsole();
try {
await setupCommitlint(gitDir, 19);
await assertInstalledCommitlintVersion(gitDir, '19');
const { findByText } = await render(
`
OCO_TEST_MOCK_TYPE='prompt-module-commitlint-config' \
OCO_PROMPT_MODULE='@commitlint' \
OCO_AI_PROVIDER='test' OCO_GITPUSH='true' \
node ${resolve('./out/cli.cjs')} commitlint force \
`,
[],
{ cwd: gitDir }
);
const oco = await runCli(['commitlint', 'force'], {
cwd: gitDir,
env: getPromptModuleEnv('prompt-module-commitlint-config')
});
expect(
await findByText('opencommit — configure @commitlint')
).toBeInTheConsole();
expect(
await findByText('Read @commitlint configuration')
).toBeInTheConsole();
expect(
await findByText('Generating consistency with given @commitlint rules')
).toBeInTheConsole();
expect(
await findByText('Done - please review contents of')
).toBeInTheConsole();
await cleanup();
expect(
await oco.findByText('opencommit — configure @commitlint')
).toBeInTheConsole();
expect(
await oco.findByText('Read @commitlint configuration')
).toBeInTheConsole();
expect(
await oco.findByText('Generating consistency with given @commitlint rules')
).toBeInTheConsole();
expect(
await oco.findByText('Done - please review contents of')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
} finally {
await cleanup();
}
});
});
@@ -153,75 +147,57 @@ describe('cli flow to generate commit message using @commitlint prompt-module',
it('on commitlint@19 using ESM', async () => {
const { gitDir, cleanup } = await prepareEnvironment();
// Setup commitlint@19
await setupCommitlint(gitDir, 19);
const npmList = await render('npm', ['list', '@commitlint/load'], {
cwd: gitDir
});
expect(await npmList.findByText('@commitlint/load@19')).toBeInTheConsole();
try {
await setupCommitlint(gitDir, 19);
await assertInstalledCommitlintVersion(gitDir, '19');
// Run `oco commitlint force`
const commitlintForce = await render(
`
OCO_TEST_MOCK_TYPE='prompt-module-commitlint-config' \
OCO_PROMPT_MODULE='@commitlint' \
OCO_AI_PROVIDER='test' OCO_GITPUSH='true' \
node ${resolve('./out/cli.cjs')} commitlint force \
`,
[],
{ cwd: gitDir }
);
expect(
await commitlintForce.findByText('Done - please review contents of')
).toBeInTheConsole();
const commitlintForce = await runCli(['commitlint', 'force'], {
cwd: gitDir,
env: getPromptModuleEnv('prompt-module-commitlint-config')
});
expect(
await commitlintForce.findByText('Done - please review contents of')
).toBeInTheConsole();
expect(await waitForExit(commitlintForce)).toBe(0);
// Run `oco commitlint get`
const commitlintGet = await render(
`
OCO_TEST_MOCK_TYPE='prompt-module-commitlint-config' \
OCO_PROMPT_MODULE='@commitlint' \
OCO_AI_PROVIDER='test' OCO_GITPUSH='true' \
node ${resolve('./out/cli.cjs')} commitlint get \
`,
[],
{ cwd: gitDir }
);
expect(await commitlintGet.findByText('consistency')).toBeInTheConsole();
const commitlintGet = await runCli(['commitlint', 'get'], {
cwd: gitDir,
env: getPromptModuleEnv('prompt-module-commitlint-config')
});
expect(await commitlintGet.findByText('consistency')).toBeInTheConsole();
expect(await waitForExit(commitlintGet)).toBe(0);
// Run 'oco' using .opencommit-commitlint
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
await prepareRepo(
gitDir,
{
'index.ts': 'console.log("Hello World");\n'
},
{ stage: true }
);
const oco = await render(
`
OCO_TEST_MOCK_TYPE='commit-message' \
OCO_PROMPT_MODULE='@commitlint' \
OCO_AI_PROVIDER='test' OCO_GITPUSH='true' \
node ${resolve('./out/cli.cjs')} \
`,
[],
{ cwd: gitDir }
);
const oco = await runCli([], {
cwd: gitDir,
env: getPromptModuleEnv('commit-message')
});
expect(
await oco.findByText('Generating the commit message')
).toBeInTheConsole();
expect(
await oco.findByText('Confirm the commit message?')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Generating the commit message')
).toBeInTheConsole();
expect(await oco.findByText('Confirm the commit message?')).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Do you want to run `git push`?')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Do you want to run `git push`?')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
await cleanup();
expect(
await oco.findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
expect(await waitForExit(oco)).toBe(0);
await assertHeadCommit(gitDir, 'fix(testAi.ts): test commit message');
} finally {
await cleanup();
}
});
});

View File

@@ -1,5 +1,7 @@
#!/bin/sh
set -eu
current_dir=$(pwd)
setup_dir="$(cd "$(dirname "$0")" && pwd)"

26
test/e2e/smoke.test.ts Normal file
View File

@@ -0,0 +1,26 @@
import packageJson from '../../package.json';
import 'cli-testing-library/extend-expect';
import { runCli, waitForExit } from './utils';
it('prints help without entering the interactive flow', async () => {
const help = await runCli(['--help'], {
cwd: process.cwd()
});
expect(await help.findByText('opencommit')).toBeInTheConsole();
expect(await help.findByText('--context')).toBeInTheConsole();
expect(await help.findByText('--yes')).toBeInTheConsole();
expect(await help.queryByText('Select your AI provider:')).not.toBeInTheConsole();
expect(await help.queryByText('Enter your API key:')).not.toBeInTheConsole();
expect(await waitForExit(help)).toBe(0);
});
it('prints the current version without booting the CLI runtime', async () => {
const version = await runCli(['--version'], {
cwd: process.cwd()
});
expect(await version.findByText(packageJson.version)).toBeInTheConsole();
expect(await version.queryByText('Generating the commit message')).not.toBeInTheConsole();
expect(await waitForExit(version)).toBe(0);
});

View File

@@ -1,37 +1,441 @@
import path from 'path'
import { mkdtemp, rm } from 'fs'
import { promisify } from 'util';
import path from 'path';
import {
appendFileSync,
existsSync,
mkdirSync,
mkdtemp,
rm,
writeFileSync
} from 'fs';
import http from 'http';
import { tmpdir } from 'os';
import { exec } from 'child_process';
import { execFile } from 'child_process';
import { promisify } from 'util';
import type { AddressInfo } from 'net';
import { render } from 'cli-testing-library';
import type { RenderResult } from 'cli-testing-library';
const fsMakeTempDir = promisify(mkdtemp);
const fsExec = promisify(exec);
const fsExecFile = promisify(execFile);
const fsRemove = promisify(rm);
/**
* Prepare the environment for the test
* Create a temporary git repository in the temp directory
*/
export const prepareEnvironment = async (): Promise<{
const CLI_PATH = path.resolve(process.cwd(), 'out/cli.cjs');
const DEFAULT_TEST_ENV = {
OCO_TEST_SKIP_VERSION_CHECK: 'true'
};
const COMPLETED_MIGRATIONS = [
'00_use_single_api_key_and_url',
'01_remove_obsolete_config_keys_from_global_file',
'02_set_missing_default_values'
];
type ProcessOptions = {
cwd: string;
env?: NodeJS.ProcessEnv;
};
type PrepareEnvironmentOptions = {
remotes?: 0 | 1 | 2;
};
export const getCliPath = () => CLI_PATH;
export const runProcess = async (
command: string,
args: string[] = [],
{ cwd, env = {} }: ProcessOptions
): Promise<RenderResult> => {
return render(command, args, {
cwd,
spawnOpts: {
env: {
...process.env,
...DEFAULT_TEST_ENV,
...env
}
}
});
};
export const runCli = async (
args: string[] = [],
options: ProcessOptions
): Promise<RenderResult> => {
return runProcess(process.execPath, [getCliPath(), ...args], options);
};
export const runGit = async (
args: string[],
cwd: string
): Promise<{ stdout: string; stderr: string }> => {
const { stdout = '', stderr = '' } = await fsExecFile('git', args, { cwd });
return { stdout, stderr };
};
export const configureGitUser = async (gitDir: string): Promise<void> => {
await runGit(['config', 'user.email', 'test@example.com'], gitDir);
await runGit(['config', 'user.name', 'Test User'], gitDir);
};
export const prepareEnvironment = async ({
remotes = 1
}: PrepareEnvironmentOptions = {}): Promise<{
tempDir: string;
gitDir: string;
remoteDir?: string;
otherRemoteDir?: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
// Create a remote git repository int the temp directory. This is necessary to execute the `git push` command
await fsExec('git init --bare remote.git', { cwd: tempDir });
await fsExec('git clone remote.git test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
let remoteDir: string | undefined;
let otherRemoteDir: string | undefined;
if (remotes === 0) {
await fsExecFile('git', ['init', 'test'], { cwd: tempDir });
} else {
await fsExecFile('git', ['init', '--bare', 'remote.git'], {
cwd: tempDir
});
remoteDir = path.resolve(tempDir, 'remote.git');
if (remotes === 2) {
await fsExecFile('git', ['init', '--bare', 'other.git'], {
cwd: tempDir
});
otherRemoteDir = path.resolve(tempDir, 'other.git');
}
await fsExecFile('git', ['clone', 'remote.git', 'test'], { cwd: tempDir });
if (remotes === 2) {
await runGit(['remote', 'add', 'other', '../other.git'], gitDir);
}
}
await configureGitUser(gitDir);
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
}
if (existsSync(tempDir)) {
await fsRemove(tempDir, { force: true, recursive: true });
}
};
return {
tempDir,
gitDir,
cleanup,
remoteDir,
otherRemoteDir,
cleanup
};
};
export const prepareTempDir = async (): Promise<string> => {
return fsMakeTempDir(path.join(tmpdir(), 'opencommit-test-'));
};
export const prepareRepo = async (
gitDir: string,
files: Record<string, string>,
options: {
stage?: string[] | true;
commitMessage?: string;
} = {}
): Promise<void> => {
for (const [relativePath, content] of Object.entries(files)) {
writeRepoFile(gitDir, relativePath, content);
}
}
export const prepareTempDir = async(): Promise<string> => {
return await fsMakeTempDir(path.join(tmpdir(), 'opencommit-test-'));
}
const stageFiles =
options.stage === true
? Object.keys(files)
: Array.isArray(options.stage)
? options.stage
: options.commitMessage
? Object.keys(files)
: [];
export const wait = (ms: number) => new Promise(resolve => setTimeout(resolve, ms));
if (stageFiles.length > 0) {
await runGit(['add', ...stageFiles], gitDir);
}
if (options.commitMessage) {
await runGit(['commit', '-m', options.commitMessage], gitDir);
}
};
export const writeRepoFile = (
gitDir: string,
relativePath: string,
content: string
): void => {
const filePath = path.resolve(gitDir, relativePath);
mkdirSync(path.dirname(filePath), { recursive: true });
writeFileSync(filePath, content);
};
export const appendRepoFile = (
gitDir: string,
relativePath: string,
content: string
): void => {
const filePath = path.resolve(gitDir, relativePath);
mkdirSync(path.dirname(filePath), { recursive: true });
appendFileSync(filePath, content);
};
export const writeGlobalConfig = (
homeDir: string,
lines: string[]
): string => {
const configPath = path.resolve(homeDir, '.opencommit');
writeFileSync(configPath, lines.join('\n'));
return configPath;
};
export const seedMigrations = (
homeDir: string,
completedMigrations: string[] = COMPLETED_MIGRATIONS
): string => {
const migrationsPath = path.resolve(homeDir, '.opencommit_migrations');
writeFileSync(migrationsPath, JSON.stringify(completedMigrations));
return migrationsPath;
};
export const seedModelCache = async (
homeDir: string,
models: Record<string, string[]>
): Promise<void> => {
const modelCachePath = path.resolve(homeDir, '.opencommit-models.json');
writeFileSync(
modelCachePath,
JSON.stringify(
{
timestamp: Date.now(),
models
},
null,
2
)
);
};
export const getMockOpenAiEnv = (
baseUrl: string,
overrides: NodeJS.ProcessEnv = {}
): NodeJS.ProcessEnv => ({
OCO_AI_PROVIDER: 'openai',
OCO_API_KEY: 'test-openai-key',
OCO_MODEL: 'gpt-4o-mini',
OCO_API_URL: baseUrl,
OCO_GITPUSH: 'false',
...overrides
});
export const wait = (ms: number) =>
new Promise((resolve) => setTimeout(resolve, ms));
export const waitForExit = async (
instance: RenderResult,
timeoutMs: number = 10_000
): Promise<number> => {
const startedAt = Date.now();
while (Date.now() - startedAt < timeoutMs) {
const exit = instance.hasExit();
if (exit) {
return exit.exitCode;
}
await wait(25);
}
throw new Error('Process did not exit within the expected timeout');
};
export const getHeadCommitSubject = async (gitDir: string): Promise<string> => {
const { stdout } = await runGit(['log', '-1', '--pretty=%s'], gitDir);
return stdout.trim();
};
export const getHeadCommitFiles = async (gitDir: string): Promise<string[]> => {
const { stdout } = await runGit(
['diff-tree', '--root', '--no-commit-id', '--name-only', '-r', 'HEAD'],
gitDir
);
return stdout
.split('\n')
.map((file) => file.trim())
.filter(Boolean)
.sort();
};
export const getShortGitStatus = async (gitDir: string): Promise<string> => {
const { stdout } = await runGit(['status', '--short'], gitDir);
return stdout.trim();
};
export const getCurrentBranchName = async (gitDir: string): Promise<string> => {
const { stdout } = await runGit(['branch', '--show-current'], gitDir);
return stdout.trim();
};
export const getRemoteBranchHeadSubject = async (
remoteGitDir: string,
branchName: string
): Promise<string> => {
const { stdout = '' } = await fsExecFile(
'git',
['--git-dir', remoteGitDir, 'log', '-1', '--pretty=%s', `refs/heads/${branchName}`],
{ cwd: process.cwd() }
);
return stdout.trim();
};
export const remoteBranchExists = async (
remoteGitDir: string,
branchName: string
): Promise<boolean> => {
try {
await fsExecFile(
'git',
[
'--git-dir',
remoteGitDir,
'rev-parse',
'--verify',
'--quiet',
`refs/heads/${branchName}`
],
{ cwd: process.cwd() }
);
return true;
} catch {
return false;
}
};
export const assertHeadCommit = async (
gitDir: string,
expectedSubject: string
): Promise<void> => {
expect(await getHeadCommitSubject(gitDir)).toBe(expectedSubject);
};
export const assertGitStatus = async (
gitDir: string,
expected: string | RegExp
): Promise<void> => {
const status = await getShortGitStatus(gitDir);
if (typeof expected === 'string') {
expect(status).toContain(expected);
return;
}
expect(status).toMatch(expected);
};
export const startMockOpenAiServer = async (
response:
| string
| ((request: {
authorization?: string;
body: Record<string, any> | undefined;
requestIndex: number;
}) => {
status?: number;
body: Record<string, any>;
headers?: Record<string, string>;
})
): Promise<{
authHeaders: string[];
requestBodies: Array<Record<string, any>>;
baseUrl: string;
cleanup: () => Promise<void>;
}> => {
const authHeaders: string[] = [];
const requestBodies: Array<Record<string, any>> = [];
const server = http.createServer((req, res) => {
const authorization = req.headers.authorization;
if (authorization) {
authHeaders.push(
Array.isArray(authorization) ? authorization[0] : authorization
);
}
const chunks: Buffer[] = [];
req.on('data', (chunk) => {
chunks.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk));
});
req.on('end', () => {
const rawBody = Buffer.concat(chunks).toString('utf8');
let parsedBody: Record<string, any> | undefined;
if (rawBody) {
try {
parsedBody = JSON.parse(rawBody);
requestBodies.push(parsedBody);
} catch {
requestBodies.push({ rawBody });
}
}
if (req.method === 'POST' && req.url?.includes('/chat/completions')) {
const payload =
typeof response === 'string'
? {
status: 200,
body: {
choices: [
{
message: {
content: response
}
}
]
}
}
: response({
authorization: Array.isArray(authorization)
? authorization[0]
: authorization,
body: parsedBody,
requestIndex: requestBodies.length - 1
});
res.writeHead(payload.status ?? 200, {
'Content-Type': 'application/json',
...payload.headers
});
res.end(JSON.stringify(payload.body));
return;
}
res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'not found' }));
});
});
await new Promise<void>((resolve) => {
server.listen(0, '127.0.0.1', () => resolve());
});
const { port } = server.address() as AddressInfo;
return {
authHeaders,
requestBodies,
baseUrl: `http://127.0.0.1:${port}/v1`,
cleanup: () =>
new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) {
reject(error);
return;
}
resolve();
});
})
};
};

View File

@@ -6,6 +6,7 @@ import { configure } from 'cli-testing-library';
global.jest = jest;
/**
* Adjusted the wait time for waitFor/findByText to 2000ms, because the default 1000ms makes the test results flaky
* CLI rendering gets noticeably slower under coverage and on CI, so keep a
* slightly roomier timeout than the library default.
*/
configure({ asyncUtilTimeout: 2000 });
configure({ asyncUtilTimeout: 5000 });

View File

@@ -199,6 +199,48 @@ describe('config', () => {
expect(config).not.toEqual(null);
expect(config.OCO_API_KEY).toEqual(undefined);
});
it('should not create a global config file when only reading defaults', async () => {
globalConfigFile = await generateConfig('.opencommit', {});
rmSync(globalConfigFile.filePath);
const config = getConfig({
globalPath: globalConfigFile.filePath
});
expect(config.OCO_MODEL).toEqual(DEFAULT_CONFIG.OCO_MODEL);
expect(existsSync(globalConfigFile.filePath)).toBe(false);
});
it('should not materialize ambient proxy env vars into OCO_PROXY', async () => {
process.env.HTTPS_PROXY = 'http://127.0.0.1:7890';
globalConfigFile = await generateConfig('.opencommit', {});
envConfigFile = await generateConfig('.env', {});
const config = getConfig({
globalPath: globalConfigFile.filePath,
envPath: envConfigFile.filePath
});
expect(config.OCO_PROXY).toEqual(undefined);
});
it('should parse OCO_PROXY=null from local .env as explicit disable', async () => {
globalConfigFile = await generateConfig('.opencommit', {
OCO_PROXY: 'http://global-proxy:8080'
});
envConfigFile = await generateConfig('.env', {
OCO_PROXY: 'null'
});
const config = getConfig({
globalPath: globalConfigFile.filePath,
envPath: envConfigFile.filePath
});
expect(config.OCO_PROXY).toEqual(null);
});
});
describe('setConfig', () => {
@@ -325,5 +367,20 @@ describe('config', () => {
const fileContent2 = readFileSync(globalConfigFile.filePath, 'utf8');
expect(fileContent2).toContain('OCO_MODEL=gpt-4');
});
it('should persist OCO_PROXY=null as an explicit disable', async () => {
await setConfig(
[[CONFIG_KEYS.OCO_PROXY, null]],
globalConfigFile.filePath
);
const config = getConfig({
globalPath: globalConfigFile.filePath
});
const fileContent = readFileSync(globalConfigFile.filePath, 'utf8');
expect(config.OCO_PROXY).toEqual(null);
expect(fileContent).toContain('OCO_PROXY=null');
});
});
});

29
test/unit/errors.test.ts Normal file
View File

@@ -0,0 +1,29 @@
import {
formatUserFriendlyError,
ServiceUnavailableError
} from '../../src/utils/errors';
describe('formatUserFriendlyError', () => {
it('should keep provider wording when no custom API URL is configured', () => {
const formatted = formatUserFriendlyError(
new ServiceUnavailableError('openai'),
'openai'
);
expect(formatted.message).toEqual(
'The openai service is temporarily unavailable.'
);
});
it('should use configured endpoint wording when a custom API URL is provided', () => {
const formatted = formatUserFriendlyError(
new ServiceUnavailableError('openai'),
'openai',
{ baseURL: 'http://127.0.0.1:1234/v1' }
);
expect(formatted.message).toContain('configured API endpoint');
expect(formatted.message).toContain('127.0.0.1:1234');
expect(formatted.message).not.toContain('openai service');
});
});

View File

@@ -1,96 +1,65 @@
import { OpenAI } from 'openai';
import { GeminiEngine } from '../../src/engine/gemini';
import { GenerativeModel, GoogleGenerativeAI } from '@google/generative-ai';
import {
ConfigType,
getConfig,
OCO_AI_PROVIDER_ENUM
} from '../../src/commands/config';
import { OpenAI } from 'openai';
describe('Gemini', () => {
let gemini: GeminiEngine;
let mockConfig: ConfigType;
let mockGoogleGenerativeAi: GoogleGenerativeAI;
let mockGenerativeModel: GenerativeModel;
let mockExit: jest.SpyInstance<never, [code?: number | undefined], any>;
const noop: (...args: any[]) => any = (...args: any[]) => {};
const mockGemini = () => {
mockConfig = getConfig() as ConfigType;
gemini = new GeminiEngine({
apiKey: mockConfig.OCO_API_KEY,
model: mockConfig.OCO_MODEL
describe('GeminiEngine', () => {
it('maps OpenAI-style chat messages into Gemini request payloads', async () => {
const engine = new GeminiEngine({
apiKey: 'mock-api-key',
model: 'gemini-1.5-flash',
baseURL: 'http://127.0.0.1:8080/v1',
maxTokensOutput: 256,
maxTokensInput: 4096
});
};
const oldEnv = process.env;
beforeEach(() => {
jest.resetModules();
process.env = { ...oldEnv };
jest.mock('@google/generative-ai');
jest.mock('../src/commands/config');
jest.mock('@clack/prompts', () => ({
intro: jest.fn(),
outro: jest.fn()
}));
mockExit = jest.spyOn(process, 'exit').mockImplementation();
mockConfig = getConfig() as ConfigType;
mockConfig.OCO_AI_PROVIDER = OCO_AI_PROVIDER_ENUM.GEMINI;
mockConfig.OCO_API_KEY = 'mock-api-key';
mockConfig.OCO_MODEL = 'gemini-1.5-flash';
mockGoogleGenerativeAi = new GoogleGenerativeAI(mockConfig.OCO_API_KEY);
mockGenerativeModel = mockGoogleGenerativeAi.getGenerativeModel({
model: mockConfig.OCO_MODEL
const generateContent = jest.fn().mockResolvedValue({
response: {
text: () => 'feat(gemini): translate the diff<think>hidden</think>'
}
});
const getGenerativeModel = jest.fn().mockReturnValue({
generateContent
});
});
afterEach(() => {
gemini = undefined as any;
});
engine.client = {
getGenerativeModel
} as any;
afterAll(() => {
mockExit.mockRestore();
process.env = oldEnv;
});
const messages: Array<OpenAI.Chat.Completions.ChatCompletionMessageParam> = [
{ role: 'system', content: 'system message' },
{ role: 'assistant', content: 'assistant guidance' },
{ role: 'user', content: 'diff --git a/file b/file' }
];
it.skip('should exit process if OCO_GEMINI_API_KEY is not set and command is not config', () => {
process.env.OCO_GEMINI_API_KEY = undefined;
process.env.OCO_AI_PROVIDER = 'gemini';
const result = await engine.generateCommitMessage(messages);
mockGemini();
expect(mockExit).toHaveBeenCalledWith(1);
});
it('should generate commit message', async () => {
const mockGenerateContent = jest
.fn()
.mockResolvedValue({ response: { text: () => 'generated content' } });
mockGenerativeModel.generateContent = mockGenerateContent;
mockGemini();
const messages: Array<OpenAI.Chat.Completions.ChatCompletionMessageParam> =
[
{ role: 'system', content: 'system message' },
{ role: 'assistant', content: 'assistant message' }
];
jest
.spyOn(gemini, 'generateCommitMessage')
.mockImplementation(async () => 'generated content');
const result = await gemini.generateCommitMessage(messages);
expect(result).toEqual('generated content');
expect(result).toEqual('feat(gemini): translate the diff');
expect(getGenerativeModel).toHaveBeenCalledWith(
{
model: 'gemini-1.5-flash',
systemInstruction: 'system message'
},
{
baseUrl: 'http://127.0.0.1:8080/v1'
}
);
expect(generateContent).toHaveBeenCalledWith(
expect.objectContaining({
contents: [
{
parts: [{ text: 'assistant guidance' }],
role: 'model'
},
{
parts: [{ text: 'diff --git a/file b/file' }],
role: 'user'
}
],
generationConfig: expect.objectContaining({
maxOutputTokens: 256,
temperature: 0,
topP: 0.1
})
})
);
});
});

64
test/unit/ollama.test.ts Normal file
View File

@@ -0,0 +1,64 @@
import { OllamaEngine } from '../../src/engine/ollama';
describe('OllamaEngine', () => {
it('sends think=false when configured', async () => {
const engine = new OllamaEngine({
apiKey: 'ollama',
model: 'qwen3.5:2b',
maxTokensOutput: 500,
maxTokensInput: 4096,
ollamaThink: false
});
const post = jest.fn().mockResolvedValue({
data: {
message: {
content: 'feat: add support for ollama think config'
}
}
});
engine.client = { post } as any;
await engine.generateCommitMessage([
{ role: 'user', content: 'diff --git a/file b/file' }
]);
expect(post).toHaveBeenCalledWith(
'http://localhost:11434/api/chat',
expect.objectContaining({
think: false
})
);
});
it('omits think when not configured', async () => {
const engine = new OllamaEngine({
apiKey: 'ollama',
model: 'qwen3.5:2b',
maxTokensOutput: 500,
maxTokensInput: 4096
});
const post = jest.fn().mockResolvedValue({
data: {
message: {
content: 'feat: add support for ollama think config'
}
}
});
engine.client = { post } as any;
await engine.generateCommitMessage([
{ role: 'user', content: 'diff --git a/file b/file' }
]);
expect(post).toHaveBeenCalledWith(
'http://localhost:11434/api/chat',
expect.not.objectContaining({
think: expect.anything()
})
);
});
});

View File

@@ -1,26 +1,71 @@
// Test the reasoning model detection regex used in OpenAiEngine.
// Integration test with the engine is not possible because mistral.ts
// uses require() which is unavailable in the ESM test environment.
const REASONING_MODEL_RE = /^(o[1-9]|gpt-5)/;
import { OpenAI } from 'openai';
import { OpenAiEngine } from '../../src/engine/openAi';
describe('OpenAiEngine reasoning model detection', () => {
it.each([
['o1', true],
['o1-preview', true],
['o1-mini', true],
['o3', true],
['o3-mini', true],
['o4-mini', true],
['gpt-5', true],
['gpt-5-nano', true],
['gpt-4o', false],
['gpt-4o-mini', false],
['gpt-4', false],
['gpt-3.5-turbo', false]
])(
'model "%s" isReasoning=%s',
(model, expected) => {
expect(REASONING_MODEL_RE.test(model)).toBe(expected);
}
);
describe('OpenAiEngine', () => {
const baseConfig = {
apiKey: 'test-openai-key',
maxTokensInput: 4096,
maxTokensOutput: 256
};
const messages: Array<OpenAI.Chat.Completions.ChatCompletionMessageParam> = [
{ role: 'system', content: 'system message' },
{ role: 'user', content: 'diff --git a/file b/file' }
];
it('uses max_completion_tokens for reasoning models', async () => {
const engine = new OpenAiEngine({
...baseConfig,
model: 'o3-mini'
});
const create = jest
.spyOn(engine.client.chat.completions, 'create')
.mockResolvedValue({
choices: [{ message: { content: 'feat(openai): reasoning path' } }]
} as any);
await engine.generateCommitMessage(messages);
expect(create).toHaveBeenCalledWith(
expect.objectContaining({
model: 'o3-mini',
max_completion_tokens: 256
})
);
expect(create).toHaveBeenCalledWith(
expect.not.objectContaining({
max_tokens: expect.anything()
})
);
});
it('uses max_tokens and sampling params for non-reasoning models', async () => {
const engine = new OpenAiEngine({
...baseConfig,
model: 'gpt-4o-mini'
});
const create = jest
.spyOn(engine.client.chat.completions, 'create')
.mockResolvedValue({
choices: [{ message: { content: 'feat(openai): standard path' } }]
} as any);
await engine.generateCommitMessage(messages);
expect(create).toHaveBeenCalledWith(
expect.objectContaining({
model: 'gpt-4o-mini',
max_tokens: 256,
temperature: 0,
top_p: 0.1
})
);
expect(create).toHaveBeenCalledWith(
expect.not.objectContaining({
max_completion_tokens: expect.anything()
})
);
});
});

126
test/unit/proxy.test.ts Normal file
View File

@@ -0,0 +1,126 @@
import axios from 'axios';
import { getGlobalDispatcher } from 'undici';
import { AnthropicEngine } from '../../src/engine/anthropic';
import { OpenAiEngine } from '../../src/engine/openAi';
import { resolveProxy, setupProxy } from '../../src/utils/proxy';
describe('proxy utilities', () => {
const originalEnv = { ...process.env };
const originalAxiosProxy = axios.defaults.proxy;
const originalAxiosHttpAgent = axios.defaults.httpAgent;
const originalAxiosHttpsAgent = axios.defaults.httpsAgent;
function resetEnv(env: NodeJS.ProcessEnv) {
Object.keys(process.env).forEach((key) => {
if (!(key in env)) {
delete process.env[key];
} else {
process.env[key] = env[key];
}
});
}
beforeEach(() => {
resetEnv(originalEnv);
setupProxy(undefined);
});
afterEach(() => {
resetEnv(originalEnv);
setupProxy(undefined);
axios.defaults.proxy = originalAxiosProxy;
axios.defaults.httpAgent = originalAxiosHttpAgent;
axios.defaults.httpsAgent = originalAxiosHttpsAgent;
});
it('should prefer an explicit proxy URL over ambient proxy env vars', () => {
process.env.HTTPS_PROXY = 'http://ambient-proxy:8080';
expect(resolveProxy('http://explicit-proxy:8080')).toEqual(
'http://explicit-proxy:8080'
);
});
it('should return null when proxy is explicitly disabled', () => {
process.env.HTTPS_PROXY = 'http://ambient-proxy:8080';
expect(resolveProxy(null)).toEqual(null);
});
it('should fall back to ambient proxy env vars when proxy is unset', () => {
process.env.HTTPS_PROXY = 'http://ambient-proxy:8080';
expect(resolveProxy(undefined)).toEqual('http://ambient-proxy:8080');
});
it('should disable proxy usage when setupProxy receives null', () => {
process.env.HTTPS_PROXY = 'http://ambient-proxy:8080';
setupProxy(null);
expect(getGlobalDispatcher().constructor.name).toEqual('Agent');
expect(axios.defaults.proxy).toEqual(false);
expect(axios.defaults.httpAgent).toBeUndefined();
expect(axios.defaults.httpsAgent).toBeUndefined();
});
it('should install proxy agents when setupProxy receives a proxy URL', () => {
setupProxy('http://127.0.0.1:7890');
expect(getGlobalDispatcher().constructor.name).toEqual('ProxyAgent');
expect(axios.defaults.proxy).toEqual(false);
expect(axios.defaults.httpAgent).toBeDefined();
expect(axios.defaults.httpsAgent).toBeDefined();
});
});
describe('engine proxy handling', () => {
const originalEnv = { ...process.env };
const baseConfig = {
apiKey: 'test-key',
model: 'gpt-4o-mini',
maxTokensInput: 4096,
maxTokensOutput: 256
};
function resetEnv(env: NodeJS.ProcessEnv) {
Object.keys(process.env).forEach((key) => {
if (!(key in env)) {
delete process.env[key];
} else {
process.env[key] = env[key];
}
});
}
beforeEach(() => {
resetEnv(originalEnv);
});
afterEach(() => {
resetEnv(originalEnv);
});
it('should not let OpenAI engine re-read proxy env vars when proxy is unset', () => {
process.env.HTTPS_PROXY = 'http://ambient-proxy:8080';
const engine = new OpenAiEngine({
...baseConfig,
proxy: undefined
});
expect(engine.client.httpAgent).toBeUndefined();
});
it('should not let Anthropic engine re-read proxy env vars when proxy is unset', () => {
process.env.HTTPS_PROXY = 'http://ambient-proxy:8080';
const engine = new AnthropicEngine({
...baseConfig,
model: 'claude-sonnet-4-20250514',
proxy: undefined
});
expect(engine.client.httpAgent).toBeUndefined();
});
});