Compare commits

...

10 Commits

Author SHA1 Message Date
di-sukharev
041465a81c 3.2.3 2024-12-14 20:03:11 +01:00
GPT8
40fa275b4f 3.2.3 (#431)
* 378: fix hook env (#402)

* fix(prepare-commit-msg-hook): update error handling to provide clearer instructions for setting API keys and improve user guidance

* Fix: a bug that causes an error when pushing without setting git remote (#396)

* update deploy commands

* feat(cli): add context flag for providing additional commit message input

* Fix [Bug]: punycode` module is deprecated #426 (#433)

Signed-off-by: Tiger Kaovilai <passawit.kaovilai@gmail.com>

* npm audit fix (#432)

Signed-off-by: Tiger Kaovilai <passawit.kaovilai@gmail.com>

* Feat: Add an option to `Don't push` when there are multiple git remotes (#434)

---------

Co-authored-by: GPT8 <57486732+di-sukharev@users.noreply.github.com>

* feat(engine): add support for MLX AI provider (#437)

---------

Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
Co-authored-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>

* feat(config, engine): add support for Mistral AI provider and engine (#436)

* docs(CONTRIBUTING.md): update `TODO.md` reference (#435)

Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>

* feat(config, engine): add support for Mistral AI provider and engine

* ```
feat(package): add mistralai and zod dependencies
```

* fix: recreate package-lock.json with node20

* fix: recreate package-lock.json with node v20.18.1 based on branch dev

---------

Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
Co-authored-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
Co-authored-by: pedro-valentim <>

---------

Signed-off-by: Tiger Kaovilai <passawit.kaovilai@gmail.com>
Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
Co-authored-by: Takanori Matsumoto <matscube@gmail.com>
Co-authored-by: BILLY Maxime <ozeliurs@gmail.com>
Co-authored-by: Welington Sampaio <welington.sampaio@icloud.com>
Co-authored-by: Tiger Kaovilai <passawit.kaovilai@gmail.com>
Co-authored-by: albi ️ <sigismondi.alberto@gmail.com>
Co-authored-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
Co-authored-by: Pedro Valentim Silva Leite <18179935+pedro-valentim@users.noreply.github.com>
2024-12-14 20:02:25 +01:00
Emmanuel Ferdman
6f16191af2 docs(CONTRIBUTING.md): update TODO.md reference (#435)
Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
2024-11-29 21:33:39 +01:00
di-sukharev
25105e4c3a docs(CONTRIBUTING.md): update links to point to the correct repository name for consistency and clarity 2024-09-07 19:11:37 +03:00
GPT8
2769121842 3.2.2 (#413)
* feat(config): add support for groq AI provider, including config validation and engine implementation (#381)

* fix migrations (#414)

---------

Co-authored-by: Takanori Matsumoto <matscube@gmail.com>
Co-authored-by: BILLY Maxime <ozeliurs@gmail.com>
2024-09-07 18:17:17 +03:00
di-sukharev
8ae927e2dc build 2024-09-06 13:59:28 +03:00
di-sukharev
2859d4ebe3 3.2.1 2024-09-06 13:59:25 +03:00
GPT8
306522e796 3.2.0 (#412)
* 378: fix hook env (#402)

* fix(prepare-commit-msg-hook): update error handling to provide clearer instructions for setting API keys and improve user guidance

* Fix: a bug that causes an error when pushing without setting git remote (#396)

* refactoring v2 (#408)

* 3.2.0

* update deploy commands

---------

Co-authored-by: Takanori Matsumoto <matscube@gmail.com>
2024-09-06 13:58:54 +03:00
di-sukharev
69b3c00a52 docs(README): update OCO_AI_PROVIDER and OCO_MODEL instructions for clarity and add valid model name options to OCO_MODEL description 2024-09-02 11:13:02 +03:00
di-sukharev
6f4afbfb52 build 2024-09-02 10:19:33 +03:00
32 changed files with 5927 additions and 1252 deletions

View File

@@ -18,7 +18,7 @@ To get started, follow these steps:
1. Clone the project repository locally.
2. Install dependencies with `npm install`.
3. Run the project with `npm run dev`.
4. See [issues](https://github.com/di-sukharev/open-commit/issues) or [TODO.md](../TODO.md) to help the project.
4. See [issues](https://github.com/di-sukharev/opencommit/issues) or [TODO.md](TODO.md) to help the project.
## Commit message guidelines
@@ -30,7 +30,7 @@ If you encounter any issues while using the project, please report them on the G
## Contacts
If you have any questions about contributing to the project, please contact by [creating an issue](https://github.com/di-sukharev/open-commit/issues) on the GitHub issue tracker.
If you have any questions about contributing to the project, please contact by [creating an issue](https://github.com/di-sukharev/opencommit/issues) on the GitHub issue tracker.
## License

View File

@@ -28,30 +28,19 @@ You can use OpenCommit by simply running it via the CLI like this `oco`. 2 secon
npm install -g opencommit
```
Alternatively run it via `npx opencommit` or `bunx opencommit`
MacOS may ask to run the command with `sudo` when installing a package globally.
2. Get your API key from [OpenAI](https://platform.openai.com/account/api-keys). Make sure that you add your payment details, so the API works.
2. Get your API key from [OpenAI](https://platform.openai.com/account/api-keys) or other supported LLM providers (we support them all). Make sure that you add your OpenAI payment details to your account, so the API works.
3. Set the key to OpenCommit config:
```sh
oco config set OCO_OPENAI_API_KEY=<your_api_key>
oco config set OCO_API_KEY=<your_api_key>
```
Your API key is stored locally in the `~/.opencommit` config file.
## Usage
You can call OpenCommit directly to generate a commit message for your staged changes:
```sh
git add <files...>
opencommit
```
You can also use the `oco` shortcut:
You can call OpenCommit with `oco` command to generate a commit message for your staged changes:
```sh
git add <files...>
@@ -70,21 +59,17 @@ You can also run it with local model through ollama:
```sh
git add <files...>
oco config set OCO_AI_PROVIDER='ollama'
oco config set OCO_AI_PROVIDER='ollama' OCO_MODEL='llama3:8b'
```
If you want to use a model other than mistral (default), you can do so by setting the `OCO_AI_PROVIDER` environment variable as follows:
```sh
oco config set OCO_AI_PROVIDER='ollama/llama3:8b'
```
Default model is `mistral`.
If you have ollama that is set up in docker/ on another machine with GPUs (not locally), you can change the default endpoint url.
You can do so by setting the `OCO_OLLAMA_API_URL` environment variable as follows:
You can do so by setting the `OCO_API_URL` environment variable as follows:
```sh
oco config set OCO_OLLAMA_API_URL='http://192.168.1.10:11434/api/chat'
oco config set OCO_API_URL='http://192.168.1.10:11434/api/chat'
```
where 192.168.1.10 is example of endpoint URL, where you have ollama set up.
@@ -121,22 +106,21 @@ Create a `.env` file and add OpenCommit config variables there like this:
```env
...
OCO_OPENAI_API_KEY=<your OpenAI API token>
OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama, gemini, flowise>
OCO_API_KEY=<your OpenAI API token> // or other LLM provider API token
OCO_API_URL=<may be used to set proxy path to OpenAI api>
OCO_TOKENS_MAX_INPUT=<max model token limit (default: 4096)>
OCO_TOKENS_MAX_OUTPUT=<max response tokens (default: 500)>
OCO_OPENAI_BASE_PATH=<may be used to set proxy path to OpenAI api>
OCO_DESCRIPTION=<postface a message with ~3 sentences description of the changes>
OCO_EMOJI=<boolean, add GitMoji>
OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview' or any string basically, but it should be a valid model name>
OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview' or any Anthropic or Ollama model or any string basically, but it should be a valid model name>
OCO_LANGUAGE=<locale, scroll to the bottom to see options>
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=<message template placeholder, default: '$msg'>
OCO_PROMPT_MODULE=<either conventional-commit or @commitlint, default: conventional-commit>
OCO_ONE_LINE_COMMIT=<one line commit message, default: false>
OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama or ollama/model>
...
```
This are not all the config options, but you get the point.
Global configs are same as local configs, but they are stored in the global `~/.opencommit` config file and set with `oco config set` command, e.g. `oco config set OCO_MODEL=gpt-4o`.
### Global config for all repos
@@ -188,26 +172,26 @@ or for as a cheaper option:
oco config set OCO_MODEL=gpt-3.5-turbo
```
### Switch to Azure OpenAI
### Switch to other LLM providers with a custom URL
By default OpenCommit uses [OpenAI](https://openai.com).
You could switch to [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/)🚀
You could switch to [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/) or Flowise or Ollama.
```sh
opencommit config set OCO_AI_PROVIDER=azure
```
oco config set OCO_AI_PROVIDER=azure OCO_API_KEY=<your_azure_api_key> OCO_API_URL=<your_azure_endpoint>
Of course need to set 'OCO_OPENAI_API_KEY'. And also need to set the
'OPENAI_BASE_PATH' for the endpoint and set the deployment name to
'model'.
oco config set OCO_AI_PROVIDER=flowise OCO_API_KEY=<your_flowise_api_key> OCO_API_URL=<your_flowise_endpoint>
oco config set OCO_AI_PROVIDER=ollama OCO_API_KEY=<your_ollama_api_key> OCO_API_URL=<your_ollama_endpoint>
```
### Locale configuration
To globally specify the language used to generate commit messages:
```sh
# de, German ,Deutsch
# de, German, Deutsch
oco config set OCO_LANGUAGE=de
oco config set OCO_LANGUAGE=German
oco config set OCO_LANGUAGE=Deutsch
@@ -223,12 +207,14 @@ All available languages are currently listed in the [i18n](https://github.com/di
### Push to git (gonna be deprecated)
A prompt to ushing to git is on by default but if you would like to turn it off just use:
A prompt for pushing to git is on by default but if you would like to turn it off just use:
```sh
oco config set OCO_GITPUSH=false
```
and it will exit right after commit is confirmed without asking if you would like to push to remote.
### Switch to `@commitlint`
OpenCommit allows you to choose the prompt module used to generate commit messages. By default, OpenCommit uses its conventional-commit message generator. However, you can switch to using the `@commitlint` prompt module if you prefer. This option lets you generate commit messages in respect with the local config.
@@ -403,7 +389,7 @@ jobs:
# set openAI api key in repo actions secrets,
# for openAI keys go to: https://platform.openai.com/account/api-keys
# for repo secret go to: <your_repo_url>/settings/secrets/actions
OCO_OPENAI_API_KEY: ${{ secrets.OCO_OPENAI_API_KEY }}
OCO_API_KEY: ${{ secrets.OCO_API_KEY }}
# customization
OCO_TOKENS_MAX_INPUT: 4096

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

649
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "opencommit",
"version": "3.1.2",
"version": "3.2.3",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "opencommit",
"version": "3.1.2",
"version": "3.2.3",
"license": "MIT",
"dependencies": {
"@actions/core": "^1.10.0",
@@ -17,9 +17,9 @@
"@clack/prompts": "^0.6.1",
"@dqbd/tiktoken": "^1.0.2",
"@google/generative-ai": "^0.11.4",
"@mistralai/mistralai": "^1.3.5",
"@octokit/webhooks-schemas": "^6.11.0",
"@octokit/webhooks-types": "^6.11.0",
"ai": "^2.2.14",
"axios": "^1.3.4",
"chalk": "^5.2.0",
"cleye": "^1.3.2",
@@ -28,7 +28,9 @@
"ignore": "^5.2.4",
"ini": "^3.0.1",
"inquirer": "^9.1.4",
"openai": "^4.56.0"
"openai": "^4.57.0",
"punycode": "^2.3.1",
"zod": "^3.23.8"
},
"bin": {
"oco": "out/cli.cjs",
@@ -108,6 +110,7 @@
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/@ampproject/remapping/-/remapping-2.3.0.tgz",
"integrity": "sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==",
"dev": true,
"dependencies": {
"@jridgewell/gen-mapping": "^0.3.5",
"@jridgewell/trace-mapping": "^0.3.24"
@@ -574,6 +577,7 @@
"version": "7.24.1",
"resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.24.1.tgz",
"integrity": "sha512-Zo9c7N3xdOIQrNip7Lc9wvRPzlRtovHVE4lkz8WEDr7uYh/GMQhSiIgFxGIArRHYdJE5kxtZjAf8rT0xhdLCzg==",
"dev": true,
"bin": {
"parser": "bin/babel-parser.js"
},
@@ -1728,6 +1732,7 @@
"version": "0.3.5",
"resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.5.tgz",
"integrity": "sha512-IzL8ZoEDIBRWEzlCcRhOaCupYyN5gdIK+Q6fbFdPDg6HqX6jpkItn7DFIpW9LQzXG6Df9sA7+OKnq0qlz/GaQg==",
"dev": true,
"dependencies": {
"@jridgewell/set-array": "^1.2.1",
"@jridgewell/sourcemap-codec": "^1.4.10",
@@ -1741,6 +1746,7 @@
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz",
"integrity": "sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==",
"dev": true,
"engines": {
"node": ">=6.0.0"
}
@@ -1749,6 +1755,7 @@
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/@jridgewell/set-array/-/set-array-1.2.1.tgz",
"integrity": "sha512-R8gLRTZeyp03ymzP/6Lil/28tGeGEzhx1q2k703KGWRAI1VdvPIXdG70VJc2pAMw3NA6JKL5hhFu1sJX0Mnn/A==",
"dev": true,
"engines": {
"node": ">=6.0.0"
}
@@ -1756,12 +1763,14 @@
"node_modules/@jridgewell/sourcemap-codec": {
"version": "1.4.15",
"resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.4.15.tgz",
"integrity": "sha512-eF2rxCRulEKXHTRiDrDy6erMYWqNw4LPdQ8UQA4huuxaQsVeRPFl2oM8oDGxMFhJUWZf9McpLtJasDDZb/Bpeg=="
"integrity": "sha512-eF2rxCRulEKXHTRiDrDy6erMYWqNw4LPdQ8UQA4huuxaQsVeRPFl2oM8oDGxMFhJUWZf9McpLtJasDDZb/Bpeg==",
"dev": true
},
"node_modules/@jridgewell/trace-mapping": {
"version": "0.3.25",
"resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.25.tgz",
"integrity": "sha512-vNk6aEwybGtawWmy/PzwnGDOjCkLWSD2wqvjGGAgOAwCGWySYXfYoxt00IJkTF+8Lb57DwOb3Aa0o9CApepiYQ==",
"dev": true,
"dependencies": {
"@jridgewell/resolve-uri": "^3.1.0",
"@jridgewell/sourcemap-codec": "^1.4.14"
@@ -1778,6 +1787,16 @@
"node": ">= 0.4"
}
},
"node_modules/@mistralai/mistralai": {
"version": "1.3.5",
"resolved": "https://registry.npmjs.org/@mistralai/mistralai/-/mistralai-1.3.5.tgz",
"integrity": "sha512-yC91oJ5ScEPqbXmv3mJTwTFgu/ZtsYoOPOhaVXSsy6x4zXTqTI57yEC1flC9uiA8GpG/yhpn2BBUXF95+U9Blw==",
"peerDependencies": {
"react": "^18 || ^19",
"react-dom": "^18 || ^19",
"zod": ">= 3"
}
},
"node_modules/@nodelib/fs.scandir": {
"version": "2.1.5",
"resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz",
@@ -2013,12 +2032,6 @@
"@babel/types": "^7.20.7"
}
},
"node_modules/@types/estree": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.5.tgz",
"integrity": "sha512-/kYRxGDLWzHOB7q+wtSUQlFrtcdUccpfy+X+9iMBpHK8QLLhx2wIPYuS5DYtR9Wa/YlZAbIovy7qVdB1Aq6Lyw==",
"peer": true
},
"node_modules/@types/graceful-fs": {
"version": "4.1.9",
"resolved": "https://registry.npmjs.org/@types/graceful-fs/-/graceful-fs-4.1.9.tgz",
@@ -2098,6 +2111,11 @@
"form-data": "^4.0.0"
}
},
"node_modules/@types/qs": {
"version": "6.9.15",
"resolved": "https://registry.npmjs.org/@types/qs/-/qs-6.9.15.tgz",
"integrity": "sha512-uXHQKES6DQKKCLh441Xv/dwxOq1TVS3JPUMlEqoEglvlhR6Mxnlew/Xq/LRVHpLyk7iK3zODe1qYHIMltO7XGg=="
},
"node_modules/@types/semver": {
"version": "7.5.8",
"resolved": "https://registry.npmjs.org/@types/semver/-/semver-7.5.8.tgz",
@@ -2328,117 +2346,6 @@
"integrity": "sha512-zuVdFrMJiuCDQUMCzQaD6KL28MjnqqN8XnAqiEq9PNm/hCPTSGfrXCOfwj1ow4LFb/tNymJPwsNbVePc1xFqrQ==",
"dev": true
},
"node_modules/@vue/compiler-core": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/compiler-core/-/compiler-core-3.4.21.tgz",
"integrity": "sha512-MjXawxZf2SbZszLPYxaFCjxfibYrzr3eYbKxwpLR9EQN+oaziSu3qKVbwBERj1IFIB8OLUewxB5m/BFzi613og==",
"peer": true,
"dependencies": {
"@babel/parser": "^7.23.9",
"@vue/shared": "3.4.21",
"entities": "^4.5.0",
"estree-walker": "^2.0.2",
"source-map-js": "^1.0.2"
}
},
"node_modules/@vue/compiler-core/node_modules/estree-walker": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-2.0.2.tgz",
"integrity": "sha512-Rfkk/Mp/DL7JVje3u18FxFujQlTNR2q6QfMSMB7AvCBx91NGj/ba3kCfza0f6dVDbw7YlRf/nDrn7pQrCCyQ/w==",
"peer": true
},
"node_modules/@vue/compiler-dom": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/compiler-dom/-/compiler-dom-3.4.21.tgz",
"integrity": "sha512-IZC6FKowtT1sl0CR5DpXSiEB5ayw75oT2bma1BEhV7RRR1+cfwLrxc2Z8Zq/RGFzJ8w5r9QtCOvTjQgdn0IKmA==",
"peer": true,
"dependencies": {
"@vue/compiler-core": "3.4.21",
"@vue/shared": "3.4.21"
}
},
"node_modules/@vue/compiler-sfc": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/compiler-sfc/-/compiler-sfc-3.4.21.tgz",
"integrity": "sha512-me7epoTxYlY+2CUM7hy9PCDdpMPfIwrOvAXud2Upk10g4YLv9UBW7kL798TvMeDhPthkZ0CONNrK2GoeI1ODiQ==",
"peer": true,
"dependencies": {
"@babel/parser": "^7.23.9",
"@vue/compiler-core": "3.4.21",
"@vue/compiler-dom": "3.4.21",
"@vue/compiler-ssr": "3.4.21",
"@vue/shared": "3.4.21",
"estree-walker": "^2.0.2",
"magic-string": "^0.30.7",
"postcss": "^8.4.35",
"source-map-js": "^1.0.2"
}
},
"node_modules/@vue/compiler-sfc/node_modules/estree-walker": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-2.0.2.tgz",
"integrity": "sha512-Rfkk/Mp/DL7JVje3u18FxFujQlTNR2q6QfMSMB7AvCBx91NGj/ba3kCfza0f6dVDbw7YlRf/nDrn7pQrCCyQ/w==",
"peer": true
},
"node_modules/@vue/compiler-ssr": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/compiler-ssr/-/compiler-ssr-3.4.21.tgz",
"integrity": "sha512-M5+9nI2lPpAsgXOGQobnIueVqc9sisBFexh5yMIMRAPYLa7+5wEJs8iqOZc1WAa9WQbx9GR2twgznU8LTIiZ4Q==",
"peer": true,
"dependencies": {
"@vue/compiler-dom": "3.4.21",
"@vue/shared": "3.4.21"
}
},
"node_modules/@vue/reactivity": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/reactivity/-/reactivity-3.4.21.tgz",
"integrity": "sha512-UhenImdc0L0/4ahGCyEzc/pZNwVgcglGy9HVzJ1Bq2Mm9qXOpP8RyNTjookw/gOCUlXSEtuZ2fUg5nrHcoqJcw==",
"peer": true,
"dependencies": {
"@vue/shared": "3.4.21"
}
},
"node_modules/@vue/runtime-core": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/runtime-core/-/runtime-core-3.4.21.tgz",
"integrity": "sha512-pQthsuYzE1XcGZznTKn73G0s14eCJcjaLvp3/DKeYWoFacD9glJoqlNBxt3W2c5S40t6CCcpPf+jG01N3ULyrA==",
"peer": true,
"dependencies": {
"@vue/reactivity": "3.4.21",
"@vue/shared": "3.4.21"
}
},
"node_modules/@vue/runtime-dom": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/runtime-dom/-/runtime-dom-3.4.21.tgz",
"integrity": "sha512-gvf+C9cFpevsQxbkRBS1NpU8CqxKw0ebqMvLwcGQrNpx6gqRDodqKqA+A2VZZpQ9RpK2f9yfg8VbW/EpdFUOJw==",
"peer": true,
"dependencies": {
"@vue/runtime-core": "3.4.21",
"@vue/shared": "3.4.21",
"csstype": "^3.1.3"
}
},
"node_modules/@vue/server-renderer": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/server-renderer/-/server-renderer-3.4.21.tgz",
"integrity": "sha512-aV1gXyKSN6Rz+6kZ6kr5+Ll14YzmIbeuWe7ryJl5muJ4uwSwY/aStXTixx76TwkZFJLm1aAlA/HSWEJ4EyiMkg==",
"peer": true,
"dependencies": {
"@vue/compiler-ssr": "3.4.21",
"@vue/shared": "3.4.21"
},
"peerDependencies": {
"vue": "3.4.21"
}
},
"node_modules/@vue/shared": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/@vue/shared/-/shared-3.4.21.tgz",
"integrity": "sha512-PuJe7vDIi6VYSinuEbUIQgMIRZGgM8e4R+G+/dQTk0X1NEdvgvvgv7m+rfmDH1gZzyA1OjjoWskvHlfRNfQf3g==",
"peer": true
},
"node_modules/abort-controller": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz",
@@ -2454,6 +2361,7 @@
"version": "8.11.3",
"resolved": "https://registry.npmjs.org/acorn/-/acorn-8.11.3.tgz",
"integrity": "sha512-Y9rRfJG5jcKOE0CLisYbojUjIrIEE7AGMzA/Sm4BslANhbS+cDMpgBdcPT91oJ7OuJ9hYJBx59RjbhxVnrF8Xg==",
"dev": true,
"bin": {
"acorn": "bin/acorn"
},
@@ -2501,43 +2409,6 @@
"node": ">= 8.0.0"
}
},
"node_modules/ai": {
"version": "2.2.37",
"resolved": "https://registry.npmjs.org/ai/-/ai-2.2.37.tgz",
"integrity": "sha512-JIYm5N1muGVqBqWnvkt29FmXhESoO5TcDxw74OE41SsM+uIou6NPDDs0XWb/ABcd1gmp6k5zym64KWMPM2xm0A==",
"dependencies": {
"eventsource-parser": "1.0.0",
"nanoid": "3.3.6",
"solid-swr-store": "0.10.7",
"sswr": "2.0.0",
"swr": "2.2.0",
"swr-store": "0.10.6",
"swrv": "1.0.4"
},
"engines": {
"node": ">=14.6"
},
"peerDependencies": {
"react": "^18.2.0",
"solid-js": "^1.7.7",
"svelte": "^3.0.0 || ^4.0.0",
"vue": "^3.3.4"
},
"peerDependenciesMeta": {
"react": {
"optional": true
},
"solid-js": {
"optional": true
},
"svelte": {
"optional": true
},
"vue": {
"optional": true
}
}
},
"node_modules/ajv": {
"version": "6.12.6",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz",
@@ -2624,15 +2495,6 @@
"integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==",
"dev": true
},
"node_modules/aria-query": {
"version": "5.3.0",
"resolved": "https://registry.npmjs.org/aria-query/-/aria-query-5.3.0.tgz",
"integrity": "sha512-b0P0sZPKtyu8HkeRAfCq0IfURZK+SuwMjY1UXGBU27wpAiTwQAIlq56IbIO+ytk/JjS1fMR14ee5WBBfKi5J6A==",
"peer": true,
"dependencies": {
"dequal": "^2.0.3"
}
},
"node_modules/array-union": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/array-union/-/array-union-2.1.0.tgz",
@@ -2666,15 +2528,6 @@
"proxy-from-env": "^1.1.0"
}
},
"node_modules/axobject-query": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/axobject-query/-/axobject-query-4.0.0.tgz",
"integrity": "sha512-+60uv1hiVFhHZeO+Lz0RYzsVHy5Wr1ayX0mwda9KPDVLNJgZ1T9Ny7VmFbLDzxsH0D87I86vgj3gFrjTJUYznw==",
"peer": true,
"dependencies": {
"dequal": "^2.0.3"
}
},
"node_modules/babel-jest": {
"version": "29.7.0",
"resolved": "https://registry.npmjs.org/babel-jest/-/babel-jest-29.7.0.tgz",
@@ -3251,19 +3104,6 @@
"node": ">= 0.12.0"
}
},
"node_modules/code-red": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/code-red/-/code-red-1.0.4.tgz",
"integrity": "sha512-7qJWqItLA8/VPVlKJlFXU+NBlo/qyfs39aJcuMT/2ere32ZqvF5OSxgdM5xOfJJ7O429gg2HM47y8v9P+9wrNw==",
"peer": true,
"dependencies": {
"@jridgewell/sourcemap-codec": "^1.4.15",
"@types/estree": "^1.0.1",
"acorn": "^8.10.0",
"estree-walker": "^3.0.3",
"periscopic": "^3.1.0"
}
},
"node_modules/collect-v8-coverage": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/collect-v8-coverage/-/collect-v8-coverage-1.0.2.tgz",
@@ -3385,9 +3225,10 @@
"dev": true
},
"node_modules/cross-spawn": {
"version": "7.0.3",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
"integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==",
"version": "7.0.6",
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
"integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
"license": "MIT",
"dependencies": {
"path-key": "^3.1.0",
"shebang-command": "^2.0.0",
@@ -3403,25 +3244,6 @@
"integrity": "sha512-VxBKmeNcqQdiUQUW2Tzq0t377b54N2bMtXO/qiLa+6eRRmmC4qT3D4OnTGoT/U6O9aklQ/jTwbOtRMTTY8G0Ig==",
"deprecated": "This package is no longer supported. It's now a built-in Node module. If you've depended on crypto, you should switch to the one that's built-in."
},
"node_modules/css-tree": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/css-tree/-/css-tree-2.3.1.tgz",
"integrity": "sha512-6Fv1DV/TYw//QF5IzQdqsNDjx/wc8TrMBZsqjL9eW01tWb7R7k/mq+/VXfJCl7SoD5emsJop9cOByJZfs8hYIw==",
"peer": true,
"dependencies": {
"mdn-data": "2.0.30",
"source-map-js": "^1.0.1"
},
"engines": {
"node": "^10 || ^12.20.0 || ^14.13.0 || >=15.0.0"
}
},
"node_modules/csstype": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.3.tgz",
"integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
"peer": true
},
"node_modules/debug": {
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/debug/-/debug-4.3.4.tgz",
@@ -3507,14 +3329,6 @@
"resolved": "https://registry.npmjs.org/deprecation/-/deprecation-2.3.1.tgz",
"integrity": "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="
},
"node_modules/dequal": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/dequal/-/dequal-2.0.3.tgz",
"integrity": "sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA==",
"engines": {
"node": ">=6"
}
},
"node_modules/detect-newline": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/detect-newline/-/detect-newline-3.1.0.tgz",
@@ -3601,18 +3415,6 @@
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
},
"node_modules/entities": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
"peer": true,
"engines": {
"node": ">=0.12"
},
"funding": {
"url": "https://github.com/fb55/entities?sponsor=1"
}
},
"node_modules/error-ex": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/error-ex/-/error-ex-1.3.2.tgz",
@@ -4254,15 +4056,6 @@
"node": ">=4.0"
}
},
"node_modules/estree-walker": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-3.0.3.tgz",
"integrity": "sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g==",
"peer": true,
"dependencies": {
"@types/estree": "^1.0.0"
}
},
"node_modules/esutils": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
@@ -4280,14 +4073,6 @@
"node": ">=6"
}
},
"node_modules/eventsource-parser": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/eventsource-parser/-/eventsource-parser-1.0.0.tgz",
"integrity": "sha512-9jgfSCa3dmEme2ES3mPByGXfgZ87VbP97tng1G2nWwWx6bV2nYxm2AWCrbQjXToSe+yYlqaZNtxffR9IeQr95g==",
"engines": {
"node": ">=14.18"
}
},
"node_modules/execa": {
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/execa/-/execa-7.2.0.tgz",
@@ -5151,15 +4936,6 @@
"node": ">=0.10.0"
}
},
"node_modules/is-reference": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/is-reference/-/is-reference-3.0.2.tgz",
"integrity": "sha512-v3rht/LgVcsdZa3O2Nqs+NMowLOxeOm7Ay9+/ARQ2F+qEoANRcqrjAZKGN0v8ymUetZGgkp26LTnGT7H0Qo9Pg==",
"peer": true,
"dependencies": {
"@types/estree": "*"
}
},
"node_modules/is-stream": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-3.0.0.tgz",
@@ -6821,12 +6597,6 @@
"integrity": "sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==",
"dev": true
},
"node_modules/locate-character": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/locate-character/-/locate-character-3.0.0.tgz",
"integrity": "sha512-SW13ws7BjaeJ6p7Q6CO2nchbYEc3X3J6WrmTTDto7yMPqVSZTUyY5Tjbid+Ab8gLnATtygYtiDIJGQRRn2ZOiA==",
"peer": true
},
"node_modules/locate-path": {
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz",
@@ -6923,6 +6693,7 @@
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
"integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
"license": "MIT",
"peer": true,
"dependencies": {
"js-tokens": "^3.0.0 || ^4.0.0"
@@ -6949,18 +6720,6 @@
"lz-string": "bin/bin.js"
}
},
"node_modules/magic-string": {
"version": "0.30.8",
"resolved": "https://registry.npmjs.org/magic-string/-/magic-string-0.30.8.tgz",
"integrity": "sha512-ISQTe55T2ao7XtlAStud6qwYPZjE4GK1S/BeVPus4jrq6JuOnQ00YKQC581RWhR122W7msZV263KzVeLoqidyQ==",
"peer": true,
"dependencies": {
"@jridgewell/sourcemap-codec": "^1.4.15"
},
"engines": {
"node": ">=12"
}
},
"node_modules/make-dir": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/make-dir/-/make-dir-4.0.0.tgz",
@@ -6991,12 +6750,6 @@
"tmpl": "1.0.5"
}
},
"node_modules/mdn-data": {
"version": "2.0.30",
"resolved": "https://registry.npmjs.org/mdn-data/-/mdn-data-2.0.30.tgz",
"integrity": "sha512-GaqWWShW4kv/G9IEucWScBx9G1/vsFZZJUO+tD26M8J8z3Kw5RDQjaoZe03YAClgeS/SWPOcb4nkFBTEi5DUEA==",
"peer": true
},
"node_modules/merge-stream": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/merge-stream/-/merge-stream-2.0.0.tgz",
@@ -7012,12 +6765,13 @@
}
},
"node_modules/micromatch": {
"version": "4.0.5",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz",
"integrity": "sha512-DMy+ERcEW2q8Z2Po+WNXuw3c5YaUSFjAO5GsJqfEl7UjvtIuFKO6ZrKvcItdy98dwFI2N1tg3zNIdKaQT+aNdA==",
"version": "4.0.8",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.8.tgz",
"integrity": "sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==",
"dev": true,
"license": "MIT",
"dependencies": {
"braces": "^3.0.2",
"braces": "^3.0.3",
"picomatch": "^2.3.1"
},
"engines": {
@@ -7088,23 +6842,6 @@
"node": "^14.17.0 || ^16.13.0 || >=18.0.0"
}
},
"node_modules/nanoid": {
"version": "3.3.6",
"resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.6.tgz",
"integrity": "sha512-BGcqMMJuToF7i1rt+2PWSNVnWIkGCU78jBG3RxO/bZlnZPK2Cmi2QaffxGO/2RvWi9sL+FAiRiXMgsyxQ1DIDA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
],
"bin": {
"nanoid": "bin/nanoid.cjs"
},
"engines": {
"node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
}
},
"node_modules/natural-compare": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz",
@@ -7219,6 +6956,17 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/object-inspect": {
"version": "1.13.2",
"resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.2.tgz",
"integrity": "sha512-IRZSRuzJiynemAXPYtPe5BoI/RESNYR7TYm50MC5Mqbd3Jmw5y790sErYw3V6SryFJD64b74qQQs9wn5Bg/k3g==",
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/once": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
@@ -7242,17 +6990,19 @@
}
},
"node_modules/openai": {
"version": "4.56.0",
"resolved": "https://registry.npmjs.org/openai/-/openai-4.56.0.tgz",
"integrity": "sha512-zcag97+3bG890MNNa0DQD9dGmmTWL8unJdNkulZzWRXrl+QeD+YkBI4H58rJcwErxqGK6a0jVPZ4ReJjhDGcmw==",
"version": "4.57.0",
"resolved": "https://registry.npmjs.org/openai/-/openai-4.57.0.tgz",
"integrity": "sha512-JnwBSIYqiZ3jYjB5f2in8hQ0PRA092c6m+/6dYB0MzK0BEbn+0dioxZsPLBm5idJbg9xzLNOiGVm2OSuhZ+BdQ==",
"dependencies": {
"@types/node": "^18.11.18",
"@types/node-fetch": "^2.6.4",
"@types/qs": "^6.9.7",
"abort-controller": "^3.0.0",
"agentkeepalive": "^4.2.1",
"form-data-encoder": "1.7.2",
"formdata-node": "^4.3.2",
"node-fetch": "^2.6.7"
"node-fetch": "^2.6.7",
"qs": "^6.10.3"
},
"bin": {
"openai": "bin/cli"
@@ -7476,17 +7226,6 @@
"node": ">=8"
}
},
"node_modules/periscopic": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/periscopic/-/periscopic-3.1.0.tgz",
"integrity": "sha512-vKiQ8RRtkl9P+r/+oefh25C3fhybptkHKCZSPlcXiJux2tJF55GnEj3BVn4A5gKfq9NWWXXrxkHBwVPUfH0opw==",
"peer": true,
"dependencies": {
"@types/estree": "^1.0.0",
"estree-walker": "^3.0.0",
"is-reference": "^3.0.0"
}
},
"node_modules/picocolors": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.0.tgz",
@@ -7577,52 +7316,6 @@
"node": ">=8"
}
},
"node_modules/postcss": {
"version": "8.4.38",
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.4.38.tgz",
"integrity": "sha512-Wglpdk03BSfXkHoQa3b/oulrotAkwrlLDRSOb9D0bN86FdRyE9lppSp33aHNPgBa0JKCoB+drFLZkQoRRYae5A==",
"funding": [
{
"type": "opencollective",
"url": "https://opencollective.com/postcss/"
},
{
"type": "tidelift",
"url": "https://tidelift.com/funding/github/npm/postcss"
},
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
],
"peer": true,
"dependencies": {
"nanoid": "^3.3.7",
"picocolors": "^1.0.0",
"source-map-js": "^1.2.0"
},
"engines": {
"node": "^10 || ^12 || >=14"
}
},
"node_modules/postcss/node_modules/nanoid": {
"version": "3.3.7",
"resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.7.tgz",
"integrity": "sha512-eSRppjcPIatRIMC1U6UngP8XFcz8MQWGQdt1MTBQ7NaAmvXDfvNxbvWV3x2y6CdEUciCSsDHDQZbhYaB8QEo2g==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
],
"peer": true,
"bin": {
"nanoid": "bin/nanoid.cjs"
},
"engines": {
"node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
}
},
"node_modules/prelude-ls": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/prelude-ls/-/prelude-ls-1.2.1.tgz",
@@ -7683,7 +7376,7 @@
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz",
"integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=6"
}
@@ -7704,6 +7397,20 @@
}
]
},
"node_modules/qs": {
"version": "6.13.0",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.13.0.tgz",
"integrity": "sha512-+38qI9SOr8tfZ4QmJNplMUxqjbe7LKvvZgWdExBOmd+egZTtjLB67Gu0HRX3u/XOq7UU2Nx6nsjvS16Z9uwfpg==",
"dependencies": {
"side-channel": "^1.0.6"
},
"engines": {
"node": ">=0.6"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/queue-microtask": {
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz",
@@ -7725,9 +7432,10 @@
]
},
"node_modules/react": {
"version": "18.2.0",
"resolved": "https://registry.npmjs.org/react/-/react-18.2.0.tgz",
"integrity": "sha512-/3IjMdb2L9QbBdWiW5e3P2/npwMBaU9mHCSCUzNln0ZCYbcfTsGbTJrU/kGemdH2IWmB2ioZ+zkxtmq6g09fGQ==",
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react/-/react-18.3.1.tgz",
"integrity": "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"loose-envify": "^1.1.0"
@@ -7736,6 +7444,20 @@
"node": ">=0.10.0"
}
},
"node_modules/react-dom": {
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react-dom/-/react-dom-18.3.1.tgz",
"integrity": "sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw==",
"license": "MIT",
"peer": true,
"dependencies": {
"loose-envify": "^1.1.0",
"scheduler": "^0.23.2"
},
"peerDependencies": {
"react": "^18.3.1"
}
},
"node_modules/react-is": {
"version": "18.2.0",
"resolved": "https://registry.npmjs.org/react-is/-/react-is-18.2.0.tgz",
@@ -7961,6 +7683,16 @@
"resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg=="
},
"node_modules/scheduler": {
"version": "0.23.2",
"resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.23.2.tgz",
"integrity": "sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"loose-envify": "^1.1.0"
}
},
"node_modules/semver": {
"version": "7.6.0",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.6.0.tgz",
@@ -7994,27 +7726,6 @@
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
"dev": true
},
"node_modules/seroval": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/seroval/-/seroval-1.0.5.tgz",
"integrity": "sha512-TM+Z11tHHvQVQKeNlOUonOWnsNM+2IBwZ4vwoi4j3zKzIpc5IDw8WPwCfcc8F17wy6cBcJGbZbFOR0UCuTZHQA==",
"peer": true,
"engines": {
"node": ">=10"
}
},
"node_modules/seroval-plugins": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/seroval-plugins/-/seroval-plugins-1.0.5.tgz",
"integrity": "sha512-8+pDC1vOedPXjKG7oz8o+iiHrtF2WswaMQJ7CKFpccvSYfrzmvKY9zOJWCg+881722wIHfwkdnRmiiDm9ym+zQ==",
"peer": true,
"engines": {
"node": ">=10"
},
"peerDependencies": {
"seroval": "^1.0"
}
},
"node_modules/set-function-length": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz",
@@ -8050,6 +7761,23 @@
"node": ">=8"
}
},
"node_modules/side-channel": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.0.6.tgz",
"integrity": "sha512-fDW/EZ6Q9RiO8eFG8Hj+7u/oW+XrPTIChwCOM2+th2A6OblDtYYIpve9m+KvI9Z4C9qSEXlaGR6bTEYHReuglA==",
"dependencies": {
"call-bind": "^1.0.7",
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.4",
"object-inspect": "^1.13.1"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/signal-exit": {
"version": "3.0.7",
"resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.7.tgz",
@@ -8119,29 +7847,6 @@
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true
},
"node_modules/solid-js": {
"version": "1.8.16",
"resolved": "https://registry.npmjs.org/solid-js/-/solid-js-1.8.16.tgz",
"integrity": "sha512-rja94MNU9flF3qQRLNsu60QHKBDKBkVE1DldJZPIfn2ypIn3NV2WpSbGTQIvsyGPBo+9E2IMjwqnqpbgfWuzeg==",
"peer": true,
"dependencies": {
"csstype": "^3.1.0",
"seroval": "^1.0.4",
"seroval-plugins": "^1.0.3"
}
},
"node_modules/solid-swr-store": {
"version": "0.10.7",
"resolved": "https://registry.npmjs.org/solid-swr-store/-/solid-swr-store-0.10.7.tgz",
"integrity": "sha512-A6d68aJmRP471aWqKKPE2tpgOiR5fH4qXQNfKIec+Vap+MGQm3tvXlT8n0I8UgJSlNAsSAUuw2VTviH2h3Vv5g==",
"engines": {
"node": ">=10"
},
"peerDependencies": {
"solid-js": "^1.2",
"swr-store": "^0.10"
}
},
"node_modules/source-map": {
"version": "0.6.1",
"resolved": "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz",
@@ -8151,15 +7856,6 @@
"node": ">=0.10.0"
}
},
"node_modules/source-map-js": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.0.tgz",
"integrity": "sha512-itJW8lvSA0TXEphiRoawsCksnlf8SyvmFzIhltqAHluXd88pkCd+cXJVHTDwdCr0IzwptSm035IHQktUu1QUMg==",
"peer": true,
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/source-map-support": {
"version": "0.5.13",
"resolved": "https://registry.npmjs.org/source-map-support/-/source-map-support-0.5.13.tgz",
@@ -8176,17 +7872,6 @@
"integrity": "sha512-D9cPgkvLlV3t3IzL0D0YLvGA9Ahk4PcvVwUbN0dSGr1aP0Nrt4AEnTUbuGvquEC0mA64Gqt1fzirlRs5ibXx8g==",
"dev": true
},
"node_modules/sswr": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/sswr/-/sswr-2.0.0.tgz",
"integrity": "sha512-mV0kkeBHcjcb0M5NqKtKVg/uTIYNlIIniyDfSGrSfxpEdM9C365jK0z55pl9K0xAkNTJi2OAOVFQpgMPUk+V0w==",
"dependencies": {
"swrev": "^4.0.0"
},
"peerDependencies": {
"svelte": "^4.0.0"
}
},
"node_modules/stack-utils": {
"version": "2.0.6",
"resolved": "https://registry.npmjs.org/stack-utils/-/stack-utils-2.0.6.tgz",
@@ -8318,66 +8003,6 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/svelte": {
"version": "4.2.12",
"resolved": "https://registry.npmjs.org/svelte/-/svelte-4.2.12.tgz",
"integrity": "sha512-d8+wsh5TfPwqVzbm4/HCXC783/KPHV60NvwitJnyTA5lWn1elhXMNWhXGCJ7PwPa8qFUnyJNIyuIRt2mT0WMug==",
"peer": true,
"dependencies": {
"@ampproject/remapping": "^2.2.1",
"@jridgewell/sourcemap-codec": "^1.4.15",
"@jridgewell/trace-mapping": "^0.3.18",
"@types/estree": "^1.0.1",
"acorn": "^8.9.0",
"aria-query": "^5.3.0",
"axobject-query": "^4.0.0",
"code-red": "^1.0.3",
"css-tree": "^2.3.1",
"estree-walker": "^3.0.3",
"is-reference": "^3.0.1",
"locate-character": "^3.0.0",
"magic-string": "^0.30.4",
"periscopic": "^3.1.0"
},
"engines": {
"node": ">=16"
}
},
"node_modules/swr": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/swr/-/swr-2.2.0.tgz",
"integrity": "sha512-AjqHOv2lAhkuUdIiBu9xbuettzAzWXmCEcLONNKJRba87WAefz8Ca9d6ds/SzrPc235n1IxWYdhJ2zF3MNUaoQ==",
"dependencies": {
"use-sync-external-store": "^1.2.0"
},
"peerDependencies": {
"react": "^16.11.0 || ^17.0.0 || ^18.0.0"
}
},
"node_modules/swr-store": {
"version": "0.10.6",
"resolved": "https://registry.npmjs.org/swr-store/-/swr-store-0.10.6.tgz",
"integrity": "sha512-xPjB1hARSiRaNNlUQvWSVrG5SirCjk2TmaUyzzvk69SZQan9hCJqw/5rG9iL7xElHU784GxRPISClq4488/XVw==",
"dependencies": {
"dequal": "^2.0.3"
},
"engines": {
"node": ">=10"
}
},
"node_modules/swrev": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/swrev/-/swrev-4.0.0.tgz",
"integrity": "sha512-LqVcOHSB4cPGgitD1riJ1Hh4vdmITOp+BkmfmXRh4hSF/t7EnS4iD+SOTmq7w5pPm/SiPeto4ADbKS6dHUDWFA=="
},
"node_modules/swrv": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/swrv/-/swrv-1.0.4.tgz",
"integrity": "sha512-zjEkcP8Ywmj+xOJW3lIT65ciY/4AL4e/Or7Gj0MzU3zBJNMdJiT8geVZhINavnlHRMMCcJLHhraLTAiDOTmQ9g==",
"peerDependencies": {
"vue": ">=3.2.26 < 4"
}
},
"node_modules/terminal-columns": {
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/terminal-columns/-/terminal-columns-1.4.1.tgz",
@@ -8618,7 +8243,7 @@
"version": "4.9.5",
"resolved": "https://registry.npmjs.org/typescript/-/typescript-4.9.5.tgz",
"integrity": "sha512-1FXk9E2Hm+QzZQ7z+McJiHL4NW1F2EzMu9Nq9i3zAaGqibafqYwCVU6WyWAuyQRRzOlxou8xZSyXLEN8oKj24g==",
"devOptional": true,
"dev": true,
"bin": {
"tsc": "bin/tsc",
"tsserver": "bin/tsserver"
@@ -8687,14 +8312,6 @@
"punycode": "^2.1.0"
}
},
"node_modules/use-sync-external-store": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/use-sync-external-store/-/use-sync-external-store-1.2.0.tgz",
"integrity": "sha512-eEgnFxGQ1Ife9bzYs6VLi8/4X6CObHMw9Qr9tPY43iKwsPw8xE8+EFsf/2cFZ5S3esXgpWgtSCtLNS41F+sKPA==",
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0"
}
},
"node_modules/util-deprecate": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz",
@@ -8728,27 +8345,6 @@
"node": ">=10.12.0"
}
},
"node_modules/vue": {
"version": "3.4.21",
"resolved": "https://registry.npmjs.org/vue/-/vue-3.4.21.tgz",
"integrity": "sha512-5hjyV/jLEIKD/jYl4cavMcnzKwjMKohureP8ejn3hhEjwhWIhWeuzL2kJAjzl/WyVsgPY56Sy4Z40C3lVshxXA==",
"peer": true,
"dependencies": {
"@vue/compiler-dom": "3.4.21",
"@vue/compiler-sfc": "3.4.21",
"@vue/runtime-dom": "3.4.21",
"@vue/server-renderer": "3.4.21",
"@vue/shared": "3.4.21"
},
"peerDependencies": {
"typescript": "*"
},
"peerDependenciesMeta": {
"typescript": {
"optional": true
}
}
},
"node_modules/walker": {
"version": "1.0.8",
"resolved": "https://registry.npmjs.org/walker/-/walker-1.0.8.tgz",
@@ -8911,6 +8507,15 @@
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/zod": {
"version": "3.23.8",
"resolved": "https://registry.npmjs.org/zod/-/zod-3.23.8.tgz",
"integrity": "sha512-XBx9AXhXktjUqnepgTiE5flcKIYWi/rme0Eaj+5Y0lftuGBq+jyRu/md4WnuxqgP1ubdpNCsYEYPxrzVHD8d6g==",
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/colinhacks"
}
}
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "opencommit",
"version": "3.1.2",
"version": "3.2.3",
"description": "Auto-generate impressive commits in 1 second. Killing lame commits with AI 🤯🔫",
"keywords": [
"git",
@@ -46,8 +46,9 @@
"dev:gemini": "OCO_AI_PROVIDER='gemini' ts-node ./src/cli.ts",
"build": "rimraf out && node esbuild.config.js",
"build:push": "npm run build && git add . && git commit -m 'build' && git push",
"deploy": "npm run build:push && git push --tags && npm publish --tag latest",
"deploy:patch": "npm version patch && npm run deploy",
"deploy": "npm publish --tag latest",
"deploy:build": "npm run build:push && git push --tags && npm run deploy",
"deploy:patch": "npm version patch && npm run deploy:build",
"lint": "eslint src --ext ts && tsc --noEmit",
"format": "prettier --write src",
"test": "node --no-warnings --experimental-vm-modules $( [ -f ./node_modules/.bin/jest ] && echo ./node_modules/.bin/jest || which jest ) test/unit",
@@ -57,7 +58,8 @@
"test:unit:docker": "npm run test:docker-build && DOCKER_CONTENT_TRUST=0 docker run --rm oco-test npm run test:unit",
"test:e2e": "npm run test:e2e:setup && jest test/e2e",
"test:e2e:setup": "sh test/e2e/setup.sh",
"test:e2e:docker": "npm run test:docker-build && DOCKER_CONTENT_TRUST=0 docker run --rm oco-test npm run test:e2e"
"test:e2e:docker": "npm run test:docker-build && DOCKER_CONTENT_TRUST=0 docker run --rm oco-test npm run test:e2e",
"mlx:start": "OCO_AI_PROVIDER='mlx' node ./out/cli.cjs"
},
"devDependencies": {
"@commitlint/types": "^17.4.4",
@@ -86,9 +88,9 @@
"@clack/prompts": "^0.6.1",
"@dqbd/tiktoken": "^1.0.2",
"@google/generative-ai": "^0.11.4",
"@mistralai/mistralai": "^1.3.5",
"@octokit/webhooks-schemas": "^6.11.0",
"@octokit/webhooks-types": "^6.11.0",
"ai": "^2.2.14",
"axios": "^1.3.4",
"chalk": "^5.2.0",
"cleye": "^1.3.2",
@@ -97,6 +99,8 @@
"ignore": "^5.2.4",
"ini": "^3.0.1",
"inquirer": "^9.1.4",
"openai": "^4.56.0"
"openai": "^4.57.0",
"punycode": "^2.3.1",
"zod": "^3.23.8"
}
}

View File

@@ -9,6 +9,7 @@ import { configCommand } from './commands/config';
import { hookCommand, isHookCalled } from './commands/githook.js';
import { prepareCommitMessageHook } from './commands/prepare-commit-msg-hook';
import { checkIsLatestVersion } from './utils/checkIsLatestVersion';
import { runMigrations } from './migrations/_run.js';
const extraArgs = process.argv.slice(2);
@@ -19,6 +20,12 @@ cli(
commands: [configCommand, hookCommand, commitlintConfigCommand],
flags: {
fgm: Boolean,
context: {
type: String,
alias: 'c',
description: 'Additional user input context for the commit message',
default: ''
},
yes: {
type: Boolean,
alias: 'y',
@@ -30,12 +37,13 @@ cli(
help: { description: packageJSON.description }
},
async ({ flags }) => {
await runMigrations();
await checkIsLatestVersion();
if (await isHookCalled()) {
prepareCommitMessageHook();
} else {
commit(extraArgs, false, flags.fgm, flags.yes);
commit(extraArgs, flags.context, false, flags.fgm, flags.yes);
}
},
extraArgs

View File

@@ -39,6 +39,7 @@ const checkMessageTemplate = (extraArgs: string[]): string | false => {
interface GenerateCommitMessageFromGitDiffParams {
diff: string;
extraArgs: string[];
context?: string;
fullGitMojiSpec?: boolean;
skipCommitConfirmation?: boolean;
}
@@ -46,6 +47,7 @@ interface GenerateCommitMessageFromGitDiffParams {
const generateCommitMessageFromGitDiff = async ({
diff,
extraArgs,
context = '',
fullGitMojiSpec = false,
skipCommitConfirmation = false
}: GenerateCommitMessageFromGitDiffParams): Promise<void> => {
@@ -56,7 +58,8 @@ const generateCommitMessageFromGitDiff = async ({
try {
let commitMessage = await generateCommitMessageByDiff(
diff,
fullGitMojiSpec
fullGitMojiSpec,
context
);
const messageTemplate = checkMessageTemplate(extraArgs);
@@ -107,13 +110,16 @@ ${chalk.grey('——————————————————')}`
const remotes = await getGitRemotes();
// user isn't pushing, return early
if (config.OCO_GITPUSH === false) return;
if (!remotes.length) {
const { stdout } = await execa('git', ['push']);
if (stdout) outro(stdout);
process.exit(0);
}
if (remotes.length === 1 && config.OCO_GITPUSH !== true) {
if (remotes.length === 1) {
const isPushConfirmedByUser = await confirm({
message: 'Do you want to run `git push`?'
});
@@ -132,8 +138,7 @@ ${chalk.grey('——————————————————')}`
]);
pushSpinner.stop(
`${chalk.green('✔')} Successfully pushed all commits to ${
remotes[0]
`${chalk.green('✔')} Successfully pushed all commits to ${remotes[0]
}`
);
@@ -143,26 +148,29 @@ ${chalk.grey('——————————————————')}`
process.exit(0);
}
} else {
const skipOption = `don't push`
const selectedRemote = (await select({
message: 'Choose a remote to push to',
options: remotes.map((remote) => ({ value: remote, label: remote }))
options: [...remotes, skipOption].map((remote) => ({ value: remote, label: remote })),
})) as string;
if (isCancel(selectedRemote)) process.exit(1);
const pushSpinner = spinner();
pushSpinner.start(`Running 'git push ${selectedRemote}'`);
const { stdout } = await execa('git', ['push', selectedRemote]);
pushSpinner.stop(
`${chalk.green(
'✔'
)} Successfully pushed all commits to ${selectedRemote}`
);
if (stdout) outro(stdout);
if (selectedRemote !== skipOption) {
const pushSpinner = spinner();
pushSpinner.start(`Running 'git push ${selectedRemote}'`);
const { stdout } = await execa('git', ['push', selectedRemote]);
if (stdout) outro(stdout);
pushSpinner.stop(
`${chalk.green(
'✔'
)} successfully pushed all commits to ${selectedRemote}`
);
}
}
} else {
const regenerateMessage = await confirm({
@@ -180,7 +188,11 @@ ${chalk.grey('——————————————————')}`
}
}
} catch (error) {
commitGenerationSpinner.stop('📝 Commit message generated');
commitGenerationSpinner.stop(
`${chalk.red('✖')} Failed to generate the commit message`
);
console.log(error);
const err = error as Error;
outro(`${chalk.red('✖')} ${err?.message || err}`);
@@ -190,6 +202,7 @@ ${chalk.grey('——————————————————')}`
export async function commit(
extraArgs: string[] = [],
context: string = '',
isStageAllFlag: Boolean = false,
fullGitMojiSpec: boolean = false,
skipCommitConfirmation: boolean = false
@@ -231,7 +244,7 @@ export async function commit(
if (isCancel(isStageAllAndCommitConfirmedByUser)) process.exit(1);
if (isStageAllAndCommitConfirmedByUser) {
await commit(extraArgs, true, fullGitMojiSpec);
await commit(extraArgs, context, true, fullGitMojiSpec);
process.exit(1);
}
@@ -249,7 +262,7 @@ export async function commit(
await gitAdd({ files });
}
await commit(extraArgs, false, fullGitMojiSpec);
await commit(extraArgs, context, false, fullGitMojiSpec);
process.exit(1);
}
@@ -263,6 +276,7 @@ export async function commit(
generateCommitMessageFromGitDiff({
diff: await getDiff({ files: stagedFiles }),
extraArgs,
context,
fullGitMojiSpec,
skipCommitConfirmation
})

View File

@@ -11,14 +11,9 @@ import { TEST_MOCK_TYPES } from '../engine/testAi';
import { getI18nLocal, i18n } from '../i18n';
export enum CONFIG_KEYS {
OCO_OPENAI_API_KEY = 'OCO_OPENAI_API_KEY',
OCO_ANTHROPIC_API_KEY = 'OCO_ANTHROPIC_API_KEY',
OCO_AZURE_API_KEY = 'OCO_AZURE_API_KEY',
OCO_GEMINI_API_KEY = 'OCO_GEMINI_API_KEY',
OCO_GEMINI_BASE_PATH = 'OCO_GEMINI_BASE_PATH',
OCO_API_KEY = 'OCO_API_KEY',
OCO_TOKENS_MAX_INPUT = 'OCO_TOKENS_MAX_INPUT',
OCO_TOKENS_MAX_OUTPUT = 'OCO_TOKENS_MAX_OUTPUT',
OCO_OPENAI_BASE_PATH = 'OCO_OPENAI_BASE_PATH',
OCO_DESCRIPTION = 'OCO_DESCRIPTION',
OCO_EMOJI = 'OCO_EMOJI',
OCO_MODEL = 'OCO_MODEL',
@@ -27,14 +22,10 @@ export enum CONFIG_KEYS {
OCO_MESSAGE_TEMPLATE_PLACEHOLDER = 'OCO_MESSAGE_TEMPLATE_PLACEHOLDER',
OCO_PROMPT_MODULE = 'OCO_PROMPT_MODULE',
OCO_AI_PROVIDER = 'OCO_AI_PROVIDER',
OCO_GITPUSH = 'OCO_GITPUSH', // todo: deprecate
OCO_ONE_LINE_COMMIT = 'OCO_ONE_LINE_COMMIT',
OCO_AZURE_ENDPOINT = 'OCO_AZURE_ENDPOINT',
OCO_TEST_MOCK_TYPE = 'OCO_TEST_MOCK_TYPE',
OCO_API_URL = 'OCO_API_URL',
OCO_OLLAMA_API_URL = 'OCO_OLLAMA_API_URL',
OCO_FLOWISE_ENDPOINT = 'OCO_FLOWISE_ENDPOINT',
OCO_FLOWISE_API_KEY = 'OCO_FLOWISE_API_KEY'
OCO_GITPUSH = 'OCO_GITPUSH' // todo: deprecate
}
export enum CONFIG_MODES {
@@ -85,6 +76,58 @@ export const MODEL_LIST = {
'gemini-1.0-pro',
'gemini-pro-vision',
'text-embedding-004'
],
groq: [
'llama3-70b-8192', // Meta Llama 3 70B (default one, no daily token limit and 14 400 reqs/day)
'llama3-8b-8192', // Meta Llama 3 8B
'llama-guard-3-8b', // Llama Guard 3 8B
'llama-3.1-8b-instant', // Llama 3.1 8B (Preview)
'llama-3.1-70b-versatile', // Llama 3.1 70B (Preview)
'gemma-7b-it', // Gemma 7B
'gemma2-9b-it' // Gemma 2 9B
],
mistral: [
'ministral-3b-2410',
'ministral-3b-latest',
'ministral-8b-2410',
'ministral-8b-latest',
'open-mistral-7b',
'mistral-tiny',
'mistral-tiny-2312',
'open-mistral-nemo',
'open-mistral-nemo-2407',
'mistral-tiny-2407',
'mistral-tiny-latest',
'open-mixtral-8x7b',
'mistral-small',
'mistral-small-2312',
'open-mixtral-8x22b',
'open-mixtral-8x22b-2404',
'mistral-small-2402',
'mistral-small-2409',
'mistral-small-latest',
'mistral-medium-2312',
'mistral-medium',
'mistral-medium-latest',
'mistral-large-2402',
'mistral-large-2407',
'mistral-large-2411',
'mistral-large-latest',
'pixtral-large-2411',
'pixtral-large-latest',
'codestral-2405',
'codestral-latest',
'codestral-mamba-2407',
'open-codestral-mamba',
'codestral-mamba-latest',
'pixtral-12b-2409',
'pixtral-12b',
'pixtral-12b-latest',
'mistral-embed',
'mistral-moderation-2411',
'mistral-moderation-latest',
]
};
@@ -92,10 +135,16 @@ const getDefaultModel = (provider: string | undefined): string => {
switch (provider) {
case 'ollama':
return '';
case 'mlx':
return '';
case 'anthropic':
return MODEL_LIST.anthropic[0];
case 'gemini':
return MODEL_LIST.gemini[0];
case 'groq':
return MODEL_LIST.groq[0];
case 'mistral':
return MODEL_LIST.mistral[0];
default:
return MODEL_LIST.openai[0];
}
@@ -123,65 +172,19 @@ const validateConfig = (
};
export const configValidators = {
[CONFIG_KEYS.OCO_OPENAI_API_KEY](value: any, config: any = {}) {
[CONFIG_KEYS.OCO_API_KEY](value: any, config: any = {}) {
if (config.OCO_AI_PROVIDER !== 'openai') return value;
validateConfig(
'OCO_OPENAI_API_KEY',
'OCO_API_KEY',
typeof value === 'string' && value.length > 0,
'Empty value is not allowed'
);
validateConfig(
'OCO_OPENAI_API_KEY',
'OCO_API_KEY',
value,
'You need to provide the OCO_OPENAI_API_KEY when OCO_AI_PROVIDER is set to "openai" (default). Run `oco config set OCO_OPENAI_API_KEY=your_key`'
);
return value;
},
[CONFIG_KEYS.OCO_AZURE_API_KEY](value: any, config: any = {}) {
if (config.OCO_AI_PROVIDER !== 'azure') return value;
validateConfig(
'OCO_AZURE_API_KEY',
!!value,
'You need to provide the OCO_AZURE_API_KEY when OCO_AI_PROVIDER is set to "azure". Run: `oco config set OCO_AZURE_API_KEY=your_key`'
);
return value;
},
[CONFIG_KEYS.OCO_GEMINI_API_KEY](value: any, config: any = {}) {
if (config.OCO_AI_PROVIDER !== 'gemini') return value;
validateConfig(
'OCO_GEMINI_API_KEY',
value || config.OCO_GEMINI_API_KEY || config.OCO_AI_PROVIDER === 'test',
'You need to provide the OCO_GEMINI_API_KEY when OCO_AI_PROVIDER is set to "gemini". Run: `oco config set OCO_GEMINI_API_KEY=your_key`'
);
return value;
},
[CONFIG_KEYS.OCO_ANTHROPIC_API_KEY](value: any, config: any = {}) {
if (config.OCO_AI_PROVIDER !== 'anthropic') return value;
validateConfig(
'ANTHROPIC_API_KEY',
!!value,
'You need to provide the OCO_ANTHROPIC_API_KEY key when OCO_AI_PROVIDER is set to "anthropic". Run: `oco config set OCO_ANTHROPIC_API_KEY=your_key`'
);
return value;
},
[CONFIG_KEYS.OCO_FLOWISE_API_KEY](value: any, config: any = {}) {
validateConfig(
CONFIG_KEYS.OCO_FLOWISE_API_KEY,
value || config.OCO_AI_PROVIDER !== 'flowise',
'You need to provide the OCO_FLOWISE_API_KEY when OCO_AI_PROVIDER is set to "flowise". Run: `oco config set OCO_FLOWISE_API_KEY=your_key`'
'You need to provide the OCO_API_KEY when OCO_AI_PROVIDER set to "openai" (default) or "ollama" or "mlx" or "azure" or "gemini" or "flowise" or "anthropic". Run `oco config set OCO_API_KEY=your_key OCO_AI_PROVIDER=openai`'
);
return value;
@@ -241,11 +244,11 @@ export const configValidators = {
return getI18nLocal(value);
},
[CONFIG_KEYS.OCO_OPENAI_BASE_PATH](value: any) {
[CONFIG_KEYS.OCO_API_URL](value: any) {
validateConfig(
CONFIG_KEYS.OCO_OPENAI_BASE_PATH,
CONFIG_KEYS.OCO_API_URL,
typeof value === 'string',
'Must be string'
`${value} is not a valid URL. It should start with 'http://' or 'https://'.`
);
return value;
},
@@ -296,10 +299,17 @@ export const configValidators = {
validateConfig(
CONFIG_KEYS.OCO_AI_PROVIDER,
['openai', 'anthropic', 'gemini', 'azure', 'test', 'flowise'].includes(
value
) || value.startsWith('ollama'),
`${value} is not supported yet, use 'ollama', 'anthropic', 'azure', 'gemini', 'flowise' or 'openai' (default)`
[
'openai',
'mistral',
'anthropic',
'gemini',
'azure',
'test',
'flowise',
'groq'
].includes(value) || value.startsWith('ollama'),
`${value} is not supported yet, use 'ollama', 'mlx', 'anthropic', 'azure', 'gemini', 'flowise', 'mistral' or 'openai' (default)`
);
return value;
@@ -315,26 +325,6 @@ export const configValidators = {
return value;
},
[CONFIG_KEYS.OCO_AZURE_ENDPOINT](value: any) {
validateConfig(
CONFIG_KEYS.OCO_AZURE_ENDPOINT,
value.includes('openai.azure.com'),
'Must be in format "https://<resource name>.openai.azure.com/"'
);
return value;
},
[CONFIG_KEYS.OCO_FLOWISE_ENDPOINT](value: any) {
validateConfig(
CONFIG_KEYS.OCO_FLOWISE_ENDPOINT,
typeof value === 'string' && value.includes(':'),
'Value must be string and should include both I.P. and port number' // Considering the possibility of DNS lookup or feeding the I.P. explicitly, there is no pattern to verify, except a column for the port number
);
return value;
},
[CONFIG_KEYS.OCO_TEST_MOCK_TYPE](value: any) {
validateConfig(
CONFIG_KEYS.OCO_TEST_MOCK_TYPE,
@@ -346,11 +336,11 @@ export const configValidators = {
return value;
},
[CONFIG_KEYS.OCO_OLLAMA_API_URL](value: any) {
[CONFIG_KEYS.OCO_WHY](value: any) {
validateConfig(
CONFIG_KEYS.OCO_OLLAMA_API_URL,
typeof value === 'string' && value.startsWith('http'),
`${value} is not a valid URL. It should start with 'http://' or 'https://'.`
CONFIG_KEYS.OCO_WHY,
typeof value === 'boolean',
'Must be true or false'
);
return value;
}
@@ -363,18 +353,17 @@ export enum OCO_AI_PROVIDER_ENUM {
GEMINI = 'gemini',
AZURE = 'azure',
TEST = 'test',
FLOWISE = 'flowise'
FLOWISE = 'flowise',
GROQ = 'groq',
MISTRAL = 'mistral',
MLX = 'mlx'
}
export type ConfigType = {
[CONFIG_KEYS.OCO_OPENAI_API_KEY]?: string;
[CONFIG_KEYS.OCO_ANTHROPIC_API_KEY]?: string;
[CONFIG_KEYS.OCO_AZURE_API_KEY]?: string;
[CONFIG_KEYS.OCO_GEMINI_API_KEY]?: string;
[CONFIG_KEYS.OCO_GEMINI_BASE_PATH]?: string;
[CONFIG_KEYS.OCO_API_KEY]?: string;
[CONFIG_KEYS.OCO_TOKENS_MAX_INPUT]: number;
[CONFIG_KEYS.OCO_TOKENS_MAX_OUTPUT]: number;
[CONFIG_KEYS.OCO_OPENAI_BASE_PATH]?: string;
[CONFIG_KEYS.OCO_API_URL]?: string;
[CONFIG_KEYS.OCO_DESCRIPTION]: boolean;
[CONFIG_KEYS.OCO_EMOJI]: boolean;
[CONFIG_KEYS.OCO_WHY]: boolean;
@@ -385,16 +374,11 @@ export type ConfigType = {
[CONFIG_KEYS.OCO_AI_PROVIDER]: OCO_AI_PROVIDER_ENUM;
[CONFIG_KEYS.OCO_GITPUSH]: boolean;
[CONFIG_KEYS.OCO_ONE_LINE_COMMIT]: boolean;
[CONFIG_KEYS.OCO_AZURE_ENDPOINT]?: string;
[CONFIG_KEYS.OCO_TEST_MOCK_TYPE]: string;
[CONFIG_KEYS.OCO_API_URL]?: string;
[CONFIG_KEYS.OCO_OLLAMA_API_URL]?: string;
[CONFIG_KEYS.OCO_FLOWISE_ENDPOINT]: string;
[CONFIG_KEYS.OCO_FLOWISE_API_KEY]?: string;
};
const defaultConfigPath = pathJoin(homedir(), '.opencommit');
const defaultEnvPath = pathResolve(process.cwd(), '.env');
export const defaultConfigPath = pathJoin(homedir(), '.opencommit');
export const defaultEnvPath = pathResolve(process.cwd(), '.env');
const assertConfigsAreValid = (config: Record<string, any>) => {
for (const [key, value] of Object.entries(config)) {
@@ -436,7 +420,6 @@ export const DEFAULT_CONFIG = {
OCO_AI_PROVIDER: OCO_AI_PROVIDER_ENUM.OPENAI,
OCO_ONE_LINE_COMMIT: false,
OCO_TEST_MOCK_TYPE: 'commit-message',
OCO_FLOWISE_ENDPOINT: ':',
OCO_WHY: false,
OCO_GITPUSH: true // todo: deprecate
};
@@ -446,7 +429,7 @@ const initGlobalConfig = (configPath: string = defaultConfigPath) => {
return DEFAULT_CONFIG;
};
const parseEnvVarValue = (value?: any) => {
const parseConfigVarValue = (value?: any) => {
try {
return JSON.parse(value);
} catch (error) {
@@ -459,41 +442,45 @@ const getEnvConfig = (envPath: string) => {
return {
OCO_MODEL: process.env.OCO_MODEL,
OCO_API_URL: process.env.OCO_API_URL,
OCO_API_KEY: process.env.OCO_API_KEY,
OCO_AI_PROVIDER: process.env.OCO_AI_PROVIDER as OCO_AI_PROVIDER_ENUM,
OCO_OPENAI_API_KEY: process.env.OCO_OPENAI_API_KEY,
OCO_ANTHROPIC_API_KEY: process.env.OCO_ANTHROPIC_API_KEY,
OCO_AZURE_API_KEY: process.env.OCO_AZURE_API_KEY,
OCO_GEMINI_API_KEY: process.env.OCO_GEMINI_API_KEY,
OCO_FLOWISE_API_KEY: process.env.OCO_FLOWISE_API_KEY,
OCO_TOKENS_MAX_INPUT: parseConfigVarValue(process.env.OCO_TOKENS_MAX_INPUT),
OCO_TOKENS_MAX_OUTPUT: parseConfigVarValue(
process.env.OCO_TOKENS_MAX_OUTPUT
),
OCO_TOKENS_MAX_INPUT: parseEnvVarValue(process.env.OCO_TOKENS_MAX_INPUT),
OCO_TOKENS_MAX_OUTPUT: parseEnvVarValue(process.env.OCO_TOKENS_MAX_OUTPUT),
OCO_OPENAI_BASE_PATH: process.env.OCO_OPENAI_BASE_PATH,
OCO_GEMINI_BASE_PATH: process.env.OCO_GEMINI_BASE_PATH,
OCO_AZURE_ENDPOINT: process.env.OCO_AZURE_ENDPOINT,
OCO_FLOWISE_ENDPOINT: process.env.OCO_FLOWISE_ENDPOINT,
OCO_OLLAMA_API_URL: process.env.OCO_OLLAMA_API_URL,
OCO_DESCRIPTION: parseEnvVarValue(process.env.OCO_DESCRIPTION),
OCO_EMOJI: parseEnvVarValue(process.env.OCO_EMOJI),
OCO_DESCRIPTION: parseConfigVarValue(process.env.OCO_DESCRIPTION),
OCO_EMOJI: parseConfigVarValue(process.env.OCO_EMOJI),
OCO_LANGUAGE: process.env.OCO_LANGUAGE,
OCO_MESSAGE_TEMPLATE_PLACEHOLDER:
process.env.OCO_MESSAGE_TEMPLATE_PLACEHOLDER,
OCO_PROMPT_MODULE: process.env.OCO_PROMPT_MODULE as OCO_PROMPT_MODULE_ENUM,
OCO_AI_PROVIDER: process.env.OCO_AI_PROVIDER as OCO_AI_PROVIDER_ENUM,
OCO_ONE_LINE_COMMIT: parseEnvVarValue(process.env.OCO_ONE_LINE_COMMIT),
OCO_ONE_LINE_COMMIT: parseConfigVarValue(process.env.OCO_ONE_LINE_COMMIT),
OCO_TEST_MOCK_TYPE: process.env.OCO_TEST_MOCK_TYPE,
OCO_GITPUSH: parseEnvVarValue(process.env.OCO_GITPUSH) // todo: deprecate
OCO_GITPUSH: parseConfigVarValue(process.env.OCO_GITPUSH) // todo: deprecate
};
};
const getGlobalConfig = (configPath: string) => {
export const setGlobalConfig = (
config: ConfigType,
configPath: string = defaultConfigPath
) => {
writeFileSync(configPath, iniStringify(config), 'utf8');
};
export const getIsGlobalConfigFileExist = (
configPath: string = defaultConfigPath
) => {
return existsSync(configPath);
};
export const getGlobalConfig = (configPath: string = defaultConfigPath) => {
let globalConfig: ConfigType;
const isGlobalConfigFileExist = existsSync(configPath);
const isGlobalConfigFileExist = getIsGlobalConfigFileExist(configPath);
if (!isGlobalConfigFileExist) globalConfig = initGlobalConfig(configPath);
else {
const configFile = readFileSync(configPath, 'utf8');
@@ -510,18 +497,39 @@ const getGlobalConfig = (configPath: string) => {
* @param fallback - global ~/.opencommit config file
* @returns merged config
*/
const mergeConfigs = (main: Partial<ConfigType>, fallback: ConfigType) =>
Object.keys(CONFIG_KEYS).reduce((acc, key) => {
acc[key] = parseEnvVarValue(main[key] ?? fallback[key]);
const mergeConfigs = (main: Partial<ConfigType>, fallback: ConfigType) => {
const allKeys = new Set([...Object.keys(main), ...Object.keys(fallback)]);
return Array.from(allKeys).reduce((acc, key) => {
acc[key] = parseConfigVarValue(main[key] ?? fallback[key]);
return acc;
}, {} as ConfigType);
};
interface GetConfigOptions {
globalPath?: string;
envPath?: string;
setDefaultValues?: boolean;
}
const cleanUndefinedValues = (config: ConfigType) => {
return Object.fromEntries(
Object.entries(config).map(([_, v]) => {
try {
if (typeof v === 'string') {
if (v === 'undefined') return [_, undefined];
if (v === 'null') return [_, null];
const parsedValue = JSON.parse(v);
return [_, parsedValue];
}
return [_, v];
} catch (error) {
return [_, v];
}
})
);
};
export const getConfig = ({
envPath = defaultEnvPath,
globalPath = defaultConfigPath
@@ -531,17 +539,21 @@ export const getConfig = ({
const config = mergeConfigs(envConfig, globalConfig);
return config;
const cleanConfig = cleanUndefinedValues(config);
return cleanConfig as ConfigType;
};
export const setConfig = (
keyValues: [key: string, value: string][],
keyValues: [key: string, value: string | boolean | number | null][],
globalConfigPath: string = defaultConfigPath
) => {
const config = getConfig({
globalPath: globalConfigPath
});
const configToSet = {};
for (let [key, value] of keyValues) {
if (!configValidators.hasOwnProperty(key)) {
const supportedKeys = Object.keys(configValidators).join('\n');
@@ -553,7 +565,8 @@ export const setConfig = (
let parsedConfigValue;
try {
parsedConfigValue = JSON.parse(value);
if (typeof value === 'string') parsedConfigValue = JSON.parse(value);
else parsedConfigValue = value;
} catch (error) {
parsedConfigValue = value;
}
@@ -563,10 +576,10 @@ export const setConfig = (
config
);
config[key] = validValue;
configToSet[key] = validValue;
}
writeFileSync(globalConfigPath, iniStringify(config), 'utf8');
setGlobalConfig(mergeConfigs(configToSet, config), globalConfigPath);
outro(`${chalk.green('✔')} config successfully set`);
};

View File

@@ -39,14 +39,11 @@ export const prepareCommitMessageHook = async (
const config = getConfig();
if (
!config.OCO_OPENAI_API_KEY &&
!config.OCO_ANTHROPIC_API_KEY &&
!config.OCO_AZURE_API_KEY
) {
throw new Error(
'No OPEN_AI_API or OCO_ANTHROPIC_API_KEY or OCO_AZURE_API_KEY exists. Set your key in ~/.opencommit'
if (!config.OCO_API_KEY) {
outro(
'No OCO_API_KEY is set. Set your key via `oco config set OCO_API_KEY=<value>. For more info see https://github.com/di-sukharev/opencommit'
);
return;
}
const spin = spinner();

View File

@@ -3,6 +3,7 @@ import { OpenAIClient as AzureOpenAIClient } from '@azure/openai';
import { GoogleGenerativeAI as GeminiClient } from '@google/generative-ai';
import { AxiosInstance as RawAxiosClient } from 'axios';
import { OpenAI as OpenAIClient } from 'openai';
import { Mistral as MistralClient } from '@mistralai/mistralai';
export interface AiEngineConfig {
apiKey: string;
@@ -17,7 +18,8 @@ type Client =
| AzureOpenAIClient
| AnthropicClient
| RawAxiosClient
| GeminiClient;
| GeminiClient
| MistralClient;
export interface AiEngine {
config: AiEngineConfig;

View File

@@ -4,7 +4,7 @@ import { AiEngine, AiEngineConfig } from './Engine';
interface FlowiseAiConfig extends AiEngineConfig {}
export class FlowiseAi implements AiEngine {
export class FlowiseEngine implements AiEngine {
config: FlowiseAiConfig;
client: AxiosInstance;

View File

@@ -11,7 +11,7 @@ import { AiEngine, AiEngineConfig } from './Engine';
interface GeminiConfig extends AiEngineConfig {}
export class Gemini implements AiEngine {
export class GeminiEngine implements AiEngine {
config: GeminiConfig;
client: GoogleGenerativeAI;

10
src/engine/groq.ts Normal file
View File

@@ -0,0 +1,10 @@
import { OpenAiConfig, OpenAiEngine } from './openAi';
interface GroqConfig extends OpenAiConfig {}
export class GroqEngine extends OpenAiEngine {
constructor(config: GroqConfig) {
config.baseURL = 'https://api.groq.com/openai/v1';
super(config);
}
}

82
src/engine/mistral.ts Normal file
View File

@@ -0,0 +1,82 @@
import axios from 'axios';
import { Mistral } from '@mistralai/mistralai';
import { OpenAI } from 'openai';
import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitDiff';
import { tokenCount } from '../utils/tokenCount';
import { AiEngine, AiEngineConfig } from './Engine';
import {
AssistantMessage as MistralAssistantMessage,
SystemMessage as MistralSystemMessage,
ToolMessage as MistralToolMessage,
UserMessage as MistralUserMessage
} from '@mistralai/mistralai/models/components';
export interface MistralAiConfig extends AiEngineConfig {}
export type MistralCompletionMessageParam = Array<
| (MistralSystemMessage & { role: "system" })
| (MistralUserMessage & { role: "user" })
| (MistralAssistantMessage & { role: "assistant" })
| (MistralToolMessage & { role: "tool" })
>
export class MistralAiEngine implements AiEngine {
config: MistralAiConfig;
client: Mistral;
constructor(config: MistralAiConfig) {
this.config = config;
if (!config.baseURL) {
this.client = new Mistral({ apiKey: config.apiKey });
} else {
this.client = new Mistral({ apiKey: config.apiKey, serverURL: config.baseURL });
}
}
public generateCommitMessage = async (
messages: Array<OpenAI.Chat.Completions.ChatCompletionMessageParam>
): Promise<string | null> => {
const params = {
model: this.config.model,
messages: messages as MistralCompletionMessageParam,
topP: 0.1,
maxTokens: this.config.maxTokensOutput
};
try {
const REQUEST_TOKENS = messages
.map((msg) => tokenCount(msg.content as string) + 4)
.reduce((a, b) => a + b, 0);
if (
REQUEST_TOKENS >
this.config.maxTokensInput - this.config.maxTokensOutput
)
throw new Error(GenerateCommitMessageErrorEnum.tooMuchTokens);
const completion = await this.client.chat.complete(params);
if (!completion.choices)
throw Error('No completion choice available.')
const message = completion.choices[0].message;
if (!message || !message.content)
throw Error('No completion choice available.')
return message.content as string;
} catch (error) {
const err = error as Error;
if (
axios.isAxiosError<{ error?: { message: string } }>(error) &&
error.response?.status === 401
) {
const mistralError = error.response.data.error;
if (mistralError) throw new Error(mistralError.message);
}
throw err;
}
};
}

47
src/engine/mlx.ts Normal file
View File

@@ -0,0 +1,47 @@
import axios, { AxiosInstance } from 'axios';
import { OpenAI } from 'openai';
import { AiEngine, AiEngineConfig } from './Engine';
import { chown } from 'fs';
interface MLXConfig extends AiEngineConfig {}
export class MLXEngine implements AiEngine {
config: MLXConfig;
client: AxiosInstance;
constructor(config) {
this.config = config;
this.client = axios.create({
url: config.baseURL
? `${config.baseURL}/${config.apiKey}`
: 'http://localhost:8080/v1/chat/completions',
headers: { 'Content-Type': 'application/json' }
});
}
async generateCommitMessage(
messages: Array<OpenAI.Chat.Completions.ChatCompletionMessageParam>):
Promise<string | undefined> {
const params = {
messages,
temperature: 0,
top_p: 0.1,
repetition_penalty: 1.5,
stream: false
};
try {
const response = await this.client.post(
this.client.getUri(this.config),
params
);
const choices = response.data.choices;
const message = choices[0].message;
return message?.content;
} catch (err: any) {
const message = err.response?.data?.error ?? err.message;
throw new Error(`MLX provider error: ${message}`);
}
}
}

View File

@@ -4,7 +4,7 @@ import { AiEngine, AiEngineConfig } from './Engine';
interface OllamaConfig extends AiEngineConfig {}
export class OllamaAi implements AiEngine {
export class OllamaEngine implements AiEngine {
config: OllamaConfig;
client: AxiosInstance;

View File

@@ -4,7 +4,7 @@ import { GenerateCommitMessageErrorEnum } from '../generateCommitMessageFromGitD
import { tokenCount } from '../utils/tokenCount';
import { AiEngine, AiEngineConfig } from './Engine';
interface OpenAiConfig extends AiEngineConfig {}
export interface OpenAiConfig extends AiEngineConfig {}
export class OpenAiEngine implements AiEngine {
config: OpenAiConfig;
@@ -12,7 +12,12 @@ export class OpenAiEngine implements AiEngine {
constructor(config: OpenAiConfig) {
this.config = config;
this.client = new OpenAI({ apiKey: config.apiKey });
if (!config.baseURL) {
this.client = new OpenAI({ apiKey: config.apiKey });
} else {
this.client = new OpenAI({ apiKey: config.apiKey, baseURL: config.baseURL });
}
}
public generateCommitMessage = async (

View File

@@ -6,17 +6,15 @@ import { mergeDiffs } from './utils/mergeDiffs';
import { tokenCount } from './utils/tokenCount';
const config = getConfig();
const MAX_TOKENS_INPUT =
config.OCO_TOKENS_MAX_INPUT || DEFAULT_TOKEN_LIMITS.DEFAULT_MAX_TOKENS_INPUT;
const MAX_TOKENS_OUTPUT =
config.OCO_TOKENS_MAX_OUTPUT ||
DEFAULT_TOKEN_LIMITS.DEFAULT_MAX_TOKENS_OUTPUT;
const MAX_TOKENS_INPUT = config.OCO_TOKENS_MAX_INPUT;
const MAX_TOKENS_OUTPUT = config.OCO_TOKENS_MAX_OUTPUT;
const generateCommitMessageChatCompletionPrompt = async (
diff: string,
fullGitMojiSpec: boolean
fullGitMojiSpec: boolean,
context: string
): Promise<Array<OpenAI.Chat.Completions.ChatCompletionMessageParam>> => {
const INIT_MESSAGES_PROMPT = await getMainCommitPrompt(fullGitMojiSpec);
const INIT_MESSAGES_PROMPT = await getMainCommitPrompt(fullGitMojiSpec, context);
const chatContextAsCompletionRequest = [...INIT_MESSAGES_PROMPT];
@@ -39,10 +37,14 @@ const ADJUSTMENT_FACTOR = 20;
export const generateCommitMessageByDiff = async (
diff: string,
fullGitMojiSpec: boolean = false
fullGitMojiSpec: boolean = false,
context: string = ""
): Promise<string> => {
try {
const INIT_MESSAGES_PROMPT = await getMainCommitPrompt(fullGitMojiSpec);
const INIT_MESSAGES_PROMPT = await getMainCommitPrompt(
fullGitMojiSpec,
context
);
const INIT_MESSAGES_PROMPT_LENGTH = INIT_MESSAGES_PROMPT.map(
(msg) => tokenCount(msg.content as string) + 4
@@ -72,7 +74,8 @@ export const generateCommitMessageByDiff = async (
const messages = await generateCommitMessageChatCompletionPrompt(
diff,
fullGitMojiSpec
fullGitMojiSpec,
context,
);
const engine = getEngine();

View File

@@ -0,0 +1,45 @@
import {
CONFIG_KEYS,
getConfig,
OCO_AI_PROVIDER_ENUM,
setConfig
} from '../commands/config';
export default function () {
const config = getConfig({ setDefaultValues: false });
const aiProvider = config.OCO_AI_PROVIDER;
let apiKey: string | undefined;
let apiUrl: string | undefined;
if (aiProvider === OCO_AI_PROVIDER_ENUM.OLLAMA) {
apiKey = config['OCO_OLLAMA_API_KEY'];
apiUrl = config['OCO_OLLAMA_API_URL'];
} else if (aiProvider === OCO_AI_PROVIDER_ENUM.ANTHROPIC) {
apiKey = config['OCO_ANTHROPIC_API_KEY'];
apiUrl = config['OCO_ANTHROPIC_BASE_PATH'];
} else if (aiProvider === OCO_AI_PROVIDER_ENUM.OPENAI) {
apiKey = config['OCO_OPENAI_API_KEY'];
apiUrl = config['OCO_OPENAI_BASE_PATH'];
} else if (aiProvider === OCO_AI_PROVIDER_ENUM.AZURE) {
apiKey = config['OCO_AZURE_API_KEY'];
apiUrl = config['OCO_AZURE_ENDPOINT'];
} else if (aiProvider === OCO_AI_PROVIDER_ENUM.GEMINI) {
apiKey = config['OCO_GEMINI_API_KEY'];
apiUrl = config['OCO_GEMINI_BASE_PATH'];
} else if (aiProvider === OCO_AI_PROVIDER_ENUM.FLOWISE) {
apiKey = config['OCO_FLOWISE_API_KEY'];
apiUrl = config['OCO_FLOWISE_ENDPOINT'];
} else {
throw new Error(
`Migration failed, set AI provider first. Run "oco config set OCO_AI_PROVIDER=<provider>", where <provider> is one of: ${Object.values(
OCO_AI_PROVIDER_ENUM
).join(', ')}`
);
}
if (apiKey) setConfig([[CONFIG_KEYS.OCO_API_KEY, apiKey]]);
if (apiUrl) setConfig([[CONFIG_KEYS.OCO_API_URL, apiUrl]]);
}

View File

@@ -0,0 +1,26 @@
import { getGlobalConfig, setGlobalConfig } from '../commands/config';
export default function () {
const obsoleteKeys = [
'OCO_OLLAMA_API_KEY',
'OCO_OLLAMA_API_URL',
'OCO_ANTHROPIC_API_KEY',
'OCO_ANTHROPIC_BASE_PATH',
'OCO_OPENAI_API_KEY',
'OCO_OPENAI_BASE_PATH',
'OCO_AZURE_API_KEY',
'OCO_AZURE_ENDPOINT',
'OCO_GEMINI_API_KEY',
'OCO_GEMINI_BASE_PATH',
'OCO_FLOWISE_API_KEY',
'OCO_FLOWISE_ENDPOINT'
];
const globalConfig = getGlobalConfig();
const configToOverride = { ...globalConfig };
for (const key of obsoleteKeys) delete configToOverride[key];
setGlobalConfig(configToOverride);
}

View File

@@ -0,0 +1,22 @@
import {
ConfigType,
DEFAULT_CONFIG,
getGlobalConfig,
setConfig
} from '../commands/config';
export default function () {
const setDefaultConfigValues = (config: ConfigType) => {
const entriesToSet: [key: string, value: string | boolean | number][] = [];
for (const entry of Object.entries(DEFAULT_CONFIG)) {
const [key, _value] = entry;
if (config[key] === 'undefined' || config[key] === undefined)
entriesToSet.push(entry);
}
if (entriesToSet.length > 0) setConfig(entriesToSet);
console.log(entriesToSet);
};
setDefaultConfigValues(getGlobalConfig());
}

View File

@@ -0,0 +1,18 @@
import migration00 from './00_use_single_api_key_and_url';
import migration01 from './01_remove_obsolete_config_keys_from_global_file';
import migration02 from './02_set_missing_default_values';
export const migrations = [
{
name: '00_use_single_api_key_and_url',
run: migration00
},
{
name: '01_remove_obsolete_config_keys_from_global_file',
run: migration01
},
{
name: '02_set_missing_default_values',
run: migration02
}
];

71
src/migrations/_run.ts Normal file
View File

@@ -0,0 +1,71 @@
import fs from 'fs';
import { homedir } from 'os';
import { join as pathJoin } from 'path';
import { migrations } from './_migrations';
import { outro } from '@clack/prompts';
import chalk from 'chalk';
import {
getConfig,
getIsGlobalConfigFileExist,
OCO_AI_PROVIDER_ENUM
} from '../commands/config';
const migrationsFile = pathJoin(homedir(), '.opencommit_migrations');
const getCompletedMigrations = (): string[] => {
if (!fs.existsSync(migrationsFile)) {
return [];
}
const data = fs.readFileSync(migrationsFile, 'utf-8');
return data ? JSON.parse(data) : [];
};
const saveCompletedMigration = (migrationName: string) => {
const completedMigrations = getCompletedMigrations();
completedMigrations.push(migrationName);
fs.writeFileSync(
migrationsFile,
JSON.stringify(completedMigrations, null, 2)
);
};
export const runMigrations = async () => {
// if no config file, we assume it's a new installation and no migrations are needed
if (!getIsGlobalConfigFileExist()) return;
const config = getConfig();
if (config.OCO_AI_PROVIDER === OCO_AI_PROVIDER_ENUM.TEST) return;
const completedMigrations = getCompletedMigrations();
let isMigrated = false;
for (const migration of migrations) {
if (!completedMigrations.includes(migration.name)) {
try {
console.log('Applying migration', migration.name);
migration.run();
console.log('Migration applied successfully', migration.name);
saveCompletedMigration(migration.name);
} catch (error) {
outro(
`${chalk.red('Failed to apply migration')} ${
migration.name
}: ${error}`
);
process.exit(1);
}
isMigrated = true;
}
}
if (isMigrated) {
outro(
`${chalk.green(
'✔'
)} Migrations to your config were applied successfully. Please rerun.`
);
process.exit(0);
}
};

View File

@@ -111,9 +111,24 @@ const getOneLineCommitInstruction = () =>
? 'Craft a concise commit message that encapsulates all changes made, with an emphasis on the primary updates. If the modifications share a common theme or scope, mention it succinctly; otherwise, leave the scope out to maintain focus. The goal is to provide a clear and unified overview of the changes in a one single message, without diverging into a list of commit per file change.'
: '';
/**
* Get the context of the user input
* @param extraArgs - The arguments passed to the command line
* @example
* $ oco -- This is a context used to generate the commit message
* @returns - The context of the user input
*/
const userInputCodeContext = (context: string) => {
if (context !== '' && context !== ' ') {
return `Additional context provided by the user: <context>${context}</context>\nConsider this context when generating the commit message, incorporating relevant information when appropriate.`;
}
return '';
};
const INIT_MAIN_PROMPT = (
language: string,
fullGitMojiSpec: boolean
fullGitMojiSpec: boolean,
context: string
): OpenAI.Chat.Completions.ChatCompletionMessageParam => ({
role: 'system',
content: (() => {
@@ -127,15 +142,16 @@ const INIT_MAIN_PROMPT = (
const descriptionGuideline = getDescriptionInstruction();
const oneLineCommitGuideline = getOneLineCommitInstruction();
const generalGuidelines = `Use the present tense. Lines must not be longer than 74 characters. Use ${language} for the commit message.`;
const userInputContext = userInputCodeContext(context);
return `${missionStatement}\n${diffInstruction}\n${conventionGuidelines}\n${descriptionGuideline}\n${oneLineCommitGuideline}\n${generalGuidelines}`;
return `${missionStatement}\n${diffInstruction}\n${conventionGuidelines}\n${descriptionGuideline}\n${oneLineCommitGuideline}\n${generalGuidelines}\n${userInputContext}`;
})()
});
export const INIT_DIFF_PROMPT: OpenAI.Chat.Completions.ChatCompletionMessageParam =
{
role: 'user',
content: `diff --git a/src/server.ts b/src/server.ts
{
role: 'user',
content: `diff --git a/src/server.ts b/src/server.ts
index ad4db42..f3b18a9 100644
--- a/src/server.ts
+++ b/src/server.ts
@@ -159,7 +175,7 @@ export const INIT_DIFF_PROMPT: OpenAI.Chat.Completions.ChatCompletionMessagePara
+app.listen(process.env.PORT || PORT, () => {
+ console.log(\`Server listening on port \${PORT}\`);
});`
};
};
const getContent = (translation: ConsistencyPrompt) => {
const fix = config.OCO_EMOJI
@@ -185,7 +201,8 @@ const INIT_CONSISTENCY_PROMPT = (
});
export const getMainCommitPrompt = async (
fullGitMojiSpec: boolean
fullGitMojiSpec: boolean,
context: string
): Promise<Array<OpenAI.Chat.Completions.ChatCompletionMessageParam>> => {
switch (config.OCO_PROMPT_MODULE) {
case '@commitlint':
@@ -207,14 +224,14 @@ export const getMainCommitPrompt = async (
INIT_DIFF_PROMPT,
INIT_CONSISTENCY_PROMPT(
commitLintConfig.consistency[
translation.localLanguage
translation.localLanguage
] as ConsistencyPrompt
)
];
default:
return [
INIT_MAIN_PROMPT(translation.localLanguage, fullGitMojiSpec),
INIT_MAIN_PROMPT(translation.localLanguage, fullGitMojiSpec, context),
INIT_DIFF_PROMPT,
INIT_CONSISTENCY_PROMPT(translation)
];

View File

@@ -2,11 +2,14 @@ import { getConfig, OCO_AI_PROVIDER_ENUM } from '../commands/config';
import { AnthropicEngine } from '../engine/anthropic';
import { AzureEngine } from '../engine/azure';
import { AiEngine } from '../engine/Engine';
import { FlowiseAi } from '../engine/flowise';
import { Gemini } from '../engine/gemini';
import { OllamaAi } from '../engine/ollama';
import { FlowiseEngine } from '../engine/flowise';
import { GeminiEngine } from '../engine/gemini';
import { OllamaEngine } from '../engine/ollama';
import { OpenAiEngine } from '../engine/openAi';
import { MistralAiEngine } from '../engine/mistral';
import { TestAi, TestMockType } from '../engine/testAi';
import { GroqEngine } from '../engine/groq';
import { MLXEngine } from '../engine/mlx';
export function getEngine(): AiEngine {
const config = getConfig();
@@ -16,50 +19,39 @@ export function getEngine(): AiEngine {
model: config.OCO_MODEL!,
maxTokensOutput: config.OCO_TOKENS_MAX_OUTPUT!,
maxTokensInput: config.OCO_TOKENS_MAX_INPUT!,
baseURL: config.OCO_OPENAI_BASE_PATH!
baseURL: config.OCO_API_URL!,
apiKey: config.OCO_API_KEY!
};
switch (provider) {
case OCO_AI_PROVIDER_ENUM.OLLAMA:
return new OllamaAi({
...DEFAULT_CONFIG,
apiKey: '',
baseURL: config.OCO_OLLAMA_API_URL!
});
return new OllamaEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.ANTHROPIC:
return new AnthropicEngine({
...DEFAULT_CONFIG,
apiKey: config.OCO_ANTHROPIC_API_KEY!
});
return new AnthropicEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.TEST:
return new TestAi(config.OCO_TEST_MOCK_TYPE as TestMockType);
case OCO_AI_PROVIDER_ENUM.GEMINI:
return new Gemini({
...DEFAULT_CONFIG,
apiKey: config.OCO_GEMINI_API_KEY!,
baseURL: config.OCO_GEMINI_BASE_PATH!
});
return new GeminiEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.AZURE:
return new AzureEngine({
...DEFAULT_CONFIG,
apiKey: config.OCO_AZURE_API_KEY!
});
return new AzureEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.FLOWISE:
return new FlowiseAi({
...DEFAULT_CONFIG,
baseURL: config.OCO_FLOWISE_ENDPOINT || DEFAULT_CONFIG.baseURL,
apiKey: config.OCO_FLOWISE_API_KEY!
});
return new FlowiseEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.GROQ:
return new GroqEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.MISTRAL:
return new MistralAiEngine(DEFAULT_CONFIG);
case OCO_AI_PROVIDER_ENUM.MLX:
return new MLXEngine(DEFAULT_CONFIG);
default:
return new OpenAiEngine({
...DEFAULT_CONFIG,
apiKey: config.OCO_OPENAI_API_KEY!
});
return new OpenAiEngine(DEFAULT_CONFIG);
}
}

205
test/e2e/gitPush.test.ts Normal file
View File

@@ -0,0 +1,205 @@
import path from 'path';
import 'cli-testing-library/extend-expect';
import { exec } from 'child_process';
import { prepareTempDir } from './utils';
import { promisify } from 'util';
import { render } from 'cli-testing-library';
import { resolve } from 'path';
import { rm } from 'fs';
const fsExec = promisify(exec);
const fsRemove = promisify(rm);
/**
* git remote -v
*
* [no remotes]
*/
const prepareNoRemoteGitRepository = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
await fsExec('git init test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
};
return {
gitDir,
cleanup
};
};
/**
* git remote -v
*
* origin /tmp/remote.git (fetch)
* origin /tmp/remote.git (push)
*/
const prepareOneRemoteGitRepository = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
await fsExec('git init --bare remote.git', { cwd: tempDir });
await fsExec('git clone remote.git test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
};
return {
gitDir,
cleanup
};
};
/**
* git remote -v
*
* origin /tmp/remote.git (fetch)
* origin /tmp/remote.git (push)
* other ../remote2.git (fetch)
* other ../remote2.git (push)
*/
const prepareTwoRemotesGitRepository = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await prepareTempDir();
await fsExec('git init --bare remote.git', { cwd: tempDir });
await fsExec('git init --bare other.git', { cwd: tempDir });
await fsExec('git clone remote.git test', { cwd: tempDir });
const gitDir = path.resolve(tempDir, 'test');
await fsExec('git remote add other ../other.git', { cwd: gitDir });
const cleanup = async () => {
return fsRemove(tempDir, { recursive: true });
};
return {
gitDir,
cleanup
};
};
describe('cli flow to push git branch', () => {
it('do nothing when OCO_GITPUSH is set to false', async () => {
const { gitDir, cleanup } = await prepareNoRemoteGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { queryByText, findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' OCO_GITPUSH='false' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
);
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(
await queryByText('Choose a remote to push to')
).not.toBeInTheConsole();
expect(
await queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(
await queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
expect(
await queryByText('Command failed with exit code 1')
).not.toBeInTheConsole();
await cleanup();
});
it('push and cause error when there is no remote', async () => {
const { gitDir, cleanup } = await prepareNoRemoteGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { queryByText, findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
);
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(
await queryByText('Choose a remote to push to')
).not.toBeInTheConsole();
expect(
await queryByText('Do you want to run `git push`?')
).not.toBeInTheConsole();
expect(
await queryByText('Successfully pushed all commits to origin')
).not.toBeInTheConsole();
expect(
await findByText('Command failed with exit code 1')
).toBeInTheConsole();
await cleanup();
});
it('push when one remote is set', async () => {
const { gitDir, cleanup } = await prepareOneRemoteGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
);
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(
await findByText('Do you want to run `git push`?')
).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(
await findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
await cleanup();
});
it('push when two remotes are set', async () => {
const { gitDir, cleanup } = await prepareTwoRemotesGitRepository();
await render('echo', [`'console.log("Hello World");' > index.ts`], {
cwd: gitDir
});
await render('git', ['add index.ts'], { cwd: gitDir });
const { findByText, userEvent } = await render(
`OCO_AI_PROVIDER='test' node`,
[resolve('./out/cli.cjs')],
{ cwd: gitDir }
);
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await findByText('Choose a remote to push to')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(
await findByText('Successfully pushed all commits to origin')
).toBeInTheConsole();
await cleanup();
});
});

View File

@@ -17,7 +17,7 @@ it('cli flow to generate commit message for 1 new file (staged)', async () => {
expect(await findByText('Confirm the commit message?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await findByText('Choose a remote to push to')).toBeInTheConsole();
expect(await findByText('Do you want to run `git push`?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await findByText('Successfully pushed all commits to origin')).toBeInTheConsole();
@@ -46,7 +46,7 @@ it('cli flow to generate commit message for 1 changed file (not staged)', async
expect(await findByText('Successfully committed')).toBeInTheConsole();
expect(await findByText('Choose a remote to push to')).toBeInTheConsole();
expect(await findByText('Do you want to run `git push`?')).toBeInTheConsole();
userEvent.keyboard('[Enter]');
expect(await findByText('Successfully pushed all commits to origin')).toBeInTheConsole();

View File

@@ -209,7 +209,7 @@ describe('cli flow to generate commit message using @commitlint prompt-module',
oco.userEvent.keyboard('[Enter]');
expect(
await oco.findByText('Choose a remote to push to')
await oco.findByText('Do you want to run `git push`?')
).toBeInTheConsole();
oco.userEvent.keyboard('[Enter]');

View File

@@ -15,7 +15,7 @@ export const prepareEnvironment = async (): Promise<{
gitDir: string;
cleanup: () => Promise<void>;
}> => {
const tempDir = await fsMakeTempDir(path.join(tmpdir(), 'opencommit-test-'));
const tempDir = await prepareTempDir();
// Create a remote git repository int the temp directory. This is necessary to execute the `git push` command
await fsExec('git init --bare remote.git', { cwd: tempDir });
await fsExec('git clone remote.git test', { cwd: tempDir });
@@ -30,4 +30,8 @@ export const prepareEnvironment = async (): Promise<{
}
}
export const prepareTempDir = async(): Promise<string> => {
return await fsMakeTempDir(path.join(tmpdir(), 'opencommit-test-'));
}
export const wait = (ms: number) => new Promise(resolve => setTimeout(resolve, ms));

View File

@@ -1,5 +1,6 @@
import { existsSync, readFileSync, rmSync } from 'fs';
import {
CONFIG_KEYS,
DEFAULT_CONFIG,
getConfig,
setConfig
@@ -50,14 +51,13 @@ describe('config', () => {
describe('getConfig', () => {
it('should prioritize local .env over global .opencommit config', async () => {
globalConfigFile = await generateConfig('.opencommit', {
OCO_OPENAI_API_KEY: 'global-key',
OCO_API_KEY: 'global-key',
OCO_MODEL: 'gpt-3.5-turbo',
OCO_LANGUAGE: 'en'
});
envConfigFile = await generateConfig('.env', {
OCO_OPENAI_API_KEY: 'local-key',
OCO_ANTHROPIC_API_KEY: 'local-anthropic-key',
OCO_API_KEY: 'local-key',
OCO_LANGUAGE: 'fr'
});
@@ -67,22 +67,21 @@ describe('config', () => {
});
expect(config).not.toEqual(null);
expect(config.OCO_OPENAI_API_KEY).toEqual('local-key');
expect(config.OCO_API_KEY).toEqual('local-key');
expect(config.OCO_MODEL).toEqual('gpt-3.5-turbo');
expect(config.OCO_LANGUAGE).toEqual('fr');
expect(config.OCO_ANTHROPIC_API_KEY).toEqual('local-anthropic-key');
});
it('should fallback to global config when local config is not set', async () => {
globalConfigFile = await generateConfig('.opencommit', {
OCO_OPENAI_API_KEY: 'global-key',
OCO_API_KEY: 'global-key',
OCO_MODEL: 'gpt-4',
OCO_LANGUAGE: 'de',
OCO_DESCRIPTION: 'true'
});
envConfigFile = await generateConfig('.env', {
OCO_ANTHROPIC_API_KEY: 'local-anthropic-key'
OCO_API_URL: 'local-api-url'
});
const config = getConfig({
@@ -91,8 +90,8 @@ describe('config', () => {
});
expect(config).not.toEqual(null);
expect(config.OCO_OPENAI_API_KEY).toEqual('global-key');
expect(config.OCO_ANTHROPIC_API_KEY).toEqual('local-anthropic-key');
expect(config.OCO_API_KEY).toEqual('global-key');
expect(config.OCO_API_URL).toEqual('local-api-url');
expect(config.OCO_MODEL).toEqual('gpt-4');
expect(config.OCO_LANGUAGE).toEqual('de');
expect(config.OCO_DESCRIPTION).toEqual(true);
@@ -124,7 +123,7 @@ describe('config', () => {
it('should handle empty local config correctly', async () => {
globalConfigFile = await generateConfig('.opencommit', {
OCO_OPENAI_API_KEY: 'global-key',
OCO_API_KEY: 'global-key',
OCO_MODEL: 'gpt-4',
OCO_LANGUAGE: 'es'
});
@@ -137,20 +136,20 @@ describe('config', () => {
});
expect(config).not.toEqual(null);
expect(config.OCO_OPENAI_API_KEY).toEqual('global-key');
expect(config.OCO_API_KEY).toEqual('global-key');
expect(config.OCO_MODEL).toEqual('gpt-4');
expect(config.OCO_LANGUAGE).toEqual('es');
});
it('should override global config with null values in local .env', async () => {
globalConfigFile = await generateConfig('.opencommit', {
OCO_OPENAI_API_KEY: 'global-key',
OCO_API_KEY: 'global-key',
OCO_MODEL: 'gpt-4',
OCO_LANGUAGE: 'es'
});
envConfigFile = await generateConfig('.env', {
OCO_OPENAI_API_KEY: 'null'
OCO_API_KEY: 'null'
});
const config = getConfig({
@@ -159,7 +158,7 @@ describe('config', () => {
});
expect(config).not.toEqual(null);
expect(config.OCO_OPENAI_API_KEY).toEqual(null);
expect(config.OCO_API_KEY).toEqual(null);
});
it('should handle empty global config', async () => {
@@ -172,7 +171,7 @@ describe('config', () => {
});
expect(config).not.toEqual(null);
expect(config.OCO_OPENAI_API_KEY).toEqual(undefined);
expect(config.OCO_API_KEY).toEqual(undefined);
});
});
@@ -188,12 +187,12 @@ describe('config', () => {
expect(isGlobalConfigFileExist).toBe(false);
await setConfig(
[['OCO_OPENAI_API_KEY', 'persisted-key_1']],
[[CONFIG_KEYS.OCO_API_KEY, 'persisted-key_1']],
globalConfigFile.filePath
);
const fileContent = readFileSync(globalConfigFile.filePath, 'utf8');
expect(fileContent).toContain('OCO_OPENAI_API_KEY=persisted-key_1');
expect(fileContent).toContain('OCO_API_KEY=persisted-key_1');
Object.entries(DEFAULT_CONFIG).forEach(([key, value]) => {
expect(fileContent).toContain(`${key}=${value}`);
});
@@ -203,42 +202,48 @@ describe('config', () => {
globalConfigFile = await generateConfig('.opencommit', {});
await setConfig(
[
['OCO_OPENAI_API_KEY', 'new-key'],
['OCO_MODEL', 'gpt-4']
[CONFIG_KEYS.OCO_API_KEY, 'new-key'],
[CONFIG_KEYS.OCO_MODEL, 'gpt-4']
],
globalConfigFile.filePath
);
const config = getConfig({ globalPath: globalConfigFile.filePath });
expect(config.OCO_OPENAI_API_KEY).toEqual('new-key');
const config = getConfig({
globalPath: globalConfigFile.filePath
});
expect(config.OCO_API_KEY).toEqual('new-key');
expect(config.OCO_MODEL).toEqual('gpt-4');
});
it('should update existing config values', async () => {
globalConfigFile = await generateConfig('.opencommit', {
OCO_OPENAI_API_KEY: 'initial-key'
OCO_API_KEY: 'initial-key'
});
await setConfig(
[['OCO_OPENAI_API_KEY', 'updated-key']],
[[CONFIG_KEYS.OCO_API_KEY, 'updated-key']],
globalConfigFile.filePath
);
const config = getConfig({ globalPath: globalConfigFile.filePath });
expect(config.OCO_OPENAI_API_KEY).toEqual('updated-key');
const config = getConfig({
globalPath: globalConfigFile.filePath
});
expect(config.OCO_API_KEY).toEqual('updated-key');
});
it('should handle boolean and numeric values correctly', async () => {
globalConfigFile = await generateConfig('.opencommit', {});
await setConfig(
[
['OCO_TOKENS_MAX_INPUT', '8192'],
['OCO_DESCRIPTION', 'true'],
['OCO_ONE_LINE_COMMIT', 'false']
[CONFIG_KEYS.OCO_TOKENS_MAX_INPUT, '8192'],
[CONFIG_KEYS.OCO_DESCRIPTION, 'true'],
[CONFIG_KEYS.OCO_ONE_LINE_COMMIT, 'false']
],
globalConfigFile.filePath
);
const config = getConfig({ globalPath: globalConfigFile.filePath });
const config = getConfig({
globalPath: globalConfigFile.filePath
});
expect(config.OCO_TOKENS_MAX_INPUT).toEqual(8192);
expect(config.OCO_DESCRIPTION).toEqual(true);
expect(config.OCO_ONE_LINE_COMMIT).toEqual(false);
@@ -266,12 +271,12 @@ describe('config', () => {
expect(isGlobalConfigFileExist).toBe(false);
await setConfig(
[['OCO_OPENAI_API_KEY', 'persisted-key']],
[[CONFIG_KEYS.OCO_API_KEY, 'persisted-key']],
globalConfigFile.filePath
);
const fileContent = readFileSync(globalConfigFile.filePath, 'utf8');
expect(fileContent).toContain('OCO_OPENAI_API_KEY=persisted-key');
expect(fileContent).toContain('OCO_API_KEY=persisted-key');
});
it('should set multiple configs in a row and keep the changes', async () => {
@@ -279,14 +284,17 @@ describe('config', () => {
expect(isGlobalConfigFileExist).toBe(false);
await setConfig(
[['OCO_OPENAI_API_KEY', 'persisted-key']],
[[CONFIG_KEYS.OCO_API_KEY, 'persisted-key']],
globalConfigFile.filePath
);
const fileContent1 = readFileSync(globalConfigFile.filePath, 'utf8');
expect(fileContent1).toContain('OCO_OPENAI_API_KEY=persisted-key');
expect(fileContent1).toContain('OCO_API_KEY=persisted-key');
await setConfig([['OCO_MODEL', 'gpt-4']], globalConfigFile.filePath);
await setConfig(
[[CONFIG_KEYS.OCO_MODEL, 'gpt-4']],
globalConfigFile.filePath
);
const fileContent2 = readFileSync(globalConfigFile.filePath, 'utf8');
expect(fileContent2).toContain('OCO_MODEL=gpt-4');

View File

@@ -1,4 +1,4 @@
import { Gemini } from '../../src/engine/gemini';
import { GeminiEngine } from '../../src/engine/gemini';
import { GenerativeModel, GoogleGenerativeAI } from '@google/generative-ai';
import {
@@ -9,7 +9,7 @@ import {
import { OpenAI } from 'openai';
describe('Gemini', () => {
let gemini: Gemini;
let gemini: GeminiEngine;
let mockConfig: ConfigType;
let mockGoogleGenerativeAi: GoogleGenerativeAI;
let mockGenerativeModel: GenerativeModel;
@@ -20,8 +20,8 @@ describe('Gemini', () => {
const mockGemini = () => {
mockConfig = getConfig() as ConfigType;
gemini = new Gemini({
apiKey: mockConfig.OCO_GEMINI_API_KEY,
gemini = new GeminiEngine({
apiKey: mockConfig.OCO_API_KEY,
model: mockConfig.OCO_MODEL
});
};
@@ -45,12 +45,10 @@ describe('Gemini', () => {
mockConfig = getConfig() as ConfigType;
mockConfig.OCO_AI_PROVIDER = OCO_AI_PROVIDER_ENUM.GEMINI;
mockConfig.OCO_GEMINI_API_KEY = 'mock-api-key';
mockConfig.OCO_API_KEY = 'mock-api-key';
mockConfig.OCO_MODEL = 'gemini-1.5-flash';
mockGoogleGenerativeAi = new GoogleGenerativeAI(
mockConfig.OCO_GEMINI_API_KEY
);
mockGoogleGenerativeAi = new GoogleGenerativeAI(mockConfig.OCO_API_KEY);
mockGenerativeModel = mockGoogleGenerativeAi.getGenerativeModel({
model: mockConfig.OCO_MODEL
});