Files
Fabric/internal/cli/example.yaml
Kayvan Sylvan 3b9782f942 feat: add disable-responses-api flag for OpenAI compatibility
## CHANGES

- Add disable-responses-api flag to CLI completions
- Update zsh completion with new API flag
- Update bash completion options list
- Add fish shell completion for API flag
- Add testpattern to VSCode spell checker dictionary
- Configure disableResponsesAPI in example YAML config
- Enable flag for llama-server compatibility
2025-07-17 10:47:35 -07:00

31 lines
566 B
YAML

#this is an example yaml config file for fabric
# use fabric pattern names
pattern: ai
# or use a filename
# pattern: ~/testpattern.md
model: phi3:latest
# for models that support context length
modelContextLength: 2048
frequencypenalty: 0.5
presencepenalty: 0.5
topp: 0.67
temperature: 0.88
seed: 42
stream: true
raw: false
# suppress vendor thinking output
suppressThink: false
thinkStartTag: "<think>"
thinkEndTag: "</think>"
# OpenAI Responses API settings
# (use this for llama-server or other OpenAI-compatible local servers)
disableResponsesAPI: true