mirror of
https://github.com/All-Hands-AI/OpenHands.git
synced 2026-01-10 07:18:10 -05:00
1.4 KiB
1.4 KiB
OpenAI
OpenHands uses LiteLLM to make calls to OpenAI's chat models. You can find their documentation on using OpenAI as a provider here.
Configuration
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings under the LLM tab:
LLM ProvidertoOpenAILLM Modelto the model you will be using. Visit here to see a full list of OpenAI models that LiteLLM supports. If the model is not in the list, enableAdvancedoptions, and enter it inCustom Model(e.g. openai/<model-name> likeopenai/gpt-4o).API Keyto your OpenAI API key. To find or create your OpenAI Project API Key, see here.
Using OpenAI-Compatible Endpoints
Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoints. You can find their full documentation on this topic here.
Using an OpenAI Proxy
If you're using an OpenAI proxy, in the OpenHands UI through the Settings under the LLM tab:
- Enable
Advancedoptions - Set the following:
Custom Modelto openai/<model-name> (e.g.openai/gpt-4oor openai/<proxy-prefix>/<model-name>)Base URLto the URL of your OpenAI proxyAPI Keyto your OpenAI API key