mirror of
https://github.com/All-Hands-AI/OpenHands.git
synced 2026-01-10 07:18:10 -05:00
1.3 KiB
1.3 KiB
Groq
OpenHands uses LiteLLM to make calls to chat models on Groq. You can find their documentation on using Groq as a provider here.
Configuration
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings under the LLM tab:
LLM ProvidertoGroqLLM Modelto the model you will be using. Visit here to see the list of models that Groq hosts. If the model is not in the list, enableAdvancedoptions, and enter it inCustom Model(e.g. groq/<model-name> likegroq/llama3-70b-8192).API keyto your Groq API key. To find or create your Groq API Key, see here.
Using Groq as an OpenAI-Compatible Endpoint
The Groq endpoint for chat completion is mostly OpenAI-compatible. Therefore, you can access Groq models as you
would access any OpenAI-compatible endpoint. In the OpenHands UI through the Settings under the LLM tab:
- Enable
Advancedoptions - Set the following:
Custom Modelto the prefixopenai/+ the model you will be using (e.g.openai/llama3-70b-8192)Base URLtohttps://api.groq.com/openai/v1API Keyto your Groq API key