mirror of
https://github.com/All-Hands-AI/OpenHands.git
synced 2026-01-08 22:38:05 -05:00
Add claude-sonnet-latest to supported models lists (#8365)
Co-authored-by: openhands <openhands@all-hands.dev> Co-authored-by: Engel Nyst <enyst@users.noreply.github.com>
This commit is contained in:
4
.github/workflows/openhands-resolver.yml
vendored
4
.github/workflows/openhands-resolver.yml
vendored
@@ -232,11 +232,11 @@ jobs:
|
||||
// Perform package installation
|
||||
if (isExperimentalLabel || isIssueCommentExperimental || isReviewCommentExperimental) {
|
||||
console.log("Installing experimental OpenHands...");
|
||||
|
||||
|
||||
await exec.exec("pip install git+https://github.com/all-hands-ai/openhands.git");
|
||||
} else {
|
||||
console.log("Installing from requirements.txt...");
|
||||
|
||||
|
||||
await exec.exec("pip install -r /tmp/requirements.txt");
|
||||
}
|
||||
|
||||
|
||||
@@ -12,7 +12,7 @@ $ npm install
|
||||
$ npm run start
|
||||
```
|
||||
|
||||
This command starts a local development server and opens up a browser window.
|
||||
This command starts a local development server and opens up a browser window.
|
||||
Most changes are reflected live without having to restart the server.
|
||||
|
||||
Alternatively, you can pass a `--locale` argument to render a specific language in dev mode as in:
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# OpenHands Cloud API
|
||||
|
||||
OpenHands Cloud provides a REST API that allows you to programmatically interact with the service. This is useful if
|
||||
OpenHands Cloud provides a REST API that allows you to programmatically interact with the service. This is useful if
|
||||
you want to kick off your own jobs from your programs in a flexible way.
|
||||
|
||||
This guide explains how to obtain an API key and use the API to start conversations.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# CLI Mode
|
||||
|
||||
CLI mode provides a powerful interactive Command-Line Interface (CLI) that lets you engage with OpenHands directly
|
||||
CLI mode provides a powerful interactive Command-Line Interface (CLI) that lets you engage with OpenHands directly
|
||||
from your terminal.
|
||||
|
||||
This mode is different from the [headless mode](headless-mode), which is non-interactive and better for scripting.
|
||||
@@ -52,12 +52,12 @@ The `-e SANDBOX_USER_ID=$(id -u)` ensures files created by the agent in your wor
|
||||
|
||||
### What is CLI Mode?
|
||||
|
||||
CLI mode enables real-time interaction with OpenHands agents. You can type natural language tasks, use interactive
|
||||
CLI mode enables real-time interaction with OpenHands agents. You can type natural language tasks, use interactive
|
||||
commands, and receive instant feedback—all inside your terminal.
|
||||
|
||||
### Starting a Conversation
|
||||
|
||||
When you start the CLI, you'll see a welcome message and a prompt (`>`). Enter your first task or type a command to
|
||||
When you start the CLI, you'll see a welcome message and a prompt (`>`). Enter your first task or type a command to
|
||||
begin your conversation.
|
||||
|
||||
### Available Commands
|
||||
@@ -76,7 +76,7 @@ You can use the following commands whenever the prompt (`>`) is displayed:
|
||||
|
||||
#### Settings and Configuration
|
||||
|
||||
You can update your model, API key, agent, and other preferences interactively using the `/settings` command. Just
|
||||
You can update your model, API key, agent, and other preferences interactively using the `/settings` command. Just
|
||||
follow the prompts:
|
||||
|
||||
- **Basic settings**: Choose a model/provider and enter your API key.
|
||||
@@ -86,12 +86,12 @@ Settings can also be managed via the `config.toml` file.
|
||||
|
||||
#### Repository Initialization
|
||||
|
||||
The `/init` command helps the agent understand your project by creating a `.openhands/microagents/repo.md` file with
|
||||
The `/init` command helps the agent understand your project by creating a `.openhands/microagents/repo.md` file with
|
||||
project details and structure. Use this when onboarding the agent to a new codebase.
|
||||
|
||||
#### Agent Pause/Resume Feature
|
||||
|
||||
You can pause the agent while it is running by pressing `Ctrl-P`. To continue the conversation after pausing, simply
|
||||
You can pause the agent while it is running by pressing `Ctrl-P`. To continue the conversation after pausing, simply
|
||||
type `/resume` at the prompt.
|
||||
|
||||
## Tips and Troubleshooting
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
# Model Context Protocol (MCP)
|
||||
|
||||
:::note
|
||||
This page outlines how to configure and use the Model Context Protocol (MCP) in OpenHands, allowing you to extend the
|
||||
This page outlines how to configure and use the Model Context Protocol (MCP) in OpenHands, allowing you to extend the
|
||||
agent's capabilities with custom tools.
|
||||
:::
|
||||
|
||||
## Overview
|
||||
|
||||
Model Context Protocol (MCP) is a mechanism that allows OpenHands to communicate with external tool servers. These
|
||||
servers can provide additional functionality to the agent, such as specialized data processing, external API access,
|
||||
Model Context Protocol (MCP) is a mechanism that allows OpenHands to communicate with external tool servers. These
|
||||
servers can provide additional functionality to the agent, such as specialized data processing, external API access,
|
||||
or custom tools. MCP is based on the open standard defined at [modelcontextprotocol.io](https://modelcontextprotocol.io).
|
||||
|
||||
## Configuration
|
||||
|
||||
@@ -40,7 +40,7 @@ On initial prompt, an error is seen with `Permission Denied` or `PermissionError
|
||||
|
||||
**Description**
|
||||
|
||||
When accessing OpenHands through a non-localhost URL (such as a LAN IP address), the VS Code tab shows a "Forbidden"
|
||||
When accessing OpenHands through a non-localhost URL (such as a LAN IP address), the VS Code tab shows a "Forbidden"
|
||||
error, while other parts of the UI work fine.
|
||||
|
||||
**Resolution**
|
||||
|
||||
@@ -49,6 +49,7 @@ LLM_RETRY_EXCEPTIONS: tuple[type[Exception], ...] = (
|
||||
# remove this when we gemini and deepseek are supported
|
||||
CACHE_PROMPT_SUPPORTED_MODELS = [
|
||||
'claude-3-7-sonnet-20250219',
|
||||
'claude-sonnet-3-7-latest',
|
||||
'claude-3-5-sonnet-20241022',
|
||||
'claude-3-5-sonnet-20240620',
|
||||
'claude-3-5-haiku-20241022',
|
||||
@@ -59,6 +60,7 @@ CACHE_PROMPT_SUPPORTED_MODELS = [
|
||||
# function calling supporting models
|
||||
FUNCTION_CALLING_SUPPORTED_MODELS = [
|
||||
'claude-3-7-sonnet-20250219',
|
||||
'claude-sonnet-3-7-latest',
|
||||
'claude-3-5-sonnet',
|
||||
'claude-3-5-sonnet-20240620',
|
||||
'claude-3-5-sonnet-20241022',
|
||||
|
||||
Reference in New Issue
Block a user