diff --git a/docs/DOC_STYLE_GUIDE.md b/docs/DOC_STYLE_GUIDE.md
index a55af799b1..93b916b0e8 100644
--- a/docs/DOC_STYLE_GUIDE.md
+++ b/docs/DOC_STYLE_GUIDE.md
@@ -46,3 +46,11 @@ docker run -it \
-e THAT=that
...
```
+
+### Referring to UI Elements
+
+When referencing UI elements, use ``.
+
+Example:
+1. Toggle the `Advanced` option
+2. Enter your model in the `Custom Model` textbox.
diff --git a/docs/modules/usage/how-to/gui-mode.md b/docs/modules/usage/how-to/gui-mode.md
index 483f8869e9..200e4ce3e0 100644
--- a/docs/modules/usage/how-to/gui-mode.md
+++ b/docs/modules/usage/how-to/gui-mode.md
@@ -1,9 +1,6 @@
# GUI Mode
-## Introduction
-
-OpenHands provides a user-friendly Graphical User Interface (GUI) mode for interacting with the AI assistant.
-This mode offers an intuitive way to set up the environment, manage settings, and communicate with the AI.
+OpenHands provides a Graphical User Interface (GUI) mode for interacting with the AI assistant.
## Installation and Setup
@@ -14,104 +11,95 @@ This mode offers an intuitive way to set up the environment, manage settings, an
### Initial Setup
-1. Upon first launch, you'll see a settings modal.
-2. Select an `LLM Provider` and `LLM Model` from the dropdown menus.
+1. Upon first launch, you'll see a settings page.
+2. Select an `LLM Provider` and `LLM Model` from the dropdown menus. If the required model does not exist in the list,
+ toggle `Advanced` options and enter it with the correct prefix in the `Custom Model` text box.
3. Enter the corresponding `API Key` for your chosen provider.
-4. Click "Save" to apply the settings.
+4. Click `Save Changes` to apply the settings.
### GitHub Token Setup
OpenHands automatically exports a `GITHUB_TOKEN` to the shell environment if it is available. This can happen in two ways:
-- **Locally (OSS)**: The user directly inputs their GitHub token.
-- **Online (SaaS)**: The token is obtained through GitHub OAuth authentication.
-
-#### Setting Up a Local GitHub Token
-
-1. **Generate a Personal Access Token (PAT)**:
- - Go to GitHub Settings > Developer Settings > Personal Access Tokens > Tokens (classic).
- - Click "Generate new token (classic)".
+- **Local Installation**: The user directly inputs their GitHub token.
+
+ Setting Up a GitHub Token
+ 1. **Generate a Personal Access Token (PAT)**:
+ - On GitHub, go to Settings > Developer Settings > Personal Access Tokens > Tokens (classic).
+ - Click `Generate new token (classic)`.
- Required scopes:
- `repo` (Full control of private repositories)
- - `workflow` (Update GitHub Action workflows)
- - `read:org` (Read organization data)
+ 2. **Enter Token in OpenHands**:
+ - Click the Settings button (gear icon).
+ - Navigate to the `GitHub Settings` section.
+ - Paste your token in the `GitHub Token` field.
+ - Click `Save Changes` to apply the changes.
+
-2. **Enter Token in OpenHands**:
- - Click the Settings button (gear icon) in the top right.
- - Navigate to the "GitHub" section.
- - Paste your token in the "GitHub Token" field.
- - Click "Save" to apply the changes.
+
+ Organizational Token Policies
-#### Organizational Token Policies
+ If you're working with organizational repositories, additional setup may be required:
-If you're working with organizational repositories, additional setup may be required:
-
-1. **Check Organization Requirements**:
+ 1. **Check Organization Requirements**:
- Organization admins may enforce specific token policies.
- Some organizations require tokens to be created with SSO enabled.
- Review your organization's [token policy settings](https://docs.github.com/en/organizations/managing-programmatic-access-to-your-organization/setting-a-personal-access-token-policy-for-your-organization).
-
-2. **Verify Organization Access**:
+ 2. **Verify Organization Access**:
- Go to your token settings on GitHub.
- - Look for the organization under "Organization access".
- - If required, click "Enable SSO" next to your organization.
+ - Look for the organization under `Organization access`.
+ - If required, click `Enable SSO` next to your organization.
- Complete the SSO authorization process.
+
-#### OAuth Authentication (Online Mode)
+
+ Troubleshooting
-When using OpenHands in online mode, the GitHub OAuth flow:
+ Common issues and solutions:
-1. Requests the following permissions:
+ - **Token Not Recognized**:
+ - Ensure the token is properly saved in settings.
+ - Check that the token hasn't expired.
+ - Verify the token has the required scopes.
+ - Try regenerating the token.
+
+ - **Organization Access Denied**:
+ - Check if SSO is required but not enabled.
+ - Verify organization membership.
+ - Contact organization admin if token policies are blocking access.
+
+ - **Verifying Token Works**:
+ - The app will show a green checkmark if the token is valid.
+ - Try accessing a repository to confirm permissions.
+ - Check the browser console for any error messages.
+
+
+- **OpenHands Cloud**: The token is obtained through GitHub OAuth authentication.
+
+
+ OAuth Authentication
+
+ When using OpenHands Cloud, the GitHub OAuth flow requests the following permissions:
- Repository access (read/write)
- Workflow management
- Organization read access
-2. Authentication steps:
- - Click "Sign in with GitHub" when prompted.
+ To authenticate OpenHands:
+ - Click `Sign in with GitHub` when prompted.
- Review the requested permissions.
- Authorize OpenHands to access your GitHub account.
- If using an organization, authorize organization access if prompted.
-
-#### Troubleshooting
-
-Common issues and solutions:
-
-- **Token Not Recognized**:
- - Ensure the token is properly saved in settings.
- - Check that the token hasn't expired.
- - Verify the token has the required scopes.
- - Try regenerating the token.
-
-- **Organization Access Denied**:
- - Check if SSO is required but not enabled.
- - Verify organization membership.
- - Contact organization admin if token policies are blocking access.
-
-- **Verifying Token Works**:
- - The app will show a green checkmark if the token is valid.
- - Try accessing a repository to confirm permissions.
- - Check the browser console for any error messages.
- - Use the "Test Connection" button in settings if available.
+
### Advanced Settings
-1. Toggle `Advanced Options` to access additional settings.
+1. Inside the Settings page, toggle `Advanced` options to access additional settings.
2. Use the `Custom Model` text box to manually enter a model if it's not in the list.
3. Specify a `Base URL` if required by your LLM provider.
-### Main Interface
-
-The main interface consists of several key components:
-
-- **Chat Window**: The central area where you can view the conversation history with the AI assistant.
-- **Input Box**: Located at the bottom of the screen, use this to type your messages or commands to the AI.
-- **Send Button**: Click this to send your message to the AI.
-- **Settings Button**: A gear icon that opens the settings modal, allowing you to adjust your configuration at any time.
-- **Workspace Panel**: Displays the files and folders in your workspace, allowing you to navigate and view files, or the agent's past commands or web browsing history.
-
### Interacting with the AI
-1. Type your question, request, or task description in the input box.
+1. Type your prompt in the input box.
2. Click the send button or press Enter to submit your message.
3. The AI will process your input and provide a response in the chat window.
4. You can continue the conversation by asking follow-up questions or providing additional information.
diff --git a/docs/modules/usage/installation.mdx b/docs/modules/usage/installation.mdx
index 64a6014e7d..610be444fe 100644
--- a/docs/modules/usage/installation.mdx
+++ b/docs/modules/usage/installation.mdx
@@ -12,7 +12,8 @@ A system with a modern processor and a minimum of **4GB RAM** is recommended to
MacOS
- ### Docker Desktop
+
+ **Docker Desktop**
1. [Install Docker Desktop on Mac](https://docs.docker.com/desktop/setup/install/mac-install).
2. Open Docker Desktop, go to `Settings > Advanced` and ensure `Allow the default Docker socket to be used` is enabled.
@@ -25,7 +26,7 @@ A system with a modern processor and a minimum of **4GB RAM** is recommended to
Tested with Ubuntu 22.04.
:::
- ### Docker Desktop
+ **Docker Desktop**
1. [Install Docker Desktop on Linux](https://docs.docker.com/desktop/setup/install/linux/).
@@ -33,12 +34,13 @@ A system with a modern processor and a minimum of **4GB RAM** is recommended to
Windows
- ### WSL
+
+ **WSL**
1. [Install WSL](https://learn.microsoft.com/en-us/windows/wsl/install).
2. Run `wsl --version` in powershell and confirm `Default Version: 2`.
- ### Docker Desktop
+ **Docker Desktop**
1. [Install Docker Desktop on Windows](https://docs.docker.com/desktop/setup/install/windows-install).
2. Open Docker Desktop, go to `Settings` and confirm the following:
@@ -78,24 +80,22 @@ or run it on tagged issues with [a github action](https://docs.all-hands.dev/mod
## Setup
-Upon launching OpenHands, you'll see a settings modal. You **must** select an `LLM Provider` and `LLM Model` and enter a corresponding `API Key`.
+Upon launching OpenHands, you'll see a Settings page. You **must** select an `LLM Provider` and `LLM Model` and enter a corresponding `API Key`.
These can be changed at any time by selecting the `Settings` button (gear icon) in the UI.
-If the required `LLM Model` does not exist in the list, you can toggle `Advanced Options` and manually enter it with the correct prefix
+If the required model does not exist in the list, you can toggle `Advanced` options and manually enter it with the correct prefix
in the `Custom Model` text box.
-The `Advanced Options` also allow you to specify a `Base URL` if required.
+The `Advanced` options also allow you to specify a `Base URL` if required.
-
-

-

-
+Now you're ready to [get started with OpenHands](./getting-started).
## Versions
-The command above pulls the most recent stable release of OpenHands. You have other options as well:
-- For a specific release, use `docker.all-hands.dev/all-hands-ai/openhands:$VERSION`, replacing $VERSION with the version number.
-- We use semver, and release major, minor, and patch tags. So `0.9` will automatically point to the latest `0.9.x` release, and `0` will point to the latest `0.x.x` release.
-- For the most up-to-date development version, you can use `docker.all-hands.dev/all-hands-ai/openhands:main`. This version is unstable and is recommended for testing or development purposes only.
+The [docker command above](./installation#start-the-app) pulls the most recent stable release of OpenHands. You have other options as well:
+- For a specific release, replace $VERSION in `openhands:$VERSION` and `runtime:$VERSION`, with the version number.
+We use SemVer so `0.9` will automatically point to the latest `0.9.x` release, and `0` will point to the latest `0.x.x` release.
+- For the most up-to-date development version, replace $VERSION in `openhands:$VERSION` and `runtime:$VERSION`, with `main`.
+This version is unstable and is recommended for testing or development purposes only.
You can choose the tag that best suits your needs based on stability requirements and desired features.
diff --git a/docs/modules/usage/llms/azure-llms.md b/docs/modules/usage/llms/azure-llms.md
index 7046fe7bf5..84f16627ab 100644
--- a/docs/modules/usage/llms/azure-llms.md
+++ b/docs/modules/usage/llms/azure-llms.md
@@ -25,7 +25,7 @@ You will need your ChatGPT deployment name which can be found on the deployments
<deployment-name> below.
:::
-1. Enable `Advanced Options`
+1. Enable `Advanced` options
2. Set the following:
- `Custom Model` to azure/<deployment-name>
- `Base URL` to your Azure API Base URL (e.g. `https://example-endpoint.openai.azure.com`)
diff --git a/docs/modules/usage/llms/google-llms.md b/docs/modules/usage/llms/google-llms.md
index d89ba389f0..74e9015ffb 100644
--- a/docs/modules/usage/llms/google-llms.md
+++ b/docs/modules/usage/llms/google-llms.md
@@ -10,7 +10,7 @@ OpenHands uses LiteLLM to make calls to Google's chat models. You can find their
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
- `LLM Provider` to `Gemini`
- `LLM Model` to the model you will be using.
-If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. gemini/<model-name> like `gemini/gemini-1.5-pro`).
+If the model is not in the list, toggle `Advanced` options, and enter it in `Custom Model` (e.g. gemini/<model-name> like `gemini/gemini-1.5-pro`).
- `API Key` to your Gemini API key
## VertexAI - Google Cloud Platform Configs
@@ -27,4 +27,4 @@ VERTEXAI_LOCATION=""
Then set the following in the OpenHands UI through the Settings:
- `LLM Provider` to `VertexAI`
- `LLM Model` to the model you will be using.
-If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. vertex_ai/<model-name>).
+If the model is not in the list, toggle `Advanced` options, and enter it in `Custom Model` (e.g. vertex_ai/<model-name>).
diff --git a/docs/modules/usage/llms/groq.md b/docs/modules/usage/llms/groq.md
index d484d5e3a4..0de104cf14 100644
--- a/docs/modules/usage/llms/groq.md
+++ b/docs/modules/usage/llms/groq.md
@@ -8,7 +8,7 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
- `LLM Provider` to `Groq`
- `LLM Model` to the model you will be using. [Visit here to see the list of
models that Groq hosts](https://console.groq.com/docs/models). If the model is not in the list, toggle
-`Advanced Options`, and enter it in `Custom Model` (e.g. groq/<model-name> like `groq/llama3-70b-8192`).
+`Advanced` options, and enter it in `Custom Model` (e.g. groq/<model-name> like `groq/llama3-70b-8192`).
- `API key` to your Groq API key. To find or create your Groq API Key, [see here](https://console.groq.com/keys).
@@ -17,7 +17,7 @@ models that Groq hosts](https://console.groq.com/docs/models). If the model is n
The Groq endpoint for chat completion is [mostly OpenAI-compatible](https://console.groq.com/docs/openai). Therefore, you can access Groq models as you
would access any OpenAI-compatible endpoint. In the OpenHands UI through the Settings:
-1. Enable `Advanced Options`
+1. Enable `Advanced` options
2. Set the following:
- `Custom Model` to the prefix `openai/` + the model you will be using (e.g. `openai/llama3-70b-8192`)
- `Base URL` to `https://api.groq.com/openai/v1`
diff --git a/docs/modules/usage/llms/litellm-proxy.md b/docs/modules/usage/llms/litellm-proxy.md
index 9178bc5c33..21413e0ef1 100644
--- a/docs/modules/usage/llms/litellm-proxy.md
+++ b/docs/modules/usage/llms/litellm-proxy.md
@@ -8,7 +8,7 @@ To use LiteLLM proxy with OpenHands, you need to:
1. Set up a LiteLLM proxy server (see [LiteLLM documentation](https://docs.litellm.ai/docs/proxy/quick_start))
2. When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
- * Enable `Advanced Options`
+ * Enable `Advanced` options
* `Custom Model` to the prefix `litellm_proxy/` + the model you will be using (e.g. `litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0`)
* `Base URL` to your LiteLLM proxy URL (e.g. `https://your-litellm-proxy.com`)
* `API Key` to your LiteLLM proxy API key
diff --git a/docs/modules/usage/llms/llms.md b/docs/modules/usage/llms/llms.md
index f4fa118dd0..c2b08d0134 100644
--- a/docs/modules/usage/llms/llms.md
+++ b/docs/modules/usage/llms/llms.md
@@ -38,7 +38,7 @@ The following can be set in the OpenHands UI through the Settings:
- `LLM Provider`
- `LLM Model`
- `API Key`
-- `Base URL` (through `Advanced Settings`)
+- `Base URL` (through `Advanced` settings)
There are some settings that may be necessary for some LLMs/providers that cannot be set through the UI. Instead, these
can be set through environment variables passed to the [docker run command](/modules/usage/installation#start-the-app)
diff --git a/docs/modules/usage/llms/openai-llms.md b/docs/modules/usage/llms/openai-llms.md
index 9157c7cac8..d035898969 100644
--- a/docs/modules/usage/llms/openai-llms.md
+++ b/docs/modules/usage/llms/openai-llms.md
@@ -8,7 +8,7 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
* `LLM Provider` to `OpenAI`
* `LLM Model` to the model you will be using.
[Visit here to see a full list of OpenAI models that LiteLLM supports.](https://docs.litellm.ai/docs/providers/openai#openai-chat-completion-models)
-If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. openai/<model-name> like `openai/gpt-4o`).
+If the model is not in the list, toggle `Advanced` options, and enter it in `Custom Model` (e.g. openai/<model-name> like `openai/gpt-4o`).
* `API Key` to your OpenAI API key. To find or create your OpenAI Project API Key, [see here](https://platform.openai.com/api-keys).
## Using OpenAI-Compatible Endpoints
@@ -18,7 +18,7 @@ Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoi
## Using an OpenAI Proxy
If you're using an OpenAI proxy, in the OpenHands UI through the Settings:
-1. Enable `Advanced Options`
+1. Enable `Advanced` options
2. Set the following:
- `Custom Model` to openai/<model-name> (e.g. `openai/gpt-4o` or openai/<proxy-prefix>/<model-name>)
- `Base URL` to the URL of your OpenAI proxy
diff --git a/docs/modules/usage/llms/openrouter.md b/docs/modules/usage/llms/openrouter.md
index 247d0a0558..2b5204d26c 100644
--- a/docs/modules/usage/llms/openrouter.md
+++ b/docs/modules/usage/llms/openrouter.md
@@ -8,5 +8,5 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
* `LLM Provider` to `OpenRouter`
* `LLM Model` to the model you will be using.
[Visit here to see a full list of OpenRouter models](https://openrouter.ai/models).
-If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (e.g. openrouter/<model-name> like `openrouter/anthropic/claude-3.5-sonnet`).
+If the model is not in the list, toggle `Advanced` options, and enter it in `Custom Model` (e.g. openrouter/<model-name> like `openrouter/anthropic/claude-3.5-sonnet`).
* `API Key` to your OpenRouter API key.
diff --git a/docs/sidebars.ts b/docs/sidebars.ts
index da416ac30b..f71e36a0b5 100644
--- a/docs/sidebars.ts
+++ b/docs/sidebars.ts
@@ -66,7 +66,7 @@ const sidebars: SidebarsConfig = {
},
{
type: 'doc',
- label: 'Github Actions',
+ label: 'Github Action',
id: 'usage/how-to/github-action',
},
{
diff --git a/docs/src/pages/index.tsx b/docs/src/pages/index.tsx
index a2df79a259..6f20f1eb77 100644
--- a/docs/src/pages/index.tsx
+++ b/docs/src/pages/index.tsx
@@ -23,6 +23,17 @@ export default function Home(): JSX.Element {
})}
>
+
+
+
Most Popular Links
+
+
);
}
diff --git a/docs/static/img/settings-advanced.png b/docs/static/img/settings-advanced.png
deleted file mode 100644
index 43a9cf05ab..0000000000
Binary files a/docs/static/img/settings-advanced.png and /dev/null differ
diff --git a/docs/static/img/settings-screenshot.png b/docs/static/img/settings-screenshot.png
deleted file mode 100644
index 987dd8c255..0000000000
Binary files a/docs/static/img/settings-screenshot.png and /dev/null differ