mirror of
https://github.com/All-Hands-AI/OpenHands.git
synced 2026-01-10 07:18:10 -05:00
Add 404 troubleshooting section (#1505)
This commit is contained in:
@@ -19,7 +19,9 @@ open a PR to add details to this file.
|
||||
If you're running on Windows and having trouble, check out our [guide for Windows users](troubleshooting/windows)
|
||||
:::
|
||||
|
||||
## [Unable to connect to docker](https://github.com/OpenDevin/OpenDevin/issues/1226)
|
||||
## Unable to connect to docker
|
||||
|
||||
[GitHub Issue](https://github.com/OpenDevin/OpenDevin/issues/1226)
|
||||
|
||||
### Symptoms
|
||||
|
||||
@@ -41,7 +43,8 @@ OpenDevin uses a docker container to do its work safely, without potentially bre
|
||||
* Make sure you don't need `sudo` to run docker [see here](https://www.baeldung.com/linux/docker-run-without-sudo)
|
||||
* If you are on a mac, check the [permissions requirements](https://docs.docker.com/desktop/mac/permission-requirements/) and in particular consider enabling the "Allow the default Docker socket to be used" under "Settings > Advanced" in Docker Desktop.
|
||||
|
||||
## [Unable to connect to SSH box](https://github.com/OpenDevin/OpenDevin/issues/1156)
|
||||
## Unable to connect to SSH box
|
||||
[GitHub Issue](https://github.com/OpenDevin/OpenDevin/issues/1156)
|
||||
|
||||
### Symptoms
|
||||
|
||||
@@ -63,7 +66,8 @@ especially Windows, this seems to fail.
|
||||
- Try [this reinstallation guide](https://github.com/OpenDevin/OpenDevin/issues/1156#issuecomment-2064549427)
|
||||
- Set `-e SANDBOX_TYPE=exec` to switch to the ExecBox docker container
|
||||
|
||||
## [Unable to connect to LLM](https://github.com/OpenDevin/OpenDevin/issues/1208)
|
||||
## Unable to connect to LLM
|
||||
[GitHub Issue](https://github.com/OpenDevin/OpenDevin/issues/1208)
|
||||
|
||||
### Symptoms
|
||||
|
||||
@@ -76,6 +80,8 @@ AttributeError: 'NoneType' object has no attribute 'request'
|
||||
|
||||
### Details
|
||||
|
||||
[GitHub Issues](https://github.com/OpenDevin/OpenDevin/issues?q=is%3Aissue+is%3Aopen+404)
|
||||
|
||||
This usually happens with local LLM setups, when OpenDevin can't connect to the LLM server.
|
||||
See our guide for [local LLMs](llms/localLLMs) for more information.
|
||||
|
||||
@@ -84,3 +90,46 @@ See our guide for [local LLMs](llms/localLLMs) for more information.
|
||||
- Check your `LLM_BASE_URL`
|
||||
- Check that ollama is running OK
|
||||
- Make sure you're using `--add-host host.docker.internal=host-gateway` when running in docker
|
||||
|
||||
## 404 Resource not found
|
||||
### Symptoms
|
||||
```
|
||||
Traceback (most recent call last):
|
||||
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion
|
||||
raise e
|
||||
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 373, in completion
|
||||
response = openai_client.chat.completions.create(**data, timeout=timeout) # type: ignore
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "/app/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 277, in wrapper
|
||||
return func(*args, **kwargs)
|
||||
^^^^^^^^^^^^^^^^^^^^^
|
||||
File "/app/.venv/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 579, in create
|
||||
return self._post(
|
||||
^^^^^^^^^^^
|
||||
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1232, in post
|
||||
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
|
||||
return self._request(
|
||||
^^^^^^^^^^^^^^
|
||||
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1012, in _request
|
||||
raise self._make_status_error_from_response(err.response) from None
|
||||
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
|
||||
```
|
||||
|
||||
### Details
|
||||
This happens when LiteLLM (our library for connecting to different LLM providers) can't find
|
||||
the API you're trying to connect to. Most often this happens for Azure or ollama users.
|
||||
|
||||
### Workarounds
|
||||
- Check that you've set `LLM_BASE_URL` properly
|
||||
- Check that model is set properly, based on the [LiteLLM docs](https://docs.litellm.ai/docs/providers)
|
||||
- If you're running inside the UI, be sure to set the `model` in the settings modal
|
||||
- If you're running headless (via main.py) be sure to set `LLM_MODEL` in your env/config
|
||||
- Make sure you've followed any special instructions for your LLM provider
|
||||
- [ollama](/OpenDevin/modules/usage/llms/localLLMs)
|
||||
- [Azure](/OpenDevin/modules/usage/llms/azureLLMs)
|
||||
- [Google](/OpenDevin/modules/usage/llms/googleLLMs)
|
||||
- Make sure your API key is correct
|
||||
- See if you can connect to the LLM using `curl`
|
||||
- Try [connecting via LiteLLM directly](https://github.com/BerriAI/litellm) to test your setup
|
||||
|
||||
Reference in New Issue
Block a user