mirror of
https://github.com/microsoft/autogen.git
synced 2026-04-20 03:02:16 -04:00
guide on the usage of docker (#1111)
* docker documentation * docker doc * clean contribute.md * minor change * Add more detailed description * add docker instructions * more dockerfiles * readme update * latest python * dev docker python version * add version * readme * improve doc * improve doc * path name * naming * Update website/docs/Installation.md Co-authored-by: Chi Wang <wang.chi@microsoft.com> * Update website/docs/Installation.md Co-authored-by: Chi Wang <wang.chi@microsoft.com> * Add suggestion to install colima for Mac users * Update website/docs/Installation.md Co-authored-by: Chi Wang <wang.chi@microsoft.com> * Update website/docs/Installation.md Co-authored-by: olgavrou <olgavrou@gmail.com> * update doc * typo * improve doc * add more options in dev file * contrib * add link to doc * add link * Update website/docs/Installation.md Co-authored-by: Chi Wang <wang.chi@microsoft.com> * Update website/docs/Installation.md Co-authored-by: Chi Wang <wang.chi@microsoft.com> * instruction * Update website/docs/FAQ.md Co-authored-by: Chi Wang <wang.chi@microsoft.com> * FAQ * comment autogen studio --------- Co-authored-by: Yuandong Tian <yuandong@fb.com> Co-authored-by: Chi Wang <wang.chi@microsoft.com> Co-authored-by: olgavrou <olgavrou@gmail.com>
This commit is contained in:
18
README.md
18
README.md
@@ -62,18 +62,12 @@ The easiest way to start playing is
|
||||
3. Start playing with the notebooks!
|
||||
|
||||
*NOTE*: OAI_CONFIG_LIST_sample lists GPT-4 as the default model, as this represents our current recommendation, and is known to work well with AutoGen. If you use a model other than GPT-4, you may need to revise various system prompts (especially if using weaker models like GPT-3.5-turbo). Moreover, if you use models other than those hosted by OpenAI or Azure, you may incur additional risks related to alignment and safety. Proceed with caution if updating this default.
|
||||
## [Installation](https://microsoft.github.io/autogen/docs/Installation)
|
||||
### Option 1. Install and Run AutoGen in Docker
|
||||
|
||||
## Using existing docker image
|
||||
Install docker, save your oai key into an environment variable name OPENAI_API_KEY, and then run the following.
|
||||
Find detailed instructions for users [here](https://microsoft.github.io/autogen/docs/Installation#option-1-install-and-run-autogen-in-docker), and for developers [here](https://microsoft.github.io/autogen/docs/Contribute#docker).
|
||||
|
||||
```
|
||||
docker pull yuandongtian/autogen:latest
|
||||
docker run -it -e OPENAI_API_KEY=$OPENAI_API_KEY -p 8081:8081 docker.io/yuandongtian/autogen:latest
|
||||
```
|
||||
|
||||
Then open `http://localhost:8081/` in your browser to use AutoGen. The UI is from `./samples/apps/autogen-assistant`. See docker hub [link](https://hub.docker.com/r/yuandongtian/autogen) for more details.
|
||||
|
||||
## Installation
|
||||
### Option 2. Install AutoGen Locally
|
||||
|
||||
AutoGen requires **Python version >= 3.8, < 3.12**. It can be installed from pip:
|
||||
|
||||
@@ -88,11 +82,11 @@ Minimal dependencies are installed without extra options. You can install extra
|
||||
pip install "pyautogen[blendsearch]"
|
||||
``` -->
|
||||
|
||||
Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation).
|
||||
Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation#option-2-install-autogen-locally-using-virtual-environment).
|
||||
|
||||
<!-- Each of the [`notebook examples`](https://github.com/microsoft/autogen/tree/main/notebook) may require a specific option to be installed. -->
|
||||
|
||||
For [code execution](https://microsoft.github.io/autogen/docs/FAQ/#code-execution), we strongly recommend installing the Python docker package and using docker.
|
||||
Even if you are installing AutoGen locally out of docker, we recommend performing [code execution](https://microsoft.github.io/autogen/docs/FAQ/#code-execution) in docker. Find more instructions [here](https://microsoft.github.io/autogen/docs/Installation#docker).
|
||||
|
||||
For LLM inference configurations, check the [FAQs](https://microsoft.github.io/autogen/docs/FAQ#set-your-api-endpoints).
|
||||
|
||||
|
||||
21
samples/dockers/Dockerfile.base
Normal file
21
samples/dockers/Dockerfile.base
Normal file
@@ -0,0 +1,21 @@
|
||||
FROM python:3.11-slim-bookworm
|
||||
|
||||
RUN : \
|
||||
&& apt-get update \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
|
||||
software-properties-common \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
|
||||
python3-venv \
|
||||
&& apt-get clean \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& :
|
||||
|
||||
RUN python3 -m venv /venv
|
||||
ENV PATH=/venv/bin:$PATH
|
||||
EXPOSE 8081
|
||||
|
||||
RUN cd /venv; pip install pyautogen
|
||||
# Pre-load popular packages as per https://learnpython.com/blog/most-popular-python-packages/
|
||||
RUN pip install numpy pandas matplotlib seaborn scikit-learn requests urllib3 nltk pillow pytest beautifulsoup4
|
||||
|
||||
ENTRYPOINT []
|
||||
@@ -1,5 +1,5 @@
|
||||
# basic setup
|
||||
FROM python:3.10
|
||||
FROM python:3.11-slim-bookworm
|
||||
RUN apt-get update && apt-get -y update
|
||||
RUN apt-get install -y sudo git npm
|
||||
|
||||
@@ -13,8 +13,8 @@ USER autogen-dev
|
||||
RUN cd /home/autogen-dev && git clone https://github.com/microsoft/autogen.git
|
||||
WORKDIR /home/autogen-dev/autogen
|
||||
|
||||
# Install autogen (Note: extra components can be installed if needed)
|
||||
RUN sudo pip install -e .[test]
|
||||
# Install autogen in editable mode (Note: extra components can be installed if needed)
|
||||
RUN sudo pip install -e .[test,teachable,lmm,retrievechat,mathchat,blendsearch]
|
||||
|
||||
# Install precommit hooks
|
||||
RUN pre-commit install
|
||||
@@ -25,6 +25,9 @@ RUN sudo pip install pydoc-markdown
|
||||
RUN cd website
|
||||
RUN yarn install --frozen-lockfile --ignore-engines
|
||||
|
||||
# Pre-load popular packages as per https://learnpython.com/blog/most-popular-python-packages/
|
||||
RUN pip install numpy pandas matplotlib seaborn scikit-learn requests urllib3 nltk pillow pytest beautifulsoup4
|
||||
|
||||
# override default image starting point
|
||||
CMD /bin/bash
|
||||
ENTRYPOINT []
|
||||
21
samples/dockers/Dockerfile.full
Normal file
21
samples/dockers/Dockerfile.full
Normal file
@@ -0,0 +1,21 @@
|
||||
FROM python:3.11-slim-bookworm
|
||||
|
||||
RUN : \
|
||||
&& apt-get update \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
|
||||
software-properties-common \
|
||||
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
|
||||
python3-venv \
|
||||
&& apt-get clean \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& :
|
||||
|
||||
RUN python3 -m venv /venv
|
||||
ENV PATH=/venv/bin:$PATH
|
||||
EXPOSE 8081
|
||||
|
||||
RUN cd /venv; pip install pyautogen[teachable,lmm,retrievechat,mathchat,blendsearch] autogenra
|
||||
# Pre-load popular packages as per https://learnpython.com/blog/most-popular-python-packages/
|
||||
RUN pip install numpy pandas matplotlib seaborn scikit-learn requests urllib3 nltk pillow pytest beautifulsoup4
|
||||
|
||||
ENTRYPOINT []
|
||||
@@ -97,12 +97,18 @@ pip install -e autogen
|
||||
|
||||
### Docker
|
||||
|
||||
We provide a simple [Dockerfile](https://github.com/microsoft/autogen/blob/main/Dockerfile).
|
||||
We provide [Dockerfiles](https://github.com/microsoft/autogen/blob/main/samples/dockers/Dockerfile.dev) for developers to use.
|
||||
|
||||
Use the following command line to build and run a docker image.
|
||||
|
||||
```bash
|
||||
docker build https://github.com/microsoft/autogen.git#main -t autogen-dev
|
||||
docker run -it autogen-dev
|
||||
```
|
||||
docker build -f samples/dockers/Dockerfile.dev -t autogen_dev_img https://github.com/microsoft/autogen.git#main
|
||||
|
||||
docker run -it autogen_dev_img
|
||||
```
|
||||
|
||||
Detailed instructions can be found [here](Installation.md#option-1-install-and-run-autogen-in-docker).
|
||||
|
||||
|
||||
### Develop in Remote Container
|
||||
|
||||
|
||||
@@ -71,8 +71,8 @@ The `AssistantAgent` doesn't save all the code by default, because there are cas
|
||||
|
||||
We strongly recommend using docker to execute code. There are two ways to use docker:
|
||||
|
||||
1. Run autogen in a docker container. For example, when developing in GitHub codespace, the autogen runs in a docker container.
|
||||
2. Run autogen outside of a docker, while perform code execution with a docker container. For this option, make sure the python package `docker` is installed. When it is not installed and `use_docker` is omitted in `code_execution_config`, the code will be executed locally (this behavior is subject to change in future).
|
||||
1. Run AutoGen in a docker container. For example, when developing in [GitHub codespace](https://codespaces.new/microsoft/autogen?quickstart=1), AutoGen runs in a docker container. If you are not developing in Github codespace, follow instructions [here](Installation.md#option-1-install-and-run-autogen-in-docker) to install and run AutoGen in docker.
|
||||
2. Run AutoGen outside of a docker, while performing code execution with a docker container. For this option, set up docker and make sure the python package `docker` is installed. When not installed and `use_docker` is omitted in `code_execution_config`, the code will be executed locally (this behavior is subject to change in future).
|
||||
|
||||
### Enable Python 3 docker image
|
||||
|
||||
|
||||
@@ -1,10 +1,50 @@
|
||||
# Installation
|
||||
|
||||
## Setup Virtual Environment
|
||||
## Option 1: Install and Run AutoGen in Docker
|
||||
|
||||
When not using a docker container, we recommend using a virtual environment to install AutoGen. This will ensure that the dependencies for AutoGen are isolated from the rest of your system.
|
||||
[Docker](https://www.docker.com/) is a containerization platform that simplifies the setup and execution of your code. A properly built docker image could provide isolated and consistent environment to run your code securely across platforms. One option of using AutoGen is to install and run it in a docker container. You can do that in [Github codespace](https://codespaces.new/microsoft/autogen?quickstart=1) or follow the instructions below to do so.
|
||||
|
||||
### Option 1: venv
|
||||
#### Step 1. Install Docker.
|
||||
|
||||
Install docker following [this instruction](https://docs.docker.com/get-docker/).
|
||||
|
||||
For Mac users, alternatively you may choose to install [colima](https://smallsharpsoftwaretools.com/tutorials/use-colima-to-run-docker-containers-on-macos/) to run docker containers, if there is any issues with starting the docker daemon.
|
||||
|
||||
#### Step 2. Build a docker image
|
||||
|
||||
AutoGen provides [dockerfiles](https://github.com/microsoft/autogen/tree/main/samples/dockers/) that could be used to build docker images. Use the following command line to build a docker image named `autogen_img` (or other names you prefer) from one of the provided dockerfiles named `Dockerfile.base`:
|
||||
|
||||
```
|
||||
docker build -f samples/dockers/Dockerfile.base -t autogen_img https://github.com/microsoft/autogen.git#main
|
||||
```
|
||||
which includes some common python libraries and essential dependencies of AutoGen, or build from `Dockerfile.full` which include additional dependencies for more advanced features of AutoGen with the following command line:
|
||||
|
||||
```
|
||||
docker build -f samples/dockers/Dockerfile.full -t autogen_full_img https://github.com/microsoft/autogen.git
|
||||
```
|
||||
Once you build the docker image, you can use `docker images` to check whether it has been created successfully.
|
||||
|
||||
#### Step 3. Run applications built with AutoGen from a docker image.
|
||||
|
||||
**Mount your code to the docker image and run your application from there:** Now suppose you have your application built with AutoGen in a main script named `twoagent.py` ([example](https://github.com/microsoft/autogen/blob/main/test/twoagent.py)) in a folder named `myapp`. With the command line below, you can mont your folder and run the application in docker.
|
||||
|
||||
```python
|
||||
# Mount the local folder `myapp` into docker image and run the script named "twoagent.py" in the docker.
|
||||
docker run -it -v `pwd`/myapp:/myapp autogen_img:latest python /myapp/main_twoagent.py
|
||||
```
|
||||
|
||||
<!-- You may also run [AutoGen Studio](https://github.com/microsoft/autogen/tree/main/samples/apps/autogen-studio) (assuming that you have built a docker image named `autogen_full_img` with `Dockerfile.full` and you have set the environment variable `OPENAI_API_KEY` to your OpenAI API key) as below:
|
||||
|
||||
```
|
||||
docker run -it -e OPENAI_API_KEY=$OPENAI_API_KEY -p 8081:8081 autogen_full_img:latest autogenra ui --host 0.0.0.0
|
||||
```
|
||||
Then open `http://localhost:8081/` in your browser to use AutoGen Studio. -->
|
||||
|
||||
## Option 2: Install AutoGen Locally Using Virtual Environment
|
||||
|
||||
When installing AutoGen locally, we recommend using a virtual environment for the installation. This will ensure that the dependencies for AutoGen are isolated from the rest of your system.
|
||||
|
||||
### Option a: venv
|
||||
|
||||
You can create a virtual environment with `venv` as below:
|
||||
```bash
|
||||
@@ -17,9 +57,9 @@ The following command will deactivate the current `venv` environment:
|
||||
deactivate
|
||||
```
|
||||
|
||||
### Option 2: conda
|
||||
### Option b: conda
|
||||
|
||||
Another option is with `Conda`, Conda works better at solving dependency conflicts than pip. You can install it by following [this doc](https://docs.conda.io/projects/conda/en/stable/user-guide/install/index.html),
|
||||
Another option is with `Conda`. You can install it by following [this doc](https://docs.conda.io/projects/conda/en/stable/user-guide/install/index.html),
|
||||
and then create a virtual environment as below:
|
||||
```bash
|
||||
conda create -n pyautogen python=3.10 # python 3.10 is recommended as it's stable and not too old
|
||||
@@ -31,7 +71,7 @@ The following command will deactivate the current `conda` environment:
|
||||
conda deactivate
|
||||
```
|
||||
|
||||
### Option 3: poetry
|
||||
### Option c: poetry
|
||||
|
||||
Another option is with `poetry`, which is a dependency manager for Python.
|
||||
|
||||
@@ -93,13 +133,26 @@ Inference parameter tuning can be done via [`flaml.tune`](https://microsoft.gith
|
||||
### Optional Dependencies
|
||||
- #### docker
|
||||
|
||||
For the best user experience and seamless code execution, we highly recommend using Docker with AutoGen. Docker is a containerization platform that simplifies the setup and execution of your code. Developing in a docker container, such as GitHub Codespace, also makes the development convenient.
|
||||
Even if you install AutoGen locally, we highly recommend using Docker for [code execution](FAQ.md#enable-python-3-docker-image).
|
||||
|
||||
When running AutoGen out of a docker container, to use docker for code execution, you also need to install the python package `docker`:
|
||||
To use docker for code execution, you also need to install the python package `docker`:
|
||||
```bash
|
||||
pip install docker
|
||||
```
|
||||
|
||||
You might want to override the default docker image used for code execution. To do that set `use_docker` key of `code_execution_config` property to the name of the image. E.g.:
|
||||
```python
|
||||
user_proxy = autogen.UserProxyAgent(
|
||||
name="agent",
|
||||
human_input_mode="TERMINATE",
|
||||
max_consecutive_auto_reply=10,
|
||||
code_execution_config={"work_dir":"_output", "use_docker":"python:3"},
|
||||
llm_config=llm_config,
|
||||
system_message=""""Reply TERMINATE if the task has been solved at full satisfaction.
|
||||
Otherwise, reply CONTINUE, or the reason why the task is not solved yet."""
|
||||
)
|
||||
```
|
||||
|
||||
- #### blendsearch
|
||||
|
||||
`pyautogen<0.2` offers a cost-effective hyperparameter optimization technique [EcoOptiGen](https://arxiv.org/abs/2303.04673) for tuning Large Language Models. Please install with the [blendsearch] option to use it.
|
||||
|
||||
Reference in New Issue
Block a user