Files
autogen/website/docs/installation/Optional-Dependencies.md
Eric Zhu 609ba7c649 Code executors (#1405)
* code executor

* test

* revert to main conversable agent

* prepare for pr

* kernel

* run open ai tests only when it's out of draft status

* update workflow file

* revert workflow changes

* ipython executor

* check kernel installed; fix tests

* fix tests

* fix tests

* update system prompt

* Update notebook, more tests

* notebook

* raise instead of return None

* allow user provided code executor.

* fixing types

* wip

* refactoring

* polishing

* fixed failing tests

* resolved merge conflict

* fixing failing test

* wip

* local command line executor and embedded ipython executor

* revert notebook

* fix format

* fix merged error

* fix lmm test

* fix lmm test

* move warning

* name and description should be part of the agent protocol, reset is not as it is only used for ConversableAgent; removing accidentally commited file

* version for dependency

* Update autogen/agentchat/conversable_agent.py

Co-authored-by: Jack Gerrits <jackgerrits@users.noreply.github.com>

* ordering of protocol

* description

* fix tests

* make ipython executor dependency optional

* update document optional dependencies

* Remove exclude from Agent protocol

* Make ConversableAgent consistent with Agent

* fix tests

* add doc string

* add doc string

* fix notebook

* fix interface

* merge and update agents

* disable config usage in reply function

* description field setter

* customize system message update

* update doc

---------

Co-authored-by: Davor Runje <davor@airt.ai>
Co-authored-by: Jack Gerrits <jackgerrits@users.noreply.github.com>
Co-authored-by: Aaron <aaronlaptop12@hotmail.com>
Co-authored-by: Chi Wang <wang.chi@microsoft.com>
2024-02-10 04:52:16 +00:00

3.7 KiB

Optional Dependencies

LLM Caching

To use LLM caching with Redis, you need to install the Python package with the option redis:

pip install "pyautogen[redis]"

See LLM Caching for details.

IPython Code Executor

To use the IPython code executor, you need to install the jupyter-client and ipykernel packages:

pip install "pyautogen[ipython]"

To use the IPython code executor:

from autogen import UserProxyAgent

proxy = UserProxyAgent(name="proxy", code_execution_config={"executor": "ipython-embedded"})

blendsearch

pyautogen<0.2 offers a cost-effective hyperparameter optimization technique EcoOptiGen for tuning Large Language Models. Please install with the [blendsearch] option to use it.

pip install "pyautogen[blendsearch]<0.2"

Example notebooks:

Optimize for Code Generation

Optimize for Math

retrievechat

pyautogen supports retrieval-augmented generation tasks such as question answering and code generation with RAG agents. Please install with the [retrievechat] option to use it.

pip install "pyautogen[retrievechat]"

RetrieveChat can handle various types of documents. By default, it can process plain text and PDF files, including formats such as 'txt', 'json', 'csv', 'tsv', 'md', 'html', 'htm', 'rtf', 'rst', 'jsonl', 'log', 'xml', 'yaml', 'yml' and 'pdf'. If you install unstructured (pip install "unstructured[all-docs]"), additional document types such as 'docx', 'doc', 'odt', 'pptx', 'ppt', 'xlsx', 'eml', 'msg', 'epub' will also be supported.

You can find a list of all supported document types by using autogen.retrieve_utils.TEXT_FORMATS.

Example notebooks:

Automated Code Generation and Question Answering with Retrieval Augmented Agents

Group Chat with Retrieval Augmented Generation (with 5 group member agents and 1 manager agent)

Automated Code Generation and Question Answering with Qdrant based Retrieval Augmented Agents

Teachability

To use Teachability, please install AutoGen with the [teachable] option.

pip install "pyautogen[teachable]"

Example notebook: Chatting with a teachable agent

Large Multimodal Model (LMM) Agents

We offered Multimodal Conversable Agent and LLaVA Agent. Please install with the [lmm] option to use it.

pip install "pyautogen[lmm]"

Example notebooks:

LLaVA Agent

mathchat

pyautogen<0.2 offers an experimental agent for math problem solving. Please install with the [mathchat] option to use it.

pip install "pyautogen[mathchat]<0.2"

Example notebooks:

Using MathChat to Solve Math Problems

Graph

To use a graph in GroupChat, particularly for graph visualization, please install AutoGen with the [graph] option.

pip install "pyautogen[graph]"

Example notebook: Graph Modeling Language with using select_speaker