Files
autogen/website/docs/installation/Optional-Dependencies.md
Wael Karkoub 372ac1e794 Text Compression Transform (#2225)
* adds implementation

* handles optional import

* cleanup

* updates github workflows

* skip test if dependencies not installed

* skip test if dependencies not installed

* use cpu

* skip openai

* unskip openai

* adds protocol

* better docstr

* minor fixes

* updates optional dependencies docs

* wip

* update docstrings

* wip

* adds back llmlingua requirement

* finalized protocol

* improve docstr

* guide complete

* improve docstr

* fix FAQ

* added cache support

* improve cache key

* cache key fix + faq fix

* improve docs

* improve guide

* args -> params

* spelling
2024-05-06 14:16:49 +00:00

4.3 KiB

Optional Dependencies

LLM Caching

To use LLM caching with Redis, you need to install the Python package with the option redis:

pip install "pyautogen[redis]"

See LLM Caching for details.

IPython Code Executor

To use the IPython code executor, you need to install the jupyter-client and ipykernel packages:

pip install "pyautogen[ipython]"

To use the IPython code executor:

from autogen import UserProxyAgent

proxy = UserProxyAgent(name="proxy", code_execution_config={"executor": "ipython-embedded"})

blendsearch

pyautogen<0.2 offers a cost-effective hyperparameter optimization technique EcoOptiGen for tuning Large Language Models. Please install with the [blendsearch] option to use it.

pip install "pyautogen[blendsearch]<0.2"

Example notebooks:

Optimize for Code Generation

Optimize for Math

retrievechat

pyautogen supports retrieval-augmented generation tasks such as question answering and code generation with RAG agents. Please install with the [retrievechat] option to use it with ChromaDB.

pip install "pyautogen[retrievechat]"

Alternatively pyautogen also supports PGVector and Qdrant which can be installed in place of ChromaDB, or alongside it.

pip install "pyautogen[retrievechat-pgvector]"
pip install "pyautogen[retrievechat-qdrant]"

RetrieveChat can handle various types of documents. By default, it can process plain text and PDF files, including formats such as 'txt', 'json', 'csv', 'tsv', 'md', 'html', 'htm', 'rtf', 'rst', 'jsonl', 'log', 'xml', 'yaml', 'yml' and 'pdf'. If you install unstructured (pip install "unstructured[all-docs]"), additional document types such as 'docx', 'doc', 'odt', 'pptx', 'ppt', 'xlsx', 'eml', 'msg', 'epub' will also be supported.

You can find a list of all supported document types by using autogen.retrieve_utils.TEXT_FORMATS.

Example notebooks:

Automated Code Generation and Question Answering with Retrieval Augmented Agents

Group Chat with Retrieval Augmented Generation (with 5 group member agents and 1 manager agent)

Automated Code Generation and Question Answering with Qdrant based Retrieval Augmented Agents

Teachability

To use Teachability, please install AutoGen with the [teachable] option.

pip install "pyautogen[teachable]"

Example notebook: Chatting with a teachable agent

Large Multimodal Model (LMM) Agents

We offered Multimodal Conversable Agent and LLaVA Agent. Please install with the [lmm] option to use it.

pip install "pyautogen[lmm]"

Example notebooks:

LLaVA Agent

mathchat

pyautogen<0.2 offers an experimental agent for math problem solving. Please install with the [mathchat] option to use it.

pip install "pyautogen[mathchat]<0.2"

Example notebooks:

Using MathChat to Solve Math Problems

Graph

To use a graph in GroupChat, particularly for graph visualization, please install AutoGen with the [graph] option.

pip install "pyautogen[graph]"

Example notebook: Graph Modeling Language with using select_speaker

Long Context Handling

AutoGen includes support for handling long textual contexts by leveraging the LLMLingua library for text compression. To enable this functionality, please install AutoGen with the [long-context] option:

pip install "pyautogen[long-context]"