mirror of
https://github.com/microsoft/autogen.git
synced 2026-01-26 21:18:45 -05:00
update ecosystem (#1624)
* update ollama ecosystem * update ollama ecosystem --------- Co-authored-by: “skzhang1” <“shaokunzhang529@gmail.com”>
This commit is contained in:
@@ -19,3 +19,11 @@ MemGPT enables LLMs to manage their own memory and overcome limited context wind
|
||||
[Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/get-started/microsoft-fabric-overview) is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. In this notenook, we give a simple example for using AutoGen in Microsoft Fabric.
|
||||
|
||||
- [Microsoft Fabric + AutoGen Code Examples](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_microsoft_fabric.ipynb)
|
||||
|
||||
## Ollama + AutoGen
|
||||
|
||||

|
||||
|
||||
[Ollama](https://ollama.com/) allows the users to run open-source large language models, such as Llama 2, locally. Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. It optimizes setup and configuration details, including GPU usage.
|
||||
|
||||
- [Ollama + AutoGen instruction](https://ollama.ai/blog/openai-compatibility)
|
||||
|
||||
BIN
website/docs/img/ecosystem-ollama.png
Normal file
BIN
website/docs/img/ecosystem-ollama.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 14 KiB |
Reference in New Issue
Block a user