* Reorganize repo structure and upgrade to CrewAI 0.152.0 * chore(gitignore): ignore Python bytecode and __pycache__ across templates * chore(gitignore): ignore Python bytecode and __pycache__ across templates; clean tracked artifacts * Update crews/instagram_post/pyproject.toml Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
AI Crew for Instagram Post
Introduction
This project is an example using the CrewAI framework to automate the process of coming up with an instagram post. CrewAI orchestrates autonomous AI agents, enabling them to collaborate and execute complex tasks efficiently.
Instagram Post
By @joaomdmoura
CrewAI Framework
CrewAI is designed to facilitate the collaboration of role-playing AI agents. In this example, these agents work together to generate a creative and trendy instagram post.
Running the Script
This example uses OpenHermes 2.5 through Ollama by default so you should to download Ollama and OpenHermes.
You can change the model by changing the MODEL env var in the .env file.
- Configure Environment: Copy ``.env.example` and set up the environment variables for Browseless, Serper.
- Install Dependencies: Run
poetry install --no-root(uses crewAI==0.130.0). - Execute the Script: Run
python main.pyand input your idea.
Details & Explanation
- Running the Script: Execute `python main.py`` and input your idea when prompted. The script will leverage the CrewAI framework to process the idea and generate an instagram post.
- Key Components:
./main.py: Main script file../tasks.py: Main file with the tasks prompts../agents.py: Main file with the agents creation../tools/: Contains tool classes used by the agents.
Using Local Models with Ollama
This example run entirely local models, the CrewAI framework supports integration with both closed and local models, by using tools such as Ollama, for enhanced flexibility and customization. This allows you to utilize your own models, which can be particularly useful for specialized tasks or data privacy concerns.
Setting Up Ollama
- Install Ollama: Ensure that Ollama is properly installed in your environment. Follow the installation guide provided by Ollama for detailed instructions.
- Configure Ollama: Set up Ollama to work with your local model. You will probably need to tweak the model using a Modelfile, I'd recommend playing with
top_pandtemperature.
License
This project is released under the MIT License.
