mirror of
https://github.com/Significant-Gravitas/AutoGPT.git
synced 2026-04-08 03:00:28 -04:00
Merge branch 'master' into zamilmajdy/code-validation
This commit is contained in:
2
.github/workflows/autogpt-server-ci.yml
vendored
2
.github/workflows/autogpt-server-ci.yml
vendored
@@ -265,4 +265,4 @@ jobs:
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: autogptserver-AppImage-${{ matrix.platform-os }}
|
||||
path: /Users/runner/work/AutoGPT/AutoGPT/rnd/autogpt_server/build/*.AppImage
|
||||
path: /Users/runner/work/AutoGPT/AutoGPT/rnd/autogpt_server/dist/*.AppImage
|
||||
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -162,7 +162,7 @@ agbenchmark/reports/
|
||||
|
||||
# Nodejs
|
||||
package-lock.json
|
||||
package.json
|
||||
|
||||
|
||||
# Allow for locally private items
|
||||
# private
|
||||
@@ -170,3 +170,5 @@ pri*
|
||||
# ignore
|
||||
ig*
|
||||
.github_access_token
|
||||
LICENSE.rtf
|
||||
rnd/autogpt_server/settings.py
|
||||
|
||||
@@ -97,7 +97,7 @@ repos:
|
||||
alias: pyright-benchmark
|
||||
entry: poetry -C benchmark run pyright
|
||||
args: [-p, benchmark, benchmark]
|
||||
files: ^benchmark/(agbenchmark|tests)/
|
||||
files: ^benchmark/(agbenchmark/|tests/|poetry\.lock$)
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
@@ -2,11 +2,11 @@
|
||||
|
||||
> For the complete getting started [tutorial series](https://aiedge.medium.com/autogpt-forge-e3de53cc58ec) <- click here
|
||||
|
||||
Welcome to the Quickstart Guide! This guide will walk you through the process of setting up and running your own AutoGPT agent. Whether you're a seasoned AI developer or just starting out, this guide will provide you with the necessary steps to jumpstart your journey in the world of AI development with AutoGPT.
|
||||
Welcome to the Quickstart Guide! This guide will walk you through setting up, building, and running your own AutoGPT agent. Whether you're a seasoned AI developer or just starting out, this guide will provide you with the steps to jumpstart your journey in AI development with AutoGPT.
|
||||
|
||||
## System Requirements
|
||||
|
||||
This project supports Linux (Debian based), Mac, and Windows Subsystem for Linux (WSL). If you are using a Windows system, you will need to install WSL. You can find the installation instructions for WSL [here](https://learn.microsoft.com/en-us/windows/wsl/).
|
||||
This project supports Linux (Debian-based), Mac, and Windows Subsystem for Linux (WSL). If you use a Windows system, you must install WSL. You can find the installation instructions for WSL [here](https://learn.microsoft.com/en-us/windows/wsl/).
|
||||
|
||||
|
||||
## Getting Setup
|
||||
@@ -18,11 +18,11 @@ This project supports Linux (Debian based), Mac, and Windows Subsystem for Linux
|
||||
- In the top-right corner of the page, click Fork.
|
||||
|
||||

|
||||
- On the next page, select your GitHub account to create the fork under.
|
||||
- On the next page, select your GitHub account to create the fork.
|
||||
- Wait for the forking process to complete. You now have a copy of the repository in your GitHub account.
|
||||
|
||||
2. **Clone the Repository**
|
||||
To clone the repository, you need to have Git installed on your system. If you don't have Git installed, you can download it from [here](https://git-scm.com/downloads). Once you have Git installed, follow these steps:
|
||||
To clone the repository, you need to have Git installed on your system. If you don't have Git installed, download it from [here](https://git-scm.com/downloads). Once you have Git installed, follow these steps:
|
||||
- Open your terminal.
|
||||
- Navigate to the directory where you want to clone the repository.
|
||||
- Run the git clone command for the fork you just created
|
||||
@@ -34,11 +34,11 @@ This project supports Linux (Debian based), Mac, and Windows Subsystem for Linux
|
||||

|
||||
|
||||
4. **Setup the Project**
|
||||
Next we need to setup the required dependencies. We have a tool for helping you do all the tasks you need to on the repo.
|
||||
Next, we need to set up the required dependencies. We have a tool to help you perform all the tasks on the repo.
|
||||
It can be accessed by running the `run` command by typing `./run` in the terminal.
|
||||
|
||||
The first command you need to use is `./run setup` This will guide you through the process of setting up your system.
|
||||
Initially you will get instructions for installing flutter, chrome and setting up your github access token like the following image:
|
||||
The first command you need to use is `./run setup.` This will guide you through setting up your system.
|
||||
Initially, you will get instructions for installing Flutter and Chrome and setting up your GitHub access token like the following image:
|
||||
|
||||

|
||||
|
||||
@@ -47,7 +47,7 @@ This project supports Linux (Debian based), Mac, and Windows Subsystem for Linux
|
||||
If you're a Windows user and experience issues after installing WSL, follow the steps below to resolve them.
|
||||
|
||||
#### Update WSL
|
||||
Run the following command in Powershell or Command Prompt to:
|
||||
Run the following command in Powershell or Command Prompt:
|
||||
1. Enable the optional WSL and Virtual Machine Platform components.
|
||||
2. Download and install the latest Linux kernel.
|
||||
3. Set WSL 2 as the default.
|
||||
@@ -73,7 +73,7 @@ dos2unix ./run
|
||||
After executing the above commands, running `./run setup` should work successfully.
|
||||
|
||||
#### Store Project Files within the WSL File System
|
||||
If you continue to experience issues, consider storing your project files within the WSL file system instead of the Windows file system. This method avoids issues related to path translations and permissions and provides a more consistent development environment.
|
||||
If you continue to experience issues, consider storing your project files within the WSL file system instead of the Windows file system. This method avoids path translations and permissions issues and provides a more consistent development environment.
|
||||
|
||||
You can keep running the command to get feedback on where you are up to with your setup.
|
||||
When setup has been completed, the command will return an output like this:
|
||||
@@ -83,7 +83,7 @@ When setup has been completed, the command will return an output like this:
|
||||
## Creating Your Agent
|
||||
|
||||
After completing the setup, the next step is to create your agent template.
|
||||
Execute the command `./run agent create YOUR_AGENT_NAME`, where `YOUR_AGENT_NAME` should be replaced with a name of your choosing.
|
||||
Execute the command `./run agent create YOUR_AGENT_NAME`, where `YOUR_AGENT_NAME` should be replaced with your chosen name.
|
||||
|
||||
Tips for naming your agent:
|
||||
* Give it its own unique name, or name it after yourself
|
||||
@@ -101,21 +101,21 @@ This starts the agent on the URL: `http://localhost:8000/`
|
||||
|
||||

|
||||
|
||||
The frontend can be accessed from `http://localhost:8000/`, you will first need to login using either a google account or your github account.
|
||||
The front end can be accessed from `http://localhost:8000/`; first, you must log in using either a Google account or your GitHub account.
|
||||
|
||||

|
||||
|
||||
Upon logging in you will get a page that looks something like this. With your task history down the left hand side of the page and the 'chat' window to send tasks to your agent.
|
||||
Upon logging in, you will get a page that looks something like this: your task history down the left-hand side of the page, and the 'chat' window to send tasks to your agent.
|
||||
|
||||

|
||||
|
||||
When you have finished with your agent, or if you just need to restart it, use Ctl-C to end the session then you can re-run the start command.
|
||||
When you have finished with your agent or just need to restart it, use Ctl-C to end the session. Then, you can re-run the start command.
|
||||
|
||||
If you are having issues and want to ensure the agent has been stopped there is a `./run agent stop` command which will kill the process using port 8000, which should be the agent.
|
||||
If you are having issues and want to ensure the agent has been stopped, there is a `./run agent stop` command, which will kill the process using port 8000, which should be the agent.
|
||||
|
||||
## Benchmarking your Agent
|
||||
|
||||
The benchmarking system can also be accessed using the cli too:
|
||||
The benchmarking system can also be accessed using the CLI too:
|
||||
|
||||
```bash
|
||||
agpt % ./run benchmark
|
||||
@@ -163,7 +163,7 @@ The benchmark has been split into different categories of skills you can test yo
|
||||

|
||||
|
||||
|
||||
Finally you can run the benchmark with
|
||||
Finally, you can run the benchmark with
|
||||
|
||||
```bash
|
||||
./run benchmark start YOUR_AGENT_NAME
|
||||
BIN
assets/gpt_dark_RGB.png
Normal file
BIN
assets/gpt_dark_RGB.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 49 KiB |
149
autogpt/poetry.lock
generated
149
autogpt/poetry.lock
generated
@@ -25,7 +25,7 @@ pexpect = "^4.8.0"
|
||||
psutil = "^5.9.5"
|
||||
pydantic = "^1.10.9"
|
||||
pytest = "^7.3.2"
|
||||
pytest-asyncio = "^0.21.1"
|
||||
pytest-asyncio = "^0.23.3"
|
||||
python-dotenv = "^1.0.0"
|
||||
python-multipart = "^0.0.7"
|
||||
pyvis = "^0.3.2"
|
||||
@@ -1131,67 +1131,33 @@ files = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cx-freeze"
|
||||
version = "7.0.0"
|
||||
name = "cx_Freeze"
|
||||
version = "7.1.1"
|
||||
description = "Create standalone executables from Python scripts"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:421920dbee2b4aab53a81f6c99d18b00baf622a328eae8e489f162154a46192a"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5fa1ca4cf20c6ce45ce2e26bf8b2086525aaaa774e2ee1b16da4e0f9f18c7272"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a520bc6772325c0e38924da1d827fe370702f8df397f483691b94d36179beef6"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b89ed99a2f99fd2f3e28a91c85bdd75b4bfaf11b04729ba3282bfebdadadf880"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5174821c82826e5a57e43960877087f5af6073e3877b0b38a0be244111fe1a76"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:cfd18cc00f3240b03d5bdf9515d59ace0881b5b5b6f2e7655d857d1fb71f351d"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:bac44e65bdfce0839b9a6d15373ea084fda3cdbd902351cde530991b450c2b2d"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:51a374f273d575827efe4f7ed9a88b6cab78abffacb858c829d7cbe4dc4ff56e"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-win32.whl", hash = "sha256:6603e6c47a15bd84bfbb20d92dc01d5e586b54928eb618461d2f14305471d570"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:d7ec847af5afbe3c638a096aae4ff5982a17d95e2fb7975e525ecf9505a185ea"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:04b7a2e5c53f5d537f3d958ebf2b0a0a7cbe8daf980cb0087559a3e698abc582"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:50e7e78001d4f78e70a403ecb5507685854ce1e6c3ff37bec1920eb6f2256534"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2d37ed560e86ca7958684701a6ae7f3300226d0d7c861ca5b90c78bf4c619ad2"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:42145dc5c2c7a98c620b30b7e25661954817a13355c50c4219a4a4954b39db39"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f9034d6f9c10d84d7edc0e4f4020e878de367e83c5877c039aa3c8b733bc318"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:adc6bdba9ff8705745831620efb6ee5eff9ec6d31d9b8c56d2a61d6555299157"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:37a3234c0e54b4afd561b47be4f22a6496f9436275fb7b59d90d7a3269fb4d6f"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:83549f9f817cafa59ea2f6e2045c8fe119628458ff14bb732649b01b0a637f6d"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-win32.whl", hash = "sha256:c508cd354728367311a7deb5bb616eee441bf79c900e3129a49fd54a372dc223"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:8fb71d23dba27dc40393a8b460bbf64759899246cd595860f66493cee64f27a5"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:648fd0acb439efe22dced2430cbaeca87e5ca9ab315d148933104376cca9553d"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3125a8408da3ff4b0cf767689d678909f840dfe08633f5f2d3cfe333111dc334"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:07e56b06c7ca0bd2fc37e3783908767dbe1926e1e2609edcaefcc749ab584329"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:25531d5c61bb5e974d8a5d042f29a37a786e91c1d6f66e018fc50342a416f4e1"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f97154b4b60f6e1953ebce05803a5e11a35047d097fad60d7c181303b7c6ef6e"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:2333db5cfa6db700c79fd45d614d38e9d698f1df2a3c7e21ccbcc63cc8a7a9b7"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:d45a58e0a9b010e0823c30fb8eb2077560d2bb0f78e4481a55bdb6ad0729f390"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:0422dbd426fd9f4f4ec0cadc7e3192d38227464daa3eb215b03eb577cd9a49d4"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-win32.whl", hash = "sha256:2018e9cbf8172da09b311cfc3906503ee6ae88665ec77c543013173b2532b731"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:ae5facd782b220bca6828eb6fb1834540cf431b1a615cc63652641bd070b11e6"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f06368dd41568572818c4abfcf9b45449dced3fa9f1b5f29e3523ba4ff7fcfbb"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e203d90d8fa1cc4489b15edac7dfdd983518a02999f275897160fc0ecfa98e4c"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f800b0bc2df14c66fcf2f220ecf273c5942d0b982268d8e5ccc9ef2fa56e576f"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-win32.whl", hash = "sha256:c52641ce2484222f4d60f0acbc79b2dfbfb984493101a4806c5af0ad379ebc82"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:92a15613be3fcc7a310e825c92ae3e83a7e689ade00ce2ea981403e4317c7234"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:60a0f674b6a55fdf46d0cc59122551a79221ceecd038fed8533dcbceb9714435"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb15314e8395e9658a8a5e4e19558d0e096a68b76c744ba81ebc249061b7dd9e"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3290127acc67e830265265a911d9018640ffffb7fddb86eacb1e3d83ed4136c4"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-win32.whl", hash = "sha256:aa885f2fb29b9f7d9a7d8af223d38d98905484cc2356c474bb1d6fd1704323ad"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:fe4dbbfd52454c8ddb550f112713ee2ac36cc024303557763b605e35cdb6b9a8"},
|
||||
{file = "cx_freeze-7.0.0.tar.gz", hash = "sha256:b03f2854a15dd1e8962660d18882a71fefba0e1b6f68337193d4a072d1fc36e6"},
|
||||
]
|
||||
files = []
|
||||
develop = true
|
||||
|
||||
[package.dependencies]
|
||||
cx-Logging = {version = ">=3.1", markers = "sys_platform == \"win32\""}
|
||||
filelock = {version = ">=3.11.0", markers = "sys_platform == \"linux\""}
|
||||
dmgbuild = {version = ">=1.6.1", markers = "sys_platform == \"darwin\""}
|
||||
filelock = {version = ">=3.12.3", markers = "sys_platform == \"linux\""}
|
||||
lief = {version = ">=0.12.0,<0.15.0", markers = "sys_platform == \"win32\""}
|
||||
patchelf = {version = ">=0.14", markers = "sys_platform == \"linux\" and (platform_machine == \"aarch64\" or platform_machine == \"armv7l\" or platform_machine == \"i686\" or platform_machine == \"ppc64le\" or platform_machine == \"s390x\" or platform_machine == \"x86_64\")"}
|
||||
setuptools = ">=62.6,<70"
|
||||
setuptools = ">=65.6.3,<71"
|
||||
wheel = ">=0.42.0,<=0.43.0"
|
||||
|
||||
[package.extras]
|
||||
dev = ["bump-my-version (==0.20.1)", "cibuildwheel (==2.17.0)", "pre-commit (==3.7.0)"]
|
||||
doc = ["furo (==2024.1.29)", "myst-parser (==2.0.0)", "sphinx (==7.3.7)", "sphinx-new-tab-link (==0.4.0)", "sphinx-tabs (==3.4.5)"]
|
||||
test = ["coverage (==7.4.4)", "pluggy (==1.4.0)", "pytest (==8.1.1)", "pytest-cov (==5.0.0)", "pytest-datafiles (==3.0.0)", "pytest-mock (==3.14.0)", "pytest-timeout (==2.3.1)", "pytest-xdist[psutil] (==3.6.0)"]
|
||||
dev = ["bump-my-version (==0.24.0)", "cibuildwheel (==2.19.1)", "pre-commit (==3.5.0)", "pre-commit (==3.7.1)"]
|
||||
doc = ["furo (==2024.5.6)", "myst-parser (==3.0.1)", "sphinx (==7.3.7)", "sphinx-new-tab-link (==0.4.0)", "sphinx-tabs (==3.4.5)"]
|
||||
test = ["coverage (==7.5.4)", "pluggy (==1.5.0)", "pytest (==8.2.2)", "pytest-cov (==5.0.0)", "pytest-datafiles (==3.0.0)", "pytest-mock (==3.14.0)", "pytest-timeout (==2.3.1)", "pytest-xdist[psutil] (==3.6.1)"]
|
||||
|
||||
[package.source]
|
||||
type = "git"
|
||||
url = "https://github.com/ntindle/cx_Freeze.git"
|
||||
reference = "main"
|
||||
resolved_reference = "876fe77c97db749b7b0aed93c12142a7226ee7e4"
|
||||
|
||||
[[package]]
|
||||
name = "cx-logging"
|
||||
@@ -1340,6 +1306,27 @@ files = [
|
||||
{file = "distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dmgbuild"
|
||||
version = "1.6.1"
|
||||
description = "macOS command line utility to build disk images"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "dmgbuild-1.6.1-py3-none-any.whl", hash = "sha256:45dba6af4a64872c6a91eb335ebeaf5e1f4f4f39c89fd77cf40e841bd1226166"},
|
||||
{file = "dmgbuild-1.6.1.tar.gz", hash = "sha256:7ced2603d684e29c22b4cd507d1e15a1907e91b86259924b8cfe480d80553b43"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
ds-store = ">=1.1.0"
|
||||
mac-alias = ">=2.0.1"
|
||||
|
||||
[package.extras]
|
||||
badge-icons = ["pyobjc-framework-Quartz (>=3.0.4)"]
|
||||
dev = ["pre-commit", "tox"]
|
||||
docs = ["sphinx", "sphinx-autobuild", "sphinx-rtd-theme"]
|
||||
test = ["pytest", "pytest-cov", "pytest-tldr"]
|
||||
|
||||
[[package]]
|
||||
name = "docker"
|
||||
version = "7.0.0"
|
||||
@@ -1361,6 +1348,25 @@ urllib3 = ">=1.26.0"
|
||||
ssh = ["paramiko (>=2.4.3)"]
|
||||
websockets = ["websocket-client (>=1.3.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "ds-store"
|
||||
version = "1.3.1"
|
||||
description = "Manipulate Finder .DS_Store files from Python"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "ds_store-1.3.1-py3-none-any.whl", hash = "sha256:fbacbb0bd5193ab3e66e5a47fff63619f15e374ffbec8ae29744251a6c8f05b5"},
|
||||
{file = "ds_store-1.3.1.tar.gz", hash = "sha256:c27d413caf13c19acb85d75da4752673f1f38267f9eb6ba81b3b5aa99c2d207c"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
mac-alias = ">=2.0.1"
|
||||
|
||||
[package.extras]
|
||||
dev = ["pre-commit", "tox"]
|
||||
docs = ["sphinx", "sphinx-autobuild", "sphinx-rtd-theme"]
|
||||
test = ["pytest", "pytest-cov", "pytest-tldr"]
|
||||
|
||||
[[package]]
|
||||
name = "duckduckgo-search"
|
||||
version = "6.1.7"
|
||||
@@ -3000,6 +3006,22 @@ html5 = ["html5lib"]
|
||||
htmlsoup = ["BeautifulSoup4"]
|
||||
source = ["Cython (>=3.0.7)"]
|
||||
|
||||
[[package]]
|
||||
name = "mac-alias"
|
||||
version = "2.2.2"
|
||||
description = "Generate/parse Mac OS Alias records from Python"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "mac_alias-2.2.2-py3-none-any.whl", hash = "sha256:504ab8ac546f35bbd75ad014d6ad977c426660aa721f2cd3acf3dc2f664141bd"},
|
||||
{file = "mac_alias-2.2.2.tar.gz", hash = "sha256:c99c728eb512e955c11f1a6203a0ffa8883b26549e8afe68804031aa5da856b7"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
dev = ["pre-commit", "tox"]
|
||||
docs = ["sphinx", "sphinx-autobuild", "sphinx-rtd-theme"]
|
||||
test = ["pytest", "pytest-cov", "pytest-tldr"]
|
||||
|
||||
[[package]]
|
||||
name = "markupsafe"
|
||||
version = "2.1.3"
|
||||
@@ -4763,21 +4785,21 @@ testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "no
|
||||
|
||||
[[package]]
|
||||
name = "pytest-asyncio"
|
||||
version = "0.21.1"
|
||||
version = "0.23.7"
|
||||
description = "Pytest support for asyncio"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pytest-asyncio-0.21.1.tar.gz", hash = "sha256:40a7eae6dded22c7b604986855ea48400ab15b069ae38116e8c01238e9eeb64d"},
|
||||
{file = "pytest_asyncio-0.21.1-py3-none-any.whl", hash = "sha256:8666c1c8ac02631d7c51ba282e0c69a8a452b211ffedf2599099845da5c5c37b"},
|
||||
{file = "pytest_asyncio-0.23.7-py3-none-any.whl", hash = "sha256:009b48127fbe44518a547bddd25611551b0e43ccdbf1e67d12479f569832c20b"},
|
||||
{file = "pytest_asyncio-0.23.7.tar.gz", hash = "sha256:5f5c72948f4c49e7db4f29f2521d4031f1c27f86e57b046126654083d4770268"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pytest = ">=7.0.0"
|
||||
pytest = ">=7.0.0,<9"
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (>=5.3)", "sphinx-rtd-theme (>=1.0)"]
|
||||
testing = ["coverage (>=6.2)", "flaky (>=3.5.0)", "hypothesis (>=5.7.1)", "mypy (>=0.931)", "pytest-trio (>=0.7.0)"]
|
||||
testing = ["coverage (>=6.2)", "hypothesis (>=5.7.1)"]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-benchmark"
|
||||
@@ -5455,19 +5477,18 @@ tornado = ["tornado (>=5)"]
|
||||
|
||||
[[package]]
|
||||
name = "setuptools"
|
||||
version = "69.0.3"
|
||||
version = "70.1.1"
|
||||
description = "Easily download, build, install, upgrade, and uninstall Python packages"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "setuptools-69.0.3-py3-none-any.whl", hash = "sha256:385eb4edd9c9d5c17540511303e39a147ce2fc04bc55289c322b9e5904fe2c05"},
|
||||
{file = "setuptools-69.0.3.tar.gz", hash = "sha256:be1af57fc409f93647f2e8e4573a142ed38724b8cdd389706a867bb4efcf1e78"},
|
||||
{file = "setuptools-70.1.1-py3-none-any.whl", hash = "sha256:a58a8fde0541dab0419750bcc521fbdf8585f6e5cb41909df3a472ef7b81ca95"},
|
||||
{file = "setuptools-70.1.1.tar.gz", hash = "sha256:937a48c7cdb7a21eb53cd7f9b59e525503aa8abaf3584c730dc5f7a5bec3a650"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
|
||||
testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
|
||||
testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
|
||||
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "pyproject-hooks (!=1.1)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
|
||||
testing = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "importlib-metadata", "ini2toml[lite] (>=0.14)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "jaraco.test", "mypy (==1.10.0)", "packaging (>=23.2)", "pip (>=19.1)", "pyproject-hooks (!=1.1)", "pytest (>=6,!=8.1.1)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-home (>=0.5)", "pytest-mypy", "pytest-perf", "pytest-ruff (>=0.3.2)", "pytest-subprocess", "pytest-timeout", "pytest-xdist (>=3)", "tomli", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
|
||||
|
||||
[[package]]
|
||||
name = "shellingham"
|
||||
@@ -7004,4 +7025,4 @@ benchmark = ["agbenchmark"]
|
||||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = "^3.10"
|
||||
content-hash = "a6022fadc0c861e7947bf4367a76242c83adcd0f963a742e2dd70a0196807bf6"
|
||||
content-hash = "d233c43b771f77779c2687cdd2c0a5d227cbea60e150e1d262cebcf83b86b046"
|
||||
|
||||
@@ -60,7 +60,7 @@ agbenchmark = { path = "../benchmark", optional = true }
|
||||
# agbenchmark = {git = "https://github.com/Significant-Gravitas/AutoGPT.git", subdirectory = "benchmark", optional = true}
|
||||
psycopg2-binary = "^2.9.9"
|
||||
multidict = "6.0.5"
|
||||
cx-freeze = "7.0.0"
|
||||
cx-freeze = { git = "https://github.com/ntindle/cx_Freeze.git", rev = "main", develop = true }
|
||||
|
||||
[tool.poetry.extras]
|
||||
benchmark = ["agbenchmark"]
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import platform
|
||||
from pathlib import Path
|
||||
from pkgutil import iter_modules
|
||||
from shutil import which
|
||||
from typing import Union
|
||||
|
||||
from cx_Freeze import Executable, setup
|
||||
from cx_Freeze import Executable, setup # type: ignore
|
||||
|
||||
packages = [
|
||||
m.name
|
||||
@@ -11,11 +13,47 @@ packages = [
|
||||
and ("poetry" in m.module_finder.path) # type: ignore
|
||||
]
|
||||
|
||||
icon = (
|
||||
"../../assets/gpt_dark_RGB.icns"
|
||||
if which("sips")
|
||||
else "../../assets/gpt_dark_RGB.ico"
|
||||
)
|
||||
# set the icon based on the platform
|
||||
icon = "../../assets/gpt_dark_RGB.ico"
|
||||
if platform.system() == "Darwin":
|
||||
icon = "../../assets/gpt_dark_RGB.icns"
|
||||
elif platform.system() == "Linux":
|
||||
icon = "../../assets/gpt_dark_RGB.png"
|
||||
|
||||
|
||||
def txt_to_rtf(input_file: Union[str, Path], output_file: Union[str, Path]) -> None:
|
||||
"""
|
||||
Convert a text file to RTF format.
|
||||
|
||||
Args:
|
||||
input_file (Union[str, Path]): Path to the input text file.
|
||||
output_file (Union[str, Path]): Path to the output RTF file.
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
input_path = Path(input_file)
|
||||
output_path = Path(output_file)
|
||||
|
||||
with input_path.open("r", encoding="utf-8") as txt_file:
|
||||
content = txt_file.read()
|
||||
|
||||
# RTF header
|
||||
rtf = r"{\rtf1\ansi\deff0 {\fonttbl {\f0 Times New Roman;}}\f0\fs24 "
|
||||
|
||||
# Replace newlines with RTF newline
|
||||
rtf += content.replace("\n", "\\par ")
|
||||
|
||||
# Close RTF document
|
||||
rtf += "}"
|
||||
|
||||
with output_path.open("w", encoding="utf-8") as rtf_file:
|
||||
rtf_file.write(rtf)
|
||||
|
||||
|
||||
# Convert LICENSE to LICENSE.rtf
|
||||
license_file = "LICENSE.rtf"
|
||||
txt_to_rtf("../LICENSE", license_file)
|
||||
|
||||
|
||||
setup(
|
||||
@@ -55,6 +93,7 @@ setup(
|
||||
"target_name": "AutoGPT",
|
||||
"add_to_path": True,
|
||||
"install_icon": "../assets/gpt_dark_RGB.ico",
|
||||
"license_file": license_file,
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
14
benchmark/poetry.lock
generated
14
benchmark/poetry.lock
generated
@@ -2192,21 +2192,21 @@ testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "no
|
||||
|
||||
[[package]]
|
||||
name = "pytest-asyncio"
|
||||
version = "0.21.1"
|
||||
version = "0.23.7"
|
||||
description = "Pytest support for asyncio"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pytest-asyncio-0.21.1.tar.gz", hash = "sha256:40a7eae6dded22c7b604986855ea48400ab15b069ae38116e8c01238e9eeb64d"},
|
||||
{file = "pytest_asyncio-0.21.1-py3-none-any.whl", hash = "sha256:8666c1c8ac02631d7c51ba282e0c69a8a452b211ffedf2599099845da5c5c37b"},
|
||||
{file = "pytest_asyncio-0.23.7-py3-none-any.whl", hash = "sha256:009b48127fbe44518a547bddd25611551b0e43ccdbf1e67d12479f569832c20b"},
|
||||
{file = "pytest_asyncio-0.23.7.tar.gz", hash = "sha256:5f5c72948f4c49e7db4f29f2521d4031f1c27f86e57b046126654083d4770268"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pytest = ">=7.0.0"
|
||||
pytest = ">=7.0.0,<9"
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (>=5.3)", "sphinx-rtd-theme (>=1.0)"]
|
||||
testing = ["coverage (>=6.2)", "flaky (>=3.5.0)", "hypothesis (>=5.7.1)", "mypy (>=0.931)", "pytest-trio (>=0.7.0)"]
|
||||
testing = ["coverage (>=6.2)", "hypothesis (>=5.7.1)"]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-cov"
|
||||
@@ -2864,4 +2864,4 @@ multidict = ">=4.0"
|
||||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = "^3.10"
|
||||
content-hash = "4a980e6d8f54a2f7f6a3c55d4f40ac3a4b27b5ac6573dd2a39e11213a4b126dd"
|
||||
content-hash = "b252cb57f23080c8e381a27c4ba8177ff541bbe98592270f923f2e61bae95323"
|
||||
|
||||
@@ -25,7 +25,7 @@ networkx = "^3.1"
|
||||
colorama = "^0.4.6"
|
||||
pyvis = "^0.3.2"
|
||||
selenium = "^4.11.2"
|
||||
pytest-asyncio = "^0.21.1"
|
||||
pytest-asyncio = "^0.23.3"
|
||||
uvicorn = "^0.23.2"
|
||||
fastapi = "^0.109.1"
|
||||
python-multipart = "^0.0.7"
|
||||
|
||||
16
forge/poetry.lock
generated
16
forge/poetry.lock
generated
@@ -25,7 +25,7 @@ pexpect = "^4.8.0"
|
||||
psutil = "^5.9.5"
|
||||
pydantic = "^1.10.9"
|
||||
pytest = "^7.3.2"
|
||||
pytest-asyncio = "^0.21.1"
|
||||
pytest-asyncio = "^0.23.3"
|
||||
python-dotenv = "^1.0.0"
|
||||
python-multipart = "^0.0.7"
|
||||
pyvis = "^0.3.2"
|
||||
@@ -4832,21 +4832,21 @@ testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "no
|
||||
|
||||
[[package]]
|
||||
name = "pytest-asyncio"
|
||||
version = "0.21.1"
|
||||
version = "0.23.7"
|
||||
description = "Pytest support for asyncio"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pytest-asyncio-0.21.1.tar.gz", hash = "sha256:40a7eae6dded22c7b604986855ea48400ab15b069ae38116e8c01238e9eeb64d"},
|
||||
{file = "pytest_asyncio-0.21.1-py3-none-any.whl", hash = "sha256:8666c1c8ac02631d7c51ba282e0c69a8a452b211ffedf2599099845da5c5c37b"},
|
||||
{file = "pytest_asyncio-0.23.7-py3-none-any.whl", hash = "sha256:009b48127fbe44518a547bddd25611551b0e43ccdbf1e67d12479f569832c20b"},
|
||||
{file = "pytest_asyncio-0.23.7.tar.gz", hash = "sha256:5f5c72948f4c49e7db4f29f2521d4031f1c27f86e57b046126654083d4770268"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pytest = ">=7.0.0"
|
||||
pytest = ">=7.0.0,<9"
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (>=5.3)", "sphinx-rtd-theme (>=1.0)"]
|
||||
testing = ["coverage (>=6.2)", "flaky (>=3.5.0)", "hypothesis (>=5.7.1)", "mypy (>=0.931)", "pytest-trio (>=0.7.0)"]
|
||||
testing = ["coverage (>=6.2)", "hypothesis (>=5.7.1)"]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-cov"
|
||||
@@ -6862,4 +6862,4 @@ benchmark = ["agbenchmark"]
|
||||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = "^3.10"
|
||||
content-hash = "b02486b166870b64f778f117708c9dab9ce9305340ad5ef28994273925f09c4c"
|
||||
content-hash = "970e0207fb787dd11f7b721a17c03bb2f42c2bd54d6ea71446da440b5295ae24"
|
||||
|
||||
@@ -67,7 +67,7 @@ pre-commit = "^3.3.3"
|
||||
boto3-stubs = { extras = ["s3"], version = "^1.33.6" }
|
||||
types-requests = "^2.31.0.2"
|
||||
pytest = "^7.4.0"
|
||||
pytest-asyncio = "^0.21.1"
|
||||
pytest-asyncio = "^0.23.3"
|
||||
pytest-cov = "^5.0.0"
|
||||
mock = "^5.1.0"
|
||||
pydevd-pycharm = "^233.6745.319"
|
||||
|
||||
1
rnd/autogpt_builder/.env.example
Normal file
1
rnd/autogpt_builder/.env.example
Normal file
@@ -0,0 +1 @@
|
||||
AGPT_SERVER_URL=http://localhost:8000
|
||||
3
rnd/autogpt_builder/.eslintrc.json
Normal file
3
rnd/autogpt_builder/.eslintrc.json
Normal file
@@ -0,0 +1,3 @@
|
||||
{
|
||||
"extends": "next/core-web-vitals"
|
||||
}
|
||||
36
rnd/autogpt_builder/.gitignore
vendored
Normal file
36
rnd/autogpt_builder/.gitignore
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
|
||||
|
||||
# dependencies
|
||||
/node_modules
|
||||
/.pnp
|
||||
.pnp.js
|
||||
.yarn/install-state.gz
|
||||
|
||||
# testing
|
||||
/coverage
|
||||
|
||||
# next.js
|
||||
/.next/
|
||||
/out/
|
||||
|
||||
# production
|
||||
/build
|
||||
|
||||
# misc
|
||||
.DS_Store
|
||||
*.pem
|
||||
|
||||
# debug
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# local env files
|
||||
.env*.local
|
||||
|
||||
# vercel
|
||||
.vercel
|
||||
|
||||
# typescript
|
||||
*.tsbuildinfo
|
||||
next-env.d.ts
|
||||
25
rnd/autogpt_builder/README.md
Normal file
25
rnd/autogpt_builder/README.md
Normal file
@@ -0,0 +1,25 @@
|
||||
This is the frontend for AutoGPT's next generation
|
||||
|
||||
## Getting Started
|
||||
|
||||
First, run the development server:
|
||||
|
||||
```bash
|
||||
npm run dev
|
||||
# or
|
||||
yarn dev
|
||||
# or
|
||||
pnpm dev
|
||||
# or
|
||||
bun dev
|
||||
```
|
||||
|
||||
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
|
||||
|
||||
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
|
||||
|
||||
This project uses [`next/font`](https://nextjs.org/docs/basic-features/font-optimization) to automatically optimize and load Inter, a custom Google Font.
|
||||
|
||||
## Deploy
|
||||
|
||||
TODO
|
||||
4
rnd/autogpt_builder/next.config.mjs
Normal file
4
rnd/autogpt_builder/next.config.mjs
Normal file
@@ -0,0 +1,4 @@
|
||||
/** @type {import('next').NextConfig} */
|
||||
const nextConfig = {};
|
||||
|
||||
export default nextConfig;
|
||||
29
rnd/autogpt_builder/package.json
Normal file
29
rnd/autogpt_builder/package.json
Normal file
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"name": "autogpt_builder",
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
"build": "next build",
|
||||
"start": "next start",
|
||||
"lint": "next lint"
|
||||
},
|
||||
"dependencies": {
|
||||
"next": "14.2.4",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-modal": "^3.16.1",
|
||||
"reactflow": "^11.11.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^20",
|
||||
"@types/react": "^18",
|
||||
"@types/react-dom": "^18",
|
||||
"@types/react-modal": "^3.16.3",
|
||||
"eslint": "^8",
|
||||
"eslint-config-next": "14.2.4",
|
||||
"postcss": "^8",
|
||||
"tailwindcss": "^3.4.1",
|
||||
"typescript": "^5"
|
||||
}
|
||||
}
|
||||
8
rnd/autogpt_builder/postcss.config.mjs
Normal file
8
rnd/autogpt_builder/postcss.config.mjs
Normal file
@@ -0,0 +1,8 @@
|
||||
/** @type {import('postcss-load-config').Config} */
|
||||
const config = {
|
||||
plugins: {
|
||||
tailwindcss: {},
|
||||
},
|
||||
};
|
||||
|
||||
export default config;
|
||||
72
rnd/autogpt_builder/public/autogpt.svg
Normal file
72
rnd/autogpt_builder/public/autogpt.svg
Normal file
@@ -0,0 +1,72 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<!-- Generator: Adobe Illustrator 27.8.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||
<svg version="1.1" id="AUTOgpt_logo" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px"
|
||||
y="0px" viewBox="0 0 2000 2000" style="enable-background:new 0 0 2000 2000;" xml:space="preserve">
|
||||
<style type="text/css">
|
||||
.st0{fill:url(#SVGID_1_);}
|
||||
.st1{fill:url(#SVGID_00000044859330063917736280000017916509329539228544_);}
|
||||
.st2{fill:url(#SVGID_00000140714777961496567230000017473346511890493859_);}
|
||||
.st3{fill:url(#SVGID_00000016043459524955834950000015278934287808704695_);}
|
||||
.st4{fill:url(#SVGID_00000133526441615091004900000013561443639704575621_);}
|
||||
.st5{fill:#000030;}
|
||||
.st6{fill:#669CF6;}
|
||||
</style>
|
||||
<g>
|
||||
|
||||
<linearGradient id="SVGID_1_" gradientUnits="userSpaceOnUse" x1="17241.2793" y1="15058.8164" x2="17241.2793" y2="16623.8047" gradientTransform="matrix(7.200000e-02 0 0 7.200000e-02 0.928 1.072)">
|
||||
<stop offset="0" style="stop-color:#000030"/>
|
||||
<stop offset="1" style="stop-color:#9900FF"/>
|
||||
</linearGradient>
|
||||
<path class="st0" d="M1216.7,1078.8v86.8c0,6.4-5.2,11.6-11.6,11.6c-6.9,0-12.6-4.4-12.6-11.6V1036c0-27.5,22.3-49.8,49.8-49.8
|
||||
s49.8,22.3,49.8,49.8c0,27.5-22.3,49.8-49.8,49.8C1233,1085.8,1224.2,1083.2,1216.7,1078.8L1216.7,1078.8z M1226.9,1020.6
|
||||
c8.5,0,15.4,6.9,15.4,15.4s-6.9,15.4-15.4,15.4c-1.6,0-3.1-0.2-4.5-0.7c4.5,6.1,11.8,10.1,19.9,10.1c13.7,0,24.8-11.1,24.8-24.8
|
||||
s-11.1-24.8-24.8-24.8c-8.2,0-15.4,4-19.9,10.1C1223.8,1020.9,1225.3,1020.6,1226.9,1020.6L1226.9,1020.6z"/>
|
||||
|
||||
<linearGradient id="SVGID_00000085938981603410528570000012380000869662973629_" gradientUnits="userSpaceOnUse" x1="15312.8066" y1="15057.3965" x2="15312.8066" y2="16624.1172" gradientTransform="matrix(7.200000e-02 0 0 7.200000e-02 0.928 1.072)">
|
||||
<stop offset="0" style="stop-color:#000030"/>
|
||||
<stop offset="1" style="stop-color:#4285F4"/>
|
||||
</linearGradient>
|
||||
<path style="fill:url(#SVGID_00000085938981603410528570000012380000869662973629_);" d="M1154.5,1078.8v55.8c0,5.1-2.1,9.7-5.4,13
|
||||
c-7.3,7.3-20.9,7.3-28.2,0c-9.6-9.6-0.5-25.9-17.7-43.1c-16.7-16.7-45.8-16.7-62.5,0c-7.7,7.7-12.5,18.4-12.5,30.1
|
||||
c0,6.4,5.2,11.6,11.6,11.6c6.9,0,12.6-4.4,12.6-11.6c0-5.1,2.1-9.7,5.4-13c7.3-7.3,20.9-7.3,28.2,0c10.5,10.5-0.1,25.3,17.7,43.1
|
||||
c16.7,16.7,45.8,16.7,62.5,0c7.7-7.7,12.5-18.4,12.5-30.1v-98.2v-0.3c0-27.5-22.3-49.8-49.8-49.8c-27.5,0-49.8,22.3-49.8,49.8
|
||||
c0,27.5,22.3,49.8,49.8,49.8C1138.3,1085.8,1147,1083.2,1154.5,1078.8z M1128.9,1060.8c-8.2,0-15.4-4-19.9-10.1
|
||||
c1.4,0.4,3,0.7,4.5,0.7c8.5,0,15.4-6.9,15.4-15.4s-6.9-15.4-15.4-15.4c-1.6,0-3.1,0.2-4.5,0.7c4.5-6.1,11.8-10.1,19.9-10.1
|
||||
c13.7,0,24.8,11.1,24.8,24.8C1153.7,1049.7,1142.6,1060.8,1128.9,1060.8L1128.9,1060.8z"/>
|
||||
|
||||
<linearGradient id="SVGID_00000127739374497564837560000013534033995177318078_" gradientUnits="userSpaceOnUse" x1="18088.9141" y1="13182.8672" x2="15383.333" y2="11899.5996" gradientTransform="matrix(7.200000e-02 0 0 7.200000e-02 0.928 1.072)">
|
||||
<stop offset="0" style="stop-color:#4285F4"/>
|
||||
<stop offset="1" style="stop-color:#9900FF"/>
|
||||
</linearGradient>
|
||||
<path style="fill:url(#SVGID_00000127739374497564837560000013534033995177318078_);" d="M1328.4,937.5c0-30.6-12.2-59.7-33.8-81.3
|
||||
c-21.6-21.6-50.7-33.8-81.3-33.8c-30.6,0-59.7,12.2-81.3,33.8c-21.6,21.6-33.8,50.7-33.8,81.3v5.2c0,6.7,5.4,12.1,12.1,12.1
|
||||
c6.7,0,12.1-5.4,12.1-12.1v-5.2c0-24.2,9.7-47.2,26.7-64.2c17.1-17.1,40.1-26.7,64.2-26.7s47.2,9.7,64.2,26.7
|
||||
c17.1,17.1,26.7,40.1,26.7,64.2c0,6.7,5.4,12.1,12.1,12.1C1323,949.5,1328.4,944.1,1328.4,937.5z"/>
|
||||
|
||||
<linearGradient id="SVGID_00000026880830724572405890000002574533588083035832_" gradientUnits="userSpaceOnUse" x1="18708.3613" y1="14393.377" x2="18708.3613" y2="16782.8711" gradientTransform="matrix(7.200000e-02 0 0 7.200000e-02 0.928 1.072)">
|
||||
<stop offset="0" style="stop-color:#000030"/>
|
||||
<stop offset="1" style="stop-color:#4285F4"/>
|
||||
</linearGradient>
|
||||
<path style="fill:url(#SVGID_00000026880830724572405890000002574533588083035832_);" d="M1328.4,973.9v14.9h19.4
|
||||
c6.5,0,11.8,5.3,11.8,11.8c0,6.8-4.6,12.4-11.8,12.4h-19.4v122c0,5.1,2.1,9.7,5.4,13c7.3,7.3,20.9,7.3,28.2,0
|
||||
c3.3-3.3,5.4-7.9,5.4-13v-4.1c0-7.2,5.7-11.6,12.6-11.6c6.4,0,11.6,5.2,11.6,11.6v4.1c0,11.8-4.8,22.4-12.5,30.1
|
||||
c-16.7,16.7-45.7,16.7-62.4,0c-7.7-7.7-12.5-18.4-12.5-30.1V973.9c0-7,5.6-11.8,12.4-11.8C1323.1,962.2,1328.3,967.4,1328.4,973.9
|
||||
L1328.4,973.9z"/>
|
||||
|
||||
<linearGradient id="SVGID_00000018229338295230736120000011477717140636842910_" gradientUnits="userSpaceOnUse" x1="17447.4375" y1="15469.0166" x2="17540.1348" y2="16329.7832" gradientTransform="matrix(7.200000e-02 0 0 7.200000e-02 0.928 1.072)">
|
||||
<stop offset="0" style="stop-color:#4285F4"/>
|
||||
<stop offset="1" style="stop-color:#9900FF"/>
|
||||
</linearGradient>
|
||||
<path style="fill:url(#SVGID_00000018229338295230736120000011477717140636842910_);" d="M1272.6,1165.5c0,6.4-5.2,11.6-11.6,11.6
|
||||
c-6.9,0-12.6-4.4-12.6-11.6c0-35.5,0-3.9,0-39.4c0-6.4,5.2-11.6,11.6-11.6c6.9,0,12.6,4.4,12.6,11.6
|
||||
C1272.6,1161.6,1272.6,1130.1,1272.6,1165.5z"/>
|
||||
<path class="st5" d="M707.2,1020.3v82.9h-25.1v-41.6h-54.3v41.6h-25.1v-82.9C602.7,952,707.2,951.1,707.2,1020.3z M996.8,1103.2
|
||||
c37.1,0,67.2-30.1,67.2-67.2s-30.1-67.2-67.2-67.2s-67.2,30.1-67.2,67.2C929.6,1073.2,959.7,1103.2,996.8,1103.2z M996.8,1077.5
|
||||
c-22.9,0-41.5-18.6-41.5-41.5c0-22.9,18.6-41.5,41.5-41.5s41.5,18.6,41.5,41.5C1038.3,1058.9,1019.8,1077.5,996.8,1077.5z
|
||||
M934.1,968.8V993h-36.5v110.3h-24.2V993h-36.5v-24.2C869.3,968.8,901.7,968.8,934.1,968.8z M824.8,1051.7v-82.9h-25.1v82.9
|
||||
c0,37.3-54.3,36.7-54.3,0v-82.9h-25.1v82.9C720.3,1120,824.8,1120.9,824.8,1051.7z M682.1,1037.4v-17.1c0-37.3-54.3-36.7-54.3,0
|
||||
v17.1H682.1z"/>
|
||||
<circle class="st6" cx="1379.5" cy="1096.4" r="12.4"/>
|
||||
<circle class="st6" cx="1039.8" cy="1164.7" r="12.4"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 5.8 KiB |
BIN
rnd/autogpt_builder/src/app/favicon.ico
Normal file
BIN
rnd/autogpt_builder/src/app/favicon.ico
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 15 KiB |
33
rnd/autogpt_builder/src/app/globals.css
Normal file
33
rnd/autogpt_builder/src/app/globals.css
Normal file
@@ -0,0 +1,33 @@
|
||||
@tailwind base;
|
||||
@tailwind components;
|
||||
@tailwind utilities;
|
||||
|
||||
:root {
|
||||
--foreground-rgb: 0, 0, 0;
|
||||
--background-start-rgb: 214, 219, 220;
|
||||
--background-end-rgb: 255, 255, 255;
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: dark) {
|
||||
:root {
|
||||
--foreground-rgb: 255, 255, 255;
|
||||
--background-start-rgb: 0, 0, 0;
|
||||
--background-end-rgb: 0, 0, 0;
|
||||
}
|
||||
}
|
||||
|
||||
body {
|
||||
color: rgb(var(--foreground-rgb));
|
||||
background: linear-gradient(
|
||||
to bottom,
|
||||
transparent,
|
||||
rgb(var(--background-end-rgb))
|
||||
)
|
||||
rgb(var(--background-start-rgb));
|
||||
}
|
||||
|
||||
@layer utilities {
|
||||
.text-balance {
|
||||
text-wrap: balance;
|
||||
}
|
||||
}
|
||||
22
rnd/autogpt_builder/src/app/layout.tsx
Normal file
22
rnd/autogpt_builder/src/app/layout.tsx
Normal file
@@ -0,0 +1,22 @@
|
||||
import type { Metadata } from "next";
|
||||
import { Inter } from "next/font/google";
|
||||
import "./globals.css";
|
||||
|
||||
const inter = Inter({ subsets: ["latin"] });
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: "NextGen AutoGPT",
|
||||
description: "Your one stop shop to creating AI Agents",
|
||||
};
|
||||
|
||||
export default function RootLayout({
|
||||
children,
|
||||
}: Readonly<{
|
||||
children: React.ReactNode;
|
||||
}>) {
|
||||
return (
|
||||
<html lang="en">
|
||||
<body className={inter.className}>{children}</body>
|
||||
</html>
|
||||
);
|
||||
}
|
||||
40
rnd/autogpt_builder/src/app/page.tsx
Normal file
40
rnd/autogpt_builder/src/app/page.tsx
Normal file
@@ -0,0 +1,40 @@
|
||||
import Image from "next/image";
|
||||
import Flow from '../components/Flow';
|
||||
|
||||
export default function Home() {
|
||||
return (
|
||||
<main className="flex min-h-screen flex-col items-center justify-between p-24">
|
||||
<div className="z-10 w-full max-w-5xl items-center justify-between font-mono text-sm lg:flex">
|
||||
<p className="fixed left-0 top-0 flex w-full justify-center border-b border-gray-300 bg-gradient-to-b from-zinc-200 pb-6 pt-8 backdrop-blur-2xl dark:border-neutral-800 dark:bg-zinc-800/30 dark:from-inherit lg:static lg:w-auto lg:rounded-xl lg:border lg:bg-gray-200 lg:p-4 lg:dark:bg-zinc-800/30">
|
||||
Get started by adding a
|
||||
<code className="font-mono font-bold">node</code>
|
||||
</p>
|
||||
<div
|
||||
className="fixed bottom-0 left-0 flex h-48 w-full items-end justify-center bg-gradient-to-t from-white via-white dark:from-black dark:via-black lg:static lg:size-auto lg:bg-none">
|
||||
<a
|
||||
className="pointer-events-none flex place-items-center gap-2 p-8 lg:pointer-events-auto lg:p-0"
|
||||
href="https://news.agpt.co/"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
>
|
||||
By{" "}
|
||||
<Image
|
||||
src="/autogpt.svg"
|
||||
alt="AutoGPT Logo"
|
||||
className="dark:invert"
|
||||
width={100}
|
||||
height={24}
|
||||
priority
|
||||
/>
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="w-full flex justify-center mt-10">
|
||||
<div className="flow-container w-full h-full">
|
||||
<Flow/>
|
||||
</div>
|
||||
</div>
|
||||
</main>
|
||||
);
|
||||
}
|
||||
224
rnd/autogpt_builder/src/components/CustomNode.tsx
Normal file
224
rnd/autogpt_builder/src/components/CustomNode.tsx
Normal file
@@ -0,0 +1,224 @@
|
||||
import React, { useState, useEffect, FC, memo } from 'react';
|
||||
import { Handle, Position, NodeProps } from 'reactflow';
|
||||
import 'reactflow/dist/style.css';
|
||||
import './customnode.css';
|
||||
import ModalComponent from './ModalComponent';
|
||||
|
||||
type Schema = {
|
||||
type: string;
|
||||
properties: { [key: string]: any };
|
||||
required?: string[];
|
||||
};
|
||||
|
||||
const CustomNode: FC<NodeProps> = ({ data, id }) => {
|
||||
const [isPropertiesOpen, setIsPropertiesOpen] = useState(data.isPropertiesOpen || false);
|
||||
const [isModalOpen, setIsModalOpen] = useState(false);
|
||||
const [activeKey, setActiveKey] = useState<string | null>(null);
|
||||
const [modalValue, setModalValue] = useState<string>('');
|
||||
|
||||
useEffect(() => {
|
||||
if (data.output_data || data.status) {
|
||||
setIsPropertiesOpen(true);
|
||||
}
|
||||
}, [data.output_data, data.status]);
|
||||
|
||||
const toggleProperties = () => {
|
||||
setIsPropertiesOpen(!isPropertiesOpen);
|
||||
};
|
||||
|
||||
const generateHandles = (schema: Schema, type: 'source' | 'target') => {
|
||||
if (!schema?.properties) return null;
|
||||
const keys = Object.keys(schema.properties);
|
||||
return keys.map((key) => (
|
||||
<div key={key} className="handle-container">
|
||||
{type === 'target' && (
|
||||
<>
|
||||
<Handle
|
||||
type={type}
|
||||
position={Position.Left}
|
||||
id={key}
|
||||
style={{ background: '#555', borderRadius: '50%' }}
|
||||
/>
|
||||
<span className="handle-label">{key}</span>
|
||||
</>
|
||||
)}
|
||||
{type === 'source' && (
|
||||
<>
|
||||
<span className="handle-label">{key}</span>
|
||||
<Handle
|
||||
type={type}
|
||||
position={Position.Right}
|
||||
id={key}
|
||||
style={{ background: '#555', borderRadius: '50%' }}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
));
|
||||
};
|
||||
|
||||
const handleInputChange = (key: string, value: any) => {
|
||||
const newValues = { ...data.hardcodedValues, [key]: value };
|
||||
data.setHardcodedValues(newValues);
|
||||
};
|
||||
|
||||
const isHandleConnected = (key: string) => {
|
||||
return data.connections && data.connections.some((conn: string) => {
|
||||
const [source, target] = conn.split(' -> ');
|
||||
return target.includes(key) && target.includes(data.title);
|
||||
});
|
||||
};
|
||||
|
||||
const handleInputClick = (key: string) => {
|
||||
setActiveKey(key);
|
||||
setModalValue(data.hardcodedValues[key] || '');
|
||||
setIsModalOpen(true);
|
||||
};
|
||||
|
||||
const handleModalSave = (value: string) => {
|
||||
if (activeKey) {
|
||||
handleInputChange(activeKey, value);
|
||||
}
|
||||
setIsModalOpen(false);
|
||||
setActiveKey(null);
|
||||
};
|
||||
|
||||
const renderInputField = (key: string, schema: any) => {
|
||||
switch (schema.type) {
|
||||
case 'string':
|
||||
return schema.enum ? (
|
||||
<div key={key} className="input-container">
|
||||
<select
|
||||
value={data.hardcodedValues[key] || ''}
|
||||
onChange={(e) => handleInputChange(key, e.target.value)}
|
||||
className="select-input"
|
||||
>
|
||||
{schema.enum.map((option: string) => (
|
||||
<option key={option} value={option}>
|
||||
{option}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
</div>
|
||||
) : (
|
||||
<div key={key} className="input-container">
|
||||
<div className="clickable-input" onClick={() => handleInputClick(key)}>
|
||||
{data.hardcodedValues[key] || `Enter ${key}`}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
case 'boolean':
|
||||
return (
|
||||
<div key={key} className="input-container">
|
||||
<label className="radio-label">
|
||||
<input
|
||||
type="radio"
|
||||
value="true"
|
||||
checked={data.hardcodedValues[key] === true}
|
||||
onChange={() => handleInputChange(key, true)}
|
||||
/>
|
||||
True
|
||||
</label>
|
||||
<label className="radio-label">
|
||||
<input
|
||||
type="radio"
|
||||
value="false"
|
||||
checked={data.hardcodedValues[key] === false}
|
||||
onChange={() => handleInputChange(key, false)}
|
||||
/>
|
||||
False
|
||||
</label>
|
||||
</div>
|
||||
);
|
||||
case 'integer':
|
||||
case 'number':
|
||||
return (
|
||||
<div key={key} className="input-container">
|
||||
<input
|
||||
type="number"
|
||||
value={data.hardcodedValues[key] || ''}
|
||||
onChange={(e) => handleInputChange(key, parseFloat(e.target.value))}
|
||||
className="number-input"
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
case 'array':
|
||||
if (schema.items && schema.items.type === 'string' && schema.items.enum) {
|
||||
return (
|
||||
<div key={key} className="input-container">
|
||||
<select
|
||||
value={data.hardcodedValues[key] || ''}
|
||||
onChange={(e) => handleInputChange(key, e.target.value)}
|
||||
className="select-input"
|
||||
>
|
||||
{schema.items.enum.map((option: string) => (
|
||||
<option key={option} value={option}>
|
||||
{option}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
return null;
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="custom-node">
|
||||
<div className="node-header">
|
||||
<div className="node-title">{data?.title.replace(/\d+/g, '')}</div>
|
||||
<button onClick={toggleProperties} className="toggle-button">
|
||||
☰
|
||||
</button>
|
||||
</div>
|
||||
<div className="node-content">
|
||||
<div className="input-section">
|
||||
{data.inputSchema &&
|
||||
Object.keys(data.inputSchema.properties).map((key) => (
|
||||
<div key={key}>
|
||||
<div className="handle-container">
|
||||
<Handle
|
||||
type="target"
|
||||
position={Position.Left}
|
||||
id={key}
|
||||
style={{ background: '#555', borderRadius: '50%' }}
|
||||
/>
|
||||
<span className="handle-label">{key}</span>
|
||||
</div>
|
||||
{!isHandleConnected(key) && renderInputField(key, data.inputSchema.properties[key])}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
<div className="output-section">
|
||||
{data.outputSchema && generateHandles(data.outputSchema, 'source')}
|
||||
</div>
|
||||
</div>
|
||||
{isPropertiesOpen && (
|
||||
<div className="node-properties">
|
||||
<h4>Node Output</h4>
|
||||
<p>
|
||||
<strong>Status:</strong>{' '}
|
||||
{typeof data.status === 'object' ? JSON.stringify(data.status) : data.status || 'N/A'}
|
||||
</p>
|
||||
<p>
|
||||
<strong>Output Data:</strong>{' '}
|
||||
{typeof data.output_data === 'object'
|
||||
? JSON.stringify(data.output_data)
|
||||
: data.output_data || 'N/A'}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
<ModalComponent
|
||||
isOpen={isModalOpen}
|
||||
onClose={() => setIsModalOpen(false)}
|
||||
onSave={handleModalSave}
|
||||
value={modalValue}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default memo(CustomNode);
|
||||
590
rnd/autogpt_builder/src/components/Flow.tsx
Normal file
590
rnd/autogpt_builder/src/components/Flow.tsx
Normal file
@@ -0,0 +1,590 @@
|
||||
"use client";
|
||||
|
||||
import React, { useState, useCallback, useEffect } from 'react';
|
||||
import ReactFlow, {
|
||||
addEdge,
|
||||
applyNodeChanges,
|
||||
applyEdgeChanges,
|
||||
Node,
|
||||
Edge,
|
||||
OnNodesChange,
|
||||
OnEdgesChange,
|
||||
OnConnect,
|
||||
NodeTypes,
|
||||
EdgeRemoveChange,
|
||||
} from 'reactflow';
|
||||
import 'reactflow/dist/style.css';
|
||||
import Modal from 'react-modal';
|
||||
import CustomNode from './CustomNode';
|
||||
import './flow.css';
|
||||
|
||||
const initialNodes: Node[] = [];
|
||||
const initialEdges: Edge[] = [];
|
||||
const nodeTypes: NodeTypes = {
|
||||
custom: CustomNode,
|
||||
};
|
||||
|
||||
interface AvailableNode {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
inputSchema?: { properties: { [key: string]: any }; required?: string[] };
|
||||
outputSchema?: { properties: { [key: string]: any } };
|
||||
}
|
||||
|
||||
interface ExecData {
|
||||
node_id: string;
|
||||
status: string;
|
||||
output_data: any;
|
||||
}
|
||||
|
||||
const Flow: React.FC = () => {
|
||||
const [nodes, setNodes] = useState<Node[]>(initialNodes);
|
||||
const [edges, setEdges] = useState<Edge[]>(initialEdges);
|
||||
const [nodeId, setNodeId] = useState<number>(1);
|
||||
const [modalIsOpen, setModalIsOpen] = useState<boolean>(false);
|
||||
const [selectedNode, setSelectedNode] = useState<Node | null>(null);
|
||||
const [title, setTitle] = useState<string>('');
|
||||
const [description, setDescription] = useState<string>('');
|
||||
const [variableName, setVariableName] = useState<string>('');
|
||||
const [variableValue, setVariableValue] = useState<string>('');
|
||||
const [printVariable, setPrintVariable] = useState<string>('');
|
||||
const [isSidebarOpen, setIsSidebarOpen] = useState<boolean>(false);
|
||||
const [searchQuery, setSearchQuery] = useState<string>('');
|
||||
const [availableNodes, setAvailableNodes] = useState<AvailableNode[]>([]);
|
||||
const [loadingStatus, setLoadingStatus] = useState<'loading' | 'failed' | 'loaded'>('loading');
|
||||
const [agentId, setAgentId] = useState<string | null>(null);
|
||||
|
||||
const apiUrl = 'http://localhost:8000'
|
||||
|
||||
useEffect(() => {
|
||||
fetch(`${apiUrl}/blocks`)
|
||||
.then(response => {
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! Status: ${response.status}`);
|
||||
}
|
||||
return response.json();
|
||||
})
|
||||
.then(data => {
|
||||
setAvailableNodes(data.map((node: AvailableNode) => ({
|
||||
...node,
|
||||
description: typeof node.description === 'object' ? JSON.stringify(node.description) : node.description,
|
||||
})));
|
||||
setLoadingStatus('loaded');
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error fetching nodes:', error);
|
||||
setLoadingStatus('failed');
|
||||
});
|
||||
}, []);
|
||||
|
||||
const onNodesChange: OnNodesChange = useCallback(
|
||||
(changes) => setNodes((nds) => applyNodeChanges(changes, nds).map(node => ({
|
||||
...node,
|
||||
data: {
|
||||
...node.data,
|
||||
metadata: {
|
||||
...node.data.metadata,
|
||||
position: node.position
|
||||
}
|
||||
}
|
||||
}))),
|
||||
[]
|
||||
);
|
||||
|
||||
const onEdgesChange: OnEdgesChange = useCallback(
|
||||
(changes) => {
|
||||
const removedEdges = changes.filter((change): change is EdgeRemoveChange => change.type === 'remove');
|
||||
setEdges((eds) => applyEdgeChanges(changes, eds));
|
||||
|
||||
if (removedEdges.length > 0) {
|
||||
setNodes((nds) =>
|
||||
nds.map((node) => {
|
||||
const updatedConnections = node.data.connections.filter(
|
||||
(conn: string) =>
|
||||
!removedEdges.some((edge) => edge.id && conn.includes(edge.id))
|
||||
);
|
||||
return { ...node, data: { ...node.data, connections: updatedConnections } };
|
||||
})
|
||||
);
|
||||
}
|
||||
},
|
||||
[]
|
||||
);
|
||||
|
||||
const onConnect: OnConnect = useCallback(
|
||||
(connection) => {
|
||||
setEdges((eds) => addEdge(connection, eds));
|
||||
setNodes((nds) =>
|
||||
nds.map((node) => {
|
||||
if (node.id === connection.source) {
|
||||
const connections = node.data.connections || [];
|
||||
connections.push(`${node.data.title} ${connection.sourceHandle} -> ${connection.targetHandle}`);
|
||||
return { ...node, data: { ...node.data, connections } };
|
||||
}
|
||||
if (node.id === connection.target) {
|
||||
const connections = node.data.connections || [];
|
||||
connections.push(`${connection.sourceHandle} -> ${node.data.title} ${connection.targetHandle}`);
|
||||
return { ...node, data: { ...node.data, connections } };
|
||||
}
|
||||
return node;
|
||||
})
|
||||
);
|
||||
},
|
||||
[setEdges, setNodes]
|
||||
);
|
||||
|
||||
const addNode = (type: string, label: string, description: string) => {
|
||||
const nodeSchema = availableNodes.find(node => node.name === label);
|
||||
const position = { x: Math.random() * 400, y: Math.random() * 400 };
|
||||
|
||||
const newNode: Node = {
|
||||
id: nodeId.toString(),
|
||||
type: 'custom',
|
||||
data: {
|
||||
label: label,
|
||||
title: `${type} ${nodeId}`,
|
||||
description: `${description}`,
|
||||
inputSchema: nodeSchema?.inputSchema,
|
||||
outputSchema: nodeSchema?.outputSchema,
|
||||
connections: [],
|
||||
variableName: '',
|
||||
variableValue: '',
|
||||
printVariable: '',
|
||||
setVariableName,
|
||||
setVariableValue,
|
||||
setPrintVariable,
|
||||
hardcodedValues: {},
|
||||
setHardcodedValues: (values: { [key: string]: any }) => {
|
||||
setNodes((nds) => nds.map((node) =>
|
||||
node.id === nodeId.toString()
|
||||
? { ...node, data: { ...node.data, hardcodedValues: values } }
|
||||
: node
|
||||
));
|
||||
},
|
||||
block_id: nodeSchema?.id || '',
|
||||
metadata: {
|
||||
position // Store position in metadata
|
||||
}
|
||||
},
|
||||
position,
|
||||
};
|
||||
setNodes((nds) => [...nds, newNode]);
|
||||
setNodeId((id) => id + 1);
|
||||
};
|
||||
|
||||
const closeModal = () => {
|
||||
setModalIsOpen(false);
|
||||
setSelectedNode(null);
|
||||
};
|
||||
|
||||
const saveNodeData = () => {
|
||||
if (selectedNode) {
|
||||
setNodes((nds) =>
|
||||
nds.map((node) =>
|
||||
node.id === selectedNode.id
|
||||
? {
|
||||
...node,
|
||||
data: {
|
||||
...node.data,
|
||||
title,
|
||||
description,
|
||||
label: title,
|
||||
variableName,
|
||||
variableValue: typeof variableValue === 'object' ? JSON.stringify(variableValue) : variableValue,
|
||||
printVariable: typeof printVariable === 'object' ? JSON.stringify(printVariable) : printVariable,
|
||||
},
|
||||
}
|
||||
: node
|
||||
)
|
||||
);
|
||||
closeModal();
|
||||
}
|
||||
};
|
||||
|
||||
const toggleSidebar = () => {
|
||||
setIsSidebarOpen(!isSidebarOpen);
|
||||
};
|
||||
|
||||
const filteredNodes = availableNodes.filter(node => node.name.toLowerCase().includes(searchQuery.toLowerCase()));
|
||||
|
||||
const prepareNodeInputData = (node: Node, allNodes: Node[], allEdges: Edge[]) => {
|
||||
const nodeSchema = availableNodes.find(n => n.id === node.data.block_id);
|
||||
if (!nodeSchema || !nodeSchema.inputSchema) return {};
|
||||
|
||||
let inputData: { [key: string]: any } = {};
|
||||
const inputProperties = nodeSchema.inputSchema.properties;
|
||||
const requiredProperties = nodeSchema.inputSchema.required || [];
|
||||
|
||||
// Initialize inputData with default values for all required properties
|
||||
requiredProperties.forEach(prop => {
|
||||
inputData[prop] = node.data.hardcodedValues[prop] || '';
|
||||
});
|
||||
|
||||
Object.keys(inputProperties).forEach(prop => {
|
||||
const inputEdge = allEdges.find(edge => edge.target === node.id && edge.targetHandle === prop);
|
||||
if (inputEdge) {
|
||||
const sourceNode = allNodes.find(n => n.id === inputEdge.source);
|
||||
inputData[prop] = sourceNode?.data.output_data || sourceNode?.data.hardcodedValues[prop] || '';
|
||||
} else if (node.data.hardcodedValues && node.data.hardcodedValues[prop]) {
|
||||
inputData[prop] = node.data.hardcodedValues[prop];
|
||||
}
|
||||
});
|
||||
|
||||
return inputData;
|
||||
};
|
||||
|
||||
const updateNodeData = (execData: ExecData) => {
|
||||
setNodes((nds) =>
|
||||
nds.map((node) => {
|
||||
if (node.id === execData.node_id) {
|
||||
return {
|
||||
...node,
|
||||
data: {
|
||||
...node.data,
|
||||
status: execData.status,
|
||||
output_data: execData.output_data,
|
||||
isPropertiesOpen: true, // Open the properties
|
||||
},
|
||||
};
|
||||
}
|
||||
return node;
|
||||
})
|
||||
);
|
||||
};
|
||||
|
||||
const runAgent = async () => {
|
||||
try {
|
||||
const formattedNodes = nodes.map(node => ({
|
||||
id: node.id,
|
||||
block_id: node.data.block_id,
|
||||
input_default: prepareNodeInputData(node, nodes, edges),
|
||||
input_nodes: edges.filter(edge => edge.target === node.id).reduce((acc, edge) => {
|
||||
if (edge.targetHandle) {
|
||||
acc[edge.targetHandle] = edge.source;
|
||||
}
|
||||
return acc;
|
||||
}, {} as { [key: string]: string }),
|
||||
output_nodes: edges.filter(edge => edge.source === node.id).reduce((acc, edge) => {
|
||||
if (edge.sourceHandle) {
|
||||
acc[edge.sourceHandle] = edge.target;
|
||||
}
|
||||
return acc;
|
||||
}, {} as { [key: string]: string }),
|
||||
metadata: node.data.metadata,
|
||||
connections: node.data.connections // Ensure connections are preserved
|
||||
}));
|
||||
|
||||
const payload = {
|
||||
id: '',
|
||||
name: 'Agent Name',
|
||||
description: 'Agent Description',
|
||||
nodes: formattedNodes,
|
||||
};
|
||||
|
||||
const createResponse = await fetch(`${apiUrl}/agents`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (!createResponse.ok) {
|
||||
throw new Error(`HTTP error! Status: ${createResponse.status}`);
|
||||
}
|
||||
|
||||
const createData = await createResponse.json();
|
||||
const agentId = createData.id;
|
||||
setAgentId(agentId);
|
||||
|
||||
const responseNodes = createData.nodes.map((node: any) => {
|
||||
const block = availableNodes.find(n => n.id === node.block_id);
|
||||
const connections = edges.filter(edge => edge.source === node.id || edge.target === node.id).map(edge => ({
|
||||
id: edge.id,
|
||||
source: edge.source,
|
||||
sourceHandle: edge.sourceHandle,
|
||||
target: edge.target,
|
||||
targetHandle: edge.targetHandle
|
||||
}));
|
||||
return {
|
||||
id: node.id,
|
||||
type: 'custom',
|
||||
position: node.metadata.position,
|
||||
data: {
|
||||
label: block?.name || 'Unknown',
|
||||
title: `${block?.name || 'Unknown'}`,
|
||||
description: `${block?.description || ''}`,
|
||||
inputSchema: block?.inputSchema,
|
||||
outputSchema: block?.outputSchema,
|
||||
connections: connections.map(c => `${c.source}-${c.sourceHandle} -> ${c.target}-${c.targetHandle}`),
|
||||
variableName: '',
|
||||
variableValue: '',
|
||||
printVariable: '',
|
||||
setVariableName,
|
||||
setVariableValue,
|
||||
setPrintVariable,
|
||||
hardcodedValues: node.input_default,
|
||||
setHardcodedValues: (values: { [key: string]: any }) => {
|
||||
setNodes((nds) => nds.map((n) =>
|
||||
n.id === node.id
|
||||
? { ...n, data: { ...n.data, hardcodedValues: values } }
|
||||
: n
|
||||
));
|
||||
},
|
||||
block_id: node.block_id,
|
||||
metadata: node.metadata
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
const newEdges = createData.nodes.flatMap((node: any) => {
|
||||
return Object.entries(node.output_nodes).map(([sourceHandle, targetNodeId]) => ({
|
||||
id: `${node.id}-${sourceHandle}-${targetNodeId}`,
|
||||
source: node.id,
|
||||
sourceHandle: sourceHandle,
|
||||
target: targetNodeId,
|
||||
targetHandle: Object.keys(node.input_nodes).find(key => node.input_nodes[key] === targetNodeId) || '',
|
||||
}));
|
||||
});
|
||||
|
||||
setNodes(responseNodes);
|
||||
setEdges(newEdges);
|
||||
|
||||
const initialNodeInput = nodes.reduce((acc, node) => {
|
||||
acc[node.id] = prepareNodeInputData(node, nodes, edges);
|
||||
return acc;
|
||||
}, {} as { [key: string]: any });
|
||||
|
||||
const nodeInputForExecution = Object.keys(initialNodeInput).reduce((acc, key) => {
|
||||
const blockId = nodes.find(node => node.id === key)?.data.block_id;
|
||||
const nodeSchema = availableNodes.find(n => n.id === blockId);
|
||||
if (nodeSchema && nodeSchema.inputSchema) {
|
||||
Object.keys(nodeSchema.inputSchema.properties).forEach(prop => {
|
||||
acc[prop] = initialNodeInput[key][prop];
|
||||
});
|
||||
}
|
||||
return acc;
|
||||
}, {} as { [key: string]: any });
|
||||
|
||||
const executeResponse = await fetch(`${apiUrl}/agents/${agentId}/execute`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(nodeInputForExecution),
|
||||
});
|
||||
|
||||
if (!executeResponse.ok) {
|
||||
throw new Error(`HTTP error! Status: ${executeResponse.status}`);
|
||||
}
|
||||
|
||||
const executeData = await executeResponse.json();
|
||||
const runId = executeData.run_id;
|
||||
|
||||
const startPolling = () => {
|
||||
const endTime = Date.now() + 60000;
|
||||
|
||||
const poll = async () => {
|
||||
if (Date.now() >= endTime) {
|
||||
console.log('Polling timeout reached.');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`${apiUrl}/agents/${agentId}/executions/${runId}`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! Status: ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
data.forEach(updateNodeData);
|
||||
|
||||
const allCompleted = data.every((exec: any) => exec.status === 'COMPLETED');
|
||||
if (allCompleted) {
|
||||
console.log('All nodes are completed.');
|
||||
return;
|
||||
}
|
||||
|
||||
setTimeout(poll, 100);
|
||||
} catch (error) {
|
||||
console.error('Error during polling:', error);
|
||||
setTimeout(poll, 100);
|
||||
}
|
||||
};
|
||||
|
||||
poll();
|
||||
};
|
||||
|
||||
startPolling();
|
||||
} catch (error) {
|
||||
console.error('Error running agent:', error);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="flow-container">
|
||||
<div className={`flow-controls ${isSidebarOpen ? 'open' : ''}`}>
|
||||
<button className="nodes-button" onClick={toggleSidebar}>
|
||||
Nodes
|
||||
</button>
|
||||
<button className="run-button" onClick={runAgent}>
|
||||
Run
|
||||
</button>
|
||||
{agentId && (
|
||||
<span style={{ marginLeft: '10px', color: '#fff', fontSize: '16px' }}>
|
||||
Agent ID: {agentId}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<div className="flow-wrapper">
|
||||
<ReactFlow
|
||||
nodes={nodes}
|
||||
edges={edges}
|
||||
nodeTypes={nodeTypes}
|
||||
onNodesChange={onNodesChange}
|
||||
onEdgesChange={onEdgesChange}
|
||||
onConnect={onConnect}
|
||||
fitView
|
||||
/>
|
||||
</div>
|
||||
{selectedNode && (
|
||||
<Modal isOpen={modalIsOpen} onRequestClose={closeModal} contentLabel="Node Info" className="modal" overlayClassName="overlay">
|
||||
<h2>Edit Node</h2>
|
||||
<form
|
||||
onSubmit={(e) => {
|
||||
e.preventDefault();
|
||||
saveNodeData();
|
||||
}}
|
||||
>
|
||||
<div>
|
||||
<label>
|
||||
Title:
|
||||
<input
|
||||
type="text"
|
||||
value={title}
|
||||
onChange={(e) => setTitle(e.target.value)}
|
||||
required
|
||||
/>
|
||||
</label>
|
||||
</div>
|
||||
<div>
|
||||
<label>
|
||||
Description:
|
||||
<textarea
|
||||
value={description}
|
||||
onChange={(e) => setDescription(e.target.value)}
|
||||
required
|
||||
/>
|
||||
</label>
|
||||
</div>
|
||||
{selectedNode.data.title.includes('Variable') && (
|
||||
<>
|
||||
<div>
|
||||
<label>
|
||||
Variable Name:
|
||||
<input
|
||||
type="text"
|
||||
value={variableName}
|
||||
onChange={(e) => setVariableName(e.target.value)}
|
||||
required
|
||||
/>
|
||||
</label>
|
||||
</div>
|
||||
<div>
|
||||
<label>
|
||||
Variable Value:
|
||||
<input
|
||||
type="text"
|
||||
value={variableValue}
|
||||
onChange={(e) => setVariableValue(e.target.value)}
|
||||
required
|
||||
/>
|
||||
</label>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{selectedNode.data.title.includes('Print') && (
|
||||
<>
|
||||
<div>
|
||||
<label>
|
||||
Variable to Print:
|
||||
<input
|
||||
type="text"
|
||||
value={printVariable}
|
||||
onChange={(e) => setPrintVariable(e.target.value)}
|
||||
required
|
||||
/>
|
||||
</label>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
<button type="submit">Save</button>
|
||||
<button type="button" onClick={closeModal}>
|
||||
Cancel
|
||||
</button>
|
||||
</form>
|
||||
</Modal>
|
||||
)}
|
||||
<div className={`sidebar ${isSidebarOpen ? 'open' : ''}`}>
|
||||
<h3 style={{ margin: '0 0 10px 0' }}>Nodes</h3>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="Search nodes..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
style={{
|
||||
padding: '10px',
|
||||
fontSize: '16px',
|
||||
backgroundColor: '#333',
|
||||
color: '#e0e0e0',
|
||||
border: '1px solid #555',
|
||||
borderRadius: '4px',
|
||||
marginBottom: '10px',
|
||||
width: 'calc(100% - 22px)',
|
||||
boxSizing: 'border-box',
|
||||
}}
|
||||
/>
|
||||
<div style={{ display: 'flex', flexDirection: 'column', gap: '10px' }}>
|
||||
{loadingStatus === 'loading' && <p>Loading...</p>}
|
||||
{loadingStatus === 'failed' && <p>Failed To Load Nodes</p>}
|
||||
{loadingStatus === 'loaded' && filteredNodes.map(node => (
|
||||
<div key={node.id} style={sidebarNodeRowStyle}>
|
||||
<div>
|
||||
<strong>{node.name}</strong>
|
||||
<p>{node.description}</p>
|
||||
</div>
|
||||
<button
|
||||
onClick={() => addNode(node.name, node.name, node.description)}
|
||||
style={sidebarButtonStyle}
|
||||
>
|
||||
Add
|
||||
</button>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
const sidebarNodeRowStyle = {
|
||||
display: 'flex',
|
||||
justifyContent: 'space-between',
|
||||
alignItems: 'center',
|
||||
backgroundColor: '#444',
|
||||
padding: '10px',
|
||||
borderRadius: '4px',
|
||||
};
|
||||
|
||||
const sidebarButtonStyle = {
|
||||
padding: '10px 20px',
|
||||
fontSize: '16px',
|
||||
backgroundColor: '#007bff',
|
||||
color: '#fff',
|
||||
border: 'none',
|
||||
borderRadius: '4px',
|
||||
cursor: 'pointer',
|
||||
};
|
||||
|
||||
export default Flow;
|
||||
40
rnd/autogpt_builder/src/components/ModalComponent.tsx
Normal file
40
rnd/autogpt_builder/src/components/ModalComponent.tsx
Normal file
@@ -0,0 +1,40 @@
|
||||
import React, { FC } from 'react';
|
||||
import './modal.css';
|
||||
|
||||
interface ModalProps {
|
||||
isOpen: boolean;
|
||||
onClose: () => void;
|
||||
onSave: (value: string) => void;
|
||||
value: string;
|
||||
}
|
||||
|
||||
const ModalComponent: FC<ModalProps> = ({ isOpen, onClose, onSave, value }) => {
|
||||
const [tempValue, setTempValue] = React.useState(value);
|
||||
|
||||
const handleSave = () => {
|
||||
onSave(tempValue);
|
||||
onClose();
|
||||
};
|
||||
|
||||
if (!isOpen) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="modal-overlay">
|
||||
<div className="modal">
|
||||
<textarea
|
||||
className="modal-textarea"
|
||||
value={tempValue}
|
||||
onChange={(e) => setTempValue(e.target.value)}
|
||||
/>
|
||||
<div className="modal-actions">
|
||||
<button onClick={onClose}>Cancel</button>
|
||||
<button onClick={handleSave}>Save</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default ModalComponent;
|
||||
143
rnd/autogpt_builder/src/components/customnode.css
Normal file
143
rnd/autogpt_builder/src/components/customnode.css
Normal file
@@ -0,0 +1,143 @@
|
||||
.custom-node {
|
||||
padding: 20px;
|
||||
border: 2px solid #fff;
|
||||
border-radius: 12px;
|
||||
background: #1e1e1e;
|
||||
color: #e0e0e0;
|
||||
width: 260px;
|
||||
box-shadow: 0 4px 8px rgba(0, 0, 0, 0.2);
|
||||
font-family: 'Arial', sans-serif;
|
||||
}
|
||||
|
||||
.node-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
.node-title {
|
||||
font-size: 20px;
|
||||
font-weight: bold;
|
||||
color: #61dafb;
|
||||
}
|
||||
|
||||
.toggle-button {
|
||||
background: transparent;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
color: #e0e0e0;
|
||||
font-size: 20px;
|
||||
}
|
||||
|
||||
.node-content {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 10px;
|
||||
}
|
||||
|
||||
.input-section, .output-section {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.handle-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
position: relative;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.handle-label {
|
||||
color: #e0e0e0;
|
||||
margin-left: 10px;
|
||||
}
|
||||
|
||||
.input-container {
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.clickable-input {
|
||||
padding: 10px;
|
||||
border-radius: 8px;
|
||||
border: 1px solid #555;
|
||||
background: #2a2a2a;
|
||||
color: #e0e0e0;
|
||||
cursor: pointer;
|
||||
text-align: center;
|
||||
transition: background 0.3s;
|
||||
}
|
||||
|
||||
.clickable-input:hover {
|
||||
background: #444;
|
||||
}
|
||||
|
||||
.select-input {
|
||||
width: 100%;
|
||||
padding: 8px;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #555;
|
||||
background: #2a2a2a;
|
||||
color: #e0e0e0;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.radio-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 5px;
|
||||
}
|
||||
|
||||
.radio-label {
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
.radio-label input {
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.number-input {
|
||||
width: 100%;
|
||||
padding: 8px;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #555;
|
||||
background: #2a2a2a;
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
.node-properties {
|
||||
margin-top: 20px;
|
||||
background: #2a2a2a;
|
||||
padding: 15px;
|
||||
border-radius: 8px;
|
||||
}
|
||||
|
||||
.node-properties h4 {
|
||||
margin: 0 0 10px 0;
|
||||
}
|
||||
|
||||
.node-properties p {
|
||||
margin: 5px 0;
|
||||
}
|
||||
|
||||
@media screen and (max-width: 768px) {
|
||||
.custom-node {
|
||||
width: 90%;
|
||||
padding: 15px;
|
||||
}
|
||||
|
||||
.node-content {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.toggle-button {
|
||||
font-size: 16px;
|
||||
}
|
||||
|
||||
.node-title {
|
||||
font-size: 18px;
|
||||
}
|
||||
|
||||
.clickable-input {
|
||||
padding: 8px;
|
||||
}
|
||||
}
|
||||
150
rnd/autogpt_builder/src/components/flow.css
Normal file
150
rnd/autogpt_builder/src/components/flow.css
Normal file
@@ -0,0 +1,150 @@
|
||||
/* flow.css or index.css */
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
|
||||
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
|
||||
sans-serif;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
background-color: #121212;
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
code {
|
||||
font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
|
||||
monospace;
|
||||
}
|
||||
|
||||
button {
|
||||
background-color: #444;
|
||||
color: #e0e0e0;
|
||||
padding: 10px;
|
||||
border: none;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
transition: background-color 0.3s ease;
|
||||
}
|
||||
|
||||
button:hover {
|
||||
background-color: #666;
|
||||
}
|
||||
|
||||
input, textarea {
|
||||
background-color: #333;
|
||||
color: #e0e0e0;
|
||||
border: 1px solid #555;
|
||||
padding: 8px;
|
||||
border-radius: 4px;
|
||||
width: calc(100% - 18px);
|
||||
box-sizing: border-box;
|
||||
margin-top: 5px;
|
||||
}
|
||||
|
||||
input::placeholder, textarea::placeholder {
|
||||
color: #aaa;
|
||||
}
|
||||
|
||||
.modal {
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 50%;
|
||||
right: auto;
|
||||
bottom: auto;
|
||||
margin-right: -50%;
|
||||
transform: translate(-50%, -50%);
|
||||
background: #333;
|
||||
padding: 20px;
|
||||
border: 1px solid #ccc;
|
||||
border-radius: 4px;
|
||||
color: #e0e0e0;
|
||||
}
|
||||
|
||||
.overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background-color: rgba(0, 0, 0, 0.75);
|
||||
}
|
||||
|
||||
.modal h2 {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.modal button {
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.modal form {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.modal form div {
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
.sidebar {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: -600px;
|
||||
width: 350px;
|
||||
height: 100%;
|
||||
background-color: #333;
|
||||
color: #fff;
|
||||
padding: 20px;
|
||||
transition: left 0.3s ease;
|
||||
z-index: 1000;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.sidebar.open {
|
||||
left: 0;
|
||||
}
|
||||
|
||||
.sidebar h3 {
|
||||
margin: 0 0 20px;
|
||||
}
|
||||
|
||||
.sidebarNodeRowStyle {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
background-color: #444;
|
||||
padding: 10px;
|
||||
border-radius: 4px;
|
||||
cursor: grab;
|
||||
}
|
||||
|
||||
.sidebarNodeRowStyle.dragging {
|
||||
opacity: 0.5;
|
||||
}
|
||||
|
||||
.flow-container {
|
||||
width: 100%;
|
||||
height: 600px; /* Adjust this height as needed */
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.flow-wrapper {
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.flow-controls {
|
||||
position: absolute;
|
||||
left: -80px;
|
||||
z-index: 1001;
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
transition: transform 0.3s ease;
|
||||
}
|
||||
|
||||
.flow-controls.open {
|
||||
transform: translateX(350px);
|
||||
}
|
||||
34
rnd/autogpt_builder/src/components/modal.css
Normal file
34
rnd/autogpt_builder/src/components/modal.css
Normal file
@@ -0,0 +1,34 @@
|
||||
.modal-overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(0, 0, 0, 0.6);
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.modal {
|
||||
background: #fff;
|
||||
padding: 20px;
|
||||
border-radius: 8px;
|
||||
width: 500px;
|
||||
max-width: 90%;
|
||||
}
|
||||
|
||||
.modal-textarea {
|
||||
width: 100%;
|
||||
height: 200px;
|
||||
padding: 10px;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #ccc;
|
||||
}
|
||||
|
||||
.modal-actions {
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
gap: 10px;
|
||||
margin-top: 10px;
|
||||
}
|
||||
20
rnd/autogpt_builder/tailwind.config.ts
Normal file
20
rnd/autogpt_builder/tailwind.config.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
import type { Config } from "tailwindcss";
|
||||
|
||||
const config: Config = {
|
||||
content: [
|
||||
"./src/pages/**/*.{js,ts,jsx,tsx,mdx}",
|
||||
"./src/components/**/*.{js,ts,jsx,tsx,mdx}",
|
||||
"./src/app/**/*.{js,ts,jsx,tsx,mdx}",
|
||||
],
|
||||
theme: {
|
||||
extend: {
|
||||
backgroundImage: {
|
||||
"gradient-radial": "radial-gradient(var(--tw-gradient-stops))",
|
||||
"gradient-conic":
|
||||
"conic-gradient(from 180deg at 50% 50%, var(--tw-gradient-stops))",
|
||||
},
|
||||
},
|
||||
},
|
||||
plugins: [],
|
||||
};
|
||||
export default config;
|
||||
26
rnd/autogpt_builder/tsconfig.json
Normal file
26
rnd/autogpt_builder/tsconfig.json
Normal file
@@ -0,0 +1,26 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"lib": ["dom", "dom.iterable", "esnext"],
|
||||
"allowJs": true,
|
||||
"skipLibCheck": true,
|
||||
"strict": true,
|
||||
"noEmit": true,
|
||||
"esModuleInterop": true,
|
||||
"module": "esnext",
|
||||
"moduleResolution": "bundler",
|
||||
"resolveJsonModule": true,
|
||||
"isolatedModules": true,
|
||||
"jsx": "preserve",
|
||||
"incremental": true,
|
||||
"plugins": [
|
||||
{
|
||||
"name": "next"
|
||||
}
|
||||
],
|
||||
"paths": {
|
||||
"@/*": ["./src/*"]
|
||||
}
|
||||
},
|
||||
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
|
||||
"exclude": ["node_modules"]
|
||||
}
|
||||
5
rnd/autogpt_server/.gitignore
vendored
5
rnd/autogpt_server/.gitignore
vendored
@@ -1,3 +1,6 @@
|
||||
database.db
|
||||
database.db-journal
|
||||
build/
|
||||
build/
|
||||
config.json
|
||||
secrets/*
|
||||
!secrets/.gitkeep
|
||||
@@ -18,3 +18,57 @@ It will also trigger the agent execution by pushing its execution request to the
|
||||
A component that will execute the agents.
|
||||
This component will be a pool of processes/threads that will consume the ExecutionQueue and execute the agent accordingly.
|
||||
The result and progress of its execution will be persisted in the database.
|
||||
|
||||
## Setup
|
||||
|
||||
This setup is for MacOS/Linux.
|
||||
To setup the project follow these steps inside the project directory:
|
||||
|
||||
1. Enter poetry shell
|
||||
```
|
||||
poetry shell
|
||||
```
|
||||
|
||||
1. Install dependencies
|
||||
```
|
||||
poetry install
|
||||
```
|
||||
|
||||
1. Generate prisma client
|
||||
```
|
||||
poetry run prisma generate
|
||||
```
|
||||
|
||||
In case prisma generates client for the global python installation instead of the virtual environment the current mitigation is to just uninstall the global prisma package:
|
||||
```
|
||||
pip uninstall prisma
|
||||
```
|
||||
|
||||
And then run the generation again.
|
||||
The path *should* look something like this:
|
||||
`<some path>/pypoetry/virtualenvs/autogpt-server-TQIRSwR6-py3.12/bin/prisma`
|
||||
|
||||
1. Migrate the database, be careful because this deletes current data in the database
|
||||
```
|
||||
poetry run prisma migrate dev
|
||||
```
|
||||
|
||||
1. Start the server, this starts the server in the background
|
||||
```
|
||||
poetry run python ./autogpt_server/cli.py start
|
||||
```
|
||||
|
||||
You may need to change the permissions of the file to make it executable
|
||||
```
|
||||
chmod +x autogpt_server/cli.py
|
||||
```
|
||||
|
||||
1. Stop the server
|
||||
```
|
||||
poetry run python ./autogpt_server/cli.py stop
|
||||
```
|
||||
|
||||
1. To run the tests
|
||||
```
|
||||
poetry run pytest
|
||||
```
|
||||
|
||||
@@ -1,9 +1,17 @@
|
||||
from multiprocessing import freeze_support, set_start_method
|
||||
from autogpt_server.executor import ExecutionManager, ExecutionScheduler
|
||||
from autogpt_server.server import AgentServer
|
||||
from autogpt_server.util.process import AppProcess
|
||||
from autogpt_server.util.service import PyroNameServer
|
||||
|
||||
|
||||
def get_config_and_secrets():
|
||||
from autogpt_server.util.settings import Settings
|
||||
|
||||
settings = Settings()
|
||||
return settings
|
||||
|
||||
|
||||
def run_processes(processes: list[AppProcess], **kwargs):
|
||||
"""
|
||||
Execute all processes in the app. The last process is run in the foreground.
|
||||
@@ -19,10 +27,14 @@ def run_processes(processes: list[AppProcess], **kwargs):
|
||||
|
||||
|
||||
def main(**kwargs):
|
||||
settings = get_config_and_secrets()
|
||||
set_start_method("spawn", force=True)
|
||||
freeze_support()
|
||||
|
||||
run_processes(
|
||||
[
|
||||
PyroNameServer(),
|
||||
ExecutionManager(pool_size=5),
|
||||
ExecutionManager(pool_size=settings.config.num_workers),
|
||||
ExecutionScheduler(),
|
||||
AgentServer(),
|
||||
],
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
import json
|
||||
import jsonschema
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any, ClassVar
|
||||
from typing import Any, Generator, ClassVar
|
||||
|
||||
import jsonschema
|
||||
from prisma.models import AgentBlock
|
||||
from pydantic import BaseModel
|
||||
|
||||
@@ -92,6 +91,15 @@ class BlockSchema(BaseModel):
|
||||
except jsonschema.ValidationError as e:
|
||||
return str(e)
|
||||
|
||||
def get_fields(self) -> set[str]:
|
||||
return set(self.jsonschema["properties"].keys())
|
||||
|
||||
def get_required_fields(self) -> set[str]:
|
||||
return set(self.jsonschema["required"])
|
||||
|
||||
|
||||
BlockOutput = Generator[tuple[str, Any], None, None]
|
||||
|
||||
|
||||
class Block(ABC, BaseModel):
|
||||
@classmethod
|
||||
@@ -126,13 +134,15 @@ class Block(ABC, BaseModel):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def run(self, input_data: BlockData) -> tuple[str, Any]:
|
||||
def run(self, input_data: BlockData) -> BlockOutput:
|
||||
"""
|
||||
Run the block with the given input data.
|
||||
Args:
|
||||
input_data: The input data with the structure of input_schema.
|
||||
Returns:
|
||||
The (output name, output data), matching the type in output_schema.
|
||||
A Generator that yields (output_name, output_data).
|
||||
output_name: One of the output name defined in Block's output_schema.
|
||||
output_data: The data for the output_name, matching the defined schema.
|
||||
"""
|
||||
pass
|
||||
|
||||
@@ -149,20 +159,18 @@ class Block(ABC, BaseModel):
|
||||
"outputSchema": self.output_schema.jsonschema,
|
||||
}
|
||||
|
||||
def execute(self, input_data: BlockData) -> tuple[str, Any]:
|
||||
def execute(self, input_data: BlockData) -> BlockOutput:
|
||||
if error := self.input_schema.validate_data(input_data):
|
||||
raise ValueError(
|
||||
f"Unable to execute block with invalid input data: {error}"
|
||||
)
|
||||
|
||||
output_name, output_data = self.run(input_data)
|
||||
|
||||
if error := self.output_schema.validate_field(output_name, output_data):
|
||||
raise ValueError(
|
||||
f"Unable to execute block with invalid output data: {error}"
|
||||
)
|
||||
|
||||
return output_name, output_data
|
||||
for output_name, output_data in self.run(input_data):
|
||||
if error := self.output_schema.validate_field(output_name, output_data):
|
||||
raise ValueError(
|
||||
f"Unable to execute block with invalid output data: {error}"
|
||||
)
|
||||
yield output_name, output_data
|
||||
|
||||
|
||||
# ===================== Inline-Block Implementations ===================== #
|
||||
@@ -181,16 +189,19 @@ class ParrotBlock(Block):
|
||||
}
|
||||
)
|
||||
|
||||
def run(self, input_data: BlockData) -> tuple[str, Any]:
|
||||
return "output", input_data["input"]
|
||||
def run(self, input_data: BlockData) -> BlockOutput:
|
||||
yield "output", input_data["input"]
|
||||
|
||||
|
||||
class TextCombinerBlock(Block):
|
||||
class TextFormatterBlock(Block):
|
||||
id: ClassVar[str] = "db7d8f02-2f44-4c55-ab7a-eae0941f0c30" # type: ignore
|
||||
input_schema: ClassVar[BlockSchema] = BlockSchema( # type: ignore
|
||||
{
|
||||
"text1": "string",
|
||||
"text2": "string",
|
||||
"texts": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"minItems": 1,
|
||||
},
|
||||
"format": "string",
|
||||
}
|
||||
)
|
||||
@@ -200,11 +211,8 @@ class TextCombinerBlock(Block):
|
||||
}
|
||||
)
|
||||
|
||||
def run(self, input_data: BlockData) -> tuple[str, Any]:
|
||||
return "combined_text", input_data["format"].format(
|
||||
text1=input_data["text1"],
|
||||
text2=input_data["text2"],
|
||||
)
|
||||
def run(self, input_data: BlockData) -> BlockOutput:
|
||||
yield "combined_text", input_data["format"].format(texts=input_data["texts"])
|
||||
|
||||
|
||||
class PrintingBlock(Block):
|
||||
@@ -220,8 +228,8 @@ class PrintingBlock(Block):
|
||||
}
|
||||
)
|
||||
|
||||
def run(self, input_data: BlockData) -> tuple[str, Any]:
|
||||
return "status", "printed"
|
||||
def run(self, input_data: BlockData) -> BlockOutput:
|
||||
yield "status", "printed"
|
||||
|
||||
|
||||
# ======================= Block Helper Functions ======================= #
|
||||
|
||||
@@ -1,23 +1,27 @@
|
||||
import json
|
||||
from collections import defaultdict
|
||||
from datetime import datetime
|
||||
from enum import Enum
|
||||
from multiprocessing import Queue
|
||||
from multiprocessing import Manager
|
||||
from typing import Any
|
||||
|
||||
from prisma.models import AgentNodeExecution
|
||||
|
||||
from autogpt_server.data.db import BaseDbModel
|
||||
from prisma.models import (
|
||||
AgentGraphExecution,
|
||||
AgentNodeExecution,
|
||||
AgentNodeExecutionInputOutput,
|
||||
)
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
class Execution(BaseDbModel):
|
||||
"""Data model for an execution of an Agent"""
|
||||
|
||||
run_id: str
|
||||
class NodeExecution(BaseModel):
|
||||
graph_exec_id: str
|
||||
node_exec_id: str
|
||||
node_id: str
|
||||
data: dict[str, Any]
|
||||
|
||||
|
||||
class ExecutionStatus(str, Enum):
|
||||
INCOMPLETE = "INCOMPLETE"
|
||||
QUEUED = "QUEUED"
|
||||
RUNNING = "RUNNING"
|
||||
COMPLETED = "COMPLETED"
|
||||
@@ -31,103 +35,228 @@ class ExecutionQueue:
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.queue: Queue[Execution] = Queue()
|
||||
self.queue = Manager().Queue()
|
||||
|
||||
def add(self, execution: Execution) -> Execution:
|
||||
def add(self, execution: NodeExecution) -> NodeExecution:
|
||||
self.queue.put(execution)
|
||||
return execution
|
||||
|
||||
def get(self) -> Execution:
|
||||
def get(self) -> NodeExecution:
|
||||
return self.queue.get()
|
||||
|
||||
def empty(self) -> bool:
|
||||
return self.queue.empty()
|
||||
|
||||
|
||||
class ExecutionResult(BaseDbModel):
|
||||
run_id: str
|
||||
execution_id: str
|
||||
class ExecutionResult(BaseModel):
|
||||
graph_exec_id: str
|
||||
node_exec_id: str
|
||||
node_id: str
|
||||
status: ExecutionStatus
|
||||
input_data: dict[str, Any]
|
||||
output_name: str
|
||||
output_data: Any
|
||||
creation_time: datetime
|
||||
input_data: dict[str, Any] # 1 input pin should consume exactly 1 data.
|
||||
output_data: dict[str, list[Any]] # but 1 output pin can produce multiple output.
|
||||
add_time: datetime
|
||||
queue_time: datetime | None
|
||||
start_time: datetime | None
|
||||
end_time: datetime | None
|
||||
|
||||
@staticmethod
|
||||
def from_db(execution: AgentNodeExecution):
|
||||
input_data = defaultdict()
|
||||
for data in execution.Input or []:
|
||||
input_data[data.name] = json.loads(data.data)
|
||||
|
||||
output_data = defaultdict(list)
|
||||
for data in execution.Output or []:
|
||||
output_data[data.name].append(json.loads(data.data))
|
||||
|
||||
return ExecutionResult(
|
||||
run_id=execution.executionId,
|
||||
graph_exec_id=execution.agentGraphExecutionId,
|
||||
node_exec_id=execution.id,
|
||||
node_id=execution.agentNodeId,
|
||||
execution_id=execution.id,
|
||||
status=ExecutionStatus(execution.executionStatus),
|
||||
input_data=json.loads(execution.inputData or "{}"),
|
||||
output_name=execution.outputName or "",
|
||||
output_data=json.loads(execution.outputData or "{}"),
|
||||
creation_time=execution.creationTime,
|
||||
start_time=execution.startTime,
|
||||
end_time=execution.endTime,
|
||||
input_data=input_data,
|
||||
output_data=output_data,
|
||||
add_time=execution.addedTime,
|
||||
queue_time=execution.queuedTime,
|
||||
start_time=execution.startedTime,
|
||||
end_time=execution.endedTime,
|
||||
)
|
||||
|
||||
|
||||
# --------------------- Model functions --------------------- #
|
||||
|
||||
|
||||
async def enqueue_execution(execution: Execution) -> None:
|
||||
await AgentNodeExecution.prisma().create(
|
||||
async def create_graph_execution(
|
||||
graph_id: str,
|
||||
node_ids: list[str],
|
||||
data: dict[str, Any]
|
||||
) -> tuple[str, list[ExecutionResult]]:
|
||||
"""
|
||||
Create a new AgentGraphExecution record.
|
||||
Returns:
|
||||
The id of the AgentGraphExecution and the list of ExecutionResult for each node.
|
||||
"""
|
||||
result = await AgentGraphExecution.prisma().create(
|
||||
data={
|
||||
"id": execution.id,
|
||||
"executionId": execution.run_id,
|
||||
"agentNodeId": execution.node_id,
|
||||
"executionStatus": ExecutionStatus.QUEUED,
|
||||
"inputData": json.dumps(execution.data),
|
||||
"creationTime": datetime.now(),
|
||||
"agentGraphId": graph_id,
|
||||
"AgentNodeExecutions": {
|
||||
"create": [ # type: ignore
|
||||
{
|
||||
"agentNodeId": node_id,
|
||||
"executionStatus": ExecutionStatus.INCOMPLETE,
|
||||
"Input": {
|
||||
"create": [
|
||||
{"name": name, "data": json.dumps(data)}
|
||||
for name, data in data.items()
|
||||
]
|
||||
},
|
||||
}
|
||||
for node_id in node_ids
|
||||
]
|
||||
},
|
||||
},
|
||||
include={"AgentNodeExecutions": True}
|
||||
)
|
||||
|
||||
return result.id, [
|
||||
ExecutionResult.from_db(execution)
|
||||
for execution in result.AgentNodeExecutions or []
|
||||
]
|
||||
|
||||
|
||||
async def upsert_execution_input(
|
||||
node_id: str,
|
||||
graph_exec_id: str,
|
||||
input_name: str,
|
||||
data: Any,
|
||||
) -> str:
|
||||
"""
|
||||
Insert AgentNodeExecutionInputOutput record for as one of AgentNodeExecution.Input.
|
||||
If there is no AgentNodeExecution that has no `input_name` as input, create new one.
|
||||
|
||||
Returns:
|
||||
The id of the created or existing AgentNodeExecution.
|
||||
"""
|
||||
existing_execution = await AgentNodeExecution.prisma().find_first(
|
||||
where={ # type: ignore
|
||||
"agentNodeId": node_id,
|
||||
"agentGraphExecutionId": graph_exec_id,
|
||||
"Input": {"every": {"name": {"not": input_name}}},
|
||||
},
|
||||
order={"addedTime": "asc"},
|
||||
)
|
||||
json_data = json.dumps(data)
|
||||
|
||||
if existing_execution:
|
||||
print(f"Adding input {input_name}={data} to execution #{existing_execution.id}")
|
||||
await AgentNodeExecutionInputOutput.prisma().create(
|
||||
data={
|
||||
"name": input_name,
|
||||
"data": json_data,
|
||||
"referencedByInputExecId": existing_execution.id,
|
||||
}
|
||||
)
|
||||
return existing_execution.id
|
||||
|
||||
else:
|
||||
print(f"Creating new execution for input {input_name}={data}")
|
||||
result = await AgentNodeExecution.prisma().create(
|
||||
data={
|
||||
"agentNodeId": node_id,
|
||||
"agentGraphExecutionId": graph_exec_id,
|
||||
"executionStatus": ExecutionStatus.INCOMPLETE,
|
||||
"Input": {"create": {"name": input_name, "data": json_data}},
|
||||
}
|
||||
)
|
||||
return result.id
|
||||
|
||||
|
||||
async def upsert_execution_output(
|
||||
node_exec_id: str,
|
||||
output_name: str,
|
||||
output_data: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Insert AgentNodeExecutionInputOutput record for as one of AgentNodeExecution.Output.
|
||||
"""
|
||||
await AgentNodeExecutionInputOutput.prisma().create(
|
||||
data={
|
||||
"name": output_name,
|
||||
"data": json.dumps(output_data),
|
||||
"referencedByOutputExecId": node_exec_id,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
async def start_execution(exec_id: str) -> None:
|
||||
await AgentNodeExecution.prisma().update(
|
||||
where={"id": exec_id},
|
||||
data={
|
||||
"executionStatus": ExecutionStatus.RUNNING,
|
||||
"startTime": datetime.now(),
|
||||
},
|
||||
async def update_execution_status(node_exec_id: str, status: ExecutionStatus) -> None:
|
||||
now = datetime.now()
|
||||
data = {
|
||||
**({"executionStatus": status}),
|
||||
**({"queuedTime": now} if status == ExecutionStatus.QUEUED else {}),
|
||||
**({"startedTime": now} if status == ExecutionStatus.RUNNING else {}),
|
||||
**({"endedTime": now} if status == ExecutionStatus.FAILED else {}),
|
||||
**({"endedTime": now} if status == ExecutionStatus.COMPLETED else {}),
|
||||
}
|
||||
|
||||
count = await AgentNodeExecution.prisma().update(
|
||||
where={"id": node_exec_id},
|
||||
data=data # type: ignore
|
||||
)
|
||||
if count == 0:
|
||||
raise ValueError(f"Execution {node_exec_id} not found.")
|
||||
|
||||
|
||||
async def complete_execution(exec_id: str, output: tuple[str, Any]) -> None:
|
||||
output_name, output_data = output
|
||||
|
||||
await AgentNodeExecution.prisma().update(
|
||||
where={"id": exec_id},
|
||||
data={
|
||||
"executionStatus": ExecutionStatus.COMPLETED,
|
||||
"outputName": output_name,
|
||||
"outputData": json.dumps(output_data),
|
||||
"endTime": datetime.now(),
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
async def fail_execution(exec_id: str, error: Exception) -> None:
|
||||
await AgentNodeExecution.prisma().update(
|
||||
where={"id": exec_id},
|
||||
data={
|
||||
"executionStatus": ExecutionStatus.FAILED,
|
||||
"outputName": "error",
|
||||
"outputData": str(error),
|
||||
"endTime": datetime.now(),
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
async def get_executions(run_id: str) -> list[ExecutionResult]:
|
||||
async def get_executions(graph_exec_id: str) -> list[ExecutionResult]:
|
||||
executions = await AgentNodeExecution.prisma().find_many(
|
||||
where={"executionId": run_id},
|
||||
order={"startTime": "asc"},
|
||||
where={"agentGraphExecutionId": graph_exec_id},
|
||||
include={"Input": True, "Output": True},
|
||||
order={"addedTime": "asc"},
|
||||
)
|
||||
res = [ExecutionResult.from_db(execution) for execution in executions]
|
||||
return res
|
||||
|
||||
|
||||
async def get_node_execution_input(node_exec_id: str) -> dict[str, Any]:
|
||||
"""
|
||||
Get execution node input data from the previous node execution result.
|
||||
|
||||
Returns:
|
||||
dictionary of input data, key is the input name, value is the input data.
|
||||
"""
|
||||
execution = await AgentNodeExecution.prisma().find_unique_or_raise(
|
||||
where={"id": node_exec_id},
|
||||
include={
|
||||
"Input": True,
|
||||
"AgentNode": True,
|
||||
},
|
||||
)
|
||||
if not execution.AgentNode:
|
||||
raise ValueError(f"Node {execution.agentNodeId} not found.")
|
||||
|
||||
exec_input = json.loads(execution.AgentNode.constantInput)
|
||||
for input_data in execution.Input or []:
|
||||
exec_input[input_data.name] = json.loads(input_data.data)
|
||||
|
||||
return merge_execution_input(exec_input)
|
||||
|
||||
|
||||
SPLIT = "_$_"
|
||||
|
||||
|
||||
def merge_execution_input(data: dict[str, Any]) -> dict[str, Any]:
|
||||
list_input = []
|
||||
for key, value in data.items():
|
||||
if SPLIT not in key:
|
||||
continue
|
||||
|
||||
name, index = key.split(SPLIT)
|
||||
if not index.isdigit():
|
||||
list_input.append((name, value, 0))
|
||||
else:
|
||||
list_input.append((name, value, int(index)))
|
||||
|
||||
for name, value, _ in sorted(list_input, key=lambda x: x[2]):
|
||||
data[name] = data.get(name, [])
|
||||
data[name].append(value)
|
||||
|
||||
return data
|
||||
|
||||
@@ -1,20 +1,30 @@
|
||||
import asyncio
|
||||
import json
|
||||
import uuid
|
||||
|
||||
from typing import Any
|
||||
from prisma.models import AgentGraph, AgentNode, AgentNodeExecution, AgentNodeLink
|
||||
|
||||
from prisma.models import AgentGraph, AgentNode, AgentNodeLink
|
||||
from pydantic import BaseModel
|
||||
|
||||
from autogpt_server.data.db import BaseDbModel
|
||||
|
||||
|
||||
class Link(BaseModel):
|
||||
name: str
|
||||
node_id: str
|
||||
|
||||
def __init__(self, name: str, node_id: str):
|
||||
super().__init__(name=name, node_id=node_id)
|
||||
|
||||
def __iter__(self):
|
||||
return iter((self.name, self.node_id))
|
||||
|
||||
|
||||
class Node(BaseDbModel):
|
||||
block_id: str
|
||||
input_default: dict[str, Any] = {} # dict[input_name, default_value]
|
||||
input_nodes: dict[str, str] = {} # dict[input_name, node_id]
|
||||
# TODO: Make it `dict[str, list[str]]`, output can be connected to multiple blocks.
|
||||
# Other option is to use an edge-list, but it will complicate the rest code.
|
||||
output_nodes: dict[str, str] = {} # dict[output_name, node_id]
|
||||
input_nodes: list[Link] = [] # dict[input_name, node_id]
|
||||
output_nodes: list[Link] = [] # dict[output_name, node_id]
|
||||
metadata: dict[str, Any] = {}
|
||||
|
||||
@staticmethod
|
||||
@@ -26,14 +36,20 @@ class Node(BaseDbModel):
|
||||
id=node.id,
|
||||
block_id=node.AgentBlock.id,
|
||||
input_default=json.loads(node.constantInput),
|
||||
input_nodes={v.sinkName: v.agentNodeSourceId for v in node.Input or []},
|
||||
output_nodes={v.sourceName: v.agentNodeSinkId for v in node.Output or []},
|
||||
input_nodes=[
|
||||
Link(v.sinkName, v.agentNodeSourceId)
|
||||
for v in node.Input or []
|
||||
],
|
||||
output_nodes=[
|
||||
Link(v.sourceName, v.agentNodeSinkId)
|
||||
for v in node.Output or []
|
||||
],
|
||||
metadata=json.loads(node.metadata),
|
||||
)
|
||||
|
||||
def connect(self, node: "Node", source_name: str, sink_name: str):
|
||||
self.output_nodes[source_name] = node.id
|
||||
node.input_nodes[sink_name] = self.id
|
||||
self.output_nodes.append(Link(source_name, node.id))
|
||||
node.input_nodes.append(Link(sink_name, self.id))
|
||||
|
||||
|
||||
class Graph(BaseDbModel):
|
||||
@@ -85,41 +101,7 @@ async def get_graph(graph_id: str) -> Graph | None:
|
||||
return Graph.from_db(graph) if graph else None
|
||||
|
||||
|
||||
async def get_node_input(node: Node, exec_id: str) -> dict[str, Any]:
|
||||
"""
|
||||
Get execution node input data from the previous node execution result.
|
||||
Args:
|
||||
node: The execution node.
|
||||
exec_id: The execution ID.
|
||||
Returns:
|
||||
dictionary of input data, key is the input name, value is the input data.
|
||||
"""
|
||||
query = await AgentNodeExecution.prisma().find_many(
|
||||
where={ # type: ignore
|
||||
"executionId": exec_id,
|
||||
"agentNodeId": {"in": list(node.input_nodes.values())},
|
||||
"executionStatus": "COMPLETED",
|
||||
},
|
||||
distinct=["agentNodeId"], # type: ignore
|
||||
order={"creationTime": "desc"},
|
||||
)
|
||||
|
||||
latest_executions: dict[str, AgentNodeExecution] = {
|
||||
execution.agentNodeId: execution for execution in query
|
||||
}
|
||||
|
||||
return {
|
||||
**node.input_default,
|
||||
**{
|
||||
name: json.loads(latest_executions[node_id].outputData or "{}")
|
||||
for name, node_id in node.input_nodes.items()
|
||||
if node_id in latest_executions and latest_executions[node_id].outputData
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
async def create_graph(graph: Graph) -> Graph:
|
||||
|
||||
await AgentGraph.prisma().create(
|
||||
data={
|
||||
"id": graph.id,
|
||||
@@ -142,12 +124,12 @@ async def create_graph(graph: Graph) -> Graph:
|
||||
edge_source_names = {
|
||||
(source_node.id, sink_node_id): output_name
|
||||
for source_node in graph.nodes
|
||||
for output_name, sink_node_id in source_node.output_nodes.items()
|
||||
for output_name, sink_node_id in source_node.output_nodes
|
||||
}
|
||||
edge_sink_names = {
|
||||
(source_node_id, sink_node.id): input_name
|
||||
for sink_node in graph.nodes
|
||||
for input_name, source_node_id in sink_node.input_nodes.items()
|
||||
for input_name, source_node_id in sink_node.input_nodes
|
||||
}
|
||||
|
||||
# TODO: replace bulk creation using create_many
|
||||
|
||||
@@ -2,14 +2,13 @@ import json
|
||||
from datetime import datetime
|
||||
from typing import Optional, Any
|
||||
|
||||
from prisma.models import AgentExecutionSchedule
|
||||
from prisma.models import AgentGraphExecutionSchedule
|
||||
|
||||
from autogpt_server.data.db import BaseDbModel
|
||||
|
||||
|
||||
class ExecutionSchedule(BaseDbModel):
|
||||
id: str
|
||||
agent_id: str
|
||||
graph_id: str
|
||||
schedule: str
|
||||
is_enabled: bool
|
||||
input_data: dict[str, Any]
|
||||
@@ -25,10 +24,10 @@ class ExecutionSchedule(BaseDbModel):
|
||||
super().__init__(is_enabled=is_enabled, **kwargs)
|
||||
|
||||
@staticmethod
|
||||
def from_db(schedule: AgentExecutionSchedule):
|
||||
def from_db(schedule: AgentGraphExecutionSchedule):
|
||||
return ExecutionSchedule(
|
||||
id=schedule.id,
|
||||
agent_id=schedule.agentGraphId,
|
||||
graph_id=schedule.agentGraphId,
|
||||
schedule=schedule.schedule,
|
||||
is_enabled=schedule.isEnabled,
|
||||
last_updated=schedule.lastUpdated.replace(tzinfo=None),
|
||||
@@ -37,7 +36,7 @@ class ExecutionSchedule(BaseDbModel):
|
||||
|
||||
|
||||
async def get_active_schedules(last_fetch_time: datetime) -> list[ExecutionSchedule]:
|
||||
query = AgentExecutionSchedule.prisma().find_many(
|
||||
query = AgentGraphExecutionSchedule.prisma().find_many(
|
||||
where={
|
||||
"isEnabled": True,
|
||||
"lastUpdated": {"gt": last_fetch_time}
|
||||
@@ -51,17 +50,17 @@ async def get_active_schedules(last_fetch_time: datetime) -> list[ExecutionSched
|
||||
|
||||
|
||||
async def disable_schedule(schedule_id: str):
|
||||
await AgentExecutionSchedule.prisma().update(
|
||||
await AgentGraphExecutionSchedule.prisma().update(
|
||||
where={"id": schedule_id},
|
||||
data={"isEnabled": False}
|
||||
)
|
||||
|
||||
|
||||
async def get_schedules(agent_id: str) -> list[ExecutionSchedule]:
|
||||
query = AgentExecutionSchedule.prisma().find_many(
|
||||
async def get_schedules(graph_id: str) -> list[ExecutionSchedule]:
|
||||
query = AgentGraphExecutionSchedule.prisma().find_many(
|
||||
where={
|
||||
"isEnabled": True,
|
||||
"agentGraphId": agent_id,
|
||||
"agentGraphId": graph_id,
|
||||
},
|
||||
)
|
||||
return [
|
||||
@@ -70,20 +69,21 @@ async def get_schedules(agent_id: str) -> list[ExecutionSchedule]:
|
||||
]
|
||||
|
||||
|
||||
async def add_schedule(schedule: ExecutionSchedule):
|
||||
await AgentExecutionSchedule.prisma().create(
|
||||
async def add_schedule(schedule: ExecutionSchedule) -> ExecutionSchedule:
|
||||
obj = await AgentGraphExecutionSchedule.prisma().create(
|
||||
data={
|
||||
"id": schedule.id,
|
||||
"agentGraphId": schedule.agent_id,
|
||||
"agentGraphId": schedule.graph_id,
|
||||
"schedule": schedule.schedule,
|
||||
"isEnabled": schedule.is_enabled,
|
||||
"inputData": json.dumps(schedule.input_data),
|
||||
}
|
||||
)
|
||||
return ExecutionSchedule.from_db(obj)
|
||||
|
||||
|
||||
async def update_schedule(schedule_id: str, is_enabled: bool):
|
||||
await AgentExecutionSchedule.prisma().update(
|
||||
await AgentGraphExecutionSchedule.prisma().update(
|
||||
where={"id": schedule_id},
|
||||
data={"isEnabled": is_enabled}
|
||||
)
|
||||
|
||||
@@ -1,30 +1,36 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import uuid
|
||||
from concurrent.futures import ProcessPoolExecutor
|
||||
from typing import Optional, Any
|
||||
from typing import Any, Coroutine, Generator, TypeVar
|
||||
|
||||
from autogpt_server.data import db
|
||||
from autogpt_server.data.block import Block, get_block
|
||||
from autogpt_server.data.graph import Node, get_node, get_node_input, get_graph
|
||||
from autogpt_server.data.execution import (
|
||||
Execution,
|
||||
create_graph_execution,
|
||||
get_node_execution_input,
|
||||
merge_execution_input,
|
||||
update_execution_status as execution_update,
|
||||
upsert_execution_output,
|
||||
upsert_execution_input,
|
||||
NodeExecution as Execution,
|
||||
ExecutionStatus,
|
||||
ExecutionQueue,
|
||||
enqueue_execution,
|
||||
complete_execution,
|
||||
fail_execution,
|
||||
start_execution,
|
||||
)
|
||||
from autogpt_server.data.graph import Node, get_node, get_graph
|
||||
from autogpt_server.util.service import AppService, expose
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_log_prefix(run_id: str, exec_id: str, block_name: str = "-"):
|
||||
return f"[ExecutionManager] [graph-{run_id}|node-{exec_id}|{block_name}]"
|
||||
def get_log_prefix(graph_eid: str, node_eid: str, block_name: str = "-"):
|
||||
return f"[ExecutionManager] [graph-{graph_eid}|node-{node_eid}|{block_name}]"
|
||||
|
||||
|
||||
def execute_node(loop: asyncio.AbstractEventLoop, data: Execution) -> Execution | None:
|
||||
T = TypeVar("T")
|
||||
ExecutionStream = Generator[Execution, None, None]
|
||||
|
||||
|
||||
def execute_node(loop: asyncio.AbstractEventLoop, data: Execution) -> ExecutionStream:
|
||||
"""
|
||||
Execute a node in the graph. This will trigger a block execution on a node,
|
||||
persist the execution result, and return the subsequent node to be executed.
|
||||
@@ -36,57 +42,103 @@ def execute_node(loop: asyncio.AbstractEventLoop, data: Execution) -> Execution
|
||||
Returns:
|
||||
The subsequent node to be enqueued, or None if there is no subsequent node.
|
||||
"""
|
||||
run_id = data.run_id
|
||||
exec_id = data.id
|
||||
graph_exec_id = data.graph_exec_id
|
||||
node_exec_id = data.node_exec_id
|
||||
exec_data = data.data
|
||||
node_id = data.node_id
|
||||
|
||||
asyncio.set_event_loop(loop)
|
||||
wait = lambda f: loop.run_until_complete(f)
|
||||
|
||||
node: Optional[Node] = wait(get_node(node_id))
|
||||
def wait(f: Coroutine[T, Any, T]) -> T:
|
||||
return loop.run_until_complete(f)
|
||||
|
||||
node = wait(get_node(node_id))
|
||||
if not node:
|
||||
logger.error(f"Node {node_id} not found.")
|
||||
return None
|
||||
return
|
||||
|
||||
node_block: Optional[Block] = wait(get_block(node.block_id))
|
||||
node_block = wait(get_block(node.block_id))
|
||||
if not node_block:
|
||||
logger.error(f"Block {node.block_id} not found.")
|
||||
return None
|
||||
return
|
||||
|
||||
# Execute the node
|
||||
prefix = get_log_prefix(run_id, exec_id, node_block.name)
|
||||
prefix = get_log_prefix(graph_exec_id, node_exec_id, node_block.name)
|
||||
logger.warning(f"{prefix} execute with input:\n`{exec_data}`")
|
||||
wait(start_execution(exec_id))
|
||||
wait(execution_update(node_exec_id, ExecutionStatus.RUNNING))
|
||||
|
||||
try:
|
||||
output_name, output_data = node_block.execute(exec_data)
|
||||
logger.warning(f"{prefix} executed with output [{output_name}]:`{output_data}`")
|
||||
wait(complete_execution(exec_id, (output_name, output_data)))
|
||||
for output_name, output_data in node_block.execute(exec_data):
|
||||
logger.warning(f"{prefix} Executed, output [{output_name}]:`{output_data}`")
|
||||
wait(execution_update(node_exec_id, ExecutionStatus.COMPLETED))
|
||||
wait(upsert_execution_output(node_exec_id, output_name, output_data))
|
||||
|
||||
for execution in enqueue_next_nodes(
|
||||
loop, node, output_name, output_data, graph_exec_id
|
||||
):
|
||||
yield execution
|
||||
except Exception as e:
|
||||
logger.exception(f"{prefix} failed with error: %s", e)
|
||||
wait(fail_execution(exec_id, e))
|
||||
error_msg = f"{e.__class__.__name__}: {e}"
|
||||
logger.exception(f"{prefix} failed with error. `%s`", error_msg)
|
||||
wait(execution_update(node_exec_id, ExecutionStatus.FAILED))
|
||||
wait(upsert_execution_output(node_exec_id, "error", error_msg))
|
||||
raise e
|
||||
|
||||
|
||||
def enqueue_next_nodes(
|
||||
loop: asyncio.AbstractEventLoop,
|
||||
node: Node,
|
||||
output_name: str,
|
||||
output_data: Any,
|
||||
graph_exec_id: str,
|
||||
) -> list[Execution]:
|
||||
def wait(f: Coroutine[T, Any, T]) -> T:
|
||||
return loop.run_until_complete(f)
|
||||
|
||||
prefix = get_log_prefix(graph_exec_id, node.id)
|
||||
node_id = node.id
|
||||
|
||||
# Try to enqueue next eligible nodes
|
||||
if output_name not in node.output_nodes:
|
||||
next_node_ids = [nid for name, nid in node.output_nodes if name == output_name]
|
||||
if not next_node_ids:
|
||||
logger.error(f"{prefix} Output [{output_name}] has no subsequent node.")
|
||||
return None
|
||||
return []
|
||||
|
||||
next_node_id = node.output_nodes[output_name]
|
||||
next_node: Optional[Node] = wait(get_node(next_node_id))
|
||||
if not next_node:
|
||||
logger.error(f"{prefix} Error, next node {next_node_id} not found.")
|
||||
return None
|
||||
def validate_node_execution(next_node_id: str):
|
||||
next_node = wait(get_node(next_node_id))
|
||||
if not next_node:
|
||||
logger.error(f"{prefix} Error, next node {next_node_id} not found.")
|
||||
return
|
||||
|
||||
next_node_input: dict[str, Any] = wait(get_node_input(next_node, run_id))
|
||||
is_valid, validation_resp = wait(validate_exec(next_node, next_node_input))
|
||||
if not is_valid:
|
||||
logger.warning(f"{prefix} Skipped {next_node_id}: {validation_resp}")
|
||||
return None
|
||||
next_node_input_name = next(
|
||||
name for name, nid in next_node.input_nodes if nid == node_id
|
||||
)
|
||||
next_node_exec_id = wait(upsert_execution_input(
|
||||
node_id=next_node_id,
|
||||
graph_exec_id=graph_exec_id,
|
||||
input_name=next_node_input_name,
|
||||
data=output_data
|
||||
))
|
||||
|
||||
logger.warning(f"{prefix} Enqueue next node {next_node_id}-{validation_resp}")
|
||||
return Execution(run_id=run_id, node_id=next_node_id, data=next_node_input)
|
||||
next_node_input = wait(get_node_execution_input(next_node_exec_id))
|
||||
is_valid, validation_resp = wait(validate_exec(next_node, next_node_input))
|
||||
if not is_valid:
|
||||
logger.warning(f"{prefix} Skipped {next_node_id}: {validation_resp}")
|
||||
return
|
||||
|
||||
logger.warning(f"{prefix} Enqueue next node {next_node_id}-{validation_resp}")
|
||||
return Execution(
|
||||
graph_exec_id=graph_exec_id,
|
||||
node_exec_id=next_node_exec_id,
|
||||
node_id=next_node_id,
|
||||
data=next_node_input
|
||||
)
|
||||
|
||||
executions = []
|
||||
for nid in next_node_ids:
|
||||
if execution := validate_node_execution(nid):
|
||||
executions.append(execution)
|
||||
return executions
|
||||
|
||||
|
||||
async def validate_exec(node: Node, data: dict[str, Any]) -> tuple[bool, str]:
|
||||
@@ -101,14 +153,20 @@ async def validate_exec(node: Node, data: dict[str, Any]) -> tuple[bool, str]:
|
||||
A tuple of a boolean indicating if the data is valid, and a message if not.
|
||||
Return the executed block name if the data is valid.
|
||||
"""
|
||||
node_block: Block | None = await(get_block(node.block_id))
|
||||
node_block: Block | None = await get_block(node.block_id)
|
||||
if not node_block:
|
||||
return False, f"Block for {node.block_id} not found."
|
||||
|
||||
if not set(node.input_nodes).issubset(data):
|
||||
return False, f"Input data missing: {set(node.input_nodes) - set(data)}"
|
||||
input_fields_from_schema = node_block.input_schema.get_required_fields()
|
||||
if not input_fields_from_schema.issubset(data):
|
||||
return False, f"Input data missing: {input_fields_from_schema - set(data)}"
|
||||
|
||||
input_fields_from_nodes = {name for name, _ in node.input_nodes}
|
||||
if not input_fields_from_nodes.issubset(data):
|
||||
return False, f"Input data missing: {input_fields_from_nodes - set(data)}"
|
||||
|
||||
if error := node_block.input_schema.validate_data(data):
|
||||
logger.error("Input value doesn't match schema: %s", error)
|
||||
return False, f"Input data doesn't match {node_block.name}: {error}"
|
||||
|
||||
return True, node_block.name
|
||||
@@ -123,16 +181,16 @@ class Executor:
|
||||
cls.loop.run_until_complete(db.connect())
|
||||
|
||||
@classmethod
|
||||
def on_start_execution(cls, data: Execution) -> Optional[Execution | None]:
|
||||
"""
|
||||
A synchronous version of `execute_node`, to be used in the ProcessPoolExecutor.
|
||||
"""
|
||||
prefix = get_log_prefix(data.run_id, data.id)
|
||||
def on_start_execution(cls, q: ExecutionQueue, data: Execution) -> bool:
|
||||
prefix = get_log_prefix(data.graph_exec_id, data.node_exec_id)
|
||||
try:
|
||||
logger.warning(f"{prefix} Start execution")
|
||||
return execute_node(cls.loop, data)
|
||||
for execution in execute_node(cls.loop, data):
|
||||
q.add(execution)
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"{prefix} Error: {e}")
|
||||
logger.exception(f"{prefix} Error: {e}")
|
||||
return False
|
||||
|
||||
|
||||
class ExecutionManager(AppService):
|
||||
@@ -142,59 +200,63 @@ class ExecutionManager(AppService):
|
||||
self.queue = ExecutionQueue()
|
||||
|
||||
def run_service(self):
|
||||
def on_complete_execution(f: asyncio.Future[Execution | None]):
|
||||
exception = f.exception()
|
||||
if exception:
|
||||
logger.exception("Error during execution!! %s", exception)
|
||||
return exception
|
||||
|
||||
execution = f.result()
|
||||
if execution:
|
||||
return self.add_node_execution(execution)
|
||||
|
||||
return None
|
||||
|
||||
with ProcessPoolExecutor(
|
||||
max_workers=self.pool_size,
|
||||
initializer=Executor.on_executor_start,
|
||||
) as executor:
|
||||
logger.warning(f"Execution manager started with {self.pool_size} workers.")
|
||||
while True:
|
||||
future = executor.submit(
|
||||
executor.submit(
|
||||
Executor.on_start_execution,
|
||||
self.queue.get()
|
||||
self.queue,
|
||||
self.queue.get(),
|
||||
)
|
||||
future.add_done_callback(on_complete_execution) # type: ignore
|
||||
|
||||
@expose
|
||||
def add_execution(self, graph_id: str, data: dict[str, Any]) -> dict:
|
||||
run_id = str(uuid.uuid4())
|
||||
|
||||
agent = self.run_and_wait(get_graph(graph_id))
|
||||
if not agent:
|
||||
raise Exception(f"Agent #{graph_id} not found.")
|
||||
graph = self.run_and_wait(get_graph(graph_id))
|
||||
if not graph:
|
||||
raise Exception(f"Graph #{graph_id} not found.")
|
||||
|
||||
# Currently, there is no constraint on the number of root nodes in the graph.
|
||||
for node in agent.starting_nodes:
|
||||
valid, error = self.run_and_wait(validate_exec(node, data))
|
||||
for node in graph.starting_nodes:
|
||||
input_data = merge_execution_input({**node.input_default, **data})
|
||||
valid, error = self.run_and_wait(validate_exec(node, input_data))
|
||||
if not valid:
|
||||
raise Exception(error)
|
||||
|
||||
graph_exec_id, node_execs = self.run_and_wait(create_graph_execution(
|
||||
graph_id=graph_id,
|
||||
node_ids=[node.id for node in graph.starting_nodes],
|
||||
data=data
|
||||
))
|
||||
|
||||
executions = []
|
||||
for node in agent.starting_nodes:
|
||||
exec_id = self.add_node_execution(
|
||||
Execution(run_id=run_id, node_id=node.id, data=data)
|
||||
for node_exec in node_execs:
|
||||
input_data = self.run_and_wait(
|
||||
get_node_execution_input(node_exec.node_exec_id)
|
||||
)
|
||||
self.add_node_execution(
|
||||
Execution(
|
||||
graph_exec_id=node_exec.graph_exec_id,
|
||||
node_exec_id=node_exec.node_exec_id,
|
||||
node_id=node_exec.node_id,
|
||||
data=input_data,
|
||||
)
|
||||
)
|
||||
executions.append({
|
||||
"exec_id": exec_id,
|
||||
"node_id": node.id,
|
||||
"id": node_exec.node_exec_id,
|
||||
"node_id": node_exec.node_id,
|
||||
})
|
||||
|
||||
return {
|
||||
"run_id": run_id,
|
||||
"id": graph_exec_id,
|
||||
"executions": executions,
|
||||
}
|
||||
|
||||
def add_node_execution(self, execution: Execution) -> Execution:
|
||||
self.run_and_wait(enqueue_execution(execution))
|
||||
self.run_and_wait(execution_update(
|
||||
execution.node_exec_id,
|
||||
ExecutionStatus.QUEUED
|
||||
))
|
||||
return self.queue.add(execution)
|
||||
|
||||
@@ -45,20 +45,20 @@ class ExecutionScheduler(AppService):
|
||||
|
||||
log(f"Adding recurring job {schedule.id}: {schedule.schedule}")
|
||||
scheduler.add_job(
|
||||
self.__execute_agent,
|
||||
self.__execute_graph,
|
||||
CronTrigger.from_crontab(schedule.schedule),
|
||||
id=schedule.id,
|
||||
args=[schedule.agent_id, schedule.input_data],
|
||||
args=[schedule.graph_id, schedule.input_data],
|
||||
replace_existing=True,
|
||||
)
|
||||
|
||||
def __execute_agent(self, agent_id: str, input_data: dict):
|
||||
def __execute_graph(self, graph_id: str, input_data: dict):
|
||||
try:
|
||||
log(f"Executing recurring job for agent #{agent_id}")
|
||||
log(f"Executing recurring job for graph #{graph_id}")
|
||||
execution_manager = self.execution_manager_client
|
||||
execution_manager.add_execution(agent_id, input_data)
|
||||
execution_manager.add_execution(graph_id, input_data)
|
||||
except Exception as e:
|
||||
logger.error(f"Error executing agent {agent_id}: {e}")
|
||||
logger.exception(f"Error executing graph {graph_id}: {e}")
|
||||
|
||||
@expose
|
||||
def update_schedule(self, schedule_id: str, is_enabled: bool) -> str:
|
||||
@@ -66,17 +66,16 @@ class ExecutionScheduler(AppService):
|
||||
return schedule_id
|
||||
|
||||
@expose
|
||||
def add_execution_schedule(self, agent_id: str, cron: str, input_data: dict) -> str:
|
||||
def add_execution_schedule(self, graph_id: str, cron: str, input_data: dict) -> str:
|
||||
schedule = model.ExecutionSchedule(
|
||||
agent_id=agent_id,
|
||||
graph_id=graph_id,
|
||||
schedule=cron,
|
||||
input_data=input_data,
|
||||
)
|
||||
self.run_and_wait(model.add_schedule(schedule))
|
||||
return schedule.id
|
||||
return self.run_and_wait(model.add_schedule(schedule)).id
|
||||
|
||||
@expose
|
||||
def get_execution_schedules(self, agent_id: str) -> dict[str, str]:
|
||||
query = model.get_schedules(agent_id)
|
||||
def get_execution_schedules(self, graph_id: str) -> dict[str, str]:
|
||||
query = model.get_schedules(graph_id)
|
||||
schedules: list[model.ExecutionSchedule] = self.run_and_wait(query)
|
||||
return {v.id: v.schedule for v in schedules}
|
||||
|
||||
@@ -1,13 +1,25 @@
|
||||
from typing import Annotated, Any, Dict
|
||||
import uuid
|
||||
from fastapi.responses import JSONResponse
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
import uvicorn
|
||||
|
||||
from contextlib import asynccontextmanager
|
||||
from fastapi import APIRouter, FastAPI, HTTPException
|
||||
from fastapi import APIRouter, Body, FastAPI, HTTPException
|
||||
|
||||
from autogpt_server.data import db, execution, graph, block
|
||||
from autogpt_server.data import db, execution, block
|
||||
from autogpt_server.data.graph import (
|
||||
create_graph,
|
||||
get_graph,
|
||||
get_graph_ids,
|
||||
Graph,
|
||||
Link,
|
||||
)
|
||||
from autogpt_server.executor import ExecutionManager, ExecutionScheduler
|
||||
from autogpt_server.util.data import get_frontend_path
|
||||
from autogpt_server.util.process import AppProcess
|
||||
from autogpt_server.util.service import get_service_client
|
||||
from autogpt_server.util.settings import Settings
|
||||
|
||||
|
||||
class AgentServer(AppProcess):
|
||||
@@ -34,49 +46,60 @@ class AgentServer(AppProcess):
|
||||
router = APIRouter()
|
||||
router.add_api_route(
|
||||
path="/blocks",
|
||||
endpoint=self.get_agent_blocks,
|
||||
endpoint=self.get_graph_blocks,
|
||||
methods=["GET"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents",
|
||||
endpoint=self.get_agents,
|
||||
path="/graphs",
|
||||
endpoint=self.get_graphs,
|
||||
methods=["GET"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents/{agent_id}",
|
||||
endpoint=self.get_agent,
|
||||
path="/graphs/{graph_id}",
|
||||
endpoint=self.get_graph,
|
||||
methods=["GET"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents",
|
||||
endpoint=self.create_agent,
|
||||
path="/graphs",
|
||||
endpoint=self.create_new_graph,
|
||||
methods=["POST"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents/{agent_id}/execute",
|
||||
endpoint=self.execute_agent,
|
||||
path="/graphs/{graph_id}/execute",
|
||||
endpoint=self.execute_graph,
|
||||
methods=["POST"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents/{agent_id}/executions/{run_id}",
|
||||
path="/graphs/{graph_id}/executions/{run_id}",
|
||||
endpoint=self.get_executions,
|
||||
methods=["GET"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents/{agent_id}/schedules",
|
||||
endpoint=self.schedule_agent,
|
||||
path="/graphs/{graph_id}/schedules",
|
||||
endpoint=self.create_schedule,
|
||||
methods=["POST"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents/{agent_id}/schedules",
|
||||
path="/graphs/{graph_id}/schedules",
|
||||
endpoint=self.get_execution_schedules,
|
||||
methods=["GET"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/agents/schedules/{schedule_id}",
|
||||
path="/graphs/schedules/{schedule_id}",
|
||||
endpoint=self.update_schedule,
|
||||
methods=["PUT"],
|
||||
)
|
||||
router.add_api_route(
|
||||
path="/settings",
|
||||
endpoint=self.update_configuration,
|
||||
methods=["POST"],
|
||||
)
|
||||
|
||||
app.mount(
|
||||
path="/frontend",
|
||||
app=StaticFiles(directory=get_frontend_path(), html=True),
|
||||
name="example_files",
|
||||
)
|
||||
|
||||
app.include_router(router)
|
||||
uvicorn.run(app, host="0.0.0.0", port=8000)
|
||||
@@ -89,52 +112,52 @@ class AgentServer(AppProcess):
|
||||
def execution_scheduler_client(self) -> ExecutionScheduler:
|
||||
return get_service_client(ExecutionScheduler)
|
||||
|
||||
async def get_agent_blocks(self) -> list[dict]:
|
||||
async def get_graph_blocks(self) -> list[dict]:
|
||||
return [v.to_dict() for v in await block.get_blocks()]
|
||||
|
||||
async def get_agents(self) -> list[str]:
|
||||
return await graph.get_graph_ids()
|
||||
async def get_graphs(self) -> list[str]:
|
||||
return await get_graph_ids()
|
||||
|
||||
async def get_agent(self, agent_id: str) -> graph.Graph:
|
||||
agent = await graph.get_graph(agent_id)
|
||||
if not agent:
|
||||
raise HTTPException(status_code=404, detail=f"Agent #{agent_id} not found.")
|
||||
async def get_graph(self, graph_id: str) -> Graph:
|
||||
graph = await get_graph(graph_id)
|
||||
if not graph:
|
||||
raise HTTPException(status_code=404, detail=f"Graph #{graph_id} not found.")
|
||||
return graph
|
||||
|
||||
return agent
|
||||
|
||||
async def create_agent(self, agent: graph.Graph) -> graph.Graph:
|
||||
agent.id = str(uuid.uuid4())
|
||||
|
||||
id_map = {node.id: str(uuid.uuid4()) for node in agent.nodes}
|
||||
for node in agent.nodes:
|
||||
async def create_new_graph(self, graph: Graph) -> Graph:
|
||||
# TODO: replace uuid generation here to DB generated uuids.
|
||||
graph.id = str(uuid.uuid4())
|
||||
id_map = {node.id: str(uuid.uuid4()) for node in graph.nodes}
|
||||
for node in graph.nodes:
|
||||
node.id = id_map[node.id]
|
||||
node.input_nodes = {k: id_map[v] for k, v in node.input_nodes.items()}
|
||||
node.output_nodes = {k: id_map[v] for k, v in node.output_nodes.items()}
|
||||
node.input_nodes = [Link(k, id_map[v]) for k, v in node.input_nodes]
|
||||
node.output_nodes = [Link(k, id_map[v]) for k, v in node.output_nodes]
|
||||
|
||||
return await graph.create_graph(agent)
|
||||
return await create_graph(graph)
|
||||
|
||||
async def execute_agent(self, agent_id: str, node_input: dict) -> dict:
|
||||
async def execute_graph(self, graph_id: str, node_input: dict) -> dict:
|
||||
try:
|
||||
return self.execution_manager_client.add_execution(agent_id, node_input)
|
||||
return self.execution_manager_client.add_execution(graph_id, node_input)
|
||||
except Exception as e:
|
||||
msg = e.__str__().encode().decode('unicode_escape')
|
||||
msg = e.__str__().encode().decode("unicode_escape")
|
||||
raise HTTPException(status_code=400, detail=msg)
|
||||
|
||||
async def get_executions(
|
||||
self, agent_id: str, run_id: str) -> list[execution.ExecutionResult]:
|
||||
agent = await graph.get_graph(agent_id)
|
||||
if not agent:
|
||||
raise HTTPException(status_code=404, detail=f"Agent #{agent_id} not found.")
|
||||
self, graph_id: str, run_id: str
|
||||
) -> list[execution.ExecutionResult]:
|
||||
graph = await get_graph(graph_id)
|
||||
if not graph:
|
||||
raise HTTPException(status_code=404, detail=f"Agent #{graph_id} not found.")
|
||||
|
||||
return await execution.get_executions(run_id)
|
||||
|
||||
async def schedule_agent(self, agent_id: str, cron: str, input_data: dict) -> dict:
|
||||
agent = await graph.get_graph(agent_id)
|
||||
if not agent:
|
||||
raise HTTPException(status_code=404, detail=f"Agent #{agent_id} not found.")
|
||||
async def create_schedule(self, graph_id: str, cron: str, input_data: dict) -> dict:
|
||||
graph = await get_graph(graph_id)
|
||||
if not graph:
|
||||
raise HTTPException(status_code=404, detail=f"Graph #{graph_id} not found.")
|
||||
execution_scheduler = self.execution_scheduler_client
|
||||
return {
|
||||
"id": execution_scheduler.add_execution_schedule(agent_id, cron, input_data)
|
||||
"id": execution_scheduler.add_execution_schedule(graph_id, cron, input_data)
|
||||
}
|
||||
|
||||
def update_schedule(self, schedule_id: str, input_data: dict) -> dict:
|
||||
@@ -143,6 +166,34 @@ class AgentServer(AppProcess):
|
||||
execution_scheduler.update_schedule(schedule_id, is_enabled)
|
||||
return {"id": schedule_id}
|
||||
|
||||
def get_execution_schedules(self, agent_id: str) -> dict[str, str]:
|
||||
def get_execution_schedules(self, graph_id: str) -> dict[str, str]:
|
||||
execution_scheduler = self.execution_scheduler_client
|
||||
return execution_scheduler.get_execution_schedules(agent_id)
|
||||
return execution_scheduler.get_execution_schedules(graph_id)
|
||||
|
||||
def update_configuration(
|
||||
self,
|
||||
updated_settings: Annotated[
|
||||
Dict[str, Any], Body(examples=[{"config": {"num_workers": 10}}])
|
||||
],
|
||||
):
|
||||
settings = Settings()
|
||||
try:
|
||||
updated_fields = {"config": [], "secrets": []}
|
||||
for key, value in updated_settings.get("config", {}).items():
|
||||
if hasattr(settings.config, key):
|
||||
setattr(settings.config, key, value)
|
||||
updated_fields["config"].append(key)
|
||||
for key, value in updated_settings.get("secrets", {}).items():
|
||||
if hasattr(settings.secrets, key):
|
||||
setattr(settings.secrets, key, value)
|
||||
updated_fields["secrets"].append(key)
|
||||
settings.save()
|
||||
return JSONResponse(
|
||||
content={
|
||||
"message": "Settings updated successfully",
|
||||
"updated_fields": updated_fields,
|
||||
},
|
||||
status_code=200,
|
||||
)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
|
||||
35
rnd/autogpt_server/autogpt_server/util/data.py
Normal file
35
rnd/autogpt_server/autogpt_server/util/data.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import os
|
||||
import pathlib
|
||||
import sys
|
||||
|
||||
|
||||
def get_secrets_path() -> pathlib.Path:
|
||||
return get_data_path() / "secrets"
|
||||
|
||||
|
||||
def get_config_path() -> pathlib.Path:
|
||||
return get_data_path()
|
||||
|
||||
|
||||
def get_frontend_path() -> pathlib.Path:
|
||||
if getattr(sys, "frozen", False):
|
||||
# The application is frozen
|
||||
datadir = pathlib.Path(os.path.dirname(sys.executable)) / "example_files"
|
||||
else:
|
||||
# The application is not frozen
|
||||
# Change this bit to match where you store your data files:
|
||||
filedir = os.path.dirname(__file__)
|
||||
datadir = pathlib.Path(filedir).parent.parent.parent / "example_files"
|
||||
return pathlib.Path(datadir)
|
||||
|
||||
|
||||
def get_data_path() -> pathlib.Path:
|
||||
if getattr(sys, "frozen", False):
|
||||
# The application is frozen
|
||||
datadir = os.path.dirname(sys.executable)
|
||||
else:
|
||||
# The application is not frozen
|
||||
# Change this bit to match where you store your data files:
|
||||
filedir = os.path.dirname(__file__)
|
||||
datadir = pathlib.Path(filedir).parent.parent
|
||||
return pathlib.Path(datadir)
|
||||
@@ -2,7 +2,6 @@ import os
|
||||
import sys
|
||||
from abc import ABC, abstractmethod
|
||||
from multiprocessing import Process, freeze_support, set_start_method
|
||||
from multiprocessing.spawn import freeze_support as freeze_support_spawn
|
||||
from typing import Optional
|
||||
|
||||
|
||||
@@ -10,10 +9,9 @@ class AppProcess(ABC):
|
||||
"""
|
||||
A class to represent an object that can be executed in a background process.
|
||||
"""
|
||||
|
||||
process: Optional[Process] = None
|
||||
set_start_method('spawn', force=True)
|
||||
freeze_support()
|
||||
freeze_support_spawn()
|
||||
set_start_method("spawn", force=True)
|
||||
|
||||
@abstractmethod
|
||||
def run(self):
|
||||
|
||||
@@ -15,6 +15,7 @@ from autogpt_server.util.process import AppProcess
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
conn_retry = retry(stop=stop_after_delay(5), wait=wait_exponential(multiplier=0.1))
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
def expose(func: Callable) -> Callable:
|
||||
@@ -23,7 +24,7 @@ def expose(func: Callable) -> Callable:
|
||||
return func(*args, **kwargs)
|
||||
except Exception as e:
|
||||
msg = f"Error in {func.__name__}: {e.__str__()}"
|
||||
logger.error(msg)
|
||||
logger.exception(msg)
|
||||
raise Exception(msg, e)
|
||||
|
||||
return pyro.expose(wrapper)
|
||||
@@ -51,10 +52,10 @@ class AppService(AppProcess):
|
||||
while True:
|
||||
time.sleep(10)
|
||||
|
||||
def run_async(self, coro: Coroutine):
|
||||
def run_async(self, coro: Coroutine[T, Any, T]):
|
||||
return asyncio.run_coroutine_threadsafe(coro, self.shared_event_loop)
|
||||
|
||||
def run_and_wait(self, coro: Coroutine):
|
||||
def run_and_wait(self, coro: Coroutine[T, Any, T]) -> T:
|
||||
future = self.run_async(coro)
|
||||
return future.result()
|
||||
|
||||
|
||||
108
rnd/autogpt_server/autogpt_server/util/settings.py
Normal file
108
rnd/autogpt_server/autogpt_server/util/settings.py
Normal file
@@ -0,0 +1,108 @@
|
||||
import json
|
||||
import os
|
||||
from typing import Any, Dict, Generic, Set, Tuple, Type, TypeVar
|
||||
from pydantic import BaseModel, Field, PrivateAttr
|
||||
from pydantic_settings import (
|
||||
BaseSettings,
|
||||
JsonConfigSettingsSource,
|
||||
PydanticBaseSettingsSource,
|
||||
SettingsConfigDict,
|
||||
)
|
||||
|
||||
from autogpt_server.util.data import get_config_path, get_data_path, get_secrets_path
|
||||
|
||||
T = TypeVar("T", bound=BaseSettings)
|
||||
|
||||
|
||||
class UpdateTrackingModel(BaseModel, Generic[T]):
|
||||
_updated_fields: Set[str] = PrivateAttr(default_factory=set)
|
||||
|
||||
def __setattr__(self, name: str, value) -> None:
|
||||
if name in self.model_fields:
|
||||
self._updated_fields.add(name)
|
||||
super().__setattr__(name, value)
|
||||
|
||||
|
||||
def mark_updated(self, field_name: str) -> None:
|
||||
if field_name in self.model_fields:
|
||||
self._updated_fields.add(field_name)
|
||||
|
||||
def clear_updates(self) -> None:
|
||||
self._updated_fields.clear()
|
||||
|
||||
def get_updates(self) -> Dict[str, Any]:
|
||||
return {field: getattr(self, field) for field in self._updated_fields}
|
||||
|
||||
|
||||
class Config(UpdateTrackingModel["Config"], BaseSettings):
|
||||
"""Config for the server."""
|
||||
|
||||
num_workers: int = Field(
|
||||
default=9, ge=1, le=100, description="Number of workers to use for execution."
|
||||
)
|
||||
# Add more configuration fields as needed
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
json_file=[
|
||||
get_config_path() / "config.default.json",
|
||||
get_config_path() / "config.json",
|
||||
],
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
extra="allow",
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def settings_customise_sources(
|
||||
cls,
|
||||
settings_cls: Type[BaseSettings],
|
||||
init_settings: PydanticBaseSettingsSource,
|
||||
env_settings: PydanticBaseSettingsSource,
|
||||
dotenv_settings: PydanticBaseSettingsSource,
|
||||
file_secret_settings: PydanticBaseSettingsSource,
|
||||
) -> Tuple[PydanticBaseSettingsSource, ...]:
|
||||
return (JsonConfigSettingsSource(settings_cls),)
|
||||
|
||||
|
||||
class Secrets(UpdateTrackingModel["Secrets"], BaseSettings):
|
||||
"""Secrets for the server."""
|
||||
|
||||
database_password: str = ""
|
||||
# Add more secret fields as needed
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
secrets_dir=get_secrets_path(),
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
extra="allow",
|
||||
)
|
||||
|
||||
|
||||
class Settings(BaseModel):
|
||||
config: Config = Config()
|
||||
secrets: Secrets = Secrets()
|
||||
|
||||
def save(self) -> None:
|
||||
# Save updated config to JSON file
|
||||
if self.config._updated_fields:
|
||||
config_to_save = self.config.get_updates()
|
||||
config_path = os.path.join(get_data_path(), "config.json")
|
||||
if os.path.exists(config_path):
|
||||
with open(config_path, "r+") as f:
|
||||
existing_config: Dict[str, Any] = json.load(f)
|
||||
existing_config.update(config_to_save)
|
||||
f.seek(0)
|
||||
json.dump(existing_config, f, indent=2)
|
||||
f.truncate()
|
||||
else:
|
||||
with open(config_path, "w") as f:
|
||||
json.dump(config_to_save, f, indent=2)
|
||||
self.config.clear_updates()
|
||||
|
||||
# Save updated secrets to individual files
|
||||
secrets_dir = get_secrets_path()
|
||||
for key in self.secrets._updated_fields:
|
||||
secret_file = os.path.join(secrets_dir, key)
|
||||
with open(secret_file, "w") as f:
|
||||
f.write(str(getattr(self.secrets, key)))
|
||||
self.secrets.clear_updates()
|
||||
3
rnd/autogpt_server/config.default.json
Normal file
3
rnd/autogpt_server/config.default.json
Normal file
@@ -0,0 +1,3 @@
|
||||
{
|
||||
"num_workers": 5
|
||||
}
|
||||
149
rnd/autogpt_server/poetry.lock
generated
149
rnd/autogpt_server/poetry.lock
generated
@@ -1,4 +1,4 @@
|
||||
# This file is automatically @generated by Poetry 1.7.1 and should not be changed by hand.
|
||||
# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
|
||||
|
||||
[[package]]
|
||||
name = "annotated-types"
|
||||
@@ -132,67 +132,33 @@ python-dateutil = "*"
|
||||
pytz = ">2021.1"
|
||||
|
||||
[[package]]
|
||||
name = "cx-freeze"
|
||||
version = "7.0.0"
|
||||
name = "cx_Freeze"
|
||||
version = "7.1.1"
|
||||
description = "Create standalone executables from Python scripts"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:421920dbee2b4aab53a81f6c99d18b00baf622a328eae8e489f162154a46192a"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5fa1ca4cf20c6ce45ce2e26bf8b2086525aaaa774e2ee1b16da4e0f9f18c7272"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a520bc6772325c0e38924da1d827fe370702f8df397f483691b94d36179beef6"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b89ed99a2f99fd2f3e28a91c85bdd75b4bfaf11b04729ba3282bfebdadadf880"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5174821c82826e5a57e43960877087f5af6073e3877b0b38a0be244111fe1a76"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:cfd18cc00f3240b03d5bdf9515d59ace0881b5b5b6f2e7655d857d1fb71f351d"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:bac44e65bdfce0839b9a6d15373ea084fda3cdbd902351cde530991b450c2b2d"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:51a374f273d575827efe4f7ed9a88b6cab78abffacb858c829d7cbe4dc4ff56e"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-win32.whl", hash = "sha256:6603e6c47a15bd84bfbb20d92dc01d5e586b54928eb618461d2f14305471d570"},
|
||||
{file = "cx_Freeze-7.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:d7ec847af5afbe3c638a096aae4ff5982a17d95e2fb7975e525ecf9505a185ea"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:04b7a2e5c53f5d537f3d958ebf2b0a0a7cbe8daf980cb0087559a3e698abc582"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:50e7e78001d4f78e70a403ecb5507685854ce1e6c3ff37bec1920eb6f2256534"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2d37ed560e86ca7958684701a6ae7f3300226d0d7c861ca5b90c78bf4c619ad2"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:42145dc5c2c7a98c620b30b7e25661954817a13355c50c4219a4a4954b39db39"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f9034d6f9c10d84d7edc0e4f4020e878de367e83c5877c039aa3c8b733bc318"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:adc6bdba9ff8705745831620efb6ee5eff9ec6d31d9b8c56d2a61d6555299157"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:37a3234c0e54b4afd561b47be4f22a6496f9436275fb7b59d90d7a3269fb4d6f"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:83549f9f817cafa59ea2f6e2045c8fe119628458ff14bb732649b01b0a637f6d"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-win32.whl", hash = "sha256:c508cd354728367311a7deb5bb616eee441bf79c900e3129a49fd54a372dc223"},
|
||||
{file = "cx_Freeze-7.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:8fb71d23dba27dc40393a8b460bbf64759899246cd595860f66493cee64f27a5"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:648fd0acb439efe22dced2430cbaeca87e5ca9ab315d148933104376cca9553d"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3125a8408da3ff4b0cf767689d678909f840dfe08633f5f2d3cfe333111dc334"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:07e56b06c7ca0bd2fc37e3783908767dbe1926e1e2609edcaefcc749ab584329"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:25531d5c61bb5e974d8a5d042f29a37a786e91c1d6f66e018fc50342a416f4e1"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f97154b4b60f6e1953ebce05803a5e11a35047d097fad60d7c181303b7c6ef6e"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:2333db5cfa6db700c79fd45d614d38e9d698f1df2a3c7e21ccbcc63cc8a7a9b7"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:d45a58e0a9b010e0823c30fb8eb2077560d2bb0f78e4481a55bdb6ad0729f390"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:0422dbd426fd9f4f4ec0cadc7e3192d38227464daa3eb215b03eb577cd9a49d4"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-win32.whl", hash = "sha256:2018e9cbf8172da09b311cfc3906503ee6ae88665ec77c543013173b2532b731"},
|
||||
{file = "cx_Freeze-7.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:ae5facd782b220bca6828eb6fb1834540cf431b1a615cc63652641bd070b11e6"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f06368dd41568572818c4abfcf9b45449dced3fa9f1b5f29e3523ba4ff7fcfbb"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e203d90d8fa1cc4489b15edac7dfdd983518a02999f275897160fc0ecfa98e4c"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f800b0bc2df14c66fcf2f220ecf273c5942d0b982268d8e5ccc9ef2fa56e576f"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-win32.whl", hash = "sha256:c52641ce2484222f4d60f0acbc79b2dfbfb984493101a4806c5af0ad379ebc82"},
|
||||
{file = "cx_Freeze-7.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:92a15613be3fcc7a310e825c92ae3e83a7e689ade00ce2ea981403e4317c7234"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:60a0f674b6a55fdf46d0cc59122551a79221ceecd038fed8533dcbceb9714435"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bb15314e8395e9658a8a5e4e19558d0e096a68b76c744ba81ebc249061b7dd9e"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3290127acc67e830265265a911d9018640ffffb7fddb86eacb1e3d83ed4136c4"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-win32.whl", hash = "sha256:aa885f2fb29b9f7d9a7d8af223d38d98905484cc2356c474bb1d6fd1704323ad"},
|
||||
{file = "cx_Freeze-7.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:fe4dbbfd52454c8ddb550f112713ee2ac36cc024303557763b605e35cdb6b9a8"},
|
||||
{file = "cx_freeze-7.0.0.tar.gz", hash = "sha256:b03f2854a15dd1e8962660d18882a71fefba0e1b6f68337193d4a072d1fc36e6"},
|
||||
]
|
||||
files = []
|
||||
develop = true
|
||||
|
||||
[package.dependencies]
|
||||
cx-Logging = {version = ">=3.1", markers = "sys_platform == \"win32\""}
|
||||
filelock = {version = ">=3.11.0", markers = "sys_platform == \"linux\""}
|
||||
cx_Logging = {version = ">=3.1", markers = "sys_platform == \"win32\""}
|
||||
dmgbuild = {version = ">=1.6.1", markers = "sys_platform == \"darwin\""}
|
||||
filelock = {version = ">=3.12.3", markers = "sys_platform == \"linux\""}
|
||||
lief = {version = ">=0.12.0,<0.15.0", markers = "sys_platform == \"win32\""}
|
||||
patchelf = {version = ">=0.14", markers = "sys_platform == \"linux\" and (platform_machine == \"aarch64\" or platform_machine == \"armv7l\" or platform_machine == \"i686\" or platform_machine == \"ppc64le\" or platform_machine == \"s390x\" or platform_machine == \"x86_64\")"}
|
||||
setuptools = ">=62.6,<70"
|
||||
patchelf = {version = ">=0.14", markers = "sys_platform == \"linux\" and (platform_machine == \"x86_64\" or platform_machine == \"i686\" or platform_machine == \"aarch64\" or platform_machine == \"armv7l\" or platform_machine == \"ppc64le\" or platform_machine == \"s390x\")"}
|
||||
setuptools = ">=65.6.3,<71"
|
||||
wheel = ">=0.42.0,<=0.43.0"
|
||||
|
||||
[package.extras]
|
||||
dev = ["bump-my-version (==0.20.1)", "cibuildwheel (==2.17.0)", "pre-commit (==3.7.0)"]
|
||||
doc = ["furo (==2024.1.29)", "myst-parser (==2.0.0)", "sphinx (==7.3.7)", "sphinx-new-tab-link (==0.4.0)", "sphinx-tabs (==3.4.5)"]
|
||||
test = ["coverage (==7.4.4)", "pluggy (==1.4.0)", "pytest (==8.1.1)", "pytest-cov (==5.0.0)", "pytest-datafiles (==3.0.0)", "pytest-mock (==3.14.0)", "pytest-timeout (==2.3.1)", "pytest-xdist[psutil] (==3.6.0)"]
|
||||
dev = ["bump-my-version (==0.24.0)", "cibuildwheel (==2.19.1)", "pre-commit (==3.5.0)", "pre-commit (==3.7.1)"]
|
||||
doc = ["furo (==2024.5.6)", "myst-parser (==3.0.1)", "sphinx (==7.3.7)", "sphinx-new-tab-link (==0.4.0)", "sphinx-tabs (==3.4.5)"]
|
||||
test = ["coverage (==7.5.4)", "pluggy (==1.5.0)", "pytest (==8.2.2)", "pytest-cov (==5.0.0)", "pytest-datafiles (==3.0.0)", "pytest-mock (==3.14.0)", "pytest-timeout (==2.3.1)", "pytest-xdist[psutil] (==3.6.1)"]
|
||||
|
||||
[package.source]
|
||||
type = "git"
|
||||
url = "https://github.com/ntindle/cx_Freeze.git"
|
||||
reference = "main"
|
||||
resolved_reference = "876fe77c97db749b7b0aed93c12142a7226ee7e4"
|
||||
|
||||
[[package]]
|
||||
name = "cx-logging"
|
||||
@@ -224,6 +190,46 @@ files = [
|
||||
{file = "cx_Logging-3.2.0.tar.gz", hash = "sha256:bdbad6d2e6a0cc5bef962a34d7aa1232e88ea9f3541d6e2881675b5e7eab5502"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dmgbuild"
|
||||
version = "1.6.1"
|
||||
description = "macOS command line utility to build disk images"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "dmgbuild-1.6.1-py3-none-any.whl", hash = "sha256:45dba6af4a64872c6a91eb335ebeaf5e1f4f4f39c89fd77cf40e841bd1226166"},
|
||||
{file = "dmgbuild-1.6.1.tar.gz", hash = "sha256:7ced2603d684e29c22b4cd507d1e15a1907e91b86259924b8cfe480d80553b43"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
ds-store = ">=1.1.0"
|
||||
mac-alias = ">=2.0.1"
|
||||
|
||||
[package.extras]
|
||||
badge-icons = ["pyobjc-framework-Quartz (>=3.0.4)"]
|
||||
dev = ["pre-commit", "tox"]
|
||||
docs = ["sphinx", "sphinx-autobuild", "sphinx-rtd-theme"]
|
||||
test = ["pytest", "pytest-cov", "pytest-tldr"]
|
||||
|
||||
[[package]]
|
||||
name = "ds-store"
|
||||
version = "1.3.1"
|
||||
description = "Manipulate Finder .DS_Store files from Python"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "ds_store-1.3.1-py3-none-any.whl", hash = "sha256:fbacbb0bd5193ab3e66e5a47fff63619f15e374ffbec8ae29744251a6c8f05b5"},
|
||||
{file = "ds_store-1.3.1.tar.gz", hash = "sha256:c27d413caf13c19acb85d75da4752673f1f38267f9eb6ba81b3b5aa99c2d207c"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
mac-alias = ">=2.0.1"
|
||||
|
||||
[package.extras]
|
||||
dev = ["pre-commit", "tox"]
|
||||
docs = ["sphinx", "sphinx-autobuild", "sphinx-rtd-theme"]
|
||||
test = ["pytest", "pytest-cov", "pytest-tldr"]
|
||||
|
||||
[[package]]
|
||||
name = "exceptiongroup"
|
||||
version = "1.2.1"
|
||||
@@ -506,6 +512,22 @@ files = [
|
||||
{file = "lief-0.14.1-cp39-cp39-win_amd64.whl", hash = "sha256:2db3eb282a35daf51f89c6509226668a08fb6a6d1f507dd549dd9f077585db11"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mac-alias"
|
||||
version = "2.2.2"
|
||||
description = "Generate/parse Mac OS Alias records from Python"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "mac_alias-2.2.2-py3-none-any.whl", hash = "sha256:504ab8ac546f35bbd75ad014d6ad977c426660aa721f2cd3acf3dc2f664141bd"},
|
||||
{file = "mac_alias-2.2.2.tar.gz", hash = "sha256:c99c728eb512e955c11f1a6203a0ffa8883b26549e8afe68804031aa5da856b7"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
dev = ["pre-commit", "tox"]
|
||||
docs = ["sphinx", "sphinx-autobuild", "sphinx-rtd-theme"]
|
||||
test = ["pytest", "pytest-cov", "pytest-tldr"]
|
||||
|
||||
[[package]]
|
||||
name = "markupsafe"
|
||||
version = "2.1.5"
|
||||
@@ -846,6 +868,25 @@ files = [
|
||||
[package.dependencies]
|
||||
typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
|
||||
|
||||
[[package]]
|
||||
name = "pydantic-settings"
|
||||
version = "2.3.4"
|
||||
description = "Settings management using Pydantic"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pydantic_settings-2.3.4-py3-none-any.whl", hash = "sha256:11ad8bacb68a045f00e4f862c7a718c8a9ec766aa8fd4c32e39a0594b207b53a"},
|
||||
{file = "pydantic_settings-2.3.4.tar.gz", hash = "sha256:c5802e3d62b78e82522319bbc9b8f8ffb28ad1c988a99311d04f2a6051fca0a7"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pydantic = ">=2.7.0"
|
||||
python-dotenv = ">=0.21.0"
|
||||
|
||||
[package.extras]
|
||||
toml = ["tomli (>=2.0.1)"]
|
||||
yaml = ["pyyaml (>=6.0.1)"]
|
||||
|
||||
[[package]]
|
||||
name = "pyflakes"
|
||||
version = "3.2.0"
|
||||
@@ -1631,4 +1672,4 @@ test = ["pytest (>=6.0.0)", "setuptools (>=65)"]
|
||||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = "^3.10"
|
||||
content-hash = "17f25b61da5f54bb4bb13cecfedda56d23c097aacb95bb213f13ce63ee08c761"
|
||||
content-hash = "a243c28b48b60e14513fc18629096a7f9d1c60ae7b05a6c50125c1d4c045033e"
|
||||
|
||||
@@ -25,10 +25,11 @@ tenacity = "^8.3.0"
|
||||
apscheduler = "^3.10.4"
|
||||
croniter = "^2.0.5"
|
||||
pytest-asyncio = "^0.23.7"
|
||||
pydantic-settings = "^2.3.4"
|
||||
|
||||
|
||||
[tool.poetry.group.dev.dependencies]
|
||||
cx-freeze = "7.0.0"
|
||||
cx-freeze = { git = "https://github.com/ntindle/cx_Freeze.git", rev = "main", develop = true }
|
||||
poethepoet = "^0.26.1"
|
||||
httpx = "^0.27.0"
|
||||
pytest-watcher = "^0.4.2"
|
||||
@@ -85,4 +86,4 @@ patterns = ["*.py"]
|
||||
ignore_patterns = []
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
asyncio_mode = "auto"
|
||||
asyncio_mode = "auto"
|
||||
|
||||
@@ -11,23 +11,24 @@ generator client {
|
||||
|
||||
// This model describes the Agent Graph/Flow (Multi Agent System).
|
||||
model AgentGraph {
|
||||
id String @id
|
||||
id String @id @default(uuid())
|
||||
name String?
|
||||
description String?
|
||||
|
||||
AgentNodes AgentNode[] @relation("AgentGraphNodes")
|
||||
AgentExecutionSchedule AgentExecutionSchedule[]
|
||||
AgentNodes AgentNode[]
|
||||
AgentGraphExecution AgentGraphExecution[]
|
||||
AgentGraphExecutionSchedule AgentGraphExecutionSchedule[]
|
||||
}
|
||||
|
||||
// This model describes a single node in the Agent Graph/Flow (Multi Agent System).
|
||||
model AgentNode {
|
||||
id String @id
|
||||
id String @id @default(uuid())
|
||||
|
||||
agentBlockId String
|
||||
AgentBlock AgentBlock @relation(fields: [agentBlockId], references: [id])
|
||||
|
||||
agentGraphId String
|
||||
AgentGraph AgentGraph @relation("AgentGraphNodes", fields: [agentGraphId], references: [id])
|
||||
AgentGraph AgentGraph @relation(fields: [agentGraphId], references: [id])
|
||||
|
||||
// List of consumed input, that the parent node should provide.
|
||||
Input AgentNodeLink[] @relation("AgentNodeSink")
|
||||
@@ -46,7 +47,7 @@ model AgentNode {
|
||||
|
||||
// This model describes the link between two AgentNodes.
|
||||
model AgentNodeLink {
|
||||
id String @id
|
||||
id String @id @default(uuid())
|
||||
|
||||
// Output of a node is connected to the source of the link.
|
||||
agentNodeSourceId String
|
||||
@@ -61,7 +62,7 @@ model AgentNodeLink {
|
||||
|
||||
// This model describes a component that will be executed by the AgentNode.
|
||||
model AgentBlock {
|
||||
id String @id
|
||||
id String @id @default(uuid())
|
||||
name String @unique
|
||||
|
||||
// We allow a block to have multiple types of input & output.
|
||||
@@ -73,45 +74,55 @@ model AgentBlock {
|
||||
ReferencedByAgentNode AgentNode[]
|
||||
}
|
||||
|
||||
// This model describes the execution of an AgentGraph.
|
||||
model AgentGraphExecution {
|
||||
id String @id @default(uuid())
|
||||
|
||||
agentGraphId String
|
||||
AgentGraph AgentGraph @relation(fields: [agentGraphId], references: [id])
|
||||
|
||||
AgentNodeExecutions AgentNodeExecution[]
|
||||
}
|
||||
|
||||
// This model describes the execution of an AgentNode.
|
||||
model AgentNodeExecution {
|
||||
id String @id
|
||||
executionId String
|
||||
id String @id @default(uuid())
|
||||
|
||||
agentGraphExecutionId String
|
||||
AgentGraphExecution AgentGraphExecution @relation(fields: [agentGraphExecutionId], references: [id])
|
||||
|
||||
agentNodeId String
|
||||
AgentNode AgentNode @relation(fields: [agentNodeId], references: [id])
|
||||
|
||||
inputData String?
|
||||
inputFiles FileDefinition[] @relation("InputFiles")
|
||||
outputName String?
|
||||
outputData String?
|
||||
outputFiles FileDefinition[] @relation("OutputFiles")
|
||||
Input AgentNodeExecutionInputOutput[] @relation("AgentNodeExecutionInput")
|
||||
Output AgentNodeExecutionInputOutput[] @relation("AgentNodeExecutionOutput")
|
||||
|
||||
// sqlite does not support enum
|
||||
// enum Status { QUEUED, RUNNING, SUCCESS, FAILED }
|
||||
// enum Status { INCOMPLETE, QUEUED, RUNNING, SUCCESS, FAILED }
|
||||
executionStatus String
|
||||
creationTime DateTime
|
||||
startTime DateTime?
|
||||
endTime DateTime?
|
||||
addedTime DateTime @default(now())
|
||||
queuedTime DateTime?
|
||||
startedTime DateTime?
|
||||
endedTime DateTime?
|
||||
}
|
||||
|
||||
// This model describes a file that can be used as input/output of an AgentNodeExecution.
|
||||
model FileDefinition {
|
||||
id String @id
|
||||
path String
|
||||
metadata String? // JSON serialized object
|
||||
mimeType String?
|
||||
size Int?
|
||||
hash String?
|
||||
encoding String?
|
||||
// This model describes the output of an AgentNodeExecution.
|
||||
model AgentNodeExecutionInputOutput {
|
||||
id String @id @default(uuid())
|
||||
|
||||
name String
|
||||
data String
|
||||
time DateTime @default(now())
|
||||
|
||||
// Prisma requires explicit back-references.
|
||||
ReferencedByInputFiles AgentNodeExecution[] @relation("InputFiles")
|
||||
ReferencedByOutputFiles AgentNodeExecution[] @relation("OutputFiles")
|
||||
referencedByInputExecId String?
|
||||
ReferencedByInputExec AgentNodeExecution? @relation("AgentNodeExecutionInput", fields: [referencedByInputExecId], references: [id])
|
||||
referencedByOutputExecId String?
|
||||
ReferencedByOutputExec AgentNodeExecution? @relation("AgentNodeExecutionOutput", fields: [referencedByOutputExecId], references: [id])
|
||||
}
|
||||
|
||||
// This model describes the recurring execution schedule of an Agent.
|
||||
model AgentExecutionSchedule {
|
||||
model AgentGraphExecutionSchedule {
|
||||
id String @id
|
||||
|
||||
agentGraphId String
|
||||
|
||||
0
rnd/autogpt_server/secrets/.gitkeep
Normal file
0
rnd/autogpt_server/secrets/.gitkeep
Normal file
@@ -1,7 +1,9 @@
|
||||
import platform
|
||||
from pathlib import Path
|
||||
from pkgutil import iter_modules
|
||||
from shutil import which
|
||||
from typing import Union
|
||||
|
||||
from cx_Freeze import Executable, setup
|
||||
from cx_Freeze import Executable, setup # type: ignore
|
||||
|
||||
packages = [
|
||||
m.name
|
||||
@@ -9,13 +11,52 @@ packages = [
|
||||
if m.ispkg and m.module_finder and "poetry" in m.module_finder.path # type: ignore
|
||||
]
|
||||
packages.append("collections")
|
||||
packages.append("autogpt_server.util.service")
|
||||
packages.append("autogpt_server.executor.manager")
|
||||
packages.append("autogpt_server.util.service")
|
||||
|
||||
# set the icon based on the platform
|
||||
icon = "../../assets/gpt_dark_RGB.ico"
|
||||
if platform.system() == "Darwin":
|
||||
icon = "../../assets/gpt_dark_RGB.icns"
|
||||
elif platform.system() == "Linux":
|
||||
icon = "../../assets/gpt_dark_RGB.png"
|
||||
|
||||
|
||||
def txt_to_rtf(input_file: Union[str, Path], output_file: Union[str, Path]) -> None:
|
||||
"""
|
||||
Convert a text file to RTF format.
|
||||
|
||||
Args:
|
||||
input_file (Union[str, Path]): Path to the input text file.
|
||||
output_file (Union[str, Path]): Path to the output RTF file.
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
input_path = Path(input_file)
|
||||
output_path = Path(output_file)
|
||||
|
||||
with input_path.open("r", encoding="utf-8") as txt_file:
|
||||
content = txt_file.read()
|
||||
|
||||
# RTF header
|
||||
rtf = r"{\rtf1\ansi\deff0 {\fonttbl {\f0 Times New Roman;}}\f0\fs24 "
|
||||
|
||||
# Replace newlines with RTF newline
|
||||
rtf += content.replace("\n", "\\par ")
|
||||
|
||||
# Close RTF document
|
||||
rtf += "}"
|
||||
|
||||
with output_path.open("w", encoding="utf-8") as rtf_file:
|
||||
rtf_file.write(rtf)
|
||||
|
||||
|
||||
# Convert LICENSE to LICENSE.rtf
|
||||
license_file = "LICENSE.rtf"
|
||||
txt_to_rtf("../../LICENSE", license_file)
|
||||
|
||||
# if mac use the icns file, otherwise use the ico file
|
||||
icon = (
|
||||
"../../assets/gpt_dark_RGB.icns"
|
||||
if which("sips")
|
||||
else "../../assets/gpt_dark_RGB.ico"
|
||||
)
|
||||
|
||||
setup(
|
||||
name="AutoGPT Server",
|
||||
@@ -41,30 +82,48 @@ setup(
|
||||
"packages": packages,
|
||||
"includes": [
|
||||
"autogpt_server",
|
||||
"uvicorn.loops.auto",
|
||||
"uvicorn.protocols.http.auto",
|
||||
"uvicorn.protocols.websockets.auto",
|
||||
"uvicorn.lifespan.on",
|
||||
"prisma",
|
||||
],
|
||||
# Exclude the two module from readability.compat as it causes issues
|
||||
"excludes": ["readability.compat.two"],
|
||||
"include_files": [
|
||||
# source, destination in the bundle
|
||||
# (../frontend, example_files) would also work but you'd need to load the frontend differently in the data.py to correctly get the path when frozen
|
||||
("../example_files", "example_files"),
|
||||
],
|
||||
},
|
||||
# Mac .app specific options
|
||||
"bdist_mac": {
|
||||
"bundle_name": "AutoGPT",
|
||||
"iconfile": "../../assets/gpt_dark_RGB.icns",
|
||||
# "include_resources": ["IMG_3775.jpeg"],
|
||||
},
|
||||
# Mac .dmg specific options
|
||||
"bdist_dmg": {
|
||||
"applications_shortcut": True,
|
||||
"volume_label": "AutoGPTServer",
|
||||
"background": "builtin-arrow",
|
||||
|
||||
"license": {
|
||||
"default-language": "en_US",
|
||||
"licenses": {"en_US": license_file},
|
||||
"buttons": {
|
||||
"en_US": [
|
||||
"English",
|
||||
"Agree",
|
||||
"Disagree",
|
||||
"Print",
|
||||
"Save",
|
||||
"If you agree, click Agree to continue the installation. If you do not agree, click Disagree to cancel the installation.",
|
||||
]
|
||||
},
|
||||
},
|
||||
},
|
||||
# Windows .msi specific options
|
||||
"bdist_msi": {
|
||||
"target_name": "AutoGPTServer",
|
||||
"add_to_path": True,
|
||||
"install_icon": "../../assets/gpt_dark_RGB.ico",
|
||||
"license_file": license_file,
|
||||
},
|
||||
# Linux .appimage specific options
|
||||
"bdist_appimage": {},
|
||||
|
||||
@@ -12,7 +12,7 @@ async def create_test_graph() -> graph.Graph:
|
||||
"""
|
||||
ParrotBlock
|
||||
\
|
||||
---- TextCombinerBlock ---- PrintingBlock
|
||||
---- TextFormatterBlock ---- PrintingBlock
|
||||
/
|
||||
ParrotBlock
|
||||
"""
|
||||
@@ -20,13 +20,16 @@ async def create_test_graph() -> graph.Graph:
|
||||
graph.Node(block_id=block.ParrotBlock.id),
|
||||
graph.Node(block_id=block.ParrotBlock.id),
|
||||
graph.Node(
|
||||
block_id=block.TextCombinerBlock.id,
|
||||
input_default={"format": "{text1},{text2}"},
|
||||
block_id=block.TextFormatterBlock.id,
|
||||
input_default={
|
||||
"format": "{texts[0]},{texts[1]},{texts[2]}",
|
||||
"texts_$_3": "!!!",
|
||||
},
|
||||
),
|
||||
graph.Node(block_id=block.PrintingBlock.id),
|
||||
]
|
||||
nodes[0].connect(nodes[2], "output", "text1")
|
||||
nodes[1].connect(nodes[2], "output", "text2")
|
||||
nodes[0].connect(nodes[2], "output", "texts_$_1")
|
||||
nodes[1].connect(nodes[2], "output", "texts_$_2")
|
||||
nodes[2].connect(nodes[3], "combined_text", "text")
|
||||
|
||||
test_graph = graph.Graph(
|
||||
@@ -45,18 +48,18 @@ async def create_test_graph() -> graph.Graph:
|
||||
return test_graph
|
||||
|
||||
|
||||
async def execute_agent(test_manager: ExecutionManager, test_graph: graph.Graph):
|
||||
async def execute_graph(test_manager: ExecutionManager, test_graph: graph.Graph):
|
||||
# --- Test adding new executions --- #
|
||||
text = "Hello, World!"
|
||||
input_data = {"input": text}
|
||||
agent_server = AgentServer()
|
||||
response = await agent_server.execute_agent(test_graph.id, input_data)
|
||||
response = await agent_server.execute_graph(test_graph.id, input_data)
|
||||
executions = response["executions"]
|
||||
run_id = response["run_id"]
|
||||
graph_exec_id = response["id"]
|
||||
assert len(executions) == 2
|
||||
|
||||
async def is_execution_completed():
|
||||
execs = await agent_server.get_executions(test_graph.id, run_id)
|
||||
execs = await agent_server.get_executions(test_graph.id, graph_exec_id)
|
||||
return test_manager.queue.empty() and len(execs) == 4
|
||||
|
||||
# Wait for the executions to complete
|
||||
@@ -67,46 +70,41 @@ async def execute_agent(test_manager: ExecutionManager, test_graph: graph.Graph)
|
||||
|
||||
# Execution queue should be empty
|
||||
assert await is_execution_completed()
|
||||
executions = await agent_server.get_executions(test_graph.id, run_id)
|
||||
executions = await agent_server.get_executions(test_graph.id, graph_exec_id)
|
||||
|
||||
# Executing ParrotBlock1
|
||||
exec = executions[0]
|
||||
assert exec.status == execution.ExecutionStatus.COMPLETED
|
||||
assert exec.run_id == run_id
|
||||
assert exec.output_name == "output"
|
||||
assert exec.output_data == "Hello, World!"
|
||||
assert exec.input_data == input_data
|
||||
assert exec.graph_exec_id == graph_exec_id
|
||||
assert exec.output_data == {"output": ["Hello, World!"]}
|
||||
assert exec.input_data == {"input": text}
|
||||
assert exec.node_id == test_graph.nodes[0].id
|
||||
|
||||
# Executing ParrotBlock2
|
||||
exec = executions[1]
|
||||
assert exec.status == execution.ExecutionStatus.COMPLETED
|
||||
assert exec.run_id == run_id
|
||||
assert exec.output_name == "output"
|
||||
assert exec.output_data == "Hello, World!"
|
||||
assert exec.input_data == input_data
|
||||
assert exec.graph_exec_id == graph_exec_id
|
||||
assert exec.output_data == {"output": ["Hello, World!"]}
|
||||
assert exec.input_data == {"input": text}
|
||||
assert exec.node_id == test_graph.nodes[1].id
|
||||
|
||||
# Executing TextCombinerBlock
|
||||
# Executing TextFormatterBlock
|
||||
exec = executions[2]
|
||||
assert exec.status == execution.ExecutionStatus.COMPLETED
|
||||
assert exec.run_id == run_id
|
||||
assert exec.output_name == "combined_text"
|
||||
assert exec.output_data == "Hello, World!,Hello, World!"
|
||||
assert exec.graph_exec_id == graph_exec_id
|
||||
assert exec.output_data == {"combined_text": ["Hello, World!,Hello, World!,!!!"]}
|
||||
assert exec.input_data == {
|
||||
"format": "{text1},{text2}",
|
||||
"text1": "Hello, World!",
|
||||
"text2": "Hello, World!",
|
||||
"texts_$_1": "Hello, World!",
|
||||
"texts_$_2": "Hello, World!",
|
||||
}
|
||||
assert exec.node_id == test_graph.nodes[2].id
|
||||
|
||||
# Executing PrintingBlock
|
||||
exec = executions[3]
|
||||
assert exec.status == execution.ExecutionStatus.COMPLETED
|
||||
assert exec.run_id == run_id
|
||||
assert exec.output_name == "status"
|
||||
assert exec.output_data == "printed"
|
||||
assert exec.input_data == {"text": "Hello, World!,Hello, World!"}
|
||||
assert exec.graph_exec_id == graph_exec_id
|
||||
assert exec.output_data == {"status": ["printed"]}
|
||||
assert exec.input_data == {"text": "Hello, World!,Hello, World!,!!!"}
|
||||
assert exec.node_id == test_graph.nodes[3].id
|
||||
|
||||
|
||||
@@ -116,4 +114,4 @@ async def test_agent_execution():
|
||||
with ExecutionManager(1) as test_manager:
|
||||
await db.connect()
|
||||
test_graph = await create_test_graph()
|
||||
await execute_agent(test_manager, test_graph)
|
||||
await execute_graph(test_manager, test_graph)
|
||||
|
||||
19
rnd/example_files/index.html
Normal file
19
rnd/example_files/index.html
Normal file
@@ -0,0 +1,19 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<head>
|
||||
<title>Example Files</title>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<h1>Example Files</h1>
|
||||
<ul>
|
||||
<li><a href="example1.txt">Example 1</a></li>
|
||||
<li><a href="example2.txt">Example 2</a></li>
|
||||
<li><a href="example3.txt">Example 3</a></li>
|
||||
</ul>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
Reference in New Issue
Block a user