mirror of
https://github.com/Significant-Gravitas/AutoGPT.git
synced 2026-04-08 03:00:28 -04:00
Merge remote-tracking branch 'origin/master' into remove-forge-and-autogpt
This commit is contained in:
@@ -36,3 +36,5 @@ rnd/autogpt_builder/.env.example
|
||||
rnd/autogpt_builder/.env.local
|
||||
rnd/autogpt_server/.env
|
||||
rnd/autogpt_server/.venv/
|
||||
|
||||
rnd/market/.env
|
||||
|
||||
138
rnd/README.md
138
rnd/README.md
@@ -1,36 +1,114 @@
|
||||
This is a guide to setting up and running the AutoGPT Server and Builder. This tutorial will cover downloading the necessary files, setting up the server, and testing the system.
|
||||
# AutoGPT Platform
|
||||
|
||||
https://github.com/user-attachments/assets/fd0d0f35-3155-4263-b575-ba3efb126cb4
|
||||
Welcome to the AutoGPT Platform - a powerful system for creating and running AI agents to solve business problems. This platform enables you to harness the power of artificial intelligence to automate tasks, analyze data, and generate insights for your organization.
|
||||
|
||||
1. Navigate to the AutoGPT GitHub repository.
|
||||
2. Click the "Code" button, then select "Download ZIP".
|
||||
3. Once downloaded, extract the ZIP file to a folder of your choice.
|
||||
## Getting Started
|
||||
|
||||
4. Open the extracted folder and navigate to the "rnd" directory.
|
||||
5. Enter the "AutoGPT server" folder.
|
||||
6. Open a terminal window in this directory.
|
||||
7. Locate and open the README file in the AutoGPT server folder: [doc](./autogpt_server/README.md#setup).
|
||||
8. Copy and paste each command from the setup section in the README into your terminal.
|
||||
- Important: Wait for each command to finish before running the next one.
|
||||
9. If all commands run without errors, enter the final command: `poetry run app`
|
||||
10. You should now see the server running in your terminal.
|
||||
### Prerequisites
|
||||
|
||||
- Docker
|
||||
- Docker Compose V2 (comes with Docker Desktop, or can be installed separately)
|
||||
|
||||
### Running the System
|
||||
|
||||
To run the AutoGPT Platform, follow these steps:
|
||||
|
||||
1. Clone this repository to your local machine.
|
||||
2. Navigate to the project directory.
|
||||
3. Run the following command:
|
||||
|
||||
```
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
This command will start all the necessary services defined in the `docker-compose.yml` file in detached mode.
|
||||
|
||||
### Docker Compose Commands
|
||||
|
||||
Here are some useful Docker Compose commands for managing your AutoGPT Platform:
|
||||
|
||||
- `docker compose up -d`: Start the services in detached mode.
|
||||
- `docker compose stop`: Stop the running services without removing them.
|
||||
- `docker compose rm`: Remove stopped service containers.
|
||||
- `docker compose build`: Build or rebuild services.
|
||||
- `docker compose down`: Stop and remove containers, networks, and volumes.
|
||||
- `docker compose watch`: Watch for changes in your services and automatically update them.
|
||||
|
||||
|
||||
### Sample Scenarios
|
||||
|
||||
Here are some common scenarios where you might use multiple Docker Compose commands:
|
||||
|
||||
1. Updating and restarting a specific service:
|
||||
```
|
||||
docker compose build api_srv
|
||||
docker compose up -d --no-deps api_srv
|
||||
```
|
||||
This rebuilds the `api_srv` service and restarts it without affecting other services.
|
||||
|
||||
2. Viewing logs for troubleshooting:
|
||||
```
|
||||
docker compose logs -f api_srv ws_srv
|
||||
```
|
||||
This shows and follows the logs for both `api_srv` and `ws_srv` services.
|
||||
|
||||
3. Scaling a service for increased load:
|
||||
```
|
||||
docker compose up -d --scale executor=3
|
||||
```
|
||||
This scales the `executor` service to 3 instances to handle increased load.
|
||||
|
||||
4. Stopping the entire system for maintenance:
|
||||
```
|
||||
docker compose stop
|
||||
docker compose rm -f
|
||||
docker compose pull
|
||||
docker compose up -d
|
||||
```
|
||||
This stops all services, removes containers, pulls the latest images, and restarts the system.
|
||||
|
||||
5. Developing with live updates:
|
||||
```
|
||||
docker compose watch
|
||||
```
|
||||
This watches for changes in your code and automatically updates the relevant services.
|
||||
|
||||
6. Checking the status of services:
|
||||
```
|
||||
docker compose ps
|
||||
```
|
||||
This shows the current status of all services defined in your docker-compose.yml file.
|
||||
|
||||
These scenarios demonstrate how to use Docker Compose commands in combination to manage your AutoGPT Platform effectively.
|
||||
|
||||
|
||||
### Persisting Data
|
||||
|
||||
To persist data for PostgreSQL and Redis, you can modify the `docker-compose.yml` file to add volumes. Here's how:
|
||||
|
||||
1. Open the `docker-compose.yml` file in a text editor.
|
||||
2. Add volume configurations for PostgreSQL and Redis services:
|
||||
|
||||
```yaml
|
||||
services:
|
||||
postgres:
|
||||
# ... other configurations ...
|
||||
volumes:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
|
||||
redis:
|
||||
# ... other configurations ...
|
||||
volumes:
|
||||
- redis_data:/data
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
redis_data:
|
||||
```
|
||||
|
||||
3. Save the file and run `docker compose up -d` to apply the changes.
|
||||
|
||||
This configuration will create named volumes for PostgreSQL and Redis, ensuring that your data persists across container restarts.
|
||||
|
||||
11. Navigate back to the "rnd" folder.
|
||||
12. Open the "AutoGPT builder" folder.
|
||||
13. Open the README file in this folder: [doc](./autogpt_builder/README.md#getting-started).
|
||||
14. In your terminal, run the following commands:
|
||||
```
|
||||
npm install
|
||||
```
|
||||
```
|
||||
npm run dev
|
||||
```
|
||||
|
||||
15. Once the front-end is running, click the link to navigate to `localhost:3000`.
|
||||
16. Click on the "Build" option.
|
||||
17. Add a few blocks to test the functionality.
|
||||
18. Connect the blocks together.
|
||||
19. Click "Run".
|
||||
20. Check your terminal window - you should see that the server has received the request, is processing it, and has executed it.
|
||||
|
||||
And there you have it! You've successfully set up and tested AutoGPT.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8000/api
|
||||
NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8001/api/v1/market
|
||||
NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8005/api/v1/market
|
||||
|
||||
## Supabase credentials
|
||||
## YOU ONLY NEED THEM IF YOU WANT TO USE SUPABASE USER AUTHENTICATION
|
||||
|
||||
@@ -1,19 +1,19 @@
|
||||
# Base stage for both dev and prod
|
||||
FROM node:21-alpine AS base
|
||||
WORKDIR /app
|
||||
COPY autogpt_builder/package.json autogpt_builder/yarn.lock ./
|
||||
COPY rnd/autogpt_builder/package.json rnd/autogpt_builder/yarn.lock ./
|
||||
RUN yarn install --frozen-lockfile
|
||||
|
||||
# Dev stage
|
||||
FROM base AS dev
|
||||
ENV NODE_ENV=development
|
||||
COPY autogpt_builder/ .
|
||||
COPY rnd/autogpt_builder/ .
|
||||
EXPOSE 3000
|
||||
CMD ["npm", "run", "dev"]
|
||||
CMD ["yarn", "run", "dev"]
|
||||
|
||||
# Build stage for prod
|
||||
FROM base AS build
|
||||
COPY autogpt_builder/ .
|
||||
COPY rnd/autogpt_builder/ .
|
||||
RUN npm run build
|
||||
|
||||
# Prod stage
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
"use client";
|
||||
import React, { useEffect, useState } from "react";
|
||||
import React, { useCallback, useEffect, useMemo, useState } from "react";
|
||||
|
||||
import AutoGPTServerAPI, {
|
||||
GraphMeta,
|
||||
@@ -22,54 +22,57 @@ const Monitor = () => {
|
||||
const [selectedFlow, setSelectedFlow] = useState<GraphMeta | null>(null);
|
||||
const [selectedRun, setSelectedRun] = useState<FlowRun | null>(null);
|
||||
|
||||
const api = new AutoGPTServerAPI();
|
||||
const api = useMemo(() => new AutoGPTServerAPI(), []);
|
||||
|
||||
useEffect(() => fetchFlowsAndRuns(), []);
|
||||
const refreshFlowRuns = useCallback(
|
||||
(flowID: string) => {
|
||||
// Fetch flow run IDs
|
||||
api.listGraphRunIDs(flowID).then((runIDs) =>
|
||||
runIDs.map((runID) => {
|
||||
let run;
|
||||
if (
|
||||
(run = flowRuns.find((fr) => fr.id == runID)) &&
|
||||
!["waiting", "running"].includes(run.status)
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Fetch flow run
|
||||
api.getGraphExecutionInfo(flowID, runID).then((execInfo) =>
|
||||
setFlowRuns((flowRuns) => {
|
||||
if (execInfo.length == 0) return flowRuns;
|
||||
|
||||
const flowRunIndex = flowRuns.findIndex((fr) => fr.id == runID);
|
||||
const flowRun = flowRunFromNodeExecutionResults(execInfo);
|
||||
if (flowRunIndex > -1) {
|
||||
flowRuns.splice(flowRunIndex, 1, flowRun);
|
||||
} else {
|
||||
flowRuns.push(flowRun);
|
||||
}
|
||||
return [...flowRuns];
|
||||
}),
|
||||
);
|
||||
}),
|
||||
);
|
||||
},
|
||||
[api, flowRuns],
|
||||
);
|
||||
|
||||
const fetchFlowsAndRuns = useCallback(() => {
|
||||
api.listGraphs().then((flows) => {
|
||||
setFlows(flows);
|
||||
flows.map((flow) => refreshFlowRuns(flow.id));
|
||||
});
|
||||
}, [api, refreshFlowRuns]);
|
||||
|
||||
useEffect(() => fetchFlowsAndRuns(), [fetchFlowsAndRuns]);
|
||||
useEffect(() => {
|
||||
const intervalId = setInterval(
|
||||
() => flows.map((f) => refreshFlowRuns(f.id)),
|
||||
5000,
|
||||
);
|
||||
return () => clearInterval(intervalId);
|
||||
}, []);
|
||||
|
||||
function fetchFlowsAndRuns() {
|
||||
api.listGraphs().then((flows) => {
|
||||
setFlows(flows);
|
||||
flows.map((flow) => refreshFlowRuns(flow.id));
|
||||
});
|
||||
}
|
||||
|
||||
function refreshFlowRuns(flowID: string) {
|
||||
// Fetch flow run IDs
|
||||
api.listGraphRunIDs(flowID).then((runIDs) =>
|
||||
runIDs.map((runID) => {
|
||||
let run;
|
||||
if (
|
||||
(run = flowRuns.find((fr) => fr.id == runID)) &&
|
||||
!["waiting", "running"].includes(run.status)
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Fetch flow run
|
||||
api.getGraphExecutionInfo(flowID, runID).then((execInfo) =>
|
||||
setFlowRuns((flowRuns) => {
|
||||
if (execInfo.length == 0) return flowRuns;
|
||||
|
||||
const flowRunIndex = flowRuns.findIndex((fr) => fr.id == runID);
|
||||
const flowRun = flowRunFromNodeExecutionResults(execInfo);
|
||||
if (flowRunIndex > -1) {
|
||||
flowRuns.splice(flowRunIndex, 1, flowRun);
|
||||
} else {
|
||||
flowRuns.push(flowRun);
|
||||
}
|
||||
return [...flowRuns];
|
||||
}),
|
||||
);
|
||||
}),
|
||||
);
|
||||
}
|
||||
}, [flows, refreshFlowRuns]);
|
||||
|
||||
const column1 = "md:col-span-2 xl:col-span-3 xxl:col-span-2";
|
||||
const column2 = "md:col-span-3 lg:col-span-2 xl:col-span-3 space-y-4";
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React, { useContext, useEffect, useState } from "react";
|
||||
import React, { useCallback, useContext, useEffect, useState } from "react";
|
||||
import {
|
||||
BaseEdge,
|
||||
EdgeLabelRenderer,
|
||||
@@ -65,24 +65,27 @@ export function CustomEdge({
|
||||
const beadDiameter = 12;
|
||||
const deltaTime = 16;
|
||||
|
||||
function setTargetPositions(beads: Bead[]) {
|
||||
const distanceBetween = Math.min(
|
||||
(length - beadDiameter) / (beads.length + 1),
|
||||
beadDiameter,
|
||||
);
|
||||
const setTargetPositions = useCallback(
|
||||
(beads: Bead[]) => {
|
||||
const distanceBetween = Math.min(
|
||||
(length - beadDiameter) / (beads.length + 1),
|
||||
beadDiameter,
|
||||
);
|
||||
|
||||
return beads.map((bead, index) => {
|
||||
const distanceFromEnd = beadDiameter * 1.35;
|
||||
const targetPosition = distanceBetween * index + distanceFromEnd;
|
||||
const t = getTForDistance(-targetPosition);
|
||||
return beads.map((bead, index) => {
|
||||
const distanceFromEnd = beadDiameter * 1.35;
|
||||
const targetPosition = distanceBetween * index + distanceFromEnd;
|
||||
const t = getTForDistance(-targetPosition);
|
||||
|
||||
return {
|
||||
...bead,
|
||||
t: visualizeBeads === "animate" ? bead.t : t,
|
||||
targetT: t,
|
||||
} as Bead;
|
||||
});
|
||||
}
|
||||
return {
|
||||
...bead,
|
||||
t: visualizeBeads === "animate" ? bead.t : t,
|
||||
targetT: t,
|
||||
} as Bead;
|
||||
});
|
||||
},
|
||||
[getTForDistance, length, visualizeBeads],
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
if (data?.beadUp === 0 && data?.beadDown === 0) {
|
||||
@@ -170,7 +173,7 @@ export function CustomEdge({
|
||||
}, deltaTime);
|
||||
|
||||
return () => clearInterval(interval);
|
||||
}, [data]);
|
||||
}, [data, setTargetPositions, visualizeBeads]);
|
||||
|
||||
const middle = getPointForT(0.5);
|
||||
|
||||
|
||||
@@ -12,8 +12,10 @@ import InputModalComponent from "./InputModalComponent";
|
||||
import OutputModalComponent from "./OutputModalComponent";
|
||||
import {
|
||||
BlockIORootSchema,
|
||||
BlockIOStringSubSchema,
|
||||
Category,
|
||||
NodeExecutionResult,
|
||||
BlockUIType,
|
||||
} from "@/lib/autogpt-server-api/types";
|
||||
import { beautifyString, cn, setNestedProperty } from "@/lib/utils";
|
||||
import { Button } from "@/components/ui/button";
|
||||
@@ -21,7 +23,10 @@ import { Switch } from "@/components/ui/switch";
|
||||
import { Copy, Trash2 } from "lucide-react";
|
||||
import { history } from "./history";
|
||||
import NodeHandle from "./NodeHandle";
|
||||
import { NodeGenericInputField } from "./node-input-components";
|
||||
import {
|
||||
NodeGenericInputField,
|
||||
NodeTextBoxInput,
|
||||
} from "./node-input-components";
|
||||
import SchemaTooltip from "./SchemaTooltip";
|
||||
import { getPrimaryCategoryColor } from "@/lib/utils";
|
||||
import { FlowContext } from "./Flow";
|
||||
@@ -59,6 +64,7 @@ export type CustomNodeData = {
|
||||
backend_id?: string;
|
||||
errors?: { [key: string]: string };
|
||||
isOutputStatic?: boolean;
|
||||
uiType: BlockUIType;
|
||||
};
|
||||
|
||||
export type CustomNode = Node<CustomNodeData, "custom">;
|
||||
@@ -96,7 +102,7 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
|
||||
|
||||
useEffect(() => {
|
||||
setIsAnyModalOpen?.(isModalOpen || isOutputModalOpen);
|
||||
}, [isModalOpen, isOutputModalOpen, data]);
|
||||
}, [isModalOpen, isOutputModalOpen, data, setIsAnyModalOpen]);
|
||||
|
||||
useEffect(() => {
|
||||
isInitialSetup.current = false;
|
||||
@@ -118,8 +124,16 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
|
||||
setIsAdvancedOpen(checked);
|
||||
};
|
||||
|
||||
const generateOutputHandles = (schema: BlockIORootSchema) => {
|
||||
if (!schema?.properties) return null;
|
||||
const generateOutputHandles = (
|
||||
schema: BlockIORootSchema,
|
||||
nodeType: BlockUIType,
|
||||
) => {
|
||||
if (
|
||||
!schema?.properties ||
|
||||
nodeType === BlockUIType.OUTPUT ||
|
||||
nodeType === BlockUIType.NOTE
|
||||
)
|
||||
return null;
|
||||
const keys = Object.keys(schema.properties);
|
||||
return keys.map((key) => (
|
||||
<div key={key}>
|
||||
@@ -133,6 +147,137 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
|
||||
));
|
||||
};
|
||||
|
||||
const generateInputHandles = (
|
||||
schema: BlockIORootSchema,
|
||||
nodeType: BlockUIType,
|
||||
) => {
|
||||
if (!schema?.properties) return null;
|
||||
let keys = Object.entries(schema.properties);
|
||||
switch (nodeType) {
|
||||
case BlockUIType.INPUT:
|
||||
// For INPUT blocks, dont include connection handles
|
||||
return keys.map(([propKey, propSchema]) => {
|
||||
const isRequired = data.inputSchema.required?.includes(propKey);
|
||||
const isConnected = isHandleConnected(propKey);
|
||||
const isAdvanced = propSchema.advanced;
|
||||
return (
|
||||
(isRequired || isAdvancedOpen || !isAdvanced) && (
|
||||
<div key={propKey}>
|
||||
<span className="text-m green -mb-1 text-gray-900">
|
||||
{propSchema.title || beautifyString(propKey)}
|
||||
</span>
|
||||
<div key={propKey} onMouseOver={() => {}}>
|
||||
{!isConnected && (
|
||||
<NodeGenericInputField
|
||||
className="mb-2 mt-1"
|
||||
propKey={propKey}
|
||||
propSchema={propSchema}
|
||||
currentValue={getValue(propKey)}
|
||||
connections={data.connections}
|
||||
handleInputChange={handleInputChange}
|
||||
handleInputClick={handleInputClick}
|
||||
errors={data.errors ?? {}}
|
||||
displayName={propSchema.title || beautifyString(propKey)}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
);
|
||||
});
|
||||
|
||||
case BlockUIType.NOTE:
|
||||
// For NOTE blocks, don't render any input handles
|
||||
const [noteKey, noteSchema] = keys[0];
|
||||
return (
|
||||
<div key={noteKey}>
|
||||
<NodeTextBoxInput
|
||||
className=""
|
||||
selfKey={noteKey}
|
||||
schema={noteSchema as BlockIOStringSubSchema}
|
||||
value={getValue(noteKey)}
|
||||
handleInputChange={handleInputChange}
|
||||
handleInputClick={handleInputClick}
|
||||
error={data.errors?.[noteKey] ?? ""}
|
||||
displayName={noteSchema.title || beautifyString(noteKey)}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
|
||||
case BlockUIType.OUTPUT:
|
||||
// For OUTPUT blocks, only show the 'value' property
|
||||
return keys.map(([propKey, propSchema]) => {
|
||||
const isRequired = data.inputSchema.required?.includes(propKey);
|
||||
const isConnected = isHandleConnected(propKey);
|
||||
const isAdvanced = propSchema.advanced;
|
||||
return (
|
||||
(isRequired || isAdvancedOpen || !isAdvanced) && (
|
||||
<div key={propKey} onMouseOver={() => {}}>
|
||||
{propKey !== "value" ? (
|
||||
<span className="text-m green -mb-1 text-gray-900">
|
||||
{propSchema.title || beautifyString(propKey)}
|
||||
</span>
|
||||
) : (
|
||||
<NodeHandle
|
||||
keyName={propKey}
|
||||
isConnected={isConnected}
|
||||
isRequired={isRequired}
|
||||
schema={propSchema}
|
||||
side="left"
|
||||
/>
|
||||
)}
|
||||
{!isConnected && (
|
||||
<NodeGenericInputField
|
||||
className="mb-2 mt-1"
|
||||
propKey={propKey}
|
||||
propSchema={propSchema}
|
||||
currentValue={getValue(propKey)}
|
||||
connections={data.connections}
|
||||
handleInputChange={handleInputChange}
|
||||
handleInputClick={handleInputClick}
|
||||
errors={data.errors ?? {}}
|
||||
displayName={propSchema.title || beautifyString(propKey)}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
);
|
||||
});
|
||||
|
||||
default:
|
||||
return keys.map(([propKey, propSchema]) => {
|
||||
const isRequired = data.inputSchema.required?.includes(propKey);
|
||||
const isConnected = isHandleConnected(propKey);
|
||||
const isAdvanced = propSchema.advanced;
|
||||
return (
|
||||
(isRequired || isAdvancedOpen || isConnected || !isAdvanced) && (
|
||||
<div key={propKey} onMouseOver={() => {}}>
|
||||
<NodeHandle
|
||||
keyName={propKey}
|
||||
isConnected={isConnected}
|
||||
isRequired={isRequired}
|
||||
schema={propSchema}
|
||||
side="left"
|
||||
/>
|
||||
{!isConnected && (
|
||||
<NodeGenericInputField
|
||||
className="mb-2 mt-1"
|
||||
propKey={propKey}
|
||||
propSchema={propSchema}
|
||||
currentValue={getValue(propKey)}
|
||||
connections={data.connections}
|
||||
handleInputChange={handleInputChange}
|
||||
handleInputClick={handleInputClick}
|
||||
errors={data.errors ?? {}}
|
||||
displayName={propSchema.title || beautifyString(propKey)}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
);
|
||||
});
|
||||
}
|
||||
};
|
||||
const handleInputChange = (path: string, value: any) => {
|
||||
const keys = parseKeys(path);
|
||||
const newValues = JSON.parse(JSON.stringify(data.hardcodedValues));
|
||||
@@ -378,13 +523,13 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`${blockClasses} ${errorClass} ${statusClass}`}
|
||||
className={`${data.uiType === BlockUIType.NOTE ? "w-[300px]" : "w-[500px]"} ${blockClasses} ${errorClass} ${statusClass} ${data.uiType === BlockUIType.NOTE ? "bg-yellow-100" : "bg-white"}`}
|
||||
onMouseEnter={handleHovered}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
data-id={`custom-node-${id}`}
|
||||
>
|
||||
<div
|
||||
className={`mb-2 p-3 ${getPrimaryCategoryColor(data.categories)} rounded-t-xl`}
|
||||
className={`mb-2 p-3 ${data.uiType === BlockUIType.NOTE ? "bg-yellow-100" : getPrimaryCategoryColor(data.categories)} rounded-t-xl`}
|
||||
>
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="font-roboto p-3 text-lg font-semibold">
|
||||
@@ -417,53 +562,24 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-start justify-between gap-2 p-3">
|
||||
{data.uiType !== BlockUIType.NOTE ? (
|
||||
<div className="flex items-start justify-between p-3">
|
||||
<div>
|
||||
{data.inputSchema &&
|
||||
generateInputHandles(data.inputSchema, data.uiType)}
|
||||
</div>
|
||||
<div className="flex-none">
|
||||
{data.outputSchema &&
|
||||
generateOutputHandles(data.outputSchema, data.uiType)}
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div>
|
||||
{data.inputSchema &&
|
||||
Object.entries(data.inputSchema.properties).map(
|
||||
([propKey, propSchema]) => {
|
||||
const isRequired = data.inputSchema.required?.includes(propKey);
|
||||
const isConnected = isHandleConnected(propKey);
|
||||
const isAdvanced = propSchema.advanced;
|
||||
return (
|
||||
(isRequired ||
|
||||
isAdvancedOpen ||
|
||||
isConnected ||
|
||||
!isAdvanced) && (
|
||||
<div key={propKey} onMouseOver={() => {}}>
|
||||
<NodeHandle
|
||||
keyName={propKey}
|
||||
isConnected={isConnected}
|
||||
isRequired={isRequired}
|
||||
schema={propSchema}
|
||||
side="left"
|
||||
/>
|
||||
{!isConnected && (
|
||||
<NodeGenericInputField
|
||||
className="mb-2 mt-1"
|
||||
propKey={propKey}
|
||||
propSchema={propSchema}
|
||||
currentValue={getValue(propKey)}
|
||||
connections={data.connections}
|
||||
handleInputChange={handleInputChange}
|
||||
handleInputClick={handleInputClick}
|
||||
errors={data.errors ?? {}}
|
||||
displayName={
|
||||
propSchema.title || beautifyString(propKey)
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
);
|
||||
},
|
||||
)}
|
||||
generateInputHandles(data.inputSchema, data.uiType)}
|
||||
</div>
|
||||
<div className="flex-none">
|
||||
{data.outputSchema && generateOutputHandles(data.outputSchema)}
|
||||
</div>
|
||||
</div>
|
||||
{isOutputOpen && (
|
||||
)}
|
||||
{isOutputOpen && data.uiType !== BlockUIType.NOTE && (
|
||||
<div
|
||||
data-id="latest-output"
|
||||
className="nodrag m-3 break-words rounded-md border-[1.5px] p-2"
|
||||
@@ -486,25 +602,27 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
<div className="mt-2.5 flex items-center pb-4 pl-4">
|
||||
<Switch checked={isOutputOpen} onCheckedChange={toggleOutput} />
|
||||
<span className="m-1 mr-4">Output</span>
|
||||
{hasAdvancedFields && (
|
||||
<>
|
||||
<Switch onCheckedChange={toggleAdvancedSettings} />
|
||||
<span className="m-1">Advanced</span>
|
||||
</>
|
||||
)}
|
||||
{data.status && (
|
||||
<Badge
|
||||
variant="outline"
|
||||
data-id={`badge-${id}-${data.status}`}
|
||||
className={cn(data.status.toLowerCase(), "ml-auto mr-5")}
|
||||
>
|
||||
{data.status}
|
||||
</Badge>
|
||||
)}
|
||||
</div>
|
||||
{data.uiType !== BlockUIType.NOTE && (
|
||||
<div className="mt-2.5 flex items-center pb-4 pl-4">
|
||||
<Switch checked={isOutputOpen} onCheckedChange={toggleOutput} />
|
||||
<span className="m-1 mr-4">Output</span>
|
||||
{hasAdvancedFields && (
|
||||
<>
|
||||
<Switch onCheckedChange={toggleAdvancedSettings} />
|
||||
<span className="m-1">Advanced</span>
|
||||
</>
|
||||
)}
|
||||
{data.status && (
|
||||
<Badge
|
||||
variant="outline"
|
||||
data-id={`badge-${id}-${data.status}`}
|
||||
className={cn(data.status.toLowerCase(), "ml-auto mr-5")}
|
||||
>
|
||||
{data.status}
|
||||
</Badge>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
<InputModalComponent
|
||||
title={activeKey ? `Enter ${beautifyString(activeKey)}` : undefined}
|
||||
isOpen={isModalOpen}
|
||||
|
||||
@@ -121,7 +121,7 @@ const FlowEditor: React.FC<{
|
||||
localStorage.setItem("shepherd-tour", "yes");
|
||||
}
|
||||
}
|
||||
}, [availableNodes, tutorialStarted]);
|
||||
}, [availableNodes, tutorialStarted, router, pathname]);
|
||||
|
||||
useEffect(() => {
|
||||
const handleKeyDown = (event: KeyboardEvent) => {
|
||||
@@ -256,7 +256,7 @@ const FlowEditor: React.FC<{
|
||||
}
|
||||
|
||||
const edgeColor = getTypeColor(
|
||||
getOutputType(connection.source!, connection.sourceHandle!),
|
||||
getOutputType(nodes, connection.source!, connection.sourceHandle!),
|
||||
);
|
||||
const sourceNode = getNode(connection.source!);
|
||||
const newEdge: CustomEdge = {
|
||||
@@ -295,6 +295,7 @@ const FlowEditor: React.FC<{
|
||||
addEdges,
|
||||
deleteElements,
|
||||
clearNodesStatusAndOutput,
|
||||
nodes,
|
||||
edges,
|
||||
formatEdgeID,
|
||||
getOutputType,
|
||||
@@ -377,7 +378,7 @@ const FlowEditor: React.FC<{
|
||||
clearNodesStatusAndOutput();
|
||||
}
|
||||
},
|
||||
[setNodes, clearNodesStatusAndOutput],
|
||||
[setNodes, clearNodesStatusAndOutput, setEdges],
|
||||
);
|
||||
|
||||
const getNextNodeId = useCallback(() => {
|
||||
@@ -416,6 +417,7 @@ const FlowEditor: React.FC<{
|
||||
isOutputOpen: false,
|
||||
block_id: blockId,
|
||||
isOutputStatic: nodeSchema.staticOutput,
|
||||
uiType: nodeSchema.uiType,
|
||||
},
|
||||
};
|
||||
|
||||
@@ -434,7 +436,6 @@ const FlowEditor: React.FC<{
|
||||
nodeId,
|
||||
availableNodes,
|
||||
addNodes,
|
||||
setNodes,
|
||||
deleteElements,
|
||||
clearNodesStatusAndOutput,
|
||||
x,
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
.custom-node {
|
||||
color: #000000;
|
||||
width: 500px;
|
||||
box-sizing: border-box;
|
||||
transition: border-color 0.3s ease-in-out;
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import AutoGPTServerAPI, { GraphMeta } from "@/lib/autogpt-server-api";
|
||||
import React, { useEffect, useState } from "react";
|
||||
import React, { useEffect, useMemo, useState } from "react";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import Link from "next/link";
|
||||
@@ -45,10 +45,10 @@ export const AgentFlowList = ({
|
||||
className?: string;
|
||||
}) => {
|
||||
const [templates, setTemplates] = useState<GraphMeta[]>([]);
|
||||
const api = new AutoGPTServerAPI();
|
||||
const api = useMemo(() => new AutoGPTServerAPI(), []);
|
||||
useEffect(() => {
|
||||
api.listTemplates().then((templates) => setTemplates(templates));
|
||||
}, []);
|
||||
}, [api]);
|
||||
|
||||
return (
|
||||
<Card className={className}>
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React, { useEffect, useState } from "react";
|
||||
import React, { useEffect, useMemo, useState } from "react";
|
||||
import AutoGPTServerAPI, {
|
||||
Graph,
|
||||
GraphMeta,
|
||||
@@ -28,7 +28,7 @@ export const FlowInfo: React.FC<
|
||||
flowVersion?: number | "all";
|
||||
}
|
||||
> = ({ flow, flowRuns, flowVersion, ...props }) => {
|
||||
const api = new AutoGPTServerAPI();
|
||||
const api = useMemo(() => new AutoGPTServerAPI(), []);
|
||||
|
||||
const [flowVersions, setFlowVersions] = useState<Graph[] | null>(null);
|
||||
const [selectedVersion, setSelectedFlowVersion] = useState(
|
||||
@@ -41,7 +41,7 @@ export const FlowInfo: React.FC<
|
||||
|
||||
useEffect(() => {
|
||||
api.getGraphAllVersions(flow.id).then((result) => setFlowVersions(result));
|
||||
}, [flow.id]);
|
||||
}, [flow.id, api]);
|
||||
|
||||
return (
|
||||
<Card {...props}>
|
||||
|
||||
@@ -10,7 +10,7 @@ import {
|
||||
BlockIONumberSubSchema,
|
||||
BlockIOBooleanSubSchema,
|
||||
} from "@/lib/autogpt-server-api/types";
|
||||
import { FC, useEffect, useState } from "react";
|
||||
import React, { FC, useCallback, useEffect, useState } from "react";
|
||||
import { Button } from "./ui/button";
|
||||
import { Switch } from "./ui/switch";
|
||||
import {
|
||||
@@ -296,7 +296,7 @@ const NodeKeyValueInput: FC<{
|
||||
className,
|
||||
displayName,
|
||||
}) => {
|
||||
const getPairValues = () => {
|
||||
const getPairValues = useCallback(() => {
|
||||
let defaultEntries = new Map<string, any>();
|
||||
|
||||
connections
|
||||
@@ -311,7 +311,7 @@ const NodeKeyValueInput: FC<{
|
||||
});
|
||||
|
||||
return Array.from(defaultEntries, ([key, value]) => ({ key, value }));
|
||||
};
|
||||
}, [connections, entries, schema.default, selfKey]);
|
||||
|
||||
const [keyValuePairs, setKeyValuePairs] = useState<
|
||||
{ key: string; value: string | number | null }[]
|
||||
@@ -319,7 +319,7 @@ const NodeKeyValueInput: FC<{
|
||||
|
||||
useEffect(
|
||||
() => setKeyValuePairs(getPairValues()),
|
||||
[connections, entries, schema.default],
|
||||
[connections, entries, schema.default, getPairValues],
|
||||
);
|
||||
|
||||
function updateKeyValuePairs(newPairs: typeof keyValuePairs) {
|
||||
@@ -587,6 +587,52 @@ const NodeStringInput: FC<{
|
||||
);
|
||||
};
|
||||
|
||||
export const NodeTextBoxInput: FC<{
|
||||
selfKey: string;
|
||||
schema: BlockIOStringSubSchema;
|
||||
value?: string;
|
||||
error?: string;
|
||||
handleInputChange: NodeObjectInputTreeProps["handleInputChange"];
|
||||
handleInputClick: NodeObjectInputTreeProps["handleInputClick"];
|
||||
className?: string;
|
||||
displayName: string;
|
||||
}> = ({
|
||||
selfKey,
|
||||
schema,
|
||||
value = "",
|
||||
error,
|
||||
handleInputChange,
|
||||
handleInputClick,
|
||||
className,
|
||||
displayName,
|
||||
}) => {
|
||||
return (
|
||||
<div className={className}>
|
||||
<div
|
||||
className="nodrag relative m-0 h-[200px] w-full bg-yellow-100 p-4"
|
||||
onClick={schema.secret ? () => handleInputClick(selfKey) : undefined}
|
||||
>
|
||||
<textarea
|
||||
id={selfKey}
|
||||
value={schema.secret && value ? "********" : value}
|
||||
readOnly={schema.secret}
|
||||
placeholder={
|
||||
schema?.placeholder || `Enter ${beautifyString(displayName)}`
|
||||
}
|
||||
onChange={(e) => handleInputChange(selfKey, e.target.value)}
|
||||
onBlur={(e) => handleInputChange(selfKey, e.target.value)}
|
||||
className="h-full w-full resize-none overflow-hidden border-none bg-transparent text-lg text-black outline-none"
|
||||
style={{
|
||||
fontSize: "min(1em, 16px)",
|
||||
lineHeight: "1.2",
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
{error && <span className="error-message">{error}</span>}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
const NodeNumberInput: FC<{
|
||||
selfKey: string;
|
||||
schema: BlockIONumberSubSchema;
|
||||
|
||||
@@ -19,7 +19,7 @@ const Input = React.forwardRef<HTMLInputElement, InputProps>(
|
||||
) {
|
||||
ref.current.value = value;
|
||||
}
|
||||
}, [value, type]);
|
||||
}, [value, type, ref]);
|
||||
return (
|
||||
<input
|
||||
type={type}
|
||||
|
||||
@@ -61,8 +61,10 @@ export default function useAgentGraph(
|
||||
const [nodes, setNodes] = useState<CustomNode[]>([]);
|
||||
const [edges, setEdges] = useState<CustomEdge[]>([]);
|
||||
|
||||
const apiUrl = process.env.NEXT_PUBLIC_AGPT_SERVER_URL!;
|
||||
const api = useMemo(() => new AutoGPTServerAPI(apiUrl), [apiUrl]);
|
||||
const api = useMemo(
|
||||
() => new AutoGPTServerAPI(process.env.NEXT_PUBLIC_AGPT_SERVER_URL!),
|
||||
[],
|
||||
);
|
||||
|
||||
// Connect to WebSocket
|
||||
useEffect(() => {
|
||||
@@ -89,16 +91,227 @@ export default function useAgentGraph(
|
||||
.getBlocks()
|
||||
.then((blocks) => setAvailableNodes(blocks))
|
||||
.catch();
|
||||
}, [api]);
|
||||
|
||||
//TODO to utils? repeated in Flow
|
||||
const formatEdgeID = useCallback((conn: Link | Connection): string => {
|
||||
if ("sink_id" in conn) {
|
||||
return `${conn.source_id}_${conn.source_name}_${conn.sink_id}_${conn.sink_name}`;
|
||||
} else {
|
||||
return `${conn.source}_${conn.sourceHandle}_${conn.target}_${conn.targetHandle}`;
|
||||
}
|
||||
}, []);
|
||||
|
||||
const getOutputType = useCallback(
|
||||
(nodes: CustomNode[], nodeId: string, handleId: string) => {
|
||||
const node = nodes.find((n) => n.id === nodeId);
|
||||
if (!node) return "unknown";
|
||||
|
||||
const outputSchema = node.data.outputSchema;
|
||||
if (!outputSchema) return "unknown";
|
||||
|
||||
const outputHandle = outputSchema.properties[handleId];
|
||||
if (!("type" in outputHandle)) return "unknown";
|
||||
return outputHandle.type;
|
||||
},
|
||||
[],
|
||||
);
|
||||
|
||||
// Load existing graph
|
||||
const loadGraph = useCallback(
|
||||
(graph: Graph) => {
|
||||
setSavedAgent(graph);
|
||||
setAgentName(graph.name);
|
||||
setAgentDescription(graph.description);
|
||||
|
||||
setNodes(() => {
|
||||
const newNodes = graph.nodes.map((node) => {
|
||||
const block = availableNodes.find(
|
||||
(block) => block.id === node.block_id,
|
||||
)!;
|
||||
const newNode: CustomNode = {
|
||||
id: node.id,
|
||||
type: "custom",
|
||||
position: {
|
||||
x: node.metadata.position.x,
|
||||
y: node.metadata.position.y,
|
||||
},
|
||||
data: {
|
||||
block_id: block.id,
|
||||
blockType: block.name,
|
||||
categories: block.categories,
|
||||
description: block.description,
|
||||
title: `${block.name} ${node.id}`,
|
||||
inputSchema: block.inputSchema,
|
||||
outputSchema: block.outputSchema,
|
||||
hardcodedValues: node.input_default,
|
||||
connections: graph.links
|
||||
.filter((l) => [l.source_id, l.sink_id].includes(node.id))
|
||||
.map((link) => ({
|
||||
edge_id: formatEdgeID(link),
|
||||
source: link.source_id,
|
||||
sourceHandle: link.source_name,
|
||||
target: link.sink_id,
|
||||
targetHandle: link.sink_name,
|
||||
})),
|
||||
isOutputOpen: false,
|
||||
},
|
||||
};
|
||||
return newNode;
|
||||
});
|
||||
setEdges((_) =>
|
||||
graph.links.map((link) => ({
|
||||
id: formatEdgeID(link),
|
||||
type: "custom",
|
||||
data: {
|
||||
edgeColor: getTypeColor(
|
||||
getOutputType(newNodes, link.source_id, link.source_name!),
|
||||
),
|
||||
sourcePos: newNodes.find((node) => node.id === link.source_id)
|
||||
?.position,
|
||||
isStatic: link.is_static,
|
||||
beadUp: 0,
|
||||
beadDown: 0,
|
||||
beadData: [],
|
||||
},
|
||||
markerEnd: {
|
||||
type: MarkerType.ArrowClosed,
|
||||
strokeWidth: 2,
|
||||
color: getTypeColor(
|
||||
getOutputType(newNodes, link.source_id, link.source_name!),
|
||||
),
|
||||
},
|
||||
source: link.source_id,
|
||||
target: link.sink_id,
|
||||
sourceHandle: link.source_name || undefined,
|
||||
targetHandle: link.sink_name || undefined,
|
||||
})),
|
||||
);
|
||||
return newNodes;
|
||||
});
|
||||
},
|
||||
[availableNodes, formatEdgeID, getOutputType],
|
||||
);
|
||||
|
||||
const getFrontendId = useCallback(
|
||||
(backendId: string, nodes: CustomNode[]) => {
|
||||
const node = nodes.find((node) => node.data.backend_id === backendId);
|
||||
return node?.id;
|
||||
},
|
||||
[],
|
||||
);
|
||||
|
||||
const updateEdgeBeads = useCallback(
|
||||
(executionData: NodeExecutionResult) => {
|
||||
setEdges((edges) => {
|
||||
return edges.map((e) => {
|
||||
const edge = { ...e, data: { ...e.data } } as CustomEdge;
|
||||
|
||||
if (executionData.status === "COMPLETED") {
|
||||
// Produce output beads
|
||||
for (let key in executionData.output_data) {
|
||||
if (
|
||||
edge.source !== getFrontendId(executionData.node_id, nodes) ||
|
||||
edge.sourceHandle !== key
|
||||
) {
|
||||
continue;
|
||||
}
|
||||
edge.data!.beadUp = (edge.data!.beadUp ?? 0) + 1;
|
||||
// For static edges beadDown is always one less than beadUp
|
||||
// Because there's no queueing and one bead is always at the connection point
|
||||
if (edge.data?.isStatic) {
|
||||
edge.data!.beadDown = (edge.data!.beadUp ?? 0) - 1;
|
||||
edge.data!.beadData = edge.data!.beadData!.slice(0, -1);
|
||||
continue;
|
||||
}
|
||||
//todo kcze this assumes output at key is always array with one element
|
||||
edge.data!.beadData = [
|
||||
executionData.output_data[key][0],
|
||||
...edge.data!.beadData!,
|
||||
];
|
||||
}
|
||||
} else if (executionData.status === "RUNNING") {
|
||||
// Consume input beads
|
||||
for (let key in executionData.input_data) {
|
||||
if (
|
||||
edge.target !== getFrontendId(executionData.node_id, nodes) ||
|
||||
edge.targetHandle !== key
|
||||
) {
|
||||
continue;
|
||||
}
|
||||
// Skip decreasing bead count if edge doesn't match or if it's static
|
||||
if (
|
||||
edge.data!.beadData![edge.data!.beadData!.length - 1] !==
|
||||
executionData.input_data[key] ||
|
||||
edge.data?.isStatic
|
||||
) {
|
||||
continue;
|
||||
}
|
||||
edge.data!.beadDown = (edge.data!.beadDown ?? 0) + 1;
|
||||
edge.data!.beadData = edge.data!.beadData!.slice(0, -1);
|
||||
}
|
||||
}
|
||||
return edge;
|
||||
});
|
||||
});
|
||||
},
|
||||
[getFrontendId, nodes],
|
||||
);
|
||||
|
||||
const updateNodesWithExecutionData = useCallback(
|
||||
(executionData: NodeExecutionResult) => {
|
||||
if (passDataToBeads) {
|
||||
updateEdgeBeads(executionData);
|
||||
}
|
||||
setNodes((nodes) => {
|
||||
const nodeId = nodes.find(
|
||||
(node) => node.data.backend_id === executionData.node_id,
|
||||
)?.id;
|
||||
if (!nodeId) {
|
||||
console.error(
|
||||
"Node not found for execution data:",
|
||||
executionData,
|
||||
"This shouldn't happen and means that the frontend and backend are out of sync.",
|
||||
);
|
||||
return nodes;
|
||||
}
|
||||
return nodes.map((node) =>
|
||||
node.id === nodeId
|
||||
? {
|
||||
...node,
|
||||
data: {
|
||||
...node.data,
|
||||
status: executionData.status,
|
||||
executionResults:
|
||||
Object.keys(executionData.output_data).length > 0
|
||||
? [
|
||||
...(node.data.executionResults || []),
|
||||
{
|
||||
execId: executionData.node_exec_id,
|
||||
data: executionData.output_data,
|
||||
},
|
||||
]
|
||||
: node.data.executionResults,
|
||||
isOutputOpen: true,
|
||||
},
|
||||
}
|
||||
: node,
|
||||
);
|
||||
});
|
||||
},
|
||||
[passDataToBeads, updateEdgeBeads],
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
if (!flowID || availableNodes.length == 0) return;
|
||||
|
||||
(template ? api.getTemplate(flowID) : api.getGraph(flowID)).then((graph) =>
|
||||
loadGraph(graph),
|
||||
(template ? api.getTemplate(flowID) : api.getGraph(flowID)).then(
|
||||
(graph) => {
|
||||
console.log("Loading graph");
|
||||
loadGraph(graph);
|
||||
},
|
||||
);
|
||||
}, [flowID, template, availableNodes]);
|
||||
}, [flowID, template, availableNodes, api, loadGraph]);
|
||||
|
||||
// Update nodes with execution data
|
||||
useEffect(() => {
|
||||
@@ -119,7 +332,68 @@ export default function useAgentGraph(
|
||||
});
|
||||
return [];
|
||||
});
|
||||
}, [updateQueue, nodesSyncedWithSavedAgent]);
|
||||
}, [updateQueue, nodesSyncedWithSavedAgent, updateNodesWithExecutionData]);
|
||||
|
||||
const validateNodes = useCallback((): boolean => {
|
||||
let isValid = true;
|
||||
|
||||
nodes.forEach((node) => {
|
||||
const validate = ajv.compile(node.data.inputSchema);
|
||||
const errors = {} as { [key: string]: string };
|
||||
|
||||
// Validate values against schema using AJV
|
||||
const valid = validate(node.data.hardcodedValues);
|
||||
if (!valid) {
|
||||
// Populate errors if validation fails
|
||||
validate.errors?.forEach((error) => {
|
||||
// Skip error if there's an edge connected
|
||||
const path =
|
||||
"dataPath" in error
|
||||
? (error.dataPath as string)
|
||||
: error.instancePath;
|
||||
const handle = path.split(/[\/.]/)[0];
|
||||
if (
|
||||
node.data.connections.some(
|
||||
(conn) => conn.target === node.id || conn.targetHandle === handle,
|
||||
)
|
||||
) {
|
||||
return;
|
||||
}
|
||||
console.warn("Error", error);
|
||||
isValid = false;
|
||||
if (path && error.message) {
|
||||
const key = path.slice(1);
|
||||
console.log("Error", key, error.message);
|
||||
setNestedProperty(
|
||||
errors,
|
||||
key,
|
||||
error.message[0].toUpperCase() + error.message.slice(1),
|
||||
);
|
||||
} else if (error.keyword === "required") {
|
||||
const key = error.params.missingProperty;
|
||||
setNestedProperty(errors, key, "This field is required");
|
||||
}
|
||||
});
|
||||
}
|
||||
// Set errors
|
||||
setNodes((nodes) => {
|
||||
return nodes.map((n) => {
|
||||
if (n.id === node.id) {
|
||||
return {
|
||||
...n,
|
||||
data: {
|
||||
...n.data,
|
||||
errors,
|
||||
},
|
||||
};
|
||||
}
|
||||
return n;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
return isValid;
|
||||
}, [nodes]);
|
||||
|
||||
// Handle user requests
|
||||
useEffect(() => {
|
||||
@@ -224,7 +498,13 @@ export default function useAgentGraph(
|
||||
.stopGraphExecution(savedAgent.id, saveRunRequest.activeExecutionID)
|
||||
.then(() => setSaveRunRequest({ request: "none", state: "none" }));
|
||||
}
|
||||
}, [saveRunRequest, savedAgent, nodesSyncedWithSavedAgent]);
|
||||
}, [
|
||||
api,
|
||||
saveRunRequest,
|
||||
savedAgent,
|
||||
nodesSyncedWithSavedAgent,
|
||||
validateNodes,
|
||||
]);
|
||||
|
||||
// Check if node ids are synced with saved agent
|
||||
useEffect(() => {
|
||||
@@ -241,275 +521,6 @@ export default function useAgentGraph(
|
||||
setNodesSyncedWithSavedAgent(oneNodeSynced);
|
||||
}, [savedAgent, nodes]);
|
||||
|
||||
const validateNodes = useCallback((): boolean => {
|
||||
let isValid = true;
|
||||
|
||||
nodes.forEach((node) => {
|
||||
const validate = ajv.compile(node.data.inputSchema);
|
||||
const errors = {} as { [key: string]: string };
|
||||
|
||||
// Validate values against schema using AJV
|
||||
const valid = validate(node.data.hardcodedValues);
|
||||
if (!valid) {
|
||||
// Populate errors if validation fails
|
||||
validate.errors?.forEach((error) => {
|
||||
// Skip error if there's an edge connected
|
||||
const path =
|
||||
"dataPath" in error
|
||||
? (error.dataPath as string)
|
||||
: error.instancePath;
|
||||
const handle = path.split(/[\/.]/)[0];
|
||||
if (
|
||||
node.data.connections.some(
|
||||
(conn) => conn.target === node.id || conn.targetHandle === handle,
|
||||
)
|
||||
) {
|
||||
return;
|
||||
}
|
||||
console.warn("Error", error);
|
||||
isValid = false;
|
||||
if (path && error.message) {
|
||||
const key = path.slice(1);
|
||||
console.log("Error", key, error.message);
|
||||
setNestedProperty(
|
||||
errors,
|
||||
key,
|
||||
error.message[0].toUpperCase() + error.message.slice(1),
|
||||
);
|
||||
} else if (error.keyword === "required") {
|
||||
const key = error.params.missingProperty;
|
||||
setNestedProperty(errors, key, "This field is required");
|
||||
}
|
||||
});
|
||||
}
|
||||
// Set errors
|
||||
setNodes((nodes) => {
|
||||
return nodes.map((n) => {
|
||||
if (n.id === node.id) {
|
||||
return {
|
||||
...n,
|
||||
data: {
|
||||
...n.data,
|
||||
errors,
|
||||
},
|
||||
};
|
||||
}
|
||||
return n;
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
return isValid;
|
||||
}, [nodes]);
|
||||
|
||||
const getFrontendId = useCallback(
|
||||
(backendId: string, nodes: CustomNode[]) => {
|
||||
const node = nodes.find((node) => node.data.backend_id === backendId);
|
||||
return node?.id;
|
||||
},
|
||||
[],
|
||||
);
|
||||
|
||||
const updateEdgeBeads = useCallback(
|
||||
(executionData: NodeExecutionResult) => {
|
||||
setEdges((edges) => {
|
||||
return edges.map((e) => {
|
||||
const edge = { ...e, data: { ...e.data } } as CustomEdge;
|
||||
|
||||
if (executionData.status === "COMPLETED") {
|
||||
// Produce output beads
|
||||
for (let key in executionData.output_data) {
|
||||
if (
|
||||
edge.source !== getFrontendId(executionData.node_id, nodes) ||
|
||||
edge.sourceHandle !== key
|
||||
) {
|
||||
continue;
|
||||
}
|
||||
edge.data!.beadUp = (edge.data!.beadUp ?? 0) + 1;
|
||||
// For static edges beadDown is always one less than beadUp
|
||||
// Because there's no queueing and one bead is always at the connection point
|
||||
if (edge.data?.isStatic) {
|
||||
edge.data!.beadDown = (edge.data!.beadUp ?? 0) - 1;
|
||||
edge.data!.beadData = edge.data!.beadData!.slice(0, -1);
|
||||
continue;
|
||||
}
|
||||
//todo kcze this assumes output at key is always array with one element
|
||||
edge.data!.beadData = [
|
||||
executionData.output_data[key][0],
|
||||
...edge.data!.beadData!,
|
||||
];
|
||||
}
|
||||
} else if (executionData.status === "RUNNING") {
|
||||
// Consume input beads
|
||||
for (let key in executionData.input_data) {
|
||||
if (
|
||||
edge.target !== getFrontendId(executionData.node_id, nodes) ||
|
||||
edge.targetHandle !== key
|
||||
) {
|
||||
continue;
|
||||
}
|
||||
// Skip decreasing bead count if edge doesn't match or if it's static
|
||||
if (
|
||||
edge.data!.beadData![edge.data!.beadData!.length - 1] !==
|
||||
executionData.input_data[key] ||
|
||||
edge.data?.isStatic
|
||||
) {
|
||||
continue;
|
||||
}
|
||||
edge.data!.beadDown = (edge.data!.beadDown ?? 0) + 1;
|
||||
edge.data!.beadData = edge.data!.beadData!.slice(0, -1);
|
||||
}
|
||||
}
|
||||
return edge;
|
||||
});
|
||||
});
|
||||
},
|
||||
[edges],
|
||||
);
|
||||
|
||||
const updateNodesWithExecutionData = useCallback(
|
||||
(executionData: NodeExecutionResult) => {
|
||||
if (passDataToBeads) {
|
||||
updateEdgeBeads(executionData);
|
||||
}
|
||||
setNodes((nodes) => {
|
||||
const nodeId = nodes.find(
|
||||
(node) => node.data.backend_id === executionData.node_id,
|
||||
)?.id;
|
||||
if (!nodeId) {
|
||||
console.error(
|
||||
"Node not found for execution data:",
|
||||
executionData,
|
||||
"This shouldn't happen and means that the frontend and backend are out of sync.",
|
||||
);
|
||||
return nodes;
|
||||
}
|
||||
return nodes.map((node) =>
|
||||
node.id === nodeId
|
||||
? {
|
||||
...node,
|
||||
data: {
|
||||
...node.data,
|
||||
status: executionData.status,
|
||||
executionResults:
|
||||
Object.keys(executionData.output_data).length > 0
|
||||
? [
|
||||
...(node.data.executionResults || []),
|
||||
{
|
||||
execId: executionData.node_exec_id,
|
||||
data: executionData.output_data,
|
||||
},
|
||||
]
|
||||
: node.data.executionResults,
|
||||
isOutputOpen: true,
|
||||
},
|
||||
}
|
||||
: node,
|
||||
);
|
||||
});
|
||||
},
|
||||
[nodes],
|
||||
);
|
||||
|
||||
//TODO to utils? repeated in Flow
|
||||
const formatEdgeID = useCallback((conn: Link | Connection): string => {
|
||||
if ("sink_id" in conn) {
|
||||
return `${conn.source_id}_${conn.source_name}_${conn.sink_id}_${conn.sink_name}`;
|
||||
} else {
|
||||
return `${conn.source}_${conn.sourceHandle}_${conn.target}_${conn.targetHandle}`;
|
||||
}
|
||||
}, []);
|
||||
|
||||
const getOutputType = useCallback(
|
||||
(nodeId: string, handleId: string) => {
|
||||
const node = nodes.find((n) => n.id === nodeId);
|
||||
if (!node) return "unknown";
|
||||
|
||||
const outputSchema = node.data.outputSchema;
|
||||
if (!outputSchema) return "unknown";
|
||||
|
||||
const outputHandle = outputSchema.properties[handleId];
|
||||
if (!("type" in outputHandle)) return "unknown";
|
||||
return outputHandle.type;
|
||||
},
|
||||
[nodes],
|
||||
);
|
||||
|
||||
const loadGraph = useCallback(
|
||||
(graph: Graph) => {
|
||||
setSavedAgent(graph);
|
||||
setAgentName(graph.name);
|
||||
setAgentDescription(graph.description);
|
||||
|
||||
setNodes(() => {
|
||||
const newNodes = graph.nodes.map((node) => {
|
||||
const block = availableNodes.find(
|
||||
(block) => block.id === node.block_id,
|
||||
)!;
|
||||
const newNode: CustomNode = {
|
||||
id: node.id,
|
||||
type: "custom",
|
||||
position: {
|
||||
x: node.metadata.position.x,
|
||||
y: node.metadata.position.y,
|
||||
},
|
||||
data: {
|
||||
block_id: block.id,
|
||||
blockType: block.name,
|
||||
categories: block.categories,
|
||||
description: block.description,
|
||||
title: `${block.name} ${node.id}`,
|
||||
inputSchema: block.inputSchema,
|
||||
outputSchema: block.outputSchema,
|
||||
hardcodedValues: node.input_default,
|
||||
connections: graph.links
|
||||
.filter((l) => [l.source_id, l.sink_id].includes(node.id))
|
||||
.map((link) => ({
|
||||
edge_id: formatEdgeID(link),
|
||||
source: link.source_id,
|
||||
sourceHandle: link.source_name,
|
||||
target: link.sink_id,
|
||||
targetHandle: link.sink_name,
|
||||
})),
|
||||
isOutputOpen: false,
|
||||
},
|
||||
};
|
||||
return newNode;
|
||||
});
|
||||
setEdges((_) =>
|
||||
graph.links.map((link) => ({
|
||||
id: formatEdgeID(link),
|
||||
type: "custom",
|
||||
data: {
|
||||
edgeColor: getTypeColor(
|
||||
getOutputType(link.source_id, link.source_name!),
|
||||
),
|
||||
sourcePos: nodes.find((node) => node.id === link.source_id)
|
||||
?.position,
|
||||
isStatic: link.is_static,
|
||||
beadUp: 0,
|
||||
beadDown: 0,
|
||||
beadData: [],
|
||||
},
|
||||
markerEnd: {
|
||||
type: MarkerType.ArrowClosed,
|
||||
strokeWidth: 2,
|
||||
color: getTypeColor(
|
||||
getOutputType(link.source_id, link.source_name!),
|
||||
),
|
||||
},
|
||||
source: link.source_id,
|
||||
target: link.sink_id,
|
||||
sourceHandle: link.source_name || undefined,
|
||||
targetHandle: link.sink_name || undefined,
|
||||
})),
|
||||
);
|
||||
return newNodes;
|
||||
});
|
||||
},
|
||||
[availableNodes],
|
||||
);
|
||||
|
||||
const prepareNodeInputData = useCallback(
|
||||
(node: CustomNode) => {
|
||||
console.debug(
|
||||
@@ -696,7 +707,15 @@ export default function useAgentGraph(
|
||||
}));
|
||||
});
|
||||
},
|
||||
[nodes, edges, savedAgent],
|
||||
[
|
||||
api,
|
||||
nodes,
|
||||
edges,
|
||||
savedAgent,
|
||||
agentName,
|
||||
agentDescription,
|
||||
prepareNodeInputData,
|
||||
],
|
||||
);
|
||||
|
||||
const requestSave = useCallback(
|
||||
|
||||
@@ -84,12 +84,12 @@ export function useBezierPath(
|
||||
|
||||
return length;
|
||||
},
|
||||
[path],
|
||||
[getPointForT],
|
||||
);
|
||||
|
||||
const length = useMemo(() => {
|
||||
return getArcLength(1);
|
||||
}, [path]);
|
||||
}, [getArcLength]);
|
||||
|
||||
const getBezierDerivative = useCallback(
|
||||
(t: number) => {
|
||||
@@ -131,7 +131,7 @@ export function useBezierPath(
|
||||
|
||||
return t;
|
||||
},
|
||||
[path],
|
||||
[getArcLength, getBezierDerivative, length],
|
||||
);
|
||||
|
||||
const getPointAtDistance = useCallback(
|
||||
@@ -143,7 +143,7 @@ export function useBezierPath(
|
||||
const t = getTForDistance(distance);
|
||||
return getPointForT(t);
|
||||
},
|
||||
[path],
|
||||
[getTForDistance, getPointForT, length],
|
||||
);
|
||||
|
||||
return {
|
||||
|
||||
@@ -13,6 +13,7 @@ export type Block = {
|
||||
inputSchema: BlockIORootSchema;
|
||||
outputSchema: BlockIORootSchema;
|
||||
staticOutput: boolean;
|
||||
uiType: BlockUIType;
|
||||
};
|
||||
|
||||
export type BlockIORootSchema = {
|
||||
@@ -182,3 +183,10 @@ export type User = {
|
||||
id: string;
|
||||
email: string;
|
||||
};
|
||||
|
||||
export enum BlockUIType {
|
||||
STANDARD = "Standard",
|
||||
INPUT = "Input",
|
||||
OUTPUT = "Output",
|
||||
NOTE = "Note",
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,4 +1,4 @@
|
||||
FROM python:3.11-slim-buster as server_base
|
||||
FROM python:3.11-slim-buster AS server_base
|
||||
|
||||
# Set environment variables
|
||||
ENV PYTHONDONTWRITEBYTECODE 1
|
||||
@@ -6,8 +6,9 @@ ENV PYTHONUNBUFFERED 1
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# postgresql-client is needed to check if the postgres service is ready for running migrations
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y build-essential curl ffmpeg wget libcurl4-gnutls-dev libexpat1-dev gettext libz-dev libssl-dev \
|
||||
&& apt-get install -y build-essential curl ffmpeg wget libcurl4-gnutls-dev libexpat1-dev gettext libz-dev libssl-dev postgresql-client \
|
||||
&& apt-get clean \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& wget https://github.com/git/git/archive/v2.28.0.tar.gz -O git.tar.gz \
|
||||
@@ -23,27 +24,30 @@ ENV POETRY_VERSION=1.8.3 \
|
||||
PATH="$POETRY_HOME/bin:$PATH"
|
||||
RUN pip3 install poetry
|
||||
|
||||
FROM server_base AS server_dependencies
|
||||
|
||||
RUN mkdir -p /app/rnd/autogpt_libs
|
||||
RUN mkdir -p /app/rnd/autogpt_server
|
||||
|
||||
COPY rnd/autogpt_libs /app/rnd/autogpt_libs
|
||||
|
||||
COPY rnd/autogpt_server/poetry.lock rnd/autogpt_server/pyproject.toml /app/rnd/autogpt_server/
|
||||
|
||||
WORKDIR /app/rnd/autogpt_server
|
||||
|
||||
COPY rnd/autogpt_server/pyproject.toml rnd/autogpt_server/poetry.lock ./
|
||||
RUN poetry install --no-interaction --no-ansi
|
||||
|
||||
FROM server_dependencies AS server_prisma
|
||||
|
||||
COPY rnd/autogpt_server/schema.prisma ./
|
||||
RUN poetry run prisma generate
|
||||
|
||||
COPY rnd/autogpt_server /app/rnd/autogpt_server
|
||||
FROM server_base as server
|
||||
FROM server_prisma AS server
|
||||
|
||||
COPY rnd/autogpt_server /app/rnd/autogpt_server
|
||||
|
||||
ENV PORT=8000
|
||||
ENV DATABASE_URL=""
|
||||
ENV PORT=8000
|
||||
|
||||
CMD ["poetry", "run", "rest"]
|
||||
|
||||
FROM server_base as executor
|
||||
|
||||
ENV PORT=8002
|
||||
ENV DATABASE_URL=""
|
||||
|
||||
CMD ["poetry", "run", "executor"]
|
||||
|
||||
@@ -1,45 +0,0 @@
|
||||
FROM python:3.11-slim-buster as server_base
|
||||
|
||||
# Set environment variables
|
||||
ENV PYTHONDONTWRITEBYTECODE 1
|
||||
ENV PYTHONUNBUFFERED 1
|
||||
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y build-essential curl ffmpeg wget libcurl4-gnutls-dev libexpat1-dev gettext libz-dev libssl-dev \
|
||||
&& apt-get clean \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& wget https://github.com/git/git/archive/v2.28.0.tar.gz -O git.tar.gz \
|
||||
&& tar -zxf git.tar.gz \
|
||||
&& cd git-* \
|
||||
&& make prefix=/usr all \
|
||||
&& make prefix=/usr install
|
||||
|
||||
|
||||
ENV POETRY_VERSION=1.8.3 \
|
||||
POETRY_HOME="/opt/poetry" \
|
||||
POETRY_NO_INTERACTION=1 \
|
||||
POETRY_VIRTUALENVS_CREATE=false \
|
||||
PATH="$POETRY_HOME/bin:$PATH"
|
||||
RUN pip3 install poetry
|
||||
|
||||
COPY rnd/autogpt_libs /app/rnd/autogpt_libs
|
||||
|
||||
WORKDIR /app/rnd/autogpt_server
|
||||
|
||||
COPY rnd/autogpt_server/pyproject.toml rnd/autogpt_server/poetry.lock ./
|
||||
RUN poetry install --no-interaction --no-ansi
|
||||
|
||||
COPY rnd/autogpt_server/schema.prisma ./
|
||||
RUN poetry run prisma generate
|
||||
|
||||
COPY rnd/autogpt_server /app/rnd/autogpt_server
|
||||
|
||||
FROM server_base as server
|
||||
|
||||
ENV PORT=8001
|
||||
ENV DATABASE_URL=""
|
||||
|
||||
CMD ["poetry", "run", "ws"]
|
||||
@@ -101,7 +101,7 @@ docker compose down
|
||||
If you run into issues with dangling orphans, try:
|
||||
|
||||
```sh
|
||||
docker-compose down --volumes --remove-orphans && docker-compose up --force-recreate --renew-anon-volumes --remove-orphans
|
||||
docker compose down --volumes --remove-orphans && docker-compose up --force-recreate --renew-anon-volumes --remove-orphans
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
@@ -55,15 +55,15 @@ for cls in all_subclasses(Block):
|
||||
raise ValueError(f"Block ID {block.name} error: {block.id} is already in use")
|
||||
|
||||
# Prevent duplicate field name in input_schema and output_schema
|
||||
duplicate_field_names = set(block.input_schema.__fields__.keys()) & set(
|
||||
block.output_schema.__fields__.keys()
|
||||
duplicate_field_names = set(block.input_schema.model_fields.keys()) & set(
|
||||
block.output_schema.model_fields.keys()
|
||||
)
|
||||
if duplicate_field_names:
|
||||
raise ValueError(
|
||||
f"{block.name} has duplicate field names in input_schema and output_schema: {duplicate_field_names}"
|
||||
)
|
||||
|
||||
for field in block.input_schema.__fields__.values():
|
||||
for field in block.input_schema.model_fields.values():
|
||||
if field.annotation is bool and field.default not in (True, False):
|
||||
raise ValueError(f"{block.name} has a boolean field with no default value")
|
||||
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import re
|
||||
from typing import Any, List
|
||||
|
||||
from jinja2 import BaseLoader, Environment
|
||||
from pydantic import Field
|
||||
|
||||
from autogpt_server.data.block import (
|
||||
@@ -12,6 +14,8 @@ from autogpt_server.data.block import (
|
||||
from autogpt_server.data.model import SchemaField
|
||||
from autogpt_server.util.mock import MockObject
|
||||
|
||||
jinja = Environment(loader=BaseLoader())
|
||||
|
||||
|
||||
class StoreValueBlock(Block):
|
||||
"""
|
||||
@@ -136,7 +140,7 @@ class FindInDictionaryBlock(Block):
|
||||
yield "missing", input_data.input
|
||||
|
||||
|
||||
class InputBlock(Block):
|
||||
class AgentInputBlock(Block):
|
||||
"""
|
||||
This block is used to provide input to the graph.
|
||||
|
||||
@@ -148,13 +152,20 @@ class InputBlock(Block):
|
||||
class Input(BlockSchema):
|
||||
value: Any = SchemaField(description="The value to be passed as input.")
|
||||
name: str = SchemaField(description="The name of the input.")
|
||||
description: str = SchemaField(description="The description of the input.")
|
||||
description: str = SchemaField(
|
||||
description="The description of the input.",
|
||||
default="",
|
||||
advanced=True,
|
||||
)
|
||||
placeholder_values: List[Any] = SchemaField(
|
||||
description="The placeholder values to be passed as input."
|
||||
description="The placeholder values to be passed as input.",
|
||||
default=[],
|
||||
advanced=True,
|
||||
)
|
||||
limit_to_placeholder_values: bool = SchemaField(
|
||||
description="Whether to limit the selection to placeholder values.",
|
||||
default=False,
|
||||
advanced=True,
|
||||
)
|
||||
|
||||
class Output(BlockSchema):
|
||||
@@ -164,8 +175,8 @@ class InputBlock(Block):
|
||||
super().__init__(
|
||||
id="c0a8e994-ebf1-4a9c-a4d8-89d09c86741b",
|
||||
description="This block is used to provide input to the graph.",
|
||||
input_schema=InputBlock.Input,
|
||||
output_schema=InputBlock.Output,
|
||||
input_schema=AgentInputBlock.Input,
|
||||
output_schema=AgentInputBlock.Output,
|
||||
test_input=[
|
||||
{
|
||||
"value": "Hello, World!",
|
||||
@@ -194,7 +205,7 @@ class InputBlock(Block):
|
||||
yield "result", input_data.value
|
||||
|
||||
|
||||
class OutputBlock(Block):
|
||||
class AgentOutputBlock(Block):
|
||||
"""
|
||||
Records the output of the graph for users to see.
|
||||
|
||||
@@ -215,13 +226,17 @@ class OutputBlock(Block):
|
||||
"""
|
||||
|
||||
class Input(BlockSchema):
|
||||
recorded_value: Any = SchemaField(
|
||||
description="The value to be recorded as output."
|
||||
)
|
||||
value: Any = SchemaField(description="The value to be recorded as output.")
|
||||
name: str = SchemaField(description="The name of the output.")
|
||||
description: str = SchemaField(description="The description of the output.")
|
||||
fmt_string: str = SchemaField(
|
||||
description="The format string to be used to format the recorded_value."
|
||||
description: str = SchemaField(
|
||||
description="The description of the output.",
|
||||
default="",
|
||||
advanced=True,
|
||||
)
|
||||
format: str = SchemaField(
|
||||
description="The format string to be used to format the recorded_value.",
|
||||
default="",
|
||||
advanced=True,
|
||||
)
|
||||
|
||||
class Output(BlockSchema):
|
||||
@@ -238,31 +253,31 @@ class OutputBlock(Block):
|
||||
"This block is key for capturing and presenting final results or "
|
||||
"important intermediate outputs of the graph execution."
|
||||
),
|
||||
input_schema=OutputBlock.Input,
|
||||
output_schema=OutputBlock.Output,
|
||||
input_schema=AgentOutputBlock.Input,
|
||||
output_schema=AgentOutputBlock.Output,
|
||||
test_input=[
|
||||
{
|
||||
"recorded_value": "Hello, World!",
|
||||
"value": "Hello, World!",
|
||||
"name": "output_1",
|
||||
"description": "This is a test output.",
|
||||
"fmt_string": "{value}",
|
||||
"format": "{{ output_1 }}!!",
|
||||
},
|
||||
{
|
||||
"recorded_value": 42,
|
||||
"value": "42",
|
||||
"name": "output_2",
|
||||
"description": "This is another test output.",
|
||||
"fmt_string": "{value}",
|
||||
"format": "{{ output_2 }}",
|
||||
},
|
||||
{
|
||||
"recorded_value": MockObject(value="!!", key="key"),
|
||||
"value": MockObject(value="!!", key="key"),
|
||||
"name": "output_3",
|
||||
"description": "This is a test output with a mock object.",
|
||||
"fmt_string": "{value}",
|
||||
"format": "{{ output_3 }}",
|
||||
},
|
||||
],
|
||||
test_output=[
|
||||
("output", "Hello, World!"),
|
||||
("output", 42),
|
||||
("output", "Hello, World!!!"),
|
||||
("output", "42"),
|
||||
("output", MockObject(value="!!", key="key")),
|
||||
],
|
||||
categories={BlockCategory.OUTPUT, BlockCategory.BASIC},
|
||||
@@ -274,13 +289,15 @@ class OutputBlock(Block):
|
||||
Attempts to format the recorded_value using the fmt_string if provided.
|
||||
If formatting fails or no fmt_string is given, returns the original recorded_value.
|
||||
"""
|
||||
if input_data.fmt_string:
|
||||
if input_data.format:
|
||||
try:
|
||||
yield "output", input_data.fmt_string.format(input_data.recorded_value)
|
||||
except Exception:
|
||||
yield "output", input_data.recorded_value
|
||||
fmt = re.sub(r"(?<!{){[ a-zA-Z0-9_]+}", r"{\g<0>}", input_data.format)
|
||||
template = jinja.from_string(fmt)
|
||||
yield "output", template.render({input_data.name: input_data.value})
|
||||
except Exception as e:
|
||||
yield "output", f"Error: {e}, {input_data.value}"
|
||||
else:
|
||||
yield "output", input_data.recorded_value
|
||||
yield "output", input_data.value
|
||||
|
||||
|
||||
class AddToDictionaryBlock(Block):
|
||||
@@ -422,7 +439,8 @@ class NoteBlock(Block):
|
||||
class Input(BlockSchema):
|
||||
text: str = SchemaField(description="The text to display in the sticky note.")
|
||||
|
||||
class Output(BlockSchema): ...
|
||||
class Output(BlockSchema):
|
||||
output: str = SchemaField(description="The text to display in the sticky note.")
|
||||
|
||||
def __init__(self):
|
||||
super().__init__(
|
||||
@@ -432,8 +450,11 @@ class NoteBlock(Block):
|
||||
input_schema=NoteBlock.Input,
|
||||
output_schema=NoteBlock.Output,
|
||||
test_input={"text": "Hello, World!"},
|
||||
test_output=None,
|
||||
test_output=[
|
||||
("output", "Hello, World!"),
|
||||
],
|
||||
ui_type=BlockUIType.NOTE,
|
||||
)
|
||||
|
||||
def run(self, input_data: Input) -> BlockOutput: ...
|
||||
def run(self, input_data: Input) -> BlockOutput:
|
||||
yield "output", input_data.text
|
||||
|
||||
@@ -438,7 +438,7 @@ class Message(BlockSchema):
|
||||
class AIConversationBlock(Block):
|
||||
class Input(BlockSchema):
|
||||
messages: List[Message] = SchemaField(
|
||||
description="List of messages in the conversation.", min_items=1
|
||||
description="List of messages in the conversation.", min_length=1
|
||||
)
|
||||
model: LlmModel = SchemaField(
|
||||
default=LlmModel.GPT4_TURBO,
|
||||
|
||||
@@ -9,7 +9,7 @@ from prisma.models import AgentGraph, AgentNode, AgentNodeLink
|
||||
from pydantic import BaseModel, PrivateAttr
|
||||
from pydantic_core import PydanticUndefinedType
|
||||
|
||||
from autogpt_server.blocks.basic import InputBlock, OutputBlock
|
||||
from autogpt_server.blocks.basic import AgentInputBlock, AgentOutputBlock
|
||||
from autogpt_server.data.block import BlockInput, get_block, get_blocks
|
||||
from autogpt_server.data.db import BaseDbModel, transaction
|
||||
from autogpt_server.data.user import DEFAULT_USER_ID
|
||||
@@ -106,7 +106,9 @@ class Graph(GraphMeta):
|
||||
def starting_nodes(self) -> list[Node]:
|
||||
outbound_nodes = {link.sink_id for link in self.links}
|
||||
input_nodes = {
|
||||
v.id for v in self.nodes if isinstance(get_block(v.block_id), InputBlock)
|
||||
v.id
|
||||
for v in self.nodes
|
||||
if isinstance(get_block(v.block_id), AgentInputBlock)
|
||||
}
|
||||
return [
|
||||
node
|
||||
@@ -116,7 +118,9 @@ class Graph(GraphMeta):
|
||||
|
||||
@property
|
||||
def ending_nodes(self) -> list[Node]:
|
||||
return [v for v in self.nodes if isinstance(get_block(v.block_id), OutputBlock)]
|
||||
return [
|
||||
v for v in self.nodes if isinstance(get_block(v.block_id), AgentOutputBlock)
|
||||
]
|
||||
|
||||
@property
|
||||
def subgraph_map(self) -> dict[str, str]:
|
||||
@@ -179,7 +183,9 @@ class Graph(GraphMeta):
|
||||
+ [sanitize(link.sink_name) for link in node.input_links]
|
||||
)
|
||||
for name in block.input_schema.get_required_fields():
|
||||
if name not in provided_inputs and not isinstance(block, InputBlock):
|
||||
if name not in provided_inputs and not isinstance(
|
||||
block, AgentInputBlock
|
||||
):
|
||||
raise ValueError(
|
||||
f"Node {block.name} #{node.id} required input missing: `{name}`"
|
||||
)
|
||||
@@ -193,7 +199,7 @@ class Graph(GraphMeta):
|
||||
def is_input_output_block(nid: str) -> bool:
|
||||
bid = node_map[nid].block_id
|
||||
b = get_block(bid)
|
||||
return isinstance(b, InputBlock) or isinstance(b, OutputBlock)
|
||||
return isinstance(b, AgentInputBlock) or isinstance(b, AgentOutputBlock)
|
||||
|
||||
# subgraphs: all nodes in subgraph must be present in the graph.
|
||||
for subgraph_id, node_ids in self.subgraphs.items():
|
||||
|
||||
@@ -14,7 +14,7 @@ from typing import TYPE_CHECKING, Any, Coroutine, Generator, TypeVar
|
||||
if TYPE_CHECKING:
|
||||
from autogpt_server.server.rest_api import AgentServer
|
||||
|
||||
from autogpt_server.blocks.basic import InputBlock
|
||||
from autogpt_server.blocks.basic import AgentInputBlock
|
||||
from autogpt_server.data import db
|
||||
from autogpt_server.data.block import Block, BlockData, BlockInput, get_block
|
||||
from autogpt_server.data.execution import (
|
||||
@@ -699,7 +699,7 @@ class ExecutionManager(AppService):
|
||||
nodes_input = []
|
||||
for node in graph.starting_nodes:
|
||||
input_data = {}
|
||||
if isinstance(get_block(node.block_id), InputBlock):
|
||||
if isinstance(get_block(node.block_id), AgentInputBlock):
|
||||
name = node.input_default.get("name")
|
||||
if name and name in data:
|
||||
input_data = {"value": data[name]}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
from prisma.models import User
|
||||
|
||||
from autogpt_server.blocks.basic import InputBlock, PrintToConsoleBlock
|
||||
from autogpt_server.blocks.basic import AgentInputBlock, PrintToConsoleBlock
|
||||
from autogpt_server.blocks.text import FillTextTemplateBlock
|
||||
from autogpt_server.data import graph
|
||||
from autogpt_server.data.graph import create_graph
|
||||
@@ -28,22 +28,12 @@ def create_test_graph() -> graph.Graph:
|
||||
"""
|
||||
nodes = [
|
||||
graph.Node(
|
||||
block_id=InputBlock().id,
|
||||
input_default={
|
||||
"name": "input_1",
|
||||
"description": "First input value",
|
||||
"placeholder_values": [],
|
||||
"limit_to_placeholder_values": False,
|
||||
},
|
||||
block_id=AgentInputBlock().id,
|
||||
input_default={"name": "input_1"},
|
||||
),
|
||||
graph.Node(
|
||||
block_id=InputBlock().id,
|
||||
input_default={
|
||||
"name": "input_2",
|
||||
"description": "Second input value",
|
||||
"placeholder_values": [],
|
||||
"limit_to_placeholder_values": False,
|
||||
},
|
||||
block_id=AgentInputBlock().id,
|
||||
input_default={"name": "input_2"},
|
||||
),
|
||||
graph.Node(
|
||||
block_id=FillTextTemplateBlock().id,
|
||||
|
||||
@@ -70,6 +70,7 @@ format = "linter:format"
|
||||
lint = "linter:lint"
|
||||
test = "run_tests:test"
|
||||
# https://poethepoet.natn.io/index.html
|
||||
|
||||
[tool.poe]
|
||||
poetry_command = ""
|
||||
|
||||
|
||||
@@ -8,7 +8,8 @@ def wait_for_postgres(max_retries=5, delay=5):
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[
|
||||
"docker-compose",
|
||||
"docker",
|
||||
"compose",
|
||||
"-f",
|
||||
"docker-compose.test.yaml",
|
||||
"exec",
|
||||
@@ -45,7 +46,8 @@ def test():
|
||||
# Start PostgreSQL with Docker Compose
|
||||
run_command(
|
||||
[
|
||||
"docker-compose",
|
||||
"docker",
|
||||
"compose",
|
||||
"-f",
|
||||
"docker-compose.test.yaml",
|
||||
"up",
|
||||
@@ -55,7 +57,7 @@ def test():
|
||||
)
|
||||
|
||||
if not wait_for_postgres():
|
||||
run_command(["docker-compose", "-f", "docker-compose.test.yaml", "down"])
|
||||
run_command(["docker", "compose", "-f", "docker-compose.test.yaml", "down"])
|
||||
sys.exit(1)
|
||||
|
||||
# Run Prisma migrations
|
||||
@@ -64,6 +66,6 @@ def test():
|
||||
# Run the tests
|
||||
result = subprocess.run(["pytest"] + sys.argv[1:], check=False)
|
||||
|
||||
run_command(["docker-compose", "-f", "docker-compose.test.yaml", "down"])
|
||||
run_command(["docker", "compose", "-f", "docker-compose.test.yaml", "down"])
|
||||
|
||||
sys.exit(result.returncode)
|
||||
|
||||
@@ -2,7 +2,7 @@ from uuid import UUID
|
||||
|
||||
import pytest
|
||||
|
||||
from autogpt_server.blocks.basic import InputBlock, StoreValueBlock
|
||||
from autogpt_server.blocks.basic import AgentInputBlock, StoreValueBlock
|
||||
from autogpt_server.data.graph import Graph, Link, Node
|
||||
from autogpt_server.data.user import DEFAULT_USER_ID, create_default_user
|
||||
from autogpt_server.server.model import CreateGraph
|
||||
@@ -25,7 +25,7 @@ async def test_graph_creation(server: SpinTestServer):
|
||||
await create_default_user("false")
|
||||
|
||||
value_block = StoreValueBlock().id
|
||||
input_block = InputBlock().id
|
||||
input_block = AgentInputBlock().id
|
||||
|
||||
graph = Graph(
|
||||
id="test_graph",
|
||||
|
||||
@@ -36,23 +36,19 @@ async def assert_sample_graph_executions(
|
||||
graph_exec_id: str,
|
||||
):
|
||||
executions = await agent_server.get_graph_run_node_execution_results(
|
||||
test_graph.id, graph_exec_id, test_user.id
|
||||
test_graph.id,
|
||||
graph_exec_id,
|
||||
test_user.id,
|
||||
)
|
||||
|
||||
output_list = [{"result": ["Hello"]}, {"result": ["World"]}]
|
||||
input_list = [
|
||||
{
|
||||
"name": "input_1",
|
||||
"description": "First input value",
|
||||
"placeholder_values": [],
|
||||
"limit_to_placeholder_values": False,
|
||||
"value": "Hello",
|
||||
},
|
||||
{
|
||||
"name": "input_2",
|
||||
"description": "Second input value",
|
||||
"placeholder_values": [],
|
||||
"limit_to_placeholder_values": False,
|
||||
"value": "World",
|
||||
},
|
||||
]
|
||||
@@ -61,16 +57,24 @@ async def assert_sample_graph_executions(
|
||||
exec = executions[0]
|
||||
assert exec.status == execution.ExecutionStatus.COMPLETED
|
||||
assert exec.graph_exec_id == graph_exec_id
|
||||
assert exec.output_data in output_list
|
||||
assert exec.input_data in input_list
|
||||
assert (
|
||||
exec.output_data in output_list
|
||||
), f"Output data: {exec.output_data} and {output_list}"
|
||||
assert (
|
||||
exec.input_data in input_list
|
||||
), f"Input data: {exec.input_data} and {input_list}"
|
||||
assert exec.node_id in [test_graph.nodes[0].id, test_graph.nodes[1].id]
|
||||
|
||||
# Executing StoreValueBlock
|
||||
exec = executions[1]
|
||||
assert exec.status == execution.ExecutionStatus.COMPLETED
|
||||
assert exec.graph_exec_id == graph_exec_id
|
||||
assert exec.output_data in output_list
|
||||
assert exec.input_data in input_list
|
||||
assert (
|
||||
exec.output_data in output_list
|
||||
), f"Output data: {exec.output_data} and {output_list}"
|
||||
assert (
|
||||
exec.input_data in input_list
|
||||
), f"Input data: {exec.input_data} and {input_list}"
|
||||
assert exec.node_id in [test_graph.nodes[0].id, test_graph.nodes[1].id]
|
||||
|
||||
# Executing FillTextTemplateBlock
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
version: "3"
|
||||
services:
|
||||
postgres:
|
||||
image: ankane/pgvector:latest
|
||||
@@ -16,6 +15,32 @@ services:
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
server_base:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: rnd/autogpt_server/Dockerfile
|
||||
target: server
|
||||
image: autogpt_server:latest
|
||||
command: ["echo", "This is a base image and should not be run directly"]
|
||||
|
||||
migrate:
|
||||
image: autogpt_server:latest
|
||||
command: ["sh", "-c", "until pg_isready -h postgres -U agpt_user -d agpt_local; do echo 'Waiting for postgres...'; sleep 2; done; poetry run prisma migrate deploy"]
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local
|
||||
networks:
|
||||
- app-network
|
||||
restart: on-failure
|
||||
healthcheck:
|
||||
test: ["CMD", "poetry", "run", "prisma", "migrate", "status"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
|
||||
redis:
|
||||
image: redis:latest
|
||||
command: redis-server --requirepass password
|
||||
@@ -25,10 +50,8 @@ services:
|
||||
- app-network
|
||||
|
||||
rest_server:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: rnd/autogpt_server/Dockerfile
|
||||
target: server
|
||||
image: autogpt_server:latest
|
||||
command: ["poetry", "run", "rest"]
|
||||
develop:
|
||||
watch:
|
||||
- path: ./
|
||||
@@ -39,6 +62,8 @@ services:
|
||||
condition: service_started
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
migrate:
|
||||
condition: service_started
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local
|
||||
- REDIS_HOST=redis
|
||||
@@ -47,17 +72,15 @@ services:
|
||||
- AUTH_ENABLED=false
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- EXECUTIONMANAGER_HOST=executor
|
||||
- EXECUTIONSCHEDULER_HOST=execution_scheduler
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- "8003:8003" # execution scheduler
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
executor:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: rnd/autogpt_server/Dockerfile
|
||||
target: executor
|
||||
image: autogpt_server:latest
|
||||
command: ["poetry", "run", "executor"]
|
||||
develop:
|
||||
watch:
|
||||
- path: ./
|
||||
@@ -68,6 +91,8 @@ services:
|
||||
condition: service_started
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
migrate:
|
||||
condition: service_started
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local
|
||||
- REDIS_HOST=redis
|
||||
@@ -77,14 +102,13 @@ services:
|
||||
- PYRO_HOST=0.0.0.0
|
||||
- AGENTSERVER_HOST=rest_server
|
||||
ports:
|
||||
- "8002:8002"
|
||||
- "8002:8000"
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
ws_server:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: rnd/autogpt_server/Dockerfile.ws
|
||||
websocket_server:
|
||||
image: autogpt_server:latest
|
||||
command: ["poetry", "run", "ws"]
|
||||
develop:
|
||||
watch:
|
||||
- path: ./
|
||||
@@ -93,6 +117,7 @@ services:
|
||||
depends_on:
|
||||
- postgres
|
||||
- redis
|
||||
- migrate
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local
|
||||
- REDIS_HOST=redis
|
||||
@@ -101,10 +126,43 @@ services:
|
||||
- AUTH_ENABLED=false
|
||||
- PYRO_HOST=0.0.0.0
|
||||
ports:
|
||||
- "8001:8001"
|
||||
- "8001:8000"
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
market:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: rnd/market/Dockerfile
|
||||
depends_on:
|
||||
- postgres
|
||||
- migrate
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local
|
||||
ports:
|
||||
- "8015:8015"
|
||||
networks:
|
||||
- app-network
|
||||
|
||||
frontend:
|
||||
build:
|
||||
context: ../
|
||||
dockerfile: rnd/autogpt_builder/Dockerfile
|
||||
target: dev
|
||||
depends_on:
|
||||
- postgres
|
||||
- rest_server
|
||||
- websocket_server
|
||||
- migrate
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local
|
||||
- NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8000/api
|
||||
- NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
|
||||
- NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
|
||||
ports:
|
||||
- "3000:3000"
|
||||
networks:
|
||||
- app-network
|
||||
networks:
|
||||
app-network:
|
||||
driver: bridge
|
||||
|
||||
@@ -13,7 +13,7 @@ serviceAccount:
|
||||
service:
|
||||
type: ClusterIP
|
||||
port: 8000
|
||||
targetPort: 8000
|
||||
targetPort: 8005
|
||||
annotations:
|
||||
cloud.google.com/neg: '{"ingress": true}'
|
||||
beta.cloud.google.com/backend-config: '{"default": "autogpt-market"}'
|
||||
|
||||
@@ -39,6 +39,7 @@ spec:
|
||||
{{- toYaml .Values.securityContext | nindent 12 }}
|
||||
image: "{{ .Values.image.repository }}:{{ .Values.image.tag | default .Chart.AppVersion }}"
|
||||
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
||||
command: ["poetry", "run", "rest"]
|
||||
ports:
|
||||
- name: http
|
||||
containerPort: {{ .Values.service.port }}
|
||||
|
||||
@@ -39,6 +39,7 @@ spec:
|
||||
{{- toYaml .Values.securityContext | nindent 12 }}
|
||||
image: "{{ .Values.image.repository }}:{{ .Values.image.tag | default .Chart.AppVersion }}"
|
||||
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
||||
command: ["poetry", "run", "ws"]
|
||||
ports:
|
||||
- name: ws
|
||||
containerPort: {{ .Values.service.port }}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
replicaCount: 1 # not scaling websocket server for now
|
||||
|
||||
image:
|
||||
repository: us-east1-docker.pkg.dev/agpt-dev/agpt-ws-server-dev/agpt-ws-server-dev
|
||||
repository: us-east1-docker.pkg.dev/agpt-dev/agpt-server-dev/agpt-server-dev
|
||||
tag: latest
|
||||
pullPolicy: Always
|
||||
|
||||
|
||||
@@ -1,14 +1,15 @@
|
||||
FROM python:3.11-slim-buster as server_base
|
||||
FROM python:3.11-slim-buster AS server_base
|
||||
|
||||
# Set environment variables
|
||||
ENV PYTHONDONTWRITEBYTECODE 1
|
||||
ENV PYTHONUNBUFFERED 1
|
||||
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# postgresql-client is needed to check if the postgres service is ready for running migrations
|
||||
# We need to check if the rest of the packages need to be installed
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y build-essential curl ffmpeg wget libcurl4-gnutls-dev libexpat1-dev gettext libz-dev libssl-dev \
|
||||
&& apt-get install -y build-essential curl ffmpeg wget libcurl4-gnutls-dev libexpat1-dev gettext libz-dev libssl-dev postgresql-client \
|
||||
&& apt-get clean \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& wget https://github.com/git/git/archive/v2.28.0.tar.gz -O git.tar.gz \
|
||||
@@ -25,19 +26,31 @@ ENV POETRY_VERSION=1.8.3 \
|
||||
PATH="$POETRY_HOME/bin:$PATH"
|
||||
RUN pip3 install poetry
|
||||
|
||||
COPY rnd/market /app/rnd/market
|
||||
FROM server_base AS server_dependencies
|
||||
|
||||
RUN mkdir -p /app/autogpt
|
||||
RUN mkdir -p /app/forge
|
||||
RUN mkdir -p /app/rnd/autogpt_libs
|
||||
RUN mkdir -p /app/rnd/market
|
||||
COPY rnd/autogpt_libs /app/rnd/autogpt_libs
|
||||
|
||||
COPY rnd/market/poetry.lock rnd/market/pyproject.toml /app/rnd/market/
|
||||
|
||||
WORKDIR /app/rnd/market
|
||||
|
||||
# Install dependencies
|
||||
RUN poetry install --no-interaction --no-ansi
|
||||
|
||||
FROM server_dependencies AS server_prisma
|
||||
|
||||
# Need the market/utils/partial_types.py
|
||||
COPY rnd/market /app/rnd/market
|
||||
|
||||
COPY rnd/market/schema.prisma ./
|
||||
RUN poetry run prisma generate
|
||||
|
||||
FROM server_base as server
|
||||
FROM server_prisma AS server
|
||||
|
||||
ENV PORT=8000
|
||||
ENV PORT=8005
|
||||
ENV DATABASE_URL=""
|
||||
|
||||
CMD ["poetry", "run", "app"]
|
||||
|
||||
@@ -87,5 +87,11 @@ def health():
|
||||
content="<h1>Marketplace API</h1>", status_code=200
|
||||
)
|
||||
|
||||
@app.get("/")
|
||||
def default():
|
||||
return fastapi.responses.HTMLResponse(
|
||||
content="<h1>Marketplace API</h1>", status_code=200
|
||||
)
|
||||
|
||||
|
||||
prometheus_fastapi_instrumentator.Instrumentator().instrument(app).expose(app)
|
||||
|
||||
@@ -58,7 +58,8 @@ def format():
|
||||
|
||||
|
||||
def app():
|
||||
run("uvicorn", "market.app:app", "--reload", "--port", "8001")
|
||||
port = os.getenv("PORT", "8015")
|
||||
run("uvicorn", "market.app:app", "--reload", "--port", port)
|
||||
|
||||
|
||||
def setup():
|
||||
|
||||
Reference in New Issue
Block a user