Compare commits

..

17 Commits

Author SHA1 Message Date
Aarushi
e94654aa11 fix typo 2024-09-20 10:22:50 +01:00
Aarushi
e9de43b0d2 all paths 2024-09-20 10:20:14 +01:00
Aarushi
de8add92e7 psuh 2024-09-20 10:18:47 +01:00
Aarushi
597cbdcc58 updated docker ci file 2024-09-20 10:17:42 +01:00
Aarushi
e3f35d79c7 tweak(.github): Update pr template wording (#8103)
* update pr template wording

* add what and how

* Update .github/PULL_REQUEST_TEMPLATE.md

---------

Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-09-19 12:44:50 +00:00
Aarushi
0040495143 tweak(.github): Update PR template (#8100)
* update PR template

* Update .github/PULL_REQUEST_TEMPLATE.md

Co-authored-by: Krzysztof Czerwinski <34861343+kcze@users.noreply.github.com>

* add note

* typo

---------

Co-authored-by: Krzysztof Czerwinski <34861343+kcze@users.noreply.github.com>
2024-09-19 13:00:16 +01:00
Aarushi
d3eac86f9a fix(frontend): Update REST API port (#8096)
update server port to 8006

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-09-19 01:06:04 +02:00
Zamil Majdy
c3cb90ac20 feat(rnd): Add initial block execution credit accounting UI on AutoGPT Builder (#8078) 2024-09-19 04:21:40 +07:00
matanm
9b5bf81d7c Fix typo in Groq setup docs (#8018)
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-09-18 20:22:57 +00:00
Nicholas Tindle
86db4deef9 feat(server): backend analytics endpoints (#8030) 2024-09-18 18:23:20 +00:00
Aarushi
d8f989daf8 docs(rnd): Update submodules info in readme (#8095)
update submodules info in readme
2024-09-18 18:59:23 +01:00
Aarushi
00f2b134cb tweak(rnd): add env var to docker compose so no messing with .env (#8091)
add env var to docker compose so no messing with .env
2024-09-18 16:39:15 +01:00
Aarushi
a3959712dc tweak(builder): Update .env.example server url with right port (#8090)
update server url with right port
2024-09-18 15:51:44 +01:00
Aarushi
8477b25c5a tweak(builder) Add local supabse credentials (#8089)
add local supabse credentials
2024-09-18 15:45:09 +01:00
Swifty
f133c9c1ef fix(rnd): incorrect docker image for migrate (#8086)
fix incorrect docker image for migrate
2024-09-18 15:21:38 +02:00
Aarushi
dc72ec97bc feat(rnd): Add support for supabase locally (#8077)
* add just auth for now

* add supabase script

* add to docker compose

* update docker compose

* tweak(rnd) Add prefix in logs (#8001)

* add prefix

* fix typos

* fix conflicts

* feat(rnd): Reduce container size remove dep with forge and autogpt (#8040)

* Remove forge and autogpt

* update lock files

* Update build process to reduce image size

* Reduced built image size

* fixed docker compose watch

* Updated logging

* updated env.example

* formatting

* linting issue

* linting not working in github actions..

* trying to get around github action linting issue

* updated version

* sleep for prisma issues

* add exp backoff on connection issues

* updated config based on review comments

* Sorting alphabetical

* updated default config

* updated depends checks

* fixed missing prisma binaries

* remove dead layer

* remove try

* remove dead layer

* updated lock file

* add to docker compose

* update for init

* add local supabase variables to docker compose

* wip supbase connectioon

* subabase submodule

* combined docker file wth new supbase url pointing to kong

* updated combined

* ngix

* updated docker compose without frontend

* updated docker compose

* update to remove frontend

* update docs

* update newline

* remove unescessary change

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-09-18 09:50:39 +01:00
Nicholas Tindle
0c915cb558 feat(server): anthropic updates, csv, sampling, and code blocks (#7803)
Co-authored-by: Bentlybro <tomnoon9@gmail.com>
2024-09-17 21:29:35 -05:00
39 changed files with 1013 additions and 404 deletions

View File

@@ -6,26 +6,18 @@
<!-- Concisely describe all of the changes made in this pull request: -->
### PR Quality Scorecard ✨
### Testing 🔍
> [!NOTE]
Only for the new autogpt platform, currently in rnd/
<!--
Check out our contribution guide:
https://github.com/Significant-Gravitas/AutoGPT/wiki/Contributing
1. Avoid duplicate work, issues, PRs etc.
2. Also consider contributing something other than code; see the [contribution guide]
for options.
3. Clearly explain your changes.
4. Avoid making unnecessary changes, especially if they're purely based on personal
preferences. Doing so is the maintainers' job. ;-)
Please make sure your changes have been tested and are in good working condition.
Here is a list of our critical paths, if you need some inspiration on what and how to test:
-->
- [x] Have you used the PR description template? &ensp; `+2 pts`
- [ ] Is your pull request atomic, focusing on a single change? &ensp; `+5 pts`
- [ ] Have you linked the GitHub issue(s) that this PR addresses? &ensp; `+5 pts`
- [ ] Have you documented your changes clearly and comprehensively? &ensp; `+5 pts`
- [ ] Have you changed or added a feature? &ensp; `-4 pts`
- [ ] Have you added/updated corresponding documentation? &ensp; `+4 pts`
- [ ] Have you added/updated corresponding integration tests? &ensp; `+5 pts`
- [ ] Have you changed the behavior of AutoGPT? &ensp; `-5 pts`
- [ ] Have you also run `agbenchmark` to verify that these changes do not regress performance? &ensp; `+10 pts`
- Create from scratch and execute an agent with at least 3 blocks
- Import an agent from file upload, and confirm it executes correctly
- Upload agent to marketplace
- Import an agent from marketplace and confirm it executes correctly
- Edit an agent from monitor, and confirm it executes correctly

View File

@@ -0,0 +1,41 @@
name: AutoGPT Server Docker Build & Push
on:
push:
branches: [ updated-docker-ci ]
paths:
- '**'
defaults:
run:
shell: bash
working-directory: AutoGPT
env:
PROJECT_ID: agpt-dev
IMAGE_NAME: agpt-server-dev
REGION: us-central1
jobs:
build-and-push:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Set up Cloud SDK
uses: google-github-actions/setup-gcloud@v0.2.1
with:
project_id: ${{ env.PROJECT_ID }}
service_account_key: ${{ secrets.GCP_SA_KEY }}
export_default_credentials: true
- name: Configure Docker
run: gcloud auth configure-docker ${{ env.REGION }}-docker.pkg.dev
- name: Build Docker image
run: docker build -t ${{ env.REGION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.IMAGE_NAME }}:${{ github.sha }} -f rnd/autogpt_server/Dockerfile .
- name: Push Docker image
run: docker push ${{ env.REGION }}-docker.pkg.dev/${{ env.PROJECT_ID }}/${{ env.IMAGE_NAME }}:${{ github.sha }}

3
.gitmodules vendored
View File

@@ -1,3 +1,6 @@
[submodule "forge/tests/vcr_cassettes"]
path = forge/tests/vcr_cassettes
url = https://github.com/Significant-Gravitas/Auto-GPT-test-cassettes
[submodule "rnd/supabase"]
path = rnd/supabase
url = https://github.com/supabase/supabase.git

View File

@@ -185,7 +185,7 @@ If you don't know which to choose, you can safely go with OpenAI*.
1. Get your Groq API key from [Settings > API keys][groq/api-keys]
2. Open `.env`
3. Find the line that says `GROQ_API_KEY=`
4. Insert your Anthropic API Key directly after = without quotes or spaces:
4. Insert your Groq API Key directly after = without quotes or spaces:
```ini
GROQ_API_KEY=gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
```

View File

@@ -102,21 +102,14 @@ poetry run prisma generate
Without running this command, the necessary Python modules (prisma.models) won't be available, leading to a `ModuleNotFoundError`.
### Running the server without Docker
### Running the server
To run the server, you can run the following commands in the same terminal you ran the `poetry install` command:
```bash
poetry run app
```
### Running the server within Docker
To run the server, you can run the following commands in the same terminal you ran the `poetry install` command:
```bash
docker compose build
docker compose up
cp supabase/docker/.env.example .env
docker compose -f docker-compose.combined.yml build
docker compose -f docker-compose.combined.yml up -d
```
In the other terminal from autogpt_builder, you can run the following command to start the frontend:

View File

@@ -14,21 +14,40 @@ Welcome to the AutoGPT Platform - a powerful system for creating and running AI
To run the AutoGPT Platform, follow these steps:
1. Clone this repository to your local machine.
2. Navigate to the project directory.
2. Navigate to rnd/supabase
3. Run the following command:
```
git submodule update --init --recursive
```
4. Navigate back to rnd (cd ..)
5. Run the following command:
```
cp supabase/docker/.env.example .env
```
6. Run the following command:
```
docker compose up -d
docker compose -f docker-compose.combined.yml up -d
```
This command will start all the necessary services defined in the `docker-compose.yml` file in detached mode.
This command will start all the necessary backend services defined in the `docker-compose.combined.yml` file in detached mode.
7. Navigate to rnd/autogpt_builder.
8. Run the following command:
```
cp .env.example .env.local
```
9. Run the following command:
```
yarn dev
```
### Docker Compose Commands
Here are some useful Docker Compose commands for managing your AutoGPT Platform:
- `docker compose up -d`: Start the services in detached mode.
- `docker compose stop`: Stop the running services without removing them.
- `docker compose -f docker-compose.combined.yml up -d`: Start the services in detached mode.
- `docker compose -f docker-compose.combined.yml stop`: Stop the running services without removing them.
- `docker compose rm`: Remove stopped service containers.
- `docker compose build`: Build or rebuild services.
- `docker compose down`: Stop and remove containers, networks, and volumes.

View File

@@ -1,12 +1,12 @@
NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8000/api
NEXT_PUBLIC_AUTH_CALLBACK_URL=http://localhost:8006/auth/callback
NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8005/api/v1/market
NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
## Supabase credentials
## YOU ONLY NEED THEM IF YOU WANT TO USE SUPABASE USER AUTHENTICATION
## If you're using self-hosted version then you most likely don't need to set this
# NEXT_PUBLIC_SUPABASE_URL=your-project-url
# NEXT_PUBLIC_SUPABASE_ANON_KEY=your-anon-key
NEXT_PUBLIC_SUPABASE_URL=http://localhost:8000
NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
## OAuth Callback URL
## This should be {domain}/auth/callback

View File

@@ -35,6 +35,7 @@ export async function login(values: z.infer<typeof loginFormSchema>) {
}
export async function signup(values: z.infer<typeof loginFormSchema>) {
"use server";
return await Sentry.withServerActionInstrumentation(
"signup",
{},

View File

@@ -7,7 +7,7 @@ import AgentDetailContent from "@/components/marketplace/AgentDetailContent";
async function getAgentDetails(id: string): Promise<AgentDetailResponse> {
const apiUrl =
process.env.NEXT_PUBLIC_AGPT_MARKETPLACE_URL ||
"http://localhost:8001/api/v1/market";
"http://localhost:8015/api/v1/market";
const api = new MarketplaceAPI(apiUrl);
try {
console.log(`Fetching agent details for id: ${id}`);

View File

@@ -185,7 +185,7 @@ const Pagination: React.FC<{
const Marketplace: React.FC = () => {
const apiUrl =
process.env.NEXT_PUBLIC_AGPT_MARKETPLACE_URL ||
"http://localhost:8001/api/v1/market";
"http://localhost:8015/api/v1/market";
const api = useMemo(() => new MarketplaceAPI(apiUrl), [apiUrl]);
const [searchValue, setSearchValue] = useState("");

View File

@@ -0,0 +1,32 @@
"use client";
import { useState, useEffect } from "react";
import { Button } from "@/components/ui/button";
import { IconRefresh } from "@/components/ui/icons";
import AutoGPTServerAPI from "@/lib/autogpt-server-api";
export default function CreditButton() {
const [credit, setCredit] = useState<number | null>(null);
const api = new AutoGPTServerAPI();
const fetchCredit = async () => {
const response = await api.getUserCredit();
setCredit(response.credits);
};
useEffect(() => {
fetchCredit();
}, [api]);
return (
credit !== null && (
<Button
onClick={fetchCredit}
variant="outline"
className="flex items-center space-x-2 text-muted-foreground"
>
<span>Credits: {credit}</span>
<IconRefresh />
</Button>
)
);
}

View File

@@ -16,6 +16,7 @@ import {
Category,
NodeExecutionResult,
BlockUIType,
BlockCost,
} from "@/lib/autogpt-server-api/types";
import { beautifyString, cn, setNestedProperty } from "@/lib/utils";
import { Button } from "@/components/ui/button";
@@ -45,6 +46,7 @@ export type ConnectionData = Array<{
export type CustomNodeData = {
blockType: string;
blockCosts: BlockCost[];
title: string;
description: string;
categories: Category[];
@@ -521,6 +523,18 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
);
});
const inputValues = data.hardcodedValues;
const blockCost =
data.blockCosts &&
data.blockCosts.find((cost) =>
Object.entries(cost.cost_filter).every(
// Undefined, null, or empty values are considered equal
([key, value]) =>
value === inputValues[key] || (!value && !inputValues[key]),
),
);
console.debug(`Block cost ${inputValues}|${data.blockCosts}=${blockCost}`);
return (
<div
className={`${data.uiType === BlockUIType.NOTE ? "w-[300px]" : "w-[500px]"} ${blockClasses} ${errorClass} ${statusClass} ${data.uiType === BlockUIType.NOTE ? "bg-yellow-100" : "bg-white"}`}
@@ -562,6 +576,11 @@ export function CustomNode({ data, id, width, height }: NodeProps<CustomNode>) {
)}
</div>
</div>
{blockCost && (
<div className="p-3 text-right font-semibold">
Cost: {blockCost.cost_amount} / {blockCost.cost_type}
</div>
)}
{data.uiType !== BlockUIType.NOTE ? (
<div className="flex items-start justify-between p-3">
<div>

View File

@@ -414,6 +414,7 @@ const FlowEditor: React.FC<{
position: viewportCenter, // Set the position to the calculated viewport center
data: {
blockType: nodeType,
blockCosts: nodeSchema.costs,
title: `${nodeType} ${nodeId}`,
description: nodeSchema.description,
categories: nodeSchema.categories,

View File

@@ -9,9 +9,12 @@ import {
IconCircleUser,
IconMenu,
IconPackage2,
IconRefresh,
IconSquareActivity,
IconWorkFlow,
} from "@/components/ui/icons";
import AutoGPTServerAPI from "@/lib/autogpt-server-api";
import CreditButton from "@/components/CreditButton";
export async function NavBar() {
const isAvailable = Boolean(
@@ -96,6 +99,8 @@ export async function NavBar() {
</a>
</div>
<div className="flex flex-1 items-center justify-end gap-4">
{isAvailable && user && <CreditButton />}
{isAvailable && !user && (
<Link
href="/login"

View File

@@ -1,5 +1,4 @@
"use server";
import AutoGPTServerAPI from "@/lib/autogpt-server-api";
import MarketplaceAPI from "@/lib/marketplace-api";
import { revalidatePath } from "next/cache";
import * as Sentry from "@sentry/nextjs";

View File

@@ -81,7 +81,7 @@ function convertGraphToReactFlow(graph: any): { nodes: Node[]; edges: Edge[] } {
async function installGraph(id: string): Promise<void> {
const apiUrl =
process.env.NEXT_PUBLIC_AGPT_MARKETPLACE_URL ||
"http://localhost:8001/api/v1/market";
"http://localhost:8015/api/v1/market";
const api = new MarketplaceAPI(apiUrl);
const serverAPIUrl = process.env.AGPT_SERVER_API_URL;

View File

@@ -1,3 +1,4 @@
import { sendGAEvent } from "@next/third-parties/google";
import Shepherd from "shepherd.js";
import "shepherd.js/dist/css/shepherd.css";
@@ -493,6 +494,15 @@ export const startTutorial = (
localStorage.setItem("shepherd-tour", "completed"); // Optionally mark the tutorial as completed
});
for (const step of tour.steps) {
step.on("show", () => {
"use client";
console.debug("sendTutorialStep");
sendGAEvent("event", "tutorial_step_shown", { value: step.id });
});
}
tour.on("cancel", () => {
setPinBlocksPopover(false);
localStorage.setItem("shepherd-tour", "canceled"); // Optionally mark the tutorial as canceled

View File

@@ -264,6 +264,43 @@ export const IconCircleUser = createIcon((props) => (
</svg>
));
/**
* Refresh icon component.
*
* @component IconRefresh
* @param {IconProps} props - The props object containing additional attributes and event handlers for the icon.
* @returns {JSX.Element} - The refresh icon.
*
* @example
* // Default usage this is the standard usage
* <IconRefresh />
*
* @example
* // With custom color and size these should be used sparingly and only when necessary
* <IconRefresh className="text-primary" size="lg" />
*
* @example
* // With custom size and onClick handler
* <IconRefresh size="sm" onClick={handleOnClick} />
*/
export const IconRefresh = createIcon((props) => (
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
strokeWidth="2"
strokeLinecap="round"
strokeLinejoin="round"
{...props}
>
<polyline points="23 4 23 10 17 10" />
<polyline points="1 20 1 14 7 14" />
<path d="M3.51 9a9 9 0 0 1 14.136 -5.36L23 10" />
<path d="M20.49 15a9 9 0 0 1 -14.136 5.36L1 14" />
</svg>
));
/**
* Menu icon component.
*

View File

@@ -145,6 +145,7 @@ export default function useAgentGraph(
data: {
block_id: block.id,
blockType: block.name,
blockCosts: block.costs,
categories: block.categories,
description: block.description,
title: `${block.name} ${node.id}`,

View File

@@ -0,0 +1,321 @@
import { SupabaseClient } from "@supabase/supabase-js";
import {
Block,
Graph,
GraphCreatable,
GraphUpdateable,
GraphMeta,
GraphExecuteResponse,
NodeExecutionResult,
User,
AnalyticsMetrics,
AnalyticsDetails,
} from "./types";
export default class BaseAutoGPTServerAPI {
private baseUrl: string;
private wsUrl: string;
private webSocket: WebSocket | null = null;
private wsConnecting: Promise<void> | null = null;
private wsMessageHandlers: Record<string, Set<(data: any) => void>> = {};
private supabaseClient: SupabaseClient | null = null;
constructor(
baseUrl: string = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
"http://localhost:8006/api",
wsUrl: string = process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL ||
"ws://localhost:8001/ws",
supabaseClient: SupabaseClient | null = null,
) {
this.baseUrl = baseUrl;
this.wsUrl = wsUrl;
this.supabaseClient = supabaseClient;
}
async createUser(): Promise<User> {
return this._request("POST", "/auth/user", {});
}
async getUserCredit(): Promise<{ credits: number }> {
return this._get(`/credits`);
}
async getBlocks(): Promise<Block[]> {
return await this._get("/blocks");
}
async listGraphs(): Promise<GraphMeta[]> {
return this._get("/graphs");
}
async listTemplates(): Promise<GraphMeta[]> {
return this._get("/templates");
}
async getGraph(id: string, version?: number): Promise<Graph> {
const query = version !== undefined ? `?version=${version}` : "";
return this._get(`/graphs/${id}` + query);
}
async getTemplate(id: string, version?: number): Promise<Graph> {
const query = version !== undefined ? `?version=${version}` : "";
return this._get(`/templates/${id}` + query);
}
async getGraphAllVersions(id: string): Promise<Graph[]> {
return this._get(`/graphs/${id}/versions`);
}
async getTemplateAllVersions(id: string): Promise<Graph[]> {
return this._get(`/templates/${id}/versions`);
}
async createGraph(graphCreateBody: GraphCreatable): Promise<Graph>;
async createGraph(
fromTemplateID: string,
templateVersion: number,
): Promise<Graph>;
async createGraph(
graphOrTemplateID: GraphCreatable | string,
templateVersion?: number,
): Promise<Graph> {
let requestBody: GraphCreateRequestBody;
if (typeof graphOrTemplateID == "string") {
if (templateVersion == undefined) {
throw new Error("templateVersion not specified");
}
requestBody = {
template_id: graphOrTemplateID,
template_version: templateVersion,
};
} else {
requestBody = { graph: graphOrTemplateID };
}
return this._request("POST", "/graphs", requestBody);
}
async createTemplate(templateCreateBody: GraphCreatable): Promise<Graph> {
const requestBody: GraphCreateRequestBody = { graph: templateCreateBody };
return this._request("POST", "/templates", requestBody);
}
async updateGraph(id: string, graph: GraphUpdateable): Promise<Graph> {
return await this._request("PUT", `/graphs/${id}`, graph);
}
async updateTemplate(id: string, template: GraphUpdateable): Promise<Graph> {
return await this._request("PUT", `/templates/${id}`, template);
}
async setGraphActiveVersion(id: string, version: number): Promise<Graph> {
return this._request("PUT", `/graphs/${id}/versions/active`, {
active_graph_version: version,
});
}
async executeGraph(
id: string,
inputData: { [key: string]: any } = {},
): Promise<GraphExecuteResponse> {
return this._request("POST", `/graphs/${id}/execute`, inputData);
}
async listGraphRunIDs(
graphID: string,
graphVersion?: number,
): Promise<string[]> {
const query =
graphVersion !== undefined ? `?graph_version=${graphVersion}` : "";
return this._get(`/graphs/${graphID}/executions` + query);
}
async getGraphExecutionInfo(
graphID: string,
runID: string,
): Promise<NodeExecutionResult[]> {
return (await this._get(`/graphs/${graphID}/executions/${runID}`)).map(
parseNodeExecutionResultTimestamps,
);
}
async stopGraphExecution(
graphID: string,
runID: string,
): Promise<NodeExecutionResult[]> {
return (
await this._request("POST", `/graphs/${graphID}/executions/${runID}/stop`)
).map(parseNodeExecutionResultTimestamps);
}
async logMetric(metric: AnalyticsMetrics) {
return this._request("POST", "/analytics/log_raw_metric", metric);
}
async logAnalytic(analytic: AnalyticsDetails) {
return this._request("POST", "/analytics/log_raw_analytics", analytic);
}
private async _get(path: string) {
return this._request("GET", path);
}
private async _request(
method: "GET" | "POST" | "PUT" | "PATCH",
path: string,
payload?: { [key: string]: any },
) {
if (method != "GET") {
console.debug(`${method} ${path} payload:`, payload);
}
const token =
(await this.supabaseClient?.auth.getSession())?.data.session
?.access_token || "";
const response = await fetch(this.baseUrl + path, {
method,
headers:
method != "GET"
? {
"Content-Type": "application/json",
Authorization: token ? `Bearer ${token}` : "",
}
: {
Authorization: token ? `Bearer ${token}` : "",
},
body: JSON.stringify(payload),
});
const response_data = await response.json();
if (!response.ok) {
console.warn(
`${method} ${path} returned non-OK response:`,
response_data.detail,
response,
);
throw new Error(`HTTP error ${response.status}! ${response_data.detail}`);
}
return response_data;
}
async connectWebSocket(): Promise<void> {
this.wsConnecting ??= new Promise(async (resolve, reject) => {
try {
const token =
(await this.supabaseClient?.auth.getSession())?.data.session
?.access_token || "";
const wsUrlWithToken = `${this.wsUrl}?token=${token}`;
this.webSocket = new WebSocket(wsUrlWithToken);
this.webSocket.onopen = () => {
console.debug("WebSocket connection established");
resolve();
};
this.webSocket.onclose = (event) => {
console.debug("WebSocket connection closed", event);
this.webSocket = null;
};
this.webSocket.onerror = (error) => {
console.error("WebSocket error:", error);
reject(error);
};
this.webSocket.onmessage = (event) => {
const message: WebsocketMessage = JSON.parse(event.data);
if (message.method == "execution_event") {
message.data = parseNodeExecutionResultTimestamps(message.data);
}
this.wsMessageHandlers[message.method]?.forEach((handler) =>
handler(message.data),
);
};
} catch (error) {
console.error("Error connecting to WebSocket:", error);
reject(error);
}
});
return this.wsConnecting;
}
disconnectWebSocket() {
if (this.webSocket && this.webSocket.readyState === WebSocket.OPEN) {
this.webSocket.close();
}
}
sendWebSocketMessage<M extends keyof WebsocketMessageTypeMap>(
method: M,
data: WebsocketMessageTypeMap[M],
callCount = 0,
) {
if (this.webSocket && this.webSocket.readyState === WebSocket.OPEN) {
this.webSocket.send(JSON.stringify({ method, data }));
} else {
this.connectWebSocket().then(() => {
callCount == 0
? this.sendWebSocketMessage(method, data, callCount + 1)
: setTimeout(
() => {
this.sendWebSocketMessage(method, data, callCount + 1);
},
2 ** (callCount - 1) * 1000,
);
});
}
}
onWebSocketMessage<M extends keyof WebsocketMessageTypeMap>(
method: M,
handler: (data: WebsocketMessageTypeMap[M]) => void,
): () => void {
this.wsMessageHandlers[method] ??= new Set();
this.wsMessageHandlers[method].add(handler);
// Return detacher
return () => this.wsMessageHandlers[method].delete(handler);
}
subscribeToExecution(graphId: string) {
this.sendWebSocketMessage("subscribe", { graph_id: graphId });
}
}
/* *** UTILITY TYPES *** */
type GraphCreateRequestBody =
| {
template_id: string;
template_version: number;
}
| {
graph: GraphCreatable;
};
type WebsocketMessageTypeMap = {
subscribe: { graph_id: string };
execution_event: NodeExecutionResult;
};
type WebsocketMessage = {
[M in keyof WebsocketMessageTypeMap]: {
method: M;
data: WebsocketMessageTypeMap[M];
};
}[keyof WebsocketMessageTypeMap];
/* *** HELPER FUNCTIONS *** */
function parseNodeExecutionResultTimestamps(result: any): NodeExecutionResult {
return {
...result,
add_time: new Date(result.add_time),
queue_time: result.queue_time ? new Date(result.queue_time) : undefined,
start_time: result.start_time ? new Date(result.start_time) : undefined,
end_time: result.end_time ? new Date(result.end_time) : undefined,
};
}

View File

@@ -1,305 +1,14 @@
import { createClient } from "../supabase/client";
import {
Block,
Graph,
GraphCreatable,
GraphUpdateable,
GraphMeta,
GraphExecuteResponse,
NodeExecutionResult,
User,
} from "./types";
export default class AutoGPTServerAPI {
private baseUrl: string;
private wsUrl: string;
private webSocket: WebSocket | null = null;
private wsConnecting: Promise<void> | null = null;
private wsMessageHandlers: Record<string, Set<(data: any) => void>> = {};
private supabaseClient = createClient();
import BaseAutoGPTServerAPI from "./baseClient";
export default class AutoGPTServerAPI extends BaseAutoGPTServerAPI {
constructor(
baseUrl: string = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
"http://localhost:8000/api",
"http://localhost:8006/api",
wsUrl: string = process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL ||
"ws://localhost:8001/ws",
) {
this.baseUrl = baseUrl;
this.wsUrl = wsUrl;
}
async createUser(): Promise<User> {
return this._request("POST", "/auth/user", {});
}
async getBlocks(): Promise<Block[]> {
return await this._get("/blocks");
}
async listGraphs(): Promise<GraphMeta[]> {
return this._get("/graphs");
}
async listTemplates(): Promise<GraphMeta[]> {
return this._get("/templates");
}
async getGraph(id: string, version?: number): Promise<Graph> {
const query = version !== undefined ? `?version=${version}` : "";
return this._get(`/graphs/${id}` + query);
}
async getTemplate(id: string, version?: number): Promise<Graph> {
const query = version !== undefined ? `?version=${version}` : "";
return this._get(`/templates/${id}` + query);
}
async getGraphAllVersions(id: string): Promise<Graph[]> {
return this._get(`/graphs/${id}/versions`);
}
async getTemplateAllVersions(id: string): Promise<Graph[]> {
return this._get(`/templates/${id}/versions`);
}
async createGraph(graphCreateBody: GraphCreatable): Promise<Graph>;
async createGraph(
fromTemplateID: string,
templateVersion: number,
): Promise<Graph>;
async createGraph(
graphOrTemplateID: GraphCreatable | string,
templateVersion?: number,
): Promise<Graph> {
let requestBody: GraphCreateRequestBody;
if (typeof graphOrTemplateID == "string") {
if (templateVersion == undefined) {
throw new Error("templateVersion not specified");
}
requestBody = {
template_id: graphOrTemplateID,
template_version: templateVersion,
};
} else {
requestBody = { graph: graphOrTemplateID };
}
return this._request("POST", "/graphs", requestBody);
}
async createTemplate(templateCreateBody: GraphCreatable): Promise<Graph> {
const requestBody: GraphCreateRequestBody = { graph: templateCreateBody };
return this._request("POST", "/templates", requestBody);
}
async updateGraph(id: string, graph: GraphUpdateable): Promise<Graph> {
return await this._request("PUT", `/graphs/${id}`, graph);
}
async updateTemplate(id: string, template: GraphUpdateable): Promise<Graph> {
return await this._request("PUT", `/templates/${id}`, template);
}
async setGraphActiveVersion(id: string, version: number): Promise<Graph> {
return this._request("PUT", `/graphs/${id}/versions/active`, {
active_graph_version: version,
});
}
async executeGraph(
id: string,
inputData: { [key: string]: any } = {},
): Promise<GraphExecuteResponse> {
return this._request("POST", `/graphs/${id}/execute`, inputData);
}
async listGraphRunIDs(
graphID: string,
graphVersion?: number,
): Promise<string[]> {
const query =
graphVersion !== undefined ? `?graph_version=${graphVersion}` : "";
return this._get(`/graphs/${graphID}/executions` + query);
}
async getGraphExecutionInfo(
graphID: string,
runID: string,
): Promise<NodeExecutionResult[]> {
return (await this._get(`/graphs/${graphID}/executions/${runID}`)).map(
parseNodeExecutionResultTimestamps,
);
}
async stopGraphExecution(
graphID: string,
runID: string,
): Promise<NodeExecutionResult[]> {
return (
await this._request("POST", `/graphs/${graphID}/executions/${runID}/stop`)
).map(parseNodeExecutionResultTimestamps);
}
private async _get(path: string) {
return this._request("GET", path);
}
private async _request(
method: "GET" | "POST" | "PUT" | "PATCH",
path: string,
payload?: { [key: string]: any },
) {
if (method != "GET") {
console.debug(`${method} ${path} payload:`, payload);
}
const token =
(await this.supabaseClient?.auth.getSession())?.data.session
?.access_token || "";
const response = await fetch(this.baseUrl + path, {
method,
headers:
method != "GET"
? {
"Content-Type": "application/json",
Authorization: token ? `Bearer ${token}` : "",
}
: {
Authorization: token ? `Bearer ${token}` : "",
},
body: JSON.stringify(payload),
});
const response_data = await response.json();
if (!response.ok) {
console.warn(
`${method} ${path} returned non-OK response:`,
response_data.detail,
response,
);
throw new Error(`HTTP error ${response.status}! ${response_data.detail}`);
}
return response_data;
}
async connectWebSocket(): Promise<void> {
this.wsConnecting ??= new Promise(async (resolve, reject) => {
try {
const token =
(await this.supabaseClient?.auth.getSession())?.data.session
?.access_token || "";
const wsUrlWithToken = `${this.wsUrl}?token=${token}`;
this.webSocket = new WebSocket(wsUrlWithToken);
this.webSocket.onopen = () => {
console.debug("WebSocket connection established");
resolve();
};
this.webSocket.onclose = (event) => {
console.debug("WebSocket connection closed", event);
this.webSocket = null;
};
this.webSocket.onerror = (error) => {
console.error("WebSocket error:", error);
reject(error);
};
this.webSocket.onmessage = (event) => {
const message: WebsocketMessage = JSON.parse(event.data);
if (message.method == "execution_event") {
message.data = parseNodeExecutionResultTimestamps(message.data);
}
this.wsMessageHandlers[message.method]?.forEach((handler) =>
handler(message.data),
);
};
} catch (error) {
console.error("Error connecting to WebSocket:", error);
reject(error);
}
});
return this.wsConnecting;
}
disconnectWebSocket() {
if (this.webSocket && this.webSocket.readyState === WebSocket.OPEN) {
this.webSocket.close();
}
}
sendWebSocketMessage<M extends keyof WebsocketMessageTypeMap>(
method: M,
data: WebsocketMessageTypeMap[M],
callCount = 0,
) {
if (this.webSocket && this.webSocket.readyState === WebSocket.OPEN) {
this.webSocket.send(JSON.stringify({ method, data }));
} else {
this.connectWebSocket().then(() => {
callCount == 0
? this.sendWebSocketMessage(method, data, callCount + 1)
: setTimeout(
() => {
this.sendWebSocketMessage(method, data, callCount + 1);
},
2 ** (callCount - 1) * 1000,
);
});
}
}
onWebSocketMessage<M extends keyof WebsocketMessageTypeMap>(
method: M,
handler: (data: WebsocketMessageTypeMap[M]) => void,
): () => void {
this.wsMessageHandlers[method] ??= new Set();
this.wsMessageHandlers[method].add(handler);
// Return detacher
return () => this.wsMessageHandlers[method].delete(handler);
}
subscribeToExecution(graphId: string) {
this.sendWebSocketMessage("subscribe", { graph_id: graphId });
const supabaseClient = createClient();
super(baseUrl, wsUrl, supabaseClient);
}
}
/* *** UTILITY TYPES *** */
type GraphCreateRequestBody =
| {
template_id: string;
template_version: number;
}
| {
graph: GraphCreatable;
};
type WebsocketMessageTypeMap = {
subscribe: { graph_id: string };
execution_event: NodeExecutionResult;
};
type WebsocketMessage = {
[M in keyof WebsocketMessageTypeMap]: {
method: M;
data: WebsocketMessageTypeMap[M];
};
}[keyof WebsocketMessageTypeMap];
/* *** HELPER FUNCTIONS *** */
function parseNodeExecutionResultTimestamps(result: any): NodeExecutionResult {
return {
...result,
add_time: new Date(result.add_time),
queue_time: result.queue_time ? new Date(result.queue_time) : undefined,
start_time: result.start_time ? new Date(result.start_time) : undefined,
end_time: result.end_time ? new Date(result.end_time) : undefined,
};
}

View File

@@ -0,0 +1,14 @@
import { createServerClient } from "../supabase/server";
import BaseAutoGPTServerAPI from "./baseClient";
export default class AutoGPTServerAPIServerSide extends BaseAutoGPTServerAPI {
constructor(
baseUrl: string = process.env.NEXT_PUBLIC_AGPT_SERVER_URL ||
"http://localhost:8006/api",
wsUrl: string = process.env.NEXT_PUBLIC_AGPT_WS_SERVER_URL ||
"ws://localhost:8001/ws",
) {
const supabaseClient = createServerClient();
super(baseUrl, wsUrl, supabaseClient);
}
}

View File

@@ -5,6 +5,18 @@ export type Category = {
description: string;
};
export enum BlockCostType {
RUN = "run",
BYTE = "byte",
SECOND = "second",
}
export type BlockCost = {
cost_amount: number;
cost_type: BlockCostType;
cost_filter: { [key: string]: any };
};
export type Block = {
id: string;
name: string;
@@ -14,6 +26,7 @@ export type Block = {
outputSchema: BlockIORootSchema;
staticOutput: boolean;
uiType: BlockUIType;
costs: BlockCost[];
};
export type BlockIORootSchema = {
@@ -190,3 +203,15 @@ export enum BlockUIType {
OUTPUT = "Output",
NOTE = "Note",
}
export type AnalyticsMetrics = {
metric_name: string;
metric_value: number;
data_string: string;
};
export type AnalyticsDetails = {
type: string;
data: { [key: string]: any };
index: string;
};

View File

@@ -17,7 +17,7 @@ export default class MarketplaceAPI {
constructor(
baseUrl: string = process.env.NEXT_PUBLIC_AGPT_MARKETPLACE_URL ||
"http://localhost:8001/api/v1/market",
"http://localhost:8015/api/v1/market",
) {
this.baseUrl = baseUrl;
}

View File

@@ -1,7 +1,7 @@
DB_USER=agpt_user
DB_PASS=pass123
DB_NAME=agpt_local
DB_PORT=5432
DB_PORT=5433
DATABASE_URL="postgresql://${DB_USER}:${DB_PASS}@localhost:${DB_PORT}/${DB_NAME}"
PRISMA_SCHEMA="postgres/schema.prisma"
@@ -14,6 +14,8 @@ ENABLE_CREDIT=false
APP_ENV="local"
PYRO_HOST=localhost
SENTRY_DSN=
# This is needed when ENABLE_AUTH is true
SUPABASE_JWT_SECRET=
## ===== OPTIONAL API KEYS ===== ##

View File

@@ -1,3 +1,6 @@
{
"python.analysis.typeCheckingMode": "basic",
"python.testing.pytestArgs": ["test"],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}

View File

@@ -0,0 +1,43 @@
import logging
import prisma.types
logger = logging.getLogger(__name__)
async def log_raw_analytics(
user_id: str,
type: str,
data: dict,
data_index: str,
):
details = await prisma.models.AnalyticsDetails.prisma().create(
data={
"userId": user_id,
"type": type,
"data": prisma.Json(data),
"dataIndex": data_index,
}
)
return details
async def log_raw_metric(
user_id: str,
metric_name: str,
metric_value: float,
data_string: str,
):
if metric_value < 0:
raise ValueError("metric_value must be non-negative")
result = await prisma.models.AnalyticsMetrics.prisma().create(
data={
"value": metric_value,
"analyticMetric": metric_name,
"userId": user_id,
"dataString": data_string,
},
)
return result

View File

@@ -15,6 +15,7 @@ from autogpt_server.blocks.llm import (
AIStructuredResponseGeneratorBlock,
AITextGeneratorBlock,
AITextSummarizerBlock,
LlmModel,
)
from autogpt_server.blocks.talking_head import CreateTalkingAvatarVideoBlock
from autogpt_server.data.block import Block, BlockInput
@@ -57,6 +58,12 @@ llm_cost = [
cost_amount=metadata.cost_factor,
)
for model, metadata in MODEL_METADATA.items()
] + [
BlockCost(
# Default cost is running LlmModel.GPT4O.
cost_amount=MODEL_METADATA[LlmModel.GPT4O].cost_factor,
cost_filter={"api_key": None},
),
]
BLOCK_COSTS: dict[Type[Block], list[BlockCost]] = {
@@ -175,7 +182,11 @@ class UserCredit(UserCreditBase):
return 0, {}
for block_cost in block_costs:
if all(input_data.get(k) == b for k, b in block_cost.cost_filter.items()):
if all(
# None, [], {}, "", are considered the same value.
input_data.get(k) == b or (not input_data.get(k) and not b)
for k, b in block_cost.cost_filter.items()
):
if block_cost.cost_type == BlockCostType.RUN:
return block_cost.cost_amount, block_cost.cost_filter

View File

@@ -78,130 +78,161 @@ class AgentServer(AppService):
api_router.dependencies.append(Depends(auth_middleware))
# Import & Attach sub-routers
from .integrations import integrations_api_router
import autogpt_server.server.routers.analytics
import autogpt_server.server.routers.integrations
api_router.include_router(integrations_api_router, prefix="/integrations")
api_router.include_router(
autogpt_server.server.routers.integrations.router,
prefix="/integrations",
tags=["integrations"],
dependencies=[Depends(auth_middleware)],
)
api_router.include_router(
autogpt_server.server.routers.analytics.router,
prefix="/analytics",
tags=["analytics"],
dependencies=[Depends(auth_middleware)],
)
api_router.add_api_route(
path="/auth/user",
endpoint=self.get_or_create_user_route,
methods=["POST"],
tags=["auth"],
)
api_router.add_api_route(
path="/blocks",
endpoint=self.get_graph_blocks,
methods=["GET"],
)
api_router.add_api_route(
path="/blocks/costs",
endpoint=self.get_graph_block_costs,
methods=["GET"],
tags=["blocks"],
)
api_router.add_api_route(
path="/blocks/{block_id}/execute",
endpoint=self.execute_graph_block,
methods=["POST"],
tags=["blocks"],
)
api_router.add_api_route(
path="/graphs",
endpoint=self.get_graphs,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/templates",
endpoint=self.get_templates,
methods=["GET"],
tags=["templates", "graphs"],
)
api_router.add_api_route(
path="/graphs",
endpoint=self.create_new_graph,
methods=["POST"],
tags=["graphs"],
)
api_router.add_api_route(
path="/templates",
endpoint=self.create_new_template,
methods=["POST"],
tags=["templates", "graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}",
endpoint=self.get_graph,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/templates/{graph_id}",
endpoint=self.get_template,
methods=["GET"],
tags=["templates", "graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}",
endpoint=self.update_graph,
methods=["PUT"],
tags=["graphs"],
)
api_router.add_api_route(
path="/templates/{graph_id}",
endpoint=self.update_graph,
methods=["PUT"],
tags=["templates", "graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/versions",
endpoint=self.get_graph_all_versions,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/templates/{graph_id}/versions",
endpoint=self.get_graph_all_versions,
methods=["GET"],
tags=["templates", "graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/versions/{version}",
endpoint=self.get_graph,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/versions/active",
endpoint=self.set_graph_active_version,
methods=["PUT"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/input_schema",
endpoint=self.get_graph_input_schema,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/execute",
endpoint=self.execute_graph,
methods=["POST"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/executions",
endpoint=self.list_graph_runs,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/executions/{graph_exec_id}",
endpoint=self.get_graph_run_node_execution_results,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/executions/{graph_exec_id}/stop",
endpoint=self.stop_graph_run,
methods=["POST"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/schedules",
endpoint=self.create_schedule,
methods=["POST"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/{graph_id}/schedules",
endpoint=self.get_execution_schedules,
methods=["GET"],
tags=["graphs"],
)
api_router.add_api_route(
path="/graphs/schedules/{schedule_id}",
endpoint=self.update_schedule,
methods=["PUT"],
tags=["graphs"],
)
api_router.add_api_route(
path="/credits",
@@ -213,13 +244,14 @@ class AgentServer(AppService):
path="/settings",
endpoint=self.update_configuration,
methods=["POST"],
tags=["settings"],
)
app.add_exception_handler(500, self.handle_internal_http_error)
app.include_router(api_router)
uvicorn.run(app, host="0.0.0.0", port=8000, log_config=None)
uvicorn.run(app, host="0.0.0.0", port=Config().agent_api_port, log_config=None)
def set_test_dependency_overrides(self, overrides: dict):
self._test_dependency_overrides = overrides
@@ -275,11 +307,9 @@ class AgentServer(AppService):
@classmethod
def get_graph_blocks(cls) -> list[dict[Any, Any]]:
return [v.to_dict() for v in block.get_blocks().values()]
@classmethod
def get_graph_block_costs(cls) -> dict[Any, Any]:
return get_block_costs()
blocks = block.get_blocks()
costs = get_block_costs()
return [{**b.to_dict(), "costs": costs.get(b.id, [])} for b in blocks.values()]
@classmethod
def execute_graph_block(

View File

@@ -0,0 +1,49 @@
"""Analytics API"""
from typing import Annotated
import fastapi
import autogpt_server.data.analytics
from autogpt_server.server.utils import get_user_id
router = fastapi.APIRouter()
@router.post(path="/log_raw_metric")
async def log_raw_metric(
user_id: Annotated[str, fastapi.Depends(get_user_id)],
metric_name: Annotated[str, fastapi.Body(..., embed=True)],
metric_value: Annotated[float, fastapi.Body(..., embed=True)],
data_string: Annotated[str, fastapi.Body(..., embed=True)],
):
result = await autogpt_server.data.analytics.log_raw_metric(
user_id=user_id,
metric_name=metric_name,
metric_value=metric_value,
data_string=data_string,
)
return result.id
@router.post("/log_raw_analytics")
async def log_raw_analytics(
user_id: Annotated[str, fastapi.Depends(get_user_id)],
type: Annotated[str, fastapi.Body(..., embed=True)],
data: Annotated[
dict,
fastapi.Body(..., embed=True, description="The data to log"),
],
data_index: Annotated[
str,
fastapi.Body(
...,
embed=True,
description="Indexable field for any count based analytical measures like page order clicking, tutorial step completion, etc.",
),
],
):
result = await autogpt_server.data.analytics.log_raw_analytics(
user_id, type, data, data_index
)
return result.id

View File

@@ -15,11 +15,11 @@ from supabase import Client
from autogpt_server.integrations.oauth import HANDLERS_BY_NAME, BaseOAuthHandler
from autogpt_server.util.settings import Settings
from .utils import get_supabase, get_user_id
from ..utils import get_supabase, get_user_id
logger = logging.getLogger(__name__)
settings = Settings()
integrations_api_router = APIRouter()
router = APIRouter()
def get_store(supabase: Client = Depends(get_supabase)):
@@ -30,7 +30,7 @@ class LoginResponse(BaseModel):
login_url: str
@integrations_api_router.get("/{provider}/login")
@router.get("/{provider}/login")
async def login(
provider: Annotated[str, Path(title="The provider to initiate an OAuth flow for")],
user_id: Annotated[str, Depends(get_user_id)],
@@ -59,7 +59,7 @@ class CredentialsMetaResponse(BaseModel):
username: str | None
@integrations_api_router.post("/{provider}/callback")
@router.post("/{provider}/callback")
async def callback(
provider: Annotated[str, Path(title="The target provider for this OAuth exchange")],
code: Annotated[str, Body(title="Authorization code acquired by user login")],
@@ -91,7 +91,7 @@ async def callback(
)
@integrations_api_router.get("/{provider}/credentials")
@router.get("/{provider}/credentials")
async def list_credentials(
provider: Annotated[str, Path(title="The provider to list credentials for")],
user_id: Annotated[str, Depends(get_user_id)],
@@ -110,7 +110,7 @@ async def list_credentials(
]
@integrations_api_router.get("/{provider}/credentials/{cred_id}")
@router.get("/{provider}/credentials/{cred_id}")
async def get_credential(
provider: Annotated[str, Path(title="The provider to retrieve credentials for")],
cred_id: Annotated[str, Path(title="The ID of the credentials to retrieve")],

View File

@@ -11,7 +11,7 @@ from autogpt_server.data.user import DEFAULT_USER_ID
from autogpt_server.server.conn_manager import ConnectionManager
from autogpt_server.server.model import ExecutionSubscription, Methods, WsMessage
from autogpt_server.util.service import AppProcess
from autogpt_server.util.settings import Settings
from autogpt_server.util.settings import Config, Settings
logger = logging.getLogger(__name__)
settings = Settings()
@@ -174,4 +174,4 @@ async def websocket_router(
class WebsocketServer(AppProcess):
def run(self):
uvicorn.run(app, host="0.0.0.0", port=8001)
uvicorn.run(app, host="0.0.0.0", port=Config().websocket_server_port)

View File

@@ -80,6 +80,11 @@ class Config(UpdateTrackingModel["Config"], BaseSettings):
extra="allow",
)
websocket_server_port: int = Field(
default=8001,
description="The port for the websocket server to run on",
)
execution_manager_port: int = Field(
default=8002,
description="The port for execution manager daemon to run on",
@@ -95,6 +100,11 @@ class Config(UpdateTrackingModel["Config"], BaseSettings):
description="The port for agent server daemon to run on",
)
agent_api_port: int = Field(
default=8006,
description="The port for agent server API to run on",
)
@classmethod
def settings_customise_sources(
cls,

View File

@@ -0,0 +1,37 @@
-- CreateTable
CREATE TABLE "AnalyticsDetails" (
"id" TEXT NOT NULL DEFAULT gen_random_uuid(),
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"userId" TEXT NOT NULL,
"type" TEXT NOT NULL,
"data" JSONB,
"dataIndex" TEXT,
CONSTRAINT "AnalyticsDetails_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "AnalyticsMetrics" (
"id" TEXT NOT NULL DEFAULT gen_random_uuid(),
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"analyticMetric" TEXT NOT NULL,
"value" DOUBLE PRECISION NOT NULL,
"dataString" TEXT,
"userId" TEXT NOT NULL,
CONSTRAINT "AnalyticsMetrics_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "analyticsDetails" ON "AnalyticsDetails"("userId", "type");
-- CreateIndex
CREATE INDEX "AnalyticsDetails_type_idx" ON "AnalyticsDetails"("type");
-- AddForeignKey
ALTER TABLE "AnalyticsDetails" ADD CONSTRAINT "AnalyticsDetails_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "AnalyticsMetrics" ADD CONSTRAINT "AnalyticsMetrics_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;

View File

@@ -22,6 +22,8 @@ model User {
AgentGraphs AgentGraph[]
AgentGraphExecutions AgentGraphExecution[]
AgentGraphExecutionSchedules AgentGraphExecutionSchedule[]
AnalyticsDetails AnalyticsDetails[]
AnalyticsMetrics AnalyticsMetrics[]
UserBlockCredit UserBlockCredit[]
@@index([id])
@@ -212,6 +214,49 @@ model AgentGraphExecutionSchedule {
@@index([isEnabled])
}
model AnalyticsDetails {
// PK uses gen_random_uuid() to allow the db inserts to happen outside of prisma
// typical uuid() inserts are handled by prisma
id String @id @default(dbgenerated("gen_random_uuid()"))
createdAt DateTime @default(now())
updatedAt DateTime @default(now()) @updatedAt
// Link to User model
userId String
user User @relation(fields: [userId], references: [id])
// Analytics Categorical data used for filtering (indexable w and w/o userId)
type String
// Analytic Specific Data. We should use a union type here, but prisma doesn't support it.
data Json?
// Indexable field for any count based analytical measures like page order clicking, tutorial step completion, etc.
dataIndex String?
@@index([userId, type], name: "analyticsDetails")
@@index([type])
}
model AnalyticsMetrics {
id String @id @default(dbgenerated("gen_random_uuid()"))
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
// Analytics Categorical data used for filtering (indexable w and w/o userId)
analyticMetric String
// Any numeric data that should be counted upon, summed, or otherwise aggregated.
value Float
// Any string data that should be used to identify the metric as distinct.
// ex: '/build' vs '/market'
dataString String?
// Link to User model
userId String
user User @relation(fields: [userId], references: [id])
}
enum UserBlockCreditType {
TOP_UP
USAGE

View File

@@ -0,0 +1,147 @@
version: '3.8'
networks:
app-network:
name: app-network
shared-network:
name: shared-network
volumes:
db-config:
x-agpt-services:
&agpt-services
networks:
- app-network
- shared-network
x-supabase-services:
&supabase-services
networks:
- app-network
- shared-network
services:
# AGPT services
postgres:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: postgres
migrate:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: migrate
redis:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: redis
rest_server:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: rest_server
executor:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: executor
websocket_server:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: websocket_server
market:
<<: *agpt-services
extends:
file: ./docker-compose.yml
service: market
# frontend:
# <<: *agpt-services
# extends:
# file: ./docker-compose.yml
# service: frontend
# Supabase services
studio:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: studio
kong:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: kong
auth:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: auth
environment:
GOTRUE_MAILER_AUTOCONFIRM: true
rest:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: rest
realtime:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: realtime
storage:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: storage
imgproxy:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: imgproxy
meta:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: meta
functions:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: functions
analytics:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: analytics
db:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: db
vector:
<<: *supabase-services
extends:
file: ./supabase/docker/docker-compose.yml
service: vector

View File

@@ -11,21 +11,21 @@ services:
timeout: 5s
retries: 5
ports:
- "5432:5432"
- "5433:5432"
networks:
- app-network
migrate:
build:
context: ../
dockerfile: rnd/autogpt_server/Dockerfile
target: server
command: ["sh", "-c", "poetry run prisma migrate deploy"]
develop:
watch:
- path: ./
target: rnd/autogpt_server/migrate
action: rebuild
command: ["poetry", "run", "prisma", "migrate", "deploy"]
depends_on:
postgres:
condition: service_healthy
@@ -72,15 +72,18 @@ services:
migrate:
condition: service_completed_successfully
environment:
- SUPABASE_URL=http://kong:8000
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
- SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60
- REDIS_HOST=redis
- REDIS_PORT=6379
- REDIS_PASSWORD=password
- AUTH_ENABLED=false
- ENABLE_AUTH=true
- PYRO_HOST=0.0.0.0
- EXECUTIONMANAGER_HOST=executor
ports:
- "8000:8000"
- "8006:8006"
- "8003:8003" # execution scheduler
networks:
- app-network
@@ -104,11 +107,14 @@ services:
migrate:
condition: service_completed_successfully
environment:
- NEXT_PUBLIC_SUPABASE_URL=http://kong:8000
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
- SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60
- REDIS_HOST=redis
- REDIS_PORT=6379
- REDIS_PASSWORD=password
- AUTH_ENABLED=false
- ENABLE_AUTH=true
- PYRO_HOST=0.0.0.0
- AGENTSERVER_HOST=rest_server
ports:
@@ -135,11 +141,14 @@ services:
migrate:
condition: service_completed_successfully
environment:
- SUPABASE_URL=http://kong:8000
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
- SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60
- REDIS_HOST=redis
- REDIS_PORT=6379
- REDIS_PASSWORD=password
- AUTH_ENABLED=false
- ENABLE_AUTH=true
- PYRO_HOST=0.0.0.0
ports:
- "8001:8001"
@@ -161,40 +170,40 @@ services:
migrate:
condition: service_completed_successfully
environment:
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60&schema=market
- SUPABASE_URL=http://kong:8000
- SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
- SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60
ports:
- "8015:8015"
networks:
- app-network
frontend:
build:
context: ../
dockerfile: rnd/autogpt_builder/Dockerfile
target: dev
develop:
watch:
- path: ./
target: rnd/autogpt_builder/
action: rebuild
depends_on:
postgres:
condition: service_healthy
rest_server:
condition: service_started
websocket_server:
condition: service_started
migrate:
condition: service_completed_successfully
environment:
- DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60
- NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8000/api
- NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
- NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
ports:
- "3000:3000"
networks:
- app-network
# frontend:
# build:
# context: ../
# dockerfile: rnd/autogpt_builder/Dockerfile
# target: dev
# depends_on:
# postgres:
# condition: service_healthy
# rest_server:
# condition: service_started
# websocket_server:
# condition: service_started
# migrate:
# condition: service_completed_successfully
# environment:
# - NEXT_PUBLIC_SUPABASE_URL=http://kong:8000
# - NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
# - DATABASE_URL=postgresql://agpt_user:pass123@postgres:5432/agpt_local?connect_timeout=60
# - NEXT_PUBLIC_AGPT_SERVER_URL=http://localhost:8006/api
# - NEXT_PUBLIC_AGPT_WS_SERVER_URL=ws://localhost:8001/ws
# - NEXT_PUBLIC_AGPT_MARKETPLACE_URL=http://localhost:8015/api/v1/market
# ports:
# - "3000:3000"
# networks:
# - app-network
networks:
app-network:

View File

@@ -45,7 +45,7 @@ def populate_database():
keywords=["test"],
)
response = requests.post(
"http://localhost:8001/api/v1/market/admin/agent", json=req.model_dump()
"http://localhost:8015/api/v1/market/admin/agent", json=req.model_dump()
)
print(response.text)

1
rnd/supabase Submodule

Submodule rnd/supabase added at 5e4e7d521b