Waleed 1d08796853 v0.5.12: memory optimizations, sentry, incidentio, posthog, zendesk, pylon, intercom, mailchimp, loading optimizations (#2132)
* fix(memory-util): fixed unbounded array of gmail/outlook pollers causing high memory util, added missing db indexes/removed unused ones, auto-disable schedules/webhooks after 10 consecutive failures (#2115)

* fix(memory-util): fixed unbounded array of gmail/outlook pollers causing high memory util, added missing db indexes/removed unused ones, auto-disable schedules/webhooks after 10 consecutive failures

* ack PR comments

* ack

* improvement(teams-plan): seats increase simplification + not triggering checkout session (#2117)

* improvement(teams-plan): seats increase simplification + not triggering checkout session

* cleanup via helper

* feat(tools): added sentry, incidentio, and posthog tools (#2116)

* feat(tools): added sentry, incidentio, and posthog tools

* update docs

* fixed docs to use native fumadocs for llms.txt and copy markdown, fixed tool issues

* cleanup

* enhance error extractor, fixed posthog tools

* docs enhancements, cleanup

* added more incident io ops, remove zustand/shallow in favor of zustand/react/shallow

* fix type errors

* remove unnecessary comments

* added vllm to docs

* feat(i18n): update translations (#2120)

* feat(i18n): update translations

* fix build

---------

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* improvement(workflow-execution): perf improvements to passing workflow state + decrypted env vars (#2119)

* improvement(execution): load workflow state once instead of 2-3 times

* decrypt only in get helper

* remove comments

* remove comments

* feat(models): host google gemini models (#2122)

* feat(models): host google gemini models

* remove unused primary key

* feat(i18n): update translations (#2123)

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* feat(tools): added zendesk, pylon, intercom, & mailchimp (#2126)

* feat(tools): added zendesk, pylon, intercom, & mailchimp

* finish zendesk and pylon

* updated docs

* feat(i18n): update translations (#2129)

* feat(i18n): update translations

* fixed build

---------

Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>

* fix(permissions): add client-side permissions validation to prevent unauthorized actions, upgraded custom tool modal (#2130)

* fix(permissions): add client-side permissions validation to prevent unauthorized actions, upgraded custom tool modal

* fix failing test

* fix test

* cleanup

* fix(custom-tools): add composite index on custom tool names & workspace id (#2131)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: waleedlatif1 <waleedlatif1@users.noreply.github.com>
2025-11-28 16:08:06 -08:00
2025-11-08 10:58:31 -08:00
2025-11-08 10:58:31 -08:00
2025-04-01 12:10:58 -07:00
2025-07-29 12:51:43 -07:00

Sim Logo

Build and deploy AI agent workflows in minutes.

Sim.ai Discord Twitter Documentation

Build Workflows with Ease

Design agent workflows visually on a canvas—connect agents, tools, and blocks, then run them instantly.

Workflow Builder Demo

Supercharge with Copilot

Leverage Copilot to generate nodes, fix errors, and iterate on flows directly from natural language.

Copilot Demo

Integrate Vector Databases

Upload documents to a vector store and let agents answer questions grounded in your specific content.

Knowledge Uploads and Retrieval Demo

Quickstart

Cloud-hosted: sim.ai

Sim.ai

Self-hosted: NPM Package

npx simstudio

http://localhost:3000

Note

Docker must be installed and running on your machine.

Options

Flag Description
-p, --port <port> Port to run Sim on (default 3000)
--no-pull Skip pulling latest Docker images

Self-hosted: Docker Compose

# Clone the repository
git clone https://github.com/simstudioai/sim.git

# Navigate to the project directory
cd sim

# Start Sim
docker compose -f docker-compose.prod.yml up -d

Access the application at http://localhost:3000/

Using Local Models with Ollama

Run Sim with local AI models using Ollama - no external APIs required:

# Start with GPU support (automatically downloads gemma3:4b model)
docker compose -f docker-compose.ollama.yml --profile setup up -d

# For CPU-only systems:
docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d

Wait for the model to download, then visit http://localhost:3000. Add more models with:

docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.1:8b

Self-hosted: Dev Containers

  1. Open VS Code with the Remote - Containers extension
  2. Open the project and click "Reopen in Container" when prompted
  3. Run bun run dev:full in the terminal or use the sim-start alias
    • This starts both the main application and the realtime socket server

Self-hosted: Manual Setup

Requirements:

Note: Sim uses vector embeddings for AI features like knowledge bases and semantic search, which requires the pgvector PostgreSQL extension.

  1. Clone and install dependencies:
git clone https://github.com/simstudioai/sim.git
cd sim
bun install
  1. Set up PostgreSQL with pgvector:

You need PostgreSQL with the vector extension for embedding support. Choose one option:

Option A: Using Docker (Recommended)

# Start PostgreSQL with pgvector extension
docker run --name simstudio-db \
  -e POSTGRES_PASSWORD=your_password \
  -e POSTGRES_DB=simstudio \
  -p 5432:5432 -d \
  pgvector/pgvector:pg17

Option B: Manual Installation

  1. Set up environment:
cd apps/sim
cp .env.example .env  # Configure with required variables (DATABASE_URL, BETTER_AUTH_SECRET, BETTER_AUTH_URL)

Update your .env file with the database URL:

DATABASE_URL="postgresql://postgres:your_password@localhost:5432/simstudio"
  1. Set up the database:

First, configure the database package environment:

cd packages/db
cp .env.example .env 

Update your packages/db/.env file with the database URL:

DATABASE_URL="postgresql://postgres:your_password@localhost:5432/simstudio"

Then run the migrations:

bunx drizzle-kit migrate --config=./drizzle.config.ts
  1. Start the development servers:

Recommended approach - run both servers together (from project root):

bun run dev:full

This starts both the main Next.js application and the realtime socket server required for full functionality.

Alternative - run servers separately:

Next.js app (from project root):

bun run dev

Realtime socket server (from apps/sim directory in a separate terminal):

cd apps/sim
bun run dev:sockets

Copilot API Keys

Copilot is a Sim-managed service. To use Copilot on a self-hosted instance:

  • Go to https://sim.ai → Settings → Copilot and generate a Copilot API key
  • Set COPILOT_API_KEY environment variable in your self-hosted apps/sim/.env file to that value

Tech Stack

Contributing

We welcome contributions! Please see our Contributing Guide for details.

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Made with ❤️ by the Sim Team

Description
No description provided
Readme Apache-2.0 488 MiB
Languages
TypeScript 69.8%
MDX 29.7%
Python 0.2%
CSS 0.1%