Waleed 0ce0f98aa5 v0.5.65: gemini updates, textract integration, ui updates (#2909)
* fix(google): wrap primitive tool responses for Gemini API compatibility (#2900)

* fix(canonical): copilot path + update parent (#2901)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output (#2902)

* fix(rss): add top-level title, link, pubDate fields to RSS trigger output

* fix(imap): add top-level fields to IMAP trigger output

* improvement(browseruse): add profile id param (#2903)

* improvement(browseruse): add profile id param

* make request a stub since we have directExec

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels (#2880)

* improvement(executor): upgraded abort controller to handle aborts for loops and parallels

* comments

* improvement(files): update execution for passing base64 strings (#2906)

* progress

* improvement(execution): update execution for passing base64 strings

* fix types

* cleanup comments

* path security vuln

* reject promise correctly

* fix redirect case

* remove proxy routes

* fix tests

* use ipaddr

* feat(tools): added textract, added v2 for mistral, updated tag dropdown (#2904)

* feat(tools): added textract

* cleanup

* ack pr comments

* reorder

* removed upload for textract async version

* fix additional fields dropdown in editor, update parser to leave validation to be done on the server

* added mistral v2, files v2, and finalized textract

* updated the rest of the old file patterns, updated mistral outputs for v2

* updated tag dropdown to parse non-operation fields as well

* updated extension finder

* cleanup

* added description for inputs to workflow

* use helper for internal route check

* fix tag dropdown merge conflict change

* remove duplicate code

---------

Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>

* fix(ui): change add inputs button to match output selector (#2907)

* fix(canvas): removed invite to workspace from canvas popover (#2908)

* fix(canvas): removed invite to workspace

* removed unused props

* fix(copilot): legacy tool display names (#2911)

* fix(a2a): canonical merge  (#2912)

* fix canonical merge

* fix empty array case

* fix(change-detection): copilot diffs have extra field (#2913)

* improvement(logs): improved logs ui bugs, added subflow disable UI (#2910)

* improvement(logs): improved logs ui bugs, added subflow disable UI

* added duplicate to action bar for subflows

* feat(broadcast): email v0.5 (#2905)

---------

Co-authored-by: Vikhyath Mondreti <vikhyathvikku@gmail.com>
Co-authored-by: Vikhyath Mondreti <vikhyath@simstudio.ai>
Co-authored-by: Emir Karabeg <78010029+emir-karabeg@users.noreply.github.com>
2026-01-20 23:54:55 -08:00
2026-01-19 23:23:21 -08:00
2026-01-19 23:23:21 -08:00
2026-01-20 23:42:48 -08:00

Sim Logo

Build and deploy AI agent workflows in minutes.

Sim.ai Discord Twitter Documentation

Ask DeepWiki Set Up with Cursor

Build Workflows with Ease

Design agent workflows visually on a canvas—connect agents, tools, and blocks, then run them instantly.

Workflow Builder Demo

Supercharge with Copilot

Leverage Copilot to generate nodes, fix errors, and iterate on flows directly from natural language.

Copilot Demo

Integrate Vector Databases

Upload documents to a vector store and let agents answer questions grounded in your specific content.

Knowledge Uploads and Retrieval Demo

Quickstart

Cloud-hosted: sim.ai

Sim.ai

Self-hosted: NPM Package

npx simstudio

http://localhost:3000

Note

Docker must be installed and running on your machine.

Options

Flag Description
-p, --port <port> Port to run Sim on (default 3000)
--no-pull Skip pulling latest Docker images

Self-hosted: Docker Compose

git clone https://github.com/simstudioai/sim.git && cd sim
docker compose -f docker-compose.prod.yml up -d

Open http://localhost:3000

Using Local Models with Ollama

Run Sim with local AI models using Ollama - no external APIs required:

# Start with GPU support (automatically downloads gemma3:4b model)
docker compose -f docker-compose.ollama.yml --profile setup up -d

# For CPU-only systems:
docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d

Wait for the model to download, then visit http://localhost:3000. Add more models with:

docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.1:8b

Using an External Ollama Instance

If Ollama is running on your host machine, use host.docker.internal instead of localhost:

OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d

On Linux, use your host's IP address or add extra_hosts: ["host.docker.internal:host-gateway"] to the compose file.

Using vLLM

Sim supports vLLM for self-hosted models. Set VLLM_BASE_URL and optionally VLLM_API_KEY in your environment.

Self-hosted: Dev Containers

  1. Open VS Code with the Remote - Containers extension
  2. Open the project and click "Reopen in Container" when prompted
  3. Run bun run dev:full in the terminal or use the sim-start alias
    • This starts both the main application and the realtime socket server

Self-hosted: Manual Setup

Requirements: Bun, Node.js v20+, PostgreSQL 12+ with pgvector

  1. Clone and install:
git clone https://github.com/simstudioai/sim.git
cd sim
bun install
  1. Set up PostgreSQL with pgvector:
docker run --name simstudio-db -e POSTGRES_PASSWORD=your_password -e POSTGRES_DB=simstudio -p 5432:5432 -d pgvector/pgvector:pg17

Or install manually via the pgvector guide.

  1. Configure environment:
cp apps/sim/.env.example apps/sim/.env
cp packages/db/.env.example packages/db/.env
# Edit both .env files to set DATABASE_URL="postgresql://postgres:your_password@localhost:5432/simstudio"
  1. Run migrations:
cd packages/db && bunx drizzle-kit migrate --config=./drizzle.config.ts
  1. Start development servers:
bun run dev:full  # Starts both Next.js app and realtime socket server

Or run separately: bun run dev (Next.js) and cd apps/sim && bun run dev:sockets (realtime).

Copilot API Keys

Copilot is a Sim-managed service. To use Copilot on a self-hosted instance:

  • Go to https://sim.ai → Settings → Copilot and generate a Copilot API key
  • Set COPILOT_API_KEY environment variable in your self-hosted apps/sim/.env file to that value

Environment Variables

Key environment variables for self-hosted deployments. See .env.example for defaults or env.ts for the full list.

Variable Required Description
DATABASE_URL Yes PostgreSQL connection string with pgvector
BETTER_AUTH_SECRET Yes Auth secret (openssl rand -hex 32)
BETTER_AUTH_URL Yes Your app URL (e.g., http://localhost:3000)
NEXT_PUBLIC_APP_URL Yes Public app URL (same as above)
ENCRYPTION_KEY Yes Encrypts environment variables (openssl rand -hex 32)
INTERNAL_API_SECRET Yes Encrypts internal API routes (openssl rand -hex 32)
API_ENCRYPTION_KEY Yes Encrypts API keys (openssl rand -hex 32)
COPILOT_API_KEY No API key from sim.ai for Copilot features

Troubleshooting

Ollama models not showing in dropdown (Docker)

If you're running Ollama on your host machine and Sim in Docker, change OLLAMA_URL from localhost to host.docker.internal:

OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d

See Using an External Ollama Instance for details.

Database connection issues

Ensure PostgreSQL has the pgvector extension installed. When using Docker, wait for the database to be healthy before running migrations.

Port conflicts

If ports 3000, 3002, or 5432 are in use, configure alternatives:

# Custom ports
NEXT_PUBLIC_APP_URL=http://localhost:3100 POSTGRES_PORT=5433 docker compose up -d

Tech Stack

Contributing

We welcome contributions! Please see our Contributing Guide for details.

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Made with ❤️ by the Sim Team

Description
No description provided
Readme Apache-2.0 487 MiB
Languages
TypeScript 70.7%
MDX 28.8%
Python 0.2%
CSS 0.1%