Compare commits

...

12 Commits

Author SHA1 Message Date
Nicholas Tindle
fdb7ff8111 docs(blocks): complete block documentation migration cleanup
Move remaining block docs to block-integrations/ subdirectory:
- Delete old docs from docs/integrations/ root
- Add new docs under docs/integrations/block-integrations/
- Add guides/ directory with LLM and voice provider docs
- Update SUMMARY.md with correct navigation structure

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-22 14:18:10 -06:00
Nicholas Tindle
0e42efb7d5 docs(blocks): migrate block documentation to docs/integrations
Restructure block documentation for GitBook integration:
- Move all block docs from docs/platform/blocks to docs/integrations/block-integrations
- Add generate_block_docs.py script to auto-generate documentation from block schemas
- Generate README.md with overview table of all 463 blocks
- Generate SUMMARY.md for GitBook navigation
- Add guides/ directory for LLM and voice provider documentation
- Fix bug in generate_block_docs.py where summary_path was incorrectly passed instead of summary_content

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-22 14:17:18 -06:00
bobby.gaffin
f2d82d8802 GITBOOK-73: No subject 2026-01-22 18:59:20 +00:00
Zamil Majdy
f9f984a8f4 fix(db): Remove redundant migration and fix pgvector schema handling (#11822)
### Changes 🏗️

This PR includes two database migration fixes:

#### 1. Remove redundant Supabase extensions migration

Removes the `20260112173500_add_supabase_extensions_to_platform_schema`
migration which was attempting to manage Supabase-provided extensions
and schemas.

**What was removed:**
- Migration that created extensions (pgcrypto, uuid-ossp,
pg_stat_statements, pg_net, pgjwt, pg_graphql, pgsodium, supabase_vault)
- Schema creation for these extensions

**Why it was removed:**
- These extensions and schemas are pre-installed and managed by Supabase
automatically
- The migration was redundant and could cause schema drift warnings
- Attempting to manage Supabase-owned resources in our migrations is an
anti-pattern

#### 2. Fix pgvector extension schema handling

Improves the `20260109181714_add_docs_embedding` migration to handle
cases where pgvector exists in the wrong schema.

**Problem:**
- If pgvector was previously installed in `public` schema, `CREATE
EXTENSION IF NOT EXISTS` would succeed but not actually install it in
the `platform` schema
- This causes `type "vector" does not exist` errors because the type
isn't in the search_path

**Solution:**
- Detect if vector extension exists in a different schema than the
current one
- Drop it with CASCADE and reinstall in the correct schema (platform)
- Use dynamic SQL with `EXECUTE format()` to explicitly specify the
target schema
- Split exception handling: catch errors during removal, but let
installation fail naturally with clear PostgreSQL errors

**Impact:**
- No functional changes - Supabase continues to provide extensions as
before
- pgvector now correctly installs in the platform schema
- Cleaner migration history
- Prevents schema-related errors

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Verified migrations run successfully without the redundant file
  - [x] Confirmed Supabase extensions are still available
  - [x] Tested pgvector migration handles wrong-schema scenario
  - [x] No schema drift warnings

#### For configuration changes:
- [x] .env.default is updated or already compatible with my changes
- [x] docker-compose.yml is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)
  - N/A - No configuration changes required
2026-01-22 12:06:00 +00:00
Abhimanyu Yadav
fc87ed4e34 feat(ci): add integration test job and rename e2e test job (#11820)
### Changes 🏗️

- Renamed the `test` job to `e2e_test` in the CI workflow for better
clarity
- Added a new `integration_test` job to the CI workflow that runs unit
tests using `pnpm test:unit`
- Created a basic integration test for the MainMarketplacePage component
to verify CI functionality

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Verified the CI workflow runs both e2e and integration tests
  - [x] Confirmed the integration test for MainMarketplacePage passes

#### For configuration changes:

- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
2026-01-22 11:14:48 +00:00
Nicholas Tindle
446c71fec8 Merge branch 'dev' into gitbook 2026-01-15 12:59:51 -07:00
claude[bot]
ec4c2caa14 Merge remote-tracking branch 'origin/dev' into gitbook 2026-01-12 21:45:54 +00:00
Nicholas Tindle
516e8b4b25 fix: move files to the right places 2026-01-12 13:46:56 -06:00
Nicholas Tindle
e7e118b5a8 wip: fixes 2026-01-09 10:23:31 -07:00
Nicholas Tindle
92a7a7e6d6 wip: fixes 2026-01-09 10:21:06 -07:00
Nicholas Tindle
e16995347f Refactor/gitbook platform structure (#11739)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 11:17:32 -06:00
Nicholas Tindle
234d3acb4c refactor(docs): restructure platform docs for GitBook and remove MkDocs (#11738)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 11:09:17 -06:00
166 changed files with 891 additions and 561 deletions

View File

@@ -128,7 +128,7 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
exitOnceUploaded: true exitOnceUploaded: true
test: e2e_test:
runs-on: big-boi runs-on: big-boi
needs: setup needs: setup
strategy: strategy:
@@ -258,3 +258,39 @@ jobs:
- name: Print Final Docker Compose logs - name: Print Final Docker Compose logs
if: always() if: always()
run: docker compose -f ../docker-compose.yml logs run: docker compose -f ../docker-compose.yml logs
integration_test:
runs-on: ubuntu-latest
needs: setup
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
submodules: recursive
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "22.18.0"
- name: Enable corepack
run: corepack enable
- name: Restore dependencies cache
uses: actions/cache@v4
with:
path: ~/.pnpm-store
key: ${{ needs.setup.outputs.cache-key }}
restore-keys: |
${{ runner.os }}-pnpm-${{ hashFiles('autogpt_platform/frontend/pnpm-lock.yaml') }}
${{ runner.os }}-pnpm-
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Generate API client
run: pnpm generate:api
- name: Run Integration Tests
run: pnpm test:unit

View File

@@ -1,12 +1,37 @@
-- CreateExtension -- CreateExtension
-- Supabase: pgvector must be enabled via Dashboard → Database → Extensions first -- Supabase: pgvector must be enabled via Dashboard → Database → Extensions first
-- Creates extension in current schema (determined by search_path from DATABASE_URL ?schema= param) -- Ensures vector extension is in the current schema (from DATABASE_URL ?schema= param)
-- If it exists in a different schema (e.g., public), we drop and recreate it in the current schema
-- This ensures vector type is in the same schema as tables, making ::vector work without explicit qualification -- This ensures vector type is in the same schema as tables, making ::vector work without explicit qualification
DO $$ DO $$
DECLARE
current_schema_name text;
vector_schema text;
BEGIN BEGIN
CREATE EXTENSION IF NOT EXISTS "vector"; -- Get the current schema from search_path
EXCEPTION WHEN OTHERS THEN SELECT current_schema() INTO current_schema_name;
RAISE NOTICE 'vector extension not available or already exists, skipping';
-- Check if vector extension exists and which schema it's in
SELECT n.nspname INTO vector_schema
FROM pg_extension e
JOIN pg_namespace n ON e.extnamespace = n.oid
WHERE e.extname = 'vector';
-- Handle removal if in wrong schema
IF vector_schema IS NOT NULL AND vector_schema != current_schema_name THEN
BEGIN
-- Vector exists in a different schema, drop it first
RAISE WARNING 'pgvector found in schema "%" but need it in "%". Dropping and reinstalling...',
vector_schema, current_schema_name;
EXECUTE 'DROP EXTENSION IF EXISTS vector CASCADE';
EXCEPTION WHEN OTHERS THEN
RAISE EXCEPTION 'Failed to drop pgvector from schema "%": %. You may need to drop it manually.',
vector_schema, SQLERRM;
END;
END IF;
-- Create extension in current schema (let it fail naturally if not available)
EXECUTE format('CREATE EXTENSION IF NOT EXISTS vector SCHEMA %I', current_schema_name);
END $$; END $$;
-- CreateEnum -- CreateEnum

View File

@@ -1,71 +0,0 @@
-- Acknowledge Supabase-managed extensions to prevent drift warnings
-- These extensions are pre-installed by Supabase in specific schemas
-- This migration ensures they exist where available (Supabase) or skips gracefully (CI)
-- Create schemas (safe in both CI and Supabase)
CREATE SCHEMA IF NOT EXISTS "extensions";
-- Extensions that exist in both CI and Supabase
DO $$
BEGIN
CREATE EXTENSION IF NOT EXISTS "pgcrypto" WITH SCHEMA "extensions";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'pgcrypto extension not available, skipping';
END $$;
DO $$
BEGIN
CREATE EXTENSION IF NOT EXISTS "uuid-ossp" WITH SCHEMA "extensions";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'uuid-ossp extension not available, skipping';
END $$;
-- Supabase-specific extensions (skip gracefully in CI)
DO $$
BEGIN
CREATE EXTENSION IF NOT EXISTS "pg_stat_statements" WITH SCHEMA "extensions";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'pg_stat_statements extension not available, skipping';
END $$;
DO $$
BEGIN
CREATE EXTENSION IF NOT EXISTS "pg_net" WITH SCHEMA "extensions";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'pg_net extension not available, skipping';
END $$;
DO $$
BEGIN
CREATE EXTENSION IF NOT EXISTS "pgjwt" WITH SCHEMA "extensions";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'pgjwt extension not available, skipping';
END $$;
DO $$
BEGIN
CREATE SCHEMA IF NOT EXISTS "graphql";
CREATE EXTENSION IF NOT EXISTS "pg_graphql" WITH SCHEMA "graphql";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'pg_graphql extension not available, skipping';
END $$;
DO $$
BEGIN
CREATE SCHEMA IF NOT EXISTS "pgsodium";
CREATE EXTENSION IF NOT EXISTS "pgsodium" WITH SCHEMA "pgsodium";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'pgsodium extension not available, skipping';
END $$;
DO $$
BEGIN
CREATE SCHEMA IF NOT EXISTS "vault";
CREATE EXTENSION IF NOT EXISTS "supabase_vault" WITH SCHEMA "vault";
EXCEPTION WHEN OTHERS THEN
RAISE NOTICE 'supabase_vault extension not available, skipping';
END $$;
-- Return to platform
CREATE SCHEMA IF NOT EXISTS "platform";

View File

@@ -34,7 +34,10 @@ logger = logging.getLogger(__name__)
# Default output directory relative to repo root # Default output directory relative to repo root
DEFAULT_OUTPUT_DIR = ( DEFAULT_OUTPUT_DIR = (
Path(__file__).parent.parent.parent.parent / "docs" / "integrations" Path(__file__).parent.parent.parent.parent
/ "docs"
/ "integrations"
/ "block-integrations"
) )
@@ -421,6 +424,14 @@ def generate_block_markdown(
lines.append("<!-- END MANUAL -->") lines.append("<!-- END MANUAL -->")
lines.append("") lines.append("")
# Optional per-block extras (only include if has content)
extras = manual_content.get("extras", "")
if extras:
lines.append("<!-- MANUAL: extras -->")
lines.append(extras)
lines.append("<!-- END MANUAL -->")
lines.append("")
lines.append("---") lines.append("---")
lines.append("") lines.append("")
@@ -456,25 +467,52 @@ def get_block_file_mapping(blocks: list[BlockDoc]) -> dict[str, list[BlockDoc]]:
return dict(file_mapping) return dict(file_mapping)
def generate_overview_table(blocks: list[BlockDoc]) -> str: def generate_overview_table(blocks: list[BlockDoc], block_dir_prefix: str = "") -> str:
"""Generate the overview table markdown (blocks.md).""" """Generate the overview table markdown (blocks.md).
Args:
blocks: List of block documentation objects
block_dir_prefix: Prefix for block file links (e.g., "block-integrations/")
"""
lines = [] lines = []
# GitBook YAML frontmatter
lines.append("---")
lines.append("layout:")
lines.append(" width: default")
lines.append(" title:")
lines.append(" visible: true")
lines.append(" description:")
lines.append(" visible: true")
lines.append(" tableOfContents:")
lines.append(" visible: false")
lines.append(" outline:")
lines.append(" visible: true")
lines.append(" pagination:")
lines.append(" visible: true")
lines.append(" metadata:")
lines.append(" visible: true")
lines.append("---")
lines.append("")
lines.append("# AutoGPT Blocks Overview") lines.append("# AutoGPT Blocks Overview")
lines.append("") lines.append("")
lines.append( lines.append(
'AutoGPT uses a modular approach with various "blocks" to handle different tasks. These blocks are the building blocks of AutoGPT workflows, allowing users to create complex automations by combining simple, specialized components.' 'AutoGPT uses a modular approach with various "blocks" to handle different tasks. These blocks are the building blocks of AutoGPT workflows, allowing users to create complex automations by combining simple, specialized components.'
) )
lines.append("") lines.append("")
lines.append('!!! info "Creating Your Own Blocks"') lines.append('{% hint style="info" %}')
lines.append(" Want to create your own custom blocks? Check out our guides:") lines.append("**Creating Your Own Blocks**")
lines.append(" ") lines.append("")
lines.append("Want to create your own custom blocks? Check out our guides:")
lines.append("")
lines.append( lines.append(
" - [Build your own Blocks](https://docs.agpt.co/platform/new_blocks/) - Step-by-step tutorial with examples" "* [Build your own Blocks](https://docs.agpt.co/platform/new_blocks/) - Step-by-step tutorial with examples"
) )
lines.append( lines.append(
" - [Block SDK Guide](https://docs.agpt.co/platform/block-sdk-guide/) - Advanced SDK patterns with OAuth, webhooks, and provider configuration" "* [Block SDK Guide](https://docs.agpt.co/platform/block-sdk-guide/) - Advanced SDK patterns with OAuth, webhooks, and provider configuration"
) )
lines.append("{% endhint %}")
lines.append("") lines.append("")
lines.append( lines.append(
"Below is a comprehensive list of all available blocks, categorized by their primary function. Click on any block name to view its detailed documentation." "Below is a comprehensive list of all available blocks, categorized by their primary function. Click on any block name to view its detailed documentation."
@@ -537,7 +575,8 @@ def generate_overview_table(blocks: list[BlockDoc]) -> str:
else "No description" else "No description"
) )
short_desc = short_desc.replace("\n", " ").replace("|", "\\|") short_desc = short_desc.replace("\n", " ").replace("|", "\\|")
lines.append(f"| [{block.name}]({file_path}#{anchor}) | {short_desc} |") link_path = f"{block_dir_prefix}{file_path}"
lines.append(f"| [{block.name}]({link_path}#{anchor}) | {short_desc} |")
lines.append("") lines.append("")
continue continue
@@ -563,7 +602,49 @@ def generate_overview_table(blocks: list[BlockDoc]) -> str:
) )
short_desc = short_desc.replace("\n", " ").replace("|", "\\|") short_desc = short_desc.replace("\n", " ").replace("|", "\\|")
lines.append(f"| [{block.name}]({file_path}#{anchor}) | {short_desc} |") link_path = f"{block_dir_prefix}{file_path}"
lines.append(f"| [{block.name}]({link_path}#{anchor}) | {short_desc} |")
lines.append("")
return "\n".join(lines)
def generate_summary_md(
blocks: list[BlockDoc], root_dir: Path, block_dir_prefix: str = ""
) -> str:
"""Generate SUMMARY.md for GitBook navigation.
Args:
blocks: List of block documentation objects
root_dir: The root docs directory (e.g., docs/integrations/)
block_dir_prefix: Prefix for block file links (e.g., "block-integrations/")
"""
lines = []
lines.append("# Table of contents")
lines.append("")
lines.append("* [AutoGPT Blocks Overview](README.md)")
lines.append("")
# Check for guides/ directory at the root level (docs/integrations/guides/)
guides_dir = root_dir / "guides"
if guides_dir.exists():
lines.append("## Guides")
lines.append("")
for guide_file in sorted(guides_dir.glob("*.md")):
# Use just the file name for title (replace hyphens/underscores with spaces)
title = file_path_to_title(guide_file.stem.replace("-", "_") + ".md")
lines.append(f"* [{title}](guides/{guide_file.name})")
lines.append("")
lines.append("## Block Integrations")
lines.append("")
file_mapping = get_block_file_mapping(blocks)
for file_path in sorted(file_mapping.keys()):
title = file_path_to_title(file_path)
link_path = f"{block_dir_prefix}{file_path}"
lines.append(f"* [{title}]({link_path})")
lines.append("") lines.append("")
@@ -653,6 +734,16 @@ def write_block_docs(
) )
) )
# Add file-level additional_content section if present
file_additional = extract_manual_content(existing_content).get(
"additional_content", ""
)
if file_additional:
content_parts.append("<!-- MANUAL: additional_content -->")
content_parts.append(file_additional)
content_parts.append("<!-- END MANUAL -->")
content_parts.append("")
full_content = file_header + "\n" + "\n".join(content_parts) full_content = file_header + "\n" + "\n".join(content_parts)
generated_files[str(file_path)] = full_content generated_files[str(file_path)] = full_content
@@ -661,14 +752,28 @@ def write_block_docs(
full_path.write_text(full_content) full_path.write_text(full_content)
# Generate overview file # Generate overview file at the parent directory (docs/integrations/)
overview_content = generate_overview_table(blocks) # with links prefixed to point into block-integrations/
overview_path = output_dir / "README.md" root_dir = output_dir.parent
block_dir_name = output_dir.name # "block-integrations"
block_dir_prefix = f"{block_dir_name}/"
overview_content = generate_overview_table(blocks, block_dir_prefix)
overview_path = root_dir / "README.md"
generated_files["README.md"] = overview_content generated_files["README.md"] = overview_content
overview_path.write_text(overview_content) overview_path.write_text(overview_content)
if verbose: if verbose:
print(" Writing README.md (overview)") print(" Writing README.md (overview) to parent directory")
# Generate SUMMARY.md for GitBook navigation at the parent directory
summary_content = generate_summary_md(blocks, root_dir, block_dir_prefix)
summary_path = root_dir / "SUMMARY.md"
generated_files["SUMMARY.md"] = summary_content
summary_path.write_text(summary_content)
if verbose:
print(" Writing SUMMARY.md (navigation) to parent directory")
return generated_files return generated_files
@@ -748,6 +853,16 @@ def check_docs_in_sync(output_dir: Path, blocks: list[BlockDoc]) -> bool:
elif block_match.group(1).strip() != expected_block_content.strip(): elif block_match.group(1).strip() != expected_block_content.strip():
mismatched_blocks.append(block.name) mismatched_blocks.append(block.name)
# Add file-level additional_content to expected content (matches write_block_docs)
file_additional = extract_manual_content(existing_content).get(
"additional_content", ""
)
if file_additional:
content_parts.append("<!-- MANUAL: additional_content -->")
content_parts.append(file_additional)
content_parts.append("<!-- END MANUAL -->")
content_parts.append("")
expected_content = file_header + "\n" + "\n".join(content_parts) expected_content = file_header + "\n" + "\n".join(content_parts)
if existing_content.strip() != expected_content.strip(): if existing_content.strip() != expected_content.strip():
@@ -757,11 +872,15 @@ def check_docs_in_sync(output_dir: Path, blocks: list[BlockDoc]) -> bool:
out_of_sync_details.append((file_path, mismatched_blocks)) out_of_sync_details.append((file_path, mismatched_blocks))
all_match = False all_match = False
# Check overview # Check overview at the parent directory (docs/integrations/)
overview_path = output_dir / "README.md" root_dir = output_dir.parent
block_dir_name = output_dir.name # "block-integrations"
block_dir_prefix = f"{block_dir_name}/"
overview_path = root_dir / "README.md"
if overview_path.exists(): if overview_path.exists():
existing_overview = overview_path.read_text() existing_overview = overview_path.read_text()
expected_overview = generate_overview_table(blocks) expected_overview = generate_overview_table(blocks, block_dir_prefix)
if existing_overview.strip() != expected_overview.strip(): if existing_overview.strip() != expected_overview.strip():
print("OUT OF SYNC: README.md (overview)") print("OUT OF SYNC: README.md (overview)")
print(" The blocks overview table needs regeneration") print(" The blocks overview table needs regeneration")
@@ -772,6 +891,21 @@ def check_docs_in_sync(output_dir: Path, blocks: list[BlockDoc]) -> bool:
out_of_sync_details.append(("README.md", ["overview table"])) out_of_sync_details.append(("README.md", ["overview table"]))
all_match = False all_match = False
# Check SUMMARY.md at the parent directory
summary_path = root_dir / "SUMMARY.md"
if summary_path.exists():
existing_summary = summary_path.read_text()
expected_summary = generate_summary_md(blocks, root_dir, block_dir_prefix)
if existing_summary.strip() != expected_summary.strip():
print("OUT OF SYNC: SUMMARY.md (navigation)")
print(" The GitBook navigation needs regeneration")
out_of_sync_details.append(("SUMMARY.md", ["navigation"]))
all_match = False
else:
print("MISSING: SUMMARY.md (navigation)")
out_of_sync_details.append(("SUMMARY.md", ["navigation"]))
all_match = False
# Check for unfilled manual sections # Check for unfilled manual sections
unfilled_patterns = [ unfilled_patterns = [
"_Add a description of this category of blocks._", "_Add a description of this category of blocks._",

View File

@@ -0,0 +1,15 @@
import { expect, test } from "vitest";
import { render, screen } from "@/tests/integrations/test-utils";
import { MainMarkeplacePage } from "../MainMarketplacePage";
import { server } from "@/mocks/mock-server";
import { getDeleteV2DeleteStoreSubmissionMockHandler422 } from "@/app/api/__generated__/endpoints/store/store.msw";
// Only for CI testing purpose, will remove it in future PR
test("MainMarketplacePage", async () => {
server.use(getDeleteV2DeleteStoreSubmissionMockHandler422());
render(<MainMarkeplacePage />);
expect(
await screen.findByText("Featured agents", { exact: false }),
).toBeDefined();
});

Binary file not shown.

After

Width:  |  Height:  |  Size: 115 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 116 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 504 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 47 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,133 @@
# Table of contents
* [AutoGPT Blocks Overview](README.md)
## Guides
* [LLM Providers](guides/llm-providers.md)
* [Voice Providers](guides/voice-providers.md)
## Block Integrations
* [Airtable Bases](block-integrations/airtable/bases.md)
* [Airtable Records](block-integrations/airtable/records.md)
* [Airtable Schema](block-integrations/airtable/schema.md)
* [Airtable Triggers](block-integrations/airtable/triggers.md)
* [Apollo Organization](block-integrations/apollo/organization.md)
* [Apollo People](block-integrations/apollo/people.md)
* [Apollo Person](block-integrations/apollo/person.md)
* [Ayrshare Post To Bluesky](block-integrations/ayrshare/post_to_bluesky.md)
* [Ayrshare Post To Facebook](block-integrations/ayrshare/post_to_facebook.md)
* [Ayrshare Post To GMB](block-integrations/ayrshare/post_to_gmb.md)
* [Ayrshare Post To Instagram](block-integrations/ayrshare/post_to_instagram.md)
* [Ayrshare Post To LinkedIn](block-integrations/ayrshare/post_to_linkedin.md)
* [Ayrshare Post To Pinterest](block-integrations/ayrshare/post_to_pinterest.md)
* [Ayrshare Post To Reddit](block-integrations/ayrshare/post_to_reddit.md)
* [Ayrshare Post To Snapchat](block-integrations/ayrshare/post_to_snapchat.md)
* [Ayrshare Post To Telegram](block-integrations/ayrshare/post_to_telegram.md)
* [Ayrshare Post To Threads](block-integrations/ayrshare/post_to_threads.md)
* [Ayrshare Post To TikTok](block-integrations/ayrshare/post_to_tiktok.md)
* [Ayrshare Post To X](block-integrations/ayrshare/post_to_x.md)
* [Ayrshare Post To YouTube](block-integrations/ayrshare/post_to_youtube.md)
* [Baas Bots](block-integrations/baas/bots.md)
* [Bannerbear Text Overlay](block-integrations/bannerbear/text_overlay.md)
* [Basic](block-integrations/basic.md)
* [Compass Triggers](block-integrations/compass/triggers.md)
* [Data](block-integrations/data.md)
* [Dataforseo Keyword Suggestions](block-integrations/dataforseo/keyword_suggestions.md)
* [Dataforseo Related Keywords](block-integrations/dataforseo/related_keywords.md)
* [Discord Bot Blocks](block-integrations/discord/bot_blocks.md)
* [Discord OAuth Blocks](block-integrations/discord/oauth_blocks.md)
* [Enrichlayer LinkedIn](block-integrations/enrichlayer/linkedin.md)
* [Exa Answers](block-integrations/exa/answers.md)
* [Exa Code Context](block-integrations/exa/code_context.md)
* [Exa Contents](block-integrations/exa/contents.md)
* [Exa Research](block-integrations/exa/research.md)
* [Exa Search](block-integrations/exa/search.md)
* [Exa Similar](block-integrations/exa/similar.md)
* [Exa Webhook Blocks](block-integrations/exa/webhook_blocks.md)
* [Exa Websets](block-integrations/exa/websets.md)
* [Exa Websets Enrichment](block-integrations/exa/websets_enrichment.md)
* [Exa Websets Import Export](block-integrations/exa/websets_import_export.md)
* [Exa Websets Items](block-integrations/exa/websets_items.md)
* [Exa Websets Monitor](block-integrations/exa/websets_monitor.md)
* [Exa Websets Polling](block-integrations/exa/websets_polling.md)
* [Exa Websets Search](block-integrations/exa/websets_search.md)
* [Fal AI Video Generator](block-integrations/fal/ai_video_generator.md)
* [Firecrawl Crawl](block-integrations/firecrawl/crawl.md)
* [Firecrawl Extract](block-integrations/firecrawl/extract.md)
* [Firecrawl Map](block-integrations/firecrawl/map.md)
* [Firecrawl Scrape](block-integrations/firecrawl/scrape.md)
* [Firecrawl Search](block-integrations/firecrawl/search.md)
* [Generic Webhook Triggers](block-integrations/generic_webhook/triggers.md)
* [GitHub Checks](block-integrations/github/checks.md)
* [GitHub CI](block-integrations/github/ci.md)
* [GitHub Issues](block-integrations/github/issues.md)
* [GitHub Pull Requests](block-integrations/github/pull_requests.md)
* [GitHub Repo](block-integrations/github/repo.md)
* [GitHub Reviews](block-integrations/github/reviews.md)
* [GitHub Statuses](block-integrations/github/statuses.md)
* [GitHub Triggers](block-integrations/github/triggers.md)
* [Google Calendar](block-integrations/google/calendar.md)
* [Google Docs](block-integrations/google/docs.md)
* [Google Gmail](block-integrations/google/gmail.md)
* [Google Sheets](block-integrations/google/sheets.md)
* [HubSpot Company](block-integrations/hubspot/company.md)
* [HubSpot Contact](block-integrations/hubspot/contact.md)
* [HubSpot Engagement](block-integrations/hubspot/engagement.md)
* [Jina Chunking](block-integrations/jina/chunking.md)
* [Jina Embeddings](block-integrations/jina/embeddings.md)
* [Jina Fact Checker](block-integrations/jina/fact_checker.md)
* [Jina Search](block-integrations/jina/search.md)
* [Linear Comment](block-integrations/linear/comment.md)
* [Linear Issues](block-integrations/linear/issues.md)
* [Linear Projects](block-integrations/linear/projects.md)
* [LLM](block-integrations/llm.md)
* [Logic](block-integrations/logic.md)
* [Misc](block-integrations/misc.md)
* [Multimedia](block-integrations/multimedia.md)
* [Notion Create Page](block-integrations/notion/create_page.md)
* [Notion Read Database](block-integrations/notion/read_database.md)
* [Notion Read Page](block-integrations/notion/read_page.md)
* [Notion Read Page Markdown](block-integrations/notion/read_page_markdown.md)
* [Notion Search](block-integrations/notion/search.md)
* [Nvidia Deepfake](block-integrations/nvidia/deepfake.md)
* [Replicate Flux Advanced](block-integrations/replicate/flux_advanced.md)
* [Replicate Replicate Block](block-integrations/replicate/replicate_block.md)
* [Search](block-integrations/search.md)
* [Slant3D Filament](block-integrations/slant3d/filament.md)
* [Slant3D Order](block-integrations/slant3d/order.md)
* [Slant3D Slicing](block-integrations/slant3d/slicing.md)
* [Slant3D Webhook](block-integrations/slant3d/webhook.md)
* [Smartlead Campaign](block-integrations/smartlead/campaign.md)
* [Stagehand Blocks](block-integrations/stagehand/blocks.md)
* [System Library Operations](block-integrations/system/library_operations.md)
* [System Store Operations](block-integrations/system/store_operations.md)
* [Text](block-integrations/text.md)
* [Todoist Comments](block-integrations/todoist/comments.md)
* [Todoist Labels](block-integrations/todoist/labels.md)
* [Todoist Projects](block-integrations/todoist/projects.md)
* [Todoist Sections](block-integrations/todoist/sections.md)
* [Todoist Tasks](block-integrations/todoist/tasks.md)
* [Twitter Blocks](block-integrations/twitter/blocks.md)
* [Twitter Bookmark](block-integrations/twitter/bookmark.md)
* [Twitter Follows](block-integrations/twitter/follows.md)
* [Twitter Hide](block-integrations/twitter/hide.md)
* [Twitter Like](block-integrations/twitter/like.md)
* [Twitter List Follows](block-integrations/twitter/list_follows.md)
* [Twitter List Lookup](block-integrations/twitter/list_lookup.md)
* [Twitter List Members](block-integrations/twitter/list_members.md)
* [Twitter List Tweets Lookup](block-integrations/twitter/list_tweets_lookup.md)
* [Twitter Manage](block-integrations/twitter/manage.md)
* [Twitter Manage Lists](block-integrations/twitter/manage_lists.md)
* [Twitter Mutes](block-integrations/twitter/mutes.md)
* [Twitter Pinned Lists](block-integrations/twitter/pinned_lists.md)
* [Twitter Quote](block-integrations/twitter/quote.md)
* [Twitter Retweet](block-integrations/twitter/retweet.md)
* [Twitter Search Spaces](block-integrations/twitter/search_spaces.md)
* [Twitter Spaces Lookup](block-integrations/twitter/spaces_lookup.md)
* [Twitter Timeline](block-integrations/twitter/timeline.md)
* [Twitter Tweet Lookup](block-integrations/twitter/tweet_lookup.md)
* [Twitter User Lookup](block-integrations/twitter/user_lookup.md)
* [Wolfram LLM API](block-integrations/wolfram/llm_api.md)
* [Zerobounce Validate Emails](block-integrations/zerobounce/validate_emails.md)

Some files were not shown because too many files have changed in this diff Show More