Files
hub-monorepo/packages/hub-nodejs/examples/replicate-data-postgres
Shane da Silva 32e660a181 fix: Reduce memory consumption of replication example (#943)
We had a report of one user on modern MacBook with 16GB of memory who
was running into heap allocation issues. This was likely due to the
excessive number of promises we were creating (as well as buffering all
data for each call).

Change the logic to fetch in batches of 1K records at most. This slows
down the initial sync but should reduce the likelihood that someone will
hit a memory limit.

We also specify a custom limit in the `yarn start` command so that when
we test locally we are using the same limit as everyone else.
2023-05-05 15:53:56 -07:00
..

Replicate hub data into Postgres

This example shows you how you can quickly start ingesting data from a Farcaster hub into a traditional database like Postgres.

Run on StackBlitz

Open in StackBlitz

Run locally

  1. Clone the repo locally
  2. Navigate to this folder with cd packages/hub-nodejs/examples/replicate-data-postgres
  3. Run yarn install to install dependencies
  4. Run docker compose up -d to start a Postgres instance (install Docker if you do not yet have it)
  5. Run yarn start

To wipe your local data, run docker compose down -v from this directory.