Single-script category. The only first-class backup tool is scripts/backup-database.ts. There is no dedicated restore script — restore is psql -f <dumpfile> by hand, or use restore-from-archive for the archive-table specific case.

scripts/backup-database.ts

Wraps pg_dump to produce a timestamped SQL dump under ./backups/.

  • Output filename: ./backups/pre-archive-backup-<ISO-timestamp>.sql (the pre-archive- prefix reflects its primary use case but it is just a name).
  • Uses child_process.spawn('pg_dump', ...) directly — pg_dump must be on $PATH.
  • Reads the standard PG* env vars with sensible localhost defaults (PGHOST=localhost, PGPORT=5433, PGUSER=user, PGPASSWORD=password, PGDATABASE=enrichnodedb).
  • Creates the ./backups/ directory if missing (mkdir -p).
bun run scripts/backup-database.ts

Sample output: Backup file: ./backups/pre-archive-backup-2026-04-06T12-34-56-789Z.sql.

When to run it

  • Before every archive run — non-negotiable, the archive scripts do destructive deletes.
  • Before any migration that drops or alters a populated column.
  • Before bulk re-imports that re-key on org_nr.

Restoring from a backup

There is no project-specific restore script. Use stock psql:

psql -h localhost -p 5433 -U user -d enrichnodedb -f ./backups/pre-archive-backup-<timestamp>.sql

For the narrower case of un-doing an archive run only (not a full restore), use restore-from-archive.ts instead — it operates on the archive tables, which is much faster than replaying the full dump.

Backups directory

./backups/ is gitignored (it appears under untracked entries in git status only when populated). Rotate manually — the script never deletes old dumps. A 1.8M-company dump is several GB, so disk usage grows quickly if archive runs are frequent.

See also

Archive Scripts, Schema Migrations, Local Development.