Skip to main content
Use Basecut in GitHub Actions to automatically seed test databases, create snapshots on schedule, and validate schema migrations with production-like data.

Overview

This guide covers:
  • Restoring snapshots before running tests
  • Creating snapshots on a schedule
  • Testing schema migrations against production-like data
  • Caching Basecut CLI for faster runs
  • Best practices for GitHub Actions workflows
For other CI/CD platforms, see the CI/CD Integration guide. For local development with Docker, see Docker Compose Integration.

Quick Start: Test with Production-Like Data

Restore a snapshot before running your test suite:
# .github/workflows/test.yml
name: Test
on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
          POSTGRES_DB: test_db
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Restore Test Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        run: |
          basecut snapshot restore test-data:latest \
            --target "postgresql://postgres:postgres@localhost:5432/test_db"

      - name: Run Tests
        run: npm test
Required Secrets (Settings → Secrets and variables → Actions):
  • BASECUT_API_KEY: Your Basecut API key
  • AWS_ACCESS_KEY_ID: AWS credentials (if snapshot stored in S3)
  • AWS_SECRET_ACCESS_KEY: AWS secret key

Create Snapshots on Schedule

Refresh your dev snapshot weekly with latest production data:
# .github/workflows/snapshot.yml
name: Create Weekly Dev Snapshot
on:
  schedule:
    - cron: '0 2 * * 1' # Every Monday at 2am UTC
  workflow_dispatch: # Manual trigger

jobs:
  create-snapshot:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Create Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
          BASECUT_DATABASE_URL: ${{ secrets.PROD_DATABASE_URL }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        run: |
          basecut snapshot create \
            --config basecut.yml \
            --name "dev-seed" \
            --source "$BASECUT_DATABASE_URL"

      - name: Notify Team
        if: success()
        uses: slackapi/slack-github-action@v1
        with:
          webhook-url: ${{ secrets.SLACK_WEBHOOK }}
          payload: |
            {
              "text": "✅ New dev-seed snapshot created. Run `basecut snapshot restore dev-seed:latest` to update your local DB."
            }

Test Schema Migrations

Validate migrations against production-like data before deploying:
# .github/workflows/migration-test.yml
name: Test Migration
on:
  pull_request:
    paths:
      - 'migrations/**'

jobs:
  test-migration:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_PASSWORD: postgres
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Restore Pre-Migration Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
        run: |
          # Restore production-like data with OLD schema
          basecut snapshot restore prod-schema:latest \
            --target "postgresql://postgres:postgres@localhost:5432/postgres"

      - name: Run Migration
        run: |
          npm run migrate

      - name: Verify Migration Success
        run: |
          # Run tests against migrated schema
          npm run test:integration

      - name: Comment on PR
        if: success()
        uses: actions/github-script@v7
        with:
          script: |
            github.rest.issues.createComment({
              issue_number: context.issue.number,
              owner: context.repo.owner,
              repo: context.repo.repo,
              body: '✅ Migration tested successfully against production-like snapshot.'
            })

Optimize Performance

Cache Basecut CLI Installation

Speed up workflows by caching the Basecut CLI binary:
- name: Cache Basecut CLI
  uses: actions/cache@v4
  with:
    path: ~/.basecut/bin
    key: basecut-cli-${{ runner.os }}-v1
    restore-keys: |
      basecut-cli-${{ runner.os }}-

- name: Install Basecut CLI
  run: |
    if [ ! -f ~/.basecut/bin/basecut ]; then
      curl -fsSL https://basecut.dev/install.sh | sh
    fi
    echo "$HOME/.basecut/bin" >> $GITHUB_PATH

Parallel Test Jobs

Restore snapshot once, run tests in parallel:
jobs:
  setup:
    runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:15
        # ... postgres config ...
    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Restore Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
        run: |
          basecut snapshot restore test-data:latest \
            --target "postgresql://postgres:postgres@localhost:5432/test_db"

      - name: Dump Database
        run: |
          pg_dump "postgresql://postgres:postgres@localhost:5432/test_db" > test-db.sql

      - name: Upload DB Snapshot
        uses: actions/upload-artifact@v4
        with:
          name: test-db
          path: test-db.sql
          retention-days: 1

  test:
    needs: setup
    strategy:
      matrix:
        shard: [1, 2, 3, 4]
    runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:15
        # ... postgres config ...
    steps:
      - uses: actions/checkout@v4

      - name: Download DB Snapshot
        uses: actions/download-artifact@v4
        with:
          name: test-db

      - name: Restore DB
        run: |
          psql "postgresql://postgres:postgres@localhost:5432/test_db" < test-db.sql

      - name: Run Tests
        run: npm test -- --shard=${{ matrix.shard }}/4

Best Practices

1. Use Dedicated Test Snapshots

Create separate snapshots for CI with:
  • Stable sampling: Use sampling.seed when sampling.mode: random
  • Small size: Faster restore (use limits.rows.per_table)
  • Stable schema: Update when schema changes, not on every run
# ci-test-data.yml
version: '1'
name: 'ci-test-data'
from:
  - table: users
limits:
  rows:
    per_table: 500 # Keep CI fast
sampling:
  mode: random
  seed: 12345
anonymize: auto
output: s3://my-team-snapshots/ci-test-data

2. Handle Restore Failures Gracefully

- name: Restore Snapshot with Fallback
  run: |
    if ! basecut snapshot restore test-data:latest \
      --target "postgresql://postgres:postgres@localhost:5432/test_db"; then
      echo "⚠️ Snapshot restore failed, using seed data instead"
      npm run seed:test-data
    fi

3. Use Matrix Strategy for Multiple Environments

Test against different snapshot versions:
jobs:
  test:
    strategy:
      matrix:
        snapshot: [v1, v2, latest]
    steps:
      - name: Restore Snapshot
        run: |
          basecut snapshot restore test-data:${{ matrix.snapshot }} \
            --target "$BASECUT_DATABASE_URL"

4. Conditional Snapshot Creation

Only create snapshots on main branch:
- name: Create Snapshot
  if: github.ref == 'refs/heads/main'
  run: |
    basecut snapshot create --config basecut.yml --name "dev-seed"

Troubleshooting

Snapshot Restore Too Slow

  • Use smaller snapshot: Reduce limits.rows.per_table in your config
  • Cache restore: Restore once, reuse DB dump (see parallel tests above)
  • Local storage: Use local filesystem for CI snapshots (faster than S3)

Authentication Failures

# Verify API key works
basecut whoami

# Check cloud storage credentials
aws s3 ls s3://your-bucket/  # S3
gsutil ls gs://your-bucket/  # GCS

Database Connection Refused

  • Ensure services container is healthy before restoring
  • Check that depends_on isn’t needed (services start in parallel)
  • Increase --health-interval if database starts slowly

Next Steps