Skip to main content
Integrate Basecut into your CI/CD workflow to automatically seed test databases, refresh dev snapshots, or validate schema migrations with production-like data.

GitHub Actions

Test with Production-Like Data

Restore a snapshot before running your test suite:
# .github/workflows/test.yml
name: Test
on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
          POSTGRES_DB: test_db
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Restore Test Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        run: |
          basecut snapshot restore test-data:latest \
            --target "postgresql://postgres:postgres@localhost:5432/test_db"

      - name: Run Tests
        run: npm test
Secrets to configure (Settings → Secrets):
  • BASECUT_API_KEY: Your Basecut API key
  • AWS_ACCESS_KEY_ID: AWS credentials (if snapshot stored in S3)
  • AWS_SECRET_ACCESS_KEY: AWS secret key

Create Snapshot on Schedule

Refresh your dev snapshot weekly with latest production data:
# .github/workflows/snapshot.yml
name: Create Weekly Dev Snapshot
on:
  schedule:
    - cron: '0 2 * * 1' # Every Monday at 2am UTC
  workflow_dispatch: # Manual trigger

jobs:
  create-snapshot:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Create Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
          BASECUT_DATABASE_URL: ${{ secrets.PROD_DATABASE_URL }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        run: |
          basecut snapshot create \
            --config basecut.yml \
            --name "dev-seed"

      - name: Notify Team
        if: success()
        uses: slackapi/slack-github-action@v1
        with:
          webhook-url: ${{ secrets.SLACK_WEBHOOK }}
          payload: |
            {
              "text": "✅ New dev-seed snapshot created. Run `basecut snapshot restore dev-seed:latest` to update your local DB."
            }

Test Schema Migrations

Validate migrations against production-like data before deploying:
# .github/workflows/migration-test.yml
name: Test Migration
on:
  pull_request:
    paths:
      - 'migrations/**'

jobs:
  test-migration:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_PASSWORD: postgres
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Install Basecut CLI
        run: |
          curl -fsSL https://basecut.dev/install.sh | sh
          echo "$HOME/.basecut/bin" >> $GITHUB_PATH

      - name: Restore Pre-Migration Snapshot
        env:
          BASECUT_API_KEY: ${{ secrets.BASECUT_API_KEY }}
        run: |
          # Restore production-like data with OLD schema
          basecut snapshot restore prod-schema:latest \
            --target "postgresql://postgres:postgres@localhost:5432/postgres"

      - name: Run Migration
        run: |
          npm run migrate

      - name: Verify Migration Success
        run: |
          # Run tests against migrated schema
          npm run test:integration

      - name: Comment on PR
        if: success()
        uses: actions/github-script@v7
        with:
          script: |
            github.rest.issues.createComment({
              issue_number: context.issue.number,
              owner: context.repo.owner,
              repo: context.repo.repo,
              body: '✅ Migration tested successfully against production-like snapshot.'
            })

GitLab CI

Test Pipeline with Snapshot Restore

# .gitlab-ci.yml
stages:
  - test

test:
  stage: test
  image: node:20
  services:
    - postgres:15
  variables:
    POSTGRES_DB: test_db
    POSTGRES_USER: postgres
    POSTGRES_PASSWORD: postgres
    BASECUT_DATABASE_URL: 'postgresql://postgres:postgres@postgres:5432/test_db'

  before_script:
    # Install Basecut CLI
    - curl -fsSL https://basecut.dev/install.sh | sh
    - export PATH="$HOME/.basecut/bin:$PATH"

    # Restore test snapshot
    - |
      basecut snapshot restore test-data:latest \
        --target "$BASECUT_DATABASE_URL"

  script:
    - npm ci
    - npm test

  only:
    - merge_requests
    - main
CI/CD Variables (Settings → CI/CD → Variables):
  • BASECUT_API_KEY: Your Basecut API key (masked)
  • AWS_ACCESS_KEY_ID: AWS credentials (masked)
  • AWS_SECRET_ACCESS_KEY: AWS secret key (masked, protected)

Scheduled Snapshot Creation

# .gitlab-ci.yml
create-snapshot:
  stage: snapshot
  image: node:20
  only:
    - schedules # Run on pipeline schedules

  before_script:
    - curl -fsSL https://basecut.dev/install.sh | sh
    - export PATH="$HOME/.basecut/bin:$PATH"

  script:
    - |
      basecut snapshot create \
        --config basecut.yml \
        --name "dev-seed-$(date +%Y%m%d)" \
        --source "$PROD_DATABASE_URL"

  after_script:
    # Notify team on success
    - |
      if [ "$CI_JOB_STATUS" == "success" ]; then
        curl -X POST $SLACK_WEBHOOK \
          -H 'Content-Type: application/json' \
          -d '{"text":"✅ New dev snapshot created"}'
      fi
Setup: Go to CI/CD → Schedules → Create new schedule (e.g., weekly on Mondays).

CircleCI

Test Job with Snapshot

# .circleci/config.yml
version: 2.1

executors:
  node-postgres:
    docker:
      - image: cimg/node:20.0
      - image: cimg/postgres:15.0
        environment:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
          POSTGRES_DB: test_db

jobs:
  test:
    executor: node-postgres
    steps:
      - checkout

      - run:
          name: Install Basecut CLI
          command: |
            curl -fsSL https://basecut.dev/install.sh | sh
            echo 'export PATH="$HOME/.basecut/bin:$PATH"' >> $BASH_ENV

      - run:
          name: Wait for PostgreSQL
          command: |
            dockerize -wait tcp://localhost:5432 -timeout 1m

      - run:
          name: Restore Test Snapshot
          command: |
            basecut snapshot restore test-data:latest \
              --target "postgresql://postgres:postgres@localhost:5432/test_db"
          environment:
            BASECUT_API_KEY: $BASECUT_API_KEY

      - run:
          name: Run Tests
          command: npm test

workflows:
  test:
    jobs:
      - test
Environment Variables (Project Settings → Environment Variables):
  • BASECUT_API_KEY
  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY

Jenkins

Declarative Pipeline

// Jenkinsfile
pipeline {
  agent any

  environment {
    BASECUT_API_KEY = credentials('basecut-api-key')
    BASECUT_DATABASE_URL = credentials('test-database-url')
  }

  stages {
    stage('Setup Database') {
      steps {
        sh '''
          curl -fsSL https://basecut.dev/install.sh | sh
          export PATH="$HOME/.basecut/bin:$PATH"

          basecut snapshot restore test-data:latest \
            --target "$BASECUT_DATABASE_URL"
        '''
      }
    }

    stage('Test') {
      steps {
        sh 'npm test'
      }
    }
  }

  post {
    always {
      junit 'test-results/*.xml'
    }
  }
}
Credentials: Add basecut-api-key in Jenkins → Manage Credentials.

Docker Compose for CI

For local CI testing or custom CI systems:
# docker-compose.ci.yml
version: '3.8'

services:
  postgres:
    image: postgres:15
    environment:
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: test_db
    ports:
      - '5432:5432'
    healthcheck:
      test: ['CMD-SHELL', 'pg_isready -U postgres']
      interval: 5s
      timeout: 5s
      retries: 5

  test:
    image: node:20
    depends_on:
      postgres:
        condition: service_healthy
    environment:
      BASECUT_DATABASE_URL: postgresql://postgres:postgres@postgres:5432/test_db
      BASECUT_API_KEY: ${BASECUT_API_KEY}
    volumes:
      - .:/app
    working_dir: /app
    command: |
      sh -c "
        curl -fsSL https://basecut.dev/install.sh | sh &&
        export PATH=\"\$HOME/.basecut/bin:\$PATH\" &&
        basecut snapshot restore test-data:latest &&
        npm ci &&
        npm test
      "
Run locally:
export BASECUT_API_KEY=your_key
docker compose -f docker-compose.ci.yml up --abort-on-container-exit test

Best Practices

1. Use Dedicated Test Snapshots

Create separate snapshots for CI with:
  • Stable sampling: Use sampling.seed when sampling.mode: random
  • Small size: Faster restore (use limits.rows.per_table)
  • Stable schema: Update when schema changes, not on every run
# ci-test-data.yml
version: '1'
name: 'ci-test-data'
from:
  - table: users
limits:
  rows:
    per_table: 500 # Keep CI fast
sampling:
  mode: random
  seed: 12345
anonymize: auto
output: s3://my-team-snapshots/ci-test-data

2. Cache Basecut CLI Installation

GitHub Actions:
- name: Cache Basecut CLI
  uses: actions/cache@v3
  with:
    path: ~/.basecut/bin
    key: cli-${{ runner.os }}

- name: Install Basecut CLI
  run: |
    if [ ! -f ~/.basecut/bin/basecut ]; then
      curl -fsSL https://basecut.dev/install.sh | sh
    fi
    echo "$HOME/.basecut/bin" >> $GITHUB_PATH
GitLab CI:
cache:
  paths:
    - .basecut/

before_script:
  - |
    if [ ! -f .basecut/basecut ]; then
      curl -fsSL https://basecut.dev/install.sh | sh
      mv $HOME/.basecut/bin/basecut .basecut/
    fi
  - export PATH="$PWD/.basecut:$PATH"

3. Handle Restore Failures Gracefully

- name: Restore Snapshot with Fallback
  run: |
    if ! basecut snapshot restore test-data:latest --target "$BASECUT_DATABASE_URL"; then
      echo "⚠️ Snapshot restore failed, using seed data instead"
      npm run seed:test-data
    fi

4. Parallel Test Jobs

Restore snapshot once, run tests in parallel:
# .github/workflows/test.yml
jobs:
  setup:
    runs-on: ubuntu-latest
    services:
      postgres: [...]
    steps:
      - name: Restore Snapshot
        run: basecut snapshot restore test-data:latest --target "$BASECUT_DATABASE_URL"

      - name: Dump Database
        run: pg_dump "$BASECUT_DATABASE_URL" > test-db.sql

      - name: Upload DB Snapshot
        uses: actions/upload-artifact@v3
        with:
          name: test-db
          path: test-db.sql

  test:
    needs: setup
    strategy:
      matrix:
        shard: [1, 2, 3, 4]
    runs-on: ubuntu-latest
    services:
      postgres: [...]
    steps:
      - name: Download DB Snapshot
        uses: actions/download-artifact@v3
        with:
          name: test-db

      - name: Restore DB
        run: psql "$BASECUT_DATABASE_URL" < test-db.sql

      - name: Run Tests
        run: npm test -- --shard=${{ matrix.shard }}/4

5. Use Read-Only Database Credentials

Never commit credentials to source control. Use your CI platform’s secret management to store BASECUT_API_KEY and database credentials. See Agent Deployment for the recommended read-only user setup.

Troubleshooting

Snapshot Restore Too Slow

  • Use smaller snapshot: Reduce limits.rows.per_table
  • Cache restore: Restore once, reuse DB dump (see parallel tests above)
  • Local storage: Use local filesystem for CI snapshots (faster than S3)

Authentication Failures

# Verify API key works
basecut whoami

# Check cloud storage credentials
aws s3 ls s3://your-bucket/  # S3
gsutil ls gs://your-bucket/  # GCS

Database Connection Refused

  • Ensure services container is healthy before restoring
  • Check depends_on with condition: service_healthy
  • Increase --health-interval if database starts slowly

Next Steps

Common Workflows

Other patterns for development, debugging, and data sharing

Agent Deployment

Run your own agents for private databases

Environment Variables

Complete CLI configuration reference

Troubleshooting

Debug common CI/CD issues