Skip to content

Chunked-resume

  • one or two large SQLite tables dominate the run
  • you want checkpoints and row-count validation
  • restart safety matters more than raw speed
schema = "app"
on_schema_exists = "error"
unlogged_tables = false
chunk_size = 100000
resume = true
validation = "row_count"
[source]
type = "sqlite"
dsn = "/path/to/database.db"
[target]
dsn = "postgres://postgres:postgres@127.0.0.1:5432/target_db?sslmode=disable"

SQLite still runs with one effective worker, but chunking keeps resume and progress tracking useful.

Raw files: migration.toml