CLI
$pnpm add -D @datrix/cli
$datrix <command> [options]
datrix migrate
Compares your schemas against the current database state and runs the necessary migrations.
$datrix migrate$datrix migrate --dry-run # preview plan without applying$datrix migrate --status # show pending change counts only$datrix migrate --verbose # show field/index details in the plan
What happens
- Loads your
datrix.config.tsand connects to the database - Diffs each schema against the live table structure
- If there are pending changes, prints the migration plan:
- green
+— tables/columns to create - red
-— tables/columns to drop - yellow
~— tables/columns to alter
- green
- Prompts for confirmation, then applies
Ambiguous changes
When a field is removed and a new one is added at the same time, Datrix cannot tell whether this is a rename or a drop + add. You will be prompted to choose:
Ambiguous changes detected:
user.name->fullName
⚠ Renaming will preserve data. Dropping will lose it.
1. Rename column 'name' to 'fullName' (data preserved)
2. Drop column 'name' and add column 'fullName' (data lost)
Choose option (1-2):
Call session.resolveAmbiguous(id, type) in code if you want to resolve programmatically instead of interactively.
Auto-migrate on startup
Set migration.auto: true in defineConfig to apply migrations automatically when the app starts — no CLI needed:
export default defineConfig(() => ({
adapter: new PostgresAdapter({ ... }),
schemas: [...],
migration: {
auto: true,
},
}))
datrix generate schema <Name>
Generates a defineSchema template file.
$datrix generate schema User$datrix generate schema BlogPost --output ./src/schemas
Output: schemas/user.schema.ts (or the path specified by --output)
import { defineSchema } from "@datrix/core"
export const userSchema = defineSchema({
name: "user",
fields: {
// Add your fields here
// id, createdAt, updatedAt are added automatically
// name: {
// type: "string",
// required: true,
// minLength: 2,
// maxLength: 100,
// },
},
indexes: [
// { fields: ["email"], unique: true },
],
// permission: Only needed if you are using @datrix/api for HTTP access control.
// permission: {
// create: true,
// read: true,
// update: true,
// delete: true,
// },
} satisfies SchemaDefinition)
After editing, run datrix migrate to apply the new table.
datrix generate types
Generates TypeScript types from all registered schemas into a single file. For each schema it produces a Base interface, Relation/RelationUpdate interfaces for nested types, and Create/Update input types — ready to import across your codebase.
$datrix generate types$datrix generate types --output ./src/types/db.ts
Output: types/generated.ts (or the path specified by --output)
// generated.ts (example)
import type {
DatrixEntry,
RelationBelongsTo,
RelationHasMany,
} from "@datrix/core"
// ─────────────────────────────────────────
// User (table: user)
// ─────────────────────────────────────────
export interface UserBase extends DatrixEntry {
email: string
name?: string
role: "admin" | "editor" | "user"
}
export interface UserRelation {
posts?: PostBase[]
}
export interface UserRelationUpdate {
posts?: RelationHasMany<PostBase>
}
export type User = UserBase & UserRelation
export type CreateUserInput = Omit<UserBase, keyof DatrixEntry> & UserRelationUpdate
export type UpdateUserInput = Partial<Omit<UserBase, keyof DatrixEntry>> & UserRelationUpdate
// ─────────────────────────────────────────
// Post (table: post)
// relations: author → belongsTo(user)
// ─────────────────────────────────────────
export interface PostBase extends DatrixEntry {
title: string
slug: string
status: "draft" | "published" | "archived"
}
export interface PostRelation {
author?: UserBase
}
export interface PostRelationUpdate {
author?: RelationBelongsTo<UserBase>
}
export type Post = PostBase & PostRelation
export type CreatePostInput = Omit<PostBase, keyof DatrixEntry> & PostRelationUpdate
export type UpdatePostInput = Partial<Omit<PostBase, keyof DatrixEntry>> & PostRelationUpdate
datrix export
Exports all data from your database into a single zip file. The zip contains a metadata.json with schema definitions and one or more CSV chunks per table.
$datrix export$datrix export --output ./backups/prod-2024-01-15.zip
| Option | Description |
|---|---|
--output <path> | Output file path (default: ./export_<timestamp>.zip) |
--verbose | Detailed output |
What gets exported
- All tables including junction tables (manyToMany)
- All rows, chunked in batches of 1000 for memory efficiency
- Schema definitions — so the importer can recreate tables without a running app
- Table creation order is sorted by FK dependency, ensuring a safe import sequence
Format
export_2024-01-15T10-30-00.zip
├── metadata.json # schema definitions + chunk file index
├── users_0.csv # first 1000 rows of users
├── users_1.csv # next 1000 rows of users
├── posts_0.csv
└── post_tag_0.csv # junction table
Exporting media files
If your project uses @datrix/api-upload, pass --include-files to also download all media files alongside the database export.
$datrix export --include-files$datrix export --include-files --output ./backups/full-export
This creates an output directory instead of a single zip:
full-export/
├── export.zip # database export (same format as above)
├── files-progress.txt # download ledger
└── files/
├── 1710000000-photo.jpg
└── 1710000000-thumb.jpg
| Option | Description |
|---|---|
--include-files | Download media files in addition to DB data |
--output <path> | Output directory path (default: ./export_<timestamp>/) |
--pack-files | Pack downloaded files into zip chunks instead of leaving them loose |
--pack-files-chunk-size <bytes> | Max bytes per chunk zip (default: 1 GB) |
--resume <path> | Resume a previously interrupted file download |
Progress ledger
files-progress.txt tracks the download status of every media file. Each line has the format:
<id> <key> <status>
| Status | Meaning |
|---|---|
pending | Not yet downloaded |
done | Successfully downloaded |
missing | Server returned 404 — file no longer exists at the source |
restricted | Server returned 403 — file exists but credentials are not supported yet |
missing and restricted entries are skipped silently and will not be retried on resume. All other errors (network failures, 5xx) stop the export so you can resume later.
Resuming an interrupted export
If a download is interrupted, run the same command with --resume pointing to the output directory:
$datrix export --include-files --resume ./backups/full-export
Only pending entries in files-progress.txt will be retried.
Packing files into zip chunks
Use --pack-files to archive downloaded files into chunks after downloading. Original files are removed after packing. Useful when transferring large exports.
$datrix export --include-files --pack-files$datrix export --include-files --pack-files --pack-files-chunk-size 536870912 # 512 MB chunks
full-export/
├── export.zip
├── files-progress.txt
└── files/
├── chunk_0.zip
└── chunk_1.zip
datrix import
Imports data from a previously exported zip file. Drops all existing data and restores from the export.
$datrix import ./backups/prod-2024-01-15.zip$datrix import ./backups/prod-2024-01-15.zip --agree # skip confirmation prompt
| Option | Description |
|---|---|
--agree | Skip the "drop all data" confirmation prompt |
--verbose | Detailed output |
What happens
- Prompts for confirmation (skipped with
--agree) - Drops all existing tables
- Recreates tables from the exported schema definitions — without FK constraints
- Inserts all rows chunk by chunk
- Adds FK constraints via
ALTER TABLE - Resets auto-increment sequences to continue from the last imported ID
Use cases
- Database migrations — move data between adapters (e.g. JSON → PostgreSQL)
- Environment sync — copy production data to staging
- Backups — restore from a known good state
$# Copy production data to a local JSON adapter for development$datrix export --config ./config/prod.config.ts --output ./prod-snapshot.zip$datrix import ./prod-snapshot.zip --config ./config/local.config.ts --agree
Global options
| Option | Description |
|---|---|
--config <path> | Path to config file (default: ./datrix.config.ts) |
--verbose | Detailed output |
--help | Show help |
$datrix migrate --config ./config/datrix.config.ts --verbose