SQLite client for Node.js applications with SQL-based migrations API written in TypeScript
—
SQL-based migration system supporting up/down migrations, rollback functionality, and migration state management for database schema evolution.
Run database migrations with comprehensive configuration options and rollback support.
// Available on Database instance
migrate(config?: MigrationParams): Promise<void>;
interface MigrationParams {
/** Force rollback and re-apply the latest migration on each app launch */
force?: boolean;
/** Migration table name (default: 'migrations') */
table?: string;
/** Path to migrations folder (default: path.join(process.cwd(), 'migrations')) */
migrationsPath?: string;
/** Migration data array (if provided, migrationsPath is ignored) */
migrations?: readonly MigrationData[];
}Migrations are stored as SQL files with a specific naming convention and structure:
interface MigrationFile {
/** Migration ID extracted from filename */
id: number;
/** Migration name extracted from filename */
name: string;
/** Full filename */
filename: string;
}
interface MigrationData {
/** Migration ID */
id: number;
/** Migration name */
name: string;
/** SQL for applying the migration */
up: string;
/** SQL for rolling back the migration */
down: string;
}Migration File Naming Convention:
{id}.{name}.sql001-initial.sql, 002-add-users-table.sql, 003-user-indexes.sqlMigration File Content Structure:
-- Up migration (applied when migrating forward)
CREATE TABLE users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
email TEXT UNIQUE NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
-- down
-- Down migration (applied when rolling back)
DROP TABLE users;Usage Examples:
import { open } from "sqlite";
import sqlite3 from "sqlite3";
const db = await open({
filename: "./app.db",
driver: sqlite3.Database
});
// Run migrations from default ./migrations directory
await db.migrate();
// Run migrations from custom directory
await db.migrate({
migrationsPath: "./database/migrations"
});
// Use custom migration table name
await db.migrate({
table: "schema_versions",
migrationsPath: "./migrations"
});
// Force re-apply latest migration (useful for development)
await db.migrate({
force: true
});
// Use programmatic migration data instead of files
const migrationData: MigrationData[] = [
{
id: 1,
name: "initial",
up: "CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT);",
down: "DROP TABLE users;"
},
{
id: 2,
name: "add-email",
up: "ALTER TABLE users ADD COLUMN email TEXT;",
down: "ALTER TABLE users DROP COLUMN email;"
}
];
await db.migrate({
migrations: migrationData
});Create a migrations directory with properly structured SQL files:
migrations/
├── 001-initial.sql
├── 002-create-users.sql
├── 003-create-posts.sql
└── 004-add-indexes.sqlExample Migration Files:
001-initial.sql:
-- Create initial database structure
CREATE TABLE schema_info (
version INTEGER PRIMARY KEY,
applied_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
-- down
DROP TABLE schema_info;002-create-users.sql:
-- Create users table
CREATE TABLE users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
username TEXT UNIQUE NOT NULL,
email TEXT UNIQUE NOT NULL,
password_hash TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
-- Create trigger to update updated_at
CREATE TRIGGER update_users_updated_at
AFTER UPDATE ON users
BEGIN
UPDATE users SET updated_at = CURRENT_TIMESTAMP WHERE id = NEW.id;
END;
-- down
DROP TRIGGER IF EXISTS update_users_updated_at;
DROP TABLE users;003-create-posts.sql:
-- Create posts table
CREATE TABLE posts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
title TEXT NOT NULL,
content TEXT,
author_id INTEGER NOT NULL,
published BOOLEAN DEFAULT FALSE,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (author_id) REFERENCES users(id) ON DELETE CASCADE
);
-- down
DROP TABLE posts;004-add-indexes.sql:
-- Add performance indexes
CREATE INDEX idx_users_email ON users(email);
CREATE INDEX idx_users_username ON users(username);
CREATE INDEX idx_posts_author ON posts(author_id);
CREATE INDEX idx_posts_published ON posts(published);
CREATE INDEX idx_posts_created ON posts(created_at);
-- down
DROP INDEX IF EXISTS idx_posts_created;
DROP INDEX IF EXISTS idx_posts_published;
DROP INDEX IF EXISTS idx_posts_author;
DROP INDEX IF EXISTS idx_users_username;
DROP INDEX IF EXISTS idx_users_email;The migration system provides intelligent migration management:
migrations table (or custom name) to track applied migrationsforce: true, rolls back and re-applies the latest migrationDevelopment Workflow with Force Mode:
// During development, force re-apply latest migration
if (process.env.NODE_ENV === 'development') {
await db.migrate({ force: true });
} else {
await db.migrate();
}Multiple Database Migration:
const databases = await Promise.all([
open({ filename: "./users.db", driver: sqlite3.Database }),
open({ filename: "./products.db", driver: sqlite3.Database }),
open({ filename: "./orders.db", driver: sqlite3.Database })
]);
// Run migrations on all databases
await Promise.all(databases.map(db => db.migrate({
migrationsPath: "./migrations"
})));Custom Migration Logic:
// Read and modify migrations programmatically
import { readMigrations } from "sqlite/utils/migrate";
const migrationData = await readMigrations("./migrations");
// Add environment-specific modifications
const modifiedMigrations = migrationData.map(migration => ({
...migration,
up: process.env.NODE_ENV === 'test'
? migration.up.replace('users', 'test_users')
: migration.up
}));
await db.migrate({
migrations: modifiedMigrations
});Proper error handling for migration operations:
try {
await db.migrate();
console.log("Migrations completed successfully");
} catch (error) {
console.error("Migration failed:", error);
// Migrations are automatically rolled back on failure
// Check migration state
const appliedMigrations = await db.all(
"SELECT id, name FROM migrations ORDER BY id"
);
console.log("Currently applied migrations:", appliedMigrations);
throw error; // Re-throw to halt application startup
}/**
* Read migration files from a directory and parse their content
* @param migrationPath - Path to migrations directory (default: './migrations')
* @returns Promise resolving to array of migration data with parsed up/down SQL
*/
function readMigrations(migrationPath?: string): Promise<MigrationData[]>;This utility function is available for advanced use cases where you need to read and process migration files programmatically before applying them. The function reads migration files following the {id}-{name}.sql naming pattern and parses the SQL content into up and down sections.
Usage Examples:
import { readMigrations } from "sqlite/utils/migrate";
// Read migrations from default directory
const migrations = await readMigrations();
console.log(`Found ${migrations.length} migration files`);
// Read migrations from custom directory
const customMigrations = await readMigrations("./database/migrations");
// Process migrations programmatically
const processedMigrations = migrations.map(migration => ({
...migration,
up: `-- Environment: ${process.env.NODE_ENV}\n${migration.up}`,
down: migration.down
}));
// Use with database migrate method
await db.migrate({
migrations: processedMigrations
});Return Value Structure:
Each migration object returned contains:
id: Migration ID extracted from filenamename: Migration name extracted from filenameup: SQL statements for applying the migration (comments removed)down: SQL statements for rolling back the migration (comments removed)Install with Tessl CLI
npx tessl i tessl/npm-sqlite