CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-mariadb

Fast MariaDB and MySQL connector for Node.js with Promise and callback APIs

Pending
Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Pending

The risk profile of this skill

Overview
Eval results
Files

queries.mddocs/

Query Operations

Advanced querying capabilities including prepared statements, batch operations, streaming queries, and transaction management with comprehensive type safety. The MariaDB connector provides multiple query execution methods optimized for different use cases.

Capabilities

Basic Query Execution

Execute SQL queries using the text protocol for maximum compatibility.

/**
 * Execute SQL query using text protocol
 * @param sql - SQL string or query options object
 * @param values - Parameter values for placeholders
 * @returns Promise resolving to query results
 */
query<T = any>(sql: string | QueryOptions, values?: any): Promise<T>;

Usage Examples:

// Simple query
const users = await connection.query("SELECT * FROM users");

// Parameterized query with placeholders
const user = await connection.query(
  "SELECT * FROM users WHERE id = ? AND status = ?",
  [123, 'active']
);

// Named placeholders (requires namedPlaceholders: true in config)
const result = await connection.query({
  sql: "SELECT * FROM users WHERE name = :name AND age > :minAge",
  namedPlaceholders: true
}, { name: "Alice", minAge: 18 });

Prepared Statements

Execute SQL queries using the binary protocol with prepared statements for better performance and security.

/**
 * Execute SQL query using binary (prepared statement) protocol
 * @param sql - SQL string or query options object  
 * @param values - Parameter values for placeholders
 * @returns Promise resolving to query results
 */
execute<T = any>(sql: string | QueryOptions, values?: any): Promise<T>;

/**
 * Prepare SQL statement for repeated execution
 * @param sql - SQL string or query options object
 * @returns Promise resolving to Prepare object
 */
prepare(sql: string | QueryOptions): Promise<Prepare>;

interface Prepare {
  /** Statement identifier */
  id: number;
  
  /** Number of parameters in the prepared statement */
  parameterCount: number;
  
  /** Result columns metadata */
  columns: FieldInfo[];
  
  /** Database name where statement was prepared */
  database: string;
  
  /** Original SQL query string */
  query: string;
  
  /** Execute prepared statement */
  execute<T = any>(values?: any, options?: QueryOptions): Promise<T>;
  
  /** Execute prepared statement with streaming */
  executeStream(values?: any, options?: QueryOptions): Readable;
  
  /** Close prepared statement */
  close(): void;
  
  /** Check if prepared statement is closed */
  isClose(): boolean;
}

Usage Examples:

// Execute with binary protocol (auto-prepare)
const users = await connection.execute(
  "SELECT * FROM users WHERE created_at > ?",
  [new Date('2023-01-01')]
);

// Manual prepare for repeated execution
const stmt = await connection.prepare("INSERT INTO logs (level, message, timestamp) VALUES (?, ?, ?)");

try {
  await stmt.execute(['info', 'User logged in', new Date()]);
  await stmt.execute(['error', 'Database timeout', new Date()]);
  await stmt.execute(['debug', 'Cache miss', new Date()]);
} finally {
  stmt.close(); // Always close prepared statements
}

Batch Operations

Execute multiple operations efficiently using batch processing.

/**
 * Execute batch operations with multiple value sets
 * @param sql - SQL string or query options object
 * @param values - Array of parameter value arrays
 * @returns Promise resolving to batch results
 */
batch<T = UpsertResult | UpsertResult[]>(sql: string | QueryOptions, values?: any): Promise<T>;

Usage Examples:

// Insert multiple records in one batch
const users = [
  ['Alice', 'alice@example.com', 25],
  ['Bob', 'bob@example.com', 30], 
  ['Charlie', 'charlie@example.com', 35]
];

const results = await connection.batch(
  "INSERT INTO users (name, email, age) VALUES (?, ?, ?)",
  users
);

console.log(`Inserted ${results.affectedRows} users`);

// Update multiple records
const updates = [
  ['active', 123],
  ['inactive', 456],
  ['pending', 789]
];

await connection.batch(
  "UPDATE users SET status = ? WHERE id = ?",
  updates
);

Streaming Queries

Execute queries that return large result sets using streaming for memory efficiency. Streaming prevents loading entire result sets into memory and allows processing of arbitrarily large datasets.

/**
 * Execute query returning a Readable stream
 * @param sql - SQL string or query options object
 * @param values - Parameter values for placeholders
 * @returns Readable stream of query results
 */
queryStream(sql: string | QueryOptions, values?: any): Readable;

Stream Events:

The returned stream emits the following events:

  • 'columns' - Emitted with column metadata before rows
  • 'data' - Emitted for each row of results
  • 'end' - Emitted when query completes successfully
  • 'error' - Emitted on query errors

Usage Examples:

const { pipeline } = require('stream/promises');
const { Transform } = require('stream');

// Stream large result set with pipeline
const queryStream = connection.queryStream("SELECT * FROM large_table WHERE active = ?", [true]);

const processRow = new Transform({
  objectMode: true,
  transform(row, encoding, callback) {
    // Process each row without loading all into memory
    const processed = {
      ...row,
      processed_at: new Date(),
      full_name: `${row.first_name} ${row.last_name}`
    };
    callback(null, processed);
  }
});

// Process rows as they arrive
await pipeline(
  queryStream,
  processRow,
  // Write processed data to another stream (file, another DB, etc.)
  process.stdout // Example: output to console
);

// Event-based streaming for more control
const stream = connection.queryStream("SELECT id, name, data FROM massive_table");

stream.on('columns', (columns) => {
  console.log('Query columns:', columns.map(col => col.name()));
});

stream.on('data', (row) => {
  console.log('Processing row:', row.id);
  // Process individual row
});

stream.on('end', () => {
  console.log('Query streaming completed');
});

stream.on('error', (err) => {
  console.error('Stream error:', err);
});

// Prepared statement streaming
const stmt = await connection.prepare("SELECT * FROM logs WHERE created_at > ? AND level = ?");
const logStream = stmt.executeStream([new Date('2023-01-01'), 'error']);

logStream.on('data', (logEntry) => {
  console.log('Log entry:', logEntry);
});

// Clean up when done
logStream.on('end', () => {
  stmt.close();
});

Streaming with Prepared Statements:

Prepared statements also support streaming through the executeStream method:

interface Prepare {
  /** Execute prepared statement returning a stream */
  executeStream(values?: any, options?: QueryOptions): Readable;
}

Memory Efficiency Benefits:

Streaming is essential for:

  • Large result sets (millions of rows)
  • Export operations
  • Data migration
  • Real-time data processing
  • Memory-constrained environments
// Example: Export large table to CSV
const fs = require('fs');
const { Transform } = require('stream');

const csvTransform = new Transform({
  objectMode: true,
  transform(row, encoding, callback) {
    const csvLine = Object.values(row).map(val => 
      typeof val === 'string' ? `"${val.replace(/"/g, '""')}"` : val
    ).join(',') + '\n';
    callback(null, csvLine);
  }
});

const exportStream = connection.queryStream("SELECT * FROM large_exports_table");
const writeStream = fs.createWriteStream('./export.csv');

// Add CSV header
writeStream.write('id,name,email,created_at\n');

await pipeline(
  exportStream,
  csvTransform,
  writeStream
);

console.log('Export completed');

Query Options

Comprehensive options for controlling query behavior and result formatting.

interface QueryOptions {
  /** SQL command to execute */
  sql: string;
  
  /** Present result-sets by table to avoid colliding fields */
  nestTables?: boolean | string;
  
  /** Custom type casting function */
  typeCast?: TypeCastFunction;
  
  /** Return result-sets as arrays instead of objects */
  rowsAsArray?: boolean;
  
  /** Return metadata as array [rows, metadata] */
  metaAsArray?: boolean;
  
  /** Force insertId as Number instead of BigInt */
  insertIdAsNumber?: boolean;
  
  /** Return dates as strings instead of Date objects */
  dateStrings?: boolean;
  
  /** Force timezone usage */
  timezone?: string;
  
  /** Use named placeholders */
  namedPlaceholders?: boolean;
  
  /** Allow multi-parameter entries */
  permitSetMultiParamEntries?: boolean;
  
  /** Disable bulk operations in batch */
  bulk?: boolean;
  
  /** Send queries without waiting for previous results */
  pipelining?: boolean;
  
  /** Query execution timeout */
  timeout?: number;
  
  /** Auto-map JSON fields */
  autoJsonMap?: boolean;
  
  /** Include arrays in parentheses */
  arrayParenthesis?: boolean;
  
  /** Check for duplicate column names */
  checkDuplicate?: boolean;
  
  /** Return decimals as numbers */
  decimalAsNumber?: boolean;
  
  /** Return BIGINT as numbers */
  bigIntAsNumber?: boolean;
  
  /** Check number conversion safety */
  checkNumberRange?: boolean;
}

Query Options Examples:

// High-performance array results
const fastResults = await connection.query({
  sql: "SELECT id, name, email FROM users",
  rowsAsArray: true,
  timeout: 5000
});

// Type-safe number handling
const stats = await connection.query({
  sql: "SELECT COUNT(*) as total, AVG(price) as avg_price FROM products",
  decimalAsNumber: true,
  bigIntAsNumber: true,
  checkNumberRange: true
});

// Custom type casting
const customResults = await connection.query({
  sql: "SELECT * FROM users",
  typeCast: (field, next) => {
    if (field.type === 'TINY' && field.columnLength === 1) {
      return field.string() === '1'; // Convert TINYINT(1) to boolean
    }
    return next();
  }
});

Transaction Management

Manage database transactions with full ACID compliance.

/**
 * Start database transaction
 */
beginTransaction(): Promise<void>;

/**
 * Commit current transaction
 */
commit(): Promise<void>;

/**
 * Rollback current transaction  
 */
rollback(): Promise<void>;

Transaction Examples:

// Basic transaction
await connection.beginTransaction();
try {
  await connection.query("INSERT INTO orders (user_id, total) VALUES (?, ?)", [userId, total]);
  await connection.query("UPDATE inventory SET quantity = quantity - ? WHERE product_id = ?", [quantity, productId]);
  await connection.commit();
  console.log('Transaction completed successfully');
} catch (error) {
  await connection.rollback();
  console.error('Transaction failed, rolled back:', error);
  throw error;
}

// Nested transaction pattern with savepoints
await connection.beginTransaction();
try {
  // First operation
  await connection.query("INSERT INTO users (name, email) VALUES (?, ?)", [name, email]);
  
  // Savepoint for partial rollback
  await connection.query("SAVEPOINT user_created");
  
  try {
    // Second operation that might fail
    await connection.query("INSERT INTO user_preferences (user_id, theme) VALUES (?, ?)", [userId, theme]);
    await connection.commit();
  } catch (error) {
    // Rollback to savepoint, keep user creation
    await connection.query("ROLLBACK TO SAVEPOINT user_created");
    await connection.commit(); // Commit partial transaction
    console.log('User created but preferences failed');
  }
} catch (error) {
  await connection.rollback();
  throw error;
}

Connection Utilities

Utility methods for connection management and SQL safety.

/**
 * Send ping to ensure connection is active
 */
ping(): Promise<void>;

/**
 * Reset connection state (rollback transactions, reset variables)
 */
reset(): Promise<void>;

/**
 * Check if connection is valid and active
 */
isValid(): boolean;

/**
 * Escape SQL parameter to prevent injection
 * @param value - Value to escape
 * @returns Escaped string safe for SQL
 */
escape(value: any): string;

/**
 * Escape SQL identifier (table/column names)
 * @param identifier - Identifier to escape
 * @returns Escaped identifier
 */
escapeId(identifier: string): string;

Utility Examples:

// Keep connection alive
setInterval(async () => {
  if (connection.isValid()) {
    try {
      await connection.ping();
      console.log('Connection ping successful');
    } catch (error) {
      console.error('Connection ping failed:', error);
    }
  }
}, 30000); // Ping every 30 seconds

// Safe dynamic queries (use parameterization when possible)
const tableName = connection.escapeId('user_data');
const value = connection.escape(userInput);
const query = `SELECT * FROM ${tableName} WHERE data = ${value}`;

// Better: Use parameterized queries
const saferQuery = "SELECT * FROM user_data WHERE data = ?";
const results = await connection.query(saferQuery, [userInput]);

Result Types

Types representing query execution results.

interface UpsertResult {
  /** Number of affected rows */
  affectedRows: number;
  
  /** Auto-generated ID from INSERT */
  insertId: number | bigint;
  
  /** Warning status code */
  warningStatus: number;
}

interface FieldInfo {
  /** Column collation information */
  collation: Collation;
  
  /** Column maximum length */  
  columnLength: number;
  
  /** Column type number */
  columnType: TypeNumbers;
  
  /** Decimal scale for numeric types */
  scale: number;
  
  /** Column type string */
  type: Types;
  
  /** Column attribute flags */
  flags: Flags;
  
  /** Get database name */
  db(): string;
  
  /** Get table name */
  table(): string;
  
  /** Get column name */
  name(): string;
  
  /** Get original table name */
  orgTable(): string;
  
  /** Get original column name */
  orgName(): string;
  
  /** Get value as string */
  string(): string | null;
  
  /** Get value as buffer */
  buffer(): Buffer | null;
  
  /** Get value as float */
  float(): number | null;
  
  /** Get value as integer */
  int(): number | null;
  
  /** Get value as long integer */
  long(): number | null;
  
  /** Get value as decimal */
  decimal(): number | null;
  
  /** Get value as date */
  date(): Date | null;
  
  /** Get value as geometry */
  geometry(): Geometry | null;
}

Performance Optimization

Best practices for query performance optimization.

// Use execute() for repeated queries (automatic prepare)
async function getUserById(id: number) {
  return await connection.execute("SELECT * FROM users WHERE id = ?", [id]);
}

// Manual prepare for high-frequency queries
const getUserStmt = await connection.prepare("SELECT * FROM users WHERE id = ?");
const getOrderStmt = await connection.prepare("SELECT * FROM orders WHERE user_id = ?");

// Batch operations for bulk inserts
const batchInsert = async (records: any[]) => {
  const values = records.map(r => [r.name, r.email, r.age]);
  return await connection.batch(
    "INSERT INTO users (name, email, age) VALUES (?, ?, ?)",
    values
  );
};

// Streaming for large result sets
const exportData = async () => {
  const stream = connection.queryStream("SELECT * FROM large_table");
  return stream;
};

// Connection pooling for concurrent operations
const pool = mariadb.createPool(config);

// Direct pool queries for simple operations
const users = await pool.query("SELECT * FROM users");

// Dedicated connections for transactions
const connection = await pool.getConnection();
try {
  await connection.beginTransaction();
  // ... transaction operations
  await connection.commit();
} finally {
  await connection.release();
}

docs

callbacks.md

clustering.md

configuration.md

connections.md

errors.md

index.md

pooling.md

queries.md

types.md

tile.json