0
# Query Operations
1
2
Advanced querying capabilities including prepared statements, batch operations, streaming queries, and transaction management with comprehensive type safety. The MariaDB connector provides multiple query execution methods optimized for different use cases.
3
4
## Capabilities
5
6
### Basic Query Execution
7
8
Execute SQL queries using the text protocol for maximum compatibility.
9
10
```typescript { .api }
11
/**
12
* Execute SQL query using text protocol
13
* @param sql - SQL string or query options object
14
* @param values - Parameter values for placeholders
15
* @returns Promise resolving to query results
16
*/
17
query<T = any>(sql: string | QueryOptions, values?: any): Promise<T>;
18
```
19
20
**Usage Examples:**
21
22
```typescript
23
// Simple query
24
const users = await connection.query("SELECT * FROM users");
25
26
// Parameterized query with placeholders
27
const user = await connection.query(
28
"SELECT * FROM users WHERE id = ? AND status = ?",
29
[123, 'active']
30
);
31
32
// Named placeholders (requires namedPlaceholders: true in config)
33
const result = await connection.query({
34
sql: "SELECT * FROM users WHERE name = :name AND age > :minAge",
35
namedPlaceholders: true
36
}, { name: "Alice", minAge: 18 });
37
```
38
39
### Prepared Statements
40
41
Execute SQL queries using the binary protocol with prepared statements for better performance and security.
42
43
```typescript { .api }
44
/**
45
* Execute SQL query using binary (prepared statement) protocol
46
* @param sql - SQL string or query options object
47
* @param values - Parameter values for placeholders
48
* @returns Promise resolving to query results
49
*/
50
execute<T = any>(sql: string | QueryOptions, values?: any): Promise<T>;
51
52
/**
53
* Prepare SQL statement for repeated execution
54
* @param sql - SQL string or query options object
55
* @returns Promise resolving to Prepare object
56
*/
57
prepare(sql: string | QueryOptions): Promise<Prepare>;
58
59
interface Prepare {
60
/** Statement identifier */
61
id: number;
62
63
/** Number of parameters in the prepared statement */
64
parameterCount: number;
65
66
/** Result columns metadata */
67
columns: FieldInfo[];
68
69
/** Database name where statement was prepared */
70
database: string;
71
72
/** Original SQL query string */
73
query: string;
74
75
/** Execute prepared statement */
76
execute<T = any>(values?: any, options?: QueryOptions): Promise<T>;
77
78
/** Execute prepared statement with streaming */
79
executeStream(values?: any, options?: QueryOptions): Readable;
80
81
/** Close prepared statement */
82
close(): void;
83
84
/** Check if prepared statement is closed */
85
isClose(): boolean;
86
}
87
```
88
89
**Usage Examples:**
90
91
```typescript
92
// Execute with binary protocol (auto-prepare)
93
const users = await connection.execute(
94
"SELECT * FROM users WHERE created_at > ?",
95
[new Date('2023-01-01')]
96
);
97
98
// Manual prepare for repeated execution
99
const stmt = await connection.prepare("INSERT INTO logs (level, message, timestamp) VALUES (?, ?, ?)");
100
101
try {
102
await stmt.execute(['info', 'User logged in', new Date()]);
103
await stmt.execute(['error', 'Database timeout', new Date()]);
104
await stmt.execute(['debug', 'Cache miss', new Date()]);
105
} finally {
106
stmt.close(); // Always close prepared statements
107
}
108
```
109
110
### Batch Operations
111
112
Execute multiple operations efficiently using batch processing.
113
114
```typescript { .api }
115
/**
116
* Execute batch operations with multiple value sets
117
* @param sql - SQL string or query options object
118
* @param values - Array of parameter value arrays
119
* @returns Promise resolving to batch results
120
*/
121
batch<T = UpsertResult | UpsertResult[]>(sql: string | QueryOptions, values?: any): Promise<T>;
122
```
123
124
**Usage Examples:**
125
126
```typescript
127
// Insert multiple records in one batch
128
const users = [
129
['Alice', 'alice@example.com', 25],
130
['Bob', 'bob@example.com', 30],
131
['Charlie', 'charlie@example.com', 35]
132
];
133
134
const results = await connection.batch(
135
"INSERT INTO users (name, email, age) VALUES (?, ?, ?)",
136
users
137
);
138
139
console.log(`Inserted ${results.affectedRows} users`);
140
141
// Update multiple records
142
const updates = [
143
['active', 123],
144
['inactive', 456],
145
['pending', 789]
146
];
147
148
await connection.batch(
149
"UPDATE users SET status = ? WHERE id = ?",
150
updates
151
);
152
```
153
154
### Streaming Queries
155
156
Execute queries that return large result sets using streaming for memory efficiency. Streaming prevents loading entire result sets into memory and allows processing of arbitrarily large datasets.
157
158
```typescript { .api }
159
/**
160
* Execute query returning a Readable stream
161
* @param sql - SQL string or query options object
162
* @param values - Parameter values for placeholders
163
* @returns Readable stream of query results
164
*/
165
queryStream(sql: string | QueryOptions, values?: any): Readable;
166
```
167
168
**Stream Events:**
169
170
The returned stream emits the following events:
171
172
- `'columns'` - Emitted with column metadata before rows
173
- `'data'` - Emitted for each row of results
174
- `'end'` - Emitted when query completes successfully
175
- `'error'` - Emitted on query errors
176
177
**Usage Examples:**
178
179
```javascript
180
const { pipeline } = require('stream/promises');
181
const { Transform } = require('stream');
182
183
// Stream large result set with pipeline
184
const queryStream = connection.queryStream("SELECT * FROM large_table WHERE active = ?", [true]);
185
186
const processRow = new Transform({
187
objectMode: true,
188
transform(row, encoding, callback) {
189
// Process each row without loading all into memory
190
const processed = {
191
...row,
192
processed_at: new Date(),
193
full_name: `${row.first_name} ${row.last_name}`
194
};
195
callback(null, processed);
196
}
197
});
198
199
// Process rows as they arrive
200
await pipeline(
201
queryStream,
202
processRow,
203
// Write processed data to another stream (file, another DB, etc.)
204
process.stdout // Example: output to console
205
);
206
207
// Event-based streaming for more control
208
const stream = connection.queryStream("SELECT id, name, data FROM massive_table");
209
210
stream.on('columns', (columns) => {
211
console.log('Query columns:', columns.map(col => col.name()));
212
});
213
214
stream.on('data', (row) => {
215
console.log('Processing row:', row.id);
216
// Process individual row
217
});
218
219
stream.on('end', () => {
220
console.log('Query streaming completed');
221
});
222
223
stream.on('error', (err) => {
224
console.error('Stream error:', err);
225
});
226
227
// Prepared statement streaming
228
const stmt = await connection.prepare("SELECT * FROM logs WHERE created_at > ? AND level = ?");
229
const logStream = stmt.executeStream([new Date('2023-01-01'), 'error']);
230
231
logStream.on('data', (logEntry) => {
232
console.log('Log entry:', logEntry);
233
});
234
235
// Clean up when done
236
logStream.on('end', () => {
237
stmt.close();
238
});
239
```
240
241
**Streaming with Prepared Statements:**
242
243
Prepared statements also support streaming through the `executeStream` method:
244
245
```typescript { .api }
246
interface Prepare {
247
/** Execute prepared statement returning a stream */
248
executeStream(values?: any, options?: QueryOptions): Readable;
249
}
250
```
251
252
**Memory Efficiency Benefits:**
253
254
Streaming is essential for:
255
- Large result sets (millions of rows)
256
- Export operations
257
- Data migration
258
- Real-time data processing
259
- Memory-constrained environments
260
261
```javascript
262
// Example: Export large table to CSV
263
const fs = require('fs');
264
const { Transform } = require('stream');
265
266
const csvTransform = new Transform({
267
objectMode: true,
268
transform(row, encoding, callback) {
269
const csvLine = Object.values(row).map(val =>
270
typeof val === 'string' ? `"${val.replace(/"/g, '""')}"` : val
271
).join(',') + '\n';
272
callback(null, csvLine);
273
}
274
});
275
276
const exportStream = connection.queryStream("SELECT * FROM large_exports_table");
277
const writeStream = fs.createWriteStream('./export.csv');
278
279
// Add CSV header
280
writeStream.write('id,name,email,created_at\n');
281
282
await pipeline(
283
exportStream,
284
csvTransform,
285
writeStream
286
);
287
288
console.log('Export completed');
289
```
290
291
### Query Options
292
293
Comprehensive options for controlling query behavior and result formatting.
294
295
```typescript { .api }
296
interface QueryOptions {
297
/** SQL command to execute */
298
sql: string;
299
300
/** Present result-sets by table to avoid colliding fields */
301
nestTables?: boolean | string;
302
303
/** Custom type casting function */
304
typeCast?: TypeCastFunction;
305
306
/** Return result-sets as arrays instead of objects */
307
rowsAsArray?: boolean;
308
309
/** Return metadata as array [rows, metadata] */
310
metaAsArray?: boolean;
311
312
/** Force insertId as Number instead of BigInt */
313
insertIdAsNumber?: boolean;
314
315
/** Return dates as strings instead of Date objects */
316
dateStrings?: boolean;
317
318
/** Force timezone usage */
319
timezone?: string;
320
321
/** Use named placeholders */
322
namedPlaceholders?: boolean;
323
324
/** Allow multi-parameter entries */
325
permitSetMultiParamEntries?: boolean;
326
327
/** Disable bulk operations in batch */
328
bulk?: boolean;
329
330
/** Send queries without waiting for previous results */
331
pipelining?: boolean;
332
333
/** Query execution timeout */
334
timeout?: number;
335
336
/** Auto-map JSON fields */
337
autoJsonMap?: boolean;
338
339
/** Include arrays in parentheses */
340
arrayParenthesis?: boolean;
341
342
/** Check for duplicate column names */
343
checkDuplicate?: boolean;
344
345
/** Return decimals as numbers */
346
decimalAsNumber?: boolean;
347
348
/** Return BIGINT as numbers */
349
bigIntAsNumber?: boolean;
350
351
/** Check number conversion safety */
352
checkNumberRange?: boolean;
353
}
354
```
355
356
**Query Options Examples:**
357
358
```typescript
359
// High-performance array results
360
const fastResults = await connection.query({
361
sql: "SELECT id, name, email FROM users",
362
rowsAsArray: true,
363
timeout: 5000
364
});
365
366
// Type-safe number handling
367
const stats = await connection.query({
368
sql: "SELECT COUNT(*) as total, AVG(price) as avg_price FROM products",
369
decimalAsNumber: true,
370
bigIntAsNumber: true,
371
checkNumberRange: true
372
});
373
374
// Custom type casting
375
const customResults = await connection.query({
376
sql: "SELECT * FROM users",
377
typeCast: (field, next) => {
378
if (field.type === 'TINY' && field.columnLength === 1) {
379
return field.string() === '1'; // Convert TINYINT(1) to boolean
380
}
381
return next();
382
}
383
});
384
```
385
386
### Transaction Management
387
388
Manage database transactions with full ACID compliance.
389
390
```typescript { .api }
391
/**
392
* Start database transaction
393
*/
394
beginTransaction(): Promise<void>;
395
396
/**
397
* Commit current transaction
398
*/
399
commit(): Promise<void>;
400
401
/**
402
* Rollback current transaction
403
*/
404
rollback(): Promise<void>;
405
```
406
407
**Transaction Examples:**
408
409
```typescript
410
// Basic transaction
411
await connection.beginTransaction();
412
try {
413
await connection.query("INSERT INTO orders (user_id, total) VALUES (?, ?)", [userId, total]);
414
await connection.query("UPDATE inventory SET quantity = quantity - ? WHERE product_id = ?", [quantity, productId]);
415
await connection.commit();
416
console.log('Transaction completed successfully');
417
} catch (error) {
418
await connection.rollback();
419
console.error('Transaction failed, rolled back:', error);
420
throw error;
421
}
422
423
// Nested transaction pattern with savepoints
424
await connection.beginTransaction();
425
try {
426
// First operation
427
await connection.query("INSERT INTO users (name, email) VALUES (?, ?)", [name, email]);
428
429
// Savepoint for partial rollback
430
await connection.query("SAVEPOINT user_created");
431
432
try {
433
// Second operation that might fail
434
await connection.query("INSERT INTO user_preferences (user_id, theme) VALUES (?, ?)", [userId, theme]);
435
await connection.commit();
436
} catch (error) {
437
// Rollback to savepoint, keep user creation
438
await connection.query("ROLLBACK TO SAVEPOINT user_created");
439
await connection.commit(); // Commit partial transaction
440
console.log('User created but preferences failed');
441
}
442
} catch (error) {
443
await connection.rollback();
444
throw error;
445
}
446
```
447
448
### Connection Utilities
449
450
Utility methods for connection management and SQL safety.
451
452
```typescript { .api }
453
/**
454
* Send ping to ensure connection is active
455
*/
456
ping(): Promise<void>;
457
458
/**
459
* Reset connection state (rollback transactions, reset variables)
460
*/
461
reset(): Promise<void>;
462
463
/**
464
* Check if connection is valid and active
465
*/
466
isValid(): boolean;
467
468
/**
469
* Escape SQL parameter to prevent injection
470
* @param value - Value to escape
471
* @returns Escaped string safe for SQL
472
*/
473
escape(value: any): string;
474
475
/**
476
* Escape SQL identifier (table/column names)
477
* @param identifier - Identifier to escape
478
* @returns Escaped identifier
479
*/
480
escapeId(identifier: string): string;
481
```
482
483
**Utility Examples:**
484
485
```typescript
486
// Keep connection alive
487
setInterval(async () => {
488
if (connection.isValid()) {
489
try {
490
await connection.ping();
491
console.log('Connection ping successful');
492
} catch (error) {
493
console.error('Connection ping failed:', error);
494
}
495
}
496
}, 30000); // Ping every 30 seconds
497
498
// Safe dynamic queries (use parameterization when possible)
499
const tableName = connection.escapeId('user_data');
500
const value = connection.escape(userInput);
501
const query = `SELECT * FROM ${tableName} WHERE data = ${value}`;
502
503
// Better: Use parameterized queries
504
const saferQuery = "SELECT * FROM user_data WHERE data = ?";
505
const results = await connection.query(saferQuery, [userInput]);
506
```
507
508
### Result Types
509
510
Types representing query execution results.
511
512
```typescript { .api }
513
interface UpsertResult {
514
/** Number of affected rows */
515
affectedRows: number;
516
517
/** Auto-generated ID from INSERT */
518
insertId: number | bigint;
519
520
/** Warning status code */
521
warningStatus: number;
522
}
523
524
interface FieldInfo {
525
/** Column collation information */
526
collation: Collation;
527
528
/** Column maximum length */
529
columnLength: number;
530
531
/** Column type number */
532
columnType: TypeNumbers;
533
534
/** Decimal scale for numeric types */
535
scale: number;
536
537
/** Column type string */
538
type: Types;
539
540
/** Column attribute flags */
541
flags: Flags;
542
543
/** Get database name */
544
db(): string;
545
546
/** Get table name */
547
table(): string;
548
549
/** Get column name */
550
name(): string;
551
552
/** Get original table name */
553
orgTable(): string;
554
555
/** Get original column name */
556
orgName(): string;
557
558
/** Get value as string */
559
string(): string | null;
560
561
/** Get value as buffer */
562
buffer(): Buffer | null;
563
564
/** Get value as float */
565
float(): number | null;
566
567
/** Get value as integer */
568
int(): number | null;
569
570
/** Get value as long integer */
571
long(): number | null;
572
573
/** Get value as decimal */
574
decimal(): number | null;
575
576
/** Get value as date */
577
date(): Date | null;
578
579
/** Get value as geometry */
580
geometry(): Geometry | null;
581
}
582
```
583
584
### Performance Optimization
585
586
Best practices for query performance optimization.
587
588
```typescript
589
// Use execute() for repeated queries (automatic prepare)
590
async function getUserById(id: number) {
591
return await connection.execute("SELECT * FROM users WHERE id = ?", [id]);
592
}
593
594
// Manual prepare for high-frequency queries
595
const getUserStmt = await connection.prepare("SELECT * FROM users WHERE id = ?");
596
const getOrderStmt = await connection.prepare("SELECT * FROM orders WHERE user_id = ?");
597
598
// Batch operations for bulk inserts
599
const batchInsert = async (records: any[]) => {
600
const values = records.map(r => [r.name, r.email, r.age]);
601
return await connection.batch(
602
"INSERT INTO users (name, email, age) VALUES (?, ?, ?)",
603
values
604
);
605
};
606
607
// Streaming for large result sets
608
const exportData = async () => {
609
const stream = connection.queryStream("SELECT * FROM large_table");
610
return stream;
611
};
612
613
// Connection pooling for concurrent operations
614
const pool = mariadb.createPool(config);
615
616
// Direct pool queries for simple operations
617
const users = await pool.query("SELECT * FROM users");
618
619
// Dedicated connections for transactions
620
const connection = await pool.getConnection();
621
try {
622
await connection.beginTransaction();
623
// ... transaction operations
624
await connection.commit();
625
} finally {
626
await connection.release();
627
}
628
```