or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

connections.mderrors.mdindex.mdlarge-objects.mdnotifications.mdquery-processing.mdquerying.mdreplication.mdtransactions.mdtypes.md

query-processing.mddocs/

0

# Query Processing

1

2

Advanced query execution modes including streaming, cursors, result formats, and performance optimization techniques.

3

4

## Capabilities

5

6

### Query Execution Modes

7

8

Control how queries are executed and results are processed with specialized methods for different use cases.

9

10

```javascript { .api }

11

/**

12

* Execute query with simple protocol (no prepared statements)

13

* @returns Query instance for method chaining

14

*/

15

query.simple(): PendingQuery;

16

17

/**

18

* Execute query immediately without waiting

19

* @returns Query instance for method chaining

20

*/

21

query.execute(): PendingQuery;

22

23

/**

24

* Describe query without executing (get statement metadata)

25

* @returns Promise resolving to statement description

26

*/

27

query.describe(): Promise<StatementDescription>;

28

29

/**

30

* Cancel a running query

31

* @returns Void (cancellation is synchronous)

32

*/

33

query.cancel(): void;

34

```

35

36

**Usage Examples:**

37

38

```javascript

39

// Force simple protocol (no prepared statements)

40

const result = await sql`SELECT * FROM large_table`.simple();

41

42

// Execute immediately

43

const query = sql`SELECT * FROM users WHERE id = ${userId}`;

44

query.execute();

45

46

// Get query metadata without executing

47

const description = await sql`SELECT * FROM products`.describe();

48

console.log(description.fields); // Column information

49

50

// Cancel a long-running query

51

const longQuery = sql`SELECT * FROM huge_table`;

52

setTimeout(() => longQuery.cancel(), 5000);

53

const result = await longQuery;

54

```

55

56

### Cursor-Based Processing

57

58

Process large result sets efficiently using cursors to stream results in batches.

59

60

```javascript { .api }

61

/**

62

* Execute query with cursor for streaming results

63

* @param rows - Number of rows to fetch per batch (default: 1000)

64

* @returns AsyncIterable yielding batches of rows

65

*/

66

query.cursor(rows?: number): AsyncIterable<Row[]>;

67

68

/**

69

* Execute query with cursor using callback

70

* @param fn - Function called for each batch of rows

71

* @returns Promise resolving to execution result

72

*/

73

query.cursor(fn: (rows: Row[]) => void): Promise<ExecutionResult>;

74

75

/**

76

* Execute query with cursor specifying batch size and callback

77

* @param rows - Number of rows per batch

78

* @param fn - Function called for each batch

79

* @returns Promise resolving to execution result

80

*/

81

query.cursor(rows: number, fn: (rows: Row[]) => void): Promise<ExecutionResult>;

82

```

83

84

**Usage Examples:**

85

86

```javascript

87

// Async iterator with default batch size

88

for await (const batch of sql`SELECT * FROM large_table`.cursor()) {

89

console.log(`Processing ${batch.length} rows`);

90

await processBatch(batch);

91

}

92

93

// Custom batch size with async iterator

94

for await (const batch of sql`SELECT * FROM orders`.cursor(500)) {

95

await processOrders(batch);

96

}

97

98

// Callback-based processing

99

await sql`SELECT * FROM transactions`.cursor(1000, (rows) => {

100

rows.forEach(transaction => {

101

validateTransaction(transaction);

102

});

103

});

104

105

// Smaller batches for memory-constrained processing

106

await sql`SELECT * FROM logs`.cursor(100, async (batch) => {

107

await sendToAnalytics(batch);

108

});

109

```

110

111

### Row-by-Row Processing

112

113

Process individual rows as they arrive for maximum memory efficiency.

114

115

```javascript { .api }

116

/**

117

* Process query results row by row

118

* @param fn - Function called for each row

119

* @returns Promise resolving to execution result

120

*/

121

query.forEach(fn: (row: Row, result: ExecutionResult) => void): Promise<ExecutionResult>;

122

```

123

124

**Usage Examples:**

125

126

```javascript

127

// Process each row individually

128

let processedCount = 0;

129

const result = await sql`SELECT * FROM users`.forEach((user, result) => {

130

processUser(user);

131

processedCount++;

132

133

if (processedCount % 1000 === 0) {

134

console.log(`Processed ${processedCount} users`);

135

}

136

});

137

138

console.log(`Total processed: ${result.count}`);

139

140

// Async processing within forEach

141

await sql`SELECT * FROM orders`.forEach(async (order, result) => {

142

await validateOrder(order);

143

await updateInventory(order.items);

144

});

145

```

146

147

### Result Format Options

148

149

Control the format and structure of query results.

150

151

```javascript { .api }

152

/**

153

* Return raw buffer data instead of parsed values

154

* @returns Query returning raw buffer results

155

*/

156

query.raw(): PendingQuery<Buffer[][]>;

157

158

/**

159

* Return results as 2D array instead of objects

160

* @returns Query returning array of arrays

161

*/

162

query.values(): PendingQuery<any[][]>;

163

```

164

165

**Usage Examples:**

166

167

```javascript

168

// Raw buffer data for binary processing

169

const rawData = await sql`SELECT image_data FROM photos`.raw();

170

rawData.forEach(row => {

171

const imageBuffer = row[0]; // Buffer containing image data

172

processImage(imageBuffer);

173

});

174

175

// Values as 2D array for CSV export

176

const userData = await sql`SELECT name, email, age FROM users`.values();

177

// Returns: [["John", "john@example.com", 30], ["Jane", "jane@example.com", 25]]

178

179

const csvContent = userData.map(row => row.join(',')).join('\n');

180

```

181

182

### COPY Operations

183

184

Efficiently import and export large datasets using PostgreSQL's COPY functionality.

185

186

```javascript { .api }

187

/**

188

* Get readable stream for COPY TO operations

189

* @returns Promise resolving to readable stream

190

*/

191

query.readable(): Promise<Readable>;

192

193

/**

194

* Get writable stream for COPY FROM operations

195

* @returns Promise resolving to writable stream

196

*/

197

query.writable(): Promise<Writable>;

198

```

199

200

**Usage Examples:**

201

202

```javascript

203

import { createWriteStream, createReadStream } from 'fs';

204

205

// Export data to file using COPY TO

206

const readable = await sql`

207

COPY (SELECT * FROM users) TO STDOUT WITH CSV HEADER

208

`.readable();

209

210

const fileStream = createWriteStream('users.csv');

211

readable.pipe(fileStream);

212

213

// Import data from file using COPY FROM

214

const writable = await sql`

215

COPY products (name, price, category) FROM STDIN WITH CSV HEADER

216

`.writable();

217

218

const inputStream = createReadStream('products.csv');

219

inputStream.pipe(writable);

220

221

// Stream transformation during COPY

222

const transform = new Transform({

223

transform(chunk, encoding, callback) {

224

// Process data during copy

225

const processed = processChunk(chunk);

226

callback(null, processed);

227

}

228

});

229

230

inputStream.pipe(transform).pipe(writable);

231

```

232

233

## Result Metadata

234

235

### ExecutionResult Interface

236

237

Query execution results include comprehensive metadata about the operation.

238

239

```javascript { .api }

240

interface ExecutionResult {

241

/** Number of rows affected by the query */

242

count: number;

243

244

/** SQL command that was executed */

245

command: string;

246

247

/** Backend connection state information */

248

state: ConnectionState;

249

250

/** Column information for SELECT queries */

251

columns?: ColumnInfo[];

252

253

/** Statement metadata including parameter types */

254

statement?: StatementInfo;

255

}

256

257

interface ColumnInfo {

258

/** Column name */

259

name: string;

260

261

/** PostgreSQL type OID */

262

type: number;

263

264

/** Table OID (0 if not from a table) */

265

table: number;

266

267

/** Column attribute number in table */

268

number: number;

269

270

/** Type modifier */

271

modifier: number;

272

273

/** Format code (0 = text, 1 = binary) */

274

format: number;

275

}

276

```

277

278

**Usage Examples:**

279

280

```javascript

281

// Access result metadata

282

const result = await sql`SELECT name, age FROM users WHERE active = true`;

283

284

console.log(`Command: ${result.command}`); // "SELECT"

285

console.log(`Rows returned: ${result.count}`); // Number of rows

286

console.log(`Columns: ${result.columns?.map(col => col.name)}`); // ["name", "age"]

287

288

// Check column types

289

result.columns?.forEach(col => {

290

console.log(`${col.name}: type ${col.type}`);

291

});

292

```

293

294

### StatementDescription Interface

295

296

Detailed information about prepared statements obtained via describe().

297

298

```javascript { .api }

299

interface StatementDescription {

300

/** Parameter type OIDs */

301

parameters: number[];

302

303

/** Result field information */

304

fields: FieldDescription[];

305

}

306

307

interface FieldDescription {

308

/** Field name */

309

name: string;

310

311

/** Table OID */

312

tableOid: number;

313

314

/** Column attribute number */

315

columnAttrNumber: number;

316

317

/** Data type OID */

318

dataTypeOid: number;

319

320

/** Data type size */

321

dataTypeSize: number;

322

323

/** Type modifier */

324

typeModifier: number;

325

326

/** Format code */

327

format: number;

328

}

329

```

330

331

**Usage Examples:**

332

333

```javascript

334

// Analyze query structure before execution

335

const description = await sql`

336

SELECT u.name, u.email, p.title

337

FROM users u

338

JOIN posts p ON u.id = p.author_id

339

WHERE u.created_at > $1

340

`.describe();

341

342

console.log(`Parameters needed: ${description.parameters.length}`);

343

description.fields.forEach(field => {

344

console.log(`Field: ${field.name}, Type: ${field.dataTypeOid}`);

345

});

346

```

347

348

## Performance Optimization

349

350

### Query Preparation

351

352

Control prepared statement usage for optimal performance.

353

354

```javascript { .api }

355

// Global preparation setting

356

const sql = postgres(connectionConfig, {

357

prepare: true // Enable prepared statements (default)

358

});

359

360

// Per-query preparation control

361

await sql`SELECT * FROM users WHERE id = ${id}`.simple(); // Skip preparation

362

```

363

364

**Usage Examples:**

365

366

```javascript

367

// Disable preparation for one-time queries

368

const oneTimeResult = await sql`

369

SELECT * FROM system_stats WHERE collected_at = NOW()

370

`.simple();

371

372

// Use preparation for repeated queries

373

const getUserById = (id) => sql`SELECT * FROM users WHERE id = ${id}`;

374

375

// These will reuse the same prepared statement

376

const user1 = await getUserById(1);

377

const user2 = await getUserById(2);

378

```

379

380

### Memory Management

381

382

Optimize memory usage for large result sets.

383

384

```javascript { .api }

385

// Process large datasets without loading everything into memory

386

for await (const batch of sql`SELECT * FROM large_table`.cursor(1000)) {

387

// Process batch and let it be garbage collected

388

await processBatch(batch);

389

}

390

391

// Row-by-row processing for minimal memory footprint

392

await sql`SELECT * FROM huge_table`.forEach((row) => {

393

processRow(row);

394

// Each row can be garbage collected after processing

395

});

396

```

397

398

**Usage Examples:**

399

400

```javascript

401

// Memory-efficient data export

402

async function exportLargeTable(tableName, outputFile) {

403

const writeStream = createWriteStream(outputFile);

404

405

await sql`SELECT * FROM ${sql(tableName)}`.forEach((row) => {

406

const csvLine = Object.values(row).join(',') + '\n';

407

writeStream.write(csvLine);

408

});

409

410

writeStream.end();

411

}

412

413

// Batch processing with backpressure control

414

async function processWithBackpressure() {

415

const cursor = sql`SELECT * FROM events ORDER BY created_at`.cursor(500);

416

417

for await (const batch of cursor) {

418

await processBatch(batch);

419

// Natural backpressure - won't fetch next batch until current is processed

420

}

421

}

422

```