or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

connection.mdcopy-operations.mdindex.mdprepared-statements.mdqueries.md

copy-operations.mddocs/

0

# COPY Operations

1

2

PostgreSQL COPY operation support through Node.js streams for high-performance bulk data import and export operations.

3

4

## Capabilities

5

6

### COPY Stream Creation

7

8

Get a duplex stream for PostgreSQL COPY operations.

9

10

```javascript { .api }

11

/**

12

* Get a duplex stream for COPY operations

13

* @returns CopyStream instance for reading/writing bulk data

14

*/

15

client.getCopyStream(): CopyStream;

16

17

interface CopyStream extends Duplex {

18

/** Write data to the COPY stream */

19

write(chunk: Buffer, encoding?: string, callback?: Function): boolean;

20

/** End the COPY operation */

21

end(chunk?: Buffer, encoding?: string, callback?: Function): void;

22

/** Event emitted when COPY operation finishes */

23

on(event: 'finish', listener: () => void): this;

24

/** Event emitted when COPY data is available for reading */

25

on(event: 'data', listener: (chunk: Buffer) => void): this;

26

/** Event emitted when stream ends */

27

on(event: 'end', listener: () => void): this;

28

/** Event emitted on errors */

29

on(event: 'error', listener: (error: Error) => void): this;

30

}

31

```

32

33

### COPY FROM (Data Import)

34

35

Import bulk data into PostgreSQL tables using COPY FROM.

36

37

**Usage Examples:**

38

39

```javascript

40

const Client = require('pg-native');

41

42

const client = new Client();

43

client.connectSync();

44

45

// Create a test table

46

client.querySync('CREATE TEMP TABLE users(name text, age int, city text)');

47

48

// Start COPY FROM operation

49

client.querySync('COPY users FROM stdin');

50

51

// Get the COPY stream

52

const stream = client.getCopyStream();

53

54

// Write data to the stream

55

stream.write(Buffer.from('Alice\t25\tNew York\n', 'utf8'));

56

stream.write(Buffer.from('Bob\t30\tLos Angeles\n', 'utf8'));

57

stream.write(Buffer.from('Charlie\t28\tChicago\n', 'utf8'));

58

59

// End the COPY operation

60

stream.end();

61

62

// Handle completion

63

stream.once('finish', function() {

64

console.log('COPY operation completed');

65

66

// Verify the data was imported

67

const rows = client.querySync('SELECT COUNT(*) FROM users');

68

console.log('Imported rows:', rows[0].count);

69

70

client.end();

71

});

72

73

// Handle errors

74

stream.on('error', function(err) {

75

console.error('COPY error:', err.message);

76

});

77

```

78

79

### COPY TO (Data Export)

80

81

Export data from PostgreSQL tables using COPY TO.

82

83

**Usage Examples:**

84

85

```javascript

86

const Client = require('pg-native');

87

const fs = require('fs');

88

89

const client = new Client();

90

client.connectSync();

91

92

// Start COPY TO operation

93

client.querySync('COPY users TO stdout');

94

95

// Get the COPY stream for reading

96

const stream = client.getCopyStream();

97

98

// Create output file

99

const outputFile = fs.createWriteStream('users_export.txt');

100

101

// Handle incoming data

102

stream.on('data', function(chunk) {

103

console.log('Received data chunk:', chunk.toString());

104

outputFile.write(chunk);

105

});

106

107

// Handle stream end

108

stream.on('end', function() {

109

console.log('COPY TO operation completed');

110

outputFile.end();

111

client.end();

112

});

113

114

// Handle errors

115

stream.on('error', function(err) {

116

console.error('COPY error:', err.message);

117

outputFile.end();

118

});

119

```

120

121

### CSV Format COPY Operations

122

123

Working with CSV format data in COPY operations.

124

125

**CSV Import Example:**

126

127

```javascript

128

const Client = require('pg-native');

129

130

const client = new Client();

131

client.connectSync();

132

133

// Create table with headers matching CSV

134

client.querySync(`

135

CREATE TEMP TABLE products(

136

id serial,

137

name text,

138

price decimal,

139

category text

140

)

141

`);

142

143

// Start COPY with CSV format

144

client.querySync(`

145

COPY products(name, price, category)

146

FROM stdin

147

WITH (FORMAT csv, HEADER true)

148

`);

149

150

const stream = client.getCopyStream();

151

152

// Write CSV data including header

153

const csvData = `name,price,category

154

"Laptop",999.99,"Electronics"

155

"Book",29.99,"Education"

156

"Coffee Mug",12.50,"Kitchen"

157

`;

158

159

stream.write(Buffer.from(csvData, 'utf8'));

160

stream.end();

161

162

stream.once('finish', function() {

163

const rows = client.querySync('SELECT * FROM products ORDER BY id');

164

console.log('Imported products:', rows);

165

client.end();

166

});

167

```

168

169

### Binary Format COPY Operations

170

171

Working with binary format COPY operations for better performance.

172

173

**Binary Import Example:**

174

175

```javascript

176

const Client = require('pg-native');

177

178

const client = new Client();

179

client.connectSync();

180

181

// Create table for binary data

182

client.querySync(`

183

CREATE TEMP TABLE binary_data(

184

id int,

185

data bytea,

186

timestamp timestamp

187

)

188

`);

189

190

// Start binary COPY

191

client.querySync('COPY binary_data FROM stdin WITH (FORMAT binary)');

192

193

const stream = client.getCopyStream();

194

195

// Note: Binary format requires specific PostgreSQL binary protocol

196

// This is a simplified example - actual binary format is complex

197

stream.write(Buffer.from('PGCOPY\n\xff\r\n\0', 'binary')); // Binary header

198

// ... write binary data rows according to PostgreSQL binary format

199

200

stream.end();

201

202

stream.once('finish', function() {

203

console.log('Binary COPY completed');

204

client.end();

205

});

206

```

207

208

### Large Dataset Processing

209

210

Efficiently process large datasets using COPY operations.

211

212

**Streaming Large Files:**

213

214

```javascript

215

const Client = require('pg-native');

216

const fs = require('fs');

217

const readline = require('readline');

218

219

const client = new Client();

220

client.connectSync();

221

222

// Prepare table

223

client.querySync('CREATE TEMP TABLE large_dataset(col1 text, col2 int, col3 text)');

224

client.querySync('COPY large_dataset FROM stdin');

225

226

const copyStream = client.getCopyStream();

227

228

// Stream large file line by line

229

const fileStream = fs.createReadStream('large_data_file.txt');

230

const rl = readline.createInterface({

231

input: fileStream,

232

crlfDelay: Infinity

233

});

234

235

let lineCount = 0;

236

237

rl.on('line', (line) => {

238

// Process and write each line

239

const processedLine = line + '\n';

240

copyStream.write(Buffer.from(processedLine, 'utf8'));

241

lineCount++;

242

243

if (lineCount % 10000 === 0) {

244

console.log(`Processed ${lineCount} lines`);

245

}

246

});

247

248

rl.on('close', () => {

249

copyStream.end();

250

});

251

252

copyStream.once('finish', function() {

253

console.log(`COPY completed. Total lines: ${lineCount}`);

254

255

const result = client.querySync('SELECT COUNT(*) FROM large_dataset');

256

console.log('Rows in database:', result[0].count);

257

258

client.end();

259

});

260

```

261

262

### Error Handling in COPY Operations

263

264

Proper error handling for COPY operations.

265

266

```javascript

267

const Client = require('pg-native');

268

269

const client = new Client();

270

client.connectSync();

271

272

try {

273

// Create table with constraints

274

client.querySync(`

275

CREATE TEMP TABLE constrained_table(

276

id int PRIMARY KEY,

277

email text UNIQUE,

278

age int CHECK (age > 0)

279

)

280

`);

281

282

client.querySync('COPY constrained_table FROM stdin');

283

const stream = client.getCopyStream();

284

285

// Write data that might violate constraints

286

stream.write(Buffer.from('1\talice@example.com\t25\n', 'utf8'));

287

stream.write(Buffer.from('2\tbob@example.com\t30\n', 'utf8'));

288

stream.write(Buffer.from('1\tcharlie@example.com\t-5\n', 'utf8')); // Duplicate ID and negative age

289

290

stream.end();

291

292

stream.on('error', function(err) {

293

console.error('COPY operation failed:', err.message);

294

// Handle constraint violations or other COPY errors

295

});

296

297

stream.once('finish', function() {

298

console.log('COPY operation completed successfully');

299

});

300

301

} catch (err) {

302

console.error('Setup error:', err.message);

303

}

304

```

305

306

## Performance Considerations

307

308

1. **Batch Size**: COPY operations are most efficient with larger batches

309

2. **Binary Format**: Use binary format for maximum performance with large datasets

310

3. **Transaction Context**: COPY operations are transactional - wrap in transactions for consistency

311

4. **Memory Usage**: COPY streams handle backpressure automatically

312

5. **Network Efficiency**: COPY reduces network overhead compared to individual INSERTs

313

314

## COPY Format Options

315

316

PostgreSQL COPY supports various format options:

317

318

- **TEXT** (default): Tab-delimited text format

319

- **CSV**: Comma-separated values with optional headers

320

- **BINARY**: PostgreSQL native binary format for maximum performance

321

322

## Best Practices

323

324

1. **Use appropriate format**: CSV for human-readable data, binary for performance

325

2. **Handle errors gracefully**: Always attach error listeners to COPY streams

326

3. **Monitor progress**: Log progress for large operations

327

4. **Use transactions**: Wrap COPY operations in transactions for data integrity

328

5. **Validate data**: Pre-validate data when possible to avoid COPY failures