or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

index.md
tile.json

tessl/npm-pump

Pipes streams together and destroys all when one closes with proper cleanup and callback support

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/pump@3.0.x

To install, run

npx @tessl/cli install tessl/npm-pump@3.0.0

index.mddocs/

Pump

Pump is a small Node.js module that pipes streams together and destroys all of them if one of them closes. It addresses critical issues with Node.js's standard pipe() method by automatically destroying all connected streams when any stream in the pipeline closes or errors, and provides a callback mechanism to handle completion or error events.

Package Information

  • Package Name: pump
  • Package Type: npm
  • Language: JavaScript
  • Installation: npm install pump

Core Imports

const pump = require("pump");

Note: This package uses CommonJS format and does not provide native ESM exports.

Basic Usage

const pump = require("pump");
const fs = require("fs");

const source = fs.createReadStream("/dev/random");
const dest = fs.createWriteStream("/dev/null");

pump(source, dest, function(err) {
  console.log("pipe finished", err);
});

Architecture

Pump is built around a single core function that:

  • Stream Pipeline Management: Creates and manages multi-stream pipelines with proper resource cleanup
  • Error Propagation: Ensures errors from any stream in the pipeline are properly propagated to the callback
  • Resource Management: Automatically destroys all streams when any stream closes, preventing resource leaks
  • Special Stream Handling: Provides optimized cleanup for filesystem streams and HTTP requests
  • Callback Interface: Offers a completion callback that standard pipe() lacks

Capabilities

Stream Piping with Cleanup

The core functionality that pipes multiple streams together and ensures proper cleanup when any stream closes or errors.

/**
 * Pipes streams together and destroys all when one closes
 * @param {...(Stream|Function)} streams - Streams to pipe together, with optional callback as last argument
 * @returns {Stream} The last stream in the pipeline
 * @throws {Error} If fewer than 2 streams are provided
 */
function pump(...streams): Stream;

/**
 * Alternative signature with array of streams
 * @param {Stream[]} streams - Array of streams to pipe together
 * @param {Function} [callback] - Optional completion callback
 * @returns {Stream} The last stream in the pipeline
 */
function pump(streams: Stream[], callback?: (err?: Error) => void): Stream;

/**
 * Two-stream signature with callback
 * @param {Stream} source - Source stream
 * @param {Stream} destination - Destination stream  
 * @param {Function} [callback] - Optional completion callback
 * @returns {Stream} The destination stream
 */
function pump(source: Stream, destination: Stream, callback?: (err?: Error) => void): Stream;

/**
 * Multi-stream signature with callback
 * @param {Stream} stream1 - First stream
 * @param {Stream} stream2 - Second stream
 * @param {...Stream} additionalStreams - Additional streams
 * @param {Function} [callback] - Optional completion callback
 * @returns {Stream} The last stream in the pipeline
 */
function pump(stream1: Stream, stream2: Stream, ...additionalStreams: (Stream | ((err?: Error) => void))[]): Stream;

Usage Examples:

const pump = require("pump");
const fs = require("fs");

// Basic two-stream piping
pump(
  fs.createReadStream("input.txt"),
  fs.createWriteStream("output.txt"),
  function(err) {
    if (err) console.error("Pipeline failed:", err);
    else console.log("Pipeline succeeded");
  }
);

// Multi-stream transformation pipeline
const { Transform } = require("stream");

const upperCase = new Transform({
  transform(chunk, encoding, callback) {
    callback(null, chunk.toString().toUpperCase());
  }
});

const addPrefix = new Transform({
  transform(chunk, encoding, callback) {
    callback(null, "PREFIX: " + chunk);
  }
});

pump(
  fs.createReadStream("input.txt"),
  upperCase,
  addPrefix,
  fs.createWriteStream("output.txt"),
  function(err) {
    console.log("Transformation pipeline finished", err);
  }
);

// Using array syntax
const streams = [
  fs.createReadStream("input.txt"),
  upperCase,
  fs.createWriteStream("output.txt")
];

const result = pump(streams, function(err) {
  console.log("Array pipeline finished", err);
});

// Return value can be used for further chaining
console.log(result === streams[streams.length - 1]); // true

Error Handling

Pump provides comprehensive error handling:

  • Automatic Cleanup: When any stream in the pipeline emits an error or closes, all other streams are automatically destroyed
  • Callback Errors: The callback receives the first error that occurred in the pipeline
  • Uncaught Exception Prevention: Pump handles stream errors internally to prevent uncaught exceptions
  • Special Stream Support:
    • Filesystem streams use .close() to prevent file descriptor leaks
    • HTTP request streams use .abort() for proper cleanup
    • Standard streams use .destroy() for cleanup
pump(source, dest, function(err) {
  if (err) {
    // Error occurred - could be from any stream in the pipeline
    console.error("Pipeline failed:", err.message);
  } else {
    // All streams completed successfully
    console.log("Pipeline completed successfully");
  }
});

Stream Requirements

Pump works with any Node.js stream that follows the standard stream interface:

  • Readable streams: Must emit 'data', 'end', and 'error' events
  • Writable streams: Must implement .write() and emit 'finish' and 'error' events
  • Transform streams: Must implement both readable and writable interfaces
  • Duplex streams: Must implement both readable and writable interfaces

Browser Compatibility

Pump is browser-compatible when used with browserify or webpack. The filesystem (fs) module is automatically stubbed out in browser environments, so filesystem-specific optimizations are disabled but core functionality remains intact.

Dependencies

Pump relies on two external modules:

  • end-of-stream: Detects when streams have ended or errored
  • once: Ensures callback functions are called only once