Get a stream as a string, Buffer, ArrayBuffer or array
npx @tessl/cli install tessl/npm-get-stream@9.0.0get-stream is a modern JavaScript library that converts streams to various data formats including strings, Buffers, ArrayBuffers, and arrays. It provides a unified API for consuming different stream types across Node.js and browser environments with features like maximum buffer size limits, error handling with partial data recovery, and support for async iterables.
npm install get-streamimport getStream, {
getStreamAsBuffer,
getStreamAsArrayBuffer,
getStreamAsArray,
MaxBufferError
} from "get-stream";Note: get-stream is an ESM-only package (uses "type": "module"). CommonJS require() is not supported.
import fs from 'node:fs';
import getStream, { getStreamAsBuffer, getStreamAsArrayBuffer, getStreamAsArray } from 'get-stream';
// Convert stream to string (default export)
const stream = fs.createReadStream('unicorn.txt');
const content = await getStream(stream);
console.log(content);
// Convert stream to Buffer (Node.js only)
const bufferStream = fs.createReadStream('image.png');
const buffer = await getStreamAsBuffer(bufferStream);
// Convert stream to ArrayBuffer (cross-platform)
const {body: readableStream} = await fetch('https://example.com/data');
const arrayBuffer = await getStreamAsArrayBuffer(readableStream);
// Convert stream to array (supports object streams)
const arrayStream = fs.createReadStream('data.json');
const array = await getStreamAsArray(arrayStream);Get stream contents as a UTF-8 string with automatic text decoding for binary data.
/**
* Get the given stream as a string
* @param stream - Stream to convert (Node.js Readable, Web ReadableStream, or AsyncIterable)
* @param options - Optional configuration
* @returns Promise resolving to string content
*/
function getStream(stream: AnyStream, options?: Options): Promise<string>;Get stream contents as a Node.js Buffer (Node.js environments only).
/**
* Get the given stream as a Node.js Buffer
* @param stream - Stream to convert
* @param options - Optional configuration
* @returns Promise resolving to Buffer content
* @throws Error if Buffer is not available (browser environments)
*/
function getStreamAsBuffer(stream: AnyStream, options?: Options): Promise<Buffer>;Get stream contents as an ArrayBuffer with optimized memory management.
/**
* Get the given stream as an ArrayBuffer
* @param stream - Stream to convert
* @param options - Optional configuration
* @returns Promise resolving to ArrayBuffer content
*/
function getStreamAsArrayBuffer(stream: AnyStream, options?: Options): Promise<ArrayBuffer>;Get stream contents as an array. Unlike other methods, this supports streams in object mode.
/**
* Get the given stream as an array
* @param stream - Stream to convert (supports object streams)
* @param options - Optional configuration
* @returns Promise resolving to array of stream items
*/
function getStreamAsArray<Item>(stream: AnyStream<Item>, options?: Options): Promise<Item[]>;MaxBufferError is thrown when streams exceed the configured maximum buffer size.
/**
* Error thrown when stream exceeds maxBuffer option
*/
class MaxBufferError extends Error {
readonly name: 'MaxBufferError';
constructor();
}Error Data Recovery: All errors thrown during stream processing (including MaxBufferError) will have a bufferedData property dynamically added containing the partial data read before the error occurred. The format depends on the conversion method used:
getStream(): bufferedData is a stringgetStreamAsBuffer(): bufferedData is a BuffergetStreamAsArrayBuffer(): bufferedData is an ArrayBuffergetStreamAsArray(): bufferedData is an arrayUsage Example:
import getStream, { MaxBufferError } from 'get-stream';
try {
const content = await getStream(largeStream, { maxBuffer: 1024 });
} catch (error) {
if (error instanceof MaxBufferError) {
console.log('Partial content:', error.bufferedData);
console.log('Stream was too large, got', error.bufferedData.length, 'characters');
}
// Note: error.bufferedData is available for any error during stream processing
}Configuration object for all stream conversion methods.
interface Options {
/**
* Maximum length of the stream. If exceeded, MaxBufferError is thrown.
* Length measurement varies by method:
* - getStream(): string.length
* - getStreamAsBuffer(): buffer.length
* - getStreamAsArrayBuffer(): arrayBuffer.byteLength
* - getStreamAsArray(): array.length
* @default Infinity
*/
maxBuffer?: number;
}All methods accept various stream types for maximum compatibility.
/**
* Union type representing supported stream types
*/
type AnyStream<StreamItem = TextStreamItem> =
| Readable // Node.js streams
| ReadableStream<StreamItem> // Web ReadableStream
| AsyncIterable<StreamItem>; // Async iterables
/**
* Supported data types for text-based streams
*/
type TextStreamItem = string | Buffer | ArrayBuffer | ArrayBufferView;import fs from 'node:fs';
import getStream from 'get-stream';
const stream = fs.createReadStream('file.txt');
const content = await getStream(stream);import getStream from 'get-stream';
const {body: readableStream} = await fetch('https://example.com');
const content = await getStream(readableStream);import {opendir} from 'node:fs/promises';
import {getStreamAsArray} from 'get-stream';
const asyncIterable = await opendir(directory);
const entries = await getStreamAsArray(asyncIterable);All methods provide partial data recovery when streams error before completion. Any error thrown during stream processing will have a bufferedData property containing the data read before the error occurred.
import getStream from 'get-stream';
try {
const content = await getStream(faultyStream);
} catch (error) {
// error.bufferedData contains data read before the error
console.log('Recovered partial data:', error.bufferedData);
}exports.browser field in package.json or strip node:* imports