CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-pino

Super fast, all natural JSON logger with exceptional performance for structured logging applications.

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

streams.mddocs/

Streams

Optimized destination streams and multistream functionality for directing logs to multiple outputs with high performance. Pino provides specialized stream implementations that offer significantly better throughput than standard Node.js streams.

Capabilities

Destination Streams

Create high-performance destination streams using Sonic Boom for optimal log writing throughput.

/**
 * Create a high-performance destination stream
 * @param dest - File descriptor, path, stream, or options object
 * @returns SonicBoom stream optimized for log writing
 */
function destination(
  dest?: number | string | object | DestinationStream | NodeJS.WritableStream | SonicBoomOpts
): SonicBoom;

interface SonicBoomOpts {
  /** File descriptor or path for output */
  dest?: number | string;
  
  /** Minimum buffer size before flushing */
  minLength?: number;
  
  /** Enable synchronous writing */
  sync?: boolean;
  
  /** Append to existing file */
  append?: boolean;
  
  /** Create directory if needed */
  mkdir?: boolean;
  
  /** File mode for created files */
  mode?: number;
  
  /** Retry on EMFILE/ENFILE errors */
  retryEAGAIN?: (err: NodeJS.ErrnoException, writeBufferLen: number, fnName: string) => boolean;
  
  /** Custom file system module */
  fsync?: boolean;
}

interface SonicBoom {
  /** Write data to the stream */
  write(data: string): boolean;
  
  /** Flush buffered data */
  flush(cb?: (err?: Error) => void): void;
  
  /** End the stream */
  end(cb?: () => void): void;
  
  /** Destroy the stream */
  destroy(): void;
  
  /** Check if stream was destroyed */
  destroyed: boolean;
  
  /** Get current file descriptor */
  fd: number;
  
  /** Reopen the file (useful for log rotation) */
  reopen(file?: string | number): void;
}

Usage Examples:

// Default stdout destination
const logger = pino(pino.destination());

// File destination
const logger = pino(pino.destination('./logs/app.log'));

// File destination with options
const logger = pino(pino.destination({
  dest: './logs/app.log',
  minLength: 4096, // Buffer 4KB before writing
  sync: false,     // Async writing for performance
  mkdir: true      // Create directory if needed
}));

// Async destination for high performance
const logger = pino(pino.destination({
  dest: './logs/app.log',
  sync: false,
  minLength: 0 // Write immediately
}));

// Stderr destination
const logger = pino(pino.destination(2)); // File descriptor 2 = stderr

Multistream Functionality

Send logs to multiple destinations with different configurations and level filtering.

/**
 * Create a multistream that writes to multiple destinations
 * @param streams - Array of stream entries or single stream
 * @param opts - Multistream options
 * @returns Multistream result object
 */
function multistream<TLevel = Level>(
  streams: (DestinationStream | StreamEntry<TLevel>)[] | DestinationStream | StreamEntry<TLevel>,
  opts?: MultiStreamOptions
): MultiStreamRes<TLevel>;

interface StreamEntry<TLevel = Level> {
  /** Destination stream */
  stream: DestinationStream;
  
  /** Minimum level for this stream */
  level?: TLevel;
}

interface MultiStreamOptions {
  /** Custom level definitions */
  levels?: Record<string, number>;
  
  /** Remove duplicate entries */
  dedupe?: boolean;
}

interface MultiStreamRes<TOriginLevel = Level> {
  /** Write data to all appropriate streams */
  write(data: any): void;
  
  /** Add a new stream to the multistream */
  add<TLevel = Level>(dest: StreamEntry<TLevel> | DestinationStream): MultiStreamRes<TOriginLevel & TLevel>;
  
  /** Flush all streams synchronously */
  flushSync(): void;
  
  /** Minimum level across all streams */
  minLevel: number;
  
  /** Array of all stream entries */
  streams: StreamEntry<TOriginLevel>[];
  
  /** Clone multistream with new level */
  clone<TLevel = Level>(level: TLevel): MultiStreamRes<TLevel>;
}

Usage Examples:

// Multiple destinations with different levels
const streams = pino.multistream([
  { stream: pino.destination('./logs/info.log'), level: 'info' },
  { stream: pino.destination('./logs/error.log'), level: 'error' },
  { stream: process.stdout, level: 'debug' }
]);

const logger = pino(streams);

logger.debug('Debug message'); // Only to stdout
logger.info('Info message');   // To info.log and stdout
logger.error('Error message'); // To all three streams

Advanced Stream Configuration

High-Performance File Streams

Configure destination streams for maximum throughput in high-volume applications.

Usage Examples:

// High-performance async file writing
const asyncDest = pino.destination({
  dest: './logs/high-volume.log',
  minLength: 4096,  // 4KB buffer
  sync: false,      // Async writes
  mkdir: true
});

// Sync file writing for critical logs
const syncDest = pino.destination({
  dest: './logs/critical.log',
  sync: true,       // Synchronous writes
  fsync: true       // Force flush to disk
});

// Memory-efficient streaming
const streamDest = pino.destination({
  dest: './logs/stream.log',
  minLength: 0,     // Write immediately
  sync: false
});

Stream Events and Error Handling

Handle stream lifecycle events and errors gracefully.

Usage Examples:

const dest = pino.destination('./logs/app.log');

// Handle stream ready
dest.on('ready', () => {
  console.log('Stream ready for writing');
});

// Handle stream errors
dest.on('error', (err) => {
  console.error('Stream error:', err);
  // Implement fallback logging strategy
});

// Handle stream close
dest.on('close', () => {
  console.log('Stream closed');
});

// Manual flush
dest.flush((err) => {
  if (err) {
    console.error('Flush error:', err);
  } else {
    console.log('Stream flushed successfully');
  }
});

Log Rotation Support

Configure streams for log rotation using file reopening.

Usage Examples:

const dest = pino.destination('./logs/app.log');

// Reopen file for log rotation (called by rotation tools)
function rotateLog() {
  dest.reopen('./logs/app.log');
}

// Handle SIGUSR2 for log rotation
process.on('SIGUSR2', rotateLog);

// Manual rotation example
setInterval(() => {
  const timestamp = new Date().toISOString().slice(0, 10);
  const newFile = `./logs/app-${timestamp}.log`;
  dest.reopen(newFile);
}, 24 * 60 * 60 * 1000); // Daily rotation

Multistream Patterns

Level-Based Stream Routing

Route different log levels to appropriate destinations.

Usage Examples:

const streams = pino.multistream([
  // Debug info to console during development
  { 
    stream: process.stdout, 
    level: 'debug' 
  },
  // Application logs to file
  { 
    stream: pino.destination('./logs/app.log'), 
    level: 'info' 
  },
  // Errors to separate file for monitoring
  { 
    stream: pino.destination('./logs/error.log'), 
    level: 'error' 
  },
  // Critical errors to both file and alerting system
  { 
    stream: pino.destination('./logs/fatal.log'), 
    level: 'fatal' 
  }
]);

const logger = pino(streams);

Environment-Specific Streaming

Configure different stream setups for different environments.

Usage Examples:

function createStreams() {
  if (process.env.NODE_ENV === 'production') {
    return pino.multistream([
      {
        stream: pino.destination({
          dest: './logs/app.log',
          sync: false,
          minLength: 4096
        }),
        level: 'info'
      },
      {
        stream: pino.destination({
          dest: './logs/error.log',
          sync: true // Sync for errors in production
        }),
        level: 'error'
      }
    ]);
  } else if (process.env.NODE_ENV === 'development') {
    return pino.multistream([
      {
        stream: process.stdout,
        level: 'debug'
      },
      {
        stream: pino.destination('./logs/dev.log'),
        level: 'info'
      }
    ]);
  } else {
    // Test environment - minimal logging
    return pino.destination({
      dest: './logs/test.log',
      minLength: 0
    });
  }
}

const logger = pino(createStreams());

Dynamic Stream Management

Add and remove streams dynamically during application runtime.

Usage Examples:

let multistream = pino.multistream([
  { stream: process.stdout, level: 'info' }
]);

const logger = pino(multistream);

// Add error file logging when needed
function enableErrorLogging() {
  multistream = multistream.add({
    stream: pino.destination('./logs/error.log'),
    level: 'error'
  });
}

// Add debug logging during troubleshooting
function enableDebugLogging() {
  multistream = multistream.add({
    stream: pino.destination('./logs/debug.log'),
    level: 'debug'
  });
}

// Clone for testing specific scenarios
const testMultistream = multistream.clone('trace');
const testLogger = pino(testMultistream);

Stream Performance Optimization

Buffer Configuration

Optimize buffering settings for different use cases.

Usage Examples:

// High-throughput logging (batch processing)
const highThroughputDest = pino.destination({
  dest: './logs/batch.log',
  minLength: 65536, // 64KB buffer
  sync: false
});

// Low-latency logging (real-time processing)
const lowLatencyDest = pino.destination({
  dest: './logs/realtime.log',
  minLength: 0,     // No buffering
  sync: false
});

// Critical logging (data integrity)
const criticalDest = pino.destination({
  dest: './logs/critical.log',
  sync: true,       // Synchronous writes
  fsync: true       // Force disk sync
});

Memory Management

Configure streams to minimize memory usage in long-running applications.

Usage Examples:

// Memory-efficient configuration
const dest = pino.destination({
  dest: './logs/app.log',
  minLength: 1024,  // Small buffer
  sync: false
});

// Periodic flushing to free memory
setInterval(() => {
  dest.flush();
}, 5000); // Flush every 5 seconds

// Graceful shutdown
process.on('SIGTERM', () => {
  dest.flush(() => {
    dest.end(() => {
      process.exit(0);
    });
  });
});

Install with Tessl CLI

npx tessl i tessl/npm-pino

docs

browser.md

child-loggers.md

index.md

logger-configuration.md

logger-methods.md

serializers.md

streams.md

transports.md

tile.json