CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-pino

Super fast, all natural JSON logger with exceptional performance for structured logging applications.

Pending

Quality

Pending

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

Overview
Eval results
Files

transports.mddocs/

Transports

High-performance log processing system using worker threads for handling log output without blocking the main thread. Transports allow you to send logs to multiple destinations, transform log data, and process logs in separate processes for optimal performance.

Capabilities

Transport Function

Create transport streams for processing logs in worker threads.

/**
 * Creates a transport stream that processes logs in a separate worker thread
 * @param options - Transport configuration options
 * @returns ThreadStream for use as logger destination
 */
function transport(options: TransportSingleOptions | TransportMultiOptions | TransportPipelineOptions): ThreadStream;

type ThreadStream = any; // Worker thread stream interface

Usage Examples:

const transport = pino.transport({
  target: 'pino-pretty'
});

const logger = pino(transport);
logger.info('This will be pretty printed');

Single Transport

Single Transport Configuration

Configure a single transport target for log processing.

interface TransportSingleOptions {
  /** Target module name or file path for transport */
  target: string;
  
  /** Options passed to the transport */
  options?: Record<string, any>;
  
  /** Worker thread options */
  worker?: WorkerOptions & { autoEnd?: boolean };
}

interface WorkerOptions {
  /** Worker thread environment settings */
  env?: Record<string, string>;
  /** Worker execution arguments */
  execArgv?: string[];
  /** Worker resource limits */
  resourceLimits?: {
    maxOldGenerationSizeMb?: number;
    maxYoungGenerationSizeMb?: number;
    codeRangeSizeMb?: number;
  };
}

Usage Examples:

// Pretty printing transport
const transport = pino.transport({
  target: 'pino-pretty',
  options: {
    colorize: true,
    translateTime: 'SYS:standard'
  }
});

// File transport
const transport = pino.transport({
  target: 'pino/file',
  options: {
    destination: './logs/app.log',
    mkdir: true
  }
});

// Custom transport with worker options
const transport = pino.transport({
  target: './my-custom-transport.js',
  options: {
    customOption: 'value'
  },
  worker: {
    autoEnd: false,
    env: { NODE_ENV: 'production' }
  }
});

Multiple Transports

Multi-Transport Configuration

Send logs to multiple destinations with different configurations and filtering.

interface TransportMultiOptions {
  /** Array of transport targets */
  targets: readonly (TransportTargetOptions | TransportPipelineOptions)[];
  
  /** Custom levels configuration */
  levels?: Record<string, number>;
  
  /** Remove duplicate log entries */
  dedupe?: boolean;
  
  /** Options passed to each transport */
  options?: Record<string, any>;
  
  /** Worker thread options */
  worker?: WorkerOptions & { autoEnd?: boolean };
}

interface TransportTargetOptions {
  /** Target module name or file path */
  target: string;
  
  /** Minimum level for this transport */
  level?: string;
  
  /** Transport-specific options */
  options?: Record<string, any>;
}

Usage Examples:

const transport = pino.transport({
  targets: [
    {
      level: 'error',
      target: 'pino/file',
      options: {
        destination: './logs/error.log'
      }
    },
    {
      level: 'info',
      target: 'pino-pretty',
      options: {
        colorize: true
      }
    },
    {
      level: 'warn',
      target: 'pino-elasticsearch',
      options: {
        node: 'http://localhost:9200',
        index: 'app-logs'
      }
    }
  ]
});

const logger = pino(transport);

logger.info('Info message'); // Goes to pretty and elasticsearch
logger.error('Error message'); // Goes to all three transports

Pipeline Transports

Pipeline Configuration

Chain multiple transports together for sequential processing.

interface TransportPipelineOptions {
  /** Array of transports to chain together */
  pipeline: TransportSingleOptions[];
  
  /** Minimum level for the pipeline */
  level?: string;
  
  /** Options passed to the pipeline */
  options?: Record<string, any>;
  
  /** Worker thread options */
  worker?: WorkerOptions & { autoEnd?: boolean };
}

Usage Examples:

const transport = pino.transport({
  pipeline: [
    {
      target: './transform-logs.js', // First: transform log format
      options: { addTimezone: true }
    },
    {
      target: './filter-logs.js', // Second: filter sensitive data
      options: { removePasswords: true }
    },
    {
      target: 'pino/file', // Third: write to file
      options: { destination: './logs/processed.log' }
    }
  ]
});

Built-in Transports

File Transport

Write logs directly to files with built-in rotation support.

// Built-in file transport: 'pino/file'
interface FileTransportOptions {
  /** File path for log output */
  destination: string;
  
  /** Create directory if it doesn't exist */
  mkdir?: boolean;
  
  /** Append to existing file */
  append?: boolean;
}

Usage Examples:

const transport = pino.transport({
  target: 'pino/file',
  options: {
    destination: './logs/app.log',
    mkdir: true
  }
});

// With daily rotation (requires pino-roll)
const transport = pino.transport({
  target: 'pino-roll',
  options: {
    file: './logs/app.log',
    frequency: 'daily',
    size: '10m'
  }
});

Custom Transports

Creating Custom Transports

Build custom transport modules for specialized log processing.

// Custom transport module structure
export interface TransportStream {
  /** Write method for processing log data */
  write(chunk: string): void;
  
  /** Optional end method for cleanup */
  end?(): void;
  
  /** Optional flush method */
  flush?(): void;
}

Custom Transport Example:

// custom-transport.js
import { Transform } from 'stream';

export default function customTransport(options) {
  return new Transform({
    objectMode: true,
    transform(chunk, encoding, callback) {
      const logObj = JSON.parse(chunk);
      
      // Custom processing
      if (logObj.level >= 50) { // error level and above
        // Send to external monitoring system
        sendToMonitoring(logObj);
      }
      
      // Transform and pass through
      logObj.processed = true;
      logObj.processingTime = Date.now();
      
      callback(null, JSON.stringify(logObj) + '\n');
    }
  });
}

// Usage
const transport = pino.transport({
  target: './custom-transport.js',
  options: {
    apiKey: 'monitoring-api-key'
  }
});

Transport Error Handling

Error Management

Handle transport errors without affecting main application performance.

Usage Examples:

const transport = pino.transport({
  targets: [
    {
      target: 'pino/file',
      options: { destination: './logs/app.log' }
    },
    {
      target: 'unreliable-transport', // May fail
      options: { endpoint: 'https://api.example.com/logs' }
    }
  ]
});

// Transport errors are isolated and don't crash the main process
const logger = pino(transport);

// Main application continues even if transport fails
logger.info('Application continues running');

Transport Events

Monitor transport status and handle transport lifecycle events.

Usage Examples:

const transport = pino.transport({
  target: 'pino-pretty'
});

// Monitor transport events (if supported by the transport)
transport.on('ready', () => {
  console.log('Transport ready');
});

transport.on('error', (err) => {
  console.error('Transport error:', err);
});

transport.on('close', () => {
  console.log('Transport closed');
});

Popular Transport Modules

Community Transports

Common transport modules available from the Pino ecosystem:

// Popular transport targets (npm packages)
type PopularTransports = 
  | "pino-pretty"           // Pretty printing for development
  | "pino-elasticsearch"    // Elasticsearch integration
  | "pino-mongodb"          // MongoDB storage
  | "pino-syslog"          // Syslog protocol
  | "pino-socket"          // TCP/UDP socket transport
  | "pino-http-send"       // HTTP endpoint transport
  | "pino-datadog"         // Datadog integration
  | "pino-cloudwatch"      // AWS CloudWatch
  | "pino-slack"           // Slack notifications
  | "pino/file";           // Built-in file transport

Usage Examples:

// Development setup with pretty printing
const devTransport = pino.transport({
  target: 'pino-pretty',
  options: {
    colorize: true,
    translateTime: 'SYS:standard',
    ignore: 'pid,hostname'
  }
});

// Production setup with multiple outputs
const prodTransport = pino.transport({
  targets: [
    {
      level: 'info',
      target: 'pino/file',
      options: { destination: './logs/app.log' }
    },
    {
      level: 'error',
      target: 'pino-elasticsearch',
      options: {
        node: 'http://elasticsearch:9200',
        index: 'error-logs'
      }
    },
    {
      level: 'fatal',
      target: 'pino-slack',
      options: {
        webhookUrl: process.env.SLACK_WEBHOOK_URL,
        channel: '#alerts'
      }
    }
  ]
});

// Environment-specific logger
const logger = pino(
  process.env.NODE_ENV === 'production' ? prodTransport : devTransport
);

Install with Tessl CLI

npx tessl i tessl/npm-pino

docs

browser.md

child-loggers.md

index.md

logger-configuration.md

logger-methods.md

serializers.md

streams.md

transports.md

tile.json