or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/cache-manager@7.2.x

docs

index.md
tile.json

tessl/npm-cache-manager

tessl install tessl/npm-cache-manager@7.2.0

Cache Manager for Node.js with support for multi-store caching, background refresh, and Keyv-compatible storage adapters

index.mddocs/

cache-manager

Flexible Node.js caching module with TypeScript support, multi-store tiered caching, background refresh, and Keyv-compatible storage adapters.

Quick Reference

FeatureDescription
Multi-StoreTiered caching (L1 memory, L2 Redis, etc.)
Background RefreshProactive cache updates before expiration
Type SafetyFull TypeScript with generic type preservation
Storage AdaptersAny Keyv-compatible store (Redis, MongoDB, SQLite, etc.)
Event SystemMonitor all cache operations
Function WrappingAutomatic memoization with cache coalescing

Installation

npm install cache-manager

For storage adapters:

npm install keyv @keyv/redis cacheable

Basic Usage

import { createCache } from 'cache-manager';

// Create cache
const cache = createCache();

// Basic operations
await cache.set('key', 'value', 60000); // 60 second TTL
const value = await cache.get('key');
await cache.del('key');
await cache.clear();

Core Concepts

Cache Factory

function createCache(options?: CreateCacheOptions): Cache;

type CreateCacheOptions = {
  stores?: Keyv[];              // Array of Keyv store instances
  ttl?: number;                 // Default TTL in milliseconds
  refreshThreshold?: number;    // Background refresh threshold
  refreshAllStores?: boolean;   // Refresh all stores or only up to found store
  nonBlocking?: boolean;        // Non-blocking multi-store operations
  cacheId?: string;             // Custom cache instance identifier
};

Cache Interface

interface Cache {
  // Read operations
  get<T>(key: string): Promise<T | undefined>;
  mget<T>(keys: string[]): Promise<Array<T | undefined>>;
  ttl(key: string): Promise<number | undefined>;
  
  // Write operations
  set<T>(key: string, value: T, ttl?: number): Promise<T>;
  mset<T>(list: Array<{ key: string; value: T; ttl?: number }>): Promise<Array<{ key: string; value: T; ttl?: number }>>;
  
  // Delete operations
  del(key: string): Promise<boolean>;
  mdel(keys: string[]): Promise<boolean>;
  clear(): Promise<boolean>;
  
  // Function wrapping
  wrap<T>(key: string, fnc: () => T | Promise<T>, ttl?: number, refreshThreshold?: number): Promise<T>;
  wrap<T>(key: string, fnc: () => T | Promise<T>, options: WrapOptions<T>): Promise<T>;
  
  // Event system
  on<E extends keyof Events>(event: E, listener: Events[E]): EventEmitter;
  off<E extends keyof Events>(event: E, listener: Events[E]): EventEmitter;
  
  // Metadata
  cacheId(): string;
  stores: Keyv[];
  disconnect(): Promise<undefined>;
}

Multi-Store Caching

import { createCache } from 'cache-manager';
import { Keyv } from 'keyv';
import KeyvRedis from '@keyv/redis';
import { CacheableMemory } from 'cacheable';

const cache = createCache({
  stores: [
    new Keyv({ store: new CacheableMemory({ ttl: 60000, lruSize: 500 }) }),  // L1: Fast
    new Keyv({ store: new KeyvRedis('redis://localhost:6379') }),            // L2: Persistent
  ],
  ttl: 300000,
  refreshThreshold: 60000,
});

Behavior:

  • Get: Searches stores in order, returns first match, promotes to higher tiers
  • Set: Writes to all stores (or uses Promise.race if nonBlocking: true)
  • Refresh: Updates stores up to where key was found (or all if refreshAllStores: true)

Function Wrapping

Automatic memoization with cache coalescing (prevents thundering herd):

async function fetchUser(id: number) {
  return await db.users.findById(id);
}

// Wrap function - executes only once for concurrent calls
const user = await cache.wrap('user:123', () => fetchUser(123), 60000);

Background Refresh:

// When remaining TTL < refreshThreshold, triggers background refresh
await cache.wrap('key', fetchData, 10000, 3000);
// At 7+ seconds: returns cached value, refreshes in background

Event System

Monitor cache operations:

cache.on('get', ({ key, value, store, error }) => {
  console.log(value !== undefined ? 'hit' : 'miss');
});

cache.on('set', ({ key, value, error }) => {
  console.log(`Cached: ${key}`);
});

cache.on('refresh', ({ key, value, error }) => {
  console.log(`Background refresh: ${key}`);
});

Available Events: get, mget, set, mset, del, mdel, clear, ttl, refresh

Key Types

type Events = {
  get<T>(data: { key: string; value?: T; store?: string; error?: unknown }): void;
  set<T>(data: { key: string; value: T; store?: string; error?: unknown }): void;
  refresh<T>(data: { key: string; value: T; error?: unknown }): void;
  // ... other events
};

type WrapOptions<T> = {
  ttl?: number | ((value: T) => number);
  refreshThreshold?: number | ((value: T) => number);
};

Documentation Structure

πŸ“˜ Guides

  • Quick Start - Step-by-step setup and common workflows
  • Multi-Store Setup - Configuring tiered caching
  • Background Refresh - Proactive cache updates
  • Error Handling - Patterns for robust caching

πŸ“– Examples

πŸ“š Reference

Quick Examples

API Response Caching

app.get('/api/users/:id', async (req, res) => {
  const user = await cache.wrap(
    `user:${req.params.id}`,
    () => db.users.findById(req.params.id),
    60000
  );
  res.json(user);
});

Metrics Collection

const metrics = { hits: 0, misses: 0 };

cache.on('get', ({ value }) => {
  value !== undefined ? metrics.hits++ : metrics.misses++;
});

Graceful Shutdown

process.on('SIGTERM', async () => {
  await cache.disconnect();
  process.exit(0);
});

Performance Characteristics

OperationIn-MemoryRedisNotes
Single get/set~1ms~10-50msNetwork latency for Redis
Batch operations~0.05ms/item~1-5ms/item10-100x faster than individual
Multi-store overheadNegligible+10-20%Proportional to slowest store

Common Patterns

Cache with Fallback:

const value = await cache.get('key') ?? await fetchFromSource();

Batch Loading:

const values = await cache.mget(keys);
const missing = keys.filter((_, i) => values[i] === undefined);
// Fetch missing, then cache

Conditional Caching:

const ttl = (data) => data.priority === 'high' ? 3600000 : 60000;
await cache.wrap('key', fetchData, ttl);

Troubleshooting

IssueSolution
Cache not workingVerify TTL is set (default or per-operation)
Stale dataCheck refreshAllStores setting and refresh threshold
High memoryConfigure LRU eviction: new CacheableMemory({ lruSize: 1000 })
Refresh not triggeringEnsure refreshThreshold < ttl and enough time has passed

See Troubleshooting Guide for detailed solutions.

Migration from v5

Use KeyvAdapter to wrap legacy stores:

import { KeyvAdapter } from 'cache-manager';
import { redisStore } from 'cache-manager-redis-yet';

const legacyStore = await redisStore(/* config */);
const adapter = new KeyvAdapter(legacyStore);
const cache = createCache({ stores: [new Keyv({ store: adapter })] });

See Legacy Adapter Guide for details.

External Dependencies

  • keyv - Storage abstraction layer
  • @keyv/redis - Redis adapter (optional)
  • cacheable - In-memory LRU cache (optional)
  • cache-manager-redis-yet - Legacy Redis adapter (v5 compatibility)

Additional Resources

  • npm package
  • GitHub repository
  • Keyv documentation

Support

For issues and questions: