or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/cache-manager@7.2.x

docs

index.md
tile.json

tessl/npm-cache-manager

tessl install tessl/npm-cache-manager@7.2.0

Cache Manager for Node.js with support for multi-store caching, background refresh, and Keyv-compatible storage adapters

quick-start.mddocs/guides/

Quick Start Guide

Get started with cache-manager in minutes.

Installation

npm install cache-manager

Basic Setup

1. Create a Simple Cache

import { createCache } from 'cache-manager';

const cache = createCache();

// Set a value (with optional TTL in milliseconds)
await cache.set('user:123', { name: 'Alice', role: 'admin' }, 60000);

// Get a value
const user = await cache.get('user:123');
console.log(user); // { name: 'Alice', role: 'admin' }

// Delete a value
await cache.del('user:123');

// Clear all entries
await cache.clear();

2. Configure Default TTL

const cache = createCache({
  ttl: 300000, // 5 minutes default for all entries
});

// Uses default TTL (5 minutes)
await cache.set('key', 'value');

// Override with custom TTL (10 seconds)
await cache.set('temporary', 'data', 10000);

Multi-Store Setup

3. Add Redis for Persistence

npm install keyv @keyv/redis
import { createCache } from 'cache-manager';
import { Keyv } from 'keyv';
import KeyvRedis from '@keyv/redis';

const cache = createCache({
  stores: [
    new Keyv({ 
      store: new KeyvRedis('redis://localhost:6379')
    }),
  ],
  ttl: 300000, // 5 minutes
});

await cache.set('key', 'value');
// Stored in Redis

4. Two-Tier Caching (Memory + Redis)

npm install keyv @keyv/redis cacheable
import { createCache } from 'cache-manager';
import { Keyv } from 'keyv';
import KeyvRedis from '@keyv/redis';
import { CacheableMemory } from 'cacheable';

const cache = createCache({
  stores: [
    // L1: Fast in-memory cache
    new Keyv({
      store: new CacheableMemory({ 
        ttl: 60000,    // 1 minute
        lruSize: 500   // Store max 500 items
      }),
    }),
    // L2: Persistent Redis cache
    new Keyv({
      store: new KeyvRedis('redis://localhost:6379'),
    }),
  ],
  ttl: 300000, // 5 minutes default
});

await cache.set('product:456', { name: 'Widget', price: 29.99 });
// Stored in both memory and Redis

const product = await cache.get('product:456');
// Retrieved from memory (fastest), falls back to Redis if needed

Function Wrapping

5. Cache Expensive Operations

async function fetchUserFromDB(id: number) {
  console.log('Fetching from database...');
  // Expensive database query
  return await db.users.findById(id);
}

// First call - executes function and caches result
const user1 = await cache.wrap('user:123', () => fetchUserFromDB(123), 60000);
// Output: "Fetching from database..."

// Second call - returns from cache without executing function
const user2 = await cache.wrap('user:123', () => fetchUserFromDB(123), 60000);
// No output - function not called

6. Background Refresh

Automatically refresh cache before expiration:

const cache = createCache({
  ttl: 300000,           // 5 minutes
  refreshThreshold: 60000 // Refresh when < 1 minute remains
});

async function fetchData() {
  return await api.getData();
}

// When TTL drops below 60 seconds:
// - Returns cached value immediately
// - Refreshes in background
const data = await cache.wrap('api:data', fetchData, 300000, 60000);

Batch Operations

7. Work with Multiple Keys

// Set multiple values
await cache.mset([
  { key: 'user:1', value: { name: 'Alice' }, ttl: 60000 },
  { key: 'user:2', value: { name: 'Bob' }, ttl: 60000 },
  { key: 'user:3', value: { name: 'Charlie' }, ttl: 60000 },
]);

// Get multiple values
const users = await cache.mget(['user:1', 'user:2', 'user:3']);
// [{ name: 'Alice' }, { name: 'Bob' }, { name: 'Charlie' }]

// Delete multiple keys
await cache.mdel(['user:1', 'user:2', 'user:3']);

Event Monitoring

8. Track Cache Operations

const metrics = { hits: 0, misses: 0, writes: 0 };

cache.on('get', ({ key, value }) => {
  if (value !== undefined) {
    metrics.hits++;
  } else {
    metrics.misses++;
  }
});

cache.on('set', ({ key }) => {
  metrics.writes++;
});

// Use the cache...
await cache.set('key', 'value');
await cache.get('key');  // hit
await cache.get('missing'); // miss

console.log(metrics); 
// { hits: 1, misses: 1, writes: 1 }

Common Patterns

9. Cache with Fallback

async function getCachedOrFetch(key: string) {
  // Try cache first
  let value = await cache.get(key);
  
  if (value === undefined) {
    // Cache miss - fetch from source
    value = await fetchFromSource(key);
    await cache.set(key, value, 60000);
  }
  
  return value;
}

10. Express.js Integration

import express from 'express';
import { createCache } from 'cache-manager';

const app = express();
const cache = createCache({ ttl: 60000 });

app.get('/api/users/:id', async (req, res) => {
  const user = await cache.wrap(
    `user:${req.params.id}`,
    async () => {
      return await db.users.findById(req.params.id);
    },
    60000 // 60 second cache
  );
  
  if (!user) {
    return res.status(404).json({ error: 'User not found' });
  }
  
  res.json(user);
});

app.listen(3000);

Graceful Shutdown

11. Clean Up Connections

// Disconnect from stores when application shuts down
process.on('SIGTERM', async () => {
  console.log('Shutting down...');
  await cache.disconnect();
  process.exit(0);
});

process.on('SIGINT', async () => {
  console.log('Interrupted...');
  await cache.disconnect();
  process.exit(0);
});

Next Steps

  • Multi-Store Setup Guide - Advanced multi-tier caching
  • Background Refresh Guide - Proactive cache updates
  • Error Handling Guide - Robust error patterns
  • Real-World Scenarios - Production examples
  • Core Operations Reference - Complete API documentation

Troubleshooting

Cache Not Working

  • Check TTL: Ensure TTL is set (either default or per-operation)
  • Check Serialization: Verify values are serializable (no functions, symbols)

Values Not Persisting

  • Check Store: Ensure store is connected (check for connection errors)
  • Check TTL: Very short TTL may expire immediately

Performance Issues

  • Use Batch Operations: mget/mset are 10-100x faster than individual operations
  • Add Memory Tier: Add in-memory L1 cache for hot data
  • Configure LRU: Limit memory cache size: new CacheableMemory({ lruSize: 1000 })

For detailed troubleshooting, see Core Operations Reference.