or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/cache-manager@7.2.x

docs

index.md
tile.json

tessl/npm-cache-manager

tessl install tessl/npm-cache-manager@7.2.0

Cache Manager for Node.js with support for multi-store caching, background refresh, and Keyv-compatible storage adapters

background-refresh.mddocs/guides/

Background Refresh Guide

Keep your cache fresh without blocking user requests.

Overview

Background refresh automatically updates expiring cache entries before they expire:

  • Users get fast responses - Always returns cached value immediately
  • Cache stays fresh - Refreshes in background before expiration
  • Prevents cache stampedes - Avoids thundering herd problem

Basic Usage

const cache = createCache({
  ttl: 300000,           // 5 minutes total
  refreshThreshold: 60000 // Refresh when < 1 minute remains
});

await cache.wrap('key', fetchData, 300000, 60000);

Timeline:

Time 0s:   Call wrap() → executes fetchData(), caches for 5 minutes
Time 2m:   Call wrap() → returns cached (3m remaining > 1m threshold)
Time 4.5m: Call wrap() → returns cached immediately
                       → triggers background refresh (30s remaining < 1m threshold)
Time 4.6m: Background refresh completes → cache updated
Time 5m:   (would have expired, but already refreshed at 4.5m)

Configuration

Global Refresh Threshold

const cache = createCache({
  ttl: 600000,            // 10 minutes
  refreshThreshold: 120000 // Refresh when < 2 minutes remain
});

// Uses global settings
await cache.wrap('key1', fetchData);

Per-Operation Threshold

// Override global settings
await cache.wrap('key2', fetchData, 300000, 60000);
// TTL: 5 minutes, Threshold: 1 minute

Dynamic Threshold

// Compute threshold based on value
await cache.wrap(
  'key',
  fetchData,
  (data) => data.priority === 'high' ? 600000 : 300000,  // TTL
  (data) => data.priority === 'high' ? 180000 : 60000    // Threshold
);

Monitoring Refresh

Track Refresh Events

let refreshCount = 0;

cache.on('refresh', ({ key, value, error }) => {
  if (error) {
    console.error(`Refresh failed for ${key}:`, error);
    // Old value remains in cache
  } else {
    refreshCount++;
    console.log(`Refreshed ${key} (total: ${refreshCount})`);
  }
});

Measure Refresh Performance

const refreshMetrics = {
  total: 0,
  success: 0,
  failed: 0,
  durations: [] as number[],
};

cache.on('refresh', ({ key, error }) => {
  refreshMetrics.total++;
  
  if (error) {
    refreshMetrics.failed++;
  } else {
    refreshMetrics.success++;
  }
});

setInterval(() => {
  const successRate = refreshMetrics.total > 0 
    ? (refreshMetrics.success / refreshMetrics.total) * 100 
    : 0;
  
  console.log(`Refresh: ${successRate.toFixed(1)}% success rate`);
}, 60000);

Multi-Store Refresh

Default Behavior (refreshAllStores: false)

const cache = createCache({
  stores: [memoryStore, redisStore],
  refreshAllStores: false, // Default
});

Behavior:

  • Key in L1 (memory) → Refresh L1 only
  • Key in L2 (Redis) → Refresh L1 + L2
  • Reduces write amplification

Refresh All Stores

const cache = createCache({
  stores: [memoryStore, redisStore],
  refreshAllStores: true,
});

Behavior:

  • Always refreshes all stores
  • Ensures consistency
  • Higher write load

Advanced Patterns

Priority-Based Refresh

interface DataWithPriority {
  value: any;
  priority: 'critical' | 'high' | 'normal' | 'low';
}

async function fetchData(): Promise<DataWithPriority> {
  return {
    value: await api.getData(),
    priority: 'high',
  };
}

const getTtl = (data: DataWithPriority) => {
  switch (data.priority) {
    case 'critical': return 600000;  // 10 minutes
    case 'high': return 300000;      // 5 minutes
    case 'normal': return 120000;    // 2 minutes
    case 'low': return 60000;        // 1 minute
  }
};

const getThreshold = (data: DataWithPriority) => {
  // Refresh when 20% of TTL remains
  return getTtl(data) * 0.2;
};

const data = await cache.wrap('key', fetchData, getTtl, getThreshold);

Conditional Refresh

interface APIResponse {
  data: any;
  cacheControl: 'must-revalidate' | 'cache' | 'no-cache';
  maxAge: number;
}

async function fetchAPI(): Promise<APIResponse> {
  const response = await fetch('/api/data');
  return {
    data: await response.json(),
    cacheControl: response.headers.get('cache-control') as any,
    maxAge: 300000,
  };
}

const getTtl = (response: APIResponse) => {
  if (response.cacheControl === 'no-cache') return 0;
  if (response.cacheControl === 'must-revalidate') return 30000;
  return response.maxAge;
};

const getThreshold = (response: APIResponse) => {
  if (response.cacheControl === 'must-revalidate') {
    return response.maxAge * 0.5; // Refresh at 50%
  }
  return response.maxAge * 0.2; // Refresh at 20%
};

await cache.wrap('api:data', fetchAPI, getTtl, getThreshold);

Size-Based Refresh Strategy

interface CachedData {
  value: any;
  size: number;
}

// Larger data gets longer cache (avoid refetch overhead)
const getTtl = (data: CachedData) => {
  if (data.size > 1024 * 1024) { // > 1MB
    return 3600000; // 1 hour
  } else if (data.size > 10 * 1024) { // > 10KB
    return 600000; // 10 minutes
  }
  return 300000; // 5 minutes
};

const getThreshold = (data: CachedData) => {
  // Refresh earlier for large data
  return getTtl(data) * (data.size > 1024 * 1024 ? 0.3 : 0.2);
};

Error Handling

Refresh Failure Recovery

const refreshFailures = new Map<string, number>();
const MAX_FAILURES = 3;

cache.on('refresh', ({ key, error }) => {
  if (error) {
    const failures = refreshFailures.get(key) || 0;
    refreshFailures.set(key, failures + 1);
    
    if (failures >= MAX_FAILURES) {
      console.error(`ALERT: ${key} failed ${failures} refresh attempts`);
      // Consider disabling refresh for this key
    }
  } else {
    // Success - reset failure count
    refreshFailures.delete(key);
  }
});

Fallback on Refresh Failure

async function fetchWithFallback() {
  try {
    return await api.fetchData();
  } catch (error) {
    console.error('Primary API failed, using fallback:', error);
    return await fallbackAPI.fetchData();
  }
}

await cache.wrap('data', fetchWithFallback, 300000, 60000);
// If refresh fails, old value stays in cache until expiration

Performance Optimization

Prevent Refresh Storms

// Stagger refresh thresholds to avoid simultaneous refreshes
const keys = ['key1', 'key2', 'key3', 'key4', 'key5'];

keys.forEach((key, index) => {
  const baseThreshold = 60000; // 1 minute
  const stagger = index * 10000; // 10 second stagger
  
  cache.wrap(
    key,
    () => fetchData(key),
    300000, // 5 minute TTL
    baseThreshold + stagger // Staggered thresholds
  );
});

Batch Refresh

// Refresh related keys together
async function batchRefresh(keys: string[]) {
  const data = await api.batchFetch(keys); // Single API call
  
  await cache.mset(
    keys.map((key, i) => ({
      key,
      value: data[i],
      ttl: 300000,
    }))
  );
}

// Use wrap with batch refresh
await cache.wrap('batch:users:1-100', () => batchRefresh(userKeys), 300000, 60000);

Best Practices

1. Choose Appropriate Threshold

// Too low: Key may expire before refresh completes
await cache.wrap('key', slowFunction, 10000, 500); // ❌ Only 0.5s to refresh

// Too high: Refreshes too often
await cache.wrap('key', fastFunction, 300000, 280000); // ❌ Refreshes at 93% TTL

// Good: 10-30% of TTL
await cache.wrap('key', fetchData, 300000, 60000); // ✅ Refreshes at 80% TTL

2. Ensure Refresh Can Complete

// If function takes 5s, threshold should be > 5s
async function slowFunction() {
  await new Promise(resolve => setTimeout(resolve, 5000));
  return data;
}

// Allow 10s margin
await cache.wrap('key', slowFunction, 300000, 15000); // ✅ 15s threshold

3. Monitor Refresh Effectiveness

let proactiveRefreshes = 0;
let cacheMisses = 0;

cache.on('refresh', () => {
  proactiveRefreshes++;
});

cache.on('get', ({ key, value }) => {
  if (value === undefined && key.startsWith('refreshable:')) {
    cacheMisses++;
  }
});

// Goal: proactiveRefreshes >> cacheMisses

Troubleshooting

Refresh Not Triggering

Causes:

  1. refreshThreshold >= ttl
  2. Not enough time has passed
  3. No subsequent wrap() calls after threshold reached

Solution:

// Ensure threshold < TTL
await cache.wrap('key', fetchData, 10000, 3000); // ✅ 3s < 10s

// Call wrap() again after threshold
await cache.wrap('key', fetchData, 10000, 3000);
await new Promise(resolve => setTimeout(resolve, 8000));
await cache.wrap('key', fetchData, 10000, 3000); // Triggers refresh

Refresh Happening Too Often

Cause: Threshold too high

Solution:

// Instead of 90% refresh
await cache.wrap('key', fetchData, 300000, 270000); // ❌

// Use 20-30% refresh
await cache.wrap('key', fetchData, 300000, 60000); // ✅

Next Steps

  • Error Handling Guide - Handle refresh failures
  • Function Wrapping Reference - Complete wrap() documentation
  • Event System Reference - Monitor refresh events