or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/cache-manager@7.2.x

docs

index.md
tile.json

tessl/npm-cache-manager

tessl install tessl/npm-cache-manager@7.2.0

Cache Manager for Node.js with support for multi-store caching, background refresh, and Keyv-compatible storage adapters

troubleshooting.mddocs/reference/

Troubleshooting Guide

Solutions to common cache-manager issues and problems.

Cache Not Working

Issue: Values Not Being Cached

Symptoms:

  • Function executes every time despite using wrap()
  • get() always returns undefined

Possible Causes & Solutions:

1. No TTL Set

// ❌ Problem: No TTL specified
const cache = createCache();
await cache.set('key', 'value'); // May not persist

// ✅ Solution: Set TTL
const cache = createCache({ ttl: 60000 });
await cache.set('key', 'value'); // Uses default TTL

// Or per-operation
await cache.set('key', 'value', 60000);

2. TTL Too Short

// ❌ Problem: Expires immediately
await cache.set('key', 'value', 0);

// ✅ Solution: Use appropriate TTL
await cache.set('key', 'value', 60000); // 60 seconds

3. Values Not Serializable

// ❌ Problem: Function can't be serialized
await cache.set('func', () => 'hello'); // Error

// ✅ Solution: Only cache serializable values
await cache.set('data', { value: 'hello' }); // Works

Issue: Cache Always Misses

Symptoms:

  • get() returns undefined immediately after set()

Diagnosis:

// Check if set succeeded
await cache.set('test', 'value', 60000);
const value = await cache.get('test');
console.log('Retrieved:', value); // Should be 'value'

// Check TTL
const ttl = await cache.ttl('test');
console.log('TTL:', ttl); // Should be future timestamp

// Check store connection
cache.on('set', ({ error }) => {
  if (error) console.error('Set failed:', error);
});

Multi-Store Issues

Issue: Inconsistent Data Across Stores

Symptoms:

  • Different values in memory vs Redis
  • Unexpected stale data

Cause: Using nonBlocking: true or refreshAllStores: false

Solution:

// For strong consistency
const cache = createCache({
  stores: [memoryStore, redisStore],
  nonBlocking: false,      // Wait for all stores
  refreshAllStores: true,  // Refresh all stores
});

Issue: L1 Cache Always Empty

Symptoms:

  • Memory cache hit rate very low
  • Always fetching from Redis (L2)

Causes & Solutions:

1. LRU Size Too Small

// ❌ Problem
new CacheableMemory({ lruSize: 10 }) // Only 10 items

// ✅ Solution
new CacheableMemory({ lruSize: 1000 }) // 1000 items

2. TTL Too Short

// ❌ Problem: Expires too quickly
new CacheableMemory({ ttl: 1000 }) // 1 second

// ✅ Solution: Longer TTL for L1
new CacheableMemory({ ttl: 60000 }) // 1 minute

3. Check Hit Rate

let l1Hits = 0, l2Hits = 0, misses = 0;

cache.on('get', ({ value, store }) => {
  if (value === undefined) misses++;
  else if (store === 'primary') l1Hits++;
  else l2Hits++;
});

console.log({ l1Hits, l2Hits, misses });

Performance Issues

Issue: Slow Cache Operations

Symptoms:

  • Cache operations taking longer than expected
  • Application slowdown when using cache

Diagnosis:

// Measure cache operation time
console.time('cache-get');
await cache.get('key');
console.timeEnd('cache-get');

// Should be:
// - In-memory: < 1ms
// - Redis: 1-10ms
// - Network Redis: 10-50ms

Solutions:

1. Use Batch Operations

// ❌ Slow: Individual operations
for (const id of ids) {
  await cache.get(`user:${id}`); // 1ms each = 100ms for 100 items
}

// ✅ Fast: Batch operation
await cache.mget(ids.map(id => `user:${id}`)); // ~2ms for 100 items

2. Add Memory Tier

// ❌ Slow: Redis only
const cache = createCache({
  stores: [new Keyv({ store: new KeyvRedis(redisUrl) })],
});

// ✅ Fast: Memory + Redis
const cache = createCache({
  stores: [
    new Keyv({ store: new CacheableMemory({ lruSize: 1000 }) }),
    new Keyv({ store: new KeyvRedis(redisUrl) }),
  ],
});

3. Enable Non-Blocking Mode

// For read-heavy workloads
const cache = createCache({
  stores: [memoryStore, redisStore],
  nonBlocking: true, // Don't wait for all stores
});

Issue: High Memory Usage

Symptoms:

  • Node.js process memory constantly growing
  • Out of memory errors

Solutions:

1. Configure LRU Eviction

// ❌ Problem: Unlimited cache growth
const cache = createCache({
  stores: [new Keyv()], // No size limit
});

// ✅ Solution: Set LRU limit
const cache = createCache({
  stores: [
    new Keyv({ 
      store: new CacheableMemory({ lruSize: 1000 })
    }),
  ],
});

2. Calculate Appropriate Size

const avgItemSize = 1024; // 1KB per item
const memoryBudget = 50 * 1024 * 1024; // 50MB
const lruSize = Math.floor(memoryBudget / avgItemSize);

console.log(`LRU size: ${lruSize} items`);

3. Monitor Memory Usage

setInterval(() => {
  const usage = process.memoryUsage();
  console.log({
    heapUsed: `${Math.round(usage.heapUsed / 1024 / 1024)}MB`,
    heapTotal: `${Math.round(usage.heapTotal / 1024 / 1024)}MB`,
  });
}, 60000);

Connection Issues

Issue: "Connection Refused" or "ECONNREFUSED"

Symptoms:

  • Redis connection errors
  • Cache operations fail immediately

Solutions:

1. Check Redis is Running

# Test Redis connection
redis-cli ping
# Should return: PONG

# If not running:
redis-server

2. Verify Connection String

// Check Redis URL
console.log('Redis URL:', process.env.REDIS_URL);

// Test connection
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL);

redis.on('connect', () => console.log('Connected'));
redis.on('error', (err) => console.error('Error:', err));

3. Add Retry Logic

const cache = createCache({
  stores: [
    new Keyv({
      store: new KeyvRedis(redisUrl, {
        maxRetriesPerRequest: 3,
        retryStrategy: (times) => Math.min(times * 50, 2000),
      }),
    }),
  ],
});

Issue: Connection Timeouts

Symptoms:

  • Operations hang and timeout
  • Intermittent failures

Solutions:

1. Configure Timeouts

const cache = createCache({
  stores: [
    new Keyv({
      store: new KeyvRedis(redisUrl, {
        connectTimeout: 5000,  // 5 second connect timeout
        commandTimeout: 3000,  // 3 second command timeout
      }),
    }),
  ],
});

2. Monitor Connection State

cache.on('set', ({ error }) => {
  if (error && error.message.includes('timeout')) {
    console.error('Redis timeout, check network/load');
  }
});

Background Refresh Issues

Issue: Refresh Not Triggering

Symptoms:

  • No refresh events being emitted
  • Cache expires instead of refreshing

Causes & Solutions:

1. Threshold >= TTL

// ❌ Problem
await cache.wrap('key', fetchData, 10000, 15000);
// refreshThreshold (15s) >= ttl (10s)

// ✅ Solution
await cache.wrap('key', fetchData, 10000, 3000);
// refreshThreshold (3s) < ttl (10s)

2. Not Enough Time Passed

// Refresh only triggers when:
// - Entry exists
// - wrap() is called
// - remaining TTL < refreshThreshold

await cache.wrap('key', fetchData, 10000, 3000);
// Wait at least 7+ seconds for refresh to trigger
await new Promise(resolve => setTimeout(resolve, 7500));
await cache.wrap('key', fetchData, 10000, 3000); // Triggers refresh

3. Monitor Refresh Events

cache.on('refresh', ({ key, error }) => {
  if (error) {
    console.error(`Refresh failed for ${key}:`, error);
  } else {
    console.log(`Refreshed ${key}`);
  }
});

Issue: Refresh Happening Too Often

Symptoms:

  • High load from refresh operations
  • Function executing too frequently

Solution:

// ❌ Problem: Threshold too high (90% of TTL)
await cache.wrap('key', fetchData, 10000, 9000);

// ✅ Solution: Threshold 10-30% of TTL
await cache.wrap('key', fetchData, 10000, 2000); // 20%

Wrap Function Issues

Issue: Function Still Executing Multiple Times

Symptoms:

  • Expected cache coalescing not happening
  • Duplicate function executions

Possible Causes:

1. Different Cache Instances

// ❌ Problem: Multiple caches don't coalesce
const cache1 = createCache();
const cache2 = createCache();

cache1.wrap('key', expensiveFn, 60000); // Executes
cache2.wrap('key', expensiveFn, 60000); // Executes again

// ✅ Solution: Use single cache instance
const cache = createCache();
cache.wrap('key', expensiveFn, 60000); // Executes once

2. Errors Not Cached

// Errors are never cached
async function failing() {
  throw new Error('Fail');
}

await cache.wrap('key', failing, 60000); // Throws
await cache.wrap('key', failing, 60000); // Throws again

Type Issues

Issue: TypeScript Errors with Generic Types

Solution:

// ❌ Problem: Type not preserved
const user = await cache.get('user:123');
console.log(user.name); // Error: user might be undefined

// ✅ Solution 1: Explicit type
const user = await cache.get<User>('user:123');
if (user) {
  console.log(user.name); // OK
}

// ✅ Solution 2: Type assertion
const user = (await cache.get('user:123')) as User;
console.log(user.name); // OK (but unsafe)

Issue: Serialization Errors

Symptoms:

  • TypeError: Converting circular structure to JSON
  • Values not stored correctly

Solutions:

1. Break Circular References

const circular: any = { a: 1 };
circular.self = circular;

// ❌ Problem
await cache.set('bad', circular); // Error

// ✅ Solution
const safe = { a: circular.a }; // Omit circular.self
await cache.set('good', safe); // Works

2. Custom Serializer

import { Keyv } from 'keyv';

const cache = createCache({
  stores: [
    new Keyv({
      serialize: (data) => JSON.stringify(data, (key, value) => {
        // Handle circular refs
        if (typeof value === 'object' && value !== null) {
          if (seen.has(value)) return '[Circular]';
          seen.add(value);
        }
        return value;
      }),
      deserialize: JSON.parse,
    }),
  ],
});

Event Listener Issues

Issue: Event Listeners Not Firing

Symptoms:

  • Registered listeners not being called

Solutions:

1. Verify Event Name

// ❌ Problem: Wrong event name
cache.on('gets', handler); // No such event

// ✅ Solution: Correct event name
cache.on('get', handler); // Correct

2. Check Listener Signature

// ❌ Problem: Wrong signature
cache.on('get', (key: string) => {
  console.log(key); // Wrong - receives object, not string
});

// ✅ Solution: Correct signature
cache.on('get', ({ key, value }) => {
  console.log(key, value); // Correct
});

Issue: Memory Leak from Event Listeners

Symptoms:

  • Memory usage grows over time
  • Many listeners registered

Solution:

// Always remove listeners when done
const listener = ({ key, value }) => {
  console.log(key, value);
};

cache.on('get', listener);

// Later, cleanup
cache.off('get', listener);

Diagnostic Tools

Enable Debug Logging

// Log all cache operations
cache.on('get', ({ key, value, store }) => {
  console.log(`GET ${key}: ${value !== undefined ? 'HIT' : 'MISS'} ${store || ''}`);
});

cache.on('set', ({ key, value }) => {
  console.log(`SET ${key}: ${JSON.stringify(value).slice(0, 50)}`);
});

cache.on('del', ({ key }) => {
  console.log(`DEL ${key}`);
});

Cache Health Check

async function checkCacheHealth() {
  const testKey = 'health-check';
  const testValue = { timestamp: Date.now() };
  
  try {
    // Test write
    await cache.set(testKey, testValue, 60000);
    
    // Test read
    const retrieved = await cache.get(testKey);
    if (!retrieved) {
      throw new Error('Write succeeded but read failed');
    }
    
    // Test delete
    await cache.del(testKey);
    
    console.log('✓ Cache health check passed');
    return true;
  } catch (error) {
    console.error('✗ Cache health check failed:', error);
    return false;
  }
}

// Run periodically
setInterval(checkCacheHealth, 60000);

Performance Profiling

const perfMetrics = {
  get: [] as number[],
  set: [] as number[],
  wrap: [] as number[],
};

// Measure operation times
async function profiledGet<T>(key: string): Promise<T | undefined> {
  const start = performance.now();
  const result = await cache.get<T>(key);
  const duration = performance.now() - start;
  
  perfMetrics.get.push(duration);
  
  return result;
}

// Analyze metrics
function analyzePerformance() {
  const avg = (arr: number[]) => 
    arr.length > 0 ? arr.reduce((a, b) => a + b, 0) / arr.length : 0;
  
  console.log({
    avgGet: `${avg(perfMetrics.get).toFixed(2)}ms`,
    avgSet: `${avg(perfMetrics.set).toFixed(2)}ms`,
    avgWrap: `${avg(perfMetrics.wrap).toFixed(2)}ms`,
  });
}

Getting Help

Gather Diagnostic Information

// Cache configuration
console.log('Cache ID:', cache.cacheId());
console.log('Store count:', cache.stores.length);

// Environment
console.log('Node version:', process.version);
console.log('Platform:', process.platform);
console.log('Memory:', process.memoryUsage());

// Package versions
console.log('cache-manager version:', require('cache-manager/package.json').version);
console.log('keyv version:', require('keyv/package.json').version);

Enable Verbose Logging

// Log all events
const events: Array<keyof Events> = [
  'get', 'mget', 'set', 'mset', 
  'del', 'mdel', 'clear', 'ttl', 'refresh'
];

events.forEach(event => {
  cache.on(event, (data) => {
    console.log(`[${event}]`, data);
  });
});

Next Steps

  • Configuration Reference - Verify configuration
  • Core Operations Reference - API documentation
  • Error Handling Guide - Error patterns