tessl install tessl/npm-cache-manager@7.2.0Cache Manager for Node.js with support for multi-store caching, background refresh, and Keyv-compatible storage adapters
Flexible Node.js caching module with TypeScript support, multi-store tiered caching, background refresh, and Keyv-compatible storage adapters.
| Feature | Description |
|---|---|
| Multi-Store | Tiered caching (L1 memory, L2 Redis, etc.) |
| Background Refresh | Proactive cache updates before expiration |
| Type Safety | Full TypeScript with generic type preservation |
| Storage Adapters | Any Keyv-compatible store (Redis, MongoDB, SQLite, etc.) |
| Event System | Monitor all cache operations |
| Function Wrapping | Automatic memoization with cache coalescing |
npm install cache-managerFor storage adapters:
npm install keyv @keyv/redis cacheableimport { createCache } from 'cache-manager';
// Create cache
const cache = createCache();
// Basic operations
await cache.set('key', 'value', 60000); // 60 second TTL
const value = await cache.get('key');
await cache.del('key');
await cache.clear();function createCache(options?: CreateCacheOptions): Cache;
type CreateCacheOptions = {
stores?: Keyv[]; // Array of Keyv store instances
ttl?: number; // Default TTL in milliseconds
refreshThreshold?: number; // Background refresh threshold
refreshAllStores?: boolean; // Refresh all stores or only up to found store
nonBlocking?: boolean; // Non-blocking multi-store operations
cacheId?: string; // Custom cache instance identifier
};interface Cache {
// Read operations
get<T>(key: string): Promise<T | undefined>;
mget<T>(keys: string[]): Promise<Array<T | undefined>>;
ttl(key: string): Promise<number | undefined>;
// Write operations
set<T>(key: string, value: T, ttl?: number): Promise<T>;
mset<T>(list: Array<{ key: string; value: T; ttl?: number }>): Promise<Array<{ key: string; value: T; ttl?: number }>>;
// Delete operations
del(key: string): Promise<boolean>;
mdel(keys: string[]): Promise<boolean>;
clear(): Promise<boolean>;
// Function wrapping
wrap<T>(key: string, fnc: () => T | Promise<T>, ttl?: number, refreshThreshold?: number): Promise<T>;
wrap<T>(key: string, fnc: () => T | Promise<T>, options: WrapOptions<T>): Promise<T>;
// Event system
on<E extends keyof Events>(event: E, listener: Events[E]): EventEmitter;
off<E extends keyof Events>(event: E, listener: Events[E]): EventEmitter;
// Metadata
cacheId(): string;
stores: Keyv[];
disconnect(): Promise<undefined>;
}import { createCache } from 'cache-manager';
import { Keyv } from 'keyv';
import KeyvRedis from '@keyv/redis';
import { CacheableMemory } from 'cacheable';
const cache = createCache({
stores: [
new Keyv({ store: new CacheableMemory({ ttl: 60000, lruSize: 500 }) }), // L1: Fast
new Keyv({ store: new KeyvRedis('redis://localhost:6379') }), // L2: Persistent
],
ttl: 300000,
refreshThreshold: 60000,
});Behavior:
nonBlocking: true)refreshAllStores: true)Automatic memoization with cache coalescing (prevents thundering herd):
async function fetchUser(id: number) {
return await db.users.findById(id);
}
// Wrap function - executes only once for concurrent calls
const user = await cache.wrap('user:123', () => fetchUser(123), 60000);Background Refresh:
// When remaining TTL < refreshThreshold, triggers background refresh
await cache.wrap('key', fetchData, 10000, 3000);
// At 7+ seconds: returns cached value, refreshes in backgroundMonitor cache operations:
cache.on('get', ({ key, value, store, error }) => {
console.log(value !== undefined ? 'hit' : 'miss');
});
cache.on('set', ({ key, value, error }) => {
console.log(`Cached: ${key}`);
});
cache.on('refresh', ({ key, value, error }) => {
console.log(`Background refresh: ${key}`);
});Available Events: get, mget, set, mset, del, mdel, clear, ttl, refresh
type Events = {
get<T>(data: { key: string; value?: T; store?: string; error?: unknown }): void;
set<T>(data: { key: string; value: T; store?: string; error?: unknown }): void;
refresh<T>(data: { key: string; value: T; error?: unknown }): void;
// ... other events
};
type WrapOptions<T> = {
ttl?: number | ((value: T) => number);
refreshThreshold?: number | ((value: T) => number);
};app.get('/api/users/:id', async (req, res) => {
const user = await cache.wrap(
`user:${req.params.id}`,
() => db.users.findById(req.params.id),
60000
);
res.json(user);
});const metrics = { hits: 0, misses: 0 };
cache.on('get', ({ value }) => {
value !== undefined ? metrics.hits++ : metrics.misses++;
});process.on('SIGTERM', async () => {
await cache.disconnect();
process.exit(0);
});| Operation | In-Memory | Redis | Notes |
|---|---|---|---|
| Single get/set | ~1ms | ~10-50ms | Network latency for Redis |
| Batch operations | ~0.05ms/item | ~1-5ms/item | 10-100x faster than individual |
| Multi-store overhead | Negligible | +10-20% | Proportional to slowest store |
Cache with Fallback:
const value = await cache.get('key') ?? await fetchFromSource();Batch Loading:
const values = await cache.mget(keys);
const missing = keys.filter((_, i) => values[i] === undefined);
// Fetch missing, then cacheConditional Caching:
const ttl = (data) => data.priority === 'high' ? 3600000 : 60000;
await cache.wrap('key', fetchData, ttl);| Issue | Solution |
|---|---|
| Cache not working | Verify TTL is set (default or per-operation) |
| Stale data | Check refreshAllStores setting and refresh threshold |
| High memory | Configure LRU eviction: new CacheableMemory({ lruSize: 1000 }) |
| Refresh not triggering | Ensure refreshThreshold < ttl and enough time has passed |
See Troubleshooting Guide for detailed solutions.
Use KeyvAdapter to wrap legacy stores:
import { KeyvAdapter } from 'cache-manager';
import { redisStore } from 'cache-manager-redis-yet';
const legacyStore = await redisStore(/* config */);
const adapter = new KeyvAdapter(legacyStore);
const cache = createCache({ stores: [new Keyv({ store: adapter })] });See Legacy Adapter Guide for details.
For issues and questions:
cache-manager@7.2.8