Cache layers for Metro bundler with multi-layered caching system supporting local file storage and remote HTTP caching
—
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Pending
The risk profile of this skill
Core cache orchestration that manages sequential traversal through multiple cache stores with comprehensive logging and error handling via Metro's logging system.
Main cache coordinator that traverses multiple cache stores sequentially, checking faster stores first and ensuring values are propagated to all stores.
/**
* Main cache class that receives an array of cache instances and sequentially
* traverses them to return a previously stored value. Ensures setting the value
* in all instances with comprehensive error handling.
* @template T - Type of cached values
*/
class Cache<T> {
/**
* Create a new cache instance with multiple stores
* @param stores - Array of cache store instances to traverse sequentially
*/
constructor(stores: Array<CacheStore<T>>);
/**
* Retrieve a cached value by key, checking stores sequentially
* @param key - Cache key as Buffer
* @returns Promise resolving to cached value or null if not found
*/
get(key: Buffer): Promise<T | null>;
/**
* Store a value in all cache stores (up to the store that had the cached value)
* @param key - Cache key as Buffer
* @param value - Value to cache
* @returns Promise that resolves when all store operations complete
* @throws AggregateError if any store write operations fail
*/
set(key: Buffer, value: T): Promise<void>;
/**
* Check if cache is disabled (has no stores)
* @returns true if cache has no stores configured
*/
get isDisabled(): boolean;
}Usage Examples:
const { Cache, FileStore, HttpStore } = require("metro-cache");
// Create cache with multiple stores (checked in order)
const cache = new Cache([
new FileStore({ root: "./local-cache" }),
new HttpStore({ endpoint: "https://shared-cache.example.com" })
]);
// Get operation checks FileStore first, then HttpStore
const key = Buffer.from("my-cache-key", "hex");
const result = await cache.get(key);
if (result === null) {
// Cache miss - compute value
const computedValue = { data: "processed content" };
// Set operation stores in FileStore (faster store that didn't have it)
await cache.set(key, computedValue);
return computedValue;
}
// Cache hit - return cached value
return result;Contract that all cache store implementations must follow to be compatible with the Cache orchestrator.
/**
* Interface that all cache store implementations must implement
* @template T - Type of values stored in the cache
*/
interface CacheStore<T> {
/**
* Optional name for the store (used in logging)
*/
name?: string;
/**
* Retrieve a cached value by key
* @param key - Cache key as Buffer
* @returns Cached value, null if not found, or Promise resolving to either
*/
get(key: Buffer): T | null | Promise<T | null>;
/**
* Store a value by key
* @param key - Cache key as Buffer
* @param value - Value to store
* @returns void or Promise that resolves when storage is complete
*/
set(key: Buffer, value: T): void | Promise<void>;
/**
* Clear all cached values
* @returns void or Promise that resolves when clearing is complete
*/
clear(): void | Promise<void>;
}The Cache class provides comprehensive error handling:
Error Handling Example:
const cache = new Cache([fileStore, flakyHttpStore]);
try {
await cache.set(key, value);
} catch (error) {
if (error instanceof AggregateError) {
console.log(`Write failed for ${error.errors.length} stores`);
for (const storeError of error.errors) {
console.error("Store error:", storeError.message);
}
}
}The Cache class includes several performance optimizations: