CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-com-github-ben-manes-caffeine--caffeine

A high performance caching library for Java providing Google Guava-inspired API with advanced eviction policies and comprehensive features

Pending
Overview
Eval results
Files

functional-interfaces.mddocs/

Functional Interfaces

Caffeine provides several functional interfaces that enable customization of cache behavior including value loading, removal notifications, entry weighing, custom expiration policies, and time/scheduling abstractions.

CacheLoader Interface

The CacheLoader interface enables automatic value computation for loading caches.

@FunctionalInterface
public interface CacheLoader<K, V> extends AsyncCacheLoader<K, V> {
    // Primary loading method
    V load(K key) throws Exception;
    
    // Bulk loading method (optional override)
    default Map<? extends K, ? extends V> loadAll(Set<? extends K> keys) throws Exception {
        throw new UnsupportedOperationException();
    }
    
    // Inherited async methods
    default CompletableFuture<V> asyncLoad(K key, Executor executor) throws Exception;
    default CompletableFuture<? extends Map<? extends K, ? extends V>> asyncLoadAll(
        Set<? extends K> keys, Executor executor) throws Exception;
    default CompletableFuture<V> asyncReload(K key, V oldValue, Executor executor) throws Exception;
    
    // Static factory method for bulk loading
    static <K, V> CacheLoader<K, V> bulk(
        Function<? super Set<? extends K>, ? extends Map<? extends K, ? extends V>> mappingFunction);
}

Basic CacheLoader Implementation

// Simple lambda-based loader
CacheLoader<String, String> simpleLoader = key -> "loaded_" + key.toUpperCase();

LoadingCache<String, String> cache = Caffeine.newBuilder()
    .maximumSize(1000)
    .build(simpleLoader);

String value = cache.get("test"); // Returns "loaded_TEST"

Advanced CacheLoader with Error Handling

CacheLoader<String, UserData> userLoader = new CacheLoader<String, UserData>() {
    @Override
    public UserData load(String userId) throws Exception {
        // Simulate database lookup with potential failures
        if (userId.startsWith("invalid_")) {
            throw new IllegalArgumentException("Invalid user ID: " + userId);
        }
        
        if (userId.equals("slow_user")) {
            Thread.sleep(2000); // Simulate slow operation
        }
        
        return new UserData(userId, "User " + userId, "user" + userId + "@example.com");
    }
    
    @Override
    public Map<String, UserData> loadAll(Set<? extends String> userIds) throws Exception {
        Map<String, UserData> result = new HashMap<>();
        
        // Efficient bulk loading from database
        List<UserData> users = database.batchFetchUsers(new ArrayList<>(userIds));
        for (UserData user : users) {
            result.put(user.getId(), user);
        }
        
        return result;
    }
};

LoadingCache<String, UserData> userCache = Caffeine.newBuilder()
    .maximumSize(10000)
    .expireAfterWrite(Duration.ofMinutes(30))
    .recordStats()
    .build(userLoader);

// Single load
UserData user = userCache.get("user123");

// Bulk load - uses efficient loadAll implementation
Map<String, UserData> users = userCache.getAll(Set.of("user1", "user2", "user3"));

CacheLoader Bulk Factory Method

// Create a bulk-only loader using the static factory method
CacheLoader<String, Product> bulkProductLoader = CacheLoader.bulk(productIds -> {
    // Only implement bulk loading - single load delegates to bulk
    return productService.fetchProductsMap(productIds)
        .stream()
        .collect(Collectors.toMap(Product::getId, p -> p));
});

LoadingCache<String, Product> productCache = Caffeine.newBuilder()
    .maximumSize(5000)
    .expireAfterWrite(Duration.ofHours(1))
    .build(bulkProductLoader);

// Both single and bulk loads work efficiently
Product single = productCache.get("prod123");  // Delegates to bulk load with single key
Map<String, Product> bulk = productCache.getAll(Set.of("prod1", "prod2", "prod3"));

AsyncCacheLoader Interface

The AsyncCacheLoader interface provides asynchronous value loading for async caches.

@FunctionalInterface
public interface AsyncCacheLoader<K, V> {
    // Primary async loading method
    CompletableFuture<? extends V> asyncLoad(K key, Executor executor) throws Exception;
    
    // Bulk async loading (optional override)
    default CompletableFuture<? extends Map<? extends K, ? extends V>> asyncLoadAll(
        Set<? extends K> keys, Executor executor) throws Exception {
        throw new UnsupportedOperationException();
    }
    
    // Async reload for refresh operations (optional override)
    default CompletableFuture<? extends V> asyncReload(
        K key, V oldValue, Executor executor) throws Exception {
        return asyncLoad(key, executor);
    }
    
    // Static factory methods for bulk loading
    static <K, V> AsyncCacheLoader<K, V> bulk(
        Function<? super Set<? extends K>, ? extends Map<? extends K, ? extends V>> mappingFunction);
    static <K, V> AsyncCacheLoader<K, V> bulk(
        BiFunction<? super Set<? extends K>, ? super Executor, 
                   ? extends CompletableFuture<? extends Map<? extends K, ? extends V>>> mappingFunction);
}

AsyncCacheLoader Implementation

AsyncCacheLoader<String, String> asyncLoader = (key, executor) -> {
    return CompletableFuture.supplyAsync(() -> {
        // Simulate async computation
        try {
            Thread.sleep(100);
        } catch (InterruptedException e) {
            Thread.currentThread().interrupt();
            throw new RuntimeException(e);
        }
        return "async_loaded_" + key;
    }, executor);
};

AsyncLoadingCache<String, String> asyncCache = Caffeine.newBuilder()
    .maximumSize(1000)
    .buildAsync(asyncLoader);

CompletableFuture<String> future = asyncCache.get("test");
String value = future.join(); // "async_loaded_test"

AsyncCacheLoader with Bulk Loading

AsyncCacheLoader<String, ProductData> productLoader = new AsyncCacheLoader<String, ProductData>() {
    @Override
    public CompletableFuture<ProductData> asyncLoad(String productId, Executor executor) {
        return CompletableFuture.supplyAsync(() -> {
            return productService.fetchProduct(productId);
        }, executor);
    }
    
    @Override
    public CompletableFuture<Map<String, ProductData>> asyncLoadAll(
            Set<? extends String> productIds, Executor executor) {
        return CompletableFuture.supplyAsync(() -> {
            // Efficient bulk loading
            return productService.batchFetchProducts(new ArrayList<>(productIds))
                .stream()
                .collect(Collectors.toMap(ProductData::getId, p -> p));
        }, executor);
    }
    
    @Override
    public CompletableFuture<ProductData> asyncReload(
            String productId, ProductData oldValue, Executor executor) {
        return CompletableFuture.supplyAsync(() -> {
            // Custom refresh logic - may use old value for optimization
            if (oldValue.getLastModified().isAfter(Instant.now().minus(Duration.ofMinutes(5)))) {
                return oldValue; // Recent enough, no need to refresh
            }
            return productService.fetchProduct(productId);
        }, executor);
    }
};

AsyncCacheLoader Bulk Factory Methods

// Synchronous bulk loader
AsyncCacheLoader<String, Product> syncBulkLoader = AsyncCacheLoader.bulk(
    productIds -> {
        // Synchronous bulk loading - will be wrapped in CompletableFuture
        return productService.fetchProductsSync(productIds)
            .stream()
            .collect(Collectors.toMap(Product::getId, p -> p));
    }
);

// Asynchronous bulk loader with executor
AsyncCacheLoader<String, Product> asyncBulkLoader = AsyncCacheLoader.bulk(
    (productIds, executor) -> {
        return CompletableFuture.supplyAsync(() -> {
            // Long-running bulk operation using provided executor
            return productService.fetchProductsBatch(productIds)
                .stream()
                .collect(Collectors.toMap(Product::getId, p -> p));
        }, executor);
    }
);

AsyncLoadingCache<String, Product> bulkCache = Caffeine.newBuilder()
    .maximumSize(10_000)
    .buildAsync(asyncBulkLoader);

// Efficient bulk loading through factory method
CompletableFuture<Map<String, Product>> productsFuture = 
    bulkCache.getAll(Set.of("prod1", "prod2", "prod3"));

RemovalListener Interface

The RemovalListener interface provides notifications when entries are removed from the cache.

@FunctionalInterface
public interface RemovalListener<K, V> {
    void onRemoval(K key, V value, RemovalCause cause);
}

Basic RemovalListener

RemovalListener<String, String> basicListener = (key, value, cause) -> {
    System.out.println("Removed: " + key + " -> " + value + " (cause: " + cause + ")");
};

Cache<String, String> cache = Caffeine.newBuilder()
    .maximumSize(100)
    .removalListener(basicListener)
    .build();

cache.put("key1", "value1");
cache.invalidate("key1"); // Prints: Removed: key1 -> value1 (cause: EXPLICIT)

Advanced RemovalListener with Resource Cleanup

RemovalListener<String, DatabaseConnection> connectionListener = (key, connection, cause) -> {
    System.out.println("Connection removed: " + key + " (cause: " + cause + ")");
    
    if (connection != null) {
        try {
            connection.close();
            System.out.println("Connection closed for: " + key);
        } catch (Exception e) {
            System.err.println("Failed to close connection for " + key + ": " + e.getMessage());
        }
    }
    
    // Log different removal causes
    switch (cause) {
        case EXPIRED:
            metricsCollector.increment("cache.connection.expired");
            break;
        case SIZE:
            metricsCollector.increment("cache.connection.evicted"); 
            System.out.println("WARNING: Connection evicted due to cache size limit");
            break;
        case EXPLICIT:
            metricsCollector.increment("cache.connection.manual_removal");
            break;
        case COLLECTED:
            metricsCollector.increment("cache.connection.gc_collected");
            System.out.println("Connection garbage collected: " + key);
            break;
    }
};

Cache<String, DatabaseConnection> connectionCache = Caffeine.newBuilder()
    .maximumSize(50)
    .expireAfterAccess(Duration.ofMinutes(30))
    .removalListener(connectionListener)
    .build();

RemovalListener vs EvictionListener

// RemovalListener - called asynchronously for ALL removals
RemovalListener<String, String> removalListener = (key, value, cause) -> {
    System.out.println("REMOVAL: " + key + " (cause: " + cause + ")");
    // Heavy operations like logging, cleanup, external notifications
    slowExternalService.notifyRemoval(key, value, cause);
};

// EvictionListener - called synchronously only for evictions
RemovalListener<String, String> evictionListener = (key, value, cause) -> {
    if (cause.wasEvicted()) {
        System.out.println("EVICTION: " + key + " (cause: " + cause + ")");
        // Lightweight operations only - affects cache performance
        quickMetrics.recordEviction(cause);
    }
};

Cache<String, String> cache = Caffeine.newBuilder()
    .maximumSize(100)
    .removalListener(removalListener)    // Async, all removals
    .evictionListener(evictionListener)  // Sync, evictions only
    .build();

Weigher Interface

The Weigher interface calculates custom weights for cache entries used in weight-based eviction.

@FunctionalInterface
public interface Weigher<K, V> {
    int weigh(K key, V value);
    
    // Static factory methods
    static <K, V> Weigher<K, V> singletonWeigher();
    static <K, V> Weigher<K, V> boundedWeigher(Weigher<K, V> delegate);
}

String-Based Weigher

Weigher<String, String> stringWeigher = (key, value) -> {
    // Weight based on string length
    return key.length() + value.length();
};

Cache<String, String> stringCache = Caffeine.newBuilder()
    .maximumWeight(1000)
    .weigher(stringWeigher)
    .build();

stringCache.put("short", "a");        // weight: 6
stringCache.put("medium_key", "data"); // weight: 14  
stringCache.put("very_long_key_name", "very_long_value_content"); // weight: 43

Complex Object Weigher

Weigher<String, CacheableObject> objectWeigher = (key, obj) -> {
    int baseWeight = key.length();
    
    if (obj == null) {
        return baseWeight;
    }
    
    // Calculate weight based on object characteristics
    int objectWeight = obj.getSerializedSize();
    
    // Add extra weight for resource-intensive objects
    if (obj.hasLargeData()) {
        objectWeight *= 2;
    }
    
    // Ensure minimum weight
    return Math.max(baseWeight + objectWeight, 1);
};

Cache<String, CacheableObject> objectCache = Caffeine.newBuilder()
    .maximumWeight(100_000)
    .weigher(objectWeigher)
    .recordStats()
    .build();

Dynamic Weigher with Validation

Weigher<String, byte[]> byteArrayWeigher = (key, data) -> {
    // Validate inputs
    if (key == null || data == null) {
        return 1; // Minimum weight for null values
    }
    
    // Base weight from key
    int keyWeight = key.length();
    
    // Data weight
    int dataWeight = data.length;
    
    // Apply scaling factor for very large objects
    if (dataWeight > 10_000) {
        dataWeight = 10_000 + (dataWeight - 10_000) / 10;
    }
    
    int totalWeight = keyWeight + dataWeight;
    
    // Ensure weight is positive and reasonable
    return Math.max(totalWeight, 1);
};

Bounded Weigher for Safety

// Potentially unsafe weigher that might return negative values
Weigher<String, String> unsafeWeigher = (key, value) -> {
    // Hypothetical calculation that could go negative
    return key.length() - value.length();
};

// Wrap with boundedWeigher to ensure non-negative weights
Weigher<String, String> safeWeigher = Weigher.boundedWeigher(unsafeWeigher);

Cache<String, String> safeCache = Caffeine.newBuilder()
    .maximumWeight(1000)
    .weigher(safeWeigher)  // Automatically validates weights >= 0
    .build();

// This would throw IllegalArgumentException if unsafe weigher returned negative value
safeCache.put("long_key", "short");  // Negative weight prevented by boundedWeigher

Expiry Interface

The Expiry interface enables custom expiration policies based on entry characteristics and access patterns.

public interface Expiry<K, V> {
    long expireAfterCreate(K key, V value, long currentTime);
    long expireAfterUpdate(K key, V value, long currentTime, long currentDuration);
    long expireAfterRead(K key, V value, long currentTime, long currentDuration);
    
    // Static factory methods
    static <K, V> Expiry<K, V> creating(BiFunction<K, V, Duration> expireAfterCreate);
    static <K, V> Expiry<K, V> accessing(BiFunction<K, V, Duration> expireAfterCreate,
                                        TriFunction<K, V, Duration, Duration> expireAfterRead);
    static <K, V> Expiry<K, V> writing(BiFunction<K, V, Duration> expireAfterCreate,
                                       TriFunction<K, V, Duration, Duration> expireAfterUpdate);
}

Dynamic Expiry Based on Value Characteristics

Expiry<String, String> dynamicExpiry = new Expiry<String, String>() {
    @Override
    public long expireAfterCreate(String key, String value, long currentTime) {
        // Different expiration based on key pattern
        if (key.startsWith("temp_")) {
            return Duration.ofMinutes(5).toNanos();
        } else if (key.startsWith("session_")) {
            return Duration.ofHours(2).toNanos();
        } else if (value.length() > 1000) {
            // Large values expire sooner
            return Duration.ofMinutes(15).toNanos();
        } else {
            return Duration.ofHours(1).toNanos();
        }
    }
    
    @Override
    public long expireAfterUpdate(String key, String value, long currentTime, long currentDuration) {
        // Reset to creation expiration on update
        return expireAfterCreate(key, value, currentTime);
    }
    
    @Override  
    public long expireAfterRead(String key, String value, long currentTime, long currentDuration) {
        // Extend expiration for frequently accessed items
        if (key.startsWith("hot_")) {
            return currentDuration + Duration.ofMinutes(10).toNanos();
        }
        // No change for other items
        return currentDuration;
    }
};

Cache<String, String> dynamicCache = Caffeine.newBuilder()
    .maximumSize(1000)
    .expireAfter(dynamicExpiry)
    .build();

Simplified Expiry with Factory Methods

// Creation-only expiry
Expiry<String, UserSession> creationExpiry = Expiry.creating((key, session) -> {
    // Different expiration based on user role
    return session.isAdminUser() ? Duration.ofHours(8) : Duration.ofHours(2);
});

// Access-based expiry  
Expiry<String, String> accessExpiry = Expiry.accessing(
    (key, value) -> Duration.ofMinutes(30),        // Initial expiration
    (key, value, duration) -> Duration.ofMinutes(45) // Extended on access
);

// Write-based expiry
Expiry<String, String> writeExpiry = Expiry.writing(
    (key, value) -> Duration.ofHours(1),           // Initial expiration
    (key, value, duration) -> Duration.ofHours(2)  // Extended on update
);

Ticker Interface

The Ticker interface provides time source abstraction for testing and custom time handling.

@FunctionalInterface
public interface Ticker {
    long read();
    
    static Ticker systemTicker();
    static Ticker disabledTicker();
}

Custom Ticker for Testing

public class ManualTicker implements Ticker {
    private volatile long nanos = 0;
    
    @Override
    public long read() {
        return nanos;
    }
    
    public void advance(Duration duration) {
        nanos += duration.toNanos();
    }
    
    public void advance(long time, TimeUnit unit) {
        nanos += unit.toNanos(time);
    }
}

// Use in tests
ManualTicker ticker = new ManualTicker();
Cache<String, String> testCache = Caffeine.newBuilder()
    .maximumSize(100)
    .expireAfterWrite(Duration.ofMinutes(10))
    .ticker(ticker)
    .build();

testCache.put("key", "value");

// Simulate time passage
ticker.advance(Duration.ofMinutes(15));
testCache.cleanUp(); // Triggers expiration

String value = testCache.getIfPresent("key"); // null - expired

Scheduler Interface

The Scheduler interface controls background maintenance scheduling.

@FunctionalInterface
public interface Scheduler {
    Future<?> schedule(Executor executor, Runnable command, long delay, TimeUnit unit);
    
    static Scheduler forScheduledExecutorService(ScheduledExecutorService scheduledExecutorService);
    static Scheduler systemScheduler();
    static Scheduler disabledScheduler();
}

Custom Scheduler Implementation

public class CustomScheduler implements Scheduler {
    private final ScheduledExecutorService scheduler = 
        Executors.newScheduledThreadPool(2, r -> {
            Thread t = new Thread(r, "caffeine-maintenance");
            t.setDaemon(true);
            return t;
        });
    
    @Override
    public Future<?> schedule(Executor executor, Runnable command, long delay, TimeUnit unit) {
        // Add logging for maintenance operations
        Runnable wrappedCommand = () -> {
            System.out.println("Running cache maintenance");
            long start = System.nanoTime();
            try {
                command.run();
            } finally {
                long duration = System.nanoTime() - start;
                System.out.println("Maintenance completed in " + 
                    TimeUnit.NANOSECONDS.toMillis(duration) + "ms");
            }
        };
        
        return scheduler.schedule(wrappedCommand, delay, unit);
    }
}

Cache<String, String> scheduledCache = Caffeine.newBuilder()
    .maximumSize(1000)
    .expireAfterAccess(Duration.ofMinutes(30))
    .scheduler(new CustomScheduler())
    .build();

Interner Interface

The Interner interface provides object interning functionality similar to String.intern() for any immutable type.

@FunctionalInterface
public interface Interner<E> {
    E intern(E sample);
    
    static <E> Interner<E> newStrongInterner();
    static <E> Interner<E> newWeakInterner();
}

Strong Interner

Strong interners retain strong references to interned instances, preventing garbage collection.

Interner<String> strongInterner = Interner.newStrongInterner();

// Intern strings
String s1 = strongInterner.intern("hello");
String s2 = strongInterner.intern(new String("hello"));
String s3 = strongInterner.intern("hello");

// All return the same instance
assert s1 == s2;
assert s2 == s3;
assert s1 == s3;

// Memory efficient for frequently used immutable objects
Interner<ImmutableConfig> configInterner = Interner.newStrongInterner();
ImmutableConfig config1 = configInterner.intern(new ImmutableConfig("prod", "database"));
ImmutableConfig config2 = configInterner.intern(new ImmutableConfig("prod", "database"));
assert config1 == config2; // Same instance

Weak Interner

Weak interners use weak references, allowing garbage collection when no other references exist.

Interner<String> weakInterner = Interner.newWeakInterner();

// Intern strings with weak references
String s1 = weakInterner.intern("temporary");
String s2 = weakInterner.intern(new String("temporary"));
assert s1 == s2;

// After gc, instances may be collected
System.gc();
// Subsequent interns may return different instances if previous ones were collected

// Useful for reducing memory usage of temporary objects
Interner<RequestKey> keyInterner = Interner.newWeakInterner();
RequestKey key1 = keyInterner.intern(new RequestKey("user123", "action"));
RequestKey key2 = keyInterner.intern(new RequestKey("user123", "action"));
assert key1 == key2; // Same instance while both are reachable

Performance Comparison

public class InternerPerformanceExample {
    public void demonstrateInterning() {
        Interner<String> strongInterner = Interner.newStrongInterner();
        Interner<String> weakInterner = Interner.newWeakInterner();
        
        // Strong interner - better for frequently accessed objects
        long start = System.nanoTime();
        for (int i = 0; i < 10000; i++) {
            String interned = strongInterner.intern("common_value_" + (i % 100));
            // Process interned string
        }
        long strongTime = System.nanoTime() - start;
        
        // Weak interner - better for memory-constrained scenarios
        start = System.nanoTime();
        for (int i = 0; i < 10000; i++) {
            String interned = weakInterner.intern("common_value_" + (i % 100));
            // Process interned string
        }
        long weakTime = System.nanoTime() - start;
        
        System.out.println("Strong interner time: " + strongTime / 1_000_000 + "ms");
        System.out.println("Weak interner time: " + weakTime / 1_000_000 + "ms");
    }
}

Best Practices

When to use Strong Interner:

  • Frequently accessed immutable objects
  • Objects with long lifetimes
  • Performance-critical scenarios
  • Small to medium number of unique instances

When to use Weak Interner:

  • Large number of potentially unique instances
  • Memory-constrained environments
  • Temporary objects that can be garbage collected
  • Scenarios where canonical instances may not always be needed

Thread Safety: Both strong and weak interners are thread-safe and can be safely accessed from multiple threads concurrently.

Install with Tessl CLI

npx tessl i tessl/maven-com-github-ben-manes-caffeine--caffeine

docs

asynchronous-caching.md

cache-construction.md

cache-policies.md

functional-interfaces.md

index.md

statistics.md

synchronous-caching.md

tile.json