0
# Synchronous Caching
1
2
Caffeine provides two main synchronous cache interfaces: `Cache` for manual cache operations and `LoadingCache` for automatic value loading. Both interfaces are thread-safe and designed for high-concurrency access patterns.
3
4
## Cache Interface
5
6
The `Cache` interface provides basic manual caching operations where values must be explicitly computed and stored.
7
8
```java { .api }
9
public interface Cache<K, V> {
10
// Retrieval operations
11
V getIfPresent(K key);
12
V get(K key, Function<? super K, ? extends V> mappingFunction);
13
Map<K, V> getAllPresent(Iterable<? extends K> keys);
14
Map<K, V> getAll(Iterable<? extends K> keys, Function<Set<? extends K>, Map<K, V>> mappingFunction);
15
16
// Storage operations
17
void put(K key, V value);
18
void putAll(Map<? extends K, ? extends V> map);
19
20
// Removal operations
21
void invalidate(K key);
22
void invalidateAll(Iterable<? extends K> keys);
23
void invalidateAll();
24
25
// Inspection operations
26
long estimatedSize();
27
CacheStats stats();
28
ConcurrentMap<K, V> asMap();
29
void cleanUp();
30
Policy<K, V> policy();
31
}
32
```
33
34
### Basic Cache Operations
35
36
#### Retrieval Operations
37
38
```java
39
Cache<String, String> cache = Caffeine.newBuilder()
40
.maximumSize(1000)
41
.build();
42
43
// Simple retrieval - returns null if not present
44
String value = cache.getIfPresent("key1");
45
46
// Get with compute function - computes and stores if missing
47
String computed = cache.get("key2", k -> "computed_" + k);
48
49
// Bulk retrieval of present values only
50
Set<String> keys = Set.of("key1", "key2", "key3");
51
Map<String, String> present = cache.getAllPresent(keys);
52
53
// Bulk retrieval with compute function for missing values
54
Map<String, String> all = cache.getAll(keys, missingKeys -> {
55
Map<String, String> result = new HashMap<>();
56
for (String key : missingKeys) {
57
result.put(key, "bulk_computed_" + key);
58
}
59
return result;
60
});
61
```
62
63
#### Storage Operations
64
65
```java
66
// Store single value
67
cache.put("key", "value");
68
69
// Store multiple values
70
Map<String, String> data = Map.of(
71
"key1", "value1",
72
"key2", "value2"
73
);
74
cache.putAll(data);
75
```
76
77
#### Removal Operations
78
79
```java
80
// Remove single entry
81
cache.invalidate("key1");
82
83
// Remove multiple entries
84
cache.invalidateAll(Set.of("key1", "key2"));
85
86
// Remove all entries
87
cache.invalidateAll();
88
```
89
90
### Advanced Cache Operations
91
92
#### Map View Operations
93
94
```java
95
// Get concurrent map view
96
ConcurrentMap<String, String> mapView = cache.asMap();
97
98
// Standard map operations work on the cache
99
mapView.putIfAbsent("key", "value");
100
mapView.computeIfPresent("key", (k, v) -> v.toUpperCase());
101
mapView.merge("key", "suffix", (oldVal, newVal) -> oldVal + "_" + newVal);
102
103
// Iteration over cache entries
104
for (Map.Entry<String, String> entry : mapView.entrySet()) {
105
System.out.println(entry.getKey() + " -> " + entry.getValue());
106
}
107
```
108
109
#### Cache Maintenance
110
111
```java
112
// Manual cleanup - performs maintenance operations
113
cache.cleanUp();
114
115
// Get approximate size
116
long size = cache.estimatedSize();
117
118
// Access cache statistics (if enabled)
119
if (cache.stats() != CacheStats.empty()) {
120
CacheStats stats = cache.stats();
121
System.out.println("Hit rate: " + stats.hitRate());
122
System.out.println("Miss count: " + stats.missCount());
123
}
124
125
// Access cache policies
126
Policy<String, String> policy = cache.policy();
127
if (policy.eviction().isPresent()) {
128
System.out.println("Max size: " + policy.eviction().get().getMaximum());
129
}
130
```
131
132
## LoadingCache Interface
133
134
The `LoadingCache` interface extends `Cache` and provides automatic value loading using a `CacheLoader`.
135
136
```java { .api }
137
public interface LoadingCache<K, V> extends Cache<K, V> {
138
// Automatic loading operations
139
V get(K key);
140
Map<K, V> getAll(Iterable<? extends K> keys);
141
142
// Refresh operations
143
CompletableFuture<V> refresh(K key);
144
CompletableFuture<Map<K, V>> refreshAll(Iterable<? extends K> keys);
145
}
146
```
147
148
### Loading Cache Operations
149
150
#### Automatic Loading
151
152
```java
153
LoadingCache<String, String> loadingCache = Caffeine.newBuilder()
154
.maximumSize(1000)
155
.build(key -> {
156
// Simulate expensive computation
157
Thread.sleep(100);
158
return "loaded_" + key.toUpperCase();
159
});
160
161
// Get value - loads automatically if not present
162
String value = loadingCache.get("key1"); // Returns "loaded_KEY1"
163
164
// Bulk loading - uses CacheLoader.loadAll() if implemented
165
Map<String, String> values = loadingCache.getAll(Set.of("key1", "key2", "key3"));
166
```
167
168
#### Bulk Loading with Custom LoadAll
169
170
```java
171
LoadingCache<String, UserData> userCache = Caffeine.newBuilder()
172
.maximumSize(1000)
173
.build(new CacheLoader<String, UserData>() {
174
@Override
175
public UserData load(String userId) throws Exception {
176
return database.fetchUser(userId);
177
}
178
179
@Override
180
public Map<String, UserData> loadAll(Set<? extends String> userIds) throws Exception {
181
// Efficient bulk loading from database
182
return database.fetchUsers(userIds);
183
}
184
});
185
186
// Uses efficient bulk loading
187
Map<String, UserData> users = userCache.getAll(Set.of("user1", "user2", "user3"));
188
```
189
190
#### Refresh Operations
191
192
```java
193
LoadingCache<String, String> refreshingCache = Caffeine.newBuilder()
194
.maximumSize(1000)
195
.refreshAfterWrite(Duration.ofMinutes(5))
196
.build(key -> fetchFromSlowService(key));
197
198
// Asynchronous refresh - old value remains available during refresh
199
CompletableFuture<String> refreshFuture = refreshingCache.refresh("key1");
200
201
// The cache continues serving the old value while refreshing
202
String currentValue = refreshingCache.get("key1"); // Returns old value immediately
203
204
// Wait for refresh to complete if needed
205
String newValue = refreshFuture.get();
206
207
// Bulk refresh operations
208
Set<String> keysToRefresh = Set.of("key1", "key2", "key3");
209
CompletableFuture<Map<String, String>> bulkRefreshFuture = refreshingCache.refreshAll(keysToRefresh);
210
211
// Cache continues serving old values while refreshing all keys
212
Map<String, String> currentValues = refreshingCache.getAll(keysToRefresh);
213
214
// Wait for all refreshes to complete
215
Map<String, String> newValues = bulkRefreshFuture.get();
216
```
217
218
## Error Handling
219
220
### Cache Operations Error Handling
221
222
```java
223
Cache<String, String> cache = Caffeine.newBuilder()
224
.maximumSize(1000)
225
.build();
226
227
try {
228
// Compute function can throw exceptions
229
String value = cache.get("key", k -> {
230
if (k.equals("invalid")) {
231
throw new IllegalArgumentException("Invalid key");
232
}
233
return "valid_" + k;
234
});
235
} catch (RuntimeException e) {
236
// Handle computation exceptions
237
System.err.println("Failed to compute value: " + e.getMessage());
238
}
239
```
240
241
### Loading Cache Error Handling
242
243
```java
244
LoadingCache<String, String> loadingCache = Caffeine.newBuilder()
245
.maximumSize(1000)
246
.build(key -> {
247
if (key.startsWith("error_")) {
248
throw new RuntimeException("Simulated loading error");
249
}
250
return "loaded_" + key;
251
});
252
253
try {
254
String value = loadingCache.get("error_key");
255
} catch (CompletionException e) {
256
// Loading exceptions are wrapped in CompletionException
257
Throwable cause = e.getCause();
258
System.err.println("Loading failed: " + cause.getMessage());
259
}
260
```
261
262
## Thread Safety and Concurrency
263
264
All cache operations are thread-safe and designed for high-concurrency access:
265
266
```java
267
Cache<String, String> cache = Caffeine.newBuilder()
268
.maximumSize(1000)
269
.build();
270
271
// Multiple threads can safely access the cache concurrently
272
ExecutorService executor = Executors.newFixedThreadPool(10);
273
274
for (int i = 0; i < 100; i++) {
275
final int threadId = i;
276
executor.submit(() -> {
277
// Thread-safe operations
278
cache.put("key_" + threadId, "value_" + threadId);
279
String value = cache.getIfPresent("key_" + threadId);
280
cache.get("computed_" + threadId, k -> "computed_" + k);
281
});
282
}
283
```
284
285
### Atomic Operations
286
287
Cache operations are atomic at the individual operation level:
288
289
```java
290
// These operations are atomic
291
cache.get("key", k -> expensiveComputation(k)); // Only computed once per key
292
cache.asMap().computeIfAbsent("key", k -> defaultValue(k)); // Atomic compute-if-absent
293
cache.asMap().merge("key", "addition", (old, new_val) -> old + new_val); // Atomic merge
294
```
295
296
## Performance Characteristics
297
298
- **Hash Table**: Implementation provides O(1) average case performance similar to `ConcurrentHashMap`
299
- **Lock-Free Reads**: Most read operations are lock-free for excellent read performance
300
- **Write Combining**: Multiple writes may be combined for better throughput
301
- **Weakly Consistent Iteration**: Iterators reflect cache state at creation time
302
- **No ConcurrentModificationException**: Iterators never throw CME during concurrent modifications