High Dynamic Range (HDR) Histogram for recording and analyzing value distributions with configurable precision across wide dynamic ranges.
—
Memory-optimized histogram implementations designed for specific use cases and constraints. These variants trade off maximum count capacity or other features for reduced memory usage.
HDR histogram using int arrays for count storage, providing lower memory usage with count limited to Integer.MAX_VALUE per bucket.
public class IntCountsHistogram extends AbstractHistogram {
// Constructors
public IntCountsHistogram(int numberOfSignificantValueDigits);
public IntCountsHistogram(long highestTrackableValue, int numberOfSignificantValueDigits);
public IntCountsHistogram(long lowestDiscernibleValue,
long highestTrackableValue,
int numberOfSignificantValueDigits);
public IntCountsHistogram(AbstractHistogram source);
// Factory methods
static IntCountsHistogram decodeFromByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static IntCountsHistogram decodeFromCompressedByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static IntCountsHistogram fromString(String base64CompressedHistogramString);
// Implementation methods
public IntCountsHistogram copy();
public IntCountsHistogram copyCorrectedForCoordinatedOmission(long expectedInterval);
}Memory Savings: Approximately 50% memory usage compared to standard Histogram (int vs long arrays).
Count Limitations: Each bucket limited to 2,147,483,647 (Integer.MAX_VALUE) counts.
Usage Examples:
// Create int counts histogram for moderate volume measurements
IntCountsHistogram histogram = new IntCountsHistogram(1_000_000, 3);
// Record values - same API as standard histogram
for (int i = 0; i < 1_000_000; i++) {
histogram.recordValue(ThreadLocalRandom.current().nextInt(1000));
}
// Analysis methods unchanged
double mean = histogram.getMean();
long p95 = histogram.getValueAtPercentile(95.0);
long totalCount = histogram.getTotalCount();
// Memory usage comparison
int intHistMemory = histogram.getEstimatedFootprintInBytes();
Histogram standardHist = new Histogram(1_000_000, 3);
int standardMemory = standardHist.getEstimatedFootprintInBytes();
System.out.printf("IntCountsHistogram: %d bytes%n", intHistMemory);
System.out.printf("Standard Histogram: %d bytes%n", standardMemory);
System.out.printf("Memory savings: %.1f%%%n",
100.0 * (standardMemory - intHistMemory) / standardMemory);Good For:
Avoid When:
HDR histogram using short arrays for count storage, providing minimal memory usage with count limited to Short.MAX_VALUE per bucket.
public class ShortCountsHistogram extends AbstractHistogram {
// Constructors
public ShortCountsHistogram(int numberOfSignificantValueDigits);
public ShortCountsHistogram(long highestTrackableValue, int numberOfSignificantValueDigits);
public ShortCountsHistogram(long lowestDiscernibleValue,
long highestTrackableValue,
int numberOfSignificantValueDigits);
public ShortCountsHistogram(AbstractHistogram source);
// Factory methods
static ShortCountsHistogram decodeFromByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static ShortCountsHistogram decodeFromCompressedByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static ShortCountsHistogram fromString(String base64CompressedHistogramString);
// Implementation methods
public ShortCountsHistogram copy();
public ShortCountsHistogram copyCorrectedForCoordinatedOmission(long expectedInterval);
}Memory Savings: Approximately 75% memory usage reduction compared to standard Histogram.
Count Limitations: Each bucket limited to 32,767 (Short.MAX_VALUE) counts.
Usage Examples:
// Create short counts histogram for low-volume or sampling scenarios
ShortCountsHistogram histogram = new ShortCountsHistogram(10_000, 2);
// Suitable for sampled data or low-frequency measurements
Random random = new Random();
for (int i = 0; i < 10000; i++) {
if (random.nextDouble() < 0.1) { // 10% sampling rate
histogram.recordValue(random.nextInt(10000));
}
}
// Memory efficiency demonstration
int shortHistMemory = histogram.getEstimatedFootprintInBytes();
System.out.printf("ShortCountsHistogram memory: %d bytes%n", shortHistMemory);
// Same analysis capabilities
histogram.outputPercentileDistribution(System.out, 1.0);Good For:
Avoid When:
Memory-optimized histogram using packed array representation, ideal for sparse value distributions with significant memory savings.
public class PackedHistogram extends AbstractHistogram {
// Constructors
public PackedHistogram(int numberOfSignificantValueDigits);
public PackedHistogram(long highestTrackableValue, int numberOfSignificantValueDigits);
public PackedHistogram(long lowestDiscernibleValue,
long highestTrackableValue,
int numberOfSignificantValueDigits);
public PackedHistogram(AbstractHistogram source);
// Factory methods
static PackedHistogram decodeFromByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static PackedHistogram decodeFromCompressedByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static PackedHistogram fromString(String base64CompressedHistogramString);
// Implementation methods
public PackedHistogram copy();
public PackedHistogram copyCorrectedForCoordinatedOmission(long expectedInterval);
}PackedHistogram uses dynamic packed arrays that:
// Create packed histogram - excellent for sparse distributions
PackedHistogram histogram = new PackedHistogram(3);
// Record sparse data (many values have zero counts)
Random random = new Random();
for (int i = 0; i < 100000; i++) {
// Sparse distribution - most values between 1000-2000, few outliers
if (random.nextDouble() < 0.95) {
histogram.recordValue(1000 + random.nextInt(1000)); // Dense region
} else {
histogram.recordValue(random.nextInt(1000000)); // Sparse outliers
}
}
// Memory efficiency shines with sparse data
System.out.printf("PackedHistogram memory: %d bytes%n",
histogram.getEstimatedFootprintInBytes());
// Compare with standard histogram for same data
Histogram standard = new Histogram(histogram);
System.out.printf("Standard histogram memory: %d bytes%n",
standard.getEstimatedFootprintInBytes());
// Full functionality preserved
double mean = histogram.getMean();
long p50 = histogram.getValueAtPercentile(50.0);
long p99 = histogram.getValueAtPercentile(99.0);Memory: Excellent for sparse distributions, may use more memory for dense distributions.
Performance: Slight CPU overhead due to packing/unpacking operations.
Auto-resize: Supports auto-resizing with efficient memory management.
Ideal For:
Consider Alternatives For:
Thread-safe version of PackedHistogram combining packed array memory efficiency with concurrent recording support.
public class PackedConcurrentHistogram extends ConcurrentHistogram {
// Constructors
public PackedConcurrentHistogram(int numberOfSignificantValueDigits);
public PackedConcurrentHistogram(long highestTrackableValue, int numberOfSignificantValueDigits);
public PackedConcurrentHistogram(long lowestDiscernibleValue,
long highestTrackableValue,
int numberOfSignificantValueDigits);
public PackedConcurrentHistogram(AbstractHistogram source);
// Factory methods
static PackedConcurrentHistogram decodeFromByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
static PackedConcurrentHistogram decodeFromCompressedByteBuffer(ByteBuffer buffer,
long minBarForHighestTrackableValue);
// Implementation methods
public PackedConcurrentHistogram copy();
public PackedConcurrentHistogram copyCorrectedForCoordinatedOmission(long expectedInterval);
}// Thread-safe packed histogram for concurrent sparse data recording
PackedConcurrentHistogram histogram = new PackedConcurrentHistogram(3);
// Multiple threads recording sparse latency data
ExecutorService executor = Executors.newFixedThreadPool(8);
for (int t = 0; t < 8; t++) {
executor.submit(() -> {
Random random = new Random();
for (int i = 0; i < 50000; i++) {
// Simulate network latency: mostly 1-10ms, occasional spikes
long latency = random.nextDouble() < 0.9
? 1000 + random.nextInt(9000) // Normal: 1-10ms
: 50000 + random.nextInt(950000); // Spikes: 50-1000ms
histogram.recordValue(latency); // Thread-safe, wait-free
}
});
}
// Coordinated reading (same as ConcurrentHistogram)
WriterReaderPhaser phaser = histogram.getWriterReaderPhaser();
phaser.readerLock();
try {
System.out.printf("Memory usage: %d bytes%n",
histogram.getEstimatedFootprintInBytes());
histogram.outputPercentileDistribution(System.out, 0.001); // Scale to ms
} finally {
phaser.readerUnlock();
}
executor.shutdown();Here's a practical comparison of memory usage across variants:
// Test configuration: 1M max value, 3 significant digits
int maxValue = 1_000_000;
int precision = 3;
// Create different histogram types
Histogram standard = new Histogram(maxValue, precision);
IntCountsHistogram intCounts = new IntCountsHistogram(maxValue, precision);
ShortCountsHistogram shortCounts = new ShortCountsHistogram(maxValue, precision);
PackedHistogram packed = new PackedHistogram(maxValue, precision);
// Record same sparse data pattern
Random random = new Random(12345); // Fixed seed for reproducible results
for (int i = 0; i < 100000; i++) {
long value = random.nextDouble() < 0.8
? random.nextInt(1000) // 80% in range 0-1000
: random.nextInt(maxValue); // 20% spread across full range
standard.recordValue(value);
intCounts.recordValue(value);
shortCounts.recordValue(value);
packed.recordValue(value);
}
// Compare memory footprints
System.out.printf("Standard Histogram: %6d bytes%n",
standard.getEstimatedFootprintInBytes());
System.out.printf("IntCountsHistogram: %6d bytes (%.1f%% of standard)%n",
intCounts.getEstimatedFootprintInBytes(),
100.0 * intCounts.getEstimatedFootprintInBytes() / standard.getEstimatedFootprintInBytes());
System.out.printf("ShortCountsHistogram: %6d bytes (%.1f%% of standard)%n",
shortCounts.getEstimatedFootprintInBytes(),
100.0 * shortCounts.getEstimatedFootprintInBytes() / standard.getEstimatedFootprintInBytes());
System.out.printf("PackedHistogram: %6d bytes (%.1f%% of standard)%n",
packed.getEstimatedFootprintInBytes(),
100.0 * packed.getEstimatedFootprintInBytes() / standard.getEstimatedFootprintInBytes());| Use Case | Recommended Variant | Reason |
|---|---|---|
| Memory-constrained, moderate volume | IntCountsHistogram | 50% memory savings, reasonable count limits |
| Sampling/testing scenarios | ShortCountsHistogram | Maximum memory savings |
| Sparse distributions | PackedHistogram | Adaptive memory based on actual data |
| Concurrent sparse data | PackedConcurrentHistogram | Thread safety + memory efficiency |
| Dense uniform distributions | Standard Histogram | No packing overhead |
All specialized variants maintain API compatibility:
// Easy migration between variants
AbstractHistogram source = getExistingHistogram();
// Convert to memory-optimized variant
PackedHistogram optimized = new PackedHistogram(source);
// Or create concurrent version
PackedConcurrentHistogram concurrent = new PackedConcurrentHistogram(source);
// API remains identical
optimized.recordValue(1234);
long p99 = optimized.getValueAtPercentile(99.0);Recording Performance:
Query Performance:
Memory vs Performance Trade-off: Choose based on whether memory savings outweigh performance costs in your specific use case.
Install with Tessl CLI
npx tessl i tessl/maven-org-hdrhistogram--hdr-histogram