or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

advanced-compression.mdcore-utilities.mdindex.mdstandard-compression.md

index.mddocs/

0

# Cramjam

1

2

Extremely thin Python bindings to de/compression algorithms in Rust. Cramjam provides high-performance compression and decompression for multiple algorithms with a consistent API.

3

4

## Package Information

5

6

- **Package Name**: cramjam

7

- **Package Type**: pypi

8

- **Language**: Python

9

- **Installation**: `pip install cramjam`

10

11

## Supported Algorithms

12

13

- **Snappy** - Fast compression with optional raw format

14

- **Brotli** - Web-optimized compression (default level: 11)

15

- **Bzip2** - Traditional compression (default level: 6)

16

- **LZ4** - Ultra-fast compression with block operations

17

- **Gzip** - Standard compression (default level: 6)

18

- **Zlib** - Library compression (default level: 6)

19

- **Deflate** - Raw deflate compression (default level: 6)

20

- **ZSTD** - Modern compression (default level: 6)

21

- **XZ/LZMA** - High-ratio compression with advanced configuration

22

23

## Core Imports

24

25

```python { .api }

26

# Main package with core utilities

27

import cramjam

28

from cramjam import Buffer, File, CompressionError, DecompressionError

29

30

# Individual compression modules

31

from cramjam import snappy, gzip, bzip2, lz4, deflate, zlib, zstd, brotli, xz

32

```

33

34

## Quick Start Examples

35

36

### Basic Compression

37

38

```python { .api }

39

import cramjam

40

41

# Simple compression/decompression

42

data = b"Hello, World!" * 100

43

compressed = cramjam.gzip.compress(data)

44

decompressed = cramjam.gzip.decompress(compressed)

45

46

# Try different algorithms

47

compressed_brotli = cramjam.brotli.compress(data, level=6)

48

compressed_zstd = cramjam.zstd.compress(data, level=3)

49

```

50

51

### Memory-Efficient Operations

52

53

```python { .api }

54

import cramjam

55

56

# Pre-allocate output buffer for efficiency

57

input_data = b"Large dataset" * 10000

58

output = cramjam.Buffer()

59

60

# Compress directly into buffer

61

bytes_written = cramjam.gzip.compress_into(input_data, output)

62

print(f"Compressed {len(input_data)} bytes to {bytes_written} bytes")

63

```

64

65

### Streaming Compression

66

67

```python { .api }

68

import cramjam

69

70

# Use streaming compressor for large data

71

compressor = cramjam.zstd.Compressor(level=5)

72

73

# Process data in chunks

74

compressor.compress(b"First chunk of data")

75

compressor.compress(b"Second chunk of data")

76

77

# Get final compressed result

78

compressed_data = compressor.finish()

79

```

80

81

### Buffer Management

82

83

```python { .api }

84

import cramjam

85

86

# Create and manipulate buffers

87

buffer = cramjam.Buffer(b"Initial data")

88

buffer.write(b" additional data")

89

buffer.seek(0)

90

content = buffer.read()

91

92

# File operations

93

file_obj = cramjam.File("data.txt", read=True, write=True)

94

file_obj.write(b"File content")

95

file_obj.seek(0)

96

data = file_obj.read()

97

```

98

99

## Architecture

100

101

### Buffer Protocol Support

102

103

All cramjam functions accept `BufferProtocol` objects:

104

- `bytes` - Immutable byte strings

105

- `bytearray` - Mutable byte arrays (often faster due to reduced allocations)

106

- `memoryview` - Memory views of other buffer objects

107

- Custom objects implementing `__buffer__` protocol

108

109

### Core Utility Classes

110

111

**[Buffer](./core-utilities.md#buffer)** - Memory buffer with file-like interface supporting read/write operations

112

113

**[File](./core-utilities.md#file)** - Rust-backed file object with buffer protocol support

114

115

**Exceptions** - Specific error types for compression failures:

116

- `CompressionError` - Raised when compression operations fail

117

- `DecompressionError` - Raised when decompression operations fail

118

119

### Compression Patterns

120

121

1. **Standard Functions**: `compress(data, level=None) -> Buffer` and `decompress(data) -> Buffer`

122

2. **Direct Buffer Operations**: `compress_into(input, output) -> int` for pre-allocated buffers

123

3. **Streaming Classes**: `Compressor` and `Decompressor` classes for chunk processing

124

4. **Special Features**: Algorithm-specific enhancements (LZ4 blocks, Snappy raw, XZ filters)

125

126

## Algorithm-Specific Features

127

128

### Standard Compression Modules

129

130

All standard modules follow consistent patterns with compression level support:

131

132

- **[Gzip, Zlib, Deflate, Bzip2, ZSTD, Brotli](./standard-compression.md)** - Traditional algorithms with level control

133

134

### Advanced Compression Modules

135

136

Modules with enhanced capabilities beyond standard compress/decompress:

137

138

- **[Snappy](./advanced-compression.md#snappy)** - Framed and raw format support

139

- **[LZ4](./advanced-compression.md#lz4)** - Block operations with advanced parameters

140

- **[XZ/LZMA](./advanced-compression.md#xz-lzma)** - Comprehensive filter chains and format control

141

142

## Performance Tips

143

144

- Use `bytearray` instead of `bytes` for input when possible (avoids double allocation)

145

- Use `*_into` functions with pre-allocated buffers for memory efficiency

146

- Use streaming classes (`Compressor`/`Decompressor`) for large datasets

147

- Buffer objects provide reference counting for memory management

148

149

## Module Documentation

150

151

- **[Core Utilities](./core-utilities.md)** - Buffer, File classes and exception handling

152

- **[Standard Compression](./standard-compression.md)** - Gzip, Zlib, Deflate, Bzip2, ZSTD, Brotli

153

- **[Advanced Compression](./advanced-compression.md)** - Snappy, LZ4, XZ with special features

154

155

## Version Information

156

157

```python { .api }

158

import cramjam

159

print(cramjam.__version__) # Package version string

160

```