0
# File & I/O Operations
1
2
Atomic file operations, advanced I/O utilities, path manipulation, and file system operations. Includes spooled I/O that automatically switches to disk for large data, atomic file saving with backup support, and comprehensive path manipulation utilities.
3
4
## Capabilities
5
6
### Atomic File Operations
7
8
Safe file writing operations that prevent data corruption.
9
10
```python { .api }
11
def atomic_save(dest_path, **kwargs):
12
"""
13
Atomic file saving with backup options.
14
15
Parameters:
16
- dest_path (str): Destination file path
17
- text_mode (bool): Open in text mode (default: True)
18
- backup (bool): Create backup before overwrite
19
- part_file (str): Temporary file suffix
20
21
Returns:
22
AtomicSaver: Context manager for atomic operations
23
"""
24
25
class AtomicSaver:
26
"""Context manager for atomic file writing operations."""
27
def __init__(self, dest_path, **kwargs): ...
28
def __enter__(self): ...
29
def __exit__(self, exc_type, exc_val, exc_tb): ...
30
```
31
32
### File System Operations
33
34
Directory and file system manipulation utilities.
35
36
```python { .api }
37
def mkdir_p(path):
38
"""
39
Create directory including parent directories (like mkdir -p).
40
41
Parameters:
42
- path (str): Directory path to create
43
44
Returns:
45
None
46
"""
47
48
def copy_tree(src, dst, symlinks=False, ignore=None):
49
"""
50
Copy directory trees.
51
52
Parameters:
53
- src (str): Source directory
54
- dst (str): Destination directory
55
- symlinks (bool): Copy symlinks as symlinks
56
- ignore (callable): Function to ignore certain files
57
58
Returns:
59
None
60
"""
61
62
def iter_find_files(directory, patterns, **kwargs):
63
"""
64
Iterator for finding files matching patterns.
65
66
Parameters:
67
- directory (str): Directory to search
68
- patterns (list): List of glob patterns
69
70
Yields:
71
str: Matching file paths
72
"""
73
```
74
75
### Path Manipulation
76
77
Advanced path manipulation and normalization.
78
79
```python { .api }
80
def path_to_unicode(path):
81
"""
82
Convert path to unicode string.
83
84
Parameters:
85
- path: Path to convert
86
87
Returns:
88
str: Unicode path string
89
"""
90
91
def augpath(path, suffix='', prefix='', ext=None, base=None, dpath=None, multidot=False):
92
"""
93
Augment path with various components.
94
95
Parameters:
96
- path (str): Base path
97
- suffix (str): Suffix to add to filename
98
- prefix (str): Prefix to add to filename
99
- ext (str): New extension (with or without dot)
100
- base (str): New basename for file
101
- dpath (str): New directory path
102
- multidot (bool): Handle multiple extensions
103
104
Returns:
105
str: Augmented path
106
"""
107
108
def shrinkuser(path, home='~'):
109
"""
110
Contract user home directory in path.
111
112
Parameters:
113
- path (str): Path to contract
114
- home (str): Home directory representation
115
116
Returns:
117
str: Path with contracted home directory
118
"""
119
120
def expandpath(path):
121
"""
122
Expand environment variables and user directory in path.
123
124
Parameters:
125
- path (str): Path to expand
126
127
Returns:
128
str: Expanded path
129
"""
130
```
131
132
### Spooled I/O
133
134
Memory-efficient I/O that spools to disk for large data.
135
136
```python { .api }
137
class SpooledIOBase(IOBase):
138
"""Abstract base for spooled I/O implementations."""
139
def __init__(self, max_size=5000000, dir=None): ...
140
def rollover(self): ...
141
@property
142
def rolled(self): ...
143
144
class SpooledBytesIO(SpooledIOBase):
145
"""In-memory bytes I/O that spools to disk when large."""
146
def write(self, s): ...
147
def read(self, n=-1): ...
148
def seek(self, pos, whence=0): ...
149
def tell(self): ...
150
151
class SpooledStringIO(SpooledIOBase):
152
"""In-memory text I/O that spools to disk when large."""
153
def write(self, s): ...
154
def read(self, n=-1): ...
155
def readline(self, length=None): ...
156
157
class MultiFileReader:
158
"""Read from multiple file objects as single stream."""
159
def __init__(self, file_objects): ...
160
def read(self, size=-1): ...
161
def readline(self): ...
162
def close(self): ...
163
```
164
165
### File Utilities
166
167
Miscellaneous file handling utilities.
168
169
```python { .api }
170
def is_text_fileobj(fileobj):
171
"""
172
Check if file object is in text mode.
173
174
Parameters:
175
- fileobj: File object to check
176
177
Returns:
178
bool: True if file object is in text mode
179
"""
180
181
class FilePerms:
182
"""File permissions representation and manipulation."""
183
def __init__(self, perms): ...
184
@property
185
def user(self): ...
186
@property
187
def group(self): ...
188
@property
189
def other(self): ...
190
def __str__(self): ...
191
192
class DummyFile:
193
"""File-like object that discards all writes."""
194
def write(self, data): ...
195
def flush(self): ...
196
def close(self): ...
197
```
198
199
## Usage Examples
200
201
```python
202
from boltons.fileutils import atomic_save, mkdir_p, augpath
203
from boltons.ioutils import SpooledBytesIO, MultiFileReader
204
from boltons.pathutils import shrinkuser, expandpath
205
206
# Atomic file writing
207
with atomic_save('/path/to/important.txt') as f:
208
f.write('Critical data')
209
# File is only written if no exceptions occur
210
211
# Create directories safely
212
mkdir_p('/path/to/nested/directories')
213
214
# Path manipulation
215
original = '/home/user/documents/file.txt'
216
backup = augpath(original, suffix='_backup', ext='.bak')
217
print(backup) # '/home/user/documents/file_backup.bak'
218
219
# Contract and expand paths
220
short_path = shrinkuser('/home/user/file.txt')
221
print(short_path) # '~/file.txt'
222
full_path = expandpath('~/file.txt')
223
print(full_path) # '/home/user/file.txt'
224
225
# Spooled I/O for memory efficiency
226
spooled = SpooledBytesIO(max_size=1024*1024) # 1MB threshold
227
spooled.write(b'small data') # Stays in memory
228
spooled.write(large_data) # Spools to disk when threshold exceeded
229
230
# Read from multiple files as one stream
231
with open('file1.txt', 'rb') as f1, open('file2.txt', 'rb') as f2:
232
multi_reader = MultiFileReader([f1, f2])
233
combined_data = multi_reader.read()
234
```
235
236
## Types
237
238
```python { .api }
239
# Constants
240
FULL_PERMS = 0o777 # Full read/write/execute permissions
241
RW_PERMS = 438 # Read/write permissions (octal 666)
242
READ_CHUNK_SIZE = 21333 # Default chunk size for reading operations
243
```