0
# Storage Operations
1
2
File upload, download, and management operations including bucket management, file metadata, access control, and URL generation. Provides comprehensive file storage capabilities through integration with Supabase Storage.
3
4
## Capabilities
5
6
### Storage Client Access
7
8
Access the storage client through the main Supabase client instance.
9
10
```python { .api }
11
@property
12
def storage(self) -> SupabaseStorageClient | AsyncSupabaseStorageClient:
13
"""
14
Storage client providing complete file storage functionality.
15
16
Returns:
17
Storage client instance (sync or async) with methods for:
18
- File upload, download, update, delete operations
19
- Bucket creation, listing, and management
20
- Access control and permissions
21
- Signed URL generation for secure access
22
- File metadata and transformations
23
"""
24
```
25
26
### Bucket Operations
27
28
Manage storage buckets for organizing and controlling file access.
29
30
```python { .api }
31
# Bucket management methods (from storage3)
32
def list_buckets(self) -> List[dict]:
33
"""
34
List all storage buckets.
35
36
Returns:
37
List of bucket information dictionaries
38
39
Raises:
40
StorageException: If listing fails
41
"""
42
43
def get_bucket(self, bucket_id: str) -> dict:
44
"""
45
Get information about a specific bucket.
46
47
Parameters:
48
- bucket_id: Unique identifier for the bucket
49
50
Returns:
51
Bucket information dictionary
52
53
Raises:
54
StorageException: If bucket not found
55
"""
56
57
def create_bucket(self, bucket_id: str, options: dict = None) -> dict:
58
"""
59
Create a new storage bucket.
60
61
Parameters:
62
- bucket_id: Unique identifier for the new bucket
63
- options: Bucket configuration (public, file_size_limit, allowed_mime_types, etc.)
64
65
Returns:
66
Created bucket information
67
68
Raises:
69
StorageException: If creation fails or bucket exists
70
"""
71
72
def empty_bucket(self, bucket_id: str) -> dict:
73
"""
74
Remove all files from a bucket.
75
76
Parameters:
77
- bucket_id: Bucket to empty
78
79
Returns:
80
Operation result
81
82
Raises:
83
StorageException: If operation fails
84
"""
85
86
def delete_bucket(self, bucket_id: str) -> dict:
87
"""
88
Delete a storage bucket.
89
90
Parameters:
91
- bucket_id: Bucket to delete
92
93
Returns:
94
Deletion result
95
96
Raises:
97
StorageException: If deletion fails or bucket not empty
98
"""
99
```
100
101
**Usage Examples:**
102
103
```python
104
# List all buckets
105
buckets = supabase.storage.list_buckets()
106
for bucket in buckets:
107
print(f"Bucket: {bucket['name']}, Public: {bucket['public']}")
108
109
# Create a new bucket
110
bucket = supabase.storage.create_bucket("user-uploads", {
111
"public": False,
112
"file_size_limit": 1048576, # 1MB limit
113
"allowed_mime_types": ["image/jpeg", "image/png", "application/pdf"]
114
})
115
116
# Get bucket information
117
bucket_info = supabase.storage.get_bucket("user-uploads")
118
print(f"Bucket size: {bucket_info['file_size_limit']} bytes")
119
120
# Empty and delete bucket
121
supabase.storage.empty_bucket("old-bucket")
122
supabase.storage.delete_bucket("old-bucket")
123
```
124
125
### File Operations
126
127
Access files within a specific bucket for upload, download, and management operations.
128
129
```python { .api }
130
def from_(self, bucket_id: str):
131
"""
132
Select a bucket for file operations.
133
134
Parameters:
135
- bucket_id: Name of the storage bucket
136
137
Returns:
138
Bucket file client for file operations
139
140
Note: This provides access to file-specific methods
141
"""
142
```
143
144
### File Upload
145
146
Upload files to storage buckets with various options and metadata.
147
148
```python { .api }
149
# File upload methods (accessed via supabase.storage.from_("bucket"))
150
def upload(
151
self,
152
path: str,
153
file: Union[str, bytes, IO],
154
file_options: dict = None
155
) -> dict:
156
"""
157
Upload a file to the bucket.
158
159
Parameters:
160
- path: File path within the bucket (including filename)
161
- file: File content as string, bytes, or file-like object
162
- file_options: Upload options (content_type, cache_control, upsert, etc.)
163
164
Returns:
165
Upload result with file metadata
166
167
Raises:
168
StorageException: If upload fails
169
"""
170
171
def upload_from_path(
172
self,
173
path: str,
174
local_path: str,
175
file_options: dict = None
176
) -> dict:
177
"""
178
Upload a file from local filesystem.
179
180
Parameters:
181
- path: Destination path in bucket
182
- local_path: Local file path to upload
183
- file_options: Upload options
184
185
Returns:
186
Upload result
187
188
Raises:
189
StorageException: If upload or file access fails
190
"""
191
```
192
193
**Usage Examples:**
194
195
```python
196
# Upload bytes data
197
file_data = b"Hello, World!"
198
result = supabase.storage.from_("documents").upload(
199
"greeting.txt",
200
file_data,
201
{"content_type": "text/plain"}
202
)
203
204
# Upload from string
205
content = "This is a text document."
206
result = supabase.storage.from_("documents").upload(
207
"doc.txt",
208
content,
209
{"content_type": "text/plain"}
210
)
211
212
# Upload from file object
213
with open("local_file.pdf", "rb") as f:
214
result = supabase.storage.from_("documents").upload(
215
"uploaded_file.pdf",
216
f,
217
{"content_type": "application/pdf"}
218
)
219
220
# Upload from local file path
221
result = supabase.storage.from_("images").upload_from_path(
222
"profile.jpg",
223
"/path/to/local/image.jpg",
224
{
225
"content_type": "image/jpeg",
226
"cache_control": "3600",
227
"upsert": True # Overwrite if exists
228
}
229
)
230
231
# Upload with metadata
232
result = supabase.storage.from_("assets").upload(
233
"data/report.json",
234
'{"report": "data"}',
235
{
236
"content_type": "application/json",
237
"metadata": {
238
"author": "system",
239
"version": "1.0"
240
}
241
}
242
)
243
```
244
245
### File Download
246
247
Download files from storage buckets.
248
249
```python { .api }
250
def download(self, path: str) -> bytes:
251
"""
252
Download a file from the bucket.
253
254
Parameters:
255
- path: File path within the bucket
256
257
Returns:
258
File content as bytes
259
260
Raises:
261
StorageException: If file not found or download fails
262
"""
263
264
def download_to_path(self, path: str, local_path: str) -> None:
265
"""
266
Download a file directly to local filesystem.
267
268
Parameters:
269
- path: File path in bucket
270
- local_path: Local destination path
271
272
Raises:
273
StorageException: If download fails
274
"""
275
```
276
277
**Usage Examples:**
278
279
```python
280
# Download file content
281
content = supabase.storage.from_("documents").download("report.pdf")
282
with open("local_report.pdf", "wb") as f:
283
f.write(content)
284
285
# Download directly to file
286
supabase.storage.from_("images").download_to_path(
287
"profile.jpg",
288
"/local/path/downloaded_profile.jpg"
289
)
290
291
# Download and process text content
292
text_content = supabase.storage.from_("documents").download("data.txt")
293
text = text_content.decode('utf-8')
294
print(f"File content: {text}")
295
```
296
297
### File Management
298
299
List, move, copy, and delete files within storage buckets.
300
301
```python { .api }
302
def list(
303
self,
304
path: str = "",
305
options: dict = None
306
) -> List[dict]:
307
"""
308
List files in the bucket.
309
310
Parameters:
311
- path: Directory path to list (default: root)
312
- options: Listing options (limit, offset, sort_by, search, etc.)
313
314
Returns:
315
List of file/folder information dictionaries
316
317
Raises:
318
StorageException: If listing fails
319
"""
320
321
def update(
322
self,
323
path: str,
324
file: Union[str, bytes, IO],
325
file_options: dict = None
326
) -> dict:
327
"""
328
Update/replace an existing file.
329
330
Parameters:
331
- path: File path to update
332
- file: New file content
333
- file_options: Update options
334
335
Returns:
336
Update result
337
338
Raises:
339
StorageException: If file not found or update fails
340
"""
341
342
def move(self, from_path: str, to_path: str) -> dict:
343
"""
344
Move/rename a file within the bucket.
345
346
Parameters:
347
- from_path: Current file path
348
- to_path: New file path
349
350
Returns:
351
Move operation result
352
353
Raises:
354
StorageException: If move fails
355
"""
356
357
def copy(self, from_path: str, to_path: str) -> dict:
358
"""
359
Copy a file within the bucket.
360
361
Parameters:
362
- from_path: Source file path
363
- to_path: Destination file path
364
365
Returns:
366
Copy operation result
367
368
Raises:
369
StorageException: If copy fails
370
"""
371
372
def remove(self, paths: List[str]) -> List[dict]:
373
"""
374
Delete one or more files.
375
376
Parameters:
377
- paths: List of file paths to delete
378
379
Returns:
380
List of deletion results
381
382
Raises:
383
StorageException: If deletion fails
384
"""
385
```
386
387
**Usage Examples:**
388
389
```python
390
# List all files in bucket
391
files = supabase.storage.from_("documents").list()
392
for file in files:
393
print(f"File: {file['name']}, Size: {file['metadata']['size']} bytes")
394
395
# List files in specific directory
396
files = supabase.storage.from_("images").list("avatars/", {
397
"limit": 50,
398
"sort_by": {"column": "name", "order": "asc"}
399
})
400
401
# Search for files
402
files = supabase.storage.from_("documents").list("", {
403
"search": "report",
404
"limit": 10
405
})
406
407
# Update existing file
408
with open("updated_document.pdf", "rb") as f:
409
result = supabase.storage.from_("documents").update(
410
"report.pdf",
411
f,
412
{"content_type": "application/pdf"}
413
)
414
415
# Move file to new location
416
result = supabase.storage.from_("temp").move(
417
"draft.txt",
418
"published/final.txt"
419
)
420
421
# Copy file
422
result = supabase.storage.from_("originals").copy(
423
"master.jpg",
424
"copies/backup.jpg"
425
)
426
427
# Delete single file
428
result = supabase.storage.from_("temp").remove(["old_file.txt"])
429
430
# Delete multiple files
431
result = supabase.storage.from_("cache").remove([
432
"temp1.txt",
433
"temp2.txt",
434
"old_data.json"
435
])
436
```
437
438
### URL Generation
439
440
Generate public and signed URLs for file access.
441
442
```python { .api }
443
def get_public_url(self, path: str) -> str:
444
"""
445
Get public URL for a file in a public bucket.
446
447
Parameters:
448
- path: File path within the bucket
449
450
Returns:
451
Public URL string
452
453
Note: Only works for files in public buckets
454
"""
455
456
def create_signed_url(
457
self,
458
path: str,
459
expires_in: int,
460
options: dict = None
461
) -> dict:
462
"""
463
Create a signed URL for temporary access to private files.
464
465
Parameters:
466
- path: File path within the bucket
467
- expires_in: URL expiration time in seconds
468
- options: Additional options (download, transform, etc.)
469
470
Returns:
471
Dict with signed URL and expiration info
472
473
Raises:
474
StorageException: If URL generation fails
475
"""
476
477
def create_signed_urls(
478
self,
479
paths: List[str],
480
expires_in: int,
481
options: dict = None
482
) -> List[dict]:
483
"""
484
Create signed URLs for multiple files.
485
486
Parameters:
487
- paths: List of file paths
488
- expires_in: URL expiration time in seconds
489
- options: Additional options
490
491
Returns:
492
List of signed URL dictionaries
493
494
Raises:
495
StorageException: If URL generation fails
496
"""
497
```
498
499
**Usage Examples:**
500
501
```python
502
# Get public URL (for public buckets)
503
public_url = supabase.storage.from_("public-images").get_public_url("logo.png")
504
print(f"Public URL: {public_url}")
505
506
# Create signed URL for private file
507
signed_url_data = supabase.storage.from_("private-docs").create_signed_url(
508
"confidential.pdf",
509
3600, # Expires in 1 hour
510
{"download": True} # Force download
511
)
512
print(f"Signed URL: {signed_url_data['signed_url']}")
513
print(f"Expires at: {signed_url_data['expires_at']}")
514
515
# Create signed URLs for multiple files
516
signed_urls = supabase.storage.from_("user-files").create_signed_urls(
517
["document1.pdf", "document2.pdf", "image.jpg"],
518
7200, # 2 hours
519
{"download": False}
520
)
521
522
for url_data in signed_urls:
523
print(f"File: {url_data['path']}, URL: {url_data['signed_url']}")
524
525
# Generate download URL with transformation (for images)
526
download_url = supabase.storage.from_("images").create_signed_url(
527
"photo.jpg",
528
3600,
529
{
530
"download": True,
531
"transform": {
532
"width": 300,
533
"height": 300,
534
"resize": "cover"
535
}
536
}
537
)
538
```
539
540
### File Metadata
541
542
Get detailed information about files and their properties.
543
544
```python { .api }
545
def get_file_info(self, path: str) -> dict:
546
"""
547
Get detailed metadata for a file.
548
549
Parameters:
550
- path: File path within the bucket
551
552
Returns:
553
File metadata dictionary with size, type, created_at, etc.
554
555
Raises:
556
StorageException: If file not found
557
"""
558
```
559
560
**Usage Example:**
561
562
```python
563
# Get file metadata
564
info = supabase.storage.from_("documents").get_file_info("report.pdf")
565
print(f"File size: {info['metadata']['size']} bytes")
566
print(f"Content type: {info['metadata']['mimetype']}")
567
print(f"Last modified: {info['updated_at']}")
568
print(f"ETag: {info['metadata']['eTag']}")
569
```
570
571
### Error Handling
572
573
Handle storage operation errors and edge cases.
574
575
```python { .api }
576
# Storage exceptions (from storage3)
577
class StorageException(Exception):
578
"""
579
Storage operation errors including:
580
- File not found
581
- Permission denied
582
- Bucket full or quota exceeded
583
- Invalid file type
584
- Network errors
585
"""
586
```
587
588
**Error Handling Examples:**
589
590
```python
591
from storage3.utils import StorageException
592
593
# Upload error handling
594
try:
595
result = supabase.storage.from_("documents").upload(
596
"large_file.pdf",
597
large_file_data
598
)
599
except StorageException as e:
600
if "file size limit" in str(e).lower():
601
print("File is too large")
602
elif "mime type" in str(e).lower():
603
print("File type not allowed")
604
else:
605
print(f"Upload failed: {e}")
606
607
# Download error handling
608
try:
609
content = supabase.storage.from_("documents").download("missing_file.pdf")
610
except StorageException as e:
611
if "not found" in str(e).lower():
612
print("File does not exist")
613
else:
614
print(f"Download failed: {e}")
615
616
# Bucket operation error handling
617
try:
618
supabase.storage.create_bucket("existing-bucket")
619
except StorageException as e:
620
if "already exists" in str(e).lower():
621
print("Bucket name already taken")
622
else:
623
print(f"Bucket creation failed: {e}")
624
```
625
626
### Performance and Best Practices
627
628
```python
629
# Efficient file uploads
630
# Good: Upload with appropriate content type
631
result = supabase.storage.from_("images").upload(
632
"photo.jpg",
633
image_data,
634
{
635
"content_type": "image/jpeg",
636
"cache_control": "31536000" # Cache for 1 year
637
}
638
)
639
640
# Batch operations for multiple files
641
files_to_delete = ["temp1.txt", "temp2.txt", "temp3.txt"]
642
result = supabase.storage.from_("temp").remove(files_to_delete)
643
644
# Use signed URLs for secure access
645
signed_url = supabase.storage.from_("private").create_signed_url(
646
"sensitive.pdf",
647
300, # Short expiration for sensitive files
648
{"download": True}
649
)
650
651
# Organize files with proper path structure
652
# Good: Organized structure
653
supabase.storage.from_("user-content").upload(
654
f"users/{user_id}/documents/{filename}",
655
file_data
656
)
657
658
# Handle large files
659
# For large files, consider chunked uploads or progress tracking
660
def upload_with_progress(bucket, path, file_data):
661
try:
662
result = supabase.storage.from_(bucket).upload(path, file_data)
663
return result
664
except StorageException as e:
665
if "timeout" in str(e).lower():
666
# Retry or use chunked upload
667
pass
668
raise
669
```