Python asyncio client library for Google Cloud Storage with full CRUD operations and streaming support
npx @tessl/cli install tessl/pypi-gcloud-aio-storage@9.6.00
# gcloud-aio-storage
1
2
An asyncio-compatible Python client library for Google Cloud Storage that provides full CRUD operations for buckets and blobs with streaming support for large files, parallel upload capabilities, and built-in session management. Designed for high-performance cloud storage operations with modern async/await patterns.
3
4
## Package Information
5
6
- **Package Name**: gcloud-aio-storage
7
- **Package Type**: pypi
8
- **Language**: Python
9
- **Installation**: `pip install gcloud-aio-storage`
10
11
## Core Imports
12
13
```python
14
from gcloud.aio.storage import Storage, Bucket, Blob, StreamResponse, SCOPES
15
```
16
17
## Basic Usage
18
19
```python
20
import asyncio
21
from gcloud.aio.storage import Storage
22
23
async def main():
24
# Initialize storage client
25
async with Storage() as storage:
26
# List buckets
27
buckets = await storage.list_buckets('my-project')
28
29
# Upload a file
30
with open('local-file.txt', 'rb') as f:
31
file_data = f.read()
32
result = await storage.upload('my-bucket', 'remote-file.txt', file_data)
33
34
# Download a file
35
content = await storage.download('my-bucket', 'remote-file.txt')
36
37
# Stream download for large files
38
async with storage.download_stream('my-bucket', 'large-file.txt') as stream:
39
while True:
40
chunk = await stream.read(8192)
41
if not chunk:
42
break
43
# Process chunk
44
45
# Can also be used without context manager
46
storage = Storage()
47
try:
48
content = await storage.download('my-bucket', 'file.txt')
49
finally:
50
await storage.close()
51
52
asyncio.run(main())
53
```
54
55
## Architecture
56
57
The library follows a hierarchical structure that mirrors Google Cloud Storage's organization:
58
59
- **Storage**: Main client that handles authentication, session management, and high-level operations
60
- **Bucket**: Container that groups related blobs and provides bucket-specific operations
61
- **Blob**: Individual object that represents files stored in Cloud Storage with metadata and content operations
62
- **StreamResponse**: Wrapper for efficient streaming of large file downloads
63
64
This design enables both direct operations through the Storage client and object-oriented manipulation through Bucket and Blob instances, providing flexibility for different usage patterns while maintaining efficient session reuse.
65
66
The library supports both production and testing environments through built-in emulator support via the `STORAGE_EMULATOR_HOST` environment variable, and provides dual compatibility with both asyncio (aiohttp) and synchronous (requests) HTTP clients through the gcloud-rest-* variants.
67
68
## Capabilities
69
70
### Storage Client Operations
71
72
Core storage client functionality for bucket management, object operations, and session handling. Provides direct access to all Cloud Storage operations with automatic authentication and session management.
73
74
```python { .api }
75
class Storage:
76
def __init__(self, *, service_file=None, token=None, session=None, api_root=None): ...
77
async def list_buckets(self, project, *, params=None, headers=None, session=None, timeout=10): ...
78
def get_bucket(self, bucket_name): ...
79
async def copy(self, bucket, object_name, destination_bucket, *, new_name=None, metadata=None, params=None, headers=None, timeout=10, session=None): ...
80
async def delete(self, bucket, object_name, *, timeout=10, params=None, headers=None, session=None): ...
81
async def download(self, bucket, object_name, *, headers=None, timeout=10, session=None): ...
82
async def download_to_filename(self, bucket, object_name, filename, **kwargs): ...
83
async def download_metadata(self, bucket, object_name, *, headers=None, session=None, timeout=10): ...
84
async def download_stream(self, bucket, object_name, *, headers=None, timeout=10, session=None): ...
85
async def list_objects(self, bucket, *, params=None, headers=None, session=None, timeout=10): ...
86
async def upload(self, bucket, object_name, file_data, *, content_type=None, parameters=None, headers=None, metadata=None, session=None, force_resumable_upload=None, zipped=False, timeout=30): ...
87
async def upload_from_filename(self, bucket, object_name, filename, **kwargs): ...
88
async def compose(self, bucket, object_name, source_object_names, *, content_type=None, params=None, headers=None, session=None, timeout=10): ...
89
async def patch_metadata(self, bucket, object_name, metadata, *, params=None, headers=None, session=None, timeout=10): ...
90
async def get_bucket_metadata(self, bucket, *, params=None, headers=None, session=None, timeout=10): ...
91
async def close(self): ...
92
async def __aenter__(self): ...
93
async def __aexit__(self, *args): ...
94
```
95
96
[Storage Client](./storage-client.md)
97
98
### Bucket Operations
99
100
Bucket-level operations for managing Cloud Storage buckets and their contained blobs. Provides a container abstraction that simplifies working with groups of related objects.
101
102
```python { .api }
103
class Bucket:
104
def __init__(self, storage, name): ...
105
async def get_blob(self, blob_name, timeout=10, session=None): ...
106
async def blob_exists(self, blob_name, session=None): ...
107
async def list_blobs(self, prefix='', match_glob='', delimiter='', session=None): ...
108
def new_blob(self, blob_name): ...
109
async def get_metadata(self, params=None, session=None): ...
110
```
111
112
[Bucket Operations](./bucket-operations.md)
113
114
### Blob Management
115
116
Individual object operations for Cloud Storage blobs including content manipulation, metadata management, and signed URL generation for secure access.
117
118
```python { .api }
119
class Blob:
120
def __init__(self, bucket, name, metadata): ...
121
async def download(self, timeout=10, session=None, auto_decompress=True): ...
122
async def upload(self, data, content_type=None, session=None): ...
123
async def get_signed_url(self, expiration, headers=None, query_params=None, http_method='GET', iam_client=None, service_account_email=None, token=None, session=None): ...
124
@staticmethod
125
def get_pem_signature(str_to_sign, private_key): ...
126
@staticmethod
127
async def get_iam_api_signature(str_to_sign, iam_client, service_account_email, session): ...
128
```
129
130
[Blob Management](./blob-management.md)
131
132
### Streaming Operations
133
134
Efficient streaming functionality for handling large files without loading entire contents into memory. Supports both upload and download streaming with automatic chunk management.
135
136
```python { .api }
137
class StreamResponse:
138
def __init__(self, response): ...
139
async def read(self, size=-1): ...
140
@property
141
def content_length(self): ...
142
async def __aenter__(self): ...
143
async def __aexit__(self, *exc_info): ...
144
```
145
146
[Streaming Operations](./streaming-operations.md)
147
148
## Constants and Configuration
149
150
```python { .api }
151
SCOPES = ['https://www.googleapis.com/auth/devstorage.read_write']
152
DEFAULT_TIMEOUT = 10
153
MAX_CONTENT_LENGTH_SIMPLE_UPLOAD = 5242880 # 5MB
154
HOST = 'storage.googleapis.com' # Can be overridden via STORAGE_EMULATOR_HOST
155
PKCS1_MARKER = ('-----BEGIN RSA PRIVATE KEY-----', '-----END RSA PRIVATE KEY-----')
156
PKCS8_MARKER = ('-----BEGIN PRIVATE KEY-----', '-----END PRIVATE KEY-----')
157
```
158
159
## Types
160
161
```python { .api }
162
from enum import Enum
163
164
class UploadType(Enum):
165
SIMPLE = 1
166
RESUMABLE = 2
167
MULTIPART = 3
168
169
class PemKind(Enum):
170
INVALID = -1
171
PKCS1 = 0
172
PKCS8 = 1
173
```