or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

app-management.mdauthentication.mdfirestore.mdfunctions.mdindex.mdmachine-learning.mdmessaging.mdproject-management.mdrealtime-database.mdremote-config.mdstorage.mdtenant-management.md

storage.mddocs/

0

# Cloud Storage

1

2

Firebase Cloud Storage integration providing access to Google Cloud Storage buckets for file upload, download, and management operations. Full integration with Google Cloud Storage client capabilities.

3

4

## Capabilities

5

6

### Bucket Access

7

8

Get Cloud Storage bucket instances for file operations with automatic Firebase configuration integration.

9

10

```python { .api }

11

def bucket(name=None, app=None):

12

"""

13

Return a handle to a Cloud Storage bucket.

14

15

Args:

16

name: Bucket name string (optional). If not provided, uses the default

17

bucket from Firebase app configuration.

18

app: Firebase app instance (optional)

19

20

Returns:

21

google.cloud.storage.Bucket: Storage bucket instance with full Google Cloud Storage API

22

"""

23

```

24

25

## Google Cloud Storage Integration

26

27

The Firebase Admin SDK provides direct access to Google Cloud Storage buckets through the `google.cloud.storage.Bucket` class. This provides complete file storage capabilities including upload, download, metadata management, and access control.

28

29

### Common Usage Patterns

30

31

#### File Upload Operations

32

33

```python

34

import firebase_admin

35

from firebase_admin import storage

36

import os

37

38

# Get default bucket

39

bucket = storage.bucket()

40

41

# Upload file from local path

42

blob = bucket.blob('uploads/document.pdf')

43

blob.upload_from_filename('/local/path/document.pdf')

44

45

# Upload from memory

46

data = b'Hello, Cloud Storage!'

47

blob = bucket.blob('text/hello.txt')

48

blob.upload_from_string(data, content_type='text/plain')

49

50

# Upload with metadata

51

blob = bucket.blob('images/photo.jpg')

52

blob.metadata = {

53

'uploaded_by': 'user123',

54

'category': 'profile_photo'

55

}

56

blob.upload_from_filename('/local/path/photo.jpg')

57

```

58

59

#### File Download Operations

60

61

```python

62

# Download to local file

63

blob = bucket.blob('uploads/document.pdf')

64

blob.download_to_filename('/local/path/downloaded.pdf')

65

66

# Download to memory

67

blob = bucket.blob('text/hello.txt')

68

content = blob.download_as_bytes()

69

text_content = blob.download_as_text()

70

71

# Check if file exists

72

if blob.exists():

73

print('File exists in storage')

74

```

75

76

#### File Metadata and Properties

77

78

```python

79

# Get file information

80

blob = bucket.blob('uploads/document.pdf')

81

blob.reload() # Refresh metadata from server

82

83

print(f'Size: {blob.size} bytes')

84

print(f'Content type: {blob.content_type}')

85

print(f'Created: {blob.time_created}')

86

print(f'Updated: {blob.updated}')

87

print(f'MD5 hash: {blob.md5_hash}')

88

print(f'Custom metadata: {blob.metadata}')

89

90

# Update metadata

91

blob.metadata = {'description': 'Updated document'}

92

blob.patch()

93

94

# Set content type

95

blob.content_type = 'application/pdf'

96

blob.patch()

97

```

98

99

#### File Listing and Search

100

101

```python

102

# List all files in bucket

103

blobs = bucket.list_blobs()

104

for blob in blobs:

105

print(f'File: {blob.name}')

106

107

# List files with prefix

108

blobs = bucket.list_blobs(prefix='uploads/')

109

for blob in blobs:

110

print(f'Upload: {blob.name}')

111

112

# List with delimiter (folder-like structure)

113

blobs = bucket.list_blobs(prefix='images/', delimiter='/')

114

for blob in blobs:

115

print(f'Image: {blob.name}')

116

117

# Get specific number of files

118

blobs = bucket.list_blobs(max_results=10)

119

```

120

121

#### File Operations

122

123

```python

124

# Copy file

125

source_blob = bucket.blob('original/file.txt')

126

destination_blob = bucket.blob('backup/file.txt')

127

bucket.copy_blob(source_blob, bucket, 'backup/file.txt')

128

129

# Move/Rename file (copy then delete)

130

bucket.copy_blob(source_blob, bucket, 'new/location/file.txt')

131

source_blob.delete()

132

133

# Delete file

134

blob = bucket.blob('uploads/temp.txt')

135

blob.delete()

136

137

# Delete multiple files

138

blobs_to_delete = bucket.list_blobs(prefix='temp/')

139

bucket.delete_blobs(blobs_to_delete)

140

```

141

142

#### Access Control and Permissions

143

144

```python

145

# Make file publicly readable

146

blob = bucket.blob('public/image.jpg')

147

blob.make_public()

148

print(f'Public URL: {blob.public_url}')

149

150

# Generate signed URL for temporary access

151

from datetime import datetime, timedelta

152

153

blob = bucket.blob('private/document.pdf')

154

url = blob.generate_signed_url(

155

expiration=datetime.utcnow() + timedelta(hours=1),

156

method='GET'

157

)

158

print(f'Signed URL: {url}')

159

160

# Generate upload signed URL

161

upload_url = blob.generate_signed_url(

162

expiration=datetime.utcnow() + timedelta(hours=1),

163

method='PUT',

164

content_type='application/pdf'

165

)

166

```

167

168

#### Bucket Management

169

170

```python

171

# Get bucket information

172

bucket.reload()

173

print(f'Bucket name: {bucket.name}')

174

print(f'Location: {bucket.location}')

175

print(f'Storage class: {bucket.storage_class}')

176

177

# List bucket contents with pagination

178

iterator = bucket.list_blobs()

179

page_iterator = iterator.pages

180

181

for page in page_iterator:

182

for blob in page:

183

print(f'File: {blob.name}')

184

```

185

186

### Advanced Operations

187

188

#### Batch Operations

189

190

```python

191

# Batch upload multiple files

192

import concurrent.futures

193

import os

194

195

def upload_file(file_path):

196

blob_name = os.path.basename(file_path)

197

blob = bucket.blob(f'batch_upload/{blob_name}')

198

blob.upload_from_filename(file_path)

199

return blob_name

200

201

file_paths = ['/path/to/file1.txt', '/path/to/file2.txt']

202

203

with concurrent.futures.ThreadPoolExecutor() as executor:

204

futures = [executor.submit(upload_file, path) for path in file_paths]

205

for future in concurrent.futures.as_completed(futures):

206

print(f'Uploaded: {future.result()}')

207

```

208

209

#### Custom Storage Classes

210

211

```python

212

# Set storage class for cost optimization

213

blob = bucket.blob('archive/old_data.zip')

214

blob.storage_class = 'COLDLINE' # or 'NEARLINE', 'ARCHIVE'

215

blob.upload_from_filename('/local/path/old_data.zip')

216

```

217

218

#### Content Encoding and Compression

219

220

```python

221

# Upload compressed content

222

import gzip

223

224

# Compress data

225

data = b'Large amount of text data...'

226

compressed_data = gzip.compress(data)

227

228

blob = bucket.blob('compressed/data.txt.gz')

229

blob.content_encoding = 'gzip'

230

blob.content_type = 'text/plain'

231

blob.upload_from_string(compressed_data)

232

```

233

234

### Error Handling

235

236

```python

237

from google.cloud.exceptions import NotFound, Forbidden, GoogleCloudError

238

from google.api_core.exceptions import ServiceUnavailable

239

240

try:

241

blob = bucket.blob('nonexistent/file.txt')

242

content = blob.download_as_bytes()

243

except NotFound:

244

print('File not found')

245

except Forbidden:

246

print('Access denied')

247

except ServiceUnavailable:

248

print('Storage service temporarily unavailable')

249

except GoogleCloudError as e:

250

print(f'Google Cloud error: {e}')

251

```

252

253

### Integration with Firebase Authentication

254

255

```python

256

# Use Firebase Auth tokens for client-side access

257

from firebase_admin import auth

258

259

# Generate custom token for client

260

custom_token = auth.create_custom_token('user123')

261

262

# Client can use this token to authenticate with Firebase Storage

263

# and access files based on Firebase Storage security rules

264

```

265

266

## Named Buckets

267

268

For projects with multiple storage buckets:

269

270

```python

271

# Default bucket (from Firebase config)

272

default_bucket = storage.bucket()

273

274

# Named bucket

275

named_bucket = storage.bucket('my-other-bucket-name')

276

277

# Use named bucket

278

blob = named_bucket.blob('data/file.txt')

279

blob.upload_from_string('data for named bucket')

280

```

281

282

## Security Rules

283

284

Firebase Storage uses security rules for access control:

285

286

```javascript

287

// Example storage rules (defined in Firebase Console)

288

rules_version = '2';

289

service firebase.storage {

290

match /b/{bucket}/o {

291

// Public read access

292

match /public/{allPaths=**} {

293

allow read;

294

}

295

296

// User-specific files

297

match /users/{userId}/{allPaths=**} {

298

allow read, write: if request.auth != null && request.auth.uid == userId;

299

}

300

301

// Admin only

302

match /admin/{allPaths=**} {

303

allow read, write: if request.auth != null &&

304

request.auth.token.admin == true;

305

}

306

}

307

}

308

```

309

310

## Performance and Best Practices

311

312

- **Use appropriate storage classes**: Choose based on access patterns

313

- **Implement retry logic**: Handle temporary failures gracefully

314

- **Use signed URLs**: For temporary access without authentication

315

- **Optimize file naming**: Use consistent naming conventions

316

- **Monitor usage**: Track storage costs and usage patterns

317

- **Implement caching**: Cache frequently accessed files

318

- **Use compression**: Reduce storage costs and transfer time

319

320

## Types

321

322

```python { .api }

323

# The bucket() function returns google.cloud.storage.Bucket

324

# which provides access to all Google Cloud Storage types:

325

326

# Core types from google.cloud.storage:

327

# - Bucket: Storage bucket container

328

# - Blob: Individual file/object in storage

329

# - BlobReader: For streaming downloads

330

# - BlobWriter: For streaming uploads

331

332

# Common properties and methods available on Bucket:

333

# - name: Bucket name

334

# - location: Geographic location

335

# - storage_class: Storage class (STANDARD, NEARLINE, COLDLINE, ARCHIVE)

336

# - list_blobs(): List objects in bucket

337

# - blob(name): Get blob reference

338

# - copy_blob(): Copy objects

339

# - delete_blobs(): Delete multiple objects

340

341

# Common properties and methods available on Blob:

342

# - name: Object name/path

343

# - size: Object size in bytes

344

# - content_type: MIME type

345

# - metadata: Custom metadata dict

346

# - time_created: Creation timestamp

347

# - updated: Last modified timestamp

348

# - upload_from_filename(): Upload from file

349

# - upload_from_string(): Upload from memory

350

# - download_to_filename(): Download to file

351

# - download_as_bytes(): Download to memory

352

# - generate_signed_url(): Create temporary access URL

353

# - make_public(): Make publicly accessible

354

```