or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

async-operations.mdblob-client.mdblob-types-tiers.mdcontainer-client.mdindex.mdsas-generation.mdservice-client.mdutility-functions.md

utility-functions.mddocs/

0

# Utility Functions

1

2

Convenient helper functions for common blob operations without requiring explicit client instantiation. These functions provide simplified access for basic upload and download scenarios, making them ideal for simple scripts and one-off operations.

3

4

## Capabilities

5

6

### Upload to URL

7

8

Upload data directly to a blob URL without creating a client instance. This function handles client creation, upload, and cleanup automatically.

9

10

```python { .api }

11

def upload_blob_to_url(blob_url: str, data, credential=None, **kwargs) -> dict:

12

"""

13

Upload data directly to a blob URL.

14

15

Args:

16

blob_url (str): Complete URL to the destination blob

17

data: Data to upload (bytes, str, or file-like object)

18

credential: Optional credential for authentication. Can be:

19

- str: Account key or SAS token string

20

- dict: Account name and key mapping

21

- AzureNamedKeyCredential: Named key credential

22

- AzureSasCredential: SAS credential

23

- TokenCredential: Azure AD token credential

24

- None: For SAS URLs or public access

25

26

Keyword Args:

27

overwrite (bool): Whether to overwrite existing blob (default: False)

28

max_concurrency (int): Maximum concurrent uploads for large blobs

29

length (int, optional): Number of bytes to upload

30

metadata (dict, optional): Blob metadata as key-value pairs

31

validate_content (bool): Validate content integrity during upload

32

encoding (str, optional): Text encoding if data is string (default: UTF-8)

33

34

Returns:

35

dict: Upload response containing:

36

- etag: Entity tag of uploaded blob

37

- last_modified: Last modified timestamp

38

- content_md5: MD5 hash of content (if calculated)

39

- client_request_id: Request ID for tracking

40

- request_id: Server request ID

41

- version: Blob service version

42

- date: Response date

43

44

Raises:

45

ResourceExistsError: If blob exists and overwrite=False

46

HttpResponseError: For other service errors

47

"""

48

```

49

50

### Download from URL

51

52

Download blob content from a URL to a local file or stream without creating a client instance.

53

54

```python { .api }

55

def download_blob_from_url(blob_url: str, output, credential=None, **kwargs) -> None:

56

"""

57

Download blob content from URL to file or stream.

58

59

Args:

60

blob_url (str): Complete URL to the source blob

61

output: Download destination. Can be:

62

- str: Local file path to write to

63

- file-like object: Stream to write to (must have 'write' method)

64

credential: Optional credential for authentication. Can be:

65

- str: Account key or SAS token string

66

- dict: Account name and key mapping

67

- AzureNamedKeyCredential: Named key credential

68

- AzureSasCredential: SAS credential

69

- TokenCredential: Azure AD token credential

70

- None: For SAS URLs or public access

71

72

Keyword Args:

73

overwrite (bool): Whether to overwrite existing file (default: False)

74

max_concurrency (int): Maximum concurrent downloads for large blobs

75

offset (int, optional): Start byte position for partial download

76

length (int, optional): Number of bytes to download

77

validate_content (bool): Validate content integrity during download

78

79

Returns:

80

None

81

82

Raises:

83

ValueError: If file exists and overwrite=False

84

ResourceNotFoundError: If blob does not exist

85

HttpResponseError: For other service errors

86

"""

87

```

88

89

### Async Utility Functions

90

91

Asynchronous versions of the utility functions for concurrent operations.

92

93

```python { .api }

94

# Available in azure.storage.blob.aio module

95

async def upload_blob_to_url(blob_url: str, data, credential=None, **kwargs) -> dict:

96

"""

97

Async version of upload_blob_to_url.

98

99

Same parameters and return value as sync version.

100

"""

101

102

async def download_blob_from_url(blob_url: str, output, credential=None, **kwargs) -> None:

103

"""

104

Async version of download_blob_from_url.

105

106

Same parameters as sync version.

107

"""

108

```

109

110

## Usage Examples

111

112

### Basic Upload Examples

113

114

```python

115

from azure.storage.blob import upload_blob_to_url

116

117

# Upload string data to blob with SAS token in URL

118

blob_url = "https://account.blob.core.windows.net/container/file.txt?sp=w&st=..."

119

upload_blob_to_url(blob_url, "Hello, World!")

120

121

# Upload binary data with separate credential

122

blob_url = "https://account.blob.core.windows.net/container/image.jpg"

123

with open("local_image.jpg", "rb") as data:

124

upload_blob_to_url(

125

blob_url,

126

data,

127

credential="account_key_here",

128

overwrite=True,

129

metadata={"source": "camera", "date": "2023-01-01"}

130

)

131

132

# Upload with Azure AD credential

133

from azure.identity import DefaultAzureCredential

134

credential = DefaultAzureCredential()

135

upload_blob_to_url(

136

"https://account.blob.core.windows.net/container/document.pdf",

137

pdf_data,

138

credential=credential,

139

overwrite=True

140

)

141

```

142

143

### Basic Download Examples

144

145

```python

146

from azure.storage.blob import download_blob_from_url

147

148

# Download to local file

149

blob_url = "https://account.blob.core.windows.net/container/data.csv?sp=r&st=..."

150

download_blob_from_url(blob_url, "local_data.csv")

151

152

# Download to stream

153

blob_url = "https://account.blob.core.windows.net/container/log.txt"

154

with open("downloaded_log.txt", "wb") as file_handle:

155

download_blob_from_url(

156

blob_url,

157

file_handle,

158

credential="account_key_here",

159

max_concurrency=4

160

)

161

162

# Download partial content

163

download_blob_from_url(

164

blob_url,

165

"first_1mb.dat",

166

credential=credential,

167

offset=0,

168

length=1024*1024 # First 1MB only

169

)

170

```

171

172

### Error Handling

173

174

```python

175

from azure.storage.blob import upload_blob_to_url, download_blob_from_url

176

from azure.core.exceptions import ResourceExistsError, ResourceNotFoundError, HttpResponseError

177

178

# Upload with error handling

179

try:

180

upload_blob_to_url(blob_url, data, overwrite=False)

181

except ResourceExistsError:

182

print("Blob already exists. Use overwrite=True to replace.")

183

except HttpResponseError as e:

184

print(f"Upload failed: {e.status_code} - {e.message}")

185

186

# Download with error handling

187

try:

188

download_blob_from_url(blob_url, "output.txt", overwrite=False)

189

except ValueError as e:

190

print(f"File operation error: {e}")

191

except ResourceNotFoundError:

192

print("Blob not found")

193

except HttpResponseError as e:

194

print(f"Download failed: {e.status_code} - {e.message}")

195

```

196

197

### Async Usage Examples

198

199

```python

200

import asyncio

201

from azure.storage.blob.aio import upload_blob_to_url, download_blob_from_url

202

203

async def async_operations():

204

# Async upload

205

await upload_blob_to_url(

206

"https://account.blob.core.windows.net/container/async_file.txt",

207

"Async upload data",

208

credential=credential

209

)

210

211

# Async download

212

await download_blob_from_url(

213

"https://account.blob.core.windows.net/container/source.txt",

214

"async_downloaded.txt",

215

credential=credential

216

)

217

218

# Concurrent operations

219

async def concurrent_uploads():

220

urls_and_data = [

221

("https://account.blob.core.windows.net/container/file1.txt", "Data 1"),

222

("https://account.blob.core.windows.net/container/file2.txt", "Data 2"),

223

("https://account.blob.core.windows.net/container/file3.txt", "Data 3"),

224

]

225

226

# Upload all files concurrently

227

tasks = [

228

upload_blob_to_url(url, data, credential=credential, overwrite=True)

229

for url, data in urls_and_data

230

]

231

232

results = await asyncio.gather(*tasks)

233

print(f"Uploaded {len(results)} files concurrently")

234

235

asyncio.run(concurrent_uploads())

236

```

237

238

### Advanced Usage Patterns

239

240

```python

241

# Upload large file with progress tracking

242

def upload_large_file_with_progress():

243

def progress_callback(bytes_transferred, total_bytes):

244

percentage = (bytes_transferred / total_bytes) * 100

245

print(f"Upload progress: {percentage:.1f}%")

246

247

with open("large_file.zip", "rb") as data:

248

result = upload_blob_to_url(

249

blob_url,

250

data,

251

credential=credential,

252

overwrite=True,

253

max_concurrency=8, # Parallel uploads

254

validate_content=True, # Verify integrity

255

# Note: progress_callback not directly supported in utility functions

256

# Use BlobClient for advanced progress tracking

257

)

258

return result

259

260

# Conditional upload based on blob existence

261

def conditional_upload():

262

try:

263

# Try upload without overwrite

264

upload_blob_to_url(blob_url, data, overwrite=False)

265

print("New blob uploaded")

266

except ResourceExistsError:

267

# Blob exists, decide whether to update

268

from azure.storage.blob import BlobClient

269

blob_client = BlobClient.from_blob_url(blob_url, credential=credential)

270

271

properties = blob_client.get_blob_properties()

272

if properties.size != len(data):

273

upload_blob_to_url(blob_url, data, overwrite=True)

274

print("Blob updated with new content")

275

else:

276

print("Blob unchanged, skipping upload")

277

278

# Download with retry logic

279

def download_with_retry(max_retries=3):

280

for attempt in range(max_retries):

281

try:

282

download_blob_from_url(blob_url, output_file, credential=credential)

283

print("Download successful")

284

break

285

except HttpResponseError as e:

286

if e.status_code >= 500 and attempt < max_retries - 1:

287

print(f"Server error, retrying... (attempt {attempt + 1})")

288

time.sleep(2 ** attempt) # Exponential backoff

289

else:

290

raise

291

```

292

293

## Use Cases

294

295

### When to Use Utility Functions

296

297

**Ideal for:**

298

- Simple upload/download operations

299

- One-off scripts and automation

300

- Quick prototyping and testing

301

- Scenarios where you don't need advanced client features

302

- When working with SAS URLs or public blobs

303

304

**Examples:**

305

```python

306

# Quick backup script

307

def backup_file(local_path, backup_url):

308

with open(local_path, "rb") as data:

309

upload_blob_to_url(backup_url, data, overwrite=True)

310

311

# Simple data processing pipeline

312

def process_remote_data(source_url, output_path):

313

# Download data

314

download_blob_from_url(source_url, "temp_data.csv")

315

316

# Process data (your processing logic here)

317

processed_data = process_csv("temp_data.csv")

318

319

# Upload results

320

upload_blob_to_url(output_path, processed_data, overwrite=True)

321

```

322

323

### When to Use Client Classes Instead

324

325

**Use BlobClient/ContainerClient/BlobServiceClient for:**

326

- Long-running applications requiring connection reuse

327

- Complex operations requiring multiple API calls

328

- Advanced features like lease management, batch operations

329

- Custom retry policies and error handling

330

- Progress tracking for large transfers

331

- Metadata and property management beyond basic upload/download

332

333

**Migration from utility functions to clients:**

334

```python

335

# Utility function approach

336

upload_blob_to_url(blob_url, data, credential=credential)

337

338

# Equivalent using BlobClient

339

blob_client = BlobClient.from_blob_url(blob_url, credential=credential)

340

blob_client.upload_blob(data, overwrite=True)

341

blob_client.close() # Or use context manager

342

```

343

344

## Performance Considerations

345

346

### Concurrency

347

- Utility functions create a new client for each operation

348

- For multiple operations, consider using client classes for better performance

349

- Async versions support concurrent operations efficiently

350

351

### Memory Usage

352

- Utility functions automatically clean up resources

353

- Client classes allow reuse of connections and internal buffers

354

- For large files, both approaches support chunked transfers

355

356

### Error Recovery

357

- Utility functions use default retry policies

358

- Client classes allow custom retry configuration

359

- Both support the same authentication and security features