or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

dynamodb-operations.mdec2-operations.mdexception-handling.mdindex.mds3-transfer-operations.mdsession-management.mdutility-functions.md

s3-transfer-operations.mddocs/

0

# S3 Transfer Operations

1

2

High-level S3 transfer functionality with automatic multipart handling, progress callbacks, and retry logic. These operations are automatically injected into S3 clients and resources, providing simplified interfaces for common file transfer tasks while handling the complexities of multipart uploads, downloads, and error recovery.

3

4

## Capabilities

5

6

### Transfer Configuration

7

8

Configuration class for customizing S3 transfer behavior, including multipart thresholds, concurrency settings, and bandwidth limits.

9

10

```python { .api }

11

from boto3.s3.transfer import TransferConfig

12

13

class TransferConfig:

14

"""

15

Configuration for S3 transfer operations.

16

17

Controls multipart upload/download behavior, concurrency, and performance tuning.

18

"""

19

20

def __init__(self, multipart_threshold: int = 8 * 1024 * 1024,

21

max_concurrency: int = 10, multipart_chunksize: int = 8 * 1024 * 1024,

22

num_download_attempts: int = 5, max_io_queue: int = 100,

23

io_chunksize: int = 256 * 1024, use_threads: bool = True,

24

max_bandwidth: int = None):

25

"""

26

Parameters:

27

- multipart_threshold: File size threshold for multipart operations (bytes)

28

- max_concurrency: Maximum number of concurrent upload/download threads

29

- multipart_chunksize: Size of each multipart chunk (bytes)

30

- num_download_attempts: Number of download retry attempts

31

- max_io_queue: Maximum number of IO operations to queue

32

- io_chunksize: Size of each IO chunk (bytes)

33

- use_threads: Whether to use threading for transfers

34

- max_bandwidth: Maximum bandwidth usage in bytes/second

35

"""

36

```

37

38

### Client Transfer Methods

39

40

These methods are automatically injected into all S3 client instances, providing high-level transfer functionality.

41

42

```python { .api }

43

# Methods available on boto3.client('s3') instances

44

45

def upload_file(filename: str, bucket: str, key: str,

46

extra_args: Dict[str, Any] = None,

47

callback: Callable = None,

48

config: TransferConfig = None) -> None:

49

"""

50

Upload a file to S3.

51

52

Automatically handles multipart uploads for large files and provides

53

progress callbacks and error handling.

54

55

Parameters:

56

- filename: Path to the local file to upload

57

- bucket: S3 bucket name

58

- key: S3 object key (path within bucket)

59

- extra_args: Additional arguments for the upload (e.g., ContentType, ACL)

60

- callback: Progress callback function called with bytes transferred

61

- config: TransferConfig for customizing transfer behavior

62

"""

63

64

def download_file(bucket: str, key: str, filename: str,

65

extra_args: Dict[str, Any] = None,

66

callback: Callable = None,

67

config: TransferConfig = None) -> None:

68

"""

69

Download a file from S3.

70

71

Automatically handles multipart downloads for large files and provides

72

progress callbacks and retry logic.

73

74

Parameters:

75

- bucket: S3 bucket name

76

- key: S3 object key to download

77

- filename: Local path where file will be saved

78

- extra_args: Additional arguments for the download

79

- callback: Progress callback function

80

- config: TransferConfig for customizing transfer behavior

81

"""

82

83

def upload_fileobj(fileobj, bucket: str, key: str,

84

extra_args: Dict[str, Any] = None,

85

callback: Callable = None,

86

config: TransferConfig = None) -> None:

87

"""

88

Upload a file-like object to S3.

89

90

Accepts any file-like object that supports read() method.

91

92

Parameters:

93

- fileobj: File-like object to upload (must support read())

94

- bucket: S3 bucket name

95

- key: S3 object key

96

- extra_args: Additional arguments for the upload

97

- callback: Progress callback function

98

- config: TransferConfig for customizing transfer behavior

99

"""

100

101

def download_fileobj(bucket: str, key: str, fileobj,

102

extra_args: Dict[str, Any] = None,

103

callback: Callable = None,

104

config: TransferConfig = None) -> None:

105

"""

106

Download an S3 object to a file-like object.

107

108

Downloads to any file-like object that supports write() method.

109

110

Parameters:

111

- bucket: S3 bucket name

112

- key: S3 object key to download

113

- fileobj: File-like object to write to (must support write())

114

- extra_args: Additional arguments for the download

115

- callback: Progress callback function

116

- config: TransferConfig for customizing transfer behavior

117

"""

118

```

119

120

### Resource Transfer Methods

121

122

These methods are automatically injected into S3 resource objects (Bucket and Object instances).

123

124

```python { .api }

125

# Methods available on S3 Bucket resources

126

class Bucket:

127

def upload_file(self, filename: str, key: str,

128

extra_args: Dict[str, Any] = None,

129

callback: Callable = None,

130

config: TransferConfig = None) -> None:

131

"""Upload a file to this bucket."""

132

133

def download_file(self, key: str, filename: str,

134

extra_args: Dict[str, Any] = None,

135

callback: Callable = None,

136

config: TransferConfig = None) -> None:

137

"""Download a file from this bucket."""

138

139

def upload_fileobj(self, fileobj, key: str,

140

extra_args: Dict[str, Any] = None,

141

callback: Callable = None,

142

config: TransferConfig = None) -> None:

143

"""Upload a file-like object to this bucket."""

144

145

def download_fileobj(self, key: str, fileobj,

146

extra_args: Dict[str, Any] = None,

147

callback: Callable = None,

148

config: TransferConfig = None) -> None:

149

"""Download an object from this bucket to a file-like object."""

150

151

# Methods available on S3 Object resources

152

class Object:

153

def upload_file(self, filename: str,

154

extra_args: Dict[str, Any] = None,

155

callback: Callable = None,

156

config: TransferConfig = None) -> None:

157

"""Upload a file to this S3 object."""

158

159

def download_file(self, filename: str,

160

extra_args: Dict[str, Any] = None,

161

callback: Callable = None,

162

config: TransferConfig = None) -> None:

163

"""Download this S3 object to a file."""

164

165

def upload_fileobj(self, fileobj,

166

extra_args: Dict[str, Any] = None,

167

callback: Callable = None,

168

config: TransferConfig = None) -> None:

169

"""Upload a file-like object to this S3 object."""

170

171

def download_fileobj(self, fileobj,

172

extra_args: Dict[str, Any] = None,

173

callback: Callable = None,

174

config: TransferConfig = None) -> None:

175

"""Download this S3 object to a file-like object."""

176

```

177

178

### Copy Operations

179

180

High-level copy operations for moving objects within S3, with support for cross-region and cross-account copying.

181

182

```python { .api }

183

# Methods available on S3 client, bucket, and object resources

184

185

def copy(copy_source: Union[Dict[str, str], str], bucket: str, key: str,

186

extra_args: Dict[str, Any] = None, callback: Callable = None,

187

source_client = None, config: TransferConfig = None) -> None:

188

"""

189

Copy an S3 object from one location to another.

190

191

Supports copying within the same bucket, between buckets, and across regions.

192

193

Parameters:

194

- copy_source: Source object specification (dict with Bucket/Key or string)

195

- bucket: Destination bucket name

196

- key: Destination object key

197

- extra_args: Additional arguments for the copy operation

198

- callback: Progress callback function

199

- source_client: S3 client for the source object (for cross-region copies)

200

- config: TransferConfig for customizing copy behavior

201

"""

202

```

203

204

### Transfer Constants

205

206

Constants for configuring transfer client behavior.

207

208

```python { .api }

209

from boto3.s3.constants import CLASSIC_TRANSFER_CLIENT, AUTO_RESOLVE_TRANSFER_CLIENT

210

211

CLASSIC_TRANSFER_CLIENT = "classic" # Use classic transfer client

212

AUTO_RESOLVE_TRANSFER_CLIENT = "auto" # Automatically resolve best transfer client

213

```

214

215

## Usage Examples

216

217

### Basic File Uploads and Downloads

218

219

```python

220

import boto3

221

222

s3_client = boto3.client('s3')

223

224

# Upload a local file

225

s3_client.upload_file('local-file.txt', 'my-bucket', 'files/remote-file.txt')

226

227

# Download a file

228

s3_client.download_file('my-bucket', 'files/remote-file.txt', 'downloaded-file.txt')

229

```

230

231

### Using Transfer Configuration

232

233

```python

234

import boto3

235

from boto3.s3.transfer import TransferConfig

236

237

# Configure transfer settings

238

config = TransferConfig(

239

multipart_threshold=1024 * 25, # 25MB threshold for multipart

240

max_concurrency=10,

241

multipart_chunksize=1024 * 25,

242

use_threads=True

243

)

244

245

s3_client = boto3.client('s3')

246

247

# Upload large file with custom configuration

248

s3_client.upload_file(

249

'large-file.zip',

250

'my-bucket',

251

'uploads/large-file.zip',

252

config=config

253

)

254

```

255

256

### Progress Callbacks

257

258

```python

259

import boto3

260

import sys

261

262

def upload_progress(bytes_transferred):

263

"""Progress callback function."""

264

sys.stdout.write(f"\rUploaded: {bytes_transferred} bytes")

265

sys.stdout.flush()

266

267

s3_client = boto3.client('s3')

268

269

# Upload with progress callback

270

s3_client.upload_file(

271

'my-file.txt',

272

'my-bucket',

273

'uploads/my-file.txt',

274

callback=upload_progress

275

)

276

print() # New line after progress

277

```

278

279

### Working with File-like Objects

280

281

```python

282

import boto3

283

from io import BytesIO

284

285

s3_client = boto3.client('s3')

286

287

# Upload from in-memory buffer

288

data = b"Hello, World! This is test data."

289

buffer = BytesIO(data)

290

291

s3_client.upload_fileobj(buffer, 'my-bucket', 'text/hello.txt')

292

293

# Download to in-memory buffer

294

download_buffer = BytesIO()

295

s3_client.download_fileobj('my-bucket', 'text/hello.txt', download_buffer)

296

297

# Read the downloaded data

298

download_buffer.seek(0)

299

downloaded_data = download_buffer.read()

300

print(downloaded_data.decode('utf-8'))

301

```

302

303

### Using S3 Resources for Transfers

304

305

```python

306

import boto3

307

308

s3_resource = boto3.resource('s3')

309

310

# Work with bucket resource

311

bucket = s3_resource.Bucket('my-bucket')

312

bucket.upload_file('local-file.txt', 'uploads/file.txt')

313

bucket.download_file('uploads/file.txt', 'downloaded-file.txt')

314

315

# Work with object resource

316

obj = s3_resource.Object('my-bucket', 'documents/report.pdf')

317

obj.upload_file('local-report.pdf')

318

obj.download_file('downloaded-report.pdf')

319

```

320

321

### Copy Operations

322

323

```python

324

import boto3

325

326

s3_client = boto3.client('s3')

327

328

# Copy within same bucket

329

copy_source = {'Bucket': 'my-bucket', 'Key': 'old-path/file.txt'}

330

s3_client.copy(copy_source, 'my-bucket', 'new-path/file.txt')

331

332

# Copy between buckets

333

copy_source = {'Bucket': 'source-bucket', 'Key': 'path/file.txt'}

334

s3_client.copy(copy_source, 'destination-bucket', 'path/file.txt')

335

336

# Copy with additional metadata

337

s3_client.copy(

338

copy_source,

339

'destination-bucket',

340

'path/file.txt',

341

extra_args={

342

'MetadataDirective': 'REPLACE',

343

'Metadata': {'author': 'John Doe', 'version': '1.0'}

344

}

345

)

346

```

347

348

### Advanced Transfer Configuration

349

350

```python

351

import boto3

352

from boto3.s3.transfer import TransferConfig

353

354

# High-performance configuration for large files

355

high_perf_config = TransferConfig(

356

multipart_threshold=1024 * 1024 * 100, # 100MB threshold

357

max_concurrency=20, # 20 concurrent threads

358

multipart_chunksize=1024 * 1024 * 100, # 100MB chunks

359

use_threads=True,

360

max_bandwidth=1024 * 1024 * 10 # 10 MB/s bandwidth limit

361

)

362

363

# Conservative configuration for limited resources

364

conservative_config = TransferConfig(

365

multipart_threshold=1024 * 1024 * 10, # 10MB threshold

366

max_concurrency=2, # Only 2 concurrent threads

367

multipart_chunksize=1024 * 1024 * 5, # 5MB chunks

368

use_threads=False # No threading

369

)

370

371

s3_client = boto3.client('s3')

372

373

# Use appropriate configuration based on file size and system resources

374

s3_client.upload_file(

375

'very-large-file.zip',

376

'my-bucket',

377

'uploads/large-file.zip',

378

config=high_perf_config

379

)

380

```

381

382

### Error Handling for Transfers

383

384

```python

385

import boto3

386

from boto3.exceptions import S3UploadFailedError, S3TransferFailedError

387

from botocore.exceptions import ClientError

388

389

s3_client = boto3.client('s3')

390

391

try:

392

s3_client.upload_file('local-file.txt', 'my-bucket', 'uploads/file.txt')

393

print("Upload successful")

394

395

except S3UploadFailedError as e:

396

print(f"S3 upload failed: {e}")

397

398

except ClientError as e:

399

error_code = e.response['Error']['Code']

400

if error_code == 'NoSuchBucket':

401

print("Bucket does not exist")

402

elif error_code == 'AccessDenied':

403

print("Access denied - check permissions")

404

else:

405

print(f"AWS error: {e}")

406

407

except FileNotFoundError:

408

print("Local file not found")

409

410

except Exception as e:

411

print(f"Unexpected error: {e}")

412

```