or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

bandwidth-management.mdconfiguration.mdcrt-support.mdexception-handling.mdfile-utilities.mdfutures-coordination.mdindex.mdlegacy-transfer.mdprocess-pool-downloads.mdsubscribers-callbacks.mdtransfer-manager.md

legacy-transfer.mddocs/

0

# Legacy Transfer Interface

1

2

The original S3Transfer class provides simple, high-level methods for uploading and downloading files to/from Amazon S3. While still supported, the modern TransferManager is recommended for new development.

3

4

## Capabilities

5

6

### S3Transfer Class

7

8

The main transfer class providing upload and download functionality with automatic multipart handling based on file size thresholds.

9

10

```python { .api }

11

class S3Transfer:

12

"""

13

High-level S3 transfer interface with automatic multipart handling.

14

15

Args:

16

client: boto3 S3 client instance

17

config: TransferConfig for controlling transfer behavior

18

osutil: OSUtils instance for file operations

19

"""

20

def __init__(self, client, config=None, osutil=None): ...

21

22

def upload_file(self, filename, bucket, key, callback=None, extra_args=None):

23

"""

24

Upload a file to S3.

25

26

Args:

27

filename (str): Path to local file to upload

28

bucket (str): S3 bucket name

29

key (str): S3 object key/name

30

callback (callable, optional): Progress callback function(bytes_transferred)

31

extra_args (dict, optional): Additional S3 operation arguments

32

33

Raises:

34

S3UploadFailedError: If upload fails

35

ValueError: If extra_args contains invalid keys

36

"""

37

38

def download_file(self, bucket, key, filename, extra_args=None, callback=None):

39

"""

40

Download an S3 object to a file.

41

42

Args:

43

bucket (str): S3 bucket name

44

key (str): S3 object key/name

45

filename (str): Path where to save downloaded file

46

extra_args (dict, optional): Additional S3 operation arguments

47

callback (callable, optional): Progress callback function(bytes_transferred)

48

49

Raises:

50

S3DownloadFailedError: If download fails

51

ValueError: If extra_args contains invalid keys

52

"""

53

54

# Class constants for allowed operation arguments

55

ALLOWED_DOWNLOAD_ARGS: List[str]

56

ALLOWED_UPLOAD_ARGS: List[str]

57

```

58

59

### Legacy TransferConfig

60

61

Configuration class for controlling S3Transfer behavior with basic options for multipart thresholds and concurrency.

62

63

```python { .api }

64

class TransferConfig:

65

"""

66

Configuration for S3Transfer operations.

67

68

Args:

69

multipart_threshold (int): Size threshold for multipart uploads (default: 8MB)

70

max_concurrency (int): Maximum number of concurrent transfers (default: 10)

71

multipart_chunksize (int): Size of multipart chunks (default: 8MB)

72

num_download_attempts (int): Number of download retry attempts (default: 5)

73

max_io_queue (int): Maximum size of IO queue (default: 100)

74

"""

75

def __init__(

76

self,

77

multipart_threshold=8 * 1024 * 1024,

78

max_concurrency=10,

79

multipart_chunksize=8 * 1024 * 1024,

80

num_download_attempts=5,

81

max_io_queue=100

82

): ...

83

84

multipart_threshold: int

85

max_concurrency: int

86

multipart_chunksize: int

87

num_download_attempts: int

88

max_io_queue: int

89

```

90

91

### Legacy Utility Classes

92

93

Supporting classes for file operations and progress tracking used by the S3Transfer interface.

94

95

```python { .api }

96

class OSUtils:

97

"""

98

OS utility functions for file operations.

99

"""

100

def get_file_size(self, filename: str) -> int:

101

"""Get file size in bytes."""

102

103

def open_file_chunk_reader(self, filename: str, start_byte: int, size: int, callback):

104

"""Open a file chunk reader with progress callback."""

105

106

def open(self, filename: str, mode: str):

107

"""Open a file."""

108

109

def remove_file(self, filename: str):

110

"""Remove a file (no-op if doesn't exist)."""

111

112

def rename_file(self, current_filename: str, new_filename: str):

113

"""Rename a file."""

114

115

class ReadFileChunk:

116

"""

117

File-like object for reading chunks of files with progress callbacks.

118

119

Args:

120

fileobj: File object to read from

121

start_byte (int): Starting position in file

122

chunk_size (int): Maximum chunk size to read

123

full_file_size (int): Total file size

124

callback (callable, optional): Progress callback function

125

enable_callback (bool): Whether to enable callbacks initially

126

"""

127

def __init__(

128

self,

129

fileobj,

130

start_byte: int,

131

chunk_size: int,

132

full_file_size: int,

133

callback=None,

134

enable_callback: bool = True

135

): ...

136

137

@classmethod

138

def from_filename(

139

cls,

140

filename: str,

141

start_byte: int,

142

chunk_size: int,

143

callback=None,

144

enable_callback: bool = True

145

):

146

"""Create ReadFileChunk from filename."""

147

148

def read(self, amount=None) -> bytes:

149

"""Read data from chunk."""

150

151

def seek(self, where: int):

152

"""Seek to position within chunk."""

153

154

def tell(self) -> int:

155

"""Get current position."""

156

157

def close(self):

158

"""Close file handle."""

159

160

def enable_callback(self):

161

"""Enable progress callbacks."""

162

163

def disable_callback(self):

164

"""Disable progress callbacks."""

165

166

class StreamReaderProgress:

167

"""

168

Wrapper for read-only streams that adds progress callbacks.

169

170

Args:

171

stream: Stream to wrap

172

callback (callable, optional): Progress callback function

173

"""

174

def __init__(self, stream, callback=None): ...

175

176

def read(self, *args, **kwargs) -> bytes:

177

"""Read from stream with progress tracking."""

178

```

179

180

## Usage Examples

181

182

### Basic Upload and Download

183

184

```python

185

import boto3

186

from s3transfer import S3Transfer

187

188

# Create client and transfer manager

189

client = boto3.client('s3', region_name='us-west-2')

190

transfer = S3Transfer(client)

191

192

# Simple upload

193

transfer.upload_file('/tmp/data.csv', 'my-bucket', 'data.csv')

194

195

# Simple download

196

transfer.download_file('my-bucket', 'data.csv', '/tmp/downloaded.csv')

197

```

198

199

### Upload with Progress Callback

200

201

```python

202

import os

203

import threading

204

from s3transfer import S3Transfer

205

206

class ProgressPercentage:

207

def __init__(self, filename):

208

self._filename = filename

209

self._size = float(os.path.getsize(filename))

210

self._seen_so_far = 0

211

self._lock = threading.Lock()

212

213

def __call__(self, bytes_amount):

214

with self._lock:

215

self._seen_so_far += bytes_amount

216

percentage = (self._seen_so_far / self._size) * 100

217

print(f"\r{self._filename} {self._seen_so_far} / {self._size} ({percentage:.2f}%)", end='')

218

219

# Upload with progress tracking

220

client = boto3.client('s3')

221

transfer = S3Transfer(client)

222

progress = ProgressPercentage('/tmp/large_file.dat')

223

transfer.upload_file('/tmp/large_file.dat', 'my-bucket', 'large_file.dat', callback=progress)

224

```

225

226

### Upload with Extra Arguments

227

228

```python

229

from s3transfer import S3Transfer

230

231

transfer = S3Transfer(boto3.client('s3'))

232

233

# Upload with metadata and ACL

234

transfer.upload_file(

235

'/tmp/document.pdf',

236

'my-bucket',

237

'documents/document.pdf',

238

extra_args={

239

'ACL': 'public-read',

240

'Metadata': {'author': 'John Doe', 'version': '1.0'},

241

'ContentType': 'application/pdf'

242

}

243

)

244

```

245

246

### Configuration for Large Files

247

248

```python

249

from s3transfer import S3Transfer, TransferConfig

250

251

# Configure for large files

252

config = TransferConfig(

253

multipart_threshold=64 * 1024 * 1024, # 64MB

254

max_concurrency=20, # More concurrent uploads

255

multipart_chunksize=64 * 1024 * 1024, # 64MB chunks

256

num_download_attempts=10 # More retry attempts

257

)

258

259

client = boto3.client('s3')

260

transfer = S3Transfer(client, config)

261

262

# Large file transfer will use multipart automatically

263

transfer.upload_file('/tmp/large_dataset.zip', 'my-bucket', 'datasets/large_dataset.zip')

264

```

265

266

## Allowed Extra Arguments

267

268

### Upload Arguments

269

270

The following arguments can be passed in the `extra_args` parameter for uploads:

271

272

- `ACL`: Access control list permissions

273

- `CacheControl`: Cache control directives

274

- `ContentDisposition`: Content disposition header

275

- `ContentEncoding`: Content encoding (e.g., 'gzip')

276

- `ContentLanguage`: Content language

277

- `ContentType`: MIME type of the content

278

- `Expires`: Expiration date

279

- `GrantFullControl`: Full control permissions

280

- `GrantRead`: Read permissions

281

- `GrantReadACP`: Read ACP permissions

282

- `GrantWriteACL`: Write ACL permissions

283

- `Metadata`: User-defined metadata dictionary

284

- `RequestPayer`: Request payer setting

285

- `ServerSideEncryption`: Server-side encryption method

286

- `StorageClass`: Storage class (STANDARD, REDUCED_REDUNDANCY, etc.)

287

- `SSECustomerAlgorithm`: Customer-provided encryption algorithm

288

- `SSECustomerKey`: Customer-provided encryption key

289

- `SSECustomerKeyMD5`: MD5 hash of customer encryption key

290

- `SSEKMSKeyId`: KMS key ID for encryption

291

- `SSEKMSEncryptionContext`: KMS encryption context

292

- `Tagging`: Object tags

293

294

### Download Arguments

295

296

The following arguments can be passed in the `extra_args` parameter for downloads:

297

298

- `VersionId`: Specific version of the object to download

299

- `SSECustomerAlgorithm`: Customer-provided encryption algorithm

300

- `SSECustomerKey`: Customer-provided encryption key

301

- `SSECustomerKeyMD5`: MD5 hash of customer encryption key

302

- `RequestPayer`: Request payer setting

303

304

## Migration Notes

305

306

When migrating from S3Transfer to TransferManager:

307

308

1. **Method signatures**: TransferManager uses file-like objects instead of filenames

309

2. **Return values**: TransferManager returns futures for asynchronous handling

310

3. **Configuration**: Modern TransferConfig has more granular options

311

4. **Resource management**: TransferManager requires explicit shutdown

312

5. **Progress tracking**: Use subscribers instead of direct callbacks