or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

admin-tools.mdapi-health.mdarchive-management.mdbackup-operations.mdcloud-storage.mdconfiguration.mddelete-operations.mdindex.mdmonitoring.mdrestore-operations.md

cloud-storage.mddocs/

0

# Cloud Storage Integration

1

2

Comprehensive cloud storage integration supporting Amazon S3, Azure Storage, and Google Cloud Storage for automated backup archiving and retrieval. The cloud storage system enables offsite backup storage with automatic upload after backup completion and download for restore operations.

3

4

## Capabilities

5

6

### AWS S3 Integration

7

8

Complete integration with Amazon S3 and S3-compatible services for backup storage and retrieval.

9

10

#### S3 Upload

11

```python { .api }

12

def main(args, settings):

13

"""

14

Upload backup archive to Amazon S3 bucket

15

16

Module: grafana_backup.s3_upload

17

Features: Multi-part upload, encryption support, custom endpoints

18

Requirements: AWS credentials and bucket configuration

19

"""

20

```

21

22

#### S3 Download

23

```python { .api }

24

def main(args, settings):

25

"""

26

Download backup archive from Amazon S3 bucket

27

28

Module: grafana_backup.s3_download

29

Returns: BytesIO stream of compressed backup data

30

Features: Streaming download, encryption support, custom endpoints

31

"""

32

```

33

34

#### S3 Common Utilities

35

```python { .api }

36

# S3 utility functions

37

def get_s3_client(settings): ... # Create configured S3 client

38

def get_s3_object_key(settings): ... # Generate S3 object key with timestamp

39

```

40

41

### Azure Storage Integration

42

43

Integration with Azure Blob Storage for backup archiving and retrieval.

44

45

#### Azure Upload

46

```python { .api }

47

def main(args, settings):

48

"""

49

Upload backup archive to Azure Blob Storage

50

51

Module: grafana_backup.azure_storage_upload

52

Features: Block blob upload, container management, connection string auth

53

Requirements: Azure storage connection string and container configuration

54

"""

55

```

56

57

#### Azure Download

58

```python { .api }

59

def main(args, settings):

60

"""

61

Download backup archive from Azure Blob Storage

62

63

Module: grafana_backup.azure_storage_download

64

Returns: BytesIO stream of compressed backup data

65

Features: Streaming download, container management

66

"""

67

```

68

69

### Google Cloud Storage Integration

70

71

Integration with Google Cloud Storage for backup archiving and retrieval.

72

73

#### GCS Upload

74

```python { .api }

75

def main(args, settings):

76

"""

77

Upload backup archive to Google Cloud Storage bucket

78

79

Module: grafana_backup.gcs_upload

80

Features: Resumable upload, service account authentication, bucket management

81

Requirements: GCS service account credentials and bucket configuration

82

"""

83

```

84

85

#### GCS Download

86

```python { .api }

87

def main(args, settings):

88

"""

89

Download backup archive from Google Cloud Storage bucket

90

91

Module: grafana_backup.gcs_download

92

Returns: BytesIO stream of compressed backup data

93

Features: Streaming download, service account authentication

94

"""

95

```

96

97

## Configuration Requirements

98

99

### AWS S3 Configuration

100

101

```python { .api }

102

# Required AWS S3 settings

103

AWS_S3_BUCKET_NAME: str # S3 bucket name for backup storage

104

AWS_S3_BUCKET_KEY: str # Object key prefix for organizing backups

105

AWS_DEFAULT_REGION: str # AWS region for S3 bucket

106

AWS_ACCESS_KEY_ID: str # AWS access key for authentication

107

AWS_SECRET_ACCESS_KEY: str # AWS secret key for authentication

108

AWS_ENDPOINT_URL: str # Custom endpoint URL for S3-compatible services (optional)

109

```

110

111

### Azure Storage Configuration

112

113

```python { .api }

114

# Required Azure storage settings

115

AZURE_STORAGE_CONTAINER_NAME: str # Azure container name for backup storage

116

AZURE_STORAGE_CONNECTION_STRING: str # Azure storage account connection string

117

```

118

119

### Google Cloud Storage Configuration

120

121

```python { .api }

122

# Required GCS settings

123

GCS_BUCKET_NAME: str # GCS bucket name for backup storage

124

GOOGLE_APPLICATION_CREDENTIALS: str # Path to service account JSON key file

125

```

126

127

## Upload Workflow Integration

128

129

### Automatic Upload After Backup

130

131

Cloud storage upload is automatically triggered after successful backup completion:

132

133

```python

134

# Backup workflow with automatic cloud upload

135

from grafana_backup.save import main as save_backup

136

137

# Backup process automatically handles cloud upload

138

save_backup(args, settings)

139

# 1. Performs backup of selected components

140

# 2. Creates local archive (unless --no-archive specified)

141

# 3. Uploads to configured cloud storage providers

142

# 4. Reports upload status

143

```

144

145

### Upload Priority Order

146

147

When multiple cloud storage providers are configured, uploads occur in this order:

148

1. **AWS S3**: If `AWS_S3_BUCKET_NAME` is configured

149

2. **Azure Storage**: If `AZURE_STORAGE_CONTAINER_NAME` is configured

150

3. **Google Cloud Storage**: If `GCS_BUCKET_NAME` is configured

151

152

## Download Workflow Integration

153

154

### Automatic Download During Restore

155

156

Cloud storage download is automatically triggered during restore operations when cloud storage is configured and no local file is specified:

157

158

```python

159

# Restore workflow with automatic cloud download

160

from grafana_backup.restore import main as restore_backup

161

162

# Configure cloud storage

163

settings['AWS_S3_BUCKET_NAME'] = 'my-backups'

164

165

# Restore process automatically downloads from cloud

166

restore_backup(args, settings)

167

# 1. Detects cloud storage configuration

168

# 2. Downloads specified archive from cloud storage

169

# 3. Extracts and processes backup components

170

# 4. Restores to Grafana instance

171

```

172

173

## Usage Examples

174

175

### AWS S3 Backup and Restore

176

177

```python

178

from grafana_backup.save import main as save_backup

179

from grafana_backup.restore import main as restore_backup

180

from grafana_backup.grafanaSettings import main as load_config

181

182

# Load configuration with S3 settings

183

settings = load_config('/path/to/config.json')

184

settings.update({

185

'AWS_S3_BUCKET_NAME': 'my-grafana-backups',

186

'AWS_S3_BUCKET_KEY': 'production',

187

'AWS_DEFAULT_REGION': 'us-east-1',

188

'AWS_ACCESS_KEY_ID': 'AKIAIOSFODNN7EXAMPLE',

189

'AWS_SECRET_ACCESS_KEY': 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'

190

})

191

192

# Backup with automatic S3 upload

193

save_args = {

194

'save': True,

195

'--components': None,

196

'--no-archive': False,

197

'--config': None

198

}

199

save_backup(save_args, settings)

200

201

# Restore from S3

202

restore_args = {

203

'restore': True,

204

'<archive_file>': 'backup_202501011200.tar.gz',

205

'--components': None,

206

'--config': None

207

}

208

restore_backup(restore_args, settings)

209

```

210

211

### Azure Storage Backup and Restore

212

213

```python

214

# Configure Azure Storage

215

settings.update({

216

'AZURE_STORAGE_CONTAINER_NAME': 'grafana-backups',

217

'AZURE_STORAGE_CONNECTION_STRING': 'DefaultEndpointsProtocol=https;AccountName=mystorageaccount;AccountKey=...'

218

})

219

220

# Backup with automatic Azure upload

221

save_backup(save_args, settings)

222

223

# Restore from Azure Storage

224

restore_backup(restore_args, settings)

225

```

226

227

### Google Cloud Storage Backup and Restore

228

229

```python

230

import os

231

232

# Configure GCS with service account

233

os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/path/to/service-account.json'

234

settings.update({

235

'GCS_BUCKET_NAME': 'my-grafana-backups'

236

})

237

238

# Backup with automatic GCS upload

239

save_backup(save_args, settings)

240

241

# Restore from GCS

242

restore_backup(restore_args, settings)

243

```

244

245

### Multi-Cloud Configuration

246

247

```python

248

# Configure multiple cloud providers (all will be used)

249

settings.update({

250

# AWS S3

251

'AWS_S3_BUCKET_NAME': 'primary-backups',

252

'AWS_DEFAULT_REGION': 'us-east-1',

253

254

# Azure Storage

255

'AZURE_STORAGE_CONTAINER_NAME': 'secondary-backups',

256

'AZURE_STORAGE_CONNECTION_STRING': 'DefaultEndpointsProtocol=https;...',

257

258

# Google Cloud Storage

259

'GCS_BUCKET_NAME': 'tertiary-backups'

260

})

261

262

# Backup will upload to all configured providers

263

save_backup(save_args, settings)

264

```

265

266

## Storage Organization

267

268

### Backup File Naming

269

270

Cloud storage objects use consistent naming with timestamps:

271

- **Format**: `backup_{timestamp}.tar.gz`

272

- **Timestamp**: Based on `BACKUP_FILE_FORMAT` setting (default: `%Y%m%d%H%M`)

273

- **Example**: `backup_202501011200.tar.gz`

274

275

### S3 Object Keys

276

277

S3 objects are organized using the configured key prefix:

278

- **Pattern**: `{AWS_S3_BUCKET_KEY}/backup_{timestamp}.tar.gz`

279

- **Example**: `production/backup_202501011200.tar.gz`

280

281

### Azure Blob Names

282

283

Azure blobs use simple naming without additional prefixes:

284

- **Pattern**: `backup_{timestamp}.tar.gz`

285

- **Container**: Specified by `AZURE_STORAGE_CONTAINER_NAME`

286

287

### GCS Object Names

288

289

GCS objects use simple naming without additional prefixes:

290

- **Pattern**: `backup_{timestamp}.tar.gz`

291

- **Bucket**: Specified by `GCS_BUCKET_NAME`

292

293

## Error Handling

294

295

### Upload Error Handling

296

297

- **Network failures**: Retry logic with exponential backoff

298

- **Authentication errors**: Clear error messages for credential issues

299

- **Permission errors**: Detailed bucket/container permission guidance

300

- **Storage quota**: Appropriate error messages for storage limits

301

302

### Download Error Handling

303

304

- **Missing objects**: Clear messages when backup files don't exist

305

- **Network failures**: Retry logic for temporary connectivity issues

306

- **Authentication errors**: Credential validation before download attempts

307

- **Corruption detection**: Integrity checks on downloaded data

308

309

## Security Considerations

310

311

### Credential Management

312

313

- **Environment variables**: Use environment variables for production credentials

314

- **IAM roles**: Prefer IAM roles over access keys for EC2/container deployments

315

- **Service accounts**: Use service accounts with minimal required permissions

316

- **Connection strings**: Protect Azure connection strings as sensitive data

317

318

### Encryption

319

320

- **S3**: Supports server-side encryption (SSE-S3, SSE-KMS)

321

- **Azure**: Automatic encryption at rest for Blob Storage

322

- **GCS**: Automatic encryption at rest with optional customer-managed keys

323

324

### Access Control

325

326

- **S3**: Configure bucket policies and IAM permissions appropriately

327

- **Azure**: Use Azure RBAC for container access control

328

- **GCS**: Configure IAM permissions for bucket and object access

329

330

## Performance Optimization

331

332

### Upload Performance

333

334

- **Multi-part uploads**: Automatic for large archives

335

- **Streaming uploads**: Memory-efficient upload process

336

- **Parallel uploads**: Multiple cloud providers upload simultaneously

337

338

### Download Performance

339

340

- **Streaming downloads**: Memory-efficient download process

341

- **Compression**: Archives are compressed to minimize transfer time

342

- **Resume capability**: Support for resuming interrupted downloads where available

343

344

The cloud storage integration provides robust, production-ready backup storage with comprehensive error handling and security features.