A Python-based application to backup Grafana settings using the Grafana API
—
Comprehensive cloud storage integration supporting Amazon S3, Azure Storage, and Google Cloud Storage for automated backup archiving and retrieval. The cloud storage system enables offsite backup storage with automatic upload after backup completion and download for restore operations.
Complete integration with Amazon S3 and S3-compatible services for backup storage and retrieval.
def main(args, settings):
"""
Upload backup archive to Amazon S3 bucket
Module: grafana_backup.s3_upload
Features: Multi-part upload, encryption support, custom endpoints
Requirements: AWS credentials and bucket configuration
"""def main(args, settings):
"""
Download backup archive from Amazon S3 bucket
Module: grafana_backup.s3_download
Returns: BytesIO stream of compressed backup data
Features: Streaming download, encryption support, custom endpoints
"""# S3 utility functions
def get_s3_client(settings): ... # Create configured S3 client
def get_s3_object_key(settings): ... # Generate S3 object key with timestampIntegration with Azure Blob Storage for backup archiving and retrieval.
def main(args, settings):
"""
Upload backup archive to Azure Blob Storage
Module: grafana_backup.azure_storage_upload
Features: Block blob upload, container management, connection string auth
Requirements: Azure storage connection string and container configuration
"""def main(args, settings):
"""
Download backup archive from Azure Blob Storage
Module: grafana_backup.azure_storage_download
Returns: BytesIO stream of compressed backup data
Features: Streaming download, container management
"""Integration with Google Cloud Storage for backup archiving and retrieval.
def main(args, settings):
"""
Upload backup archive to Google Cloud Storage bucket
Module: grafana_backup.gcs_upload
Features: Resumable upload, service account authentication, bucket management
Requirements: GCS service account credentials and bucket configuration
"""def main(args, settings):
"""
Download backup archive from Google Cloud Storage bucket
Module: grafana_backup.gcs_download
Returns: BytesIO stream of compressed backup data
Features: Streaming download, service account authentication
"""# Required AWS S3 settings
AWS_S3_BUCKET_NAME: str # S3 bucket name for backup storage
AWS_S3_BUCKET_KEY: str # Object key prefix for organizing backups
AWS_DEFAULT_REGION: str # AWS region for S3 bucket
AWS_ACCESS_KEY_ID: str # AWS access key for authentication
AWS_SECRET_ACCESS_KEY: str # AWS secret key for authentication
AWS_ENDPOINT_URL: str # Custom endpoint URL for S3-compatible services (optional)# Required Azure storage settings
AZURE_STORAGE_CONTAINER_NAME: str # Azure container name for backup storage
AZURE_STORAGE_CONNECTION_STRING: str # Azure storage account connection string# Required GCS settings
GCS_BUCKET_NAME: str # GCS bucket name for backup storage
GOOGLE_APPLICATION_CREDENTIALS: str # Path to service account JSON key fileCloud storage upload is automatically triggered after successful backup completion:
# Backup workflow with automatic cloud upload
from grafana_backup.save import main as save_backup
# Backup process automatically handles cloud upload
save_backup(args, settings)
# 1. Performs backup of selected components
# 2. Creates local archive (unless --no-archive specified)
# 3. Uploads to configured cloud storage providers
# 4. Reports upload statusWhen multiple cloud storage providers are configured, uploads occur in this order:
AWS_S3_BUCKET_NAME is configuredAZURE_STORAGE_CONTAINER_NAME is configuredGCS_BUCKET_NAME is configuredCloud storage download is automatically triggered during restore operations when cloud storage is configured and no local file is specified:
# Restore workflow with automatic cloud download
from grafana_backup.restore import main as restore_backup
# Configure cloud storage
settings['AWS_S3_BUCKET_NAME'] = 'my-backups'
# Restore process automatically downloads from cloud
restore_backup(args, settings)
# 1. Detects cloud storage configuration
# 2. Downloads specified archive from cloud storage
# 3. Extracts and processes backup components
# 4. Restores to Grafana instancefrom grafana_backup.save import main as save_backup
from grafana_backup.restore import main as restore_backup
from grafana_backup.grafanaSettings import main as load_config
# Load configuration with S3 settings
settings = load_config('/path/to/config.json')
settings.update({
'AWS_S3_BUCKET_NAME': 'my-grafana-backups',
'AWS_S3_BUCKET_KEY': 'production',
'AWS_DEFAULT_REGION': 'us-east-1',
'AWS_ACCESS_KEY_ID': 'AKIAIOSFODNN7EXAMPLE',
'AWS_SECRET_ACCESS_KEY': 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'
})
# Backup with automatic S3 upload
save_args = {
'save': True,
'--components': None,
'--no-archive': False,
'--config': None
}
save_backup(save_args, settings)
# Restore from S3
restore_args = {
'restore': True,
'<archive_file>': 'backup_202501011200.tar.gz',
'--components': None,
'--config': None
}
restore_backup(restore_args, settings)# Configure Azure Storage
settings.update({
'AZURE_STORAGE_CONTAINER_NAME': 'grafana-backups',
'AZURE_STORAGE_CONNECTION_STRING': 'DefaultEndpointsProtocol=https;AccountName=mystorageaccount;AccountKey=...'
})
# Backup with automatic Azure upload
save_backup(save_args, settings)
# Restore from Azure Storage
restore_backup(restore_args, settings)import os
# Configure GCS with service account
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/path/to/service-account.json'
settings.update({
'GCS_BUCKET_NAME': 'my-grafana-backups'
})
# Backup with automatic GCS upload
save_backup(save_args, settings)
# Restore from GCS
restore_backup(restore_args, settings)# Configure multiple cloud providers (all will be used)
settings.update({
# AWS S3
'AWS_S3_BUCKET_NAME': 'primary-backups',
'AWS_DEFAULT_REGION': 'us-east-1',
# Azure Storage
'AZURE_STORAGE_CONTAINER_NAME': 'secondary-backups',
'AZURE_STORAGE_CONNECTION_STRING': 'DefaultEndpointsProtocol=https;...',
# Google Cloud Storage
'GCS_BUCKET_NAME': 'tertiary-backups'
})
# Backup will upload to all configured providers
save_backup(save_args, settings)Cloud storage objects use consistent naming with timestamps:
backup_{timestamp}.tar.gzBACKUP_FILE_FORMAT setting (default: %Y%m%d%H%M)backup_202501011200.tar.gzS3 objects are organized using the configured key prefix:
{AWS_S3_BUCKET_KEY}/backup_{timestamp}.tar.gzproduction/backup_202501011200.tar.gzAzure blobs use simple naming without additional prefixes:
backup_{timestamp}.tar.gzAZURE_STORAGE_CONTAINER_NAMEGCS objects use simple naming without additional prefixes:
backup_{timestamp}.tar.gzGCS_BUCKET_NAMEThe cloud storage integration provides robust, production-ready backup storage with comprehensive error handling and security features.
Install with Tessl CLI
npx tessl i tessl/pypi-grafana-backup