A Python-based application to backup Grafana settings using the Grafana API
—
Archive creation and extraction functionality with support for compressed tar.gz format and automatic timestamping. The archive management system packages backup files into portable archives for storage and distribution.
Creates compressed tar.gz archives from backup directory structures with automatic timestamping and optional cleanup.
def main(args, settings):
"""
Create compressed archive from backup directory
Module: grafana_backup.archive
Args:
args (dict): Command line arguments including --no-archive flag
settings (dict): Configuration settings with backup directory and timestamp
Features: Gzip compression, timestamped filenames, automatic cleanup
Output: backup_{timestamp}.tar.gz in backup directory
"""Archive creation is automatically triggered after successful backup completion unless disabled:
# Archive creation workflow
from grafana_backup.save import main as save_backup
# Normal backup process includes automatic archiving
save_args = {
'save': True,
'--components': None,
'--no-archive': False, # Archive will be created
'--config': None
}
save_backup(save_args, settings)
# Skip archive creation for troubleshooting
save_args['--no-archive'] = True # Archive creation skipped
save_backup(save_args, settings)Archives contain the complete backup directory structure:
backup_{timestamp}.tar.gz
├── dashboards/
│ └── {timestamp}/
│ ├── dashboard1.json
│ ├── dashboard2.json
│ └── ...
├── datasources/
│ └── {timestamp}/
│ ├── datasource1.json
│ ├── datasource2.json
│ └── ...
├── folders/
│ └── {timestamp}/
│ ├── folder1.json
│ └── ...
└── [other components]/
└── {timestamp}/
└── ...Archive files use consistent timestamped naming:
# Archive filename format
filename_pattern = "backup_{timestamp}.tar.gz"
# Example with default timestamp format (%Y%m%d%H%M)
example_filename = "backup_202501011200.tar.gz"
# Customizable via BACKUP_FILE_FORMAT setting
custom_format = "%Y-%m-%d_%H-%M-%S"
custom_filename = "backup_2025-01-01_12-00-00.tar.gz"Archive extraction is automatically handled during restore operations:
# Restore process automatically handles extraction
from grafana_backup.restore import main as restore_backup
restore_args = {
'restore': True,
'<archive_file>': 'backup_202501011200.tar.gz',
'--components': None,
'--config': None
}
# Extraction process:
# 1. Validates archive format and integrity
# 2. Creates temporary directory for extraction
# 3. Extracts archive contents
# 4. Processes component files
# 5. Cleans up temporary directory
restore_backup(restore_args, settings)Archives can be manually extracted using standard tools:
# Extract archive manually
tar -xzf backup_202501011200.tar.gz
# List archive contents
tar -tzf backup_202501011200.tar.gz
# Extract specific components
tar -xzf backup_202501011200.tar.gz dashboards/# Configuration settings affecting archive operations
BACKUP_DIR: str # Directory for backup files and archives
BACKUP_FILE_FORMAT: str # Timestamp format for archive naming
TIMESTAMP: str # Current timestamp using BACKUP_FILE_FORMATArchive creation can be controlled through command-line arguments:
# Control archive creation
args = {
'--no-archive': False # Create archive (default)
}
args = {
'--no-archive': True # Skip archive creation
}from grafana_backup.archive import main as create_archive
from grafana_backup.grafanaSettings import main as load_config
# Load configuration
settings = load_config('/path/to/grafanaSettings.json')
# Create archive from existing backup
args = {
'--no-archive': False,
'--config': None
}
create_archive(args, settings)from grafana_backup.save import main as save_backup
# Backup with automatic archive creation
save_args = {
'save': True,
'--components': 'dashboards,datasources',
'--no-archive': False, # Archive will be created
'--config': None
}
save_backup(save_args, settings)
# Result: Individual JSON files + backup_{timestamp}.tar.gz# Backup without archive for troubleshooting
save_args = {
'save': True,
'--components': 'dashboards',
'--no-archive': True, # Skip archive creation
'--config': None
}
save_backup(save_args, settings)
# Result: Only individual JSON files, no archiveArchives use gzip compression for optimal balance of size and compatibility:
The archive maintains the complete backup directory structure:
Archive creation includes integrity validation:
Comprehensive error handling for archive creation:
# Common archive creation errors and handling:
# - Insufficient disk space: Clear error message with space requirements
# - Permission issues: Detailed error about directory access
# - Compression failures: Fallback to uncompressed tar if needed
# - File access errors: Skip problematic files with warningsRobust error handling for archive extraction:
# Common extraction errors and handling:
# - Corrupted archives: Clear error message with integrity check results
# - Insufficient space: Space requirements and available space reporting
# - Permission issues: Temporary directory creation and access validation
# - Format errors: Validation of archive format before extractionArchive creation is optimized for performance:
Archives are optimized for size while maintaining data integrity:
Archives are designed for cloud storage integration:
# Archive creation automatically triggers cloud uploads
# if cloud storage is configured
save_backup(save_args, settings)
# 1. Creates individual backup files
# 2. Creates compressed archive
# 3. Uploads archive to configured cloud storage
# 4. Optionally cleans up local files# Cloud downloads provide archives ready for extraction
restore_backup(restore_args, settings)
# 1. Downloads archive from cloud storage
# 2. Extracts archive to temporary directory
# 3. Processes extracted component files
# 4. Cleans up temporary filesThe archive management system provides reliable, portable backup packaging with comprehensive error handling and performance optimization.
Install with Tessl CLI
npx tessl i tessl/pypi-grafana-backup