CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-grafana-backup

A Python-based application to backup Grafana settings using the Grafana API

Pending
Overview
Eval results
Files

archive-management.mddocs/

Archive Management

Archive creation and extraction functionality with support for compressed tar.gz format and automatic timestamping. The archive management system packages backup files into portable archives for storage and distribution.

Capabilities

Archive Creation

Creates compressed tar.gz archives from backup directory structures with automatic timestamping and optional cleanup.

def main(args, settings):
    """
    Create compressed archive from backup directory
    
    Module: grafana_backup.archive
    Args:
        args (dict): Command line arguments including --no-archive flag
        settings (dict): Configuration settings with backup directory and timestamp
        
    Features: Gzip compression, timestamped filenames, automatic cleanup
    Output: backup_{timestamp}.tar.gz in backup directory
    """

Archive Creation Process

Automatic Archive Creation

Archive creation is automatically triggered after successful backup completion unless disabled:

# Archive creation workflow
from grafana_backup.save import main as save_backup

# Normal backup process includes automatic archiving
save_args = {
    'save': True,
    '--components': None,
    '--no-archive': False,  # Archive will be created
    '--config': None
}
save_backup(save_args, settings)

# Skip archive creation for troubleshooting
save_args['--no-archive'] = True  # Archive creation skipped
save_backup(save_args, settings)

Archive File Structure

Archives contain the complete backup directory structure:

backup_{timestamp}.tar.gz
├── dashboards/
│   └── {timestamp}/
│       ├── dashboard1.json
│       ├── dashboard2.json
│       └── ...
├── datasources/
│   └── {timestamp}/
│       ├── datasource1.json
│       ├── datasource2.json
│       └── ...
├── folders/
│   └── {timestamp}/
│       ├── folder1.json
│       └── ...
└── [other components]/
    └── {timestamp}/
        └── ...

Archive Naming Convention

Archive files use consistent timestamped naming:

# Archive filename format
filename_pattern = "backup_{timestamp}.tar.gz"

# Example with default timestamp format (%Y%m%d%H%M)
example_filename = "backup_202501011200.tar.gz"

# Customizable via BACKUP_FILE_FORMAT setting
custom_format = "%Y-%m-%d_%H-%M-%S"
custom_filename = "backup_2025-01-01_12-00-00.tar.gz"

Archive Extraction

Automatic Extraction During Restore

Archive extraction is automatically handled during restore operations:

# Restore process automatically handles extraction
from grafana_backup.restore import main as restore_backup

restore_args = {
    'restore': True,
    '<archive_file>': 'backup_202501011200.tar.gz',
    '--components': None,
    '--config': None
}

# Extraction process:
# 1. Validates archive format and integrity
# 2. Creates temporary directory for extraction
# 3. Extracts archive contents
# 4. Processes component files
# 5. Cleans up temporary directory
restore_backup(restore_args, settings)

Manual Archive Extraction

Archives can be manually extracted using standard tools:

# Extract archive manually
tar -xzf backup_202501011200.tar.gz

# List archive contents
tar -tzf backup_202501011200.tar.gz

# Extract specific components
tar -xzf backup_202501011200.tar.gz dashboards/

Configuration Options

Archive Control Settings

# Configuration settings affecting archive operations
BACKUP_DIR: str          # Directory for backup files and archives
BACKUP_FILE_FORMAT: str  # Timestamp format for archive naming
TIMESTAMP: str           # Current timestamp using BACKUP_FILE_FORMAT

Archive Creation Behavior

Archive creation can be controlled through command-line arguments:

# Control archive creation
args = {
    '--no-archive': False  # Create archive (default)
}

args = {
    '--no-archive': True   # Skip archive creation
}

Usage Examples

Standard Archive Creation

from grafana_backup.archive import main as create_archive
from grafana_backup.grafanaSettings import main as load_config

# Load configuration
settings = load_config('/path/to/grafanaSettings.json')

# Create archive from existing backup
args = {
    '--no-archive': False,
    '--config': None
}

create_archive(args, settings)

Integrated Backup with Archive

from grafana_backup.save import main as save_backup

# Backup with automatic archive creation
save_args = {
    'save': True,
    '--components': 'dashboards,datasources',
    '--no-archive': False,  # Archive will be created
    '--config': None
}

save_backup(save_args, settings)
# Result: Individual JSON files + backup_{timestamp}.tar.gz

Backup Without Archive (Troubleshooting)

# Backup without archive for troubleshooting
save_args = {
    'save': True,
    '--components': 'dashboards',
    '--no-archive': True,   # Skip archive creation
    '--config': None
}

save_backup(save_args, settings)
# Result: Only individual JSON files, no archive

Archive Format Details

Compression Algorithm

Archives use gzip compression for optimal balance of size and compatibility:

  • Format: tar.gz (tar archive with gzip compression)
  • Compression level: Default gzip compression level
  • Compatibility: Standard format supported by all major platforms

Archive Structure Preservation

The archive maintains the complete backup directory structure:

  • Timestamps: All timestamped directories are preserved
  • File hierarchy: Folder structure exactly matches backup directory
  • Metadata: File modification times and permissions are preserved
  • Completeness: All backup files are included without filtering

Archive Integrity

Archive creation includes integrity validation:

  • Compression verification: Ensures archive can be properly decompressed
  • Content validation: Verifies all expected files are included
  • Size validation: Confirms archive size is reasonable for content

Error Handling

Archive Creation Errors

Comprehensive error handling for archive creation:

# Common archive creation errors and handling:
# - Insufficient disk space: Clear error message with space requirements
# - Permission issues: Detailed error about directory access
# - Compression failures: Fallback to uncompressed tar if needed
# - File access errors: Skip problematic files with warnings

Archive Extraction Errors

Robust error handling for archive extraction:

# Common extraction errors and handling:
# - Corrupted archives: Clear error message with integrity check results
# - Insufficient space: Space requirements and available space reporting
# - Permission issues: Temporary directory creation and access validation
# - Format errors: Validation of archive format before extraction

Performance Considerations

Archive Creation Performance

Archive creation is optimized for performance:

  • Streaming compression: Files are compressed during archive creation
  • Memory efficiency: Large files are processed in chunks
  • I/O optimization: Minimizes disk I/O through efficient buffering
  • Progress reporting: Provides feedback for large archive operations

Archive Size Optimization

Archives are optimized for size while maintaining data integrity:

  • JSON compression: JSON files compress well with gzip
  • Duplicate elimination: File deduplication where appropriate
  • Structure preservation: Maintains necessary directory structure
  • Metadata retention: Preserves essential file metadata

Integration with Cloud Storage

Archives are designed for cloud storage integration:

Upload Integration

# Archive creation automatically triggers cloud uploads
# if cloud storage is configured
save_backup(save_args, settings)
# 1. Creates individual backup files
# 2. Creates compressed archive
# 3. Uploads archive to configured cloud storage
# 4. Optionally cleans up local files

Download Integration

# Cloud downloads provide archives ready for extraction
restore_backup(restore_args, settings)
# 1. Downloads archive from cloud storage
# 2. Extracts archive to temporary directory
# 3. Processes extracted component files
# 4. Cleans up temporary files

Best Practices

Archive Management

  • Regular cleanup: Implement retention policies for local archives
  • Storage monitoring: Monitor disk space usage for archive directory
  • Backup verification: Periodically test archive integrity
  • Documentation: Maintain records of archive contents and purposes

Performance Optimization

  • Disk space: Ensure sufficient space for both backup files and archives
  • I/O performance: Use fast storage for backup directory during operations
  • Cleanup timing: Balance between storage usage and troubleshooting needs
  • Compression trade-offs: Consider compression level vs. creation time for large backups

Troubleshooting

  • Use --no-archive: Disable archiving when troubleshooting backup issues
  • Manual extraction: Use standard tar tools for archive inspection
  • Temporary retention: Keep uncompressed files temporarily for validation
  • Error logging: Review detailed logs for archive operation issues

The archive management system provides reliable, portable backup packaging with comprehensive error handling and performance optimization.

Install with Tessl CLI

npx tessl i tessl/pypi-grafana-backup

docs

admin-tools.md

api-health.md

archive-management.md

backup-operations.md

cloud-storage.md

configuration.md

delete-operations.md

index.md

monitoring.md

restore-operations.md

tile.json