An interactive data visualization platform built on SQLAlchemy and Druid.io
—
Quality
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Panoramix supports two types of data sources: SQL databases and Druid clusters. This module provides the core functionality for connecting to, managing, and synchronizing metadata from these data sources.
Register and manage connections to SQL databases using SQLAlchemy. Supports multiple database backends including PostgreSQL, MySQL, SQLite, and others supported by SQLAlchemy.
class Database(Model, AuditMixin):
"""
SQL database connection model.
Attributes:
id (int): Primary key
database_name (str): Unique database identifier
sqlalchemy_uri (str): SQLAlchemy connection string
"""
def get_sqla_engine(self):
"""
Get SQLAlchemy engine for this database.
Returns:
sqlalchemy.engine.Engine: Configured database engine
"""
def get_table(self, table_name):
"""
Get table metadata from the database.
Args:
table_name (str): Name of the table to retrieve
Returns:
sqlalchemy.Table: Table metadata object
"""
def __repr__(self):
"""String representation of the database."""
return self.database_nameUsage example:
from panoramix.models import Database
# Create a database connection
db = Database(
database_name='sales_db',
sqlalchemy_uri='postgresql://user:pass@localhost/sales'
)
# Get database engine
engine = db.get_sqla_engine()
# Get table metadata
table_meta = db.get_table('customer_orders')Manage connections to Druid clusters for real-time analytics. Handles both coordinator and broker endpoints, with automatic datasource discovery and metadata synchronization.
class Cluster(Model, AuditMixin):
"""
Druid cluster configuration and management.
Attributes:
id (int): Primary key
cluster_name (str): Unique cluster identifier
coordinator_host (str): Druid coordinator hostname
coordinator_port (int): Druid coordinator port
coordinator_endpoint (str): Coordinator endpoint path
broker_host (str): Druid broker hostname
broker_port (int): Druid broker port
broker_endpoint (str): Broker endpoint path
metadata_last_refreshed (datetime): Last metadata sync timestamp
"""
def get_pydruid_client(self):
"""
Get PyDruid client for querying this cluster.
Returns:
pydruid.client.Client: Configured PyDruid client
"""
def refresh_datasources(self):
"""
Sync datasources from the Druid cluster.
Connects to the cluster coordinator to discover available
datasources and updates the local metadata cache.
"""
def __repr__(self):
"""String representation of the cluster."""
return self.cluster_nameUsage example:
from panoramix.models import Cluster
# Configure Druid cluster
cluster = Cluster(
cluster_name='prod_cluster',
coordinator_host='druid-coordinator.company.com',
coordinator_port=8082,
broker_host='druid-broker.company.com',
broker_port=8083
)
# Get PyDruid client
client = cluster.get_pydruid_client()
# Refresh datasource metadata
cluster.refresh_datasources()Both database and cluster models include connection validation to ensure connectivity and proper configuration.
# Health check endpoints
@app.route('/health')
def health():
"""System health check endpoint"""
@app.route('/ping')
def ping():
"""Simple ping endpoint"""Data source management integrates with Flask-AppBuilder to provide admin interfaces for managing databases and clusters through the web UI.
class DatabaseView(ModelView):
"""Admin view for managing SQL databases"""
class ClusterModelView(ModelView):
"""Admin view for managing Druid clusters"""These views provide CRUD operations, connection testing, and metadata management through the web interface.
Install with Tessl CLI
npx tessl i tessl/pypi-panoramix