CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-sqllineage

SQL Lineage Analysis Tool powered by Python

Overview
Eval results
Files

cli-interface.mddocs/

CLI Interface

Command-line interface for analyzing SQL files and generating lineage reports, with options for different output formats and visualization modes. The CLI provides a convenient way to perform lineage analysis without writing Python code.

Capabilities

Main Entry Point

def main(args=None) -> None:
    """
    The command line interface entry point.

    Parameters:
    - args: command line arguments for sqllineage command (optional)
    """

Console Script

The package installs a console script entry point:

sqllineage = sqllineage.cli:main

Command Line Arguments

Input Options

# SQL from command line
sqllineage -e "SELECT * FROM customers"

# SQL from file
sqllineage -f queries.sql

# Multiple files
sqllineage -f query1.sql -f query2.sql

Output Control

# Verbose mode - show statement level lineage
sqllineage -v -e "SELECT * FROM customers"

# Set lineage level (table or column)
sqllineage -l table -f queries.sql
sqllineage -l column -f queries.sql

Visualization Options

# Launch graph visualization
sqllineage -g -f queries.sql

# Specify visualization server host and port
sqllineage -g -H 0.0.0.0 -p 8080 -f queries.sql

SQL Dialect Options

# Specify SQL dialect
sqllineage -d snowflake -f snowflake_queries.sql
sqllineage -d postgres -f postgres_queries.sql

# List available dialects
sqllineage --dialects

Error Handling

# Silent mode - skip unsupported statements
sqllineage --silent_mode -f mixed_queries.sql

Metadata Integration

# Use SQLAlchemy metadata provider
sqllineage --sqlalchemy_url "postgresql://user:pass@host/db" -f queries.sql

Complete Argument Reference

ArgumentShortTypeDescription
--versionflagShow version and exit
-e <sql>-estringSQL from command line
-f <filename>-fstringSQL from files (can be repeated)
--verbose-vflagShow statement level lineage result
--level <level>-lchoiceLineage level: table or column
--graph-visualization-gflagShow graph visualization
--host <host>-HstringVisualization webserver host (default: localhost)
--port <port>-pintegerVisualization webserver port (default: 5000)
--dialect <dialect>-dstringSQL dialect (default: ansi)
--dialectsflagList available dialects
--silent_modeflagSkip unsupported statements
--sqlalchemy_url <url>stringSQLAlchemy URL for metadata

Usage Examples

Basic Table Lineage Analysis

# Analyze a single SQL statement
sqllineage -e "INSERT INTO summary SELECT * FROM details"

# Output:
# <default>.details
# =================
# <default>.summary

File-Based Analysis

# Create a SQL file
cat > analysis.sql << 'EOF'
CREATE TABLE staging.customers AS
SELECT 
    customer_id,
    customer_name,
    email
FROM raw.customer_data
WHERE active = 'Y';

INSERT INTO analytics.customer_summary
SELECT 
    customer_id,
    customer_name,
    COUNT(*) as interaction_count
FROM staging.customers c
JOIN raw.interactions i ON c.customer_id = i.customer_id
GROUP BY customer_id, customer_name;
EOF

# Analyze the file
sqllineage -f analysis.sql

Verbose Output

sqllineage -v -f analysis.sql

# Output includes statement-level breakdown:
# ************** Statement #1 **************
# CREATE TABLE staging.customers AS SELECT customer_id, customer_name, email FROM raw.customer_data WHERE active = 'Y'
# raw.customer_data
# =================
# staging.customers
# 
# ************** Statement #2 **************  
# INSERT INTO analytics.customer_summary SELECT customer_id, customer_name, COUNT(*) as interaction_count FROM staging.customers c JOIN raw.interactions i ON c.customer_id = i.customer_id GROUP BY customer_id, customer_name
# staging.customers
# raw.interactions
# =================
# analytics.customer_summary

Column-Level Lineage

sqllineage -l column -f analysis.sql

# Output shows column-level relationships:
# raw.customer_data.customer_id -> staging.customers.customer_id
# raw.customer_data.customer_name -> staging.customers.customer_name
# raw.customer_data.email -> staging.customers.email
# staging.customers.customer_id -> analytics.customer_summary.customer_id
# staging.customers.customer_name -> analytics.customer_summary.customer_name

Dialect-Specific Analysis

# List available SQL dialects
sqllineage --dialects

# Output:
# {
#   "sqlparse": ["non-validating"],
#   "sqlfluff": ["ansi", "bigquery", "clickhouse", "databricks", "db2", "duckdb", "exasol", "hive", "materialize", "mysql", "oracle", "postgres", "redshift", "snowflake", "soql", "sparksql", "sqlite", "teradata", "trino", "tsql"]
# }

# Analyze Snowflake SQL
sqllineage -d snowflake -e "CREATE OR REPLACE TABLE analytics.daily_sales AS SELECT DATE_TRUNC('DAY', order_timestamp) as sale_date, SUM(amount) as daily_revenue FROM raw.transactions GROUP BY DATE_TRUNC('DAY', order_timestamp)"

Web Visualization

# Launch web visualization (opens browser)
sqllineage -g -f complex_queries.sql

# Specify custom host/port for visualization
sqllineage -g -H 0.0.0.0 -p 8080 -f queries.sql

Database Metadata Integration

# PostgreSQL metadata
sqllineage --sqlalchemy_url "postgresql://analyst:password@db.company.com:5432/analytics" -l column -f postgres_queries.sql

# Snowflake metadata
sqllineage --sqlalchemy_url "snowflake://user:password@account/database/schema" -d snowflake -l column -f snowflake_queries.sql

# MySQL metadata
sqllineage --sqlalchemy_url "mysql://user:password@localhost:3306/analytics" -d mysql -f mysql_queries.sql

Error Handling and Silent Mode

# Create mixed SQL file with supported and unsupported statements
cat > mixed.sql << 'EOF'
-- Supported
CREATE VIEW sales_view AS SELECT * FROM sales;

-- Unsupported (DDL permissions)
GRANT SELECT ON sales_view TO analyst_role;

-- Supported
INSERT INTO summary SELECT customer_id, SUM(amount) FROM sales_view GROUP BY customer_id;

-- Unsupported (procedural)
CREATE PROCEDURE update_summary() AS BEGIN UPDATE summary SET last_updated = NOW(); END;
EOF

# Normal mode (shows errors)
sqllineage -f mixed.sql

# Silent mode (skips unsupported statements)
sqllineage --silent_mode -f mixed.sql

Complex Analysis Pipeline

# Multi-step analysis with different outputs
echo "CREATE TABLE mart.customer_360 AS 
SELECT 
    c.customer_id,
    c.customer_name,
    p.total_purchases,
    s.support_tickets,
    m.email_engagement
FROM raw.customers c
LEFT JOIN analytics.purchase_summary p ON c.customer_id = p.customer_id  
LEFT JOIN analytics.support_summary s ON c.customer_id = s.customer_id
LEFT JOIN analytics.marketing_summary m ON c.customer_id = m.customer_id" > customer_360.sql

# Table-level analysis
echo "=== TABLE LINEAGE ==="
sqllineage -f customer_360.sql

# Column-level analysis  
echo -e "\n=== COLUMN LINEAGE ==="
sqllineage -l column -f customer_360.sql

# Verbose analysis
echo -e "\n=== VERBOSE ANALYSIS ==="
sqllineage -v -f customer_360.sql

# Launch visualization
sqllineage -g -f customer_360.sql

Environment Integration

# Set default schema via environment
export SQLLINEAGE_DEFAULT_SCHEMA="analytics"

# Analyze unqualified table references
sqllineage -e "SELECT * FROM customers"  # Interprets as analytics.customers

# Override with different dialect
sqllineage -d postgres -e "SELECT customer_id, email FROM users WHERE created_date >= CURRENT_DATE - INTERVAL '30 days'"

Batch Processing

# Process multiple SQL files
for file in queries/*.sql; do
    echo "=== Analyzing $file ==="
    sqllineage -f "$file"
    echo
done

# Generate lineage reports for all files
mkdir -p reports
for file in queries/*.sql; do
    basename=$(basename "$file" .sql)
    sqllineage -v -f "$file" > "reports/${basename}_lineage.txt"
done

Integration with CI/CD

#!/bin/bash
# lineage_check.sh - CI script to validate SQL lineage

set -e

echo "Validating SQL lineage in migration files..."

# Check for unsupported SQL patterns
if ! sqllineage --silent_mode -f migrations/*.sql > /dev/null 2>&1; then
    echo "ERROR: SQL lineage analysis failed"
    exit 1
fi

# Generate lineage documentation
sqllineage -v -f migrations/*.sql > docs/lineage_report.txt

echo "SQL lineage validation completed successfully"

Install with Tessl CLI

npx tessl i tessl/pypi-sqllineage

docs

cli-interface.md

configuration.md

data-models.md

index.md

lineage-runner.md

metadata-providers.md

visualization-export.md

tile.json