or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

docs

additional.mdauthentication.mdcode-management.mdcompilation.mddeployment.mdgeneration.mdindex.mdpreview.mdrefactoring.mdvalidation.md
tile.json

additional.mddocs/

Additional Commands

Additional utility commands for project management, exposure generation, and code validation.

Capabilities

Refresh

Refreshes a Lightdash project from its connected remote git repository.

lightdash refresh [options]

Options:

  • --verbose - Enable debug logging (default: false)

Usage Examples:

# Refresh active project from git
lightdash refresh

# Refresh with verbose output
lightdash refresh --verbose

Requirements:

  • Project must be connected to a remote git repository
  • Git integration must be configured in Lightdash
  • User must have appropriate permissions

Behavior:

  • Triggers git pull from remote repository
  • Updates dbt project files from git
  • Recompiles dbt models
  • Updates Lightdash explores
  • Refreshes all content based on latest code

Use Cases:

  • Sync Lightdash with latest git commits
  • Update project after git push
  • Refresh after team members push changes
  • Ensure Lightdash matches repository state

Example Workflow:

# Developer pushes changes to git
git add models/
git commit -m "Update customer model"
git push

# Refresh Lightdash to pull changes
lightdash refresh

# Lightdash now reflects git changes

Error Handling:

Error: "Project not connected to git"

  • Project has no git integration
  • Solution: Configure git connection in Lightdash UI

Error: "Git refresh failed"

  • Repository access issues
  • Invalid credentials
  • Solution: Check git configuration, verify access tokens

Generate Exposures

[EXPERIMENTAL] Generates a dbt exposures YAML file from Lightdash charts and dashboards.

lightdash generate-exposures [options]

Options:

  • --project-dir <path> - dbt project directory (default: .)
  • --output <path> - Output file path (default: <project-dir>/models/lightdash_exposures.yml)
  • --verbose - Enable debug logging (default: false)

Usage Examples:

# Generate exposures with defaults
lightdash generate-exposures

# Generate to custom output file
lightdash generate-exposures --output ./exposures/lightdash.yml

# Generate with custom project directory
lightdash generate-exposures --project-dir ./my-dbt-project

# Generate with custom output in custom project
lightdash generate-exposures \
  --project-dir ./dbt \
  --output ./dbt/models/exposures/lightdash.yml

# Generate with verbose output
lightdash generate-exposures --verbose

What Are dbt Exposures?

dbt exposures represent downstream uses of dbt models (charts, dashboards, reports). They:

  • Document how models are used
  • Track data lineage
  • Show dependencies
  • Appear in dbt docs

Generated File Format:

# lightdash_exposures.yml
version: 2

exposures:
  - name: monthly_revenue_chart
    label: Monthly Revenue
    type: dashboard
    maturity: high
    url: https://app.lightdash.cloud/saved/abc-123
    description: Track monthly revenue trends
    depends_on:
      - ref('orders')
      - ref('customers')
    owner:
      name: Analytics Team
      email: analytics@example.com

  - name: executive_dashboard
    label: Executive Dashboard
    type: dashboard
    maturity: high
    url: https://app.lightdash.cloud/dashboards/def-456
    description: Key business metrics at a glance
    depends_on:
      - ref('orders')
      - ref('customers')
      - ref('products')
    owner:
      name: Executive Team
      email: exec@example.com

Behavior:

  1. Fetch Content - Retrieves all charts and dashboards from Lightdash
  2. Analyze Dependencies - Identifies dbt models used by each chart/dashboard
  3. Generate YAML - Creates dbt exposures file with proper format (adapts format for dbt 1.10+)
  4. Write File - Saves to specified output path

dbt Version Compatibility:

The command automatically adapts the generated format for dbt 1.10+:

  • dbt 1.10+: Uses config.tags instead of top-level tags
  • dbt <1.10: Uses standard metadata structure
  • This ensures compatibility across dbt versions

Exposure Details:

Each exposure includes:

  • name - Slugified chart/dashboard name
  • label - Display name
  • type - Always "dashboard" (dbt requirement)
  • maturity - Set to "high"
  • url - Link to Lightdash
  • description - Chart/dashboard description
  • depends_on - List of dbt model references
  • owner - Chart/dashboard owner information

Use Cases:

  1. Documentation - Document downstream usage in dbt
  2. Lineage - Track data flow from models to visualizations
  3. Impact Analysis - See which charts affected by model changes
  4. Governance - Understand model dependencies

Example Workflow:

# 1. Generate exposures
lightdash generate-exposures

# 2. Review generated file
cat models/lightdash_exposures.yml

# 3. Commit to git
git add models/lightdash_exposures.yml
git commit -m "Add Lightdash exposures"

# 4. Generate dbt docs
dbt docs generate

# 5. View in dbt docs
dbt docs serve
# Navigate to model → "Used By" section

Integration with dbt:

After generating exposures:

# Generate dbt docs
dbt docs generate

# Serve docs
dbt docs serve

# In docs UI:
# - Navigate to any model
# - See "Used By" section
# - Shows Lightdash charts/dashboards
# - Click links to view in Lightdash

Limitations:

  • EXPERIMENTAL - Feature may change
  • Only includes charts/dashboards that query models directly
  • May miss indirect dependencies
  • Requires manual regeneration after content changes
  • Owner information may be incomplete

Best Practices:

  1. Regenerate Regularly:
# Daily or weekly regeneration
lightdash generate-exposures
git add models/lightdash_exposures.yml
git commit -m "Update Lightdash exposures"
  1. Automate in CI/CD:
# .github/workflows/update-exposures.yml
name: Update Exposures
on:
  schedule:
    - cron: '0 0 * * 0'  # Weekly
jobs:
  update:
    steps:
      - run: lightdash generate-exposures
      - run: git add models/lightdash_exposures.yml
      - run: git commit -m "Update exposures" || true
      - run: git push
  1. Review Generated File:
# Check output before committing
lightdash generate-exposures
git diff models/lightdash_exposures.yml

Lint

Validates Lightdash Code files (YAML) against JSON schemas.

lightdash lint [options]

Options:

  • -p, --path <path> - File or directory to lint (default: current directory)
  • --verbose - Show detailed output (default: false)
  • -f, --format <format> - Output format: cli or json (SARIF) (default: cli)

Usage Examples:

# Lint current directory
lightdash lint

# Lint specific directory
lightdash lint --path ./lightdash

# Lint specific file
lightdash lint --path ./lightdash/charts/revenue.yml

# Lint with verbose output
lightdash lint --verbose

# Lint with JSON output (SARIF format)
lightdash lint --format json

# Lint and save JSON output
lightdash lint --format json > lint-results.sarif.json

File Types Validated:

  1. Models:

    • Type: model
    • Version: model/v1, model/v1beta
    • Schema validation for Lightdash model YAML
  2. Charts:

    • Version: 1
    • Must have metricQuery field
    • Schema validation for chart configuration
  3. Dashboards:

    • Version: 1
    • Must have tiles field
    • Schema validation for dashboard layout

Validation Checks:

  • Schema Compliance - YAML matches JSON schema
  • Required Fields - All required fields present
  • Field Types - Correct data types used
  • Valid Values - Enum values are valid
  • Structure - Proper YAML structure and nesting

CLI Output Format:

lightdash lint --path ./lightdash

Success Output:

Linting ./lightdash...

✓ charts/monthly-revenue.yml - Valid
✓ charts/customer-segmentation.yml - Valid
✓ dashboards/executive-dashboard.yml - Valid

3 files validated, 0 errors

Error Output:

Linting ./lightdash...

✓ charts/monthly-revenue.yml - Valid
✗ charts/order-analysis.yml - Invalid
  Error: Missing required field 'metricQuery.exploreName'
  Line 5, column 3

✗ dashboards/sales-dashboard.yml - Invalid
  Error: Invalid value for 'version'. Expected '1', got '2'
  Line 1, column 10

1 valid, 2 errors found

JSON Output Format (SARIF):

lightdash lint --format json

Outputs SARIF (Static Analysis Results Interchange Format):

{
  "$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json",
  "version": "2.1.0",
  "runs": [
    {
      "tool": {
        "driver": {
          "name": "Lightdash Lint",
          "version": "0.2231.0"
        }
      },
      "results": [
        {
          "ruleId": "schema-validation",
          "level": "error",
          "message": {
            "text": "Missing required field 'metricQuery.exploreName'"
          },
          "locations": [
            {
              "physicalLocation": {
                "artifactLocation": {
                  "uri": "charts/order-analysis.yml"
                },
                "region": {
                  "startLine": 5,
                  "startColumn": 3
                }
              }
            }
          ]
        }
      ]
    }
  ]
}

Use Cases:

1. Pre-Commit Validation

# Git pre-commit hook
#!/bin/bash
if ! lightdash lint --path ./lightdash; then
  echo "Lint errors found. Fix before committing."
  exit 1
fi

2. CI/CD Validation

name: Validate Lightdash Code

on:
  pull_request:
    paths:
      - 'lightdash/**'

jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - name: Install Lightdash CLI
        run: npm install -g @lightdash/cli

      - name: Lint Lightdash Code
        run: lightdash lint --path ./lightdash

      - name: Generate SARIF Report
        if: always()
        run: lightdash lint --path ./lightdash --format json > results.sarif.json

      - name: Upload SARIF
        if: always()
        uses: github/codeql-action/upload-sarif@v2
        with:
          sarif_file: results.sarif.json

3. Pre-Upload Validation

# Before uploading
lightdash lint --path ./lightdash

# If valid, upload
if [ $? -eq 0 ]; then
  lightdash upload --path ./lightdash
else
  echo "Fix lint errors before uploading"
  exit 1
fi

4. Continuous Monitoring

# Daily validation
0 6 * * * cd /project && lightdash lint --path ./lightdash >> /var/log/lint.log 2>&1

Common Validation Errors:

Missing Required Fields

# Error: Missing 'exploreName'
metricQuery:
  dimensions:
    - orders_order_date

# Fix: Add exploreName
metricQuery:
  exploreName: orders
  dimensions:
    - orders_order_date

Invalid Field Types

# Error: 'version' must be number, not string
version: "1"

# Fix: Use number
version: 1

Invalid Enum Values

# Error: Invalid chart type
chartConfig:
  type: invalid_type

# Fix: Use valid type
chartConfig:
  type: cartesian

Malformed YAML

# Error: Invalid YAML syntax
metricQuery:
  exploreName orders  # Missing colon

# Fix: Add colon
metricQuery:
  exploreName: orders

Exit Codes:

  • Exit 0 - All files valid
  • Exit 1 - Validation errors found

CI/CD Integration:

# Fail pipeline on lint errors
lightdash lint --path ./lightdash
if [ $? -ne 0 ]; then
  echo "Lint failed!"
  exit 1
fi

SARIF Integration:

SARIF output integrates with:

  • GitHub Code Scanning - Shows errors in PR
  • Azure DevOps - Security scanning reports
  • GitLab - Security dashboards
  • IDE Extensions - VS Code SARIF viewer

GitHub Code Scanning Example:

- name: Lint
  run: lightdash lint --format json > results.sarif.json

- name: Upload to GitHub
  uses: github/codeql-action/upload-sarif@v2
  with:
    sarif_file: results.sarif.json

Result: Lint errors appear as annotations in pull requests.

Best Practices:

1. Lint Before Committing

# Add to .git/hooks/pre-commit
lightdash lint --path ./lightdash

2. Lint in CI/CD

# Always lint in pipelines
steps:
  - run: lightdash lint

3. Use Verbose for Debugging

# When errors are unclear
lightdash lint --verbose

4. Generate SARIF for Reports

# Create reports for review
lightdash lint --format json > report.sarif.json

5. Lint After Download

# Validate downloaded content
lightdash download --path ./lightdash
lightdash lint --path ./lightdash

Command Comparison

Refresh vs. Deploy

Refresh:

  • Pulls from git repository
  • Updates from remote code
  • For git-connected projects
  • Automatic compilation

Deploy:

  • Pushes from local to Lightdash
  • Updates from local code
  • For any project
  • Manual compilation

Generate Exposures vs. Download

Generate Exposures:

  • Creates dbt exposures YAML
  • Documents model usage
  • For dbt documentation
  • One-way (Lightdash → dbt)

Download:

  • Downloads charts/dashboards as YAML
  • For version control
  • For editing content
  • Two-way (Lightdash ↔ local)

Lint vs. Validate

Lint:

  • Validates YAML syntax
  • Checks schema compliance
  • Local file validation
  • Fast, offline

Validate:

  • Validates content in Lightdash
  • Checks queries work
  • Server-side validation
  • Slower, requires API

Troubleshooting

Refresh Errors

Error: "Project not connected to git"

  • Solution: Configure git integration in Lightdash UI
  • Or use lightdash deploy instead

Error: "Git authentication failed"

  • Solution: Update git credentials in Lightdash
  • Check repository access

Generate Exposures Errors

Error: "No charts or dashboards found"

  • Project has no content
  • Solution: Create charts/dashboards first

Error: "Cannot write to output path"

  • Path doesn't exist
  • Permission denied
  • Solution: Create directory or check permissions

Lint Errors

Error: "Invalid YAML syntax"

  • YAML is malformed
  • Solution: Fix YAML syntax, check indentation

Error: "Schema validation failed"

  • YAML doesn't match schema
  • Solution: Review error message, fix fields

Error: "File not found"

  • Path doesn't exist
  • Solution: Check path spelling

Permission Requirements

Refresh

Required permissions:

  • Developer or Admin role on project
  • Project must have git integration

Generate Exposures

Required permissions:

  • Viewer or higher role on project
  • Read access to charts and dashboards
  • Write access to local filesystem

Lint

Required permissions:

  • Read access to local filesystem
  • No Lightdash permissions needed (local operation)

Best Practices

1. Automate Exposure Generation

# Weekly cron job
0 0 * * 0 lightdash generate-exposures && git add models/lightdash_exposures.yml && git commit -m "Update exposures" && git push

2. Lint Before Every Upload

# Validation pipeline
lightdash lint --path ./lightdash && lightdash upload --path ./lightdash

3. Use Refresh for Git-Based Workflows

# After git push
git push origin main
lightdash refresh

4. Integrate Lint with IDE

# VS Code task
{
  "label": "Lint Lightdash",
  "type": "shell",
  "command": "lightdash lint --path ./lightdash",
  "problemMatcher": []
}

5. Document Exposures in README

# dbt Project

## Lightdash Exposures

Lightdash exposures are auto-generated weekly:
- File: `models/lightdash_exposures.yml`
- Command: `lightdash generate-exposures`
- View in dbt docs: `dbt docs serve`

Integration Examples

Pre-Commit Hook

#!/bin/bash
# .git/hooks/pre-commit

echo "Linting Lightdash Code..."
if ! lightdash lint --path ./lightdash; then
  echo "❌ Lint failed. Fix errors before committing."
  exit 1
fi
echo "✓ Lint passed"

Makefile

.PHONY: lint exposures refresh

lint:
	lightdash lint --path ./lightdash

exposures:
	lightdash generate-exposures
	git add models/lightdash_exposures.yml

refresh:
	lightdash refresh
	lightdash validate

all: lint exposures refresh

Package.json Scripts

{
  "scripts": {
    "lightdash:lint": "lightdash lint --path ./lightdash",
    "lightdash:exposures": "lightdash generate-exposures",
    "lightdash:refresh": "lightdash refresh",
    "lightdash:validate": "lightdash lint && lightdash validate"
  }
}

Summary

Refresh

  • Sync project from git
  • Updates from remote repository
  • Requires git integration
  • Use for git-based workflows

Generate Exposures

  • Document Lightdash usage in dbt
  • Creates exposures YAML file
  • Improves lineage tracking
  • Experimental feature

Lint

  • Validate YAML files
  • Check schema compliance
  • Fast, local validation
  • Essential for code quality