Warehouse-specific SQL generation for field quoting, string escaping, and metric aggregations.
| Warehouse | Field Quote | Float Type | Percentile Function | Concat |
|---|---|---|---|---|
| BigQuery | ` | FLOAT64 | APPROX_QUANTILES | CONCAT() |
| ClickHouse | " | FLOAT | quantile() / median() | CONCAT() |
| Databricks | ` | FLOAT | PERCENTILE() | CONCAT() |
| PostgreSQL | " | FLOAT | PERCENTILE_CONT | || |
| Redshift | " | FLOAT | PERCENTILE_CONT | || |
| Snowflake | " | FLOAT | PERCENTILE_CONT | CONCAT() |
| Trino | " | DOUBLE | APPROX_PERCENTILE | CONCAT() |
import { warehouseSqlBuilderFromType } from '@lightdash/warehouses';
import { SupportedDbtAdapter, WeekDay } from '@lightdash/common';
const builder = warehouseSqlBuilderFromType(SupportedDbtAdapter.POSTGRES, WeekDay.MONDAY);All builders implement the WarehouseSqlBuilder interface:
/**
* Common SQL builder interface
* All warehouse-specific builders implement these methods
* Note: Base class is not exported - use factory function or client instances
*/
interface WarehouseSqlBuilder {
/**
* Get configured week start day
* @returns WeekDay enum value (0=Monday, 6=Sunday), null, or undefined
*/
getStartOfWeek(): WeekDay | null | undefined;
/**
* Get DBT adapter type for this warehouse
* @returns SupportedDbtAdapter enum value
*/
getAdapterType(): SupportedDbtAdapter;
/**
* Get character used for quoting field/column names
* @returns Quote character: '"' (most), '`' (BigQuery, Databricks)
*/
getFieldQuoteChar(): string;
/**
* Get character used for quoting string literals
* @returns Quote character: "'" (single quote for all warehouses)
*/
getStringQuoteChar(): string;
/**
* Get character used for escaping strings
* @returns Escape character: "\\" or "'" (varies by warehouse)
*/
getEscapeStringQuoteChar(): string;
/**
* Get floating-point type name for this warehouse
* @returns Type name: 'FLOAT', 'FLOAT64', 'DOUBLE', or 'DOUBLE PRECISION'
*/
getFloatingType(): string;
/**
* Generate SQL for metric aggregation
* @param sql - Column SQL expression
* @param metric - Metric definition (type, sql, optional percentile)
* @returns SQL string for the metric aggregation
*/
getMetricSql(sql: string, metric: Metric): string;
/**
* Generate SQL for string concatenation
* @param args - String expressions to concatenate
* @returns SQL concatenation expression
*/
concatString(...args: string[]): string;
/**
* Escape string value for SQL query (prevents SQL injection)
* @param value - Raw string value
* @returns Escaped string safe for SQL query
*/
escapeString(value: string): string;
}Clients implement the SQL builder interface:
const client = warehouseClientFromCredentials(credentials);
// Get quote characters
const quote = client.getFieldQuoteChar();
const sql = `SELECT ${quote}user_id${quote} FROM ${quote}users${quote}`;
// Escape user input
const safe = client.escapeString(userInput);
// Generate metric SQL
const sumSql = client.getMetricSql('revenue', {
type: MetricType.SUM,
sql: 'revenue',
// Note: Full Metric interface in @lightdash/common extends Field
// and includes additional properties. For SQL builder usage, only
// type, sql, and optional percentile are used.
});Metric Interface (simplified for SQL builder usage):
import { type Metric, MetricType, FieldType } from '@lightdash/common';
/**
* Metric definition for SQL generation
* Note: The full Metric interface in @lightdash/common extends Field and includes
* many additional required properties. SQL builders primarily use type, sql, and percentile.
*
* For getMetricSql() usage, you typically only need to provide:
*/
interface MetricForSqlBuilder {
/** Metric aggregation type */
type: MetricType;
/** SQL expression for the metric (column name or calculation) */
sql: string;
/** Optional percentile value (0-100) for percentile metrics */
percentile?: number;
}
/**
* Complete Metric interface (from @lightdash/common)
* Extends the Field interface with metric-specific properties
*/
interface Metric extends Field {
/** Field type discriminator - must be FieldType.METRIC */
fieldType: FieldType.METRIC;
/** Metric aggregation type */
type: MetricType;
/** SQL expression for the metric */
sql: string;
/** Unique metric name within the table */
name: string;
/** Human-readable label */
label: string;
/** Table name this metric belongs to */
table: string;
/** Human-readable table label */
tableLabel: string;
/** Whether this metric is hidden */
hidden: boolean;
/** Optional percentile value (0-100) for percentile metrics */
percentile?: number;
/** Optional metric filters */
filters?: MetricFilterRule[];
/** Optional format options */
formatOptions?: CustomFormat;
/** Optional description */
description?: string;
// ... plus additional optional fields from Field interface
}
/**
* Base Field interface (from @lightdash/common)
* All metrics, dimensions, and table calculations extend this
*/
interface Field {
fieldType: FieldType;
type: string;
name: string;
label: string;
table: string;
tableLabel: string;
sql: string;
description?: string;
hidden: boolean;
format?: Format | string;
groups?: string[];
urls?: FieldUrl[];
index?: number;
tags?: string[];
}MetricType.SUM → SUM(col)
MetricType.COUNT → COUNT(col)
MetricType.COUNT_DISTINCT → COUNT(DISTINCT col)
MetricType.MIN → MIN(col)
MetricType.MAX → MAX(col)
MetricType.AVERAGE → AVG(col)
// PostgreSQL: AVG(col::DOUBLE PRECISION)BigQuery (approximate):
MEDIAN → APPROX_QUANTILES(col, 100)[OFFSET(50)]
PERCENTILE → APPROX_QUANTILES(col, 100)[OFFSET(95)]ClickHouse:
MEDIAN → median(col)
PERCENTILE → quantile(0.95)(col)Databricks:
MEDIAN → PERCENTILE(col, 0.5)
PERCENTILE → PERCENTILE(col, 0.95)PostgreSQL/Snowflake (exact):
MEDIAN → PERCENTILE_CONT(0.5) WITHIN GROUP (ORDER BY col)
PERCENTILE → PERCENTILE_CONT(0.95) WITHIN GROUP (ORDER BY col)Trino (approximate):
MEDIAN → APPROX_PERCENTILE(col, 0.5)
PERCENTILE → APPROX_PERCENTILE(col, 0.95)All warehouses apply these escaping rules to prevent SQL injection.
Security Implementation: The escapeString() method performs multiple passes to sanitize input:
-- (line comments) and /* */ (block comments)\x00Common escaping for all warehouses:
-- (line comments) and /* */ (block comments)\x00Warehouse-specific rules:
O'Reilly → O''Reillypath\file → path\\fileclient.escapeString("O'Reilly's path\\file") → O''Reilly''s path\\\\fileO'Reilly → O\'ReillySay "Hi" → Say \"Hi\"path\file → path\\fileclient.escapeString("O'Reilly's \"path\"") → O\'Reilly\'s \"path\"O'Reilly → O''Reillypath\file → path\\fileclient.escapeString("O'Reilly's path\\file") → O''Reilly''s path\\\\fileO'Reilly → O''ReillySecurity Notes:
escapeString() for user input in dynamic SQLvalues or queryParams) when availablePostgreSQL/Redshift (|| operator):
concatString('"first"', "' '", '"last"')
→ ("first" || ' ' || "last")Others (CONCAT function):
concatString('"first"', "' '", '"last"')
→ CONCAT("first", ' ', "last")import {
BigquerySqlBuilder,
// ClickhouseSqlBuilder - NOT exported
DatabricksSqlBuilder,
PostgresSqlBuilder,
RedshiftSqlBuilder,
SnowflakeSqlBuilder,
TrinoSqlBuilder,
} from '@lightdash/warehouses';
const builder = new PostgresSqlBuilder();const builder = warehouseSqlBuilderFromType(
SupportedDbtAdapter.SNOWFLAKE,
WeekDay.MONDAY
);
builder.getStartOfWeek(); // 0 (Monday)