Factory functions that create warehouse clients and SQL builders based on configuration, providing a unified entry point without requiring direct instantiation of specific warehouse classes.
Key Points:
warehouseClientFromCredentials() - Creates warehouse client from credentials objectwarehouseSqlBuilderFromType() - Creates SQL builder from adapter type stringtype field in credentials (e.g., WarehouseTypes.POSTGRES)startOfWeek parameter (WeekDay enum)When to Use:
Creates appropriate warehouse client instance based on credentials type.
// Import from @lightdash/warehouses and @lightdash/common
import { warehouseClientFromCredentials } from '@lightdash/warehouses';
import { type CreateWarehouseCredentials, type WarehouseClient, WarehouseTypes } from '@lightdash/common';
/**
* Creates a warehouse client instance based on credentials type
* @param credentials - Warehouse-specific credentials object with type discriminator
* @returns WarehouseClient instance for the specified warehouse type
* @throws {UnexpectedServerError} If credentials type is not recognized
*/
function warehouseClientFromCredentials(
credentials: CreateWarehouseCredentials
): WarehouseClient;Usage Examples:
import { warehouseClientFromCredentials } from '@lightdash/warehouses';
import { WarehouseTypes } from '@lightdash/common';
// Create a PostgreSQL client
const pgClient = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES,
host: 'localhost',
port: 5432,
user: 'myuser',
password: 'mypassword',
dbname: 'analytics',
schema: 'public',
});
// Create a BigQuery client
const bqClient = warehouseClientFromCredentials({
type: WarehouseTypes.BIGQUERY,
project: 'my-project-id',
dataset: 'my_dataset',
location: 'US',
timeoutSeconds: 300,
priority: 'interactive',
retries: 3,
maximumBytesBilled: 1000000000,
keyfileContents: {
type: 'service_account',
project_id: 'my-project-id',
private_key: '-----BEGIN PRIVATE KEY-----\n...',
client_email: 'service-account@my-project.iam.gserviceaccount.com',
},
});
// Create a Snowflake client
const sfClient = warehouseClientFromCredentials({
type: WarehouseTypes.SNOWFLAKE,
account: 'my-account',
user: 'myuser',
password: 'mypassword',
database: 'ANALYTICS',
warehouse: 'COMPUTE_WH',
schema: 'PUBLIC',
});
// Test connection
try {
await pgClient.test();
console.log('Connection successful');
} catch (error) {
console.error('Connection failed:', error.message);
}Supported Warehouse Types:
The function supports the following warehouse types from the WarehouseTypes enum:
WarehouseTypes.BIGQUERY - Creates BigqueryWarehouseClientWarehouseTypes.CLICKHOUSE - Creates ClickhouseWarehouseClient (Note: ClickhouseWarehouseClient is not directly importable; only accessible via this factory. See Clickhouse Client)WarehouseTypes.DATABRICKS - Creates DatabricksWarehouseClientWarehouseTypes.POSTGRES - Creates PostgresWarehouseClientWarehouseTypes.REDSHIFT - Creates RedshiftWarehouseClientWarehouseTypes.SNOWFLAKE - Creates SnowflakeWarehouseClientWarehouseTypes.TRINO - Creates TrinoWarehouseClientCreates appropriate SQL builder instance based on adapter type string.
// Import from @lightdash/warehouses and @lightdash/common
import { warehouseSqlBuilderFromType } from '@lightdash/warehouses';
import { SupportedDbtAdapter, WeekDay } from '@lightdash/common';
/**
* Creates a SQL builder instance based on adapter type
* @param adapterType - DBT adapter type string or SupportedDbtAdapter enum value
* @param startOfWeek - Optional week start day for date calculations (WeekDay enum value, null, or undefined)
* @returns SQL builder instance for the specified adapter type
* @throws {Error} If adapter type is not recognized
*
* Note: ClickhouseSqlBuilder is only accessible via this factory function.
* See [SQL Builders - Export Status](./sql-builders.md#export-status) for details.
*/
function warehouseSqlBuilderFromType(
adapterType: string | SupportedDbtAdapter,
startOfWeek?: WeekDay | null
): BigquerySqlBuilder | ClickhouseSqlBuilder | DatabricksSqlBuilder | PostgresSqlBuilder | RedshiftSqlBuilder | SnowflakeSqlBuilder | TrinoSqlBuilder;Usage Examples:
import { warehouseSqlBuilderFromType } from '@lightdash/warehouses';
import { SupportedDbtAdapter, WeekDay, MetricType, FieldType } from '@lightdash/common';
// Create a BigQuery SQL builder
const bqBuilder = warehouseSqlBuilderFromType(SupportedDbtAdapter.BIGQUERY);
// Create a Snowflake SQL builder with week start configuration
const sfBuilder = warehouseSqlBuilderFromType(
SupportedDbtAdapter.SNOWFLAKE,
WeekDay.MONDAY
);
// Use the builder
const fieldQuote = bqBuilder.getFieldQuoteChar(); // Returns '`'
const escapedString = sfBuilder.escapeString("O'Reilly"); // Returns 'O''Reilly'
const floatType = bqBuilder.getFloatingType(); // Returns 'FLOAT64'
// Generate metric SQL (requires complete Metric object with ALL required Field properties)
const metricSql = bqBuilder.getMetricSql(
'amount', // SQL expression to aggregate
{
// Complete Metric object with ALL required Field properties
fieldType: FieldType.METRIC, // Required: Field type identifier
type: MetricType.SUM, // Required: Metric aggregation type
sql: 'amount', // Required: SQL expression
name: 'total_amount', // Required: Unique field name
label: 'Total Amount', // Required: Display label
table: 'orders', // Required: Source table name
tableLabel: 'Orders', // Required: Table display label
hidden: false, // Required: Visibility flag
}
);
// Result: 'SUM(amount)'Supported Adapter Types:
The function supports the following adapter types from the SupportedDbtAdapter enum:
SupportedDbtAdapter.BIGQUERY - Creates BigquerySqlBuilderSupportedDbtAdapter.CLICKHOUSE - Creates ClickhouseSqlBuilder (Note: ClickhouseSqlBuilder is not directly importable; only accessible via this factory. See SQL Builders - Export Status)SupportedDbtAdapter.DATABRICKS - Creates DatabricksSqlBuilderSupportedDbtAdapter.POSTGRES - Creates PostgresSqlBuilderSupportedDbtAdapter.REDSHIFT - Creates RedshiftSqlBuilderSupportedDbtAdapter.SNOWFLAKE - Creates SnowflakeSqlBuilderSupportedDbtAdapter.TRINO - Creates TrinoSqlBuilderWhen to use direct imports vs factory:
new BigquerySqlBuilder()) - except for ClickhouseSqlBuilder which is not exportedClickhouseSqlBuilder can only be created via this factory functionValid Inputs:
credentials.type must be one of: 'bigquery', 'clickhouse', 'databricks', 'postgres', 'redshift', 'snowflake', 'trino'Invalid Inputs:
type value → throws UnexpectedServerError with message "Warehouse credentials type were not recognised"Error from warehouse client constructornull or undefined credentials → throws TypeErrorExample Error Handling:
try {
const client = warehouseClientFromCredentials({
type: 'invalid-type' as any, // Invalid warehouse type
// ...
});
} catch (error) {
console.error('Error creating client:', error.message);
// "Warehouse credentials type were not recognised"
}
try {
const client = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES,
host: 'localhost',
// Missing required fields: port, user, password, dbname, schema
} as any);
} catch (error) {
console.error('Missing required fields:', error.message);
}Valid Inputs:
adapterType must be one of: 'bigquery', 'clickhouse', 'databricks', 'postgres', 'redshift', 'snowflake', 'trino'startOfWeek (optional) must be WeekDay enum value (0-6), null, or undefinedInvalid Inputs:
adapterType → throws Error with message "Invalid adapter type: <type>. Must be one of: bigquery, clickhouse, databricks, postgres, redshift, snowflake, trino"startOfWeek value (not 0-6) → may cause unexpected behavior in date calculationsnull or undefined adapterType → throws ErrorExample Error Handling:
try {
const builder = warehouseSqlBuilderFromType('invalid-adapter');
} catch (error) {
console.error('Invalid adapter type:', error.message);
// "Invalid adapter type: invalid-adapter. Must be one of: bigquery, clickhouse, databricks, postgres, redshift, snowflake, trino"
}
// Valid with week start
const builder = warehouseSqlBuilderFromType(
SupportedDbtAdapter.POSTGRES,
WeekDay.SUNDAY
);No default parameter values - all credentials are passed through to warehouse constructors as-is. Each warehouse client may have its own default values:
timeoutSeconds (no default in credentials)port defaults to 5432, sslmode defaults to 'disable'port defaults to 5439, sslmode defaults to 'prefer'port defaults to 443, threads defaults to 4, clientSessionKeepAlive defaults to falseport defaults to 8080, http_scheme defaults to 'https'Default Parameter Values:
startOfWeek: undefined (not set) - date calculations use warehouse defaults// ❌ WRONG - Missing required fields
const client = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES,
host: 'localhost',
// Missing: port, user, password, dbname, schema
} as any);
// ✅ CORRECT - All required fields provided
const client = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES,
host: 'localhost',
port: 5432,
user: 'myuser',
password: 'mypassword',
dbname: 'mydb',
schema: 'public',
sslmode: 'disable',
});// ⚠️ ACCEPTABLE but not type-safe
const client = warehouseClientFromCredentials({
type: 'postgres', // String literal - no compile-time checking
// ...
});
// ✅ BETTER - Use enum for type safety
const client = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES, // Enum - compile-time checking
// ...
});// ❌ WRONG - Incomplete metric object
const sql = builder.getMetricSql('amount', {
type: MetricType.SUM,
sql: 'amount',
// Missing required Field properties
} as any);
// ✅ CORRECT - Complete metric object
const sql = builder.getMetricSql('amount', {
fieldType: FieldType.METRIC,
type: MetricType.SUM,
sql: 'amount',
name: 'total_amount',
label: 'Total Amount',
table: 'orders',
tableLabel: 'Orders',
hidden: false,
});// null credentials → throws TypeError
try {
warehouseClientFromCredentials(null as any);
} catch (error) {
// TypeError: Cannot read properties of null
}
// undefined credentials → throws TypeError
try {
warehouseClientFromCredentials(undefined as any);
} catch (error) {
// TypeError: Cannot read properties of undefined
}
// null adapter type → throws Error
try {
warehouseSqlBuilderFromType(null as any);
} catch (error) {
// Error: Invalid adapter type: null. Must be one of: bigquery, databricks, postgres, redshift, snowflake, trino
}
// startOfWeek = null is valid (resets to undefined)
const builder = warehouseSqlBuilderFromType(
SupportedDbtAdapter.POSTGRES,
null // Valid - resets week start to undefined
);// Warehouse types are case-sensitive
// ❌ WRONG - incorrect casing
const client = warehouseClientFromCredentials({
type: 'POSTGRES' as any, // Wrong case
// ...
});
// ✅ CORRECT - lowercase
const client = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES, // 'postgres'
// ...
});// Dynamic warehouse selection based on runtime config
function createClientFromConfig(config: any) {
// Map config to warehouse type
const typeMap: Record<string, WarehouseTypes> = {
'pg': WarehouseTypes.POSTGRES,
'bq': WarehouseTypes.BIGQUERY,
'sf': WarehouseTypes.SNOWFLAKE,
};
const warehouseType = typeMap[config.database_type];
if (!warehouseType) {
throw new Error(`Unsupported database type: ${config.database_type}`);
}
// Build credentials object
const credentials = {
type: warehouseType,
...config.credentials,
};
return warehouseClientFromCredentials(credentials);
}Cause: Invalid type field in credentials
Solution:
// Check that type matches WarehouseTypes enum
console.log(Object.values(WarehouseTypes));
// ['bigquery', 'clickhouse', 'databricks', 'postgres', 'redshift', 'snowflake', 'trino']
// Ensure credentials.type is one of these values
const client = warehouseClientFromCredentials({
type: WarehouseTypes.POSTGRES, // Use enum, not string literal
// ...
});Cause: Invalid adapterType parameter for SQL builder
Full Error Message: "Invalid adapter type: {type}. Must be one of: bigquery, clickhouse, databricks, postgres, redshift, snowflake, trino"
Solution:
// Check that adapter type matches SupportedDbtAdapter enum
console.log(Object.values(SupportedDbtAdapter));
// ['bigquery', 'clickhouse', 'databricks', 'postgres', 'redshift', 'snowflake', 'trino']
// Ensure adapterType is one of these values
const builder = warehouseSqlBuilderFromType(
SupportedDbtAdapter.POSTGRES, // Use enum, not string literal
WeekDay.MONDAY
);Cause: Incomplete credentials object
Solution: Refer to warehouse-specific documentation for required fields:
/**
* Discriminated union of all warehouse credential types
* Type discriminator is the 'type' property
*/
type CreateWarehouseCredentials =
| CreateBigqueryCredentials
| CreateClickhouseCredentials
| CreateDatabricksCredentials
| CreatePostgresCredentials
| CreateRedshiftCredentials
| CreateSnowflakeCredentials
| CreateTrinoCredentials;
/**
* DBT adapter type enum for SQL builder selection
*/
enum SupportedDbtAdapter {
BIGQUERY = 'bigquery',
CLICKHOUSE = 'clickhouse',
DATABRICKS = 'databricks',
POSTGRES = 'postgres',
REDSHIFT = 'redshift',
SNOWFLAKE = 'snowflake',
TRINO = 'trino',
}
/**
* Warehouse type enum for credential type discrimination
*/
enum WarehouseTypes {
BIGQUERY = 'bigquery',
CLICKHOUSE = 'clickhouse',
DATABRICKS = 'databricks',
POSTGRES = 'postgres',
REDSHIFT = 'redshift',
SNOWFLAKE = 'snowflake',
TRINO = 'trino',
}
/**
* Week start day configuration for SQL builders
*/
enum WeekDay {
MONDAY = 0,
TUESDAY = 1,
WEDNESDAY = 2,
THURSDAY = 3,
FRIDAY = 4,
SATURDAY = 5,
SUNDAY = 6,
}The package also exports additional warehouse-specific helper functions:
OAuth authentication helpers for Databricks. These are documented in detail in Databricks Client.
/**
* Exchange OAuth client credentials for access token (M2M flow)
*/
function exchangeDatabricksOAuthCredentials(
host: string,
clientId: string,
clientSecret: string
): Promise<{ accessToken: string; refreshToken?: string }>;
/**
* Refresh OAuth access token using refresh token
*/
function refreshDatabricksOAuthToken(
host: string,
clientId: string,
refreshToken: string
): Promise<{ accessToken: string; refreshToken: string; expiresIn: number }>;All warehouse-specific client classes are exported for direct instantiation if needed:
import {
BigqueryWarehouseClient,
DatabricksWarehouseClient,
PostgresWarehouseClient,
RedshiftWarehouseClient,
SnowflakeWarehouseClient,
TrinoWarehouseClient,
} from '@lightdash/warehouses';Note: It's recommended to use the factory function for consistent client creation across all warehouse types.
All warehouse-specific SQL builder classes are exported:
import {
BigquerySqlBuilder,
DatabricksSqlBuilder,
PostgresSqlBuilder,
RedshiftSqlBuilder,
SnowflakeSqlBuilder,
TrinoSqlBuilder,
} from '@lightdash/warehouses';When to use direct imports vs factory:
new BigquerySqlBuilder())Do you need to create a client or SQL builder?
├─ Client
│ └─ Do you know the warehouse type at compile time?
│ ├─ Yes (e.g., always PostgreSQL)
│ │ └─ Use direct import: new PostgresWarehouseClient(credentials)
│ └─ No (determined at runtime from config)
│ └─ Use factory: warehouseClientFromCredentials(credentials)
│
└─ SQL Builder
└─ Do you know the adapter type at compile time?
├─ Yes (e.g., always BigQuery)
│ └─ Use direct import: new BigquerySqlBuilder(startOfWeek)
└─ No (determined at runtime from config)
└─ Use factory: warehouseSqlBuilderFromType(adapterType, startOfWeek)