Comprehensive Table/SQL distribution for Apache Flink with Blink planner for optimized table processing in both batch and streaming modes.
npx @tessl/cli install tessl/maven-org-apache-flink--flink-table-uber-blink-2-12@1.13.0Apache Flink Table Uber Blink is a comprehensive distribution package that bundles all necessary components for Table/SQL programming within the Apache Flink ecosystem. It provides a unified JAR containing table common APIs, SQL parsers (including Hive support), table APIs for both Java and Scala, bridge APIs for DataStream integration, the Blink query planner for optimization, runtime components, and Complex Event Processing (CEP) capabilities.
This uber JAR is designed for applications that need complete table processing functionality without managing multiple dependencies, supporting both batch and streaming table operations with the Blink planner's advanced query optimization capabilities.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-uber-blink_2.12</artifactId>
<version>1.13.6</version>
</dependency>For Gradle:
implementation 'org.apache.flink:flink-table-uber-blink_2.12:1.13.6'Java:
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import static org.apache.flink.table.api.Expressions.*;Scala:
import org.apache.flink.table.api._
import org.apache.flink.table.api.bridge.scala._
import org.apache.flink.streaming.api.scala._Creating Table Environment (Java):
// Pure table environment
EnvironmentSettings settings = EnvironmentSettings.newInstance()
.useBlinkPlanner()
.inStreamingMode()
.build();
TableEnvironment tEnv = TableEnvironment.create(settings);
// DataStream integration
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment streamTableEnv = StreamTableEnvironment.create(env);Basic Table Operations:
// Create table from SQL DDL
tEnv.executeSql(
"CREATE TABLE source_table (" +
" user_id BIGINT," +
" item_id BIGINT," +
" behavior STRING," +
" ts TIMESTAMP(3)" +
") WITH (" +
" 'connector' = 'filesystem'," +
" 'path' = '/path/to/data'," +
" 'format' = 'csv'" +
")"
);
// Query with Table API
Table sourceTable = tEnv.from("source_table");
Table result = sourceTable
.where($("behavior").isEqual("click"))
.groupBy($("user_id"))
.select($("user_id"), $("user_id").count().as("click_count"));
// Execute query
result.execute().print();The package contains several key architectural components:
Essential table environment setup, table creation, and basic operations.
Key APIs:
// Table environment factory
static TableEnvironment create(EnvironmentSettings settings);
// SQL execution
TableResult executeSql(String statement);
Table sqlQuery(String query);
// Table operations
Table from(String path);
void createTable(String path, TableDescriptor descriptor);Type-safe expression building DSL for Table API operations.
Key APIs:
// Field references and literals
static ApiExpression $(String name);
static ApiExpression lit(Object v);
static ApiExpression lit(Object v, DataType dataType);
// Logical operations
static ApiExpression and(Object predicate0, Object predicate1, Object... predicates);
static ApiExpression or(Object predicate0, Object predicate1, Object... predicates);
// Function calls
static ApiExpression call(String path, Object... arguments);
static ApiExpression call(UserDefinedFunction function, Object... arguments);
static ApiExpression callSql(String sqlExpression);
// Collections
static ApiExpression array(Object head, Object... tail);
static ApiExpression row(Object head, Object... tail);
static ApiExpression map(Object key, Object value, Object... tail);Complete SQL DDL, DML, and query capabilities with Hive compatibility.
Key APIs:
// SQL query execution
Table sqlQuery(String query);
TableResult executeSql(String statement);
// SQL parsing and validation
SqlParser createParser(String sql);Seamless conversion between Flink Tables and DataStreams for hybrid processing.
Key APIs:
// DataStream to Table conversion
Table fromDataStream(DataStream<T> dataStream);
Table fromDataStream(DataStream<T> dataStream, Expression... fields);
// Table to DataStream conversion
DataStream<T> toDataStream(Table table, Class<T> targetClass);
DataStream<Row> toChangelogStream(Table table);Rich type definitions, schema inference, and catalog management.
Key APIs:
// Data type factory
static DataType STRING();
static DataType INT();
static DataType TIMESTAMP(int precision);
// Schema management
ResolvedSchema getResolvedSchema();
List<Column> getColumns();Type System and Schema Management
Support for custom scalar, table, and aggregate functions.
Key APIs:
// Function registration
void createFunction(String name, UserDefinedFunction function);
void createTemporaryFunction(String name, Class<? extends UserDefinedFunction> functionClass);
// Function base classes
abstract class ScalarFunction extends UserDefinedFunction;
abstract class TableFunction<T> extends UserDefinedFunction;
abstract class AggregateFunction<T, ACC> extends UserDefinedFunction;Time-based and count-based windowing for streaming data analysis.
Key APIs:
// Window definitions
static Tumble over(Expression size);
static Slide over(Expression size);
static Session withGap(Expression gap);
// Windowed operations
WindowGroupedTable window(GroupWindow window);
Table select(Expression... fields);Window Operations and Time Processing
Pattern matching and complex event detection on streaming data.
Key APIs:
// Pattern definitions
static Pattern<T, T> begin(String name);
Pattern<T, F> next(String name);
Pattern<T, F> followedBy(String name);
Pattern<T, F> within(Time within);
// Pattern application
static <T> PatternStream<T> pattern(DataStream<T> input, Pattern<T, ?> pattern);Multi-catalog support with database and table metadata management.
Key APIs:
// Catalog operations
void registerCatalog(String catalogName, Catalog catalog);
void useCatalog(String catalogName);
void useDatabase(String databaseName);
// Metadata access
String[] listCatalogs();
String[] listDatabases();
String[] listTables();