or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

tessl/maven-org-apache-flink--flink-table-uber-blink_2.12

Comprehensive Table/SQL distribution for Apache Flink with Blink planner for optimized table processing in both batch and streaming modes.

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
mavenpkg:maven/org.apache.flink/flink-table-uber-blink_2.12@1.13.x

To install, run

npx @tessl/cli install tessl/maven-org-apache-flink--flink-table-uber-blink_2.12@1.13.0

0

# Apache Flink Table Uber Blink

1

2

Apache Flink Table Uber Blink is a comprehensive distribution package that bundles all necessary components for Table/SQL programming within the Apache Flink ecosystem. It provides a unified JAR containing table common APIs, SQL parsers (including Hive support), table APIs for both Java and Scala, bridge APIs for DataStream integration, the Blink query planner for optimization, runtime components, and Complex Event Processing (CEP) capabilities.

3

4

This uber JAR is designed for applications that need complete table processing functionality without managing multiple dependencies, supporting both batch and streaming table operations with the Blink planner's advanced query optimization capabilities.

5

6

## Package Information

7

8

- **Package Name**: flink-table-uber-blink_2.12

9

- **Package Type**: maven

10

- **Language**: Java/Scala

11

- **Installation**: Add Maven dependency:

12

13

```xml

14

<dependency>

15

<groupId>org.apache.flink</groupId>

16

<artifactId>flink-table-uber-blink_2.12</artifactId>

17

<version>1.13.6</version>

18

</dependency>

19

```

20

21

For Gradle:

22

23

```gradle

24

implementation 'org.apache.flink:flink-table-uber-blink_2.12:1.13.6'

25

```

26

27

## Core Imports

28

29

**Java:**

30

31

```java

32

import org.apache.flink.table.api.TableEnvironment;

33

import org.apache.flink.table.api.EnvironmentSettings;

34

import org.apache.flink.table.api.Table;

35

import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;

36

import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;

37

import static org.apache.flink.table.api.Expressions.*;

38

```

39

40

**Scala:**

41

42

```scala

43

import org.apache.flink.table.api._

44

import org.apache.flink.table.api.bridge.scala._

45

import org.apache.flink.streaming.api.scala._

46

```

47

48

## Basic Usage

49

50

**Creating Table Environment (Java):**

51

52

```java

53

// Pure table environment

54

EnvironmentSettings settings = EnvironmentSettings.newInstance()

55

.useBlinkPlanner()

56

.inStreamingMode()

57

.build();

58

TableEnvironment tEnv = TableEnvironment.create(settings);

59

60

// DataStream integration

61

StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

62

StreamTableEnvironment streamTableEnv = StreamTableEnvironment.create(env);

63

```

64

65

**Basic Table Operations:**

66

67

```java

68

// Create table from SQL DDL

69

tEnv.executeSql(

70

"CREATE TABLE source_table (" +

71

" user_id BIGINT," +

72

" item_id BIGINT," +

73

" behavior STRING," +

74

" ts TIMESTAMP(3)" +

75

") WITH (" +

76

" 'connector' = 'filesystem'," +

77

" 'path' = '/path/to/data'," +

78

" 'format' = 'csv'" +

79

")"

80

);

81

82

// Query with Table API

83

Table sourceTable = tEnv.from("source_table");

84

Table result = sourceTable

85

.where($("behavior").isEqual("click"))

86

.groupBy($("user_id"))

87

.select($("user_id"), $("user_id").count().as("click_count"));

88

89

// Execute query

90

result.execute().print();

91

```

92

93

## Architecture

94

95

The package contains several key architectural components:

96

97

- **Table Environment**: Central entry point for table operations and SQL execution

98

- **Table API**: Fluent API for building table transformation pipelines

99

- **SQL Engine**: Full SQL support with DDL, DML, and query capabilities

100

- **Blink Planner**: Advanced query optimizer with cost-based optimization

101

- **Type System**: Rich data type definitions with schema inference

102

- **Catalog System**: Metadata management supporting multiple catalogs

103

- **Connector Ecosystem**: Pluggable connectors for various data sources

104

- **DataStream Integration**: Seamless interoperability with Flink's streaming API

105

106

## Capabilities

107

108

### Core Table Operations

109

110

Essential table environment setup, table creation, and basic operations.

111

112

**Key APIs:**

113

```java { .api }

114

// Table environment factory

115

static TableEnvironment create(EnvironmentSettings settings);

116

117

// SQL execution

118

TableResult executeSql(String statement);

119

Table sqlQuery(String query);

120

121

// Table operations

122

Table from(String path);

123

void createTable(String path, TableDescriptor descriptor);

124

```

125

126

[Core Table Operations](./core-table-operations.md)

127

128

### Expressions API

129

130

Type-safe expression building DSL for Table API operations.

131

132

**Key APIs:**

133

```java { .api }

134

// Field references and literals

135

static ApiExpression $(String name);

136

static ApiExpression lit(Object v);

137

static ApiExpression lit(Object v, DataType dataType);

138

139

// Logical operations

140

static ApiExpression and(Object predicate0, Object predicate1, Object... predicates);

141

static ApiExpression or(Object predicate0, Object predicate1, Object... predicates);

142

143

// Function calls

144

static ApiExpression call(String path, Object... arguments);

145

static ApiExpression call(UserDefinedFunction function, Object... arguments);

146

static ApiExpression callSql(String sqlExpression);

147

148

// Collections

149

static ApiExpression array(Object head, Object... tail);

150

static ApiExpression row(Object head, Object... tail);

151

static ApiExpression map(Object key, Object value, Object... tail);

152

```

153

154

### SQL and Query Processing

155

156

Complete SQL DDL, DML, and query capabilities with Hive compatibility.

157

158

**Key APIs:**

159

```java { .api }

160

// SQL query execution

161

Table sqlQuery(String query);

162

TableResult executeSql(String statement);

163

164

// SQL parsing and validation

165

SqlParser createParser(String sql);

166

```

167

168

[SQL and Query Processing](./sql-processing.md)

169

170

### DataStream Integration

171

172

Seamless conversion between Flink Tables and DataStreams for hybrid processing.

173

174

**Key APIs:**

175

```java { .api }

176

// DataStream to Table conversion

177

Table fromDataStream(DataStream<T> dataStream);

178

Table fromDataStream(DataStream<T> dataStream, Expression... fields);

179

180

// Table to DataStream conversion

181

DataStream<T> toDataStream(Table table, Class<T> targetClass);

182

DataStream<Row> toChangelogStream(Table table);

183

```

184

185

[DataStream Integration](./datastream-integration.md)

186

187

### Type System and Schema Management

188

189

Rich type definitions, schema inference, and catalog management.

190

191

**Key APIs:**

192

```java { .api }

193

// Data type factory

194

static DataType STRING();

195

static DataType INT();

196

static DataType TIMESTAMP(int precision);

197

198

// Schema management

199

ResolvedSchema getResolvedSchema();

200

List<Column> getColumns();

201

```

202

203

[Type System and Schema Management](./type-system.md)

204

205

### User-Defined Functions

206

207

Support for custom scalar, table, and aggregate functions.

208

209

**Key APIs:**

210

```java { .api }

211

// Function registration

212

void createFunction(String name, UserDefinedFunction function);

213

void createTemporaryFunction(String name, Class<? extends UserDefinedFunction> functionClass);

214

215

// Function base classes

216

abstract class ScalarFunction extends UserDefinedFunction;

217

abstract class TableFunction<T> extends UserDefinedFunction;

218

abstract class AggregateFunction<T, ACC> extends UserDefinedFunction;

219

```

220

221

[User-Defined Functions](./user-defined-functions.md)

222

223

### Window Operations and Time Processing

224

225

Time-based and count-based windowing for streaming data analysis.

226

227

**Key APIs:**

228

```java { .api }

229

// Window definitions

230

static Tumble over(Expression size);

231

static Slide over(Expression size);

232

static Session withGap(Expression gap);

233

234

// Windowed operations

235

WindowGroupedTable window(GroupWindow window);

236

Table select(Expression... fields);

237

```

238

239

[Window Operations and Time Processing](./window-operations.md)

240

241

### Complex Event Processing (CEP)

242

243

Pattern matching and complex event detection on streaming data.

244

245

**Key APIs:**

246

```java { .api }

247

// Pattern definitions

248

static Pattern<T, T> begin(String name);

249

Pattern<T, F> next(String name);

250

Pattern<T, F> followedBy(String name);

251

Pattern<T, F> within(Time within);

252

253

// Pattern application

254

static <T> PatternStream<T> pattern(DataStream<T> input, Pattern<T, ?> pattern);

255

```

256

257

[Complex Event Processing](./complex-event-processing.md)

258

259

### Catalog and Metadata Management

260

261

Multi-catalog support with database and table metadata management.

262

263

**Key APIs:**

264

```java { .api }

265

// Catalog operations

266

void registerCatalog(String catalogName, Catalog catalog);

267

void useCatalog(String catalogName);

268

void useDatabase(String databaseName);

269

270

// Metadata access

271

String[] listCatalogs();

272

String[] listDatabases();

273

String[] listTables();

274

```

275

276

[Catalog and Metadata Management](./catalog-management.md)