or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

tessl/maven-org-apache-flink--flink-sql-jdbc-driver-bundle

A bundled JDBC driver for Apache Flink SQL that packages the JDBC driver implementation along with its dependencies into a single JAR file

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
mavenpkg:maven/org.apache.flink/flink-sql-jdbc-driver-bundle@2.1.x

To install, run

npx @tessl/cli install tessl/maven-org-apache-flink--flink-sql-jdbc-driver-bundle@2.1.0

0

# Apache Flink SQL JDBC Driver Bundle

1

2

A comprehensive JDBC driver bundle for Apache Flink SQL that enables Java applications to connect to Flink SQL Gateway and execute SQL queries against Flink's distributed stream and batch processing engine. The bundle packages all necessary dependencies using Maven shade plugin to create a standalone JAR that can be easily integrated into applications without dependency conflicts.

3

4

## Package Information

5

6

- **Package Name**: flink-sql-jdbc-driver-bundle

7

- **Language**: Java

8

- **Package Type**: Maven JAR bundle

9

- **Maven Coordinates**: `org.apache.flink:flink-sql-jdbc-driver-bundle:2.1.0`

10

- **Installation**: Include as Maven dependency or add JAR to classpath

11

12

## Core Imports

13

14

Standard JDBC imports:

15

16

```java

17

import java.sql.Connection;

18

import java.sql.DriverManager;

19

import java.sql.Statement;

20

import java.sql.ResultSet;

21

import java.sql.SQLException;

22

```

23

24

Flink-specific imports (optional):

25

26

```java

27

import org.apache.flink.table.jdbc.FlinkDriver;

28

import org.apache.flink.table.jdbc.FlinkDataSource;

29

```

30

31

## Basic Usage

32

33

### Using DriverManager (Standard JDBC)

34

35

```java

36

import java.sql.*;

37

38

public class FlinkJdbcExample {

39

public static void main(String[] args) throws SQLException {

40

// Connection URL format: jdbc:flink://host:port[/catalog[/database]]

41

String url = "jdbc:flink://localhost:8083";

42

43

// Connect to Flink SQL Gateway

44

Connection connection = DriverManager.getConnection(url);

45

46

// Create and execute a query

47

Statement statement = connection.createStatement();

48

ResultSet results = statement.executeQuery("SELECT * FROM my_table LIMIT 10");

49

50

// Process results

51

while (results.next()) {

52

System.out.println(results.getString(1));

53

}

54

55

// Clean up

56

results.close();

57

statement.close();

58

connection.close();

59

}

60

}

61

```

62

63

### Using DataSource

64

65

```java

66

import org.apache.flink.table.jdbc.FlinkDataSource;

67

import java.sql.*;

68

import java.util.Properties;

69

70

public class FlinkDataSourceExample {

71

public static void main(String[] args) throws SQLException {

72

Properties props = new Properties();

73

props.setProperty("catalog", "my_catalog");

74

75

FlinkDataSource dataSource = new FlinkDataSource(

76

"jdbc:flink://localhost:8083",

77

props

78

);

79

80

Connection connection = dataSource.getConnection();

81

// Use connection as normal...

82

}

83

}

84

```

85

86

## Architecture

87

88

The Flink JDBC driver follows standard JDBC architecture patterns while providing specific integration with Flink SQL Gateway:

89

90

- **FlinkDriver**: Main JDBC driver implementation, auto-registered via META-INF/services

91

- **FlinkConnection**: Connection to Flink SQL Gateway (NOT thread-safe)

92

- **FlinkStatement**: Statement execution interface (NOT thread-safe)

93

- **FlinkResultSet**: Result iteration with comprehensive data type support

94

- **FlinkDatabaseMetaData**: Metadata access for catalogs, schemas, and database capabilities

95

96

**Important**: Connection and Statement implementations are explicitly NOT thread-safe. Use separate connections for each thread.

97

98

## Capabilities

99

100

### Driver Registration and Connection Management

101

102

Core JDBC driver functionality including automatic driver registration, connection establishment to Flink SQL Gateway, and connection management with catalog/schema support.

103

104

```java { .api }

105

// Automatic driver registration via META-INF/services

106

public class FlinkDriver implements Driver {

107

public Connection connect(String url, Properties driverProperties) throws SQLException;

108

public boolean acceptsURL(String url) throws SQLException;

109

public int getMajorVersion();

110

public int getMinorVersion();

111

public boolean jdbcCompliant(); // Returns false

112

}

113

114

public class FlinkDataSource implements DataSource {

115

public FlinkDataSource(String url, Properties properties);

116

public Connection getConnection() throws SQLException;

117

}

118

```

119

120

[Driver Registration and Connection Management](./connection-management.md)

121

122

### SQL Statement Execution

123

124

Statement execution capabilities supporting both DDL and DML operations, with special handling for INSERT statements that return job IDs as result sets.

125

126

```java { .api }

127

public class FlinkStatement extends BaseStatement {

128

public ResultSet executeQuery(String sql) throws SQLException;

129

public boolean execute(String sql) throws SQLException;

130

public ResultSet getResultSet() throws SQLException;

131

public int getUpdateCount() throws SQLException;

132

public Connection getConnection() throws SQLException;

133

public void close() throws SQLException;

134

public void cancel() throws SQLException;

135

}

136

```

137

138

[SQL Statement Execution](./statement-execution.md)

139

140

### Result Set Processing

141

142

Comprehensive result set processing with support for all Java primitive types, temporal data, decimal precision, and complex data structures including Maps.

143

144

```java { .api }

145

public class FlinkResultSet extends BaseResultSet {

146

// Navigation

147

public boolean next() throws SQLException;

148

public boolean wasNull() throws SQLException;

149

150

// Data retrieval by index

151

public String getString(int columnIndex) throws SQLException;

152

public boolean getBoolean(int columnIndex) throws SQLException;

153

public int getInt(int columnIndex) throws SQLException;

154

public long getLong(int columnIndex) throws SQLException;

155

public double getDouble(int columnIndex) throws SQLException;

156

public BigDecimal getBigDecimal(int columnIndex) throws SQLException;

157

public Date getDate(int columnIndex) throws SQLException;

158

public Time getTime(int columnIndex) throws SQLException;

159

public Timestamp getTimestamp(int columnIndex) throws SQLException;

160

public Object getObject(int columnIndex) throws SQLException;

161

162

// Data retrieval by label

163

public String getString(String columnLabel) throws SQLException;

164

// ... all types also available by column label

165

166

public ResultSetMetaData getMetaData() throws SQLException;

167

public int findColumn(String columnLabel) throws SQLException;

168

}

169

```

170

171

[Result Set Processing](./result-set-processing.md)

172

173

### Database Metadata Access

174

175

Database metadata functionality for discovering available catalogs, schemas, and database capabilities, with specific Flink SQL Gateway integration.

176

177

```java { .api }

178

public class FlinkDatabaseMetaData extends BaseDatabaseMetaData {

179

public ResultSet getCatalogs() throws SQLException;

180

public ResultSet getSchemas() throws SQLException;

181

public String getDatabaseProductName() throws SQLException; // "Apache Flink"

182

public String getDatabaseProductVersion() throws SQLException;

183

public String getDriverName() throws SQLException;

184

public String getDriverVersion() throws SQLException;

185

public boolean isReadOnly() throws SQLException; // true

186

public String getIdentifierQuoteString() throws SQLException; // "`"

187

}

188

```

189

190

[Database Metadata Access](./database-metadata.md)

191

192

## Types

193

194

### Connection URI Format

195

196

```java { .api }

197

// URL Format: jdbc:flink://host:port[/catalog[/database]][?param=value&...]

198

public class DriverUri {

199

public static DriverUri create(String url, Properties properties) throws SQLException;

200

public static boolean acceptsURL(String url);

201

public InetSocketAddress getAddress();

202

public Optional<String> getCatalog();

203

public Optional<String> getDatabase();

204

public Properties getProperties();

205

}

206

```

207

208

### Result Set Metadata

209

210

```java { .api }

211

public class FlinkResultSetMetaData implements ResultSetMetaData {

212

public int getColumnCount() throws SQLException;

213

public String getColumnName(int column) throws SQLException;

214

public String getColumnLabel(int column) throws SQLException;

215

public int getColumnType(int column) throws SQLException;

216

public String getColumnTypeName(int column) throws SQLException;

217

public int getPrecision(int column) throws SQLException;

218

public int getScale(int column) throws SQLException;

219

public int isNullable(int column) throws SQLException;

220

public String getColumnClassName(int column) throws SQLException;

221

}

222

```

223

224

### Column Information

225

226

```java { .api }

227

public class ColumnInfo {

228

public static ColumnInfo fromLogicalType(String columnName, LogicalType type);

229

public int getColumnType();

230

public boolean isSigned();

231

public int getPrecision();

232

public int getScale();

233

public int getColumnDisplaySize();

234

public String getColumnName();

235

public boolean isNullable();

236

public String columnTypeName();

237

}

238

```

239

240

## Supported Data Types

241

242

**Primitive Types**: boolean, byte, short, int, long, float, double

243

**Text**: String, binary data (byte[])

244

**Numeric**: BigDecimal with precision/scale support

245

**Temporal**: Date, Time, Timestamp

246

**Complex**: Maps (converted to Java Map objects)

247

**Generic**: Object (for any type)

248

249

## Limitations

250

251

**Thread Safety**: Connection and Statement implementations are NOT thread-safe

252

**JDBC Compliance**: Not fully JDBC compliant (returns false for jdbcCompliant())

253

**Query Support**: Batch mode queries only; streaming queries not supported via JDBC

254

**Unsupported Features**: Prepared statements, callable statements, transactions, savepoints, result set updates, stored procedures, generated keys, Array data type retrieval

255

256

## Error Handling

257

258

The driver throws standard JDBC exceptions:

259

- `SQLException` for general database errors

260

- `SQLFeatureNotSupportedException` for unsupported JDBC features

261

- `SQLClientInfoException` for client info operations