or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

tessl/maven-org-apache-flink--flink-sql-parser-hive

SQL parser component for Apache Flink that provides Hive dialect support for parsing Hive-specific DDL and DML statements

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
mavenpkg:maven/org.apache.flink/flink-sql-parser-hive@1.17.x

To install, run

npx @tessl/cli install tessl/maven-org-apache-flink--flink-sql-parser-hive@1.17.0

0

# Flink SQL Parser Hive

1

2

Flink SQL Parser Hive is a specialized SQL parser component for Apache Flink that extends the core SQL parser to support Hive dialect syntax. It provides comprehensive parsing capabilities for Hive-specific DDL (Data Definition Language) and DML (Data Manipulation Language) statements, enabling seamless integration with Hive metastores and compatibility with existing Hive data warehouses within Flink's distributed stream and batch processing engine.

3

4

## Package Information

5

6

- **Package Name**: flink-sql-parser-hive

7

- **Package Type**: maven

8

- **Language**: Java

9

- **Installation**: Include as Maven dependency with coordinates `org.apache.flink:flink-sql-parser-hive:1.17.2`

10

11

## Core Imports

12

13

```java

14

import org.apache.flink.sql.parser.hive.impl.FlinkHiveSqlParserImpl;

15

import org.apache.flink.sql.parser.hive.ddl.*;

16

import org.apache.flink.sql.parser.hive.dml.*;

17

import org.apache.flink.sql.parser.hive.type.*;

18

```

19

20

## Basic Usage

21

22

```java

23

import org.apache.calcite.sql.SqlParser;

24

import org.apache.flink.sql.parser.hive.impl.FlinkHiveSqlParserImpl;

25

26

// Create a Hive SQL parser

27

SqlParser parser = SqlParser.create(sqlStatement,

28

SqlParser.config()

29

.withParserFactory(FlinkHiveSqlParserImpl.FACTORY)

30

.withQuoting(Quoting.DOUBLE_QUOTE)

31

.withUnquotedCasing(Casing.TO_UPPER)

32

.withQuotedCasing(Casing.UNCHANGED));

33

34

// Parse Hive SQL statement

35

SqlNode sqlNode = parser.parseStmt();

36

37

// Example: Create Hive table with partitions

38

String createTableSql = """

39

CREATE TABLE IF NOT EXISTS sales_data (

40

id BIGINT,

41

customer_name STRING,

42

amount DECIMAL(10,2)

43

)

44

PARTITIONED BY (year INT, month INT)

45

STORED AS PARQUET

46

LOCATION '/data/sales'

47

TBLPROPERTIES ('transactional'='true')

48

""";

49

```

50

51

## Architecture

52

53

The Flink SQL Parser Hive is built around several key components:

54

55

- **Generated Parser**: `FlinkHiveSqlParserImpl` provides the main parsing entry point using JavaCC and FMPP code generation

56

- **DDL System**: Comprehensive support for Hive DDL operations including databases, tables, views, and partitions

57

- **DML Extensions**: Enhanced INSERT statements with static and dynamic partition support

58

- **Type System**: Extended type specifications for Hive-specific data types like STRUCT with field comments

59

- **Constraint Handling**: Three-dimensional constraint system (ENABLE/DISABLE, VALIDATE/NOVALIDATE, RELY/NORELY)

60

- **Property Management**: Reserved property validation and escaping for SQL client compatibility

61

62

## Capabilities

63

64

### Parser Integration

65

66

Core parser factory and integration point for creating Hive SQL parsers within Apache Calcite framework.

67

68

```java { .api }

69

/**

70

* Main parser factory for creating Hive SQL parser instances

71

*/

72

public class FlinkHiveSqlParserImpl {

73

public static final SqlParserImplFactory FACTORY;

74

}

75

```

76

77

[Parser Integration](./parser-integration.md)

78

79

### Database Operations

80

81

Complete database lifecycle management including creation, alteration, and property management for Hive databases.

82

83

```java { .api }

84

/**

85

* CREATE DATABASE statement for Hive dialect

86

*/

87

public class SqlCreateHiveDatabase extends SqlCreateDatabase {

88

public SqlCreateHiveDatabase(SqlParserPos pos, SqlIdentifier databaseName,

89

SqlNodeList propertyList, SqlCharStringLiteral comment,

90

SqlCharStringLiteral location, boolean ifNotExists) throws ParseException;

91

}

92

93

/**

94

* Base class for ALTER DATABASE operations

95

*/

96

public abstract class SqlAlterHiveDatabase extends SqlAlterDatabase {

97

public enum AlterHiveDatabaseOp { CHANGE_PROPS, CHANGE_LOCATION, CHANGE_OWNER }

98

}

99

```

100

101

[Database Operations](./database-operations.md)

102

103

### Table Operations

104

105

Comprehensive table management including creation with Hive-specific features, alteration, and column management.

106

107

```java { .api }

108

/**

109

* CREATE TABLE statement for Hive dialect with full Hive table features

110

*/

111

public class SqlCreateHiveTable extends SqlCreateTable {

112

public SqlCreateHiveTable(SqlParserPos pos, SqlIdentifier tableName, SqlNodeList columnList,

113

HiveTableCreationContext creationContext, SqlNodeList propertyList,

114

SqlNodeList partColList, SqlCharStringLiteral comment, boolean isTemporary,

115

boolean isExternal, HiveTableRowFormat rowFormat,

116

HiveTableStoredAs storedAs, SqlCharStringLiteral location,

117

boolean ifNotExists) throws ParseException;

118

}

119

120

/**

121

* ROW FORMAT specification for Hive tables

122

*/

123

public static class HiveTableRowFormat {

124

public static HiveTableRowFormat withDelimited(...) throws ParseException;

125

public static HiveTableRowFormat withSerDe(...) throws ParseException;

126

}

127

128

/**

129

* STORED AS specification for Hive tables

130

*/

131

public static class HiveTableStoredAs {

132

public static HiveTableStoredAs ofFileFormat(...) throws ParseException;

133

public static HiveTableStoredAs ofInputOutputFormat(...) throws ParseException;

134

}

135

```

136

137

[Table Operations](./table-operations.md)

138

139

### Partition Management

140

141

Partition operations for adding, renaming, and managing Hive table partitions.

142

143

```java { .api }

144

/**

145

* ADD PARTITION statement for Hive tables

146

*/

147

public class SqlAddHivePartitions extends SqlCall {

148

public SqlAddHivePartitions(SqlParserPos pos, SqlIdentifier tableName, boolean ifNotExists,

149

List<SqlNodeList> partSpecs, List<SqlCharStringLiteral> partLocations);

150

}

151

152

/**

153

* PARTITION RENAME statement for Hive tables

154

*/

155

public class SqlAlterHivePartitionRename extends SqlAlterHiveTable {

156

public SqlNodeList getNewPartSpec();

157

}

158

```

159

160

[Partition Management](./partition-management.md)

161

162

### View Operations

163

164

View creation and management with Hive-specific properties and syntax.

165

166

```java { .api }

167

/**

168

* CREATE VIEW statement for Hive dialect

169

*/

170

public class SqlCreateHiveView extends SqlCreateView {

171

public SqlCreateHiveView(SqlParserPos pos, SqlIdentifier viewName, SqlNodeList fieldList,

172

SqlNode query, boolean ifNotExists, SqlCharStringLiteral comment,

173

SqlNodeList properties);

174

}

175

```

176

177

[View Operations](./view-operations.md)

178

179

### Data Manipulation

180

181

Enhanced INSERT statements with comprehensive partition support for both static and dynamic partitioning.

182

183

```java { .api }

184

/**

185

* Enhanced INSERT statement for Hive tables with partition support

186

*/

187

public class RichSqlHiveInsert extends RichSqlInsert {

188

public RichSqlHiveInsert(SqlParserPos pos, SqlNodeList keywords, SqlNodeList extendedKeywords,

189

SqlNode targetTable, SqlNode source, SqlNodeList columnList,

190

SqlNodeList staticPartitions, SqlNodeList allPartKeys);

191

}

192

```

193

194

[Data Manipulation](./data-manipulation.md)

195

196

### Constraint System

197

198

Three-dimensional constraint system supporting ENABLE/DISABLE, VALIDATE/NOVALIDATE, and RELY/NORELY traits.

199

200

```java { .api }

201

/**

202

* Complete constraint trait specification

203

*/

204

public class SqlHiveConstraintTrait {

205

public SqlHiveConstraintTrait(SqlLiteral enable, SqlLiteral validate, SqlLiteral rely);

206

public boolean isEnable();

207

public boolean isValidate();

208

public boolean isRely();

209

}

210

211

/**

212

* Constraint enable/disable options

213

*/

214

public enum SqlHiveConstraintEnable { ENABLE, DISABLE }

215

216

/**

217

* Constraint validate/no-validate options

218

*/

219

public enum SqlHiveConstraintValidate { VALIDATE, NOVALIDATE }

220

221

/**

222

* Constraint rely/no-rely options

223

*/

224

public enum SqlHiveConstraintRely { RELY, NORELY }

225

```

226

227

[Constraint System](./constraint-system.md)

228

229

### Type System Extensions

230

231

Extended type system supporting Hive-specific data types including enhanced STRUCT types with field comments.

232

233

```java { .api }

234

/**

235

* STRUCT type specification with field names, types, and comments

236

*/

237

public class ExtendedHiveStructTypeNameSpec extends ExtendedSqlRowTypeNameSpec {

238

public ExtendedHiveStructTypeNameSpec(SqlParserPos pos, List<SqlIdentifier> fieldNames,

239

List<SqlDataTypeSpec> fieldTypes,

240

List<SqlCharStringLiteral> comments) throws ParseException;

241

}

242

```

243

244

[Type System Extensions](./type-system.md)

245

246

### Utilities and Helpers

247

248

Utility classes for property validation, data type conversion, and constraint trait management.

249

250

```java { .api }

251

/**

252

* Utility methods for Hive DDL operations

253

*/

254

public class HiveDDLUtils {

255

public static final String COL_DELIMITER;

256

257

public static SqlNodeList checkReservedTableProperties(SqlNodeList props) throws ParseException;

258

public static void convertDataTypes(SqlNodeList columns) throws ParseException;

259

public static byte encodeConstraintTrait(SqlHiveConstraintTrait trait);

260

public static SqlCharStringLiteral unescapeStringLiteral(SqlCharStringLiteral literal);

261

}

262

```

263

264

[Utilities and Helpers](./utilities.md)

265

266

## Exception Handling

267

268

All parsing operations may throw `ParseException` when SQL syntax is invalid or when validation fails:

269

270

```java

271

try {

272

SqlNode node = parser.parseStmt();

273

} catch (ParseException e) {

274

// Handle parsing errors

275

System.err.println("SQL parsing failed: " + e.getMessage());

276

}

277

```

278

279

## Integration with Flink

280

281

This parser integrates seamlessly with Apache Flink's table ecosystem:

282

283

1. **SQL Gateway**: Use with Flink SQL Gateway for Hive compatibility

284

2. **Table API**: Parse Hive DDL statements in Table API programs

285

3. **Catalog Integration**: Works with HiveCatalog for metadata management

286

4. **Streaming/Batch**: Supports both streaming and batch processing scenarios