or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

cli-driver.mdcli-services.mdindex.mdmetadata-operations.mdoperation-management.mdserver-management.mdsession-management.mdsql-execution.mdweb-ui.md

server-management.mddocs/

0

# Server Management

1

2

Core server lifecycle management and initialization for the Spark Hive Thrift Server with Spark SQL integration.

3

4

## Capabilities

5

6

### HiveThriftServer2 Object

7

8

Main entry point for starting and managing the Spark Hive Thrift Server.

9

10

```scala { .api }

11

/**

12

* The main entry point for the Spark SQL port of HiveServer2

13

*/

14

object HiveThriftServer2 {

15

/**

16

* Starts a new thrift server with the given SQL context

17

* @param sqlContext The Spark SQL context to use for query execution

18

* @return HiveThriftServer2 instance representing the running server

19

*/

20

def startWithContext(sqlContext: SQLContext): HiveThriftServer2

21

22

/**

23

* Command-line entry point for the thrift server

24

* @param args Command line arguments including Hive configuration options

25

*/

26

def main(args: Array[String]): Unit

27

28

/**

29

* Execution state enumeration for tracking operation states

30

* Note: This is private[thriftserver] - not part of public API

31

*/

32

private[thriftserver] object ExecutionState extends Enumeration {

33

type ExecutionState = Value

34

val STARTED, COMPILED, CANCELED, TIMEDOUT, FAILED, FINISHED, CLOSED = Value

35

}

36

}

37

```

38

39

**Usage Examples:**

40

41

```scala

42

import org.apache.spark.sql.hive.thriftserver.{HiveThriftServer2, SparkSQLEnv}

43

44

// Initialize the SparkSQL environment first

45

SparkSQLEnv.init()

46

47

// Start the server with the initialized SQL context

48

val server = HiveThriftServer2.startWithContext(SparkSQLEnv.sqlContext)

49

50

// The server is now running and accepting connections

51

// It will automatically set up UI tabs and listeners

52

```

53

54

```bash

55

# Start from command line with custom port

56

spark-submit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 \

57

--conf spark.sql.hive.thriftServer.singleSession=true \

58

spark-hive-thriftserver_2.12-3.5.6.jar \

59

--hiveconf hive.server2.thrift.port=10001

60

```

61

62

### HiveThriftServer2 Class

63

64

The server instance class that extends HiveServer2 with Spark SQL capabilities.

65

66

```scala { .api }

67

/**

68

* The server instance that provides HiveServer2 compatibility

69

* Note: This class is private[hive] - not directly instantiated by users

70

*/

71

private[hive] class HiveThriftServer2(sqlContext: SQLContext) extends HiveServer2 {

72

/**

73

* Initialize the server with Hive configuration

74

* @param hiveConf Hive configuration object

75

*/

76

def init(hiveConf: HiveConf): Unit

77

78

/**

79

* Start the server services

80

*/

81

def start(): Unit

82

83

/**

84

* Stop the server services

85

*/

86

def stop(): Unit

87

}

88

```

89

90

### SparkSQLEnv Object

91

92

Singleton managing the global SparkContext and SQLContext lifecycle for the thrift server.

93

94

```scala { .api }

95

/**

96

* Singleton environment manager for SparkContext and SQLContext

97

*/

98

object SparkSQLEnv {

99

/**

100

* Initialize the Spark SQL environment

101

* Creates SparkContext and SQLContext if not already initialized

102

*/

103

def init(): Unit

104

105

/**

106

* Stop the Spark SQL environment

107

* @param exitCode Exit code for the application

108

*/

109

def stop(exitCode: Int): Unit

110

111

/**

112

* Global SQL context instance

113

*/

114

def sqlContext: SQLContext

115

116

/**

117

* Global Spark context instance

118

*/

119

def sparkContext: SparkContext

120

}

121

```

122

123

**Usage Examples:**

124

125

```scala

126

import org.apache.spark.sql.hive.thriftserver.SparkSQLEnv

127

128

// Initialize the environment (creates SparkContext and SQLContext)

129

SparkSQLEnv.init()

130

131

// Access the global contexts

132

val sqlCtx = SparkSQLEnv.sqlContext

133

val sparkCtx = SparkSQLEnv.sparkContext

134

135

// Check if environment is initialized

136

if (SparkSQLEnv.sparkContext != null) {

137

println(s"Spark master: ${SparkSQLEnv.sparkContext.master}")

138

println(s"Application ID: ${SparkSQLEnv.sparkContext.applicationId}")

139

}

140

141

// Clean shutdown

142

SparkSQLEnv.stop(0)

143

```

144

145

### Server Configuration

146

147

The server supports various configuration options through Hive configuration and Spark configuration.

148

149

```scala { .api }

150

// Common configuration options (set via --hiveconf or spark.conf)

151

// Transport mode

152

hive.server2.transport.mode = "binary" | "http"

153

154

// Port configuration

155

hive.server2.thrift.port = 10000 // Binary mode port

156

hive.server2.thrift.http.port = 10001 // HTTP mode port

157

158

// Authentication

159

hive.server2.authentication = "NONE" | "KERBEROS" | "CUSTOM"

160

hive.server2.authentication.kerberos.principal = "principal@REALM"

161

hive.server2.authentication.kerberos.keytab = "/path/to/keytab"

162

163

// Session management

164

hive.server2.session.check.interval = "6h"

165

hive.server2.idle.session.timeout = "7d"

166

```

167

168

### Error Handling

169

170

Server management includes comprehensive error handling for startup and lifecycle management.

171

172

```scala { .api }

173

// Common exceptions that may be thrown during server management

174

class HiveThriftServerErrors {

175

def taskExecutionRejectedError(rejected: RejectedExecutionException): Throwable

176

def runningQueryError(e: Throwable, format: ErrorMessageFormat.Value): Throwable

177

def failedToOpenNewSessionError(e: Throwable): Throwable

178

def cannotLoginToKerberosError(e: Exception): Throwable

179

def cannotLoginToSpnegoError(principal: String, keyTabFile: String, e: IOException): Throwable

180

}

181

```

182

183

**Common Error Scenarios:**

184

185

```scala

186

// Server startup with error handling

187

try {

188

SparkSQLEnv.init()

189

val server = HiveThriftServer2.startWithContext(SparkSQLEnv.sqlContext)

190

191

// Check if SparkContext was stopped during startup

192

if (SparkSQLEnv.sparkContext.stopped.get()) {

193

throw new RuntimeException("SparkContext stopped during server startup")

194

}

195

196

} catch {

197

case e: Exception =>

198

println(s"Error starting HiveThriftServer2: ${e.getMessage}")

199

SparkSQLEnv.stop(-1)

200

System.exit(-1)

201

}

202

```

203

204

### Shutdown Hooks

205

206

The server automatically registers shutdown hooks for clean resource cleanup.

207

208

```scala { .api }

209

// Automatic shutdown hook registration

210

ShutdownHookManager.addShutdownHook { () =>

211

SparkSQLEnv.stop()

212

// UI tabs are automatically detached

213

// Server services are stopped

214

}

215

```