0
# Server Management
1
2
Core server lifecycle management including startup, configuration, and shutdown operations for the Thrift server.
3
4
## Capabilities
5
6
### HiveThriftServer2 Object
7
8
Main entry point singleton object for the Spark SQL port of HiveServer2.
9
10
```scala { .api }
11
/**
12
* The main entry point for the Spark SQL port of HiveServer2.
13
* Starts up a SparkSQLContext and a HiveThriftServer2 thrift server.
14
*/
15
object HiveThriftServer2 extends Logging {
16
/** Optional UI tab for web interface monitoring */
17
var uiTab: Option[ThriftServerTab]
18
/** Event listener for server events and monitoring */
19
var listener: HiveThriftServer2Listener
20
21
/**
22
* Starts a new thrift server with the given context.
23
* @param sqlContext The SQL context to use for the server
24
*/
25
def startWithContext(sqlContext: SQLContext): Unit
26
27
/**
28
* Main method for standalone server execution
29
* @param args Command line arguments for server configuration
30
*/
31
def main(args: Array[String]): Unit
32
}
33
```
34
35
**Usage Example:**
36
37
```scala
38
import org.apache.spark.sql.hive.thriftserver.{HiveThriftServer2, SparkSQLEnv}
39
40
// Initialize the environment first
41
SparkSQLEnv.init()
42
43
// Start the server with the current SQL context
44
HiveThriftServer2.startWithContext(SparkSQLEnv.sqlContext)
45
46
// The server is now running and accepting connections
47
```
48
49
### HiveThriftServer2 Class
50
51
Private class that extends HiveServer2 and provides the actual server implementation.
52
53
```scala { .api }
54
/**
55
* Private implementation class for HiveThriftServer2
56
* @param sqlContext The SQL context for this server instance
57
*/
58
private[hive] class HiveThriftServer2(sqlContext: SQLContext)
59
extends HiveServer2 with ReflectedCompositeService {
60
61
/**
62
* Initialize the server with Hive configuration
63
* @param hiveConf Hive configuration object
64
*/
65
def init(hiveConf: HiveConf): Unit
66
67
/**
68
* Start the thrift server
69
*/
70
def start(): Unit
71
72
/**
73
* Stop the thrift server
74
*/
75
def stop(): Unit
76
}
77
```
78
79
### HiveThriftServer2Listener
80
81
Event listener for tracking server events and providing monitoring data.
82
83
```scala { .api }
84
/**
85
* Spark listener for tracking HiveThriftServer2 events
86
* @param server The HiveServer2 instance
87
* @param conf SQL configuration
88
*/
89
private[thriftserver] class HiveThriftServer2Listener(
90
server: HiveServer2,
91
conf: SQLConf
92
) extends SparkListener {
93
94
/**
95
* Get the number of online sessions
96
* @return Current online session count
97
*/
98
def getOnlineSessionNum: Int
99
100
/**
101
* Get the total number of running operations
102
* @return Current running operation count
103
*/
104
def getTotalRunning: Int
105
106
/**
107
* Get the list of all sessions
108
* @return Sequence of session information
109
*/
110
def getSessionList: Seq[SessionInfo]
111
112
/**
113
* Get session information by ID
114
* @param sessionId The session identifier
115
* @return Optional session information
116
*/
117
def getSession(sessionId: String): Option[SessionInfo]
118
119
/**
120
* Get the list of all executions
121
* @return Sequence of execution information
122
*/
123
def getExecutionList: Seq[ExecutionInfo]
124
125
/**
126
* Called when a new session is created
127
* @param ip Client IP address
128
* @param sessionId Session identifier
129
* @param userName User name (defaults to "UNKNOWN")
130
*/
131
def onSessionCreated(ip: String, sessionId: String, userName: String = "UNKNOWN"): Unit
132
133
/**
134
* Called when a session is closed
135
* @param sessionId Session identifier
136
*/
137
def onSessionClosed(sessionId: String): Unit
138
139
/**
140
* Called when a statement starts execution
141
* @param id Statement identifier
142
* @param sessionId Session identifier
143
* @param statement SQL statement
144
* @param groupId Job group identifier
145
* @param userName User name (defaults to "UNKNOWN")
146
*/
147
def onStatementStart(
148
id: String,
149
sessionId: String,
150
statement: String,
151
groupId: String,
152
userName: String = "UNKNOWN"
153
): Unit
154
155
/**
156
* Called when a statement is parsed
157
* @param id Statement identifier
158
* @param executionPlan The execution plan
159
*/
160
def onStatementParsed(id: String, executionPlan: String): Unit
161
162
/**
163
* Called when a statement encounters an error
164
* @param id Statement identifier
165
* @param errorMessage Error message
166
* @param errorTrace Error stack trace
167
*/
168
def onStatementError(id: String, errorMessage: String, errorTrace: String): Unit
169
170
/**
171
* Called when a statement finishes execution
172
* @param id Statement identifier
173
*/
174
def onStatementFinish(id: String): Unit
175
}
176
```
177
178
### Configuration Support
179
180
The server supports extensive configuration through Hive configuration parameters:
181
182
**Transport Configuration:**
183
- `hive.server2.transport.mode`: Binary or HTTP transport
184
- `hive.server2.thrift.port`: Port for binary transport
185
- `hive.server2.thrift.http.port`: Port for HTTP transport
186
187
**Authentication Configuration:**
188
- `hive.server2.authentication`: Authentication mode (NONE, KERBEROS, etc.)
189
- `hive.server2.authentication.kerberos.principal`: Kerberos principal
190
- `hive.server2.authentication.kerberos.keytab`: Kerberos keytab file
191
192
**Security Configuration:**
193
- `hive.server2.enable.doAs`: Enable user impersonation
194
- `hive.server2.authentication.spnego.principal`: SPNEGO principal
195
- `hive.server2.authentication.spnego.keytab`: SPNEGO keytab
196
197
### Error Handling
198
199
Common server startup and runtime errors:
200
201
- **Configuration Errors**: Invalid Hive configuration parameters
202
- **Authentication Errors**: Kerberos or SPNEGO authentication failures
203
- **Port Binding Errors**: Port already in use or insufficient permissions
204
- **Spark Context Errors**: SparkContext initialization or connection failures
205
- **Resource Errors**: Insufficient memory or compute resources
206
207
The server will log detailed error messages and exit with appropriate error codes for troubleshooting.