0
# Interactive Shell
1
2
The Interactive Shell provides the main REPL functionality with Spark integration, command processing, and session management. It serves as the foundation for the spark-shell command-line tool.
3
4
## Capabilities
5
6
### Main Entry Point
7
8
The primary entry point for the Spark REPL application, providing global access to the interpreter instance.
9
10
```scala { .api }
11
/**
12
* Entry point object for the Spark REPL application
13
*/
14
object Main extends Logging {
15
/**
16
* Application entry point that starts the REPL
17
* @param args Command line arguments passed to the REPL
18
*/
19
def main(args: Array[String]): Unit
20
21
/**
22
* Gets the current interpreter instance
23
* @return Current SparkILoop instance
24
*/
25
def interp: SparkILoop
26
27
/**
28
* Sets the interpreter instance
29
* @param i SparkILoop instance to set
30
*/
31
def interp_=(i: SparkILoop): Unit
32
}
33
```
34
35
**Usage Examples:**
36
37
```scala
38
import org.apache.spark.repl.Main
39
40
// Start REPL with default settings
41
Main.main(Array())
42
43
// Start with specific arguments
44
Main.main(Array("-master", "local[*]", "-i", "init.scala"))
45
46
// Access current interpreter
47
val currentRepl = Main.interp
48
```
49
50
### SparkILoop Class
51
52
The main interactive loop class that provides the read-eval-print functionality with Spark-specific features.
53
54
```scala { .api }
55
/**
56
* The Scala interactive shell with Spark integration
57
* @param in0 Optional input reader for commands
58
* @param out Output writer for results
59
* @param master Optional Spark master URL
60
*/
61
@DeveloperApi
62
class SparkILoop(
63
private val in0: Option[BufferedReader] = None,
64
protected val out: JPrintWriter = new JPrintWriter(Console.out, true),
65
val master: Option[String] = None
66
) extends AnyRef with LoopCommands with SparkILoopInit with Logging {
67
68
/**
69
* Constructor with BufferedReader and JPrintWriter
70
*/
71
def this(in0: BufferedReader, out: JPrintWriter) =
72
this(Some(in0), out, None)
73
74
/**
75
* Constructor with BufferedReader, JPrintWriter, and master URL
76
*/
77
def this(in0: BufferedReader, out: JPrintWriter, master: String) =
78
this(Some(in0), out, Some(master))
79
80
/**
81
* Default constructor
82
*/
83
def this() = this(None, new JPrintWriter(Console.out, true), None)
84
}
85
```
86
87
**Core Properties:**
88
89
```scala { .api }
90
/**
91
* The active Spark context for the REPL session
92
*/
93
@DeveloperApi
94
var sparkContext: SparkContext
95
96
/**
97
* Sets the REPL prompt string
98
* @param prompt New prompt string to display
99
*/
100
@DeveloperApi
101
def setPrompt(prompt: String): Unit
102
103
/**
104
* Gets the current prompt string
105
* @return Current prompt string
106
*/
107
@DeveloperApi
108
def prompt: String
109
110
/**
111
* Gets the list of available REPL commands
112
* @return List of LoopCommand instances
113
*/
114
@DeveloperApi
115
def commands: List[LoopCommand]
116
```
117
118
**Spark Integration Methods:**
119
120
```scala { .api }
121
/**
122
* Creates and initializes a SparkSession for the REPL
123
* @return Configured SparkSession instance
124
*/
125
@DeveloperApi
126
def createSparkSession(): SparkSession
127
128
/**
129
* Gets array of JAR files added to the classpath
130
* @return Array of JAR file paths
131
*/
132
@DeveloperApi
133
def getAddedJars(): Array[String]
134
```
135
136
**Main Processing Method:**
137
138
```scala { .api }
139
/**
140
* Processes command line arguments and starts the REPL loop
141
* @param args Command line arguments
142
* @return Boolean indicating successful completion
143
*/
144
def process(args: Array[String]): Boolean
145
```
146
147
**Usage Examples:**
148
149
```scala
150
import org.apache.spark.repl.SparkILoop
151
import java.io.{BufferedReader, StringReader}
152
import scala.tools.nsc.interpreter.JPrintWriter
153
154
// Create REPL with default settings
155
val repl = new SparkILoop()
156
repl.process(Array("-master", "local[*]"))
157
158
// Create REPL with custom input/output
159
val input = new BufferedReader(new StringReader("val x = 1\n:quit\n"))
160
val output = new JPrintWriter(System.out, true)
161
val customRepl = new SparkILoop(input, output)
162
customRepl.process(Array())
163
164
// Access Spark integration
165
val repl2 = new SparkILoop()
166
repl2.createSparkSession()
167
val jars = repl2.getAddedJars()
168
println(s"Added JARs: ${jars.mkString(", ")}")
169
```
170
171
### Initialization and Welcome
172
173
REPL initialization functionality including welcome messages and Spark context setup.
174
175
```scala { .api }
176
/**
177
* Trait providing asynchronous initialization for the REPL
178
*/
179
trait SparkILoopInit {
180
/**
181
* Display the welcome message to users
182
*/
183
def printWelcome(): Unit
184
185
/**
186
* Initialize Spark context and session
187
*/
188
def initializeSpark(): Unit
189
}
190
```
191
192
### Command Line Processing
193
194
Command line argument parsing with Spark-specific options.
195
196
```scala { .api }
197
/**
198
* Command-line parser with Spark-specific options
199
* @param args List of command line arguments
200
* @param error Error handling function
201
*/
202
@DeveloperApi
203
class SparkCommandLine(
204
args: List[String],
205
error: String => Unit = System.err.println
206
) {
207
/**
208
* Constructor with default error handling
209
*/
210
def this(args: List[String]) = this(args, System.err.println)
211
212
/**
213
* Constructor with custom settings
214
*/
215
def this(args: List[String], settings: Settings) = this(args, System.err.println)
216
}
217
```
218
219
**Usage Examples:**
220
221
```scala
222
import org.apache.spark.repl.SparkCommandLine
223
224
// Parse command line arguments
225
val cmdLine = new SparkCommandLine(List("-master", "local[*]", "-i", "init.scala"))
226
227
// With custom error handling
228
val cmdLine2 = new SparkCommandLine(
229
List("-master", "spark://localhost:7077"),
230
(error: String) => println(s"Error: $error")
231
)
232
```
233
234
## Integration Patterns
235
236
### Starting a Custom REPL
237
238
```scala
239
import org.apache.spark.repl.SparkILoop
240
import org.apache.spark.SparkConf
241
242
// Create REPL with specific Spark configuration
243
val repl = new SparkILoop()
244
val success = repl.process(Array(
245
"-master", "local[4]",
246
"-conf", "spark.app.name=MyREPL",
247
"-i", "init.scala"
248
))
249
250
if (success) {
251
println("REPL started successfully")
252
}
253
```
254
255
### Accessing REPL State
256
257
```scala
258
import org.apache.spark.repl.Main
259
260
// Start REPL in background
261
Main.main(Array("-master", "local[*]"))
262
263
// Access the running REPL instance
264
val runningRepl = Main.interp
265
if (runningRepl != null) {
266
val context = runningRepl.sparkContext
267
val jars = runningRepl.getAddedJars()
268
println(s"Spark Context: ${context.appName}")
269
println(s"Added JARs: ${jars.length}")
270
}
271
```