0
# Interactive Shell Management
1
2
Core REPL loop functionality for interactive Scala development with Spark integration, providing command processing, prompt customization, session management, and automatic Spark context initialization.
3
4
## Capabilities
5
6
### SparkILoop Class
7
8
Main interactive shell class that manages the REPL session, user input processing, and Spark context lifecycle.
9
10
```scala { .api }
11
/**
12
* Main Scala interactive shell for Spark
13
* @param in0 Optional buffered reader for input
14
* @param out Print writer for output
15
* @param master Optional Spark master URL
16
*/
17
@DeveloperApi
18
class SparkILoop(
19
in0: Option[BufferedReader] = None,
20
out: JPrintWriter,
21
master: Option[String] = None
22
)
23
24
/**
25
* Alternative constructors for common usage patterns
26
*/
27
def this(in0: BufferedReader, out: JPrintWriter, master: String)
28
def this(in0: BufferedReader, out: JPrintWriter)
29
def this()
30
```
31
32
**Usage Examples:**
33
34
```scala
35
import org.apache.spark.repl.SparkILoop
36
import java.io.{BufferedReader, InputStreamReader, PrintWriter}
37
38
// Default REPL with standard I/O
39
val repl = new SparkILoop()
40
41
// Custom I/O streams
42
val in = new BufferedReader(new InputStreamReader(System.in))
43
val out = new PrintWriter(System.out, true)
44
val customRepl = new SparkILoop(in, out)
45
46
// With specific Spark master
47
val clusterRepl = new SparkILoop(in, out, "spark://master:7077")
48
```
49
50
### Session Management
51
52
Methods for controlling REPL session lifecycle and processing command line arguments.
53
54
```scala { .api }
55
/**
56
* Start REPL processing with command line arguments
57
* @param args Command line arguments array
58
* @return true if processing completed successfully
59
*/
60
def process(args: Array[String]): Boolean
61
```
62
63
**Usage Examples:**
64
65
```scala
66
// Start with no arguments (interactive mode)
67
repl.process(Array.empty)
68
69
// Start with initialization script
70
repl.process(Array("-i", "init.scala"))
71
72
// Start with specific settings
73
repl.process(Array("-classpath", "/path/to/classes"))
74
```
75
76
### Prompt Customization
77
78
Methods for customizing the REPL prompt display.
79
80
```scala { .api }
81
/**
82
* Set custom prompt string
83
* @param prompt New prompt string to display
84
*/
85
@DeveloperApi
86
def setPrompt(prompt: String): Unit
87
88
/**
89
* Get current prompt string
90
* @return Current prompt string
91
*/
92
@DeveloperApi
93
def prompt: String
94
```
95
96
**Usage Examples:**
97
98
```scala
99
// Set custom prompt
100
repl.setPrompt("spark> ")
101
102
// Get current prompt
103
val currentPrompt = repl.prompt
104
println(s"Current prompt: $currentPrompt")
105
```
106
107
### Command System
108
109
Access to available REPL commands and command processing functionality.
110
111
```scala { .api }
112
/**
113
* Get list of available REPL commands
114
* @return List of LoopCommand instances
115
*/
116
@DeveloperApi
117
def commands: List[LoopCommand]
118
```
119
120
### Spark Context Management
121
122
Methods for creating and managing Spark and SQL contexts within the REPL session.
123
124
```scala { .api }
125
/**
126
* Create SparkContext for the REPL session
127
* @return New SparkContext instance
128
*/
129
@DeveloperApi
130
def createSparkContext(): SparkContext
131
132
/**
133
* Create SQLContext for the REPL session
134
* @return New SQLContext instance
135
*/
136
@DeveloperApi
137
def createSQLContext(): SQLContext
138
139
/**
140
* Current SparkContext instance (mutable)
141
*/
142
@DeveloperApi
143
var sparkContext: SparkContext
144
145
/**
146
* Current SQLContext instance (mutable)
147
*/
148
var sqlContext: SQLContext
149
```
150
151
**Usage Examples:**
152
153
```scala
154
// Create contexts
155
val sc = repl.createSparkContext()
156
val sqlContext = repl.createSQLContext()
157
158
// Access current contexts
159
val currentSc = repl.sparkContext
160
val currentSql = repl.sqlContext
161
162
// Set new context
163
repl.sparkContext = newSparkContext
164
```
165
166
### Main Entry Point
167
168
Object providing the primary application entry point and global interpreter management.
169
170
```scala { .api }
171
object Main {
172
/**
173
* Application main method
174
* @param args Command line arguments
175
*/
176
def main(args: Array[String]): Unit
177
178
/**
179
* Get current interpreter instance
180
* @return Current SparkILoop instance
181
*/
182
def interp: SparkILoop
183
184
/**
185
* Set interpreter instance
186
* @param i New SparkILoop instance
187
*/
188
def interp_=(i: SparkILoop): Unit
189
}
190
```
191
192
**Usage Examples:**
193
194
```scala
195
import org.apache.spark.repl.Main
196
197
// Start REPL application
198
Main.main(Array("-i", "startup.scala"))
199
200
// Access global interpreter
201
val globalInterp = Main.interp
202
203
// Set custom interpreter
204
val customInterp = new SparkILoop()
205
Main.interp = customInterp
206
```
207
208
### Utility Methods
209
210
Companion object providing utility methods and implicit conversions.
211
212
```scala { .api }
213
object SparkILoop {
214
/**
215
* Get array of JAR files added to the classpath
216
* @return Array of JAR file paths
217
*/
218
def getAddedJars: Array[String]
219
220
/**
221
* Implicit conversion from SparkILoop to SparkIMain
222
* @param repl SparkILoop instance
223
* @return Underlying SparkIMain interpreter
224
*/
225
implicit def loopToInterpreter(repl: SparkILoop): SparkIMain
226
}
227
```
228
229
**Usage Examples:**
230
231
```scala
232
import org.apache.spark.repl.SparkILoop._
233
234
// Get added JARs
235
val jars = SparkILoop.getAddedJars
236
jars.foreach(println)
237
238
// Use implicit conversion
239
val repl = new SparkILoop()
240
val interpreter: SparkIMain = repl // Implicit conversion
241
```