0
# Code Interpreter
1
2
The Code Interpreter provides the core functionality for compiling, executing, and managing Scala code within the REPL environment. It handles code compilation, execution, variable management, and introspection capabilities.
3
4
## Capabilities
5
6
### SparkIMain Class
7
8
The core interpreter class that handles all aspects of code interpretation including compilation, execution, and state management.
9
10
```scala { .api }
11
/**
12
* Core interpreter for Spark REPL handling code compilation and execution
13
* @param initialSettings Scala compiler settings
14
* @param out Output writer for results and errors
15
* @param propagateExceptions Whether to propagate exceptions to caller
16
*/
17
@DeveloperApi
18
class SparkIMain(
19
initialSettings: Settings,
20
val out: JPrintWriter,
21
propagateExceptions: Boolean = false
22
) extends SparkImports with Logging { imain =>
23
24
/**
25
* Constructor with default settings
26
*/
27
def this() = this(new Settings())
28
29
/**
30
* Constructor with custom settings
31
*/
32
def this(settings: Settings) = this(settings, new NewLinePrintWriter(new ConsoleWriter, true))
33
}
34
```
35
36
**Core Properties:**
37
38
```scala { .api }
39
/**
40
* Output directory for compiled classes
41
*/
42
@DeveloperApi
43
lazy val getClassOutputDirectory: File
44
45
/**
46
* The underlying Scala compiler instance
47
*/
48
@DeveloperApi
49
lazy val global: Global
50
51
/**
52
* Code wrapper used for execution context
53
*/
54
@DeveloperApi
55
def executionWrapper: String
56
57
/**
58
* Whether compilation errors were reported
59
*/
60
@DeveloperApi
61
def isReportingErrors: Boolean
62
```
63
64
### Initialization
65
66
Methods for initializing the interpreter and preparing it for code execution.
67
68
```scala { .api }
69
/**
70
* Initialize the interpreter synchronously
71
* Must be called before using interpretation methods
72
*/
73
@DeveloperApi
74
def initializeSynchronous(): Unit
75
```
76
77
**Usage Example:**
78
79
```scala
80
import org.apache.spark.repl.SparkIMain
81
82
val interpreter = new SparkIMain()
83
interpreter.initializeSynchronous()
84
// Now ready for code interpretation
85
```
86
87
### Code Execution
88
89
Core methods for interpreting and compiling Scala code.
90
91
```scala { .api }
92
/**
93
* Interpret a line of Scala code
94
* @param line Scala code to interpret
95
* @return Result indicating success, error, or incomplete input
96
*/
97
@DeveloperApi
98
def interpret(line: String): IR.Result
99
100
/**
101
* Interpret code with synthetic naming (internal use)
102
* @param line Scala code to interpret
103
* @return Result indicating success, error, or incomplete input
104
*/
105
@DeveloperApi
106
def interpretSynthetic(line: String): IR.Result
107
108
/**
109
* Compile source files
110
* @param sources Variable number of SourceFile instances
111
* @return Boolean indicating compilation success
112
*/
113
@DeveloperApi
114
def compileSources(sources: SourceFile*): Boolean
115
116
/**
117
* Compile a string of Scala code
118
* @param code Scala code to compile
119
* @return Boolean indicating compilation success
120
*/
121
@DeveloperApi
122
def compileString(code: String): Boolean
123
```
124
125
**Usage Examples:**
126
127
```scala
128
import org.apache.spark.repl.SparkIMain
129
import scala.tools.nsc.interpreter.{Results => IR}
130
131
val interpreter = new SparkIMain()
132
interpreter.initializeSynchronous()
133
134
// Interpret simple expressions
135
val result1 = interpreter.interpret("val x = 42")
136
result1 match {
137
case IR.Success => println("Successfully defined x")
138
case IR.Error => println("Error in code")
139
case IR.Incomplete => println("Code is incomplete")
140
}
141
142
// Interpret complex code
143
val result2 = interpreter.interpret("""
144
def fibonacci(n: Int): Int = {
145
if (n <= 1) n else fibonacci(n-1) + fibonacci(n-2)
146
}
147
""")
148
149
// Compile string directly
150
val compiled = interpreter.compileString("case class Person(name: String, age: Int)")
151
```
152
153
### Code Analysis
154
155
Methods for parsing and analyzing code structure without execution.
156
157
```scala { .api }
158
/**
159
* Parse Scala code into Abstract Syntax Tree
160
* @param line Scala code to parse
161
* @return Optional list of Tree nodes representing parsed code
162
*/
163
@DeveloperApi
164
def parse(line: String): Option[List[Tree]]
165
166
/**
167
* Get symbol information for a line of code
168
* @param code Scala code to analyze
169
* @return Symbol representing the code
170
*/
171
@DeveloperApi
172
def symbolOfLine(code: String): Symbol
173
174
/**
175
* Get type information for an expression
176
* @param expr Scala expression to analyze
177
* @param silent Whether to suppress error messages
178
* @return Type information for the expression
179
*/
180
@DeveloperApi
181
def typeOfExpression(expr: String, silent: Boolean = true): Type
182
```
183
184
**Usage Examples:**
185
186
```scala
187
val interpreter = new SparkIMain()
188
interpreter.initializeSynchronous()
189
190
// Parse code structure
191
val parsed = interpreter.parse("val x: Int = 42")
192
parsed.foreach(trees => println(s"Parsed ${trees.length} tree nodes"))
193
194
// Get type information
195
val exprType = interpreter.typeOfExpression("List(1, 2, 3)")
196
println(s"Expression type: $exprType")
197
```
198
199
### Variable Management
200
201
Methods for binding variables and managing the interpreter's variable namespace.
202
203
```scala { .api }
204
/**
205
* Bind a variable to the interpreter namespace
206
* @param name Variable name
207
* @param boundType Type of the variable as string
208
* @param value Value to bind
209
* @param modifiers List of modifiers (e.g., "lazy", "implicit")
210
* @return Result indicating binding success
211
*/
212
@DeveloperApi
213
def bind(name: String, boundType: String, value: Any, modifiers: List[String] = Nil): IR.Result
214
215
/**
216
* Direct variable binding without wrapper
217
* @param name Variable name
218
* @param boundType Type of the variable as string
219
* @param value Value to bind
220
* @return Result indicating binding success
221
*/
222
@DeveloperApi
223
def directBind(name: String, boundType: String, value: Any): IR.Result
224
225
/**
226
* Rebind an existing variable with new value
227
* @param p NamedParam containing name and value
228
* @return Result indicating rebinding success
229
*/
230
@DeveloperApi
231
def rebind(p: NamedParam): IR.Result
232
233
/**
234
* Add import statements to the interpreter
235
* @param ids Variable number of import strings
236
* @return Result indicating import success
237
*/
238
@DeveloperApi
239
def addImports(ids: String*): IR.Result
240
```
241
242
**Usage Examples:**
243
244
```scala
245
val interpreter = new SparkIMain()
246
interpreter.initializeSynchronous()
247
248
// Bind variables
249
val bindResult = interpreter.bind("myList", "List[Int]", List(1, 2, 3, 4, 5))
250
val directResult = interpreter.directBind("pi", "Double", 3.14159)
251
252
// Add imports
253
val importResult = interpreter.addImports("scala.util.Random", "java.time.LocalDateTime")
254
255
// Check results
256
if (bindResult == IR.Success) {
257
println("Successfully bound myList")
258
}
259
```
260
261
### State Management
262
263
Methods for managing the interpreter's state and lifecycle.
264
265
```scala { .api }
266
/**
267
* Reset the interpreter state, clearing all definitions
268
*/
269
@DeveloperApi
270
def reset(): Unit
271
272
/**
273
* Close the interpreter and clean up resources
274
*/
275
@DeveloperApi
276
def close(): Unit
277
```
278
279
### Introspection
280
281
Methods for inspecting the current state of defined names, symbols, and variables.
282
283
```scala { .api }
284
/**
285
* Get all defined names in the interpreter
286
* @return List of all defined names
287
*/
288
@DeveloperApi
289
def allDefinedNames: List[Name]
290
291
/**
292
* Get all defined term names
293
* @return List of defined term names
294
*/
295
@DeveloperApi
296
def definedTerms: List[TermName]
297
298
/**
299
* Get all defined type names
300
* @return List of defined type names
301
*/
302
@DeveloperApi
303
def definedTypes: List[TypeName]
304
305
/**
306
* Get all defined symbols as a set
307
* @return Set of defined symbols
308
*/
309
@DeveloperApi
310
def definedSymbols: Set[Symbol]
311
312
/**
313
* Get all defined symbols as a list
314
* @return List of defined symbols
315
*/
316
@DeveloperApi
317
def definedSymbolList: List[Symbol]
318
319
/**
320
* Get user-defined term names (excluding res0, res1, etc.)
321
* @return List of user-defined term names
322
*/
323
@DeveloperApi
324
def namedDefinedTerms: List[TermName]
325
326
/**
327
* Get the name of the most recent result variable
328
* @return Most recent result variable name
329
*/
330
@DeveloperApi
331
def mostRecentVar: String
332
333
/**
334
* Get recent compiler warnings
335
* @return List of warning tuples (position, message)
336
*/
337
@DeveloperApi
338
def lastWarnings: List[(Position, String)]
339
```
340
341
**Usage Examples:**
342
343
```scala
344
val interpreter = new SparkIMain()
345
interpreter.initializeSynchronous()
346
347
// Define some variables
348
interpreter.interpret("val x = 42")
349
interpreter.interpret("def square(n: Int) = n * n")
350
interpreter.interpret("case class Point(x: Int, y: Int)")
351
352
// Inspect defined names
353
val allNames = interpreter.allDefinedNames
354
val terms = interpreter.definedTerms
355
val types = interpreter.definedTypes
356
val userTerms = interpreter.namedDefinedTerms
357
358
println(s"All names: ${allNames.mkString(", ")}")
359
println(s"Terms: ${terms.mkString(", ")}")
360
println(s"Types: ${types.mkString(", ")}")
361
println(s"User terms: ${userTerms.mkString(", ")}")
362
363
// Get most recent result
364
val recentVar = interpreter.mostRecentVar
365
println(s"Most recent variable: $recentVar")
366
```
367
368
### Value Access
369
370
Methods for accessing values, types, and runtime information of defined terms.
371
372
```scala { .api }
373
/**
374
* Get the runtime value of a term
375
* @param id Term identifier
376
* @return Optional runtime value
377
*/
378
@DeveloperApi
379
def valueOfTerm(id: String): Option[AnyRef]
380
381
/**
382
* Get the runtime class of a term
383
* @param id Term identifier
384
* @return Optional Java class
385
*/
386
@DeveloperApi
387
def classOfTerm(id: String): Option[JClass]
388
389
/**
390
* Get the compile-time type of a term
391
* @param id Term identifier
392
* @return Type information
393
*/
394
@DeveloperApi
395
def typeOfTerm(id: String): Type
396
397
/**
398
* Get the symbol of a term
399
* @param id Term identifier
400
* @return Symbol information
401
*/
402
@DeveloperApi
403
def symbolOfTerm(id: String): Symbol
404
405
/**
406
* Get both runtime class and type information
407
* @param id Term identifier
408
* @return Optional tuple of (Java class, Type)
409
*/
410
@DeveloperApi
411
def runtimeClassAndTypeOfTerm(id: String): Option[(JClass, Type)]
412
413
/**
414
* Get the runtime type of a term
415
* @param id Term identifier
416
* @return Runtime type information
417
*/
418
@DeveloperApi
419
def runtimeTypeOfTerm(id: String): Type
420
```
421
422
**Usage Examples:**
423
424
```scala
425
val interpreter = new SparkIMain()
426
interpreter.initializeSynchronous()
427
428
// Define a variable
429
interpreter.interpret("val numbers = List(1, 2, 3, 4, 5)")
430
431
// Access its value and type information
432
val value = interpreter.valueOfTerm("numbers")
433
val clazz = interpreter.classOfTerm("numbers")
434
val tpe = interpreter.typeOfTerm("numbers")
435
436
value.foreach(v => println(s"Value: $v"))
437
clazz.foreach(c => println(s"Class: ${c.getName}"))
438
println(s"Type: $tpe")
439
440
// Check runtime information
441
val runtimeInfo = interpreter.runtimeClassAndTypeOfTerm("numbers")
442
runtimeInfo.foreach { case (runtimeClass, runtimeType) =>
443
println(s"Runtime class: ${runtimeClass.getName}")
444
println(s"Runtime type: $runtimeType")
445
}
446
```
447
448
### Code Generation
449
450
Methods for working with generated names and code paths.
451
452
```scala { .api }
453
/**
454
* Get the real generated name for a REPL-defined name
455
* @param simpleName Simple name used in REPL
456
* @return Optional real generated name
457
*/
458
@DeveloperApi
459
def generatedName(simpleName: String): Option[String]
460
461
/**
462
* Get the full path to access a name
463
* @param name Name to get path for
464
* @return Full access path
465
*/
466
@DeveloperApi
467
def pathToName(name: Name): String
468
```
469
470
### Execution Control
471
472
Methods for controlling output and execution behavior.
473
474
```scala { .api }
475
/**
476
* Execute code block while suppressing normal output
477
* @param body Code block to execute
478
* @return Result of code block execution
479
*/
480
@DeveloperApi
481
def beQuietDuring[T](body: => T): T
482
483
/**
484
* Execute code block while masking all output
485
* @param operation Code block to execute
486
* @return Result of code block execution
487
*/
488
@DeveloperApi
489
def beSilentDuring[T](operation: => T): T
490
491
/**
492
* Set custom execution wrapper code
493
* @param code Wrapper code string
494
*/
495
@DeveloperApi
496
def setExecutionWrapper(code: String): Unit
497
498
/**
499
* Clear the execution wrapper
500
*/
501
@DeveloperApi
502
def clearExecutionWrapper(): Unit
503
```
504
505
### Classpath Management
506
507
Methods for managing the interpreter's classpath.
508
509
```scala { .api }
510
/**
511
* Add URLs to the interpreter's classpath
512
* @param urls Variable number of URL instances
513
*/
514
@DeveloperApi
515
def addUrlsToClassPath(urls: URL*): Unit
516
```
517
518
**Usage Example:**
519
520
```scala
521
import java.net.URL
522
523
val interpreter = new SparkIMain()
524
interpreter.initializeSynchronous()
525
526
// Add JARs to classpath
527
val jarUrl = new URL("file:///path/to/library.jar")
528
interpreter.addUrlsToClassPath(jarUrl)
529
530
// Now can use classes from the JAR
531
interpreter.interpret("import com.example.SomeClass")
532
```
533
534
## Integration Patterns
535
536
### Complete Interpreter Usage
537
538
```scala
539
import org.apache.spark.repl.SparkIMain
540
import scala.tools.nsc.interpreter.{Results => IR}
541
import java.net.URL
542
543
// Create and initialize interpreter
544
val interpreter = new SparkIMain()
545
interpreter.initializeSynchronous()
546
547
// Add external dependencies
548
val externalJar = new URL("file:///path/to/spark-sql.jar")
549
interpreter.addUrlsToClassPath(externalJar)
550
551
// Execute code
552
val result = interpreter.interpret("""
553
import org.apache.spark.sql.SparkSession
554
val spark = SparkSession.builder()
555
.appName("REPL Session")
556
.master("local[*]")
557
.getOrCreate()
558
""")
559
560
if (result == IR.Success) {
561
// Access the created SparkSession
562
val sparkSession = interpreter.valueOfTerm("spark")
563
println(s"SparkSession created: $sparkSession")
564
565
// Get information about defined variables
566
val definedTerms = interpreter.namedDefinedTerms
567
println(s"User-defined terms: ${definedTerms.mkString(", ")}")
568
}
569
570
// Clean up
571
interpreter.close()
572
```