or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

class-loading.mdindex.mdinteractive-shell.mdmain-entry.mdscala-compatibility.mdsignal-handling.md

scala-compatibility.mddocs/

0

# Scala 2.11 Compatibility

1

2

Specialized interpreter and expression typing components for Scala 2.11 compatibility fixes and enhanced import handling.

3

4

## Capabilities

5

6

### SparkILoopInterpreter

7

8

Scala 2.11 specific interpreter with import handling fixes for REPL functionality.

9

10

```scala { .api }

11

/**

12

* Scala 2.11 specific interpreter that extends IMain with Spark-specific fixes

13

* Addresses import resolution bugs and flattened symbol issues in Scala 2.11

14

* @param settings Scala compiler settings

15

* @param out Output writer for interpreter results

16

*/

17

class SparkILoopInterpreter(settings: Settings, out: JPrintWriter) extends IMain(settings, out) {

18

19

/** Custom member handlers with Spark-specific import handling */

20

lazy val memberHandlers: MemberHandlers

21

22

/** Expression typer instance for symbol and type resolution */

23

object expressionTyper extends SparkExprTyper

24

25

/**

26

* Get symbol for a line of code

27

* Overrides IMain implementation with enhanced symbol resolution

28

* @param code Source code line

29

* @return Symbol representing the code line

30

*/

31

override def symbolOfLine(code: String): global.Symbol

32

33

/**

34

* Get type of an expression with optional silent mode

35

* @param expr Expression string to type

36

* @param silent Whether to suppress error messages

37

* @return Type of the expression

38

*/

39

override def typeOfExpression(expr: String, silent: Boolean): global.Type

40

41

/**

42

* Generate import code for request wrapper

43

* Custom implementation to handle Scala 2.11 import bugs

44

* @param wanted Set of names that need to be imported

45

* @param wrapper Request wrapper for import context

46

* @param definesClass Whether the request defines a class

47

* @param generousImports Whether to include generous imports

48

* @return Computed imports with header, code, and access path

49

*/

50

override def importsCode(

51

wanted: Set[Name],

52

wrapper: Request#Wrapper,

53

definesClass: Boolean,

54

generousImports: Boolean

55

): ComputedImports

56

}

57

```

58

59

**Usage Examples:**

60

61

```scala

62

import org.apache.spark.repl.SparkILoopInterpreter

63

import scala.tools.nsc.Settings

64

import scala.tools.nsc.interpreter.JPrintWriter

65

import java.io.StringWriter

66

67

// Create Scala 2.11 compatible interpreter

68

val settings = new Settings

69

val output = new StringWriter

70

val interpreter = new SparkILoopInterpreter(settings, new JPrintWriter(output))

71

72

// Type expressions

73

val exprType = interpreter.typeOfExpression("List(1, 2, 3)", silent = false)

74

75

// Get symbols for code lines

76

val symbol = interpreter.symbolOfLine("val x = 42")

77

```

78

79

### SparkExprTyper

80

81

Expression typing trait that provides enhanced symbol and type resolution for REPL expressions.

82

83

```scala { .api }

84

/**

85

* Expression typing support for Spark REPL with proper phase management

86

* Extends ExprTyper with REPL-specific functionality

87

*/

88

trait SparkExprTyper extends ExprTyper {

89

90

/** Reference to the REPL interpreter */

91

val repl: SparkILoopInterpreter

92

93

/**

94

* Interpret code while preserving compiler phase

95

* Ensures phase consistency during expression evaluation

96

* @param code Scala code to interpret

97

* @return Interpretation result

98

*/

99

def doInterpret(code: String): IR.Result

100

101

/**

102

* Get symbol for a line of code with enhanced resolution

103

* Tries multiple strategies: expression, definition, error handling

104

* @param code Source code line

105

* @return Symbol representing the code, NoSymbol if not found

106

*/

107

override def symbolOfLine(code: String): Symbol

108

}

109

```

110

111

**Usage Examples:**

112

113

```scala

114

// SparkExprTyper is typically used through SparkILoopInterpreter

115

val interpreter = new SparkILoopInterpreter(settings, output)

116

val typer = interpreter.expressionTyper

117

118

// Interpret code with phase preservation

119

val result = typer.doInterpret("val data = sc.parallelize(1 to 10)")

120

121

// Get symbol with multiple resolution strategies

122

val symbol = typer.symbolOfLine("def process(x: Int) = x * 2")

123

```

124

125

## Import Handling Fixes

126

127

### SparkImportHandler

128

129

Custom import handler that addresses Scala 2.11 import resolution issues.

130

131

```scala { .api }

132

/**

133

* Spark-specific import handler for Scala 2.11

134

* Fixes issues with flattened symbols and wildcard imports

135

*/

136

class SparkImportHandler(imp: Import) extends ImportHandler(imp) {

137

138

/** Override target type resolution for better symbol handling */

139

override def targetType: Type

140

141

/** Check if selector is individual import (not wildcard) */

142

def isIndividualImport(s: ImportSelector): Boolean

143

144

/** Check if selector is wildcard import */

145

def isWildcardImport(s: ImportSelector): Boolean

146

147

/** Get importable symbols with their rename mappings */

148

lazy val importableSymbolsWithRenames: List[(Symbol, Name)]

149

150

/** Get individual (non-wildcard) symbols */

151

override lazy val individualSymbols: List[Symbol]

152

153

/** Get wildcard-imported symbols */

154

override lazy val wildcardSymbols: List[Symbol]

155

}

156

```

157

158

### Import Resolution Strategies

159

160

The import handling system uses multiple strategies to resolve symbols:

161

162

**Target Type Resolution**:

163

1. Check for module definition first

164

2. Fall back to expression type resolution

165

3. Handle flattened symbol cases

166

167

**Symbol Filtering**:

168

- Excludes flattened symbols (containing name join strings)

169

- Handles individual vs wildcard imports differently

170

- Manages symbol renaming and aliasing

171

172

**Import Code Generation**:

173

- Generates proper import wrapper code

174

- Handles class-based vs object-based imports

175

- Manages import precedence and conflicts

176

177

## Phase Management

178

179

### Compiler Phase Preservation

180

181

Critical for maintaining consistent compilation state during interpretation:

182

183

```scala

184

def doInterpret(code: String): IR.Result = {

185

// Preserve current compiler phase

186

val savedPhase = phase

187

try interpretSynthetic(code)

188

finally phase = savedPhase

189

}

190

```

191

192

### Symbol Resolution Phases

193

194

Multi-phase symbol resolution for robust code interpretation:

195

196

1. **Expression Phase**: Try as expression with lazy val typing pattern

197

2. **Definition Phase**: Try as definition or declaration

198

3. **Error Phase**: Handle interpretation errors gracefully

199

200

```scala

201

def symbolOfLine(code: String): Symbol = {

202

def asExpr(): Symbol = {

203

val name = freshInternalVarName()

204

val line = "def " + name + " = " + code

205

doInterpret(line) match {

206

case IR.Success =>

207

val sym0 = symbolOfTerm(name)

208

sym0.cloneSymbol setInfo exitingTyper(sym0.tpe_*.finalResultType)

209

case _ => NoSymbol

210

}

211

}

212

213

def asDefn(): Symbol = { /* definition resolution */ }

214

def asError(): Symbol = { /* error handling */ }

215

216

// Try strategies in order

217

beSilentDuring(asExpr()) orElse beSilentDuring(asDefn()) orElse asError()

218

}

219

```

220

221

## Bug Fixes and Workarounds

222

223

### Flattened Symbol Handling

224

225

Scala 2.11 has issues with flattened symbols in imports:

226

227

```scala

228

private def isFlattenedSymbol(sym: Symbol): Boolean =

229

sym.owner.isPackageClass &&

230

sym.name.containsName(nme.NAME_JOIN_STRING) &&

231

sym.owner.info.member(sym.name.take(

232

safeIndexOf(sym.name, nme.NAME_JOIN_STRING))) != NoSymbol

233

```

234

235

### Safe String Operations

236

237

Enhanced string operations to handle edge cases:

238

239

```scala

240

private def safeIndexOf(name: Name, s: String): Int = fixIndexOf(name, pos(name, s))

241

private def fixIndexOf(name: Name, idx: Int): Int = if (idx == name.length) -1 else idx

242

```

243

244

### Import Context Management

245

246

Proper import context handling for nested scopes:

247

248

- **Wrapper Management**: Creates import wrappers as needed

249

- **Conflict Resolution**: Handles import name conflicts

250

- **Scope Isolation**: Manages different import scopes correctly

251

252

## Integration with Main REPL

253

254

### Conditional Usage

255

256

SparkILoopInterpreter is used conditionally based on Scala version:

257

258

```scala

259

override def createInterpreter(): Unit = {

260

if (isScala2_11) {

261

// Use SparkILoopInterpreter for Scala 2.11

262

val interpreterClass = Class.forName("org.apache.spark.repl.SparkILoopInterpreter")

263

intp = interpreterClass

264

.getDeclaredConstructor(Seq(classOf[Settings], classOf[JPrintWriter]): _*)

265

.newInstance(Seq(settings, out): _*)

266

.asInstanceOf[IMain]

267

} else {

268

// Use standard interpreter for newer Scala versions

269

super.createInterpreter()

270

}

271

}

272

```

273

274

### Version Detection

275

276

```scala

277

private val isScala2_11 = versionNumberString.startsWith("2.11")

278

```

279

280

## Error Handling

281

282

### Interpretation Errors

283

284

Robust error handling for interpretation failures:

285

286

```scala

287

case IR.Success => /* handle success */

288

case _ => NoSymbol // Handle interpretation failures

289

```

290

291

### Symbol Resolution Errors

292

293

Graceful fallback for symbol resolution issues:

294

295

```scala

296

beSilentDuring(asExpr()) orElse beSilentDuring(asDefn()) orElse asError()

297

```

298

299

### Import Resolution Errors

300

301

Safe handling of import resolution problems:

302

303

- Failed symbol lookups return appropriate defaults

304

- Import conflicts are resolved with proper precedence

305

- Malformed imports are handled without crashing