CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-org-apache-spark--spark-repl-2-11

Interactive Scala shell for Apache Spark with distributed computing capabilities

Pending
Overview
Eval results
Files

scala-compatibility.mddocs/

Scala 2.11 Compatibility

Specialized interpreter and expression typing components for Scala 2.11 compatibility fixes and enhanced import handling.

Capabilities

SparkILoopInterpreter

Scala 2.11 specific interpreter with import handling fixes for REPL functionality.

/**
 * Scala 2.11 specific interpreter that extends IMain with Spark-specific fixes
 * Addresses import resolution bugs and flattened symbol issues in Scala 2.11
 * @param settings Scala compiler settings
 * @param out Output writer for interpreter results
 */
class SparkILoopInterpreter(settings: Settings, out: JPrintWriter) extends IMain(settings, out) {
  
  /** Custom member handlers with Spark-specific import handling */
  lazy val memberHandlers: MemberHandlers
  
  /** Expression typer instance for symbol and type resolution */
  object expressionTyper extends SparkExprTyper
  
  /**
   * Get symbol for a line of code
   * Overrides IMain implementation with enhanced symbol resolution
   * @param code Source code line
   * @return Symbol representing the code line
   */
  override def symbolOfLine(code: String): global.Symbol
  
  /**
   * Get type of an expression with optional silent mode
   * @param expr Expression string to type
   * @param silent Whether to suppress error messages
   * @return Type of the expression
   */
  override def typeOfExpression(expr: String, silent: Boolean): global.Type
  
  /**
   * Generate import code for request wrapper
   * Custom implementation to handle Scala 2.11 import bugs
   * @param wanted Set of names that need to be imported
   * @param wrapper Request wrapper for import context
   * @param definesClass Whether the request defines a class
   * @param generousImports Whether to include generous imports
   * @return Computed imports with header, code, and access path
   */
  override def importsCode(
    wanted: Set[Name], 
    wrapper: Request#Wrapper,
    definesClass: Boolean, 
    generousImports: Boolean
  ): ComputedImports
}

Usage Examples:

import org.apache.spark.repl.SparkILoopInterpreter
import scala.tools.nsc.Settings
import scala.tools.nsc.interpreter.JPrintWriter
import java.io.StringWriter

// Create Scala 2.11 compatible interpreter
val settings = new Settings
val output = new StringWriter
val interpreter = new SparkILoopInterpreter(settings, new JPrintWriter(output))

// Type expressions
val exprType = interpreter.typeOfExpression("List(1, 2, 3)", silent = false)

// Get symbols for code lines
val symbol = interpreter.symbolOfLine("val x = 42")

SparkExprTyper

Expression typing trait that provides enhanced symbol and type resolution for REPL expressions.

/**
 * Expression typing support for Spark REPL with proper phase management
 * Extends ExprTyper with REPL-specific functionality
 */
trait SparkExprTyper extends ExprTyper {
  
  /** Reference to the REPL interpreter */
  val repl: SparkILoopInterpreter
  
  /**
   * Interpret code while preserving compiler phase
   * Ensures phase consistency during expression evaluation
   * @param code Scala code to interpret
   * @return Interpretation result
   */
  def doInterpret(code: String): IR.Result
  
  /**
   * Get symbol for a line of code with enhanced resolution
   * Tries multiple strategies: expression, definition, error handling
   * @param code Source code line
   * @return Symbol representing the code, NoSymbol if not found
   */
  override def symbolOfLine(code: String): Symbol
}

Usage Examples:

// SparkExprTyper is typically used through SparkILoopInterpreter
val interpreter = new SparkILoopInterpreter(settings, output)
val typer = interpreter.expressionTyper

// Interpret code with phase preservation
val result = typer.doInterpret("val data = sc.parallelize(1 to 10)")

// Get symbol with multiple resolution strategies
val symbol = typer.symbolOfLine("def process(x: Int) = x * 2")

Import Handling Fixes

SparkImportHandler

Custom import handler that addresses Scala 2.11 import resolution issues.

/**
 * Spark-specific import handler for Scala 2.11
 * Fixes issues with flattened symbols and wildcard imports
 */
class SparkImportHandler(imp: Import) extends ImportHandler(imp) {
  
  /** Override target type resolution for better symbol handling */
  override def targetType: Type
  
  /** Check if selector is individual import (not wildcard) */
  def isIndividualImport(s: ImportSelector): Boolean
  
  /** Check if selector is wildcard import */
  def isWildcardImport(s: ImportSelector): Boolean
  
  /** Get importable symbols with their rename mappings */
  lazy val importableSymbolsWithRenames: List[(Symbol, Name)]
  
  /** Get individual (non-wildcard) symbols */
  override lazy val individualSymbols: List[Symbol]
  
  /** Get wildcard-imported symbols */
  override lazy val wildcardSymbols: List[Symbol]
}

Import Resolution Strategies

The import handling system uses multiple strategies to resolve symbols:

Target Type Resolution:

  1. Check for module definition first
  2. Fall back to expression type resolution
  3. Handle flattened symbol cases

Symbol Filtering:

  • Excludes flattened symbols (containing name join strings)
  • Handles individual vs wildcard imports differently
  • Manages symbol renaming and aliasing

Import Code Generation:

  • Generates proper import wrapper code
  • Handles class-based vs object-based imports
  • Manages import precedence and conflicts

Phase Management

Compiler Phase Preservation

Critical for maintaining consistent compilation state during interpretation:

def doInterpret(code: String): IR.Result = {
  // Preserve current compiler phase
  val savedPhase = phase
  try interpretSynthetic(code) 
  finally phase = savedPhase
}

Symbol Resolution Phases

Multi-phase symbol resolution for robust code interpretation:

  1. Expression Phase: Try as expression with lazy val typing pattern
  2. Definition Phase: Try as definition or declaration
  3. Error Phase: Handle interpretation errors gracefully
def symbolOfLine(code: String): Symbol = {
  def asExpr(): Symbol = {
    val name = freshInternalVarName()
    val line = "def " + name + " = " + code
    doInterpret(line) match {
      case IR.Success => 
        val sym0 = symbolOfTerm(name)
        sym0.cloneSymbol setInfo exitingTyper(sym0.tpe_*.finalResultType)
      case _ => NoSymbol
    }
  }
  
  def asDefn(): Symbol = { /* definition resolution */ }
  def asError(): Symbol = { /* error handling */ }
  
  // Try strategies in order
  beSilentDuring(asExpr()) orElse beSilentDuring(asDefn()) orElse asError()
}

Bug Fixes and Workarounds

Flattened Symbol Handling

Scala 2.11 has issues with flattened symbols in imports:

private def isFlattenedSymbol(sym: Symbol): Boolean =
  sym.owner.isPackageClass &&
    sym.name.containsName(nme.NAME_JOIN_STRING) &&
    sym.owner.info.member(sym.name.take(
      safeIndexOf(sym.name, nme.NAME_JOIN_STRING))) != NoSymbol

Safe String Operations

Enhanced string operations to handle edge cases:

private def safeIndexOf(name: Name, s: String): Int = fixIndexOf(name, pos(name, s))
private def fixIndexOf(name: Name, idx: Int): Int = if (idx == name.length) -1 else idx

Import Context Management

Proper import context handling for nested scopes:

  • Wrapper Management: Creates import wrappers as needed
  • Conflict Resolution: Handles import name conflicts
  • Scope Isolation: Manages different import scopes correctly

Integration with Main REPL

Conditional Usage

SparkILoopInterpreter is used conditionally based on Scala version:

override def createInterpreter(): Unit = {
  if (isScala2_11) {
    // Use SparkILoopInterpreter for Scala 2.11
    val interpreterClass = Class.forName("org.apache.spark.repl.SparkILoopInterpreter")
    intp = interpreterClass
      .getDeclaredConstructor(Seq(classOf[Settings], classOf[JPrintWriter]): _*)
      .newInstance(Seq(settings, out): _*)
      .asInstanceOf[IMain]
  } else {
    // Use standard interpreter for newer Scala versions
    super.createInterpreter()
  }
}

Version Detection

private val isScala2_11 = versionNumberString.startsWith("2.11")

Error Handling

Interpretation Errors

Robust error handling for interpretation failures:

case IR.Success => /* handle success */
case _ => NoSymbol  // Handle interpretation failures

Symbol Resolution Errors

Graceful fallback for symbol resolution issues:

beSilentDuring(asExpr()) orElse beSilentDuring(asDefn()) orElse asError()

Import Resolution Errors

Safe handling of import resolution problems:

  • Failed symbol lookups return appropriate defaults
  • Import conflicts are resolved with proper precedence
  • Malformed imports are handled without crashing

Install with Tessl CLI

npx tessl i tessl/maven-org-apache-spark--spark-repl-2-11

docs

class-loading.md

index.md

interactive-shell.md

main-entry.md

scala-compatibility.md

signal-handling.md

tile.json