or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

class-loading.mdindex.mdinteractive-shell.mdmain-entry.mdscala-compatibility.mdsignal-handling.md

main-entry.mddocs/

0

# REPL Entry Point

1

2

Main application entry point and SparkSession/SparkContext lifecycle management for the interactive shell.

3

4

## Capabilities

5

6

### Main Object

7

8

Entry point object that manages the REPL application lifecycle and Spark session creation.

9

10

```scala { .api }

11

/**

12

* Main entry point for the Spark REPL application

13

* Manages SparkContext, SparkSession, and SparkILoop instances

14

*/

15

object Main extends Logging {

16

/** Spark configuration object */

17

val conf: SparkConf

18

19

/** Current Spark context (mutable, null initially) */

20

var sparkContext: SparkContext

21

22

/** Current Spark session (mutable, null initially) */

23

var sparkSession: SparkSession

24

25

/** Current interpreter instance (mutable, used by tests) */

26

var interp: SparkILoop

27

28

/**

29

* Main application entry point

30

* @param args Command line arguments passed to the REPL

31

*/

32

def main(args: Array[String]): Unit

33

34

/**

35

* Creates and configures a new SparkSession with proper catalog support

36

* Automatically determines Hive support based on classpath availability

37

* @return Configured SparkSession instance

38

*/

39

def createSparkSession(): SparkSession

40

}

41

```

42

43

**Usage Examples:**

44

45

```scala

46

// Start REPL programmatically

47

org.apache.spark.repl.Main.main(Array("-classpath", "/path/to/jars"))

48

49

// Access current session

50

val session = org.apache.spark.repl.Main.sparkSession

51

val context = org.apache.spark.repl.Main.sparkContext

52

53

// Create new session

54

val newSession = org.apache.spark.repl.Main.createSparkSession()

55

```

56

57

### Internal Main Method

58

59

Internal main method used for testing and custom REPL initialization.

60

61

```scala { .api }

62

/**

63

* Internal main method used by tests and custom initialization

64

* @param args Command line arguments

65

* @param _interp Custom SparkILoop instance to use

66

*/

67

private[repl] def doMain(args: Array[String], _interp: SparkILoop): Unit

68

```

69

70

## Configuration

71

72

### Spark Configuration

73

74

The Main object uses a pre-configured SparkConf instance with REPL-specific settings:

75

76

- `spark.repl.classdir`: Directory for REPL class files

77

- `spark.repl.class.outputDir`: Output directory for compiled classes

78

- `spark.app.name`: Default application name ("Spark shell")

79

- `spark.executor.uri`: Executor URI from environment

80

- `spark.home`: Spark home directory from environment

81

82

### Class File Management

83

84

REPL manages temporary directories for dynamically compiled classes:

85

86

```scala

87

val rootDir = conf.getOption("spark.repl.classdir").getOrElse(Utils.getLocalDir(conf))

88

val outputDir = Utils.createTempDir(root = rootDir, namePrefix = "repl")

89

```

90

91

### Catalog Integration

92

93

Automatic Hive catalog support based on classpath:

94

95

- If Hive classes present and `spark.sql.catalogImplementation=hive`: Enables Hive support

96

- Otherwise: Uses in-memory catalog

97

- Configurable via `spark.sql.catalogImplementation` property

98

99

## Error Handling

100

101

### Initialization Errors

102

103

```scala

104

// Shell session initialization failure

105

case e: Exception if isShellSession =>

106

logError("Failed to initialize Spark session.", e)

107

sys.exit(1)

108

```

109

110

### Argument Processing Errors

111

112

Invalid Scala compiler arguments are handled via error callback:

113

114

```scala

115

private def scalaOptionError(msg: String): Unit = {

116

hasErrors = true

117

Console.err.println(msg)

118

}

119

```

120

121

## Environment Integration

122

123

### Environment Variables

124

125

- `SPARK_EXECUTOR_URI`: Sets executor URI for distributed execution

126

- `SPARK_HOME`: Sets Spark installation directory

127

128

### Command Line Integration

129

130

Arguments are processed and passed to Scala interpreter:

131

132

```scala

133

val interpArguments = List(

134

"-Yrepl-class-based",

135

"-Yrepl-outdir", s"${outputDir.getAbsolutePath}",

136

"-classpath", jars

137

) ++ args.toList

138

```

139

140

### Signal Handling Integration

141

142

Automatically registers interrupt signal handling:

143

144

```scala

145

Signaling.cancelOnInterrupt()

146

```