or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

auto-completion.mdclass-loading.mdcode-interpretation.mdcommand-line.mdindex.mdinteractive-shell.md

command-line.mddocs/

0

# Command Line Configuration

1

2

Command line option handling and settings management for Spark-specific REPL configurations, compiler settings, and initialization parameters.

3

4

## Capabilities

5

6

### SparkCommandLine Class

7

8

Command class that extends Scala compiler command functionality with Spark-specific options and settings management.

9

10

```scala { .api }

11

/**

12

* Command class enabling Spark-specific command line options

13

* @param args List of command line arguments

14

* @param settings Compiler settings instance

15

*/

16

@DeveloperApi

17

class SparkCommandLine(

18

args: List[String],

19

override val settings: Settings

20

) extends CompilerCommand(args, settings)

21

22

/**

23

* Alternative constructors

24

* @param args Command line arguments

25

* @param error Error handling function

26

*/

27

def this(args: List[String], error: String => Unit)

28

29

/**

30

* Default constructor with arguments only

31

* @param args Command line arguments

32

*/

33

def this(args: List[String])

34

35

/**

36

* Compiler settings for the command

37

*/

38

override val settings: Settings

39

```

40

41

**Usage Examples:**

42

43

```scala

44

import org.apache.spark.repl.SparkCommandLine

45

import scala.tools.nsc.Settings

46

47

// Create with custom settings

48

val settings = new Settings()

49

val commandLine = new SparkCommandLine(

50

List("-classpath", "/path/to/classes", "-deprecation"),

51

settings

52

)

53

54

// Create with error handler

55

val errorHandler = (msg: String) => System.err.println(s"Error: $msg")

56

val cmdWithError = new SparkCommandLine(

57

List("-invalid-option"),

58

errorHandler

59

)

60

61

// Simple creation

62

val simpleCmd = new SparkCommandLine(List("-verbose"))

63

```

64

65

### Settings Access

66

67

The SparkCommandLine provides access to parsed compiler settings that can be used to configure the Spark REPL environment.

68

69

```scala { .api }

70

/**

71

* Access parsed compiler settings

72

* Contains all standard Scala compiler options plus Spark-specific configurations

73

*/

74

val settings: Settings

75

```

76

77

**Common Settings Usage:**

78

79

```scala

80

// Access classpath settings

81

val classpath = commandLine.settings.classpath.value

82

println(s"Classpath: $classpath")

83

84

// Check for specific flags

85

if (commandLine.settings.deprecation.value) {

86

println("Deprecation warnings enabled")

87

}

88

89

// Access output directory

90

val outputDir = commandLine.settings.outputDirs.getSingleOutput

91

outputDir.foreach(dir => println(s"Output directory: $dir"))

92

93

// Check Scala version settings

94

val scalaVersion = commandLine.settings.source.value

95

println(s"Source version: $scalaVersion")

96

```

97

98

### SparkHelper Utility

99

100

Utility object providing access to compiler settings and parent class loader configuration.

101

102

```scala { .api }

103

/**

104

* Utility object for Spark REPL helper functions

105

*/

106

@DeveloperApi

107

object SparkHelper {

108

/**

109

* Get explicit parent class loader from settings

110

* @param settings Compiler settings

111

* @return Optional parent ClassLoader

112

*/

113

@DeveloperApi

114

def explicitParentLoader(settings: Settings): Option[ClassLoader]

115

}

116

```

117

118

**Usage Examples:**

119

120

```scala

121

import scala.tools.nsc.SparkHelper

122

123

// Get parent class loader from settings

124

val settings = new Settings()

125

settings.usejavacp.value = true

126

127

val parentLoader = SparkHelper.explicitParentLoader(settings)

128

parentLoader match {

129

case Some(loader) => println(s"Using parent loader: ${loader.getClass}")

130

case None => println("No explicit parent loader configured")

131

}

132

```

133

134

## Integration with REPL Components

135

136

### Compiler Integration

137

138

SparkCommandLine integrates with the Spark interpreter to provide proper compiler configuration:

139

140

```scala

141

import org.apache.spark.repl.{SparkCommandLine, SparkIMain}

142

143

// Parse command line arguments

144

val args = List("-classpath", "/spark/lib/*", "-Xmx2g")

145

val commandLine = new SparkCommandLine(args)

146

147

// Use settings with interpreter

148

val interpreter = new SparkIMain(commandLine.settings)

149

interpreter.initializeSynchronous()

150

```

151

152

### REPL Loop Integration

153

154

Command line options are processed by SparkILoop during initialization:

155

156

```scala

157

import org.apache.spark.repl.SparkILoop

158

159

// REPL processes command line arguments

160

val repl = new SparkILoop()

161

val args = Array("-i", "init.scala", "-classpath", "/app/lib/*")

162

163

// Command line parsing happens during process()

164

repl.process(args)

165

```

166

167

## Standard Command Line Options

168

169

The SparkCommandLine supports all standard Scala compiler options plus Spark-specific extensions:

170

171

### Classpath Options

172

- `-classpath <path>` or `-cp <path>`: Set classpath for compilation

173

- `-sourcepath <path>`: Set source path for finding source files

174

175

### Compilation Options

176

- `-d <directory>`: Destination directory for compiled classes

177

- `-encoding <encoding>`: Character encoding for source files

178

- `-target:<version>`: Target JVM version (e.g., jvm-1.8)

179

180

### Warning and Error Options

181

- `-deprecation`: Enable deprecation warnings

182

- `-feature`: Enable feature warnings

183

- `-unchecked`: Enable unchecked warnings

184

- `-Werror`: Treat warnings as errors

185

186

### Debug and Verbose Options

187

- `-verbose`: Enable verbose compiler output

188

- `-Xprint:<phases>`: Print code after specified compiler phases

189

- `-Ylog:<phases>`: Log specified compiler phases

190

191

### Memory and Performance Options

192

- `-Xmx<size>`: Maximum heap size for compiler

193

- `-Xms<size>`: Initial heap size for compiler

194

195

### REPL-Specific Options

196

- `-i <file>`: Preload file into REPL session

197

- `-e <expression>`: Execute expression and exit

198

199

**Example Command Line Usage:**

200

201

```bash

202

# Start REPL with custom classpath and initialization

203

spark-shell -classpath "/app/lib/*" -i "setup.scala"

204

205

# Enable all warnings and use specific encoding

206

spark-shell -deprecation -feature -unchecked -encoding "UTF-8"

207

208

# Debug mode with verbose output

209

spark-shell -verbose -Ylog:typer

210

```

211

212

### Programmatic Usage

213

214

```scala

215

// Parse complex command line

216

val complexArgs = List(

217

"-classpath", "/spark/lib/*:/app/lib/*",

218

"-deprecation",

219

"-feature",

220

"-Xmx4g",

221

"-encoding", "UTF-8",

222

"-i", "startup.scala"

223

)

224

225

val commandLine = new SparkCommandLine(complexArgs)

226

227

// Extract specific settings

228

val useJavaCP = commandLine.settings.usejavacp.value

229

val maxHeap = commandLine.settings.Xmx.value

230

val initFile = commandLine.settings.loadfiles.value

231

232

println(s"Java classpath: $useJavaCP")

233

println(s"Max heap: $maxHeap")

234

println(s"Init files: ${initFile.mkString(", ")}")

235

```