Interactive Scala shell for Apache Spark with distributed computing capabilities
—
Signal handling utilities for interactive job cancellation and REPL interrupt management.
Provides signal handling functionality for graceful job cancellation in the REPL environment.
/**
* Signal handling utilities for REPL interrupt management
* Provides SIGINT handling to cancel running Spark jobs
*/
private[repl] object Signaling extends Logging {
/**
* Register a SIGINT handler that terminates all active Spark jobs
* or terminates when no jobs are currently running
* Makes it possible to interrupt a running shell job by pressing Ctrl+C
*/
def cancelOnInterrupt(): Unit
}Usage Examples:
import org.apache.spark.repl.Signaling
// Register interrupt handler (typically called during REPL startup)
Signaling.cancelOnInterrupt()
// Now Ctrl+C will:
// 1. Cancel active Spark jobs if any are running
// 2. Exit REPL if no jobs are activeThe interrupt handler provides intelligent behavior based on Spark job status:
When Spark Jobs Are Active:
When No Jobs Are Active:
The signal handler integrates with Spark's job tracking system:
def cancelOnInterrupt(): Unit = SignalUtils.register("INT") {
SparkContext.getActive.map { ctx =>
if (!ctx.statusTracker.getActiveJobIds().isEmpty) {
logWarning("Cancelling all active jobs, this can take a while. " +
"Press Ctrl+C again to exit now.")
ctx.cancelAllJobs()
true // Handled - don't exit yet
} else {
false // Not handled - allow normal exit
}
}.getOrElse(false) // No active context - allow normal exit
}SparkContext.getActive to find current Spark contextStatusTracker.getActiveJobIds() to check for running jobsSparkContext.cancelAllJobs() for graceful job terminationWhen jobs are cancelled via Ctrl+C:
The signal handler operates safely with concurrent job execution:
When no SparkContext is available:
SparkContext.getActive.map { ctx =>
// Handle interrupts with context
}.getOrElse(false) // No context - allow normal exitSignal handling includes proper exception management:
Signal handling is automatically registered during REPL startup:
// In Main.scala
object Main extends Logging {
initializeLogIfNecessary(true)
Signaling.cancelOnInterrupt() // Register on startup
// ... rest of initialization
}Provides smooth interactive experience:
Clear user communication during interruption:
Cancelling all active jobs, this can take a while. Press Ctrl+C again to exit now.Uses Spark's SignalUtils.register() for cross-platform signal handling:
Integrates properly with JVM signal handling:
While the object is private[repl], it demonstrates patterns for custom signal handling:
// Example pattern for custom signal handling
SignalUtils.register("INT") { /* custom handler */ }
SignalUtils.register("TERM") { /* termination handler */ }Can be extended for more sophisticated job management:
Cancellation events are logged with appropriate levels:
logWarning("Cancelling all active jobs, this can take a while. Press Ctrl+C again to exit now.")Debug-level logging for signal handling events:
Install with Tessl CLI
npx tessl i tessl/maven-org-apache-spark--spark-repl-2-11