Java annotations module for Apache Spark test categorization and API stability markers
npx @tessl/cli install tessl/maven-org-apache-spark--spark-tags_2-13@3.5.0Apache Spark Tags is a Java annotations library that provides annotation interfaces for test categorization and API stability markers within the Apache Spark ecosystem. It includes annotations for organizing Spark's extensive test suites and communicating the maturity and expected evolution of Spark's public APIs.
The annotations are designed to mark APIs as experimental or intended only for advanced usages by developers, and are used project-wide across Spark, reflected in both Scala and Java documentation.
pom.xml:<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-tags_2.13</artifactId>
<version>3.5.6</version>
</dependency>For Gradle:
implementation 'org.apache.spark:spark-tags_2.13:3.5.6'import org.apache.spark.annotation.*;
import org.apache.spark.tags.*;Use these annotations to mark the stability level of your Spark APIs:
import org.apache.spark.annotation.Experimental;
import org.apache.spark.annotation.Stable;
import org.apache.spark.annotation.DeveloperApi;
@Experimental
public class MyNewFeature {
@Stable
public void stableMethod() {
// Implementation
}
@DeveloperApi
public void advancedDeveloperMethod() {
// Implementation
}
}Use these annotations to categorize your tests for selective execution:
import org.apache.spark.tags.SlowHiveTest;
import org.apache.spark.tags.DockerTest;
import org.apache.spark.tags.ExtendedSQLTest;
@SlowHiveTest
public class MyHiveIntegrationTest {
// Test implementation
}
@DockerTest
@Test
public void testWithDocker() {
// Test requiring Docker
}
@ExtendedSQLTest
@Test
public void testComplexSQL() {
// Extended SQL test
}Apache Spark Tags consists of two main annotation packages:
org.apache.spark.annotation): Annotations for marking API maturity levels and stability guaranteesorg.apache.spark.tags): ScalaTest-compatible annotations for categorizing and filtering test executionAll annotations use runtime retention and support comprehensive targeting of Java language elements.
Annotations for communicating API maturity and stability expectations to users and developers.
/**
* An experimental user-facing API.
* Experimental API's might change or be removed in minor versions of Spark,
* or be adopted as first-class Spark API's.
*
* NOTE: If there exists a Scaladoc comment that immediately precedes this annotation, the first
* line of the comment must be ":: Experimental ::" with no trailing blank line. This is because
* of the known issue that Scaladoc displays only either the annotation or the comment, whichever
* comes first.
*/
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface Experimental {}/**
* Stable APIs that retain source and binary compatibility within a major release.
* These interfaces can change from one major release to another major release
* (e.g. from 1.0 to 2.0).
*/
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface Stable {}/**
* APIs that are meant to evolve towards becoming stable APIs, but are not stable APIs yet.
* Evolving interfaces can change from one feature release to another release (i.e. 2.1 to 2.2).
*/
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface Evolving {}/**
* Unstable APIs, with no guarantee on stability.
* Classes that are unannotated are considered Unstable.
*/
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface Unstable {}/**
* A lower-level, unstable API intended for developers.
* Developer API's might change or be removed in minor versions of Spark.
*
* NOTE: If there exists a Scaladoc comment that immediately precedes this annotation, the first
* line of the comment must be ":: DeveloperApi ::" with no trailing blank line. This is because
* of the known issue that Scaladoc displays only either the annotation or the comment, whichever
* comes first.
*/
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface DeveloperApi {}/**
* A new component of Spark which may have unstable API's.
*
* NOTE: If there exists a Scaladoc comment that immediately precedes this annotation, the first
* line of the comment must be ":: AlphaComponent ::" with no trailing blank line. This is because
* of the known issue that Scaladoc displays only either the annotation or the comment, whichever
* comes first.
*/
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface AlphaComponent {}/**
* A class that is considered private to the internals of Spark -- there is a high-likelihood
* they will be changed in future versions of Spark.
* This should be used only when the standard Scala / Java means of protecting classes are
* insufficient. In particular, Java has no equivalent of private[spark], so we use this annotation
* in its place.
*
* NOTE: If there exists a Scaladoc comment that immediately precedes this annotation, the first
* line of the comment must be ":: Private ::" with no trailing blank line. This is because
* of the known issue that Scaladoc displays only either the annotation or the comment, whichever
* comes first.
*/
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE, ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER,
ElementType.CONSTRUCTOR, ElementType.LOCAL_VARIABLE, ElementType.PACKAGE})
public @interface Private {}ScalaTest-compatible annotations for categorizing tests to enable selective test execution during Spark development and CI/CD processes.
/**
* Tags slow Hive-related tests
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface SlowHiveTest {}
/**
* Tags slow SQL tests
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface SlowSQLTest {}/**
* Tags extended SQL tests
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface ExtendedSQLTest {}
/**
* Tags extended Hive-related tests
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface ExtendedHiveTest {}
/**
* Tags extended LevelDB-related tests
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface ExtendedLevelDBTest {}
/**
* Tags extended YARN-related tests
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface ExtendedYarnTest {}/**
* Tags tests that require Docker
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface DockerTest {}
/**
* Tags UI tests that require Chrome browser
*/
@TagAnnotation
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface ChromeUITest {}All annotations use standard Java annotation interfaces and elements:
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;Test category annotations integrate with ScalaTest:
import org.scalatest.TagAnnotation;Use API stability annotations to communicate the maturity and evolution expectations of your Spark components:
// Mark experimental features that may change
@Experimental
public class StreamingMLAlgorithm {
// Mark stable core methods
@Stable
public void fit(Dataset<Row> data) {
// Implementation
}
// Mark evolving APIs moving toward stability
@Evolving
public MLModel getModel() {
// Implementation
}
// Mark internal developer APIs
@DeveloperApi
public void internalOptimize() {
// Implementation
}
}Use test category annotations to organize tests for selective execution:
// Slow tests that should be skipped in fast CI runs
@SlowHiveTest
public class HiveIntegrationTest {
// Test implementation
}
// Tests requiring external infrastructure
@DockerTest
public class KafkaIntegrationTest {
@Test
public void testKafkaConnectivity() {
// Test requiring Docker-based Kafka
}
}
// Extended test suites for comprehensive validation
@ExtendedSQLTest
public class ComplexSQLTest {
@Test
public void testComplexJoins() {
// Extended SQL functionality tests
}
}Annotations can be combined as needed:
@Experimental
@DeveloperApi
public class AdvancedInternalFeature {
// Experimental developer-only functionality
}
@SlowHiveTest
@ExtendedHiveTest
public class ComprehensiveHiveTest {
// Tests that are both slow and extended
}These annotations do not throw exceptions - they are purely metadata markers. However, they enable:
scala-library at version specified by ${scala.version})