CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-com-github-spotbugs--spotbugs-annotations

Annotations the SpotBugs tool supports for static analysis control and null safety

Pending
Overview
Eval results
Files

testing-annotations.mddocs/

Testing Annotations

Control expected warnings and analysis behavior for testing static analysis rules and validation. These annotations are primarily used for testing SpotBugs itself and validating static analysis behavior in codebases.

Note: Most testing annotations in this package are deprecated as of SpotBugs 4.x in favor of programmatic test matchers using edu.umd.cs.findbugs.test.matcher.BugInstanceMatcher. The annotation-based approach has limitations with modern Java features like lambdas.

Capabilities

ExpectWarning Annotation

Indicate that SpotBugs should generate a warning at the annotated location.

/**
 * Annotation indicating that a FindBugs warning is expected.
 * 
 * @deprecated The annotation based approach is useless for lambdas. 
 * Write expectations using edu.umd.cs.findbugs.test.matcher.BugInstanceMatcher 
 * matchers in test source directory
 */
@Deprecated
@Retention(RetentionPolicy.CLASS)
@interface ExpectWarning {
    /**
     * The value indicates the bug code (e.g., NP) or bug pattern (e.g.,
     * IL_INFINITE_LOOP) of the expected warning. Can be a comma-separated list.
     */
    String value();

    /** Want a warning at this priority or higher */
    Confidence confidence() default Confidence.LOW;

    /** Expect a warning at least this scary */
    int rank() default 20;

    /** Expect at least this many warnings */
    int num() default 1;
}

Usage Examples:

public class AnalysisTestClass {
    
    // Expect null pointer warning
    @ExpectWarning("NP_NULL_ON_SOME_PATH")
    public void methodWithNullIssue() {
        String str = null;
        System.out.println(str.length()); // Should trigger null pointer warning
    }
    
    // Expect multiple warnings (note: deprecated approach)
    @ExpectWarning(value = "EI_EXPOSE_REP,EI_EXPOSE_REP2", num = 2)
    public Date[] getDatesWithExposure() {
        return internalDates; // Should trigger representation exposure warnings
    }
    
    // Expect specific number of warnings
    @ExpectWarning(value = "URF_UNREAD_FIELD", num = 3)
    public class ClassWithUnreadFields {
        private String unreadField1; // Should trigger warning
        private String unreadField2; // Should trigger warning
        private String unreadField3; // Should trigger warning
        private String usedField;    // Should not trigger warning
        
        public void useField() {
            System.out.println(usedField);
        }
    }
    
    // Test resource leak detection
    @ExpectWarning("OS_OPEN_STREAM")
    public void methodWithResourceLeak() throws IOException {
        FileInputStream stream = new FileInputStream("test.txt");
        // Stream not closed - should trigger warning
        stream.read();
    }
}

// Package-level testing
@ExpectWarning(value = "UWF_UNWRITTEN_FIELD", num = 5)
package com.example.test;

// Class-level testing
@ExpectWarning("SE_BAD_FIELD")
public class SerializationTestClass implements Serializable {
    private transient Object nonSerializableField; // Should trigger warning
}

NoWarning Annotation

Indicate that SpotBugs should not generate warnings at the annotated location.

/**
 * Annotation indicating that no FindBugs warning is expected.
 * 
 * @deprecated The annotation based approach is useless for lambdas. 
 * Write expectations using edu.umd.cs.findbugs.test.matcher.BugInstanceMatcher 
 * matchers in test source directory
 */
@Deprecated
@Retention(RetentionPolicy.CLASS)
@interface NoWarning {
    /**
     * The value indicates the bug code (e.g., NP) or bug pattern (e.g.,
     * IL_INFINITE_LOOP) that should not be reported
     */
    String value();

    /** Want no warning at this priority or higher */
    Confidence confidence() default Confidence.LOW;

    /** Want no warning at this rank or scarier */
    int rank() default 20;

    /** Tolerate up to this many warnings */
    int num() default 0;
}

Usage Examples:

public class SafeCodeTestClass {
    
    // This method should not trigger null pointer warnings
    @NoWarning("NP_NULL_ON_SOME_PATH")
    public void safeNullHandling(@Nullable String input) {
        if (input != null) { // Proper null check
            System.out.println(input.length());
        }
    }
    
    // No resource leak warnings expected
    @NoWarning("OS_OPEN_STREAM")
    public void properResourceHandling() throws IOException {
        try (FileInputStream stream = new FileInputStream("test.txt")) {
            stream.read(); // Properly closed via try-with-resources
        }
    }
    
    // No representation exposure warnings
    @NoWarning({"EI_EXPOSE_REP", "EI_EXPOSE_REP2"})
    public Date[] getSafeDatesCopy() {
        return Arrays.copyOf(internalDates, internalDates.length); // Returns copy
    }
    
    // General - no warnings should be generated
    @NoWarning
    public void perfectlyCleanMethod(@NonNull String input) {
        String processed = input.trim().toLowerCase();
        logger.info("Processed: " + processed);
    }
}

// Class-level no warnings
@NoWarning("SE_BAD_FIELD")
public class WellDesignedSerializable implements Serializable {
    private static final long serialVersionUID = 1L;
    private String name; // Properly serializable field
    private transient Logger logger = LoggerFactory.getLogger(getClass()); // Transient non-serializable
}

DesireWarning Annotation

Indicate that you would like SpotBugs to generate a warning (for testing detector development).

/**
 * Annotation indicating that a FindBugs warning is desired.
 * 
 * @deprecated The annotation based approach is useless for lambdas. 
 * Write expectations using edu.umd.cs.findbugs.test.matcher.BugInstanceMatcher 
 * matchers in test source directory
 */
@Deprecated
@Retention(RetentionPolicy.CLASS)
@interface DesireWarning {
    /**
     * The value indicates the bug code (e.g., NP) or bug pattern (e.g.,
     * IL_INFINITE_LOOP) of the desired warning
     */
    String value();

    /** Want a warning at this priority or higher */
    Confidence confidence() default Confidence.LOW;

    /** Desire a warning at least this scary */
    int rank() default 20;

    /** Desire at least this many warnings */
    int num() default 1;
}

Usage Examples:

public class DetectorTestClass {
    
    // Would like SpotBugs to detect this potential issue
    @DesireWarning("POTENTIAL_SECURITY_ISSUE")
    public void methodWithPotentialSecurityProblem() {
        // Code that might have security implications
        String userInput = getUserInput();
        executeCommand(userInput); // Potentially dangerous
    }
    
    // Desire warning for performance issue
    @DesireWarning("PERFORMANCE_ISSUE")
    public void inefficientMethod() {
        for (int i = 0; i < 1000000; i++) {
            String result = ""; // String concatenation in loop
            for (int j = 0; j < 100; j++) {
                result += "x"; // Inefficient string building
            }
        }
    }
    
    // Multiple desired warnings
    @DesireWarning(value = {"MEMORY_LEAK", "RESOURCE_LEAK"}, num = 2)
    public void methodWithMultipleIssues() {
        List<Object> list = new ArrayList<>();
        while (true) {
            list.add(new LargeObject()); // Potential memory leak
            if (someCondition()) {
                FileInputStream stream = new FileInputStream("file.txt");
                // Stream not closed - resource leak
                break;
            }
        }
    }
}

DesireNoWarning Annotation

Indicate that you would prefer SpotBugs not generate warnings (for testing false positive reduction).

/**
 * Annotation indicating that no FindBugs warnings of the specified type is desired.
 * 
 * @deprecated The annotation based approach is useless for lambdas. 
 * Write expectations using edu.umd.cs.findbugs.test.matcher.BugInstanceMatcher 
 * matchers in test source directory
 */
@Deprecated
@Retention(RetentionPolicy.CLASS)
@interface DesireNoWarning {
    /**
     * The value indicates the bug code (e.g., NP) or bug pattern (e.g.,
     * IL_INFINITE_LOOP) that is desired to not be reported
     */
    String value();

    /** @deprecated - use confidence instead */
    @Deprecated
    Priority priority() default Priority.LOW;

    /** Want no warning at this priority or higher */
    Confidence confidence() default Confidence.LOW;

    /** Tolerate up to this many warnings */
    int num() default 0;
}

Usage Examples:

public class FalsePositiveTestClass {
    
    // This should not trigger null pointer warnings (complex but safe logic)
    @DesireNoWarning("NP_NULL_ON_SOME_PATH")
    public void complexButSafeNullHandling(@Nullable String input) {
        // Complex logic that is actually safe but might confuse analysis
        String processed = Optional.ofNullable(input)
                                 .filter(s -> !s.isEmpty())
                                 .map(String::trim)
                                 .orElse("default");
        System.out.println(processed.length()); // Safe due to orElse
    }
    
    // Should not warn about unused field (used via reflection)
    @DesireNoWarning("URF_UNREAD_FIELD")
    public class ReflectionBasedClass {
        @SuppressWarnings("unused") // Used via reflection
        private String reflectionField = "value";
        
        // Field is accessed via reflection in framework code
    }
    
    // Should not warn about synchronization (external synchronization)
    @DesireNoWarning("IS_FIELD_NOT_GUARDED")
    public class ExternallySynchronizedClass {
        private int counter; // Synchronized externally
        
        // This field is protected by external synchronization mechanism
        public void increment() {
            counter++; // Safe due to external synchronization
        }
    }
}

Testing Workflow Integration

Test Suite Integration

/**
 * Test class for validating SpotBugs detector behavior
 */
public class SpotBugsDetectorTest {
    
    // Test that detector correctly identifies issues
    @Test
    public void testDetectorFindsIssues() {
        // Methods annotated with @ExpectWarning should generate warnings
        testMethodWithExpectedWarnings();
    }
    
    @ExpectWarning(value = "NULL_DEREFERENCE", num = 1)
    private void testMethodWithExpectedWarnings() {
        String str = null;
        str.length(); // Should be detected
    }
    
    // Test that detector doesn't generate false positives
    @Test 
    public void testDetectorAvoidsFalsePositives() {
        // Methods annotated with @NoWarning should not generate warnings
        testSafeMethod();
    }
    
    @NoWarning("NULL_DEREFERENCE")
    private void testSafeMethod() {
        String str = getString();
        if (str != null) {
            str.length(); // Safe usage
        }
    }
}

Continuous Integration

// Package-level testing configuration
@ExpectWarning(value = "SECURITY_ISSUE", num = 0) // Expect no security issues
@NoWarning({"NP_NULL_ON_SOME_PATH", "OS_OPEN_STREAM"}) // These should be clean
package com.example.production;

Custom Detector Validation

public class CustomDetectorTest {
    
    // Test new detector finds the issue
    @DesireWarning("CUSTOM_PATTERN_ISSUE")
    public void methodWithCustomIssue() {
        // Code that should trigger custom detector
        customPatternThatShouldBeDetected();
    }
    
    // Test that custom detector doesn't over-report
    @DesireNoWarning("CUSTOM_PATTERN_ISSUE")
    public void methodWithoutCustomIssue() {
        // Similar code that should not trigger detector
        safeCustomPattern();
    }
}

Best Practices

  1. Use in test code: These annotations are primarily for testing, not production code
  2. Validate detectors: Use @ExpectWarning to ensure detectors find real issues
  3. Test false positives: Use @NoWarning to verify clean code doesn't trigger warnings
  4. Guide development: Use @DesireWarning to specify what new detectors should find
  5. Refine analysis: Use @DesireNoWarning to identify false positive patterns
  6. Document test intent: Include comments explaining why warnings are expected or not expected
  7. Regular validation: Run tests regularly to ensure detector behavior remains consistent
  8. Version control: Track changes in expected warnings as code evolves

Install with Tessl CLI

npx tessl i tessl/maven-com-github-spotbugs--spotbugs-annotations

docs

default-annotations.md

index.md

null-safety.md

resource-management.md

return-value-checking.md

testing-annotations.md

warning-suppression.md

tile.json