Skip to content

semanticdb-javac: exceptions in reportException() use Kind.ERROR, breaking compilation for projects with partial classpaths (e.g. Spark) #861

@rpalcolea

Description

@rpalcolea

When semanticdb-javac encounters an exception during the ANALYZE phase (e.g. a CompletionFailure for a missing class), it calls reportException(), which reports the failure as Diagnostic.Kind.ERROR. This causes javac to exit with a non-zero status, failing the entire compilation — even though the catch block comment explicitly says "we don't want to stop the compilation".

Root cause

SemanticdbTaskListener.java lines 80–93 — the intent is correct, the implementation is not:

} catch (Throwable ex) {
    // Catch exceptions because we don't want to stop the compilation even if this
    // plugin has a bug. We report the full stack trace because it's helpful for bug reports.
    Throwable throwable = ex;
    if (e.getSourceFile() != null) {
        throwable = new CompilationUnitException(
            String.valueOf(e.getSourceFile().toUri().toString()), throwable);
    }
    this.reportException(throwable, e);  // ← ends up as Kind.ERROR → build fails
}

SemanticdbTaskListener.java lines 102–108reportException calls reporter.error():

private void reportException(Throwable exception, TaskEvent e) {
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    PrintWriter pw = new PrintWriter(baos);
    exception.printStackTrace(pw);
    pw.close();
    reporter.error(baos.toString(), e.getCompilationUnit(), e.getCompilationUnit());
    //      ^^^^^ Kind.ERROR → javac exits non-zero → build fails
}

SemanticdbReporter.java lines 54–61error() uses Kind.ERROR:

public void error(String message, Tree tree, CompilationUnitTree root) {
    trees.printMessage(
        Diagnostic.Kind.ERROR,   // ← this is what fails the build
        String.format("semanticdb-javac: %s", message), tree, root);
}

Concrete trigger

Projects that use Apache Spark (or any library with anonymous inner classes compiled from Scala) have types like org.apache.spark.sql.types.DataType$1 on the Scala classpath but not on the Java compile classpath. When semanticdb-javac tries to resolve these during deep type analysis, it throws:

semanticdb-javac: CompilationUnitException: file:/path/to/BoundReflectFunction.java
  Caused by: CompletionFailure: class file for org.apache.spark.sql.types.DataType$1 not found

1 error
FAILURE: Build failed with an exception.
* What went wrong: Execution failed for task ':compileJava'.

The build works fine without semanticdb-javac. Removing it restores the build.

Proposed fix

Add a warning() method to SemanticdbReporter and update reportException() to use it instead of error(). Intentional error paths (e.g. -no-relative-path:error mode) are left unchanged.

// SemanticdbReporter.java
+  public void warning(String message, Tree tree, CompilationUnitTree root) {
+    trees.printMessage(
+        Diagnostic.Kind.WARNING, String.format("semanticdb-javac: %s", message), tree, root);
+  }
// SemanticdbTaskListener.java — reportException()
-    reporter.error(baos.toString(), e.getCompilationUnit(), e.getCompilationUnit());
+    reporter.warning(baos.toString(), e.getCompilationUnit(), e.getCompilationUnit());

Also change SemanticdbReporter.exception() (the Throwable overload) from Kind.ERROR to Kind.WARNING for consistency.

The full stack trace is still reported — useful for bug reports — but as a warning. The build succeeds with partial semanticdb output rather than aborting entirely.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions