The manual of the Checker Framework claims "You can write multiple #EnsuresNonNullIf annotations on a single method", however I observe the following message if I try this:
#EnsuresNonNullIf(expression="getFieldNames()", result=true)
#EnsuresNonNullIf(expression="getFieldName(i)", result=true)
public boolean hasFieldNames() {
return fFieldNames != null;
}
The resulting error message by the Eclipse Java compiler:
Duplicate annotation of non-repeatable type #EnsuresNonNullIf. Only annotation types marked #Repeatable can be used multiple times at one target.
The resulting error message by the MVN javac compiler:
[ERROR] Blabla.java:[?,?] org.checkerframework.checker.nullness.qual.EnsuresNonNullIf is not a repeatable annotation type
I'm annotating 10-year-old code, so I'm hoping some configuration trick can safe the day :-) Without the multiple #EnsuresNonNullIf I'm up for quite a bit of manual code annotation to fix false positives that I'm not interested in...
PS: I tried using both checker-framework-2.8.1 and 2.9.0 with similar results, and always using <maven.compiler.source>1.8</maven.compiler.source>
I found this issue on the Checker Framework issue tracker: https://github.com/typetools/checker-framework/issues/1307
It explains an "enhancement" request for adding #Repeatable to the following CF annotations:
> #DefaultQualifier -- DONE
> #EnsuresKeyFor
> #EnsuresKeyForIf
> #EnsuresLockHeldIf
> #EnsuresLTLengthOf
> #EnsuresLTLengthOfIf
> #EnsuresMinLenIf
> #EnsuresNonNullIf
> #EnsuresQualifier -- DONE
> #EnsuresQualifierIf -- DONE
> #FieldInvariant
> #GuardSatisfied
> #HasSubsequence
> #MethodVal
> #MinLenFieldInvariant
> #RequiresQualifier -- DONE
> #SubstringIndexFor
And the discussion contains a workaround, since EnsuresQualifiersIf is already repeatable:
#EnsuresQualifiersIf({
#EnsuresQualifierIf(result=true, qualifier=NonNull.class, expression="getFoo()"),
#EnsuresQualifierIf(result=false, qualifier=NonNull.class, expression="getBar()")
})
boolean hasFoo();
And in my case that works out to:
#EnsuresQualifiersIf({
#EnsuresQualifierIf(result=true, qualifier=NonNull.class, expression="getFieldNames()"),
#EnsuresQualifierIf(result=true, qualifier=NonNull.class, expression="getFieldName(i)")
})
public boolean hasFieldNames() {
return fFieldNames != null;
}
Related
I'm working on a maven build pipeline.
Currently I'm facing the problem that a maven project is still building if a dependency is invalid.
I think everyone know that warning in the log:
[WARNING] The POM for is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details.
I would like to fail the project instead of a warning because in a big build pipeline its hard to find.
I looked into the code:
The warning happens in maven-core because of an EventType.ARTIFACT_DESCRIPTOR_INVALID.
In the DefaultArtifactDescriptorReader I found that during building the effective model the ModelBuildingException is catched.
There is a ArtifactDescriptorPolicy. Based on that a exception will be added or only the EventType.ARTIFACT_DESCRIPTOR_INVALID is fired (see invalidDescriptor()).
model = modelBuilder.build( modelRequest ).getEffectiveModel();
}
catch ( ModelBuildingException e )
{
for ( ModelProblem problem : e.getProblems() )
{
if ( problem.getException() instanceof UnresolvableModelException )
{
result.addException( problem.getException() );
throw new ArtifactDescriptorException( result );
}
}
invalidDescriptor( session, trace, a, e );
if ( ( getPolicy( session, a, request ) & ArtifactDescriptorPolicy.IGNORE_INVALID ) != 0 )
{
return null;
}
result.addException( e );
throw new ArtifactDescriptorException( result );
}
I didn't found any option to configure the ArtifactDescriptorPolicy.
I expect that the ArtifactDescriptorPolicy.STRICT would solve my problem.
Does anyone knows more about that problem?
I think you're a bit ahead of the curve. There is a feature (new switch) in Maven 4 called --fail-on-severity or -fos that does exactly this: fail the build on a specific severity.
$ mvn -fos WARN clean install
If you don't mind being on the bleeding edge, you could install it and give it a go.
I've got a Java project build with Gradle and a property file that contains custom configuration for my testing framework (amount of thread to use, test environment url, custom username & password for those environments, etc...).
I'm facing an issue related to using properties from that file that I can't figure out:
if my Test task include '**/*Test.class', all tests are running as expected.
if my Test task include '**/MyTest.class', only that test is running as expected.
if my Test task include readProperty(), the task is skipped as NO-SOURCE. <- this is the part I can't understand - as the readProperty return the correct value.
Let's get into details:
This is how the property is defined in a my.property file:
testng.class.includes='**/MyTest.class'
This is what the build.gradle file looks like:
Properties props = new Properties()
props.load(new FileInputStream(projectDir.toString() + '/common.properties'))
def testsToRunWorking(p) {
String t = 'MyTest.class'
println "Tests = $t"
return t ? t : '**/*Test.class'
}
def testsToRunNotWorking(p) {
String t = getProperty(p, "testng.class.includes")
println "Tests = $t"
return t ? t : '**/*Test.class'
}
task testCustom(type: Test) {
outputs.upToDateWhen { false }
testLogging.showStandardStreams = true
classpath = configurations.customTest + sourceSets.customTest.output
include testsToRunNotWorking(props) ///< Does not work!
// include testsToRunWorking(props) ///< Works!
useTestNG()
}
In terms of debugging:
The println properly return the value I expect, even when the testCustom task doesn't do what I would expect.
I tried adding a dependsOn task just to print the content of testCustom.configure { println $includes } which looks correct as well.
--info
Tests = '**/MyTest.class'
:clean
:compileCustomTestJava - is not incremental (e.g. outputs have changed, no previous execution, etc.).
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:processCustomTestResources
:customTestClasses
:testCustom NO-SOURCE
The core of the issue seems to be coming from the fact that I'm reading that value from property. I hard coded inside the build.gradle everything works as expected. If read from a property file - build stops with a NO-SOURCE statement.
Any idea?
Thanks!
You are using quotation marks in the values of your property files. Everything that comes after the assignment sign in a property file is used as value, so the quotation marks remain in the string. They are printed in your output Tests = '**/MyTest.class'. On the other hand, if you define a string in your (Groovy) code with quotation marks, they are not included in the string. Therefor, the passed strings are not the same.
Remove the quotation marks from your property file(s) and everything should work, since the class files will match your string without the quotation marks.
I'm going to put a fat bounty on this as soon as the system permits.
What I'm specifically having trouble with is getting coverage and getting integration tests working. For these I see the uninformative error:
Resource not found: com.something.somethingelse.SomeITCase
Uninformative because there's nothing to relate it to, nor does it mean much to someone who has none of the context.
Here's other weirdness I'm seeing. In cases where there's no integration tests for a subproject, I see this:
JaCoCoItSensor: JaCoCo IT report not found: /dev/build/dilithium/target/jacoco-it.exec
Why would I see target? This isn't a Maven project. A global search shows there's no target directory mentioned anywhere in the code base.
Then, there's this section of the documentation:
sonarqube {
properties {
properties["sonar.sources"] += sourceSets.custom.allSource.srcDirs
properties["sonar.tests"] += sourceSets.integTest.allSource.srcDirs
}
}
Near as I can tell sourceSets.integTest.allSource.srcDirs returns Files, not Strings. Also it should be:
sonarqube {
properties {
property "sonar.tests", "comma,separated,file,paths"
}
Note that you get an error if you have a directory in there that doesn't exist. Of course there's apparently no standard for what directory to put integration tests in and for some sub-projects they may not even exist. The Gradle standard would be to simply ignore non-existent directories. Your code ends up looking like:
sonarqube {
StringBuilder builder = new StringBuilder()
sourceSets.integrationTest.allSource.srcDirs.each { File dir ->
if ( dir.exists() ) {
builder.append(dir.getAbsolutePath())
builder.append(",")
}
}
if (builder.size() > 1) builder.deleteCharAt(builder.size() -1 )
if (builder.size() > 1 )
properties["sonar.tests"] += builder.toString()
properties["sonar.jacoco.reportPath"] +=
"$project.buildDir/jacoco/test.exec,$project.buildDir/jacoco/integrationTest.exec"
}
Sonar is reporting no coverage at all. If I search for the *.exec files, I see what I would expect. That being:
./build/jacoco/test.exec
./build/jacoco/integrationTest.exec
...but weirdly, I also see this:
./build/sonar/com.proj_name_component_management-component_proj-recordstate/jacoco-overall.exec
What is that? Why is it in such a non-standard location?
OK, I've added this code:
properties {
println "Before: " + properties.get("sonar.tests")
println "Before: " + properties.get("sonar.jacoco.reportPath")
property "sonar.tests", builder.toString()
property "sonar.jacoco.reportPath", "$project.buildDir/jacoco/test.exec,$project.buildDir/jacoco/integrationTest.exec"
println "After: " + properties.get("sonar.tests")
println "After: " + properties.get("sonar.jacoco.reportPath")
}
...which results in:
[still running]
I don't want any bounty or any points.
Just a suggestion.
Can you get ANY Jacoco reports at all?
Personally I would separate the 2: namely Jacoco report generation and Sonar.
I would first try to simply generate Jacoco THEN I would look at why Sonar can not get a hold of them.
I've got this bit of CQL:
// <Name>A stateless class or structure might be turned into a static type</Name>
warnif count > 0 (from t in Application.Types where
t.SizeOfInst ==0 &&
// For accuracy, this constraint doesn't take
// account of types that implement some interfaces.
// and classes that have a base class and don't
// derive directly from System.Object, or classes
// that have sub-classes children.
t.NbInterfacesImplemented == 0 &&
((t.IsClass && t.DepthOfInheritance == 1
&& t.NbChildren == 0)
|| t.IsStructure) &&
!t.IsStatic &&
!t.DeriveFrom("System.Attribute") &&
!t.IsAttributeClass &&
!t.IsGeneric && t.Name!="Program" && !(t.IsGeneratedByCompiler || t.HasAttribute(#"NDepend.CQL.NDependIgnoreAttribute") || t.HasAttribute("System.Runtime.CompilerServices.CompilerGeneratedAttribute".AllowNoMatch()))
select new { t, t.SizeOfInst, t.NbInterfacesImplemented,
t.DepthOfInheritance, t.NbChildren }).Take(10)
// this rule indicates stateless types that might
// eventually be turned into static classes.
// See the definition of the SizeOfInst metric here
// http://www.ndepend.com/Metrics.aspx#SizeOfInst
It's fine in the GUI, but I get this message in the output report when I run it from the command line:
1 query syntax error: Not a valid type name {"System.Attribute"}
Any idea why?
It must come from the fact that mscorlib, the assembly that contains System.Attribute, is not resolve at analysis time. Are you running GUI and command line versions on the same machine? To look at assembly resolving go to NDepend Project Properties > Code to Analyze and see from where mscorlib is resolved by unfolding the folder panel.
I have a gradle build script with a handful of source sets that all have various dependencies defined (some common, some not), and I'm trying to use the Eclipse plugin to let Gradle generate .project and .classpath files for Eclipse, but I can't figure out how to get all the dependency entries into .classpath; for some reason, quite few of the external dependencies are actually added to .classpath, and as a result the Eclipse build fails with 1400 errors (building with gradle works fine).
I've defined my source sets like so:
sourceSets {
setOne
setTwo {
compileClasspath += setOne.runtimeClasspath
}
test {
compileClasspath += setOne.runtimeClasspath
compileClasspath += setTwo.runtimeClasspath
}
}
dependencies {
setOne 'external:dependency:1.0'
setTwo 'other:dependency:2.0'
}
Since I'm not using the main source-set, I thought this might have something to do with it, so I added
sourceSets.each { ss ->
sourceSets.main {
compileClasspath += ss.runtimeClasspath
}
}
but that didn't help.
I haven't been able to figure out any common properties of the libraries that are included, or of those that are not, but I can't find anything that I'm sure of (although of course there has to be something). I have a feeling that all included libraries are dependencies of the test source-set, either directly or indirectly, but I haven't been able to verify that more than noting that all of test's dependencies are there.
How do I ensure that the dependencies of all source-sets are put in .classpath?
This was solved in a way that was closely related to a similar question I asked yesterday:
// Create a list of all the configuration names for my source sets
def ssConfigNames = sourceSets.findAll { ss -> ss.name != "main" }.collect { ss -> "${ss.name}Compile".toString() }
// Find configurations matching those of my source sets
configurations.findAll { conf -> "${conf.name}".toString() in ssConfigNames }.each { conf ->
// Add matching configurations to Eclipse classpath
eclipse.classpath {
plusConfigurations += conf
}
}
Update:
I also asked the same question in the Gradle forums, and got an even better solution:
eclipseClasspath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
It is not as precise, in that it adds other stuff than just the things from my source sets, but it guarantees that it will work. And it's much easier on the eyes =)
I agree with Tomas Lycken, it is better to use second option, but might need small correction:
eclipse.classpath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
This is what worked for me with Gradle 2.2.1:
eclipse.classpath.plusConfigurations = [configurations.compile]