I have a Java project under Eclipse Luna, with EclEmma 2.3.1.201405111647 (latest), which use Jacoco 0.7.1, which have support for Java 8 as stated in their changelog:
"Version 2.3.1 (2014/05/11)
Fixed ASM 5.0.1 dependency conflicts with new ASM bundles in Eclipse 4.4 (GitHub #83).
Upgrade to JaCoCo 0.7.1 for full Java 8 support.
I now have the following toString:
#Override
public String toString() {
// [BLOCK0]
if (0 == value) {
return "0B";
}
// [BLOCK1]
final MutableLong val = new MutableLong(value);
final StringBuilder sb = new StringBuilder();
// [BLOCK2]
Arrays.asList(TERA_BYTES, GIGA_BYTES, MEGA_BYTES, KILO_BYTES, BYTES).forEach(unit -> {
// [BLOCK3]
long divider = unit.toBytes(1);
long n = val.longValue() / divider;
if (0 != n) {
sb.append(n).append(unit.getUnitCharacter());
val.subtract(n * divider);
}
});
// [BLOCK4]
return sb.toString();
}
I won't put the Junit test, because I know it goes 100% coverage. I can prove it by moving the lamdba expression into a appendToString method, and remplace the forEach with a for-each for (V value : Iterable<V>).
The result is, when I do "Coverage as Junit Test", the following:
BLOCK0 is all green
BLOCK1 is all green
BLOCK2 is green, up to the forEach(unit -> {
BLOCK3 is white (as if it were ignored lines)
BLOCK4 is all green.
Can someone explain me why Jacoco can't detect coverage in lambda ?
Lambda expression bodies are compiled into synthetic methods, but as far as I read, synthetic methods are unconditionally filtered out in the code coverage analysis.
By looking at the Change History of JaCoCo I see
Snapshot Build 0.7.2.201408210455 (2014/08/21)
Fixed Bugs
Do not ignore synthetic lambda methods to get code coverage for Java 8 lambda expressions (GitHub #232).
which seems to address your issue. Since you are using EclEmma 2.3.1 which is using JaCoCo version 0.7.1 you just need an update.
Related
I've got a Java project build with Gradle and a property file that contains custom configuration for my testing framework (amount of thread to use, test environment url, custom username & password for those environments, etc...).
I'm facing an issue related to using properties from that file that I can't figure out:
if my Test task include '**/*Test.class', all tests are running as expected.
if my Test task include '**/MyTest.class', only that test is running as expected.
if my Test task include readProperty(), the task is skipped as NO-SOURCE. <- this is the part I can't understand - as the readProperty return the correct value.
Let's get into details:
This is how the property is defined in a my.property file:
testng.class.includes='**/MyTest.class'
This is what the build.gradle file looks like:
Properties props = new Properties()
props.load(new FileInputStream(projectDir.toString() + '/common.properties'))
def testsToRunWorking(p) {
String t = 'MyTest.class'
println "Tests = $t"
return t ? t : '**/*Test.class'
}
def testsToRunNotWorking(p) {
String t = getProperty(p, "testng.class.includes")
println "Tests = $t"
return t ? t : '**/*Test.class'
}
task testCustom(type: Test) {
outputs.upToDateWhen { false }
testLogging.showStandardStreams = true
classpath = configurations.customTest + sourceSets.customTest.output
include testsToRunNotWorking(props) ///< Does not work!
// include testsToRunWorking(props) ///< Works!
useTestNG()
}
In terms of debugging:
The println properly return the value I expect, even when the testCustom task doesn't do what I would expect.
I tried adding a dependsOn task just to print the content of testCustom.configure { println $includes } which looks correct as well.
--info
Tests = '**/MyTest.class'
:clean
:compileCustomTestJava - is not incremental (e.g. outputs have changed, no previous execution, etc.).
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
:processCustomTestResources
:customTestClasses
:testCustom NO-SOURCE
The core of the issue seems to be coming from the fact that I'm reading that value from property. I hard coded inside the build.gradle everything works as expected. If read from a property file - build stops with a NO-SOURCE statement.
Any idea?
Thanks!
You are using quotation marks in the values of your property files. Everything that comes after the assignment sign in a property file is used as value, so the quotation marks remain in the string. They are printed in your output Tests = '**/MyTest.class'. On the other hand, if you define a string in your (Groovy) code with quotation marks, they are not included in the string. Therefor, the passed strings are not the same.
Remove the quotation marks from your property file(s) and everything should work, since the class files will match your string without the quotation marks.
I'm going to put a fat bounty on this as soon as the system permits.
What I'm specifically having trouble with is getting coverage and getting integration tests working. For these I see the uninformative error:
Resource not found: com.something.somethingelse.SomeITCase
Uninformative because there's nothing to relate it to, nor does it mean much to someone who has none of the context.
Here's other weirdness I'm seeing. In cases where there's no integration tests for a subproject, I see this:
JaCoCoItSensor: JaCoCo IT report not found: /dev/build/dilithium/target/jacoco-it.exec
Why would I see target? This isn't a Maven project. A global search shows there's no target directory mentioned anywhere in the code base.
Then, there's this section of the documentation:
sonarqube {
properties {
properties["sonar.sources"] += sourceSets.custom.allSource.srcDirs
properties["sonar.tests"] += sourceSets.integTest.allSource.srcDirs
}
}
Near as I can tell sourceSets.integTest.allSource.srcDirs returns Files, not Strings. Also it should be:
sonarqube {
properties {
property "sonar.tests", "comma,separated,file,paths"
}
Note that you get an error if you have a directory in there that doesn't exist. Of course there's apparently no standard for what directory to put integration tests in and for some sub-projects they may not even exist. The Gradle standard would be to simply ignore non-existent directories. Your code ends up looking like:
sonarqube {
StringBuilder builder = new StringBuilder()
sourceSets.integrationTest.allSource.srcDirs.each { File dir ->
if ( dir.exists() ) {
builder.append(dir.getAbsolutePath())
builder.append(",")
}
}
if (builder.size() > 1) builder.deleteCharAt(builder.size() -1 )
if (builder.size() > 1 )
properties["sonar.tests"] += builder.toString()
properties["sonar.jacoco.reportPath"] +=
"$project.buildDir/jacoco/test.exec,$project.buildDir/jacoco/integrationTest.exec"
}
Sonar is reporting no coverage at all. If I search for the *.exec files, I see what I would expect. That being:
./build/jacoco/test.exec
./build/jacoco/integrationTest.exec
...but weirdly, I also see this:
./build/sonar/com.proj_name_component_management-component_proj-recordstate/jacoco-overall.exec
What is that? Why is it in such a non-standard location?
OK, I've added this code:
properties {
println "Before: " + properties.get("sonar.tests")
println "Before: " + properties.get("sonar.jacoco.reportPath")
property "sonar.tests", builder.toString()
property "sonar.jacoco.reportPath", "$project.buildDir/jacoco/test.exec,$project.buildDir/jacoco/integrationTest.exec"
println "After: " + properties.get("sonar.tests")
println "After: " + properties.get("sonar.jacoco.reportPath")
}
...which results in:
[still running]
I don't want any bounty or any points.
Just a suggestion.
Can you get ANY Jacoco reports at all?
Personally I would separate the 2: namely Jacoco report generation and Sonar.
I would first try to simply generate Jacoco THEN I would look at why Sonar can not get a hold of them.
The following program is based on the example in the v8 Getting Started page. I have made three changes to demonstrate a problem I am encountering:
I create an empty array put it into the global context.
The script being run references the zeroth element in the array, which should return undefined.
I run the compiled script twice.
The first run works fine. The second fails: v8 calls V8_Fatal() in Deoptimizer::DoComputeCompiledStubFrame() because descriptor->register_param_count_ == -1.
Am I doing something wrong here? How can I fix it?
Isolate* isolate = Isolate::New();
Isolate::Scope isolate_scope(isolate);
HandleScope handle_scope(isolate);
Local<Context> context = Context::New(isolate);
Context::Scope context_scope(context);
Local<Array> a = Array::New(isolate);
context->Global()->Set(String::NewFromUtf8(isolate, "a"), a);
Local<String> source = String::NewFromUtf8(isolate, "a[0];");
Local<Script> script = Script::Compile(source);
Local<Value> result = script->Run();
Local<Value> result2 = script->Run();
return 0;
NOTES:
This is the entire body of main().
Other fragments of JavaScript code run twice without a problem. Somehow this relates to the out-of-bound array reference, which is perhaps triggering deoptimization.
I do not want to recompile the script from scratch each time because I am typically running these scripts thousands of times, and sometimes millions of times.
I have tried compiling the script as an UnboundScript and then binding it for each execution, but the result is the same.
I have reported this as a v8 issue, but nobody has responded so I'm hoping that the StackOverflow community can help.
I am seeing this on VS2012 Update 4, but I also see it on VS2008, and in both x64 and x86 and in both Debug and Release builds.
OK, found it. The problem is an uninitialized code stub for dictionary loads - your use case triggers this as a failure as the stub isn't initialized through other means, eg compilation.
Below is a patch against v8 trunk revision 22629 that fixes the problem for me, tested on Windows with VS 2010 and Linux with g++ 4.9. Please let me know how you go with this:
Index: src/code-stubs.cc
===================================================================
--- src/code-stubs.cc (revision 22629)
+++ src/code-stubs.cc (working copy)
## -236,6 +236,8 ##
CODE_STUB_LIST(DEF_CASE)
#undef DEF_CASE
case UninitializedMajorKey: return "<UninitializedMajorKey>Stub";
+ case NoCache:
+ return "<NoCache>Stub";
default:
if (!allow_unknown_keys) {
UNREACHABLE();
## -939,6 +941,13 ##
// static
+void KeyedLoadDictionaryElementStub::InstallDescriptors(Isolate* isolate) {
+ KeyedLoadDictionaryElementStub stub(isolate);
+ InstallDescriptor(isolate, &stub);
+}
+
+
+// static
void KeyedLoadGenericElementStub::InstallDescriptors(Isolate* isolate) {
KeyedLoadGenericElementStub stub(isolate);
InstallDescriptor(isolate, &stub);
Index: src/code-stubs.h
===================================================================
--- src/code-stubs.h (revision 22629)
+++ src/code-stubs.h (working copy)
## -1862,6 +1862,8 ##
virtual void InitializeInterfaceDescriptor(
CodeStubInterfaceDescriptor* descriptor) V8_OVERRIDE;
+ static void InstallDescriptors(Isolate* isolate);
+
private:
Major MajorKey() const { return KeyedLoadElement; }
int NotMissMinorKey() const { return DICTIONARY_ELEMENTS; }
Index: src/isolate.cc
===================================================================
--- src/isolate.cc (revision 22629)
+++ src/isolate.cc (working copy)
## -2000,6 +2000,7 ##
NumberToStringStub::InstallDescriptors(this);
StringAddStub::InstallDescriptors(this);
RegExpConstructResultStub::InstallDescriptors(this);
+ KeyedLoadDictionaryElementStub::InstallDescriptors(this);
KeyedLoadGenericElementStub::InstallDescriptors(this);
}
As a workaround if you don't want to compile your own V8 for now, you could execute some code on each Isolate that uses the KeyedLoadDictionaryElementStub directly, prior to running your code --- this should initialize the stub. Something like the following works for me:
Isolate* isolate = Isolate::New();
Isolate::Scope isolate_scope(isolate);
HandleScope handle_scope(isolate);
Local<Context> context = Context::New(isolate);
Context::Scope context_scope(context);
Local<Array> a = Array::New(isolate);
context->Global()->Set(String::NewFromUtf8(isolate, "a"), a);
// Workaround code for initializing KeyedLoadDictionaryElementStub
Local<String> workaround_source = String::NewFromUtf8(isolate, "Math.random()");
Local<Script> workaround_script = Script::Compile(workaround_source);
Local<Value> workaround_value = workaround_script->Run();
// End workaround
Local<String> source = String::NewFromUtf8(isolate, "a[0]");
Local<Script> script = Script::Compile(source);
// ...and so on
I have a gradle build script with a handful of source sets that all have various dependencies defined (some common, some not), and I'm trying to use the Eclipse plugin to let Gradle generate .project and .classpath files for Eclipse, but I can't figure out how to get all the dependency entries into .classpath; for some reason, quite few of the external dependencies are actually added to .classpath, and as a result the Eclipse build fails with 1400 errors (building with gradle works fine).
I've defined my source sets like so:
sourceSets {
setOne
setTwo {
compileClasspath += setOne.runtimeClasspath
}
test {
compileClasspath += setOne.runtimeClasspath
compileClasspath += setTwo.runtimeClasspath
}
}
dependencies {
setOne 'external:dependency:1.0'
setTwo 'other:dependency:2.0'
}
Since I'm not using the main source-set, I thought this might have something to do with it, so I added
sourceSets.each { ss ->
sourceSets.main {
compileClasspath += ss.runtimeClasspath
}
}
but that didn't help.
I haven't been able to figure out any common properties of the libraries that are included, or of those that are not, but I can't find anything that I'm sure of (although of course there has to be something). I have a feeling that all included libraries are dependencies of the test source-set, either directly or indirectly, but I haven't been able to verify that more than noting that all of test's dependencies are there.
How do I ensure that the dependencies of all source-sets are put in .classpath?
This was solved in a way that was closely related to a similar question I asked yesterday:
// Create a list of all the configuration names for my source sets
def ssConfigNames = sourceSets.findAll { ss -> ss.name != "main" }.collect { ss -> "${ss.name}Compile".toString() }
// Find configurations matching those of my source sets
configurations.findAll { conf -> "${conf.name}".toString() in ssConfigNames }.each { conf ->
// Add matching configurations to Eclipse classpath
eclipse.classpath {
plusConfigurations += conf
}
}
Update:
I also asked the same question in the Gradle forums, and got an even better solution:
eclipseClasspath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
It is not as precise, in that it adds other stuff than just the things from my source sets, but it guarantees that it will work. And it's much easier on the eyes =)
I agree with Tomas Lycken, it is better to use second option, but might need small correction:
eclipse.classpath.plusConfigurations = configurations.findAll { it.name.endsWith("Runtime") }
This is what worked for me with Gradle 2.2.1:
eclipse.classpath.plusConfigurations = [configurations.compile]
Current behavior:
Put a breakpoint on the case Twice(n) ... line.
On "step into" the control goes to x match { line
On "step into" the control goes to def TwiceTest = { line
On further "step into" the control goes to if (z % 2 == 0)... line.
Expected behavior:
Put a breakpoint on the case Twice(n) ... line.
On "step into" the control goes to if (z % 2 == 0)... line.
Code Snippet
object testobj extends App {
def TwiceTest = {
val x = Twice(21)
x match {
case Twice(n) => Console.println(n)
} // prints 21
}
TwiceTest
}
object Twice {
def apply(x: Int): Int = x * 2
def unapply(z: Int): Option[Int] = {
if (z % 2 == 0) Some(z / 2) else None
}
}
The current behavior is irritating while debugging a scala program with lots of nested extractors. I tried this with the new Scala debugger as well as the Java debugger but with the same result.
Step Filtering also does not help in this case.
As a workaround, I am putting a breakpoint in the unapply method and running resume from the first breakpoint. Can someone please suggest me a cleaner method.
Edit 1
I am using Scala-IDE (latest nightly build. 2.1.0.nightly-2_09-201208250315-529cd70 )
Eclipse Version: Indigo Service Release 2 Build id: 20120216-1857
OS: Windows 7 ( 64 bit)
The line number information in the bytecode is wrong. It is not an issue with the IDE, but the Scala compiler. When pattern matching is compiled, synthetic code sometimes gets the wrong position information.
I assume you are using Scala 2.9.2. In the next version of Scala (2.10.0), there are significant improvements in the pattern matcher, so it would be good to give it a try.