JaCoCo just outputs jacococ.exec which is the input for Sonar. In that file, there seems to be only the info:
- Class name
- Total Class Probes
- Executed Class Probes
But then, SonarQube cannot rely solely on these values as it needs to tell you which are the exact lines unconvered, so Sonar is performing an analysis on itself. So how does it use Jacoco report? And why does it need it?
So how does it use Jacoco report? And why does it need it?
SonarQube itself alone doesn't / can't know anything about which tests you actually executed and how they cover your code. To obtain this information it relies on third-party test coverage tools. In case of Java it relies on data collected and provided by JaCoCo as explained in answer on similar question from you (JaCoCo collects execution information in exec file, and obtains line numbers and other information from class files during generation of report), or SonarQube can rely on data in "generic format".
Related
Currently at my internship, I have my Jacoco-Maven plugin set up such that after the unit and integration reports are generated, they are merged into one.
Now, adding CircleCI into the mix, we run parallel jobs for our tests. These jobs cause only longest running test's .exec file to be read as the basis for the report, meaning all other tests are ignored, and our coverage would be lowered.
I thought the solution would be to have unique names, and then just merge all .exec file using a wildcard character within my POM, but there seems to be no documentation on it.
We have framework that splits tests to many machines. Data related to environment, OS etc are important for analysis of final result.
Gradle provides XML report compatible with existing tools. Passed/Failed details, timing information, hostname etc are part of XML. This however does not have some data such as environment variables or system properties used. Another example is lack of "OS" information in result.
https://docs.gradle.org/current/userguide/java_testing.html#test_reporting covers many options. I could not find a way to force more detailed test report. Anyone managed to create more detailed XML?
I have multiple jmx files which are committed to gitpipeline for CI. End users are not fine with reports showing OK/KO and requested us to change the same and also asked us to add few details like total time take for the execution, description of the test and environment/hostname used.
Solutions am looking for -
how to change OK/KO when the JMX is executed using maven. What properties do i need to change/add in order to achieve this for maven?
Is there a way to customize the report to have additional details like "Total Duration", "Environment", "Description of Test" under Test and Report information section or anywhere else?
If it is not achievable using the default report available, are there any other JMeter report plugin's for maven?
As of JMeter 5.2.1 the KO and OK labels are not controllable by any JMeter properties, you will have to go into /bin/report-template/content/js/dashboard.js.fmkr and change the labels there
Similarly for extra columns, you will need to add them manually in the aforementioned file.
Get familiarized with Apache FreeMarker template engine, this is what JMeter is using under the hood in order to produce the dashboard.
Alternatively you can consider a 3rd-party results analysis solution like BM.Sense which reports test duration out of the box and provides possibility to add comments to the execution.
The whole morning I have been trying to setup e2e tests reporting via SonarQube's Generic Execution, by using the Generic Test Data -> Generic Execution feature.
I created a custom xml report that gets added to the scan properties like this:
sonar.testExecutionReportPaths=**/e2e-report.xml
So far, SonarQube seems to completely ignore this property and I no attempt to parse the file in the logs. Has anyone made it work?
These are links by Sonar about the Generic Execution feature:
https://docs.sonarqube.org/display/SONAR/Generic+Test+Data
https://github.com/SonarSource/sonarqube/blob/master/sonar-scanner-engine/src/main/java/org/sonar/scanner/genericcoverage/GenericTestExecutionSensor.java
This is a SonarQube 6.2+ feature. Make sure to use an appropriate SonarQube version.
In addition sonar.testExecutionReportPaths does not allow matchers (like *).
Please provide relative or absolute paths, comma separated.
See also:
The official documentation of the Generic Test Data feature
The source code, that looks up the generic execution files
TXTFIT = test execution time for individual test
Hello,
I'm using Sonar to analyze my Maven Java Project. I'm testing with JUnit and generating reports on the test execution time with the Maven Surefire plugin.
In my Sonar I can see the test execution time and drill down to see how long each individual test took. In the time machine I can only compare the overall test execution time between two releases.
What I want is to see how the TXTFIT changed from the last version.
For example:
In version 1.0 of my software the htmlParserTest() takes 1sec to complete. In version 1.1 I add a whole bunch of test (so the overall execution time is going the be way longer) but also the htmlParserTest() suddenly takes 2secs, I want to be notified "Hey mate, the htmlParserTest() takes twice as long as it used to. You should take a look at it".
What I'm currently struggling to find out:
How exactly do the TXTFIT get from the surefire xml report into sonar?
I'm currently looking at AbstractSurefireParser.java
but I'm not sure if that's actually the default surefire plugin.
I was looking at 5 year old stuff. I'm currently checking out this. Still have no idea, where Sonar is getting the TXTFIT from and how or where it is connecting it to the Source Files.
Can I find the TXTFIT in the Sonar DB?
I'm looking at the local DB from my test Sonar with DBVisualizer and I don't really know where to look. The SNAPSHOT_DATA doesn't seem like it's readable by humans.
Are the TXTFIT even saved in the DB?
Depending on this I have to write a Sensor that actually saves them or a widget that simply shows them on the dashboard
Any help is very much appreciated!
The web service api/tests/* introduced in version 5.2 allows to get this information. Example: http://nemo.sonarqube.org/api/tests/list?testFileUuid=8e3347d5-8b17-45ac-a6b0-45df2b54cd3c