The whole morning I have been trying to setup e2e tests reporting via SonarQube's Generic Execution, by using the Generic Test Data -> Generic Execution feature.
I created a custom xml report that gets added to the scan properties like this:
sonar.testExecutionReportPaths=**/e2e-report.xml
So far, SonarQube seems to completely ignore this property and I no attempt to parse the file in the logs. Has anyone made it work?
These are links by Sonar about the Generic Execution feature:
https://docs.sonarqube.org/display/SONAR/Generic+Test+Data
https://github.com/SonarSource/sonarqube/blob/master/sonar-scanner-engine/src/main/java/org/sonar/scanner/genericcoverage/GenericTestExecutionSensor.java
This is a SonarQube 6.2+ feature. Make sure to use an appropriate SonarQube version.
In addition sonar.testExecutionReportPaths does not allow matchers (like *).
Please provide relative or absolute paths, comma separated.
See also:
The official documentation of the Generic Test Data feature
The source code, that looks up the generic execution files
Related
We have framework that splits tests to many machines. Data related to environment, OS etc are important for analysis of final result.
Gradle provides XML report compatible with existing tools. Passed/Failed details, timing information, hostname etc are part of XML. This however does not have some data such as environment variables or system properties used. Another example is lack of "OS" information in result.
https://docs.gradle.org/current/userguide/java_testing.html#test_reporting covers many options. I could not find a way to force more detailed test report. Anyone managed to create more detailed XML?
I am new to taurus testing. I have a set of tests inside of a taurus project.
Within the scenario 1300_Azuresubscriptions.yaml I have the following list of labels:
0100-Authorization
1300_ListAzureSubscriptions
1310_CreateAzureSubscriptions
1320_UpdateAzureSubscriptions
1330_FetchAzureSubscription
1340_CreateAVWGateway
1341_CreateVirtualWanSite
1342_ListVirtualWanSites
1343_UpdateVirtualWanSite
1344_FetchVirtualWanSite
1345_DeleteVirtualWanSite
1346_DeleteAVWGateway
1350_DeleteAzureSubscription
1351_ListADSubscription
1352_CreateAzureSubscriptions
1353_FetchADSubscription
1354_ValidateADSubscriptions
1355_GetADGroups
1356_ADSyncConfigurations
1357_ADSync
1358_CheckADCLientCreation
1359_DeleteADSubscription
1360_CheckADCLientDeletion
1361_ListAzureSubscriptionsWithInvalidAuthHeader
1362_ListAzureSubscriptionsWithNoAuthHeader
1363_CreateAzureSubscriptionsWithInvalidAuthHeader
1364_CreateAzureSubscriptionsWithNoAuthHeader
1365_UpdateAzureSubscriptionsWithInvalidAuthHeader
1366_UpdateAzureSubscriptionsWithNoAuthHeader
1367_FetchAzureSubscriptionWithInvalidAuthHeader
1368_FetchAzureSubscriptionWithNoAuthHeader
1369_CreateAVWGatewayWithInvalidAuthHeader
1370_CreateAVWGatewayWithNoAuthHeader
However when I run bzt test-cases/1300_AzureSubscriptions.yaml, only the following list of labels are actually getting tested:
0100-Authorization
1300_ListAzureSubscriptions
1310_CreateAzureSubscriptions
1320_UpdateAzureSubscriptions
1330_FetchAzureSubscription
1340_CreateAVWGateway
1341_CreateVirtualWanSite
1342_ListVirtualWanSites
1343_UpdateVirtualWanSite
1344_FetchVirtualWanSite
1345_DeleteVirtualWanSite
1346_DeleteAVWGateway
1350_DeleteAzureSubscription
1351_ListADSubscription
1352_CreateAzureSubscriptions
1353_FetchADSubscription
1354_ValidateADSubscriptions
1355_GetADGroups
1356_ADSyncConfigurations
1357_ADSync
1358_CheckADCLientCreation
1359_DeleteADSubscription
1363_CreateAzureSubscriptionsWithInvalidAuthHeader
1364_CreateAzureSubscriptionsWithNoAuthHeader
1365_UpdateAzureSubscriptionsWithInvalidAuthHeader
1366_UpdateAzureSubscriptionsWithNoAuthHeader
1367_FetchAzureSubscriptionWithInvalidAuthHeader
1368_FetchAzureSubscriptionWithNoAuthHeader
1369_CreateAVWGatewayWithInvalidAuthHeader
1370_CreateAVWGatewayWithNoAuthHeader
Why isn't it running the labels 1360-1362? I don't understand the problem. Any help would be greatly appreciated. Thanks!
Taurus is just a wrapper for underlying load and functional testing tools, if something is not executed there could be various reasons for this, for example:
The particular request is not enabled in YAML configuration file
The underlying test executor is failing on attempt to run the particular request
There is a lack of test data for the requests
Check out bzt.log file and any logs generated by the underlying tool. By default Taurus uses JMeter executor so if it's your case the reason could be found in jmeter.log file (it also worth checking jmeter.out and jmeter.err for any suspicious entries)
More information: Navigating your First Steps Using Taurus
You can also reach out to Taurus developers, maintainers and users at Taurus support forum
The issue was the indentation in the yaml. I would recommend that if you're running these tests you'd better install a linter to check your yaml formatting. One tab off and the error is invisible. It just ignores the tests.
I have multiple jmx files which are committed to gitpipeline for CI. End users are not fine with reports showing OK/KO and requested us to change the same and also asked us to add few details like total time take for the execution, description of the test and environment/hostname used.
Solutions am looking for -
how to change OK/KO when the JMX is executed using maven. What properties do i need to change/add in order to achieve this for maven?
Is there a way to customize the report to have additional details like "Total Duration", "Environment", "Description of Test" under Test and Report information section or anywhere else?
If it is not achievable using the default report available, are there any other JMeter report plugin's for maven?
As of JMeter 5.2.1 the KO and OK labels are not controllable by any JMeter properties, you will have to go into /bin/report-template/content/js/dashboard.js.fmkr and change the labels there
Similarly for extra columns, you will need to add them manually in the aforementioned file.
Get familiarized with Apache FreeMarker template engine, this is what JMeter is using under the hood in order to produce the dashboard.
Alternatively you can consider a 3rd-party results analysis solution like BM.Sense which reports test duration out of the box and provides possibility to add comments to the execution.
JaCoCo just outputs jacococ.exec which is the input for Sonar. In that file, there seems to be only the info:
- Class name
- Total Class Probes
- Executed Class Probes
But then, SonarQube cannot rely solely on these values as it needs to tell you which are the exact lines unconvered, so Sonar is performing an analysis on itself. So how does it use Jacoco report? And why does it need it?
So how does it use Jacoco report? And why does it need it?
SonarQube itself alone doesn't / can't know anything about which tests you actually executed and how they cover your code. To obtain this information it relies on third-party test coverage tools. In case of Java it relies on data collected and provided by JaCoCo as explained in answer on similar question from you (JaCoCo collects execution information in exec file, and obtains line numbers and other information from class files during generation of report), or SonarQube can rely on data in "generic format".
I have a pre-existing Java project, that Sonar Analysis was recently applied to. There are a large number of CheckStyle JavadocMethod rule violations.
How would I restrict the JavadocMethod rule, to apply only to java filenames with the pattern "Controller.java" ?
The JavadocMethod check does not offer an option to limit itself to certain files, so this cannot be done easily. But - you could:
Write a custom filter which suppresses all JavadocMethod warnings that occur in files which do not match a pattern. This is not difficult - the example on the linked page covers just that case. But it requires you to deploy the filter and that may be a bit of a hassle.
I am not sure if this works in Sonar. I use custom Checkstyle checks in Sonar all the time, but I haven't tried custom filters yet.
Write a subclass of Checkstyle's JavadocMethodCheck which adds an option to apply itself only to certain files (Sonar Examples, Checkstyle tutorial). This is a sure bet if custom filters cannot be added to Sonar.
If you are using Eclipse, you can configure it to use different rule sets based on filename. You would do that using the "advanced" configuration setting in the project properties. Your regexes would be Controller\.java$ to match all controllers, and .{10}(?<!Controller)\.java$ to match the other Java files. This approach could also be applied to a stand-alone or Ant-based Checkstyle run, but not to Sonar.
I am sorry that there is nothing easier available to you - but that's how things are at the moment ... Good luck!