jasmine-allure-reporter, display test cases separately based on spec files - jasmine

I'm using Protractor and jasmine-allure-reporter
And trying to run multiple specs that are defined in the config.js file.
specs: ['spec1.js','spec2.js']
spec1.js contains 3 tests and spec2.js contains single test.But the jasmine-allure-reporter displays all the four test cases(3+1) together and there is no specification about the spec files (spec file name). How can I display the test cases separately under each spec file-name in one HTML.
Please help me on this.
I am generating report using command "allure generate allure-results --clean -o allure-report || true"
allure-results >> Location where xml files are generated and
allure-report >> where html report is generated

Looking at the allure reporter repository, it doesn't look like it's supported. They set the outDir but do not expose a way to consolidateAll like how jasmine-reporter does. See jasmine-reporter GitHub. If you decide to switch to jasmine-reporter, consolidating XML files is pretty simple. See the Protractor cookbook for an example.

Related

Ginkgo: how to combine test reports

I'm setting up GitLab CI.
We use Ginkgo tests for BDD.
Ginkgo creates a report per each folder where tests are located.
This create a problem with collecting all reports and publishing it as a single test report file.
Is it possible to configure GinkGo in a such way so I could take all test in a single test report file?
What I understand is your reports lies in each test folder:
Example
testScripts
- test_1_directory (contains test spec and result files)
- test_2_directory (contains test spec and result files)
- test_3_directory (contains test spec and result files)
I'm not sure this might exactly help you but can give it a try
In you job add reports paths as mentioned below:
artifacts:
reports:
junit:
- ./packages/e2e/goProject/testScripts/**/**.xml
Assuming .xml are report generated.
At the end all test will be displayed in pipeline's test section

SonarCloud requiring code coverage for files ignored with Istanbul

I have a JavaScript app where we generate a code coverage report using Istanbul and use SonarCloud for static analysis.
There are two ways we exclude code from the Istanbul. The first is to set exclusion paths. In jest.config.js we have this to exclude patterns:
"coveragePathIgnorePatterns": [
"source/legacy"
]
The second way is to use Istanbul ignore comments in source files like /* istanbul ignore file */. In either case the ignored file will not be part of the generated report file.
In our Sonar configuration we set it to use the generated lcov.info report file with the sonar.javascript.lcov.reportPaths property. However we then also need to set sonar.coverage.exclusions to exclude patterns like source/legacy because it is not treating the lcov.info report as the source of truth. This is acceptable but duplicates configuration, which is unfortunate. The real problem is that I cannot find any way to get Sonar to handle the files excluded with /* istanbul ignore file */.
Is there some way to make Sonar treat the lcov.info file as its source of truth, such that any file that is not included in the file is excluded from coverage?
Alternately, is there a way with Istanbul where I can make it list ignored files but say that they are ignored? Maybe that way Sonar will see that they are ignored.
For those looking for a solution, you unfortunately need to tell sonarqube to ignore this file explicitly. Not an ideal solution, but what I ended up with.
docs: https://docs.sonarqube.org/latest/project-administration/narrowing-the-focus/
I put my filename in sonar.coverage.exclusions.
What my organization ended up doing was making a policy to just not use istanbul ignore file comments, and instead ignore each function on a file. Also not ideal.

Cannot generate coverage report using lcov

I'm trying to use lcov to generate coverage reports for my unit test suite, but I cannot even capture a tracefile. The error messages indicate that the source files cannot be found. The code is compiled by a Jenkins job on a build machine and the unit test are executed as a downstream job on a target machine. The source code and gcno files are transfered to the downstream job, which then executes the call to lcov. Here follows all the details, a cup of coffee might be needed.
On the build machine, make is executed in
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/
The source code which I want coverage for is in subdirectories in
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/
The object files and gcno files are generated in an subdirectory o relative to the corresponding cpp file. So for example
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/Myclass.cpp
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/o/Myclass.o
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/o/Myclass.gcno
The source files and gcno files are copied to the unit test machine keeping the same folder structure and ends up in
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/
Note: There is a difference in the name of the workspace folder, "App-Coverage-Unittest" instead of "App-Coverage" since these two Jenkins jobs cannot have the same name.
So there is now for example
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/Myclass.cpp
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.o
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.gcno
The unit tests are executed in
/opt/app/test/app
Using GCOV_PREFIX_STRIP and GCOV_PREFIX I make the gcda files appear in the same folders as the gcno files, so for example
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.gcno
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.gcda
Now I want to generate a coverage report using lcov, but I don't seem to understand how to set the paths correctly. The following examples where executed from /var/lib/jenkins/workspace/App-Coverage-Unittest/ by the Jenkins unittest job.
For example I tried
lcov -d BUILD/app/packages/ -c --no-external -o app.info -b /var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/
Reasoning: "-d BUILD/app/packages/" is what I want coverage for, "-b /var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/" is the root of my project in which I executed make (but on the build machine with a different workspace name...).
I also tried
lcov -d BUILD/app/packages/ --capture --no-external --output-file app.info
Reasoning : "-d BUILD/app/packages/" is what I want coverage for, don't set -b since relative path between each gcno/gcda and corresponding source file is the same as on the build machine, maybe lcov can figure it out.
In both cases get errors like "Cannot open source file /var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/Myclass.cpp"
Note: The workspace folder in this path is that of the build machine, not the unittest machine. I thought that this is what the -b option is intended to solve. Clearly this is very suspicious and a valuable clue.
I also get errors like "Cannot open source file ../../../packages/subdir/Myclass.h", which I guess has to do with how I include header files.
I have tried specifying all the paths here. Is it possible to generate the coverage report in the workspace of the unittest job using lcov, like I'm trying to do here? If yes, which are the correct paths to specify for lcov -d and -b flags? If not, what do I need to change to make it work?
Fortunately the answer is yes, it is possible. I got a reply from an lcov dev providing me with the solution, thank you Peter!
He pointed out that all source code paths are hard-coded during the compile step into the .gcno files. However, despite not finding the source files (and producing the warnings) lcov will generate code coverage output even when the source code cannot be found, based solely on the data found in .gcda and .gcno files. However, the genhtml step will fail because it won't be able to find the source code to annotate with code coverage data.
The solution is to use lcov's "geninfo_adjust_src_path" configuration setting. By using this setting, lcov is instructed to change source code paths as found in the .gcno files into the correct source code paths while writing the output .info files. So in my case:
lcov -d BUILD/app/packages/ --capture --no-external --output-file app.info
--rc geninfo_adjust_src_path="/var/lib/jenkins/workspace/App-Coverage/BUILD/
=> /var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/"
The warnings "Cannot open source file" will still be there when invoking lcov, but the resulting .info file will contain the correct paths and can therefore be converted to HTML on the test machine using genhtml.

Rspec: Allure command line tool

After installing allure-rspec gem and adding require 'allure-rspec' in spec_helper.rb file. I am able to generate results in .xml format.
Now I want to convert this .xml to actual html report for this tutorial navigates us to here https://github.com/allure-framework/allure-cli
I don't understand how this would help in generating html reports.
I am using this line to generate .xml in results directory
bundle exec rspec spec/create_post/post_creation_spec.rb
From that github link clone(download zip) the allure-ci repo in your local machine.
Then go to project and create a directory a root level directory-with-results
copy the .xml you have in this directory.
Open terminal at root directory of this maven project
And run these commands one by one
allure generate directory-with-results
allure report open
if JAVA_HOME maven_home is configured properly,you will see the report generated in your default browser.

Automatically generate Code Coverage during nightly build

I have some problems getting the code coverage .coverage file generated in nightly build.
What I have: I've configured my build to use a .runsettings file and Type of run settings : CodeCoverageEnabled
The build is correctly running all the required unit tests and measuring the code coverage, using only a selected number of assemblies (specified in the .runsettings file).
In the build report, within VS2013, I can manually export the code coverage file (a .coverage file).
What I need:
I would need to configure the build to automatically generate that .coverage file in a target folder.
How do I do that?
The .coverage file is present as a part of the test results. You can use the .runsettings to set a outputpath for the test results
<ResultsDirectory>c:\\TestResults</ResultsDirectory>
The .coverage file will be present in a subfolder within the results directory.
If you want to push it to another location you can do that via a post-build script in your nightly's build process template.

Resources