Automatically generate Code Coverage during nightly build - visual-studio-2013

I have some problems getting the code coverage .coverage file generated in nightly build.
What I have: I've configured my build to use a .runsettings file and Type of run settings : CodeCoverageEnabled
The build is correctly running all the required unit tests and measuring the code coverage, using only a selected number of assemblies (specified in the .runsettings file).
In the build report, within VS2013, I can manually export the code coverage file (a .coverage file).
What I need:
I would need to configure the build to automatically generate that .coverage file in a target folder.
How do I do that?

The .coverage file is present as a part of the test results. You can use the .runsettings to set a outputpath for the test results
<ResultsDirectory>c:\\TestResults</ResultsDirectory>
The .coverage file will be present in a subfolder within the results directory.
If you want to push it to another location you can do that via a post-build script in your nightly's build process template.

Related

How to generate .trx result file after TFS build instead of code coverage?

I recently added xxx.runsettings file to my solution to pass parameters from TFS variables to my solution(url). Now I'm not able to generate .trx result file after running TFS build in Test summary/results page, only code coverage is generated.
[.runsettings code taken from here-https://msdn.microsoft.com/en-us/library/jj635153.aspx
Can anyone here tell me how to edit the runsettings file in order to show .trx result file in my test summary instead of code coverage?
TFS 2015 Update 3,TFS ms build
Can see this window once build is completed.
Run functional test Logs
Publish results log -
Logs
2017-06-21T17:20:49.9138829Z Executing the powershell script: C:\agent\tasks\PublishTestResults\1.0.22\PublishTestResults.ps1
2017-06-21T17:20:50.0628925Z ##[warning]No test result files were found using search pattern 'C:\agent_work\2\s**\TestResults\xyz*.trx'.
Just remove <ResultsDirectory>.\TestResults</ResultsDirectory> section from the runsettings file should fix your issue.
By the way, Publish Test Result task does not work in your scenario since it can only publish the test result files on the build agent while the trx file is usually generated on the test agent with Run Functional Test task.
Since you are running the VStest. Which will not generate the .trx result file.
VStest step is actually using VSTest.Console.exe command, it will use the /logger:TfsPublisher which not generate the .trx file. So if you use the built-in tasks such as Visual Studio Test or Run Functional Tests to run tests, results are automatically published and you do not need a separate publish test results task.
To log results into a Visual Studio Test Results File (TRX) use /Logger:trx. More details please refer this command. To generate a .trx file is not related to runsettings file.
For TFS , the test result is automatically published, you could click test run for more details.

How to configure a sonar-project.properties file for code coverage?

Current my scanner is running through and only scanning the parent and skipping the rest of my nested files. If I run sonarlint (using the cli and specifying some test and source files) , it tries to analyze 37k files instead of the few I need. I have been able to skip ~3k files by adding the <sonar.skip>true</sonar.skip> property to a pom file. However, I still can't configure the project to run across certain sub-folders and print out some kind of code coverage test. (Is JaCoCo needed for the latest version(6.3,0)? Or can code-coverage be handled through some configuration?).
If Sonar seems to be analyzing too many files, it is probably because you had not set the sonar.sources=src/main/java in your sonar-project.properties file, so it defaults to the basedir and includes everything.
SonarQube can't do code-coverage itself, it just reports on coverage-reports from a tool like JaCoCo. It is funny they don't clarify these things in https://docs.sonarqube.org/display/SCAN/Analyzing+with+SonarQube+Scanner --but with enough digging, you can find good info on that site.

Cannot generate coverage report using lcov

I'm trying to use lcov to generate coverage reports for my unit test suite, but I cannot even capture a tracefile. The error messages indicate that the source files cannot be found. The code is compiled by a Jenkins job on a build machine and the unit test are executed as a downstream job on a target machine. The source code and gcno files are transfered to the downstream job, which then executes the call to lcov. Here follows all the details, a cup of coffee might be needed.
On the build machine, make is executed in
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/
The source code which I want coverage for is in subdirectories in
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/
The object files and gcno files are generated in an subdirectory o relative to the corresponding cpp file. So for example
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/Myclass.cpp
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/o/Myclass.o
/var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/o/Myclass.gcno
The source files and gcno files are copied to the unit test machine keeping the same folder structure and ends up in
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/
Note: There is a difference in the name of the workspace folder, "App-Coverage-Unittest" instead of "App-Coverage" since these two Jenkins jobs cannot have the same name.
So there is now for example
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/Myclass.cpp
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.o
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.gcno
The unit tests are executed in
/opt/app/test/app
Using GCOV_PREFIX_STRIP and GCOV_PREFIX I make the gcda files appear in the same folders as the gcno files, so for example
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.gcno
/var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/packages/subdir/o/Myclass.gcda
Now I want to generate a coverage report using lcov, but I don't seem to understand how to set the paths correctly. The following examples where executed from /var/lib/jenkins/workspace/App-Coverage-Unittest/ by the Jenkins unittest job.
For example I tried
lcov -d BUILD/app/packages/ -c --no-external -o app.info -b /var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/
Reasoning: "-d BUILD/app/packages/" is what I want coverage for, "-b /var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/app/" is the root of my project in which I executed make (but on the build machine with a different workspace name...).
I also tried
lcov -d BUILD/app/packages/ --capture --no-external --output-file app.info
Reasoning : "-d BUILD/app/packages/" is what I want coverage for, don't set -b since relative path between each gcno/gcda and corresponding source file is the same as on the build machine, maybe lcov can figure it out.
In both cases get errors like "Cannot open source file /var/lib/jenkins/workspace/App-Coverage/BUILD/app/packages/subdir/Myclass.cpp"
Note: The workspace folder in this path is that of the build machine, not the unittest machine. I thought that this is what the -b option is intended to solve. Clearly this is very suspicious and a valuable clue.
I also get errors like "Cannot open source file ../../../packages/subdir/Myclass.h", which I guess has to do with how I include header files.
I have tried specifying all the paths here. Is it possible to generate the coverage report in the workspace of the unittest job using lcov, like I'm trying to do here? If yes, which are the correct paths to specify for lcov -d and -b flags? If not, what do I need to change to make it work?
Fortunately the answer is yes, it is possible. I got a reply from an lcov dev providing me with the solution, thank you Peter!
He pointed out that all source code paths are hard-coded during the compile step into the .gcno files. However, despite not finding the source files (and producing the warnings) lcov will generate code coverage output even when the source code cannot be found, based solely on the data found in .gcda and .gcno files. However, the genhtml step will fail because it won't be able to find the source code to annotate with code coverage data.
The solution is to use lcov's "geninfo_adjust_src_path" configuration setting. By using this setting, lcov is instructed to change source code paths as found in the .gcno files into the correct source code paths while writing the output .info files. So in my case:
lcov -d BUILD/app/packages/ --capture --no-external --output-file app.info
--rc geninfo_adjust_src_path="/var/lib/jenkins/workspace/App-Coverage/BUILD/
=> /var/lib/jenkins/workspace/App-Coverage-Unittest/BUILD/"
The warnings "Cannot open source file" will still be there when invoking lcov, but the resulting .info file will contain the correct paths and can therefore be converted to HTML on the test machine using genhtml.

MsTest unable to copy test data files to build server

I am trying to implement automated unit testing with each build using TFS.
Problem statement :
I have created few xml files which stores test data and set to copy always. When run locally files are picked up from bin folder. When I schedule a build, build process looks for files in out folder under TestResults on Build Server. Out folder contains ddls but not the xml files. Hence unable to find files and results into build failure partially.
You can specify additional files to deploy in your test settings file:
More details here - https://msdn.microsoft.com/en-us/library/ms182475.aspx
You could also use the DeploymentItem Attribute.
https://msdn.microsoft.com/en-us/library/microsoft.visualstudio.testtools.unittesting.deploymentitemattribute.aspx

TeamCity export CoverageReport.xml to a predefined directory

I am trying out TeamCity for our CI.
We have a build step running NUnit for unit test and dotCover for code coverage. The process went through fine, but im just wondering if it is possible to export the Build Log and the dotCover's CoverageReport.xml to a predefined directory on the local machine.
or maybe even export all the artifacts and reports used in the build to a local folder.
Just add a build step ( command line ) that will copy over the reports that you need to the desired location locally.

Resources