How to install nUnit 3 nunit3-console.exe in TeamCity 9.x - teamcity

NUnit 3.0 is supported by TeamCity 9.1.x now however you have to install the runner and specify the path to the nunit3.console.exe in the step. My question is where do I copy the nunit3-console.exe? Do I have to put this on all the agents? Do I put it in a directory on my main TeamCity server and it will get shared or pulled by the agents? There doesn't seem to be good documentation on where to copy these files so that all the agents can use them.

You should have the NUnit console on the each agent where you would like to run NUnit tests.
The best option is:
Add reference to the NuGet package (https://www.nuget.org/packages/NUnit.Runners/).
To restore package you could use "NuGet Installer" build step, see following blog post: https://blog.jetbrains.com/teamcity/2013/08/nuget-package-restore-with-teamcity/
After that you just set path like "packages\NUnit.Console.3.0.0\tools\nunit3-console.exe" from the restored NuGet package.

Building on #NikolayP's answer:
Add reference to the NuGet package (https://www.nuget.org/packages/NUnit.Runners/).
To restore package you could use "NuGet Installer" build step, see following blog post: https://blog.jetbrains.com/teamcity/2013/08/nuget-package-restore-with-teamcity/
After that you just set path like "packages\NUnit.Console.3.0.0\tools\nunit3-console.exe" from the restored NuGet package.
I wrote the following PowerShell script to determine the correct NUnit.ConsoleRunner package directory and populate a TeamCity variable before the NUnit task is run. It uses the most recent version of the NUnit.Console package.
$SrcDirectory = "%src.directory%"
$PackagesDirectory = Join-Path $SrcDirectory packages
$NUnitConsoleRunnerPackageDirectory = Get-ChildItem (Join-Path $PackagesDirectory NUnit.ConsoleRunner.*) | %{
#{
Directory = $_.FullName
Version = [Version]::Parse(($_.Name -replace "NUnit.ConsoleRunner.",""))
}
} | Sort-Object Version -Descending | Select-Object -First 1 | %{ $_.Directory }
if (!$NUnitConsoleRunnerPackageDirectory) {
throw [IO.DirectoryNotFoundException] "NUnit console runner package directory not found"
}
Write-Output "##teamcity[setParameter name='nunit.consolerunner.directory' value='$NUnitConsoleRunnerPackageDirectory']"
Note that you'll need to define the src.directory variable to point to the directory that contains the packages directory on your build agent, or otherwise supply the necessary root directory for the PowerShell script to work. You'll also need to define the nunit.consolerunner.directory variable with a default value of empty.
The script will also throw an exception if, for whatever reason, an NUnit.ConsoleRunner directory could not be found.

Also you could follow this instruction: https://confluence.jetbrains.com/display/TCD9/Getting+Started+with+NUnit

Build is running on an agent, so you need to install NUnit3 on all of the agents where you want to run a build.

There are some gotchas around the TeamCity runner - specifically, its default behaviour is not to run the specs in their own AppDomains with their own base directory, as per NUnit2 (and the NUnit3 Visual Studio Test Adapter).
There is a (currently undocumented) configuration property in the TeamCity 9.x build series that enables you to change this behaviour. I've written about it here.

Try latest version of script #NathanAldenSr
Still required variable http://teamcityserver/admin/editProject.html?projectId=yourId&tab=projectParams
add nunit.consolerunner.directory parameter to Configuration Parameters
$SrcDirectory = "%teamcity.build.checkoutDir%"
$PackagesDirectory = Join-Path $SrcDirectory packages
Write-Output "PackagesDirectory" $PackagesDirectory
$NUnitConsoleRunnerPackageDirectory = Get-ChildItem (Join-Path $PackagesDirectory NUnit.ConsoleRunner.*) | %{
#{
Directory = $_.FullName
Version = [Version]::Parse(($_.Name -replace "NUnit.ConsoleRunner.",""))
}
} | Sort-Object Version -Descending | Select-Object -First 1 | %{ $_.Directory }
if (!$NUnitConsoleRunnerPackageDirectory) {
throw [IO.DirectoryNotFoundException] "NUnit console runner package directory not found"
}
$NUnitConsoleRunnerPackageDirectory = Join-Path $NUnitConsoleRunnerPackageDirectory tools
Write-Output "NUnitConsoleRunnerPackageDirectory" $NUnitConsoleRunnerPackageDirectory
Write-Output "##teamcity[setParameter name='nunit.consolerunner.directory' value='$NUnitConsoleRunnerPackageDirectory']"

Also building on #NikolayP's answer:
NuGet currently supports the command line argument -ExcludeVersion for the install operation. From the docs:
Installs the package to a folder named with only the package name and
not the version number.
This results in a path that is rather easy to use in a subsequent NUnit runner build step and allows to drop the clever workaround of #NathanAldenSr.
As of TeamCity 2017.1.3 (and probably earlier versions), this feature is even exposed as a parameter for the NuGet Installer runner (see the Restore Options) but requires a solution path. The example below is suitable for a generic on-the-fly and transient installation of NUnit.
For easy copy and paste (adjust the NUnit version to your requirements):
Executable: %teamcity.tool.NuGet.CommandLine.DEFAULT%\tools\nuget.exe
Parameters: install NUnit.Console -Version 3.7.0 -ExcludeVersion -OutputDirectory %system.teamcity.build.tempDir%\NUnit

I know that it is now July of 2018, but none of these answers were clear to me. Why do I need to install a console on each agent. There has to be a better way. When I was adding my Build Step for running tests, I noticed that the text under the input for the path to the NUnit console tool said, "Paths relative to the checkout directory are supported." All I did was added the nuget packages to my test project in my solution. I added the NUnit version v3.10.1 and then the NUnit.Console v3.8.0. Then in Team City I simply added the relative path "packages\NUnit.ConsoleRunner.3.8.0\tools\nunit3-console.exe"

Related

Team City Code Coverage and Unit Test results not showing on Sonar

I am trying to upload Unit Test and dotCover Code Analysis results from TeamCity to Sonar server. It shows code coverage and unit test results in the TeamCity but no code coverage/unit test on Sonar.
TeamCity Unit test step:
Followed by Powershell script:
I have the following additional parameter in Sonar Runner step:
Dsonar.cs.vstest.reportsPaths=TestResults.trc
Dsonar.cs.dotcover.reportsPaths='%sonar.coverageReport%'
Does anyone know how to fix this?
Thanks.
Managed to fix the issue using the below script in step 3:
$Files= Get-ChildItem %system.teamcity.build.tempDir% `
-Filter coverage_dotcover*.data `
| where-object {$_.length -gt 50} `
| Select-Object -ExpandProperty FullName
$snapshot =[string]::Join(";",$Files)
& %teamcity.tool.dotCover%\dotCover.exe merge `
/Source=$snapshot `
/Output=%env.TEMP%\dotCoverReport.dcvr`
& %teamcity.tool.dotCover%\dotCover.exe report `
/Source=%env.TEMP%\dotCoverReport.dcvr `
/Output=%sonar.coverageReport% `
/ReportType=HTML
What TeamCity version do you have? In TeamCity 9.1.1 there was a bug that caused test report files to be written in build temp directory. In this case Sonar plugin, that expects trx file to be in checkout directory, does not find it.
There are several ways to solve it: the first is to upgrade to TeamCity 9.1.2 and above, the second is to pass absolute path to the report paths variable:
-Dsonar.cs.vstest.reportsPaths=%system.teamcity.build.tempDir%/TestResults.trc
The third way would be to pass absolute path, that points to checkout directory, to results file field of mstest runner:
%system.teamcity.build.checkoutDir%/TestResults.trc

TeamCity dotCover report path for Sonar

I'm trying to integrate the sonar analysis into by TeamCity build process. I have a NUnit build step which runs my unit tests and then runs dotCover for the coverage.
My next step is the sonar-runner. The configuration that currently exists is; gallio.mode=dotCover, sonar.gallio.mode=reuseReport but I also need sonar.gallio.reports.path.
Does anybody know the path to the dotCover report generated in the the previous step?
Spent some amount of time on the same issue, but with newer Sonar c# plugin (v.2.3) - Gallio support has been dropped, but the report is still required.
To answer the question directly, TeamCity puts dotcover snapshot file into a temp folder with a name like coverage_dotcover27574681205420364801.data (where digits are random). So
The procedure is:
Create a PowerShell Build step in Team City after the step with test and coverage
you may use Command line if you prefer
Get the full dotCover snapshot name in temp folder
Run dotCover to produce a HTML report from a snapshot
Note - Sonar (c# plugin v 2.3) supports only dotCover HTML reports
Pass the produced HTML report to sonar
PowerShell script:
$snapshot = Get-ChildItem "%system.teamcity.build.tempDir%" `
-Filter coverage_dotcover*.data `
| select -ExpandProperty FullName -First 1
%teamcity.dotCover.home%\dotCover.exe report `
/ReportType=HTML /Source="$snapshot" `
/Output="%sonar.coverageReport%"
Now you can specify your report in sonnar runner as sonar.cs.dotcover.reportsPaths='%sonar.coverageReport%'
Where %sonar.coverageReport% is a defined property in a TeamCity
It seems TeamCity 2017 no longer creates coverage_dotcover*.data files. Instead it creates *.dcvr files.
There are potentially multiple files which need to be merged before you can create the report. As a result the powershell need updating.
So using the steps provided by Oleksandr, just update the script to be:
$snapshotfiles = Get-ChildItem "%system.teamcity.build.tempDir%" `
-recurse -Filter *.dcvr `
| select -ExpandProperty Name
$snapshots = $snapshotfiles -join ";"
%teamcity.dotCover.home%\dotCover.exe merge /Source=$snapshots
/Output=dotcovermerge.dcvr
%teamcity.dotCover.home%\dotCover.exe report `
/ReportType=HTML /Source=dotcovermerge.dcvr `
/Output="%sonar.coverageReport%"
Then the property %sonar.coverageReport% can be passed to the sonarqube scanner. Btw, you need to create a parameter in TC for %sonar.coverageReport% e.g. "sonarcoverage.html"
I couldn't find a way to do this using the built in NUnit runner. I managed to get it working by using a powershell build step to manually call the required commands.
First step is to run the NUnit tests via Gallio within a dotCover cover call:
& dotCover cover `
/TargetExecutable="C:\Program Files\Gallio\bin\Gallio.Echo.exe" `
/TargetArguments="/report-type:XML /report-name-format:test-report /runner:IsolatedProcess /report-directory:.\Gallio .\Path\Test.dll" `
/Filters="+:WhatToCover" `
/Output=coverage.snapshot
The Gallio test report is then available to be picked up by Sonar with reuseReport, TeamCity automatically detects the test results.
You can make TeamCity directly process the coverage snapshot by writing a service message to standard output:
Write-Host "##teamcity[importData type='dotNetCoverage' tool='dotcover' path='coverage.snapshot']"
To get the coverage info into a format usable by Sonar you need to use the dotCover report command and the undocumented report type TeamCityXML:
& dotCover report /Source=coverage.snapshot /Output=coverage-report.xml /ReportType=TeamCityXML
We are using SonarScanner for MSBuild and needed to add the team city temporary build path to the begin step.
Run the SonarScanner.MSBuild.exe begin command, specifying the temp build directory to be where the reports will be available using
/d:sonar.cs.dotcover.reportsPaths="%system.teamcity.build.tempDir%".
Build your project using MSBuild
Run your test tool, instructing it to produce a report at the same location specified earlier to the MSBuild SonarQube Runner
Run the SonarScanner.MSBuild.exe end command
The coverage report from the nunit/dotcover build step is stored in the teamcity hidden artifacts directory. You need to add that as an artifact dependency in the sonar step. I wouldn't recommend the hidden artifact route but it can be done.
This is the artifact path I used to publish the report which worked for a few weeks then began to fail:
%env.TEAMCITY_DATA_PATH%/system\artifacts\**\%teamcity.build.id%\.teamcity\.NETCoverage\dotCover.snapshot
Once you have the report, your're home free though.
Although it might be a bit cumbersome solution, I'm using two chained builds.
The first build configuration builds the solution and runs the tests/coverage, plus saves the dotCover snapshot as an artifact.
The other build has an artifact dependency on the first one on .teamcity/.NETCoverage/dotCover.snapshot and runs
"C:\Program Files (x86)\JetBrains\dotCover\v2.7\bin\dotCover.exe" report /ReportType=HTML /Source="dotCover.snapshot" /Output="dotCover.html" and, as the latest step, executes SonarRunner (your project properties file will point to the "dotCover.html").
(Tried with SonarQube 5, dotCover 2.7, TC8)
Below is what worked for me. I am on TC 2018.2.4 and the bundled version of the dotCover was not generating anything so I upgraded that to the latest version as well which is 2019.1.1
I could not make the agent use the latest version of the dotcover tools so I had to construct the folder path for that rather than using the default %teamcity.dotCover.home% variable.
I also did not want to use the temp folder as things were getting deleted there by the build.
$snapshot_file_list = Get-ChildItem "%system.teamcity.build.tempDir%" -recurse -Filter coverage_dotcover*.data | select -ExpandProperty FullName
Echo "dotCover Snapshot files"
Echo $snapshot_file_list
$joined_snapshot_files = $snapshot_file_list -join ";"
Echo "Merging data files to %system.teamcity.build.checkoutDir%\dotcovermerge.dcvr"
& "%teamcity.agent.tools.dir%\JetBrains.dotCover.CommandLineTools.2019.1.1\dotCover.exe" merge /Source=$joined_snapshot_files /Output="%system.teamcity.build.checkoutDir%\dotcovermerge.dcvr" /LogFile="%system.teamcity.build.checkoutDir%\dotCover.log"
Echo "Generating dotCover Html report"
Echo "%system.teamcity.build.checkoutDir%\%sonar.coverageReport%"
& "%teamcity.agent.tools.dir%\JetBrains.dotCover.CommandLineTools.2019.1.1\dotCover.exe" report /ReportType=HTML /Source="%system.teamcity.build.checkoutDir%\dotcovermerge.dcvr" /Output="%system.teamcity.build.checkoutDir%\%sonar.coverageReport%"

Installing multiple packages via Nuget by single command

I am using Nuget via Visual Studio's Package Manager Console. For every project I need to add few packages e.g. xunit.contribs, fluentassertions and nsubstitute. To do that I type 3 commands in consolle.
I understand that console is just another powershell host and there should be a way to create a script (kind of Add-Test-Stuff.ps1) that will add several packages at once. What is the best way to do that?
I typically define my packages in arrays and execute them using a foreach loop
$angular = "AngularJS.Animate", "AngularJS.Core", "AngularJS.Locale", "AngularJS.Resource", "AngularJS.Route", "AngularJS.Sanitize", "AngularJS.Touch"
$angular | foreach {Install-Package $_}
This can also be written as a one-liner
"AngularJS.Animate", "AngularJS.Core", "AngularJS.Locale", "AngularJS.Resource", "AngularJS.Route", "AngularJS.Sanitize", "AngularJS.Touch" | foreach {Install-Package $_}
It is not direct answer to you question, sorry. But I've seen very nice practive adopted for Code52 IdeaStrike project.
They do not have any packages included in source control, instead they will be installed during first build of solution and downloaded automatically, depending on packages.config
The details of configuration are here:
Using NuGet without committing packages to source control
Put any script to your solution directory and you will be able to execute it from Package Manager Console.

Using TeamCity build templates how can I remove "Clean all files before build" from the template?

I have a number of daily and in-response-to-svn-change builds that all derive from the same configuration template. I can set it so I can choose what branch to look at, what build steps to carry out and of course what triggers the build.
However I would like the daily builds to do a completely clean checkout, whereas the svn dependent ones (that obviously happen throughout the day) I am happy for them to simply to an update.
Simply un-setting the option in the template does not let me set them in each of the derived builds. Is there a build parameter that I can use to switch on clean builds for those builds that require it?
In each Project's "Version Control Settings", look under "Checkout setting". There are options to specify the checkout dir and also a checkbox to clean files before a build.
There is an Additional Build Feature (Swabra) you can add that does a clean checkout if required.I have enabled this for our nightly builds but have not yet investigated the consequences.
See here for more detail on Swabra
Seems to me that you need one template for daily and one for nightly builds.
I so disliked the Accepted Answer that I put a conditional scorch into my built script.
Then I you can either set a default in the template and override in the one that needs the scorching or define it on an opt-out basis.
Get-ChildItem . -Include obj,bin -Recurse -Force | Remove-Item -Recurse -Force
Remove-Item Artifacts -Recurse -Force
Why answer with such an unsightly hack? I'm hoping someone will chime in with a real solution and [then downvote this or preferably] comment.
There's a way to declare variables in the template and substitute them with configuration parameters in each build project. Please refer more details at [link] http://blogs.jetbrains.com/teamcity/2010/10/14/overriding-template-settings/

CruiseControl.Net: How Does One Clear Obsolete Build History?

I just started using CCNet, and in the process of getting my build projects set up I racked up a lot of build history from trial and error. I really don't want to keep that old stuff around, but I can't seem to see where/how to get rid of it. I'm sure this is a silly question, and I apologize if I'm overlooking something that should be obvious. I did RTM and Google for about a half hour, and poked around my CCNet installation, but it's not jumping out at me. I deleted the state files for the projects (don't know if that has anything to do with it), but the old builds are still there if I drill into a project's stats from the dashboard. Any suggestions? Thanks.
Answered: I had explicitly set the artifacts directory to a location that was not under the CCNet server directory and consequently never looked in it again... went looking and, disco, there's the build histories.
Don't forget you can use Artifact Cleanup Publisher to keep your build history from growing to the size of Mars over time.
Assuming you have a project called "Dev" and you've installed CCNet into the default location, you'll have a folder called:
c:\Program Files\CruiseControl.NET\server\Dev
and a Dev.state file in:
c:\Program Files\CruiseControl.NET\server
Just delete both the folder and the state file.
What you're looking for are the "artifacts" folders. Check your ccnet.config file for the tag
Stop your service, delete the artifact directory folder, and restart your service.
The logs are stored in the artifacts directories under artifacts\MyProjectName\Build\log*.xml.
The State file stores things like the last build date, time, info.
Best to stop the service, and then delete the .state in ProgFiles\CC.net\server and also delete the artifacts\MyProjectName\Build\log.xml files.
As mentioned above, use the Artifact Cleanup Publisher to keep the number of artifacts to a sensible level.
If you have a lot of projects and need to do a retrospective cleanup, you could use the following Powershell script to remove old log files:
$limit = (Get-Date).AddDays(-60)
get-childitem -Path D:\Builds -filter MatchMyProjects.* | %{
$projectPath=$_.FullName
$logsPath=$projectPath + "\Logs"
write-host Removing logs from folder $logsPath
Get-ChildItem -Path $logsPath -Force -Filter *.xml | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
}
Thanks to this answer: Delete files older than 15 days using PowerShell

Resources