Is there a simple way to have TeamCity include a text or html change-log as one of its output artifacts?
Perhaps I need to go down the route of having msbuild or some other process create the change log but as TeamCity generates one for every build, I'm wondering if there is already a simple way to access it as an artifact and include it in the artifact paths directives so that it can be part of a release package.
Yes, the change-log is accessible as a file, path to this file is in the TeamCity build parameter:
%system.teamcity.build.changedFiles.file%
So you could do this:
Add a command-line build step to your build.
Use type Custom Script.
Enter this script:
copy "%system.teamcity.build.changedFiles.file%" changelog.txt
Finally edit the artifact rules for your build to include the changelog.txt in your artifacts (General settings -> Artifact paths -> Add "changelog.txt").
You can generate a change log via the REST API of TeamCity. A PowerShell script for this can be found here
For TeamCity v10.x & above:
<#
.SYNOPSIS
Generates a project change log file.
.LINK
Script posted over:
http://open.bekk.no/generating-a-project-change-log-with-teamcity-and-powershell
Also See https://stackoverflow.com/questions/4317409/create-changelog-artifact-in-teamcity
#>
# Where the changelog file will be created
$outputFile = "%system.teamcity.build.tempDir%\releasenotesfile_%teamcity.build.id%.txt"
# the url of teamcity server
$teamcityUrl = "%teamcity.serverUrl%"
# username/password to access Teamcity REST API
$authToken=[Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes("%system.teamcity.auth.userId%:%system.teamcity.auth.password%"))
# Build id for the release notes
$buildId = %teamcity.build.id%
# Get the commit messages for the specified change id
# Ignore messages containing #ignore
# Ignore empty lines
Function GetCommitMessages($changeid)
{
$request = [System.Net.WebRequest]::Create("$teamcityUrl/httpAuth/app/rest/changes/id:$changeid")
$request.Headers.Add("AUTHORIZATION", "Basic $authToken");
$xml = [xml](new-object System.IO.StreamReader $request.GetResponse().GetResponseStream()).ReadToEnd()
Microsoft.PowerShell.Utility\Select-Xml $xml -XPath "/change" |
where { ($_.Node["comment"].InnerText.Length -ne 0) -and (-Not $_.Node["comment"].InnerText.Contains('#ignore'))} |
foreach {"+ $($_.Node["user"].name) : $($_.Node["comment"].InnerText.Trim().Replace("`n"," "))`n"}
}
# Grab all the changes
$request = [System.Net.WebRequest]::Create("$teamcityUrl/httpAuth/app/rest/changes?build=id:$($buildId)")
$request.Headers.Add("AUTHORIZATION", "Basic $authToken");
$xml = [xml](new-object System.IO.StreamReader $request.GetResponse().GetResponseStream()).ReadToEnd()
# Then get all commit messages for each of them
$changelog = Microsoft.PowerShell.Utility\Select-Xml $xml -XPath "/changes/change" | Foreach {GetCommitMessages($_.Node.id)}
$changelog > $outputFile
Write-Host "Changelog saved to ${outputFile}:"
$changelog
For versions before Teamcity v10.x:
<#
.SYNOPSIS
Generates a project change log file.
.LINK
Script posted over:
http://open.bekk.no/generating-a-project-change-log-with-teamcity-and-powershell
Also See https://stackoverflow.com/questions/4317409/create-changelog-artifact-in-teamcity
#>
# Where the changelog file will be created
$outputFile = "%system.teamcity.build.tempDir%\releasenotesfile_%teamcity.build.id%.txt"
# the url of teamcity server
$teamcityUrl = "%teamcity.serverUrl%"
# username/password to access Teamcity REST API
$authToken=[Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes("%system.teamcity.auth.userId%:%system.teamcity.auth.password%"))
# Build id for the release notes
$buildId = %teamcity.build.id%
# Get the commit messages for the specified change id
# Ignore messages containing #ignore
# Ignore empty lines
Function GetCommitMessages($changeid)
{
$request = [System.Net.WebRequest]::Create("$teamcityUrl/httpAuth/app/rest/changes/id:$changeid")
$request.Headers.Add("AUTHORIZATION", "$authToken");
$xml = [xml](new-object System.IO.StreamReader $request.GetResponse().GetResponseStream()).ReadToEnd()
Microsoft.PowerShell.Utility\Select-Xml $xml -XPath "/change" |
where { ($_.Node["comment"].InnerText.Length -ne 0) -and (-Not $_.Node["comment"].InnerText.Contains('#ignore'))} |
foreach {"+ $($_.Node["user"].name) : $($_.Node["comment"].InnerText.Trim().Replace("`n"," "))`n"}
}
# Grab all the changes
$request = [System.Net.WebRequest]::Create("$teamcityUrl/httpAuth/app/rest/changes?build=id:$($buildId)")
$request.Headers.Add("AUTHORIZATION", "$authToken");
$xml = [xml](new-object System.IO.StreamReader $request.GetResponse().GetResponseStream()).ReadToEnd()
# Then get all commit messages for each of them
$changelog = Microsoft.PowerShell.Utility\Select-Xml $xml -XPath "/changes/change" | Foreach {GetCommitMessages($_.Node.id)}
$changelog > $outputFile
Write-Host "Changelog saved to ${outputFile}:"
$changelog
Possible you should use Service Messages
The above script works, however it only includes check in comments of the current build.
So I've slightly amended this script so that it does include all changes since the last successful build.
In Teamcity it's tricky to get the last successful build number so that's why I just get the last 10 builds and then iterate over these changes until I've found the last successful build.
<#
.SYNOPSIS
Generates a project change log file.
.LINK
Script posted over:
http://open.bekk.no/generating-a-project-change-log-with-teamcity-and-powershell
#>
# Where the changelog file will be created
$outputFile =
"%system.teamcity.build.tempDir%\releasenotesfile_%teamcity.build.id%.txt"
# Get the commit messages for the specified change id
# Ignore messages containing #ignore
# Ignore empty lines
# the url of teamcity server
$teamcityUrl = "%teamcity.serverUrl%"
# username/password to access Teamcity REST API
$authToken=[Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes("%system.teamcity.auth.userId%:%system.teamcity.auth.password%"))
# Build id for the release notes
$buildId = %teamcity.build.id%
#unique id of the project
$buildType = "%system.teamcity.buildType.id%"
$changelog =""
Function GetCommitMessages($changeid)
{
$request = [System.Net.WebRequest]::Create("$teamcityUrl/httpAuth/app/rest/changes/id:$changeid")
$request.Headers.Add("AUTHORIZATION", $authToken);
$xml = [xml](new-object System.IO.StreamReader $request.GetResponse().GetResponseStream()).ReadToEnd()
Microsoft.PowerShell.Utility\Select-Xml $xml -XPath "/change" |
where { ($_.Node["comment"].InnerText.Length -ne 0) -and (-Not $_.Node["comment"].InnerText.Contains('#ignore'))} |
foreach {"+ $($_.Node["user"].name) : $($_.Node["comment"].InnerText.Trim().Replace("`n"," "))`n"}
}
# Grab the previous 10 builds together with their changes
$request = [System.Net.WebRequest]::Create($teamcityUrl +'/httpAuth/app/rest/builds?
locator=untilBuild:(id:'+$buildId +'),count:10,running:any,buildType:
(id:'+$buildType+')&fields=$long,build(id,number,status,changes($long))')
$request.Headers.Add("AUTHORIZATION", $authToken);
$xml = [xml](new-object System.IO.StreamReader
$request.GetResponse().GetResponseStream()).ReadToEnd()
# Then get all commit messages for each of them
Foreach($x in Microsoft.PowerShell.Utility\Select-Xml $xml -XPath
"/builds/build/changes/change")
{
#we collect the changes until we've found the previous successfull build. so we must always collect the changes of the current build and then stop once we find a succesful build
if($x.Node.ParentNode.ParentNode.status -eq "SUCCESS" -and $x.Node.ParentNode.ParentNode.id -ne $buildId)
{ break;}
$changelog +=GetCommitMessages($x.Node.id)
}
$changelog > $outputFile
Write-Host "Changelog saved to ${outputFile}:"
$changelog
Related
I am writing a simple script that makes use of 7zip's command-line to extract archives within folders and then delete the original archives.
There is a part of my script that isn't behaving how I would expect it to. I can't get my if statement to trigger correctly. Here's a snippet of the code:
if($CurrentRar.Contains(".part1.rar")){
[void] $RarGroup.Add($CurrentRar)
# Value of CurrentRar:
# Factory_Selection_2.part1.rar
$CurrentRarBase = $CurrentRar.TrimEnd(".part1.rar")
# Value: Factory_Selection_2
for ($j = 1; $j -lt $AllRarfiles.Count; $j++){
$NextRar = $AllRarfiles[$j].Name
# Value: Factory_Selection_2.part2.rar
if($NextRar.Contains("$CurrentRarBase.part$j.rar")){
Write-Host "Test Hit" -ForegroundColor Green
# Never fires, and I have no idea why
# [void] $RarGroup.Add($NextRar)
}
}
$RarGroups.Add($RarGroup)
}
if($NextRar.Contains("$CurrentRarBase.part$j.rar")) is the line that I can't get to fire.
If I shorten it to if($NextRar.Contains("$CurrentRarBase.part")), it fires true. But as soon as I add the inline $j it always triggers false. I've tried casting $j to string but it still doesn't work. Am I missing something stupid?
Appreciate any help.
The issue seems to be your for statement and the fact that an array / list is zero-indexed (means they start with 0).
In your case, the index 0 of $AllRarfiles is probably the part1 and your for statement starts with 1, but the file name of index 1 does not contain part1 ($NextRar.Contains("$CurrentRarBase.part$j.rar"), but part2 ($j + 1).
As table comparison
Index / $j
Value
Built string for comparison (with Index)
0
Factory_Selection_2.part1.rar
Factory_Selection_2.part0.rar
1
Factory_Selection_2.part2.rar
Factory_Selection_2.part1.rar
2
Factory_Selection_2.part3.rar
Factory_Selection_2.part2.rar
3
Factory_Selection_2.part4.rar
Factory_Selection_2.part3.rar
Another simpler approach
Since it seems you want to group split RAR files which belong together, you could also use a simpler approach with Group-Object
# collect and group all RAR files.
$rarGroups = Get-ChildItem -LiteralPath 'C:\somewhere\' -Filter '*.rar' | Group-Object -Property { $_.Name -replace '\.part\d+\.rar$' }
# do some stuff afterwards
foreach($rarGroup in $rarGroups){
Write-Verbose -Verbose "Processing RAR group: $($rarGroup.Name)"
foreach($rarFile in $rarGroup.Group) {
Write-Verbose -Verbose "`tCurrent RAR file: $($rarFile.Name)"
# do some stuff per file
}
}
I'm trying to run the dotnet test command on defined test projects in a MS Azure devops pipeline.
This is working:
- script: |
dotnet test "./Project One/Project One Unit Tests.fsproj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
dotnet test "./Project Two/Project Two Unit Tests.fsproj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
displayName: 'run tests in cascade'
I don't want to specify the project names, only some rule (project name has to end with "UnitTests", "Unit Tests" or "Unit_Tests") but dotnet test <PROJECT> does not allows wildcards.
Seems like wildcards works with dotnet test <DLL> (./**/*Unit?Tests.dll) but it fails because it does not find the deps.json file on the obj folder.
My solution is to loop trough a filtered list of project files:
for proj in ./**/*Unit?Tests.*proj
do
dotnet test "$proj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
done
It's running the tests but differently from the cascade calls, here when a project test fails it does not fail the step so the pipeline does not stop!
I tried to get the result form the run, but unsuccessfully (this does not work):
for proj in ./**/*Unit?Tests.*proj
do
result=$(dotnet test "$proj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY")
if result = 1
then
exit 1
fi
done
Any suggestion?
Why when dotnet test fails (exit 1) the step does not fail?
I tried to put in the loop only the failing test project and it fails in that case.
[UPDATE]
Full pipeline.yaml:
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
variables:
project file: "Alex75.MySolution/Alex75.MyProject.fsproj"
steps:
- script: dotnet build -c Release
displayName: 'Build'
- script: |
dotnet test "./ProjectTwo Unit Tests/ProjectTwo Unit Tests.fsproj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
dotnet test "./ProjectOne Tests/ProjectOne Unit Tests.fsproj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
displayName: 'Test'
condition: false
- script: |
for proj in ./**/*Unit?Tests.*proj
do
echo "run tests in $proj"
dotnet test "$proj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
done
displayName: 'Test (Unit Test projects)'
condition: true
- powershell: |
$URL = "$(System.CollectionUri)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/logs?api-version=5.1"
Write-Host "URL = $URL"
$logs = Invoke-RestMethod -Uri $URL -Headers #{authorization = "Basic $(PAT)"} -Method Get
$lastLogId = $Logs.value[$Logs.value.count-1].id
Write-Host "lastLogId = $lastLogId"
$URL = "$(System.CollectionUri)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/logs/$lastLogId?api-version=5.1"
$result = Invoke-RestMethod -Uri $URL -Headers #{authorization = "Basic $(PAT)"} -Method Get
Write-Host $result
Write-Host "Start Check result..."
$lines = $result.Split([Environment]::NewLine)
foreach($line in $lines) {
if($line -match "Failed!")
{
throw 'dotnet test fails ($line)'
}
}
Write-Host "Test result check completed."
displayName: 'Check tests result'
PowerShell script
Using the example of #vito-liu-msft I tried to check the test logs to check the error.
Here the powershell script alone:
$URL = "$(System.CollectionUri)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/logs?api-version=5.1"
Write-Host "URL = $URL"
$logs = Invoke-RestMethod -Uri $URL -Headers #{authorization = "Basic $(PAT)"} -Method Get
$lastLogId = $Logs.value[$Logs.value.count-1].id
Write-Host "lastLogId = $lastLogId"
$URL = "$(System.CollectionUri)/$(System.TeamProject)/_apis/build/builds/$(Build.BuildId)/logs/$lastLogId?api-version=5.1"
$result = Invoke-RestMethod -Uri $URL -Headers #{authorization = "Basic $(PAT)"} -Method Get
Write-Host $result
Write-Host "Start Check result..."
$lines = $result.Split([Environment]::NewLine)
foreach($line in $lines) {
if($line -match "Failed!")
{
throw 'dotnet test fails ($line)'
}
}
Write-Host "Test result check completed."
PAT is the Personal Access Token as it is created, no need to be transformed
before using it in the HTTP request.
The LogId is not available so there is a request to get the logs collection and than a second request to get the specific last log (the response seems ordered by date, maybe it's worth to double check).
Note that if the second request returns a 404, 500 or other result (not a failure) the error check will not find a evenctual errors! A proper check of the response is required.
I tested the -match "Failed!" in PowerShell but I haven't tested the throw command in the pipeline because...
dotnet test command in the loop is working!
After spending so much time trying to figure out how to read and check the logs throw a separate PowerShell script I find out that the initial simple solution works!
Yes, the build fails because of the test failure (it also shows the precise error pointing to the problem) and the step fails.
(so the next step, the check of the test results, is skipped!)
I think I should have made some error before on filtering the test projects so that the failing project was not running (and not raising any error) so I thoutght it was not "intercepting" the error, and the build didn't stop.
It could be that I mixed 2 different pipelines files while I was changing it.
Anyway this is the incriminated step (script):
for proj in ./**/*Unit?Tests.*proj
do
echo "run tests in $proj"
dotnet test "$proj" -c Release --no-build --filter "TestCategory!=SKIP_ON_DEPLOY & TestCategory!=REQUIRES_API_KEY"
done
At list with the echo it is possible to see which test projects are used.
when a project test fails it does not fail the step so the pipeline does not stop!
Why when dotnet test fails (exit 1) the step does not fail?
As a workaround, we could get the dotnet test task log id via this rest api, add the task power shell and enter below script to analyze dotnet test log. we need to enter match code, such as fails (exit 1), If dotnet test fails it will stop the pipeline.
We should add PAT to variable and set it to secret, then use it in the script
$connectionToken="{PAT}"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$URL = "https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/logs/{logId}?api-version=6.1-preview.2"
$Result = Invoke-RestMethod -Uri $URL -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get
Write-Host $result
$lines = $result.Split([Environment]::NewLine)
$passed = 0;
$failed = 0;
foreach($line in $lines) {
if ($line -match "{match sentence}") {
throw 'dotnet test fails (exit 1)'
}
}
Update1
how can I find the "logId" ?
We could use this REST API to check logId
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/logs?api-version=6.1-preview.2
Result:
And I really have to use a api "preview" version?
We could also use other version, such as 5.1, we could switch the REST API version in the doc, you could check the pic below.
I am new to PowerShell having some issues with getting corrupted files when splitting and re attaching files back together using PowerShell.
I have a remote server from which I need to download a .bak file with the size of 44GB. To be able to do this I am splitting the files in to smaller (100mb) pieces using this script.
$from = "D:\largebakfile\largefile.bak"
$rootName = "D:\foldertoplacelargebakfile\part"
$ext = "PART_"
$upperBound = 100MB
$fromFile = [io.file]::OpenRead($from)
$buff = new-object byte[] $upperBound
$count = $idx = 0
try {
do {
"Reading $upperBound"
$count = $fromFile.Read($buff, 0, $buff.Length)
if ($count -gt 0) {
$to = "{0}{1}{2}" -f ($rootName, $idx, $ext)
$toFile = [io.file]::OpenWrite($to)
try {
"Writing $count to $to"
$tofile.Write($buff, 0, $count)
} finally {
$tofile.Close()
}
}
$idx ++
} while ($count -gt 0)
}
finally {
$fromFile.Close()
}
After this is done and the "PART_" files are downloaded to local computer I use this script to merge the files back together to 1 .bak file.
# replace with the location of the "PARTS" file
Set-Location "C:\Folderwithsplitfiles\Parts"
# replace with the SQL backup folder in your computer.
$outFile = "C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Backup\newname.bak"
#The prefix for all PARTS files
$infilePrefix ="C:\Folderwithsplitfiles\Parts\PART_"
$ostream = [System.Io.File]::OpenWrite($outFile)
$chunkNum = 1
$infileName = "$infilePrefix$chunkNum"
$offset = 0
while(Test-Path $infileName) {
$bytes = [System.IO.File]::ReadAllBytes($infileName)
$ostream.Write($bytes, 0, $bytes.Count)
Write-Host "read $infileName"
$chunkNum += 1
$infileName = "$infilePrefix$chunkNum"
}
$ostream.close();
#Get-FileHash $outfile | Format-List
When trying to restore database in SSMS I get an error basically saying that the file is corrupted and can't be restored.
I have been struggling with this a couple of days now and don't seem to get my head correct.
Everything seems like its working but something is causing me these errors. Does anyone have any ideas?
Might I suggest you look at Background Intelligent Transfer Service, you should be able to download the whole file in one piece to save over complicating anything. (Also supports starting/stopping the transfer etc.)
I am running a build on my Jenkins server and I am looking to dynamically populate the git_commit field with the commit number from the current build. The file has multiple functions in it and I want to use sed to match core-lambda-function1 name of the module and update the git_commit field with the commit number from the current build. Any help is appreciated. Thanks.
module "core-lambda-function1" {
source = "./lambda"
name = "core-lambda-function"
runtime = "nodejs6.10"
role = "${aws_iam_role.iam_role_for_lambda.arn}"
filename = "../Archive.zip"
source_code_hash = "${base64sha256(file("../Archive.zip"))}"
source_dir = "../"
git_commit = ""
}
module "core-lambda-function2" {
source = "./lambda"
name = "core-lambda-function"
runtime = "nodejs6.10"
role = "${aws_iam_role.iam_role_for_lambda.arn}"
filename = "../Archive.zip"
source_code_hash = "${base64sha256(file("../Archive.zip"))}"
source_dir = "../"
git_commit = ""
}
this is what i currently have.
#!/bin/bash
set -e
while read p; do
NAME=$p
GIT_COMMIT=`git rev-parse HEAD`
echo $NAME | grep `xargs` main.tf -A 7 | sed -ri '7s/git_commit = ""/git_commit\ = \"'$GIT_COMMIT'"/g'
done < build_name
Why not Input Variables in Terraform?
variable "git_commit" {}
module "core-lambda-function1" {
source = "./lambda"
name = "core-lambda-function"
runtime = "nodejs6.10"
role = "${aws_iam_role.iam_role_for_lambda.arn}"
filename = "../Archive.zip"
source_code_hash = "${base64sha256(file("../Archive.zip"))}"
source_dir = "../"
git_commit = "${var.git_commit}" #### Use variable here.
}
So in your wrapper script, you can update to:
#!/bin/bash
set -e
GIT_COMMIT=$(git rev-parse HEAD)
terraform plan -var 'git_commit=${GIT_COMMIT}' ...
Occasionally we commit a C# project file to SVN that references files we forgot to add to SVN. Are there any pre-commit hook scripts out there that parse the .csproj file and reject the commit if it references unversioned files?
#!/usr/bin/perl
# Checks for source files which have been added to a csproj in a commit
# but haven't themselves been committed.
use Modern::Perl;
use warnings FATAL => 'syntax';
use File::Basename qw(basename);
use XML::Simple;
die "usage: $0 repo transaction\n" if #ARGV != 2;
my $opt = "-t"; # -r for testing
my ($repos, $txn) = #ARGV;
# If you really, really want to add a file to the proj and
# not commit it, start your commit message with a !
my #info = `svnlook info $opt $txn "$repos"`;
exit 0 if ($info[3] =~ /\A!/);
my #lines = `svnlook changed $opt $txn "$repos"`;
my #projects = grep { /\AU/ }
grep { /[.]csproj\z/ }
map { chomp; $_ } #lines;
my #filelist = `svnlook tree $opt $txn "$repos" --full-paths`;
my %present;
foreach (#filelist) {
chomp;
$present{$_} = 1;
}
foreach (#projects) {
m"\AU.\s\s([\w/.]+/)([\w]+\.csproj)\z" or die "bad line $_";
my ($path, $proj) = ($1, $2);
my $projfile = `svnlook cat $opt $txn "$repos" $path/$proj`;
my $xml = XMLin($projfile);
# Tested with VS 2012 project files
my #includes = #{$xml->{ItemGroup}->[1]->{Compile}};
# All the source files in the csproj
my #filenames = map {$_->{Include}} #includes;
foreach (#filenames) {
# ignore "../etc", not below the project file in the tree
next if /\A[.][.]/;
# if you have files that are in the proj but shouldn't be committed
# eg some generated files, add checks for them here
# next if /MyGeneratedFile.cs\z/;
my $file = $path . $_;
# The csproj file speaks windows paths, but svn will output unix ones
$file =~ tr|\\|/|;
if (!defined $present{$file}) {
die "The file $file is included in the project $path\\$proj, but is not present in the tree, did you forget to commit it?";
}
}
}
If you are using a windows server, you can have a look at Subversion Notify for Windows -http://sourceforge.net/projects/svn-notify/
I use it to do a simple check on the commit message, to ensure users confirm with our in house rules. I have not gone into any of the other pre-commit uses, but it might be worth a shot!
I Quote from the Manual:
PRE-COMMIT CHECKS
Ensure the integrity of your repository by enforcing commit message standards
Restrict the types of files that can be committed (eliminate accidental commits of temporary or developer specific configuration files)
Enforce file type checks - files that can be committed, but require a special commit tag to make sure it's being committed as per your rules
Check against your task tracking/ bug tracking system to ensure it's a valid tracking item and your standards are enforced (no commits against closed bugs for example)