I am working on a large scale application which uses multiple small-small project/solutions. Each solution is built in their respective agents. Similar to below screenshot
Now the problem is that all the projects/solutions are compiled even if a single project file is changed.
To reduce CI build time, I want to add conditional expression in the build which would build the project only if their respective source is changed.
I know this can be achieved in Azure DevOps via Custom condition using variable expressions. But I am not sure how can I check if the respective source code is changed
Does anyone know what variable expression I need to write here?
Well as I see your are using the Git Repo, you can do this in the below way
As a very first step you need to find whether which project/solution is modified.
You can find the solution from my answer here,where you can see I've used a simple powershell script to pull out the modified files, and enables the corresponding variables.
Sample Powershell script to pull-out the modified files
$files=$(git diff HEAD HEAD~ --name-only)
$temp=$files -split ' '
$count=$temp.Length
echo "Total changed $count files"
For ($i=0; $i -lt $temp.Length; $i++)
{
$name=$temp[$i]
echo "this is $name file"
if ($name -like "SubFolderA/*")
{
Write-Host "##vso[task.setvariable variable=MicroserviceAUpdated]True"
}
}
you cant achieve that with condition. condition are for executing tasks inside build. you are looking for build triggers in yaml files:
trigger:
branches:
include:
- master
paths:
include:
- path\to\app\*
this build will only trigger when anything under `path\to\app\ folder was changed. that way you create a build per app and can isolate those builds to specific files changed.
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=vsts&tabs=schema#pipeline
Related
I am creating code coverage reports for my C++ projects using gcov/lcov, and I am trying to remove all files except the ones in a certain directory from the coverage report (i.e. I do not want different dependencies in various folders to show up in the report).
However I want to do this automatically and not manually. I tried the following:
lcov -r coverage.total '!(<path>)' -o coverage.info
But then lcov comes back with Deleted 0 files. I also tried !(<path>), '[^path]*' and slight variations of these but nothing seems to work. I can manually remove the undesired folders for example the following does work:
lcov -r coverage.total '/usr/libs/*' '/usr/mylibs/*' -o coverage.info
So my question is, how can I have lcov exclude all but a specific directory?
P.S.
I am open to workarounds (for example if this can be done with a bash script)
I am using bash+CMake+gcov+lcov
P.S.
This is not a duplicate of this question. I am asking about an automated way to only include files in a specific directory in the report. (for example the current directory) I am aware of the --remove argument but that is not an automated solution.
Your help is greately appreciated!
The problem:
I want to manage my database(s) using one SSDT database project. in which I want to centralize database development and automate deployments (mainly stored procedures).
In a multi-tenant environment, where object names are preceded by company names,
example :
[dbo].[spu_COMPANY_NAME$Stored Procedure Name]
we have a central database in which we do our development and every time we publish, we do a 'Replace All' to the company name.
The SQLCMD variables won't do because they cannot be included inside object names.
Is there a way I can build so that for every build configuration I get tailored stored procedures during build/publish, I get a folder structure like this :
--Database.Project/
--bin/
--CompanyA(build.companyA.congig)/
--CompanyB(build.companyB.congig)/
As #PeterSchott mentioned in a previous comment, the solution is easily doable through Powershell scripts and Pre/Post build events in Visual Studio.
I'll post the answer here in case someone needed it for reference (or maybe in case I needed it)
First:
Prebuild powershell script, this one takes the project directory, and replaces all strings with 'COMPANY_NAME' to the actual company name depending on the build configuration, for all files with .sql extension in them.
Next:
The powershell script:
if ($args[0] -eq "Release"){
$company_name = "ReleaseCorp";
}
$dboDir = $args[1] + 'dbo\';
$sqlFiles = Get-ChildItem $dboDir -rec | Where-Object {($_.Extension -eq ".sql")}
foreach ($file in $sqlFiles)
{
echo $file;
(Get-Content $file.PSPath) |
Foreach-Object { $_ -replace "COMPANY_NAME", $company_name } |
Set-Content $file.PSPath
}
Post-build event does the inverse, only that this time it takes the actual company name for the configuration and puts it back to the token, for upcoming builds.
Worth noting that for Visual Studio to be able to run Powershell (x86) scripts, the following step is mandatory :
Run Windows Powershell (x86) in administrator mode, and run :
set-executionpolicy unrestricted
More info about this to be found in David Frette's informative blog post :
Creating Powershell pre-build and post-build events for Visual Studio projects
I am trying to assign value to built-in release notes variable in "Run a Script" step.
$OctopusParameters["Octopus.Release.Notes"] = "Some release notes"
In the next step "Send an Email" I am using this variable in email body, but unfortunately it is empty.
<p>#{Octopus.Release.Notes}</p>
Is it possible to set Octopus Deploy system variable value from PowerShell and use it in the next step?
I am using Octopus Deploy 3.7.11.
EDIT:
I have also tried the cmdlet Set-OctopusVariable and it did not work.
Set-OctopusVariable -name "Octopus.Release.Notes" -value "Something"
I don't think it is possible to overwrite values of the built-in variables provided by Octopus Deploy. But you could define your own output variable and refer to that in the following steps. For example in your 'Run a script'-step use:
Set-OctopusVariable -name "MyReleaseNote" -value "Some text here"
Then the "Send an Email"-step can refer to this text by using the following (assuming the first step is called 'FirstStep'):
#{Octopus.Action[FirstStep].Output.MyReleaseNote}
The variable can also be used from a script in other steps, then use the syntax:
$relnote = $OctopusParameters["Octopus.Action[FirstStep].Output.MyReleaseNote"]
(If you want to save the generated releasenote perhaps you could save it as an 'artifact' in the project).
I tried this using Octoposh. Modifying an existing variable is covered in the Octoposh wiki at Modifying Variables - Edit a variable of a Project/Library variable set.
I wasn't able to get this to work because of timeouts on our network, but it looks like it should work - just not as straight-forward as I expected.
I am using a visual studio c# library project to contain static resources that are needed as deployment artifacts. (in my case SQL files that are run with a combination of RoundhousE and Octopus deploy). By convention all files in the project must have their properties set so that the "Build action" is "Content" and "Copy to output directory" is "Copy always".
If someone on the team adds a file but forgets to set these properties we see deployment errors. This is usually picked up in an internal environment, but I was hoping to find a way to enforce this in the CI build.
So is there a way to either fail the build or better still override these properties during the build with an MS Build task? Am I tackling this the wrong way? Any suggestions welcomed.
You are going to have to parse the project files and check for Content without CopyToOutputDirectory set to Always, I doubt there is another way.
That can be done using whatever scripting language you want, or you could even write a small C# tool that uses the classes from the Microsoft.Build.Evaluation namespace. Here is a possible PowerShell implementation - the hardest part is getting the regexes right. First one checks for Content without any metadata, second one for Content where CopyToOutputDirectory does not start with "A" (which I assume should be "Always", no idea how to match that whole word).
FindBadContentNodes.ps1 :
param([String]$inputDir)
Function FindBadContent()
{
$lines = Get-Content $input
$text = [string]::Join( "`n", $lines )
if( $text -match "<Content Include.*/>" -Or
$text -match "<Content Include.*`n\s*<CopyToOutputDirectory>[^A]\w*<.*" )
{
"Found file with bad content node"
exit 1
}
}
Get-ChildItem -Recurse -Include *.csproj -Path $inputDir | FindBadContent
Call this from MsBuild:
<Target Name="FindBadContentNodes">
<Exec Command="Powershell FindBadContentNodes.ps1 -inputDir path\to\sourceDir"/>
</Target>
Note you mention or better still override these properties during the build. I'd stay away from such a solution: you're just burying the problem and relying on the CI to produce correct builds, so local builds using just VS would not be the same. Imo making the build fail is better, especially since most CI systems have a way of notifying the developper that is responsible anyway so the fix should be applied quickly.
Another possibility would be to have the CI apply the fix and then commit the changes so at least everyone has the correct version.
IIRC there is a way in Visual Studio to set a file extension to do certain things on default, much like .config files will always set to content and copy to output directory.
So one could do the same with .sql files (and other files that they would want to be set up this way). A quick search brought me to this: http://blog.andreloker.de/post/2010/07/02/Visual-Studio-default-build-action-for-non-default-file-types.aspx
The relevant parts:
The default build action of a file type can be configured in the
registry. However, instead of hacking the registry manually, we use a
much better approach: pkgdef files (a good article about pkgdef
files). In essence, pkdef are configuration files similar to .reg
files that define registry keys and values that are automatically
merged into the correct location in the real registry. If the pkgfile
is removed, the changes are automatically undone. Thus, you can safely
modify the registry without the danger of breaking anything – or at
least, it’s easy to undo the damage.
Finally, here’s an example of how to change the default build action
of a file type:
1: [$RootKey$\Projects{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}\FileExtensions.spark]
2: "DefaultBuildAction"="Content" The Guid in the key refers to project type. In this case, “{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}” means “C# projects”. A rather comprehensive list of project type guids can be found here. Although it does not cover Visual Studio 2010 explicitly, the Guids apply to the current version as well. By the way, we can use C# as the project type here, because C# based MVC projects are in fact C# projects (and web application projects). For Visual Basic, you’d use “{F184B08F-C81C-45F6-A57F-5ABD9991F28F}” instead.
$RootKey$ is in abstraction of the real registry key that Visual
Studio stores the configuration under:
HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0_Config (Note:
Do not try to manually edit anything under this key as it can be
overwritten at any time by Visual Studio).
The rest should be self explanatory: this option sets the default
build action of .spark files to “Content”, so those files are included
in the publishing process.
All you need to do now is to put this piece of text into a file with
the extension pkgdef, put it somewhere under
%PROGRAMFILES(x86)%\Microsoft Visual Studio
10.0\Common7\IDE\Extensions (on 64-bit systems) or %PROGRAMFILES(x86)%\Microsoft Visual Studio
10.0\Common7\IDE\Extensions (on 32-bit systems) and Visual Studio will load and apply the settings automatically the next time it starts. To
undo the changes, simply remove the files.
Finally, I’ve attached a bunch of pkgdef files that are use in
production that define the “Content” default Build Action for C# and
VB projects for .spark, .brail, .brailjs and .less files respectively.
Download them, save them somewhere in the Extensions folder and you’re
good to go.
The author also says that he built a utility to help do all of this for you:
http://tools.andreloker.de/dbag
Expanding on #stijn answer, instead of using regex it is far easier to use native xml parsing.
Here is my proposed file, it also supports the ability to customize which files are evaluated by using a regex on the filename only.
param([String]$Path, [string]$IncludeMatch, [switch]$AllowPreserve)
Function Test-BadContentExists
{
param (
[parameter(Mandatory=$true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)]
[Alias("FullName")]
[string[]]$Path,
[string]$IncludeMatch,
[switch]$AllowPreserve
)
[xml]$proj = Get-Content -Path $Path
$ContentNodes = ($proj | Select-Xml "//Content|//n:Content" -Namespace #{n='http://schemas.microsoft.com/developer/msbuild/2003'}).Node
if (![string]::IsNullOrEmpty($IncludeMatch)) {
$ContentNodes = $ContentNodes | Where-Object -Property Include -Match $IncludeMatch
}
#remove the always nodes
$ContentNodes = $ContentNodes | Where-Object -Property CopyToOutputDirectory -ne 'Always'
#optionally remove the preserve nodes
if ($AllowPreserve) {
$ContentNodes = $ContentNodes | Where-Object -Property CopyToOutputDirectory -ne 'PreserveNewest'
}
if($ContentNodes)
{
write-output "Found file with bad content node:"
write-output ($ContentNodes | Select-Object Include,CopyToOutputDirectory | sort Include | Out-String)
exit 1
}
}
[hashtable]$Options = $PSBoundParameters
[void]$Options.Remove("Path")
Get-ChildItem -Recurse -Include *.csproj -Path $Path | Test-BadContentExists #Options
and calling it, with parameter:
<Target Name="FindBadContentNodes">
<Exec Command="Powershell FindBadContentNodes.ps1 -inputDir path\to\sourceDir -IncludeMatch '^Upgrade.*\.(sql|xml)$'"/>
</Target>
I ended up using a pre-build event instead and put this ps1 file in my solution directory so i could use it with multiple projects.
echo "Build Dir: %cd%"
echo "Sol Dir: $(SolutionDir)"
echo "Proj Dir: '$(ProjectDir)"
echo.
Powershell -NoProfile -Command "& '$(SolutionDir)\FindBadContentNodes.ps1' -Path '$(ProjectDir)' -IncludeMatch '^Upgrade.*\.(sql|xml)$'"
example build output:
1> "Build Dir: C:\Source\RPS\MRM BI\MRMBI-Setup\MRMBI-Schema\bin\Debug"
1> "Sol Dir: C:\Source\RPS\MRM BI\MRMBI-Setup\"
1> "Proj Dir: 'C:\Source\RPS\MRM BI\MRMBI-Setup\MRMBI-Schema\"
1>
1> Found file with bad content node:
1>
1> Include CopyToOutputDirectory
1> ------- ----------------------
1> Upgrades\V17.09\myfile1.sql
1> Upgrades\V20.05\myfile2.sql PreserveNewest
1>
We have a large base of code that contains several shared projects, solution files, etc in one directory in SVN. We're migrating to Mercurial. I would like to take this opportunity to reorganize our code into several repositories to make cloning for branching have less overhead. I've already successfully converted our repo from SVN to Mercurial while preserving history. My question: how do I break all the different projects into separate repositories while preserving their history?
Here is an example of what our single repository (OurPlatform) currently looks like:
/OurPlatform
---- Core
---- Core.Tests
---- Database
---- Database.Tests
---- CMS
---- CMS.Tests
---- Product1.Domain
---- Product1.Stresstester
---- Product1.Web
---- Product1.Web.Tests
---- Product2.Domain
---- Product2.Stresstester
---- Product2.Web
---- Product2.Web.Tests
==== Product1.sln
==== Product2.sln
All of those are folders containing VS Projects except for the solution files. Product1.sln and Product2.sln both reference all of the other projects. Ideally, I'd like to take each of those folders, and turn them into separate Hg repos, and also add new repos for each project (they would act as parent repos). Then, If someone was going to work on Product1, they would clone the Product1 repo, which contained Product1.sln and subrepo references to ReferenceAssemblies, Core, Core.Tests, Database, Database.Tests, CMS, and CMS.Tests.
So, it's easy to do this by just hg init'ing in the project directories. But can it be done while preserving history? Or is there a better way to arrange this?
EDIT::::
Thanks to Ry4an's answer, I was able to accomplish my goal. I wanted to share how I did it here for others.
Since we had a lot of separate projects, I wrote a small bash script to automate creating the filemaps and to create the final bat script to actually do the conversion. What wasn't completely apparent from the answer, is that the convert command needs to be run once for each filemap, to produce a separate repository for each project. This script would be placed in the directory above a svn working copy that you have previously converted. I used the working copy since it's file structure best matched what I wanted the final new hg repos to be.
#!/bin/bash
# this requires you to be in: /path/to/svn/working/copy/, and issue: ../filemaplister.sh ./
for filename in *
do
extension=${filename##*.} #$filename|awk -F . '{print $NF}'
if [ "$extension" == "sln" -o "$extension" == "suo" -o "$extension" == "vsmdi" ]; then
base=${filename%.*}
echo "#$base.filemap" >> "$base.filemap"
echo "include $filename" >> "$base.filemap"
echo "C:\Applications\TortoiseHgPortable\hg.exe convert --filemap $base.filemap ../hg-datesort-converted ../hg-separated/$base > $base.convert.output.txt" >> "MASTERGO.convert.bat"
else
echo "#$filename.filemap" >> "$filename.filemap"
echo "include $filename" >> "$filename.filemap"
echo "rename $filename ." >> "$filename.filemap"
echo "C:\Applications\TortoiseHgPortable\hg.exe convert --filemap $filename.filemap ../hg-datesort-converted ../hg-separated/$filename > $filename.convert.output.txt" >> "MASTERGO.convert.bat"
fi
done;
mv *.filemap ../hg-conversion-filemaps/
mv *.convert.bat ../hg-conversion-filemaps/
This script looks at every file in an svn working copy, and depending on the type either creates a new filemap file or appends to an existing one. The if is really just to catch misc visual studio files, and place them into a separate repo. This is meant to be run on bash (cygwin in my case), but running the actual convert command is accomplished through the version of hg shipped with TortoiseHg due to forking/process issues on Windows (gah, I know...).
So you run the MASTERGO.convert.bat file, which looks at your converted hg repo, and creates separate repos using the supplied filemap. After it is complete, there is a folder called hg-separated that contains a folder/repo for each project, as well as a folder/repo for each solution. You then have to manually clone all the projects into a solution repo, and add the clones to the .hgsub file. After committing, an .hgsubstate file is created and you're set to go!
With the example given above, my .hgsub file looks like this for "Product1":
Product1.Domain = /absolute/path/to/Product1.Domain
Product1.Stresstester = /absolute/path/to/Product1.Stresstester
Product1.Web = /absolute/path/to/Product1.Web
Product1.Web.Tests = /absolute/path/to/Product1.Web.Tests
Once I transfer these repos to a central server, I'll be manually changing the paths to be urls.
Also, there is no analog to the initial OurPlatform svn repo, since everything is separated now.
Thanks again!
This can absolutely be done. You'll want to use the hg convert command. Here's the process I'd use:
convert everything to a single hg repository using hg convert with a source type of svn and a dest type of hg (it sounds like you've already done this step)
create a collection of filemap files for use with hg convert's --filemap option
run hg convert with source type hg and dest type hg and the source being the mercurial repo created in step one -- and do it for each of the filemaps you created in step two.
The filemap syntax is shown in the hg help convert output, but here's the gist:
The filemap is a file that allows filtering and remapping of files and
directories. Comment lines start with '#'. Each line can contain one of
the following directives:
include path/to/file
exclude path/to/file
rename from/file to/file
So in your example your filemaps would look like this:
# this is Core.filemap
include Core
rename Core .
Note that if you have an include that the exclusion of everything else is implied. Also that rename line ends in a dot and moves everything up one level.
# this is Core.Tests
include Core.Tests
rename Core.Tests .
and so on.
Once you've created the broken-out repositories for each of the new repos, you can delete the has-everything initial repo created in step one and start setting up your subrepo configuration in .hgsub files.