TeamCity: How to setup proper trigger file wildcards - teamcity

Say I have a solution containing the following projects in a folder-tree (this is C# but could in principle be anything):
MySolution
- SharedFolderA
- File1.cs
- File2.cs
- SharedProject.csproj
- SharedFolderB
- File1.cs
- File2.cs
- SharedProject.csproj
- Hosts
- Host1
- Program.cs
- Host1.csproj
- Host2
- Program.cs
- Host2.csproj
Now in TeamCity I wish to make building of each host into seperate builds. So I will have a project called MySolution containing the following builds:
Build Host1
Build Host2
Now comes the question: For each build I want to setup a VCS trigger rule, that triggers the build if the commits contain changes to either
files in the root folder
files in one or more of the shared folders
files in the specific host folder - but not the other host folder
Example: Build of Host2 should trigger if any files have been changed in any of the following folders:
MySolution
SharedFolderA
SharedFolderB
Hosts/Host2
How should I setup the File wildcards?

The format is as follows from TeamCity Ref Perhaps consider a different approach, since you have different build/host conditions!
Option 1:
Since you have various hosts with build conditions, a dependency build with wildcards is (IMHO) **recommended, i.e. use the Artifact Dependency feature which in turn allows wild cards in files & folders picture below.
Option 2: However if you choose to go down your original Build Trigger Path, you can try this.
+|-[:user=VCS_username;]root=VCS_root_id;]comment=VCS_comment_regexp]]:Ant_like_wildcard
Once you have setup via the UI, you can make a copy and edit the triggers by adding modifying like so.
//all files, or .CS in the root folder
/root/**
/root/*.cs
//files in one or more of the shared folders
/root/SharedFolderA/**
/root/SharedFolderB/*.cs
//files in the specific host folder - but not the other host folder
/root/SharedFolderA/**
// the -ve sign will ignore it
-/root/Hosts/Host2/**
You can add regex or ** or ? singmatch

Related

Avoid path redundancy in Gitlab CI include

To improve the structure of my Gitlab CI file I include some specific files, like for example
include:
- '/ci/config/linux.yml'
- '/ci/config/windows.yml'
# ... more includes
To avoid the error-prone redundancy of the path I thought to put it into a variable, like:
variables:
CI_CONFIG_DIR: '/ci/config'
include:
- '${CI_CONFIG_DIR}/linux.yml' # ERROR: "Local file `${CI_CONFIG_DIR}/linux.yml` does not exist!"
- '${CI_CONFIG_DIR}/windows.yml'
# ... more includes
But this does not work. Gitlab CI claims that ${CI_CONFIG_DIR}/linux.yml does not exist, although the documentation says that variables in include paths are allowed, see https://docs.gitlab.com/ee/ci/variables/where_variables_can_be_used.html#gitlab-ciyml-file.
What also didn't work was to include a file /ci/config/main.yml and from that include the specific configurations without paths:
# /ci/config/main.yml
include:
- 'linux.yml' # ERROR: "Local file `linux.yml` does not exist!"
- 'windows.yml'
# ... more includes
How can I make this work or is there an alternative to define the path in only one place without making it too complicated?
This does not seem to be implemented at the moment, and there is an open issue at the moment in the backlog.
Also, with the documentation saying that you could use variables within include sections, those are only for predefined variables.
See if GitLab 14.2 (August 2021) can help:
Use CI/CD variables in include statements in .gitlab-ci.yml
You can now use variables as part of include statements in .gitlab-ci.yml files.
These variables can be instance, group, or project CI/CD variables.
This improvement provides you with more flexibility to define pipelines.
You can copy the same .gitlab-ci.yml file to multiple projects and use variables to alter its behavior.
This allows for less duplication in the .gitlab-ci.yml file and reduces the need for complicated per-project configuration.
See Documentation and Issue.

Got "ZIP does not support timestamps before 1980" while deploying a Go Cloud Function on GCP via Triggers

Problem:
I am trying to deploy a function with this step in a second level compilation (second-level-compilation.yaml)
- name: 'gcr.io/cloud-builders/gcloud'
args: ['beta', 'functions',
'deploy', '${_FUNCTION_NAME}',
'--source', 'path/to/function',
'--runtime', 'go111',
'--region', '${_GCP_CLOUD_FUNCTION_REGION}',
'--entry-point', '${_ENTRYPOINT}',
'--env-vars-file', '${_FUNCTION_PATH}/.env.${_DEPLOY_ENV}.yaml',
'--trigger-topic', '${_TRIGGER_TOPIC_NAME}',
'--timeout', '${_FUNCTION_TIMEOUT}',
'--service-account', '${_SERVICE_ACCOUNT}']
I get this error from Cloud Build using the Console.
Step #1: Step #11: ERROR: (gcloud.beta.functions.deploy) Error creating a ZIP archive with the source code for directory path/to/function: ZIP does not support timestamps before 1980
Here is the global flow:
The following step is in a first-level compilation (first-level-compilation.yaml). Which is triggered by Cloud build using a Github repository (via Application GitHub Cloud Build) :
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args: ['-c', 'launch-second-level-compilation.sh ${_MY_VAR}']
The script "launch-second-level-compilation.sh" does specific operations based on ${_MY_VAR} and then launches a second-level compilation passing a lot of substitutions variables with "gcloud builds submit --config=second-level-compilation.yaml --substitutions=_FUNCTION_NAME=val,_GCP_CLOUD_FUNCTION_REGION=val,....."
Then, the "second-level-compilation.yaml" described at the beginning of this question is executed, using the substitutions values generated and passed through the launch-second-level-compilation.sh script.
The main idea here is to have a generic first-level-compilation.yaml in charge of calling a second-level compilation with specific dynamically generated substitutions.
Attempts / Investigations
As described in this issue Cloud Container Builder, ZIP does not support timestamps before 1980, I tried to "ls" the files in the /workspace directory. But none of the files at the /workspace root have strange DATE.
I changed the path/to/function from a relative path to /workspace/path/to/function, but no success, without surprise as the directory ends to be the same.
Please make sure you don't have folders without files. For example:
|--dir
|--subdir1
| |--file1
|--subdir2
|--file2
In this example dir doesn't directly contain any file, only subdirectories. During local deployment gcp sdk puts dir into tarball without copying last modified field.
So it is set to 1st Jan 1970 that causes problems with ZIP.
As possible workaround just make sure every directory contains at least one file.

Is it possible to rebuild only updated files in Gitlab CI?

I'm using this script for my Gitlab CI build stage (only relevant part is shown):
cache:
key: "$CI_BUILD_REF"
paths:
- bin/
- build/
build:
image: <my_build_image>
stage: build
script:
- "make PLATFORM='x86_64-linux-gnu' BUILD='release' JOBS=8 all"
only:
- master
- tags
- merge-requests
artifacts:
untracked: true
paths:
- bin/x86_64-linux-gnu/release
I thought what if I'll add bin and build dirs into the cache, make won't rebuild the whole project every time (just like it behaves locally), but it seems what CI runner overwrites my src dir every time, so timestamps on the files is being updated too and make think each file is updated. I thought about including src dir into the cache, but it's included in the repo and I'm not sure this is correct. So, which is the best way to rebuild gitlab ci project using previously built binaries?
I see you are using $CI_BUILD_REF as a cache key; although this variable is deprecated, it seems to work and provides the commit's SHA1.
Is that really what you intended, to create separate caches per commit (not even per branch)?
So for any new commit there wouldn't be a cache anyways?
I'd probably even use a static cache key in order to maximize caching (while using minimal cache storage), or maybe per branch.
Maybe also the Git checkouts and/or branch switches touch the source files too often.
I have implemented a similar strategy in one of my projects, but there I have a distinct "cached" folder to where I /rsync/ the files from the checkout.
The shared runners of Gitlab.com do seem to leave the file modification time intact when using a cache, and even on the main checkout.
I've put up a sample project with a CI job that demonstrates the fac tat https://gitlab.com/hannibal218bc/test-build-cache-xtimes/-/jobs/1022404894 :
the job stats the directory's contents
creates a cached directory if it not yet exists
copies the README.md file
"compiles" the file to a README.built file.
As you can see in the output, the modification timestamp of the README.built is the runtime from the previous job:
$ cd cached
$ stat README.* || true
File: README.built
Size: 146 Blocks: 16 IO Block: 4096 regular file
Device: 809h/2057d Inode: 2101510 Links: 1
Access: (0644/-rw-r--r--) Uid: ( 0/ root) Gid: ( 0/ root)
Access: 2021-02-10 23:06:13.000000000 +0000
Modify: 2021-02-10 23:02:39.000000000 +0000 <<< timestamp from previous job
Change: 2021-02-10 23:06:13.000000000 +0000

Have ansible role retrieve its files from external location as part of its own role

So one thing we've encountered in our project is that we do not want to store our large files in our git repo for our ansible roles because it slows down cloning (and git limits files to 100 mb anyways).
What we've done is store our files in a separate internal location, where our files can sit statically and have no size restrictions. Our roles are written so that they first pull these static files to their local files folder and then continue like normal.
i.e.
roles/foo/tasks/main.yml
- name: Create role's files directory
file:
path: "{{roles_files_directory}}"
state: directory
- name: Copy static foo to local
get_url:
url: "{{foo_static_gz}}"
dest: "{{roles_files_directory}}/{{foo_gz}}"
#....Do rest of the tasks...
roles/foo/vars/main.yml
roles_files_directory: "/some/path/roles/foo/files"
foo_static_gz: "https://internal.foo.tar.gz"
foo_gz: "foo.tar.gz"
The main thing I don't find really sound is the hard coded path to the role's files directory. I preferably would like to dynamically look up the path when running ansible, but I haven't been able to find documentation on that. The issue can arise because different users may check roles to a different root paths. Does anyone know how to dynamically know the role path, or have some other pattern that solves the overall problem?
Edit:
I discovered there's actually a {{playbook_dir}} variable that would return "/some/path", which might be dynamic enough in this case. Still isn't safe against the situation where the role name might change, but that's a way rarer occurrence and can be handled through version control.
What about passing values from the command line?
---
- hosts: '{{ hosts }}'
remote_user: '{{ user }}'
tasks:
- ...
ansible-playbook release.yml --extra-vars "hosts=vipers user=starbuck"
http://docs.ansible.com/playbooks_variables.html#passing-variables-on-the-command-line
I just want to add another possible solution: you can try to add custom "facter".
Here is a link to official documentation: http://docs.ansible.com/setup_module.html
And I found this article that might be useful: http://serverascode.com/2015/01/27/ansible-custom-facts.html

Get MSDeploy to skip specific folders and file types in folders as CCNet task

I want MSDeploy to skip specific folders and file types within other folders when using sync. Currently I'm using CCNet to call MSDeploy with the sync verb to take websites from a build to a staging server. Because there are files on the destination that are created by the application / user uploaded files etc, I need to exclude specific folders from being deleted on the destination. Also there are manifest files created by the site that need to remain on the destination.
At the moment I've used -enableRule:DoNotDeleteRule but that leaves stale files on the destination.
<exec>
<executable>$(MsDeploy)</executable>
<baseDirectory>$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\</baseDirectory>
<buildArgs>-verb:sync
-source:iisApp="$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\"
-dest:iisApp="$(website)/$(websiteFolder)"
-enableRule:DoNotDeleteRule</buildArgs>
<buildTimeoutSeconds>600</buildTimeoutSeconds>
<successExitCodes>0,1,2</successExitCodes>
</exec>
I have tried to use the skip operation but run into problems. Initially I dropped the DoNotDeleteRule and replaced it with (multiple) skip
<exec>
<executable>$(MsDeploy)</executable
<baseDirectory>$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\</baseDirectory>
<buildArgs>-verb:sync
-source:iisApp="$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\"
-dest:iisApp="$(website)/$(websiteFolder)"
-skip:objectName=dirPath,absolutePath="assets"
-skip:objectName=dirPath,absolutePath="survey"
-skip:objectName=dirPath,absolutePath="completion/custom/complete*.aspx"
-skip:objectName=dirPath,absolutePath="completion/custom/surveylist*.manifest"
-skip:objectName=dirPath,absolutePath="content/scorecardsupport"
-skip:objectName=dirPath,absolutePath="Desktop/docs"
-skip:objectName=dirPath,absolutePath="_TempImageFiles"</buildArgs>
<buildTimeoutSeconds>600</buildTimeoutSeconds>
<successExitCodes>0,1,2</successExitCodes>
</exec>
But this results in the following:
Error: Source (iisApp) and
destination (contentPath) are not compatible for the given
operation.
Error count:
1.
So I changed from iisApp to contentPath and instead of dirPath,absolutePath just Directory like this:
<exec>
<executable>$(MsDeploy)</executable
<baseDirectory>$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\</baseDirectory>
<buildArgs>-verb:sync
-source:contentPath="$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\"
-dest:contentPath="$(website)/$(websiteFolder)"
-skip:Directory="assets"
-skip:Directory="survey"
-skip:Directory="content/scorecardsupport"
-skip:Directory="Desktop/docs"
-skip:Directory="_TempImageFiles"</buildArgs>
<buildTimeoutSeconds>600</buildTimeoutSeconds>
<successExitCodes>0,1,2</successExitCodes>
</exec>
and this gives me an error: Illegal characters in path:
< buildresults>
Info: Adding MSDeploy.contentPath (MSDeploy.contentPath).
Info: Adding contentPath (C:\WWWRoot\MySite
-skip:Directory=assets
-skip:Directory=survey
-skip:Directory=content/scorecardsupport
-skip:Directory=Desktop/docs
-skip:Directory=_TempImageFiles).
Info: Adding dirPath (C:\WWWRoot\MySite
-skip:Directory=assets
-skip:Directory=survey
-skip:Directory=content/scorecardsupport
-skip:Directory=Desktop/docs
-skip:Directory=_TempImageFiles).
< /buildresults>
< buildresults>
Error: Illegal characters in path.
Error count: 1.
< /buildresults>
So I need to know how to configure this task so the folders referenced do not have their contents deleted in a sync and that that *.manifest and *.aspx files in the completion/custom folders are also skipped.
The issue with this was... line breaks!
Where I'd split each -skip directive to a new line that was causing the illegal characters in path. Running all the skip directives inline has solved this:
<exec>
<executable>$(MsDeploy)</executable>
<baseDirectory>$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\</baseDirectory>
<buildArgs>-verb:sync
-source:contentPath="$(ProjectsDirectory)$(projectName)$(ProjectsWorkingDirectory)\Website\"
-dest:contentPath="C:\WWWRoot\$(websiteFolder)" -skip:Directory="assets" -skip:Directory="_TempImageFiles" -skip:objectName=dirPath,absolutePath="\\Desktop\\Docs"
</buildArgs>
<buildTimeoutSeconds>600</buildTimeoutSeconds>
<successExitCodes>0,1,2</successExitCodes>
</exec>
Take a look at this MSDN article entitled Web Deployment: Excluding Files and Folders via the Web Application’s Project File. Specifically the section "Excluding Specific Files / Folders". This will stop directories, files, and file/dir pattern matches from both being included as content in the deployment package as well as ignored on the destination when deployed.
However, I would take a step back and ask why do these files exist in your web project in the first place. The way I've handled user uploaded content on IIS is by adding a virtual directory to my web application. The contents of virtual directories (and the provisioning of the vdir itself) is ignored when doing a sync on a web deploy package. This also gives you the benefit of hosting the client content directory anywhere you like which has a whole score of advantages (i.e. bigger disk drive, prevention of denial of service by unscrupulous users who try to fill your hard disk with garbage data, etc.)

Resources