I have a pipeline that runs regularly every 8 hours. It's task is to checkout a bunch of repositories, compile multiple projects and run some tests. The pipelines yaml-file is stored in Repo1:
resources:
repositories:
- repository: Repo1
type: git
name: Repo1
ref: main
- repository: Repo2
type: git
name: Repo2
ref: main
- repository: Repo3
type: git
name: Repo3
stages:
- stage: 'CompileAndTest'
jobs:
- job: 'x86'
timeoutInMinutes: 240
steps:
- checkout: Repo1
lfs: true
- checkout: Repo2
lfs: true
- checkout: Repo3
lfs: true
trigger: none
schedules:
- cron: "0 6,14,22 * * 1-5"
displayName: daily every 8 hours
branches:
include:
- main
always: true
In order to link work items to commits I enabled the appropriate option within my pipeline:
Now I get a few work-items that are related to the build within the build itself:
.
As an aside in the image there are 19 repos. For the sake of simplicity I just mention 3 here.
The commit associated with that item was done on Repo2, so on a different repo than the one where the pipeline is stored. However when I go to the the mentioned item, it doesn't show the build within the Integrated in build-field. Is that because the commit was done on another repo than the pipeline?
Related
I have a pipeline which I want to trigger when PR is merged into master. I have tried different things, but this did not work. Furthermore, I am neither getting any error nor pipeline is triggering.
I do not want this to be triggered on PR creation. I want this to be triggered when PR is merged into master. That is the reason I have not added pr in my yml.
What am I missing here?
Approaches:
Enabled "Continuous Integration" option
Following trigger syntax per Microsoft recommendation
trigger:
batch: True
branches:
include:
- master
paths:
include:
- cosmos
Setting up valid YML file
Pipeline:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
batch: True
branches:
include:
- master
paths:
include:
- cosmos
stages:
- stage: SOME_PATH_dev
displayName: SOME_PATH_dev
jobs:
- deployment: 'DeployToDev'
environment: Dev
cancelTimeoutInMinutes: 1
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzureResourceGroupDeployment#2
displayName: Azure Deployment:Create Or Update Resource Group action on SOME_PATH_dev
inputs:
ConnectedServiceName: SOME_KEY
resourceGroupName: SOME_PATH_dev
location: West US
csmFile: cosmos/deploy.json
csmParametersFile: cosmos/parameters-dev.json
deploymentName: SOME_PATH.Cosmos.DEV
Repo Structure:
References:
"Configuring the trigger failed, edit and save the pipeline again" with no noticeable error and no further details
Azure Devops build pipeline: CI triggers not working on PR merge to master when PR set to none
https://medium.com/#aksharsri/add-approval-gates-in-azure-devops-yaml-based-pipelines-a06d5b16b7f4
https://erwinstaal.nl/posts/manual-approval-in-an-azure-devops-yaml-pipeline/
I was changing YML (pipeline) files instead of files inside cosmos folder and expecting trigger.
I have self-hosted agents on multiple environments that I am trying to run the same build/deploy processes on. I would like to be able to deploy the same code from a single repo to multiple systems concurrently. Thus, I have created an "overhead" pipeline, and several "processes" pipeline templates. Everything seems to be going very well, except for when I try to perform checkouts of the same repo twice in the same pipeline execution. I get the following error:
An error occurred while loading the YAML build pipeline. An item with the same key has already been added.
I would really like to be able to just click ONE button to trigger a main pipeline that calls all the templates requires and gives the parameters needed to get all my jobs done at once. I could of course define this "overhead" pipeline and then queue up as many instances as I need of it per systems that I need to deploy to, but I'm lazy, hence why I'm using pipelines!
As soon as I remove the checkout from Common.yml, the validation succeeds without any issues. If I keep the checkout in there but only call the Common.yml once for the entire Overhead pipeline, then it succeeds without any issues as well. But the problem is: I need to pull the contents of the repo to EACH of my agents that are running on completely separate environments that are in no way ever able to talk to each other (can't pull the information to one agent and have it do some sort of a "copy" to all the other agent locations.....).
Any assistance is very much welcomed, thank you!
The following is my "overhead" pipeline:
# azure-pipelines.yml
trigger:
none
parameters:
- name: vLAN
type: string
default: 851
values:
- 851
- 1105
stages:
- stage: vLAN851
condition: eq('${{ parameters.vLAN }}', '851')
pool:
name: xxxxx
demands:
- vLAN -equals 851
jobs:
- job: Common_851
steps:
- template: Procedures/Common.yml
- job: Export_851
dependsOn: Common_851
steps:
- template: Procedures/Export.yml
parameters:
Server: ABTS-01
- stage: vLAN1105
condition: eq('${{ parameters.vLAN }}', '1105')
pool:
name: xxxxx
demands:
- vLAN -equals 1105
jobs:
- job: Common_1105
steps:
- template: Procedures/Common.yml
- job: Export_1105
dependsOn: Common_1105
steps:
- template: Procedures/Export.yml
parameters:
Server: OTS-01
And here is the "Procedures/Common.yml":
steps:
- checkout: git://xxxxx/yyyyy#$(Build.SourceBranchName)
clean: true
enabled: true
timeoutInMinutes: 1
- task: UsePythonVersion#0
enabled: true
timeoutInMinutes: 1
displayName: Select correct version of Python
inputs:
versionSpec: '3.8'
addToPath: true
architecture: 'x64'
- task: CmdLine#2
enabled: true
timeoutInMinutes: 5
displayName: Ensure Python Requirements Installed
inputs:
script: |
python -m pip install GitPython
And here is the "Procedures/Export.yml":
parameters:
- name: Server
type: string
steps:
- task: PythonScript#0
enabled: true
timeoutInMinutes: 3
displayName: xxxxx
inputs:
arguments: --name "xxxxx" --mode True --Server ${{ parameters.Server }}
scriptSource: 'filePath'
scriptPath: 'xxxxx/main.py'
I managed to make checkout work with variable branch names by using template expression variables ${{ ... }} instead of macro syntax $(...) variables.
The difference is that, template expressions are processed at compile time while macros are processed at runtime.
So in my case I have something like:
- checkout: git://xxx/yyy#${{ variables.BRANCH_NAME }}
For more information about variables syntax :
Understand variable syntax
I couldn't get it to work with expressions but I was able to get it to work using repository resources following the documentation at: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: git
name: MyAzureProjectName/MyGitRepo
ref: $(Build.SourceBranch)
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
#some job
steps:
- checkout: MyGitHubRepo
#some other job
steps:
- checkout: MyGitHubRepo
- script: dir $(Build.SourcesDirectory)
I'm trying to set up the Bamboo build plan configuration using Bamboo YAML specs (.yml file below). In the last stage (Create deployment artifacts) I want to use the shared artifacts from the previous stage. By specifying the artifacts of the jobs as "shared: true" I can use them in the second stage. However, they are in the same destination folder. Using the UI this can be easily edited.
Artifact dependencies
But how can I specify the destination folder of the two artifacts in the Bamboo YAML specs, e.g. from "Root of working directory" to "./app" and "./wwwroot", respectively?
---
version: 2
plan:
project-key: COCKPIT
key: BE
name: Cockpit - Continuous Build - Windows
stages:
- Build Stage:
- Build Backend
- Build Frontend
- Build Artifact:
- Create Deployment Artifact
Build Backend:
requirements:
- Visual Studio Build Tools (32-bit)
tasks:
- checkout:
repository: cockpit_backend
path: 'cockpit_backend'
force-clean-build: false
- script:
- dotnet publish .\cockpit_backend\src\Cockpit.WebApi\ --configuration Release
artifacts:
-
name: BackendBuild
location: cockpit_backend/src/Cockpit.WebApi/bin/Release/netcoreapp3.1/publish
pattern: '**/*.*'
required: true
shared: true
Build Frontend:
requirements:
- os_linux
tasks:
- checkout:
repository: 'Cockpit / cockpit_frontend'
path: 'cockpit_frontend'
force-clean-build: false
- script:
- cd cockpit_frontend
- npm install
- script:
- cd cockpit_frontend
- npm run build-prod
docker:
image: node:alpine
artifacts:
-
name: FrontendBuild
location: cockpit_frontend/dist
pattern: '**/*.*'
required: true
shared: true
Create Deployment Artifact:
requirements:
- os_windows
tasks:
- script:
interpreter: powershell
scripts:
- $buildDir = "Cockpit"
- $dest = "Cockpit_${bamboo.buildNumber}.zip"
- Add-Type -assembly "system.io.compression.filesystem"
- '[io.compression.zipfile]::CreateFromDirectory($buildDir, $dest)'
artifacts:
-
name: Completebuild
pattern: 'Cockpit_${bamboo.buildNumber}.zip'
required: true
YAML specs doesn't support artifact dependency management and you need to have script task at "Create Deployment Artifact" job to put them into separate folders from root before compressing
I am using CircleCI to build a project, everything is running fine, except that my tags are not built when pushed to github:
I don't understand why, I have reduced my whole configuration to a minimalistic config file, it's the same logic:
version: 2
jobs:
my_dummy_job_nightly:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: NIGHTLY BUILD
command: |
apk add --update py-pip
python -m pip install --upgrade pip
my_dummy_job_deploy:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: RELEASE BUILD
command: |
apk add --update py-pip
python -m pip install --upgrade pip
###################################################################################
# CircleCI WORKFLOWS #
###################################################################################
workflows:
version: 2
build-and-deploy:
jobs:
###################################################################################
# NIGHTLY BUILDS #
###################################################################################
- my_dummy_job_nightly:
filters:
tags:
ignore: /.*/
branches:
only: master
###################################################################################
# TAGS BUILDS #
###################################################################################
- hold:
type: approval
filters:
tags:
only: /.*/
branches:
ignore: /.*/
- my_dummy_job_deploy:
requires:
- hold
filters:
tags:
only: /.*/
branches:
ignore: /.*/
I don't understand why the tags don't build ... The regex should let everything through...
I have finally found the issue. Nothing to do with the configuration, CircleCI interface do not show tag build in the Workflows Interface and thus the approval operation block the whole process.
To access the workflow and approve the deployment, you have to click on the build and click on the workflow (see below):
Once on the workflow, it is possible to approve the process:
The only solution I have found to make the build appear is to create a dummy and useless step in the build process that will appear before the approval.
version: 2
jobs:
init_tag_build:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: Launch Build OP
command: |
echo "start tag workflow"
my_deploy_job:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: DEPLOY BUILD
command: |
echo "do the deploy work"
workflows:
version: 2
build-and-deploy:
jobs:
- init_tag_build:
filters:
tags:
only: /.*/
branches:
ignore: /.*/
- hold:
type: approval
requires:
- init_tag_build
filters:
tags:
only: /.*/
branches:
ignore: /.*/
- my_deploy_job:
requires:
- hold
filters:
tags:
only: /.*/
branches:
ignore: /.*/
TL;DR
In the yaml you ignore every branch. Remove the following part.
branches:
ignore: /.*/
You probably meant to build only if tags are shown, but you ignored all the branches. If you want to build for every branch with tags remove the line. If you want to build for some branch (e.g. dev) with tags then add branches: only: dev.
The connection between the two specifier is AND instead of OR. There is a discussion on CircleCI forum to add feature to change it to OR.
What is the best way to commit from a pipeline.The job pulls from a different repo and makes some changes + builds - then push the new files to a different repo.Is this possible?
You should use the git-resource.
The basic steps of what you are going to want to do are to
Pull from the repo into a container.
Do some stuff with the code
Move the new code into a different container
Push the contents of that new container to a different git-repository
Your pipeline configuration should look something like this:
jobs:
- name: pull-code
plan:
- get: git-resource-pull
- get: git-resource-push
- task: do-something
inputs:
- name: git-resource-pull
run:
path: /bin/bash
args:
- -c
- |
pushd git-resource-pull
// do something
popd
// move the code from git-resource-pull to git-resource-push
- put: git-resource-push
params: {repository: git-resource-push}
resources:
- name: git-resource-pull
type: git
source:
uri: https://github.com/team/repository-1.git
branch: master
- name: git-resource-push
type: git
source:
uri: https://github.com/team/repository-2.git
branch: master