Can CircleCI run a conditional step even on failure - continuous-integration

I'm attempting to upload test results as a separate step from executing tests even when one of the tests fails. But I only want to upload results from specific branches.
Can when: be used this way? Is there a better alternative?
- when: always
condition:
matches:
pattern: '^dev$'
value: << pipeline.git.branch >>
steps:
- run:
name: Upload Test Results
command: <code here>
The code above results in a build error: Unable to parse YAMLmapping values are not allowed here in 'string', condition:
https://circleci.com/docs/2.0/configuration-reference/#the-when-attribute

It can, I was just using the wrong syntax.
- when:
condition:
matches:
pattern: '^dev$'
value: << pipeline.git.branch >>
steps:
- run:
name: Upload Test Results
when: always
command: <code here>

Related

GitHub Actions: make job dependent on at least one other job from an array, but not all of them

In my GitHub workflow, I have a finish job that uploads the coverage report to Coveralls once all other jobs are finished.
jobs:
foo: ...
bar: ...
finish:
needs:
- foo
- bar
If either foo or bar fails though, or it is skipped (why a job may be skipped is not relevant to this question), the finish job won't be run.
Is there a way I could make finish run if at least one of the jobs provided in the needs field is run successfully?
It's currently not possible to get job.status or job.conclusion natively on the workflow to check them on other jobs through conditional.
A workaround could be to use outputs or artifacts to always save each job status.
Here is an example using artifacts with 3 jobs, where the third job would check the previous 2 jobs status before executing some operation:
jobs:
JOB_01:
name: Job 01
. . .
steps:
- name: Some steps of job 01
. . .
- name: Create file status_job01.txt and write the job status into it
if: always()
run: |
echo ${{ job.status }} > status_job01.txt
- name: Upload file status_job01.txt as an artifact
if: always()
uses: actions/upload-artifact#v1
with:
name: pass_status_job01
path: status_job01.txt
JOB_02:
name: Job 02
. . .
steps:
- name: Some steps of job 02
. . .
- name: Create file status_job02.txt and write the job status into it
if: always()
run: |
echo ${{ job.status }} > status_job02.txt
- name: Upload file status_job02.txt as an artifact
if: always()
uses: actions/upload-artifact#v1
with:
name: pass_status_job02
path: status_job02.txt
JOB_03:
needs: [JOB_01, JOB_02]
if: always()
name: Job 03
. . .
steps:
- name: Download artifact pass_status_job01
uses: actions/download-artifact#v1
with:
name: pass_status_job01
- name: Download artifact pass_status_job02
uses: actions/download-artifact#v1
with:
name: pass_status_job02
- name: Set the statuses of Job 01 and Job 02 as output parameters
id: set_outputs
run: |
echo "::set-output name=status_job01::$(<pass_status_job01/status_job01.txt)"
echo "::set-output name=status_job02::$(<pass_status_job02/status_job02.txt)"
- name: Show the values of the outputs
run: |
# using the syntax steps.<step_id>.outputs.<output_name> to access the output parameters
echo "status_job01 = ${{ steps.set_outputs.outputs.status_job01 }}"
echo "status_job02 = ${{ steps.set_outputs.outputs.status_job02 }}"
- name: Some other steps of job 03
. . .
Where the other steps of job03 would depend on the outputs results you got from the other jobs, to perform an action or not.
You can find more references about this example on the links below:
Get status of multiple jobs in the same workflow
How to get status of previous job
I also wrote an example here:
workflow file
workflow run

GitHub Actions pass list of variables to shell script

Using GitHub Actions, I would like to invoke a shell script with a list of directories.
(Essentially equivalent to passing an Ansible vars list to the shell script)
I don't really know how, is this even possible? Here's what I have until now, how could one improve this?
name: CI
on:
push:
branches:
- master
tags:
- v*
pull_request:
jobs:
run-script:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout#v2
- name: Run script on targets
run: ./.github/workflows/script.sh {{ targets }}
env:
targets:
- FolderA/SubfolderA/
- FolderB/SubfolderB/
Today I was able to do this with the following YAML (truncated):
...
with:
targets: |
FolderA/SubfolderA
FolderB/SubfolderB
The actual GitHub Action passes this as an argument like the following:
runs:
using: docker
image: Dockerfile
args:
- "${{ inputs.targets }}"
What this does is simply sends the parameters as a string with the newline characters embedded, which can then be iterated over similar to an array in a POSIX-compliant manner via the following shell code:
#!/bin/sh -l
targets="${1}"
for target in $targets
do
echo "Proof that this code works: $target"
done
Which should be capable of accomplishing your desired task, if I understand the question correctly. You can always run something like sh ./script.sh $target in the loop if your use case requires it.

using for-loop in azure pipeline jobs

I'm gonna use a for-loop which scans the files (value-f1.yaml, values-f2.yaml,...) in a folder and each time use a filename as a varibale and run the job in Azure pipeline job to deploy the helmchart based on that values file. The folder is located in the GitHub repository. So I'm thinking of something like this:
pipeline.yaml
stages:
- stage: Deploy
variables:
azureResourceGroup: ''
kubernetesCluster: ''
subdomain: ''
jobs:
${{ each filename in /myfolder/*.yaml}}:
valueFile: $filename
- template: Templates/deploy-helmchart.yaml#pipelinetemplates
deploy-helmchart.yaml
jobs:
- job: Deploy
pool:
vmImage: 'ubuntu-latest'
steps:
- task: HelmInstaller#1
displayName: 'Installing Helm'
inputs:
helmVersionToInstall: '2.15.1'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: HelmDeploy#0
displayName: 'Initializing Helm'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'init'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: PowerShell#2
displayName: 'Fetching GitTag'
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Fetching the latest GitTag"
$gt = git describe --abbrev=0
Write-Host "##vso[task.setvariable variable=gittag]$gt"
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: Bash#3
displayName: 'Fetching repo-tag'
inputs:
targetType: 'inline'
script: |
echo GitTag=$(gittag)
echo BuildID=$(Build.BuildId)
echo SourceBranchName=$(Build.SourceBranchName)
echo ClusterName= $(kubernetesCluster)
- task: HelmDeploy#0
displayName: 'Upgrading helmchart'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'upgrade'
chartType: 'FilePath'
chartPath: $(chartPath)
install: true
releaseName: $(releaseName)
valueFile: $(valueFile)
arguments: '--set image.tag=$(gittag) --set subdomain=$(subdomain)'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
Another thing is that if the jobs can get access to the GitHub repo by default or do I need to do something in the job level?
Besides how can I use for-loop in the job for this case?
Any help would be appreciated.
Updated after getting comments from #Leo
Here is a PowerShell task that I added in deploy-helmchart.yaml for fetching the files from a folder in GitHub.
- task: PowerShell#2
displayName: 'Fetching Files'
inputs:
targetType: 'inline'
script: |
Write-Host "Fetching values files"
cd myfolder
$a=git ls-files
foreach ($i in $a) {
Write-Host "##vso[task.setvariable variable=filename]$i"
Write-Host "printing"$i
}
Now the question is how can I run the task: HelmDeploy#0 for each files using parameters?
if the jobs can get access to the GitHub repo by default or do I need to do something in the job level?
The answer is yes.
We could add a command line task in the jobs, like job1 to clone the GitHub repository by Github PAT, then we could access those files (value-f1.yaml, values-f2.yaml,...) in $(Build.SourcesDirectory):
git clone https://<GithubPAT>#github.com/XXXXX/TestProject.git
Besides how can I use for-loop in the job for this case?
You could create a template which will have a set of actions, and pass parameters across during your build, like:
deploy-helmchart.yaml:
parameters:
param : []
steps:
- ${{each filename in parameters.param}}:
- scripts: 'echo ${{ filename }}'
pipeline.yaml:
steps:
- template: deploy-helmchart.yaml
parameters:
param: ["filaname1","filaname2","filaname3"]
Check the document Solving the looping problem in Azure DevOps Pipelines for some more details.
Command line get the latest file name in the foler:
FOR /F "delims=|" %%I IN ('DIR "$(Build.SourcesDirectory)\*.txt*" /B /O:D') DO SET NewestFile=%%I
echo "##vso[task.setvariable variable=NewFileName]NewestFile"
Update:
Now the question is how can I run the task: HelmDeploy#0 for each
files using parameters?
Its depends on whether your HelmDeploy` task has options to accept the filename parameter.
As I said before, we could use following yaml to invoke the template yaml with parameters:
- template: deploy-helmchart.yaml
parameters:
param: ["filaname1","filaname2","filaname3"]
But, if the task HelmDeploy has no options to accept parameters, we could not run the task HelmDeploy#0 for each files using parameters.
Then I check the HelmDeploy#0, I found there is only one option that can accept Helm command parameters:
So, the answer for this question is depends on whether your file name can be used as a Helm command, if not, you could not run the task HelmDeploy#0 for each files using parameters. If yes, you can do it.
Please check the official document Templates for some more details.
Hope this helps.

Using an array of values to repeat a step in GitHub Actions workflow

I am trying to create a GitHub Actions workflow which would collect specific paths changed in last commit and run a step for each of collected paths, if any.
Currently, in my workflow I'm creating an array of paths, but I'm not sure how to proceed with my array:
name: Test
on:
push
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
# This step will create "an array" of strings, e.g. "path1 path2 path3"
- name: array
id: arr
run: |
arr=()
for i in "$(git diff-tree --no-commit-id --name-only -r ${{ github.sha }})"
do
if [[ $i == *"path1"* ]]; then
arr+=("path1")
fi
if [[ $i == *"path2"* ]]; then
arr+=("path2")
fi
done
echo ::set-output name=arr::${arr[#]}
# How to run this step by iterating the `${{ steps.arr.outputs.arr }}`?
- name: reviewdog-lint
uses: reviewdog/action-eslint#v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
reporter: github-pr-review
eslint_flags: 'my_project/some_folder/${{ SINGLE_PATH }}/' # `SINGLE_PATH` would be a path from the array
Is something like this even possible in the first place? If not, what would be recommended way to loop through some values and use them as arguments in other workflow steps?
There is some support for this in Github Actions. There is a very good tutorial here that explains how to do it in detail, but essentially what you'll do is split the steps into two jobs. The first job will output a JSON object that will serve as the input to the matrix of the second job.
Here's a simple example:
jobs:
setup:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.matrix.outputs.value }}
steps:
- id: matrix
run: |
echo '::set-output name=value::[\"a\", \"b\", \"c\"]'
build:
needs: [ setup ]
runs-on: ubuntu-latest
strategy:
matrix:
value: ${{fromJson(needs.setup.outputs.matrix)}}
steps:
- run: |
echo "${{ matrix.value }}"
Difficult to say without running it, but I would say you need to use the output in the second step by assigning it to a variable, something like:
env:
OUTPUT: ${{ steps.id.outputs.arr }}
Then you use $OUTPUT as an environment variable inside the action.
The problem with that particular action is that it takes one commit at a time. But you can check out the code, it's a shell script. You can fork it from line 15 and make it split input and run a loop over it, applying eslint to every one of them.

How to validate that a file has a specific substring with Ansible?

I would like to have an assert or fail task in my Ansible play that validates that the right code build was deployed. The deployment comes with a version.properties file which has the build info I care about.
The 'correct' code version comes from a vars file and is called desired_build_id.
How can I validate that my version.properties mentions this build ID? Some sort of substring search?
I've tried the following:
---
- name: Validate deployment success
hosts: app-nodes
tasks:
- name: Read version.properties file
shell: cat /path/to/version.properties
register: version_prop_content
- fail: Wrong build ID found in version.properties
when: desired_build_id not in version_prop_content.stdout
However, that gives a error: error while evaluating conditional: esired_build_id not in version_prop_content.stdout
What's the right syntax for this? Or, is there a better way?
A much simpler python expression would also do:
- name: Read version.properties file
shell: cat /path/to/version.properties
register: version_prop_content
- debug: msg="desired build installed"
when: "'{{desired_build_id}}' in '{{version_prop_content.stdout}}'"
Or as I always recommend, avoid using ansible as far as possible:
- name: verify version
shell: grep '{{desired_build_id}}' /path/to/version.properties
Figured it out!
The way to do the substring comparison is with version_prop_content.stdout.find(desired_build_id) > 0 which is true if the substring is present
The find command returns index of the substring and -1 if it is not present.
I also changed it to an assert tasks to make it look a bit prettier (fail is such an ugly word ;) ).
- name: Check that desired version was deployed
assert:
that:
- version_prop_content.stdout.find(desired_build_id) > 0

Resources