How to pass parameter value as a tag for new image? - yaml

I need to force user to change the name of an image he wants to push.
User pulls image from jfrog and changes it, then he HAS TO change the name before uploading it to jfrog.
I managed to set a parameter which user can change in the UI, just type something
parameters:
- name: "changeName"
type: string
default: "newname"
Now, in the task docker push, this value "new-name" has to be added to the image name after dash:
[base-image-name]-newname
I want it to look like this:
- task: JFrog.jfrog-artifactory-vsts-extension.artifactory-docker.ArtifactoryDocker#1
displayName: 'Artifactory Docker Push'
inputs:
command: push
artifactoryService: Jfrog
targetRepo: 'docker'
imageName: jfrog-base-image-name-newname
I did a PowerShell task which gets this value and adds it to the base image name:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$variable = '${{parameters.addName}}'
Set-Variable -Name "newname" -Value "base-image-name-$variable"
Get-Variable -Name "newname"
Write-Host "$newname";
It returns correct new name "base-image-name-newname"
Now, how to actually send this to the task docker push? How this task should look?

Have you tried adding the parameter value directly to string base imageName?
I don't have the JFrog.jfrog-artifactory-vsts-extension.artifactory-docker.ArtifactoryDocker#1 running, but please try something like this:
parameters:
- name: changeName
type: string
default: newname
#....etc...
- task: JFrog.jfrog-artifactory-vsts-extension.artifactory-docker.ArtifactoryDocker#1
displayName: 'Artifactory Docker Push'
inputs:
command: push
artifactoryService: Jfrog
targetRepo: 'docker'
imageName: 'jfrog-base-image-name-${{ parameters.changeName }}'

Related

Using Azure Devops pipeline and webhook to trigger it, i can not get payload content

I'm trying to get the payload content coming from the webhook that trigger the pipeline when a work item is updated.
I have a powershell Task to try to get the content, for example, when a work item is updated, i want to get the System.AreaPath that is in the WorkItem and available in the payload.
trigger:
none
pool:
name: SYNCCHR
resources:
webhooks:
- webhook: GETPAYLOAD
connection: GETPAYLOAD_CON
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "${{ parameters.GETPAYLOAD.resource.workItemId }}"
Write-Host "${{ parameters.GETPAYLOAD.resource.revision.rev }}"
Write-Host '${{ parameters.GETPAYLOAD.resource.revision.fields.System.AreaPath }}'
i don't find a way to get the System.Area value. parameters.GETPAYLOAD is a yaml expression related withe the parameters variable, it is gobal variable built-in in the yaml pipeline in Devops.
The fact the property System.AreaPath contains a dot in the name does not help with yaml syntax !
The payload is containing this kind of json :
payload content
I tried many time different syntax, triggering the pipeline via the update of work item but i can not obtain the content of properties under revision or fields nodes.
The goal is to be able to obtain in powershell task any property in the payload.
i found finally the way to obtain the data concerning System.AreaPath:
the syntax is
for the workitem.updated event :
'"${{ parameters.GETPAYLOAD.resource.revision.fields['System.AreaPath'] }}"'
for the workitem.created event :
'"${{ parameters.GETPAYLOAD.resource.fields['System.AreaPath'] }}"'

Azure Pipeline set of tasks for each folder in root DIR of repo

I have a requirement in which
for each top level folder in the azure repo:
print(foldername)
execute an entire set of tasks or stages of tasks around pylint and other various stuff
I am just trying to save the folder names across the whole pipeline but having issue retrieving and saving them...
my yaml file
trigger:
branches:
include: [ '*' ]
pool:
vmImage: ubuntu-latest
stages:
- stage: Gather_Folders
displayName: "Gather Folders"
jobs:
- job: "get_folder_names"
displayName: "Query Repo for folders"
steps:
- bash: echo $MODEL_NAMES
env:
MODEL_NAMES: $(ls -d -- */)
output
Generating script.
Script contents:
echo $MODEL_NAMES
========================== Starting Command Output ===========================
/usr/bin/bash --noprofile --norc /home/vsts/work/_temp/jflsakjfldskjf.sh
$(ls -d -- */)
Finishing: Bash
I checked and the variable just takes the literal command itself instead of its output.. what am I missing here?
I was hoping to inject the folder names into a pipeline variable.. then somehow execute for each folder... a stage or set of stages in parallel
For this issue, Krzysztof Madej gave an answer in this ticket. To get directories in a given folder you can use the following script:
- task: PowerShell#2
displayName: Get all directories of $(Build.SourcesDirectory) and assign to variable
inputs:
targetType: 'inline'
script: |
$arr = Get-ChildItem '$(Build.SourcesDirectory)' |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
echo "##vso[task.setvariable variable=arr;]$arr"
- task: PowerShell#2
displayName: List all directories from variable
inputs:
targetType: 'inline'
script: |
echo '$(arr)'

Azure Pipeline cannot execute command after go get

I want to use rsrc to set an icon in my windows Go application in my azure devops pipeline.
I think I miss something trivial, my pipeline doesn't find the command rsrc after go get -u -v github.com/akavel/rsrc.
My workaround is to work with an rsrc.exe in the vcs.
Pipeline
- task: Go#0
displayName: Install rsrc
condition: eq(variables['agent.os'], 'Windows_NT')
inputs:
command: 'get'
arguments: '-u -v github.com/akavel/rsrc'
workingDirectory: $(System.DefaultWorkingDirectory)
- task: PowerShell#2
displayName: Generate syso files
timeoutInMinutes: 1
condition: eq(variables['agent.os'], 'Windows_NT')
inputs:
targetType: 'inline'
script: |
$icon = ([System.IO.Path]::Combine("$(System.DefaultWorkingDirectory)", "build/App.ico"))
$iconSyso = ([System.IO.Path]::Combine("$(System.DefaultWorkingDirectory)", "cmd/myapp/rsrc.syso"))
rsrc.exe -ico $icon -o $iconSyso
workingDirectory: $(System.DefaultWorkingDirectory)
Error
rsrc.exe : The term 'rsrc.exe' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Update
go install github.com/akavel/rsrc doesn't help
$env:GOPATH is empty
I have reproduced your error:
The way I solve this question is:
Firstly, before you run the Go task, you should set up GOPATH and GOBIN in advance.
Here is an example.
variables:
GOBIN: '$(GOPATH)\bin' # Go binaries path
GOPATH: '$(system.defaultWorkingDirectory)\gopath' # Go workspace path
modulePath: '$(GOPATH)\src\github.com\$(build.repository.name)' # Path to the module's code
Then add the GOBIN to the PATH.
- script: echo '##vso[task.prependpath]$(GOBIN)'
So the complete script should be like this:
variables:
GOBIN: '$(GOPATH)\bin' # Go binaries path
GOPATH: '$(System.DefaultWorkingDirectory)\gopath' # Go workspace path
modulePath: '$(GOPATH)\src\github.com\$(build.repository.name)' # Path to the module's code
steps:
- task: Go#0
displayName: Install rsrc
condition: eq(variables['agent.os'], 'Windows_NT')
inputs:
command: 'get'
arguments: '-u -v github.com/akavel/rsrc'
workingDirectory: $(System.DefaultWorkingDirectory)
- script: echo '##vso[task.prependpath]$(GOBIN)'
- task: PowerShell#2
displayName: Generate syso files
timeoutInMinutes: 1
condition: eq(variables['agent.os'], 'Windows_NT')
inputs:
targetType: 'inline'
script: |
$icon = ([System.IO.Path]::Combine("$(System.DefaultWorkingDirectory)", "build/App.ico"))
$iconSyso = ([System.IO.Path]::Combine("$(System.DefaultWorkingDirectory)", "cmd/myapp/rsrc.syso"))
rsrc.exe -ico $icon -o $iconSyso
workingDirectory: $(System.DefaultWorkingDirectory)

Azure pipelines bash tries to execute the variable instead of expanding

This is really stupid but was driving me crazy for a couple of hours. I'm testing how to pass variables between Powershell and Bash. Relevant code:
steps:
- task: PowerShell#2
name: 'pwsh_script'
inputs:
targetType: 'inline'
script: |
$response = "6458ddcd4edd7b7f68bec10338d47b55d221e975"
echo "latest (harcoded) commit: $response"
Write-Host "##vso[task.setvariable variable=LastCommit;isOutput=True]$response"
- task: Bash#3
name: 'bash_script1'
inputs:
targetType: 'inline'
script: |
echo $(LastCommit)
And I keep getting errors like:
/d/a/_temp/b40e64e8-8b5f-42d4-8118-82e8cf8a28c2.sh: line 1: LastCommit: command not found
I tried with all kinds of quotes, double, simple, none. Nothing works.
If you want to use echo $(LastCommit)
then you just need to remove isOutput
Write-Host "##vso[task.setvariable variable=LastCommit]$response"
And with isOutput you need to reference via task name
steps:
- task: PowerShell#2
name: 'pwsh_script'
inputs:
targetType: 'inline'
script: |
$response = "6458ddcd4edd7b7f68bec10338d47b55d221e975"
echo "latest (harcoded) commit: $response"
Write-Host "##vso[task.setvariable variable=LastCommit;isOutput=True]$response"
- task: Bash#3
name: 'bash_script1'
inputs:
targetType: 'inline'
script: |
echo $(pwsh_script.LastCommit)
Solution:
+ Write-Host "##vso[task.setvariable variable=LastCommit;isOutput=True]$response"
- Write-Host "##vso[task.setvariable variable=LastCommit;]$response"
Turns out that the "isOutput" was breaking it, as it means you I was creating a multi-job output variable and trying to use it inside the same job.
From the official documentation:
If you want to make a variable available to future jobs, you must mark it as an output variable by using isOutput=true. Then you can map it into future jobs by using the $[] syntax and including the step name that set the variable. Multi-job output variables only work for jobs in the same stage.
To pass variables to jobs in different stages, use the stage dependencies syntax.
When you creating a multi-job output variable, you should assign the expression to a variable.
For example:
myVarFromJobA: $[ dependencies.A.outputs['setvarStep.myOutputVar'] ] # map in the variable

using for-loop in azure pipeline jobs

I'm gonna use a for-loop which scans the files (value-f1.yaml, values-f2.yaml,...) in a folder and each time use a filename as a varibale and run the job in Azure pipeline job to deploy the helmchart based on that values file. The folder is located in the GitHub repository. So I'm thinking of something like this:
pipeline.yaml
stages:
- stage: Deploy
variables:
azureResourceGroup: ''
kubernetesCluster: ''
subdomain: ''
jobs:
${{ each filename in /myfolder/*.yaml}}:
valueFile: $filename
- template: Templates/deploy-helmchart.yaml#pipelinetemplates
deploy-helmchart.yaml
jobs:
- job: Deploy
pool:
vmImage: 'ubuntu-latest'
steps:
- task: HelmInstaller#1
displayName: 'Installing Helm'
inputs:
helmVersionToInstall: '2.15.1'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: HelmDeploy#0
displayName: 'Initializing Helm'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'init'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: PowerShell#2
displayName: 'Fetching GitTag'
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Fetching the latest GitTag"
$gt = git describe --abbrev=0
Write-Host "##vso[task.setvariable variable=gittag]$gt"
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: Bash#3
displayName: 'Fetching repo-tag'
inputs:
targetType: 'inline'
script: |
echo GitTag=$(gittag)
echo BuildID=$(Build.BuildId)
echo SourceBranchName=$(Build.SourceBranchName)
echo ClusterName= $(kubernetesCluster)
- task: HelmDeploy#0
displayName: 'Upgrading helmchart'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'upgrade'
chartType: 'FilePath'
chartPath: $(chartPath)
install: true
releaseName: $(releaseName)
valueFile: $(valueFile)
arguments: '--set image.tag=$(gittag) --set subdomain=$(subdomain)'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
Another thing is that if the jobs can get access to the GitHub repo by default or do I need to do something in the job level?
Besides how can I use for-loop in the job for this case?
Any help would be appreciated.
Updated after getting comments from #Leo
Here is a PowerShell task that I added in deploy-helmchart.yaml for fetching the files from a folder in GitHub.
- task: PowerShell#2
displayName: 'Fetching Files'
inputs:
targetType: 'inline'
script: |
Write-Host "Fetching values files"
cd myfolder
$a=git ls-files
foreach ($i in $a) {
Write-Host "##vso[task.setvariable variable=filename]$i"
Write-Host "printing"$i
}
Now the question is how can I run the task: HelmDeploy#0 for each files using parameters?
if the jobs can get access to the GitHub repo by default or do I need to do something in the job level?
The answer is yes.
We could add a command line task in the jobs, like job1 to clone the GitHub repository by Github PAT, then we could access those files (value-f1.yaml, values-f2.yaml,...) in $(Build.SourcesDirectory):
git clone https://<GithubPAT>#github.com/XXXXX/TestProject.git
Besides how can I use for-loop in the job for this case?
You could create a template which will have a set of actions, and pass parameters across during your build, like:
deploy-helmchart.yaml:
parameters:
param : []
steps:
- ${{each filename in parameters.param}}:
- scripts: 'echo ${{ filename }}'
pipeline.yaml:
steps:
- template: deploy-helmchart.yaml
parameters:
param: ["filaname1","filaname2","filaname3"]
Check the document Solving the looping problem in Azure DevOps Pipelines for some more details.
Command line get the latest file name in the foler:
FOR /F "delims=|" %%I IN ('DIR "$(Build.SourcesDirectory)\*.txt*" /B /O:D') DO SET NewestFile=%%I
echo "##vso[task.setvariable variable=NewFileName]NewestFile"
Update:
Now the question is how can I run the task: HelmDeploy#0 for each
files using parameters?
Its depends on whether your HelmDeploy` task has options to accept the filename parameter.
As I said before, we could use following yaml to invoke the template yaml with parameters:
- template: deploy-helmchart.yaml
parameters:
param: ["filaname1","filaname2","filaname3"]
But, if the task HelmDeploy has no options to accept parameters, we could not run the task HelmDeploy#0 for each files using parameters.
Then I check the HelmDeploy#0, I found there is only one option that can accept Helm command parameters:
So, the answer for this question is depends on whether your file name can be used as a Helm command, if not, you could not run the task HelmDeploy#0 for each files using parameters. If yes, you can do it.
Please check the official document Templates for some more details.
Hope this helps.

Resources