I've a script which works locally fine and while running the script i'm passing 2 arguments in the script and its work perfectly fine, Here's how i'm running the bash script locally:
./changeDB_connection.sh "fdevtestcuskv04" "test"
But, I want to do it through azure devops pipeline and for that I've a pipeline task in which I'm calling a bash script with script arguments but it failed with this error message: ##[error]Bash exited with code '1'
Here's the pipeline task:
- task: Bash#3
displayName: 'Update Mongo Connection String'
condition: and(succeeded(), eq('${{ parameters.UpdateDBstr }}', 'true'))
inputs:
azureSubscription: $(azureSubscription)
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform/templates'
targetType: 'filePath'
failOnStderr: true
filePath: "$(System.DefaultWorkingDirectory)/Terraform/Terraform-Scripts/changeDB_connection.sh"
ScriptArguments: '-keyVaultName $(kvname) -Stringintials $(strinitial)'
let me know what am i doing wrong.
The below is a sample, and it works fine on my side.
xxx.sh
#pass arguments to the script
echo "Argument 1 is $1"
echo "Argument 2 is $2"
echo "Argument 3 is $3"
RunBashScript.yml
trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- task: Bash#3
inputs:
filePath: 'xxx.sh'
arguments: '1 2 3'
Successfully on my side:
Structure on my side:
For bash task with multiple arguments you can use YAML multiline syntax e.g.
- task: Bash#3
inputs:
targetType: 'filePath'
filePath: "$(System.DefaultWorkingDirectory)/Terraform/Terraform-Scripts/changeDB_connection.sh"
arguments: >
-keyVaultName $(kvname)
-Stringintials $(strinitial)
Bowman's answer regarding passing arguments to the script works, but can become troublesome if many arguments should be passed.
I would instead recommend passing arguments as environment variables to the script and then consuming the same variables in the script. This would require some rewriting of your script.
The documentation for the bash task specifies how to pass environment variables to the script-execution. In short, just add them under env in your task definition.
Example:
steps:
- task: Bash#3
inputs:
targetType: 'filpath'
filepath: 'my/file/path'
env:
MYFIRSTVARIABLE: 'some text'
MYSECONDVARIABLE: $(aPipelineVariable)
Consume the envrionment variables in the bash script by referencing $MYFIRSTVARIABLE and $MYSECONDVARIABLE respectively. If the variables are secrets, you should keep them in variable groups which the pipeline consumes.
Related
I have a requirement in which
for each top level folder in the azure repo:
print(foldername)
execute an entire set of tasks or stages of tasks around pylint and other various stuff
I am just trying to save the folder names across the whole pipeline but having issue retrieving and saving them...
my yaml file
trigger:
branches:
include: [ '*' ]
pool:
vmImage: ubuntu-latest
stages:
- stage: Gather_Folders
displayName: "Gather Folders"
jobs:
- job: "get_folder_names"
displayName: "Query Repo for folders"
steps:
- bash: echo $MODEL_NAMES
env:
MODEL_NAMES: $(ls -d -- */)
output
Generating script.
Script contents:
echo $MODEL_NAMES
========================== Starting Command Output ===========================
/usr/bin/bash --noprofile --norc /home/vsts/work/_temp/jflsakjfldskjf.sh
$(ls -d -- */)
Finishing: Bash
I checked and the variable just takes the literal command itself instead of its output.. what am I missing here?
I was hoping to inject the folder names into a pipeline variable.. then somehow execute for each folder... a stage or set of stages in parallel
For this issue, Krzysztof Madej gave an answer in this ticket. To get directories in a given folder you can use the following script:
- task: PowerShell#2
displayName: Get all directories of $(Build.SourcesDirectory) and assign to variable
inputs:
targetType: 'inline'
script: |
$arr = Get-ChildItem '$(Build.SourcesDirectory)' |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
echo "##vso[task.setvariable variable=arr;]$arr"
- task: PowerShell#2
displayName: List all directories from variable
inputs:
targetType: 'inline'
script: |
echo '$(arr)'
This is really stupid but was driving me crazy for a couple of hours. I'm testing how to pass variables between Powershell and Bash. Relevant code:
steps:
- task: PowerShell#2
name: 'pwsh_script'
inputs:
targetType: 'inline'
script: |
$response = "6458ddcd4edd7b7f68bec10338d47b55d221e975"
echo "latest (harcoded) commit: $response"
Write-Host "##vso[task.setvariable variable=LastCommit;isOutput=True]$response"
- task: Bash#3
name: 'bash_script1'
inputs:
targetType: 'inline'
script: |
echo $(LastCommit)
And I keep getting errors like:
/d/a/_temp/b40e64e8-8b5f-42d4-8118-82e8cf8a28c2.sh: line 1: LastCommit: command not found
I tried with all kinds of quotes, double, simple, none. Nothing works.
If you want to use echo $(LastCommit)
then you just need to remove isOutput
Write-Host "##vso[task.setvariable variable=LastCommit]$response"
And with isOutput you need to reference via task name
steps:
- task: PowerShell#2
name: 'pwsh_script'
inputs:
targetType: 'inline'
script: |
$response = "6458ddcd4edd7b7f68bec10338d47b55d221e975"
echo "latest (harcoded) commit: $response"
Write-Host "##vso[task.setvariable variable=LastCommit;isOutput=True]$response"
- task: Bash#3
name: 'bash_script1'
inputs:
targetType: 'inline'
script: |
echo $(pwsh_script.LastCommit)
Solution:
+ Write-Host "##vso[task.setvariable variable=LastCommit;isOutput=True]$response"
- Write-Host "##vso[task.setvariable variable=LastCommit;]$response"
Turns out that the "isOutput" was breaking it, as it means you I was creating a multi-job output variable and trying to use it inside the same job.
From the official documentation:
If you want to make a variable available to future jobs, you must mark it as an output variable by using isOutput=true. Then you can map it into future jobs by using the $[] syntax and including the step name that set the variable. Multi-job output variables only work for jobs in the same stage.
To pass variables to jobs in different stages, use the stage dependencies syntax.
When you creating a multi-job output variable, you should assign the expression to a variable.
For example:
myVarFromJobA: $[ dependencies.A.outputs['setvarStep.myOutputVar'] ] # map in the variable
Using GitHub Actions, I would like to invoke a shell script with a list of directories.
(Essentially equivalent to passing an Ansible vars list to the shell script)
I don't really know how, is this even possible? Here's what I have until now, how could one improve this?
name: CI
on:
push:
branches:
- master
tags:
- v*
pull_request:
jobs:
run-script:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout#v2
- name: Run script on targets
run: ./.github/workflows/script.sh {{ targets }}
env:
targets:
- FolderA/SubfolderA/
- FolderB/SubfolderB/
Today I was able to do this with the following YAML (truncated):
...
with:
targets: |
FolderA/SubfolderA
FolderB/SubfolderB
The actual GitHub Action passes this as an argument like the following:
runs:
using: docker
image: Dockerfile
args:
- "${{ inputs.targets }}"
What this does is simply sends the parameters as a string with the newline characters embedded, which can then be iterated over similar to an array in a POSIX-compliant manner via the following shell code:
#!/bin/sh -l
targets="${1}"
for target in $targets
do
echo "Proof that this code works: $target"
done
Which should be capable of accomplishing your desired task, if I understand the question correctly. You can always run something like sh ./script.sh $target in the loop if your use case requires it.
I have plenty of bash scripts with various variables that being piped into various scripts.
I've been wondering if I can extract an output of bash script that is activated by Azure Pipeline to be a pipeline variable for the rest of the Pipeline runtime?
Example:
foo=$(date + %Y%m%d_%H%M%S) output: 20200219_143400, I'd like to get the output for later use on the pipeline.
Depends on how you design your pipeline you can use Azure Pipeline variables:
Inside the same Job:
- job: Job1
steps:
- bash: |
$WORKDIR/foo.sh
echo "##vso[task.setvariable variable=foo]$foo"
name: FooStep
- bash: |
$WORKDIR/nextscript.sh $(FooStep.foo)
name: NextScript
# ...
Different jobs:
- job: Job1
steps:
- bash: |
$WORKDIR/foo.sh
echo "##vso[task.setvariable variable=foo;isOutput=true]$foo"
name: FooStep
- job: Job2
dependsOn: Job1
steps:
- bash: |
$WORKDIR/job2script.sh $[ dependencies.Job1.outputs['FooStep.foo'] ]
name: Job2ScriptStep
# ...
So, you need to "print to pipeline console" with ##vso[task.setvariable] all variables you need to save to output, and after to pass them as scripts arguments values.
I'm gonna use a for-loop which scans the files (value-f1.yaml, values-f2.yaml,...) in a folder and each time use a filename as a varibale and run the job in Azure pipeline job to deploy the helmchart based on that values file. The folder is located in the GitHub repository. So I'm thinking of something like this:
pipeline.yaml
stages:
- stage: Deploy
variables:
azureResourceGroup: ''
kubernetesCluster: ''
subdomain: ''
jobs:
${{ each filename in /myfolder/*.yaml}}:
valueFile: $filename
- template: Templates/deploy-helmchart.yaml#pipelinetemplates
deploy-helmchart.yaml
jobs:
- job: Deploy
pool:
vmImage: 'ubuntu-latest'
steps:
- task: HelmInstaller#1
displayName: 'Installing Helm'
inputs:
helmVersionToInstall: '2.15.1'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: HelmDeploy#0
displayName: 'Initializing Helm'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'init'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: PowerShell#2
displayName: 'Fetching GitTag'
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Fetching the latest GitTag"
$gt = git describe --abbrev=0
Write-Host "##vso[task.setvariable variable=gittag]$gt"
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
- task: Bash#3
displayName: 'Fetching repo-tag'
inputs:
targetType: 'inline'
script: |
echo GitTag=$(gittag)
echo BuildID=$(Build.BuildId)
echo SourceBranchName=$(Build.SourceBranchName)
echo ClusterName= $(kubernetesCluster)
- task: HelmDeploy#0
displayName: 'Upgrading helmchart'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'upgrade'
chartType: 'FilePath'
chartPath: $(chartPath)
install: true
releaseName: $(releaseName)
valueFile: $(valueFile)
arguments: '--set image.tag=$(gittag) --set subdomain=$(subdomain)'
condition: and(succeeded(), startsWith(variables['build.sourceBranch'], 'refs/tags/v'))
Another thing is that if the jobs can get access to the GitHub repo by default or do I need to do something in the job level?
Besides how can I use for-loop in the job for this case?
Any help would be appreciated.
Updated after getting comments from #Leo
Here is a PowerShell task that I added in deploy-helmchart.yaml for fetching the files from a folder in GitHub.
- task: PowerShell#2
displayName: 'Fetching Files'
inputs:
targetType: 'inline'
script: |
Write-Host "Fetching values files"
cd myfolder
$a=git ls-files
foreach ($i in $a) {
Write-Host "##vso[task.setvariable variable=filename]$i"
Write-Host "printing"$i
}
Now the question is how can I run the task: HelmDeploy#0 for each files using parameters?
if the jobs can get access to the GitHub repo by default or do I need to do something in the job level?
The answer is yes.
We could add a command line task in the jobs, like job1 to clone the GitHub repository by Github PAT, then we could access those files (value-f1.yaml, values-f2.yaml,...) in $(Build.SourcesDirectory):
git clone https://<GithubPAT>#github.com/XXXXX/TestProject.git
Besides how can I use for-loop in the job for this case?
You could create a template which will have a set of actions, and pass parameters across during your build, like:
deploy-helmchart.yaml:
parameters:
param : []
steps:
- ${{each filename in parameters.param}}:
- scripts: 'echo ${{ filename }}'
pipeline.yaml:
steps:
- template: deploy-helmchart.yaml
parameters:
param: ["filaname1","filaname2","filaname3"]
Check the document Solving the looping problem in Azure DevOps Pipelines for some more details.
Command line get the latest file name in the foler:
FOR /F "delims=|" %%I IN ('DIR "$(Build.SourcesDirectory)\*.txt*" /B /O:D') DO SET NewestFile=%%I
echo "##vso[task.setvariable variable=NewFileName]NewestFile"
Update:
Now the question is how can I run the task: HelmDeploy#0 for each
files using parameters?
Its depends on whether your HelmDeploy` task has options to accept the filename parameter.
As I said before, we could use following yaml to invoke the template yaml with parameters:
- template: deploy-helmchart.yaml
parameters:
param: ["filaname1","filaname2","filaname3"]
But, if the task HelmDeploy has no options to accept parameters, we could not run the task HelmDeploy#0 for each files using parameters.
Then I check the HelmDeploy#0, I found there is only one option that can accept Helm command parameters:
So, the answer for this question is depends on whether your file name can be used as a Helm command, if not, you could not run the task HelmDeploy#0 for each files using parameters. If yes, you can do it.
Please check the official document Templates for some more details.
Hope this helps.