How to convert bash array to ADO pipeline variable array - bash

I have an array in bash that looks like:
libs=test1 test2
I would like to use the output of the bash script in a subsequent step in an ADO pipeline. How can I loop over this in ADO with pipeline variables like:
- ${{ each value in $(libs) }}:
- script: echo $value
- task: Npm#1
inputs:
command: 'custom'
customCommand: npm publish --ignore-scripts
workingDir: 'dist/libs/$(value)'
publishRegistry: 'useFeed'
publishFeed: 'feed'

Unfortunately, you cannot as each statement works only with parameters and not variables (as per documentation).
Moreover, variables are only string while parameters have different data type.

Related

GitLab CI - Trigger the job based on pattern match in the rules from dotenv variables of an other job

I have two jobs in the Gitlab CI file.
First job env_var_test generates the dotenv variables from the command.
echo '{"apple":red,"boy":"bar","cat":"white"}' | jq -r 'to_entries|map("\(.key)=\(.value|tostring)")|.[]'
Second job env_var_retrive_test looks for a variable from env_var_test dotenv variables and if the variable match the predefined value of the CICD rules, it triggers
env_var_test:
stage: build
image: $CFIMAGE
script:
- echo '{"apple":red,"boy":"bar","cat":"white"}' | jq -r 'to_entries|map("\(.key)=\(.value|tostring)")|.[]' > deploy.env
artifacts:
reports:
dotenv: deploy.env
tags:
- linux
env_var_retrive_test:
stage: deploy
image: $CFIMAGE
script:
- echo "[ $apple - $boy - $cat ]"
tags:
- linux
rules:
- if: '$boy == "bar"'
when: always
With this setup, I tested them and could see the variables are correctly printing echo "[ $apple - $boy - $cat ]". However, the job was not triggering if I defined the variables in the rules section
rules:
- if: '$boy == "bar"'
when: always
Please correct me if I'm doing it wrong or help me with any better approach to do the same.
-Thanks
https://docs.gitlab.com/ee/ci/yaml/#rules
You cannot use dotenv variables created in job scripts in rules, because rules are evaluated before any jobs run.
Feature request: https://gitlab.com/gitlab-org/gitlab/-/issues/235812. Please vote for it.
When you are comparing Variable to some value, you should not enclose in quotes. You should use - if $boy == "bar"

Gitlab CI: save command in variable

I need to run big command in several jobs and save results in dynamically created variables.
My idea - save such command as variable and evaluate it in script sections of all jobs.
For example:
.grep_command: &grep_command
GREP_COMMAND: dotnet ef migrations list | grep "VERY_LONG_PATTERN_HERE"
job1:
variables:
<<: *grep_command
script:
# some job specific code
- echo $GREP_COMMAND
- VAR=$(${GREP_COMMAND}) # doesn't work
job2:
variables:
<<: *grep_command
script:
# some job specific code
- echo $GREP_COMMAND
- echo "VAR=$(${GREP_COMMAND})" > build.env # also doesn't work
I found the right way:
define command as script command and use it in script section, not variables:
.grep-command: &grep-command
- dotnet ef migrations list | grep "VERY_LONG_PATTERN_HERE"
job1:
script:
# some job specific code
- *grep-command
(By the way saving command as variable also works, just use it carefully, but I suppose it is not so clear - variables must stay variables, and commands - as commands. I find it bad practice to mix them)

How can I get a list of directory names to pass in as an array to an Azure Pipeline template?

I'm trying to get a list of directories and then pass that array to an azure pipeline template. Here's the code for getting the array names:
- script: |
cd $(Build.StagingDirectory)
directoryList=( $(ls -d */ | sed 's#/##') ) # sed gets rid of the trailing slash
echo $directoryList
echo "##vso[task.setvariable variable=directoryList]$directoryList"
# Archive all directories in staging directory created by sam build
- template: zip-template.yml
parameters:
directories:
- $(directoryList)
One thing I noticed is that echo $(directoryList) only outputs the first entry. When I try to pass this to the template it only performs 1 iteration which makes me think I'm not creating the array correctly. Here's the azure pipeline template that should receive the list of directory names:
parameters:
- name: "directories"
type: object
default: {}
steps:
- ${{ each dir in parameters.directories }}:
- task: ArchiveFiles#2
displayName: "Zip ${{ dir }}"
inputs:
rootFolderOrFile: '$(Build.StagingDirectory)/${{dir}}'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.StagingDirectory)/${{dir}}/package.zip'
replaceExistingArchive: true
Another odd outcome I observed is that the displayName outputs "Zip ($directoryList)" when I run the pipeline. What can I do to get the list of directories an pass that to the template to iterate through and run the zip task multiple times?
There are a number of things going on here, and ultimately I don't believe this is possible the way you're doing it.
The immediate cause of your problem with directoryList is that the syntax to access all elements of a Bash array is not $arrayName, but ${arrayName[#]}. $arrayName is equivalent to accessing element 0 (${arrayName[0]}).
BTW, parsing ls is a bad practice. A better way here would be:
IFS=$'\n' directoryList=($(find ./* -maxdepth 1 -type d -printf '%f\n'))
This prevents all manner of issues that could occur when your directories have weird names (if any contain newlines, this could still get messed up, but in this context, if that happens you probably have bigger problems).
I do not know if a VSO log command (the last line of your script) will split up a variable containing spaces or newlines into an array that you can loop over -- I'm not too familiar with the for-each template syntax. You might want to check this, though, because the log command is just writing a string of text onto standard output, it's not somehow passing the array itself back to Azure DevOps.
Another odd outcome I observed is that the displayName outputs "Zip ($directoryList)" when I run the pipeline. What can I do to get the list of directories an pass that to the template to iterate through and run the zip task multiple times?
${{ }} expressions are evaluated at template compile time, prior to when your script is running to populate the directoryList variable on the pipeline using the VSO log command. You have to use $[ ] expressions if you want to evaluate them at runtime. Unfortunately, I do not believe there is a way to use $[ ] expressions to loop over a step:
Compile time expressions can be used anywhere; runtime expressions can be used in variables and conditions.
I'm thinking your best bet is to do the archive-files step within a Bash script, perhaps looping over that array you created earlier. Or just hard-code the list of directories into your pipeline or put them in a variable in the pipeline YAML that you have to update when the list of directories changes. The best approach may depend on how often that list of directories changes.
I am afraid that it's not supported defining an array as a variable, the variable syntax is variables: { string: string }.
Here is a Ticket about the Variable, you could refer to it.
The parameters support passing the Array.
From your requirement, you need to use command to get the directory list. But the Yaml parameters couldn't be set in the command.
In summary, we cannot pass the array to the template via variable to run the archive file task multiple times.
Workaround:
You could use Powershell script to get the directory list and archive them as ZIP.
Here is the Pipeline example(The tool is 7-zip):
steps:
- powershell: |
$arr = Get-ChildItem '$(Build.SourcesDirectory)' |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
echo "##vso[task.setvariable variable=arr;]$arr"
displayName: 'PowerShell Script'
- powershell: |
$string = "$(arr)"
$Data=$string.split(" ")
foreach($item in $Data){
function create-7zip([String] $aDirectory, [String] $aZipfile){
[string]$pathToZipExe = "$($Env:ProgramFiles)\7-Zip\7z.exe";
[Array]$arguments = "a", "-tzip", "$aZipfile", "$aDirectory", "-r";
& $pathToZipExe $arguments;
}
create-7zip "$(build.sourcesdirectory)" "$(build.sourcesdirectory)/$item/package.zip"
}
displayName: 'PowerShell Script'
Explanation:
The first Powershell task is used to get the folder list in the target folder(source directory.) Then it will be send to the variable value, the type is string.
The second powershell task could get the variable and split it.
Finally, loop the result and compress it into zip through the script.
Result:

Unable to print string containing double quotes in GitLab CI YAML

I'm using the CI Lint tester to try and figure out how to store an expected JSON result, which I later compare to a curl response. Neither of these work:
Attempt 1
---
image: ruby:2.1
script:
- EXPECT_SERVER_OUTPUT='{"message": "Hello World"}'
Fails with:
did not find expected key while parsing a block mapping at line 4 column 5
Attempt 2
---
image: ruby:2.1
script:
- EXPECT_SERVER_OUTPUT="{\"message\": \"Hello World\"}"
Fails with:
jobs:script config should be a hash
I've tried using various combinations of echo as well, without a working solution.
You could use literal block scalar1 style notation and put the variable definition and subsequent script lines on separate lines2 without worrying about quoting:
myjob:
script:
- |
EXPECT_SERVER_OUTPUT='{"message": "Hello World"}'
or you can escape the nested double quotes:
myjob:
script:
- "EXPECT_SERVER_OUTPUT='{\"message\": \"Hello World\"}'"
but you may also want to just use variables like:
myjob:
variables:
EXPECT_SERVER_OUTPUT: '{"message": "Hello World"}'
script:
- dothething.sh
Note: variables are by default expanded inside variable definitions so take care with any $ characters inside the variable value (they must be written as $$ to be literal). This feature can also be turned off.
1See this answer for an explanation of this and related notation
2See this section of the GitLab docs for more info on multi-line commands
I made it work like this:
script: |
"EXPECT_SERVER_OUTPUT='{\"message\": \"Hello World\"}'"
echo $EXPECT_SERVER_OUTPUT

Running script with arguments through ansible

[Ansible version == 2.1.0]
In order to run a script which is present locally on the target server, we can use Ansible's "command" module. Following can be done easily:
- name: Executing getpkgs.sh to download packages.
command: sh "/path/to /dir/scriptName.sh" arg1 arg2 arg3 arg4
I have my script names and the arguments stored in ansible variables. For example, the following variable contains all the script names and the arguments to be passed to those scripts:
scripts_to_execute:
- { filename: "/path/to/file/file1.sh", args: "arg11 arg12 arg13"}
- { filename: "/path/to/file/file2.sh", args: "arg21 arg22"}
- { filename: "/path/to/file/file3.sh", args: "arg31 arg32 arg33 arg34"}
And i want all these files which are already present on the target server, to be executed using with_items. Trying to achieve something like the following:
- name: Executing all files.
command: sh "{{item.filename}}" "{{item.args}}"
with_items: scripts_to_execute
I am trying to pass the script name followed by the string containing all arguments that are to be passed into the script. But it is considering that string of arguments as a single argument.
But it is considering that string of arguments as a single argument.
I think that makes sense since you pass the args in quotes. Did you try without quotes?
- name: Executing all files.
command: sh "{{item.filename}}" {{item.args}}
with_items: scripts_to_execute

Resources