I'm using the CI Lint tester to try and figure out how to store an expected JSON result, which I later compare to a curl response. Neither of these work:
Attempt 1
---
image: ruby:2.1
script:
- EXPECT_SERVER_OUTPUT='{"message": "Hello World"}'
Fails with:
did not find expected key while parsing a block mapping at line 4 column 5
Attempt 2
---
image: ruby:2.1
script:
- EXPECT_SERVER_OUTPUT="{\"message\": \"Hello World\"}"
Fails with:
jobs:script config should be a hash
I've tried using various combinations of echo as well, without a working solution.
You could use literal block scalar1 style notation and put the variable definition and subsequent script lines on separate lines2 without worrying about quoting:
myjob:
script:
- |
EXPECT_SERVER_OUTPUT='{"message": "Hello World"}'
or you can escape the nested double quotes:
myjob:
script:
- "EXPECT_SERVER_OUTPUT='{\"message\": \"Hello World\"}'"
but you may also want to just use variables like:
myjob:
variables:
EXPECT_SERVER_OUTPUT: '{"message": "Hello World"}'
script:
- dothething.sh
Note: variables are by default expanded inside variable definitions so take care with any $ characters inside the variable value (they must be written as $$ to be literal). This feature can also be turned off.
1See this answer for an explanation of this and related notation
2See this section of the GitLab docs for more info on multi-line commands
I made it work like this:
script: |
"EXPECT_SERVER_OUTPUT='{\"message\": \"Hello World\"}'"
echo $EXPECT_SERVER_OUTPUT
Related
I'm trying to run the sed command in Jenkins groovy scripted pipeline file.
Parsing variable as below
def SERVICE = args.service
def RESOURCE = "Services"
regionSuffix = (action == 'failover') ? 'us-east-2' : 'us-east-1'
environment taking as an argument
The below sed command is working in the Ubuntu terminal.
sed "\|CFT_ENV_FILE|s|$|$RESOURCE/$SERVICE/$environment-$regionSuffix.yml|" docker-compose.yml > docker-compose-$SERVICE.yml
but when I apply this command via Groovy file it gives me an error:
sh '''sed "\\|CFT_ENV_FILE|s|$|"${RESOURCE}"/"${SERVICE}"/"${args.environment}"-"${regionSuffix}".yml|" docker-compose.yml > docker-compose-"${SERVICE}".yml'''
jenkins error
[2022-08-03T11:54:48.642Z] /tmp/jenkins-ac851b81/workspace/Infrastructure/failover/region-failover-test-job#tmp/durable-98781677/script.sh: line 1:
"\|CFT_ENV_FILE|s|$|"${RESOURCE}"/"${SERVICE}"/"${args.environment}"-"${regionSuffix}".yml|": bad substitution
script returned exit code 1
Your variables are in Jenkins/Groovy scope, not in shell scope.
Therefore, you need to substitute their values before passing them to shell:
Use " (double quotes) instead of ' (single quotes):
sh """sed '\\|CFT_ENV_FILE|s|$|${RESOURCE}/${SERVICE}/${args.environment}-${regionSuffix}.yml|' docker-compose.yml > docker-compose-${SERVICE}.yml"""
Please notice that you don't need quotes every time you use a variable (e.g "${RESOURCE}")
Also - consider that you probably need to escape dollar sign (when not used to groovy vars) else based on the command logic.
You can also use just one quotes sign instead of triple, the triple are used for multi-lined string
I have all my env vars in .env files.
They get automatically loaded when I open my shell-terminal.
I normally render shell environment variables into my target files with envsubst. similar to the example below.
What I search is a solution where I can pass a dotenv-file as well my template-file to a script which outputs the rendered result.
Something like this:
aScript --input .env.production --template template-file.yml --output result.yml
I want to be able to parse different environment variables into my yaml. The output should be sealed via "Sealed secrets" and finally saved in the regarding kustomize folder
envsub.sh .env.staging templates/secrets/backend-secrets.yml | kubeseal -o yaml > kustomize/overlays/staging
I hope you get the idea.
example
.env.production-file:
FOO=bar
PASSWROD=abc
content of template-file.yml
stringData:
foo: $FOO
password: $PASSWORD
Then running this:
envsubst < template-file.yml > file-with-vars.yml
the result is:
stringData:
foo: bar
password: abc
My approach so far does not work because Dotenv also supports different environments like .env, .env.production, .env.staging asf..
What about:
#!/bin/sh
# envsub - subsitute environment variables
env=$1
template=$2
sh -c "
. \"$env\"
cat <<EOF
$(cat "$template")
EOF"
Usage:
./envsub .env.production template-file.yaml > result.yaml
A here-doc with an unquoted delimiter (EOF) expands variables, whilst preserving quotes, backslashes, and other shell sequences.
sh -c is used like eval, to expand the command substitution, then run that output through a here-doc.
Be aware that this extra level of indirection creates potential for code injection, if someone can modify the yaml file.
For example, adding this:
EOF
echo malicous commands
But it does get the result you want.
I have the following line in GitLab CI/CD:
script:
- echo "Backend image: $BACKEND_IMAGE"
But YAML interpreters treat this as an object. Then I googled this issue and tried this:
script:
- echo "Backend image:: $BACKEND_IMAGE"
But it still doesn't work and GitLab job fails with the following:
jobs:deploy review:script config should be a string or a nested array of strings up to 10 levels deep
If I remove colons at all it works fine. How to make a string value with colon and following space in GitLab CI/CD?
It should work if you surround your string with single quotes:
script:
- 'echo "Backend image: $BACKEND_IMAGE"'
Maybe you can try and put your string in a variable first, then echo that variable:
- ECHO_STRING=$(echo "Backend image:: $BACKEND_IMAGE")
- echo $ECHO_STRING
If not, try:
- ECHO_STRING=$(echo "Backend image:\ $BACKEND_IMAGE" | tr -d '\')
- echo $ECHO_STRING>>
(Replace ECHO_STRING by a more meaning variable name)
After going through several options, I found this below simple solution
Use like this,
Image="Backend image:"
export RESULT="$(echo "$Image" $BACKEND_IMAGE)"
echo "Result is " $RESULT
We have a project using Azure Pipeline, relying on azure-pipelines.yml file at the repo's root.
When implementing a script step, it is possible to execute successive commands in the same step simply writing them on different lines:
- script: |
ls -la
pwd
echo $VALUE
Yet, if we have a single command that is very long, we would like to be able to break it on several lines in the YAML file, but cannot find the corresponding syntax?
You didn't specify your agent OS so I tested on both windows-latest and ubuntu-latest. Note that the script task runs a bit differently on these 2 environments. On Windows, it uses cmd.exe. On Ubuntu, it uses bash. Therefore, you have to use the correct syntax.
On Windows:
pool:
vmImage: 'windows-latest'
steps:
- script: |
mkdir ^
test ^
-p ^
-v
On Ubuntu:
pool:
vmImage: 'ubuntu-latest'
steps:
- script: |
mkdir \
test \
-p \
-v
Those two files above work on my Azure DevOps.
At the moment, the only way we found for to break a single command on multiple line is using YAML folded style:
- script: >
echo
'hello world'
It is all about replacing | with >.
Notes:
It is not possible to introduce extra indentation on the following lines! For example, trying to align all arguments given to a command would break the behaviour.
This style will replace newlines in the provided value with a simple white space. This means the script now can only contain a single command (maybe adding literal \n at the end of the line would actually introduce a linebreak in the string, but it feels backward compared to the usual approach of automatice linebreak unless an explicit continuation is added).
You can use '^' to break your command line into multiple lines. Check below exmaple. Below script will output 'hello world' like a single line command echo 'hello world'
- script: |
echo ^
'hello ^
world'
I know the basics of Bash but often miss the nuance and I'm having a problem using it to achieve what I had hoped would be a rather simple problem:
If I have the following in a bash script, which works exactly as I'd want it to:
cbType=`echo $configuration | jsawk -a 'return _.where(this,{name: "reference_data"})'`
It takes $configuration -- which is a JSON string -- and identifies the array element where name is "reference_data" and returns that object/hash definition only. Please note that this does use the very handy jsawk utility but it has been designed to be exhibit good command-line behaviour.
The problem is that when I remove the hard-coded "reference-data" with a variable it seems to not be able to reference the scope of the variable. So for instance, ...
myVar="\"reference_data\""
cbType=`echo $configuration | jsawk -a 'return _.where(this,{name: $myVar})'`
Does not work and instead returns a jsawk error of:
jsawk: js error: ReferenceError: $myVar is not defined
Is there anything I can do to enforce that first the variable is expanded, and then the command string is executed?
Declared variables won't be expanded if it's not within double quotes. So put your code inside double quotes instead of single quotes.
myVar="\"reference_data\""
cbType=$(echo "$configuration" | jsawk -a "return _.where(this,{name: $myVar})")