Gitlab CI/CD create and use custom user functions - bash

I tried create like this function in my gitlab config file:
deploy:
stage: dev
services:
- docker:dind
script:
- myFunction () { api_pl_tmp=$(curl -s --header "PRIVATE-TOKEN: $TOKEN_VAR" "https://git.example.ru/api/v4/projects/1/pipelines/latest" | jq .) }
- while myFunction; do
- if [ $(echo $api_pl_tmp | jq -r .status) = "success" ]
- then
- export PROJECT_CURRENT=$($api_pl_tmp | jq -r '{id:.id,sha:.sha[0:8]}' | base64)
- break
- fi
- if [ $(echo $api_pl_tmp | jq -r .status) = "failed" ]
- then
- echo "Error: Frontend can't be deployed!"
- exit 1
- fi
- if [ $(echo $api_pl_tmp | jq -r .status) = "running" ]
- then
- echo "Wait 5 sec... Frontend deploying!"
- sleep 5
- else
- echo Unknow status $(echo $api_pl_tmp | jq -r .status)
- exit 1
- fi
- done
But it's doesn't work and gitlab return me error with message:
This GitLab CI configuration is invalid: jobs:deploy-to-dev:script
config should be a string or a nested array of strings up to 10 levels
deep
How I can fix this problem or maybe I have an error in my custom function?

- separates commands with commands in between. Put your commands as one command, not multiple. Remember they are joined with spaces.
The problem with your script is the colon - see https://gitlab.com/gitlab-org/gitlab-foss/-/issues/30097 .
deploy:
stage: dev
services:
- docker:dind
script:
- "colon=:"
- myFunction () {
api_pl_tmp=$(curl -s --header "PRIVATE-TOKEN$colon $TOKEN_VAR" \
"https$colon//git.example.ru/api/v4/projects/1/pipelines/latest" | jq .);
}
- while myFunction; do
if [ $(echo $api_pl_tmp | jq -r .status) = "success" ]; then
export PROJECT_CURRENT=$($api_pl_tmp | jq -r '{id:.id,sha:.sha[0:8]}' | base64);
break;
fi;
if [ $(echo $api_pl_tmp | jq -r .status) = "failed" ]; then
echo "Error$colon Frontend cant be deployed";
exit 1;
fi;
if [ $(echo $api_pl_tmp | jq -r .status) = "running" ]; then
echo "Wait 5 sec... Frontend deploying!";
sleep 5;
else
echo Unknow status $(echo $api_pl_tmp | jq -r .status);
exit 1;
fi;
done
Also jq .) } is missing a ;. Check your scripts in your own shell one at a time first. Check your scripts with https://shellcheck.net .
Also $($api_pl_tmp is missing an echo and there are a lot of problem with quoting. Use consistent indentation and try to write readable code to minimize typos.

Related

Writing a comparison BATCH file to verify sha256sum to released code

Trying to write a script that takes 2 arguments ($1 and $2) one to represent the $hash and the $file_name.
I am trying to utilize jq to parse the required data to download and compare PASS or FAIL.
I see to be stuck trying to think this out.
Here is my code
#!/usr/bin/env sh
#
# Sifchain shasum check (revised).
#
# $1
hash_url=$( curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | jq '.[] | select(.name=="v0.10.0-rc.4")' | jq '.assets[]' | jq 'select(.name=="sifnoded-v0.10.0-rc.4-linux-amd64.zip.sha256")' | jq '.browser_download_url' | xargs $1 $2 )
echo $hash_url
# $2
hash=$( curl -s -L $hash_url | jq'.$2')
file_name=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | jq '.[] | .name')
#
#
echo $hash | sha256sum
echo $file_name | sha256sum #null why?
echo "\n"
## version of the release $1, and the hash $2
## sha256 <expected_sha_256_sum> <name_of_the_file>
sha256() {
if echo "$1 $2" #| sha256sum -c --quiet
then
echo pass $1 $2
exit 0
else
echo FAIL $1 $2
exit 1
fi
}
# Invoke sha256
sha256 $hash_url $file_name
Ideally this should work for any comparison of hash with correct file, pulling the 2 parameters when the BASH script is invoked.
I can suggest the following corrections/modifications:
#!/bin/bash
#sha file
SHA_URL=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | \
jq --arg VERSION v0.10.0-rc.4 -r \
'.[] | select(.name==$VERSION) | .assets[] | select(.name |test("\\.sha256$")) | .browser_download_url')
SHA_VALUE=$(curl -s -L $SHA_URL| tr 1 2)
FILENAME=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | \
jq --arg VERSION v0.10.0-rc.4 -r \
'.[] | select(.name==$VERSION) | .assets[] | select(.content_type =="application/zip") | .name')
#added just for testing, I'm assuming you have the files locally allready
FILEURL=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | \
jq --arg VERSION v0.10.0-rc.4 -r \
'.[] | select(.name==$VERSION) | .assets[] | select(.content_type =="application/zip") | .browser_download_url')
wget --quiet $FILEURL -O $FILENAME
echo $SHA_VALUE $FILENAME | sha256sum -c --quiet >/dev/null 2>&1
RESULT=$?
if [ $RESULT -eq 0 ]; then
echo -n "PASS "
else
echo -n "FAIL "
fi
echo $SHA_VALUE $FILENAME
exit $RESULT
Notes:
jq
--arg VERSION v0.10.0-rc.4 creates a "variable" to be used in the script
-r - raw output, strings are not quoted
test("\\.sha256$") - regular expresion, used to search for a generic sha256, so you don't have to hardcode the full name
select(.content_type =="application/zip") - I'm assuming that's the file you are searching for
wget is used just for demo purpose, to download the file, I'm assuming you already have the file on your machine
sha256sum -c --quiet >/dev/null 2>&1 - redirecting to /dev/null is necessary because in case of error sha256sum is not quiet

How to check the stdout of a command in a Bash while conditionality check?

I am trying to check for the output of a command as a breakout condition for while loop in Bash, but it keeps skipping past the block, even while the last echo confirms the value as "DELETE_IN_PROGRESS".
This is what I have:
stackname=cf_test
while [[ $(aws cloudformation describe-stacks --stack-name ${stackname} | jq '.Stacks | .[0] | .StackStatus') == "DELETE_IN_PROGRESS" ]]; do
echo -e " $(aws cloudformation describe-stacks --stack-name ${stackname} | jq '.Stacks | .[0] | .StackStatus'): waiting for current stack to delete before re-deploying..."
sleep 30
done
echo -e $(aws cloudformation describe-stacks --stack-name ${stackname} | jq '.Stacks | .[0] | .StackStatus')
What should I change?
edit:
Adding -x debug flag, gives:
▶ bash -x ~/Downloads/test_script.sh
+ stackname=cf_test
++ aws cloudformation describe-stacks --stack-name cf_test
++ jq '.Stacks | .[0] | .StackStatus'
+ [[ "DELETE_IN_PROGRESS" == \D\E\L\E\T\E\_\I\N\_\P\R\O\G\R\E\S\S ]]
++ aws cloudformation describe-stacks --stack-name cf_test
++ jq '.Stacks | .[0] | .StackStatus'
+ echo -e '"DELETE_IN_PROGRESS"'
"DELETE_IN_PROGRESS"
I would move the comparison inside jq. By default, jq will succeed, regardless of the logical status of the comparison you make.
% jq -n '3 == 3'; echo $?
true
0
% jq -n '3 != 3'; echo $?
false
0
To change this, use the -e option.
% jq -en '3 == 3'; echo $?
true
0
% jq -en '3 != 3'; echo $?
false
1
Doing this eliminates the need for the [[ ... ]] command.
while x=$(aws cloudformation describe-stacks --stack-name ${stackname});
jq -ne --arg x "$x" '$x.Stacks.[0].StackStatus == "DELETE_IN_PROGRESS"'; do
printf '%s: waiting for current stack to delete before re-deploying...\n' "$x"
sleep 30
done
printf '%s\n' "$x"
Your debug log shows the problem (even though the [[ line itself weirdly obscures it):
+ echo -e '"DELETE_IN_PROGRESS"'
The value you are comparing contains literal double quotes. That's why the match fails. Your right-hand side of == contains syntactical double quotes that are not treated as part of the string.
To fix it, use jq -r to output the string without JSON formatting and escaping:
json='{ "foo": "bar" }'
jq '.foo' <<< "$json" # Shows bad 5 character value: "bar"
jq -r '.foo' <<< "$json" # Shows good 3 character value: bar
In your case:
while [[ $(aws cloudformation describe-stacks --stack-name ${stackname} | jq -r '.Stacks | .[0] | .StackStatus') == "DELETE_IN_PROGRESS" ]]; doq -r '.Stacks | .[0] | .StackStatus') == "DELETE_IN_PROGRESS" ]]; do
echo "Still deleting"
sleep 30
done

Bash variable scope issue

I am struggling to understand what the cause of the following bug is and how I can fix it.
I have this code:
set_filters() {
json=$1
filters='"Name=instance-state-name,Values=running,stopped"'
echo $json | jq -r '. | keys[]' | \
while read tag ; do
value=$(echo "$json" | jq -r ".[\"$tag\"]")
filters="$filters \"Name=tag:${tag},Values=${value}\""
done
echo $filters
}
set_filters '{"Name": "*FOO*", "Cost Center": "XX111"}'
The output I am expecting:
"Name=instance-state-name,Values=running,stopped" "Name=tag:Cost Center,Values=XX111" "Name=tag:Name,Values=*FOO*"
The output I am getting:
"Name=instance-state-name,Values=running,stopped"
If I insert echo statements to assist with debugging:
set_filters() {
json=$1
filters='"Name=instance-state-name,Values=running,stopped"'
echo $json | jq -r '. | keys[]' | \
while read tag ; do
value=$(echo "$json" | jq -r ".[\"$tag\"]")
filters="$filters \"Name=tag:${tag},Values=${value}\""
echo "FILTERS INSIDE LOOP: $filters"
done
echo "FILTERS OUTSIDE LOOP: $filters"
}
The output I then get is:
FILTERS INSIDE LOOP: "Name=instance-state-name,Values=running,stopped" "Name=tag:Cost Center,Values=XX111"
FILTERS INSIDE LOOP: "Name=instance-state-name,Values=running,stopped" "Name=tag:Cost Center,Values=XX111" "Name=tag:Name,Values=*FOO*"
FILTERS OUTSIDE LOOP: "Name=instance-state-name,Values=running,stopped"
I can't explain the behaviour. In a language other than Bash I would assume a variable scope issue for the variable $filters, but I thought the scope would basically be global.
I am using JQ version 1.3 and Bash version 4.1.2 on Red Hat Enterprise Linux 6.8.
Bash executes loops in a subshell if they are part of a pipeline. See for example BashFAQ/024 and "Bash Script: While-Loop Subshell Dilemma".
A possible workaround is to use process substitution:
while read tag; do
...
done < <(jq -r '. | keys[]' <<< "$1")

rake - running shell command returns error

I am trying to run in Rake the following shell command:
sh "d='jps -l | grep jar | cut -d ' ' -f 1'; if [ -z \"$d\" ]; then :; else kill \"$d\"; fi;"
However I get:
sh: -f 1: not found
If I run it in linux shell it works fine.
What is wrong?
I interpreted your question wrong earlier. This is what you want.
d='jps -l | grep jar | cut -d " " -f 1; if [ -z "$d" ]; then :; else kill "$d"; fi;'
system(d)
OR
If you want output of the command (which I guess you don't in this case)
output = `jps -l | grep jar | cut -d " " -f 1; if [ -z "$d" ]; then :; else kill "$d"; fi;`
You need to escape your single quotes and quote the whole string:
d='jps -l | grep jar | cut -d \' \' -f 1; if [ -z "$d" ]; then :; else kill "$d"; fi;'

Bash script help/evaluation

I'm trying to learn some scripting however I can't find solution for one functionality.
Basically I would like to ask to evaluate my script as it's probably possible to reduce the complexity and number of lines.
The purpose of this script is to download random, encrypted MySQL backups from Amazon S3, restore the dump and run some random MySQL queries.
I'm not sure how to email the output from printf statements - one is for headers and second one for actual data. I've tried to format the output so it looks like below but I had to exclude the headers from the loop:
Database: Table: Entries:
database1 random_table 0
database2 random_table 0
database3 random_table 0
database4 random_table 0
I would like to include this output in the email and also change the email subject based on the success/failure of the script.
I probably use to much if loops and MySQL queries are probably to complicated.
Script:
#!/usr/bin/env bash
# DB Details:
db_user="user"
db_pass="password"
db_host="localhost"
# Date
date_stamp=$(date +%d%m%Y)
# Initial Setup
data_dir="/tmp/backup"
# Checks
if [ ! -e /usr/bin/s3cmd ]; then
echo "Required package (http://s3tools.org/s3cmd)"
exit 2
fi
if [ -e /usr/bin/gpg ]; then
gpg_key=$(gpg -K | tr -d "{<,>}" | awk '/an#example.com/ { print $4 }')
if [ "$gpg_key" != "an#example.com" ]; then
echo "No GPG key"
exit 2
fi
else
echo "No GPG package"
exit 2
fi
if [ -d $data_dir ]; then
rm -rf $data_dir/* && chmod 700 $data_dir
else
mkdir $data_dir && chmod 700 $data_dir
fi
# S3 buckets
bucket_1=s3://test/
# Download backup
for backup in $(s3cmd ls s3://test/ | awk '{ print $2 }')
do
latest=$(s3cmd ls $backup | awk '{ print $2 }' | sed -n '$p')
random=$(s3cmd ls $latest | shuf | awk '{ print $4 }' | sed -n '1p')
s3cmd get $random $data_dir >/dev/null 2>&1
done
# Decrypting Files
for file in $(ls -A $data_dir)
do
filename=$(echo $file | sed 's/\.e//')
gpg --out $data_dir/$filename --decrypt $data_dir/$file >/dev/null 2>&1 && rm -f $data_dir/$file
if [ $? -eq 0 ]; then
# Decompressing Files
bzip2 -d $data_dir/$filename
if [ $? -ne 0 ]; then
echo "Decompression Failed!"
fi
else
echo "Decryption Failed!"
exit 2
fi
done
# MySQL Restore
printf "%-40s%-30s%-30s\n\n" Database: Table: Entries:
for dump in $(ls -A $data_dir)
do
mysql -h $db_host -u $db_user -p$db_pass < $data_dir/$dump
if [ $? -eq 0 ]; then
# Random DBs query
db=$(echo $dump | sed 's/\.sql//')
random_table=$(mysql -h $db_host -u $db_user -p$db_pass $db -e "SHOW TABLES" | grep -v 'Tables' | shuf | sed -n '1p')
db_entries=$(mysql -h $db_host -u $db_user -p$db_pass $db -e "SELECT * FROM $random_table" | grep -v 'id' | wc -l)
printf "%-40s%-30s%-30s\n" $db $random_table $db_entries
mysql -h $db_host -u $db_user -p$db_pass -e "DROP DATABASE $db"
else
echo "The system was unable to restore backups!"
rm -rf $data_dir
exit 2
fi
done
#Remove backups
rm -rf $data_dir
You'll get the best answers if you ask specific questions (rather than, "please review my code")...and if you limit each post to a single question. Regarding emailing the output of your printf statements:
You can group statements into a block and then pipe the output of a block into another program. For example:
{
echo "This is a header"
echo
for x in {1..10}; do
echo "This is row $x"
done
} | mail -s "Here is my output" lars#example.com
If you want to make the email subject conditional upon the success or
failure of something elsewhere in the script, you can (a) save your
output to a file, and then (b) email the file after building the
subject line:
{
echo "This is a header"
echo
for x in {1..10}; do
echo "This is row $x"
done
} > output
if is_success; then
subject="SUCCESS: Here is your output"
else
subject="FAILURE: Here are your errors"
fi
mail -s "$subject" lars#example.com < output

Resources