while loop not working while reading file in sh - jenkins-pipeline

I have the following shell script deployed in my Jenkinsfile
withEnv(['PATH+EXTRA=/busybox:/kaniko']) {
sh """#!/busybox/sh
cat "images2.txt"
cat "images2.txt" | while read line || [ -n "\$line" ]; do
fields=(\$line)
echo "FROM \${fields[0]}" | /kaniko/executor --dockerfile /dev/stdin --destination \${fields[1]}
done
"""
}
I have to escape the $ since its inside a Jenkinsfile otherwise Groovy doesnt expect it.
The contents of images2.txt are:
docker.io/prom/blackbox-exporter:v0.14.0 eu.gcr.io/development/infra/monitoring/blackbox-exporter:v0.14.0
docker.io/busybox:1.30.0 eu.gcr.io/development/infra/monitoring/busybox:1.30.0
When the pipeline runs, I get the error
/home/jenkins/agent/workspace/test-job#tmp/durable-9cce3aab/script.sh: line 4: syntax error: unexpected "(" (expecting "done")
I have also tried doing something like this
sh """#!/busybox/sh
input="images2.txt"
while IFS= read -r line
do
fields=(\$line)
echo "FROM \${fields[0]}" | /kaniko/executor --dockerfile /dev/stdin --destination \${fields[1]}
done < "\$input"
"""
But the result is the same. Any idea what might be wrong here ?

As per #Charles Duffy's recommendation, I implemented
sh """#!/busybox/sh
cat "images2.txt" | while read field1 field2 rest; do
echo "FROM \$field1" | /kaniko/executor --dockerfile /dev/stdin --destination \$field2
done
"""
and it worked perfectly.

Related

Concatenate the output of 2 commands in the same line in Unix

I have a command like below
md5sum test1.txt | cut -f 1 -d " " >> test.txt
I want output of the above result prefixed with File_CheckSum:
Expected output: File_CheckSum: <checksumvalue>
I tried as follows
echo 'File_Checksum:' >> test.txt | md5sum test.txt | cut -f 1 -d " " >> test.txt
but getting result as
File_Checksum:
adbch345wjlfjsafhals
I want the entire output in 1 line
File_Checksum: adbch345wjlfjsafhals
echo writes a newline after it finishes writing its arguments. Some versions of echo allow a -n option to suppress this, but it's better to use printf instead.
You can use a command group to concatenate the the standard output of your two commands:
{ printf 'File_Checksum: '; md5sum test.txt | cut -f 1 -d " "; } >> test.txt
Note that there is a race condition here: you can theoretically write to test.txt before md5sum is done reading from it, causing you to checksum more data than you intended. (Your original command mentions test1.txt and test.txt as separate files, so it's not clear if you are really reading from and writing to the same file.)
You can use command grouping to have a list of commands executed as a unit and redirect the output of the group at once:
{ printf 'File_Checksum: '; md5sum test1.txt | cut -f 1 -d " " } >> test.txt
printf "%s: %s\n" "File_Checksum:" "$(md5sum < test1.txt | cut ...)" > test.txt
Note that if you are trying to compute the hash of test.txt(the same file you are trying to write to), this changes things significantly.
Another option is:
{
printf "File_Checksum: "
md5sum ...
} > test.txt
Or:
exec > test.txt
printf "File_Checksum: "
md5sum ...
but be aware that all subsequent commands will also write their output to test.txt. The typical way to restore stdout is:
exec 3>&1
exec > test.txt # Redirect all subsequent commands to `test.txt`
printf "File_Checksum: "
md5sum ...
exec >&3 # Restore original stdout
Operator &&
e.g. mkdir example && cd example

Concatenate String and Variable in Shell Script

Content of file is:
#data.conf
ip=127.0.0.1
port=7890
delay=10
key=1.2.3.4
debug=true
Shell Script:
#!/bin/bash
typeset -A config
config=()
config_file_path="./data.conf"
cmd="java -jar ./myprogram.jar"
#This section will read file and put content in config variable
while read line
do
#echo "$line"
if echo $line | grep -F = &>/dev/null
then
key=$(echo "$line" | cut -d '=' -f 1)
config[$key]=$(echo "$line" | cut -d '=' -f 2)
echo "$key" "${config["$key"]}"
fi
done < "$config_file_path"
cmd="$cmd -lh ${config["ip"]} -lp ${config["port"]} -u ${config["debug"]} -hah \"${config["key"]}\" -hap ${config["delay"]}"
echo $cmd
Expected output:
java -jar myprogram.jar -lh 127.0.0.1 -lp 7890 -u true -hah "1.2.3.4" -hap 10 -b
Output:
Every time some unexpected o/p
Ex. -lp 7890rogram.jar
Looks like it is overwriting same line again and again
In respect to the comments given and to have an additional automatic data cleansing within the script, you could have according How to convert DOS/Windows newline (CRLF) to Unix newline (LF) in a Bash script? and Remove carriage return in Unix
# This section will clean the input config file
sed -i 's/\r$//' "${config_file_path}"
within your script. This will prevent the error in future runs.

How to run commands off of a pipe

I would like to run commands such as "history" or "!23" off of a pipe.
How might I achieve this?
Why does the following command not work?
echo "history" | xargs eval $1
To answer (2) first:
history and eval are both bash builtins. So xargs cannot run either of them.
xargs does not use $1 arguments. man xargs for the correct syntax.
For (1), it doesn't really make much sense to do what you are attempting because shell history is not likely to be synchronised between invocations, but you could try something like:
{ echo 'history'; echo '!23'; } | bash -i
or:
{ echo 'history'; echo '!23'; } | while read -r cmd; do eval "$cmd"; done
Note that pipelines run inside subshells. Environment changes are not retained:
x=1; echo "x=2" | while read -r cmd; do eval "$cmd"; done; echo "$x"
You can try like this
First redirect the history commands to a file (cut out the line numbers)
history | cut -c 8- > cmd.txt
Now Create this script hcmd.sh(Referred to this Read a file line by line assigning the value to a variable)
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
$line
done < "cmd.txt"
Run it like this
./hcmd.sh

Bash Script ouputs Command Not Found Repeatedly

I am trying to create a bash script that will write to text files the output of some pacman queries mainly what packages are installed locally, packages that are installed as dependencies, packages that are orphans, and what packages require what dependencies. Currently, I am in the middle of solving an issue that is preventing my from writing what packages require what dependencies. I am using the following bash code:
#!/bin/bash
set -e -u
#Switch to PWD
cd /home/$USER/System/scripts/pacman-queries-output/
#Get the current date
DATE=`date +%m%d%Y`
#Pacman Queries
pacman --query -e >pacman_installed$DATE.txt
pacman --query -d >pacman_dependencies$DATE.txt
pacman -Qdt >pacman_orphans$DATE.txt
while read package_desc
do
package_name=$(echo $package_desc| cut -d' ' -f 1)
check_if_none=$(pacman -Qi $package_name | grep "Req" | sed -e 's/Required By : //g')
if $check_if_none != "Required By : None"
then
echo $package_name >>pacman_required_by$DATE.txt
pacman -Qi $package_name | grep "Req" | sed -e 's/Required By : //g' >>pacman_required_by$DATE.txt
fi
done < $PWD/pacman_installed$DATE.txt
echo 'Completed 'basename
However, the while loop doesn't seem to create and/or write to the text file I specified instead it echoes this multiple times in the terminal:
./pacman-queries.sh: line 20: Required: command not found
The following is one of the iterations of the while loop that is displayed when running bash -x pacman-queries.sh:
+ read package_desc
++ echo aesfix 1.0.1-4
++ cut '-d ' -f 1
+ package_name=aesfix
++ pacman -Qi aesfix
++ grep Req
++ sed -e 's/Required By : //g'
+ check_if_none='Required By : None'
+ Required By : None '!=' 'Required By : None'
pacman-queries.sh: line 20: Required: command not found
Could anyone suggest any solution that they might have to solve this issue? Thank you in advance.
if $check_if_none != "Required By : None"
If the check_if_none has the string:
check_if_none="Required By : None"
Then it gets expanded to:
if Required By : None != "Required By : None"
if tries to execute the command passed to it. So it will try to execute the command Required. And it does not find such command.
How to fix it:
use test [ .. ] or [[ ... ]] to do comparison in bash
always quote your variables "$varvar"
if [ "$check_if_none" != "Required By : None" ]
Also:
don't use backticks. They are less readable, can't be nested, and are deprecated.
You script after some fixing may look like this:
#!/bin/bash
set -e -u
#Switch to PWD
cd "/home/$USER/System/scripts/pacman-queries-output/"
#Get the current date
DATE=$(date +%m%d%Y)
#Pacman Queries
pacman --query -e >"pacman_installed$DATE.txt"
pacman --query -d >"pacman_dependencies$DATE.txt"
pacman -Qdt >"pacman_orphans$DATE.txt"
while IFS= read -r package_desc; do
package_name=$(echo "$package_desc" | cut -d' ' -f 1)
# renamed from check_if_none
# some newlines for readability
Required=$(
pacman -Qi "$package_name" |
grep "Req" |
sed -e 's/Required By : //g'
)
if [ "$Required" != "Required By : None" ]; then
echo "$package_name"
# runnung `pacman -Q` twice is just throwing cpu cycles...
echo "$Required"
fi
# All output goes into required_by - moved it here
# also changed `>>` into `>`
done < "$PWD/pacman_installed$DATE.txt" > "pacman_required_by$DATE.txt"
echo 'Completed 'basename

Bash Script - using cmd instead of cat

I wrote a script, including this loop:
#!/bin/bash
cat "$1" | while read -r line; do
echo "$line"; sleep 2;
done
A shellcheck run put out the following message:
SC2002: Useless cat. Consider 'cmd < file | ..' or 'cmd file | ..' instead.
I changed the script to:
#!/bin/bash
cmd < "$1" | while read -r line; do
echo "$line"; sleep 2;
done
but now bash exits with:
cmd: command not found
what have I done wrong?
Your cmd is the whole while cond; do ... done compound statement and in this case the redirection needs to come at the end:
while read -r line; do
echo "$line"; sleep 0.2
done < "$1"
Remove the | and have the end line as :
done < "$1"

Resources