Unable to do a IF statement inside a bash heredoc - bash

I am trying to do a IF..ELSE inside a heredoc. The heredoc is necessary because the set of commands need to be executed as a different user.
Unfortunately, the IF statement doesn't work as expected, and always jumps to the ELSE clause. When I remove the IF block from the heredoc and place it outside it works as expected.
It is probably a simple thing I'm missing, but I have no idea what.
rem=0
function1 () {
su - user1 <<'DONE'
if [[ "$rem" -eq 0 ]];
then
echo rem is even
else
echo rem is odd
fi
DONE
}
function1
It echoes rem is odd.

Change the 3rd line of your script with this:
su - user1 <<DONE
Notice the missing quotes around DONE.
With quotes around the delimiter you basically deactivate the parameter substitution and the value of $rem is not what you expect it to be (just echo it and see). Without the quotes, the parameter substitution works.
Tested on a CentOS 7 with Bash 4.2.46.
See also a longer discussion here: Using variables inside a bash heredoc

Related

Why can't a string variable be evaluated as a command when using ${!variable}?

This is somehow related to Use substituted string as a command in shell script I asked last year. That accepted answer worked nicely.
#!/usr/bin/env bash
for user in ytu
do
cd /home/${user}/H2-Data/crons
for path in "$user"_*_monthly_report.py
do
if [ -e $path ]
then
. ../../.profile
userpython=${user^^}_PYTHON
echo ${!userpython} $path
else
break
fi
done
done
This echos what I expected:
/home/ytu/anaconda3/bin/python ytu_clinic217_monthly_report.py
/home/ytu/anaconda3/bin/python ytu_clinic226_monthly_report.py
However, by simply changing ${user^^}_PYTHON to $YTU_PYTHON, which should be exactly the same in this case, the bash script now echos:
ytu_clinic217_monthly_report.py
ytu_clinic226_monthly_report.py
I even tried userpython=/home/ytu/anaconda3/bin/python but that ends up the same. That said, if I echo $userpython, I can still get /home/ytu/anaconda3/bin/python in the latter cases.
I wonder why can't userpython be evaluated anymore by simply assigning the variable explicitly, and how can I make it right?

Unbound variable with bash script

Im becoming desperate when debugging my script, I used some constructions recommended to me from my senior collegue and I dont know how to make it work properly.
#!/bin/bash -x
set -ueo pipefail
exec &>/tmp/dq.log
source ${BASH_SOURCE%/*}/env-prd.sh
times=${2:-1}
sleep=${3:-1}
name="all-dq_hourly"
fs_lock_file="/tmp/mwa/jobs/prd-${name}.lock"
( flock -n 200
log="/var/log/mwa/prd/$(date +%Y-%m-%d)__${name}.log"
for i in $(seq 1 $times); do
if [[ ! -f /tmp/stop ]]; then
couple commands
fi
sleep $sleep
done
) 200>"$fs_lock_file" | tee -a $log
rm $fs_lock_file
From the execs , I can see there is an issue with unbound variable for the tee -a $log part, couple commands get executed allright. I tried to use backtics in the log path, but to no benefit. I suspect the same issue with fs_lock_file, but I havent fixed the logging first yet.
Can somebody open my eyes and tell me what Im missing? Im not able to make the script logging to path specified.
You are assigning the variable log inside a subshell ( [...] ). That variable is not bound outside that subshell.
In this case it is probalby best to just set log outside the subshell, i.e. move the variable assignment before the subshell block.
Generally in similar cases, you could try replacing the subshell parentheses with curly braces (group command syntax) { [...] }.
Group commands are executed in the current shell. Note that in contrast to subshell syntax, lists must be terminated by newline or semicolon, see Compound Commands in the Lists section of the bash(1) manpage.
And as a general best practice, setting variable names, especially constants at the beginning of a script or function helps avoid this kind of bug.

Echo-ing an environment variable returns string literal rather than environment variable value

I have two bash scripts. The first listens to a pipe "myfifo" for input and executes the input as a command:
fifo_name="myfifo"
[ -p $fifo_name ] || mkfifo $fifo_name;
while true
do
if read line; then
$line
fi
done <"$fifo_name"
The second passes a command 'echo $SET_VAR' to the "myfifo" pipe:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
As you can see, I want to pass 'echo $SET_VAR' through the pipe. In the listener process, I've set a $SET_VAR environment variable. I expect the output of the command 'echo $SET_VAR' to be 'var_value,' which is the value of the environment variable SET_VAR.
Running the first (the listener) script in one bash process and then passing a command via the second in another process gives the following result:
$SET_VAR
I expected to "var_value" to be printed. Instead, the string literal $SET_VAR is printed. Why is this the case?
Before I get to the problem you're reporting, I have to point out that your loop won't work. The while true part (without a break somewhere in the loop) will run forever. It'll read the first line from the file, loop, try to read a second line (which fails), loop again, try to read a third line (also fails), loop again, try to read a fourth line, etc... You want the loop to exit as soon as the read command fails, so use this:
while read line
do
# something I'll get to
done <"$fifo_name"
The other problem you're having is that the shell expands variables (i.e. replaces $var with the value of the variable var) partway through the process of parsing a command line, and when it's done that it doesn't go back and re-do the earlier parsing steps. In particular, if the variable's value included something like $SET_VAR it doesn't go back and expand that, since it's just finished the bit where it expands variables. In fact, the only thing it does with the expanded value is split it into "words" (based on whitespace), and expand any filename wildcards it finds -- no variable expansions happen, no quote or escape interpretation, etc.
One possible solution is to tell the shell to run the parsing process twice, with the eval command:
while read line
do
eval "$line"
done <"$fifo_name"
(Note that I used double-quotes around "$line" -- this prevents the word splitting and wildcard expansion I mentioned from happening before eval goes through the normal parsing process. If you think of your original code half-parsing the command in $line, without double-quotes it gets one and a half-parsed, which is weird. Double-quotes suppress that half-parsing stage, so the contents of the variable get parsed exactly once.)
However, this solution comes with a big warning, because eval has a well-deserved reputation as a bug magnet. eval makes it easy to do complex things without quite understanding what's going on, which means you tend to get scripts that work great in testing, then fail incomprehensibly later. And in my experience, when eval looks like the best solution, it probably means you're trying to solve the wrong problem.
So, what're you actually trying to do? If you're just trying to execute the lines coming from the fifo as shell commands, then you can use bash "$fifo_name" to run them in a subshell, or source "$fifo_name" to run them in the current shell.
BTW, the script that feeds the fifo:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
Is also a disaster waiting to happen. Putting commands in variables doesn't work very well in the shell (I second chepner's recommendation of BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!), and putting a command to print another command in a variable is just begging for trouble.
bash, by it's nature, reads commands from stdin. You can simply run:
bash < myfifo

Can't check checksum in bash --- apparently, wrong syntax?

I have this bit of code, that is supposed to call reload if current file ($1) is changed:
thehash="`cksum $1`"
while true
do
curhash="`cksum $1`"
if "$curhash" -ne "$thehash"; then
reload
fi
...
done
tl;dr: it doesn't work.
Since I am not very experienced with bash, I can't figure out what did I do wrong. I get this error:
58003725 834183 main.pdf: command not found
Apparently, bash is trying to execute curhash? How do I fix this?
You need brackets around your condition in if or to use the test command, so it should be
if [[ "$curhash" != "$thehash" ]]; then
and note that -ne is for integer comparison, != is for string comparison
Without the [[ or test the variable gets expanded and that becomes a command to run, which is why it was trying to execute the output of cksum: the content of curhash was being treated as a command.
Also, as #Sundeep mentioned the more often preferred way to get the output from the subshell is to use $(...) instead of the backticks. here is a good answer talking about that

Trouble Passing Parameter into Simple ShellScript (command not found)

I am trying to write a simple shell-script that prints out the first parameter if there is one and prints "none" if it doesn't. The script is called test.sh
if [$1 = ""]
then
echo "none"
else
echo $1
fi
If I run the script without a parameter everything works. However if I run this command source test.sh -test, I get this error -bash: [test: command not found before the script continues on and correctly echos test. What am I doing wrong?
you need spaces before/after '[',']' chars, i.e.
if [ "$1" = "" ] ; then
#---^---------^ here
echo "none"
else
echo "$1"
fi
And you need to wrap your reference (really all references) to $1 with quotes as edited above.
After you fix that, you may also need to give a relative path to your script, i.e.
source ./test.sh -test
#------^^--- there
When you get a shell error message has you have here, it almost always helps to turn on shell debugging with set -vx before the lines that are causing your trouble, OR very near the top your script. Then you can see each line/block of code that is being executed, AND the value of the variables that the shell is using.
I hope this helps.

Resources