Bash unintentionally splitting command - bash

Currently I had a rsync command which is failing once every ~15 minutes due to poor network condition. I had written a script to rerun the rsync, however the script does not work as intended because bash is unintentionally breaking up the command I passed in:
$ cat exit-trap.sh
#!/bin/bash
count=1
while :
do
echo ==============
echo Run \#$count
$#
if [[ $? -eq 0 ]] ; then
exit
fi
echo Run \#$count failed
let count++
sleep 15
done
$ ./exit-trap.sh rsync --output-format="# %i %n%L" source::dir target
==============
Run #1
Unexpected remote arg: source::dir
rsync error: syntax or usage error (code 1) at main.c(1348) [sender=3.1.1]
After poking around for a while I guess what rsync recevied in argv is `["rsync", "--output-format=#", "%i", "%n%L", "source::dir", "target"]. The output format is appearantly unintentionally splitted into indiviual pieces, causing a syntax error. Is there a way to fix this issue?
PS: So far I've also tried sh -c $#, sh -c \"$#\", and
./exit-trap.sh rsync --output-format=\"# %i %n%L\" source::dir target
./exit-trap.sh rsync --output-format=\\\"# %i %n%L\\\" source::dir target
./exit-trap.sh "rsync --output-format=\"# %i %n%L\" source::dir target"
None of these works.

You need to use "$#" as described here https://www.gnu.org/software/bash/manual/html_node/Special-Parameters.html#Special-Parameters:
($#) Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands to a separate word. That is, "$#" is equivalent to "$1" "$2" ….

Related

Unexpected EOF in conditional construct in makefile

I have the following target in my makefile
omp: main_omp.c omp_impl.o
if [[ ! -e ../bin/ ]]; then mkdir ../bin/ fi
gcc $(CFLAGS) ... # compilation et cetera
On executing make omp in the same directory causes make to terminate with the following error
if [[ ! -e ../bin ]]; then mkdir ../bin fi
/bin/sh: 1: Syntax error: end of file unexpected (expecting "fi")
make: *** [makefile:10: omp] Error 2
Executing the if ... fi statement in the terminal works as intended. I tried different combinations of double quotes, splitting into different lines etc and nothing works.
How do I fix this problem? Why is make running into an EOF over here?
You state:
Executing the if ... fi statement in the terminal works as intended.
I doubt that. If I cut-and-paste your example, I get a continuation prompt from the shell:
if [[ ! -e ../bin/ ]]; then mkdir ../bin/ fi
>
And that is logical. Your shell (either via the prompt or via make) sees that you want to execute mkdir with two arguments ../bin and fi. The solution is of course to make sure that the shell sees the fi as the next "command". To do that, you need to add a ; before the fi.

How can I use getopts in a script that appends lines from files in a separate directory to a new file?

I am trying to write a bash script that takes in a directory, reads each file in the directory, and then appends the first line of each file in that directory to a new file. When I hard-code the variables in my script, it works fine.
This works:
#!/bin/bash
rm /local/SomePath/multigene.firstline.btab
touch /local/SomePath/multigene.firstline.btab
btabdir=/local/SomePath/test/*
outfile=/local/SomePath/multigene.firstline.btab
for f in $btabdir
do
head -1 $f >> $outfile
done
This does not work:
#!/bin/bash
while getopts ":d:o:" opt; do
case ${opt} in
d) btabdir=$OPTARG;;
o) outfile=$OPTARG;;
esac
done
rm $outfile
touch $outfile
for f in $btabdir
do
head -1 $f >> $outfile
done
Here is how I call the script:
bash /local/SomePath/Scripts/btab.besthits.wBp-q_wBm-r.sh -d /local/SomePath/test/* -o /local/SomePath/out.test/multigene.firstline.btab
And here is what I get when I run it:
rm: missing operand
Try 'rm --help' for more information.
touch: missing file operand
Try 'touch --help' for more information.
/local/SomePath/Scripts/btab.besthits.wBp-q_wBm-r.sh: line 23: $outfile: ambiguous redirect
Any suggestions? I'd like to be able to use getopts so I can make the script more generic. Thanks!
You have to pay extra attention to quoting and globbing when writing bash scripts.
When you call the script with a glob (* here) it gets expanded and split into words by your shell. This happends before your script even gets executed.
If you for example do cat *.txt cat will get all .txt files in the directory as its arguments. It will be the same as calling cat afile.txt nextfile.txt (and so on). Cat will never see the asterisk.
In your script it means that the input -d /local/SomePath/test/* gets expanded som something like /local/SomePath/test/someFile /local/SomePath/test/someOtherFile /test/someThirdFile.
Subsequently getopts only takes the first file after -d as for $btabdir and the -o doesn't get handled in the case switch.
I suggest you start by quoting every variable, preferable in the "${name}" style, and only invoke the script with quoted input.
It might also be send in a directory path, test that it is a directory (test -d), and change your for loop to for f in "${btabdir}"/*
This also works:
head -n1 -q /local/SomePath/test/* >> /local/SomePath/out.test/multigene.firstline.btab
I think the right answer here is "don't do it that way." :-)
The reason your current script isn't working may be that the wildcard is expanded by your interactive shell, not by your script. Try running your command with an echo at the beginning of the line for a hint at what's really happening. Once getopts sees the second of the matched files in the glob, it stops processing options, so -o never gets read, and $outfile remains unset. And since you don't quote your variable in rm $outfile, it's as if you're running rm without options. Test the difference in your shell between rm alone and rm "".
Also, what happens to your for loop if there's a space in a filename? Since you have bash, you have arrays. And arrays are much better for processing lists of files.
Perhaps use something like this instead:
#!/bin/bash
# initialize an array
files=()
while getopts :d:o: opt; do
case "$opt" in
d)
if [[ ! -d "$OPTARG" ]]; then
printf 'ERROR: not a directory: %s\n' "$OPTARG" >&2
exit 65
fi
# add to the array
files+=( "$OPTARG"/* )
;;
o) outfile="$OPTARG" ;;
*)
printf 'ERROR: unknown option: %s\n' "$opt" >&2
exit 64
;;
esac
done
if ! rm -f "$outfile" && touch "$outfile"; then
printf 'ERROR: cannot create %s\n' "$outfile" >&2
exit 73
fi
for f in "${files[#]}"; do
read -r < "$f"
printf '%s\n' "$REPLY"
done > "$outfile"
Here are some highlights of the changes....
We're using arrays, of course. The array ${files[#]} will contain one-file-per-record, without relying on whitespace, so with proper quoting you'll avoid problems with special characters in filenames.
We test for more error conditions, and actually show errors and exit if we see them. (The exit values are sysexits.)
Instead of using head, we use read and a single redirect to $outfile. This saves multiple forks to an external program, and multiple fopen() calls to your output file.
Note that the argument to -d should be a directory, not a glob. And you can specify options multiple times. Multiple -d options will be added together, but only the last -o option will be used.

Rules of executing shell command in Makefile

When I executed command make, I got an error message
Makefile:4: *** missing separator. Stop.
The command in Makefile is:
$(shell ./makejce common/jce jce)
What's wrong with it?
-------makejce---------
#!/bin/bash
FLAGS=""
local_protoc=""
dir0=`pwd`
dir=`pwd`
......
if [ $# -gt 1 ]
then
mkdir -p $2
cd $2
dir=`pwd`
cd $dir0
fi
cd $1
jce_dir=`pwd`
#sub dir
for d in `ls -d */`
do
if [ -d $d ]
then
cd $d
for f in `find . -name '*.jce'`
do
${local_protoc} ${FLAGS} --dir=${dir} $f
done
cd $jce_dir
fi
done
#current dir
for f in `ls *.jce`
do
${local_protoc} ${FLAGS} --dir=${dir} $f
done
cd $dir0
-----makefile------
......
$(shell ./makejce common/jce jce)
......
With so little info it looks extremely bizarre (why are you running all the build steps in a shell script then invoking that script with a shell makefile function? The entire point of a makefile is to manage the build steps...) but without more information I'll just answer your specific question:
The make shell function works like backticks or $(...) in shell scripts: that is it runs the command in a shell and expands to the stdout of the command.
In your makefile if you have:
$(shell echo hi)
then it runs the shell command echo hi and expands to the stdout (i.e., hi). Then make will attempt to interpret that as some makefile text, because that's where you have put the function invocation (on a line all by itself). That's a syntax error because make doesn't know what to do with the string hi.
If you want to run a shell function then either (a) redirect its output so it doesn't output anything:
$(shell ...command... >/dev/null 2>&1)
or (b) capture the output somewhere that it won't bother make, such as in a variable like this:
_dummy := $(shell ...command...)
(by using := here we ensure the shell function is evaluated when the makefile is parsed).

LOCAL_DIR variable prepends the scripts current directory (totally not what I expect)

Consider the following simple rsync script I am tryint to slap up:
#!/bin/bash
PROJECT="$1"
USER=stef
LOCAL_DIR="~/drupal-files/"
REMOTE_HOST="hostname.com"
REMOTE_PROJECTS_PATH=""
# Should not have anything to change below
PROJECT_LIST="proj1 proj2 proj3 quit"
echo "/nSelect project you wish to rsync\n\n"
select PROJECT in $PROJECT_LIST
do
if [ "$PROJECT" = "quit" ]; then
echo
echo "Quitting $0"
echo
exit
fi
echo "Rsynching $PROJECT from $REMOTE_HOST into" $LOCAL_DIR$PROJECT
rsync -avzrvP $USER#$REMOTE_HOST:/var/projects/$PROJECT/ $LOCAL_DIR$PROJECT
done
echo "Rsync complete."
exit;
The variable $LOCALDIR$PROJECT set in the rsync command always includes the scripts path, :
OUTPUT:
Rsynching casa from hostname.com.com into ~/drupal-files/casa
opening connection using: ssh -l stef hostname.com rsync --server --sender -vvlogDtprz e.iLsf . /var/groupe_tva/casa/
receiving incremental file list
rsync: mkdir "/home/stef/bin/~/drupal-files/proj1" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(605) [Receiver=3.0.9]
The line with mkdir should not have /home/stef/bin, why is bash adding the script's running dir on the variable?
Thanks
LOCAL_DIR="~/drupal-files/"
The string is in quotes so there's pathname expansion, and the variable will contain the literal string.
Remove the quotes.
$ x="~/test"; echo $x
~/test
$ x=~/test; echo $x
/home/user/test

bash: passing entire command (with arguments) to a function

I am essentially trying to implement a function which asserts the failure (non-zero exit code) of another command, and prints a message when it fails.
Here is my function:
function assert_fail () {
COMMAND=$#
if [ `$COMMAND; echo $?` -ne 0 ]; then
echo "$COMMAND failed as expected."
else
echo "$COMMAND didn't fail"
fi
}
# This works as expected
assert_fail rm nonexistent
# This works too
assert_fail rm nonexistent nonexistent2
# This one doesn't work
assert_fail rm -f nonexixtent
As soon as I add options to the command, it doesn't work. Here is the output of the above:
rm: cannot remove `nonexistent': No such file or directory
rm nonexistent failed as expected.
rm: cannot remove `nonexistent': No such file or directory
rm: cannot remove `nonexistent2': No such file or directory
rm nonexistent nonexistent2 failed as expected.
rm -f nonexistent didn't fail
I have tried putting double quotes around the commands, to no avail. I would expect the third invocation in the above to produce similar output to the other two.
I appreciate any/all help!
#rici correctly pointed out the issue you're seeing, but there are a couple of real problems with your wrapper function. First, it doesn't correctly preserve spaces (and some other funny characters) in arguments. COMMAND=$# (or COMMAND="$#") merges all of the arguments into a single string, losing the distinction between spaces between arguments and spaces within arguments. To keep them straight, either use "$#" directly without storing it in a variable, or store it as an array (COMMAND=("$#"), then execute it as "${COMMAND[#]}"). Second, if the command prints anything to stdout, it'll wreak havoc with your exit status check; just test it directly, as #chepner said. Here's my suggested rewrite:
function assert_fail () {
if "$#"; then
echo "$* didn't fail"
else
echo "$* failed as expected."
fi
}
Note that the way I did the echo commands does lose the distinction of spaces within arguments. If that's a problem, replace the echo commands with this:
printf "%q " "$#"
echo "didn't fail"
and
printf "%q " "$#"
echo "failed as expected."
rm -f never fails on non-existent files. It has nothing to do with your wrapper. See man rm:
OPTIONS
-f, --force
ignore nonexistent files, never prompt

Resources