I'm trying to execute a simple if else statement in a Makefile:
check:
if [ -z "$(APP_NAME)" ]; then \
echo "Empty" \
else \
echo "Not empty" \
fi
When I execute make check I get the following error:
if [ -z "" ]; then
/bin/bash: -c: line 1: syntax error: unexpected end of file
make: *** [check] Error 2
Any idea what I'm doing wrong?
I know I can use the following, but I have a lot of logic after the echos so I need to spread it out across multiple lines:
check:
[ -z "$(PATH)" ] && echo "Empty" || echo "Not empty"
Change your make target to this (adding semicolons):
check:
if [ -z "$(APP_NAME)" ]; then \
echo "Empty"; \
else \
echo "Not empty"; \
fi
For evaluating a statement in a shell without newlines (newlines get eaten by the backslash \) you need to properly end it with a semicolon. You cannot use real newlines in a Makefile for conditional shell-script code (see Make-specific background)
[ -z "$(APP_NAME)" ],
echo "Empty",
echo "Not empty" are all statements that need to be evaluated (similar to pressing enter in terminal after you typed in a command).
Make-specific background
make spawns a new shell for each command on a line, so you cannot use true multi line shell code as you would e.g. in a script-file.
Taking it to an extreme, the following would be possible in a shell script file, because the newline acts as command-evaluation (like in a terminal hitting enter is a newline-feed that evaluates the entered command):
if
[ 0 ]
then
echo "Foo"
fi
Listing 1
If you would write this in a Makefile though, if would be evaluated in its own shell (changing the shell-state to if) after which technically the condition [ 0 ] would be evaluated in its own shell again, without any connection to the previous if.
However, make will not even get past the first if, because it expects an exit code to go on with the next statement, which it will not get from just changing the shell's state to if.
In other words, if two commands in a make-target are completely independent of each other (no conditions what so ever), you could just perfectly fine separate them solely by a normal newline and let them execute each in its own shell.
So, in order to make make evaluate multi line conditional shell scripts correctly, you need to evaluate the whole shell script code in one line (so it all is evaluated in the same shell).
Hence, for evaluating the code in Listing 1 inside a Makefile, it needs to be translated to:
if \
[ 0 ]; \
then \
echo "Foo"; \
fi
The last command fi does not need the backslash because that's where we don't need to keep the spawned shell open anymore.
This is shell syntax, not makefiles. You need to familiarize yourself with the rules surrounding using backslashes to enter long commands into a single line of shell.
In your example, after the backslash newline pairs are removed, it looks like this:
if [ -z "$(APP_NAME)" ]; then echo "Empty" else echo "Not empty" fi
Maybe you can now see that the issue is. The shell interprets that as:
if [ -z "$(APP_NAME)" ]; then
followed by a single long command:
echo "Empty" else echo "Not empty" fi
which would echo the content Empty else echo not empty fi, except that since there's no trailing fi shell token it's instead a syntax error.
In shell syntax you need to add a semicolon after every individual command, so the shell knows how to split it up:
check:
if [ -z "$(APP_NAME)" ]; then \
echo "Empty"; \
else \
echo "Not empty"; \
fi
Note the semicolons after the echo commands telling the shell that the command arguments end there.
Other answers already pointed out that the problem is combination of makefile design and shell syntax. The design of Makefiles make it really cumbersome to write complex recipes. Often it is better to rethink the process and either rewrite parts of the makefile or put the complexity in a shell script.
Here is example of your recipe put in a shell script:
check:
sh check.sh "$(APP_NAME)"
and the script:
if [ -z "$1" ]; then
echo "Empty"
else
echo "Not empty"
fi
advantage: you have all the power and flexibility of a shell script without any of the makefile awkwardness. You just need to pass the right arguments.
disadvantage: you have aditional files for your build process and your makefile recipes is spread across multiple files.
If the condition is "simple" you might use the conditional construct from make itself. In your case I would argue that it is just barely simple enough to tolerate, but any more complexity and it will go in a shell script.
Here is how to write conditional recipes using makefile features:
check:
ifdef APP_NAME
echo "Empty"
else
echo "Not empty"
endif
again with annotation
check: # target name
ifdef APP_NAME # makefile conditional syntax
echo "Empty" # recipe if condition true
else # makefile conditional syntax
echo "Not empty" # recipe if condition false
endif # makefile conditional syntax
For example if APP_NAME is defined the rule will effectively look like this during execution:
check:
echo "Empty"
This specific example is probably semantically equivalent to your makefile. I cannot say for sure because I did not test thoroughly.
It is important to know that this conditional is evaluated before the recipe is executed. That means the value of variables that get computed values might be different.
advantage: all build commands in one place.
disadvantage: headaches trying to figure out when makefile does variable assignment and evaluation if the conditional does not behave like you expected.
read here for more info:
https://www.gnu.org/software/make/manual/html_node/Conditional-Example.html
https://www.gnu.org/software/make/manual/html_node/Conditional-Syntax.html
https://www.gnu.org/software/make/manual/html_node/Reading-Makefiles.html
see also
Passing arguments to "make run"
Related
I am trying to automate our application backup. Part of the process is to check the exit status of egrep in an if statement:
if [ ! -f /opt/apps/SiteScope_backup/sitescope_configuration.zip ] ||
[ egrep -i -q "error|warning|fatal|missing|critical" "$File" ]
then
echo "testing"
fi
I expected it to output testing because the file exists and egrep returns success, but instead I'm getting an error:
-bash: [: too many arguments
I tried with all kinds of syntax - additional brackets, quotes etc but error still persists.
Please help me in understanding where I am going wrong.
You are making the common mistake of assuming that [ is part of the if statement's syntax. It is not; the syntax of if is simply
if command; then
: # ... things which should happen if command's result code was 0
else
: # ... things which should happen otherwise
fi
One of the common commands we use is [ which is an alias for the command test. It is a simple command for comparing strings, numbers, and files. It accepts a fairly narrow combination of arguments, and tends to generate confusing and misleading error messages if you don't pass it the expected arguments. (Or rather, the error messages are adequate and helpful once you get used to it, but they are easily misunderstood if you're not used.)
Here, you want to check the result of the command egrep:
if [ ! -f /opt/apps/SiteScope_backup/sitescope_configuration.zip ] ||
egrep -i -q "error|warning|fatal|missing|critical" "$File"
then
echo "testing"
fi
In the general case, command can be a pipeline or a list of commands; then, the exit code from the final command is the status which if will examine, similarly to how the last command in a script decides the exit status from the script.
These compound commands can be arbitrarily complex, like
if read thing
case $thing in
'' | 'quit') false;;
*) true;;
esac
then ...
but in practice, you rarely see more than a single command in the if statement (though it's not unheard of; your compound statement with || is a good example!)
Just to spell this out,
if [ egrep foo bar ]
is running [ aka test on the arguments egrep foo bar. But [ without options only accepts a single argument, and then checks whether or not that argument is the empty string. (egrep is clearly not an empty string. Quoting here is optional, but would perhaps make it easier to see:
if [ "egrep" ]; then
echo "yes, 'egrep' is not equal to ''"
fi
This is obviously silly in isolation, but should hopefully work as an illustrative example.)
The historical reasons for test as a general kitchen sink of stuff the authors didn't want to make part of the syntax of if is one of the less attractive designs of the original Bourne shell. Bash and zsh offer alternatives which are less unwieldy (like the [[ double brackets in bash), and of course, POSIX test is a lot more well-tempered than the original creation from Bell Labs.
Hi I have written small shell script, I am not able to understand the behavior of that script. can any one help me to understand that script.
Script:
#!/bin/bash
if [ -z $1 ]
then
echo "fail"
else
echo "success"
fi
While executing the script .
./test.sh one
It exuting the else statement instead of main statement , even though its passing the argument.
can any one explain me this behavior to understand
The -z test in bash is checking if a string is an empty (zero length) value.
Since you're passing an argument to the script $1 is not empty and therefore -z $1 evaluates to false, executing the else portion of your script.
Side note: Since you're working with strings I recommend you to quote variables as follows:
if [ -z "$1" ]; then
echo "String is empty / No argument given"
else
echo "String is not empty / Argument given"
fi
Edit:
As pointed out by user1934428 it's probably better to use [[ instead of [. This, among others, eliminates the need for quoting. See more differences here.
if [[ -z $1 ]]; then
...
However, be aware that this is a bash extension and won't work in sh scripts.
Example script:
#!/bin/bash
printf '1\n1\n1\n1\n' | ./script2*.sh >/dev/null 2>/dev/null
Shellcheck returns the following:
In script1.sh line 3:
printf '1\n1\n1\n1\n' | ./script2*.sh >/dev/null 2>/dev/null
^-- SC2211: This is a glob used as a command name. Was it supposed to be in ${..}, array, or is it missing quoting?
According to https://github.com/koalaman/shellcheck/wiki/SC2211, there should be no exceptions to this rule.
Specifically, it suggests "If you want to specify a command name via glob, e.g. to not hard code version in ./myprogram-*/foo, expand to array or parameters first to allow handling the cases of 0 or 2+ matches."
The reason I'm using the glob in the first place is that I append or change the date to any script that I have just created or changed. Interestingly enough, when I use "bash script2*.sh" instead of "./script2*.sh" the complaint goes away.
Have I fixed the problem or I am tricking shellcheck into ignoring a problem that should not be ignored? If I am using bad bash syntax, how might I execute another script that needs to be referenced to using a glob the proper way?
The problem with this is that ./script2*.sh may end up running
./script2-20171225.sh ./script2-20180226.sh ./script2-copy.sh
which is a strange and probably unintentional thing to do, especially if the script is confused by such arguments, or if you wanted your most up-to-date file to be used. Your "fix" has the same fundamental problem.
The suggestion you mention would take the form:
array=(./script2*.sh)
[ "${#array[#]}" -ne 1 ] && { echo "Multiple matches" >&2; exit 1; }
"${array[0]}"
and guard against this problem.
Since you appear to assume that you'll only ever have exactly one matching file to be invoked without parameters, you can turn this into a function:
runByGlob() {
if (( $# != 1 ))
then
echo "Expected exactly 1 match but found $#: $*" >&2
exit 1
elif command -v "$1" > /dev/null 2>&1
then
"$1"
else
echo "Glob is not a valid command: $*" >&2
exit 1
fi
}
whatever | runByGlob ./script2*.sh
Now if you ever have zero or multiple matching files, it will abort with an error instead of potentially running the wrong file with strange arguments.
I am trying to automate our application backup. Part of the process is to check the exit status of egrep in an if statement:
if [ ! -f /opt/apps/SiteScope_backup/sitescope_configuration.zip ] ||
[ egrep -i -q "error|warning|fatal|missing|critical" "$File" ]
then
echo "testing"
fi
I expected it to output testing because the file exists and egrep returns success, but instead I'm getting an error:
-bash: [: too many arguments
I tried with all kinds of syntax - additional brackets, quotes etc but error still persists.
Please help me in understanding where I am going wrong.
You are making the common mistake of assuming that [ is part of the if statement's syntax. It is not; the syntax of if is simply
if command; then
: # ... things which should happen if command's result code was 0
else
: # ... things which should happen otherwise
fi
One of the common commands we use is [ which is an alias for the command test. It is a simple command for comparing strings, numbers, and files. It accepts a fairly narrow combination of arguments, and tends to generate confusing and misleading error messages if you don't pass it the expected arguments. (Or rather, the error messages are adequate and helpful once you get used to it, but they are easily misunderstood if you're not used.)
Here, you want to check the result of the command egrep:
if [ ! -f /opt/apps/SiteScope_backup/sitescope_configuration.zip ] ||
egrep -i -q "error|warning|fatal|missing|critical" "$File"
then
echo "testing"
fi
In the general case, command can be a pipeline or a list of commands; then, the exit code from the final command is the status which if will examine, similarly to how the last command in a script decides the exit status from the script.
These compound commands can be arbitrarily complex, like
if read thing
case $thing in
'' | 'quit') false;;
*) true;;
esac
then ...
but in practice, you rarely see more than a single command in the if statement (though it's not unheard of; your compound statement with || is a good example!)
Just to spell this out,
if [ egrep foo bar ]
is running [ aka test on the arguments egrep foo bar. But [ without options only accepts a single argument, and then checks whether or not that argument is the empty string. (egrep is clearly not an empty string. Quoting here is optional, but would perhaps make it easier to see:
if [ "egrep" ]; then
echo "yes, 'egrep' is not equal to ''"
fi
This is obviously silly in isolation, but should hopefully work as an illustrative example.)
The historical reasons for test as a general kitchen sink of stuff the authors didn't want to make part of the syntax of if is one of the less attractive designs of the original Bourne shell. Bash and zsh offer alternatives which are less unwieldy (like the [[ double brackets in bash), and of course, POSIX test is a lot more well-tempered than the original creation from Bell Labs.
First of all, hi to everyone, that's my first post here.
I swear I have checked the site for similar questions to avoid the "double post about same argument" issue but none of them answered exactly to my question.
The problem is that in the code below I always get the "There are no files with this extension" message when I call the script passing it an extension as first argument.
#!/bin/bash
if [ "$1" ];
then
file=*."$1";
if [ -f "$file" ];
then
for i in "$file";
[...do something with the each file using "$i" like echo "$i"]
else
echo "There are no files with this extension";
fi;
else
echo "You have to pass an extension"
fi;
I tried using the double parenthesis, using and not using the quotes in the nested if, using *."$1" directly in the if, but none of this solution worked.
One problem is that you're not quoting a variable when you first assign a value to file. In this statement:
file=*."$1";
The * will be interpreted by the shell, so for example if you passed in .py on the command line, file might end up with the value file1.py file2.py, which will throw off your file existence test later on.
Another problem, as #sideshowbarker points out, is that you can't use wildcards with the [ -f ... ].
Another variable quoting issue is that quoting inhibits wildcard expansion, such that even without the file existence test, if $file is, e.g., *.txt, then this:
for x in "$file"; do ...
Will loop over a single argument with the literal value *.txt, while this:
for x in $file; do ...
Will loop over all files that end with a .txt extension (unless there are none, in which case it will loop once with $x set to the literal value *.txt).
Typically, you would write your script to expect a list of arguments, and allow the user to call it like myscript *.txt...that is, leave wildcard handling to the interactive shell, and just let your script process a list of arguments. Then it becomes simply:
for i in "$#"; do
echo do something with this file named "$x"
done
If you really want to handle the wildcard expansion in your script, something like this might work:
#!/bin/bash
if [ "$1" ];
then
ext="$1"
for file in *.$ext; do
[ -f "$file" ] || continue
echo $file
done
else
echo "You have to pass an extension"
fi;
The statement [ -f "$file" ] || continue is necessary there because of the case I mentioned earlier: if there are no files, the loop will still execute once with the literal expansion of *.$ext.