Bash if statement not working properly - bash

I have a bash statement to test a command line argument. If the argument passed to the script is "clean", then the script removes all .o files. Otherwise, it builds a program. However, not matter what is passed (if anything), the script still thinks that the argument "clean" is being passed.
#!/bin/bash
if test "`whoami`" != "root" ; then
echo "You must be logged in as root to build (for loopback mounting)"
echo "Enter 'su' or 'sudo bash' to switch to root"
exit
fi
ARG=$1
if [ $ARG == "clean" ] ; then
echo ">>> cleaning up object files..."
rm -r src/*.o
echo ">>> done. "
echo ">>> Press enter to continue..."
read
else
#Builds program
fi

Answer for first version of question
In bash, spaces are important. Replace:
[ $ARG=="clean" ]
With:
[ "$ARG" = "clean" ]
bash interprets $ARG=="clean" as a single-string. If a single-string is placed in a test statement, test returns false if the string is empty and true if it is non-empty. $ARG=="clean" will never be empty. Thus [ $ARG=="clean" ] will always return true.
Second, $ARG should be quoted. Otherwise, if it is empty, then the statement reduces to `[ == "clean" ] which is an error ("unary operator expected").
Third, it is best practices to use lower or mixed case for your local variables. The system uses upper-case shell variables and you don't want to accidentally overwrite one of them.
Lastly, with [...], the POSIX operator for equal, in the string sense, is =. Bash will accept either = or == but = is more portable.

first:
Every string must double quoted or will error absent argument.
second:
for string used only = or != not a == and also -n and -z commands.
third:
you may combine conditions by -a and -o commands but newer used enclose in () yous conditions so not to get error. Logical operators acts through operators presidence, fist calculate -o operator and then -a! For example
[ -n "$1" -a $1 = '-h' -o $1 = '--help' ] && { usage; exit 0; }
will work when passed to script at least 1 argument and is -h or --help. All spaces must be!!! Bush do short cycle logical evaluations. So don't trouble for case when $1 don't exist in second condition because of result of this expression is determined in first one. next don't calculate in this case. But if your argument may contains space symbols you need it double quote. You must do it also in command line too! Else you get error in script or split your arguments in two or more parts.

Operator == isn't used in test. For numbers(not siring) used -eq or -ne commands. See man 1 test for full descriptions. test EXPRESSION... equivalent of [ EXPRESSIONS... ]. More shirt form of test.

Related

sh shell double if statement

Can anyone see what I did wrong here? I keep getting the following error message: [[: not found
read INPUT
if [[ "$INPUT" -ge 1 ]] && [[ "$INPUT" -le 10 ]]; then
Do something
else
printf "Please enter a value between 1 and 10"
fi
[[ is not available in scripts which start with #!/bin/sh, or which are started with sh yourscript. Start your script with #!/bin/bash if you want to use it.
See also http://mywiki.wooledge.org/BashGuide/Practices#Choose_Your_Shell
If you are going to use bash, by the way, there's a better syntax for numeric comparisons:
if (( input >= 1 && input <= 10 )); then ...
Note that lower-case variable names are preferred for local use -- all-upper-case names are reserved for environment variables and shell builtins.
If you're not going to use bash, use the POSIX test operator:
if [ "$input" -ge 1 ] && [ "$input" -le 10 ]; then ...
Note that when using [ ] correct quoting is essential, whereas with [[ ]] it is often superfluous; also, [ ] is missing some extensions such as pattern-matching and regular-expression operators.
It's complicated:
First, there are three separate ways of constructing your if statement. Each way has its own unique syntax on how to join two booleans. (Actually, there are four ways since one way allows you to use list operators).
A little background...
The if command is a compound command built into the shell. The if command executes the commands following the if. If that command returns a zero value, the if statement is considered true and the then clause executes. Otherwise, if it exists, the else clause will execute. Remember, the if is just a command. You can do things like this:
if ! mv "$foo" "$bar"
then
echo "I can't move $foo to $bar"
exit 2
fi
What we need is a command to do some testing for us. If the test succeeds, that test command returns an exit code of zero. If not, it returns a non-zero exit code. Then, it could be used with the if command!
The test command (Yes, there's really one!).
The [ is an alias for the test command which was created to allow you to test files, strings, and numbers for the if statement. (This is now a built in command in Bash, but its roots are actually part of /bin/test and /bin/[). These are the same:
if test "$foo" -eq "$bar"
then
...
fi
and
if [ "$foo" -eq "$bar" ]
then
...
fi
The test command (if you read the manpage has a -a And test and a -o Or test. You could have done:
if [ "$INPUT" -ge 1 -a "$INPUT" -le 10 ]
then
....
fi
This is a single test statement with three test parameters (-ge, -a, and -le).
Using List Operators
This isn't the only way to do a compound boolean test. The Bash shell has two list operators: && and ||. The list operators go in between two commands. If you use && and the left hand command returns a non-zero exit code, the right hand command is not executed, and the entire list returns the exit value of the left-hand command. If you use ||, and the left hand command succeeds, the right hand command is not executed, and the entire list returns a zero exit value. If the first command returns a non-zero exit value, the right-hand command is executed, and the entire list returns the exit value of the right-hand command.
That's why you can do things like this:
[ $bar -eq 0 ] || echo "Bar doesn't have a zero value"!
Since [ ... ] is just a command that returns a zero or non-zero value, we can use these list operators as part of our test:
if [ "$INPUT" -ge 1 ] && [ "$INPUT" -le 10 ]
then
...
fi
Note that this is two separate tests and are separated by a && list operator.
Bash's Special [[ compound command
In Kornshell, Zsh, and Bash, there are special compound commands for testing. These are the double square brackets. They appear to be just like the single square brackets command, but because they're compound commands, parsing is affected.
For example:
foo="This has white space"
bar="" #No value
if [ ! $foo = $bar ] # Doesn't work!
then
The shell expands $foo and $bar and the test will become:
if [ This has white space = ]
which just doesn't work. However,
if [[ $foo != $bar ]]
works fine because of special parsing rules. The double brackets allow you to use parentheses for grouping and && and || as boolean operators. Thus:
if [[ $INPUT -ge 1 && $INPUT -le 10 ]]
then
...
fi
Note that the && appears inside a single set of double square brackets. (Note there's no need for quotation marks)
Mathematical Boolean Expression
Bash has built in mathematical processing including mathematical boolean expressions. If you put something between double parentheses, Bash will evaluate it mathematically:
if (( $INPUT >= 1 && $INPUT <= 10 ))
then
...
fi
In this case, (( $INPUT >= 1 && $INPUT <= 10 )) is evaluated. If $INPUT is between 1 and 10 inclusively, the mathematical expression will evaluate as true (zero exit code), and thus the then clause will be executed.
So, you can:
Use the original test (single square brackets) command and use the -a to string together two boolean statements in a single test.
Use list operators to string together two separate test commands (single square brackets).
Use the newer compound test command (double square brackets) that now include && and || as boolean operators, so you have a single compound test.
Forget about test command and just use mathematical evaluation (double parentheses) to evaluate boolean expressions.
Test Constructs Can Vary by Shell
As has been mentioned in other posts, [[ is a Bash shell keyword that isn't present in the Bourne shell. You can see this from a Bash prompt with:
type '[['
[[ is a shell keyword
In a Bourne shell, you will instead get "command not found."
Be More Portable: Use the -a Test Operator
A more portable construct is to use the -a test operator to join conditions (see man test for details). For example:
if [ "$INPUT" -ge 1 -a "$INPUT" -le 10 ]; then
: # do something when both conditions are true
else
: # do something when either condition is false
fi
This will work in every Bourne-compatible shell I've ever used, and on any system that has a /bin/\[ executable.

Square bracket ( test condition ) stuff in shell and advance case statement

I have two questions. Lets talk about the simpler one 1st and then we'll talk about the case statement.
consider this simple if else
if fgrep -q '= ' sf
then
echo "blanks in file"
else
echo "no blanks"
fi
[[ `fgrep -q '= ' sf` ]] && echo "blanks there"
# rc=$?
# echo "rc is $rc"
the if condition works like a charm. I am trying to rewrite the same stuff using [[ test condition and it does not exactly work. What is wrong in my test condition.
now the 2nd Question
Actually here is what I am trying to do .
There is this kinda case statement I have
source "/path"
# die is a function that will output standard error and return 1
opta=false
optb=false
while getopts ":abf:" opt; do
case $opt in
a ) $optb && die "Cannot specify option a after specifying option b"
opta=true
;;
b ) $opta && die "Cannot specify option b after specifying option a"
optb=true
;;
f) # Pl see the note below for f
\?) die "Invalid option: -$OPTARG. Abort"
;;
esac
done
shift $(($OPTIND - 1))
test $# -eq 0 && die "You must supply SID"
test $# -eq 1 || die "Too many command-line arguments"
SID=$1
the f is a file option. So 1st two are incompatible if -f is used.
myshell -f /path/file1 -a 500
above is not allowed. I can manage this part so don't worry about it too much
Here is the rock that's stumbling me off path .
-f accepts a file path. The file is an manual override file that has various parameters that will override ones set using the source command . So when It comes to f option it should do the following
check if
file path is valid
2.
if it is valid check if there IS "= "viz. equal to followed by blank- then quit.In other words I dont want blank values || ones that ^blank
3
if both these conditions are met then if the search string s parameter is set in the file then there cannot be any positional parameters passed to the command .
e.g. of manual override file below
p1=v1
p2=v2
s=v3
p3=
# some parameters like p3 need to be set in that case it will take the defaults from the source file if those aren't set
In other words e.g the below command
myshell.ksh -f /path/file1 500
is valid if value of s is NOT set in file1 else it should quit giving an error that positional parameter was supplied when override value was already applied in file -f file1
4
export all the parameters that are set in file1 as overrides to the source file parameters.
e.g. source file has
p1=sv1
and in file1 p1 is set then export p1=v1 from file1
I can manage #4. I just need some insight into $3 for the most. 2,1 and 4 won't harm
Question 1
The [[...]] have no business being there. Use:
fgrep -q '= ' sf && echo "blanks there"
The above runs fgrep on sf. If fgrep indicates success, then the echo command is run.
Question 2
If /path/file1 contains a setting for the s variable and there are positional parameters on the command line, then we are supposed to report an error:
grep '^s=' /path/file1 && [ "$#" -gt 0 ] && echo "ERROR: file has s parameter set and there are positional arguments"
The above checks for two conditions to be true and, if they are both true, it prints an error message. The first condition is:
grep '^s=' /path/file1
The is true if the file /path/file1 has a line that begins with the characters s=. (^ signifies the beginning of a line.) The second condition is:
[ "$#" -gt 0 ]
This return true if the number of positional parameters is greater than zero. If both of those conditions are true, then the echo statement above is executed.
Question 2: Alternate Approach
In this case, we suppose that the variable $filepath has the path and name of the file, such as /path/file1, which contains shell commands for setting variables. The following checks to see if that file is readable. If it is, then it sources all the commands in that file. The next line checks to see if s has been set. If it has and if there are still positional parameters, then it prints a message:
[ -r "$filepath" ] && source "$filepath" # Set all override variables
[ -n "$s" ] && [ "$#" -gt 0 ] && echo "message"
In the test ([...]) statements, note that $filepath and $s are enclosed in double-quotes. This prevents an error should either value be empty. It also prevents errors if the value of filepath contains spaces.
To source a file, it needs not merely to exist but also to be readable. Therefore, the first test above checks for readability (-r) instead of mere existence (-f).
MORE
To check if the source file has any uncommented lines with variables set to empty values:
grep -qE '^[^#]+=$' file1 && echo "message"
In the above ^ matches the beginning of a line.
The regular expression works because [^#] matches any character that is not a hash sign. Since plus sign means one or more of the preceding character, the [^#]+ means a string of one or more characters, none of which are hash signs. Outside of square brackets, a ^ matches the start of a line. So, ^[^#]+ matches any string of non-hash characters starting at the beginning of the line. ^[^#]+= matches if those characters are then followed by an equal sign. Since $ matches the end of a line, then ^[^#]+=$ matches if the line starts with one or more non-hash characters, followed by an equal sign, followed by nothing (the end of the line). Thus, it matches if some variable has its value set to nothing.

What is the use case of noop [:] in bash?

I searched for noop in bash (:), but was not able to find any good information. What is the exact purpose or use case of this operator?
I tried following and it's working like this for me:
[mandy#root]$ a=11
[mandy#root]$ b=20
[mandy#root]$ c=30
[mandy#root]$ echo $a; : echo $b ; echo $c
10
30
Please let me know, any use case of this operator in real time or any place where it is mandatory to use it.
It's there more for historical reasons. The colon builtin : is exactly equivalent to true. It's traditional to use true when the return value is important, for example in an infinite loop:
while true; do
echo 'Going on forever'
done
It's traditional to use : when the shell syntax requires a command but you have nothing to do.
while keep_waiting; do
: # busy-wait
done
The : builtin dates all the way back to the Thompson shell, it was present in Unix v6. : was a label indicator for the Thompson shell's goto statement. The label could be any text, so : doubled up as a comment indicator (if there is no goto comment, then : comment is effectively a comment). The Bourne shell didn't have goto but kept :.
A common idiom that uses : is : ${var=VALUE}, which sets var to VALUE if it was unset and does nothing if var was already set. This construct only exists in the form of a variable substitution, and this variable substitution needs to be part of a command somehow: a no-op command serves nicely.
See also What purpose does the colon builtin serve?.
I use it for if statements when I comment out all the code. For example you have a test:
if [ "$foo" != "1" ]
then
echo Success
fi
but you want to temporarily comment out everything contained within:
if [ "$foo" != "1" ]
then
#echo Success
fi
Which causes bash to give a syntax error:
line 4: syntax error near unexpected token `fi'
line 4: `fi'
Bash can't have empty blocks (WTF). So you add a no-op:
if [ "$foo" != "1" ]
then
#echo Success
:
fi
or you can use the no-op to comment out the lines:
if [ "$foo" != "1" ]
then
: echo Success
fi
If you use set- e then || : is a great way to not exit the script if a failure happens (it explicitly makes it pass).
You would use : to supply a command that succeeds but doesn't do anything. In this example the "verbosity" command is turned off by default, by setting it to :. The 'v' option turns it on.
#!/bin/sh
# example
verbosity=:
while getopts v OPT ; do
case $OPT in
v)
verbosity=/bin/realpath
;;
*)
exit "Cancelled"
;;
esac
done
# `$verbosity` always succeeds by default, but does nothing.
for i in * ; do
echo $i $($verbosity $i)
done
$ example
file
$ example -v
file /home/me/file
One use is as multiline comments, or to comment out part of your code for testing purposes by using it in conjunction with a here file.
: << 'EOF'
This part of the script is a commented out
EOF
Don't forget to use quotes around EOF so that any code inside doesn't get evaluated, like $(foo). It also might be worth using an intuitive terminator name like NOTES, SCRATCHPAD, or TODO.
Ignoring alias arguments
Some times you want to have an alias that doesn't take any argument. You can do it using ::
> alias alert_with_args='echo hello there'
> alias alert='echo hello there;:'
> alert_with_args blabla
hello there blabla
> alert blabla
hello there
Two of mine.
Embed POD comments
A quite funky application of : is for embedding POD comments in bash scripts, so that man pages can be quickly generated. Of course, one would eventually rewrite the whole script in Perl ;-)
Run-time function binding
This is a sort of code pattern for binding functions at run-time.
F.i., have a debugging function to do something only if a certain flag is set:
#!/bin/bash
# noop-demo.sh
shopt -s expand_aliases
dbg=${DBG:-''}
function _log_dbg {
echo >&2 "[DBG] $#"
}
log_dbg_hook=':'
[ "$dbg" ] && log_dbg_hook='_log_dbg'
alias log_dbg=$log_dbg_hook
echo "Testing noop alias..."
log_dbg 'foo' 'bar'
You get:
$ ./noop-demo.sh
Testing noop alias...
$ DBG=1 ./noop-demo.sh
Testing noop alias...
[DBG] foo bar
Somewhat related to this answer, I find this no-op rather convenient to hack polyglot scripts. For example, here is a valid comment both for bash and for vimscript:
":" # this is a comment
":" # in bash, ‘:’ is a no-op and ‘#’ starts a comment line
":" # in vimscript, ‘"’ starts a comment line
Sure, we may have used true just as well, but : being a punctuation sign and not an irrelevant English word makes it clear that it is a syntax token.
As for why would someone do such a tricky thing as writing a polyglot script (besides it being cool): it proves helpful in situations where we would normally write several script files in several different languages, with file X referring to file Y.
In such a situation, combining both scripts in a single, polyglot file avoids any work in X for determining the path to Y (it is simply "$0"). More importantly, it makes it more convenient to move around or distribute the program.
A common example. There is a well-known, long-standing issue with shebangs: most systems (including Linux and Cygwin) allow only one argument to be passed to the interpreter. The following shebang:
#!/usr/bin/env interpreter --load-libA --load-libB
will fire the following command:
/usr/bin/env "interpreter --load-libA --load-libB" "/path/to/script"
and not the intended:
/usr/bin/env interpreter --load-libA --load-libB "/path/to/script"
Thus, you would end up writing a wrapper script, such as:
#!/usr/bin/env sh
/usr/bin/env interpreter --load-libA --load-libB "/path/to/script"
This is where polyglossia enters the stage.
A more specific example. I once wrote a bash script which, among other things, invoked Vim. I needed to give Vim additional setup, which could be done with the option --cmd "arbitrary vimscript command here". However, that setup was substantial, so that inlining it in a string would have been terrible (if ever possible). Hence, a better solution was to write it in extenso in some configuration file, then make Vim read that file with -S "/path/to/file". Hence I ended up with a polyglot bash/vimscript file.
suppose you have a command you wish to chain to the success of another:
cmd="some command..."
$cmd
[ $? -eq 0 ] && some-other-command
but now you want to execute the commands conditionally and you want to show the commands that would be executed (dry-run):
cmd="some command..."
[ ! -z "$DEBUG" ] && echo $cmd
[ -z "$NOEXEC" ] && $cmd
[ $? -eq 0 ] && {
cmd="some-other-command"
[ ! -z "$DEBUG" ] && echo $cmd
[ -z "$NOEXEC" ] && $cmd
}
so if you set DEBUG and NOEXEC, the second command never shows up. this is because the first command never executes (because NOEXEC is not empty) but the evaluation of that fact leaves you with a return of 1, which means the subordinate command never executes (but you want it to because it's a dry run). so to fix this you can reset the exit value left on the stack with a noop:
[ -z "$NOEXEC" ] && $cmd || :
Sometimes no-op clauses can make your code more readable.
That can be a matter of opinion, but here's an example. Let's suppose you've created a function that works by taking two unix paths. It calculates the 'change path' needed to cd from one path to another. You place a restriction on your function that the paths must both start with a '/' OR both must not.
function chgpath() {
# toC, fromC are the first characters of the argument paths.
if [[ "$toC" == / && "$fromC" == / ]] || [[ "$toC" != / && "$fromC" != / ]]
then
true # continue with function
else
return 1 # Skip function.
fi
Some developers will want to remove the no-op but that would mean negating the conditional:
function chgpath() {
# toC, fromC are the first characters of the argument paths.
if [[ "$toC" != / || "$fromC" == / ]] && [[ "$toC" == / || "$fromC" != / ]]
then
return 1 # Skip function.
fi
Now -in my opinion- its not so clear from the if-clause the conditions in which you'd want to skip doing the function. To eliminate the no-op and do it clearly, you would want to move the if-clause out of the function:
if [[ "$toC" == / && "$fromC" == / ]] || [[ "$toC" != / && "$fromC" != / ]]
then
cdPath=$(chgPath pathA pathB) # (we moved the conditional outside)
That looks better, but many times we can't do this; we want the check to be done inside the function.
So how often does this happen? Not very often. Maybe once or twice a year. It happens often enough, that you should be aware of it. I don't shy away from using it when I think it improves the readability of my code (regardless of the language).
I've also used in it scripts to define default variables.
: ${VARIABLE1:=my_default_value}
: ${VARIABLE2:=other_default_value}
call-my-script ${VARIABLE1} ${VARIABLE2}
I sometimes use it on Docker files to keep RUN commands aligned, as in:
RUN : \
&& somecommand1 \
&& somecommand2 \
&& somecommand3
For me, it reads better than:
RUN somecommand1 \
&& somecommand2 \
&& somecommand3
But this is just a matter of preference, of course
null command [:] is actually considered a synonym for the shell builtin true. The ":" command is itself a Bash builtin, and its exit status is true (0).
`
$ :
$ echo $? # 0
while :
do
operation-1
operation-2
...
operation-n
done
# Same as:
while true
do
...
done
Placeholder in if/then test:
if condition
then : # Do nothing and branch ahead
else # Or else ...
take-some-action
fi
$ : ${username=`whoami`}
$ ${username=`whoami`} #Gives an error without the leading :
Source: TLDP
I used the noop today when I had to create a mock sleep function to use in bats testing framework. This allowed me to create an empty function with no side effects:
function sleep() {
:
}

Operations on boolean variables

In this question it has been shown how to use neat boolean variables in bash. Is there a way of performing logic operations with such variables? E.g. how to get this:
var1=true
var2=false
# ...do something interesting...
if ! $var1 -a $var2; then <--- doesn't work correctly
echo "do sth"
fi
This does work:
if ! $var1 && $var2; then
echo "do sth"
fi
Maybe somebody can explain why -a and -o operators don't work and &&, ||, ! do?
Okay boys and girls, lesson time.
What's happening when you execute this line?
if true ; then echo 1 ; fi
What's happening here is that the if command is being executed. After that everything that happens is part of the if command.
What if does is it executes one or more commands (or rather, pipelines) and, if the return code from the last command executed was successful, it executes the commands after then until fi is reached. If the return code was not successful the then part is skipped and execution continues after fi.
if takes no switches, its behavior is not modifiable in anyway.
In the example above the command I told if to execute was true. true is not syntax or a keyword, it's just another command. Try executing it by itself:
true
It will print nothing, but it set its return code to 0 (aka "true"). You can more clearly see that it is a command by rewriting the above if statement like this:
if /bin/true ; then echo 1 ; fi
Which is entirely equivalent.
Always returning true from a test is not very useful. It is typical to use if in conjunction with the test command. test is sometimes symlinked to or otherwise known as [. On your system you probably have a /bin/[ program, but if you're using bash [ will be a builtin command. test is a more complex command than if and you can read all about it.
help [
man [
But for now let us say that test performs some tests according to the options you supply and returns with either a successful return code (0) or an unsuccessful one. This allows us to say
if [ 1 -lt 2 ] ; then echo one is less than two ; fi
But again, this is always true, so it's not very useful. It would be more useful if 1 and 2 were variables
read -p' Enter first number: ' first
read -p' Enter second number: ' second
echo first: $first
echo second: $second
if [ $first -lt $second ] ; then
echo $first is less than $second
fi
Now you can see that test is doing its job. Here we are passing test four arguments. The second argument is -lt which is a switch telling test that the first argument and third argument should be tested to see if the first argument is less than the third argument. The fourth argument does nothing but mark the end of the command; when calling test as [ the final argument must always be ].
Before the above if statement is executed the variables are evaluated. Suppose that I had entered 20 for first and 25 for second, after evaluation the script will look like this:
read -p' Enter first number: ' first
read -p' Enter second number: ' second
echo first: 20
echo second: 25
if [ 20 -lt 25 ] ; then
echo 20 is less than 25
fi
And now you can see that when test is executed it will be testing is 20 less than 25?, which is true, so if will execute the then statement.
Bringing it back to the question at hand: What's going on here?
var1=true
var2=false
if ! $var1 -a $var2 ; then
echo $var1 and $var2 are both true
fi
When the if command is evaluated it will become
if ! true -a false ; then
This is instructing if to execute true and passing the arguments -a false to the true command. Now, true doesn't take any switches or arguments, but it also will not produce an error if you supply them without need. This means that it will execute, return success and the -a false part will be ignored. The ! will reverse the success in to a failure and the then part will not be executed.
If you were to replace the above with a version calling test it would still not work as desired:
var1=true
var2=false
if ! [ $var1 -a $var2 ] ; then
echo $var1 and $var2 are both true
fi
Because the if line would be evaluated to
if ! [ true -a false ; ] then
And test would see true not as a boolean keyword, and not as a command, but as a string. Since a non-empty string is treated as "true" by test it will always return success to if, even if you had said
if ! [ false -a yourmom ] ; then
Since both are non-empty strings -a tests both as true, returns success which is reversed with ! and passed to if, which does not execute the then statement.
If you replace the test version with this version
if ! $var1 && $var2 ; then
Then it will be evaluated in to
if ! true && false ; then
And will be processed like this: if executes true which returns success; which is reversed by !; because the return code of the first command was failure the && statement short circuits and false never gets executed. Because the final command executed returned a failure, failure is passed back to if which does not execute the then clause.
I hope this is all clear.
It is perhaps worth pointing out that you can use constructs like this:
! false && true && echo 1
Which does not use if but still checks return codes, because that is what && and || are for.
There is kind of a black art to using test without making any mistakes. In general, when using bash, the newer [[ command should be used instead because it is more powerful and does away with lots of gotchas which must, for compatibility reasons, be kept in [.
Since the original poster did not supply a realistic example of what he's trying to accomplish it's hard to give any specific advice as to the best solution. Hopefully this has been sufficiently helpful that he can now figure out the correct thing to do.
You have mixed here two different syntaxes.
This will work:
if ! [ 1 -a 2 ]; then
echo "do sth"
fi
Note brackets around the expressions.
You need the test command ([ in newer syntax) to use these keys (-a, -o and so on).
But test does nut run commands itself.
If you want to check exit codes of commands you must not use test.

How do I use a file grep comparison inside a bash if/else statement?

When our server comes up we need to check a file to see how the server is configured.
We want to search for the following string inside our /etc/aws/hosts.conf file:
MYSQL_ROLE=master
Then, we want to test whether that string exists and use an if/else statement to run one of two options depending on whether the string exists or not.
What is the BASH syntax for the if statement?
if [ ????? ]; then
#do one thing
else
#do another thing
fi
From grep --help, but also see man grep:
Exit status is 0 if any line was selected, 1 otherwise;
if any error occurs and -q was not given, the exit status is 2.
if grep --quiet MYSQL_ROLE=master /etc/aws/hosts.conf; then
echo exists
else
echo not found
fi
You may want to use a more specific regex, such as ^MYSQL_ROLE=master$, to avoid that string in comments, names that merely start with "master", etc.
This works because the if takes a command and runs it, and uses the return value of that command to decide how to proceed, with zero meaning true and non-zero meaning false—the same as how other return codes are interpreted by the shell, and the opposite of a language like C.
if takes a command and checks its return value. [ is just a command.
if grep -q ...
then
....
else
....
fi
Note that, for PIPE being any command or sequence of commands, then:
if PIPE ; then
# do one thing if PIPE returned with zero status ($?=0)
else
# do another thing if PIPE returned with non-zero status ($?!=0), e.g. error
fi
For the record, [ expr ] is a shell builtin† shorthand for test expr.
Since grep returns with status 0 in case of a match, and non-zero status in case of no matches, you can use:
if grep -lq '^MYSQL_ROLE=master' ; then
# do one thing
else
# do another thing
fi
Note the use of -l which only cares about the file having at least one match (so that grep returns as soon as it finds one match, without needlessly continuing to parse the input file.)
†on some platforms [ expr ] is not a builtin, but an actual executable /bin/[ (whose last argument will be ]), which is why [ expr ] should contain blanks around the square brackets, and why it must be followed by one of the command list separators (;, &&, ||, |, &, newline)
just use bash
while read -r line
do
case "$line" in
*MYSQL_ROLE=master*)
echo "do your stuff";;
*) echo "doesn't exist";;
esac
done <"/etc/aws/hosts.conf"
Below code sample should work:
(echo "hello there" | grep -q "AAA") && [ $? -eq 0 ] && echo "hi" || echo "bye"

Resources