Compare strings if contains in bash - bash

I am trying to implement bash script that is reading from error log file and comparing strings with exceptions.
I am trying to compare it with if
error="[*] Text Text # level 4: 'Some text' [parent = 'Not found'] "
exception="'Not found'"
if [[ "${error}" == *"${exception}"* ]]; then
echo "Yes it contains!"
fi
In this case I would expect the script to return "Yes it contains!", but it doesn't work as I expected. But it is true that my logs contain special character as well, anyone knows how should I handle that and compare it?
For me my if also works, but I might have something wrong in my nested loop. I am running the script in following process.
I have file with errors called mylogfile.txt:
[*] Text Text # level 4: 'Some text' [parent = 'Not found']
Then I have another file where I have exceptions inserted exception.txt:
'Not found'
I do a loop over both files to see if I find anything:
while IFS='' read -r line || [ -n "$line" ]; do
exception="$line"
while IFS='' read -r line || [ -n "$line" ]; do
err="$line"
if [[ "${err}" == *"${exception}"* ]]; then
echo "Yes it contains!"
fi
done < "mylogfile.txt"
done < "exception.txt"

I don't see anything wrong with your script, and it works when I run it.
That said, looping over files line by line in a shell script is a code smell. Sometimes it's necessary, but often you can trick some command or another into doing the hard work for you. When searching through files, think grep. In this case, you can actually get rid of both loops with a single grep!
$ grep -f exception.txt mylogfile.txt
[*] Text Text # level 4: 'Some text' [parent = 'Not found']
To use it in an if statement, add -q to suppress its normal output and just check the exit code:
if grep -qf exception.txt mylogfile.txt; then
echo "Yes, it's contained!"
fi
From the grep(1) man page:
-f FILE, --file=FILE
Obtain patterns from FILE, one per line. The empty file contains zero patterns, and therefore matches nothing.
-q, --quiet, --silent
Quiet; do not write anything to standard output. Exit immediately with zero status if any match is found, even if an error was detected.

Use grep if you want exact match
if grep -q "$exception" <<< "$error"; then
echo "Yes it contains!"
fi
Use -i switch to ignore case

Related

Check if command error contains a substring

I have a lot of bash commands.Some of them fail for different reasons.
I want to check if some of my errors contain a substring.
Here's an example:
#!/bin/bash
if [[ $(cp nosuchfile /foobar) =~ "No such file" ]]; then
echo "File does not exist. Please check your files and try again."
else
echo "No match"
fi
When I run it, the error is printed to screen and I get "No match":
$ ./myscript
cp: cannot stat 'nosuchfile': No such file or directory
No match
Instead, I wanted the error to be captured and match my condition:
$ ./myscript
File does not exist. Please check your files and try again.
How do I correctly match against the error message?
P.S. I've found some solution, what do you think about this?
out=`cp file1 file2 2>&1`
if [[ $out =~ "No such file" ]]; then
echo "File does not exist. Please check your files and try again."
elif [[ $out =~ "omitting directory" ]]; then
echo "You have specified a directory instead of a file"
fi
I'd do it like this
# Make sure we always get error messages in the same language
# regardless of what the user has specified.
export LC_ALL=C
case $(cp file1 file2 2>&1) in
#or use backticks; double quoting the case argument is not necessary
#but you can do it if you wish
#(it won't get split or glob-expanded in either case)
*"No such file"*)
echo >&2 "File does not exist. Please check your files and try again."
;;
*"omitting directory"*)
echo >&2 "You have specified a directory instead of a file"
;;
esac
This'll work with any POSIX shell too, which might come in handy if you ever decide to
convert your bash scripts to POSIX shell (dash is quite a bit faster than bash).
You need the first 2>&1 redirection because executables normally output information not primarily meant for further machine processing to stderr.
You should use the >&2 redirections with the echos because what you're ouputting there fits into that category.
PSkocik's answer is the correct one for one you need to check for a specific string in an error message. However, if you came looking for ways to detect errors:
I want to check whether or not a command failed
Check the exit code instead of the error messages:
if cp nosuchfile /foobar
then
echo "The copy was successful."
else
ret="$?"
echo "The copy failed with exit code $ret"
fi
I want to differentiate different kinds of failures
Before looking for substrings, check the exit code documentation for your command. For example, man wget lists:
EXIT STATUS
Wget may return one of several error codes if it encounters problems.
0 No problems occurred.
1 Generic error code.
2 Parse error---for instance, when parsing command-line options
3 File I/O error.
(...)
in which case you can check it directly:
wget "$url"
case "$?" in
0) echo "No problem!";;
6) echo "Incorrect password, try again";;
*) echo "Some other error occurred :(" ;;
esac
Not all commands are this disciplined in their exit status, so you may need to check for substrings instead.
Both examples:
out=`cp file1 file2 2>&1`
and
case $(cp file1 file2 2>&1) in
have the same issue because they mixing the stderr and stdout into one output which can be examined. The problem is when you trying the complex command with interactive output i.e top or ddrescueand you need to preserve stdout untouched and examine only the stderr.
To omit this issue you can try this (working only in bash > v4.2!):
shopt -s lastpipe
declare errmsg_variable="errmsg_variable UNSET"
command 3>&1 1>&2 2>&3 | read errmsg_variable
if [[ "$errmsg_variable" == *"substring to find"* ]]; then
#commands to execute only when error occurs and specific substring find in stderr
fi
Explanation
This line
command 3>&1 1>&2 2>&3 | read errmsg_variable
redirecting stderr to the errmsg_variable (using file descriptors trick and pipe) without mixing with stdout. Normally pipes spawning own sub-processes and after executing command with pipes all assignments are not visible in the main process so examining them in the rest of code can't be effective. To prevent this you have to change standard shell behavior by using:
shopt -s lastpipe
which executes last pipe manipulation in command as in the current process so:
| read errmsg_variable
assignes content "pumped" to pipe (in our case error message) into variable which resides in the main process. Now you can examine this variable in the rest of code to find specific sub-string:
if [[ "$errmsg_variable" == *"substring to find"* ]]; then
#commands to execute only when error occurs and specific substring find in stderr
fi

grep, else print message for no matches

In a bash script, I have a list of lines in a file I wish to grep and then display on standard out, which is easiest done with a while read:
grep "regex" "filepath" | while read line; do
printf "$line\n"
done
However, I would like to inform the user if no lines were matched by the grep. I know that one can do this by updating a variable inside the loop but it seems like a much more elegant approach (if possible) would be to try to read a line in an until loop, and if there were no output, an error message could be displayed.
This was my first attempt:
grep "regex" "filepath" | until [[ -z ${read line} ]]; do
if [[ -z $input ]]; then
printf "No matches found\n"
break
fi
printf "$line\n"
done
But in this instance the read command is malformed, and I wasn't sure of another way the phrase the query. Is this approach possible, and if not, is there a more suitable solution to the problem?
You don't need a loop at all if you simply want to display a message when there's no match. Instead you can use grep's return code. A simple if statement will suffice:
if ! grep "regex" "filepath"; then
echo "no match" >&2
fi
This will display the results of grep matches (since that's grep's default behavior), and will display the error message if it doesn't.
A popular alternative to if ! is to use the || operator. foo || bar can be read as "do foo or else do bar", or "if not foo then bar".
grep "regex" "filepath" || echo "no match" >&2
John Kugelman's answer is the correct and succinct one and you should accept it. I am addressing your question about syntax here just for completeness.
You cannot use ${read line} to execute read -- the brace syntax actually means (vaguely) that you want the value of a variable whose name contains a space. Perhaps you were shooting for $(read line) but really, the proper way to write your until loop would be more along the lines of
grep "regex" "filepath" | until read line; [[ -z "$line" ]]; do
... but of course, when there is no output, the pipeline will receive no lines, so while and until are both wrong here.
It is worth amphasizing that the reason you need a separate do is that you can have multiple commands in there. Even something like
while output=$(grep "regex filepath"); echo "grep done, please wait ...";
count=$(echo "$output" | wc -l); [[ $count -gt 0 ]]
do ...
although again, that is much more arcane than you would ever really need. (And in this particular case, you would want probably actually want if , not while.)
As others already noted, there is no reason to use a loop like that here, but I wanted to sort out the question about how to write a loop like this for whenever you actually do want one.
As mentioned by #jordanm, there is no need for a loop in the use case you mentioned.
output=$(grep "regex" "file")
if [[ -n $output ]]; then
echo "$output"
else
echo "Sorry, no results..."
fi
If you need to iterate over the results for processing (rather than just displaying to stdout) then you can do something like this:
output=$(grep "regex" "file")
if [[ -n $output ]]; then
while IFS= read -r line; do
# do something with $line
done <<< "$output"
else
echo "Sorry, no results..."
fi
This method avoids using a pipeline or subshell so that any variable assignments made within the loop will be available to the rest of the script.
Also, i'm not sure if this relates to what you are trying to do at all, but grep does have the ability to load patterns from a file (one per line). It is invoked as follows:
grep search_target -f pattern_file.txt

How to redirect output of command to diff

I am trying to write a loop, and this doesn't work:
for t in `ls $TESTS_PATH1/cmd*.in` ; do
diff $t.out <($parser_test `cat $t`)
# Print result
if [[ $? -eq 0 ]] ; then
printf "$t ** TEST PASSED **"
else
printf "$t ** TEST FAILED **"
fi
done
This also doesn't help:
$parser_test `cat $t` | $DIFF $t.out -
Diff shows that output differs (it's strange, I see output of needed error line as it was printed to stdout, and not caught by diff), but when running with temporary file, everything works fine:
for t in `ls $TESTS_PATH1/cmd*.in` ; do
# Compare output with template
$parser_test `cat $t` 1> $TMP_FILE 2> $TMP_FILE
diff $TMP_FILE $t.out
# Print result
if [[ $? -eq 0 ]] ; then
printf "$t $CGREEN** TEST PASSED **$CBLACK"
else
printf "$t $CRED** TEST FAILED **$CBLACK"
fi
done
I must avoid using temporary file. Why first loop doesn't work and how to fix it?
Thank you.
P.S. Files *.in contain erroneous command line parameters for program, and *.out contain errors messages that program must print for these parameters.
First, to your error, you need to redirect standard error:
diff $t.out <($parser_test `cat $t` 2>&1)
Second, to all the other problems you may not be aware of:
don't use ls with a for loop (it has numerous problems, such as unexpected behavior in filenames containing spaces); instead, use: for t in $TESTS_PATH1/cmd*.in; do
to support file names with spaces, quote your variable expansion: "$t" instead of $t
don't use backquotes; they are deprecated in favor of $(command)
don't use a subshell to cat one file; instead, just run: $parser_test <$t
use either [[ $? == 0 ]] (new syntax) or [ $? -eq 0 ] (old syntax)
if you use printf instead of echo, don't forget that you need to add \n at the end of the line manually
never use 1> $TMP_FILE 2> $TMP_FILE - this just overwrites stdout with stderr in a non-predictable manner. If you want to combine standard out and standard error, use: 1>$TMP_FILE 2>&1
by convention, ALL_CAPS names are used for/by environment variables. In-script variable names are recommended to be no_caps.
you don't need to use $? right after executing a command, it's redundant. Instead, you can directly run: if command; then ...
After fixing all that, your script would look like this:
for t in $tests_path1/cmd*.in; do
if diff "$t.out" <($parser_test <"$t" 2>&1); then
echo "$t ** TEST PASSED **"
else
echo "$t ** TEST FAILED **"
fi
done
If you don't care for the actual output of diff, you can add >/dev/null right after diff to silence it.
Third, if I understand correctly, your file names are of the form foo.in and foo.out, and not foo.in and foo.in.out (like the script above expects). If this is true, you need to change the diff line to this:
diff "${t/.in}.out" <($parser_test <"$t" 2>&1)
In your second test you are capturing standard error, but in the first one (and the pipe example) stderr remains uncaptured, and perhaps that's the "diff" (pun intended).
You can probably add a '2>&1' in the proper place to combine the stderr and stdout streams.
.eg.
diff $t.out <($parser_test cat $t 2>&1)
Not to mention, you don't say what "doesn't work" means, does that mean it doesn't find a difference, or it exits with an error message? Please clarify in case you need more info.

Read last line of file in bash script when reading file line by line [duplicate]

This question already has answers here:
Read a file line by line assigning the value to a variable [duplicate]
(10 answers)
Closed 3 years ago.
I am writing a script to read commands from a file and execute a specific command. I want my script to work for either single input arguments or when an argument is a filename which contains the arguments in question.
My code below works except for one problem, it ignores the last line of the file. So, if the file were as follows.
file.txt
file1
file2
The script posted below only runs the command for file.txt
for currentJob in "$#"
do
if [[ "$currentJob" != *.* ]] #single file input arg
then
echo $currentJob
serverJobName="$( tr '[A-Z]' '[a-z]' <<< "$currentJob" )" #Change to lowercase
#run cURL job
curl -o "$currentJob"PisaInterfaces.xml http://www.ebi.ac.uk/msd-srv/pisa/cgi-bin/interfaces.pisa?"$serverJobName"
else #file with list of pdbs
echo "Reading "$currentJob
while read line; do
echo "-"$line
serverJobName="$( tr '[A-Z]' '[a-z]' <<< "$line" )"
curl -o "$line"PisaInterfaces.xml http://www.ebi.ac.uk/msd-srv/pisa/cgi-bin/interfaces.pisa?"$serverJobName"
done < "$currentJob"
fi
done
There is, of course, the obvious work around where after the while loop I repeat the steps for inside the loop to complete those commands with the last file, but this is not desirable as any changes I make inside the while loop must be repeated again outside of the while loop. I have searched around online and could not find anyone asking this precise question. I am sure it is out there, but I have not found it.
The output I get is as follows.
>testScript.sh file.txt
Reading file.txt
-file1
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 642k 0 642k 0 0 349k 0 --:--:-- 0:00:01 --:--:-- 492k
My bash version is 3.2.48
It sounds like your input file is missing the newline character after its last line. When read encounters this, it sets the variable ($line in this case) to the last line, but then returns an end-of-file error so the loop doesn't execute for that last line. To work around this, you can make the loop execute if read succeeds or it read anything into $line:
...
while read line || [[ -n "$line" ]]; do
...
EDIT: the || in the while condition is what's known as a short-circuit boolean -- it tries the first command (read line), and if that succeeds it skips the second ([[ -n "$line" ]]) and goes through the loop (basically, as long as the read succeeds, it runs just like your original script). If the read fails, it checks the second command ([[ -n "$line" ]]) -- if read read anything into $line before hitting the end of file (i.e. if there was an unterminated last line in the file), this'll succeed, so the while condition as a whole is considered to have succeeded, and the loop runs one more time.
After that last unterminated line is processed, it'll run the test again. This time, the read will fail (it's still at the end of file), and since read didn't read anything into $line this time the [[ -n "$line" ]] test will also fail, so the while condition as a whole fails and the loop terminates.
EDIT2: The [[ ]] is a bash conditional expression -- it's not a regular command, but it can be used in place of one. Its primary purpose is to succeed or fail, based on the condition inside it. In this case, the -n test means succeed if the operand ("$line") is NONempty. There's a list of other test conditions here, as well as in the man page for the test command.
Note that a conditional expression in [[ ... ]] is subtly different from a test command [ ... ] -- see BashFAQ #031 for differences and a list of available tests. And they're both different from an arithmetic expression with (( ... )), and completely different from a subshell with( ... )...
Your problem seems to be a missing carriage return in your file.
If you cat your file, you need to see the last line successfully appearing before the promopt.
Otherwise try adding :
echo "Reading "$currentJob
echo >> $currentJob #add new line
while read line; do
to force the last line of the file.
Using grep with while loop:
while IFS= read -r line; do
echo "$line"
done < <(grep "" file)
Using grep . instead of grep "" will skip the empty lines.
Note:
Using IFS= keeps any line indentation intact.
You should almost always use the -r option with read.
I found some code that reads a file including the last line, and works without the [[ ]] command:
http://www.unix.com/shell-programming-and-scripting/161645-read-file-using-while-loop-not-reading-last-line.html
DONE=false
until $DONE; do
read s || DONE=true
# you code
done < FILE

shell script working fine on one server but not on another

the following script is working fine on one server but on the other it gives an error
#!/bin/bash
processLine(){
line="$#" # get the complete first line which is the complete script path
name_of_file=$(basename "$line" ".php") # seperate from the path the name of file excluding extension
ps aux | grep -v grep | grep -q "$line" || ( nohup php -f "$line" > /var/log/iphorex/$name_of_file.log & )
}
FILE=""
if [ "$1" == "" ]; then
FILE="/var/www/iphorex/live/infi_script.txt"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists. Script will terminate now."
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not be read. Script will terminate now."
exit 2
fi
fi
# read $FILE using the file descriptors
# $ifs is a shell variable. Varies from version to version. known as internal file seperator.
# Set loop separator to end of line
BACKUPIFS=$IFS
#use a temp. variable such that $ifs can be restored later.
IFS=$(echo -en "\n")
exec 3<&0
exec 0<"$FILE"
while read -r line
do
# use $line variable to process line in processLine() function
processLine $line
done
exec 0<&3
# restore $IFS which was used to determine what the field separators are
IFS=$BAKCUPIFS
exit 0
i am just trying to read a file containing path of various scripts and then checking whether those scripts are already running and if not running them. The file /var/www/iphorex/live/infi_script.txt is definitely present. I get the following error on my amazon server-
[: 24: unexpected operator
infinity.sh: 32: cannot open : No such file
Thanks for your helps in advance.
You should just initialize file with
FILE=${1:-/var/www/iphorex/live/infi_script.txt}
and then skip the existence check. If the file
does not exist or is not readable, the exec 0< will
fail with a reasonable error message (there's no point
in you trying to guess what the error message will be,
just let the shell report the error.)
I think the problem is that the shell on the failing server
does not like "==" in the equality test. (Many implementations
of test only accept one '=', but I thought even older bash
had a builtin that accepted two '==' so I might be way off base.)
I would simply eliminate your lines from FILE="" down to
the end of the existence check and replace them with the
assignment above, letting the shell's standard default
mechanism work for you.
Note that if you do eliminate the existence check, you'll want
to either add
set -e
near the top of the script, or add a check on the exec:
exec 0<"$FILE" || exit 1
so that the script does not continue if the file is not usable.
For bash (and ksh and others), you want [[ "$x" == "$y" ]] with double brackets. That uses the built-in expression handling. A single bracket calls out to the test executable which is probably barfing on the ==.
Also, you can use [[ -z "$x" ]] to test for zero-length strings, instead of comparing to the empty string. See "CONDITIONAL EXPRESSIONS" in your bash manual.

Resources