So I am writing a script that will curl a site I've written that will return a string. I am using it to compare a value that is saved within a file. I am having trouble getting my conditional statement to return true for matching values.
Here is a snippet of my bash code
var1=$(curl -s -w %{http_code}\\n www.somesite.com)
var2=$(cat somefile)
if [[ "$var1" = "$var2" ]]; then
echo "TRUE"
else
echo "FALSE"
fi
I have manually looked at both the strings and they seem to be identical. I've done wc with all applicable options with it. I've copy and pasted both the values into Notepad++ and did a find for the expected string and it said that both values matched the find command.
Obviously, if I manually put the values in the condition it returns true, so I know its not my conditional statement.
My guess is there is some type of hidden character on the end of curl...('\r' or '\n' or something that I am unaware)...or maybe on the end of the file that I am unaware of. Is this a known issue when trying to compare curl output and file content?
This may be a dumb question, but for the life of me I cannot seem to get these strings to match doing it dynamically instead of hardcoding the values in the condition.
Any suggestions would be great. Thanks
$ diff -q -b <(echo $'abc\ndef') <(echo $'abc\r\ndef') > /dev/null ; echo $?
0
$ diff -q -b <(echo $'abc\ndef') <(echo $'abc\r\nde') > /dev/null ; echo $?
1
Related
I have this shall script, basically I need to print all <sst> result only when <pr> is found.
Probably I have some syntax error so when I run the script I receive a message "Display all possibilities" basically the grep does not work.
Could you please help me out to understand what is the problem here?
declare -a arr=(
"123"
"345"
)
for i in "${arr[#]}"
do
echo "$i"
if [grep -q "<pr>$i</pr>" ./archiv]
then
grep -r "<sst>" ./archiv
fi
done
There is very likely no command named [grep. Drop the [
if grep -q "<pr>$i</pr>" ./archiv; then ...
[ is not and has never been a part of the shell grammar. It is a command, just like echo or test or grep. The value returned by that command is used to determine whether or not to execute the clause of the if statement.
I am trying to implement bash script that is reading from error log file and comparing strings with exceptions.
I am trying to compare it with if
error="[*] Text Text # level 4: 'Some text' [parent = 'Not found'] "
exception="'Not found'"
if [[ "${error}" == *"${exception}"* ]]; then
echo "Yes it contains!"
fi
In this case I would expect the script to return "Yes it contains!", but it doesn't work as I expected. But it is true that my logs contain special character as well, anyone knows how should I handle that and compare it?
For me my if also works, but I might have something wrong in my nested loop. I am running the script in following process.
I have file with errors called mylogfile.txt:
[*] Text Text # level 4: 'Some text' [parent = 'Not found']
Then I have another file where I have exceptions inserted exception.txt:
'Not found'
I do a loop over both files to see if I find anything:
while IFS='' read -r line || [ -n "$line" ]; do
exception="$line"
while IFS='' read -r line || [ -n "$line" ]; do
err="$line"
if [[ "${err}" == *"${exception}"* ]]; then
echo "Yes it contains!"
fi
done < "mylogfile.txt"
done < "exception.txt"
I don't see anything wrong with your script, and it works when I run it.
That said, looping over files line by line in a shell script is a code smell. Sometimes it's necessary, but often you can trick some command or another into doing the hard work for you. When searching through files, think grep. In this case, you can actually get rid of both loops with a single grep!
$ grep -f exception.txt mylogfile.txt
[*] Text Text # level 4: 'Some text' [parent = 'Not found']
To use it in an if statement, add -q to suppress its normal output and just check the exit code:
if grep -qf exception.txt mylogfile.txt; then
echo "Yes, it's contained!"
fi
From the grep(1) man page:
-f FILE, --file=FILE
Obtain patterns from FILE, one per line. The empty file contains zero patterns, and therefore matches nothing.
-q, --quiet, --silent
Quiet; do not write anything to standard output. Exit immediately with zero status if any match is found, even if an error was detected.
Use grep if you want exact match
if grep -q "$exception" <<< "$error"; then
echo "Yes it contains!"
fi
Use -i switch to ignore case
I have a list of names in a list (say name.txt) which has the set of name lists one by one.
name.txt
babu
praveen
kamal
sneha
This name will be passed as run time argument $1 in my bash script.
Now I have to do a match to check if the inputted name is in my list or not.
If it's not there then I will print saying invalid name and exit. Can you help me with this?
I have tried this with
if [[ "$1" != "babu" || "$1" != "praveen" || "$1" != "kamal" ... ]]; then
exit
fi
but this doesn't look good professionally.
Is there any other simple and decent way to achieve this?
I guess you could use grep:
if ! grep -qFxf name.txt <<<"$1"; then
exit
fi
-q is "quiet mode" - the exit status indicates a match
-F matches fixed strings, rather than regular expressions
-x is a full line match, so names that are substrings won't match
-f means that the list of patterns are contained within a file
The string contained within the first argument is passed to grep using a here string. This is a bash feature, so if you're using another shell, you could go with something like if ! echo "$1" | grep -qFxf name.txt instead (but the bash way is preferred as it saves invoking a subprocess).
If you want to ensure that any error output from grep is suppressed, you can add 2>/dev/null to the command, as suggested in the comments. This prevents any messages sent to standard error from being displayed.
I'm trying to use an if statement with grep in order to check if a string exists in some files. Now, the grep statement work by itself, but when I run it as part of the if statement the output is:
line 6: [: too many arguments
My Code:
#!/bin/bash
if [ $(grep -c "OutOfMemory" /my/path/to/domains/*/*/subdomains/*/logs/*.*) -ne 0 ];
then
echo "String found"
else
echo "String not found"
fi
If tried using a shorter path but it didn't help.
Any suggestion will help.
Thank you,
The problem is that your grep -c does not produce the correct output.
e.g, you could get multiple files:
$ grep -c "OutOfMemory" /my/path/to/domains/*/*/subdomains/*/logs/*.*
/my/path/to/domains/a/b/subdomains/c/logs/my.log:1
/my/path/to/domains/a/b/subdomains/c/logs/another.log:2
Your if statement cannot handle the multiple lines returned by grep, so it fails with too many arguments.
If you want to see if there is any file containing the string "OutOfMemory", do this instead:
if grep -q "OutOfMemory" /my/path/to/domains/*/*/subdomains/*/logs/*.*
then
...
I'm running into an issue where my argument list for echo is too long and would like some ideas on how to get around this issue, or at least test for the condition so I can properly handle it, and it won't kill my script
for file in `cat filelist`; do
PROTOCOLS1=`egrep -i 'rsh|rsync|remsh' "$file" | egrep -v '^[ | ]*#'`
FIELDS=`echo $PROTOCOLS1|wc -l`
if [[ $FIELDS -gt 1024 ]]; then
echo $file >> $debuglog
else
set -A myarray $PROTOCOLS1
do stuff.....
fi
done
So the problem is that when my arg list for echo is too long, $FIELDS is set to null, and thus my test for $FIELDS -gt 1024 always is true and does not get caught.
Problem is when it goes to the array it's obviously too big and I get a subscript out of range error and my script exits.
Any ideas are greatly appreciated.
Edit 9/18
OK so the problem is a little more basic.
myserver-v1> echo $variable
myserver-v1> /usr/bin/echo: too many args
I want to test for this in my script
I tried the following, which works, but I get all this crap to stdout, which fills up my debug log and is annoying
echo $variable
if [[ $? -ne 0 ]]; then
write to error log
fi
Is there a way to test echo $variable....without sending it to stdout?
I tried the following, but neither seemed to work, so I am kind of at a loss here.
[[ ! `echo $variable ]]
[[ `echo $variable ]]
If you keep the unquoted variable $PROTOCOLS1 in the echo, you could simplify life by replacing:
FIELDS=`echo $PROTOCOLS1|wc -l`
with
FIELDS=1
This is because when you echo $PROTOCOLS1 without any quotes around it, you will only have one (possibly very long) line of output. Alternatively, you can use:
FIELDS=$(echo "$PROTOCOLS1" | wc -l)
where the double quotes will preserve the newlines in the value of PROTOCOLS1 (but it gets you back to the 'argument list too long' problem).
So, you need to think about using:
FIELDS=$(egrep -i 'rsh|rsync|remsh' "$file" | egrep -c -v '^[ | ]*#')
which gets the second egrep to do the line counting for you. Obviously, since the later portion of the script uses $PROTOCOLS1, you will need to re-evaluate the egreps to get the data, but you should think about whether your processing scheme is appropriate. If you are running into a string value that is too long, you are probably not doing the processing in the best way. What the alternatives are depends on what you are trying to do, and the question does not reveal that. It might be appropriate to do the extra processing with a scripting language such as Perl, Python or Ruby.