IF statement in script and in a single command line - bash

My script below checks for the instance of opened window in X server and it prints some info in the terminal depending on the state.
#!/bin/bash
if [[ -z $(xwininfo -tree -root | grep whatsapp | grep chromium) ]]
then
echo "IT DOES NOT EXIST";
else
echo "IT EXIST";
fi
When I try to rewrite this into a one line terminal command I do it like this:
if -z $(xwininfo -tree -root | grep whatsapp | grep chromium); then echo "IT DOES NOT EXIST"; else echo "IT EXIST"; fi
this returns error and a wrong state...
bash: -z: command not found
IT EXISTS
Does anyone have any advice? I tried asking the ShellCheck but it says I have everything in order...

Correctly following the advice from http://shellcheck.net/ would have looked like the following:
if xwininfo -tree -root | grep whatsapp | grep -q chromium; then
echo "IT DOES NOT EXIST";
else
echo "IT EXIST";
fi
...thus, in one-liner form:
if xwininfo -tree -root | grep whatsapp | grep -q chromium; then echo "IT DOES NOT EXIST"; else echo "IT EXIST"; fi
See the wiki page for SC2143, the shellcheck warning you received.

I got messed with an online bash code checker which stated that [[ and ]] are not needed. This worked for me:
if [[ -z $(xwininfo -tree -root | grep whatsapp | grep chromium) ]]; then chromium --app=\\"https://web.whatsapp.com/\\"; fi & if [[ -z $(xwininfo -tree -root | grep skype | grep chromium) ]]; then chromium --app=\\"https://web.skype.com/en/\\"; fi & if [[ -z $(xwininfo -tree -root | grep Viber) ]]; then viber; fi

Related

While loop hangs and does not find string

I have a section of code in a bash script that uses a while loop to grep a file until the string I am looking for is there, then exit. Currently, its just hanging using the following code:
hostname="test-cust-15"
VAR1=$(/bin/grep -wo -m1 "HOST ALERT: $hostname;DOWN" /var/log/logfile)
while [ ! "$VAR1" ]
do
sleep 5
done
echo $VAR1 was found
I know the part of the script responsible for inserting this string into the logfile works, as I can grep it out side of the script and find it.
One thing I have tried is to change up the variables. Like this:
hostname="test-cust-15"
VAR1="HOST ALERT: $hostname;DOWN"
while [ ! /bin/grep "$VAR1" /var/log/logfile ]
do
sleep 5
done
echo $VAR1 was found
But i get a binary operator expected message and once I got a too many arguments message when using this:
while [ ! /bin/grep -q -wo "$VAR1" /var/log/logfile ]
What do I need to do to fix this?
while/until can work off of the exit status of a program directly.
until /bin/grep "$VAR1" /var/log/logfile
do
sleep 5
done
echo "$VAR1" was found
You also mentioned that it prints out the match in an above comment. If that's not desirable, use output redirection, or grep's -q option.
until /bin/grep "$VAR1" /var/log/logfile >/dev/null
until /bin/grep -q "$VAR1" /var/log/logfile
No need to bother with command substitution or test operator there. Simply:
while ! grep -wo -m1 "HOST ALERT: $hostname;DOWN" /var/log/logfile; do
sleep 5
done
Don't waste resources, use tail!
#!/bin/bash
while read line
do
echo $line
break
done < <(tail -f /tmp/logfile | grep --line-buffered "HOST ALERT")

Bash script not killing all PIDs in specified file or allowing partial names for input [duplicate]

This question already has answers here:
How to kill all processes with a given partial name? [closed]
(14 answers)
Closed 6 years ago.
Right now, my bash script works for 1 PID processes and I must use an exact process name for input. It will not accept *firefox*' for example. Also, I run a bash script that opens multiplersync` processes, and I would like this script to kill all of those processes. But, this script only works on processes with 1 PID.
Here is the script:
#!/bin/bash
createProcfile() {
ps -eLf | grep -f process.tmp | grep -v 'grep' | awk '{print $2,$10}' | sort -u | egrep -o '[0-9]{4,}' > pid.tmp
# pgrep "$(cat process.tmp)" > pid.tmp
}
PIDFile=pid.tmp
echo "Enter a process name"
read -r process
echo "$process" > process.tmp
# node_process_id=$(pidof "$process")
node_process_id=$(ps -eLf | grep $process | grep -v 'grep' | awk '{print $2,$10}' | sort -u | egrep -o '[0-9]{4,}')
if [[ -z "$node_process_id" ]]; then
echo "Please enter a valid process."
rm process.tmp
exit 0
fi
ps -eLf | grep $process | awk '{print $2,$10}' | sort -u | grep -v 'grep'
# pgrep "$(cat process.tmp)"
echo "Would you like to kill this process(es)? (y/n)"
read -r answer
if [[ "$answer" == y ]]; then
createProcfile
pkill -F "$PIDFile"
rm "$PIDFile"
sleep 1
createProcfile
node_process_id=$(pidof "$process")
if [[ -z $node_process_id ]]; then
echo "Process terminated successfully."
rm process.tmp
exit 0
else
echo "Process not terminated. Kill process manually."
ps -eLf | grep $process | awk '{print $2,$10}' | sort -u | grep -v 'grep'
# pgrep "$(cat process.tmp)"
rm "$PIDFile"
rm process.tmp
exit 0
fi
fi
I edited the script. Thanks to your comments, it works now and does the following:
Make script accept partial name as input
Kill more than 1 PID
Thank you!
pkill exists to solve your problem. It accepts a pattern to match against the process name, or the entire command line if -f is specified.
It will not accept *firefox*
Use killall command. Example :
killall -r "process.*"
This will kill all the processes whose names contain process in the beginning followed by any stuff.
The [ manual ] says :
-r, --regexp
Interpret process name pattern as an extended regular expression.
Sidenote:
Note that we have to double quote the regular expression to prevent file globbing. (Thanks #broslow for reminding this stuff).

How do I store a bash command as string for multiple substitutions?

I'm trying to clean up this script I have and this piece of code is annoying me because I know it can be more DRY:
if grep --version | grep "GNU" > /dev/null ;
then
grep -P -r -l "\x0d" $dir | grep "${fileRegex}"
else
grep -r -l "\x0d" $dir | grep "{$fileRegex}"
fi
My thoughts are to somehow conditionally set a string variable to either "grep -P" or "egrep" and then in a single line do something like:
$(cmdString) -r -l "\x0d" $dir | grep "${fileRegex}"
Or something like that but it doesn't work.
Are you worried about a host which has GNU grep but not egrep? Do such hosts exist?
If not why not just always use egrep? (Though -P and egrep are not the same thing.)
That being said you don't use strings for this (see BashFAQ#50).
You use arrays: grepcmd=(egrep) or grepcmd=(grep -P) and then "${grepcmd[#]}" ....
You can also avoid needing perl mode entirely if you use $'\r' or similar (assuming your shell understands that quoting method).
You can do this:
if grep --version | grep "GNU" > /dev/null
then
cmdString=(grep -P)
else
cmdString=(egrep)
fi
"${cmdString[#]}" -r -l "\x0d" "$dir" | grep "{$fileRegex}"
#Etan Reisner's suggestion worked well. For those that are interested in the final code (this case is for tabs, not windows line endings but it is similar):
fileRegex=${1:-".*\.java"}
if grep --version | grep "GNU" > /dev/null ;
then
cmdString=(grep -P)
else
cmdString=(grep)
fi
arr=$("${cmdString[#]}" -r -l "\x09" . | grep "${fileRegex}")
if [ -n "$dryRun" ]; then
for i in $arr; do echo "$i"; done
else
for i in $arr; do expand -t 7 "$i" > /tmp/e && mv /tmp/e "$i"; done
fi

BASH - Check that it is not running the same script

I have a script /root/data/myscript
and when I run /root/data/myscript
I do not know how to determine if you have one running
does anyone know?
I tried
if [[ "$(pidof -x /root/data/myscript | wc -w)" > "1" ]]
then echo "This script is already running!"
fi
thank you
This should work.
if [[ "$(pgrep myscript)" ]]
then echo "This script is already running!"
fi
This could work to check whether the script is already running or not.
if [[ "$(ps -ef | grep "/root/data/myscript" | grep -v "grep")" ]] ; then
echo "This script is already running!"
fi
Try this one.

Lynx is stopping loop?

I'll just apologize beforehand; this is my first ever post, so I'm sorry if I'm not specific enough, if the question has already been answered and I just didn't look hard enough, and if I use incorrect formatting of some kind.
That said, here is my issue: In bash, I am trying to create a script that will read a file that lists several dozen URL's. Once it reads each line, I need it to run a set of actions on that, the first being to use lynx to navigate to the website. However, in practice, it will run once perfectly on the first line. Lynx goes, the download works, and then the subsequent renaming and organizing of that file go through as well. But then it skips all the other lines and acts like it has finished the whole file.
I have tested to see if it was lynx causing the issue by eliminating all the other parts of the code, and then by just eliminating lynx. It works without Lynx, but, of course, I need lynx for the rest of the output to be of any use to me. Let me just post the code:
!#/bin/bash
while read line; do
echo $line
lynx -accept_all_cookies $line
echo "lynx done"
od -N 2 -h *.zip | grep "4b50"
echo "od done, if 1 starting..."
if [[ $? -eq 0 ]]
then ls *.*>>logs/zips.log
else
od -N 2 -h *.exe | grep "5a4d"
echo "if 2 starting..."
if [[ $? -eq 0 ]]
then ls *.*>>logs/exes.log
else
od -N 2 -h *.exe | grep "5a4d, 4b50"
echo "if 3 starting..."
if [[ $? -eq 1 ]]
then
ls *.*>>logs/failed.log
fi
echo "if 3 done"
fi
echo "if 2 done"
fi
echo "if 1 done..."
FILE=`(ls -tr *.* | head -1)`
NOW=$(date +"%m_%d_%Y")
echo "vars set"
mv $FILE "criticalfreepri/${FILE%%.*}(ZCH,$NOW).${FILE#*.}" -u
echo "file moved"
rm *.zip *.exe
echo "file removed"
done < "lynx"
$SHELL
Just to be sure, I do have a file called "lynx" that contains the urls separated by a return each. Also, I used all those "echo"s to do my own sort of debugging, but I have tried it with and without the echo's. When I execute the script, the echo's all show up...
Any help is appreciated, and thank you all so much! Hope I didn't break any rules on this post!
PS: I'm on Linux Mint running things through the "terminal" program. I'm scripting with bash in Gedit, if any of that info is relevant. Thanks!
EDIT: Actually, the echo tests repeat for all three lines. So it would appear that lynx simply can't start again in the same loop?
Here is a simplified version of the script, as requested:
!#/bin/bash
while read -r line; do
echo $line
lynx $line
echo "lynx done"
done < "ref/url"
read "lynx"
$SHELL
Note that I have changed the sites the "url" file goes to:
`www.google.com
www.majorgeeks.com
http://www.sophos.com/en-us/products/free-tools/virus-removal-tool.aspx`
Lynx is not designed to use in scripts because it locks the terminal. Lynx is an interactive console browser.
If you want to access URLs in a script use wget, for example:
wget http://www.google.com/
For exit codes see: http://www.gnu.org/software/wget/manual/html_node/Exit-Status.html
to parse the html-content use:
VAR=`wget -qO- http://www.google.com/`
echo $VAR
I found a way which may fulfilled your requirement to run lynx command in loop with substitution of different url link.
Use
echo `lynx $line`
(Echo the lynx $line in single quote('))
instead of lynx $line. You may refer below:
your code
!#/bin/bash
while read -r line; do
echo $line
lynx $line
echo "lynx done"
done < "ref/url"
read "lynx"
$SHELL
try on below
!#/bin/bash
while read -r line; do
echo $line
echo `lynx $line`
echo "lynx done"
done < "ref/url"
I should have answered this question a long time ago. I got the program working, it's now on Github!
Anyway, I simply had to wrap the loop inside a function. Something like this:
progdownload () {
printlog "attmpting download from ${URL}"
if echo "${URL}" | grep -q "http://www.majorgeeks.com/" ; then
lynx -cmd_script="${WORKINGDIR}/support/mgcmd.txt" --accept-all-cookies ${URL}
else wget ${URL}
fi
}
URL="something.com"
progdownload

Resources