Use SED to comment out cronjobs (not that simple) - bash

I have a rather large BASH function that I'm working on. This function is a CronJob generator. This script is intended to be run with SUDO privileges, and will allow the user to inspect the unprivileged user's Crontab file. They can create new cronjobs (a few questions and it does the proper syntax for them), they can also remove a cronjob. That's where I've hit a wall.
In this part of my CASE statement, the user has been asked if they want to create a new cronjob -- they reply with "N" or "n" and we get here:
#!/bin/bash
read -r -p $'Would you like to create a new cronjob? [y/n]\n\n--> ' CRON
case "$CRON" in
y|Y)
echo "not pertinent to this discussion"
;;
n|N)
read -r -p $'\n\nWould you like to REMOVE a crontab entry? [y/n]: ' REMOVE
case "$REMOVE" in
Y|y)
declare -a CRONTAB
while IFS= read -r LINE
do
CRONTAB+=("$LINE")
done < <(grep -v '#' /var/spool/cron/"$SCRIPTUSER")
echo -en "\nPlease select a cronjob to remove from the Crontab file:\n\n"
PS3=$'\n\nPlease enter your selection: '
select LINE in "${CRONTAB[#]}"
do
echo "Going to remove \"$LINE\""
read -r -p $'Is this correct? [y/n]' CHOICE
case "$CHOICE" in
Y|y)
sed "s/$LINE/^#&/g" -i /var/spool/cron/"$SCRIPTUSER"
break
;;
N|n)
break
;;
esac
done
echo -en "\n\nCurrent Crontab entries for $SCRIPTUSER:\n\n"
echo -en "\n\n######################################\n\n$(grep -v '#' /var/spool/cron/"$SCRIPTUSER")\n\n######################################\n\n"
sleep 3
break
;;
N|n)
break
;;
esac
;;
esac
The problem I'm having is these are my cronjob entries I'm testing with:
The SED statement doesn't seem to be doing anything at all. I would imagine the '*' and '/' are probably messing with the SED pattern, and I have already tried a sed where I escaped all the '/' but it still passed over it like nothing was there.
I appreciate the extra set of eyeballs, thank you!

Related

How to properly iterate through a list using sshpass with a single ssh-login

Situation: we're feeding a list of filenames to an sshpass and it iterates correctly through a remote folder to check whether files with the given names actually exists, then build an updated list containing only the files that do exist, which is reused later in the bash script.
Problem: The list comprises sometimes tens of thousands of files, which means tens of thousands of ssh logins, which is harming performance and sometimes getting us blocked by our own security policies.
Intended solution: instead of starting the for-loop and calling sshpass each time, do it otherwise and pass the loop to an unique sshpass call.
I've got to pass the list to the sshpass instruction in the example test below:
#!/bin/bash
all_paths=(`/bin/cat /home/user/filenames_to_be_tested.list`)
existing_paths=()
sshpass -p PASSWORD ssh -n USER#HOST bash -c "'
for (( i=0; i<${#all_paths[#]}; i++ ))
do
echo ${all_paths[i]}
echo \"-->\"$i
if [[ -f ${all_paths[i]} ]]
then
echo ${all_paths[i]}
existing_paths=(${all_paths[i]})
fi
done
'
printf '%s\n' "${existing_paths[#]}"
The issue here is that it appears to loop (you see a series of echoed lines), but in the end it is not really iterating the i and is always checking/printing the same line.
Can someone help spot the bug? Thanks!
The problem is that bash first parses the string and substitutes the variables. That happens before it's sent to the server. If you want to stop bash from doing that, you should escape every variable that should be executed on the server.
#! /bin/bash
all_paths=(rootfs.tar derp a)
read -sp "pass? " PASS
echo
sshpass -p $PASS ssh -n $USER#$SERVER "
files=(${all_paths[#]})
existing_paths=()
for ((i=0; i<\${#files[#]}; i++)); do
echo -n \"\${files[#]} --> \$i\"
if [[ -f \${files[\$i]} ]]; then
echo \${files[\$i]}
existing_paths+=(\${files[\$i]})
else
echo 'Not found'
fi
done
printf '%s\n' \"\${existing_paths[#]}\"
This becomes hard to read very fast. However, there's an option I personally like to use. Create functions and export them to the server to be executed there to omit escaping a lot of stuff.
#! /bin/bash
all_paths=(rootfs.tar derp a)
function files_exist {
local files=($#)
local found=()
for file in ${files[#]}; do
echo -n "$file --> "
if [[ -f $file ]]; then
echo "exist"
found+=("$file")
else
echo "missing"
fi
done
printf '%s\n' "${found[#]}"
}
read -sp "pass? " PASS
echo
sshpass -p $PASS ssh -n $USER#$SERVER "
$(typeset -f files_exist)
files_exist ${all_paths[#]}
"

How do you add a .txt file to a shell script as a variable

Hello guy I am trying to write a basic shell script that adds, creates or lists multiple user accounts from a provide list in the form of a file specified at the command line. I am very new to this and have been banging my head on the keyboard for the last few hours. below is an example of the syntax and the code so for. (I called this script buser)
./buser.sh -a userlist (-a is the option and userlist is the filename, it is only an example)
file=$(< `pwd`/$2)
while :
do
case $1 in
-a)
useradd -m "$file"
break
;;
--add)
useradd -m "$file"
break
;;
--delete)
userdel -r "$file"
break
;;
-d)
userdel -r "$file"
break
;;
-l)
cat /etc/passwd | grep "$file"
break
;;
--list)
cat /etc/passwd | grep "$file"
break
;;
esac
done
when the useradd command reads $file it reads all the names as a single line and I get an error.
any help would be greatly appreciated thank you.
Not sure if I understood correctly.
But assuming you have a file with the following content:
**file.txt**
name1
name2
name3
You would like to call buser.sh -a file.txt and run useradd to name1, name2 and name3? I'm also assuming you're using Linux and useradd is the native program, if so I suggest to read the man, because it does not support to add a list of user at once (https://www.tecmint.com/add-users-in-linux/)
You have to call useradd multiple times instead.
while read user;
do
useradd -m $user
done <$2
A few simplifications, plus an error handler if the option doesn't exist:
while read file ; do
case "$1" in
-a|--add)
useradd -m "$file"
;;
-d|--delete)
userdel -r "$file"
;;
-l|--list)
grep -f `pwd`/"$2" /etc/passwd
break
;;
*)
echo "no such option as '$1'..."
exit 2
;;
esac
done < `pwd`/"$2"
Note: the above logic is a bit redundant... case "$1" keeps doing the same test, (with the same result), every pass. OTOH, it works, and it's less code than a while loop in each command list.
You can use sed to create the commands, and eval to run them:
var=$( sed -e 's/^/useradd -m /' -e 's/$/;/' $file )
eval "$var"
(Edited to put in the -m flag.)

Ksh syntax error '=~'

read -p "The Process running for "$days" days continuously OK to kill this process (y/N)? " -u 4 ok
[[ "${ok}" =~ y ]] || continue
echo "Killing $pid"
kill -HUP "$pid"
fi
This is the snippet of my script ,when i am executing this it shows like
`=~' is not expected.
How to resolve it?
I'm guessing your shebang line has #!/bin/sh and so you don't have access to the full ksh syntax. If you do, ksh93 does appear to support [[ string =~ regex ]] syntax, so there's something here which doesn't add up right.
Either way, there is a construct which works just as well in classic Bourne shell which you can use instead, with the added bonus that your script will be compatible to systems where ksh is not available.
You use read -p <prompt> but that is a Bashism; the -p option to read has a quite different meaning in ksh93.
printf 'Process ran for %i days continuously, OK to kill this? ' "$days"
read -u 4 ok
case $ok in [Yy]* ) ;; *) continue ;; esac
echo "Killing $pid"
kill -HUP "$pid"
Your code looked for y anywhere in the input but I restricted that to only examine the first character.
(Your code had erratic indentation and an unpaired fi which I omitted.)
Your ? is 'how to resolve' - tripleee's suggestion looks like a solution - simplify the code - try:
if [[ "${ok} == "y" ]]
I tried copying your code snippet and I get a different error. Time for D&C - simple ksh93 script testing '=~'

Lynx is stopping loop?

I'll just apologize beforehand; this is my first ever post, so I'm sorry if I'm not specific enough, if the question has already been answered and I just didn't look hard enough, and if I use incorrect formatting of some kind.
That said, here is my issue: In bash, I am trying to create a script that will read a file that lists several dozen URL's. Once it reads each line, I need it to run a set of actions on that, the first being to use lynx to navigate to the website. However, in practice, it will run once perfectly on the first line. Lynx goes, the download works, and then the subsequent renaming and organizing of that file go through as well. But then it skips all the other lines and acts like it has finished the whole file.
I have tested to see if it was lynx causing the issue by eliminating all the other parts of the code, and then by just eliminating lynx. It works without Lynx, but, of course, I need lynx for the rest of the output to be of any use to me. Let me just post the code:
!#/bin/bash
while read line; do
echo $line
lynx -accept_all_cookies $line
echo "lynx done"
od -N 2 -h *.zip | grep "4b50"
echo "od done, if 1 starting..."
if [[ $? -eq 0 ]]
then ls *.*>>logs/zips.log
else
od -N 2 -h *.exe | grep "5a4d"
echo "if 2 starting..."
if [[ $? -eq 0 ]]
then ls *.*>>logs/exes.log
else
od -N 2 -h *.exe | grep "5a4d, 4b50"
echo "if 3 starting..."
if [[ $? -eq 1 ]]
then
ls *.*>>logs/failed.log
fi
echo "if 3 done"
fi
echo "if 2 done"
fi
echo "if 1 done..."
FILE=`(ls -tr *.* | head -1)`
NOW=$(date +"%m_%d_%Y")
echo "vars set"
mv $FILE "criticalfreepri/${FILE%%.*}(ZCH,$NOW).${FILE#*.}" -u
echo "file moved"
rm *.zip *.exe
echo "file removed"
done < "lynx"
$SHELL
Just to be sure, I do have a file called "lynx" that contains the urls separated by a return each. Also, I used all those "echo"s to do my own sort of debugging, but I have tried it with and without the echo's. When I execute the script, the echo's all show up...
Any help is appreciated, and thank you all so much! Hope I didn't break any rules on this post!
PS: I'm on Linux Mint running things through the "terminal" program. I'm scripting with bash in Gedit, if any of that info is relevant. Thanks!
EDIT: Actually, the echo tests repeat for all three lines. So it would appear that lynx simply can't start again in the same loop?
Here is a simplified version of the script, as requested:
!#/bin/bash
while read -r line; do
echo $line
lynx $line
echo "lynx done"
done < "ref/url"
read "lynx"
$SHELL
Note that I have changed the sites the "url" file goes to:
`www.google.com
www.majorgeeks.com
http://www.sophos.com/en-us/products/free-tools/virus-removal-tool.aspx`
Lynx is not designed to use in scripts because it locks the terminal. Lynx is an interactive console browser.
If you want to access URLs in a script use wget, for example:
wget http://www.google.com/
For exit codes see: http://www.gnu.org/software/wget/manual/html_node/Exit-Status.html
to parse the html-content use:
VAR=`wget -qO- http://www.google.com/`
echo $VAR
I found a way which may fulfilled your requirement to run lynx command in loop with substitution of different url link.
Use
echo `lynx $line`
(Echo the lynx $line in single quote('))
instead of lynx $line. You may refer below:
your code
!#/bin/bash
while read -r line; do
echo $line
lynx $line
echo "lynx done"
done < "ref/url"
read "lynx"
$SHELL
try on below
!#/bin/bash
while read -r line; do
echo $line
echo `lynx $line`
echo "lynx done"
done < "ref/url"
I should have answered this question a long time ago. I got the program working, it's now on Github!
Anyway, I simply had to wrap the loop inside a function. Something like this:
progdownload () {
printlog "attmpting download from ${URL}"
if echo "${URL}" | grep -q "http://www.majorgeeks.com/" ; then
lynx -cmd_script="${WORKINGDIR}/support/mgcmd.txt" --accept-all-cookies ${URL}
else wget ${URL}
fi
}
URL="something.com"
progdownload

How to read input from the user in a bash subshell [duplicate]

Consider this bash script :
#!/bin/bash
while true; do
read -p "Give me an answer ? y/n : " yn
case $yn in
[Yy]* ) answer=true ; break;;
[Nn]* ) answer=false ; break;;
* ) echo "Please answer yes or no.";;
esac
done
if $answer
then
echo "Doing something as you answered yes"
else
echo "Not doing anything as you answered no"
fi
When run from the command line using :
$ ./script-name.sh
It works just as expected with the script waiting for you to answer y or n.
However when I upload to a url and attempt to run it using :
$ curl http://path.to/script-name.sh | bash
I get stuck in a permanent loop with the script saying Please answer yes or no. Apparently the script is receiving some sort of input other than y or n.
Why is this? And more importantly how can I achieve user input from a bash script called from a url?
Perhaps use an explicit local redirect:
read answer < /dev/tty
You can run it like this:
bash -c "$(curl -s http://path.to/script-name.sh)"
Since you're supplying content of bash script to bash interpreter. Use curl -s for silent execution.

Resources