crash do while with ssh ls - shell

Script read file line by line and check folders on remote server using command ls
But my do-while - is work only 1 time, and ; for example: if try use rsync - all fine, while work correct, problem only with ssh user#server ls $SERVER_FOLDER >> $LOG
i try use incorrect syntax?
Error from console: syntax error near unexpected token 'done'
LOG="/path_to_log/log.txt"
FILE="/path_to_file/projects_id.txt"
cat $FILE | while read -r line || [[ -n $line ]]
do
ID=$(echo $line | cut -d' ' -f3)
SERVER_FOLDER=`echo "/path_to_id/$ID/"`
echo "SERVER_FOLDER:" $SERVER_FOLDER
ssh user#server ls $SERVER_FOLDER >> $LOG
sleep 20
done

Add the -n option to ssh to prevent it from reading stdin. What is happening is that ssh is consuming all the input from the file (that is coming through stdin), so the while loop terminates after the first line because there is nothing left for it to read.
Change your code to:
while read -r line || [[ -n $line ]]
do
ID=$(cut -d' ' -f3 <<< "$line")
SERVER_FOLDER="/path_to_id/$ID/"
echo "SERVER_FOLDER: $SERVER_FOLDER"
ssh -n user#server ls $SERVER_FOLDER >> $LOG
sleep 20
done < "$FILE"
I have also made some other improvements such as changing the way you are reading the file (cat is not necessary).

Related

How do I prevent my bash script (tailing a file) from repeatedly acting on the same line?

I was working on a script that would keep monitoring login to my server or laptop via ssh.
this was the code that I was working with.
slackmessenger() {
curl -X POST -H 'Content-type: application/json' --data '{"text":"'"$1"'"}' myapilinkwashere
## removed it the api link due to slack restriction
}
while true
do
tail /var/log/auth.log | grep sshd | head -n 1 | while read LREAD
do
echo ${LREAD}
var=$(tail -f /var/log/auth.log | grep sshd | head -n 1)
slackmessenger "$var"
done
done
The issue I'm facing is that it keeps sending the old logs due to the while loop. can there be a condition that the loop only sends the new entries/updated enter as opposed to sending the old one over and over again. could not think of a condition that would skip the old entries and only shows old one.
Instead of using head -n 1 to extract a line at a time, iterate over the filtered output of tail -f /var/log/auth.log | grep sshd and process each line once as it comes through.
#!/usr/bin/env bash
# ^^^^- this needs to be a bash script, not a sh script!
case $BASH_VERSION in '') echo "Needs bash, not sh" >&2; exit 1;; esac
while IFS= read -r line; do
printf '%s\n' "$line"
slackmessenger "$line"
done < <(tail -f /var/log/auth.log | grep --line-buffered sshd)
See BashFAQ #9 describing why --line-buffered is necessary.
You could also write this as:
#!/usr/bin/env bash
case $BASH_VERSION in '') echo "Needs bash, not sh" >&2; exit 1;; esac
tail -f /var/log/auth.log |
grep --line-buffered sshd |
tee >(xargs -d $'\n' -n 1 slackmessenger)

Loop EOF ssh -n can't create file

Hope this time it's not a duplicate. I didn't find anything.
My code:
#!/bin/bash
FILE=/home/user/srv.txt
TICKET=task
while read LINE; do
ssh -nT $LINE << 'EOF'
touch info.txt
hostname >> info.txt
ifconfig | grep inet | awk '$3 ~ "cast" {print $2}' >> info.txt
grep -i ^server /etc/zabbix/zabbix_agentd.conf >> info.txt
echo "- Done -" >> info.txt
EOF
ssh -nT $LINE "cat info.txt" >> $TICKET.txt
done < $FILE #End
My issue:
if I only use ssh $LINE it will only ssh to the host on the first line and also display an error Pseudo-terminal will not be allocated because stdin is not a terminal.
using ssh -T , fix the error message above and it will create the file info.txt
using ssh -nT , fix the error where ssh only read the first line but I get an error message cat: info.txt: No such file or directory. If I ssh to the hosts, I can confirm that there is no info.txt file in my home folder. and with ssh -T, I have this file in my home folder.
I tried with the option -t, also HERE, EOF without ' ... ' but no luck
Do I miss something?
Thanks for your help,
Iswaren
You have two problems.
If you invoke ssh without -n it may consume the $FILE input (it drains its stdin)
If you invoke ssh with -n it won't read its stdin, so none of the commands will be executed
However, the first ssh has had its input redirected to come from a heredoc, so it does not need -n.
As stated in the comments, the second ssh call is not needed. Rather than piping into info.txt and then copying that into a local file, just output to the local file directly:
while read LINE; do
ssh -T $LINE >>$TICKET.txt <<'EOF'
hostname
ifconfig | grep inet | awk '$3 ~ "cast" {print $2}'
grep -i ^server /etc/zabbix/zabbix_agentd.conf
echo "- Done -"
EOF
done <$FILE

Why does bash script stop working

The script monitors incoming HTTP messages and forwards them to a monitoring application called zabbix, It works fine, however after about 1-2 days it stops working. Heres what I know so far:
Using pgrep i see the script is still running
the logfile file gets updated properly (first command of script)
The FIFO pipe seems to be working
The problem must be somewhere in WHILE loop or tail command.
Im new at scripting so maybe someone can spot the problem right away?
#!/bin/bash
tcpflow -p -c -i enp2s0 port 80 | grep --line-buffered -oE 'boo.php.* HTTP/1.[01]' >> /usr/local/bin/logfile &
pipe=/tmp/fifopipe
trap "rm -f $pipe" EXIT
if [[ ! -p $pipe ]]; then
mkfifo $pipe
fi
tail -n0 -F /usr/local/bin/logfile > /tmp/fifopipe &
while true
do
if read line <$pipe; then
unset sn
for ((c=1; c<=3; c++)) # c is no of max parameters x 2 + 1
do
URL="$(echo $line | awk -F'[ =&?]' '{print $'$c'}')"
if [[ "$URL" == 'sn' ]]; then
((c++))
sn="$(echo $line | awk -F'[ =&?]' '{print $'$c'}')"
fi
done
if [[ "$sn" ]]; then
hosttype="US2G_"
host=$hosttype$sn
zabbix_sender -z nuc -s $host -k serial -o $sn -vv
fi
fi
done
You're inputting from the fifo incorrectly. By writing:
while true; do read line < $pipe ....; done
you are closing and reopening the fifo on each iteration of the loop. The first time you close it, the producer to the pipe (the tail -f) gets a SIGPIPE and dies. Change the structure to:
while true; do read line; ...; done < $pipe
Note that every process inside the loop now has the potential to inadvertently read from the pipe, so you'll probably want to explicitly close stdin for each.

Reading from pasted input with line breaks in a Bash Script

I've been trying for a couple nights to get this Script to run with no luck. I'm trying to write a script using Bash that allows a user to paste a block of text, and the script will grep out the valid IP addresses from the text, and automatically ping them in order.
So far, after much modification, I'm stuck at this point:
#!/bin/sh
echo Paste Text with IP Addresses
read inputtext
echo "$inputtext">inputtext.txt
grep -E -o "([0-9]{1,3}[\.]){3}[0-9]{1,3}" inputtext.txt > address.txt
awk '{print $1}' < address.txt | while read ip; do
if ping -c1 $ip >/dev/null 2>&1; then
echo $ip IS UP
else
echo $ip IS DOWN
fi
done
rm inputtext.txt
rm address.txt
After running this script, the user is prompted as desired, and if an IP address was included in the first line of text, the ping check will succeed, but then all the text after that line will be spat out onto the following command prompt. So it seems that my issue lies in when I read from user input. The only part that is being read is the first line, and once a break is encountered, the script does not considered any lines past the first in its work.
As written, you just need an outer loop to actually read each line of user input.
#!/bin/sh
echo Paste Text with IP Addresses
while read -r inputtext
do
echo "$inputtext">inputtext.txt
grep -E -o "([0-9]{1,3}[\.]){3}[0-9]{1,3}" inputtext.txt > address.txt
awk '{print $1}' < address.txt | while read ip; do
if ping -c1 $ip >/dev/null 2>&1; then
echo $ip IS UP
else
echo $ip IS DOWN
fi
done
rm inputtext.txt
rm address.txt
done
However, you can actually simplify this much further and eliminate the temporary files.
#!/bin/sh
echo Paste Text with IP Addresses
while read -r inputtext
do
ip=$(echo "$inputtext" | grep -E -o "([0-9]{1,3}[\.]){3}[0-9]{1,3}" | awk '{print $1}')
if ping -c1 $ip >/dev/null 2>&1; then
echo $ip IS UP
else
echo $ip IS DOWN
fi
done

Bash script only read the first line of the file

I wrote a script to ssh to remote server to find the disk usage of a user. However, this script can only read the first line, it doesn't continue on the other lines of the file. Anything wrong with my script? Thanks.
#!/bin/bash
FILE="myfile.txt"
while read line; do
server=`echo $line|awk '{print $1}'`
cpid=`echo $line|awk '{print $2}'`
echo $server "---" $cpid "---" `ssh $server grep $cpid /var/cpanel/repquota.cache|awk '{print int($3/1000) "MB"}'`
done < $FILE
myfile.txt contents:
server1 user1
server2 user2
server3 user3
The ssh call is inheriting its standard input from the while loop, which redirects from your file. This causes the ssh command to consume the rest of the file. You'll need to use a different file descriptor to supply the read command:
#!/bin/bash
FILE="myfile.txt"
while read -u 3 server cpid; do
printf "$server---$cpid---"
ssh $server "grep $cpid /var/cpanel/repquota.cache | awk '{print int($3/1000) \"MB\"}'"
done 3< $FILE
An alternative is to explicitly redirect input to ssh from /dev/null, since you're not using it anyway.
#!/bin/bash
FILE="myfile.txt"
while read server cpid; do
printf "$server---$cpid---"
< /dev/null ssh $server "grep $cpid /var/cpanel/repquota.cache | awk '{print int($3/1000) \"MB\"}'"
done < $FILE
First of all you can simplify your read loop to
while read server cpid; do
echo $server "---" $cpid "---" `ssh ...`
done <$FILE
and save the parsing with awk. Another simplification is to save the call to grep and let awk do the search for $cpid
ssh $server "awk '/$cpid/ {print int(\$3/1000) \"MB\"}' /var/cpanel/repquota.cache"
To your problem, I guess the ssh call doesn't return, because it waits for a password or something, and so prevents the loop to continue.

Resources