Using xargs to ssh to multiple hosts, receiving :Name or service not known - bash

I am writing a shell script that will ssh to multiple hosts, and perform some operations on them.
my script, test.sh, looks as follows:
cat < $1 | xargs -e -d ' ' -I % ssh % grep "example" /opt/example/test.log
I run it via the following command
./test.sh prod_hosts.txt
and the contents of prod_hosts.txt:
hostname1 hostname2 hostname3 hostname4
I have verified there is no carraige return at the end of this file, yet I am getting the following error:
[ryan#hostname1 ~]$ ./test.sh prod_hosts.txt
ssh: hostname4
: Name or service not known
xargs: ssh: exited with status 255; aborting
It looks like it successfully ssh's into the 4 hosts but then has a blank entry that it is attempting to ssh with, hence the error.
Any idea what I'm missing here? Seems like I'm missing something obvious!

echo '1 2' | xargs -d ' ' -I % echo % produces:
1
2
<blank line>
whereas echo -n '1 2' | xargs -d ' ' -I % echo % returns:
1
2
i.e. xargs decides to generate one more output entry if the input string is ended by newline.
Workarounds:
Use newline delimited entries in hosts.txt and: <hosts.txt xargs -I % <ssh-command>
If hosts.txt cannot be changed: < <(tr ' ' '\n' < hosts.txt) xargs -I % <ssh-command>

Related

Set a command to a variable in bash script problem

Trying to run a command as a variable but I am getting strange results
Expected result "1" :
grep -i nosuid /etc/fstab | grep -iq nfs
echo $?
1
Unexpected result as a variable command:
cmd="grep -i nosuid /etc/fstab | grep -iq nfs"
$cmd
echo $?
0
It seems it returns 0 as the command was correct not actual outcome. How to do this better ?
You can only execute exactly one command stored in a variable. The pipe is passed as an argument to the first grep.
Example
$ printArgs() { printf %s\\n "$#"; }
# Two commands. The 1st command has parameters "a" and "b".
# The 2nd command prints stdin from the first command.
$ printArgs a b | cat
a
b
$ cmd='printArgs a b | cat'
# Only one command with parameters "a", "b", "|", and "cat".
$ $cmd
a
b
|
cat
How to do this better?
Don't execute the command using variables.
Use a function.
$ cmd() { grep -i nosuid /etc/fstab | grep -iq nfs; }
$ cmd
$ echo $?
1
Solution to the actual problem
I see three options to your actual problem:
Use a DEBUG trap and the BASH_COMMAND variable inside the trap.
Enable bash's history feature for your script and use the hist command.
Use a function which takes a command string and executes it using eval.
Regarding your comment on the last approach: You only need one function. Something like
execAndLog() {
description="$1"
shift
if eval "$*"; then
info="PASSED: $description: $*"
passed+=("${FUNCNAME[1]}")
else
info="FAILED: $description: $*"
failed+=("${FUNCNAME[1]}")
done
}
You can use this function as follows
execAndLog 'Scanned system' 'grep -i nfs /etc/fstab | grep -iq noexec'
The first argument is the description for the log, the remaining arguments are the command to be executed.
using bash -x or set -x will allow you to see what bash executes:
> cmd="grep -i nosuid /etc/fstab | grep -iq nfs"
> set -x
> $cmd
+ grep -i nosuid /etc/fstab '|' grep -iq nfs
as you can see your pipe | is passed as an argument to the first grep command.

Bash is redirecting output from command only after script has finished

Context
Got a daft script that checks a process is running on a group of hosts, like a watchdog, as I say it's a daft script so bear in mind it isn't 'perfect' by scripting standards
Problem
I've ran bash -x and can see that the script finishes its first check without actually redirecting the output of the command to the file which is very frustrating, it means each host is actually being evaluated to the last hosts output
Code
#!/bin/bash
FILE='OUTPUT'
for host in $(cat /etc/hosts | grep webserver.[2][1-2][0-2][0-9] | awk {' print $2 ' })
do ssh -n -f $host -i <sshkey> 'ps ax | grep myprocess | wc -l' > $FILE 2> /dev/null
cat $FILE
if grep '1' $FILE ; then
echo "Process is NOT running on $host"
cat $FILE
else
cat $FILE
echo "ALL OK on $host"
fi
cat $FILE
done
Script traceback
++ cat /etc/hosts
++ awk '{ print $2 }'
++ grep 'webserver.[2][1-2][0-2][0-9]'
+ for host in '$(cat /etc/hosts | grep webserver.[2][1-2][0-2][0-9] | awk {'\'' print $2 '\''})'
+ ssh -n -f webserver.2100 -i <omitted> 'ps ax | grep myprocess | wc -l'
+ cat OUTPUT
+ grep 1 OUTPUT
+ cat OUTPUT
+ echo 'ALL OK on webserver.2100'
ALL OK on webserver.2100
+ cat OUTPUT
+ printf 'webserver.2100 checked \n'
webserver.2100 checked
+ for host in '$(cat /etc/hosts | grep webserver.[2][1-2][0-2][0-9] | awk {'\'' print $2 '\''})'
+ ssh -n -f webserver.2101 -i <omitted> 'ps ax | grep myprocess | wc -l'
+ cat OUTPUT
2
+ grep 1 OUTPUT
+ cat OUTPUT
2
+ echo 'ALL OK on webserver.2101'
ALL OK on webserver.2101
+ cat OUTPUT
2
+ printf 'webserver.2101 checked \n'
webserver.2101 checked
Issue
As you can see, it's registering nothing for the first host, then after it is done, it's piping the data into the file, then the second host is being evaluated for the previous hosts data...
I suspect its to do with redirection, but in my eyes this should work, it doesn't so it's frustrating.
I think you're assuming that ps ax | grep myprocess will always return at least one line (the grep process). I'm not sure that's true. I'd rewrite that like this:
awk '/webserver.[2][1-2][0-2][0-9]/ {print $2}' /etc/hosts | while IFS= read -r host; do
output=$( ssh -n -f "$host" -i "$sshkey" 'ps ax | grep "[m]yprocess"' )
if [[ -z "$output" ]]; then
echo "Process is NOT running on $host"
else
echo "ALL OK on $host"
fi
done
This trick ps ax | grep "[m]yprocess" effectively removes the grep process from the ps output:
the string "myprocess" matches the regular expression "[m]yprocess" (that's the running "myprocess" process), but
the string "[m]yprocess" does not match the regular expression "[m]yprocess" (that's the running "grep" process)

Executing bash -c with xargs

I had a job to perform that involved:
grep lines from a log
find a number in the line
perform basic arithmetic on the number (say, number + 1234)
The final result is a bunch of numbers separated by a newline.
If the input was:
1000
2000
3000
Then the required output was:
2234
3234
4234
I ended up with the following command:
cat log.txt | grep "word" | cut -d'|' -f7 | cut -d' ' -f5 | xargs -n 1 bash -c 'echo $(($1 + 1234))' args
I found the xargs -n 1 bash -c 'echo $(($1 + 1234))' args snippet in an answer to this question but I don't understand the need for the final args argument that is passed in. I can change it to anything, args could be blah, but if I omit it the arithmetic fails and the output is the numbers unchanged:
1000
2000
3000
Could anyone shed some light on why args is a required argument to bash -c?
A simple awk command can do the same - in a clean way:
awk -F'|' '/word/{split($7,a," "); print a[5]+1234}' log.txt
Man bash:
-c If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after
the command_string, they are assigned to the positional parameters, starting with $0.
So, for your case, 'args' is a placeholder that goes in $0, making your actual input go in $1.
You should be able to alter your command to:
grep "word" log.txt | cut -d'|' -f7 | cut -d' ' -f5 | xargs -n 1 bash -c 'echo $(($0 + 1234))'

xargs wget extract filename from URL with Parameter

I want to do parallel downloads but the problem wget output not correct filename.
url.txt
http://example.com/file1.zip?arg=tereef&arg2=okook
http://example.com/file2.zip?arg=tereef&arg2=okook
command
xargs -P 4 -n 1 wget <url.txt
output filename
file1.zip?arg=tereef&arg2=okook
file2.zip?arg=tereef&arg2=okook
expected output
file1.zip
file2.zip
I'm new with bash, please suggest me how to output correct filename, and please don't suggest for loop or & because it blocking.
Thank you
You can use a bash function that you have to export to be seen outside the current shell
function mywget()
{
wget -O ${1%%\?*} "'$1'"
}
export -f mywget
xargs -P 4 -n 1 -I {} bash -c "mywget '{}'" < url.txt
Process your input to produce the desired command, then run it through xargs.
perl -ne - iterate over the lines of the input file and execute the inline program
-e : Execute perl one-liner
-n : Loop over all input lines, assigning each to $_ in turn.
xargs -P 4 -n 1 -i -t wget "{}"
-P 4 : Max of 4 Processes at a time
-n 1 : Consume one input line at a time
-i : Use the replace string "{}"
-t : Print the command before executing it
perl -ne '
chomp(my ($url) = $_); # Remove trailing newline
my ($name) = $url =~ m|example.com/(.+)\?|; # Grab the filename
print "$url -O $name\n"; # Print all of the wget params
' url.txt | xargs -P 4 -n 1 -i -t wget "{}"
Output
wget http://example.com/file1.zip?arg=tereef&arg2=okook -O file1.zip
wget http://example.com/file2.zip?arg=tereef&arg2=okook -O file2.zip
--2016-07-21 22:24:44-- http://example.com/file2.zip?arg=tereef&arg2=okook%20-O%20file2.zip
--2016-07-21 22:24:44-- http://example.com/file1.zip?arg=tereef&arg2=okook%20-O%20file1.zip
Resolving example.com (example.com)... Resolving example.com (example.com)... 93.184.216.34, 2606:2800:220:1:248:1893:25c8:1946
93.184.216.34, Connecting to example.com (example.com)|93.184.216.34|:80... 2606:2800:220:1:248:1893:25c8:1946
Connecting to example.com (example.com)|93.184.216.34|:80... connected.
connected.
HTTP request sent, awaiting response... HTTP request sent, awaiting response... 404 Not Found
2016-07-21 22:24:44 ERROR 404: Not Found.
404 Not Found
2016-07-21 22:24:44 ERROR 404: Not Found.
With GNU Parallel it looks like this:
parallel -P 4 wget -O '{= s/\?.*//;s:.*/:: =}' {} <url.txt

Pass multiple file names captured in a variable to a command (vim)

I am trying to create a script that opens automatically any files containing a particular pattern.
This is what I achieved so far:
xargs -d " " vim < "$(grep --color -r test * | cut -d ':' -f 1 | uniq | sed ':a;N;$!ba;s/\n/ /g')"
The problem is that vim does not recognize the command as separate file of list, but as a whole filename instead:
zsh: file name too long: ..............
Is there an easy way to achieve it? What am I missing?
The usual way to call xargs is just to pass the arguments with newlines via a pipe:
grep -Rl test * | xargs vim
Note that I'm also passing the -l argument to grep to list the files that contain my pattern.
Use this:
vim -- `grep -rIl test *`
-I skip matching in binary files
-l print file name at first match
Try to omit xargs, becouse this leads to incorrect behaviour of vim:
Vim: Warning: Input is not from a terminal
What I usually do is append the following line to a list of files:
> ~/.files.txt && vim $(cat ~/.files.txt | tr "\n" " ")
For example :
grep --color -r test * > ~/.files.txt && vim $(cat ~/.files.txt | tr "\n" " ")
I have the following in my .bashrc to bind VV (twice V in uppercase) to insert that automatically :
insertinreadline() {
READLINE_LINE=${READLINE_LINE:0:$READLINE_POINT}$1${READLINE_LINE:$READLINE_POINT}
READLINE_POINT=`expr $READLINE_POINT + ${#1}`
}
bind -x '"VV": insertinreadline " > ~/.files.txt && vim \$(cat ~/.files.txt | tr \"\\n\" \" \")"'

Resources