Bash script: How to remote to a computer run a command and have output pipe to another computer? - macos

I need to create a Bash Script that will be able to ssh into a computer or Machine B, run a command and have the output piped back to a .txt file on machine A how do I go about doing this? Ultimately it will be list of computers that I will ssh to and run a command but all of the output will append to the same .txt file on Machine A.
UPDATE: Ok so I went and followed what That other Guy suggested and this is what seems to work:
File=/library/logs/file.txt
ssh -n username#<ip> "$(< testscript.sh)" > $File
What I need to do now is instead of manually entering an ip address, I need to have it read from a list of hostnames coming from a .txt file and have it place it in a variable that will substitute the ip address. An example would be: ssh username#Variable in which "Variable" will be changing each time a word is read from a file containing hostnames. Any ideas how to go about this?

This should do it
ssh userB#machineB "some command" | ssh userA#machineA "cat - >> file.txt"
With your commands:
ssh userB#machineB <<'END' | ssh userA#machineA "cat - >> file.txt"
echo Hostname=$(hostname) LastChecked=$(date)
ls -l /applications/utilities/Disk\ Utility.app/contents/Plugins/*Partition.dumodule* | awk '{printf "Username=%s DateModified=%s %s %s\n", $3, $6, $7, $8}'
END
You could replace the ls -l | awk pipeline with a single stat call, but it appears that the OSX stat does not have a way to return the user name, only the user id

Related

Copy containing a string to other server in shell script

I have a question
How can I cat filea.txt and write to fileb.txt with a shell script? Filea and fileb are different server Linux.
cat filea.txt >> fileb.txt in other server
Many thanks!
cat filea.txt >> /tmp/fileb.txt
scp /tmp/fileb.txt user#192.168.0.30:~/fileb.txt
Just to let you know, ~/ is the home directory of that user. You'd need to replace the user and the ip, ofc.
You can also not paste into the home directory and do a full file path.
scp is really cool, check out its man page.
EDIT: I do see your trouble with the appending.
You can try to cat filea.txt | ssh user#192.168.0.30 "cat >> fileb.txt" This works because cat takes from stdin when no file is specified.

'grep' not recognized over ssh

I'm trying to close my app remotely like that:
ssh pi#192.168.0.227 "kill $(ps aux | grep '[M]yApp' | awk '{print $2}')"
It fails and prompts:
grep : The term 'grep' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
If I login via SSH first and then do the command, it works, but I need it to be one-liner. I've set /etc/ssh/sshd_config variable PermitUserEnvironment to yes, and tried to use full path to grep (/bin/grep), and even removed spaces around the pipe (these were all answers to questions similar to mine) but nothing allows me to pass the command. What am I missing?
The string is expanded by your local shell before being passed to the other host. Since it is a double-quoted string the command within $() runs on your local host. The easiest way to pass such a command to a remote host is with a "quoted" here document:
ssh pi#192.168.0.227 <<'EOF'
kill $(ps aux | grep '[M]yApp' | awk '{print $2}')"
EOF
Similar: How have both local and remote variable inside an SSH command

Global Variables using Quoted vs. Unquoted Heredocs

I'm curious if I can have my cake and eat it too. I'm writing a script that needs to find the directory with the most recent date on a remote server. I then need to build that path so I can find specific .csv files on the server.
The script takes an input called folder and it needs to be appended to the end of the path. I've noticed I can pass folder into the heredoc and have it expanded, but then I lose out on the awk expansion I need to do. Here is an example:
folder='HBEP'
ssh $server /bin/bash << EOF
ls -t /projects/bison/git |
head -1 |
awk -v folder=$folder '{print "projects/bison/git/"$1"/assessment/LWR/validation/"folder}'
EOF
This produces a close but wrong output:
# output:
/projects/bison/git//assessment/LWR/validation/HBEB
# should be:
/projects/bison/git/bison_20190827/LWR/validation/HBEP
Now, when I quote EOF, I can access the piped in variable but not the folder variable:
folder='
ssh $server /bin/bash << 'EOF'
ls -t /projects/bison/git |
head -1 |
awk -v folder="$folder" '{print "projects/bison/git/"$1"/assessment/LWR/validation/"folder}'
EOF
# output:
projects/bison/git/bison_20190826/assessment/LWR/validation/
# should be:
projects/bison/git/bison_20190826/assessment/LWR/validation/HBEP
Is there a way I can leverage expansion in the heredoc and the outside shell?
You can use the unquoted version of heredoc. Just add the \ before $ if you want to avoid the parameter expansion.
eg
folder='HBEP'
ssh $server /bin/bash << EOF
ls -t /projects/bison/git |
head -1 |
awk -v folder=$folder '{print "projects/bison/git/"\$1"/assessment/LWR/validation/"folder}'
EOF

How to denote that pipes on a remote machine end, and a pipe on the local machine follows

I want to get some files from a remote machine. For this I make some pipes to determine which files have to be fetched, and these results I want to put in a pipe also.
The remote pipes combine to 1 command which is given to ssh.
I do not know how to let know where the pipes on the remote machine end and to put the results in the new local pipe. So I do:
ssh user#remote find ... | grep ...| awk ...| ls
The first 2 pipes are remote (find, grep , awk run on the remote machine), and the last pipe is local (ls runs on the local machine).
Wrap the part of the command you want to executed on remote machine into double quotes. E.g. find, grep and awk will be executed remote, while less will be exceuted local.
ssh user#remote "find ... | grep ...| awk ... "| less
As "tripleee" added in the comments it's better to use single quotes if there is no variable substitution in the quoted string. So use " if there is a variable inside the remote command.
ssh user#remote "find $foo | grep ...| awk ... "| less
or use ' if there is no variable involved.
ssh user#remote 'find "foo" | grep ...| awk ... '| less

How to echo text and have output from a command be sent to a file in parallel

I would like to echo text and also remotely ping a computer and have the output be sent to a log file. I also need to do this in parallel but I am having some trouble with how the output is being sent into the log file.
I would like the output to look like:
host | hostTo | result of ping command
but since I have this running as a background process it is outputting:
host hostTo host hostTo rtt
rtt
rtt
etc...
Is there a way to allow this to be a background process but make it so that the echo is part of that process, so the logfile isn't out of order?
here's my script, thanks in advance!
for host in `cat data/ips.txt`; do
echo -n "$host ";
for hostTo in `cat data/ips.txt`; do
{
echo -n "$host $hostTo " >> logs/$host.log;
(ssh -n -o StrictHostKeyChecking=no -o ConnectTimeout=1 -T username#$host ping -c 10 $hostTo | tail -1 >> logs/$host.log) &
};
done;
done
It's possible to do this with awk. What you're basically asking is how to print out the hosts as well as the result at the same time.
ie. Remove the line with echo and change the following:
ssh .... ping -c 10 $hostTo | awk -v from=$host -v to=$hostTo 'END {print from, to, $0}' >> logs/${host}.log
Note that tail is effectively being done inside awk also. Including shell var inside awk tends to be a PITA, maybe there's an easier way to do it without all the escaping and quotes. [Updated: Assign var in awk]
PS. The title of your question is a little unclear, it does sounds like you want to pipe your program output to the display and file at the same time.

Resources