copying a file on remote server using cat command not working - bash

I am trying to copy a file on remote server using below expect script.
I cannot use scp or sftp etc.
#/usr/bin/expect
set timeout -1
spawn /usr/bin/ssh -q root#testserver cat /tmp/passfile > /tmp/localpassfile
expect "assword"
send "welcome1\r"
expect eof
Its not working.
But below command works fine when i execute on shell
ssh -q root#testserver cat /tmp/passfile > /tmp/localpassfile

You are now passing the output to /tmp/localpassfile on testserver. Try:
/usr/sbin/ssh -q root#testserver "cat /tmp/passfile" > /tmp/localpassfile

Related

SSH to remote server, output to remote file?

I'm ssh'ing to a remote server and running a command. This works fine.
sshpass -p password ssh user#192.168.0.1 /bin/bash << EOF
cd /usr
ls -lh
EOF
But I want to save the output of ls -lh to a file on the remote or local server.
I've tried the following but that doesn't seem to do anything locally or remotely.
sshpass -p password ssh user#192.168.0.1 /bin/bash << EOF
cd /usr
ls -lh > /home/user/filename
EOF
How do I do this ?
Thanks
UPDATE.
Quick update.. the live script will be running multiple commands and I only want the results of one command sending the the file. Thanks

How to use a source file while executing a script on a remote machine over SSH

I am running a bash script on a remote machine over SSH.
ssh -T $DBHOST2 'bash -s' < $DIR/script.sh <arguments>
Within the script I am using a source file for defining functions used in the script script.sh.
DIR=`dirname $0` # to get the location where the script is located
echo "Directory is $DIR"
. $DIR/source.bashrc # source file
But since the source file is not present in the remote machine it results in an error.
Directory is .
./source.bashrc: No such file or directory
I can always define the functions along with the main script rather than using a source file, but I was wondering is there any way to use a separate source file.
Edit : Neither the source file nor the script is located in the remote machine.
Here are to ways to this - both only requiring one ssh session.
Option 1: Use tar to copy your scripts to the server
tar cf - $DIR/script.sh $DIR/source.bashrc | ssh $DBHOST2 "tar xf -; bash $DIR/script.sh <arguments>"
This 'copies' your scripts to your $DBHOST2 and executes them there.
Option 2: Use bashpp to include all code in one script
If copying files onto $DBHOST2 is not an option, use bashpp.
Replace your . calls with #include and then run it through bashpp:
bashpp $DIR/script.sh | ssh $DBHOST2 bash -s
ssh -T $DBHOST2 'bash -s' <<< $(cat source_file $DIR/script.sh)
The following acheives what I am trying to do.
1.Copy the source file to remote machine.
scp $DIR/source.bashrc $DBHOST2:./
2.Execute the local script with arguments on the remote machine via SSH
ssh $DBHOST2 "bash -s" -- < $DIR/script.sh <arguments>
3. Copy remote logfile logfile.log to local file dbhost2.log and remove the source file and logfile from the remote machine
ssh $DBHOST2 "cat logfile.log; rm -f source.bashrc logfile.log" > dbhost.log

How to copy echo 'x' to file during an ssh connection

I have a script which starts an ssh-connection.
so the variable $ssh start the ssh connection.
so $SSH hostname gives the hostname of the host where I ssh to.
Now I try to echo something and copy the output of the echo to a file.
SSH="ssh -tt -i key.pem user#ec2-instance"
When I perform a manual ssh to the host and perform:
sudo sh -c "echo 'DEVS=/dev/xvdbb' >> /etc/sysconfig/docker-storage-setup"
it works.
But when I perform
${SSH} sudo sh -c "echo 'DEVS=/dev/xvdb' > /etc/sysconfig/docker-storage-setup"
it does not seem to work.
EDIT:
Also using tee is working fine after performing an ssh manually but does not seem to work after the ssh in the script.sh
The echo command after an ssh of the script is happening on my real host (from where I'm running the script, not the host where I'm performing an ssh to). So the file on my real host is being changed and not the file on my host where I've performed an ssh to.
The command passed to ssh will be executed by the remote shell, so you need to add one level of quoting:
${SSH} "sudo sh -c \"echo 'DEVS=/dev/xvdb' > /etc/sysconfig/docker-storage-setup\""
The only thing you really need on the server is the writing though, so if you don't have password prompts and such you can get rid of some of this nesting:
echo 'DEVS=/dev/xvdb' | $SSH 'sudo tee /etc/sysconfig/docker-storage-setup'

Wait for a proccess to finish

Could you please help me with any hint about the below issue?
i have to send a command to a host (the command needs lot of time to execute and creates a file):
ssh uname1#host1 ssh uname2#host2 'command1'
after this command gets executed i need to zip the file created
ssh uname1#host1 ssh uname2#host2 'gzip file1'
Than do the same thing for another host
ssh uname3#host3 ssh uname4#host4 'command1'
ssh uname1#host1 ssh uname2#host2 'gzip file2'
Is it possible to run both this commands in parallel in order to save time for script execution?
thank you in advance.
try something like
ssh uname2#host2 'command1 && gzip file1' &
ssh uname2#host3 'command1 && gzip file1' &
ssh uname2#host4 'command1 && gzip file1' &
You can put all the commands in a file on the host you start from
&& in this context works like ; but the second command is only executed if the first works
Simply do:
ssh uname1#host1 ssh uname2#host2 'command1; gzip file1'
and if the gzip should be run only is the first command is a success, then :
ssh uname1#host1 ssh uname2#host2 'command1 && gzip file1'
The second command will be launched after the first one.

How to execute bash with ssh connection

I'm trying to write a bash script to automatically do stuff in clients machines in the network. But after the code
ssh -i ~/.ssh/key root#machine
The bash program just stops
What can I do to send the command to the remote machine?
Thanks
Same way as if you were invoking bash directly.
ssh ... somescriptontheserver.sh
ssh ... ls -lR /
ssh ... << EOF
ls -lR /
EOF

Resources