I am trying to copy a file on remote server using below expect script.
I cannot use scp or sftp etc.
#/usr/bin/expect
set timeout -1
spawn /usr/bin/ssh -q root#testserver cat /tmp/passfile > /tmp/localpassfile
expect "assword"
send "welcome1\r"
expect eof
Its not working.
But below command works fine when i execute on shell
ssh -q root#testserver cat /tmp/passfile > /tmp/localpassfile
You are now passing the output to /tmp/localpassfile on testserver. Try:
/usr/sbin/ssh -q root#testserver "cat /tmp/passfile" > /tmp/localpassfile
I am trying to scp latest file from remote machine however getting below error;
can't read "(ssh root#1.1..... "ls -t /test/*txt | head
-1")": no such variable
my expect script ;
spawn scp -r root#$remote_ip:/test/$(ssh root#$remote_ip "ls -t /test/*txt | head -1") /mypath
how should I get latest file from remote machine with expect script?
$(...) are shell syntax. To perform the same functionality in Tcl/expect, use the exec command.
spawn scp -r root#$remote_ip:/test/[exec ssh root#$remote_ip "ls -t /test/*txt | head -1"] /mypath
It doesn't have to be a single line, for maintainability, split it
set latest [exec ssh root#$remote_ip "ls -t /test/*txt | head -1"]
spawn scp -r root#$remote_ip:/test/$latest /mypath
However, I suspect you're using expect to send the passwords, either:
spawn ssh root#$remote_ip "ls -t /test/*txt | head -1"
expect "password"
send "$passwd\r"
expect eof
# parse $expect_out(buffer) to extract the file
But, your life will be much easier if
you set up ssh key authentication and avoid expect altogether:
ssh-keygen
ssh-copy-id root#$remote_ip
latest=$(ssh root#$remote_ip "ls -t /test/*txt | head -1")
scp -r root#$remote_ip:/test/$latest /mypath
scp -r root#$remote_ip:ssh root#$remote_ip ls /test/* -1td | head -1 /mypath/.
Shell script needs to
ssh to Host2 from Host1
cd /test/test1/log
grep logs.txt for string error
write the grepped output to a file
and move that file to Host1
This can be accomplished by specifying the -f option to ssh:
ssh user#host -f 'echo "this is a logfile">logfile.txt'
ssh user#host -f 'grep logfile logfile.txt' > locallogfile.txt
cat locallogfile.txt
An example using a different directory and cd changing directories to it:
ssh user#host -f 'mkdir -p foo/bar'
ssh user#host -f 'cd foo/bar ; echo "this is a logfile">logfile.txt'
ssh user#host -f 'cd foo/bar ; echo "this is a logfile">logfile.txt'
ssh user#host -f 'cd foo/bar ; grep logfile logfile.txt' > locallogfile.txt
cat locallogfile.txt
I'm trying to write a script to automate connecting to various remote instances but am having a hard time piping the resulting hostname to as an argument to ssh. I'm basically trying to do the following:
echo "example.com" | xargs -I {} ssh {}
I also tried a bunch of combinations but to no avail. The closest I got was with the following but it loses interactivity.
echo "example.com" | xargs -0 ssh -t -t
The end goal is to be able to have a script that will return a hostname/ipaddress that can then be connected to via ssh. For example:
my_random_script | ... ssh
Maybe, you want to run:
ssh `echo example.com`
ssh `your_random_script`
When you insert some command into back-quotes, this command is executed, and result stored in unnamed variable. For save/reuse, you can invoke something like:
VAR=`your_random_script`
ssh $VAR
i've set up my public and private keys and have automated ssh login. I want to execute two commands say command1 and command2 in one login session and store them in files command1.txt and command2.txt on the local machine.
i'm using this code
ssh -i my_key user#ip 'command1 command2' and the two commands get executed in one login but i have no clue as to how to store them in 2 different files.
I want to do so because i dont want to repeatedly ssh into my remote host.
Unless you can parse the actual outputs of the two commands and distinguish which is which, you can't. You will need two separate ssh sessions:
ssh -i my_key user#ip command1 > command1.txt
ssh -i my_key user#ip command2 > command2.txt
You could also redirect the outputs to files on the remote machine and then copy them to your local machine:
ssh -i my_key user#ip 'command1 > command1.txt; command2 > command2.txt'
scp -i my_key user#ip:'command*.txt' .
NO, you will have to do it separately in separate command (multiple login) as already mentioned by #lanzz. To save the output in local, do like
ssh -i my_key user#ip "command1" > .\file_on_local_host.txt
In case, you want to run multiple command in a single login, then jot all your command in a script and then run that script through SSH, instead running multiple command.
It's possible, but probably more trouble than it's worth. If you can generate a unique string that is guaranteed not to be in the output of command1, you can do:
$ ssh remote 'cmd1; echo unique string; cmd2' |
awk '/^unique string$/ { output="cmd2"; next } { print > output }' output=cmd1
This simply starts printing to the file cmd1, and then changes output to the file cmd2 when it sees the unique string. You'll probably want to handle stderr as well. That's left as an exercise for the reader.
option 1. Tell your boss he's being silly. Unless, of course, he isn't and there is critical reason of needing it all in one session. For some reason such a case escapes my imagination.
option 2. why not tar?
ssh -i my_key user#ip 'command1 > out1; command2 > out2; tar cf - out*' | tar xf -
You can do this. Assuming you can set up authentication from the remote machine back to the local machine, you can use ssh to pipe the output of the commands back. The trick is getting the backslashes right.
ssh remotehost command1 \| ssh localhost cat \\\> command1.txt \; command2 \| ssh localhost cat \\\> command2.txt
Or if you aren't so into backslashes...
ssh remotehost 'command1 | ssh localhost cat \> command1.txt ; command2 | ssh localhost cat \> command2.txt'
join them using && so you can have it like this
ssh -i my_key user#ip "command1 > command1.txt && command2 > command2.txt && command3 > command3.txt"
Hope this helps
I was able to, here's exactly what I did:
ssh root#your_host "netstat -an;hostname;uname -a"
This performs the commands in order and cat'd them onto my screen perfectly.
Make sure you start and finish with the quotation marks, else it'll run the first command remotely then run the remainder of the commands against your local machine.
I have an rsa key pair to my server, so if you want to avoid credential check then obviously you have to make that pair.
I think this is what you need:
At first you need to install sshpass on your machine.
then you can write your own script:
while read pass port user ip; do
sshpass -p$pass ssh -p $port $user#$ip <<ENDSSH1
COMMAND 1 > file1
.
.
.
COMMAND n > file2
ENDSSH1
done <<____HERE
PASS PORT USER IP
. . . .
. . . .
. . . .
PASS PORT USER IP
____HERE
How to run multiple command on remote server using single ssh conection.
[root#nismaster ~]# ssh 192.168.122.169 "uname -a;hostname"
root#192.168.122.169's password:
Linux nisclient2 2.6.18-164.el5 #1 SMP Tue Aug 18 15:51:54 EDT 2009 i686 i686 i386 GNU/Linux
nisclient2
OR
[root#nismaster ~]# ssh 192.168.122.169 "uname -a && hostname"
root#192.168.122.169's password:
Linux nisclient2 2.6.18-164.el5 #1 SMP Tue Aug 18 15:51:54 EDT 2009 i686 i686 i386 GNU/Linux
nisclient2