My requirement is like this:
I need to log in to a remote device (say Router/switch) and execute following commands.
telnet xx.xx.xx.xx
//give password here
sys
interface g x/x/x
shut
desc free-port
exit
There are Hundreds of devices for which I cannot waste time doing above damn thing 100 times. I need to write a automated script which does it. so My questions are as follows:
I use Windows system, so What is the best scripting language to be used : Ruby / shell script / perl ? (I was formerly ROR Developer, so i know Ruby, Linux terminal. Now I am working in networking domain. )
What I thought was : Put all Devices into an array and using for loop, call devices one by one and execute above said commands.
I don't have knowledge of scripting, so please guide me further. I don't know where to start from.
Step 1: decide the file structure of your program.
For example, this is the simplest structure
if_admin/
|--config.yml
|--run.rb
Step 2: write a config file or a bunch of config files that contain the different parts of the commands you need to run on the targets.
For example, you can use a yaml file like this:
xx.xx.xx.xx:
password: s3cret
router-shelf: x
slot: x
port: x
yy.yy.yy.yy:
...
Step 3: implement what you want to do
require 'yaml'
require 'net/telnet'
config = YAML.load_file('./config.yml')
config.each do |host, conf|
telnet = Net::Telnet.new('Host' => host)
telnet.login(conf['password'])
telnet.puts <<-CMD
sys
interface g #{conf['router-shelf']}/#{conf['slot']}/#{conf['port']}
shut
desc free-port
CMD
telnet.close
end
If you can use expect script , you are in luck.
#!/usr/bin/expect
set timeout 60
set cmds [list "ssh host1 ..." "ssh host2 ..." "ssh host3 ..."]
foreach cmd $cmds {
spawn -noecho bash -c $cmd
expect {
-re "password" {
exp_send "$env(PASS_WORD)\"
exp_continue
}
eof { wait } ; # at this time the last spawn'ed process has exited
}
}
Here is the rough idea of above script :-
set cmds [list.... will be used as list to store set of commands.
foreach will iterate though those commands
spawn will spawn process for each of the command. you can write multiple command with single telnet in bash, just break down commands using \ (backslash) so it is easily readable and extendable.
expect block will pass password whenever it encounter certain regex.
eof will wait till all commands in spawn process are finish.
set timeout -1 will keep loop running. i think default time for expect script is 10secs.
You can create one more foreach loop for host-list.
I think this will be enough to get you started for your automation process.
As to the question of "What is the best scripting language to be used", I would say go with one that does what you need and one that you're comfortable with using.
If you want to go with Perl, one module that you could use is Net::Telnet. Of course, you'll need Perl itself. I'd recommend using Strawberry Perl, which should already have Net::Telnet installed.
Another possible route is to use putty, which is a SSH and telnet client. You could combine that with TTY Plus, which provides an interface that uses tabs for different putty sessions. And it lets you issue commands to multiple putty sessions. This is one possibility that wouldn't involve a lot of code writing.
Related
I have a script that helps me update the IOS of my Cisco devices when ever I need to. It works fine and I have no issues with the script itself other then the fact that it only does one device at a time.
Is there something that I can research to make the script run asymmetrically so it can do multiple sessions at one time?
The script consists of an expect script which is setup like so:
set timeout 6
set hostname [lindex argv $0]
set password [lindex argv $1]
spawn ssh $hostname
expect "TACACS*:"
send "$password\r"
expect "#"
send "term length 0\r"
< other similar commands >
interact
The main bash script works as follows:
IP=$(cat ./iphosts)
read -p "Please enter your TACACS Password:" password
for i in $IP
do
expect 01.exp $i $password | tee -a bulk.log
done
interact
Both the expect and .sh script have a little bit more to each but those usually post script completion tasks like reporting or additional commands.
Thank you for any information that you can provide on this!
You can use the xargs tool to start a number of processes in parallel. For example:
#!/bin/sh
read -p "Please enter your TACACS Password:" password
xargs -IADDRESS -P4 expect 01.exp ADDRESS $password < ./iphosts
This uses the -P argument to xargs to run up to 4 processes at a time. You could scale up the argument to -P to run more processes in parallel.
But there's a problem here: you're calling interact in your expect script, which suggests that the script is expecting (possibly requires) interactive input from you when it is running. If this is the case, the solution presented here won't work. You would need to rewrite your expect script so that it does not require any user interaction.
You may also want to investigate a tool like Ansible which (a) does this sort of parallel execution by default and (b) has explicit support for configuring a variety of network devices.
I'm writing a script that will eventually execute a list of commands on a switch (via SSH). These commands are stored in a file and the number of commands will vary
However, I'm not sure how this can be done using Expect. I know Expect can use a while loop, but I can't find a clear example. can someone here help?
/usr/bin/expect <<EOD
spawn ssh -o StrictHostKeyChecking=no admin#$switch
expect "*Enter password for admin\:"
send "password\r"
expect "*#"
send "????"
there should be a while loop that reads line by line from a file called "commands" that looks like this
command 1
command 2
command 3
...
Extreme Networks XOS has an XML API. You can use this for executing arbitrary commands. See the ExtremeXOS XML API Developer Guide which is listed on the support documentation page.
Managing switches by expect-scripting their CLIs is often erratic and error prone, I'd recommend that you avoid doing so if possible.
So, I've established a connection via ssh to a remote machine; and now what I would like to do is to execute few commands, grab some files and copy them back to my host machine.
I am aware that I can run
ssh user#host "command1; command2;....command_n"
and then close the connection, but how can I do the same without use the aforememtioned syntax? I have a lot of complex commands that has a bunch of quote and characters that would be a mess to escape.
Thanks!
My immediate thought is why not create a script and push it over to the remote machine to have it run locally in a text file? If you can't for whatever reason, I fiddled around with this and I think you could probably do well with a HEREDOC:
ssh -t jane#stackoverflow.com bash << 'EOF'
command 1 ...
command 2 ...
command 3 ...
EOF
and it seems to do the right thing. Play with your heredoc to keep your quotes safe, but it will get tricky. The only other thing I can offer (and I totally don't recomend this) is you could use a toy like perl to read and write to the ssh process like so:
open S, "| ssh -i ~/.ssh/host_dsa -t jane#stackoverflow.com bash";
print S "date\n"; # and so on
but this is a really crummy way to go about things. Note that you can do this in other languages.
Instead of the shell use some scripting language (Perl, Python, Ruby, etc.) and some module that takes care of the ugly work. For example:
#!/usr/bin/perl
use Net::OpenSSH;
my $ssh = Net::OpenSSH->new($host, user => $user);
$ssh->system('echo', 'Net::Open$$H', 'Quot%$', 'Th|s', '>For', 'You!');
$ssh->system({stdout_file => '/tmp/ls.out'}, 'ls');
$ssh->scp_put($local_path, $remote_path);
my $out = $ssh->capture("find /etc");
From here: Can I ssh somewhere, run some commands, and then leave myself a prompt?
The use of an expect script seems pretty straightforward... Copied from the above link for convenience, not mine, but I found it very useful.
#!/usr/bin/expect -f
spawn ssh $argv
send "export V=hello\n"
send "export W=world\n"
send "echo \$V \$W\n"
interact
I'm guessing a line like
send "scp -Cpvr someLocalFileOrDirectory you#10.10.10.10/home/you
would get you your files back...
and then:
send "exit"
would terminate the session - or you could end with interact and type in the exit yourself..
I've been wanting to run some ruby scripts on remote computers (in a bash shell)
I could create a sequence of bash commands of ruby -e "<command>", but some of these scripts are over 100 lines.
ruby -e with a HEREDOC or %{} & eval() doesn't work well with the mixture of single and double quotes.
Is there a better way to attempt this?
Edit:
The protocol being used is Apple Remote Desktop, which executes these commands in the scope of the remote shell.
If I understand you correctly, you want to run local ruby script on remote machine via SSH or similar protocol. If the script is non-interactive (i.e. doesn't require any user input), you could create it locally and deliver through stdin.
In other words, first write the script and save it locally as, say, foo.rb. Then:
ssh remotehost ruby < foo.rb
That with start the SSH session and execute the remote ruby interpreter. With no arguments, the ruby interpreter executes commands from standard input, and thus we feed SSH with the program on stdin.
As I also want to run ruby scripts via ARD (which I don't think can embed a ctrl-D), I first thought you could combine joraff's solution (to his own problem) with Kelvin's:
cat << ENDOFSCRIPT | ruby
#Here be code ...
ENDOFSCRIPT
Which saves creating/deleting a file.
But there's an even better way:
It turns out (duh) that ARD embeds an EOF or just otherwise terminates what it sends in such a way that you can simply do:
ruby
#Paste whole script here
Works at least in ARD 3.6.1. Win!
This worked:
cat << 'EOF' > /tmp/myscript.rb
# ruby script contents go here. any syntax is valid, except for your limit string (EOF)
EOF
ruby /tmp/myscript.rb;
rm /tmp/myscript.rb;
Since this isn't relying on how an interpreter binary handles stdin-style commands, it will work for most other languages as well (python, perl, php).
Why not send the script over first?
scp foo.rb remotehost:
ssh remotehost "ruby foo.rb"
You could even clean up the file after:
ssh remotehost "rm foo.rb"
I have roughly 12 computers that each have the same script on them. This script merely pings all the other machines, and prints out whether the machine is "reachable" or "unreachable". However, it is inefficient to login to each machine manually using ssh to execute this script.
Suppose I'm logged into node 1. Is there any way to for me to login to node 2-12 automatically using SSH, execute the ping script, pipe the results to a file, logout and proceed to the next machine? Some kind of bash shell script?
I'm afraid I'm at a loss here since I haven't had experience with shell-scripting before.
Since the script is on the other machines, you can just have ssh run the command for you there:
ssh $hostname my_script >> results_file
When you specify a command like that, it's executed instead of the login shell.
I'll leave it up to you to figure out how to loop over hostnames!
One trick you'll need to use is setting up pre-authorized keys for each host. Then you can run a script on one host, running something like 'ssh hostname command > log.hostname'
This script might be what you are looking for: It allows you to execute one command (which can be your script) on multiple remote machines via ssh. It's a simple script with bash source available, so you should be able to customize it to your needs:
http://www.heinzi.at/projects/upgradebest.sh/
Yes you can
You need actually 2 small scripts as following:
remote_ssh.sh ( which takes as first argument the name of the machine and the rest of the arguments are your script that you want to execute with his own arguments)
Example : remote_ssh.sh node5 "echo hello world"
remote_ssh.sh as following:
#!/bin/bash
ALL_ARG=$#
FST_ARG=$1
REST_ARG=${ALL_ARG##$FST_ARG}
echo "Executing REMOTE COMMAND ON $FST_ARG"
/usr/bin/ssh $FST_ARG bash execute_ssh_command.sh $FST_ARG pwd $REST_ARG
execute_ssh_command.sh as following :
#!/bin/bash
ALL_ARG=$#
FST_ARG=$1
DIR_ARG=$2
REM_ARG="$1 $2"
REST_ARG=${ALL_ARG##$REM_ARG}
cd $DIR_ARG
$REST_ARG
of course you have to get this 2 scripts in your path of all your nodes ( maybe ~/bin/ )
Hope that it's helpful