So here's my setup:
Laptop -> Host 1 -> Host 2 -> Host 3
Laptop can reach Host 1, but not Host 2 or Host 3
Host 1 can reach Host 2, but not Host 3
Host 3 can reach Host 2, but not Host 1
What I'm trying to do is set up remote forwards so that a process running on Host 3 will be routed to running service on Laptop. I've successfully done this using the following code:
Run from Laptop:
require 'rubygems'
require 'net/ssh'
threads = []
config = {:user => "user", :remote_port => 3333, :service_port => 2222}
threads << Thread.new{
Net::SSH.start("host1", config[:user]) do |ssh|
puts "Forwarding port #{config[:remote_port]} on host1 to #{config[:service_port]} on localhost"
ssh.forward.remote(config[:service_port], "localhost", config[:remote_port], "127.0.0.1")
ssh.exec! "ssh #{config[:user]}#host2 -R #{config[:remote_port]}:localhost:#{config[:remote_port]}"
ssh.loop {true}
end
}
threads << Thread.new{
Net::SSH.start("host3", config[:user]) do |ssh|
puts "Creating local forward for port #{config[:service_port]} on host3 to port #{config[:remote_port]} on host2"
ssh.exec! "ssh #{config[:user]}#host2 -L #{config[:service_port]}:localhost:#{config[:remote_port]}"
ssh.loop {true}
end
}
threads.each {|t| t.join}
In one thread, I'm setting up a remote forward from Laptop to Host 1 and then another remote forward from Host 1 to Host 2. In a separate thread, I'm starting another connection from Laptop to Host 3, then running a local forward from Host 3 to Host 2.
The only way I can connect from Laptop to Host 3 is because of my .ssh/config file, which automatically routes me through Host 1 and Host 2 when I try to connect to Host 3 from Laptop.
What I want to do is cut out the second thread where I'm connecting from Laptop to Host 3 so that I can remove the dependency on the .ssh/config file. I want to do all of my connections from within the first thread.
So basically I need to do multiple hops that originate from Laptop. I can initiate the first connection from Laptop to Host 1 and then execute a command on Host 1, but after that I can't get any further. What I need to do is initiate the connection from Laptop to Host 1, set up the forward on Host 1, connect to Host 2 from Host 1 and then set up the second forward on Host 2.
Is this possible to do with net/ssh?
Thanks for your help!
I fixed this by writing out an SSH config file, then specifying the config file when initiating the connection. The config file contains proxy commands that will automatically route through the necessary hosts to reach my destination.
Example:
def write_config_file
File.open('confi_file', 'w') { |f|
f << "host host1\n"
f << "HostName host1.something.net\n"
f << "user user_name\n"
f << "\n"
f << "host host2\n"
f << "HostName host2.something.net\n"
f << "ProxyCommand ssh host1 nc %h %p 2> /dev/null\n"
f << "user user_name\n"
f << "\n"
f << "host host3\n"
f << "HostName host3.something.net\n"
f << "ProxyCommand ssh host2 nc %h %p 2> /dev/null\n"
f << "user user_name\n"
}
end
write_config_file
Net::SSH.start("host3", "user_name", :config => './config_file') do |ssh|
#whatever you need to do...
end
I wrapped the connection in a begin/rescue blocked and trapped ctrl+c input, and in the trap block I delete the config file and shut down the connection.
Related
I have docker with nginx. Nginx run bash script via Lua:
server {
listen 80 default;
location /sign {
if ($arg_data = '') {
return 404 '{"Error": "Param data is not set"}';
}
content_by_lua '
command = "/root/test.sh " .. ngx.var.arg_data
local handle = io.popen(command);
local result = handle:read("*a");
handle:close();
ngx.print(result);';
}
}
Bash script test.sh
#!/bin/bash
sleep 1h
echo "test"
I have 8 nginx workers, so on 8 http requests ps ax print
Why on each worker 3 processes?
If I make new http request via curl, then I don't see error, I see connection hang. Who hold this connection, if all workers is busy?
I have the following launch config for an auto-scaling group:
resource "aws_launch_configuration" "ASG-launch-config" {
#name = "ASG-launch-config" # see: https://github.com/hashicorp/terraform/issues/3665
name_prefix = "ASG-launch-config-"
image_id = "ami-a4dc46db" #Ubuntu 16.04 LTS
#image_id = "ami-b70554c8" #Amazon Linux 2
instance_type = "t2.micro"
security_groups = ["${aws_security_group.WEB-DMZ.id}"]
key_name = "MyEC2KeyPair"
#user_data = <<-EOF
# #!/bin/bash
# echo "Hello, World" > index.html
# nohup busybox httpd -f -p "${var.server_port}" &
# EOF
provisioner "file" {
source="script.sh"
destination="/tmp/script.sh"
}
provisioner "remote-exec" {
inline=[
"chmod +x /tmp/script.sh",
"sudo /tmp/script.sh"
]
}
connection {
user="ubuntu"
private_key="${file("MyEC2KeyPair.pem")}"
}
lifecycle {
create_before_destroy = true
}
}
Error: Error applying plan:
1 error(s) occurred:
aws_launch_configuration.ASG-launch-config: timeout - last error: dial tcp :22: connectex: No connection could be made because the target machine actively refused it.
I want to run a bash script to basically install WordPress on the instances created.
the script runs fine in a resource type "aws_instance" "example"
how to troubleshoot this?
Sounds like the instance is denying your traffic. Start up an instance without the provisioning script and see if you can SSH to it using the key you provided. You may want to add verbose logging to the SSH command with -v.
I am trying to run the below expect scriipt to login to the multiple hosts and run the some commands on those hosts.
I have two files 1) host.txt which contains host names another 2) commands.txt which contains commans to execute.
Though, while i am running it its through an error which i have given below.. Please advise .. I went through all the solutions over the stackoverflow but did not get things working ..
SCRIPT CODE .....
$ cat myexpc
#!/usr/bin/expect -f
# Set up various other variables here ($user, $password)
set timeout 20
set user karn
set password 123
# Get the list of hosts, one per line #####
set f [open "host.txt"]
set hosts [split [read $f] "\n"]
close $f
# Get the commands to run, one per line
set f [open "commands.txt"]
set commands [split [ read $f] "\n"]
close $f
# Iterate over the hosts
foreach host $hosts {
spawn ssh $user#$host
expect "password:"
send "$password\r"
# Iterate over the commands
foreach cmd $commands {
expect "% "
send "$cmd\r"
}
# Tidy up
expect "% "
send "continue\r"
expect eof
close
}
====================
SCRIPT RUN & FAILURE CODE ..........
$ ./myexpc
spawn ssh karn#Server1
The authenticity of host 'server1 (192.2.47.194)' can't be established.
RSA key fingerprint is 33:r3:31:dd:a1:8a:38:08:b7:b1:e1:24:bc:cd:e2:5c.
Are you sure you want to continue connecting (yes/no)? yes
123
Please type 'yes' or 'no': yes
/bin/grep 192.17.14.1 /etc/ldap.conf /etc/resolv.conf
Please type 'yes' or 'no': yes
y
y
y
y
y
y
it keeps on running
I have code that requires me to connect to one server, rsync to a different server, then connect to the second server and run a bunch of commands on it. But without fail, the second SSH connection throws a 'do_open_failed': open failed (1) (Net::SSH::ChannelOpenFailed) error. Am I doing something wrong here, is there a way to close the first connection properly that makes the second one connect?
Net::SSH.start(self.from_creds['host'], self.from_creds['user'], :password => self.from_creds['password']) do |ssh|
channel = ssh.open_channel do |ch|
ch.exec "/usr/bin/rsync -e ssh -varuzP --exclude=sys-export --delete #{self.from_creds['filepath']}/#{self.client_id}/ #{self.scp_to}/#{new_client_id}" do |ch, success|
raise "could not execute command" unless success
# "on_data" is called when the process writes something to stdout
ch.on_data do |c, data|
$stdout.print data
end
# "on_extended_data" is called when the process writes something to stderr
ch.on_extended_data do |c, type, data|
$stderr.print data
end
ch.on_close { puts "done!" }
end
end
channel.wait
end
Net::SSH.start(self.to_creds['host'], self.to_creds['user'], :password => self.to_creds['password']) do |ssh1|
# Do some other stuff here
tmp_path = "#{self.to_creds['filepath']}/tmp/#{Time.now.to_i}"
ssh1.exec "mkdir -p #{tmp_path}"
ssh1.exec "cd #{self.to_creds['filepath']}/#{new_client_id}"
end
According to the documentation, exec doesn't block. Trying using exec! instead.
Net::SSH.start(self.to_creds['host'], self.to_creds['user'], :password => self.to_creds['password']) do |ssh1|
# Do some other stuff here
tmp_path = "#{self.to_creds['filepath']}/tmp/#{Time.now.to_i}"
ssh1.exec! "mkdir -p #{tmp_path}"
ssh1.exec! "cd #{self.to_creds['filepath']}/#{new_client_id}"
end
Alternatively,
Net::SSH.start(self.to_creds['host'], self.to_creds['user'], :password => self.to_creds['password']) do |ssh1|
# Do some other stuff here
tmp_path = "#{self.to_creds['filepath']}/tmp/#{Time.now.to_i}"
ssh1.exec "mkdir -p #{tmp_path}"
ssh1.exec "cd #{self.to_creds['filepath']}/#{new_client_id}"
ssh1.loop
end
I writing a script to read user name, password and host info from a file.
I then parse this info to get the variables. I would then like to add these variables to an expect script that reads all the ip address in my file and performs certain commands on the remote devices that I am trying to log into. The script works when it connects to a known host however What I am seeing is that there is one device that is not up and running and the system promps with the following error.
ssh: connect to host 192.168.3.2 port 22: No route to host
the file
I would like to do 2 things
1. Skip the host and move to the next host
2. log the host that is down to another file so that I can troubleshoot the network issue to that host.
Please see the script below. Please any help is greatly accepted.
#! /usr/bin/expect -f
## Read the file
set fid [open /csv_pars/employee1.csv]
set content [read $fid]
close $fid
## Split into records on newlines
set records [split $content "\n"]
## Iterate over the records
foreach rec $records {
## Split into fields on comma
set fields [split $rec ","]
## Assign fields to variables and print some out...
lassign $fields\ ipaddr username password
puts "$ipaddr"
puts "$username"
puts "$password"
if {$ipaddr == ""} continue
spawn ssh -X "$username#$ipaddr"
sleep 2
expect "password:"
sleep 2
send "$pass\r"
expect "$"
send -- "ls -l\r"
expect "$"
send -- "exit\r"
expect eof
}
You need to expect to see one of two things: the password prompt, or the error message:
spawn ssh -X "$username#$ipaddr"
expect {
-re "password: ?$" {
send "$pass\r"
expect "$"
send -- "ls -l\r"
expect "$"
send -- "exit\r"
expect eof
}
"No route to host" {
set fid [open error.log a]
puts $fid "[clock format [clock seconds]]: No route to host $ipaddr"
close $fid
}
}