Ruby SSLServer closes when connected to by non-ssl client - ruby

I have a simple SSL server in ruby:
require "socket"
require "openssl"
tcp_server = TCPServer.new("0.0.0.0", 8443)
ctx = OpenSSL::SSL::SSLContext.new
ctx.key = OpenSSL::PKey::RSA.new File.read params["ssl-key"]
ctx.cert = OpenSSL::X509::Certificate.new File.read params["ssl-cert"]
server = OpenSSL::SSL::SSLServer.new(tcp_server, ctx)
#client handling code
loop do
client = server.accept
client.puts("Hello!")
client.close
end
When I start the server, it works and I can connect to it with ssl, but whenever I connect without ssl, I get an error: OpenSSL::SSL::SSLError and the server stops and I can not connect again. I have seen solutions for this that involve modifying the code of the SSLServer code but for my usecase, that is not feasible. Is there any other solution for this that can be solved my modifying my code that I have?

Related

Access to SSL context in faye-websocket+eventmachine connection

I would like to get a wire dump of a secure websocket connection where I am the client.
I am using the faye-websocket gem in ruby to connect to a secure websocket service. This works well. To understand a specific issue, I need to get a wire dump of the communication. I typically use wireshark for this (running on the same machine as the client). To decrypt the SSL connection, I need to extract the master key to pass it to wireshark. I know how to extract the master key if I have direct access to the socket, but I fail to get access to it when using the faye-websocket gem.
The code to run faye-websocket is pretty standard:
EM.run {
ws = Faye::WebSocket::Client.new('wss://...')
ws.on :open do |event|
p [:open]
### authentication
end
ws.on :message do |event|
p [:message, event.data]
### message - response loop here
end
ws.on :close do |event|
p [:close, event.code, event.reason]
ws = nil
end
}
Inspecting the content of ws, it has a #socket member, but I fail to receive it (get_instance_var returns nil).
For the record, once I have the SSLcontext, I would use the code from
https://www.trustwave.com/en-us/resources/blogs/spiderlabs-blog/how-to-decrypt-ruby-ssl-communications-with-wireshark/
to extract the master key and pass it to wireshark:
ssl_socket.session.to_text.each_line do |line|
if match = line.match(/Session-ID\s*: (?<session_id>.*)/)
session_id = match[:session_id]
end
if match = line.match(/Master-Key\s*: (?<master_key>.*)/)
master_key = match[:master_key]
end
end
Does someone have a solution to get access to the underlying socket and the SSL context?

Ruby open-uri proxy authentication fails

I'm coding a native Ruby script to scrap a website using Nokogiri, whenever I pass proxy options to the open-uri open() method, it returns 407 Proxy Authentication Required but my options does have the authentification details, here's my code
proxy_url = URI.parse("http://12.34.567.89:PORT")
session = Nokogiri::HTML(open("http://google.com", :proxy_http_basic_authentication =>[proxy_url, "username", "password"]
Note: As my proxy is premium, I have replaced real proxy credentials with fake one
I have a restrictive proxy at work but the followig works.
Try the code with your proxy credentials.
I used Nokogiri here for parsing but you don't realy need it for getting the HTML.
require 'net/http'
require 'uri'
require 'nokogiri'
url = 'http://stackoverflow.com/questions/32818853/ruby-open-uri-proxy-authentication-fails'
proxy_host, proxy_port, proxy_user, proxy_pass = '****', 8080, "*****", "*****"
uri = URI.parse(url)
Net::HTTP::Proxy(proxy_host, proxy_port, proxy_user, proxy_pass).start(uri.host, uri.port) do |http|
http.get(uri.path) do |str|
puts Nokogiri::HTML(str).text
end
end

web server in ruby and connection keep-alive

Web server example:
require 'rubygems'
require 'socket'
require 'thread'
class WebServer
LINE_TERMINATOR = "\r\n".freeze
def initialize(host, port)
#server = TCPServer.new(host, port)
end
def run
response_body = 'Hello World!'.freeze
response_headers = "HTTP/1.1 200 OK#{LINE_TERMINATOR}Connection: Keep-Alive#{LINE_TERMINATOR}Content-Length: #{response_body.bytesize}#{LINE_TERMINATOR}".freeze
loop do
Thread.new(#server.accept) do |socket|
puts "request #{socket}"
sleep 3
socket.setsockopt(Socket::IPPROTO_TCP, Socket::TCP_NODELAY, 1)
socket.write(response_headers)
socket.write(LINE_TERMINATOR)
socket.write(response_body)
# socket.close # if this line is uncommented then it's work.
end
end
end
end
WebServer.new('localhost', 8888).run
if update browser without waiting for the end of the cycle then the following queries are not processed
How can handle incomming request which are persistent socket ?
You need to:
Keep around the sockets you get from the #server.accept call. Store them in an array (socket_array).
Use the IO.select call on the array of sockets to get the set of sockets that can be read:
ready = IO.select(socket_array)
readable = ready[0]
readable.each do |socket|
# Read from socket here
# Do the rest of processing here
Don't close the socket after you have sent the data.
If you need more details leave a comment - I can write more of the code.

Ruby: Connect to remote WebSocket

I'm trying to connect to remote websocket using Celluloid and Websocket client based on celluloid (gem 'celluloid-websocket-client'). The main advantage of this client for me is that I can use callbacks in the form of class methods instead of blocks.
require 'celluloid/websocket/client'
class WSConnection
include Celluloid
def initialize(url)
#ws_client = Celluloid::WebSocket::Client.new url, Celluloid::Actor.current
end
# When WebSocket is opened, register callbacks
def on_open
puts "Websocket connection opened"
end
# When raw WebSocket message is received
def on_message(msg)
puts "Received message: #{msg}"
end
# When WebSocket is closed
def on_close(code, reason)
puts "WebSocket connection closed: #{code.inspect}, #{reason.inspect}"
end
end
m = WSConnection.new('wss://foo.bar')
while true; sleep; end
The expected output is
"Websocket connection opened"
However, I don't get any output at all. What could be the problem?
I am using
gem 'celluloid-websocket-client', '0.0.2'
rails 4.2.1
ruby 2.1.3
As you noticed in the comments, the gem had no SSL support. That is the trouble. To expound on the answer, here is a resolution, and also some next steps of what to expect for the future:
[ now ] Override methods in Celluloid::WebSocket::Client::Connection
This is an example injection to provide SSL support to the current gem. Mine is actually highly modified, but this shows you the basic solution:
def initialize(url, handler=nil)
#url = url
#handler = handler || Celluloid::Actor.current
#de If you want an auto-start:
start
end
def start
uri = URI.parse(#url)
port = uri.port || (uri.scheme == "ws" ? 80 : 443)
#socket.close rescue nil
#socket = Celluloid::IO::TCPSocket.new(uri.host, port)
#socket = Celluloid::IO::SSLSocket.new(#socket) if port == 443
#socket.connect
#client = ::WebSocket::Driver.client(self)
async.run
end
The above sends ripple effects through the other methods however, for example, #handler is used to hold the calling actor, which also has the emitter methods on it. Like I said, my version is very different from the stock gem because I got fed up with it and reworked mine. But then:
[ soon ] Use Reel::IO::Client and avoid near certain brain damage.
There are exciting things going on with WebSocket support, and a gem is coming to refactor both server and client implementations of websockets. No more monkeypatches required!
All websocket functionality is being extracted from Reel and being combined with a websocket-driver abstraction, as Reel::IO... in both ::Server and ::Client varieties.
Interestingly, this is prompted by Rails which is moving away from EventMachine to Celluloid::IO for websockets:
https://github.com/rails/actioncable/issues/16
https://github.com/celluloid/reel/issues/201
https://github.com/celluloid/reel-io/issues/2
A prealpha is online for preview: https://github.com/celluloid/reel-io

Encrypted Transfer With Ruby FTPS

I am trying to fetch files from a server using FTPS. I'm able to authenticate but when I try to list/fetch the files, I get a "521 Data connections must be encrypted". Is the Net::FTP module capable of this, and how would I accomplish it?
I modified Net::FTPTLS into my own class because I needed to store a self-signed cert.
require 'socket'
require 'openssl'
require 'net/ftp'
module MP
class FTPS < Net::FTP
def connect(host, port=FTP_PORT)
#hostname = host
super
end
def login(user = "anonymous", passwd = nil, cert_file = nil, acct = nil)
store = OpenSSL::X509::Store.new
if cert_file == nil
store.set_default_paths
else
certraw = File.read(cert_file)
cert = OpenSSL::X509::Certificate.new(certraw)
store.add_cert(cert)
end
ctx = OpenSSL::SSL::SSLContext.new('SSLv23')
ctx.cert_store = store
ctx.verify_mode = OpenSSL::SSL::VERIFY_PEER
ctx.key = nil
ctx.cert = cert
voidcmd("AUTH TLS")
#sock = OpenSSL::SSL::SSLSocket.new(#sock, ctx)
#sock.connect
##sock.post_connection_check(#hostname)
super(user, passwd, acct)
voidcmd("PBSZ 0")
end
end
end
And here's the snippet for trying to fetch the files:
def get_ftpclient(host)
FTPS::new(host)
end
def check_for_files
#ftp = get_ftpclient(#host)
#ftp.passive = true
#ftp.login(#user_name, #password, #cert_file)
#ftp.chdir(#remote_dir)
files = #ftp.nlst
files
end
It fails on the nlst.
Edit: I tried adding voidcmd("PROT P") to the end of the login function but it just hangs for a while, then I eventually get:
IOError: Unsupported record version Unknown-48.48
___BEGIN BACKTRACE___
org/jruby/ext/openssl/SSLSocket.java:564:in `sysread'
/opt/jruby/lib/ruby/gems/1.8/gems/jruby-openssl-0.7.6.1/lib/1.8/openssl/buffering.rb:36:in `fill_rbuff'
/opt/jruby/lib/ruby/gems/1.8/gems/jruby-openssl-0.7.6.1/lib/1.8/openssl/buffering.rb:159:in `eof?'
/opt/jruby/lib/ruby/gems/1.8/gems/jruby-openssl-0.7.6.1/lib/1.8/openssl/buffering.rb:134:in `readline'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:211:in `getline'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:221:in `getmultiline'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:235:in `getresp'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:251:in `voidresp'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:436:in `retrlines'
/opt/jruby/lib/ruby/1.8/monitor.rb:191:in `mon_synchronize'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:422:in `retrlines'
/opt/jruby/lib/ruby/1.8/net/ftp.rb:612:in `nlst'
... etc
I realize this is an old question, but I stumbled upon it while researching FTPS ruby gems.
No. net::FTP does not, on its own, support FTPS.
I highly recommend double-bag-ftps.
Provides a child class of Net::FTP to support implicit and explicit FTPS.
Version 0.1.1 has been working beautifully for me running daily for the past year.

Resources