Can I let the connection time out but still get the response? - ruby

I have a function that gets response over http. It runs some tests. Lately it started to happen that the test never finishes. So I introduced a time out. Then I found out that if I stop the database server the test script finishes with a db error that is in fact very good lead why the test didn't finish as expected. So to get the error could help to save me time. Because I wouldn't have to reproduce the whole test again manually.
Q1: Is there any way to let the connection time out but then get the response after the database server is restarted? Note that I cannot send the http request again as it would start the same text again.
Q2: I think that a solution would be to introduce timer while "waiting" for http response. But I don't know how to do that. Any idea?
My function is like
def execute_db2_script(url)
db2_database = 'RATIONAL'
http_read_timeout=$http_read_timeout
uri = URI.parse(url)
start = Time.new
connection = Net::HTTP.new(uri.host, 443)
connection.use_ssl = true
begin
response = connection.start() do |http|
http.open_timeout = 50
http.read_timeout = http_read_timeout
http.request_get(uri.request_uri)
end
rescue Timeout::Error
time_out_message ="security time out - after #{$http_read_timeout} sec"
return time_out_message
end
return response.body.gsub("\n","<BR>")
end

You can use retry keyword
def execute_db2_script(url)
...
begin
...
rescue Timeout::Error
time_out_message ="security time out - after #{$http_read_timeout} sec"
if "the server is going to restart then"
retry # this will restart begin-rescue-end block again
else
return time_out_message
end
end
response.body.gsub("\n","<BR>")
end

Related

Skip a http request if response if taking too long with ruby

I have an array of urls. I'm going through each one, sending a get request and printing the response code. Here is part of the code:
arr.each do |url|
res = Faraday.get(link.href)
p res.status
end
However sometimes I get to url, it times out and crashes. Is there a way to tell ruby "if I don't get a response in a certain amount of time then skip to the next url?"
You could add a timeout like this:
require 'timeout'
arr.each do |url|
begin
Timeout.timeout(5) do # a timeout of five seconds
res = Faraday.get(link.href)
p res.status
end
rescue Timeout::Error
# handle error: show user a message?
end
end

how to raise a timeout error manually?

I want to test a http get request. output something if timeout.
begin
url = "#{url}?#{params.to_param}"
Net::HTTP.get_response(URI.parse(url))
rescue Timeout::Error
puts "....."
end
How to raise a timeout error manually? or how to set a shorter timeout number for http request?
For a http request, should I change the default timeout number? How long is appropriate?
Based on http://opensourceconnections.com/blog/2008/04/24/adding-timeout-to-nethttp-get_response/
http = Net::HTTP.new(url.host, url.port)
http.read_timeout = 5
http.open_timeout = 5
resp = http.start() {|http|
http.get(url.path)
}
puts resp.kind_of? Net::HTTPResponse
puts resp.code
puts resp.body
To set timeout use:
http = Net::HTTP.new(host_param)
http.read_timeout = 500
There are few types of timeouts you can set. From docs:
open_timeout:
Number of seconds to wait for the connection to open.
Any number may be used, including Floats for fractional seconds.
If the HTTP object cannot open a connection in this many seconds,
it raises a Net::OpenTimeout exception. The default value is nil.
read_timeout:
Number of seconds to wait for one block to be read (via one read(2) call).
Any number may be used, including Floats for fractional seconds.
If the HTTP object cannot read data in this many seconds,
it raises a Net::ReadTimeout exception. The default value is 60 seconds.
ssl_timeout:
Sets the SSL timeout seconds.

Multi-Threading in Ruby

I have a TCPserver that I made in ruby, the server seems to work, I can see that two or more clients can connect and be served by the server, but, they sometime get stuck (as in need to wait for the other client to disconnect or just get unresponsive), usually after the "pass_ok" bit, When connecting only with one client I don't see this issue.
Here is my code:
def self.main_server
begin
server = TCPServer.open(#port)
rescue Exception => e
CoreLogging.syslog_error("Cant start server: #{e}")
end
#main_pid = Process.pid
# Main Loop
Thread.abort_on_exception = true
while true
Thread.fork(server.accept) do |client|
#client = client
sock_domain, remote_port, remote_hostname, remote_ip = #client.peeraddr # Get some info on the incoming connection
CoreLogging.syslog_error("Got new connection from #{#client.peeraddr[3]} Handeled by Thread: #{Thread.current}") # Log incoming connection
#client.puts "Please enter password: " # Password testing (later will be from a config file or DB)
action = #client.gets(4096).chomp # get client password response 'chomp' is super important
if action == #password
# what to do when password is right
pass_ok
Thread.exit
else
# what to do when password is wrong
pass_fail
Thread.exit
end
end
begin
CoreLogging.syslog_error("Thread Ended (SOFT)")
rescue Exception => e
CoreLogging.syslog_error("Thread was killed (HARD)")
end
end
end
I'll leave it here for future reference and hope someone in a close situation will find it useful.
The issue was the global #client variable, which got overwritten every new thread and then inherited to the subclasses inside the thread.
using a local client variable (without the '#') got it to work as supposed.

Curb doesn't respond

I parse RSS stream with Feedjira.
When I used a fetch_and_parse method it sometimes blocked and doesn't respond.
The same thing happens with manual curb downloading.
I write in a loop:
#my_logger.info "--- Before perform ---"
easy = Curl::Easy.new
easy.follow_location = true
easy.max_redirects = 3
easy.connect_timeout = 120
easy.url = url
easy.useragent = "Ruby/Curb"
easy.perform
#my_logger.info "--- After perform ---"
doc = easy.body_str
easy.close
After some time (it may be a day or an hour), process stops on the easy.perform line and doesn't respond. E.g. process outputs --- Before perform --- and nothing else.
It can be related to a network issue happening randomly.
If you use a timeout you can skip this kind of situations in long running tasks.
require 'timeout'
begin
Timeout.timeout(5) do
easy.perform
end
rescue Timeout::Error
puts 'timeout'
end

How to implement timer that runs independently along with http request?

I have a ruby code that triggers php script over https.
Use case: The php script usually finishes in 5 minutes so I have set up time out for https request after 10 minutes. I need a timer that would trigger code after let's say 7 minutes after the https request started.
I was thinking of using thread that I created just before I initiate https request. I am not sure if this the correct way to approach this. Maybe there is not need to use threads at all. I am using ruby 1.8.7 (2010-08-16 patchlevel 302) [i386-mingw32]. Also I don't now if I can 'kill' the thread on successful finish of https request.
uri = URI.parse(url)
start = Time.new
http_read_timeout=60*10
connection = Net::HTTP.new(uri.host, 443)
connection.use_ssl = true
begin
response = connection.start() do |http|
http.open_timeout = 50
http.read_timeout = http_read_timeout
http.request_get(uri.request_uri)
# here I need to place a code that is triggered
# in case of custom timeout is reached
end
rescue Timeout::Error
# "Connection failed
time_out_message ="security time out - after #{http_read_timeout} sec"
return time_out_message
end
puts "finished"
The basic structure could be like this:
seconds_timer = MyDelay
counter = 0
test_thread = Thread.new do
run_http_php_test
end
while test_thread.alive?
counter += 1
if counter > seconds_timer
handle_custom_timeout_somehow
# if you want to halt run_http_php_test:
test_thread.kill if test_thread.alive?
# otherwise:
break
end
sleep 1
end
# the below doesn't apply if you kill the run_http_php_test thread
test_thread.join if test_thread.alive?
...but of course you could change that sleep 1 to whatever polling interval you like. Polling is nicer than just forcing your original thread to sleep, because the code will finish faster if run_http_php_test is done before you hit your custom timeout value.
Most or all of your code above can be in the run_http_php_test method, or inserted directly...whichever you'd prefer.
ruby 1.9.3 implements timeout module that has a timeout function. you can see it here. if you scroll down you can click show source and see the definition for timeout method. you can copy it if you dont want to upgrade to ruby 1.9.3 (I recommend upgrade since 1.8.7 is very slow compared to 1.9.3)

Resources