I have Sidekiq 5 workers, concurrency 50. My webapp make third party API calls every few minutes, averagely 15000 requests per hour. Each hour I have near 10 errors Net::OpenTimeout: execution expired. Looks like not a big problem, but I want to know how to deal with it. Thanks.
def grabber(url)
response, body = nil
uri = URI(url)
Net::HTTP.start(uri.host, uri.port,
:use_ssl => uri.scheme == 'https', :read_timeout => 1000) do |http|
request = Net::HTTP::Get.new uri
response = http.request request
end
if response.code == '200'
body = JSON.parse(response.body)
end
body
end
Net::HTTP throws an exception when request time out occurs.
You can just catch exceptions and do something with it:
rescue Net::OpenTimeout
#do something
Here you can get some examples.
Related
I use net/http ruby's library to get the html response, but i can't get the body of the page with the status code 3xx
Page Body:
<div class="flash-container">
<div class="flash flash-success">
Il tuo indirizzo email è stato modificato con successo.
×
</div>
</div>
Request:
require 'net/http'
require 'uri'
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
request = Net::HTTP::Post.new(uri.request_uri)
request.set_form_data({
'email' => email,
'email-confirm' => email_confirm,
'password' => password
})
request['Cookie'] = 'ACCOUNT_SESSID=' + token
response = http.request(request)
Response:
response.code # '302'
response.body # ''
You'll likely need to follow the redirect (302 code). The Ruby docs have a great example for doing this.
I've included this below, along with a check to return the body if it exists. If you never want to follow the redirect, you could change the else condition to return response.code, and empty string, false, or whatever's appropriate. Here's the full example:
def fetch(uri_str, limit = 10)
raise ArgumentError, 'too many HTTP redirects' if limit == 0
response = Net::HTTP.get_response(URI(uri_str))
case response
when Net::HTTPSuccess then
response
when Net::HTTPRedirection then
if response.body_permitted?
response
else
location = response['location']
warn "redirected to #{location}"
fetch(location, limit - 1)
end
else
response.value
end
end
The code is pretty straight forward, calling itself recursively if the code from Net::HTTP.get_response returns a redirect, pointing to the new location.
You can follow up to ten redirects with this approach, which should be ample, though should likely adjust to suit or circumstances.
Then, when you run fetch(your_url), it should follow the redirect until it lands on a page and can return the body. I.E.
res = fetch(your_url)
res.body
Let me know how you get on with this, or if you've any questions!
I'm trying to make an HTTP Head request using Net::HTTP.
require 'net/http'
uri = URI("https://github.com/rails/rails")
http = Net::HTTP.new(uri.host, uri.port)
request = http.head(uri)
puts request
fails.
AFAICT, this is because Net::HTTP is waiting on a response body which will never come. How do I ask Net::HTTP to make a request and not wait on the response body?
If you follow the documentation properly, it works just fine. The library implementation probably has some assumptions on the usage when it determines whether to read the payload.
response = nil
Net::HTTP.start('github.com', :use_ssl => true) do |http|
response = http.head('/rails/rails')
end
response.each { |k, v| [k, v] }
Similar to "getting the status code of a HTTP redirected page", but with NET::HTTP instead of curb I am making a GET request to a page that that will redirect:
response = Net::HTTP.get_response(URI.parse("http://www.wikipedia.org/wiki/URL_redirection"))
puts response.code #{
puts response['location']
=> 301
en.wikipedia.org/wiki/URL_redirection
The problem is that I want to know the status code of the redirected page. In this case it is 200, but in my app I want to check if it is 200 or something else.
The solution I've seen is to just call get_response(response['location']), but that won't work in my application because the way the redirect is designed makes it so that the redirect can only be followed once. Since the first GET consumes that one redirect, I can't then follow it again.
Is there some way to get the last status code that is a result of a GET?
EDIT: Further clarification of the situation:
The application that I'm sending GET to has a single sign-on authentication mechanism where, if I want to access 'myapp/mypage', I have to first send a post:
postResponse = Net::HTTP.post_form(URI.parse("http://myapp.com/trusted"), {"username" => #username})
Then make the GET request to:
'http://myapp.com/trusted/#{postResponse.body}/mypage
*The postResponse.body is a 'ticket' which can be redeemed once.
That GET verifies that the ticket is valid and then redirects to:
myapp.com/mypage
So whether that ticket is valid or not, I get a 301.
I want to check the status code of the final get to myapp.com/mypage.
If I manually try to follow the redirect, whether it's a HEAD request or a GET, the original redirect will have already consumed the ticket, so I will get an error that the ticket is expired even if the original redirect was a 200.
The Net::HTTP documentation has example code showing how to deal with redirects. Have you tried it? It should make it easy to get inside the redirect mechanism and grab statuses for later.
Here's their example:
Following Redirection
Each Net::HTTPResponse object belongs to a class for its response code.
For example, all 2XX responses are instances of a Net::HTTPSuccess subclass, a 3XX response is an instance of a Net::HTTPRedirection subclass and a 200 response is an instance of the Net::HTTPOK class. For details of response classes, see the section “HTTP Response Classes” below.
Using a case statement you can handle various types of responses properly:
def fetch(uri_str, limit = 10)
# You should choose a better exception.
raise ArgumentError, 'too many HTTP redirects' if limit == 0
response = Net::HTTP.get_response(URI(uri_str))
case response
when Net::HTTPSuccess then
response
when Net::HTTPRedirection then
location = response['location']
warn "redirected to #{location}"
fetch(location, limit - 1)
else
response.value
end
end
print fetch('http://www.ruby-lang.org')
A minor change like this should help:
require 'net/http'
RESPONSES = []
def fetch(uri_str, limit = 10)
# You should choose a better exception.
raise ArgumentError, 'too many HTTP redirects' if limit == 0
response = Net::HTTP.get_response(URI(uri_str))
RESPONSES << response
case response
when Net::HTTPSuccess then
response
when Net::HTTPRedirection then
location = response['location']
warn "redirected to #{location}"
fetch(location, limit - 1)
else
response.value
end
end
print fetch('http://jigsaw.w3.org/HTTP/300/302.html')
puts RESPONSES.join("\n") # =>
I see this when I run it:
redirected to http://jigsaw.w3.org/HTTP/300/Overview.html
#<Net::HTTPOK:0x007f9e82a1e050>#<Net::HTTPFound:0x007f9e82a2daa0>
#<Net::HTTPOK:0x007f9e82a1e050>
If it's enough just to make an HTTP HEAD request without 'consuming' your URL (this would be the usual expectation for a HEAD request), you can do it like this:
2.0.0-p195 :143 > result = Net::HTTP.start('www.google.com') { |http| http.head '/' }
=> #<Net::HTTPFound 302 Found readbody=true>
So in your example you'd do this:
...
result = Net::HTTP.start(response.uri.host) { |http| http.head response.uri.path }
If you want to preserve a history of response codes, you could try this. This retains the last 5 response codes from calls to get_response and exposes them through a Net::HTTP.history method.
module Net
class << HTTP
alias_method :_get_response, :get_response
def get_response *args, &block
resp = _get_response *args, &block
#history = (#history || []).push(resp.code).last 5
resp
end
def history
#history || []
end
end
end
(I don't entirely get the usage scenario, so adapt to your needs)
I've tried this on a few machines on different networks, all running ruby 1.8.7 and I get the same result after a long wait.
Net::HTTP.get(URI.parse('https://encrypted.google.com/'))
Timeout::Error: execution expired
but HTTP works fine
Net::HTTP.get(URI.parse('http://www.google.com/'))
After the inital timeout I get an EOFError instead
EOFError: end of file reached
It's really got me stumped. If you have any ideas or you can let me know if you get the same results I'd really appreciate it.
I think you need to set use_ssl to true...
example:
uri = URI.parse("https://www.google.com/")
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
request = Net::HTTP::Get.new(uri.request_uri)
response = http.request(request)
puts response.body
This is cannibalized from the following Ruby Inside post.
I'm developing a small application which posts XML to some webservice.
This is done using Net::HTTP::Post::Post. However, the service provider recommends using a re-connect.
Something like:
1st request fails -> try again after 2 seconds
2nd request fails -> try again after 5 seconds
3rd request fails -> try again after 10 seconds
...
What would be a good approach to do that? Simply running the following piece of code in a loop, catching the exception and run it again after an amount of time? Or is there any other clever way to do that? Maybe the Net package even has some built in functionality that I'm not aware of?
url = URI.parse("http://some.host")
request = Net::HTTP::Post.new(url.path)
request.body = xml
request.content_type = "text/xml"
#run this line in a loop??
response = Net::HTTP.start(url.host, url.port) {|http| http.request(request)}
Thanks very much, always appreciate your support.
Matt
This is one of the rare occasions when Ruby's retry comes in handy. Something along these lines:
retries = [3, 5, 10]
begin
response = Net::HTTP.start(url.host, url.port) {|http| http.request(request)}
rescue SomeException # I'm too lazy to look it up
if delay = retries.shift # will be nil if the list is empty
sleep delay
retry # backs up to just after the "begin"
else
raise # with no args re-raises original error
end
end
I use gem retryable for retry.
With it code transformed from:
retries = [3, 5, 10]
begin
response = Net::HTTP.start(url.host, url.port) {|http| http.request(request)}
rescue SomeException # I'm too lazy to look it up
if delay = retries.shift # will be nil if the list is empty
sleep delay
retry # backs up to just after the "begin"
else
raise # with no args re-raises original error
end
end
To:
retryable( :tries => 10, :on => [SomeException] ) do
response = Net::HTTP.start(url.host, url.port) {|http| http.request(request)}
end