I am making an external request and using HTTPARTY for the JSON file and then am parsing it.
BUT should the request fail (the file no longer exists or is a bad uri) how can I handle errors so I could still display the page?
Not sure of how best to protect the application from this point of failure and I have not done much in error handling.
def api_fetch(url)
JSON.parse HTTParty.get(url).response.body
end
api_fetch('http://example.com/data.json')['test']
Please help
The below should work. It will check if the method returns nil when you call it.
def api_fetch(url)
begin
JSON.parse HTTParty.get(url).response.body
rescue
nil
end
end
Related
I'm trying to write an API in Sinatra that accepts a temporary CSV file as a parameter. I want to raise an exception if the filetype isn't text/csv or if the csv doesn't have an email column, and I wanted the confirmation page to simply display the error message. I imagined it to look something like this:
if params[:recipients_file]
raise ArgumentError, 'Invalid file. Make sure it is of type text/csv.' unless params[:recipients_file][:type] == "text/csv"
recipients_csv = CSV.parse(params[:recipients_file][:tempfile].read, {headers: true})
raise ArgumentError, 'Invalid CSV. Make sure it has an "email" column' unless recipients_csv.headers.include?('email')
recipients += recipients_csv.map {|recipient| recipient["email"]}
end
However, any time one of those conditions isn't met, I get really ugly error messages like NoMethodErrors etc. I just want the API to stop execution and to return the error message on the confirmation page. How do I do this?
You should define an error block:
error do
env['sinatra.error'].message
end
See http://www.sinatrarb.com/intro.html#Error for more details, including how to set up different error handlers for different exception types, HTTP status codes, etc.
I have a webapp written mostly in ruby compiled with opal. I now would like to store/retrieve file in my owncloud, maybe using WebDAV. I am looking for an example how to do this using HTTP module.
I tried
HTTP.get("https://owncloud/foo.abc") do |req|
req.username= "user"
...
end.then do |response|
puts response
end
But that does not work. no method then for module HTTP.
So it seem that if I pass a block to HTTP.get it no longer returns a promise.
When I do not pass a block I don' know
how to configure the request.
Best if I could find an full example how to use HTTP from opal.
The small example in opal blog die not hell out.
I think username/password should be passed in the options hash (see the opal-jquery README).
HTTP.get("https://owncloud/foo.abc", username: 'user').then do |response|
puts response
end
A note about the Promise-style:
The block is used as the default form of callback. To switch to promise-style you should not pass any block, instead try assigning the result of HTTP.get to a variable to modify the request options:
req = HTTP.get("https://owncloud/foo.abc")
puts req.inspect # <= do something with the request
req.then do |response|
puts response
end
I am testing how a method handles a 302 HTTPError exception. I tried to stub the one method call to raise one programmatically, however it keep complaining that wrong number of arguments error (0 for 2)
the code tested this particular line:
document = Nokogiri.HTML open(source_url)
and in the spec I stubbed it like this:
subject.stub(:open).and_raise(OpenURI::HTTPError)
subject.should_receive(:ended=).with(true)
subject.update_from_remote
I don't think it is related to Nokogiri.HTML() or Open-uri.open(), so why is this happening?
Also, how would I try to make this HTTPError as a 302 redirect error? Thanks
I found out that OpenURI::HTTPError's constructor requires two parameters. Rspec by default will call the error class's new method with no parameter, which cause this error. So I need to manually create an error object by passing the required parameters.
exception_io = mock('io')
exception_io.stub_chain(:status,:[]).with(0).and_return('302')
subject.stub(:open).with(anything).and_raise(OpenURI::HTTPError.new('',exception_io))
This is a very late reply, but for others who may find this helpful: if you use the FakeWeb gem in conjunction with Nokogiri, you can do this kind of testing without having to get so involved with the internals of the code. You can register a URI with FakeWeb in your test, and tell it what to return. For example:
FakeWeb.register_uri(:get, 'http://www.google.com', :status => ['404', 'Not Found'])
The URI argument you provide needs to match the URI your method is calling. FakeWeb will then intercept the call, and return the status you provide.
I'm trying to use a super simple API from is.gd:
http://is.gd/api.php?longurl=http://www.example.com
Which returns a response header "HTTP/1.1 200 OK" if the URL was shortened as expected, or "HTTP/1.1 500 Internal Server Error" if there was any problem that prevented this. Assuming the request was successful, the body of the response will contain only the new shortened URL
I don't even know where to begin or if there are any available ruby methods to make sending and receiving of these API requests frictionless. I basically want to assign the response (the shortened url) to a ruby object.
How would you do this? Thanks in advance.
Super simple:
require 'open-uri'
def shorten(url)
open("http://is.gd/api.php?longurl=#{url}").read
rescue
nil
end
open-uri is part of the Ruby standard library and (among other things) makes it possible to do HTTP requests using the open method (which usually opens files). open returns an IO, and calling read on the IO returns the body. open-uri will throw an exception if the server returns a 500 error, and in this case I'm catching the exception and return nil, but if you want you can let the exception bubble up to the caller, or raise another exception.
Oh, and you would use it like this:
url = "http://www.example.com"
puts "The short version of #{url} is #{shorten(url)}"
I know you already got an answer you accepted, but I still want to mention httparty because I've made very good experiences wrapping APIs (Delicious and Github) with it.
It seems like the methods of Ruby's Net::HTTP are all or nothing when it comes to reading the body of a web page. How can I read, say, the just the first 100 bytes of the body?
I am trying to read from a content server that returns a short error message in the body of the response if the file requested isn't available. I need to read enough of the body to determine whether the file is there. The files are huge, so I don't want to get the whole body just to check if the file is available.
This is an old thread, but the question of how to read only a portion of a file via HTTP in Ruby is still a mostly unanswered one according to my research. Here's a solution I came up with by monkey-patching Net::HTTP a bit:
require 'net/http'
# provide access to the actual socket
class Net::HTTPResponse
attr_reader :socket
end
uri = URI("http://www.example.com/path/to/file")
begin
Net::HTTP.start(uri.host, uri.port) do |http|
request = Net::HTTP::Get.new(uri.request_uri)
# calling request with a block prevents body from being read
http.request(request) do |response|
# do whatever limited reading you want to do with the socket
x = response.socket.read(100);
# be sure to call finish before exiting the block
http.finish
end
end
rescue IOError
# ignore
end
The rescue catches the IOError that's thrown when you call HTTP.finish prematurely.
FYI, the socket within the HTTPResponse object isn't a true IO object (it's an internal class called BufferedIO), but it's pretty easy to monkey-patch that, too, to mimic the IO methods you need. For example, another library I was using (exifr) needed the readchar method, which was easy to add:
class Net::BufferedIO
def readchar
read(1)[0].ord
end
end
Shouldn't you just use an HTTP HEAD request (Ruby Net::HTTP::Head method) to see if the resource is there, and only proceed if you get a 2xx or 3xx response? This presumes your server is configured to return a 4xx error code if the document is not available. I would argue this was the correct solution.
An alternative is to request the HTTP head and look at the content-length header value in the result: if your server is correctly configured, you should easily be able to tell the difference in length between a short message and a long document. Another alternative: set the content-range header field in the request (which again assumes that the server is behaving correctly WRT the HTTP spec).
I don't think that solving the problem in the client after you've sent the GET request is the way to go: by that time, the network has done the heavy lifting, and you won't really save any wasted resources.
Reference: http header definitions
I wanted to do this once, and the only thing that I could think of is monkey patching the Net::HTTP#read_body and Net::HTTP#read_body_0 methods to accept a length parameter, and then in the former just pass the length parameter to the read_body_0 method, where you can read only as much as length bytes.
To read the body of an HTTP request in chunks, you'll need to use Net::HTTPResponse#read_body like this:
http.request_get('/large_resource') do |response|
response.read_body do |segment|
print segment
end
end
Are you sure the content server only returns a short error page?
Doesn't it also set the HTTPResponse to something appropriate like 404. In which case you can trap the HTTPClientError derived exception (most likely HTTPNotFound) which is raised when accessing Net::HTTP.value().
If you get an error then your file wasn't there if you get 200 the file is starting to download and you can close the connection.
You can't. But why do you need to? Surely if the page just says that the file isn't available then it won't be a huge page (i.e. by definition, the file won't be there)?