Allowing cross-origin requests in Yesod - ajax

My application uses a bookmarklet, and I need to allow CORS for MyRouteR so my bookmarklet code can use this route for AJAX requests.
In my first draft of config/routes I gave MyRouteR support for only one request method, PUT. But it turned out (duh) that I'd need to support the OPTIONS method as well, which browsers use for CORS preflight requests.
I ended up with the following in config/routes:
/myroute MyRouteR PUT OPTIONS
I was kind of hoping there would be some relevant machinery in the Template Haskell that processes config/routes so that the addition of OPTIONS to this route's method list would automagically result in CORS support, but no dice. Not the end of the world, but it would have made sense and felt elegant that way.
To make CORS work, I gave the route an OPTIONS handler:
optionsMyRouteR :: Handler RepPlain
optionsMyRouteR = do
addHeader "Access-Control-Allow-Origin" "*"
addHeader "Access-Control-Allow-Methods" "PUT, OPTIONS"
return $ RepPlain $ toContent ("" :: Text)
putMyRouteR :: Handler RepJson
putMyRouteR = do
addHeader "Access-Control-Allow-Origin" "*"
-- more stuff ...
This works, but it feels slightly un-Yesodic because it's so boilerplate. So, two questions:
Do we have a better adjective than Yesodic?
Is there another, better way to let a route support cross-origin requests?

UPDATE:
Someone else published some generic middleware for this: http://hackage.haskell.org/package/wai-cors.
I am currently working on the same thing and haven't yet implemented a solution, however I imagine it can be done via a WAI Middleware similar to the sample code on the wiki page Allowing WOFF fonts to be accessed from other domains (CORS). This should allow you from writing the CORS code once without repeating yourself.
Sample code from the link above to add cross-origin access for WOFF fonts:
addCORStoWOFF :: W.Middleware
addCORStoWOFF app = fmap updateHeaders . app
where
updateHeaders (W.ResponseFile status headers fp mpart) = W.ResponseFile status (new headers) fp mpart
updateHeaders (W.ResponseBuilder status headers builder) = W.ResponseBuilder status (new headers) builder
updateHeaders (W.ResponseSource status headers src) = W.ResponseSource status (new headers) src
new headers | woff = cors : headers
| otherwise = headers
where woff = lookup HT.hContentType headers == Just "application/font-woff"
cors = ("Access-Control-Allow-Origin", "*")

Related

d3.js setrequestheader fails in ie8

Can someone say how to set a request header using the d3js xhr interface in IE8?
Code is like this:
d3.csv(url).header("accept","text/csv").rows(function(d) {...}).get(function(e,r) {...});
This doesn't have the desired effect in IE8, but works in Firefox and Chrome.
I load the aight compatibility library before loading d3, and the aight.d3 library after, but I don't think those are relevant to this problem.
The request is sent, but the response type is incorrect (it's json instead of csv), so the rows() function fails to get any data. At the server, the "Accept:" header value is */* from IE8, but text/csv from other browsers.
When I write the equivalent in bare javascript, IE8 sets the request header correctly.
I have d3 version 3.4.3.
Thanks for any help.
Regards,
--Paul
I'm not an expert on the subject, but looking through the d3 source code and various MSDN references, I think the problem is that d3 automatically checks whether you're using an absolute url reference, and if so assumes that it is a cross-domain request and switches to IE's XDomainRequest instead of XMLHttpRequest for older IE.
Relevant d3 source code (line 17-26):
var xhr = {},
dispatch = d3.dispatch("beforesend", "progress", "load", "error"),
headers = {},
request = new XMLHttpRequest,
responseType = null;
// If IE does not support CORS, use XDomainRequest.
if (d3_window.XDomainRequest
&& !("withCredentials" in request)
&& /^(http(s)?:)?\/\//.test(url)) request = new XDomainRequest;
The first line tests whether the XDomainRequest object exists (it does for IE8 and up), the second tests if the created XMLHttpRequest object does not have the withCredentials property for cross-domain requests (which only exists in IE10 and up), and the third tests whether the url starts with "http://" or "https://".
So, if you pass in an absolute url to any of the d3 file-grabbing functions in an IE8 or IE9 browser, it will use an XDomainRequest object instead of XMLHttpRequest.
Which is good if you want to actually grab files from a cross-origin server. Not so good if your same-domain server is expecting you to specify the accepted file type, since as far as I can tell XDomainRequest doesn't have any way of setting headers, and it certainly doesn't have the setRequestHeader method that d3 checks for (line 96):
xhr.send = function(method, data, callback) {
/*...*/
if (request.setRequestHeader) for (var name in headers)
request.setRequestHeader(name, headers[name]);
/*...*/
So how do you get it to work? If you're not doing a cross-origin request (and therefore XMLHttpRequest should work fine), specify your URL using relative notation if you can. Otherwise, you're either going to have to change the d3 source code or create the XMLHttpRequest yourself.

Maintaining session and cookies over a 302 redirect

I am trying to make fetch a PDF file that gets generated on-demand behind an auth wall. Based on my testing, the flow is as follows:
I make a GET request with several parameters (including auth credentials) to the appropriate page. That page validates my credentials and then processes my request. When the request is finished processing (nearly instantly), I am sent a 302 response that redirects me to the location of the generated PDF. This PDF can then only be accessed by that session.
Using a browser, there's really nothing strange that happens. I attempted to do the same via curl and wget without any optional parameters, but those both failed. I was able to get curl working by adding -L -b /tmp/cookie.txt as options, though (to follow redirects and store cookies).
According to the ruby-doc, using Net::HTTP.start should get me close to what I want. After playing around with it, I was indeed fairly close. I believe the only issue, however, was that my Set-Cookie values were different between requests, even though they were using the same http object in the same start block.
I tried keeping it as simple as possible and then expanding once I got the results I was looking for:
url = URI.parse("http://dev.example.com:8888/path/to/page.jsp?option1=test1&option2=test2&username=user1&password=password1")
Net::HTTP.start(url.host, url.port) do |http|
# Request the first URL
first_req = Net::HTTP::Get.new url
first_res = http.request first_req
# Grab the 302 redirect location (it will always be relative like "../servlet/sendfile/result/543675843657843965743895642865273847328.pdf")
redirect_loc = URI.parse(first_res['Location']
# Request the PDF
second_req = Net::HTTP::Get.new redirect_loc
second_res = http.request first_req
end
I also attempted to use http.get instead of creating a new request each time, but still no luck.
The problem is with cookie: it should be passed within the second request. Smth like:
second_req = Net::HTTP::Get.new(uri.path, {'Cookie' => first_req['Set-Cookie']})

Play! 2.0 easy fix to OPTIONS response for router catch-all?

Having some annoying issues making AJAX calls simply because almost every browser these days is making an OPTIONS call to the server before the actual AJAX call.
Since I am using Play! 2.0, is there any easy way to make a wildcard response to any route using the OPTIONS method?
For instance, in my routes do something like:
OPTIONS /* controllers.Options.responseDef
Yes I am aware that the new Play! doesn't have a wildcard built-in, but there needs to be a solution for this since all browsers are increasingly calling OPTIONS before AJAX calls.
Not quite a wildcard, but you can use a route which spans several slash-segments:
OPTIONS /*wholepath controllers.Options.responseDef(wholepath)
OPTIONS / controllers.Options.responseDef
It should match all the requests:
OPTIONS /a
OPTIONS /a/b
OPTIONS /a/b/c
Note: that's from the top of my head, so maybe you'll need to polish it. I can't check it now by myself.
Check the section Dynamic parts spanning several / of the manual.
A very clean way to have a single controller endpoint match all OPTIONS requests is to override the onRouteRequest method of Play's Global object. The following version of onRouteRequest will route all requests to a single endpoint named OptionsController.options.
import play.api.mvc._
...
override def onRouteRequest(request: RequestHeader): Option[Handler] = {
request.method match {
case "OPTIONS" => Some(OptionsController.options)
case _ => super.onRouteRequest(request)
}
}

Protecting prototype.js based XHR requests against CSRF

Django has been updated to 1.3, and in fact ever since 1.2.5, it has extended the scheme to pass a Cross Site Request Forgery protection token to XMLHttpRequests. The Django folks helpfully provide an example for jQuery to apply a specific header to every XHR.
Prototype (and thus Scriptaculous) have to comply to this scheme, yet I can't find a way to tell prototype to add the X-CSRFToken header. The best would be to do it once in a way that applies it across the app (like for jQuery).
Is there a way to do that?
This is a wild guess but you could try extending the base AJAX class...
Ajax.Base.prototype.initialize = Ajax.Base.prototype.initialize.wrap(
function (callOriginal, options) {
var headers = options.requestHeaders || {};
headers["X-CSRFToken"] = getCookie("csrftoken");
options.requestHeaders = headers;
return callOriginal(options);
}
);

Ruby's open-uri and cookies

I would like to store the cookies from one open-uri call and pass them to the next one. I can't seem to find the right docs for doing this. I'd appreciate it if you could tell me the right way to do this.
NOTES: w3.org is not the actual url, but it's shorter; pretend cookies matter here.
h1 = open("http://www.w3.org/")
h2 = open("http://www.w3.org/People/Berners-Lee/", "Cookie" => h1.FixThisSpot)
Update after 2 nays: While this wasn't intended as rhetorical question I guarantee that it's possible.
Update after tumbleweeds: See (the answer), it's possible. Took me a good while, but it works.
I thought someone would just know, but I guess it's not commonly done with open-uri.
Here's the ugly version that neither checks for privacy, expiration, the correct domain, nor the correct path:
h1 = open("http://www.w3.org/")
h2 = open("http://www.w3.org/People/Berners-Lee/",
"Cookie" => h1.meta['set-cookie'].split('; ',2)[0])
Yes, it works. No it's not pretty, nor fully compliant with recommendations, nor does it handle multiple cookies (as is).
Clearly, HTTP is a very straight-forward protocol, and open-uri lets you at most of it. I guess what I really needed to know was how to get the cookie from the h1 request so that it could be passed to the h2 request (that part I already knew and showed). The surprising thing here is how many people basically felt like answering by telling me not to use open-uri, and only one of those showed how to get a cookie set in one request passed to the next request.
You need to add a "Cookie" header.
I'm not sure if open-uri can do this or not, but it can be done using Net::HTTP.
# Create a new connection object.
conn = Net::HTTP.new(site, port)
# Get the response when we login, to set the cookie.
# body is the encoded arguments to log in.
resp, data = conn.post(login_path, body, {})
cookie = resp.response['set-cookie']
# Headers need to be in a hash.
headers = { "Cookie" => cookie }
# On a get, we don't need a body.
resp, data = conn.get(path, headers)
Thanks Matthew Schinckel your answer was really useful. Using Net::HTTP I was successful
# Create a new connection object.
site = "google.com"
port = 80
conn = Net::HTTP.new(site, port)
# Get the response when we login, to set the cookie.
# body is the encoded arguments to log in.
resp, data = conn.post(login_path, body, {})
cookie = resp.response['set-cookie']
# Headers need to be in a hash.
headers = { "Cookie" => cookie }
# On a get, we don't need a body.
resp, data = conn.get(path, headers)
puts resp.body
Depending on what you are trying to accomplish, check out webrat. I know it is usually used for testing, but it can also hit live sites, and it does a lot of the stuff that your web browser would do for you, like store cookies between requests and follow redirects.
you would have to roll your own cookie support by parsing the meta headers when reading and adding a cookie header when submitting a request if you are using open-uri. Consider using httpclient http://raa.ruby-lang.org/project/httpclient/ or something like mechanize instead http://mechanize.rubyforge.org/ as they have cookie support built in.
There is a RFC 2109 and RFC 2965 cookie jar implementation to be found here for does that want standard compliant cookie handling.
https://github.com/dwaite/cookiejar

Resources