I'm using an angular service to GET a resource via a rest api. The server sets the ETag header to some value and it also sets Cache-Control: no-cache in it's response.
This works as expected using Firefox, but when I access the same app using Chrome, it is not sending the If-None-Match. I've tried on current Chrome dev and stable channels on both a Mac and an Ubuntu box, and it was the same on both, while Firefox was adding the If-None-Match correctly.
Now, there are other non-xhr/static resources that are fetched conditionally and all those requests correctly get a 304 NOT MODIFIED response.
Is there anything I can do to get more information about why Chrome is not sending the If-None-Match header only for XHR requests?
If you're issuing an Ajax query in Chrome over HTTPS, any certificate errors, such as using a self-signed cert on your API server, prevent the response from being cached. This seems to be by design.
Evidently a Chrome defect existed but was fixed in Webkit and made it into Chromium / Chrome around 2010.
Another question recommends setting the If-Modified-Since and If-None-Match headers manually using jQuery's ifModified: true and cache: true options. Unfortunately this won't over-ride Chrome's intended behavior to not cache HTTPS responses from a server with a self signed certificate.
Testing on a server with a valid signed SSL certificate solved the issue for me; Chrome received 304's for text/html content as expected, using the default jQuery AJAX methods.
Related
Is it possible to use HTTP caching for conditional GET requests over a secure HTTPS connection? I've got caching working over non-secure HTTP, but when I switch to HTTPS the browser stops sending if-none-match and if-modified-since headers, so the caching breaks. I've tried various Cache-Control settings like public, max-age=3600 and whatnot, no dice.
This happens in both Safari and Chrome, so I'm assuming the SSL is breaking it somehow. Is caching not allowed over SSL?
And just to be clear, the server is indeed properly setting the etag and last-modified headers, but the browser is not sending if-none-match and if-modified-since in the request, according to the Chrome developer tools.
Thanks for your help.
Figured it out! Turns out you have to have a trusted certificate. I was using my self-signed test certificate for SSL HTTPS. Adding it to my keychain and turning it green made the caching work.
I have developed a web application that makes ajax requests to a web service on a server in a different domain from the server that hosts the web app.
I have configured the web service to do a pre-flight check to set the necessary headers to allow a cross domain request.
In the web app I am using a JQuery client to access the web service. I have set the properties on the Jquery command to allow cross domain access.
$.support.cors = true;
In Chrome this all works fine. In IE9, however the cross domain behavior is only partially successful. All get requests work. But post requests with a content-type of application/json fail because IE9 refuses to make post requests with any content-type except text/html. IE9 switches the content-type on the request and the request fails on the server with a 400 bad request.
I had read that with IE10 the cross domain request would work as in Chrome. But after just testing this, I find that IE10 has the same behavior as IE9. The browser will not set the content-type to application/json. So post requests fail.
Does anyone know whether it is possible in IE10 to do cross domain post requests with other content-types than text/html. This makes writing web apps that do anything more than display data extremely difficult.
Are there other settings I need to make on the Jquery request? Or in the service pre-flight?
What does your $.ajax() call look like? You could try adding data: 'json' to your JQuery call in order to force the data type to be json. You also shouldn't need to set $.support.cors = true;, JQuery should figure this out for you (but its ok to leave it in for now).
I do have the content type param set data: 'json'.
Chrome honors this but IE switches to text/html. I had read that this was a know issue in IE9 and below but that IE10 would be using the same ajax implementation as Chrome, but this is apparently not the case.
I'm working on extensions for Firefox and Chrome. The data used by my extensions is mostly generated from ajax requests. The type of data being returned is private, so it needs to be secure. My server supports https and the ajax calls are being sent to an https domain. Information is being sent back and forth, and the extensions are working correctly.
My questions are:
Do the extensions actually make secure connections with the server, or is this considered the same as cross domain posting, sending a request from a http page to a https page?
Am I putting my users' information at more risk during the transfers than if the user were to access the information directly from an https web page in the browser?
Thanks in advance!
The browser absolutely makes a secure connection when you use HTTPS. Certainly, a browser would never downgrade the security of your connection without telling you: it will either complete the request as written or it throw some sort of error if it is not possible.
Extensions for both Chrome and Firefox are permitted to make cross-domain AJAX requests. In Chrome, you simply need to supply the protocol/name of the host as a permission in your manifest.json. In Firefox, I think you may need to use Components.classes to get a cross-domain requester, as described in the MDN page for Using XMLHttpRequest, but I'm not 100% sure about that. Just try doing a normal request and see if it succeeds; if not, use the Components.classes solution.
I am trying to make cross-domain requests with Safari on Windows. My Safari version is 5.1.2.
This is a classical question. I read in many places that Chrome and Safari allows cross domain requests as long as Server responds with the followin header in the response
Access-Control-Allow-Origin: *
I have read this post.
How to allow cross-domain requests in Safari?
and many others on the stackoverflow site too.
However, none of them answers my question.
I am having problems with Chrome AND Safari doing cross-domain AJAX requests even though I am sending the necessary header back from the server.
I finally ran Chrome with "--disable-web-security". Then it worked.
My questions:
1) What do I do with Safari? Do I use a similar command line argument?
2) More importantly, can I someone please tell me whether cross-domain functionality is allowed in Chrome and Safari by default as long as server responds with the header or do I have to make sure that
a) server responds with a header
AND
b) browser is started with a proper argument.
I found the problem. Reading more about CORS helped html5rocks.com/en/tutorials/cors. I realized that my requests were triggering preflight requests (OPTIONS) and the server was not set up to handle these requests properly. The reason it was causing preflight requests was because I was using JQuery and it was adding a custom header into my requests. I modified my code to prevent addition of this extra header and my requests no longer needed preflight requests. Now I do not have to disable web security and it works fine.
I'm testing my servlet using google chrome. When i tried to load the same url twice, say,
localhost/myserver/servlet
chrome only sent out one request to the server. However, if I modified the second url to be:
localhost/myserver/servlet?id=2
it sent two different requests.
I've enabled the incognito mode, but it seems that chrome shares cache and urls between all its incognito tabs.
Caching control is a part of HTTP specification, read something about it. Using HTTP headers like Cache-Control: no-cache or Expires: ... should help you.