Web app doesn't work in Internet Explorer - session

My application works beautifully on Chrome, Firefox, Safari, and Opera, but fails on Internet Explorer 11 (I haven't tried earlier versions yet). Basically, I find that I cannot log in. I have narrowed the problem down to an issue with cookies. On login attempt, I get through to the server fine, and the response is the exact same as with any other browser. But for whatever reason, even though the Set-Cookie header is set, the cookie is not saved and sent on the next request to the server. My application has a (mostly) RESTful backend, and so authenticates on most requests. This obviously fails because the cookie is not set!
This problem persists whether served from localhost or from a real server. I should mention that it is a cross origin request, but the proper headers are in place (and they work on other browsers)
The response headers (when served from localhost) are as follows:
Response HTTP/1.1 200 OK
Access-Control-Allow-Credentials true
Access-Control-Allow-Headers accept, content-type
Access-Control-Allow-Methods POST, PATCH, DELETE
Access-Control-Allow-Origin http://localhost:8000
Cache-Control no-cache,no-store
Content-Length 68
Content-Type application/json
Date Thu, 05 Mar 2015 20:00:20 GMT
Frame-Ancestors none
Server waitress
Set-Cookie session-id=ZqANyJ4KRUFoOaVgtGs/hcX+fxjMcCVM0kdRqF4riHglfAeBJJK56X9wn0XsNdPwUg; Domain=; HttpOnly;
Strict-Transport-Security max-age=31536000;
X-Frame-Options deny
I cannot figure out why this doesn't work solely in IE11! The app is not in an iframe, and the cookie actually appears in the cookies tab in the IE developer tools when I click on the network request, but it is not subsequently sent along with the next ones.
Some of my attempted solutions were:
Adding the Cache-Control header
Removing the HttpOnly flag from the cookie
Adding the Domain flag to the cookie (set as localhost or 127.0.0.1)
Removing Headers like X-Frame-Options, Frame-Ancestors, and Strict-Transport-Security
Tried multiple computers
Changed security settings to accept all cookies
All failed. Ideas?

Related

Firefox does not keep cookies sent by cross-domain even with all CORS allow

I experience a problem with Firefox while Chrome works fine. Here is the situation:
Website1.com returns an html page in SSL.
This page makes a request to Website2.com in SSL either via img tag or XMLHttpRequest (same issue).
Website2.com returns a cookie to be set for itself
Firefox ignores this cookie. It is never stored even though it shows in the console.
The console doesn't complain about anything.
Client sends:
Origin: https://website1.com
Server returns:
Access-Control-Allow-Credentials: true
Access-Control-Allow-Headers: *
Access-Control-Allow-Methods: *
Access-Control-Allow-Origin: https://website1.com
Access-Control-Expose-Headers: *
Set-Cookie: ...
What else am I missing about CORS?
Thanks!
Access-Control-Allow-Credentials: true
Is a special flag. If one side declares it other also have to declare it or else it's security failure and browser will not accept data.
So add the same header to client request. (Or if you control server, consider doing without cookies and passing data with other mechanism)

Browser serving an obsolete Authorization header from cache

I'm experiencing my client getting logged out after an innocent request to my server. I control both ends and after a lot of debugging, I've found out that the following happens:
The client sends the request with a correct Authorization header.
The server responds with 304 Not Modified without any Authorization header.
The browser serves the full response including an obsolete Authorization header as found in its cache.
From now on, the client uses the obsolete Authorization and gets kicked out.
From what I know, the browser must not cache any request containing Authorization. Nonetheless,
chrome://view-http-cache/http://localhost:10080/api/SearchHost
shows
HTTP/1.1 200 OK
Date: Thu, 23 Nov 2017 23:50:16 GMT
Vary: origin, accept-encoding, authorization, x-role
Cache-Control: must-revalidate
Server: 171123_073418-d8d7cb0 =
x-delay-seconds: 3
Authorization: Wl6pPirDLQqWqYv
Expires: Thu, 01 Jan 1970 00:00:00 GMT
ETag: "zUxy1pv3CQ3IYTFlBg3Z3vYovg3zSw2L"
Content-Encoding: gzip
Content-Type: application/json;charset=utf-8
Content-Length: 255
The funny server header replaces the Jetty server header (which shouldn't be served for security reasons) by some internal information - ignore that. This is what curl says:
< HTTP/1.1 304 Not Modified
< Date: Thu, 23 Nov 2017 23:58:18 GMT
< Vary: origin, accept-encoding, authorization, x-role
< Cache-Control: must-revalidate
< Server: 171123_073418-d8d7cb0 =
< ETag: "zUxy1pv3CQ3IYTFlBg3Z3vYovg3zSw2L"
< x-delay-seconds: 3
< Content-Encoding: gzip
This happens in Firefox, too, although I can't reproduce it at the moment.
The RFC continues, and it looks like the answer linked above is not exact:
unless a cache directive that allows such responses to be stored is present in the response
It looks like the response is cacheable. That's fine, I do want the content to be cached, but I don't want the Authorization header to be served from cache. Is this possible?
Explanation of my problem
My server used to send the Authorization header only when responding to a login request. This used to work fine, problems come with new requirements.
Our site allows users to stay logged in arbitrarily long (we do no sensitive business). We're changing the format of the authorization token and we don't want to force all users to log in again because of this. Therefore, I made the server to send the updated authorization token whenever it sees an obsolete but valid one. So now any response may contain an authorization token, but most of them do not.
The browser cache combining the still valid response with an obsolete authorization token comes in the way.
As a workaround, I made the server send no etag when an authorization token is present. It works, but I'd prefer some cleaner solution.
The quote in the linked answer is misleading because it omitted an important part: "if the cache is shared".
Here's the correct quote (RFC7234 Section 3):
A cache MUST NOT store a response to any request, unless: ... the Authorization header field (see Section 4.2 of [RFC7235]) does not appear in the request, if the cache is shared,
That part of the RFC is basically a summary.
This is the complete rule (RFC7234 Section 3.2) that says essentially the same thing:
A shared cache MUST NOT use a cached response to a request with an Authorization header field (Section 4.2 of [RFC7235]) to satisfy any subsequent request unless a cache directive that allows such responses to be stored is present in the response.
Is a browser cache a shared cache?
This is explained in Introduction section of the RFC:
A private cache, in contrast, is dedicated to a single user; often, they are deployed as a component of a user agent.
That means a browser cache is private cache.
It is not a shared cache, so the above rule does not apply, which means both Chrome and Firefox do their jobs correctly.
Now the solution.
The specification suggests the possibility of a cached response containing Authorization to be reused without the Authorization header.
Unfortunately, it also says that the feature is not widely implemented.
So, the easiest and also the most future-proof solution I can think of is make sure that any response containing Authorization token isn't cached.
For instance, whenever the server sees an obsolete but valid Authorization token, send a new valid one along with Cache-Control: no-store to disallow caching.
Also you must never send Cache-Control: must-revalidate with Authorization header because the must-revalidate directive actually allows the response to be cached, including by shared caches which can cause even more problems in the future.
... unless a cache directive that allows such responses to be stored is present in the response.
In this specification, the following Cache-Control response directives (Section 5.2.2) have such an effect: must-revalidate, public, and s-maxage.
My current solution is to send an authorization header in every response; using a placeholder value of - when no authorization is wanted.
The placeholder value is obviously meaningless and the client knows it and happily ignores it.
This solution is ugly as it adds maybe 20 bytes to every response, but that's still better than occasionally having to resend a whole response content as with the approach mentioned in my question. Moreover, with HTTP/2 it'll be free.

Cookie not set after HTTP request with Set-Cookie response header

Context: I'm trying to couple a separate frontend (Nuxt.js) with a Laravel backend. Session (logged in user etc.) is maintained by the backend and should be stored and updated in the frontend using cookies. I am making API calls using Axios.
I am currently running my frontend on localhost:3000 and my backend on 127.0.0.1:8000. When I make API calls from the frontend, I get the following headers in the response:
Access-Control-Allow-Credentials:true
Access-Control-Allow-Origin:http://localhost:3000
Cache-Control:no-cache, private
Connection:close
Content-Type:application/json
Date:Sun, 15 Oct 2017 13:05:24 GMT
Host:127.0.0.1:8000
Set-Cookie:laravel_session=fuqQf1fX3ZwQYl7xORGPopgZhD4qw5Mfi8lFrHTJ; expires=Sun, 15-Oct-2017 15:05:24 GMT; Max-Age=7200; path=/;
Vary:Origin
X-Powered-By:PHP/7.0.10
From what I understand, the browser should now set/update the laravel_session cookie. However, when I check the cookies in Chrome devtools, nothing changes.
Are the different URLs (or at least ports) an issue here? Am I missing some kind of header or directive that is required? I've done some research but haven't found a solution yet.

Does if-no-match need to be set programmatically in ajax request, if server sends Etag

My question is pretty simple. Although while searching over, I have not found a simple satisfying answer.
I am using Jquery ajax request to get the data from a server. Server
hosts a rest API that sets the Etag and Cach-control headers to the GET requests. The Server also sets CORS headers to allow the Etag.
The client of the Api is a browser web app. I am using Ajax request to call the Api. Here are the response headers from server after a simple GET request:
Status Code: 200 OK
Access-Control-Allow-Origin: *
Cache-Control: no-transform, max-age=86400
Connection: Keep-Alive
Content-Encoding: gzip
Content-Type: application/json
Date: Sun, 30 Aug 2015 13:23:41 GMT
Etag: "-783704964"
Keep-Alive: timeout=15, max=99
Server: Apache-Coyote/1.1
Transfer-Encoding: chunked
Vary: Accept-Encoding
access-control-allow-headers: X-Requested-With, Content-Type, Etag,Authorization
access-control-allow-methods: GET, POST, DELETE, PUT
All I want to know is:
Do I need to manually collect the Etag from response headers sent from the server and attach an if-no-match header to ajax request?OR the Browser sends it by-default in a conditional get request when it has an 'Etag'
I have done debugging over the network console in the browser and It
seems the browser is doing the conditional GET automatically and
sets the if-no-match header.
if it is right, Suppose, I created a new resource, and then I called the get request. It gives me the past cached data for the first time. But when I reload the page, It gives the updated one. So I am confused that, If the dataset on the server-side has changed and it sends a different Etag, Why doesn't the browser get an updated data set from the server unless I have to reload
Also in case of pagination. Suppose I have a URL /users?next=0. next is a query param where the value for the next changes for every new request. Since each response will get its own 'Etag'. Will the browser store the 'Etag' based on request or it just stores the lastest Etag of the previous get request, irrespective of the URL.
Well, I have somehow figured out the solution myself:
The browser sends the if-no-match header itself when it sees url had the e-tag header on a previous request. Browser saves the e-tag with respect to that URL, so it does not matter how many requests with different URLs happen.
Also, a trick to force the browser to fetch a conditional-get to check the e-tag:
Set the max-age header to the lowest (for me 60s works great)
once the cache expires, thebrowser will send a conditional-get to check if the expired cached resource is valid. If the if-no-match header matches with e-tag. The server sends the response back with 304: Not-Modified header. This means the expired cached resource is valid and can be used.

Can Firefox be made to accept third-party cookies from an AJAX response header?

I'm writing some code that makes an AJAX request to our web server. Our server runs some logic and then responds with some JSON. It may also respond with a set-cookie header:
Set-Cookie: our_organisation=[uuid]; domain=.our_organisation.com; path=/; expires=[soon]
It works in Chrome and Safari as far as I can tell, but not in Firefox. Firefox will accept the cookie if it's an image request instead. Am I doing something wrong here?
I already had a problem where I couldn't read the AJAX response on the client side in Firefox; this was fixed by setting Access-Control-Allow-Origin: * in the response header.
This is a cross-site XMLHttpRequest?
If so, per http://dev.w3.org/2006/webapi/XMLHttpRequest-2/ withCredentials defaults to false so the "credentials flag" used for CORS is set to false, and then per http://dvcs.w3.org/hg/cors/raw-file/tip/Overview.html the "block cookies" flag is set during the HTTP get, and per http://www.whatwg.org/specs/web-apps/current-work/multipage/fetching-resources.html#fetch that means Set-Cookie headers are ignored. Sounds like Chrome and Safari are just not following the specs here.
You can set withCredentials = true on the XHR object to send cookies. But note that if you do that you have to list an actual origin in Access-Control-Allow-Origin; you can't just use *.

Resources