I found a strange cookie problem on safari. If you surf to http://2much.ch you can enter with FF/IE and surf inside the site.
But if you use safari, you can enter only once; you can't surf inside the site. I found that Safari doesn't set the entered cookie, but FF/IE does.
What is wrong here?
It looks like you hit a Safari bug here; you are redirecting any visiting browser to /entry while setting the cookie at the same time, and Safari is ignoring the Set-Cookie header when encountering the 302 HTTP status:
$ curl -so /dev/null -D - http://4much.schnickschnack.info/
HTTP/1.1 302 Moved Temporarily
Server: nginx/0.7.61
Date: Sun, 19 Jul 2009 12:20:49 GMT
Content-Type: text/html;charset=utf-8
Connection: keep-alive
Content-Length: 14260
Content-Language: de
Expires: Sat, 1 Jan 2000 00:00:00 GMT
Location: http://4much.schnickschnack.info/entry
Set-Cookie: colorstyle="bright"; Path=/; Expires=1248092449.12
Set-Cookie: _ZopeId="73230900A39w5NG7q4g"; Path=/
Technically, this would be a bug in Apple's Foundation Classes, I've found a WebKit bug that states this is the case.
I suppose the workaround is to set the cookie not in index_html but in entry instead.
In the intervening years since I first answered this question, this issue now appears solved, at least it was for Safari 6 when someone tested all major browsers for Set-Cookie support on 302 redirects in 2012.
A month ago, I ran into this issue. At first I thought it was a corrupted cookie jar as I could clean out the cookies and go.
However, it popped up again. This time I spent an hour going through it, watching what was sent, reviewing what safari sent back, and I found the problem.
In this case, I had an array of cookie values being sent to the browser after login prior to the redirect. The values looked something like 'user id', 'user full name', 'some other id', etc.
( yes, the id's are encrypted so no worries there )
My user full name was actually in a <lastname>, <firstname> format.
When safari was posting the cookie back to the server, everything after the comma after the lastname was dropped. It was only posting back values up to that point.
When i removed the comma the rest of the values started working just fine.
So it appears that if you send a cookie value that contains a comma, then safari doesn't properly escape that in it's internal storage. Which leads me to think that if they aren't properly escaping commas, then there are probably some security issues with safari's cookie handling code.
Incidentally, this was tested on Win 7 x64 with safari 4.0.5. Also I put up a web page at: http://cookietest.livelyconsulting.com/ which shows this exact problem.(I removed that test site)
IE, FF, and chrome all correctly set the cookie. safari does not.
Looks like this is no longer an issue. See http://blog.dubbelboer.com/2012/11/25/302-cookie.html
We've run into a very similar issue where Safari (v. 7.0.6) would ignore a cookie. The cookie header looked perfectly fine, almost identical to another cookie which was remembered.
It turned out that the culprit was the previous cookie header having a malformed expires value. Safari's handling of broken cookie headers is evidently not as robust as that of the other browsers.
I ran into same issue with Chrome. Chrome doesn't ignore the set-cookie header while you are redirecting, but you never know the order (set cookie first or redirect first). Here is something I have tried:
I have a website, which supports English and French. I implemented it (with php) this way:
localhost has a link to localhost/fr (which set-cookie to French and redirect to localhost). It works. (set cookie first)
localhost/path1 has a link to localhost/fr?return=/path1 (which set-cookie to French and redirect to localhost/path1). It doesn't work. (redirect first, the language didn't change)
localhost/path1 has a link to localhost/fr?return=www.google.com (which set-cookie to French and redirect to google). When I came back to my website again, it's in French. (which means set-cookie to French is not ignored, only executed after redirect)
Hope I make myself clear, English is a foreign language to me.
After a great deal of pain, I found out that Safari (15.3) actually does save and my cookie, but it never displayed in developer tools storage -> cookies, but it works fine.
Here's the cookie I create and return in a Netlify function.
const secureCookie = cookie.serialize('jwtToken', JSON.stringify(jwtToken), {
secure: process.env.CONTEXT !== 'dev',
domain: process.env.CONTEXT === 'dev' ? 'localhost' : '.domain.com',
httpOnly: true,
sameSite: true,
expires: new Date(Date.now() + (1000 * jwtToken.expires_in))
})
and netlify function return
return {
statusCode: 200,
headers: {
"Cache-Control": "no-cache",
},
multiValueHeaders: {
"Set-Cookie": [secureCookie],
},
body: JSON.stringify(body),
}
Related
I'm losing my mind here - I'm looking into an issue where some signout functionality in an application I have isn't working because the authentication cookie is not being cleared. The thing is that our "signout" endpoint does include the appropriate set-cookie header in the response - here's what I get looking at the raw response in Firefox:
set-cookie: Auth.myapp=; domain=app.mydomain.com; expires=Thu, 26-Nov-2020 13:19:20 GMT; path=/; secure; HttpOnly
Firefox is reporting this error in the console:
Cookie “Auth.myapp” has been rejected because it is already expired
This is kind of confusing, not only have I successfully used set-cookie with a past-date in expires before, it's even codified in RFC6265 as the accepted way to request a client remove a cookie:
Finally, to remove a cookie, the server returns a Set-Cookie header
with an expiration date in the past. The server will be successful
in removing the cookie only if the Path and the Domain attribute in
the Set-Cookie header match the values used when the cookie was
created.
So I need to set an expires date in the past to clear the cookie ... but doing so causes the browser to reject it? Does anyone know what's going on here?
To be clear I have checked that the cookie name, path, secure and SameSite match (update: I suspected that because I hadn't explicitly specified a SameSite this might be the cause, but after making sure the cookie is both set and cleared with SameSite=None it is still not working).
As far as I can tell, the message is harmless - the cookies are indeed deleted based on my testing. And Googling the message leads to this changeset, which indicates that the deletion does happen, just with additional logging.
// If the new cookie has expired -- i.e. the intent was simply to delete
// the old cookie -- then we're done.
if (aCookie->Expiry() <= currentTime) {
COOKIE_LOGFAILURE(SET_COOKIE, aHostURI, aCookieHeader,
"previously stored cookie was deleted");
+ CookieLogging::LogMessageToConsole(
+ aCRC, aHostURI, nsIScriptError::warningFlag,
+ CONSOLE_REJECTION_CATEGORY, "CookieRejectedExpired"_ns,
+ AutoTArray<nsString, 1>{
+ NS_ConvertUTF8toUTF16(aCookie->Name()),
+ });
NotifyChanged(oldCookie, u"deleted", oldCookieIsSession);
return;
The message seems unnecessary however, and there is a bug report about it.
This occurs when using a null , false, or empty string in the cookie value.
It is only flagged as a warning in Firefox.
Other browsers do not show this warning.
I'm having a similar problem today with the same error message as the OP.
This happened in the past and my solution then was, rather than setting a time in the past, I set the value to nothing (as a failsafe) and also set 'expires' to 'time() + 1' (so a future time with the cookie expiring almost instantly).
setcookie('cookiename','',[
'expires' => time() + 1,
'path' => '/',
'domain' => $_SERVER['HTTP_HOST'],
'secure' => true,
'samesite' => 'strict'
]);
Unfortunately, this workaround seems to suddenly be failing today and that failure does seem to be unique to Firefox. (So, yup, perhaps an FF bug?)
Chrome and Vivaldi are still working fine with it. I didn't test on Safari nor Edge as yet.
I am not sure about the exact status of this HTTP header. Some source - for instance Mozilla or Caniuse - clearly indicate that this header has been removed since the version 70 of Firefox, and has been replaced by Content-Security-Policy: frame-ancestors.
Despite of that, I can see that X-Frame-Options: ALLOW-FROM myServerURI is still working : using Firefox 75, I clearly see that setting this header or not server side has still an impact on an iFrame : the inner content is allowed or is blocked when the header is present or not.
Examining the server's response headers using Firefox F12 / Web developer tools, Network, Headers clearly shows the presence of this header and the impact on the result. In this situation, there is also a Content-Security-Policy header present, but without the frame-ancestors directive.
Something must be wrong with your test.
When I try using it in Firefox 75, I get an error in the console:
Invalid X-Frame-Options: “ALLOW-FROM http://www.example.com/” header from “http://localhost:7007/” loaded into “http://localhost:8080/”.
… and the content is displayed in the frame even though the iframe is hosted on http://localhost:8080/ and not http://www.example.com/
ALLOW-FROM value for X-Frame-Options header is obsolete now and not supported by new browsers.
Refer this link for valid possible values : https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Frame-Options
Has anybody tried?
Here is the use case. In a first request-response cycle, this would happen:
Request 1:
GET / HTTP/1.1
...
Response 1
HTTP/1.0 200 OK
Etag: version1
Cache-control: max-age=1
... angly html here
....<link href="mycss.css" >
...
Request 2:
GET /mycss.css HTTP/1.1
...
Response 2 (probably pushed):
Etag: version1
Cache-control: max-age=<duration-of-the-universe>
...
... brackety css ...
...
and then, when the browsers goes a second time to the same page, it will of course fetch again the "/" resource because of the very short max-age:
GET / HTTP/1.1
...
If-not-modified: version1
But it won't fetch mycss.css if it has it in cache. However, the server can use the validator present in the "if-not-modified" header of the request for "/" to get an idea of the client's cache age, and may conclude that mycss.css version's of the browser is too old. In that case, before even answering the previous request, the server can "promise" a new version of mycss.css/
By the specs, should the browser accept and use it?
Overview:
I still don't know what the answer to my question is from a purely theoretical side, but at least today it doesn't seem possible in practice to do cache-busting this way :-(, with neither Google Chrome or Firefox. Both reject or ignore the pushed stream if they believe that the resource they have in cache is fresh.
I also got this from somebody who prefers to remain anonymous:
Browsers will typically put resources received through push in a
"demilitarized zone" and only once the client asks for that resource
it will be moved into the actual cache. So just pushing random
things will not make them end up in the browser cache even if the
browser accepts them at the push moment.
Update
As early 2016, it is still not possible, due mainly to lack of consensus on how this should be handled, and if it should be allowed at all or not.
As this page shows, even with HTTP/2, the way to solve the stale assets issue is to create a unique URL for each asset version, and then ensure that the user receives that new URL when they re-visit the page.
due to the fact that my users have configured their sessions to reopen each time they reopen their firefox browser. the 'session' cookies come back.
but my website needs fresh authentication if the session cookie is not present or 24 hours old. so I am having this problem of needing to manually remove the expired cookies each time i reopen browser after 24 hours.
to combat this, i tired to put a 'Logout' link on my page which should have helped me. but unfortunately it is not helping...
i tried below code to remove the cookies, but it seems it does not remove the cookies from the sqilte table in which firefox stores its cookies. After the following code is run, cookies reappear.. (or are they not getting removed???) how can i achieve that?
code:
function Delete_Cookie( name, path, domain )
{
document.cookie=name+"="+((path) ? ";path="+path:"")+((domain)?";domain="+domain:"")+";expires=Thu, 01 Jan 1970 00:00:01 GMT";
}
$("#Logout").click(function() {
Delete_Cookie('SecOne','/','.mydomain.com');
Delete_Cookie('SecTwo','/','.mydomain.com');
alert("Bye");
});
Although I can't say for certain what the problem is, there are a few possibilities:
The domain name could be wrong.
You could write a test delete line that doesn't use the path and domain names and see if it works then. If the cookies get deleted after that change, then it's the string you made for the delete line, or the parameters that are passed in that are wrong.
Also, perhaps it is possible another section of your code is causing the cookies to reappear. Check to see if there's any other cookie setting things that get called after this is.
Or maybe the page needs reloaded in order for the cookies to disappear
If the cookie is HttpOnly cookie, it is not able to read/delete from HTTPS secured webpage. If the cookie is normal one, below functionality can be used for delete cookie.
function deleteCookie(keyName){
var allcookies = document.cookie, i, cookiearray = null, name = null;
cookiearray = allcookies.split(';');
for (i = 0; i < cookiearray.length; i++) {
name = cookiearray[i].split('=')[0].trim();
if (name === keyName) {
document.cookie = name + '=000;expires=Thu, 1 Jan 1970 00:00:00 UTC; path=/';
}
}
}
Note: Session cookies are not able to delete even browser is getting closed. This is known bug in chrome browsers. Refer below link
Cookie issue in chrome
I know this is an old thread but I was having the same problem as I kept trying to set a cookie with a time in the past, thinking it would expire and not show in FF cookie manager.
Setting the cookie to expire in the future but having a blank value got rid of it. Not sure why.
setcookie(mycookie,"", time()+5000,'/');
Using FireFox v40.0.3
Seems there have been a few bugs logged with Firefox and it's handling of cookie expiration.
I'm implementing a "pass-through" for X-Frame-Options to let a partner site wrap my employer's site in an iframe, as per this article: http://blogs.msdn.com/b/ieinternals/archive/2010/03/30/combating-clickjacking-with-x-frame-options.aspx
(splitting up URLS to post)
In a nutshell, our partner's page has an iframe with an URL against our domain.
For any page in our domain, they'll add a special url argument like &#mykey=topleveldomain.com, telling us what the page's top level domain is.
Our filters pick up the partner TLD, if provided, from the URL, and validate it against a whitelist. If it's on the list, we ship the X-Frame-Options header with value ALLOW-FROM topleveldomain.com (and add a cookie for future clicks). If it's not on our whitelist, we ship SAMEORIGIN or DENY.
The problem is it looks like sending ALLOW-FROM domain results in a no-op overall for the latest Firefox and Google Chrome. IE8, at least, seems to be correctly implementing ALLOW-FROM.
Check out this page: http://www.enhanceie.com/test/clickjack. Right after the 5th (of 5) boxes that "should be showing content", is a box that should NOT be showing content, but which is. In this case, the page in the iframe is sending X-Frame-Options: ALLOW-FROM http://www.debugtheweb.com, a decidedly different TLD than http://www.enhanceie.com. Yet, the frame still displays content.
Any insight as to whether X-Frame-Options is truly implemented with ALLOW-FROM across relevant (desktop) browsers? Perhaps the syntax has changed?
Some links of interest:
Draft rfc on x-frame-options: https://datatracker.ietf.org/doc/html/draft-gondrom-frame-options-01
developer.mozilla article discussing the header as a 2-option header (sameorigin or deny). https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options
msdn blog that initiated the whole thing: http://blogs.msdn.com/b/ie/archive/2009/01/27/ie8-security-part-vii-clickjacking-defenses.aspx
msdn blog that talks about 3 values: adding allow-from origin http://blogs.msdn.com/b/ieinternals/archive/2010/03/30/combating-clickjacking-with-x-frame-options.aspx
ALLOW-FROM is not supported in Chrome or Safari. See MDN article: https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options
You are already doing the work to make a custom header and send it with the correct data, can you not just exclude the header when you detect it is from a valid partner and add DENY to every other request? I don't see the benefit of AllowFrom when you are already dynamically building the logic up?
I posted this question and never saw the feedback (which came in several months after, it seems :).
As Kinlan mentioned, ALLOW-FROM is not supported in all browsers as an X-Frame-Options value.
The solution was to branch based on browser type. For IE, ship X-Frame-Options. For everyone else, ship X-Content-Security-Policy.
Hope this helps, and sorry for taking so long to close the loop!
For Chrome, instead of
response.AppendHeader("X-Frame-Options", "ALLOW-FROM " + host);
you need to add Content-Security-Policy
string selfAuth = System.Web.HttpContext.Current.Request.Url.Authority;
string refAuth = System.Web.HttpContext.Current.Request.UrlReferrer.Authority;
response.AppendHeader("Content-Security-Policy", "default-src 'self' 'unsafe-inline' 'unsafe-eval' data: *.msecnd.net vortex.data.microsoft.com " + selfAuth + " " + refAuth);
to the HTTP-response-headers.
Note that this assumes you checked on the server whether or not refAuth is allowed.
And also, note that you need to do browser-detection in order to avoid adding the allow-from header for Chrome (outputs error on console).
For details, see my answer here.