Is X-Frame-Options ALLOW-FROM really deprecated? - firefox

I am not sure about the exact status of this HTTP header. Some source - for instance Mozilla or Caniuse - clearly indicate that this header has been removed since the version 70 of Firefox, and has been replaced by Content-Security-Policy: frame-ancestors.
Despite of that, I can see that X-Frame-Options: ALLOW-FROM myServerURI is still working : using Firefox 75, I clearly see that setting this header or not server side has still an impact on an iFrame : the inner content is allowed or is blocked when the header is present or not.
Examining the server's response headers using Firefox F12 / Web developer tools, Network, Headers clearly shows the presence of this header and the impact on the result. In this situation, there is also a Content-Security-Policy header present, but without the frame-ancestors directive.

Something must be wrong with your test.
When I try using it in Firefox 75, I get an error in the console:
Invalid X-Frame-Options: “ALLOW-FROM http://www.example.com/” header from “http://localhost:7007/” loaded into “http://localhost:8080/”.
… and the content is displayed in the frame even though the iframe is hosted on http://localhost:8080/ and not http://www.example.com/

ALLOW-FROM value for X-Frame-Options header is obsolete now and not supported by new browsers.
Refer this link for valid possible values : https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Frame-Options

Related

Session Variables getting reset in Mac [duplicate]

I found a strange cookie problem on safari. If you surf to http://2much.ch you can enter with FF/IE and surf inside the site.
But if you use safari, you can enter only once; you can't surf inside the site. I found that Safari doesn't set the entered cookie, but FF/IE does.
What is wrong here?
It looks like you hit a Safari bug here; you are redirecting any visiting browser to /entry while setting the cookie at the same time, and Safari is ignoring the Set-Cookie header when encountering the 302 HTTP status:
$ curl -so /dev/null -D - http://4much.schnickschnack.info/
HTTP/1.1 302 Moved Temporarily
Server: nginx/0.7.61
Date: Sun, 19 Jul 2009 12:20:49 GMT
Content-Type: text/html;charset=utf-8
Connection: keep-alive
Content-Length: 14260
Content-Language: de
Expires: Sat, 1 Jan 2000 00:00:00 GMT
Location: http://4much.schnickschnack.info/entry
Set-Cookie: colorstyle="bright"; Path=/; Expires=1248092449.12
Set-Cookie: _ZopeId="73230900A39w5NG7q4g"; Path=/
Technically, this would be a bug in Apple's Foundation Classes, I've found a WebKit bug that states this is the case.
I suppose the workaround is to set the cookie not in index_html but in entry instead.
In the intervening years since I first answered this question, this issue now appears solved, at least it was for Safari 6 when someone tested all major browsers for Set-Cookie support on 302 redirects in 2012.
A month ago, I ran into this issue. At first I thought it was a corrupted cookie jar as I could clean out the cookies and go.
However, it popped up again. This time I spent an hour going through it, watching what was sent, reviewing what safari sent back, and I found the problem.
In this case, I had an array of cookie values being sent to the browser after login prior to the redirect. The values looked something like 'user id', 'user full name', 'some other id', etc.
( yes, the id's are encrypted so no worries there )
My user full name was actually in a <lastname>, <firstname> format.
When safari was posting the cookie back to the server, everything after the comma after the lastname was dropped. It was only posting back values up to that point.
When i removed the comma the rest of the values started working just fine.
So it appears that if you send a cookie value that contains a comma, then safari doesn't properly escape that in it's internal storage. Which leads me to think that if they aren't properly escaping commas, then there are probably some security issues with safari's cookie handling code.
Incidentally, this was tested on Win 7 x64 with safari 4.0.5. Also I put up a web page at: http://cookietest.livelyconsulting.com/ which shows this exact problem.(I removed that test site)
IE, FF, and chrome all correctly set the cookie. safari does not.
Looks like this is no longer an issue. See http://blog.dubbelboer.com/2012/11/25/302-cookie.html
We've run into a very similar issue where Safari (v. 7.0.6) would ignore a cookie. The cookie header looked perfectly fine, almost identical to another cookie which was remembered.
It turned out that the culprit was the previous cookie header having a malformed expires value. Safari's handling of broken cookie headers is evidently not as robust as that of the other browsers.
I ran into same issue with Chrome. Chrome doesn't ignore the set-cookie header while you are redirecting, but you never know the order (set cookie first or redirect first). Here is something I have tried:
I have a website, which supports English and French. I implemented it (with php) this way:
localhost has a link to localhost/fr (which set-cookie to French and redirect to localhost). It works. (set cookie first)
localhost/path1 has a link to localhost/fr?return=/path1 (which set-cookie to French and redirect to localhost/path1). It doesn't work. (redirect first, the language didn't change)
localhost/path1 has a link to localhost/fr?return=www.google.com (which set-cookie to French and redirect to google). When I came back to my website again, it's in French. (which means set-cookie to French is not ignored, only executed after redirect)
Hope I make myself clear, English is a foreign language to me.
After a great deal of pain, I found out that Safari (15.3) actually does save and my cookie, but it never displayed in developer tools storage -> cookies, but it works fine.
Here's the cookie I create and return in a Netlify function.
const secureCookie = cookie.serialize('jwtToken', JSON.stringify(jwtToken), {
secure: process.env.CONTEXT !== 'dev',
domain: process.env.CONTEXT === 'dev' ? 'localhost' : '.domain.com',
httpOnly: true,
sameSite: true,
expires: new Date(Date.now() + (1000 * jwtToken.expires_in))
})
and netlify function return
return {
statusCode: 200,
headers: {
"Cache-Control": "no-cache",
},
multiValueHeaders: {
"Set-Cookie": [secureCookie],
},
body: JSON.stringify(body),
}

ASP.NET MVC Page with ResponseCache on action DOES return new content instead of cache

I followed this tutorial (https://learn.microsoft.com/en-us/aspnet/core/performance/caching/response?view=aspnetcore-2.1) to implement ResponseCache on my controller-action.
In short, I added
services.AddResponseCaching(); and app.UseResponseCaching(); in the startup and this tag [ResponseCache( Duration = 30)] on my controller.
Then I added a <h2>#DateTime.Now</h2> in my view and what I expected.... was the same datetime.now for 30 seconds.
But it doesn't, it just shows the new time on every reload (F5).
I made sure my devtools in chrome do not say 'disable cache'.
It's both with and without the chrome devtools open, on my local machine, now trying on a brandnew .net core mvc project.
One thing I noticed (with devtools open) is that the request has this header: Cache-Control: max-age=0. Does this influence the behaviour?
I thought it would mean something because it looks like the request says 'no cache' but that strikes me as weird because I didn't put the header in and I would say the default behaviour of chrome wouldn't be to ignore caches?
A header like Cache-Control: max-age=0 effectively disables all caching. Resources are basically expired as soon as they come off the wire, so they are always fetched. This header originates from the server. The client has nothing to do with it.
Assuming you haven't disabled response caching manually in some way by accident. Then, the most likeliest situation is that you're doing something where the response caching middleware will never cache. The documentation lists the following conditions that must be satisfied before responses will be cached, regardless of what you do:
The request must result in a server response with a 200 (OK) status code.
The request method must be GET or HEAD.
Terminal middleware, such as Static File Middleware, must not process the response prior to the Response Caching Middleware.
The Authorization header must not be present.
Cache-Control header parameters must be valid, and the response must be marked public and not marked private.
The Pragma: no-cache header must not be present if the Cache-Control header isn't present, as the Cache-Control header overrides the Pragma header when present.
The Set-Cookie header must not be present.
Vary header parameters must be valid and not equal to *.
The Content-Length header value (if set) must match the size of the response body.
The IHttpSendFileFeature isn't used.
The response must not be stale as specified by the Expires header and the max-age and s-maxage cache directives.
Response buffering must be successful, and the size of the response must be smaller than the configured or default SizeLimit.
The response must be cacheable according to the RFC 7234 specifications. For example, the no-store directive must not exist in request or response header fields. See Section 3: Storing Responses in Caches of RFC 7234 for details.
However, in such situations, the server should be sending Cache-Control: no-cache, not max-age=0. As a result, I'm leaning towards some misconfiguration somewhere, where you have set this max age value and either forgot or overlooked it.
This is working for me in a 3.1 app to not let F5/Ctrl+F5 or Developer Tools in Firefox or Chrome bypass server cache for a full response.
In startup add this little middleware before UseResponseCaching().
// Middleware that fixes server caching on F5/Reload
app.Use(async (context, next) =>
{
const string cc = "Cache-Control";
if (context.Request.Headers.ContainsKey(cc))
{
context.Request.Headers.Remove(cc);
}
const string pragma = "Pragma";
if (context.Request.Headers.ContainsKey(pragma))
{
context.Request.Headers.Remove(pragma);
}
await next();
});
app.UseResponseCaching();
Haven't noticed any problems...

Spring HTTP Strict Transport Security (HSTS) And FireFox

The spring security (4.0.1.Release) set the HSTS host by default for https protocol and you can see Strict-Transport-Security: max-age=31536000 ; in the response header (I used Firefox>Web Development>Network ).
But when I look at firefox console I see an error which says: The site specified an invalid Strict-Transport-Security header.
I also set the hsts header manually in spring config as :
<headers>
<hsts />
</headers>
The same response header is generated and the FireFox show error again.
According to the https://developer.mozilla.org/docs/Security/HTTP_Strict_Transport_Security the header must be correct !
Any comments ?!
I found it was a bug in firefox as mentioned in The site specified an invalid Strict-Transport-Security header - firebug.
The self-signed certificate seems to generate this issue.
Also refer to: https://jira.spring.io/browse/SEC-3021
Try this, so I solved a similar problem:
You could enter about:config into the firefox address bar (confirm the info message in case it shows up) and search for the preference named security.enterprise_roots.enabled double-click it and change its value to true and restart firefox.

X-Frame-Options: ALLOW-FROM in firefox and chrome

I'm implementing a "pass-through" for X-Frame-Options to let a partner site wrap my employer's site in an iframe, as per this article: http://blogs.msdn.com/b/ieinternals/archive/2010/03/30/combating-clickjacking-with-x-frame-options.aspx
(splitting up URLS to post)
In a nutshell, our partner's page has an iframe with an URL against our domain.
For any page in our domain, they'll add a special url argument like &#mykey=topleveldomain.com, telling us what the page's top level domain is.
Our filters pick up the partner TLD, if provided, from the URL, and validate it against a whitelist. If it's on the list, we ship the X-Frame-Options header with value ALLOW-FROM topleveldomain.com (and add a cookie for future clicks). If it's not on our whitelist, we ship SAMEORIGIN or DENY.
The problem is it looks like sending ALLOW-FROM domain results in a no-op overall for the latest Firefox and Google Chrome. IE8, at least, seems to be correctly implementing ALLOW-FROM.
Check out this page: http://www.enhanceie.com/test/clickjack. Right after the 5th (of 5) boxes that "should be showing content", is a box that should NOT be showing content, but which is. In this case, the page in the iframe is sending X-Frame-Options: ALLOW-FROM http://www.debugtheweb.com, a decidedly different TLD than http://www.enhanceie.com. Yet, the frame still displays content.
Any insight as to whether X-Frame-Options is truly implemented with ALLOW-FROM across relevant (desktop) browsers? Perhaps the syntax has changed?
Some links of interest:
Draft rfc on x-frame-options: https://datatracker.ietf.org/doc/html/draft-gondrom-frame-options-01
developer.mozilla article discussing the header as a 2-option header (sameorigin or deny). https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options
msdn blog that initiated the whole thing: http://blogs.msdn.com/b/ie/archive/2009/01/27/ie8-security-part-vii-clickjacking-defenses.aspx
msdn blog that talks about 3 values: adding allow-from origin http://blogs.msdn.com/b/ieinternals/archive/2010/03/30/combating-clickjacking-with-x-frame-options.aspx
ALLOW-FROM is not supported in Chrome or Safari. See MDN article: https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options
You are already doing the work to make a custom header and send it with the correct data, can you not just exclude the header when you detect it is from a valid partner and add DENY to every other request? I don't see the benefit of AllowFrom when you are already dynamically building the logic up?
I posted this question and never saw the feedback (which came in several months after, it seems :).
As Kinlan mentioned, ALLOW-FROM is not supported in all browsers as an X-Frame-Options value.
The solution was to branch based on browser type. For IE, ship X-Frame-Options. For everyone else, ship X-Content-Security-Policy.
Hope this helps, and sorry for taking so long to close the loop!
For Chrome, instead of
response.AppendHeader("X-Frame-Options", "ALLOW-FROM " + host);
you need to add Content-Security-Policy
string selfAuth = System.Web.HttpContext.Current.Request.Url.Authority;
string refAuth = System.Web.HttpContext.Current.Request.UrlReferrer.Authority;
response.AppendHeader("Content-Security-Policy", "default-src 'self' 'unsafe-inline' 'unsafe-eval' data: *.msecnd.net vortex.data.microsoft.com " + selfAuth + " " + refAuth);
to the HTTP-response-headers.
Note that this assumes you checked on the server whether or not refAuth is allowed.
And also, note that you need to do browser-detection in order to avoid adding the allow-from header for Chrome (outputs error on console).
For details, see my answer here.

Asian characters in IE 8 get garbled in Server; is this due to HTTP header Content-Type?

One of the request parameters in an http request made by the client contains Japanese characters. If I make this request in Firefox and look at the parameter as soon as it reaches the server by debugging in Eclipse, the characters look fine. If I do the same request using IE 8, the characters get garbled when I look at them at the same point in the server code (they are fine in both browsers, though). I have examined the POST requests made by both browsers, and they both pass the same sequence of characters, which is:
%2C%E3%81%9D%E3%81%AE%E4%BB%96
I am therefore thinking that this has to do with the encoding. If I look at the HTTP headers of the request, I notice the following differences. In IE:
Content-Type: application/x-www-form-urlencoded
Accept: */*
In Firefox:
Content-Type application/x-www-form-urlencoded; charset=UTF-8
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
I'm thinking that the IE 8 header doesn't state the UTF-8 encoding explicitly, even though it's specified in the meta tag of the HTML document. I am not sure if this is the problem. I would appreciate any help, and please do let me know if you need more information.
Make sure the page that contains the form has UTF-8 as charset. In IE's case, the best thing to make sure of this is by sending a HTTP header ('Content-Type: text/html; charset=utf-8') and adding a meta http-equiv tag with the content type/charset to your html (I've seen this actually matter, even when the appropriate header was sent).
Second, your form can also specify the content type:
<form enctype="application/x-www-form-urlencoded; charset=utf-8>

Resources