X-Frame-Options: ALLOW-FROM in firefox and chrome - firefox

I'm implementing a "pass-through" for X-Frame-Options to let a partner site wrap my employer's site in an iframe, as per this article: http://blogs.msdn.com/b/ieinternals/archive/2010/03/30/combating-clickjacking-with-x-frame-options.aspx
(splitting up URLS to post)
In a nutshell, our partner's page has an iframe with an URL against our domain.
For any page in our domain, they'll add a special url argument like &#mykey=topleveldomain.com, telling us what the page's top level domain is.
Our filters pick up the partner TLD, if provided, from the URL, and validate it against a whitelist. If it's on the list, we ship the X-Frame-Options header with value ALLOW-FROM topleveldomain.com (and add a cookie for future clicks). If it's not on our whitelist, we ship SAMEORIGIN or DENY.
The problem is it looks like sending ALLOW-FROM domain results in a no-op overall for the latest Firefox and Google Chrome. IE8, at least, seems to be correctly implementing ALLOW-FROM.
Check out this page: http://www.enhanceie.com/test/clickjack. Right after the 5th (of 5) boxes that "should be showing content", is a box that should NOT be showing content, but which is. In this case, the page in the iframe is sending X-Frame-Options: ALLOW-FROM http://www.debugtheweb.com, a decidedly different TLD than http://www.enhanceie.com. Yet, the frame still displays content.
Any insight as to whether X-Frame-Options is truly implemented with ALLOW-FROM across relevant (desktop) browsers? Perhaps the syntax has changed?
Some links of interest:
Draft rfc on x-frame-options: https://datatracker.ietf.org/doc/html/draft-gondrom-frame-options-01
developer.mozilla article discussing the header as a 2-option header (sameorigin or deny). https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options
msdn blog that initiated the whole thing: http://blogs.msdn.com/b/ie/archive/2009/01/27/ie8-security-part-vii-clickjacking-defenses.aspx
msdn blog that talks about 3 values: adding allow-from origin http://blogs.msdn.com/b/ieinternals/archive/2010/03/30/combating-clickjacking-with-x-frame-options.aspx

ALLOW-FROM is not supported in Chrome or Safari. See MDN article: https://developer.mozilla.org/en-US/docs/Web/HTTP/X-Frame-Options
You are already doing the work to make a custom header and send it with the correct data, can you not just exclude the header when you detect it is from a valid partner and add DENY to every other request? I don't see the benefit of AllowFrom when you are already dynamically building the logic up?

I posted this question and never saw the feedback (which came in several months after, it seems :).
As Kinlan mentioned, ALLOW-FROM is not supported in all browsers as an X-Frame-Options value.
The solution was to branch based on browser type. For IE, ship X-Frame-Options. For everyone else, ship X-Content-Security-Policy.
Hope this helps, and sorry for taking so long to close the loop!

For Chrome, instead of
response.AppendHeader("X-Frame-Options", "ALLOW-FROM " + host);
you need to add Content-Security-Policy
string selfAuth = System.Web.HttpContext.Current.Request.Url.Authority;
string refAuth = System.Web.HttpContext.Current.Request.UrlReferrer.Authority;
response.AppendHeader("Content-Security-Policy", "default-src 'self' 'unsafe-inline' 'unsafe-eval' data: *.msecnd.net vortex.data.microsoft.com " + selfAuth + " " + refAuth);
to the HTTP-response-headers.
Note that this assumes you checked on the server whether or not refAuth is allowed.
And also, note that you need to do browser-detection in order to avoid adding the allow-from header for Chrome (outputs error on console).
For details, see my answer here.

Related

Is X-Frame-Options ALLOW-FROM really deprecated?

I am not sure about the exact status of this HTTP header. Some source - for instance Mozilla or Caniuse - clearly indicate that this header has been removed since the version 70 of Firefox, and has been replaced by Content-Security-Policy: frame-ancestors.
Despite of that, I can see that X-Frame-Options: ALLOW-FROM myServerURI is still working : using Firefox 75, I clearly see that setting this header or not server side has still an impact on an iFrame : the inner content is allowed or is blocked when the header is present or not.
Examining the server's response headers using Firefox F12 / Web developer tools, Network, Headers clearly shows the presence of this header and the impact on the result. In this situation, there is also a Content-Security-Policy header present, but without the frame-ancestors directive.
Something must be wrong with your test.
When I try using it in Firefox 75, I get an error in the console:
Invalid X-Frame-Options: “ALLOW-FROM http://www.example.com/” header from “http://localhost:7007/” loaded into “http://localhost:8080/”.
… and the content is displayed in the frame even though the iframe is hosted on http://localhost:8080/ and not http://www.example.com/
ALLOW-FROM value for X-Frame-Options header is obsolete now and not supported by new browsers.
Refer this link for valid possible values : https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Frame-Options

Session Variables getting reset in Mac [duplicate]

I found a strange cookie problem on safari. If you surf to http://2much.ch you can enter with FF/IE and surf inside the site.
But if you use safari, you can enter only once; you can't surf inside the site. I found that Safari doesn't set the entered cookie, but FF/IE does.
What is wrong here?
It looks like you hit a Safari bug here; you are redirecting any visiting browser to /entry while setting the cookie at the same time, and Safari is ignoring the Set-Cookie header when encountering the 302 HTTP status:
$ curl -so /dev/null -D - http://4much.schnickschnack.info/
HTTP/1.1 302 Moved Temporarily
Server: nginx/0.7.61
Date: Sun, 19 Jul 2009 12:20:49 GMT
Content-Type: text/html;charset=utf-8
Connection: keep-alive
Content-Length: 14260
Content-Language: de
Expires: Sat, 1 Jan 2000 00:00:00 GMT
Location: http://4much.schnickschnack.info/entry
Set-Cookie: colorstyle="bright"; Path=/; Expires=1248092449.12
Set-Cookie: _ZopeId="73230900A39w5NG7q4g"; Path=/
Technically, this would be a bug in Apple's Foundation Classes, I've found a WebKit bug that states this is the case.
I suppose the workaround is to set the cookie not in index_html but in entry instead.
In the intervening years since I first answered this question, this issue now appears solved, at least it was for Safari 6 when someone tested all major browsers for Set-Cookie support on 302 redirects in 2012.
A month ago, I ran into this issue. At first I thought it was a corrupted cookie jar as I could clean out the cookies and go.
However, it popped up again. This time I spent an hour going through it, watching what was sent, reviewing what safari sent back, and I found the problem.
In this case, I had an array of cookie values being sent to the browser after login prior to the redirect. The values looked something like 'user id', 'user full name', 'some other id', etc.
( yes, the id's are encrypted so no worries there )
My user full name was actually in a <lastname>, <firstname> format.
When safari was posting the cookie back to the server, everything after the comma after the lastname was dropped. It was only posting back values up to that point.
When i removed the comma the rest of the values started working just fine.
So it appears that if you send a cookie value that contains a comma, then safari doesn't properly escape that in it's internal storage. Which leads me to think that if they aren't properly escaping commas, then there are probably some security issues with safari's cookie handling code.
Incidentally, this was tested on Win 7 x64 with safari 4.0.5. Also I put up a web page at: http://cookietest.livelyconsulting.com/ which shows this exact problem.(I removed that test site)
IE, FF, and chrome all correctly set the cookie. safari does not.
Looks like this is no longer an issue. See http://blog.dubbelboer.com/2012/11/25/302-cookie.html
We've run into a very similar issue where Safari (v. 7.0.6) would ignore a cookie. The cookie header looked perfectly fine, almost identical to another cookie which was remembered.
It turned out that the culprit was the previous cookie header having a malformed expires value. Safari's handling of broken cookie headers is evidently not as robust as that of the other browsers.
I ran into same issue with Chrome. Chrome doesn't ignore the set-cookie header while you are redirecting, but you never know the order (set cookie first or redirect first). Here is something I have tried:
I have a website, which supports English and French. I implemented it (with php) this way:
localhost has a link to localhost/fr (which set-cookie to French and redirect to localhost). It works. (set cookie first)
localhost/path1 has a link to localhost/fr?return=/path1 (which set-cookie to French and redirect to localhost/path1). It doesn't work. (redirect first, the language didn't change)
localhost/path1 has a link to localhost/fr?return=www.google.com (which set-cookie to French and redirect to google). When I came back to my website again, it's in French. (which means set-cookie to French is not ignored, only executed after redirect)
Hope I make myself clear, English is a foreign language to me.
After a great deal of pain, I found out that Safari (15.3) actually does save and my cookie, but it never displayed in developer tools storage -> cookies, but it works fine.
Here's the cookie I create and return in a Netlify function.
const secureCookie = cookie.serialize('jwtToken', JSON.stringify(jwtToken), {
secure: process.env.CONTEXT !== 'dev',
domain: process.env.CONTEXT === 'dev' ? 'localhost' : '.domain.com',
httpOnly: true,
sameSite: true,
expires: new Date(Date.now() + (1000 * jwtToken.expires_in))
})
and netlify function return
return {
statusCode: 200,
headers: {
"Cache-Control": "no-cache",
},
multiValueHeaders: {
"Set-Cookie": [secureCookie],
},
body: JSON.stringify(body),
}

Is it possible to do cache busting with HTTP/2?

Has anybody tried?
Here is the use case. In a first request-response cycle, this would happen:
Request 1:
GET / HTTP/1.1
...
Response 1
HTTP/1.0 200 OK
Etag: version1
Cache-control: max-age=1
... angly html here
....<link href="mycss.css" >
...
Request 2:
GET /mycss.css HTTP/1.1
...
Response 2 (probably pushed):
Etag: version1
Cache-control: max-age=<duration-of-the-universe>
...
... brackety css ...
...
and then, when the browsers goes a second time to the same page, it will of course fetch again the "/" resource because of the very short max-age:
GET / HTTP/1.1
...
If-not-modified: version1
But it won't fetch mycss.css if it has it in cache. However, the server can use the validator present in the "if-not-modified" header of the request for "/" to get an idea of the client's cache age, and may conclude that mycss.css version's of the browser is too old. In that case, before even answering the previous request, the server can "promise" a new version of mycss.css/
By the specs, should the browser accept and use it?
Overview:
I still don't know what the answer to my question is from a purely theoretical side, but at least today it doesn't seem possible in practice to do cache-busting this way :-(, with neither Google Chrome or Firefox. Both reject or ignore the pushed stream if they believe that the resource they have in cache is fresh.
I also got this from somebody who prefers to remain anonymous:
Browsers will typically put resources received through push in a
"demilitarized zone" and only once the client asks for that resource
it will be moved into the actual cache. So just pushing random
things will not make them end up in the browser cache even if the
browser accepts them at the push moment.
Update
As early 2016, it is still not possible, due mainly to lack of consensus on how this should be handled, and if it should be allowed at all or not.
As this page shows, even with HTTP/2, the way to solve the stale assets issue is to create a unique URL for each asset version, and then ensure that the user receives that new URL when they re-visit the page.

Force chrome to cache static content like images

I want to improve my experience with the internet by caching static content like images(jpg,png,gif) and fonts. Because always happens that when watching a webpage with a lot of images, and then I refresh with F5, the same contents are downloaded again.
I know that it's because the response headers could contain no cache o max-age 0, and even sometimes it happens when there is no cache o max-age in the response.
But in case of images or fonts that never change, it's useless to get max-age 0. So I wanted to know if there is a way to override the response headers and set them with max-age 1 year. Maybe with a chrome extension?
Yes you can do this by using a Chrome Extension. See this Change HTTP Headers chrome extension already does it.
For your specific case, you just need to do this:
Add an event listener which should be called whenever headers are received.
Read the details of the headers
Check if response content type is image
Add/Update the desired header to the headers
To accomplish this you can use webRequest Headers Received event.
Documentation of onHeadersReceived
onHeadersReceived (optionally synchronous): Fires each time that an HTTP(S) response header is received. Due to redirects and authentication requests this can happen multiple times per request. This event is intended to allow extensions to add, modify, and delete response headers, such as incoming Set-Cookie headers.
Your code will look something like this
chrome.webRequest.onHeadersReceived.addListener(function(details){
for(var i = 0; i < details.responseHeaders.length; i++) {
// If response is of image, add the cache-control header
}
return {responseHeaders: details.responseHeaders};
},
{urls: ['https://*/*'], types: ['image'] },
['blocking', 'responseHeaders']);
PS: I have not run and tested the code so please excuse the typos.
EDIT (After #RobW comment)
No, this is not possible as of now (22 march 2014). Adding Cache-control has no influence on the caching behavior. Check out this answer for more details.
Albeit this is an old question, I stumbled upon it recently. Later I found the chrome extension "Speed-Up Browsing" which seems to do exactly what the OP asked for.
For me it worked.
https://chrome.google.com/webstore/detail/speed-up-browsing/hkhnldpdljhiooeallkmlajnogjghdfb?hl=en
You could use Fiddle https://www.telerik.com/fiddler (free) as proxy adding caching for selected URLs or patterns. I did that a

Serving content depending on http accept header - caching problems?

I'm developing an application which is supposed to serve different content for "normal" browser requests and AJAX requests for the same URL requested.
(in fact, encapsulate the response HTML in JSON object if the request is AJAX).
For this purpose, I'm detecting an AJAX request on the server side, and processing the response appropriately, see the pseudocode below:
function process_response(request, response)
{
if request.is_ajax
{
response.headers['Content-Type'] = 'application/json';
response.headers['Cache-Control'] = 'no-cache';
response.content = JSON( some_data... )
}
}
The problem is that when the first AJAX request to the currently viewed URL is made strange things happens on Google Chrome - if, right after the response comes and is processed via JavaScript, user clicks some link (static, which redirects to other page) and then clicks back button in the browser, he sees the returned JSON code instead of the rendered website (logging the server I can say that no request is made). It seems for me that Chrome stores the latest request response for the specific URL, and doesn't take into account that it has different content-type etc.
Is that a bug in the Chrome or am I misusing HTTP protocol ?
--- update 12 11 2012, 12:38 UTC
following PatrikAkerstrand answer, I've found following Chrome bug: http://code.google.com/p/chromium/issues/detail?id=94369
any ideas how to avoid this behaviour?
You should also include a Vary-header:
response.headers['Vary'] = 'Content-Type'
Vary is a standard way to control caching context in content negotiation. Unfortunately it has also buggy implementations in some browsers, see Browser cache vary broken.
I would suggest using unique URLs.
Depending of you framework capabilities you can redirect (302) the browser to URL + .html to force response format and make cache key unique within browser session. Then for AJAX requests you can still keep suffix-less URL. Alternatively you may suffix AJAX URL with .json instead .
Another options are: prefixing AJAX requests with /api or adding some cache boosting query params ?rand=1234.
Setting cache-control to no-store made it in my case, while no-cache didn't. This may have unwanted side effects though.
no-store: The response may not be stored in any cache. Although other directives may be set, this alone is the only directive you need in preventing cached responses on modern browsers.
Source: Mozilla Developer Network - HTTP Cache-Control

Resources