IE8 has a feature called InPrivate Filtering, which will block scripts it finds on webpages from more than 'n' different sites.
I'm listening to the most recent 'Security Now' podcast which is raving about this feature as being great.
At the very same time I'm screaming NOOO! What the *#&$ -- because my site (as does many many others) includes the following (jQuery + SWFObject). i.e. I'm using Google's CDN to host my jQuery.
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/swfobject/2.1/swfobject.js"></script>
So whats the deal - should I stop usin jQuery and swfobject from a CDN ?
Whats everybody else doing?
**Edit: ** I couldn't find out if they keep a list of 'trusted sites' or not, but according to this from Microsoft the InPrivate filtering is per session. So at least someone has to actively enable it every session.
InPrivate Filtering is off by default and must be enabled on a
per-session basis. To use this
feature, select InPrivate Filtering
from the Safety menu. To access and
manage different filtering options for
Internet Explorer 8, select InPrivate
Filtering Settings from the Safety
menu. To end your InPrivate Browsing
session, simply close the browser
window.
If your site has content that people would not want cached (bank site, porn, or something else "sensitive"), then I would not use an externally hosted file. Or if your site is just totally broken if the file does not load I would consider it. But if your site is anything else, I wouldn't worry about it. I don't think this is a feature most people will use if they want to hide their tracks. And if they really want to, let them deal with the consequences.
This may seem silly but since IE8 is out, why don't you test your site with InPrivate turned on and see how it behaves? Also if you can report back your findings here that would be great :)
It looks like there's a significant chance this will be disabled with InPrivate enabled, but it ultimately depends on each user's browsing habits.
If a user visits 10 sites in regular mode that all link to files from the same third-party domain, links to files on that domain will be blocked when InPrivate is enabled.
So while you won't be able to take advantage of the CDN, you should host files like this yourself if you need them to work reliably.
InPrivate Blocking keeps a record of
third-party items like the one above
as you browse. When you choose to
browse with InPrivate, IE
automatically blocks sites that have
“seen” you across more than ten sites.
You can also manually choose items to
block or allow, or obtain information
about the third-party content directly
from the site by clicking the “More
information from this website” link.
Note that Internet Explorer will only
record data for InPrivate Blocking
when you are in “regular” browsing
mode, as no browsing history is
retained while browsing InPrivate. An
easy way to think of it is that your
normal browsing determines which items
to block when you browse InPrivate.
Disclaimer: I haven't actually tested any of this as I don't have IE8, but the document you linked to is pretty clear about this.
You should host the JS files on your own site.
Here's another reason to host the JS file on your site.
I've always wondered, would it be possible to have a safe fallback in the event the CDN is down/unavailable?
Something like:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
<script type="text/javascript">
if (typeof jQuery == 'undefined') {
document.write(unescape("%3Cscript src='local/jquery.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
I think there would be a low percent of people using IE8 (I think), then turning on the option "InPrivate Browsing". Google's CDN somehow says "it has a server near where the user accessing the website is, so that the performance is increased" (not directly quoted). IE has caused me numerous problems in the past, and I dropped support for it.
does it work from the domain name of the site e.g. ajax.googleapis.com or does it resolve the name? if it just logs the domain, couldn't you just wrap it in a CNAME e.g. js.yourdomain.com -> ajax.googleapis.com?
Related
I ´m trying with Ad Block Plus, Ghostery, Disconect and Self-Destruct Cookies in Firefox.
Can you recommend some tips to stop the tracking?
The "Do Not Track" feature in Firefox is useful for telling sites that you do not want to be tracked. A detailed guide on enabling this feature is available on Firefox's support page : http://mzl.la/WL6fUP .
Besides, if you want an extra level of security, I would suggest you to use the "NoScript" browser extension(https://addons.mozilla.org/en-US/firefox/addon/noscript/). NoScript blocks JavaScript and other executable content on website thus effectively protecting you from tracking codes on websites.
And if you want real privacy use a proxy or VPN. Another good idea is to use the tor browser ( torproject.org/projects/torbrowser.html.en).
I am working on a site that has to use IE6 !!! BUT also requires intranet users to access apps that need Firefox. Is there a slick way of doing this when launching the app? (link pop up - seems old fashioned). Not sure how much access I have to the root directory, ie for HTaccess
thoughts much appreciated.
You can use a <div> element inside conditional comments and display it for intranet users only based on an IP check.
Is there a way to make sure Magento calls secure urls when its in the checkout process? The problem is the web browser complains when over httpS because not all resources are secure. In the source I have things like <script type="text/javascript" src="httP://something"> which triggers this error. I'm afraid customer won't think the site is secure.
I know I can use this <?php $this->getUrl('something/', array('_secure'=>true)) ?> However I don't want all my javascript resources to be secure all the time, just in the checkout process.
It seems Magento should handle this automatically when you configure it use frontend SSL, but apparently not.
So my question is what is the best way to handle this?
Thanks
The customer would be correct - the page content is not secure.
If you hardcode protocols in markup or incorrectly specify protocols in code, the system delivers what you ask. It's incumbent on the implementer to make sure the markup is correct.
That said, asset sources can use relative protocols in markup:
<script src="//cdn.com/some.js"></script>
Also, secured/non-secured status can be passed dynamically to arguments.
Magento serves out everything secure that it controls. The problems usually come from scripts that load content from other sites. Magento doesn't have any control over these. It would have to literally rewrite the script in order to do that.
It's your responsibility to see that the scripts are properly written or else banished to pages where they belong so the browser doesn't complain about insecure content.
A case where relative protocols did not work. --->> We took on Authorize.NET and chewed them out because of their security badge causing Internet Explorer to pop up the insecure content warning during cart operations, the very place you want the badge to show so the customer knows their credit card info is being properly handled. They had the problem fixed within two weeks after we told them people were not ordering and actually complaining about site security when we showed their badge in the cart.
It was caused because the script they gave you at the time, which we tried to modify for relative protocol, then turned around and called yet another script that retrieved plain ole port 80 insecure content.
Facebook can go like itself on another page, it doesn't belong in cart operations (another script menace we had to deal with).
I am seeing something weird in some logs and I was wondering if someone could suggest how this could happen. I am looking at error logs sent from the client-side of a web application.
The logging has information about the client-side data that would seem to indicate that a certain <script> block within the page has not ran. The clients browser is running javascript though, otherwise the log results would not be available. The logging component is included in an external javascript file.
I can think of two scenarios where this may have happened. Perhaps there was a script error in the ignored <script> block. I have tried to rule this scenario out.
Another scenario might have to do with some security settings on the browser. Maybe certain script blocks were not allowed to run due to user preferences. I'm not sure if browsers look like this though. Browsers may have javascript disabled altogether, but I don't know of any browsers or tools that would partially disabled javascript.
Does anyone know of a way javascript might be partially disabled or filtered? Any special browsers or firewall rules, ... any possible explanation.
Some company firewalls like to block "suspicious" JavaScript; see You cannot rely on JavaScript being available. Period.
On the user side, addons like Adblock Plus and NoScript can selectively block scripts in Firefox.
There are also proxies like privoxy that can filter and change the source code of the page you requested before it arrives to browser. And with firefox addons, you can do virtually anything you want with the page you are visiting.
I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.