how could javascript be partially disabled? - script-tag

I am seeing something weird in some logs and I was wondering if someone could suggest how this could happen. I am looking at error logs sent from the client-side of a web application.
The logging has information about the client-side data that would seem to indicate that a certain <script> block within the page has not ran. The clients browser is running javascript though, otherwise the log results would not be available. The logging component is included in an external javascript file.
I can think of two scenarios where this may have happened. Perhaps there was a script error in the ignored <script> block. I have tried to rule this scenario out.
Another scenario might have to do with some security settings on the browser. Maybe certain script blocks were not allowed to run due to user preferences. I'm not sure if browsers look like this though. Browsers may have javascript disabled altogether, but I don't know of any browsers or tools that would partially disabled javascript.
Does anyone know of a way javascript might be partially disabled or filtered? Any special browsers or firewall rules, ... any possible explanation.

Some company firewalls like to block "suspicious" JavaScript; see You cannot rely on JavaScript being available. Period.
On the user side, addons like Adblock Plus and NoScript can selectively block scripts in Firefox.

There are also proxies like privoxy that can filter and change the source code of the page you requested before it arrives to browser. And with firefox addons, you can do virtually anything you want with the page you are visiting.

Related

Why do some websites that have SSL not work but still load if using the HTTPS version? How can I avoid it if I make a website?

Sometimes, if I go to a website, such as this one through an HTTP link, it looks fine and works as apparently intended:
However, if you change the address to be HTTPS, the page loads without any browser warnings but looks really weird and seems broken—spacing is messed up, the colors are wrong, fonts don't load, etc.:
All of this same stuff happens in both Firefox and Chrome on my computer.
What causes this to happen? How can I avoid this if I make an HTTPS-secured website?
For me the browser tells you what is wrong in a warning message. Parts of the page are not secure (such as images).
What does this mean? The developer of the site has linked some content such as CSS, JS, or images using HTTPS links and some using HTTP links.
Why is this a problem? Since some content is being retrieved over an insecure connection (http), it would be possible for malicious content to be injected into your browser which could then grab information which was transmitted over https. Browsers have had this warning for a very long time, but in the interest of security they have hedged their behavior on the more secure side of things now.
What will fix this? There is nothing we can do as consumers of the website. The owner of the site should fix the problem. If you are really interested in viewing the site and not concerned about security, you can temporarily disable this protection from the URL bar warning message in Firefox.
As #micker explained, the page looks weird because not all of the sources are loading since their connections could not be made securely and the website's ability to load those sources are being denied by the browser for not being referenced using a secure connection.
To elaborate further. in case it's still not quite clear, a more accurate and technical explanation would be that, for styling a webpage, the Cascading Style Sheets, or CSS, is the language used to describe the presentation of a document or webpage in this case, and tells the browser how elements should be rendered on the screen. If you consider these stylesheets as sort of building blocks, where you can combine them together to define different areas on a webpage to build one masterpiece, then you would see why having multiple building blocks for a site would sound pretty normal.
To save even more time, rather than try to figure out the code for each and every stylesheet or "building block" that I want to include, I can burrow someone else's style sheet that has the properties I want and link to it as a resource instead of making or hosting the resource myself. Now if we pretend that there's a stylesheet for every font size change, font color variance, or font placement, then that means we're going to need a building block to define each of those
Now, If I am on a secure connection, then the browser ensures that connection stays secure by only connecting to other sites, or resources, which are also secure. If any of the sites containing the building blocks of CSS that I want to use but are not secure, AKA not using SSL (indicated by a lack of "s" in "http://" in their address), then the browser will prevent those connections from happening and thus prevents the resources from loading, because the browser considers it a risk to your current secure connection.
In your example's particular case, things looked fine when you entered only http:// without the https:// because the site you were visiting doesn't automatically force visitors to use SSL and lets you connect to it using the less secure, http protocol, which means your browser is not connecting securely to it, and therefore won't take extra steps to protect you by blocking anything outside of that since you're already on an insecure connection anyway. In which case, the browser doesn't need to prevent sources that are coming from an insecure connection or sites because in a way, your connection is already exposed so it can freely connect where it needs to and load any resources regardless if they can be transferred securely or not.
So then, when you go to the "https://" version of the site, there are no browser warnings because you're connecting to that site with a secure connection and unfortunately that also means that if the designer of the page had linked resources from somewhere that just didn't have an SSL connection available or didn't update the link to go to the new https:// standard, then it's going to be considered insecure and since you're on a secure connection, the browser will block those connections which means blocks those resources from being able to load, making the page load incomplete with not all of its building blocks. Build blocks that tells your screen to move all the text on the right into a panel and to have a blue font color while changing to a different font face. Those definitions defining the look and appearance didn't make it through and so those sections adopted whatever existing stylesheet is present which normally don't match with what was intended to be there.

Does Firefox allow extensions to bypass a normal web page's CSP?

I have a web page with a CSP like this:
<meta http-equiv="Content-Security-Policy" content="script-src 'self' https://d2wy8f7a9ursnm.cloudfront.net https://cdn.polyfill.io https://browser-update.org https://static.zdassets.com https://ekr.zdassets.com https://mysite.zendesk.com wss://mysite.zendesk.com https://*.zopim.com https://*.googleapis.com 'unsafe-inline' 'unsafe-eval'">
For privacy reasons, in this post, I replaced the name of my company with mysite. Also note that the use of unsafe-eval is because I have some legacy code that requires it for templating.
My site includes bugsnag error monitoring, and I picked up a particular error for a user where the breadcrumbs show XmlHttpRequest calls to suspicious domains that sound like adware and/or malware. There is also some console log string "swbtest loaded".
Although it's possible that the user disabled the Firefox setting security.csp.enable, I find it highly unlikely. This user is a customer I emailed with, and she doesn't seem like the type to do that.
My questions are:
(1) Does this look like a Firefox extension/plugin?
(2) If so, how is it bypassing my CSP? Or does unsafe-eval allow extensions to access?
(3) Would it help to add a connect-src rule for the CSP?
Thanks.
Extensions don't care about the page CSP at all. They can run code alongside your code (where they are not bound by the page CSP), or inject arbitrary code in the page JavaScript context.
Further, extensions have enough power to override the page CSP on the fly (e.g. by rewriting response headers), but they usually don't need it.
There's nothing that you, the website author, can do to prevent extensions interfering with your page.
Unfortunately, that means "noise" in reports you get.
Of note, "swbtest" seems to be related to Selenium browser automation / test suite.

Chrome caching like a mad browser

I've got a web service that, like most others, uses js and css files. I use the old trick of appending a version number to the js and css file like; ?v=123 and that gets changed every time we update the service on production.
Now, this works fine on all browsers, except for Chrome. Chrome seems to prefer it's cached version over getting the new one and therefor seems to ignore the appended variable. In some cases, forcing it to refresh cache (cmd+r / ctrl+f5) wasn't enough so I had to go into options and clear out the cache for it to load up the new content.
Has anyone experienced this issue with Chrome? And if so, what was the resolution to the problem?
Chrome should certainly treat requests with varying query strings as different requests; a cached result for style.css?v=123 should never be used for style.css?v=124. If you're seeing different behavior, please file a bug at http://new.crbug.com/ and post the bug ID here.
That said, I'd first check to see whether the page was cached longer than you expected. If a new version of the page itself wasn't downloaded, then it would still be requesting ?v=123 as the HTML wouldn't have changed. If you're sending long-lived cache headers with the page, it's certainly possible that Chrome is caching it more aggressively than you expected. If that's the behavior you're seeing, please star http://crbug.com/8742 for updates.
I had also same experience
You can user Ctrl + Shift + R for cache free browsing in both Chrome + Mozilla.
I have had this experience as well.
I run a membership site which displays content such as "You must be logged in as a Gold member in order to see this content" if they are not logged in or are trying to view content not allowed by their membership level. But even if the user is logged in, the user would still see "You need to log in", due to Google Chrome's aggressive caching. In Firefox, however, it works fine as I test logging in and out of all 5 levels of membership - each displaying the proper content.
While Chrome's caching problem can be solved by clearing the cache every time the user logs in and out, it would be really annoying to take that approach.

A plugin for manipulating JavaScript/HTML code

I need a tool that can parse and insert code to the JavaScript/HTML code before the browser starts to interpret the code. I've been thinking using a proxy to do it. But now I'd like to know whether I could implement such functionality in a Firefox plug-in?
Sounds like Greasemonkey to me.
What does Greasemonkey do?
Greasemonkey lets you add JavaScript code (called "user scripts") to any web page, which will run when its HTML code has loaded. Compared to writing extensions, user scripts often offer a light-weight alternative, requiring no browser restart on user script installation nor removal, and work with the common DOM API familiar to any web developer (with somewhat elevated privileges for doing cross domain XMLHttpRequest requests and storing small portions of private data). User scripts work more or less like bookmarklets automatically invoked for any URLs matching one or more glob patterns.
http://wiki.greasespot.net/FAQ
I'm pretty sure something like TemperData might work. Or maybe Fiddler, but that's an application with additional hooks that enable it to work with Firefox.
TemperData: https://addons.mozilla.org/en-US/firefox/addon/966/
Fiddler: http://www.fiddler2.com/fiddler2/
Of course both work on a network level, so they may be a bit more arcane than what you'd need.

Is IE8 going to break my CDN hosted jQuery?

IE8 has a feature called InPrivate Filtering, which will block scripts it finds on webpages from more than 'n' different sites.
I'm listening to the most recent 'Security Now' podcast which is raving about this feature as being great.
At the very same time I'm screaming NOOO! What the *#&$ -- because my site (as does many many others) includes the following (jQuery + SWFObject). i.e. I'm using Google's CDN to host my jQuery.
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/swfobject/2.1/swfobject.js"></script>
So whats the deal - should I stop usin jQuery and swfobject from a CDN ?
Whats everybody else doing?
**Edit: ** I couldn't find out if they keep a list of 'trusted sites' or not, but according to this from Microsoft the InPrivate filtering is per session. So at least someone has to actively enable it every session.
InPrivate Filtering is off by default and must be enabled on a
per-session basis. To use this
feature, select InPrivate Filtering
from the Safety menu. To access and
manage different filtering options for
Internet Explorer 8, select InPrivate
Filtering Settings from the Safety
menu. To end your InPrivate Browsing
session, simply close the browser
window.
If your site has content that people would not want cached (bank site, porn, or something else "sensitive"), then I would not use an externally hosted file. Or if your site is just totally broken if the file does not load I would consider it. But if your site is anything else, I wouldn't worry about it. I don't think this is a feature most people will use if they want to hide their tracks. And if they really want to, let them deal with the consequences.
This may seem silly but since IE8 is out, why don't you test your site with InPrivate turned on and see how it behaves? Also if you can report back your findings here that would be great :)
It looks like there's a significant chance this will be disabled with InPrivate enabled, but it ultimately depends on each user's browsing habits.
If a user visits 10 sites in regular mode that all link to files from the same third-party domain, links to files on that domain will be blocked when InPrivate is enabled.
So while you won't be able to take advantage of the CDN, you should host files like this yourself if you need them to work reliably.
InPrivate Blocking keeps a record of
third-party items like the one above
as you browse. When you choose to
browse with InPrivate, IE
automatically blocks sites that have
“seen” you across more than ten sites.
You can also manually choose items to
block or allow, or obtain information
about the third-party content directly
from the site by clicking the “More
information from this website” link.
Note that Internet Explorer will only
record data for InPrivate Blocking
when you are in “regular” browsing
mode, as no browsing history is
retained while browsing InPrivate. An
easy way to think of it is that your
normal browsing determines which items
to block when you browse InPrivate.
Disclaimer: I haven't actually tested any of this as I don't have IE8, but the document you linked to is pretty clear about this.
You should host the JS files on your own site.
Here's another reason to host the JS file on your site.
I've always wondered, would it be possible to have a safe fallback in the event the CDN is down/unavailable?
Something like:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"></script>
<script type="text/javascript">
if (typeof jQuery == 'undefined') {
document.write(unescape("%3Cscript src='local/jquery.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
I think there would be a low percent of people using IE8 (I think), then turning on the option "InPrivate Browsing". Google's CDN somehow says "it has a server near where the user accessing the website is, so that the performance is increased" (not directly quoted). IE has caused me numerous problems in the past, and I dropped support for it.
does it work from the domain name of the site e.g. ajax.googleapis.com or does it resolve the name? if it just logs the domain, couldn't you just wrap it in a CNAME e.g. js.yourdomain.com -> ajax.googleapis.com?

Resources