Allow Cloudfront Globally on NoScript - firefox

So Amazon's Cloudfront CDN is ubiquitous, and as a NoScript user, it can be a little frustrating having to allow every "########.cloudfront.net" on different sites. Does anyone now how to create an ABE rule in NoScript to allow any script coming from a *.cloudfront.net domain?

While the answer by #samadadi is correct and will allow you to universally accept content from cloudfront, you might want to consider why you're running No Script in the first place.
Cloud Front is a Content Delivery Network that anyone can use, even "the bad guys". If you allow JavaScript downloaded from there to be run, then you are circumventing the protection NoScript offers. If you want to remain safe, I would recommend staying with allowing it on a site by site basis.
This might be an unpopular option because, yes, it is more work, but you will be safer.
Update (just to save people reading the comments below): CloudFront sub domains, while looking random, remain the same between visits. Permanently allowing cloudfront subdomains from sites you trust should be safe, and will only ask you once.

Add these addresses to the Noscript whitelist to allow CloudFront scripts globally:
cloudfront.net
amazonaws.com

To allow *.cloudfront.net, I checked the debug button, and touched up the JSON as suggested here: https://www.dedoimedo.com/computers/firefox-noscript-10-guide-1.html. Let me know if yo have trouble doing this and I will walk you through it.

Related

What is the best way to let Google know the difference between a Production, Development, and Staging environment?

We have three domain names with pretty close to duplicate content (Magento sites). Let's call them production.com, development.com and staging.com.
I have robots no-index on development.com and staging.com. I also have htpasswds enabled. A Google search of these domains shows that they haven't been indexed. However, I'm starting to get phishing warnings from Chrome when I log in to the back end of the software.
I need to stop this as soon as possible. If the warnings spread to the front end we're looking at pretty serious ramifications.
What is the best course of action?
Through robots.txt make sure google finds disallow and will not spider servers that are not production server.
Also elaborate on what warnings are being raised.

Magento - prevent from browsing without rewrite

I have a problem with someone (using many IP addresses) browsing all over my shop using:
example.com/catalog/category/view/id/$i
I have URL rewrite turned on, so the usual human browsing looks "friendly":
example.com/category_name.html
Therefore, the question is - how to prevent from browsing the shop using "old" (not rewritten) URLs, leaving only "friendly" URLs allowed?
This is pretty important, since it is using hundreds of threads which is causing the shop to work really slow.
Since there are many random IP addresses, clearly you can't just block access from a single or small group of addresses. You may need to implement some logging that somehow identifies this crawler uniquely (maybe by browser agent, or possibly with some clever use of the Modernizr javascript library).
Once you've been able to distinguish some unique identifiers of this crawler, you could probably use a rule in .htaccess (if it's a user agent thing) to redirect or otherwise prevent them from consuming your server's oomph.
This SO question provides details on rules for user agents.
Block all bots/crawlers/spiders for a special directory with htaccess
If the spider crawls all the urls of the given pattern:
example.com/catalog/category/view/id/$i
then you can just kill these urls in a .htaccess. The rewrite is made internally from category.html -> /catalog/category/view/id/$i so, you only block the bots.
Once the rewrites are there ... They are there. They are stored in the Mage database for many reasons. One is crawlers like the one crawling your site. Another is users that might have the old page bookmarked. There are a number of methods individuals have come up with to go through and clean up your redirects (Google) ... But as it stands, in Magento, once they are there, they are not easily managed using Magento.
I might suggest generating a new site map and submitting it to the crawler affecting your site. Not only is this crawler going to be crawling tons of pages it doesn't need to, it's going to see duplicate content (bad ju ju).

Copy cookie to other domain. Firefox? Chromium?

Is there any possibility of copying a set of cookies from one domain to another. I badly need this for Web development.
You cannot just copy set of cookies, but you can write your own php/python code, to set several cookies for another domain and use values from the old set.
No, the same origin policy prohibits sites from setting or reading cookies on behalf of other sites outside of a few special cases. Some browser extensions will allow you to copy and paste cookies to sync them manually, though.
From your comments it sounds like you want to use the same cookie for your development and your production system. How about using something like 'local.example.com' instead of 'localhost' as your development domain and setting wildcard cookies for all subdomains?
We use this pattern so we don't need to register multiple API keys for webservices since most of them have wildcard support for subdomains.
I'm not sure I would recommend something like that so you are able to use the same cookies in development and production because it has other implications as well. For example if you send static assets from another subdomain then the browser will send cookie header information unnecessarily and of course there might be some more details that might make debugging harder rather than easier this way.
If you could explain the problem at hand in a bit more detail there might be other solutions or best practices for staging and production environments that can help you.
You can do this manually, using grease monkey.
goto tools->page info.
select security tab.
view cookies button.
type the domain you wish to read from.
make a note of all the cookies and their content.
now go to the domain you wish to copy to.
install greasemonkey add on for firefox (or better yet, using Cookies Manager+ :: Add-ons for Firefox!).
run some javascript code to re-create the mentioned cookeis and their values.

When trying to integrate one website with another what is the way to go? Iframe or pulling content?

My company has multiple vendors that all have their own websites. I am creating a website that acts as a dashboard where customers can access all of the vendor's sites. I wanted to know what is the best option for doing this?
Here's what I have so far:
Iframe
Can bring in the entire website
Seems secure enough (not sure if I'm missing any information on security issues for this)
Users can interact with the vendor's website through our site
Our website cannot fully interact with the vendor's website (Also may be missing info here)
Pulling in the content
Can bring in the entire website
Not very secure from what I hear (Some websites actually say that pulling another website in is a voilation of security and will alert the user of this or something similar...
Users can interact with their website through our site
Our website can fully interact with the vendor's website
Anyone have any other options...?
What are some of the downsides to bringing in a site with an iframe and is this really our only option for doing something like this?
Optimally, we would like to pull in their site to ours without using an iframe- What options do we have on this level? Is there anything better than an iframe?
Please add in as much information as you can about iframes, pulling content, security, and website interactions like this. Anything to add in is appreciated.
Thanks,
Matt
As far as "pulling content" is concerned I wouldn't advise it as it can break. All it takes is a simple HTML change on their end and your bot will break. Also, it's more work than you think to do this for one site, let alone the many that you speak of. However, there are 3rd party apps that can do this for you if you have the budget.
You could use an iframe/frames, however, many sites might try to bust out of them and it can ruin the user experience of the site within the frame.
My advice is to use the following HTML for each link in your dashboard.
Vendor Site Link
If you can have the sites that you are embedding add some client-side script, then you could use easyXSS. It allows for easy transferring of data, and also calling javascript methods across the domain boundry.
I would recommend iFrames. Whilst not the most glamorous of elements, many payment service providers use iFrames for the Verified by Visa/Mastercard Secure Code integration.

Mixing Secure and Non-Secure Content on Web Pages - Is it a good idea?

I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.

Resources