Mixing Secure and Non-Secure Content on Web Pages - Is it a good idea? - web-optimization

I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?

Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.

Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.

Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.

This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.

Related

Why do some websites that have SSL not work but still load if using the HTTPS version? How can I avoid it if I make a website?

Sometimes, if I go to a website, such as this one through an HTTP link, it looks fine and works as apparently intended:
However, if you change the address to be HTTPS, the page loads without any browser warnings but looks really weird and seems broken—spacing is messed up, the colors are wrong, fonts don't load, etc.:
All of this same stuff happens in both Firefox and Chrome on my computer.
What causes this to happen? How can I avoid this if I make an HTTPS-secured website?
For me the browser tells you what is wrong in a warning message. Parts of the page are not secure (such as images).
What does this mean? The developer of the site has linked some content such as CSS, JS, or images using HTTPS links and some using HTTP links.
Why is this a problem? Since some content is being retrieved over an insecure connection (http), it would be possible for malicious content to be injected into your browser which could then grab information which was transmitted over https. Browsers have had this warning for a very long time, but in the interest of security they have hedged their behavior on the more secure side of things now.
What will fix this? There is nothing we can do as consumers of the website. The owner of the site should fix the problem. If you are really interested in viewing the site and not concerned about security, you can temporarily disable this protection from the URL bar warning message in Firefox.
As #micker explained, the page looks weird because not all of the sources are loading since their connections could not be made securely and the website's ability to load those sources are being denied by the browser for not being referenced using a secure connection.
To elaborate further. in case it's still not quite clear, a more accurate and technical explanation would be that, for styling a webpage, the Cascading Style Sheets, or CSS, is the language used to describe the presentation of a document or webpage in this case, and tells the browser how elements should be rendered on the screen. If you consider these stylesheets as sort of building blocks, where you can combine them together to define different areas on a webpage to build one masterpiece, then you would see why having multiple building blocks for a site would sound pretty normal.
To save even more time, rather than try to figure out the code for each and every stylesheet or "building block" that I want to include, I can burrow someone else's style sheet that has the properties I want and link to it as a resource instead of making or hosting the resource myself. Now if we pretend that there's a stylesheet for every font size change, font color variance, or font placement, then that means we're going to need a building block to define each of those
Now, If I am on a secure connection, then the browser ensures that connection stays secure by only connecting to other sites, or resources, which are also secure. If any of the sites containing the building blocks of CSS that I want to use but are not secure, AKA not using SSL (indicated by a lack of "s" in "http://" in their address), then the browser will prevent those connections from happening and thus prevents the resources from loading, because the browser considers it a risk to your current secure connection.
In your example's particular case, things looked fine when you entered only http:// without the https:// because the site you were visiting doesn't automatically force visitors to use SSL and lets you connect to it using the less secure, http protocol, which means your browser is not connecting securely to it, and therefore won't take extra steps to protect you by blocking anything outside of that since you're already on an insecure connection anyway. In which case, the browser doesn't need to prevent sources that are coming from an insecure connection or sites because in a way, your connection is already exposed so it can freely connect where it needs to and load any resources regardless if they can be transferred securely or not.
So then, when you go to the "https://" version of the site, there are no browser warnings because you're connecting to that site with a secure connection and unfortunately that also means that if the designer of the page had linked resources from somewhere that just didn't have an SSL connection available or didn't update the link to go to the new https:// standard, then it's going to be considered insecure and since you're on a secure connection, the browser will block those connections which means blocks those resources from being able to load, making the page load incomplete with not all of its building blocks. Build blocks that tells your screen to move all the text on the right into a panel and to have a blue font color while changing to a different font face. Those definitions defining the look and appearance didn't make it through and so those sections adopted whatever existing stylesheet is present which normally don't match with what was intended to be there.

How effective is ajaxcrawling compared to serverside created website SEO?

I'm looking for real world experiences in regards to ajaxcrawling:
http://code.google.com/web/ajaxcrawling/index.html
I'm particularly concerned about the infamous Gizmodo failure of late, I know I can find them via Google now, but it's not clear to me how effective this method of ajaxcrawling is in comparison to serverside generated sites is.
I would like to make a wiki that lives mostly on the client side, and which is populated by ajax json. It just feels more fluid, and I think it would be a pluspoint over my competition. (wikipedia, wikimedia)
Obviously, for a wiki it's incredibly important to have working SEO.
I would be very happy for any experiences you have had dealing with clientside development.
My research shows that the general consensus on the web right now is, that you should absolutely avoid doing ajax sites unless you don't care about SEO (for example, a portfolio site, a corporate site etc).
Well, these SEO problems arise when you have a single page that loads content dynamically based on sophisticated client-side behavior. Spiders aren't always smart enough to know when JavaScript is being injected, so if they can't follow links to get to your content, most of them won't understand what's going on in a predictable way, and thus won't be able to fullly index your site.
If you could have the option of unique URLs that lead to static content, even if they all route back to a single page by a URL rewriting scheme, that could solve the problem. Also, it will yield huge benefits down the road when you've got a lot of traffic -- the whole page can be cached at the web server/proxy level, leading to less load on your servers.
Hope that helps.

Is it possible to Drag-and-Drop images between Web sites (applications)?

There are a number of questions on stackoverflow about drag-and-drop but I can't see that any relate to this question specifically.
Question: Is it possible to drag-and-drop an image from one Web application (or site) to another Web application (not the same window etc.)?
I'm not looking for specific technologies that may help one achieve this, just if it is possible with Web application security restrictions.
For example, I've read that it's not possible for one Web application's Javascript to mess with the DOM of another Web application (for obvious reasons).
I just want to be able to drag an image displayed on one Web page into a Web application on another page (and for that application to have full access to the image).
Thanks,
Ashley.
Drag and drop is not defined within html. Many browsers (not IE IIRC) support drag and dropping image URL's into text boxes. So if you drag an image from a one site and drag it into another site's textbox you will have the full URL of the image. You can have JavaScript take it from there
I don't think this is directly possible without Gears.
EDIT: The Desktop API provides drag + drop.
It's really a question of data handling in the browser. If there were no security issues involved this would be a piece of cake ... but there are security issues, big ones. Any time you permit data from site X to be introduced to site Y in a programmatic way you are opening a door, and it's very difficult to find the right balance of "useful enough to permit exciting new functionality" without going all the way to "bending over in the shower in prison to pick up the soap".
Cheswick and Bellovin say there are two basic approaches to security:
That which is not explicitly forbidden is permitted.
That which is not explicitly permitted is forbidden.
Microsoft basically went with #1 and you can see where that leads to. Most paranoid sysadmins go the route of #2 with a vengeance. Opening a big door between two unrelated sites would send most of us screaming off into the woods.
Unfortunately, although browsers & web site people are mostly (somewhat?) aware of this problem and are trying to deal with it, companies like Adobe and their "flash storage" are creating more and more problems.

When trying to integrate one website with another what is the way to go? Iframe or pulling content?

My company has multiple vendors that all have their own websites. I am creating a website that acts as a dashboard where customers can access all of the vendor's sites. I wanted to know what is the best option for doing this?
Here's what I have so far:
Iframe
Can bring in the entire website
Seems secure enough (not sure if I'm missing any information on security issues for this)
Users can interact with the vendor's website through our site
Our website cannot fully interact with the vendor's website (Also may be missing info here)
Pulling in the content
Can bring in the entire website
Not very secure from what I hear (Some websites actually say that pulling another website in is a voilation of security and will alert the user of this or something similar...
Users can interact with their website through our site
Our website can fully interact with the vendor's website
Anyone have any other options...?
What are some of the downsides to bringing in a site with an iframe and is this really our only option for doing something like this?
Optimally, we would like to pull in their site to ours without using an iframe- What options do we have on this level? Is there anything better than an iframe?
Please add in as much information as you can about iframes, pulling content, security, and website interactions like this. Anything to add in is appreciated.
Thanks,
Matt
As far as "pulling content" is concerned I wouldn't advise it as it can break. All it takes is a simple HTML change on their end and your bot will break. Also, it's more work than you think to do this for one site, let alone the many that you speak of. However, there are 3rd party apps that can do this for you if you have the budget.
You could use an iframe/frames, however, many sites might try to bust out of them and it can ruin the user experience of the site within the frame.
My advice is to use the following HTML for each link in your dashboard.
Vendor Site Link
If you can have the sites that you are embedding add some client-side script, then you could use easyXSS. It allows for easy transferring of data, and also calling javascript methods across the domain boundry.
I would recommend iFrames. Whilst not the most glamorous of elements, many payment service providers use iFrames for the Verified by Visa/Mastercard Secure Code integration.

Are there any disadvantages to using AJAX?

Are there any disadvantages to using AJAX?
No integration with the browser's history.
If you build a site that requires Ajax to see content and perform tasks, you have several major problems. Ajax-only content/functions are invisible/unavailable to:
search bots
many mobiles
people with Javascript turned off
etc etc.
However, if you build a site using the progressive enhancement principle, those problems are solved, and you still get to serve nice-to-use Ajax to most users.
Progressive enhancement involves first creating your site using bare-bones (X)HTML, on REST-like principles (at least to the extent of requiring POST requests for state changes). Simple semantic markup; forget about CSS and Javascript.
Step one is to get that right, and have your entire site (or as much of it as makes sense) working nicely this way for search bots and Lynx-like user agents.
Then add a visual layer: CSS/graphics/media for visual polish, but don't significantly change your original (X)HTML markup; allow the original text-only site to stay intact and functioning. Keep your markup clean!
Third is to add a behavioural layer: Javascript (Ajax). Offer things that make the experience faster, smoother, nicer for users/browsers with Ajax-capable JS... but only those users.
Browser compatibility.
Asynchronized access to data means it's harder to make things go correctly in every combination of actions.
Dependency of javascript makes the site unusable for some. Also javascript performance can be a bottleneck in resource limited environments.
User may not know via the client that an AJAX operation was made, or if it failed. It can be difficult to recover from client side errors caused by a failed AJAX call.
Makes it really Hard to do functional testing .
Inability to update the client without "polling", which means querying the server every X seconds.
It requires javascript. And you have to admit to your friends how "Web 2.0" you are. Instead of being hard core old school: It's all tables for layout and frames for navigation for me.
Yes, Ajax is not supported by old browsers or browsers which don't have javascript enabled. Nowadays, most of the browsers do have support for Ajax -- even mobile browser like the one on the IPhone.
The biggest issue for me is that Ajax adds complexity to the project.
There are many ajax libraries out there, which are suppose to make life easier. In most cases, these libraries are easy to use to create a "Hello World" application. One of the main issues which is most of the times kept asside by Ajax libraries is (client-side) error handling/logging.
For larger projects, the developer has to understand the internals of the library, which adds a new learning discipline to the project.
Some of our big clients -for security reasons- took a corporate decision of having javascript switched off. Therefore no AJAX is possible.
If you are going to develop something using AJAX for a given client be sure that your client are allowed to use javascript.
Restrict your application to a reasonable number of browsers and browsers versions.
Crossbrowser compatibility can make your life miserable.
Ultimately, the problem is that it introduces is complexity. Most problems inherent with AJAX sites (bookmarking, browser history, graceful degradation, etc...) can be overcome with a good design, so there are not really any disadvantages to a well designed AJAX enabled site. The problem is a creating such a site requires a lot of design and very good developers who can manage the complexity.

Resources