Turn all links on a page into https within firefox - firefox

Lets just say that I wanted to be extra careful with the website I'm visiting (irrespective of whether the site is offered in https) and wanted to convert every href in the web page received into its https equivalent.
Is there a way/add-on to do this ? or do I have to write my own :(

As Paul said, most sites will break if you do this. However, if you wanted to do something similar to this (grabbing all the links on a page and doing something to them), a Greasemonkey script would be easier and quicker than writing a Firefox add-on.

You can't just point all links to https, most of them will break, and secure sites will redirect you to https anyway.

Related

the bookmarklet doesn't work on the HTTPS website?

Here's how I make develop a bookmarklet, get the input control value on web page ,
I write a javascript function, add the bookmarklet to my browser, load my test web page, is test the bookmarklet, the result is ok,
but then i test the bookmarklet on HTTPS website ,the bookmarklet can not get the input control value, why? the bookmarklet doesn't work on the HTTPS website?? Is there any way to make the bookmarklet work on https sites?
3 questions :
Why cant you get the input value : there is no reason why it does not work, almost certainly you are looking for the wrong id.
Do bookmarklets work on HTTPS : absolutely, HTTPS is not the problem
Can I make it work on https sites : if you provide a code sample, we might be able to tell you what is wrong with it.
I know this is a pretty old question, but since I came across it while searching for a similar problem, I will add my thoughts. If you wrote your own bookmarklet, this is most likely caused by your bookmarklet trying to access insecure content. If you have other static content that your bookmarklet references on your own server, such as HTML, JS, CSS, or image files, the browser will block that content from loading. This is because of the Same Origin Policy. This question is also discussed in this question. If you, or someone else viewing this is having the same problem, attempt to serve your content up as https or access only other content that is https.

Using mod_rewrite to remove www along with SWFAddress. Why does Safari lose content after hashtag, Chrome and FF don't?

I'm curious about the browser behavior when using mod_rewrite and a hashtag (#). Firefox and Chrome can rewrite a URL that has a 'www' and remove the 'www' while keeping the original URL, with the hashtag and fragment, no problem. That's awesome! But Safari and IE7/8 (not sure about 9) remove the 'www' and lose the hashtag and fragment during the rewrite. I'm wondering if there's a fix using mod_rewrite for the Safari and IE? Although, I suspect there isn't because I'm asking about fragments. (I have no experience with mod_rewrite, I've only been reading about it and just started using the HTML5 Boilerplate .htaccess file. I've read about how hash fragments aren't ever sent to the server and if I wanted to store or use that fragment I'd have to do something client-side with Javascript.)
I can use Twitter to show an example of what I'm talking about. If you use Chrome or FF and go to http://www.twitter.com/#!biz you'll get the rewrite of http://twitter.com/#!biz no problem, perfect. But using that same URL (with the 'www') in Safari and IE7/8 will rewrite the URL back to the main Twitter URL, without the 'www' but no hash and fragment, no deeplink. It's the same result for the Twitter URL that has #!/biz ( with the second / )
If Twitter doesn't (or can't) do anything about this Safari/IE & 'www' behavior, then maybe I shouldn't sweat it either? Is this a browser thing because there's no solution using mod_rewrite within a .htaccess file to store a hashtag fragment, right?
Specifically in my work, I'm using SWFAddress, which uses the hashtag to deep link in Flash content, which is almost exactly like a Twitter URL, except there's no '!'. I think Twitter is using the Making AJAX Crawlable approach. And just like Twitter, my URLs will rewrite fine in FF and Chrome, but bonk in Safari and IE 7/8. It doesn't seem to make a difference if the hashtag has the '!' or not, it's still part of the fragment, no?
I started to play around with a SWFAddress version approach of Making AJAX Crawlable back when I was looking for a solution to make the content on a Flash site crawlable, but the browsers handle the mod_rewrite the same way. The removal of the 'www' works fine in FF/Chrome, bonk the fragment in Safari and IE. I have a working example for my SWFAddress Making AJAX Crawlable approach version (had to remove the link. new users to stackoverflow only get 2 URLs per post). In the end I didn't use that approach for making my Flash content searchable, but it looks promising. I think making the HTML snapshot was more work than using PHP pages, but that's a totally different subject/question.
It'd be funny if (for my first stackoverflow question) there was a really short answer like, Yeah, don't sweat it! :)
Thanks!
The part after the hash is never sent to the server with the request, so it's being lost by the browser, not Apache. Nothing you can do about that, except maybe open a bug report with the browser vendors.

Redirect AJAX page requests to canonical links with .htaccess

I'm coding a site that makes heavy use of AJAX to load pages for users with JavaScript, but I also want it to be friendly for users with JavaScript disabled or unavailable. I've covered all the basics; for example, all my links point to canonical links, and JavaScript loads them via AJAX. My "about" page, therefore, is located at /about/, but will load on the main page and will, once finished, utilize hash/hashbang links to enable back-button functionality.
Here's the problem I have: while a hash/hashbang link will be able to be used to link to a specific page via AJAX for users with JavaScript, if a user with JavaScript attempts to link someone without it to the page, the page cannot be loaded for that person using AJAX.
As such, I'd like to be able, if possible, to use .htaccess to redirect hash/hashbang-specified pages to the canonical link. In other words, the exact opposite of what this contributer was trying to achieve.
http://example.com/#!about --> http://example.com/about/
Is it possible with .htaccess, or otherwise without JavaScript? If so, how?
Thanks!
I don't think it's possible to do this on server side. Because the part of the url after # is not included in the request sent to the server.
I might be a bit late to the party on this one, but i'm looking into this too. Since your url already contains the #!, as opposed to #, you can actually do this. Google will fetch
http://example.com/#!about
as
http://example.com?_escaped_fragment_about
Therefore, if you use a redirect 301 on that, and use javascript to redirect the user only version of the page, you have practically reached your desired result.
I realise you asked for a no-javascript solution, but i figure that was for reasons of SEO. For more information, please see this page by google.
EDIT:
<meta http-equiv="refresh" content="5; url=http://example.com/">
Some more on meta refresh here.
It:
1) Does not require javascript!
-
2) Can be Seo friendly!
-
3) Works with bookmarks and history (etc.)
I hope this helps!

Log in form in a lightbox

We've been trying to implement a site with a http home page, but https everywhere else. In order to do this we hit the rather big snag that our login form, in a lightbox, would need to fetch a https form using ajax, embed it in a http page and then (possibly) handle the form errors, still within the lightbox.
In the end we gave up and just made the whole site https, but I'm sure I've seen a login-in-a-lightbox implementation on other sites, though can't find any examples now I want to.
Can anyone give any examples of sites that have achieved this functionality, or explain how/why this functionality can/can't be achieved.
The Same Origin Policy prevents this. The page is either 100% HTTPS or it's not. The Same Origin Policy sees this as a "different" site if the protocol is not the same.
A "lightbox" is not different than any other HTML - it's just laid out differently. The same rules apply.
One option would be to use an iFrame. It's messy, but if having the whole shebang in https isn't an option, it can get the job done.
you might be able to put the login form into an iframe so that users can login through https while it seems they are on a http page,
but im not sure why you would want to do this.

full ajax site and SEO

i am planing to start a full ajax site project, and i was wondering about SEO.
The site will have urls like www.mysite.gr/#/category1 etc
Can Google crawl the site.
Is something that i have to noticed about full ajax and SEO
Any reading suggestions are welcome
Thanks
https://stackoverflow.com/questions/768233/do-hashes-in-urls-affect-seo
You might want to read about so called progressive enhancement.
Google supports indexing of AJAX sites, but unfortunately it involves extra work for the developer. See http://code.google.com/web/ajaxcrawling/docs/getting-started.html
I don't think Google is capable of doing so (yet)
http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
However you can of course make your site usable with or without JavaScript. That way, browsers will have the full candy stuff and Google (and text browsers) still can navigation your site.
In addition to SEO, you also need to think about usability standards here. A site that is that reliant on AJAX isn't going to work for things like screen-readers as well as spiders. You need a system for graceful degreadation. A website that can't function without JavaScript isn't really a functioning website.
The search engines will spider the initial page load - what happens to the page (with ajax) after that is irrelevant to listings.
Google itself doesn't crawl ajax content but advice a mechanism for it. For this you first need to change # to #!
Whole process to SEO AJAX content is explained here along with simple asp.net code to start working on it.
Imagine having to hit the “refresh” button in your browser to update your Twitter feed rather than just hitting the button on the page itself and having it instantly update? These are the types of problems that AJAX solves, although it does come with its pitfalls. Google might claim it’s able to crawl and parse AJAX websites, yet it’s risky to just take its word for it and leave your website’s organic traffic up to chance. Even though Google can usually index dynamic AJAX content, it’s not always that simple. This guide covers some of the things that can go wrong and how you can make sure your AJAX website is crawlable: https://prerender.io/ajax-seo/

Resources