I want to prevent JavaScript redirects in Firefox for one domain (youtube.com), and I was wondering whether there's a plugin that will do it. I'm trying to use NoScript, and I'd like to allow scripts globally because I don't want to disable most JavaScript, but this seems to just allow JavaScript redirects. Is there a way for me to just disable JavaScript redirects (or ideally, display a prompt)?
The only other way I can think of doing it is to write my own extension that messes around with window.onbeforeunload and window.unload, but ideally I'd like to use an existing addon.
var {utils: Cu, classes: Cc, instances: Ci, results: Cr} = Components
Cu.import('resource://gre/modules/Services.jsm');
var myobserve = function(aSubject, aTopic, aData) {
var httpChannel = aSubject.QueryInterface(Ci.nsIHttpChannel);
if (httpChannel.loadFlags & Ci.nsIHttpChannel.LOAD_REPLACE) {
//its a redirect, lets block it
httpChannel.cancel(Cr.NS_BINDING_ABORTED);
}
}
I use the presence of flags to test for redirect. Here are some notes i took on flags awhile back, im not totally sure of how accurate these notes are:
if has LOAD_DOCUMENT_URI it usually also has LOAD_INITIAL_DOCUMENT_URI if its top window, but on view source we just see LOAD_DOCUMENT_URI. If frame we just see LOAD_DOCUMENT_URI. js css files etc (i think just some imgs fire http modify, not sure, maybe all but not if cached) come in with LOAD_CLASSIFY_URI or no flags (0)
note however, that if there is a redirect, you will get the LOAD_DOC_URI and the LOAD_INIT_DOC_URI on initial and then on consequent redirects until final redirect. All redirects have LOAD_REPLACE flag tho
note: i think all imgs throw http-on-modify but cahced images dont. images can also have LOAD_REPLACE flag
Of course to start observing:
Services.obs.addObserver(myobserve, 'http-on-modify-request', false);
and to stop:
Services.obs.removeObserver(myobserve, 'http-on-modify-request', false);
Related
I'm going to build my application based on ajax, and my URLs are something like:
http://server.com/module/#function_name,param1,param2...etc
After referencing some discussions about google's suggestion: hashbang (#!), it's not hard for me to realize that it was not the best solution. There are several reasons:
The URL is pretty ugly, anyway.
It's terrible if someday Google(or some other search-engines) suggest a better solution other than hashbang. I must keep my ugly url with hashbang, or write some js-code to make link to my page still alive.
HTML5 pushState will be popular someday.
For all things above, I decide to make it my way: my navigation links will be like this:
<a href="http://server.com/module/for-crawler/function-name/param1/param2/...">
Some text </a>
And some jQuery code will make it capable to load ajax content instead of page-change like a normal link:
$(function(){
$('a').live('click',function(e){
var realURL = translateURL( $(this).attr('href') )
loadContent( realURL );
e.prevetnDefault();
return false;
})
})
/* -- the function translateURL will turn url like :
..... http://server.com/module/for-crawler/function-name/param1/param2/...
Into:
..... http://server.com/module/#function-name/param1/param2/...
That's the real url I think all ajaxers are used to dealing with
*/
When crawler reads my page, it will follow the url in "href" attribute, and I will provide it with static non-js version of my page just for google to read. After some days, my page is indexed & user will see my page on Google's results like this:
http://server.com/module/for-crawler/function-name/param1/param2/...
I'm going to user js again to redirect user to my normal ajax version, I mean, to the real URL :
http://server.com/module/#function-name/param1/param2/...
That's the best approach I can think about at this time. Please give me advices : should I do it that way, or can I make it better ? Thanks all guys !
Depending on your audience I would suggest to use HTML5 PushState anyway.
If the client is not supporting HTML5 PushState let him simply use the same version of your app as the crawlers do. In my opinion a page reload is not as bad as a hashed URL. Since users share URLs your hashed URL gets exposed to other users. This URL wouldn't work for, let's say Facebooks Link sharing previews or any other client that doesn't support JavaScript.
Instead I would only use the crawler-friendly app in combination with HTML5 PushState. With PushState you will always expose a single URL, independent of the JavaScript support of your client.
First, detect whether PushState is supported:
function supports_history_api() {
return !!(window.history && history.pushState);
}
Then your click-handler would look something like this:
$('a').live('click',function(e){
var url = $(this).attr('href');
e.preventDefault();
loadContent( url );
history.pushState({"url":url}, $(this).attr('title'), url);
return false;
})
I'm looking for a way to load a part of an external page (possibly selected by an id in the external page) into a div. Something similar to Ajax.Updater, but with the option of specifying an id to look for in the external page.
Does anything like this exist in prototype. I've been googling for examples without luck. If I can't find it soon I'll have to do some "gymnastics" with Ajax.Request and some function tied to onSuccess.
You could do something like this, though it is by no means an elegant solution.
new Ajax.Updater("container", 'www.babes.com', {
onSuccess: function() {
$("container").update( $('idOfSomeLoadedElement') );
}
});
I don't think there is an actual elegant way of doing this purely in js. Ideally, you'd make your AJAX request only for what you need. You might be able to do some server-side stuff to lop out what you don't need (basically, offload the onsuccess functionality above to the server).
Instead of AJAX you might get by with an iframe.
// dollar function calls Element.extend
var iframe = $(document.createElement('iframe'));
// control how and what it loads
iframe.addEventListener('onLoad', function() {
$('container').update(iframe.contentDocument.select('#someID').first());
});
iframe.setAttribute('src', 'http://URL');
// add it as invisible, content isn't loaded until then
iframe.setAttribute('hidden', true);
document.body.appendChild(iframe);
From a minimal test it's clear that you'll have to be as concious about cross-origin policies as any AJAX method, that is, it's a PITA.
I have run ajax-calls on the unload event for about a year.
It has generally worked in FF and IE but not to 100%, I cannot say when it has failed.
I register the event by writing in the bodytag:
onunload="...."
I got error messages in FF4 since the unload event also wanted to write in a div-tag of the page that just had unloaded. Fixed this by making the ajax-routine write nothing if the id of the target div is 'dummy'
I am no expert on AJAX, but the following code has worked:
http://yorabbit.info/e-dog.info/tmp/ajax_ex.php (the link is a text-page)
(You call ajaxfunction2 with the following arguments: filename, queryString for PHP, string to show in target div during update, name of target div)
I don't get any error messages in the FF error console and IE9 works. Is there any way I can make it work in FF too?? I have just started trying FF4, but my impression is that it works less well than in FF3.
Thanks.
(I am on a trip and ay not have the possibility to reply immediately, but I really appreciate suggestions and will reply in due course)
EDIT:
I had bettter add this:
The AJAX-call I make on unload does only send some data (how long time the user stayed on the page) to the PHP-MySQL server
I don't know what is happening here, but Firefox 4 has made notable changes to how unloading works: For example, if you do an alert() during a link click event, it will no longer freeze the page, but load the new location anyway. Maybe this is something similar.
However, you are never guaranteed for the Ajax call to finish if it is not synchronous in any browser anyway - the request may or may not come back with a response until the page has been closed. Whether this works will be down to chance, and the user's network speed.
Try using a synchronous request first, as outlined here: How does jQuery's synchronous AJAX request work?
this will usually guarantee that the request comes back. However, use it very sparingly - blocking behaviour at page unload can be very annoying for the user, and even freeze the browser.
I suggest to use jQuery instead of keeping track of browser changes yourself.
Solution:
Find working sample here: http://jsfiddle.net/ezmilhouse/4PMcc/1/
Assuming that your internal links are set relatively, and your external links therefore set starting with 'http':
Leave ...
Stay ...
You could hijack 'a' tags via jQuery events and ask the user to confirm the leaving (in case of external links). In 'ok' case you kick off your 'onleave' ajax call (async=true) and redirect user to external link:
$('a').live('click', function(event){
// cache link
var link = $(this).attr('href');
// check if external link (assuming that internal links are relative)
if (link.substr(0,4) === "http") {
// prevent default a tag event
event.preventDefault();
// popup confirm message
var reply = confirm('Do you really want to leave?');
if (reply) {
var url = 'http:mydomain.com/ajax.php';
var data = {'foo': 'bar', 'fee':'bo'};
// kick off your 'onleave' ajax call
// forced to be synchronous
$.ajax({
type: "POST",
async: false,
url: url,
data: data,
success: function( data ) {
// ok case: leave page, cached link
window.location.href = link;
}
});
}
return false;
}
});
This is a very similar question to AJAX, Subdomains and the 200 OK response (and JavaScript Same Origin Policy - How does it apply to different subdomains?), but with a twist. I have a situation in which:
A domain (www.example.com)
Where the page at a subdomain (sd.example.com/cat/id)
Needs to make ajax-style requests to another subdomain (cdn.example.com)
In contrast to the aforementioned question, what I am requesting is images.
GET requests of images (using jQuery $.load())
This seems to be working just fine. Because it was working just fine, when someone pointed out it was generating errors in Firebug the same-origin policy didn't immediately occur to me.
Images ARE loading at localhost (apache VirtualHost url of test.sd.example.com/cat/id)
However, now that it has come to mind thanks to that question I linked, I am concerned that this will not work reliably in production.
Will this continue to work in a production environment -- and will it work reliably cross-browser?
Answer: No -- it only looked like it was working; it wasn't really
If not, how do I fix it? (I don't think I can JSONP images...can I?)
Answer: Continue setting src of image & wait to show until load event triggered.
If so, how do I stop the Firebug errors? If I can. (They're scaring fellow devs.)
Answer: Same as above -- get rid of step where actually doing GET request for image file.
Initial Code
function(imageUrl, placeTarget){
var i = new Image();
var img = $(i);
img.hide()
.load(imageUrl, function(e){
// console.log("loadImage: loaded");
placeTarget.attr("src", imageUrl);
return true;
})
.error(function(){
// error handling - do this part
// console.log("loadImage: error");
return false;
});
return;
} // loadImage
Why not insert the images into the page by creating image elements and setting the src. what could be simpler?
edit: ... via javascript
I'm not sure this is exactly right, but in jquery:
img = $('<img>');
img.attr('src', 'http://somewhere.com/some_image.jpg');
$('#place_to_add').append(img);
img.ready(fade_into_next);
Checkt this post: JavaScript Same Origin Policy - How does it apply to different subdomains?
How to "bookmark" page or content fetched using AJAX?
It looks like it can be easy if we just add the details to the "anchor", and then, use the routing or even in PHP code or Ruby on Rails's route.rb, to catch that part, and then show the content or page accordingly? (show the whole page or partial content)
Then it can be very simple? It looks like that's how facebook does it. What are other good ways to do it?
Update: There is now the HTML5 History API (pushState, popState) which deprecates the HTML4 hashchange functionality. History.js provides cross-browser compatibility and an optional hashchange fallback for HTML4 browsers.
To store the history of a page, the most popular and full featured/supported way is using hashchanges. This means that say you go from yoursite/page.html#page1 to yoursite/page.html#page2 you can track that change, and because we are using hashes it can be picked up by bookmarks and back and forward buttons.
You can find a great way to bind to hash changes using the jQuery History project
http://www.balupton.com/projects/jquery-history
There is also a full featured AJAX extension for it, allowing you to easily integrate Ajax requests to your states/hashes to transform your website into a full featured Web 2.0 Application:
http://www.balupton.com/projects/jquery-ajaxy
They both provide great documentation on their demo pages to explain what is happening and what is going on.
Here is an example of using jQuery History (as taken from the demo site):
// Bind a handler for ALL hash/state changes
$.History.bind(function(state){
// Update the current element to indicate which state we are now on
$current.text('Our current state is: ['+state+']');
// Update the page"s title with our current state on the end
document.title = document_title + ' | ' + state;
});
// Bind a handler for state: apricots
$.History.bind('/apricots',function(state){
// Update Menu
updateMenu(state);
// Show apricots tab, hide the other tabs
$tabs.hide();
$apricots.stop(true,true).fadeIn(200);
});
And an example of jQuery Ajaxy (as taken from the demo site):
'page': {
selector: '.ajaxy-page',
matches: /^\/pages\/?/,
request: function(){
// Log what is happening
window.console.debug('$.Ajaxy.configure.Controllers.page.request', [this,arguments]);
// Adjust Menu
$menu.children('.active').removeClass('active');
// Hide Content
$content.stop(true,true).fadeOut(400);
// Return true
return true;
},
response: function(){
// Prepare
var Ajaxy = $.Ajaxy; var data = this.State.Response.data; var state = this.state;
// Log what is happening
window.console.debug('$.Ajaxy.configure.Controllers.page.response', [this,arguments], data, state);
// Adjust Menu
$menu.children(':has(a[href*="'+state+'"])').addClass('active').siblings('.active').removeClass('active');
// Show Content
var Action = this;
$content.html(data.content).fadeIn(400,function(){
Action.documentReady($content);
});
// Return true
return true;
And if you ever want to get the querystring params (so yoursite/page.html#page1?a.b=1&a.c=2) you can just use:
$.History.bind(function(state){
var params = state.queryStringToJSON(); // would give you back {a:{b:1,c:2}}
}
So check out those demo links to see them in action, and for all installation and usage details.
If you use jquery, you can do that in a simple manner. just use ajaxify plugin. it can manage bookmarking of ajax pages and many other things.
Check this, something may help you:
How to change URL from javascript: http://doet.habrahabr.ru/blog/15736/
How to pack the app state into url: http://habrahabr.ru/blogs/javascript/92505/
An approach description: http://habrahabr.ru/blogs/webstandards/92300/
Note: all articles are in Russian, so either Google Translate them, or just review the code and guess the details.
Take a look to the Single Page Interface Manifesto
I tried many packages. The jQuery History plugin seems to be most complete:
http://github.com/tkyk/jquery-history-plugin