I was playing with jQuery and async calls last night and found an unusual issue. I wanted to run multiple Ajax calls inside a loop. I wrote the below (where rand.php just sleeps for a second and returns a random number). Somewhat surprisingly it executes synchronously and takes 20 seconds or so to finish.
$(document).ready(function () {
$([1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20]).each(function() {
var number = this;
$.get("rand.php", function(data) {
$('#'+number).html(data);
});
});
});
The PHP code is as follows,
<?php
sleep(1);
echo rand();
?>
I was thinking this is clearly wrong as the async calls should be no blocking and return almost in parallel. After much playing around (assuming it was a server issue) I discovered that appending anything to the URL to make it look like it was different worked as expected. That is it returned in 3 seconds or so (6 or so calls at a time).
$(document).ready(function () {
$([1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20]).each(function() {
var number = this;
$.get("rand.php?"+number, function(data) {
$('#'+number).html(data);
});
});
});
I don't suppose a jQuery/Javascript Guru can explain this behavior? Is it some browser limitation? Why is it that only when the URL's are different that it runs as I would expect?
EDIT - Rather then reply, this was using Chrome (whatever the latest is) and Firefox 5/6. I did try it in IE which did cache it, so ignored that and focused on Chrome. Interesting the first one works as expected in IE9 on the first page load, but then just displays cached results when reloaded.
You are in ie aren't you. Bad! Bad IE user! IE caches get requests as if it was any other content. Jquery has a built in function to address this:
$.ajaxSetup({ cache : false });
This will add a nifty spoiler to take care of this. But why add the spoiler in other browsers? So usually I do this:
if(!+"\v1"){
$.ajaxSetup({ cache : false });
}
Which is the tests for IE and set it only in that browser.
Related
Got an issue with safari loading old youtube videos when back button is clicked. I have tried adding onunload="" (mentioned here Preventing cache on back-button in Safari 5) to the body tag but it doesn't work in this case.
Is there any way to prevent safari loading from cache on a certain page?
Your problem is caused by back-forward cache. It is supposed to save complete state of page when user navigates away. When user navigates back with back button page can be loaded from cache very quickly. This is different from normal cache which only caches HTML code.
When page is loaded for bfcache onload event wont be triggered. Instead you can check the persisted property of the onpageshow event. It is set to false on initial page load. When page is loaded from bfcache it is set to true.
Kludgish solution is to force a reload when page is loaded from bfcache.
window.onpageshow = function(event) {
if (event.persisted) {
window.location.reload()
}
};
If you are using jQuery then do:
$(window).bind("pageshow", function(event) {
if (event.originalEvent.persisted) {
window.location.reload()
}
});
All of those answer are a bit of the hack. In modern browsers (safari) only on onpageshow solution work,
window.onpageshow = function (event) {
if (event.persisted) {
window.location.reload();
}
};
but on slow devices sometimes you will see for a split second previous cached view before it will be reloaded. Proper way to deal with this problem is to set properly Cache-Control on the server response to one bellow
'Cache-Control', 'no-cache, max-age=0, must-revalidate, no-store'
Yes the Safari browser does not handle back/foreward button cache the same like Firefox and Chrome does. Specially iframes like vimeo or youtube videos are cached hardly although there is a new iframe.src.
I found three ways to handle this. Choose the best for your case.
Solutions tested on Firefox 53 and Safari 10.1
1. Detect if user is using the back/foreward button, then reload whole page or reload only the cached iframes by replacing the src
if (!!window.performance && window.performance.navigation.type === 2) {
// value 2 means "The page was accessed by navigating into the history"
console.log('Reloading');
//window.location.reload(); // reload whole page
$('iframe').attr('src', function (i, val) { return val; }); // reload only iframes
}
2. reload whole page if page is cached
window.onpageshow = function (event) {
if (event.persisted) {
window.location.reload();
}
};
3. remove the page from history so users can't visit the page again by back/forward buttons
$(function () {
//replace() does not keep the originating page in the session history,
document.location.replace("/Exercises#nocache"); // clear the last entry in the history and redirect to new url
});
You can use an anchor, and watch the value of the document's location href;
Start off with http://acme.co/, append something to the location, like '#b';
So, now your URL is http://acme.co/#b, when a person hits the back button, it goes back to http://acme.co, and the interval check function sees the lack of the hash tag we set, clears the interval, and loads the referring URL with a time-stamp appended to it.
There are some side-effects, but I'll leave you to figure those out ;)
<script>
document.location.hash = "#b";
var referrer = document.referrer;
// setup an interval to watch for the removal of the hash tag
var hashcheck = setInterval(function(){
if(document.location.hash!="#b") {
// clear the interval
clearInterval(hashCheck);
var ticks = new Date().getTime();
// load the referring page with a timestamp at the end to avoid caching
document.location.href.replace(referrer+'?'+ticks);
}
},100);
</script>
This is untested but it should work with minimal tweaking.
The behavior is related to Safari's Back/Forward cache. You can learn about it on the relevant Apple documentation: http://web.archive.org/web/20070612072521/http://developer.apple.com/internet/safari/faq.html#anchor5
Apple's own fix suggestion is to add an empty iframe on your page:
<iframe style="height:0px;width:0px;visibility:hidden" src="about:blank">
this frame prevents back forward cache
</iframe>
(The previous accepted answer seems valid too, just wanted to chip in documentation and another potential fix)
I had the same issue with using 3 different anchor links to the next page. When coming back from the next page and choosing a different anchor the link did not change.
so I had
House 1
View House 2
View House 3
Changed to
House 1
View House 2
View House 3
Also used for safety:
// Javascript
window.onpageshow = function(event) {
if (event.persisted) {
window.location.reload()
}
};
// JQuery
$(window).bind("pageshow", function(event) {
if (event.originalEvent.persisted) {
window.location.reload()
}
});
None of the solutions found online to unload, reload and reload(true) singularily didn't work. Hope this helps someone with the same situation.
First of all insert field in your code:
<input id="reloadValue" type="hidden" name="reloadValue" value="" />
then run jQuery:
jQuery(document).ready(function()
{
var d = new Date();
d = d.getTime();
if (jQuery('#reloadValue').val().length == 0)
{
jQuery('#reloadValue').val(d);
jQuery('body').show();
}
else
{
jQuery('#reloadValue').val('');
location.reload();
}
});
There are many ways to disable the bfcache. The easiest one is to set an 'unload' handler. I think it was a huge mistake to make 'unload' and 'beforeunload' handlers disable the bfcache, but that's what they did (if you want to have one of those handlers and still make the bfcache work, you can remove the beforeunload handler inside the beforeunload handler).
window.addEventListener('unload', function() {})
Read more here:
https://developer.mozilla.org/en-US/docs/Mozilla/Firefox/Releases/1.5/Using_Firefox_1.5_caching
I have a simple casperjs test to submit a search form on my homepage. Then I assert that the title on the landing page is correct.
Works fine on my computer (OSX 10.9.2) but on my colleague's laptops (a Win 7 and Win 8), the test fails randomly because casper "thinks" it is still on the search page.
casper.test.begin('Search', function(test) {
casper.start("http://localhost:8080/site", function() {
this.fill(searchForm, { query: goodQuery }, true);
});
casper.then(function() {
// sometimes fails, says it's "My Project" main title
test.assertTitle('Search Result', 'Search result title is ok');
});
}
Introducing a casper.waitFor(3000) before checking the page title does not change the outcome. I've also tried to replace the then step by a waitForUrl, but it fails after 5 secs, saying it is still on the current page.
Plenty of other tests work fine on all computers but it's the only one with form submition.
Any hints on how to solve or properly work around this? I'd rather not simulate a click on the submit button (more coupling to the form internals) if possible (but it would be okay I guess).
Thanks
$ casperjs --version
1.1.0-beta3
$ phantomjs --version
1.9.7
EDIT: submitting the form and waitForUrldid not help. I found out that the test actually runs fine on its own, even on the Windows 7 machine. But when I run two tests:
01 search.js (the one described above)
02 menu.js (a simple one, merely containing assertExists)
'search.js' fails most of the time... and sometimes 'menu.js' fails instead! I suspect some mishandled concurrent access, although it consistently works on OSX. I must be doing something wrong. Both tests have the same structure:
casper.test.begin('Some test', function(test) {
casper.start(someUrl, function() {
// some test
});
casper.run(function() {
test.done();
});
});
Any clue?
Try :
casper.test.begin('Search', function(test) {
casper.start("http://localhost:8080/site", function() {
this.fill(searchForm, {
query: goodQuery
},false);
this.click("your selector for submit button");
});
casper.then(function() {//you should use waitForUrl/Selector/Text() instead
// sometimes fails, says it's "My Project" main title
test.assertTitle('Search Result', 'Search result title is ok');
});
casper.run(function() {
this.test.comment('------ Tests over ----\n');
test.done();
});
});
It's better to submit the form by clicking. Sometimes (often) it doesn't pass putting the fill arg at true. Just put the correct selector for the submit button.
You should wait for an item to appear on the following page. I would change your code to the following:
casper.test.begin('Search', function(test) {
casper.start("http://localhost:8080/site", function() {
this.fill(searchForm, { query: goodQuery }, true);
});
casper.waitForSelector('#someSelectorOnNextPage', function() {
test.assertTitle('Search Result', 'Search result title is ok');
});
}
I also experience same issue. Suprisingly adding empty then() handler fixes that in v1.1.0-beta3. I don't think this is expected behavior though:
casper.test.begin('Search', function(test) {
casper.start("http://localhost:8080/site", function() {
this.fill(searchForm, { query: goodQuery }, true);
});
// Do nothing here, just call it as a placeholder
// Here http://localhost:8080/site sends us to the next endpoint
casper.then(function() {});
// Now this is the final page we actually want to assert
casper.then(function() {
test.assertTitle('Search Result', 'Search result title is ok');
});
}
EDIT:
Although question author says casper.waitForUrl() didn't work for them, it did work for me as an alternative solution.
What does look strange is that in verbose mode whatever returns a 301 status code along with Location Header is recognized as HTTP 200 response by Casper.
EDIT 2:
Well obviously it doesn't happen every time, but what I noticed is that Casper sometimes doubles the previous response (that's why I thought it recognizes some specific HTTP codes as 200 mistakenly and that's why author's code functioned as if it stayed on same page after form submission) and sometimes not.
waitForUrl() fixes that obviously but there is still some underneath issue in Casper which scares me a bit and I hope I will find some time to report it with all the dumps to Casper issue tracker.
I admit I'm quite noob with full ajax websites, and so I'm surely making some mistakes.
The problem is this:
in http://lamovida.arabianessence.com
every page is loaded with an $.ajax call using this function
function getAjaxPage() {
$('a.ajaxc').click(function() {
$("li.page_block").find(".wrapper").fadeOut(400).remove();
hideSplash();
var $thishref = $(this).attr('href'),
$thisurl = $thishref.replace("#!/",""),
$urlArr = $thisurl.split('-'),
$urlOk = $urlArr[0],
$dataOk = $urlArr[1];
$.ajax({
url : $urlOk + ".php",
data : 'id='+$dataOk,
success : function (data,stato) {
$("#content").css({opacity:1}).fadeIn(400);
$("li.page_block").html(data);
$("li.page_block").css('visibility', 'visible');
$("li.page_block").find(".wrapper").css({opacity:0}).show().animate({opacity:1},1000);
var $whgt = $(".wrapper").height(),
$ctop = ( ( $(window).height() - $whgt ) /2 )-40;
$("#content").stop().animate({height: $whgt+40, top: $ctop},1000);
$("li.page_block").css('padding-top',20);
$('.scrollable').jScrollPane();
$('.slider>ul>li').jScrollPane();
getAjaxPage();
},
error : function (richiesta,stato,errori) {
alert(errori);
}
});
});
}
Every time this function is called the content gets loader slower, and after about 20 clicks things get real bad, and the loading time grows and grows.
I tried to analyze the situation using the Google Chrome's Timeline, and I saw that after each click the browser uses more memory. If I comment the getAjaxPage(); row in the "success" section the situation starts to get better, but of course I lose all the internal navigation.
What could I do to avoid this problem?
Many thanks to all!
Every call to $('a.ajaxc').click() is adding new event handler thus every click causes more requests to be made. After the first click, every click will cause two requests. Another click? Another three requests. Etc.
Put the handler outside the function and you will have only one AJAX call per click:
$(document).ready(function() {
$('a.ajaxc').click(getAjaxPage);
});
I also don't see the reason behind calling getAjaxPage again from within the callback, so remove it as well to avoid infinite loop of requests.
I'm having a problem with, guess what, IE8. The following code, simplified for clarity, does not work at all:
alert('before get');
$.get(getActivityURL('ActionName',{
ts: new Date().getTime(), ...other params...}),
{cache:false;},
function (xml) {
alert("in get callback");
},'xml'); // End $.get()
alert('in after get');
The getActivityUrl() outputs a valid URL with request parameters.
This works correctly in FF and Chrome. However, in IE8, this doesn't even get into the $.get() callback. I get the "before" and "after" alerts, but not the "in" alert and indeed, nothing happens and the request is NOT sent. I don't really know what to think here.
The response headers are "Content-Type:application/xml; charset:iso-8859-1" as confirmed in FF.
EDIT: $.post() doesn't work, either.
IE is infamous for caching. So you need to make sure you are not getting a cached result.
You can disable caching globally by setting the cache property value to false in the ajaxStart method.
$.ajaxSetup({
cache: false
});
Or If you want to eliminate the cached result in a specific ajax call, Append a unique number to the end of the url. You may use the $.now() method to get a unique number
$.get("someurl.php?" + $.now() ,function(result) {
// do something with result
});
$.now() method return a number representing the current time.
I'm not sure if it is a problem but try to remove ";" in {cache:false}
IE doesn't like any additional stuff in {}, eg
{a:a,b:b,c:c,} will work in FF but not in IE
I think so there is Cache problem in IE.
So add Math.random(), one more parameter at the end like "&mathRandom="+Math.random();
Because IE will recognise same request as previous one so it will give data from cache instead of firing request.
$J.get(getActivityURL('ActionName'
// End $.get()
Is this correct? I mean $J... Are you using more than one JS framework or something?
have u tried:
$.ajax({
url: getActivityURL('ActionName',{ts: new Date().getTime(), ...other params...}),
data: data,
success: function (xml) {
alert("in get callback");
},
dataType: 'xml'
});
Just a guess
EDIT:
I found a interesting thread that might help you, check this out:
jQuery issue in Internet Explorer 8
I'm running into an issue with an async call to the server that only works one time, then it appears to become a synchronous call. Let me try to explain.
It's an MVC 2.0 site, using ASP.NET and Ajax. I'm using the Ajax.BeginForm helper, like so:
<% using (Ajax.BeginForm("Start", null,
new { virtualMachineId = xyz },
new AjaxOptions { UpdateTargetId = "VirtualMachineForm", OnBegin="OnStartingVm" }
)){
Then while the machine is starting I want to call back to the server and get an update every second. It works the first time correctly, then changes behavior. OnStartingVm looks something like this:
function OnStartingVm() {
$('#StartingDiv').css('visibility', 'visible');
$('#StartingDiv').show();
var vmId = xyz;
intervalId = setInterval(function () {
updateStartingStatus(vmId)
}, 1000);
}
function updateStartingStatus(vmId) {
/* This part always runs */
$.ajax({
url: "/member/vm/getstartingstatus/" + vmId,
dataType: 'json',
async: true,
success: function (data) {
alert('This part runs every second on the first time only');
if (data.status == "Running") {
$('#StartingDiv').text(data.percentComplete);
}
else {
$('#StartingDiv').css('visibility', 'hidden');
$('#StartingDiv').hide();
clearInterval(intervalId);
}
},
});
}
Within the updateStartingStatus function, the first part runs every second, every time. However, within the Ajax call, the success result works every second on the first time only. Then on the second time I click on the start button all of the requests queue up. After the starting has completed, about 20 seconds later, I get a bunch of alert windows back to back. So, I can tell that updateStartingStatus runs every second every time, but the ajax call appears to switch to become a sync call after the first time.
Refreshing the browser window doesn't help. I have to fully close it and open it again. The same occurs in IE and Chrome.
One more thing to note is that the updated div (VirtualMachineForm) contains most of the page, including the button being pressed. So it basically replaces the page from under itself. Not sure if that would cause any issues.
Additionally, if I debug in Visual Studio 2010, the call isn't made to the controller action when the issue occurs. So, it appears to be something client-side. I've ruled out any issues server-side.
I eventually figured it out. This post lead to the answer.
It was session state related and the browser locked the request until a previous one was completed. I didn't need to disable session state, but I had to avoid a session write from code.
That explains why a browser refresh didn't work and why I had to close and open the browser again.
Why don't you call clearInterval function?