I have modal boxes on a page are opening dynamic content in iframes.
Link 1 (fixed id)
Link 1 (fixed id)
...
But whatever I do, I can't prevent iframe content to be cached in IE10 (seems to other browsers are good).
I use html5 so meta tags are helpless.
Cache.manifest made my page messed up (or if to put just NETWORK: * - no effect).
PHP headers header("Cache-Control: no-cache"); header("Expires: -1"); also gives no effect for IE10.
JS/JQuery solutions found in web I couldn't apply correctly.
Any solution except to add another dynamic GET parameter for href ?
I'm not sure if this will work in IE10, but a few years ago Firefox had a similar issue with caching the content of iframes, and a trick to make it reload was to use JavaScript to force a reload like this:
var orig_src = iframe.src;
iframe.src = "blank.html"; # A blank document you'll need to create
setTimeout(function() { iframe.src = orig_src; }, 100);
or the simpler technique of
iframe.src = iframe.src;
... which also may or may not work in IE10.
Related
I'm currently facing an annoying (suspected) cache issue with fancybox 3.4.1:
<a data-fancybox="" data-type="ajax" data-src="src/views/forms/SpeiseplanCreateForm.php?ValidFrom=1560117600" href="javascript:;" id="fancybox-SpeiseplanCreateForm"><button id="fancybox-SpeiseplanCreateForm-button">Speiseplan bearbeiten</button></a>
has a GET parameter which is then evaluated by my PHP script, however, when I click on it for the first time, it works fine, but when I change the value of ValidFrom via JavaScript and try to open the box a second time, the get parameter is still the same as in the first call. I assume it's related to fancybox caching the requests.
I have verified that the URL parameter is getting changed properly and I also tried disabling the cache by adding this to my header:
<script>
$(document).ready(function() {
$(".data-fancybox").fancybox({
type : 'ajax',
ajax : { cache: false }
});
});
</script>
as suggested here: https://stackoverflow.com/a/17621281/4934937
Is there a way to disable the cache?
I just found a very dirty workaround for this. After editing the element in Firefox (Edit HTML), it worked. So I guessed it must have something in common with some weird caching (browser or fancybox, who knows).
The workaround is to create a new a element, remove the old a and append it to the old a's parent again.
let newElement = document.createElement("a");
newElement.setAttribute("id","fancybox-"+id)
newElement.setAttribute("data-fancybox","");
newElement.setAttribute("data-type","ajax");
newElement.setAttribute("href","javascript:;");
newElement.setAttribute("data-src",fancyboxSrc);
newElement.appendChild(button);
fancybox.outerHTML = "";
parent.appendChild(newElement);
This is a very similar question to AJAX, Subdomains and the 200 OK response (and JavaScript Same Origin Policy - How does it apply to different subdomains?), but with a twist. I have a situation in which:
A domain (www.example.com)
Where the page at a subdomain (sd.example.com/cat/id)
Needs to make ajax-style requests to another subdomain (cdn.example.com)
In contrast to the aforementioned question, what I am requesting is images.
GET requests of images (using jQuery $.load())
This seems to be working just fine. Because it was working just fine, when someone pointed out it was generating errors in Firebug the same-origin policy didn't immediately occur to me.
Images ARE loading at localhost (apache VirtualHost url of test.sd.example.com/cat/id)
However, now that it has come to mind thanks to that question I linked, I am concerned that this will not work reliably in production.
Will this continue to work in a production environment -- and will it work reliably cross-browser?
Answer: No -- it only looked like it was working; it wasn't really
If not, how do I fix it? (I don't think I can JSONP images...can I?)
Answer: Continue setting src of image & wait to show until load event triggered.
If so, how do I stop the Firebug errors? If I can. (They're scaring fellow devs.)
Answer: Same as above -- get rid of step where actually doing GET request for image file.
Initial Code
function(imageUrl, placeTarget){
var i = new Image();
var img = $(i);
img.hide()
.load(imageUrl, function(e){
// console.log("loadImage: loaded");
placeTarget.attr("src", imageUrl);
return true;
})
.error(function(){
// error handling - do this part
// console.log("loadImage: error");
return false;
});
return;
} // loadImage
Why not insert the images into the page by creating image elements and setting the src. what could be simpler?
edit: ... via javascript
I'm not sure this is exactly right, but in jquery:
img = $('<img>');
img.attr('src', 'http://somewhere.com/some_image.jpg');
$('#place_to_add').append(img);
img.ready(fade_into_next);
Checkt this post: JavaScript Same Origin Policy - How does it apply to different subdomains?
This is probably going to get a resounding no, but I am wondering if it possible to have the URl change dynamically with using hashing, and without invoking a http request from the browser?
My client is keen on using AJAX for main navigation. This is fine, when the end user goes to the front page first, but when they want to use the deep linking, despite it working, it forces an extra load time as the page loads the front page, then invokes the AJAX from the hash.
UPDATE: Could it be possible, given that what I want to avoid is the page reload (the reason is that it looks bad) to stem the reload by catching the hash with PHP before the headers are sent, and redirecting before the page load. This way only one page loads, and the redirect is all but invisible to the user. Not sure how to do this, but seems like it is possible?
Yes, this is possible. I often do this to store state in the hash part of the URL. The result is that the page doesn't reload, but if the user does reload, they're taken to the right page.
Using this method, the URL will look like: "/index#page=home" or "/index#page=about"
You'll need to write a JavaScript function that handles navigation, and you'll need a containing div that gets rewritten with the contents fetched from AJAX.
Home
About
Questions
<div id="content"></div>
<script type="text/javascript">
function link(page) {
location.hash = "page="+page;
loadPage(page);
}
// NOTE: This is using MooTools. Use the AJAX method in whatever
// JavaScript framework you're using.
function loadPage(page) {
new Request.HTML({
url: "/ajax/"+page+".html",
onSuccess: function(tree, elements, html) {
document.id('content').setProperty('html', html);
}
}).get();
}
</script>
Now, you'll also need to have something that checks the hash on page load to load the right content initially. Again, this is using MooTools, but use whatever onLoad method your JavaScript framework provides.
<script type="text/javascript">
document.addEvent('domready', function() {
parts = location.hash.split('=');
loadPage(parts[1]);
}
</script>
Ok, the problem is that opening an AJAX link of the form http://example.com/#xyz results in a full page being downloaded to the browser, and then the AJAX-altered content is changed once the page has loaded and checked the hash part of its URL. The user has a diconcerting experience.
You can hugely improve this by making a page that just contains the static elements - menus, etc. - and a loading GIF in the content area. This page checks its URL upon loading and dynamically fetches the content specified by the hash part. The page can have any URL you want; we'll use http://example.com/a. Links to this page (http://example.com/a#xyz) now provide a good user experience for users with scripting enabled.
However, new users won't come to the site by fetching http://example.com/a; they'll fetch http://example.com. This is fine - serve the full page, including the home page content and links that don't require scripting to work (e.g., http://example.com/xyz). A script run on loading this page should alter the href of AJAXable links to their AJAX form (http://example.com/a#xyz); thus the first link a user clicks on will result in a full page load but subsequent ones won't.
The only remaining problem is is a no-script user gets sent an AJAX link. You can add a noscript block to the AJAX page that contains a message explaining the problem and provides a link back to the homepage; you could include instructions on how to enable scripting or even how to modify the link by removing a# and pressing enter.
It's not a great answer, but you can offer a different link in the page itself; e.g., if the address bar shows /#xyz you include a link to /xyz somewhere in the page. You could also add a link or button that uses script to bookmark the page, which would again use the non-AJAX form of the link.
Like so many lost souls before me, I'm floundering in the snake pit that is Ajax form submission and IE browser caching.
I'm trying to write a simple script using the jQuery Form Plugin to Ajaxify Wordpress comments. It's working fine in Firefox, Chrome, Safari, et. al., but in IE, the response text is cached with the result that Ajax is pulling in the wrong comment.
jQuery(this).ajaxSubmit({
success:
function(data) {
var response = $("<ol>"+data+"</ol>");
response.find('.commentlist li:last').hide().appendTo(jQuery('.commentlist')).slideDown('slow');
}
});
ajaxSubmit sends the comment to wp-comments-post.php, which inelegantly spits back the entire page as a response. So, despite the fact that it's ugly as toads, I'm sticking the response text in a variable, using :last to isolate the most recent comment, and sliding it down in its place.
IE, however, is returning the cached version of the page, which doesn't include the new comment. So ".commentlist li:last" selects the previous comment, a duplicate of which then uselessly slides down beneath the original.
I've tried setting "cache: false" in the ajaxSubmit options, but it has no effect. I've tried setting a url option and tacking on a random number or timestamp, but it winds up being attached to the POST that submits the comment to the server rather than the GET that returns the response, and so has no effect. I'm not sure what else to try. Everything works fine in IE if I turn off browser caching, but that's obviously not something I can expect anyone viewing the page to do.
Any help will be hugely appreciated. Thanks in advance!
EDIT WITH A PROGRESS REPORT: A couple of people have suggested using PHP headers to prevent caching, and this does indeed work. The trouble is that wp-comments-post is spitting back the entire page when a new comment is submitted, and the only way I can see to add headers is to put them in the Wordpress post template, which disables caching on all posts at all times--not quite the behavior I'm looking for.
Is there a way to set a php conditional--"if is_ajax" or something like that--that would keep the headers from being applied during regular pageloads, but plug them in if the page was called by an Ajax GET?
jQuery.ajaxSubmit() takes any of the options for the standard jQuery.ajax(). You can thus use the standard cache: false option to turn off caching:
jQuery(this).ajaxSubmit({
cache: false,
success:
function(data) {
var response = $("<ol>"+data+"</ol>");
response.find('.commentlist li:last').hide().appendTo(jQuery('.commentlist')).slideDown('slow');
}
});
The way I have been doing this is by adding a rand=new Date().getTime() to the end
if(url.replace("?") != url)
url = url+"&rand="+new Date().getTime();
else
url = url+"?rand="+new Date().getTime();
The function above will append the rand=time to the address of the url [address to the .php] If you have supplied get parameters, it will add &rand=time... otherwise it will add ?rand=time
The browser keeps caching, but the pages won't overlap.
You could also use PHP's header() to disable caching by setting Cache-control: and Expires:
Put this in the beginning of Your php:
header("Cache-Control: no-cache, must-revalidate");
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT");
This should help. If it doesn't - try putting a random number as a filename with the headers.
The post is redirected to a get request and You would have to send some info to the get page to control if it should be cached or not.
This will prevent caching globally
$(document).ready(function() {
$.ajaxSetup({ cache: false });
});
i am using .ashx to retrive image and i place the the image inside the ajax update panel it retrive the image when a new image is added to the form but when we change the image it is not updating the image it dont even call the .ashx file but when i refresh the browser it works properly
Sounds like a caching issue. Try adding some of the lines found here to your ashx file and it should hopefully force the browser to rerequest the image. (I know that the link is for ASP rather than ASP.NET, but things like Response.Expires = -1 should work)
Alternatively, can you change the path to the image in the updatepanel? If you just add a random parameter on to the end of it the browser will treat it as a fresh request (we use the current date/time as a parameter when we're doing this. The parameter is ignored by ASP.NET unless you explicitly reference it)
Do something like this:
var sPath = "../../handlers/ProcessSignature.ashx?type=View&UserID=" + userID + "&d=" + (((1 + Math.random()) * 0x10000) | 0).toString(16).substring(1);
That puts a 4 character alpha numeric string at the end of your query string. It's not needed, but it will force browsers to pick up the latest version of that image because the URL is different.
I tried the above and some browsers ignore the headers. I threw all of those in and Chrome/FireFox 3 didn't try to update.
IE7 worked sometimes
IE6 just twiddled it's thumbs and asked why it was still in existence.
Changing the path above will fix it in all browsers.