prevent ajax request caching in IE during post method - ajax

How to prevent IE from caching the request sent to the server?
i tried by setting ("Cache-Control: no-cache) in the https response object but still the IE is caching my request data.
Please find tmy project details as below:
in my application i am sending login request to the server. so after i login if i take the memory dump using winHex tool i am able to get the password details in the memory.
i am clearing the dialog refrense also but still the request data is getting cached.
Please suggest me some work arround for this

You could try to add a parameter to your URL with a random value, this will prevent that the URL is always thesame.
Example:
Normal URL:
www.test.com/test.php
Fake different URL:
www.test.com/test.php?_dc=12353somerandomval
Make sure the _dc parameter always has a different value, you can (for example) use JavaScript date object for this (It returns the current time in milliseconds, which will virtually always be different):
params: {
_dc : new Date().getTime()
}

In a project I did a while back I had the exact same issues, I searched around and saw a few things that recommended adding a time stamp to the request, that does work too, but this was the most elegant way that worked for me.
$('document').ready(function () {
$.ajaxSetup({
cache: false
});
});

Related

Disable cache in ExtLib REST control (which uses dojox.data.JsonRestStore)

In my XPage I have a xe:djxDataGrid (dojox.grid.datagrid) which uses xe:restService which seems to use dojox.data.JsonRestStore.
Everything works fine without proxy but my client accesses the application via a proxy because of corporate policy. After a user updates data in the DataGrid it shows old values when accessed behind the proxy.
When the REST Control/JsonRestStore sends an ajax GET request to get data, there is no Cache-Control parameter in request headers. And Domino does not place Expires parameter in the reponse headers. I believe that's why the old version of the GET request gets cached by the proxy.
We have tried to disable cache in browsers but that does not help which indicates the proxy is caching the requests.
I believe this could be solved either by:
Setting Cache-Control parameter in request headers OR
Setting Expires parameter in response headers
But I haven't found a way to set either of these. For the XPage Domino sets Expires:-1 response header but not for the ajax GET request which is:
/mypage.xsp/?$$viewid=!ddrg6o7q1z!&$$axtarget=view:_id1:_id2:callback1:restService1
This returns the JSON data to JsonRestStore and gets cached by the proxy.
One options is to try to get an exception to the proxy so requests to this site would bypass the proxy cache. But exceptions are generally not easy to get thru.
Any ideas? Thanks.
Update1
My colleque suggested that I could intercept the xhr GET requests made by dojox.data.JsonRestStore and add a time parameter to the URL to prevent cache. Here is my question about that:
Prevent cache in every Dojo xhr request on page
Update2
#SvenHasselbach has a great solution for preventing cache for all xhrs:
http://openntf.org/XSnippets.nsf/snippet.xsp?id=cache-prevention-for-dojo-xhr-requests
It seems to work perfectly, &dojo.preventCache= parameter is added to the URLs and the requests seem to return correct JSON also with this parameter. But the DataGrid stops working when I use that code. Every xhr causes this error:
Tried with Firefox and Chrome. The first page of data still loads because xhr interception is not yet in place but the subsequent pages show only "..." in each cell.
The solution is Sven Hasselbach's code in the comment section of Julian Buss's blog which needs to be slightly modified.
I changed xhrPost to xhrGet and did not place the code to dojo.addOnLoad. When placed there it was not effective in the first XHR by the DataGrid/Store.
I also removed the headers modification because it overrides existing headers. When the REST control requests data from server with xhrGet the URL is always the same and rows requested are in HTTP header like this:
Range: items=0-9
This (and other) headers disappear when the original code is used. To just add headers we would have take the existing headers from args and append to them. I didn't see a need for that because it should be enough to add the parameter in the URL. Here is the extremely simple code I'm using:
if( !(dojo._xhrGet )) {
dojo._xhrGet = dojo.xhrGet;
}
dojo.xhrGet = function (args) {
args['preventCache'] = true;
return dojo._xhrGet(args);
}
Now I'm getting all rows and all XHR Get URLs have &dojo.preventCache= parameter which is exactly what I wanted. Next we'll test in customer environment to see if this solves their problem.
Update
As Julian points out in his blog I could also use a Web Site Rule to set Expires or cache-control http response headers.
Update
The customer reports it's working now for them!

What does status=canceled for a resource mean in Chrome Developer Tools?

What would cause a page to be canceled? I have a screenshot of the Chrome Developer Tools.
This happens often but not every time. It seems like once some other resources are cached, a page refresh will load the LeftPane.aspx. And what's really odd is this only happens in Google Chrome, not Internet Explorer 8. Any ideas why Chrome would cancel a request?
We fought a similar problem where Chrome was canceling requests to load things within frames or iframes, but only intermittently and it seemed dependent on the computer and/or the speed of the internet connection.
This information is a few months out of date, but I built Chromium from scratch, dug through the source to find all the places where requests could get cancelled, and slapped breakpoints on all of them to debug. From memory, the only places where Chrome will cancel a request:
The DOM element that caused the request to be made got deleted (i.e. an IMG is being loaded, but before the load happened, you deleted the IMG node)
You did something that made loading the data unnecessary. (i.e. you started loading a iframe, then changed the src or overwrite the contents)
There are lots of requests going to the same server, and a network problem on earlier requests showed that subsequent requests weren't going to work (DNS lookup error, earlier (same) request resulted e.g. HTTP 400 error code, etc)
In our case we finally traced it down to one frame trying to append HTML to another frame, that sometimes happened before the destination frame even loaded. Once you touch the contents of an iframe, it can no longer load the resource into it (how would it know where to put it?) so it cancels the request.
status=canceled may happen also on ajax requests on JavaScript events:
<script>
$("#call_ajax").on("click", function(event){
$.ajax({
...
});
});
</script>
<button id="call_ajax">call</button>
The event successfully sends the request, but is is canceled then (but processed by the server). The reason is, the elements submit forms on click events, no matter if you make any ajax requests on the same click event.
To prevent request from being cancelled, JavaScript event.preventDefault(); have to be called:
<script>
$("#call_ajax").on("click", function(event){
event.preventDefault();
$.ajax({
...
});
});
</script>
NB: Make sure you don't have any wrapping form elements.
I had a similar issue where my button with onclick={} was wrapped in a form element. When clicking the button the form is also submitted, and that messed it all up...
Another thing to look out for could be the AdBlock extension, or extensions in general.
But "a lot" of people have AdBlock....
To rule out extension(s) open a new tab in incognito making sure that "allow in incognito is off" for the extention(s) you want to test.
In my case, I found that it is jquery global timeout settings, a jquery plugin setup global timeout to 500ms, so that when the request exceed 500ms, chrome will cancel the request.
You might want to check the "X-Frame-Options" header tag. If its set to SAMEORIGIN or DENY then the iFrame insertion will be canceled by Chrome (and other browsers) per the spec.
Also, note that some browsers support the ALLOW-FROM setting but Chrome does not.
To resolve this, you will need to remove the "X-Frame-Options" header tag. This could leave you open to clickjacking attacks so you will need to decide what the risks are and how to mitigate them.
Here's what happened to me: the server was returning a malformed "Location" header for a 302 redirect.
Chrome failed to tell me this, of course. I opened the page in firefox, and immediately discovered the problem.
Nice to have multiple tools :)
Another place we've encountered the (canceled) status is in a particular TLS certificate misconfiguration. If a site such as https://www.example.com is misconfigured such that the certificate does not include the www. but is valid for https://example.com, chrome will cancel this request and automatically redirect to the latter site. This is not the case for Firefox.
Currently valid example: https://www.pthree.org/
A cancelled request happened to me when redirecting between secure and non-secure pages on separate domains within an iframe. The redirected request showed in dev tools as a "cancelled" request.
I have a page with an iframe containing a form hosted by my payment gateway. When the form in the iframe was submitted, the payment gateway would redirect back to a URL on my server. The redirect recently stopped working and ended up as a "cancelled" request instead.
It seems that Chrome (I was using Windows 7 Chrome 30.0.1599.101) no longer allowed a redirect within the iframe to go to a non-secure page on a separate domain. To fix it, I just made sure any redirected requests in the iframe were always sent to secure URLs.
When I created a simpler test page with only an iframe, there was a warning in the console (which I had previous missed or maybe didn't show up):
[Blocked] The page at https://mydomain.com/Payment/EnterDetails ran insecure content from http://mydomain.com/Payment/Success
The redirect turned into a cancelled request in Chrome on PC, Mac and Android. I don't know if it is specific to my website setup (SagePay Low Profile) or if something has changed in Chrome.
Chrome Version 33.0.1750.154 m consistently cancels image loads if I am using the Mobile Emulation pointed at my localhost; specifically with User Agent spoofing on (vs. just Screen settings).
When I turn User Agent spoofing off; image requests aren't canceled, I see the images.
I still don't understand why; in the former case, where the request is cancelled the Request Headers (CAUTION: Provisional headers are shown) have only
Accept
Cache-Control
Pragma
Referer
User-Agent
In the latter case, all of those plus others like:
Cookie
Connection
Host
Accept-Encoding
Accept-Language
Shrug
I got this error in Chrome when I redirected via JavaScript:
<script>
window.location.href = "devhost:88/somepage";
</script>
As you see I forgot the 'http://'. After I added it, it worked.
Here is another case of request being canceled by chrome, which I just encountered, which is not covered by any of answers up there.
In a nutshell
Self-signed certificate not being trusted on my android phone.
Details
We are in development/debug phase. The url is pointing to a self-signed host. The code is like:
location.href = 'https://some.host.com/some/path'
Chrome just canceled the request silently, leaving no clue for newbie to web development like myself to fix the issue. Once I downloaded and installed the certificate using the android phone the issue is gone.
If you use axios it can help you
// change timeout delay:
instance.defaults.timeout = 2500;
https://github.com/axios/axios#config-order-of-precedence
For my case, I had an anchor with click event like
<a href="" onclick="somemethod($index, hour, $event)">
Inside click event I had some network call, Chrome cancelling the request. The anchor has href with "" means, it reloads the page and the same time it has click event with network call that gets cancelled. Whenever i replace the href with void like
<a href="javascript:void(0)" onclick="somemethod($index, hour, $event)">
The problem went away!
If you make use of some Observable-based HTTP requests like those built-in in Angular (2+), then the HTTP request can be canceled when observable gets canceled (common thing when you're using RxJS 6 switchMap operator to combine the streams). In most cases it's enough to use mergeMap operator instead, if you want the request to complete.
I had faced the same issue, somewhere deep in our code we had this pseudocode:
create an iframe
onload of iframe submit a form
After 2 seconds, remove the iframe
thus, when the server takes more than 2 seconds to respond the iframe to which the server was writing the response to, was removed, but the response was still to be written , but there was no iframe to write , thus chrome cancelled the request, thus to avoid this I made sure that the iframe is removed only after the response is over, or you can change the target to "_blank".
Thus one of the reason is:
when the resource(iframe in my case) that you are writing something in, is removed or deleted before you stop writing to it, the request will be cancelled
I have embedded all types of font as well as woff, woff2, ttf when I embed a web font in style sheet. Recently I noticed that Chrome cancels request to ttf and woff when woff2 is present. I use Chrome version 66.0.3359.181 right now but I am not sure when Chrome started canceling of extra font types.
We had this problem having tag <button> in the form, that was supposed to send ajax request from js. But this request was canceled, due to browser, that sends form automatically on any click on button inside the form.
So if you realy want to use button instead of regular div or span on the page, and you want to send form throw js - you should setup a listener with preventDefault function.
e.g.
$('button').on('click', function(e){
e.preventDefault();
//do ajax
$.ajax({
...
});
})
I had the exact same thing with two CSS files that were stored in another folder outside my main css folder. I'm using Expression Engine and found that the issue was in the rules in my htaccess file. I just added the folder to one of my conditions and it fixed it. Here's an example:
RewriteCond %{REQUEST_URI} !(images|css|js|new_folder|favicon.ico)
So it might be worth you checking your htaccess file for any potential conflicts
happened to me the same when calling a. js file with $. ajax, and make an ajax request, what I did was call normally.
In my case the code to show e-mail client window caused Chrome to stop loading images:
document.location.href = mailToLink;
moving it to $(window).load(function () {...}) instead of $(function () {...}) helped.
In can this helps anybody I came across the cancelled status when I left out the return false; in the form submit. This caused the ajax send to be immediately followed by the submit action, which overwrote the current page. The code is shown below, with the important return false at the end.
$('form').submit(function() {
$.validator.unobtrusive.parse($('form'));
var data = $('form').serialize();
data.__RequestVerificationToken = $('input[name=__RequestVerificationToken]').val();
if ($('form').valid()) {
$.ajax({
url: this.action,
type: 'POST',
data: data,
success: submitSuccess,
fail: submitFailed
});
}
return false; //needed to stop default form submit action
});
Hope that helps someone.
For anyone coming from LoopbackJS and attempting to use the custom stream method like provided in their chart example. I was getting this error using a PersistedModel, switching to a basic Model fixed my issue of the eventsource status cancelling out.
Again, this is specifically for the loopback api. And since this is a top answer and top on google i figured i'de throw this in the mix of answers.
For me 'canceled' status was because the file did not exist. Strange why chrome does not show 404.
It was as simple as an incorrect path for me. I would suggest the first step in debugging would be to see if you can load the file independently of ajax etc.
The requests might have been blocked by a tracking protection plugin.
It happened to me when loading 300 images as background images. I'm guessing once first one timed out, it cancelled all the rest, or reached max concurrent request. need to implement a 5-at-a-time
One the reasons could be that the XMLHttpRequest.abort() was called somewhere in the code, in this case, the request will have the cancelled status in the Chrome Developer tools Network tab.
In my case, it started coming after chrome 76 update.
Due to some issue in my JS code, window.location was getting updated multiple times which resulted in canceling previous request.
Although the issue was present from before, chrome started cancelling request after update to version 76.
I had the same issue when updating a record. Inside the save() i was prepping the rawdata taken from the form to match the database format (doing a lot of mapping of enums values, etc), and this intermittently cancels the put request. i resolved it by taking out the data prepping from the save() and creating a dedicated dataPrep() method out of it. I turned this dataPrep into async and await all the memory intensive data conversion. I then return the prepped data to the save() method that i could use in the http put client. I made sure i await on dataPrep() before calling the put method:
await dataToUpdate = await dataPrep();
http.put(apiUrl, dataToUpdate);
This solved the intermittent cancelling of request.

Should I be using POST or GET when retrieving JSON data into jqGrid in my ASP.NET MVC application?

I am using jqgrid in my ASP.NET MVC application. Currently I have mTYpe: 'POST' like this:
jQuery("#myGrid").jqGrid({
mtype: 'POST',
toppager: true,
footerrow: haveFooter,
userDataOnFooter: haveFooter,
But I was reading this article, and I see this paragraph:
Browsers can cache images, JavaScript, CSS files on a user's hard
drive, and it can also cache XML HTTP calls if the call is a HTTP GET.
The cache is based on the URL. If it's the same URL, and it's cached
on the computer, then the response is loaded from the cache, not from
the server when it is requested again. Basically, the browser can
cache any HTTP GET call and return cached data based on the URL. If
you make an XML HTTP call as HTTP GET and the server returns some
special header which informs the browser to cache the response, on
future calls, the response will be immediately returned from the cache
and thus saves the delay of network roundtrip and download time.
Given this is the case, should I switch my jqGrid mType all to use "GET" from "POST" for the mType? (It says XML (doesn't mention JSON). If the answer is yes, then actually what would be a situation why I would ever want to use POST for jqGrid mType as it seems to do the same thing without this caching benefit?
The problem which you describe could be in Internet Explorer, but it will be not exist in jqGrid if you use default options.
If you look at the full URL which will be used you will see parameters like
nd=1339350870256
It has the same meaning as cache: true of jQuery.ajax. jqGrid add the current timestemp to the URL to make it unique.
I personally like to use HTTP GET in jqGrid, but I don't like the usage of nd parameter. The reason I described in the old answer. It would be better to use prmNames: {nd:null} option of jqGrid which remove the usage of nd parameter in the URL. Instead of that one can control the caching on the server side. For example the setting of
Cache-Control: private, max-age=0
is my standard setting. To set the HTTP header you need just include the following line in the code of ASP.NET MVC action
HttpContext.Current.Response.Cache.SetMaxAge (new TimeSpan (0));
You can find more details in the answer.
It's important to understand, that the header Cache-Control: private, max-age=0 don't prevent the caching of data, but the data will be never used without re-validation on the server. Using other HTTP header option ETag you can make the revalidate really working. The main idea, that the value of ETag will be always changed on changing the data on the server. In the case if the previous data are already in the web browser cache the web browser automatically send If-None-Match part in the HTTP request with the value of ETag from the cached data. So if the server see that the data are not changed it can answer with HTTP response having 304 Not Modified status and empty body of the HTTP response. It allows the web browser to use local previously cached data.
In the answer and in this one you will find the code example how to use ETag approach.
If the data that the server sends changes, then you should use POST to avoid getting cached data everytime you request it.
You should not use GET for all the purposes. GET requests are supposed to use for getting data from the server not for saving or deleting operation. GET requests has some limitation since the data you are sending to the server or appended as query-strings you can't send very large data using GET requests. Also you should not use GET request to send sensitive information to the server. You should the POST request in all the other cases like adding, editing and deleting.
As far as I'm aware jqgrid appends a unique key in every GET request so you don't get any benefit from browser caching.
One way around the caching behavior is to make the GET unique each time the request is made. jQuery.ajax() does this with "cache: false" by appending a timestamp to the end of the request. You can replicate this behavior with something similar:
uri = uri + '?_=' + (new Date()).getTime(); // uri represents the URI to the endpoint

jQuery .get caching working too well?

I'm using the jQuery .get() function to load in a template file and then display the loaded HTML into a part of the page by targeting a particular DOM element. It works well but I've recently realised for reasons that confound me that it is caching my template files and masking changes I have made.
Don't get me wrong ... I like caching as much as the next guy. I want it to cache when there's no timestamp difference between the client's cache and the file on the server. However, that's not what is happening. To make it even odder ... using the same function to load the templates ... some template files are loading the updates and others are not (preferring the cached version over recent changes).
Below is the loading function I use.
function LoadTemplateFile ( DOMlocation , templateFile , addEventsFlag ) {
$.get( templateFile , function (data) {
$( DOMlocation ).html(data);
});
}
Any help would be greatly appreciated.
NEW DETAILS:
I've been doing some debugging and now see that the "data" variable that comes back to the success function does have the newer information but for reasons that are not yet clear to me what's inserted into the DOM is an old version. How in heck this would happen has now become my question.
You can set caching to off in the jQuery.get() call. See the jQuery.ajax() doc for details with the cache: false option. The details of how caching works is browser specific so I don't think you can control how it works, just whether it's on or off for any given call.
FYI, when you turn caching off, jQuery bypasses the browser caching by appending a unique timestamp parameter to the end of the URL which makes it not match any previous URL, thus there is no cache hit.
You can probably also control cache lifetime and several other caching parameters on your server which will set various HTTP headers that instruct the browser what types of caching to allow. When developing your app, you probably want caching off entirely. For deployment, I would suggest that you want it on and then when you rev your app, the safest way to deal with caching is to change your URLs slightly so the new versions of your URLs are different. That way, you always get max caching, but also users get new versions immediately.
$get will always cache by default, especially on IE. You will need to manually append a querystring, or use the ajaxSetup method to bust the cache.
As an alternative, I found in the jQuery docs that you can just override the default caching. For me I didn't have to convert all of my $.get over to $.ajax calls. Note that this also overrides default behavior for all other types of calls, like $.getScript which has the opposite default behavior of cache: false from $.get.
https://api.jquery.com/jquery.getscript/
$.ajaxSetup({
cache: false
});

NETWORK_ERROR: XMLHttpRequest Exception 101

I am getting this Error
NETWORK_ERROR: XMLHttpRequest Exception 101
when trying to get XML content from one site.
Here is my code:
var xmlhttp;
if(window.XMLHttpRequest) {
xmlhttp = new XMLHttpRequest();
}
if (xmlhttp==null) {
alert ("Your browser does not support XMLHTTP!");
return;
}
xmlhttp.onReadyStateChange=function() {
if(xmlhttp.readyState==4) {
var value =xmlhttp.responseXML;
alert(value);
}
}
xmlhttp.open("GET",url,false);
xmlhttp.send();
//alert(xmlhttp.responseXML);
}
xmlhttp.open("GET",url,false);
xmlhttp.send(null);
Does any one have a solution?
If the url you provide is located externally to your server, and the server has not allowed you to send requests, you have permission problems. You cannot access data from another server with a XMLHttpRequest, without the server explicitly allowing you to do so.
Update: Realizing this is now visible as an answer on Google, I tried to find some documentation on this error. That was surprisingly hard.
This article though, has some background info and steps to resolve. Specifically, it mentions this error here:
As long as the server is configured to allow requests from your web application's origin, XMLHttpRequest will work. Otherwise, an INVALID_ACCESS_ERR exception is thrown
An interpretation of INVALID_ACCESS_ERR seems to be what we're looking at here.
To solve this, the server that receives the request, must be configured to allow the origin. This is described in more details at Mozilla.
The restriction that you cannot access data from another server with a XMLHttpRequest can apply even if the url just implies a remote server.
So:
url = "http://www.myserver.com/webpage.html"
may fail,
but:
url = "/webpage.html"
succeed - even if the request is being made from www.myserver.com
Request aborted because it was cached or previously requested? It seems the XMLHttpRequest Exception 101 error can be thrown for several reasons. I've found that it occurs when I send an XMLHttpRequest with the same URL more than one time. (Changing the URL by appending a cache defeating nonsense string to the end of the URL allows the request to be repeated. -- I wasn't intending to repeat the request, but events in the program caused it to happen and resulted in this exception).
Not returning the correct responseText or responseXML in the event of a repeated request is a bug (probably webKit).
When this exception occurred, I did get an onload event with readyState==4 and the request object state=0 and responseText=="" and responseXML==null. This was a cross domain request, which the server permits.
This was on an Android 2.3.5 system which uses webKit/533.1
Anyone have documentation on what the exception is supposed to mean?
Something like this happened with me when I returned incorrect XML (I put an attribute in the root node). In case this helps anyone.
xmlhttp.open("GET",url, true);
set the async part to true
I found a very nice article with 2 diferent solutions.
The first one implementing jQuery and JSONP, explaining how simple it is.
The second approach, it's redirecting trough a PHP call. Very simple and very nice.
http://mayten.com.ar/blog/42-ajax-cross-domain
Another modern method of solving this problem is Cross Origin Ressource Sharing.
HTML5 offers this feature. You can "wrap" your XMLhttp request in this CORS_request and
if the target browser supports this feature, you can use it and wont have no problems.
EDIT:
Additionaly i have to add that there are many reasons which can cause this Issue.
Not only a Cross Domain Restriction but also simply wrong Settings in your WEB.CONFIG of your Webservice.
Example IIS(.NET):
To enable HTTP access from external sources ( in my case a compiled Phonegap app with CORS request ) you have to add this to your WEB.CONFIG
<webServices>
<protocols>
<add name="HttpGet"/>
<add name="HttpPost"/>
</protocols>
</webServices>
Another scenario:
I got two webservices running... One on Port 80 and one on Port 90. This also gave me an XML HTTP Request Error. I even dont know why :). Nevertheless i think this can help many not well experienced readers.

Resources