d3.js setrequestheader fails in ie8 - d3.js

Can someone say how to set a request header using the d3js xhr interface in IE8?
Code is like this:
d3.csv(url).header("accept","text/csv").rows(function(d) {...}).get(function(e,r) {...});
This doesn't have the desired effect in IE8, but works in Firefox and Chrome.
I load the aight compatibility library before loading d3, and the aight.d3 library after, but I don't think those are relevant to this problem.
The request is sent, but the response type is incorrect (it's json instead of csv), so the rows() function fails to get any data. At the server, the "Accept:" header value is */* from IE8, but text/csv from other browsers.
When I write the equivalent in bare javascript, IE8 sets the request header correctly.
I have d3 version 3.4.3.
Thanks for any help.
Regards,
--Paul

I'm not an expert on the subject, but looking through the d3 source code and various MSDN references, I think the problem is that d3 automatically checks whether you're using an absolute url reference, and if so assumes that it is a cross-domain request and switches to IE's XDomainRequest instead of XMLHttpRequest for older IE.
Relevant d3 source code (line 17-26):
var xhr = {},
dispatch = d3.dispatch("beforesend", "progress", "load", "error"),
headers = {},
request = new XMLHttpRequest,
responseType = null;
// If IE does not support CORS, use XDomainRequest.
if (d3_window.XDomainRequest
&& !("withCredentials" in request)
&& /^(http(s)?:)?\/\//.test(url)) request = new XDomainRequest;
The first line tests whether the XDomainRequest object exists (it does for IE8 and up), the second tests if the created XMLHttpRequest object does not have the withCredentials property for cross-domain requests (which only exists in IE10 and up), and the third tests whether the url starts with "http://" or "https://".
So, if you pass in an absolute url to any of the d3 file-grabbing functions in an IE8 or IE9 browser, it will use an XDomainRequest object instead of XMLHttpRequest.
Which is good if you want to actually grab files from a cross-origin server. Not so good if your same-domain server is expecting you to specify the accepted file type, since as far as I can tell XDomainRequest doesn't have any way of setting headers, and it certainly doesn't have the setRequestHeader method that d3 checks for (line 96):
xhr.send = function(method, data, callback) {
/*...*/
if (request.setRequestHeader) for (var name in headers)
request.setRequestHeader(name, headers[name]);
/*...*/
So how do you get it to work? If you're not doing a cross-origin request (and therefore XMLHttpRequest should work fine), specify your URL using relative notation if you can. Otherwise, you're either going to have to change the d3 source code or create the XMLHttpRequest yourself.

Related

How to access request body in cy.route response callback?

I have built something that can capture network requests and save them to a file. Currently I am now trying to build the second part which should return these captured requests, but running into some difficulties. Sometimes I have multiple requests going to a single method/url combination, and I need to return a different response depending on the request body. The problem I am facing is illustrated in the example below:
cy.route({
url: 'api.example.com/**',
method: myMethod,
response: routeData => {
// I can set the response here
// But I don't have access to the request body
},
onRequest: xhr => {
// I can access the request body here
// But I am not supposed/able to set the response
},
})
If I understand the API docs correctly, I am supposed to set the response in the response callback. However, in that callback I do not seem to have access to the XHR object from which I could read the request body.
Is there a way to access the request body in the response callback?
Or, alternatively, is there a way to set the response from the onRequest callback?
Update: just saw this post which mentions a body property which can be added to the cy.route options object. I don't see this in the cypress route docs so I don't know if this is even a valid option, and I also wouldn't know if making multiple calls to cy.route with an identical method and url, but a different body would produce the correct results. If this was of any use, I would have hoped to have seen some branching logic based on a body property somewhere in this file, so I am not super hopeful.
Cypress v6 comes with the cy.intercept API. Using that is much more convenient than using cy.server and cy.route.

JSON serialization is very slow when compared to XML

I have a webapi service that I need to access from web app
If I call the webapi url directly (e.g. /api/scrccc/32) it returns data in less then 5 sec, but when called from jquery, it takes more then 5 minutes (!!!) minutes
My ajax call is
j$.ajax({
type: "GET",
url: '/api/scrccc/' + id + '?dt=' + new Date().getTime(),
error: function (jqXHR, status, error) {
//....
},
success: function (data, status, jqXHR) {
//....
}
});
This happens with javascript console (firebug) open or closed, and in Chrome and FF
Anyone has any idea why this happens?
EDIT:
Here are the timings in Chrome with Jquery call:
and the timings in Chrome with direct access to url in address bar
... and timimgs in browser with timestamp also
EDIT - June 1
I realized there is a difference between calling the webapi from ajax and from browser's address bar:
- ajax call requests response as json
- browser requests response as XML
So I tested both requests in Postman, with json and xml response and the findings puzzled me: request with xml response took 1261 ms while json response took 47000 ms (!!!)
(The timings shown in Chrome console were with local IIS Express, while latest timings shown in Postman are with real app on real web server over internet, that's why they are different, but the scale remains)
So, indeed, the problem is at the server side, as some suggested, but not in the actual application code providing data, but instead at the serialization point.
My Webapi is 2.1 (version 5.1.2), and Json.Net version 6.0.3 (both are latest versions)
I have no special settings to use any particular json serializer, so, as I know, WebApi uses Json.Net.
Any idea what could cause such a HUGE difference in serialization time?
Thanks
First, thanks to all who took the time to loot at this.
As I posted in comment to #adreno, the problem was with an expensive calculated property, loaded on first access.
The property was in an class defined in BL, which was used in various places, so at first I didn't thought to check it.
Why that property was excluded from serialization by XML serialization? I don't know, and due to time constraints, I didn't had either the time or a particular interest to investigate further (since it's not an open api, and we didn't needed XML output).
How do you serialize the JSON?, what library are you using, I would try just Gson first in case that you are using Jackson or another library that your framework is using.
like I said render a String
String myjson = "";
Gson gson = new Gson();
myjson = gson.toJson(myuser);//myuser is my java object
return myjson

Protecting prototype.js based XHR requests against CSRF

Django has been updated to 1.3, and in fact ever since 1.2.5, it has extended the scheme to pass a Cross Site Request Forgery protection token to XMLHttpRequests. The Django folks helpfully provide an example for jQuery to apply a specific header to every XHR.
Prototype (and thus Scriptaculous) have to comply to this scheme, yet I can't find a way to tell prototype to add the X-CSRFToken header. The best would be to do it once in a way that applies it across the app (like for jQuery).
Is there a way to do that?
This is a wild guess but you could try extending the base AJAX class...
Ajax.Base.prototype.initialize = Ajax.Base.prototype.initialize.wrap(
function (callOriginal, options) {
var headers = options.requestHeaders || {};
headers["X-CSRFToken"] = getCookie("csrftoken");
options.requestHeaders = headers;
return callOriginal(options);
}
);

XMLHttpRequest in Ajax and PHP

In internet explorer we can create the object of ActiveXObject like follows
xmlDoc=new ActiveXObject("Microsoft.XMLDOM");
xmlDoc.async="false";
xmlDoc.load("note_error.xml");
It is possible to use the xmlDoc.load("note_error.xml"); for the object of XMLHttpRequest in other browsers.If no,any other substitute for this method when we use XMLHttpRequest.Please help...am using firefox as my browser
xmlDoc.async="false";
That's not doing what you think. async is a boolean property. When you assign the string "false" to it, you're getting the value true, because all non-empty strings are truthy.
It is possible to use the xmlDoc.load("note_error.xml"); for the object of XMLHttpRequest in other browsers.
Yes, in fact that's what you should be doing in IE too. There is no reason to use XMLDOM to fetch an XML Document; XMLHttpRequest can do that fine and it's much more widely supported.
var xhr= window.XMLHttpRequest? new XMLHttpRequest() : new ActiveXObject('MSXML2.XMLHttp');
xhr.async= false;
xhr.open('GET', 'note_error.xml');
xhr.send();
var doc= xhr.responseXML;
If you do need an XMLDOM-like object in other browsers, it's called new DOMParser, but it's not as widely-supported as XMLHttpRequest.
the activeX 'concept' is only in Internet Explorer. All other browser implement a similar, but more or less standard version.
http://www.w3schools.com/Ajax/ajax_browsers.asp
that shows you how to create an xmlhttp object in 'any' browser.

How can I prevent IE Caching from causing duplicate Ajax requests?

We are using the Dynamic Script Tag with JsonP mechanism to achieve cross-domain Ajax calls. The front end widget is very simple. It just calls a search web service, passing search criteria supplied by the user and receiving and dynamically rendering the results.
Note - For those that aren’t familiar with the Dynamic Script Tag with JsonP method of performing Ajax-like requests to a service that return Json formatted data, I can explain how to utilise it if you think it could be relevant to the problem.
The service is WCF hosted on IIS. It is Restful so the first thing we do when the user clicks search is to generate a Url containing the criteria. It looks like this...
https://.../service.svc?criteria=john+smith
We then use a dynamically created Html Script Tag with the source attribute set to the above Url to make the request to our service. The result is returned and we process it to show the results.
This all works fine, but we noticed that when using IE the service receives the request from the client Twice. I used Fiddler to monitor the traffic leaving the browser and sure enough I see two requests with the following urls...
Request 1: https://.../service.svc?criteria=john+smith
Request 2: https://.../service.svc?criteria=john+smith&_=123456789
The second request has been appended with some kind of Id. This Id is different for every request.
My immediate thought is it was something to do with caching. Adding a random number to the end of the url is one of the classic approaches to disabling browser caching. To prove this I adjusted the cache settings in IE.
I set “Check for newer versions of stored pages” to “Never” – This resulted in only one request being made every time. The one with the random number on the end.
I set this setting value back to the default of “Automatic” and the requests immediately began to be sent twice again.
Interestingly I don’t receive both requests on the client. I found this reference where someone is suggesting this could be a bug with IE. The fact that this doesn’t happen for me on Firefox supports this theory.
Can anyone confirm if this is a bug with IE? It could be by design.
Does anyone know of a way I can stop it happening?
Some of the more vague searches that my users will run take up enough processing resource to make doubling up anything a very bad idea. I really want to avoid this if at all possible :-)
I just wrote an article on how to avoid caching of ajax requests :-)
It basically involves adding the no cache headers to any ajax request that comes in
public abstract class MyWebApplication : HttpApplication
{
protected MyWebApplication()
{
this.BeginRequest += new EventHandler(MyWebApplication_BeginRequest);
}
void MyWebApplication_BeginRequest(object sender, EventArgs e)
{
string requestedWith = this.Request.Headers["x-requested-with"];
if (!string.IsNullOrEmpty(requestedWith) && requestedWith.Equals(”XMLHttpRequest”, StringComparison.InvariantCultureIgnoreCase))
{
this.Response.Expires = 0;
this.Response.ExpiresAbsolute = DateTime.Now.AddDays(-1);
this.Response.AddHeader(”pragma”, “no-cache”);
this.Response.AddHeader(”cache-control”, “private”);
this.Response.CacheControl = “no-cache”;
}
}
}
I eventually established the reason for the duplicate requests. As I said, the mechanism I chose to use for making Ajax calls was with Dynamic Script Tags. I build the request Url, created a new Script element and assigned the Url to the src property...
var script = document.createElement(“script”);
script.src = https://....”;
Then to execute the script by appending it to the Document Head. Crucially, I was using the JQuery append function...
$(“head”).append(script);
Inside the append function JQuery was anticipating that I was trying to make an Ajax call. If the type of element being appended is a Script, then it executes a special routine that makes an Ajax request using the XmlHttpRequest object. But the script was still being appended to the document head, and being executed there by the browser too. Hence the double request.
The first came direct from the script – the one I intended to happen.
The second came from inside the JQuery append function. This was the request suffixed with the randomly generated query string argument in the form “&_=123456789”.
I simplified things by preventing the JQuery library side effect. I used the native append function...
document.getElementByTagName(“head”).appendChild(script);
One request now happens in the way I intended. I had no idea that the JQuery append function could have such a significant side effect built in.
See www.enhanceie.com/redir/?id=httpperf for further discussion.

Resources