Cross-Domain XmlHTTPRequest in chinese after firefox update - firefox

My Firefox just did an update, and now all my XMLHttpRequest (that worked 10 minutes ago) answer in Chinese...
I can read this as responseText in firebug :
"㰿硭氠癥牳楯渽∱⸰∠敮捯摩湧㴢畴昭ㄶ∿㸍਼㽸浬⵳瑹汥獨敥琠瑹灥㴧瑥硴⽸獬✠桲敦㴧⽯扩砯硳搧㼾ഊ㱯扪慭攽≷慴捨㌢⁨牥昽≨瑴瀺⼯ㄲ㜮〮〮ㄺ㠰㠰⽯扩砯睡瑣桓敲癩捥⽷慴捨㌯∠楳㴢潢楸㩗慴捨∾ഊ†㱲敬瑩浥慭攽≬敡獥∠桲敦㴢汥慳支∠睲楴慢汥㴢瑲略∠癡氽≐吵䴢楮㴢偔こ∠⼾ഊ†㱩湴慭攽≡摤䙲敱略湣礢⁨牥昽≡摤䙲敱略湣礯∠睲楴慢汥㴢瑲略∠癡氽∵〰∠浩渽∰∠⼾ഊ†㱯瀠湡浥㴢慤搢⁨牥昽≡摤⼢⁩渽≯扩砺坡瑣桉渢畴㴢潢楸㩗慴捨併琢 㸍ਠ‼潰慭攽≲敭潶攢⁨牥昽≲敭潶支∠楮㴢潢楸㩗慴捨䥮∠潵琽≯扩砺乩氢 㸍ਠ‼潰慭攽≰潬汃桡湧敳∠桲敦㴢灯汬䍨慮来猯∠楮㴢潢楸㩎楬∠潵琽≯扩砺坡瑣桏畴∠⼾ഊ†㱯瀠湡浥㴢灯汬剥晲敳栢⁨牥昽≰潬汒敦牥獨⼢⁩渽≯扩砺乩氢畴㴢潢楸㩗慴捨併琢 㸍ਠ‼潰慭攽≤敬整攢⁨牥昽≤敬整支∠楮㴢潢楸㩎楬∠潵琽≯扩砺乩氢 㸍ਠ‼潰慭攽≳瑡琢⁨牥昽≳瑡琯∠楮㴢潢楸㩎楬∠潵琽≯扩砺坡瑣桓瑡琢 㸍਼⽯扪"
This is fun, but I really would like to have my real answer back.
Do you know if I can change a setting or something?
An important thing to know, is that my ajax code is a crossdomain request. Is it possible that something changed in the http header encoding ?
Thanx a lot !

Firefox now gives precedence to the UTF-16 BOM over HTTP. This is not a regression but an intentional change. Better not use a UTF-16 BOM if the bytes after it aren't UTF-16!

Related

How to get firefox to read UTF-16 JSON

Ok so I got a web server serving UTF-16 jsons.
UTF-16 needed so don't even dare to answer "why don't you switch to UTF-8".
Apparently firefox can't read them.
In fact i get a JSON.parse error...
I tried switching to UTF-8 to prove that's the problem and it works...
but how can i make it successfully decode UTF-16?
my response headers charset is already "UTF-16"
The answer is you can't... Not with up to date versions of Firefox at least....

LongPoll: Stop Firefox-Throbber

I use dojo/request/xhr for longpoll requests. In Firefox (not in IE or Crome) the throbber spins during the request.
From technical view this is correct, because a request has not finished yet. But this looks not very nice. Is there a way to stop it?
There are some posts about "Throbber of doom", but I can't find a solution. Is there any?
Frank.
P.S. Maybe it could be possible to emit an event to feign a completed request, that will make the throbber stop!?
P.P.S. Is it a known bug? See here: Mozilla throbber bug
But it seems not to be processed!?
New trials:
Using a basic JavaScript XMLHttpRequest Object the throbber doesn't spin! Seems to be a problem with Dojo-Xhr!?

How can I validate http response headers?

It's the first time I am doing something with headers. I am mainly concerned with Cache-Control but there may be others I will need to check as well. For example, I try to send the following header to the browser (based on tutorials I just read):
Cache-Control:private, max-age=2011-12-30 11:40:56
Google Chrome displays it this way in Network -> Headers -> Response headers, but how do I know if it's correct, that there aren't any typos, syntax errors and such? Will it really work? Will the browser behave like I want it to, or will it treat it like a gibberish (something like "unknown header/value")? I've tried sending nonsensical headers on purpose but they got displayed with the rest. Is there any Chrome tool / addon for that, or any other way? Thank you in advance!
I'm afraid you won't be able to check if the resource has been cached by proxies en route, but you can check if your browser has cached it.
While in the Network panel of Chrome DevTools, hit F5 to reload your page. You should see something like "304 Not Modified" in the status field for the resource you are treating (which means the resource has not been modified and its contents were not received from the server but rather loaded from the browser's cache.)

Cross domain ajax POST in chrome

There are several topics about the problem with cross-domain AJAX. I've been looking at these and the conclusion seems to be this:
Apart from using somthing like JSONP, or a proxy sollution, you should not be able to do a basic jquery $.post() to another domain
My test code looks something like this (running on "http://myTestdomain.tld/path/file.html")
var myData = {datum1 : "datum", datum2: "datum"}
$.post("http://External-Ip:port", myData,function(return){alert(return);});
When I tried this (the reason I started looking), chrome-console told me:
XMLHttpRequest cannot load
http://External-IP:port/page.php. Origin
http://myTestdomain.tld is not allowed
by Access-Control-Allow-Origin.
Now this is, as far as I can tell, expected. I should not be able to do this. The problem is that the POST actually DOES come trough. I've got a simple script running that saves the $_POST to a file, and it is clear the post gets trough. Any real data I return is not delivered to my calling script, which again seems expected because of the Access-control issue. But the fact that the post actually arrived at the server got me confused.
Is it correct that I assume that above code running on "myTestdomain" should not be able to do a simple $.post() to the other domain (External-IP)?
Is it expected that the request would actually arrive at the external-ip's script, even though output is not received? or is this a bug. (I'm using Chrome 11.0.696.60 )
I posted a ticket about this on the WebKit bugtracker earlier, since I thought it was weird behaviour and possibly a security risk.
Since security-related tickets aren't publicly viewable, I'll quote the reply from Justin Schuh here:
This is implemented exactly as required by the spec. For simple cross-origin requests http://www.w3.org/TR/cors/#simple-method> there is no pre-flight check; the request is made and the response cannot be read if the appropriate headers do not authorize the requesting origin. Functionally, this is no different than creating a form and using script to make an off-origin POST (which has always been possible).
So: you're allowed to do the POST since you could have done that anyway by embedding a form and triggering the submit button with javascript, but you can't see the result. Because you wouldn't be able to do that in the form scenario.
A solution would be to add a header to the script running on the target server, e.g.
<?php
header("Access-Control-Allow-Origin: http://your_source_domain");
....
?>
Haven't tested that, but according to the spec, that should work.
Firefox 3.6 seems to handle it differently, by first doing an OPTIONS to see whether or not it can do the actual POST. Firefox 4 does the same thing Chrome does, or at least it did in my quick experiment. More about that is on https://developer.mozilla.org/en/http_access_control
The important thing to note about the JavaScript same-origin policy restriction is that it is something built into modern browsers for security - it is not a limitation of the technology or something enforced by servers.
To answer your question, neither of these are bugs.
Requests are not stopped from reaching the server - this gives the server the option to allow these cross-domain requests by setting the appropriate headers1.
The response is also received back by the browser. Before the use of the access control headers 1, responses to cross-domain requests would be stopped dead in their tracks by a security conscious browser - the browser would receive the response but it would not hand it off to the script. With the access control headers, the server has the option of setting the appropriate headers indicating to a compliant browser that it would like to allow certain origin URLs to make cross domain requests.
The exact behaviour on response might differ between browsers - I can't recall for sure now but I think Chrome calls the success callback function when using jQuery's ajax() but the response is empty. IIRC, Firefox will not invoke the success function.
I get the same thing happening for me. You are able to post across domains but are not able to receive a response. This is what I expected to be able to do and happens for me in Firefox, Chrome, and IE.
One way to kind of get around this caveat is having a local php file with will call the data via curl and respond the response to your javascript. (Kind of restated what you said you knew already.)
Yes, it's correct and you won't be able to do that unless you use any proxy.
No, request won't go to the external IP as soon as there is such limitation.

Firefox makes two HTTP requests [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 7 years ago.
Improve this question
Ok this is weird.
If I make a request to a page, where it's text/html, firefox makes one request.
If I make a request to a page, where it's application/xml, firefox makes two requests.
In IE, Google Chrome, it one makes one in both cases.
Any ideas why the two requests, and how I can force just the one?
I've had a similar issue if the encoding of the page didn't match the <meta> tag. If the page was encoded using default windows encoding, but the meta tag specified UTF-8, then firefox would stop downloading once it reached a non-ascii character (e.g. æ,ø or å) and it would redownload the page from the beginning. This would mess up view counts and lots of other logic since the server side script would run twice.
It might be that if you do not start your page with <?xml ?>, but claim that it is, then Firefox will redownload the page again as html (text/html) and process it as html.
Just to add another possibility...
If the html code contains an empty img src attribute, then this causes a 2 http request in both Firefox and Chrome. Currently, those are the ones that follow the standard to the letter, which states that an empty URI reference refers to the absolute base URI.
I've had a similar problem with Firefox. Might help someone.
FF was making two HTTP GET requests while Chrome didn't.
The problem was an empty src="" attribute.
Firefox considers such empty attribs for img/script... tags as the current url and GETs the current page.
Maybe you're making the request in a way that cause HTTP Access Control features to fire?
It is a fairly new standard, and new in [FF3.5][2] that can cause double GET requests.
In case you can sniff the requests server side: see if they contain the Origin: header.
[2]: https://developer.mozilla.org/En/Server-Side_Access_Control Server-Side Access Control
In my case it was a wrong content-type header "image/jpg" sent with PHP-generated image. Double requests gone after I changed the type to "image/jpeg"
More info about this bug...
https://bugzilla.mozilla.org/show_bug.cgi?id=236858
I meet this problem too and I've figured it out.THIS may be related to Non-existent favicon.ico. details here,you can check it using following code(node.js),:
var http = require('http');
server = http.createServer(function (req,res){
console.log(req.url);
res.writeHeader(200,{"Content-Type":"text/html"});
res.end("Hello World");
})
server.listen(8000);
console.log("httpd start #8000");
the result is expected to be:
httpd start #8000
/
/favicon.ico
Found the problem.
The XML packet I was returning had a root node of <feed>
Firefox passes this twice for some reason, maybe as it's trying to identify if this is a valid ATOM/RSS feed. If not, just displays instead?
Changing root node to something else fixed the problem.
Thanks Marcus for starting me in the right direction.

Resources