I'm working on a website and I have a problem right here. On the page I'm working, the user puts an ip address and the ports he wants to be searched. This is being made with ajax (user side) and php (server side). Ajax sends the ip and port (one by one) to the php file, and he returns the result of the port. The goal is that user sees what's the port is being tested (in a div element) at the moment, and here is where the problem is. He runs/works well, he tests all the ports the user wants to, but during the test period he shows no port, just shows the final port (after all previous ports have been tested) and the result of the ports (if some port had a result) which appears in a distinct div element. This just works perfectly in Firefox, in other browsers happens what I just explained. The Google Chrome console says: Refused to set unsafe header "Content-length" and Refused to set unsafe header "Connection". I've been searching about this problem for days and I found so many things and I tried them, but none of them solved the problem.
Here is my code.
jquery.js
function HttpRequest(endereco, portainicio)
{
var xmlhttp;
var params = "endereco="+endereco+"&"+"porta="+portainicio;
if (window.XMLHttpRequest) // IE7+, Firefox, Chrome, Opera, Safari
{
xmlhttp = new XMLHttpRequest();
}
else // IE6, IE5
{
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.open("POST", "/firewall/ajax", false);
//alert(params);
xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
xmlhttp.setRequestHeader("Content-length", params.length);
xmlhttp.setRequestHeader("Connection", "close");
xmlhttp.send(params);
return xmlhttp.responseText;
}
function ajaxfirewall()
{
(...)
var resposta;
$("p.ip").append("<span class='end'> "+endereco+"</span>");
for (portainicio; portainicio <= portafinal; portainicio++)
{
resposta = HttpRequest(endereco, portainicio);
$("p.porta").append(" <span class='tporta'>"+ resposta+"</span><br>");
}
return false;
}
Another thing it's really strange. Do you see those alert(params); which are commented in the HttpRequest function? If I leave it uncommented it displays the port which is being tested, but it shows the alert and I don't want that.
Without the HTML your jquery.js is supposed to work on this involves some guesswork (maybe you could post the relevant excerpt (Hint, hint)). I would consider it possible that $("p.porta") cannot be found or that the appended HTML reacts in an unexpected way. You should try to just print your results to console using e.g. console.log (that is you are using Firebug or some such) in order to see what you get at what time. Maybe you will find something on the client side too.
Update
Judging from this question and its accepted answer the Chrome behavior is actually what you should expect. The standard for XMLHttpRequests prescribes that these two headers should not be set by the client in order to avoid request smuggling attacks. You just should not set them (even if your PHP source tells you to).
Related
Let’s say we have several different websites: website1.com, website2.com, website3.com. We use jQuery on all of them and include it from CDN like googleapis.com. The expected behavior from a browser would be to cache it once and use it for all other websites. Chrome seems to do it, but Safari downloads jQuery for every domain.
Example
With the given JS code below open nytimes.com, bbc.com and dw.de in Chrome.
Append jQuery on the first website and look at the Network tab of your DevTools. It will say that it got jQuery.
Now open any other website and append jQuery again — the answer will be “from cache”.
However, Safari will say it’s loading jQuery for every domain, but try to open any webpage on one of the domains and append the script again — you will see that now it says it got jQuery from cache. So it looks like it caches data for a domain, even if it has already downloaded a resource from the exact URL for another domain.
Is this assumption correct and if so, how to fix it?
Code you can copy/paste:
setTimeout(function() {
var SCRIPT_SRC = '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js';
var s = document.createElement('script');
s.type = 'text/javascript';
s.async = true;
s.src = SCRIPT_SRC;
var x = document.getElementsByTagName('script')[0];
x.parentNode.insertBefore(s, x);
}, 0);
UPD: Tested it with a static image.
test.com, test2.com and test3.com have <img src="http://image.com/image.jpg" />. In all browsers except for Safari access log shows only one — first — request for the image. Safari gets the image for every new domain (but not a subdomain).
I've noticed this too, and I suspect it is for privacy reasons.
By default, Safari blocks third-party cookies. A third party cookie is a cookie set on b.com on for a resource that is requested by a.com. This can be used, for example, to track people across domains. You can have a script on b.com that is requested by a.com and by c.com. b.com can insert a unique client ID into this script based on a third-party cookie, so that a.com and c.com can track that this is the same person.
Safari blocks this behavior. If b.com sets a cookie for a resource requested by a.com, Safari will box that cookie so it is only sent to b.com for more requests by a.com. It will not be sent to b.com for requests by c.com.
Now enter caching and specifically the Etag header. An Etag is an arbitrary string (usually a hash of the file) that can be used to determine if the requested resource has changed since the person requested it last. This is normally a good thing. It saves re-sending the entire file if it is has not changed.
However, because an Etag is an arbitrary string, b.com can set it to include a client ID. This is called Etag tracking. It allows tracking a person across domains in almost exactly the same way as cookies do.
Summary: By not sharing the cache across domains, Safari protects you from cross-domain Etag tracking.
This is by design, something the Safari team call Intelligent Tracking Protection - https://webkit.org/blog/7675/intelligent-tracking-prevention/ - and the cache is double-keyed based on document origin and third-party origin
Based on research using HTTP Archive data and the Yahoo / Facebook studies on cache-lifetimes I doubt shared caching of jQuery etc is effective - not enough sites use the same versions of the libraries, and the libraries don't live in cache for very long – so Safari's behaviour helps prevent tracking, while not really affecting performance
Rather than simply adding a DOM element, you could try using XMLHTTPRequest. It lets you define custom headers -- one of which is Cache-Control.
Give this a shot, it should override whatever's going on at the browser level:
(function () {
var newRequest = function() {
return (window.XMLHttpRequest) ? new XMLHttpRequest() : new ActiveXObject( 'MsXml2.XmlHttp' );
}
var loadScript = function(url) {
var http = new newRequest();
http.onReadyStateChange = function() {
if (http.readyState === 4) {
if (http.status === 200 || http.status === 304) {
appendToPage(http.responseText);
}
}
}
// This is where you set your cache
http.setRequestHeader( 'Cache-Control', 'max-age=0' )// <-- change this to a value larger than 0
http.open('GET', url, true);
http.send(null);
}
var appendToPage = function(source) {
if (source === null) return false;
var head = document.getElementsByTagName('head')[0];
var script = document.createElement('script');
script.language = 'javascript';
script.type = 'text/javascript';
script.defer = true;
script.text = source;
head.appendChild(script);
}
loadScript( '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js' );
})();
Note: Safari has had some issues with caching in the past. However, from what I understand it was mostly about serving stale content -- not the other way around.
Here are some suggestions :
Have you checked if "disable cache" option is disabled ?
Are you looking for HTTP status code in the network dev panel ?
Have you tried capturing traffic with tools like WireShark ?
Best regards.
I tried to POST/GET some Vars to a php-file via AJAX/d3.xml, but a print_r($_POST/$_GET) inside the php, tells me they're empty. Even the example on the doc won't work for me.
d3.xml("xxx.php")
.header("Content-Type", "application/x-www-form-url-encoded")
/*
.get("a=2&b=3", function(error, data) {
console.log(error);
})
*/
.post("a=2&b=3", function(error, data) {
console.log(error);
})
.on("load", createTable(data));
The console tells me that there is a POST request but under POST (on Firefox, not on Chrome) it lists the Vars in one line "a=2&b=3", instead of
a = 2
b = 3
as it usually does. What is it that I'm not getting?
First, apologies for the confusion in the comments. I was looking at the source for an old version of d3. In the newer code, it is indeed true that if you do not provide a callback to d3.xml it returns an xhr request that you can configure directly. (If you do provide a callback, it executes a GET right away as before).
You're going to hate this, so get ready. :) Your issue is because you have an extra dash in your Content-Type header. You have:
application/x-www-form-url-encoded
but it should be
application/x-www-form-urlencoded
Looking at a modified version of your jsfiddle in Firefox, I see the arguments show up in the list view as expected: http://jsfiddle.net/WaT6r/3/
I'm trying to post data (a chunk of a file) using a XmlHttpRequest object with an Int8Array as the data but it fails in FF18, but works perfect in IE 10 & Chrome.
Here's my JS:
//dataObj is an Int8Array with approx. 33,000 items
var oReq = new XMLHttpRequest();
oReq.open("POST", "Ajax/PostChunk");
oReq.onload = function (oEvent) {
//
};
oReq.send(dataObj);
I use Firebug in Firefox to debug my JS and when I watch the activity under the Net tab, nothing ever shows up for this XHR call. As if it was never called.
Also, prior to this call, I call jQuerys .ajax() method for "Ajax/PostChunkSize" and that works fine in all browsers, although that doesn't use an Int8Array for its data. I can't use .ajax() for this since .ajax() doesn't support Int8Array objects, as far as I know.
Does anyone know why Firefox doesn't even attempt to send this? Any questions, please ask.
Thanks in advance.
The ability to send a typed array (as opposed to an arraybuffer) is a recent addition to the in-flux XMLHttpRequest2 spec. It'll be supported in Firefox 20 in April or so (see https://bugzilla.mozilla.org/show_bug.cgi?id=819741 ) but in the meantime if your Int8Array covers its entire buffer, doing send(dataObj.buffer) should work...
Note that per the old spec the code above should have sent a string that looks something like "[object Int8Array]" instead of throwing; you may want to check to make sure that other browsers really are sending the array data and not that string.
i'm trying to lock a row in a db-table when a user is editing the entry.
So there's a field in the table lockthat I set 1 on page load with php.
Then I was trying to unlock the entry (set it 0) when the page is unloaded.
This is my approach. It works fine in IE but not in Firefox, Chrome etc....
The window.onbeforeunload works in all browsers, I tested that.
They just don't do the Request
BUT
if I simple put an alert after req.send(); it works in some browsers but not safari or chrome. So I tried putting something else after it just so that's there's other stuff to do after the request but it doesn't work.
function test() {
var req = new Request({
url: 'inc/ajax/unlock_table.php?unlock_table=regswimmer&unlock_id=',
});
req.send();
alert('bla'); // ONLY WORKS WITH THIS !?!?!?
}
window.onbeforeunload = test;
i've already tried different ways to do the request but nothing seems to work. And the request itself works, just not in this constellation.
ANY help would be appreciated!
Thanks
the request is asynchronous by default. this means it will fork it and not care of the complete, which may or may not come (have time to finish). by placing the alert there you ensure that there is sufficient time for the request to complete.
basically, you may be better off trying one of these things:
add async: false to the request object options. this will ensure the request's completion before moving away.
use an image instead like a tracking pixel.
move over to method: "get" which is a bit faster as it does not contain extra headers and cookie info, may complete better (revert to this if async is delayed too much)
you can do the image like so (will also be $_GET)
new Element("img", {
src: "inc/ajax/unlock_table.php?unlock_table=regswimmer&unlock_id=" + someid + "&seed=" + $random(0, 100000),
styles: {
display: "none"
}
}).inject(document.body);
finally, use window.addEvent("beforeunload", test); or you may mess up mootools' internal garbage collection
I have seen simple example Ajax source codes in many online tutorials. What I want to know is whether using the source code in the examples are perfectly alright or not?
Is there anything more to be added to the code that goes into a real world application?
What all steps are to be taken to make the application more robust and secure?
Here is a sample source code I got from the web:
function getChats() {
xmlHttp=GetXmlHttpObject();
if (xmlHttp==null) {
return;
}
var url="getchat.php?latest="+latest;
xmlHttp.onreadystatechange=stateChanged;
xmlHttp.open("GET",url,true);
xmlHttp.send(null);
}
function GetXmlHttpObject() {
var xmlHttp=null;
try {
xmlHttp=new XMLHttpRequest();
} catch (e) {
try {
xmlHttp=new ActiveXObject("Msxml2.XMLHTTP");
} catch (e) {
xmlHttp=new ActiveXObject("Microsoft.XMLHTTP");
}
}
return xmlHttp;
}
The code you posted is missing one important ingredient: the function stateChanged.
If you don't quite understand the code you posted yourself, then what happens is when the call to getchats.php is complete, a function "stateChanged" is called and that function will be responsible for handling the response. Since the script you're calling and the function itself is prefixed with "gets" then I'm pretty sure the response is something you're going to be interested in.
That aside, there are a number of ways to improve on the code you posted. I'd guess it works by declaring a single "xmlHttp" object and then making that available to every function (because if it doesn't, the stateChanged function has no way of getting the response). This is fine until you run an AJAX request before the last one (or last few) haven't replied yet, which in that case the object reference is overwritten to the latest request each time.
Also, any AJAX code worth its salt provides functionality for sucess and failure (server errors, page not found, etc.) cases so that the appriopiate message can be delivered to the user.
If you just want to use AJAX functionality on your website then I'd point you in the direction of jQuery or a similar framework.
BUT if you actually want to understand the technology and what is happening behind the scenes, I'd continue doing what you're doing and asking specific questions as you try to build a small lightweight AJAX class on your own. This is how I done it, and although I use the jQuery framework today.. I'm still glad I know how it works behind the scenes.
I would use a framework like DOMAssistant which has already done the hard work for you and will be more robust as well as adding extra useful features.
Apart from that, you code looks like it would do the job.
I would honestly recommend using one of the many libraries available for Ajax. I use prototype myself, while others prefer jQuery. I like prototype because it's pretty minimal. The Prototype Ajax tutorial explains it well. It also allows you to handle errors easily.
new Ajax.Request('/some_url',
{
method:'get',
onSuccess: function(transport){
var response = transport.responseText || "no response text";
alert("Success! \n\n" + response);
},
onFailure: function(){ alert('Something went wrong...') }
});