If I do an XMLHttpRequest request to some server and that server doesn't return anything (it just hangs), will the XMLHttpRequest eventually time out?
Try:
xhr.timeout = 10000;
xhr.ontimeout = function(){ alert('Timeout!')};
reference:
Related
I have a page on localhost:4000 that has a cookie set by the server.
The page also contains a script that successfully makes an XHR request back to the server upon the page loading. This XHR request response sets a second cookie. I can only see the non-XHR cookie in Chrome devtools under Application (tab) > Storage (menu group on left) > Cookies > http://localhost:4000.
I can see the XHR cookie returned from the server in the Network tab (which if the page is loaded a second time shows both the non-XHR and XHR cookies are correctly included in the XHR request:
Request Cookies
xhr_cookie valueABC
from_homepage value123
Response Cookies
xhr_cookie valueABC
So the XHR cookie is being persisted somewhere but I can't see it in Chrome's devtools.
Not the answer for Chrome but a work around is to use Firefox and enable the Storage "inspector" from the gear wheel on the web developer pane.
https://developer.mozilla.org/en-US/docs/Tools/Storage_Inspector
Same Origin request were fine.
Cross origin request has some limitations.
File:1.php:
<?php
setcookie("cookie_name_1", "cookie_value_1", time() + (86400 * 30), "/");
?>
<script>
var xhr = new XMLHttpRequest();
xhr.open('GET', 'http://foo.ir/2.php', true);
xhr.withCredentials = true;
xhr.onreadystatechange = function() {
if(this.readyState == xhr.DONE) {
get_res();
}
}
xhr.send(null);
function get_res(){
var xhr = new XMLHttpRequest();
xhr.open('GET', 'http://foo.ir/2.php?is_set', true);
xhr.withCredentials = true;
xhr.onload = function () {
if (xhr.readyState === xhr.DONE) {
if (xhr.status === 200) {
console.log(xhr.responseText);
}
}
};
xhr.send(null);
}
</script>
File:2.php
<?php
header('Access-Control-Allow-Origin: http://localhost');
header('Access-Control-Allow-Credentials: true');
if(isset($_GET["is_set"])){
if(isset($_COOKIE["cookie_name_2"]))
echo "cookies are set:".$_COOKIE["cookie_name_2"];
else
echo "cookies not set";
}else
setcookie("cookie_name_2", "cookie_value_2", time() + (86400 * 30), "/");
?>
Cross-origin cookies are working:You need to allow third party cookies to be set in
browser setting
I couldn't find where third party cookies are stored.
Chrome Won't show cookies and wont effect the real website.
Firefox & Edge save cookies in the third party website storage thus it will effect real third party website.
More information can be found on Here
According to the XMLHttpRequest Level 1 and XMLHttpRequest Level 2,
this particular response headers falls under the "forbidden" response
headers that you can obtain using getResponseHeader(), so the only
reason why this could work is basically a "naughty" browser
This should show up as a seperate "Cookies" tab when you inspect the XHR request. It's easy to miss because the tab only shows when withCredentials is set to true.
In Chrome, disable "Site isolation":
Go to 'chrome://flags/#site-isolation-trial-opt-out
Set Disable site isolation to Disabled
Relaunch
For further details see: https://blog.ermer.de/2018/06/11/chrome-67-provisional-headers-are-shown
I'm trying to login from one of my servers to another in order to send cross-origin requests that requires being logged. is it possible?
I have two web servers, A and B. Lets say www.a.com and www.b.com.
B has an API that can be used only if the client is logged in. I need to use that API from A clients.
So, I send from A client an ajax (post) login request to B. B responses with CORS headers, the session cookie and a successful redirection to B's home/index.
But when I make a second ajax request (jsonp request) from A client to B server, this request doesn't send the previous session cookie received, therefore the login request failed.
If I login to www.b.com manually (in a second browser tab), all requests from A to B are successful detected as a logged user, so, the B API works from A.
I think that the session cookie received from my login requests is not being saved to the browser.
This is my login request:
$.post("www.b.com/login", { 'j_username': 'username', 'j_password': 'password' } );
Using:
jqXHR.withCredentials = true;
settings.crossDomain = true;
Response headers:
Access-Control-Allow-Headers:x-requested-with
Access-Control-Allow-Methods:POST, GET, OPTIONS
Access-Control-Allow-Origin:*
...
Location:http://www.b.com/home
...
Set-Cookie:JSESSIONID=tY++VWlMSxTTUkjvyaRelZ0o; Path=/
The cookie received is being saved to www.a.com or to www.b.com? How can I set this cookie to www.b.com from an A client ajax request? I think that is the problem.
As apsillers said, we can't use the wildcard Access-Control-Allow-Origin:*.
But this doesn't solved the problem.
I was setting jqXHR.withCredentials = true; inside a beforeSend handler function.
$.post({
...
beforeSend: function(xhr) {
xhr.withCredentials = true;
},
...
});
And for some reason, this doesn't work. I had to set the use of credentials directly:
$.post({
...
xhrFields: {
withCredentials: true
},
...
});
This code works perfectly !
Thanks you.
I'm trying to make a connection to a node server from a page served by an APACHE server but I'm running into issues with the connection. I was getting cross domain errors until I set origin to allow all but now I'm getting a 400 bad request error.
server
var http = require('http');
var io = require('socket.io');
var server = http.createServer(function(request, response){
console.log('Connection');
response.writeHead(200, {'Content-Type': 'text/html'});
//response.write('hello world');
response.end();
});
server.listen(8001);
io.listen(server);
var socket = io.listen(server);
socket.set('origins', '*');
socket.on('connection', function() {
console.log('mooo');
});
client
<script src="https://cdn.socket.io/socket.io-1.3.5.js"></script>
<script src = "http://localhost:8001"> </script>
<script>
var socket = io.connect('http://localhost:8001');
</script>
error:
GET XHR http://localhost:8001/socket.io/ [HTTP/1.1 400 Bad Request 2ms]
Your code is running fine without the following code but Im not running on apache. I have got the output "hello world" in browser (http://localhost:8001/).
var socket = io.listen(server);
io.set('origins', '*');
io.on('connection', function() {
console.log('mooo');
});
How are your accessing the page because 404 is something that your page can not be located by the request.
I have a Scrapyd server running and trying to schedule a job.
When i try below using CURL it is working fin e
curl http://XXXXX:6800/schedule.json -d project=stackoverflow -d spider=careers.stackoverflow.com -d setting=DOWNLOAD_DELAY=2 -d arg1=val1
After that i have done a small code UI in angular to have a GUI for this,
I have done a AJAX request to do the above.
var baseurl = GENERAL_CONFIG.WebApi_Base_URL[$scope.server];
var URI = baseurl +"schedule.json"; //http://XXXXX:6800/schedule.json
var headers = {'content-type': 'application/x-www-form-urlencoded'}
console.log(URI)
$http.post( URI,data = $scope.Filters, headers).success(function (data, status) {
console.log(data)
}).error(function (data, status) {
console.log(status);
alert("AJAX failed!");
});
but i am getting No 'Access-Control-Allow-Origin' header is present on the requested resource. error.
Can any one help me how to resolve this ?
And why it is working in CURL but not in my AJAX.
Thanks,
This is because of browser protection called Same-origin policy. It prevents ajax requests across a different combination of scheme, hostname, and port number. Curl has no such protection.
In order to prevent it you will either have to put both the api and client app on the same domain and port or add the CORS header 'Access-Control-Allow-Origin' to the server.
One other option is to use JSONP. This may be suitable in this case to just get json data. It's not suitable for rest apis. In angular use $http.jsonp for this
I'm trying to do a ajax call between a server (http) that is on internet. And target that to my own localhost. FF/Chrome/ ETC... works. It's ONLY an IE issue. IM USING IE 11 AND 10.
The request is don't even done. The "denied access" is thrown instantly.
This is the code. Just for you to see.
Is not the classical HTTP/HTTPS error in IE8 AND IE9. This is something else, but the documentation is not helpful.
$jq.ajax({
contentType: 'application/json',
url: url,
dataType: 'json',
crossDomain: true,
beforeSend: function (xhr) {
xhr.withCredentials = true;
xhr.setRequestHeader("Authorization", "Basic " + $jq.base64.encode(username and password));
},
success: function (data, status, headers) {},
error: function (xhr, status, error) {}
The status is 0 in xhr object and error is "Denied access"
Internet Explorer raises this error as part of its security zones feature. Using default security settings, an "Access is Denied" error is raised when attempting to access a resource in the "Local intranet" zone from an origin in the "Internet" zone.
If you were writing your Ajax code manually, Internet Explorer would raise an error when you try to open the resource. For example:
var xhr = new XMLHttpRequest();
xhr.open('GET', 'http://localhost/', true); // This line will trigger an error
xhr.send();
You can work around this error by adding the origin site to the "Trusted sites" security zone. You can test this by adding "http://client.cors-api.appspot.com" to your "Trusted sites" zone and using this test page at test-cors.org with your localhost site as the Remote URL.
In addition to the trusted site requirement I found that the problem was not fixed until I used the same protocol for the request as my origin, e.g. my test site was hosted on a https but failed with any destination using http (without the s).
This only applies to IE, Chrome just politely logs a warning in the debug console and doesn't fail.
If you are attempting to make cross-origin ajax requests in IE9, you'll need to use XDomainRequest instead of XMLHttpRequest. There is a jQuery plug-in that wraps XDR. You should be aware that there are some notable limitations of XDR.
Another option would be to use a library like this: https://github.com/jpillora/xdomain.
jQuery implements ajax calls using the XMLHttpRequest object which is not supported in IE9. You have to force it to use XDomainRequest instead.
I get around this problem using this jQuery plugin:
https://github.com/MoonScript/jQuery-ajaxTransport-XDomainRequest
Note:
Do not use "http://www.domain.xxx" or "http://localhost/" or "IP" for URL in Ajax.
Only use path(directory) and page name without address.
false state:
var AJAXobj = createAjax();
AJAXobj.onreadystatechange = handlesAJAXcheck;
AJAXobj.open('POST', 'http://www.example.com/dir/getSecurityCode.php', true);
AJAXobj.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded; charset=UTF-8');
AJAXobj.send(pack);
true state:
var AJAXobj = createAjax();
AJAXobj.onreadystatechange = handlesAJAXcheck;
AJAXobj.open('POST', 'dir/getSecurityCode.php', true); // <<--- note
AJAXobj.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded; charset=UTF-8');
AJAXobj.send(pack);
function createAjax()
{
var ajaxHttp = null;
try
{
if(typeof ActiveXObject == 'function')
ajaxHttp = new ActiveXObject("Microsoft.XMLHTTP");
else
if(window.XMLHttpRequest)
ajaxHttp = new XMLHttpRequest();
}
catch(e)
{
alert(e.message);
return null;
}
//-------------
return ajaxHttp;
};