I am loading a large (~300 MB) JSON file by using the following code:
$.ajax({
type: 'GET',
url: path,
dataType: 'json',
data: {},
async: false,
success: function(json_object) {
console.log("success!");
} error: function(request, error) {
console.log(request["statusText"]);
}
});
Running it outputs "InternalError: allocation size overflow". Is there any way to get around this that does not involve making the file smaller?
You'll need to set up a buffer. However, why on earth are you passing so much data? That would be an extremely unreasonable wait for any user.
EDIT
Buffering isn't really something you can do from the ajax side (according to How to buffering an Ajax Request?). However, you can set something up server side (if it's your server returning the data) to send it in pieces, then use ajax to request each piece.
If it's not your server, or your requesting from an API or something, then look and see if they accept any parameters to define the size of the return object - this way you can request it in chunks.
Related
I have created a chat room which uses ajax request to check for new message every second by using setTimeOut function, I achieved this but my only question is that, is it okey to request data from server after every second? or it may cause some problems? below is my code:
function refresh(){
setTimeout(function(){
$.ajax({
type: 'POST',
url: 'checkNewMessage.php',
data: { sender:$sender, recipient:$recipient},
success: function(response) {
$('#newComm').val(response);
if($('#newComm').val()>$('#oldComm').val()){
$.ajax({
type: 'POST',
url: 'appendNewMessage.php',
data: { sender:$sender, recipient:$recipient},
success: function(response) {
$("#chatRoom").prepend(response).fadeIn(4000);
$('#oldComm').val($('#newComm').val());
}
});
}else{}
}
});
refresh();
},1000);
}
Well, this is a question with a depends on answer.
Polling the server with timed requests is not the best way of achieving what you would like to achieve. At this place I recommend using WebSockets: https://developer.mozilla.org/en-US/docs/Glossary/WebSockets
But back to your question. It depends on your server and the load it is going to take. Lets say you have ten active users. So your server would take about 10 requests per second - not too much.
You could run a benchmark and see how many requests per second your server can handle. But handling requests is not the same like answering each request.
If you don't have so many users on your chat, you might be ok with this approach. For bigger loads I highly recommend to switch to WebSockets.
This is how I make my post request. data contains an array of selectedIds and it's possible with the traditional property. The problem is that if I need to POST 200 selectedIds the URL is to long and it breaks. What is best pratice to solve this situation? Only idea I can think of is looping my ids and POST'ing them in smaller chunks, but don't know the downsides or there is a more standard way of doing this?
POST:
var addToBuffer = function(url, data) {
return $.ajax({
traditional: true,
url: url,
dataType: "html",
data: data
});
},
Url ( there is 200+ selectedIds ):
http://localhost/foo/Buffer/AddToComputerAndDevicesBuffer?selectedIds=2639&selectedIds=5386&selectedIds=3225&selectedIds=6791&selectedIds=3231&selectedIds=357 ...
Error message:
"HTTP Error 404.15 - Not Found"
"The request filtering module is configured to deny a request where the query string is too long."
"Most likely causes:"
- "Request filtering is configured on the Web server to deny the request because the query string is too long."
"Things you can try:"
- "Verify the configuration/system.webServer/security/requestFiltering/requestLimits#maxQueryString setting in the applicationhost.config or web.config file."
Update 1:
Just realized just before posting this question, that I am not POST'ing but using GET, and it works with 200 ids, when I used:
var addToBuffer = function(url, data) {
return $.ajax({
type: "POST",
traditional: true,
url: url,
dataType: "html",
data: data
});
},
But that quickly raises another question, will this still works when I scale up to 10k+ or even 1M+? Do I get a timeout or am I DDos'ing my own server? Am I back to the loop with smaller chunks solution, or does anyone have any good advises how they solved this before? Thanks.
I'm using web api without deep understanding what it is, just knowing that each editable entity become a resource, that means has the uri, when web api provides the interpretation of PUT, POST, GET, DELETE HTTP commands to support CRUD operations. But what if for tracing/logging purpose I need to send correlation token together with e.g. GET request? Are there any recommendations and techniques to add to the HTTP request/routing "technical parameters"?
I have found something that need to be tested https://webapicorrelator.codeplex.com/ But actually I would prefer just to understand how it could work...
Or just add it to the heder using jquery ajax headers:
return $.ajax({
// have to use synchronous here, else the function
// will return before the data is fetched
url: url,
data: { ElectrodeId: electrodeId },
headers: { "X-CorrelationToken": correlationToken },
method: "POST",
dataType: "json",
success: function (data) {
}
I have a C# web application which uses ajax method to GET and POST data. Is there any difference between GET and POST methods in passing data (in case of contentType,data,dataType)?
$.ajax({
type: 'GET',
url: "url",
contentType: "application/json; charset=utf-8",
data: { value: "data" },
dataType:"json",
success: function (data) {
alert(data);
},
error: function (data) {
alert("In error");
}
});
});
GET encodes the information into the url, the more info you GET the longer your URL becomes.
POST stores data in an array and passes that array to the next page. your Url remains unmodified.
While that may not seem like a huge deal, URLs do have a maximum length and errors will ensue if you exceed it. In addition and call to a specific url may fail due to the modifications GET makes. Apart from that, They are similar enough in function to be interchangeable for most purposes.
In normal form method also GET is used to sent some insensitive small chunk of data to server in querystring, whereas POST is used for sending large and secure data to the server
In case of using ajax GET is commonly used, POST is feasible only when you have to do DB interactions on server or there's some sensitive data involved, read more here http://www.jquery4u.com/ajax/key-differences-post/
I have to query (via Ajax) 2 scripts at the same time.
I know for sure that one is really quick, it just displays some html, the second is doing some query using a WebService.
The quick request, is always sent after the first one. But with all my attempts, the fast/quick one, never completes before the slow one.
The code use to call the first long ajax request:
$.ajax({
type: "POST",
url: '/fr/ajax_flight_get_other_oneway',
cache: false,
dataType: 'json',
success: function(data) {
// some treatment
}
The code for the second faster ajax request:
$.ajax({
type: "POST",
url: '/fr/load_back_forflight?id=SN4422_23',
cache: false,
data: "comps="+compSelectedCodes+"&escale="+escale,
dataType: 'json',
success: function(data) {
// some treatment
}
Is it something in Apache that should be changed or is it in jQuery?
I found the solution to my problem, it was linked to the session.
the session was based on file system. So the first (long query) is lock the session file, and then the second one is forced to wait for the long query to finish.
by using session in DB, I've resolved the problem.
thanks for your help
Put the slow one in the success callback of the fast one. This will guarantee that the fast request will finish first before starting the second request.
It's possible that the browser decided to use the same HTTP connection for both (using the HTTP header Keep-alive) and thus it appears queued. This is not a jQuery thing -- it's something that browsers can opt to do.
Use your browser's HTTP network traffic debugger to see if that's the case.
If not, then your web-server may be only allowing one connection per client and is queueing them. See this:
How do I configure Apache2 to allow multiple simultaneous connections from same IP address?
Try this:
$.ajax({
type: "POST",
url: '/fr/ajax_flight_get_other_oneway',
cache: false,
dataType: 'json',
success: function(data) {
// some treatment
//The code for the second faster ajax request:
$.ajax({
type: "POST",
url: '/fr/load_back_forflight?id=SN4422_23',
cache: false,
data: "comps=" + compSelectedCodes + "&escale=" + escale,
dataType: 'json',
success: function(data) {
// some treatment
}
});
}
});