Should I make multiple Ajax requests or combine into one - ajax

I am building some html reports. The user can choose to view additional data for individual elements of the report, or choose to view all additional data.
To view a single line of additional data, an Ajax request is made.
My question is that if a user clicks "View all additional data", should I make 20 or so asynchronous Ajax calls, or just make a single Ajax call that might take a little longer.
Aside from usability, are there any best practices as far as making lots of smaller Ajax requests vs one larger one?

I would say normally you would want to make one call. Your sending a request to the server - while you are there - just get all the data you need before coming back. Depending on the situation you could always cache some of the data (by storing in a variable) - to limit the amount of information you are retrieving.

Related

Coldfusion automatically save data

Instead of the traditional Posting of forms (with a save button) to save data to a database using coldfusion.
Is there a sensible way of having information saved as the user exits the field.
Is this even good practice?
All you need to do is via JavaScript, assign a change event to every field, then define that the event will make an Ajax call to save the data in that particular field. You should need a single target URL that takes some primary key and the field name in question.
What you really need to consider though, is the bandwidth required to support such a process. What is your current load? Concurrent users? Concurrent form usage?
If you have 100 people filling out a 10 field form, you currently have 100 HTTP POST requests to deal with. Can you handle 1000 HTTP POST calls if every field saves on its own? What about 1000 people at a time? 10k? 100k? And larger forms, how many of those do you have?
The functionality is fairly trivial to implement, what is not trivial is the potential impact on your infrastructure.

From AJAX action, returning JSON or HTML is preffered?

On my website landing page, I am calling various AJAX actions.But the performance is poor as of now.These actions are
To get latest articles
To get latest news
To get latest Jobs
To get recent added users etc.
I am showing all this information in dashboards for each AJAX actions.
My question is,
From my AJAX actions, should I return the HTML or JSON? Which one would be better in performance and maintainance point of view?
I have following few points on these approaches -
HTML
Pros-
1. Will be easy to code
2. Easy to maintain.If there is any UI change in dashboard, with HTML it would be easy to do.
Cons-
1. Performance hit as complete HTML would be sent on client side.
JSON-
Pros-
1. Good performance as data transfer size would be less.
Cons-
1. UI change in dashboard would be comparatively diffcult as I need to change JS code rendering logic.
I want to understand if my assumptions are correct or not.And if there are any other points in these approaches?
Loading and embedding HTML directly as opposed to just sending the data and transferring it into a DOM structure client-side should not be so much different when it comes to performance.
Usually the greatest performance “killer” in an HTML page environment are HTTP requests – they take close to “forever” compared to all other stuff you do client-side. So if you have to pull data for multiple such widgets, it might be a good idea to encapsulate those data transfers into just one HTTP request, and have the different widgets read their data from there once its loaded. And for that, a data format like JSON might be preferable over HTML.

Large number of concurrent ajax calls and ways to deal with it

I have a web page which, upon loading, needs to do a lot of JSON fetches from the server to populate various things dynamically. In particular, it updates parts of a large-ish data structure from which I derive a graphical representation of the data.
So it works great in Chrome; however, Safari and Firefox appear to suffer somewhat. Upon the querying of the numerous JSON requests, the browsers become sluggish and unusable. I am under the assumption that this is due to the rather expensive iteration of said data structure. Is this a valid assumption?
How can I mitigate this without changing the query language so that it's a single fetch?
I was thinking of applying a queue that could limit the number of concurrent Ajax queries (and hence also limit the number of concurrent updates to the data structure)... Any thoughts? Useful pointers? Other suggestions?
In browser-side JS, create a wrapper around jQuery.post() (or whichever method you are using)
that appends the requests to a queue.
Also create a function 'queue_send' that will actually call jQuery.post() passing the entire queue structure.
On server create a proxy function called 'queue_receive' that replays the JSON to your server interfaces as though it came from the browser, collects the results into a single response, sends back to browser.
Browser-side queue_send_success() (success handler for queue_send) must decode this response and populate your data structure.
With this, you should be able to reduce your initialization traffic to one actual request, and maybe consolidate some other requests on your website as well.
in particular, it updates parts of a largish data structure from which i derive a graphical representation of the data.
I'd try:
Queuing responses as they come in, then update the structure once
Hiding the representation invisible until the responses are in
Magicianeer's answer is also good - but I'm not sure if it fits your definition of "without changing the query language so that it's a single fetch" - it would avoid re-engineering existing logic.

Best practice for combining requests with possible different return types

Background
I'm working on a web application utilizing AJAX to fetch content/data and what have you - nothing out of the ordinary.
On the server-side certain events can happen that the client-side JavaScript framework needs to be notified about and vice versa. These events are not always related to the users immediate actions. It is not an option to wait for the next page refresh to include them in the document or to stick them in some hidden fields because the user might never submit a form.
Right now it is design in such a way that events to and from the server are riding a long with the users requests. For instance if the user clicks a 'view details' link this would fire a request to the server to fetch some HTML or JSON with details about the clicked item. Along with this request or rather the response, a server-side (invoked) event will return with the content.
Question/issue 1:
I'm unsure how to control the queue of events going to the server. They can ride along with user invoked events, but what if these does not occur, the events will get lost. I imagine having a timer setup up to send these events to the server in the case the user does not perform some action. What do you think?
Question/issue 2:
With regards to the responds, some being requested as HTML some as JSON it is a bit tricky as I would have to somehow wrap al this data for allow for both formalized (and unrelated) events and perhaps HTML content, depending on the request, to return to the client. Any suggestions? anything I should be away about, for instance returning HTML content wrapped in a JSON bundle?
Update:
Do you know of any framework that uses an approach like this, that I can look at for inspiration (that is a framework that wraps events/requests in a package along with data)?
I am tackling a similar problem to yours at the moment. On your first question, I was thinking of implementing some sort of timer on the client side that makes an asycnhronous call for the content on expiry.
On your second question, I normaly just return JSON representing the data I need, and then present it by manipulating the Document model. I prefer to keep things consistent.
As for best practices, I cant say for sure that what I am doing is or complies to any best practice, but it works for our present requirement.
You might want to also consider the performance impact of having multiple clients making asynchrounous calls to your web server at regular intervals.

ajax architecture question

I have a page with 3 layers, one for navigation, one for database records and one for results. When I click on a database record, the results are displayed in the result layer via ajax. For navigation, the links will simply be different queries. I am wondering if it would make sense to have each different query be sent as ajax data and palced into the records layer, or rather to have the query appended to the php file each time. Which is the more efficient approach?
Well, sending a different AJAX request will be recommended as per my point of view. As
Performance wise, it will rather reduce the response times, as only the POST data is sent and databytes recieved. The page can then format it, one it receives an XMLHttpResponse
Security wise : I prefer using POST than GET as it gives at least some opaqueness as to what is being passed as a parameter and not anyone can just edit the url and play around. Plus, you don't have the URL length restriction while passing parameters in POST.
So, i'd say fire an XMLHTTPRequest
each on each link and display the
response in the Results layer
(pane/div) on the page.
I think you question is quite unspecific and confusing.
What is "appended to the php file"?
Are you really concerned about efficiency? I mean, how fast should the results be displayed? Or are you concerned about the server workload?
Have you read this tutorial? Prototype introduction to Ajax
I think it should answer most of your questions and give enough example code to continue.

Resources