I'm working on a project where I have to fetch data from a database then present results in tables with at least 200 rows until 3000 rows.
I tried first Pagedlist and noticed that only the fixed number of records per page is displayed in the client side and each time when the user change the page, there is a new request to the server.
For tablesorter, I noticed that all the results are displayed and paging is only visual (only the presentation of table) but everything is in the client side
Is my understanding correct?
What I want to know is which approach is better (in time execution)?
Actually I'm working on localhost and there is only me as user so I can't notice the difference even tablesorter take more time to load in the first time but after it's very quick meanwhile the pagedlist method is faster in loading page but each time it request the server to change page and load the according data
When finished the application will be in the server and many users will have access to the application to add, search delete...
Which of these two approaches is a better choice?
Thanks
Related
I wanted to check how quickly my web application will display results for a query : SELECT * FROM orders.
the query returns about 20k records on one page and it takes about 15 seconds
Why on every browser the response time stops after two seconds? Is it because the browser has trouble displaying so many records per one page? at 70k it gets out of memory.
Database - mysql on hosting
problem
correct response time
If you want to check how long it takes for the web app to process. You can add logging before and after doing the query.
You also could add some logging of the current time, when receiving the request and before returning the response.
As for why the request stops after two seconds, I don't think we have enough information to decide.
It could be from the web server default configuration that you use.
In my opinion, displaying 20k records might not be an efficient approach.
Other than the time to query and response time.
You might want to consider the looping that happens on the front end.
Personally, I would recommend paging at a lower number, and if you need to display all the data at once. You might consider using lazy loading as an option.
I know this is a very generic answer, but hopefully, this could help you out.
I am using DataTable plugin in Laravel. I have a record of 3000 entries in some
But when i load that page it loads all 3000 records in the browser then create pagination, this slow down the page loading.
How to fix this or correct way
Use server-side processing.
Get help from some Laravel Packages. Such as Yajra's: https://yajrabox.com/docs/laravel-datatables/
Generally you can solve pagination either on the front end, the back end (server or database side), or a combination of both.
Server side processing, without a package, would mean setting up TOP/FETCH or make rows in data being returned from your server.
You could also load a small amount (say 20) and then when the user scrolls to the bottom of the list, load another 20 or so. I mention the inclusion of front end processing as well because I’m not sure what your use cases are, but I imagine it’s pretty rare any given user actually needs to see 3000 rows at a time.
Given that Data Tables seems to have built-in functionality for paginating data, I think that #tersakyan is essentially correct — what you want is some form of back-end filtering or paginating of rows of data to limit what’s being sent to the front end.
I don’t know if that package works for you or not or what your setup looks like, but pagination can also be achieved directly from a DataBase returning data via the SQL (using TOP/FETCH for example) or could be implemented in a Controller or Service by tracking pages of data and “loading a page at a time” both from the server and then into the table. All you would need is a unique key to associate each "set of pages" for a specific request.
But for performance, you want to avoid both large data requests and operations on large sets of data. So the more you limit how much data is being grabbed or processed at any stage of your application using it, the more performant your application will be in principle.
I have a grid with a big set of records(round 10,000 on an average). I have implemented server side paging on the grid to retrieve 50 records at a time. Everything with the grid is working perfectly fine till I decide to do a export to pdf from the grid.
When I try to do the same the export takes round 5-6 mins on an average to complete. I tried to debug on the server side and realised that multiple calls to the server were being made to retrieve the data for the export which probably was eating up the time. I alternatively tried to modify the set of records retrieved to 1000 at a time so as to reduce the server calls to eventually reduce the time taken for the export. But now while trying to export to pdf the page crashes. I changed the set of records to 500 but the crash still happens when exporting.
On another note the export to excel works pretty fast in round ~4 secs. When trying to debug I found out that there was only a single call to the server made while exporting to excel and this renders the entire set of data as well.
Please note that I am using the kendo defaults for exporting to pdf and excel with the grid.
Thanks.
you need implement server side export.
kendo says :
Important
When the allPages option is set to true and serverPaging is enabled, the Grid will make a "read" request for all data. If the data items are too many, the browser may become unresponsive. Consider implementing server-side export for such cases.
look this page
Full example : link
I am using kendoUi Grid with 20k records, When i am changing page size 20 to 200 grid taking 40 to 50 sec to work. some times it taking min. paging is client side only.
For large datasets, it's better to use server paging, mainly because:
Faster loads
Avoid wasting memory
Avoid unnecessary database and network loads, database just gives you the number of records of the current page
You should consider enabling server paging at datasource level, and then read pagination values on backend side before performing the query to the database.
http://docs.telerik.com/kendo-ui/api/javascript/data/datasource#configuration-serverPaging
good question, i am also faced this type of issue.
Please use MVVM logic instance of MVC for bind the grid,
for more please find this below link.
http://demos.telerik.com/kendo-ui/grid/mvvm
Binding MVC grid to stored procedure with large amount of data:
I want to bind MVC grid to an object result that returned from SP
Normally the grid request only data needed to be shown to the user and that will be very good when binding to a table with large amount of data that lets the grid fast and its performance would be good.
i have 2 ways to bind MVC grid with SP:
Bind to SP without using .ToList() it gives me an error "The result of a query can not be enumerated more than once."
Bind to sp using .ToList() will resolve that error but it will loads all records from database first and the performance will be bad and the grid loading, paging, sorting and filtering would be very slow.
Please tell me a solution to bind MVC grid with SP that returns large amount of data with good performance.
Thank you
I have used jqgrid in the past, jqgrid implements paging so not all the content of the grid is actually displayed at once, you pick the number of rows per page and jqgrid automatically will hook the navigation to your controller. There are a lot of examples of this on the web. If you click the next page it will retrieve the appropriate data for that page and so forth.
The jqgrid page has a lot of examples in loading data that ilustrate this: http://www.trirand.com/blog/jqgrid/jqgrid.html, here is another page that talks how that is implemented at the server side using mvc http://www.codersource.net/AspNet/ASPNetAdvanced/jqGridPaginginaspnetmvc.aspx, I'm sure if you look around you'll find a lot of information on how to go about this approach.
Finally I usually avoid showing the user a lot of data anyways, mainly because it is hard for a human been to make any sense of more than 100 rows of data having a way to search further in it. So I rather try and attempt to shrink the data size giving them a way to filter it down further, but this is not always possible.
Hope this helps.